cross-nvptx-gcc10-10.3.0+git1587-lp152.5.1<>,U+`)/=„yZXN(xŏGj wc> xV q7>v ,cWàkc^ !*N"XdYR6pJj_-rj79~npN%9i{/ˉPaRgq*_U''*͞;+ToҨo@w*D܉=Y?Yd# - Y '-4MhM M M M M pMM%MMp ( 8 49!4:(4FC1GCDMHDxMIEMXEYF\FTM]GM^K! bMcMdNYfN^lN_uNtMvOwUMxVMyXMzYLY\Y`YfYCcross-nvptx-gcc1010.3.0+git1587lp152.5.1The GNU Compiler Collection targeting nvptxThe GNU Compiler Collection as a cross-compiler targeting nvptx.`)lamb63topenSUSE Leap 15.2openSUSEGPL-3.0-or-laterhttp://bugs.opensuse.orgDevelopment/Languages/C and C++https://gcc.gnu.org/linuxx86_64y# ?PC%2 $PQJ!Fr$i2p%CHpU*#ngHr .OPC%2 $PQœ~N`|ZNЁAAA큤A큤AA큤큤큤큤A큤A큤큤큤큤` ` `Z`Z``` ``` `h`h`h`h`h`h`h`h`h`h`F`F`F`E`f`h`E`a`E`h`h`E`E`E`E`E`E`E`E`E`E`a`E``d`d`````f`f`f`h`h`h` ` ``a`h`h`h`h`h`h`h`h`h`e`e`a`a`f`e`e`h`h`g` 6f2c4203503ef5004c4173a7421b39dfc7ece758078ed60e19042f47b4f0472f61fd33c3ff97613b8cd5add30d02cece7cd8b2605b8f6277bae568b990f06d7dc8b9084b9c171339b86cda8b493ea9fb61eab04e842958fe77c21101e7abe6dffef6b994a5a1aad2f360458759c78bf9c234e5c3e9cefbe9a3ae5fdaa3ecd0836b0b515662de94b93d15e50ad650cd0b0d2779cff7ff0e1332f1b6c3fc5e0568f8664f389a0b6c0f6d10c8205fb055e30121f1367bcc06b543cdd3c2fb34b40baa36df19a3f4e5648c2d59a648e6569081938db6d482d0e0d3c88d1391825e71f1966f60821864bfe70d302c61042ef46a6b0dbfc7296857ba8c0ef79c8d1057302ef2e156e21e7c753dd6510b22742e6ef5506041c6fe7db444b5ed91cafefd2b1b25b56808670f6044d9cb287eb1fc450f4c9453af28601f431fe33f70102f0e44df4ef964daa569203051838e5c3b3185623483140a9fdc12df90530f2198ca1fd4956e5bc3bf6dd3793d4e63710cbae6cba3ded4c656b836b6e0491dad4ee0a9e842cd77d0a65b69ee2adb2bb208e497d5798f5b166fc4a2fe3a90d3d0e932baa94efdf4fb418c20321f14a1b66982a6cd3c2b5bc8fbfbe438f687c528d6d29fc0059f53becf3920e3595c90117c5e0d13af7b73d2d233f03ab78fd1582622dd0ab81baadd5a8d455b445855f341d419c7b0267beca235de00b7ec78e529f2dd904630bafc13abfe676e297ca38a253da362e7ec7bf6a50a9bc574c3a7fbd970bcdc50b69a59b436f4786bddfc4658e6098cf72bfb3c62ca01094da9508da00dab9896af5796ada6f9bd3e2aa44c48666eb8eb0f4b42355a46ccb7c73c53db9b31abea1c7d90e7555f6f528e1b4734af5eb0905f8cc5725947a56dacaa4292c5d407f7475a3b638b37eb34f8a7a62bd75a0cd8e116df0b57ac3a96d6e21781ce18f3fe280355a55ce5fa08462aacadca8568a0259e31883492e34fd90bead282320fab43fd3c5845ca785382e0bd368a00afbfb068cf47f7b6ee620e51f365a335066ff36d3439de396a15cfbd9e322f041be21822736b732588ad50fe94f5c150e379cd71d5a924680a870f007ff8295e656f3f11a3fd28dfb971ee3999055e0a8cde18be0677036f4ebb50796a0add54bb831fabc2ff7290c27f10668174764991a8ffd655a5ac9f5c4ef36274aa85a721ad3deac6a3f9a7708047c92fae6110360a66cb2a1e1f2141925f70a4f5818b27a7de2b62a78503c3391b7eb6232f4f2c2e5b073e2cb4541494033a8e2796a0040a9cfdcdc5e330886e45b6942903c7c715607eb9586af92878d88da6d5bc75d3434772a8ca8e4c6e0295a4b486a7914ab7d6a18465a6fb9a7b54f945cad604fdbf3ca0d10e4e0254e00379114fdc35e006f222b64ada672c59f893292a5d2df6310cc4dafd01c9d4c5a4e1f3f450f05d92dc2466fc0e2677c77bb1b3533dd8f4254de5e5d0e2dbf63d2fd597358075ebb10e855e9a2fd7c462dca5f042b4a9349eee2bb79ee8a2118d247794f9acb21973be5764aa2fd689c941ad963c1ecdd133f3a6529f279803135f8268d9121be97a68937d9ecfb81e844d25041e9ff3cbea21c1c226d3f736d1a0649ae969c791ce29d8cfad0234b638e619f450f756bfbed5792168ab01324fa839278fc050fd9e1915aedbcd99587102a31ceb42d6fb4664822b2d12d0587ab395c101bc08583da00e7641ed795a85df91b7d1abec10924412c1f6f495a4de06378ce3219265398c44010c130783ebbd8b17d22b3678ca89fd6d61f204ad5bf032b2041c1591946ddf2187dc2d7f43f9c32e9302c54a2360de8d2b4f3580352f8034f3578658dbb3c986e556c894b9aaebced30ddc41aa57dd376f1cb32e7a6eeffc6c5742682397bde7dcd414b52cf28e2068b5a7b5f2fb72c6194c3571f6654d165c1ec96eefef1d10f71250c7455f586b869b13ae08fdd6014d51dd401ccfc43a48d8a261e91cde1dd065da338e9da73c0dff3b8756e47ff1ae12b268d9cb84f46c2cf706826b2bc48cf4d5ef87dc33338fb611e74c6c785f1b1b6881ab02586ddb6680335d24f2cb1d6672ed5b7c490d20b5d987b587906977217f290306b8aefef6b994a5a1aad2f360458759c78bf9c234e5c3e9cefbe9a3ae5fdaa3ecd083f8664f389a0b6c0f6d10c8205fb055e30121f1367bcc06b543cdd3c2fb34b40baa36df19a3f4e5648c2d59a648e6569081938db6d482d0e0d3c88d1391825e71f1966f60821864bfe70d302c61042ef46a6b0dbfc7296857ba8c0ef79c8d1057302ef2e156e21e7c753dd6510b22742e6ef5506041c6fe7db444b5ed91cafefd2b1b25b56808670f6044d9cb287eb1fc450f4c9453af28601f431fe33f70102f0e44df4ef964daa569203051838e5c3b3185623483140a9fdc12df90530f2198ca1fd4956e5bc3bf6dd3793d4e63710cbae6cba3ded4c656b836b6e0491dad4ee0a9e842cd77d0a65b69ee2adb2bb208e497d5798f5b166fc4a2fe3a90d3d0e917fae79752fe661992deb6cf1b045626e075571bf337ae081429461ec108a58630d73ba8e1d4d52359f548793b5e927fa3c168ddb95771182d6fa1a2f2861c9925e75f8328c5c8bea55afefd7ae3f6f831d43288a5b27823a8b5a982412d6c6ae13e5a7effb17694210d9448b60780aaabf733ef7d102283900f08e3c801a577802e4006550eb53fa69bdab53dd6d26bb845d78faeef58432d87846a099dd0f9e76184cd33a2af14ce1187a7e31b85dcc1ab161d004d17f163cf27d8cae44a8df3578658dbb3c986e556c894b9aaebced30ddc41aa57dd376f1cb32e7a6eeffc8d79fc65c3ec4f643c0fd4fa1a54f868bc83675e32e778b8e65e3183e91fdcd5bbf5e7e321a947342e424eadc42ae0095ee499098eb350a6e645dce3493a178348d8a261e91cde1dd065da338e9da73c0dff3b8756e47ff1ae12b268d9cb84f45f2a90ebc724d4b0cb79abbcdf919fbe1d2f37a6f0db808dbcd9b283f2e9c608../../../../../../nvptx-none/bin/as../../../../../../nvptx-none/bin/ldrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootrootcross-nvptx-gcc10-10.3.0+git1587-lp152.5.1.src.rpmcross-nvptx-gcc10cross-nvptx-gcc10(x86-64)libtool(/usr/lib64/gcc/x86_64-suse-linux/10/accel/nvptx-none/libcaf_single.la)libtool(/usr/lib64/gcc/x86_64-suse-linux/10/accel/nvptx-none/libgfortran.la)libtool(/usr/lib64/gcc/x86_64-suse-linux/10/accel/nvptx-none/libgomp.la)libtool(/usr/lib64/gcc/x86_64-suse-linux/10/accel/nvptx-none/mgomp/libcaf_single.la)libtool(/usr/lib64/gcc/x86_64-suse-linux/10/accel/nvptx-none/mgomp/libgfortran.la)libtool(/usr/lib64/gcc/x86_64-suse-linux/10/accel/nvptx-none/mgomp/libgomp.la) @@@@@@@@@@@@@@@   cross-nvptx-newlib-develld-linux-x86-64.so.2()(64bit)ld-linux-x86-64.so.2(GLIBC_2.3)(64bit)libc.so.6()(64bit)libc.so.6(GLIBC_2.11)(64bit)libc.so.6(GLIBC_2.14)(64bit)libc.so.6(GLIBC_2.2.5)(64bit)libc.so.6(GLIBC_2.3)(64bit)libc.so.6(GLIBC_2.9)(64bit)libgmp.so.10()(64bit)libisl.so.15()(64bit)libm.so.6()(64bit)libm.so.6(GLIBC_2.2.5)(64bit)libmpc.so.3()(64bit)libmpfr.so.6()(64bit)libz.so.1()(64bit)nvptx-toolsrpmlib(CompressedFileNames)rpmlib(FileDigests)rpmlib(PayloadFilesHavePrefix)update-alternativesupdate-alternatives10.3.0+git1587-lp152.5.13.0.4-14.6.0-14.0-14.14.1`x*`e`[)`?z@`@_ǁ_@__^@_@_@__]@_ _D@_C_*@_ @_c^^^^@^@^`^^@^H^@^^^}^y@^t@^s^^r @^ku^f/^b;@^\@^U @^O@^K^K^C^AE^;^;^:@^!]QT]QTRichard Biener Richard Biener Richard Biener Richard Biener Richard Biener Richard Biener Richard Biener Richard Biener Richard Biener Richard Biener Richard Biener Ludwig Nussel Richard Biener Matthew Trescott Richard Biener Richard Biener Richard Biener Richard Biener Richard Biener Guillaume GARDET Richard Biener Matwey Kornilov Richard Biener Richard Biener Richard Biener Richard Biener Richard Biener Richard Biener Richard Biener Richard Biener Richard Biener Richard Biener Richard Biener Richard Biener Richard Biener Richard Biener Richard Biener Richard Biener Richard Biener Richard Biener Richard Biener Richard Biener Martin Liška Martin Liška Richard Biener Andreas Schwab Martin Liška Martin Liška Martin Liška Martin Liška Martin Liška Martin Liška - Update to GCC 10.3.0 release (63fa67847628e5f358e7e2e7e), git1587 - Disable nvptx offloading for aarch64 again since it doesn't work- Update to gcc-10 branch head (892024d4af83b258801ff7484), git1574 * Includes GCC 10.3 RC1- Update to gcc-10 branch head (592388d4f6e8a6adb470428fe), git1450- Update to gcc-10 branch head (85977f624a34eac309f9d77a5), git1331 * Includes fix for [bsc#1182016] - The 32bit nvptx libgomp plugin is no longer built, do not attempt to package it.- Remove include-fixed/pthread.h - Change GCC exception licenses to SPDX format- Update to gcc-10 branch head (e563687cf9d3d1278f45aaebd), git1030 * Includes fix for firefox build [gcc#97918]- Do not specify alternate offload compiler location at configure time. - Update README.First-for.SuSE.packagers - Install offload compilers for gcc10-testresults build- Enable fortran for offload compilers.- Add gcc10-amdgcn-llvm-as.patch to fix build of amdgcn offload compiler with llvm11.- Update to gcc-10 branch head (98ba03ffe0b9f37b4916ce6238), git958. * Includes fix for memcpy miscompilation on aarch64. [bsc#1178624, bsc#1178577]- Fix 32bit libgnat.so link. [bsc#1178675]- prepare usrmerge: Install libgcc_s into %_libdir. ABI wise it stays /%lib. (boo#1029961)- Update to gcc-10 branch head (a78cd759754c92cecbf235ac9b), git872.- Build complete set of multilibs for arm-none target [bsc#1106014] * Fixes inadvertant mixture of ARM and Thumb instructions in linker output- Update to gcc-10 branch head (c0746a1beb1ba073c7981eb09f), git583. * Fixes ABI breakage for as-base CDTORs of final classes. [gcc#95428]- Update to gcc-10 branch head (d523b5201cce1796717a8ca669), git580. * Includes gcc10-streamer-backports1.patch and gcc10-streamer-backports2.patch. * Includes fixes for LTO ICE [bsc#1175168] and aarc64 128bit CAS miscompilation [bsc#1174753].- Update to gcc-10 branch head (dda1e9d08434def88ed86557d0), git501. * Includes fix for AARCH64 kernel build failure. [bsc#1174817] * Includes aarch64 SLS mitigation changes. [bsc#1172798, CVE-2020-13844] - Add gcc10-streamer-backports1.patch and gcc10-streamer-backports2.patch. - Enable x86 CET runtime for SLES15 and Leap15 also. - Do not enable the now deprecated HSA offloading capability.- Update to gcc-10 branch head (c0438ced53bcf57e4ebb1c38c), git465. * Includes GCC 10.2 release. [bsc#1173972] [jsc#ECO-2373] * Picks up fixes for C++20 coroutines support. [jsc#SLE-12297] * Picks up fix for a recent chromium build fail. - Build x86 CET enabled runtime for Factory. - Disable GCN offloading for SLE12 and SLE15 GA.- Update to gcc-10 branch head (12e1a54b06777db74ce375496), git355. * Includes fix for non-reproducible builds with LTO [bsc#1172846].- Enable nvptx support for aarch64- Update to gcc-10 branch head (c91e43e9363bd119a695d6450), git290. * Includes fix for PR95719, fixing LibreOffice.- Enable c++ for arm-none-eabi- Update to gcc-10 branch head (b0461f44076c26ced5526e4fd6), git68. - Add gcc10-foffload-default.patch to make offloading ignore offload targets that have not been installed both at compile and runtime (for the libgomp plugin part).- Update to gcc-10 branch head (dd38686d9c810cecbaa80bb82e), git40. * Includes GCC 10.1 release.- Update to gcc-10 branch head (2aaa1dc3c87372fd55c1c33aa7a), git5. * Includes first release candidate for GCC 10.1. * Includes gcc10-pr94734.patch- Update to master head (3685c5adf5c0b30268cb8f95c89e4), git176017. - Add gcc10-pr94734.patch- Update to master head (b835645c7a51b7e99092abe61d677), git175845. - Drop to 4 jobs as constraint for s390x.- Update to master head (effcb4181e143bc390286a489ff84), git175831. - Package arm_cde.h and arm_mve_types.h for arm. - Alter _constraints to also constrain jobs. - Add libzstd-devel BuildRequires to cross compiler specs. - Switch to release checking builds.- Update to master head (2dc9294c3c7c81a6d5e1d4dedf58f), git175805.- Update to master head (13e41d8b9d3d7598c72c38acc86a3), git175688.- Update to master head (c72a1b6f8b26de37d1a922a8af143), git175641.- Update to master head (038769535a8cbdd3dd3e100bde314), git175499.- Update to master head (75c24a08d697d6442fe6c26142f05), git175422.- Update to master head (7d4549b2cd209eb621453ce13be7f), git175366.- Update to master head (c7e9019681857b329bbe4c1e7ec8d), git175348. - Package arm_mve.h for arm.- Update to master head (4e3d3e40726e1b68bf52fa205c68495124ea60b8). - libgphobos and libgdruntime SONAME versions were reset to 1.- Update to master head (3604480a6fe493c51d6ebd53d9b1abeebbbb828f).- Update embedded newlib to newlib-3.3.0.tar.xz, drop old newlib-3.1.0.tar.xz - Enable support for amdgcn-amdhsa OpenMP/OpenACC offloading.- Update to master head (655e5c29ae4080666154b3e10ac81116a1b7a638). - Re-add gcc9-reproducible-builds.patch and gcc9-reproducible-builds-buildid-for-checksum.patch.- Update to master head (778a77357cad11e8dd4c810544330af0fbe843b1). * Includes fix for binutils version parsing [gcc#93965]- Add libstdc++6-pp provides and conflicts to avoid file conflicts with same minor version of libstdc++6-pp from gcc9.- Disable zstd use for SLES15 and older.- Bump to rfa1160f6e50500aa38162fefb43bfb10c25e0363.- Bump to r33351ff9faa21c4c1af377d661a52ac0ce366db3.- Adjust installed headers for arm and aarch64, enable link-mutex for riscv64.- Don't remove go tool buildid, needed for bootstrapping go - Increase disk constraint- Bump to rc940105cc17111be98d8d42ba48a413b0e63aebe.- Bump libtool version of libgo.- Bump to r269e8130b77065452698ab97e5da77d132d00276.- Bump to r507de5ee23efdc8a16d6b0b6488e118055c711cd.- Add lto-dump to cross packages.- New package, inherits from gcc9 * gcc-add-defaultsspec.diff, add the ability to provide a specs file that is read by default * tls-no-direct.diff, avoid direct %fs references on x86 to not slow down Xen * gcc43-no-unwind-tables.diff, do not produce unwind tables for CRT files * gcc41-ppc32-retaddr.patch, fix expansion of __builtin_return_addr for ppc, just a testcase * gcc44-textdomain.patch, make translation files version specific and adjust textdomain to find them * gcc44-rename-info-files.patch, fix cross-references in info files when renaming them to be version specific * gcc48-libstdc++-api-reference.patch, fix link in the installed libstdc++ html documentation * gcc48-remove-mpfr-2.4.0-requirement.patch, make GCC work with earlier mpfr versions on old products * gcc5-no-return-gcc43-workaround.patch, make build work with host gcc 4.3 * gcc7-remove-Wexpansion-to-defined-from-Wextra.patch, removes new warning from -Wextra * gcc7-avoid-fixinc-error.difflamb63 1622802473  !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLM10.3.0+git1587-lp152.5.110.3.0+git1587-lp152.5.1 x86_64-suse-linux-accel-nvptx-none-gcc-10x86_64-suse-linux-accel-nvptx-none-lto-dump-1010accelnvptx-noneascollect2crt0.of951fincludeomp_lib.f90omp_lib.homp_lib.modomp_lib_kinds.modopenacc.f90openacc.modopenacc_kinds.modopenacc_lib.hincludeinclude-fixedREADMElimits.hsyslimits.hISO_Fortran_binding.hacc_prof.hfloat.hgcov.hiso646.homp.hopenacc.hstdalign.hstdarg.hstdatomic.hstdbool.hstddef.hstdfix.hstdint-gcc.hstdint.hstdnoreturn.htgmath.hunwind.hvarargs.hldlibcaf_single.alibcaf_single.lalibgcc.alibgcov.alibgfortran.alibgfortran.lalibgfortran.speclibgomp.alibgomp.lalibgomp.speclto-wrapperlto1mgompcrt0.ofincludeomp_lib.f90omp_lib.homp_lib.modomp_lib_kinds.modopenacc.f90openacc.modopenacc_kinds.modopenacc_lib.hlibcaf_single.alibcaf_single.lalibgcc.alibgcov.alibgfortran.alibgfortran.lalibgfortran.speclibgomp.alibgomp.lalibgomp.specmkoffload/usr/bin//usr/lib64/gcc/x86_64-suse-linux//usr/lib64/gcc/x86_64-suse-linux/10//usr/lib64/gcc/x86_64-suse-linux/10/accel//usr/lib64/gcc/x86_64-suse-linux/10/accel/nvptx-none//usr/lib64/gcc/x86_64-suse-linux/10/accel/nvptx-none/finclude//usr/lib64/gcc/x86_64-suse-linux/10/accel/nvptx-none/include-fixed//usr/lib64/gcc/x86_64-suse-linux/10/accel/nvptx-none/include//usr/lib64/gcc/x86_64-suse-linux/10/accel/nvptx-none/mgomp//usr/lib64/gcc/x86_64-suse-linux/10/accel/nvptx-none/mgomp/finclude/-fmessage-length=0 -grecord-gcc-switches -O2 -Wall -D_FORTIFY_SOURCE=2 -fstack-protector-strong -funwind-tables -fasynchronous-unwind-tables -fstack-clash-protection -gobs://build.opensuse.org/openSUSE:Maintenance:16441/openSUSE_Leap_15.2_Update/67dda6723b2f6cf288c232241e90934d-cross-nvptx-gcc10.openSUSE_Leap_15.2_Updatedrpmx86_64-suse-linux      ELF 64-bit LSB executable, x86-64, version 1 (SYSV), dynamically linked, interpreter /lib64/ld-linux-x86-64.so.2, BuildID[sha1]=441d2c1c6f92f5d124f89be8fc060c6e2e9c9346, for GNU/Linux 3.2.0, strippedELF 64-bit LSB executable, x86-64, version 1 (GNU/Linux), dynamically linked, interpreter /lib64/ld-linux-x86-64.so.2, BuildID[sha1]=599c3c3972713aee2005eae4fd8060a7cd9b1ca3, for GNU/Linux 3.2.0, strippeddirectoryELF 64-bit LSB executable, x86-64, version 1 (SYSV), dynamically linked, interpreter /lib64/ld-linux-x86-64.so.2, BuildID[sha1]=7c2299c9577ce4d16b38ba7450651a128574090d, for GNU/Linux 3.2.0, strippedASCII textELF 64-bit LSB executable, x86-64, version 1 (GNU/Linux), dynamically linked, interpreter /lib64/ld-linux-x86-64.so.2, BuildID[sha1]=47fae0ff209f94ff7942d3e0b06f91c83e8ec218, for GNU/Linux 3.2.0, strippedASCII text (gzip compressed data, from Unix)C source, ASCII textcurrent ar archivelibtool library fileELF 64-bit LSB executable, x86-64, version 1 (SYSV), dynamically linked, interpreter /lib64/ld-linux-x86-64.so.2, BuildID[sha1]=f26e8500a8002d74785591955c31650fdc291de9, for GNU/Linux 3.2.0, strippedELF 64-bit LSB executable, x86-64, version 1 (GNU/Linux), dynamically linked, interpreter /lib64/ld-linux-x86-64.so.2, BuildID[sha1]=9746f2b1e44b026b3ec24785980574904a348234, for GNU/Linux 3.2.0, strippedELF 64-bit LSB executable, x86-64, version 1 (SYSV), dynamically linked, interpreter /lib64/ld-linux-x86-64.so.2, BuildID[sha1]=a7daea21286ae8e5dc8141ec47c05445a7637864, for GNU/Linux 3.2.0, stripped*+,-5BCDE   RRRRRRRRRR RRRR R RR RR RRRRRRRRRRRR RRRRR R RR R RRPPPRRRRRRRRRR RRRR R RR RR RRPPPRRRRRRRRcӼuq؈utf-8af3dc6635650e9388aca93546799c8e373b800986061c004001798b77538fed8?0DLT3*cross-nvptx-gcc10-10.2.1+git583-lp152.2.1IC񽏡e?j~ۜKQ!bATf8cross-nvptx-gcc10-10.3.0+git1587-lp152.5.1 4>$  Ap`R/=„%z)wjk_a,WJE]y.>;5[ qf ~G#mЖ ~_ٌiq=*#A}ͷID7bVJ3Ýf iʻ=չh-BM[rD|U\$DpkwwA<^@?cbav'πL!) a.gq+$r ]w0֖ h|ۛ$!X(deb781a46a525722ec00d4f7464a5160ab4cb45234cc0846bf4e67a182e8f795323289776ff7c2972f1b3273a1945109395c4b990O`R/=„SN<$9x} zn83H)ŇqFT%JK*I6N>]d o;fPN`kMDGÅ_34!'\25bseNLu)0-ɻHJzo*C{N>8Zs=^|ZƔcf0&kH.xc~L8"`t v#o#M%,$=gX}6{YX̓V?Xq3l92)+'"H>pNYg                                                                                         -z BeZ?>@&o]>98tDSC + {^yb .3 2\Vu{$|+0ii[Q.CXOOZ&pcc=vu[<# __t} xw> 9V 1q @~O<f@  r%bG 6UY#4qM o R  6i>~gkQNW> '8@/)y;!^T"J'Z|Oc#^N}/%+"? \w 5 nN~ Y@^ CD <\Hfl ]   !}+$];K(^D^t$8^]Xp L R_P2|Q4f4 iygh&-JAR&4 p,*6 deM2:^B )u:5 T tU  tw +-a' 47G3{Sj= FM`qOB  $D9LVyK? E(d f0P)rZ c\/h}w @Z:A(Q>  b<l\ v:]Y4B9L] ;. V*7 3 u #]='5<NX( Zf_~U)k}rS}!=Q,+Dk_soooK*=KtK-KgKIKKKK3KK2KKJoz Gzyw <|N$qK ?c-   Pdf888fgupp t ' i ! 5g)mf5r@_ W _I>< re1q]R$Q3lj Ii{&#=B>f?z IZ? o:`q9P= PU'! @ Eg:~!)S: FK$ (f c5:# Sz0AZ")BJ#1 ,PN<&Y92G$" 33Jv&6)BZ' o-ee Y!5 7I?9- B\Z? >@o(9Cw`@5yh78mxtWDXC / z46>? KuYcT6{H&VT3JP0Cx % HN ?kzU80e:o0b0!8,-B> ;7]P$ #8 /b m 5*g1ȓ,y@Dfecz3R{^y d .33\ ]u{$|+0ii[_Q3CXOPZ&pcc=v[<# __t8} xw> 9V 1qE~O<f@ r%bG  6UY#4qM oR  6i>~gkQNWC '8@/)y;,^T"J'a|Oc#^N}/%+"? \w 5 nN~ Y@^k \D <\Hfl( ]  ( !}+$]K(_D^t)8^]Xp LL RqP2|Q4f4 iygh&-JAR&4 p,39"! deM2:^B )u:5 T tU tw +-a' 47G3{Sj= FM`qOB  $D9LVyK? I(d f0P)rZ c\/h}w @Z:(Q>  b<l\ v:]Y4B!9L] ;. V*7 3du #]='5<N X( Zf_~U)Qf}vS}!=Q,+Dk_soooK*=KtK-KgKIKKKK3KK2KKJoz Gzyw <|N$qK?c-   Pdf888fgupp t  ' i ! 5g)mf5r@`_ W _I>< re1q]R$Z3ls  Ii{&0=B>f?z IZ?o:`q=P= `Hg66!/0U1  ) BI^ "a   ; )% 4v;D% AD+PU'!  F" )V T(TB<@  Eg:%    2 -A~)S: FK$ (&V  0" C E 63m w]*k$& *sIB$(FL  J a l | Q D   Z8\wm .bU&R E/6j-mb -L '9V\w2]  c#+c 5G#!w ^z ;AZ")BJ#<+9_[G&Y9=R#$ " *3"CJ166Bg2 ee# Y.5 7I#?2>  <CU-)[UfCC ;X4R:.3^+27 <CA@<BBB<H?FH@D868>249+@ci+C@@>?_dV8?22 SE-$7s"%^ BX_Q[a 9/M( J3 e/Q%%i^(isj|zzz zz {yzz2>*i <Rk %UUEE6667CjhR# | d|6PH0nTL@K<|BBD4<"H"BFHr@%D86,G85.>2%(H7X&HI! ,!D O!Q  46YHH {Z_[;! _Hq <!T  !T 9+:Q@X%69 xi zhhhhVge6\R1oB/RzUB66np~hhhhVgi Dyu7H+Q^C @@EH>%?=@B4{b!8Oc!8I'x;!I#  eRRMQ48PMR=#T6/K#N#L#tR"###9L:FT8_c~:P6TF,T }d P. 2#LuF 9DXC39 t#  K32 ?20 ACd<,[>V,Z  <j8 v#=TC @#>)));.#>//!/?4 57 43:9DH=::FF\_]<tFYY,4[6qG pX@O:u]ZC&9&4`a(,:&(*)6^+D#(*)H)6=DO((*<Z4% *f *,uH 0YA60JH&7G]## N0QYA60|&6&-]## 0YA60|&6]-]## 0YA60|&6]-]## YA62|&6(-]## E0YA60JH&G]## ^0QYA60|&6&-]## 0YA60|&6]-]## 0YA60|&6L-]## YA62|&6-]## 76YA<6I&'-3# 6YA<667#\" 56YA<6|&6]-]## 6YA<6|&6L-]## 8YA<8|&6H-]## <>\8rMe+L<;%&0-+>!'u(,!)#]> >N(*!a<>\8rMe+L<;K%<(+>!&%'J[& M(+))G> >J(*!_>@\8rMe+L>;8A-+@!&YJ'&(!!)@ >A(*!t<>\8rMe+L<;lA., >!&'6v(+ $> >N(*!b<>\8rMe+L<;<(+>!&'ZI#(+!)Q> >J(*!^>@\8rMe+L>;JA(+^@!J%(+!)@ >A(*!s<>\8rMe+L<;A.,>!&''v(+ $>> >N(*!b<>\8rMe+L<;<(+j>!&'Z#(+!)> >J(*!^>@\8rMe+L>;&A(+@!(+!) >A(*!s<>\8rMe+L<;A.,>!&''v(+ $G> >N(*!b<>\8rMe+L<;<(+s>!&'Z#(+!)> >J(*!^>@\8rMe+L>;&A(+@!(+!)@ >A(*!t>@\8rMe+L>;(A.,@!&''[(+ $@ wP((!f>@\8rMe+L>;(<(+@!&'!#(+!)V@ wL((!Y@B\8rMe+L@;JA(+KB!"(+!)B wC((!v:>\8rMe+L:; mI#0(+>!&O(+!)Y> ovP((!f:>\8rMe+L:;,9'#>-+>!&'k'm #(+!)> ovL((!b<@\8rMe+L<;A(,H@!9~Z(,!)#@ ovC((!w:>\8rMe+L:; mI#0(+>!&O(+!)a> ovP((!f:>\8rMe+L:;,9'#>-+>!&'k'm #(+!)> ovL((!b<@\8rMe+L<;A(,P@!9~Z(,!)#@ ovC((! <<\8sL [)N(<%I&e t-+*<{H7XY([\# (,)<!I9(#2k-;2<<\8sL [)N(<9%IG8' V -#+ < %&7%IL(\ -,)<!I9(#2y-;2>>\8sL [)N(>%Ie -SI,>'[I[m7}~-,)>!I9(#2|-;2<<\8sL [)N(<'I%e v-,<G86~8('\ (#,) <!I9(#2k-;2<<\8sL [)N(<&Ie -,c<&:(\ -,)<!I9(#2y-;2>>\8sL [)N(>l%[d -,>Y'8-9NkH7.Z# -+)8>!I9(#2|-;2:}<\8sL [)N(<'I%m&p v-,R<G86\'\ (#,)<!I9(#2k-;2:}<\8sL [)N(<&&e -, <&\ -,)<!I9(#2y-;2<>\8sL [)N(>l -'>Y'8-&[&Z# -+)>!I9(#2|-;2<<\8sL [)N(<'I%m&p v-,[<G86\'\ (#,)<!I9(#2k-;2<<\8sL [)N(<&&e -, <&\ -,)<!I9(#2y-;2>>\8sL [)N(>l -'>Y'8-&[&Z# -+)>!I9(#2|-;2>@\8sL [)N(>'I%'| v-,N@G86L'l!\ (#,)@!I9(!#2k-;2>@\8sL [)N(>&e -,N@&Z!\-,)@!I9(!#2y-;2@B\8sL [)N(@ l: -, BY'8-&(~"Z# -+)B!I9("#2|-;2;~=\8sL [)N(<G%9e u-,=XGGJkZp \ (#,)Y=!I9( #2k-;2;~=\8sL [)N(<'I%6J(e .,=&' \-,)O=!I9( #2y-;2=?\8sL [)N(>&)&e }(,7?j'%8X![ -,)U?!I9(!#2|-;2<>\8sL [)N(<G%9e u-,>XGGJkZp \ (#,)a>!I9( #2k-;2<>\8sL [)N(<'I%6J(e .,>&' \-,)W>!I9( #2y-;2>@\8sL [)N(>&)&e }(,?@j'%8X![ -,)]@!I9(!#2|-;2  4v7\8sL-[)G4I'8&Q-, Z6YI{'%IiZ&g#-,")6%Z #;34v7\8sL-[)G4%Q z-, 68G8`g#-,)<6I%j +;14v7\8sL-[)G4%Q k-, 68G9`g# -,)6I%j -;14v7\8sL-[)G4'IH8Q y-, 6I7\g#-#,)I6I%j -;18y9\8sL-[)G8%]~ |-,A:8G:^g#-,):I%j' -;16v7\8sL-[)G63&7Hy-,7{&}88(g# -,)7I%| -;16v7\8sL-[)G63&7Hy-,8{&}88(g# -,)8I%| -;1<>\8rMe+L<;%&0-+>!'u(,!)#\> >N(*!b<>\8rMe+L<;K%<(+>!&%'J[& M(+))E> >J(*!_>@\8rMe+L>;8A-+@!&YJ'&(!!)@ >A(*!s<>\8rMe+L<;lA.,>!&'6v(+ $> >N(*!a<>\8rMe+L<;<(+>!&'ZI#(+!)Q> >J(*!_>@\8rMe+L>;JA(+X@!#J%(+!)@ >A(*!t<>\8rMe+L<;A.,>!&''v(+ $=> >N(*!a<>\8rMe+L<;<(+d>!&'Z#(+!)> >J(*!_>@\8rMe+L>;&A(+@!#(+!)w@ >A(*!t<>\8rMe+L<;A.,>!&''v(+ $F> >N(*!a<>\8rMe+L<;<(+m>!&'Z#(+!)> >J(*!_>@\8rMe+L>;&A(+@!(+!)@ >A(*!u>@\8rMe+L>;(A.,@!&''[(+ $@ wP((!f>@\8rMe+L>;(<(+@!&'!#(+!)M@ wL((!Y@B\8rMe+L@;JA(+BB!"(+!)B wC((!w:>\8rMe+L:; mI#0(+>!&O(+!)W> ovP((!e:>\8rMe+L:;,9'#>-+>!&'k'm #(+!)> ovL((!b<@\8rMe+L<;A(,C@!9~Z(,!)#@ ovC((!w:>\8rMe+L:; mI#0(+>!&O(+!)_> ovP((!e:>\8rMe+L:;,9'#>-+>!&'k'm #(+!)> ovL((!b<@\8rMe+L<;A(,K@!9~Z(,!)#@ ovC((! <<\8sL [)N(<ZzKp k-,<{H7XY([`# (,)<!I9(#2k-;2<<\8sL [)N(<&(e -,< %&7%IL(` -,)<!I9(#2y-;2>>\8sL [)N(>'%e x-,:>'[I[m7~-,)>!I9(#2|-;2<<\8sL [)N(<9e -,9<G86~8('` (#,)<!I9(#2k-;2<<\8sL [)N(<%GJe -,<&:(`-,)<!I9(#2y-;2>>\8sL [)N(>G}%%b -">Y'8-9NkH7.^# -+)B>!I9(#2|-;2:}<\8sL [)N(<9J'e -,<G86\'` (#,)<!I9(#2k-;2:}<\8sL [)N(<%G(e -,/<&`-,)<!I9(#2y-;2<>\8sL [)N(>GII%b -"F>Y'8-&[&^# -+)>!I9(#2|-;2<<\8sL [)N(<9J'e -,<G86\'` (#,)<!I9(#2k-;2<<\8sL [)N(<%G(e -,A<&`-,)<!I9(#2y-;2>>\8sL [)N(>GII%b -"X>Y'8-&[&^# -+)>!I9(#2|-;2>@\8sL [)N(>98p -,8@G86L'l!` (#,)@!I9(!#2k-;2>@\8sL [)N(>%G -,@&Z!`-,)@!I9(!#2y-;2@B\8sL [)N(@G[%b -"BY'8-&(~"^# -+)B!I9("#2|-;2;~=\8sL [)N(<G:Z%b g-"=XGGJkZp ` (#,)i=!I9( #2k-;2;~=\8sL [)N(<'I%('e -'=&' `-,)Y=!I9( #2y-;2=?\8sL [)N(>Y%]9'a y-,?j'%8X!_ -,)f?!I9(!#2|-;2<>\8sL [)N(<G:Z%b g-">XGGJkZp ` (#,)q>!I9( #2k-;2<>\8sL [)N(<'I%('e -'>&' `-,)a>!I9( #2y-;2>@\8sL [)N(>Y%]9'a y-,@j'%8X!_ -,)n@!I9(!#2|-;2  4v7\8sL-[)G4I'8&Q-, X6YI{'%IiZ&g#-,")6%Z #;34v7\8sL-[)G4%Q z-, ~68G8`g#-,);6I%j +;14v7\8sL-[)G4%Q k-, 68G9`g# -,)6I%j -;14v7\8sL-[)G4'IH8Q y-, 6I7\g#-#,)H6I%j -;18y9\8sL-[)G8%(~ |-,O:8G:^g#-,):I%j' -;1 6v7\8sL-[)G63&7Hy-,7{&}88(g# -,)7I%| -;16v7\8sL-[)G63&7Hy-,8{&}88(g# -,)8I%| -;1 6y9\8sL-[)J6(&l # #+ r8%Ie -,)(8%Z #;36y9\8sL-[)J6&'&m }#, 8j%I87I%g#-,)S8I%j +;16y9\8sL-[)J63%Q k-, -88G9`g# -,)8I%j -;16y9\8sL-[)J6'IH8Q y#, 8I7\g# -#,)H8I%j -;1:|;\8sM-[)J:3%Q |-,a<8GI<`g#-,)8<I%j' -;18y9\8sL-[)J83%&|Q k-, <98G9`g# -,)9I%| -;18y9\8sL-[)J83%&|Q y-, L:8G9`g#-,):I%| -;1 8y9\8sL-b108l8P -,9j&7%I8`g-,)9I%8 -;18y9\8sL-![108l8P ,*:j&7%I8`g-,):I% -;1 .m1\8sL -[)>.(&l # #+ o0%Ie -,)%0%Z  #;3.m1\8sL -[)>.&'&m }#, 0j%I87I%g#-,)P0I%j +;1.m1\8sL -[)>.3%Q k-, *08G9` g# -,)0I%j -;1.m1\8sL -[)>.'IH8\ y#, 0I7\g# -#,)I0I%j -;1p3\8sL -[)>23%H& |-,48Gl`g#-,)4I%j -;10m1\8sL -[)>03%&|Q k-, <18G9`g# -,)1I%| -;10m1\8sL -[)>03%&|Q y-, L28G9`g#-,)2I%| -;1 0m1\8sL -[1$07HHr-, 1{&}88'' g#-,)"1I%8 -;1 0m1\8sL -[1$07HHr, 2{&}88'' g#-,)*2I% -;1 <RY.]<J7-:: T,<QY,[<(J1-*p:#P,! p3\8sL-[)A0I'8_-,M2%I{'%IiZ&# -,")2%Z  #;3p3\8sL-[)A03%Q z-, 28G8` g#-,)J2I%j +;1p3\8sL-[)A03%Q k-, +28G9` g# -,)2I%j -;1p3\8sL-[)A0'IH8\ y#, 2I7\# -#,)F2I%j -;14s5\8sL-[)A3%\ |-, 68GJY`w#-,)E6I%j -;1p3\8sL-[)A0I'8_-,M2%I{'%IiZ&# -,"),2%Z #;3p3\8sL-[)A03%Q z-, 28G8`g#-,)H2I%j +;1p3\8sL-[)A03%Q k-, )28G9`g# -,)2I%j -;1p3\8sL-[)A0'IH8\ y#, 2I7\# -#,)H2I%j -;14s5\8sL-[)A3%\ |-, 68GJY`w#-,)#6I%j -;16y9\8sL-[)J6I'8_-,M8%I{'%IiZ&g# -,")8%Z #;36y9\8sL-[)J63%Q z-, 88G8`g#-,)I8I%j +;16y9\8sL-[)J63%Q k-, *88G9`g# -,)8I%j -;16y9\8sL-[)J6'IH8\ y#, 8I7\g# -#,)I8I%j -;1:|;\8sL-[)J:3%\ |-, <8GJY`g#-,)%<I%j -;148s2(4'Q v-,48s0(4'Q -,I 68t(68m* +-+ 68t(6'I&&m x(, 68t(6'IHZ k-, 68t(67H87Qr#, E88t(8'IHIQ |-, 4Y ZB&4 zLa4p#QR_#;O`4Y ZB&46%666G$;H4q-RR#4DO`GWY ZB&4%%%MX?4b-RR#`;`l4Y ZB&4G%%%G=B4p-RR##K<`>8Y#]B&8%%%GJ4s-RR##K=`.Bggfgfgfgf    XY ZB&5G%6%G6iF@4b-R#R`#;_`s6Y ZB&6XF%GG6F-4p-RR#K <I`5t Y.T75%5 X%S4p-g^##P`W6Y.T76G%5 GX%S4s-g^##Q` 6Y<^I[kB6Yp8j"`(#Jsxk)6Y<^I[kB6YL8j"n.#Jsxk)8Y<aI[kB8Y}9(8&"q(#JsxH j91R1% J,10R0Y1R1G1R1+ U*[3S3|,B\nP4 (~!'7*)+:X_85B5D8,B\nP4 (~!'\*)+8Z}85B5D8,DF\nP4 (~!k(%8.-(+3|:k^:5D5F8,B\nP4 (~!&',S,(+WsW"85B5D8,B\nP4 (~!&I-S-(+NmnW"85B5D8,DF\nP4 (~!'(7\J%,S,)}) ?rZ(:5D5F8*t?B\8nP_<<<<<<<1\D&&8 t(~)!!!!!!!!! H 5 @ B *t?B\8nP_<<<<<<<1\E&L(K77,(~)!!!!Hn6 @ B *wAD\8nPa>>>>>>>1\D&L(KpnZ,(~)    """"RH 8 B D 68AMn6" X&6-q7)v#  68AMn6" X&6-q7)v#  88AMn8" X&6+r-q7)v#  68AMn6" X&6-q7)v#  68AMn6" X&6-q7)v#   88AMn8"&86(q(O7(v#7A  88AMn8"&86-q(O7(v#7A   T:(T%@(RT"-UT-UU-T"-UT-U TFTF G!K >sF!J PG!K MG!K MI"M&NI!M MI!M MI!M3I!M3 L3 L4 ;H2u545$545$50 M1 q;L q;59HLLU LLe 545Ud5455U %:=:WPPGNP_ 757757$ 28K2u- -2Q8K2&Wu-#28K2&'Lu--28K2&'Lu--68K6&l'yu-Z-38K3&'Lu--48K4&'Lu-- 38K3Y87Lu-W-48K4Y87Lu-Z( 8x7A8EY8kY-.(8E7Y_5--*8x7A8E7%l--' 8EkF--*` 8x7A8E7%l--' 8E7kF--*` 8x7A8E7%l--' 8E7kF--*` <{7A<E7%l--' ?<ElkG--*` 9x7A9E7%l--' 9E7kF--*` :x7A:E7%l--' :E7kF--*` 9x7A9E'&F(#( R9EK--*:x7A:E'&F(#( R:EL--* \8ySYN[4'J98#D#(=\8ySYN[4Z9(8--AD#[(#f4\8ySYN[4Z(6f8--AD#[#q\8ySYN[4Z(6f8-#AD#[#r:\8ySYN[:Z9(8--AH#[#7\8ySYN[7Z(6f8--AE#[#q\8ySYN[8Z(6f8-#AF#[#s 7\8ySYN[7Z(w8#-AE%#[,\8ySYN[8Z(w8#-AF%#[( 95=(GG b95= JBH (7=J:H 595= J:H 6;6? JH u8= J:H 595= J:H 3 8;H H n75;H H {;!?Q- <!?~I: C;!?%6I5H Z;!?%6I5H Z?"C%(%FH <!@%6I5H Z=!A%6I5H =!@y: >!Ay: <!?Hw meG "<!?%&H ;!?K%: ;!?K%: ?"Cj': *<!@K%: =!AK%: <!@K6| : '=!AK6| : '>"A(H9%%F ]-="A&(9K\  H ^="A&H(:H ^="A&H(:H OA#E&H]\'H 9->"B&H(:H ^?"C&H(:H R>"B"]Yn9H 1N?"C"]Yn9H 1B<>u8rM1_L<'#*(,t> 8 #(+!'> #N(!l<=\8rM1_L<7#0-+-X-*=O9H #(+!'G-i-X=#N(!l<>u8rM1_L<&'7-)m> '7 #( !)> #J(!h<=\8rM1_L<[#>( )X=' - !'|>.i.>=#J(!i>@u8rM1_L>()@ n!s(+!)@ #A(!>?\8rM1_L>I#A-)$-X-2?>P!#(+!'m-i-A?#A(!~<>u8rM1_L<'#*(,t> 8 #(+!'> #N(!l<=\8rM1_L<7#0-+-X-*=O9H #(+!'G-i-X=#N(!l<>u8rM1_L<&'7-)m> '7 #( !)> #J(!h<=\8rM1_L<[#>( )X=' - !'|>.i.>=#J(!i>@u8rM1_L>()@ n!s(+!)@ #A(!>?\8rM1_L>I#A-)$-X-2?>P!#(+!'m-i-A?#A(!~<>\8sMV)N-< 6'y -,/>6'k'( ~F-,) >!6'Z # -?5;~=\8sMV)N-;8&'y -,K-g- =''6H[  lH-#,) --=!6'Z # -?5<>\8sMV)N-<  ,X2,x>6{%~ vS-,) ~>!6'I # #?5;~=\8sMV)N-;'b -,B#g#='k\ & ~V-,)((=!6'I # #?5>@\8sMV)N-> %99I~ -,d@'I+9e!lY-,)  @!6'!# -?5=?\8sMV)N-=G'x ,5-g-?YI:},6!lV-#,) b-- l?!6'!# -?5<>\8sMV)N-< 6'y -,/>6'k'( ~F-,) >!6'Z # -?5;~=\8sMV)N-;8&'y -,K-i- =''6H[  lH-#,) --=!6'Z # -?5<>\8sMV)N-<  ,X2,x>6{%~ vS-,) ~>!6'I # #?5;~=\8sMV)N-;'b -,B#i#='k\ & ~V-,)((=!6'I # #?5>@\8sMV)N-> %99I~ -,d@'I+9e!lY-,)  @!6'!# -?5=?\8sMV)N-=G'x ,5-i-?YI:},6!lV-#,) b-- l?!6'!# -?5n9(A<ELFc#"8n99A<uFc#@4"8n9(A<I%%#"Hn99A<:SI&#?(( "H=@A<>A(@ %#@ =?A<>+#s#f?#)!? n9(A<ELF+d#"8n99A<uFd#@"8n9(A<I%#u"Hn99A<:SI&#?(("H=@A<>A(@ %#@ ;?A<>+#s#u?H#)? y:8de$L8E#u\=:v!:;97y:8de$L7E#u";#=:&4v.T!:2(y:8ee$L8E#uX=:7"Y!: 0(7y:8ee$L7E#u47#=:7&81.T!:D'8PgGsMG>`!]86  (,R,3:{i'L#,)W-R:!'}(6 +&#18PgGsMG>`!]87l%9V -,h-%:%&69H[J-L}-#,)#UW:!8&g#(=3N8PgGsMG>a!]86 (,V,3:{iqL#,)[-R:!'}(6 +&#1(8PgGsMG>a!]87l%l9V -,{-%:%&69H[K'L}-#,)#UW:!8&#(=36:<o8rM@1_L:;("#>.,l <;GP!U#+!)#<J(!\:<o8rM@1_L:;,&8#>- v <;7'( $S+!)q?#E<J(!]:<o8rM@1_L:;,'H#>- v <;7YZ $S+!)q?#<J(!]::\8rM@1_L:!,'H#>- v :E7YH $S+!)q?#: J(!^<>\8rM@1_L<!&>- v >E%9 9-US+!)!d>J(!!]9{;\8rM@1_L9!,'H#>- v !7YZ $S+!)q?#; J(!]:<\8rM@1_L:!,'H#>- v ;7YZ $S+!)q?#< J(!^9{;\8rM@1eL9!7%(#>( q' 4 &}H%T(,!)?#;u4#J!\:|<\8rM@1eL:!7%(#>( q' 4 &}H%T(,!)?#<u4#J!]:<\8ro;_V):;,J2#>(+vQ)~)<;'%U(+))$q.")/<#J(!]9;\8rL;_V)9; ,.#>-+}g44r;;&'TU-!!)3#2&2;#J(!\:<o8s@V(:&8[y u(,< 7|%%&( )-*<'|  -,:<o8s@V(:[%6n ~,,c<66| ## /,*]<'|  -,:<o8s@V(:[%&'y ~,,*<66[(## /,*<'|  -,8|Rs@V(8[%[&'y ~,,*<66'&## /,*<'|  -,<Ts@V(<69# ~,,>X6Z%68|# /,*>  -*9{;\8s@V(9[%I&'y ~,,*66[\## /,*'|  -,:|<\8s@V(:[%I&'y ~,,*66[\## /,*'|  -,9;\8s;-V9'%%j( -,(%I(I 8 +(, &l;5 -*:<\8s;-V:'%%j( -,(%I(I 8 +(, &l;5 -*:^o8s^[(:9% {-,$3$J<6%GZ%8X# )(*..<'JO -*9^o8s;[(9&68 q,,434?<X'%mx# ,(,33{<'JO -*oc<%$ $~#)y)J~$Vo@<H/ / //t~$V K$ -   3Vs6&@  IBL LY t)c 'B $c8Asg &{a3G%t38 !%6}$:SRi?^4C%+ H7iV"/0W* ] ! "+ R35R3=~47"("4X"L" ! R'mfX/zRo&<t,H<"*?B6?B6?B6?B8AD\8n8999;99771I L'(I KE'c#(GPX GP6R @ B 6R @ B 5 @ B 6 @ B 8T B D ,B8B8B8B:DF\nD,<6t-y(%SAY(8^EB"GD?G8`EB$GDAG8M\CBGDC8`kEBGDG:yGDGFEG,B8B8B8B:DF\nD,<!Z(89&%D,(juAY j8bdXbdXb%ud%XbdZdfX!W02[ Z " Lf w:CQC$ 4#4%" y3.56689{6668%7!7<;E<\6v,5809<o8322242222 :e -'"!(Gk-!6-(9, -68n0 :P<n 6T$'A  8:~/9<>GJ766686666DDDDFDDDD\YNpOYc,I#hV$=#t#(#0 HHHHHHHHd222 : < >#2777!7 7 7 7 sj!"H#'"J#"o'#QW:722>@F\>>>G(Y7[5Y")8Z6 {-A.5a<2J>#)l!4G$@3E Uo=7>$6*G##{/-w-G, E4z88@@8 Wq$S ?Y6r6DJ  b@m#14m@&2##o<62<]o8AR"J%'H\P Os#t<##P#7 * 28< !;WA:EBH ayk 1aIII24,0.2.M3%" ,eJL07d.LE0L%` 3`zM5dKME5DsD 6b2;>2;>o8MH888:88888888:8888*[Qk"#$"<$Tk#$|-z#=VW2mqn<+>M"2&m <>x112t(BGgeijikikE}ddd#ffgh#\pZ \nZ  UBUU\E>\S \U^`4``6Zn`? \l` CWAv@.CUA/?E\n`?\l` CWAv@.CUA/?E `tdA `rd^^W\^YEXEvABEXC/@E^mb@^ob `tdA `rd^qb@^ob `tdA `rd^qb@^ob `tdA `nd\j`? \l` CWAv@.CSG/=E\n`? \l` CWAv@.CUA/?E\n`? \l`^qb@^ob^qb@^ob^qb@^kb EEEE^^[\^] `tdA `rdigG_^^T\^V Zi EE`\\ \\^t^^r^ QEIC\pZ \nZ aIlkJL<kM2 2 \\IJJhJhLh(^I%'OM '^`diieKnKngMpggiiieKnKnggggifopS4R2P22444,0222446ble%i[i6eXLv@6L/?7eXLw@7L?8i[grNwA7N@7gZgZgZi[i[i[eYLw@7L=8eYLw@7L?8eYgZgZgZgvi[ge i&e&N{N8C9/*U 8aQ&= j )  ,  0A,E3+-f*(R'3J ;1 E3?h!L b)' G@,& W28,( :)Rf?@#FamoKc0%"Bn++ -W*3VlWr+jY6a-mAA0amSS$&XB  <P{=%: R& &/4F!3T& @" h[hF! hh<!g[@! s- !+Hh] !3L!,3;j6!%aqF!j~rZ`@ d+Q!&6%dXD qs5==^=!!7!=!=4+"l:f 8f<{IJ!:(:'6%G%6:%} !Q#%Q_v;QQ s au['!v$0v;!<;wvv  Y!9l%$#8_%~VY %lhH!'w">>|ZZj6x2%%.&%5T ('=7g:!NZZ*7$5DHZ 7%'''7C#\%-B 9.V WC~9.!Xj|'u \kteuU6zp0zwz#fz7z %zz!%k^7PA}e}a_b+&Z}AQf(,.a.h:.-h"!f%n$E/#< ''J>P!'|8\'j=L)z bt{+[%A#$z U;%};nV_%]MT .(,<.@.[.C"%9( v6 }A$}MJ=,) U @, 1DP;_ :cG#l-? 8 (#Qyh.&&2 +1A2` 'IAc!Q<3z 2S4,s)' ..oU H 7 O%Z|&lscZZji3/ _Q)a(h'Pb`E'j-}xxz B`Z? ^P@0o]>&tOE98tDSC + {^yb .3 2\Vu{$|+0ii[Q-.CXOOZ&pcc=vu[<# __t} xw> 9V 1q @~O<f@  r%bG 6UY#4qM o R  6i>~gkQNW> '8@/)y;!^T"J'Z|Oc#^N}/%+"? \w 5 nN~ Y@^ CD <\Hfl ]   !}+$]K(^D^t$8^]Xp L R_P2|Q4f4 iygh&-JAR&4 p,*6 deM2:^B )u:5 T tU  tw +-a' 47G3{Sj= FM`qOB  $D9LVyK? E(d f0P)rZ c\/h}w @Z:A(Q>  b<l\ v:]Y4B9L] ;. V*7 3 u #]='5<NX( Zf_~U)k}rS}!=Q,+Dk_soooK*=KtK-KgKIKKKK3KK2KKJoz Gzyw <|N$qK ?c-   Pdf888fgupp t ' i ! 5g)mf5r@_ W _I>< re1q]R$Q3ln Ii{&#=B>f?z IZ? o:`q9P= PU'! @ Eg:,~!)S: FK$ ( c5:# Sz0AZ")BJ#1 ,PN<&Y92G$" 33Jv&6)BZ'ee Y!5 7I? >  @GU-)cUfC"CY:.3^+27 <CA@<BBB<H?FH@D868>249+@ci+C@@>?_dV8?22 zE-$7s"%^ BX_Q[a 9/ -( J'3 e/R6%i^(isj|zzz zz {yzz2>*i <Rk %UUQE6667CjhR# | d|6PH0nT1@K<|BBD4<"H"BFHr@%D86,G85.>2%(H7X&HI! ,!D O!Q  46YHH {Z_[;! _Hq <!T  !T 9+:Q@%69 xi zhhhhVge6\R1oB/RzDB66np~hhhhVgi DyuH+Q^C @@ C1lH>?@1=gB4{b!8Oc!8I'x;!I  eRRMQ48KMR=#T6/K#N#L#R"###9%L:FT8c7P6TFT}P W. 2#LuF 9DXC39 t#  K32 ?20ACd<,[>V,Z  <j8 v#=TO @#>)));.#>//!/?457 43:9DH=::FF\]<tFYY,4[6qG pX@O:u]Z&9&4`a(,:9(* @)6K ?+D6(* @)HC0)6=DO((*<lZ4% *f *,uHl 0Y<=0JH&7G]## ; 0Y<=0|&6&-]## 0Y<=0|&6]-]## 0Y<=0|&6]-]## 2Y<=|&6(-]## 1 0Y<=0JH&G]## K 0Y<=0|&6&-]## 0Y<=0|&6]-]## 0Y<=0|&6L-]## 2Y<=|&6-]## 6Y<C6I&'-3# 6Y<C667#\"  6Y<C6|&6]-]## 6Y<C6|&6L-]## 8Y<C8|&6H-]## <>\8rMe+L<%&0-+> 'u(,!)#`> >N(*!H<>\8rMe+L<K%<(+>&%'J[& M(+))J> >J(*!D>@\8rMe+L>8A-+ @&YJ'&(!!)@ >A(*!Z<>\8rMe+L<lA.,>&'6v(+ $> >N(*!H<>\8rMe+L<<(+>&'ZI#(+!)T> >J(*!D>@\8rMe+L>JA(+i@J%(+!)@ >A(*!Z<>\8rMe+L<A.,>&''v(+ $A> >N(*!H<>\8rMe+L<<(+u>&'Z#(+!)> >J(*!D>@\8rMe+L>&A(+@(+!)@ >A(*!Z<>\8rMe+L<A.,>&''v(+ $J> >N(*!H<>\8rMe+L<<(+~>&'Z#(+!)> >J(*!D>@\8rMe+L>&A(+ @(+!)@ >A(*!Z>@\8rMe+L>(A.,@&''[(+ $@ wP((!K>@\8rMe+L>(<(+@&'!#(+!)Y@ wL((!G@B\8rMe+L@JA(+VB"(+!)B wC((!]:>\8rMe+L<mI#0(+>&O(+!)\> ovP((!L:>\8rMe+L<,9'#>-+>&'k'm #(+!)> ovL((!H<@\8rMe+L>*A(,S@9~Z(,!)#@ ovC((!^:>\8rMe+L<mI#0(+ >&O(+!)d> ovP((!L:>\8rMe+L<,9'#>-+>&'k'm #(+!)> ovL((!H<@\8rMe+L>*A(,[@9~Z(,!)#@ ovC((!^ <<\8sL [)N(<%I&e t-+5<{H7XY([\# (,)<I9(#2k-;2<<\8sL [)N(<9%IG8' V -#+ < %&7%IL(\ -,)<I9(#2y-;2>>\8sL [)N(>%Ie -SI,>'[I[m7}~-,)>I9(#2|-;2<<\8sL [)N(<'I%e v-,<G86~8('\ (#,)<I9(#2k-;2<<\8sL [)N(<&Ie -,n<&:(\ -,)<I9(#2y-;2>>\8sL [)N(>l%[d -,>Y'8-9NkH7.Z# -+)D>I9(#2|-;2:<\8sL [)N(<'I%m&p v-,]<G86\'\ (#,)<I9(#2k-;2:<\8sL [)N(<&&e -, <&\ -,)<I9(#2y-;2<>\8sL [)N(>l -'>Y'8-&[&Z# -+)>I9(#2|-;2<<\8sL [)N(<'I%m&p v-,f<G86\'\ (#,)<I9(#2k-;2<<\8sL [)N(<&&e -,<&\ -,)<I9(#2y-;2>>\8sL [)N(>l -'#>Y'8-&[&Z# -+)>I9(#2|-;2>@\8sL [)N(>'I%'| v-,Y@G86L'l!\ (#,)@I9(!#2k-;2>@\8sL [)N(>&e -,Y@&Z!\-,)@I9(!#2y-;2@B\8sL [)N(@ l: -,+BY'8-&(~"Z# -+)BI9("#2|-;2;=\8sL [)N(<G%9e u-,=XGGJkZp \ (#,)f=I9( #2k-;2;=\8sL [)N(<'I%6J(e .,=&' \-,)[=I9( #2y-;2=?\8sL [)N(>&)&e }(,B?j'%8X![ -,)a?I9(!#2|-;2<>\8sL [)N(<G%9e u-,>XGGJkZp \ (#,)n>I9( #2k-;2<>\8sL [)N(<'I%6J(e .,>&' \-,)c>I9( #2y-;2>@\8sL [)N(>&)&e }(,J@j'%8X![ -,)i@I9(!#2|-;2  6v7\8sL-[)G6I'8&Q-, g6YI{'%IiZ&g#-,")'6%Z #;3 6v7\8sL-[)G6%Q z-, 68G8`g#-,)G6I%j +;1 6v7\8sL-[)G6%Q k-, ,68G9`g# -,)6I%j -;1 6v7\8sL-[)G6'IH8Q y-, 6I7\g#-#,)T6I%j -;1 8y9\8sL-[)G8%]~ |-,N:8G:^g#-,):I%j' -;1 6v7\8sL-[)G6&7Hy-,7z&}88(g# -,)7I%| -;1 6v7\8sL-[)G6&7Hy-,8z&}88(g# -,)8I%| -;1<>\8rMe+L<%&0-+>'u(,!)#_> >N(*!H<>\8rMe+L<K%<(+>(&%'J[& M(+))H> >J(*!D>@\8rMe+L>8A-+@&YJ'&(!!)@ >A(*!Z<>\8rMe+L<lA.,>&'6v(+ $> >N(*!H<>\8rMe+L<<(+>&'ZI#(+!)T> >J(*!D>@\8rMe+L>JA(+c@J%(+!)@ >A(*!Z<>\8rMe+L<A.,>&''v(+ $@> >N(*!H<>\8rMe+L<<(+o>&'Z#(+!)> >J(*!D>@\8rMe+L>&A(+@(+!)z@ >A(*!Z<>\8rMe+L<A.,>&''v(+ $I> >N(*!H<>\8rMe+L<<(+x>&'Z#(+!)> >J(*!D>@\8rMe+L>&A(+@(+!)@ >A(*!Z>@\8rMe+L>(A.,@&''[(+ $@ wP((!K>@\8rMe+L>(<(+@&'!#(+!)P@ wL((!G@B\8rMe+L@JA(+MB"(+!)B wC((!]:>\8rMe+L<mI#0(+>&O(+!)Z> ovP((!L:>\8rMe+L<,9'#>-+>&'k'm #(+!)> ovL((!H<@\8rMe+L>*A(,N@9~Z(,!)#@ ovC((!^:>\8rMe+L<mI#0(+>&O(+!)b> ovP((!L:>\8rMe+L<,9'#>-+>&'k'm #(+!)> ovL((!H<@\8rMe+L>*A(,V@9~Z(,!)#@ ovC((!^ <<\8sL [)N(<ZzKp k-,<{H7XY([`# (,)<I9(#2k-;2<<\8sL [)N(<&(e -,< %&7%IL(` -,)<I9(#2y-;2>>\8sL [)N(>'%e x-,E>'[I[m7~-,)>I9(#2|-;2<<\8sL [)N(<9e -,D<G86~8('` (#,)<I9(#2k-;2<<\8sL [)N(<%GJe -,<&:(`-,) <I9(#2y-;2>>\8sL [)N(>G}%%b -">Y'8-9NkH7.^# -+)L>I9(#2|-;2:<\8sL [)N(<9J'e -,<G86\'` (#,)<I9(#2k-;2:<\8sL [)N(<%G(e -,:<&`-,)<I9(#2y-;2<>\8sL [)N(>GII%b -"Q>Y'8-&[&^# -+)>I9(#2|-;2<<\8sL [)N(<9J'e -,<G86\'` (#,)<I9(#2k-;2<<\8sL [)N(<%G(e -,L<&`-,)<I9(#2y-;2>>\8sL [)N(>GII%b -"c>Y'8-&[&^# -+)>I9(#2|-;2>@\8sL [)N(>98p -,C@G86L'l!` (#,)@I9(!#2k-;2>@\8sL [)N(>%G -,@&Z!`-,)@I9(!#2y-;2@B\8sL [)N(@G[%b -"BY'8-&(~"^# -+)BI9("#2|-;2;=\8sL [)N(<G:Z%b g-"=XGGJkZp ` (#,)s=I9( #2k-;2;=\8sL [)N(<'I%('e -'=&' `-,)c=I9( #2y-;2=?\8sL [)N(>Y%]9'a y-,?j'%8X!_ -,)p?I9(!#2|-;2<>\8sL [)N(<G:Z%b g-">XGGJkZp ` (#,){>I9( #2k-;2<>\8sL [)N(<'I%('e -'>&' `-,)k>I9( #2y-;2>@\8sL [)N(>Y%]9'a y-,@j'%8X!_ -,)x@I9(!#2|-;2  6v7\8sL-[)G6I'8&Q-, e6YI{'%IiZ&g#-,")&6%Z #;3 6v7\8sL-[)G6%Q z-, 68G8`g#-,)F6I%j +;1 6v7\8sL-[)G6%Q k-, *68G9`g# -,)6I%j -;1 6v7\8sL-[)G6'IH8Q y-, 6I7\g#-#,)S6I%j -;1 8y9\8sL-[)G8%(~ |-,\:8G:^g#-,):I%j' -;1 6v7\8sL-[)G6&7Hy-,7z&}88(g# -,)7I%| -;1 6v7\8sL-[)G6&7Hy-,8z&}88(g# -,)8I%| -;1 8y9\8sL-[)J8(&l # #+ g8%Ie -,)38%Z #;38y9\8sL-[)J8&'&m }#, 8j%I87I%g#-,)^8I%j +;1 8y9\8sL-[)J8%Q k-, "88G9`g# -,)8I%j -;1 8y9\8sL-[)J8'IH8Q y#, 8I7\g# -#,)S8I%j -;1:|;\8sM-[)J:%Q |-,V<8GI<`g#-,)8<I%j' -;18y9\8sL-[)J8%&|Q k-, 198G9`g# -,)9I%| -;1 8y9\8sL-[)J8%&|Q y-, A:8G9`g#-,):I%| -;1 8y9\8sL-b108l8P -,9j&7%I8`g-,)9I%8 -;18y9\8sL-![108l8P ,:j&7%I8`g-,):I% -;1 0m1\8sL -[)>0(&l # #+ d0%Ie -,)00%Z  #;3 0m1\8sL -[)>0&'&m }#, 0j%I87I%g#-,)[0I%j +;1 0m1\8sL -[)>0%Q k-, 08G9` g# -,)0I%j -;1 0m1\8sL -[)>0'IH8\ y#, 0I7\g# -#,)T0I%j -;1 2p3\8sL -[)>2%H& |-, 48Gl`g#-,)4I%j -;1 0m1\8sL -[)>0%&|Q k-, 118G9`g# -,)1I%| -;1 0m1\8sL -[)>0%&|Q y-, A28G9`g#-,)2I%| -;1 0m1\8sL -[1$07HHr-, 1{&}88'' g#-,)-1I%8 -;1 0m1\8sL -[1$07HHr, 2{&}88'' g#-,)52I% -;1 <RY.2<J7-:<3T, <zQY,0<J1-*p< P, 2p3\8sL-[)A2I'8_-,B2%I{'%IiZ&# -,")2%Z  #;3 2p3\8sL-[)A2%Q z-, 28G8` g#-,)U2I%j +;1 2p3\8sL-[)A2%Q k-, 28G9` g# -,)2I%j -;1 2p3\8sL-[)A2'IH8\ y#, 2I7\# -#,)Q2I%j -;1 4s5\8sL-[)A%\ |-, 68GJY`w#-,)P6I%j -;1 2p3\8sL-[)A2I'8_-,B2%I{'%IiZ&# -,"),2%Z #;3 2p3\8sL-[)A2%Q z-, 28G8`g#-,)S2I%j +;1 2p3\8sL-[)A2%Q k-, 28G9`g# -,)2I%j -;1 2p3\8sL-[)A2'IH8\ y#, 2I7\# -#,)S2I%j -;1 4s5\8sL-[)A%\ |-, 68GJY`w#-,).6I%j -;18y9\8sL-[)J8I'8_-,B8%I{'%IiZ&g# -,")8%Z #;3 8y9\8sL-[)J8%Q z-, 88G8`g#-,)T8I%j +;18y9\8sL-[)J8%Q k-, 88G9`g# -,)8I%j -;1 8y9\8sL-[)J8'IH8\ y#, 8I7\g# -#,)T8I%j -;1:|;\8sL-[)J:%\ |-, <8GJY`g#-,)0<I%j -;1 48s2(4'Q v-, 48sF(4'Q -,% 68s(68m* +-+ 6Q8s(6'I&&m x(, 68s(6'IHZ k-, l 68s(67H87Qr#, " 88s(8'IHIQ |-, 4YUZB&4 zLa4p#QR_#;O`34YUZB&4X6%666G$;H4q-RR#4DO`\VYUZB&4#%%%MX?4b-RR#`;`r4YUZB&4G%%%G=B4p-RR##K<`l8Y#]B&8%%%GJ4s-RR##K=`.Bggfgfgfgf    XYUZB&5G%6%G6iF@4b-R#R`#;_`r6YUZB&6XF%GG6F-4p-RR#K <I`ku YT75%5 X%S4p-g^##P` 6YT76G%5 GX%S4s-g^##Q`  6Y<^I[kB6Yp8j"`(#Jsx.5)6Y<^I[kB6YL8j"n.#Jsx.5)8Y<aI[kB8Y}9(8&"q(#JsxH.492R1% J,E2R0J1R1]/R1+ U*\1S3{,B\nP4 (~'7*)+:Xl85B5D8,B\nP4 (~'\*)+8Z85B5D8,DF\nP4 (~k(%8.-(+3|:kk:5D5F8,B\nP4 (~&',S,(+WsW!85B5D8,B\nP4 (~&I-S-(+NmnW!85B5D8,DF\nP4 (~'(7\J%,S,)}) ?rZ':5D5F8*{?B\8nP_<<<<<<<1\&&8 t(~)!!!!!!!!! H5 @ B *{?B\8nP_<<<<<<<1\&L(K77,(~)!!!!Hc6 @ B *~AD\8nPa>>>>>>>1\&L(KpnZ,(~)    """"RH 8 B D 68AMn6 X&6-q7)v#   68AMn6 X&6-q7)v#  88AMn8 X&6+r-q7)v#   68AMn6 X&6-q7)v#   68AMn6 X&6-q7)v#   88AMn886(q(O7(v#7A   88AMn886-q(O7(v#7A   [:([%@(b["-e[-e\-["-e[-e [F[F G!E >NG!D PG!E MG!E MI"G&NI!G MI!G MI!G I!G  L6  L75H2u575$575$5/ M4 q; L q;75H L"LUJ"Le 575Uc5755U #%97:WP#PGP#P_ 787887; 28K2u- - 28K2&Wu-# 28K2&'Lu-- 28K2&'Lu-- 68K6&l'yu-Z- 38K3&'Lu-- 48K4&'Lu-- 38K3Y87Lu-W- 48K4Y87Lu-Z( 87A8Y8kY-.(87Y_5--* 87A87%l--' '8kF--*` 87A87%l--' 87kF--*` 87A87%l--' 87kF--*` <7A<7%l--' J<lkG--*` 97A97%l--' 97kF--*` :7A:7%l--' :7kF--*`  97A9'&F(#( ]9K--* :7A:'&F(#( ]:L--* \8ySYN[4'J98#D#( \8ySYN[4Z9(8--AD#[(#C 4\8ySYN[4Z(6f8--AD#[#N \8ySYN[4Z(6f8-#AD#[#N:\8ySYN[:Z9(8--A H#[#z 7\8ySYN[7Z(6f8--AE#[#N \8ySYN[8Z(6f8-#AF#[#N 7\8ySYN[7Z(w8#-A E#[,z \8ySYN[8Z(w8#-A F#[(z 987(GG o 987JBH 8&7J:H :87J:H ;99JH 9&7 J:H :87 J:H  8&5H H { 785H H { ;(9Q- ;(9~I: P;(9%6I5H g;(9%6I5H g@)=%(%FH <(:%6I5H g=(;%6I5H g<(:y: =(;y: ;(9Hw meG ;(9%&H ;(9K%: ;(9K%: ?)=j': <(:K%: =(;K%: <(:K6| : =(;K6| : =);(H9%%F ]:=);&(9K\  H :>);&H(:H :>);&H(:H : A*?&H]\'H 9:?)<&H(:H :@)=&H(:H : >)<"]Yn9H 1*?)="]Yn9H 1* <>u8rM1_L<'#*(,t> 8 #(+!'> #N(!H<=\8rM1_L<7#0-+-X-5=O9H #(+!'G-i-c=#N(!H<>u8rM1_L<&'7-)m> 7 #( !)> #J(!D<=\8rM1_L<[#>( )X =' - !'|>.i.I=#J(!D>@u8rM1_L>()@ n!s(+!)@ #A(!Z>?\8rM1_L>I#A-)$-X-=?>P!#(+!'m-i-L?#A(!Z<>u8rM 1_L<'#*(,t> 8 #(+!'> #N(!H<=\8rM 1_L<7#0-+-X-5=O9H #(+!'G-i-c=#N(!H<>u8rM 1_L<&'7-)m> 7 #( !)> #J(!D<=\8rM 1_L<[#>( )X =' - !'|>.i.I=#J(!D>@u8rM 1_L>()@ n!s(+!)@ #A(!Z>?\8rM 1_L>I#A-)$-X-=?>P!#(+!'m-i-L?#A(!Z<>\8sMV)N-< 6'y -,/>6'k'( ~F-,) >6'Z # -?5;~=\8sMV)N-;8&'y -,K-g-=''6H[  lH-#,) --=6'Z # -?5<>\8sMV)N-<  ,X2,x>6{%~ vS-,) >6'I # #?5;~=\8sMV)N-;'b -,B#g#='k\ & ~V-,)((=6'I # #?5>@\8sMV)N-> %99I~ -,d@'I+9e!lY-,)  @6'!# -?5=?\8sMV)N-=G'x ,5-g-?YI:},6!lV-#,) b-- m?6'!# -?5<>\8sMV)N-< 6'y -,/>6'k'( ~F-,) >6'Z # -?5;~=\8sMV)N-;8&'y -,K-i-=''6H[  lH-#,) --=6'Z # -?5<>\8sMV)N-<  ,X2,x>6{%~ vS-,) >6'I # #?5;~=\8sMV)N-;'b -,B#i#='k\ & ~V-,)((=6'I # #?5>@\8sMV)N-> %99I~ -,d@'I+9e!lY-,)  @6'!# -?5=?\8sMV)N-=G'x ,5-i-?YI:},6!lV-#,) b-- m?6'!# -?5q9(A<ELFc#"8q99A<uFc#@4"8q9(A<I%%#"Hq99A<:SI&#?(( "H=@<<>A(@ %#@  =?<<>+#s#f?#)!? q9(A<ELF+d#"8q99A<uFd#@"8q9(A<I%#u"Hq99A<:SI&#?(("H=@<<>A(@ %#@  =?<<>+#s#u?H#)? :8de$L8#u\=":v#:;97:8de$L7#u";#=":&4v.T#:2(:8ee$L8#uX=":7"Y#: 0(7:8ee$L7#u47#=":7&81.T#:D'8PgGsMG>`!]86  (,R,3:{i'L#,)W-R:'}(6 +&#1 8PgGsMG>`!]87l%9V -,h-2:%&69H[J-L}-#,)#Uc:8&g#(=378PgGsMG>a!]86 (,V,3:{iqL#,)[-R:'}(6 +&#1 8PgGsMG>a!]87l%l9V -,{-2:%&69H[K'L}-#,)#Uc:8&#(=3:<o8rM; 1_L:("#>.,l <GP!U#+!)#<J(!]:<o8rM; 1_L:,&8#>- v <7'( $S+!)q?#P<J(!\:<o8rM; 1_L:,'H#>- v <7YZ $S+!)q?#<J(!\::\8rM; 1_L:,'H#>- v :7YH $S+!)q?#: J(!h<>\8rM; 1_L<&>- v >%9 9-US+!)!o>J(!!79;\8rM; 1_L9,'H#>- v !;7YZ $S+!)q?#; J(!\:<\8rM; 1_L:,'H#>- v !<7YZ $S+!)q?#< J(!h9{;\8rM;1eL97%(#>( q' ; &}H%T(,!)?#;u4#J!]:|<\8rM;1eL:7%(#>( q' < &}H%T(,!)?#<u4#J!i:<\8ro;_V):,J2#>(+vQ)~)<'%U(+))$q."):<#J(!\9;\8rL;_V)9 ,.#>-+}g44r;&'TU-!!)3#2&2;#J(!]:<o8s;V(:#&8[y u(,< 7|%%&( )-* <'|  -,:<o8s;V(:[%6n ~,,p<66| ## /,*i<'|  -,:<o8s;V(:[%&'y ~,,7<66[(## /,*(<'|  -,8|Rs;V(8[%[&'y ~,,7<66'&## /,*(<'|  -,<Ts;V(<69# ~,,>X6Z%68|# /,*>  -*9{;\8s;V(9[%I&'y ~,,7;66[\## /,*(;'|  -,:|<\8s;V(:[%I&'y ~,,7<66[\## /,*(<'|  -,9;\8s;-V9'%%j( -,5;6%I(I 8 +(,;&l;5 -*:<\8s;-V:'%%j( -,5<6%I(I 8 +(,<&l;5 -*:^o8s^[(:9% {-,$3$K<6%GZ%8X# )(*..<'JO -*9^o8s;[(9&68 q,,434@<X'%mx# ,(,33<'JO -*rc<%$ $~#)y)J~$V r@<H/ / //t~$V K$ -   3Vs6&@  IBL LY t)o 'B $c8Asg &{a3Gt38 %6}$:SRi?^4C%+ 7iV"/0m* d ! "+R 65R38~47"("4X"L" ! R'mfX/zRo&<t,H<"*s?B6?B6?B6?B8AD\8n8999;99771I L'(I KE'c#(GPX G]6R @ B 6R @ B 5 @ B 6 @ B 8T B D ,B8B8B8B:DF\nD,<6t-y(%SAY*8^EB"GD?G8`EB$GDAG8M\CBGDC8`kEBGDG:yGDGFEG,B8B8B8B:DF\nD,<Z(89&%D,(juAY u8bdXbdXb%ud%XbdZdf X(W02[ Z "Lfz:CQC$ 4#4%"'y 5156689{6668%7 7<;E<[6y,5809<o8322242222:e -'"3(Gk-!6-(E, -68n0 :P<n 6T$'A 8 :/9<>GJ766686666DDDDFDDDD\YNpOYc,I#hO$=#t#(#{0 HHHHHHHHd222 : < >#2777!7 7 7 7 sj!"H#'"J#"+ '*QW:722>@F\>>>G(Y7[5Y")8Z6 {-A.5a<2J>#)l,4G$@3E Uo=7>$6*G##{/-u-N, E4z 88@@8 WN4S ?d6r6DJ8 8 j@m#14{@&2##o <92<]o8ARJ%'H\P Os#t<##P#7 * 28< !;WA=EBHayk 1J$III24,0.2.M3%" ,eJL07d.LE0L%` 3`zM5dKME5DsD 6i2;>2;>o8MH888:88888888:88883-["#$"<$3k#$|-z#=VW2mqn<+>M"2&m *<#># -x112t( AGgeijikik?}ddd#ffgh#\wZZuZ $B$U FE)\$\SZ$\U `&`4^&`6 \u`?Zs`>AWD/@/cUD/?/ \u`?Zs`>AWD/@/cUD/?/`{dA^yd@\%^W ^%^YCZF/A/EVF/@/^tb@\vb?^{dA^yd@\xb@\vb?^{dA^yd@\xb@\vb?^{dA^ud@Zq`?Zs`>AWD/@/cSD/=/ \u`?Zs`>AWD/@/cUD/?/ \u`?Zs`>\xb@\vb?\xb@\vb?\xb@\rb?CEEE^%^[ ^%^]^{dA^yd@i GGs^^T ^^VZiEE`\\Z\ `t^ `r^QEE/|pZZnZhIlkJL<kM2 2 \\IJJhJhLh(^I%'OM '^`diieKnKngMpggiiieKnKnggggifopS4R2P22444,0222446ble%i[i6eXL/@6L/?)eXL0@)L?8i[grN0A)N@5gZgZgZi[i[i[eYL0@)L=8eYL0@)L?8eYgZgZgZgvi[ge i&e&N{N8C9&*U 8aQ&= j ) ,  0A,E3+-f*(R'3U;1 3?h!L b)5 Gs,& W28,( :)Rf?@#FamoKc0%"Bn++ -W*3VlWr+jY6a-mAA0amSS$&XB  <P{=-:R& &/4F!3T& @"h[hF hh<g[@ s- +Hh]3 3 ,3;iD%aq'Fj~rg`@ Ml+Q&6%dXD qs5==^=!!7!=!=oB+"l:f 8fJ{IJ:(:'6%G%6:%} !Q#%Q_v;QQ s au['!v$0v;!<;wvv  All%$#8_%~VY %lMV'w">>|ZZj6/%2%%.&%7T('=7g:!NZZ*7$5HZ 7%'''7C#\%-B 9.V WC~9.LXj|'u \kteuU6zp0zwz#fz7z %zz!%k^7PA}e}a_b+&Z}AQf(,.a.h:.-h"!f%n$E/#J ''J>N'|8\'j<L)z bt{+[%A#$z U;%};nV_%]MT .(,<.@.[.C"%( z6 }A$}Ml=,) U F, 1DP;g :cG#N-? 8 *1Qyh.& &2 +1A2` 'IAc!Q<3{ 2H4,u)5..oT H 7O(Z|&s>cZZ BibL&  M%&JCq :4 J&v)a 'M3#7_Q)a(h'Pb`B1ES' j- \Z?>@o)9C #ÍÊ 6aH6h6G6hyz@AB&]ϼ _ۀ111  āƀAA4I@H@̀9S"Pa`{&d # L  F  D 8ۀu) Ձ g   ] 4`X -)^"D    <  w}eA P!`+ @ = V*"*e_p  [` P w8  ) y I  `>ހy|E Y , 8 0       6w  T   ҁ t Hހ I n I / #`y " 9,0oF I2  e Ȃ    9 _} V/ׁ  î   Zǀ  R .," ԁ ) Ā 5; -`}t[Ԁ\I $PV  q!] ±qqS?v72q  E  xJW   x h)H!w!Fo" > k   á;7   ҁ  ,u* MG 6   ]q ! ~ t Ÿ #Q&" ҁ1a" € oV(i6)1-&t ) +  L  *V"o':.<  N  ORp.  J)  # Ps耑o+[&u/ 3mH׀{s  T5 h 8    ́ ( t   A,p   ̀ _ ҁ g 5#P  >#     t S  H$p$q   r18Pm  T kQ6Q4D ^   c é ]  \   = GM  Kq V    #1ځdMf_Ȁ",,I Y#" 4[4A[4VK Ҁ-"w,7 [[gW 2IsIt + p   D 8M Ѐ  sP(.  c e D ˀ@Q<U8 ׁFNFh  F   h  6Yȁ | 4..P H?  Ú u  .  $   && Mv"  ] D G @U  C`3 l`8 XI 5\o  fT@ i /_ T w c  P4Pٵ|1ɀ EZ  Á #F,paNa  I\ r <~贀eXGc';r嫀=S/" 4 ƹ  2:\1R\" j q E  x  K u0 Ӂ ] Y:  4 T  '@抓 d _~B "UD|t" X a/ ́  9%  z@ FVt>&5 {p]  lR;¤;~m :r^  4  , &$A= Fx z ݵ ܻ ݵ ܻ [ h ̀AAA= ?  ? ր ? J6j ? 7 ? .ρC ? ր ?x ? ߀ ? GC ? ߀ ? U ? ߀ ? U.mukul- dj*     ,] p(  U   , pR  \  s]`߁     ,*, *S*pUȀ@  ?6  W))X g d ms7t5t5t"BCUZĀO F{{#$#ɀ % Ĥ p  0 d z  Ns E e t4:o@ Ё  } / :. d p  M f Q12^c^z(SSr  Kx   ^GG !]  g `  ĕ  (}ǁ v=      ,  j u  (: TS ^ _ ?i4[Pr orÊ ;80PeLcsOo=] ŀѓѓ тz`(ZV[QNG8J pHZˁDT! bp` c 1tՀ0@ #  {Ād@P@ 0,3pP  z@ B R ԍ`ϼ\ ؿ,APHP;@ ` tP ` DkfЁ  P d #  0      <X   l P   0  ((       ts ?  L < P   ̀^A Ԁϔϔ 8 R-_ ] ҝ qہ t 4P N    χ   i L4؀6    86 $ O  X  o  0(L` 0 0P &*   8 3 ^w$ q ކ72d  0 - 000"  \ 8      8o Q  j1S=  ( ̀X Q`7 | 1߀@ |K G 4 0\   0*  ؁ H X ́  02 0 ! 2 8 F< _ 8 \[Ϙ 4?sd  ԁ A l ¤) 4 P4 D K ( 8L h   ((  h B |i87H 6\   Kp׀sqXVUcHx){) T Ԁ@ 4 &+E [ k t lj  ߁ ; X C M >h 3&րӀ , `9/K$Z  BX #HvbGїѓѓъ=z`(Z&]ϼގAAt@|9Ha`{qs1\ L,B    F Ҁ  #    F!  D /ۀt) Ձ Z   w!=TuRcmZ _HO`bT6N4ZZ?&YkށZ<Z<?cPǁ ADO|gwOO)4&3GU 8N8UWUIvp\hIp~2Ph㑀Istl-x)uv<HV/PqT ̀i R O6 / ;/ ?{ހJ&b79.PZaRa06b#Xa&-JTQ@a##-ߧ&F&0ɻ7CVր  o `X -4`  D    <  w}aA P!`+ @ = VoPDĀ@*!*ρ  Ae_p  [` P w8  );OI  `>》y|E  , 8 0       6w  T   ҁ t Hހ I n I / #(y ! 9,0oF I;  e Ȃ    9 _} $V/ׁ  î  ?  {'%R .," Ӂ ) Ā 5; -`C@ddk@PV   ] ±qqS?v72q  E  xJW   x h)H!w!Fo"  &'$  t k   á;7   ׁ  ,u* MG 6   ]q ! ~@ #Q&" ́1a" € oV(i6)1-&t ) +  L  *V"o':.<  N  ORp.  J  # Ps耑o+[&u/ 3mH׀{  T5 h 8    ́ ( t   A,p  @̀ _ M 5#P  >#   #  t S  H$pp$q   r1d !  T kQ6Q4Ё pMRnl ^   c é ]  \   = GM  Kq V T ҀCL  #1ځdMf_Ȁ",,I Y#" 4[4A[4VK Ҁ-"w,7 [[gW 2IsIt + p  8<pM ܔWsP(.  c e D ˀ@Q<U8 ׁFNFh  F   h  6Yȁ | 4..P H?  Ú?  .  %   && Rv"  ] 4 F  G @U  C`3 l`8 XI 9\o  fT@ i /_ T w c  P4Pٵ|1  ɀ EZ  Á EF,paNa  I\ B<~贀eXGc';r嫀=S/" 4 ƾ  2:\1R\" o q  o x  K u0 Ӂ ] Y:  4 T  '@,抓 d _~B "UD|t" X a/ ́  9%  z@ FVt>&5Y vp]  lV;¤;~m :r^  4  , &$A= Fx z ݵ ܻ ݵ ܻ [ h ̀AAA= ?  ? ր ? J6j ? 7 ? .ρC ? ր ?x ? ߀ ? GC ? ߀ ? U ? ߀ ? U.mukul- dj*     ,] p(  \   , pR  \  s]`߁     ,*, *S*pUȀ@  ?6  W))X g d ms7t5t5t"BCUZĀO F{{#$#ɀ % Ě p  0 d z  Ns E e t4:o@ Ё  } / :. d 40  M f Q12^c^z(SSr  Kx   ^GG !]  g n  ħ  (}ǁ v=     C  j u  5: TS ^ _ ?i4[Pr oÏÍÊ ;806csObG=]H ĨdĠaP I)pz`(ZV[QNG8J pHZˁDT! bp` c 1tՀ0@ #  {Ād@P@ 0,3pP  z@ B Rԍ`ϼ\ ؿ,APHP;@ ` tP ` DkfЁ  P dTff^ۀʬ@O lj]ÀA ܙ sŀ3="g  Ƶ w߀  P ϵ Ł J   Ӣp  Jpl&  e ˁ F B  €#>?P*1 Z  ;B z ?(?׀2@T(@:: 2@i#@2>μ#&U#À-??"? πp<'ʀ1}ہ  z;ŀ &oߵ߷ ֊\        0  tӁ   p   =       !        A      l P     :  ?  d  <@$ l 20\ 2| : 2ML j{20X 2 v2|2X j*(*0 jz x  jT  z *R|      ;(     t        ts ?  L < 9 \t '      ҀQvu$ѢTʹ` 0 ſ !       > ̀ T E s㄁^At!;}H ā H Ͽ ?c?c2hȀ2h oϏǣR ØI@2 8 R 3 7   qہ t 4P N    χ   i L4؀6    8޸a ^鞀$iOר!4~ 7զ EI3רx >&f,sD@ 3 YD&8ԀHkCE4G€wD$рFB @Jƃ>R AW鸀9 O  px*<#)Ӗ܀P].2Hׁ:yҮl%q y: T3+ Dс F_[4kR;OE6#M'ށ ۳W}ܺ\g 'hB]?F?>܁FFFN g  X  s j' nC/o  ;3Sg ) 0P 3  !' 8>`w 'CXL ;  q ކ72k  0 8 0;$<#;7 "  k 8      8o Q  u&W=    ̀X Q`&7 | >@ |K R 4 0\   0  ؁ A X Ł  0 4 m@&2 0 . 2 8 9< j 8 \ -@[[( A$ā ԀmLvJd  ԁ A5) 4 P4v8K ( 8L ]   ((  h B |i87   Kp׀sqXVUcHx){) T  Ԁ@ 4 &+E [ k t lj  ߁ ; X C M >h 3&րӀ , `9/K$Uh8Ё j4K:sBp:casvjހcasr22z;YuFDtv itX \ \XC `)wYk C  \ $v al| IaRVa ukv vv 0yyʢ TjXC aiVXVXd7 TjXC& XzDWY{]#F :痀  /.;潁 ! p/R 1) n HG-^$1i0#ijf} qQOj+>`= lQ  7nɒs)J ?q <ڳY4 {5 > xP5N;d3}ŀ;<N01Ku 7W J$MCw߶x l HCH'W^F Yy - ¨Pm||M6ccc }^ܫ 5 MtX/ ?WkK1ɁlodG -gӪ Bc`q l >wd ڜ#i 4$ƀ zE  \ 5T D 9 Y ĀBcD KC+Pځ*ǀ|>ځ*ǀ|>ځ*ǀ|>ځ*ǀ|>ځ*ƀZ<|>ʀ*ĀZ<|>ʀ*ĀZ<|>ʀ*ǀ|>ځ*ـ ˫ ހ ] V`WR ݁{ˀǂp Wxt8lCTHs09QdKvρ(m 7 ^OM8 Á & 1~] i , d]hׁ 08$ \ 5T D(AvS=;Ƿ C}C z ݁@ ~ R{>BЁ* H؀ Ӡ zՁə=}6X z& A M zIb 4%M zIb 4%M zIb 4%MY7ObPs݀6W z; &}"*} ;.}"*ƀZ&}"*9 R }k\\ }5W*ƀY\ }"*ƀY\ }"*ƀY\ }"*ƀY\ }5Z*ƀY\ }5Z*ƀZ&}"*} }5Z*ƀZ\ }5Z*ƀZ\ y-X*ƀZ\ }5X*׀   E k7ٷL5 kF%π]ƀ*YV0NN \LU} 1ǹ [,n5 xf \LU} 11 [,n5 xf1ڳ0\ o-#-L5 t kՀ%*Y Ư ΁aOwU  EBw>/  EBi΀wW6s,?ywW6s:0N 0B uzD.h5€zdK[G  q fD uŀ9Jn \ˁn`tlf./€xxԁ',}^dH! 䞀J8 `ژH g,}ƀ z;껀6 X "Yf k],}ƀ z; 6р k!`q l z aژ d}ʁЀIHD.hgzb i 4](iHŀ 5 \ˁnM{/I37,C(Ӂ ń`(B`q dzd.Y ژH g,}ƀ z; z6] O΀6_3p1Āzd.Y ژWg,}ƀ z; + _2p1Āzd.Y   m L}ʁ5i Tj dH| 3 t*JQ]ɀ }VX64ט =0mH0 H6K4  n *}JcGց uH@ 4%?('.}I~lb c NMӀNMӀNMрNMӀNMӀ " ;*9+Oiځx5s *&I;k]\ }k]OÀ]&I;k]d.YQsk]&}k\iځ *&}k\iځ  *8<u }k^]9H <V }kĀ-1ŀ{ %&5+*Ȁ }VX64טwҀm r35H@ [' Z p- < 4C@@/0 tkjnu ӠNMҀNMӀNMӀNMӀNMҀ9| ӟʀeox(]8<uPsk]8u }k] }k]y{ }k y-]y{ }k]&}k^1ǴQ׀.Y }"*ƀY e d7 Tj\6  `&X ukOЁ]} zIɘ Ё]}Iހ>Q: @؀x)^H>R YҀSx)^H #q_<ԁ y준:Du]Rׁ #-  6OB & ghN 쯀hJf˲S֚n kÉB k>mY ڀmB~ ˁ巀I Y9O [d`1k4р lJD DRW<Fǁ + 3 ;LŸwOSTO ӞH pd.[ }k 0"   }ky 6b y>wL\m _Hc|< ? HhG^$A 8BĀ B) Cd%mg)K7L%>ǀ7 ހ Cdx"~3SI] ׀ %ϖ*LM7!^;I )nL  '@ \*{ "[  ..%> G2>|B>ny2;',=z?omyph+`Q ˀ 77,-2Ё/ P !y6EFmkӀ*6E  4 4 }7 MI7)TYL?vX@̎ 1(`K eMπL@x 5 -9Bs ׁ !pG  ݁PEu rdj~!/.9Xi/5mсi.&@wZ\fR}TmtX* N XU ?xU2iT ǀ e*Mɀ M~v{ Ɂρa Ҁ)e z( 4cc tǀ ;I ߀p|%0'_Q z( 4ccn` ( 'P / Si "g "quǀހƾXQPC2NYO3I9m'_q 0.6*& &+]ѥ2 F3 P2 P  $ w7% t  ǀ $  ԀmW7d+ A4ߣ I,]FZnڜYc[oڜƁÌژՀ  ԀmW7d+ A4ߣ I,]FZnڜYc[oڜƁÌژB l `_  )Ҁ. P ~P }a ~Prx Ia W)Q wED ߓL PL HaGƦ%^$ L-ǀ VyD3& ˩YȏCI H Rր5U5i`LXLTo QhZ+YO U~SV O鶀t 9 | $ C l.* *agpnnˏmsS >ÁYP Ј( b֮*C`z pnnˏo YS >ÁYN Ј( pmO  nk E1 ng.hE e %! 9p<(岀ڀGCCo*NzR F˫N ށ*"cg %fP 9' F   PJ<P Ӂ*Nz xdvz #‘#À Cf T3 n %} 6[ ȁEU · V_ F Rn.o~ Am 4kjnwhL67{ RЗvz #‘#À Cf T3 :@?E VL wKwL)7(Hm7l{ U<oЗvz #‘#À Cf T3 :@?E VL wKwL)7(H 1 1m7l{ U<oИvz #‘#À Cf T3 :@?E VL wKw s  ? @t >m7l{ U<oК v4v_  0 N" HąǁW? +D|- + YG es(˗7NF BjOb uW>׹쎀;`bxq %W  An  nxC a  ^ <ҁŀja̴<Vy ژ L$?iF 적 PL!āoNl1{.]J v4v_  0 N" Hąǁ 5۱ zD_̷.]7375̀㏀9jMW uW> rA讁q 3a  zwW0Z 3 |ɟ~Z "6Ҁʿ A꤀2jaObl la% m$?iF 적 PL!āo5{.]J v4v_  0 N" Hąǁ ODO Po e567NV BjOb uW> ^LVg W{+fPgwހ fl 6c n>- Ul<ҁ׳jaDZ < Igژ 4"^%L$?iF 적 PL!āoN^1{.]J v4v_  0 N" Hąǁ@e!;րT 'h4U8w :j 7 ajn uW>  I yK P xI=z4} C㕁l<ҁĀ'pBׁOb< ہ X4%L$?iF 적 PL!āoNl1{.]J v4v_  0 N" HąǁفBK" l݀F ) xJ$lH675̀ꡀ2jOb uW>  p / ˀ ـ] P r `g | s5ҁ[Ӏ꤀2l<  ژ EWMv%L$?iF 적 PL!āo5{.]J v4v_  0 N" Hąǁ IG >G&m O j n(˗7NVn2jOb uW>ɀTߙMӁ'mAC ] rO  >4  .րj- wv<ҁ׳jaOb<&ژ ؁%L$?iF 적 PL!āoN^1{.]J v4v_  0 N" Hąǁ@e!;րT 'h4 & 'M.7 ajn uW>  I yK P ԁ b ݁ >| rF 5. fSl<ҁĀ'pBׁOb< ہ X4%L$?iF 적 PL!āoNl1{.]J v4v_  0 N" HąǁفBK" lj* FEl675̀ꡀ2jOb uW>  p / ˀ ـ 1 6  $8 | s5ҁ[Ӏ꤀2l<  ژ EWMv%L$?iF 적 PL!āo5{.]J v4v_  0 N" Hąǁ IG >G X x H'M.ҁ˗7NVn2jOb uW>ɀTߙMӁ'mAC ] R.~B= Q wv<ҁ׳jaOb<&ژ ؁%S?iF 적 PL!āoN^1{.]J v4v_  0 N" Hąǁ f!;րT 'h4 & 'M.7 ajn uW>sր  I yK P ԁ b ݁ >| rF 5. fSl<ҁĀ'pBׁOb< ہ X4%L$?iF 적 PL!āoNl1{.]J v4v_  0 N" Hąǁـn:{L lj* FEl675̀ꡀ2jOb uW> Hs'c / ˀ ـ 1 6  $8 | s5ҁ[Ӏ꤀2l<  ژ EWMv%L$?iF 적 PL!āo5{.]J v4v_  0 N" Hąǁ IG >G X x H'M.ҁ˗7NVn2jOb uW> HsˀߙMӁ'mAC ] R.~B= Q wv<ҁ׳jaOb<&ژ ؁%L$?iF 적 PL!āoN^1{.]J v4v_  0 N" Hąǁ HӀ;րT 'h4 l-]>  q]7 ajn uW>sƀz^uTW I yK P ԁ b ݁ >| r#,+h 9l<7N]'pBׁOb< ہ X4%L$?iF 적 PJ!oNl1{.]J v4v_  0 N" Hąǁ2K" ljt sW 5 5G675̀ꡀ2jOb uW> D=k 1 / ˀ ـ%V - + | s 75꤀2l<  ژ EWMt%L$?iF 적 PJ!o5{.]J v4v_  0 N" Hąǁ HG >G"-]> C(˗7NVn2jOb uW>sƀˀ1ߙMӁ'mAC ] R.{сF) O wv<7NnjaOb<&ژ ؁%L$?iF 적 PJ!oN^1{.]J v2v_  0 N" HąŁ0,c 8k +^2>R5j | U5TA؀Jс 8?GNFn2  uW>)l W @y=m@*(l m  b ( "87NZjaOb<\a4* ^U|$?iF 저\ 10J!oNl1{.]J v2v_  0 N" HąŁ ?<7NJ 6 ZT 2j kr@#w5Q jOb uW> HCo߃ / ˀ ـ] *9 {}Ё+`V | s 75꤀2l<  ژ EWMvU|$?iF 저\ 10J!o5{.]J v2v_  0 N" HąŁ݈ C _)O9<z˗7NVn2j}k f uW> _|{q %W  2 ݁ ɛ 'Ko  ㆁ^ <7Nmja̴<Vy ژ cC|$?iF 저\ 10J!oN^1{.]J v2v_  0 N" HąŁ0,c 8k +^2>R5j | U5TA؀Jс 8?GNFn2  uW> [l W @y=m@*(l m  b ( "87NZjaOb<\a4* ^U|$?iF 저\ 10J!oNl1{.]J v2v_  0 N" HąŁ ?<7NJ 6%a KZT 2j kr@#w5Q jOb uW> HCo߃ / ˀ ـ] *9 {}Ё+`V | s 75꤀2l<  ژ EWMvU|$?iF 저\ 10J!o5{.]J v2v_  0 N" HąŁ݈ C _)O9<z˗7NVn2j}k f uW> a{q %W  2 ݁ ɛ 'Ko  ㆁ^ <7Nmja̴<Vy ژ cC|$?iF 저\ 10J!oN^1{.]J., ) I% I I% _ Nw fMqLG- `{tـ*LrkX >XAh$ N > ȁ N ہ OR+=J=> qPImD L 7NFj%(\ ڙT1,0 ?  I=c@?GbqK,$d qt́_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' x Sꡀ2j Cʀ DNl1;) I% I I% _ Nw fMqLG- `{tـ љUdЍ~ } PM<  vZ $̩8mm DSW 7NVPC%(\ ژT1,0 ? 7%= , &| 16AKEƔ \Gs')Zhu1 Q7v7NmX7%(\ ژb>GbqK,$d qt́_ HW] M  E )VĴF_ uހZfy&4Bb,ف;ӀD ŀ ' xT3n2j Cʀ DNl1;) I% I I% _ Nw fMqLG- `{tـ D *l  2ԁ+HBy7 = c&4 ϟ(+¿sA˙7NFj%(\ ژT1,0 AN`Ks 1^=$ YffK@ ?Sv LlqD 7A7N]9%(\ ژb" ژbqK,$d qt́_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' xTCn2j Cʀ DNl1;) I% I I% _ Nw fMqLG- `{tx! VNYۀMP`P7 B, i ~q )'c 쀀5ʀ꤀2j%(\ ژT1,0 /wm ߀g?zYKs q_v | qژbqK,$d qt́_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' x Sꡀ2j Cʀ DNl1;) I% I I% _ Nw fMqLG- `{tx;2?˛ DWȁȅ [ŀ y 6=ԂS>ҁ@T 6KH 7NVj%( ژT1,0 II&EYD sl|Y>  < }>ӮT (},(( xu {D  lP57NmX7%(p b>ژbqK,$d qt́_ HW] M  E )VĴF_ uހZfy&4Bb,ف;ӀD ŀ ' xT3n2j Cʀ DNl1; Lo J4 J J4 _ Nw fMqLG- `{tـ D *l  2ԁ+HBy7 = cIj<7` 'ԀA˙7NFj%(\ ژT2-0 AN`Ks 1^=$ Yff SCoC/kAӷ lqD 7A7N]9%(܁ ?b" ژbrZ-$d qt́_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' xTCn2j Cʀ DNl1; Lo J4 J J4 _ Nw fMqLG- `{tx! VNYۀMP`P7 B, i ~ 6S ¶Tml'c 쀀5ʀ꤀2j%(\ ژT2-0 /wm ߀g?zYKs q_v | qژbrZ-$d qt́_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' x Sꡀ2j Cʀ DNl1; Lo J4 J J4 _ Nw fMqLG- `{tx;2?˛ DWȁȅ [ŀ y 6=ԂKJ~i-Ȍ ,%́ 7NVj%(5 ^dT2-0 II&EYD sl|Y>  < } /6BOnx0 xu {D  lP57NmX7%(\ab>ژbrZ-$d qt́_ HW] M  E )VĴF_ uހZfy&4Bb,ف;ӀD ŀ ' xT3n2j Cʀ DNl1;) I% I I% _ Nw fMqLG- `{tـ Dm 2Á+HBy7 = cIj<7` 'ԀA˙7NFj%(\ ژT1,0 AN`Ks 1^=$ Yff SCoC/kAӷ lqD 7A7N]9%(܁ ?b" ژbqK,$d qt́_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' xTCn2j Cʀ DNl1;) I% I I% _ Nw fMqLG- `{txn'g NYۀMP`P7 B, i ~ 6S ¶Tml'c 쀀5ʀ꤀2j%(\ ژT1,0 /wm ߀g?zYKs q_v | qژbqK,$d qt́_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' x Sꡀ2j Cʀ DNl1;) I% I I% _ Nw fMqLG- `{tx;p ݁ DWȁȅ [ŀ y 6=ԂKJ~i-Ȍ ,%́ 7NVj%(5 ^dT1,0 II&EYD sl|Y>  < } /6BOnx0 xu {D  lP57NmX7%(\ab>ژbqK,$d qt́_ HW] M  E )VĴF_ uހZfy&4Bb,ف;ӀD ŀ ' xT3n2j Cʀ DNl1;) I% I I% _ Nw fMqLG- `{tـ D M |++HBy7 = cIj<K O}E{߀A˙7NFj%(\ ژT1,0 AN`Ks 1^=$ Yff SC/}5z$lqD 7A7N^9%(\ ژb" ژbqI,$d qt́_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' xTCn2j Cʀ DNl1;) I% I I% _ Nw fMqLG- `{txM~NYۀMP`P7 B, i ~`D~@%р-t6'c 쀀5ʀ꤀2j%(\ ژT1,0 /wm ߀g?zYKs q_v | qژbqI,$d qt́_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' x Sꡀ2j Cʀ DNl1;) I% I I% _ Nw fMqLG- `{tx;uMnˊ DWȁȅ [ŀ y 6=ԂKJ B}c=tgN 7NVj%(/mژT1,0 8I&EYD sl|Y>  < } /6B ys'R xu {D  lP57NnX7%(\ab>ژbqI,$d qt́_ HW] M  E )VĴF_ uހZfy&4Bb,ف;ӀD ŀ ' xT3n2j Cʀ DNl1; Kn J4 J J4 _ Nw fMqLG- `{tـ@~]>V_~ہ~ II# S+(+¿s 7NFj%(\ ژT2-0 ܁[ ` CR>(D uz rH1"x}A u x  yW[D 7A7N^X<g%(GCKb>ژarY-$d qt́_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' xTCn2j Cʀ DNl1; Kn J4 J J4 _ Nw fMqLG- `{tـ 5ƀ,qÁ+HBy7 = cIZ !K6'  A˙75,J%j%(\ ژT2-0 ! Eg?zYKs q_v | qژbrY-$d qt́_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' x Sꡀ2j Cʀ DNl1; Kn J4 J J4 _ Nw fMqLG- `{tـ 4_N z b.4lƁ=T;I^ 8^  Q ,-yH€ ~ç'c NVj%(\ ژT2-0 gr>bS'_vw ˕?ˀ}7C+-&~ '` {D Yr97wd.NnX7  '^ ژb؀% ژbrY-$d qt́_ HW] M  E )VĴF_ uހZfy&4Bb,ف;ӀD ŀ ' xT3n2j Cʀ DNl1;) I% I I% _ Nw fMqLG- `{tـ@ KA)]>V_~ہ~ II# S+(+¿s 7NFj%(\ ژT1,0 ܁[ ` CR>(D uz rH1"x}A u x  yW[D 7A7N^X<g%(GCKb>ژaqI,$d qt́_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' xTCn2j Cʀ DNl1;) I% I I% _ Nw fMqLG- `{tـ 5ƀ,qÁ+HBy7 = cIZ !K6'  A˙75,J%j%(\ ژT1,0 ! Eg?zYKs q_v | qژbqI,$d qt́_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' x Sꡀ2j Cʀ DNl1;) I% I I% _ Nw fMqLG- `{tـ 4_N z b.4lƁ=T;I^ 8^  Q ,-yH€ ~ç'c NVj%(\ ژT1,0 gr>bS'_vw ˕?ˀ}7C+-&~ '` {D Yr97wd.NnX7  '^ ژb؀% ژbqI,$d qt́_ HW] M  E )VĴF_ uހZfy&4Bb,ف;ӀD ŀ ' xT3n2j Cʀ DNl1ŀ Ԁ ́݀wI2_ Nw fR HiG-  Ӏ Ch Vxt!{xbp { 6-B9 )N΀ cyW w ؏67 ?ɀ Ā2j%(\a`J Հ )7~@I H́[LҁK? ,:K  ]߁ xu <YRyv7m|!]X7%(\ ژژT1   ;vSl[ /M= )iǺ o *Lx Q s 2Jk 7 ˁ1OЀju7Nl1;݀wI2_ Nw fR HiG-  Ӏ Ch Vxtـ lbq ڀ~ } PM<  vZ $̩8mm C? ,> o[ȆBD| ] 7 c2j  ژ`J Հ ) ġ >  EV , )C  eKhr,G0p)R \l<yv7 c2Xd.%(\Qf  ژT1  qt́_5-) lԃ" 7̅ '   EV , )C  eKhr,G}e \l<yv7N]Xd.%(\Qf  ژT1  qt́_5-) lԃ" 7̅ '  75!! >j%(\ ژ`J Հ )q3 %) mCJT N u = 4_x}D',}399K  [X <Gv75꤀2X 7%(\ ژfuaT1  qt́_5-) lԃ" 7̅ ' F( w 7NVj  ژ`H Հ ) ͏ց >  EV , )C  eKhr,GʀK}@-v>a) \l<yv7NmXd.%(\Qf  ژT1  qt́_5-) lԃ" 7̅ '  ؏67 ؀j%(\a`I Հ )  ؏67 ꤀2j%(\a`H Հ ) 2N  сxq %W  An  nxC a  ^ <ҁŀja̴<Vy ژ L$?iF 적 PL!āoNl1{.]J v4v_  0 N" Hąǁ 5۱ zD_̷.]7375̀㏀9jMW uW> r ـ/zH{idy  zwW0Z 3 |ɟ~Z "6Ҁʿ A꤀2jaObl la% m$?iF 적 PL!āo5{.]J v4v_  0 N" Hąǁ ODO Po e567NV BjOb uW>soZH{&DBx({+fPgwހ fl 6c n>- Ul<ҁ׳jaDZ < Igژ 4"^%L$?iF 적 PL!āoN^1{.]J v4v_  0 N" Hąǁ@e!;րT 'h4U8w :j 7 ajn uW>pMNo I yK P xI=z4} C㕁l<ҁĀ'pBׁOb< ہ X4%L$?iF 적 PL!āoNl1{.]J v4v_  0 N" HąǁفBK" l݀F ) xJ$lH675̀ꡀ2jOb uW>  p / ˀ ـ] P r `g | s5ҁ[Ӏ꤀2l<  ژ EWMv%L$?iF 적 PL!āo5{.]J v4v_  0 N" Hąǁ IG >G&m O j n(˗7NVn2jOb uW>!xCOcKMӁ'mAC ] rO  >4  .րj- wv<ҁ׳jaOb<&ژ ؁%L$?iF 적 PL!āoN^1{.]J v4v_  0 N" Hąǁ@e!;րT 'h4 & 'M.7 ajn uW>pMNo I yK P ԁ b ݁ >| rF 5. fSl<ҁĀ'pBׁOb< ہ X4%L$?iF 적 PL!āoNl1{.]J v4v_  0 N" HąǁفBK" lj* FEl675̀ꡀ2jOb uW>  p / ˀ ـ 1 6  $8 | s5ҁ[Ӏ꤀2l<  ژ EWMv%L$?iF 적 PL!āo5{.]J v4v_  0 N" Hąǁ IG >G X x H'M.ҁ˗7NVn2jOb uW>!xCOcKMӁ'mAC ] R.~B= Q wv<ҁ׳jaOb<&ژ ؁%L$?iF 적 PL!āoN^1{.]J v4v_  0 N" Hąǁ f!;րT 'h4 & 'M.7 ajn uW> X MNo I yK P ԁ b ݁ >| rF 5. fSl<ҁĀ'pBׁOb< ہ X4%L$?iF 적 PL!āoNl1{.]J v4v_  0 N" Hąǁـn:{L lj* FEl675̀ꡀ2jOb uW>  p / ˀ ـ 1 6  $8 | s5ҁ[Ӏ꤀2l<  ژ EWMv%L$?iF 적 PL!āo5{.]J v4v_  0 N" Hąǁ IG >G X x H'M.ҁ˗7NVn2jOb uW> ;K& tbB뒀FhBʁ ] R.~B= Q wv<ҁ׳jaOb<&ژ ؁%L$?iF 적 PL!āoN^1{.]J v4v_  0 N" Hąǁ HӀ;րT 'h4 l-]>  q]7 ajn uW><No I yK P ԁ b ݁ >| r#,+h 9l<7N]'pBׁOb< ہ X4%L$?iF 적 PJ!oNl1{.]J v4v_  0 N" Hąǁ2K" ljt sW 5 5G675̀ꡀ2jOb uW> 5}R. t / ˀ ـ%V - + | s 75꤀2l<  ژ EWMt%L$?iF 적 PJ!o5{.]J v4v_  0 N" Hąǁ HG >G"-]> C(˗7NVn2jOb uW> oK& tbB뒀FhBʁ ] R.{сF) O wv<7NnjaOb<&ژ ؁%L$?iF 적 PJ!oN^1{.]J v2v_  0 N" HąŁ0,c 8k +^2>R5j | U5TA؀Jс 8?GNFn2  uW> 2޻ )F 6y=m@*(l m  b ( "87NZjaOb<\a4* ^U|$?iF 저\ 10J!oNl1{.]J v2v_  0 N" HąŁ ?<7NJ 6 ZT 2j kr@#w5Q jOb uW>   / ˀ ـ] *9 {}Ё+`V | s 75꤀2l<  ژ EWMvU|$?iF 저\ 10J!o5{.]J v2v_  0 N" HąŁ݈ C _)O9<z˗7NVn2j}k f uW>xKj1q %W  2 ݁ ɛ 'Ko  ㆁ^ <7Nmja̴<Vy ژ cC|$?iF 저\ 10J!oN^1{.]J v2v_  0 N" HąŁ0,c 8k +^2>R5j | U5TA؀Jс 8?GNFn2  uW> 2޻ )F 6y=m@*(l m  b ( "87NZjaOb<\a4* ^U|$?iF 저\ 10J!oNl1{.]J v2v_  0 N" HąŁ ?<7NJ 6%a KZT 2j kr@#w5Q jOb uW>   / ˀ ـ] *9 {}Ё+`V | s 75꤀2l<  ژ EWMvU|$?iF 저\ 10J!o5{.]J v2v_  0 N" HąŁ݈ C _)O9<z˗7NVn2j}k f uW>xKj1q %W  2 ݁ ɛ 'Ko  ㆁ^ <7Nmja̴<Vy ژ cC|$?iF 저\ 10J!oN^1{.]J., ) I% I I% _ Nw fMqLG- `{tـY g 87B]J( @C}T ƁKJ cہH ʁ 7NFj%(\QT1,0 ?  I=c@?NYۀMP`P7 B, i ~Ȯ=|I}'c 쀀5ʀ꤀2j%(\ ژT1,0 q,k20;~A F)  貁#ǩ/Ity};X vÙAr H? 1 €7XR75FdX7%(\ ژf>GbqK,$d qt́_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' x Sꡀ2j Cʀ DNl1;) I% I I% _ Nw fMqLG- `{tـ^[ U `I 6(X w , pGbqK,$d qt́_ HW] M  E )VĴF_ uހZfy&4Bb,ف;ӀD ŀ ' xT3n2j Cʀ DNl1;) I% I I% _ Nw fMqLG- `{tـ^VrT8.P3 NIc =J>P B5 ` 7NFj%(y ژT1,0 AN`Ks 1^=$ YffK@ ?Sv LlqD 7A7N^9%(\ ژf" ژbqK,$d qt́_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' xTCn2j Cʀ DNl1;) I% I I% _ Nw fMqLG- `{tـ^=F =6f 2]>V_~#~ II# _ ϊ<F€ ā 75ʀ꤀2j%(\ ژT1,0 /wm ߀g?zYKs q_v | qژbqK,$d qt́_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' x Sꡀ2j Cʀ DNl1;) I% I I% _ Nw fMqLG- `{tIҁ} = R 6n"4 )(NK H}M0} y; ?Gǹ7% ؎77NVjA}T1,0 II&EYD sl|Y>  < }>ӮT (},(( xu {D  lP57NnX7%(p f>ژbqK,$d qt́_ HW] M  E )VĴF_ uހZfy&4Bb,ف;ӀD ŀ ' xT3n2j Cʀ DNl1; Lo J4 J J4 _ Nw fMqLG- `{tـ^VrT8.P3 NIc& &(ªQ 7NFj%(\ ژT2-0 AN`Ks 1^=$ Yff SCoC/kAӷ lqD 7A7N^9%(܁ ?f" ژbrZ-$d qt́_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' xTCn2j Cʀ DNl1; Lo J4 J J4 _ Nw fMqLG- `{tـ^=F =6f 2]>V_~#~ II#  7 p&C KhIfy(yH 6 75ʀ꤀2j%(\ ژT2-0 /wm ߀g?zYKs q_v | qژbrZ-$d qt́_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' x Sꡀ2j Cʀ DNl1; Lo J4 J J4 _ Nw fMqLG- `{tIҁ} = R 6n"4 )(NK J * S  Z827% ؎77NVjA}T2-0 II&EYD sl|Y>  < } /6BOnx0 xu {D  lP57NnX7%(\af>ژbrZ-$d qt́_ HW] M  E )VĴF_ uހZfy&4Bb,ف;ӀD ŀ ' xT3n2j Cʀ DNl1;) I% I I% _ Nw fMqLG- `{tـ H3ݠP@s&L>^ @y NIc& &(ªQ 7NFj%(\ ژT1,0 AN`Ks 1^=$ Yff SCoC/kAӷ lqD 7A7N^9%(܁ ?f" ژbqK,$d qt́_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' xTCn2j Cʀ DNl1;) I% I I% _ Nw fMqLG- `{tـ  g]>V_~#~ II#  7 p&C KhIfy(yH 6 75ʀ꤀2j%(\ ژT1,0 /wm ߀g?zYKs q_v | qژbqK,$d qt́_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' x Sꡀ2j Cʀ DNl1;) I% I I% _ Nw fMqLG- `{tـ =  = R 6n"4 )(NK J * S  Z827% ؎77NVjA}T1,0 II&EYD sl|Y>  < } /6BOnx0 xu {D  lP57NnX7%(\af>ژbqK,$d qt́_ HW] M  E )VĴF_ uހZfy&4Bb,ف;ӀD ŀ ' xT3n2j Cʀ DNl1;) I% I I% _ Nw fMqLG- `{tـƀ~xs&L>^ @y NIc&ᖀ O=K:܁ 7NFj%( ژT1,0 AN`Ks 1^=$ Yff SC/}5z$lqD 7A7N^9%(\ ژf" ژaqI,$d qt́_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' xTCn2j Cʀ DNl1;) I% I I% _ Nw fMqLG- `{tـ Tz_F  2]>V_~#~ II#  73 <@ /7؁ 75ʀ꤀2j%(hژT1,0 /wm ߀g?zYKs q_v | qژaqI,$d qt́_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' x Sꡀ2j Cʀ DNl1;) I% I I% _ Nw fMqLG- `{tـ H += R 6n"4 )(NK J́j' ?H8 tԀ7% ؎77NVjA}T1,0 8I&EYD sl|Y>  < } /6B ys'R xu {D  lQ77NnX7%(\af>ژaqI,$d qt́_ HW] M  E )VĴF_ uހZfy&4Bb,ف;ӀD ŀ ' xT3n2j Cʀ DNl1; Kn J4 J J4 _ Nw fMqLG- `{tـ >n. R 6n"4 )(NK J́) rǁroA7% ؎77NFjA} )܀2-0 ܁[ ` CR>(D uz rH1"x}A u x yW[D 7A7N^X<g%(GCKf>ژbrY-$d qt́_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' xTCn2j Cʀ DNl1; Kn J4 J J4 _ Nw fMqLG- `{tـ Gj}%+A+HBy7 = cIZ 1 5` ts%A˙75t1j%(; XT2-0 ! Eg?zYKs q_v | qژbrY-$d qt́_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' x Sꡀ2j Cʀ DNl1; Kn J4 J J4 _ Nw fMqLG- `{tـ D ڀ~J: r} ( Q  Ɓ 1ɀz_r2I Sy{ 3~ |ŀA˘7NVj%(\ژT2-0 gr>bS'_vw ˕?ˀ}7C+-&~ '` {D Yv97wd.NnX7  '^ ژf؀% ژbrY-$d qt́_ HW] M  E )VĴF_ uހZfy&4Bb,ف;ӀD ŀ ' xT3n2j Cʀ DNl1;) I% I I% _ Nw fMqLG- `{tـ >n. R 6n"4 )(NK J́) rǁroA7% ؎77NFjA} )܀1,0 ܁[ ` CR>(D uz rH1"x}A u x yW[D 7A7N^X<g%(GCKf>ژbqI,$d qt́_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' xTCn2j Cʀ DNl1;) I% I I% _ Nw fMqLG- `{tـ Gj}%+A+HBy7 = cIZ 1 5` ts%A˙75t1j%(; XT1,0 ! Eg?zYKs q_v | qژbqI,$d qt́_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' x Sꡀ2j Cʀ DNl1;) I% I I% _ Nw fMqLG- `{tـ D ڀ~J: r} ( Q  Ɓ 1ɀz_r2I Sy{ 3~ |ŀA˘7NVj%(\ژT1,0 gr>bS'_vw ˕?ˀ}7C+-&~ '` {D Yv97wd.NnX7  '^ ژf؀% ژbqI,$d qt́_ HW] M  E )VĴF_ uހZfy&4Bb,ف;ӀD ŀ ' xT3n2j Cʀ DNl1ŀ Ӏ ́݀wI2_ Nw fR HiG-  Ӏ Ch Vxt!{xbp { 6-B9 )N΀ cyW w ؏67 ?ɀ Ā2j%(\a`J Հ )7~@I H́[LҁK? ,:K  ]߁ xu <YRyv7m|!]X7%(\ ژژT1   ;vSl[ /M= )iǺ o *Lx Q s 2Jk 7 ˁ1OЀju7Nl1;݀wI2_ Nw fR HiG-  Ӏ Ch Vxtـ lbq ڀ~ } PM<  vZ $̩8mm C? ,> o[ȆBD| ] 7 c2j  ژ`J Հ ) ġ >  EV , )C  eKhr,G0p)R \l<yv7 c2Xd.%(\Qf  ژT1  qt́_5-) lԃ" 7̅ '   EV , )C  eKhr,G}e \l<yv7N]Xd.%(\Qf  ژT1  qt́_5-) lԃ" 7̅ '  75!! >j%(\ ژ`J Հ )q3 %) mCJT N u = 4_x}D',}399K  [X <Gv75꤀2X 7%(\ ژfuaT1  qt́_5-) lԃ" 7̅ '  7NVj  ژ`H Հ ) ͏ց >  EV , )C  eKhr,GʀK}@-v>a \l<yv7NmXd.%(\Qf  ژT1  qt́_5-) lԃ" 7̅ '  ؏67 ؀j%(\a`I Հ )  ؏67 ꤀2j%(\a`H Հ ) .׀oq K'A9?I"77vd.mJ,(X7܁ژf>ژ   ;vSl[ /M= )iǺ o *Lx Q s 0Jk 7 ˁ1OЀju7Nl1;݀wI2_ Nw fR HiG-  Ӏ Ch Vxtـ >n~NYۀMP`P7 N i ~}>|'c jnj%(\ ژ+ Հ )! l4G B aF)I&0{w [g  +P A߀ ~ ? 4" Z? <7v<g c2X7%('ژf>ژ  qt́_5-) lԃ" 7̅ '   EV , )C  eKhr,G}e \l<yv7N]Xd.%(\Qf  ژ  qt́_5-) lԃ" 7̅ ' r  )@b$  75tQj%(\ ژ+ Հ ) < o ) mCJT N u = 4_x}D',}399K  [X<Gv75FdX 7%(\ ژfua  qt́_5-) lԃ" 7̅ '   EV , )C  eKhr,GYQŗ2mC? \l<yv7NmXd.%(\Qf  ژnZ  qt́_5-) lԃ" 7̅ '   EV , )C  eKhr,G}e \l<yv7N]Xd.%(\Qf  ژ  qt́_5-) lԃ" 7̅ '   EV , )C  eKhr,G}e \l<yv75꤀2Xd.%(\Qf  ژ  qt́_5-) lԃ" 7̅ ' .׀oq K'A9?I"77vd.mJ,(X7܁ژf>ژ   ;vSl[ /M= )iǺ o *Lx Q s 0Jk 7 ˁ1OЀju7Nl1;݀wI2_ Nw fR HiG-  Ӏ Ch Vxtـ >n~NYۀMP`P7 N i ~}>|'c jnj%(\ ژ+ Հ )! l4G B aF)I&0{w [g  +P A߀ ~ ? 4" Z? <7v<g c2X7%('ژf>ژ  qt́_5-) lԃ" 7̅ '   EV , )C  eKhr,G}e \l<yv7N]Xd.%(\Qf  ژ  qt́_5-) lԃ" 7̅ ' r  )@b$   EV , )C  eKhr,Gł   \l<yv7NmXd.%(\Qf  ژ  qt́_5-) lԃ" 7̅ '   EV , )C  eKhr,G}e \l<yv7N]Xd.%(\Qf  ژ  qt́_5-) lԃ" 7̅ '   EV , )C  eKhr,G}e \l<yv75꤀2Xd.%(\Qf  ژ  qt́_5-) lԃ" 7̅ '  o[ȆBD| ] 7 c2j  ژ+ Հ )&R~>  EV , )C  eKhr,G0p)R \l<yv7 c2Xd.%(\Qf  ژ  qt́_5-) lԃ" 7̅ '   EV , )C  eKhr,G}e \l<yv7N]Xd.%(\Qf  ژ  qt́_5-) lԃ" 7̅ ' r  )@b$   EV , )C  eKhr,GH@b > \l%yv7NmXd.%(\Qf  ژ  qt́_5-) lԃ" 7̅ '  o[ȆBD| ] 7 c2j  ژ+ Հ )&R~>  EV , )C  eKhr,G0p)R \l<yv7 c2Xd.%(\Qf  ژ  qt́_5-) lԃ" 7̅ '   EV , )C  eKhr,G}e \l<yv7N]Xd.%(\Qf  ژ  qt́_5-) lԃ" 7̅ ' r  )@b$   EV , )C  eKhr,GH@b >ՁU \l%yv7NmXd.%(\Qf  ژ  qt́_5-) lԃ" 7̅ '  o[ȆBD| ] 7 c2j  ژ+ Հ )&R~>  EV , )C  eKhr,G0p)R \l<yv7 c2Xd.%(\Qf  ژ  qt́_5-) lԃ" 7̅ '   EV , )C  eKhr,G}e \l<yv7N]Xd.%(\Qf  ژ  qt́_5-) lԃ" 7̅ ' r  )@b$   EV , )C  eKhr,GH@b >ՁU \l<yv7NmXd.%(\Qf  ژ  qt́_5-) lԃ" 7̅ ' %" H{ 8 Vxtـ NNMK|Q#(>VBhBMLJ д8n u L= @ ^'c  _3 Ā2j%(\ ڙ*v_ >%" H{ 8 Vxtـ  + )@b$ %" H{ 8 Vxtـ  + )@b$  7NFj%(\ ژ*v_ >%" H{ 8 Vxtـ6.u(n 8YB%" H{ 8 Vxtـ   + )@b$ 3+ vD vcIOUgW+ 0H >*mOπX7mf.w7!7l , Wwiځ _8 J[ ' }"*dm 2̀a  km ~m ~m 2 ˑQZ ~. ~. ; =( HcHĀ # G ~R? HfVG-^$B ;4 > dqKsܪ(9  vc.H2jMP(؀v:}؁H $Ƙ*s̀ P9X7m7w7qRy.(L7wH  i_8 KS&}5Z*dƀ1 2ʀ^  km 2̀ 2̀ 2 ~bQZ ~' ; 7 HcHĀ # G ~R? HfVG-^$B ;4 >  C'DD  [,N4)RǁfED Rx/w?&(K *}X7mYRyw7!7m7uH >_8jP}"*dp( Ԝ (pp/ 'bIh2 p =( HcHĀ # G ~R? HfVG-^$B ;4 > `[  7<2LȅI] D  zT [,N4)RǁfED 3M?'c~/H $ƚ* R㑀9X7mn\uw7Ɂ2 C DuH >_8=^K  :`*ƀOiem O m1Ļ >1e>1 S/e<b+< =!T ;( HcHĀ # G ~R@? HfTG-^$BE ;4 > XrA0͇T [,N4)RǁfED 3M?'c~/H $ƚ*mX7mn\uw7Ɂ2 C DuH >_8=^K  }"*ƀOiem^  & €& À& À& À& À& 6>ڀ 6>ڀ À& €& À& €& À& €& À& €& Łι 0e8[  z À& À& *u2j  z 6>ڀ À& À& 8<u8<uÀ&#&#*8[À&À&À&8<u8<u€& fZ|mfmf<r Hl U/<bQ\ U/<2Vx;ù&# &̀&Łoǀu| 7 HcHĀ # G ~R? HfVG-^$B ;4 > `J В 6<2 0B@QB u?X\$xxk' 0H >*}X<gm7w7!7m*2wH <_8 J[ .Y y-~*dm 'ր  m1 ~m ~m ~0 ~. ~. ; ;( HcHĀ # G ~R? HfVG-^$BE ;4 > ` 6LHH> ,=O|V@Q Pf^P(`$xxk' 1H $ƚ* R㑀9X7m7w7 ,8 K C DwH <_8 J[ .Y }"*ƀOicЀ  O m1 td1 td1 tb. tbbb =|( ́ 7 HcHĀ # G ~\8 HfVi-%. ;4 >E T R m] ` c>~(6H >* RQ X7l7wn\uGm7uH  _8 J[ ŀs܀5Z*dȀ1 2̀^  kp Hj ~1 ~1 ~. ~2 l ~bQZ ; ;( HcHĀ # G ~\8 HfVi-%,E ;4 >EMT R m] ` '~(6H >*mX7l7wn\uGm7uH  _8 J[ ŀs܀5Z*dȀ1 2̀^  kp Hj ~1 ~1 ~. ~2 l ~bQZ  ́e HcHĀ #Q Z W ]VMjLgF-A Cf~# C : ׀ /*π|? 5B yC v2| K =݁KKK@1 րB̃ nÁ 6"k*(ƀZcefs-<}dXm7w7 ް j& U WM&kOb  R ,_De5e HcHĀ #Q Z W ]VMjLgF-A Cf~# C : ׀ /*π|? 5B yC v2| K =݁KKK@1 +B̃ nÁ 6"k*(ƀZcefs-< R,J%dXm7w7 ް j& U WM&kOb  R ,_De6e HcHĀ #Q Z W ]ހVMjLgF-A Cf~# C : ׀ /*π|? 5B yC v2| K = {0@qԖ I l PuJzIӷ{ Ckʀcefs-<mdXm7w7 ް j& U WM&kOb  JtH<N x  OeOAqBZ Ub2l9  OeOӀ G44z  OeOӀ G44l  OeO6ì꽁V߀ꡀ2  OeOӀ G4?O{v>sUzHcHĀ NLK xKM qǀHƀ-EI 63͙ 2ۀ%vL$Bs Cs"gWg` ǜ "/>s"gWg` ǜ8O{v>sUzHcHĀ NLK xKM qǀH & I 63͙ 2ۀ%vL$ g , 4?ā_~9j̴&F7€ ; %) CEsMԀB<N5cɁ ǜ/>s"gWg` ǜ "/>s"gWg` ǜ8O{v>sUf>sUzHcHĀ NLK xKM qǀH 9H+ 0dE @$ C /Ab_odjOb ڙ8<vqsj  $I <ICFs !Ӂol E<N5cɁ ǜ/>s"gWg` ǜ "/>s"gWg` ǜ IʁGv>sUzHcHĀ NLK xKM qǀH +́ i-(_ f o, ɱy @Ȇ \W Tj= jK7k_2!2jOb ڙ8<v߀;j 5AFsXk N5cɁ ǜ/>s"gWg` ǜ #/>s"gWg` ǜ8O{v>sUzHcHĀ NLK xKM qǀHp}hm nf o, ɱy @ :ǁƕ [(` >  C/_'ĀJ%!'ĀJ%j΍zڙ *}j  <Is  k N5cɁ ǜ/>s"gWg` ǜ #/>s"gWg` ǜ I΁Gv>sUf>sUzHcHĀ NLK xKM qǀH3 祀6* 9 1Tn_1}l9~t}Ud@ TuÀ -u w2!2j1 U q "*}h -N; <ICEsMҀk N5cɁ ǜ/>s"gWg` ǜ #/>s"gWg` ǜ IЁ,M:1 bŀ ce bŀ_HcHĀ NLLwwwwwww &Mj I BD=DрI+Lz 6ni 7t8ȝV0ț ́ + & jQ`$}h a a a a a a a a|. Ӡ !u Ӟ ί AՀ{ڜt LtJB-6_s 0LtJB,فe,M:1 bŀ ce bŀ_HcHĀ NLLwwwwwww &Mj I ̀yGn Ċ !ymÛ Wl v1 O ~RO7Ł7+_2XO ڙ QeW_h K˷O ښ Q˷O ښ Q˷O ښtO ښ Q˷O ژ Q˷O ژ Q˷O ژ Q˷O ژ4" Ӡ  u <  od <́ AՀ{ڜt LtJB-6_s 0LtJB,ف슀\M:1 bŀ ce bŀ_HcHĀ NLLwwwwwww €Mj I ΀k` /I 5 Ċ !ymÛ Wl v1 OL 3xA M7+_2Xڙ q "*}h KYښ QYښ QYښ [ښ QYژ QYژ QYژ QYژAμ3 8  ;  <́ 1|μ , AՀ{ڜt LtJB-6_s 0LtJB,ف*v$_ %e8 MjLlX IŤN OY [ 8mlvƀk W`ԃ  l in8Y 8;ImR؁z=&=ҁ.ߛ77NZ1 ́ Ѐ 2 l  9u %7we.11 `  U c3 "2ҁC    'Ā.Y ӟv$_ %e8 MjLlX IŤN OY [-1mlvƀk W`ԃ  l in8Y 8;ImR؁z=&=ҁ.ߛ775꤀21 ́ Ѐ 2 l  9u %7we.11 `  U c3 "2ҁC    'Ā.Y ӟv$_ %e8 MjLlX IŤN OY [iVemlvƀk W`ԃ  l in8Y 8;ImR؁z=&=ҁ.ā&qZ77Nk1 ́ Ѐ 2 l  9u %7we.Nl1 `  U c3B mSC JC}A 4 ӟ  x$_ %e8 MjLlX IŤN OY [HrYmlvƀk W`ԃ  l in8Y 8;ImR؁z=&=ҁ.ߛ77NZ1 ́ Ѐ 2 l  9u %7we.11 `  U c3 "2ҁC    'Ā.Y ӟx$_ %e8 MjLlX IŤN OY [-1mlvƀk W`ԃ  l in8Y 8;ImR؁z=&=ҁ.ߛ775꤀21 ́ Ѐ 2 l  9u %7we.11 `  U c3 "2ҁC    'Ā.Y ӟ  ́v$_ %e8 MjLlX IŤN OY [ Q  >E n? ԀQ@<@0z(r/ tPkp@OOu 쀀5㑀91 ́ۀ(>7Nl1 9ul v7Nl1 `  U c3}@hD Dԁ JC}@    v$_ %e8 MjLlX IŤN OY [ Q  >E n? ԀQ@<@0z(r/ tPkp@OOu Nk1 ́ۀ(>7Nl1 9ul v7Nl1 `  U c3}@hD Dԁ JC}@    X z OeeBu H/ K;WXv9W]x^v~. z Oe5h-u9v 'FK0сAHҀ֔R]x^X z OeeBu9 'FK0р܀7U k]xQX z OeeBu9 'FK0р܀7$vtP k]xQX z OeeBu9v 'FK0сR5 k]x^X z OeeBu9 'FK0р܀7U k]xQX z OeeBu9 'FK0р܀7$vtP k]xbX z OeeBuYЎB/w* ;I ̡ H]x^X z OeeBuYЎB/w* ;IH H]xbĀ #Pr({\"CY"9PPr ݁'X̺'" #Pr% d0 . },'" #Pr/ Ǥd0 . },'" #Pr ݁<>];~un G'" #Pr{ ݁ . },'" #Pr{ ݁ . },'4 #PrZYkTa x") #PrZYkTa x4 HKL8%%C{Fx nJX{n+ 5@s+9] f+a R N R.o=42O wu ;58U[#Ā I Sʀ 8 Z *u=q4FxǁB3~=03W/q%N R N RG!n&Sɀ\;p"Sm!n& Y % > 42O7wuOC ;58R+#k  ՀG8:0cr2$- E 4FzsFK2$j!"~΁hWLـ̝:= j q q?oG!p Ч P@5k!pV PN oπ A 423=|;5Qx A ʁFp GHcHĀ LÀ  tـ T l |po9z8*yOl nBu u @7Nbc OL"cj N| jFp>M GHcHĀ LÀ  tـ t ہ ,> Hځ ,wlMāo r+\PNZQM( tq7Nb1Āل=0o k LZ GHcHĀ LÀ  tـ  ˁ ,> Hځ ,wlMāo rB%M7Nb1Ā~Ԁn  0oN^Fp GHcHĀ LÀ  tـ  ˁ ,> Hځ ,wlMāo rB%M7Nb1Ā<  0oۀ5qFn GHcHĀ LÀ  tـ t ہ ,> Hځ ,wlMāo rڀZ 6JA47Nb1Ā~n  0oNn KZ GHcHĀ LÀ  tـ  ˁ ,> Hځ ,wlMāo rB%M7Nb1Ā~Ԁn  0oN^Fn GHcHĀ LÀ  tـ  ˁ ,> Hځ ,wlMāo rB%M7Nb1Ā<  0oۀ5q ʁ KZ GHcHĀ LÀ  tـ  -@5sD u3OvldX RV' 257Nb1ŀ<  0oۀ5! Fn GHcHĀ LÀ  tـ  -@5sD u3OvldX RV' 257Nb1ŀ~n  0oNn ́FpM:c G癀 LZ |j݀j,fb*ӽF v7(Fp-7NV13& pJ%1À 79J 3]@ ZMym l| [ } d(R d, <J7Nn1 6 Ā2 2 UpFpM:c G癀 LZ IۀA6{^g jJ rr݉ NV$ ? > 5 U = :J u,3  48 e(߀  IZ ~ _l7Nn1 4 Ā2 51b &ҁ Lc G癀 LZ IۀA6{^g jJ48$4 NV$ ? >ڀn2 U = :J u,3  48 e(߀ ʀꡀ2 U = :J u,3  48 e(߀ n2 U = :H 셀,3  48 e(߀ d' _l7Nn1 4 Ā2n21b &ҁ Kc G癀 LZ IۀA6{^g jJ48$4 NV$ ? >ڀn2 U = :I u,3  48 e(߀ ʀꡀ2 U = :H u,3  48 e(߀ _xOV xNV1 6 Ā2tP1ŀ&I ԁ . 3 m  㨁+ M_ `=@. 7Nn1 4 Ā2o1;FnM:c G癀 LZ B;; |Hia6E (} 6>_xOV xNV1 6 Ā2΀1ŀ&H ԁ . 3 m  㨁+ M_ `=@. 7Nn1 4 Ā21ŀ ߀ڤM:I2_ L HGŠ w9`Ȧ Deh %O ݀? ];gsț =z Vv k: M7%%%%ZJ [> y `_ <$؀j0U }7+FM:I2_ L HGŠ w   k%Z` ˤ ̼I8qlR+> 5G 3YCR 7V \X7%%%%ZJ [>qnS6LQ7RCL+ NI2_ L HGŠ w7%Z` ˤ ̼I8qlR+> 5G 3CPC€=7V \X7%%%%ZJ [>qnSnQ7NlŮ s5CL+FM:I2_ L HGŠ w0N k%Z` ˤ ̼I8qlR+> 5G 3CPC€=7V N kX7%%%%ZJ [>qnSnQ75 ؁CL,FnM:I2_ L HGŠ w   k%Z` ˤ ̼I8qlR+> 5G 3_ 7V \X7%%%%ZH [>qnSU)Q7N^Ů s5CL+ KI2_ L HGŠ w7%Z` ˤ ̼I8qlR+> 5G 3CPC€=7V \X7%%%%ZI [>qnSnQ7NlŮ s5CL+FM:I2_ L HGŠ w7%Z` ˤ ̼I8qlR+> 5G 3CPC€=7V N kX7%%%%ZH [>qnSnQ75 ؁CLd ́ KI2_ L HGŠ w  ۽ HG  N"!t tJ %BL7UOπʀ㏀9X7%%%%ZI [>! N}*#v !.5] }7+FM:I2_ L HGŠ w  ۽ HG  N"!t tJ %BL7UOπg9X7%%%%ZH [>! N}*#v !.N^c5 }7dup͆0̀ ZPA = w!eK{}V Qj;b 1ŀ   Ӟ p͆0̀ Z  v6l΁ |A Ȕ 5>`~aqj< 岁1GY   t0̀ Z 1[. v6l΁ |A Ȕl 9} A Ẁ1GY  odu Ӡ tp͆0̀ ZB/# uQa*l΁ |A Ȕl 9} A Ẁ1GY  odu Ӡ sp͆0̀ Z  v6l΁ |A Ȕl  C =ʁ ep1GY  odu Ӡ 0̀ Z  v6l΁ |A Ȕl 9} A Ẁ1GY  odu Ӡ tp͆0̀ Z  v6l΁ |A Ȕl 9} A Ẁ1GY  odu Ӡ u0΀ ZNۋ i p /k o=* v oK j8],NE29 wG\ A (A([O7 ȔBI  n K@ # hf R=!29h wG5\ A (A([O7K2ā  s089:,9HI]đ #  'À.Y Ӡ NE29h wG5\ A (A([O7K2ā  s089:,9HI]đ #  'À.Y Ӡ NE09 wG5\ A (A([O7K2ā n ɷ 9 6YKuAe #  'À.Y Ӡ e!19h wG5\ A (A([O7K2ā  s089:,9HI]đ #  'À.Y Ӡ NE09h wG5\ A (A([O7K2ā  s089:,9HI]đ #  'À.Y Ӡ !19Ȁ 2Hc& !/ mB%b g O#I *~{ŁA Dh . NE09Ȁ 2Hc& !/ mB%b g O#I *~{ŁA Dh .׀ NE2B}fƀ[ Akb@KX m @+ DS y&y uÁ4" Ӟ ~ ],NE2B}Ǥx́D x A (A([O79k[<Q = 7vxwy"| Ӡ ݀׀ !2B}DŽx́f x A (A([O7K RBI Ђ! "| Ӟh .׀ NE2B}DŽx́f x A (A([O7K RBI Ђ! "| Ӟh .׀ &'NE0B}Ǥx́f x A (A([O7K ># J+ Q =N"| Ӟh .׀ e!1B}DŽx́f x A (A([O7K RBI Ђ! "| Ӟh .׀ NE0B}DŽx́f x A (A([O7K RBI Ђ! "| Ӟh .׀ !1B}VrRDq !/ m7HB> h.Y Ӟ eNE0B}Dq !/ m7HB> h.Y Ӟ ϽM B|4VBwtC {(| l ZFFiHǩy  4 ǩ3׀ϽM B|Ok? ts ց `ˤ o; 3#u #  3 Ӡ}#uX M B|Nk? tmLi uw" m ~yBeyvx> "ڀ #  ~ ~>܀+׀ϽM B|Nk? tmLi uw" m ~yBeyvx> "ڀ #  ~ ~>܀+ـϽM B|Ok? tmLi u*9< "ڀ #  ~ ~>܀+ـϽM B|Nk? tmLi uw" m ~yBeyvx> "ڀ #  ~ ~>܀EX M B|r9À=Ɂ onX )mj 5+|fi"| 9& ݀׀ q2ـϽM B|r9À=Ɂ onX )mj 5+|fi"| 9& ݀׀ qL{ ځ; OY_  0N€ Nw HMjM ą66W<@ / >J WHUL k'NFn2jq ݦ ~91t$W> љWuWmMMo5J9V>b| ݀?րl WN]jaOb<y{ ښ%kySr Ytd 5 `,-:  Pڜ趁{Nl1{ .рJ;^dPoPPo_  0Nŀ Nw HMjM ą66W< iǓ ā \ {I. YSұ{'πNFV Aց E?mՀ?m92V$W>E_ q* (  s ܫ2Lg.J(Ba (,ژ N]jaOd<8<u ښ%kym ymr iVd 5 `,-:  Pڜ趁{Nl1{ .рJ; OY_  0N€ Nw HMjM ą66W<W8 pɁw7j ु K) K;   )n o075! >j1G ~91t$W>vBc}ā.DŽ7ũs )) w v_ژYv< '5꤀2.U<e.Ya%? ́' Ytd 5 `,-:  Pڜ趁{5{ .рJ;^dPoPPo_  0Nŀ Nw HMjM ą66W< l8 D]R  l܀рQ#H?΀*5ʀꡀ2#oFObEsZsB)2V$W>  7\Gjj? u ܮD sa'ҲM m  F,2Bژ75 jazD<2Cښ -em%k `  ` r iVd 5 `,-:  Pڜ趁{5{ .рJ; OY_  0N€ Nw HMjM ą66W<W+ H~//9 I U( Y Z o n2j!.\ ~91t$W>?"7 tƁ t؁/ m, sa َ ,ژYe Nn<&ژ% .Tr Ytd 5 `,-:  Pڜ趁{N^1{ .рJ;^dPoPPo_  0Nŀ Nw HMjM ą66W< \8 Z + xYV2 HX Vz;>E?mՀ?m92V$W>C ؁p/Y>bj{ >|a cځ Mژ Ml TNnjaOd<8uQ%kym ym sR iVd 5 `,-:  Pڜ趁{N^1{ .рJ; OY_  0N€ Nw HMjM ą66W<@ / >J WHUL k'NFn2jq ݦ ~91t$W> љWuWmMMo5J9V>b| ݀?րl WN]jaOb<y{ ښ%kySr Ytd 5 `,-:  Pڜ趁{Nl1{ .рJ;^dPoPPo_  0Nŀ Nw HMjM ą66W< iǓ ā \ {I. YSұ{'πNFV Aց E?mՀ?m92V$W>E_ q* (  s ܫ2Lg.J(Ba (,ژ N]jaOd<8<u ښ%kym ymr iVd 5 `,-:  Pڜ趁{Nl1{ .рJ; OY_  0N€ Nw HMjM ą66W<W8 pɁw7j ु K) K;   )n o075! >j1G ~91t$W>vBc}ā.DŽ7ũs )) w v_ژYv< '5꤀2.U<e.Ya%? ́' Ytd 5 `,-:  Pڜ趁{5{ .рJ;^dPoPPo_  0Nŀ Nw HMjM ą66W< l8 D]R  l܀рQ#H?΀*5ʀꡀ2#oFObEsZsB)2V$W>  7\Gjj? u ܮD sa'ҲM m  F,2Bژ75 jazD<2Cښ -em%k `  ` r iVd 5 `,-:  Pڜ趁{5{ .рJ; OY_  0N€ Nw HMjM ą66W<W+ H~//9 I U( Y Z o n2j!.\ ~91t$W>?"7 tƁ t؁/ m, sa َ ,ژYe Nn<&ژ% .Tr Ytd 5 `,-:  Pڜ趁{N^1{ .рJ;^dPoPPo_  0Nŀ Nw HMjM ą66W< \8 Z + xYV2 HX Vz;>E?mՀ?m92V$W>C ؁p/Y>bj{ >|a cځ Mژ Ml TNnjaOd<8uQ%kym ym sR iVd 5 `,-:  Pڜ趁{N^1{ .рJ ;f: !_ N€ fMqL Jx `Š t Mw drI u]>V_ F Rn.K x>K y[<]|s 7NFj%(\ ژ' H Psρs``  7 pb ׁ r 0%= , AiK@́I Y ƀ. lP4C`2Á~ H| c} njT7v7N^X7%(\ ژfu ژ'^J:sρs O, qĀ:.'̀Ё u?]P]1 ׁ)h 3 vyBaB~ ށ 1n -  {j $% <gN^j Cʀ D11< KEDghMg !_ Nŀ fMqL Jx `Š V Mw7} NN 8uB 6/ ׀ >L {rED K .17F݁ N=D  &AKÁbC1 t u . e nlqjC 7,275㑀9X7%(\ ژf>G'^J:sρs O, qĀ:.'̀Ё u?]P]1 ׁ)h 3 vyBaB~ ށ 1n -  {j $% <g5tOj Cʀ D11< KEDghMg !_ Nŀ fMqL Jx `Š V Mw ѩfe R. ȁ  7 ,@Ac-tq ؎775qj%(\ ژkY qY sρs`` B =f r 0%= , AiK@́I Y ƀ.3mCl  L c} njT7v75㑀9X7%(y ژfu ژ' S !f  S !fcC~sρs O, qĀ:.'̀Ё u?]P]1 ׁ)h 3 vyBaB~ ށ 1n -  {j $% <g5tOj Cʀ D11< ;f: !_ N€ fMqL Jx `Š t MwKՀ~ 6(J@P NIc&p39ŁHn 7NV6%(\ ژ'q mPsρs`` A/ wMt r3˔ Z =< 0{Aˬ||Y˖:[KÈ:ȁ'/-wڀlqja7v7NnX7%(\ ژf AP'TcC:sρs O, qĀ:.'̀Ё u?]P]1 ׁ)h 3 vyBaB~ ށ 6 /ҁ  {j $% <gNnj Cʀ D11< KEDghMg !_ Nŀ fMqL Jx `Š V Mw $z s GހM 6  >DA@.IoL =[:7 }p  Z@ 7  %("r q'q m q m sρs``  ys̜ER -? pDw Vp+ l~ nB HӁ}-V_ F Rn.K x>K y[<]|s 7NFj%(\ ژ' H Psρs``  7 pb ׁ r 0%= , AiK@́I Y ƀ. lP4C`2Á~ H| c} njT7v7N^X7%(\ ژfu ژ'^J:sρs O, qĀ:.'̀Ё u?]P]1 ׁ)h 3 vyBaB~ ށ 1n -  {j $% <gN^j Cʀ D11< KEDghMg !_ Nŀ fMqL Jx `Š V Mw7} NN 8uB 6/ ׀ >L {rED K .17F݁ N=D  &AKÁbC1 t u . e nlqjC 7,275㑀9X7%(\ ژf>G'^J:sρs O, qĀ:.'̀Ё u?]P]1 ׁ)h 3 vyBaB~ ށ 1n -  {j $% <g5tOj Cʀ D11< KEDghMg !_ Nŀ fMqL Jx `Š V Mw ѩfe R. ȁ  7 ,@Ac-tq ؎775qj%(\ ژkY qY sρs`` B =f r 0%= , AiK@́I Y ƀ.3mCl  L c} njT7v75㑀9X7%(y ژfu ژ' S !f  S !fcC~sρs O, qĀ:.'̀Ё u?]P]1 ׁ)h 3 vyBaB~ ށ 1n -  {j $% <g5tOj Cʀ D11< ;f: !_ N€ fMqL Jx `Š t MwKՀ~ 6(J@P NIc&p39ŁHn 7NV6%(\ ژ'q mPsρs`` A/ wMt r3˔ Z =< 0{Aˬ||Y˖:[KÈ:ȁ'/-wڀlqja7v7NnX7%(\ ژf AP'TcC:sρs O, qĀ:.'̀Ё u?]P]1 ׁ)h 3 vyBaB~ ށ 6 /ҁ  {j $% <gNnj Cʀ D11< KEDghMg !_ Nŀ fMqL Jx `Š V Mw $z s GހM 6  >DA@.IoL =[:7 }p  Z@ 7  %("r q'q m q m sρs``  ys̜ER -? pDw Vp+ l~ nB HӁ}-ˀ `/8%8ij%iƬŀn u  y  Ԁe0-tt 4Gim0P P( WO(reS"+kӛpӛp/imt`5 Š˸d>ˀ `*X*[ڀ\X[Ƭ€n  } 2 =Gp|rdj Ś[ڀ8'0P` Pi~saa T" lT+k0,Sށ{mxdyP*X*[ڀ\X[Ƭŀn  } f ʧ rE3`:ހUtUts[ڀ8'0 K ~saa\ \+k0,<0,<{mxdyPbʀ €nx H׀7;Q"lgր}tƊ?tC@Ҁ16i.$!%+i9zk<0V:t р EvP ƀt$n bʀ^dPoPPoŀn rρLz hszdc#~VƊvā$6 d6-2V%+i9zk<]<];WJV р 1x<ƀt$n 8%8ij%iƬ€n u E /9KNz ݬ im0P P( W D6~w.eS"+kӛ Uimt`5 Š˸d>ˀ `8%8ij%iƬŀn u  y  Ԁe0-tt 4Gim0P P( WO(reS"+kӛpӛp/imt`5 Š˸d>ˀ `/*X*[ڀ\X[Ƭ€n  } 2 =Gp|rdi Ś[ڀ8'0P` Pi~saa T"e +k0,Sށ{mxdyP*X*[ڀ\X[Ƭŀn  } f ʧ rE3`:ހUtUts[ڀ8'0 K ~saa\L!? +k0,<0,<{mxdyPbʀ €nx H׀7;Q"lHtƊ?tC@Ҁ16A\ ځ%+i9zk< t р EvP ƀt$n~k&^dPoPPoŀn rρLz hszdXVƊvā$6 d6- Ls-.%+i9zk<]<] kj *V р 1x<ƀt$n*FR J=€e8Q Ią > p45$  iS ^Ny f:Y NW> *1)6 =πU ' l -i T=|7wH >K(䤁 fk:0J@>Mp\|,m KjgR J=€e8Q Ią 9vS57b iS ^Ny ydAY NW> c|=Y1)6 =πU ' lZHak T=|7wH ><& fk DCNb rx= ڜ+9FR J=€e8Q Ią > p4\h HK ˁ iS ^Ny+ ˓̀΀Y NW> *1)6 :i ! ؏ /5=|7wH >+1 %k:0J@bpOG; KjgR J=€e8Q Ią 9vS\h =7V ˁ iS ^Ny+ ˓HԴydAY NW> c|=Y1)6 :i z5F=|7wH >+Ռ%k DCNb/rx= ڜ+9v!Nw/_ N f Ġ ? ]ހVhNƹ%^(nN`w? .R =o u]>V_ F Rn.ۀ ߗ_Ob !] 7}9w%  ژ;1ÀT@0$ B`` q7Q.ERx(Mb *V[D   uT<5 51 b}9 tʴK~܀ ? }h(-] $7 ( !7mOπX7wn\  ژ&ژ;1À,|uQC ӞB O, /$H)h 63ȇE o 2 r tK =ˁ  d wuQou $% 7n >_z%~A 7Nl1ŀ~ xv!Nw/_ N f Ġ ? ]ހVhNƹ%^(nN`wv<: # I "h AEEe@.'ܟ, =Jځ?Ob Ժ 7 ؀w%wd.Y ژplO m@0$ B`` ? v#dM}р&E = /  JnUCOmN 35lI;j(-Ob  $7!7N]gy v&ژ8<u ژ;mqYC B O, ,W_ M '̀Ё uCafY o+ ɗ 2lY pف  $% <g ؀n2w%~7Nl1ŀ~p;v!Nw/_ N f 1 f 1 ? ]ހVhNƹ%^(nN`w? .R =o u]>V_ F Rn.KT7Ob !] 7}9w%  ژ=1ÀT@0$ B`` q7Q.ERx(Mb *V[D   uT<5 51 b}9 t `Ԃހ? }h(-] $7 ( !7mOπX7wn\  ژ&ژ=1À,|uQC ӞB O, /$H)h 63ȇE o 2 r tI qB wuQou $% 7n >_z%~A 7Nl1ŀ21;v!Nw/_ N f 1 f 1 ? ]ހVhNƹ%^(nN`wv<: # I "h AEEe@.'ܟ΀ ̀˦Ob Ժ 7 ؀w%wd.Y ژ=1ŀ m@0$ B`` ? v#dM}р&E = /  JnUCG"  3 xlI;j(-Ob  $7!7N]gy v&ژ8<u ژ=mqYC B O, ,W_ M '̀Ё uCafY o+ ɗ 2lYL  $% <g ؀n2w%~7Nl1ŀ2m;TЀ[\_  ED %MjM ą X^5; +;πq  xn۝臁)Xn\u5,J%j~n  IH1/J f *#uh*ȁ Iy yy0~yہ+f 75tO  <&ژ&11k0Y1 :&P P'D趁{5{ .рJ;TЀ[\_  ED %MjM ą ìq;ZTā   Ӂ Cͫ#w5pj~oA 9 pހ&IH1/9рrh Ł@[N o@+qcV̀<#ʁ67 [c C l<8<u ژ Dh m&1cO0Y1 :&P P'D趁{5{ .рJ;TЀ[\_  ED %MjM ą ìqLZTāǪ 8 6S dk#w5pj~oA IH1/9рrW Ł@[N o@ ʂ R  67 6K67 [c C l<8<u ژ Dh m&`+O0Y1 :&P P'D趁{5{ .рJ) I% I I% _  ED %MjM ąπW< ìqLZTāݞ΁Z 6S dk#w5pj~oA J,9рrW Ł@[N o@ ʂ Ro 67 6K67 [c C l<8<u ژ Dh m&`+O [,C }׀:&P P'D趁{5{ .рJ) I% I I% _  ED %MjM ąπW< ì^ׁh,S,ǀ |N Cƀ!؈i5pj~oA H,9рr  w  zw x s ;Y 6 0rE C 쀀5 l< ˼ |\NlcO Y,rax8*czb P*D )ց5{:jJ Kn1߀2k1߀?_  ED %MjM ą W< ìqLZTāǪ 8 6S dk#w5pj~oA  wW>9рrW Ł@[N o@ ʂ R  67 6K67 [c C l<8<u ژ Dh m&`+O i F }׀:&P C'D 趁{5{ .рJC? k1 k k1?_  ED %MjM ąπW< ìqLZTāǪ 8 6S dk#w5pj~oA \/9рrW Ł@[N o@ ʂ R  67 6K67 [c C l<8<u ژ Dh m&`+O Ǻ 똀 }׀:&P P'D 趁{5{ .рJ.7 ΁ K*)&(_  ED %3 Ią FwW< s nn /69 l$ M-nm9< '5ʀꡀ2j|8\13Ze  w? ^k   MJmB oC!Wg݁ 7;eN rCm3-YX0$! XQŁ.Z|z )rM 75꤀2jaOcl l ژd:m .9T  ژ'9TbT1΁fc}`X 't+5{ .ԀJ KWXW_  1wcIM J& HÑ ' Y L  WāQ 0?y5Q j}Obڀt 4s3ژ'&q ژ d љwI~ "  I]}(&75㑀9jaDZ < Igژ EW 8 7' .qښ'q  ښT2΁fc}`X 't+5{ .ԀJ;q_ N %:L J I{1w rEl4NYۀMӊa B, i ~ ՀIfIdq6'c 쀀5ʀ㑀9j%(\ ژ`IH Ɇ zw /dװӸG3E#@-I̩r<' =#v$ < @  0JՁE{ 7 4t5Fdj%(\ ښaIH Ɇ zw D?؁>좀  _ 7Qc?ȁJ:~93DoB" llI  +*>. 쀀5! >j%(v *;q_ N %:L J I{1w ADV_ FƐ=[ pj%(\QaIH Ɇ zw D?؁>좀  _ 7Qc?ȁJ:~93DoB" llI  +*>. 쀀5! >j%(v *;q_ N %:L J I{1w ADV_ FƐ=[ pj%(\QaIH Ɇ zw D?؁>좀  _ 7Qc?ȁJ:~93DoB" llI  +*>. 쀀5! >j%(v * =+so_ N %:L J I{ ,0w ADV_ FƐ=[ pj%(\Qa]e+w D?؁>좀  _ 7Qc?ȁJ:~93DoB" llI  +*>. 쀀5! >j%(v * ;+so_ N %:L J I{ ,0w AD݀~" m n׀.K x>K2&iNI}g,j+7v  D[ K5ʀj%(\ ژ`]e+w Hc$Abz_yF}NK@a|@MK0ρ YRyz&5!! >j%(\ ښa]e+w D?؁>Xd'.a /^ /W o,75! >j%(\ ښ* K*) W X/ W_ N %:L J I{ Ftـ ADV_ FƐ=[ pj%(\Qarvw D?؁>좀  _ 7Qc?ȁJ:~93DoB" llI  -*>. 쀀5! >j%(v *C _ N %:L J I{ 8 똀tـ ADV_ FƐ=[ pj%(\Qarv ^ ^w D?؁>좀  _ 7Qc?ȁJ:~93DoB" llI  -*>. 쀀5! >j%(v  ΁ KLMNL"_ NcIM7i J HÆ ' zwxE YyJ4|Ҁ[ 27.Ks o l <ہs= MK7 75ʀj%(\ ژ` Řww z 4dQN.Q4 ,G9h u = =A=؁R(RPU =ƀ 45@q mʀ ݁ ># z&5Fdj%(GCÀژa Řww D?؁>@DV`V6]*ȇE o + Ɇ< ph D  flp2a75! >j%(\ ښ*CW"_ NcIM7i J HÆ 8W ͐ zwxE YyJ4|Ҁ[ 27.Ks o l <ہs= MK7 75ʀj%(\ ژ`www z 4dQN.Q4 ,G9h u = =A=؁R(RPU =ƀ 45@q mʀ ݁ ># z&5Fdj%(GCÀژawww D?؁>@DV`V6]*ȇE o + Ɇ< ph D  flp2a75! >j%(\ ښ ΁ ;⿀k_ N^" tI JK{ 6 wsƀ= xs&L>^ ܁ Ʌ NIcI;S  '@  75ʀ꤀2j%(\ ژ'-cy''cy_XҀ3ytـ  ; k_ := b@ @1LjXҀ3yt؀ D?؁>, Oa /^ /W o,j%(\ ښ* K΀k_ NcI JK{ 6 w ѹUc : XR.4lƁ=T;I^ 8^ w}}L zD<7e'c 쀀5Qj%(\ ژ'-,|u ژ'',|u ژ_XҀ3ytـ8ﺀ  Łdҁ Á  zKK ?< ?R:Y ؍77XR 75꤀2j܁ژ 9  ڙ ;  ڙ>XҀ3yt؀ D?؁>, Oa /^ /W o,j%(\ ښ[ڀ\mvv  [ڀM'ÉbS>r  y k&*܁ L8p9 /r9zJڜƁ_l3ڜuɀ h& ;&ـO .G[ڀ\m % [ڀM'ɯ O B (2ֿ\3ڀ ڀ'* {zdqp ހIKdJ!ڜHo` MҀڜʀW' ;&ـO .G׀,`}<;6] ) D 2 ^( ~;$ u N HfC-X^$-(qmm$m Mh-lX,. / GU٨ IC tF D%Ā *7v] Mvr=wl]7 3րDI^Á%ڳ Z S 5DǀJz =gFT@Np Lo +=cڳ?Zw [ h =H$] t.O$ˀAn>3"W [ PȁM?s ʀ:k D  `  qT1 MHf!Q݁?/|,1V#/:k nSur-VZؤ*8xnG?ŮU^0N 3bH2N OYw GÀY X- ~|ـ 58 6\Stq|J 4C sہG} 3@Mڀcڙ?$ˀ?>ڙpo .[ ڙ/8Yd 9ۀ1r @1yMܫD  04" Ӡ*  @1y l wd t {֛Hnz =vk(HeKC6g GÁCF 2 k2  D@ژ?#N\ 9ۀPa^ @4ID>ـb `1p#:O {C_HJY68? Z 3Ys Z CsYQ ڀR5 b#/:tIƀuFtF g z gv Q<48RDQ܁+€7 .%3@ Iy ḀZPE ـ_x% 8!u ޫ s߁9 %.ÀWn.\ Ӟq=ܫ̀Y(F?;D $^G` F(}. Rcw ~ Gw@c xN`} }*c kŁo t niG{.j]P Hl ~U%N cl fmL)F\ ڊ1 x*~.q'l S# #$"@5De91 bŀ ce bŀF1 bŀ ce bŀF1 bŀ ce bŀF1 bŀ ce bŀF1 bŀ ce bŀ_HcHĀ NyvvvvvvxxbʀMjx t$ ģO="-Y?4~ Cfր{8R ;?=8 p? c@Ӑ 7Á uVP{C7D,2I$ kX&ڙ QlnD  QlnD  QlnD  QlnD  LInD  QlnD  HInD  QlnD  FInD  QlnD  hJnD  hJnD  hJnD  hKnD  hJnD  hGnD f4 } < i= ӠBHuk y RPǁ y桀LtJBtGr桀LtJB*<Puk y RPŁ y桀LtJBtNr桀LtJB*<Pt`WJρ桀LtJBՁALtJB)+ht`WJρ桀LtJBՁALtJB)+mkuk y RPr=LtJBPr桀LtJBp6 y桀;9v>sUv>sUv>sUv>sUv>sUf>sUzHcHĀ N f Fā H  GR4݀ }Ib.NJ 6) 5<2  d9 AL!Kف=8#Z)@$6= E)4 jOb ڙplp 1q;# =t ӠN HƁn ڒ Ӡ7)@_Q{π~,*Āf>s"gVUd*ƀf>s"gVu" Z*ƀf?@_Q{π~,*Āf>s"gVUd*ƀf>s"gVu" Z*ƀf?@鬀a|1p9טY}qf>s"gN C*ƀf>s"gV)++*#f?@鬀m JρrQ*Āf>s"gO x*ƀf>s"gܠ)+1#*ƀf?@<1{π~[*ƀf>s"gV *ƀf>s"gzld鎁 Z*ƀ9⿀yv>sUv>sUv>sUv>sUv>sUf>sUzHcHĀ Nā H ⿀e9.: 47a} [@ Z  veKz ! B}_2jOc ڙ7A j:;8CEsz Ӡ$No  LۀRxU>s"g 1܀RxU>s"g ?܀RJNo  LۀRxU>s"g 1܀RxU>s"g ?܀RJNb6^*:TRxU>s"g대e 7#RxU>s"g대e #RJNb6^*:TRxU>s"g0 nRxU>s"g0 nRJNo ܀RxU>s"g !Ձ܀RxU>s"g !Ձ܀Rk$$4(C] :NWr"Vk  B:B <48R! `$ f* ܎T|   Ҁv ( 64" qAˀP'y U)X lh/:'.2  ̀$#߁ &z lx 3 M  ʀ.@d Yi 2Dʃ[ sjmf8q Z!YXq iX, $ i4U ~ liĀ li/za=Z &0- LGN-r`JT bW?EĀlY܀1U jiU jMA1U jiU m GHcHĀ OwwwwwwwwaN`w GGC M/ n*Ӏ N܀I/˘7`QZ L"cj 變  Ӡ) N GcA 'lZˠHk$ ' Wo d# 5€ x  6@ ݀z-1ŀ w x)Umy]g  Iр@2 HI Iq/* CC(x \ S Q S Q S Q S Qxjڜ L ~ Q S Q S Q S hK i hK i hI i hK̀ڜ8܀_[ V Ā UѺB 6 € }UѺB PBp:T V Fq3p m#=  Y À UѺB 6 ^,?UѺB PBp:T Yq.L^' ԄB W AW30Cʀ 0jQ!A"A' `X . h?vOee.+N'#sa| рILM:w1 bŀ ce bŀF1 bŀ1z_ Owwwwwwwwwwwwwwww"N Jx -   O= ˀ  1_* F) Dg|:%q  ~ 5oO V 7V Ā2!%%%%0€J} J NT6  z惁 ́ !| 9 %g0 훀 ڹC' TI0Lq ae4 ښ4 ښ4 ښ4 ښ4ژ4ژ4 ژ4 ژ4 ژ4 ښ4 ڜ4 ښ4M 첀 LtJB-6_s À LtJB, TͲ [>yO/*oH2 -% +KMNH2:ڞ2:ڞ28ڞ28ڞ28ڜ28ڜ28ڜ2;ڜ2;ڜ28ڞ28ڞ28ڞ1  [ހ28ڜ ň ڜ /?LtJB ETr?G#1ڜd }LtJB ETr?G#1ڜ^;Ё7%y []{ր\?x?! I%WÀa R I% H I% E=78ʀ J4WÀa R J4 H J4l€l #  Ii ؀G 1W ^BK  6 X|C,h \uۀ{E%&/Ӏ?  I(<À1Ͱ ހJځZ 9+ _mv s7OЀ0g pOǀހƻ ]mڀ$Mi)0%, - gɁU[V Ub7qPe$  <,一7@ Lڀ$Mi bY( 7 ?^L Śڀ$MiZ\[ۀ i \<zhQo/I* l 7\ 8sڜ>+GӀ >\c ⃀ > CԀ aف YRDLBd[ W, z 8TI rq . *I+O x ̀wˀ pMa 6ʀ#>i; Q! 'V T{ # YZ ̀[ ")<!@ Lʀ#>i7+ i 2Xʀ#>iZth8р<z[Gq0E !&(L LkrڜYP x ̀ b_b`^ >W рß  Lem%  s9m77݀aQ_Ӏ#܎ } Ł > H̀  Ā  ɀ g.  pg/%ހya,Bƀ?z X[:?z4 &ԁ v q W5h Bf:5 + I3?CFm O"T ?}Ӏ6RL` ΁7%)%C EMӀ7 xN^΀u ӠBL5 O?;?`3  ́ Z=%)%5 Ӡ7u75!  , Hˁ_ %e8 QXaM  O, `  2f+ȁ;> C u AxKe 7Vp ́~ n n nE ʀ XAzi(uہ /7g(d U`ʺ d ́w7NlꥁJw S '5=Nl1bly- N} \ "  Pa΀ Ӡ   'À.[ Ӡ "| ӠB<Nbߘ= ̀4Xt6v ᪀g )i f1́ F N7jY} :"?baXjuW4CÀ17Oe2BU8H9՞eZ-8 끁VS(1V899N6;7G37Ā(JfiǛA LRow q l@Lrl1/<I ,5{ ^tKoSl@q*k EoL]r"V6[AkRJTL%w5xD /<I ,߼o(Ӂ L% <D Bk . VL%h {Y܀1U jiU jMA1U jiU mzHcHHcc f癀 O X wwwwwwwwwwwwwwww ʀ WlLiG-z Cdf  - %i\ʀ 3m)N7sN9 + 6JB u.ȍwKhBK ÁGA@ _A! D4℁b4" Ӡ 4" Ӡ~ Q ϴ }G @t ;ڹC޸ ƀ?#<j ڜ Qiڞ Qiڞ Qiڞ Qiڞ Qiڜ Qiڜ Qiڜ Qiڜ Qiڜ hJɀڞ hFɀڞ hJɀڞ hFɀڜ y $> UѺB PR/U71ڜT1 V € }UѺB AP m wʁ ;ڹӤ, .r;̺Wyڜ Q ڞ Q ڞ Q ڞ Q ڞ Q ڜ Q ڜ Q ڜ L7 ڜ Q ڜ hF\2:ڞ hF\2:ڞ hJ\2:ڞ hJ\2:ڜ d &4 獁H">/UѺB (k t T1}> À UѺB (W B5yXVHcHHcHHcHHcHHcHHcHHcHHcHHcHĀ O4B>ak))~e, ؁ RS=ȁV/PQ<rQOOOJOD@PPPP"g!0ǀ;ρh==k+==k==kOPrn$ Ti jI+要 ܷYZ&U " )5Mq kkk^_kkkkkkk^_/ـ"-_^ցoƀp#_sD > fg|"CmXYNGҀ>0,d#_$ p#_' z$  w\ Ki 'p#_!])i >ŀ > XRlDr2S.P:ƀ ˏslD À K#ƀ.m%, w\ Ki 'p'_!])i >ŀ > XRlDr2S.P:ƀ ˏslD À K#ƀ.m_ w\ Ki 'p&_!])i >ŀ >_z$ L ) > πcɀQr2S.P:Y ˏslD À K#ƀ.m| wX Ki '.!])i >ŀ >_ w\ Ki 'p"_!])i >ŀ >| w\ Ki '.!])i >ŀ >_ w\ Ki 'p"_!])i >ŀ >| w\ Ki '.!])i >ŀ >_ w\ Ki 'p"_!Y)i >ŀ >%, wX Ki 'p#_!])i >ŀ > XRlDr2S.P:ƀ | / À K#Ȁ.m%, w\ Ki 'p#_!])i >ŀ > XRlDr2S.P:ƀ ˏslD À K#ƀ.m%, w\ Ki 'p#_!])i >ŀ >| w\ Ki '.!])i >ŀ >| w\ Ki '.!])i >ŀ >| w\ Ki '.!Y)i >ŀ > , n  ,n ]_z$ L ) >_ w\ Ki 'p"_!])i >ŀ >%,>,1R m ,?iq 6 W'0 :Q( Lɀ P ,A |I,A I > >,1!l̀m ,?iǀ %-'9΀ W$M,. g p# ,AZ_C Jq?F+[oǀɀ|IEʀ15?d€l>,1R ̀tĀ ,?iq  M ,+ ?F+[oɀp# Eʀ15L% K >4fg )      =k 0aa ! !?x 0 !NXisXXrv I% V}~ P~ Pǀ 35 njeF݀ njeFdddYRȾrq z!Y cK 3  .9*1[сrNo/H . rC,A L h ‘777  =k Z  =k Z7  =k Z777777  =k Z  =k Z77777  ++0+=kk[[ g v[[ h&Pv[[ S r77777;[[ No[[ EfPx77775H${ ( )q3 ) ) ) ) ) q r2S.P:m À K#m0 q r2S.P:m À K#m0 Հq ) ( r2S.P:m À K#m0 Հq ) Հq ) Հq ) Հq ) Հq ) Հq ) Հq r2S.P:m À K#m0 Հq r2S.P:m À K#m0 Հq ) Հq ) Հq ) Հq ) ( ) Հq y x y F kt> F kt; n |րn zyրa 1 U z_ ]ĀsMqMLLTئ & !#:QHtvb:54H O0 0Ѐ(:ҁ ;^ ǀC v X<&SWw.iThiTO6M wȽ9 ,$o U0 }Tm~l@h%ƀ Lhi71hqZ= Js GWjɁ}q Y'kSG,mweJ  v'̀v L 5v i b%c S X xB#Q` E JR > y桀st . DOTep;P` S}0vCs|w eiIJ IJ ÌI~J;EPJ8++>%ׁɀ ߀݀S^ LЁr`f  a b Jw!\-jij ;6T u C91vdv 6s Hg_?%%^$0+$mgsh7/צS|@rH&m  ,.h- hH] z|6] z|Q[] z|6$@WGuX]  b] . [v  Tq :! h  /U݀ U/ ʆ ́CÀkf)\ ڀ A[< i C@ Qkf)\ ڀ A[< i C@ H6 swmJ8RfN݀;-K4 ƀb̗"B gzf[1=? u AO*C SAL>=!G .H߁$Հ+;Xr 3~HFo4p3G+oe}aϑ  /? \ U~XԀ5h*imрӁˣ /? \ p=Ӏb8Ҁٱ pȀ # ЀSk  Xց`!_ \ ' 9 ڜ+0ٱ Iw4 ' ' ߀ ܀b€ Tp=V܀ gT[{t4 'N3 i&i۰ Hjp=`΀7}!_T$ ' ڀ߳1Ap=_ǀdEv \jˀɀoC.N+%cp= /G \ U}JX"  YJu aW{P+ 9 M /G \ U~I#50`agߑ $ %Ԥ 3 #- AK= /DZ(XS&|Ҁ74?΁ /=E  jCǁP R% 9 eC c!MԜr€ 3 Mԛ  Mc ' }"*ԛ ہKy(y '  +'Y*ΘN 1 ' z*Ι1%1%1%sZ ' }"*Ι1% ' }"*ep= /GR,ŀ-MV b`N_Na:kll a Ё  :d3YYZ*˽>u! Z*cLl#*ح5%&h%sXL,}#*ƀZ*ƀ'vwn c)&.z O`xQR Ѐ` /G \  a~ٞ &OD r 37eU2YRTjf Ľ̀4ρ>:4EL! r3=|W/V ʁ :Dn}n3> et =AK> aMK'Z΍ 1Mp= * /GjRŀ ĀtK ^̀MK'΍L2p=  /GQ <2p=  /G \ +L et2ZZ΍LL ? ֙p=  /F \ +  ĀtJ%!ā[[YفZZ[\Y ځ޳2΍LLL,2p=  /G \ +oCp= J S /Gpvހv k&g` 0$ x /^>ژ_ w j ( ژY^  0 /GjRŀ p_u1EсMb$ˀ;"*ep=a /G ٭'_ڑ N cIB타" ӞNMԀI;@}j&;AQ~Veli i Jy ܕ란 WXVSpl  A O`(Xp=PV π8Āڜ[h!Mɤss܀4Z*ƀɥ$Z*ƀOg  .M~iO ycˀ$Xğb .ָ .L" . h' MD΀u .< ,T AO_7 t ' &= ym]> )Oı 'z} dL 48Āڜ ȍ $ с &(J( lh"BɥɢZ*ƀɤɢZ*ƀh y*E 0@*Ā:s->wz>sl /1݀n j 4  0eƀ Z*n,:G KHg'׀E ʕژ / ژ=1ŀJO|M?!Mep=  /GP,ŀda; 0 b$ˀ;*b=}*c 8"< :}h *  F *ƀZ :}h } *J뗀 ژqπ;> R=kQv ߀U B|&xvq8S㙀Ѐ" ށqe&!] B Jp=  /G & awRP<p= /Fb8܀n Ӡ2*2p=  /G) d 2p=  /GQ <22p=  /GR,ŀ 1<>ښ?Mp=a /G & awoCp=  /GjRŀ<ژE-9Vl"p=  /G \ +iEp=  /G & aw''[N;ژ $% Ӟـ Ā Ӟ NiRXWz*#XWz*]g"}xo*ˁ p=  /G EV u6c #Zp=Q /G \ +_NiRp=  /FPĀ']00LN" ژ gQ - ӞF p=  /Gpv`]g"}xn*ŀΗNNiRp= /G \ + Rҽ" M ᱀π" ʊLc]^ژl===R=kS^ژs*" }jY]g" ր"a;4Z*Ā2S" ژ4" ژ&R=%*" }jY]g" }j"a;"*c '   {"e I} 2 t'_ڑN" ژ^ % <U-d7 '  y_ ' ulW ~+}^Ā} 24 '   {^€ $  '   {^€elNyC7 %HO Z*ǀǀzZ*cr0 \v+εA-g ,J.2$OU +p> , BN ? ~ }rvxxց p~MԀ2,2p=a /G P<ٯ\"p= /FZ )(  2' "  5ށ6 O ڔ ?GnqeMԀ iGp=  /G K?PNٯ t}́  w PE  JQZ Q|   @ QRҽMҀ L L~PE 4O_R=Tq}Mo"  l  i,}jq  'q   o"  [-ehD  ?M1"'q  5vmw!vmD^;+y0Ll J ހ %opDW ;Ekj 2<5=tƀ/c`΁Z-D= Yqǀ 4vmNb%e  2q9:T}0ҁ fvmNb%e  ā:T}0 `$+K!_S"yUi mN? CDz =2SmXv"X Y' Yy*) KDhYK"ˬf߀1ʀ R Ny)\J ^'M Oˀ ܀Sf)g/MqLZހ  D ~L`?~J _4?7+v! Ş #0N1W0Nd8 0N yV2 ( >Do- d7HdTڀ ÁMÑ u [,BWhc  ց5 dхzڜ́ v0CX亁ssgM !Hp m1 uρOAPqj< Dːv i b% xB# xe'^Áb܁#Āː ܠD|5ȘEπh߄% ځib$H!e s0PH`PK k   &  kn!3 3 m|рea { \o/uf GNs x>:my82πA /7S 2t ؤ Yy?rq=*yKt6 qƁsÆD/A ؤ YyπXLC1e , 3% b%뻁5#ˁ#I/, / /(((aS!+c'(((m'(R'((b݀ T؁  S; . +(I` ]Jq ,̀ 8 a #%de ̞$aNÊ 6aHzbG0#ћŀѓѓъ=z"ʀ&]ϼp_"i ̀ ΁  AA4$, >;@H@̀9>|ƀLK ( ȫa`{&d 0  #  F  D 8ۀu) Ձ g   ] `X -)^D    <  w}eA P!`+ @ = V*"*e_p  [` P w8  ) y I  `ހy|E Y , 8 0       6w  T   ҁ t Hހ I n I / #`y " 9,0oF I6  e Ȃ    9 _} V/ׁ  î   Zǀ  R .," ԁ ) Ā 5; -`}t[Ԁ\I tPV  q!] ±qqS?v72q  E  xJW   x h)H!w!Fo" > k   á;7   ҁ  ,u* MG 6   ]q ! ~ t Ÿ #Q&" ҁ1a" € oV(i6)1-&t ) +  L  *V"o':.<  N  ORp.  J)  # Ps耑o+[&u/ 3mH׀{s  T5 h 8    ́ ( t   A,p   ̀ _  g 5#P  >#     t S  H$p$q   r18Pm  T kQ6Q4D ^   c é ]  \   = GM  Kq V    #1ځdMf_Ȁ",,I Y#" 4[4A[4VK Ҁ-"w,7 [[gW 2IsIt + p   D 8M Ѐ  sP(.  c e D ˀ@Q<U8 ׁFNFh  F   h  6Yȁ | 4..P H?  Ú u  .  $   && Mv"  ] D G @U  C`3 l`8 XI 5\o  fT@ i /_ T w c  P4Pٵ|1ɀ EZ  Á #F,paNa  I\ r <~贀eXGc';r嫀=S/" 4 ƹ  2:\1R\" j q E  x  K u0 Ӂ ] Y:  4 T  '@抓 d _~B "UD|t" X a/ ́  9%  z@ FVt>&5 {p]  lR;¤;~m :r^  4  , &$A= Fx z ݵ ܻ ݵ ܻ [ h ̀AAA= ?  ? ր ? J6j ? 7 ? .ρC ? ր ?x ? ߀ ? GC ? ߀ ? U ? ߀ ? U.mukul- dj*     ,] p(  U   , pR  \  s]`߁     ,*, *S*pUȀ@  ?6  W))X g d ms7t5t5t"BCUZĀO F{{#$#ɀ % Ĥ p  0 d z  Ns E e t4:o@ Ё  } / :. d p  M f Q12^c^z(SSr  Kx   ^GG !]  g `  ĕ  (}ǁ v=      ,  j u  (: TS ^ _ ?i4[Pr orÊ ;8PlcsOb=] ŀѓѓ тz`(ZV[QNG8J pHZˁDT! bp` c 1tՀ0@ #  {Ād@P@ 0,3pP  z@ B R ԍ`ϼ\ ؿ,APHP;@ ` tP ` DkfЁ  P d 0   P   l P   0((       ts ?  L < P  ̀^A Ԁϔϔ 8 R-_ ] ҝ qہ t 4P N    χ   i L4؀6    86  X  o  0(L` 0 0P &*   8 3 ^w$ q ކ72d  0 - 000"  \ 8      8o Q  j1S=  ( ̀X Q`7 | 1߀@ |K G 4 0\   0*  ؁ H X ́  02 0 ! 2 8 F< _ 8 \[Ϙ 4-d  ԁ A l ¤) 4 P4 D K ( 8L h   ((  h B |i87H 6\   Kp׀sqXVUcHx){) T Ԁ@ 4 &+E [ k t lj  ߁ ; X C M >h 3&րӀ , `9/K$Z(  l` sBp:sAp:casvjހcasr2_tv itX \ \XC `)wYk C  \ $v al| IaRVa ukv vv 0yyʢ TjXC aiVXVXd7 TjXC& XzDWY{]#F :痀  /.;潁 ! p/R 1) n HG-^$1i0#ijf}8 4MQOj+>`= lQ  7nɒs)J ?q <ڳY4 {5 > xP5NX}ŀ;<N01Ku 7 n~| Cxˤ IsMCw߶x l HCH'W^F Yy - ¨Pm||M6ccc }^ܫ 53tX/ ?WkK1ɁlodG -gӪ Bc`q l >wd ڜ#i 4$ƀ zE  \ 5T D +;. {BcD KC+Pځ*ǀ|>ځ*ǀ|>ځ*ǀ|>ځ*ǀ|>ځ*ƀZ<|>ʀ*ĀZ<|>ʀ*ĀZ<|>ʀ*ǀ|>ځ*Ћ ހ ] V`WR ݁{ˀǂZF ֆt8lCTHs09QdKvρ(m 7 ^OM8 Á & 1~] i , d]hׁ 0с| \ 5T D1YǷ C}C z ݁@ ~ R{>BЁ* H؀ Ӡ zՁə=}6X z& A M zIb 4%M zIb 4%M zIb 4%MY7ObPs݀6W z; &}"*} ;.}"*ƀZ&}"*9 R }k\\ }5W*ƀY\ }"*ƀY\ }"*ƀY\ }"*ƀY\ }5Z*ƀY\ }5Z*ƀZ&}"*} }5Z*ƀZ\ }5Z*ƀZ\ y-X*ƀZ\ }5X*ЋƑ  E k7ٷL5 kF%π]q1V0NN \LU} 1ǹ [,n5 xf \LU} 11 [,n5 xf1ڳ0\ o-#-L5 t kՀ%q1 Ư ΁aOwU  EBw>/  EBi΀wW6s,?ywW6s:0N 0B uzD.h5€zdK[G  q fD uŀ9Jn \ˁn`tlf./€xxԁ',}^dH! 䞀J8 `ژH g,}ƀ z;껀6 X "Yf k],}ƀ z; 6р k!`q l z aژ d}ʁЀIHD.hgzb i 4](iHŀ 5 \ˁnM{/I37,C(Ӂ ń`(B`q dzd.Y ژH g,}ƀ z; z6] O΀6_3p1Āzd.Y ژWg,}ƀ z; + _2p1Āzd.Y   m L}ʁ5i Tj dH| 3 t*JQ]ɀ }VX64טÀ { H6K4  n *}JcGց uH@ 4%?('.}I~lb c NMӀNMӀNMрNMӀNMӀ " ;*9+Oiځx5s *&I;k]\ }k]OÀ]&I;k]d.YQsk]&}k\iځ *&}k\iځ  *8<u }k^]9H <V }kĀ-1ŀ{ %&5+*ЖƑVX64טÀ4zm r35H@ [' Z p- < 4C@@/0 tkjnu ӠNMҀNMӀNMӀNMӀNMҀ9| ӟʀeox(]8<uPsk]8u }k] }k]y{ }k y-]y{ }k]&}k^1ǴQ׀.Y }"*ƀY e|Ƒd7 Tj\6  ` ؀5q}WkOЁ]} zIɘ Ё]}I RQ: @؀x)^H>R YҀSx)^H #q_<Aglԁ y준:Du]Rׁ #-  6OB & ghN o xDsY쯀hJfKπS֚n kÉB k>mY ڀmB~ ˁ巀I Y9O [d`1k4р lJD ^R(ǁ + 3 ;LŸwOSTO ӞH pd.[ }k 0"   }kvրȀ,>wL\m _Hc| K|0+* ? HhG?^$Aفɀ B) LY/mg)K7L%>ǀ7rLYڀSI] ׀ %ϖ*LM7!^ob ]Kn '@ \*{ "[  ..%> G n~O j;',=z?omyph+%? ˀ 77,-2/#lxy6EFmkӀ*6Ex 4  n~lxϧ MI7 n~ TYH?vX@̎ 1(`K eMπL@x 5 -9Bs ׁ !pG  ݁PEu rdj~!/.9Xi/5mсi.&@wZ\fR}TmtX*N XU ?xU2iT ǀ e*Mɀ M~ ` Ɂρa Ҁ)e z( 4cc tǀ ;I ߀p|%0'_Q z( 4ccn` ( 'P / Si "g "quǀހƾXQPC2NYO3I9m'_q 0.6*& KC]ѥ2 F3 P2 Pv  $bE7% t  ǀ $  ԀmW7d+ A4ߣ I,]FZnڜYc[oڜƁÌژՀ  ԀmW7d+ A4ߣ I,]FZnڜYc[oڜƁÌژOT l `_  )Ҁ. P ~P }a ~Prx Ia W)Q wYhQr"ߓL PL HaGƦ%^$ L-ǀ VyD3& ˩YȏCI H R c.5i`LXLTo QhZ+ً~ O鶀t 9 | $ Fցf~l.* *agpnnT S >ÁYPc 2( bF*mf`z pnnT ` YS >ÁYNc 2( pm GI nk E1 ng.hE e[9p<(岀ڀGCCoqzR F˫N ށ*"cg %fP 9' F   PJ<P Ӂqz xd̏z #Q HÍEb K %} 6[ ȁEU · V_ F Rn.o~ Am 4kjnwhL67{ R ̏z #Q HÍE D:@?E VL wKwL)7(Hm7l{ U<o ̏z #Q HÍE D:@?E VL wKwL)7(H 1 1m7l{ U<o ~̏z #Q HÍE D:@?E VL wKw s  ? @t >m7l{ U<o q v4v_  0 \ Hą#W? +D|- + YG es(˗7NF BjObv ' _iFb5\ Eq %W  An  nxC a  ^ <ҁŀja̴<Vy ژ  n~v>iy  ` 적 PL!āoNl1{.]J q v4v_  0 \ Hą3Wɀ8 zD_̷.]7375̀㏀9jMWv 'sq 3a  zwW0Z 3 |ɟ~Z "6Ҁʿ A꤀2jaObl la% ~v>iy  ` 적 PL!āo5{.]J q v4v_  0 \ Hąν[HDO Po e567NV BjObv '*uog W{+fPgwހ fl 6c n>- Ul<ҁ׳jaDZ < Igژ 4"^% n~v>iy  ` 적 PL!āoN^1{.]J q v4v_  0 \ Hą>€6;րT 'h4U8w :j 7 ajn v '   I yK P xI=z4} C㕁l<ҁĀ'pBׁOb< ہ X4% n~v>iy  ` 적 PL!āoNl1{.]J q v4v_  0 \ HąNBK" l݀F ) xJ$lH675̀ꡀ2jObv 'Fc / ˀ ـ] P r `g | s5ҁ[Ӏ꤀2l<  ژ EWMv% n~v>iy  ` 적 PL!āo5{.]J q v4v_  0 \ HąMqG >G&m O j n(˗7NVn2jObv 'ӃߙMӁ'mAC ] rO  >4  .րj- wv<ҁ׳jaOb<&ژ ؁% n~v>iy  ` 적 PL!āoN^1{.]J q v4v_  0 \ Hą>€6;րT 'h4 & 'M.7 ajn v '   I yK P ԁ b ݁ >| rF 5. fSl<ҁĀ'pBׁOb< ہ X4% n~v>iy  ` 적 PL!āoNl1{.]J q v4v_  0 \ HąNBK" lj* FEl675̀ꡀ2jObv 'Fc / ˀ ـ 1 6  $8 | s5ҁ[Ӏ꤀2l<  ژ EWMv% n~v>iy  ` 적 PL!āo5{.]J q v4v_  0 \ HąMqG >G X x H'M.ҁ˗7NVn2jObv 'ӃߙMӁ'mAC ] R.~B= Q wv<ҁ׳jaOb<&ژ ؁% n~v>iy  ` 적 PL!āoN^1{.]J q v4v_  0 \ HąνҚ!;րT 'h4 & 'M.7 ajn v '   I yK P ԁ b ݁ >| rF 5. fSl<ҁĀ'pBׁOb< ہ X4% n~v>iy  ` 적 PL!āoNl1{.]J q v4v_  0 \ Hą + *)A K" lj* FEl675̀ꡀ2jObv 'Fc / ˀ ـ 1 6  $8 | s5ҁ[Ӏ꤀2l<  ژ EWMv% n~v>iy  ` 적 PL!āo5{.]J q v4v_  0 \ HąPM >G X x H'M.ҁ˗7NVn2jObv 'ӃߙMӁ'mAC ] R.~B= Q wv<ҁ׳jaOb<&ژ ؁% n~v>iy  ` 적 PL!āoN^1{.]J q v4v_  0 \ Hą;րT 'h4 l-]>  q]7 ajn v 'Cށ .W I yK P ԁ b ݁ >| r#,+h 9l<7N]'pBׁOb< ہ X4% n~v>iy  ` 적 PJ!oNl1{.]J q v4v_  0 \ Hą0r2K" ljt sW 5 5G675̀ꡀ2jObv '7c / ˀ ـ%V - + | s 75꤀2l<  ژ EWMt% n~v>iy  ` 적 PJ!o5{.]J v4v_  0 \ HąMG >G"-]> C(˗7NVn2jObv 'p& tMӁ'mAC ] R.{сF) O wv<7NnjaOb<&ژ ؁% n~v>iy  ` 적 PJ!oN^1{.]J q v2v_  0 \ Hąecm kR5j | U5TA؀Jс 8?GNFn2 v ' @y=m@*(l m  b ( "87NZjaOb<\a4* ^U)Ɂv>iy  ` 저\ 10J!oNl1{.]J q v2v_  0 \ HąINJ 6 ZT 2j kr@#w5Q jObv '!  _ / ˀ ـ] *9 {}Ё+`V | s 75꤀2l<  ژ EWMvU)Ɂv>iy  ` 저\ 10J!o5{.]J q v2v_  0 \ Hą9 Nx6 C _)O9<z˗7NVn2j}k fv ' {q %W  2 ݁ ɛ 'Ko  ㆁ^ <7Nmja̴<Vy ژ cC)Ɂv>iy  ` 저\ 10J!oN^1{.]J q v2v_  0 \ Hąecm kR5j | U5TA؀Jс 8?GNFn2 v ' @y=m@*(l m  b ( "87NZjaOb<\a4* ^U)Ɂv>iy  ` 저\ 10J!oNl1{.]J q v2v_  0 \ HąINJ 6%a KZT 2j kr@#w5Q jObv '!  _ / ˀ ـ] *9 {}Ё+`V | s 75꤀2l<  ژ EWMvU)Ɂv>iy  ` 저\ 10J!o5{.]J q v2v_  0 \ Hą9 Nx6 C _)O9<z˗7NVn2j}k fv ' {q %W  2 ݁ ɛ 'Ko  ㆁ^ <7Nmja̴<Vy ژ cC)Ɂv>iy  ` 저\ 10J!oN^1{.]JE ) I% I I% _ Nw MqLG- `{N0֊ >XAh$ N > ȁ N ہ OR+=J=> qPImD L 7NFj%(\ ڙ M M,ށ  I=c@?G M M,݀ فo҆_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' x Sꡀ2j Cʀ DNl1E q) I% I I% _ Nw MqLG- `{N0֚~Ѝ~ } PM<  vZ $̩8mm DSW 7NVPC%(\ ژ M M,ހT˨ 7%= , &| 16AKEƔ \Gs')Zhu1 Q7v7NmX7%(\ ژb>G M M,݀ فo҆_ HW] M  E )VĴF_ uހZfy&4Bb,ف;ӀD ŀ ' xT3n2j Cʀ DNl1E q) I% I I% _ Nw MqLG- `{N0}>  2ԁ+HBy7 = c&4 ϟ(+¿sA˙7NFj%(\ ژ M M,ހmGAN`Ks 1^=$ YffK@ ?Sv LlqD 7A7N]9%(\ ژb" ژ M M,݀ فo҆_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' xTCn2j Cʀ DNl1E q) I% I I% _ Nw MqLG- `{N0Rm VNYۀMP`P7 B, i ~q )'c 쀀5ʀ꤀2j%(\ ژ M M,ށT/wm ߀g?zYKs q_v | qژ M M,݀ فo҆_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' x Sꡀ2j Cʀ DNl1E q) I% I I% _ Nw MqLG- `{N0?˛ DWȁȅ [ŀ y 6=ԂS>ҁ@T 6KH 7NVj%( ژ M M,ށ>II&EYD sl|Y>  < }>ӮT (},(( xu {D  lP57NmX7%(p b>ژ M M,݀ فo҆_ HW] M  E )VĴF_ uހZfy&4Bb,ف;ӀD ŀ ' xT3n2j Cʀ DNl1E q Lk J4 J J4 _ Nw MqLG- `{N0}>  2ԁ+HBy7 = cIj<7` 'ԀA˙7NFj%(\ ژ M \-ހmGAN`Ks 1^=$ Yff SCoC/kAӷ lqD 7A7N]9%(܁ ?b" ژ M \-݀ فo҆_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' xTCn2j Cʀ DNl1E q Lk J4 J J4 _ Nw MqLG- `{N0Rm VNYۀMP`P7 B, i ~ 6S ¶Tml'c 쀀5ʀ꤀2j%(\ ژ M \-ށT/wm ߀g?zYKs q_v | qژ M \-݀ فo҆_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' x Sꡀ2j Cʀ DNl1E q Lk J4 J J4 _ Nw MqLG- `{N0?˛ DWȁȅ [ŀ y 6=ԂKJ~i-Ȍ ,%́ 7NVj%(5 ^d M \-ށ>II&EYD sl|Y>  < } /6BOnx0 xu {D  lP57NmX7%(\ab>ژ M \-݀ فo҆_ HW] M  E )VĴF_ uހZfy&4Bb,ف;ӀD ŀ ' xT3n2j Cʀ DNl1E q) I% I I% _ Nw MqLG- `{N0}# 2Á+HBy7 = cIj<7` 'ԀA˙7NFj%(\ ژ M M,ހmGAN`Ks 1^=$ Yff SCoC/kAӷ lqD 7A7N]9%(܁ ?b" ژ M M,݀ فo҆_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' xTCn2j Cʀ DNl1E q) I% I I% _ Nw MqLG- `{N0R/'g NYۀMP`P7 B, i ~ 6S ¶Tml'c 쀀5ʀ꤀2j%(\ ژ M M,ށT/wm ߀g?zYKs q_v | qژ M M,݀ فo҆_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' x Sꡀ2j Cʀ DNl1E q) I% I I% _ Nw MqLG- `{N0 ݁ DWȁȅ [ŀ y 6=ԂKJ~i-Ȍ ,%́ 7NVj%(5 ^d M M,ށJfB}I&EYD sl|Y>  < } /6BOnx0 xu {D  lP57NmX7%(\ab>ژ M M,݀ فo҆_ HW] M  E )VĴF_ uހZfy&4Bb,ف;ӀD ŀ ' xT3n2j Cʀ DNl1E q) I% I I% _ Nw MqLG- `{N0} |++HBy7 = cIj<K O}E{߀A˙7NFj%(\ ژ M K,ހmgAN`Ks 1^=$ Yff SC/}5z$lqD 7A7N^9%(\ ژb" ژ M K,݀ فo҆_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' xTCn2j Cʀ DNl1E q) I% I I% _ Nw MqLG- `{N0R]~NYۀMP`P7 B, i ~`D~@%р-t6'c 쀀5ʀ꤀2j%(\ ژ M K,ށS/wm ߀g?zYKs q_v | qژ M K,݀ فo҆_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' x Sꡀ2j Cʀ DNl1E ) I% I I% _ Nw MqLG- `{N0Mnˊ DWȁȅ [ŀ y 6=ԂKJ B}c=tgN 7NVj%(/mژ M K,ހ,+%8I&EYD sl|Y>  < } /6B ys'R xu {D  lP57NnX7%(\ab>ژ M K,݀ فo҆_ HW] M  E )VĴF_ uހZfy&4Bb,ف;ӀD ŀ ' xT3n2j Cʀ DNl1E q Kj J4 J J4 _ Nw MqLG- `{N0s~]>V_~ہ~ II# S+(+¿s 7NFj%(\ ژ M [-ށ[ ` CR>(D uz rH1"x}A u x  yW[D 7A7N^X<g%(GCKb>ژ M [-݀ فo҆_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' xTCn2j Cʀ DNl1E q Kj J4 J J4 _ Nw MqLG- `{N0V-ƀ,qÁ+HBy7 = cIZ !K6'  A˙75,J%j%(\ ژ M [-ށ切 Eg?zYKs q_v | qژ M [-݀ فo҆_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' x Sꡀ2j Cʀ DNl1E q Kj J4 J J4 _ Nw MqLG- `{N0}_N z b.4lƁ=T;I^ 8^  Q ,-yH€ ~ç'c NVj%(\ ژ M [-ހJˀvځgr>bS'_vw ˕?ˀ}7C+-&~ '` {D Yr97wd.NnX7  '^ ژb؀% ژ M [-݀ فo҆_ HW] M  E )VĴF_ uހZfy&4Bb,ف;ӀD ŀ ' xT3n2j Cʀ DNl1E q) I% I I% _ Nw MqLG- `{N0! KA)]>V_~ہ~ II# S+(+¿s 7NFj%(\ ژ M K,ށ[ ` CR>(D uz rH1"x}A u x  yW[D 7A7N^X<g%(GCKb>ژ M K,݀ فo҆_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' xTCn2j Cʀ DNl1E q) I% I I% _ Nw MqLG- `{N0V-ƀ,qÁ+HBy7 = cIZ !K6'  A˙75,J%j%(\ ژ M K,ށ切 Eg?zYKs q_v | qژ M K,݀ فo҆_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' x Sꡀ2j Cʀ DNl1E q) I% I I% _ Nw MqLG- `{N0}_N z b.4lƁ=T;I^ 8^  Q ,-yH€ ~ç'c NVj%(\ ژ M K,ހJzgr>bS'_vw ˕?ˀ}7C+-&~ '` {D Yr97wd.NnX7  '^ ژb؀% ژ M K,݀ فo҆_ HW] M  E )VĴF_ uހZfy&4Bb,ف;ӀD ŀ ' xT3n2j Cʀ DNl1EE ̀̏,wI2_ Nw  HiG-  xwe9}Y&V{xbp { 6-B9 )N΀ cyW w ؏67 ?ɀ Ā2j%(\a M M{ށM*F@I H́[LҁK? ,:K  ]߁ xu <YRyv7m|!]X7%(\ ژژ M M{݁ԁ'p/lJ /M= )iǺ o *Lx Q s 2Jk 7 ˁ1OЀju7Nl1B ̏,wI2_ Nw  HiG-  xwήq ڀ~ } PM<  vZ $̩8mm C? ,> o[ȆBD| ] 7 c2j  ژ M M{>"xġ >  EV , )C  eKhr,G0p)R \l<yv7 c2Xd.%(\Qf  ژ M M{݀ فo҆_5-) lԃ" 7̅ ' "xġ >  EV , )C  eKhr,G}e \l<yv7N]Xd.%(\Qf  ژ M M{݀ فo҆_5-) lԃ" 7̅ '  75!! >j%(\ ژ M M{ށM >(U] %) mCJT N u = 4_x}D',}399K  [X <Gv75꤀2X 7%(\ ژfua M M{݀ فo҆_5-) lԃ" 7̅ ' F( w 7NVj  ژ M K{>"x͏ց >  EV , )C  eKhr,GʀK}@-v>a) \l<yv7NmXd.%(\Qf  ژ M K{݀ فo҆_5-) lԃ" 7̅ '  ؏67 ؀j%(\a M L{ށvNׁ  t yJ*F)#YY_' Z?, @} \ ̀a l<8?Gv7N^X7%('ژf" ژ M L{݀ فo҆_5-) lԃ" 7̅ '  ؏67 ꤀2j%(\a M K{ށvN t yJ*F)#YY_' Z?, @} \ ̀a l<8?Gv75꤀2X7%('ژf" ژ M K{݀ فo҆_5-) lԃ" 7̅ ' iy  ` 적 PL!āoNl1{.]J q v4v_  0 \ Hą3Wɀ8 zD_̷.]7375̀㏀9jMWv 'Q!dy  zwW0Z 3 |ɟ~Z "6Ҁʿ A꤀2jaObl la% ~v>iy  ` 적 PL!āo5{.]J q v4v_  0 \ Hąν[HDO Po e567NV BjObv '& Bx({+fPgwހ fl 6c n>- Ul<ҁ׳jaDZ < Igژ 4"^% n~v>iy  ` 적 PL!āoN^1{.]J q v4v_  0 \ Hą>€6;րT 'h4U8w :j 7 ajn v '&MNo I yK P xI=z4} C㕁l<ҁĀ'pBׁOb< ہ X4% n~v>iy  ` 적 PL!āoNl1{.]J q v4v_  0 \ HąNBK" l݀F ) xJ$lH675̀ꡀ2jObv '&l$- t / ˀ ـ] P r `g | s5ҁ[Ӏ꤀2l<  ژ EWMv% n~v>iy  ` 적 PL!āo5{.]J q v4v_  0 \ HąMqG >G&m O j n(˗7NVn2jObv '&BTK& tbB뒀FhBʁ ] rO  >4  .րj- wv<ҁ׳jaOb<&ژ ؁% n~v>iy  ` 적 PL!āoN^1{.]J q v4v_  0 \ Hą>€6;րT 'h4 & 'M.7 ajn v '&MNo I yK P ԁ b ݁ >| rF 5. fSl<ҁĀ'pBׁOb< ہ X4% n~v>iy  ` 적 PL!āoNl1{.]J q v4v_  0 \ HąNBK" lj* FEl675̀ꡀ2jObv '&l$- t / ˀ ـ 1 6  $8 | s5ҁ[Ӏ꤀2l<  ژ EWMv% n~v>iy  ` 적 PL!āo5{.]J q v4v_  0 \ HąMqG >G X x H'M.ҁ˗7NVn2jObv '&BTK& tbB뒀FhBʁ ] R.~B= Q wv<ҁ׳jaOb<&ژ ؁% n~v>iy  ` 적 PL!āoN^1{.]J q v4v_  0 \ HąνҚ!;րT 'h4 & 'M.7 ajn v '&MNo I yK P ԁ b ݁ >| rF 5. fSl<ҁĀ'pBׁOb< ہ X4% n~v>iy  ` 적 PL!āoNl1{.]J q v4v_  0 \ Hą + *)A K" lj* FEl675̀ꡀ2jObv '&V$- t / ˀ ـ 1 6  $8 | s5ҁ[Ӏ꤀2l<  ژ EWMv% n~v>iy  ` 적 PL!āo5{.]J q v4v_  0 \ HąPM >G X x H'M.ҁ˗7NVn2jObv '&BTK& tbB뒀FhBʁ ] R.~B= Q wv<ҁ׳jaOb<&ژ ؁% n~v>iy  ` 적 PL!āoN^1{.]J q v4v_  0 \ Hą;րT 'h4 l-]>  q]7 ajn v '&u<No I yK P ԁ b ݁ >| r#,+h 9l<7N]'pBׁOb< ہ X4% n~v>iy  ` 적 PJ!oNl1{.]J q v4v_  0 \ Hą0r2K" ljt sW 5 5G675̀ꡀ2jObv '&SR. t / ˀ ـ%V - + | s 75꤀2l<  ژ EWMt% n~v>iy  ` 적 PJ!o5{.]J v4v_  0 \ HąMG >G"-]> C(˗7NVn2jObv '&BeK& tbB뒀FhBʁ ] R.{сF) O wv<7NnjaOb<&ژ ؁% n~v>iy  ` 적 PJ!oN^1{.]J q v2v_  0 \ Hąecm kR5j | U5TA؀Jс 8?GNFn2 v '~hBy=m@*(l m  b ( "87NZjaOb<\a4* ^U)Ɂv>iy  ` 저\ 10J!oNl1{.]J q v2v_  0 \ HąINJ 6 ZT 2j kr@#w5Q jObv 'sR / ˀ ـ] *9 {}Ё+`V | s 75꤀2l<  ژ EWMvU)Ɂv>iy  ` 저\ 10J!o5{.]J q v2v_  0 \ Hą9 Nx6 C _)O9<z˗7NVn2j}k fv 'j1q %W  2 ݁ ɛ 'Ko  ㆁ^ <7Nmja̴<Vy ژ cC)Ɂv>iy  ` 저\ 10J!oN^1{.]J q v2v_  0 \ Hąecm kR5j | U5TA؀Jс 8?GNFn2 v '~hBy=m@*(l m  b ( "87NZjaOb<\a4* ^U)Ɂv>iy  ` 저\ 10J!oNl1{.]J q v2v_  0 \ HąINJ 6%a KZT 2j kr@#w5Q jObv 'sR / ˀ ـ] *9 {}Ё+`V | s 75꤀2l<  ژ EWMvU)Ɂv>iy  ` 저\ 10J!o5{.]J q v2v_  0 \ Hą9 Nx6 C _)O9<z˗7NVn2j}k fv 'j1q %W  2 ݁ ɛ 'Ko  ㆁ^ <7Nmja̴<Vy ژ cC)Ɂv>iy  ` 저\ 10J!oN^1{.]JE ) I% I I% _ Nw MqLG- `{N0} / 87B]J( @C}T ƁKJ cہH ʁ 7NFj%(\Q M M,ށ  I=c@?NYۀMP`P7 B, i ~Ȯ=|I}'c 쀀5ʀ꤀2j%(\ ژ M M,ހ H',k20;~A F)  貁#ǩ/Ity};X vÙAr H? 1 €7XR75FdX7%(\ ژf>G K M,݀ فo҆_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' x Sꡀ2j Cʀ DNl1E q) I% I I% _ Nw MqLG- `{N0}+ `I 6(X w , pG K M,݀ فo҆_ HW] M  E )VĴF_ uހZfy&4Bb,ف;ӀD ŀ ' xT3n2j Cʀ DNl1E q) I% I I% _ Nw MqLG- `{N04̀8.P3 NIc =J>P B5 ` 7NFj%(y ژ M M,ހmGAN`Ks 1^=$ YffK@ ?Sv LlqD 7A7N^9%(\ ژf" ژ K M,݀ فo҆_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' xTCn2j Cʀ DNl1E q) I% I I% _ Nw MqLG- `{N0 + =6f 2]>V_~#~ II# _ ϊ<F€ ā 75ʀ꤀2j%(\ ژ M M,ށT/wm ߀g?zYKs q_v | qژ K M,݀ فo҆_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' x Sꡀ2j Cʀ DNl1E q) I% I I% _ Nw MqLG- `{N0L = R 6n"4 )(NK H}M0} y; ?Gǹ7% ؎77NVjA} M M,ށ>II&EYD sl|Y>  < }>ӮT (},(( xu {D  lP57NnX7%(p f>ژ K M,݀ فo҆_ HW] M  E )VĴF_ uހZfy&4Bb,ف;ӀD ŀ ' xT3n2j Cʀ DNl1E q Lk J4 J J4 _ Nw MqLG- `{N04̀8.P3 NIc& &(ªQ 7NFj%(\ ژ M \-ހmGAN`Ks 1^=$ Yff SCoC/kAӷ lqD 7A7N^9%(܁ ?f" ژ K \-݀ فo҆_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' xTCn2j Cʀ DNl1E q Lk J4 J J4 _ Nw MqLG- `{N0 + =6f 2]>V_~#~ II#  7 p&C KhIfy(yH 6 75ʀ꤀2j%(\ ژ M \-ށT/wm ߀g?zYKs q_v | qژ K \-݀ فo҆_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' x Sꡀ2j Cʀ DNl1E q Lk J4 J J4 _ Nw MqLG- `{N0L = R 6n"4 )(NK J * S  Z827% ؎77NVjA} M \-ށ>II&EYD sl|Y>  < } /6BOnx0 xu {D  lP57NnX7%(\af>ژ K \-݀ فo҆_ HW] M  E )VĴF_ uހZfy&4Bb,ف;ӀD ŀ ' xT3n2j Cʀ DNl1E q) I% I I% _ Nw MqLG- `{N04̀P@s&L>^ @y NIc& &(ªQ 7NFj%(\ ژ M M,ހmGAN`Ks 1^=$ Yff SCoC/kAӷ lqD 7A7N^9%(܁ ?f" ژ K M,݀ فo҆_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' xTCn2j Cʀ DNl1E q) I% I I% _ Nw MqLG- `{N0Flg]>V_~#~ II#  7 p&C KhIfy(yH 6 75ʀ꤀2j%(\ ژ M M,ށT/wm ߀g?zYKs q_v | qژ K M,݀ فo҆_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' x Sꡀ2j Cʀ DNl1E q) I% I I% _ Nw MqLG- `{N0H = R 6n"4 )(NK J * S  Z827% ؎77NVjA} M M,ށJfB}I&EYD sl|Y>  < } /6BOnx0 xu {D  lP57NnX7%(\af>ژ K M,݀ فo҆_ HW] M  E )VĴF_ uހZfy&4Bb,ف;ӀD ŀ ' xT3n2j Cʀ DNl1E q) I% I I% _ Nw MqLG- `{N0_ЀT~xs&L>^ @y NIc&ᖀ O=K:܁ 7NFj%( ژ M K,ހmgAN`Ks 1^=$ Yff SC/}5z$lqD 7A7N^9%(\ ژf" ژ K K,݀ فo҆_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' xTCn2j Cʀ DNl1E q) I% I I% _ Nw MqLG- `{N0 2]>V_~#~ II#  73 <@ /7؁ 75ʀ꤀2j%(hژ M K,ށS/wm ߀g?zYKs q_v | qژ K K,݀ فo҆_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' x Sꡀ2j Cʀ DNl1E ) I% I I% _ Nw MqLG- `{N0H += R 6n"4 )(NK J́j' ?H8 tԀ7% ؎77NVjA} M K,ހ,+%8I&EYD sl|Y>  < } /6B ys'R xu {D  lQ77NnX7%(\af>ژ K K,݀ فo҆_ HW] M  E )VĴF_ uހZfy&4Bb,ف;ӀD ŀ ' xT3n2j Cʀ DNl1E | Kj J4 J J4 _ Nw MqLG- `{N0. R 6n"4 )(NK J́) rǁroA7% ؎77NFjA} )n [-ށ[ ` CR>(D uz rH1"x}A u x yW[D 7A7N^X<g%(GCKf>ژ K [-݀ فo҆_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' xTCn2j Cʀ DNl1E | Kj J4 J J4 _ Nw MqLG- `{N0֚j}%+A+HBy7 = cIZ 1 5` ts%A˙75t1j%(; X M [-ށ切 Eg?zYKs q_v | qژ K [-݀ فo҆_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' x Sꡀ2j Cʀ DNl1E r Kj J4 J J4 _ Nw MqLG- `{N0dzڀ~J: r} ( Q  Ɓ 1ɀz_r2I Sy{ 3~ |ŀA˘7NVj%(\ژ M [-ހJˀvځgr>bS'_vw ˕?ˀ}7C+-&~ '` {D Yv97wd.NnX7  '^ ژf؀% ژ K [-݀ فo҆_ HW] M  E )VĴF_ uހZfy&4Bb,ف;ӀD ŀ ' xT3n2j Cʀ DNl1E |) I% I I% _ Nw MqLG- `{N0. R 6n"4 )(NK J́) rǁroA7% ؎77NFjA} )n K,ށ[ ` CR>(D uz rH1"x}A u x yW[D 7A7N^X<g%(GCKf>ژ K K,݀ فo҆_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' xTCn2j Cʀ DNl1E |) I% I I% _ Nw MqLG- `{N0֚j}%+A+HBy7 = cIZ 1 5` ts%A˙75t1j%(; X M K,ށ切 Eg?zYKs q_v | qژ K K,݀ فo҆_ HW] M  E )VĴF_ uހZfy&41b,ف;ӀD ŀ ' x Sꡀ2j Cʀ DNl1E r) I% I I% _ Nw MqLG- `{N0 .ڀ~J: r} ( Q  Ɓ 1ɀz_r2I Sy{ 3~ |ŀA˘7NVj%(\ژ M K,ހJzgr>bS'_vw ˕?ˀ}7C+-&~ '` {D Yv97wd.NnX7  '^ ژf؀% ژ K K,݀ فo҆_ HW] M  E )VĴF_ uހZfy&4Bb,ف;ӀD ŀ ' xT3n2j Cʀ DNl1EF ̀̏,wI2_ Nw  HiG-  xwe9}Y&V{xbp { 6-B9 )N΀ cyW w ؏67 ?ɀ Ā2j%(\a M M{ށM*F@I H́[LҁK? ,:K  ]߁ xu <YRyv7m|!]X7%(\ ژژ M M{݁ԁ'p/lJ /M= )iǺ o *Lx Q s 2Jk 7 ˁ1OЀju7Nl1B ̏,wI2_ Nw  HiG-  xwήq ڀ~ } PM<  vZ $̩8mm C? ,> o[ȆBD| ] 7 c2j  ژ M M{>"xġ >  EV , )C  eKhr,G0p)R \l<yv7 c2Xd.%(\Qf  ژ M M{݀ فo҆_5-) lԃ" 7̅ ' "xġ >  EV , )C  eKhr,G}e \l<yv7N]Xd.%(\Qf  ژ M M{݀ فo҆_5-) lԃ" 7̅ '  75!! >j%(\ ژ M M{ށM >(U] %) mCJT N u = 4_x}D',}399K  [X <Gv75꤀2X 7%(\ ژfua M M{݀ فo҆_5-) lԃ" 7̅ '  7NVj  ژ M K{>"x͏ց >  EV , )C  eKhr,GʀK}@-v>a \l<yv7NmXd.%(\Qf  ژ M K{݀ فo҆_5-) lԃ" 7̅ '  ؏67 ؀j%(\a M L{ށvNׁ  t yJ*F)#YY_' Z?, @} \ ̀a l<8?Gv7N^X7%('ژf" ژ M L{݀ فo҆_5-) lԃ" 7̅ '  ؏67 ꤀2j%(\a M K{ށvN t yJ*F)#YY_' Z?, @} \ ̀a l<8?Gv75꤀2X7%('ژf" ژ M K{݀ فo҆_5-) lԃ" 7̅ ' .׀oq K'A9?I"77vd.mJ,(X7܁ژf>ژ M M{݁ԁ'p/lJ /M= )iǺ o *Lx Q s 0Jk 7 ˁ1OЀju7Nl1E ~̏,wI2_ Nw  HiG-  xw} q~NYۀMP`P7 N i ~}>|'c jnj%(\ ژBN{ށY l4G B aF)I&0{w [g  +P A߀ ~ ? 4" Z? <7v<g c2X7%('ژf>ژ M M{݀ فo҆_5-) lԃ" 7̅ '   EV , )C  eKhr,G}e \l<yv7N]Xd.%(\Qf  ژ M M{݀ فo҆_5-) lԃ" 7̅ '  75tQj%(\ ژBN{ށz@3B %) mCJT N u = 4_x}D',}399K  [X<Gv75FdX 7%(\ ژfua M M{݀ فo҆_5-) lԃ" 7̅ '   EV , )C  eKhr,GYQŗ2mC? \l<yv7NmXd.%(\Qf  ژn K{݀ فo҆_5-) lԃ" 7̅ '   EV , )C  eKhr,G}e \l<yv7N]Xd.%(\Qf  ژ M L{݀ فo҆_5-) lԃ" 7̅ '   EV , )C  eKhr,G}e \l<yv75꤀2Xd.%(\Qf  ژ M K{݀ فo҆_5-) lԃ" 7̅ ' .׀oq K'A9?I"77vd.mJ,(X7܁ژf>ژ M M{݁ԁ'p/lJ /M= )iǺ o *Lx Q s 0Jk 7 ˁ1OЀju7Nl1E ̏,wI2_ Nw  HiG-  xw} q~NYۀMP`P7 N i ~}>|'c jnj%(\ ژBN{ށY l4G B aF)I&0{w [g  +P A߀ ~ ? 4" Z? <7v<g c2X7%('ژf>ژ M M{݀ فo҆_5-) lԃ" 7̅ '   EV , )C  eKhr,G}e \l<yv7N]Xd.%(\Qf  ژ M M{݀ فo҆_5-) lԃ" 7̅ '   EV , )C  eKhr,Gł   \l<yv7NmXd.%(\Qf  ژ M K{݀ فo҆_5-) lԃ" 7̅ '   EV , )C  eKhr,G}e \l<yv7N]Xd.%(\Qf  ژ M L{݀ فo҆_5-) lԃ" 7̅ '   EV , )C  eKhr,G}e \l<yv75꤀2Xd.%(\Qf  ژ M K{݀ فo҆_5-) lԃ" 7̅ ' ^kg Tہ <P {N.͙J !b S Հ;/76$IIIEƑ>^k }fY9QsJ^2ꡀ2Vh6<IIICQ ʀ̏,wI2_ Nw  HiG-  xw֪  #bp { 6-B9 CPz<4 ؏67 ?ɀ Ā2j%('N{ށgt 5NI H́[LҁK? ,: t; ]߁ xu=YRyv7mX7%(|C==ژژ 2I M{݁ԁ'p/lJ /M= )iǺ o *Lx Q s 2Jk 7 ˁ1OЀju7Nl1E ̏,wI2_ Nw  HiG-  xwήNMLʀ~ } PM<  vZ $̩8mm C? ,> o[ȆBD| ] 7 c2j  ژBN{ށY&R~>  EV , )C  eKhr,G0p)R \l<yv7 c2Xd.%(\Qf  ژ M M{݀ فo҆_5-) lԃ" 7̅ '   EV , )C  eKhr,G}e \l<yv7N]Xd.%(\Qf  ژ M M{݀ فo҆_5-) lԃ" 7̅ '   EV , )C  eKhr,GH@b > \l%yv7NmXd.%(\Qf  ژ M K{݀ فo҆_5-) lԃ" 7̅ '  o[ȆBD| ] 7 c2j  ژBN{ށY&R~>  EV , )C  eKhr,G0p)R \l<yv7 c2Xd.%(\Qf  ژ M M{݀ فo҆_5-) lԃ" 7̅ '   EV , )C  eKhr,G}e \l<yv7N]Xd.%(\Qf  ژ M M{݀ فo҆_5-) lԃ" 7̅ '   EV , )C  eKhr,GH@b >ՁU \l%yv7NmXd.%(\Qf  ژ M K{݀ فo҆_5-) lԃ" 7̅ '  o[ȆBD| ] 7 c2j  ژBN{ށY&R~>  EV , )C  eKhr,G0p)R \l<yv7 c2Xd.%(\Qf  ژ M M{݀ فo҆_5-) lԃ" 7̅ '   EV , )C  eKhr,G}e \l<yv7N]Xd.%(\Qf  ژ M M{݀ فo҆_5-) lԃ" 7̅ '   EV , )C  eKhr,GH@b >ՁU \l<yv7NmXd.%(\Qf  ژ M K{݀ فo҆_5-) lԃ" 7̅ ' VBhBMLJ д8n u L= @ ^'c  _3 Ā2j%(\ ڙBĀ̏gM_ N\ H{ـwR< + )@b$  7NFj%(\ ژBĀ̏_ N\ H{ـwRp.u(n 8YBG9J3+ vD vcIOUgW+ 0H >*mOπX7mf.w7!7l , Wwiځ _8 J[ ' }"*dm 2̀a Kil m ~m ~m 2 ˑQZ ~. ~. ; =( HcHĀ # G ~h Hf@?-^$B ;4 >G9JR dqKsܪ(9  vc.H2jMP(؀v:}؁H $Ƙ*s̀ P9X7m7w7qRy.(L7wH  i_8 KS&}5Z*dƀ1 2ʀ^ Kil m 2̀ 2̀ 2 ~bQZ ~' ; 7 HcHĀ # G ~h Hf@?-^$B ;4 >G9J%ko C'DD  [,N4)RǁfED Rx/w?&(K *}X7mYRyw7!7m7uH >_8jP}"*dp( Ki = (pp/ 'bIh2 p =( HcHĀ # G ~h Hf@?-^$B ;4 >NC;Zdxހ0[  7<2LȅI] D  zT [,N4)RǁfED 3M?'c~/H $ƚ* R㑀9X7mn\uw7Ɂ2 C DuH >_8=^K  :`*ƀOiem OKil1Ļ >1e>1 S/e<b+< =!T ;( HcHĀ # G ~R@ A!? Hf@?-^$BE ;4 >4]͇T [,N4)RǁfED 3M?'c~/H $ƚ*mX7mn\uw7Ɂ2 C DuH >_8=^K  }"*ƀOiem^  & €& À& À& À& À& 6>ڀ 6>ڀ À& €& À& €& À& €& À& €& Łι 0e8[  z À& À& *u2j  z 6>ڀ À& À& 8<u8<uÀ&#&#*8[À&À&À&8<u8<u h.& fZ|mfmf<r Hl U/<bQ\ U/<2Vx;ù&# &̀&Łoǀu| 7 HcHĀ # G ~h Hf@?-^$B ;4 >G9JZdxހ0J В 6<2 0B@QB u?X\$xxk' 0H >*}X<gm7w7!7m*2wH <_8 J[ .Y y-~*dm 'ր Kil1 ~m ~m ~0 ~. ~. ; ;( HcHĀ # G ~h Hf@?-^$BE ;4 >NC;Zdxހ0 6LHH> ,=O|V@Q Pf^P(`$xxk' 1H $ƚ* R㑀9X7m7w7 ,8 K C DwH <_8 J[ .Y }"*ƀOicЀ  OKil1 td1 td1 tb. tbbb =|( ́ 7 HcHĀ # G ~h8 Hf@@!-%. ;4 >NC;гS T R m] ` c>~(6H >* RQ X7l7wn\uGm7uH  _8 J[ ŀs܀5Z*dȀ1 2̀^ Kil p Hj ~1 ~1 ~. ~2 l ~bQZ ; ;( HcHĀ # G ~h8 Hf@@!-%,E ;4 >MT R m] ` '~(6H >*mX7l7wn\uGm7uH  _8 J[ ŀs܀5Z*dȀ1 2̀^ Kil p Hj ~1 ~1 ~. ~2 l ~bQZ  ̀̏u HcHĀ #Q|0! WBMjLgF-Ax .: ׀ /*π|? 5B yC v2| K =݁KKK@1 րB̃ nÁ 6"k*(ƀZcefs-<}dXm7w7 ް j& U WM&kOb  R ) …,_De5̏u HcHĀ #Q|0! WBMjLgF-Ax .: ׀ /*π|? 5B yC v2| K =݁KKK@1 +B̃ nÁ 6"k*(ƀZcefs-< R,J%dXm7w7 ް j& U WM&kOb  R ) …,_De6̏u HcHĀ #Q|0! WBMjLgF-Ax .: ׀ /*π|? 5B yC v2| K = {0@qԖ I l PuJzIӷ{ Ckʀcefs-<mdXm7w7 ް j& U WM&kOb  J Ǟ <H<N ё䇀ݸ OeOAqBZ Ub219)ݸ OeOӀ G44lݸ OeOӀ G44W`tݸ OeO6ì꽁V߀ꡀ2E܀ݸ OeOӀ G4?O{v>sUzHcHĀ NLKݶxKM qkfp4 -EI 63͙ 2ۀ%vL$Bs Cs"gWg` ǜ "/>s"gWg` ǜ8O{v>sUzHcHĀ NLKݶxKM qkfp4 x*I 63͙ 2ۀ%vL$ g , 4?ā_~9j̴&F7€ ; %) CEsM7 PN5cɁ ǜ/>s"gWg` ǜ "/>s"gWg` ǜv>sUf>sUzHcHĀ NLKݶxKM qkfp4@S 9H+ 0dE @$ C /Ab_odjOb ڙ8<vqsj  $I <ICFs !ӁolX PN5cɁ ǜ/>s"gWg` ǜ "/>s"gWg` ǜ8O{v>sUzHcHĀ NLKݶxKM qkfp4 i-(_ f o, ɱy @Ȇ \W Tj= jK7k_2!2jOb ڙ8<v߀;j 5AFsX5 PN5cɁ ǜ/>s"gWg` ǜ #/>s"gWg` ǜvv>sUzHcHĀ NLKݶxKM qkfp4}hm nf o, ɱy @ :ǁƕ [(` >  C/_'ĀJ%!'ĀJ%j΍zڙ *}j  <Is  5 PN5cɁ ǜ/>s"gWg` ǜ #/>s"gWg` ǜ8O{v>sUf>sUzHcHĀ NLKݶxKM qkfp4̬ 祀6* 9 1Tn_1}l9~t}Ud@ TuÀ -u w2!2j1 U q "*}h -N; <ICEsM5 PN5cɁ ǜ/>s"gWg` ǜ #/>s"gWg` ǜ] M>1 bŀ ce bŀ_HcHĀ NLLwwwwwww & &/Mj I {r|gDрI+Lz 6ni 7t8ȝV0ț ́ + & jQ`$}h a a a a a a a a|. Ӡ !u Ӟ ί ,MAՀ{ڜt LtJB-6_s 0LtJB,ف5 M>1 bŀ ce bŀ_HcHĀ NLLwwwwwww & &/Mj I {#A wGn Ċ !ymÛ Wl v1 O ~RO7Ł7+_2XO ڙ QeW_h K˷O ښ Q˷O ښ Q˷O ښtO ښ Q˷O ژ Q˷O ژ Q˷O ژ Q˷O ژ4" Ӡ  u <  od <LAՀ{ڜt LtJB-6_s 0LtJB,ف슀\M>1 bŀ ce bŀ_HcHĀ NLLwwwwwww € &/Mj I {#` /I 5 Ċ !ymÛ Wl v1 OL 3xA M7+_2Xڙ q "*}h KYښ QYښ QYښ [ښ QYژ QYژ QYژ QYژAμ3 8  ;  <́ 1|μLAՀ{ڜt LtJB-6_s 0LtJB,ف*v$_ %e8K/MjLlX IŤN 8 @ U]l in8Y 8;ImR؁z=&=ҁ.ߛ77NZ1 ́ Ѐ 2 l  9u %7we.11 `  U c3 "2ҁC    'Ā.Y ӟ7 ~v$_ %e8K/MjLlX IŤN 8 @ U]l in8Y 8;ImR؁z=&=ҁ.ߛ775꤀21 ́ Ѐ 2 l  9u %7we.11 `  U c3 "2ҁC    'Ā.Y ӟ7 }v$_ %e8K/MjLlX IŤN 8 @ ULl in8Y 8;ImR؁z=&=ҁ.ā&qZ77Nk1 ́ Ѐ 2 l  9u %7we.Nl1 `  U c3B mSC JC}A 4 ӟ  7 ~x$_ %e8K/MjLlX IŤN 8 @ U]l in8Y 8;ImR؁z=&=ҁ.ߛ77NZ1 ́ Ѐ 2 l  9u %7we.11 `  U c3 "2ҁC    'Ā.Y ӟ7 ~x$_ %e8K/MjLlX IŤN 8 @ U]l in8Y 8;ImR؁z=&=ҁ.ߛ775꤀21 ́ Ѐ 2 l  9u %7we.11 `  U c3 "2ҁC    'Ā.Y ӟ7T ́v$_ %e8K/MjLlX IŤN 8 @ U]Ȭ `߀ n? ԀQ@<@0z(r/ tPkp@OOu 쀀5㑀91 ́ۀ(>7Nl1 9ul v7Nl1 `  U c3}@hD Dԁ JC}@    7 ~v$_ %e8K/MjLlX IŤN 8 @ U]Ȭ `߀ n? ԀQ@<@0z(r/ tPkp@OOu Nk1 ́ۀ(>7Nl1 9ul v7Nl1 `  U c3}@hD Dԁ JC}@    7`X z  Qր j!R;WXv9W)%?v~. z  w  ju сAHҀ֔R)%?X z  Qր ju р܀7U k)%2X z  Qր ju р܀7$vtP k) A(X z  Qր ju сR5 k)%?X z  Qր ju р܀7U k)%2X z  Qր ju р܀7$vtP k)7WF>X z  Qր j(w* ;I ̡ H)%?X z  Qր j(w* ;IH H)7Ku<a#RYt2/{\"CY <Ku=a{ ݁'X̺' <Ku<a{ ݁ . },' <Ku<a{ ݁ . },' <Ku<a{ ݁<>];~un G' <Ku<a{ ݁ . },' <Ku<a{ ݁ . },' <EKu<ayZYkTa x <Ku<ayZYkTa x <E HKL8%%Cv~{Fx nJX{n+ 5@s2DB a R N RQ=42O wu ;58U[#Ā I Sʀ 8 Z *u=qvFxǁB3~=03BvaoD  Kq%N R N RG!n&Sɀ\;À!n& Y % > 42O7wuOC ;58R+#k  ՀG8:0cr2$- EuFzsFK2$j!"BvaoD  Oـ̝:= j q qG!p Ч P@5kp!So!pV PN oπ M 423=|ȁn;5Qx ) ʁFp GHcHĀ LJÀ ВN0ܗ l |po9z8*yOl nBu u @7Nbc OL"cj N| j Fp GHcHĀ LJÀ ВN0ֺ Hځ ,wlMāo r+\PNZQM( tq7Nb1Āل=0o k  LZ GHcHĀ LJÀ ВN0 Hځ ,wlMāo rB%M7Nb1Ā~Ԁn  0oN^ Fp GHcHĀ LJÀ ВN0 Hځ ,wlMāo rB%M7Nb1Ā<  0oۀ5q Fn GHcHĀ LJÀ ВN0ֺ Hځ ,wlMāo rڀZ 6JA47Nb1Ā~n  0oNn  KZ GHcHĀ LJÀ ВN0 Hځ ,wlMāo rB%M7Nb1Ā~Ԁn  0oN^ Fn GHcHĀ LJÀ ВN0 Hځ ,wlMāo rB%M7Nb1Ā<  0oۀ5qQ ʁ KZ GHcHĀ LJÀ ВN0@5sD u3OvldX RV' 257Nb1ŀ<  0oۀ5!  Fn GHcHĀ LJÀ ВN0@5sD u3OvldX RV' 257Nb1ŀ~n  0oNnQ ́FpM>c G癀 LJZ В΁Jۀ Eqj,fb*ӽF v7(Fp-7NV13& pJ%1À 7$$ M{OE]@ ZMym l| [ } d(R d, <J7Nn1 6 Ā2 2 Up\T FpM>c G癀 LJZ В O 7pN'A6{^g jJ rr݉ NV$ ? > 5 U =O M{O,3  48 e(߀  IZ ~ _l7Nn1 4 Ā2 51bu  Lc G癀 LJZ В O 7pN'A6{^g jJ48$4 NV$ ? >ڀn2 U =O M{O,3  48 e(߀ c G癀 LJZ В O 7pN'A6{^g jJ48$4 NV$ ? >ʀꡀ2 U =O M{O,3  48 e(߀ c G癀 LJZ В O. 7pN'A6{^g jJ4 [E 6 NV$ ? >n2 U =O K{O,3  48 e(߀ d' _l7Nn1 4 Ā2n21bu  Kc G癀 LJZ В O 7pN'A6{^g jJ48$4 NV$ ? >ڀn2 U =O L{O,3  48 e(߀ c G癀 LJZ В O 7pN'A6{^g jJ48$4 NV$ ? >ʀꡀ2 U =O K{O,3  48 e(߀ _xOV xNV1 6 Ā2tP1C L{OuO . 3 m  㨁+ M_ `=@. 7Nn1 4 Ā2o1E FnM>c G癀 LJZ В߀ .r |Hia6E (} 6>_xOV xNV1 6 Ā2΀1C K{OuO . 3 m  㨁+ M_ `=@. 7Nn1 4 Ā21EQڤM>I2_ LJ HGŠ Вg]w0Ȧ Deh %O ݀? ];gsț =z Vv k: M7%%%% n~ MYKY  y `_ <$؀j0U }7rĀFM>I2_ LJ HGŠ Вg]8 k%Z` ˤ ̼I8qlR+> 5G 3YCR 7V \X7%%%% n~ MYKYnS6LQ7RCLrĀ NI2_ LJ HGŠ Вg] ` ˤ ̼I8qlR+> 5G 3CPC€=7V \X7%%%% n~ MYKinSnQ7NlŮ s5CLrĀFM>I2_ LJ HGŠ Вg] ` ˤ ̼I8qlR+> 5G 3CPC€=7V N kX7%%%% n~ MYKinSnQ75 ؁CLrFnM>I2_ LJ HGŠ Вg]8 k%Z` ˤ ̼I8qlR+> 5G 3_ 7V \X7%%%% n~ KYKYnSU)Q7N^Ů s5CLrĀ KI2_ LJ HGŠ Вg] ` ˤ ̼I8qlR+> 5G 3CPC€=7V \X7%%%% n~ LYKinSnQ7NlŮ s5CLrĀFM>I2_ LJ HGŠ Вg] ` ˤ ̼I8qlR+> 5G 3CPC€=7V N kX7%%%% n~ KYKinSnQ75 ؁CLrR ́ KI2_ LJ HGŠ Вg]yҀ ۽ HG  N"!t tJ %BL7UOπʀ㏀9X7%%%% n~ LYK AQv !.5] }7rĀFM>I2_ LJ HGŠ Вg]yҀ ۽ HG  N"!t tJ %BL7UOπg9X7%%%% n~ KYK AQv !.N^c5 }7rRup͆0 &  PA = w!eK{}V Qj;b 1ŀ   Ӟ7p͆0 &   ց v6l΁ |A Ȕ 5>`~aqj< 岁1GY   50 & On[. v6l΁ |A Ȕl 9} A Ẁ1GY  odu Ӡ5)͆0 & =S# uQa*l΁ |A Ȕl 9} A Ẁ1GY  odu Ӡ5p͆0 &   ց v6l΁ |A Ȕl  C =ʁ ep1GY  odu Ӡ50 &   v6l΁ |A Ȕl 9} A Ẁ1GY  odu Ӡ5)͆0 &   v6l΁ |A Ȕl 9} A Ẁ1GY  odu Ӡ5Tu0 & Nۋ i p /k o=* v oK j87NIK{€Ʊ\ A (A([O7 ȔBI  n K@ # hfg "÷%K{€y(W5\ A (A([O7K2ā  s089:,9HI]đ #  'À.Y Ӡ7 "%NIK{€y(W5\ A (A([O7K2ā  s089:,9HI]đ #  'À.Y Ӡ7&NIK{€Ʊ5\ A (A([O7K2ā n ɷ 9 6YKuAe #  'À.Y Ӡ7 "ö%K{€y(W5\ A (A([O7K2ā  s089:,9HI]đ #  'À.Y Ӡ7 "'NIK{€y(W5\ A (A([O7K2ā  s089:,9HI]đ #  'À.Y Ӡ7E%K{?9oE& !/ mB%b g O#I *~{ŁA Dh .7NIK{?9oE& !/ mB%b g O#I *~{ŁA Dh .7ENIK{rzefƀ[ Akb@KX m @+ DS y&y uÁ4" Ӟ ~ 7NIK{rz䓀x́D x A (A([O79k[<Q = 7vxwy"| Ӡ ݀7%K{rz䳀x́f x A (A([O7K RBI Ђ! "| Ӟh .7NIK{rz䳀x́f x A (A([O7K RBI Ђ! "| Ӟh .7NIK{rz䓀x́f x A (A([O7K ># J+ Q =N"| Ӟh .7%K{rz䳀x́f x A (A([O7K RBI Ђ! "| Ӟh .7NIK{rz䳀x́f x A (A([O7K RBI Ђ! "| Ӟh .7E%K{rz;RDq !/ m7HB> h.Y Ӟ7NIK{rzDq !/ m7HB> h.Y Ӟ7E׀ϹM€ &{r{BwtC {(| l ZFFiHǩy  4 ǩ׀ϹM€ &{r{]/k? ts ց `ˤ o; 3#u #  3 Ӡ}#u Dƀ M€ &{r{]Nk? tmLi uw" m ~yBeyvx> "ڀ #  ~ ~>̀,ϹM€ &{r{]Nk? tmLi uw" m ~yBeyvx> "ڀ #  ~ ~>́ـϹM€ &{r{]/k? tmLi u*9< "ڀ #  ~ ~>̀,ϹM€ &{r{]Nk? tmLi uw" m ~yBeyvx> "ڀ #  ~ ~>́S M€ &{r{99À=Ɂ onX )mj 5+|fi"| 9& ݀׀ qـϹM€ &{r{99À=Ɂ onX )mj 5+|fi"| 9& ݀׀ qS ځ; OY_  0N€ ߕMjM ąP6 S  >J WHUL k'NFn2jq ݦ ~ Kt$ LjWuWmMMo5J9V>b| ݀?րl WN]jaOb<y{ ښ%kyS Kt .UyB `,-:  Pڜ趁{Nl1{ .рJ q;^dPoPPo_  0Nŀ ߕMjM ąP6/X ā \ {I. YSұ{'πNFV Aց E?mՀ?m [V$G=_ q* (  s ܫ2Lg.J(Ba (,ژ N]jaOd<8<u ښ%kym ym [V .UyB `,-:  Pڜ趁{Nl1{ .рJ q; OY_  0N€ ߕMjM ąP6 ` ]z ु K) K;   )n o075! >j1G ~ Kt$! ZfDŽ7ũs )) w v_ژYv< '5꤀2.U<e.Ya%? ̀M Kt .UyB `,-:  Pڜ趁{5{ .рJ q;^dPoPPo_  0Nŀ ߕMjM ąP6ӣHg]R  l܀рQ#H?΀*5ʀꡀ2#oFObEsZsB_, [V$\Gjj? u ܮD sa'ҲM m  F,2Bژ75 jazD<2Cښ -em%k `  `  [V .UyB `,-:  Pڜ趁{5{ .рJ q; OY_  0N€ ߕMjM ąP6ӳ//9 I U( Y Z o n2j!.\ ~ Kt$Y/ tƁ t؁/ m, sa َ ,ژYe Nn<&ژ% .T Kt .UyB `,-:  Pڜ趁{N^1{ .рJ q;^dPoPPo_  0Nŀ ߕMjM ąP6&0됀 Z + xYV2 HX Vz;>E?mՀ?m [V$ ؁p/Y>bj{ >|a cځ Mژ Ml TNnjaOd<8uQ%kym ym  [V .UyB `,-:  Pڜ趁{N^1{ .рJ q; OY_  0N€ ߕMjM ąP6 S  >J WHUL k'NFn2jq ݦ ~ Kt$ LjWuWmMMo5J9V>b| ݀?րl WN]jaOb<y{ ښ%kyS Kt .UyB `,-:  Pڜ趁{Nl1{ .рJ q;^dPoPPo_  0Nŀ ߕMjM ąP6/X ā \ {I. YSұ{'πNFV Aց E?mՀ?m [V$G=_ q* (  s ܫ2Lg.J(Ba (,ژ N]jaOd<8<u ښ%kym ym [V .UyB `,-:  Pڜ趁{Nl1{ .рJ q; OY_  0N€ ߕMjM ąP6 ` ]z ु K) K;   )n o075! >j1G ~ Kt$! ZfDŽ7ũs )) w v_ژYv< '5꤀2.U<e.Ya%? ̀M Kt .UyB `,-:  Pڜ趁{5{ .рJ q;^dPoPPo_  0Nŀ ߕMjM ąP6ӣHg]R  l܀рQ#H?΀*5ʀꡀ2#oFObEsZsB_, [V$\Gjj? u ܮD sa'ҲM m  F,2Bژ75 jazD<2Cښ -em%k `  `  [V .UyB `,-:  Pڜ趁{5{ .рJ q; OY_  0N€ ߕMjM ąP6ӳ//9 I U( Y Z o n2j!.\ ~ Kt$Y/ tƁ t؁/ m, sa َ ,ژYe Nn<&ژ% .T Kt .UyB `,-:  Pڜ趁{N^1{ .рJ q;^dPoPPo_  0Nŀ ߕMjM ąP6&0됀 Z + xYV2 HX Vz;>E?mՀ?m [V$ ؁p/Y>bj{ >|a cځ Mژ Ml TNnjaOd<8uQ%kym ym  [V .UyB `,-:  Pڜ趁{N^1{ .рJ q ;!L  !_ N€  cMqL Jx `Š t$4|Łr]>V_ F Rn.K x>K y[<]|s 7NFj%(\ ژ' Hxsρsȁ< )^ pb ׁ r 0%= , AiK@́I Y ƀ. lP4C`2Á~ H| c} njT7v7N^X7%(\ ژfu ژ'^ sρs 9\_@'̀Ё u?]P]1 ׁ)h 3 vyBaB~ ށ 1n -  {j $% <gN^j Cʀ D11A q K0ghMg !_ Nŀ  cMqL Jx `Š V$4Z=NN 8uB 6/ ׀ >L {rED K .17F݁ N=D  &AKÁbC1 t u . e nlqjC 7,275㑀9X7%(\ ژf>G'^ sρs 9\_@'̀Ё u?]P]1 ׁ)h 3 vyBaB~ ށ 1n -  {j $% <g5tOj Cʀ D11A q K0ghMg !_ Nŀ  cMqL Jx `Š V$4Z0e R. ȁ  7 ,@Ac-tq ؎775qj%(\ ژkY qYx\sρsȁl =f r 0%= , AiK@́I Y ƀ.3mCl  L c} njT7v75㑀9X7%(y ژfu ژ' S !f  S !f [sρs 9\_@'̀Ё u?]P]1 ׁ)h 3 vyBaB~ ށ 1n -  {j $% <g5tOj Cʀ D11A q ;!L  !_ N€  cMqL Jx `Š t$4~ 6(J@P NIc&p39ŁHn 7NV6%(\ ژ'qsρsȀ) / wMt r3˔ Z =< 0{Aˬ||Y˖:[KÈ:ȁ'/-wڀlqja7v7NnX7%(\ ژf AP'T sρs 9\_@'̀Ё u?]P]1 ׁ)h 3 vyBaB~ ށ 6 /ҁ  {j $% <gNnj Cʀ D11A q K0ghMg !_ Nŀ  cMqL Jx `Š V$4* s GހM 6  >DA@.IoL =[:7 }p  Z@ 7  %("r q'q m q mx\sρsȁعZ ys̜ER -? pDw Vp+ l~ nB HӁ}-V_ F Rn.K x>K y[<]|s 7NFj%(\ ژ' Hxsρsȁ< )^ pb ׁ r 0%= , AiK@́I Y ƀ. lP4C`2Á~ H| c} njT7v7N^X7%(\ ژfu ژ'^ sρs 9\_@'̀Ё u?]P]1 ׁ)h 3 vyBaB~ ށ 1n -  {j $% <gN^j Cʀ D11A q K0ghMg !_ Nŀ  cMqL Jx `Š V$4Z=NN 8uB 6/ ׀ >L {rED K .17F݁ N=D  &AKÁbC1 t u . e nlqjC 7,275㑀9X7%(\ ژf>G'^ sρs 9\_@'̀Ё u?]P]1 ׁ)h 3 vyBaB~ ށ 1n -  {j $% <g5tOj Cʀ D11A q K0ghMg !_ Nŀ  cMqL Jx `Š V$4Z0e R. ȁ  7 ,@Ac-tq ؎775qj%(\ ژkY qYx\sρsȁl =f r 0%= , AiK@́I Y ƀ.3mCl  L c} njT7v75㑀9X7%(y ژfu ژ' S !f  S !f [sρs 9\_@'̀Ё u?]P]1 ׁ)h 3 vyBaB~ ށ 1n -  {j $% <g5tOj Cʀ D11A q ;!L  !_ N€  cMqL Jx `Š t$4~ 6(J@P NIc&p39ŁHn 7NV6%(\ ژ'qsρsȀ) / wMt r3˔ Z =< 0{Aˬ||Y˖:[KÈ:ȁ'/-wڀlqja7v7NnX7%(\ ژf AP'T sρs 9\_@'̀Ё u?]P]1 ׁ)h 3 vyBaB~ ށ 6 /ҁ  {j $% <gNnj Cʀ D11A q K0ghMg !_ Nŀ  cMqL Jx `Š V$4* s GހM 6  >DA@.IoL =[:7 }p  Z@ 7  %("r q'q m q mx\sρsȁعZ ys̜ER -? pDw Vp+ l~ nB HӁ}-ˀ `8%8ij%iƬŀn u  y  Ԁe0-tt 4Gim0P P( WO(reS"+kӛpӛp/imt`5 Š˸d>ˀ `/*X*[ڀ\X[Ƭ€n  } 2 =Gp|rdj Ś[ڀ8'0P` Pi~saa T" lT+k0,Sށ{mxdyP*X*[ڀ\X[Ƭŀn  } f ʧ rE3`:ހUtUts[ڀ8'0 K ~saa\ \+k0,<0,<{mxdyPbʀ € ~P2x H׀7;Q"lgր}tƊ?tC@Ҁ16i.$!%+i9zk<0V:t P ƀt$n M Z^dPoPPoŀ ~P2 rρLz hszdc#~VƊvā$6 d6-2V%+i9zk<]<];WJV w<ƀt$n M8%8ij%iƬ€n u E /9KNz ݬ im0P P( W D6~w.eS"+kӛ Uimt`5 Š˸d>ˀ `/8%8ij%iƬŀn u  y  Ԁe0-tt 4Gim0P P( WO(reS"+kӛpӛp/imt`5 Š˸d>ˀ `*X*[ڀ\X[Ƭ€n  } 2 =Gp|rdi Ś[ڀ8'0P` Pi~saa T"e +k0,Sށ{mxdyP*X*[ڀ\X[Ƭŀn  } f ʧ rE3`:ހUtUts[ڀ8'0 K ~saa\L!? +k0,<0,<{mxdyPbʀ € ~P2x H׀7;Q"lHtƊ?tC@Ҁ16A\ ځ%+i9zk< t P ƀt$n M\ ^dPoPPoŀ ~P2 rρLz hszdXVƊvā$6 d6- Ls-.%+i9zk<]<] kj *V w<ƀt$n M ~FR J=€ &!Ag Ią {a#pʀ45$  iS ^Ny f: < PN 1)6 =πU ' l -i T=|7wH >K(䤁 fk:07 P@>Mp\|,m KjkR J=€ &!Ag Ią {S57b iS ^Ny ydA < PN 1)6 =πU ' lZHak T=|7wH ><& fk DC < PNb rx= ڜ+9FR J=€ &!Ag Ią {a#pʀ4\h HK ˁ iS ^Ny+ ˓̀΀ < PN 1)6 :i ! ؏ /5=|7wH >+1 %k:07 P@bpOG; KjkR J=€ &!Ag Ią {S\h =7V ˁ iS ^Ny+ ˓HԴydA < PN 1)6 :i z5F=|7wH >+Ռ%k DC < PNb/rx= ڜ+9v!Nw/_ N f 1ߕ+ ?BhNƹ%^(nN4? .R =o u]>V_ F Rn.ۀ ߗ_Ob !] 7}9w%  ژ;1ÀT@0$ 7 ȁ 7Q.ERx(Mb *V[D   uT<5 51 b}9 tʴK~܀ ? }h(-] $7 ( !7mOπX7wn\  ژ&ژ;1À,|uQC Ӟ7  9 |$H)h 63ȇE o 2 r tK =ˁ  d wuQou $% 7n >_z%~A 7Nl1ŀ~ـ ~v!Nw/_ N f 1ߕ+ ?BhNƹ%^(nN4*' # I "h AEEe@.'ܟ, =Jځ?Ob Ժ 7 ؀w%wd.Y ژplO m@0$ 7 ȁW? v#dM}р&E = /  JnUCOmN 35lI;j(-Ob  $7!7N]gy v&ژ8<u ژ;mqYC 7  9N W_ M '̀Ё uCafY o+ ɗ 2lY pف  $% <g ؀n2w%~7Nl1ŀ~pE ~v!Nw/_ N f 1 _ ?BhNƹ%^(nN4? .R =o u]>V_ F Rn.KT7Ob !] 7}9w%  ژ=1ÀT@0$ 7 ȁ 7Q.ERx(Mb *V[D   uT<5 51 b}9 t `Ԃހ? }h(-] $7 ( !7mOπX7wn\  ژ&ژ=1À,|uQC Ӟ7  9 |$H)h 63ȇE o 2 r tI qB wuQou $% 7n >_z%~A 7Nl1ŀ21E ~v!Nw/_ N f 1 _ ?BhNƹ%^(nN4*' # I "h AEEe@.'ܟ΀ ̀˦Ob Ժ 7 ؀w%wd.Y ژ=1ŀ m@0$ 7 ȁW? v#dM}р&E = /  JnUCG"  3 xlI;j(-Ob  $7!7N]gy v&ژ8<u ژ=mqYC 7  9N W_ M '̀Ё uCafY o+ ɗ 2lYL  $% <g ؀n2w%~7Nl1ŀ2  };TЀ[\_  EDc|0MjM ąWvz|?0;πq  xn۝臁)Xn\u5,J%j~n   ]K1F&X f *#uh*ȁ Iy yy0~yہ+f 75tO  <&ژ&11E ]K1 :&P P'D趁{5{ .рJ;TЀ[\_  EDc|0MjM ąWӣq;ZTā   Ӂ Cͫ#w5pj~oA 9 pF ]K1Fsb Ł@[N o@+qcV̀<#ʁ67 [c C l<8<u ژ Dh m&1cO ]K1 :&P P'D趁{5{ .рJ;TЀ[\_  EDc|0MjM ąWӣqLZTāǪ 8 6S dk#w5pj~oA  ]K1Fsby Ł@[N o@ ʂ R  67 6K67 [c C l<8<u ژ Dh m&`+O ]K1 :&P P'D趁{5{ .рJ) I% I I% _  EDc|0MjM ą_>>̀qLZTāݞ΁Z 6S dk#w5pj~oA  M,\. 'sby Ł@[N o@ ʂ Ro 67 6K67 [c C l<8<u ژ Dh m&`+O M,C }׀:&P P'D趁{5{ .рJ) I% I I% _  EDc|0MjM ą_>>̀^ׁh,S,ǀ |N Cƀ!؈i5pj~oA  K,\. 'sb : w  zw x s ;Y 6 0rE C 쀀5 l< ˼ |\NlcO K,%c*czb P*D )ց5{:jJ~ Kj1߀2k1߀?_  EDc|0MjM ą  K̀qLZTāǪ 8 6S dk#w5pj~oA  [ FrW Ł@[N o@ ʂ R  67 6K67 [c C l<8<u ژ Dh m&`+O [ F }׀:&P C'D 趁{5{ .рJC? k1 k k1?_  EDc|0MjM ą K̀qLZTāǪ 8 6S dk#w5pj~oA  rW Ł@[N o@ ʂ R  67 6K67 [c C l<8<u ژ Dh m&`+O 똀 }׀:&P P'D 趁{5{ .рJ., ΁ Ke<e&(_  EDc|0/Mj Ią F.KρT l$ M-nm9< '5ʀꡀ2j|8\13 ['   MJmB oC!Wg݁ 7;e좀  _ 7Qc?ȁJ:~93DoB" llI  +*>. 쀀5! >j%(v  M };i:q_ Nc|0/MqL J I{1F4V_ FƐ=[ pj%(\Q M ]K Ɇ"c~>좀  _ 7Qc?ȁJ:~93DoB" llI  +*>. 쀀5! >j%(v  M };i:q_ Nc|0/MqL J I{1F4V_ FƐ=[ pj%(\Q M ]K Ɇ"c~>좀  _ 7Qc?ȁJ:~93DoB" llI  +*>. 쀀5! >j%(v  M } =d-πo_ Nc|0/MqL J I{ ,\.N0V_ FƐ=[ pj%(\Q M qh_"c~>좀  _ 7Qc?ȁJ:~93DoB" llI  +*>. 쀀5! >j%(v  M q ;d-πo_ Nc|0/MqL J I{ ,\.N0݀~" m n׀.K x>K2&iNI}g,j+7v  D[ K5ʀj%(\ ژ M qh_!3$Abz_yF}NK@a|@MK0ρ YRyz&5!! >j%(\ ښ M qh_"c~>Xd'.a /^ /W o,75! >j%(\ ښ M } Ke<e W X/ W_ Nc|0/MqL J I{ FxӁV_ FƐ=[ pj%(\Q M [A=<<"c~>좀  _ 7Qc?ȁJ:~93DoB" llI  -*>. 쀀5! >j%(v  M }C3_ Nc|0/MqL J I{ 8 xӁV_ FƐ=[ pj%(\Q MQ ^ހ"c~>좀  _ 7Qc?ȁJ:~93DoB" llI  -*>. 쀀5! >j%(v  ME ΁ KLMNL"_ NcM7i J HÆ '"ۣ YyJ4|Ҁ[ 27.Ks o l <ہs= MK7 75ʀj%(\ ژ M [hb렀4 b nQN.Q4 ,G9h u = =A=؁R(RPU =ƀ 45@q mʀ ݁ ># z&5Fdj%(GCÀژ M [hb렀4c~>@DV`V6]*ȇE o + Ɇ< ph D  flp2a75! >j%(\ ښ M }CW"_ NcM7i J HÆ 8W ͐"ۣ YyJ4|Ҁ[ 27.Ks o l <ہs= MK7 75ʀj%(\ ژ M렀4 b nQN.Q4 ,G9h u = =A=؁R(RPU =ƀ 45@q mʀ ݁ ># z&5Fdj%(GCÀژ M렀4c~>@DV`V6]*ȇE o + Ɇ< ph D  flp2a75! >j%(\ ښ ME ΁ ;⿀k_ N^" t JK{ p44 xs&L>^ ܁ Ʌ NIcI;S  '@  75ʀ꤀2j%(\ ژ'-cy''cy O qhXҀ3K8eyoa^@D_ := b@ @1Lj, Oa /^ /W o,j%(\ ښ M } K΀k_ Nc JK{ p44˽ : XR.4lƁ=T;I^ 8^ w}}L zD<7e'c 쀀5Qj%(\ ژ'-,|u ژ'',|u ژ O qhXҀ3K8lﺀ  Łdҁ Á  zKK ?< ?R:Y ؍77XR 75꤀2j܁ژ 9  ڙ ;  ڙ O qhXҀ3K8c~>, Oa /^ /W o,j%(\ ښ M[ڀ\mvv  [ڀM'ÉbS>r  y k&*܁ L8p9 /r9zJڜƁ_l3ڜuɀ h& ;&ـO .Gˀ[ڀ\m % [ڀM'ɯ O B (2ֿ\3ڀ ڀ'* {zdqp ހIKdJ!ڜHo` MҀڜʀW' ;&ـO .GʁT`}<;6] ) D 2 ^( ~;$ u;/z HfC-X^$-(qmm$m Mh-lX,. / GU٨ IC tF D%Ā *7v] Mvr=wl]7@ʀgDI^Á%ڳ Z S5_ɀJzQGUۃMJNp Lo +=cڳ?Zw [ h =H$] t.O$ˀKHAn>3"W [ PȁM?s ʀ:k Dh?  qT1 MHf!Q݁?/|,1V#/:k nSʁȀ-ŀZؤ*8xnG?ŮU^0NDԀ82N 9( X- ~|ـ 58 6\Stq|J 4C sہG} 3@Mڀcڙ?$ˀ?>ڙpo .[ ڙ/8Yd 9ۀ1r @1yMܫD  04" Ӡ*  @1y l wd t {֛CnzQGUYk(HeKC6g GÁCF 2 k2  D@ژ?#N\ 9ۀPa^ @4ID>ـb `1p#:pn9CҁVtJY68? Z 3Ys Z CsYQ ڀR5 b#/:ʁƀqFtF g z gv ^e8RDQ܁+€7 .%3@ RɁ, ḀZPE ـ_x C2Vu ޫ s߁9 %.ÀWn.\ Ӟ% 0րY(F?;D $^G` F(}. Rc  d"/HNҁ  }*c kŁo t niG{.j]P Hl ~U#h PNҀl fm[ VՁ.\ ڊ1 x*~.q'l< O V.)$"@5D F #y!21 bŀ ce bŀF1 bŀ ce bŀF1 bŀ ce bŀF1 bŀ ce bŀF1 bŀ ce bŀ_HcHĀ Nyvvvvvvxxbʀ &/Mjx t$G O="-Y?4~ Cfր{8R ;?=8 p? c@Ӑ 7Á uVP{C7D,2I$ kX&ڙ QlnD  QlnD  QlnD  QlnD  LInD  QlnD  HInD  QlnD  FInD  QlnD  hJnD  hJnD  hJnD  hKnD  hJnD  hGnD f4 } < i= Ӡ7 Kuk y RPǁ y桀LtJBtGr桀LtJB*<Puk y RPŁ y桀LtJBtNr桀LtJB*<Pt`WJρ桀LtJBՁALtJB)+ht`WJρ桀LtJBՁALtJB)+mkuk y RPr=LtJBPr桀LtJBp6 y桀;9v>sUv>sUv>sUv>sUv>sUf>sUzHcHĀ N f Fݶ H  H 4݀ }Ib.NJ 6) 5<2  d9 AL!Kف=8#Z)@$6= E)4 jOb ڙplp 1q;# =t ӠN HƁn ڒ Ӡ7 P@_Q{π~,*Āf>s"gVUd*ƀf>s"gVu" Z*ƀf?@_Q{π~,*Āf>s"gVUd*ƀf>s"gVu" Z*ƀf?@鬀a|1p9טY}qf>s"gN C*ƀf>s"gV)++*#f?@鬀m JρrQ*Āf>s"gO x*ƀf>s"gܠ)+1#*ƀf?@<1{π~[*ƀf>s"gV *ƀf>s"gzld鎁 Z*ƀ9⿀yv>sUv>sUv>sUv>sUv>sUf>sUzHcHĀ Nݶ H p4Χ."9.: 47a} [@ Z  veKz ! B}_2jOc ڙ7A j:;8CEsz Ӡ7 PNo  LۀRxU>s"g 1܀RxU>s"g ?܀RJNo  LۀRxU>s"g 1܀RxU>s"g ?܀RJNb6^*:TRxU>s"g대e 7#RxU>s"g대e #RJNb6^*:TRxU>s"g0 nRxU>s"g0 nRJNo ܀RxU>s"g !Ձ܀RxU>s"g !Ձ܀R4(C] :NWr"Vk  B:B .We8R! `/ f* ܎T|   Ҁv ( 64" qAˀP'y U)X lh/:'.2  ̀$#߁ &z lx 3 M  ʀ.@d Yi 2Dʃ[ sjmf8q XhрZ!YXq iX, $ i4U ~ liĀ li/za=Z &0-w$GN-r`JT bW?EvwlY܀1U jiU jMA1U jiU m GHcHĀ OwwwwwwwwߕN48W3,3AC M/ n*Ӏ N܀I/˘7`QZ L"cj 變  Ӡ) 7 PqN "A 'lZˠHk$ ' Wo d# 5€ x  6@ ݀z-1ŀ w x)Umy]g  Iр@2 KI Iq/* CC(x \ S Q S Q S Q S Qxjڜ L ~ Q S Q S Q S hK i hK i hI i hK̀ڜ8܀_[ V Ā UѺB 6 € }UѺB PBp:T V Fq3p m#=  Y À UѺB 6 ^,?UѺB PBp:T Yq.LCd ԄB W AW30Cʀ 0jQ!A"A' kX . h?vOee.+N'#sa| {,M>w1 bŀ ce bŀF1 bŀ1z_ Owwwwwwwwwwwwwwwww\N Jx - Вg]5 O= ˀ  1_* F) Dg|:%q  ~ 5oO V 7V Ā2!%%%%0€Jx7Yi74T6  z惁 ́ !| 9 %g01LڹC' TI0Lq ae4 ښ4 ښ4 ښ4 ښ4ژ4ژ4 ژ4 ژ4 ژ4 ښ4 ڜ4 ښ4M 첀 LtJB-6_s À LtJB, TͲYyO/*oH2 -% +KMNH2:ڞ2:ڞ28ڞ28ڞ28ڜ28ڜ28ڜ2;ڜ2;ڜ28ڞ28ڞ28ڞ1  [ހ28ڜ ň ڜ MƑLtJJրd r?G#1ڜ DƑLtJJրd r?G#1ڜ!9=  i/ z$x,7%} []{ր\?x?! I%WÀa R I% H I% E=78ʀ J4WÀa R J4 H J4l€l #|0Ā  Ii ؀G 1W ^BK  6 X|C,h \uۀ{E%&/Ӏ?  I(<À1Ͱ ހJځZ 9+ _mv s7OЀ0g pOǀހƻ ]mڀ$Mi)0%, - gɁU[V Ub7qPe$  <,一7@ Lڀ$Mi bY( 7 ?^L Śڀ$MiZ\[ۀ i \<zhQo/I* l 7\ 8sڜ>+GӀ >\c ⃀ - CԀ aف YRDLBd[ W, z 8TI rq . *I+O x ̀wˀ pMa 6ʀ#>i; Q! 'V T{ # YZ ̀[ ")<!@ Lʀ#>i7+ i 2Xʀ#>iZth8р<zI0E !&(L LkrڜYP x _b `^ >W рß  Lem% ie ܩm77݀aQ_Ӏ#܎ } Ł |0 HDn Ā  ɀ g.  pg/%ހya,% S} X[:?z4 &ԁ v n5h Bf:5{Lf~ހ?CFEG΁ "T ?}Ӏ6RL` ΁7%)%C EMӀ7 xN^΀u Ӡ7΁?;?`3  ́ Z=%)%5 Ӡ7u75!  ,Hˁ_ %e8K/QXaM  87 2f+ȁ;> C u AxKe 7Vp ́~ n n nE ʀ XAzi(uہ /7g(d U`ʺ d ́w7NlꥁJw S '5=Nl1bly- N} \ "  Pa΀ Ӡ   'À.[ Ӡ "| Ӡ6 PNbߘ= ̀4Xt6v ᪀g )i f1́ F N7jY} :"?ba\juW4CÀ:ISOe2BU8H9՞eZ-8 끁VS(1B$$V899N6;7G37Ā(JfiǛA LRow q l@Lrl1/<I ,5{ ^tKoSl@q*k EoL]r"V6[AkRJTL%w5xD /<I ,߼o(Ӂ L% <D Bk . VL%h_ˁ Y܀1U jiU jMA1U jiU mzHcHHcc f癀 O X wwwwwwwwwwwwwwwwwߕ!?  HiG-0Y *݁  - %i\ʀ 3m)N7 M PqNF + 6JB u.ȍwKhBK ÁGA@ _A! D4℁b4" Ӡ 4" Ӡ~ Q | K}G @t ;ڹC޸ ƀ?#<j ڜ Qiڞ Qiڞ Qiڞ Qiڞ Qiڜ Qiڜ Qiڜ Qiڜ Qiڜ hJɀڞ hFɀڞ hJɀڞ hFɀڜ y $> UѺB PR/U71ڜT1 V € }UѺB AP m В%Zwʁ ;ڹӤ, .r;̺Wyڜ Q ڞ Q ڞ Q ڞ Q ڞ Q ڜ Q ڜ Q ڜ L7 ڜ Q ڜ hF\2:ڞ hF\2:ڞ hJ\2:ڞ hJ\2:ڜ d &4 獀؁ƑUѺB t T1}>؁ƑUѺB ;m y߁5؁ VHcHHcHHcHHcHHcHHcHHcHHcHHcHĀ OK׀ jm8q ؁ RS=ȁV/PQ<rQOOOJOD@PPPP"g!wx+h==k+==k==kOKu<U~Ai jI+要 ܷYZ&U " )5Mq kkk^_kkkkkkk^_/ـ <_bցoƀLsblf1|"CtX?iGR=2_wd#)_z$ L ) >_z$L ) >_ w` Ki 'L!a)i > 3Q | " R&:颀M 2 " K#颁:_ w` Ki 'L!a)i > 3Q | " R&:颀M 2 " K#颁:\t w` Ki 'L!a)i > 3z$ p&_ Q | " R&:颁a ߁ " K#颁:_ w\ Ki 'L!a)i > 3 w` Ki 'L!a)i > 3 w` Ki 'L!a)i > 3 w` Ki 'L!a)i > 3 w` Ki 'L!a)i > 3 w` Ki 'L!])i > 3 w\ Ki 'L!a)i > 3Q | " R&:颀MG 瞁 " K#颁:_ w` Ki 'L!a)i > 3Q | " R&:颀M 2 " K#颁:_ w` Ki 'L!a)i > 3 w` Ki 'L!a)i > 3 w` Ki 'L!a)i > 3 w` Ki 'L!])i > 3Q n n ]%,z$ _  w` Ki 'L!a)i > 3lc1R t?iq 6 {0 :Q( Lɀ A _A Jlc1!l̀t?iǀ Bv{9΀ W$4. g GXAZ_CI >^+[oǀɀp"}ڀ15? >%,c1R ̀m?iq ӪgwԁIƅaJ=+颀M^+[oGX}ڀ15L%'.lfg )      =k 0aa ! !?x 0 !NXisXXrv I% V}~ P~ Pǀ 35 njeF݀ njeFdddYRȾrq z!Y cK 3  .9*1[сrNo/H .rA L h •777  =k Z  =k Z7  =k Z777777  =k Z  =k Z77777  ++0+=kk[[ g v[[ h&Pv[[ S r77777;[[ No[[ EfPx77775Ed ( )q3 ) ) ) ) ) q  R&:p K#p3/ q  R&:p K#p3/ Հq ) (  R&:p K#p3/ Հq ) Հq ) Հq ) Հq ) Հq ) Հq ) Հq  R&:p K#p3/ Հq  R&:p K#p3/ Հq ) Հq ) Հq ) Հq ) ( ) Հq y x y F kt> F kt; n |րn zy݀a 1 |Z/LsB+MqMLL@ ^ & !#9v$N|Htvb:54H O0 0Ѐ(:ҁ ;^ ǀC v X<& SWw.iThiTO6M wȽ9 ,$o U0 }Tm~l@h%ƀ Lhi71hqZ= Js GWjɁ}q Y'kSG,mweJ  v'́ڀ = 5v i b%c S X xn -[g LC E JR > y桀s9 zDOTep;P` S}0vCs mJ IJ ÌI~J;EPJ8++>%ׁɀ ߀݀S^ LЁr`f  a b Jw!\-jij ;6T u C91vd y :t Hg_?%%^$0+$mgsh7/צS|@rH&m  ,.h- hH] z|6] z|Q[] z|6$@WG ]  b] . [v  TvWF {  /U݀ U/ ʆΒkf)\ ڀ A[< i C@ Qkf)\ ڀ A[< i C@ H6 swmJ8RfN݀;-K4 ƀb̗"B gzf[1=? u AO*C SAL>=!G .H߁$Հ+;$  J 3~HFo4p3G+oe}aϑ  /? \5W 2h*imрӁˣ /? \ p=Ӏb8ҀٱW J ٨ '2Qk  Xց`!_qMjd n9 ڜ+0ٱɁgÀd Ty n ߀ ܀b€ Tp=V܀ gT[gÀd i۰ Hjp=`΀7}!_qgÀd 0#߳1Ap=_ǀdEv \q 1BQC.N+%cp= /G \+JXӧ ۻ YJu aW{P+ 9 M /G \c7܀agߑ $ % ȁ^ LЁ 3 #- AK 4%ځ (XS&|Ҁ74?΁ /=E  jCǁP R% 9 eC c!MԜr€ 3 Mԛ  Mc ' }"*ԛ ہKy(y '  +'Y*ΘN 1 ' z*Ι1%1%1%sZ ' }"*Ι1% ' }"*ep= /GR,ŀՁS-MV b`N_ I#:kll a Ё  :d3YYZ*˽>u! Z*cLl#*ح5%&h%sXL,}#*ƀZ*ȥ  ) c)&.z O`xQR Ѐ` /G \qg a~I ) r 37eU2YRTjf Ľ̀4ρ>:4EL! r3=|W/V ʁ :Dn}n3> et =AK> aMK'Z΍ 1Mp= * /GjRŀ ĀtK ^̀MK'΍L2p=  /GQ <2p=  /G \ +L et2ZZ΍LL ? ֙p=  /F \ +  ĀtJ%!ā[[YفZZ[\Y ځ޳2΍LLL,2p=  /G \ +oCp= J S /Gpv YDkfp4 8 g` 0$ x /^>ژ_ w j ( ژY^  0 /GjRŀ p_u1EсMb$ˀ;"*ep=a /G ٭'_ڑ N cIB타" ӞNMԀI;@}j&;AQ~VvHiI 8  ܕ란 WXVSpl  A O`(Xp=PV π8Āڜ[h!Mɤss܀4Z*ƀɥ$Z*ƀOg  .  ie H ?cˀ$Xğb .ָ .L" . h' MD΀u . ́ ,T AO_-t ' &= ym]> )Oı 'z} dL 48Āڜ ȍ $ с &(J( lh"BɥɢZ*ƀɤɢZ*ƀh y*E 0@*ȥg:s߀#>sl /1݀n j 4  0eƀ Z*n,:G KHg'׀E ʕژ / ژ=1ŀJO|M?!Mep=  /GP,ŀda; 0 b$ˀ;*b=}*c 8"< :}h *  F *ƀZ :}h } *J뗀 ژqπ;> R=Kځbkfp4 9 ,U B|&xvq8S㙀Ѐ" ށqe&!] B Jp=  /G & awRP<p= /Fb8܀n Ӡ2*2p=  /G) d 2p=  /GQ <22p=  /GR,ŀ 1<>ښ?Mp=a /G & awoCp=  /GjRŀ<ژE-9Vl"p=  /G \ +iEp=  /G & aw''[N;ژ $% Ӟـ Ā Ӟ NiRXWz*#XWz*]g"}xo*ˁ p=  /G EV u6c #Zp=Q /G \ +_NiRp=  /FPĀ']00LN" ژ gQ - ӞF p=  /Gpv`]g"}xn*ŀΗNNiRp= /G \ + Rҽ" M ᱀π" ʊLc]^ژl===R=kS^ژs*" }jY]g" ր"a;4Z*Ā2S" ژ4" ژ&R=%*" }jY]g" }j"a;"*c '   {"e I} 2 t'_ڑN" ژ^ % <U-d7 '  y_ ' ulW ~+}^Ā} 24 '   {^€ $  '   {^vHNҀtC7 %HO Z*ǀǀzZ*cr0 \vN݁A/ @R.2$OU +p> , BN ? ~ }rvxxրw1Bnlp~MԀ2,2p=a /G P<ٯ\"p= /FZ )(  2' "  5ށ6 O ڔ ?GnqeMԀ iGp=  /G K?PNٯ t}́  w PE  JQZ Q|   @ QRҽMҀ L L~PE 4O_R=Tq}Mo"  l  i,}jq  'q   o"  [-ehD  ?M1+I L: 5vmw!vmD^;+I@Ll J ހ NˉDW ;Ekj 2<5=tƀ/c`΁Z-D= Yqˁ9vmNb%e  2q9:T}0ҁ fvmNb%e  ā:T}0 `$+K!_S"yUi mN ǀ CDzQGU5SmXv"X Y' Yy*) KDhYK"ˬ gUV1ʀ R Ny)\J! +Ɂ^'M Oˀ ܀Sf)g/v /MqLZހ ~L`?~ Alb+v! Ş #0N1W0Nd8 0N yV2Ģ an bDo )  d7HdTڀ ÁMÑ u [,BWhc  ց5 dхzڜ́ v0CX亁ssgM !Hp m1 uρOAPqg 7 Dːv i b% xn cgae'^Áb H˛ː zD ȘEπh߄% ځib$H!e s0PHltǀ kV  & &Ԁ knЄ3 m|рea {5~%?o/uf GNs x>:my82Hh0]7S 2t ؤ YyKq=*yU\Y\Fbt5  ƁsÆD/A ؤ Yy4t ꀁ| }KKY܀U \1e ,d%`e F:˅n6(()O _ylɛIiQ^  T (/ S $^ݽo8р[ #G.c34|-ĬIbJ ̜ H`YH`B$L,4 Y ـft6 ͥ_&ۀD=::At 803i3jII+M CY_ o t9 7wvZ48 wg:E1ȁ ߓ ˂T="%&$;..Y& NZ ZP U}Ӂ %} Qmkwx",((((aS!+c'(((m'(R'(((f  T؁ 9d j S,; . +(I`qdF X #!/!8!2!!v!!#Jz`(Z&]ϼ _!z!B!8kIGHt̀9?8(`Ja`{q E)$.D-V)'pp```/f_[R:#=" |-SW8p%[M=7Iu _s}o6nQN$J1i`0 1jt{gA^#>t! 3B SXUOx p,< P-D@"*Sdh{2*X*[ a63~a{V5;6 D tp,*c">dcG^=s"G('E'<|/0bhE24a|1hC<IBSH_`8AtTQ5=$FjQCD'A @+~QSuI3B7` XH'? 2.P ]r%Y{BS!l6+9^"[ /XuQ2G)S,E?eWfL4&%Oj!KVf+Hi|Gt% 3n/(5u0Rj-=(=]:%`XQY&rMCv xi`O A.kA8X  _2cj8AQ"^.=sux J\6hzvVqv,# F) .iQ4 ' R 2Hv@{Q..N4> Y$#)jS r}3V8u U0b\LwlEr%v;R'o7:d|[d5QxGjRs65DS !T- ;a~YH\[=`"G{*.vq0.TUA( B5v j06 dJI $c3aA&-&-9AAL$(<99gAAAAL>LL>4(<(<L>L>gLL(<LLLLLL^>mmukul--~@IMBK,tI@IBMC,XmIBGI~BM@I@I,,[):=1U6 )[)))+)%W>5t5t5t*R{{{'S`mAQ0>NfOQ[nYd8e@~ ?BF;zh3%k>Jr.`OQtAX/CP[N' :8g$9KMxXZ 3% F O5.,;p$A)S iF/u|QpXDQg8z{EP2AR r "_?"!~D0.p O /BPL+}C/q_L#z @0 1/ cegY./@2 E-u*T@:6 H#| GQ&;W1  Dt ;D?QtW1!vZ8s X$jdY d{v]j$tIZg=-ze~4zw),-]=$l`\S/+orD*R Ug)#;LP75)snp B,\|_M5V*Y@BM _` ^/ 5#. $<&#xhTG+WP)(Yci:ZnS^X$H O|hUA ,XhA *l>4Aq]/JC8s6(( >tck3*aMDDmQlf6$H;C s& " Gz  A'@p D\bB 6b =^eL>G gIoM*GIJ\&UjQJ7!+\eDQ5g8z{= 2A;@ `/[Z\+k#h 7[H9 50@&-090Q<IB;.A+.{5#L(q$7 Vxwxt:Q  $  =%6@0'/0:)aoZZZ?'??b84J7974=A336,,84f7)`Ei$-J>(E.3PO@Uq~\1q7(GZH<VD2DmQ@0FR='7 [%W>',00A5Q6e1H(HղkBul^5)H:5oh jFp _&K4B  %\,[~ulh$ Q R8`8SPm$pPF'"4d<d<PAaAYsCfSfhDXG f1Dz"cPOeYĥ;v+wWH=}PE+ 9yL   zDj> E)$.D-V)'pp```/f_gZ:#=" |(PSW8p%[M=7Il _s}o6nQNR$?I%B1i`0 1jt{gA^>t! 3B SXUOx p,< P-D@'*Sdh{2*X*[ a63~a{V5;6 D tp,*c">dbG^=s"G('E%c/0bhE24a|&hC<IB~SH_S`8AsTQ5=$FjQCD'A @+~?SuI\x>x3B7` XH'? 2.P ]r%Y{BS!l6+9^"[ ,GlQ2G)S,E?eWfL4&%Oj!KVf*Hi|Gt% /n/(5u0Rj-=(=]:%`XQY&rMCv xi`O A.kA8X  _2cj8AL"^.=kux J\6hzvVqv,# F) .iQ4 ' R 2Hv@{Q..N4> Y$#)S r}3V8u U0b\LwlEr%v;R'o7:d|[d5QxjRs65DS !T- ;a~YH\[=`"G{*.vq/TZA( B5r j06 dJI $c3aA&-&-9AAL$(<99gAAAAL>LL>4(<(<L>L>gLL(<LLLLLL^>mmukul--~@IMBK,mI@IBMC,XmIBGI~BM@I@I,,[):=1U6 )[)))+)%W>5t5t5t*R{{{'S`cAQ0>NfOQ[nYd8e@~ ?BF;:/!ph3%k>Jr.`OQtAX/CP[N' :8g$9KMxXQ 3%E&F O5.,;cA)S iF/a(|QpDQMd]  @8P2AR r "_?"!~D0.p O /BPL+}C/q_L#z @0 1/ cegY./@L"D$'`* a,.`,#.Z+,:M/;RZ*,|x*5/^- F4,);.+0*I-.'?:N.R4;sg/]$-##3**21#53-cY(('9((.B"6##%3$###&&##--(""$''&:%-3e+=  (d ?eXPY2oVrAs<-  7RIoJ'}/~(EOv@; P9_Px8Sah- kS 9^'aDFb*| GQ&;W1  IA% *,_ZMgQa9V te' &8)9'8(9)?$6"0,&$[(32!93,4(508+!//&,(&2+"0fV. 04(:,.0&& ~%Y oU5,F}^.S4)'QSfJak QWY e%I@,~Z8s X611}'b 3h<,p ?boH(<<ET[qpxG< fgRSv-KV _H/(&HU 0E%MGx1SU$tIq1&=-ze~4zw),-]=$l`\S/+oD:y;A&i4#(?c%4<fXR&/23U4qI/(6;@@E1C@3C(&N7f&&4AILB&6mAV?&&B#>&R%&f%NtJ*rO#?C&OB'@D/#W5$Q>X5-Lvz5&V&&#W&r#n2#Y&@B#h&4&F&3<???%(F#&#6g#;P[]=I hn{ B,\ucB5V*Y@BB v_;%. ^/ *#2 [CWo$<&#xhTG+WG (YXiZn^^X$H O|hU6 !XhA *}e34AaM/?C8w:K( >tX@n3*`iNf. A7DmQlf6$H;C f "z  Q'@c D\bB )V =^eL>G gIoM*GIJ\&UeQ ec>N8.&?.822 8*n;u"rLvr#N'*7"vr#"iaaNaa"*LaaL"i"i$~"~L~iiwr#"T]"T]7?r#77i?i~aS(v.g(~/x=C@XS(;?mF9=C=?(?7BBBBBC7IXA;?7KEY5EB;.C% L5d@(ndJ7/NZ9j?*@(4%@wD"Ij<WF2DT:q\<0G`=H{&{&ccccN4IL`h1qm^A>qK7TQ54J.$^+Gn.%E4Svrw5!v](LRDAu)FL](13rZV3R754Gm4Gm4GkD<6466kG<6466kG<6466kG<6466kGA6466kG76466mG76466mG76466lG=6466lG)38 ba5*YV4V7f=8=-;;0|T4>>NPg73P>g_>vrw"u0RRA"0[;RcC#dpAURl4L9l447%j64Gm6Gm4Gm6G6?Dm5646@S44kG54kG64kGm44336546465G65464kG65464kG65464kG6546444G6546444G44kG;544444G7544444G7544564G7544664G+"GMza=*4/l;wW. mMD9CG9NI98c mMD9>G9NI98crkC+*K9G$39G.wGv"GM3"GM/$2wT5)"2wT'/L_(J+-je2E`3AA3aA11J;AuBARAcC?,4.[E%7G?LN?LlJ3,5.Cl5'G?LLN662R54J4.%L=J+-jU2 A93;R1S11J;A["1uw01Q&EJ5444.[E%7G?LO?M??I1K5444.[E%7G?LA2K5446.6k%L)wr[vB{p4 6G)"T\MZ-hx]"UR0RA0(<2UJqRdcA12RJJJLJJLN]H4y743754433784k3543744533433n56y4338544y744433M>6x755633OkJ:k46kG*"T\MZ-hx_{0ADR'M<O"ADR38G0hN44JKJJJKN74K553745533764433764633764433443764453433kJ94354kG6x(7?rNZ`[7o'z4G8%D4G\*+5BP55mx(8%QK55mx)k6i8q**7q-;P3y5)xW)=*ic(x6Ng\vNl\) .4~ONuh(P%8n .),N"(e9>DT+_8a6+7746433:3m5633A5u;vX%xwi*7Q(k@IPE>>>5>=A9$hF2 wOXSb=?NkDwD) W?&P=SRb6PkDDhU3iW1`7:fAP5blDxD;5D)*f?p5~FBB0GlEPC%"777BE[(*[H5*13Z3X(D3XM<@D..*+.Q018(J@I6=@A>@$.~3TXS*m>2@0==MD@*+0<8NWS8`.Dp; 0\4US8A*C*w*d+N,--*?*N8a&RP2;@)*Y`*PZNliNM`*YeBLHNliNP48J`+sE9(`8m,E@S8_-ikz?=LHC8t$l*W;h/4k4'k4;M1"K$*"T/[(r1"K5i8<7K/G@Nn )+,N5n),4,p*"K5i8<7K/G@Nn )+,N5n),4,SL+*7*R7=8aQaaQNaa?m;9E:;A<G@E6<=O:(* )1?9*nOHMJ*?}k(RG33*O2YMGx"X)F)?)TbPP4T4?X_jpI]4O6S6?Xrd6T49X_jpI`4O4S6:X_kHMC>GX6;R,]4T6?Xw?GP%?Gc6>R/G5A:*M6OT:Xw?>:X%*+QtIE,x/RTRB#9=:pC0#hRO6v;?DiD96OX %*+QtIE,x\LcvARKeDeg:;4m5N6'v;?DlD76OX%*+QtIE,x\LcvARKeDe0\sR'v;?DlD76OX%*+QtIE,x\LcvARKeDe0\sRO6'v;?DlD76O%*+QtIE,x\LcvARKeDeL'v;?DlD76OX%*+QtIE,x/RTRB#9=:pCO6v;?DiD96OX %*+QtIE,x\LcvARKeDeg:;4m5N6'v;?DlD76OX%*+QtIE,x\LcvARKeDe0\sR'v;?DlD76OX%*+QtIE,x\LcvARKeDe0\sRO6'v;?DlD76O%*+QtIE,x\LcvARKeDeL'v;?DlD76OX%*+QtIE,xeR)r3_":LAL;VRj'i?DD75OX%*+QtIE,x8#3u)DF[LVh-N?'v<?DD76OX%*+QtIE,x\LcvARKeDe0\sR'v;?DlD76OX%*+QtIE,x\LcvARKeDe0\sRO6'v;?DlD76OX%*+QtIE,x\LcvARKeDe#J#^]'v;?DlD76OTvg%Q*+aUI6D?3T+%;VN0}5I+<@\]6'4$5 !sPkJ08uOnTu;Uj(C58&;?\'pj$;k4.lD+ ?WN;&DmT+$Tvg%Q*+aUI6D?35`3d0C0S;?\Q:'Rk$5 !%0/R0ACd:5o)GC44&<?\-o'4$j45.D4+ ?WN;&D mT+$Tvg%Q*+aUI6D?3hXm<0}5L<?\M6'4$5 !e@.C9"ATGK<AB&;?\'pu$;k4.7D+ ?WN;&D*mT+$Tvg%Q*+aUI6D?3B9>1-0TJ5;;?\'C|$5 !s c((OydAjT4d2A>;&;?\'"4$-.D+ ?WN;&DmT+$Tvg%Q*+aUI6D?3%v1ZSTPBL<?\Q9'4$5 !ilD_tfiSL=fBj6&-E\o'j4$;544.+D+ ?WN;&D mT+$Tvg%Q*+aUI6D?3~D0/J52<<@\M9'4$5 !sn[BJ-r3xE10Ev0:=ScC46&;?\'4$4.[D+ ?WN;&D*mT+$Tvg%Q*+aUI6D?3B9>1530cb;?\'C|$5 !s c((OydA10#:w3ES&;?\'"4$-.D+ ?WN;&DmT+$Tvg%Q*+aUI6D?3%v)QcL<?\Q9'4$5 !ilD_tf@Adj6&-E\o'j4$;544.+D+ ?WN;&D mT+$Tvg%Q*+aUI6D?3~TB00cb<@\M9'4$5 !sn[BJ-r3j)0f3C46&;?\'4$4.[D ?WN;&D*mT+$Tvg%Q*+aUI6D?3`B9>1530cb;?\'C|$5 !4! c((OydA10#:w3ES&;?\'"4$-.D+ ?WN;&DmT+$Tvg%Q*+aUI6D?3Fov)QcL<?\Q9'4$5 !DlD_tf@Adj6&-E\o'j4$;544.+D+ ?WN;&D mT+$Tvg%Q*+aUI6D?3~TB00cb<@\M9'4$5 ![BJ-r3j)0f3C46&;?\'4$4.[D+ ?WN;&D*mT+$Tvg%Q*+aUI6D?3B9>15A0 T1cSB;?\'C|$5 !TAS((OydA10#:Bf4 $;?D'"4$-.D+ ?WN;$DmT.$Tvg%Q*+aUI6D?3 v)_C1BctL<?\Q9'4$5 !lD_tfe1 j6$-EDo'j4$;544.-D+ ?WN;$D mT.$Tvg%Q*+aUI6D?3~T0 T1cS:]<@\M9'4$5 !TQ[BJ-r3j)p_#C0KGC46$;?D'4$4.[D+ ?WN;$D*mT.$Vvg%Q*+aUI6D?34Qp};\L?p1WBK;5\]9)5$5 !0#` 0tRwK442(-B6$>?E'4$;545.4O+ ?WN;<$DmT.$Vvg%Q*+aUI6D?3u[%`UJM]C*p Tccm\t'4$5 !lD_tf;5<1fSj6$-EEo'j4$;544.+O+ ?WN;<$D mT.$Vvg%Q*+aUI6D?3!K?R101L4=9=@\M9)C|$5 !s08uOn10EB^_0100E58$;?E'pj$;k4.lO+ ?WN;<$D*mT.$Vvg%Q*+aUI6D?34Qp};\L?p1WBK;5\]9)5$5 !0#` 0tRwK442(-B6$>?E'4$;545.4O+ ?WN;<$DmT.$Vvg%Q*+aUI6D?3u[%`Dl<]C*p Tccm\t'4$5 !lD_tf;5<1fSj6$-EEo'j4$;544.+O+ ?WN;<$D mT.$Vvg%Q*+aUI6D?3!K?R101L4=9=@\M9)C|$5 !Co08uOn10EB^_0100E58$;?E'pj$;k4.lO+ ?WN;<$D*mT.$WN?N0%Q*+xU37x>=HA>)Ul)nd"Dd1fcA(y;BB)=?\'D45.3Vx[ce"AdA[OESd03Bu0Gn4;,EE=?E(;?D4499l4.4VZ4okE^D (6i>RC"1A3|47);5D)]4)DDKWN?N0%Q*+xU37x>=HA>)>O9)-R49>tUuAgBSB0:Tc0)=?\j'lD45.3Vx(FZd0R"EwuDv09U3CcBdtw1F545;;?Ev;?E@F(;?D4498m5.4VZ4okE^D (6i>RC"1A3|47);5D)O4)DDKWN?N0%Q*+xU37x>=HA>):]S3CjDR2J-E)=?\'FD44.3Vx4x!AARA0UdQ(9GS5R4F765;_=AE;?E(=?D5498m5.4VZ4okE^D (6i>RC"1A9|47);5D)M4)DDKWN?N0%Q*+xU37x>=HA>)?CR)\A0vD?)>@\'D44.3VxrU}wuR"AD00BdB0(}WMHk4;;?Ev=?E(iD4499544.4VZ4okE^D (6i>RC"1A3|47);5D)]4)DDKWN?N0%Q*+xU37x>=HA>)t.U|>R5[:Ll;gBUUSTn\j'D44.3Vx_X4BJR(3-(D-R2C0(u134RF444;;?EEj(;?D4698m4.4VZ4okE^D (6i>RC"1A3|47);5D)O4)DDKWN?N0%Q*+xU37x>=HA>)SJLuBAmd0A0ut0+;?\'DSm4.3VxAR>J;Au1"l-c:3LAAVALLF464;=5E[;?E(=?DRm698m4.4VZ4okE^D (6i>RC"1A9|47);5D)M4)DDKoN?N0%Q*+xU37x>=HA>)?CR)\0T"#R)>@\'D44.3VxrU}wuR"AD00Bd0TSIMSHk4;;?Ev=?E(iDR99544.4VZ4okE^D (6i>RC"1A3|47);5D)]4)DDKoN?N0%Q*+xU37x>=HA>)t.U|>R5[:Ll;gB6S 0;\j'D44.3Vx_X4BJR(3-(D-R*)F444;;?EEj(;?D6698m4.4VZ4okE^D (6i>RC"1A3|47);5D)O4)DDKoN?N0%Q*+xU37x>=HA>)SJLuBAmEDu:1"d0R+;?\'DS.3VxAR>J;Au1"l-c:3Le0tlIF464;=5E[;?E(=?D5698m4.4VZ4okE^D (6i>RC"1A9|47);5D)M4)DDKWN?N0%Q*+xU37x>=HA>)?R)\0T"#R)>@\'D44.3VxrU}wuR"AD00Bd0TSIMSHk4;;?Ev=?E(iDR99544.4VZ4okE^D (6i>RC"1A3|47);5D)]4)DDKWN?N0%Q*+xU37x>=HA>)t.3v|>R5[:Ll;gB6S 0;\j'D44.3Vx_X4BJR(3-(D-R*)F444;;?EEj(;?D6698m4.4VZ4okE^D (6i>RC"1A3|47);5D)O4)DDKWN?N0%Q*+xU37x>=HA>)JLuBAmEDu:1"d0R+;?\'DS.3VxAR>J;Au1"l-c:3Le0tlIF464;=5E[;?E(=?D5698m4.4VZ4okE^D (6i>RC"1A9|47);5D)M4)DDKWL?L0%Q*+xU37x>=HA>)?R)\0TDv u)>@\'D44.3TxRU}wuR"AD00Bd00(nAHk49;?Ev=?D(iD4499544.4TZ4okE^D (6i>RC"1A3|45);5D)]4)DDKWL?L0%Q*+xU37x>=HA>)t.|>R5[:Ll;gBBRI9F\j'D44.3Tx?X4v:JR(3-(D-RAF4449;?EDj(;?DTj698m4.4TZ4okE^D (6i>RC"1A3|45);5D)O4)DDKWL?L0%Q*+xU37x>=HA>);JLuBAmEDuCv"AB+;?\'DyF4.3TxbR>J;Au1"l-c:3LeA0RRF4649=5E[;?D(=?D6698m4.4TZ4okE^D (6i>RC"1A3|45);5D)M4)DDKnM?M0%Q*+xU37x>=HA>)4(')-R(vAR9<E0A1D?*;?\'D54.3UxcBSxDC00kw hd1(^u4KFn49;?D=?D(;5DRm599l4.5UZ4okE^D (6i>RC"1A3|45);5D)]4)DDKnM?M0%Q*+xU37x>=HA>)!ZOR)\BieRR0U;)>@\O4'D44.3UxJR(3-(D-R;K=3D4449;?DDj(;?DR98m4.4UZ4okE^D (6i>RC"1A3|45);5D)O4)DDKnM?M0%Q*+xU37x>=HA>)%U9]cxu(33p).0R012m;>^5:h\'D64.3Ux$JM0B0VDDDJ3`3ARecNk649=AD;@D(=?D5498674.4UZ4okE^D (6i>RC"1A3|45);5D)M4)DDKWL?L0%Q*+xU37x>=HA>)4()-R(vAR9<E0A1D?*;?\'D54.3TxcBSxDC00kw hd1(^u4KFn49;?D=?D(;5DRm599l4.5TZ4okE^D (6i>RC"1A3|45);5D)]4)DDKWL?L0%Q*+xU37x>=HA>)!ZOR)\BieRR0U;)>@\O4'D44.3TxJR(3-(D-R;K=3D4449;?DDj(;?DR98m4.4TZ4okE^D (6i>RC"1A3|45);5D)O4)DDKWL?L0%Q*+xU37x>=HA>)%U9]cxu(33p).0R012m;>^5:h\'D64.3Tx$JM0B0VDDDJ3`3ARecNk649=AD;@D(=?D5498674.4TZ4okE^D (6i>RC"1A3|45);5D)M4)DDKZwg%Q*+xUxB7===HA,xS(9cVTw[<dfw9N5)<?\)j'D65.5bANDS"RRd0RlxCcVD44(;5E;?EO6(;?D4454.3 yD5 BO1t95K?ve();?DBs)D=ADKZwg%Q*+xUxB7===HA,x#U8:]S3CjDR2J-EX3;LN)=?\j'D44.5(,g05;AB0JV"yuDx(;5E;?Ej(;@D6558544.3 y4okEO)"1(TRJAB)CS3|();?D)D;?DP64LZwg%Q*+xUxB7===HA,x#U8:]S3CjDR2J-EwB0BSA#L)=?\'D44.5(,g05;AB0JV4ADx(;5E;?E(;@D4558544.3 y4okEO)"1(TRJAB);Mv;3|();?D]4)D;?DP64LZwg%Q*+xUxB7===HA,x?TN:n""dRSRc3h0:^v)=?\P5'D44.5'Ex cS(UoAVJ3-K3p00z+D74(;5E;?Ej(-ED4458446.3 y4okEO)"1(TRJAB);Mv;3|();?DO4)D;?DP64LXwg%Q*+xUxB7===HA,x/U8:]S3CjDR2J-EwBD1c)=?\'D44.5',g05;AB0JV4AEE]MFx$;5E;?E(;@D4558544.3 y4okEO)"1(TRJAB)1S0A3|$);?DM4)D;?DP64LXwg%Q*+xUxB7===HA,x%DLEkRA1tRTR94RSEL)<?\)'D55.5:`RRA9]DAM-LvSR"Hv$;5E;?D(;?D5458544.3 y4okEO)"1(TRJAB)vv;9|$);?D]4)D;?DP64LXwg%Q*+xUxB7===HA,x%4kRA1tRTR94RSEL)<?\)j'D55.5:`RRA9]DAM-LvSR"Hv$;5E;?Dj(;?D5458544.3 y4okEO)"1(TRJAB)vv;9|$);?DO4)D;?DP64LTvg%Q*+aUI6D?3T+%;VN0}5I+<@\]6'4$5 !sN2(08uOnTu;Uj(C58&;?\'pj$;k4.lD+ ?WN;&DmT+$Tvg%Q*+aUI6D?35`3d0C0S;?\Q:'Rk$5 !%oT+R0ACd:5o)GC44&<?\-o'4$j45.D4+ ?WN;&D mT+$Tvg%Q*+aUI6D?3hXm<0}5L<?\M6'4$5 !Tp|e(C9"ATGK<AB&;?\'pu$;k4.7D+ ?WN;&D*mT+$Tvg%Q*+aUI6D?3B9>1-0TJ5;;?\'C|$5 !ET(OydAjT4d2A>;&;?\'"4$-.D+ ?WN;&DmT+$Tvg%Q*+aUI6D?3%v1ZSTPBL<?\Q9'4$5 !ilD_tfiSL=fBj6&-E\o'j4$;544.+D+ ?WN;&D mT+$Tvg%Q*+aUI6D?3~D0/J52<<@\M9'4$5 !M+3(BJ-r3xE10Ev0:=ScC46&;?\'4$4.[D+ ?WN;&D*mT+$Tvg%Q*+aUI6D?3B9>1530cb;?\'C|$5 !ET(OydA10#:w3ES&;?\'"4$-.D+ ?WN;&DmT+$Tvg%Q*+aUI6D?3%v)QcL<?\Q9'4$5 !ilD_tf@Adj6&-E\o'j4$;544.+D+ ?WN;&D mT+$Tvg%Q*+aUI6D?3~TB00cb<@\M9'4$5 !M+3(BJ-r3j)0f3C46&;?\'4$4.[D+ ?WN;&D*mT+$Tvg%Q*+aUI6D?3`B9>1530cb;?\'C|$5 !T(OydA10#:w3ES&;?\'"4$-.D+ ?WN;&DmT+$Tvg%Q*+aUI6D?3Fov)QcL<?\Q9'4$5 !ilD_tf@Adj6&-E\o'j4$;544.+D+ ?WN;&D mT+$Tvg%Q*+aUI6D?3~TB00cb<@\M9'4$5 !](C Jo(r3j)0f3C46&;?\'4$4.[D+ ?WN;&D*mT+$Tvg%Q*+aUI6D?3B9>15A0 T1cSB;?\'C|$5 !C(OydA10#:Bf4 $;?D'"4$-.D+ ?WN;$DmT.$Tvg%Q*+aUI6D?3 v)_C1BctL<?\Q9'4$5 !IlD_tfe1 j6$-EDo'j4$;544.-D+ ?WN;$D mT.$Tvg%Q*+aUI6D?3~T0 T1cS:]<@\M9'4$5 !](C Jo(r3j)p_#C0KGC46$;?D'4$4.[D+ ?WN;$D*mT.$Vvg%Q*+aUI6D?34Qp};\L?p1WBK;5\]9)5$5 !eP0tRwK442(-B6$>?E'4$;545.4O+ ?WN;<$DmT.$Vvg%Q*+aUI6D?3u[%`UJM]C*p Tccm\t'4$5 !5lD_tf;5<1fSj6$-EEo'j4$;544.+O+ ?WN;<$D mT.$Vvg%Q*+aUI6D?3!K?R101L4=9=@\M9)C|$5 !sk408uOn10EB^_0100E58$;?E'pj$;k4.lO+ ?WN;<$D*mT.$Vvg%Q*+aUI6D?34Qp};\L?p1WBK;5\]9)5$5 !eP0tRwK442(-B6$>?E'4$;545.4O+ ?WN;<$DmT.$Vvg%Q*+aUI6D?3u[%`Dl<]C*p Tccm\t'4$5 !5lD_tf;5<1fSj6$-EEo'j4$;544.+O+ ?WN;<$D mT.$Vvg%Q*+aUI6D?3!K?R101L4=9=@\M9)C|$5 !s408uOn10EB^_0100E58$;?E'pj$;k4.lO+ ?WN;<$D*mT.$WN?N0%Q*+xU37x>=HA>)qk9f)vA9Vue#E*;?\'D55.3Vx[ce"AdA[OESd03Bu0Gn47,EE=?D(;?D4459l4.4VZ4okE^D (6i>RC"1A3|47);5D)]4)DDKWN?N0%Q*+xU37x>=HA>)qKDfl|>R5[:Ll;gBF0BSAL\j'D44.3Vx(FZd0R"EwuDv09U3CcBdtw1F5457;?Ev;?D@F(;?D4458m5.4VZ4okE^D (6i>RC"1A3|47);5D)O4)DDKWN?N0%Q*+xU37x>=HA>)M)`cTw[F40ALL0*;?\'D44.3Vx4x!AARA0UdQ(9GS5R4F7657_=AE;?D(=?D5458m5.4VZ4okE^D (6i>RC"1A3|47);5D)M4)DDKWN?N0%Q*+xU37x>=HA>)CZkx0c[_5*;?\'DTj4.3VxrU}wuR"AD00BdB0(}WMHk47;?Ev=?D(iD4459544.4VZ4okE^D (6i>RC"1A3|47);5D)]4)DDKWN?N0%Q*+xU37x>=HA>)U()-R(fAR9<5Ae235,bL*;?\j'D44.3Vx_X4BJR(3-(D-R2C0(u134RF4447;?EDj(;?D4658m4.4VZ4okE^D (6i>RC"1A3|47);5D)O4)DDKWN?N0%Q*+xU37x>=HA>)9JL)P]9]dBA( A:S;Q*=?\'D.3VxAR>J;Au1"l-c:3LAAVALLF4647=5E[;?D(=?DRm658m4.4VZ4okE^D (6i>RC"1A3|47);5D)M4)DDKoN?N0%Q*+xU37x>=HA>)CZkx0cKRCAdB*;?\'D44.3VxrU}wuR"AD00Bd0TSIMSHk47;?Ev=?D(iDR59544.4VZ4okE^D (6i>RC"1A3|47);5D)]4)DDKoN?N0%Q*+xU37x>=HA>)U()-R(fAR9<E0C 40\o:+(5*;?\j'D44.3Vx_X4BJR(3-(D-R*)F4447;?EDj(;?D6658m4.4VZ4okE^D (6i>RC"1A3|47);5D)O4)DDKoN?N0%Q*+xU37x>=HA>)9JL)P]9]yR#RBUBJt;Q*=?\'D.3VxAR>J;Au1"l-c:3Le0tlIF4647=5E[;?D(=?D5658m4.4VZ4okE^D (6i>RC"1A3|47);5D)M4)DDKWN?N0%Q*+xU37x>=HA>)Bw9 DhDe90cKRCAdB*;?\'D44.3VxrU}wuR"AD00Bd0TSIMSHk47;?Ev=?D(iDR59544.4VZ4okE^D (6i>RC"1A3|47);5D)]4)DDKWN?N0%Q*+xU37x>=HA>)LJ)-R(fAR9<E0C 40\o:+(5*;?\j'D44.3Vx_X4BJR(3-(D-R*)F4447;?EDj(;?D6658m4.4VZ4okE^D (6i>RC"1A3|47);5D)O4)DDKWN?N0%Q*+xU37x>=HA>)t1JL)P]9]yR#RBUBJt;Q*=?\'D.3VxAR>J;Au1"l-c:3Le0tlIF4647=5E[;?D(=?D5658m4.4VZ4okE^D (6i>RC"1A3|47);5D)M4)DDKWL?L0%Q*+xU37x>=HA>)cB9^DhDe90cKR2C1gDBBc*;?\'DSm4.3TxRU}wuR"AD00Bd00(nAHk45;?Dv=?D(iD4459544.5TZ4okE^D (6i>RC"1A3|45);5D)]4)DDKWL?L0%Q*+xU37x>=HA>)[U)-R(fAR9<E0jDm#Dt*;?\j'DyG4.3Tx?X4v:JR(3-(D-RAF4445;?DDj(;?DTj658m4.5TZ4okE^D (6i>RC"1A3|45);5D)O4)DDKWL?L0%Q*+xU37x>=HA>)JL)P]9]y15m.t9Z;Q*=?\'D.3TxbR>J;Au1"l-c:3LeA0RRF4645=5D];?D(=?D6658m4.5TZ4okE^D (6i>RC"1A3|45);5D)M4)DDKnM?M0%Q*+xU37x>=HA>)(]JL)P]9]yL,*PAB|v;Q*=?\'D.33UxcBSxDC00kw hd1(^u4KDn45;?D=?D(;5DRm559l4.4UZ4okE^D (6i>RC"1A3|45);5D)]4)DDKnM?M0%Q*+xU37x>=HA>)P3(R)\B S4]W)>@\O6'DS.3UxJR(3-(D-R;K=3D4445;?DDj(;?DR58m4.4UZ4okE^D (6i>RC"1A3|45);5D)O4)DDKnM?M0%Q*+xU37x>=HA>)Rw(Ntn*JLT(;CAAkMDdAt*=@\(D56.3Ux$JM0B0VDDDJ3`3ARecNk645=AD;@D(=?D5458674.4UZ4okE^D (6i>RC"1A3|45);5D)M4)DDKWL?L0%Q*+xU37x>=HA>)(]JL)P]9]yL,*PAB|v;Q*=?\'D.33TxcBSxDC00kw hd1(^u4KDn45;?D=?D(;5DRm559l4.4TZ4okE^D (6i>RC"1A3|45);5D)]4)DDKWL?L0%Q*+xU37x>=HA>)P3(R)\B S4]W)>@\O6'DS.3TxJR(3-(D-R;K=3D4445;?DDj(;?DR58m4.4TZ4okE^D (6i>RC"1A3|45);5D)O4)DDKWL?L0%Q*+xU37x>=HA>)Rw(Ntn*JLT(;CAAkMDdAt*=@\(D56.3Tx$JM0B0VDDDJ3`3ARecNk645=AD;@D(=?D5458674.4TZ4okE^D (6i>RC"1A3|45);5D)M4)DDKZwg%Q*+xUxB7===HA,xS(9cVTw[<dfw9N5)<?\)j'D65.5bANDS"RRd0RlxCcVD44(;5E;?EO6(;?D4454.3 yD5 BO1t95K?ve();?DBs)D=ADKZwg%Q*+xUxB7===HA,x#U8:]S3CjDR2J-EX3;LN)=?\j'D44.5(,g05;AB0JV"yuDx(;5E;?Ej(;@D6558544.3 y4okEO)"1(TRJAB)CS3|();?D)D;?DP64LZwg%Q*+xUxB7===HA,x#U8:]S3CjDR2J-EwB0BSA#L)=?\'D44.5(,g05;AB0JV4ADx(;5E;?E(;@D4558544.3 y4okEO)"1(TRJAB);Mv;3|();?D]4)D;?DP64LZwg%Q*+xUxB7===HA,x?TN:n""dRSRc3h0:^v)=?\P5'D44.5'Ex cS(UoAVJ3-K3p00z+D74(;5E;?Ej(-ED4458446.3 y4okEO)"1(TRJAB);Mv;3|();?DO4)D;?DP64LXwg%Q*+xUxB7===HA,x/U8:]S3CjDR2J-EwB1D1c)=?\'D44.5',g05;AB0JV4AEE]MFx$;5E;?E(;@D4558544.3 y4okEO)"1(TRJAB)1S3|$);?DM4)D;?DP64LXwg%Q*+xUxB7===HA,x%DLEkRA1tRTR94RSEL)<?\)'D55.5:`RRA9]DAM-LvSR"Hv$;5E;?D(;?D5458544.3 y4okEO)"1(TRJAB)vv;9|$);?D]4)D;?DP64LXwg%Q*+xUxB7===HA,x%4kRA1tRTR94RSEL)<?\)j'D55.5:`RRA9]DAM-LvSR"Hv$;5E;?Dj(;?D5458544.3 y4okEO)"1(TRJAB)vv;9|$);?DO4)D;?DP64LZwg%Q*+xUxB7===HA,x[n|6GcJ;tw%(qe"R*k\O?'D45.)_3JJ(*KO3R0qA*DDRuRuK?cy|G&>?E;@E(;?D458m4.3 yD5 BO1t95K?ve&);?DBs)D=ADKZwg%Q*+xUxB7===HA,x|>R5[BRl;gB 0ReR\N?'D44.)E~DCRR0SEADdDJnSDAKoTD54&;?E;5Ej(;?D5458m4.3 y4okEO)"1(TRJAB)CS3|&);?D)D;?DP64LZwg%Q*+xUxB7===HA,x4D:]S3CjDR2J-EwB0BSA#L)=?\'D44.)G,g05;AB0JV4ADx&;5E;?E(;@D4558544.3 y4okEO)"1(TRJAB);Mv;3|&);?D]4)D;?DP64LZwg%Q*+xUxB7===HA,x/:n""dRSRc3h0:^v)=?\P?'D44.)G3N cS(UoAVJ3-K3p00z+D74&;5E;?E@F(-ED4458446.3 y4okEO)"1(TRJAB);Mv;3|&);?DO4)D;?DP64LXwg%Q*+UUxB7===HA,z4U8:]S3CjDR2J-EwBA")=?\'D44.'H)G,g05;AB0JV4A0ActL9;Dx$;5E;?E(;@D4558544.3 y4okEO)"1(TRJAB)1S0A3|);?DM4)D;?DP64LXwg%Q*+xUxB7===HA,x4D:]S3CjDR2J-EwBv#L)=?\'D44.)G,g05;AB0JV4ADx$;5E;?E(;@D4558544.3 y4okEO)"1(TRJAB)vv;3|$);?D]4)D;?DP64LXwg%Q*+xUxB7===HA,x4D:]S3CjDR2J-EwBv#L)=?\j'D44.)G,g05;AB0JV4ADx$;5E;?Ej(;@D4558544.3 y4okEO)"1(TRJAB)vv;3|$);?DO4)D;?DP64L Xwg%Q*+xUxB7=/8==S,xHJOmNAmEDuT"#"L+;?\O5'D44.)M0R"EnDv09U3Cc-LBAu}D54$;?Ev;?E(7?D6455.3 y4okEO)"1(TRJAB)S3|$);?DO4)D;?DP64LXwg%Q*+xUxB7=/===S,xHJOmNAmEDuT"#"L+;?\D44.)M0R"EnDv09U3Cc-LBAu}D54$;?Ev;?E(7?D6455.3 y4okEO)"1(TRJAB)E3|$);?DM4)D;?DP64L Zwg%Q*+xUxB8===HA,x[n|6GcJ;tw%(qe"R*k\O?'D45.)_3JJ(*KO3R0qA*DDRuRuK?cy|G&>?E;@E(;?D458m4.3 yD5 BO1t95K?ve&);?DBs)D=ADKZwg%Q*+xUxB8===HA,x|>R5[BRl;gB 0ReR\N?'D44.)E~DCRR0SEADdDJnSDAKoTD54&;?E;5Ej(;?D5458m4.3 y4okEO)"1(TRJAB)CS3|&);?D)D;?DP64LZwg%Q*+xUxB8===HA,x4D:]S3CjDR2J-EwB0BSA#L)=?\'D44.)G,g05;AB0JV4ADx&;5E;?E(;@D4558544.3 y4okEO)"1(TRJAB);Mv;3|&);?D]4)D;?DP64LZwg%Q*+xUxB8===HA,x/:n""dRSRc3h0)^v)=?\P?'D44.)G3N cS(UoAVJ3-K3p00z+D74&;5E;?E@F(-ED4458446.3 y4okEO)"1(TRJAB);Mh;3|&);?DO4)D;?DP64Lwg%Q*+xUxB8===HA,x4U8:]S3CjDR2J-EwB*VL'<)=?\'D44.)G,g05;AB0JV4A(e;Dx$;5E;?E(;@D4558544.3 y4okEO)"1(TRJAB)CS0A3|$);?DM4)D;?DP64LXwg%Q*+xUxB8===HA,x4D:]S3CjDR2J-EwBv#L)=?\'D44.)G,g05;AB0JV4ADx$;5E;?E(;@D4558544.3 y4okEO)"1(TRJAB)vv;3|$);?D]4)D;?DP64LXwg%Q*+xUxB8===HA,x4D:]S3CjDR2J-EwBv#L)=?\j'D44.)G,g05;AB0JV4ADx$;5E;?Ej(;@D4558544.3 y4okEO)"1(TRJAB)vv;3|$);?DO4)D;?DP64LXwg%Q*+xUxB8=/>==S,xLtIkRA1tRTR94RKNE$)<?\)O5'D55.)-[fA:`RRA9]DAM-L3dR;]Fv$;5E;?Ej(7?DyF45;544.3 y4okEO)"1(TRJAB)S3|$);?DO4)D;?DP64LXwg%Q*+xUxB8=/>==S,xAtIkRA1tRTR94RKNE$)<?\)D55.)-[fA:`RRA9]DAM-L3dR;]Fv$;5E;?E(7?DyF45;544.3 y4okEO)"1(TRJAB)E3|$);?DM4)D;?DP64L?"O%*+QHP$"P~50<cRtt-nh9;;?\(N=(6iM'@mNSvO49uD((?r%*+QHP$*=Z%~50<cRtt-n"h9;;?\L=M=6Vx= '@mNSvS49uDL=M=wg%Q*+xUxB7===HA,x%N(JcVTw[<dcK)<?\)j'D55.w)TSANDS"RRd0RlxCD44;5E;?EO5(;?D454.w3 yD5 BO1t95K?ve();?DBs)D=ADKwg%Q*+xUxB7===HA,x4D:]S3CjDR2J-EX3;LN)=?\j'D44.)G,g05;AB0JV"yuDx(;5E;?Ej(;@D6558544.3 y4okEO)"1(TRJAB)CS3|();?D)D;?DP64Lwg%Q*+xUxB7===HA,x4D:]S3CjDR2J-EwB0BSA#L)=?\'D44.)G,g05;AB0JV4ADx(;5E;?E(;@D4558544.3 y4okEO)"1(TRJAB);Mv;3|();?D]4)D;?DP64Lwg%Q*+xUxB7===HA,x/:n""dRSRc3h0)^v)=?\P?'D44.)G3N cS(UoAVJ3-K3p00z+D74;5E;?E@F(-ED4458446.3 y4okEO)"1(TRJAB);Mh;3|();?DO4)D;?DP64LXwg%Q*+xUxB7===HANx4U8:]S3CjDR2J-Ew:u3L)=?\'D44.)G,g05;AB0JV4A;O[T+D;5E;?E(;@D6558544.3 y4okEO)"1(TRJAB)CS0A3|$);?DM4)D;?DP64Lwg%Q*+xUxB7===HA,x%N(JcVTw[<dcK)<?\)j'D55.I)TSANDS"RRd0RlxCD44;5E;?EO5(;?D454.I3 yD5 BO1t95K?ve&);?DBs)D=ADKwg%Q*+xUxB7===HA,x4D:]S3CjDR2J-EX3;LN)=?\j'D44.)G,g05;AB0JV"yuDx&;5E;?Ej(;@D6558544.3 y4okEO)"1(TRJAB)CS3|&);?D)D;?DP64Lwg%Q*+xUxB7===HA,x4D:]S3CjDR2J-EwB0BSA#L)=?\'D44.)G,g05;AB0JV4ADx&;5E;?E(;@D4558544.3 y4okEO)"1(TRJAB);Mv;3|&);?D]4)D;?DP64Lwg%Q*+xUxB7===HA,x/:n""dRSRc3h0)^v)=?\P?'D44.)G3N cS(UoAVJ3-K3p00z+D74;5E;?E@F(-ED4458446.3 y4okEO)"1(TRJAB);Mh;3|&);?DO4)D;?DP64LXwg%Q*+xUxB7===HANx4U8:]S3CjDR2J-Ew:u[FL)=?\'D44.)G,g05;AB0JV4A;OID;5E;?E(;@D6558544.3 y4okEO)"1(TRJAB)CS0A3|$);?DM4)D;?DP64LZwg%Q*+xUxB7===HA,x%N(JcVTw[<dcK)<?\)j'D55.w)TSANDS"RRd0RlxCD44&;5E;?EO5(;?D454.w3 yD5 BO1t95K?ve&);?DBs)D=ADKZwg%Q*+xUxB7===HA,x4D:]S3CjDR2J-EX3;LN)=?\j'D44.)G,g05;AB0JV"yuDx&;5E;?Ej(;@D6558544.3 y4okEO)"1(TRJAB)CS3|&);?D)D;?DP64LZwg%Q*+xUxB7===HA,x4D:]S3CjDR2J-EwB0BSA#L)=?\'D44.)G,g05;AB0JV4ADx&;5E;?E(;@D4558544.3 y4okEO)"1(TRJAB);Mv;3|&);?D]4)D;?DP64LZwg%Q*+xUxB7===HA,x/:n""dRSRc3h0)^v)=?\P?'D44.)G3N cS(UoAVJ3-K3p00z+D74&;5E;?E@F(-ED4458446.3 y4okEO)"1(TRJAB);Mh;3|&);?DO4)D;?DP64LXwg%Q*+xUxB7===HA,x4U8:]S3CjDR2J-Ew:u[FL)=?\'D44.)G,g05;AB0JV4A;OIDx$;5E;?E(;@D6558544.3 y4okEO)"1(TRJAB)CS0A3|);?DM4)D;?DP64LX%Q*+wB7=)xd>+)`cTw[F40T RDW*;?\'D44.X%Q*+wB7=)xd>9)`cTw[F40T RDW*;?\j'D44.X%Q*+UxB7=)xpfM[?2xF(L:UDLLOeR\*j'D65.X%Q*+UxB7=)x!D:n""dRSRc3hR0Wu<TR)=?\o'D44.X%Q*+UxB7=)x!D:n""dRSRc3hR4uEL)=?\'D54.X%Q*+UxB7=)xa"DlkRA1tRTR94LA(;)<?\)O?'DSm5.X%Q*+UxB7=)x!D:n""dRSRc3hRBAH)=?\'D44.V(7*+0ftIB66=9;A==9%~zZ-U9,ftJqC]RAc4lMO?'<@N=@D;?D7?N<5Dl64mX6m65kGj5LP?9*l4Kl4Kl4KQ54KO44KO44KV(7*+0ftIB66=9;A==9%~zZ-Wt0RAJLRJURRRR 5nMj';?N;?D;?D75NgAD;747lX4649G4JN>9*j4K4K4KQ44KQ54KO94K(7*+0ftIB66=9;A==9%~zZ-ZUAR3RJDOR;0R9qR8aRRc;646mM';?N;5D;?D7?N;?D 6lX44kGN>9*O6O5O7V(7*+0ftIB66=9;A==9%~zZ-nJMA"RRR'<JDOR;0R(-w|MRRcR 5lMj';?N;5D;?D75ND 4nXi4nkGP?9*O6O5O6T(7*+0fsIB66=8>A==9%~zZ-RAJDOR;0R(-w|MRRcR 5lM';?N;5D;?D75ND 6lXi444kGP>9*644<744<644<644<644<644<844<844<644<744<644<744<644<744<644<744<744<944<844<644<644<844<844<844<644<644<644=644=644=644=944>644>644>644>646=646=744<O5O5O6844464447445644>7464(7*+0ftIB66=9;A==9%~zZ-9LA(AU[>RR0[;DRfcRRc4lM';5N=?D=?D7?N;5D6lX54445kGj5JN>9*5Lk5Lk6LO75JO55JO55JT(7*+0ftIB66=9;A==9%~zZ-RRA80(3`J-URc'-RDRfcRc 5lMj';?N=?D=?D(END6lX54444kGR?9*O5O5O6(7*+0f@?B66=9;c==&~zZ-TfU9eMARcRc 4nMO6';?N<@D;5D75N;?D=544nX5j5444GP75LP>;*5MP65MQ65MO45MQ65MQ55MT(7*+0f@?B66=9;c==$~zZ-T:0fU9eMARcRc 4nM';?N<@D;5D75N;?D=544nX5j5444GP75LP>;*5MP65MQ65MO45MQ65MQ55MX(7*+Q0I6;=L@>=>,fYE{9,]L@AA(-CxewhSA)o5dA000v0R0=6k6l,Y'iN;?ND;?N+rD65m45m;;;;;;X(7*+Q0I6;=L@>=>,fYE{9,]L@AA(-CxewhSA)o5( 000v0R0=6k6l,YO4'iN;?ND;?N+rD65m45m;;;;;;X(7*+Q0I5>=L@>=>,fYE{9,]L@AA(-CxewhSA1)?B c4KM0`dodVdR6k6l,Y'iN;?ND;?N+rD65m45m;;+++;C*+!0h@mUH2P49D*+!0htj9C*+!0htj9C*+!S2S69C*+!0htj9p7T>%(7*+wi)37BC8wBkcARATDdAB))=3LZl'45.k3J&6O(5j@O2S)7TaS)7TaK7T>%(7*+wi)37BC8wDcARATDdT35l(LZl'Pj5.663JYMJ5j@O2S)7TaS)7TaK7T?7T0%(7*+wi)37BC8wOTit2d10B#DAJvZL4'45.;46643&6M5(5j@O2S)7TaS)7TaTp7T>%(7*+wi)37BC8w<yJ@]*VdCV"B_j'j'45.;46743JOMA|J)j@O2S)7TaT)7TaU7T>%(7*+wi)37BC8wUl*Vdl=H1B\3Do8_L4'L4'A|5.;56543&6OJ)j@O2S)7TaT)7TaYo7T?7T0%(7*+wi)37BC8w[:*RA00"R'^*21IRL`0,j'j'A.6563NJ&5MJ)j@O2S)7TaT)7TaYqmw"~"o%Q(7*+wiwwwwwwwT9Q6C W02pTSA :bBL43L\Z,LI'46.5563k467k467k467k467k469k469k465k465L54JM6Y7m) AMon)7"\MN95*"\ML7.qmw"~"o%Q(7*+wiwwwwwwwT9Q6C C-44-:0T0T[,0A;?\Zj';745.5773B7447B7447B7447B7447B7449B7449B7445B7445L64JN&mL4) AMon)7"\MN95*"\ML7.pmw"~"o%Q(7*+wiwwwwwwwx?9Q6C D1PI-44-:0T0TDBR;?\Zj'5.4773 47 47 47 47 49 49 45 45JJY4L66J) AMon)7"\MN95*"\ML7.l%Q*+~9I@0A`\(jZT_|^D}DuR(L;201#S083("=?G'KDDP54JV;?D;ADQlJ5jj6IIPKKL44Kl%Q*+~9I@0A`\(jZT_|^D}DuR(L;201#S083("=?Gj'KDDP54JV;?D;ADQlJ5jj6IIPKKL44Kl%Q*+~9I@0A`\(jZT_^D}DuR(L;201#S083("v0=?G'KDDP54JV;?D;ADJ5jj6II6OKL54KKn%Q*+~9I@0A`\(lZT_|^D}DuR(L;201#S083("=?G'KDDP54JV;?D;ADQlJ5jj6IIPKKL44Kn%Q*+~9I@0A`\(lZT_|^D}DuR(L;201#S083("=?Gj'KDDP54JV;?D;ADQlJ5jj6IIPKKL44K l%Q*+~9I@0A`\(jZT_ORTv0SdSAo[VI4M;T1@Go'JD_=ADJVD<@DJ5jj6IIJJ+JJJl%Q*+~9I@0A`\(jZT_ORTv0SdSAo[VI4M;T1@G'JD_=ADJVD<@DJ5jj6IIJJ+JJJ *.*+!*fa@[I9O$SQ+/*+!+fT{6;I9 sSQ*.*+!*fT6;IJ];hS4*.*+!*fT6;IJ]; hS4*.*+!*fT{6;IJ;*hSQ*.*+!*fT6;IJ];hS4*.*+!*fT6;IJ]; hSQ *.*+!*fs@Q]QSQ*.*+!*fs@Q]QSQ COT1+pTD4@[5eK",3pGCCx'`K)5GCO%'`K)5GC4@[5eK*3^*GC\@;5eK)5GC\@;5eK)5GC\'xRSFC\'xRSF-#,eGB6,#,,XZ&EJ$)ll&T#T%/MX'T#T%R/M''.#,+{eG7A,E#,eO+A,$ZitEllE#M6@'AE#M6@'AT#T%OMX-T#T%KM'-*C#*Hq0/%R*C#*0/%RAXg&+1'ksk~C#MDSC#MDSA T#TMMT#TsMM!o%0(7*+TF,23ctAhA)]?S)(<e;?NL'D,L5'o %0(7*+TF,#fAt5;oU(_=ANKL]DN@'o%0(7*+TF,#fAt5;o0=ANKL]4]D'o%0(7*+TF,#fAt5;o0=ANKLO4]DP5'm%0(7*+TF,#fAt5;o1:PAS=ANKLM4]D'n%0(7*+TF,#fAt5;o0=ANKL]4]D'm%0(7*+TF,#fAt5;o0=ANKLO4]DP5'n%0(7*+TF,,ztB:;14dD1vc:^KG=ANJLO4]DO5'm%0(7*+TF,,ztB:;14dD1vc:^KG=ANJLM4]D'ow0*+T3 5lcttCSB:5[=A\J,O5'h'M.5J !(OA0ADA#gd0Bn;?DJQ5'O4'ow0*+T3c#stDAcfh\NO5''5f]1lDR0j"(Lf;?DLj''K,ow0*+T3c#stDAc<MER:5\NO5']4'5f]1lDR0jhTJE^f;?DLj']4'K,ow0*+T3DBstDAc<MER:5\NO5'O4'5f]1lDR0jhTJE^f;?DLj'Q4'K,mw0*+T3 stDAc<M4\NO5'M4'5 f]1lDR0jvJf;?DLj'M4'K,nw0*+T3c#stDAc<MER:5\NO5']4'5f]1lDR0jhTJE^f;?DLj']4'K,mw0*+T3 DBstDAc<MER:5\NO5'O4'5 f]1lDR0jhTJE^f;?DLj'Q4'K,nw0*+T3T*I4U:rBMv\Jo's'K5w1f}>D:wI^(PVS*;?DLj'S5'Kmw0*+T3 *I4U:rBMv\Jo's'K5 w1f}>D:wI^(PVS*;?DLj''K0g%Q*+I?Z>>=$ xsS9OAjR0*;BZr1(?S\O@'P6;?DDD5 ?5a2(hDDTJSEg%Q*+I?Z>>=$ xsl]e5(U1nA3"029X;?S\O5''=?DDD5 b&o:&(hDwDSkEqg%Q*+I?Z>>=$ xNe5(U1nA3"02S0;?S\O5''=?DDD5 b&o:>hDwDSkEg%Q*+I?Z>>=$ xl]e5(U1nA3"02S0;?S\O5's'=?DDD5 b&o:>hDwDTJSkEmg%Q*+I?Z>>=$ xsl]e5(U1nA3"02d19X;?S\O5''=?DDD5 b&o:'hDwDSkEng%Q*+I?Z>>=$ xNe5(U1nA3"02S0;?S\O5''=?DDD5 b&o:>hDwDSkEg%Q*+I?Z>>=$ xNe5(U1nA3"02S0;?S\O5's'=?DDD5 b&o:>hDwDTJSkEng%Q*+I?Z>>=$ xs\wcjA(hR2B;?S\P?'O5'=?DDD5 %9g.-kDwDS6SEg%Q*+I?Z>>=$ xs\wcjA(hR2B;?S\P?'M5'=?DDD5 %9g.-kDwDSEWDltnAREC0$fU}LkL54LWSrl+FX[=JBS2UtddBDd_JLoD@[rl+FX[=JBS2d"A1R;_JL46LWuv9DFX[=JBS2d"A1R;_JL46LWSrl+FX[=JBS2#AUyA_JL46LnD>rl+FX[=JBS2d"A1R;_JL46LWD>rl+FX[=JBS2d"A1R;_JL46L nU8#mKATTRI#B0[I]L44KLY}8#mKATTRI#B0[I]L44KLo-SAdA :M11cFJnEVL46f=|Jo-Ppe96 *OeJ;edROkJ[6o-PT96 2%0D9o1R(MCOlJL54Jo-PT96 2%0D9o1R(MCOlJL54Jm+PT96 2%0D9oSMASD*OlJL54Jn,PT96 2%0D9o1R(MCOlJL54Jm+PT96 2%0D9o1R(MCOlJL54Jn,,AKNg("Dp1AJ\kJm+o,AKNg("Dp1AJ\kJo-dA :M4J`59>"BR*L5mo-EnB96 *OS"RT":N9<LDR;L76J=|Jo-EB96 2%0dRJ;(X]*L76J\kJo-EB96 2%0dRJ;(X]*L76J\kJm+EB96 2%0dRh3t(]RR;L76J\kJn,EB96 2%0dRJ;(X]*L76J\kJm+EB96 2%0dRJ;(X]*L76J\kJn,`RAKN<0A0d0J\46Jm+CsRAKN<0A0d0J\46Jo#?-3,uA$L*0RERAJFLL&o#?-_:LE)c#lLB(h]87OkJM54J87o#?-p:LE)c2dIP3A9W9OkJJ9o#?-p:LE)c2dIP3A9W9OkJJ9m#?+p:LE)c2dd19-RR64OkJJ64n#?,p:LE)c2dIP3A9W9OkJJ9m#?+p:LE)c2dIP3A9W9OkJJ9n#?,fA)D4M0eRBL76J=|Jm#?+fA)D4M0eRBL76J=|JTc%Q*+a(dU2DIC3T#T|h9LB1S[C9^l\]9'C{$K3cwv!sBB13w1nRC5455<5D'4$;646.DQjK3cw-N_-V7jo5hDmT$T|%Q*+a(aU2DIC3T#Tau:C(@Li\'Qm$51513Uwv!d|_DtD(B1#e)fMC6445lD'6$;466.DQj1Qj13Uw-N_-V7jo5hDmT$Tc%Q*+a(dU2DIC3T#M.E}iNfRA;?\t'Rl$J3cwv!30C#dxcA65<5Do'$;545.D3cw-N_-V7jo5hD mT$T|%Q*+a(aU2DIC3T#"W)RB10cA_i\O9',6$J1JD3Uwv!sK:O"g[11e;/cp45;?DO6':$6.rD@z1@z13Uw-N_-V7jo5hD mT$Tc%Q*+a(dU2DIC3T#UX$]B;NRAVu\-M9'?l$J3cwv!c2F##G1eTYk445D'5$4.DNjJ3cw-N_-V7jo5hD*mT$T|%Q*+a(aU2DIC3T#zvcQi\'i6$51513Uwv!c30ZN0EC6645;5D'6$;767.DQj1Qj1:3Uw-N_-V7jo5hD*mT$Tc%Q*+a(dU2DIC3T#T|h9LB1S[C9^l\]9'C{$K3cwv!sBB13w1nRC5455<5D'4$;646.DQjK3cw-N_-V7jo5hDmT$T|%Q*+a(aU2DIC3T#Tau:C(@Li\'Qm$51513Uwv!d|_DtD(B1#e)fMC6445lD'6$;466.DQj1Qj13Uw-N_-V7jo5hDmT$Tc%Q*+a(dU2DIC3T#M.E}iNfRA;?\t'Rl$J3cwv!30C#dxcA65<5Do'$;545.D3cw-N_-V7jo5hD mT$T|%Q*+a(aU2DIC3T#"W)RB10cA_i\O9',6$J1JD3Uwv!sK:O"g[11e;/cp45;?DO6':$6.rD@z1@z13Uw-N_-V7jo5hD mT$Tc%Q*+a(dU2DIC3T#UX$]B;NRAVu\-M9'?l$J3cwv!c2F##G1eTYk445D'5$4.DNjJ3cw-N_-V7jo5hD*mT$T|%Q*+a(aU2DIC3T#zvcQi\'i6$51513Uwv!c30ZN0EC6645;5D'6$;767.DQj1Qj1:3Uw-N_-V7jo5hD*mT$c%Q*+w(dU2x>C>A>=$cwhxt#3u)DWJL9;>Mc01T*;?\'D44.OkJ)-t`J.(x!AARAT((pl>t0m<;1Kg>o*F66743_=AE;?D(;?D4458464.);ZCI1kL'?n:#w6$M(LD-TKeLk4643);5D'DDQlLK8%Q*+x(aU2x>C>A>=$UwhxeF4RV9LJ]>1Rem?#c-p0L*=@\'D45.6j16j1)-tD[UDB0(;3B0(>u)`Z<0LTB#(oc*Fm%F3;?E;?E(kD4458454.O541O541);ZCI1kL'?n:#w6$M(LD-TKeLk4643);5D'DDQlLc%Q*+w(dU2x>C>A>=$cwhx;E TfR(Oww[F4(2m>T*;?\O7'FD44.)-?XwERgiR:5LBJo'53l?c0Gl443g=AEv;?Ej(;?D4459l5.);ZCI1kL'?n:#w6$M(LD-TKeLk4643);5DQ?'DDQlLK8%Q*+x(aU2x>C>A>=$Uwhx(]J]t:-u(O^m1S9F;Q*=?\O5'D44.11)-T2U9.(x!AARAT((pl>t0m<XJ^nf1nF66743_=AE;?Ej(;?DTj458464.@1@1);ZCI1kL'?n:#w6$M(LD-TKeLk4643);5DQ?'DDQlLc%Q*+w(dU2x>C>A>=$cwhxcwc0cK(AU;*;?\'D44.)-OABRAAD03vBDD(9GS1AJoSFm443;?E;?D(=?D7458675.O54J);ZCI1kL'?n:#w6$M(LD-TKek4643);5D'DDQlLK8%Q*+x(aU2x>C>A>=$UwhxUB"AweJ_9;D(LERR#RBg+;?\3DB.6j16j1)-d?9wcB)33Oxh(U]#Due9D^AAyF74443;?Ev;?E(lD5558554.O441O441);ZCI1kL'?n:#w6$M(LD-TKek4643);5D'DDQlLc%Q*+w(dU2x>C>A>=$cwhxt#3u)DWJL9;>Mc01T*;?\'D44.OkJ)-t`J.(x!AARAT((pl>t0m<;1Kg>o*F66743_=AE;?D(;?D4458464.);ZCI1kL'?n:#w6$M(LD-TKeLk4643);5D'DDQlLK8%Q*+x(aU2x>C>A>=$UwhxeF4RV9LJ]>1Rem?#c-p0L*=@\'D45.6j16j1)-tD[UDB0(;3B0(>u)`Z<0LTB#(oc*Fm%F3;?E;?E(kD4458454.O541O541);ZCI1kL'?n:#w6$M(LD-TKeLk4643);5D'DDQlLc%Q*+w(dU2x>C>A>=$cwhx;E TfR(Oww[F4(2m>T*;?\O7'FD44.)-?XwERgiR:5LBJo'53l?c0Gl443g=AEv;?Ej(;?D4459l5.);ZCI1kL'?n:#w6$M(LD-TKeLk4643);5DQ?'DDQlLK8%Q*+x(aU2x>C>A>=$Uwhx(]J]t:-u(O^m1S9F;Q*=?\O5'D44.11)-T2U9.(x!AARAT((pl>t0m<XJ^nf1nF66743_=AE;?Ej(;?DTj458464.@1@1);ZCI1kL'?n:#w6$M(LD-TKeLk4643);5DQ?'DDQlLc%Q*+w(dU2x>C>A>=$cwhxcwc0cK(AU;*;?\'D44.)-OABRAAD03vBDD(9GS1AJoSFm443;?E;?D(=?D7458675.O54J);ZCI1kL'?n:#w6$M(LD-TKek4643);5D'DDQlLK8%Q*+x(aU2x>C>A>=$UwhxUB"AweJ_9;D(LERR#RBg+;?\3DB.6j16j1)-d?9wcB)33Oxh(U]#Due9D^AAyF74443;?Ev;?E(lD5558554.O441O441);ZCI1kL'?n:#w6$M(LD-TKek4643);5D'DDQlL*B*T*T(dt3*N7xF'\s_*TAm=H(^f_lDvp*TY0@N%NC6*B*T*T(at3*N<]17E'OCOCa*TAm==f_lDvZ3vZ1p*TY0@N%NCa*B*T*T(dt3*NgV`*TAAG.N/-(_lDP7Y5;M%NC0*B*T*T(at3*NPK^&)ICIC`*TA{1W.]5BlDPo3Po1Y5;M%NCMT(dt3TPtNQs47cwh@1Pt;s0GhDv*cwhh\5N7.T(at3T7N]sJ]sC7Uwhlp}pPt';a0GhDvZ3vZ1*Uwhh\5N7.*B*T*T(dt3*N7xF'\s_*TAm=Hma_lDvp*TY0@N%NCa*B*T*T(at3*N<]17E'OCOCa*TAm==f_lDvZ3vZ1p*TY0@N%NC6*B*T*T(dt3*NgV_*TAAG.N/-clDP7Y5;M%NCM*B*T*T(at3*NPK^&)ICIC`*TA{1W.]5BO0,lDPo3Po1Y5;M%NC0T(dt3TPtNQs4(cwh@1P(L;a0GhDv3cwhh\5N7.T(at3T7N]sJ]sC(Uwhlp}pPt/E;a0GhDvZ3vZ13Uwhh\5N7.(BP7<I>?3 t2 hSAG&>3j!TlT9^fBt]4{59;?D4lvFJuDJ*j9[lJn(?P7<I>?3r21"uPhSAG'Qs33j!a[dT9^fBT]4{59;?D4luD@z1*jH&H9kn9(BJ7<I>?3 tTB`FhSAGM&>3j!TlT:6Dz4{59;?D4lMkJDJ*j>9Nn(?J7<I>?3r;NcLhSAGMQs33j!a[dT:6"4{59;?D4lMD@z1*jH~9kn9lQ*+V(BU9\I9>=>7=BFb(jxKgt)-u)DWJL(u144l*;?\o'D44.JO54J9J542g"(aR:>CDTkrM4c(ORDRG5576355l=AE+7?EO?(;?D;64454.LO45LL56J4 ZC2DA1Bd4v0H44553);?Dv'D(EDLxlQ*+V(?U9\I9>=>7=BFb(jxUod[LR9*wREPAe"DGAc44l+*;?\)(D;444.O6419L5RSAe3Ahd(>A9>l;(-R0CGl54345l=AE7?E(kD45;474.1k4 ZTOBn:.9iACRRgvd1SSk4k3);5D)]9'D7?DLlQ*+V(BU9VI9>=>7=BFb(jxKgt)-u)DWJL(uv01V44l*;?\o'D44.JO54J9J542g"(aR:>CDTkrM4cR9yG5576355l=AE+7?EO?(;?D;64454.LO45LL56J4 ZC2DA1Bd:H44553);?Dv'D(EDLMkKlQ*+V(?U9VI9>=>7=BFb(jxUod[LR9*wREPAe:n44l+*;?\)(D;444.JO6419L5RSAe3Ahd(>A9>l;(-RBBfR(Gl54345l=AE7?E(kD45;474.1k4 ZTOBn:.9iACRRgvd10;k4k3);5D)]9'D7?DL!TL?i%Q*+w]DIC3eGWK-e(?R;5\h'D|$15L 53N00;?DT[7h6UV;?DQ?'4$4.mDQlK3LcP6A6jo7rD mT$TL?i%Q*+w]DIC3l]AtWRCc!0m\t'$5L syE>VR"!(W1(<?D.@Kj4$;444.OmDQ3LcP6A6jo7rD mT$TL?i%Q*+w]DIC3l]AtW:`ST$cAm\t'$15L syE>VR"!(p4Ff<?D.@Kj4$;444.OmDj3LcP6A6jo7rD mT$WN?N0%Q*+w]DIC3 #l]AtWA/ST$cAm\t'$15VsyE>VR"!(p##5f<?D.@Kj4$;444.OmDj3Vx!OP6A6I7rD mT$WL?L0%Q*+w]DIC3 #l)EL1=UjCA\t'$15TsyE>0I_0AAD2dR,AoBvLD@Kj4$;l5.77D"3T5BF85mD mT3$n|%Q*+w]DIC3 #l]AtW:`ST$cAm\t'$1qO!syE>VR"!(p4Ff<?D.@Kj4$;444.OmDj3Ox!OP6A)jo5rD mT$?7T7E%Q*+w]DIC3 #l]AtW:`ST$cAm\t'$1~? syE>VR"!(p#Ff<?D.@Kj4$;444.OmDj37x!OP6A6I5rD mT$#`%Q*+wdI6D3O#5aQhuDe)51<5\O9)$K,oN2MZ}yO^tRARRk<u1SDO9'a|$;545.475DK3 43lDUIT$7K7i7Z%Q*+wdI6D37#5aQhuDe)51<5\O9)$K,oL4KZyO^tRARRk<u1SDO9'a|$;545.475DK3743lDUIT$7J7~7o%Q*+/[w0IC;D(7dd0 oS]AtO5R DMm\O9'4$#O8p,a7p,57DB&UDRhei n;?Do'5$j54.74Dl4K_544,Z,37-}=;'IlD mW$8J8~8o%Q*+(ww0IC;D(8dMRAJ@Rv1;5\t)4$#Oi6,.6k6,5809Ll]3:^gth";?Dj'pu$;k4.+75Dj6K,6,'6j6,38-}=;'IlD mW$i~%Q*+x]x>C>)Lx%oP3|>R5[:Ll;gB2BF\o'D44.5L O9RlF2J;dA3wRDN0C0[sE+;?E;:E@F)D46.4LnF9:9Do0AK=1 &K3Dt)D6.i~%Q*+x]x>C>)Lx?Iu)D39F4AhR*;?\O7'D44.5L ccUH[;d~sAuuRERE*;5E75EO6)D47.4LnF9:9Do0AK=1 &K3Dt)D6.i~%Q*+x]x>C>)Lx?Iu)D39F&uCwmL10*;?\O7'D44.5L ccUH[;d~sAuu1$SS*;5E75EO6)D47.4LnF9:9Do0AK=1 &K3Dt)D6.N6Q*+x]x>C>)Vx?Iu)D39F&uC1mL10*;?\O7'D44.5ccUH[;d~sAuu1TUS*;5E75EO6)D47.4nF9:9Do0AK=1 &K3Dt)D6.L6Q*+x]x>C>)Tx?xR:QJL9;>MA0#11J#c*,E\O7'D44.5AN?vAdLB0D3uA\n1R0L*;5ErEO6)D46.4nW50TJD(x3%F3);?Dt)D46.#`%Q*+x]x>C>)Kx?Iu)D39F&uC1mL10*;?\O7'D44.q~z ccUH[;d~sAuu1CS*;5E75EO6)D47.p~znF9:9Do0AK=1 &K5Dt)D6.7K%Q*+x]x>C>)7x?Iu)D39F&uC1mL10*;?\O7'D44.q~ccUH[;d~sAuu1CS*;5E75EO6)D47.p~nF9:9Do0AK=1 &K5Dt)D6.%Q*+x0x>=C7$%tn)]K,Rm^11cA"(BL*;?\O6(D44.oNx<9|<C10"9^A3dR0R4Uc1;RRl+ErE@K(DTk4.nNynC1qA4KA1-o 44I*;?Dt)D46.7J7~7o%Q*+x0x>=C7$75tn)]K,Rm^11cA"(BL*;?\O6(D44.oLx<9|<C10"9^A3dR0R4Uc1;RRl+ErE@K(DTk4.nLynC1qA4KA1-o 44I*;?Dt)D46.7]%Q*+x/[0xB>6D)7xTJDhDecc0cgSAC$91St*;?\j'D44.3,/,) IxDd0)n0dS(U-9g`G0DdhL*;?E75D(D46..5,a545,3 Iyn(50TJD(x3BR &);?Dt)D56.8]%Q*+x(w0xB>6D)8x9]cxu(33p).0v2JO4A5eR*\O7'D44.f444,b444,) IxA9(]3#ARRuc"wv1]nh1]*>?Ev75Do(D4.a755,a745,3 Iyn(50TJD(x3BR &);?Dt)D56.*i*7i*7i/[t3*iph&JmP,P)+`*7T -07_)t0ThDO6n),n)-l*7TEmc+&GN*i*8i*8i(wt3*in>'-m9,9,`*8T@-@G4=0GrDO,vn,l*8TEmc+&GPC+1+*v*7)Li*7-,v)] (TV99@D@A@T>9@B;;BBB>?XA;;B<EBE;;>=&PVQVS)+~].8I41/"pC_/+Nk*Tx>=&x`(cfSn/M+v-!+N6,5,_*7Hn?G6{Mj_)L%IG]3c"Jn?}{vM5OM6U*UrbO@I'M%|Fv6{k-iZ`c9RFAB0DN70"5"LF:5,5,5,N465,N6k6O4R5,N54J)}P@IO7=K(I,T{%7[J)]ggR(A1T,>),66,kN6M7M5O_)g;b!}GM+MuN4M5Xu*3*a)`*a)%*C#*o.>5{QA&1Z%k*cw(|)hPIAEMH557hPIP65dTMH\)*zkJjG=e-&_D(wMHQ&Fdh4jXPQ583?4p;5]mkPMHJl3RO87rcw"~"~w"~"~w"~"~w"~"~w"~"o%Q(7*+wvvvvvvvxV9Q6cwv!4O<JId4L*9<h2*2"uL4K5BAL;?NQ5L@'5.87 87 87 87 85 89 85 8985 87878785 8789 85L6M7&L4J5 `o)8[3&"\M;*C[3&"\MKP8ln3& `o+8[3&"\M;*J[3&"\MKP8ln3& h=R\3&"\M;=3&"\MKN83& h=R\3&"\M;=3&"\MKN83& `o)83&"\M;Jln3&"\MK8ko3 7TM7TM7TM7TM7T?7T0%(7*+U5B>67 Oei Oe0RA(>dA1EC40G$2`.L5'45.KY74JM6N64J*j.1v)Rl6G&7T7<64G&7T78%E4G&j.1v)Rl6G&7T7<64G&7T78%E4G&j;5(-k6G&7TJR4G&7T5N8k9G&j;J)J76G&7T]D4G&7T5N8 4G&jv.8 4G&7T7K64G&7T78%E4G7]7TM7TM7TM7TM7T?7T0%(7*+U5B>677M3[v-!]]HwR1352d5AZj'55.LjLJHMN54J3jhJR4Q7TJS4Q7TJE4QjhJR4Q7TJS4Q7TJE4QjS5 4Q7T5;'< 6Q7T5;'5 6QjS5 4Q7TJ: 4Q7TJ: 4Qjh5aD5Q7T%8E4Q7T%8E4Q.$99"T8H1/bpTP8H1/bN*C#*?4A,0$,7)3,x4_,pb,,(!,+O,+**"2$*aJ.*"X1@B*_*"Yn2_R"IbJ 1V*.#* l%j[8Ux2j[15Pb2jE`h2jx%h2jXRRANhC(?lZ)7clw"T?"T?w"T?"q%0(7*+wwwwwwwwU=Tjx`+(/0AA9G(?9`<ANj5LL'D44JQlJjS9gMAdR0AfRU)WfjJUJA{JUN:)4 "bHNT,tdlv0BBBBB5BBBBDDFD5B+4"qMhah+3"qMip;5+3 Aek(3"qMhah)"qMip;5(.C.Os>BT(.-@/:gCCCA*.$*+!*nF`* k;mww"~"~w"~Q*+wwwwwwwwwwwwwwwwwUIZ>>?== x?SwT0Vo<0Vn-U)M"K;]F=AS\Q5'LJ'lDDDk 4. hDkD9hD* @be2*800R0D;j47;j47;j47;j47;k65;k65;j45;j45;j49;j47;j67;j47;j;5C(3"\MN9(3"\ML7(. `"b6`G*^V"(^lD^n;^n7^E87^E87^E85^E85^E85^F:5^F:9^E87^E87^E87^E:79^E85pn.)"\M}'kDjn7)"\M}'kDjn.C$*X%[ w,-*L?aa*L?*L?i*ww*+*M?aa*M?*M?l(BL*+;;46=$**?PNp*R@_0T'=/<'Ns7krIQa=}cNN5NF4JB;`9'T494,ELl*T"O\a;K"6HAR%GJ3`c0KB6GJ3*T"nCNG _m*T"-briCP%CDhDN)+8[S%Gc3*UaP@`@ 8(_ N@!; @BK@`9O:*K4_*U"\1U-lkP1-3T1/M3*U"k@fl*U"/@;_l1_&pUhD.kD]6n,9!.+7"^)+Vq2C_0r=-n77ab*c)a(7(;QA=C<9)m)6iEMlCa-`3S1,g3S9%oi'QU(55 (!}E~c(L,3.#0xS*y;_wDiDD-K<@DD*5jJ6R!1T0g"\ekDlDDO64J;?D=ADt8Jr%%Q*+~960AI_\ ZOD04O3wu#R00D"Bgt"9@(eGZeA;?G+S?(JD;?D_lD;?V=5DJoj54IIKP56jJLL56JL66K5j@^kQ3MPa >[)J`^jY..".]O*X.+!*Y_pQG++G].OB.k999H7;9H77"a-]H:Ov=:qopa-UB[0R)CS=AlTUJR=CGlT-]H:OEe=(xvj\-UB[0R)SS1=*jTUJRk=(GMrmw"T?"T?w"T?"q%(7(7QU*+wwwwwwwwwwwwwwwwwUM;P>66;Z= Wh&`VP8P/kN3R.)<.j!w309<Vd0kD.6Q6'u)RLjJLjJl64m* @"bbWf0(h03Gn.3Gn75Gn73Gn73Gn73Gn93Gn93Gn53Gn56Gn54Gn;4Gn74Gn;4Gn54G+3"qMixAjn.j+3"qMioZ.+)fbUDFSAu0A%ln.mn7 mn; mn7 mn; mn5 mn5 mn5 mn5 mn9jn;jn7jn7jn5zi8+3"qMhG.ji8+3"qMhj.7+*.%(7(7(7(7(7(7(7(7(7*+*fD diQ3W*P46;5(OHQBQQOOOBOO+PPPP'SC%+.+++++++C\LV$VD$D fd4 vVMkkkk#kkkkkkkk#k.N=A?.N=n,6,z,6,zL!0.x,.x.!h-.!h.3u?.3unC?$M/qYC?$84qY).3uC.3unC?$M/qYC?$84qY.3uC.3un.!x.!x*CaM/qYC?$84qY%.7u?.3un.3u?.3un%.3u?.3un.3u?.3un~2.3u?.3un.3u?.7un'.7u?.3unC?$M/qYC_84q[).3u?.3unC?$M/qYC?$84qY).3u?.3un%.3u?.3un%.3u?.3un%.3u?.7unC!1RC!P2.!x.!x*.3u?.3un',E!,d,E!,K4X!itLC!RC!B*,E!,,E!,T.X!@E[wC!7,C!-7CN6\CN=9,E!,?#,E!,C X#]naB CN=D?CN=n*.*7*7*7++aaa,,,-a,w,w(7(7(L;*M4Y*M4Y*nOI*nOIq/M&q/M&&&&,Y,I&,V@:GY,O^QEIjNEC!*.7777++7++77++7777777++7++777777+++0++H7a?7a#A7u?;77777;a7a#?77777HNH=:HxHhIuHM/qR&84qR&uHM/qR&8GqR&"HHM/qR&8GqR'"H"H"H"H"H"H"HM/qR&8GqT&"HM/qR&8GqR&"H"H"H"HH"HHE]7RH7:H1O&P/$2a?KS6ST;\=6$,b;@OPduL1& 8/@ 1\E&l4&&/l)))v*(]9B$/0T<+,D6n_j\))i /u(>*0o)y~gv[>3D9_*T`J;bP'9*0B$)"i(+(-(.x-n56*,^ko3*`a)";x4O]5o,47398B , , ,+ U U#+++ K + 67?" u$u ")(((M/v.+-V(vv(T:E>??=FA9=??A?A?A8?BP;F9?K??.?A@>? 4V DA DA DA_A@r>O6EAGIO,63AuV@@Omj9E2L{]G NEA {]G NEA @aH%;7mn3n3"<6^L26VOE?9J mXA9GN%'CWkuQH(M5j%Dh8Ml3%YV4_1SMYL*M%Y3%Dn8!bZ_g@MQJUZoNMl3%:Di8!J UZo&Rp[5[W.4%Di8/J UZ_O^}7%5QMVZO6+7-)7%4RAuKV(7%lV*XZOH+eq{-OMk4%QWe0]/#`4A-Z[3zR&2)ot 0056 PPuQsN54Gq0GrG,)74HqALGO,)kj6HNN,)mHO5(O5*O5*,)44HO5)O,)mj4H(l65%44 A 3=@_SCCfB2*q4G>W44\E4G,5 4Hu K4(L9(,6j4He4GA"x0]?m]#T/RM6i%QJ Z4{9;"R22D2$*,2h2D29LA4~nAweRv0xB)wL5([N4D[.M4DZZLMJD((k45%Ri4 fL4D[M4DZLL](k43%56 L[(l43%v LL5)L[ZZZLLL;(l43%v L[eL4(LQ[[p[Q[\p5[\LLLL[L[(k53%v (7%66 Z4RDV?C7 AM4,M.[L*Pj4,Zk3%Ri4 O44G54pG(k44%44!L6O65,MlJJi86436j663Ai ZWen03(;Y101S#R67%<Qj8N,M66G55344G4j44GMinXJi %h@'X-i 7ESGEE nE55EL dgu\<20D2nK."BT0;_[L),>4G&E3$44G44G&}jG,6GAGxC)]Bt:R)C=CS#vAB:MR^iq66,P666,J>GN76G(k53%44 L6556X557:G457:Gk7$7j663O6566pI>7kG7j463O6567pI6,i54+A ZkvBRAk__5j6R MNM(k43%Q LQ(k55%54 N[N64KL(k43%Qj4 M[(k43%vE4 QL(k43%44 (6,((m45%Q (k43%Ri6 6,Q5(N5(k43%v N(k43%Q [6N8O554,L66Jl4LNNN5j5ZM5j5Z755'}7G[k63%56/N6LI(k44%]@ONNk43%5%@N6O6O564,JQ54LNk43%6%@755'}5GNNNk63%_@NMNNP54kG[774,n64+M774,:54443754m3N555344IP744,L744,En4+:54443754443O5474[,)m5463Qj44EL6M6O544,N&ELOlJ,)k%E63,)k76I3N7j4E,)m4453Nlj4R,)l4453Aix{|.4bv3~P73~,3_)eZ8ic\cSvhRARAudaLJP(k44%56!(k55%44 Pj65X6,QL8\R JL(l43%Qj6!LU4qL`6,N54J8j6.qLOLLLsLL6,OOL4+OL%G(6,656436,6,'6,5+6,JJ W$))*"/w(PT=I\ g.ND@@!89G);)D5n9EL6i%5,0D5spbk)iSNC,636q)iSNCX63S*l*a*c()"?(F(*TxKA>-7/3+-D5n\Ca*X0@a4+0 caS(KL)"z9B6? P%^CgH(/a8]49*XP[$W)7_v(F(-x)?B=()"h5,>>=C+*D5n[5o)+1MC+0ZD5na_ Lz553$B**"i(+(.W0@OLS*a*";98l" WM3=%E?55m?)"J$-V)"[cmn)+Po\Bl$,i_;,(!3*_, B-6v A"R.Q]a6P19NAA"R %1aH?Eo ^i0D 97%8ot7@(9HI:^;Pf ;:'"3#f9.4 M>)8/:)Ub-?H%e pS}W6*p)E|DQ-g8z{= 2A;@ `B/n@~]vb rc\S-h%jDy-4- D[cnTZ8g-(q$7 Vxwx:Q $ D(HղkB}wl^5)H:5o| jaG~  %\,[~ulh$ Q R`8|Pm*pPF'"4d<d<wAaAYsCfSfhDG f1Dz"cUOeYĥ;v0wWH=}PE+ 9y   zDj> E)$.D-V)'pp```/f_)[):#=" |-SW8p%[M=7Iu _s}o6nQN$J1i`0 1jt{gA^#>t! 3B SXUOx p,< P-D@"*Sdh{2*X*[ a63~a{V5;6 D tp,*c">dcG^=s"G('E'<|/0bhE24a|1hC<IBSH_S`8AtTQ5=$FjQCD'A @+~QSuI3B7` XH'? 2.P ]r%Y{BS!l6+9^"[ /XuQ2G)S,E?eWfL4&%Oj!KVf+Hi|Gt% 3n/(5u0Rj-=(=]:%`XQY&rMCv xi`O A.kA8X  _2cj8AQ"^.=sux J\6hzvVqv,# F) .iQ4 ' R 2Hv@{Q..N4> Y$#)jS r}3V8u U0b\LwlEr%v;R'o7:d|[d5QxGjRs65DS !T- ;a~YH\[=`"G{*.vq0*TUA( B5v j06 dJI $c3aA&-&-9AAL$(<99gAAAAL>LL>4(<(<L>L>gLL(<LLLLLL^>mmukul--~@IMBK,tI@IBMC,XmIBGI~BM@I@I,,[):=1U6 )[)))+)%W>5t5t5t*R{{{'S`mAQ0>NfOQ[nYd8e@~ ?BF;zh3%k>Jr.`OQtAX/CP[N' :8g$9KMxXZ 3% F O5.,;p$A)S iF/u|QpYDQg8z{EP2AR r "_?"!~D0.p O /BPL+}C/q_L#z @0 1/ cegY./@/JE-:i@D#iD 3#| GQ&;W1  Dt ;M?QtW1!vZ8s Xj-yj3n{v\$tIZg=-ze~4zw),-]=$l`\S/+orD*g)#;LP75)snv B,\|_M5V*Y@BM _` ^/ 5#. $<&#xhTG+WP)(Yci:ZnS^X$H O|hUA ,XhA *l>4Aq]/JC8s6(( >tck3*a7Dm(Qlf6$H;C s& " Gz  A'@p D\bB 6V =^eL>G gIoM*GIJ\&UjQK _]NO8.&?.822j;u"rLvr#N'*7"vr#"iaaNaa"*LaaL"i"i$~"~L~iiwr#"T]"T]7?r#77i?i~aS(v.g(~/x=C@XS(H?mF9=C=?(?7BBBBBC7IXA;?7KEY5EB;.C% L5d@(ndJ7/NZ9j?*@(4%@wD6"A>j<WF2DT:q\<0G`=H{&{&ccccN4I6L`h1qm^A>qK7TQ54J.$^+Gn.%E4SvrwX3(LRDAu)FL](13rZV3R754Gm4Gm4GkD<6466kG<6466kG<6466kG<6466kGA6466kG76466mG76466mG76466lG=6466lGi38 ba5*YV4V7Z>=8=-;;0|T4>>NPg73P>g_>HvrwmRRA"0[;RcC#dpAURl4L9l447%j64Gm6Gm4Gm6G6?Dm5646@S44kG54kG64kGm44336546465G65464kG65464kG65464kG6546444G6546444G44kG;544444G7544444G7544564G7544664Gi"GMza=*4/l;wW. mMD9CG9NI98c mMD9>G9NI98crkC+*K9G$39G.wGv"GM3"GM/$2wT5)"2wT'/L_(J+-je2E`3AA3aA11J;AuBARAcC?,4.[E%7G?LN?LlJ3,5.Cl5'G?LLN662R54J4.%L=J+-jU2 A93;R1S11J;A["1uw01Q&EJ5444.[E%7G?LO?M??I1K5444.[E%7G?LA2K5446.6k%L)wr[vB{p4 6G)"T\MZ-hB)R0RA0(<2UJqRdcA12RJJJLJJLN]H4y743754433784k3543744533433n56y4338544y744433M>6x755633OkJ:k46kG^"T\MZ-h?{0ADR'M<O"ADR38G0hN44JKJJJKN74K553745533764433764633764433443764453433kJ94354kG6xi7?rNZ@m_|7o'z4G8%D4G[^+5BP55mx(8%QK55mx)k6i:6l&`**7q-;P3vL5)x$&n'A)L'*ic(x6Ng\vNl\) .4~ONuh(P%8n .),Ne9>DT+_8a6+7746433:3m5633N5uB[%xwi*7Q('[9@IPE>>>6D=A9$hNL XSb=?NkDwDX  &P=SRb6PkDDhUKiy`7:fAP5blDxD;5DK*U~FBB0GlEi?%"777/[%*[H5*13Z3U(DKMM<@D-.**+.Q018(J@I6=@A>@$.~3TXS*m>2@0==MD@*+0<8NWS8`.Dp; 0\4US8A*F*w*d+N,--*?*N8a&ROS;;@)*Y`*PZNliNM`*YeBLHNliNP48J`+sE9(`8m,E@S8_-ikz?=LHC8t$l*~ZW;h/4k4'k4;d1"N$*"T/[(r1"K5i8<7K/G@Nn )+,N5n),4,p*"K5i8<7K/G@Nn )+,N5n),4,L+*7*R7=8aQaaQNaa?m`E:;A<G@E6<=O:(* )1?9*nOHMJ*?!Qk(RG33*O2YMGCh"X)F)?YTbPP4T4?X_j]4O6QS6?Xd6QT49X_j`4O4QS6:X_kl'HMC>GX6;R,]4T6?Xw ?GP%?Gc6>R/G5A:*M6OT:Xw ?>:[%*+Q/E>$?_RTRB#9=:pC0#hRO6v;?DiD96O-[%*+Q/E>$N/M\LcvARKeDeg:;4m5N6'v;?DlD76O-[%*+Q/E>$N/M\LcvARKeDe0\sR'v;?DlD76O-[%*+Q/E>$N/M\LcvARKeDe0\sRO6'v;?DlD76O-[%*+Q/E>EN/<\LcvARKeDeL'v;?DlD76O-[%*+Q/E>$?_RTRB#9=:pCO6v;?DiD96O-[%*+Q/E>$N/M\LcvARKeDeg:;4m5N6'v;?DlD76O-[%*+Q/E>$N/M\LcvARKeDe0\sR'v;?DlD76O-[%*+Q/E>$N/M\LcvARKeDe0\sRO6'v;?DlD76O-[%*+Q/E>EN/<\LcvARKeDeL'v;?DlD76O-[%*+Q/E>$]OeR)r3_":LAL;VRj'i?DD75O-[%*+Q/E>$]O8#3u)DF[LVh-N?'v<?DD76O-[%*+Q/E>$]N\LcvARKeDe0\sR'v;?DlD76O-[%*+Q/E>$]N\LcvARKeDe0\sRO6'v;?DlD76O-[%*+Q/E>$]=\LcvARKeDe#J#^]'v;?DlD76O-[vg%Q*+aT6D?3F+%;VN0}5I+<@\]6'4$M AcBAA9m08uOnTu;Uj(C58&;?\'pj$;k4.lDK @K@i.WN;&DmT+$-[vg%Q*+aT6D?3RE@Pk3d0C0S;?\Q:'Rk$M A|R0ACd:5o)GC44&<?\-o'4$j45.D4K @K@i.WN;&D mT+$-[vg%Q*+aT6D?3F_hXm<0}5L<?\M6'4$M AC9"ATGK<AB&;?\'pu$;k4.7DK @K@i.WN;&D*mT+$-[vg%Q*+aT6D?3FB9>1-0TJ5;;?\'C|$M AUc((OydAjT4d2A>;&;?\'"4$-.DK @K@i.WN;&DmT+$-[vg%Q*+aT6D?3F%v1ZSTPBL<?\Q9'4$M A,PDlD_tfiSL=fBj6&-E\o'j4$;544.+DK @K@i.WN;&D mT+$-[vg%Q*+aT6D?3R~D0/J52<<@\M9'4$M A[BJ-r3xE10Ev0:=ScC46&;?\'4$4.[DK @K@i.WN;&D*mT+$-[vg%Q*+aT6D?3FB9>1530cb;?\'C|$M AUc((OydA10#:w3ES&;?\'"4$-.DK @K@i.WN;&DmT+$-[vg%Q*+aT6D?3F%v)QcL<?\Q9'4$M A,PDlD_tf@Adj6&-E\o'j4$;544.+DK @K@i.WN;&D mT+$-[vg%Q*+aT6D?3R~TB00cb<@\M9'4$M A[BJ-r3j)0f3C46&;?\'4$4.[DK @K@i.WN;&D*mT+$-[vg%Q*+aT6D?3FB9>1530cb;?\'C|$M AUc((OydA10#:w3ES&;?\'"4$-.DK @K@i.WN;&DmT+$-[vg%Q*+aT6D?3Uv)QcL<?\Q9'4$M A,PDlD_tf@Adj6&-E\o'j4$;544.+DK @K@i.WN;&D mT+$-[vg%Q*+aT6D?3F^~TB00cb<@\M9'4$M A[BJ-r3j)0f3C46&;?\'4$4.[DK @K@i.WN;&D*mT+$-[vg%Q*+aT6D?3FB9>15A0 T1cSB;?\'C|$M AaC((OydA10#:Bf4 $;?D'"4$-.DK @K@i.WN;$DmT.$-[vg%Q*+aT6D?3` v)_C1BctL<?\Q9'4$M A ADlD_tfe1 j6$-EDo'j4$;544.-DK @K@i.WN;$D mT.$/[vg%Q*+aT6D?3F~T0 T1cS:]<@\M9'4$M AAw8BJ-r3j)p_#C0KGC46$;?D'4$4.[DK @K@i.WN;$D*mT.$-]vg%Q*+aT6D?35};\L?p1WBK;5\]9)5$M Al0tRwK442(-B6$>?E'4$;545.4OK @K@i.WN;<$DmT.$-]vg%Q*+aT6D?3P`UJM]C*p Tccm\t'4$M A 1lD_tf;5<1fSj6$-EEo'j4$;544.+OK @K@i.WN;<$D mT.$-]vg%Q*+aT6D?3_iK?R101L4=9=@\M9)C|$M Ah#08uOn10EB^_0100E58$;?E'pj$;k4.lOK @K@i.WN;<$D*mT.$-]vg%Q*+aT6D?35};\L?p1WBK;5\]9)5$M Al0tRwK442(-B6$>?E'4$;545.4OK @K@i.WN;<$DmT.$-]vg%Q*+aT6D?3P`Dl<]C*p Tccm\t'4$M A 1lD_tf;5<1fSj6$-EEo'j4$;544.+OK @K@i.WN;<$D mT.$-]vg%Q*+aT6D?3_iK?R101L4=9=@\M9)C|$M A08uOn10EB^_0100E58$;?E'pj$;k4.lOK @K@i.WN;<$D*mT.$-WN?N0%Q*+xT27x>=HA>) A)nd"Dd1fcA(y;BB)=?\'D45.KV?*[ce"AdA[OESd03Bu0Gn4;,EE=?E(;?D4499l4.KVA%{kE^D (6i>RC"1A3|47);5D)]4)DDK-WN?N0%Q*+xT27x>=HA>) A?>O9)-R49>tUuAgBSB0:Tc0)=?\j'lD45.KV?u(FZd0R"EwuDv09U3CcBdtw1F545;;?Ev;?E@F(;?D4498m5.KVA%{kE^D (6i>RC"1A3|47);5D)O4)DDK-WN?N0%Q*+xT27x>=HA>) A:]S3CjDR2J-E)=?\'FD44.KV?ux!AARA0UdQ(9GS5R4F765;_=AE;?E(=?D5498m5.KVA%{kE^D (6i>RC"1A9|47);5D)M4)DDK-WN?N0%Q*+xT27x>=HA>) AY%SR)\A0vD?)>@\'D44.KV?U}wuR"AD00BdB0(}WMHk4;;?Ev=?E(iD4499544.KVA%{kE^D (6i>RC"1A3|47);5D)]4)DDK-WN?N0%Q*+xT27x>=HA>) AU|>R5[:Ll;gBUUSTn\j'D44.KV?X4BJR(3-(D-R2C0(u134RF444;;?EEj(;?D4698m4.KVA%{kE^D (6i>RC"1A3|47);5D)O4)DDK-WN?N0%Q*+xT27x>=HA>) A HJLuBAmd0A0ut0+;?\'DSm4.KV?AR>J;Au1"l-c:3LAAVALLF464;=5E[;?E(=?DRm698m4.KVA%{kE^D (6i>RC"1A9|47);5D)M4)DDK-kN?N0%Q*+xT27x>=HA>) AY%SR)\0T"#R)>@\'D44.KV?U}wuR"AD00Bd0TSIMSHk4;;?Ev=?E(iDR99544.KVA%{kE^D (6i>RC"1A3|47);5D)]4)DDK-kN?N0%Q*+xT27x>=HA>) AU|>R5[:Ll;gB6S 0;\j'D44.KV?X4BJR(3-(D-R*)F444;;?EEj(;?D6698m4.KVA%{kE^D (6i>RC"1A3|47);5D)O4)DDK-kN?N0%Q*+xT27x>=HA>) A HJLuBAmEDu:1"d0R+;?\'DS.KV?AR>J;Au1"l-c:3Le0tlIF464;=5E[;?E(=?D5698m4.KVA%{kE^D (6i>RC"1A9|47);5D)M4)DDK-WN?N0%Q*+xT27x>=HA>) AY%R)\0T"#R)>@\'D44.KV?U}wuR"AD00Bd0TSIMSHk4;;?Ev=?E(iDR99544.KVA%{kE^D (6i>RC"1A3|47);5D)]4)DDK-WN?N0%Q*+xT27x>=HA>) A3v|>R5[:Ll;gB6S 0;\j'D44.KV?X4BJR(3-(D-R*)F444;;?EEj(;?D6698m4.KVA%{kE^D (6i>RC"1A3|47);5D)O4)DDK-WN?N0%Q*+xT27x>=HA>) AJLuBAmEDu:1"d0R+;?\'DS.KV?R>J;Au1"l-c:3Le0tlIF464;=5E[;?E(=?D5698m4.KVA%{kE^D (6i>RC"1A9|47);5D)M4)DDK-WL?L0%Q*+xT27x>=HA>) AYPR)\0TDv u)>@\'D44.KT?U}wuR"AD00Bd00(nAHk49;?Ev=?D(iD4499544.KTA%{kE^D (6i>RC"1A3|45);5D)]4)DDK-WL?L0%Q*+xT27x>=HA>) A|>R5[:Ll;gBBRI9F\j'D44.KT?X4v:JR(3-(D-RAF4449;?EDj(;?DTj698m4.KTA%{kE^D (6i>RC"1A3|45);5D)O4)DDK.WL?L0%Q*+xT27x>=HA>) A;JLuBAmEDuCv"AB+;?\'DyF4.KT?R>J;Au1"l-c:3LeA0RRF4649=5E[;?D(=?D6698m4.KTA%{kE^D (6i>RC"1A3|45);5D)M4)DDK-jM?M0%Q*+xT27x>=HA>) A(')-R(vAR9<E0A1D?*;?\'D54.KU?mcBSxDC00kw hd1(^u4KFn49;?D=?D(;5DRm599l4.KUA%{kE^D (6i>RC"1A3|45);5D)]4)DDK-jM?M0%Q*+xT27x>=HA>) AZOR)\BieRR0U;)>@\O4'D44.KU?}JR(3-(D-R;K=3D4449;?DDj(;?DR98m4.KUA%{kE^D (6i>RC"1A3|45);5D)O4)DDK-jM?M0%Q*+xT27x>=HA>) A@U9]cxu(33p).0R012m;>^5:h\'D64.KU?]0YJM0B0VDDDJ3`3ARecNk649=AD;@D(=?D5498674.KUA%{kE^D (6i>RC"1A3|45);5D)M4)DDK-WL?L0%Q*+xT27x>=HA>) A()-R(vAR9<E0A1D?*;?\'D54.KT?mcBSxDC00kw hd1(^u4KFn49;?D=?D(;5DRm599l4.KTA%{kE^D (6i>RC"1A3|45);5D)]4)DDK-WL?L0%Q*+xT27x>=HA>) AZOR)\BieRR0U;)>@\O4'D44.KT?}JR(3-(D-R;K=3D4449;?DDj(;?DR98m4.KTA%{kE^D (6i>RC"1A3|45);5D)O4)DDK-WL?L0%Q*+xT27x>=HA>) A@U9]cxu(33p).0R012m;>^5:h\'D64.KT?] ~JM0B0VDDDJ3`3ARecNk649=AD;@D(=?D5498674.KTA%{kE^D (6i>RC"1A3|45);5D)M4)DDK-[wg%Q*+xTB7===HA,1N(9cVTw[<dfw9N5)<?\)j'D65.K ?\ANDS"RRd0RlxCcVD44(;5E;?EO6(;?D4454.K A\mor1t95K?ve();?DBs)D=ADK0[wg%Q*+xTB7===HA,U8:]S3CjDR2J-EX3;LN)=?\j'D44.K ?}x(,g05;AB0JV"yuDx(;5E;?Ej(;@D6558544.K A%{kEO)"1(TRJAB)CS3|();?D)D;?DP64L0[wg%Q*+xTB7===HA,U8:]S3CjDR2J-EwB0BSA#L)=?\'D44.K ?}x(,g05;AB0JV4ADx(;5E;?E(;@D4558544.K A%{kEO)"1(TRJAB);Mv;3|();?D]4)D;?DP64L0[wg%Q*+xTB7===HA,>TN:n""dRSRc3h0:^v)=?\P5'D44.K ?uB cS(UoAVJ3-K3p00z+D74(;5E;?Ej(-ED4458446.K A%{kEO)"1(TRJAB);Mv;3|();?DO4)D;?DP64L0[wg%Q*+xTB7===HA,U8:]S3CjDR2J-EwBD1c)=?\'D44.K ?}',g05;AB0JV4AEE]MFx$;5E;?E(;@D4558544.K A%{kEO)"1(TRJAB)1S0A3|$);?DM4)D;?DP64L0[wg%Q*+xTB7===HA,DLEkRA1tRTR94RSEL)<?\)'D55.K ?FTVRRA9]DAM-LvSR"Hv$;5E;?D(;?D5458544.K A%{kEO)"1(TRJAB)vv;9|$);?D]4)D;?DP64L0[wg%Q*+xTB7===HA,4kRA1tRTR94RSEL)<?\)j'D55.K ?FTVRRA9]DAM-LvSR"Hv$;5E;?Dj(;?D5458544.K A%{kEO)"1(TRJAB)vv;9|$);?DO4)D;?DP64L0[vg%Q*+aT6D?3F+%;VN0}5I+<@\]6'4$M A(08uOnTu;Uj(C58&;?\'pj$;k4.lDK @K@i.WN;&DmT+$-[vg%Q*+aT6D?3RE@Pk3d0C0S;?\Q:'Rk$M AM+R0ACd:5o)GC44&<?\-o'4$j45.D4K @K@i.WN;&D mT+$-[vg%Q*+aT6D?3F_hXm<0}5L<?\M6'4$M A(C9"ATGK<AB&;?\'pu$;k4.7DK @K@i.WN;&D*mT+$-[vg%Q*+aT6D?3FB9>1-0TJ5;;?\'C|$M AT(OydAjT4d2A>;&;?\'"4$-.DK @K@i.WN;&DmT+$-[vg%Q*+aT6D?3F%v1ZSTPBL<?\Q9'4$M A3>UZlD_tfiSL=fBj6&-E\o'j4$;544.+DK @K@i.WN;&D mT+$-[vg%Q*+aT6D?3R~D0/J52<<@\M9'4$M A](C Jo(r3xE10Ev0:=ScC46&;?\'4$4.[DK @K@i.WN;&D*mT+$-[vg%Q*+aT6D?3FB9>1530cb;?\'C|$M AT(OydA10#:w3ES&;?\'"4$-.DK @K@i.WN;&DmT+$-[vg%Q*+aT6D?3F%v)QcL<?\Q9'4$M A3>UZlD_tf@Adj6&-E\o'j4$;544.+DK @K@i.WN;&D mT+$-[vg%Q*+aT6D?3R~TB00cb<@\M9'4$M A](C Jo(r3j)0f3C46&;?\'4$4.[DK @K@i.WN;&D*mT+$-[vg%Q*+aT6D?3FB9>1530cb;?\'C|$M AT(OydA10#:w3ES&;?\'"4$-.DK @K@i.WN;&DmT+$-[vg%Q*+aT6D?3Uv)QcL<?\Q9'4$M A3>UZlD_tf@Adj6&-E\o'j4$;544.+DK @K@i.WN;&D mT+$-[vg%Q*+aT6D?3F^~TB00cb<@\M9'4$M A](C Jo(r3j)0f3C46&;?\'4$4.[DK @K@i.WN;&D*mT+$-[vg%Q*+aT6D?3FB9>15A0 T1cSB;?\'C|$M AC(OydA10#:Bf4 $;?D'"4$-.DK @K@i.WN;$DmT.$-[vg%Q*+aT6D?3` v)_C1BctL<?\Q9'4$M AIlD_tfe1 j6$-EDo'j4$;544.-DK @K@i.WN;$D mT.$/[vg%Q*+aT6D?3F~T0 T1cS:]<@\M9'4$M A @](C Jo(r3j)p_#C0KGC46$;?D'4$4.[DK @K@i.WN;$D*mT.$-]vg%Q*+aT6D?35};\L?p1WBK;5\]9)5$M A~n0tRwK442(-B6$>?E'4$;545.4OK @K@i.WN;<$DmT.$-]vg%Q*+aT6D?3P`UJM]C*p Tccm\t'4$M A5lD_tf;5<1fSj6$-EEo'j4$;544.+OK @K@i.WN;<$D mT.$-]vg%Q*+aT6D?3_iK?R101L4=9=@\M9)C|$M A408uOn10EB^_0100E58$;?E'pj$;k4.lOK @K@i.WN;<$D*mT.$-]vg%Q*+aT6D?35};\L?p1WBK;5\]9)5$M A~n0tRwK442(-B6$>?E'4$;545.4OK @K@i.WN;<$DmT.$-]vg%Q*+aT6D?3P`Dl<]C*p Tccm\t'4$M A5lD_tf;5<1fSj6$-EEo'j4$;544.+OK @K@i.WN;<$D mT.$-]vg%Q*+aT6D?3_iK?R101L4=9=@\M9)C|$M A 408uOn10EB^_0100E58$;?E'pj$;k4.lOK @K@i.WN;<$D*mT.$-WN?N0%Q*+xT27x>=HA>) A?*9f)vA9Vue#E*;?\'D55.KV?*[ce"AdA[OESd03Bu0Gn47,EE=?D(;?D4459l4.MVA%{kE^D (6i>RC"1A3|47);5D)]4)DDK-WN?N0%Q*+xT27x>=HA>) A?Dfl|>R5[:Ll;gBF0BSAL\j'D44.KV?u(FZd0R"EwuDv09U3CcBdtw1F5457;?Ev;?D@F(;?D4458m5.MVA%{kE^D (6i>RC"1A3|47);5D)O4)DDK-WN?N0%Q*+xT27x>=HA>) A r)`cTw[F40ALL0*;?\'D44.KV?ux!AARA0UdQ(9GS5R4F7657_=AE;?D(=?D5458m5.MVA%{kE^D (6i>RC"1A3|47);5D)M4)DDK-WN?N0%Q*+xT27x>=HA>) AHZkx0c[_5*;?\'DTj4.KV?U}wuR"AD00BdB0(}WMHk47;?Ev=?D(iD4459544.MVA%{kE^D (6i>RC"1A3|47);5D)]4)DDK-WN?N0%Q*+xT27x>=HA>) AOU()-R(fAR9<5Ae235,bL*;?\j'D44.KV?X4BJR(3-(D-R2C0(u134RF4447;?EDj(;?D4658m4.MVA%{kE^D (6i>RC"1A3|47);5D)O4)DDK-WN?N0%Q*+xT27x>=HA>) AJL)P]9]dBA( A:S;Q*=?\'D.KV?AR>J;Au1"l-c:3LAAVALLF4647=5E[;?D(=?DRm658m4.MVA%{kE^D (6i>RC"1A3|47);5D)M4)DDK-kN?N0%Q*+xT27x>=HA>) AHZkx0cKRCAdB*;?\'D44.KV?U}wuR"AD00Bd0TSIMSHk47;?Ev=?D(iDR59544.MVA%{kE^D (6i>RC"1A3|47);5D)]4)DDK-kN?N0%Q*+xT27x>=HA>) AOU()-R(fAR9<E0C 40\o:+(5*;?\j'D44.KV?X4BJR(3-(D-R*)F4447;?EDj(;?D6658m4.MVA%{kE^D (6i>RC"1A3|47);5D)O4)DDK-kN?N0%Q*+xT27x>=HA>) AJL)P]9]yR#RBUBJt;Q*=?\'D.KV?AR>J;Au1"l-c:3Le0tlIF4647=5E[;?D(=?D5658m4.MVA%{kE^D (6i>RC"1A3|47);5D)M4)DDK-WN?N0%Q*+xT27x>=HA>) ABw9 DhDe90cKRCAdB*;?\'D44.KV?U}wuR"AD00Bd0TSIMSHk47;?Ev=?D(iDR59544.MVA%{kE^D (6i>RC"1A3|47);5D)]4)DDK-WN?N0%Q*+xT27x>=HA>) AKJ)-R(fAR9<E0C 40\o:+(5*;?\j'D44.KV?X4BJR(3-(D-R*)F4447;?EDj(;?D6658m4.MVA%{kE^D (6i>RC"1A3|47);5D)O4)DDK-WN?N0%Q*+xT27x>=HA>) AJL)P]9]yR#RBUBJt;Q*=?\'D.KV?R>J;Au1"l-c:3Le0tlIF4647=5E[;?D(=?D5658m4.MVA%{kE^D (6i>RC"1A3|47);5D)M4)DDK-WL?L0%Q*+xT27x>=HA>) A=9^DhDe90cKR2C1gDBBc*;?\'DSm4.KT?U}wuR"AD00Bd00(nAHk45;?Dv=?D(iD4459544.MTA%{kE^D (6i>RC"1A3|45);5D)]4)DDK-WL?L0%Q*+xT27x>=HA>) A+U)-R(fAR9<E0jDm#Dt*;?\j'DyG4.KT?X4v:JR(3-(D-RAF4445;?DDj(;?DTj658m4.MTA%{kE^D (6i>RC"1A3|45);5D)O4)DDK.WL?L0%Q*+xT27x>=HA>) AJL)P]9]y15m.t9Z;Q*=?\'D.KT?R>J;Au1"l-c:3LeA0RRF4645=5D];?D(=?D6658m4.MTA%{kE^D (6i>RC"1A3|45);5D)M4)DDK-jM?M0%Q*+xT27x>=HA>) A(]JL)P]9]yL,*PAB|v;Q*=?\'D.3KU?mcBSxDC00kw hd1(^u4KDn45;?D=?D(;5DRm559l4.MUA%{kE^D (6i>RC"1A3|45);5D)]4)DDK-jM?M0%Q*+xT27x>=HA>) AP3(R)\B S4]W)>@\O6'DS.KU?}JR(3-(D-R;K=3D4445;?DDj(;?DR58m4.MUA%{kE^D (6i>RC"1A3|45);5D)O4)DDK-jM?M0%Q*+xT27x>=HA>) A)5{Rw(Ntn*JLT(;CAAkMDdAt*=@\(D56.KU?]0YJM0B0VDDDJ3`3ARecNk645=AD;@D(=?D5458674.MUA%{kE^D (6i>RC"1A3|45);5D)M4)DDK-WL?L0%Q*+xT27x>=HA>) A(]JL)P]9]yL,*PAB|v;Q*=?\'D.3KT?mcBSxDC00kw hd1(^u4KDn45;?D=?D(;5DRm559l4.MTA%{kE^D (6i>RC"1A3|45);5D)]4)DDK-WL?L0%Q*+xT27x>=HA>) AP3(R)\B S4]W)>@\O6'DS.KT?}JR(3-(D-R;K=3D4445;?DDj(;?DR58m4.MTA%{kE^D (6i>RC"1A3|45);5D)O4)DDK-WL?L0%Q*+xT27x>=HA>) A;Rw(Ntn*JLT(;CAAkMDdAt*=@\(D56.KT?] ~JM0B0VDDDJ3`3ARecNk645=AD;@D(=?D5458674.MTA%{kE^D (6i>RC"1A3|45);5D)M4)DDK-[wg%Q*+xTB7===HA,1N(9cVTw[<dfw9N5)<?\)j'D65.K ?\ANDS"RRd0RlxCcVD44(;5E;?EO6(;?D4454.K A\mor1t95K?ve();?DBs)D=ADK0[wg%Q*+xTB7===HA,U8:]S3CjDR2J-EX3;LN)=?\j'D44.K ?}x(,g05;AB0JV"yuDx(;5E;?Ej(;@D6558544.K A%{kEO)"1(TRJAB)CS3|();?D)D;?DP64L0[wg%Q*+xTB7===HA,U8:]S3CjDR2J-EwB0BSA#L)=?\'D44.K ?}x(,g05;AB0JV4ADx(;5E;?E(;@D4558544.K A%{kEO)"1(TRJAB);Mv;3|();?D]4)D;?DP64L0[wg%Q*+xTB7===HA,>TN:n""dRSRc3h0:^v)=?\P5'D44.K ?uB cS(UoAVJ3-K3p00z+D74(;5E;?Ej(-ED4458446.K A%{kEO)"1(TRJAB);Mv;3|();?DO4)D;?DP64L0[wg%Q*+xTB7===HA,U8:]S3CjDR2J-EwB1D1c)=?\'D44.K ?}',g05;AB0JV4AEE]MFx$;5E;?E(;@D4558544.K A%{kEO)"1(TRJAB)1S3|$);?DM4)D;?DP64L0[wg%Q*+xTB7===HA,DLEkRA1tRTR94RSEL)<?\)'D55.K ?FTVRRA9]DAM-LvSR"Hv$;5E;?D(;?D5458544.K A%{kEO)"1(TRJAB)vv;9|$);?D]4)D;?DP64L0[wg%Q*+xTB7===HA,4kRA1tRTR94RSEL)<?\)j'D55.K ?FTVRRA9]DAM-LvSR"Hv$;5E;?Dj(;?D5458544.K A%{kEO)"1(TRJAB)vv;9|$);?DO4)D;?DP64L0[wg%Q*+xTB7===HA,P][n|6GcJ;tw%(qe"R*k\O?'D45.W ?}3JJ(*KO3R0qA*DDRuRuK?cy|G&>?E;@E(;?D458m4.K A\mor1t95K?ve&);?DBs)D=ADK-[wg%Q*+xTB7===HA,?f|>R5[BRl;gB 0ReR\N?'D44.W ?}E~DCRR0SEADdDJnSDAKoTD54&;?E;5Ej(;?D5458m4.K A%{kEO)"1(TRJAB)CS3|&);?D)D;?DP64L-[wg%Q*+xTB7===HA,0D:]S3CjDR2J-EwB0BSA#L)=?\'D44.W ?} G,g05;AB0JV4ADx&;5E;?E(;@D4558544.K A%{kEO)"1(TRJAB);Mv;3|&);?D]4)D;?DP64L-[wg%Q*+xTB7===HA,.:n""dRSRc3h0:^v)=?\P?'D44.W ?l cS(UoAVJ3-K3p00z+D74&;5E;?E@F(-ED4458446.K A%{kEO)"1(TRJAB);Mv;3|&);?DO4)D;?DP64L-[wg%Q*+UTB7===HA,0U8:]S3CjDR2J-EwBA")=?\'D44.'HW ?} G,g05;AB0JV4A0ActL9;Dx$;5E;?E(;@D4558544.K A%{kEO)"1(TRJAB)1S0A3|);?DM4)D;?DP64L-[wg%Q*+xTB7===HA,0D:]S3CjDR2J-EwBv#L)=?\'D44.W ?} G,g05;AB0JV4ADx$;5E;?E(;@D4558544.K A%{kEO)"1(TRJAB)vv;3|$);?D]4)D;?DP64L-[wg%Q*+xTB7===HA,0D:]S3CjDR2J-EwBv#L)=?\j'D44.W ?} G,g05;AB0JV4ADx$;5E;?Ej(;@D4558544.K A%{kEO)"1(TRJAB)vv;3|$);?DO4)D;?DP64L- [wg%Q*+xTB7=/8==S,HJOmNAmEDuT"#"L+;?\O5'D44.W ?}0R"EnDv09U3Cc-LBAu}D54$;?Ev;?E(7?D6455.K A%{kEO)"1(TRJAB)S3|$);?DO4)D;?DP64L-[wg%Q*+xTB7=/===S,HJOmNAmEDuT"#"L+;?\D44.W ?}0R"EnDv09U3Cc-LBAu}D54$;?Ev;?E(7?D6455.K A%{kEO)"1(TRJAB)E3|$);?DM4)D;?DP64L- [wg%Q*+xTB8===HA,P][n|6GcJ;tw%(qe"R*k\O?'D45.W ?}3JJ(*KO3R0qA*DDRuRuK?cy|G&>?E;@E(;?D458m4.K A\mor1t95K?ve&);?DBs)D=ADK-[wg%Q*+xTB8===HA,?f|>R5[BRl;gB 0ReR\N?'D44.W ?}E~DCRR0SEADdDJnSDAKoTD54&;?E;5Ej(;?D5458m4.K A%{kEO)"1(TRJAB)CS3|&);?D)D;?DP64L-[wg%Q*+xTB8===HA,0D:]S3CjDR2J-EwB0BSA#L)=?\'D44.W ?} G,g05;AB0JV4ADx&;5E;?E(;@D4558544.K A%{kEO)"1(TRJAB);Mv;3|&);?D]4)D;?DP64L-[wg%Q*+xTB8===HA,.:n""dRSRc3h0)^v)=?\P?'D44.W ?l cS(UoAVJ3-K3p00z+D74&;5E;?E@F(-ED4458446.K A%{kEO)"1(TRJAB);Mh;3|&);?DO4)D;?DP64L-[wg%Q*+xTB8===HA,0U8:]S3CjDR2J-EwB*VL'<)=?\'D44.W ?} G,g05;AB0JV4A(e;Dx$;5E;?E(;@D4558544.K A%{kEO)"1(TRJAB)CS0A3|$);?DM4)D;?DP64L-[wg%Q*+xTB8===HA,0D:]S3CjDR2J-EwBv#L)=?\'D44.W ?} G,g05;AB0JV4ADx$;5E;?E(;@D4558544.K A%{kEO)"1(TRJAB)vv;3|$);?D]4)D;?DP64L-[wg%Q*+xTB8===HA,0D:]S3CjDR2J-EwBv#L)=?\j'D44.W ?} G,g05;AB0JV4ADx$;5E;?Ej(;@D4558544.K A%{kEO)"1(TRJAB)vv;3|$);?DO4)D;?DP64L-[wg%Q*+xTB8=/>==S,@tIkRA1tRTR94RKNE$)<?\)O5'D55.W ?m/-[fA:`RRA9]DAM-L3dR;]Fv$;5E;?Ej(7?DyF45;544.K A%{kEO)"1(TRJAB)S3|$);?DO4)D;?DP64L-[wg%Q*+xTB8=/>==S,@tIkRA1tRTR94RKNE$)<?\)D55.W ?m/-[fA:`RRA9]DAM-L3dR;]Fv$;5E;?E(7?DyF45;544.K A%{kEO)"1(TRJAB)E3|$);?DM4)D;?DP64L-?"O%*+QH*$"(u}50<cRtt-nh9;;?\(N=(K"uP'@mNSvO49uD((/"Nr%*+QH*$*=%~50<cRtt-n"h9;;?\L=M=K*='@mNSvS49uDL=M=/[wg%Q*+xTB7===HA,S(JcVTw[<dcK)<?\)j'D55.wW ?}SANDS"RRd0RlxCD44;5E;?EO5(;?D454.wK A\mor1t95K?ve();?DBs)D=ADK-[wg%Q*+xTB7===HA,0D:]S3CjDR2J-EX3;LN)=?\j'D44.W ?} G,g05;AB0JV"yuDx(;5E;?Ej(;@D6558544.K A%{kEO)"1(TRJAB)CS3|();?D)D;?DP64L-[wg%Q*+xTB7===HA,0D:]S3CjDR2J-EwB0BSA#L)=?\'D44.W ?} G,g05;AB0JV4ADx(;5E;?E(;@D4558544.K A%{kEO)"1(TRJAB);Mv;3|();?D]4)D;?DP64L-[wg%Q*+xTB7===HA,.:n""dRSRc3h0)^v)=?\P?'D44.W ?l cS(UoAVJ3-K3p00z+D74;5E;?E@F(-ED4458446.K A%{kEO)"1(TRJAB);Mh;3|();?DO4)D;?DP64L-[wg%Q*+xTB7===HAN0U8:]S3CjDR2J-Ew:u3L)=?\'D44.W ?} G,g05;AB0JV4A;O[T+D;5E;?E(;@D6558544.K A%{kEO)"1(TRJAB)CS0A3|$);?DM4)D;?DP64L-[wg%Q*+xTB7===HA,S(JcVTw[<dcK)<?\)j'D55.IW ?}SANDS"RRd0RlxCD44;5E;?EO5(;?D454.IK A\mor1t95K?ve&);?DBs)D=ADK-[wg%Q*+xTB7===HA,0D:]S3CjDR2J-EX3;LN)=?\j'D44.W ?} G,g05;AB0JV"yuDx&;5E;?Ej(;@D6558544.K A%{kEO)"1(TRJAB)CS3|&);?D)D;?DP64L-[wg%Q*+xTB7===HA,0D:]S3CjDR2J-EwB0BSA#L)=?\'D44.W ?} G,g05;AB0JV4ADx&;5E;?E(;@D4558544.K A%{kEO)"1(TRJAB);Mv;3|&);?D]4)D;?DP64L-[wg%Q*+xTB7===HA,.:n""dRSRc3h0)^v)=?\P?'D44.W ?l cS(UoAVJ3-K3p00z+D74;5E;?E@F(-ED4458446.K A%{kEO)"1(TRJAB);Mh;3|&);?DO4)D;?DP64L-[wg%Q*+xTB7===HAN0U8:]S3CjDR2J-Ew:u[FL)=?\'D44.W ?} G,g05;AB0JV4A;OID;5E;?E(;@D6558544.K A%{kEO)"1(TRJAB)CS0A3|$);?DM4)D;?DP64L-[wg%Q*+xTB7===HA,S(JcVTw[<dcK)<?\)j'D55.wW ?}SANDS"RRd0RlxCD44&;5E;?EO5(;?D454.wK A\mor1t95K?ve&);?DBs)D=ADK-[wg%Q*+xTB7===HA,0D:]S3CjDR2J-EX3;LN)=?\j'D44.W ?} G,g05;AB0JV"yuDx&;5E;?Ej(;@D6558544.K A%{kEO)"1(TRJAB)CS3|&);?D)D;?DP64L-[wg%Q*+xTB7===HA,0D:]S3CjDR2J-EwB0BSA#L)=?\'D44.W ?} G,g05;AB0JV4ADx&;5E;?E(;@D4558544.K A%{kEO)"1(TRJAB);Mv;3|&);?D]4)D;?DP64L-[wg%Q*+xTB7===HA,.:n""dRSRc3h0)^v)=?\P?'D44.W ?l cS(UoAVJ3-K3p00z+D74&;5E;?E@F(-ED4458446.K A%{kEO)"1(TRJAB);Mh;3|&);?DO4)D;?DP64L-[wg%Q*+xTB7===HA,0U8:]S3CjDR2J-Ew:u[FL)=?\'D44.W ?} G,g05;AB0JV4A;OIDx$;5E;?E(;@D6558544.K A%{kEO)"1(TRJAB)CS0A3|);?DM4)D;?DP64L-[%Q*+w B7=)6^^)`cTw[F40T RDW*;?\'D44.9[%Q*+wB7=)-9)`cTw[F40T RDW*;?\j'D44.9[%Q*+UB7=)fM[?2xF(L:UDLLOeR\*j'D65.9[ %Q*+UB7=) D:n""dRSRc3hR0Wu<TR)=?\o'D44.9[%Q*+UB7=) D:n""dRSRc3hR4uEL)=?\'D54.9[%Q*+UB7=)"DlkRA1tRTR94LA(;)<?\)O?'DSm5.9[%Q*+UB7=) D:n""dRSRc3hRBAH)=?\'D44.9V(7*+0{B66=9DA==9%~zZ;A]U9,ftJqC]RAc4lMO?'<@N=@D;?D7?N<5Dl64mX6m65kGj5LP?9*8l4Kl4Kl4KQ54KO44KO44KV(7*+0{B66=9DA==9%~zZ;Au0RAJLRJURRRR 5nMj';?N;?D;?D75NgAD;747lX4649G4JN>9*8j4K4K4KQ44KQ54KO94K(7*+0{B66=9DA==9%~zZ;AuZUAR3RJDOR;0R9qR8aRRc;646mM';?N;5D;?D7?N;?D 6lX44kGN>9*8O6O5O7V(7*+0{B66=9DA==9%~zZ;A>jnJMA"RRR'<JDOR;0R(-w|MRRcR 5lMj';?N;5D;?D75ND 4nXi4nkGP?9*8O6O5O6T(7*+0fsSIB66=8GA==9%~zZ;?5RAJDOR;0R(-w|MRRcR 5lM';?N;5D;?D75ND 6lXi444kGP>9*644<744<644<644<644<644<844<844<644<744<644<744<644<744<644<744<744<944<844<644<644<844<844<844<644<644<644=644=644=644=944>644>644>644>646=646=8744<O5O5O6844464447445644>7464(7*+0{B66=9DA==9%~zZ;A>j9LA(AU[>RR0[;DRfcRRc4lM';5N=?D=?D7?N;5D6lX54445kGj5JN>9*85Lk5Lk6LO75JO55JO55JT(7*+0{B66=9DA==9%~zZ;A>jRRA80(3`J-URc'-RDRfcRc 5lMj';?N=?D=?D(END6lX54444kGR?9*8O5O5O6(7*+0{@?B66=9Dc==&~zZ;A>TfU9eMARcRc 4nMO6';?N<@D;5D75N;?D=544nX5j5444GP75LP>;*85MP65MQ65MO45MQ65MQ55MT(7*+0{@?B66=9Dc==$~zZ;T:0fU9eMARcRc 4nM';?N<@D;5D75N;?D=544nX5j5444GP75LP>;*85MP65MQ65MO45MQ65MQ55M[(7*+Q'[I6D=L@>=>,o9,]L@AA(-CxewhSA)o5dA000v0R0=6k6l,Y'iN;?ND;?N+rD65m45m;+;;;;;[(7*+Q'[I6D=L@>=>,o9,]L@AA(-CxewhSA)o5( 000v0R0=6k6l,YO4'iN;?ND;?N+rD65m45m;+;;;;;[(7*+Q'[I5G=L@>=>,o9,]L@AA(-CxewhSA1)?B c4KM0`dodVdR6k6l,Y'iN;?ND;?N+rD65m45m;+;+++;F*+!0h@mUH2P49F*+!0htj9F*+!0htj9F*+!S2S69F*+!0htj9p7T>%(7*+wi)S27BC8w)BkcARATDdAB))=3LZl'45.k3J&6O(Kj@O2S)7TaS)7TaTp7T>%(7*+wi)S27BC8wcARATDdT35l(LZl'Pj5.663JYMJKj@O2S)7TaS)7TaTo7T?7T0%(7*+wi)S27BC8w~Tit2d10B#DAJvZL4'45.;46643&6M5(Kj@O2S)7TaS)7TaK7T>%(7*+wi)S27BC8w<yJ@]*VdCV"B_j'j'45.;46743JOMA|JMj@O2S)7TaT)7TaYo7T>%(7*+wi)S27BC8wGUl*Vdl=H1B\3Do8_L4'L4'A|5.;56543&6OJMj@O2S)7TaT)7TaYo7T?7T0%(7*+wi)S27BC8w.[:*RA00"R'^*21IRL`0,j'j'A.6563NJ&5MJMj@O2S)7TaT)7TaYqiw"~"o%Q(7*+wiwwwwwwwTS9Q6C ?W02pTSA :bBL43L\Z,LI'46.5563k467k467k467k467k469k469k465k465L54JM6Y7mW AMon)7"\MN95*"\ML7.qiw"~"o%Q(7*+wiwwwwwwwTS9Q6C ?wD-44-:0T0T[,0A;?\Zj';745.5773B7447B7447B7447B7447B7449B7449B7445B7445L64JN&mL4W AMon)7"\MN95*"\ML7.piw"~"o%Q(7*+wiwwwwwwwx?S9Q6C ?]pI-44-:0T0TDBR;?\Zj'5.4773 47 47 47 47 49 49 45 45JJY4L66JW AMon)7"\MN95*"\ML7.l%Q*+~S9I@0A`\(j??DA;201#S083("=?G'KDDP54JV;?D;ADQlJ5jj6IIPKKL44K-l%Q*+~S9I@0A`\(j??DA;201#S083("=?Gj'KDDP54JV;?D;ADQlJ5jj6IIPKKL44K-l%Q*+~S9I@0A`\(j??DA;201#S083("v0=?G'KDDP54JV;?D;ADJ5jj6II6OKL54KK-n%Q*+~S9I@0A`\(l??DA;201#S083("=?G'KDDP54JV;?D;ADQlJ5jj6IIPKKL44K-n%Q*+~S9I@0A`\(l??DA;201#S083("=?Gj'KDDP54JV;?D;ADQlJ5jj6IIPKKL44K- l%Q*+~S9I@0A`\(j??DRTv0SdSAo[VI4M;T1@Go'JD_=ADJVD<@DJ5jj6IIJJ+JJJ-l%Q*+~S9I@0A`\(j??DRTv0SdSAo[VI4M;T1@G'JD_=ADJVD<@DJ5jj6IIJJ+JJJ%***+r*UI9O$SG++*+r+V)NI9 sSG***+r*U9NIJ];hSd***+r*U9NIJ]; hSd***+r*U)NIJ;*hSG***+r*U9NIJ];hSd***+r*U9NIJ]; hSd ***+r*U\@Q]QSG***+r*U\@Q]QSd Fv|kT1+pT-Fvw%;5eK",3pG-Fv%;5eK)5G-Fv%;5eK)5G-Fvw%;5eK*3^*G-Fv%;5eK)5G-Fv%;5eK)5G-Fv'xRSF-Fv'xRSF--#,eGB6A,#,,XZ&EJAvG=)ll2&T#T%/MX'T#T%R/M''.#,+{eG7A,AH#,eO+A,A[vtEll2A#M6@'A#M6@'AT#T%OMX-T#T%KM'-*?#*Hq0/%RA*F#*0/%RA[v&+1'ksk~2?#MDSA?#MDS9T#TMMT#TsMM k%0(7*+F, A23ctAhA)]?S)(<e;?NL'D,L5'-k%0(7*+F, A\#fAt5;oU(_=ANKL]DN@'-k%0(7*+F, A`#fAt5;o0=ANKL]4]D'-k%0(7*+F, AM#fAt5;o0=ANKLO4]DP5'-i%0(7*+F, A\#fAt5;o1:PAS=ANKLM4]D'-j%0(7*+F, A`#fAt5;o0=ANKL]4]D'-i%0(7*+F, AM#fAt5;o0=ANKLO4]DP5'-j%0(7*+F, A,ztB:;14dD1vc:^KG=ANJLO4]DO5'-i%0(7*+F, AM,ztB:;14dD1vc:^KG=ANJLM4]D'-kw0*+3 A}85lcttCSB:5[=A\J,O5'h'M.M ?J !(OA0ADA#gd0Bn;?DJQ5'O4'-kw0*+3 Aw`stDAcfh\NO5''M ?f]1lDR0j"(Lf;?DLj''K,-kw0*+3 Aw`stDAc<MER:5\NO5']4'M ?f]1lDR0jhTJE^f;?DLj']4'K,-kw0*+3 Aw`stDAc<MER:5\NO5'O4'M ?f]1lDR0jhTJE^f;?DLj'Q4'K,-iw0*+3 Aw`stDAc<M4\NO5'M4'M ?f]1lDR0jvJf;?DLj'M4'K,-jw0*+3 Aw`stDAc<MER:5\NO5']4'M ?f]1lDR0jhTJE^f;?DLj']4'K,-iw0*+3 Aw`stDAc<MER:5\NO5'O4'M ?f]1lDR0jhTJE^f;?DLj'Q4'K,-jw0*+3 APJ*I4U:rBMv\Jo's'KM ?w1f}>D:wI^(PVS*;?DLj'S5'K-iw0*+3 APJ*I4U:rBMv\Jo's'KM ?w1f}>D:wI^(PVS*;?DLj''K- g%Q*+?Z>>=$ AMS9OAjR0*;BZr1(?S\O@'P6;?DDDK X3?5a2(hDDTJSE9g%Q*+?Z>>=$ A?J]e5(U1nA3"029X;?S\O5''=?DDDK X3`m&o:&(hDwDSkE9mg%Q*+?Z>>=$ A?e5(U1nA3"02S0;?S\O5''=?DDDK XC`m&o:>hDwDSkE9g%Q*+?Z>>=$ A?e5(U1nA3"02S0;?S\O5's'=?DDDK XC`m&o:>hDwDTJSkE9ig%Q*+?Z>>=$ A?J]e5(U1nA3"02d19X;?S\O5''=?DDDK X3`m&o:'hDwDSkE9jg%Q*+?Z>>=$ A?e5(U1nA3"02S0;?S\O5''=?DDDK XC`m&o:>hDwDSkE9g%Q*+?Z>>=$ A?e5(U1nA3"02S0;?S\O5's'=?DDDK XC`m&o:>hDwDTJSkE9jg%Q*+?Z>>=$ AwcjA(hR2B;?S\P?'O5'=?DDDK X3s-kDwDS6SE9g%Q*+?Z>>=$ ALwcjA(hR2B;?S\P?'M5'=?DDDK X3s-kDwDSE9Wv @?ltnAREC0$fU}LkL54L-Wv @IU{]rl+FX[=JBS2UtddBDd_JL-kv @I@[rl+FX[=JBS2d"A1R;_JL46L-Wv @Iuv9DFX[=JBS2d"A1R;_JL46L-Wv @IU{]rl+FX[=JBS2#AUyA_JL46L-jv @ICrl+FX[=JBS2d"A1R;_JL46L-Wv @ICrl+FX[=JBS2d"A1R;_JL46L- jv @I`U8#mKATTRI#B0[I]L44KL-Yv @I^g8#mKATTRI#B0[I]L44KL- kv AdA :M11cFJnEVL46f=|J-kv Ape96 *OeJ;edROkJ[6-kv AMT96 2%0D9o1R(MCOlJL54J-kv AMT96 2%0D9o1R(MCOlJL54J-iv AT96 2%0D9oSMASD*OlJL54J-jv AMT96 2%0D9o1R(MCOlJL54J-iv AMT96 2%0D9o1R(MCOlJL54J-jv AP[3AKNg("Dp1AJ\kJ-iv AP[3AKNg("Dp1AJ\kJ-kv AdA :M4J`59>"BR*L5m-kv AEnB96 *OS"RT":N9<LDR;L76J=|J-kv AEB96 2%0dRJ;(X]*L76J\kJ-kv AEB96 2%0dRJ;(X]*L76J\kJ-iv AEB96 2%0dRh3t(]RR;L76J\kJ-jv AEB96 2%0dRJ;(X]*L76J\kJ-iv AEB96 2%0dRJ;(X]*L76J\kJ-jv A`RAKN<0A0d0J\46J-iv AKsRAKN<0A0d0J\46J-k#?v @3,uA$L*0RERAJFLL&-k#?v @_:LE)c#lLB(h]87OkJM54J87-k#?v @p:LE)c2dIP3A9W9OkJJ9-k#?v @p:LE)c2dIP3A9W9OkJJ9/i#?v @p:LE)c2dd19-RR64OkJJ64-j#?v @p:LE)c2dIP3A9W9OkJJ9-i#?v @p:LE)c2dIP3A9W9OkJJ9/j#?v @fA)D4M0eRBL76J=|J-i#?v @fA)D4M0eRBL76J=|J/Tc%Q*+a(dTS>DIC3Tzxh9LB1S[C9^l\]9'C{$KKcwhBB13w1nRC5455<5D'4$;646.DQjKKcwh@K[N_-V7jo5hDmT$-T|%Q*+a(aTS>DIC3Tzx1Pau:C(@Li\'Qm$5151KUwh9|_DtD(B1#e)fMC6445lD'6$;466.DQj1Qj1KUwh@K[N_-V7jo5hDmT$-Tc%Q*+a(dTS>DIC3Tz  E}iNfRA;?\t'Rl$JKcwh0C#dxcA65<5Do'$;545.DKcwh@K[N_-V7jo5hD mT$-T|%Q*+a(aTS>DIC3Tz 2W)RB10cA_i\O9',6$J1JDKUwh=K:O"g[11e;/cp45;?DO6':$6.rD@z1@z1KUwh@K[N_-V7jo5hD mT$-Tc%Q*+a(dTS>DIC3Tz+]B;NRAVu\-M9'?l$JKcwhY'F##G1eTYk445D'5$4.DNjJKcwh@K[N_-V7jo5hD*mT$-T|%Q*+a(aTS>DIC3Tz(KzvcQi\'i6$5151KUwh=30ZN0EC6645;5D'6$;767.DQj1Qj1:KUwh@K[N_-V7jo5hD*mT$-Tc%Q*+a(dTS8DIC3Tzxh9LB1S[C9^l\]9'C{$KKcwhBB13w1nRC5455<5D'4$;646.DQjKKcwh@K[N_-V7jo5hDmT$-T|%Q*+a(aTS8DIC3Tzx1Pau:C(@Li\'Qm$5151KUwh9|_DtD(B1#e)fMC6445lD'6$;466.DQj1Qj1KUwh@K[N_-V7jo5hDmT$-Tc%Q*+a(dTS8DIC3Tz  E}iNfRA;?\t'Rl$JKcwh0C#dxcA65<5Do'$;545.DKcwh@K[N_-V7jo5hD mT$-T|%Q*+a(aTS8DIC3Tz 2W)RB10cA_i\O9',6$J1JDKUwh=K:O"g[11e;/cp45;?DO6':$6.rD@z1@z1KUwh@K[N_-V7jo5hD mT$-Tc%Q*+a(dTS8DIC3Tz+]B;NRAVu\-M9'?l$JKcwhY'F##G1eTYk445D'5$4.DNjJKcwh@K[N_-V7jo5hD*mT$-T|%Q*+a(aTS8DIC3Tz(KzvcQi\'i6$5151KUwh=30ZN0EC6645;5D'6$;767.DQj1Qj1:KUwh@K[N_-V7jo5hD*mT$-J;%Q*+w(dT2x>C>A>=$cwh?)E3"#3u)DWJL9;>Mc01T*;?\'D44.OkJW;?`J.(x!AARAT((pl>t0m<;1Kg>o*F66743_=AE;?D(;?D4458464.K;??=n:#w6$M(LD-TKeLk4643);5D'DDQlL0K;%Q*+x(aT2x>C>A>=$Uwh?eF4RV9LJ]>1Rem?#c-p0L*=@\'D45.6j16j1W;?tD[UDB0(;3B0(>u)`Z<0LTB#(oc*Fm%F3;?E;?E(kD4458454.O541O541K;??=n:#w6$M(LD-TKeLk4643);5D'DDQlL0J;%Q*+w(dT2x>C>A>=$cwh? TfR(Oww[F4(2m>T*;?\O7'FD44.W;?lXwERgiR:5LBJo'53l?c0Gl443g=AEv;?Ej(;?D4459l5.K;??=n:#w6$M(LD-TKeLk4643);5DQ?'DDQlL0K;%Q*+x(aT2x>C>A>=$Uwh?}{(]J]t:-u(O^m1S9F;Q*=?\O5'D44.11W;?2U9.(x!AARAT((pl>t0m<XJ^nf1nF66743_=AE;?Ej(;?DTj458464.@1@1K;??=n:#w6$M(LD-TKeLk4643);5DQ?'DDQlL0J;%Q*+w(dT2x>C>A>=$cwh?jwc0cK(AU;*;?\'D44.W;?fABRAAD03vBDD(9GS1AJoSFm443;?E;?D(=?D7458675.O54JK;??=n:#w6$M(LD-TKek4643);5D'DDQlL0K;%Q*+x(aT2x>C>A>=$Uwh?B"AweJ_9;D(LERR#RBg+;?\3DB.6j16j1W;??9wcB)33Oxh(U]#Due9D^AAyF74443;?Ev;?E(lD5558554.O441O441K;??=n:#w6$M(LD-TKek4643);5D'DDQlL0J;%Q*+w(dT2x>C>A>=$cwh?)E3"#3u)DWJL9;>Mc01T*;?\'D44.OkJW;?`J.(x!AARAT((pl>t0m<;1Kg>o*F66743_=AE;?D(;?D4458464.K;??=n:#w6$M(LD-TKeLk4643);5D'DDQlL0K;%Q*+x(aT2x>C>A>=$Uwh?eF4RV9LJ]>1Rem?#c-p0L*=@\'D45.6j16j1W;?tD[UDB0(;3B0(>u)`Z<0LTB#(oc*Fm%F3;?E;?E(kD4458454.O541O541K;??=n:#w6$M(LD-TKeLk4643);5D'DDQlL0J;%Q*+w(dT2x>C>A>=$cwh? TfR(Oww[F4(2m>T*;?\O7'FD44.W;?lXwERgiR:5LBJo'53l?c0Gl443g=AEv;?Ej(;?D4459l5.K;??=n:#w6$M(LD-TKeLk4643);5DQ?'DDQlL0K;%Q*+x(aT2x>C>A>=$Uwh?}{(]J]t:-u(O^m1S9F;Q*=?\O5'D44.11W;?2U9.(x!AARAT((pl>t0m<XJ^nf1nF66743_=AE;?Ej(;?DTj458464.@1@1K;??=n:#w6$M(LD-TKeLk4643);5DQ?'DDQlL0J;%Q*+w(dT2x>C>A>=$cwh?jwc0cK(AU;*;?\'D44.W;?fABRAAD03vBDD(9GS1AJoSFm443;?E;?D(=?D7458675.O54JK;??=n:#w6$M(LD-TKek4643);5D'DDQlL0K;%Q*+x(aT2x>C>A>=$Uwh?B"AweJ_9;D(LERR#RBg+;?\3DB.6j16j1W;??9wcB)33Oxh(U]#Due9D^AAyF74443;?Ev;?E(lD5558554.O441O441K;??=n:#w6$M(LD-TKek4643);5D'DDQlL0*B*T*T(dt3*N7xF'\s_*TAm=H(^f_lDvp*TY0@N%NCa*B*T*T(at3*N<]17E'OCOCa*TAm==f_lDvZ3vZ1p*TY0@N%NC6*B*T*T(dt3*NgV`*TAAG.N/-(_lDP7Y5;M%NCM*B*T*T(at3*NPK^&)ICIC`*TA{1W.]5BlDPo3Po1Y5;M%NC0T(d/3TPtNQs47cwh@1Pt;s0GhDv*cwhEN7.-T(a/3T7N]sJ]sC7Uwhlp}pPt';a0GhDvZ3vZ1*UwhVN7.-*B*T*T(dt3*N7xF'\s_*TAm=Hma_lDvp*TY0@N%NC6*B*T*T(at3*N<]17E'OCOCa*TAm==f_lDvZ3vZ1p*TY0@N%NCa*B*T*T(dt3*NgV_*TAAG.N/-clDP7Y5;M%NC0*B*T*T(at3*NPK^&)ICIC`*TA{1W.]5BO0,lDPo3Po1Y5;M%NCMT(d/3TPtNQs4(cwh@1P(L;a0GhDv3cwhEN7.-T(a/3T7N]sJ]sC(Uwhlp}pPt/E;a0GhDvZ3vZ13UwhVN7.-(BSP7EI>?3 A_2 hSAG&>KjT9^fBt]4{59;?D4lvFJuDJKj9[lJj(?SP7EI>?3 Ar21"uPhSAG'Qs3KjT9^fBT]4{59;?D4luD@z1KjH&H9kn9(BSJ7EI>?3 A_TB`FhSAGM&>KjT:6Dz4{59;?D4lMkJDJKj>9Nj(?SJ7EI>?3 Ar;NcLhSAGMQs3KjT:6"4{59;?D4lMD@z1KjH~9kn9lQ*+V(BTS9\I9G=>7=BFb(j?Kgt)-u)DWJL(u144l*;?\o'D44.JO54J9JK ?2g"(aR:>CDTkrM4c(ORDRG5576355l=AE+7?EO?(;?D;64454.LO45LL56JK ?2DA1Bd4v0H44553);?Dv'D(EDLx-lQ*+V(?TS9\I9G=>7=BFb(j?Uod[LR9*wREPAe"DGAc44l+*;?\)(D;444.O6419LK ?pRSAe3Ahd(>A9>l;(-R0CGl54345l=AE7?E(kD45;474.1kK ?Bn:.9iACRRgvd1SSk4k3);5D)]9'D7?DL-lQ*+V(BT9VI9G=>7=BFb(j?Kgt)-u)DWJL(uv01V44l*;?\o'D44.JO54J9JK ?2g"(aR:>CDTkrM4cR9yG5576355l=AE+7?EO?(;?D;64454.LO45LL56JK ?2DA1Bd:H44553);?Dv'D(EDLMkK-lQ*+V(?T9VI9G=>7=BFb(j?Uod[LR9*wREPAe:n44l+*;?\)(D;444.JO6419LK ?pRSAe3Ahd(>A9>l;(-RBBfR(Gl54345l=AE7?E(kD45;474.1kK ?Bn:.9iACRRgvd10;k4k3);5D)]9'D7?DL!-TL?i%Q*+w'[8DIC3K|CWK-e(?R;5\h'D|$1KL-3N00;?DT[7h6UV;?DQ?'4$4.mDQlKKLcP6A6jo7rD mT$TL?i%Q*+w'[8DIC3]AtWRCc!0m\t'$KLI5[VR"!(W1(<?D.@Kj4$;444.OmDQKLcP6A6jo7rD mT$TL?i%Q*+w'[8DIC3]AtW:`ST$cAm\t'$1KLI5lVR"!(p4Ff<?D.@Kj4$;444.OmDjKLcP6A6jo7rD mT$WN?N0%Q*+w'[8DIC3 Al]AtWA/ST$cAm\t'$1KVA5lVR"!(p##5f<?D.@Kj4$;444.OmDjKVx!OP6A6I7rD mT$WL?L0%Q*+w'[8DIC3 Al)EL1=UjCA\t'$1KTA5l0I_0AAD2dR,AoBvLD@Kj4$;l5.77D"KT5BF85mD mT3$:j|%Q*+w'[8DIC3Nl]AtW:`ST$cAm\t'$1KKYyE>VR"!(p4Ff<?D.@Kj4$;444.OmDjKOx!OP6A)jo5rD mT$?7T7E%Q*+w'[8DIC3Nl]AtW:`ST$cAm\t'$1K7YyE>VR"!(p#Ff<?D.@Kj4$;444.OmDjK7x!OP6A6I5rD mT$#c%Q*+w'[9I6D3KuUcduDe)51<5\O9)$K,K%O^tRARRk<u1SDO9'a|$;545.475DKK 43lDUIT$7N7i7Z%Q*+w'[9I6D37uUcduDe)51<5\O9)$K,K75iO^tRARRk<u1SDO9'a|$;545.475DKK743lDUIT$7J7~7o%Q*+/[w/C;D(7V?d0 oS]AtO5R DMm\O9'4$#O8p,a7p,X7MB&UDRhei n;?Do'5$j54.74Dl4K_544,Z,K7-}=;'IlD mW$8J8~8o%Q*+(ww/C;D(8V?MRAJ@Rv1;5\t)4$#Oi6,.6k6,X809Ll]3:^gth";?Dj'pu$;k4.+75Dj6K,6,'6j6,K8-}=;'IlD mW$i7i~%Q*+x'[2x>C>)L?6d3|>R5[:Ll;gB2BF\o'D44.KLO9RlF2J;dA3wRDN0C0[sE+;?E;:E@F)D46.KLcF9:9Do0AK=1 &K3Dt)D6.-i7i~%Q*+x'[2x>C>)L??Iu)D39F4AhR*;?\O7'D44.KL>ccUH[;d~sAuuRERE*;5E75EO6)D47.KLcF9:9Do0AK=1 &K3Dt)D6.-i7i~%Q*+x'[2x>C>)L??Iu)D39F&uCwmL10*;?\O7'D44.KL>ccUH[;d~sAuu1$SS*;5E75EO6)D47.KLcF9:9Do0AK=1 &K3Dt)D6.-N9Q*+x'[2x>C>)VA?Iu)D39F&uC1mL10*;?\O7'D44.KK>ccUH[;d~sAuu1TUS*;5E75EO6)D47.KLcF9:9Do0AK=1 &K3Dt)D6.-L9Q*+x'[2x>C>)TA?xR:QJL9;>MA0#11J#c*,E\O7'D44.KK>AN?vAdLB0D3uA\n1R0L*;5ErEO6)D46.KLcW50TJD(x3%F3);?Dt)D46.-#c%Q*+x'[2x>C>)K??Iu)D39F&uC1mL10*;?\O7'D44.K~z>ccUH[;d~sAuu1CS*;5E75EO6)D47.K~zcF9:9Do0AK=1 &K5Dt)D6.-7N%Q*+x'[2x>C>)7??Iu)D39F&uC1mL10*;?\O7'D44.K~K>ccUH[;d~sAuu1CS*;5E75EO6)D47.K~LcF9:9Do0AK=1 &K5Dt)D6.-%Q*+x/>=C7$%'%)]K,Rm^11cA"(BL*;?\O6(D44.K?p0|<C10"9^A3dR0R4Uc1;RRl+ErE@K(DTk4.K?cC1qA4KA1-o 44I*;?Dt)D46.-7J7~7o%Q*+x/>=C7$75K'%)]K,Rm^11cA"(BL*;?\O6(D44.K7?p0|<C10"9^A3dR0R4Uc1;RRl+ErE@K(DTk4.K7?cC1qA4KA1-o 44I*;?Dt)D46.-7]%Q*+x/[/B>6D)7? JDhDecc0cgSAC$91St*;?\j'D44.3,/,K 55Eu>d0)n0dS(U-9g`G0DdhL*;?E75D(D46..5,a545,K c(50TJD(x3BR &);?Dt)D56.-8]%Q*+x(w/B>6D)8?l9]cxu(33p).0v2JO4A5eR*\O7'D44.f444,b444,K X9(]3#ARRuc"wv1]nh1]*>?Ev75Do(D4.a755,a745,K c(50TJD(x3BR &);?Dt)D56.-*i*7i*7i/[t3*iph&JmP,P)+`*7T -07_)t0ThDO6n),n)-l*7TEmc+&GN*i*8i*8i(wt3*in>'-m9,9,`*8T@-@G4=0GrDO,vn,l*8TEmc+&GOF+1+*v*7)Li*7-,v)] (TV9@D@A@T>9@B;;BBB>?XA;;B<EBE;;>=&PVQVS)+~].8I41"pC_/+N*TFN9H`(cfSn/M+v-!+N6,5,l*7Hn?G6{Mj_)LXG]3c"Jn?}{vM5OM6l*U^O@I'M%|Fv6{l-i?c9RFAB0DN70"5"LF:5,5,5,N465,N6k6O4R5,N54J)}P@IO7=K(,T~7[J)]ggR(A1T,>),66,kN6M7M5Ol)!}GM+MuN4M5Oq*3*a)`*a)B*F#*o.>5{Q&1[%k*c/(#2L)hPIAEMH55K#BAhPIP65dTMH\P*kkj!G=e-&_D(wMHQ&FdhKj583?M p;5]mkPMHJlK RO87-oJ;w"~"~w"~"~w"~"~w"~"~w"~"o%Q(7*+wvvvvvvvxVS9Q6cwh JId4L*9<h2*2"uL4K5BAL;?NQ5L@'5.87 87 87 87 85 89 85 8985 87878785 8789 85L6M7&L4JK `o)8[3&"\M;*C[3&"\MKP8ln3& `o+8[3&"\M;*J[3&"\MKP8ln3& h=R\3&"\M;=3&"\MKN83& h=R\3&"\M;=3&"\MKN83& `o)83&"\M;Jln3&"\MK8ko3u 7TM7TM7TM7TM7T?7T0%(7*+U)^B>67 Wei Oe0RA(>dA1EC40G$2`.L5'45.KY74JM6N64JKj.1v)Rl6G&7T7<64G&7T78%E4G&j.1v)Rl6G&7T7<64G&7T78%E4G&j;5(-k6G&7TJR4G&7T5N8k9G&j;J)J76G&7T]D4G&7T5N8 4G&jv.8 4G&7T7K64G&7T78%E4Gu7]7TM7TM7TM7TM7T?7T0%(7*+U)^B>677[v-!]]HwR1352d5AZj'55.LjLJHMN54JKjhJR4Q7TJS4Q7TJE4QjhJR4Q7TJS4Q7TJE4QjS5 4Q7T5;'< 6Q7T5;'5 6QjS5 4Q7TJ: 4Q7TJ: 4Qjh5aD5Q7T%8E4Q7T%8E4Q*$99"T8H1/bpTP8H1/bM*F#*?4,,$,7)3,x4_,pb,,,(!,+O,+*-*"9$*aJ.*"X1@B*_*"Yn2_R"IbJ 1V**#* Al%j[8Ux2j[15Pb2jE`h2jx%h2jXRRANhF(?lZ)7clw"T?"T?w"T?"q%0(7*+wwwwwwwwTS2Tj? @0(/0AA9G(?9`<ANj5LL'D44JQlJ6j,9gMAdR0AfRU)WfjJUJA{JUN:)K "bHNT,tdlv0BBBBB5BBBBDDFD5B+4"qMhah+3"qMip;5+3 Aek(3"qMhah)"qMip;5(.F.Os>BT(.-@/:gCCC**$*+!*nF`* k;,riww"~"~w"~Q*+wwwwwwwwwwwwwwwwwTZ>>?== ASwT0Vo<0Vn-U)M"K;]F=AS\Q5'LJ'lDDDkB @K. hDkD9hDW @be2*800R0D;j47;j47;j47;j47;k65;k65;j45;j45;j49;j47;j67;j47;j;5C(3"\MN9(3"\ML7(. X^"b6`G*^V"(^lD^n;^n7^E87^E87^E85^E85^E85^F:5^F:9^E87^E87^E87^E:79^E85pn.K"\M'kDjn7K"\M'kDjn-)$F$*T%[ w,-*L?aa*L?*L?i*ww*+*M?aa*M?*M?l(BL*+'f;46=$**?PNp*R@_0T'=/<'Ns7krIQa=}cNN5NF4JB;`9'T494,ELl*T"O\a;K"6HAR%GJ3`c0KB6GJ3*T"nCNG _m*T"-bri&qCP%CDhDN)+8[S%GcK*UaP@`@ 8(_ N@!; @BK@`9O:*K4_*U"\1U-lkP1-3T1/M3*U"k@fl*U"/@;_l&f@_&pUhD.kD]6n,9!$*+7"^)+Vq2C_0r=-Crn77ab*c)a(7('fQA=C<9>m)6iEMlCa-`KS1,g3S9%)C6'QU(YC6!}E~c(L,N.#0xS*y;_wDiDD-K<@DD*5jJKD!1T0g"\ekDlDDO64J;?D=ADt8J.r%%Q*+~S960AI_\ ??r4O3wu#R00D"Bgt"9@(eGZeA;?G+S?(JD;?D_lD;?V=5DJoj54IIKP56jJLL56JL66KKj@^kQ3MPa >[)J`^jY.*".]N*[.+!*Y_pQG++G].OBW*k999H7;9H77"a-]H:Ov=:qopa-UB[0R)CS=AlTUJR=CGlT-]H:OEe=(xvj\-UB[0R)SS1=*jTUJRk=(GNriw"T?"T?w"T?"q%(7(7QU*+wwwwwwwwwwwwwwwwwTSMP:6==66D ^UVP8P/kN3R.)<.6jw309<Vd0kD.6Q6'u)RLjJLjJl64mK @"bbWf0(h03Gn.3Gn75Gn73Gn73Gn73Gn93Gn93Gn53Gn56Gn54Gn;4Gn74Gn;4Gn54G+3"qMixAjn.j+3"qMioZ.+) E\fbUDFSAu0A%ln.mn7 mn; mn7 mn; mn5 mn5 mn5 mn5 mn9jn;jn7jn7jn5zi8+K"qM]G.ji8+K"qM]W.7+.*$%(7(7(7(7(7(7(7(7(7*+v*UI6I"W*P46;5(OHQBQQOOOBOO+PPPP'eF%+.+++++++vLV$VD$D fd4 vVMkkkk#kkkkkkkk#k$*N=A*N=C,2!,z,2!,z*F!0*!x*!x**!h*!h+*3u*3uF?rPqYF?r4qY+*3u*3uF?rPqYF?r4qY+)*3u*3u*!x,*!xF?rPqY"FAr4qY+*7u*3u*3u*3u*3u*3u*3u*3u*3u*3u*3u*7u*7u*3uF?rPqYFBr4q[+*3u*3uF?rPqYF?r4qY+*3u*3u*3u*3u*3u*3u*3u*7uF!1RF!P2'*!x,*!x*3u*3u,H!,d1,H!,K4*[!itLF!R,F!B,H!,,H!,T [!@E[wF!7F!-7*FN6\?FN=n',H!,?,H!,C3[#@sS FN=DFN=(***7*7*7++aaa,,,-a,w,w(7(7(L;*M4Y*M4Y*nOI*nOIq/M&q/M&&&&,Y,I&,V@:GY,O^QEIjNEF!**7777++7++77++7777777++7++777777+++0++H7a?7a#A7u?;77777;a7a#?77777HH=:HxHhIuHPqRI4qRWuHPqRWGqRI"HHPqRWGqRL"H"H"H"H"H"H"HPqRWGqTI"HPqRWGqRI"H"H"H"HH"HHE]7RH7:H1O&P/$9a??gS?ST;\=?$,b-O\1@OPduL1& 8/@ 1\E&l4&&+/i)))v*(]9B$/0T<+,D6n_j\))i /u(>*0o)y~gv[>3D9_*T`J;bP'9*0AE$)"i(+(-(.] %-n56*,^ko3*`n)";4O]5o,4739ME, , ,+ U U#+++ K + 67?" u$u ")(((M/v.+-V(vv(T:E>??=FA9=??A?A?A8?BP;F9?K??.?A@>? 4V DA DA DA_A@r>O6EAGIO,63W 3lV@@Omj9E2m{]G NEA {]G NEA @aH%;7mn3n3"<6^L26VOE?9J mXA9GN%'CWTCVDkuQH(M5j%Dh8Ml3%YmTC>^1SMYL*M%Y3%Dn8!mBL9=@MQYTCxNMl3%:Di8!Y TC\202p[5[W.4%Di8/Y TCeO^}7%5QW UCX6+7-)7%4RWTCH~uKV(7%lo*XBXH+eq{-OMk4%QmRQ/#`4Am7?r-Z@3zR&2)ot 0056 PPuQsN54Gq0GrG,)74HqALGO,)kj6HNN,)mHO5(O5*O5*,)44HO5)O,)mj4H(l65%44 n 3=@_SCCfB2*q4G>W44\E4G,5 4Hu K4(L9(,6j4He4GW"@?m]#T/RM6i%QW \9;"R22D2$*,2h2D29LA4~nAweRv0xB)wL5([N4D[.M4DZZLMJD((k45%Ri4 fL4D[M4DZLL](k43%56 L[(l43%v LL5)L[ZZZLLL;(l43%v L[eL4(LQ[[p[Q[\p5[\LLLL[L[(k53%v (7%66 n@>$DV?C7 AM4,M.[L*Pj4,Zk3%Ri4 O44G54pG(k44%44!L6O65,MlJJi86436j663qi @{Wen03(;Y101S#R67%<Qj8N,M66G55344G4j44GMinXm$i e@'X-i 7ESGEE nE55Em F8gu\<20D2nK."BT0;_[L),>4G&E3$44G44G&}jG,6GW"G?L)]Bt:R)C=CS#vAB:MR^iq66,P666,J>GN76G(k53%44 L6556X557:G457:Gk7$7j663O6566pI>7kG7j463O6567pI6,i54+n?>NkvBRAk__5j6R MNM(k43%Q LQ(k55%54 N[N64KL(k43%Qj4 M[(k43%vE4 QL(k43%44 (6,((m45%Q (k43%Ri6 6,Q5(N5(k43%v N(k43%Q [6N8O554,L66Jl4LNNN5j5ZM5j5Z755'}7G[k63%56/N6LI(k44%]@ONNk43%5%@N6O6O564,JQ54LNk43%6%@755'}5GNNNk63%_@NMNNP54kG[774,n64+M774,:54443754m3N555344IP744,L744,En4+:54443754443O5474[,)m5463Qj44EL6M6O544,N&ELOlJ,)k%E63,)k76I3N7j4E,)m4453Nlj4R,)l4453ni|.4bv3~P73~,3)S8ic\cSvhRARAud'aLJP(k44%56!(k55%44 Pj65X6,QL8\R JL(l43%Qj6!LU4qL`6,N54J8j6.qLOLLLsLL6,OOL4+OL%G(6,656436,6,'6,5+6,JJeu Z$))*"/w(T=I\ 7.ND@@!89G);)D5n9EL6i%5,0D5spbl)iSNC,636q)iSNCX63S*i*a*c()"?((*Tq>-7/3+-D5n\C|*X0@a4+0 J:aS(KL)"GS29B6? P%^V(/a8]49P{$W-)7_v(F(-x)?B=()"h5,>>=C+*D5n[5o)+1MC+0ZD5na_ Lz553$E**"i(+(.]G0@OLS**";9"i" WM3=%E?55m?)"M$-Vs)"Gcmn)+PoYXl$,i_;,(!3*_, B-6vA20A"(2(f.R#6P19NAA"R^KC 1n%1Mo?ElyoL<sM*Yk}f4ig@-aQaDngYs)lDb8U1$\tR9M:e D<86ny?HXT(5#_00(NPnI6~!HD=*WPd>DBQQ6g^bPl6321CF1C2VLIWCB#B|^zwQD]\7465>- L55=?}=N K^pA0I7%8ot7@(9HI:^;Pf ;:'&1Hf9.l'5y ()8/s)UZCrb-^cuHQ2A;@ `/$ .)\-h#4 yE 7[(n"yBZh91AY&SY$5YM7o%}:m:=zz:pn؉Y{{^s{yޯ7׾k[{^7mpnݔ^+klU}{;=ݹ۾|4[w{kWkڥ]v:sܪkݚ0ލݻvJ>=sUםutiwwvڵ5kC5:{ŽKC mm_6z>Ͷiovqٽ(ٽsιek965vvκv]: -wުJ;slHQ( 6"uUUUm[wkafnGmqqEd4ǥT\/nã_;`6*1ei28(D J:cueB,0m&KU[;t l H ga*5`HbP4f ž*ЩaO P`{3 @+l 1Jې[AU:0֛c'ϻ}YΠ%T&`@4! I{d2"@ $"$ͬ=Ba+|%wݯ5h#V\*aT6) q[vw`N]9nk):4j4R]5K.uݶkƙ- 5Ԥh kt <^!R;!HV%nx; >]o0̘6ʨjm jo{{N85y5QJ'Ҵ_w=T]$ GUQH^cR m)Ik@An6i8[+%hseѦLZAa\妃}o=4c>:(*h T_u5 koQl }_&snG{ai>O{=p_e(-sT);$w$zJz'篡BMLM4211 ɦ#6P&& L<&h M$̠ShڇySSmOT L@L@L&L 0h&hjMM0$d#@L# 04LFI'i)Byjm&ODMOBli$=4`FdlGOU$ cS &jzijF$HH6"yA4h=!A44D 2h&&&ABf@24 i4= L@=)Xg97hmmz߱] 5Y<"!!{iQ*Voʪ/HD=&UA6WV? kئnѷmV~d?;i.幚btԝ}-ՄaV;Cm9@9LWɀ4*etz{}AX]6Iݠ "fO7L sx{4MTe:s,GH@lhˡg7OEw(lL5{0H@rI/@?ku$`- ؑ i{dUl!\vsmR la !j78%H^k& xŝ`S,hHb #| tq7ٴq9|׏>iv&ےHZ46(↬C7Ű˭"!)D@Tg<>bq 6ڹj[n q?c|3(&!0u\Io5u`=k,p${ ^"Rɽ'"Ҋ2,# \CT y,&oI;%& SL]a0[s̓."HBBP֯ǿe1lMxHp$6XHHZ-?s)o[u}mnY[l49r#6][z#ϓ؇bГ%x>U8[na<ﬧ}{#QȂݼrʑC{k}W$v™,^UlLKf^V:TԸG;`xcȧfg( .sBR HH+F{KmsMNү IaT226HUq'OB$[t0BW!*BIs.R nsSIl)VN!#PUɷx>"$0$HXug@3iFݧ+j}2gZKhY`$E(hd/Y~Uc+ |u UL9U]F/i%:Nyg%{ad<'X2ohg#"k G\8v4vfo61!3ͭªN(;Ώ y!n* Z#=<A$KC iE$-U@s}F/|[]<'S5^\w *gp 2 hf˖~DdFN+5r%;8TDi:W H bt*%)4~qU |}6=_;ImurQ$wu_cTU/,i$$aRBHY'R|k_|υg$ՋmQjV޿uv_k㤿.5<6K>ELpy<||Ӯ YEYXDQVX4ek*KI691>$nK~/Đ;>kM&G#eV=Z3//Eq{sM3KИ) O?GVjB꽕ͭ[e_%I [`UƿlYjG32ylcϯRz[aVKE|Dm#Ck}7.KLpv%>L\Z_cs3 y6~@)hm=f O:#zF{ߥkꠐ'T_f$A*/ãs$$F!`$ZT]>hDjt.Qa4}i 1Q ,orkyƢ»s"CJHq}bw]k(̊48?c[6cQ̀pדlB!%=Gm;N) !(kOhX)brߴa=6Q! dTC&TC}\qdսib| t/q->DgGYֻՄX6J Qr7Ƴ,T | &~*$ō(UB"#8<!e i?y6Mq_-ޜ>j6/> UVnW DB Ťد&‹\+%\!!,Rt)x;Tm$*󒩀w_E]rw GSc'~=z7}W7I$I$H̴Y,IEk//߷C7r<'ҹoAu[?R~=bI-9NM(>OJn]u|G֓GA%:`)cv".a{""rQr4 .%C9&jӈڛl;k{=/?:q}#YkQW^B2C^N6?\@D!ri?7͉$u򆡙$%JTWPn]26}]vtyt;qH2̄{/ tҺZQ#xhۡC*)\q,KO򡽤dxEJ|< _峻߲bQXPC&iuFӇpΫdGfolnA%@1#zqa0O9n[ 6fx@ǃ듑q7HձR^n[ -W$ A ]>mR*WO" #'yeyfn˜wK۸f=Cj(*bُsނ/T$KT K2enQE~""Y=5P2EXE?)= "u䡤QD|n5{diRVR,ZTqXrN)5ژh-Jͮ^E9ꫮ#|u梹#7\<:)U *HHX I j8'gX8в$I2'tF Ey]P?$I',Ά'fb2W~1,vJo>7V= iymxZ)Wg%iok*C <;Xٽ"\ 3-FO]9]$tu *=p[zs2"B|*ػ`v aAb`s:W>$#IF^cM ^ D|x%*sRqjβ [f>)WV&/3$$qv JZ/]0Z*!r!0o7!)yd` z=[{ c圕Nn* #ep$2dRg>t4|Q/=SEӒ``R DAL9`KD!D._S,>W'i5-N { BޞB֙D8GN!@˱^Z岩_+5 {8.`tg!FQ /5"r)< z#i3|Wes Q ?k7Kw@vˉa-Nl4a(hS!N<#. x$`#c7*,6 c-,]#evxT:I-aJe)@Z'*r%.7WJ/AF>W!%)((wH%"4&whSXY3H& U>;|g JĨ!h PT#xV yuE H \=-C^R`g:>uQ"#D p8 fl=,`Ď~{m#su ؜j62Z˚79$E}_rf۴6(`cܳeuFӊ59t$~ݕCbJZE(PEb&jK&6amEؐSmLTJF!,gngjumW QM6$͹qQ#h(ּ*}tۇC BLsXJeFA*7xH[ @K]qS)8`ڹlQ[s v cۿqlc(8#^~O2~~sr+rD%︸֘_F DAi/3zޜ\n/if4aE$ϔy8G'??rKnS:Ʋ@Ȉ lËg4ˀO -/жũп ?8]=;%g|/Y6n.aUXAB*-kϲvחpLX$ľT ̆3ql6kJBBӛɧ9"a *{ї_`P)@-;"~?νN %AsG)$lz}|c#Z38: f[U7i^`'zP16 $J ;V| /JTӌP*e˧_:QGćn3T=Fgk5}ԉdݠibqPn$z۠0>zaPPV!ha2zZGBN/"*HƝ\BX=]D"HU`H5)2eA板SQ  Š+JA`+#lK)Qo  ,<MnS Ђ$v7A4F@ocP k|e#L U_WFwU 솃*I%<9zBu#=C 7v J T&#gp V\3ԋTK΋MB 91ePeS:'Ha1ָ18U9otsXD9یJ!w>[$d]-0Av)m^nF hpaê$EPRmbUQ P6ЕQb!Փ"S2 A d"٘"CŚCQb4ōcXQEQ c0cD4hA-j0]r(reR҉JZ(^>]H;< 8m>nh?(q!(H$J= ؂#_i$k;9;~@mHCP4zR D@PDCcIK:mWm g|^LUCmg?y۞ncc$ $H**lVňؐkKiAK0h#mRlm&QH6I(Ʋh[D5EZ3"$kCJ\ҳ 8(B|+2!4o^=yٸQt);N/$#Ll8PBJ_<\ . } oGrwnA,rw%aŴ^`C63 BCq>pφ*4nبҹښ?X@=`L ؼV=ZG]lݺ 9ʪ BR杣gn榕 o]98%$rC |QS8&~E!D@R404c{f(`_0$YGYGy[wz} -=2~ݟ Â@VAW^A-Z9Qm+ p Ԥg"~pA2,CPq?B=.cM=AjEU08 '6H}R縼?(0@n fv0 J' q 'L>Ph0utN2(4$"ĚI |VVI0Xeᔢ`xk8M>pU ⦉o0%@q!Ƃp2;;y$ ,]DbO#MTB g1DfC>9JQke>s!"r z9`;5UF'P=7 C8=-~ Pa-OߒG d ʁByQ#݋oMcKnQ K &) H&qeD*@*F??4{B> !ǜ%XU6"p & kǩ6ЄI$GyR Z,Qh*1`D-~+]膞;)-qcDH̸cSS/Cj}6J:]:э(칧RWc: yٙ6=V Qe" Z O&m黍 UM?xÁRz T))p2Āq" 8t27Ha^AD RD]א ٚ HV8S|’O(Ƽ= l MvBER()TqB鬃M)!? .+M{F}zSS>f 2G̋Nt+9 SWƺDE\ul^ CkG tl fEv%35mEUm5Dƈ#v:u}_BGj-iby~/+ @dbksNxgv cE΂=P@VQj%d0%BLxVNIɁqN V~/ذw|»~4== Ba09{js6jT|9Aw<߈i0osS D^Z8E0Ï#1Tl}✙w Tˋ61~_Nrnݳ>Ic,&TR*"!%P#~wfID,+ǸE5'8FAZ^>L.#y[2+ t'siVSYJPD]Rs̨wo)C YS$ iĞqUդE86+wܸKy8Š#t8 Q+ Azwwl}j#Zљ $kQM21.U|u2(H ||rt ]s:oYra`ÛCz7m4UØ"직e |Zir'zNF4+腴ι{NLO+ {o  nzs = ZI'[P -tNh&CC.o:/%$@zKeq3y4=F.x^E;]e9)VD`ITmߗ?;/ٰ/%$o<=TX@ _-̾&ΐA , #$,k'ʩƻY5~~Ν S͐MXPnŽ|FXKE|5DFDA!} $ONXH F*5HsLȘԆHK$Jը I-oQUSq4}BFlB-mTs;$ҵv{!X8ӦN"ƌfCaw%"|Yfͱm0!A}ЬohK2KEAzE$jg9#F}e"B5n㐽􏔣@Τ@'4-ɬ9(Mz<[8 H^!2H-K!zqyLp] o^e0'G.}=!vi 7I@+VHa*sM1$cHfzȜAYhؑM [2=5@@(y$sPb%h" WZ n&}k)1Z]֧K1`$W^)!I @&``BF 0"&,:=r@^j~zZM`m!$O'\ $\-4N6-Ga2yC6ND $]UF[qxz54tYt "KlG:{n{N"/-qk,OɕՊ;mK}|z tvH$z|UkW0wv( "SgY&,@(`@&PSgg< GJ|x$7 @D5 }B"}QM +ZWzT_{nt-F$ a闽 ե քL2J.)qlFFME)D2PIhdJbb4%3ԖY*ƮC)D44R-Z*4 $m+|>wLq $$>vI&̭z QTZ2b!#F5nUēXJk^Ǐ|WwVӛxB1$ _ 0F4rx D|waOtn7-};)PJux८8dKĖ%pvd]qّ3pNHo6SK2F]_$ъZ$nMyH"ytahP2GhI_[X[`Yfi$pw>'gͬ!Jrt!P,iw6twHHc@A)Uu"$cT(F (:VSܴDzYjCOP??:>( MA-{s{-o* HBA$$%2`"HI#;[dmK mmTL@)u/ԫ\MՒ'ɵ!#!$ixh_ZnW5W!<F7g؇@$M[_'6 .rR{om70p|D|տm t  ` =LWw{cm”6'ز Ա C.B˻-χw\qi4l Zwźh`R>pJg9q`8K=^dCf|*&R,<<7~κ)Vlv*ٍ${™HI.F0ogl,g#hBB:4|>F`Of8U^ߟ'\L"BFF%TH21;$$鎯iB>/If P  uRV-K(g!l˖:F^]khUmlנH$I,3SInkog컇ۜ@-2(Q(Ȉ( fd/o}~QkJ$-'[I$3]=ſK?Wӵz0iA#uXb i_F @ f ՈnʤDhm"c?1VjO;d":dP5Z_\rܗǞʏ2V ՝q;GŮLF†D|"2fxjU 7\Dqqk- [,FJNJG/(6q#ot(}8&U$H}3oW gp[{t^Q:aK/T$ jh^vkO2@:VOG5ut ĺ۫zKcE7 Hr~:%iuKev|;b^7/< yh^BD|ղjވ~ L#$P/BLc0v.(t{Pob4x4_os˗.2Vk;TrR֦)58iWhӷ:R 8;POM=hsWR2YJmn])%Ą@@st==P%؄ T,_? tXm$ EFr <&U8y7Vwrluy9>>t;J!Q$cD 8Vۏe[uD G`IFtUf=8 pGMBKl&mH&VI.BAHYʈ$ w9x {[e y!Uu H?YS Hop$n\NY41Qbk{=j9l d1:):EIZryHTϏ_A2y6/rٔR 칭#'Z2lC+kaTCIK= B`Y*waZGuXȜy7BjȂiu˅rUIWkDϢK=y-a>^yB. @^po;L8S IY]%mm˻$G:,"0@Sqt9y "Arr l&OC% g:q#iEO=Ʒ`0w4}{ֽ+ @7fd@'x-]Z/!=."c~M;@Cd LA0o#x3'YHPMX^>HI"t/hV6C&D^X''Yڪ3@(*е`h5L' mar!p=k63f\"!jjV4)/ӤkK$z໳8q'vA dkO()#{` |TiH!mih8(Q|NkΨmGs,-uk ciXPxhRAgt3a0FBCy%B%/ (lrUC澌q"$\`vܬerce1n "dCu%Mehp dBV('iYg%9Ol&mi"aƭpz^*dn8`f!FÅ^q\-0ysWJs;|~Po`>IGEy>NE4͏< a9hۖjyCYM H+K%yC5M>C^k>l@% ?50dD 5)0 &P3P_$`MxF-6y?ޟ/}?}eBJP$bMbd؍&Rk^>LoF"W:1/NbS Ѐ5#rrIENW_nqz?#\3H $H,Iim ё@8|'. "DԍDmDcļ|V I< u##tU(̬Z9vf)٤l+g05 EE03@ //ż Qrd+]^H5p:VP 缵qp'EuQpkL4i)s+?m ;=GPTdPe #2p# AHQ9Qli=m>iݽ܏7tNu^gE/+QrjM~]ZV] eީ{0kGF'0wrܡ}`2C}0bm/kIDy!a7SzAex"C X% NƥG;1M 3&pܶE:S_9xE2<^ 5Cj/qʤ5lng-SNnryiAZ.wW45䪒W#M(P1*3F%JE S!L^dMpHL3e̘72}uk{nzc9>%V M`> BzVo5MH<D#.D$i2,&|R{%l{rt{{săבCz`EX7axU"eK eAADa >A0-c$Ž f|}KReF!(D A@μ4@=CøceZm4G֑8.$Ej%HY/+˚ts퍠 !OvP_(@bRYel$ @5y%E T7. ^: I6~(! h!鲨6tO(5f|C>nrDE[',L<{ WDɰ1p:GmQ`DG8`8f$qdzg/r`a0[Q!a%_ed!b"*Rs<6A)-̔I(%qJkpkcc-'owwlq7x`mz] RH"`F;8ƶͳ9 X{ ;:e! 8GwW;`P+m[ZAj0@j}L0YU\G\70\Ճ\Nl_jЁ`{9Z߭ ˱hH6܎AF@SPe+ XV"GJ*Kй Vt[-űm.%RXWV!q8偲dAhkD/TKNؘkBMQ0RK؍ x'l/$>- &e+ M+)!R~m%v8Re'@6JtD3fJ@,1W󂗏UE3៼{RE^*pL f&~BKs_/N ^흸tF/W=' /u$}GSWO 6 gME%@p-U*)$&AB=@@Qsv!T}L !A Q$ RkCtnK$ "8"W#1 A d,Eom!=LP-+[Rԃf`hP?Ï,o+\Hһ ӅZe 8b$γ'Mi8rNЁ'&,PH$ %6h"j4O61lZDح)d5 AFJ2i[r]:SNwF&t`f1X' N/~`Ajncvpl͙ "v,ЬőDžD -K(N2*:]5ZL ܃\N; %JǶX% 0a63zC_rW /4fƣ!RRY؜hAS!ѲT3ebP8#C)=8rC@@֯]߿ [DPc7׈g4m^WDIW%mykIo@ZeWHCFYQ lFC68Y}<z`<>DfJ#;{yJ8[V__!kH!PfReW85rMʻHBEF&yUXŷ%f/ wTH8-,u@aBuZ5Y^YW a{<@7s2BY!ԑ#y l"Һx@9Y(Tx^i~H' @5HP4T\:4 dD|8#Êv h ޓtn>Pp;: 8T7u{D(^-{Akvvx)@̈́_1vVWjs"SmC+P-HW e zA%iݤ{zPF7lAF83t18dX<u֕ ^VbX{|% CMHP@/86̂aUjTnܿDK "k ڣTJLI3 DۿIC^}%(,] p*AJ%  ǡ"U @"(OBJL8*kag=(T-A s}Ls$@!.ݹ J8s#, dHK 8[(4Ѹ*;|FƠE,v [Žd0H" 8# :KW .VC3PqȼR  ,CeRo6ޗ 0 jq.!B%Dh0?~$Op>p>!k[=U s*TCH55# G >0a9fs61oJda@;EOx Ax YhJj#UyD & ጀW~ܴaRH=%%/GD ?rk @<䄊ۤ#[tg%yE-g ='prNHIL=̯h4A7;4r0^٪z*j#-{ldEg@Hb?vdGszT²SSb$i$,8L}2d5_#V-a~>պMRGM#2!Zh;͞ gõ~Nr9yce,aR~|&9sPO,%)QBrEyJGh1|tx&;kzL2(B+v={--Odp'Æ""|ɜJҘ :2Ƚ@i]l|]OpP{C+dtj/)P Plt7 {2Mwuʷ0vt΍aVBXL_q:a!j܃!/p/1yD!ٌg<x6fX;^z:\JbNPPÇ"_(hgHg~5?5T wZAbZ6l"@I:5' _䦐83}"ԏDQZk鈪w 3xeŃdо\Cl!8L$$(!c!1XEokFu{F{Z0mOg2cᎷr'5L~`GE1&MA}+6c%8ڶg+ OCH=$oJCC/]TEИrkW7w H8[u,†*1H6LqP5bϪKq oF& GX|Q(쎀 LKFgFbܗ`x%d]RL$q0J@gO$3"K#= o|t- aGb_$>2V! /+^ "K'u"-WE^D*퀵N)B""N ZHiJ ɨaPxTƧӘѓ5! ln;/38ؔF0J#J~ ,o afge潟 =cwVx ңyct"sh"3>Uo0~?s+ôgʂq4`FRjb,&^2H^7U1ĸ PJњ \ 6qfιDn.ˈԡp @Lh8G'bY|.Y`Z/Y\D"""@ c=<}Vx7 3}fjR Gz?I:M)\!5>B2wef7(~7P0t5!`ErK#.xסNGDm` d jAϙj`1rp!|jJ_Aj@ `؀ulrݪ㙏Emt,&@"![Ԙ-`)0T0,?QZǼDhh$qg@.Ogwu+_c7 o"|Sb.䣈=B`cP .z d]HC=ܬ\(xѠx$=aO\CՊٛP< JA6.+XU!{AaOZU&ob 5U! HQ=ay,9U v5**ҕUT != ?ocN#:B:ȥ yټL ™fJARGEHx`d&I{y6IǨ sĴBĩwD (IJ!"@@)4aH_m'.;D:+8D: +F[l6ձPD4SD¶ ! ~a,\gMJ”֝2Q'MP ƭXR]9ͥʑHILr]L)-+RGF5L.od'>'X9"{&-2TIKғW8=Q hffqd()B,.000n6 gx47𪘎l4ө2/Ή˲7\p;+NP}3#H>~ju1Qm'TeE(56&d3X*w2,Pxp@p4J{RTa`Hx^>>8,@0usKP2QG[1>A@#Z1CQڵ/k}=Sx3/!0zVJս9J T`JB[!g9Ww:*&vK`$kje_!Jc[|-[+7F0|NA!P5)@2H# (rq(H·Eu dۧ $#`:D @D@MLjoH&ee'B]h'/-ܴ`_PA˔ۗy閿p' LDt% Bl&03}o奼7ݖpQ]Rarv-2ΠN!םh&$ AR̈ɨkFѩ- 䜖vx_ %'^{ r9/0_W"sq .9<Ex$8Q b n+]l0 7oob{u޾N|;a;]vzu8B"H0#&+stq,AcN뼺bVXf%'z?h!pJ,z""">^fsv98xY+0:np]eUrpGŧL`w *>9hqmQ 9Y3ִ )o Rӡ,0 %*0؛o[\0^MhBZBAEBD2~_W^W&s'H-w*AǰZ.F=~Jd>VA83Vf=rltO{xD#B!d$^Ěp`'eh"ѶΰF&a+aa ׯ&yy H%wBCkPrI)!λ:V\ۊy{Y6P[_m!?CJB H$D@79 a)8,ni!L;6:R {N/[O?iLA"G{ޘ4XwZ`Ri\y` |A N$dR k?*ۛ)2OpAٔջ !PPT!bI0Kij"`i_jYc$.Hct[0[qE%вT\ڦYd "Esc慆l MnXgxy-|.uxC>y^M7rx0h.IUyy?xy~xe k$ -J;Iw,lt׷A*[l&!D"""c*̈Q-{I x>ۭU8c5Sɂnc[۱'*~=!.4Q,Zq/(&TڐM]CWPJ1sc$Fycc-U$BFB[&+D \5 gݓN@k| 2vƥ -Xyxg;xRDd@HO**0ED$ Opl e}.b?oʱe/7Y@V\ջB 65P|)U)ƎAk!oȁH-i}p//Jun ſ΍JD%:Țzs]_3:,1xX<2rGcĀjUI@FM% S]w\łJ< =h+.^[N#74T@@CegVnM.azT("2iY_6HL M1{CWX>YDj,pnq Uo P>}whH`I4э߸\)K>=ݽ^ f8]LW6 [C@͵ub)!ɗQ0Ԟ)ge 8;y $) GU@md9q PC7kN_hh }HPLIѱm? FV'{L`kPޜ<ޡW Q2>D3D1LP7 1xe$KN$zIvDQbHx*ol[`Gm">vsԷVuZO1\m䁸.E\iء<[s\1J/iiJ3#n;"BM*wĘDd`uj^#~[*#Muƒ]rDENAt@fRQ>eL'rTkF(~,m$r<>L/!@Zˮ6&D(z'LM& @.i[I$ѫR͢{l{a#==~F4Y?CocVPF7{5{ !BLZ=}oE(i"!QkƆR;OVn\e&ѵ}NX-+,PH >V#rhw5P #ms 6A %D-)hBXL`AXgUyBibH{8*Vo$IZ<"6|W8T?HYӿ3]5 Xisu 0v!"I@9BZ `и7(izg~ }hZRb 9~2 adKKIAh-]} `Y8 Y ѸN #!!x{}QUs?|>,O:o-u7N^C" sP!|rPčf1 [2rBQ^Pxܽ@ chީhr_g5š)rLg f>I<#r'q:B~nZ'm WG(Z܄ARCFSAh@lA \D.8{]>L, T{Wv~֒k>X\afWy{iPE yvk {')|B <(T)d?A6nfd78Ŷz杫r|ts| 7IG[*1kW5h($dC##,2m5}.u뷺O\sK6f !Ғ% %GH3ntOy7ނ Z2Ys_:C/*ZTEf@,*$N8q/aiLW9Mno²BIڜ>ju<='mfʜ )쬚=}Jf,/HLnCUZT璇`0s|]:'%K(FdS#k޻)tO'_~ŁЂf ǽdǘ//q M-j'Ѥw%(\2ZEx!g0hOcsg2y :'PǗ} i> ŰK!ݠ;dj">b!fPei w*sP-]Όo@G?vV@$Ispf֜c, z r@+JԽ0KWm׎@C'Vm0lX e-"8 >UdYf 09bNHz#wtK`{F}焔bSC3N"ZV'իN19WnZe.+}ٵӅdO Qh=RуLD`jvZ =)$;zBFpx{P4r)X #ޘƻ5r# }nk<-@qmjkb5qn( MBzbN'OhVtP(g/y#7tV 7Ch" B mc| )P P(jVզSQ*ꔢ"! P4 r>G%K"yiemF=3ޡ7k@* ֍BU f[Y*':\Xd({T CA8;)1+ v2t2j)ld6P2 k=t!!{י:Q0_ДD HC󟟍T/s!88Q$ZbV(.pdğ ,48ue#AFGx KK`j<C +u^U$Pα!p$)iM#n$;&=[,wIȮIK^w~\68oM9ɡ(DTIV&h8uЋA`Ȝ4NjqD]W?0G7p=ťIZ}"y^D) 8"Bm=׎ۃc(d!B0wXڍ qLx[@CXh5N Ic$q[j;s1qQUFO֨@^~K"J:. ͘8Pgw!s&`wT<. G v5;OW Ǘ } TD9:R$WP:*k1f,Dd֤mƴ D;*@ $=! %v@v'q@3h¶&w,B%0u_mhcsaND'EiA uYKlA-L:{@8Z-@H"H61T+G̐}A I!:>d4-Z5f&y=K>W6M(<0) ElQ8 8001j@-O!ɷreoGCͣ*M !BB c.BDdKXrYu=rAsvV]5mƜ?;2b J77Rja'rT@̈xWAL=h1.`: #*ts.M+ܞۏJ*0֐#'ꏣnol+~L +E Y%,Ap֔sE<{NǣɏUrc6%o KQޫVv(kܱ 1H=F* il`똯2JB~DA}~3f$ccK^$!(zfAzIy,hjS d ol2A8,Q E(ټ{D ưf›F *+G!P(gfhP(!:>!*_lDpP|l$ y"*>`suhxA+(3Će(Ֆ0vv懍%NW m }>#~;ԥܥ|>PC}OZjpL,@bȲ{6@k^|b h 7 |?A깊Se=)f`s:Z@p{s,>» 75kXvxzE;*1HbU%NgefZD @@(oVv, ۉ(Jc|7JcE,غ<!H ʏnaě@I|bU "dǜ*;K-Q&e< ԴK9l&-4sIfI%9}: LqAfϪ PEGLN@RA(.fdK]48q+:(G9 ȗ1&D!Ŵ64Jp6[BMg3DCFy-e+ e_IvT08,)J[K ʔp| D!P A-1H^@&Nc/mOCd=|NSKc"15 Lj:@IEtub8żIܘgUdPb!ֶZZ:Γ V&Q .D5 #7 C -9p3{NFHZxi2HJ@<#8?P^T$`K $h)Cye@5Oჩ].† hv|D"SX~f)ЕTKZy$r6NQ'{dq{2E1MMΌva/8%Fe"b8g @8u,NT f WR - zK{m?<ֺbPͬJ&xmX^nc Q ;@-04[5AP`KNJ7SWqtOL CӖ7R8B9}Ue#kڊDK3|[a f E.5r:^2B+eㇻ&8$>ZP\sBP/?c1O\0u@DxSWJΞl_`A@XP2Ș-3zA¥[ YjJM+* š8)m8!ĀJA{Jlsu-D;_t [.ۣs, T"ưJLn` `~Pvxn ϋp"Bf&L"UiÂ4ᅙiB(ęFa-E>V0CwdzH&6K9~ <~(p6U3x}KH92 Q<ߒunTS\Kr +A&v&"T8mR 5 KLϓ 5m$ 1I ʊPf8f*涭GQ]p% D|[x }g(5rY7pù裏VGhm+9lہA@~%Ѥb7x>Ȁ!6ktdܖo-4pUy"S @56sqкA8*' DD@֨LyA 9. >kHឣK/=h$ԩd|H؆ sA YDta"tfnkf( V BOV*3xXHrvz{Ojh^`4C+]1~B쫏_uaĿE[Ɂ0D gB(sb@2nocB ejH83V഻k>7Jiz˜-#!5 :G;dݛ~ӪAe$c1}D^W 'UVivIAM(;vi+CӲ%Tɘ: $ }t)#z]&QO>FS4zH0#⨘Ks MVҙ}v-nL"̑ibv4/rݹ9 lߒd 20L˘!EЉOQ(+ ݜËGS ,Md-)OSZ'7#ְNM2`P60AWZ%X7JB n2ۻKLȆߗؐ޽VG+7+=9Ɯ,K=<"c58)!- ܈C$9JXqI#_^@H@[5i[l;j{+Q Dm^b k/8Z8lXQȰmA$ w >UDްci=&AՎ0Iڥ8gT%j<[ $qD',Ac3C 02ě3@Wl,PQ cWYw1VԤS_rSlK6Ux펅Q.=qe! P@?s!\7\0|H# L& [2il~Gq<z]وcSJȚP8B衒IitDp]ufc[-%!|Uu7-iO8DDD0N#;>PZ^{wl6r-qU4i44zdp08u::s**B5ڄJCzt0t}.h[ iH1kmM #ZÁH(񣩇/ーĀ"b3ɴYЦSU$EDDaKKȮDԜqbN+ Y BU-3X>KD"4mYCgBJ$~ 5;xP@wk7j^%6p{0#] ?h [)(%@$mj,%֠U@jIsN'\6'K!sG9?_WPpu]'} ow{(Pz+H TQǶ4$kC VTL' Wc[ Y 9B*(GеAT1Z4iaD С,ȂHK^M~S@߄믒[o nz*^N?~B!2yk @i3r+wwu',Ο .( @9L^&NOC\Y@Z:8<-["q@5!Wo! V>Z} 8^6li~6Yn5 n/~™ PH 0t.~r=!$qsV1[]Z^D~q.wuZ>W9`v.6NB~2QC|vrĠgN[kvz٫(xKNCU"1-L lfG1J)xA}s e {0ϕm.*gXeuf V1aA%."0%QC Kki@MW}qf;C*DpU2mrw)','M7$Q&yJ;\֯>[::;:W <#*9.J!daOjraR٫Ǖ]Y{k9'#q$$!kҍ jͅQ$8V5@ǮTRו`aJ䡢JA_]VGO#T) Gj%rGBAiԣ)^ݔ&U"LP`@188Ֆ qG;0F gx Ҩ5䦀KhF9SvK&d K.RxM]KLJ2RPse^\Y\kn5wZWV4! ka R\R(dzŝ`6%܈~/?/0bs 'l&aҊnZ>F8~[C0$"b{Yx Snt t( ! P EV@W3f8GRiE. s'xOB 0ZmdvJ?BTDfoF^(:%Fv]Y"rcp|e3 N`FXh!B+ե4@FPP14 /)@$#weyxsJ<$ @G3W3󾫩b.Wb8Ii]WVi8荩a20OyT,nP){:47sYB |'Bm N1@G#A9VAPl5U+6RIsP\@_#R?N97zr?3zxW` hkqthD {g $>IocKĊڍ%WM*lw=`o%(k0H][8)A +Pms6ߕES\H` .AJ!^bB(tUJj>ԾC|[cQ@ wD!._aDJ(=҃_Zf(x: H0%ࣼGe.hkI[ǔ:/8NTyuID.j%_1TO!e_)dꚘCt.|X }r#EY<\u$@ւ pgbIv9Z)̀;a ,3ӽzXA0^_:ͲJNG&{hikc+y85_f=fP9@ t k">1e ?]̷5:%r~QI a8~\;)4 M:Ry[hɋ¶I!RY|lJ:(5i*bu.T:N 5d񦭎 Wg~4tp+Dt.@DG-@w"( 0iq9}y*o61WUI9]tNI"Oԯ|L@"7o:!b=g(ҭ#ۍtFi aP_Qr ]YNof,QDi0̯6}L/Mn48 Ѐ2 IwdITgs-zlPw^ gw{JgqL~[]Lc*rs0m+W_JɆJ iK/r@Ķ& +'q\eF+~פ{K {@ZCLlu2pŸ/BODkI@Q&%(p"b\`˒kY)  i mw6 ?/x V.`pB] jRpDŮb;'=|Ԃ5˜*eʛpQ-C@(p6 T VlUZsc wXJm\JD poxV}חUU:v $|\Ne8K@ao -+fS{C\"74 ^ocpx^+x2oӽ[0q)k${e MXu ^Y;ptmu43X Fq(!_{-}KmaP\scܛѦ$Ǩ2P1i䄺 oʥ'qg`H 9 ޖ wI>Cn3^"P-vL qYZ@=3̶n2Mwe5ù椑7F_@U쵥5 V(þ2a>a^4, ++C jDI`'҈F`g%Ҧ9~ٱΑ{ D`1I0 u4午P_ok; k$k[IiHw(I-M5uMKË{|VGp«—@[ݓMްݔ,vKxt w5(3ƕ% `VؒѢ6֯}0qg.>{?l];Af4rEDݚ wP[lϗe5QQ5 ah;1X"BFL4$R+,~ l{T<]!T`U8%](1?E(Ŏ֙uZ\bb.W%$^& jps;"kTx!||rҗV7XEY_8½#n0W)E d@Iб*40x,7^?=n N@- O`6b5P(!#+n2%khdi5م zf G~ww)H{šd : ;%(Pa|X )|eB3Τ]*ǘZgJ:;11AU$Ц5pěE*(.B=ߖ7G=2Xi 7 9e|g: 4x+pTUx`VV  FH!ջm LnG6lt  qC6c6VQ mc!ZDE h,񒴋B&"OHBJ{5v6;𕫠2ėyf{H߻khAx KSք Q@=sm,D1g^=0#(2*/?^IZTgŽ/~4)s$-.7Iy]xzDBZ4+H^%W),,cwL"zm[efsF $58e ]+\ly6*RPgꅁ~3"B{Mlu0MNZwβ5%_KCmq/Η?`(3sY?n;P@5Yal2Aq- !m4xT*G#fC0z# LҶVN5e(( vS)#87]|O:@8` O Ҹ>7fMoS(I!6 "b}5d)+lJ~C&[*bQ+|,k=.,ק4򻻳p5N*ԯ.& pД`"]@AUphUVFD`q P 9&U7 7Z0p :)A ^Vyx!j/g1hFp}'1;ĨC5.RW'mDnU{2pO C˘Y!E<Cw+ v8:,#{wʤr, {DyiVR]qdMRQ%=K-$Eg*2 vgˤJ G?3Q-7%i jl{.>8=3yn駠΋7|OWTɜI3 (cQz|Oy"!3r^=6EBV w*wZ[?)!َhX1m;blN\[8#lbQTNUP7<vX իh3L1!Ԭ9c plM" %\ j#ahVUhnWx҄wTz%%RS?2 1 վ!c0-ɡJaBի ("A)p/ض{Y4<^]gs6TWOhLwt81׵iz[0(^AtP1L@[_Yp#sᬵ #c(h3Pnv=K]W$ @GZLu@iLc]'~Ll[!o5}4,$.1HZ jTO$ɀ& qˀ CJwKc:X@-RۃJ۩ku+y8@0ɿڳ0=D9\ʵE R{pB& 归z)LҮv˹{P*̴7rsVu#yt.^"K7cA~T^*PCYRAM+#(3nc6"V.VK;$but͸]]VD=SQ_-̲zrͅpWzsM(M\)i+MhSm5 Xܶ! 8 [IUY8CP]@AwWz( ;>U6!C()<8(DCb@-(AEGΧ>}U+nb~ RW#DKYMbe"- GschtHU(G˴nLʈPd(sݑ&b;7.y$EBZeK JZuQPڔju~x( ie@+`ʼn2xLޤF]Mꕗ:nn[n#e /_O}]f[*~]!u!AXyKsQxaX.m' ϯL:('j2r/F2|_a i`w ~q2R( xg 'M4eput A&]aH; phUrD/Lm 9NmNۇcH=Ùⲟ_uI !*$ "@!~O+V>/=%3Pf7bP8ϩa H)Q-dR ޅ;':(#ܙB" g8(rKYkpc%%*Eg"y8IVnjrr0F>{74z&ʿ;)ŇaĪѕ"̷ `P\i7)L#pjP t8vW#XZ 2|b_o1XF 2b Xmź*钋"e`XUn8tɺH@Ӑ0 (MO嘵JD14~[u#;L$j+oKN>;EETB.zW2۲ <7e<<`Ϊ}nk}|5:AbLk6c߉1Yo@17s%Z}mc6Zh=X sja|R|yg3mUwL㿚/d_LISvQީ%bZw\+IJƎ9xH Y` "g}]tFrov 6=Ra++FAaɺ BF(Pkԕ>Nt4{ d%r%hC9OB3ٷܽ#KR#yt+a| 薺̹]a?eHUy=LWn|lfOd&[}7oZ%oTjܞ"]t-SD ~+$Z ("!(ϕA7V;r <a4q$(ێ {hn^gC-Ɛ[wۯiU5u*p%67ٶYc:aD4wܭK4R/l jgmJᩥ{Qy !{2ڊ:А\qt-gcH@!=萀!\ uL6~(,ak\d]!)vf۟R n.x4ہ/1 thk"5A쬖 J\hT;vڧ0b|;EAt1P$® E;a/!0TwqhnM0<>{}|+})]ًC[ TvC>J)v( f`Aΐݦ`ya|){HM7M`hggoVz\bf>j[gd|QF|P_Ml7 CTI8HE tݚ4y *;|^VXByKm͗{ŲYGk{-(ZI+3Ud8 6!k lSTĉnD5{XH晕E-+tERg6NZ85}jľ-u–hH\oLR0CNU8&!tz[cjĉTE Z,}}uYm<:&PJ8fqi{u;`{tY=i'eCqc$gO8OV$BvD ؃Kf(5*ܼ4ۼ%H6|zz`\~I-zU] Q:fB,ͦ -3_YA[pL4SɡB<^׫wVl2ٶF~ ir~u=ɚ % i˝p|m6JRLPyil^^0]^\tr\š ry575S MPrlCȥpOБ<k,QrLs. AC]Bv(L:-0媋Zܽy~J V[Z+mT縀78[Mv$"{$|>J"XB4sH*.SC,C*7cyζ.^2`0b;qpwA؅V(8( f~~ CSeLb^^wZ<=*k_cf$+ =tx: 6& fAM)$]OW">.rv -k(ʨt5ͦW{l^aehIB 1 >1vV@ҫJ&gSAhCIUPn"H﹄Rŧ6`k2KȀ]Ze7J+r|̵ZAՍh[!*¶s9hȒ%g9@$Fd3Y-*T*ld D"ցUPB nm’FJ:*[uĥ5̐LӚXF/T( Ѝh` E{R;e 0B{4ڗsiHW 7hK"Pm{I2AXBje̥Z0yb1T6Qk  Q/Pt7HhaNVf4nlKk$%ZAi@(p{Pݢ$FMHf^ VsV7 ghh=:AJ6DH3AUY~<EY-W"%RN+DX`H;.e#,ɵw4ۼm9:mDunNjN4- z2 ӳ Q>Ūjg K!V5F{' PrݖjwXA ~lz2QBri?k> Qad&Ԋ\dBZfM䷗JPH\1'9]K8H`7m{zP^iAQ%bj* gFt`hi,ɳH=Qct /|)<;og _iP _4:<$lfO*&ɺU ۏ28+Wg'mv@}"'jNdգ3{Mfɒ!an@S, T.}5*ʛ1 QWȿňU"or^h:U$wtn jD gb_+YXs&q\}L mɭ?c):3#ԕ؝;A}*1kz)}J+yb@"l.|(y|cS+Jw|UYB(N],"H5,7tP;`I`79޲~:|zLog]9N!e~aVT]g۫i,&[ oL>"MN2,[fvMsqi#رFB@˅8oM{6Hl% )vӦ-um>4R]isE||dHfi2$3Zznʢ}U>_M[C]jHD@CWb`(hf.k9gɎ ie֮ _TD6κT= {a 0!C 6dxJ-K=N9߿)42KPT(ND[X y̨>tQA-כSo΢27p=n=xP߿vG`ՆJ2!II藸pWsҼkx/!R|in3oaZ>m݉:Vd%itvpf~ĕnP|oz0yXרYզPfcp*+7*:0xoSm(D`Awmem#reA]}(½epݜWLJ*,!HvI+h,f_xP >ee^3>OU cΌmçq!KR% 6o</{oh W%H"L++n :3M G;ʍW^pҚJ\+mk$<ۦ]Lʧ' ɩ!, ZqqOL+#JN) cB "$N $}F;MHifHDsw||n GFI ׂKw 9y0oG>6 jQM4 TdSJ՟~۩IQs봙#uEa[w)#|?SVzs4Wȷ(18뻳”/S{:Ѿ$(BFzցY}'Z7HbF4CgHW9Rmy+d1l $0߉;iJ.W C`jdt_4uQtQ82M1V +@Wh}lMOHd-I"\+\OW^0" _CYdy}t,/_E[)Vm<@hB#q}qȱ.ÀCBr}Pю u>ɹtct#*Aȇ|{}$oʴkLg+ƃc0Yv|%n4ì3Kd-Cա+@vՇkM -q/Ӿ\=M )hCZi5Ty(yw<"}kiʡh!N )kb'0f+iɔT(pgkDvƞ~@ysb硚srקiM#Eh'KsZӭ>t 0(ϰcC e-аYQĥZZq] 4}VX~o#joA՚Uk5-nv3&i͹-Gv'G+NBIV=Ͳ5ܻy/23'hq&H(W ^1O5P+|/ :ǶIBb]H> 8׍q%w;+ƥ+Nٹ!nc)~؜:L??X'1,W.TVˑ5RB~DH*}[Ck ˽_:ɹW9837NҡX&ymXHq3K}3QVKGFn$'=n@$&V EYY}4]uLC!"e~TmsGSAR~ ,AD.Lw7Kof|C?E45Ҳǻj ̋c#Pۜ"sl1 ׽}f\Ԧ&r$o(Pj/O6xOenDo"Au$)!tqP<ڰ }֔p4 ^~Yj3+Ho^x%W8\]Mн>WSnd%U9SR< @NWGHɢP&fIy&m!_YumHVƺnʰ\Cݾht$?$|fL=x!ՠ7k(0vw9MV;{'CTh(ƈ#!0[#RN4Ϋ.rt>.~iSL D)y9˻{6u?r6p>Itũo8B8eeNw3D9RQf27ݠ"9bzƿe.Dre{EHQ.hScTޡ%{ۯ(08f`.S~.XH+"v-`D$vyktiykyj'le*j;4ݢQq7nY%W*uO~y[+L Ipc5v(DDbNc1J"pݩ$%&aNjAt3$: tZz +#|ۥL,i$@zӞ`lF^} qL$J rEԉ/:iVDa9dh[L0';Q)c "M|z2*㷩r5G.λT$)0Wg_@11)gcjN6$WiUvt#KěF- w[[hh`h_ֹE Ifa#هgX/_A3>+Elytdhw" Ԑe:#" 6DCw^_Ƒ›Cl7(Kalx /ixY(Ń> :vagoI4RnS)۔ v]O(7DޑsJbF9$CE0t`SB55Jn>NOӐ9ȈB@@ $WMu<\~=lN>AXZλVTR7/f25&KPޛ@8p2 uRYZ0&Nw[7Cs.PqPNq@ҬVإҲENѲW3`x[M*N/R[MPOv5C;l̠LAЅȡe7:,izkV &3G #tY8qrU< ÄVŎxAQ a)6'|Xڗrr{E`\rw5 Y[JnWriےc\:=ޚ:<{4v+^5i[6gw"~ZmIK#xQUAZ~K`v%ڌd+wɿ58yM\x9o}ƌep$6PeotDXE@ e{fuh !⑒MZZ+@qY@KܴBYy0[N3IupҢyzM-mЁY̘.rz>2^S:ڻH@{ԲKgpG"XɹN22`bf. L (/N/Ԛa-x.s%2.^巵ngmâa9E@# :` fC0h1`Vm j?;8"@%4,/q񴔊Uu2M=\Zƫң8jX*.,ȗxDQ䧋_[t њ;O*.0(3+\b;G!rUC}#Ue=gɼ[fcEf};\a/<*oM =,NR p,ss-R2gR14s"i˽O< 0dy\qFn.U{)="O<}3FWS8 L5]芾:aQyL>Om6,ZMi 5&K|^K4 x-6&+࣪֞y<UjO'%1bV]Q߽_u7>n`o^J>+%c%\7p9]L5ۋk8%f;Bh- cYq*tH&i\hx||R[F\Nڬ QHqY+fU> Pn(!&LZf(~FG{)'kV(碃nܽt{k,8wnoͼ 'vʷ.Ku<ҳǰZx!j MFu趓&$UowF0>|s;>oUd!8>$D:['!>m6vؾIv0ಈpk6Om;w۰ޠHc/+004ƲD/pW2㿢Ev{h @-senД-.]34KwG:BEqۛ k.ʉg֓Z\-Ruk7/7ɺXy xQ)%{޺dujUp6{&v R6| /E&'/XG@2P>A.yG{3Ͳ^!9Ka#_-.[(ՙn&kѱlI+V Zd~WC)[ #;Hb(*yuݲ}aEMkyY$mK;24KIW) Ӌjy&Z$ = o?aTlRܳBW}Oyw" M'~ "u!= d>B@@R}>d⃼s\<B a) x/%GśuF-pf 4X҅LdiZez_oXy-&.3nphxxO.kTf5VtRk'fXsipxO 9ƨmvhs2E[-׃ Ҽͱ*5y:kY 80(b^Tpmi!]Щ:MiY`uu3\3,HsqLPvWtҶ}(V0L<AVfM|ɣHX&]ZVZd_%a\(ʅr$,//ZK,&YjRwm(+>E) vc,Zv yà!l&$nL 8>b!ϦbͣE5*@Nq.1pդ&qnfſ8qA^DJ; \ #^k )Pp.5eA%awҨ^ףYq_TgqTkb ,F2ɡޙ@-(՞cC0񁰬7=[Ρ= ټq\B?xuZ4[4wx忉~~ğ#7_8T\vN6)Rollu[DjNϤh&Om{釴vwc[ K҄Tt!CCh㞥SIIt-H` B 8z=.}3mZn1,p~x96) f\+^i}}ncu{S{|<,bKq#<$o< qmF2G*)߄.sd}M8V-_5UAVZ:jq!Z0, ƺX(T֚mh/ΐ[2gĎ5-BPwW0x@G9r:TUf=X|Fàlp8B22=˾6$dSTAE(kP( A~@nP jp49Vi+&|D@*1C!/5 ZVI{)Ws.‰=q܏( KZE x >-"\/"ڳU~yEl5mNeXLn^ ;֞P|1hcwgY#ߝT)F'TXݮd-3Lmf%Cwg}Kn+.0_?(`7>N|-n[R]i`Iť v§2r׸+ڮ9 }[0?;$lB ۔agA !kd>|H 4(P]Q\իjA$bJ\t9 qh7!Cbw0B)/B,`X#, ^Atm# }qҺT6A佇ۊWYpd!YYf\b0 _Un``ԖEʳ򸘿]FC旣 bÜT:N6=TAɧhӇ.N5*TuEELx{baʔGEzSΣ>y!8rpwl8c[Y %d"MAq̃Za9QB^4+m%sNPͷNA2( ,ɠQ 5c̚ū^7(8p㹝AKfsKa*l|8}>b'E #PmCGӸ`ȂTx[ku9`l|bP>G;%J\ޫ=Zx|>vױ+ܞ Qw. cyO>KU !HdЙ i7 vw{,rr2NXGhq@. N{Kz2r!l'm##?8*yF}6(m@'`ͩ%#m4@EKj/1eE/36< mjOu-K -"yΎ¦AY1Õ\~e@_R\[USQl jf"AbWzc{5:Sږ|qh(سgO*xUG^pieUH@a|]݅s$بLrjmzSm|]M~RW,1PB|ZH[@%9 ߟ*ɢ#% (9fO6Cm0A;#hcQy" ܡG.ˈ165y >o?~@ h@MEڥJԐ}:yPnJ3Kwh(1D*н5R1F-M`mʈ-<Bm. ^_F3޻7hZOuO:-{ ў,n&[m*ك! E9v-s/ܾ}ccb2h"A@۵Y  LӪUq>2-NAj:uͦE 8ΑDh9 9;c6ttqH6]hѪX7;39Ru A *A*|.%ܞ<,C" ׄ7N bjl3N[[QZjxVٴ3U.?ʺ챈TRt@:ݪ. JzG9m|vq&!D(TߌlufSLB=D&襗 ӽ H衤D.詚DDAG N2X!g${@ M7M[gWK*h:I?烑Mop%-Od<8)M}V4'wXE,HH0tdyіl/7Cyq>GѦPcʏ\HlM9i;JpG TmhR(1Iw \/PZC?@z#/6,i3CAm28KN L ̗9q׹^ F ֨p`gtX\/(l8͇fҭfP\ ':ƻw@.dآJ e>]߁EMe ٍӝ >˪.GVȚx޷ ųkYlHKm OTowke6%$Yo5W@)_ŶƢ%2Xm+>͡#?سwB'ɌkyCހ*/4,tЯ;"(ihf6p6rF:e66VkZ/5&6V kD̨7fAgbK_Æ k [Ԯ9kh\ F:Lz=+"k ߩ)ƍZV \Y;@`M_7VRGb:0!("ZX_; #:ȯ2imsm>t^zrRޮK]21o$rnxy6ϙu،+2FԺTJMS&Ra+e-WT?hBHDBt5)x v%td;6n(KjY0"fҢ!_R1_ov<rLTs fˍ`(]e*XmmFUߘ3|G4ZF2R5ZҦ`ڰ2&_d :kT@X駁 ָ@v( Dd:YDlu (w>gPȡW8 Hiȝyi!?8gZ6'&>HnY)ǫ4*AҵrJUi9ڕ*`5vGZ3xVAkmp;5Jd+i,v(ryeҏ]Fqa \,`}$%d'c.{p֕b_2_ 2G,/脤fzV1WUg^C7y>Wv+xZA,ϱC]7x@@ZKzLg8'?*:9D x|L9ں1guzLlA-ϡǠ~\ZLmvԀ! i`9e!,G3㥼^2\,t|nGB1 8I^6#9BUJFMW3QC.17mH ȩ%.J砡t ;{+F,7ijо.wSkmZE#6|S6)p vN hb:%d>9:.сT-v!vD]rˆ+tN*E2!i+®^H9-A鍃pZ|5]4 ؟VW] ޱ<(PTuq74|/W)[V"i {Xik2*cJN;тx!L1lU3V|81 {%OoF!v٤рn'NV^Sh[%rx"8C |gPl$Ls #.2pÁl,N+s1|%wh;+FAq=&(aKzٔpBR^6 ^s6# h$k8\C$!I %xU?V:Zj5G&ܸ8=HUѾ+8Sܩb*T"/v,%ʂ#5pQH3UAՖKTx=!jY;?~9kQSƦyKHsvlDTu~|>Ws+qvA)&JW j_~_&FґO̓pcyzٚG6Pp|}%S>gT1dQqtr{l"|".ZueB QZr Jܱ]Ε@He?RlQus٢($I Ëkw;b,6gR 楌Lkυo'jʀQh؇ Io.k_y (Z6&8ҫ\ga/cϙ/${|yѝs`[[ݲb=&\}F=f/h` " M3vF53Kf#;E 0 ٯŎ^Z@u]ѓ;ŃލY\Lj{y@Ӫ;։`(A ( ֣%L6G#;L|:ڵѸ߇!V@>Y7h̿4GY;(ۊYZ=񃫎1l[P*Wo2tn+j8Yl0ۚ.c'[G 3y8:)K9̬l^֚GM?68"JcT안u 6-+4M0`oUYI;iʯ˧:%kxDa՛ހőXcg9ɃR4Q!oq ֍'_Rhu|xԗM& ̈́j@% q\Xb~eenFE [TiкQ_&m(UeLG،W^|3KwKyw\@6'jdefOT-4=|KޢTW0#.,(/xwy^g0PЄuZJgܓbBI۲@/~V%[bG_AV-U ~F C==],ks*ҴYVcYMQCq@]/Hgr\sV)jw߆ŚOY|{)MbeQJө &%0_q o?P*/(^8̦8Ǐ5Ğ:bs)̴Y[CRnYoCNL  UQ:[ȭ:[fS/)z51D ClN*t!z3*xI9ypii|P ^HAZygK?#h-f,q?D5]SA0%1͂e".u @ /[jj4rPu0h΋2g. Ymn dWVch9I}\nT\g( Xp&)yj ]XӡD(TiaGju ,Xb|7G!o@f, qS)-KaAlb̭֪蘀mHuMeyoO |)N:-%,+qKW{UӛzQD][2m³57Em^*-СD.@qBex9PDzv&h4㪋Eoo3e8=7(ékЧl7&$F\,+Z5簗D.Zµo_0(2ηv碥sd>?Pޏ2($jH0* ܎;x"`Rĩu@( {{^E3u}o3)uv;2O֣E3x#<ޛ K!$d"@AlD/1(yˑ[⪌e2NwH@tr1T%_ubX8%/qouo*\tscM^gr*ӛa$dp5(Ϭ׫#}M>UfѺ8%e+ #?Mm#^2l}w Bz*]@&H"'']XPr%u0PU~ٔtLfTW ] ؤX&? 5\c*y`CqKлOuM\ lO5/̦..M@?q}d-|oP"q܄ $Y%6c@== _FA2T(|Yݤ>3#\Bm# U48r.wf6dQ=XT$MtB}8d'Q~ŔؐQ]9y|C0s4[BXpF5yH@x-#HwT5ShNmfGw]Y",kaI!ނ16hPΕbwƺXo IUx Xrhd!y /jG#un\gAnݷpo5xx$ƝXa6~O6|g-C(f9KTI/tsg9Nnn60eޣhզq ٵ֊) I[)yטbv#kϟ3PY2\/tsioFŬY1u,VCQBr \ wVmD."-}G@,jdEPǙKB$ F^QD!!Xlq(la^ݩʩ Ve[T+t!HP yKxIm$12/H8IBW9Z6SgN*ScˋcP ]bT-M⢍J 'Ϸ5S.-ȗB"N-{ Qu<6ϹZG38hP;=~}Ws:ѺLasuJärD k#?Zid8]K)2BG$!GhoҌ18rH(\/!ʵƲpPU[yf@Z:Tur^ Q[7^A9*w]xq#3iSIZ~"817) *}I+%Ker&IS>jp6Ⓦr $Az&s<2j=9UnoCZ[;q[ٻ<^.6ќ  :z]Q *-Q6K,{ ;l;^AZϵ0wk6hH7 %Hy:Q/AqJ1TAÆ_pݴ^~PcRQ1'QE2".cB5[6j^d "1J՝N/\=8H9CC9;;:)᜿w_֓rBmўfRkƃ("L؏+$ذ.icv"6{ć-h uevyHn%,sL'6[Y$sH )ˬuKdu|n8`P(UX)Ut||Qgg#!G_S58h!bLEM(1(QpoD`J]l&A[CkJ\m G PMP^Z1 `AD"5e,P6’9Ro;ٹ  ӾȡC*T8ffa Ά-MǙd!mяaN]@7\ӭ@c'vGFb_=v6>65Rwe; Kު2zbaƱ̕{BxhNp=r1 DN[ Ժ^˶NUB" ,$ѳ P8`_j8 3o7"?լqX`%hLPB:U6~Ӂ~]O#~'YI"j%nQ>bfY4IP2|9E@hʀ(;dP#LV.z\Lxyyb) fHTE|Tap0^KE:j@7_b1/$1Rs)jdmwJ@C3LaHxn 2LMn@"wRn$+5>)yJ ڱj`sްVߺ! YcȖwp$_mB6ja8_Sb>Ss=nAw J1j^$q6 :v^w-l)̜Q/ƬU̜,XW*`~Ym529jY(ASYA&JwzW*ivIr l.#ٳ 죝QO{HKi"" VSBWjtBGd߱7N癦b ]/(Behr hKuZy bKbEyX w8;|Rg29vT2c+J0!bUY"Fs2,=VH-SP}hUhvq:17SKщmմߧǁdHZk@;IpKYgCHahׅ1Wџl1c&ߦK+!kFk}`DX 0cR e1KJ12:MU?y O5O,J^{W i vS-oVy6<*i \,\z Ӷ[fi'>LiK:"Y/1|dWYSP$% RbQЃv;&"PUX2u$-?I;.ޟj_9xE/O _dXeSؙ`],[ʩR#a_ :Q[1SLC-V2-2I֤ۯi3SR#{өi Z_!A"B Gzn4k@zE)\Eo7>|"C",~*ݞ{.p"4^d(:;$Js̐É9'4 Iұc;"0tu2 T?x8==>E$n -[il:2EI:x*#Ls.'W>;WQBLoVufb.- Pa0@s#obGN{BqμzW=gq^_~:3o.=-ԷDeKtJXa0K&kb9蹳dxQpC.PxЩˉ Mta/62(fK'= i# nAʉ|owesnlz͋NpE7t&T\93F.ᆔPЀ9@}Tܪ0$LSi+ϕHb-Zym1$ލ(ES^b1 lIbZFev^~njRPXY$Pa'xTb*8d˷U27eR ]@cjBAPrءBBH~QrHv&_qJ ب^A/Zu$W$&"'S@SAdBF@λ'JE91@Ί"9FBF@dH-P׊:*0E.2Ă a*%b2 +Ƨ?uĊ"+q}^/8Dԇ |hET eŐ& 4 ʛ؀W2=-"֬D@QCG I܌;z9<̫Ç9lZЄ22#u;:wthjT\xsWl.@ gsF\1 s*8AEA "ŕ6oZ euy$a"Y-jL :7* ?"&(eZ8l異']$P1dD I]hXbI$$ \\ܔDm]D,*TTZ-#p,aTBHƔ$xoMhMĿ;ʵZ幤EhU(m fVz=o1!-bpɗ < ÇJ٤9; 請E̒xxww ^pֺm0s뿙jM=lOs߁8sW~G/hm cӡѷIU^w5pFэڶLD{jH}n jڸ&=:&-F k﹢EwC%4* ]TZ: (sՊ#COy7],t#P֊ziR oU@ DddiQF)[u۞}S/q#s(z`5<ʃes뭓F[fD\2A"Z!,9 rJD&EqlhCAw*|] H`{[ z(kqÓa6=4]5Ή՛ %oK7]""}\l[np[U2I@eBHN#UV(OK7vE%v ]/^rṫ&j삊2T4ȥ0 AFSil?<hA6 c1A5Qʊ"ˆs%H u 6BDe"%-4ܻn.ss{Ft#Qܫ;u΢H¡VDws!(j=V A%B@zN%qi^|B9|mnh D"2aî^; J+rW<^^*sp>If Z @ɔT+ۃ. fCzmbs LakEko6ؙvoYm;Cd!V!i]&L4hȀ迕Cy<' `I E^Ij@6˜rs&}VVkoE" a~{.(mIZV0D Qm[(84(zPGϛ Las÷SʈP$SG3تg`xpNFDu!qn&hDD6&䵋BVٴ/QCaMx Ґ!4d "Oԑ2 yBgːbe૆:定m. "rg.mIlz/ pl% "~^ov+JG֧niڭ\\<D7 7n%Ft3Qԥ\LZ?iܔvpxw$|[g53G0z]_YpP /dr=S8U,hT!IX"**ET90*Ft@BDF#% z H㈭\A#㜉.MI-$n.69s rneF׫j\m~۽[nض"),T (1;j,K00R`!yPZS1c[[sMsVWּ @-*Tկpm'GN;U1xNm;H.Q4D }J!$kG J,\>| wC9gOԹ+}q`TF' nbJj+ɢ5ӍlQm~e7,`覎!m0qg9~ UŠȀ O@[V f_Kn}4=Մ7fa=*wMp^IT?`G |av|me~fOG B(D`+AlcS WدʭA`Eo'|{~~+IiMҼuh|\Q@$$6R6&h*l{k~Ƚ_Np3)f@-"^uG:TLU PZS>+ˊ&9su7reCqWԱmu^%Ȱ I%e"P'#C{ElGB1JSl)JN C5J Pz|@4O/59k(GrBJQJaA4BVg͆w9O]5!\BŤB(1ĀAwj>1^`b| ʃ=X"vY yS?mf붜?9pLf, KBg^ c^\c}l^!:#;#|6~a&#bfjq9r D$D$dQlS̀wPtz^ .+sq8!q 0XDE$ ^pX3-*cd?] 8"2(zQC!:ʅB%BJQ@J5",;mvj)>ww\\bqHuֺxZorP*(Ova'kx..-MlǴrdLk=k*{ R/ B}smg3} vo7gWz`pD.6w'plw;2'xkZ`!B@@AO N폟c+1[a6 P I@ya>D P@6h+^*ՉMj@FH127MizLZе+å/~D./wY6$TQ,@z诇IZbdCJp`&(:g$Y(W-s(wsqƸAomIF3k7zGF:> #Xta2:WHr8{n`wgD0TW"tF)KκN5_G10/Fts=|>gMa":3EwɁ "HK)JYX#ef6o%?7ʯw^R:XJ`fADX`mqon-»zf̼4)ބZgExsN!@M*Ce?<{HHBwX3-uGI|[xq룕cQ?z&f*L /\msO\L Dt6ڒJmUES8JYƎ2І_Zbe0C.\ZĂU D6|_]x:n,t&9PA}z(&T>(A {X %D }kv2ҥ4 $ GUU.EAh#0ւ"0C&&ᄂX0Z~E."8!"2(( U[0Z+i2E lw4!4VfnUR)-c[jXִmkI-6mڍ}U)r*!ÂE3 ȠSeHUCb#m_*ָ[XleڱQIb+^mKm}qصZ*6jM̶ƙbĚ%&+jj"ДF*;X"YB0"E5|ZzZb Q ,l[l&$HEEb) L0DcFH*4cB&_M[mM4h,3- 𪭵̭jnM%+4Rkd5#-lTEThMs(j[]A׀_0A:]Vk#"" M,iAej(4RZ*M$Mkʊܘ pٚmrd-bUnJ-Zkkrjܛk{+ZEqZ+4]P&uDĺ(]* Zj$h"qizqbmhڒ5E65EZkjHEk& * Tɍ@ %Y"hysLQSyW腨jMbɨ ֯Ujڊ66ưQ弍]*(kZ֭ϑJ FFEP6PPԵ(Ӈ? 7蕷~[hcQ4 1&Rd5J-"S)66k\VZZ[M5[o]VNljmnMbQ%34ldB@A?"& 8` :UJ+Q+CJJ6ߒZ*+msmFѶkl ƫFձCh+k~۬6[6ܺ&2J0IGX &tH@`6\ֿ 14E*߳(5YQ*4ՋX(Mj⠮.,z5SkJt5Q* vqN(D(bDM,ԙƢ*hifY+M5Mn5{k]4ln[t|3TmbqEXih)emC XHEFF[bJM^W[F!',76_ޫEkekUҶ*dFe]g&RsD2L` A;* Z{[B$æ $*!d"[Xdd iBhmmQlHzp:hb* .gE:R T* *&%@nTQ<GUG.( #tC 6ҵ7ê @P7^ADQ"ADDU]EE 9pE2͞pD@櫑T8 5t&w/f]UGF((dAo8ETj /}@}dAPo/t fA L[N{hr sPJ9pTnUvq8PUv AAˀ"\Q5"DD׀V*DPmm+6bIX$ EW AUTN"(:dEP5`*uU8*xSł8bE ")e"! iDP70TvT@8pC`*9@`jB( 3`{(젪 v kJCz [.8 C:(7|58T(TA4D@T8Z@C: Tz8 ͊>h(T[E|Ԉ"Db(`&#P9(.{**|6ш S-%jKZƮwPySqDaC #" D` p8 h%E3)@.Y[zEjM@4PMziFAPL#H(L9@R"(x@= pD@!ҊBu>|U?fD. ~U EzkEW @EрANM-T@ Q⊞Zo .LDAॡ\8"x(b Da@QAyP̂Ȃ @ @@ψ]0I7aA X(B8glˀE8\ˊh) tWA.pj" _ּOxn˛\j7x#R-KЃ08JBI^S,(ЁD#U3*N!LJ58Rvx~>w^7"LWHM"I=B6;뿀-K66+G-VP(EH"#ljc>*ƙE8.) Y:^&M RU@Ry0EZ2%3e!gpmIJn6JR2B>\!H30@#z =2@<ݸi&h@z`x'Po! AGQ YJ ,ʑ^zs#G  CGZ["pk>Ż]*o8[Y.k)ɬ 뙅X VXHf}Fx#(ס@3IHZK\Vޫc((Nܨp!&.:`[ h EsY):Tl($HAtwYlC$+gsel"Q,h"h z;m{i4ygdd%_0f 2J=aW%!U9H@2o`匂X8`i295[sKi" U-!p HgdyH@FHX @Qx*YW/ Xbs4HI?%El) US4`E,t JNd熮ȝ%0%%$"+#=.%{Q(IlnT*CDљ y!|FxL榭32BHDPh{SDl$!Bc͐|j9;D E 1haɱ?ZgTw|*i!t/Wc@{nZ}mmO~t7O¯Yoe̮놄 ":/!b,˩cHP!D (/"w\ I@)@`بT,0 tuIRI$ֈXq eV3Ɉc+vHJh0C] $C Z4t7АI o<-]n3߉rz+5RDD@$raA708wɪƿObj^X$#_p2'"+{dfVNf$ ht& 6+v-b!8$0Jznɯswz>E( &;{'j5xo|2܂9'ŋ~Ǔ,WZ00I&S]x_@G𚈱S"2 e HR7{s/FlM̙6$7 &B 511yzoOy0O?\ʋhHrt4H)b` Xwc5k!qoɾUA(&#W?ݘMR-[fYÁMS[@X$ ja擌H$ŤgoG3m'WK|^]!10eONj k+3nj3!cKᢛx𳮐ǃ"ezd 7X -(nr,\q8jE3  ngSآy`I*(srm4jܠH~jAܒ] \ 2yνSq$dqE. ,S N^\bx:5a@[ywF}Cj|}̀W5:CMd[˛sB7 ׬0H2M""Q Kz=Uza#Se\7A^-9*)Y\-O;2|9n+o,Լ0>S X{ -('$uE!CLDą 8gI~ p)Ѩ1@H(Ȅ!H{ u-as&|*Zy; sryok"D$ZۍO>du| <ߙ@UA=NL8̐PՒ5vAP& ?[QH#t\"ަ4hAc ?/ghz8/]?9C?'v&l{;Fl-_;Q_Dj X(xk\Es +abri&݆ oGUm|@L12jm]d@V6ѵ1m},5%uv SEYOˌ@Ik4ؾM bFw %ZӄDG/,%7=ړfPSQv|pDn!H;qt8]p!FD2u;RbטP ؆5N](i S[2z#ڞU-!7<3 @$Dk>\9<s{gu4 Ԏ94vnYe>ST҅)/:@"U UxV\A\T@3{z xns(I:8 FH#3nR(1Y SjQ@ weFz۹ 2 "h:fĊ 8ܦZ:?SGZΆcWxV'~}ICJ ֶ(v^^# =)N8VQMP ]߱!D1k9qٸ Q"R%)`;ehk;:9ab晥}8ѫM"AοVyAj1ج@Eǡ9d%q(rvI#ɨ 8A DD241yc͕ ֏z & xVի; ""O({BT 2U @r:#">pxMdA$BypFgr]=#AWoC`nqm MIYgR+N3 C CX-I+y&ۭL]4~E}sC5eUlU0?\O"NDGy\8k#WL>;ytԘq^8-/C)VAy 1TСՙ5[%r.|qJb@CJ A @WЉZxGkNzM޼9@"Yei`1ɺ}ָ<E؏yiKy StW0]{TG: (3jך6M 9H+*z$#2j8dBCP]͖2΍2 ~>fWWRղ}a=8ȿоVZ]\M]6hCd [.b+yFI&> h"U "+(8#Kw.Ǝm#! !T.#>pH+ 9kIf^CRql:ZGcESL!;ϛӶ4!Fa(r*@N4V| L'P*D$Ώ:`en%Y^v j=THdW/ᗄ6C0gCv`r!Dkrɵa{)s|3@!Kc [y ( uPDEy\0n,E*U/A|`mfdB;Qp h#ۺl4}[A xz|[= o&zdCq/v=ns <+CSdP%K UCJZ " #1b*>M駒jl#dse Jgל˚l1xHBLa-F-ԙm>fa4t!o\UɮT’A Āݘ"ۇ%l(mR%e9vX($y.uQ.aW鹠O_F$ub2o 7H :Z؍)I`lB;%J/_*T3BE ]4d7YcFrv" 1ٶQkm7"$HI VE'Z^zLM@ "wgkShq Aύ*>j &Lqn8MZrG $N=H % Kyb@2n$ma$^wcH=&Ao{ @=%;R̥$\rAV+\ x| ˺j蔸ast6bNPI1A\;:.< "m`_BLK" uU1H($Jp#)c5 PXڿ1_ZZWLҟo/PPtUXR,`4'*R*i-<sDTx%ЎW}0g:n0@&j.\A  0uUB D$o522 H =ے6$#QLb,F* Fד'|߻:%]95Pj*Jv!z!!%Džد^w]i ܉mX~Ns{8D9BEy >=(P|VbT}"u2r,R U RlDs!\kp@A@zH@R8QEm En7XwB1'>IN%P84^ @ɍ,2ƯQؐZ @1R@\;()\'e |IBID(v!= k($]n)3axF5h ;Q42j)ԟReaPG䞣GWmB F]="A X*4yKt(N8k,()k5}y6@^.0Z>%;h> i46qBu Ul M}(QxfuIhEikq1]}W%fF%Dn y49X&v4dܳW/-as:,ό=A*<[2$c`L[\n9 D{wTPĀ0dddERE dt5|jӗJF'u*/T&g! Q"<3P T^A3.!=qT!:4rH,Kf}rLQdRkL;a3]H*@ O1d.A$Nzgt.1Yz "I3!YkhV/ZjPͤ%4㚋04FbEAԣpcJ[&]}[7ا(%($ l;ԅ P,vX:W=Ʀ[YwX$\JO`,qu.\w/cWBt҅|1ԕbLMM LXN=|GhYnD`P5.$.FJ&,h |nea1V #@bmJ;woXO'!5a*yL:׎}Nep3")Ao*gR:3k @#&/O mͺ@RRf0'CUM`m H } A9l}Wx֣Jl@'l!z?M -I&:Wu>ەxAA0΄sPR :of "i6YTH D N52#QU}ތKYfnq!Y[G$`6¬hG/Ѕ\OE(=LfT )Ib$Ԣ뒔&TH*?#$ iF&C 6ɳqX͙vv4t6F@;8@;q/XъG 2y >|9K磪F d8ph+Ҷfd$FW\ R 76s<gQhb 0 x7 qŪ@F&;=5@(+P 9շ!ah5-NH'B6̸Ƶx5?qrbb#woS> "WЩDC3OH_6u 4V^B`@jQqf]QC4tCxu^ UcL(Xx[>]$R2-H bcQ«)(#׆fhis].k-v*'BK"#V>:)έ{K!͊X('^Ϝ(hiVӟxY/5gf~tY V$>Ys +Ӂ9#ʦyۘl$3[M lO[E:>8ӐuMi`O]h5ėq6HiŐ?k>VJo&! $r/Y]5T 5jjPH]x+ ް%j+kk.:/vM0ᔕ蚿:P+UjT81["3o?5+gxekUу @oOуѤ]QHWan ^= ExkhbW7 ~R7̱!-`'e<#J6:b2Ut܄ҜRP;Ɨ=9 1]F96A̎i8#7yn Q645@2'I=I11En\ 'RZ-T0g d˾4 -J gJ'dZ}nХő=.nkNxV@D (un6su|r\Ui#*;شZL Fe4!\ ejm醞Q&'}ƌAH J:ANEȇS@l;CNLEe^RX# [JlgRAc֙7r/  h/:bG^T.^;[B Y[^S1ɰh`bB%Ƿkr葷j5+L2PNfbZc6%4O/T5QE.(A@Sy 2內AR4nNh8yujWiBd-;LL ŵE,Jf:H8έ(!?{b5x.`u)!ƟDzHJ@'NޟzA̗J ژ"Ay@$w尫q@srMz@4$[;6o;@H{^/3 ǑLTxĴY;&(Z~=!/<497SƆ~Q)+An0L8I˯X Hlyĕ2ZϓH /%N+r'5PI" PŚjnk㐁Z-;8%x$!O_Vwg ! lA+%V4TPIQd0wDŽf߾\#@)\ӽCM,ڑ2]um){Ц̲xKTegqwBٷ~ļ6ޙ`0+ڕm`,ZFɍϗ0wn ^ߕ%A .*Nc5jaDאD`l4Thx ^PT=1Zᬻ0( e:ܞ*9 f5ȹ=3pTw1l -ݵD "]s Hr0 6d8t-k8'BDQƾJ4R,=f9s!nTǪ}~];7^ `S(,D'D &'{*%pVF)Sՙ;E fi)(b0ѯbժ$i.+ 5eluR![iiߜmw Oźqeo&|qj3s(~Coq+xXaDil6Y^yڝ-:Ǚ]J+2ZoέjQv^IhX8^uui:r5ۧR P4E*C3T⮣L:Q)ʪt0tC@u%הIܮbz(A;.ܫN|>E`NjW J2Go`J@e#רSfс'M6R3)xQPu/VPՖxw(Eb*8OAXV#aLfELrX(TD!MSGS@l$ipE$MmS#7b 9yz(@l`[W67gەkҥsX};s;2{L!!4G/ «Xv"7 #AM9b:|uM^\|sW/1R_ -Tu8-kb-[ rV?BcMkO}A0dFA]VÅvg SfYr6' u{KjR[7~pGCA]xDա&Ӣ-CϑH+XP4|9]40Kº<=*S<6|ɖ;jn:@܌{IĄ \j.q:˭Fj9/=RLoNWU q^VWETFi q,+ Y=!7#fRuK;N;ȅ] . N!;4f>bVPh2XQgœG_<1~m\ay3ZDހ8a̺RAa_MxNJ|+v_VD-+s. .1+'l- ']#|_$$2JR$Jf?eW v0?#䗅Sԁ,} tR:22%9r@|gD[k4+(]- HH'5Ҹ9af4Zb 7b:`6nPA!Y0h-O2Yp]>s=NwVt=B"DQ"P(E sվ11'ŶFw}33u:j.{,aLGFwfv+q!ipXcNÕAp/S8ebM(.H$#1AeC`UbAC40t2#5$l"P5(*\{^>wuE[͕3$!$ Fi01첳gywyK ![̙PY@.Urn;e4pWwxxѸ3Gz?Pޫc6gG_\g'1DII@>B2:&lb&"$XtOFl4#qci*ǺnW9FY0˸ޕù_si~'h+ŸvepY|1!^AaWm-IR,Qmsa'B&ePOMo3i!Qh:uR+R@Y#nƵXjp,0{_Y~ݭ?{DY  ̈!DMs@H alu)Bj#6TKy[#-6F %6弝7m\|$  mQFD#xgvⶁ]kͤFgve@0ΥT-( fTϡtv-&1fTSFh(3So&Rfa:-#i7)'k&|`/ofmUt$`=n܊A|xT<*_T](*`}H"  a@>*"l -3`( ȊȢ$?tTU (T_ PA}OѰ)n&V9@7 '.g/䰿5%(Dd@&8W'XH̑+9#sCF:0;(q ;,'\D \g|X9浢bٙq MqHl{,K6oYҍ$e~$kF4 wH/)Es1"­(-͓-hx2E) N Zy9gP_C7l,Z$i4#)r-`zfg =3g΀E`ţG_ %S[tЋ_Hd^!BVA#W$tD@k.Փ';LۡPb؅>|r'V/P5zNB*|Y \("h.yJCСձ-~l!Ǯ҃dd^Yλ42G؅%@>P1vˢґ+%j(|NI'X C !YeAGxVR gx[7gY WZ+/FUa#l kR#4u5wـ{N_S ǍKeKjçwHD?T׏8@IPS:M OQKؒX-YIJbHY-jz]K&ۓ_hOE e&/3ťDj r IDO-ղ3#X ^gv!ߛK%)pǺ-UsuE )bm"5XW'T=YD;O8RV@"4|!c91X!L }3H ϟPw@4 ?"@4 Uz*xa iP4 Èx*'SdT0AxuP_&( ) GMQUQ dW%|E`WY!hTCz2D Ɲb4.DP{,wJSPITFZZ*AeC B ~J,HAdVH\>l! Qg>YAkNLaNh:[9^ I,2{y o{Ha՝GK.ugqT=o;M7 *Vt:ҳw //hS'R.ݗ6^4 w/26ąagSR0 ]UJB_KwsP|S"ZJŰ+4+:$r"hU:~|22ȷPy3 8(2p$g`eYmdl:ȧ7ZG)3'x+I͔@#SAӀlkBT/Ad8>1Ur,6s(&OxQkb V9jUI CqTY8&aY 9ƗH1L”YO2^]TtÍg+)gy2lQ&t&!$ "s[8p[t5R-F!dmg\-Yaw˥6ƶeڡ҉xjCVޖQk_ jrΰShlEh{X+_qOJYQ P sMɼc13%Ha|g!,3ԍ^@ JBu4b-NM+J~lg5s,)9sG }ogނ'* yT EPD:>((0ϊ{k+PZ{~t8 Mmm̯31j쭬FoL"BAoWS+X砙 U+7(t@&mllWLK疖TܪLv{v̕w?˲~ŊW~G'NL@D(n%('[TCT'{|8" b}@Q^{ئP[ ā|cbEoB2s|C9/dי!5 'YW'@gY *J"@ ʺP*B+^Sau0@q#)tO =k)8:&N) d7H-ca?,p6M;XӔ5><psmf &.G؋2V!)THXQ.%bz6<Б`6A@)%"&ؖ${&$ V⃺Q2 ̩h*orqȇ Sx %AePc !.d84LW}"QC4kY{5d_"w05߹ZܯC,au!58dIC(z)+[41UA LfyIn-ЅeKrHyB"/2uXF힗#}C{ ɔ~DkɒgtM; $A%BhTR@@,$o@-?~ y\h̕Y[ik@GD_ 8CeATQQ (?y'G"* T;>07W/Epqgj@qXБ9 =19BZMvBH?hj!ߜ2ڈ0S,hƑmY;r9^},}.7%*`\ߥ cQհR֜21v-0BHՉD+X9 JHԊ d>ӲLň-&IswKNu FB T|W1uLD2Q>W+hbrL}bdlKKi5q9)I{ubz2KMgh'b*#Mu˽uhثQ(N6MMd#,,gh("@mYlt1K[.fM}U/v2q !DE>u(|  "}(8@PKb߫}_J"ֲȜW1d'`pfT:,N޶X{y{4 j-@|jBY҇myx8ZyO.C| "46IGGGqކ<˝$]W>._,17 ׶cL+.[deV$4nQ2nl]Fz ?Q7ټXQbt¦%-.r+PcV,!Krn c#<%mD7ZqkD{6o6y !C ֔C]02-Żvkaf-6?rFڗnQNZt#eoNz  )GD[Pk8m u70\Qߌ8W d˩DUA_r='d%;s(8D18al[ÙAE英,H&2@+2"H?Um.\m弝k Ѽ2DC{ńyu=Om 17a?)AH@[T<-1lC`zj uZ{?ztPz(/_Ȃ+DD< |D̀;omewoaǹA0 r>@\@̄1K7YALac 7zg6D}A'|镾 C7gb:]y8";k_4*g;Ey B~4vv\.b)`lFt?qFZJ Rb1)wPg9ӝ(eSU qn ,%蝢'z*]ucLS#TAUC7]:oBk SInȬ!ځBO {ap ^!yO1!ؼRnDy"maHicP$^aB3CT'g˖G Wܪ>I Qa6$,fx%:^,'> 1/ 'ia$8Y8Ѕ&DN]7uT]8&ʞ8$WєV*ttJY,Ȗʙ5S)؏ nKǰ춹*9a$kfoWUs\[4\WW 9zݔ9mE SmrY+_Uy(u1(fW-p s  xH9B'%}gi [G2I6#30$X%$pHL+ "|Utb2\h@@~%ׄD- dP G-dN]} ]^ZV7x=|R)"uM APԂ|Laa*Wq^)ԉoCN)}QȮ /:g]gWA<yhvpywDCʂ@* nU;"1/Sm5l wU(&tq}LJOK{?,Xչgl :S :Ɇ\&X_ߝ=q 7j@apß-@"ǒYzE _]j̒!Dϭ\CQjZ&D fb!]]i;^Hn12J<+ d-fl3Wl-W^70s&Qř@D oDH- W6d&ϡh"QD RUIE[Лl[Lnwef3|Haİ n-1l.6@SK!;[]3M@8T¹"ۗ1'BiSb8,.<; kF}Hi ׮,֒|hЧ]Dh)b*3t!KngvMp+X'.pچELSy.lnP:ܾ3Ɏ$w@`!4!+Wz/#iDm{2uzp`D_Y %J-3S#H?g$ڑ}ϡtpL6G pM 3>blPW}"*F D uDJ"z q߇`hv_Ȉ"1_:x&^nϽ ;cZjjέNb\ +UJ`7(@XEj dsk-Q9 -V*7qOKBp!nLP>gCvG8]Vivkd0B ex@@@(.,`4=OG1t&Rs:jb۲Ø}禹 N)P#"THRs8~R ^WR$Ka=~V+s @%9 Ga95)E=GJP,W٢ I.tPN4p}æ? BMWǓoηet>sgpU;!bdER f$ X1p ɠ˶;=is2N|#S ΁ 0S "& Dqb n?DeF)D\ (H"hۍx# @:j<0E aկ1 NI>' )GCѵο:Yp*FYɐ 3iB&z͡ݫh@&Z+%^3?TLDQEʯ\9P)h[HE> +YQNmx ЉGH@/1K{r)2/rcqCd[ :$ : J28lM6^%Fx>Ŝİ J/mI: owtt0Y#/y"yaa$H(z+R<40䘕ṕ.ĩЏ[ xOTg,K=wC_Z Y8φ{( x@1bp"r1NKnM}"t.2IXɐҋ @@@8q]g S~ψ1gsy\ Ga܎#~JeWM[ #ٶq`vޱͳcw!"cl֫:Ķ?S)ľ#]ѐp Hi; D'G;lbf{䚬 Ւ:op5a$u L;`pHCx QE芣^O s9vH~ {"TΊPB NJ@t;d5Mwo}QW[dxYI"& KZ**-[9rD*+V*HyR\{uʵ3 o oYh jZ~7hqkQ,ݦatHvaD- S%k0,wbוt vأEX[l n龩q\6P6Ц p(,PHdLlN<)d9 l8{ tE&K/![,wbCɞR-pg!è"0\1Y|)`ağ* ]63;IE0`噉mRFFhB%[^v-.N܍%j]ܸnwao'Qà](0Q{!T?tE71/&/4ަT*!4OQ? /UM(u)DO]PEHO"ᣇ5$/Q(VBJ@rj %YTǾ,dQw$q}S-oqW@0+pr 4̀z*z*?<ȈV6-|1Qy%Awka<8}sHQ>6%^I(s=E=QP^$lX)rs\*wPNhCQ^vp". sS'!vNtΠcġU̇ bC`hd^@pfn}6)/ e:*0+"DX˸&ệ/yN]7A,E'D~N wؼK#"?~3˚^ cD$@$mo`D#Gд^9:KC~ KʆNARpmb~]qL|E; }>|ՠgZ:n;/Qī3m) hH(K wv),uLF̯LNK2ě?s}-/;-Q0d#qۀZ&ŵV7FvFjŠcTH,,.ˢo}qUE.U?NJ3 vus`]|Glj+%FnVX0|Wk 7V(ynzݻ,?JnSf}F|Gev3Dt:Y#͉ Sz?v0\,82I9c)r8tv;75ٲ.[ ]w[s+t١jcꅒX}/*UV~d_32vGOE.,}ѭTyJ-"7/۞o'| tp+Sm* ZqGY-Z7tAicqgγB lg5ӦvkvI \,+K(q[Ҩ;F5#R;4%zeE={ W6/ZO2jxm~3/c-cG+1KU]O#H\Jx=lxOșߪazWc2)\<&#qGulWo- "vvE`֟MԣY[ uZS" 7un{d aJٵ^FeÞ7ΨY?8KY(o@/ys*f*vmoBFuxKc |岺/Go}=}Sf~r8ruݱՖB<=nxtkF^:Z\5r?KO( K K>צm/p3֬c#{ytq^iT޿_q"Zf c#}GH<_mis=.xpe?w&5M7ӳ=a,h8 XjJ̭kd.ݝoax25 [Uux<]\Ͷ=GVΥEeuKބV֬éd3{OJa2-ikh!j |: ́9ICәgػIjeX`ޭ,D)Y 0L U:bfV 72O418VPמo)X]Z]wVo hTWl Wh#3Ɣ\hdly` gDRFc'~فvTF: @1Za..k"ictExS.O1_ '$A{#Y~EB_wɯ[9z:xp"'>݅ +{N'|# *%fPZ>4j2k:|Wx91HPlDDH"H22(2洮k?+r]AS"ڵ˜n6ʽNʌbyIyE$@KV I}\8#%,r\ĔG%Wʅm%RK,?sW|dk|iȣQ?)_}Yv J>j ohCK*$;I aT `SN_`P͸pen؟0 !2m D{(hi"NL 2S-ly\/4v{_UO=qcA.a{Mџ%xOמ\IUP}"ROc+Gmch UH jn>34N[?iǕ4s5>f\9^%}w"=d ͥ*ڕEC[i2_v|n*JbKv{)+p2 |T!2x_Б&Rz%pק.aIj%,awܚtȖi\ 5Jkϧn䧎˜'՞%O_DçP/k;$3ؽiB@TH:86J{:LQ@05"}?C}6Vf-d~_i]TC♐d/.Cm1)Ka}q˃$!nMNB>2K!5{w鸷,@)ECX:βRw(,pFT 6 &gh5^zq1R $yje,{i>}̄zHgh3$ ZH-aںvZU_`\^fwrϋP?$2*Ph/)NǍ[dXl"edz|yr>&%[hߔ QxjY{Do]&OJ|^Ymn4ʻb E~4@Q'т ,܈Q㙐";?iTy (Bt]ZVwAwP{< 3ˏ(}թ{݂T6z#_uJ+p}D@=T E=TS>(ߴmL;ӂqGo϶;޺ 7xb,;+4 FernMHKaHNqQ1}B=z=*/]N~}ǓW fW)6__F ]jteKOC^eiAckGDȐ :/QM ֘n 63oqJ/ёŀ }(󒻚=l̈. 4? ;2#jnrN tp=cY[D8R"00A"ƈ(?>kGm#PbQ][G1?LZuԬţ\>X~ d:nu:+`r`T@?i71D Sgh^B`ޏhp;,dȅ!"q% @[i47oSڶ,V;rZXIɥ9" eL!(oi tB ! onF#DV;`7ߏ4_m_7y"tO Cb?u^};8l??#8FdI0#!\:oT}G<}'M͓lvZrQAbi'}!k(qr'fX"pE>$ )*EKV2Gu10(Xi(p5q`l3C*FDawd;cYoδ `ڂ/ '?g"T]HmmhD xdΒz5lB_n>g"bgC0G(CSTIu= r,fx_!&la'1&(=Ua഻21uwnzS$ux ȳ9d\oZ@3 $.I%ߛg,HSf8OgͲ'CmNMڽDKiDÿJ҂SR oDb8qP pN\<8+ícx{-9y[ a׫'q~yEwp:7W?圂Ac!ݒSM|sbˤ/Mb}_ۋզ~,P_ĀA@{jPCr|D?"~C}#k)T93 2g__cnﱖ*F$ORNWZ`| EǏb\ Մ@u &SMW?itI{?ģhת3Up'd厅waӽAc9% ̓vuG?IufN6 }$<ؓ3Z#V\}dn;й7*Ab U|+xܚd?F*L[5h) >V|wVkQz$OqMZKPX^Kw,Rn=ߕ-WrW4v?%'l̗29LOxG0PX|ܬ{#*UlH՝e96%F̪8O ԶMʍC[ׁhZy?[6f5 0ʩ/6RSe 3bv/7Qn&"<(*ʰj%Ѿj7!NjQѓ!^X2Q@{gԉr#BgC\ R!.W~%ցe8-)ӏ? Mxϖ2).mZTi[$-/ GqEYFj8uX}ޑ3O^&#O.J8q}OkqwKIU^,禙MU=~<<.ˍd -Y߿F-mY\>fO-} NNMB&EK9.[Yj! =&(ʫ3c~fk_(ww4s]r n׫gkd$8\N83 ǎ $Pr<}OLghY ؐ]=lD-0I $r9q5^lÙHlݹ# B= ha?vw I H g#2h)iUjOYMuwWBE{9w/ԛg󗰋.cI^kGgsgmLlzMu'jq+4-duΎ5SҴakcOSrk7pI(S]`PBm\ƚd(5QFo] $̴aMk*WL;rs˒JNI4w)dz[ȥ ^[԰ztUqҗRғ,Yŀ nWf'^fM9s|xܚm?Җ>zv4'ssgX#o<~z\wUo^[`AFXG XR464BX^-l5+O%jH?%eAԿY3:;F~|3py~i NݫQD Mf=Ԑ B%iz!҇ؗ t}"6j ;F@X!@yI7pc՝S@+米_oK٫Иu'+mp֚^['3B׶-_ 5F"CFd]l|7(“-tW*ډػae<]G< ah2;l5Vv m:6/q?{/U7b]ۖeE*AWZu2}v4'LB^;&7I,76ӫ'ۆPEQ@% \?Q]ij} `k#hVJRz{}ߠp̪bG}MTUizW 7V9G1)M#֍1I]ܵRF:.!&l6]I*' ¡o-)䏈֭E;3)@R&:,ڲ !gW v1%z;riY?b}×íHc˯RoˏZoGm_B_b,Sәv?goa7X>5 WYFUМaySE q3" \{*vt!OAѭ9gk@8yM?jOa0':`*wcPgs14A4`W  p]OnaHe'+@G^|Iz^7PPA-p" ZZAm3fZuLv>3]u9 dR:ș<ÿE" ' lWPEdQ~=nӇ( s7"=t;lue xLU<cU.vXE@5âx5ڇ U׈d/M >ܻh+d)ЭG?5Nx=C.[J:CL7d*pV{ F_:p޽rn l2N\ՐcRҨp25E[T{Q\V[A->rS8H4vWZ.EP0 !h_ht{oف,iՅS5Ҥҽ56/sƗIӓFNCwc[)p9Y(/0v'3JdT'S%X[/ d^V#I6L8;ԬxXN*AkQ!1b4TLBp ;>`v*d\zpT4&m`S՝@Q8M/xi 3@MGP/0[o1VGmXjnck|POFp BإS.S)?A:-&DpM 5U|G(1_k'HnxĂMz ϘfBwZȤ醦pW8'ToiBt9ejۙAV6FcJԿd:Me;^0M#1SC"wHC<)ąrUC )`\Qu`dQ^t&Xk w@;< Pd.K.IV:]KVHLɼІ_ G{_iO)JTSuOWCE@DbP7p3tVWnɑP1fܩ!ۻ@58*$(1ݑ`#7oަ ٗs\O F$7MHO9_a@HG+ ,H$ -/<3-Sr!o k@LV~1@A4pª/Y嚐}!R16!>|\"=dADKקx`g(i81 2jt0ޠ~Ēr/8o@ŀN*/Gs@:#}Ml "r ҎnĚ<+_PcxV(WE# pΑ(1; /?D9 HL3~cT\730E?~eZEN&90A&*fA'О^ gDB3#6#8lp[1MH=fnڕx0'(pLWCߜ%æ_:hjx2qn`YЗGɽ}%/P, 0S$p&DӴb(@@BTR&%S=XeigF31G!9ޡ" k.//"w[!TL(˞YFK6t0FBv5~Iw 7+јCQpHrL!%L )"D:_=`neh# |t"aNT9|@ɼB՜CIP{9u_&Y f3!N*dD'ٍn3Њ!*3gaA1`m%PvV ^K+ibe.:6o>#}WMyޔFZQ#EC䎔DUq?q~a>dT r C}(4@?)Cb{mтWyLX'1nr'8יٴDP*uBP U&:<9 Y  %81E#q}跥4.ĞTX|ȦG1(@o X$?6@; zHҠb}}M&}R!"VA_u T !$ h GS,Jc$TdBD%g0)U@Ǟ`\PŎEn2dw+ൃg7"Zj$2,3"18WÍ_3f@ $$U&-2並p%>E(Yl*>t5 C'M>;|^s^m}of\#itUV-v'xN 7\&w0=FMNb(pA:5=d5O.|!KLt?'F`A(I徑yQW,@ hOНLʃwy!XXŏm(>rkk\S̡&E>Kш<A -!qwj@/н:/> ~"r& K- [l}D> l(ylq,Ī *҅y3,1,h̺dm[0|Kwqg+[No/W#L6nw}&?# M.7~DV(yZ[L3ʵ[]Yis{9PjsU7ж爈Hws[Wzݦq9YbR?ӭl:evkO+#WF?mʏJT;$SCӂodz/`! έ\Jjhӆ >p#kcmcŁ {}`tMю ];S2\ZZB΅+HR=eQ"3m~a-CfeV}䬴[錙Zb;*H᣸sR׬K`r8;5XIU7yPcفv *Ƃl{ۇ߆w牵%j.#n"Itll_-눬ދzxʬR=X7:PD":3s y[}uMZܶw~q:d81=9k0f`ӆ]/25t~< GkP* O *` W\gm߬o7.LWcD`byTHY"C0Sf1sbNC5M_> oPi6% n[ oEAw8A wdv%qULR)5HaEhj&A($s1S\t%t4̹6 A[ m]w3Wx"m?` fo#̰ |x<-ٵ j-kAlq2]Pd$5Û'[EʎbMA0p7\=iXyƺ>]q͗93˜ٔ% 9 dxBA-fӤx^ q$C(9G#?2q9u]Xe00j SRT? 'U~oS? P'o/>r4`{QTۃ}m>WpG_ .@0*Vrq9~\6*x3}æ~tm^4xV9o5 лWK(t&;[C~3 Rӊ{ihuwS|c$ԇɻ}I |q+nn9Kp=:UBٕQ=AP)v3ъ/T?'6?L3"PO.R3@}*2?/Y=/JƧ+(Āu),p4_Uz @&ə$]:A!̝e xJr-IMAV_&pB‰}dұqlu{+ʇ>/9!BXrTXb]-KiJ;Nvi$שI^A=18DDX#\|j~=A/uTk\H֦_%YUr`g[Me`Ng8&lr.vs≕;;+G4#CWeb @> > *~ml^Рy2|w}]ZW;T5{ݭĶ8d goc nX^6~Lq5 "D#3qD24.2;[֌ :\2è`筆WȦ d UdK'9& Ġ1m:FϢuod%}LQ 'pM3!,ޥ3@ķ1 9&f 03ҬyrRA"" wg7}\WJdZ@Q`] Km . >WhzO{߃n{nq0B"#\x#._oOɰ՟~$457/]?3qb ٺ%pDb_I\$,ͿQ>`fW2T|b^~(Y&$5edK&#y@I~5l/jdx&z)\v |p)XHQ5_؉uB@$ NS) d-jS̹r#Qs3{s"[,| `k$`K=sD,^H o !:fc0Knc]iK,5dY0(,EM?:ݎRD jGLӌ($WօH^g.HL|.b}4Bػ+|*=|kNp[Vu1Z:h[./X@`* M8e.hC St_F;zOܴnzL߄raV2CwRmH >sRJdP%K|(O͕R7-d{U{%j?ML `ӻ h%#{rV`Oǖ6cAm\pS61Qs5[=#A.a׌8$)h abIBWPI@@D\s=YBQ\(MEgmM."_Ⱥĭ2 0Ta,fGLB 4H \vlGHX@l蘫y'cȈ~yCfWe|SK܃SH?Epk˨l4i Iζ3/mr 0 1)JQǼyFF&U"u:m"jq.b'wgdv#IS cmuPc3QQ@/B>[ 4„OWPE'Tܑ'+gDߢTG.tɾ8@WE%iSZTyO]U@RbJ8]#u$>| o%8*t6̪6-'M'1[9Q:b3JJ'ζZ+CHOuwkC߱`yG̴Uc` [O3P$LU8иaM^楕ODNӽd ke˵]f6'Ǹż*7sg=ۋ0ghׇw+l>Mm63yjS+m6'q=ha-)<Mb^r1Mci|>^մ۾;/gk G筏2_8KHͮj!olsܽKy&)3u%3!Rt."al;Pt48m3x6VH(b91Ojx4J`;˳QCO.t(GEC PQQ:3F QZMܤ`i5) 6FNX7k^PWCi3zf&BՔa]9ӿelxmXk\H5% \WsԭTgʯ$'oE) soyv`V:S΢n.cJ]<πV[TuY^) =-Y[ $Dчt]n6;YxΎH^g|[ K!+ڸEsN|-c=ǖ{?Q9֕wv5Nw`ֲz= IWEgZDEC#wgL_R%kiX_HK2 P(Q/lMqe9qAmJ$w)n ؼxh<05Դe^9mӑ#/OnֺeB@;i}Ez%3@d!Ao ֯!"1NԍbUnShcOd&v_%y=(>GEpKsob3[.1 l+YbHB3a()[hid`Q@T4P>$ms1Vj,?W#N[L*_ uɋJ[e F{ dg^= aKo"$:\Ep*lbVkPfh2,NW8V.XuQWS"JK)%!8cO3JQEʤ(@5z̮[`IS -_0Ϊ蕕34ί_{ؓ45"p@4&! {EY^VD}MX;'+U+~s?9tJ FhjPox̥iڲr8]f/_aWц<ԘgF< Ҥ!KYs(P2~zLJǥzh/_i|ZKgO)K 0F2E_v9Q0˃z՗k|܊{l[ y"{T;n@k9mpT.p"_W'+l&1/7%2W-D[/{|.%-8 1X'Mz4Du{ j}jp<~qLU\iLYr O޺2Y.&;L~y^ci~)վqVŔ&,纚Ue=&DNoƘU1Dձg ^USʦ'{a'\L⣶ښ8yյ~se𚴾=#e/XfcXJ~ՅZzҕO~z,P¡ﭽ^sЦQ -DU%1ZfLɌ _r{<81B 0HوқVOћE[:5,pqxr~aN೾ Ntg gvӚr@)߲ZLjVT tsyFB\1oqQCnI dX򰔥JܖإG76~wOxr:i=o_n@Mm !}aˏɤ̫{PsMuq6rwv;sP(㜀E!C Ķ)[tA2;!`Cļ6ػ F˭ި Tn$e"YSBݵ[!Bi3庅hE%&ދQOPЙ^ ذܡ0cYp4St1I/*- Π!۳zdwOtL=ovT`X4$BwW} Z#ǐ* 2(Cpܧ`l0YJB2۔Qdto,G^Ǎ @8􆞊;r>~=oʰ,&dthjnϥT$LjOVwvl;TMt{E~q󪭥x Y80ހPnrIe?zqp+d9Y`EIICw:_>O< 'x]ʔ%i8`NEAKi3{IƐ /Γvd/6caլc\̙IU $5 (<DPv#:%`fϕ/7bumϒ1[ 6?BhKrcVKi5?EF3͗Ytw+`~}o` A?@?ܢ(P?6q=Ƞ(@=ŻCp[]ەvyuV=LWP?֑DK>w Nt:\ނCݿ_A⯃竰Ӹs? ﻙ'0`s">VǸtu3R:PT@T* Iiʅe䌷ŏ.BAzuyf-u.&O9B>LPDMA\OH/ģa@u?"zxwGN&7/M.C* 7A=Jg~}VNYXxB, Iabq(6B0+êÏ_@LI%4~4@qYQSb1D!`("fL"?n[qJqV1Zd;,DSw<et^z{\7r'K( 9Wu Cj9T77Hwu({X{*AQ`s_O1= S ^@' ԯk[ VpXC=s )r!ҫı=?3ςuA{-TO0#дezE'O~`|hr@ܤ Q Kܱ IT)}AqB.py*K:נp=V}veIki4BAcVt+,Qƭv=47NS^6ٶ`bx^ [DhLcҀ%@ =Z|]R>2 PAwyiwom#~엽y%o;'ri&գ:vZ OS`N<>@(&.W&: .zoN {'Mր?UHye)iwp.8n;m[?Iq~$E*@^"sDH y*gAP'GȹOƋt]ez]xin_y١D9iRz=:XKyC/꠆P2:{nG_{/`*N+-ϛ:˓wVK`;(F!ƊOW>Uچ'm˵-\B tefC\CjZrMLwnD4`ZtqjmҦ׷{Zd|,(|X[%2 q"Y*?}>r&m77ou#ŇoNTb/E4@3?"~|6.Os o^nKuc*3V4 G=ǴʅEs5PNqu5NJ^4}L03$X0uuѺOW|>"ÖXɘ`QC`Gҭw%xqRHmW}H>%ecG9ȓ)!kB*ц1zgYuɀ@$~H_ nJnOQ(9ѨsǟϕЯ+~Ƈgk.RAֈ-|: < \eXJ~F< O[5ّd+]t@0nJ!#Whk649*`ǭ $:@_1IaILu (tS"I 9I<Ņ .} >8м~V jp tiwh(뫺U%OziFWku9U[2y8>$¤Koz/AThhɭMU $e4o\Bd^M5Y 9wѳ'Z'$+wq2n'MN v}vlݰ+v A ƪ\qwb陋}N>N>aӶ?Cѥ(ȖV)٨XNK00oqS?闐 yܑۛSo8U\^帏| k hطסJ>OcpX>jPAﷅ[{@,DU 6t6 ۥn5\n㳄 & KJ͌0H-֗"WzX[j$-T+$UvLg|( D c]Knoo/PDtep$ b;g YՔ$aQ$G⁼X@"#kqOr9E(׫&!0Jf_݂Yr{Om$g Q~'yVᷕF;g@RO}ArD.5o+%Ȭl[1J*No9Fez~}cCb@2<ڽ,Y$$s49EJ@!P KU5NAh]6*]z& w׼^U^xi 0ܺ%kP欓/z7xgx @@ ݵbڝݿ>g_OsflZ~%Fv)w=\:'؝[n7^MGQjړُ_V?ڦr#2L3cMw-T3`&ׯIyhqZn❚^;oʏԙCs=AI/^wUϤ=鬄K1cՈtk {Zؠ6q\/;SΗ'ydt(XR?)˜;ԛRa^CsyoBųjSwz2?y%vV2ʵ#Fꖩ i(_{b& :/ϮuYb35<a=)ֱ6AHbP n&Y:)\t0V`:6SvFn?# _<,H֩5{ RW{bDƒE 'l_P rMKD@}EP*]$9;'DLM aa5HBXP[n`F"ۿ ν})XSek$٩CNڶBBWt a41N>P:zh. 25O,̝Y4~ *IUj~V/j*r5N8)J]J*+fnŢBLIzqݳ#|~~y*_d(_Wҷ2p~1 [YJJ~(zVUTX\ig[$`,W%}h{d O=:w0Zyg\X F6`BC)rni#ODjpb7S硩C\Uh {\#'JJ,yZBR-֠o;e;ږzYM4;-F(](Z>J f7[5~DDGf%NG/nQPVb]h"=a6Y~3,C7Id$SZUVd&t!Fέ`b4~zw7g&)8D(ˎ( \2{%ZG,vea|lp-=DRbe֐ OSN 5>\Y1۫"ۤGY l`0LpNFƺy{36#{֩ہ8%MPPUGSUѕL2$އayDNzaW6g*[c%Ogָ}5;>c#( mSȳAd1M ToϻzAQӛ5uàݯd Z*&iXTU]Yqޕ4> FbՆ+ "Xbf/_+ri)eؖ.R "gע_ `2Sm.GTb12F #$?kM34︾b_|2<]<8:{}? H #" "$"DI::Ń mypNJYU##HP q2{Lkc<%7B ﷯ssfCOKN+Re `gmsYӝ唕'44fa$ bF8M=u4 E`‡6==SA 7->)I޷{hAeK>{4*\lYC]iKGMGBX"Q>`8@,SRa818I'~y۝WQ% H4 ) ,p8TieoXK(g1k#V\Y2ܙJ% t1"C8OD6^6f,{#Btro{ut^g*Na:pN.Ej5k8/j2^:^*jA5Z) ejZb97ebks:KO/vBЉH2| K5نW~h* 1iSyg^0#_15" @4vnf>N( W_.a{-JDbq\*WTӖ'H(h.5#R?uju>283@<֪u_!l FB$nK Y GA$Ap3p4w;čvf7Mǃ$AMۯ]x\T6"`\cBP*4l!'ܔk-2 —I-wϐ^,w ^}PtR֯]:IɆM r"%FS2D2ԩGW&ݏkdb2nizz|r H?w! -N ެ!X-^BbV"~"9%=bNal0#KLٓ$oQ__JYkPPMʦמ43&_\!RTz{aA8b_$8F?H V(iZ,lF^]BV};'zuyehZWtbAq-ݚn9w 6PIN^4W161ibu8TSi91sxw^6u&R?YH>:2q^,1O+RNB5[}Q0d!BVEu 8߯N`bV( ҚQLF%`8 svvq[ē|R*;}U&#唑?*\w˜崙> d#ibR4mՄ,DXCFg>'>ٕAT:cҝko}?Gw߸/\VYը'z_ 4==%i9? Vtmk,/ =\J\~ oE|ߴڴUՖC%Ň͌ƬDtMsfIL  BWcJ>?a@rj3@"Be~MK.௷;wB ?K MDYNΡ#n}֣oPrin&>:t-8l}^U+Tp; Pl5铪TL*-j[kKLUxoupxJڻ*)\"}.sphXu~ꊣfɺnוPFW~ρP!P&@bTQ:" 6oLr>h"' PC^!9o_+N^g.)eI|ʟR;^L/oTfV=}ʼnI}=z7U'lEtهBjNje',:Vs'Jaf: 7g#$uF OA> R!/7wnE;O ʧztn<"@7U\\%$lWz,( AbФ6QڑMCT 'DLwA-0"Kmy`a0uBm^2r2_fyвcD)Q„"Fe;T&$rD~OvqB8G8p>?o|L/;O92"* wZk齂!$zs>tt0a얫i~?},NoDkP*i_̀l^vߝ7]y\ãwqeg`sZ" NQEӷ;[5q]qv:ABH œ$i JA5Wh Ԕ lV!;ٗv3[|enl3g_ ~1C"ccdC뿷?;? CG~>_p߮=|8pGI'88ݰrxևgȡi;$ɀ(CH2)wi\Bk"?u?E'Er?_Ua +2(Z;"A'$x|_ƣ#η[LsM/7ywp|[|sI+7;abjc-{$-YoO ]$Kgwp-Ǯ9_j>ۿzɊN o4@v)I֭ٽ'ssR&~ <~>$7q΀~4_S4=g '%gdp¥>$R;f; ghlN&t?hq'r0C $}QD@1L.4l/:T22+&F]j##^cx)c eDb')[Wc]Dكwe##yܪ% UV6m5l[\V׀_‰y+TP'5`Tˊ">GρGwdCmzNߡ$>%{h$! qwΝ1gF(pZ$rI$!92zj'M\>Gc[G?>/+`̋6VCyl'u-|iއ]{;+ OBU%D< ($K>ְc+d`%'ij/McxmСgWAVC(0l h"D;Gj B#6A "؏`z!ʈ|9TjDRI6Fg`'?qtb cd!xxͭK aH,8J |[?9G:f@ h%}ub+u@r7~k"?^mS&L}sն?ڑ[GfB}CKF#҄Ȼ;({ࠈxOۧёkS>:*p9e|Oƙe ?Q{ +(G'8]15̀$-` EC߯ C{7@X%9=.XnJds%rk2t.3GgY~!8gH,rj(CDA JK [0T$'G)OONO=+m~|RakL[ݧ?K|Op^4 pRg_ D U 'vSz{6D\O ’%.g]u5彜"EANTș Md4@$ExY:Zu{X!lטEh&*@"bi``]yoG?-uM Y{K3S[g -H{/8΅?Oj(hWy|TCyV.D&j+gZиIRqtO=Ļt"oRXx 41a%H"C6+@\]ʳ,^D2Z t^さcTj;Ë@%P)\㏗frߤ}*bz $jH/;KAd 톊*Ep` ni`3d,>A!w']塈,6Yl-`Z 9i?[!gǗpY1Xfa.N)@PMw\cqr;`5X0C@ $1J{Ә@p1(@^D7jOJe{TaA;~'m`WCnq ."`V6wV;#tBmqp_}j@[+ZoHA$/o\2 $xŭ̀PI[EU~\$;φRӁC'cz@?6_y@}ܧwJ ]O}DSvr&ԭ Ch W 9xKk#] 8 Pق)M*+w%6,^LQ%*9}{faJPKlڀ;@`m8ܘQdE 0 $fL\ 1A` ȱ, nQ{(X&N43J j)b 4I'-(NȇHB% Dm. v.!bA!N#Z`#  )\A )-g`_Te˾@?W s}ȥ5b!bgߨ<~DNQWb6yzc<V ڞ>bz3pp"¦< 1rt^+ f-F/عcxrVP,@Kͽ4 u!LA9* 7(b2J`ÌIhh13qįZ9\Te3%GeX$b jrg` 6VlY$QVz/9]f2R+N*z DX@kva:@Cݝ"jP"Ag2A)EU* * 8EJC+ gۙ7ї~) @0"&~LS@_3q5'muǑE"ߩ;p Pp Bt򠾶*/_c8pMo4H]\g;20#c]~4.@0!ԸaݲЅX 8P朜މI",홂$& f|ó6ׯ d%ŧi BĬKm``_hF*L*;PRgָ ",jĤ(֢$lhdD `J$ a"Gbp9,̧%%@H` L$z9uvfܹHQ-e!bQCT-cfgbaj7` Pzt; 0d YтCt Qn.[ :qq|Bt2ܶ4 SUZK0 L+&7PG,eCmGN`:N>ݵBBt|m8Oݘl-r4fgk8@fl' v-mܖN3d;q\ |m\h3;)&%Ngg.&fL'vy^oq4KK,f0w= ZlOC7c`8RpV#: . .ܴ+h&JRŶ; @4B:HAW"А6,rtx&HSX;Nf F 8蘏#&/mfIl"{dbyS3#{̈́GM.v'pL1`zd\01(K[QL YPvy0$]o{5Å%I&;@5Q܄`Ho l@;KhXQHu!>@0 H(:8džXs^5Y N'JO4g3* 'Qfq,"1R !%d'$ ˜f: <+ЧHoDŽ-G lsZ6=Xx{l1rPfa";丐 '- ]&TIyR#n.@%➡NԘI֐9Yi_e)~CeY9!sbkimnE(˘`Ҁi(+ow]XH`,0FLC%wMb8D]jdA(B6ꁌUI) "p$2lW{sz:ѫj3w*]T". 2tqFu]r %m̨5k+bښS}Ui++7kM&#"ޘZ*Ftjvt5$/u^Nqm=+v*JRE׮CsqQܶj6Z5rBh]M9(ֈxp]D=T0fi4>ͺ,?~nW;׷}`f7A .@i$B 7Zl\ !P;  $v)@3l!mCX=LEFUUBh_YqƂnZ@ ,]*+b0v7 S 4 JJ-ҍ6VҭW}]}do610k2;=Z&Cx4`ef ɠ(l;sQa aBĒv[Kt0-n9MSXOsKdAY(4P`rӘ*#Zk< X`\8cf(=<Qt"T4*T0&SX!ٙ.EDQUuHCD&J=:^h3g(P@Z'JPT#@R8J$ Y{=O[ξSm)NZ?ƽ=(HR˩+tBh?>X%v!hn8aVU lB"(I&5a#A4f:Z;!`wh4󲿱L+qJ$ !fΉU/+׃_|O` I ,",WHqБ$$ ï*qɬLb)‚^7o06*`hH)E"'XaJ* rܣs'x#EڑLE41I$q5@9de :ILJ ?*`; pf# ْG(;9a鋸_{Ќ~&O,yIR}s [Ǵ/%5K%0ϭ1P-Ca ix}Δ/e0,&f0i=cGrg.02AΕLjoM-VR/37rrT,CAE`bA$@H@FX:DF!8+Tϸ@Jd挚p HB1Fc[ JPdy{|(s!z,chlp K  u$<"T2pdcon(.L0ĔQKq]` H%DO 7Y1B1gtQfюfpj$`!CON`{H oSfl dJ >XA$B#~u}{=dArbQkrP[ӊRca= g/oD8w``A;6+ P7p RcUmrnbҙCϓiz*(6ʛ9HWAEBaR+XU`HtZ'umOm"\#ӃO*?oMIX/X˦t0+D&ݕ^`8_-Dk@jAwXS"?_ZuuAj[ڻ`Տux^a -ڱܹ[@񕿫ZFN-A®5BgnYUֲӐ>ZK2ϋ4ft|(D xM\I$/k1*(1E*QWO2/P) r?,m7uwɐ$~Gci  ǟPNu, o?f՞%.O~mGچ+UdqN*6ru&N$zKl229G1?89ϥo| ;J>߯1bGaW_~j)+U>DD,فmvUV5pC `0KwTPx uW^/MƤq_ؑv^3]v+P|˂!B(>%9_fjMO*Md"~61kLđd L!ΠNRW#nvסö/ڷVJۚe[1NCІő %(/qgE >ZOw3"1h_hίM/#[ a[8|(M6}hb" tuq]8$mf2Pxa !H0^>7,f|:xrMǎ`vܿꇼ@+ĸmdLܜ˅}"N9:vlJc'7e[dw~jN۲ݳzhu_ؓkִz'Q¢fJNf CIsţe?}_RXqH?3j]db \4c. b߂{3K<vNіVFsEL΋5M.ٚ>PV^!RHH <{!:w7,R0`)};9qr+hUP_oIf4qlGF]avu@3p?MmbD VS J@~7 s(99[A>U.!ux2pf>JooLBBh1zܣ_'tG$;\MGg} ƌ}da!N35-ʊb <-zO/?n;KaʵȄh&55SY> *q^US UE*$F']x_"€ ^Y6, :ޖ45yy0FY=DKP&Z0ˋY4i1E*QxMSDzh|SoH i"rDX0$T(o-yD-5K aߥ]CPŝs'jY7RW]\!s'` |)! e~ S BI%C)aeAMYn"ԭ?suHi\+ #/ĽR$څo~u~*Y" ag(7d"xY$Qy0HTiEDē?i EbIa*U+?A.BvMJ٠`|~ v7AC0т 4Ĩ"eQ4VW i e2,wf QCR 9֡ "zօQu0P:Eh ֬ :~%huFْBD"ˌА9҄~"c@sFX4e]sVR(*MG7j9PçZ!=1jk0sa˄*+@Oq@= 쯶xʛȵH$8-u6Ef㈏3b.9=L AL< V ʿ-vKnd>-:^ "x K"D$MDKZ\thp"~=M6aC6qMovM@$j C3Rd@j;[ܚb2BjA)ΉD 1KFhi `$Րɲfaہs: %kB:[Rao#ll̗#Bzg#mej>I4R8Y8BKG h ;؇!>)'UrvlAL9(W3JU3 ͇ XqB !MP!c63TNZel Yds2aÀQ.D1-75!M eӴ4IaP2p !@NM:0ef$SE@^!Bw I3bSʇDÓ06Fd*OrA*Űxt@Bd 16'(C"gq6 4`.+iwk`E # NVBMd%RշxC: ceՕUh- mS }n pPlxy]fq}=u E!769D }z0دȞV!Plچ M BBxJO]$/ASV0Rq vX$ĝ"%d&;atg{oƚ\ǓҸmW2gҌZTmD!sY]m>,W$rR5iBDhX'yV) zx beY2Bi]Ѡ^tf*,${[ԈNY>E*\+JHzCnbFx9p,/kw{79{6 ӞDev=&@^~N(S4s+Lw8оRR 2Iu:5tzeJΈ0bDˇ; ]J!30MD @خq)a0 9Xghg}L _+;&6#vdY$ Ӝ +n9v"`bwA7%9x>; | ;`| 9Y.(D$Rtq{cQCRG 8~ vlNK 0틘'ʪlcl;`RZvC<XY :Ǝɋ@{ٖ˴}5U"idd7Q |s$z2Hw]=/g^jbf_2ْ{ (shfQd%na :'Mt,](؜ »y)UbAȋto۞EJJEhrrsA;"fZ%n$+QWA nɰXd|=t&b]eqꬨ7U^`yZ2U) YI$ VkIh0Dh 4):~v J!dCnCFh3$5P&2bIkkȨa:0.RPR{egzsm~A+p!t 2Jh1. Gw3JW at˚.K;˹"l@T +2`L79 'DܒZ7h"..&yP:n9$ruyi<7C]M]:*ӊs]Y.rI,><`I鐹AfDt"pm's`4ԃ~~LSy8{åK;1D0,\<Ɉ7:m vm2aȻ> H@t_j&Lvf YwbB)rM`!i-4wq n7Uow9LsBU5E^#HA-?cQ>=M+'W= ~WY@w%.y xq8ق~DUEbv>$N|T@!`IjopRR&(.i/9:pI#5̜$ BP׉-R#h_íD0Ï5#Os]3N}l iZՀN5@*=ntVĎl|j 5@ t$*Aba @g=p8@' pdf+%eNŇ6~ʐa@"KCybh(lI;uqzV[6,GB(h8PA 7x",Gah'l;Q9.л2&cQ;&&4yC䶘qerZ_c2M Qxm !'I?3Bwֲr_'g ;{.v"=+VglȄiY J3M8X²,g'Ġ~9l ĥиλ$L'Gw'QMip9#qYpI8`aCٔLx.3$N+YpuVTƬڐTCD0LXwf@va*/b4 4a//TZԓA'kK B&-H 5H,BFF]fTYCv)@p ݐr5xRr IPi.΂bΑuDREB. Z"CDQIg ܉H BAJSEupwvx b vT?ťRﴠs״_.s ܕ5$P $OA60% i=6Vk{kM۝y:Uvx5pWĮ[sWZ^5κi˃qRUŎ6.*5)FsL:D4A1$b 'S]N6-+*nݷ, gE$1!I2tK H$) ?zl@dY \CyHgq79!bWD> "%CAA6js(6Ts&9u1XmZ+q Y!D_Y={[ m|òPjnxAE]F :Ў=U&|Sd/Sk,M hbT@uornk4j N0ƏmypKU tS~b 6FZ$idECӵ{A|uvSiHyĚoڛ Ad 4Q.7ݐĤ4zh~w4Z:b;0dH 4 A$yrU;@s\M+.K>l桡ADe{Jqg "`H zf`P H.Q)`rh" 3dm 吗 Vv.#0A#)&x0bx `Cx%Hp""U W,e-.,X @r4wpO])r[pD ^737 pRZ_5O)edH0f eYHe"ӕ)Qi1&*`"$0FX-bE , U Q%ƬS3UҋE*tp RH)D(,Ʊ l %HN7ŀ(!˂5 FK.,4Kt [EV0Z-%N\ ::CdL($>]X8r& AD.VԍȜXIA(]"[&3e4C N*/h旍eu[+8?O'wP?z+{ ~!v.ὀ;n IECt{cwlxdxytQ\Sug~Om&j@C=AbrhyLH-$$\=ZK /|gA{ńWIhp)3<$ dqxxiN3@260Ł 3{J ۜK^Ak9 PW#J^3AE"Kp ** EL| (,2;fևVT"m EAd B(bgz D,{=QL) t)+*/TDI=ቇrj%8RQ.ԤwW"3)&5ͳ0$[UY91-w ou/EE, hvaq1X3<xP$4,Q%;a ȫQ\P,(/BP ).HKW[سI`j*A I 8vDS&2#;S𨰧Bw JD" dE" Zh(xB)輼҂6ZZ"`T:J97R4LR@H&!9rЊA]X*dÙtfBd0p*@K0J!by~*M^W" h <6.xaz8J-Pb&qDauW0 ״XAV=o\ ,+ A,O͊졀x/iz;.G9$l`ETDj3{$k\BP̕&,Ɂⷲ@#o+D82" `| 3:2v8=z{6St"p>tGBQ;Ps3< '3|v {đ`2Qd((VQT:QVwrTF!1u6heb7j.,pmTLRBDh!I: T.8$DDF)I.10GX(A><X;>Su]݀><)=+Z3i ih2`hys*|T6nEΚ;lPg:=x!&lÎp"O.ihHz H{lvw]T{K5\ C ̾ $S g`vH,CYGRN粹5Q< >jAo`ŋFbd?zr3d. |YTװKl,$KD,gK|/Wa1X7]>u J`+̦eA_ g[]*N&:O@%:@ mB@iV+GoQ* `7ffU~~ˍ}sL|eCyɏ}IlgNTlz+Y2JR䗣4kD/̂2 U4P +3d}8lX<ۖuMF>tޛek+&zSlx:61T᪬~$>P[<ĤJ*^0(2$];M<>9|,-`AÝ˽1~n`,W96#Y]9V&;9!6j[$>'j4jR^Q]7&6>9böƽ1OF~F30Һ▚ {gX1_P֠ ˕f JjH.hcY,xȭTT3 }lH054ghr[d̓+432pRtZ J ug%S6J{!{o@Vs!'[q- 'M"Ol0ۥm;%f 'J(m ҋo[SjK u:Z?H3* v2bU*P?}cB PF:d"n!K l'id)A/Νs~ZT-i~fߡ_i6 "Dq /h:+w3t9cXwc^Q? u>iC+cJLл0#ZcߎuOYMgEQ xnŲ!wQɝB%OTո(~il/#H(v%lܭo8>; ylZ#h)fҗC(DٶX :Y;tnDz~ũ) FgGdtyO&}*Ua-DFՔN 9(zNU("мWf_:-C3ٕ#A\áI=^3t&^yѐJɂ kY:ZmXMjzڲ_+OU,=ftR('ڊ#Z40i1U:҃`4kIxH~R.Hp! )l48NKG}'ED(O-So.B~@čnFtuPBoUu$p 2i|Ŧ~N.QA脆P»x|0=XW",,\W QICe狣 h$h BFheP,bnJ֬y F^kL\.9gJm>wlI!-E K(J!X)>h/(C±Hq!P"7ő/ V˿ߦ"bH !C!BPG![RMrJ`q)HDDWK=?B&`i`b*O<@ $Q y]$_fm" Q5'C/F1(qO~7|,b,[!+9zq/JmC4 ""@|ۓd$8Vr5v@XqȎ22A#5Q(XP4Jk0Z,uL) FzV Ռtƒw\/C>!p!V&D;LqIӡ`Zo&2ixu`Hڸۊ_h$yQ0 RԾeZ)%zoߋ'Vh$k4P6)]ci b$GP ~F;!|EfklEg +:vYɓc"4J-Ў4o[9,?bvptʆHI k>l8mJPڬg+(,qPDzo'h/GrzwϮ]y$fw=W^mUvi;?g0Ml~w#BND.NQ #Br75׷@/ SU~ q^ ׸8~Vm'UK@!|xIUW~7j0'l _5Z\Uُ:D,j; 1.U G_W͹KDúw6~Do֋ِEs!a~gum׫>D {9(znڹ0m"mNӯ6,9?d!PK$xa~a^O]*nQ3ԩ y)"w|sDwf-i]Wʵ;8ezSN ];TFot]QmVw!Z7Dt ^Z{^j;yۢ~GV`=/WG..ª_~ij!*7-ν㯊xZ5^iQa+g\n .ȩ"{_~ZlrIŎ, A8:ZO]?\Q"ۼw8ʣ]]9"\*y-/x|-d_^#N;(\AG~2e ǯ?w1sO ޲ ų9j|- h[I  0lɣ֞V?2FާCP#5}  YKh]ׁث9`_kYmZGo(%^e~^fM#rGq|/&.s,WfG?*& ^-{ /iK9?_8; Q][ ٍNJmnlW5;{7펙dm9 ίǁFRAI<_A:QH YaR v3U67G2zC/ѤǏ<  o_c7y_ҝƫjL!$"f(%8wu|tKWγoX}KE*rRASn2$żQ}P4жp(UK\֑k8Tq\p 斏Vh|ji_fC[lig!mr QS?I GvzAC5 (1^D]Zo)]Nh ؠ<) U).w#&qgⓈɃ ɩ󟍭&m"'wәz9x+1MҩA\ Zd "BuxO2Wa^-I/;"gOpw a- Lƪ)/ }w؎mUO.s&f=o2&(dD0Ti`R͢-4+~0#'LK6 eަ@ >uW />@$HD5ĈO-iūgjd6ܬ܍BʌH4ՏCa&> 3c`Chn5_6}8gҤey?^˷ZlkmIWo}ck~3Z빈 _]RnWB>"6ٍook= ??{_Nn\e}џxrJZޤGmH_2z̏E$6ݯ^MM]tP:Ɂ x9OSߗGkľ>=okf6b"JF$_oIɃG8^ y\4lKva"{j%$ Yr(grIH.`y.,DbKhABAb|k}Ù4 ]@\#'[@SסA˂Z C^HgLNL)K&}u&y)=H!yHQ|G+bfL dTQgύ7[$8!e"@AܔnE©xkA&.[3$U,/mI1jt6 +$&Œ;)vi5!:SXk@,c,[[깸?H]W$n ۼXx{zl0q :!@Kԛ|#ߤ%v w0P̊!Q=&B 3XR(jBzi,Rۅ [Y(KRHGd^KC/o#‹*v&<3JK(ǣ:媃=fL~6vغ"#Z4^z3|Ú\s$X5 cIz= p\>ky"/PF7rF< X(]> "awf;m͚[uEH:aغ7,xk7%lkȊZmQ+QW~3EcT"qñ~.$I Qo7_7>N<:_31۽wwqcLHU"$@HQDs ؀YfF6;&ߍ[7i8g7kY H`qTHFI㇓jׇj=PBwNot+KP#٘6tm=T7H[^=Wh۷ΉFk|{`\T77z r\@] X'YXD)90+!~;U@s;f~Wj'؍W ~˜wqndu13l)()PR"HzBuI].$iHJ:Lr<xk:lY9&Ua=Trq{9WxW(wfG>gFy.YѸr9f@wyC15IQ_߮Ӂ IYJ-om&`<%JBVey.ۇX⾞uGXN10/$f*k]YLs F3 ݞk[q!WSQb|3.Kb+2@ugX_mӤz]zzTZxTe,ȥ˸)Xɖ<xjSKL̶ kHK!c(sJ r,46Px|֕ˁe.?<>sԎMc.Ļ~d7Rf˂l3>:Ŭ>')2cG~H!Tg7JdaM@S)>™%UFq!n'>ǻj EkLKg;vWVGtw5 i1opdq'f7^CN\ʃC6E3^غbb95;g_AR,hMk<NqN]as%pǻGn;!%L yCݷm33-6Z+ Ws݊ S˿_>>]S J@ZI( R)I˒^lH')9 NVutQJ"ޱQ2e hNKMHI0^P-Wb| 6Ϟ2'ȣ>_l/2M4N_d{ m]][ݱGo!,XCL@$%Z =E ~ Oeҩ3 5jőaIЄ'5j!#_?t P6*8,"3$%hyL$QmDsӏX;ߓ"a7LBU!R4Q5`g:MN2ba~*l"X_JBU11%yJ< M2Yyn'").kssz=}A??Aбa-YlfϳQ*7'B'~FAKL K&"pxcy3,W\Wma-nO!J2EAn#ɂZP m+xp+Kom'1}OꨟowC蒫jc#BnjB%ËxMm;48h`X*Ut6!98g Z0ХuP㽤r{#>KoTѝGd9mvmGjU)=tPuo4* f&%$E932Q`!~'q|v@S\"H/J WZ@B!ZT& VbgLzPMc[#3Z GwS^7dQ35jر<^Q%~NRlq:юVzVG68r ']t)_ۃB-DJĄA܈'ΐ`'_wШX>ga`~KZy=w4Q c1Q1REE]f0ء/gc |̗+.2Rm({[vH~y</WhZj؉_qM91Xbu e*N h bS:eW)  u֛a il^,CDsd/*QN & 68[6ס9b@Y%i\ޙ X|g?c|+o`VqiT>o#&vta sk;>BvCo/"w@rAb.vtavBDNOR;=/yk˻kFkW S*Ca\k.U]RՕCbD!@C+I}FbZBu܃̻3ỳ{($F,ŀ;zCkD>ͬk#/lgA` q`(B>HĚS`28Q3 IKcȚ0vWG/$Fѕ{0fp-Nil6a05vk^ã c8Y2 3Hby{+\$;|;v632͂d*Y`a/ ~5VZfy /fG߻V1$%ѳ >drbitHII3avG{c:7<=?3lOx/є膁ηPC<,03ߓᙠ3eNj7 w#}'=z eٌ?(pXuÀ'j:Ԋ:ad  ϫ:eq  @nlGs_TJ[DO?I`)Kbä1@+MCq|Q7K,1uEvzyFIZZ?0G9XC7ZTtu#8U."K6̪='X\PNKFd  Q*lsW!krXKXU$d$n/Y(!`!كgKn3F0 Jambe8ؑ#LiHEH8*SU]2iJ&"k,-bQ)P0/nxCi6nL}-xj Weg!A1@ E є0І`Qf_yer%^bZ_ aBJW a`YFOcZ- ٘Y+= a. RgffkS32)pr+'p‘3h ’qMJXi̞R&s5 HByx`D%}SCk =K94e;Y(r @I :M@EwE ΓA!+8(1 VL s$Iv-0@I#&Eu\\Nŀ3a D3.L<"H4ȃv$(fZfZwKXH%s.`\ :r9<Tiir Q!U=%xqHTú C2s("E@k\nOu>'5d\їXba>?h'QÃo[*jjR?I!_x;C jb:@0G,&KA`%i@ ݘt,XDF e(mh"˓7F@";3pHƏ0A1!E$L!DC JDH t1,S~ɱI2IrX "gd$eNI9Pښ, lǸf%AP1kfdڔYDVjk 𚌐 X*x2ZKpP8ˆ&ww& D "e( ͤH"&QΊDTJrA1~XnnQ첩=l4C n[;{!Dy?wg" OM6v;J=ÈEpP S`vD KdOw }DEKiOjM iM,dN/PJ!;(bDPe L )PXB(dC4 G!H0]a"8J$@ΐr~Zs&9|K;D?QfE!_!5 Ɠ0a;˔ ty]&0FadYqȑ>UNq|.&œR0 5!҆ &z^Z'W$=auvfqlTD^f w7#|DD^ A6z>%aZg! 'aw)΂A+B NU'7+kL9Q-Exi4V lNd\U4t^8w@Ȥ F'IWwȒ.VK"qiK=ja88 u&ac.SJrT+7@=g QA&#Z1H#+BCXd5/L@A,ن`!*lT*!N@(ԀB 0EHx˰ra&4ZU6X&r8h퀒Q& 6AYKYꄉ @:+Ke"Rs(g,& ".$ eKvNm"j!YЧPiqH(-mx! qA`v;$ABy*tPH XTA,$dd!Ν.A:$`@^23=y0zA9&\!fT'D6=Sin֓ `v}5d!b]z#_M,. \fp] 9O\}<f́ekLi @8hvf%MX'LXEiO2`lZd|8QLBd6cVQg'_'1l"E ,^oP8Xul :4[Ky$r,\J)31 "l *&hYX\e!=K˖J7ϓZŢOfLj7'$ j5!n֎4\lIZ!;ejt0whu>..PBw+5*úP-VwT,*2ʦr8Fs9•b+,Y"Ǎv)#S\aIV.V3YfI ;8D\,707[ KB0E16T;,N̊ 0ؠA:|`1ł98pvNن 0F_=LZS &Oe͊v` Df baV@l ȉP wnC",^#9L,-tdZSES ]DwAS0DXLø2өON D0!KTDfêOpazWu;H. 5U[m0Ir$P0:TvmY=|þBJJGokzoQ TXJz2P=R;Y8tX/ɘ s\n<: 9;uwurlB(ǻR(nO<>nΨr\WP95Һ~nfqzjZLRhO6 58wET$W1.:I5Bs A2V)EZ '!`û |pAChy|/Yg&'\//1ơ%@йqN\qn.9]Q" B$=֏egw=f-)8YHst˜:7r;,heċGJ>ԧesA}q?uЉ'C,x {8'}b0Q>w퓢77SdYwjw\.WptiA/wt$>Y:b Ȧ/]\{|ƛy!i''O^Ȫ"DV}!ǚ^1OD ׳Gb;Z矓>F)݇XexdgmK1cpEE 9V+ƒdde:ݥjXd3. ֓ȕmOl\-=:[E+J[Zr3>4$hStʚ+X2VT7{RT {U%nԎy;$};ixa H/3 W%Ʈ);+ V>sB6Z|LK+>"Scq\BZgd$g+K:ojdaui_oWQcWDѷ ^i*.yf}h|/>"jMT#WjaOirFcU5YL6eԉ^ 0*h~+ՋY\} _2y|byQ`gjH|ez3F)!t.LxsuDSMVvQ7%+?;zA:_cv,q2{#~5o?ZezE!n;]U~){* B RbkY(BWJK:e/4%F w_ )6ON`/icu2иzKbcH_Hő,yeԉߪ{v@`nby\:KHBD:=v,{/A/R0D8<V=`uH 3 0$%ikZ8#vՋum'՜"&4 °9_ ux b,5ɹKՂ(} ~_yI-u(a+@*L5`/F͆L1I$i!b1.Tъ7:>4@sQ @}o&O*WG>1""(O7 8m3߭Zb_(LJIZC!˯ Uo롾j>p{S2iΫݾP̜=} R=# }~3HуPEf7CvkM;ÂڧtPxĆG04a ڐA&NT:̈ɡc3YFQli6"4*/ kYY:L7FeY `{ )Cnв U-Lꂊ 4:$xёT.7b{NvE1 'H0s ˁcZV.OJ# wᦲfrmM[\(䝧p@8͈Kۨt"% Wb |/GxҼ Gz rQs(x9;7Cpd4h4CYQؾ x$ZD;Vv#֊: ^M|4}M Jf reZ$cĤ{3{Gհo7O3U~ 8o;OdH>$ > ׻_3DHEP#E$Ad^H#>aT˄Q!kb</ x{WyJh ASYH5ڳiZZns\Uۻu% !!v5@"QPKVI(LrV^kvkmTN˜-OYy3\Mŋ:ɶ"4+\֝i0 f$+ %*r6s)5%DG ^\RD_:å$""sR_as1UafYe HĶK܎}4z~9 H{S S.b$0ߚp]C hϷq ڵ6ӯ׆ej- 5 }W=\ҥ!_(C%?"'P4p'MI-{,W*- j_ӛ4wN)KtZc°`j B׫È'A!yp?0BiGצ!#poDV1= $w*!O؆.Q]o_M$j<^Jd$G:ZX7.QN7'NJ >e3P_H`1s '>jr@h7"cCjzDr} >5-.=r{?8oDu:I svm^ =;t,x1jdݛmeI ^t&RyBѤT-jn^dzMt._?=fFxϔѼ(#> TlIL,|c7롖8c b$P) (IU![:{u:8[Ck7K谛SOsYAIBx9E(ˊ<˴X\G&TүGB͛wMڻ/*:Om˯g iUY5tFRE6/J0 {@gc=73Ϯ}52aBh~=$lȷ> ğ_wy^0fWNdR^!]=*n0!Q jΣ/PxTBQ=9ƺM4d"z>Z T-j%/>^7VVؼ1THb_0צԌL: ɾa.]j# $UdVWX}ڨ~ GߧG*4Ss{[TmFj_"|n]mљ@KmDc#6m.1˝r@7/ y孩aMƶ B&$C+q"@P{χ+0=~qsAۜը:Q.ߙU!HY#ĦP-(!2fs9+"IA`S^?A?l_Ws鲘5!_谛h[9@p<~D>?8q!̏{n {ِ @@B8}A(75YhQHK2#H~\0y_82!9r~}u?VMqCf dduӀ}amshxS;&ǧ7p~> ?@ o@ۻ9K!5*3wa'Ҽ8?{k 9&Fd/~&%CawO*=/AGz82Z^C!?>Om_X-KBRr1Ab5zK6o"l>lO/*%4m9o Qq@}aDν) <>ɃA &aF/0za@p=Ge We1l-U/ץYV>5r4h=p<2?},Џ >"@c)*.QчhPR>;a}'x7|=]G)h !sbe/:~sOv4{KwN6ՐiҜO=\=|>(5`?Eg>37_G{ba:yڀ&a 'ˤ}"+HpZ:;ПTu %[o~Il}WS\ɵIQ;gŹSYDomX*T)X#h04'OF"Zz 9iIqxcFWm *0 OdFBKT]y,"enX`sV~`O/q5T\Dx):^R!mZ YJBaG.ܸ$F y ^ {;zz+TzUנJZZA _¹6Qp$u2yoa6Oy0pYi11"R<Ź dbзG Q&+Z&\9 11 >cƈ0yl}/skнxYu҉UMbߨ`pX>xa-i~QUAbBylM{A_͖YKc)::k\t}YK%&T!V0ZƪnU1啳/\़bW_U۩| _Քj0f^]sW:TxECڏaz?[)=֓(tLe8mUR9-D"3qf17.]w` Htcqzqetz>m-˔R^uf+m%08dE6rfe[Ok3ms=}@?"I_x"p+ vƳ\*+ j$@rkb)8qf:K}R鋪wx|A{~zߐ|Դ'|&FȱQ\; %y\teos;#mW_p-{8a^ .WonTe7%$nzlC}Jlϫ&n3KoKbc7"ݡ6Ugc+>{- ^W1Q t[&^B=dXz-%|LV~#ܺI:m|?{׺/HK'}iX$١>Gw#wLx>KZ.VұԱlik-Oi찲Dx07`A->yFiuC2Dek^7+VzOϕP)s [rĤr!p| $w?^$Xʵm 6ɷޡĽ1͇;u΋~֌/R@D5!>DD:ƺFJUs%j;n {ZJït^ĵ:^Ⲻ}_BHYe~Sl^Ψ˭m KeڥFuu8Pw0k*C7kwᖫu>AspKUUyz\Yi}Ћ ڮ\V#ЪqV[ XFʻ%ȶZ |_u%ۣ=Ow%@ۿLI툘* vJ"5 oki~u:TTwuP$s!&ɍEI ȍTu*)$t|NE9PePX+U*3]3[f 1ԳK66BZSDͭg76o-6 _%vbC4⚹e0)?4>2mXY4*X-Ul=´l+.E:y3\ DA ,vW XjYFbU$ rU7kw 쬸n?άb-Q9kHpC0fݜFD3 }E{e0PaZoa\-e? ,{&EOg j 5 }2;Gn?G^*o_+DH7^[=2&(amI-)9Ʒkc/|m,VDB*+TDHDiҊ=d0oo;-е "T@,6fihZ`߿5ܒypJgJq3ک_cEc)wV /L1eJTefaGؽe\*Yڋh7}K(Ҥ/H͆*܅pv/L\~}Ȁ:LLr2)ˁn#[3F}FU >WʼnWX5j*Y5"Eِ 4L w2iZ0BFeLŸge?"%Kâ݊ԗQ޲u}~F͹Qu\)_ oʙ[>7=V='κ1/{*!<[W]^r6<=VX=KnN{e:Ĵ"Zi$bs%v&U:'zLFX0ƯNGmMX&{ϻnU ϕ((GܹJ ^82y  @)v/v&no:9 vzAE01r#Wj4F4V<7n&&3_jWSÂb}5ZapCTض{OWjn^\ LJRΔ'-*ƖE/U=%o&{C-~jKr{.|t V!KXM:m v+(m4R~m EPuƽܑԍ=r+ٌ Eo gq x'lc-MFsiOVsE94d{T.]_PPirj3?⬭۽ZlͭFD-̳vE++YgcIDvhbXG~y[`(}hqeر~of\COk\9^9T84Xe`t^R7eʇNxeq!۹ O}ܽ \, "H]bnLXqlXG;` cff;#ςE$l용E4̼S!nu=/77DKZmX+2Qq0M- mHd t}A<[nX=M:LQ6k¡1{-mqFINe1n=Ȥt_8,GcqN=u'Njd*O4%n#Kb'{Y2Q@,S˱srWҥRɷ@g R%S%2!T]ܼzx= 2 )ܦPJ=YEBh-gX*k/!,j_< IC Gy@RU߳5!ؒf=(倳l/^Z$stMv.

"9.%|/ \:Vt+KD?jOW˧*Lٺ>uX*I;fu֛jbAic l Jץχ g(XƤ*M5j*Hm*ݗ(ͺTKbUFi8G9#kh@wAầwhFH"Ohz'~ 5c}m7X< SXC֌Y/uz[{HL¹mfJHr*ʚPE,n;Si[}"r=HzVd뿆2 mMՎpólXxcbSQW꣦.1HpRf]g2@2i#L,+}qS' r; cS8Z8`ܯyK׺\zb'@mz;ت eVb:B؃VZe'(ɓL=p0bEΓ[I0͑IyHS^)$?НQj "@#pZƔ*&AX:Jѷ.ydNQPt2_]zj8wU9Н1^m'# >HkLS1@H!LAQ[}0bWJja!mfa {xV%3#:UZ4V=8î!f*NS.|:3/BKK3A_+m4#^X Eu% |Iς-(7:,Ji$) D|J<}ڲ/Oi.HladgfRנ{j No3eJQ/09zs)P]QK.ڈ}%yYyF6E^R,lRӠjlvoHH In1D%DJu . bV8{@4\9 /١Y+yTW7r}՗1?ﻭ}K įeo Z> Ftr&r.TPC8IveTav;3TEgD;>߳w^ޱoo}G&i\7I@;If ҄+sis ȧcͅYFci= dm5y'~lp}-G&Y<_}d~u/zY`z˨&//MOůXߝt+ec|Hob? ,6 "L !˶arK<@3Kݐ?[əgP|N,~Ʃzx95G >dSI J>|^ U /XŇ#e!S-9nyȟ :l<}ڹw[ZB6bfeNh dw}`vGf*.~RB|5[f{u# , #}d[(nNܫyu}[D geX([utqkGA3QD M{q;yp~Y짻+J:I]ct B y?&'MT}eyD'FĠ CD(!C ?Q.wcw?&}Gݎ(}M$_my@VuCyy A Ve‹>=!nOZ>oHknŖ~O#{z쎣Z2wsBcS(ޜ,({z__{A(؃H#$gGǔD% d&(*Z>`'baTRIBSCM8+|Cl~RP>P?/~^/U&X2/(L8 VF}?T VodƂbQ?`?@xl7ӌ ^Y%9MW—m)7,G]W>m}qTY߸/D)  倥.e8? <(;V4ھ@fq@PI[Dq MH@`20-BP|=7GP@dMjR߽2.fG u}ߍ #^ 3x=&~SY_\qR20t݌ӽEs5F-v;Hj &;:Q-33" 3"%;-C8)Jc7R/m x۽cwlwvD6\*(?t7wdcjtC?(4̬1& @H [_xr๓vDs#f| q&ӡ?*<}!'`HFx^Cx^b2sUFNq>C+:5nVoͼnKJ$gXU.tӀgA ,88|8Mg[#EJV&V<{:k%DhAQUtzCtu6ҡżJl ibÑNҵhI,vye:٣EX  K/Ͽ+C m?2t|Y=+92G5+nśnp{o6ӞLqvBdޣ @.#fʸd ^.AgtP.ͣ:ն0u ,[@pkQ:V LLh,&evX9Xw.\/pًpc:1{1Gܯ)4osF' B*8٧o.׳IP3kXٲ5[p?j?_=jpw]L?30_'6m_7s4SoB Rt2PG(6f !/t*#! ǜ "Q>(^"9p" YG R|R Y$mdr~% )sL`eHH"68[%Q 4׻sݿ^lA \G7$jNcmV6qLz T^P}P05p;RSQd'<[ޠ㨗m7 .U^Mɢox\5BH%:4\щz+>cƢ}[eTPNd1Gt1e,p>5˷1\-,k}btZ[5`M}KGD ] @[búr1IJ@PE1MjmZq0Tu@_QʚCvZVz0HR j*Ggb-"[8KVt.߈#^N>0X^k[<`4>M@fe\3Rf7[/C= v_BqehoYNCT*z)"zRmE4<\! )s &ݢ([wq~v/U-*" Z>-j+\,Ł*Xo%4YH&4*ԩ ys]WܰVpy(ƗUlSurոsdM»{5}΅Skwܨg(eC/Yg2 #~^\Bm.e(y 6VaE &'hnaXɐZg`8%0W+f+R5&;)5 =|F77Oaԗޚ KFq`QMf&NcwuHO҅(SʳX@38UU@sFF.GfY$]% &E]"2!ȓ\dE%)!;';n-z尹P~<@{R4%FX 9"FC!B|_~,ŀ0p׾9)&BۗY*HQ8IP؎D:J[$_"QTb!Uq&7ULC{R*㮮TL}  3"3V~H,F{;|T]_$"J$&8s2 MI3M~Ac{P5 !' D1b^ H($&$:4~)ejFCcmF3ݕ!NZnַ9I n| ul;/3sO}SpQ"m:?ewE:Hn.Gv@>/Z0'{mpw_,DqOO<>;g5}A~msbá@zIOځ`Psg`a@3mkj#fك _}#?0u\6GE^Z1pbE"(>½{MHvc^K [МIxZҞ)p}El}EF&wQ=u ayKUdXiKy-J/#A/EL& >Ԇ~Η/q`~X=/i͔szi"3'W.eYg~q H~3=w|N\^Z>?5oe}IoP[(d>=|]n+oCX?ASw3+7> t_z:H/ll'c`N#v97~F(; ^zi.aLjiWPI.N圹J]D$L)ʖ*U^UI]_oB&T7(HNL}k`L!WQs-q64 0"\:HBwA`|\qSvLA dY% %8[] ʇ.!@EtF5Lȡ!0qrI$;çE ˠ҃oE"e l6|?%7ipɡyY<?_̼S< c$$ӒXeT>) fP!;z?cϽK+I!R܎8M*n7ZGxGPN*;^GWA^M'{|:Zɵ݇ݬ<|^?E&#|R&FB/竗j z& N2}׍@Ci_=kXݽ o5vuJ5;64]rzh#V)g.5~4MH{asc>;|;E- HE1uꙠwi᪟ͮ_zbZ\z}#Lgϓa l;V*ʛ'1F4TH3Ъ1@y%O*͚|", {2=1L9@ A*CX2@'׷.):H~鴄QZgqb*){0$q(,Am SKu5dw?Dvg&}jO߉z " ^zL*ErnD裥s?%o}{+#Diu9ov{Ki *+KiS5#Up{E kyh5;/ L7|UT6'q pz+\h@zFQ8/wA?%?JBbkȋWP-Uq)ڴT{rt C6 . 6R@iXj"{ ξc.UC# 6XB2[R4oi_VĦ5ea|)4 aQT&R[3~a+$9/P37EY:D*0>sO)%Ά6O*G͓[{-jR>D;PQV_eZƒP-8`*gvF\K0PՍ}}s:mr[߻2% 9O}Å!g 't&i IH\QhE& 0i}lPiՙU-*v:E+OXXw[wtJq1m, Q,3?Gi #MKޞ" I7ҬuiF0rVqwՠpLG>ƀm1-3h9zM$I٘sA}=l2c7Yq㬊:IbB p4C6v`@6!7'lbZna=c393u<(vkؓ7c@{bşPڈr~hscP4qKC{q&J pgftm?>/~M5f߿O(em@ d g{^Ä&o'fJr#[ֆ^>} iK >kz׸wD\ yD UzjOnjj 6/fWéhHCgK+14.f۽9=V-ʽֿ,}%#Ujҫqd@~z?!v$Ϙ\EN}GC5 dGrB^;o3awGi{,WݙxAeGnx 7u{_֏[,UcZp,V5zTmdn?2P0VQrgphTb0NjLoh"Ŀ}&N~DQ[NER2+I+# Ue;hUg`m v0oYxJX*֟gjg\^[dMo+)P񮔍% 3sAvr8=& )~՛et$b  O;ZJaGnPuiFP2IRUibvt\grŇG\n|P+²W-+Ҽt uz,HOr1VFD+%/7;ĄeE X mx,,lp{^zܱB^FLB Fr">u#Y@b\YGY|^E TSVk8S\K)78{NTRR!4nRg3߳_#? g+P`*+H : Id-Ձy\k̴kYV\>,‰DO˝c?5\I'MZEad'(;BE+S'$sK0Jfy hX'%UsQZ> YiWU"(ijjtq Z>[ FF~zx| I8}I5*6_L?t+e .xowDnZ .ZvMzSsHlw帉pޯ|G'6oy=-Z^+eFگOx;g'"V[Hʥ焟ϕ0(M#`n*?sD8WHt(춯bgX]u۩!ܣ9X]|U]PMԒɐ`%°)"R,ν¶C\h9;ϹPԺ3/Jaa:ҿcbu@,%lzr?[Ү_-/zoōƵ MDķIk}?V*%A/O-oEg|>yč 5HdQ Fw\DtT>uįz/hCDOg5p K}rǪ>ao*SY?=/.IEק<|q5߅@s!8cx4zkfl}n} Fɑ;Yr;>Ǥ;n~*-9{1Xiq=g\,"|ϑ89L`?v$t$8a (HJoE?__Lnjk@ݵ(MwNK)*\{3Fk.y0;ZUhvɝᴁrrل~yl"}ߟ"gn. pfRwqUe~*K~SwyKIMůc~sa_G=ԉgNh $IFm2 @V ?mEc?J4GM[kЫO+eU;Y^Z0q\\??oإ 3lGPJ=#֎KNczDkgݏ 6/̂>J}O)+Srڔ9.>+=E?o|Ҩ֗TXV/_wxiEa$W+sLa4d$W+5g𜯣%ѹ[A/ﴜ߶ov|4+á/=rV.Qln/8vy۱附ao,e"7 D33|: ?M.-T2:Y<]-qȒ2-Eˣ5_~hk-YЪ79z2W,uBx<ŠH@I$XIY0r^-5v lYLR%rnjdQB'>Ex5ScN:B'm!kV'#ǀ%ՒxF 0I^L=$Z> 'IUuDOH]PoۓjMMӕ'^Wv&{Yӳ}3[^֕_A:} -\]OCG yoHaL>qS|}g/3,p-+Ar!lt)PA)$~Vee{ƪwlp' X_WGRhi0 *Ex,[e_@!Xg-`! w7TP ǩr[GeEdOAyKm\ Б-W]FòYYjihEHiآb%Z{SEOQGJJjو4%^oGh_Kbs^7M6UnkQO,if0P;QU}lE~7ڮJztW;z 3X#Phqgw|[i7, XӰ{KFz^i8%ƫ~Dn4 %ƚ[ZIJ c즓lY iq/|w횐fGcv = ݄.)m"3GF̘\ ^*wTj;AC}P۠6.C- 1G8Wũ3 >̣9Z}R{p؂Z ަk%ֲYS Vt zGoWX8MPpT1AYqx|ʿ) j-CF> RG,~ @Kw:Ь~^O:<]##ebq:&(Sj>oXBk 4#TPe ?hc-#29bH].Y^ΛF;kޥliTPZFRoY{{Ox̕zِ>tqC봷+t|XìD sΟe+܍H19za]00RgU.3Bl̑bŸ́H/0YړJl^s#uғnјj]11;֥>gMVy! Uzc.\Z'2,Ko/cs`t?;VL01B`B wх}d1g|M8Z+w\ᚔ(3 dzwdwG2iѱQ5{?-DJ,LhFɮ57#n6,y Jo;7z.jEZYq>Dˆz4Y 7{endwQ2vnLfWΖ슏h%($Q-OՆkb22<*ʭ UK~D(dtvDnK/ʡ0&㛡 ';8X&ICPyX_lzYf$QC #5z'*uCh<Ȫa+IPcΨ\`zAeȢ^]]5Aucֶv|fdtW1' i} /*c=(,RrGF0O8lEiƁn{@a ܸ, 7iP = `jJx&9^@=oxnuBƏX^Å\"6_`#RZs,yTRES4t2t*R[vJw6[]p"4Ot{XEzfl2Рvu _uQ/;%'w.5Yզ61j%īȎ\KlG| Z%5I-u*1Ѷ*7#6A!uvQJ;0~C!x[!. D JٗZ33d=PcMn#\ .6;iCc>nnf1!%B*$杏JF"{3|ΘHW9 w٠)0_x8TeNVM/dI¥<#mM35痜NwVO&s,H#@20oӪ怡<^!Xc$_cc5g )۸yh>;[^[8ajɮNB Bw#SNNtco Ky,<ފOaP|Al0Un{~=s"\lۖql~EڕO2#^%5+S%^.w_\ o6Donߣ,#2TnЩ߃'+BLͷo`u', ĸG]e@µr v3߭랠PiP,,FzP ]`t /27G5? ߲= \({/L0i}_Kq OM'Y 9$7dMJ%cCͰ}/0 jO H‡>WT|1-n0{zU1[ѠuB$DxaFrVAB6rZӷIJ,wa?.|()a^k~m:YV5f٬㢶-&g=&24ʹv9 ~}3 .b SZd(T u~~sW[\$ouoW|_Kyg1xt6Bx9Õu7ҴӸ[;;6ݴÎ{卸hI8<|? 9JQ'!>H]ECGP_xoTiBN@*6["!gP'c S<21ײb(< (z"㰠?ywsta^ڳwLϼy7G*[?n^~9c"0Ƞ /GF AQ~VHxР9 O )A >FV1nK#3[ˉ]SKr+gX;dsEx)"y32 5)#P\lI] V+_mRM(fN/Hi]~]%ʮPT$VXHe@B3 V~SuC Q Z@<: :%{8 5/~e} v[~֕:WߟDqcFl;_ʩ}O;~$ 4;E=y,@t4pD=ی {L2ꪺ(DLRŘ:T!?}:n4o'"3*?wо\/v ^Г#JFm2rL9""T=w+bN_H/BYJVR&L~2T'D%T iZD,0;{$ =]u"Goҏ(!?ܗ)@7Û>Yǹ)/8At5v>~ju:fPu @DC^&X=H L؁m Zp_^ ?1f'@B@ C;W]Bᡑ?L8 `/5'-Q/FA_@;T<7-?} $̴`>[G[~qԯQ>|Wq)4jCw~%_W'~zKE7"wR߼[^E87A!GA'xY*,<^ƃR /J ?Hd"3de;]o֧h<'c0_k 'd>XOx4=ۮ:Ct@D#)>2`"A+A IJ޴z$Gl}%@6nqeu*xIxs?5ɱμcR<<'zMy CE1P(Fr2RZ*/J (Zu'[o5RjiYjAjxURlrB5˰=Z))'*gM?ETVM)hiKO]SUZqt;JȌ! q0Yy}+ cyUT'dSXRB+S̡OLB K ڮמm-ORLԭ~ֽ@UVcd%ΡUč۲vxUڱ[x\a[FM'fW_+4_QF$4| z&XZK_4[ )wOJjIxnφ%SoDtӵHXC@gz{vBȥZx|kW9-Qgɢ[oοzgqt's%ʻ5owN?E@~_(gKٕue?m=eF[yMw,h gc~w2> (zk5NϧFk}$[7c BhFa{`MwOיk\j?U [;?d "o)g*u2RS17#kA A*O vnj*s,NİQ|swI3=L|S &3"=ɓȹsKoL;ࡰ˪s6҉]3\l*ø aX{g*Z,aEꥪQЗU6=dad*o= ]*eykQ10 q:pReܪ|qZ)zTk JU.&` P;3IY~]wATT<²7Υҽtׇ,6›叾 V՚qόOd3\mqzSr WS~[]w#E8ןOTy(bl\g.EǮr#Jgas2ߌ^=6`@^"=ɦZG^!\ |8RvsNX} J֛ǘ"DP;[J~*/vspua^9ߏ:(zh:{u}c3{92y>hf{fPntA yƸz>+ϥ;Ͻz*m-=;e lT""L0.y/Y79͐lͪS3X|!6o/+:goI6Pov2~aHO9c)@B@(G%#~lqWghb*Ir>ɟ>;)LJg|ssL C;=C9ɭ)-cY"i/=RWs+Iboٹ[/_gl O7ϸ胏rD=X(w9[xt_Z4^d-{;Ov4{4  !2kW8;El .C{gB۽di?.[_GEn7:;4_cIB)e&=͛{ڙ KӘ!e>37x|O^}LV{q<9SdMP:_cUw]\zn^5uofw.V/ii_&͆8D;+P' (d)b'`A4̀%Md8VY( !@?dM\k?#.?سYqShD%o ?BxMgAqھ5vO=#^0Hp8fT.DYJl:–RAރ;E5$> u'A $ƃ9sD *&OPǩ:⾟mMB,MPI=gB`!`E)?C{\rO XHՖhI5JܰBgE)#; F] L"*d@iцK`[ OݘN[P@څAk XBhRYpCYc"$ eԺ.5цfVPS~xrw.HDZ PLg(9<jtێ~EtN[ErdD(.OSw^.XXr, [8Q#y&(ݢ-/ըC ؗ 9 &lРVjr:/y| ]-Nƣc۳aa [<*6mDQëERJچ6"X)X $ `zb*+ZN @ mLtƇ{ c:&]OVgHtw>+*^v(aTFMY &gGmZ[~e̲o\*@v]nP1v -Q\h[p=WťX€#x6C <Z "rci1ߕWonOMŲQo'kG~/y`IWIV{bD$Ӻj HbtPx3`_01"7H^ ()u>"@  )p'~89;)K2ᇥ~ }o=CE-FɨfX(t4=CjRUg=Cu08>ʙ1$zS\#XU߰| `HIhF'[W'zSlܩ>Ck9˓ 6p<<743Hz9OWc{AyzOdkw;$#G7,3 'OK;lDBf=/e!ve퀶)7e, O/zP}ݘ;;/ BAַV&axmx>>/{AgDt*>&/ŤNIbn>Nԣ+ͧI31kDj$}*2QOMQ;F~IyG7U: 㺌n|ye3Y&/v𩶗[L5vb1~ Cxr XRܛneTd^}/B)S@RsEAsO־k#HJ4K+b sRpkg)HF}?âS 6GyFF)T50d;S%6ȴY ~el[kUJnN_CZ̾m{u'*24x͚EEW|e DQu+ ꊝ8i+ *ɩF:C&)q %=˧ii..\֬ ۗTZ]6~s¤*5^woLeJEMpiZ)#5qe[*vNHόz 7-쭆I <֡ϝetWi.M)#&J!U9_Ut4F6Cy%,OՅ,v=gjRKRęG3?>U\mDԯ8\ͮg\ËiƱ{5яn*;0t-?Y=f*Ahf TVI Ȕ,l.X=e?:.Id=)7?KS_E,\Lւj@yCҭQX/T' 4 N"0 em|YN6n..#Efy]. 1u~_,tb=g1»4tIEHJ Fڗi9ۏ-[%BW)D(zIޖE,pYNo+jdjup{!~.?ς#yifnsVz,rRDp4j\^G7jy;ZM%*kS EWAIsOǛXi\A=`e>J!hݭFR_T g*|JÇǺx[EmnU Òw6%%ny}`7sI,HHN2 6t ZeSA"D+2&' j.^"j>.v*Hds, ""eSjqSKN uQ+|.6 E2 Kz*dcdkJ0(hvaQ5=xo٠Ь_Ʀ4S}׹ϕd;񞧧Th-D@ -voeRu߾t@7#¨û=AT_``lg8+PYgS뗾X$ H÷v/(W)Ƚ~bRDgSg:D.#8it8w4D,0N }Կ,ҽzsr`$u: -c$V՞CcJi }C y$f~zt3)2e Jhlop$P9wmBH#%<3@Ն 1Mȍ SV. mbJM C-ᶐ`ѱbT:H6,ns2,.ȉT2LcbKdM 0 !G[ё-`(yK w8]%m{[=ͻ^YY "(;݀(vE!1JU. Jp[ \I*:גsjo-%iT֑ ]&(ch,Wa6`|7/- ZH2fP&xgJDj9s]zu᭎h2(F H1X"X @dCaܖ-*; 4yqN,ȑK@uv32Ŀ!՘3ȸik\+66 Ϋc"1|?w-y8׹̆`J^^Em*8s+ YĔC`^ `.NZ2ȺI `4t cj~$_mUqG?= zT 2Q^8sp{?+DڊM/gMBIun3C X1yf2?'l3ϮEj@;NIo/eFFj $$fz:l ?`4{o9VE"ݛc!dk쭠v9HQ .;kp'y|ͭ}9|1k$Ǫl\T:e,tRfϴ-Y4\+ {4[opsf/`&˃]/LOS:^l_zM`+=ʛYf5e>B*L,[9L5zǷ?:| ɞxu{͎O:ͰSox~x:WrJ 8fg g >/)9Ϫ >/Zs\ Ml&mhgP@YVZ\PJ\W~N҂K@SX,&a0" | 7ҟ,D:&Y $2No?a2 ~Rg|q!NR!bSvmPl$'8$_=ƿd:KIsxp UV}8hLĜܹ@IN\V JP3 $ز?~C=_w >!;IEVa3U|^$IB,C`Gi a?vTe oE&#y P(`c SiwIB&!_<0bg*zuZqk@Mm  8uEB_8>iz=bh.Ρl"ɑ|G" =m`(Ixs6o[/{>Tx[/a|l4xB,lּ(V.XaLe5fa4HQаy mb[ YCrF}fB$IcM"bL_ nK ј426Ng,8Qah6"ξ\P,M˨_C!ctCW.t`/mI$"Mĉ1C,DK1qQJz REߚ/5aiBD9 2Aי0%fzYICޮřaw  FLv(.KgCӔ߶p #Z$ 1D?q/O+],M*jl{K2=?s9+b ™| ǥ0{sp(^_׍*$G-z0N<3Vv qf^@ cC;N ,Ͳ+yT?=3͕Д̙HBr@T6 Ɛi۷ %L0qb$kJ$fdsck: { ;(HS:M{-ϕ0HVh XJRUҜθOX%UvuY;([L@/ Py!˥WK[.) ]ݘ 4+$ @G $,~繱sR$LD ü1>wmsbA}v` Bh>C{T"8By\@ob|o"H<@Gܩ8Pxʳa?8G/(R֊co뿧LG8ܾoջ`wS\UIk).=uiQ# {x`07"E#x* T? 8rKb > ݾ8Vh?f8Ɔ'HH*8m[~{}9[SDsv{; zyۂHѹ{KSSW |d"#Hg~œc̠}PL0o2ڑ Wlxmm >n02`ڐ<Ж@YbJ > c6P#0"N娲l_&ۤىL59(afyfC&?)+Y l8BaAg %ÊhN0"q(DU 5e NJZVSDkY x/_K̶0TIFD{Xg{YYC{f$ $/ HJ:Y*@Ek0BP1Y Tӄ]/B]ZAD_;5O:˒O3AF^9WZy0D F5 =/N7mji2})r[t#[.M5јbuC!.IZcZZv<#6Dr JC@"u4S@ "֗~[zB,orԕ ӺBcdi#jl?r2pQ.p8U4X`$_Ad3/ݮ}uA j2Ǟo­۸$!k!3d}5+w;df!|1ED-_SZ)@D! S;=+ !P$]7K 1#t*Z n8m>~fHCǩ?7:<<)WMeu=w4d8DlFjrϒ]1~ϙSțszS Ћ Cc]\2Q {}}j|=E=7!_"WbK`F@QglG>y  & |DpX2B<ؘKCɛ9=_*mj"!҃ Eř0X6e.kf3qPS\& &1aB_kglLˡ&Js4@~;MddED!Ș/kpYZW}Dt G692LekZH3b^B*|pK5H k2BS^^MiǙg:d)4WCqm#brEIW2׊}?,բqP9a'KwJssPQ^H\ ]vt"o:焞OΛ%uie"Ux@}sWj;ctU_jy)a%@%-YpGX[Z&,ﴛݑ_=_%;PP]6i0B I⋈hx*k^1n|/~xnXHDDwȡ lU[I^UfM{8ci:{DCGe?y1t=|TXΫmסK6r2߲\ɽ!.2sY7]z9HM/u6{gNc~׹꾲殒e// w?ᕫ>[J9ovz<tГ|IT:z}_bw3}\7DF?8}߫-^XC?6P̺i\r:{}7}J[ݱ_%1luD%O m7Zr;.F~n96*y(|oգ%q"$ߟvHvpV]D> 0@\nW{]/'?wcfoO ݧ8f^KWPpVLK/l%VZK!])yV=-]877͓HAx:'Ykjm+zXeW,}vñ(hߡkqH#'1c}^e@d)N Jz\g!c:=́ei %laV Z~J|   (R?~7ޘrj[sݖ~qFZFVf VY,uig7wyT]f+vqX}}z'jj@M_;A?O2~\C<0+g @m>zWhZ`J/J?v!Z+ӂ3H!!w)wW5?o筜~oICIQt)ZE㣝bsG/^~b5zN_K,mbYi]?-JX"PBP허h1V-TI? W.9b)Vԗf&z1]bNǢޤoR} ,$%ʋDZ29 \J]!gkUAY .eBٽ3Mgʶ{S[5/JbuxLr(QC 2ܪ 4 @^Ե9\S7?~i]5.0N| ?7ǀc[mMsImJ[;B[7&?gb\k+H2?* +Rc:vK_KVU.h2c9[SȾTR_ ,~f\z`inYf] v5 ew1l!Û #]q4PNBG H[9l=n  ]; Tp[$^KYh q ^@LLc q~{h\Ul0|˯ߴ{ ˒; &+v,i'Ymrb~jé *dqj|Qҡ܄pbocG%Vo^"9%OhWq?~$Q@˜ yp#!FA{tV N_*J՟Bݍv,1O|Xvk<-̒4P& i\0) Joƥ3)(%B^o]< [QeQ UB2΄ GݭH].]nЪF6/"2\| E[]ÚD>3m*!ˈ2ycql_!PcO%e Ki$:UͤhD'=;tn(K(hԷNfÙ)_6`#Kч8AulX: & Dg#1dPJP3;#l]L(tۺufǖ (|]|\1PF 샹sg6Ycvky輒skfbP 䴏O!pBT!;@ -߶2q{iiSݎAx6 /öG&&w?CìoS\VtjFEW7@-Ovi V Po}[Kw4YWzGC Gge=ݫJI}N}ә5N?V"Ma]|؊{lHjp5yiդ~IbN9nXG{eEpLX$bK\)bm%HW@xҳDEavo0G -֗1/!"4&tojEw~_=C'G+Sl}d_[a*-|cSq4E%H_Q `Io30V2R؀?9yy^}2ko]L~PT(X 6O9V,י]qRmHDr>lYrgw"S+tV;YA#"U=HZI΂b@V|ŧzIh&ϘzIuxӝr}g;Y/UdM0JR߈]8?_oBP9:LLJ667MK[_W_C_RKP_/Ay9>'!!z譮~`2F D 7H˹&6ǸD@!={ ߤ+""dGJXi?Lr#M U'r<'R'o'ort~N'$A.ql|90&8݈n+ؗ:a>H$ole6ȷMv;T&uw\2D?.M'<'(-P9xŏmE#F%HCBG8 @8pM [VM#.P)g~w~\ 6@ol+\ BOZSX)&{Cha]?7?W acK[[oL0,m0'o5cIS1bXWd`wXXP? m? {> Ah-<#7@$9͔%F'~i_pA={`7m }:OW&:Ōߏ^h}HLYq}= v{6*hp3u6``[cHF$rw99B싥;~(!ܘ{Vve gDrlV5&זc{bޱC!%_x$V&EɿKa C-bT.Zs]1fRgw7/wsZ)A{ >on͟ )>O2?~JPQ( kkxp$C ܎6BUBa3}Er1篂_C8HW\L˺BK9G 22 jdc@ZGX$x.yJz-!?Ճn4 yI?vށbvFIaa t7^EkI<"ShxwWj*2r;rd` ӝ[ӵ)RsqS % H\,DS0a^Au&H HAzEYdFQG6.X"_϶~@Ͳ5׻ ?~Ovn:*X;/{Lџ wLEzvC BwCj@*xAA$;C2#g|F]IށNx:~TЈ%8) \y˳Nma ۟ߘ0DW =>I{aO;[7@D]6CknGTw~/3H ޕmF/`fŗDhXSg(VC,B&%^/Y,5zE;(Y{TMU{RAdx@@~x%Ir?U1!O+|d{s ]U$Njy gL)Xåþֱ#?QlrZ{*o5 b97Dri8bx`/3H!(2%#~G6y36 @IT4ER$`~@0D8 cS0IBv8ZYﮘiQqhnhy,Hl`Ȩ8TDYԕID_[-cQUߥm\Uc[@Ŋ`Z *PW $@k`郓mNGkߏny^Y:Vѱ;8\fg)Q!<ϗbq\s% i#]lHDMϫ]ͫ8h!ST˫Qe3Rnk_Nm_//.d/VY4}ni˜{0ԋhBg|\*s]R:rIUiS-1G[PNmSR@dT`{?Vp@4c@tB"wmYCS6-юHdFCKqɇBP HFv҆ihgMw^L?n4Z 64#ȗOvӱW8/OeY^.{s:QokkK1aJ*h٩[8+Xҍ3ծ* 9$8_F^56%O.;̸~-- ,ݷ%̓qmjt\CDF8R?) ;9-݅=K ƑP{Z,_ĖgSܹ/:镀c^$8AOwA>Z:]ߎ dv51v3c>Lj4k:;5 WIoJHSd%!*ZmayR0L4b of `lnxCV\>>|$k^Ɓ{6XL#RW֩2s50\M'/ p?(ή F،0]MrySs΢+GEað37Uc[ P-Ws9%&/FOǑ~qZGR!/{ZB.B:7Vdޡʺ^ά>OMg +|k68Ԇ79%Sf=vVޭ=M`%B=F[ G#i_y_l'{L/7yvׁӧ woŰcC2¯#U=I͆sߺpg~?ƀsq;noHʣ{ɞor[}CN eX:mݩ8ێ;O[͛? χk-:>mkYP#&6/ą-gI>ugOBk/}H{Rd-_Ooȗa=$5:]Wm܏7rIhv,>pi1` oﵰ"x5 h3^Mb71ZL+hl#<{fa)D3xU1fٟhg $Fro-MS=ˆw&K73}lgZo`$Oadv0eT'A+Sy6"\ǭetpV`xe~o^(4/d 9Yx~ 4" K޺qMG'ߍfD6inIbLH"?>,jɘ B`)Ax~R"rֲ7_iCR cw fR-{b}כ vsװH"Aї1qlcԳQWw[?q93^nKUvPQUK ƶ:6]+OÁ2M eZdAy_}Ye/}QI:o4L^^8b[EMݫW6%u}d)ēx2U>{Z({+yIyQRV{81JC( ~&*S 7P꡴0B*яal͆Dmm|cWe~PٶS tƝCAˬ|1͗M̾/פ xQf_P" T&\>w6ikN YH^Tș0y^E_z_o)#dZk~Gk-}u6 Xm\ۖPt *0P6T>;vkԏUVH>v#U%?Rk~HJr}\z6*s+Qd|NL)y?$  fhG^! ,`ǒPn)Tt_g@#=ai"c 4FܖI=(ڕSFu]P@ nR=DxMnV7D̚mLr;wGIƦX'Y/WwhR.Lߨ{AVԿEX]tTkxgB^rZHa MaK50xhxq4A2Ռc=ud`@y"t\2+;(3v0 u5ktCn=D"m3.9x_)ɥm1@4W1-|h̖ZLYZ)6L+Z0(REࠃQ#JdǞfLr98R&"k'= lν J՞R5f\vߍRV sDb/wn3{ [7Sv=g7Vi@UinԟMy5K'w ;~ 3bҨok&cS]ɸv 81"R,ʼkiE9K~~Re=q>dV-fµ|9Z?j#o0P\_8'1ZH%wra*xfE3]ɉ3ؠ&" INhF/&8z_9^U.8 #{MX!&eC.ϨiQt֋k]_yAn\VbɞEtEh+*Ϻs.'p}1d6XI0~E+׎`}SM1nkLjvva"voex#0[!{Oqv>ቾu4\{RW.3oO|/}xAsIЛMS;[l`]}䷱{f^$'?nvAs+,7N[ՍNmupooi7Eƅjm2 2yVBLUBHބ/KCP{wY1oj v{R%{@R;'tލBrPNKǍu$b(zo##гWïGnT% s)޼L 6l&_YcPL [/ $LN^g瀂 ze#5C,?ߎRc;A+5|;HK8-Q׻ݼ_5cy|ρS%MJnSFhs2:ZԲ#]Z)L^$]=6FBOЖ g.XO_no+jo:? s?՞U^O^%b#(xKI>#!/SL$Y-B A Q"L Nf?KpBta49t J",$1l}_;>o!=ϗv_Xo{H e'3Ir$uFȪi⡰4/qp21Ȓw>|SME@Ƅ  ء >y %O\_ߕbp|4^@ )1d!"p>G}ߑ 6W J_K.H~%qp_N#m=Piղ䚑6eeJ.Ā;ra@^ {e S~9Fg"S]їҦZ^Q)k72BbHm`>C 1b9y}Fa@;uG?D +JEffP )ݥZt5STF;8"HEۍѶUWq88bXooj;oqۯՇrx #l6Yr5I0qPq :r* 9 THR+tXć琱$7Sʅ$6&WvLrˡe7JXf砬ER^<YN!Z.[36Ujeu7ۅD3 AO+/%(Dlft7| wٰU&-s0 4POfi֡2}o~nǗo9kEN^qg0s}Yΰ#G~8NZgd+L1'/&gDXҥ!D4ɏ(FnWںE4tlv\~ݮfMSFF~qg$$` $!mZZߥz/Irp>xDqy kζyu8#Ӂw4Ƈko{ '͏e=_E]5^y_ڡ䖆bp;Ms_J 18@,^kXm.2H,[nςjȉw4a.=<`Ԣ+YrZWb/9 [X RB\7#PF 1AtaZ?2e?zh['Zdۀs9WGo`,)\dҐbٯO X@8i$G~Dff4bf!tN{r"n"DWB9yc<cAzjHG-Ռ s C530EBIY Q]؆bh. (]j[c ]Nb7eRe4NJՋ `C0\L7/qE`a3bckZce vChŎ@֓EQA`/p׍ b41@ a @< Ota0Ř,Yƀٰ-ZZD\D}`A f 1L0! r1 <0luf`c=7Uɘ 8jÁ7ċ !1U͐CekC|b*i7"4E˹(f7:Y4VhdGxulT'ĞPZ 02RÇ cdZY\8@NA!]r!?p&E8FcH@?Az4FR]iaEQ$Lri*Ff}HHHܲfq>L-XR]6#!} M@J!'.#WrhV 'beZCBi X*zS8$JGBl<}YV8e!PEJL3o+2™uCSs>hP5agyV0bq4$qA]b١ N.$@\^bZٙT3M0)Ѹ`4텨@"vBDd TF3PQa!EdJX BĈHU8B8.4 5D> vˆmk&(#;nJƷ5|o,(L+wv Y71s_%ZoLa ^8 l 4}MBR緒p/i6^~Ϡ~CQ,ijt&@F0؛e,ilVLNf3&܁knfT9_]C8!\YCUR ?N,K 13pۢMlyI GVk4.@<#95R6 'k NM`Е !/iȤ l0 s@A-_. z;YҝҤ{6^x'%!T" BY@2V̬}2+YmrC[RkkS6sejFƟ~@d1t٠^& ;1דrZ͹ryۙ>DCkv>,VuVdx6Q ـMy;a'L11(-rtZ,Qf(d5cvԪ@}5So#w-fA* By‰NڙJ+Ҋ% D4m %t1Щ11ڔ0Wެ<{L4DY$!Ѷ;imE5#hE]i|8+,(W?OeS!'n<7R`pڵbGek?6eZпr / <5=`?3֑n[ېIk'pH!& ^2Q6myaq֊d e΢ȥfqPak,c/3kXē}`U{. ߱a3  zsGR]|uf%g<8"}߽D_\:!Ѭ% 6JU}Й 6n7j7Vӫv/c(;ibo%4*.| M{~_R?[/=Sa|(jiurCs NX3cX_ ȶhSPP.>ɈXXp3fX@XlѾ$0H-@ 0x0޿ao rM`04H~v;=+0txZڛT ,,!21u@@qOFU,MFnbN$X uj:lVT$-)A ?W'( "ٝڑE{+8ېS06Ja{baD@We#g4hۮl\08H>/ 3h 4;g~ox>g6tנ#vF{/l x='V$@P:gpܶXpfx $CFC96?HG=gdbbY ʾíeaw7pSur٣r/ˤLd۷bunٳ %‚+B]ACtp 8:cf[xr뻞/W^><:hC>bL-΂g,]2f,A, \X%q%z 40p䅰!jŽSL?G }QJ '4@( "$;m#6W*U,ː]/,G*\QGETYCXf@Jhܲb! El ZtfE 3,v1O.9n(ri[ ]w*w-U R %gt#JYXwaׂ{.\t heD>u෋1Sy&T(]-`a¢8tPLp_J1LQT 0o7QS&( ɊA4`DUr&l N""FgbK}ցY1{Ohx$59;_5PV7{CcK-+o@CJX @9h>HW\H g7A8,cѠ"A{M5jD̋).ZڏV~,8m4\1٥k% "  l`fzkB?CdE~t+9Y᪐$ʓQr@tS{'4"k$n=u|XD7]}6@}3l72u}jSyTjB jq5ƫg$Om@_ RbˁJMdbW40¼ _ݪS,JcLd}ˤ(@c|ߙ*cnyg~S*D]I;dr <,et )SbPܠ7輬f%>y̭vp@+|HP{b':\P-vA5ԺӥN]k& ?R_PY @9 8AW@ v+`OSFJ=\Gu3R])›[uw}+oRv7Pd[cw7 ҬݝT~*.=xd3Y 8\GڀE'kl,3f1P '{+Uq%cABKf e5,~@ҵha&fzFd]@Enr`F 2\:6eifhҺ;i[$mby88y6B`DeYF9V[Os)T =ۤdߜ[3kfd-F`B٘ãmg_u~"szM9[e9ݔ5{gLF۾کxffV|_^*x:C'_l!^8^t}\E(`Y񭑍U9#kQ-S8001.Xq iQ0E@{HXmg,_y`8dA/Ow•pTIGKc?8KL"#910Dǵˀ!ݏGVBTfXYy•!0`e2 ͉&lKB,TV@ 0@C@xUgiuB?[C*@]5dDb-,*WHl9(HB,*DP5?Ib,R$p^Di>UzhDЇ*T("mԥڂB ǝg>a/C9H'Q/8ot)k⚴uК }೚՗㥹EX|.=Xѕ\P{vnrlQ 5xtF@AEC:RO;TKhc`Ƕ e\jF,IjpV6#J;t-ťU e%^=ek+ RJ*n1JV \*3pX( `न0cNjɫCyFYbQA:\so ԪɻUPRXr:]g,We=L CL-#t`Bl po${aQPxdcGUC6d@ڥꪙ"F.?ET * aTQ)>k5X?WTUғsa9_Lj ú658 * HR-$ND.Q"㠁FN-o\Nӓ+-ʨ>dkxOl#y#t"LzKOpʢ0qcבV9Z ta }ءT9]tu;N^ kV#cl65F6U1ʚ[JLFMdɮӮWE6tqƫ!m1ݸD[an-i o[ɧz1|kn9oΊOt6PM dP4Di[-rlewdqc̖e#kIh* zǴJE56OηfǨԝ.Wxڃv o lk][+HV#qZNiLjN W G[8-ӜsFu^hw= 0 @ڀֻXJ -Ϟ=?]9 oma@\&d 43xKy_>  [7j P­w3T:XZ gd0.ʍHDD"Ap_i;,?5O#\'zm4`v"]?AW}o5odPf]ϕ`]Oӹ?sKf-%|늀98{KG0OSXF t!8> ”c}XR9enگ@ 6 "@w)'d]%Tչ^a LfZ v ~\G _!)dᜱ 9H(]RH覨$bc"D!թZPA E-'TePF|Q( =,CG%;>H%Zhͩ.u|rtYH֙r#C?ۇr|t_I_`Y&oGh$H_H@Ga"_aGRoVށ֓b|˰4-Y}nlR9OU$D:cĉ{љ*9ԈDԬ~*1dHn"|)i1gPq:"iF,D%SW\2Cvk \@Jχ~'gg ԝ\P.JDN5Ձ7[~ٴ>=Im2PRG ,/KEZN~\̲UFZLXx@}~q'85l+]<2%fT.?ZG_jGdžwGP&ȑ}y,.Qcw{-<[6J4D+tV,$6ir93"ǫ߄`P[:R3$>v OtBlglVͼ|"\M̷.}9FQ{Ե$RD"Ĉ#;ҙO-JYެMʜ#72m:cOAX {HiÈ訉[1v!Lc'e, <q>B֊}zu-pʧ+$l8hRL* K{lނrV*)D2{ܷVÚoMU#[--]|5c G&oMSXIgB`VW>gӸ6ttb `JU?Kb 종Q42/䗯m{\qGqU.IsY͘/cygj(L4G]*ZV+5gpY:`maƫI9xf^eWnUG"~AU\/rOM6C/cI)w e mi%Ƒ}ʔTG%ZdZ3dzfujZ eKIXٜoelQUnI_W<;YKZl)3c[J¶XG;ͨhѓ2iZ?bm{txK6vgس0ֱH5r^pj)%y/JZ29*K= g)')hX>Ę&[IV*Z9Z6VE3 z:@cvB6Ց#x(ɞ=_@b<>5G0B AH;CGej" yp& Hȶ-^;MQU<\_T  ʦa5yǺrwmS]f*Q#}ǬbX N!X@({037n#p\F#? =oQOo~%p'y;MO~u~_mG)9zurߍ3bR \ZNMP|NPVISb/Ad23=>l 4CgolX}4uq<á덛[=,utgOvkݥwnL%}u{o}/]}F)oo-msw{[ϻKZ2֊vnUgLW2^=knn{-.ιޛc7v%m:jUv xieuږ xkL{{Խn{{]nޞ}p]2q7>;kgrn;j9֮v]5wsWi}/=]m]6ΫnFm׭Z:L:{ovkΚ Κ*V(Rkf+mP֪V@vT[hVu Z"Rٗ0ԄUTʚUɋ@֚PQ !iCF4٢l)424U()D uZm@(*@vНn((hiӎ+ p6BI@X )!T_`0( 4ԆĦ̄M2*l26dIPMA͗MȪ6(SP Z)@9mdGJ'ސ{ۮT:@wAVv+^ z-CuNZ+Yޔۑ27jJ`)mK_H`q;5 3{h殹5}N˰*PTA Ø;" `[˞ܳMU4-6mclmJ5S}H[n㼀yPlﺶh1@kUlՁJ(ZWԧ};w 8zD= *  c 57&-Uuß^Q2k}nϫp%lx z LyyϮVGоW{ꎱQ"65i-BN;iu]yn=J][i׻{pcvg-(Ѐ$l;dZTmT3 {o\ 2c]efԓR@CL@:@ͥn>gdѳv@"zZ1\}y疷5+Y`z)=Z%U.\>5R]4>gCI@QA.z6-x.ִ@M𻹱17!E؅iYA6u<^K?Ղ=wa1`a3,7v6\VoZ kƹ^Tn>7Yi6n_!,iD"6>Cz k"-(R\ai'áVCtL̼557~\ZX{l7V޻!jXlYɱ#!TAEpzk|oϊ@~O4sIK߱n{^p"> 3ywC$"} A ӯxB0肢9:jR $YךVtobQO~~zfYop͍c`΃I݄/myg;IuTPt7#h2Gw SLM=N-N {/V M;MvddGF*FR)wuѮtn+I$ {|/ڜϢmy{~>I * ( (` +0hi((J( b*("c m|x.穒*xk58VQ7ɖC- OgYEjBH65L<z㯌ޔLSZ|T*t&*Wʹ=ZNC&xƚ}h~90 q=uy9h2+ǘLb8Rf@ Xܡg'reeIQRlBS?=Zw<9ʉʉ 1"ݩ cFny-S_ꪯ_ifՃD4i 9=Fh<1-?eq> +Hܹ%C $fέޚ+ظ.jm,cRYLo7ZaA 4(=\&NMkjFal :HY #*$ewV9=UAhkj:үZyJM0cs*4Xi?k^=wvCkŞq,C730(_Q! 'iu*_Q!Ĥ0CtU[ qM'cf]LKҌފ2}˼="){Rm WPQI_7yҫ'^z[;e^C4:vc eҞ={^CS餹x%ZW|tѴ/+i^{;(S0 0ĕS|%_1yaAx&+YUh$rg3PHFC=4J>,ܞ3 qq-~LݬYN te_;ưaӂ)m^{z=r)bBhBh(CfbA@B~6 "$1pʭ RHLvq|qfVS׎V]ȁ#1~ʎDوhz`.Ng^+)D p17nj>ܳ(}&uW5םsP㲪)#*@Ww LA, ,aԟ3@,2=4349xVhJ_€ox\ًv7Ω j(a 4҉ (*jQM0L#4g \YA Z%j^tqC$p"ߜp'}SiAT̉Υe_ MDPP[ `K22eH}To{;7դ۰լK~[)E(׏lv]Om*s~é SLJ2U3}6_mIzlr6R%͛wC;4EVkZVZp64R\' U'o:,p$m:O?vb9C-ݜ]8J8*f @YiiYĨV^oF-PODS2WAD߀ЫПt\FSFb߬뇾ی`;gӭl(Y[LeȈۉ؎j`vuR'' }гz썑;5,6\a\FL; v ny>v#F["iR@P|IZ'2ӊM0 }f:ᚳYJQfv4^?i\ #crRCM4i@Uf[ϱ;#'t:%34EN[~9BF}NvX7Hң2;'g(8bV'H(ADƣT`CP[g;9VzYq< !:~1L{gdA~>=˨'n$8~ jA&3hR8o,L : k'"*<SECzɾD\^E,.{tHm\u@ Y%}Q%9Am}+9l,M&Ge YM@‰"7&ͮ-cXz<}{l2/i#d$_\DLeG֌0~ FfRM6Em!.sY->lEeԨKѬ% ]XQc* Z:WiX|O7e2Yк]m ҥ7Ξ&MU&!sKUr:$-BUQKV3䲙c+<"'$@->s-}N@;iYqsQ+T9~OgKז(2ITk h_mnX.vkt]4p|Gͧ#Θ)oS?7F|vL ,3}lq&KUBnI 5#E}>oSM߄A#h<(TZ ,ZC}o7( × DX_i@Qsc))0~㞏U-亦6֜&Zh|( C:)3Εߣf'dFLBv ]FqE<2z9B{Rqܻ1fq9j<_x~͜ ̑ 8:}XDb}2V}R^Sm2k6D4i4vySfϨG>ۑYb] }Uf"o$61 zݙR6-v7"Vغqԡ3ECinUPDQ Tn۟3!* j)jJ@+ GK#`A,A&.IbX^{g_ %`~[2E-;Sioާ]._t#w~B4>8ľW5|RL*uFЍ'09̪7>kmў wPH» HE~HyA3t " ;CV@HccwPD7x\My/8#b:=4wۋ5y}U/g;U !^\6ζ&$P)j5|Nn<N 07B;]ZL0 )EP^HpA6y$#BQP8s$'>0£JsmF.H3Mmȁkݱe[-רE,Ymoỏ88~x \@Ь/s!\<6H5  'r&t(+7 uB=q)ݹ̏u>meXAL@ly~(͍I,LDU$z5maС")~ 3 "D\S_ pL3glf>nP5YkOkTbb "owG^ h: cwĠ^ l>}*)%\ZG@1wu1wZ*>wl) @kH.q g[#T2:G*s4XoR7s3y7,}Dvr[g(hP?&E#m^VסȀu!Rz7u8 +  @qW=SG2{{|sy.gp˅r3!&LKiXgO'p%: gE{'"Jo-k$@#܁ U$%(K dv% GdaSNHk̔*̏bP\@ŁUc‚JE$ " %CqY@yb h&a6&A6̊(OH͉Kƌꩅr/ Ђ3;3wʔzV5< eh"$;\Gt?o7Y|ǎM;F&#ۼcg#]N=vxތM9;qƥE$^{>'QuuC`-G@{X7h-ǔoN_O& ڬꝂ1U7vEp$0:m94zK T|^-D;R/+5 Iz,TWݡO)DzcY4N?#6-V|gcRSd(#9ʂlHRAm,\,C"(]Kp!m}`.3䡄,&de F>2Q x=CY5b{e+D@jސ?;y\.A3{Hj<= ޴S_k+ %ىN9N,>31 e'Xd3LԮdCDgwr.Zv N&BR+FӨ +.cmvtlx/] -I(uƮnIIZYC 2EG@w,BG_] 3k'C7S,:b0nWʒ?at=>ʎXο y"e=(Ra^跻cq>Qu(-`^>6%K-%ĠTBanX<&rδb]Ǯ9&$ֱ+L<Kej9LKSdq Mus#x )d?mToɩR*@WrZlfUn3c,!LJ#K\ ^S5ߋS,Qinr ?TxN-U oe/[ \xq+UB&fAgZ̓P%P$C7[Hl"r1}f\x=5KUoF-};6 `0و9/.N:PBT-8n`\ PwCaLc;$G_@},(ώ3%* I66Rgd(t0UbДʢ&r 2aAc qgZ5!Lp ?dn&Н5eˮ`Fꑻwo_V48L(;iRIfZw 99 儵&y!o44 y2t&c'AN=[-3Yaӳ ֪)-##qTSf'OInܳ_&6 1W)~i6::72(rE.>Z|z NikZw"blyW=tweqg; wɶTF֍6/'gFA R&} JmE8-=0ȴ/Ϸ#"xq.,m'F@x#>0V0_Ǧ^kC.~1z$4ngBicc#q%=qvMV若|:)A9lx0HD瞭(mʎ5mT]Pg˅5ڂjڼ׬;G:'d9*F~b`R[͎ٟߗu6A\78/;;#h=1 N!^V\fos?9ˠ|q-_|ʪ5QUYD p m"2@bYXbaN@1PdZ_8Уjy|TLwM[[Ij 4tͧn.^e>㰥D6A'OٚZ@~XKh-5ws`{DcH 0@D@^vG_*D5!DщBOIs [+6XXK>0:ݝq= .LgyCz/;`9(h*fc" f棒(1d)a30Wc \H\(p&!wُGҵvm H?[;{ǻ+nT/@XN*67/m )y2532G̷Nm.v t@f!`Qu1  `I3Ĵr>WZU2w<>+eO+ڽQTl.C<#|y[2Y"DT:_9jK`sV8 E5MA4A E$ ,>1G/rޛss^85Sm ؓY} 3O 5m `07 ?7>T % 'zИ0^BPcxn5kX ܛ o jǨl.Ww5أ6Y*$A$,3ÀOY`/ޮL'y"TEDS<#IA TD Ϟm^|n|D)h JHBJ")fi 1XhƲ 4ڱ8xaI4-@4ДEIQ̴I"DDEDRRgv;v?/nt|CsJ7*uy~VN zT@@^m"VV bi"hi *).}YA62΄U̗ OK ý ~[L@lPK`/fHs~:`j$ZVPQ@N_hBH=jh09 yr_{mg+:Cf8_f&94&w4ܐp3=w/;(l&˻ȤL4 B+EU*]L$ 7&Յ3vοnZEXXVrj~:)W;"β0prx-0K7K)B9r W8L7^Ed;9$,peȃ!YZdzK4mGu/ґ$!mg{Ww &4/7_?~&03(-l{$7q.ogWü<7ΆY6,3.ҝUWGyC_zV*==̝V1,1qũ qFމwhخ#Gխ |ACR)S{!6iJc=/?(t;"Gк\+ҋ)w8t@ 081GQlnv'rA(:%v5C ĥdQN5Oߒ_zCP`06 8Ybʛ3Úɇ'! VWڽہ= j:b霾t—UjewŽ. _ܒaD59 pYmwBt P!Az=ѩ 7eVR}ӝ*̰j P'sӀA9"h ,nE%XL&JLm9mRQ)9pN XUjsu+]6^z=3,ũ&,O=o83n+P9x DIIm[.Hew c) QJd6mh 5뚓JOLބN!%`[[Nyah /ǽr/1|3j?HW`;fGg`w5iѹ3@F|Ѕq@`Rw-rj{F5 в;. \J@DR((g> L %o(*-]kOD^R4jhn>N~AY=}& R GiaUV 9_'.ʥ nv-ЙJpX6la>|M;mkT(pRjuxToН-iK6Q2n&NCUQ>"ja=TJ{|N&ׁ[Rg',deF jblRp4Iq6DB"]c6vQhpcjtD36h@A-9 2d#H̯z 1u~a'lKg, VGnyݮh,i|J1`m|3@DZf62G8 *h0A(. u{\ЧݾeK-n]8 ݦO7ff+Tblb͈eCtʢiT06(9* ~Wv Ց!?"(ڳM\-]c.eYdRsAth$LK+1hv4VwTk#,=:>fWakb^% Կd$crimw;5#Wm2c_BgA ZUZ{|Ǿ?@-'ؑ6Hߕη6;6 1l+[oV!!69Di \xA>*r8z78 k]HӪ!'H]=8SkIô.z eriYV9 Ӷ:Dk˪IQ6(HI I&(bM*zWZaDj!C7kuATcIX7 9^+,=[+:q,'iKO}VG RЮ%ԏ@hSE8AjAK0Xwkj>R[ߦ R.tİt^GR>_lK 1Mg vƝDx Gd |M&*eBPڞVGA|[7:f-1Ġp_4sJ"4Bn]*eT1Ϡ:q9"'MiEP 9E Kg|M"]efnn]7`Q󋌝O̍bgeì-Y-/fP%JrLje^uϦAZfa`$؛Y=tX9k7SB7m>+Ϊ(={[Oah5,Mpqd/N!޲iuM 5ogw)6?UEL6ѯtnQZ%jCC> TM8 gU3AMizn`4}}=M]fv.&U&% 0,SA[MJOWG\s2W zƴsX)'qf} >n S3c((cpu )D;JyF埁!NuǍk;3ʱlB$#o+i?"SD$^En?4n^qG(G4N NQ㔠|"}Ba|oH l5^+*碊E ˘)h,R%Q++1)!EAq Vliat,j_xZB>+m5(bD{hrT @)3HH9@cGa]+:ffY3U}XqBJv5pFpOV W2\w|!kޫ@*#m•6ˣzH4M U%1薠eT b@ h"I $_ŚOWҧ_czݶ 6~_&!ĸ@` FKXkHuDcS˝N +(e6?&+qrj'mtlД6f`(?[iydz}"P8䐱/~O:8ZdȀ~9[*9'UJɾ|̓svVG/}b e-M]t4AFqǍn~}1gY-{KW6V}iaQa)U赨Bq!⣲/a,&eq=sJ<܌FWŢ؃0_=?ޟYPrc2^)mG:jO= ^ bRE=eH]lnmʝƫ > D$< @w "#w=#%xc<⽅<} Rq]LEf()u.jːEsL"t|+77CN+0ڌCThi (cG0r-3I:M.^ڂPo W]hib5puk8$9s^)k&m @y-08+ɑ<~7ɱ^xܮ:ab @ i$Aw}}M:FD; $Ҋ̬CbRJbX@GO[t EoBY3tQ [k9EkP=\_o.Ác>]6TmIMW괾*0]4fs֕ԣS8 $]UOmc8HDEl#i*sj:#d&ԔU#Źꖛ#}y Ox؁HIGpYt pL MT,".eMS+j]6Z4uߵJ1͛km8&/2 1Aˆ "l^|Pa>Lf qkng'b3V!@Χ^ӡ~#,2A!/m #<8N4-YDVOD'V`l ~9^^eZa`ݤ^f )Nj|4"s;81yjcYj~,9ѽb+$ghc:#aH T6=VorAKk2مTb c7Ӫ"%?>u$KG; pƬʈI#xu!+n }0 5PI&}fߊ]f R`0 YO8 k/z=o^l ?H(2$9?@p}Wiلz{4lKNl=+c9NmlA V轿h꣖3pJDŽGkdLhAAVRVޭĂIA|6<3_\XmfǴ7' A["Xܶ1 164N+08&^,iwķܥF27!UP=Q t\nY$a€2@0׻\kQ^@qXe} g.W^_.c<ܕ FYG:R-|=UaS>%: (gym̢Rd(KH\ُhTU -3QAC.:<6(B i7Y|~Ј'B629]{-$ظq M4kcG\Ӝ xHH St~?P+e"8lE;5e9̆wћ+DM^P;OЏAqt.WęhQKI *([譨i4Yp*>C/PAGj]m,5TwͩܒЗfB@(,t`( `ᚡE*' j-S;r:K 12;KCZ:#iB[@H>IHX{nM67.s$ɳyDzp BK)qN%>8@s 5!+]q~`+͌~~G~;累Dg9pINaR;3w>3oY ~NS?RSP)IR}M#`$n.É/=ޢv&6i&&6 y06B] vxy68 k: ZW*aIdJ7u%J5 ku%05^>;.´M,"&=aG.>̑$H> ^:' qKn ReI"Je"@h(JbAUHgDUSۛ2:gN殮<ܙ jzM.2gc)WU^&7|^|Ϸ.E0ȯ`%2^׀ mPjL2=|NWkesG#F8 @&u+9.[A4W8j6;}HAV66t;PR)L5ї7~&cZ-,c"n`tSʇ CQOb12B/؞>g"bG0(S`}V;%r nÆ B\vƂt7n 6nťz-!ӝ*\(T( {S/AX)8K]uQ e"~dc݋z^I!y 3!TrW4ā~Mh U>RFN`L #[Wsƞ;'K{q'@7$mxC[M\(ghfj`69)!TCڣ{wUԴ\ }zL!*]Vy:uqq=8ܩj)^$;oQ~TgꃣsGm͵^RWECZC*= hPv:<YjRYY<@ ͻݫoZ/QtJyϭde8-? sKd7$#|==&mO|J8BMVLAztp ZP,Q5rQEz'}Q)]7!YCt۴J2F*fh!|0uTnhpIZ޽?TaLǀ~~|6$o<é!jaoSgzYJ*ryTjoːmfD5=(ٴ;4U"h>XBna,Gfks!e7 1;`5eN)0RL:~F8Ȍޚ8X&ius?IjswR!CuX0NB#F(/.%@rxF(c kXqbFe?^(-%ss;nSNkRĩ~<[;8*Ii@tLLkJK\Dx|<18 H`dq=vnXK LN?47iA:O1hZz_/zziNT+"19~xù{oWLS0 A|q$rJPZ FcÉlb{1 m#P^~Nވ9X]<vD?v؍wpH1G㷶:g{8x)1*?}ô(^cj-[j  @hBRKa?.5k`O'%2d<?7tWH=!(k Hr!3_TM+]Jf%hj'{?KtUz pM1z-[m Ҩwblaqo$|4JE4]?[L4\r88!5W0r l0 qphgw'UD3<~dרwB$SnK|^/r:?*B 2Hxˤ6(sv:͔C3b8ݼgwi/qCE7һWrLcwt9=+jԺ\0tQG6=\{SG!Vӿ#3J^jAk?tx᳜l@55 x̺- m!."0Ɋ,t[~.WY}Ms&C1xZ{-M[k`+Z-t6AcѭB S燅 J' ZHH@/íyyTW`Xw)a꜀TkT_6jv+*sI 5w55"#_| uK5H8=s0GĔҺ$r1+kHhھRFVuhЁpz]>F;_MdC/`3?= 7]ӆdɒҦkXm {2>Ƃx*erC={i r 9Dè$3Cg# j;:_~W6{(5 %* P6ci67=8%QZ#pl=QXIuZ+|V6VT fUIY/Lص 4:h! )]u|"]G)s 3hfb-DW`Ϫ\`90 1 6rCV |!.JQx&aT[ޅM+O?!kKiZ Q*-%İ@n)Rp 6FJ@FT"}wtr'8Gf|89Di%Jy^${T]9 1 ve|P޲KkhVC$Ȍ2tQYfv^QOC&ؐ! !@C]Q21PXǢ+:J^嫬[燉?Ë7CsK;yXl< ;둪P!(w^bS j #4@ِK林V~ztػ99[[>FI?~/]ׅB@YAiq<<9f)?B$Qx#+tZ{r+ ug3VUeWwC-uc!VEb ԗme薶|GV$) TI%j2ݿntŞld`Ủuމptmf,ɨ2N~X ]r:  i*J*}vz"&RvV|yr 76#.gdk||.B6Rfu $ -M~G`Pẏp;h̝<1O$HR c]vܷqz`r>u{"sΊ 1`tj!|O-4.`22@DRT.8KLXȠznt=O깿M8%Лem.0kpEӘpaqm׳|< >xlYQE^d%TM[ c@y K&_~;Kޕ}GlJAR>>UPL%UUoC9vy&P$A3#RMG_˗Q˗!Ah ֚R7rNq~~p_ŸTe]G>{QT,^b 7[t`e^;Z$FC#琳{4Յa 1Jb>:ODDH"hNU1'/su8`ᇄa^@jpzh.AdqQAn#rC/_#YY X\?1Rź afsdy98/wX4;`"vemz bK%2@>ps kOF2iCͩ9yB]rq&a,:sC BXq>¼=RNÎp'8 n7O]>}WRuYikd0@ p\tB׷ RO!2Z e) bCMA v"rה^1Q XB@,ظOLG:T Pd JtQhk ZߪqHpVGB]/qfOpF_+_]gZ`m 6 P Saу<?{_b(%>)I{= e{x#ءĜ+TH6{9H=t0tՠ[{=]Wjk/Гpa|G59  gm&ǿ {Zɛփ+i\-qQ;{2 pKsv)z7 q0Ϊ_5-tF{ 7PޅmQWAI`l8*PyDQW(}h<,zSA"C̴BK &%9,fOQ5jta;,EڔC u1I|OޚJ{?`G[/#Zјǣ,NU/H4$~Ho.g5ZozA *oq]i`~E!yt&t}ޟonw|]f%a 1"3 1Ȭc&2r"j N ~n;37d=48Qi:%/db`/Q*Wn;v%i0c5j 27xG|N5v8^F0$m .GHZm|vDJLm1v8^ V`IXg ố  Gv8D{z+;ϊ|1@J(p7(+N$YE$3sF`ҁ}H;CK>`zblx3%ՄXU<4COJ; V>ti꧘?YԜMҽa|UV2l2^;tۡQIrs퍧8ƠH&kWgRdt &ץAXizް7@tuҕ%z;y8O?S^=@X~:!!;dD4%TxPR1jy2ne8_F[ a anj'}W{}6ydJ@ VpZ9m%B+OpLwqs9 G-pVmx>U^./3+T0VaHtA I^J1QT n%ӂ2'+ PWϘbMYHX -4@ˡoF2GEӳt5sWtooh|)DR g" e$iC+Dž$ { v'ci|?]L˦c+kzꂉaF%dH @d`)AM4*e4DB7l%0p͔HQ\]47 Z>/,$?P wi#"}5=gm[H~`1N&6&~LWh5/TexsG&.݂vt>E%GQ }"(C:Y hs)~yНF.I~gTZA0p$: b57`1#܌:9o.1?&>̻}XS\N9%ED,$pIΕ!&K^@z$ 8cyqDpHrdp.]M>./NU|7h S8:EvظvajXf {G]+8y[n{R{_26?j XYJ%\Ge<Ba۱f|$BcaoűӬө{W}]uGWߎ#ylGPOWx;١s:"8,~6 Ӹ8̾J9quUmlt~< HnPxЬIS(F+b>H L.ggډ@xn w=bUG, Tx`7m>Ӂ"$0⦷7GOP~Hlg҇1FЀSw5Xp[H)l[{0,&xYȻ-Zp0<ݑE}l3~ИYaUxK{з{j$ $u{B^`bu@"JR2,IxPi!iUˏ Iu2쾘p{p#"R`lwBpI"|JiϕK~{źkhDMzCHg$<+0CJƹ ӄܱͻ/{3\@&ecL 639|Em\"i"JBh]O3h$"z!, (Xŭk2P!eLѫKw.&6 v&( >)?W^3\,Y{?$V"%nQ%koVwOUЖZ xJql&ݍ4Z##<]Ȣ  \v̽mlm1)D biocT LTh3_+jPD9C,HT;/SEry'̂VK#p[Om{6HK{ EiZ.qD[8l@MT%9\=?SzއӐ1LRbKҺ+0w,/mO9X:!.u 4wMmޘ#6<8Jӗ7D*5]Oo0)9o >̤ s/7hJR2mT?t]Թ\:W;#$1)bZPICOˣw>goݏsWbYA8H9Ii&{Ocu.[Š"6'>|CD2-+,IԦnTA{T̐_ )iRգ샖sZvM@FD Db1k@K/>*+KyvjR4cOqQh&Ԟ-iD!!o~cR&Lv3$3mC4}bmXwwF ҇iBG֏|ia/\R\6V fd`!g*ϨLl!= 4#L 0 0&f%,G{:V,!?1h TB쁦&HV- ݵ+[aqM N" ٕƼ?vPΚ T%q REKH LgjWJ"6X/|ΐ9Ò3\I zн=r<,X{Жƹ` &}a;QNFھ"(Ai^EHVwŏ)eSöԸ Gk@%b2;0[L4B^x9j$|iKY6Jl\i@>qUE>%;MmĺV@pN8>`{)sFCJBE._ko)R횿khIJ$Dr vr|\;Lm(u)aE I975L8fAݾEn80sos$ν#{}ѯ3mEu:}Et @N?P*|"P2]>:̸$->''S|j$4TjxqAQzmR`qVUﵷkUga볃Nk&W M͒^Fl fs.n=]b7K씚vCM6>1vyu|7I'ia)@6Yo<;ow bPғ kv==]?߿>z_`l 'ؽJW3T ?(.Ě25 |p?It,3R/y#m.Z]vtr K%8H@_5\w>]#6\[s!%#yvyT%2Y2S.eoOfwܤ3m12Bgv~ -M8w.m8kR|&S4fRG'ci2.pv=-"^#YR}dß_{κ|`Ï^Og] 9=0+D[/yr]$aIբZ߫I/ab"A!kRzxH$jE_H4HvQoUN>cg@xU,>y<~~7@@HCb2'H1NعܗX`7-Pm\Tm~֜#hR!@zLaC Շ}aU+H.枒R3xy6jJ!)uePlVxU6Ko°lE|<o( B<*?C }{K/b98yV;.j:] **U%U9mdp3 4g&( zNg}8ޠgW} Lm6nHֳS'BXT =aFk Z}*I a|Ƶ>K̠u9F3S2R4ÁMw u+7,>E7pYvam}` "Y@Ρݱө ϑ@BJ[A̯K@Rv@,kDP $\`t%ç_/4wneHO=HrR<1}TQ2)$"5ec$ bxxmʚE):yOujQcݹ\8LeZȁ0 =ŞpH@yo)c3Gk_y/J5Qw9iK; ߶R'C}9JvJ脳}̘`?=(i!7*Rq9@N1@- *%W.٥Ua<HS~w|x0~g~uZxsN/fѥ+ #q2fq@N2^xHV \> 懘 b+$_qm-$dg)V"S~.A(,}T B`껏4*1+ǥ^, XwZ߼NY5R $(Ѡa0AaCJ99B]8`4G,Px .I s N( ҫSiknaXK wf3^7@Ez6IcʾѲ:e%hg,JOy}- o_p3fd4 sG}}#Rz+ުxR0-Nzo!#3p6p`"DAP1Hjˣ0Yo`L!HJͪ9Ä`\ʍ-rNo04C*bIڢcrQ* -Wrv|ejLſ ihUJA~,ɣZT*,CI YLM_@x9TPts2;,xg"* :iC;Z>t1 ;DEj ^Ns&jJ:)C)$!*B:l\ض, bO|Yh8s08LqGr@b݄d34P?+桀&ͅ]8pԬ_ҦQ<u8XK4+um TʈA2} /NRf /ADtTw'u&Aa;ݒ@-bfKqã1g9Hp Y`P` #bk&&:;k P]Q\!zT)ÂQgFcA@7#>vcQ__n]`#b4%Zor|ڊUA 54mUxVenG*c,!w( |jS#AAb^G~A^X_#,ɭ9wAP>qoW.҅@b/t<ϥ{7PW:)p+0HNPIiA(c#92AQ">B ㋡|CCDh|(}mpT6{x>=[}Uqnp z\~eM.v8]8F%.` +@bG "@2t[3T`&l':I)F AB <,q7 nH8Ŷl݈SiI#c`. {ڠΣ`Ǒ{B:~:5ę>P9DFGeDS+*?sG*7gBj]>TU * $\К*S6CF{FQjvn-3zIFv#h"R7f 3bmmK\:H_XRS.͍~0@\ةW/3<pP! 8?mq B{q C}Uk צvWn2TP+ 9l,<6{;[`@YKb9A#7 I G[|akAJ3|NLy%$!Ps6 sGVw?,Ab`ұ@zW4Y Dcol -ơ#xh+w\q$f0Kz '-ihD2hkߟs2nC,ݿv7W{2@ ۠F79]z#i T\Uo9?8b()L$ >)8q.bcwA/oyèJ͟#}\V>@`с%SS=@+!KT&dEY.me+_iSK]Uk@6A`x {WANo=~M]^ĽT5A(sԫjZCZitQf ۭYqlԡ<޹u3OiYn!켪̗J]o+řkv)1)Z% s\r9l9ĺK_bqo]IP3 'kQ{4Eu̝le@==޹x]4ж/J ":t$Rp'./yBC&,4@#hjaD PUSd O0@|1tGjgYf:hZZp O IV%<03Nfaɥ({6AܗY+s RC^mKч.K,"HIiF@6V hW֟IGIbBD7ޓzPk' 1BtSpy*%(2Ŧv02qm " s dϋƤY-;XU1ۢN(BZDߘJTa4ְLH lwJZ oIo!$EߥYbtt)_0\wx"gPpPj~UP{]#fDilӣT{YZvEJ1 sr#:ֶMV"wtZ4pᣦMBXx`bt4M m@rZJֲ8an3y~|CD4Fs3߄l +n}m'0bV-pG=C@4ng|k\92™m .m!QXΖQh'b5@醁Aa >p=[(#2BN͖uԬг ~}]\q_Ka Pey??*g0xqs (n8KS\6Yorf;şSBFK"?1J@ݼ@B1V4 b2e '`G}>~{d0n|O~MlCt|m((n@2r:!Ke2i嗤q҃hMz^>ڶaLvO_"`0U"=1O%tV0C _3]A oQO3WvL92̍NB f-0 ;%ٜRޙc,?5iݳi2fY*דzm{V!x\g ys6ΐD4឵Yxval -Y蜵]uST莧h; kӝOmr8Oq/Oʐx{IYpnmj-eb?~DnE(]Oz6a1PFk:5D@#<TyvqJ@*r<%ٹ;;f̖iu-:&tdⶺI-d%D W%f'n9ڰY$ fݑꜴ0,3523(I=BLckmuw01T}Kߤx*"}g RwaxanGSU5\ҙ^I8 ߝɨNu)";.5/L ]Q/U~?:ɻԏM*ɞ^N/ePXš=sKCabnD?a1]/GE;u΢퓙=XtFqQV\bJA8L ckv.߯A Xw`6󗬀|xW"@DFSˊYU' 4Aצjp*RexA>ϕS8fv.2ʢl" { 2˿@f"ciD SL:Z7Tg X!*O,lH bUdDgB`5bց#@=@B% )@x<5M|U/c?~H8}m%ޟ|:hSP.uT8w[ emo*9HN L%QcA"affFEp|u620/-1`jAF񐎘Pf^P7&n x -^& LMXWAg|l7ۡlx "2c w7  ȹY3M~25KLbbe̽.}tٻ=C3݈ iyMk&ʽ7Ud݀Xv:NxB/b1!ڙג Ps%5<^%:ϭ;0D831<wN@-*Wp(joxKT |aNR$ ͙dݨB :>B-դ"q(CG/ !K>19eo@)\;\`Fm|uhb%*ȆYτ ^6[+̱y eV|^Ӎh);rUxy,5(qdq=j,Z)ۅ@ |yrs2(z9GK;p%o[roʲ)k4Fa)grcvךJM I-IlЂ,ح!V~ 7L#"!ݒ;+N=9#^puk+ҹT؁`Vt˩F}Xh )BY񈥜6*[-$. |mгSGO BG?0# ѷ")q9Nyw}+O=<$]|973ۼ;4]mH(%rr /:Ωf'p7U坤zrΖ#~$U%[~133R %@@x&. AvrPǼ>V*-}+֠ fvߍ2Pm0""xTNS)ʳ]d29#;f8l+JO&ՋP1ph702ü"&"RPBxq ĊɟꊂIn==[VgR|hXubfvm_5gAQyp^0V_ysՋvo؏TKP5 ®7x*\]tF8 b+*' ASox T@:dgi %iit/eA X-ŪO=aJ†-{t0h[="Y5L7w|+]m||j׵Ϥa:n 9oIbv9A*Dкx N}Wa5ni'E$f(u0l+ԣGܨ x$cw;Vv 6;;O "E}}٬ [_̮Zu5hA8 iUh"3˥75a+*gf /η%4IʄgGs# m4L{yuxVgԮT`Kr,.\M¡p}ss>CCcho~ߎe2s]@$ 2"Lja-&6͟>r-XfFiwmLGju sKLL$T"_ݵdu-9EЗ94tKؿפx ae,IK8cdAɆ)F/ pkm煠_Tgh*jFXWP@@HB溩s*]0ِmXXhC+I=f'O‰yʁhDQ|:Sz m0pl&1Q{epL`o=Ke< W@P4yN NC*Y$2U,T΀G -U+1i[)oE;ZBpjy)6B};g`jzӬ^sIgj(wb-7Om";C}#A?ojd`P]DzL|IbvnYA1炒&-K(@@%B[Z}o $WXHT̂d2kW|ۍ{| h;%w XԹJ!vz`}z+jj| 4LLO"N=[ ""RZХ42?4 @'sNuˬsSŒ z: 7\9Xiyꋌ>H!͠Y*&Pc B bh9@0#CkovFy/ޫ*@B$ B v¨aח{8wgp +::2\?,5'{nq,?f>sv!8j!aF6u): Kc]A$Y%-;M0 MQV-݂k|57m̈́ L;*G%p;[ UO-2gc|~%)aծ'֫B=^ݳ7:~'t']-@ѩ󏙞Ƞ ^%dϣ@'p8jIlCkoY WM,-$c)ĕ槨)@K[sse-P87M`IHх%(!#|yf>EUPBMM0X{V* dMT̈@;R!OkEM:׀14[dk8tVɃt"xdECq(y64/f[84a,Zf-x `` ՝Lc$O +ۛ;p3 8<j_"iN9B$ey<k4{̖MnB cgܮ+ãcA]\ef lC?*f0^2^0{Gӏa{2#.+U'&5iΪ.G.2[S^Cٖ/Yn=3zo5DAj˟RCdO_2x='xPa>p:R̤CFP{絀nȰda@v{*.@|ki칇ubz(:.Be+{/4\x$j2(|h =_.P0ƝOefUaY~g'bj[r,(X;**B0 IN֫݊pX!T;W;4pLeT, <*" ~@A~Cck6>ߵcp~7gucM_z!Ns:ld 6&-"lO2- {Y_:X:9;]m*mA |HFls0TjUD}<,e)i˥A.D|z6/aj 8ϸ[Y ",{ī}EVQoL Vz~qwaOo@nIP h>TH ͅVR!,TXH@tKE}g!Dˀ|5FyYgSe^سr_ *:=2_g&-Yֆ^)S٭|K^MKb-ʌz]^ ်x#~0$46ge!6E9:ԭ iѨ_B~/xwcF#[q~qz j\fCo/zή0tחUEOkȼ<>͈[jb{I=waBAA8%9>s-x@gM{-;n45"\H@'g`@CR. P[9!xy(rɗD@[z +zض355ݴ ^K'Bն{rں!WzNH$vZ:FE:}:ުk¹YMԵ8{'T) PL3onk4qpzՈԁg '- {PW,Gx Ķ !÷x WT3+`(NvxDU=KғXn يTfb=z/:s - *sT_F aiX|OGXAXf@BaNCD؁a$Q?ަ{J,;eOԿD]n0EN+N0 &;gm-!B.YR$tߵ<'Poić6ѼE&BpņVPH" 7XPjBZ p321;ԟvC$zAIi%z67'=]?֡-"L6͂s)q'|zdڝ@Q! J1cx?7]%D pōYcO]Զ@6<e鷯^Qh{1陌/Ni_ -I^  (֫߶% Tyx Յ4M#< >;_:mAn%s+HtA"!fELH?Y6/`ԡQ8;ȏ3/^^p`3MƀSt~R ĽDu4Ə.`05gy7,|nVK*֜k:(vܝ, VQy3q kU @18zV 56n{puD 16B#V.y%U-O|⧳[)0݀Hed" ׷rAU JM?, T7;n/ G]@Bw܄nz[U-?&F|!usQPUݹ]?a۶g/֑2ftLj'Os:Ѣ fXAݗS MU7 =/;ufӬK"_^7;HhBF2&I8w s49㋀2O rt3Gƕ3<ZK;[ ʱu 2s KWc_ЁeCӡ0:'K.K{ =9dߚHM @FVCq6GnIV/{:Zyk`;.=K Yr#u 8. JE;P *@0s @0QRRuVdrdP,h^׍Rh~AN? " $Nfd~JfgFXntYj zPl%!L"k ]%Prb:*ڼV0@"{?<3z Ou$EonC<8 ^XʤT2g[gSvLcIW*M Ji#Us/gQ Ovܾ嘏Ƣ=KĽ{@5D] @C=/ZwUa %ıCTdHfLنS11& [mmr%ѡΊVCeZH[hXs6 eQr܉O Cw(q9ucXqPl 6Tg: 'GJmXeXEMH*9ɗgi=4Fˡ|^y)1kҎZ5:FD<6y;i=.e*

>@*:md-ӥJN&Jr;!͊(ne@MNDCԊZQ18. j; gbh-N[!=G,JɩV !+\Jb !@_%ā5j\veh:+4-^8E멍o(KY.%PBXX0 888<(pzoSkE-+A/e'QO?oiTêph`DR!ٌsxkD@PH+*c*LQt~͚=E?(C?^\D>Fy jq)zJƊTN:}c=-l:?ЪL`YȀHfHE(fLv A]"z^nhxNxU]ݸ5'uoDRGd279B,-Y8.g nzQAø VNZQynT`W'-S2VJt#DH췧Bt]⚝,2WyR.d ~:%bE Q'OmԗMǑ+>ώZ6lm7&HWc7 $y0^>sxyZ^WMߦ?#I Q% Q5opTl^a=0G˷iGrdS~!mbsEp}&u1'X_fSAFIeƐNcOz$҈gR66=Gj+|b=1 M%ԧ0B9ոOќ w[c _ P;6;.GVrV6ucEO3zW̎|1sv#^iLz @ URd5.1Z9VX@>s#6ME4{zsVȋЧͬ4_k˔K)ο\6]?&EbCQ ;Wcmf0݇5nCÒfOӍ ֩MaK ֮ . .pF:ENFƯ*O<.fyLޅMiTڝS7,k44eD!vg4}Ƶ6cg!Nc"vv_@Q oP 0 z5M e^g]Ыql*LGhkf [ tsa0uzYTaQsYH'dĥM2 ^4Nq $= imIMptxk]"JXظhHPm. zY;hz ;.h9g OoLoQZFFv6h%fKF)8RfK&(p D֟" ,`[G/Qnȩb'juzuF2ȻU`xk1/a-nWV;o,%[8 o-X)\ʑ4K1kzq8 G\왰BGtA8%1yE[=-0;y۵sp[Ǖ% l~U⧻a_(1KkpR@p8Ϸu řj Aа(.#PL0O_锂 7g7V(P:Ip,Uw:+4θ8'( CO`?|)]Nv jHe_sѤcFy !=8,YijPcNf[Iwn?C}y/)gճ ?ST\NM@Е{WwEsB X>><`96[)U|_!{eHʼnXѱkH8W+f1q7w Jn-ӭ u2PCtbQ0iS1]Ʌ@DPqkt]U r߀=x_=3k~3'w7pr%.n)n"f3Thq㞗'aiLYTMdp8ֶa#aaXLSRr;eJsXޖ]H"bs7Am|a;(/.Lbq _'gnYUa7 <'OZ )V P@Ab9na,N4{8gcO:tC}Ku1V!ޕXTb~ezh[P޺@1 ,j1• 2HxǎQ zz`m@VeN]~+;VUɕB7"R;^t?3GEX!qm$/D~FHꄌxw(Fl]UVOnKRkE;r+Z񚒏Fn417p:S>^[T:zIۉ0aJneUF )R9廽=EbN[RhbJb¥:N=ŝ1@0[VCELpy^y |wiF@ V?' ~/Bb5++ iufsuEdi0lR,qLc7"8XI)ieg;f6xB|վt:5Fw{yvQEr}T)@= rK6VDg k;6Fhp9ez:Nr&N̯뷭1>0XPڍ(V5X ʒ 5PS$jj|[1"1~?)rL BUEMDmSY$G-X[hlP(N5=FSپ= vkC E)j^Y O^vWlAX.&:eF"k} K.6k$Щl./͕ nMfuZgP>9c`8m9/I!/ɓIWk 3߭t!; :2{ H ,lc<$0 @M)L:9"'1'SŃyU MÎƷkG p6TJ ,&|zS%̂%M*' p`v1!,2 ؜,Yhjd衂bֹi7˛[^Ylphm9: ܠ_?oъֱz0@NF&bԕ}uՓQrOV|%%ǂM1W'mCJc\gY`Ӈc> M&˜9k\dwE2Y- =HBlBDu  ^l)? 6'EoF,Al6K4 3zYJ}"Ev= =AdgY;hh͉H K!Wub-op<˿'A@!!(w1X;w(̷->#DR>R0u^(R45oE|@/ r>|#,)bv˙JPl4"7@#D {B$9zc!{YKEWuXk&l& *!nb!ы>W9VΐtM*٣rz4e[s<$QN sWMAD"cT( FuCH77D`ڞУǟdDq6<|l';%/L!>1dërGƠOԁ"VP FBANFOz1$`ҌzPSJދS`/zeV{,8tw7m8aQH@5lGIq 3>ʳpSB=;8zgKǴ/ r]|! 9ֱcJ*CTMKX Sc+,\ z!Hfps/TN*Pۂ+(Qѷ7lU-+xD6YJJh=66cQ6 ܜUrSjǵs RL^G碌0i/ ӧimF G2!z.yY̷9db'4)V=&4sjmt#5OAEg}NtȊS|^~YG瓷+1UkR\R$nn x=)o XQHӔ@5qT|8 ::TD Q$e4 EQY}Mze>?Ѡ ԋQlCmȯ)~jcRgiŁMB D nD)/B@L̠htV kֲvifYp>Q󮔜\ | eD:W@>sŞra. 6cE 1RPRsa( el@`蔌xL v @jt('Q%6ִI˙Zv+z@es$Z4̂U[jbliG ,HRrk! ݚ?4"XG#^#m$A0(,mbh=7ޚ͑6EKf6 '7|]E;iS)|`G=0@ҜҴ[<5uR}dRM xwwWosN9zi[y$\{&ܹaB)wP `̠ kmN`v%lqi/Ie .(#j,xc5pfީwu[{X-BA66X𮳶:RePJ2c,E) PE샸Kn-ψƬ,ht94I(3n1l7T#Z5U=OV]ndj8ZĖK/N! -p b- W<@[jqM!biz{4hYs[g4F RJf}B>=*NFAtk=+W[K-InkAB&M Օ31,[FZf*j`.E _?x U۹]a Rʩ Lj3Y[plOT?1nT6?\0PD)p'lOPGwK۹`F9ܷ A7PrBwK{.>gupfіo19 [vFVq䬭jS NH};p}ܕ!gd\b"@3q$m ;SwbbJaP#=7_Pw M8ZMh )MeZ ùojdK@F Nf%Ǘ(oxFYODQ'9#\q^aR$2ev*MudZJƊpی)X>EhD-Vp^m*̅tFW^%[/'-88bB%X&mSM*:s,iwcqHI*o3O%H$BxHvk/ x/Y->~`ttY*8gzI$_-Y,H~4Yrŋ P+Yv6/[:|r_؇JV+ GƚRl$)+z%LEV|~z7/2'ds(;ua4|T'ƍO+=v\<\2[G\f'34REEe.S `%D6aA* ۏ}NoU]7VA'ebjhʞ5;37S9_NG\'BtHFjE' jӉD[9gd-L;R㬷O%3dRKZ>8R{U ! E A$ v<;z۝ *h݃gAQ(my(}K,VT) 视WΦcg̳1 3K:A]ITÄi,G7Lk; xD` ﳴ$ l㨧ʰ_by2SV}kb2G+c,=~{>]-f"f0&w`54V'[ߒ3Go#GC5D\Bw3NA9u@\Z\53=y[.dECP@e!IӮ@YP<]7 [T?ZV&|<"`MĖL˫9.F, Pfͪ@Q[qiJg%NR֓ A"X<kxJXV5>p0'5js/! H8E4;:o#r ЯY$ Ho/\+H Op5#]m,^Ih-S¥Ufs|x夷iYpGJSk|]E=Mq&=]K4̗g\}JG.P0,\S&Gx=2[]((˫-3{s;nHp;3+/7x H 7`Wlxe#agP8ׯe a)pTv+v_ SO;$WXGƬHᕗ1ZFͦի~<@`Ko,! 6*JiSqM,pǤzxrms axmEWo뾘v2]ޛ7n>r`6 0W`()+)E 4 67Rʾ5;NuҚ/uUS)?k+dPe˲܉+tC/=2L[Q"EXSo[kh+mJĆ>\yJW߷ 2n؍V9!;#Ր2hc5(Nk׫i :<唁(5h2h!k SoXΕzy/ BYfhhZ]a[8##}ZC3T7pxT\. Y)>SU%Ng5,UkTxm8ezIz mH9=x `HT,/`pշ1yv=*{uÍLQ!]lH'p1Tޒ `zRO[`7 # Bp4xvk'?P2;\i;j Sx" k@tYf>H94:Ug:BƑ($t'3#_^qbWmi) ͻ<ǹ,m}(󌦊2$;2O*]P%6juƿ!xGӵXpƍMA85LNV5>1v%JR ]66ܐ5c\=04|Fxف8 WGJf]FAC3ڌ<* L䁒;i qf$[V͔6:wWj.$X&6z0 OFGIIk'hhW6?1 q'b[^W`b`o=k`-o[@% ja͊ԨlBS*skdp=DL:8qec;ޒQKݻ"GtNrQҿE;QZ=xpP fE2gxn\^[Lʫ{uBPv wnP=y6(.eF}TV(\:wY12 Gt59OcloȎ"Y6p4Y8e|*NyP: RʓեuOcDνUNUDŽ`TsIj?&QydSSgɸ9L7o}ő ʗׄg0Ɂ1mtW~U>[~H;.TD\˄u棶zM:J?Ed`+j֘1w(~~5݄1Z*3 24sIVus/Pς~FTv3#ިO^L#-od3mt@[.#xV VZH[/\Pe{ـ ӧNC~^ȸFinv %yM&ACSWt@I%̶ $*9̼ |[Xy5C"NC̣ʩqSC;vfzԔ\M~Vc;1lǍ|S?Q8IiH$%ث bR`uL:h8] `ɳ$ejOk'"WWZ.[e8OÔ4UpҲ$r4n6Q0NvF%. xXatN}T-`7D1c>J;(aiu6йZ:l"etp' Sϋû@7tʝvց G{Y |_ ߚƒ,عrdgJss2sΔl}G`9;%"RU*X()UD>@Cvav;Wi{wP9^`L(=Z㿟:JPMٌqPs@@EN)FpāZ b[ny|⯈wIx)8ĨQ #VWy:ظ3INu<]36q9ǢzX̛qETp@!ɧF{ՋhS΃ aM h@o@Bو+BXÓܙljdUȤ]Ax]~⹛ϫ9YFhB QdL`8k |)^#sbJ4 3F`8*h7>! Dn+IAys *>6\~٬We9h7upPװ DEenV8X*3HXYhyʛGB68+qEFjuT|&T *"Mqxܚ>< !!9%Y?mlp D u\2 0L XVm0W*/l__ kvPޡLNfr\muQZqwC{(cl'8$.Z *+f|o um99#dxC7V= ޷8]f^yt-&C%V3M`P08ؖ~0/7g>u+]5k`iҖӫbKEǐ|o79_nM уjsz өsaN?;U/-3.{2 !8D#m*k,kUPnIr)YDk!yT}M+YᢒBMUgjݻqC]e?FiamsAؓR5Ooj @/ `eSGK-Lrp,ʸ:65e DV%x/a,ki=ns1ΜY6])AZo|O5=md%Cѯ>.2TXo9v糌drj{]BɦsBIYqD35qZpxI-X!4nTdw`$Iƭ=$ߩbVo˕xSi? %,.St:)&{])7JrEǂ:s.ZYl)Z.> l ~~G99!'EzgAk s5N[uT'$Etu tSZ2O0J VדOo:MM:GNiY. M5F ,BC=1TښqxbLhX6A=#'aPEoV@(euqA^`c~QōAL1*I(Q='y^<ۀ _A'|NVh"IkwGwzD̂jZ[> UŠμHZ~4` pqyMkpwv#'\Daw~wJRՁOH=oՁQ=LjͧkL24H} #͍^gq/h 5Pp PjeӈeI8Q2԰8u.h>Ƕ i%Uġ:GȣveB[}v ^6ۆL .i݄ |巬m;kP̷I᳕P9)=GP*?2:)٭knMVWcAl%> t@5ApsܻNjhmzkOA7̳wpF px AOXrKL:Z!x4᫯ (]qQ[l@ ` +5ޥh%{8Vx8mZ9Xx뻄ljmz,$-<\ w[_gn ˡ*9XZ8tzW-ͫUQ:[y NKx-X2^_`| snEnhN *@<dk>zBr 7"H:c켗Z>X{jl8-q(O.3 PyVj|N mZlaNY?GXڇ4I=qz]w- ,+ggze*T7PCrj8vl~5^/-/'uHqH B6w(-GQ_ĉN4d}O1+=Mwswd69sHӮuf(\Nj2屧G-d tV(rD;Bh,q%T,=P઺p0 A3ߡ'˜uQ(21 ;ս߇ #T_zU~ w0\l@',o>AOOȻrubM~eFq@`$?B #Qg >.EwC rgxW\r&7 N8k)o锌t$ aP i )bPX3)uU[[9U)ҞZ"~m6-W7!JZ Eק[ i-yBƘ4t !7 y[Ph ;ҭn*U\ Y <=t\ H93IU:`{ȝKP\LQy饉_w35AՔU~;=t@̝w4M u-LnS:L8O*#܊^Τm"Nԫ3wɒuڱa&Q4h}m2RuѶ_' +#_s.sk%Z0ӧvBwL f+M[ 3kH V.#=lY{8h[泘!$4M-BYl!RxݍšhMfiR[Ոn Gic6QeڟE e 'U>`ǶX{}Ʃ";\*?S`pQBn3 g?B%yQ?}5~~ފI/rhr͆tfjkKF=JClM89HZ-9yQY2r$@wFq}\].p HΖUYL$9 ǽ -LˇuejVe*}(IB{շ*q!k lG*$y-͈M  yXxUBtBfu;GGP*+(yL ]<fT^5v.%AFB)Oxֶ~/=*|^7z^֒0=wG18+aTrCf2"^vd̎l6"&b2sEFŴuG8- ycnԬUS3`J4v vŠڗd[zbٶ#|D3$~EX#,e-yQv~FNf~jzBCl7ĜA緃<4:y6C.o 7FDΚ#B."кR?DW+~6f˷M.n ;ƞ [E q,s9iyHIssH-> T"fyJ!k:FVQ@-k%N'ܼma &M z.^/@oe)z$0b ]y_/gBDg &WƱ]w7㬙H!on40~X$rK|kq-#Mmjhq5Y- OyӁ:e mPuX8%"z҄S.g6W-tCWw2ڠ+67Z!PWQW&eH4ULJRw8kvS}\V枺FQcNLx$!ۜ`̇Wîey^j5N6Lu!p-TTIXvREȢ|fi|-"[Ay|x!!? \ GAjCpw*xUa}ߪgrׂrX?g7j6ɕmg[]E)4De9ʮT5\RVޝ1s>%ȵ6+F*YO?N_,buJxyw>oXoգ'W\;u0ӃHT1tM,qnr >I_y}LM|&>emNe&g :[[]]N(yB+mQWЅf#o/DfJa xtp{տ-u/ &@#noaNx'lm2آk<P)'79[Ed'b%no|Ba<̅IPC2M_[7H/=R#5@ 0PLab2)D6L5o)B3O:2..>Ww BfQz7u S=1$ iKz//Ci/i(0z). 7[w7J]'o;CN&^)6rOG՝N!ܯ"#˾]` eiy=}ń8!ࡒ EFelˀi vݥB5@*Y"fF,l@ a9xAFk YJr]ùUVd bZ]1ƥsusoJjwXxIUe~b3`8> X8,ҟlÓKb 5 Dy[R٦{]݂j\gx0JpLZBV9̏ cBnΫ_۟`Cqu_ikk!$!Q9vO!"AtR`ބFՆ+1-dIuhܬܑm5)u87zeAQ"?HPSZ=ʩK2$(9w\ֵTCNHQ#Y['6TcZzzROHMBoT'Z@`3i{!Fnީ($R42\-]c(Q:*wܳ3EJ,FjGlXChjg%hZb2Cb7 @ ;wt?5b>4_Ė!*&,kI'LۥB5ᔬVS`, 22GhE<*V[AU%DJ9g5u]s' ٞԗpR(0qLcWYRlǰ^>:;JPDNHNbsj5)Y*&d:ץɑ##~{vq|ǗɎ̼K؂d&J>fD8 !|dy BF!R/>tTqXi\t?6.- >_0 yUB"ٛJIDżx"53Ox%rzgHg+`KyqK?Ξ b0B8M`H`.MeGEq+%{3}.e ch2LJ!I4m$e4LRdbdBVyJ1cZ%8WHL @%u ]EvW.zQf⋔.>#]E %OPI%| 0]H%(OCl6=a!<7ee)].W,#G3XtEƋ%8wm:N稁頪zZw@ɶH`Ȯ/X8&7@v-5Ogk,peMQ؉FuN*hh_bnsLJF&Um&tTJ{ȃU쪑PD(ZcR3<lt~pͬNiBI^bTK`C  1|Qgd!(C׀18u>f];{7{ fS̬L:#L7L^yd5A(J&n Ɠ9I`[/1 Ibt$J7^>ft2l&ɝ<:@Bbi"aeVŭq6ЊfY\|ݑ{>-VU_p fp%')O$t dc]'yYh]E6S령QNzppI\Y.oTF(T֟c2pVysZ|j^\tl Rrm\H(mdDs@h,#㤞4nJfK-V/.q'C&\1r`**-%F>޺d5]-e:H!Ͱ}0xZ}%1O-W(o:8TãKgs+PQAP-zr%_΃ 8-8mHRmEs(LƕŔ>o1-"/w@,,*~g_;Ho٩ᆁk蛾;pp1D01hnSER +% {5zq-F"@Gbj`볭w¯[o(CDǤm vFzp)rM3[U]y0YDV9;~e:Dזs3o~{ 1{jH:fJKK06y[{OO/AGB HtI軁Ď\`@^FƎ 8N0i׼$.8.H5jN8&rn s8$Dvj01>u'?Z%6p @0)w&'T^OXj_AAH!P8FasDPƩsx A3 81& dvԟYf;qn%f6MUpA6Xs|EJ5G{t 6#e+R!b=,Cad Ӛ4 yOr 3,*ugB~z_ʂԀr?ߟ?q1hP]R|Q:1&QqlŧFaQxsQ:D8P@u]ƿNl{HAJh? ($9~[WOX wIA@k&&;'M*.i)4_hQF7seAHI5/҈tnvQ,KET'kW4|uဢzUK5g 7L+.FF:X:aIL8+oqtVmЌYj ciۮarؼF wZpʃat`bqEȅ]vSe)Z|O·CBFJĶ.ʺVkˢY4N B aAS\TW!+`N[@4"`hGYz,:J*dm |a$^fGf%L)1{ ‹K+p?BJc[pVs7Rhu}'d BreHigYe+/h,ꡨX":-Z4N5"}+eoҭAV @%^cda{,n W`6jI %l3Sf[S;vƭx Hr(4ƼۨPcFuSNn)Pvi޳W)g +]hTG@ ٬]+b1׋q&?Ev~DĤ.:Gd@8)VO*gQ~+A+\fkR$bߥ)G?EHVFc'wX]hC`.\A9dP-_.~@:{OiTUcdqrc*7!|L&%JDZWpn4tQ XܟW|@B¥S4F,6=Ydgar %'1;3iĀMlEcT+̠p0IyZ?[R?qgApQ۵ؗ!- LZB78fQh )),ypճ)b ƣ&3s=ˬgZѠB.guVd_[iLP$h,OfQ̅cH܅69:BiuuT5@tzabshܟWOot]2C֜\8~$+wjiи0[׌9ktM^K|oWr8ˮQnRw7ٟM`/t81CEZSORm& xuN#4r' ]zqQ,)v=eO/r2%:H^aGhCTtjJ^ϡAun{ho4'Jk,oSaW1:Im7v"*L|ku_:+a%<ey6k!…kG)a%1zU)*`Sc!}~2@in|ż6$(0BF(ύ)8XMJd'j b*;؀G|m8r>A?}-PUIq,{~d4U26KB/n;,&$S/zX9^G]VzԠmvwT@(g,?khj/lTctE^dw>1lY.1uC^x =-yA^[Y%=5ypyά2m&!L1;(bҮph,ы AB$J5&ѮW3TyzM akHlLQN+j6;ƧB*]wrgs.rFq_yƸ<@7:)\(x PdeQP?x #yy tcR9( z"3ĥdYS`+yw{p?'y+mT;w'|+I `}Po':=Y)a=O|F^&*x  ”DH}q0d`9w 5;e,I*j[~}ƒ뷨ume { Orh6.h8w m RD;LkPH"T=(qвGUcꖉd} 9dɳB |}E;z4rc?0$bNdƋ[m,  PDCc΄p Ua˜zMs:mHm4>OE!:78-Mѡ_:G)2 -if60z\XSāLkQ u ' Gh2YGgHXW]45bſw f/Z TAbirFF)XM q=|vqj6XXiiu^h0}ؑ&v=pn nHڌۚ&\/[[*Qa ]?|JB* fdtѼNZ#I"6O+E1ؾ25~Ti("4"{^gDz[ PL *Z/ec4UbYmJD_qWA?&0|ALR%'oBu.<\S~ x4 ?5 ;*nOi?sR(;1߇ 8m iM PH(O@= QWK=R^+n]]VFǍGF޳fhb (Qnt64A=.gp|"`MRSEE NaɎ7](qp B˅!Td?u(|܂{z(v>$y>=SJ-Y)Ƞ÷&glR2r8B  bADt !ߺrB/1xG@CDӔ,C4dP`5@OWs[ۇbI`ht Ƒ!p?'$@(S^pӬU<CrEy%y2 5Red{aa""&CZ3i] q(L1bw5KGXT% - IguLN$~! z|Lxmf&n\-Hl &Ą Qֵ̓9?$`^p9{zF CE;D7!v& BJ ;赈TM j;/,ۉD!ݥ!D='z):x(9P"?M*)eHw PIhU}=#Hքպ>*y yz, 9v|""Cw`Q*.? @$(>.hŠF5s] \̉ gEFpu5t =S.4h ! 0+"Bh1h,pi "~6 HAUČ @2"|^\P O fpNVGXR&$WA;Sa0aSޱ$^[YPUJYz#Ev,86PL*Z*ebe{V87)?3-ޔ^wcLP CSҔў $|hI |n4 M p6k 0(@Ig&0ix,4 %yBGUBHAߐWSp(/DU"zERS"IۄUFDÖf{Y EdF 3 Dh0k0ȵTٕ8U -bdq\:AY`Ȱ).6XeZ6CʶmmVıa![jXpdqoqš !݌2^GȌJȊ>@d{XeA槬 AB1BLAn`3&AM ?5TМ>wA9ްuGCC#*tRBFHG>_h߷~ndd:hn@^aol}hrwH=g0:@ĬT/=RBPD%O[zv∿h >dq$A䈟ϔ_g@K@p)PG 1bB & &60mĢRpSr5Ujf%mZ)]Ӿٵ[FkL0E%&$+PFVj66 6ѹ5. Az[7Q̥ӝ4SWdBe"6D&a} Rn\}N[7z ĺY&=+WR]~ʴWDHI13vrMCGGj!QFiL"w]-ZolY: qɈ2c(nr5LS T\AY^!z4 /\{ߣVl68\jvaFNjIm< ;9QU7")M`7B92.Uc!OY2 0s\" \Fo#i$nF(6v:Bji m $aޚ2晴a9Y.IՐo"0CWƘ)+YeIH%)ǿ6s<(G.arY42 Q8ވUX9( 8JPQGD= K %k`c;ZEOj:&L"Bx @x;(!PH> 'JR0 ?y/7WMd1LN6fUE$UOiNсQ : ҂z_zXBT+A'|*$m6#O h`ͧW^&$D/s&k%hp"W6 f:klxrb (Ҝz/=5ָ,7*/X653J-0 ovSݎR8soކ=`ϯVՑi &5\߄@6GMr[=_!Hu:"ά/Ԝrs|h8" $d@~4.vD|1݇[x B0 ywIQOO2`}"?K 7USŐ=ߺ=2j쩔h*( R:7@ĩp->U)w82b lN\ b\]5 ZO*`QH, ni!iMG;ޅ\48{ÓG<7(jj*781,\`OpI,*oe8>~beBS#/)]H׹wҾՆ^A=-ѫa]lUL}I)!I G8v@|}(x"dעi (< Hz>w=caQDQUOmޠjx*!e$TD7Q!jn9d5LbbPS X ֳjY916 u$;5I4 h 7u{!@)kSd۽ש1Q9ZHu р}v]SٷA EsCDbA!4:B2mmPLݘȟX_+h2U(`=رkve4d(Vc9!E|gROǯ]R5z"kD>zIogmCm\C`ڛ  kĜ_Q#D^fDBQ[fw8sr{׊|~Bly0bF*IhFYbiP!C}[_}nWݗp||&!$ w_{ e*"GiBUu(bJkin[srI',"IZPC!@5.#`bJMFB5픦A:RjU_ȁDXV@H<Tw0)GF\џJޅՐQ-1)J+JDQ-%*.H(#GoTYTtP(ׄQ рwEDQPL,B$5- A@RQQMTR(e *Q@%)d1T1d! JR Q0!0)|| dRUDCJ!P* H '4T@A@(}(E#C,tx@R4U :jQAX (B<(" |H8UMLP"`$"Z% HH)(]bDITđ!H"%"(hSd0dPP (RI4PZTh)iZPDԨ?*!TBʢl)au ?w* r,(J`fdd&(b(dd+ (R*R4P  *Ъ~ ҈EJHP(H@P)(@UUu*/FUܢj7Ti@!*`ZT>%JADH׋ IM4M4TADS2TQąIPRLTDD4Rd%Qi)Tfe@(J  DEL *d(A@#CPP!5$ •5|(+dW P % P8AMAJC% *h(K%U>E HTJk`LCJHZ  sBdPh"( !R"^WPTP dfD*PQBP 擜M )9" @U E1_7lJA瓞:*JF\Z&(Ph5PD@ hju&GBBΑ~(_.A:pu9h*( Vx<$zB;J@=SANBHSK >d袤u) - +QRTQQE,ELQE44)Bd9!G*(BRҁ#Htp p 1 0 ]K'V3N$k()āP>Ts |m]* hHB YVARD$DHJ Z@XPz9咆BR$F f?.;R <;nw SMD@.ۍBլ_S6&Њ"?9 >AP ǟ*(TT>$(Ȣ @ENI _(Ó '$bd8Y"Q`** 'oSQ̨*?w(!%m˂(9{ 1Q/V2RSр;<\ZvTFr78"ґ@ אAW ('?)Nl C@W (A(** !D#ДO@}BANISErԀ ؓ*_"s `f0*! ńPNIQ '@DC{3@@TCdDJאvf$&B& "u9**Х*,(CńͱD?~v^BT"x~~| r{R"=_ fAELXQ)@NJ %)SATOE(UA@(uNLTr(( *"uW @}D*ol ӑA{*/zPC~CT*( )DiF((ij!$iA Ҫ[PJ ,H fx‡y{ʄNzaEP_&?*zUNhCȢdKܕ?BxU(̍U('B C#R tM $J/"sBxP jT8[{Aj( :09.J0n e?W!ח !אO'?sYzTK$ ~*92G%{'@TC*NH;Hw`S#"f2#A|^2([X2ʦTx]œS܅SR+Ї)g4#{y!P y҇x?*jTydEOaB+?Y'}sǛ'zQ  sH 0E!ٵ'Ƥ=ljBd!R', {?'O8*r<ѕ7 ^ʚ"~}j">k5 :Ɔ7f/yY {c8KWhoک~DZEmpg-O˫xczgoXKQnp̺[],K~ [&OZ:R߯q_~oe$۟͗^qs>/'Wv؞tEl~Qs3Q`a6*2Щ@8|SBa:hQNJ@+Af:vIng՘n&@ Qx F`É*C>1WEi?G++n=* 0ָc x>ⶮ|+ HJ$@jZk~x'CN9yPA y]63'CiI|@ċ|62(cҎp)~.CLCܲ8VÐDE91Ɋh蓧Vf#KYASIk2M!bmbj< &d(bJmf5c;WZk"+^WӃ͙r-&"fxgeN&7GQMVwHѭ\pgƞ@17)qYyӐ1=gѭ,o 8E7u*ڙMyX8a~ZP3QIYnd4,xqu1d,Na ubs!wN)OF p Z'gRlB#\BvG#=Av7yaύW{L؈^`T8e=Mx[ǰNBiv )k]M~kQs33]:Ig'V 2'PFO190\@@hm8-'aU.I:yNL |q JL.tn&3h ΍󾺪^vBnM6jNubg|o\^A1z'ͅ׍膪Yv!3Źz]fݷM +h 굊eidV+ScZRbV644ePSDםTRDws0D9he] cȗwMϨ:g_r.EU2R>89G<`N5TaLjMvumwAV3zgC hd@FF-K- sfXvs)Xod 0UEam[Jj菙cxO`0׋k"!"uuwNPQ[X19vNErтcsX+OI,mZL_}[eπF4s]`TB0h60aĎ*&3tяUU~%F̸2 bӫ?@#&HeԪCYs i TQ|rS%$+Ͱ[]Zɹ2 =$W.b1gX:2(cr]mnC>KdB@#Yg-?s 3Akв8%wQ3M2۫B5suX ik^YkcQsjDk8 :r C<},r1n0qz#`ZSޫ=$vAȯt Ɂިa/8\O:S/آz?b2S\CrZ$Y1EYE/dX4 '|֓H*m8E2{O-rHQS-&P DHs[2`DZpƳD .o*+2~>=;ATlp@1$od!CwLbkdu3.>3-X\@jEUZkkASvq FCu`a\ᝢ,%>w R" Dm*ReVf<{|Ύxt*,^ $*]ipj0$RJaOx'lD2:րb$bI$W$/c][5+ 4${|O ?UXxE)!&mbkKm=&++A>)5-B  "SG"izc<YnDg3Ԃ3Xa2`TN)V8t EkQ"{̐'Ԥ'>PEÚ6^91囩ȱ,ۇ<?çr H"*`s '{oD!q4Vj KŎإv/#@l&rC64)U&$AC0f2mLa$5.uEfp'V~0ӤZ'qWA`]HCsn[ +|VD'O=I8H֟`̷lm%_P{,q\DFd ,MrnKK #jǏmW OAV[g;PFMsg aYck~21BL hsEyWK,e0w3pC e0쟲0 oB93O/I&Af:tF認"uu܍bF4f9p6fm;~+BYcȵ{3Nn'>.q7-i qjTItlDC.6XӘ[V3%p`"fh260qxpޯcH*h63צ,ъg^ZE#|D<Ӝ[\ 1lR4Т|M0[};+Ics}c3FG+cO--0YxXD"Z%[keۍ; q1@0#"~uudZ0aڏ[%f 峕%zHE5ӔYjc=ڂxqL \;L78 7 1dD %׺Oj6iK!!ŸͷxvR8;GN 3Ll|Ȕd\Q<4ܬR5{hz -qp$ h[%0e7^FBavÈ[V$>lgn/.2K&{s . p;RCPd 58T"1D 8fTzekr e "ysҁ aV{Ɯ;BLУJ&1L:cr6sbA,+i,Fcq`Է5*BCFS͌2dl 40}PK+EĥfsZP @04Xt8TazQhm(3 jLOAݠxbei.F 0ShG(.fYbt+N HG #A_>MъW,0;by?=L|iŹfH]z(ޢ!Xjهzq&1-X;ϖ&="n&`A`xO VnզD98EdY־ݫ8$*t(w2q#>[R37r%[GRgiS) yDL9BŃ[UFtSGXl̒P:,lfavkXFdT% L _kF jbjz"Iu{k2 xu8!A7j^ˢw`rb@aޠUqĈuW]c읕%`+xrA#%s97Q*v]*d =r[^";n͒)8(|Xn,>\u%,T2?%&^؜%%⶙:)FN5Vk†\4eBOM` Zx7B,٠M#Y| }b&v,DB,#'y3p^1'}x& r/:K6w.ŵRH#f!s`PVӭ"\5:Qk]Z@8i X8,?kwI<)X;ۓM%nm F/J FB4v.g33y DH[U3I!:- ;IN ~l=6JMb%+ZLSl: uTM> c` &籙[˯ڇG*⦀3K3.f^Tk *_|Okx\cqk >00k ;Z\.5iA_})huwV!0 ǞRfR۠\ 3#|.6 =ynoNyAn͆H7oC[T}nq kwx*EU!W5V3 X<@J{@Y8߶:,{ɕ9 cPj:hOoh^t:بC.s|Q sN SМ.7EH,XyaSϯ:‰鿡lW`5'-lbȐIB!rC Q]JMKm0G*C62F]3C4A?nd$QڷLLZyޗ1q hČWcY bdqg_+ ǏmxoMߺuѩerY펺ml,$ ^ur¤~ Pʦ }t$imB!(4$1 p3:oE;ݼ{+t7%~^7A!ZUc gp&q͇-sdy)aa((j,E,=׵Ɖ̬:)rGe> sV=t1nE/K.)$g)`8a.3cN0O x18[BsC5Qyոb~?oe > d#$ ~.%5vHOZ}u 5bQ\ rq_b 13ڷ#UQ;%8M:fu.s ӁtV^rruJhPAʩ!:m .l Y#:E @қxԠH@d<%z{ "{+`#tbJIS-0[pj ìtZbh[8ݙ)6 R=d޽r-8ž:'Zc[`z Dn:LWMɢPF'=daArL!nc& "rbrpEI}q6u57|UlV.R,C XA 7$v; 2AY\&هg; .{î@M?CDw]e rK-T\+p! z xhOknsQ n&Fq dIHXą-CKOSK@NLmǷR 141F$i $7D;nddSTA. iHJ4K[ly#p335TZ rmPx"B4AG}9y\1ڜu+2XxLv*(663 h}9x(C82IW-!ͲD3'{4TJCz1í=US\E-nƫIoNێe5H H;'/ΉfHx 7,Rh>0n0а=z>*^zҽv5u Z5F|9Y잇S^ ȯ{o-ܖjWU[v[XLi-[pP61bȓ`iX7q(ҝ0$+(c`NHaBv[0A6Wjh\= %Ŧӈ KcYq0haN. B0+&6l HF-knQq 1Xm;$$)ąfbVԒ%}GrQFp!* FJdo: ?X~N9X3^Q6&;V*`kdU(!OnϑRFrʩ@n?Obza4kaej`HH~D?DziD_I?*o/Y*{} D?z5u~d )$Ӵ$x$sB/7xO)KpKT4dt#!$JFL8c(prNSP<#pKio=#1|H 6v30߆0wУ)@_ @82{ TX#(E/PA`l s ԊT 29y䚫v^V줛~߹\oKU0C݌Gxy7)K{ d/jPDp΋VGg%]K>M,rzE_])já^|/.4{>Kv BO#nK\a-UhtH<'|d=UE}?3vMi5YC< Pxbn'^P]( 6%iE J}Y  Q"c5`z>+B䴃h-2?C"o8s\ ˱Q0me{!Ӂ`9ฅ}7:C?aHG$wY1%`AD0AoI?6Pr`` rNT$n!CȖS8V&]sT9H  ! Uwa΀X S "mO};#}z9I:lC*c=o.I=jg7khNzdek2;dIڭj?A0黉#wyq&W]+O4pe4 v.7'SГ%QG&asahj +$C0BW M(( &ITKa0(*W4Ƚ }\ccO#=/_Mv+A :ǚk\˕_m>DfRQvo`K@?,;n1<C%Dˁ#@uI4?bSX vvG@\olW*󗋙5sǸ#0t<"GH\V/k~'x>$  !dᶸb!cB0GDs\ROK#l">?Ar e6+D@EL"츾Vµx7YuZcߕE!)ɗ0YL MEAJFA&C<̶DQS3ܲ*d<'ee,x޾asX?))ofJO9^ QϜz5+3h@+Ćb|.clB 6+3Ng}(E>70o"T;F_EFZ˹Y%V<՞zk|G ^kB«G76~_2CggUe 6sIжQŬg&Ĉ0v=ŋ|+JlܸpvwkRHSD@?'¨^QS\ِ8H@^ʧ} {BԄ^< ցP 5inin \$$3EP]QGb 8>@z}}FYgUmbo^/Z|B/4PSLe[RTG aŧ++9KpVhjLVs_͚B;(B[q$x\@wS.{T.@w& 1̢ ȼt~11|,k=F`-Q$]Aw$?h@Ť"?m6?@~@S}%Ih $SLQn~otn;WK1Q,X̌V$a)BڴbbmIRl~!x+;y70Yb{K-^F^8BWQH؋L WƇ*k0@SЄ !O%TU@}qJD""$mŗ ֭! 4 OEiMqڢ5{6Xٚ&1ev Ɛz߱qחbԣ8YYx6E Y_Jts*d^VMñxlwVY}@TeST/xB;8)|w*o=qx}]*܌.t,#y$ @3oj&SW_p=>߽ߏ~?Ãlƈq~&swϚwEq#bTb[:bB1 ҩ/"ceLِ<_5DR0M+sO+w$?fr "! ~.80^xi+@ጴhMD3Jh<@$:l i މtRшi *-5J֡`O 4v-VesX_@zlyIX}/5uH5]J28a3`ĢV54!r /~:߭Q;}== U.@p$?qMk$#-bE#j?:=1n7(?jqE+ޅ qg;OAG-YPq (YR!R7r{vTCMl B?$VZ?%l_; #ZHմm /j3igm=L-ii6ܕelk[>lzȄ~^lTu +*Ͻ GbI 6$$V/n/s[EӟSM yu%wd\́P1؄u ^q:L߇Lhz(A|$ FlB` b4b! 8( '")HH AOEҨAEC"Bs{r|ec'AA~9~ksv /%>E#:{ء>95\E9|NOkʹG$+ ܱ.$& . $+^Ai A`7M! h3~= ^w'`~.pfqjYaxP/gbaذ3$Xa Bq&ѧJZJhc6R$L,.~h|Hg_ܚOpCaMg0A42\ PR/ê }\` @BB?3GRS6Ąz[v^?U$]O#:vYC¢tρ@xw)az~MݜԨ1OCL,<3Ephs~g xAKV=\1a`q\<8 /ǥ$ɲq\:ҟ~|b )\(Ks "m49B6EWWdqECsrۖ~wu,Q-7؇WHqAV!Iyi?FABz2߮ ]\?CEV.@%jͥ@/گ.;v 7:4sNBv<& ~rVm/k`.p=J_cU|pib@𶔓_dsU$)Iጆ^. KBqRi(HqlØFl+aO^;x>2NSd/и;-CBqd$d1a5SfkRΦ_PgVY l)m}&#h!S]B2/ۮ":ޘzNE؇i*7f6g9{r>W̆ÒMa iXa LVoDP>Ͻ_G#Em/{1UWv\"8?@_GuYԤo|ljX<زH"F,\yR>=h{ҷ7|K!?Ȯe.J{Py;>+ T]c|SL-SWww30Vt'Eh 젛Å~kʀU:w!k'A2#+|gn}l3hFyS,?F;aѧ&m_j!&d7j_s7u<,sW3n_2iqjUNM1ѪIϾS~6s s29'렜ʒKkEjM(jJ}04)gHɱR]au}-2ȆQ%'Xy$c!\:W}ZևΩ: _sϜ_%@.<.2Bezͽ7ί_%7YK$6/ecוMɲ̊> nNS.y>% J}/¢KVKvז=XmI{b6Q8@o_#@[ψ6^V#D n'+33[cT%31J& |}O&O+~_P$l1zz};R*T?3d9g/"bj7bB+~]]o,iX\lOo HdbA.B{;>yވš(1R]_GđԮȬe\;P6x-~CwWZs ;<#*a}CY[4Z]<Β8_>skUs'm\Ip*iEgʭF٨Y 7E!&Lfg&AӢ4Y(G^F(?Q(Z:[Ls (B@4#d8x@4(PD8O9oXtr>) :Qaa'toC8h{lW+ \m0 Bw'1<kd?smyrta)4olDO-$As|=_c%Y'C?ћ1GNW-PINCnf/"ő${$6g Khi)io!M?IC*[PYԥ' FFW))(0+! K_QL)u 4yh"M t;o4_]g\pA2jS LF *.㷰MٮG sәQVuy'QΥkuzey[ˮpq-RNN Ԛ uI-e Ζ=]$?2ꄋՃy9SUoT#w8Qy/<ƕ?# $ßFzRQn+* )#U۽N# ꇠ[ MlU τk Q՝a2F珗Z Ђ<]*f(\F_@\t:Q, IhéF(~jϑjhқo7vVNeW+@1ǃM'6?=NppOvmtF VSn%ؽx!Qm(8[şɃp5\ -a&<2cgy9- _E>v^>~Lu' `[=]Q);8kczP=#Pp*leo]^$)\w_u7x%j%/n|?4FL峮qV{0+Rxε;'l=)B rU=!2c{[vduq VR ZlZð|?1gw/̯)YS%(ґSɾ|UXG˧5Ob=|R)T,vc(=v|DB=@$k n9|So/:]'WHϤ-PiQңg|@6K@ȁ9z:n}aAuK7`.j%Bi4Cv,UKzHhu c;R*1I(zS:dr{ˤO}_M=k{ XQ Y$$3]H&?fw@t>#i=,tc/R+-JbՁm@߲Z*%ί^u{?v@`@o R#_]:{6]Gjc:^ezMJqǽ [B: 5C?OMA`p!܎(A2![Fyu,ڌYsJc t(1MzATG"p q`. %.ºo(Bg IJ{Xn |H "";/Z]OkY~  @JL4&(|=[gR.$zɳ:tHu6=DE+c}> Aߔg a72IQL;߀Ln$a=ͻ(' $Xc5J#TJGJ:wVup#7xػg#<a'0ǚ`R4%b|d0P/KU( Q;ċȅM颅l0rq\OO%sX`oQtWN՜sFUcwByJ[j߿FQ4@&5eibX%N4c_:X{Qތr^M9$H;1k%kP53XuC)+YJ>;^ɦG b0kƬYn^ZIaZnZ ̙N?MH944Q^ۃG72y*e7Dgh峠jPE9#,jjUwWք;X)\ u`_*'.K շQ,bPtuQUIul3p+qj:.㷸VC;#YTm۸89{v(.F'^ Ϝv/b޻TXvٶv&qrdsB >qJ\OwEZ[M|=2\]!MOv6j[-ݝYNI! ʥ :TpAl߸G;QL|R ѷ% 1 mI~׳l6k݃̌A)suxȐVBA}7Bfƣk dVe7.U◕9PY>П& rl`2;}@xCA8>&|HՅwU3L$,{HZM`uZ'z["V >s~39،)Sh^CwGܦ)1l]nD wWMbrdH>F&€W?mȀL9Ux8 ?ǽ A('z+٘o+/,X|NpOAl|4r?$#iӥK}uT/q?m=5O溋,I!C@7M #K/ЕTA_'3zcg~]luD9잳t.ItHrN)!“My'ԪeݭB>A** wݴbB͹$^=O5|\v:_=Т[~.D;xǻ Ӱ?}(~..vZ4 ṯϿԳ)ڮG—hׅwQ43BH^2} O)ߕYnҩ}sH@F Gi$5ٛ=}RBoۯrq nd3P! ~qNQc?f~k!tf)"2M.Jnq!颻 ;@#Ι7B{fxP] !?p~Q\Hb1j^<,<ܛ2n$$Q %#Q3b/B#OrOAw`N _)su4BVAEIӔ?߆q LE0јW96$DEsڗ봥cyƒlQ/8 #.E)(x^̜7R!xqoii#4_:= " b(_$H MD7!*!Vxlه;_$)Qt@h&  3z 2t}fD8g!mC`v4lC,$zݺ?є wKD2"O2,t[aE/Xk4G\r GXlm jX^96sIw r Mo o% ,q" Yp7@džٮc>툟hK0X$e5RLWe0ԃGP?ֈھ;C_ ”Q @0Ͽ4G'E$! 5d΁|DBlzJ|W1rxMKIk2xP2|\ AL6k( 'O2&FsFסqu9̍>sjfkfmu(k=i[d{l rq5o[7 2o*FO9^'L@{'o?XxdgŸ{lfI2)n&\%|rwޯ{:Qlп̇#d~do/L@iA_mzh9a;WqݲH29eL:iQ:@jOАHz:NִvA$-y|?;6 W{x+_.+w=/#ٖ{) &3"Eoq㫌˖l6`"͈lAX\nԉ[S-- 3#`%b0e 7g}N66Q@''{2x`ﶼdf0 $vY{u+t?=`6CxZqqէ"CV\\xVFsV: 3tF?XUv !0!ڔ.; kn`podc`:hkY8{rD;蹴_0.iD0'>pH9f}Z6W.$|f{8H-267>WzΏ9uH.w[\ F5kfo]z!-;@h:x?JngB_t212s^ &l$*E^_b,Y,L=ſ&DBte,A=Qc:Qawg<(JjPa}/wVs˖W;L9֜CnQ.172Qo9goy}&PS C,h^|J֭xxx xCAeҍfم~6}5OȳM%&&-֎úhm:vYAgYFn3D@H)iH_W,!:.P+ؗkFH`XJ7 /3 ?`զNt37dk!9z4X{ph8kuB$PdZYݯpo:Zt 3”0\&,DHwXt̅2,aHz7($.YlkO]"%:=TQ([Ņ@UNHP? T̓3Cc0x(ԁ];Ɍ݆A[Xh9v#']5fj rA0҃āSǭ8{?6,c9HXp{t:m M.GH@Ҵ'Gz?tX/'8w`agI:1r#913^$i $ĝFf$q/VB5DۣiDŽjc3 N`F%͞LSgHRp3{083.ѩ;+"fGB2AXAӽ/N*Q0$( f"{yŀ K\'$AŞL_303kdLBH0B%I,4)\ /G^i63)=xmtٹWyte643: êӆi6;_M}йo Tx -H5};+.<WS/ZJIsC|D^3㆐߾Mt`܀o.B[Œwc)0@!BH f D"1?ѐin?oϤ_::p;2DLkgx@gp^.L@mvYlܰ&!!&<51(UVBMGXw($q~Fta6򊌶R]Uxn[^~ynQ=,u~^Y י-k~C<??g#KL}wu:[ڜZy|Oi f پgTjglޡY+tW9u04D~%j>rq2Qy,D1~ط.:#7X?zy+V6V*"2`ERv*I4@s$]r XHNPsqhI`'$&@V/a0O!=O5e|Y=Vd8\;dz{aׯ";Ҿ1Zp"=ܺd~*hݽ͵F ֒b뙐~ۜA^&(uJ KŌYU; MEk&ez&8>u4YPa Fë6r`};9(Jc}򉃽y}7&P1]wZ>O]u)j.RTls{Y7u฾"%O‚SMg> Ubk ߇*NP7+9F/T' ):Etv5$lMYC9`.MLiƜ 4iHgo&rSf}slX@'KH +!xֳػ/p rT`wWxpŭ ?lO\+)T<IA3( Ͼ/֢C2>20]S q쏆FxC2./$N?|{$ ms[Ƌ>Yg5w*?5ʱA# `^D<:`׊2K۽(:HM)Hvu ϓ@-kA@OMW`Ϝ_!)8A-HB3ԝ#4u#RR`H$d!AUP9_?B0 Edy SNUOm$a>6I?>_B!?pr5^@i4X|JņU: +9CyoڷnW&XqyLvCp_4+>gƳz$_N!q|xC_X`E U|‹:!PphdIꞻJv[¢`gݿ~Tx)Dh?: u[˺[(" ^nq[2TTgSrrC(߾- 6{vpyl5&˾Ot1.r4OӞ[ 6e@9)g|Ku;JD"xPOqu?vW~z>L&ε`K.0kH0EcŚm㕺 y Ki0i|zPheźqHtE'3m.ۡ| t w1IVb!L/~!"4K+4(Ⱦwtr3Ky?;U]\OAFK}Ӭ줾k;ZuyNsVI^^5 gfZW麱b-UR dOU>jH? fo*X&j=+If]^*^]䷭i/f&ZOX)SyZSSC3k>c,ve R=7Xk}$ԝ:kQ-aq^&.,9߄Ie =T|]cyi |fYǭv۸ޒ>K) }+[NU_6q0߈thTtux8\Tjb&8O7pn]9X̋O𼯂*zycMEU"5A|޳{Xȯ֝fd91Q(]Xj T LLW1wivoxϫoeAO4'SՁ ll)ф(/"醥PʼnF$x h#rAЅ:8(<.rW}sH"*xiAnƅ.sZ *LUW өJ>P~z)WnKWBEjM^g <?Kug?I B!Dk!'3Gbe2M^ wYzӐb#h'Y}~̀$d`}W+_-5(9`,Jw] llomDѓSB]Nޜ0YN+g/W,&jJ\z̄5#Р({ȑS/|W<&Gba\K ʰ:4̢VKN륒KItBb1D' @ 3PK-%LGPv8|Y9N\] ٓyYw{mXXW?chG1hn`CT)5E6>Sҍ%֝p(0.RaTQ(N$.5^VoQ}8Sӟ,8;'BCÄ+}?GBM 0&B-v9:QZE Bl(K2_Ж= #c:Ztx_9De>~WVy{0Y-b, #@ƚ_7U0vNu(4o,EMGP d'fUL͹Y,JRb0uRXUekU(jDr, L@ԾcUT.iK* 31LHdB HrƔn5kyy^lխJ ޞYbͩf}m)0خ#X-P KL'E@TQP(e ~!`j (;e&*Zǐ%+dBt gg/3Zl򓖃b (e @(PFK)T]b [kq׫ 9Q#4:J*<2xaN(_tiJkM4]~~MWv7c\s]=F*W JcĮɁ}xٍ Ǻxj{8-Y*+`k;7u1̔h0$ٺr+uÍYkR˔r12VY5ƚ7=vBL:ǪQǔ5M'iym㣗[?RӅrɧ*+3q]F;jA{wBzrbMcmuuJAJMubw<^"An` bVy6 2sNc0ۉ3A'8X l @{`tGy. ,vb] J?E%RM o<%/lj0Tw{:W,s:0S,_MPy&*SDG:wG_e1ba3Viq&A4ugZl,qI/i$V{CӤ5OJ9Op7Xamce^C-Zf2rAY|.:H6 U:$ahHO#UXU4h_p2c.!LWf>땔m"'zkà /c9BnJkXȜ""ȸkg{)a{HB^= m#/ˋA,MK(f*mАsk}CZ=ֽ.‚BjKw宬xꭀh  5m֯,^Rð<F~K6Xd7 x0uy;\0z-k ވri ya蟀/?=Ύ=5߸HvP8e&aă,vCwfo&V X:$x<'VQtb+{](C a};!p]94EcG&x1|4258=I|y/kk\ஈWޯhiZTjߓ*Ir&Uu.8i^}CT<]p + ȕ߇,h2س V"FcK=0BFk0=^RN& F?;D$Yͯs5R^MӸZd ԠBIeSbYc3}PZ; Y^3sۓNr_cyYҰ?6& aix:0SM u zaaZѿ-0cTjwHƗvOuAgPz2˸ pti>_'i{Ji|T(*: n5*٨'dqq`Kp'?PӃeF{"I9Z].nGf ƒ=ڍ _.5 Fᠠ.RVP %,m Qą#$V :ڏ9{Y#8C+mVN I"@=q@U CrgKJsώTB<Сv_;t,)LܪO5ա菎(3\U΍X1!W2@<^]]?bܽME6ZHIV$ȬLJFJo,U'\}rR'u &PBi@ ba6Z(/v  d.-U N?B h.7 T2dC F}EAeSʸbKfȦ!POkUŘsS2H ٨ry`fS Hg3C:k 8!D圬.8DR'JB]0l_%`hjD\P kA2D; spLKxq#t? jA 5bn 7xF9,KEO ,WqrrKfZq2pS50rQNR{9Ge6L.wZ_5Fgu9 Qla-~q{6&`F*  @3@öȄID2((Ђ hYE@o4Nr.E$ȑwզhCygqX0e{y9z!%` 1! Gupw$#"%A %FlZVTd)2` س,#;&+x-Ԛppk)!!!047T;;an"ų@d,- $g]\^3_|9ZDn{{(^t ;3M4-4Fy ڃ %L0,apz@*&Dg9g$:/" \t kF#Q3y*T {NDkon)&;;j5@v' pSh!N7`kN HйpB98d!r$]9EV"!pN^y7q&b+EJ)@L LH]e'$Y t'tfTeDAWdî-8 IP5W?RvYaTMvs 9Yؔxb"˳yal +\r߭Dy@xH/@Hl*ieHqM7m/4 { >q;5/?o/w ENHH %'X kZvnÊ=e I9/SOqi1 DGGM%*#15O *>f̟wWCn .'//M#+E3˺(̂#l=ׇeB3Ǐ q 4>ܺ ѵSe^Luz=zjk,RW$R* ݑ:y_t5j\%ef23l%;z3oJ5* ŃeHsC<#Lq#7!'(yDdJQ P1ks '&S`/2 "O^$v{~شy0WC$r~S>s=0}7a?lӌ_dAD8#Y"F+o:?)q(D'"aH颍J/) jČ׵Y`/Ae}G@] 0Yl}r CR+Ӻ8[mu$iWh$ G?H!C?ʪ(lhߖb tCN F՜N/X|hNR>IJT~Ғ ޟ4Z|0!€U==9]s, GbͱsA>^hQ7e[猌g_X@"Ⱦ/y@UJфsS;`3ل3EVfֱ1#H C`-`@Bop4޾n ~*g>`.7%~\VF`,4*km@Fyk.oȔ'kŮtNCo"xp4Od'EG;O 6[kuXrNs^ $˥Z7a%4yrOP, ©_p?XIaG4kX=nH2ΘV;{ 0tqfCL-!SU9 pY̽ xH`?EHIoB{nF\nl?cx̀$Q-IZީ{cQpX3(9DTRsVYn .!qs@7ADZ ~ih"H p@$`k"7JJ< We0(*S5Yi9Q. )v-h최fҒ$/8܂b`HI g5ٴ #ĐZn= Yk}ߴv-!)m@x6h eTDZ) B>;Il!ЃU)ߛYoe ;/O{WZ~)tOn(]&AnʏjIn>֙S8DZ\|"sq& HN#l! {$G!˒I=F ‚$-J2q.+\V]a7-%q?h$ٺ9YZТ"O?m RC0;LMU]uo~\ȿ U};Oˁ%x|HH/pL/{/ Bz~fa1V&dl2(X;"&!RSOIH![{lٷSݑ BO$kF[WL@FcjL,"1NaZC&i)+)ߡ2_)Z+СHZeP 6 Mwvxu>>H! ؀3wiMӕc-Pl&2>NI$_itN\ |4W ni3xVYfYWf22+Ōֵ ̠1bF62}c/1+mG?'^@?+|\fS@E ceNm& X%{;_>MBQfVHaI.!h^M6dpQ0K* 0۲~^#uB t\`90$ᜋYP :b%b/B2D;B2I )$&,-\!BCn=h$8HI[(l l軾;CdBHAgAD`8lrLQ"H@M(M1Ֆ[((N,*LmL bK\9Bf"w 2q gHI LD H $ -[NF9VmlXf?rIv,{r`SH -s+AziPa0̚HtHޢ).y2ח.b&@9.n~XX/A{&Re!0$m ,amT=LLAO"ДC>M{;@`.Z3).,a㘽>jHxqʭM,zxRD+ R9:/h5uݼ:,VRK֎ rP[H$t 9QS!+̔_Ji*a ճ_g~^^E=e5#CQx rEE8w4j+q _-E%<81wI4-%8J"[ZcKv^i3JQc5…ZB[A䯝ɿ}Tp$ L ʾw/A):հ 9oWX}'i{#7>k%(sw[^_qgwď-UqILZ 8#PF[ɪRz4iNV9E"Xxaĸࢀ_FTvvN|^pveIߜ9D)xl508$9uy~ޑ)xT')[`s۟gE'p(qՖ#`*% cOL>rVIW|rz_Ph%0;x՗CKPh_H_nGӷUM/s>~+MlEp=\ )?"9z(IV6ǻt+^/_Ðs(DGdpp@麔$MQe"pRWi_$#S+v$_OvEVQ;&](|pTx%v1`G |qS33:z+;=C LHAnRQtm a`If6Iyi@=@ 'K~-Ug@X6jXbcys 7{vXO ~Y2z ;c s+9RI  {E 1oڽ>ޜ͸Gm:t9:rܹY#۲ek+[X?wzJ jql{=?Q*ֳ؊1>U/%Tj?1z~XUfG(s_oЛ8`Vsn;Yr#rI5l;*"if7 W4~@ۢ ]$gyb<x/Ť1;WC0(>ԟRzܹs5\Y(hq8m4<ǀR=Հk',"L8g}?2mhz6H-OT~%-ϥ4-ra ++!sT %aq 60hX({9*%reD,_:I;?Guו)ެ"sYctW8A--'JTA9qz*$(]c3XA,#ZP*Fb_KC=F@8%B]r& o3oI%Ir_8+T!a9(oVMbE$C* BC0Z.hUNjV$ـ,쏃/uvLB1%pyuV_#e4C'9|w̨dx|h:S*L0Bbw6Xb!?:hu#ɯl` EӣHIs}?JG}tjo%98BZ6%sי35^MH5*VֲaLEe)l?Um@%fW;ASOZt=k(s5+K+$z[6/*._Ns"鬖RO}ZiAyS'q嵎5bQv99WA4_6ʛ[&8yiMyɍڭG=O 6o{ƦfoJ~gup" o4YՐf: Z+ =2gpѪOv-OrngXNkZEe'CS$1;֦oM1CrVc ڪ7yI֚R62 B;`253޼݇Ӭ59! Hi K=ԵQ[&ee 2_Tu!}/lKyZ,~.n2_J=bOߢGiΞOY}L3W{\ʜYF5NG?Rdyb7s{PPI&DBblvSxfӁ eAcs^{j*SV@=j'~- )[~D0;HX.%y9 'w]Dm\?!W!`C8KUJȕ %@l0_OD{ 507dlE;I,>gH; a˹/ϣ"ߦVW٠ܪET|xg~Eqz@ݝŋ3AIpx܊9󈏘(.@LC&@c!b{ldtJK/݈ Bi!ާ e1o ʽu;#! ͍ɚf2v""W4섶BtLzԩf.$Eq0,[XS~9#K#Ͽ7*JÓgbc1ΔLVP4\k28EܾRrzʈ:+yPG6)()<+GB8tsl4YÝ.1) !@H 8I雁ӓr Ao Tb3&(`OηC@/[FW ,a㙘mk%a$T3"B*&MȤzD1oՒ]:]bAˍ]$DIkELt좚8XMFb8v#p0G s3.csGb&(" VFfeDM7>PU=|Ⱥeղb*3͙M, h(9q9F$Mtl*v3$HEɆDLE0Z$wh=/xxQ&"DAu#o uF^0&chzJ"D$whGBBbϝI]-o8`,,:4}*>ޅ[> ǀM3soPZAf>h,H}=?29]tJSr|p&[BebH O[ EVpH$ SdT?AXb|å2,I# {ފ{|sj(e& 3xlyt āE$2!5"AC A,X,D2`w*QF (b- :zv(g` 2yH5(2 &D8(Cg 4JN;-.k (hQtA)C@`EӲDd put`xt ƒ$DMTQHwiK8Ã`&7 "iÉ a1kjЇgO@vwps;FUX0*rcM)9k-!PC$1TitQY.&*1)$ֳa^H !-=[BbOu H_"y%JZlѱhɓgħo]QRT4fvl3 XZ0mX3 ,ָN:M0fwMH-`2 5#1;>`d+#Awh$ M847c8 `h9_y3T+U*ʳ) aUR񢠑Ex7]=UFz6Óx/7YyHx/;1hDxb|;WJrUi[3i &ʦ}* ;vgϨetlWhIXP'JS ]!26D}ԧA4JR-;"ōkYpíZh`Cd';^KeOb }ϡ%f@+˒3њX#c$@xC:ی@1Hl[7M2rnJ>4[TUPc !gG[lgHߚbPp&/Nf>`0uOt3H"=X|J"#E]úbhA /h<%_#P ͜LH> ĩۑW%_@hBF֏s3`ɕb|9Q:C HdGTM`(N4wz' 9$,Z/[PA)ҏ(%; GL6ݱe{6HZ?Ŝ2vWĹB^@{c60zi hF&4+Xp\}f*oƐ{*|n{H2&[/ʊ!OM|8I ב0N,DSZxe<=,i;'*"I-&Ӿ}繤V;([-s"CDeuJ%PlYe}]fU'hV_vӦH $yP4_,lbABe@tG9߷Ba*Qee#ΉcYB\U4G* cW"Zhا^$<>?:v&jPm1l;tݗj|(ȏV,xFMq-+0|\̼5N5"Z0В/޺HR`މܾA/e*OЖ4m~,#?'3{0ߒ&܊Ez)CQ\4˦dWm$o{ev:Ww^&3!7B=DߟHP~ycnli\̂+ĉPRL$J&" DQw4 Y'< CPCq|Wzçjs zmQmBHa酪=׷;1Wyb<Bm M=:>]ct[m!(?Q@\ YeZK\Vx?mQ $c~(MӮpN{o,4lP(#~yg:hΙfœ*NҁvIZ$ 5ׅ& ՚CFS}oܱYj%ۣ/:G_ӴSik1 YG峔8Q *K0Ts>GKlƳ%hO σljٔZUv EAw:`)"Trv8Yu?+r|$T:(huĨ2D-Qf&ΐvU` -H s]#QBJN`(. RIk iKKKa pz~w=yRtl>Tf~a03B cמ n3YF ˺9Z @2lItsԠd+74*Vs&ʗAJ.XW8 1&HĿMl^Mܩ &# UnVc31 /޽$W-GCinQ H)˻g{hA'ErcN#DK$YU7qrv(R1+\!arM*SG%[ǯTv,S)#!&~@_vԑe9ĒI~?OjM DMHVxaq|`C[_ RxuGABx! } 1겞+p@ѝ3Q8YFɧ)/Z:@,md`1B6zC+yi%W=keܿ'CZK i98(# ]*mW(nmn8P㟲ZsL湜B .4( 1wנ`>8UU˶{lSFm[[xP> և~Sx 7#o#XY^2 }q@M~5F{x{ a܂=?lݵC1S/ſ2'F}Drbu ;4t0BMNPmTlE.ڊ|Ik%3Ȱi 7YAmS @ꔲv3ۯ0P'֛ { 2EkfQƦ i 'pKɅ. j@+ 3<$r!0XDNy՟Pg. z2zk@ۯ5^&+@0̜o=v3z/ lJ@q,ZopWovP~Äg.m_uL}gvؚfEB4Ȃl%^[vߖ T1@Λ~,?9b1HW_'yT=#ѰcLRvFfJc" {lExdZh=+y:RI]c+ǔf+đZʬ|u #(I#`Ž%$tMm0J?DE{W^M L:׼qr?b7ȣn/]%>VPj` 0IYuXS͞|^Ad*u<,5 zB>auf'O)?{;Ԡy4<7΃r/aeD8DsD wy.TkZ~]zP !iJG TdAn]F 1fB!uoKb<Y_-( CAX_#/OwG5O<. , mEB;pO[Rx.Yԏ꿂 >fEPLEmõہV9r&͋3Ϙ[ Kf GxLB"ZDw:K_)>6ngI-x3HZ"!@v=kO/p[ Y#0-C2=e+6APBpդ=;6}f(WmPG )bq!t&גXX}ݴf5h-few -l Feü*Xšb'"G,=eֵVRx/n74g]ixuˋBR)@.`ӌ%Qkl)as3EW6~&+Uo'{W\/q@T3џlBaRBw~LZ . j6`C cIJi u~OhdF<+5l\Goo>V9o]]O8D\cigCG{ם/) d@R@DB0/hQ[@Œ;8O&;C\VhIs]+v=}Õ`+)/ c+QKueln$= iسKArl*||,o>׎OC~;%13s' \Ҧ.tE O@@0sw:S?P/lz)5TZ2O/­7I@9~yld81 0/P{U+G_]s˲ԚA9)W!#Wvz/Q6sM?T yWbO(gSDw,uQ\_J2[F2}~Eko竗B;V+2s. H j1>/u~ۮ 4ucmy#}^Q)PXRX}=ǥ|Dz}'?@݆ gbϜ(iUDzj|×!'H0O񣶴>۸??c*~C|6'64Vhx aR:V._[zS~>vW*h1B't)R6x&=6Tj/}"K_nwcU"vzieKc x -:sh}j37Gٸ:}ޱZMÛ_7>,1)?Xx]M\5 z-VtלoݟĠ%$߯~޳Ʃ\p)V]Ve El=]y51"o󗯦w@ipW]|}POBOY{ ltC847Z"VO0s.KSaGM/lws3} <ՙUkwa;Y't@o.'q|#RJ%6$* ܱ }ۦQ=ˀS۳J˙mp8yGf7%\ND4g*hݎ:~pۺ;'z"Ojq<[v1lW{{|\ *b!:CJ,?j%O}wcܭg_twk&Njq^Q_M>S{FKϋγU\T)=~Z<-NηJyI<ߓ/ҍ3Z+^DBoڱ1^xK_}nI]{֩MOwTi b"Eh(|GϘZJ>RljR@Rx5=;S _[{~vF!Fd_EA{X].}CRĶZ|Y9//>dBH ;SKXA4q0#b:[1GjbPB]|_0ye௜aJ^S6ݺ b]~.{_WCeJh ZS.p(c^1ذlU޽ͨOqaKǤsG 1~kF5/q!kkkiEl- =zqlCiMPX)g_IRm7tcnƬ1[rTc%<@@[<2{߆$5ي _B\IWe@.UܫXԑiTױi'J-x?S)o%۶`ѿl;4mKwljg ۝_zʏ#udrJZA FyUrt )멉 ]/^=]dc~L>vq{^?ƓPЙ(543x*Cz&Qv䆐 K8TݛֺK[z3֬y]:֌M"[sZ~< 䫙8 z-Ü >gL0O.s l/3+&J9 @oҀNęß'q_ך UMkq˯/Uz*ez}<&'7{cBEfb2 dccH%4Ƿ>L˷?E)ܮO=3Le-WxY,v<fەaNĵ-nQ@s"ߪ/\pP[<,kyA MۗGD44cߡ R<+4 !˩4.Wq\&+? SHDB$2:C_Q!1z4J?e0$,I~9oU$'TedbtHUyL䅄O<_&[ Bm/L;I*’~©!Gn#1ƝWM.o8ܻ PIlxSJ .e"h09ܟŏ$d粙 6zW8Xǎ.ok b4=).했MӇ=@Tg$.gQ}e->(JY\ou,A@I!wnC}ib#<>9WfEÄ]?Wtr XWJ:*)uig<3tn^h2a4ވ20ȵoVyTMw~~m1etӺqF1Ň)F` g+$m7j~O^Ԋ1g#FEEUi> PhsyU+/XvVXYPiZ=G&Nh[RsٕO~V=/̪ :<1xNqM\a" o;m|fPsuRPccZ\LXZbx3,RS砓O(=#V[7:v fw &8+sTT2 p:-^;7{ZUmg_ rZ1 Vh};v6;^aS*;iB|3%mL𐄱)Ғ_Am}_BK[I9oSN#ڰ ,~%'.bSI7] H6HF#< B Gtd#Gđju9eʱ\o[Wq@hG:KGQ80既eA KIkFeלAÖ֧B;寧4') 5,MUWt뿁uG ?a?šJ_k74IhaA }hɯv/T0ε sgu,]s c v(ߴ@ Єk14#-r%鞛 6 G$ט:]ňM1]Qߖ/_jr_*DB;nT9σb B7>s=:_. iۥ$)}7/p 1Ӱ+Ȣs#6%-PW.4kCHF'u bXPbIC*Dځԗ" b׳HySl ޼>U{l%gj=O }w_eZŧ>~;Y"I_mϨ"w_w_JwM$f sKyE|m07ëd%OqY36 1 !Ukyn)=vM.ẀEY3BZ _}~let1 ><$X8ri G[urk@0r꿌_uEzGJ 7|(SM%)%O%B}QmU90BP))4{FUw>p+w1}VxPg>;AzW)|bGcd!NXKUSLVh*:w?P G@GP`aB[4mo<"`j0q|ֽE@Y#-k${ M.sGJF6S0Ooa RTFfP)Ԕ!`Q jB%*DL rd%#JT)T R#*ѩ3Ȩ38Hi hb*7ӧxU :!5.x:u HP9|85g#}K^п @]lR٤\Tf]f4wv(TbNQ+ UM9%κ~>؇)e0Eh-pROfp-ADZ~cǸ`:jѢI=]䑸fm4X$֝/`$IR< y ?tAc٧(._I̩ 9(;$I v%o()4ҁBJ";Dx;4q'|^;_D 5"lM6 7ek/8%zq5'L`gfqnjf[[_dK0EO'RXc)zao8iO44RD*0)bjd}e%  {<y$}T#뾣=L)ҿI!ՑHρṪsJ_ȓ C.}D/=OG#?a'7O?*$L}cDIJO(J ?r z2 ʾՀ>$GId!b6ZsγӠKg[Un^'Y5%PPO$Hp*>aUg ='5ާ!Z;DL)q,7Ui8Ę8(X^.ܻin&t &8w noHإ)Vtt:g;ؚxVSkC_@Z^ZI@:)wg, oI7Ӛs;Up/ jӊZjbb^,vp԰V{0 ƙAw!j_v% )J_ihIu<~e| 嬎 uTg?KħƷ(Q yAjN9bϭ /J(: }WzsI,IՂMTTl,|Zrk  x:Q7*ފIgJ= eMډ TZ><kؠ޴˺4t5;ĥhLgzpE6S(" . :`鳹E&V|2(ZEZ$5 )JN5}]ySli=_pk0iT:Su>^Bԭ_aSϚwc%OnAL"p6B~?Ὠ(c#DZx x=.,pռoABμ΄c#wgo֑ƨeR:@2QoPItg}JfQݫ)wiJ4Dre{"gL-:)Yr~::ȑ`BoH2$9? Ry~oi;^ˇ|ä,ˎwc5?7]7^E Q ti=qf?e ton!>?)EAJ[EC ͋27|/Y{~~m3tOcfEfWPx6KvaF?1G*~z9xI2%֢0V}T>^'w!8$ k{J?抗&5mJUF΢Hf^ ^ȑ+*00%s4-Ыw]nwRM߇=>}E=o~rSnŅhG22k-MkEhT-o=ǯL:|l{xmV- :te s"ZbU7|h&C~ =;G#?#L7F HUru2X֚oW%b&DVpRV}F_k/&(?<ڛ>tu@ZzI MEsY Tծ=w\ʫʋ{& de^T R:vX!gfs5Ŵ ;tm{UQԑw)05ټCrÊ iR <fI;׷cږ =X!V gX׳[ KhfXR[},[e4¿K[,k70,0N[OÁO%cg Cֳxnw4`LG@%{|]l=8 L@hOu0 0]tG2HmrQD#QuĻ#$HH.䥳.;NZG5+j\X-l}U#Ә fz a=4)1'fe 'dFyǵ~u @ᙔ{UXŝ*F 65]뫗ęO^&ZYFg+{S4մE#F.4PVJ  ד\ȯ#d.Sen=~Ó3%JVG[$UUB'͜΍ޯr jB$ Mk{%rҶ#rƱvch؆aHddd j[ZVZJןM⏠!,f2ND30*XnaϺ~pkYafxoͫhM9$)<(NfZ|}:.F7;ey{H1WYE@a-Ydy E>#ttRSR #͢ @ܥ!nC.>a屁 ,4EM#G=}^s^)\!6QY*8 `^пI_7 %#8n}N)a8ϰ~2zb=#'7C:=s66_e]&.ЪdHT5̊ 3줺^.4c7=FKʶǖFL}xcm( y󙫹ƃ%{3O`d.ghXKҌrE+ێsJJ !4M%$MB6VS/ž|{2>73>]1(I2!\1^Nu䋥Ô ";[%^o?cYrt:4wq!)Ktl69Ӡ75tS$2.'|١!-8ݘh dJzڷ#mUzԨM^;YuN`><M%)@,i`lZW-+ivkW acdJc YekC]C:7~iEY)փ0$Sv`.3 E6UGx\#݋?f>QGBVCc4IAڔNrR1XMUAm=fɣr,!Sì|W_IT*Mt}"!#q.JϏU?V|i={t]͑f#M MIC,>#6KѹtA5EhPHe6]$Qi, `ęrŪ[:@@5U!M{Lx xL؀[|E)Z[0^2zvCjcCQ`mXk[ɷKފu$d_} *t*wgE?Af%;i ظE,L[h k +Iu`f9m6fͬSRSY 14 OR#u!BLB&(N0ٱVl&"5L408Xoli ma㭈"1s5V٢26u:DllHoj6֬P9NQZ7FkZ 3Z2c14Q4@(!$d9bd @Zefai2tсY 49QA \%u@0yu)"Ѣ)!j'#2jiuQVp*  !L  %(%kluLDD2$Jʘ*mN(E-lP&٣ A4E THM FIg1%t eAMњdy%ւQP,ŪZcf5#""030d2"hZ+,ʣ0 jTZhLR#Ht~U?١wY!o=T-=5"aDk B4 $q1s 7hu_G5d ,NgE[?كlT>k",2VjW+T.gOz[qoR20V$.kKid7r ·6MaCm*eĤP<8G8Rodzij 1)cFgS -iHW|$/ h洃Ȅ '`?} =٢rrD\tj(g&T$O~Q9y:7u}89ΒX) @*Ю@6 @0/в1b7TK1$zkwp.!u,izN\]U1|o;ky/qk̑Od#Ri%}I/fHDyB 4洖hk3K%G0Xxe,)n=h|w ƀ` 0go"[Vǃ$kK{y$zLMG!P񤻶HV×R#z{L+47} mj% r\NȘjA sw]}b .4+@&bw ,E~<纠\,'F#ѺHU1F Zԯ<5 R1¬`>׮=0qؑ ~=?c1FϾ)v`/VbxOĀ5 nSaѝTHaW4;4_ ց6)} GoYLfҙ$ WZ,=CŃFa5|9\al(h;o0h OQw)=X^q)'GLwd$u0hbGT՟݁4#?k-4mOqǽ>YYY~5SK_kS_^3: ֍ 4sf-M̗aVjgaO˯^io Ҿg5' G|vO}^[Wr/x5ah;]a1^0lj_gt5>+_·#!ﷰ`uZ0w,3Z..L\) CC!<79F~PӖC`ϗFfՆr GyOUiWfj_s{doƒ8l0*Z'I (V^'@ppe[@bHSi$ 9l~TGx2*wvZ׳-6 侓\%Z p6f<®$l@_B+A8+!/3i.F >#h1{-Āb]sAy 7;J#GO>2ԵkGňUhF: fh>j=]==Wj!ƘEfENIQ׈_ŊL >fӥ͓l.12VjNԃAB^i"fW5hùϊMO~XzB;QCk s>TOeuP}ܺ]cf=V"LŊbX NvM¥ĜkOĥP9edx^ @@n-6Oi7(yZ5wy! .Ʒmww(Z3+Ժe~>D%"1$Ky BqWȉB4 3(>6ȹCgyoEq{q_5N ,A sZĿ_56ZIV ?9jE!e m+?v|eWٝ4Eeb_]r^Zg%W7rӇ2-ʌIZ`D nNҰUւ# ͭpmVJ1y,g3@CeBۦ21'hFQ$bV HYȇ@2sHBn-3'#oɜ[Yܣy0~ 1W(N+XE}/ K78Ҁb^i8Ic@E $ݶN H0\:f\CȭUxcVfܫt%cKJ$FJr3-xx` h>v0A P^B`8&eHzHaW5q,!_c$EuŨyV TK%NQE|k8pu852Нo\*~1?u"H@C +X,INu:h?4+Th"92"|e{HFpcznLIGJM?հi "e(߫*K2(*vIi?N/CE{Ԫg}'BXl {DC@C E4 ?e5Pg¯꺧=ҀMzǗWL UHc?(O94q xEz59hVW]T|,״,ONȍYLMCW80AH@E;L(ms_bTMuyaëVPd/54)JzKJpQm).%FtqjVrQAd^%G3ө\鉌$HxÑ&][0ЊBBRjL̀<(TY0!'."_zHgjko;^Z׌ҔAM7Nj,x%3[յ3'keƽSħwg{3<eȍC(Rdw Kfn_^v1w{L97q.xy2_kO tu=x2;ol@ )(( Sr{ayRd6I / 78-*mm@8H_ GQ8⊡ɇƊnD:e?1:!_¤3K%r%B%pk=WaІ|ϩs7$]Ov~>s? MRL,A+ J! JB&02"}AAj,"*($M[XCDDQ)s/$%\`ao7,VO\S[Z&DӤΈ/}}ㅠsOԖj(9#)&b9\b Z)H("eb ? 0>?gJmcqu!Z9X,<5пx|mk(j!|ۥ@*E:^HwaSsˬx C9RZOVK,E@ѝGdC# 0ϭha4axVlag/c||J"9IOI;k[凪5b<zØchȰV2>eSfP/]DVȏxx*dFLAfZ!BBiJ妝bDVu@9թo rzĊ۟>dYG9˙3Nj4?%lq@wGKxNW mGotoεz7Aec*(?'] L"SpѸ%R{Xʊ"XD4ƾVNOdxCۊ^ -ij_tWv!m΢P>1#83/ "S+T,$ Gk;O`tpi#v#děX?S+7 /y,h  %DȀ/-7z:Rl-M2?,cOojÅCkk\_̷?/cw^{ͱS.l67uʋ֖ya5 5wuGg`O Օ%^xR,DT@[t $ƮwbFرļ^d^{٪LoVK_fO l}gݗ)?B]5Dܲ0+iݚϭzؿh {.P %{aø1=BTI{]“ZƜSF (?놽o|cM<~?gm/yMoh>V߱Q2`֧}LTgSSt2;p*vW/,pV($S)n O9rJ{WPjj{Ez|k̗vD>+м,wV^;5o}7k=/N1>{YCߺ$9I!Ge `lWY9{?Ezm!Qd Dh?f7؏fw?v~ƺ> gMch1j]P? *!^n>FcmY\Wgk hzJ:i<&X«otR~IzX1 I%X_MK#:!iARKAAr A̰bBR;.l#2NfQٸcZ XS'h=^4f3J !!< |6cdVN ,DƃCƧ~ &Ʃ1O^g~Ow}_Kuq_]Ԛxvتa(J04vXHm/UuO9lf햅!^YlxuYa{-$KQ/Yp}:҂_=+er/$3ԪiV ٖC?Õ%NjԃUf mϫq;D.sN#uVT$>(g0[xɔNOO=g=Fx0fbX$}cK ?i)`ONļDi (d@bI/H{߾R!{Av?EA{+m.P ZHΐ]]ჟ4|KBT Gݟ?YOyAY_|?U^ߏȁ8|{T}5tƄ0w BZփCxB=Gt{ꍫT/b5H΄u?T@}VLϙмFf1^^[}6r<uYYd `o}ns<{+, 4s 0^C۴[}-7/Xb%UM"d K|<}e: j{?Ɨkxt&IZax-X#{~"vj! Epgw^)5KPSFFAjmkVRbUӡ')0Q6MN{@ԥ5wp dMC>_L񠢘ƽίƐ,sքCL7g ~5z)Abӕ@w[Q#P1hex kZ8>{esz57OVqyם{aW s(N^tי[AyOlIC$ Z} ͇ۨ۽5%>W*MnJ!d0LӋ(&TbOzkAav z _nm|GN=`٫o5rKϞ>ڌgjװjvkFk2cZP<1zj%n+{+7%c:a?c!oZMmp->v#`Q!?jW(N\c'lyL;,n[oQq}dQ!! Xq6C2#*8 IlU) K<`PiN#ÊKy°EN>;dg1$״)8B#)b'L`5)/=a+>17I17MհX}͛Pl.x-r$M) ,_uX6LQ;ttJKH4.FA+qŎ/ίJr{=i-39SAyx> &݆јȽl(xȅɦoiR}}8'Cb~ Y/@0<.RV IW,4K%M Q6ʓQ=Y"S\F^9L )3g=^[Z(>\5;N A :<ވʊP]dO/1`W`+~`  YiHSd\^$9C߰㴃L~u^" +sR{GG%zKLc {MA:߹[Y/m6Տ:I$Vu#+rϝ+upy|Q\oY#4u(GvSA<ij{ū]ΒfP6NB$~$VǥZ޻$1pvO̚ s}<_5P?(JؤQ|Cznv硟/lp ߶qdww9ʔ"gU1 6z46R.~Y2껵o}=o~.3{rp-'HƒhShߡO[EeB Ǔȧvdj):*7Im<7A>v0[o p}ѭK{GvĄ|ۛnvڻHN{~W?gكsU^}ܖ{m+<0 L>x+Xݪ )nFwQ {׭%q|OprCNQ/AX ^T +u@//\"ZK9:}_Vg(O/BWEpPh]%7`Z[cB I2}޷wL;Nnùp0(yuv|ϐ O酨1]P+W;~ѣۆwiwmdȇn^!;Nn{eh;WIw (eEgtj͝k|ݸdVݟJe/z\W~/ìz-aF;Gg(_y-|Zt ^]TÝ߹v|*KH16$&,?yn &'[pyf9}Z>ǿ9g`zf̡}5!?ךM_2jtwߡ8#ҭ|u(YH96ԁ]`?C7ܵ|(1{& dPCG8  |?Cbl;u|tg|@wwn "? ̌v ''7㴳i!2u8k, !ATO{OWh=e?=G]?!?"]ƺ-qeƱ"R2_]rkQȭsN%Cf$THx$!!rg.Vɷ}_|,˜\lD_AsJM=~ p+jDC> 象mVviGOؕ-=384'v!Θ3伨 e;LfHL@б CsnRz7.=ܵ2=s2K5W v-Y3'nnN_e"9H/P$z cz(Q.R |~9rA%7E87F Ne6.[YҖ KTLӑS':6Xxe7?YnvOzl+2e=?`۩-޼>lo'e? 3P,I9H˱{4p$E   a:miՈi +^bgd۝: JMR!ɐ4ڌH'T;u'[P>a 9$ JM"D PgHYX)_/ ('qJ"JD̴Sx^,QyV1ٻ?1KY hDj#%!OIo<"7@!'N2oaRjKT+[c/̽#7zG1]<=-N"O`L3u(i͇WU+Jex3+"sD?SE͇r9}I .%@5N:lJ.#rXrZ EI~@?(FOS:VM2 "2jmg#Sv:&ZEWTsT 'noĎ|7Dr}cZ{|^~%X| p?S;pC@5k.֨()n+8 <1""Z.zuav{ﯜZ徣אx7!(W c*} ٢Ox-xuy؋܍׺(W+l%|:J|z 8ܣ"|yyש]Nƫ{ECkK4Ķ'HнRcuXUwF R߹C>_S(禞)m՜!py,?F]3}rZ*'דRG:=-Y+biUdNFY1wm ^bb asyG;Y wmj|CY7@NSlJ 4"=)}y&M"d ".6rhf붞2IR ӷC?L eI"C‚_,3\gw?5aLz)AF8B.|XE> 5]>o?=3ĔzFB50C۳5fXRbY껝f#z1bкpNOR4ZWӏn.$0@@d{!6 s6߬%~;~}449hE-Ô,[K`G qX/@vzn54hgk2d+/sD?a|A I1(P50EJ"CSA@@44QPAM~aw{"/D `ym"sk$ o+/}?=YֵkZ4/H<SjOПʄ XBJf Q)1Q"֔b3_͓[ !_3^H,5"Ra!ĖhFbK.sQej_>r @2rHgwt_ ^cWH`4C {wU.;еwa>{IP2ٳB4PY^hZ Yi >Y')q΅`ӱ7lu, fbf UI 71_3o mi $=|Hw^3Syw.=i{0i8kf(hMUSTEUQDXbz~6P?jGJN;3E ip夆a0VCZ3hxIص6ڇY0m8fj -9&8|{.#`,B?mC zVJ`]5xPM4HfK.]3 wP@hZfX2|"0͌[p6mkY0}! ]N[;cS(M jl Çf.lZ YmVR ȍvFfv83 (,2s, 3 L2O鏥R'@;GNhY+ֱ~L>_˟R>yﮱK3, 1ήN-TiMj+%p%aѫF}eZ\ʱr"l̰FgZYxw{4v_6X?GG bv"}b N/`z]K ,69M _KUKk bx=Қ &س>4-?~˾+P}kZO{|%ė?֙btOQZ Nu0 2 k5oh=,ȋ,s#̱e"jӟ0%5_җẊ0ͷo]Y @FUí=Nkܝ+:/ ߅z+&o~E N.#O]'^EC+WՅVU4h7BjeI2UߗgZ>R+Hi%Az[6eYﮪ-3҇eiaZ_ed9Ld^nB7=w=켗% Xgߩ$?\<ɩV%8i1~=g)KSO <4`<6T*H6bdzjs/G!b+uo]Q4⵾T:UwsqAo(/fn6PdEFYk'}:h)>6$97R.}_CY_6ztl۞z&$?a&=G[؛Ҏ:^ɜg%g \.=.mԗōiH~ w~ߝ*?IBȀ4+]U}z$my7v;$ܲh 'jEnm^ _$ԅ5 u>P|ꦍ 2Τ]mMׅoo*)5vtRQfQjIOX买BOܦq@?}mW%^G^n]:CG:te M|/Xu_}Q[v+ozxU$$KuA]4Ojd 7;tݓ?{\-° ʸHCY3;W#VfL4kϛw4{Y<Q-i-@(<<.R8-;XRRJ@[A^ZCS2fBн[dmQK*"+\7T|mMQèPo)f,;x],/4*Ĕ->m]*5?Dhj. ٻyՒ -d;A-Z(Ff /#xdW̧ AuXe'IPETLGUNhoߛ@z7?==bFڅEqdƘX~T3$ULS۬]J|(;/iPaS;@#yps8}:8 !!ADKI{(_@>sÄx']N;Qࢪ8+<i+ص1RdYᜇD3 Itݙ.BU1{Y-HۨBFҦawpz_6} y#F1A4R!KH=;|ZɆ% MN)Yr5Ic}]?#Kug] sk;qrIz*kKeE~ާ  mqk^h#r\c:}i^ơ+x:}l)Z˟S˂md /W0jE`iJzSA[E="1СcdVO{͹8Rv/I gvpMj3 ((Hj3ZƷɐ)Ky g*w]q,}Ov \,NPɁ*gLp)ōxФrex:ȥEڨ?z%eXu[)RosyN|[?kt.ZKP}XcMF 4ϕ&L*u_fezxSLKu|8lIoY}sUj|\߫r:nYaQa6`U203\z3op7N%oq{Z\dCvҩg"Uy5Ginݷ_® ºd$8L@DdZwVz%)?/u~u{)kt[N꥘?c3/μh)O6X^tgV!;GZ} Ն'9,Ki3|* U IlN;:jI_o .7H|X&[ngϛ`˪P ?e$R@y6+^i;aek7?saxq'&;_R}It71ߚ[RRm¹J^vQˮEct}Mx[O`O]U/uD=+߃&L^]]?sa,..ڷMĢmEъZz]ܘz>?uݮ2C^{}7s&~)/`{R̜/+˞zйwkǤx|(3 L}?]{-Mǡz6~fW vgYf1kbLU:F ՙIM(pg|k"8O4c璖D[c`Llm_il@E&0|+1%_Oq?q@\T<(@D<۞M}s8]W3y\%[mmeg> nX "c#Su&m6M !Ұ0!3wvU)1º*`޵SXڛ+'X߭oOa،`՜^Kz(ڈ6(%ͬOuF0SP t:@}#nE4]n¸LOk"Šha•ϑ4f8EFQ62×[  jX5SPiIMܻ`eUԸp YY zlQXcQDB.K5S~Zrћxt[䳩-Uxi@ L& 6XЉ{_^2[uX}!'t%۹i B5t-3}i=ta:+IR<}.B7)56N'.x+J[&V9[7z<Τ^}{[2i%.!a"l 3)J0AblFAk~3c Qۄ" eJRJ˩ѓ YIR;mooj&D 3y( jciCP s&\iU&q-ʳePt>5§_ֲ5w)@h .PNjUWP&m 4/-h@=Zu$1菀ϟmF9@THCx_ȋUR3WPψCbZ[Mm^oP]V'j06v+;:zgM5/ov-咆e9 Aw<W7C4LnOkϠ1$]݉HX_(爄h/n3Wo$s'"  -tx yn6|N/>Y1oLYcԅ(!@ RfJե٠wkz<ۊ|f6,ȯDVƭ£;##MĀcGt.}ƬKSft*?*5eLdz9; ,\&ONacJƥ eS[\+>A).zv/mRo67}$Fz܌{vUIx?S&8!ll`r'?E(mZB,,> q%j\yL`gӤwXaף;2ŵa?{!~(l>| ~9/ksBJg HCI?zBeTǛ_Jo(zŒs[̇0HDIfBd/e/'adz^JÀ|_[ !P_%S#ǫwEZÉI!ᾔ i=A˝( ^BbܮJP#Y (J ,qaɩ`_9r<C]C1f)){ݰ}ΐ @fifQëܺ嵱F$4tXҡQ (!\C`'xOo` yY "M3  xOqΒvǩNpc`"/V%Q@~ _1AzMrs H#6ΓW;3>S-Pn$HϳQ`p@kx)sv"') |yJA3kZukG%/LB'ɔ3P])9BY؍- f /סC=FЁU#%)9 1w&f)?diϟBKјmv+6<%>я[xR:'n~'K F@liP/z[GQGqRڄjS=:na Jد-Ѫ逖U߷s__T_gQX=9g 9 TZ`$BBr6&*+0 [S޷Uj+y#e}FY5*T|}) =̿T=wo OEFE>\OΧU{QIo)s7̹xbm1;;֠sw][ lILn}VawXXQ \RxﴯQqɲO}Зs)``:spvgH 2Ir9lz8SmVkQnhgoM5r.߶yK5P{7)Wt4;,N|$$dgFI-AזB6=SbX< PoFl*zzb`!`V8$AUWm|]q_I*|ke{d:*y9uFYC3v?zHw>4m<:,~߼)z @d38s;ݵ2>l?w4kn"Kb úVvgp)2FϏ [Ts$ݺCuc}[aDHf w.20i~VoX)%b~03?۹7x2bVP\qQDP`4}8mL%$f9+)nLRє:5{噟F,9_y 3 xRxJ/OEf)O;z6>tYtXuŊ?tVNEN+/kw$ܜ\fQEf3P#0qZErWQV\Ϯ RQaI[\4=aw5 kaߠVJG̽ !͊NVEbt+"'iOL`E82/LJkSO($?e`1n,l_i.kesz8k m"K"$~?0?3bL-":^> \{)Tgs*k DZ,@l?ghqش+>Fnȑ԰.?Oóv>k[/AoΚfNi[yo k7"#h蘼]I |inlr‘~ `T@}wk]EӘl=Gg0D_bKk ^0\גq6 H\S_]]w K:a =ot6$>S^'a5b?B~`*mh_cWO ]wK5i#M!/t$AA+~+4Sjt~1(HGHNIс%!^m9K^ J&(IuD}JJPέjIUV%b|a2|a R;X^gӷo먷 {|Zh/} =[AyߩWMO U 5v[Xrky:!.Sud? ɑTgY.?sht$ ˽3a%şT\?$] (E ҇N1K8lo]C?OoQcv~3.lR׋\;ဲ^GU!@* ^i$n _7OL32a @arP!.7 %z{Xßo$k ϸ,L5$Ў>?y$h /X uza=={'$EImnaw]Ѫ %$a1{aW_Ѥ&-=G /Vs짷k_@D t RiJd;^'RSø7eNr;#罻)i7v}ΤIzҊU`}{2eڽRCŠ+J2{/w1UzsǦ37ڔF_Z|~?˛65CΖ{Q%/15iA+dR|{7SBˤ!#2Xb?0o,~ufFX: $NCWwt::bͫ.k~>&1 V`=#qfnMNR>{%hg&ʲ'ՔS2?SUr+_?nSyLؐuu9+M0ϡqK}YؚgNe F\ܡNO$68{AҸ6Nu-M~$0N.VO1u=Q(ZgNz=< {U#mܕqgoW!Ūp(O s7W3BRGio}|_|^&G[=܍c@W\#V̯ͫ}6zރQW  n],5xYGv)fi:mwkS&VZ(G2h T!>3،;ʲ>l/pn m]Bdhkﱖ3WADjF[~JrƛBtMDX~wB9 i6kX]L:LTh_o +X勹[LӒN2B@KSꖩ[=[N ;hj=q= ۺxhSћz_ f?m234s]X)YŒoUT]}Ȓ$Yt i((iUq͖jʎF\z.}WdĽ:/2{6؍нۡ]m-@sw 6YM0ok/<>ree6ij`gUEG0J >Tf`$߶#}M5ZYƕ`s`h _^Ս`(@?яnOjOA 6$& MH^ #fzZ2Nۄ5E=UmSg&-Dƈ?.쎻>wf+!7ܭk/%Ns|'`lą7nׁ09A<^ =#/CtOҡ(H:ϯ\ZAz_]>-(R>u#̾d٪cꩥq 5=Dwȣ Dg+q*ḳL'tro e/ VP<ոokRzsǠ)1lҌ }lcu=˹`]:tp6\sOrgY(5'yd.XKHNyw -2M)^85\*7up2C mw}DQ:FF*2Дip:+x/g~sxokG|ׁwo~ oe29Y_n[K9d:2^?$xv5ēzGmnղRoa'TZx*Ic]QL9>XYPeK w"c#Gu9Q2hū6Tѥ]O!~GGD절7զo-C9]gz_\ q@ i9!E]*=R\{O 9)yG+v{BO|~ιlА: YB "$;P~NPvG0ǟƗ{n@4N@U5H!GCql<i^by)8/nE JzBqނ} fz?|^<[CJu{xL] Bg_wUN8+v˞btF?B^US|(4m^/e?.b2w(qb=.-xxi~ڑI|0.ԁB#Kl>4 R^ 1A{TIݐcwz)ygsN= V@Gӣ;ex1Z3Oc>=cU/&Mxi lA:Lۦ*Je=Uʕ/%!>=uc,mOIg|9Jay(#=~ō]&n}, n ^1>u9ϧ#sq]i}<6@7 5ѷ_ՠ;eF_/5߱UE(j†tK$ 5 !fRB(h|UAi8iUO[gi^/߮ƌ`?Fy`~}#Nk|lw(#._;ȚMUOmRʔUN pi#OmsNf,c(岄d[ϔk]鼴2/&Ym6;KN *}-{b3OkfPt3ڠ2^(;Ȱ vWK?i?_]>UWki )(HG!CbI};~$emLUAEdқcUmؕ7v`QXcI}6ƶ,0^?[xÓ{s2]0px~65$ڼu!eK*G+ Ԥ`_qN<=^*|WBδ6q =|̹-4(/@~MrL?B/izkLO5؛\ney=:]@ik$?#Jutfd&)~D޻STU4i%6i^ ?\ȀhK)x֗4ae|hKkBPH?is[9zU=W]Џ]4#=nVpF#~ 5IȠJ` ؁ĠN8Z/D2z֝y-Q>]/\*m~%/f^`*iyߍ!@ =3NI\x_3x>Jy.IlLTN~}`<}/g:_q nG4.B ֐.~6(bgΞB4&44aJ{m敯sO?%^FeŔM. NpdY%W,'^.}_>ߩ44U7&|oQ焽Y%%q=|U۞%s)Y0hK͈X[jt }$f8vkVt -эg^^2 Zuk@D-KE=osꤥQqqhc/mfK}U6Er܏j:%ڽ!F}uc.A{}J_!"y{1`Xb_M/坥xu%c}]^B_Qک蹛5w{M[6|<MJf=i[yNrbPߺP~hOUx7/]f}=O]-yy}s'peyE5Ihfw]M0uZ>;kO$#==a1W@U{M`XzҞ{v&e   7&dBEo{p)E5bc!LϨ,\{1TC9E5]fTs5/>۬B9Jw?}Ӹ.ㄝ} 1Wӕ?q\junlg2gy莲Fl}1+Vxn{=)'< pjvw +dkἰCyn0_-fhS! #k moio>C ۛoh!<ޱoQ]N|[u #F1+4_ed:qir x%e(b8_ #9y͎CYљf"ݑ%b)J:Ii!ia\& jdiv@ewI ſmp6L/+n߸ƾN׷ea ;[35,UPf8a&*8`g 1x?EGL6YCfF0':t,ș^{3~ <(3bQ9ĒzNLg]_rg!׍^u0cmA\5yE4$Sլ_AL%K1X1&/Q:(2%|q0Sg$Y<֏vCl\^N_m@&Tx#~*Q䲞?]h~ehjz(߳Zz%cL;C4׸ԯ.[h;LaX's.z\>7"o]Y?JM h&WB(DwD!U/h}.i>V(~#9N lܠai4 ah$◻3כDJRDdNpN0Mv=$$u L(qظڦsGz~p{י'*(x:}yJ 5V+L{&̑~]; =.e'D\BMxiͰC":zOI~s9Q!B^%za)//\m_9Uh]Yt>=@Z:C c%]9jaBĀ{\)EjI1C㝭Ix}yeaCXbhq!9I\2@Ғ!@ c[gx\,$vb>=MKEy$O"## {pX]7e:[n5) SRicW]Vi~3z’,(~$S$gWųX]lliib|߹ d4.P,B3Ix!l@4yj.xZmo"""1TܥK F" $ .1aY-kn!y[b_5I_5xv|lr-i@.,w: [ @ٻ$kDb0BŚv^RƾHp9%+@t9MsG>elޚ 4P;tS!\; p1FԀ&$5.rD&Qhy'V0=*8Sҭp mk,@ Z֐"uv vfH v[/0oZLWlNdoz,aK;58K"[ױQ7 stqÝ`\qC*^}UqX;^Sڪupq/Fe]|vWNkz ;K  UvL~}-\ {69(PzYZ `죆2a5*1yͭƍkeri. i6U֟C׵fSmX{զB|$WOZ+%{&av0[wr~O0͒/k"6vUSNXu0];;=Eׯ_⼵9]ֳAcKiROv3&1[;e뒊 | oYͦA6֙HP΅ O'*_َ\au$?kw4Sє0ljcN@1$#ΧzlK ro,kJʁCYZd&rm}XJj7 4')nž! pfF-r2/8\D7!>F9`3a7Iϧ]nIwKY򛋊7`K| 5X~~>@8iYv!cmn+4^';=Z3 QDZHy 6[ε8 s`'7\Z=FU*6굲Ošs3_8Uh),~ƆIҹuN^뉎}1JaMT+lp#L>0t!7P>.kDgIәVL 4%Qx=Ƭ>R/F 4H`#?o´9WK/=2:%7̑UjJ#р2h[X\m K(g>zqu[|-NVMU«-LPt7^|=2'C\K< v͸U.+vLʶjNsMQw4櫄,I|9D*XJUU9\0i ř= ٯ`+=W[B6\ !&$=#GZC'כPu.T`a X,M4WOP9ʘiD2EJ.I'bMx#6\I_@eԋ;/MBAU2*[#tO{܏OG?6W ]ɹR@Р t0+ڦGJwnYi 3*Vf'.5 54IQϡfYƆQH"^gw1=R3<h@>lctVߐ6rO;J9 'i\ZF ,ifjpMŎJ5[&3چj;`.53>x\[ٿ҇ٺMn ԇ=ۅ~`Vz߄./ga-eĕ82*Ed5~!6q1ɸ B֧S`<?{=k }oWJmr{/m3c )8djTg>n"0(<!%?|x߄*>BF=;,lC.2}wGYnM)uMϟ Ϩ~,+#6 ilLG+:{?kY>g)ZFc_RO퇑6Mb; R|,7p|H^m "4_\ޏG1L>Sk!,Az=޺eZzWcw0}Wʿ&BⴑW{m+s񦎳s(uV Gt{^h>.E/Vj$'^[LxA  $wIXx4xLҾbi /qSbPŪsX@q mͽvGyw*~ۡR[a +Ӣ骁Х`nm5tC\E H < A6 WWHz{d2aC,ϓGlԐ(yad,ZBH!x,5q(ArJK:t혪`@>6|gx\\9S6+TA:] !@9\H"JfeQ'rU9‡JLc}a@/~ȕI3X8L8yݾ|D5$1nT3$TO'DZe#zk.PZ2""`(BxA ֮ZEm٦XXWsfʼ.^f{^wm&Q>f6Xaq6zv_z^yM /@1[f[y۳A!|+MFKPs>^ax7.,V@\?հ.qVH@b/{_?7g"+2ZgϤUH=פ 88 C8ծ`C ");VgUzg W`օB#]meUЯkڂM&:KBԵa_#E az\[PiuM60@d d=9մ[n[M`?#mGcl\ N Y]۬Ɵedb{<7]664qs`|~zek ևAcE EB_+Llm؅e}@}B%  xۤvJJ]AlU^] K3~f]2h=Lma5( wx'=]V}rX f0c}OOeFTKso0V8 f^O}]SJ@=Iyo G-OigA#gZдɴcq"'$MIJ+gKA"R S!`|4誘N2M5z6+=|RNN&viWaW^T) 3- BOiV[";#sS}`oJO9 j69PQdh>jݩY( yBg!v iNF7qYLQrZ Q;0?ѝdQٳ%okIIˡm9]ncm BJ)_ɶ -̋-F(n~s(po@gek!0_ܬ]w|W* x>ܶaf Αz]o[JX~#j]B^ϴ.ǝ+P.ded y60$k'w~wțf77>MWCn0KQI?2<HBK0{^=hMZ\;q躇;w2$3V7m:(4 H؂"w´W+.shSkG{6bK]@ d]R釨~hZPzòSoش<1fU|gG\kBQNp(uX2Ae{GXh %Ud0akX˖OVW+Svl](B¿,UN&˜»w;bLqE U)~ 4E 6Po"3 .h bJ w6 @x'g4sk`2pi8rpiEیfB"КiE˽ĬtqyG0L~ Um_D 䔺K #EUL B OohA?U> aPn 4p^иq#z|+ dڵ .'˶o.[Ĭ@C7;se3f܎2dlW 튿n'IZGh]y|w5sTNy-c?5):Yþ5i6p+eIk2 ,o_V 1wgv^B5]s usg() cQE\ Vv xxl] Z[O$ԯed!Y*`L-컍?ofG0Sf-}5).M|υ~d&K,G+뮁C5 ~ A :4aZti>Ղ#tNT˫V d1r>7]$ 1#G:sōvkm+ͩX<@|H8xUzvcGq~wsNOPvНʘ{ U y:7Z3+w(g6 ҋFkr/m@y9nȧA ~2(+Ц9[a?{o{}o5)&燃wҮb|:l̚WnY57|f|>'=xɅ^ L]Ԕ={, ໆh33:ک nj,8Xo!K)|b׉(&5zisrW{ȨA6 %P~!nz'䨟B Kn%+Vepܻ޶0I8|,Qwh~ʄ̉5K!fq+"'.T<̉ .SAEidsM= eE2q~s44QX& 9LJQ%P# JEi0BQΠ| 5]pTN$6Ǿ&'P@ sj,&=O!] Svd\U0/ +`TvLŠ͜Ak[a؅2^2BP5'r]JZ0xf#H.&(h ܿigtOeS_|o0/I oօB6^mf4kZӼ6 lbwy|-`&J|}(l|͕h: Vg6]<,aq0K ݻ/. 0]AxB]AgpАhM)+lBnAZ u 4/iyy'hD1"{ ?mjIceq1; kid 2~#[+ ,IJ"]l&#vA(fg{e!I΀h2,w%M7w!A&m9ίk4Wq5V]+:w@̸pW :>DdE29oeL,O hi|>ۄF7Y$~kY9-Fpcqǿ7~Ցj_?%6A{@ L_("#m"ꉻM P*(ZpI1p1/h^ {wÖ.ϑ{z4K"j.F 323#+ L! 1Sy'dV{"\ Rfw:^vwpٞi6էU9yq°Fo>'~(CDp:@wr'ggGd+W HvCU|/Nqo[HHGlR+, ͅBIk. F} =a-j(,GZS 5s- z" R*JŁ:+ClpR}a$tOKmеAS4 U1X:%XgSO^Sh+{:Č.XI5!d ;5g -od,~ߏC61qxX UoP{ݟo]azs5;/4\5v"M3 yPa\8b,R8ѩWhS_HϊY!gj[--­^':LDPz+;^7R멀4vL׬я|nDU.'\Eaծ".A\[_ֈ[{r1\`M}C |dU8 }:=e.c nV064}:ʓ!Df11 tD @Q`%ZD;EJq4-k*2OX?i`28>@pz@{Y+:DrbBpsr@B86<^)|p[CLAGT%1$zD7 '*MM$I1&TZ˭d1I&)HD`דw 3S/>z2eAOf}rxW;q bLC4?*L!K|eh$E}P} KChS Z [0x~;mmk[^ZeZnvjD ҹ$~tݓ?{ +^59ryՄ=r PY"7쬏a RMؑ\ Ҙh)|ǹ@Pyg E 63 Cxc2UHkjP@%^AzL(`68!в!8 'p d ?VHgnB(;#hw̤_GԘN.Qn+tAO:;Ig:4M:ˤ 4ˍ['ըBX%$4] ~wo'dC忽[ʦzc w?e岯k'k 7>W<-sG5NGH &q c$ }eEsUDz|}5&VrslϽ px {GoC}M6S:\&z9ָk 59hkLXOZm ".; H8o *;1!1 FKt+%"X9mk\}U&]7G;lSrACB8)Ig⡚]ɂàI,H '"pA5":Cms ? 3yp 1@B!JSzV Z:[=4kcnKWvlb;U%%@8#ȥ?. e[0D!j(8-&2uB #)a݂5(% JPWq 'wpEzxkpZ(l}-AC"@Ze6Ш=O&JTLJSа| =Í)sCbd>ď`XmI#(WQ|s|FGS_gɝo Q5[1jx $4eJoco.̳B>LqgJ"1߼,'~h\uC\MvQ.r"i4 "bD̘H'OatxR6vY}~}Ma󲓪ʈjd~8hBAϱj/3+}ToAݮ8qᴟ=hSsbE>kξgi/>y??| f7P z9Oo8b_'Tyv .F#?qҰ3D"DqʶS4E2o;/Z@,=gڄXc;B0+dpk(5/m9I0I`3~Mi)Nn&yǞ.A"tTa 5cfUI ?6yg:k.Gm,GR^M/^&g׺yQ;ekcig=.aӳ r.L|Ah"Y&ݠGi}vquf1=ןA.y8'Rne\#o,z,;n/Pg+Q^! ^>>#Tr]2۫VkInQ5X~UcJwvPgn7r=` i44&7Ƣ_#KNY{ζqv+b?I z6pVQy3Gs<2@/vf)W>ݷ0B?gJ&HXj֗CdSPKv=8h>{ٴ L47n 0%lC\owRįs19rueh hhԡ0p?*>50gP|t*i9qCg?= A05@vO_:ޏ"hy!}v<֐Tt?`0qW+#1)~&@b$uݟ%; YS^߼F/ `{2g &3sc ^3f*hiesf›W7#6n5Z'*\Ժ,J:8f"x|)< @aQ6h,oJ|_XΚ/ig;cSAgz[ӽٻmY5"wKtCB&(y N@Տ0[0GcfQ`k&لYTps1/:3}L)vPҶK<(i&߿~\koo9h꺳j!/=MXLVuZ_ԕYjIw .|oCi~m\Wԁ^=yoօai&cQ-j!C( Y@\G4ol]賀}~zƝˏB2ͷS^'U+SB`v,>~:詯w13ѧxV,B =Ӱ{J4ׯ^♅^M׷ Ox֖{ܒf׼ *7?IM>hI*ҐqA-PC.QEr@{@1p%AO@{޿?!iHBe|GVZl!r3dQ<>^VH>zNAewkX/b8= ?}(w#48Hk1[ /:f_hc`7M*c`m 9)<acߑi% ;Fv_e͸O8S 3Lӽx?emU.<` 7\D'308<59T5x3S~KU!NAƪzTF MvL W3CΈi W* H0Uz ;m2u}ꄫL\L _p_$:q=GJT=Rb[Iaq>ʌh)LR؆J OTGj(Ѿn [X/PMwY;i G1$#нaSabS;)SpDѯc^CM@D<N%cֶG;+ !\VIx.3Y٨]V㼞 >" 'NdaAOAyCp "Zg\wË@obr~[M>v90ZFú_&Q_BeMn6|{wu`k22}mI#.e+HꢈbÐ.UTx"6LqANY~*ӗǙ_tiHO$C/kSʮF.҅2OvgAUxtĊ 7(}D F/f=SAɉDFaJ8)~:=,3f_<Ý~]6fTO+?ЕRc:J" `˚>y`pNhՂ_M&!w:Ӹ}ɣɢc,WCL2scxM;')LP6 H9^ JwAa~ϛ}og)e'.T 3(+.H"JJ?1D!{C Cwdz.NBv?5ZFgLt)0H';'uQA{EOIFv4:[:-x F~8mF3=/U4mpO=_y,`\[ , 5k_1YY|b=TqYQ`ޜ)Қꐉ#5zFćgy ! ~Owqnn q/!գ+}9;^V<"g5bZ}~ fTlglU;UׂqFDmc\#۾bX jؔbm@&Jkphm"X~M%V_m%&HXnW[^4g4Vu#\@dMcJsEMx)C<8g#u̚Z5B8VI ~YYN9OuH4M҆-p%n }l uc'83<c?"gKqVڰ2"5 =AGgA!TΕ~R3JcE/|S# h17fP%"< ̃B iM * Hal%j勰׵YFdxPh۞h˝GU_k侅x6]]ſDsU(^Y Oh QYQ%[F|AiA)~+|ˮ[kc/khLlwRUMh+e/__Lb㭾h,yCXA aO! N>߈`Kr!ۘc*y6^sz$ d"զ_r^ork$^Cj+oꆕj ~9Rf8 Į{AM'OEE 8"cemxΦoi3 `14D4ژ-fCf?o,Κu%!Gqr-է,ⰺ'"i؍C@Ǐ.-E-%$i").H~ƭJ+}44b~[\ԣPO!K!ǽaaSm`Eၐj`zu[c2dB;^dw\=hmnty/crU`MҽaYg}oW$ݿ|\KEѥ1W>8BY׍AvvaW6DD-DZ[$ !x66'rVCd^Ragr1hrl#z k4#)dzOd?ڢ>[w-DjŴrnR(rc=&0u(:̐P1ƺ ϛjCK= 'zV^OW?Jg)GnO#JVfF(CW/3'd&B7< ReҞi/]vU{yDU*,J^6-EW7 .BXrgmhY HEM$!#ɑ_[a82d0iȊ=s[p^;so' _w } aDXA[ckEwx{Rߓ'MD@@Dw+ 1L :O3 MGrn֭hԭF & .WXd? i\,"e)x(!(P)Q^#BJ P J(ThDMHRO)(h#:0֢-AXoJJCJDI)ls0كgiMFSUfLeTDL&fYo8pA-լqp $(Ji020EPRIm8:V1 BZ^ uK+msM s YBʣ7Y02sCN[ x2 [>ɭKk7Wd/*EwսىmfSut(ٽ6'XtKqY0:&b#"2{X$$hh3mup;14xJOA V/&&Z3Q?eޣ, D0*\Ț]v^\Hg?\XW4(CI gNsj"U(cU.稜dz{gc8Gt`唠V@,HH`m`MjO`|HgH 2:U%XJFp%jst$t .,E,'rFJ^.knQf2fd<{M'0'۬^Dx]Zǚz"sw8 m r,S}B;ɔudIhS]HH@}xkt5,i:1B^Mbqnl[t{}Te=T,wŕR E<ˣ{V[E)~vz)aY+ _7NjN? ͐:gfJ$8E$IVy Rb&\,iwPQJS'?5$7ѭq޿]D8JWcuetW^U-`΄}wO4פ>jݻꞗJKjhƷ}%JoqH}auJ(eeنd zjJӰynkBmIctq%աD;Y9{rؕ5~nDLe[n~PcQiV>|7ﭸ8t=kDzKhý)쒨Qy8}\nZӶv/gWZ[eVKsތvԟvnJ{72WRi]Gus<*ȳ[T ^NZ>R/E>kBxH.(0@zQ=͵q| ϷoqiQЗ_׾+~] K'eEsIFMvAadVmKp@MQƒrDp*Viӓ&u3͕к8@>Jy1[҆]̓X `S[7q2r)#1t*: X|[Wz [^w~vTWzi&!Ekg%*x$)9oB󛪄Y|%ՍmwQsuxL IC]!- #!6a lMM{H+MiV܀\[ܰjg8C|)l#8B<V Ҝ'SA(:ꈮlrz V8ov hQ.[XEY}ge^I9Kpc.J "YvBbAH"""(B}fI`S6ojt66lm召+V+oY**^BKO48dNud(_웹I &,;]snm [\<@ӿ,/×:&L@$MYjPJՄr4 ZkDDfe>~zz+{nJsuebIP[әJ "4ntc ^l#ئ.AK?FPurTaSVmN!^4[GJM+/gb]Y9`lxN}2IzΓ+g/[I{J$Pϲ 4#ESGBI&`91U)UQ0uayRK$H70p\ư(jf:vB@Lk >.v )5EKnߚԄM{֖ק/,{u"y~Ah0XqllVow|z.Øᯛ`8=ˠBi$66#.xL*xzgs2.ؼ63lC p?­3 7:J$XghbQHPz ČQ{u>lޫ+%\h0~Ұi;s=㻕J5;m3. a/1 z2y.±"ES䙳sitVxS8>S_m䏴JPb$bH]J D4[JduτA7r B~ʸ,)%eL9&ĭ$qm!p* = n 6}FoQ{-}Bj~$+4FFV9eJ;fjZw.`;W@Ms_˄vıáɖAi6h0ͭ{J wMbv Ab^t"3fA[fpͣGtH($<*?~L)b >9w k3ɾo0)X;YGjL1>7Ў@AZU,wd ebIF83Jו$7b/U~Q~i@% Co}JU%gY|~ \)o vUTΏxPDOuC0yAPyut@#LPq KIЈ,')-ab* h}j# ܝfcj[ȋ1}Oqj =|WA㳻wH(65$ߣU,px\+8Z1.K2Y[/幘6$ZX\aF>sVV&1 @00CǼ<GC23Qńe+db 2N Orawyf]S2Ζyti;"!F+0a7zbH 'tUi&]*]IK{MP-d]%QrAĺ{5J9fV:&c]/D=K5E O1Ux0XISÙ!Ŷ(`26/EZ3[~tt_\X,]hkN1]3sӜ()JVe9bK_Mˏ 2d:3{ݗ6VѠ$Ay&%)$dT7i7UİSj,Ad Vc7 EOggqztPgyՈgԻJNjEfPrt\5̭wrl:9-&nBp _-Esza|aܰ.~| 錯7>-TA5 Br4h9@XaOAusrTVm_un-sWQ/k7.L*d BRd]8{^V?CBax#T/bb;a{HA_H^6{8;6#({ `}.8Kwg#k_Wxp5Zb_2%/N=U l5ʘ!\!8 6MKXEI[T g-}Iط:\5+b@j> Ra^ֺb/ /੐PYhEsr~ȗy ##",CŃOzA!w3gl8i@FmOy/- U缐_3 F%a혐-qL8"$aC! ]S$^UHǽ{ṫup+YC~}`-Lo~]=Mn'FT\#ݍ1-ȧe3\[ǛׁꏙvbvwI.'-|7P!ª-Rt浇Y\gJDgZ\FcF)I;?^w2evuI)`EW繺+i]~Sh4y? /S'6YzA"OQ&_O:47T:}IdJQیqW߲# =Par[5ἷ-5`WW+>\;v>QY(Ы&<̈y]+(kPI!eҲ^dm8Z@\@4 ufr5NF{l6[7ɱ C@)MPDd{?KNةn<‚l[b e \Z|Dx6~C>E:wUñ ,ϝYbI4*2u麅% U18qo\G/2F(-zٍ'_+l,LXlYzh+=߂2M.%ZD'̻&5>_~ z:=1'Dxg~9܊c/D7U:FLD,!~t<3Uw|K@K. )Ңډ&z`]BI⡟ -;|T=ycnZoq-qcGE4 ͸X;^ؽpH9ghr6hc-[ҫ'dˣbvz0 n,>o/g5^ g[ 3T).[v]bB Tl:â^·$Gez%LcfhEZy(&ROX;?Y8vrؔգfA&'ն~ܿ\4:Yme~æIQZ%9VT+JU>}tZDʍ@Y5ZO+\Bf=Ƭ$0)"iMt-*4X+I_Nȇ3dAe6ȠZ7A?~P]7q9SNmYV/E`G^vFckff:ޠdi[D{,{?O| yl\Wc ZquBjHBCͿ:cg"gS}jjJ L׾]i!:MsňQ~euH?ӍYP cR]ɴn_hS2'`g޷:&MH='D@@86:EBJ,%:I!/ϋYDdenJ wi,X `aJ\N.vuZf-zgiݟ*VBgƣopɀWŴ kbGڍM~toQjE:M=уxٮS<rh&׬<(Ȯzoii!o;o*; jӶw^kܾǫW!9{4D9&yTmzG;ٯj~OAo]-9 &h5FGo9:™ VPwQ 41ʶгQJ:Vڢu4֑]XvɏMzN\=>?xٸ'noij/Ry5WluK;Լ=ӃJQ5ȗzb5Cxy&+KL4"n/ZMi u\Gv֊nN=}/vDi(*?9%4#AJ췆*0좣>-W^$Uv|^Ӳ;FR }|\m`Τ%h|He}sW㹭Lq|O?WjYn_ha6,V$(nHbƫZTijQ>}N>&j"//4va/}ŧ:)zKGG2ջ?W|T;QbLW|L[LoV+kbȞLUꍞ崚#S~Eʈ?3OS$_$|H"'L>a= O/_9jmpoȝ*gX \k;Kܩ6k 3%|;ys(޵R{h4Ɓz4ic 1I$Tfj7hF' 0iY d c[S^\Ũ "KyNk/7j9͓4 ԵTK!Q;` '%dS({<=*#ñab҃1d&tnmBpdS}O}k>n崚-sOy/uGnU:af"B:Ӂ;ksWAp ϛ\硔D8 S46\nag4lgzgGJؕ =زtpN dEE?dseq\ $@^Td ,oJ7!Jc)|ur{ j[9OƬfB1B)PPIWW2Fb^g`U0xW,bѠUcKW%Oj6bw6X52dH!tOFİnFJ{s+67Czwac #&x׵T f X]-0Qnfc7`w;I1ž18Jqҭ21{ puqo1.a*oL׶(納jZ(';Rksm{=0bXÊhz.8\c!ijFS(6%LBpg;,,rgztWn')dQWṵSh"+9|(aa;pKz­YK>ūV=;JVT xGFVYgҠs&D:Ϋ>nE#mSj?ĞEؚV2w,GCK)Zc+[|{ͿM6^W׮Uf`L/;27=ⵔ5\t).1SrwY Q`pneķ:K9oysixҟ={_GOTv~=9@Pc ޲ -NE7637U_kjmW. Q ϰ5(TL rdƭnT]>};4$՛}[!bW4 konɈRAW-fNlR5ɗ8Y c@w GZh>;`DkJ湢̵  C7;u }AM@~'9 %|!NBa>BLm"mעa0jノ=5=/xYVP83]eE2^~ͳ:uJ --94):DRjs:QYUDTyq] Éݖ*g4 m TD# Jj/Z֋/ wWw&f+&V,nIƥbj ~n%bVbJ{/會e2V?V&۹Ls܈_"TӑUICk< *0y̮6n_pZ1_N AQ'-$O)%V:ƅڳkSvԇ Aecί2f;6 ߫y$.nqYF:j1 #I]  _+u l_eD-]gcuNS2ޞW׼|,I;KDПDKgCmbbGk,E|l3s{^ˢyJrA0@Haφm{0gȰ="A¼ס\MlrFVa>YGqfD-qđa~Nv'+4d䶘y׸ƀJ+,@:  7Ǘ=il#m3o㛕]p܎$pgBJ华q{ew鐀&#n,׃TXboC#k]˺ #žB9?_P02tRb`[pˋ #[,*+ٿ>m@\#Zbإ\׍'ueRY|J,Bә7ۇC~wg³R- 2on3J \Q"b{җc=zqcxʸM[\W;u=v3ytˎt+KQv{tSjxȼo$lNHW 慣ttYn nFyڨZY'\әI<8}2MX8D]ƒNTVnyc(@Hߊ:hQmAZ-B#Gj3:8vD#{r^A%- gVkkS^wwy{?_tlu.FUY0ӳٳ^䶗⍤|? N uۄ a&]ݳKO|5:a/Y+LQ'!xwQ{aAX;bi$=;EC9 ͘eiu3$C#QP VL  .?Z6|d6|39$ U[Cqyվڶz~0lBnN(":oK](ap5 AZCΰ7t=f2 ˧[T`V4wWdcCiG@z9z3q /{ˢ]?rK{ݹq^(fnK4߉&1Mm$N=<d_,e_+Mgv$ <,InGM2N(IhjvEk=  [kPkg ˯hoLs(AL2CNoѥϧ6݌[Mt {gVx` u9)Q5mU)6ֺPf- fnL:L=GQ/4%q)ciP L`}y V)8fSeO1DyMՙWNb@= 'Y8EW`k'^ϯO+>2?UoiЩ} ȎdB҆-/e1=iYw: ӮF\HXr}HÊ9:S7mMe G$f'[5GQ@}l:[>}M7Pp<'q%A,ek/DU7݌gl}4ZE&d7NŖj-_&㕦M ëdL *uhT eOW5W< _+[LɷsIepNp&D6RG\e%\,Y* *;wVYT˅hȤ)YdQCȗ!Y 2]>1L:FEpg|w3j$4x,%Zgyac`"(%d2y'2r%Epqq<'lk7Z/,^=2TVVqnn)7Y)i|I,Ci"L>5[".5}1L8\qq6a3|O#w:<2`N=4H& >ߓŪ8eۗH(ԫ}YR @Hr(3H30{k鸭l&e)s֬3b̆N$麳`  _yP\4 )!o5Ҹ{keC2?#y4:W-`p l`N C6i]x5 Xb#*l! 1jBH⽯F,|%.v"Q.ϔc\ e7K;g` T]T9HO80ksj`xRGs- H@=xcB1p hEm߆W,j7D:" ΕLDP̎rqĂ8X"KMkp+rGl [X !\x|xN;ᙫ' ̐3KIfO1C ;=]6b,Ci"լ@/?mN,#P+mV*OCR/q"iťv$](kv8qss[v&hJaTs95 d|5GϭFK4 $=UfŽjgk)ټc|1oh tkmVҧZј/^D 4"}jeY;3F/(Dqlv%Qrdun1 e'=NgW V.wZ^ҏ3(8]єg܄mܯLȗ~l`[l.mq21ЗD)r.nJd"0e;Whf1Il[ryVBxeLǕֶBjVJŤi[ɵie9snOKo/=JN<豀:>Y<|/t߮F۲dzm^QqIva4qLj/;Xp,x}ЧD/P%gnyME-21Sȕrg5f,%*16>Z_W<(A3,HV6ISs>0CJ܋|>kߴ9u #q?+ BצPJ$2ǣUߩa]!|7ćbYĐ^WV_tyۈ]Amdp6vrE8u25@$5>O0I -fffKӇ=y.nȩRR=7ICCCZ0[8^P5~53b?Z*R&.'txpB q@k6|B  RI3]|Գ-4 10ɰ߅9͘a_1‣ Cu l!6pm@jȁd334ƚE'5qAd+/c`2xkkɉsgv2'ʪk~QN ,"$e|JMFa<\c.gi \wXW]a_2"sB`p'8ĂH7h/hl/ZS*V4NmW(G<#(Nuӂ(< w1R23%Q}""bӉ>R"+Ys8A`HⅽZ U1y "qxW}B|wq)EgUҘxƁMҢ4|qmhGG*HDaÈP9 9ÒmGqNss')fs_*ɇ7gk{@jY:Ih[֕^ jͶF_,$ia6d# YYnv\(xV?۽V lZ/x/bRZ Ոe ɀPO]LTФ.0) yX͘g>h 2 1-xc~Cn5)my-5$4I!cz-0z_:~u+{?)Ost#K=.$}Lo]Ģ|'13[%r./ۡ/rxK̆8TKVR$ghCNKy@.7l=2Nex%%"JHl s:v A禅k~t︗TkqY?"אsaoIb >!v” @ <=Pq~6q)6~eqǺ4Z,ZRlp.HilVP8I,.u͖Qâvߛ9>`\Ysg08ASΗqV2\ :lIY;[V{A${(piɂYo:S'5f?]1D$ }t K[]~_z>4{[5|jtH) b"̼T0f`YpH](2D[ܿWXRE1hoP ;mzi C P'ǧs'6yxAnoxX(A$0 * ed`=V!{' sN8hk`+͋$١ B3AK n`QrIp9prP䢊N XCG,C@xq7%Rr&"D5Bp^\2!7iNTCJFD A9Δ:w,[4ȑp;|Mgw(9gPw 0-cMϔŠFyZY4Nwȑ(!(38v 1S-H@ďQ:pAD.ЧwJo%@fo54P]BɃul0+>a'4w N*8:zԆ 5! JEE2Q^JV⟆ZZ*gnwO,un ءMܧ:M5Il@ ?,=eV,f =ާ=9퟊&IQr@\t^ٽ[VS̸WnznRLh#^U_jq6Ȓf%x`!؝L pjM%c0WaL ȳP{[H^˚8&?/~pPWиb4 ֪NtωkYD|"{ 2țwlϳY8ez%|K:ƯJ=0V"3Ak|iA NfǸ ?C3m gFM51c$!ge.$w-Ԯߪ.!AGцLggש7'g7}lnc_ΘfhC`.^ۋ'o{]^W0o#`7"4 2ZYYуsA S2nc85*SQ뺤89ˉ:~\ah@C2 O9yіӫ[˛\y_AVkN yxk>LFdrć1h ՋZ.wAΈ$Kuô[:GtڌEe&2ЄV AikT/n{|ynۏ?M:U=J93>{{}?~K^@ 'd /o1+Q_#{Ȏi Q$ѳUicpS_u8eć?{4vi.L4͘z+;K}Aӕ {HI4t硅A qT(iҠ.z{Σx-A$Xln>ЫL[p:>{>5m8,>: v"4?f StD+T3[d3D& EВZwv Xl500ͯ;&;}j)Ph IĔ\Yi'x󐵬;t[q•Ə {(_Bt CXrG󒡻8A<CZ|nG}gQ >ʂ [pO ];=ܡuI&p6p 曇A HChErK?.{! @MkT_I0aӜ3d+(4VO25Y ]UJ`'{ҍ^ෟ [v#tﭷO50.ʏi~GO=3 }%.))0l L\<_̪囏/ lH1XfsM#wpL'3iT7m)Kwl=bbX{7#h i rQ0=Z^yB,ޯUZP0=JL 1M|}erO`]졽~Xg f(1+ t|={& XH枪e{MÔώ"\uECqjuuoWK̑7Gt|$b=lZ5yD[Sv\j/[lis:6NmxQ xWboGUq/Q"RPlm]"L ZZ"$>ZČ3ǣ'rA=%BסGMcz&Y*b)=y6s#xČV͎G_&Nߞ~Ba4+#2B`cxS O*I*P7.h_[!D.bg'?3?7g :@ML#4q7ޱ۠vO[`jYf #nzg+ 5Pf(o1)ܳ })J'%:'ɪz+Hl-;MEFEz*㭔N,һ؀*=,dW$ݿz^Lnu<~nOcJݝGY/[JoM\[.T颔)cǹ]n K3]P)Ex*t^ti(V..Gkc}v]r;b/\nZJ=ύメt;9sؠB}f羼FvK w`BR}{5#`bXs_{~e|%9&`dЄ~?樨Ù Jdapd Z۾4^6ֹMнwuf g~]Y&p!0C HAOO+M44% qMt $  )7)7\LG\D*(]n2iڏ(UWn~R6!>[e=g~pWoaOx6=oR龏%j=Em0:^$me -\ߞõ6̧֝ V&OKh^gZ5te {8ʼ4 2zm9>1m.$Lxe߲3]bCAhƜc$ۙ+wT| NjmV^~V&DeU%.DI~>εSwJZ([_5n/,l'ɴzߏ#}=5C'?@εƖI%Yc xb^\}=}8z͛#wrЂ$#gS+@y?;3j:bU~bO??c䩽 |}OLAf;G'6r]59[.Q QwXfHDx_y6w&ֱT/KKTqgDؔI$HX'2C؋J\A4,L*/W %1$Y&Q/Ԯgbs^y_Fh$N:bp;/'9RqS:E)`'"6<{Ył]:6tLNS1%GŲTLFCBieŐ8ϻ, jVZx>bi)mLd_ȐWQ='e^l>}<|F&!iq:U Y`z&$UD@V}踿2*qowmN޾ DeǯI}3nZ\I{c2~egv}j]DB,g:}U;]tFIX: x]{y=!=ӑm3n\Ze?O"dG|f IrrVh 2e伌ery3 qJ8/<tE] VUBA Y.yv).h2=NKũNa76*g _07{XNq2uG#[;?F7ˆiE7< :kHi0h~OE E0JMXKCUn݇&'UaiNbM_X{ítfSrnc5kW4iI}-5>ܕagv\YdJnUM6s_k^*E,pז 6eIn;FV>'^R_)cu6tÞIbD$1\[;Y=m|xMDUғLcch8!Ӛg9@CR(ʄC_.3UOK>kìHr0/0@Hn]/C~m GyH,|<-xn!1p ff7w/8s>ogU<_x cɖQz&sK۽G=_ q}Oo=aSsּBa&tlP,!]*b$ȩI)ɦmCH|h \aAo{` |%9lb^Z|,O|j7=y{MnzC:sW*n4 -'uw/ &:ogQ'=/1t̄*`j/(s.sш–3Ae+4VrRSNZZqr&w= {a?鶳xMl6FSds'GߑthTumܶT*K]IDA>z$04ΡF戠XQI uaq{Y&"+PD]O:`_ % 79HD=uΑvUU*a|^$Ogg8w+ˡ]j8,(mi UAۺf"Ys\EYX3Hl[kCk3r^4yqS *bMn\oV/|?·1Oy]ϒ_}y12'(<'J-T֠* K6#!uj?^=(?>fvzhWt`B9( &[]$+]eîʑV]QЖfÖ֘ްa}ԧ9*>_k7} $^Vm~Rގ?CLHqw{@HJ2SE <\Oճ,R \9mGܲ˫f>njAX(;SO- JYc*Ud^/EMJV6M)-aWNN/OŎfR7 ޔ"ix efe}d\4 i u jH$,>_BlB=Jƌ)iXt q?gtzAf Q߬FT-AAcv6"k| ƒ8jF{eleQd{܉4t ak)i~mcݺ@fvӷn9s ~JlBzuΐ(,sGr"$Ey[{}:l1* d(ڛ `5^ Rwvї8"*^^ f-?U+z: f!MW[P4Zrw`"|o.@i)~h.PC=fW= ?rv-IyLY >|~_zi ,)Dŧ!wE/'eYƈ<"-) gh j8D;6ݺ-V߹w+G#S@KwoFV[{H__F,ˋܹJ#Qw=Qa/لi Xp7ޤ$%=l}萋'uuxIa=ͧ cyT+r4(exPLSu@|Lv Po0A&x,y=G=L~|njs镙w̷krac~e̾yM0-VB*9/8 `"y'm3.4ӂrYZhuy|E0?Pj6݂'"/NqfcNc%rk$;^¯RS@;~Ig06 BIV ѓɔ?gPCv5xWfh1:1/ 'Ii T4?s <)߷/(Mv56 -mTH5EM N ;K'] ΐikd8V]II=>4W_z`Xwo`:aZ-ZRPZCX&qz0G7Z#.)җѿis;|>&Z*d\(/].,T(eM%Z[7g_7-0kt^mpֺ?r)y<2ƺ\e\JO^o ,bU;bv7] źw'f [ljPcLr,dI+R:h6`v3rO=nȻYt҂ȴymcyZ "'*_d`K䏯[1.Ǒy%LFb$*$+H틬A%ħ}yʟcmNMYHVvPoK se#MI;0LL*&@!ʧȄh~B:-kmG3:~TnĒʾ2r9Pc[nF{?}kcouG9 ].UB=ey > ʋlʤdӍAZ!ZЊ5H?N:ަu_pU|=an˗_?d) >%ߡOld]6gf6*#~VGmȭi]5Ү-=>9s^LKRsescGZΈŶ;jrey[x@5/} v5}+:*%Z_|]jjOo=.df# ԘaS4W{L򅆳OO;+K}↪tS̮}Bo]Ek`YEUv7N@xGꢧoezo>}68.bq;"Su"qO2g]^n"NT4U@ٱb Jpg˅?SUd6L9lRA?:i J}3m퉍'ҼLm\@&=|~U}[n$s/kyriF-tHTUn8ocD,蠋]:۳xˣ':-E-lQ!KIw7nah$Ih-EL2鳺<lC?RB{Ȑ" C~n!$+K;Њy:#|X{羧4dq]pg0\n5r@qfhhAZ'P s  ZtR4XhhņiSIXBj8gd(A4oAX4 :Bi 49dз7*݌ 4>l<q3B}̾} L)e첑49B-kMbNisİkJ4;T0$ t[SSLJ4C8IҾ⥉ N 3 /q̶zbԨӇovI@ &۸yȣHנ[/I9G sY;Gɀ5/--'$6)xwtgug}0TKyR:&dˀaZv7n%\OͩoGjW_Ԧ* 꺎9JfSTèX|"nƽXr;铈ł C_*Ԗ'VZiRB',m$ZJԋCdާd.QRؔ{x,;靭CfyY$-J!-gc\f)ˆ{5reO'%͓ɜ1D ##v ^RSnp9bKvTU+X^ʽp-ykez1]¬\KfZ ]mVi}۱oc[. W/HMܻoS uJPksBԓl-kSX*8i6+{ӡ&Y$K0)6;s%Z4pJ^T!`8qI3-,)mT\h3N mq%E=r 3C/#,Bz*(,>HY=ILzs|:\ `5b$Af_ ̙F]Sy[bP+/s݅*GfF  k,' p|.h>iq+nQC0J:\w3l]?eQ`ۆqe#i~#/zu}Q4@p;!m :s i'2=ўEIخMk\D675$6 T.ˮ '{# PɥEƧSv6CO^(8 F'Y?w@W>ar3r[\kAL68^TA"΄$u HFHt'_v﷦gzqϝe?rNc2b ?}Y-d%`+U" n?~MKe0PܰAԘ,dVA%UR.~]cR ^ 8n܇ h!i.{G,VO#6T~t}-U7ӆ)csjȕ;czFu= TH@״ОBA6ȸ^X j Iv>Rn8S2eۏj>)l?}Qnm~mh2BEko!#>8y#sl}[]$eM5 xR_(]w[}'9g*P^lj'2ڷjxeT`}ҳ e}Z:IۉiӀ.&6;xX|z\e^Olm @>q>[ yNԀQA*W ̒ꍧ}=:Z76jedDPu'Y[r7EMؘ(cZzroY x"HF9`x,/{g>DWJf}~sZt}rŇlYF˼\%"b>E/H(fS i @2KYNAdIn@t^cz<ҌMzgS1X"bCOnWn].yO0PE@89bHfL/fۿD)̯,  ݬYݫ\V$f9 \\8+3GIA33)Do6We+Q i6ش"}os}<ޘX_1npbz8T u 6"_c\fc#L8}#͕OO,qٰ$d8ontb"KiIp7"cI 8>:^:_"{D)`1Q"l;$6cl&ɝ?F6\:}M>k;Գ C2?o]P`CÓ@I%r.FX:C!Vպ2]hKD8Dfize& ĵws,#MwRKvjq7,фұ2 -vX LHU|䝜Xc(˯UyǜWdvQ];o4F&(8ETre=ա<@krT=`՛Lsշq鲶5(b{vp,\ގ0A[GrShf~]:$IZ):5cd#OI=R+ 7+Zac?G V jo?Qf,Ǽtv,l0./{H*A@M9$s;;\vLڱs?۳}]-<&A\5!SXÂ!;'adEZFJh FO0|ғJ[)5LKִJĹ#r^ywskC**WtyKgY+Jg6 9]ᔀIU&z\9ypЉ33S|OaVz2gY`8MUՕ=; 8T'+ gKvl1?Ư^)eGbUxRꌐhْKl?#'])&GJ{94';ɼ0ip >7uF-~>u)<~KzSƃ~xMYj$i_H.I>6_]VcSuWazȔU/gvQhm|w=5M3hWBD3ĝx H9k9!qTT/vsYp=' yI@Sp03\0<sɵIFuP `/p /Qe VǺ%:?:5&mޱoAfRn̰ĠB QoB}l5ϥmܤj7ݕ \nаuY<5i@/=9w V  ·Dl0TB7rlcBuj|mf95 t}6SY>iL޷z|KϜ4dPQilɪ"$!!ng#/wxsܚsn玟ԂQo}!_Nˏox:+Ʈ.}fT=?Z1:r? HM2SܬuYWO $A`A4mX?ĆV /sTrCХ1ܐ]O,1q]բr8PNs;=ۢQܦr$ `*f,Б<@ {.4d 62s9謾F @Ͱ 3%]4%i/ <R+5ti]zY"ALoF@#:c,K TZҴ 6mΟmb Y>7E1% Og{;da201=V}aQcŘL Wx` B3%1'ζEvPOԠ$2i4x<#OK^]h,4Kah;疥ȻlapV8ҨDo;xXAűA$jf1R$Y" "'nP]Hƈgy: HhX E >漀\PH~mĵE?9hߍXt\tdQiN},Y z8r62m}(w)4=ͭ߳fK9ʂF+p=NL/1ܽH ٔ,;M SW?)_Rb^f{tlOo(6&vcA~EOX]oB$0N?CcmG2ry!J!Ғs]3aDXP\f\kv_@9jtDm?iǏVPDH4n_T(i4Z(QO%!A &.VΑ%^99f]h6F>vB|5b{%Hmѿ'~SqK_XlM\?V=pQD 0g} J4+Boel$=^Mo8)|E QZW\ro{;x=jȾ3Ggi4]osc"N$2Q%Zh'`< Yʅ( ͩz}(+m܁\EB && ,E/~x[2_,8мIx~\!"V^tU j q0<^jUzP8>p=ŗ#*ِ~9Νz`öN ѕ{ !V 9eQj utV9P.MbP#dw J+m/uhg7wEi`a\z].ccG=!yi azT*1 bm0[tm { h5cհFm+;Tc,tuA <f`SGFwQk6a=;[IKn,r^h(I6h̢uGAzϹy%7b@Lb _֞bW M7;Cr\[ Ԕ͞CH>|ۗR&Waf r;ZZb"o^h#c:2ցt;NR)RlX[E\ 2KDdġ&ؗ)ZQom<-ᢼ _ةό=5l9+#7FXGTa$A682+JUoS- i#q!76i 4$@ IYMUSW(t ?ic& ?U- ҃v:+x`.Ӎ&9T'J@ -vɚ  [@f;h>,,LWTg'\.NX>oNV6qtJ*D!HI\F?5}<f\pHvPA ɢFŊL-ʹ(򖞝1z."f9pXu墄 Y3+;-}΃G M0DԕDR14e;?hW5kxwx|]/lp@fd[z0>eBHf*zۘ0>z=A8(.Gx f+@JA7;y]/ i6nrPxиyIEhtq3If۳ωz|_3]מꒆ(`X|WikXiqoehՌBFˁ:J)D:V|n_aw&Xʧ'K$%1@[[% J_]'9pN0GY%#8Gs`Wj# #' u;)qluIa匉>'K\~/b;k9+7[Atf%iMy{͝&mq}ȴJNo|$WFٔK&p.; u32Z:Vo^Ӛ &Ō3ȶU~M`@@OK[H2d1 0uQ@RD Ts9;?Ov.?7`.OW~:*H'!F# O92f⟦c0[5g݊G-6xfɨaBHHt4 ilG,1 LpbFr?b@R,BUrpyYc~.{.@8|Ϛ.b6]\TU1U.=gQ[9,XVٳOL؛~Nyݭ5j̿Ѫů1,Ꚇa%5LQDMw?{_rKuJlV _dlGkg #"6F09&`tA~)2+2{ -Y.\]1;O{\5[6"A.J$۹ىPN=BL I0U 48!& ws^ERZ&M@}y 6e9 'Zd|=U;fsNCh&cB_= rb8cH&Ҡip9HIMJ%j`P dy!>_ZNwoɲ KL͠ܯ:!_"h0ɔ&Ӛvsa/8ʕ%'Y{eUFN1yv= ςb]aC%M/3Ϋ .MɿW83x гZPr0#-G u:ij7M=I&vX; ,U]G]ޔחZpBc㳸8ͲY蘜;W_FZVIٳyG*zd#;%"D705Th!,DC(LBEpÜqʫiqmUFJ2m~bsݐ͵O1JI0dyGWT$ ? 3۰n_cI 'KJ8THh͑ %!Z|֤% 퓜Zs3& a< @^e SّZ1oH`avrfJ`@A27aLy'Ó*΂I@~ӔsgjO6V\ ƞ/#щjVfȾ> U&Gw6 A*SӎNwNJ䡻XR& uUl5XZ3:.:$pdȝzr$blG1;ỷB^E)cY@GZmKm4!OÄۓM5,K!r! !5z'4ύǵ=vGIFռؼNx3Ver<8af̼zq&x C rGh_YRb O 21i}l##eyXÄ 2T^à"@j`=1[tQP_k:Wh^lphb2P$rB }U_3p~jՊgHf Ӓi;ZXd_Rnci&22s1+VAi{qx,[kug?ݳ hI F&}piͺwv.m2e|\+6!~U wYim8t{FcPaɠE A!#ƶY,u] S8hAQMOsٍY8N|m6{N'2Ԣf8^#^!UU qI&|ALҋܪl>`3_wYXFJ7ǝOQ;`:b=0)ہj:YbE{:J"'#FpТ D_A`\}2-yqa6Dlɻt8r`Pvj"eN3w(Kv*l J0ݖ@qS]3mSQ-8@ 4#:D `hCT]ko#Y&6ɡ֕oj*^pƩD͏ n_ZۙsOȌ=Um 裩ś=].O'{5oԴa8.=䑙&Y?si"̤A ߏ@̱U*ef* .eSok\)\X?&#z6>tWn s=(ڀl:DFFϘj>\)]XRGP[Ȗϖ 96,ـUʤʊR%zH71zNFŢ@>d%:6w 1 &m;|P ͿҺ7Eau,gsԊ2"-QP d^֧GEn~+| %h6 t˫Z6BF)QE܌Zf-R4U=W]gIƘ%1ݘ?T 6Jm`aԄ!T!3#P9Ztsaf>߼kXp!^rLp` b|WK%f8 .OVNM9fAcxC>4YNsy>e.0IY`ݒ7iٯ6d+3/ZZпjn)qٚ훮*u l]'BQ K;e#xV%ӌ3tzl8ڔ_~?ɪ *h%di6Vvx$iXk]ejAJl=Hq Ɋ(Fla"Y˘ *v]9ZQF%20*"bXNYExO-)a#2ȸpSv8 ȓ{?!z׶^dcUKGl9D)[xOJu`X =GH^ꗓFmhk:w+ҩ~n*]hV! {7[\5f)b,4Ǜo2_Jѿ/pړKໃ~zD4ZeѴa)vQaTȿF4=uPU<$!UrFܤf`L k6ab0k;ml&6YD_¯ܘ1`eOʥV;%[lϞ6shĺh2%x&?^c`Ga_ ''X5X*j20߻efun&97s$5l+(c;xň;whD&Mf&Sqa0k/. ӵ "I,RNPmϚwZҬ, -D˫"KaR 4(ݬ㡿&+l'Z8jܧrkFն E OAOYư'& G|'8JER PA%8$iŞԈ;߾BiSibQ`BR`፯1CW p; NMѤь]itCʆ)waNQႽ>niܛ.>MăRD!քאq[cNp Iq]\E|G" 7zROf+B eC%*fl+gZ)[12FNB4 ®IO:҇K)3.Rb+IX4l ?a?5F49Kh$X{fDDp9nIj17Ls٩Rd{)XӜZ&>]rT1;K- ̅KIo* Bvt-%&z@B%t$NK+ *7$v z1ET!ǖ-"6X3hVHssݭRx2sUs,EzT+[eA/V_V+8UYZ7'zgьˌǍ*9̊Kh-2=qh ^Z SR6-Za5*Ћj؞C6ܫVAnGFѷKW,۱2;*nlU3d MsBpm6ױ]k/CrP,LtRT@1ꎫ+Y}"S5wWK:nݚ#=Ahaܜu 35ƄyX7d=,4  JVO1 A 7ht#<`&ֆGʾ SՑ\-z0TC{Y;{@=;2Tcmv?*?I+UkLOd@<.syD,t:'QۘD-:`)XG?7RE/ZٯX@D N˿/º{[.?(}{?:Ŋkz/PVI{m}Z?SvosKkKw:s׋ާ} =Qn:kJ*T}lt_{oCz]΀җ:  .P;a%v4C@@jF:@Bz4kt[P*P5E 4@%U]*(m/@HU*FfM; l-MYN:4@6 44;C[}gGݝRPUU_2VRO@:sG(=ު<1{:8s޻w|zA}ٹfwp@z|iexpǶM| ۬t'( 6ڊ蒩]ҕW6Ƕ[UWk0h;q2^@B 9`R f5#!.9n|25ê,p4 -ڊiW-2 e G}hɰ}<\{hĻǻ(7ݹ<=>z{/0Ϸ{Sa+!ܯGu@>/P(2Smv5OPJ]hKhwt&kC5/x'.f_gjޖo^ 6U*/<٨϶}vl@PPl07{<3]Y >ڮCl!@>@;Hy}.;\v }rY}s}UgΠnZPuԯ| +yӭQI %Mw` 9upΪzdl:> hZ/ﷇ{{alٰa٦.w篱gx8؁OjmAiT wޤ4D L##@&d4h&C#4ѓ 3#Ljhh40 MS=4M<0C!6#MGOST~jjl OCM#(4A&M40 &@ѩde6!ha42bz)M{IJ~T 4OS&ژ@PIMMh4j~ LjTS6=i2zS?F==P2OLQSi2?TiQQC?SҚ6҂" @  IDL4 Lh bIi0hLS3E7a6驶$(jI"&M4biF=i2F$dѤTOLmM14ڧǬeym|W*pU$c"P0-G,]21Ql*DY&li"럿펴+?/yk5?k?}k R?q,dO^;바Q`g܇e0=D|V`*zP扅 k"* M9X` ߱+@Ca$P 4,g׭KD/SXy|攇@ ë"l>K Nb?IxTiчfV()ET3vʳTDrEfU aqu:;TDCݹ˞GہYlz`=Þ&{Oyzr撘^mk iUN\̶]@Q $MQr %h6XIupjAIL?I.]6 ys&\om7ug7d(ΖZ9,Wc2=RCܪE{L;[4eWG!!/ЧŅ/7rzm#L&|=t$yv=] C*h*{c` u #IOg zB|v{yNW INIƀ:`]̝`Lk9]4r698fުgw)7I0 ̈dbO~e'xzaR>5:p*p!Ή݆Bb/PȁNֹ83pKxs˸ w!Pz> Oj[HZaœȹ.6;b fr7_lO1JlS 'a'JqH4z3&Nu5fDs?_AQ(_ArB,gg!]l(G=_#L(t(' .zęp%0Rch!ܲ_%$ Z8͵(ǜz,!Bi"2 &fEʽJ\cDZƕ֪Ħ\)EH$H$*1 'ms:9د |%ވKGQڦ瑀4C #v9M4ک\ݟbHe`,1?FTQt_gW[sK8} ɜ֗,VK-W ZXd3~" H]G] <}s -fݷ}Ov3jZ< *z|J dUY|uv$ lFԪ*F"-5@*7{l-&Q-dnA 60/١E!m֮BCd3ӠxR ~aULH}M//a||؆\p]k6 iogԶŐ-\cc P (!A\lQz6.t3d֎L~gt:*$"TѶ#鹮̢[9{fTb%I >O^ox[<3^>7S{J&"(Q/EzSwS2+C9@ި\}7"} w$nH͑wCE)]A__wv*)CAEstE?k Nj.K+m}ߴ(kP9:%4 1㞧Nd<ś])KUxv] HHLQJQDEugÔ>PEL-(R|9G>}T `#8XuWd0mvNzg`ŝ%2,C$4B zH = ] $ UTD6Obd;jFM!.ÙIofFf P|)w䗙 !?+MRp#0憒d;GynuӝF:swt&gVWms:TZ{:NAqhkQ"wH|SK$Ղ#aW59J#Ɛ,C*J'ôN-{>spNЛ>2,QEPX*$ y>?mm&)( H$c -RUT4SMTPL biUW>ԋUQ Sgv~MhNh 'a4'`̂SC? j)Lc Ʊ?% 4M4qq E2D=aE%E4E-UT-4t9jz@xwŤ^O+꼻s>Gȟ‚}"ʆ [{#{~! P)BP-*P(R P4QAO'발IH*"2>_SŪHJe4gx[hS/ d >T@;-]Xof ρֆi߄ݝEZZ9k n$A"hJZA&X(*JYb "h…|"șxh#D騈a)1Nr Ԋh$ԑa0%-t,dlwX+6kb3럴䇖7]KvMURU-TСT P1P ЁJ"HΖ{gh=5p$3yy2Mu:`6@(t?\˵̲RAHE$P!J& " h( H"A& Z**hfc7*Eb(Ȫj)b$"j *((jhZR;Zú((AI42F,a#eNw]=Ov:*43#JHPMz 2J(𼧭2g7 xrCDCa'0PV\ps~ܐ","DTQDH }~b+>B̩b!Xx)+I=؎{?1:9q#;6h&ʊfk"8Gة>}<|A+ !XNosFu[[V L4ew_#T;/7^辛;4x]h A 1)R j%iP(VJHBb iHbibLHQ1ESQ4ACMS@1REH4V}YTEJ@P Y1k4,@G wB"-"44RDnjQ*`,,PhS3pN4Oczn{oǁ`5@N" B!@TD*/險Ȣ(AQ<#,PD@UT,ET CIE!ITT%SD@QQSM4KCEQ)ITDSQ+JICQ+H4SE4HСE*Ҕ @ JPR@ @!0Х $H҈PH%-RJ^_t/}Ѥ{F'x<נl/-,y(1e,ߥZ/twރzE{_㞯s`nvɼH&iXH?MmeN!6xx$m'հBL~5Ħѵ"J"i[ߩ9g!vcG i~'WhP]>//* ( ZZ(T T=5ed "4H8((ZĮ!)iZb֍B5LDA%53TDPUSVx0Q?c}R|P=nԇ\{(Wu65_>c@b )&"rUB)jn6x䷣tA2#7;6o =Ol/ CDI4\\@TTSM HPQAM ʏ!jbAiP4yzjB )*"(jF h B |("WQ XhJB *&aj%fhe RJZBH{0JB(J)JR) ()(* "*&*&WڠE X%4RDPSIPHRD@ )CHЪ$V0Fglvȝ>Qmc8( 4Lvf΅x'|>~|»)Q3LQb?ت(EUJR! iF)B)[e *f bhi*bP@ SEID V(Z( *3//cz>A*[d(P4P 4%+ADI TI‚ҥU45TEJ%)T% H44 #^0*B,ZAVB=='eLB05j&V@$Ebd2XsSΙJ0 &) h"f W/ͩ P$N:8_B@͓O %s;!h&8/AP4; r$]oQv.i?QP$' YRY=Y9mYoW`=8mYjT;*~@#[7YO)Y:z R)AM%TDQ$KU0 E HСCz(W'fJ)ZF Դ]}TPAD겸Ԗ-U)MP!3K~zU{!J%#M4%1@U55ETSA@ 4})ZqEzϓC4AB"PR#J J H(P DIH*R*fAz{BhSq H =@ZB$FihhBT h AH!@dPH$D':g_{y#t~9\XQt5#"F|]+ҋ܆\c|!*UH}XWHFg8ȀOGPY[TZ¢Ȯ㹭KK@|zpǚD_%tߐ׽%օb1g ƨ$ cq!ĎD '}{րVk$79QGH7ʃ_ P7v{jà^@#MC͡`^k{xn붾尗Pk#%z)Ĺ9pN|n2ܺb/A6iJ.<VkhsJת>*Gu!ڎؤVJb&}``H,D!J:}wbAQB!t)H8UJZM֞39ކV[p#i`KODU*+ܼ $qNa <#"bL{#UACR8.WvcHэ}淨 P%(4%EM JKCEE%W38fm{mIQe< yZ7x*bJ;j~O@A@20Jȉ+9ur*~uO{M);A"*; 7 P8Sr;|Pqv]=.禎σ`ۡA1nw닠/sn_x;{pIrybQ*">Luc%bxL&Q ,*8$8JR9 *b ޑ s 2 t&,u,|cn^t| -tlM|G3"~*%NR$G}3gqv*uUZg]j΋i_.t(7_APA~nAm2bK h )G 6-iz/QB#gJEA{yHRC EQEQ L4>E R"w4<nU/.q35xN"M"kټ^4@}T/dzVoݖ~8 kNa 30RVoC8~8 g,`܅#oV`7dΟMNnUwWeB@_',v((`T {*,d`,E `^_^=Y8UE HeŐJɝCzׂ%(2EzxsbYsnLAΔ54J25rHaiX iX 7H SX9fFՍmAS6rP H)%Uue/Ƿ!wQ+lt%mtl5yv]#Ncc1 6CJ `F .Ҷɉ\ Tc>e1ddo-J~Ck=VuK;>}=t@д4 Aa>{?q{KfvsmOp'yD KE6hAH!+WP>YAB<\a9A͋q$6bŗ;X`?W=E4?|WaWAHb׷tįH:,T":8'% @FJ5z@;|u\|(ay zfbAJb#CB/ ™MFD} >e<[QL&q@, 3)5%3S)co pCU1ؼϰ|x%44#<,HP۾:[y ;LP1hq)>Fv۝|<}IX|le֋^}DSK\c DТB#:W; p?>gH)aK@wi?z je\tڻ2215myC<QFOWS(oigS11gtO%*܅ !QČunZoa.,$o&s `qI5NC{k -v1lLE89#d7($.j !lJzՑ&3k%&owH5t n$(!P2 Kj{'Aj?Ck8ZyS79"$H9R(+ Щ4Kn7O^!<GW_Qq+hΞgSӟ>?Ϝƚ_j_@Fxbr{i ?Z@su`P]F Aa`z/z,8Y4`w>΍T 4]sh(%Y>'HTrR]X *\Ru'nNBѣS\"4 v ȠHBt' T5(ɍAIy6[S `OK/A8@ZfOQ1ۂZZZ,'9oIB朌`$N|GNTsfK{̭a plP|*-t_a/*pͻ"v DD )eh5<'#=[}T'"߹u.z{n``3 1̢nxH2I  )}8kuU o|n0psMwAZ25Ab b>SStc4=v6 -ü{jJ-#:aW_scZX6) ` ( U ;'YJj-T͢STx~@,{llhgA{ K1K9/ev<.M\aA;ƂW\\FIB2 Ӷub}TNE9p Z u}Je!z.pw}GK0t g&ˌ\lt4hKk"`I nt\ y]gofA[)*P 2)Nrףdv,YNkIcCat 1H [@LTIHF9='4 l0l>+: Y5( 2co*dU;.;  LˬC,i{v>"6Gj}kL$)FB[(<<P#Y@'Τ Pu;W' %:Ek@^ Q⫂q' XZԍ7o!s)+}`ځzt;5B(b#(['ؐBlA l8nax'7 KU'' 38i$L Ά̅u3y>(\t; &Dc\EQx]^zu a>{-l9G.`L'BAɔ+!J,Pa$ib"Q6~b~cB|(+s@tp78Ɲ=YLpфyPi<' ƈJL2$žH.H%+n9h)'ԦDFHqQ_|2zJЙ $O[jsy|#m;y`tgcX"(sQ* u=}FǵC !qfw[̼4tS$A Ri|P@HF@6,AbmMngZ=G!y̤tSf A\̉!rZ,<шe{Z)!BHWKh8Xe]΄/{쯻d7p] k s@SrY`%泑TxU D $ vbz䐳YQ [t5 *ckE]4 Y|M!Z|%@ P hG8Df$媃Ϩ .}GY4s-ޭ+ţl'UdlDvr9"l0?5ӈx, Q̔Xl~\j3ᮯPtI|tBUSyI2{mMEם^$b] ƩŐuϓt맩Dqͤi3m´aş9=#KP6Mdqrq%@2"2ev1HimZcwċ^#XY}<ȑ~{&_Twr'6B[K%L0U>p}cX"94Y`hieN@8T5:$rpfp!@W( L_Ȳ3a:x*D Z7[hF(99}A;36fg|yXyr2 w^wqrȮ0EI?pC.\bV,fHj=KݹNZ8o:GTu+8xb  ɦ\<̊K c~ωїlxSoL;-)#6Pίf-2" ?-16KhW }r r>H~>3o=[%dHo"F Nbr}{>g~"# $^ɟVZ(8n|eq˯9Xqif8qg4\S]xf/-I8|=FFiU ! Pɹޔښ;>X*.8A|u}6),Gh4mu757b?"\e1IEb04KHGf)x8/85π  <H2 @AV64*HBU%ҋPJ ,E\9[owxs%[2^5 o3hˉ'd@@hG yt!sϤ| Uو&}dJQ Yc맔BMM Ӫ;S^*CvQ](Iq6CN"Z,{*oîgW?!+t3J}wy@$e+SBi'i? _%MLYav1Kj ;_j* $(bNI-K:8e(駐IW9z+|`;n$6K.Mq9, [*܈6i#`sInj&o kDa<⽫AW娶g#LTU*K/h9Pp#§]}!ihT'ԜshL9pozN{D \1@2$)+!‹@/v.h\@L}G5WkDiIwT/%w#Ǒ_agr> (wNK08iރ=JdF?fh]>} &tQI !URJ"O_4Bݫx< Da* Ǎ70f.."JlB 84m=AK NWnbP*թ BLYuovgmQP'=GۊvԜi1}swg6B LT{`Pw s~A 6Δߺ<}" O"BA _!i<.vc17ϲP2 8Xq9/kN`2^]fEbB';*ħIf?x3UZԻN A8Q{ ulthIɔyHvEkSұw=!8y)'8OFΑ2KFa!{Oeu>{s_򧸇!ɧ5.c9o8d 1u(^\w~|>c9U.1 GP?h҄@ d WnU <ou[q\+5kWNT"0NQ{湪 Ċ/+ ?hZXg˳OeJ=g<;>({>UL.&y^g( !zOD@B IS.- \9ŒQI!P^O|c  ԙtg]; ;tu|H?\qPﲝ?!8#sfu j@ztoz@#sy`C" UB6<(6N`<rcYMk܏WD}áuGo=/b. M` xX0 9))ifWR҂T>nƨ D)e2h>)hru51KwT[3ʒmYD7/cw"HLW:)@ dr89׫1 22nA\0rg|oATFډɐ>;JgOM$6~7*)60`r1LPTLޑf;GA}%Ş<'mn(`ޜp RE#,1?6G*CFnٸ~JGF`y @QR,f&(e-TM'D v'i41|밮ž |7@_۟9OWBa%f6BA*eCW.3+恶&F#sX$zHGX^YtRs6'\5ǽ0=²xj\ښC.& n5>~LZ0; d!!ێ(l2..gT$ I$Apɠ u8<$db}Y\+5݀u'# @ %c.W*T&M}/}~]ItF5ƲtQV3uǞ:K\Lw.QkP9h0Ηo!mAT›~/+]qUTt*DCۧ| ]W׷D\|ǽEGTUK`HXV $5{ǀ1g{qѽnz XeB;L +JhBJrF`G:4<x5` N)O'Y1M8c01` L "3~06͵4rLBgw*`EntKa  b*Qw^Ɲ3Vn>)&AQC :zKvx:U5 Hd<|`م[+XP,ܑܐf8Ĭ?_KnE[1DOHB5B3A'ರAHH!DTpA8s<.rHK'9rJ\m w&7ըND((`!)If3I7 ( )›]bq ZSa[ĞMnµ.Cx|(+j$&l=ٙ';2 ^nQM_J?Uix` UaH 2 @s 58$1`. ^@>vC翄ɎF\0&7X搇v] ,5R빙=Tx@o4&JP/<f;%vyf!LN`xFX Wo2! iyi #w)Tn#4 |mZXz~$6J]y҄>X(:hM1e5P )JR:&H[G t^lO]@b6ޮ&% dZPQIXp{:j㪼[Ca!rZi4Q:4HAi [g:#-X;&֤f0z?Ò@_nJ/ʑOh%ڲ}?{FF t0`c%b%ׯ8xAj$/nUshC8N\@C6}K2eA:dbu1QOź7u0ҟU $YDDߕd s{&D3 J -A^[/CbgsH5Xз#20VE Uv-hܐ3 7je8B1LƬQFO%eS UI:ȐqM՛YvbU]޸sKźYI@4D w/6 $PθMH/#zw 2_0@n若:M<9γhHޟL!cff1` =T[$W( HBGyo;OLϓ0Of<K7Au/;=«ۓx Y<I>@gC "Ԗ EBY(*f8Zw-4BBB$Ot)C%k"siKH?6*L9P*4 R~H3,~~ b p͂?) ΁o!ϥGqJ đLAA)hr*<__LLoﳨN+5el|,k,N %V5Ѡyń~DA0&)7Wsɫږ!9"ﰌA1yi^x,38ĽZZWj s7\* UW)<*exO Jk[Z *QSNw}'#aN ',f L@>ABpoىlM|D *N0gjt)S:Ӛwr쓩t^vcǰ*͈aaiӂ~)3@m`,^)NpNBvxMjnwVs{^}}o@*_ҥN^Q%ϯC5"?`TRVω= 5D2B4+3Y/ɿNNϕq^U>, >l1+|?O=?zih9c[{|(_yd6v}ߖ@s#?_9-I΅襾!>tt.CWQY|xʨBG PRzH 7ٹv!CtE5hb5!"|>o?dbև^g vc祎2-!g $ y~I&$ʡWom^3`O;Ҝ޻|ҵɅGUq. ۓ xtn@E[@29wWJM7ա1AqJsV ^[oEPitԑ{iH(FЉ , cOY=Ro;;G@nsUonXL zNKr )=Q*Y- U v[JayDSqi6x*7둈 0;w+;Fƥβ^Zh-Ee0(}vͯJ]MMXmunLUcdy T*zsլsĬn8vsQt:ը"d˕t[X@u\^CЄ4 EFČJ9T/!¡<6zxE| <1j{*b "4`T@8O_ Gc1L@ vq jxH)1E䚮 Ѡ"vn *"(4-R)M֥S,7 ~~Dt'O趱' g=SApmwEy_f5gS# \l/=jgO-Kl@P8%,Y(dA$ϕO3]$av ¨%M߀ʑ u0ԑUE$_O Db _JB$Rv ncQ=^bt6c6 r[d\s2BTyٜ@#&toxG9ɽ&rieH@ ,uyوvW9] ]X,S- .-ͥ:=C2`Dc\eZ׀E{v7 $hzG4""'ٯkR6(>)`RTc8zZϒUMǶ._&Io[d`(b#-ڨiy!"(- <၄C!hL`RH>T nIU}ihUخ7hXs j^2RĀ钙(cnrdzVW7J>5ہm,sk eBCU,8v|?!yr8"/wl*5IJۨFÁ9-\[1v@ka+ɆGKfmK&qF|;oQS(qb G݌0?B7$ՋW6m3E `g%h)WT툞60%6,`(yT8,e;0D2)~e'ܠm Ƿ7wԂn~ ;Z~hN$^mX>bFN>jR•Ύ= ڥu{[dFpJq-7EIHStojsqkB*k;=}Mi:B%螁 t[}np2Lu+i&^^A/ 4Jؽoh3ύJ5-/}220 ô^[I- qT/Rw}m~<{|&"%,-$(S,,֨HD|eKd|Nڤr3k ,FB3dt 5rgHc~[S a.RcUvD0|3snA=YᛪJOUTL76P umڟn[% lPg5!˟_C1* tSԑO7:QdZuH^cͼ8ȡ3|'֝$}`Š3T$lBSji=R/FhVkw;w~ 81J{|Y:,Z{^h:Ʉ=*S m7vx]ʁQ`rHG%_[O*v;t=`H[*ԿS Z(yG.q:Urh[ub?v 1Ȏn[ رQ~0 ELE -=}zbRl)8%P6hKEr2bS8{N/"s4S&]Zjvٱl0|D,1u?X`FZ)"ʥ%ĞT*| Y q pwcxb!5l:I㸴JL?z:_|dB\SA='g7'm08=iPkqGi`Tqr%\G&iu捦' KJ'߿}ȁʜoDNnŬ |*<*G-K]QKy6`"ZԳk^ݮt/! ]^ Z0T,1AA%(J]zyYs 0"rI6MNWEkf4XhnN}YJ! gs< x ̢dHFT[q杖ijE\\&1) ՟|ӧ5 HwQYIy 17@16l!0Oђ FԥO>%b Gפm)tui.%"b~gv > Oݔ=%ΉP\O)Fbc܈ӵW!$xsa>Yw sW\a  /8X5 n}l;4/KJަ᳤Pl-m .G#Km쳐p.:xBgTpo<j}H~fx÷ c(V>&X_BANASx1ŀ9 d܄#*+-}#sC)TQ4kkѵ |pϷˤ8n9BQ \IHbj:VD5n?u^xZ!@r#ct׌$1Hn:WFRZ\8>r,os`y~ $m}+LɦMuӀVM&UUa-' YyrKdnJUGV/~'!VN DKz58j޻+oVxAwɑT5BP6`GaMd+ $4WuJ##ͣa@ŁiKA en>mDPj)u&g͌?R YO.a9AogDw t3Cرrq3&$6@fa5.4Tzf[x6w_bPfK:,)-#@wkPX лAr 7%nki1P& S~% kfSM@>s"d7"CO9CK 2ueD~/l3ܠL|y(5xds{E`k{%^U83p.`oJ1ܬ3gDmE4o͸%;aJ]S"b#h A`q]nksS=&^5rA~y7wԣ2xo$S#p2gEEQ͹rN 5A j>5:3i(gϥ*@(ݞ'xF-<0nZZHb,hh ZGm.gYP2>ZLu3+W=г?bnlTO# ;P ()si[ld^GkіCcζ:t~| Ѯ9NOq9.f*\Be E0'ڗe){zs , tg< &id 20dR>2CSW"dv=NՀBHbjoc~OR )̦[Kg ?=t:k}D:VTA?oSrggC]&p)Ʊ7٦3F{~fAkm5IK7)KD lDAŬV$> 50b#`m]Ȯ€7*)g'L8pd'# em@Uw^+ɍbpLvJfݮL5gTJ+|,՚kv<ôUC#2Ɩ<cv} r]&E͒9iϓU-R#27#c EH"VMpj{`28GhGlASr"yzh]*GkB,[%doB=!|Z>.2*u4( w] * {}ӏ. xt!+T;m- }1Eo?]C5( KcdAZU*H4RVw f& דp\b[;Aׯ F2k$a<]|#PpP~Ԫyzߙ^ӹׇ rb{cL|ZNt[dgL,X?Ch-p%zB^/8;vnt;z\Z}-ӦKZBぱeb~s+*4zm}I![)Om;'ꋥQ ݴq}HjeKrwkt]r[+'$#ye[Mgs(5!fߦ +ceϴ?q$ $kfBuOkKs!49\2+!mY{ e rGwhENYS;g3P0<j'qdjSP ?Ev})&?C>?ggqC}\yA#QtRX QQ_ _Mj͗Orzqba}gM#/BC_=wrg^zis\qH@06- $(̀N l[): ‘ٳd޲ҵ']Ah33yk=ǟKň/BlNg˄>0+5Td r܁k6%:ϭh:lWym7 EVuK׻vM?'$TLzD* W"OQ]"Rlk`嚫ycjAIDTmas"k3r{|U ޺E: QjEQmZ/mjpԜiM' WFI%0sH\p ?8?BR~G'#D@:^"}-zvv>(T3 8yi珽7fq͸_>M54 CEc21,qt8]323=CDiZ랹\r Bt-3<G?*""0*E*DG"+Ed;Q n֤2 M)|nDGD(VJPutNDș)WhR9UtZ D"Wh:9)O'RR4^bVdλ؈Y!L3Bz=/yq2~߯J !ب߷?%G&gx*߷uImھHZL8b׃H?7mM~Fs8o#[Iƶ<_[v~-ǚCMB믳uiKQbz[}=}iGfh%4+ ɶMZ^;xI%u|~ZzcU2Ƒ {gCwq8$s=8a%'SI꬟3x/)sVDsAZO 33 VӦƳ6ЈfI190s cPB9Rj1FiYmv)ȱLTciLHE¡U7hAJeM$O. k;Z髳@6W(aeHF58-I$S9^MH9-;j,6Drƛs}lj䧚\ 3u(ZJ >(ڋL?WK>e98!}b;kT\zu\@x1F1NS 2nDZmmXf k, {HۨhAL=nV?2;- Z[bgp{#h'9 ! a^iݠ~Q Z'3P2u#[x[>n1v_/Kh{$lTw{r`՞}taWK=P}t3@dKV,%t#ɪce/V ?Ҋ_l=P:m(zSԛv6R;õޫof% tOp8;8sXJ6זq:)tF>'keY{(k:"Y2cruX1A2Y'=q~'m+F)P+PZ;:4"pxaD2Shk)_*YܒsTՀ;[o ׄ 㥍U*D&8=#Öy%^nި@-Mt79?1͙ bA`=v>!݉}Dn˒H .Fp>A.tdg'U^K$9<=xp`YiaC]3U|0Rk}8I M0]np-Ƞ7b!f^ x*4$~իmJSA)yzab9eicM8ܷ]R8)/9T+rgߵ(iQt*!S֦̾cej;H|><)mv х؁0rn6̠ 1P_9x)n6!3D3o qU`a%?{nXW߷m ߂ =O~gmD؄CfӠ{Zk&Wڮ3I/xhm)uX5.fO:{ZU_ЪHŝ0SռK)l-_q|妱 e^Mt]n@XutA>:$"hZmڽaN&8OW|PQ=PO: $flj_A0Jmmj z<,Um+% Yu_bBVZLgK<,*9\Myx/tMhom>5#C޵\AJܶGE\ICg(ɖJ I) '=p賓z7\}mU1oӛċ)I58[$j4}[|'eϴ{6,FĴB,wE3-eZhPJѶYs}< ;M>DoNB;c,Z Unj-|I"͌U vs%fU4fKl9Ѐw;ǧrzl5Yf٫C=u˹ qehw<|Kܳ{e*^P& E?x?u+\dt Mr0Aݗ}BwQ9pJq2a)j$x8umYb/}H6C0.6ؒ{S@PAUc\[ Пpۺy,oA@+w$;=7b21%MKӲy}fdln;}|X0 *}Z$|;i":UMyw\j4U%QNPפLs鸜Vu:x9f4< ٭ò ]/<*P,3lhQϚNmfpf#@48-]/y0Щ:Oqo ܝggF>B&%m6\<;BXhdqU@8( #V>.EĭW$dWZ'ψ$&Fw-j%w;;؛r|(#ki:W "6^"^|s7 k؎m"PcVWt,kz{A7k+.:*7JcЁEԌEsA#xhfM\+-"qɂN#ޗ/q[8q8udAf22XNjE3LpwcvT#EBckWIO-Ʀ2{Q 2Ϋ̬W'dCo>@>h+)0ImtP{+*+ԪR:dA_!HE ysCu~s}u eyvëYxr$ 2ntR4;ds↫ۺ峿ȓv5&i$9@3k痠A~jV"KUڪ%dWQ]~4,c8Hk3`qPk#3K$;#T3%C~x[m '3xͪQEJj.b=,Yt2!9J z(@H|aV#ӆV;#aiu2/6 &i&:`U|r&eW|vGb]fWxiUuшhAMmNCqǂnӾLPGWdo,|$>b}Q"BZXϐYm=Hu%ݯo>~/zb)lqU";Uk`3= ,X,.޸[omΊ" ϐ7Y 3w 7 lz֜cfzG 1> ᓈt@Gq)O7Alhg?a!:R` 4V'ma$BNl5$u{p&(j9z[:r*؈Z$(zxtxl!oU&9Z:p2dZigv$LFnʰY⡥i1t_ۭkC%A`f e޴]Ăl>Fٷ?㱢Y:۴:$.zef+T=ۙѬy|@ 2} t(pNa 5 6%S:܁]ublvj o#L:̙8"Vo[EZ, (fpzY-|JEPE M 1w9m:2\h0>kfK,^;ӚOAl_ްm]"<(AJd 2󟘂[,vǣ{`kUKGk]#6Λ14ة]e6G3(Ekz-U\bKUPg> Km¿!p*>-K^e}kU{hWB jimLp(Q0ek17wqGB*}n "sb(`~yqAenKlZ[ϳhY{:]2V4"oP@޾;v(p`UsbiuRU8:99ig&6좀'-k\iֺx#l!&|h٫h/<;ܙz}C  @Dp1?Dpy (˼b~C58Dh<WiI%;Ķ@˰6Fng^: oUQ#3pt >B|QfJ8=&_dBs]D%$P;u7snU%ѫB*@cR9tQKԉh-Z7E$ woLU߿Fc?u׳  #Z=v G@tH}h-4W g+R7-\R Lz,V,FEz,3Yl_:ggf̓å_b{V3+D>6znDZ ՃR_=CY01.{#kbu}\oADD@,-5 *"wXZb*hYІo%m+wdC\ge5IU8z52ԃǾ"-SqLZ{f6v^q[Ԩ֓=h9,;]W Ou:K6ƹk}m2m90璠'n{NPgM^v O,- φ+¤j);04=Zɯ' DjS xNM,М&u]\ڮ'}Sa?p k#-]g3jĿCeb!m9T+!<˘~n{+=μΖ\M#[{ ֡PΤMrljU]B3 }0 `[L˳6~+?wOhg%1ZW-r-Ng}:ag:zbp:*.GyЉQ["mJiM8*OuC]#b$4iʘҊ~g8Ah.9-.:/q^uL,֒0.)kO>d0O!2aEםVC) Q8hҕ)v}j$^mޭ?c$GrfO.tG|gP>CͅF]ZKltSnN~]n;m_ aոwXhwՋï[,/+R[W5n<*&L[-7f/@XX[fBqt~{UYwŦB5S7>=;Lvwx7BP[m-Ni--ϸ XmTtPfASz{ [97Ngb ?:Z7XM$G;żytgTgyc 1ljb!aWp(Ur<{i PFq5oiz|U'Scv6WZȝhL*3T}q9i1c%T}qfbXfbr̥7BEʨJ_ѡ U CVnJkd?f>އ l?XՓ5p#Nёn]ZOSE7|1meV2ٷF(?gD&R"IpM$i͝cMURQļ&-{{>-vVs]\m:^V*0 6ZH+c']wi蹧+VoAC!0S;ݥd?71&y#d\DS!6Dg="&x^,:dc(\1Db%1o m-g dcf܎!Rt7Elb%v8M5bh>ݷEw .c搮lXq⻖ {2Eġ .S0PA5Bffl~mY<Eڅ2Β,qԪ~!M1]K aCnچFBYmEKK:gIeOVr9-skޞÀglv?E٫f4lFAi ygtڙC @j(P(5XX+4W*`0LwR.ʫ4Q<;]Yi'7Zg t8#5/uosom&˧r훪ӝΏ\ΰu(t+Dv<Ԟ(N#@tMMZ79^);2[Q0,F`9{gRL} l㷤"9׏6j83ʵȻ0`H|;(}CXzO6.9p|RL6PxPMɹr`58eSYY+/ǀs1`Z%7@8CJ8糓^s5'.f闟C_ #Ҷx|X3;rP־i>׆d_/-1l:Sw|MOчzaɖyruai s9vf\8|g%/!UQH-Xutg3φۨ=RcZTH{i'-QG~5O]C)v&Dy% :`oFĪt;j,W٘oC4l8v-1OZGq6{dh ;DG>B]aW塢},v-7i)ͦ7]a &?5 ƣ' nO;=H -%+AǥE+q\d=F&Fwr`ru}88u}t1t ۽ }b( g8'qotckan"OH2u/$Bꈚ&μߒ"/ۏ:s0RX-<2C䁈נY zjVXYmCG~Ags8/>!4[Vu}%!{/;@(gc΍J2!TPp Ld``K G5 J\;c> I)GKj4?h~qy?T/ekrD0Fcx%"p {l*c,2ͱ*ؿ1Enz )p&SEe[pl .nB錃|+3ixnEJ.%VtY*4Sv(d%u\..|UbU9YN ӏXԶJ'=KطCD5oZ@&ivؑrC7aP3b À (QT(OO3R Dz.|&YYäڟ"2w=1h6qeMH LgRX18G:%4N,Jתƣ7(-82Vx1`;q|hZTV֬ 66| AF }y&[&V549G%Uw(خjFyzVAٱx̅\cu@٘,9&VV<7[۶PsXggp\N^"<ҥ*dd;TG wy[זMgNwJe,UqqLf%b(&Xm JI~ֿ Iz1%xR 9.uWe;L;Yٽ@:۠ΓzHNDՃV)PO/S2@Ւ) PisU#*>4?O=B;3U+.qD<I~ڨHpU7=_- aئI_娨6P縮f9d*ЦI'B=`럨$u+%&^l͙e~))뮠uuVii:}VW݊nx1z\F_868Έ@N=;QBFm;2_Hm;wA:IhլиVJӇ\(ӁtDx0`w@sm P!O(C\t2Hq'{(kjBj"騦 :jsXęb֕ĄBJT1\%w8P&7Aw1{Hc{c&k1JTh$t0z4%c窂CVjeiUŋ`8qz$ڱ"i$ QtN,há}]یyFesnyDu?%)^ 󐜿?4!|oU6>nO]>^ 5ǥZWX yfuOqbWt(S 2r[J{}UAvw&.ՔOT"y*yw~ 8}:pc GfsܨHvz)vHF>ء5{^;`GrF'nTcnRYؠЪ!6¢m_d)mZ*́OcA_ qNM:]J?›̅: ~+ŋmT 񕎧߁^jʟ^TӀ\q|=*z&5LW]7{2m?l9#:8_9:A^VYq p ePF9iO!13ct@}қ5Y oQ93nt ~S~)F'cl #)b9HM<~H.) e硪5)0I}' 7cBB2ɺNssLg90ntmQ ;Jpsۋt6pqԥ:}w{ {i8ځӄV'|}~^Bc¡1CaD2@c yhtSxZUG˭Qt[Ț4⋛/2-}znourv!Ճ~.KtPYm:oSC.&|GKsEP'eقwU\xb&Ịm/Ŏ&I՟؄'Y&j@խzCWGftO(daB{ۃiuF߆:%6AOC͹$rD&dVEJrb`ԧ2SBv* PtwSf(s! 7R:Gjz[ C] (!P04P@t`sܕe+Hl9J ]!!֗ZTІg".P<̊ExcQAs\_qa80$w.I9"pRf!џo=fZFDT;9UGI@uTEfolʾTP(#40AA EP*NJQCeαQU4&L gѤ1;;XM՝±N>] 2bA"[VTh A=-nЈ3v.UcR"AvtQ@jQ(CXHk)bE}XRQcd<єV20Zx}>:,>YdPV/2= XDѨo9__aic4 /c0 7Sk|6SGOfB!"9!sB$${4k{[rr>5J Sy$=boL?BS4 %TY&@1+t샌VikE=(K_BAG@ j, D28 БEœGO;3Z(A8ؕ;EsLEWKاsodg `Jq0h=V=cHG]5po)ME@u"DUA+ʄXSK,;RBCzpnW}9ӐtBV%M}fQUg7#qnL5 T"DH#g!#%Rj EY%YNDΪM6xl]"R*OiSY3k]2PQʉhĎ̿7&v,رTb$\@@W M !(%ʂ*( ؓbd((E) XWQ$Q!QL >bCAEIPKFl)Ν?ѼbO 87LJ] M:4DP(&Hwam9tV=QL2;  vrf`h0Z X :G"ӊE:O@) DZDDPy (&Mr)8t*; BbY^0o Su!X®p`XU?U}*;RU@ TU=܈ʺ*  9`2vM# %@)1 * r @.*6h>TU< `DDU>܄T?S8ZQ$fJ*h" J()r2Di,,L A L@Dc T:(S"P* (bP7)JGh ,,2D +T&"UM* Q*c/@8!STq`ĩHPsWC" 'S4PRE%P1 4P5 IB3hi(h( ZRY A :>ń| 4 ΉjUpH!LRj¦ i(hB")b(P)SGu &h\A":`(iސ-!CQ QOx ILx8†3bpDL) P! B&!$k"Ý@J@ J{(M0P>r~zP bHNA`,@yYX$8*Dui zU<8PX)MHN9X&d**)Bj>~T@T% h J(j )(q,8 ZDb%ZAhUI$E1PA(rAģe @RdTPRS1B"veGB!.&*@TDdD`vZ,T?edjXW@U R4AJP14aQ4ʥ*PR PQJTg8La$(;lbq L`{*u:䢝h(zRlP hdRZ R_QTqT .J 'Q U"yhG2 "hi ܌xRyAL)(@Ϸ(.*%)J#4jUP0CCEJPґ!1IU4U4JQ"QDKACI@DBP R #LKIK%PAAIQ?O{n mx HV9nRClɪ[HLF6$X[ Pj(C~qXGr?a㇍c%@"(Z:;%"1bBɈ (1IJЁD węN cl4A% 1MQ E@JS"Aڄ $>T"#R I*(D*+*%J2>;ZAD9{yAA UeQQ<Es2J DHH& (DU"EJ&c@D@ (ă΀Ue҈%32P^9"֠0TӊB*A` " BȐ%D*j@TNPʥP(@5%_a)!3DA5TDjDJ@ &$UQ96 )RQ՜Hb5$ @* rLg SH2DMh J%d c #4B hO(:"Ty@$@ fXa6%G @Ĕ )Jm@.UM2̐b@RT@'*:N]wxAG"m!!ʈ 6MQhTX4$r0 ")Z)v#DdHjJ5>,T7krPG6bH)H'G"{XO>M= *?ʅ RR*Ѐ|C?Ϝc>>?JTz<d_ߌ$5{oL؉̞iGq閘j \nN¾f>^&;CZs Jtc=$BLGR>swz:M_vsx=<-wa9) $4  >Q39fU`f+/1@b 3[yq=ƻ*C`*J`BhP"t=>o_A|墠ZJZ/QӲ|,|q<X |p@h@hhV(@( FT$j j"ӌL3TP41JPPR'`q /MtOms8CSft"&bJRQ2L,jVme%T6&Y } M> =unpP8QS|?}WlzcӐXBB" z^wvG٧uG;P,?'߈H7xGZ"Jf`P4Tv @PRPt8g\DK*#J,%p ,3S/օ /T_F^/-hvmlI%*S`\2`7C@ \%1Ik ӏwgl090+twb*;H_s %A~˪b 1sys.Zg?ؒBGmukʜݭɾ@A[ej`3G $6J5v9DCm#x2EZ6&@̱qt{ёq lޏP> CA@=LD2k$#(Ej2gis乘6)ds|,*Z5g0k$Ph,lР+I/% Ffr ރMܺ0X5B{qp`W Z,"ܘAXfь |=qx܁}EVE C:)3: >7~mAvHhF  $h+oF2d)6~SDLzИ`Zae$&<nq%(bd߫~V}/ø܎:} ?=a,>=((TZȂȢtsoI~gzM-"D)hv6ud͘nAPh4$U e٩I[%IJ@pkgPA< [{mLD:KBm1aֿ3c[+0!0XE&C(SN"6+h ` |TɊZG}H}Qej &y= C tJ Ɲs:[]X1S}\ @[FDUFⶫ{1ܫbUNW(Qi/?ROGb&(G@><(kS3l7F N36T`" br/4cD 9.r&9\b#1V`1+ɖ* ElU M Ǫf.sl5GYMj,x1[5rc|xkL\y贽09;(JNf6L*}SPnv(n 1gĢ6s7/Df VǴi&RH _>hpt2@.MmBF0$Jš==(6PQ8=V j`2攜b;-q:Ý0-Ax +9Ѩ7Oh F&(SIQWбaNQp]F:kS^S34Eb7M3^D M8G':Q7郴VU3g0# 㧻ϸE2Ë֍kW<ʴ#ެ$[!Zby`T<:ma xz4>!&-foQ?fJO^(][m8zAFkCf\VTl&1>'#J B:]gzͬaXV"[qpZ$gTu5s^`qR(N[ꦀ&뎇xJ \MB=p(z!,ڻXBeo$$ KXvU\:۝bw`@A(w.L5dJo*j0{s)md8R;l!8ש˴=>aCjG -O{L_l |1qN?}$נtSz]m3ehfH9Kn#GjttTz$0F0@15p;2b'umu&d T ~XbһĆvYav>8j0ЪQ.s->6c<`'#s۠H[8}vB|M ye0QE c+ePt/gLwm + )&S肎;O-. J1y`ɀ ЭBhOWa򭆌 /` Tȡ9K@t3{ֿX@l7׆7m]~JEk@ zG.}ҒUj.>ȍ!{ѹ$slB'l#Rt˖f=t8Pkh6/uH0y12dV ˎt unrrNs~VDk9ų*@,؋fq5ٶƔ'Տ8CEP^nzMa(͖uY6gN("& Y76?-XQC`(n}'&IVe&D; V6ݲ@PٶOpe2寵z`Z$%C}*qBCV4|"3ik#KgЁ|lNi{f/T=!6aJ}x;az] Cv=V̡BV76F"4J(5JDR@HR P 3% DT LJ -DT4Röoh3{Z@RlFffXlrd 򞃬1FE 졁'7X WOTYfTSm?uN03gKJ4[RuWg:[V `u4v GD'Ei&KG<ڑ&/jQX*[ki3XNwk򙖛.ύ@by̌'ؘS>X*9"5YLMXK9^L D)9|t- `5[dm#=vwH{-dsw3Ҫ 7N,rmqdzd}?U1q^B8(T)(O ͼp=bDv?;kjzqY4G@u7yZ59ۻAt( PavCRIh[~9g0ȆeD,T"Wtx_*6JM3ʈ< !f1c4TNF B `+Q}eY[ݶrU:)}7pkyϴB\rX"ّH,.56o=+tGXPC"@&;ioi1_q7RPu]u.h߃iו;O0l5KzcO $si(̑>&P,ls'ypJnH;bdy6ZW>&ИXҞߣN,vmNv]> L`Q35X~v)V1#LZ:S=_QfP2_۝~lY_Oi4ce"Ω~r-{BinZ}6Ydnrt8AlR|$|cA'6ܟ7rUχkFږpLsKz;I-Csi SnB>ҭCUY,~XAUL%Ҥ"k^hh[פqyQS]?Y&Y/sZáx^vWg x0ыͯ!<ϓ vѻ9{mlws̸0 ]|T0- 4YSKP|O>k|c1 ./O4v[#~|(DMyVEA_n`*rO d8u{OħX|żccE4^/)ݐzA-)zyY(| jxY|AJ[3h 0i&g`[]=148t> x6p[Of?$Bzxf.@15\R %f4)`"r5o-\!E(!%Ϸsoj4(/o1}^~GW@K=4|y߇*dݒr-}im&2r ōHUna]\Ep4A!c(}>9,ge i #b1<'VU>b@t蓙 67`7Ck`zᬥB;FڋDoOy\:eR&Xck<ЈW/ ~|^uŷgpl];Y9,QhT*(ak<c˼y[z , `oo?!]@€_ѯ1f"Ya}SVA>㺡ЊEQ%_IUC"Ă*b*U)AS‡R‡mf'EOΞj !A%ڒ͏KvB:3=z<0%ά@jal#ʠ=ڗq</oT@!ȕ Jm }A偗aȧȯL y'pA &@j"meT(enDvA# ܨipR_YLE@tQ(n f;Hi@F Hd M2|K)Eڴ# S{!5-Z$ZV0`Ț"]\`nf:f}*xPb")c%6V"1J(}ťkޣv;_?ڟyDI$P@L2(wcD4{y(2~%\ϧdD;V*B*+@l(!H4H'zXK0B-طX.""#^]ȃGBby`(5Hߡtd=d5Pt\E@GQ0DNW g6IaEڣQj*(Ru9ˋBNHH.ɏ#+NpPI03vUU\} a6 RY NMC#a^m? U:GRf)< ! RL_# !GWwED8hoeU:(  7?O<8W'!iAxP[5.?G;8s_̲ 6*uy'jc-;vb$~>C]sڑI+JEc&Z /.-:v}/ /fa|86qm#+Ԡ$m<"` bZ^!Ȋyx>;߲9 hE&" ͆E!:x@g5~Gg13gsD DIw<oo¼Lb0'I.i ,.t|xA0ƦH:$0e`P6  , *Iѓؠ0`IbĈ C/U[ٶߙAw'|vgZ y-:Φ}+w0_A 7L *<OGDe1u 4*z Xϗ]}yvJDhc#JadNy=* uCYpbǃ")8C;zly*x~Po쭰@kd") eBld)%Hr=WvԬ磿K:)y/k>Dt<$w]9hbb3=0=DEup)E7E/¿j7HpPn2CHԎ?qW'Y;GP1ٍ S{\9:-EM>dϴ_yؼ/  2{>  $O@4A&]痍|zgJd+ώ|g\(>L߀V|p5qӉ_ѧqQ4\|&k.oMKsۗ/bk:$6$G s`dS.sg~&v:tO 2GcwN# T\P2yC1Ҋu<ȯ+yd3}hB2%-T^-G>Ӷ©z"Rۄ!Uss2}lE._NPLFxu=o%X6?)Xh˜W&$:@C eǖ9x/ҳ.<  %(id^۬"D){ۢ✠ #=?xJŷT/0*`>cN Q<3hL0 4'VI:?y^[frDGUf A%UHb8>KؿeT?]. rp,#su1y#PT"ÓШKE6K>^s%|t%:<|ph(L%͚1 PT, !\Sw|~e}~BH5B!)BT57ZMQmÉ ']N$ <G;SGqf (G]^9<6_@N@* QTj(+yX2n ~p4B(zވ:{>:C/)e%wP7aBmqqbńYM0HJg`nӍ*q P0H/{¢9~d*=&袸:z.ߚGC/KøVj9=(ۊP~!\7£%M vSp44cq?Ӏ'_gI ^tN1qц!‹$DA8sB*5)TMM\ބ$R$\ QОq.{G(d'-!M|Cu6D>ݮy`< LEk2ֵSsx9EoTr04M;\c6-Q6kZeå3 AƊ^8"$dAcw}W$jq|,:Jj)Yݑ;(BV]O$֐eVccWkMeLKCFXX=yf˥}/{p1. YL2 VE)E}D\Q-c2ы1t6#l*60)b1E4|6p@̅ M73;Z}U׮ cAqEE|@0u>2oϡt%W XÎ&ڑWr4$8 ($ <4psI]LfC\"(˓,85gxi'-ČHC%He28S4:`ڑћc̔pJW{SJ".Xo.brLf ot10AG7Iﶡ}aX9SI]Jbm? r-~]kzwLjnWCʋ@S()(U7EAo`e`RbU׈&h_2SХQޗS~>Y IUDUSfo'0yMAF C4bRRѸI$B`LQq(W 5bVjؽvtFQ:. TVl&:%%ɒ50. .o(er3flQTZ JԫJeQf!f6rah| UaSwi/ brZWtDYB1RMxlX"TPuB{)^2c\ Dc;hzƩKԥmNs5Kw[HgyZ Uz RmD6?骀D3keknE;>%}>Dvw Wߕ>e>["]7;>#E/N#Hq'-iwkd'U4-`{Au;.Vei9fxC+kYR?UOREwsHu~NV0pQr%'h0iZt>}535tX,?=GY*:ht%K^e?J-QOCk'jM&%EJnNy{Ue|iVLa/ˑ/zho2)pv^n u¥E (Q?:{>_g{>S۸(Ã@(oh1"[wsZ(_%ڃm#<z!>P\o]1">eEVhӄ H4 gP:D*fo*t_ꭱ_CokCh^}xq줹6dU{\̝nG&Yoi) >Udn90*'nVPBup_+"u( 86k|ӴX/t! 3~V?}!n*w@ a aء+PHε\բ7F: epR'-l@_NE dAposZ vuqx@G`*aUuIS}/_| 0* ;սb%??>IֽZy9g x(J|v#ȫW1?çkSUYy:PdDO.'u_ .'b46 Y/\a$7Ku'A{ej*Q ]4Ĕ{d_ͷrmOͣfpOؗȩvL%ZiM>c0)9`A_9C$_j? =ә灛z[`|rҫ1!ݸ<6RP܎?Z%o}!ǵN]mYyH_czϢzHņkf yFtyMY)SSRD~:^SL<&ɭilaX=GxJ=uó48eCxNpPU"Q%ܯazC|?ήSE캹"xDͮh]hjpT /g' =ƉbzsM颹10 ޽ 9m5/x`U 0lVP-ͻuLvĀLNpj"!y)r(l4*$2ay.2!;-=[4"hN8t,'] f8m=U/B >Qo΃ l^W~_+{lQuƵiizg~d &FR}= PJV/u] . rrO)g 3lC$ז?aOdV9 u"IC#-ѯ@ ]e3$ʃ!%WuHߠ #gi(pW|x)dNo2хpYut j tݘJX?6շPzE4Y&BcU14PFHZ4J ?s $0@8&Mo_'Cz<ڗ-#PBw8! T;&,Iw:~sfݵ₥G3!ݧ'm'rY= e4cRW>d'}Qcu?".zp[+hΩ*5m\Mkª6K77>w s[/_ԗt<}L_2|MdHaĢG.ZcZf|Cݥ1wQTS6K0:C>ﴇpg2hY|al3%V~#M QaB†Prr}mE~z@IwJRN>(͚_EC?7eFGPufR _57YF>z)ہ!Hb3|/K:-ed@ Dϫ6+? tl!$@=ѽ(nN(#=WF|f=W&C>fMh4&f?9UJb[4;Gv.7fA@72 2mQ[㋾ۘfcYzJ8k2V*'B/s㍰LvnutQVMjmUTNriX")wCd(iMYI5l2f`]vW*ۻZuV,IV" /c%x4v*!!ە: ^!gG{mԇmwfJpfXjrUbk kM\cmX4o.tF(me.- ۏS ć;4h{}6 RL8,~kA1ˍ`dJH#'+pV̳*fRTY-n\ٱ"B)h,uJ " g8:hhIJmrd* qLlF,"*-F+QR eU(U KAc][{(5l((h('+t08ޝ5C7t(b^.aTU`ĊIXuڭJExJ `:p PcY2aYh Kk;R)ۅ5y\fTBjvVUR(lֵ1CCXV6.yu7]S| *T:M80%dXV+!ʒTH+kE @@ Z#bٕZ 1 eA3?|$NiWwq}Уs7-T8C2,@ۃa94Ĭ6N,jYH$ĚɶBmuba晓sϸí &wu?;ԕ|ל5oGuA>O-1Y}PRoi*_7}=Oy^w6چ;vw|6!AEW-AreR!)7`̭~#_-Aߠ}􍪞 ƪd;8 R!ګ)|L%՘.]Q(΅X؜kV'&an)Nv*ߵ;R D-*4[V9*=j$Rgjvs&7HD:2c^̼{!0&"F\wKu: C`p3 ~4QEi?mnUm.Ѡj3/Oi;j ĹT B~IҫV`~C4L*MU#/!ve=$mP88~ Β{Q׌@1tJƳ 6ozv}Vg/]F8I&jD5ﶦ{n dKp:jZI Qw{,{2!Lݟ!*fZ"Tg߬.>G3YzuX?GwOWx(Zv_yq7qLneOpFt56?ұ{y긏UgtkQUfc6M"jUbUO"Nʆ;̙ZZhRko^=ѠCZSY\Z/6\.?NkUn3?fGmԸ?-Vr;N;mE"T?c4!\{|A kyWY//P?Cd/0F&{v=A7gȄ`uU԰Z=쪆WŹ6/VYQ̻aL4P~~O{폰BYy+&e*Aj{o*]#8[37k?\Qqm=wU - _%D~l (#{@VNQ3߾F;Ig缣%g^2~rѲMC54=AKZJ)~Qn#R}|2BB(!(iiy~CǗ>y^uթYmv0ܠ 'YD?UE?J q㭑@U(b!bRL@?9Fy)T?=?uiYIt<34LOSqB6@1vWJ#0J>e)t@AwZ]+$i mfcy`Q~i5`4A ؾwJ->+b"SBd\A1~ -6ɠ]) H 4 |M=1`LӇoЃa%r1uJQWycbp.L3>@SDQ 7]W|#VU5(*}Nm\yDT!$X$e"]iV!% 0)zo$(f2l }|9Y %wh u ;0S:K)8hU.J/PX?w"!x0F4ȈN083"zfmK;?eL2]vMusx9SB:N0%X'f8~D} 0yX,5X)o?}({Q OM$`4Voľߤ5/^.믙پh%'6&}09MN I|FՓȰb瘊dC2Àv4,j, +ȏlhT{ f|Kcc(tkgQB0>#|>~z?yO ˤc`fyTX  x{\Z =Ɔq]ss24h0Z:"tkK&GBvl^6UW&6xFG#DNQzZx|UX߭1C]z #=0eex)?T /yӉV#6+3{E#vub_~)ʉ٠|(wsK.i!_ʟ%Jv̆Sی׾NCT$0 a|q>K#ϮD{BڶHLTSn0E5oG;uѳ"yܟK{ܗ k]z NUb(iAU!Sw!H|yn:nB };!GE=_}kwx< <=p=c$!A<СDŝR* qj|Zg[Q:NkW׏GnG_ӎ͛uS&Be֧fPuYI'?߇~ny޲u'jx Y=̓ܮ6:7;﫢븷ֵ!Ivn n*wpd*#sWϩp'9A4jԩ=],E<"Ӝe lcO[-7›s}}?S(nofeL>7I~qXms+m$*%dwKӭ\,*ڕ*ņէTbFόPzKp8>Y+`IbH yZE,c 9T22Q1Rb":Htt=R2ńԭ-P䟨NFkV:}B'-LɝYA'f ,TRi^@"el:]x[';:>mAxL1SW8ߛPǠ-}vBh4 X=[,_7"!{a5VbfҔA%EXbޡ8Kf p2*CHC([?`qZRST=X '\OJ}2.iOy]ܫ#Y ]@QAڨ ;4]!5d{3évq;p;wp㨈u*O";ڤдD⻬rbZlD*yަ>Rz,}ϲjƯ+ԋu;~ZC>NS\UOȃ^8~gد99ԡ.fzJ鵡nuߕb)sL@U M{+A;概'FD( =a`j;wfXt1yUCdd:ktkBUvhfjL=^KNj{ܠ"yJT:0≕9PtKKBʌ 9|,04@D5AcY¹~!yTHeabm;nWcvD9|ɛ{yǶޖsAMjXJ9O>h}м4 ڤ _ w^++' BJP!>q&0Lq+^-b'CSi;;&; 7ˇ(27H/"oSnj TU5TV:)rI ivu%bP S"-` 7V_ܱh&īO*ޟ3rÀ xx$7PC`T6OrMY$!m, VBEX&=0YF˙qR#"9m74 rM( 5Chج) XJ MV40m!W|  1f !xF_@4~݄MF½3O=O0'Jd,RWTʀH5CNJX$H[O]l7݁&Lxio2XՊ j>,O^ 8IT5&K`hד6uv)?+ġT؊%U~ Q2Ł\Faz"n:*Sg<)bo^kcJ% UBocƼ칑>ԟ[:H 2> IRmYze>{8l{hBtBCȏʻ4ac|@4V\EmP?\QZh^M] ߦPqλ\ߙ#CC,b\mk溤i:y:~d|m/-OŇ|}f ܬlW}0;ſIW>o3U}Gn(~?Me7oTm>JS:ͦ}ֻf52m~L]xtȬ:m/=K}][>o\6@$=A4[@ 0?,o!iӧBDA{~_*cWnpQ|utڜIY0]Œ]SwC؝F.Yg&A>tmNT~f<O"ne8Lv+P:j]^GmsȬ$u cb*+/5'$oTӞbl&oB";g[\F w]Ut4I췳zo+ټjh6ӎSXTrqwR'2~,"P=>[[7YmZ+tt0D⒌1u Qm{'ݓ07b_^ƕ5͙Mbڙs"dUAˏKUJa_O&g_su=OءȚ]ݱ ڕmϔWn9lO]{%8Vto1!CN๻+ %\8?wFk/WILou,ѥDZ߫Yc=ħI Η Xu j7syRsx٫n\xӉ7^nD%5 QPTjܺ~i[չ: ˯Fvem|vrc6!; Fg++C7҅d)5I-De|a #%ʧ,hG5vYt-Rbg]P9j.z R2  7-Q<}sg=RfXp*"Dcд`hpKqI<%LTq @AX GazUej"bs'vmGx$xa5&k DCNI粮~)dúM,a-ҥBXju3UHe FȤtRi/TPϤ' 2d(1`-.Sϱn|Px&VXr0Xߢ6$(FZA3#v4".v2@nmu궳44$䘄 ]=@'ަ3)W8mݵ-V M[~vO9rI/YUD$$Ƙs#vӆuqR;? (gU:5B{> `,J5o JܭC^ f/ 1yyWtPPXH@DCHqv=rjc]WG%ص3gcps"iz~wρD7ju0 4o,(á!PP&UtvD@-i,(! a@;9_4'Ċ=?CRӼ:g^^03R.(.֠So;K>e_-ih~:'^D'~9~#N'nE_s|YzD7{K?{?OI"$I"rQGŌ>ƈ^d SrTLRpt$73E0A?a\Wsv~L«W/ף؏'.fx/sߍosyo_==8!$"u4J* ˩_Gu-@l}z-{h!/lߘ.}, )L`V>& C{'g( BQv-+ ]^"GRC=/k>vcDZ`˟S,~/_P]xn)A{=:gsBE*a>M|؝"y(+nM@%P4@ B9Gv|74:׌Е:ՌS$CM͝:H>"C`puAy+ 6G\-[GR9:LV^r"q`O1l>F!LpDGWKܰy+hDZQiRi i&*J*Rb*!j ZbT30f ff|j( M^&[V0;Zs1'h/WDk_ #rE`)i?-z3" `HNaI J"c^Cڭi7GlBo΁'L4t?jDɑ8g)5w[o0F"(C[ynԁul Kl_S(pi"&d"N?4QTM#e~ ]Vȇ6WutMX7#qaCKb"IkľQfLD>{vߩq_Ԣo/oB;v"Z.=ZԳ7Ob±VpȡXQd<6&6N@" " cX!>.!>AQQ  X0:H`P({x4 OEEwΖ:2# YOTnFb ls6甴klcm<kmj紁/+|>UZ:u/M[.^e?g#  "#]+ޓI?;L_5ڄ02G 7h G6TF1e`qO[83NQm9|@%j_)hjz3E5m dh 'gY:vd!%S37yAY!Jj x7a6:ڎ#R9?JmPm#$}RM~.uSs98 ݞkO}LTC9T?v_|Q[-tCEp[cmM>Ԡiۗ֏պm3 (mSCcǘ w|Ǻ?N:ԗ\'p{c;лFq ;a[;뻯g^{0Ϡn-Doo}tb+vϪo_Jj%d7oT k)0Tju#F]Z%CiFw}9ҼeG/\lLg~z=g <ƚ~o8:|Ղ!=(N$¯4=Y6]bbǸ9@LTEegGH9efˡMo5׈ڌ-&[nT(9KewCIl/r߉Qc\6fP4i63T8_)2S֕7~ܓ0煠(7,^Ʀ<㧽nsG(l/`Wsj7ܽ|תff;||bc(W%Wpc@a,ynBF H""}񥊏h*IgE8de/ 4<$0#m]xƳkqtFfnHh vIJu ch_0Ӥ4z1h3,%gI;܏]Bme g tWzgH"=Y ΞO*xfsXZ *$ܲIyg3 w8 < E$5DHVem@u!S LpdB$E l{:_ǠمؐxQUtWM! αsYU=+?Y?g}+75?߱j;\#9(/]Nվ~?ʞgNqWO;t^~qFb>W,\FPZ:tcOA@x{]r%P/ÙhCbo6uK|y: g0Ύ;^4CuMۥ"Dx@{8 y| }YRɨsr>7sNR+Le/jQTP|?Unn/׿6HC`пD\Ъu5x(B?m]g Eޠjd8JL+(@$aݔwDRA3^f!خmW!3?N͸K]j^lkt.z&$vN1fRn]zGfJ_n=*>*U^{|5|ƭjnl#/[SVKW_BXv,9ۥ;JFFU؞VjͫMqu%/*^ZZ2"!d^Αts Յ 8>%l4ۆ ~$+ `ݴ!0: *V{iwþs9m; FximEQuh" *fdpI i8peJNT)m5_nc~e[Ti!E]tu;UJJԮ&q׈wE[;S'f"}?Jۿ\gP/d&B,3Ƒ`D8Q_Q`=gms)k[L~{Ⱥ4:G25Hr. C@HLNn6my2cj~Cvp=i`< 2vfVc ;^GH1Τ"HK3!!+UnQwo9R$p>P󲃇[dg;-7mʺg_edqUh')ӛ5{mA3l"HZ:4bnڧ2`-7=:뿱@tBT8mF\gOFz/(ej|ͫ*nꍋ @Vd\ڠDODE"jcQۉd~--GOe;@LEѓ޸M7o/6x01iɞNTNִ ^{oFx4%c#Ptse#p!jKSQmfʽt.k/4^ ^Vk~l3Og|imUMl[ῑI B `dA{t00<0ю?&?'}u^)V5nHє뭷Us~8?#]At9]z;"Š^s\1XSJ訋E:߉x+a/Ns^Kh\نv82!GNK>-umkwδrho9-{ ^|U)v\/ZH g; :dᗛs*v/" X 3 LRے/_N=1*P dE%) dוEu3I"4?ẊhknE}.&0e͸v`OOv2f %[|y!a-е䄵k Ӛ U+@"bPJ0*I!7ȼG͓wA"tg^9;[%@]7Gb;Ox"[u١“>Zգ4]ϼnz^\!ﴻjj>w[/"rp>V;_:Kz\lԜtG0ЈviSdי=](B@h]&E}+`zK@ t|+6ZXf 81kѮOZKHSu7g].rX&~>Zû/;pӹ>?֠Oo\"c P'M_[n;>㽥E $D "(0 ˆPQS5tA`gMy  .k3܇ |4JR biZV>f ν6q3/x- $DWayv+دlA dbB( Z3`;(, 23f")0sމc54MЛ:٨.6CMptQE ([2it&35)0&E01aS$mA G d5N+2. !XWa!%Mh5 M1(zֲ]X6cIX=% VHi*Cc;v ä:/6FԚ`!hTr]6Pc;d)MLhʹ]DOd'$B=D>?c2h=?f{Jb7nJD%~V[/wz4I 90R!H"* A!M D,QA"("J"J1P| 8JQ?EN2)j0išBh3HK)bC !Bϴa&*eDALk|Mb`4EUE;_[zOs~霵t潶Ŋ P9tSsbDDv>OT5zvld2%W! 11.h.o/xe]{X6LVL+F&к}؇.ޡh FbhM35$TR/H*)B`LT!JS-҄BP@BP,m:Rқp"sYݰwd\d^C7|C@:r|LIQU;xgxF ~5I:/O*O//zGG+?ב~#BQBGh&'N特2+O!P߉t߉O=xYix8~or"պ9=qXv?(Ɨ;@C PVQ Ac&EXJHO~e+/g]WeZ$f1Nwq^2D_kmJ^w cjQC!syoL5p~YghIwb.=2. ➳O(j3jgv ֋ V7; <3g.x4y;^ί_ZQ-f=2- ʦ}Ԙ6J|U{&IAW nj}}v*ڑ\UU[*ZvCa[I~+[K^LlYy 6}[_CwJ9.[> jEG,ۜDXjm WƖ3<,OJ-ccnB<ڍE!%Oa Tokw7߷GKEZ۝'&f&t"`#m9xl,41AE?j:mx>].UFm@}w蜶HG=۳y6V@p ,mZ]( _Ң*Gw~M7eX'P7i]֦@UT8ӨnzBiwtZPO^XֶNN# D`^vS܎qYn];uQ5x?zBÐSTj$\T IrtjjӬ7Z{)y=z֕I:I&zD t5jBWM?vm ']g3\ 6QŢm5VJٴvËQMk7%>5R3'm%̨th=°/gqm K'&O_,{lSX.]:k*b`K@ST%="w6b[TJ(F(wPyg䁎.O P o4#@=#'AdU=}-`ذu!Mdw[O HIgX"c,$ʘ#65"lAI<#EQ]MЭg]hOxn ΥI@8l@r~w893Rw S=\Oyd~;vfbv\X >jk/?pnR0`Qqo$PR,f nC ^iİzmHOHOWPNUgK=m;+ 5D1Ghbeڇ6ZZK(3C6x/e`QQx]l{6Z⊣VEPm]YpJ5sM%èףP6ᅗmдRg5p0syգܾ{$ ʸg}/RGƮ3{Ng\*^@62f#>3ٷ}O]x|^fr*BGJ!uVOμAbEߥW$B,OWމGm,9Mb|#06 00zwXHj$˞s4˚Zz3}*8{e2hX3 "v9Y^2[잝'HA @I5-I sd?=)S|?;!ۊ5.B~MTE$堝u}ڏ}.*U #tO<06H`*ڳ5@!96U] kd2'pp3˟K=:42*$ {YÔ6hN%vb RM2d;5)8*O:G tPFG8Utz@%u[aʺ#4;gvEުcu?*() 'j4dzl9ZRSc@&m9Pu@⃁:8뛿qSߏ7IAO&qy]Mw H=R-" ̟[y#zUMv#ɢB6"JLCs=~N\ڎOlrb}c".gkTWcxQfVjmpQc4XaU9:_*gx', Ϗcͦmu3$4ۧ\k\ȬʽYCV0}ǴU&%se\Tv?b}?CiJFZkgX\Ǵ2" kMΎ+l4-܆YmoWo痷w/VʄQr*3*3(f-`}cGN?FN{0Q"@Pkw.w'&?3U`+6vڎy:vy{L]ݓ/GmN<<_O}]2YiK:Ȟ@z{jl > . mwnZj[qz[P{[jb*8Ϩf lݎu>E.:~ơ}a]xhmu9kgD%DHr%%yqu%>.7r.wy0wgB,KK0 >cAPԀ!$DY%DE!|fP]^T_-N -l(֚ud10 i$+ [/pH.iTb ALx,C B,B8Z\4΃+_pm'>y<͎N)cIA^N( )BiThU" IEӮɝ\3 YX^ {2F 0clkkk7iȝ;ЇB(*v?z~.K bȉ=I#[v MNe7ij_Kz7o{gNj2WQ'ٹ"=2!@qD/ h?Cs aԂ[b∊X jܔ̍)]eU2VBKt$$AT(JDUhZ@ wy)p H*}׽C=@D!`xWy-FJgbINHEP(R4@g b2K"$Ā(oٵQAi%?He< d"Koz* -~F1U Hn`žqW0,WO}pn'%~֕> #ȼʹB9c=tп[q ќY! wzXw¿s p/DLYanWYdָr5˔}>#!{~h _.[Bu6/ygu啐ӅΥtx%Ad*07f%d/Oyó~A 5c o4b;C^Ns2қ 035(98K&y֣ c$ r餚[LHޟC&n@з  b=\ЙC3^?&_W/òbD8qmޕΛyS{"0ׁ8Amö.=YzfF>kb&WO)DIADdwyIl6-u2ZENj1`x<}z x>3IvNr+0D(bZJ@J"m1i͢qhfkkqXqUmiKcqo; I;Hn%o(͂~vRD̘@0(Bgx߭*@qq$Dq)P !H (b#:<( "-ʰ1,?-(l)Oz̀* ,Bp+stdʚ+ૉvv%9F;E  Uٔ nd(A&-[CGi(ت8r%xl t1Mn퉂wTboqP:Zu`OWrK PuHaܸb2+r`Ub~w.ݬvyNw3NGʤP{`v8ʂUך*gМsD/`^r<9/" d_j43 I9Z)dZeI!P\'fvpS0W3؆"F-?mSioþ}LU@ld 2|1a#(]Hҡc҉֌Yz?=yb=GCGjW'E4lYMRpk8T 0= yWZXSAϷ_F+[VC0'@Mr $B(`d EE#4Dh 54M:S$3sўARR4@U"P4&kGhiIx=.36hWAF9ȁɌãืA|[#PhѺ HSA:a2f/oν"d ddbu(U*m(v!!OH` (C [H*w܀ ii͟?YƩ6Lc  !,bJ@.v& 8%g5`]J4Nk"C.k3߯tRLK6~aޜ |V?nMTtxhw;>"cBW5?uu@k1Tr7c*DBKqW-z$`d5%*&D -HH#kH殌{6pXl;tGF L TVa vj1_r㳋hge<,q Nɝtg~cn p.9xq 䥓3@c!;ghE7b`BT"MPUUK$8߯"C`X^J\9>W*4y+gU>1ysV!HdF? -a9 &K1:3!W4yEO]lE'LS$-IJ@$\Alr 5f.`f4sih~G߷ooVY'+aca+Go^"t+a_#kFA ͈. ,dzYOap,x.ýEAǥ4i|?`gþBҶ @chH5.{Yl:Jx9-,vuqy2Bޓ G[k__5ME՛6i`o_"{Л]OG0)eP z!G ~A찍im^W9k f+rHdā.u<  bvL IX΅p:4iݛ&MXk aAt6BkX7 s[/Ț!*R'> b偎 IY}y$_<,͙f)c1 N""x`r1 i*JÚc2]_ny 1(3E+0+uT Rr֙8{ۓl|b` kHPZ(SPZ)Ƃ]RE >R @7@+@%a d+(`sgd^JEyX0C-Š)ld`"{SFiaPvN}?s *H~\T)sOkgKihYgtXL(Hۧ(z;coz?z %6iݶd(k=׾q=sߓȮL zgΠEi ):(Ub)%lmV-u|oW^q=ڠOAۏs\~9TE{]~KB-mVOK];⨕|! ;zy~]-~?ObZv&od@>R(R]=̯W_ɟ~Q hT$@ZbyWW>4 ,Exd&M (3?+wz^b/,kNr_ϧt|[,"djVNuzVV[䮬%/Y]t[)%wzb@O7"xwH0fIIz0MzjY/#^3S{uv 4z)qoi-jcs]/ +J!A֕ FEҹp$Ht䁽@4)aMpuN>;c6@ג)˻2\֞tjy1zej$y1Av'-: @ mc9B\ӅAyOՒ0MbHs9Am㫑"#g~175i⾮xڼ](bkɒ U]?n D6q^a]^v~70Q}6!` 3)5Ȑ˸}Pomd_vFlh]ne(]\mʁ;K$ 9 H,bF@U|>V bÄ6 餈:/zvqeߟwJ1eq-pr@$kLFɞ TU7`}f\a2xl`AG߃Jf$eQG:gRI0>8b[BL>R`~ټ2(FH FuŚszgRC9vp.},:?U/-=~?)~8| 9 jAzvJ^ޯӼuNձcO P(B߭Ntrz:z{Z;;8 dPj>d| %Uo!ѝIz:$AV_?ˁ͒f} Ŀ -7c)/'MQ>[ձ|1}-<  YBZBFkA{=v!j:H beX> QSJ 4]Z1poy)ci_Xv{It7ouP>,ԃ.:.z_kc@&3wgK@ɾ|o9}yθ&C`W3;wLLY0c;NN ĦU|i{ɭA#Xv"ˏã77oU =>IZ<(sʄ`=o^]san#4_MOlPAW 'RZc.lUp~Od,#:SJYD髴Y-CVk7:t* KLLPvlw4Cz|Mhۃ(SCM$-1nٳ}- ;߽{w3_c^AC S/.4y ܕS1eA !ʐM:N asuhNǕue7 (HOLWꘓ6 2F"IHɃb CNjKH"_Ը<&pX2m+~-6FsWq`T&A:ybo'z%+9n \dQ̤Ajgʂ%#;ʄ8)u- FD5֟1LIAR6js>n^[RGMDߌ*T:Z|*|eu}\֚o}EpܦH^hzbUۺfZk(10_sVLN&E0/?_yc@>iBJSrknVqRjUzkqn¥{cߪنBԀ >q.&(Rug}]$@⩪O΁JbCB 䍁eNqjGzdEpt -a@ P1NJ >+$xZ-.{hdm>u}&u\UAy'/wDb( DD6%\cAim ? _DهvY? =G}oFi?+ncz%5JzNettyfQ (`%iBf t"dMҫXӣ=ZH)L0>u^ǟM`VS&LH?櫽>Lz.^­2񷟉-$+PuD@S}ÝZhp u-~x ۱Ev+N]-1md@+͉ab0T[qrƉ=BM-iusq<:^o`2̌*o*džuȊL]gLOLs*{~ԂDd 0f ~(=3zͰD* 0L(`T+iPSmMHru3xˬڇ*lƇPVP o@tcoJZ(v~~i_Ur;Lbid`S#lbigK:ϙZ罜eDUў*3Rx׈dM91WYa I@4Tsж,q`hr@|HA/k6R+0 @~&31!1b+ u,^,ʙ;I}kGk4n쬥R(s])D<جǦX,d \0fZIne",EF,|뷃_3|s/zz=ڝ<_Ƽh T$@1?&\,#l f8޿&u~so<|-oYmb*pxt}=<n IX~#ftU 7ߨny?ACс\,c} |h??{auN}oբ=Wt[~Aa5'9 BsZf: A`O1#BqKnK}s%a.C Ve]X''6_aBye,5:i.$%qI[/GO̯/ئql`P-6@zh1?#y;iu?|{sw|i~RPoR;/Aa.p^:5gڢfͿ_`_W_i_"mFۨݴ9O R2 y+xUcWm0UZ( 4(zmÞZ]7ׁQoO7wh&^YToq4rjB<"/7n d#- o,ؤj&].:Z+wv,eGcx0 eg+61]Eca d'Z36(WZ6Ł ,* !KsEf&I^԰Ƕ"[MKbemxVzxE ߝ8 թZt2LrTs:4E?Iו *bN(ay|d[AN1(e%jw{ ^^V/P B)d8LK%DQ"G"lآx lN6QH6cYjޅnεUQt"׾swbT*lʊ1MFx MنWԟS{PiSkOEQM;/ww ]=_7;;F E CǧH嬒;čsOr?}jF3{̥-) eB榡diKu^l+R/lQ]"=hkv՜Zͷ7pRQʉr>X:=`zsRڇ4f>bo<(D0 #ţƭUǠK]z[n֘oΣw[BjYTvgp;h= g [LuדK^DԤXEyWgnX ?&?e<+rjK? owE}s_?h0 !cA=|]7C2Əϐ-V;e9uS~v<5aY?[V[O;kc|,+ͮV.>u%A5v10>4W&"8庋ƍgd&?oWdJ1N~+87 3j8QM.`:yL(;왕Jq[HmugEYZ mE;V <)4ӴUneyi5X6CauIsGJT\wEepp}e7.oWxsKO]͹P8 YbEn !/m].7KH|V#\y;* r8V_ӫ WZH[L|%ct9=  X`+K8|!ЪKRTՃ2j^3\:8TkV=9@04kM 5hQFu[J 8aŝG^$d 3lP P{(<=';OvpO$"sܿW 6`j@ @ EP%/yS(81M2p]gpoCsA0ˆlUpO?Y?wn8N:Iޗc{s0: Us>PvP}NoO_AΆnKeㆿwaOO0@i {oy;C|.oz'KfuԝǰzIr:@"Q6zwߣMΟwFAy@A PfArDK& `Q6pn}@}2!mWΛH0>ڀ[y{"!ϻy ߶pjەv=uI|O(y}V !g=׽,؈9RwR-*\tKgn;(RvB8C??0Os'z/ya o>0fè0'z'Ң _1e{?)h}d~pM߽IUB瑖z?󟺏ϓPw%7@.+<7~+{c/j&&E!z/7>ð~ GYOe '\ vKQoЬ$J OKcǕGb+n6=&p`NN})GRjhΥ?Y7݇ Vu+xM{pY_gYݶ6钑Ҙ"nە6M{>@ZnKJki?O\yJJܭ+.u*-E_tyଲqAS*T9!&ie<\dPߜcT#\y?#8.lH)bԼ`WDT! cmaUj}e{6~tNTg9U`AP "f XL2JRiՏ3 R~!)QܑU<+#T$HĔ4P@"_s#;\]jmq0*jo A4H`bѐ9Q0Dy8"&>ށb6!O1Aa7*dCi E!LhE2>땲c>Vυ{/ոi}f|IIG^Cem#앳ڱgOxM 2 ;!@#bP@y Y*͊N-)lC5&ij[;`{'#S|Wݫb Gܝ&D@ψDDn~gx! e1O+T c\ܻ7bf6mw\Y\7Utʷ[Ќ?7U@ώ}/wWx>)x0CA浪/ڳOD1j*˫n[5C jfBg0(Ww'ZED=Vn z3o&Kوu`榺NuV=WTv"g'$i}8\)ֽL`1Ysijg(V'[ڥZFɡ؛^qߡ[+T1KDͅvd#CmTtˍ/#YK~[,>AN~5U~>|p}U&qOn\] .7XzLW?gߏcoU `vVfLzliE7oz+!UJ9TʋDy/MKh>{#"NZ{M;ϑІHd ѥ1&CO(>6uo[oσ\䅵P6Dy=Je vPQU^.sOAl*%`-]>5|숗Ȯ د슒ݘx-AcOWggNEn3Ը(`1=N&\AIAIFM]\\/὿r_d/%̯2*(H^qܗ+GYYIGA;I"K9ϻwRs?Bu rSG"SgR bm_}bf[V6/~~ ~Tw>-oFǞy-AZ9f?֣F[/Ё<p\+SNݗԽrB?]-Di-cjm0u[zWÝm:eɲ{ijڍ K Z|/M=>A*A,w6-'~=돴}E.E -. kۻ&, 1OWn#o>J}B"h]8T Wh2{==kxdu' + KVlq2j}y,; ֱ7vvޥO^׫{&]yJ8jm_ʟnu}޽jo =«qxx\XqLҺN UuyY޻I w]A'6O{aj9u~ÝCQn*'&>R*ciݍ?YZ1/u;ŝf;gs[@,&ZE]_s_7Gw+6+veB}E8غc[5]k(.&]5M|>PVV`k{mmhl)Յ~"u̇Àll۸gvuI'XȾgخ߈4f3MpÃsquPm y^ϾbpoY]kv6q>`N};OI!kD\B& +h@+x0kdP9hy!?C Emm#%ҖhRqz^cӠ Px<z.b&rjGcP0tL=,_ԓZ]K(b7J%ՈB@W2`S۔Wd;\1Ry?^O=07I$xԃx`(@Lf:Xm-$ҧd$I@=v0} `YBie0i8_!x,V/ϟ+߱a}[o j_<; $H")$8^ Cc)} >Q3W1&ϰAmGOӏA?6,ݱqIu!w lCci6 y/}@iŀjm4LT(?\bO?):h{(AP;k$Oٯ+Ͼ9z[ޗZ0σދҎO; 0B3u[  Qii_!KC1QDMes ?*Us|k*uTbIMn~ #v/kщm޾3lg6BξfwToUrq{1x NS~m*u&īCeK_x=#{Ԫj4VVE?B(|./vu߭r|Ai҅Azlo8?]@2vVT]=[Oxft0jjd1oZWƥ 0H|˓ (A<"zڛkW4{F}&?ܔO^W@.+\֣Ybi1%*+tjgD4lG_)m65 7 KwTg%ζl3t1{SϞ sٯEG9c Ie̻ pq-_vޖEeL~9ݝ $](_~u){"KS`,!1K`uQ;(<3[ӏQ]&Tkەɑ:3 9dg&}*H0kOv*!`URSnbX1: Q>꽦&R1mo>RIByq bV.c)[< GC}9_9&X9Ⱘ|4u {Tf Y0T[KCAVi@լoƅ&> %RFi{ƌG _fd%c2>U<.`6z3FvQ_a|}/[_Ve \@5g˨^F0vڻҼփ.U~*kɉ{ ,p2.  `7{zp-H كA‘+{9X n Ӛ9ʟ}#Q)ZV}ئ(1*t8}>kNTnn ݏ@K}l2_]5v* S1'VBȜtJ$ionzX< ,<;N{×OfXn&\ ~o7^ J3 IOBDS 1NFQrNmSQK־ ӺSkc,. /MUmDQ >8) - 7ԷYv+jQPW[RSG+ mi-^[Scۃuzu)zoiyWnaYyzsUO@^nJ( x#HHզ[>{ W䕄 FԤߍMx8kv }wX[rgKXdaf%nҪIo Q`| koqvܟ>>m)3ix^gitEPH"=%4mQhUܳ%T|* z R_l%C6@qTK%ɧS/ubT$ X4݈-gdt`ŭYHXw'횴_q|if,IH o4HlT=Y~R y /Kyg][4% 0d[ҷ} o8'KGvp5%_C׷W\٨ȡ{,qH Ә&U@GǏ<̲ayЁj>=nw5a跰xkYT l;ݗPT-aq)kLG_;u?ȝwj#=!PBC ~ž)[q~E$.eD' IiZ @ hU )Ⱦ:8LYr fFiAM4 - AJД) ' þxq;OL^Wf2#6 8]U.3y/~ScNz0Oĺm/ޓ7߽hX63>l=pݣ*->!qDu*BА?܉#PsG9^_ `! J!|:98?QpQt  3?wBnMk8?o}(?my0'D,]BI RQ`38 G=x|9zk̋CA]СBGF@{lP8:x8~\l[<0x}=wh{xuvȊ(L D+P C'dP ;Tr(3*}s=u4pru;wq6>4);XDg  b +yo_   U[xI"DĢSpG-_:~8 _N~m|:TMC۰,y>5~/k8rl89D4;<LJdo""`h.?838CbJ..E4Tt&PϷʿ}(A|) zOȇ *Q.^@@!F`e~y.0f]K-GHˬAZaAlyҥ '12"d.G_@ fUIK Ec- ?;*dB:!Kyw E|Y`!gÌFehMroxNÞOa; PA[w=w6_l#*9qFTw^o+WND`(}@g3q\w^yY$~֐,t7p(gDdY@IT~ߑ>?YI|e7#]Z|4y4Oн<ܗ=ۿƇ/wMVaWcm?kU (UHLd(2v ƶMsP3g-N-6yq73ܷ>reoR{ëHY7_s *׫P-m{ŅTM?cp1O]Fc"Gk?ϵwtjvkԹu"^&;%;ڴ'YV <=75jwmq2%̺&E=(m*lɢV?gSc `}Ġj}[) DjW}TM0cU66ku]o5"-~}me-ƾJ Ld1CQʕTք&ctR8TWH2' Nb'<.D+TDܓҾOY˯F.N.}ss (O*? CZgCL֊xd!ypa5mt70"EOc^.W0ӭptTj(haBhϝ 7~W3}]6vC0Uֵ4b7h23$@ @>~13;Zh7&:0ּ7vhSvPYu H>"4'$/uΪ]wWmf\Lhv&L!2փ&, C21+B,2`f`p[D1VR\ 4֩fS?ZF?wX}G_ }=NL ('!h^Z}_utVֵ3PRL+dہ{m6Hٖmcl Yf[,02Tj0fWWY~n0?CXwy^ҝ}֧\&x:ځHdѸ!Z!j%RYPR=Q2(yQ|2j 4MS|kZ֢jh̘MɨfMZ0ڐ-Zk )KL~1٭L2َRmYIhm"}eF~(iZM ʀ0s–B°IACG _>o) -ɚ˓ ex8ÇMs@C@ {tvnؿ~xav#!v;[…BYH`obUP'yuo!:~դ!t4V%_q Vۨ?O 7_xfML-X&P D.pn\őPLl^;2^pF7GUYeWo" zd)vާ LZ50^SM5U(~IAy}V[[oy egl}$Ost)j~h//V~㡀Uיb!U VxkkW{% O'qZ#MuX$x[]ZEΥ n_M _T#z܍/#Z{_q%ˑj:QO]SuZwO+p" gITO.nK -SaR[_,y$ظp<9=.SA''zOA>5sVWqӠnLlr;/n;i: 7i{d/׫,ѭѧmTG=e {sztU#o`'bxɫ1><+FsEXO$1DТ KT)B.kkc[ǡEd~Ť]m̠_:;5q ;oee%a:и`ήm-})h/w|ĜT(OWo55ՙDsm)IfNSoQOMҟeP+w}]=j/费56~̨p*ڽƕ15x>=Ze+; 2Ț3}W|{}M83t>Vst)'z^Zo @R^@h^uӮmlӲ2Fg)΂& F7+yT}3V.4榉3͜/ީG~ӊzr?M_ćxIb},Gέv_!լL'6'&LA0~+ /!hSdÈ>ں<[Xp}f'Ǝ,~a=ӡ,gL\Bc9S6;BЖ.mٮ"; 'AsO5㿍{7$:*M,?k>h'7#kT}>^;z9NzwԏsQOuYpo^ɜq[帅|{Zv+#ھK)@|ݔݖ6~KXxߟL{>DPJn7fS|p:< h_,\Ȼ@Hv5:*Z+sپ ɰHl?c1 :l&wx:kx=P,8fX01`z2:-#$,`AF h.21|Ng֟㯏oy*34:wA㼡iFDD4ZC0ͻH8/ӬcirҾ8 A*PNB|fTJ oά!0U1%N@ ;SZ:}_xxA2كH%mRLâNL< i nEE0w &`< .{b-[SL*%54, )0AGև61DRi\b=R3]9=Hq-PR" v2Y0wQ~RNA*h4_y;G!TX/{$ OL(%Zk,*2pg{A~=-%o ҆1M:l60,Qq6W4\s&=hJ'=-yk%z^b9?Ĝnm|FFJYAU-ʠIؒAG3m EY3I\9(,Ŝz@I[ndY5PtdJ&%e얳CûzE ⪥ KaUѬ{E @VQHdIѓt\|X5S(ϡzK@JPACtqdUgzJAނa֤E:):PÚ3G؆Ȉ/3^dVisͺŸϕNjv}m"#"\Kd`?YQ&uZ]C>OC+}^izc #6vo"xェ~yNi_rۘ+A""qcOk9$ I:;0:L AohWX ZlRu:G1l]A=?uQqZ:C,#9 Άd2F`yiNh Fu3+x8SyNћ ǻB O;7+Mx䟇*_q߃29G XFS"Q_mXoM)ԁxKGtg5 `Gίe=+*)G%m#$5?B5.z9U+yWU'e62du2R8uv~!T5=FFA?A0V4o-qkM4iE媧Gs0fK[ӿ'-g&3_L",[ k/[jvk͓Y-[d#Aߍ7bm_K-VgC; K Zi Y5ii;eiGZVZ0 ,VbFԚ\YH(Kp+^dD `]*i\b*b#e't UըT9g&Ŧ2,̍qGܺFۂc׷+ףn)6Vyש0MrY~d&Pe@Z]'*$VpZ42ԤJI6B-ZدmJ$R'Z\} Х hBb0Nd)IAF+'ްp_^S[$ {n*D?ʹKL.»VyOnMsIH2aM DT¨ pz}tEZդp*o~rCۚ]M¦YE-$3y_ M.tW㼯h Ez-"ڕ_%[+݋k/e1Cm<ۖ)_'oIO͞ls01-ܕlڴW˺fE"S7,ic-)VB;=۬A5M/Tj? _ӽp p1''$$eO&Sۨe'gZ KvY) Hx<i`VzZR魥.F_,f gtk ^0:$9:"q̹Uݢ(bz'09aُ!J ׼dW2)Ӗ-֎%ĚklgCdeEhYiEXĭ WIcJjo0[EqY %o@܁4@>=67CJDSHY$:^O,NąFr5&y  cJ'˰tvrgU# %l"Yλ&Zإ`d3ZEу&YFұ`kxMeٲ5o4QM3S0ƹd3fSfd6K˞S \$`zܹ1d𝓰BG+&m/;. - ж2ۤQ6͌_,@ 6f6q%F!9@SҵTxiߖ3tI}DMN`n᫥LΌ!PIuT\OL{$`=~7=B[q" 4f[KP$9JOdYBlmLQ?Xnr*Yݣmx( P%D$ -)O?sv`7~E{.Mih&5B+qs٩R|WuR2(CaQ=2*oT1LYΕgqIGk9a}rTO:̩ߑYx{EQSu;Y\c*/ǨیsigmeacʪT7'.[4݆,;=?GcB|Ǿ`d~wp فb)FO/1Cr"P`j~m$_}Re;,n 3bq I6s*0RO! }YRzB/T]9vji)CڹNBvtb.y%ch)h`2mQdztl=.TߴSNb8|`H\OC%B/ UXx^I6MJrB{z4&$}$ LCAAH $= *TzWmf)θNt\a*mN6-ovӉ=r]m4eA͞үIOS.Fm63!݆d@ӥjBȏ[N5Uw$^SfkٿFL{86u`#SowvrPQ ;6`]頻#)Zc%rls>D׶O"lNuW*Y?ƲWJRf2! 욼~I@ND;.KJlbTnPEC>F #mL&)@sB~==Wc,9x3EU(*ǞA((A(AGNe2@☊ք [`͂&KJ܍..>y1 HQbW"w \dqq} /}|lsRl+M;zi=v舸L"X +3F&M]aF,j›n]:o4ڡ^fӱz!];{O]* Ĺ 2]*fhsLw rkȘJ*`U{qҳ3󇸣Bؑ) !u: ;8A$p!Znl&Wtz?IMR1&#}[g}Ѵ7 3J=W߽7Tw:wѳUj%ੁ!on#To^No}=dv~CY}O/#m;EOBQaSх) x`T~CBjzHb j|"3uP cޢ_)"5ʊ,jwrh|" m_ xCAʇ !MG0-"6]T4XC6e| Q eK6CS>`l][ :%|4}o/x70t$nϞrMJDp,wlrpj\!n3wnSoƝPߋio'ldDhCWm_h B3ld&88miYdє0kghԿ2ɳMh!m{3JУų g6fi$L†M&a JlM^ogVm Ͷlm"%n*ʍ IEdl1( "4J`g ζĚeZIXDŨ 1Lϼxq BSJ2 ›! xP>usP͛5kJD?"B73RǃbW\8i+B pc=N?/6dE/3tfƜq,b/g#zeYϪ\8JhJ;Z|39̪%tiN?:';mt7Q:D=ŘbSh&ǑO XEroFqX$Q~`ND:O d?ß7TJ` ][auPMWܵ6MTrrlǐHIU *A;#l gk4f|D{g ꖅ(=csd v(j.s`ue{)PZ^ǵvL1%yu\a6M}VF;bA3irV33W=2HE?Od#,M1q\-Ri4:}ƫ;C.UQf&g+x 4Jx kxN8 fYSsZ\knZVZٲkWĬKւwu +2[4B c}ҁ,C2`*RXUMN]ߡZJS:,%F[695O% ]7a=²{:2v8UGTW@Vx]/Ed &6|g!u`6ָ\D:'my2K03pSO-j,E_d\9R B=8?)+>`Lc[|{S6Jkź/];^HZub{fB?[+Jcu5K( joEFfؐR֧lEń?u8It[6E|{;.ȕwVL5m\ $|)_z5Rc\zn/I'6 kOόOW(.s[wZQZݳ~7O 2%INR5}X|48ޚU|ո}?α+F,HĹ| ]F2OݺF~ ¡QT*bţuw\ \a#)TsIS hS15U<<c83dpqm,cB02q\#Md2d];?.XӾi!'o/ PU,\)x!GœQ%y{R-3| aY!'jdoJZRB; E:H[->\Yk4zis}ʿtgdufK3SoyJ%TҴŹ ,qdb)U#֏7ib"&Z8OgQL+OBy20UXBnk N66aiUM64_$i)<&ASRA!Ew/ =SA"T( .b_g FrVVY;Ly򺓱ah)AscY[Q%E = xr'N|@N8c}3ʉOE8CB -iVaN%Ҫ38)9>AM[[쾞-;vjc[2mOB+CSKnXN+qPSg%\Z9,,(&Aߠf`yYbQ*+ x13 ƈ-A)%=^SF)'ǖ3H&Ɖ'|k{Fϛ[wnLt>ˊPh"um2cﺃZz~qKʻ6X@|vˤUu{}&˘:S>&MEp3 Ju}$^Z8u'+'siJ-L3v@xfrS̹C"^~Q戉h(iz".#fzdzN17µ=vpRBglYә;Mb㤰^&*Bn*}!}m;\@~+~'.jXuy^GҾ ACR碇Np,Cd}^ϴji?C Ve[ pj<Ҍ e7c)nQj`)d ,(QWͪ 'B5a 7v"JJ++ CfuU)Re҉"=޽sX^wCwknɠ\zB@$M-M,at!H0lTM@9.+h URdUĚ@Ѕ.bV{&C(YC#b@BkHH !5e?6xV]J 0Lꈘ %/(!1Fп}1'C!rHͧ2͚P*o98b0@y)cFRvw;zYLq ,+I+ 5w/(~v6s(FRht@]0 :pMՊmLdT,RT2an5"faXpvVIMJXUٌuk)90טC{r/ޚT:ꩩ7d+#mH p~p۬H?tqǭYEϥvR]oJT[xokIrRmx( >@-Svۼp.z>;6v&y4gWK/d}Qԝǃ3φn-7r9jPhuެ3phDXV ;OND 4҂2UKX;MN2/|jJ@z F ѳzƂ]ֺ4v昭t~8S}|~O?Oh&*Th!" JBbPJBhJR @(%( HJV$` ZIaF"J~d2*EHCBKVX\@E!KU1 TULED4ET*QTJIBU  U35E KBLU)E% QPQE4E0DE-PP4D$E R T0QPDE-BAI2UOc(X!*jhhI((Bt /(&lt5]1V߮;s~[k %%-4˘Ƞ @R SH-4J4-- J(R4 4R-RP-LDS ҀR4 BU PR 4!H !8X( &h&B *fX0 Sٗa,I O6֊z?+w{ .fN&K(F0BNGٸٗ'Xy oC8 $@ߵB(,1&"U?_(4Zn6KkOrI{mhttiK~َmƒ'`Ռɺx‚;uܮⱧid]x]u =-1e~VYQ]o} 4}s5U]vki3/W_v/9EWtqϿE-2h=]4*:T}10!iHj+)sImvƍjq6ƶK;or#?AXHwHsPJ2O]ʜEm& \)4j_zYQ?V8Jt^Svv[êSĜrjb*—s!0b4kG׭ ]j6^#?Ժs Hh[!ln$2 Q/U)R`vQ*18/񟶡 Ȉp B-ge Ywpv?ϼ6P ШT A\?o8Mf@um}A$6.a+CKH<)ĴZ=vvfwߝ8Q鸿]}wv~ju> 7?cnro1_+'BRtEb&ˢBTnw@P;\ISd@K4URŵCv(Kk>v 5cC-E&+pƀXw_H-OғS*d$01`V)X^, CB70ƯR~ٳl~]BM6`o8)X]~3ōEi|:WW.1Ԛ:=eo׼jſګ71:j62 ƺ+^R mu^>umNFf )9(vXkj}2dRيI܀BsݱֆɵN dh c0 ):>@TbupP&rN#-l`#h%9|7~! `*jr163rf E@N`Δ C/$ `9}.^f֥24kqtI/a9/ha" vFeulDd6v-asa.,X{^Nߚ,4d"꡵0kHY# }kVW2"o%. heS: Yp/Ah~l +54U 0e1=r۞ Ur63hFb)(* :p[S΂`ZbH홌hJkGC`/tG2on]:|FgcN}1s39 L6r;WŠ8D @ܞ{B*ֿwQ)WGCi8s¶zӚ'S~Љz^ CxUuJ\IWVwU%&4⍞u^)KW7KɷNN0:cxk%g/LA6"Ik- &qto kJ<~VwzI3boY\`Te^AG3}oDڗ$CtthJGNqx;\1h^\SI[kbĂ~=]P(;3yKHZgʘwv F-f+p\:=:Kw[ ?6ɡd!vZm(jg:t~'C>Is7~㯦?yDMׅ/!g!X "2 qR@;QNfsN7l]s6A3^ 0bT/mAᗱʷt۵Յ<|ƵKR>'!t9·Rp@ EuQ-:tR)WSKEفrx GAIҋH_d_נȟ|=} ~D~v]qM(QoHzan?@K 7=~'ЯYd|n>7T߻ d +DU156,B4"bbJE*""EL"(ETZ(ږzT7A”|\h" &7aPT~wtDMh70yGA{k.#:}_Zqg# Du4:FQn&[VR~޻=~J?[\@_(,%U?`u9Z~ocκWu.44j|N_ G/ُ!@"i4(%ǧ'^9qjA < [_.;u/ Ξ>?JU@wդǽU*5˸kwCP)mdӮ\nJzM^\YQ:9e&7ƳOu<{]Ÿ7 #@Ҫ{=[C'.˿(-  Fט̯hEq|G "ʧ}H p.z2-d^AvQjh`,+:9QQ8 !1(Re?^jsY%"vFB3MC/dr]~(ﻅp;a7( B'ouaLV~'z8@4F,GE_ yZEL b*+B@v u.vܿm& "#mUFH1'e0=i}"XabXWؖ߇?VgP綿tlΤ"{4&1$)_rwL1 h'C^/7TTASQ9uuRD9>K8>5eY3m@tsmo$vRBϫZ'5՘/_hS5ȑ]Sl=7f>E+1>Ufq2%᠐c?,g-2Tj.XצDCΈ$rXwC8C(Ȁs;s6#E$XbD&GL"M {ԹJ4 ` djڌ[p=a _scT崨yuWMI&r2]iS!W0 9M9?NwpOkI>[?/?X9oouƓt`^z[M#G Cz.mI4|~ Pt5zmt)q:*~f;ʠWS}$n쮩erPG3á *QzKrgzz[yb8٢/Θ>%>cSJN#v)̢]Me*T698p8fexaCu0֭JٛJf> iZk󪯫wwz:*T@~cyǰ,댩lbSO"Vy0_ِ}ge9/̓Li$Xzx4Zmx_Q êd) P\U:VCԋjޅ{9uPӅZrۓ_}WxG~s.=w R Ձ) >ŎE> 0V5Ҽ=c q!`4Z* &#B16V?n|ou3ῆՀPT @L]IG}+˕.[^^ق!pwQ4rڄ򓉘c3,uUu.|f o5Υ<Mҙ;.]> aP1hЎ^PY-yQ-h a1 ![odbg$}xi%B}-lrPyZ4lM | xdKqb`z)Jppdup8´wCw0.$^/LϡJ0=,?52\n:3yv] ֟45nN8~q&0* V@22c)40N?,\EߣJ'Ҫ\MzN5$4EK[*ŋ&NKƮ_}@(TbĹJ9R8SX9AkI-+]m5sf+"vIUka4Ga^J{-:ulVv 9' tVML_"uH%*VT2t!.rhhө{a5!ȣ%@ζ4_6z`)Y0i)Ǭ4G1-P3z5ԡɗEkIsP:}KR(4w<$*tq>sezfU֡G&ჷ~'?_0_|:JTHw<UzO_ :db5#:jՠ| helpҋU34EaF"0)*-a@Nt,ӹ#`%b1ܷWZ,H^v+FΔ[H\E{(:,kCP)XwKl]|)&n ^x>HKǬT2@tPDK"M'vs]Kqɽ_j3j^@ZWjEU+XDnR$UdR'@vG:IMķp4[Dŭ`,.aqMxFEM`Top{1S5:NwdCt;F jT8߮XM|k YNS*YAB1+p,h6|"T5TVjbk0Ӆ[qeXZöV*y^WR"m6ӺO%TVlEDUD7(O/]nY2hn;P!;}4S_o$E`?ؽ5XJ(S?;[kFq׏0A7 Y ЅD 6ݭV5(ܝdëVWZt-MZ/!EB#S2/Gѿ:nT 92QIf=DIZ`Z/Eݕ52n[I5d1٠5<*jc&]N]RmkSǍ<4f%63smDhպp5ÛZi̻ 05Zfgٶŷ[j6k6\ tfb; jk==,lk7ol 4fk- 0Ĺ4aQ2&GZ+X7mѩ-'NHQIU38g(pj!]YSWI ]k[ff™U'+ڻ^ֻ[7h.ˌOW~PXYL̸.[˗M3*pήICU'/ arCtiKL&R,SF_Y7E6meO#rxe3#=}܃H:Dg rG63@2`3W/6tpU@C<Mlrp).Rd{f|C5"`#8u-&0myglJX41B֫aA  &Rũ"+P3,*Ya f$RҴ+噐HH(Je,Q[GreBhDb]d9R 1beib)֑ADNsJ#z2*>c? |cF6؟:}0MNq*tJbH ZQ/XV5!UWYM.@CR5*Ÿs D 45=o/rpO|bohc lg!<,yMGO7IÙO `ٿcY J%l}RB: ^Mӯ&6cy , k6;ws7 3#2Yq0US:U^ 6#m`A\FUyw Co'ZUҐQ=b7wVmcH~k1-.Ǭ/idMuu壘#b5f:+ k^g׷햞 BZD3C%H : 'yіik?jStk}ǁ o,7u#*UG%Yqz}``˝ >~ >/~Pqϙ4|^ow N j8,TL!8`"!R&nvL>cU>R?{6:y P>^7ZΖBH tscz^( <-MP@AS4 l~`8 = =TĨk@l0BC$I@SH"4 L !@P1!KC2DSUPHULLBrA#Ce:_y>rjt ԆoGB`(ȸ-G|ݒ|`v0sѯ9 ȇ:_wzp|i"uӨDmP϶!#" uv;"5w&*n{gѥ)fc nǹlҾLԓ{ ߙ[#-Ŏs8a̍ s.dd)DF ~ߜxÁp^})t9TbE@z?4o@Tɉ\c:!ŦU`lZ\ںl7~XYn6ktv!yi6~Y{OZZț7=/JF t}<̾q s^mN&uou}r[7={WN LWm#|u! O"q/;~nMvBw*V eg9fñ-ʲe@>L_kǷ_z)Rl)f ٷ}#Bl}Q\ (KhWirݟ>|GE9ȟHKِzeHdt yEyc's[{wǶ@"{܉8`ꁗ (Ex1h%/#|ˡs;}$V Ů / ~' nF\TauĿp1l*odbέfw!_(Õv\6붖^Җ׹uC}gT@̱1ρ;QZjqroz/YH^a9o6o1i!~Hm]᪎OhTdc^nͽ L`J+uR'g+꧷Wa:=b-+x:yQ?j@Z&S|zy?F*a%ŒsbXxH]K/NLpT@:jyKC5}! x SFi`F^sǼFʥ9hTa#@| *@:(g;a U2A$| }}dGq^vgz<ԤBt$7aY9x9ݭvΐu~ 97-5j77ZqzpS5O8o`t[Ù&-v!ޟ bl݋s~C?70C,csg8mxc-r&x9 b > '.'~Rzc^tM}Ww=e*H[ E=s9wu~Ύq PbSb 8nQpfqj|N̒&'XJ^G,#(&PςlMJ;_N"zb W837s)Mw‚1EBqą v5 g.9A8~δpR#` Q< ]ii3eP$uhdc)W5#x<z%lOޟ;SΛsx2B@ lY:}cpný6Ya(l`˹LjdO5"V{@ZBGd?W?D%JVN/QU1ДQf=ڵY8^mf{Hul.tSրDĔA#5LFǬ:nx/? ]=ƹn<]_rkDρWx0{,JBHI*Ѱ 1$6CfWJ-Gp):vʟZ' nD>QZC$%Ո'AOK[nNI# jtO43mش/ar 2>#Yč=%E8W028 mCPѯ~0>4W7:.tſny;Ss1kv@͎sXk%)g}œg|RJ9?>?nXl2Eђ[@M~-X? Wal'C+O! ? xE0xpѢפ)nK}5fr8AaJ:EgJ:'Զ:Se18UCa嚢D#A%qPl=qn+B_D3:A ’vT``Y0ےՍ'eSÏu7<4J޵/Ìnn#\}L;}kނ쥣4)W(|@`Pu *% T04tNN&H(!@  DmԮ1!]rln+P_vo=A- Eq!{e#P%L!@ RLB~0g_pA,@)`lضIa$EV "%$)B(F~x.|Bsq=Qms$0-V~O0V0G!шPPq,1kW[Շ>շgA:[P._?HG߸Q y_O4+O:+7k 2~8$dNG5]?e8NX({RiTi`q)kVaf:yǩ+tDmB]u(%93]6i6? P~l6`#-oYicSt鴩VVAcC&<]6MmfO_*bZ~އpvklLʱR˅_=ꧥYd`P _미P4 jg=-ޱOv8߶s/UWƛEofbnڡau30`ŗֵGWqFZ`&@^@0d/kչVl+xH< w<6jЯW[[w;?{l"ë?wOQa59}!vo֝f\|̑}??2;tR^P (T>m3m7 _l?FN=>-2n?<(J_t0yu4]!  `= .wO(Zo i|>fAbkPA(Q33dkOvUnmw61>^`ZgfXr!@-+ |6##R&,g'e=4tKmvSi扱* Y ˞N"Ѱ`tϻ̤O/n# hWvr߇w;;*'gw[9IɁ!6u9 lmƦSd2r BATtE_V8s*>Ub(RW+[m~ɽA=fm\+>;^% A CkCZd jg Û(E/`cw @=k~uׯIí~E9;U5+X ֮%zO|&r`گɔlLZteѡj+_Tb(EhaiS).~N T_܅}:6I`yPY`@P/R+~冃&iFpsME$SVP>/ҙzWu*ծ99@'ȇ@ +JR]ՖF囿?\]?@bx B\jvUK"A0gc 2maG6ֶ_Σmx?GVП4>\ E ( R5qnئ*Q>lG"Oh!Iɔ/ YѸ~5 7V2H0*NU퀹?3i * `:#ޅD÷xLd [јm;=]umg K|(>53i)Xزr6a#א}q^y~|#* "D㴐a^O}@R5TR@Q~\`K:`ҨV&Kh_N 35jJ`{];q[X!Bƒ.h0 .3%nwZ}c@1H a]9F:YFS,WvYZ*6 "d5$F͡tT4dH"q~&sea• /@B &J+x}x:$v $}Hf4?j<ܚku]9a{ˍs 좒 S(NWm 5-6W!u9Wu̟q~+) ̒.8ÓNqe$:^.VEmy:1}cgaCBi*wAyzϡCOÅ{DNkX.Fy$G;*xOAMBa'PCiY px RO cO..3Fvg~o ΎPY)3[!O\ձ^@z*9[ŇGЄp C!A';ލfMjCt9~ T bGt \rZ2vԃAGX&S'_wKV=sz4d<;F TX< `Mj"[Õ<6OљJ} &z|9I?`d >su6H%Jtd)ߛJS:/'O# ./7ZCTݞ Y%`P(btIF(Dr >>BbbBD i>6o&LC?C; ,?^w5&L0UŊV>;ʛ'zj ~B` pWao(9*WFrbIuw=./{ ^u'0CvWݠsMT^,k{V lxf$Ъ2⁦mi2;KF=05s q])fde, ;/ @CPp/صMEUHQR8ZJD4T{AHwuo]ha Z=;S`>V{Fy| >ӮӆS&CASuh8|LLUԤ>Ɯ25r@پ`i7{ڧK `p0ٞ.,Hi&CatǒN`)ғtSts_m@:&=Vey!6C'K>kcO}\hy<^&H:qIpd{=23lvVҙ ٩pEUJ&<(rƍQvhf?揝tb )?pحOb DXd8 B>0@t}S*c>w68EC\=Woh& "Z_ _,Jŭ[T@#Y==~ԍK =xnh|uyd@QU|_wqa88XaXK U¥pdz*#MGoy'MSj3!7K*Qr3i.Jhe%n۩;+Wĸ1n<~^(u~;o))TBM{)!;÷ }an&("I<}?b}WF;c@ ABˍ"W=d(_8!A ;c uυoLhzcDj]p+D(?7v7f&PAkOmM+gS6i5"Eqއ!X_׶PET =*:$:0g:[`F LP{ϊcK8^PdtL R{O瓻9{oQꨈ5UGfȃ!ABILXq.a& |R$H>Vz|+2>˚ 93"SʹH䘤XwU 2?!ݕôYm85'viF{KꌌCיּgŗa4N}aR~yPFJuh7oe+,eO =Ї8ވhzʩւ!2*'>j~mML&/?k6Ϡ)LÝSܸiqZJUQ$RD|'hS;57eqNgHHR=v|d@A1$K:W_iqDZڝIAq;w_ǮѽvoWJ89K쑅Բ_I8z1Lg/[Bg?Ar9~ ""BBxKw'49_&Cd@Ks/&yEO7ߞ~sCWAw>/} 9γ$% Ƈ<e$T]*P0f3>?O7zސOxɬW2$agw΢.QYR`Ew^ubv ƑJ/ jD蝚#V( Dĭoʪ46 I_K[j}؉mNn)rWY:d#Ż>79'f )BW~U[UUҝAe>sErāÜEo3D TdQg90gCTi+(1b(>'9Q; r\30{Xk@ٱ$8;8/ҝ<`*B|E["s2k&b;5RCj z^0}* @ gw7:&_]=jo0bnj3肋bsds~ڕ3,7aC! bػCsip_IJj.ly8Ӿ +.0{]3)?h"Pq._^tG+ŬzeIFe{vy韕z=1b܆WUzP$!!  pyh2ePpF|"k5Ct]~/*ie`{=W7/ԥǝ1#DB8*l&EamDwyRbM&1ƻځKٳf3Ҷ3QS=+. 5/h>[{ςCZ\tv,۱ |mbAC mPqnuj 4v~}w/S~h_l5*5 *pHQsA i8d7p\Z|-KtHlXH,Y9L!щ N&wBc)„30 @,Q|bW]tt_bG6%`\0^+Gˈ <0#3OWr.u .W3H q`_׭1jvQs#ّY=͍xNB:|@ ;H#a4oJC"|R`DEjTyάOF@_6˫~ b;iA**RpQMOV^5m@ưamj]BvEL-n˪X7₈M#\naWߛ /RQ DjZ@MgNj Tө>9O$ǑFA[rhj/zG>*FMBLCI:~*R+Ql ݵE۱mn"W;,IjT3Obq\XfJd{6,{Jo5voMw^K 9t}ѫ  bs=bnK7!`^`4XuJT~P\iw?nj7?h:~uʳ[ڧ[<UpR)gPtcA >Ƌ"0ùɮ[6+ͽbYѣzhb}9p-.U'I'60[ϱ7'?bߒa0.E| WxP#1Ybܙ%=cȄyƭm׳| Y /|Lq!ȞeFF\r;-7 H%\P '1o*9`tLPvRT./^iB(rV Gl^(;?!6^WYOQdZJϸuYܥ彖|I!lDܶ+rS}oT iS 2Z0[QBgܵ!)I7'f ?,?IAp L/$ºd7Zld3^j=؀ޣniM']ˈظzz:M,={Fy4h\*Cc>\zΘ<۲Ɩ$K$k%yl3](weHg'IluL1PsG%ux(kQ5m s х"aCMDA0pu >(Hb~Far"@AeGqp7CT<{:f.& :O}1;;L'gaYkx)I8FۨuiQK0L3ZUVq,qN7uF[hXƠY-Ǹ'GA yI[[xEJ +͞2\st}YÛ (wE`pI ՙsg֗%ms= H6ouxO1@+I-yLHa`i %T_D4WmcW)h5QAE|oS 2`5B3oM+WԾ}4]|26Ν5Yopw4Tu[˾`mwTg͵K{Vyw{Uݵ!/{ͯ_rWjݚ4,M*ljm|kۢn]vu+uS6ר}G}{x{ˏmw}FU4hFJvu(sn{C7֯7})ti_Z@W%@>Zufx }O|o0@mf1iO۽'[Z$bjѦF!%>wpP! mQV@('*m4 42km4-!YjL4{P:+@mP ()ZdkZm(@c@@R֙PkP lYbD@m:1`}R}}Umݳfq^RC`iQ"lKA@@]ϗ q|}E5$T(c\i2|>+\|}ޗѮ8 >AOvrw7aH!ATJTJUU>֜X_fRQQ^ G%s6)Hl{fmVM15[@0{3\w<ڹg7>j۱_N;o9y;.eT)@ J(SD*{ﻸϚA m;pi(tTuF:N=0Syϻӭ>Y^||]Ӵ3-ٞs8cos {J{:ξw1g.>o8Â@=essg] jϫx$t9c{9vwvs۹oϷDo9wU@{;yP)ѡlp=n'rõ H c鷍ﻦ PQn'ͳ4k%.^;QHVFtmǂs5@ fק>IV瞎8|R,}: viQ*=ehnOwݞR}W<_\uݗ;wsh[Z4IDe;r pIQӒF!L蠯М=;RZG϶zm^ j۷:j]ĉ@ ;+)Vu|s|v&M֫p 4"u) U}j+J}Q&w#wg=|TEVk)AAM)E&=ݳ S|$Ss`$gA>QQ1">^,  l܎fC뾏"I&{zoI<0[nz6!@FC&! 0i#FM4)S@ LAA F& &S&Fh  @C@&M шah&! # 4OJxM2S5SʞJiiR($DB 1 jz  z`CL2i'D4YT9w;-Cv&st[ov]o˟s-sM*F;]I(@`4};i!3yUE8NrwmWurq%&zۣ iS*.~iTQ;2s{4,)UcFnSbzј831t}wup[iHD@w_ s4KpU={^n?w|r Q[x )@v`=RRI\җUC-(lEMjdu_^4|t3&@%% RB3"j?QtN˶5(osTM?O#sR\Ԛγ̇<4ZkJT̳6BS xn~牗yUV?NFn&Ǻ.wdΔRl(x?ux[w˅nE$$V+_8C{.qSL 9xO.F|Xy|Xt2S#J52V . T2tvM2TjoY(n ]qɦj 5mՆn xqxj3ۨr1&٣Lޖv6S`IY$A;q Y0ةdSN|>/Kn쐝P1ԳO}yqJ/ v_ZHIQX y6Q};LYhr{0tυh1 j2-Q a,z'HϑOɤL__ZtTG<}OOkiVltOD̦D:v4S T"LWDNQBi*SGb(um# š4Q ,Ŷ=@"5O[Y48e37RE77DϹQa[l1Y> 0|(dEh慉.5=^Ib`"&RJ .&uy.QHd-%3H]1R*m$U\ƶdv#ۺR9U@VO5Ap1gvz-FUz`Er6HZgLK]iM{?bvy}iz`SP)K`4X0]= Bv+1Ӧ!ꖌ.Cdn;^_.|___ "*2H52"EAҐӘF#n1;wٚe YssF/[\9SptìHC:FH18ef +?Dk露mFd}+YTh?1o'%03ОS 6O5<+tklިK9~pEs;[Q XT(^H{w 'dl{&5#mK/B;UfYpIA 6RKta$_*j/a3\h1 [@eY6Vvdal#By&uլdG=INߕ<&m:!6>gvr- Ry* u>1KZSН4w3#\"ٷ_'-j Y?3FX\B DfkNKs_91Gp~'@c1wKfVV p&D>G)Q~S*2Lvd"(X ԗQƇJRu C#y?Fc>1!.$XD!PEE#3[{u|;k[WSQ %w3N8P`SQa!d;^& ] Y(nsMtPc^(('f c-K8f{2Y h~;GCd1."C0WV-1NU|!8خ!yz~۪,O4 'MQ@-Nn&=A ld|ߓݼ{?q%^1n<3Iisv,86l;-rZ4G5P5;2=H.@\k|%Mhpm ںreQsAdҎ:R)Um<.*Ru6 ErͪwѾ2phtߛ@`-gtfeSKȶi_E4ҕwBWТЏvr/B|R 0nǮwu5咂Kgfj:){uj쎇_2?U|^l\V~}8{s\$QQl oi]]؟3Y)XRZ2[iooNk_ImX;{_kVjw7S!"Ԉ@0˙Rva MK9  ixG$!O'Iͷèw|qMauNYx(PjfJu65vG-UL)-D?P3Y{c8a^7W֑߃qՖegL%\=uM<+!HdS |M*k[6tL^+ ԑ?'o .PI]A_pvC-xҊ9ib 2emwpL 6}NXyC#WSqg7T \\+(`jE%xN ٶBƽ!o <߃ ji151ӭieqܑ"=<3&YN^[g[" .w-x'Go2dy pY&IΉi kGw:ivWz;^#UZBMM @?Hg:DD G]`PzeQ{(WEC05a_Rm\ Wi=9J) =nl. lkTB ਇO,8,#!}菶$xVXa 9.>#9ܛ/ ]>ѕSTH|lPUB*& b))"Z(Rb !Jh;JX1UU@DDPMQ4PQU4 P/W?)ˎ"]2/#´w~.+Aqq~aPX# zl F|}TUB1 xT!8@B`őaS[׶a،  ku $RR BL #H5,uS﮺=B D4F(&o <Vz b 18PNoM%E*#~k~xO*K@ENEsFhF7<dA0B"Tyd@2VvnBrp/rw{FjLA1za)5쁡?l|=\} }>91P4u^Z2!""{IϷ-3?7:q.yÙOk6~nO ׸ fUj0񡋙XLޙCkdq~twOQFPAoEC^`kW^Ʉ^; lȐ b=z=r.W eؑ`]P.5⺠[ I"CVQ;q%ΈX-H]jD,;o i`$јgλu3sԒ;+Zq/1y=$̣hc 7BdIG!GPnI'u*[[[V6q|oxoɶ4*3 )YL!`z=j"5"g*B$a3G,g-_F5i@m3 (2jV`w~S%52u+B#`Ömӧ":{f^ kwm^jo}lU*ЕQd%Abf _ Ȣ_yO}-`X0!UgQ(?TN8"\lȮb $k'WÖ5Q1Cl*tq[_ӄDnI<糬WM33-NcO UB"@GugI(4.-o?/3s9S9B Mh]8!Z8=x7g$;w6E (Xx#)gv֯dZjB$|VP,:=uݼ/ i\?dy FrPXk=Ble R9v 8vWY{gw;g  y) ց̻|-Ļ1qvP)K\1H%’'[ tNUey oa'hb>8gh ]0@w|:kBoFpM׎كjJD7eę~X s"Z|dD4>qLJA T3k|7=~[C y( A/=;z`.< U#I/e(RVQC V(>E8nbqZway=|Ovff 1 E՞3H_g^RUTI$}vvg *ʌP JZ@x밼SzQDe +@-6df䒎+1܈d.ӳnTwfxO>c)xNi ApdNia@R>1r,-h?8aL̃cY`W 0rdD[HBR}-s6^\$ S$w5+ 9nfR?v7T,:|,3 QU4Q |fp{w'U2"R ܶװ!&{ ,h۟^-t)$_ V3GO[)̕~~w"}oʇj}H?;m37Ys"IRTs7V p_;rs~]c7'_Doƴ{ jm4Mّ 7NE{n!+4ޔ0@5P#`[qUҏ23 eo{eՇ@Ct$Y Sv8izh8*J! xЂ?)f|_?zڂ`$6z|uQ 5/{G]5ȝRz:Z$w$,פ_"?ھ"_̡vNdߪ ֩G{k:2xui@䍄kI!400&J؝4~W(t_SQbf1t B9*3*#\uѢӇs!UX,:%Ig86|Y:yxuNDO1KNiz+罤2c@~.~Y\V2k`ľJl/ JVvDAK@C}9}|7ۗ |{) ΎxU+#}ɲ 3qFnVd*.OZZKq}c G~`.?c}K< Sޓ$AP? Ff Ӗ1a?!iqQb[<!"s Xz]1](edtV„b;eD w $mD8ywZM?-~K`bH0It/]IL]>) H5?Ϯo'VgKQ&?;w mߡ̒`x9X~HdRmd\]Y&=C7S ā8f@ gم,-)E[vXy跲8Ć#5 .Xm G9MpA~:C|T_ ie/zkU=H6->`]9ݱ2^m\L{#Of$wD_f 8<s sQH%vJ+j~Okƛf^ {&+tGkp5naA AESQtAQ 9LE4D!(n 7@}\/~@t:aiy-CAUk{$2>o1>>?q%՜0b 0l'o٦#8OE~(8"( |gG.ZIj2)t IKB|rѨHXfGIpm7:3*8'ՈC%Ɯ9 ~hT1%(GI~QJ{>;dik;v1i 5e/|8`'|/G.F|!K_є(2DtvyKîS2my#Q=7<5s-52/=J|?Y _NNsǂ`-}9F(a0 /:{%SCjt@% ?.} b})sKS%CU`}D@.@J 8Ԑ }/o|t}MH#!xs ?O &"wBك"7:$oh:t4bBKNZJ(X3չeh!1 w{VLIp ʨDGАpH/A :{i؉+뎄 xy1]6~ìmºyv~_tI)\{v&2L0zURHħJd ATҢ xq֌͒cJէc)I>mN55ªGjn/q0aj>E4dlK<W@O!Z% wXM`M\RB_ N5êF~c(?`Ӑ\]TLғU:$ 1-xQ _HBp`ő[Gd/ T@wșհʜY $)sm3?8D)]Vi䇖BStSA_?3Ӭ M\ f<nnW_|XHnjC?m5~L?#|hcɛ--&U_E٪âe ?DFu挞|5bw}m&3}6ͼp|9n 7b݉zI $UuXfWF`&Ǫ9 ͜^fOAl"B 13<› Z,(b}?TU(w˦{\ ڙ-!$ F^V g3-Mc(@\|2X=+RɂCd40JKBz0Y75l >6xD*W^wjf`Nz! wHEH{Җ+n8BJ|/ rAx}4uAj>gBe~E8_ %Nhe xr*M[٣lzI-aS:sG]Xq xi|1#y;k@su{p1*r`"/&̠&CrN$UwH"sь[k!]ζ$7GTM"QtduK}P SlPH'ldupI\W"Խ C_-jp"<7U1h'[`I!--zqK"2C42@>Aә"A#! q~UJ:'F"} Pt0$GK܄v ||B-"ʩއ<0 vC>I3]wu(5[z9|0Y ^ FPDݩ4+ء,1W~4],rv ?`GiPO1 N[X8{H:$s"hwd$ӧZ=їf`9ָ$#WFaQ5<<<"A>A~ aCgh݂j4T"+wWf0u9XP'XWs9bǫ;woCY,#!;Iv Xii{gMt 1{l;ҕE:lT #x{ uT@,&k'U眨izᦿ 2" 2JXEtIkz6RP(Thyc H>hĹ noGcgZA!GXef*fSEUny6͹BKWUh+1qJ7F -O햁)n p<*/K#"9@F< o:@ 4AvgQHݿ⻹kx8D:aO]aaXDSH_vpDHA8[PaݠdNڄre7,H5Pxs7C=E#g, zFOvmHMIGMrNC67vPM.4'Ҡ@ϩARWנEv5L4ЎgVY_P. 5F ?nO & Lc{i+z ~Wp ` F 2堷n *-gRY&_Rx;`f݄QVZ?&{XV}I ZT=2ux Π誇 D (?BPGIy|Z( "Z0Fd2T;MfC R?5j BVTH+BBHXd䯒z]?ݺ%pV=%2BD*GR! " zV0df1p- AJcXdWt{ܗ`OFfo:$d|&^߹Rȗ 8'+B3 ;ty0Xl 4Z_0 HOOpI9p?4}[sn}X8F4o=Y+?CSIx]16llPvc<6{aLQC_֒sU?3jϿ@Bzጐ1> ݴ»=r&"µfk= tJ(@jY<8!kH_-.'C:t# \`a,mly#7^דL<@%v ʩMcFz~nJcGxXf 1_9R> sþ2‚+M,*I 2ܨ`{g8 o6/ʫy@ @t!qJz%k{ P%qs( Ou\d KUD(VZ oiR#s`Un_TE6l!ElsVvvtrВ~Qn 砐TPM8Z~e>xS6?ig)ZlQ_@X*%3p},nG0 k>JIbo_BmAX{Da [K;ICnV56ڈ|JA@D(Fd^}6m&#eGnt_q1ǍU+]s0TvyRĄ >Γn3cc!NKMCM"QdЋ@j\/'NF9\Vu-%g:?C8$X՗&_ZI=%RE)QI.BGok "f US0͈:隻_^yJQ=%I Lx@aC'\7=C{)u+Vs"t3pKslK0 R͉A^}Ӏ ~K?+|䐛C%u !SLĠ lbf>ۙ&_Tߗ5H7x-g98@edR-nb9%P5>ptp|`9QLZ +ɪ^5rV[>^.j`)ͪ\'@Ąk" ګCl1uL:0<%c& AT,Ktv0[g0apRca"*$7Qj}31nғ :Q;SMI %D^y>Σ'3J` ح; #J04sfuyhq#d_†$rIg)|@(>:` aJ @(}z<\YKRe}Fn7v3n6+]Fb,8x͂ '%k]]o(s쐓:/ݪ{ۋ[ϥ[drȊK!v24Ŵ+cNLB֡)Ө3uFg &4ݧ %;]3yYV衆G1 .q"Rf+3`~x^?^qz h1 @8rϦIݿ*B ZAB`Oj}dlNվuI\ y%* eEqwMee`"luޏ&1&T[\WOFsЗ1D@u*׻^3w-C?t񚚤^[odEL+'=#)(<|ďTae`t,sdl#Pd\ 5SO?v#%2y:iŽp.>SS˃5Fa(vb0Y^B C#4S4T "ciu5 ӕ5D %B )$Vh(J RVH* "q  ;^E?0J{v8 q࢔2B TCC ?3#fK}؀Kt`Pa': 6~)%N(ԯB0H$K0A>t)i(T7hr? clJ湚㞂_%$QQ *BĔA4V}S>/zְ*T`۹FaMu15*-`0+0 9c֞ Bkݦ,|*qe4dIR@`v iԀ~G\qiRFGHq?\4S#A{Y8]yDDWgcLE !T% HR TRS4S1D Qv{~(dxI:u S ~ wbž_O!HBH%%LzkժÓ{`~dpx'yo,PA R%wҸb( ?& wcBDM R U\whc G}nMለtq9l`9r[]:'QZ:H!rt:&}ذfe߳>@0BL d Y Ad+ #QxNxwh7'ROƌHCB46w 7Dw/ ,B?`6h6?U%Ec&KZ!!%ܥ_+MD3;@HIUO=CjWqXC[u^GezKgqrFkOѪLQ:v: (HT7VRuڏ"H# nzO^+H{i1#0)(fT-4"chЦJibVZ55#]` (蓁6" w6]gCf"w J-:₅[y܏s{?oir!>AbʒnzM%*{ECeQ/BE͍$_bo'I7kDRE4p EAL1VUQ #"Ea$Y?&?_@vy}z'fvɬ'%6CyCۂ,ͯoPEABM[HRu_~Qa'Q8'OZceW5~\q:2=.iyxŌ@2EȕM4ERIMRX8fF`{Hk"gx>S+Ma'}!6lWUwŞuR52bB&$i2*m6Hΰ[o49CtJHBIU})Zg/{k_SҐ|0*p[fy"оE\&+E 1/6G ؂ GI љ7oG{ )gY?Rҡ0:[Jww 4OAO 4l{]^P6S[aP;'#0m yj|l~oqL#V[ |hGPM%M ȔIAELM@QE3o/u^1VV?|tkhpo{2ok*hxcEdrj뼉_tzVm8:; ܼvyR% !/ A$k  hzdQUVqP`J K1Z g.  $EO3*(AHE cYfxiYs..sNl=gN`Frݴb1M}0?g:NY)w@liO4X"%X}j=crUYWt EwWLzX/K#zzk>7XJ.5䌔=:J$Rت~ɺː4FPĠQh! Z4-H{g* pǕ]|B N^ l`wl)&Q %+YS.ҀI2殍W`&s\~lk|g!$FøB1!vb 1FH$|_+y(aW.%HPz_b_v z9Jꩫ|%MC,އ;K5F0Q2K9v:k5A?s#* (0Xd:C:^e}7I xc/ ]8zA͇:('JrI{6+^0xP>yW=f;nsbJC5IIRP^ A(㥚ɘ0:&.YBݲӭk]؛T1 wNH?}+g~9/5|5?ѧ[VH>/'}U&ڠA5PB\jFO!l",~v[NWPpޠw&m|,%8]1XŎ=8`u{6ڮ|daN[& A2j_sIlLwKBCSqG G^1%sF)jn[#۝Lk G'}tޙ:'8|PHڃ`S $(l :'.Mt//3j!9dˮܳC '^ʈ2#sh3NE~GpF|$ \09]jD1O 1Q[9#8XxP=04({봝(t <'11k"f*r%w=,a /mЍ. Fdt{* -MBbq[ڼɵs m?Ҫ^-h6_ߵhq:NqaG!Ⱥ䒵'{=0C>UD$& ֳ0(WV rfP?ݵ[}_ lUMƖ) T0WI~ ,Cč>]^zflN5qmhB &<2ڏ{rxn:GPJ">y;' B5Sl8I791t1j,=gj+mJdOQnTa% 4Q<N%=ZpȌ Baunӎ6цl_ dn (SSK eHxF=fpVF+*q㮴~]3M5$X(2LjˆNIq}ٕuD4bWmNI H{rN+SqUt*P%C1ZGMuq rY(i5jF 7sv=u3nҏ]:tYB) -f`ڡΩX:{Ddf8}kw- hnocZ1H_0^,<]Y/4dD`g@'-cV!wd9WQfǃzٞWfYC:^e?d3a[vf9J`PP`wj@]PEBX!"Kb^1\E+d D: /愥! 9fbb ퟝ ,eFn.,Ei{;RSۻҿVxA#d@+kYq1C]#;cs dd*K8feÆErñܑ~RLڌs$CUAZ>HZH*O'l{`V\Om%z(͇Ps׭[1$ŇStr% 84ݯ]yleDY !py;5$k O}G?~ b5\gNJPu*a\kЈZ|,V%THFFդw+( fK Zz|//'@ P^Ut8纔`="8]df"qjơX:6cCDb3 pѵEStJ uKPek,pz{73Q͛Y~5=[Đ0D 7j~ݤ̑4Q*untX#KD]>BzpQ ދTq_ r_(gmLy{}MNj*X?vy/h53'?7̿tV, ;8u but/6.}tÚ˷(֍P:riEV 9qy]͙ȃz<{ >5<+mE $N"BlLm &N1N}*#J.)rtĜU/ Y@XSJuhWȃEZ2/bl6Է<]'fc@pK @ڋx/7_vo:u#_P]nyO>y%rsm8 4U@BgvN@鑹m9°/!d":i`9UݑL6ѻ+^dDZE:N sbxo>c|s[tVz^ K6(#oj09&+ydB[Q5Z #z >=\\ 7v(Dߑ;5Ϋ9n=[dv'sKg>AHvc?%NΕ'M%v!pZ6J9եΝ8σ13n?֢~ : LOFvDb%) n_t[vgļ7d"^IpU+.ɒYrȕ+YX[dQPپ?ZW;@CϜ4[h{єNCh?2΅"wt 4:p)@b IJ hlYQWohv #cLܕ2X,6NX_LW֋EmVKA$\#OG?Qw|LQHO6[%>s똚EGOo| <;G\cd³+Y68( U>''G+U:'mk+kG+WAQ~s :AGVMrnj\16X(\w(D](?l@F*%4'¬@0kzB=RNJ2Qi tJ׎ҷ)*4-0`7DS4[  fٛc?aHp/|Ejo Ύ> ;wh\f]Cye  ߰|Ҷo"5<0t e 9yBu2>)9,cJn!ͰSvO1n![79YR^:W5Ki'0ɤTHg%_>dl; Vpdw60B 8 R޵:k4#acYB[5k !j@qWʕU_*ׇܾ䥨Q\ԏU-VLw@SQ8aI,** {##;*@f,'PNQF>aR`VeKW E.+uVv"@Ϙv&硌i./a͘VzS h6@yfQq^y[PdcSH |<$x:Dn(MdDRVtCp@p 8*\顄rFb[OyEs2GPD tmLh'&V-#Kx߇0/ $LʧB[) *4/Qo*e7eFQck83`( `-{՘$Gb V\֤]ƛE'vc8]6)6Ƀ==Gܶr)8.ˢ4R404wN_{w iq:Ndʾi=dq+Eu/O(N-8O*B0A\Ը*CP &J@蓿P^$H!+dV Djײ_H^iBH$(;XRUN!1SBSK:yKdgsT,!svJ`șںincpGgMrE򳗰TMB|R%1Gh INr6X@1=Or` hAJ=iiz7Ozu u{ݜaq?6@;ȥa.nOonָ^~fo6;Y^K\vӉA' ]9V|xg4*-5(Ǩʤ!:Inr""1 qwC>c|legŰ\i=ZyP~"/ʐt4)~m-`CIm )8V\ے*YWl1C-m@xҌsC0J|1p=)<*檶t ;T+{Ea5q9guK):Wd#7zɵɎj%Ȑ{kK\^0vj[]X/ &5'b "%k3,872ʵu&?+4"1g%ZAuD5Z~.{r./#.gw?zbL彩6F>_Lpf$6Ѯ A(",Y3: 5kWlFp3~lylhrflxE)+ˊ) a'ƇS2K|w]APnWRdSux mM^SlNEtVc~1>>Gf,4$Jy˶dX}P<\.&:k`T ݥw߬˱s>Wk_~7zr}UMŒ`J^;@1R( C/\_I V& d(?Pp5D$x|=g>{s>NmRSNYP$0 KҞz53ˌ[f'0fA~a/ČJv7oKq'%*l6 FѲB<2@ƽt qgn,CDS54)C_fkzqgsLbmh=7Z*VH,!@JߜC|+N謣eܝM2jĬf@gI^ʸ0"վVN%W}[ݍW-Fc^a& d[=ꏋKb!d)zUsc*IUumuZKXӘ ke /y~HR3LͽcϞR*0 .g^rg*#j*z* w+CQyMz6O3UTĔc/-UX8fr=.4x2;>y:tnA[N=yQ su`TcE|)&\! ƸQ|ﯝYvufKs@ϴj:TaS@}1~3ۍ{y^U.M?tb51GIG0f`/^t`0E|sn_ιxHc) u\*h|4]EZV, 0~$0 F\]Ul,Qe*6kaL$hԃ+]\uO˷tP$;?nKL2tULi  zhZma8a/4Nr[cwu^cpv^*ºDӴC)γ"Ӽ~p"N\uBV]qG+-)IH#Q )>6Z$[FO!$J篨SS_J-'k+.!'BRcekz:$ǎ֭}>s>P$ @aVUubˠ]1_))XE'.ؕHucF_@%uf*l$|iD朏|fΕ#[GjQ@pvfN Ȧ^tNw{~-6f.Gǁ&lVrMumEP÷w/[!q/|Y#rJ.)AlD-F,'tO&M+9[ :p֢b-Pa1~ XU۽P;пurYtʡI$e)s'9.:h琑 w}8jjdup"Tb{1bgZA Bap8[Ǎ<O/8/<n*z"|FK>+<Զ+ŋ }yrjp;t)"R9>#Ãy.=t f*Z$η.*=6έ99 CzL[~B ΃_=pdY UŔ(ub7j4" ۋ=qV+$zMoa Ilr5Lk(UZ#)` 5Fp.$ p3n_v-5= :n8H\ݰ .o2-YL ]s" FZ1,֜=ٮf!Lb_ՙTV6MhԦY jC :Fz: 2n%Ӿ}F3>hpd.$Jy5HU+ Ww_0B'?֛u 8+r{WF㫄 J8D\tYR!U()u{DU1bր~`o߷dٹZ-~OJkH1[hN-=؍"eHf l0H8tF#_\rxiW H:8qY?K +/wtWl-@Yf+R `(_TncǿS #2kۋd"sNHׅ4 \)%Tp%P)@+f e -X3gëzKKW2 hՏ!,ꊩ~f1 .8*eޘ`η|Z3"<9f\^Enk\@@nwٝ<5&o1| nӱӳ2I]LТK[#m,:gǤǟfC`ݭ43aB|82p!tT#ě}ks[(ܳ Tf7B Q>wۓn G 2iY`Cxm۠ O6ϿKgSݜXr (:8+cDutS$:,ȈSrbI n O+St{Au@Ο>3 ?Iڿ>ꊉk 4Y"=s ia)eGibk;egu5q5!E,E};mǠNcfD?@jA;ZWk. Hۚ((eW0xܓ0,}}QH4Ş$xy;lyOM@$ISauoy#>.iYR0 ($;)xێRK-ϴWL H {Գ&J̞hK|󮞚Bs Eb1ƈ/W|H W}cҘG 23# f $y.Q|Λcζu>;{,^HW .MfUnqN%gC=4̙i]r21 WI2`uO j_ş#y53nd`7ADd0,Hk>eմ/~Gw/>tϓUU5c{$#%a05G'с6qw*VEUW::EmT7E֙0́妮*!cs7xKnz,ۉ#\ۻ%Ži7Ȁ^n5/$7ڹ0] zt9'^zP퓖P`03d6S (A6z4Ji ˱3Vj;Env$SϋBf=SbׯvqMi'yČ~*?mw_Wﱁ/̈́O ppjf!N鄱@S[~mU7^%4K4 ދz^Q?hnL97]M))DN aD:qp|hi: >>g HɎ:tLyK18t7EWR\cP /fnITȻ+yׯn~QoRwqFu7~ $(hB#HJ@o^Ɇ|5E 5PlA dsf*+۲)8iIS!swyi) s%ԌwD;9Zs*{>uēO$GzjPZ*<8{%%{G1U<ͫۄԟ/4$ *X3/NyGFw(2@R͒vsI\R |UeIG(>ΫFڽ<4̌@ǴAZQi}inٻ&PQo{PeȬKIpy8"  RAt%5XRNSsBWo{7Y:3/ԧ+V=hkDWd/U?n\9u\&唜;"8Y+I@gj{#gۜrRRdLbji9$Hc*?c-.dB?]_2#P.`TFTTbU\ YA]i$S_kh 5nFa(Ӿf_# Z?JZ^NsUfӧE3RrER`lJ Al_D\j&wS62eKMmH8=FWycs~96+l۫ RjrUƦH{ޡ-ΈA]S:%=\\;Qa$4[V ndzFGC@J .8BbTy+c;)4M(nSwx_IJcP$ËQ]ɇ";>< Ϩ(?=)^J ZieS?wcHQ`hOtAt3ߴ}r)j*UV T4qč+>%JfXr:rT!ldc; K%je|>rYZrkZ!X"AW8x$s\b"vI"P)ZBT3<дZ,D 0in~c@cdExo="sjC ]e:U]^O_m ؇P=VN$T|eIwRdqԬ o(͊UW:ZQ/8HGgw#WKH$(-ᤐmE:!ltY3$ r ʜ_±X_~k]@Zma^`@e;ES@AEsIHG=Dy}^:xd)S1V3:CI߄rvu#kq}Wύw^ШE%219j@5P}:7HH^}fHw ijvP/GmmYS»5 oU:tkt[G K"q"ahfmcvn-]U_*kUp[a4&f dBs~Z^ )X<. fe "!22>:CN 4KI>5PӶVðl %xo#w]2.?s삃B=!qΟigMC}:0ǹנ|ftD4GKAi?aq:ϊFPPt'-10JPF`X HaF{j9qΜ<ݙߡ-HʒEkJ!E\.q6n e`\҄L V׬X!h_?Ԙ9g !>ռ>B" =W9*Ѫw d?ϐuz⨕GdtCv +9S`"#=DIgBB~xp[50TD}ql;g}O\x :ɕM|Cd:\w/+@B s;_hI'Ή1qO.;f 9?S'ɽYuWZ d2~TQY"s< KҰբ8:w&U`$ I';#$SD$ۅUFgsKG& g-ws i=v=]Lj;>AmFd#sGrԱ P5k:ңtH/#fZzi)Gx_ebgFsivq_+ 6; xs,9|xɠR&c7y08Dp$Y{X@$90?N4]-A(ô(HAWaau_KN_jM`/_9 D(+>ߊ"`ܯY}*"M1]US })̚%|a"~̂΃HBEourPHC#\C(y1]F&Le!YTZ4OL|Nz'>DJ@D; =v% aeID7ÒocBvsc5@ǂ!eKOy{dʻqM$:M UFqZ5v3DcdH 0I4a1E6>ŊDeDru!9j >4,ṱ ШwG` KFb GFd-uj֓/BAu&aPPtQМX1jQyWТj^L1ֽ^ءx2"Q ʂ)6r={^xɴazgX GWa#faXv8=ܸ+@D !# B@3*!TS`!g'R X/_oSXF9=vӧtƳʉb.A  PǛvVWBYQ=Lߪ~1˙b|D>ZwxsMFDQ]dnV[vq'l3cgsgO8D >dn0YRCAζrG'0ܷAq#{nSF."b&TAG=oPj }"" Oj _ts xrvr<hLv`X{*#:""DZjlVJ8yE !U?oIEy N+OkuktfPGffc1 .`BL\ ;0Tq%8)\uΨtv3f LW.a $* k>`r}1`k|.38H-&]} H;c齽U/_n/C/zWMF-R=$'u@[OkYVr}XpN"0zT:cCmKSQުfHz7m3b FW?kP26ĝfCGj.ɱW3:+} jEP:h^g<@jm}86k=8_ " ߡU) ?[^|AЂ:uO`VL K>ϑat`'"B_E;"jtk'[ Y4s4sFryqW%cmƙ%>keg^ltJCC&C"b(X N@ D:sPf"(cЄSZRK`0;}DWgEu >ȑV쭏6L`Kp[#*BtmPmn_œ4gpCWakADAGTW$jx1ֱ b4c ~>N:75Jt$hMXF'ij%n,8C" >c}B)G }} IyC>0uĊ'5X^T:ľ-u#9ElpA?@Y/ouIF0obMnp!:;䓤4LCNm@ɐnףb0%x' >yBzrD`/Q2Z%s__/_[i}<]Kkp18-\2⇿j@U/yu!}NTlK&J #*34^7q">g54;_]9jȳZaK#)9oT S,#e|B,|EDO7K[u(\DvM;F?nѥU^LGF  Z,O2I丅5&H:ubTt{2kcZ)2cmOi&Pd~#aقB|uvTwUz%g .}/b*&ib?TAUCUE=!;*B .{_t7ܭ_+tKc<%χs9rVgnϱ5N|А333$ .Em5) _b dv8 $5.8AQq (!QjEFI?#O?`TgaAzҮx!_sVt|?Is8{r )[PrdMIzl9F]DmnChEPRU8'OG<4^/k[LJ|v +4#9pY]}]/FıuOF *" /jlnw_l5pZn7]*YWRe 9H#`=3Z S>S6F@2#4^wL@}9C^O-?B n! ein,iSy\Sȿ/tO"Ѧ}]]X~ٺLɣwS!B+ѻa=1P-[# d$ĴdJO_#;b.~f2#Si?/ A7щز##Ө1 s/T8Gc}q]Kn56k3~9xrV'Ji GEƴyqh[+NƤE;̦v "cŐHzWkxNT 24Q喒dwy Z83+UovqA9H9$Ӧi }4|R~{Ƥ{%`@tsP-WCʎ]-9dL1m iͷ7*'l$/v]_5x.nG* ؉/ugo ߪ#i%*-f# c6 tJ-uϋ 1($"50OY*_,yР!tlHz|wQt F*<d6eHzcemuiXnmJE2EtsĶtlpgitaTf"#vȚ\k-]<{ڬƝwnM?DK\eF'gh~ݔ IVb7Y6@Fбἱ=ԈxɪL#:W; sA{U0.sQ/JE^<@*'hj rm09o$1՛]39F,9?fŧfv47j!e'Jc$B|^]e}ӗzv[GzJ}Hr uWcL\~ϐ]=[-N%8k y PH=-ƱH3D4[k 4' $*XBAw b)ovZ":/6z I&g=ّ GXlN悒fМoHKו9PINvinlJ]dcOëo {=o2ǁ+2o!5. |^Y41wiړf (owsIi澏` kOH;@f +[s[16ik]<*G66lH?sR`&6@"$ ( \#Y}ͪM09*}tyDfrQ0s0 `.V>T ݅O΄U [jUdH 9`ᱧvXh羬z(uo"3! 3ضx&S@HQ4CG>$S2r k}R: p\.a@Y vͧPznz7Ψz&|aii#oXd@lNC ,@j.F0zAK\xȎt2-}AW\TlOhՋ(rl$>/ EjH@,B䟲;#"c gKӍ΃. Fn1l 7wr2iSS<$1׏Tů{F:wI!{SiFt1ǚ%>; Ի1!$.--q,t{ʏK|A~]+*}M gi7}.0qcpᥞ ث}'Y|Fi:| ~%:= 8AʀiXLoؤ6e:<,|u ZflGb ̗lWiX'|K(WO4D' G'|=VEheKL[l7KA)O'S! '@vKe2B]) zR G(O3rC Yj"dma0CPzX  ~] 9 V0@{|>1J?onZ+^=ӝNR}FeÎ]2ԍJ#tއ|Li)d4ʄd90żfG@WF(I/+pMtǜ!.!y[RVrej}:^ eJ~(FdѭC [*_m}D>ǘJaF ^җ ÆgTC0zw!hY`@B[\DW`8+B=fFi6Zy{oKf̝2OAiBc \TxOO5\O}:B(  8ӀhVCبA tN7ĪkV 4Θ RP*IDx#ګOg}R&ɀ!gN9GGAz|N}, DJ}]|_]Gʿ g@ɘ%嵃Z{ۚwꢚ^O^ϯve5-ݸwS!7#q!}|v'hyOF zeVdԀ8黤~ vSVVw%;w@& eq(6:!Wi-o.42Ca8[! Ea>Cr#b"ȕo[aG䱼KyjOz:/Q%Z>e)zdz۬ZqAWWW/sB:YUء6YiXpo-<̑A:X+-z=BUu$_7QOgjDQ:Crs 6-?\ySmwjFGgZG:{yݞnC.qTfKe2JAdʧ7yfN01 p GaɅ¨jnhN O-1+v\'ݼOķCkq.*]SO#_@: C9G:ћlic܍xW4z ,ac|2,á?錗 w࠾z)l}%fvd'!fRJ P̶)հN~Ong& l1AʌRɿ)dz3vj Ll-D{@6P=h2 ⫾{FHdBNMgSmzE $LeaA|Dhx4.NF!<;Q==+ jn?䄪nJ\(b >=~ )W˜ $f &-pwtAy+)b]ؙq霈ZfC{,[#T΢)}XPX[Hc@|11 ZΦR ;RS2AvcL'3|. /mЄ%ju l4rm12bGʜQPCj~Er~ӠK.p/iΨG A8-dܣE)y0IrNmlŒpjq{_옗k8!>()#.QR8YoHq)t QLIy$I]m_MRӽ2"?"qʣovW˴V/w%҂0} Њ# 0iŝ-4S>J[w_mv#`dI  ?A.$?:Uq fŰzK^v茶t JUXo^yǾ!Ö<~C6lEbYjJϻYy|kԳj}!&q}v-hhꎟW}}:މz/~ !$,f5 Hxzk.v;uCD 1)m{} |?7{|}屩~)? uTT_ @@5c<\P.Sk4qdt"HxJ]$`0fg/J}{Ţ4 KZGo`A$"F <6T{nDGj4OwW쾥< }:}rfuƜB1/C%l n˳TJL.;|$6dvy^ZTxIgiNU}a)2`3*pTYLOl#><HXYj4֑SۍbڦYl>ć@ 7 )e΁ѺpI ̅gVU \涺w`;P*ROͫ9qVY 2@aӎ!df7}}TL@hTwP!) ; }K1?C)Gڠ4OIP7 ٰ\-2!ckwrH_z6U)dag8 oe2<ֲx.\9 C~#/zsNͲs=yhZwdžr졠&\--@2׉FM\yTn },PH]!V; N"\|L~RC4um&Fg<2/1alH @ |zJ'Ln}S[9ѸYFdҲ{7s$TgS,3\TZm{o6~V0`{e/No|~dmQ!t#1I+Ϯ;Xt{GڇyK5Dų !m4FYS_mz_q75\w8OS<,ϯ KRUءyfT<1Cߙ _cob|xW&N#UݳkUzӌ+ۑѪ߉Z5} B#Wrp"IGMpx*J9neWg\0俯n,긷]S9. tKG_¥La hY&–ƹmtDdaA!^U'>lĨf*`"4Uqe27@.AJ嘁]hQ$$]|^ܖ]q.#0 :r9p{,G)H5t(~;|4feWk\˞*> c{6mu+ͪ%3Oa]oW֟XvN C\X@lf2͊p<-4 Yiwf_ct0aǚOi8YzH6;> 6Q|Y-{\~ĥ\%!/âGG_pu4.S u K8ll!GZBk+tmO@!=Ӄ[b9~Y8gK à"$X?BCᮕ Tt[GvwǒRtdF82 `SqB=`tIs~""ɸ?RН-n:Ud CR[PdGtM! GZڳ{$KTz*uDD+" E<1M910~TI 镝o\ÊCjLdcoMOl\.Lo!fD%4;/ Q" P;H=)˷4w4nyn,]ebCwꁉ1AnRNa4ONTP)xČ,c9K}>~# Q$4&*l0p]cؐ:ހMWtXtp~CXHyLu=qt3!&^.}'x=.Ժ)KT7]_UsnŴݝ-96h SLfk,i hnwRv[3 y)G}{>V9Hql /KZuk?s!#臰wJU躎DeE|*7lXeQcСaF=׊NE E&ǫgGG>e*UR<.zXRw[x諸Gj9t4`ΰ"ʀ+'fXҮZ@x!/G~܂ؘ`aIҸN?t|U.K|N6BI?[w&[ !l;ZD`zY9YmUN}fvHhV?[ y** @o`m>*#L*T9y+ZC\|)SXtj[gDRawNa٪p ~qJ82L_GT0:}V(ĤdQe%p-,WX0:&ԥc(00bwyŽҟK<䓞 D QPӅ!MVP}ϛ1O\fLJV 6*J}Nu};Wy}\Vs/!$ YQ)rYS͋o,n/ߒa =CAE%TĔTu_x_px1U?%6 K?Z}ma|G|%_oW_O,@iO -<O#A'յ 1ZK7-p8_|~G#jgDU!X,W TV gvGGA+6wҨJ!6 Dj,MiiTLjĶUQ"VQDJڵS2aj殑=yxO?P"0dQ3 Dgٺ Жtk@e3UQ&q ?+'|{;%8]/Y,rjF#aLsaZƼq҇U?c d`Os3cֱTfѢY'Pl/m|C9ҧT'LX p} hiVp GmQeEctUI8KZOEiHwx18XYG`:C?.}lW2kI fTpپ;_ɉ͜Tg.$HT_{#@c~d|PK=y0,cő+}XvoWg`x``t̎5&6 '6 .B: W 5C^X4%D/.[gnʀv5/`g4KW'ߚ 3?_q-}hYr`7}x9Ջmt6kfl^X f(dO"B%b |w#/iӯghיϢPH$f=ShYGb;q:9^_jN.W9IU:ׯwnmxoo}k/*IUfuI{b=?gN7;=}@$P> }iN:QSUz?I`ne -$Nw8s@,`4`0Y`z;oDT{&M ̨!U_ϟz>i-nbyњίQH5 ҊBFPBB2ܼ_$G񗺄9Z::ڠhSX0lL!$a&HF1`:F;Il]c:IT,2~u@_BH$ I ?_t0}팓"'  @ξ]01d1} u75N+J eG;!c2x\8Tx$7TLLWTGx*~ZxtLB 2]}TnUW3#Yk#ߞ'uԋ$)PDRbã^ɮ &fPS(˸˜&X:o;G_ Yȅvͼ1E* :Npho-خ?YgzX?ŨO,OsEXnV#=1oRpS}GW 5>j-A)k 0}ōHDuPw0 ت,5P'$]*)x[eo ~~|3j~ &_~MAq]gEhqy絛L6L [<_Ə1aEiDF#{"*kawAF[&|hz)B1[h`]NL>ʒkg $O20<#.;J-t5@Y z@Vx %uHes*:L VmvT=!p΃,ݻ A: }msNK,8|emӾ0x9j/eF+^=0^oݠ'Jaơ_R}%<:t%HB5OGj5K9 I:xսcy1UaQ?Pr8WS1️ aQH.CR).+r4=?‡(@{/6a>) hFbk⠿;GCT{TXO.|;K`޻'n4 D'/rxQ SbpK1jNa p,yhQ0@ѐ\Fƅ)Gkݞ@㵸w~ @U#U9E YRʩޱ &hU8Jk v60#ۨgC֎)=WbiYDi+>eY8~by R'L[+A6zuYxGڣRZFY%߀ru|l"s-4I7drZZAx j<<`PQuRΡ{$@q !+bq=4&q[9q`p$ @0yG.#V.,lRKsĺ$01* }G$hbZSB#11S afZv'g7|jE0$|/O6ErQrxL2Wo W@n]fqP\u5G啳t$3HYq}o[jH'&]o7J*9P:oQCHQT+`s6nlu2I̛/m G1>i_B<;R~:|)}?)GP]ӯHzЯ]ef?Fs8-dbeQ{kzo|u ۤcHv$h4Ⱦ~'kŽ3IڇWWm+0O9ILSw⪅MKڨFqӳYR q>\Ɗ86UȻVP6s-N?ҬA%qRAgY=l* !JvcZ䴦{B0 u= y3$<'Os珛 ǀg.Z22N dm8%PB^֍0Eπ|1!QxmfʛGB\'bbqIks.[e7=2Eg+$k*Yp*M=rYwNx₄QHl7 ?%3 ~ \#=擾8p̀#,v| 6Ϙ@jRȣyH՘X`c6SaIҍ(0*v\?ɗ+^o‡ ȟh)j$1ƀAG.j(i<~sx~hPYEf t8r! :Tr)$[tmd1A yB=oT!t3y؁.jdfA $@=NG1e- sy:a|80xL*Q{V|dT=K4>zY^X">b¬yX&Z?w)(7s~=g0wa#>)YO _ )G=>RRcVRoYq]1{3W5uHw2p{Ca{F]XGn)$vT-lu0rBx|*}{{?ˮAvG< ;2ց3Mvo"ī7#췻Rg wnj.a&buy~%E_Y uBs`i]4-Zs(2c!޶JXP X:3DZN,@??2,U!=1z❭%|mˋ烺jUZdoXrm[ʆ_v񿫟c< (G#ψi5J*O%h3#;oR 8i殘MsN) :e<_ c iP$PG~ؖ"r![^i_ܑ[FoE;g 7({w*[s6^wvST9F5-SZH3oII$9t-:eWRE.{5{xOŴƿU !:wN%[F~' )YQS;>Ig5mJs1{c-4iy_.9lx1X eMw؜phS j-QO@8 u@c>XO9\w4u[X;4[n{9]`OqژҫWqga$-y7"w:y .\gVU%:0BLzA؊|z_霐h܎);abP'Y_|;u-w\];zdC5eM}[m07@3\S̴a B;xSU:]1ggAE_{4\WXCU+'Xс{!A-F 1?$=7b2@d]hEL7iՔ7#bOU<.dC.C9'Y+5ojv8׎^\vd8XRdL= `fG+3K$>ZnTݹ V@\ZHnvx_Vy?ov(]7Y\:Ffu[ r2^}@D\pX8z:UxהxO -|C;ܖ ڈR r{#squ3/FIH gpcߜUgqZ~,k>[?hVj d5J gKnO s!$ Wyf/phqNz!aNB h8 p|QkppyA ܮbx˙ƛxtĞun3,͜-|~9\6n2+,;Uol,9u2W3T/οj=h=6@ԊjUJr*>>=/v7Q[SRښyѲtA1NczX=Y^LQ|产Gg?==jchkݙ3wLo??}C<͂"frBg']u+ .(hWa9yD 6t$aBݦj<#[b-/6rrp` 'ךޞGEM~:*/u7ƪܦDq.7qȖ~ZƓb@֎>ܱf G:o%1ơ& 8IQ@ tRB{6[y}IUz+0=ܨOl$?4 |m~TeH r FUYNJz)`(bgwU0ys6!I-"5~Or:az8:+qP5;'ոI#jHdR&'i a'^~cUq8oo";7OUEXW2YV{Z05B׼p"eq~ִJX6\2vF `.ݒඹorPuR׃QRٺvJO6_FB_ qg3! bΔU(0rMƒAYSZ/JASGKuV:U@ߦ5\'3&I(":'ڮ6ȑ-[ZDhSJHM[f(_fv8& {k}#VG`3 uQ|J TЙzX#c,꭛~ /CeͳAh79/C18r1;JGѡ F^.'36H"YTx=#ȿ*1 5%g}NBehw=n8LoK(xøO2(mN [ N@;#.87S{e^eatԑT>9ɥ~zڪw sfI ðZCk?1; {~drM|v8Xo_\R)g!R.$7)_vegɴJ\ܞ:9*  aÃ>uyL@úEG-m:}܃T M*~gI&F6h2V{ H^g; t3(bkLcNƗ6g\1qF_"_ö`Z @JRksva5MN̽Hx&U;{qQh;(֢m|5Ԝ™Apɏs쮤?c#eD>ЗdÙ3 ȬB-4 ǰμLdJ 0+[kGH#|hϭ,OH\1. ( #=$@a@h{kxG2"/Uȣ]j>PP4YJ*b! P(I[.94y.n{a>":R'Q߀<7='(^ڕT<ӥ)(sPs\y~)sd @tz(qFM&KX/[C0،cK4lY|%ﴍATnXm<־} 'GS촔tN&SM{cZBvb+wz=Ѹ}^ϓ$Ni8م`NVhJLMm]ebЫ^PElXP<VgtV;?r412'&nOYjL 30VhuYﲶ6L5t& #X'vTxsۊY*qaK]m"{I wO`[n{}?p^!JIBP"AwNj p8YY(‘0^Uṓ2.ig NtQ)=>n(^R Q[֍҃8v~|j/MeZ7s*/(#aDϙX\[`|BK ~;2ԟip=Qz^ܮhNS@' 3TqC+BMŰ+R$Pv2Mxl42r c(*%Cn2zϕB}[agoHp9|OgL8aIҒm>TzI0C'ݤ'p ҳL;P@\ y)܏{'ʍ 0Rٷ#Rm6#Lb7#jMy= A %g(zi_LQ7`Wk[WWA}BRì¨I%LTĮ%҅g 䚙{݇H<=P_ |TATP% N_l(Ei!tkOݫ 鱚{Jۊ@5}.654" DW9_q4CJc*it{$ Pѣ- ®A0)U0T|,`@ہDfQBQEyQENb*"?>".~nED"%dQr6 >'+0PV`z< +Zz{-^z>" 㕅E۔@b**sU(C褋TIXn sY bŴKF!*Р&rlK>v"/]x;GlQw03 EAPP  C*UC~@Sp52ߞC!AG ̈'Ӿ?_F֊,@*h_܅u-"* ?EP5~ZHk~ȓ)0Y$B nhJQ$T̐I(lP )֍Y0N Me!YIuP5t1MD1 uz! mdEE={E~߃ϔC &%*t`Di$*'2`C%""4JŒ, VR( /ª'Ƽeueu@Å]Eʥb[pO?:@>%|v56Gn䢀/(CL4W`Q(|&kya`@AG1ꔋ'f}y('";6a@>,"aQ>,(z,=~XFH5tװ ( `<#Pv|XeTW>*D Nefxu88+@eڴuM #$_*H x2W"!܍"rD. O>G2Nd(b2,@s+kTyщ?Ѳ~p"E_ 2aE7ayw{Òaix! I m Zʂq%ĉ<6L hDaA܄ /((}S¡P)3˕P4`>HHsp+u)TNOȲJDLP`3?"n:y7a(ʄ^P "- OAdADĔD$b %"F%|`m)EW^ &TE_ꇣו7Dz@b]dVMP!徖=T! ARNgbb)>_|=u̥;6}_R% efeBq@3_]S{tdP RM(|k6Ш1%Kt1CBAER P?d@'ˆb)MML@_]}7J;rvJC+]< @B" >%(2?ʰ}9~4'G7gD$ c}1~Ww*0T@JbC7hk?nY: WCsMx`+ I)3%w_ߕʼHAGiGPPJ#O~h'\jL8gmY}&NB$ z0 B"* `~%)hOyQ3PESauILʅET4MQKCIUBaֿ31(& @"hbZ "&d(J* P{.$y0$߄1Ż AUK0eg̤xOWlA L>}~ydm@C;0UM v2z__Ɂm"6\:H!n;:23kI րSu01E}Re]a89%ȔZZy;-C"8iImRII*ݖF(EIQ%Ue%*gp/@dv~ܧwHlnt`'Ry.H:l7pw0+YjAޣHtDEM $CA$TIE)D@D̴EE>^ax:"(*Aߞ|֍>T=椔9F0%,H .TA_;+]`C@ R|S@`NQ/Ƥ4i߂|u;"PS}^zКvGiI76p-}${mC&$@J1Njǭk#%;אIqgB"TDTC>Wїo CTR2HHgf2*g,¶>wy&;|+t7t\ vs~ P?cqndPU<ٞ:F lnJ2 R<ٌ^V`~ʊ yl`fd~ P@ DiܰAyg ob8pvAGIVFd3 K9F~y2׀ZJQJTZA U EZPiT DiFDihR h@D()i B X(Ç858aޕWvD{H@N BWeMPESQIXb%i8P\%ѡJ&* ( %)""bi*)&" ""*h  b0+vQ5"4((}6Z$@*E]JQAQDE"(;*'&^4 @j}PdEʀB"'{Of̑ES~ r!;{m~Й˿ mݼnQ]\p*mN|xelaC"4Q:5kf(%O@JD ]TLT@R@E}CVW1@bb[B\CB N^Fh`rV{ayܠH]S2b1S@xԀT P cPpnNaĈoK҇Vx" t@4jrF (5]ˍnFi~簠b*) TJࢬb!RTsniBh i(ij)i)"(J i( !(3f^+5 $(EиTT5$CZңt;H!@Mi7* &[*F;ٹy+PhE@PG:}]5ze!'9&=yBJ ~;ܛwAf2TF-r Z t J 7 ]ML(B h0H&%J`5QDBa7#R"i b )()Åצ5]@y$"p&2X&a3LDT,AE 4EAJDe!Mt'de idDDHG9 p$D\CP6h+ hQx=mgy A$9 MQ EDDU15LLPDERIPLx`D>% *4 DQE@ BEQQUKP|x|s*@&>jEքv%JQAD75{{Yf@2&D< l0B*xE&(*F 7  #脈E)h*"C5PfhAt1UTDQMmX t +=c(⅔DbQX I`(¢hT4~Fy? .#Ar*&b 9B AS55@q"v@OE. Wo\ݣk:1C#B7P=lFb6%ƅ)*B**ij&X( *jI CF7`%0_ފM!~`!%)QE"[[O0U6LXwQ8ШATΤy ִZ"d3<DBa;,1e D wAeƳ=#C8#XDŽ<V;"Z1wG"*jj"b"")E"jXT5F8ޞjs()z b*)XTKNJTGl=!\HUHh*c2 `6$.R JĢM0 .pphq T8pժB)HmHR44 CTA1UEUU1S2P@LIPTQTIE1PI3%UQ$E43KHPE1KQ5$I1Y9F&|4Xj@<C_rԂNj xQ4mAU`MiDet4JݭQ\V"ZH7,|LB}(~TMKPDEUPQLg{u c$$zHRD'?\IBePT@$JD4PD%%QE%DMTAD`UQPДQSM!BRLU$EEN,X PZ&[`$uaAЀyCe49p_{,-L%H#ʄ F*eQTM BgM UR(UPBbRw[B(Ҋ/.:92xPui Ǧng 5BU#@AE( ۲w r"ϳ ʰ"}ʋe#ibbSH9'wz̃ma-BcP'TC7d -DRh3`T*&xtʦ@r`r]Z ޅ3 =O*1n*hf1 L (FP)H) EJĸ"hh1L"j*""F*(h;4 őE(,lO $Qq KV2!(PNpK1I0TMMDTS$ӧ SrGU)@j=(p$G=wF  edSE0U%"DT$5T3DD14M4KQMD 4DSLPSE$DTTđTPS1UAT`0p cNi4PQ(eIFb&H*F ,RFsD[-283`d1Qs R9Ar--r8ԣLr)Yr)EX,ȌQsFL-,)iQ3&"p1.!qcEiaqJcq˜%1Ř[ pL es(\V%[|?TPon  /)V"'a8<(*(** (Pz2J7LUJRhz4ڸ3>OYU5$DPCLEQ1EMQURBDLEUCDQH%KUU1TSAEKA$UQDDUETSU,*>2d+Q"bK@1!1 -Pf9yhJRqm6VXE+#ȃq+}NEL(g|`VX;#"$E6t`UP7zBwKK?#Oć<9@X̴y]fD8n7-dpwt)mjMЦ+Ԡ| β#f5lH14SO`gmF'$' RA",QE/B☯(KIJP$RД)@Ĵ1R4DR&P L@RMUQAH(Z֢) Xcm[ ZZPZ `hf!f "&`&h%()J *X()(iJ h)`*j`)`fJHhd)*I"(DDF1V*RxA{Z ((")`)R&*dTXEb(ġ( *(|RrhD4@(,I @cd|U3H,G AFZ0nu$Qy٢b@!4HE@N2D8cb, nN˷ m0M5ɓhr0oaa)h&j" Ji (H$"0i @(ii`jBbH&(*ZJ ((J"jj)i*b(Yb)&%*h (,EEEl<>óyYADXDQ40TIM"*ajQ1s{y)NPwx6OƌvƨXlw5a#R7 s"bB` ]q f &JJB(i"j*b*& a6DB)JI@SCTAAABP BbT)@1 +AE D+%Q4AE+E$A0QS@D%% %E 1#LAU%E+SE1DI!DM@PPHAKP0E%QDDTEQ4PQSL2TS4PM%E g=&4)xTG=Qل7% pk`Cb5.>ڛKCJ N!d1IC@4"zTݐ5`)5$ (,$eOuvh$b9.Q5¨f  $/j!X l%d 4EǍyXŠJTМ"@2$Bi8 1ATс 4iO0K4fo4ĕUTV )H4 *A|ojrؒj" ! )ٕ7vM)H) FH si" *-,Hpqf ['38V:%:E:,~F{z/C=`?`A x;q! gO;1!K%mPo"QM)12RZw/}x6_PgP!H42 O?г B)窕HȖϙ.RU=̋jˈOfm @2x55$D1DS1TDXLV7m8>(D42:kzaPNI3u^RDTTU SU1SDEUISI%D1QLIDIQDİLQVwhi(G19)(ZZW0:( *"iJR"R`YZi (& *&Jh&*i(" F)d H_)-mF@;oُE$[)S6L{lOj8gf̔S\h˟A)(@3yb!NlH&IQC*-Zv$MAnʓb8+<"Jh!TP|7R҇a6`:U~44RSKM%AmCRōf%E)͆)UT"O׈^iٯħ1ҷ¶Bb\IAD>ϝJG&de!ʥwraYH.yfbdtYrؘhϵ*{ B)A@% @4#H.؍hL/HJS$H0%(DPDPUEUD!DQQA**~#;4YX% *V!`%R &((bhb&j*  bFH("*"" &"!(Jbh0ͪ ("% "$( *)((ύ;Q`yUȊ [jmΊ*KV"\Ec+i0SBHA EA4HEQRDDPD+H`IaaP2lX7"nႨOW8|9gͯq1\dwwxz/w8~=0nr1)ZBa2wc\ .%n<ͫJsxYg.S6 B'ՇĈ"Qw*8 "$TjPⵔKDP cEɒm<96$1dG~wgr{0!D B ҅--,@(˕ӤLhªAOkvDQH2ɇ#rzĴpb@Xv8i2РtT0M 1QA 4@(kcknB "*/9q(|q<88^APE4.$`$H !""Yh iLTĠ))[J ) UjIQ IQ8 &(\$rAZT,*!CIB+XFHBJbd(Fb((T" )A`DR I5PPQTR= DH*ZJ @)EL *j!6@PbJA[FR*g (!8D'[,.B(P Qd QPZT!$MQP=l"# tHȡhJ B()bFRdPؑ04M(%)&TiU Q d«hrP@^(Gإ/`D5 2`L!T3!:Uz:"P @ ZfD"/ŃVS)D110Je% ($TAM!Ģ)B L3JE]]TDB2 B+&"(CDB3"dpBA 0 zA SHHJM$B#PζۉbhI$*JhB KV&JbN9P:`YD2(TSko`~M"BЪ< jhALH V`Q(*e>t  "($1*6vB{ +(H#W[  `1 ERД@Y0JA P g@0Ȩ \4"RGR4ȠM" AP(b%PRĢ2Crr8+d EO T`E` * 'Lx K4KHУJ#J(_g*WjH$*&p!(B,@e*u! VCSE"$@bL-l۲HH QGБy`z,8CdP dD1TAHD b0*! TH*D)DT\!i "F*AQ"iAĊ/?"ՙ0 zU1)!#%( .d1H"B:`c!h)) Fj1̃@ 3BET},ԅG H 4BDN7 eI@Ic99˔8& @BRS((@ (r ta"R PRS) hĨ. (@9q#JPR~JDc /!jIb;9DQ2c))P\ JeH)$@=D!SD C12 BRԪ !5Q@(P#B Q$}( PsPH@@LīH#@+(9ySGB!~E =<( j^_/%$IJдDTQHP4̋HD Ho@PJJM:^^]@r H1A! %%"PRC)BP@4H ʑ" %"@$1":4tIuO.ʗuGfw:Xn2J"Thb@ $L5 PQk%2Y"%DDԕr+EɄňU@"V4@Y.$t@e!K"&8F%Hid!R!B ILZl3TRȋ$C 4"(- cRїUԂ@(R J 3CH'`/.D,CDT>U*m l:` `sWRT(DL)I.  hB"TeV@J/L 'BSR_ Ғ#|: @d39L0I̬Hrg"wXrnČNxtBLzĎ21(QBPP4R4Fs¡ aM224-ACF1J=ࠑT{[V%)[:8vBť:,hGo]~>qß!)QLIZYBCwjXU-V6lhI6M&OɅ&-mZ`Zr1}496¬xmF)1Td65}ev}'+霂۵,ުpZ6g!@{'H*Rś; #ߡśVl֒qh[]̧߼肙;>G*j_ER,$IRK2{pI*un% :J$@s"Z 0"D _әyrg\"&AWУA9?H:6m 7!+!~j\OkUU$6ʼ.3YjD4Æ3[4t3~M[DCA:]0duͼ1 <fۃ[?c|+R\}\%"&) P`"hI(R$b2f.R z)Vu%͘îB`rp% *Fo5 Fr;bso+ܢQC-GH(f$j=Hm HfuHI-].]瘶k(hvoaX{#4bZqtX4uYc" i$zԨ=Fƛl<4!k9?\`%aʒ,ǜegi8Yeݙuj`wţe+ -(fZ‹Cx `hn4 uz 6]Xm>]?3ڽigG؝s.7m:5*?"9,! o*(jb)ʴ%%+MjNǡ:$XH>>/'򸝷>C(f!O(QJrP7tH[%2"`IFG)n3{_G/.~{(YtOq䃈F_ICr(C 0R>evymw)|0D2c)P NX|mo~__IUFBR;חÜyG.୎ QDt~wrjH&C^hS-)+-2,6B؎e8^W]8zosKERP{$7 U "P2 D4zT.F_@vQ!:0+冤Y" W[QK2fFbypuH;q8NAؘDAa32qbyvZ@- IHJ.84=cξѐL0_g}&mU'H4k qnX2b8%^oZzk2ĥSh(J`s~oMB0zt΀$4f8dDalR%A`"g[&߻/GMlTO&}IBλ&™<ݳUo:, ުRhi//Ңi6:{kh"xhSB+!³np:M6?X[_2cYxKIV}&a|Щnx7d!DuG-05%!=QeIP[Ze8}jT3LL7r.pJƼr.w| dɈ-,}=i^-khsPH57FBܵnrc1(ĩ)8l45)cBkxgn :g{ͱRjϑ>|o,kNvs(2_h,Ģ0IQ]7 "Ja&7jV6CLI /=fy*Aq;8z[X5_I 8BM "R6L<:nK⹤VY?.'84> ߋn{5bx {[mShD9@ @+롇x;ň‰2 ÅXsY<"3Q>RҰP,Wᢩ^ ,l,}&ڦ0@CꋲxSX GD$z?2wF0AВ֑{hޞpsJ?2:VǏ\0d L TC*Pѯ8oOx=׫|E9KfY؞6( a!WZd4Ygm\s~ }hoUXK. ߤv0&N<W"gl2]#lrH)~%ZO!ɎdYmK *Ѝ/YePLa=p{*MnفSR-O%g&2;HftvaXJ%G~} xD͝?gjU-fұB ꬺi!3xu}R4CQfI#< b5l@ ϯTasYX$ ΧqoaVDrc,7FMGFZ}!uA@( 8BʡPI ZU,1;h?Ik۵(-R%#uejs"@+3-JՠԐyz)a+ߤ9jUS4gv?+ιZ;dC8kJ~ӯ{ H 9($].lA[sR|Wvh:x~a٬2 7NvZbY~d=0`&" +ZY!e_~EWm0]ҙvyЍҷ4p#.+rC/fu ɸwg 9z-s$ F84H7*T,X }Ikmza:uԨ!ݱbيZ:#)."< dXJ 㮲}BqA-n(7r`9F)|RXzFH{jSa.;mv]u 8+pǀ;(=*aQxS'[/njq Ql@x fy`蠥fZBͯYt坭iIkp?OiF#l(*N[NijupZ#WedViΛ6tk@aG2OrTe EUu/,$a`FXXrF#:{RKհM̟\ k72Qic+B4r(xeܰee{9vBʑ9i# iӼ̦~:"!fK8цq4u܄Q^) Xe?SڄghkItn$I[\Jo``ISYؐ,9 R8y8K3T'ĂN VpL5 V_9'w0;cXMK=ơ p\`Eŭ,]4-t WiRߌg ]gHt5-Bh7-u[da5ڔ̑`&]︰7r:PZs߰J|-# LC)eW]iiZº9f^xt,#obᨧxV$N~dsYG>l,Qsdm U'2M߅2HUw -t7і"EMCVPDsَgYVoB :M@ҢMhwVCϠP(3w ۛ%jp.P@,lGcPqXݹޝfl-oYc.*g#1-6jDs('l;DŹ )$wa("#{\)O1\Ad "4(9}Iu ^yZt|Iŷ uMpDiḤI#pAZ4Gxs{l!9n4tq096#;;́ܿacX& Fr=ƸbTKiGs"qƯC/ ]>iF eDýj2!B3n]L+{>] =~7bm7w3'yv,dwë++!6"}EmG sIBhNW'K6sſu>"H#R#Ta!o^,)`zvtUe>Ueg|='@Byd -!^m63:JKQCWJQD=c܇6G!*<.8~9܍9V48 ǂ}:/Cw7jQ*j$!ӈa嶸8ӳePLKF`̑X; ?_GcֲOJp4usdЈ )Ҁ˪yd/1;Î?$6}ݸct`~d谣+Pƺ."MtK\9y+YN;HNgft(MxB Lǡ(~KTdQ7OunϽNBYZբ=RTw_,+q'(4SƼ ^-} 4kswi ʉ2^S 1@v (YVyQʱ;ƥok+#mv.C.c1bj.:*;}m~MӪvB5J4*<7Q)6qI}gW!+7ֈC֒k=n~(5 @Y&(#nzNVXROlgҸpiM,-uڝ%VɽzI[X wLUC\{oZ%y8pBut|JOi{RPeڜu-,4R ~xNa$(Iҁ;SڏtWpTNd.\2PM36#tga|8>[.+GHg09&L{43~)>wH =q9LyighJ2(,#d IPp(E2/uP5֯C#erݐ.;)~-&`w'cBxbk-|K>9dp,fB/NY:{?ogP[H蔈3 >Bt{՝.y)jH2 z-P`Րi#bBH]4+6͉.o3J p  `-^ӧt΁u؋> n1W~R|a? +/[5`gҬ2Yl]nȾKڻ,K7^BuɯR/il/t8wWNs2Wd@c}M\.]B<ps[Ed{ͳ'sQ ) cR5(ՙ0Xuץ4R+#$ҵH%ә}oskEu{"2i{>V־KhG;Odx"R_+Vu[@YTU_錈 #RELdGWNBb T`dT4S?'aN}lk3U㊺a1^YhB9&PP mw;ڔc &CC[vBqU烣Hh 9@H~>^6O:Yn$#oϞ~ _>/:?gω,05ies߁1` o7HO"!zD{;]_ψDZ[$_.F`:-b|cUs+QX4,ܳn<Zm9jN5O[OG9hG*[0OZ kVFo%zyL`PiRzU&2evP>{yc>UGy2+ΌKH,Z9?1ݼgmA PۺHtZ{u'"T0( XCߥe%3>l\Y* h@M 7/CVp_2-jE׻#]B*H.ϣ(*bi)bb"j J)j-aQԬ+%@e-ETXDZ e"+r"e*ΉXQ:#hn9S^foH`, TgTQ2EdۄZn"xN?!"zʲ!~7 \b5U}=7s @xc ~ki|ٌ:˓aت,qĺaz9[?|c+5PF*|j/;{Wow܍y {&󺦿O撖^u=;p :~c@A EVnZ Yfn J[|d9g ! ~bJ`0> rۨ;L~}O:S[^%IѮQUFpFM q>VTRFNy( `@I$Vh+ŀDPH WF5660!t=ukz&:׋B$h%gu<λ GBܿnT2l>P `I^2/cy>~o{܌Ӣ۪ O>[)^3?u$( hd^:\em c޶,o0+i5? ۤG <):wэ 2](òD"2)hL@D2"H D|[. I,<ζdx ri7GHǘ0-TxItJU'A֏Nl}{u:ˋD@x_L^M yr|9C/v@+^|r#5˅wC/ȁEؔRP !P-@@L݂ /iwG.*vr:aD_:!}*P 9NAP܃zTL">=m<DN5xc'585qpW%4?˘ /W{z[gq.e@$~HUb#j"aLǯ{ C*ꊂqR+^͓d; gYcg L7\=a[=Qv((xl?[i߬%@qVuBCw4uzw:OξziVKҠpRP)jj& &#MFp0a :Fb>Ո F~/&e5cjT8_޵QCqmX|ζ񀤲@Ƃv2ou;w S_?L:u3r@l%PD@:5YVe"oV@24tݸ 졤jst϶ x Cj>|\"A"H: AG9Hěf,'AőD o"ltJ<?wM{(CdÔ`6$ܖYKM{n_=ჯ伹w s?33 @{)ŕk|(?%K%nšկ׏5`oSOu+\2d3`̨Y)(Jj7' ;>瓭;?S}~y4LPwh fF Vgr[IבUJuy1oiκH$ ɔ MiU²a>"J;cA GUP}>3bԝ)gAOr~'"1qMZdi2# W(MGР9n 2/bRE2"hse/ZTxbVj%H( BP )XhS1T(JjiRh-IPEGCT)1q?k3" MXflh7 ;/\!TQ|4\Sea1PAZkSH 2^BfruI:?݌zhAHY@$iҝ4LHT)V{ƉꬠgA1ϥC":b#0=AMǕ}E?JN,eAUJ.W|0"|Y@_J@<"2G# ]\ df@As]ޏ]&9ۉfsol k}]H=T(  XB]UUIi~u,:OKz*Ug\4c\-CF,2M ($"Dq8Lh?37}C2tiήDGEEvt=Czm: f] NxFC#Fao)3,-d}*)h627%RC£JfmTVeet>rݼw˙\mpy$_2yNW?hN[ޔ!0 Ɓ2)d4 pZA3W̙ p!pA2LCOxb iH ]E55TW@FeY OnD)F%!FXHI* $0RsetWE"vH4!  G+7QS KS d*hl7Gw)" ( R& |A<<GLȨ 79 UOzEJyOPy(Sp:opPҏ>yKߐ xϋ._?0DX PQC1R4 E6z+o,GB3ot&/Kd}AHs($0;}FZ#y̏#q؍zb`e,Ra%:e0f 0?b D1HWRSѢ᪟*ϗrɤ^JwF`L[ G&00Q 3kjɠ DH +BbS 3&1/Y B"17PĄbzeӑfXsǁ RqehFB@y>X$\l6vXfGѝEx =;H IA# BmL"+5u3$ܧ@$D«B +}t3ĭk( @;kJ ̢&OԉyE(D"A&ܵbfJ?B/`sVDRr 8M$] .R-W`Xgpo" eka@ J`˧V3PisIn(*X4(PPI$#vh (23F8ț3ciȂV9nf k07\I5Ϩc #JN4Vqŭ1*qcA7.<1O$UqP ZczK,YAf챯Σrc5NG+ͭ80.LD OI.Xǁ=:SV$ ľk.:Ovrrfzdj.at4bL.3_ioĔCf:wOxܗ`kEoҖ$[CpzQکLP/~u'uե.WT1ɚ~]kĭlrX'Ϝ]qʏDgEXԂ PET S*)KԤ{+ٟfʹ =fZdkJ5*upF$,NLssV3U`x16T .܍Q _5=^vRY6l|+`6q+r묫iJ-E[bRu "AI/VMIdruJ e8ԴaڐaFG(2Is t3r8{kRcaN #OfKW/ћ [kہ-%&L\tkk)gLw6v[6y~?֬{6Iml-m4B|(+'SRUt6{=qh+ NVT-sFt&UVZ'`hS!;NӨqaHF@&JͷFi[1dWH[)ٿ%_9\{%Tzx[k5ofz {,hJIKl+MQF$+j<Ț'3:'BCJ2UF!&TbTZ γ;^ %"\b-[4ճG-',뱂mI.Lp>tϕLӽ`X~)\rcQS~IX~ǔB3ILRut'(Cs[PLn9M<:pP1KpHZKm,DNiYWBN.A2RT@"jJ-ƭ2vլk+s YxK+ YfŤ1"i5j3h-?kzdVx`0d)9zc,ū<4j3X^ʍ-[IJ֖j;բ0̭Z/XKkfv"Z[we&.+]LC)ř^蜧J"CR),t}d/sa 9rs eW;kg 5zDC$IMb 0/6kwjKzS-rR&ʈE$\*fr5p*' h{܊cOݗޏM8e,AHf]  s鼎RA3 f v6Cap $7Bf (E.煋,DTX&/OXunT;)RygTz9nh(4't8杩ͤebP?ހoW/SDTdyTx[*@Z"xOwQ9lD}T\{OfshT7x:O0v~4l4R4TP@ RLUI 2x}ň;nژ8R763fLp@$Re&iIO Qkq6{jo)"B )܃nטji7լiD5OYZj򆌀$eb wzřƱKQġӥB{eK04 (e_C&DIV)D , < q&qXJLV#hv9cP!hRiБ"^k/w=nk+Z_ʜ|]HSX|'ֹ_cmTm o$*bB_[⻄NMB8:؇8վMxmݚdAGS:[,N7}!U(J*((_ĬFrJ?sBߋy&-smqSVDUb@*t_(ƻNTKF $bYk6{..;xwz]qY2u&]9GK:W@HP(.aTd鬪 p_uڳ$7u*՘RB_?eZ; 3kWny;AK.TID@ %[ܜs%^PPdYPJ,N*"9ɐDQrDZA=}6.S[ K!SkHtcשrh~ŵaY-2_hFWu!A 6.׽Bhh FˎNYT` * [=B^PP a<5ժ}OZWGmETfG#xF}*` xۥ.-AgUUßZi>7XJ,Ci5y&A͑OC q86r8ws(2հd|:@@}>|4^h,vGwAvNxVLLڈg/Nd* jsفt3%m*ɓp6l\AII9 i$m:F6ƅysf̀# 5VU)Ch *jgiٓ2*C B lPD `a9;Tcp &eZ$]~"pj$4b R cHZ#[:=СM:L0!quchT*DlVjs֞VLi%n罟Xaǐ)r0 F0 FνܸwYnECu]X'Zlȴ\Q]qÅIrL jU*v/,?_J+Sr^KSK&k[婸%߇;b}p! M.f濉 Eӵ0™CpR؇g }ԟí9|Vv8ݸ4 Â48[l86<q)R (/Q@ W%,ɯA$"&yU:4[3կl*#Ja&Z”ZMҵ}lSZ`l5!Zq_EJ1{T1*S]FLj(gz #L7$jRB$q{5iT1wcL7qp!191u֗bu^*զ5@M+:c!chjVAkǛC̅sYC";٩,BNదm-) jǖtuEŎG@rErQPBH[REy=tǷ4;Urkoҝ!z *K͐Pm%kYUQ9:Q㜗@ke@BbQ3Ww'QzTGjN%(߶ `$FuӏdN鴨nIQ/pEk/HlҮGet/Kt,/醔ΘZ5Û θrogkѭ;'uYl۪ػU^Mּۣ%UGU3%Vc4++^1* L'㽙*\J&y< xBwsA8*ƙ@ˤ= % ˙(0O-*aJŞMKs覹qLBm#(ZΟees &m<]̽NtNd6UËu2/e ;ifΎן4egl&OM[YW?.lo\캵l ga+g59E}`_ 6ڈ[kq`Pʊ <@]&:t DAc GMRx?B͎?L)="ƔUH*^ߑ]E?_,!QT !#WWEQք.T+"r(i91 H xOFܺ V {/gAhhZT->˒貇@' % uKC.H!׼9 P[.<?n$zmS{Hs.f~J(yb?bhaQQOu͊QQS Tx@~8]zsU>l ő6d[y(?uv:k**`kfH}5kwCS3Dܺ~&.g5ɁqiW#&"rqk(b W$1V".ߑu&͈=DGa  (~l" ~䉵"sDQ6Ӯ}|D:8@Jr'x߫IEMG o*|/~0@̤}N0Y _`~>kioķz-{.3>(G8ߤӦ  (٘]vKyE LoFHp&2DC::.`̻rF$gTxE ";_}&L/ˊ)*aW\!Phwƣp&MƟ:ۈ'Q>ȠR@txd/W^/74+:hE^^c-W T-6jyZ~E}Z}]ywv,f~R)p|;n>Fm)Cs9тy|jwDCǼC?G_%ohd#`J}4 E7k<[:pPfaCLG薏0^?j4Ga2+uIP}; O'|ONUs 73# sd.=сGa % uTqDWt*"DS2 CYRφ&=gZxCD0EaL96_/N}#?}& ( [y*zK'|toљh(hi-=:JV=w9*wGySR>`U!h? 'Н;ءEڑO}g`/o}"b"3U@, pv~! (+A=0D}{U_z>: R9Xk|}Su_p>b1BT|vs9u^؅V%㛆d7|A3D nr`x"*>]?B[yܡ%}=OKE:bACٳ5\S)+R!vX+69zն.oiaZ~3d(&eeE~EiZ=JFfs&,< -:q!,ORA"i-)Ob g"c}KMLM36}Tmz_M%f_Qzw>DLz:Hf*r~F*53ᠢ?.bAvY7x[T9@g\ZDqVM\=auU1 @CM3/ⲊwPSģbZ't,f(r?Fr8'򰵍i4ݓԫtTؔշXD(CGFKǔB^Y-{=*:)wWYÈwwRk(<خa 5MMQ{G6֑C`O=65LHb+ЪCl~ O.ns?XyJDA^d>{*,;˧j1z\P]ƩGPÇ;1ļ[8qbi`RR:3x|t2G1YB=w=}REcsdY]ݣ%xakUt{ֳYcK>N>Qh"4Z "#i.4=JziZ` }xcPr ":Valhy֟VeSn_N19*bNú{Z*SYki-PѸaٽ붕?Ypha?Ҋk_[;ּZB/ҌfdtI<:> CTY?MқOoVkuM#t q+eRXE_枍(|*U6܏rsFM甗# muUESwm߷{gTon,Yy|z_.S)w{uo.c9\ovz\]vϗ7~Ԫg;ᛨݣ*+ޓ*NݧQ']myFu =g k[ Yk Nacp,SQ1+Kÿ߸E=37+Ё$6m/öt+= }׼}x?݌5M= 3;Zcʨb6]%MLg[ >2jÖMWi:wPgKՑ4u;J. O.}WՎR}׉f v138g2R K } LoQLDMm7/2 ]rn"E._}8d@7tQP~ },vkca4*w Ekw?|ȯw[߶0g3@́3#&(Z Ji  (b*Z)JhB+aLXV,IEFÞ2tpr`,wrR'.r(G՛'8A$F{I"d"\ YY!ifg:F0FwZ.b7s>XlRN~lr_fyЦBF b&P^l氐:-;SOWn{ie@ \1BFbwx.佈oE&~[vϨhrb--u^Ih "r?RO4Nř,W^rEAe=Y>7".}v#н_C.uZ_Yƾ.Kzz::?Kmڮ+Wxj{138S6MA5/l&B/GC`͵ͱ]ƵoTsO=zSE7.deY~6]UZ( Og#fsyg;0 v~uGkqf:Í9+xҴ&cZ$mO>W;oHi+mZ4z*s}[NʔIQDdeW؜WcxpӁCɿ'l5\٭v_Ymvo%Y!d@km%d%BǽbejgT{O5_cON3r?CKZzִ׏4#>eZij>Ag?/'YgJ˃E\~:mکUpǷMfbP.?옲wSd5k[=f^{ƒ)'ȃ{i Sㄥ: (T⺞m]C9*Su9x] iUZhek<<9g)0NzCJR7օt8x|Qښ$AGk?G5HpBVI J`Béo=WngdVr !Kһ7.vv//y.ᮃ#gB33BBil(5QOWB^dQ$ *#qBiݹ"^} oU!ջ:ב- G|w_1\-W_$h~o/k$eO;3#23f >${c{Zrc?o"e& 0X9˴C5ٛI?u:W\!5sGO㵭i$)2#!ΡYe"̤?3hXPd4Oje|_%MvX@HA\| &AvS 3 'I0'|BR `q\n_ALDEy E_T=~6e=~G @?"C߽׀@ "!gG)Ir7TXahe &4T)&bMkƇP`` M@MDui=ofڥRP&)("È) ѯĪ) |=N ޿(Y"&%kwzI0qܭBcii[fDBJs4)Yo}v=C0m!afJn2ˋ1tL:Ki`Ks7" gM7 d3D=%k}`j& *`./ckLRXvp\ T=N>&.Z||UXd;PGPmol/oH)]-sIjBHF:Bz`|Y[EP5L$C0kc)@2 M+EDB#; }2'3]4Edh7_"!2B 9QDaس խXRr9@HRWQ !h4'wJi[KFʧ`d))"#:gH̩CoO8>WZ=C>1)-T`y-d C#U~JXf u#Y@gƸu#l"q:0|@&|ݑJAS=[ie_6D, DF8X݈;+ewp g|U)ޙ +E8 3@ǁ҅8H!S!L mMm$K hAv,zsʊϜ3n-!-> "|t(b5T?烤F/~tے(x { t$pndjL"j&$i=H8V@yER-vcoڅzp+jCo.t9 Ya{::EQրUv5U%P,,ESJ(FOxgr?̻ 1K,ҽdd dr>idc=ۊN\>-i,!^uRC]ʺ^ #Ì:%j9&u (瑠oy͝3:2wV|Gm& ê^Z}+]Ö\TO|ll($?!\tDEOnUks!7}3RC"`z<f v?J,\ ݜ6`a5Qdѵ8X>"G4A,~g!Dt|xSv2 젯D]<(]=thSWCrb:t]ԇ: =O:'6([}8:!?e\%=!"dG]]aqH ߱w Gy|e)`$D*T^i;!@qarbD~d7YvJ; ,Q>|2T|q}b(,"n5|CsL5D#u2x5M|t㡜,lYeACfx<~?/cH(!XG"0j_*VW ,Y*Ti5*', !Z$%&t`.ҞDEB23.T>k-PٍlJVǎ&nhu-09s%(=cc(V|elL>}siY<#qR"F(=5 'fS@#dX*AhP-tx2 `-M2ZIf> nC2;S[)iOJ%v&OaH)[esCXaM]Fi-K&Vg2Eĺ"eDdb*m F4Œ[l1T,f&ƢAf88>JO'St$:e]w >9#0. G̢`J2# &'ߗ f/rݞ,(sC4GI --]5$"[&F0.Hkέzq,,Yw i_Xriz@8yE{RQC\uY kz-Ӵ)'`"UT@`E4떆V*Հ*EuQCPNptNM0a r9 U)[ H%(GK $%BFDPA8ac6N9/vv * aӠ6<\PP"Ñ("AE(*F"+DQ"^9;a`Hwc!gp8(qfCeJT-&vG”XteEPkgMi۲A 1 DUTEЪE:)6w^Bs<܄:*@;Xzl<=CxhAPZE!X'Nip»!BWB jOWqt%ĢB">k.8S2 P@,Ǧ9ʅgA-EYLɏiQ7HH  8_%$&]ጞ :©с3fFzMH"#PCGỤB'dT HD%b`Wܘ0m~DX-V&b^֋=T#TGV>ъ}蘰F6х{'[ "1m7CSZ \ek+Ʋ& |(iR`LLz7'~E&<|$I][ ixJzEI U}7|נ'O舕~>[X2߷/bCWxB'*]o=M w6hHϽ k\Y+bba3)$RRM{NJl !96"rn'yxO+dI@(I eaF `驵xѤ{)nBWH2sw8aR&'竮w_4N,w:yon݇&q bɮLՠae:fb3VX\ݜ$4p!9c8tq JUCIä3Q!+7Jwz9K0,UFmGA9 |x4UAj8[Մ3lhs Ԛs0<7QHbX")UjbT R݈KnX @w=FⵡeQ{7,BC];>OGpjXjXm+emaw e/٠a1i;ezCƈ_ O*">{]nxiT߰n5JBI(Y̍"{~D #Њw".#CW LٚD7C" 4Dnd@L?{&'G_w^)gXOi׈`O;e~PIEWWӭp9^kevtZg%Y P(Lґࡂw_ M[X'==[v/o*\] QE]"OF΃`e,f9TH   o{{(|+<{;h]k9R2"(-)PEs}^onoTqWKWX""RɃtJ^gFש嫋]6߳wx^T/m4|0\<8$gs{v(+T~]mu\L4Ġ#tWM%L_4}w65v<Æޓ[̪TUy@?{ HzKcQh,CWt6fQ8W]#Wuz鳗X79y6t}ҴWV0A;^3}6Iӳm8O(j ae B^g2g . (KjI 8J  ~\ٰؒeUy޻}/wYNkg4:7t7ax'}(5![bP? rxjfݨi,/4x ;` ] T co+lH c $d{Y02ݺx#~qgg]U>F12?{ʺ_ k_Q`ci*lCs!|Mt8SElCI"׫&Z D֜v ǘ &3S=`Mw=]xDQ8}4)졃ſ''ϩߢX] :&M __n!蚏G; :o츖і!XNntXMGiU|*i\e&v-޷v;>8.D\p/(=E?wT"+#v!GZ@Z=Wn"%5_Q/^?*XӢ_mcdS}H>{r32駫~Ymr* ~}{߸8Z˫2.A:A*""b_Y%,Uf-]k2299*&mt@WˤX ~}suUG cuhxZqH0ݬ㘭N+ xm0/vQ-Z6Ӡm"'8Ax(ḳ0AX5Ur̋Dk?)ӊz5DtL%'OQ&f|PڊCñp;L#(q.(l6(s A jsO'0!]vy׹:cz Rnn/\̞zu+@ei|V=m]/Y5ouc4ﱫyA.m vA&N kM5m7 t".dIxܿA~.T G8EOo1l'/3\nT781׼U\^`6vZƲaQM1mes;1Qnӏ3΃ 6OVϥu x9 !\!=2)YNGFW؃uNBìܸCCU.GiKE;' Ŵrc"ic]f{KR[_tLyY?[ެsPݴ _B~"^8>rps! БP]a ?ϣiDz66!z Ҭ!7}_|bf]R9ݥbb3E%@B!(n lϵ,^RSw;jcn6*?Hrq̠:EPJqJ "vR{f GqAd>"soVK/o\F IRD1]ȵNx =0zF[B9`:ESk nm=E['p??ESs@!;*.d SB'o;#DwS ~j*-vIpH6s ΰ .X-y Q[Kࡄ(  JTUR!C TI?m%[dhR2H%ڿ'k1d<9'QY ~]ٴYzZOQݝ5u ^3 e޵./H S#1ŃyD(!o|_fOӏvbMI"#w)(,o>9f}{NH'7:#pKb y}xP/Ț> TXQf^U{020gg]Dܨ `ʄT(*֚n\::rTӦiU䰨%/6l)OmYD/=_#ĸ_c yBmX"rqOUQxĦV!p ]Ƞu9!efQrhYRd=TQ/  8'/Tڅ6I3E!?@ :`BML @.p  c!Iњ!6J֋ r,'K&NCzނC*$QA'tcsՆd3 ЙG>&6) -q,8O̤찁ȼ@@ tdd {X7W{mO3> sz\<YrDjB$D~ӻ9 8H?jNu+loYM59қ8Кf̩l9 m.)eR}ɪ95 ~eOKUl]-/'aTزw'xN넉]ʁ{RrxB~! PkV*F*10CxEDOmHu{A(zPrUxR5M\ 8P侗NnYW1 a`Z;IjcLj"&gBHd//| 5(^`?=lV#kb-v}W͖BRlbJ(?pzdu9[( P"TȀL8ݙ"aV8k"iȣ|쨐WA0w(aoEͱ U0>}|ѿ!fk `ύKLC@4,osgws7cYG$&=u,%#MʱFr!(={s/?=-k lM,z*BI;b›Y%  . hyq4RPY!S94N2`$DghdG{+mwWaЀ:,*Tbz>smGK(w])+?Y&R@P;AK*JĺH[;\3OEhn} 7qbDT)q%ezRA$!$MbˆI) :-MJS(u֮\E/Wfꯙ+]TQ" ! 3H `P߂V_MSF's^cM nnJ}n71_)x'pu;۔w938?y 뿃yQ"mq0+2qz kP)rsNH!Թ uҚWI_kQ,#9 J-V%YiIuimTh #3$ lg'%I]fWSu"yk4]r[`EaA#+I=rEĜS@pp#q^b$< 5X| xS2 8ysT]fՅ9pǥm!OnND*y0w%EMP"cvMhWMkmEPX|WwMq^&G렐Pl|(l r,TM($98=o2ɔ-(դ}?}a y7Ou g{MvT!)R:C4~s ͽkfXx<[ 7YV+m4#mI0"ԝu37S}%A?ƹyq/v'tTU:Dd*ӎ2eI7Ŷ;gMc>7Ip\ͼRqu J /ĵeN߱ͻs=XIk:蓷M%XRwke5m'rmOjY[6G/mmV`땒(d-Ő~(=RC[+89:Q)gjصFhp 88im@5`P8LM5'{ Y4ֶRz豖:x( 8rlUq ȒpIή[(y.rQ,]gw$̌VR ",7"9E{˙"/#pHqeF>5ux;~(M2U}T<׬ɪ풪|E$%urC0Fc6H A(PN㸱h UjUsVV1rונ5h4ޝlF]w\!IFDFcUDa?&oi^lk,|['yos.^įnsŃ̵@y-oр+v T>'sV<-Te=a7ZjXM aJV{fE__ dA3s0KIt V%\X=k9~|?__kSiGt[_yLigkwt/;g_„Ñg#QJ$:!h!:7Ǽ7N=O)UќU% ?kc+ú='ͤ@cȫV{*TַV&C3]@e~4 @rU6ebڭiO"*({&v ZPj䪬YuSܕPr3헿zN7oHdg`!X ;A"d9l§ _<&1$t[ ͭsj LQMH5$YXEU=r/,#?y/9sKv|}i&AJGR]Ɍ%̴v wd/6n$ht4ʨuz9yr{ðO.ǏI=G{N_ɘ.Tcըɧ_2`!atpΉ?J_Ψ񪐩$(I?u3xU:fdmү{s1X-NMl5kny&S4LJzp`s}e={͍r4lZw m%Ju\pupriy 8ưT)> | Ay¤ARh\חf)8VF Z88 ?oMbV7OA|8^K(g؁Ko2A2ǕHb^woSl|ZԖԬe ~Yoö7$̗bl|[mˉ?3ܾc/h(5<81( b{=WnB8b( [A鄀CF-qY\kU/ZnxeJ"#ʯ.V.@V: ߒD~Ugp,-0d`ERgPxQ-Zhr.(*ְ/1]<ƌU>5^~䀰2/`&:`D `ga>eҷpB9ۆx2#F-9{_TN{!ef&%$ &[^7]_v><`FMIHOߏoR%$]5UE8%Nr#y)#߶];qḑs-#_z 4<<ֱF cۙc-г]᪈Ὢccgf0 0F97byO `~M-XI Wg%Xjo]znhEbNXPgiQAqHp~Y""gyiT@B© D_Y)ɔN@6 @?Jq$G DžOF=6=?} .:xb@L̐HE4T3jҙFy_t]NVdgeaw|w~feTH1Qxa"a[sP?r`<#;b#K !a5 KfÎ"WVs& g BDz~vt4l(9#e3''(6D=hhW,WVp+9ID KU:kKp\A*DR <吧 *r%"z*DlZA=hdOإ A2!>7+pKH/ Dʀ>=⣭i(@>/G Яܼ)g򥲻-_^>+qjc$8wTM 2CQSu+D]Ev–[^vX [ppݒm[k"1;gPCO qد=찐ʨޗqR6]XĒE..`-=0@"q?S Ȓh T^Hb?6fDCM=-1M#.uE O;8Yz Ӟ4 ^=G S"Q^R\/qtky ]B?-ev I4bCLg2@Gx<{툃uG9ZvaHȓA61P09G)n4M_ ,L#E!^#`ұDƌ# I\ j ^mȪ.[\2Mv?c-rX@..Kjs#UukF >R>MNɦ}3 Ie>ު &@aHkˆ Ac5'rڂQ"[zri+ S'BY{ huu*1܎Wg@sb(6}2X;FS1 E9 !T<*_ } {?$tػc3}ӚJSPR;o `J{bY/w [T2ZΫ&bb@ B&:p.zzvH"b<*Gר]_PJnIP-P1LZ4Rvlz|![ 6nE)ϯ<$Ɣ=@&}@LJ(lAQ4n,kdok\Ք^)@ [,OGf42_JA͸U"GK|}=n^|s/0X\gᄺ(W!oO+aTDoL2cqdT[-[N8#-**MM]MSe,iZ_r40fdr[}NX#0)27l׊TJH|idAIT6Hmv"~R۵h}wo[Tçam+^Ehi{H};Y}", d_b+,j? \/]1([3 $RHjxp1EG}Jq( vy%zژŦ_C7C:bGHM&h'K}^50z̩I;To=_~#{ Fǒ}-Y` >羇݆͟[(3~X83WUNOGFtkm LVǻ jouns+JЌK׹HwJi}`LaBt1PD3 z @ONSNny9^6>gq[\uWCc( C2 i U""d'zĈu:ږ7,!r**̵Z,AKC}nkT"CX`}'k{6.)l.xtS2{SMxYLX3U(a [ZaY| N% ^5J'鲛O 'AY )vq؍q+wt(D!=VRE@ ;š&P2 \Ṕ2Q!Ii"ɗRVLc@}#RP#qKC-ѱ)][lǻmAUj p'f-@ni8I޵LþL4ZPBM\珧2fl?.RmpقwSܫz;]{_:Jor:dUyeihQKٵYltS@}DiPV|OIg8:OWC܇ 0oWCVL#Ye/ظfBY..j6CmVO[m!8B+4qCo?qޯ~&(aó>Oܛz[ixC}~^PY.@ S?FQBw[Q9[T^GƢd$*eXw7+o/&rDx4I/[0+u@_d\H \x!r rġGĔP8%Ѐy4H_hD)BP(a-PW1wNB4 ĉ@@Pj"QGdBFRA!rʡ-a5IݦFnRdNIp"n՘.o!PYpaf10pEou|CWx{k3Br̮=z1k,2ɦnMܷLy ޞ)fm+z\͹܇.'‹GRyL:@2%"pHE4[e,YPQ4!dCfd!auHrV`*VҭUR(BvA15W_{>}e|]ͅ\ڀfEX%Ydm/CpHF`}Q lQeL腏+ CZ4Q!o Q*RJc1qD|+/0̆b1_ UtC9Ha3=!6F jpJBsb0F f  -8̷USIW`\?`T80Ѥ"qRzZ8`a߾к O^xƅ v mGpuK#;RZ8r,_?+/><{x_ήTAgE h^Ψ6J 'g{jՋwۥhBƏl>F&=ׯf3 U:=6M1 &vQA#X+ʙ'n)MŔJ-bg~k✍t.;L;gh?ԉ,[[{t݇,(=i}=gv"kd8݅ۯM=HuSzllqym,hM C8)$] 6]*^ 9wVu4qagR'%?~OҜy@86=؀/zG\~U?i7YHXAuRvw&X>g{!gRyN3(w=SM}Cwݞ(i*i{i]CMV9q|ɷixos8Q>g\.lV% kR_](NOy]n`xseg3& FveN~5,~hjaC"37G}u?x`Wg!ASҜOᴵٯu_ّ)a 3"D hw`B N}}LYi. gNַbŧ$Xlj$<͓ɓI's=ma@9('7"QE*0_G_vyϮOΪygmSK#D4(Wbȉ[sF+-7m:wPS|qq*J"qhPJ "^x<n<F)#B@vX~1?F=d5O9bmV+l-bWc!,R" 0(I'(592K8u@]8WH<⿵GD!g(lZg{(g:ErjN`G39}=+E|63?u9]d~?N۪|X!˃PT_=ms{\a".iNLa<~IYYezˌw\F/> WtވUM#ח_MI0{E_8M1μe{O~Ka FJ>KUGhG#mH9ZV SKro.}!m>ϊO+)4-:f33kwJϙSzTֻ%g3ƪ\oŪ!WyI,5>\{#oٜkW_q69_+x âbm Kl-Z P.J2! wsKԑ9Fxs mn{GXmn3B+[L= t20D}K诮KəToaxuk 6ҳMaM MCYn[FŬ4[%qׇ %OU.|ɸQM?'fK){y:9mNyW$}0ad:A">Om_ԴZ/U}xr/N'/\Zx?rRq'i;ѕsBhgdf!eh{*ck5q馍O5_+0̹e|_IumKMfT i)m#_gTq 枙:I:%*c}2<Vsn-#3R٭7" h"+i?˗mu>>{w}:9Ji}}]FA 0>İXw@_w=(m3E`.SGPbk^%@hmΠ$f~ao{cW%•C'FHwo#w`Y"Tt:4lэNF4L(N5LJo=jv"wckFns;Te#| &uW>|CF8T#SN'갾(^5ՕMrpwWAmº[/IGqp,YD=e8ЁV"~y쉯l4Z 瑨l57C3HK)bxSLIy7+x1)(6,ǃ$OX `*qΦN@*Aq}}|>ig7ȚFrs W= ֆˉW|湽04 kpowsG/5)H|] 8Jײ5ձ/ϟB}"tn.=YĈtf5yF}F=u'Tϗb; N:1͓cC;uk(!F qSN"4+Af)nB$=k? bdb-^YׇgMW=4ul_Ob72g7vH;~OG;?WPv# u`lEH)!7nvRB߾0Kr2uVo&1-Ё}ZġEN%SwwcQ^ړg.VԌ|P_Z[U+%aW|fErNqfʳOȴĪHʰM|h:o=Xw9mx0.O&5'?cUJJuN[@c8ct4_ڵTww^]"AfMO"m“xG*LV]糫IPQ"ˇtL%r L&HIIU d^2ӛs6$:S3L.@L_;HR{D>]aY^Tąc<8P"ꛬ-?$n{be3T3aEl!.=ttkK_(؞'%,+kgt6gfWUeɲ=7 ӫ|믉 o}drsQs:Öd9Uyw,mU5 1ǹ-xfa<⨣!ϓ;rN̗x3涱TR_M4C[R 9­w*Ch30cGA&7-_X?|; ׇqϏ]? {Gb~__#5YiFC,ֵSF7CsMtrw.ID.G4Ӫ^'{!ݠ=NxTqُsHznr uo YdB-ҁCTR )B%o!3F`G,`2ڧ֌Q~6%AP-LB B?0 @D )!)o \X ?/-gfۅNr/E3Z5Losz SQ>lg[FB.NMN?n%&jumx!Il5WWKNc&dw]öH)(fAn뱛<,ha:0 `o\W7aay +6zx_6UX8YjxYOm`̌?Lz> -MQwV;IϡyOúov agX5epkh8ut6Y}'䞂?V>j9`~Ь=poL^?T;^-}56J~:-YPv%M>3ua/n}2e'JQ, /~˞$fwBuxWRS@#YԶb;^ani?wj[?OULma\;{ӷ!ݠNV M/_M/eN(GVb8ʾTI26hQg+1=u՗Ozs0SȬ2s߬t4li@RFvLux)H0~$@ )3iFI Usꅑ4@T~*Fk рE@k)]uR+?6{YIÊ?뱮CY[aQ}iӋuæ{3pQNI[O\_sŽQv膏Ϡgo%vR QI1^@IIَ@Az8kϤǗ 4SCW 4f$O.ui?c~΍Ʉ#+#P r0eЯ`w:c". R@p{ykgNЁ@@V&.eM렀-wvEZ݂O9 ?gp.Og>.wYϗ_Cd7@T `X]6ERǕj8 |"HrUHIl;fV=&ۮ+r˿wVmwJuߙ1s9vs*z} 6[V6UWk=<ۊK}աZyhzљG3y|NAR|+?y0ӆ/b5PX,0[^ %xxdxVnoo>Mִ.aX:~v7}G`:=BûaƱdGs?]|)]#\ -r\u<2C<IuhaQ<~%]^am:b'2nUCg(!0l .g740UsrosVtw{)Act4 Q>ԦAeĊ=NEi[2Q+J$jX{\J.6ʱ0EQvqr\7ӡSIҜ Jѻ2bzo}r^Z,5TP@G($$Lb G X@ޠe%HRq&|&JD-l`ͥ9Ša/KpA~Ъ0^ &wq &O9G@J}d>(cfLWڄ#!l+loW S|G1!ʑ4y9#2cvPa`>{9P$!_9x_0;p??!\~fDE2('s]wQ~ )Q1x69;0`'qۭut?aoҢ"j]/ub'H78#\kQI. t}jEem4J3JS6[YӊXW6,{YU޵d$Jh Ja1oȯ5 !PyWs,eO>LGay?6}\Ի}-f 4| d _ p?]lx ֻjpV*gPF/+6w;)/sVם%>g zD$j52shf濗r}]Yo>"0 rIjYegIZ&a|{zh&Hq‰/S =r0uM9SA.T>T}m'ۀvpu>_#Nf'B,M,^FWԎ&l_|gӏJ7Yb3#Ϡ5e`-PT؁==`)lN%+ڶFo֮sW!nzo#]-3A `E0nΟNnbʲYR=sM`w~{kww `&K/w^CO >=eJ϶}O\'c7y|V~;%Cȓme,M/gt 9=3R"j&evln}NOfN g3*aD+LN2g\a9g.3G# /wF,kl$ajV{ΐA[^G,sQ,l@!B(HHJ$5qt.7;+`;в?J۠@{wK5omk?m{'AYGQUn# $ŨER 3Mx :`zMAu)];\厛 M;`U!Jp0!>.Ps}t?}d@ .5PqsT'/um!>+4>C{W"%ϵ&e,0 (ÉƱv_hb]&U:[fxW]nhDXICC+ӵQbo j2ZU^a\(?6rmp8=]X+>m'FD<~ia9k0M`9TMQ-N4M3UBRLfJ =*Eo=Q;{oCW-yȕgfU?{?:EP=袭}00A0'ǭkMXWu6| cnet^.T' ",Ǚ]65]]~!?|'p; <\Skm(moFĪ.?~LΩ$7øEsg "}h|NLJdO?ԦKFnF@$@*&Ú+_sO u~[N?{Wߨ( Ay@W(nz[1mk<y 'DOx"wWBqo}YA$0zK7sX>l&nwP!HD vyrݬ~_}E罫^v<ҦG+ȥN7˪S }gHtu޾鷬Bd@9׃QogV2 cы<ȿ0 M)E%[:=_ٿqz4ܫ]!|it q#Gt~&DxоުCc?9C DH.dVm ,ݹ(x{k%4X{0 /dzbemjXؐv:JSGc<Dj{R29M?f2 ii(E)7|7*Ӿ,ޣ`;:iPJlG)}R̫]Xǂ;ֹ{cy]<(9`!lIwƌĐۉDk%jricOz9Xj6'n<2dd껝IxO\ -MϲW38+}$3gia 4u`2QE4ūt5IiTKh;; ͇y(9y ZYb;*1ޕ}~65S{3}3ЫV:%gbY.ҟjE..lUzvb=6l/UӇl1eK5= E8Hbv}k.7k؜nǼ-]vcݛF}jق`"d{ЫjLk&}|I-+'ԇj/lD$ιI_<۵z2}GiaSYc|\FfKq(zs [[ ՒW̯i5mEc wu'-%[e6%m;i1WsI-ó]tu/#lEuٱ4KZ]95a&"b:8o-U4GmiO_3=a.CoisS򹛜`G-:.aZ#1hL戭7,i:sS\wdTHMKnB!¨ۈk_7p98[^H} קM#0Tvnݍ޻j89"OՁ?Gj <=*4^EN˫-7Xm/_r#APEw&8=yVөh4# mg{]c\kSl궢PX2#4 i,Nlx\jKk[!Lp43mou45C?sxJ9;AT9|Ast{iy =}qoh=fKo@mÛ7ǵ6s:mljvQG$g\t HyWn3xgТm՘5qTy)ˠr.%)CJQSLWprW˲uZ9v.AշTdmzGG ۘ_Q!ݎyΩ4YRڗr&Uokn_-ёԾȧ}r3UTZ.FZƼefk#Lb#C>3c&iCw(qi:'ʏ[Ǣ8G4`s/%:,fH,]{:roX+5l cSa_VKP*y`|1ͫLOYYe}T=֋$!~…qmEƬPPcB@X4` HUU@b#Kq/9OnX R= 7|WWc"5ˌ'3Wu/.Os;Ma? \D<cr݈(XHHo 8\$Ū3z{;D\vgȺʺmUW[Ģꅁh^UX\Ay>rLs7/=«es!=$:xEᎴз?W{h;oN'[FTi^S*D8HYDc 01f{~]?ViM?|g-ϫҾ,/m'?`_=W@͌BDDDF"K1kL[tI/&Uz{l ?OY~̯]D.]"yv2fg՘Ρd`{ĩI '%kd )XfOR~}Y[kb~>[Q[a7E:%tǤ@wF\B(H| ܓy{<KHʢaB_~"3|kF?OMۼ8%2:X/6^kf5$^X0P+/eDZ,̱ODq+_Tq-#4†#Oްa5!n t`Xr, zU8 ȦOY i@梃;,?OP\/l~'qsm73p7EQ}xꪤ_H p]gC>ffH~@;+x!5^&CtUf(ݟr~ y3yX/͈AH< zƼ՘gXBƢ`pٙB__&< ®-?a߭ m8w5Ddys\/v+"U&*z+K"C=BuQ9Tv/\:_ʢE?= gbݘ,fNy*7,aC$#'[溫򓱇?esO1I+GToOÑ:]r?^CHA)\oaN|#}C(#b?oLvϥH-)C=Z2*YI "tPuL6D]=5‡q7 p<LyX<7 Y^ls-kה1uզ8 });D÷rxX-KJZy:|ggU@"8rX:]Zwcӑɀ7xпW9Bc ,8'pwh{?9K2潍l{O]tY7@f_jO tE@y?os KjV'ԚH-a} IIK$<[KuUKb _VQϘXR="V =46}M}bmNcЗD\$#ܩ8ޱHUq0hPWOލA}}V]<#R-W`d{l,/u*Y|K^8"ǝzG l{JDLniG "=/\$ť<5 C >QdF5TPʃ,˚!ALx!b_ak|'fy]ݡh}u, -~o4j hI'(>wﴘ KŁҁ@|sfwϱ wyCUÏ4<:T<_)Ghzd4&!Xbd:s#bb]Fmr_H N/WF?0h͛;-`tbUK,kLc2dTRLiD(-$$Se {ZЬQa "nsJ0VqQiO_% KQd,HO"`0i ɱ!X G sa<9W~ܩh $^C\3ANN#Fd!NÀnձr>!4#pېH)P%BE֔ @(Ȉiz>u7be`-n6dHE,t4 2YtXPS;H\ :7(+{33 K=fa"Ҋbvh&!/c}UK S LA`sGKMKq(nanSXT pY㕈 ؈D8x]QjaB(PA*R-d&unf3E1\ ҳ$]|rݙZ"vO Owac%Jbuȁ?,MڷpMBKt!c<Qck>f! /ulyXk:-_6@Y˦~i (Y*}.EvTZ)*JV'Q*l>ɤ+^5)o;E|fSq)oFDx"&>oݑf2lѥw-|J1oe>ܶct%_jPA,J6jYp, v{c-?}_Ҫ] ݳH~>SE<ުYvp5:_t me|G/߳* r!;aQldfざfZX%J[*TTpPV&q(!f YGlb#'BkOnS~o[@ۣWh*L"* 򒢩Gɘ"M"@#p\\IFZhHmCYp]=ʀ/:%gPn Uq8_Oz 4ٹ+\Wl8rX/w滮W XO&:!mw5n>7Wk#5-`me]-UC{ X AoQo-HEfE~[vK4/Oi-j{5f1ogwk!NeɢjS9 @j},S5aSR[u-lnbssCg/; $s!k_[mГ]uEOwB=j{8a)%μWO9V;Իhg{ ۿMt5 rgUǬ$RMRPlg5JՊQ]ﵴs8NCiW}=û(tpkh̐$~ˌo,8uژK}}.:GYf+m]_ty)e}}z.>gdD81iѪaTBS$ QR:p'0ُ(xo#Yَ;n R0"'A3J 3$ } ~➖6w?tK~l dC;nŠYn;"${d.xvH_OKkzuPTvOY8sA ,YuU^EUMOߝ5[F4s§&nwҿ&jXeG3wj-2G33xX I=~{6GB"^o (`~arjDSO'ah6֨Z$\T-!$p]*U aGVWƫqI ;?Q.GFێ^q\U (RmmQdy|"Ýt+=.~OkIC˥9;u$縥3'pN=0Uq8Ff`e 'f%Vڥ-֖eF-2,p`XJ;t"U*m4KhMQ`ʮE L0/Pw1'־IE]*;P3=snH;7XQ"g\_fBx([qлm=G3=ߕm{ͪ'TK,Ccg 2'U_+*4^뇚4>1Z4`,:H N›NP3XUYq!ڧ$تUH ޹B:sj^=?c 8_-6 BQ]$̜񕹀EDg|!bbb-uǣsŜivMjD_s&AYKmyIuu Citt(0DF_c\-F+Y~WyW c2j >ZbWZO:P [lmFiCF)fQ&xYJѵq:e5ASc%̧:㮝:ITJsλƝXiK|PDDi %'9/}u֥r?&WsGSE2c0*H{Di SHZ;n܍>tv͋Mk=vC2An'xG'6 'BfTYz.PUhf$8[(oQnL.׾438,8 B)`zݫېٙ.c)e -\p1h0IT *D EVI[,fԙYZn]:lª}15|P q嘠$;$A0Z:`l]юU@`VOq+g'#Cnj&$?ڹks;[J&{1J|*SI% 094r=w7=\yޯWtM2 LLU؂FHDž]-q3LKFaDiӏq{Aǟ 즼?ٿ;3hJ8jC*vww^j{'I#B qOnMۙZ=]/]5ٹh՞bЁo6g ?xJ-iJCt۷?xbro1}H^?HA0L!UJy~ K0&^kqr%:,zT_^6eS)ǜjL09 ]Z6,}]pz!%!l2_cᢨɘ$0ݚ-kvnW`XI yfJHvOδW"˸/NC$AcqBIu?_ŒyzaO=S-g;n5箟rFJceh90v7Gޮ M=3wg Y7Goƽn}{3 b:㈞o߽8VQ8u Y"OL.ul<'%al]ţvgmdEVQ@QӸ+V/͊@Auݴʓcr/ A|#]GB(,+/}vf^B{Zo|tv܋ߴFB0tNx̖36U~1y}g83ʗi݆MYz܋_91nϽowin=&^XBgM nHRZ4_A7t2EYm?U[6탐YSqK)R:[B\ l) %sb K/_}Tu7kpqi߰5?M " A3R28!z/,롱KyPIqisspU[rɆ ȫGcJ і2x 2Ni-䃓=?x|/qgpg'>J1^uzZD7U#eX wocbw'Ef 0 ^} ;9ūtN_Iv/R}~]m, FTǹH~:?5Nh.F)kB!ܥkGO*?b9ljQj=U+v?Eb/+kuvuޏr F*P#mnUmUN¯fNeI+}IܽkŐfrJ+.;b_)}]Sjh1uy[*{ssAu֩6ɩ7y-Gy)l{&y+FjVbdG<蜻Mc{Cnڕo) swnR h#JSqdy|0g2X  K <({ug[;y2rylBa|(oBp*tn1mwHgđZ#<\V U+ "퇪V*uQ_)'NMsY:#x54=fLK/{?>ν[Ӳw68+9}GRlݹoxp7jGԞix[% {TCE$2!1vr q̺(̪ǯ\} @ WW Ogtcǩ #.idhlvlhĥL9҆;iD:dNVDh*{H_mc6T Eyq Y#Օ"onz{sƹ\oY\gtd^6o'ELnbrvh{G&))-#WK({BcR}2153 ;F] %9X,Q%W,3ylokA~}P:9qp{oή$;t/SAMwRg<3Aޡ\SJt\kɵa E$qF9?L$~6O8b+ۯVrpߘUFm}]MϤl=jƣ_ e${uk<>rǢU[?_:f3Mg`nc|J nQXv]c{}+$p.wApȌؠ.u0zI$%U9Hq=NPNcu6̇w1= b 0@-V HB>3Xp194g^*Zӿlo' KY2qM&7-ӖPFIJOdTZɮQca7@:Fic+Uzзw#ɽS){>\DurbxV"syYof=tǖh<KBUp~s q-vd2328Y)4Jjjo_:FWCһ|\8R׵w6FlfnmP!J]rfghWA1pmxi_!ˢ΄PNL}hH?dѡzH3Bo0.Wj!,H)wuUzx_& .^#o=gzk7_z}pWHQ)qnO wuxC%J"ь\׏4"0<;6¢LԄ7eOП)tgZeZdD?v0_C݂D1s~#(Qِ$7@FATR0M``(f5َm'Vct'L;q yɵ'Pe_&)wj*X?Cs?93B3gWGMPM//I,=CS 8a'3H&h1pB0 ,y (E7;vQ7b!Y!fװB(R< l` aB!]W)3%J2ɹ*9*=Hu]#TЧm})>NNy6 y_t:ރGg3}Cܿj"gZ laȮ@Y@ !ijIQ AwD Jy }`@>Ѷ93vdy2ٯ'@EHFԻ~I2 s~C0aI@yv~>;+3]0-MǛ}ONWB /~)xsaжi_=ZiQ}<[${mS:Oߗ]GPżhga7G1W,l^2u)( vF*uյ-k]򵂟*ie ]97Db"YEuLO%̈{kv:s\sHD_\j)S6k({sрQ7HP%9C8=(̎l1ПouG&=#4Qd;("HG?e-rwMrBeu]P_j)*Idžn3eUUU'",(:2]f&b|D 7ag1~cgt3+a qǷ&"# y>Hqx~kLَ~u[X駓[;Mf[]j4kmYsCk8}TĽ8 ywUPH"O:fHC9w6- :г~u?K}nbbzkY%@m0%YEBT/{"iwbXvgk9i\T_g1fCaEcldzSB'/0ԝ+Ye*ċU^ݳqݗĹtgwK?`օV %ֶ- bU˛U.}yP ;o\ݧ[xyZ:҆]m's9S#-Vx> Q.LDȠdEt"bڀd^|L,dIYe.Iz[\EΔ,GH(#R :$}"'%{^0.){FUg2m](LD(>IJtqxNީBѦwXU;WY::ߤ˶9sM n$+Ts71Omj*RC^xm -]jϘn&h;B eYĩS{sBci2M.}У0!gxͺCU)ylt?8Χ=5;g2-e3qWC L$f9?!=_EX́.Qr#%<:ftJֱ)PćNƎu?|[ AOe;xw7 Ve~"QMKXTHF F+%X2qNLJ̃9&r+=+ 1y|"J{UX ub3fcQdw>.NI)>]ϩ/w9iip p;i*@U1,I#1A 5 E&8V:v6(%>bpp nY$\UwU"[~ Ioz GJiӦ-?9DD`6~uPW *hKp?AtQd}Sn=OqWr(ؼ/}0Fn[`)@;LxR_)J;-r"U!-vT5Gb"i2?Ӏ̨`KvRw?$Oy~tЋ}?a:6|fkZ{P]ٵK$k[L:{>_s!Հ(j-*J^=Lm2HR $ !CBU"76۳c8}׊| !v>- ~~nrS] }L>LQQ~9-䏵5QNz&(X Bol((㏕YO]~Wi< jioLJaBN 9PlA2g E#uLf#FtKDИd43kTHJ!iȕs.Dqm"ִ` %˘rmwr+fTYC &Sb;Ɍ MmSܷ)jjnF03Z~-)ۜ&Z8ޞ!z34ɊL^gSZtBhe+U8]߯=̝tl h̺R¸_k5˧&Mv>gN79\JUng+笼W315V1e>lYJLE(~x @21>cS1x˂/kQDxxK&+P/Ut1/L^Z `)sEG-'z?{- Ѽ4Л/f#y,3Dor-[6,}9 d]Cԧ q+·{]z;?QYIhP,<lQg8V~lG p<ރ yMcN ŪaXgyl){4-CRXeb>t(Ǵ,c)W%B.+"HAGD _7'{8r,zī˫usf_vÐ[8ﱱp@)_;peEe+'d:s ֗H]c7XȨS6 /o>Yc<{ųfiSjpSU[~qN$%[[{Gs˧Sa 3l"EIrok_^MdIg{TBB'AU~2q:w9$gXslƴx=Y곔YL˅T~hLs@~#'z_ga"ƫO:hDt4<})6bOQ=}CgT,ÇV1,/"ٟ7#8\6, )avc7 =8;8Ǝ7SS\^Zl7eiqkm8h۶au5ý"yGԐi :R:ׄd*CC2RLn)㍘.;#W/wAG($z|Nk-C0n#a :6H!hg(\,WHLJ~%[U!p2`*p#-'~9g$㤩NԷɭf9LnWweXcoa1.54ij,?ou)ϡJL㳈.4Si#)W`W)Rڕ1SLCz>6."UeKKgqIθp>S HdPQB3`Fr\e %Ubw,Ez־ϽL>w0nff}ޛZ2r,bee1A>N[im3%\\L*D\(V$-V= ^sfT4H@޳ %wO-R˔0s.9m|vGwN"8u7eQнQsww~ FVID^{''ˆj"}wv>Y<.;3D؂z1=6ʚ8!kjdE!*^]*J109$Hlvy[rcn[0I$U^q]]ôQ-!ԔOYv\`oF0A73‹-ۓM3Cy.=?v;j԰6*B r^x EIVHm搄HTRh^ ΰH0hov^Hk3s3 Hi6!6:ClsGI3N9n]t0fHCZuRRd@Lxer)ht1:5.ތ9ӫ/5Z(̡ƶA|[85u j0X9ք˻F[fa];;-rf-ɅƊ˕5/wbXᛧե8iuEe[r`kVuiӌoT8Tn?3ߑ 46}5~ {Y utd40ѫ<4pIZ4c)AhR?7- F~5ۛ{aFj nfv+DO ^q30T pO JҺa?lͣp LdecFSY㿉 xVe]Tυ\kc!I]߰?MNý= %f%[;hgzO5uv~zNaI$Aj!ˆq,IIG|iHInph0B?caU{(/d֧xWƠ\Y! Z7H;g˂v&Sk5]17n@ fQ:~[dOe Iy zȾS'مFX? ^u%JxmMl)EA+qѠɍgݸ yzKHB)4GcGä҅Hdd2C*@Hj.y=~$_=@i]^f@`@޽ _%ʵv3Q@`"O$ zљRۦMnN4У[FDUՖ&SXcffxkX2uYmrnE5Ll2٦]:ӧ%EYGYEMH$R h `d֧5!FH1Pv@X)+5 TVT,1Latd](-j(֫ebfqS 4Ud*kyoE:o!ۧ~ 2IsOTY)\H*01OeIdM" dy6={љ:9=wu9 m &ߑNҳGt.!*l7v:&ږ4ի^6p"_Hxr:[/QU %ڼ󁕝hu"D!o( H"q@jW~?*|W|WVVRh%ék߭ mTi&2f;cB|vY08RDR > Y)1 12lPۑ)(Dͳ{̬uCUFN ncL̷:P؅CRsY)bpdQIVK?8Xz\,0"$w{\7؊X3pX0axpqB< Й*]D/t PX,܉Ŋ!!C%:#Vl B"TX@48J[s 6%wMK5prDH3!5" Cdbf33j;#<'RaUWm\!s0ֳYޖmq:9\4uuv#-Y$P6od=")Y!y4JޯƗJWvDzhZiRLUt.A oy^g|PCTR" c!g@j MJGZ 5YK=k wn[ì5nNBsfYL/ɭ/fnͲTnf6N2iU˻UR k0C7~l}/!M?Ll`QsU_:Ee,uVIPdoFbmT3G6+%dt AY˝3#" <| AӨp^ ) ahތe4ӱMY'L9K unrdxOZ6fb+KQnaFenЛoxw:n塃V(޲W5:3S \s,R58λ43u^8/FijK1\*ىQTKpuՅteEorxzxyfr4bbz9ߡ 5k oTi8eLAco:Z洿YY,HFahcotꕆ Ld1 u5zsӣr'P` krǼ [9I[.o6t'#8^X`V Y@Vq KB9Q#08ryitYYMU!R("$J!)9I hPEge #NN Hg&M Yˡв JHRNX0BI8yYL&sgtGT9Ə46LNm睡#atxNuG\-4plmKșo'N3Y¦VL $hgN`ܙbtn)FC(Z43*)EȲtdƿަ^8#65q:L2,7w 6/9LyE*t Ei;W.X|X;舥M$/9Nm VOd;g! vgL/.o5>[U =E2@LRf*g?[o }3U &[%TRL0#Y5B(Mu=I+g;=>tVfe[k89iQ7s:^3fhb*rdH9n^!j{&MZ{ճ< u=FSSL//Li_?u;U`hJ Ijֳ>Crʪ~jf<I )DK*IU&g9~վr3 7ɚTBP( 0ffdb1Tiwv)Uj0| F[v-'y-( aMF0Ҋ*/dUFB9 b4iژlm7ia޹Q|[- xr͛s/sm.3vY02V^KD I:\lI=˭eXb2^\k@lt {m]Reovb{W {D0lRχw{dm_<,Vh~!U^ˤꝃ(\}9_ o?S_ݦlmuO064X 9=8:g+ː9oaU<,$9rs>iVqYu#ep/ W!$rG̖_׀y9%Y@ [qnʴ}%8š?ZuI(I?q ?8 :A0SvC!_#x"~~ذֹ%Fdoj̅\g{2%ǗVÁӘݩ!Ä6FN&+}lC]nP->i&^c9-bHb[Jg`b=T#{Ϩ)>Q(=z,yij;2L'na;.a!wgA.'^QqӅnˇn!0}2{nMy7.ۻZ lٹϲE w5&*R)B6Mao|/(vェk,< B V\?$xLì{;,ПYhpأNC+}}P2&^"+֠n fOsy,EFrM)A7Frto40%WyjA)sb cN5+97\&[M8xsJL58Yt%涅{5b%~kQZZJQrP~iDRTOy̋ƴq;>pfمr;75|> \/o `ܭ\q.,qf?EjZdPm{IJg̵~4Z" Mh7G]\sIsE-fmCڴ"*8eCe=V޽˻DSgmZı)p#0=t=TxLS쐽c;5Y;O;<]u.M_&)dv=ŝ jX5Ezsg/SJf$,ypEB^r+*W]#LO qGǟKow{B#԰Yj(z;cmGc%G;k. ]@()(wr2l.Y9Iy]OgDt ؐ0OSZS0)z>kw/}YCIE.$|w^3to?}>u*:~?gH1MܙSf!u?w&MS[`Xu;}賑TZ4] ~\}p$_֩&N6rz6i+ɫ[cFWjhfcՏ&m'L^B$,7kEs6 u,29zא*myqڄy8Iucq'Btrc9/V_7̹ݬ9󥎽FZUz@5ɏ">-ʚ-?7WGO>mźj n[YTg;K @%# iNIMx%]ɤiXKKoAnLpot νw=wo{)Xzb\PEw:D$)ƈ:Ub(tL)}$eBԀV=HS ݃69FT.7ZbScWݽv!ȔUHWY(/se: a8: -w=Kǡٯ/O3 ŦsVaZw[][}WHiGo k9ťs.r+c]r>~Om/0J0Hf_/GS@>5Р]Y)vxJC:lrc_q]~[4gpznbO]sWs;E?!Έ'Պ/DtMD@ 瑌":'c4.To!!lQK/RXw NėNՋMZY˜,@QuK+nZ[2 aiSm$S}BMNRGGϷg4!6W7Ϸ]}Y}^eg52h ܢ}NuB=ǃB,Ɋc}͏ bY녟>]- -[lfɓʿ|'긙ِst1(AuHR\h͘xQ8)VV;jmB~$uy'YeP@Ǖ@+J@ZnfFm,t q#}&`gU88j/=,~臨SZBjIQn)S(!aޢe/tN-vd&{L9;w05[[,6+}fՇ_}=-5Eʃ$=Gx/;NZeL_٧{6; jV1^>p۰\z7i}1x}<:"UH'ۜhg=QkŜUyn6w2]9ݘIS';&`w 3 *?\B0vf(1*> :h(1պ4ZeITTZ7V`c|_r{NO)БC2'Cc }-QGN]t便en_ou8?]3fP})Ѳ0eenYk0bulԪ8Rz'piuA}UG{B/_U]wvc.q0Z|cpkHPtid}۬%f<$VPʩRNScLv?1Lp%ESϋS2ad=X~ݷO InpjZ;Ÿ? tJ )fJp=ʜGMOxd;[88\ٻxaLK]%K2 U:SXURS5pykJ2B&65eJ21VOehV{3vJIMi "uҲV;4ز|a%gZpsNеU} 5U|Q ̥!̠[FFme耊.>BrΊB H$ 4Hh9q`/,b`=T=;,6,CSI=KM#V! :egN6! y87;oЛ9rMT+Bg-BwFSW"\kBU<#yշI6[Ld5\z=k}7=,6],Z0c`n  BNѢ*E^g[oyS==Z^~Z=V{5ƒ%4϶2ޝSs$ߜCLh "X uY?KKaW6L[:i2lʦ('kL+f@7zH W 陑2+&!cNG^>0Jh鈂LXIHnr2Ψ@Ӄַ=Gi: %;6֋~S6D?bԆ٣D|o?Rn"X?.y ,*J%qpz|l+Pj!(55pTd? 22SZ'p)8PV0[ۍTŴH*H`{^sv9wz7)( 0$(J:|!seRRK K*]/sy׾39|<׷[߿ɟ|29bZkɰ1Lv\kCC|&zͳ!FS% jnԅ2g~WS0;Scp?YXVXvPRb1Nw;r=t3̩J2Jշuw: K^1֯]} k8oHD2 q >H z$Vɖ(N)+,[)%,b0}^UN*<_ "^;Ɛ#N,֚ÖIK.~ZXˌ=T2rS]bUc2f[l07ے{ΐ${#~d~R Q6#>~seҸՒ|jZMBt0^-\QvVºx/ŋH%Na**Ow SB_sqKy;));ZاMJ|O 0693nSyuf '@@@n{vw~q"nU'E1&ߊМ_mWS3NFiJzBHձCͣeЖ~ƾ٪n#.f\?! qp\} 1a١qiaȍԿ+A0VJxrw3|g/9:o& *؎=:&Yk4U{o9nb?ckDjK{4V\|@@!b=7;ʕJ(L ceJ˞GΊqfX+[OLUuA֛M%JxeSRvE=Z|Ω-F{BZ20po͓2Q$˿/7xg*'u$4 1|tBRDc nD7^w㱚)\9aZ`o ,E.w7-G᫠8_Q?ڏk^'}GW=nbçbwGǓċTݛ1qyv㛿btzc!_g=dЄYW}:RQauet5PX7BF[4m+[ƲC;TF ߼QwP8,šHe wOviG qaBr.)?m߹_;03"G QhJ drʼVzј,"D MheE1- ,<Ǿi0ӿJ(2-(}z>3Qyug5(-޺x>xi01 Xf̫YgE-de/AGn9F$WT#£ 0 t꽇Q߫Q~3QW!N(}T ]9lhXUc»t&(\K5!cEbŃI;s?)8GZm|l͍nlOff\7]86rkr|;wW;Jx^J3ˑV"=0K^OM7m"Ɩ Axڧ/trb\ЛV!> fLz~ [Ĕkm/hŹnZWeQ.RڪYa>}ÒD@ (o up?AsRn=$o::P:9-"`R1MTA)Lx:n1"agPk$#&w|{d#0h6'@ č#Aoqa?wޔEېT:G <:tm~%<~uҵ`^/<0o-s b0 EdSzKr\׺% ?q3{.:' c{w4\jg=ZX7H5G>,S'Iן2Qg ksKKl:4]ku/EdFң?,vqPP#J3{|Q(mMF@q}gP!m%bf"~q!p[F/L$E}-s}8zt\mCDd1hRYuo ܺӟ]LNJ\5x~?w"&~S"? Tq~ﴶa~/@iYA:Ghي@iimnmE#W8w݃nlqx$" wn3hC %bήN|K&[Ejfyy/<|XNZa_#dp9SQu 1BB @'Њ1ĬZk #´4%K3t!XBN]q’]y5Zr5sSVcCHq\]Z arջ!l^bfVBꭱw)3YxQxdGP&A8k&‹M [ObE(yvD@N*UG![7Mop~4}z# 5NR5rG$f-"'Ja+Sfuh9ֳA4# a<y!;ݳۍvVX\BZPLܙ,g BNNPª-;q=#^q;\U%Fr5L/,JI;֌½Cc:ԁj~Of. zm3jM=]GS֔V6هkr]q_0d'yWQI46ٖFUZc(u28%nr%H6Qix}NҎymJ$M\wm^ܛgfkFcG˿{z;_[̶3dklCDso+am*ԵPwrQA0k@Zk#ҟаp/80?Vs ;N_g<}Cb":t:Y2 ??wh kzU0~bǔPzޫM Y*IQu%|N,8^3?9`ŝ>}O!Sҡ;ȨvP9X|և%d7Z[J gR^:lU.NpWV_fQ꽅mO1?2s{WJ̧Ǟt+@@ļbs=sP!h'r`SZ;5bWu*I`Fu$="dj$E2dSSh'`bهܭ'zU ? B0QK0`ii("/r\2cz盰Qo1XO(=8/Pt˿NZ7e+,r"+Nw2,IH  ,ޑu] tO!JnwVM.lU+f_pٞ6kiv/ky fiHgYR_GV#3hB9&H-#6-^̑p8+\X+ N][|M'L{$cP'|a| z)ͮGGH$<9$Lb9mL7; tyQ@tU "Ά!!TRW"RwGḫ96)jf޺ӝjJY$6VӔ",}c)U }t+ZYu%r ЗU1Az9wjsѠkTqS(~ \j↑7eQP]ګ.e42:JsE1eA2 ,ލkoF4V:k% 'q7t>%Jd. ekP&wqOlfhϓLWWXTd26+!dPv룯楠fCCr\2 x{\F R gIOuBKY*bTL5^*rK(zc2 }V;Ŀ;Y3[p;8fuӜ߱UGZP* QRHP(N'9Ko3K|+Kl~m,TV2 (ꙣP"0 f:degM3J:PPh2FCV:=OUP pFoqZI(Hi~8G`+NS/GCX@rTEX ,Wy9@Q3*>7sS[ԑ=Ey]VeO1oy4GyO~m#vj}y~_e+/MNΐK}zx5=9<2o c hrTiT:]6wsB1Gcst'}M7hIo]w]>'ͫus 9 )9s $>7&:cQ-k}<> Wކ{e;r狺 p1~_},! 5Nc:-^?;y8n~uB@Ns)eAg |rK(E³Ie9^D0{}U\v5y34\^A}EItI[/ǫ$ηC'*3\ qml݊y&:d$::hA`@PF:=Å7F 0B=gd㪥1+!ys|HV=Z+ʩ 7tfw2i]:kwIjyZG\UIu?f}uɸݻK*3:Fgy2ak':̏3OZ!jTUIfLa;RZoߑ&WQ Xhz/[9Gij;f5PoWgPOmyL{`PX)?X[\ I}zsq=H6T@" 6Ï#fYM+JFO`*)lI|{j]eV" t˳~o{]WߢӼMwi &=fŠ8?LzZ)}G{iڰ"`4CSԻ܎ˣ[ׯC܋ZދtzI>BX:I'Լw0(4_Tʫ>wx>omKS/Y%Me=>o3魴4V=@XVo?~hnS-9OS-η  edYq$T @*y!L` $/ t~4N4`12aWR W_&h T!Fqc#>.5J*@aZp:*on2?N3Z,E0g! \r,槝]*V5e@ GCT31tyUpb>2$R7-v>45jߗl6uC|*E"sxd|q~ͥbHJZbNyuי8ʏtfޓ1S(3qbIQX߹RǶbh1v155s" !n98`5ǮX{a#d10DssAm\ÃmyN@7ciꞝy$Ųu;+UmIKP{f#ݍ.ilWdxhCt4UJ."DTRiheg2cX>_`{F@Z>Z>j!r N;'KTd.;Se͟mYquv{ v*}UyX IŲ˲} 1o&Ξ\V~D<=q}r S< iv j"}ӄT*X>XΜlN.IBdәlb/D  a- D1xej44V9wtRZɡ&Gc_}roَ0tB]Dl}2k+6Ak׎:)ØuM琙NiOn9M;v{> }ЛNiEU<_K{SG3ms۴nc4Tirvu0z WMwu7f˽FAV}$*9)@aAyUKA%:Ei81d7yj2'ϾzNT>:scm<(hٽ]2[m, V7K/d*fnS |}Kn{Uf|Fݙ(Vw|X*u˭wUyi5ܯhNd`3FtW GhԣO{;Klqڜ[N ? d=-uFǥ^ŠPAUe4~uQKw{(!oڂ!x,- 'QuX뱷Yse%[g:sv3 D}zcqv2tnV^P<rlov4A`/$Tc 69?j5w 0(43^CKe8C fGj{4nQ,e!9 *\9,ҬPT>ߙM#~n[=@5) Et-S&ުdҰ/Vvb;$p>?3k[Or^%j${FԋmoBN|˲/|t>%p\}MXRû ?2Z?&p ȩHIhagzsGea]*XtϽ!|!4 &b<7ڸ&;#Ď =g,kY;3Տl>Yo Tn[4wqitf6$ B \E}#χ: b`w9&UmIbW{=ܞBs^`^;6J &IZDD+Cuvϩ6>t?wQ=n8y55ЂB $CqOQxwƔU/S+wt\֯edІG;sbMiP(.`rwa#H9AV09GieieaJOR¶Ƒ9)"{$_({A?޻zx{ٵt><=s j+?/k$+1"ޖAMR= iVYnkڪSlU,woPK4.fwy2^(^}Z4ly땩Վ/h+2hy{37.Wԯ6fW]<S ,B }XMEwb<Պ$@w+$/5g.j[;gPBhh1֐0mċs瓮Bgo;O8Tx b:Y;8/χ|9Onjj9|:Yٚgݟp @πwWcגֶ=|l;=,˚Xr9[I;/w&@ɨsƇ/v~bs*tE;~usD;ĶwY7Jw[3m3bVOd;keԹPjj'Qc ,vƽEuZ"󫁤A gp<{+qޘnCZi!t9DHH rT4J?'0@ Žsl=F=܅b[$U)ܫƒ< ΁e}f5;_xTRFtJs9~yugQ׃A]sݎ-ؐ@/ M  5uo >сP\0(<('` Nf[DWt^靛ʟSw-|b{32/\j+G#yEhe ^n9;\WT߃~9󔱊ۥ~[ʅG#e~mNM4Ώ*٭QNOOc^t8<[>?nxj>\)&ͨ/ܽvՐ(i$ڭ}i!=#qH0{[n.M UN ;UՇ.5{~]]qy:lk> u{oDNzOoW}>'I~';kfnpasy.Y6N?Lyv{l*zXE~~Q_䵣 _z[9?A ǝ9\/Ye-TYnuMRcrt<KvfC (Gyg$?v<-ahݜJ>cO*}c-{E(MOjI.]~@3{k~6xXa)Qc˧3L2Ya˷!] *$ 9!dk߶XOx?AH Vh M aw!>g F`] Ge5aӚ,z:*U͈ýGjL]{@>_2ϵ)͇-.vw@"'xyWkEZ'Q^KM[ enFy1ϕPf1U/LhFAA ~<8 *[# eO;z%­,6's?uI=%7]xÖC|G\O=Uwz,uBy H-h  \A{*c{~k-4.< /Qb: OD)'eUZN# Qf&ƿ_u۲]mךܭY@8DHBS f qj>8~[?r۠wu&0̵ ^{zݗ)E@TG~3x)1J)0IE(,'gdW(m*SZ)a\3R|_B|?-{DDVeݟgz3 QV];Z=K ; Q= <>BBέߧWl,>@Zk<<`Ie_~h׼rL'Ы u\h;3]2v#w VFX+ƩGh+8[-J]zv ׽twa7=I~ Yx܌7" & 2~@v<k FƄLJ~x#z;v/*c"^|ݥ[WӋn%U!.""B>/?K`B!iO0oO,d8Lx>?iHMmn-;]f( rX5z6Z=\{wD7J.bׇrk+>O&E<:S >)$_;WO.ƞks%w3V3EE7o#{mW!1P b:TF b-$lV=Ϸ4_LgWmשN+ka8$DuF:̬K#KAׂ)ۙ;u붢|Ss:%.,!YNȱ׵ߞ7z}qĶeuw=Bi N+ygٖoс.HTZ]:VHe_Rgrj O!9I̒1PF%Շ8?8C\Y ^%B 2Z k3SzaC) nD.<Y6ؠzL]=( -ݪE! !@RI_M P%*d!LAı:w 0}oýL߹fx[nnRlB6P\2dF7$5M ;rˉbZ۔@ oB%@ު Ӏ͜3nD BL>7;`a200,`d\8Jk ٤Io(9 QY6oRy$';0㗖=!=;6B(6য;WĚnށY-nJ9rWS_(Y_!ozn1& d,ᔶT5\ewaa(U\7pa쨥nW3|7qʣmq|p?>o{7i*^+;YL+NWQ{1s>.&;X;D #? D6qQIFB] E`nz%ָ\408/LrfѩFrdg:sn0x5;&f&y;;&MeLGd*aٍdO?eh?[x~ J1F-"" Pb(JBP($9yq ^ 0eb!:c?'8ȿI7x||OYuWhI#KJVT;[h*3=oެ@=g =|;ǒ?~/Y̯ɐE<Pm]ilBXa)Bb ("_}~èN$,"J)lX?<95ƴL)*U @3&؃  qs~=PtfՎ6ƱѼS  ^b{-bBYo""2;߇IV;zQ[|"Q[N ,|}PD1DC,R 3RĄETAM)#ve9Ky2+)ly} +,񏷧PCje_/!)X 4$+`0_k _X?xڻZnSm4~nyW9͖I|Pfr A@ Ȭ UY3>ё&QB3<, cFnf"f̊ ]2 L97,>i;.00 :j,wx}kmw{]Wqњ+ݕbU`亰xncWgiSHgULןCA]S}yhUQ,UWIIfrU+{rrJ,9۷PpgtJ%|8Jz, 1b) _@bA0iAs_Se]L cGTkqgGȁtJ~$~~" 95_J-*aN ygvUVF]kmSy AWIإ[beWށDOU,*K^vﶷfǖL(/SC# 9hgӳ9qvFwwRb5,M,5pF, ʘρ|TujErOժO.i߰rSf=$}@7f^@h9nO6$q_Zed% d5S+g~LSev*TZ]"$?DZ3^'ӽ^ Z)| $*=PWLUpS5Zs҆pok]% ҳy^t˃0Iܜ3ŔDF{QF&qK;;:O)I!>DyHJ[ %ֻ cI0̰-6gFb~S* FYyĝ;{jXϰK ɧtl٫M_:{xQa*_=+FdXB:}W^)jaoXx'r] !(T2tSЄCqC$yAy9BCpt"ΆTYtWnfYQ8lFIхv0jD .(]KM:$]K:KF_-Ƴ~QwU|[a\s09mop9wH}L/\ e&)E,l,@$ :ޣU,/-tAPJjސ&R]%?$Na S3]penrx^R1"dx,<XSRV$ƈR |G?(`C8 w˙XhW@{@\e}{/k==WSmA=*]sIu]Anhۈ.=n D%#juԭ շ%GVSgZpZqnܲ~\]͏~> vdAB>D@yX&__-Mu%!j9F }d̕[ '$S8.̫k/`E˭Q$f+㾈CGp!\'#h2}.3DgD>fc\ 1+RZ!Bq}ow fsb 9atnwdd;-RIUo˥a\~2clgt>$E2- HI,yL(֯\U@awF:>Vת_DD4rE ɉh0:)'SS8Vb Ŀo\M5mŞ]!(UFz%vH$R/3Pf>ͦe֪&nAaWm1y5HY(y_cю ׹E0Pา)Uʣe4e%EGfAH {qEgi}Dq.{Y꾗w,t҉t&8!#X+>xGfEIu`Blݔ;8b7󻜞0g' :~j3{nv,hf&1%;P3_En B9WH3։I#G4!l;"(T?ի)#M7L 2ެ%t_ۗG`{" S) |o fd¶uNBk|~p}L"8j<QM6S5 sKc젶G$smrn+8 lrdN_ JGM. C4(v*4~vX"H+låa+b#Br>G}wq ioYRa#*0}oylE\kVmf\%0TKh@+$'5˳K ap֝ X`>pI-ԛdL:ϖm_\ڰڇ2Y[e1%dwmN{:CR3JuJsT6_yݦw̼I}tϋ=.Κ]]AjAL$ Z}'qNP?㠗A[uLNŎ "̚;>سMoG@*3hW0K%]K,>ʏ9 G|w{tz,i;҃ժ}qtW[;TJ=?o}rU3ZЁeᰓyL빺8%W]rOj쩙>~>Vl{ ԚZ-̛3ι41òN^~zwEr |/ǟjޫ9 "&_ޕBbX1 r?R;}뀲2BZV(v[xoeĚm]aճ(CV_$U{oBVۊ{+MBߐZYlTF텵EyCE]rBZT}"jK^?΄$҂uf084 zt$(pI?QjL WV'pB j0-]F21g¸lO-Y]JșRhEnҏ16hicA3]3,ׯxξFuɕ8Qv M@`DC禄߾sin0 ̸Z bDqX|SNR2nY阽=F,ÝbN;_|}+OAo6|<~'5Ja8GZfmiXlg޼%\ڦL ONܳkQe :E}P[DD!PRKMRTw ʎu3* @H< Y0G uGl}Q_dܘX~sޞgtܵϖ3?ŗZ:6Ъ*T H@Dګ"$}1YD?mN0܄YK 3#*EԿUqFNCHI@VU,2[8j(jfԝLmwx kwek1!k.@Ӡyuw1#.#zYY;0@oW:a%xu&TPQ1G 2$Ԭ"B#O~f/#^e2pGyD Q@QB `m,>,*6daP3H=+EF䉋 Qxͷi;c {y:'eEBaې|lKSJ% l^dH3N~Q@{)}ҶyZS^;<`^U_)iV"<!rT7P_ž{`}#CV{Q r 5)҆7Oeye\.ގH({ižA&:Rv+30RI!A[aOt%]ul{Kh]V_<6#=kOpeBBRĆ&^C8RJ0 k6E^zh-M3EUg߄ ]3Pyה`m[Ֆ?i٭ځ ]*y0gqbTHq_ٕB@cAar}cu}.8`~RzaDאħ .f@cm{*6N onQXYdPAiS~F`fN]vKB]v'1IV1%6љԥ6Z^w5iAk gn b/*R(JrJn7RaM=_)`@A(5zQ5 wVfx\ͅeh6QF]G޻Zkf H6]b%fH;/rܛd(O\x*7c9عږ(ZZ): p"[+Na+gJjcһ6Ue[`9DJY q(YM.D,J=-Jk9%+RjWזL!U+BMƇev3NmKBًbn7=Q)a3U:t1OǤBah3vf3[tj4ij*^{RrI29/5otTt1bi.#kӨzѳ++ߨNLUvϣ~'9#ɥ:b؊"& vZI~M6"1{;p%~*yO1g#h$^0r;ƳZT,ԕ% mK!XIy݈D!,ol@s|C@kg5Ʒy 6Vi[-~umpn#ShZD0p,pp<&ؓFqq 2Yc̋L(@ل&s(F Ad"|͞`t1 `@CovW_6h5ǷSKedd0##+Q9@0'8:fX̡~ה,h|7V6Hw+5rс\+ń3w{ :orEuy`-vϾ"'մ% g^f>]dƹ*in 34 1 JW,"3{H#]Z6;ŎO3o+c@z[jˑ>v}Wj5x=5mԂY,%9g4|.bLO#O5~:Aj vm_?XcWwΠǝ~ow޹ڹ; ׉[>]vJc~5磀vSx}n 擡X$3 d+46Bv:$7*~crXX(,QRSm8مԃWqo:4,qv?Ǒ=Z>|r?=G>pQ?\9v^!Q~ŞH`}1GN?m ]ew]yLH PyKiwVNX#2oraXD'ih DɰO975F}pWLZ%6Xt &/ހ8NU2n(X7_w 8ܼZcmmr#k(Z{8]U?(\ u8qto1;9 3!Jc1D˰ fSԿq71Yv$gުZy:NV3PHx@LۆhŸ)Y`8aaԤP@݄EҬakBz>aRG⣑9I;ֻh~RFdGF+.Cj*_k"3GWfbWa޸]祽oav&s%-lTS@G)t&t4{"~wĽIeKпp.g/46~O 8N*Brevp\9lKgcJ/ykJ Č60C8"$Z c浦wbn?;q|%]Aj^P^զ]+> @p [o[PMnN7` iV |dBDP`Oo2NL {O#) 0椬M{ (ZB⊡b]eV[e3E xԺrL2!ĵHmGQN,NԄI"03[ E]Cr>JrgΟl1LtGOdU37V(Ԛ-Zg>^R, **k;9 7N7%x|U_"pt%=&&7E9n>* ~j-y}ߨOF"-|=Rv iVCGZɸp)e:d] ɹbӾ.:iK0["MؘCXIiufy>\O;>7'(=[# !а8a|g|M0A 3##%\D]>3C]ŅiG/ӿw~s+-M':kPvyʟ!+Z( &f_r_حJG@ ͩ{w b8k赴J'ڤ`U`0ae`-kkmgu΀My1pIKARd"aCA [CKSvlS{j]ݝW%?#SW@(*r'at?#KN/vX??;du' &BE {¢__iϤq+`+kh< ]y[e9lNO"2uȼjc:߽W5[혯mo /QG7qﺰFxРDZ h!GϿ?r&RFH3#z5 T5RQMSMT5۴y=K[clq oWG)$UtEnƱASR X#qű~Gq tBqܦE`/0k{ȗs59MD %8Tw)XQ:bhaqB/9@}xr&ܜqQd`E/4-Ve*$ `vFwqUjzZL1^گ6p!i0ۼk%ʶ/V͐8R!`K/\ywX? N.Cb^.ߕu"u6~$/)enڊwu{쿸ya8\p|o.K粨jpގzr,z@&М*n qT-7V*辗:'wR[؍6qʅYt\ %[lJHԵnTyyX#vu!|׸Q<=Q31%"RNrpa]cy>.?/iZưҴ"g͝-Tq#VlK0̈q#le^Jut^$6ő86lz[c;S\+ PIW</K|| `; 4r]+7"ۏr|:`ȜjU|A&e9} =n)E[^*yNm%,`aзN7u?IX}ע£|&33Ï;zZOu;P]im7֣/;/S~s6;!؞f.Un) Ȩ,} +5')zK0h]g5-8/6]YC?7ρ7{GyR’Ct{og+osYmiye81ˏT̹'{,WWQy0,)s14OS.K0* ;=-.ciht7MUQ\rYŽXlVqi֦m_~T14GNIɹjj?I+5;vv7fNT[/?F]WMvT6t/kfSYiʶ5{reۨ0qmO~$wo+nu]W&|ݪF I:9~-*a\ 4oqryistGɁ`O y O9/pWHsNX<$3SZ(ٿxRFTISVpYx'\I3JY.9niVjqj"E3:y~:^-MТC9u6˨ƵerV$BӮLulEi}َB^,sj͸Թ,Yؚ{x^ϓ rDƹ!av uoT(i?|?˙L 08ٞJC7%%V/(03. 0(#2%^H ';Hg5L^Tda;6L=٭ _q4hMj=s#ilbT$` W;mlIgzkT|?GnuiT}} }PӒ=IXHGjMsz~ǰ 4ddQbo\Q>٢1JћJ]5DŽD"Rbcbخ@t{]rz,.iŦ:ZbuDI2R-](񖒰'2"ņ9]PEb8.k;iQ|h" Cj\?>N}Z2z9t'k0)m ګh$I&_%C'y# ̕CMܝǰ釯c}H (g~)l>ˤ%# (|J!jEc;;w0>Y(@ŹaO>coq#,P )0}r|p_y2j PJ#NdgN `#Ozk[e4l ɐ0|f2q ݆5Ѭ)iY4M'l8gT88tuɾ4i4'2 PI*BsF?PuQM&RRݜ{; iP6sޣ @9 @ 8[X,bodrŕ2L.gk w0!h̪f 3lꫲmWi a@`QD֜${qBL^jгn*trh"3&`N'[act -S:rS^;Li2rS5Ylᰲv,75B+%{0td))%K&.X0((BڬAY\ITyN_x#ٝJj@QdM:hqdy#mĐ !څ7~`S1($R:$T9~+^u{ %KNQ d624x f!Xj|FeFʣِP`Qy`@,H s!>ah%7$Ӆ_~ϯk~Vԍeo|!?2.ꆚ82ܫ9}GU>3 M5zMJ] :bbcI:`B3Ǹ[/;`ѐ |5\ A?BjO{A;iP.Zp3 Ŧy=<]dk4cB@iC{vaOq,)G2A% ᖔʶ;V'MFq 9 u']!ŧh 0?o4X8h0 l-Ӎ|m,mhR%OGuq*]u%/\t^Y 4vkWBDY`LjvȑRbfn tkp{DmRB/Y?ٸ @|ɉO xP08,7DO7O Znwcsd:sd7I`s 5;{{7.=9!asZ mİw C٧GpQm2p 3vζ(zf746FPh[V)ը#o{*\ũho?5? IWss> lpiϼb D iI|'{)Z;æee}iDyCό=,=zIB&MKF{xBBLkdH]n|0c)&K9\ıQrWU`ydVAXPOYT7h؄?*n?zD3=8ѓjf:A6܍N)*ZpYCq(+Kxfޫ{bI^JKԪY[k"szc;6J`?z{/ԟj9NLv'!>dlenhиeް2£;}Tx.h"w)|ub}/;(( CUKUL˒+[(OYi疲cW7 !aS8w7>536ez#b~Ԗ+hF5oU=?CõToI]nk[mnU[Is<]lZ]@x !~Or{ ;1| 4l+Q}N1C{PgNGtn3C9ܨT|`E(ivnQ*}b-`4|4tf \MʸSJ0030+t\ٵѵ6.2%斄ʕҊv:6RӢt!>BpA:wo? qԐ(^m=mcMx6#fvvPwu?SQf$xt߂e6bfH؈". ۠6*㠣pֳbj6g{ZԍÝ+5Yq.JGCWR[j\cuK[C%SQ,tΐ #Y2W)nu'KAN׵U67Ņa*J=%[ƅT/E@xJwtv,bbV ;58v#z78& 1a~&gyEhM+4,)2@~Nynb$|&>^i?&yFB*^(sLLK;6|N:4lT9UC!ʙ@cbDWYINbv=6it)R@oTr@u:u:\vYWVrܩN'NtwqmWf+ux*Wi]%qw2Lj7+B S\!ي^]l6N>]Ρz>vle&%2r,GD~66ôGw1"meKF酓p?uH 9(*jW\[n[Cݦu8+?sro۞ќ+#R $m)u{d( &X(3:}rszV]9tTϋ;|Xmsl"D}Q!Z&u"o%"|1MquO9e~jJ4i>5ӥbwp*eJm!HLlMtoL>V]=kOG/Sz7u@WuMT1Pd졻ajp`.@f AA@(Qj2z;7,gP' &4' |wty1UnŊmܵr -LYq<d.cD ْv&/b01vcA8;LS(bNg2UT8@f4:%xUl(hGذ-]H8'e㦅^Z 4g܏kv3Ar7CIMt݃"1`*Rl;g|j}$tą *(Z;ȯhvԍS Jg9T"/-_l]R( uycc?&];miREK$_O`Vprqm@~x_2]\Hfq8k{|o$*ȱS{eSo[YISot$`ff27fӫBXOUO1G 1|dzZ>Wa?a`QFm;Q5)XSP+E s\m61,,#2@!TaDJ?ąZ[L "}wJe &A" ui2}~7gԇ>/>!ezl9|^くYtHϱOr~qG|ߢM O1Wzn~, Vjt[ܿ25gr3v69帨goɠ#+Q]DA!ovGe~1L:&B256@!sxrS=1NM"F@^'>`>I{qFp\Z\-5:Nou A >NCz/=6]2F1yGvۇ^o(pel2PuoB:Hqr~HҞh^q|-s]kϥ%4Lj5x{\VШ7ܨ{YS7syZʫSDՕNW(\܏O"2<.f̩3s2qiWKg[T 1B,'.k=P2v6mK:Y!I?i 8 ),ьeƎ0- ^ߥ=!( $D@ PŴN'^e029Cgx& gMα4xrCD W_V)ay?1Y2 s?a 4pt¢pijf8p+_bbe~H5{j ( A2àRM% $:w@ 8fF޻x58NPnFc~IC? )9 ,6ś0i uϮ}ã ZU-@*oSi-ٍ?$4lTI~z`ȁjgݥS|&*ev Ik,m AN3ٯ>')`Ǥ4zѽ zmxJwd;l8;E $;:Y298:<IDf\LVZ++)2Y !8JPTE/y9q HY4Tt;DER-5: RF}HAxH#$%"R#+$d:3RqϢ˨/ԥU 6;K 1BCÀ^L\mUrC* 7.|rgr[) Ӎۈğ[3#ϲRu&}]kY ȉiT .w("tU#ML$D$(tu# _/{xפ=, Cy # ,(іE&0@Rfѭu:k&)хJle VUP?u!BT +H@EG}`(%:"]MNjȀ>' GBd@@y࢙SB&(K @!-tTgvwvCHt ,d8In҅@mRf& ZI0.+0 u P!J5FEHȐ F`LlqS㟟W, rĕgסRk4 U30 R's~ A7&SMD@{4&vEȷBk˓Pr'8ZfF"&oӣ-} ΚrDQV CMzԤLyݩ;-\,ĤP ;m T0)GӷP|&ntIEΎ߽"33Ow/s4ն 5{.ېצաHnEnpR4+ s1f 9X&;¦1T 9zJ2CcMԀxgMAT' Q/Q_$2:-!I'¨{,˔$9\Ffˈz'ܳYNLXJKeJFrdG ŗc{w?sVtYmV*GIPi.I?w>+C0fd w {3KfB|M}L4'a۞e\Ջr 8G3ӧ07qRK&Kіm MUe|{H5!2 W+` *N!B$1e_F<C⬍?=}z,- ޓW˭p{;l!s9r f`Wj6}0LYw>RphOnشGE ]͸тQqyZ﫡9ݶ<{;Q:޿y۩n 4x 9|l_]*eD@wp fr=s{/ gΫs^NolΗ0^~cNA\taUٖ5fDN pqަ@!;QB*D^:[GqP|r@2} Yɘ4rT¢uFUsn w_R;%Er~toGO?܁yzH ~,_Mْo_~Thsϳfwv dB P1M/}0W,fD[M *4Z$tǦ1newkv3٨60I >]1$ KQI%x oB[se+ܠ=2T=}B=EΠ${4$'7Hzx@sͧꚝ\Pj[Rf;UXRLpך:f=%3VeM–fUBW_1z<$?*vd (Ob]5U*; mg=b:1eBd%jmAOW.|U/Xl)y/-ܻ'b{Wg"@dx~;r$ }*E:'J@/5T;U o{+w~1̇?goM @Rc+F{Ť$c (zt^,yZ$Q 0儬yYnC$YXfgF6Y@:RXMFijk;,g;@27TaCΦA`n\PF*-ifOu}4'_N fxẴ[wbK2s`Y,i5ÑG0EL^GeQ[h_N>#~&w>r*jhGU$!?cg|T{}J)>U^Н]MڊlZYtej:On4w߿2VfL?Ex<-Lit'"{^%]HvI\RӡF=s\Zo"Gp6тW [G;;kw?8ɐ'\wQy%τ](Խsä+;J|֤OR *zbA%̌j|6lgTM]gyCaý{8v̚0/ƽmܶT]1%󿝶Z;cVz&}Ϸ F=y0M #{ǂSAb~3 9M[ ōf|?>~Q-'٩Wmh}/]j B(5TQvK:mdn^_sy轖|`# +óOpP6 PwZxYMQ!4D㈁8ϻ0ٵi.Gp2R`cn.oR*$t*]9'usu0OB>GP*j =[jvz˥޶jŷ?|={єrQSq%Pp|5DQH]=.*W=+:{@Π&mDT>ڬ_2Q;Lsqe Gg1ɼR/5[=+%Ou;+ca'#s `7M BZ("@ %2ΓC)ySnZ񸷰i;i;VGIR9~N# bLٹp0 f5 LS^0/mU>͢]i lZďN(kP+F%6G% hEVc"\,+)K0"Z8l,/^h^{>LϷU4/WקaʷmRtboXLcZ'owzµ\QQ5F[NUW,a!N 3ut-GBFԸ7]"sx54Ұi>B}k3cXyؿe;Sm׳I Gi)|jguYMx!:nUyREқ;r|cn!^_YKWAKR*U߼F#8V1f9*Jv.#BJ{h*{_J*Tս pxX@f},B\ED ӻHp _MY?#L=pAwd*E*I8./94smy5ӡB*e^WOw>YcI *jc: n#nAw[.r8wJN V a^Οv~Ȥ8AϷ}uwH Ff+9>KK\>}GSS6Ic}rf7结P~wM=C}8c(fʲwCHa3P.$sJŶHϼ;?Q_J3?pRkQW"oOJ|ԪugȬD-b|l}$})5#%R2_υ,d98s"tEۦ:hOi*k:9\Ĥ|#aƧcr;)u:,e1r:%j`Nx@W΁p&£mlxO0l+$׫ugqV{ .J츦C㻞Y<ʮ>lX,=SZ\qYw5u+\}Wl|Z/iq_ױ5JڸþaዝͻڎȠzO3Ogkwko04|4haR{pq>/G=3uc] _d}p_ &9i~187bq 5ԅb%zܞU#9"{gcmy.MQUPӧ}&!9QX>°U$s;GJ ;oer~J_>?Q -Vwgڋ~ޏs;qN (z89չ~OMأT4^Hf% o-WT+@2A`J}bhj!ǪwhA@Iug h\t#OAdsVxu Clk`Bi;)PzK)6 Os;~.>_)u;w]}Ss txĉJ ڷ J)\XuG~|@G+I?'V6l|RNvY |xiOvWtWޡ_ qBva QP慁a>}#z?;YuNMq9C @5'!FH2H%`S:誊2Y3aIe*Z k.A5P"7 nq)w|=^?-(e oEԥ]Uozw:/СAps5o3lL 󽯛mjIm*u^M6&S*VXU \Cow߿nF;r"Djw3K0P.s]>UȦL.GyS?^2@+-+|lI烑y\: Xu#zu?7l~&Ta.C<ڹg(2Yb؁ R&h&hi@T>_:C:qN PO4:/;{*voycQhK8jQձ>ک̫̌ \ |b>,so5:@hS6+ ͇ozTT6䤒"$dNwGP+n[?qizwY;vp 5Iۙ~*Ds"!2Rd 8>^)eGQu&Y@xh5x:]A^%l 4Ԙ DEǿZ ?ð=탙SYA1֕2ʹѵ\`T;sC߻;`?JdzqO>UQ;Wʿ(IdM/gZjg $9 ?]| {^t% S(Hk30޵j~?q茠2.@≡bZB&( 0_._SHFytpzW *3}fI*k^ՋS$?gUnN/PRTeVJU N#uZ4iXSX`R@?ԍNg$H$d` Þ c.L䔊3!A} gg]dԛZ$(әf+ vy;9H-+AKf fgA!#9NI<}_CZ 06(fPԈ1a5dIJ17u*7=k;mDʄ5ԛܷD>U [0z[T4Oo E-Q%aR9шϝtsb#81r|is6ql~b'驭k6صX0_r#Q}c އsr\\ܨZ?nZ&@7oz+c|I2_V6uyѮ]fPBѝFHW`y" z G-?pݹ:^O:Y 1\ULA&*2UvX^~kc?Xqb|3 s{[5 R I5 vwoZztE;iAǸ Ѻsx0 j @Y_WWQANz|hr Q 9ϙ g].PI"85ʕ" G@.%h"#l8 #B4jǧ>F $v~h;/ Ԕ׸ lwdz.$D3Ùu/7 kqp`t&RtjrkWUUYtbRჾ17穾|MrાE;.jghH_, e4axpGܒ՝E[^ֱ %+03䀬`Z-X ,Œ~3^#NZs߫qgž:n33!*um .BigCBLA͹n4s\JXf澋3Nvqd\5<+'76. Y,L:ywȰYS90@0v扂>8- JzXShK hMb.͵gI&ls0t<sH '5Tt8B 9'*InNA9WGu=vtQ1TDxM캜s2c3H*8ygEd2_s@Y+RPzPa-I E5D3$09u<睽3>F/v @p[x=sxbiѲǷ;X'خ'aftԻ!y,B (R!$5cWJ59c90^w]}0FG@kNƕݥi1XJw=@d_wL%cݝ'AF41 ui~K/T)=1six.uzQ]j&feӝ&rw_ӆyadIqg _puBg"QІ,߽ n-WUCn{:;Zz9> Yi\;Ƣ泒e>5_A*2rW8I(iZ綝(N 'VXD A @0 H k!䙗HUŲ`5S]v5^N:J4t~#.pKʧoGAaq7r®ޯϪ\~G)WqeP᪚&.Ypےmb# eoC>mD&ӻ:lW,NL\[-]٦mivNU+*Q;v?ijjsЗS8'-e'yCڻİK^߼@X0FBBl c^a[9Xͧ1妉0L{jUz*cy3? 86>zٱ_qjZɮkH\bPLnj 5]\"$"rxqPDH=^;0 *>, '!t|Q[a"¢*eu{?ʽ jI5_zxKAl%ac@DHok6,vMp'9٢xF&  %˷Psǘٗ\db L@ MvяzmA.;e̥T'~?D?jV9k:+LI6ET+dɼ빫ͮɹʏI`Y36ՠk6A,04}ڢEM#y#̈́kV#}s\ǪZ$v;tX͉0X[" xhA-xv͐H9suZ< !Fv(هvc!E<*}2ɠԼڛg H |ꪭ;; ]V 'lMZŔ#ZaynG#lߌDj^ʛ!՟FCĆzSTg-+\w˵Bz: l'!9=H8sy'% f*׆s\6A.yG_z}]9]XHDZڅ S)mPԄaM3t479&L(vC5E6 eQסx')7ϻQ*`4ߛ` ̂$Y!dw |lsot3|m^ѭϋpx|A`ǜeYBx \J9@\A*Q.rUtIA( NV'Sx55֧xu$4#"AeXPRas[@2JR.%Qx"v&mdP,)ԝnnxx;;,P"t@}= ]#Z(,&J+*Rh&w%I$(F2A  We+J Far*JYC5( u JF{ |Yf&O.=IsxcIJ^֬ *In4yUF խV.iKvX 2o[ 5a4Kԓ30Qj[nfwmk[FaZ)@Mprs\ڰ_W!?S~WԡD9ņ9|lD ''S{<ݦj ,КhGF`dL9baGC1 L' 0Ў! 7Ae &fi  JN>Oڜz?@ HF?0zF>]Xn:$\QBI$-*ҡ?Mۙ?Q׮~|̶^|>Dž] m??J#. McøNܞ%XӛjF cB۾FEΆ>d wI:SS 4@J-ĠP~,&P@-2bʆS4S 0ED$GCn37˘bF9{wU]LˏkBO jg٩b=^\̸lXU>Aށt a330| q`Ooeyӓ7QlH{9#&1Vs*dcy絖Onxu-ͽf|ynCjWq9iz\ޫ~JQOjf֦P'O-c3 Υ,{݆"x;0٘  Y>ѡ6'gAPuOC6 !4nԵ8݆J^л_=EB:D}Tgҽ-uC\W{ጟ{MSP_X(_rC!Y.pn*|y17Jv4v<= Qv_p0xXǿw6{՚= CbxkƻB*J^FZc]Fe㹌u'Kyt}.,lt"P,JBE^$*+P4mq˘>PZϼcD^j=1b 4 Ӕ]S#95bYULGt5Fvs@rጪ)eǴ{{Ϣ-?183c?'R=Bxi^zNϐ.gOwnUׅG!#h"xwS)lü)v[YE Y,#L:uXx= 9iMHcMCka]oS.,i^RhDZD4ZkȥU"Fw.Tک@겲ܷ;zMBL>bǘ3~L9$q1 r#K*&t(0?>|hFa,#y0S˧8Rh/OA[3 ZP ׾﻽g?8_I030fBkd`|C'uxp>tjHǺ@WOl#ōvO}&30fc -#)ް{K}ez;SrsOקQUxۿŁ\KOs^t?{ۛ96[T\@ɳ5OmBET_MBt+Ӭ'A"I%J~`KOKG9* Cs 4 nɥ[VQ8X\%??oA1'Y #J-mzS!BI5zTTG=OWeY;CE0"g퇕U.&Ÿ#SCƢE F.Ջmfv,j;zsa@}=C$Oj>v{7)~nl.sE O{]k;z9R}/ۦ{nOEڎ{|ç~쎪7Ub>/;oeۜ':#ʂK4J᭩|3~f$Vl6/7r:üʁiCAں:li6-)- .R!C)DF2ގ]!jt-QQSP/ߧ[^D4$%?V;}h?>E 5ETvn]׏ n򅹺-༺yn^7Ё: zmZ•6vswg{F%&="8kԱl<~ O#7"feFyrD(1TS_ \ե4u>?aޒ3`v ":Hٔk0~ʝM\Y6Ӡ!D ?~A~W!.u߽lʠl"y#S6LmL38X%q9S`կׅpėEitb?ifW^/0bLn.z\Zn_} j%NķV1p|>T}Yu:mrL3>TVhi*޽+ rȩZoGo׻Ȗqwv$Lzh[9>2߇y8Oʦ@ 8%ee, 7hʐU̻x#<] m }o|woRG#:j|s.tblm$FxMsX/"Kxyf0jجQ]Xr{-έLYJiI|KB/ ZWmO~zt'j:3Tֵ ӥv]̳ەRe-RviiXrqkgw-OҾ_E ݍ- gYOT$9~i}>G]g99y9XHo]$% >/A'{&a^)|7ڧj[,;!X}[^e8b(FgYi@Ae#01'dad`k)dj|*.@%l;SOjCc=1S߿>$G-8>'3jܤ OBP^c_>?/@3w3;t)-MMe})br/n2AbD@]]qu~_4?"п!hheU>Q+'}S3 L;@A! RS <\U+lv=dt olny84vvj.?,j3퐻$IWE/9[Wi.8xZg><.K -``_e6 M,I_Iw㛊۫E!В8*J= >_L mЁ2i@ Eb mVU},7?M߇A2ܝq rn">SmKh>5,Z\ݍ:K=֚vE`ap;vC Ň, `hLc 5A!KO`h8U; Zx(3Sѩ>;ϠunU,UMwٚ=ӚKW+sI""vXk3SF/p#8S ]HBR]' @ ,YLx9A3,$3,C2~4ޟolX` <$ƺ/O¨it޳sѺCh_1**.}25( cpOBߧ~F<߆om+M_ Ǟ/$$RQy<҄oG]gb|5}y!ߧta.DO}nCu|yoCߊ |NF'%7wzFW,0y ynfPhg;E"sm]z<R8wD5tB4JuPaG"2 1 $0 2ws֠Xmj\./|;.,?&!BF0Z2s'̮w^>-D}O~uB$D1Y]vnޫo(4(POGI~E~/"wSQ{CͳGo6FT.W@{M߆iVUD֍=JbmYiQhU 64{٥s"vYj?/\\v-g/Iw~鮞[d랠H?.C|zDwо zΥ@[4+w׋F}*)VqK=.3?3ĦbVB-do vfSL2㚴χ'z~moleZ~[!Zfr[K[FMnϵzܧ`+%KVBQrR\Y<Ԣ'46M,Nrk=e3A*?k~oW?B!tEM1ԴmB\E?`ybB|CG#};W֩1 ~㜌|҈[IKIwxzS?83H1i&`en+{T'i{O70yU'YqGrohpM&`Sb6KLO"G" MEѲKLzP U{ 1]ᘕj aˉ)qe! Wz }+w/Gߛa N]?%~] ߭Gb0*J߬R~U%k Z5>=C5PD$c5 hyLt9rҀ*Md B-E~1>.XNnξd` TMuɚHVԽM_ w7RۖeuKj**gֹ`?y'5!C:WlD|{Xi,CRfd@h m xpYeAa63˓+/K?nX'|3GBv6+c2?.[x3U</Ntq0Ohw]'m_^(;A;C޴d&B3kJѮ//<8N4)}OM;tM[3 StOaƹ<jWD4(T_gdD'#:b9bEݲ"/ \Q#l{;>4'm/}(4ɴEs<ؓy4TtoJtM:v:RtCj킄E6IJ9ؕ5SZ4Hh9r XZ@P8R#p*@j`!g2rR"'Nj˿\T.$Q^3nm|_U?L8 wA4ܢqZbHo,r!rO4on ɻ)vJ1y5+Jate;5'x, /m 1@P4 4N3%mKլE@GxKK~ @_Yՠq)>.7d#M^ĉl՚K86yۮF+s W% a-qw2R"JA%ڗiQJCg[UuSϥ6 i{y;c@;gmfO7bg״q-Šl։8:)_Y"6P#ZTsRwX,|G0y=yXãMF e2IST !adiϧ/j^@JHv˱ 3 .usK?Z0ux.x-I2K@$g,Q==>qx]g8&Ssi'>gm21$a4s]=4P;WV\us gk\p5:2Coq&qÿLZFܐ|!"E+C8AbA\P lA"n6(|ly2DFYS>NwڪE,B z_~_^W8=[O-uu-48>› #\ZnluG}]N˺:B\J᧓X$;BB_RzYJ瑻j#x Je[\ A=޹eJQ.?s毾UG7'5^@uަ~gB 3`LY"UCIFF~{c2L?Ǩ ɰqeA?r3z+< OI+y2c5>nYfߝD<=%_7߆vȖt;pTQˣ+*Vh h > _$Oo]OukhwAHJzg(IϠWJF5'Q}Qo$&vr +:ey=tO;_Ѫh=Le92$?QBR'BzXR,^>˴`1;n,b1i D_< ME:;M{EbԮ}xmi@f,ZՓsJRn` ]%4)0A2`K`h";;šX9Vw1|_R"?gͨ@s"Iwv T3Md䦟Vx&_Lf ̱Rz23f\kwe1P ihr'QgBQm=>^HӋ)|U,OK#xk){W W4ﻷǗ(|d\ARcgQÅ@ABPa"ƴ@T3?~ށ2㧦 3JRcT ' #iP*cA5Tk]0G C .Xf"?hs=[ޟ8_\l$IZHÓV?o0df>S/ }.=6Q1ϩO ` =T|i] hC ~e_JC⊢y2$ "HW[C`@k$V\vk/u]Sҝl]ggd#"z ]q7Z*aU"0#$jc8qղeT뜉.);wa8DUIpEnV׊e<5 Wk2+e`ak] zՋg̍{#$e)G4"8Sj4> 1ՠ\7M>x,qezӘ#RwQ [f0hX6JP 7diaKMQ0=;3^W^2׍<#dl8ԣeJ*`-_lORʙ k_9.SqGzI yÎ[mslLUӝAuu&"s~>9&8(V /ަ U%9P# Fg-ˊ| |]Laẏ}JϽJDJ ֔Ҙ]5&4Đ NO#[(SǃURmG r|?E1T1np3h_@Q (d0jQ|Fw܋h*ozts~+wh3Dus7ˍ|Π? X(h=OWʂ&"!"P7\¸o2&ӮѨPM"|+^?чL_2//7' äժ- K4˧z7k@|qݘ=d(!WBW yڗ<o[ڏ{9TaN'j+,^6[? jٓḨץs9!5|p.SP |n]8 Ʊ<ߠߋ `I{NMh~-w~O@*q~zVqS9`xY\*H$ؠ;v3 2h=P3v,_lv0IYU]8 KE$ăU@{,n=ӰoT"K "͎.܉ 2-! D& &;SWG>tyx(I6j7pi-|Gg¶ BKj4a/mΤ$372Ki ag$]r23.jH/Z(tjoY2RqPZ.n` R"Z3VS,5!Χ5H̢^mx§Lv1heo>"I>@TG/i; ĴuiuڡxWD]pn\\0=+Y$2NE˅u6 Hw kǪࣂoȜ"&cv>VL#p};r31ޘ9 F>1<2v^ ڧc ̖DX}z$k !;>9{Lyvc! "v]$wXa34Ex] ǯ ݙƿB4FqQCk`4G[EZ'4'(e1#5d䇦RE؈T$ ' g6~?q0g3e:[܋0_i|f)Z*HzX &bKc\*(ʙ09`Gqz)C(4((!urѐa2X>ƶVX̄00~ J!\' l0C9B4ka]||ct% N9aXubW .ݛû1!Q .xT-ٙ!BM^ے3>XkͲ.W7M<8f FUIPgP~J2Ycd,;@jb]M(>H_FN[MmTyQv3WREj+!MunC)ɝZbVe,x ūU`'yK(2VYIutى.n׼%.RI z<0[pKnpiD {]}GUV̅bTuI3:ǵ E $F.B:d֘.~~Dcg ~n8xdË9o\${_dݻt\AО9-Ͼ6~P4x'sqr,BqB3%мډ,֚zٞC8Fң-,5h6m$t`Yb5BS1,IO4(sW7sZHn$M2nHW8+fd9A BOXQ{1:2S9Z[uOQ9g j6ԎX@0Xr8P$/mgC_gQ|R]1, .TgFĪpA}"D]8OT튢*^WyH?וߙtݰ W;^-Si % @\+ˑXƕyC~!^Dq soK5\b1oQ{?޷0,4IPMcE / "&7tz:(@(a Z)(bZ" !grxCiC(|0`%e[-[ PKMCLԔQSPHIwhۃ{I9ƝeGIB_*ҖryVx˅ 0r g8Cm7*JZIpkz(<-d@]Gϓi|km+|Ƹ^oVO{ Ԕ(`#ʡȭy+u)" JtP^n~˸07=[מ!CAWNMתy.R>;(Ukwsx 24.  n(kʤP*@ !IMYw7a5fݬrGNڧDծYSE Ӓ튂Х2ժٷ]>e-[+nwz{w0۵{ݽ]yBNإIQ5,l&۶F8FvgNԋn+{kzo{wncL 1b=v/Z'g⎇kZ‡GlȵTݡ^*[흔wc@nӷs;gMZmQ%Ҵ裻δ3Lw[`z>:Pj_}rQ,UnmEV-2(ʏ\٫@fQm^PbMZUjJk-mSajjMf݁ mF$β-m$fizu{=\lHƴiBQ-AI몮mfcMtn*5CF f i6Ed鹄 IBN(bABĀ %@ t@H!AAA  %(@f1D@TB$RRE)BJR ::Jm^s+cpfi(PPU CRRFB}^¸|<@Cnv U Ec-Szލ@^;vǻ{ HQhѠ! 0IR@iX:cЦ"`$0ehi&X  %@aФe[IυNv^頫qϽl:z@hy "!vHRO 0G8eR:G;'o0=*_vhR* eR>7ǝ**>,J ѐ} kVzA (R,}+񵦶EF3[r%/^oCM[5|8. ݎ+δ;sX|qd:p95@ zQxZƔse%S@|P^&Ic(C2'C䖵FڦPI!}(ou{{е6R{i/NUl^{6*FYֹe Ia6Q 6 s7j Mukb=wv# @Zʑ{k]jbm`5>|ȭ-h`rbeHH;ٵ'i{w$E%UK$[|׮_v!#Zv^ݐeUGU[jR5:ih΁e;1eK5;u=< h* ,>: ġ]Ck##]/j'gtuqѣ@خyoqz{{vη@]ȨzgZ1ﻟ }Th!(4R޶-'l2z{{}.k PPQ@|tj)|Zie!e=Tk{z{1[w8p@G]ށT@}"wWv*N{(VֆE%E*( ցBP(P u&&14@L 4S @h `h4&L~L#@=žSdiaOhɥOJzzzɵ@&M&A@& ` dOSL=J~M LFM dhhȞ&&OC&dbdѤLɩ= 456F)S)6TzE6A?Tf˫qOC +(bb٤G| bҐY, ().S+Y]ج@!CK;NjS,DoOoedj 552kqda4^L!PB=bHaEUb2pqgljZQ?Μ6 0D6r(K<|_tFOrC εz@X*aI']enHl,%HK6) 8u؆ HHp|o*j+b Wg\!XP7Α 1oomQk:۶@[pwWl|.E:-FJLU 7!d7U+c]`aQUR65?uvõս62j vϦExUXro͗lĸRGD3u4yHbbw.kEۧE3[N&8"QɾlhZij& *06Ukn0K4wh]M.&(H1A5TUcsI0Tb Ĥ< t Iä,Kd贙=R2](U66UGRfګ29 O`I{ZJ}:wE|2`t+L6obf5$$`*sF7G}gT,cy0& b-9$p1',(+ܷ[$-!sOh{/JUP( ~o>*v5)'J( p8V|JϬHA~Zjw:#xo3a7r8t t@5TT[*Vo *nbY-k,LV 7;{RݶH#;nJ`ڛM4簦ls,ٲSRp#"68ɒp9?\rk yDǨ-fxPj<lRM !hW_NJj]UJ)<{ 0 B悑T4U$jaQ2oU3=4w >Hٳܒ_ p"l~=bB=QOP鼽f -[kG6UBcWm("Ew\O;9B_ 3 0b@ q;{NjF>G#qΘz-AZBICy$'OQ1:3k TURTY XoY!d-~ҐëTUԅ%!c_v}kD  N.&ltNyp|WX8 %?sz1!" "K \Og̫w^~ĥf-!Bv#;z\~! ܢ.B s^R}AM/'{0$QKDX)f3Z)Xt $`HC97"'&Ƽws=ޑ$tgj峋z= Yd5HRXH4)`o&myZ-Qiؗ@E ہpQLX(SW|LZDXmKa&i{hNٻD u->ʹ͖t}˚=aK+'f?(GLIϽ>/$aa;؉ ?7Iny(<b4xB6oxyͷK9ߖ5ďnj/d2IJOIvCkDc WԪ 'r e_ᖛwpW: MHk)G~ ns}|ewEǦ¯An][p!2_m8P! mh:;C䢴"D Qm6^ P):`gl9*TGY$&tכŘ|Anް>`6D2L; |Ӄ-&bEum+Bh#-creC ۠9h\0$.B{uᶶWXfo!gm ~Dh]KYgϛm}q-_z8֑5 K=gG?vH@@0"a0W"L3Vrr)x%<@w{AG^}ח|^=70 ȅ I]K䳀<]zXS%kԶgk# ,um-IUcPCd#」چYd]tˋAu[u, & {Q{:{cd)R/x0  ێ-9^(MEv @kkCUYgw@<ᢊ *kR'CAD͹ *Y碍S-}~ς1VIR bMSva&uK'ija3o1-Dhaw/ϡWgQ:BF?7c0dduke?ٛ:Gwk'ݕhX?[%x#R%. L/;:d̀Ò@:kR2W!:p;5!ɢ ^/[-`џܦf._j[rb@D T0¥V [5H +K4Yo{x rNVvwZHsłٕ'|#(L&RED  )`~B /Nxpz T x(ӚKהԓHCN.j2+4v^m"Fr"u}3~!\SR,ל [H Oq^.Qv|98=) 7V`×^xΒ5Z"i1|9H>TC i-K>X#*Iȋwn׿o՞Wdp'zA . $BK!ߞ{{Q3B5'HJ[(LBr vj !dҔUBh]^ٖuBbԢAqG!`2v x]pܤv.鍇z]ɪ=ޏUH?FIe0hB7kMtWimHV0U/zq:-@v`Mgb5BL-JU Z N¡VW\ >*vf s[K?dY^/ȉ:5UlFu9jTm˺F_4 3PHm.fc@A>';qnȈ xPJVeT@^X D͜ ,kKTq}t.}DѼW ]~]'ZZ>wp ٴ0wO)Aa$WL-=櫓F]R8~N>Vh]3ji(v T"KzЁ$ ;{p /;Y ~^xڟ@YۨkjY) #"!]z2פ0/Ҁd ޲[Aبb?&Wk9K+ץ|~ v $1fFj9}Ž wGxƌq]8UӚW3W'>l X(K7V{6eWEYP, +rv P+I[ nGwS2=^Pz;c`^Q뜖Ӽeڨ\z`PS*/m&~-VUbR$@i&$D1ϛw/×df"%mveaUJekku:T0QoQP)|GoLndo1zA֦[pR4f4H"-^zyןv^e kk9IHPPpW\)͌.qq3fkZWA' _ rί~.Ǯ>29hXsqg+7Ib5+?] rKx)튖Ь> ح?yX DDI[h\* DW|3#`\G;cw'w5/K{@Hg.. x*("QuZ8{AWv<642Z)rmkz wuR*َE~q~P){P| ]k˱PN_1Uztᚁ 2j*S?Ni\"hU @ 5jճ0a:S՗ao޺9{^ 4æ8H@N/C.!e*>TO>>cNS͒SEGΔ'Kl zށYŠ->O1Ș>f'Bɔ1œqI([%p2~P>#0qR+r'|- 1H* 5T+ E7ө,v۫g3'/Ȇ8C|/lաjSΠ8HwٷL\WDBYs'QfwFE^ 6[Y}g4c-6!Կu녡\K#pÊQ5}>a4`Q$7,v&$C2Bxv#);Ba|\I?2thlΘEDz~^[Zyg~UO}`O,5+kBRJxpIO mf/~7nGmcg ̼W)-''l^8:$ַ3 ?`HkEJ!y/}PjX w[&abd/c`! <0t>7 2ʝ}@GhT-f* +j VIOd]K_S^13o+#= ;uRsN'I8bPsY܊~ px?{Wxxt_\%=7}LJ#m9M NLjPw3)DpÇ=A_-6D RҐ]`=_C"CjwUfh#0)vѭx yߋM]eFl/!}6jb'-4M !UY8oa(D˓P,sN >ҕ˛ לÜ?C @Dmߦ9&s h?l ,"!}Mƶ+ޒKFgb[?|d[E2p9w iw S{8p. Xy}>w7?j^#Sdrhs[ú \DRBҡ6f,AAju=g7[ISOlJ1vstvYa( #ʒ߬Bk3Q ƨm.3SFl* `^QoYALShVxcXķS 0kN Gu/t )g5ƛKMlKL )t2;rI\L ImQĄ LeP RZs 5d"KT00W4itM0TP:d-B*$/ZL#qGa)M xS UoiLjYƪDC¾A3.qJ:4Xvur7f`RJY=Ãyj/Z9`K}vHJz ?u)3Kcqp_O}~PL%\9n u7O 3X7NNhN*T+6꾾fl(mJerݽ0T(InAW0=1${:dQ%G=ڊ6L3C+4"  Yx侞gquCW0r~M;F"\mн%ˠhZ\?,B'kX (y S9!0>&A Gq@Cx,1 q\6O /E !'d/Ȇ~2;͠Vt%4cwIO9_aƠY>斜nϘIj96L|J@#Yd}qaX|rY,.:g:1erfqo&pݛ;P$P)}M˜sh^0@/fr xs"-5_@%-k˽ŵ&I"ޡ3ρF/HZD 1Bp&t68np :Q@%q*̤*>qq9xϊl@KHܭ,zRtNkjlgc3@| q -pG{ni'ך!Aj` By@ -~q3ap cK}!9HRkJH` $ 2_=iќaUfF1!-"wqcHdxA]˔\vʐr|h `0ߌ*gkyKbHbM)ITY,1=/J4=YuXj^k~ٞoO{ܚpw:g*[gL>8\*F}TKs*L 8T6g4ei3t #) :QIɱo~G`""Cy]E}߳\q {Tbܳ0Gnߢ=7gӰ~Ocxƈ2qzI栎w9 c* N z3$,AXZ)*oېc fM  FbR +7uexdM $5MvuI/9^>iRWڬ =_b 7$}_D#Q}U!3C&:au JߗL?y?I|guo,XH|MK5:2.]v%{{\RFCU&Ϩ}|jp"ҴI sHe:)>_2%p6ǷsvCXfC==8e\<_I>?h!hjPНU äFuT܉ k,g{w voÞ`{:X3c؄o7'~ dk 6k$Fs]g C],KkaރPP>Xmn<:!ojz7?۩6:ȶ (oA3ftAõ{>/3_tt%G$<0 |,-pqm4{HQû{E-MxLߎ* $7S#V ={{8Ll]y&t:ye.fOl  Vq18|mo) ¬ftȈ_"p[~LYpkq`}gꐀ rh`9IfDGDDzn0Vye#V,$.Kr*N>MféNHn Rݬo:4ik83ak(j{2=em"yO3BѴ H~ÐGOjgaeBsd6ވCA,o&vl7Av:Zދً{q4l?< %A}jGoKe,_bVTly,yDyY <@,ؑ=wFֲ&M>?Ӷ&>}E #0y If`L]WL2vI\ʰdc qicZ+(s^-[ I-Z4!0z(⽜l"e/YwJռMU <*O<\&K*UjF*菾9Ă"6uk<!`\lPS;޸΀ndC|起^߹>B7s7V!sQEQIr~ XNy;^8-Լ0CeIVDDy($u?~"=~HYz*i@gwtI}ԟSPFam|1gϿh pcTj@$Ĕɀ z:7&8DP$w0Z.\2o5^;[ǘ;EBi*ux& !^JUH0ʔ}&7lph|,OE$<ɡ=>L~8O)̀lRh<ӰHtu`ͲfH9s7R̺KC+ Π<~YXN\ӗ[a3f@,&][O.b?B΅,% x3 jԒOEFJ 9Q?DO(SsU,_~dTmߗY)1jW J6wLkԗL4Ckv.\/!j/ W$ _hLM)9S>?8~-TȤ9|GzEj%ŦA'bCjA[C`y:O_$!tڞhI?|3DTV9&;Ѣ>W*|O#o9#W:R{A"/Su;{nGF^svd_+pJ9Zr9;Y|sJpq % Fpc'Gd#e(rc}v^, K;`)KM a͛+$Dږ !)Q`d^f>Ж 3ApINH-^2j--h&KjzT}ma1IXaGy!ƿFjxs ȠCAIn* ^|P)'b3 "m F;_nXF@)1Ç4hD\kM浿B2UB kpx w  >)!ΰ(_C:VPsFYeȣ0S?cڲ;콻 R~( 0SK7L'es.Ƥ3~̓H~EO/fALn'Ѐ!9fǨ1dJcGtQQG:c 3WxaM1L(l|,2dA,MVx$@?sʜdgi_ZOyx&5h5d:;pjDq\8;o~ϱÝ+ pM aC?ҹpCha4WvRV,K.\ x峲0=GFIv<1_wF۴e\H"~kMAp`Zf_9Xa6N35z;=<~4XoFthOmaMc3!dg_2 ,R-#jߍNM4isW)I@F*Ws0Vn+K$ ^b"cq@;@4VNyf')/~7P'#V+DoDd`X"=U8{XWiF9uAv7L- { hwA@r¢mqھ{8gô XƟ_#am?ڍ|:~NVb}Iyvro~\e>Uh"m(2r#O(| jy6.m(4u#efŏ)nosPo%yǗ{c'D/Sҙ'qLB*-b񤈲LQsAKχ6HR-ZZ~Cu#| 4xtFJ7K/tTF-*޼I8:Fc]*@"! RtAδ@oփ"^6ĠDW-s{U8iPKbCak[rs@}grEghxHC/-JsD<91L侚vٝ[*Uw098{,|OeV*ݹ*_5֔K3&{ᬇ)a}~iE#D{4@;&lJ8LBi␢g%*n8A8f3C m:yv" ;`C#ͮك)ב.&ؕ-g/T>fra.9,sӚL0Ԙ"-K,Bӳz=}iE\TC19}ߙo'nP@!`ѯ]xN5൨>@xZHD$]_̄Rh)"B!vN"|0s }BoR|`@E 7 9s%(ku;bUi@4"}y`!`c!">_qn_NF;GS#2!&p=?3'6 EO铖UUU#,P^ŷi>I$Com喯~P9~oZCOƕ(0ɻI4BxS׬Z 7!(B\#.Zct/'5vtن ̃cx[א<}ۍʻL|&k̔  8Z A 3#*r~u(ˏ"Ibwi׼0-xS Z(E0|Z;s{fFY߀N}h-\CPlί-;C|/9Q"+.UnEq˪NI&a9 4x`̍rsf= 4ԲO'nfdEb"ُ4~!y f9Wd俤 ;տf R$;k㟘bB x '@p2E8Z}'L)#FC;3}{F  {ׁkK F|*R-M L_i3gx`!(ͳ^*2a)W4/^Y׿>Q_wB5$X痨 M0VUoKZa@x&86g^$@Dy4"N\/Lys$:_Z5ͲE]8]C[m\YO3KgҐ-4^rzͤTl …N%.^ i.JuS0Ȇ>ϑ9lƢ\DaY晠1e?fߣ/ҥy)' K BFB4CQ7a D(~pPoEHp^0bSH] 3 ODDk_ g"2Y1k]T0Cs^ i<cPW[h9LySv `YʊF 3/Ў}Z%R@e3CGABg'J-a jCKDCh|!gkP =c_b|zbjY,@1mna5o>OX0qBH%j Wv~6Y3fҔ8gmV@ym4*]BzqYGǀ< vnP,D"$AMj‚sbuB(~ ϋ݀wu aǫ?+7yIҮrp ='֨XA)~c]_s/ڠu@y>v}`A4fLrm |7׾@B(H?V(K)nw)}%w!Afy2՗˨[܌ Òߙ#=߆qH9Ǜ}.{--u8.ag 8Z$Aو1N*-p8M_gq/#EP8f Aç6rf~9O:ӽRK~(HQWN[ HE%T;| Ü5Ry4f߫էM{kj~nϋL+=#3}!WeIF"D>%JZk \P[?Mpy#Kɑ$``'XXJ z\[x.w4Z:la*"Q%e#>M V0tAꒆjhHyd4}~M|ZvbNj-< j @`xx H:%f˚rwd"E/^z?^ƒ2ëv áK9$H7g^>qU 7cp4Q['CAX3/Zk+=%*;JtT#z Y\acD ڊ #%]t"ɀzCO;8!@ Cm ş]nfaΨLpl{[clz9\sĿНN@B쮝sHY,g~oH rw(t#{U/5?۷\Goh*A^D=[PA# rwBS0z/@@aW< 2l)JaEqXM[D!fq s*8ulO()5$"e'Po¬Ii-kD/B|(7nnD:A9䞥Ķ~=^ڡvffe"!YTR#!l[grtkܕ@-@IAAh:m]HdEX^zT!A-TvڂSIܰ 5SqR ( PQ+H۫K C%s7Mޜ@Ӕf?'ӭ8uN0w`fV j y 7*Pyo?~}}c|L=D/#-1 #D5V>z(:G(E ;\Y"`Y^|?5b'X C,V/.6QtFFBZQԉ=[5h׈,[= an1Q(DJ9!kJxI`09 L nƂG/(8q 8 *:xCѢ[yٽq3_q:~.6{tT9޴Hmx8/p><Z6VU#6p` Ӥ ivI$6|ds(!vNYLHC|gt"Po._?UoW8w-VqP2z~yeJtAA_~#s MX(=Ǵ015@L.6PN}oqjZPGg} {ue fH_. I?k\HV T`bܠMcIKrsZ) wGsP.]hw$}&@0'/ȟǚx~#ROcI/+ouT<@[MKa H*{׈ p(-5D@.Է?.uh@!-7a7P$LOf5{h> Zt- 4)#@P eS!j@\77^1sPA=a vL71C$ ?{s.G{M"D9#ԎychIt~SXw: (A < cR>9Y1`0rPZcJ8o*'5^ h%&=IT0)#_a̐L R@Br8vB(U#.Ug "6.}UH5ʟSt)\Γز4jqA)@dcvl+Z#IzVP>xBܠ_~5{4hED#Q v^ fTIgzA2$rYfh~Hz@uEr+_q;=DCj~+4s}욓 M.x!l秕P NVmnEY7MIu_Em3'}̭l*f(}ܧꑤ嘁ؾpZ3HWN(:YoHmۖA)I36UErEES߮4p`>1U6e<"8ʺJ  x͎YCG?t=dyI@]_)fSҦK3=33jV"t6BTP/\T!~Ja3SwTB1 ]F ~yj!,TlgS];=bV@hɮBrdH;d-qADd$tˑp.N?;φoX"CPóۿlld1kSגt*AI3HJdd+9kSlzޛq4^dxͫ,AfhnkL"'{WFTPOF.x7y?-^.x{&vͲȺz_V<ڪI{rb`Ǿ׭ei\$k@" gS BCf*4k? H(C}wTO3qW*5} $:Kqqղl]yC}Ng.k{v˵ {;fr [ ⫐a 2z`9JLYbײ'Lnهg$` i;x= yOm:|9 g~i}wWϤfB~N{8^2Fn16ō&½(xcdd׭4e 4i%qKTVdCh20 2;-#rgVnt: ~I-Ezܺd<&+{P׼~}r>P6"b䍴7|dEA T@|%ٟ޾<r{oYyG8,\/y]B=վ"PU-Sa+qِp{ngO}"m^M5{͊cX&c"7My}3_&<#V0պ~ H Z /}2i+x ($id mm3rW\fuH"2 mtA w>̀R$U# [7!]G{[Cڕ䢖ņ7D(Bǻjt?Ί\S܁(5$,a4zHA Q)E # A }WS sv86 CW8zYNŌQ[mz},?TE@ ^:B~xaL>OU}tYu4?s}rĂhLDh#dC4_b,VRB ?8 bCoj"GPHZ=nO#׾>XYG0+@XF)Yפ~zwᱻ&>׀~md$ΥKdTz(P'&1T]nD:W|onr|؊W 5. E_PI"f('lR76TQ(P_7ȁHy _hqB`[jKF)48p Q n CT[9AJ/9[=vJ #aa#nYwU߁)VX4X2;> Oh%ȹZ9 䱑>6Xp+a݌z/,>׷GQQB3@ڛ<ʲbq)Bq c($ރJohB$%د|!UWvf:$ AMOd=?["2FQy@t?г3'@K.g;fmV-&ұͼk@0,QĬz >'P^M.A L H GhZpg&]:Utx.p`;Ӄ-otku´˨X$A'4׌3pb?[vd3kY}$;F߄W`GavocG xD UȐb 1ޒQzM:Sݕ7lmxuuSZ<" F@R,$:8[=OA$pc]Ys,4kiMC 4DclW[ Ҝ"Ok(kGSW4#l:ֱ<*!Hn;W%]g0DM+mcK\6HD|}CF͹tH7N4ZP]MAp\~@ҋ"JV@H@I/ G|ޱU"Htx ^vrf?e%`|񡆳4V\[1H=7q]oܶzS=t݇MMg&R;(uڈz"T7 mHjy_)N)޲^=Nl'FJp@FyQJ)+)ܧc?h>EYf8y{!9FBG*pNGg:+#R3jSn{p|Nwܬhc'i];H%L#"~|~] :&҆ƞ^ 5oDJ\Eu#58qPz_K lp e9꒓tNn:nW- I3 ChóL-8v(2I5}V=CY#8TȐ\@o4> V]nyVԎEnGQqW} mGu3`7Ͻ4X~^8 juȃ,&s=/ӆkS*y™Qr~؍}.Уy0*]:3 ^ޚʰ$`gQ͢M"DsvuWB¸Bߗ.(\N?]N 㘰8pGjz=0 6 9&eJ Гa?y>a*PAdjp|0(֋dz8Ȟr<8ǿѪV"-#?o?UCG':qͯ]-< ,%%eo"([iIY:AE;zobr $ aROGorp4IiY\|X|̄7Һ'6-v(_GbybO_?|e|YFBTU![%R;1 3T-67[~,HfTlVSZHp<+ӊ̢@ĆsktmH]$P:JÏ1BS?=J^bIH}Rb7*T3DOE?OialV{O1$KAĎ%lj=Xx.A EgƏ<| ~fQ5>ʼSdÖ|ûFTw[&:r׻.XD8cB9[$BVc9)Hv8*֭{'#x&j*khOCK1I׹WM}A4-s/HtN5q5o1 $J^K++~\ieـ 0lňȅ۞+@R8ͳ,E0)x UG·2A 7`Avs[[*=zGw;^<8B$ !x5Cg̙&UEW .HAZXB0#)`Vs_:iT È`yY<x1vƮzI%YBw+`L՘NRu7eW~;|4|';:ͺzQB&H& IHmk+L)jW5j=BZWyJFW-CL9)HHFE$@H@ La9j+.vlÝldxM3W5_NpȂ^zNAi`2[C$Lp>gST p&. T}-ݢ2ѠKB uVh 7|.5x>K岉6$ydmƪ 9D?ukcT\D @(?V2q A$ Nm~H bذHN'?gO*j\Hҳx.7]M gv`K(+H$$5 /^V٪ \pr|s?{{H0`U=~ ]ֈO xoITdoB:HItԃn0+Zꄖd GK/qWr{I_x˱o]Tۈ&7KA|`>CT-eU(JÎ{7!m_޾mO. ik!&eۃTE$ןr~YmyRU[^, Kztl/A}Z|$rK!n;2A"U`&,֞d`oMbFIv)b q^Uf!3EG&Gt~9կ8n6aHT.wئyYT&hL^nz j\sZ|$*]ދ-]cJ%g"x?gn76ԙ_!B ~&fM#_y}M{\ۤ=HN_jh~H i[%gޚth/)#(s-{?GKGz፰Eiv}ܐ%{3!EϐquЂ ȐL%c7 .lSQz>iX9/PHF*B`׺WZ$v5 |󅮁eso{ tuΧՈiHI&0 eU3\R'w+eʗ`@y3"2# ^e~A3(?R)$JHB:w:.AEɉ?(v8V@TɷUiwxyM[ι݈*]ѹwz `Iou) 6oz?w,Br'S9K( >QKyqY5<*AmeA005C Ծ[:z˺N[Qj"$! sST0DZ-%8/#nLth{8:%^C4vL7(4q-F(_rFprP:ob "^d zaLD_ xxl8w*; L'@e2@E9ky@FԈ˞. V>[m"O q~ a)UO"i0b("C΂L1ffe{E w,*-'-0 #ĉ6Ur L"&2wkPpP_(=vp/1.S /pL)~oQˆH$dqo$jjn9rc&voHFYH?)WS1 I$pA/|߿>pm ٸD:D?Y'j_-q /YW zy"[\~?IpOVD w8;"cJo$ "(1TzְI>Bmi_ƫ)t7 \;Gx˜ًv<r&a,+U@;:7m3~ēx=ВAkk \.DAFgFPHCd/,K?y4/󢜣fXXIKVu:^ڂbQ噦9J!o|յ(]H=Go%P֌R@0߲a&OjCwt W7HjحsjE,~R/4\H][$\x-\|f$5tFmhaۢaVd4$ru/Ԯ$ B>87?%OߚC1DJH -﷿ZS;[ѵŸX761kRV Gd"@,mSz϶х7 b:>Ʈ&Y-PIĄc(F`#$^z{E[~ʗf7E`~/cE ~IU.~?5s~:dQZΉEG0bvkW, :điQBJ{nYtܸ_T u8[~50q$1 ag/j^aq{59)>Ez#('ea`L \$ŗ{e<'!6NU֭v>EfRRegm`˄(R^] c,\eͻDX랶"0Z]DPJώncʽ^J0Rm* hC@-O:7Aߩҏ$ފՓ([]{4ǬlTZb$o͑U<"}zqDSp*K- ō{.{wggl, ;JrQ,(wfaK { =@av%d[ PSwD-] 0pĔD;yh?E[2CK yMλжD:;7>þrOpFT$JPhn$Zpp k C:eE^5<8t"?; |).,H˹_>DI{ "Rж|}GۂZ7r o< <]OJشn20m?&n !ÈJ5iTQTܶڍ }X=4m=Xya#I6բbܿ< }@*aJ$C9wA+X>>TuuƸYcOxv}"ș]"?k#_A][h0o[M y.iXDJ|ZB HAEɧEg}!V5 0fhV' p\p-fn\\kӝu#C2X(4jnL@Ջ. 5"Iykv|Z0 TqjjAHNG7O֪(#f˃mޔ:2lyA3˧ <xOñ|X5ڏRed? j`K)QT -bK 64Sn]C 9 ]<$_V;R]ݞz|;(7M Qo/u5yͺC@ϩ\zRlnddț8uh#HaP73K:LXPAx KTF UTR`O-/A(* ]My^674q|SDHFhrIs#[XlsȦR fC\*-}y;؞tYs`1ⅾa!1 g= |=<=e]/cO[{0,Oô$>N/5>NJ?f|=[@)r5|ąԅ2$v$3]0,#Ol5#|R(I(Dž3r5.L@5_0ݧ~AvZ}hO+t؅6YbWQ&u3B`F;nBKl$h-i\dgFw Z^7ERr0Tď Nc*gkȞ/ow~h˞loc?] 5S`vb|MUt놞} "~z OȫDD=(qͅ|- ;wўz{Ȓ%aw *Cj Eڊ:' [P-R@05Gu;&upS}~^OI-'ԌOacV_ZYd!XN-iBJ_G>emˊbbye5~Qg2Dfl-P2zoǺ(Vy"*S~HrԴ5qb bk7]ez#}=wVT]S2{ zw;MTRJVQHG\rͫ2_{_d~[ϛ 8bG":}u?|_U%SW1_|q]@@X]H2_̋ r e2ĖU[A.@Z``+`ۏ w6ǰEF`3>֛Cؗ%G>ի9#kp;(8d^N~yR@Ԣ^?++%`12 C0\\[pY[Z:;J流I.Y:P$/|̷@uS7|}~[/9(vn@D=9h J0&8eD5dڻgWH"sg{K=3 o!68> }Vpnl꠬ )$esFy#,/ma>t4 <0VK.wW҂Jg rZHr8ΏDˇT8|^Inq[JN{( Rni%ha<͐W1gW.>ºI3 t R M1%=I3eRiqV>Ń[|W H JWojCO@ƿ%/d|9?]kCPwƳs4IjsuټuBb TZ ㏦H%1!PrClZ {~j+ӻ<6I./Tͥ>wN{昖F| Wg19p723Rs%/vᄄ7ҷh@'00>5L+Y}G3PA+taw%T|c@q#J5O"er UT55D"a$ t7dD>(D4+~xL{(;eMUvx&ںZ2lY:r ڕ6xQioRC<`;YrcYS~;gة_Ĝ}Q 7Sqک9Ỳec'sjbQě<&;DCɹ+(=sޏR]r#uodvS~QTx^f4~p#/o9:*xIhkM8J ʌ!ueef&[?:!JO 0>,Rrif-ӼAxS=GԞjJa,{(α۾i˿q~MRґ,?'A ?Z(Q{xiߏw@}d<@˨ \dImN2Z-!?M)Xr\YpBvdy!z¥/SMHéŲ%ȐXihd㺧y_s[aA=i[VT Ldn9i# cT0wK-tT1Wqp^(S K0_Z5 \tjUkP]QrVb#E >:- XuARVNk'0Ņak-J@]]ÇJǒWOc=8O0< ;݇'ܦ~yO5 VGPї'w` Qݚ Ad Vn7*^h뮁":;䩌: !Do{7 _u/5֤:a _8#vv>rdzH'@BByI?Am-^o&"=(-eeW]*F`.KB^$5O̢<.+Kz{C|=YS+6u5h#O:0!0M si`"PxABHp3ǕR uMNN4ˍ\zn)G$TFmڢDP1z􄨱w~uܞGIbA[XW_na)JGܻw _#$#Dg'3,h}Wjtaa{V⃶} lMF|kj6W~Bbau4A(Cx f<'q ~"X;uuvO^QgQ%QbT)ٳo{O ZIu[# X@z]y\V aUyu͒E"AU<@Mau'h0 ? Zм̶{ }^HE-`r k; < ֦`0B\rwNM}/l7 a`/s;VǴ0r7Uev7Tap[MC~ ߘ]tu4 AbV6^ 6i=euaYd#mۑbMa!ӽ@d= tjiF{d1ʜQcߒGwŶ \Wp>$D1Oʤ "A;Tq6 ;ظ<#$ IUk!AR?q@,4,pVB{N8RЁ ߖ@0J[2U0W$E@xq(5:6vlj>0'a>2Aɼ) %@PSX~Z̈Opq5`=~y)^4icY:xiSZ9s]J$gfMtpwS^7"ɩ^wJ:}C^-<˱$4%E]ܤ=|Dޱuҏd5?A<}KZm;d8hegaHy%c-;FI+rhʁ9J00Cڅrq^:ܗuHSF:.  Q )&KŨ$_sq<؜9\aWQOaOK a#X89(m~u[Hϵ_~ŭ_Эv%0{d= 1F; 8"P" Gk$]/G>m>.1[>crq=Ǚc^`Ȧ鰏O?.縱iG7i;6r; ITAcuv@Y)5=Uc)ƺ|RהN4K]: ^׫#?evm5[HQfuS<ٓ`9~Rݪ 0Uwi飵qZt2~Y[cH=!l|0Gcss)ke4"\tr~&yˬٽBzV%|9 cU~_lҦ}M[f0 @̴IwpײPB4ױtçnd m*WP|:[N~n~׭޾H#/ 0wA4^%G. h3_0L3ȣI*MS9ۋvzxU)GJ2mupāڔ.(LJ²y_bHl9u{FFǵ_|T0Ii⼗.*stU]+]'➶WՓM"GYH?u X# :q8KhJA 8 DuJ3Xm?\)}8:BT`1 f6~^M!u !vKO!3ݝBn3\ l 1W5DshAX3tOEtv5fd_V֞9|k$LBSq{3BZ,ZdNdrPu9|-7+58حLܻv_X5d%PI*O{//oDT`j pU+~rBQ|^Q5RAV\٫ؔ Sxp/{w|lR{]NZ-wb6Ӟ'y ]@RG2F h=0qZʜ]{Ƌ9Ocjw^GTjhy xq n(_Mc#.X؊ L<Ê݈<8#uFlBQ"-w_åÏ~1\v(%V\8[q̾: Q;&GPa1ΰ,L6^HYb8~g'P@C)>,p]W%r~Z]~aio@n\H\Rgx_avo"EXc/\>#-Ie(-ׯ-JѭWLCV=`>Q/h0sY ã_2t+!A%}m1T=4c'aSGқϦn ؀$EAE!oYZqĭWM_(O|x,=h&I%־zd]8ަ 5+ņ H^7)o?e{_Hv4. h>o'c,M2 Q*Z\g=߆p{c#n 3v9«윗YF5Í 8k廙BMmV))RFnnST~h #%QeY={>]J1oԾ_O*+2ޢ%%QTbJ&{XrycMQ"t ܺ#kmxsGZSpSf^A.yGs|uwWÑ$Ǖk\:hw*;:&%.( a$tc=/:(G?beJZ3.yjZyAK.O%d-ֺMGR32W\^Nz֬A֭ /ޔƶ )k'~>y!p-RWqogDhOX>7;98')䡺nq6G%i(EF M쾤2'Ͱ)=NKYcUr0RZN49%'3qUx~'WX+Lye?iz(`t& o}o Eo%|}2W$ kқ"nO!4#^2I>7V(_Yuw Ohq Ӵrs?5ґA8fyGո#Ea) p)Q4VI).Kr CSO9hl.xoKdWf5xԉo/*s=zgG!7OHo?C/oe2rqoD{ˌx |;%nfSb;.T Eώ9n vA7># Ԣ__CܾQ:Yz $N8bb*;2M ( Gro4zhow[;}I3}iREPPF0Ͼܗq(S.#M,ⱅa.t7<7yI߰rC3ɣY+x !ET9n6v\({L4SY(p@P @,ꖹeδ9wZF*b:FR3-h7[߭>87gM{405Hk -"p& zxfnPfp!=gglJJU? 7ҿ3NٛU{9V_:y|=0I۱)L)kX _ cRlڂӺ»F5riyW'DOb_*o-=_7 H`._7ǸCZM$RE |,liHZ9;O^}TWK[&e5~2OzùX\@Ggww-=bDmd}rLP hcJ'[K7UWOZNڕ!Py~+;\IB*nT_+o8*ɕnWǀfWhcv,v~-wYUDJC3ƫ1Z}yF60FrXb|V?ߓ[b74tU|!Rs, 0/qep㾼v%zz㒫{@a_6Q,4}i@Lύ +  >ITIRl6&o'-%wضZTIZ 4lx͡ؤ 7`!`zۃpؓfvEڪhÆrT҅=c./A5{{qXpPYfMvs.٭X(S 6"v,*8wn ԗz9jhCOBu(Q6e+L+a [0Im`<ܶ+ ȥ#BS|M]`::#G5{e0@(<k+U~a @}}7 n9?TgQbüR?:FvO a%.n #qW>})&Л+!ӛcKKX[lR Q(cu ^Fdܜҩ(WR/.7L&+,deUyB$zOK BE*rwzώw7bM>ƽ΃smZ(& &&@$,d:?V9[rVz?%u~eT$c%(Ld x&u†p*e;W55G7?Ŧm}T)*B4 -G%XX$HC$k^{CRŤ@VTN+Ё?COx[黁:ĐګTkVοojMwvK=1ok̙Lop%z:* Ӥtߌn|HjWm899AtFd]NU ='iof)gN%f6 vdce "dVF:21S@" CWb,ʷY#sN+jLʕa-A+vzL*2+ⱒqw F M\mbx'ԂN0K vkB,MSE7 U[W FBEʆ:IO=l VMId3)ziʓ*Y!جo7{FuE]\)!ljcـǩU?#Ko2iz" d?\-)(XƟTgi &J37]1LwDp|OON3nVԅ~P+ÆA^Ie41)F%-X?#l:V:Upv =/E7r8Fn8;̭^vՃnJO \Ÿ>"XqON3I)?kT-[ᄼZy^a \(OhW.(򴜹7gELkH 4/I%HH3*Iv نD{ĵĞ~ð9fdE[V.cnHE=d=^Ѭ!F] _r r( (#J*F}&ޗSqCnc E. kUCa^vIt>BBg C0G6¿Csh„]u0ksT9U޻%Ȳ.w3?kاMP]ږKH473jUS#1\BS&dHJFޖB4ϟkX|4}CY1tw:N!ʫ䯸zғ{BS4)&1ڻAJrH6߆& KkHWٙp ߙyZb֎AvVPA6woXH^Ϣk.Ԟf ޼ei.%Ăql.4Zۏ|T IM9QmP"Jgu$@] -cN#ZCLB@Tr $Uu.:~Ay"+{-蠽$&D~FήpGP/ָ$ҕ~A<_~YLV0i;`{m~%Pg-TG]VB6d1q*{e7 J O\υAcN%r!lջµe&$*j2T anV.GnJ{,V*>ܾAŜu*Req]DLѴG 01-f\CB/e`BU? rřscGkS*%'Ҁw x=K4XόIN/4Ur*F?jviZTzf%%c˲ %m.n;[L>0YJ;{$qiPZh I)f]ߞ|I ]&iOsK4BßqydɂSZJ=mVpj]V U$Ȩ&>dX9{J1|Ċ8D:/1[g;Sʡ:E?\ yoX 7(ZnZRn/2ش Pgm: Pt@S8~IWrϾ*HU",cx@B_G; oS5,|ܼ}Q"-;Fe+*$(ᵩ5ҠGe@Us;I錪iv% R+BW'5+{**&16ONg U C9[5@ɀYqAQ'7OpuwPTtL0phV_hƶ`*pKigēb4VzkAû ^4x-w=v8ѧ}"gUy9ڛ1VG1ζG^bTf`_5!'9.Sᑯ ,KyhW";tŠaed6mm7brZX\X&`ӪC #7%t|Wnְ6ВtE;U-|~g$V)a^& W;²  dȀ^"\1LlfOާ^sx)rd9E;CX ~`yq7@]UPvRPFV݋} xn:oaL^,yJRDG/gEN xUߊ QMgve|)!I$Mw -_$RLK Q1"@m7)tѡ%AD4ٓWi ,͟γefF '{h[lдuᵇǏF܌az>ysirlW 5H {zNA0 S?dMūV[_?6uWO4ej> X$“}V︠?,ewd>v=%Nk">5xJT-ĥDĊddWĒPH{+O - `sЦ˒+|qjix!%r*SK^Bc6)K1h&i``I*R4Ǎyq~ΞJ crg~}HΣGђT+'E]nd꯴+(V`uX<n}Mjna+A#xD*~0d7ωKbDzBORXZTWOc^BX4VbAjH:>+/ŭ87X = d5mqW&l6cMńզ 赵Lg8;ȬW{×tY<ݤ9M`={Ej_h%;,B/W*EQr!l]S>0"si06DbWѾ~ܺX+gk^ u:KZ| 2RR5bȗ~YLAO<@O,xv.M 9HO"߅*ˆB &2Qg1kH80 Af3S(=V7r^,n> bB8BE" &;X֏_# 6~;}îB ޲ Z̕(~HO.E_^5Om7]|5<ðC Bٚ(|4L%MŊiTد jHIW TN"ko%#g%'uRqPi~ǥOﵪ7TcF>b UeUݗΔV${@3hl#lMvcT,;Jr} BE)vVUn,Z''5'^kz9}'5|ԮMUS2?{ߵ=U)QIq4KlɈ=P.r(łqpOnrwKlLj !ot1/g{p£$m5,L=ރ Ra"/*q\ Iiw~GȏqD'ʯ+eC|)5Iii(Lz\K#rFee|w \aDJi tk2&ϼҽ@?u5Sh,M4_V3!Z~l 85v3#xNH@cRMCo 2U.<x#n6P,S Zc,BL`iZMFՄ‡q;Yֳ&_׍ ':W5sASMa}o['vy d>8's$!# ;9$X籟l 2^ l ;LbwkxErɝܒ{jd/i_v o Kjf3+y}y5#T|"Rl5DYd$vRhJyRB$*b+dV[²FT:#&xyCnK_<`@󴹬M_M媰Zϝ(aE34^g|8z?̛ܶ v7zH((z; %9Yk|RuF=tE&\ce'q<kY}; r>kG?jfq*A Hi-9 1%=btqH9Hʎ]^2'N~ziAvVC,ɖjvV6G)k.AWs)yZuHذ.DrUnʹ#zoF@X0ͲD2~2GЍi ':_ w#r CgNf2ws^.\Ȼ xHum_$i}5D;̱dUWTXHIYJX5᳎&PVxАZ|85 mIvL ]b=G JGɑWf7l1uP-n~C8`.zBa/OQ:yEK1׌#^qtMYeAxW_3P0f6~jm}p/Tu`ʎQnFUt2w9FRbw 3 xISw FVSEVmCs4=QE;MHnREL綏0 c7.ĮiK+y2qv0qV^t^d tTm1ϻOZtX](@5 ahrjf{e ~9װuOMCXPͦqIrbn1*Cw:~3R:ʰA R*1`Jq%vŸTy"C$,h(P2㾽Ez,eY4Ǝj O!LA@}&o(/9sxA[-BH1fe7şgGT_ZZ&pe4qѳtbˆ>0@F`'PWϏ8 }|gZeX>=H',Ă}%fIgwEpOW4QkilZ|ɝxZK5LqqWq oB[myG]]22٩,[ff],<&.!DY<g#z(ZUzozZ;2T_*)_t'=q_]҂FkQl6PDїeaٞӣRv+a jkm$xT, :qc[-b)1v,hKS q󌄫.=p~ax${~xqnjYx4wl5Y-U!%v /0ηqmPO1Izx~/oN`#U~T@^_DІ% }:t];`if|P2Ny8\Njp'.ĈLy wD+rtl>1z5:%0]oų!MJ˜F>tخ+IQP+%y/d__Ze=GgoYu JN\\?w.W}E܇*FI=W}ɃF8F꜒HYeSzDu& S>ϯhIϰr<Jrf4բl؆r9̾ %" 1r_rKF)@B1 ;g7k?;1 TAK, B5P(ʵD^C K%%]N3OuG$n}Y%eu1.ћ1x4`!);byŎS0 yqyޜ0>-Be8 x6PA>USQ %R'Bnٝ%+g S`N.sN{w˫ ϘHs|IY˷;!6"q(|`Z6DbtIe4ˆg?&^?U͕[ Dlyi윑\&6^Aڸ5R<DaY:/]8x52CQSQ›ZOes;zer(Q 2@!ޠvjg>RVPG_džhBd_ ! JQ**g1_g]X%! s$TA6Xt^8*O6Cbb5LqDG;J1QTl0">U {b#[J&z| |xqcH <):t\gUޒ"Ve9C5FLw01ﳭ,BPLzmC&Ҏkge[1yu/1 TVuU&W(dG =aN "P#b~=RZ>-GX导L w2\KѷPPBa},5XʰpRt_+ȓO%X-z)ZlzCCփmQyFYn2Gj4L%kq/'+뽛u 05z3=w>AG }A&J"-C0k8պTPA^0d/Wkz*G 1iS8 .${Kjw4FVX7T5/`l'ɖ)0򯵕da Wbp!ޙPbݺɓ=tPG߂s9ØNJuދ% ޠnw\<ƾBb u~YU>-R};D7쨥 ;=I=]LƘ0X/`'E%Ɨ߱E4 OP,Ckj8%`{Xwu3k@"Cs}i(`٣P=NjublRIT AmoZܮa7(yvVmlv¨~IKŊd+fāM1C HI"UAX}ʙMqT 5bu7"+E4%:+R7˨_M[7yHW:gj}л=d NZ?F{ߢbᯉ<&H"!:b bp9ѧzّeKhC6]z֖~?=$K͢eў e ɕvϒOKQəsX(NgrY@ 5vՔ a2hi\N6Nl ^ J\S&˹{"h&|7h0r3ȥ<<$𸬸Nґ'CGyku˝o"g}0KKI@! 4eݢ"XGD9{aZ&Ym'Թ @&;văH^jR;p6'a{ m%t.?r=ױ~s~Yr,0ngcqVSb1LjW+Rŗ_J+3| /?b]?S_҈@I&].-v}1\C { :)݅};}Hj´9F,bl\ ]XEr;9=o^%[ sZs>^I{'븈_dOnd( 0AsU#mwspؤw} gt~3bױBB\lI ^1AZI0 u 酖olrwT?gh骘C:K{ha4&vqрjt,$7WbIVU<.{WOՇUA'|?IKv+/jCtHS?P\Jɿ3lDGeA[G'Tp]=6i99[ id'b$b"e.(5jOy^&^VY)El%@D+XN6Id<kZ]=M}%J{x3񠨆a9)(EcqmvMrm] OL,qRà晍3),Tl9;rm^98ӓT H> =yps l4NCvzm!?#k3CPWnjdH/l)=2?:rڼB<},tZ7+k5~ \oϪ+QIx1H,e֫7Xb!4W\5׶ɼ^Gџ519ycZ["]bQ\vG'o: ^J; }+!U$I/|J0a#^=>rKMނY?Ӓ BW.w`ބsފТPá12 ˄7}jǙz6M6?v{Ǣ}0[4뿩'Ⱦg U}[?ޕsZJg%1itFfq*v3mlÑFv΋4) &R&` ݮ̷:cg{@tRPᦻFe|YyfSMm`30dwǂ)C{)Bx"î;.>D+L8L6%M="q718c>8 )q3nhɠ_]]H"=%cYl\ejQcMbOsr _Tw. ?I+xE6sF MD5 RO7D&(a4L%yngr|V^3 ^Դ˔u[eԊ .HKu b3@DWE5To5 vHy6gcVHf8O;KgHpXK1JR-SJ>W&24cˆɊLԎdݿr#SB̋ ݯq7 | |۟ޒ,ǘzե?֝eJԁF5(ԃwžM d|m\e+;?ۚJ!~U0N &).Ɯ cVԪOl(y+p{^QSsRM*5=bW50F VIc2rh7ψN'`ޯJw@*8;eqJgoVt[웵|lr(jɻhE;H n7C'DNzB%ZNp:/bTqjLYb|M㴥VUZ60BN^uk>:e؇IHQu͆_  !0>ybbs N(I\^4ef>B{zD9r^0}1Ԥ)lzr)QY^iQ)`z<|AZ>^^\wG>O&D߅ʱ.!Y ?8.=\bEA^ѹ/LXխD'bPsUǪBoJo>ݼl{hr?ưѝ:*$qXڲ,:L2* yOKe[Er+8#r@vm;*L*ݦLa<.#Ŝ&t ;]H4753^U4D[SMlQ -L{XSZEa_þHYa%2Ol@ lx _Zd;%s8ZD,~?IՏA is>+=XxziD!p-+?}8`+ҏ))v.Dz~؂=ϊ,8.`6R&epuLJԪw⊓(j*ʏÚw eM7تGelkZ$ h\PY5[#;ICE"[!swwbٓI)Jk~B q7@kpf@^҅FMT1xu= A&TM6SCBD իL1H&MJ6HUo&XGK^<!6-1H^C 4?"踖X0W9R}tlUCm|ƥ#NQEe|YFD#: NH[wNEhP}0Su]Pc { Z)1-gzC1]b <,0o*D, XSMJs!co?=R+*!G|#>\j0ωYD9a;R &v+F /7RS+Fj ;8eRsLfUuA|$Wo}GJ_6̘])[4&*C f*UssIޓY:/Ϗ)=/GWM.v?yT-CꝬ9<ԫ^mAϘYc.41Z`X0YclLYux&=eceZyBؗzdUZǭ$päl?ei!5u k4bW6sU Tٴէ\X VkEb fhD(gh@u2$'qkjYi@Fc33(_,*p/2B-bFr/:k;Li=xɀSH*>g.:Kn-Y?YP]7D[bk$׺\M?6SMU !fH)+A>4nB;"ˣTl㣡#W}}hW"CWgIsC&Ӑ2|zZ2O+jqm>۝oJME=@+p2$:mB.Y \mIlqb/e3nUƩItR~gxޒY2n v)!ۘAyhd[ږlSY~$+ EZYAZ/*pFܾ*s-doZ*&ĺDܴؗly{7 DSBEsMhʘ4Q,^ͩ(9Sx,0QZb9ZZw o=; ׸iuTsk\5ga6z{Kqki7X59iϾP 3!?l]Ǭ.MiKYD2%יv[Oc= 6V=k8U$ H U9_ӑ@Ei;4kY)@ #h¿gǃ92ڛXO9bBƍͬB2SL#sVbh{K¡QydzCtrS\%]jn~eEq}3aS?KM(w%h_{Uy7G&+9fMT%3}shcEdK[q \~K8 K7+'S*`"OF>?OlU_ ɤYukl"~wiϟhT1JɁioԍTzQ9Q: rnVcCTbSo?/޺>:~D9dUMòt5KJw(9Md*4J8S 9:-ݖ'^ w ("{)og$0)#>P㹏4+q7ekŃ~ w^nĹPY7%JHN&(b2mG*=*>*BhZ##G:'n .{H]tWA–{M&icWYo9Oia /yw(‚1h݈`B`x45)}n>&kX˓Mʺ<.m(v\!-Ik&1'I+mwƸ 7Xf~NIjNTԉSoq=cl*'}hbOƞjXұ.RZ>t.ҤxJIê/D'G_։ ƭz/YpPHf͸+4Lޕw=[s#p>3 uWD$,屠c!Uiua4vcLlkELV8vFJ AZʊi)⪶T826*nv'l] c/*UP=V`&:?ϧ5KP4ƼDAE䜖m)MzX ZT1~Ņ?砒XP/U/OigNZ53a ϶.l[8,=ۥDB Q* P ,I`Dmxv(17MuAo3O};hNo״}p^qTm;u.odC-~|I</|26OCxB3͗ItmF!tc_䘥#́|ZVPuᜓTAOJYƗG:v~"q{E{'(V-rnr­"  :t$J^iw\C^ܯGgȌ;/de6@tU)'j'MWx]&z/uoH"+`#8ܙ;c,U-'34CKw7ڰ_75[’9,k6Ve%\`KS|)A"tc 3].|gj஧"|KFS3j-Q(ox-s&EU_~i!XCxnO-C| 88~@ZVS~^CYNQ_Vvt'Fq%Зp7߲kKgF⍾<^\si|E1Y^=] $rPN t#rs;' b 犕j+ĒepF>|ώTb%~ìU r5fBxv3-&5t p_0_@7.?!2ĕɒfyܟ#{y}Q<>rպ5yq&J5wmz9o3}Q5 LDd|49>THK/ntܕ&QcbjZVfdLTh3|Kq ޷ K4y?}(y-a$rzwX[]ZЙP8 M^2";7̙!7D¾P|rS (LH'\vOJ2+.Koi6obJ]L^[vfrY3> C7q 1IձSSu6{ǨP}'#J{C5'?~6}27dWP忸hUW7rnY#<)w96TRaOMG!k¯Vs!ųLzjd촆!|>m?s$">Gr&M 0@:\/—__ܧFIo(֌o)JYkjb@zu|'e>zK:5co;ock4`2a)Jeg[IPC{=^%9\6w ǖ '@io N!VxpGF4"z=q@T[:6ݗ2y HN]>W̛S!Ep e/qm s.kk#ޑfϏ :hcnD[^ ~䬀ijYlnF >=CTJ22s邉UO<# R οVl3SI—J~!OsXH6!T5x%gMb?r2fʬϘ<Lo+L'B _ܽ궮X#CfW c D^a$9($bHvak=ʭEpd ̦K85c}jxR/矩^[Q+DvBTl?|;D]}ojNg 2/]_k^@H@@_NzlHh$I}aX[O`7R HVfK$kQkJ뭛: Aq>z\UM"3X4! WCQsQ 3IER |r/[s"2\.[,6^3oW'KS^2GqcGƒ+Z1⃑qR^&4Vchh 2’U0 MɪXP1l0=@vtr|)-yAdݗvxwKw^˷/1`yhJxy9Qc{JOoU>=kRA/O<Uv[8qXHT5cAo^T)?qߋλ[;&^QVVOT>daJͨ Im>91pͽ3X#w!X?dMn+OvtW~&8ƂzƞBZF_CC>0BLI%iƾ͐.MV:mVה̏1,lW(SVSoa!e$!IJI/"(& *))V93YkZjcɢ'Zra{` 1/pm\b^Nb<a#;&:c;a-k| y+{]IrB8Թp*f'SgM;̊=[N,L¦:Uf1g4aH:dR+"j_,8d-gTѷ5@r ';jGVck׿L98U NM sU}`a$hL&ͥzcT-ؚ;TPy_eԓ{5Žڌ )9ZmxsԭNYԟg7[qrVK68dy7d;/}chz=|RGE4yY-\au ţ)}E䳾,'%fCڑjl\݉s)6^VJϗS!Dqψ@ÿEBdQ|f Uk}?'{K hI׺L-UMH[^ZQs.q(أW)2kJjrYo;"$Ԏ~[ yW_gע=/}4 k5w(܍KGNY 2%}_)wi[{RW4zjL09i>bդ m #hzcE*&unk{bTbsv(ja}G9TTi_Z^1 d@SGf}4F^$ް嫈SHN?AV ev7vfdJOń!$=-H&;'+kN:g)~U'=nzY}ٵǰLFC+mg7jط@ldtfj){V4㨩jykҨ* Sw Sn+7gf[ƧdF|"@])qZFt:^u'dpfIy1~/Scy, hb2aѲDCÌs([9h~*}/mj h;̖)WypigzFo]@ ֡Sתcl[uQZ),( WԫwXG.}T:žz . [u,xÓ եkIWٵsz<r{p)N!2Bhǒ-׉Oueeةꦪ?" 1̑I"[\0>y2_LЛd9uy/µ7zR2.ϊF^w5Ǖz",=B%;'PŶeiN_0aE$TqŞt781)[/l`=˔ҶZsPqY*d:}&pPAP!B(/-be}>@hBb'.ѽo !:'AB-D]Puczv\_"Ν 2(CuvÆu&zE3(%m1 a7 [sְ>=+Gu׶g/x5_;n.(/~BGo?dA ^_矼-^5]50wZߚdx%hy&T`dsi-=R>y7z{"Ɓ2udܟK+Mz(,zL}+Untv롆Tχ2%m[pPx%z*Rb˧} d~[oVmcid»zB73auabw+=s(16ضw~*{)·8WҰɵqN&b&a5e0M-5c-T] e H1jP3UJݡH&[XZs/dm{͟zhO+1dKC\?feIWTƆ-$qjHO!ѩ[R@^̥Zu=KU_tI3G, BuBY̻<&V&"ysJjަd%M OQ1%V5"hX 0Xٍ)+7nZ"o_u{&uo*0|5# }!Ts+qe|Æ֌2q֘SflfzncWLΛpuL{~"y律Jg_Z8O+5?9tȞ ,皺G~0`rxڮulEf>蔓۳cLeztMjqc$>,`BcSsy%]#:+.jլޞ_x-AeR)6B!Yu|{q?G]~8W\-}5M?lt\Pw,lG SAlk giزsWyʂ}o;d,Lqx<]NJM2)ۘNޱ/5]\VZ|9ui0j= TE+%u|-6BvCi/`a.U~W-Nȋ.Nta۩ԋ c7|k5=KWę,rkcтġz^+7&έks9<]$w5 yOnP nۅ0f`ل{zT>qEr}FA($IIz ꗐTEQqS/+)6~.\rnbeUx%?Ǟ?V%O9by¢t-gc^nkRhhξɬa*iuq!&B!SJmLHed4YUw}|f/rPaaNлКt52Hw1=dWߚ39$eU_obr1˥BtotZu6ϯғO4~`Roq*I,h\#M:/?Rx6+qV#RU: v?Ttݗɑ4ՊFBG7amd԰=sf̅Ln,A}jDb$U>" dlE(jQ.ZYY<殳h\,&=+\A9*s5&\dU%ꙓ7-36p)۴Ҝ 5n_v"JyRDL*&h$bm/LQU.#?'1hr$ fogsU˒kfOvf; |5{(M䗇YL< iB,O]7# |{?U-) B{Fs۰5⪍o_pA ʙ _px|okd8I" $O+[mPS: l]k rY2:4zfYmzuLBAH,H73z]*{_߂շ|zɾٴQ3c1G'n4مuԺcI 4mxBryW߽$)mzn zIЮ qFb:NXję>Ln;溧ڛ NhegL|z Uy !e{x22fͩaFLy00FE;CU=qOCNJ((LDT-tY(?ytMcKM(pI$VN)qa~i` x'謸NKDs՝$ޗ`&$$-?|j%0uy:T2I C|q%E-ē !EiƄ 0;*.6޵U3bYܻV6uS+5(c$3|,2Oꃬxn/On1)lu\/L=@8י膅f†|6g5EC9%=N jv<\sv khy #)頏c_2D]DJԘS+.Vf~}f?;q^n)JOhJ$4!Dt@,1b77Ɣ!b,axl#U'zTCڙʹW&ZMc*0 +B6ue~bl$0Nmt>fQjﰟȴ<*$)JբZTHS1櫗2&qu iSU ʹKfyKF?`LnlzU-nQ2:Zfї(IPܒ]iV63oq#ՀO[+fr eAy[u_+;2)Yr/BmY¸X'b䵱_C,X[kIErA(FD1E C :ijrO_:| /W,a8^3ee 񡇈2,WGs77MHKnwEq DM;ieCql%dn3U[פiа$?2))f{ߜ.0mfǛgڹPbxs]/ΨG*,~Ik S]oX̲FPdq4Gn&w,PeFl55@UA[K{MjD/6`U y@oXc(Q?)|}?%UMlY& 5S{wiNU7H1-VHB`ι|kOm(O9hQ)=SfɭzzP&M"JoI w,;LjuX.W:TݰS'U]t; l)5XȨ:txۜV-:f7JnK&ILgH8']|%7* *zlԔw1[lyo2)0j"~{Ķco Ӭ=ұF]sg N3N$Q<"QvUN{hz ]P7 4ň_;K4%9M r0&##ԠwkrVt}Cgn}[v+-^MSGَ(թlՃbXQۖ1]GY9{5ɋHUr%8EY YY!BENnfo-!Ji[^8$v^eI.O޺^A%hl)FxxҎ)zAυq Tu- RszZCcu^Iu+6$Vb=6)7 2% GZi"A#8H6styk&KmbNpm;|X+I % jbLd=쫏Ô\:Ԃ9ߑ;Pb=r-t__3@˹e{nBl2X3FC?{9B/Q)+N R3C QnwSQ!M$A+"a me_8=$_'鑫VR@"ӟef,2C ýGcy616`3$6jeY Fz 2e0, !a$ "*,A@C ߧ72斮d,PŲQH(Fp9JӒZG@XhB~Q`F6_T`'$ 9V7?XBC]1֫jI h$d&av@ D$_S-a'- ;5L I0P$zI9JCjHy]Sz(*DrY!BCB("f$$}iai j5Uл%3VCiI3V@t6CK$Tv E t:Cav@8hHI % cw!!I=q' *b0Q5 3kyV}[yk^_v6ZLm$6&MJdXE d靖@K wf`@)!XBqs $I&9J R@;TH7 2B{d '@5$ {@߰$ڤȥԫVkʢb@'(@ 0I$&,C4h 2-z[2&[6UTZףkE([U(dHiyHL.VfxŠq|ym]YTz_k xUЀi@cX`̖ghPmIJE/>e&)32 EQ!C$λaáI4"MVI ] ! v$`Cxp]UUgI% d!$ޤCɹfdɝ,UY!=JeV !1aH*$;8H[L !2C$&X-5CR?7uo4k}n\P'ap  7d0 X6mT. fDգׇd@gBagbA2eubbc abJa 1 $he`Zt<`oScK9T$ϕFhf,e2ZaW)I^pBB^ܯ*>-y{Q)~gUsי9f]k䍟7{W+ux9o 6 JH]42vE >2Ld6U$I Hx ZD ^9 !0a!d&wP!-܊CK~wycQ@y[Uh9ř蒈3&k>6]o9ơ'}ʶ!^c#i΄DbE#j4kF4lh̆j6E%6 $)6ٙ$Pbđ&Xĥ,1&Qjj,(+^%AhڑDb+4P+ ]$ł) I8JIL(k5aL&BX̒ =ѹɦ,N2% BŔ$5Vq}%-_3m0Md>e( ^ RA@9!ڤ7.H(CK,ldY S$%! 5UM8 28Q%d&Xaf_n_ŷko'MA |,QAC3C Sʒ,O8:8I0d!3I I8LـCv0@fbBwiD8iZp/+pT) . 9^QPW6/ c[k",,X`Y 2Ld,P!BO<P&'.a*I8b Ӫf N>;'d$ZDBfQ @3 0R)W\j-[qvBֻXjz3|&N L7X^S!vK_VKI(e8)}{= fT,|I >b $ 7- ijPA>'I-Q !`)'XuY|Ja6ѕP4g6ERiruG[{La&) ^2NuA@XEg/.X.GuRd:HYyTxN}>jN]k>]̞S+Z}\t3D*I5ݤ !sǸm5#;uP^A;^]$*mqm(%mmWZmnnm6DRj!hi-!!4Z7v*Qkχ7Uۻ(@'I! !BBH B495THBHf@mVI&L$IRLgI5!.$ BIwUZmjۚL!!B t'a!oF֪݌j6[ZmzdBHr&sVgʥ; 9BXв $BmAD.fMnU@.Sf23$2I^V fMȀ-"=Ƭ%"ȁJoH8T!%jāDaLcXժ/!.!HakL VZ Z.9da&)$D0m%^RT 0g5dt,5EK#TIZ(! UCtBdZD,QYy5*`9 D6[\$ qHI!IRBHC@a!$>$ $$ Y} m j6 )u7IV/E4-q@DZe"-[w:[殢o%)7F $I2BLPKM$2A"7΃EUV )BXE6KFeʆL$Njc$Xa/v44Tfxi$ eEI.@30tHH @HHa&(JŶs*l66ի$ HCUBR@ҐHdHC nű*) *F\.J {կ*̶ fi{elDS[8 & BI/B(E,JKB]WfD J9U7b32B_560L, Bj%dY0eT7Ie asZ7*RXUP!AHK$A/TI`Dc{P[ڔY:L423ԫVcizs- Yk˛9Fxkzuk7{v櫭d$Y dy4zƿ"oeo[sKiD0@JH3yQ;MS&9$M;qXP<$f;CFLnP0 K1dI"cJ=Eۅ^1KLCT`WWW+JP5M- KYp.ښo $7bI$@2 !R dI e:P)9U ;c3LmغB\a1SLU*Vi(Tª͎ /, "Bm^d!ET1kBYA)2K&uQd u](pDB& D MӿI>\f B%V3!9PSdUC%3jSl%701UUӛ4`/g53!\ڵȃ.eWV 1 %Qq T7E5T7 @΄!dD($2adui,P{Eӭ YQjڗT j(Ƀ :B`I]]Zgӓ*;Vy00I[ 3-7L mIaDB}Ҁ`&=BĚ(CfaV6jT-MU 0dw-+&(UZKي銲[*\rHqcdb)#0f BoRioY"ItRzXUiMJTUnwnn]w^LJ deVtVƴSo,{/[+>n4Z䀚uEA4-kBIP0j$p@H\D^&Zlat& Hd{2u)(ՀljfHLjud'-@Y !$z$e{<-jjjJn؄ 6m{1F%kp7K7ܰ1fWml>/m4tm-LI@jiY5+}_ F:TF(LI! ,-UZ ,P!ЦIIyP@3,ݲPl V$Xt(U0e/\ ,la$HHS$D0Dgh͡$ SmԴ2B@nꝏlA BHdͤ)UR w@ڔߥ$ |t@$Fv[h $^浵yUmZiYZحVֶ*VזВI H٢Y$ $PBOa$jX;WِY$&Mk!27+;(C_sVM-5"U&kU_Gm|ZFѪVܥ5[uZN0IHdR$!! $"ġ $ &^mZUoBIU-5V 6[Wb1FыFlX-U%( heSYU3UmnշUUZlV9uDjm26Q61ak)EU_,žɵ6@ !!EmRm{ڭm{BHCwhI$@ BɒCxb!Q(4FSmJRVMRs[mIB~@$`)^mMm%Jت[ zcXڀ 5)&@$4%I M@0debR ^Vrַ|/-V,HDm#YlճU6[ZmAHId$"ҒbWUmգFXT"ZUhDj[mܸ[Y56 F[֯koQ^*4j*uk0`BO )[OI9+J@$7ڕP;$t.z 2h DmI(+MV6CJ@ܲ d0`I?ޭWVFUƉ55XձZŵ5+FfU[6&+k^Zj%mFуkeb6VU[,QZ[ZYm_ɵn5mQW&L26n5HkU@mV-qyXţF@x,s8>$  !60Q@QDmEFB&f2`'<) /ghV|ܨIM0j ILbcfjj6ո5lSb[[+ms6߸֩ :!΅0QEɭܭ~-v5[\Z3m(TZ5mՍnkr˜+f]Z}ul[bkEC BCd᭶-6עWVײJų4-Rշ*]kZ@Y"5CbekjeV15F\cA{u%2IHOB@ {[Z72"d,E0I)%شLcFV"2- ffm}9_"F,U U{-Ƶ{PVQu[[xU$j,fhѱmkZحڂ֙QV-ض5j-Pjmqhm+j~S$*ffjbIIcJFVF&$622ɶ5-mmo]Z[jdm*k|Wȵj }B0d%թ9@9qS2$6TB<.sl3oEHDb0"!bEhe\XfV*[oݚͦHdFH J7 1H&Wڷ,JZmZR9֐$3x,bABtnX$]pI}&گ"lQe*ފjm9ʫ~_^,I5ԅ5ލ:5AZ6*M6j,&fs } !.2lѰ38v֒Wh=muɨa FmqPI|0:&H!);[hn5,c}műUkWlUcVUqVS*T?1&ҁ='s_"L*Uo@4Y|܇GAĴ Xm392szZ,qE$!;RU\@/%vur1.ZDwC1)#dM  y֔D B MK,СEԁ  EjHl oҝn$SPpLSR(8v]Z+[ELߎ[n5ת\g_;լ6[FZ(Vڸm9m6Պqڤ}E㳔τ5wgmQ T&n4Ɋ02вnZEBa7}m2̓ ȔCE)-ekACu!0a@H$ސ:gQ ~L%2?}P$.EݜѺrC DH] (E>C'&7S-.aLSNhN\s6=`͚BĖ7=ޞE~/7_xB jZ#^nnf޿?&% "XwKNA)m4s<Hzu@W,Nf{mkCvk&ٞZظ 5#n߇L:)BdAF#WTX.kLoo:HoS>߷qwI>4 X1m+ [1FT\$J@/~kÙ*Ov l@B6o~x03y[s#֨]!姚֏m`` e n7yULVZ΃-"O찰~礼]?VX=1N'J ٢5"m%PL(3M-Z.nU5u|y~$"j!_5lCggZس~kE)Y-Z[-iL~uL>ڻϳk 8|˅䄏9ZpEh(U_( <3ΝDNå`x@pN͞>8v*vZ7V.1 @,U`3n蹢q;vx)v ,r=Q%z8@o0z^Y{iSK9׹-+7oh> b Aw $3HP;OS0DLC|[7laǃy6z|IzHstkFD`m+5`H@`A,!z_7lQ b|K ܐb@R`")J G׻1ȒuLĢٞȳygx/bqZkyjnp-ULaAbj\+z:2"bQW4nLJ[1a'OEJGL2V^ 2e+24[P! mj^I.Bح4$$> yitq'2X0]w%r%IJ;+bbR)nEȎsλs>NnR$#Ԋ\7މJe<8m1gX,XLs疥d1'U &$@x8f=G϶NJT-9=ڄq!crB&&͵[jj];*DXA )n4k%D:793lR|>MD5R PEL53Hb~;n_:90 K@4y\"9G"e ,tP0t.11`-$""!YH٪[mўZX·Zo:{;s-.ʕV3 0ф9J`瑖@ZV,%sX ;6ގmg-Xoz=#mx}rCGky+W8t6SAwnvĽ$"'.e[N z.O֔36,/kA!S=B)&+zO1~E i (4ՉNP$[r]q $W24U2sKcDIoR[fr9atD+ RDDHenU+ţSu7pW,ww7@B_DAp.c9{L}BoVr*˪!g#iB^q\8D DqO0ͪ>>b-G'䎇M2ûBv}\Z\x7H՟dɿ#$|t!"!HbfIcX^/j*)t:>#tOpchww|H,N1W3hLB@**"$32뛚q;~.,}''ӫ=R7y 7,3&)>\]Һ$A;'oXAHyCvs}-Qۤsʊ ^WB×UqIGeVߕn&ɮ}^AC};tID4%ݖCI69XuXA!,pkMR.f|qDCKjf7XvJ[ M h[&Y\0>o:i2#"w^HqblFg2R$y4gH;J;NDKqp îAjX@Qnoê&trʼn.M@)m|Iv캭ۢN5JeU5.i': kFlfwy5 d*+&!꼨atv4CI|G|^6U(E_ T(cncRYy}E;ʦRpja rWuoPq)| z격5+HSA@o}wt޵U=3ĝyV9jۓY fA^Y:OӥTcT}a-$쁷E:-e+P]ZuO 7ǁk\_s$&X"2_uc,+Yc{0׫Ç[HrX@، K3a;!0d΂ QE"I% W54IůƮ#DU@u^-b-&`*˓kO7F%T[%j 3!MdJ) )%Wa)uNsNBW)uM1,^f}X YK$mZƂ|14O[ 9sMnImj6mRP]MR˅Iђ!F UbTƫqUƋ:ۍVJIUU bM(Ɩ<VMD\'4h#,D!) qHHE-Xvw^](>χX&z> TSנu}.i$výM73ڂ!0zJ0@ 5I$ĄMFH~HuXオDOIHsdR@VzDv>Md04DBBCX@'m?U')!y $ B&o]svhLXDC0& XE; Nr"!+Z>X!u,d'tP&Bԙ@z  °P YnJl| g:PhϷA~!eNڵ|/ lˢBӛYp*bAtp~CV;EWwNr}UZnœ&H[?J} NN"u)YޖT Q)߫<|}ĠlMX`qG`K[݄V$rg _9JJ:e&p Oc"PUtgE:p?ɖcI-la+k8F>bҙ$\Ou!!CTlDAA*f,kYi~+3ZޜH`MVݴeIL_ ?cV7-z'{;EՉ03";  $lA!yeqؖ "X.Wxd>J w /Yro"D6=$[%1 b~Mf^0Z^daʥ% N`Uh:X Nj]݈dv"X U(2:z k%9[s3&׹n&M xJFS?̔$9DRsy@S_,NǬ;r^0$aˁԒOaIBH$ I{?"}@Q?~s $d!$Q@s?섁 BO#9&|UK}_*F#=fP.zKYc"ah!DBt$g>C~QshXZ4!,_s3dz+/?Q=UĮU#}"Xl#%'ߩ 㴕 Ge$.#0 z++-PzAI+1j*Ҏ^)jH*$>ыWygj/Q_REm zS]1lBXUIKrSBQ1YqP@W}ecwH k) | iBL*UZJ,y5ePRoH39AKKlEA iz=%RAӤWRF ꯺KK~SmɇŻ%}Vڍhn&҃*a:KQ9 a_Ŀ+Z1*F#2XfR_X^{VG6dcU B sX!cc)!aES91RYaPehK0&ӏn{ 6§T~Z&B~'ꠟjϭC!7o)޳?G1$8l d$0@@_|BOjO' ?-r/:dܮ}ɲZiˁ5j@b's(ATTA%aRSBsRvm-uݤm ݢmmzo$uy诳uV}`޿FY[n=Hƞ8-[P2ɷMaht~i#MD`*+ jC32 ͤ1('@H'HB!&)HI yv>=@?ÃR~! m  _<ùd:vH!$\-oS2\?}fn#3"*ؾ>EEUx?KZege:OKiT̍eV^}W7N2&ќgStR" ~eFdD6))&FpXBŃehFOa'n 0C__nʁ<j;@XǛ S&pY< {#/U%3HFN+_(v[~+1$>fPO~x.O1? <OaY٤H$:$ sG~"!0)H@d6?UR@B@9i$hyZtL[ -8?]ǒgq7UY:x XCڰa~ѯ}Xy^yd?0"AM;PlKqu'#(ɩ6G>'TYNDHOWTPi-$,_g}vGeE" Y u8suq]4%ABIk   :<`y*&" $̢ r(˛qN/'Xrr,uOW-F+z ((h Pġmn١£# =U\uc7>ӱdk+ in ~ċ=/XjoFxpwL-"nGV(A`j8͏+[oe "óC2ajk)7W]HLXꌲxL RPk^T؅m@n8$!ĀCIH"GXd=u$p7 d0P6y)m;s *,-RɄ7RF.~ baƲC D ҪUC2!B `xm< E^~g ɐ@P^Vڥ`De6Iꪜ9CF]"6:\e7Ua aЫlfkVFf+\s-@V#Z7He3ՌaÔpG>jLwEDTDMNַVSϥ8+x*_7VIoF5i p)t, +_`B֦85դg"-_SZ2j9J-Wˎ̉^U.@@HC~?OA WnNuw|=X*SW:P8ɬ*MA&ĒGC>^s@R,dY^3Fc @.Sq݅2I*u"(Ks2w#l'M6, iu>6H&~qS zxEpmvkwk'r Yay] @Ā`kDIe!I'Nw@ -I%Dfe~BI$*LEHB+$"Ɠ)ѧjg !.W ƻ7dէ8ݒK LDu̯at} YQ,B L -,"Z`bUdb uƭZ/՘V 3MGHH@" Y\nŠ@Y b)Q!"6 棩53gC8mminMm5YtK<4`( ׈%z\טo.EWZM@("h*솕 G 8Jj-$q`"!M3yG'y߉k6A;8z8.j8ӊqx8lPD!q(Js* m\$7'1h3@PlEYp߬VAZ:"Bb 5\CEq6 @MLc:捜HS"/,W24kd#_$k-B3P4BuO4BG(UvZQ~2$Dc".Ye#A@ "@A⹙f́rmWһ7߅S 7xMZ~M1 Nxo7n*,AqIUCzRm^\ѫ~fP;{Yqu4mn Zt])a^,×3l7{$Q25Ec/(jfJѿԌkoZcinaVl d*G&J$DڊĄg׌Հu Kw/إp1MiE$n<麌eO`oDC ݹ dF8|܌H,~p)F32bїjpQO*R*EMِ"@_]W|U8Ap)W*PtsLJdVIҸ,(0nVP %XaL)KYLcɐмnsX "".b9tef&ӃëOgk-'VJp4ܩ݂ J`@! WL'FY+ogձY#v 3i1>gW= F%diգ}c0Y .4ں3aD .xAmAb!b5JG;oIM:Y* )VP+U  L "ȰFmd<^@q:[vg8c%SCF,dG3K 9[CIiZ |c-"R/e`nBkQX~Ԕ$EZHb*OscCle\w78-^{&aE ;ې"HMaJfoZC-k }\%Xk8)4'~7y,XhA!c!5 Q IfJtVw+`gg QH"!:+j̸I"ҬǮzUF3G,I*F le`nJY.~9-OPΠhc,I{=@6{p[2as֦IEPAQ@vWQjѠSg A#^*Ejj Zma<ܱ q r=ۯ~@5E/^G7rWϖ# cF#YX(ADR5(nX̥#~۬ĕe%ءi" L; AdEo#PL !$|(zrT6a 7.P*[pj+\p(C$no'su8x Z$^ \KN8 a,,`hЂc7z[13pwHѵ9 qXƫJ,QfQYo\FBB c.8 f*Sceǐы!Z\+ZRL(36WWJde&t(n9Eǐ EX/FRQwJ1f ȎC4 偔DH=W3G0Bƞ#_UmU~z!ce #YFV護R$%PRI!,C=zUÝdEpЕOxផ|AU$E0&\2ìrs55":d\Ito Nm2.)-i*, άjPRDMp4W[`!Ɛ9u vD8 hLɓlX3E|`q8C,%Pbi8O "jsC.Ihu+s$"qRq. i>:n< JmEfejP.ځVDppT3=k64-QD xOpYH $dh9D0 E"nύ](@alj"jn 4ƀ 楟aə :0I2\i!e%:, DC0ƼA y6_X;|B[tJ@vzduwc)TxJV_Z K.M>a,%RZ)V՗bV㯿V2 ʏX{a)4)YV*9nKЌ @)$ujȎ ҒHflp"bSlBiNyAE\̤)JWrK `%.)ꤴr9 a ׀54O#X Yda@Z6eni;o&vqЙm2+K$!\AqԥKjXTbP ֗/.9grƭ%-X ]<&Bgn/i +W'an<*8BE§P2'8] 3 k7X% EHFkW~ԽDa;llT fH"C2E mr-K]RgB'+|’)zqlτNu2P;IU3jt*N!:\pd]q&&5\+fvs!$g8Nt ؠͨUdtAxtԫ!d& FkkkHUbHu{&M9H^xr5¨"xon\=UJdł 4fg4j2PL DNk>7Xɱ,]XieP"]筁A + V*,8ߣ@@,͆F #fϛVٸO_`"T]> %RJy05 }iZ;Ͻx :u|u.;XR5v\{$H* X ߳.SsAH.[Y 4(b.NY6] v#|9"'-2]хO$者 WJ 0a-TX|*s;ULD$z'ሬ:P\ E̩u0t*зTFjB|gG0$Or/0L֪*o77ambpveO~V )iJf0Rf5hs9{~0w]ɯQ"`hԈ;fcVړf\kd6N*u ۜb XII| k93q:jTю.}E¥D!:9iC¥HJ֚hYc))*+=^%mX@m(q@KE8Le W_7|^$ $AW'ܙ!ƣɨ̡`<pT8(pN>p\ WRpB/W[o#Sb{X Ŋ}̓42,I^ p:ku;qp>+jMVJw# NB-~inU'Pyw4`!*ڌR4PԆMCkE] 7iڂ`<29Tc, ~SAEE]mws Rn1MIP@CLJ&V ..E=Hi K+oqBυ¼Y zՕdZ|)Ή* W'x4DJi'YL-njQdېW} ت(H[#L޽R.ԕM<gLfKK? &BY(;2&1ryG0nT@>eaH4tR'"$W,iF1W='}Sa9Qԙ jk 5vjAqM3) WE_Qm႟}RH6,Ӄqo^&(-;`$%R޳*ҳ9ԛ&YRahvڄD.ߖ\eDS62$TjP. ,0l ;P8J#Q‡t.gV E ꙼;%*aC)Oi\ hC,@ Q G˜,>L5\P͕fI~ 'K),⇦B'.壒*3HNM2H.݌EQ{ ܑHJUZ*[ӄ2Umk]Xkw+eȫRB` {`$dˍĜMGj. VyHZ+ -P%LҚμ']T Vñy=ɂILb\*G/+q!H:GJ"n')fK]A~+̜W )w4]g'dҾ $U*mBEF$)5 ~Fq*Ra5Yfk4"fiDnz3״)$wmƛRm<аJ:`6ƫtT/,:Cc?~2]i<ɵVSZ嚕T[:|Y(J!Y,vE+|*el.Dz##=.*h4J&V):M$Sӓ)r*KedIZQ p,$V\VqAcsP ꪬ&.jB*`ɭc)B MM nsfUJ SYT ;wh#>+0"vhY%2yw[.TL‰ܴYA>HR20F J!hB gLW40 .%Z$mzDd ∆h a!!;"E5Fv@$Q*Dd`gQHBMt노A(6&@XNoB^kS 4Ŋ'Э! ec}f%NC^ƆB[1JA^A6 qg8fфZc x^X^Lx2-o >&(@@H&@$PDEDB՛߉ךFM)z h3EƤ!Ǜhϯ H L-L8怾0J_ƹ e@.ta,,"ʈ] $uIzWBA2p;a$| |ә"h9ո*lsKi\xLM!| -CX`Bq\ʴhZB\"(Xi x Dv-AXJjA \gD {k0xY>%aD ,Ѿ6|N68iMGe$-[I,BEaZz}|/?5I4ezkHBbJ3u XtHx>P=0>uMS@ L!$$'͉>a@&!dzGp<z\:ݷiosz0FIk@a}S@@'HuhBCÂu=`W v^z+PȒF٤g3^{ss|XH c|[^;dP<-1 D11mƇI򛎯7R,,mW}߈ZCII2",ڇqFgA0{c {k16ՉSwzMΞsT.޽wѹ EAcm -}>h$Y1JkN.G]C(;5U-`qF :8kI$yB0 H VJ؇ H.0.J};vü>j!u=|h ehe B/ڔM[{,XnAÊ!66T,\ re; ߹ُXFPN<~p.@1H68-` Gop1n$Kj@FVI >-4Y X9}4F$^3B "0'*&?U:@i|@26٢~9Hy:&.YCI=Jܜffl,  +WRKHDk40!1H[stqM>=w's~µwդXA$2ݒI?Vz &X`f: J4 YX$ٯҮ)똴% Uą lRuY(||RrcrZ' ۅ9sKmEEWH.LNawg$U\-)#&} 3MaM?p E';7 MK62뷦Ἑ"jn <|C{ph,L@P)kj˶NLx'eùՁtH{rI]H.Cյ3yη/6HZTk"ʜaULi ̲ie ]jٽRZq|ɑ-b#8_ߩ*clh=v^a| ~uӸ6o qYDT[+ #Y'"/~hw녃>eQK #qp| wP-E$p٢̷0>D̷뀉.R^J:\>2aBmnLFLp੠xNaӔx)Ge\JL@M94֦ZP!&@HQ`@)Y5C}i`>H2\0,DܣfC| 98ԭU=sR`ml%CY0j!9҅XRRذVSE&3ҳtp*b1,?"`Hɼ/rA|+ B3S2P"q4.\¦~uOq 6\9j("73Sh!4.V:^ D@^_Vqdl 1 N#!4p١g{\mr/>`jqocN DAF |z˼P0,\C@.gb :Τ \edmD-wafYw`446QNDH .3͒!b X%]D2bGf`nJQs'r͗Žo=X5El؍MWaQT)Z0yd$}> Z a*S Hgƚ+p-3IhD4tסBŒ!q7R1U/iԦ=t?$eg:֚aeR"&)ʣ4I"R~gsVjU=0+r+Rz`H]Bq%2fͪԷ.u!okM1PĴI+0$xz'qx0 =ڭ,C9[b´|7tν;Gg<=5'7kMOX* hƨˀBpΜ&f)ԱNL- p  T2oGP.E>,]', } oCRTLB"6Z\o&9rK!0wMkr۸2IW6BGpC(5 iq}B= Ht\lT:r PȅQjٮ}g&ĦH_ *㘰`-:+E%DUb ujYf-7*u R3SPp)bZiOĵ&Em QrhF-[_ M +}( +"Ou_N0į!]Ь'JL?nV5MUq+u$fbi.XOsڰ.a4o⪃ЙmM]V;=HwbE5ѭf+xT*WIZ+@hת|8 bY&SqJqfr㛅Im^%N}wxGM 6XbcW\89T4C+5z:oPEY \bݻyi u,q'iNFvĚNk%VQNeCgeA6\HP<5ۅBN,xWtm+VЕA'מ"SzJh ={z ߡm^4,,++hLIuF M({|[`MAdȹnv; V=RGanfI* V^qu.7p & Fl!Q&%t.!#{5R޽]PNۏm7^S(6`EX1JFujg71bt-R-/DCf9 HCѰx dڷ7ilfKo?sA@0d@=7#a}yn6:k+ԐuGs0e8 Ko2UAgQa%&9r>X9K+.3kwzVN7KA}8t*sy=K,KW,lH 󮵖Hԣ|Ӹ@KX~JwzdOJH@@H$ '$I@gJI޲H D$!=BIH 9|3u(@AI$;ޝ K ?+(}I!S<0Fr PT;.8?/4;'K=kM ` ! t '59_@ByCRJ C2CВv@Aѕ Isg8wWrs|{e՟)W ;}V`{tI/Ā H cuVҭC_e~.!h:$ )7X9y͏3Ӛg䰁`RY;ge!w=o $uB/@^a6}s COHc_FL ֡搒MWUSդ!=Bxp6*{bϰs/CyGqH l' as%ReZsgjV/sf)WE8i>z~^7{Ku] VYNY_L1cZ4:M^Eݍ{HIx_춯%E:W"Θ^_F .U~*ȱ׋&|{ٿs2ODv$PƪW ŧY1bl* 17uW( \'i}OuG}TV&43Z ,Cro).}tj"W'NM)`usyhY`sҿR20'sǍVWti0]zjM?v#ۏeRKZteʡ'[& W!ⳗU[W7l"LIC]UYkE1 h0A sB>j2`{?#FڝX& XjQA\O^]4wƎCZ3[;ϨVNc wb@i>䖘jwm^Z_Թ j?[}Lי*(m [:t |HZܖMF*Ϊ¸D#J-l3e*a),zJΈ`,f(j0`<7G3+02쑘b 83}+j zn;:l*YECHAum̓l0P#;e vص~gbPmBxvŵI +)%3h$^rDp/[ VgG] #A$gfW0[ [  RhO4\M]-ĒLI44tǼ`'b?AGpIlJҶGЌw٦yS%y^w5k?ܫ;?,PHN~}wwF+!ޡ nq$x#4v\OhUZ-G_B)HW1-H BcLҽ7!Y۷Ya(Գ25RMZGUuh —9| `ľ-d動1BslE>U<ԱŲ #Z"SSU#yXAuS' V5H㪙BWki-AfXc"e=U<6>|5t3[m z1&ps s7|t4+ OF^P27wXQř)kktհPJ$,õv2"a]KЀ 1i\O  AVd r9zP҃e: {jUW/e}?H̄PW+s+11)uwx˂*>OO޵/N3pCq}S$lk(7kp S3d+P+ϗ52_娽|TO2ЅZ}0izT45qOv}IelQo԰Ӏb,Ub.NUmPN5NWTsޏrV{%w_ϤǀP>6 PTM sҧ6 U@B{E=z͘3w DiWSݟ#0:2`:-JϚ3O!N)uf dc(1%g7tqXCa'}V muѦ v = ~O*8mA832CGBW2kP@N1e9?^zߕhe蠿Aҷ2YhZ{`!{v[c6SU-"$A>N럿 KjCQ榆 M#q&k`QSRWSO"ÐWBĶ;υN +Z:B3#NՅm""#Uv\H$c햘O究DSyfh}c_͒lƏ8)d2ޠ8'ĕTJڎ"%v$'B@6X* !͗u|.QH&9?weZKɍe -%`@2<&@^]4(|QD2%Oc ĪQT~6wFϣY=G}wJSTxZc;?cF;j͟e{Gcdf{IVv\~%%^;7ޓ<ϹQ򿭋}o~I+F{O9] J_}3.'\a [/*@/yYBGkKSo+(hsOsOgz{m}e96?98÷/7} M+Ó&vSOXyŻMA$Z/E46Gq@.C-!Y}WS's[ĕ☪h9It(ﮓ *y^'9fl\:Uԙ"w6[uDvv}ҫgcTf !:Եz]4}'g޼tuЯkWWOk.5xt1*ѯ)Z˗[Bܥ{e䬶>}1vQ2z0eBjmۉ9uYy0zg?d5Lhύ'Vti6%cOyem7 tRm"".VsRBB/zh*$T?rX#tZ|BLP3!m=.,~d&y`SH}$>٫.R▸tg4bVUV\lOaAF0ɮnΧC?87x߃GmTbD'B`tX;e"=Xp3jјo^\B;E[Nv ud$ &1]ィ86&Ȍ,>؇,}M?+F@![WHH PI@b7S_5op+tI6EF>l\z3TZj]kp{|\l>ۙodzJQHz1Ps)WesnŇ8yQiIhr`[4K g'û67Svc}XkP5U4^UZֵX[!vU$!IGm̂$''C@2I&!6$m$z$?Q&ڴCdVaI8ʜA.銽5U]! 9Lذ!Aˀnڷ6h샢o_:ᓔ!Vg ,  6YĄ5BHj'`@ve*M,9bPz4^$d&|9R}b< MUi?_6~. -0?go,,M1gJS<>!&?`8'{U) s ywcqO>^H+ u j$R@+@'n~I~$!_!Dn jL"08!T!0pTrDHpϴ:K0 wcp p Pp~t6<8"\dN )|&אݨn %jg3]!ŵ#a dRT->.H`)S똊jp> w7v:nR8 VrFRɅ&Sûkn[]wXEo[{*2#YgiO|d,`@;i}7HZ.,AuţܲNvA *)\SSqqNY_{ 6IԍG+Cw 0ЬB̻4IW61#XkXZ?H Y v[]᷺mjy Kt]Btsp Cšhs) u$D]$0$DCrU rTLNb2qwq 2!-!ELNoRL/NDT)Ve  e=!7O*UU,g.`$"CBag([ؐz {*yy$0&kg@P'I|I?$>$ |0 YZǼH`k V5Ε>/z &$$;Tt=@q6!H$ dAJc|L1XO:=I1Ae!L(@7H%`E!&ڙK |fql -AUH-0 S%I#c]P\+/bx%g$oMII Յ(6ۻUxW*Z :kigx LZ@ H)¥вXT' %ذc|y"F>&on46o26Z5 kS-*S=7k^.|{؂-@4h M61LY hH:ݮ3zsuܱ嫊Q{Σd D`i LgSM$i)JQSX `obO}, AFA$ .UL "BbgcEZ9=gp@܍Fx2ޖڮ,c53X>lJ-3ݬP4[=F\&L-%Vq! { rz#?ptWuk*kywr&8."m"p; lT,~܇ /gj7-a%4EjjЇ2E{ن'i 1|&!2g` -ܜl &wDg$k[ .j1㌆"a6vgjSYA2ZF~F`g[RAmk 8v) }[-H6 ^[b7k~SAlA V'- Jd\(pwCHB RI.@,H@e NSgEIl(0II FT&E֣:0VM)Y1yW&DHmns" eK Ff8"Vv(Qm0 !gP.b+}i` noˋv\bDs*BmrXfɈ#\\Ky41JQ"" #hW gB8cW*j,\MhVD  T̑Hƹso9\ϵRrϞ󱢲nWfګz[wn`o9pwf⣊sƷ6("*" "jl77MQϞfP4 7 hUca1$X1aI˅U$/ D$Qd0"kP ®ʨT"ZקQF\z{,ʪf3LPP^+ $ $.%4fXTdYn.#{QU].ި,K L)%dI@CY2v )E@^mP9˫[ܶⶍo6(mQ|U羾6DV+Im=.M8kMhZ#yյ{s&) 4IJ i [ވ657ݭEKQ lV1[!Ϲ|9 6퀄U$F2 g9\/^BȘckg.RW;˽\c]SǢu^c%[ky^-x3sSbrwM^P)(ITlLP TX SnWS}fyM/q(,){1vOʽL5>ӣAdfzbg5)c&F8ЈTS'n}"fZĀ$İ7>^s# $IAQ6d3,bXVyo ŜjBrL(f2E좔fr 9AD#h H9g1L255C$ɘhxU+UONz2(Kz%,aO2@ AB78Zឲ̈́c%5s0iFI&bFZR, fĊPh Iq!$PTjFZ)u,QfXj,3.|DE"mbB$d1JHf:f3{t9j ,V( j E Iz6jj#Ed4[ōb8ꁀ0sI (έ-bٵ"(af@dԤ#), w1Y;&l(N|&6c66S}$],\1X`rLXr:gs9Qr 7%֕!!sx(rf0Y l94XblՈt ^pdXI1  ^Z$cf lR  0AS!^Hj6lY(/OOh 2_,KӁ0fA,pL3Ʌ&x.XX"sxăTYB  nnoQMCZk 4hk6!3:C%3 } Z:ED fʼn;% si` Ffĸ|c0MQ$Ѳ #)M$i EME((3"0a(ARf,4d(+% 5#!AcAH(!b&e 46 ϛǩ9m%-}Vi0UY44P QUJBHT(L!3lfq d; -dfo$!F)%!Ac K&4R/F`j!ōQ] Alfaa^W@H]cT 4Qji*DYc5 $#6zJjd٬֎&r+UUH"Tk56u0th 2P o 7ysȚ5fT\ёY&lHdQ[2a U%Ke!HjVl͋{^]maۋ[D u_7Y;NfjZ맯;;ól aC6@t@5tX/fmͿ۟@VlG YyL6e܉ߣ#@r#`O-`GSR' ,;oxabM͆HtN\+YXO/Y߇P'5< ($8DT}7ޯ|0@?$ӆk)dpYy̦2en>Yu{EjK)RUD[q"UqF$/- cn{k"xo{ط8#tcL!if7.ߴCGFXjvjtB"UTF@Oeƿa;my1"څN?wVCMՐߘ"txȣ4!Jg1"ZZDt7#Z1"d sH@QS(dE16H0f .?];fDbi0{J|t"Uqq_t'-P VBrٖ~<4@gcI[83V/1gSk ,\P9p(.08NTTRmEqkjU40HݾrPtu,HE kH۷[xQ2INJN~ -?5q94`zq Og;B62'v>RH#qCQފʁe}X{=e b9~O%jaH^d6 ^/`݇v#VƇ>@?Pp8_=yzB$Bb 0:}[=QMHIۜNY YI^o_ρxKJ@ dčKW/(SWםk]su{)EW{P6JzPҴV4ePoaFK @8aC4>r4d~ FW18^rdzui]LTHPɚ38Z'l )NDd@EREX#EWʱ䧦z) ̒A,P`#b"ַrKaEKR,6,%PE0m+of"USqRZ9Zm1gI &cIn&s}NB9%ߟY^; ,i4PY{t"8% u $(8J7w)),ȽPrK V`3,-847FqtfVEFjMD$ZJ\gBHI5˄A1" `^xYfKU.:Ѕ(V VA LJ XB @#Q@BD RnQy@'}UP B&,FX9ͥkb2'< DCLqs9:PF-B2l%5Yko,ut e'62Cx* .ha? n}3sb-jjX !M2RmlT)6ʘ;Aj2Xnʱh @ )tdt0M@!2\NItS .G^ݳimd/j%5BSu0n)Hݴ#feچ%õ"M՜2ڂ B6F\ :! "a `sRѻl }zR)Jh!ޢ[4ZC˦֓h_4Qi)ޗa58&171VL!RHK*X҂K Eg!88\qW\PT®E[ח[e\&6-)Qmfbכ_nֽg* !$d ZT j*5yjyN|9nիU\^OVQflXT|W% bIC+ި2@E p+]^?ڞs*#mRxQ ~`@G! ;Ifjf"U3 T;=ءR !UY, P{J`-IDF.j ݅\8@QbIx,lY36]z{Q$"⭋Ò˹0% @t*ACχG68aDdbI@22q35v7{hp 0e]A]v23>rsP݅B`c,3IjCC`ˆɄMTjxF~+o#O6X ?N[IHHsfLRE.rX5Z,Y.xS`eIe49us> TƄ H,=N@(" % \%m G*ف 0BN0K"@$+!&df6B FrP!L0-d kUÁ3.fكV#d!K20[PВqL1L,`"g[1csW-R̂"-0JIF^xSӞ^4W LU@g(;Ñf%Bɓ)Sv "P"-۞3@ !-~d"Uc( BDC* 1-`9vMͤ58dt MTdl@ƙ κ|7 u,#Qj"mO< |aċfi1@c) .'21!AQ&9ɀa2aD5 sLu&c>  ((Ld]ɺ:dnNQR9z':LKe2_'0^7v4ZPR $/Tߤ2FXŤ($~yh"Y; !e/.5g@ HOgEnN#G(=pĝr!zWF $nXE[o`Sg b8E~f9OۧI7_RRՀwHgB75hy4 Cس0n*I!< d^ A2fLjحEŮ 62XڋZLch% jA:H@.MHaRîzjMk{ꭖ4gc`j9f@֔y4LIp@H=X0 "7dZB!vZN TYer)90$ p ̶]=дd`'@C- ]-Pܼ\I +aPu|>˖ހR\!gE 2(`E("Q.:$(v/8qhRĠYU^B˽ұ*} 1aTNWf8%WuZ*CE,@ ;ߋ:X[}dze2j[s q6s°N G/5T 'RO8S_y@Sp:N'`&@`CjYTI! HBod?Su8yUaf_ SQAn6[z%WEJ4Ho՞D*j@-9tYrs٥s7&X,5%˚Z@WRS)QeJs+IF=0`Sa0n8Frܡ 3UIaC 2f]2̀8 {污ɤae3Zf@!F=G]bbcKՍ{&|Ew/XK2Hֳr,nJaF{Q&uO%M!.A *`:Π [-Ybd[eRk\4ZR4aG ,^d),UBH`vRI(fpB^:צ Խ|Ɲ9l 4DjE &mPs0gfe542 𐁘@`ʪ0Κ]w; Mw'K ;L*t$* 4hb7KdΓChn8?Z3z%Hf#TА!s&K RC` ((6p]UTC;*! Vx_Z" 3E\ ھkXX%cfTf4ALpe35vgEv BR:Lupz7,克lGB倹$1d+=j絤fb2d a&(*pjz0HdgSTNנr4,a-2Fj9!B$H(oD\AZ]5](k!94Z]R9 jaw $D@$,IMmlu/1ʅ6 a {U. Y&.] r;383])2a3EHņK!& Vt ȡՙ$]K#|ot0qʍ 1e$iXHq[ D9nP5k5fe dZ^ELd0nլP"Dk$iatϧ+E8עbC;d:B)5IfYj$t^&`5D&2BM*ک5h$'0٭]ukevkIrbg9ٲ\'ynCMFs\9Snp[ڸ+FKfx UL"H7)ˁ5@,𭿅ů݅=V*JE`<^YV-bR& f9&DZw;Nedܲ2Bw̒z ݳ Ƥ]=O1zNćE"bA>qh@NQJI54ȤIalk{NR^ȝ.84ɲhU yk~L˵Zb}6q,]clݘ{ORAz03WK"vpB աƍ{dM#HQ~ ڣp Ǵt0fKn/ v-I8`@Jv;eE[9ZͣjLۻ7v¿;8n1 e]ՐMڇg [hþ(Aglq#J6BvEgSsjlLO &bg-VqHMe 85c@y"0}X歗' s iP{ idB,= 3>i"ܕQ0BM/xc\"X,?+`CNr\ \D1L5EC{;bԅ[fe"G?TFEP 8"$=ΤEP꽷Z/nĝG ;Ce&Yз[jq-L0C2 =p9YE;J/Sqi SB׀UlC0dN}ԘX/ VYrgdM~!9!.Hp퀧'0rUGPwg-role,3vYMrM;1!";'.adK%f5[2eT 9֩̚ dO̪ H`0֨7*_iͅӷst{wՓAרSQ@#&{AL$)BZS]rakt!X̮3՝w%lj^ [S,2D=ox:R$ȐPQ"@FZ80~( :k/gfG!3zs37xTQD;]@!tBlaCH`I=oŽ`HF~5؎xTIbEvR|Kå$0<`R}f<9`QATBa#8r $t kQ!ebHw8mupUX| -ֱ ќV$I8 IN,!% W-`Ss$&)p:i"ċ_UG[OЬȉŞ=)΅s?+g-wuT2d(}JbdPPx' T;1\M8= t<<[\HvɈT=!g 1Ҵͮ[$PK`-])@Չ*/! JȣhMŊbfX{+)5 DYg ^RPwO'V$Rj^SZeP&SwP'^ozss o^#'wFvqҕқ;>IڵaG)y{ gAkR6r nm1a:Qaz9羦 ovP-?C3ròI Or=>0z@zZH@BzY?uGW_y9Ѭ:%36 A;h|M,&:\e&Z"$ ̍0Tk3 iK% 8p|0>|[|̦w`wqq6;FLF I !W몟]6>aE3x<3.۶^ER/1Ue*0d$s R9m煗ma_=!iJul g5T>'I!H}$ xdxɿ=j^rsP+d}S#%ͦK tvEsH )(/Aqd䡽fzQEyZ7ƒ@Ǔ6ꢳtQdlI#!*kڀh 6~ d!'foĠ ʰnK5Bd?X%(; xL5^p 9AB[/83(m-PݙI ,r {O@@MkFv˄҈ P\7po}?k̊<iF"d7zվk9~j]tKwO6ݟ?ǬE|>Zj65KW$GkgPצ>7_F\U_mEY#biv]on0ֹ(IXл)ع(Xt1W{V2\H4r$'Ŝ\Yj,cc66]tس aN5}R}rұ'.tzST?]ӏw6q0FSy⭏j_܀WBCp=l0zPٺklqwg^mDI]k6҄g'!$ poGLFM'K(H/?ʧi!X~a6MTAah6=.yxD- 3Tz#yL}vm}Z&ڢ:}u[jڒڤ}z׈WkQj$)!>4u{=:y%2֐Oj($]w_Y1>U^5Wc?CHIq99x2<7["~n(@^R6OB$2TSD9}*n+y:#܍BË$Xc"nNIngh B!'f,ĄA4Q<|0]`sNڕI1,Ηap$,gf@1c~4Jp,!]pPE@h栀R~mnO2- (%s-]ե~t:#P=a!EVH3V/Zѵvz0dž{JՏRƲgݠBrBJA&l)t,9=gt6sP:S AE= uDa T c`cս4a%"BZjqzN! 46T-Galz qv|Qs[}~E_k?Op,o59^2IFCj=n(]11.|)Z͔Ʃ?ȉwO)(!+V e Ep&];ݝVc4n68Unwz{Y!f,"ɷRd(C~ɒ)q5rܾڛ85@ESom?7?p\w&16U&X$ܶƽxUlV9qH3y02[SPQTPĤ$᾵lu4`% ֡,h$6E4oX3"SLjy1=S3`ķ6Q }ll 5Mᾬ:.ߪ}ħKݗȀ(}(98Qn~'| $ū8Mlopl1[fI|%sA]d*8u(㞯Ƣ5OA0:9<Ǵ1p3N&u" #V+ 7 V{tZfv\q'@z>YYB v|7`նy\0$\$܄@zh[Q6ɳ!?dAk^f|6`~8o}}drJO*\qMҫRIP]rȈsIT/fj YQ*X} "A E7;N4Qa Z/ 6µ-b{X(MWSH`_SBd/ D!Hm awy PBKtGQGkJzy;uJw+ ӕIVT}ع`;c47%:5(Ј8h@F ڒp13cw:@b@ v|*ނ=ʅtRX|eҠ@nTk6]cv;U^y|%v$$K)3a,+b3&ptYmZ{ͻm..@{wγbW+z(/HcJT.A˲u쪾 *kZ5jv&WW^wTq`.mucT[A뿪pXLR-ĄV<ØszC,֝/;lzY\$%[I᥃1Yu{#c7j"gg*l?gڃ;ڹ,Aխ\ҁY/}7;N*xw;ݖDzAj ﻟe f>yn'+y˪AϤXz5;F&ϡLNm\ wM@叄YqUpXcЫ%3[u5u 8d-'bORUG[\fWuq?J?u$iyb =XX&h{jFG}ϳ_@&Ng]坰0ҖĔՇBIkQ5P-VZ_ԺWw;#'tk- yئso@ֿGY~%D_>B !1 ,g)"d񪲚O;`[Ke)ns;8 xE{Z33}IkC]9>^yeg IU45lՉo=+ Ky3_^?cGVJ,X6F. f2'\aB-h Rd &VZYoFejy-WmjnLlDaR7Zl+cn?O`e;.fJ>czpIeW]vbD`,kz~QfzNAb;vgHe'Ew n~xt9s-wQ_u>qBp0?lfB*}o4b\ʎeFWwj~-EU) *BVҸYU7m6-nxTis澏/̖ݤh_=KJ4vsgeFIFҍ1gjOU"f, EML_VecS?I15SpR AC!M&ȏ@`,w7oӳ#־M&z_VN ޷sNV#ܢVf_jy< &5TWaqf.23#Ql<*K31cV1_!9'%Y"9j$/>qCOVkBN}-¡(;c@);n{A;!DR. ݆ wċw<8AZBi(t˪K mM"CmUקoWD,0 U+ny;_Fzs9 T0L7YۆʱŢWV]q]gkX0*;D7BI⚶IKIqSu;, M1Ðp<ӺpLoY*pC{οeضm"xgq! ʝ0d[PcC}(2j硖3uH, 'P\%.ۑ&F4!bC¦>[uH)B$$b!B Sqn!Q$ʀkwT7ʞ{Ht n@ c2I%Bl;çt"e8Q;^8Tudrر*F丫9JTLhB~'#9s5Yq:Wu erqjv筟o*ÌIt9tMkBŴt>}v{}En;uVn^P7,\v4Tjn{t@1ʒIE>t:س@oPA?UC| 42kT>9yoܷry& 3c{ak2n1U_7UTU%Ox`sw rӶf#&P`|ahQ L KuuC ^mq6dj=tf誆NmQ;[jUɽI0pC$42f= ,о6[L3yI„Z50OܲAY$1Ғ^Z.IxmIV$ aRBjHO! 4 _G'?u?hw$?)>åKouweܻNAsK)ĢFTF8^]QT*6iH)HD99ˢ.=gi{2,HvO D =BHzG腐Z$-|Ts[=~|;=:tl6!o;K0E Tͺlmayj5Y+V@5궶[bع%JH4N%(,`̒vo3owpދώ`T 'lӹa 0^⺄X|*nZY) z\,*nl%&V+.ܣvI E!fǭͅ&OIG7:W|SM1n- F!FZ=r]=kmB0G1@䋄 GՍ,"E'b}Zlx86%ӫ_]vעXEɈl@,Afma.H ڤM> ZZARWȇK@$d0r@}IzG}uekӢb5TY/W涖iCtϕ2B zo1,Ĺz3k%R5k5P$t='tZy>O0_SN}WB7ch0qt2j#s%; aTF5Q;Swkw[Ks][@`׵UɣCG @HN9)$xIbɋ~@ψ _U>O#MLw{!ΏS{/ 7Ͱ/:+C]HA].${u !Ecl>.F~}2 9~N\tOM/% ɕ( g_ԌvegwfNb_O92->r!nܤJ_@v$BSpx!ˍD0ddž:kP'볬d !]yǣز?셪R@l4yN=w]t-_7L,f\xNچS|#"^d$0& ^^+hlRmdxڴA "v(ȾV9`BT{rʻ(K)8-7/Y~_ɝV߶Q} 8'wG%D?POc߬򐁮'$BH~ > Ф d!$, }e>?8 !7ۓϡUQ Rt/(85za4i;.g;t4oՎ=RVkr%yʖǃ&,$ۚD:2neKHFDp`gȨL2AbϊؔpPZl|ŭBaywra,,,J׹H,# 4jT)*5rͮ8H!+웘W aS!r ۚ⒙"h$^\ȴQbIL+hfZDS0lk-RRjLPB2ʮAdޤjQ6f)5sH!cc?_/|6 MBI;ZBxxTo?X[PB+*W#RH ,T*a aVRE0es³9 M*@OF%$r!Kd+el(9 T.,uh)= u1dpS 3ޯ 3qy pH8!?^ m%SU=`N]5g~Q$]Ezs96aERUIEg S;!3'kao~'7E:}aBi(2_Q}< 9)\qk Y(T%HԤbg $9R| %{┄ܨKzV??=5unp9ĐPM@;@83꧰[QU *,.I=s߶<-ͽTwl{C߸ H!^0ސ] {n J`+(X^өI9}v[АL>(u! Ӳ !FI>?) {^ʕ訷LL/F_I  35`Y5>ne =TW# 䅈sT̐'+`kJl'}Hm yY|CHl?$yX W$MQb{oMa[ӸNr ,s,6*Lq`$?~ BesFoA?9pmzCt >`;D{NOM_ǤGA*]J5R~*^%psۋ;$|=^%EboLNC(X1EŪIBHf"}rG2<\R zJ4&!.FʲFNJ+mѐD+&8G4t֐o7tn>Y?xSijU~$z4<*g|"gFymah[2Ȩ Л5Ebl/%X2 ϙ`里cXڸ}CIOVɒ  ;xte]5*D/"zJE#O%Oͫ5u".}~2=NNΦ}U:D0}P??BD|n{'1Vg`Oao(=ǡ=IZ[omn LIY,B :W~ˆnV֢axpGXt)\Q-N; ,r?|P'fQ3W\BVϐkӏvL%#p+z7;rVܾG⺍ ~}oۙm~"|7PF45b9ƛ{ qI3'MPlGZ:hEN꨺~|d_jKXVaNQaɒ[{gG/zd3$Bbfa~[J?ݎHNV$F}ii O0d~AR,TnsĠԣ9|%5)4蹸L&q %C d}qf̲5U  yß1  Ժ2THΠ@p&C `YeRYd n($4UoI&@҄e ňi! w 2 wY;0 iy\sfT1@x'>&M@Jkl"b*A1!I aNYt+BKZ$""V@;'`eg@hIL;%8$gdw"gV_/KA"H)+GD9%' i`$#/[./VNg79 Ҙ8W% )?elQWmmJ|TƟ5U#b A~oD"@;_.F?wP"XaaZ-I$2F7er7j [6h 6*b <].tsV@c(ĂjƅoS[D0M^[% cSTܨPyVAJlUE 4&UB6eӤI I!"(H< )+E2bjL*Xkec:aP1J mi\&:ؑ*FE5iH#^W5F7;,xRq,Oyos qJc0|RDbU_ 3CSZ!%,oy c?#nRe:ŲcYo K󱫿Q,awzWJBU{w눂LyiEyP59Ȯ#mX|u9ewȨ?Z۞/l?vEn@;^UZu=F>UD?`QttmpV[WNO\]tw.ag3:+i-_j(/y{N.Eg٧ igĚz8p@4E7L{;JClLtWս76upR+ޮ S^)BO`v/]`Qˮn{_ uIŘH+˗|lɶ0#f/7oW+5V[-MZ;i=H0RpM[>0a𩃲?< \6[z͒Oj'UOdijk)= frV~H!*EZsF/,([SW t%+!bLzu{sT#rI+ZBa^VQ[ ە\bU?i!lBϗ_7 ! fVOE'GKxZydstd8r~'9x4 z2~g5M vih ڊ O,lZ,hR٣Pu%ST[DH up֚B.ڸ;k n}&,(j+iݑ5B{Bt]"tmTM=928ݱw7=f[ͪyN=E/:x;|ϑwdd!oۭ6x ?S];l;~_ǵos"g_7 8Q Ve<25_oCԗқ_8^L Pv%^lhFtPT7{3VMo%xOkgw\ȖΪyK# Q0ΏFjntGu`Ri{I|^Ŵ|xx?"r=weQ[eGۦua'ײɤ|Y5͡_┽gUTy>SPGʹ}!]MJN$MkjJYn~@`B<ɓ FjR\lvjq㵨;_zɓqxBzĶoXq^ iUM 8VC^C\t 3^8^Ѝh|L|vуߙM;JFSpKJŽTh>||^OM7>{F|]: ΓG/NW pRA\+$h v(< @ad) Y&̨]X0Bv1{L p:hp9K@C]Q 6Mn]Ϧ)C[qd8Pͅ _!m9;䒢b ']~sPg|ul=\ڟ8Q=Ay붸E'ѬMzi\iT']d3A$)La!|,ajdB HJW3X2) ['MwMGwDWN{I1$-`]d+! /$PC͵')SɈ0$1.&gEF1I*II J_֪}?/Y69l W ShCo=u!TeW?xS੬6vo/~z?l>u_wS4K<}?',gV vzkvH.} .J8 T܊#xPlzsY}j&W}sV1iߢwn}ЬY DYACUTÕ?6&TĎݡ o1G?>zZ#H$ B0H,I:I3eXruNҫHZ%Іm4EbciCfsiaɓh`G땮>o尟+@~բ!y:광 f@`,!j0AQ%](1Qfi?8⨑1JaXԖ(43͍EfMGj盪37MhalK5.Sb;2D@ꄸشD6(0mawy]l++rB¬e-VJcdH#$=ZI=^?^B|!5Pa |I hwxcY^?sC?+  J P] Ɗh+ZYkZPլBPq]UUSMT+/yLL0UT)`LE N'><! gJKy3 |̡RO(%?;ԐݍHH|OD$!; >"r6p:MPW=JK"34;:3v{YKv o +Kl( DQR mM~B!ԆR M;fiqooe8?x?fTذAHGR=>PF0DCB"!HHͭBRL.4tRFc`, kvnAj%nRąL.t8t5X~ 5tg[I@y@IuT'} 0Ὡ:@[kcx@0EcbwD[ ]cU+1Й8?H 3"7A%HWmgγE~Paʃq #or"RnK^L7y}wG`; k`?{mNmֳ͟`Gt!p{C[oSϸ;'$5xa koSIC?q>^1hKpXtnJądNGzvly>2r-K wB_?IGjVP}kjjU#PwcJZ尐h: mN{zS_bH!Ծ4'RM1~Zy.)JB?j) kFCs&L0R֫(j] }嘳^3k mBbHҘ eG|,6pn^ljBJ1`Bn noi )o44 &T:drD6m<'L4ȜK&KE m:|WŨFBY h؁A6eFp7{L8dѻnFF@JLE -PǻRJEJ^>X2'/^WI"p[XTaΤusLIv }5VγZKVZmXU݄ XRH00@'r!L;$id Ϲai ˌP3Kh@ҥ/HE67 W̻$_K̒nhX/٢L##7TT2VLYDB3A^T+P{𓩺 ^\DԆeM]Z'wY m-*;䢯vxf ;w6{|<@`(^IlwvşVIa7/"q,!*\Az=&*RkZYAq H\˅@!URX`fڒ>m?!a>OPr'YC@Yپ*|t8If' ^cjЛi =U$=;=O]2s7G"\>I "趰 F(rh!ұP^S:N̳Xʧ7qRDE˹$jDllMMNIy l֮An&?mؠ7<>nH 0C@N `mIXBm0ϋ6;@%s7AWKj}8ab(m DoRkkN-+vbOI FtLpW-$o:V|5Ek{`nēR =m'Gq {b.gՖpo#W#CyMgqPxO>NId4" "kB=<ɮX#sgh<%=tkxxL,jNGH9YЗ9>RZoAJPzm?\G U7٩39’wlV0" .Aެ  A\Z|kW n7?q}+Ty kb q\AziOFygr _,X>s:N-UQV}=m~"Dо 9~N=]W\ܳyK[l$ѫ6հ$ O|~^ly>E[4a?X?nJؿMd W+z{'Gڰ0gcf! c&Z`Z\<]t*^!2`BD?*D 0r!eBD2xO%ila&Y&$E;FŠ%Jhv(!\!'7(% (LU;̦!@E$T=Ȓґ )hVI? $/umcyf,9`(j7i#`JM$!]@gpؽKKئ2, *mpE.J R%(J2B8D"d y XY! TI&T41X]̪% JyLUDT\CD (DJs(A&oAOZA8$"Au}@$P!˻!gwd! p(>Dn;F)Ji*(Q2O)NP`aX(ipjDC<ä!Ad:0_sh@1@  ܻѪ᭛ˠ+K(,. Tʪ qEZ\ZN8Kz*iʃN*&32EHa~'dԢI!j}v~w=4Pkҁ -ąI.ԏeYEVK=P駲@ Ք+MQ`wMmTSzĘyjʤD(%X1KH5 M:U&n"p !8E&u3Wsw 6$d6Vk~Y"q}:>:䁶OI'>b~ơ7Nِڰ!u=w= 8" /JEZòK+ukp:1s$ bГFY yMU/PrOGr|DɜZĺݼӉçp,E^2i& NA@Qr$߃ſmǬ;}/V8()i|ob@=9tޠ| n쐓1-$S~&&Vfꪡ'~t^ ~>Fcvk|`f(OP낫90C;()! _"A[< cJ`W!D I@?N@D8$ $BYak1,h.TlS/9-͵O< {-wEE")>܎7f硰; פ4uN:@8j5 ej4ߥRO[;^WX'ߣ[!Ŷ6^>g4HV(C7ԭסI JJXRML$I0`k5`C>hXʩa*J_!ff[EPY-b5jIGrkwo>P tVAH47BcUlj L[ k`[Mt #dQʭjX1X ,VHT.{.q#˦8$],DtTX*-ڙ[=_UF%BRaK ěB,j. I&\l@@/ltd %st$ )aQN35C d=CmjRMSBP ]t I(D,e.Y zjD,2[S}n[-zbg׎c HH\6[#*; sCyŶ1<77H$n'&Aro0װx. %EG)UG<]d]tԨݤQ9EcfB5\"%@H фyL-5UTS>oۇkKٵTJHxl N_DԁJZ@8 'ܰ0G]hf"K~tjQ32uìD3Rl ,ךL~0ʹF:c@nm]:,AN9aM*{}Lc{3R TSWE5Š*M0qf8ѝ1l" -5݂X߰ڸK`V&f$"s`\ -#BI !̯U|h ]2@YzېE_/4!rZ)Jxw%͑R-.P0X =޽Wwoz\;6''Pk^ȒJkS|:4낟v'F,3Y4lq|׌qQ [w읏W0zӂOنϱ*Ô^m{0{楁ΥԸ%3w1ٓ;(t"Uu|:4s5uV}jAEpnxE&JIcs@?ݿ}g|n<Ia;r1[ߚ"&Dͳ"al@&:B%O)@~ +x=wf*5E"@@aAX xDXT5oR kI&rw:uwYmup5&YlzlBwԞlM Нz߳jìbj?MLq{ 40<-ƪOgsyE˗by@3 ,_Z~/}ѪznRC?#u="jzi5]VU棴 +w=k~ta+ox1޵J[n4nvTRX_|7iE2UOQ8=+ƝzkT*rЧq*5al5\c b|G>vv gqU,a5y/(SkΥLԧC[zvIqoƈ o}S5~GCR45c}|낦ws@ ,VR ]t(AHWʚQrY>)*a! uF WϨ \rkx%bDDIxXi\}KEG] ^D!R+?1+Y? 6ҙWv,̿_ڞDIoxPh>h.EAs٣ +n'=-ka-,g'Mzz?*足L*{U>;34Re*5ؕJc'JcyP4.%aKjAw.}?}Gyu K+j4b-u%n]7u.?ōR]g˰le:$:O̟u>֞ٙ-)hMh'mԇ<$ >gC`~! Iؐ%d $mP~cx׃~ =$O2BH-,&>GaTQj 5RAgڠZL.$kT)(*Ze3=hQ*U$,m"KdlܵR `@ VbO߫~}l3WC?ׇ'jk0:y*H}Wo㴁~r& i:R?DèvCwܮf;:Πˤ4'Z_pX1K<:)nPpdɨ:S;єL<9u Ð)$) T" w H&?ԔE'DW =ܮ9̉d/j-%z|,(R@5b4؄7e KM aZoW$PpK w 4A1_@& F]QP!LH 0 e #TZE2!bU@0tLZdʦ BDAAQ8R%b2\0D@q!DaJQE) FF8DfDIPR`B:Hb $6(*փ ?d±{OYSwøBbGl \֣QYs4; O}BCOBg٣`Фa=@5pcsMRlt:vI~8F fR" {w0" =( H*R+-W a&i &YUQ$YT,\$IĈayk ; S-`B;2SZAsxTR ; +U%$`d)1 Ohgpwx%g` 8T4 Dgâd<ÀAsw $gDBa%Ca(yvAà IDɏד&]m$ v"%sV٧c5+H<؋ @.۝|ѨQ1/5B+k \t*}@|\p*q r&rtMdLX^zv, 屈lD+o6B Jqwz-f22%"]u?,hX@!ZNga}t2D5EHVHii/ ! HY-z/f&,bx2dˌfBzo@=/u7D>{!Syϩ B}^kkK>Wȓ=HGg\p? @`y3eb䚵D [|p{/W9wąM20 饛5\c%?(T[TIe׏/TrvkB/tq< @J)HW{&3յIw^~wR_ p@@()mH)gRC\((E(`MeB1eTY!tQB-eꗴ/YzZ4#ri_xiZ,) vZz97di*C#R[V׫P7 s-aǑ֙>wyh16f6vlr:,ZS%43V[ F9Z# OItEs^1J1go0T'V~a'`]ObiG*k0^Qofog9$08OvɃFH2$$E/zQ:aSծ0e\$7uU`#7&ǷR\>m_iV&r#X%JW$Poq;==CI&ޚ '> ktABy)Q.œyfvGZWMtp>9¥hu2N3~Zikm79L sWUp̝K|H&b9 c[)~WHLDRPQעHN"_G d*u S%;gCslTd 8lg8&CttL`*MJ-=[p-9{e"ЖChHf?I9pT7G3G~F C7̽D༜hmgK*9Zf.UUΙ~5<&ZT^{IiaMÑgͅ% VG1=_oϒ ơ&-Ȓw{(`5*GWl;]<6}6annm6ؘܸ>Rw_{{;G~=Ne:h!XTX:Υ;Slloϴmhl7%~G2gR+q*JM*&y)i/R.*JBbF(8jP 53Lhkv#f֯>pHx2C,Y=L6 E]M^w4yxOKّ1 ]vTPW'eM;-\ŗ=:$>\꫷'炅YN_.$&:NDDq#hFa~OG}p7)PzR归/ٳOJ n!0P=[kJٿ:\3ͮ<6u^p9JR>R-^J25P)v2=ŷ`2w:<*ɻ̾ࡎ35#XN96ajx@5'e L-AFwME٢ EcFFDޔ/[g\󣅈Q(Sp0_kcJl+98>Grw~n?iWxݳ-Zڴǣ Xr,F[ZW\VmL΂NF.(&V,kws Dkȁ7o# >21@W^9B2j _[_Y,ku1iL`]yvWGr.ÓC[4ngDC@.7i,e9u!1L.Y<).sqr[Bؕ">E zk?:_^8qWM-@%kOwn.Էr6k.^{b?I80oLǙE% Adž,;i)' yUŜ帟VG+ 2 0ZL R%̡zCK~޲v}5^!7k vQ !$aCtUt㋈{'syOKd^?V9p[ :IZ.nȢp6Bљi+D_psqKtՊm5ճjŴ>,ǙnvI'3*g o bd\eaP ◉1c/x'ǚ>a61?P"C@6P!o'F͎''oCvq8gDfyVڞ<Գ|sU]"2cG{#) JD(E[4z(ԋ+3XBz7dZ4q2o?=bBR_KOzװ;mHƽVԸe%ve OLVvwF1Y?_b_A0ul,_y 7jy M|/-x!@Z ϩ( =րgME4_q6K+!# PEM9쯀?_a%ȡ|YC7zq|)cc;02P0!)+TlCwQT(C0{NѿX,^4~q]Xg~|tP;pz!mװhfD=MidkCεx黑 ٸm`"$ DiY[Q0pu$#<ӟx*m){+1X7h˹^+듃I 6[q &KA+eK2oG ˷FjVS!+%]ťsVR My6Qۈa]V;Ukcσ&,~ 6=j]YKê]uڗKegX/6roj l% heX[0#ט/!gr ɻ+4CR1SEnܒ"Ws'+vיxpw$A1]HCR?WΣxՖ" ֑02RtǷ_ j.VFDrhZ7|k}UKRFq*/G@RvJ[X9]8a1Z9q9NC Y#U?*=^0„X!hӶ5a@x^M@,-!@a%@![T~V@o U۔ A_%;d Oc|O=fm.eRC/$0"s)4ȾoqgϠ[Q짻Uqum>G3[wN??7ԚE Ml% lN괭綴W(~tKY:R`ndƹ103xk9nwWQzUtԚRH zμ'Hmky޾ϜŒ6ܕ֙Q-|vNK=DܦX>ںM5ƶ~RlXgu<_Wo7}w~UǂJ 9:h& -.OoIw'yݍaM[*QL/"'Oy֢ 7|}{>Nk80ިG4|.wdH'hߏ[;F7ppz=J5S~tZ|[zК q"򁦇w%ys[Gqdl_8 ^ Iҫ=[&*@D-m*06.)}$9xJW :uha*o0d 6BB cOZϚx,.=kl$ۍFȰ?%rS>Z?̵gjdXYxQ{;d4v2C_9NiiO|gH='땿-:洀pP&2cV2@~6K[xKGت~d\EڡHL.E/ޞK6dzf0 kD0)5IP}OQ a)}\3ދ0/MX7:N9 I/N-7:oש˜XϽkGizA߁@-Nn>ߑ^)u+۩ʄ3Iݠb׏w:Db&rRg842nUGԠj9bzyّ601IyYǷ{wއgć08:Œ"/(NvB"vJ\y^1RUK,j/_# }cCZB:HJڴw@!ѭ_LP@`2D í@FpeH 61NP^H@%xRu?`!U3_CȭUpFW D,gϗ yuE] ɶͩϙ+J`tl<:WmmhVWxsX~mM-F AENeӞU΃ O~ٌ̛'`$ԧ\c+i+q 8| tE%z> ^ΰǸ8'"v6{mY7p@!0%$ CR=CEߖjeJBXA)tmzxhb٬]pv9voC68OeԱP9?SQCOz㓳Ĭe>bk,Ox {*>0^ WÕe>^u%yW@(vyY>afRag՝:,}3gib|4}X/fw.ڃ3Oim;R \Fcuֱ[]`klN`,%In1F ? sd)T=!K d>Pt $"_Uj{boKX]q2\Oɋ9ӲkèmS4x]v/9$E_m]ɣ_Zo?cᆮ]En6U;{+Ff1ixr[jùOK-pZ4+6PCHB+ہ>,t6A&SQmX*-׏`~ѽniOQ/~oz'72Bge"ht"ni$n(n"fJd ةjJ]#&bQb$- Yp6jrRuHZM|RapYsvCUTᄯwsL>]Q#dPq$i:~wj/08x%$8[Op['_Ws2W5:UƝ@>Ie6w/)V6X {(4Y02%$1{en>%W&?M}e2jÝBQħ3c9ivB>AdZ$K Vv@On?f𽥉> 8pxVN_Y vHyZ6D*f2O;:s^y7;ڟ?1Q| 1_A TR4֟nmEMДI?B(g.j!I ʄ&t2]B@/D2!KE̹&`C.&!q1#EŊZK]Ļm ީ0K9\@H SWScSf4d̤&46\ijAL+CҴ\%%^0 :  tA@D Mu/n- }(KI!&9Bn HLe`뫒H:4ὐQ sR3xxG:eh{C:b@K7ỿ`6pw*-Dkr+} } HXqh3LQ_{i ED3Z4mp=ˉ-uZdW/9bɭĊV9/Bi,%+XT4kunb˚*4IjTWX.-ՀȲZRHPgU*SC O`]`J6Zծm\GJH"UHI#AIx@EBHX+Xª?_ f;4qW<@ vy_vZ,2P0(TJ@.7%dOff6% j_ Z"HMRn߽=Y<O?s MVuo)?zpfkWgs')f]Oߊۢ-֠yw^ڀI?"$lFIt?oTVFBs|Ɵrs  5E3]-:Qrjѵ9O~t'a3v5q RC|Z ~V}y{Ÿ1L~ҷFC WNQ%."Sxuf맬w ) |H'0iEgItp+O@؎Da'1@t$$ܤ Eٽ7E_|^_-C(!u ैp#Ժ#2!LI T:iIۜ-!(TtP؁sQߦGԀf6ޏGd$?ޱI|~W1yE'Bz]; s暕fxvH|0$!:$I'MПdgW-߬sk=3k6(=u E PH?U@!8̐a'N=POHաd=SP';XOCaѠy: By] aII eՇJ 4?ԅ?d|?QQP8 W?"孕tE})R{GGUΔytUKXB[U.Xxo[RbWBnE #y{&??temw-yoӯYHrE Q(w.^WH2MȾm639yEAU~u½ͨMwDz 4W^[E{gd[; bΫξwpQIr ~{{{Ry=>/H5ݗ~;%SD̗*-~E_#X̄>9$OO`DlT[īQ/a{o~~fޓZXj UUÙwKyn8Ô;p5(O 5Y#•|2\];{ԃ9e $\4U*j$r@I$~$(i?Cmk-㕹ؗٛ6m_wgT*xBRDaN]!ܧw<1BԨZijuŢB`4h iRxdSL_#gsXBRbkS %N $(4bЩ|.uyxNB|:_̟fIT% V0T%2aqY d?mH*HiVi$&Ykw /܉^5Mx A "WVZ&D(ȓR߀=Mw[%k.*Ĭ79ߚbD&| 5,Egshmc [97uXccJʑBMc"raPv8 Hxww6NSn[)}j60'8Ō@&_6O:LJl۳{5؏*}5 yyTzBOHBDAEjD45U~cP_ X:v#J.rxH'Z7T Rćk1/CjnݷٶfMRsn\'>Fe_u!Dž7gwn5vt007xotZp\}HVC^|:8asER5m4[4XȉUn>+fOAsm|9tt`7v7`1`1܇8TӼmNKGuV-Id1;T8QAӍ?%ݯ{yFvK/΄*Vv% Ejbr(nNj0:3OW|ˊkCaTDNU[/w{EhPoGk,[\z6R?쵴خj*_U^YoG[! 5Uy[-LژCT2|\T~m~-ݖіZA^m6$%cڐflU vФ..ovj%tޭG[{9jB9uF`u3X\Xq;?(YOS!{'XkWsNys A9 Gc>;ڙϹ3i5`gVK:!Y"kB2q1K';0zZҏ0{cȩ)a~]4xHj1Gc|> q"6P|7 l ,~'"Rnσikkztɵ(QGJEĥ^ʔ*b3zu(r#2UÈ6SfT ~b;@Q>ҳҵ2xw0 <֦|5)77w"gWbH?pUA z v?khDHMsi0g(J殤jzuwS7~ۏeޓ=]7#˿NJ4 K3ġ5wc}O)?e;mF.ḟnJب+?]bE8\65#'JdvR!"؇{Ƙ}{}jX0}oҡj&wE a]櫘JKj9IJ{ ٝ :(IB\Ѳ(H"e]VsNZ˃m)$91I @a P1erK9heIgy,@D\'F9*ϸlI6.֍$:rR6!$H I6 !bE`J ʩoM0{9)ܼd̋jBrgNf :r%qT/ xrJ۲-mUnx3n;L ኦ9=_*]q [2|(bķ-QrSVaJL AgT2I0 O8鄺xS9)-΅TZK0xc':. (Gdo]/&>j3.M '2%T33 )7ߗ~wus9]w^a784@dh(Wy<|Y[(,Z\ƛǞuur]vT, 4+ qFIyE: 8DZIf  FK*J@2CTKXHU Z$X(\7hfxh+^㗑T0.k Q M %NQ{ۄIIU*"",QH.UXbCCk[_-&N{a v-j RPi$)JƮ>EբmkV[x$ћӅ."r,P 4RꆤP~ != B| 'b$9U!7 PR 'UVBYK{% 5)f-rP9ӵ$ b /=Zp

#S쫫x{c_#c)'OƏS7+u9]yZ+T#V2Е}zP]<^K$/ɨ%l@0^?w*›zngCR,jR]Jn杂@*cP(f)KO Kg]i(R4lB+zy \ڭvB?kKOb :З6Re;\] BkܦFKy6> +24f$ T3/)IKIfK@kZdK0D\5pdyPi$D)qUW%|e,%d)$0dHSF _V R펩&@L{j|FCk*s}I8i7[px[Q-Q*[Beެ)@e k]ݬ \Ba NyeiE ɃfD5x7R˳`A3'gb:o_1 q{DGl͈kbAm:MQB,U.QC-7h[Y /O\* @֓I%,],Kj,r"ƈI`%GN Aiosd{U Xvg,Zڌb`T pe "&rjذm[Zf9qj(^gUD2XH@ (XF,U"Uqʤ8@%e8v7/$Z D7z%ڵ^u=7|*ľ1e,b PvCt)z!{[mލa*3{۝'7O4a\/$yͭM3F|k'>@&y @>Ҷ 30 ˕Wnr)H["bƪm*,m %Ձblq!|N8&HhukstGeMT9[n^:TU*tÙqދAt3C}D"B5z8C^= c|c6E*5 F}bs`WX=asMZj9kn=SHeGCeٻŽ'ΦqHH,Rkr-԰XXa53;z=w71UjXkTj;.Œ ^ϳ`֨d MΔ?*(ޏTBrK!4%aw] %؞+JcX; [A!)9#?0 VO.Z ܿ+h[r&uZ}z{ƍ{а,@ S@g,Ma_tH_ˈ$IF%"/wu^y/ۧgI"W-~ ,kZ<%UWEbF* BIHR,&E[lhRDk嗾l^|S ?x`:@Yؓ Q1ҩQ-WwDҏ.h6=X!9L"a$m@U(,:xFH9條3 bjM04Rj|5NoSo3.>Γs]]LKM|Z] 1Z^wK.h'Xխ%)90j\ֹmO5_qy 7 7[sHfM)LҿZfrCf _8UETU1cko| lLok}"KKd,R,RDNAo]_, f 6ZiS; Nn %CRmnN tA;3nx3>`! ԙZHj\4#I+ĠCe&ʋx㆓w$sq-󫗓GWz])ʪW-V` vK*1=az$ ltUX(D ]Ǡ Y[A-44.K{̳CVbnGʯ`e ,ZNA;$f$0 BүWlީDDU(A[qN.oS9d%aS/K ET`lG6z[xѡ-l[3]\zUxspb^.WRPҥ4 92e٥stiM ڻ @ Rm=~@7g*;ąTJk`${]I3a櫤H4}r7ξ悪-B MZ6Ir  C]urxu:v=dM&lС4Is8&3?QݎR}䪣X^MwrMG1dn:V`G4 , _ҰE:]BBxH,]%jNJޮѷ}̗Ĥ e\%1?)pC6B>u1"+~}QNo䢻pVf )y?Hn"qPX*"sZ~'|\ͬ2ǁ{C2 |boxMhdfGl/hpk)ae'-nn#2%9[u04 |4yIC -Y3&f\F(tܽ6I(s?Rt1ke${HdA)<eceYb9ʍx0krԕ^ Ya*HsPp-‚]:4kc\^ EbCqG(h O.}nȋ+hNvϧҶMQ%AHo V gb :"qqa$XVT @o0q7>骩 5,~xL!Nb-m\҇k^]nEcjցVxOB1A4xbtHbӈ[+oז7Pd 9. ~*اhRH*B, ujS(EX<3EtuHɦժTo\W8U# r+R #0E泝=7T0Li9Zk*1-!o d:1fUjF[_ǓN04Ⱥ3U JM|Zt y7?dvo/b?pN*6RLy*88^ :6`Y]TЪZHq[tk;iP]{GSZQvn=eJq+.2uÏz3qH *3kfb5@7)Ojb2(_j A؂TS0D !ZUwsiۿk3k^36b}XDeiKQʻR+TyM[A9Tr&`ැ3+o7MG&W| Rn XS $퍸 J& ${xu1tBxIL*kݏoQM\bTo6{ܷNB5}QA'O>u9*ž)iՖ@xR?jh1\uc_y ;o-F4Oblpp~ j-:ڝ`,.P3/KidS]XRѦȁ9KRRhTo\ w,"zdJf8=)=CqV 1%^\ec"K6&>^wkJଟ1g?lb KXw=9JhRsHi"x:.}sVZ)00 bMKݔ41"!P=arq{֓NiT(JCx6̋"iaRiY3[;%ޚ;Wd8<)r ;5, mZȔѾu%o9.8Zl0vOw8;VRچW<؊+WՇGP6l6w hAUCI*QWsM%L*0&]k!fZzɧ6|*}rd,qfh*r>=F KBN@t7G/1T'b#r4:֒*R;}*2&1~:{ \xtxv7E|Pˎ[2cO*6'ĺb2<ƋS%G|CBI֢_?š7Q: Xu57.tόb8.{|R'ނ"˜5C4LhzG=Tũ D;ʇIX/ vHuiLm5BAwt$ʊ0)Nʕ8p%Rq>חgIqxc#j҃;;hvTu՟S{L8qp4gGksqnxl) ;ވqħVi3 {0&՗t4_mdZ Qs">){ PipC6VJrŬ{AN:?:!&lIBNÝCKl=Bx?|?♈;7(d 4)\ @bfMDI=$>1d̠# 2L#ٶCWH[_b=W.!q(Unn[C="B5> a84a= w;߸O||9YDD~D$p>Қ29cPf "jZ2 F҇6,Ջk ́tʄY\T€]X86Y↗12Lc!>}a(% Rgq ;)3SϪs n LYR!䪫=;rMr eUdpj8t] w")ST O"dC¬M'jC[6b"ȓ]hƅB.-!J z%(1*PR Tlcŝ+8Cӛ.sk UTf YdLC7r *ggd~ A;(DZ|K){rUvv\T[JoFU~90eV-҉0aJUa:tTTgB#}B.vCbֵ>kˍ|1·Ww]\&qґ[9/evݹ8n+Fʱ@2n$-yH,`ZNķ hJ ],"CH[1 [VZKAIzjDT8 `jT7yv9yrK` c$wް\sTo>6PTfؘCxˎuA$j,>b/91Y蔙^ňTrE^7uHRM~q2-7ګleK XH@(*K֜$5Gd'ް0;/xLԨz_K#{ߝٝUL2IL5 ̕i~jۆ`r<"daNvMS>M8Gga)`=bo@xrvPkZ.B`t={ lqZWP\T+$y0KɮG(I!kI M9,@75JB"_ X_Gqi񔴃@0Zu,:hOCQ衡[Ҟ(NodA6qso]$ PվVdJnJgvEh܇AgA@5,og%OVz.ܶn ab0cͰ,/˚* ".HolS*(H vnHo©Dbam6I$.\m,0\KUP8+3F(TƠQPTf)ery֥s6)R 2zY"mTmMժJ n\0i0-jEdI#מ;ܗ9jK)0L x Z JW3i\ϣ6}b:B@# se3CrMy^Ƃٹ%( ))Ov !7L_95}崹="@ǀR~1 g0ЫP+av *hٜs}͏,;y8u:TPV;\C@ -w?N&%%S%gO[5<~q evưoQ, 9'jBʒCBCQ'#nLH'&62h$)%Q T42ICRcBD,d>X__Ѻ'}EcHh0& 4̉:. M9IۖJW-xx_gVA0){0#PLiFLp# :Btm*"[OX(;(I }.}E[<#i޻"ylڋ{IF7_~yC̖Tk{GDqNp:B\& lb`xt,:yg<ϛSkU/ΜXm~hO>+~I[jPYm)Z-^/wAi=MNA|m٨l}Z{= I_Hv{G,bv6 T/M>6OTqp[荞eRt#S9^N-{,;z0Uġy1IF;L$JoTu'UQVԭÙW5royu}݅E3W|+]\1i;th%,]/Cϴ{_s/ N%SZW" vlg8B)\4''H2ZocOٹAb O= W5;ͮ /tF Ѷ݅M:ͻ;̮4h<5Зk;&h+o+oa9jMNMwRo uvl*mN#} hE|X"]NɲK_[8^Sc+qWZWlm;*Ἰʡj߻kQ?֘%hfZ.{ `T"DOGP!  iKըv_c+p;lx$۔ep xԝ:t r;Dߌy{~.?WZ4ɫW[ ,Bf"} $!ѽb0" NOH~oԐ|`:vH~'y+/]w[?; O/$dl!}I$;RHjNOTs!h)E Mn[ڂiL jS`fޮGzxZ K/oچvGb`n}>fvSK/s۠M)sQBqD91|#coE.x ISI`R W'@nyѾMmLvߔU0=62Q,)BoIHiC/^8y_}.nd+ާdnع XD Д"8Ɓ$-Ԫ H9y ?h6k>"â\'0tJ0(w6 Usԗk*A =&$ bã59a*)Յ RHt)Y" 5/z[>oqv*BtrEΨB'!y6$@|o?9p(kisV?X>a{P;{ik=,l᥿HLxhWU֙ ep9rןgC 8N?g_CHX?[IV,:PFۮNb_3e6qFNÏk-gλWdS޹k9޷X͇"/635?X5m. bO8BW{Tݭ/' ʭ6*Gҳ552cYTχEt [紐HCWCKwc𐕜Wp2Bz]yE*?\ 9Z>5TѰf6}b* -έJ61TJ%gP !DG[ D($Hr"/YW""@P+MXK2x봳v2Ι#x| ' 6N/OWw_*hZ@un.v5q[9l=j銧!}FdӤeF#2LmtPleI굥C 32i"XV첑˶i}pM4ne!#4&niv&ϡfsۧXJ0M UO-yqB%0\vu \!{羠)$9P_ gv gF`Et zɴ#dkzRwF]8.16hP`wCIީ%h=Of{9A J鰞$eC 揝A!ngjU}tSؾ/5G\Cf_hq|4vtLi).rcR -id5AHWt%QC=˫hv8pKO| {>nf[U;Ht|_ˢ%n5 켦:y.o]za[sk݃H*ŷ.HMVXz޶# f4Ns@Rkɻc>\ʌ=TKy˹]4NE^NB\Nɞr x~Dނ]g g`%Qf#Kh] M@\ݕPiXpz_Uº2:%3ǟ?_/ xniF847 C -IE$Qݺ(Fr\dh0 v@ UNC6/! Ց~ 6]~Vp|ӻ7wVȁR`WyTP?suߘlSܴx~ZJn79%u}j {C_;9Nou.~E9v/KG>,lC| ץ(-*42쬦p0XTX%3=ze ; |*G{0g?.CF6^(߆ҽT8Ԝ8ۡuk&2&fįe,tILbT9HX:f(uQ3z=Q:`OZe{9cZ̨Sͬ{L1$\H]ǤˀK9bb%Vr A}QB}Ӵ˜=}Z3ҵxɱS{0pbr$ ezoj_C_Pv"~^H̛G}]fө')S7_e'&'߶2i7hrrJ"ԩ0F1BD^Vz* tb'mLnt%?kf5}s?Leo;e~,S]ZeO*ĉ%`OeG: @DQrFI7GɰL=y6\L o'Nֳ2XxꔀH@6YXN nh0z:L] J6ӟMu_%Oa/KR~s6uZ=]R\FbLkȜ&죻`6|3|]Ќ PqL$ I8v!nd+$b|ri\V$ՁRu|kNgQXp"lUw BiL"̌y<✶$zX . LrVdN`sm3Aޤ.HiHͨO*pnjP &>Nviu,;ݯ 9u)j#^ ;1*A ӬỎjᓌZB%6eȕ\{.o&fXCBDv Jn\yQERy^Lś}رT<$ȅ섨$08eq%O[)S|fԡMEwJjSSz% F\kīRWk!˾q.W$fIWq5iôyt)BqSoٱ]~ ZJsQw09 ?'f;_u|N$mοnu?#|\!ƽ$Z6?]Xa[32yx5 Uʡ\O S>TVrnTw*,wԾ֠JU"7W⿲1 P8mM;E8goO7g7W=H4НzO+a{Zwd{Lg!]c wZ'uH&;vk'N͛4[ow6E"s!cx ~61ÙN-cGd:wP> OOm"ӂДdxR&<)]O ,6k, b>Y5?cQy3~h\;osJkZE@]e!Q%14Ni!\Tt{:y_3ʽlSﻇoUgyzŦegͭ oW2!wЌ)nS@ l%֛|0e0>r׈RRcKZXQ֔'z,NQkC ~$q> T,<,>|O/x]_ZH~m0?+=xdKRV?'&xP&)!w ?R{spxߥ:yvE^YCG|{S 7{MwR)s,(u-KV}y/ OHF)W.購OMwVT#)!fC较o!Uw~c1~c Yy2Ǩy$jyo9A>3!Έh(͏,M!6lO6ß9C ;ߠ2w*t:i/<-65y7~ʃ 5Od`O +R S뢈_!`ƽFѤFeڜg+ͺbl*Dj Qێ (i yV:yMS"9D­X=/=qIBS~s s8]CSqOMeB<8=_w:.zGncdb1tId~bT%lr?{og6BVDTc7-XoZ,vM`D]w.&$qFq _!@~۬Ą!핢`}tACU9h?dK5=ﹽ}6~v{wvd8\,oiIp|6Y>`V55vضg0G|'_l;hgԝv;cX=Fpg]j?PP ک8Ƥ PZ5uZy}&)D5evs#X 52] ]FAalc~ 504/HKۡAq&WiSWп% 7? א |b* @!)J }oJ/w7פah3?~ROpWUvf OcRगiglrFJBE8GCcEh Ч-;ӧVeuX ;IRy0ak`]lėLCmn"n[6ljѱ.XʄLj#a`*8&˺DW.l9UK +{Fy~cQs 7_|Fyϗ$\K:7RWRX=*ފq#}Eq/YaȽ_P}?7Jkźa~6g'8|~Wtn1}URZN%rmL+OڍE '6(kYWM,+jѸY܏'weAO ]5B9y.s\q]@HWBL`2E 4*gO-mWN*v<ȉ \oLX\6GxlZUye[]~OS,/K;M~Ҕ$ WL/WJFr&ԏA'84 Kq, L( GƠz{cM,ko}UT I[Rx4VMrOAH2nMoֹg߱!UF\ SfXR5F6՟ѧNQ)2j!/SF+C׳tKeP۽(n'uB.>"[w[@l։|gf'[{kG 's쨟̲k;;[[vN rq{Rm)/JjBlY3c/pksJZRf^~UrR~Pkפ`Y[bZ(uxchv+yo}A7U1" ߟs1g?c܁P `Zܴb)죴v8O𰃡m_?l‚?e8+,'3C'G"ɍ@Y[.S?vR8=:Kx푙Oѳ( |s[{Q(h9G XLJ'uC$vPr#t_CJǹ3[j0&2mZN]v FeRd1+shy.QpVLB]P:Jz?wg19mU`B&&^: $O.@|(?(?zst5v `VxKH9{RE!?eF[kgA+ tb%b/óP3[ǰm e5^W[I>cHX;4 Nhg 2I & Jw+$)P;wvl|udBTΚ@\b`6|@;7jkP8p~ƣKz#F)a#_*/׸[RaMYrp4Á9bocMv6,66۾J3YtNᾞ-&RXcªKf={ào@RhXI[ՙu~jXC{.0M{u{j(\=srѢ uF͌KR.r;<.m}q54tuǣ@A՗< C|YҖ2qh'qD*hV:MQPڗHRV(Jl^zU}Cn̹piqπ[4 U{߮fVWBsWbTn˨2ȇkEnK ew:ݕ- _2I^+'+ct=ZTT,,:״}ml;w IԹ6_u'Kuoto94׉0q'WobQuۑ68:;eS|Im;#azR~]?7s[y8MT;V [cNSVź"e/sUc9o0 &} j@ؿk::.> X_vbnSҺ!OBr;C3}M"X#W(|zop׭Fud\`7;NW{mU$I~$3>2sROҍί@ 5ܱ' ^'2 b2-A%sj~_>EAqhM`p,fQu'݌o1(I rB "!ТYMj6q-3GIN\ThR@]a*f+|\/_% K|fWb[li_~# | 㸃[۫idK4ZYԄ {Ϟn%$ZG#@=&&k_&*!0 \Gc𪝪/e"oF0QF I*%#4nsEjS^d?c-fN*0BB2,DbF}qcG( ~ p}+7a+!ۨԡm>Ʀ[S?%0jpD=ЫQm yɵ>ՓųcpBv)GA'D*o p\߮5}5ϻ*HM80 @' &-=iD#Yn{^j#sY,oHAuU=#BK'pg;p9e_K+O{$OMd yä4Uc t1\Y%bXrsw  hҚ]"!5~i (S]]G^٧lvJCJ}L.`d`@;ٟpo{tց}BC~m~wQFh~~W}oao7ɑ =CU)"qbȩɯ_kW/:'2sI~ %z~Oy=O} /H=:si"BB)bBU8FRޤ;?Sl^P9~{y?xIR Ot$ (;1-<K,{QXgd<* آ=CQ?j`=?x}:Z$$ !2}~KێpaF'bTGvZGd<s5p!{FO(ΧYy!u'= y:9u8j5^o7:(,N( 1vc%O5/ѩ]KJ'Săי`ޙ Si'URѢlό~x"@3894J'8`΅֤c&~:7Y'H|ğc|;gvܚ9P ``TװX8H__IlJ@'LHMM>'d=[! Vzç~)SYIѰ'8+LkkT0O{d@/kH|wo=68>ǯgO'l:?CDC Cs:sae'i{ w?gmjv]2Ei?a*} mhY]2'рfg `;_` ^t@(1/Mg2BuOB+A`~FסO?ߛ0G[w;IJWK1RH,EHJBE$R{@CsT'~?Ɯ2w&ǰ!,Fdvy/QsiuVRYmVNLSI|5TA2~Һl@)tH_#s@U7`_w-gtm*y \xM) yXۼg/uC2}6lXT4) Z+az;~~ ]UbsQ|> o;Kk,]8|R#V>=64z$2߉Hş,־~g?Yf%Uo;[zPeʞY[ZSXBڽăam _n;wL)v4jt.(1?)>uO@-oh6}A{vEG3gafS®'s}}Sq0CYq-7 n#,]F+ns̐+DJx˓kv Y晐fѺ MߞF}g{$+Q?-Ը#9MhծھK79; ex'&O >[#-܌#)=a4+w|):+E:QASi{QOzut[U>W/df )Չ"UJo SZN7!4!A}B,w>*GR߉밖|3`hmɾک/ŨCsm//V9F:AL0 0mgd#qkr|e}iI\5ˣU7mz>>d?FH7ۺkh#^[!lx?6Io!+rf>a{}fX aޱ6ɫ!wU1c]eܺc"<1.1okۿeps K u. Y$QH..Gɒlz{ݱZ>q7% T;oP%*bR(Gb}Y!kR:6H8k͐Vo/zO-6KʹMJ|?9rXf7 Y UbD~7a>ǧ,&qv-yItML):RAQ1!S2 M`l@L bo`Ɏ- ے(G+M"r#іQo ޗEn@ ʊ+>$HBo-o+QG\0S$*C$].HdRYtyvo[jw88аmo06: W9N#6J, I-n@1DbMr~=ǦH@W,BAܗס&zO,,jNʫD0#oG8gE "AzN;ZĈHЧzX FӹƊa;ɟ8 sT P!b# U,=Pvgr~GOaBpĕWr5<8ٚ ,ًBDپe30W-@ '(c~ LjJRLg- ZCj@0"""vVyڦ+2ÄO; |RaL8LT}DgL!N_?Mtn[̳A.oRb9A:}N 8#=mRïQ0r`MdĔr}[p Ԏ1}J't MChq _#R<ΦSU;:!. kgY8qM@ .;oHqۏ:S7ӠoctF$'jOoeqwl@"X0қvr,紨6 *U*磒ӧ2lie9l噳?EkhpO2TM"R;xgWrզ7C\hn3V!{4?*mսݧ많!2|n^Ooz[{)vMLU0?G> *uHOzaZs(tЎA |ubYJdWSu?'h>)v_FSy wSy̔*g d|5_Ǽjv3Ifx]j3iiSg44zIoucf8:ʎ _ۘ4ߞN7AoU|G\H SP]́" Kǐ7.%] 3z@]@ n0minu‚_Yj@n\[0ĖUh+Zp89kه!Z:uĦE/ɖwL]w[{o3-U{{+U'+ykK䶪k%C-R\zx^?ܙ$ /e5 Mc>~V75.+0KXu+GŦS4]2I,*U+%qڠrr%)_z2mfE* xshO/>ux2[ҶBrBx!e8rRzvt0€-萵Nyd#i` ;v$ZBERq¾.ۑ`'[W [͙~o5{mӅ#I -{ɗa~ϻ掠~qy49Eዌe#Z3`|^r#-oǣk[orBoOC}lЛvr/KY=oC[wS U_M{RfRAmVv^ǍqJɯk=UT [B&!1ulo|;ϕ|ӫ_M_Cڝ@!U/{y+UV^k pkc1>ƾ9gc?&ƣFm%7/l?#a<2 Q/dj˰Tl756FOܶHuִܴ4d&IvK,&MoLg73Ԙ^ueZsٿ`B7 -[*v / k|, JQLg~JRĵHK}gJO^.f9c"f dn#X)njfb2F{Kmw*u7ѡ¥fݤHGm=6Q̹Rm'ӬƷOV'9AU;, $sj &~PM;+Ⱦ:3+^iN˼owhB V6ymRǹhKq 'Y~;w#GEw;Цfɵ\/ .W/yMyZzLT}[Xp]gD۞xNn‹eߴXOبL8MC;[]PDM}1pӣ[; W>E2Vm,N h+ӾRںMH}?Ly#kꝡ_CHe[zbL33Q_|u'<88Cƿ>e˩ 5:z::5"C2}k[MTV$+uY]EPÈ07 5W@ܮU" `HɌxnm3W4ޗ?S{ Igͣ3XFbDNK?{q'gTMk8/a9wZ@LCmz8{~\iak"zf>I ~ y"3`?)L!C!?FŸRa{Go[Cd?f+>^aGs~< E|pEA8< n7sxye4' O?xH }(_|w;9O8}7;o8ɵTz¹8( w]WDOl`,~QbǺN4/{&|jWϠvPKD%O#JkX -@P,E)y?nVzSsQ${q6x0/ui7v3mbZ|(/$Cd\KPnWd,Ӂ lM$Aga7kI|]Lr Ę KtȂ,)RS7pv&`-'1@:YW!< IADMiOG|> $.t(a H?]/Dyb|S6I5OPM'$(H(ӄS0y' pgɼi0#x2j<@^A]$fgwUh= "MH4>ٝ@ͷ]7`w&!=T3 q0+xh{i>[;dȯ~+mWZEяXsR m=۝1s쇳a|>5zm'[<~Aa~~.ԽQݒ$0D:vNF= H ;* R j`HiLXJ7qrӵ$AQgPsd-ӈDI4&΄)wI ?Ɩ+d:5o¼4rR %cT̒;,(JIww ;F/읔C `|! P1Dn)@ C ٭{?W{I2>qy?\5Qg%KE9-.Xy{L1Cψ )w'3qGhL߼?e3uDu7lZ`(\ʐkfgEH=t>Igu.Äi\|;ɰ T箧GĖFwy3 IZ)1s1DfA:781Q(J5"WCSAI _0᫦{/I u̸ؖ' ji*d9Z{~bO1ѭ)~޵'̗Ok6_`=WƇy2o4G`*zs5Dfͣ_UU_Kn75Z!pY&]P(ghD?:~_&%^^xq)QҖե0)wDۄԦMȞ[^un5_wS~1ÜuUx&mo]L qᵮNr00|V2TDKtST6Tq g,$z%&l鄕9چ?Mea^&wDMx#)6p-#; ШEv(~YE+gQ-XtǏK%}[sy51L BXH@qGO-I! ve5-_oeT?^#k]TmB+dRpث<즮~h|ÒPZԭKo6__=g#Lt=Guc~D2V’c/dqT;R\}56 KtZLm}R_n8ts푇EU򄆌@5. D"[D0n٣"瞖^kmgi{71AqSs}q? CnyIo7-]J蔭迣J4VȢVU«Y,xVux,^hl(:')tѤ9XԶչA溪=}TUGmBG"1+ܙ*%&˵ؿH Pckx+ph!׮ƪ+𚋑y6Guq /dKe:3`i':LLW0Ovn.*m6,n$= :eOw+Rj0OawTwY^Q+B_u(GpbsQZ;ħyg;씅ۗ۽korhLuA #'iζ-vTo5>^;6^ڗq>n x8g2[̫)p[V>YW))eOa%wf52?]_oo]?ͤæ$Ьv~ߨU䯮IjÝmAU#Z>05mw"Kɥ8xtoqќO1z@]DB@_opj-wm`<sMY4 L1UbTǯGSAq5M^ @P ;iǑWXpO#-Сs>'W`/q 쳰-4O.>4O祤@ 6wZJ6D+mMNֹTMjZ\֍y|ſY $l&$ yR;Oe46WV1r{+^/P;Eb}P$lE~aƑ~K3^RDߟ΀!jUk/cZde(şeh%e v [Qڂ8"<ކh[%?;ln*_\V &."o3Oя ;kбԭ!M @8a@9Uscf)5q"@@;:zєFcf6GR55+Ќd>J=*[+^X=>Y/n3ma>!&vG4" 3GbBqImI =3U'X@kj秧٧Uj?ꁑZ Tp#>׽mr\8z3*^fW&5-R0_rAXpB$'(*n( d6VѤ~c'Boq=\0ź'eeA F?eXLY|"-`(}.]ic<8Q4-u˦V-t%VIU)M+s9SݗUKN pys!@7~ȉ.%~YJ v$ cY.Y%M;XL 2uwyd;kP1kl?+hC0QpWÁn }fZ(xODn ħk=Ͳk]X"c?|lO3>U0t4:^j(kr8.9$?-^PyEQKH ~TAʳ kx^wOBGy gԛ=um#bNmwL Xa4<"/PUR Z8) Ȧ9@rhmnXLk-rv;OfѼOiuvז8M-&T;NP0z/ϭ>|۹E 0mW~{Z%)Փh]/ Kv:Q*GQ5H \a+9f=I؁8-gEwy;=w{6{D,opnhnڶHb@qh u2vc>8Rřu' ~k˷ 諧m>6wҲטBѽEM̚. "Lb(יcK ^DB9?ǯ ؆־16K?&i~ҧ~syyc>Nw^gf៾d=:UЩ(vŹ=kxtM,'- 8U(3c+%UlgC8\o_f%˩GقnܱRr!+OtȽc\hZ<5m+8!s\Ne=wCTX^W/11Z?EJ Gu*PG΁P_F)rOZ[jv*U5ݒ\L{lHl.!F:0,;ރ_蹽SX#B SL?NY.I$A߉{.cj‚sMߨu}B$aS=2voKC{p;쮥6S#T%k;`Y( !JH!FHL b(w@X3 8tȀ•{T"]h(N)d!0KU 9_}[=؇݌:[??qOSRyש $a~ߔXN3LH*]C\锻8edü5xLKȒHgIbqטpAN-xgjCu='[z a6n,+kuqu\lf}|zHoY6=?E:Ңaoa{N{{d}yDq''90D4/ RG B~* Kc 3u3d(毤Y{I[ZAD"CN:;XzNK ,'oILU6R~~y)l6%`6-;JqgdVy7?-q~\\}u։ϒDv[5 Ijt>1tfN'=Pzev~3{U2Hp33$˘g-0auCIܹm&5 S r윕|(|/hHZJE$Iq!ӈrIF jitN'M؁?,G+8Zδdohs}E" R*C,.d//npmI}~|UY-H2O鑄D~ޞ M@>1t?AXa&'{Nqm,vgi{`b*Hn+8Y70rS0N4ʑ^gfTڝV#]ZYc S`M ~S@{v~Ϻb!@>(L}nl=@s`G)8nrOCzNréOF`z&^Gx/<Tx9xiJXCr mLg h͒hc-vYDPCg̱>#$5WvQ zQ) 1,sD"zeg l pe(T2arٲfg~_u۳螱!]dKA`Cy0;:{gqBOdK!v_kO@]==@.|$2/NXXQ!q }ҐhBo/$'$C({.>!KÆG8Df A~cGkHOXŭPP`0|fgAO'ݦ~i-:Տ-} ˊ>VcunVw ./9b~D<)ۂ+NJb3B=D &^G5^ [2Q Wl~mk4aYSgAGv"#+5sD$ptha8zi B)(4;e̬E%ؠRVś`\@vaӨe-P6gsulg4OqC!7>{MlOʑR٘β#`nU870Gu/諴7M_rqg]&h@vso+bwalm}<|/;$NVܹW+1uƪm7BX(=w汧. 6U˙>,ImN Jd=ܙAk9*?O/g]=| K xi"<#uc(2WtK?ᚼ;?Pej}#}&\.־Yl|eC`LFlw-ѡhh}0WN Y,WJrq~4PFPI@.!R*ߝVxHzgy75&>³b;({*5KX$NB#ҫ2n9NT \?¥(ͿR=b!v/w?a4J7NV.'+[UM+- V_[-tRZ^̩am͍=vՄ# I>9~ڀ@5?.Ř5>cdNd#v:z8y`NaJ|s{zf65\kjy~dJ;$.}AJU2 ,Y-wUjPDr2AnnOљ JBF$,U?4&ʾhX Q%a_ SPaGTJh9DRPȮrZauRGbUsx[$Ѷ nVoNƃl( ~; 0S&oT龚>)C]B.[O+¦6b 1nH!YS9&Ǥ[sYf&KCD,P۱q[lo{˗aѿwˡtGrMgJWi7Sɠn]N&ʟT-/ݍ7r6ǖև>=%OþІB7W<.+j}fe ɃjriYN9/cuֻX=U=?YIT(2 8h:*Yrzmж1#czkQ_p+cSrYs [Q)=֣*p,Ep01C[]t+[}m\K$;M}Azs*sOH ,\utFLt$)oO{`POC¶.m3;#_QT2ʭTn4ڈ=uzj=qtP7O/^Ol~'z7'K[e +] xD^76zQɦgl˲Mtͷ.^Qvsu|)ج8—ȵmyEUmS;]]>.ze;jp ER Ufp9`aK Xbǜc?KW4$F_3r"կsdؚ=Vy@ut]z4*t?j6@/v+{Q$M=WQ 6N:c9uؼžȖhc "—B {xnڅefc=U5u\ܦfW}Uݝ~eOq1yXJ_r%9r^x455n>hۼĮLpwcG55ű>b4ÃM/tC_p %sdr.&:գM;3lu̓ ͬEʹf2uThrc?&Ju]nٸ*qgM.`A2[ir|[#Xph6aEҢ{s|FoNcs4;DE{;\%\c"hPWa6Ըɏi5{v &^(3rҩ&3kLŅL*6B_Ԇ'+cst󡲄"1Ze]Qo(@@SI"d4b `0$+UuDhѽ~*`ҟ„T?)S[S*Ka&'_Ӫq҈"٤72 c#_9)G_qB B]\W3Ғ-`YTQ P̽(C4H}XO ǣ1 -fx9Dt PIQmfKŖ[sդ$E$@# w;eg1t0] :4 ڻiஂā&0@ !!Ck,mC3F1! HP`F gM##*WU{Y݊NKE.\ B+(ĵc`рR (v 2Dd`I)TՙJ{$ɤ$:=]z[/63qrjmwq)0ücKɋ>9+LH G[ ΐZ-wc??I΃ˋ^1ݸPBN7vA F@b A4"9t~iՌO>VjOZb}^<~>t!%F҅LfA/eaԘ]T~:_ow5~ l8=utK7#ƹ΀x+G\G`j/.KNpK=`WSԨt{I';?KJCYiJ 9IdPx=!! 'c{|fL!8G,:1ִ@v'Ssɓ=mB@ TuIVYÁ85BRD__U'sÏ{wOMѳ+Wfvۆ.[M}ΕAGOa}{f+ ϛBIpʂ z{#9myQzgտ8=`]'/{VdzXCtͿA@sz}}b_Ɂ[kxH.ۛ$;8ԑ|Jl:)'5g >#~б5U{] +?].QWڿ͡5{3nv8B תyyC/!~d+dzRRzKd8i-6b"N6դ{ܝ[>J>9lwq=ɸ_>lGMsm[1v,|ӑKg}rQNB^,޺Y#=/bzhhLdC_Tyթ:it)k3-7W,-hesB$/7$.~ݟzm2}t܅o_*֑t!f@BbZtHc(N5${kOӥhSfS 9U1f1SZuSeU^Ԛ:Ͱeęu|D ,[75y^^7UjqQL-Qɞ8cZWe#:2jl=qS-2u=j gR0q(Qct MwhCzEx$RM\Z <@ 3$ DAվZU9VCfD*mYܔ4eh>&#XMb̀k\űH5iq+_=ZRەK\+ L|y.v1p9=hby"Lc'j*__q`-qPJD5GlrqR?^^}jGdmس; *aJ?%1B/2~’;dNM^ԳAnH{3=r5ČܮVLzKta讨vS09YKE)JdbZɸM*캭woYQ){AǴTmsQG6Nup:m~]xτ)x,z6kI}j#O ~nAd Ks^]рGlscf HiH]y?M>-);빝j@gz$$8EF .z`Cja/r#p# RP)0? bZ|tu<1䔚IgrZ |8"TǢl2k2/P2eWlE$t o=\pl歒#M$sW؁Õ{Z*눑v,VN*\]? \pƳY'Bw9ޖrY\}sHpanW5EUBMʵTufrO70*ր[ϓfk'AZ-ôe(Ų3|-\(Z7MZv3?i]G?UKYqҤ jd^1I*TQ@!}uft*oE[o/n"qW0e Cߣ?.|X&TЁְ6m{ C`EX2"jWr\ QPR,,&7|SyLy6O6c<kp=w7w/ ߊY J:f \Ώd{UN'_(zw^sC8lY=* QzzZHNVC<`\SaaTxw/!$UP勍p-Er {y@mM6=? vC{mUYbrƂ?|V}쳲)8^&5$W 72isԑdhzqMh}"&ȸ@\ص@㽾W R8s=;L#Ի;* hIHa R|jSo>f,~hZ%PU-UMRk|ū NhGc%ghO,D7毩 = ?0t H "3E5*IE_6J'yFz˨z.,?OvhAi(>K_AhB&aBHΒw r@%(v$"@Fci~AՌCgPN(8k=} \5Ze⭕Cx"/hJM\I^!~siKK`VU"n#W}kuAhO\}ucz~vZ{ ރkT\%=-z?^W<>q`d RHonZ?h|#I?xos7<f#o}_#vܳB}$,?Gk_R}sz>7)83>}vEP>3X=P,M8 ճ|6Cڧ⡥$da}B+'n{l&ׁSoeh^`YGQF. Oe(=fTHD卋gMRl֡=oaı!ۿi ֟^ΏGecȀ[ X~{OL*'16~6ǣuu᾽n9kT<8 7S&Ц9y]G1 qtOG`~-mǥ?]$Jx(hޔeGн<nǭxO-Ѐg3M*)U,/HT;[},H~~EjЯ7a}Y gF4延m|((PkMY1H15&Bl }Jq>¹[5w(4Y-4qn4g_DYf]촶b[ZoE6Y} ^4CˌoUY6O}5uf]CYf7ȿLßlʹjkSɜUCşu3n3`r=UO: UXJ%I| ֔nqbٵ,Z(4SkA_6QDz~,)qt_ )fiܾ;-kx2tr'(H=泫⼝@遈0g;W涆kU6 Q/($ Ц~1bo~-n5b'Y 6L@R?=i1$Vmy Q&Im7,EB~/Sw4j>'",T,1NWqcBޗ6uXd<gAì_Ą.}?=޶kH9ssp]N*N+8bլ#- VwLijA}0N^|fBq!$¿@Q'k#d7:'`[`qv\2ٮ?4_ S{3SOgPVv:,Ic׉o?> ѓpJ0ơ Q5řf=JBR1-Rw5:o;%y2DPtF!T.7NiO.3?ო&a&n%=NRNM]B5x m~&{ŏZBR%KMƭ,REgQER=Kժc[Ht;nܠ3D& 'KɰuZKDCQfn ~򯁸;ЍR5x\=9u:5AbEv=iKTSxϝIR,f?#':_=mfb dK*YWZX1ٝJz~yW)'7.)l;@Fiw\;~%u79n|w| 5wg1ރ`w cɮ'U~^ 52&J:Z\1[ocbZ]{k"׮:n[,4&+`Y*ȴoϑ\3Fs{t!}G_N_'}dQU,}־ʑ (:i Q'E]m٤ov$UBy+ M\."HC. Xv6Ĕ IO v'0WSwc"/He;[5{We`{ oVDuYJ$xaL/x3VTj~m "eMϠ<_NV6y*ƌH(?Q> ƥ4eE&.voҙÑYdz.x8E=F`a2P0գ3gJ'7Xa@Dp H%N٫NضST<(oieR :Կj dio6Vxj.ʳQjJ*HZR"h>9̢#noC[aR )GĎ՞*I 9CS·qfOijJYK?P!TOhvvogu\aխJHMZ!uE`%`^yh|fG>~;R}ೈץ@,[Cr=7@ϕAYw{>gW{ "Cò%7i< #AOcs njg> j$J _%[1Jx%(r2O!Ck~ OywUmۘ;W '97)yN5OuUֆQTwd;Ǘ'h^: J^ MSyEm;tcZM7+5xG(Lpy!@15J 8foK_f-:ا.$D_@{hsW5(E] |,\So[j;me?E|_!Lm 7O$ wJEKX^3{'Q;f%rOV]&?zl igpJ4mC3} \K!d-W wWXMP9 |xV9<Ŏu05fK+2#o8Nג^+Yg_7cc<M'aIff1@𖣳_z$Մ-20ʏ#{^.X|+{4%dRZu=7Cn DkD@1ʪK-.X0uua -jpA>hP1<=u3Ч[_*D(nRۺ6܊2}4z4|**c6 3?K؞_(1 bAX2nDOs!-wqPOe*CG- ]).7G7 sUWFAS*GE(WqR: {bߠ&곓Q=oԸr?QN/w$pPPd9K:B>P`GLi!<|רՂCaK;H|T . .@IX5)Y +(0Od89y+%R!n@%;_ځظߐ-aO D(IājR7"( Y":gbf~Sz9P:g 5kKI 7 gَNS$Nu+j0U[UƏ{v-awצtPP UpNJw1yjp S3U$$ϯJ9=Ô( $(>ܹL&9&%MDA!! EkJkem0b" "õ:k9PN߯(m,>i>RG̨C'CYoo('An?"`.('H<,^h^3"IŠeƍ(< MLD! Ʃ`ݢg?'3 h"%9:( {k:|wU.6y$'L69icXSAIF#@ =p;|$/HzrRsn.BoEWĈMj.i1 ) `( ;J7ܐ!0D:@9cXf ybBFLTirh `nV^Ȱe⪖# "sȝ&G%HV&;V\p&ݜ%l(6M?0}P+z˂U7?Qc|b\O`މt֥~[_zoGuc[Q<5(؊6I 3 S-{+Cd('awKHτlzkHIk"nninH|9[!6k ui>s|m%{,8ag^-k;K?0'9F !b>-[nn ؀qˇfi 5ϋDF +mDPL/|< F'}v͂/ŕߘb6~C`}锟yWocJvHpm̌o|s)xwғW]}gHd$d`?hs ý'C/C}Z`K%,!F7d(EZݜdNu, s]Ç5"S{n-ܰ-$!d'A*aޣ|w[z֎P%yPUG'yoEH;1Cƾ[{dSTzb{2EdEg^ [YHNKBNBGF sV~0'Ĺg}.Wg~Z ,>(;V&! qOA8Wَ<'v=DIbD]dl|OVC(a:_A~ׇ̈;4߾ze՝"@!e+DXiI&hlAoO{n@4}~fR_zp~)=DT |'t^H$Ru>Dӳ{.3C<{V }Oz>SmU,;Oy?̷i]ܣ}Cf䌘r*7&]Q+)a?r}֥ 9R ė+e4b,p[xˍǷ/c7 c!rS tx@܈$wp0<- 7[ʅ-1gmf)D0%zmqB`_r;St $I`$[|m8$#<ďq?+}?+sV09Cg$dhL\7xїjVP?F ' ۤ_+ZXnnD bSsn g t(H VeNUADg/^4T qtWek CS#v߷m Yf"~B< -6 >o97 =6{W>@RY ԝGT !v2'u)HE!8t;P\X(N0`!݋X"aNY<c MQmړUwN(q0y0"bƢ׶}D ADÛZt_ 脡;om ½1)$cJӌL ~bA&@z.6{-|Ov]Np@@W&tv<'o.I@w>XBhaFs) VK(u7ZvD -\ $6R)z> BWUъi.zBPށ;?a!JXHw9Ԕkd' YNci/a@+%+d@gy|cc(o>:(dt⦮ͻRE) ZRM0P(Yym<[ÊPh#5yVg8LEP.U1ğ߸졍e9˨%IvCm ?=K"('EcSҡ2)* Y*goR>9b# vR qn5*YzRg{-6v==V 8ǝչS ꋲٔZŚO/bG2~SES~zpGC*9D~$7LR%)!w@nӤ:2Qv>[6$ovp 4ם? ^HeX6>(AEe^4xw<_BF{ϴmͼ"f۹&- d(As)Oo ]縛#_= t_.$㷤J՗tOo4*{SD!?CeB˖Goƪ1~Kwu\Mdw;k ۸. e3Y;?MJhRx/ʡ*o֬cDB_W L;*W!7m̓O5ޢa&b:(YQȭ}iq1:Z!/|dw\9}pWIV-s2cK-|}ϡž=bˏCbYV*ѻ*t=eE9DttC^29O$nqiuu_Wf _ݨK5Z1u|S~ܫ\/X~Ƥ ݪ= ut=/sЯ94dp:CX5mn1ĄPQw8zQ?P'A%o渱sbbQwvKTXͦ[VikwB8XĴ൉D g)f+dVt99Z}KW)`Ѡ/Ze':. R qDe{kI}hD\}d5Xyē?b,6k jeLժD){GQeѫt4mz9·`yV:ٸFUP+  f]1ˎ /g Bm|m\4wcoAIF3L`!Ddܪ|IM9yQ@ 0ȕ JpҨ*¶&]_ )RZf}:zf{jg[$~K;%ȳe&FP{CGӥJyW*7SjiN</Ż#(%S0Hp.oxRoښŋ=.`gjUe b6;qփ_e䪐IK-c-Kr^zX({/ݷiSkiهctmJ)7ʸ1%T854OIYIL)ʍ, G*^On߱}󵦲7/;]=\B+M5URřR@ۖpmnfk38w_.9[ Q6X>e&Μj̍~]k:v50jEZ+?:z0`IK(gx2=!ܮbW/!_eO-\*#`mh@@^G-h]2T{1ͫUycowcȒߩ asEQ,}?_m&U-溱=\d1k7H{GVjЉ9@mlx6Z宓,QlT&qŸ6 T \Vq1b<<\Vdj́g~N|iTDE'm=5o/ϕL/!_!Q6QJ ~ƶ7F~In]OMw +#EI"GT" ؝2IH cy~vԐZoΓ.ޅck00Y3O],CMz>5KNPWo+ŶT")~̸N}jt4@KJf_b?hg}h_!@z=>`-~> [g4m N^~&\4CX#qׂ"KUQ~7.:U[k]lAzinG&Hv̾){dFKYi+쒻8}JCBnT0dgl͜"f$##; $ TÂg+Amgo"u;Օ+MES R6Up2%bA2\]}>i@S[j!$EZs5 ȭ<F c-~'OLi,9d":(B;gzu -OϳN̵͸B~aIks=q %?U.:C߱ujb>^d$v:QdFK\_F|7z~Ԩ=%_ROs0r*D @p!^._FT :6Zf{_ߤU6ϮY:Az!@@@(HJ^4sC3W u*|Դg,Md}'܌B2y|Z'^ƴXD nV\QvTP@/E#UjQTWQnJ=bA2+fCm=Fɂ\@_<' y{J2EZE!@@Έ'IUl-m,qR|N>4bH?"0wbT$T22 $%=idA`9=RW#l>uN޴x.v & ŁR*rz>^k] vVٜb|G+׾GCVԫU s YCܒ8J ,ͶreL:Eur,]7laJgiyBkC?:e9S^=XD ,EO&&WW.[w\gqZkw PҦ5j| CBOk ImOzZ-2֦ SI^nf H%iG8C윩ej;8.3xq:àR:p]F^mv85V[nm6ƈC:h,--!')YԢk^?lmEt yѷ%~E^ŭӺBsؔmgU?s;kѲC_xRKD~(MW|P}UE1&Ԕ&35{.`bCAJuD̛^gN2!NH3&>"NiQldQ!e\ل7i5Z5Zl#9bDfqՉ %lٽb-dbQ4ӈ  qfHBsXUb:дiVaɺC +LˮڨZ[3%6aċZ=0k̢"1 XFLZ(ewCa9{Z/1dps7>J#Tƻ[9JF}X&x}dE%#9lt 8a1h^)W^b " n@ LzQ-iPA#( ru Xc<)@p/5Ksn #U˅Hy[(_8*Im5f/^n1Q*553xWa^\7a$dVw8J(lq*>+E!oyC#װV{@%2:kčӥ(ew3(,TTுT2"K`!n`mFb!hjb|) açB%! Vrg&|*I9o" )*ā=2$$)S +} O;0vv/2@(^ѩ6~@\E'h?\#@JyP ;n)JHĆljND\rWɱR }8n@ޱ]TW _yPdln#KCYUD@27̨߲Gg|qQPJ5 C~3y@nHQuVI5ei`dwel"J<3)īM?Ofd.e #i/;L{v^ ffFVIC$?EڮC @Ă cHݢuy?E 2<#CwF A2d(eT ­PkV["H$w98"#G 9mՐ( "b,5ⅈ ړŤ @q&"Ĕ 2-a 6 6C CO_8e8ճ4 GȉqkoNV eD=#0"q NDT1>Q54nr _[Њ*I_C$Y֢ Ν$8p!C{8eVx/ 0ZZȺA$RQ}Ǻ԰GQlcuN^(5^=f @5N1 :,"f5~$oTvJ=RJL:ec)S[| Bmj rw[Cd#}nw/i>)nlF6ZD?S#m;,d1_"iot~dÓ|qjnp;={7[PqiߥԌ6S$0s!r$B{=OwH˒RJS#l!`䗉0b"`Hw9)1E]TA BrD<ϹIIuD Bs{% ;L*ƚ 6.}\Df'Η U}*R^jޔViسSB hOACU[77ȎczsL?7φ&dCvHէo?Y |^ <@_^, 5U^F `V; (>!Bsx[ / AH怌ۯYiQ+X 9 >g?- entA İCD,/5-B}aJ$ fuH9 .n)ST,KLfLq@՛m>,G(ˌ<Úm{ò'-/zN\QD j@%Ƚ_64H16A KөD_z|lD O~ ?bqd|Oz?}`7uOOM9MǴ ug5.9l (͘?D@5M"=p02D4uZp,%I'TZ! V QD~9oͽʌ^Z)$$A]1Jy{3CE5"1J\ypKߔ? vޓy)Qm(pGfQ3)N5g3(g y 5dq1PPXՄ!)%{r ) Ld8}\LAT҂xf@߷0Fcl2w}Yxαա! @wGKavIHe% 1%ص7r1(H_Ő>ߕq=j\%5L5q8toֽ&Alxs0rHC[ 7^ԛl\ &%ƫ>EޞS::رFDVf.;l޷: rkӑR I S^aH.$sE"$qj6[\:ORXDכ()ݫqr㾱"ʄ%Ǭy0-G+ZF[17$uamhQM3v?<(A3MzNSεxS !d%sS(#` WPE#B$/q}0R<\=N4V-zD Od>7s l'B7eˡ= {ǥM%$ fˆv/N$%ӄ w$`c0@\H> U{d@X632 %<|++D}'vܫ1 )?hSm+6e axկ%Ù# &$*iۡ&]NĜ:zq:M3P^v&XzCMܦf1~5x6?E_D3]iyg=EU)@rʭ)M,Avm:ѽk[A6f2420aICA:nLA * `Ł#f $Îj:UW3'j8^0hB pnhEi!#9 SO:5$Vxvf<ʦ @ 4z%AEY7@4  a(1 M,_WT 5-:)c~BGKB=-AE.eY7I~ f[R_ UɃpc;}*Ll`8=vEwUP޺gNsN G>49KTc%BUv &TSZӏx¨TWMg;qˤe 98Ʉ/džM)?~H6OO}΄Y:z)w"jL_/s`c!b⚡HrO:[nG̡iW|e͑b7 \E; bY-ly4onwϳCP$uT@} .0OOM(>;J^!MWp'sLP8 ? Xy8pdPa]Qz~U !OPޙV#RY:j `Tk8nvJJny l%VcO elUcWwJwo h jgwn~Wwgɫt}MWYS+FF\O2[Q_Oe:R?m`\]Gvm}?yp̨e.cJ~f3#R?;24t|cq n'*oX?ԉ $d]J~5籯 |X>ԊP>AmɘC/7~ײGmf`6f[r;{}|$׿s4Zqo9cT菥;Yh4H*l'=D]iFO7uue?=wāF~I!qIqІt?%5pkk;zt84b|R)_ڤlp$i&.b"n >@#%uP.nVLmq`sH0 c+Oyl9'SZ@ Pfu%5ZCBSwQR, .pUf"thE1þz{g/ΠZ&Jm+CIrVn{_8]oAIkbv7]vaҫ{:^1,(P姨k4o3Ik5BƱtFCAȱe(__oQuٰ=X'̍Jvnm? eӡܠ֫MkeIZx4}whxPRlьx3]3(  s/Zp+LO@/}E2첖~t?ljnO*> n,knͫbːw' O)抵eQR5.u yF_Wes>W+; ]7sg)llgo[H~Gj0~piN/ IP4=7A|s -QYvLu;*cx3LcW?>?qö0Q0-sĮI\F޹2Ĝ.$yҡ"+.$z%42b6=jmhȲ6ʮ>u53S##)+HluB aDn,ܹSvΉj+ .P7 tQJU_W{bqt槡I"a;S TuNItQ:p )3Uɴ ӡwW;cQEE%?g EI%`~ m"k4BAͣ^ |~:ඈ2 J%Sϩ\01!&}viP[:]= w崟w|PR@$g-A+JXM|jH[4/`m-U] (:*("Rуe;ӆ'^|HꞚ[ӤճK sUR_ұO3lOcYQ 6Ů;aOjد%@Tf*K~S]h(G c |?qɏ6,4A9O{my(ysAofT.GKA_&"=ZDT?%:&QgO0XoO W_[M_B"Vr+Xr)G !\h C=d1:cwvBz-u+=l_YC3ö묪]'R&u֤jY7/fZ-Wyށ׆^ЙnmYkIglLӳ,8`;jq6ban% 7GY^=yZ=Vs\|(3EOq]S앶jo2iJCƁyk]&ː,% }ڈGCI&-*5l^*68e4(Sjk+w<W0|i>IOZc{d=vf e8}?/}pVpeuTS~yf7+$ǁoW>e8@Ś@m(ʆ&PyWX֩kpGV .6 B Bv93*xH+)]{Ҥ_2:2d\fO"C[4 o Ԣpu=Flr#k6V4$Aʦ ^΅p5=nW%ߧqmg_>]Y5v0jU'MFFy%9iy?6UTwHϔ>;tB0J s=w濁,aqu9r7t1SG2gIS*ZĶ!jҞHcvf_;z=1>KȘHhW 1 6kkf櫡e>IDVUDę7}NbD}62/(T_bĪ.o`:w`jI8+KztOاS-n EϷ>~ zޞ#.)77ÈV@lK0J/H."Hr@#@ bB?b"IŭzGm3(A+0s ^Ri([pk?8Rooy}M>Vo`WֱgNfشHQ7{PD"4A#!(7w}(fYQr5 !ʁb֑{;#>E!NI']_n% "f\$< 뗩ï=>f8'weֱsnYZVUマp3w6( Elk󽇪`u7kyF44YçM7%kYu*?ZS]X}lU!ho NGbAIE}Z$'Uz.t$vW*j[#Ȍ}Z \~M-d->B`N<-淨wnn0 b| >Av(BզDK1&da`4z/uXpbKWT-t_8; UȨh,—FV人jDy;rkjɷG*R.Zέ?G']yfqhHO\2*!L(NLޜ#B^Z@?+K}oPFܰ!)Zm08tq$%pDSʇ(0A8'q8S8yzOG|OL1&bRؑ0b JI(@tx HtLJv]/'ŭc>6ŲQmBH` H_&ԟx1!=>;!55ALF(+Y1y8oؓ{W;-!=b~W9|ozJ_u02FDsx[V嚙qVJ%`\ 1#&'z%.R}$n @=ҕpF;E͌0Dcp1av`xgFZ|{O7(rv_Klt@4 ~5w#u %gi!~n`a܂E N_9K@UNIYW)>{(G|}SشeOnv4 gz{\x[+yW#N&9RPg{]*t B-l‹;ɲ΂% XVkIdy{qaz;{7?,QRdk|7WEsD/éΌe7fx0N߀LQ 0_G>sltLXơ BG/]HᄄF>F3C 4[K _F·IxS|CL?5W~-r59)Z7{O~AJpSAlL^"e0xL\X! Kt _Sʳ$ V ,vz1(G6Cٝy9J;O~gօ i!BcMof|՘ @!s?9Gl-B2 /K02{b}89D>jNH1>SDp{25XfC[M*몐'Bo˴ ^igXB!5)Viw󻽖vPGuV?9?WrnBS̉E*|?֬8BRI* Ut!lRW^W/o1ڴ(z?=h=76I>^h ˣQKD2n1C%u1 䐦jF#c,l\bБrlPc2g#NԸf6V f#F)zMʖ "5f]ĺWHt=1ZmuoZ~aK'tޟa9{ZpF?:-= !; ^NK7DfP ]3y\MRve.ocdeߑޣ'UVYjTex8Z?46V8%_{o5Tp0WB,@_bP۹|^, z]0ȯPKX'HIbl@_C*U+!d)~ĒGz.$ ؊8؂|Ί*ȾoFpl޽#] QRII\tJx0?5+!@/96(״;Xz=MO_qbT$:[Es>bRj):EFxiCv#Obi -eZ(Ȧ]E%Q|a4|u |qL⌑#QQyf[dYx݉Z40Țۇ4:I0$Yw% 3#3]H&8pzcȜSoG6ExMA)\6M7Z;8+7p.7Fz3gserA="pTBꔬ{Єg eۭc;SD5fi^FƜ+HDI,_ Uн%% h%U$wD$_lV T?Dp+ AzC_6G#=7H3aԯ,^T(]ysοsf?>hͬh/uH4qӑq7[igg?B9 r_P!iW՞k\BIGq?ĄH*훉cA4<B_#+A??GR<ݲ_P nJ:(Z{ US'2 v_˧n5m9ڈUHUZXI$-\|ƱwZ+wMs'6}N8S ~a?{*}fd("v²SA%h cH}G>նZߍ řmש0uWOIPˆ5ЈBF֘C p҄JOi{UnA̭RaJ(ftEGOVC'M1w@J3So(9vZ8%i`=14.J\qMf!@:{:^hc,z(1T;_*!ͽcs[k/ؙߪKF}7 }?t%$p9Q\4gmK03puiFX#,2Ib/Dn=gĺ]9z%,`ӓTgZeq?,?/.a2?E(H` 2G7O T= w|7;1_sM~dwpeߦIs+YCFeXo"F0)S+x$MuԹX .VDoͅ#64G՗{ gu5j᪂d4jQ+:v+=We/qx۟ei==Kg&ݑr~ N+ҨZtivmni(_367"/'ǬF'9e9EMcCtĮmݎT!kv!˹t{3Ooʸ|ڣ+wF㤙X)Jڝk9Nx~}> \BUm7)tH~՚i~s+o x"oMGE=AݲkrV(oiCRhOG2yϲgoT3Λ}Eح]q"5C,jmtU{.-yo(4yۯpy|}o6ٸ@uxoSZ[jބmM޾^d:>]bYV!Jـيq`cupTߏ tf&f)GJ{TˆLniטYm; TEtbJhxEY1+wRr푓13kun8ĭCßy[≂ΆԨ%49HT EMKG+L 7եζ|RӷaN`9Sخppxԃ':8S ȫ.8:973]j|]VGF&p[v74#Ѡ85G*礲&:IA]͝4>U_Gb׽9m\:8}[㤎KO癠KJ1sjޙ2±9M>(߹F?8tX&d0t)hƦ|.}3-<`cn|K듖/Cv- /~Voa(-c6xg[QY~WAPa1]i_'JO7 Jޡ1fc0ˆs+J23 SPS=YLdg Fn7g!*[K'4f%d=GN=e`,]Z "_F,h5Ck}yo^Uj|^lZ"Tw*HFmՏ=:QN)qNGXe$077w5"IwUIљVmI-EܺXyUynjHUG,Cչ89I0&9yOvO ҳaY:RPReQhb+i"nf>,AuL k#U 4+kϵT_@>tm.S-Qʹn\OWeou㺼u?Te,EDֲuXwpDzWm3ޥ eC<Vhm]QЄ @Jϟ;f/MǁS6y<"@=|=4&وNpl9fζk~qpWYe]j3qCi̴C5P5+{>cYYiugG Fϝu?̻Zr:͔I4d k9YW7 ,*ݩ'[wH^)E}L'D5:=F,Aa]?q*jҬno2VFNn, VgղQJ&U~:h<)ml7~T^\n5;OO͢E|t_x7 :p\[eۧ=6ޕsxa~"([{d˞ mYfލ&ࢲe׉Rү2N39*L]=u!DtKQdV[\`v#e3_6}ׇ̺xU JdaI5D%)գ9/HX̃uQ@ "Gh .z]Un$IR;j mFsxD/m%ZrzA$|qpm$v3=m9 ɧTW(FJJl^O 3&Lo?B=[b:wPCAyu3!Ш;::ﴯĠټ3.#$e£o3IuBhIW&ɤ [&'?7p[eIl!xISH^8 'og"н!)crZ6ُ~v_B>OmbqҦ.yX>)[%yxJP (E)!|9s> AT1Fa*v.DC$}ď\wֵyd SwaKZe* $!Ⱦ`7C23UO(Zɒq_x2ͬ%yeO]0}B@41YBX!"2!Ցp-'%ITbo *fSn.0`c2"x>39ʔR5^|-t;s@ p!N PX% ݾ kޮ#Lb49G(Ks$lVI#I1?Enhp s]"hy ymc&64@<g{zFIJ%Gjtw͙2 \47M@^ʢdMQ tg(d<&>ȴ|#Z^ѻ831?F\ejCk0|g=UWa[S6rTI"2=ҋĻ$UU^Wg<\N?(}9E0o}=ߺ|܎_- &i7nnN @PSQ8] c[K(4{N FE3,,Vn[![T#Я,^ v b)k )ZŻA^{.Fnc536$k5;x\*jM^ SHd9`?gRrz8KF7^:t0Hp7QiVEkgMNG`hPMóhZX}wB/6+.+G H I1)Zi1✌_8e:`݃lT *C->PVR*֝\2%,/Z^o+%ŝy_>V纩jp&AHzX4 8vz k+^P9RHkHepʅ"0 QQW6R90b~ò_!~ŵϋEzٶM=5 Pz->r8tڴi\?:'}}沟s+"`CX~E !WԽe xQ5x4<w G)\:8aAwė۩kuIz=`MYjޢ}F*X JWe3Q@PG(EV&(pJ:=Գ@1`'7\'T僿`vzp1c^D3M(@Hw0NmS&h7h 9z?>jqHZQdp!WecZK/EO*-0( +iWj) :>l}pn}#LM(Bez Isb9 BȀ/|N({_g5=Lm[ ׏6piުO~k@wddkIrk ˈ\rx2FgyⰬ'd?W{ڒ:);IJ"6Al$dBA!}yB8v tk-,B04A eP kga}nzN_,㝑zi'pATx9*&[#Y]&ɺa7ju<kwu]ޮoFFI !9 Tk]s(eAOAK3β^I"FR'U传ZE5R&Ǡw<n%3)8܋KFY,\/:.? Mh #bMt ! NٓtNIݱ> `իM/`%F`3:H엲u#<=ئSj:=pUeDL{qXm~2ÜcHj`#4#7̮;`<ݡ~n*늢n.kK5:{Zj"6nCf#ߥ|bhȚokaTQuz݈~6!vU,v1NUnKDFV"炳9dtAq[\"nv %yDRVX1&y]{_|P$TTt<3-f-TC*uN cAST] 4M"%4<M#(σ^KK2˅,q.cQڏ{:ԵE9+X=J┅5\֭6pWp )$iW(qꐡ"ҤY ~l hj01VEimjSM#LSe+4LHFͦYT+"vz[;b&8X;&t@0> <겚6m`:I}6#=)-(c J\J {vvR)ETnRلe[x GCb{o^7(rPjcVwR^e@4]zPΞOG$ pߓpTmK֝`iت^>L^qZN囸eʷ xg$ԍ_ar4(w> GFh8Y X4.ûLFr bo*z f^k[m\if/6hڴ̡\5FG)B\ԿƮ&<987bѫ+ٛNf( $@#⪧+7l =)`; 2dnOVA@ Z;5ի -+!mT,6{ :kBWP BM"3`նvB;YP XBN^|AܽIN22۲z|MJ-Rm#kI6" % %z,ɤ2 l{p j8mv R,bh `hj,vQXn!dqgea U m_^Xkq嫐O$ϭNo巋[mP2Ҽ\.0,8eK Cs=kVm7vzJArD㍌S+0hDft7kCmi])K~qlE&:,R&hicaЛ"źFnRZ؈y!\ Rb*hʜ50e-uᆘ"T{珿A%@6]ǀz4KzO+Z.J'ܴ@0kܤugo_tU%[ezu7%&{̃r֧h8\ǼF1߰5'͠[YtK }g` . S\,RIS ︺5v13:R؇ji̾M"Qץ :YKU\Iq咤,V\|YOu: M5'UAE۴J]ia-Gۈ),²d$+դvmRB f+6&+?eȤZvhD[^[vk"%a-?8mK<lZ:\^zn,j7\FT͛{DAlSĝxE"AOaf)ⵆzDiR3M#Q>aăݳ fi`+E%2*X7쩬+:p^HPG}3uIUɷ:h})ց:GYJUZd"4ZdհA3ɵز=Y1Ljf5լ 4V0SYEıoNs`3sX 4wr([l-\XZ/ۃR/cEEBkTExGLOaUe) ^*9 <#ct0LjLd(""WKBcjK /~8Ѵ;З3qznEᓋ[tK%@ `B @B195x ND8#9v xR,?(vENi0$_ir :;'NaĝSMOg$n)eMP~IJ5jI|b/(=A;D 4G~:_}Bxk[mN D;sT͓f Ger,f)[+Fw}_xvaAU3/H yvL[My.?o'g.g[x 7!ÏoBEJO#m&>fn 19˟ZnD\FWKQ"h €ThA{ͧɉ6 N,IUT`^TJ6} JK{Ic }V.RPRl^kTo==4M^= z?Oڜ(+];bYWI'#/T;Č3!P[ tg9?$\ `)"~URv'@qrK8oiV*#Y 7'cng9UƃukruSF){x*n1[j6]b4ʴ˽#[&8{5=wnE5V8 _&Zsy 6Vkq*3l{գgA~t{j!ljkJS}<CBZhb'Sq5y.4M[&RnwE7@JFFxlzr_=Ɔt[9vetlԱ /V#mFh>\4/#4ﰮ[A㾷z ̮IeЛoO,ws\a=NJ e<+Uk!Y-MZ(g[ҹJ:K7$]?"_H^Bn#^0$ELIr*խr֨*AģhcF~ \"n9^#Q:Q80QalN]33"d\2x9d7M|͙ˇXCTpYu^%sR:YꢵBgSdf!Gbl C[}(&槼ô_d+*C>ÞCmX rhUvbņQ?h\ldگxo,r`/͉rw i*xޜ`R"@й _z౺~8p8,$Gj(nS&oo-rjЬ!o0c8yE*G-Tt95ݿCwzC/%6ѝkL隬EC!n:ejgi[GEHpf6ag"*,4$^S{[+4?aMG_8 {4Mg5Ot? D3TD|Zhf>P%X=ۤ0ATTP9P5 `7QㇺK+sWqcOe:vׯjΕq,)&'G];_*NH|C!v&l'xZZY_?Ylhn,m>xb+4fƋhjiN,t@ ]9-E*fh45֗?&ԚS\ˏUp5<\m/D P'`(~l|{CLllOT׆d.–040`aK#a|bE@ aVJEv7ZlL:. hmtq\Z9d)M 2ah6UzeYث)ָS e.nsGUJ*Y*wrfɕlRrRm$BZcB90-jg F'2%p5E2ba?s 9-I6E>gC J(w:kX;m$7جM[٤WcQAG0E&R } uj!vcy,_fԻ,"kh,]Ԅ/dF }f!27gWݯiۣxۉYm$Cjm(hCc [Z̝Qy'[WRvm+]* n܌ⷅ^GF@B>n{J g1)"E)pwRGR'O$G6dQhP|RNJX OQߖ%$qu SQλ&ShsR?-"NxͷoũsKnjݹz.-$Dly}w^'IzٗV!<$ͺIčX0`Yy{?nXjW8t+%]zpiK.Gx|SyF "%J0k,|tmzl0^nzf֍,a&nATaPp:n_nEhQi@;샅σy`[68fzts GOe5toR z{q6j˅d(iiez~Е.aYh`'`miyz{sp_#6f44jB6c(&BӾCi7,~RZT}sEt;Ao_ϻ Mg#?^=Y1)X/Zu5G^ @f=b*Oꛭ_[R N+ Lϥ2S_gNwE)*q xOKQ'Ks !+bKF`"7㹂Zɂ'AVgD?Y ź1 . 9Ջχ̰ *@T\)LpSw -4v[g+s)% čQ~{j3%q¦uρiRӋ R,mtm;6,QC@B?VA"#N3VsO\R]Je=[% |?&F/JMϾvO?\鵼 v-Uyk1h|4![/r~j tZzuLu bjL{'ώ>^u{}??vGX[H^Mɽ+gvɣf֐ڙ|-]_u[;@;Go$]xV~YOчp6Apc#`bߙ^e6W&ˮfé^NӑzuѤf+`}([xxSl ]jW:ߏK3Ob3%IFzNUPh{ʼn}>A.dAڈJQ/ʍAvp>n uxMd|Uuh[ 6ZG]HP+/ǐiit2Y/s-w+>uL_zE3НNQlUv=~ꏼD}wXflx3}(9&u۽M.Y!oET6u>Nyxᢋ77r)}Qv:)sP+zg8tɕM7xRh]Ae$S?[;M֢>ursOsyon7[VgdվBX=j*Rߠˏy swE-jˮ=~Wnߣw7i_!$(rugiQphK:fbYz7۠fn\櫏\mr;[m%U43mJoKùj5bTVPk΋ǖDoP1b;rg>yo.F:\~[(rA>sbJv $ $45-^ '5-1qlUI;lGϒ-wOfY{;JMv>JwInOOeO|>6f\u?csYjaMdM(T˞p] O<Ѻt{b9=_=T~_Bmˊ|wܥ9/фO}pk72qLo돷=?l/%:-=&FWwxtcT -bniA?ZmFԿǥebDzDn=*ʍ"'xҗr/_j뤊x 6yۊV[j֛uX驢S|ř>,K=o, xւdZk~>W[t[* vFnM?lߝJT2Pl[M̄6 4<)>W~i/?v_bˈVj νIbUɦtd-ƿM6 v R${/GPQ3O}g&Ycmܲ`E!`L'Nn!DN][!JˏЗ ~F5>Ч)?vG7g #W„H#\RBh@Hhqj NU7bxxB4,9H^fzܤ{/pبl;,m\ I_[!4vB t ûW);G/hF !h HP J.v<揟nŭ»LzUnϭU ?>Zo! FزgOP7'2Tz-rJ4:䰾;sގe*= kSxT{? 5bq^e]ȭlaዧX(wHw^R c5 ~%wSjucs?´XL@j'uz kBkL!;ǰAMrRMǸk#Ȯ0 OC,,CօoeFZh鯑\ tɥ$'#oW.£|aIF/ zM3pɖ Zak|W={mv^Z"Juۼsx.s|n@oԈIw萕dwsʐ9`Y@!Q:i;05 C-~dÑo 2Q qIly9\Pv" E9ĺ4(J7~Jhτ,UP{PNI_م !;](V ҆*l;XzTBL3wm?W#%l^7Hka"5٨ckJBJyVl~80O?s}oqCinGccM"ppLHeXlj\f)62vVSϭw=OݶSG&{(Z7šEQk=O0 Q:ՀKXw㻸$Hc >^-25 0u֏OA$p&2e "P1KWI/ ]͏jƇׯtz=u=uӤ`Eb+Y^-Ejr^( Z֖`ĉf w~ęz)H$ꇐ`K~ZDDUֿgnEV23eyB}%vs X2PD9_ 6 zBy؛󂝁Tdp;`.= d-k~~L*2pDpQ-):+T,PrO"RiMťYAT^Jpo!nG;:#Y4 _w/uED S `gϳذE»7'4siBƽEsӽE]a{d6K2F34"ٔ!(]SfdeV@ IH)A9411 .II,jXeWZs5}+z1$h ܦx6{pU*xE^8e  YG cx+1둹 qēuҲ7;.|V{OYwuoeA;c)0UL-X@t[Q]KUE\ޏX!J$&-A~>𷶖'(vIk'ؕb^܏']`P ,S ?U$^yox__tUT(*u&9< VՍ%Ss9+,^!@#ڞ"xHu#;ѐL9M ).{S;"fX/^ ݯe[O5^ed2B$KK/C`1,]-N_kaf?VPnWOEbal˴mO5V1-vɾ/bQ#3r1Za:5w281]6R9\]4s dҁ ڳK!!Uv 8 1 7- YR"?2ꂕσƉ4;rjQ~^pc)ėCCre/x,s.itMi;s4׻ )tV%oXڟwf9#@LR?9,&r\[zN4;bd \cmWML抰0*U3ww;W+~3,?ܯGۦo'<ƞ2uMF4wӣwP h AkT~0LrfJAP|2ZFR\ZYr`e±R +(F^x41 B!2frZNlczX|{3TZq4\Eʭ&V(C3ZN9c4hPX3q,!bQ^]y $h9dHBZM ;1yUOZI0㘍* ˥ϳaIj]QAj#:2NMa'Mh|}ؗ1 M~SnXdSBNɬ;%Mtdh՟jFy&*? ѢVMW} TY&'4&cf盺^@ bĞvs%OcU{K.Sz\g`gJ)oMuZKNs>q,stOn  #dQ^ s:gZfV% jP8C( DHTaG?\ŭ;UcZ`jSaT2&t*^*2Ռ =PUnD-/^929ͥ=Ng}nr}r^i('-=7F i+slwXgz,M@Cxg)6dm2%NOFco]N ds-%֛Cm:1Cۧnf|Ȱ#Tx#3 .wyĴZwņ[Ѝy0%<6UjC[ϭi'$ӋZV׷.ލ6`sVZis )=RArO>EKќ,LDFX* 5F&{ORdq6l&kI9Yc:|f~0bF<c|z6[:俷Ў@Fd|! ψ]zOH]`]xz!jsả)dV.͐U4iL1ܟ,7NCi9v+Oخ73vp:Gl*P#`*$ ⠇]Clo^GLeUxtLjc(dkM<w |Bc8;R[EQՄT0)/ <{ЛςF@v> _.v]]he~t fx8p`X`BRZ#@=}atT *&);b}9#;;qQ v#)n̽v-jp'ɽ 0JտO2 j+/8?t7< g4[!ؗ~f( _ϡso QƑZ=ϓڏixM|B&JHd)Y(-Ja@pZPi) v'B)?UT#}W/HRwf',dʂ'"~3˴P+_)Z)wg&p1yn[Ma<R͋y^ `!""RN+_Nh/,huk<1-\v{;5*d:B9kLOZ Ȉ8kbm"tݩ D|'Ɔڥ6tkNg%a cHRdXUsi fM̫?& |޸k&2ҟ6: ZX:93|L.U`6!j5=l0j+F¾ΠBf9~*IוmEn=oXxV)QtQ\\Į+L6,Z;/e\'o^uNMRkH).eXi2LdS{;57qKiJV#\c.NFxxg %N6y>` SlRa x1Ek-ɐ%f ^Xݙ?A5,9+%'~j?Ͼ;Yz׺YR?N]]RՇ_Pq 9مHzD2-t&,E0"\I"ԳDI+LHY(Y E¤ )6XKH$at0R2mIFw;D["f` d}uJ HD>T/U] >7O?sY͘>n$?hN$֡UPWTD~>h%0A'G~V$YkyN\0_ c!.ٻ7Nk&=S@N&EZ)d@P?tDPuC5@2 Y®X#"K "ұQKChdJΖ#.%# `"cʮ\Yf|%S']J%u'I`(Cp>K0gz'6ge޷%e azmvMڸv\A,l?lk J 4> ]ȴ J܄"Xo(o[\a}Z%"5ƲƬ;6i릺P YB}A " 1S< 8jtZK]m2{n|^X<rdz?BFZ+0U; _SnS~^=h7w>ڷe?S茁~"/@X#نŤod Xɇ@Z'TH5͵ dd8ٍ~&Vքn~] b 97G0mo]kfҫuwgo gM!=uaҸy:vbzj7sZM9Iε؈&GNX9 }j)E7 _ȭy2']C-[IK+ĀlPS}E\뀄t'ܽ_TAw?RǀNE ਩'F@\mHt x*5lg`=p#/uI܎I@LBKd;}~i6"# hƧ^) |lɿTROv.+2GĈaLGlLg$~q={+/~6ڊX `ޘ, p-BuLVUD]TܰH ]p);{MI6YKpkGg`IW ec_E-`JI$V3e I:>v,ZZFB+D2z;w/5JhSNs{_oS ёs^232J!S@ī1Z"FbQw"#ygd3}-jHM j!{+7S@ CBti*lp;mf)F { Q̫HO q ̚\VWR:l ueռT|("EOVTIyY%J Np!#sQA_x Dή '^ I LWCc] 8NZs9?=Qrrm]vѡ| [xBA_xْraz]ns)>uh s!eB$T.Rax|b (JoR PK+_gEvR@&˭f.KԟVU-qȤh^[huoiN Tӥ-N.758:`gV0ƅRL$byt/~]_h6tWsWKrѽ PU[rQP aԗ f\fRHW4ϓD)F'Oޟ(e06W=*kQ@tkzݰFùVMS(^i!CM=P8(em^q>fi,덼zL-P\l--/h آon]Iaw>Wp1X@S8Xw&ݹίPS hD&"Ԑ4a+UQxiD ZYV^:v|Eߥ5JlۛzLH͎-Hπ( 4WiaI{cXqXCl)3ILm5ytu9Cյލ9p#)F$4} 4 Ʈ^[A_vH9Rr' 5?VźD?:魆rјffd1c 삡% JkGcShx~'kچŸl?^Y%~w|6B-9Akvܞ /q$~4|VfV0UމFbpBK1׵OcLV1oϝZ\~MUҦ6WM:"+שlt]S hOSKNX#1#appߖ&%s%Z\ Yﭳ |Р>ZօP7'Y ɥjʁxw0ܑn*!rSjԥY9{ϖxgv̽2⚏9?V+ IH9-ܺid_x4ޗ0P"Bs|!n$=ʶ-4%>4¡J !wO؃J-3TQaִ?[[J><.91⁝q3R:w%i5}6(L]fkՃEW_o+h o:W khz|q+pVCBJ%/(ɋyLª5LK8LF%P1ۄ:Z{wx@e_Ԟ_((ٜ զmznL! ^/@}nr?˓5XŹDl#6&4T"QAUw3hI7 }u1Ы҉PT_WR5=mMw.pōN,`Rc?ǻ.2DžV7(rd %cb@ȋS΋C?qv+_o}-|z~zSf k4(ho&ݶMhl.6Rr2aȖ4cb'>;ҫǻ^ $On<'b ~wA0[?zz盬Nk4XowQ_@7ҽX7o 4 ssnfbr)îݿgXw8{梣MҲoMKh^2;qYPWaHWYErdr%jc$]gA\fbjTu4NY kek{ҝ3|%4,TS)/iCj!s D_Ӡ-< =20şR.Uʞ帬>iQѥV:oā㇇7&tSbK:%CY)?̍Nqno20nK 1\8c`ʅaeMzT-[џ6${G7Όivґ6lTG%3Onk8 4JqEg 5d]< vaIeHjI# rgf'A~6}_J߭7,B⵩$lF4!ѴS郆 _4P_~.Ga2,B:ިK\`phf"deɇ . KiauC lе:MyS"Sk]^^5IA ako.EŸ}:~%S:FN; x \ ,pP]nEĹB{KcBi}LzcL,a," eեO5>p+Xr&߃v/'듅ϻ/"ͽ 4łM\*~%]4 j[6rS{7ijZݪ;g?=f۸N5psm2>5]WZ:qD3'jTz y5Zu.̳ttuOukKn ?zXh1G>-Y_)6mwǒGF[Ib;v8WtzTУkwɫR!E?>NfێݨlmM=m8Wv^@%Aɫj;8&UbZ Xۚ^2=?wڌ8vLX}Y|ZFfͮsnYھ{M۪?4` Ui3ͬ;IKJ70:tiB[WMJS>O=tUPXx_ Q -E֖北g-.'Oe J[?(S]K/hl]S^/;ʂ*1ºSl渻_܈hIiNAQ[a"*'Qj"Q%AwQ QP*GMP&ōV^rvبTE^zt5-IX^-U>m+e ?:ޭ!鴉p}0(%% J춾0y?j2D(94o0E{(QTCY~`ts>wN{\|ϗ};LEhaPU =Mkp|S?}+XPPsū !3: C!ؿlɂܸpBA$+%۾vpKb3RLr5e3вOshIpAJŬ"%Åm76)+ z zC}PVIAO1_jӡޣ1u7ϰwޯ{<\ק.n7nUmϗl]z8cwk﻽ٷݻۛouysw'{gY.׷}EkOvvvuvv޳]!n=z{\&vNe>7^Yc:;iv\י}=y;|yHwۺv﻽}vsDlOnM4:P<랛vHb֔Ph몠z4RP@[h@T 2j]P2UlQ eM !R@HR$BKZ-US)Tg2Di ) (gT'YElmE)A{ԥPR Kِ©5`3Tb8slmkF&)-ald" lbV6ڶֲhi`ddLHʶd6,hDn|kP-:ѐ .T){̾zj[|xc_wݽ=:Wn^w|]};٥zkd]+lV&hh@1*@4{Oz A_{#3=9p_x9F}ff:z9x};mmhAtj:c`kM `+z^πpc@&"PB(h@7n( Rìt(P݃n;/w0 l2Zƃn- yϦ>ր^y`c=/>;;0l)":pv`>Ow^̀P *;v-T=|G+ZOnpWp9Gn%lO`{={O |ϽٗyD۽4+(}ѩA6fA5.uO;<9׭>|`W}oݰy/aw}|뽪L 5 6 L( [*wЗmOVQ\m$ }m6o϶_z{!񩄢U[\6vѰd7 @vNw1p[ }iZ֐6wG՛vwdw=W|yWξkN^iWNmKrサ^oKqޘf8>"ۭ4gwPю]O>}2prVpwws [m;`\{zm󷓷.wau{G={۷{{ۀ}>yUU[Q.ݮӯ{m{Yic&{O<续″m@H>g|}_wDDQB;}ϦpƳ#w}MO7;j{:w*r;vwn{o=d;b|ZאVֶ޺B!E )@K{4D hѠ&h& hi @h ɦLLL am2)')a)4ҟJ 4@LL@4M14 (= baSOIJ~E=4&y'L zFH4dA&"D1FOS42a Iъx 0M4OP4 Ię=2LjSGIja$I ziɦ4Sy#Kޢx%s)kn}̀,ahأ=p0%lv%Af!1>5sPr)Ȃ 2xw3t(?>'V,@[*‰eaw 9q`dPaABE@("B P)$I()VEXR!H(E* $FEddPw]hNWLv"$RLt{s0p/(?.v=D3-"3E.=lItLaއx4gs3wPKo) ,ш*2 C?:ڏf=P:$ّr$;)cGM\Ixd1‡Zk̅)PAD9<?܋|u6^0 >4bfl*Uuv93A{'zP![s mV޺֪8&H$YKW8v؄*H)$}+4d}624el_Piww xU^NJEESgs'߯v[bm*DTA0C퇱6"" 4y7uTx׶ 0JB EX owI׻K<0?xx^Z V'mmfm(x & N<潫3:T|=L|kV:cˁjf܊ҡ@:1a7U|S7Y3r5s {ͥ<H@L'+XQU*4C:Vj' Jaw ˎ퀯އ?7I ~ox3Ͷ۝X?) ߚ }z8a]4Ϋ@iXTh=՘ r4&3.zu3d;_ @.e ʰ5#D _۠t7m5~E:9+͗MuU?t P$GsX݆CI5 …7`\ۧL0etެ8tOl"4#jFj;Zu!~5DR2m~OVٛ.48M_F(d_c1KC5H`V F"Z EEXH#7@w/'35EW&7!%<DzUijk򝃂~#8z_̏]N5d] ֫Mɮfs[1gBIY4(Dp[ÎN!$HE@a ABHVD *t!D^2:Xopd䪏s,y!)G{֪vMJBaDR̿vj*>֝w1̈́&s~7Cw!ocp^$ICsHvr@ 4eٳttǏ \h4Oz\k &"qCMr;+r$RcǦ|oJ[[8ѩ`%),, v}Fkڐlx@_ 2aoKqTiC)|KJ%DDEVADD0$P$@PY$Y$b,R H*= $*\8)$;MummrmSC?yο׀{B0`x|3fQ95!( t8d-Q +DR3V*Y,#Ȳa,cdsT=*T`Rʰ:~13Dw9nf6p*MZ&8- -G3 %V*U#dBr,Ui؞nW=CLFicKSCqAflY>_3x:NaUڙ=GTFIUmUw?_c{| $z?IS R$pGE jbMj\4TA 4 U~R ݈|R.=krX m刞~E(>Pr%J!! ѭ>"2 x-+ߠQƵWbWIMB=20p;6Cv5XV(P Y?lawm} FQ'nrm,Ow.?S/TvS'p?C] I/ h Q-$]Ǽ$Ϩ|/)SyT9D'nCAW+2y* <@EaDmj * ;NQm(DCSV_İ6h|z"G7dXPD p0@ 6]05̎Ԕ_=9[M3 ;'蒙CC(keRHwhLqҘ8<< PbV$|,䬾f8@!4H: b ޡ ]xP@`ŻwPs\BPW|:tki 4QC> jn~~L]@K!(Je"!$BHB&C:XOuK#1,WdAWG;-HOl%P!rӂdfEJH5.%^ rܸ2 'ޡ,YZii !Acɔڣftvrn7@!V");K$ 09z/0֓  V&-0ʦ7st|kt|O{S&rˁHpnR=1yX }͇k")'𣼇/8V]4jEJiK=-®ϰ[̈ C_S!φё;&Bݵ!Uc+ 0ӻ|IRqB{vD+""J Δ =K,^Xo(P>"fP$ Bt q:᳄-TL}Nm k8q]2^\ߕ-{*!LL* ђD?S-ӄO[6:);yx# ͑9} s )L6 nSlWłڜPTWƦ|l$ jQHB3^%x=!v > $17::Vv7k_0Q,$YoNmQ,y6KuEϕJ Z|NB@PBPDܜf]`͘qЗ@,S2 ׹!%׮{Ovv:h=E9P9Ui(1Hb.InkxAu aGM8'nN 3l1(b;i }l\SӔ )d.khhwONR'.R !j 89|5ϗ},0]w/HqL%ÆFPIb32SȔ*͈J}VH˓;R'x'`)@$$d @ϝ AF 4c7v.qt~i7$-7utq?C8ED\09)ዪ^P*L-a|/'/H'oܦ|hy2dH^olInl:UeKtpL1Mwm˱{>oaBasV96q|yOJC^rҳ~_<Cu1 BfrSY\r(mtYw@d",~tY2Idq?G׍)94W} @3! S#krTr@>|2/|?r*pP`/S<9hTDgUEN*Ãg )[DTܬ D|t=v08㡦RZm[?A7|Vח=_c16 } `tZX ׭de ji͡ԫPf>`ڧgnj@_93 k Ly(e|Ch`ȰCPP_FE\N:h0E=>,V6Dt쎊(A0?t>f:6kx|xo(:cbp\rN.1&/u"ThItQ=R}()" YG\ wt[[!# C_:ˀ~ۿ z3Qsb{weK ?6,!DJtK@.gmA)TzVgԝ'ΤpbHxS0¾ך0$Uh1bPHvu:u ;'{gRMn~=[ix9&qaR$n f@ lGs''A_`vOf?ˆБ^Ӽƅne;>B~/5#s_h[@ Z0dQ a\WfeWJu"\# mfV>Yy?aAj'[GdCp$U!2 ]zCZTpy1+Vjsp6$0bq :Ғ@Fja [A"f.F\ pÊ| 'Dk6|fUOgJ$j ~'A#8$ @daʯY#l17(R!5՘lMtf.bct:N+K/:E͜ h `!@0F`3ڀ#u@L/ ({@ fJC 7J󃷙$vCB(J9byubͲW_:i>\oy]b\*I("Hߏ]`[0 ƍ}&Ԇ:;!$] zO}[tsx@xDTI!M!s.ŁrZH6 DP`K1q)Fl$՘luH 1buFEҌ $P8Q! w c^Nw7<6W}1ӹp_Uq?rw1T0@܁Í|φ% ;BMFsj;= fSE}&5|ckH`HH'EBe91lP.d- {QDI}SW k"`?HAJ87/)<%".,B[n㕵48^uY9 5nsޤsil@Apv]@SGt⒨^ |㍀5 <;"&3#Og9  ͺG5 Q |v 5C@xG{XO-EH7e>Cu{.-H0/[+ "%D 8c?V<84NH]9~ 4CHƺ|M=mr1HNi9Md12(y Vo]QC 0.vҙ6ߍ̎,Lƕ lN\1e8iUκ Si/Q] rd- }Bhwq- "#3ٯu̐H@A(Rp^ ktqٴfx.s ]gW4$Bm4 $Al uqNìCj Hӻ_B %|oF4٫`?^6HzB. [WwxF#H",RETRE$ @$C{m'M5X\tVθmT_C|!6 Çʑ(awQP@48PGSR86Ύ}$+Oq٣δЭvi'U rd |tcHΩXW,WR QvM7͖ f  T# Odm(R%$ΔtL\n,Pܾ}d2ERTPH_b!d!(y_snv E;V %?[u& ՠk*v0E|ـA ]ͤT@|R;ˆABEvWha癑 U2{Մ|,=]45DN7`Yfk C)hsLԎrVi7):Wu'c@[|_w&b :2)FLb'w԰Ki<Yz#f8+-@{U]51$ *!4Bd٪ghaw4 Z;<]m$h;0mf ~bC\VBw:| N@X}/__wt=v=s" 640 "$7 ^_IDc-O3 Lnffcxo_F J'`/a_l'v(?,yNr`?{AV>r ;;=7譡 Nzo]I5j4 @@> 4! A& =4m%hm9A뭓bk]((C4F[WT=rw.c/u>x aƿ]5eÀ9i&"ꄋQ3Yϥ{{dcniIL5TE_Zy$- HYb0A^e3Z Tڡ40)VJ$'y $0I&-{:I &vE;$8d+w-ى2 r) ^~E-4rt-:̯JoN9sLu}G+saf̼lBDhjdULBxǽ[*%'&hc@U*76N)U}0@NO0  J;{=sZge* 62UpJݔO]7E>'.(cʎ!UZ}xKNb )0#.յ0n L<"PÇQ.b?@ ͇r7RlP@sؠA}x=h2;Ԭ4PHea3nD .j@p/٘N)ryi`AMf!O?_Ojed);rAIN`2ѮP|^ ݗgt7?nhX< d;M;{_=lڇ;(8-HH"[TH"F!Jцϱ<-~- -z:[\v`=9yw I4nze^1tg7dN? DN1O8a /-"?CWW|3!W`;lFNY?IuYp}prd"])юI2 bTr;3T楇j b}G_fɒn{*zD# c2z$ɤ}c9>\l[/]9s8 ,!U:t @:T@n7Q3 &ƯCϡTJ-1&CѶSGOaG~; 7Ngט.#ݿũТh;|e NB3ePY(N7v]wj_^FPb]8+H:~*c@3qb4 5$`Ў Q9qJ}a8D$dԚX<1p ehx>z3XdRIv7ӑR @ɴKT *XZ`C/{D=f߭{zi0vUc ! s-ɒ/ow$e.v|.qjR#aw',)f)\Ǘp|/9ƚ]+ LZ 2# +kaPޙнGBE_`vp6_|Σi̚B~6. & 7pxaXAaNЇ0? %f].nxgT-"~@QfLRoI[62֍DwM9h 3D:yz fG—];lz֚ٚM( fDd;iΆGwmөG^42 3 SJzYڝ8FSs%3MIBIM@Ƀ;/,e/$p2Bkp߸h{L/x~tՀ<e۾VwC 3m@Aq 5 ኅW&] ΐSQVh hA@B NMBceoeXz˜&R,"CېvHrLp`@,r*7|6Jl 3B<}28%Tq`f:[ \B_XX r%GkǚaeHHeda ;p l5Fߧ9gSWQo܆+{9Z,)6O@ EE?*K=T VǞo( A]^ xdww:hir'f$R6r=L53f,)zVP."z #;ï+Sc'\\^#3} Džr.(OHB뺼^bq*}HQm'0< >ӄνgŠBp} N5(t43L94f4D<~0w ʠPd%) )S0Tiem5c"!rYiTU_ޞ <}{~~p!`wח Vqq g/*0 JdM˖,͓ÎYMbvK/ ˸je_tIo5SxSr n -DZƸKyx+BtR`~ Qc@%}2spKo =v`E߀KkfNY~?,*6 Q&ҊQ|V0MPm \!ub% A ~iM632ֳrhS8r%xB@Gq5$E'`38DAqFw(J^YSݏ3 g_|b1I;^mX(:כF;%T'Sh.ݶK<Y(t2:2$plW%@)$؉@9قK!(TZ ldZ40?N 4.܂炫?P09ivWA a@.`,HtŢ)S+f2߿ceQDK  XANiJ4HD2/#x]1zE[=H<kl; Oc=ڬ%uh[lE[dF.Py_๒$}Ofgìv{.r-YТUeJ%+_??UKD"`µm~ `%Sppm0uŵ FWd{|}4bGp{VձC'A%!Lj 0~WL~4߯ %TI'{cH:z7sh;7O`c.Z-D mG_C w2#DF@r ͐'J*CPk! %@MǠ{Ja^Mg'42QR"bf -zo+kE >R AAd=1ЁڿpNH t_E1" :G/*2yXbv{$O=4&x9$">Puψ Df,qVE; sy q(ijB;Ì<#8b#aw?W6ǓG3S￰!?DƥpnS6W Z$pn*A:WOyP6e0 :Ip$<_A`7^69L$7e(tWEgYCrFfFR5q*jYmYX-w&:aSH6K`,rX+:dnDFFӐuG/_Z.P{xT0!wuډEA.#OoYdEG H5 I2dEkB)d!-qE|=aPD `q*CE5/Tb#N)[PXxUSU SvnpHBKۍO 82/oFHRe@P <q<و4(!2H3_^w᛭1bm/R^Uﶉ$$Y H D(H"EPp?jm{C(>-gT^. TYʁW>._&)@Y 9mr!d]3)B( NY/ _{ J+W.0N"Ɂ ^(Ё#gQq (Ѵn5g>}C$"O{VKƑ5R432Dw j#n v  .N$$E$w->ή:u"|Q ӊʦ¨PJFx&nf\e\).BC!Kveg#+P]֖0p5CZ7-9Xn^6C0T Q>zikӫ?WFșֲ >\pE  f$Op}R&J~-]bn U#:zC"Il(rKDP|]^H ;W 24i+a#ׯ;EG׽\?goW!I={%Ʈk)^J$Ƞ\O#pJQGS_t9oLޗ;Oi|s0;~h7`;fAF Km(oZ1]ڐ9gw+G<ˠ${iu[j7e|;I5_ykXrsU)?b1U."KGg*VtTYTV5#Ejx =TGȵ4SJfsdme\d&~qvwr밉"k̲]{"E\qeN8&xFW_J7ۆմ#xk12 7PH ;w:;b}in-7uBjq3*8΀6Z"/Xٱ uG9+!zjg/Uhʉ@X!2=>/7 o@H(; YNo6O)enbϕ ~ 5uA/%'{fc|p!% pifr7c[3KQgm'&3 *58hGipw{*iۏh-oGVy7G~3A.d*1 G7][O=yd^h߼cϟ콀#tĸ  6U _\qskq:bMֺXwmild8S Wssf4d=iH&lZҨN4 ȉj-xС7 8w.tׄip28{Ud5110 @X2G*cW0-[-j›$KoXO㿯v?@ي.P@u7<k4]>оﶴ[ JBBL[3:M;D]L4͗ΔےQ:j_!>>mp 8J1|sp"ףx{xv~ʑ!$SF)#'Xe<Ԕ 04 !!Cuò%isLN & lTyU0BIb h~t>0}TD(, J āqxNiH%lX~i'}]':됕0P-ӣbo%H>3&MHqJR"MmEm ;RF̒1 uOuJZy"T Yim-̱DT {SAJ`A2{Bv- /w|"C"g?9sޢi#GoC؅4@dސ S*oo~@8itѱ^5  УVJ +;dzaȁU"roS+vgO*4WˈHpDh22q/1i{Vͣ41㦢5C$y{-.º:\iJab[VGLHN4PƇA%%x#+:G_;@1UJê)$C$98t#9}dıJ{pS:g^+CEQMgtO{iWyܓܬSW j:m 1Y=`nĀ Wa%5ƪŊj!IM}g MYuE,zǰ-!+C}#~Z"-{j1?&Efs`V"&:>jG9}JW, +]vIS({Pe1hk}ҠTS꯾F+8x?ǯyTYY@N@al 0~jX-U}X#׷oM[-ն{X`^ Mz'sM懜ǛZ<UPT(5, )EB@FdVqN'G[!qhIcnȟ[ɬB_ۥȷU l~*Iv_l'5ݴ"a[:pg_3\cX!<@YntڐTf*4ynysJz`S;([)DCD ]/XOn8Jvw&H "*bx!!I?+m${g4D "WV A *]h‹Ɤ0y 8`-v[*hGG?a l"*^. *"Qƞ&}d!)bNweBOOp)#1Gxkc H^ 9c@Dzgv|p[{~22uKᾊ4&3pKQv;? sJ %K ҲX 0݆T258lQ&].q φ릆̅.ڒ%,_I;%3?$juW甫R9g=,fFJ+T[_s4UdVw*ekz4X砠.h`#VWԹf䏀-uU (ú'ujH1Mo;RU))h4G;5ټ}]&h -H Jf  dJϤs9Jb3237AhҰ]H* tA1x=_6aڎGĉ;<8VWSYmA&3$bOE)z5ʱӮ,m+)? $nq (sqM¹0#O09ryTN ȧ0`v L !lHxi/acfGߤRb` sHA aW%̛s.B& ՙ'ǩ}l녏DO~I|]MnƷYukcS, V'4CX6['![5ܕR5m )z㰔 »6LHMK=u|k'58! vFWT< atIHy[A1OџU5DzoLA"j`꘰KΡ1fe _q\Sy/εTbnlߤI7r+=tVl(_lLHFYWq{d)dƤ4ѵ Ϸ2 XodhV~ пco3~>XSKxTnPf`l3_M(G:{;qóKl%cXHly:Ylţ%;S_j ABl/v9Ac K({R xء,9 +v6(Ղ[Js˓g,DsKck 'Gt@jmO>T-+cnmqFԚv4UZ]>'r ^.ØEU-E!Cg8kVPGy8T,o.nK7F.[W%*.ڊ#`` -W/%a`X0BÖqe,sl"O5#gy{jJw^J]}=seG ݠ7.r`o(H$}-0Ae~kvnǾv4gEĐM/Fx$2sX  EwΧsa" vrht @sCYTa9Y:wlmcB'lIcIIe(5}]%r@!~…d ]mB?` ͆: '##/%$_InHh8bAͧͽk_ ]c]@ R L- ET:><ɕeΓN=`ߠ{NYЍ۰=#,sck[~0QϪBtZ=wW?\}t1b36[+0]2Xyab{R. ű%krV|6n=>'j$WWn3vxD@t=0>T2EVUh=Z^AU 1C8/SScczy8M')biæznq,wZK >gFpAWܘ^sl}q'@x &D4I[ՌXs~Vv 5(}?$WĔ?Gyb- of9\V򖨆>xFqBHOVzcg2e2_،ʄ+NP3^F<۰pA) £ַN[ӣwPU(S%ځh[7 K gX K=~y M;##h(0*~1]?3p.mL@@jG xs`7$ـEjuᤊU> ʰc%hxUkx׾2.^>[7I\ +s]_6@:Dy0$Q='#Ԝ΋ ?aTNkI9zC4%d,_@q9n){];=;׌.' !:٭k}鸛XZƍBkrs1<"y.X9Jue4B(o$ `{ ETXDc\d"ݽnKFt%ﶀ :stdQUX/ng3q맖3lP c/?)50XŸ3E1eu FU}0⁦4߰uP/eS&K%Nf˨hg(@Pw!?XHS |V`R֋e3{RBFӼ{ .g??Qo<DshJ{ٻēkƏ 1sPaS\[^zJ8F,|!t 'U= $Kk3|2P뤯AVD{uk؎_4,$ >k'q2~%ЛnYV_q88-VoPT) :|y(A~j=3Ss;Ix+s^XX_ $8XUF!R q_&$,֔^3ñ\W ]R5*,Y.?pըzq?=@  ?Aª 2$4wP0ECxpZ{H /$XOƎ@prFJ 3>"S^yßsTH6LKD>b`"1vF(MxFvXa$r>2[>#3bkqD}>lno'AA|{.V [C {ݽVs7vm}:Mɵfq%Yl0ZCA0 ql5rRtUFҰd?/Y0n`y7Rѥr͝ӧA2Oh"b^Uҽ.C=9Žy D5-Y<)fz&DUyڮhGrOٟx͵6>fm,Ջ]B9w|9].ұ&˦:Q.nτzk)fui #:8%?1 ]@ N]^<6Q8jeؠu3*p |.0scFf"%%PMB /VCh,Bt ccG~[@kŽ6~O&œ}h7)vC,{i7Bd@J&14<2BZBT2aVR/ϭaG> uh&!$k.Ƈsf$F&=QlmJfnen~b` W:v)q"o-v޻BUΆ f hʃ|ӲZ|eTe ]47 X!P(uB@Mͮ}ߜH2ʎ6ƒ[#;Mζ&z;HZRzW>Hl13" Ǭ&hH㖪JS+?idH$1HÈM3U$ú$t_༢kʳ2j:2A+ T}EkɠP9l5# ]>_~^`J\`fq޽z$?Ka0HFyeM|s6"t fՀ`͇nak9Q/Dž~dF{9T糋r[%l@ﺰgvK;s:+k b0%6Wɑ̅ kQ'x8׮{LpSc 38?Z֤&!#-H?Y{5[Prr D8GkoֻxehF?":/ph4xF])Gוd+&#vSY'v8Q9`RuC~U.GYf[mC/d /3)M.՜C¡]Lig lNd(ـ ? ?%>>DŽaWYF٨-^N>LuAdrVzu IJb\'AwϦ-&~cUʯ;8,U:X!IQb |M%G%-l1nL@5od]i?N`4XhĒ὿Y-Lb&nʶsCSB.OOB jC!IfS GY|G3< JO4>-<%ď yb¢S*B $o-\;Y<(p NfZ@'FՃ`(ҍ:;m5_Z<($Y/v ;'@VO= @o 4.|͹IďLQl"O)uvlZ]K0[ bQC 6.NJx\$ǀMTT;} KBԑ9{]D8d3p ΌK1{e2V ϒ;co@=/Qyp 891jp}Y)l=C(!{Q"%#<Ӧh'~j"*Cޏ7g:HHA}hE(=|9QKc+ e3@3t0(X,df s&^Lįy!hW^Fx=3 ;}/0 .B),B\x^_H$o>^$q"M7E1/&װy@ :ZaMČqiCWHclk(T_NvNJgXwNqěFf}TW|=_ 'M]/<Ƒ c߁8|G%^rǼa k =M2-Z/̝ vBa=)QUm\r,\v-p )Շ6saB)`@lymߜRE5_tìt0hyj&x`T5bןYV+URM6Λ{13g þe؂ mU:4q%(4+ S,V;E㈷PtK[:w2*rmmm|CܰOn hFt9Af"}Fz[c0g^ұNV OgF/r6&1xw*õ[t)_m"~ה|M`~yŸ}7Ii\;) {|4A|AIRtUy= h]Ymƪ#5;o\LV}Dm)\aB2ۆ=#l_Wk<ٌ5̗ogqP:$ʠ{Rk v]yM`hv1am06q[M wK ΄I$tzbqϐ@[v6g_}|[onDO(@P JICAd&1@[`?7ehxµCT0kXqcLc'̈́)Fd,/o*Uf)i%"AA P֐X9YHz̮Dh˚ƌM &4%7xA8nBD6a3l_n -W^ieCZ4lW+"x="eDWl;86Gz>)gW~.pJD7 L>D(3pk6w2{T ,2(hamԻ58ye*{'h ,+cwEAށ(g^0.Bj[[6~qzb ڝNaHFēttqa aJ^ẑ@?h4^ܿ*^;YqZeïFGR=&S0fi(] Zj8Nx._" ;vvulCҸ#}UwK^5"I |hpcRawQ3He57\ ޴oQsS&;׵0YS543StGtQ,BosU)2{ҋcM cqndu#33:Xn_ӛmX qy#Xe2NӶXor%{t& N gu}ce $ ā8q F{_@T3[ُq(v#vnG|h C t&ɩJO bG!a?n=eGC4!@ mU.{ ge< /nl90Kx]bP,FoGUxCpb0zBYi~F+j;Dds ( Efy.WŽ>lp"Gdb gKzn_P \,.D缅L iRP֚96+tAefN"ZW tw;޿)hSk@،KEUz)e< =u\|(.wRǪpiBR[Tpc_j_O Hd`*8.ڡ]C[f3Y7ˡ 7mi>m|hڇs񖉽FzKZJbmP9_YzX_;H?e5$iYu2&rlH X @E"Web9`,5ËW/ԲQg>*aK(E0d`Rկ7؅N!_BEm14fjP* 5$@ڇ7 .0{1,|UԔu{9fٵ WemW,<>yWJɞ#&t.Cc?c?VqM#ç3IF thbpW<9\[-6=_~м7ZS(Wm*(5צoUm8͞ZW(~M!&?D2Sd3GDCyӌj3dp-]נ볘z^LL!iH%,^`ķKmp; 9B~PkUge P1 Vbq>pj;Q^$ּxXɓ4w,:lt<8-n?Z@cӶmE鍳 Op6}D MؓlPIv<dhvíQ!B/_"A  JJb8'0A2K5W v9WSҢAOa~߱A1ANi1NPS}@@]Y)D͏j(zP1 Iѣ=ZYY`J IYDFϱO"4FFckf{&uXtb/ρLڧ hUn4 djl&ai@wqy -txMw2G-tk\xwr  Q 0F"mhC"uЫBwG *MZ辯.HK'[f̘Bv @+S#](H0~N2+>M h6.زѴMl1 (R\~jLeP/ L#陹M{g x@Evy1>H,x<jZ*+T3chԀhIYyFʼnx, YF3;ߴNw";UYijM3NtMƤ)l 2zт o S#>$ 0qä-28)E/pFzu",=/ 间 02QװB |W*Z( O*v9"rJ9S,|a8PrWBI\Ԡ]LT;!Q"rNy4u)u0gLSHיyEj,=BU]l -YԦ XTdyZ׵ 'Bix?.K~W7ثA9=E8|i?[-|Rh?稣 qxh&ʹ!H\-u $]K秣F? GDjF׺IbJ@8s0bHӝ=4r|A8ot~im1",.{HE8MI^W/!A5KrOIkoQInܝBo4~U7Ʃ*-6~6LUh3P/ ;M|9͙iM4)gnC SJkHN71ƖSJHu\PH7p^4;H: u rM7\brqsЬ(um"u$I3:A0̠x|70A8wP\]tREΧH Bs]K f<$Ӫ [<<85<mʤ0(!Qw 2xg+N!t ԭZݶhc˦b7V Ff6KDgO_#S x?sB1#zn||5 4#w zr Q${)MկK 8N !r2[g!Vsw7X|d_j*D[QUB"O}-%54?aj\xW V웸T%pW< COFCFyTb{db 2yyni{,9\.a3  O9)+x29y$">M^,h_'ë(d%\Pqx-PH)3f+8R|zϚn4*A!Ƭ[x|J%ĄY@!>AO+n7npӾ 6.v# PIKO8f(އ_ u&D1dDCˬ7 o]@yQ@9=3q0BiBuu+LMPqVRI` (gm+lkk;3* mW*FS8te8";M0T# owbmcb3E~TA!4CY:5u we҉a Uv[_ TXC̲@=m\O.pißj B+' f,&Ϯ ^2`}[JYMеyevаxL1vOñiTfd$ܘ('#\2 @\-=^UqD,SM.Uk1!bdB2TyKBc!ps~Lm{OJ 1s>T4qTصu{5LvP %ꆎAznr0R,v' Vj Irqڊ5&wըquQjSr)Ȁ8E7Hmb~>n)A^{mv?._ wPU",⇍ d<_/z/:q}.RO¿v^* nF7acִaQ? h,svue642#Cn?c =u=p$hϲJ /||F)-9J2;P )zgIQXxYA꨿&0۬]DD!-05wٔ?0M(+p,ۧE&^FLfS[(t pC۫_ĭJL0N `1 #H!#@Jh;=Vt s-]mc;v< |XXXvDmAwWGDžچDAN3ټ5<#M~Q7|bQhS3R9L|OwxiŠzŭG]BE[;ۋ5K4 sM:hƃHؼg@7m@Do=Dd!\΄kϐ|Oy70|/JrOeYQR 8 4 Hu[>yH(֪K"@hy+.]ⲧqtcSku1xM (Q})Sr4բϚWclfELݎ+!x|G!%qoC>,|/Nhtv4ޫ4q=;.EHF$6ZA.Lj0'f`g+RMO73k!c<kqYD Ɓ6Q4iB|%+C)wKd\!9k =jQBBZG 8S|+o]m,E@QFxN(=$(@6Wzs||WTG##K'mX g;<5Yf{TH hP![c:e19(s3h 3'^j4#~  5/myz<:Qۃg 6#B7OU>\zw=EU @@\ Vuo[<6$_'tcʶ (Ш!'~jS J5Ԡg%0 }f;8D꩐k) tQF.tL]XwoW"ňҶ (\ gG(oF{7 PX}(Pθ ֏8ݧLM]ꞩ+h} v0:H_!*a0=.ȲbMw[W@ar3(Rc$`AGإ7KzSɈx$ wjcVlHkІoFhS-j C|:^F;1$F_EV~5 1sg҃[p/5mn؍[P_{ m"Ewqu#w ||w &@ΗVnmwnƴֺH27=2uDf[1pXmV;HڕYW-SR9*iE0%#VGm[k&g%MvYVݻEII=3w"+۴bϏK7u\n4Q9XD>L__TJQ:03EDj򷺼VC|&>jUet͓?TZy2  ֏5=$$,1FEj/7j,8DžK޴t/ZCF ˆ=D-EO0#Zc 0eobGI C:^:q O&u}f;ܮ4.%F~›8ZyS1_pMM_Yq` PX{$}ljy$sv2|ˣ آ*ܸ( C8C{O#B󍆹7T5\F3&H*@  ,xv<0>5prP|<6D}v:nyΑI4ȆO6ysžњ;l@yIBLr)p{?l-i8%Òzv%C =Q3?F«_z}d0k1=fY XybZT>+Zg 6%/OSCe̩ rv,${+x\I{11 ۛb;68WscWdQ pCׂL7wgMkr@@hŘO?b4Ϛ  @겟5D6i\GV G.36|R #ܚYel_ṼSv}ARZK3m|>E+vj,kq$bx=zcɤjը+ BVZYe 3gG@ '(Wg>:5$Nvv6Xy#LqPf]h!s|~+{ׯ%1‹:d; v`| -޷&wuțt>4>y:{5}u4̘ z^=D4^8ƣblT:<(F{@#XoBr񶷫-xݓ$PmEj\Z Ҵ #<*[=Ҟ@) c?es=w :Z˫X"l3X-++>n/.mƇg9R㘘]5GM_‡qInp:FS@~#㳚U8DG5e脏&N8c#NO7wLS$ƭYC8AF{^ؖnfT~ ‹eD6l-uRV[;B|d$Zye'7A:5B7W;CBǿ-1pQџ@A7G&qEի(5wPb;WJ*X@ř\Q4{ l&S뭦˛Izc>>-3qZ+jIϖ;lՀ} S`xRCgƞZC8܍A-夠4qbhmXپbX8se+2y~caa]ٛsY5{eyA7E  Då< zl}q1x~݃ʫ8~go6 XlE'p-;˯w=CwRuQ{ᚍQ`!OB7|5$AS}uʚVp̚HQ0 -tPc/e6p ?d[{ηR(2sb(2@ չyi%V ~߆B48'@˧qp]H o^rY@xQ@Ӈa(ܚ 0OʠXBaeG/#DdKtg%V!vԴ$ ^3#NRՅ)7"9fu{S8T 3^pKofr5 >z@! jk۳!ZBF)J@롲*ux{jP8yo.г{"][CZ OE=~Z͵YbZ{]?h3z箥;ɩ@kڹMh)6g~m*^ו#%%,vHÑ()]`MUdxw!7pdʔhFIop!(x|u 3W;hM wUŗiMd5V*90,@R)0qZϲo y¡ فhZ1s+_77;׏r\6c^uz5'gUt$O1!IgJ `h_/~⣧^ݑ뻿3с7%ڴ/p䰪u[軁 0%4$*;E6 3䪈4*88IEh4[MM~x pVx]@12keLy ek޻X9^svm1-_qq++NP J*jj!o,a<_B$']]}#&FAY.ot.)T<2o{^R`'{P:bm !͜eֈ UCsZ.=o< DFYnsanE =lG% 0.zh|YbR= ]Α\2o=[sQ<Lqc_i]\tXVfWYoGٞ>A]Df8 d|׈ؾqg#]cݢ:wDGq4V&@Hw DA]:5ak.1AfE/L^:'`0,HG)~Żm uA/[^痧勔/ѧ>ਗ਼5$ Ys!#$@+V6aT>&W7k$1+]itCBYfz]lI{IsBqoYA)6u#w<+׋,SBliۂ@L]IZDA98הBULC@„g7YWˉ©G"(HNg#}ֈ"Z3_GEߕ"O F(qg9B E(5kL" ]f& 9%m(NisڎǺ8'e&!BeRn1.=V0-Q!g{ WW"Pf{] 5l7I}y,n`Nڝ]ȉ%(!xIP6+bpl῕V_EZR8PD  K>Qn2 O)ȤaA A"ql(:1DH7_a{ÖAmE^E!$%7uDŀ{B9mΛf|G\@)wr{4{ҜV/g#(¨6`J2( Txe"lrؽِ[0s8s|!!;Ɍ H.7@!Ee&/[j )s_w] , -m!RK$aڅrmA `2n*Ȕ4- 4yܚ?KrmOxpNj@vg]:!Tm{M\3}M'er/i; Ǚ{ogq@CR5mgR{kCg 5Y+6aJzp,zK` LjIWJpʌf=ARЈ5R@ ilf@0 {/|4| 9M F>6WfYCRssOn,>|.]P.M4!Ĉ{PcA'ƒaesbpssvhþѧ% |]+Q*wHkX $D%[,$C7_] \h_ :Q&mSPLfb2`jΐ4jrX'x.^htKY#퉸0ZuiplM zm=eU\"Ѯ! "H&{ ]cVU^:l>y WJ %] YRh&8)F*LM";Qu9ԭ8l[x*`޿'B)ZnD@}LK~TR׏eA/`_N- v =Y5jsdLRVl tʌ[P6Blt4bsuƪ ɾa ۆĭ0 lTδm^|BG' P %)h5{ٞɸ1 ,Z1|VPЅuiRSFAU56lsZF9-Byt(8/hTЂ@oYYMt*bYJ7Nhrch{HFbU0,<\}ߗٮT׽ul5=<f`^َJo\B?8ek3̢7 / ;,~x:1rKO3UYpX@lDn.eX7tll9hF#MT\FsIp(AÇ; ZuYEɨXl4VUQ\/h 2--8QF:[+eZm2;f^['pt';K PN}th)J!H@kzd@,FRST^_'UqoTyJ oV]| 0JswY)rZ7OԮcx6)%2Ģ:+SO[bg8Gk>備Dѵ}'e"bM"aT6]mdH*Y4CjjOOrKR5.i\ZM^u2S7+w8PAm WT vH{ (oK'Sk3*CBJ^C묙{; e6(htiOP=DC術SЎ8_^qpLt'PM1$Ry^Ro,1d?[a !*!Hc= Ik' આM:z,,4W]W?0`p~gK8#yw!#l~MhX}X7&% (_A" FRңdS PU1D||<ʀ PTAtPޑ[w,8*QDWJ_QAqob`l=^v!UGDDT2{ʈ+b('e_ ALEUT UQzs;UEӁB.[ZJjM8s Ep'ʊ?nP  &*">CB~( 8*y* t3/e@ EOMQMUcT^"r#94 `e(*A*W?nwA..dP#HGDUnH @  >\>,rD@o@1؊{׈fTkV+ b*!!⿛|AL`|$P=< !×BUG?z*ňQDwDIbHDvQN5D(k*U$UDMdTF !QP j*%Dd62lO N"F'd[IaYuoEr۠Xd 'C"0I$"@3`E@|Sw@bjE^{Eb((v:*q"TU ;R(< G=yVwdyy0Т G4@5IՈh"le|\(|;ۓ)P@$XD $TU!Ws]`'d2a`2T b"0HTdVqDWqmX (AM"d@8H| 6J* 41EU6[ȁ$ВhtԂ_t=jۈx(f&0E6 +\"Hh,J"`KTg*y'Fhjլ?`y` A` "Ȫo50{ڣ TF()Pb(D""$DXI8rz ODV1X@P=V'{  E= \P{@ @wq^+8P bD$$ $Jo1QƁU;(]4 uQOf)ϑZK: 2=uu! bb.Wb0^(>nh22 $!kvCR!SʀA_}t3=r޻`ψ) AڀJ\ P  ̀w6\:z_r"yEC`~i@ ' J{&HdUCcpD^^qAL ";K~"  ;X i@<= 3rlLWgg;jYyKx_İ]J؁3΢ۦ(F dE=׼XgPs ."nb#PL1RA͜tzn`؁RCUPR)5!SV,+C+^iTsY4QAEv,<Óɜf1d a:_\qW9O.O؜G+`chkâcb\裼qVg*}\սm `l}OU]ƨAOT~&5Ď^[ vQ Iw}g>6c)Ъ ~ dO4OüaEXYRQ:؞[p|W9o!m(ъ q"T (Z H'3:8  iXaNK)hQDcP#Rx8}*>${&0Nzb#"%ݑUۀ(2(:PE]9"kCt笊8P Qj p (". KE]"AdPa7*#AdQ^DE̊Q9PI% iU.EJ36 Ƀ%lXjP\l\(AT|T{Έ|0Fo}-ℂ0DDvBx%` @Y! (XGL&2E$&U-P ];3H"i (2 $}̗E9H8 )t KP((A8TQJ Pޤ֍h$\F :M&#L L֍;G[7<ڣk&,43ۧ: nM&FXA-hUP򕀠JqT""A$0{``>Vϛ! 8$\rÇvC&C&o'PE4ED aaQJ",&(Q` fr&02gfn Gw0`byb._un, )$U2H$QAU6"AkF&DP:iU1~ZҀ8KMwԘл &8`a48RjR7.ke(5RI46 &tϢ $8jhx?oF0`O_hi;zn=%`MWЏ$@ I$ :2XEY"DP$C<3"2 twn E76JP@2g8fԚ4PU>ur?oi;HJ,7`wY8֎OtH =@A2:q@L*H:9brڙRҗTSR Fՠ d<4o8 ET7O𾦡;PHgfͳ$CO^BImƍN,bʹyw\kB&¤ڂ"!,"B32([jMmJ5d,n!u0L,rf(۽VlPjQ^ eS 3'9I߻" du ,a""R,o# b2"EbQ*HH…KG: Z ge8M0UE] Ǭ a 5`eXPrAi8d*' d* g@ P lH|D=؇[۴bdTTEdTbY K$t:xjD =]Z+$sj»]-+ccz%IWfA[@ְ̰o!>#(,MSiWF8MXWZW{3Ftk!&2$I.JwU2KhM Uu&+NJ"Z8AEҎN|}@DbTY"Ey(&7DCÈp@˂ ͚tΈzBxg=!9dB֩ up": 'Sr .cF>,$吘ba$HHA dBE d EmAK\ _ @9Ȋ"*F R IXH1"j@`hWRAV x p@)gd%V*UPD* U$_R 2BC̒`"EB**0 RHH*SF(5BAY4z~]$?d!P'"QH"0,F,b"0!U0GQN A"'}B%STTbZ+|V/Ljh@%`]NVQ645"-B0$Y `,1XA@`( ")@a toBC0`!Q! !P!}s(H rI`ABHT$ "bH 2 kCwgEB 1G-2U#h @Q`@+߬Q4L+)(8ZO9)dQZٱ.Pb\(2@q*e(哽$*Br"VBH  8 (|(RPu(PE<y$P60Od_81g$b(P |U ,ʠ()#&Y"ŃE SDnz3!H:y/IH& ]O5T|JO)7z_/!!i@~'KeY&+o?3C#U'k}g胉$M-?yA僷MtŎ?.AӼuF}}Eka˛p"[myXx>ۦFcyw/eM/Q W"QB}iDFD  ׼ 4oܿgǏ\W ff m0yie6i? THH3%_ h5" 6t>Q5/j.`)Hy H0&jQ_oS}-5?`;/g@ l =rt]~{K?0rz3! H BCkgOCSFުÅmTP,AK|w%$ 8"H!dsِ3ǽhچ~l2pÒh#`iLޜ:ZQF֐0`ӴAb¢dl?[Kh'<$fs\nKl`= vf.N&4X#GVEIXy,&jKksԺ>$UښzPS&': 'zub(^φM:X!`̧/j dCŕ? ,2}#(<3„%fZȀzUg)k\ֱAI 0okdXš Q7WMd:cʨ?` @jr֭h?L 7o A+ mbb! A0'pp{5ĵ~0xv zw0X C]c.ށS\~.˚Y˭ tyKAI=9P6cnnϧرPX/IWGnhr\3 -Դ)GIPABamC-JZ ׌WWbN|0C?Wi[49fɉ3`+x> T-ym+W >vPMKʲL;)/EK Lb NU4Dlѹ;@*ڦ kQR0 I肁v4SZ5@A wQ|7AD4(Vz5 PD~Li3 aqR~5F*WDnTw34Y(`ڡmHD1q2i"4UEsz}.vPYn[rB/%$Ԡh#Wc7;d}>Iqi>7imޙ[ѯWKv4|9 HQXHbzڠWQq>/@L@VGj@,nʂ gke~ƣS.^\@XMqZ TVGfb:_@JEhiL:t$؍gE\sO:lQ'090(; Er-'o'FA3/sݗfh"+W& AvV>>dR{"ԽsՒ?_Mz:JT7!RK:U YffhS՗_gB(XdA2a<<> ݯms7m@Gwtx 2(R>_W.58:_9bɳD\!@q0Qɀ B_Hx#׆j${'amۭI] An00-@pS(&m $lGW^6/f [)0zQd0#Un:R0?zsK>DbvAMvu~\:u@UVDxv>&%~zD>]k9\%"o܌Qwj〣ڴk3uGqIo#=N=5|b1/J=i#a7rސ 9`@pb Yy}!˚N rM|NMX(O}b0fFG@Pn0֒Y1#?儕󳸠Otwlk/=%IN6š2IsU p:i%{79Xb^pfXetBRy=: 7un jl*mR,(C㚏 n3TP1޳;0C3 oW3DxЍyۖlaQ'\_C3sTTAh’LJh"6y*m.rAY\R%(!V*$J1z$J=nE=pk`4ac29Z='{:OFM[ʶ#JCPn*'퇍8oEBdGXCkys 顠.q“E('&(ewhGvGL=?pM6^6߿k:K q֡&j5QGvQ%h #/?W:TL5ZXhGuENxޑ!@ӂH ^K T'iݢڌAiyOEVu[tOҁt3[SW#v}TȕT_qxrmpr|M$m` ;-PH]9]_m ſ-ZPDQha}TFq@ \d3 0}fdl D8ߚ;`a`C[wV Nr^.]31FjfA<Y2p2$};V0sM64[w(YٳȀ Cpu˖7 |κW+ ." 4̖e$n\7nTB|p#rߥ cr<_qUkMc", -;g Ӹat&@\B@]4x:y>@̕*Ld(΅+b8P -J@!XAX=tѭq,eBdEmEpQZWVCU^PE``?:կbSÎN?fXWz@+F6*#c:[li}e8DE*q LNzKPqp*Js^z[]tq;w8a PrH(HMuWX0 3+*?ó|݇ҠwMqO]n7=@*[`񺬃awֳBAAeX E7a G {n!c댃Î4e2C3zDT u]=^߉#!IVvD>#ok7"#~}!TWNmouˡv񰄈pqԚQM[PS_kPcKĈArn`tu1ǧq@C}}d֍>~%]s#;[P\J%-4\:zšϞo{X{w=PwÛ߷jYfnŽǒRHrit ">S_R]2qշN; <6g@6P63-P'?Žb_@ Vb/!F t'{l|Яĸ(W} X2SRyf)Y H*:.ӳ1Jeߙ JQ*v#u{ ㎸o]F848c*%Z{MMySƫB]#C!\R?^́Zt pʇ.?FJW2x *`LAp`q+,c@9d_?Ϧ\M t~m`dQ$(%l@X:$y`t|2D v(c3#U^8+i ʵutL]t/ԕL` F]6X^[ɼ}]w=8H,XyjjEP M1`.͠ݧÐ32C<\jeiqw$f/qo\T5گT0D)m+ȅ@e}Bi`}ɸ"/cbg*T̀%s( B(,euI4h3#J2Ƕ[nlòfn%͔4_Aq ~ϣm@&\R;RJœYA;SVCړʚ_/q&9Qɠ:k{dG~_Jk0 Aa2iiR%ф`M XH&yNz7o \~?{C+{!80w l7umkB2 EWuESy;̭  (DECR"(I"iY'$!!$,0d6H*A1E)6.RXPT BH}[YJAG#hR;YA*FzneU}]f2HC@ dVIV0jEiY-@Fy7eO.9[݉k-xܐ=&?_yHHy<3|0r~3a)'e 9wժ^|D-mc GƢ,cO8Rw+]l[)[؊ܞh P9uWfݨܢuWO+^߹.8FfcۗJ-~nI35^@ӊa@]d~c]\"N@Ϳf͘7mMCn6n u;n͠`|_rX" 3h~!C@[Rd)HV!$Ocq}{'[M\|ztPgZ?kM7*.DʒB)LhIb B|\΢XA6~Z͈!M\Hd1EolAh, Ma#1~g'`!:ŏ乯op5;$;H]݆DT=\(=d>EW@T A!!HKP6ݷxg.B* @w:{(  Gٜ7|d\E3BWL s4m}SR%pɈu?HG{yIcG9|*~Ҭm]HCAA飯-ISlВ?;k;ͼ -V`Z6#؁*X"ݭ"6^ *B{R PXED(\h_Gځ I ɗu Ռfra 6V"F_(NwP$ rzfE0!0vh .u7rnkA(ARP~R  /-N4v,"5g4ص(Q1@!~@BćY(pF_[%BSBi49w:Q,!1Cz,hf &(G0ʋbVe/>W?~/UUX,?]|Ow&ȢyXHT%E /łHD? ]1r3h%z D^Nc ňE|gnEȊ"~QO4>WH&=]w O|oOn7?B!(|E*"UdRmb1ՄTbUDHqhB2!s iКX@ׄ*_ qTV,(Rְ+舻:CU >(1R xlMίo\\ 5b%:6iM,Š/8-];+MN0$vK(2')iJ/ިQ,Av0lBQ=WX!l% uC)>fYf' (Y_Bi@F M.'Lr*KKQE* *0@T5ܸ,vZ Q<.Wڱs^TETZ(Z|}6TUo+ǣe/Ԛp澊 3 J9a1eT! sv+:'#jDH4g}o"ܫSO*UX<<:bE' g _,uݧg|JyX UTs(5Auv^y_sς VW)O@|IHHA#*HP]O{MvC dQ@ $D*~`N/N jX|B'F arfu 5'NPՁ+'AVQPgna[";p3}dHD0JSͦv35_p81ӂB+(_.6wRD$ 0%V%DĩP}=Ј'Rɘ!7@!wwVi/@h^p1"ۆnͅxD~vg]V>dJ/A?LatNڅO8/n6FXյZ>OYz 6q?M^=&UEΖJ*3f ^P&چ ݻD k|j]Ta=P{]GS_Iѝ X^7f\ bi4R0_ěq~ %`:4(y(9ag}Un*~oT2 iex#[G `(-=E=e.XbV Gn9)هM(@LْSOjT"OϮu!߭;X Y"s]lo \$GUO~oӲ &vςbAR;k)B=30S2OY4k<葻\-AGʌZd.IA)B*ȵ"UsaHtq@?؁> js5A7Uh0H0[Ç_)z$G!Q*ľ V#FQ X/3"iQ _.Uo"T@& (8e+&0 :O] k :0t/][r`ȟD"ZH遐ja G"N.֍Kn߁[0'>9 ̉aEH*`b*65?wg~s@v֍C&ʋYb,IQY&.8헏=fneD2G(BI[䬄au[Fff 0fG(w t=Y?6藥 7Qdd 4 T'EIE_CF[.9rsF@<5qbyĔ(?qhٷ2x"=QE o8T9lwqOU$ | &H2\)D Q|@qFB:DWg7riD BE蜵%>bB BPjۻI79ܡ"HH#F+#;Fh!EB "R,&13ga8D8á>6d FHCtt @̏861t &Pk٧fR'MjRmN:v.ĶV*\DPj]ʍ=C8i8I{]uĜ+ԆlNTw G# .H@;Nyu\zb'P`xC (f2<m@8fdD*QҀL(G|99i=I'G"&,d30';5Bc6<|=b1Y9?8㈢1g9bz;g8¦8TτyAdvHٹqL7f0 ف@A9kiE6s`iI $A<|5\w86Є т`^Ok\D{y(@PR ""Td)D$EUU `{^#UNxT K`Yz+Q8-ºBɚU ֑rm}P^iTpڧiRA$hP.d'K&6 eЬe +r3kV0T;>7X wA h(PkkӉ#Y /C/K9VF$c}y, wߩvCSu&k[6J!(F4ѯX;uO÷x((ŀ ; f/JxL BA"s _HK4go$" l$qΗ a0!"bH$//]K}t Epd5Jrc0*)p0{mcř-C iL J0[\B]m6%k"V0afn#TH0!K3RwZ  P@ KBX<;1)fʷ}nJq *X0^ 6C[05P@Zȋ P;]0H8pg%Fٰ$ )suYaaS?1_yUPB-bڀHk>60$MԅFkӏ'D,q[{:Kn>4fJ&P9)55š@qsdDZnT :z0cp5WcbE1Lq=^&J<|w~:ֽRV /睙Tp(vJ63:v`u؀rJiqA@1 H pP ȚXc5&>F upg;i)02BmˠMhD (5q$%n(ZL+. q-[8{'#}-sL/Skx k҃fI)_gArQta)m8avG6"FK]&9Dhm$Ys,3[ڜekFNc,x ,&5 2ʎґ}D  *3epcƔw 3NJF; oлztjU̹ "%&InFkkioT&9Z(}3S? C\5g^#SNVHRDݶG^{_ @ ; `PțM((qvh(EgVJ}gu2qAv~ژXo@6-R0FPV^h>Ff}'ӐD˲( h7hKT(!ԛg*в4i߻@Mbį"­uD:촁]b|J s9T* 6,g9F',IH $qNq[ l_4v> })Fl]WQ7>5))y*ACNE mE=p5>5E34sPώ8p Bte[7=.|V`h[ +Za[s;@ZBWgYewƖ 8FH0ٸD1q4KFZM˄zfZdӇk AT6ڈ,3 o Wh:kL-T,j{*lRQ/*`XNTX Bm5\1yms̵$gǕ*3]mp6P'IȌR8%,JhasšDU;LšNɎ"r]lg.6\3֢1o2̵ea/-xl:&Q@M`$LKf{r1QNLuR] #iTa`Fim".0-@oliJ͊džZW?t֑gQI6];T̵H»,ܩ5f 7w5(d!p—8,"4 eI ay($-Y_usu^Z~bj.R澌bf[LRf٘Bd =jص_5$aEk/lU[k i H=@bs;NJ](Bs_4UZޕ {Q0 }+\'oT0H05ֵRoQDsRF 4`AF K91%#S]/roN3efϡKYkkh4]p{d.q¯x5ϝM^}A*޺S %Uka}E3x#vqF3֪2#Vņ8%";qq}>FunFa/Z-E6accAuҸXă^aj^.E5䈂T(sZElNՆmn͈ӃS>J 1. mJheJ4[skS r.gM *i96|ʷ%刿>TFZ5T.E rw= YQ,-H D'_LZD]vqC{f:s\kJI8"s'Z3qsv3 R$XQM5 dnv5 ]+KUk\]g׫i+S0[`V \)kVln "bl[8a@VC/hg[\:M&|- Vvvjj YϤ%4Jhl-.+X\Mt\h][LJO($b,XUU@9P]"S:y5nΨdAs.UP*; &~VpB[n}fͪ8lgx̃s8$4PL\R%ЂJkM#Q9T4G<[)@$P tX"Kͺ9z,H3uL|`@k>(7? C#u7~r(i'uzD3; WEbоg2es4.a~(XI$!V6= YCC[9:;|' Lg)@S8?_iz,xn}[]71(*H)mXWǷ C㧗Rs|#xp,,LxYRM qк4IVef v8FLU%Jt2gȜQURK A,Q/(VMj>D*ظd$WR*Qj% E?C;\S^]@'V>4@n ׋pɨf@%8n) ^L[}jpHEn$qzKJ~H ܿhդTu9eYzwSucpH@ ]N}[Wf&Q0q%yq%Q/s*8K[57T\.ْkD IhXwxo9WZ:j>z4]i8Y4+Ws۟<0JMIHJ;hU%-gRvdaIęr@d\ ^%(1L'$rrуp5ř 2J6@AnTTLGX tܿ89)JQ=+51qrGљFe,jg: {e'A.LAS{Dk8W45;90-L njߚ bbv0*I9؄2 ["j8IiK3D0K >tF3TlHӝXIDDF4 ㏥kM&IV㜙 Lo,5&_tpMiHq2K4 j,4H^XMк ,Y*=/ s+` ÁyẴ0rW_mLe}1MnsT.d赴 \rɅ^!Z+^vqT*k)=FG S?*16zZ-F&sEnu4݅@SY(ɨ3A[nVk) ]]`GZuM`JЬ39,rk\t;,>!st:0q6';N?`zX CSK ˒d?ra]Ց(qőC DD gJB@P_N<S,B3qswXEMAA"T{M9teXndN'%@!|.0v{i>7Sgw?ϫwk@$d/A.?1CwQ^ܟC)'.(YiwSy^'s'PS 1A^vJ_V~|Vdv NŋC--a0ڟ[-ye6_*UO^G *R4'PP@xJDz]:{k"3b FIx>F޳*6jIJ*Z/к10<Nq/7]&SR qAPT(qN:f{ /{/69C9A?+Ɲh0_|?iTE-¡TF!5ieP}=S;Awr# "DbAR@P{~`te˝σ7􉲊 P?bpT CiC DPȌ}vg3B 6liymeBWUzZ0hBVѻ@n;,B?=ErLp|*C{zTBB/dA_0ǏWԂrvF`XD~Ba}y; Qp$Y΂0Dzos}$|n?H c&+{'84luzLg,.@Θ32@0hGpk6e? `?<9I$.~3Sk&s)74q|{ݲTac߀y .fQx ,?^JbovH۶$rd4Vd5&dDѪ1˵64f4L4K 1 ˚e+w7n3I&bfŦ˷dԅAJS5l6ɡL m&iYmp(S3zS{oF͆6oQƌ3.@Z1wC\@mCAX϶(Fdb"Cuw,PC_xP@K 堊(""ehp*Ҁ2dfм z_>{_Wû*3+ (7ˉDzG%ΧSB@%r3ъZ~~@wFS҄ wRӢ]S9ȂnV-n}Ax^;@_1o'ʬjH:ڃ0`!rA53ƀOCf 4Ե/Kʡ;IKѤ6Enzbܗ0Gm96son7! 0 9ycF"ɧx\@@Q61 谪Z92k>룿zA-3xҎ;wqN"kme70Vw:ś8Eeg0$4p\P!j4` `h/`(0sqG-WpsUg lwמ:|b%b4j6ɗCh~LBŊ 2!k }MJ % 炀1HL -@ CFy{ 5" r `}F b+ff{U[jܻ' fEﴠ/ K!uK`Y b8\/ tt0$j%eV&_yilkn9(ˀ7uSB '؇(~;*AyG^8DNVr~X+A}\CCt=-/9EJ 0&L~,2E0's6C/vjWs%@z۽Ql:;úyŠ\6 RSV!UցmV7ϸYH +>bzvwnX&6Hjwȣg3jU/[Xaguyނae+MIUmYr=J^[g"\V5|SJxR&i74ϝvéTtd_5mΛںvQQ G{f(?%:g;~RK5{C\/ZST穦o2+`C3Z&ysleU)S5s}޳+5*>q*>bQKTڇYw6k˛EX{[3eJmp2n/#o<;DV\ d'(M/oV_nVٜ֝[LJ\Y;iq^15F^ÅAg}uujӘ|BdT` @ 0fD ju&.NMFZw3׸ɹa ZpO~c *, "~YpqK鷮a*E>\<@ϧXsڶT`2l|,zCwlt*zy7  *4&s.nmcQw>ץ7Z|mKnkW=M@QT Tu頩O26 c"ubnßĉ-ȴ=w ii(Z0m"l^IasVΦu/"ơM(Vz\Y9JSδ@SYZ)4rKxUp4O1SOL&i 0Qu d 98]y[,@t]T;,.2m1.Wv;|c= H7g:@]?5XSeg}HUU}]׃ȃUnBٖńw"og ׳]FV<]/|/V% ijRG9>}=ʛUZSĮrNÚʿ\_Y=}0drHtyx 6uL]"V B(ڭe>44v^ Tɨrf–3|̅z_JTjTU<7x FݫA2T1 #;yvShX s4*$dE?8=eOSw/ uuOY>~N7"rk“i6!wY~"KW>*O.r~Bf ,2貇!h_*$,Z0{jgW-VtLM.i[ :\UAdAB gU3Tk'^teֻ=UrժLh ;SF.EC )7L|&U ˱hm]?^YΗ$etzs3.:wV8B8~>l|Mx8$K

^#xKTSά<_΀Q`i9Ֆrs)omg{ϑ#k𽭇1@B(ۜS/6d@t}Nr'_JYh. )|mHݤeL'mdiՉ.o?:N Iu~̑s $܀yJ^U* {T1r~m*/>]T-GS 16:`QAZ:_|jX;TM|; H*q@Ղ>&`@tzjp^T\*sp^Uaoj3~qa3j%dL5=f^cW+9L,JRLS*D``Gm8qc=r5xо;[q?IAVȔJs}5ƭYHjmf2Kϣcvם}DVM7Xq!֩kG͡&[~^)+LJ&L'aTky Tokc9sb}^&éO#4^.gkʇ9 +B7DZwJ/< }7}fa֩SV,zﭾ429q @-k$1gD+.?h!*!1`.j6r~ޒd@@5OȥN:,! Q:=Z8ǕcP p SJzO|732ZC?Uc%B(0|/<{fbk7HU>YA'\R lEPH"⡩:G҈hOG32y/'*(Hpr[\M?%aZVȢsQyQ ".V([2~ CB0YLaC b@_+Z$]lpkeRr`pes82XTosU#">H|s:Q5j4O> 0 Ŭ02! gx^;[a AE @LUw\ quiʼnKNJO  c f0qmjAdTz+y Vuys2qoOy 1EdPD_I'.5sC=C$I"$0u64aly'YK[NH5*qfLCsz/d@7y)\j:6̰ jZi&ӏ&֧0(Ia?Θf,0TMmRIOP \zomGsӚ5 $" 2("ER(IBAB( ,H b!S pp4X oڼjz1PTKBH)3$ 1ҡLm/F d$@YuxL\t$, E Q,")""1QHŊb*EU1CvQV(EXU"1E(E$Q`(* I E!Ya",YAd!!H" a ,H(V(11YV UY $bF ($DD`  "X"X1cTHEb QF1EPUEEDDDV"F Wӱv+.E)ʕ]mV`ѧ2`acn*sFH>u7Ta'N+  kTVҰ3yˏY`VrHI+ ( P=aћ 6co^ES,B$WX!ځ)p8Si/f?RHmb $?wxՄ&) J(1 ExbȤRdP!Mm{OV|ʤGވ_ص_#}n'sP+Hj(lXsR0@#I`,E `1b2 Ac1Dd{'",*R@,$R,(UX(XE"DXĒB) ]fl.[R1V_"ϲ{_#z\*6H3hBA,EH  -=Wbd@EQ"#Q5rQXG&oChbhR_hl$xZk"i*!fa͕vCmG|y XE#**iD\A~Pܪ<[yoK? ٌf h*Fgamahq7gVy*'<f4?i0GA;6등fmP~8xYEpv-9.^<$g Ef]0~{(-Z]`ͥD5E{">l:67SY8}!$TvDU]f4_O0WAƛ-ј0$$07VQ|tC9sG6-'?u=M5O=N!]6"@C\"ǝr.QEOAxfD lLPD/DuO[u~WJf*jHFHHI?y>2[N T蠾1q힛^_ד.l7_'yx`!,I>ܖ!tykD6sk;{ǗvCDΊ2,Q֓?Kk Far/HPQTr0pd]~[6wrcm̤*@ C "0\f)ZL5UwSy-4&)%-2!Z(PF-QJĮ[+D*b6TV6Ԭ0^7t4vyF-f\rfFҭhj[GҴ6f6!buI-,XZAI* ?^("!CVgJ(1!ԝooԡMŸWYU;_{}V5ŃaP-9` % lɬ7$J:މ_aR;g&B2L:2dPY ъQcRLLq*Jw!d1 &La;Li(/m񦤲Cӄ΃|$3sI,Hnfp !&N f`Rd&Շ̊$)aMf1Z0 Y8sƌ4,^$&}?sk.{;08h=u<0#fqt3now&tԆg6:'ٽaѹyJA 3 sgGƞݠYCnt0)mBTp:gC9SBڹїe4˺V"@`)KsLka1T959fqVmi3YvS@5hַB)5,fo+fJbqL7PYlȢxM7FNOJ*? kQtq՗ kYy:W)t@ip8kY_O??6P#!~+ZJtچnRterY.5"dĩ `" !ݟh$2m0ÄbB!82T2 " Xľ95q)abD7%S۲kX20`|Ek!Y`ϝɚO'fJfk%4&&@ˈQ3$ 5|wpCYov;yF0lHɟC/;ܴfqMmIwWItI)զIG2Z-5T7Q%f d1bi!Y6aLe2a[YVUz=jxDFYeӥ"Vn3&.Fl~!|C mF"OGcَ2q۞y~2A#|aXXD( UFZVY bsfCc{ &lDjk  "rq&URtwQ|;v^a)u7 1x|NShoe8ԅSyOd3P)XuW2dRhn(<)t8`u] ы%S XM'ZB Pf$ϸ\ ~F %1+;ց810[K8Á62KCYu4RM2PX  98DJV,DDX (,VUE" AUTXDE*Y,bA`K$gL)99hCLbȠoCh B3e"|:RXNMC`E͙Ȝaa!DPPH nI& ۳7>O^JT3]9۝wߛtNNvq+9P¬e\@&85;7lYLd6n`:#{*MB#HņpNM. 2h 3FeH CYPQIf|Y?T[ҭ۬5o)'LirkOqHnًl4q)~ERh+}*U!"Hb A 8rT-dBDM pƯ&Qu^[}ޯ6fR]9P8!#ܮGQ.P ুx^'Vrer"^Rٿ-BL@n^ջUVEʈN3KU(28^xģ]軻HX.7wTw飪:&n8:f NI$ZTT"EX0uG$)с JUwɸ,Xre8 '^! 2NԒI(#l@BΘ,ZB Fqh@@́;`^68r5$eFh@% 44hքui{n׋9;DƴS-˔$`a46%p'km0:22#8o1xA jƖ7ʓgDDGY';(WFK\V0U{yP6 X Vԍ)I$ mL 0r$zH2DE7e2VHN$dDY !f)"AE@R"0DE5c58TPEF 0aJI &D , $̖ &VALh:2& A(AA Ȧe ڰgFI,Q5lHcT6&Pհ2;LuJ܀*" 2aq&&'~iݛ55Q1"YP{0,ZI%e E'(eU[i@ǜ*-JkFCtspMk29k]SXm2B8ҵ`Z QSrF#҅2-\˶Q]qFY.UƟчDTхP/.򤪔RXn.A",Y ^s!Mq5HElrj1=':ڊ*""x'ׄ:r)dX "QDU8花 Ĉȣ$I s[ݣބN|80QHEb1VdT^N⨠,p:rc&r_!AF-w# Q@X۽HN&R"%ኣ /2d`n/tD:xȶX?pcFBS!4I>%@?@Ld)ҌpT E -!+"Ȥ J “6{ս:&26?sGfKEjG!"Tv,B+&d HYm"cAPR/MeqEFI"E!}d( eI!XE R@X@@B B 01*" DYEfZD&uɭE8GC "խ/1YڙFu`F)PΐO wm>{JGz1"3 $*Y]UxZrg A6|B@MWͨ "l׸p!Y##pz))ǣ<|93FA X AUEAAs Ȅ9jXE(PsZ]sy |lx80)Ԥ\{mtm\gWΚdtR+!jQ3]9p:ʽ(*Z>L{ D YqU#!kUNS%c k gs{TXt.СM\sMCB"Pnuh13:BBY@P7^R\Z 6񾒒^k KZ}:HjWWQ??wny3%5Q(sD8ߏZқ%K7yXRa\#ju2AаD@l3PG=F,eY]GߚV 1stTnA9j~MktKa@,y k V4"v.ks/x)BcO=AzW4T50"Hj R*Z"hsV}j1]pFJrq@dDM"":1y9/~8!]IuS$yLzZJdڊ+KG*JRe]ڛ<;]CݛY?n|j#x x3hW0eTO٨r8@k|{.3 8 2p(Aorkҫz5ZAY|T: ڹIsz{c upM" U*Ϙ(DXUdC`zAZs*1WOCfd*MM #Wfej'ֱUeBwHl'Daz_FjO`߹L6sG˯8j{V2^T7Jl"7]<%ď93uCϟ6vKe1 U 4B[_m屮uiGPճOsxoiZT=vgs3nSqsndzɇ blcP NZWɩ@xB0.| қ4q >CXB+p BAP΍gw=14 1_3O& DҥwYmLs4RuŅV/ddD$}AELүS_pra. Foy c) }LCNʖ5bLk9OW*~u*=k&34;n4亡9SY"uY.tNSy"éhv0tgZىGv5]&ڬSV~ΦqvvtowV_xY)5hc7Gm&šssycԹׂauF6G0# EeD )YFj}4!mpƷRvmW~v9Air[1LVt[/4HdUR6(ID -?tqy,ye0t|l}Ϫ7 T _/˜[6" Zn~2xIp8,ދ4 SZ3"3x~/~(ޑnX$b].0AvҔ\xEf .:'zy>SBd'/<,YrW.M پMBL?'zQPq=Lغ/I](wb11Ha?#47U044q>SGKHՉl]aC/ԆӘu~Oj>Xuk|KwWX>ȸyñnVQZBz jLO}~v&Aj q{8_t9Տ3Ms18d-A`( CBPh4^oxugc<4GQt;- nWT޸ĴwOEN5;#h'tnW̼ܩNvFLtАlbu:閶z7r?Ϲ5wÞU5#:,8L4|* QJ-wIA~O=x.,٨#9⌮>qn!F.)ڍZ@w;nou-E3 hq O&k9fja:tut7'ÈfACiCͯ3uyҠ+x$MCWsܵɈsSUB 67 ssn/[P;tNOPA  kU6{u! @@mvJ'} tWM,8@Vъ69_қOyFkMZ:ѝ=V)dL܉b}4C *淝뮏b(?,!w?9nQ$rO /w4zQtpYUB˩CIdBO /=i /+>רeSn&HQdP9F % %%.`0} F3zz~POu];7oߴ3lX"Ě_wZ[MNJ_ěbt=[і-zo2\8N$y~?N:qT}$Ez'^ 81S"&hZ`0'r⺣S?1 J=!$Qi io=nqt |{,@!fSR$``MI%b,p͵!Y' 4d !XXa0bT7(^l [8R "ZJZUҥ׆@ 7L&ת*xu, NCd?g)P mpŐdBRH((fiu}]5o1&eYaoKsƧ9?OBJ,Qi:VJ,H_mwTA ȫ X{675oѤP(d+j~[vo[C?3Ca:eoܟBm(!C/gIUxBj6Bb°RWq 9CI2t*"ҒWQ@ e̊-FrS 1ɔ̮)1&֪844n鍰 |_BfQ~}vsD}|?}?ǖM%~Okb;jצiKWǵQ C8? 2AlBc6. 3uIL Ċ{zGgj/^P\z?:N5 aƻZ@'{I?z K#.8teۻVM{{V^}lŦ҅i R?w-ԪHJ?5W5W7{hsGx1"Ry[VUcb4yL@lR1>ʌ_Atnx!S4Y5f֝^zFRf^^Ʈofbnv\փy>OUz kM\k8Dd618+܆/]Ǖ9Ò 13gqV~ZU31?ާWzdSE``& JCXBqkgO$Y^ȥz6iTT]. QPzL4?hvvAiƕM|T_''BA*:63g0w㐠*B z;L''3 =Wj_G$WeE!޳r>N}NϷ7:ͩ@՞1dD@4 ai}˦psދ,z凹瀒ZIOaozs6' D%1`wj.*Qv)q8*VcY+,E@*6|,Fkg=88zK,ÍD ʇPck4Zx{"('p')9Wi}%n#kr'W%)`auYgػRJ+$(HƘД&aֶ&/ y ʱD̛ =\:'}.5}ySml3,\xU%[ˡ٨ղ>LhB  A@~Ӭp[ ;/r jG@}w (CE1@̬^z*hrCITs{x_DH4r~,YcšF] 9~XWGyYg6꫰Bw3t>&s+%r#uGlegY^&m9rR"F@@RN4?kK5*bdCQ,fsh,8|{ uՆq_7OWM)hk1]MFb#R@L[W##A^-b0Vum n6Njϙo38 (2g;.XR\M3Y_e }{G(jSUS?f/&>RAj-Mn]Z!yfD|`ΝmnA$ ) vtSڨ2k jK`zr| B\S"ym]"ի?EvֆQ)|x ,+i/ǻwː%930XMSHB,045sw<R;49ǺU3FZ6aV ?\BtMi]SA-5HS^*3h+:gД?v ﱠ!)Jy0..Ϊgd>9=;W!]TcEPOTAjXc j]YT.u]ë="56~Of]Ŧ"Kf={ׇB|¨?Q6ǣ.yNUsZBmokuC{(1^ 0<}f O?*9^F Uhg|){sYߗCqbM0zvOd2~eznK}n%a+p"L |Gg;V;<*p*~߻֪.]aGUEcdi>ǧ7ٕk8ߍڮel69.h}үݷ?5ˇ}"0fD{˸j?Kc\Xi;S-"f#qg^ >KH<\ꯑ5jsER2I7T:Gp̧jyX-3~aYqM6hzW֡S^ԨYS!%O5xPYEЂ>J:k~vx`›BY-ЄѪV{t\WDsr8fFx pTmk}o1_R57v]G-]4Uk h-)0Ay(flC}kВWmHDUtsY > ˻j7PZһ :.ߵK36mEw_G cG<ع1ݴ,Z[u3nQTMex:k (xH>Ai;O'X?2/\#)5d,'AM{(#0"af7peǪ,|[i/[yU-h[` iX{[9D".@AcB}ٌm̼o[2Y?V6+ IKrœak҇]]%'e}=߹[U|o C3!PԴ$ j'?zʔ_O4hB6WDnmn֞UZ!-ECd8prQaA$TD,?)N/~z*uJg4l*vZ t`-cΠº& y yR90bh-/Zq]2X,NgG@­>lrJPE{ v08 ȓyknHIq$೙!3\ݡ'h#Zr Tb.h?j+h]Fv;Tȹk?_WfrYngbC恦+}[}6X7bKSE)8K&V4:$d|hHFGAhv"iAmzOBs?j]S؂/ q]m,uU6mys1wv/eP ZT([AcZq[E+K<nCEW5pш~]͋ {NKOҾfÐ =) ws/DHIA>T2tL5ò#Qv(,tZ’2aMdl bC^8ɯ0ed/ ǃoHуZ1,Jn9ƚPbuW:){G_TC*v[uC]ݰg[ʑ!ײ\eiX"< [;6?uHr7gbD抺oάVs m`9bj];ӭ9?뇌/I8+6lV_\-~ qp|,`S6Y};UF=A4S2ݠhu vS@TgUHI|qe5< 2"pds&orLxeP,N(Ԑd%iz<O+/+q4tcI}|%h/P2FetrMȧ'̘IXC^S>Q:_0(J6;ynP?qT [¾Ѹ!{ڭ>8@ ]ӭW5aJ"]nsV'(bY n^;L2V8pݩ R ) Q4_^x H bQ=$I!?Xz@?aDTmr[:c~g]D =JU/(.ėo`<لё#T R"(**A'|¸g5>~2'x`{iL*NY麫 QG$YMbzEq!65M0~A,>db#hR2!h"XJ`«mPc` Ն$Y61%)R֤$%P_)O*Fw<;4CX <#E,3q 0%h}oË[ᚻfЩ+@^B[ByZRA0>d~9}T> M">W#ѷA0AOxZMm-Am+J,@QaHdLJ:T5Q?] ]If%9w5ZM%th*VV= `41Kf ń40'G$0,fYa?&!B Z#"d rWSnK a#D:;rGpN룪[ 0CF:(qi>SװO3z.6oV;G u3]u5yZKǒhq}9g9;xvJeag&cik=ݟsk N֍Z>]6/Ps/zޏ\1>&`?F67mcK]N~y/wmˍ3psTS'\hp:2˦ݟ 68 T߳axUeT㓢5~uB*>_v#;^:ɏҤ6wmL5>sO/u[snFwXz0Sgibq{狀`mMl9noȲq1zڻJ齶ۓt2޳-ގ/|%̜IάjWT{ُG.o;IFr[?+\-ΗC9Z>y6]jAWVCa]]STS~=/_c~0otx>|K>dR`hXg#RXo}{+&dZWyb |I#oT?|W YU"> ~d=  ec9ޓ?mª])P̢+ o JhNqL^(hM"o[w+l~kǺXQ1[4h+(\T~OeASfdEwt/'Fw}V[W~83%O$Y"*b>t'iZ0/R+i[=UN@(z*{(GJnՃmw7m֊}/d0=\GZpŠoX5e(Y?Ee`8{mm#S0EH)ӂ*]^.(v좇@a`I"B>!oِNOKjjR*f;܊>-=v^}I=iyl۱k[|s?;p jff%a31Y?I+~;{9KmvE_7Ϝ}|ǃiAgﵿ-礓[k*eJM#}͚ч _5蝿#@҄sɠ6gOt ,Z/k6bg໺;k)x)滝7}Sy}O-'5YR5O]*kW3Odxr" O2vNn~Vܭf3O"ɞ#. s ΕmLVh:_?sg+f&КlV.!&Gn5kzj ejP0^ ;)Zޗ͘@p/٘DuWkҫ x]MGצfu3 YΜRN] 0ZvpOu|4=9a˙{+5_ڹh&~>.uեȗt&&m[N?f= <>#ȢR4qDaP:NeSsBV-zBӖeswZCs \DC5]; SS4vT~j?_>jɓd˥̦gmicN<ںϠbgr0:3 w#Fo6 wᙛ/8`҂(t5xsi׸~73?={a;Er+Ye;G馲+~j 2ִ 7=[fůo8qí ˭!GF0/*ߡ6ᴋmelZK&&A.hLJnZR,tZ.' Poq /=>Ӌgޚ= &ME $K -ҭБt n¿6ikIgNmpt [/J5'ըֶү&Y)3:Zb:F3gƹ\Կi:v~v94ȚB>`-3:9"uBe]%L2muj$ZNJgMijޝ>q,_9ZSUn)p ]bE?i7uУT( V?U$DPejGNX?%])(oE.A_Z8٫<<c̵_k \@:>{UkU'>#ys #NN74vsiA[j]]b /[ >pnYXg/ }Tz]< BTJfDJ5iXo_$P=iH _ZB m}Mm3*,b/_6\ ztx㉳j'UD]ou/jlJ0j ZkLq9M)ifZ}-n@W0Nf]Hr2SE # Ls$bC̜lʜEDC)l?4c2Fu"dN#.ڛ嵋iR1Z,eˈ k|Oo<u :H{K*@TF>Tշ+2!h}~]f }g9Hb fEMkޭ([MKRU-W3TzivsŊkԎk/Z\ {]@MAͻByjaOTSF5}>T妭} k:ZZA[8ȵm2gl׼Z~ e} zr_a53J[8IcB6} t˨c>پ 8uqz8zɍ%w*RVNjt'f4]tJzl tVO. ]d\݄j H7n K[<:V O<0cZָC&WZ.H0 |i_z}y~?c""j׳ j0(yp=g}ê9T-Sug쳓UONsO%{a1ֶΊ F5_2]ؙ]M~g{,Hu4OF%aK->iĔĚ㝉oEҰ ?TTZ@T%L  G~χS/)S-.hma7kx6O^ji$R7 l3yJkð[\9щ#f=T?'"O88 ]\R,}vuɧSj\<}0>PL( V[Dʕ!ЦNqPU4YMk^Ȧh?uTcXDV;2Q!WYb,e$f€ɦT4a(S9`f-֡81 #8ab/!2tNR7wio1 &wmq&\)#6^ap KCX BT50$% "Χfx~&̰UQ@FC}lYY3iKK}W󾡥mfydJZ'=U!pŚSp *9v\Nyb&^, kl>#36c zH0:D}d.ytG=UaV`{UQ 8?%vUjoeVn{KҕSUO|&=\R8ywYۂH $'>`@NȧcQeKD(kuΔ9 B#$q\݉&E4ƽkLqPl$MٮU8<&.ƄY39sUś™:g|~>6[0YVĤ[yc(K;ΒL1+vk3f@{G42JƁ@gA(]g})Uv8;ErVf~XkWԷiΦM@>acAFV|zk'?\6NoY4S9OKBYǔga{}2?a7L*.znX 5]/i#7/ٌ>ұ j3 +QYp]w+Э{(Q5E? kz;`9cq7pO ?[}?o C7AT$!sY|O5x$yQdOME`!HS5}9}Lͫ[eZfEy米r‹fEG*&qQþ~y+-<?h>W*E"{8V/@0 >X^C}^tPQ 7г(p{lԝ鷽W7!H"1{DwԅHIDwCO#R N=FV;aT|e*3A-S|ƻ'l/Z HkPQOt[WAъW9qƒ q@}VNK ӽ€"ReB!WDA~#;9B>@e4QJOƇ.CQoH$c@ 38;@'j78aP10;PFXUDE?9^SRANy}?)ˠ:R=uUv/=*! gaE';@aWaiCd*'݂~U$0|VL`K@>R%L}>h;b\i[s{2NBCH44QP~‹Vݐڟ͹|u[iDhX"/4okJY82lϡ鍥_ VYy >8UUڇtB(!?$F$("EPP?e {"+~GjЁ`<2ȀjT9 9{2)v?q#㰆[w ũ' z@]+\'L|?(jQ+-f|?M >-gD>Du(A3|R.:̅(|VۿQoWe.11+4#빑+P޵h6<6&wԶCPגxΝPCtdKYxrPհO-P]mdΖ;;!?kа=Oܥ|OO/J(yM(Dt#y#A.Aw))}yrSbՓ0eesKެ>:! >~Y-Tܠ۬O L|i] $Lض^2zҳ4{ͤ[%ѐ==)ᢷ5MA4!aT^ϏWy 'MRhE#4;9n2] "n3qtr=GN/ ;c<ԣ\C}Jϼ{75:Ec1D(u("+ O?e}b$b+nOр Rt=0Diqׂp!r)}<]>Q4l|u누(hu!6@xHB pI$C6E8DWC=Q_ެ=D> wj8Aʿ??q~2k[%g_QoK!( UJ%z?0>P6ꦤ(P _a>#_ƄHk0mU8P!6.+7X9vۈ}ϲ&BE%*~UP}];NO[n&<ף/,: hM!_>E@P[} =-)AKSqӼ;(M(O뫀jqxfvw b2v2a=`/q2i1$gD]I&)`I@ Bmv9Hi ~4T=> tT4"e ^ &|_Of^E1dgLy`$}Qh, h.3i!r?* _HpP_CTНWkG ^gOlW%l,gw:їݯdlu^V&ݪ}+I<3m jxt^&|Wht$ G&<L<1xZ̧tl×g$>LS쭽ͬωu{k_[6fm; U:/#gs+9{RZ 6 y]eyc[xsw<' F-j7+5T0v:F^ltv3 }Ym]\kO[}ΦSqzTCWrs.jqӈaǶ8e|e{b2{l8"3mEyn(nuod/m\eOr7qg S>9~Lr2櫳z6rm1ó,~W К1?I^I٘6ZȰ4W,BA(~͔º+.*?}ņxKD08@!dMdbՔ*kKv0ɷ}׏lTFcP8bwQG>rfu'3Ӎ&%H-4^vgXŠ H hm+[U~p{ To/6A,gV=lk9^fmGr p??, FVqm98+EW&BI{wKtrX_y׹0`DB⮪բJQdB_ffcvI(L1rTA77IXv.j[ݙ:7ٝunߏO-ҁh?1e"G#s5zC2G((TPU%f(ȧqFQ>'[z\QFm!eʩoesԲH*4H1N@HX*i/GbԆQFG.dBqrQ,[~f`X;F^3cH,<_ZD}lE$N&'}/:kڢuM{0ßI- y0'āH([BĬRTAn_)/hK(X"d(s=(m@dPysu6$ð6/?IeDϐNZ I@!Nl2ȴ ˠ3|s;do#% Gɑ?+eV.iU鈢y?Gzb0&Z ZH/ T'%H~6DQI"!,$;Mb;gHCB`@$*0x!}* ~1;v +Xn݃%$9K9D@+T}8F.?!g0[QQ` Rh1ҭsT0GEZ_߭s0:{waM\am2zc¾Y4z3 ^s7E|wgݒ VV`Uofue Kbε] 9t2tgr &be iE1.huDM3w!@ w5=6ic@u͊p0u#3 ")$G^I!4il!@cFAc6$(Z(1+cXePM 4V1\B`e ƑCYChțrɄ"BT`<8DU ד;rC77zqCF(ur-Nm)PUDXJ AdPE@4JXmr" *Pe*Q,С ☠͔C 92iI9D@ZLm1 ; 1F @0 KɨAu鱇 PE )5Hmi@@ŒJ XLHJX:uYY||!!@FA@͙l@Wbrd= P@x~{[Vs^0a\N]N~UKu\WWɸ@?j  &R:qSSCZ> 3ݳݯe bR8k_肜ySHT}^KY2!rh\m):OɧQOlbs:5>zV ?ڌAMH~v~ S`~ー&T)tB@ٰku{E\dK<|-pA=ah}7$dee(QZ1rJ`ThdXW 0O̥!,e '2pHmdBx$M T&4H!J@;hIb k %@PkT}  ]H_L>4CPk]wJnㆅi<A~l6Ì7͐8C(a%  P̜:ժ:֠F`hHPAF+Y!YH hHTIe,lV,*LHXXUXZQG6d Jŕ*dU@[`I m) dEETheb5u랊|wz%΄`xPyc@ "T{gZt(3 ߩ =$=Cw [\r]f:rQk,/B~U#>,,{K :#`oGё$%IYhVH(HB- `>ΎlD*{"\MMK2BN:O1=EkpV "[O|^}C4U `R`T9U=d7sX oS!:FBE f`BͱTL1shv;{ a1ՙaUt-Hpntm^FRaEy_̢ox A^M(4`ְX@~Юڮa+Z聑B4y~7n'-c~f'vxuH5LpVA@0 gDݗn)" 5UEjI!$3ˆ\{mm: J;E˅!Fr:Z2~6<kn+}}0?!l!ZH;<} &x\f^5vy7G4:XƒEErd[蚗[euKVٷ~kСu<%=BӴ资nWaŅW7Ɔ` 8,sw爛c7QH4gFbm7q JU^bC?a;6ǧkKM)m}b6w4~p,_0yRjS U뢏 =hguޟ'_z+]N$GD90Wvζuo8?2Ox\ٰ?Top=˯FFG{dO ( ;"b}ELTF|IIW>27Z\5eSp|ނ4I}?].b2$`f=%Ed@a XYQB"-*dP1 J$W m1XE b@Wh|*\ v0JعjOH=>uh`0CO (%x^.\Pp5W ho{X[E?#2Ľŝ#{,8X ]TF$B>"Xh Ow |5 JND ᅳ!4 {uYY+w#kf -@Xle{h#qQU#V]̀^!w-,5`3R Ci!0 U*h4=RG;[a> $a69їn3p3DuuQnvpzG  0փ.d@Y+@Vyr62qȃDP?܉7=MH- V?qCNg*5a]vBkU1Ka|+p0nzQ0|ƭJsa~{L9>5f ^XS(daK~ezF%?u!]L,%b7P >ME]!7KETAjp≔" l_v\ A S019G[8 ˊ7f耹ѕnPn0)6j6$a9qEj25=>SDف$]!X j PB;gʋwZb`0Q*X Y%BP Bvaty7ns(θ k\QkŻG3ݛ{t,Z42QJ(k{o0{{$y~)DGF2yԇ5 OȪ~S=o JpMb_(ч`1KDҎ1پFa o и.wR\ʖIKsՇ8MD>G7D0inB74}l~9P@vO(9"B!jJ'17-eZEO%U])pX]/aŁp*X@m J>ċ#F[(0ŢF pAYF؉hJ(;@vg.K]`ɓ4<;Pb+ݲn+}E($I%DӻSOY( .)XCʢ,0jԥx ^=15}3eQgh5S>E0 Ä yssw?636k{b06(L>T+ ^woTFBh2wf՛jr3*{kl?x!u9-Z鿬 (Z%BETW=PT9m'I_߼Hs_3bM|@(aJ.uXle~2(BA* P찋 T"b)&F1hH @Ƥ{Vd1I,f&F}[SLIreKXY QEyj>PS笛<ꅈFluD<~#Qim@>\zuTB?Jķk){s}tw^q(5]Ӈbf,.1uSB8Ҟw=6ZyI l9~wlٰ9yPB}[Ǽ/o7p2@#UylcvG@"?[O֛n'22Ǚ=tը"&CS_}گGωFIpox/MڭThSt~FsY3 5CGb' OM |hyj}2\IEoFMsMGmu׍yZ5 _ {zKyx "4NPܰ FX(& e—-pMq`U=.]VOgOr!uEA;ӻs &k;eaỲPj:_MP   {̶}QC=1J4$ɤj_CI/k99F uJs/DzmsO܂y2{^CAf F.>K=Z4I±}#LcUr}5⊚/1vOzLM𗝒5EmpZ.gWocm>h7X;`uscahAe=A'~qx7u}./VEo@|B䈌+.K2Y" ?nb%\ ד l9;8 f`kNY^b֩mxW-], gUrycKK*=Ww3f+8m's4P~Ƙ~ FoQ/}i]V^T!oYƣ&Ij1Lk)3TԹd=셸l&ޔu ӪqP7d X4Cg2 W *w#3[\r)053hrU֮$ V5LmP ΚW_]cn]7׷H7>"Û8cY)1Jv_K;k9qȜʔ4󷼻 44MpUTB$;ƞCZ4a93]ue~@K q`Pɻl䯥Ǘy%P벧~l:koPҝW_w\l]NavRMž^oyb-Cs=޼(xo?*^z+9lXCG.{;٭n$cOWg)~/<|ۡ_$T`rqAˤO[Q -%2*m7fw;H{gbNWԮhn>;=]~Vۿm'n(%ƛg==.,Zɝq|ʹ9JӺ.cqP4X`!SLykM85t<0 ]eB}0 VͤRFfghV يPV?M<2xl#KY`EU# @Chy`h\-)DqPcߞn {unIyC8(΀1G#s" t /̯EuMwA )5e;q&P9mdQL .*7٨}iG{X'uFDA%e$ "M7rԷHN/9, OoόѯeJ(Kk)PYK=%6;ʒ Z-މ&K(=}Kfd'F9WixQ5`|w2xaTہ?nda y/Ga{ԕuwsTLǟ^:7j thk5-?/vِ#@jGGЎ6uB؃4އa2_ HAؿ:-4|7[]KlUs;PJry'w¬zݨQhmwڂy(>'wu(ёm҃B{{?W9K?K=EX=*jFuwv}ϧ*e;~}fOۓ61B " XV$b EEP+ȈXX $$#"Qj75Ӏ`J>+Eq`5JU[_Z4ru#X(@P #$ނ> }DAH/P^89<Q_f{pƃ4 Fm{ u ¶p٣ |5-46ɮvW+_xJ)P,6Wk+Rպnl2ؙצLNB@&*(R;$: _Ձ pT߂4.葛!M`0QިFn_.vͺc8`h5PL}űeeD$g:E>pmd$‹ m"6Q⡜lx@N4ڭzȁSc/'Ϧ%dՈ5tgs!GNMn2O, QC aQX!2#ؚGJk]#Za;d](XY6B u 0ÏIe5wxog#ˀv"$ ;9h C؇D>),>J#帝d͂L!'ȇp@Nh߾R7ݧ%ĨnN)لmaxZxU@Seli3j+\aQZ 4oZYU><^ZqGp"d+~WdW# ڟu}-}Ԇ~.W#%F0|[ vq!Z#<ءg 7-ff?Z FUUŅۧ•k{kA>3K1&r*2walx;#gjw]w{߉9S_F44MoUg[= ~@xgqil\Ĭ=;<ǭ8  q{^wm|H4&f`8= Y6.?73~?;X{;L'b)=y\qR"&.f9e]'ڡ>Y j}%rDY~qzj3hE"p~63";1={c;w/b%AkSAvũ$XCm* PwgMA]Ioo]ؾd*iq(0 Lf36iWLkVæ]B֐ˡ =>0[{rFm^nwC|";,r+ }îmDOl`G(翛1c)p>O;osm+ ҇?3}"]X/N DO8hv$N`~^S5_[?> /=eQO}v9 R8KG}̡<8~~ɟ~D1]KR.\qꖹc` n jB>B Ngݤ8@.=&ObͿկkMM4WYcJ*B\ŮfjZϷ}j}?ҜwINC?M` uځ=Ծ5474dޗcs]h.EXq%1qAGD نhU=[nFU霕w5#,}t6L=U>-#?d(tcw}"HFzZE M @DtrNm% (_ͺnpΫt39^{܉ E *M p&#ݽ-%VE<ՋXXoDFZGV*>n1;?\}%:4bV^Jڪ;hJ$H$ = Tzgջ~" DBeG8̏=AO;sOlWܧR?+gIם`QX7 vޓ!nޗT^L즵䭓ѝ궶[/:/aj|.)ݴ ~oO*kUGO lf+Pm3L4߲٪:|ҺRH_ى+yvWVO㴊Ǥ*KE;mÄd\,l,lʧ#xj|ysIo)g3 Ӣt͹׬pCS]ffGUkv0i\-f%SLMbΆe o46P{Pet1@i?B4}wv0SD5"?Ax;K<'ҕ$t,۲jr+ |̵$Ac j ؾ`ӣq;[qbU"ZݩWG3_-Mn,e?A? 5ߋ5.}DUt;+Z Mw@-@~{!!BYuЅ΀kyT7u丗ZDQL^8ii:BjY8SIQR?}l" d3P<2P&ם\ncФĮ#Ɣ( 3m]WּZBE0~݇9MseAe[ʹ}*xŠB׻Uk 'Ckwiz Ԩa)~DBly R,]QjkgT0)C[2e)#YF<:Nͺd Wjg_%:yo.A~F(0NTS/oq"pyzM%GʡЌ=Y^a5D'_I@Bdnΰ~5+Yȹ 3桦q=݋z0]g_er}ҺuSq}bOöo6A@7o;>V+,Uscܺbơ{d+U#@CbEqC[mHƊ~!gV`wzӿ@B4sEjeѳuE'ʉ)%8|%:P ׄbj5\'x/G. 1h‹um`.fmSW>]̟oם4H.@~Oƅ̵ZrV2@"$GZ&8D=`$=T޿[qEPeapnޤ {3oigb]H-1==9בiV5MhX*wҖsbȻF4ψF^^__]UW7nG9uHQNeNeMjch#q| SM4̩?5J{7 y$L2]igxm+OtRhiwڕKY)#|`px(A hVLr=6u`,*/ K=8A?8&1ތd\(AѾҵn!?0(LHF3`csoakjͧcMAQ2_N8sbf> Ն{9skY.b1cupTHf~w{J;l@ G~ ]: p"*`"z {AEQqF)'q[{An*NnfXy u;>pE;4.DrS5kzoobXf 6Aa{5Kx }(v8*xDT4?95uw' XLw&h! Y<(e#@ǒt4.cת!-Ϣf=t\Va<|!v*Yj|lp|* M87^";VUQ.a4xDkBzwi'fI3uQ&ba*i9y^=W˭V힚R͊J ?12$JH5QL;Y3mus\oYEKOX& `FXxt8"El k 9g"^>{,so9~_rEz.=<8-GS!dci?v/\{*{|R(RtՀoo]2^LU9{ƭnVY{&9 J|]2U9w`60F,4~/=/H)嗔_.0\fS,QFc֔QPߜ &fǽA R|a5 8'~paq}V!7wn7`w"B0!kgsnOt澎UL{z_}B^|D`' m^PG/> $sqwqO㉗^6cϥb1KOs3Ay 09sf^ҹ.6^ mqiv_ӹs: f <aj]OٔӈPԑYT8 kyR~znmtv e1"op/ilYP"J23s@,"quLn껨#<OY:n#{Y5h?kDL?n:-B! KIcTatHqC ߌ RkpT(,moE*XJP`KF@0<>{k{_oi׾3FZ9C2'8T$%7g b4E|N>Nntg­$S7woBڏ]@5/U{I_o"fՕcv_f{;Z cJ ]Cds.60]9~urW}Gq=n,'hZSx&rwVJAB_Φ0nmL읊::z{Du^/2q G]YƆ|fΞnM[ otWKXq_?gD{~E^T+9?o~ZqpG*4"0w/n/Lɂ&P~ C68IUxYr/DGosƄR߁Gx?;p3"|JM><50HmР}V‘v3]{7k/֗hnD fNm}$N$A9i~,KÄV55 2+C:4`~ΐbF8"8ʘ}3B̊fRxy0m鷟V?k&= c TM{' 53҈84kGBgm ],{rsoo\#^G}u KR@0cЪ$(F`z²,m((>ƍs3DoBb_ c ;YKA[D0Ԡ! GYOΞͽr\Bvs.tEױ gZ>= %Cĵ5T ^T~RS5nb&Ճ;U^DZ, uxYvɬiT+ZهVZ4PpR6.hz #FfD;}T]d<Qe4'pPsoUaV>Մڋ1VI}w^ %݇ɭAQ)?Ҵ Sp㜣=u>: Cꖨ-hF$W8O\ ~`6+.Ga>V +T?&UWtUwmrtI}PgzJh#cT,!V^rv9{-\ nd5#Ȱ=dL48&*4Ț* FgJdMGP̭\5/5nu>a&Z-&gEa&ڧ6u^&ApsϚ3m-{:T|ʔɟຼRӳ'GbrCQ,ngij C;QZ}5"g,-RI~2!Iy!yaUW|f3DEd*Y]͎\4L:P z^~*xO9<t3s:m ܥŒaEw-7أJSO5B~}4l< 3fέ8н16)/ "k))Zi"CϣUS_XZM/rHLPL!ց7 qz`Fe'͇^ggElZq;\ޛ -ffoSQSD8mQOg^h-Uߺ !>a-ۊsDf(Ё é 6aJ+U;6^ J/K=Cv 1^(gniO{zXG!0[UElWD ƀ9:{p ~5e[M6%4ljk̝q)Yq, P=/ivrCQaS(љ}]DM -)']o;ɹşb0ᰃj PlNv!7C9O?2ځCGe!0,`4lkMަb+6Tbm[D#??~ PPl^mcx/c5U pi7[?Uvm\. 0qѡ4 3tՄ֦{ӁPtlj\wzg3##3 JP ?;.gjXtЀkD_B/UzE6^u?]")+*'c=^( RWֳQв휧@b  ^ ˍNjc#vE1ord-@ Dtx> L db YOWEqNj {+ӦNQI sjūFn|"6c?_Dƛ}-%; Za><;&z\MLs;u3ƦԳu~TMbqH1MXTX[L9XTXͼee\gχWyBV $ 4 223Y2>\}-;lwzˉyaQŷh|)O93f~JUzXohbMMdusm8w7fzI3QS"ıߪ]v>ײm Rhe": b.zM3"q/Wh }b,L-TIV"Sⴣ ݼ'N`|(Ԓkpjys0ՔNF.5n}P9rT'/[Hl qh0*o( R+|{Ҥ e˜9kZ]n[|oK=2nBITdBE @R(H)DTDD@TYYޤ =%UAI* DD"pEZz"E *+#"[b P1QT2 %V B#AU,Km*#* 2 51dD]e TFIm( !" (m9+% d?UȀ,*p:a MZW}ݲSB}'ACިᐪA~2^Em$w) g0u }:xMtUP #h(!Q$Ad Y iɾɬlgw/~t.u/㚆@6b@Oɏda^N?YS_Cf*%; )*y VSS9:wp_}ؾ6?#l^iuFinF@ʹ)+{&Φma]Z`RfWۭ6Z8yge۷{qck7[wBB;𸲢sHLڠv8YxV'k׸x/_lw2mAg[uz!OsaRߏjK_MYưyyԍi>KPĘ}K~YpJXzVc_5ч1Dua<ލr)cSηyL<~+mkl MT^=hoJ1t+'nW٢y WlĹ0u.Q#4cCF/i%#aheCtdh^R{4Er,[ Qg~>Zly;/$@@InV]b+Kڱ.Ӻ_9)~RǶlPh&1"= C(t~RL;}E$)_eKٱӻAO˃r ^zf&`]e2j(@=WUǞ 3K9V&qxitzzCyPkmvol:I[=pJ Y__wTz/ܾ毙i[`Aj ilY`b򢇶s_ꍱT첃 d$z{jT@q bS0iO =O8}VgU`5 ^XTؖk Ӈl;}=BZ.@ tЉhAp _d.1JѠ)€:v&DCܙv=#"TuB%lX:@@$ dl/L`S d/r꺃t4=#nǷl@HgE 2 z\i-dljcz30eWA5Y0t-N9Q Uv5Vz_;]nvvßpxx1hvh*Kߚsz8*XbN5usyM^/oooh5<2}WKb&``$MWUcJEѩ@9t0ۨ)]JkT@.!@6C_Eq0K7CU.kǾχmSt4MAś`eU0U+c8aJ@3N!N^'VȰ~^ Y?|/8UiӛqU_mqFL:#0 } Pјsl RH,8}!k)z_ABcƧc|(eUr$oRo5z>)l¥MO~;:l3DaP+z!Z4a׀Y?ˊ"łYi w} dd[Sf[pNtPJ A:1$`$)Fwg2t&_vVlxݶy|7z y'Ǐ4'K Lʡ. k5E~=}FzP+]ɽ7f둰xP6v;-<^άGB!ædD%JD !@B AaD,Vc 6TQbۍ[Y86k5:Flo@C QC~8'3ȣ}4GЉF'Be$$ngIm0ɀbj% e`HUU^@-&K0= puh`hxޫln8^,mP<CIԕ:}1MlNۺnpQԙ g3{q0sqۮlᓋMvF Y%竖@Fwbde0~YJpJ'rJCLD!н#=>e PۖF\fuolwDF~^WYMÉIPंuCn~M^̉ŦrNS\JZhΞG$="??41Ÿm693R1DH!;Y1n6]R؊wі=ò!CjQ FkM߻ٺGvXqsU{mm%g}~Nfp鐑L'/F?gq٘:^Ȅm3t ݬYɨrzG3~}ˢtE}לeBrk};O7n+sn<@ڧ0w FEMG(2!P `26UGҿ+ٛ*8e ~ӋUjzWO5G9{w=9Emr ) 4ٷog.V`oMeo=~68ck_altykF&^ylC7 ~]NjȔ@A"qB⧦wŻ,; [l難SjMZ(qdÁ}q4\Wx ;7(+wpk!hkg]#g|ifA5;U&{_!Ѻ?+YQ)kTn5yl4A`NЗF eZXt*1Uf챵/ܑ@Yџd9v H5OڏSl?^Rm+0(c_gY*}udʼ Ȟ Fb<}E G_=B&"8*&]%71ȹ~ Y"'Q_%M29yxLdcȪɜRbA1K:)#D lJvӟGhb6c+.@N*"Lf`bd  k:` nޯRxoԞ8}JL!X' ^ƃJ8g8sJTT}㗍AE?.(Q.>^ڱl"L;^JJʼnRuҧ)sS"T͉;EdGm^-UҮ΀DyTڥ6M)OyjHV_|"mQ4iEUs LI3%q7oʺAWʁo;?#`gn,]nUh`d~S3~o~UoG cM]m9SfK?'/yE㣨 MFb#I{j51DhO/f]4.e3Gmӽ|F٣&1e.Ο~'}ks@E8Ǥ! h+W/FNa gkh1$ޓ3UԀ$:Okf1T0x6 *.=N먹3iCD' s7L,†7.TĘ- ݹlҳ<6f'Tto"dR3T%7UW6]7]FʸevS)r髻ll <g8%(7i'Zk`qikc.ͤҞas 8mR5o"rѪY[#;`W U;y:N/,vZnk۶nϹ#~`@1Nޝ}vqVaD8:^M$|a* РP`:3!/A)Oޙ7?> _SͿړo(n "=#lLd`n;~ ]H'cn3 hi]##rC'eC(0fm$}8o䨽'Mq*;*L(s+ڻSڼ=YuJndiô?" r;ۏӞcw⽬⅗n=oco'`dD*2(p{OEnZj^\?5ROvg_ Ox W1X2޸7~մDRǿcigˁ \?R3r/x8!Us"aswjt"֯)D#y-W]].ڟwl",j,,>>"w[N5TaǓHP{v@D2-]-XvJncsN֫3Q?K'sQ!05݀|J;(Xoh9<Oe6}nUo ZGn6fd1Z@>? +} 9S0 51wJE!N2`W}\U00q]2l/iY]f̚f ]f,wG4ff FlN9sewLL3@g?JM_B*rdpzT!eJɋzmLed(g*I!iQ!T&, +qvN/ Yo"n@&\ M Cne^Bs!3@xCtui${M~Q"az?f!"8=Uo;~'1=u}5ֺS\FNt]jr]Ȧ$Ҋ6 bO Җح-eҰ~4W?/D H{߅M=~7\/?"&' fMqe+pp,㵗} ̌QC v,y +f`y5f%0ɯb *4anT!nau ?= @c Q?tF~6PBo](Oa8H~OX$Ulk"$*4Ҥ4E( $G3} +th G9*72P`AmljuO[:&soi)֛tsǮ;U>oq%_ ,K0^9`,djMRY:P ~@|= q"D.6䦐dϝ$DR+Px(i7mâVv(ꥂ!bkYXIҙ֢kZ.͘aM<٫6w4F%Ip)4Xa2CbCSZ0ֵpM& kO]ZI7MR(Ix$2_'7$ *,L B)ʀ |) Z(ԛӛ}ߙ~/\H̠ȩpfa ǠAs;9Okuh [&娠c\ vev2K?/Fjٳ :&B6ɣZ)nd֜ӫ2?s ! XCL4HCl}.L̤dᒍ03.~"H!19xEYNY@OЗKKݾNS NgZ*BH1,AEX1$`ȊETAETRtI_=WRd[R`a*BDG^L MVI,R84ZĢrCmPpK2QEa6PF@*LWi5;0/ |xM5T39AH8s>\Β)P]aSء0.?9a姕m3;=|V8 |mGm)JIkG砀 iS=??QZP`k -*vT@n|qOi_2Xx=]y?2E.)ͤvpKb( Y(5*>ɍ /BPlm>{JK~5 ?AgWֺi»['@Qg='RD;8\]m"S_ rL\!|\Ȅ>ϬQD bOR`g=e)=oZFs<ѹ3sUal<%/>>4WlU4D[0KlzFP4.bbM#$-G4Ly4\"4 AHn4Xغ/+͍w5|MG*iS#- ?lntr@rύ7י2A0r3~KEz1(M\;yiR::z gw"a1(0Yf3Ȭ'3bw>*]ElMso9-yh:ƘLn*\CKQO@D#+dd94ژ 3BYTpb j;y}.FԆst2Mpd`5E?EU&PeWaEmG@;jUa{tt$:aʶ>w]6/"lzE4춃06cBimcB-)ӿjsݥX=*e%\TP3ˬie\$[^{{; mS G64EuRO#[{>}(ƅ~GNYhVgBBhki6l_HQo<{vbħ~*8<[H`mS5?獠fXRF~xqLkJTEd5>mL;Hd][V)Yg~d˂FxƗ<8k>Tџr;͉WƆMJU)P#|Pd\# IHh?RkgA6dqR2^IomdsRJ gc~" ͛qrN֛چ HnL[?^ c&Ûs(ϣ㶺J3G͝MZ ."lt[W__0d`afM NbZDFE=_h6i)fB*ukoA=[d6㡼@3`T;rs~' iܚ*lWs'V.Y1cgK5+b@Ϊ!j& '|}hm@as[]NIp}*X9]Tn9p޻)A@hwV@|s=tP0% 6\9>筌"/ AhBծƌ=.6lTSA ӫ AZjS~v%0 Ƥ%iح`0w DSF>]IWҖG.:{| rxa$C#J _ʇ"WfRp5&oL5&.y^b<8Dy;01za +jQul;,uQd@vtiV,+!Sn/Uu@cAWwXJ?SL*߼]?!Mb"d}*|jɛ̫+4ioe}_vw9Иwo}_hN$E`|tw&mӚe=+dWsk)ջY \U>y\"-!CS[ڭɓ5̕x?oz|h0[P`A=9z2mu ]RwK}a9'5 U.@o|p9m~s9~DH!>\z>i4Xfu{hQx?~)eB_|S-[h~"'u+VD™9r5Z BHH  f@,8a]6,pʝK NY*|) KugϏ>6٭h%i;h$V7ѳN% 7D̍eȃ g_y4ESd2/l-Ùm5RZ3,MFߜvݷƆA#!(9ϹG.{6/Cʩ6O[/`f@ 0q*mZdYfmd.w%ZO7 -qd7].I&TXfb| ֵUc^%}Z0!^k_c`͟,rLJtx`.c5)HUy|'ޏ;9tzzYx(ʎ7vz&hiN-'OQgHQ5eJU9tZyWmԶ$<=pBAt9F~i %Q%8K&p 2Z޵o´ K9 ?>2;4$uZQF[K#S +C/owQó3;]K=2wxG8AF@#q^i+{]2|`' ^#/3W_g;/dʀߥ^ s8b'O*1R8dh4ݮO4/8S:*߱uۡ_|."{/br/]rƎɑ F=kw 5`i}UU>xJi$Oo.[w]4|@=Vҩ\9+Wzk{7;7 ͒M ?[S` l@y$8cG8bI=v:!R)* dIQP&-0;r^-d uo))e qnE_\UrU :^@Ԙ"/Ů??fj()˄`umS+[@эW58=ׇ˜۠_}osYKf'H˙%K@qK6Y{]^k=IFak#j)UL18KƒSn=+ek#ڨ7[|ʆe*^go,yd;30.MDžWpy~dL738lT0D$NbeN|sO_֘jL#5HH'&X fE8nQsvck9 א{Oȏ8k#дC3.x!0ȥ~>xrƼľ@h]x/w~. R ~EZv>&!# >ݍ٘_@`?O½>L>Omq)O!?/qGC7|9ÍH{ؚ P !-2_v=_W'j}_Mn2oV"# )"O_[ϵyk{堏i.;g=f{v>: 3~O" oƽ$q:&M:՚*|>q=?>z, %YQPTdn>G|Gܔ*& s?_/ B!)vy>spg`嬓 F?09i$d3q6]ASt9`|-Yj3^qpMB! +) 8`*"+62WZ`hD(RfX z.".uD Q%6M拍m>O[-z-(Wut qFbDL"R.-q04ogF!5hD?⹵ 4D.)"RD Hm!%#m;Tcyih|C,. OAg~9ibN]a/iH?zPkT+_m bG^%kͭb:O2}>:3æ{-7J_YI`*6B#%n?>_t/yn|4]z$j+$d۪(8͛6T7ozI0mc/^/U:$kI{WSӡo'H_FBxVs{Kmo292 Au_ ^YҔ'& 5je 4Q?PX\"\,~a7AsbI@ЪYG0fFw4}:=ᚚA%TaVSv %V7LS*J*S}6fC0!H?Yŗzm'vMw??=O~iRKDo~9D݊Añ[HGzCfPG}=N%<0>S`,MRo-)U)-S_ 1"E%1Ӣ!AdjЭlaRʮՂl@G~) 7#S/Umm!*_y3 u"Oi&w u1+G_&G]$G2%͎~#E Z#cTX6lb]a6's={Eؓ(܇j\nJt|UQݟIA*pύs=ޮ[獲L5x˾%KW|Eb d>T){&DY& 3| $"~f6 U$a&^]kHGہZjeHxuK>XaN +t%G?SyT7ƱhQX_Kfei*j~ͭGk:" $--%)zG'oo0Qb*$L mqeqʎH:jFcEE#RI.هIQZLa;5%;G}㎌\B5qPPtU!$(Oˬ,uԕL%N?SnVFf(Tg,[KU6~R$QbMyȆĝWYm0&?㖫&,Yi8'_uua{›$ J3owI`?d Fh>.T$Cx_x};^WG&g̓oYjJ毩_59~j?Kڬ?n wP)~E\`)ϼwWcWot7AqIlŎgBv_}_#i3TbI& o4gn!e @chBVJR(~~Zj.yc9C,>&1G)S +⓶`_B뮹.XG4EyRB22d)ރ6;dp>͆g{-@ԯ\xgw]ѹa;(a+G ݝ}c[t}9|(egC j9|G/Ӛ)e6QJ3"=fzCNZ6qӆ8 -*5#p~,@ ɂ4)@J\ߞw#N;ۻאk.{) ~akbu1]fߗÛWM˴R>35w۹zN.uw[`Ri#?5uf-9Uz#C>Ew`/2ͪ!IV+#oԱn9:oaq#UlO?81lFzm6;fT& ^u5՗;p݊6r77z `]Qdzggxh`AB#=s4i4 T_몖u?6m_Pr-k;cE yK>t#yε?R~;}'jňp"]IRMvA6_ 5OM$7" ([0.\r'^RSYJъAUqFW/gs)2A0BD7L$֤칺jޱk%0$P @&Aِd4x 7(s'-H?w ~9SH7 ٍudma3;Oʭ:5=sq.zXP!1H\H d $XaK_ khې#URT2r}@yFtc8DDSms}feNɇl=3j~89rxZV? !;Ea(){z;eפhA6b*=';תYUf}qt02{O.>M?ݮo4)B_B'\u+ͬwUسY! Ʌ|W7OQ v_.EnV-7|!nB0]0!1 `wf Ǩ~J36x&cL`HA߄$0ivS\BZONe0E;{hO3JBMO[c/Uu]JNdziU"-v[f*Smɬ4l\76qhaɥ+FeDϭTyA0 w>ދi13 )%~ MBT;dW1qx݊JmBH% &uLk ' Z xeE0?͝娼GrMuۯo!JC=Z|\O԰ұq0DX'8|+$EFG$$ c橓oٷ[}U63 7wPɺ:ޖ;PANy4b'*?Gr(s2k5h|D<G`f믴`Ϡ_T^q*)-i@EB2\0@<rۭgevׅi^6t s( 6+:n䗥*i3 ⮪zI '=oʧ>?_/OK%y$&KAl@}}S9jVUAEjw 3PG%89g8~D!’x =cAqbw``5&]V!Hڻ&+m24*qgm>J6yZ Jk*Iz*TP^aĽ5\#;h7O7T zN{ q }NHKlЬGeCvK;.C_ *>Ҫ[WHh 6bZ3Ej0S+Hv%K-7NZ9k"s]T]a| V7?3soKPyt1Spū>NnOc]ha}{;>B 71VeVMh!s#AQcPtT|cWFp@? BLW,  JSKPB箣FpsZ~)6GsSUH¸5\EjKDw:KU~dze*@oUwHdIfuf]=JCetvIzB$_AALJP:*Vme,!_Nu}yaٳ&䦑DC o뇆F g*o/mDvX úb+>`Lmjz~R)XM]iE$Ora|Fsgs=)oT͞D#VP5>>-ϯ[*4Be}S:GR< !v?\ηR&՛iJ`oĠirAsXvë˟(B+U[C( HJZVIRV0fHµMEB)6ư;cs<~mNTgxbzKOL5qtl±qEm׸jҍx,53‚.(,]Tp߇87Yqa]oeIsG֦ZL Uw20SPm!40 RlN2 Yͳ2ó ST5s[}ߐf 30h$PG`MP6MqLEFa@ȴ !_=X)C}bes!0b4azOq{nTfd~>h8 !4"~!!沈UHj^jué-Lk֚jZfѫVSHN+6W{{'](lL\*Gbh睃iě6Q] :EEj Xm|1h [J*yaΦ,7G9F5͵N\˄s*سc8鲟롡]WCtd*r#-[/Y)mtܽ{G{'KFATur^?k?À`s렡9|6eOÿX s#8:ZѫcQ0xzvb*XWbCd29GVRۈ¯ .f֥臛ePzs&`)s˽F] +? iy{H;8ޚ֋MvX+0yf^w[SFեbLLmbB;:tC[ }=Scη.Nx﷨ݏ۱!s9Jw-iQ;@UVT=>oL(ۺSUĆL2Ey J P]ڑoi-~y`Pd\u,z7 3sFq5x8HZ͙vuOQJe.*~xFpx4p$۾| D, /7sQ*Z}Jc)ȋH :>o>m\^f)\iԔ:F=ylj)cJoؘ'KÜ9OjPH3uz} wW_.&Sra&ztldppbt#Q_w÷bMh2jջK %W4u(׏Κhe490LSuު.fitn.ݲ:H(M24+.&Mz:*2߀*U"Ȯw0Uqա.+ ad9,Y穘ڂ0c#~Ac6L< H=钒hy֐u+iXqEʬ3{]BiNq-(8iM_7kgWrVImNILӈӪbngW4ak`B}RTq*ٙHKCοD.wjնgH: $+=2P]>{K:N11[o+=C=mhwϮ.dʶo5` 6XsMQ   6u7j(5 !VlBzeK%' T:ׅgіUv(i+0`xUStJaclB24%>"]$N wtwhF@v;UKT#gxٴVZ'Wpaa IMJ+D5NMhy8auX`Kkm`f,)B.їm j$+F~| 0&\TFU+P?Nߙmn^H>_\4 N(yV2hm$CZ"| (c]]0h`&-!{R9qNhu+94TGc|jfԘ4cD ףd*ҕP^(Z˺V7uySUi0ʓAZx&]}v|jػGD[⬪AJZ&tFaMդpݫ_Y2}T*B$%hSVWml(oqd5vzE3*6mP8&b'g/ 8c' ǁڮ"}3{3sl4>ǁ~ͣoHBۣϪy='[wO^]0}iٷjڶJػZb+9s1|i̗~Os3L]Xu)j'pX~zfu:)wys : j$gRl&t)GEfu{7=_gyP#EJDf ʄ" eB`t.p%󋭢R?Pk͉7޳!9v:6laοa͎Zu0(Q@ҩGY$ 95ն%7iFvV-ΎϬϕwh}"̙d&It zulFyFqHV!y \Ll&] -2~4 1gD_y yrlGzܨ"t@;0B/ܿ`0:qha+ F֍gԥzNzǏٔ؇Î 5z>E+[IK { uZKu{> q0+kky7O5uMbN0t9q+֜niorx#F՚jJN%Wƫv`?v֞w2+I?鶮Jj(ܽ(`ӷМ& 0p?A>Ïmm;Dl9U2jtU#~}E9su mZOUsS}n%vJYt몸1:7ĵ̹'bHB- <c 'l9,A(1^բ%9 `T()ҡ*,d`K<;fۦ9߳BI͓[I@.&3 ffb1;Qi.q?O1 WQ̌ bt d% M3Q=gw;b)Ô9SI^im t9AOhHTR\)i`~ Mif0li(3k46Dnr*}#]#`" HD(Ha2ˆo ?_oÕq 3=Hjt挿БfT-0LHd(-`Zhf &1 SPBӎ?Y 3~}wgmr0uD=TO+|KwOAnyq|S}Z(MOd U5U&HsCxGORKK5%ϫg u40{p5!MJxAN07'~Nڢ/d1,)P!i ̈́+l~%*iiU ȉ"qCyB,(I#&JCiz-П?m@{m|⭹rsg&P!\_!#(f@hx]th5a,E,LtYf]L}q.$ĶҨl+AVرvYRҷZaQaF!A4X_iQN*3;+oE!\{VR恐PDMỹϒ2}5 y` DPWKR`<:K8'@X sBFf](.QCQ >ޛsZ"ʥݓ Oݲ1V؈`BdZ? -}^q\(gWAM)=^:m GСy''[5rf!iڐa ]. "1o[C;G٩ )СfCyN_K s4y!ݦHWO ZBTgqň$M *R4*g]UQ!mCT_b@ԋNjnÕ}WYT'Ԉa)SC_NX v!̀CLͪsAXGNRA{Q]7='a2_}k.⡦i607ﵪ`A"\,(.8a VdZ^TV(A_t \BFTIlXP l?}$ }nIiۍX8fPp;H U;󚡉͸5ԭq\ـ[E7䷚˝>;W+g9RTP3桴@R]khnN$kĈ S!1i @x~(fTWQ"UV Bg*0NN5<*?9?)H_5gu/7i&lɻf{yYq&*s+SXbtvaJO: zqd,`2/e:aXRu:a@'t^i&.+ oXܮfM $LWCŸN=2O%~ ;TJY.N)|Sgy$>BD屨HBJX* AdXTga~7Ι ?W*]J.kCIT12dKhͲ 'ҵT#07{[d$h5($Ak%b!T"ԊEQT 2 RJʀQ]aXZmta X+ r)dQ:*/[:r,`H\jkvOw' `2'a{msz} ܀GӑJR39jYc'vmg; ?EE+mKugP_ħv L `FH2329Zz"{'].7q+3bn!!CBlx^۳<<%Jj-¸(O$3;wM~o&Vյj@!I|Qs?]dcë́O'թ5c /Aҩa_1 {PFOG (2IHTjI4%Y5iL'h9ċLES`j`n.9k;M~pqݽ~dCg?B&m6{Zֶ|vzNlH'uگQ4FK֍|NbS./l&ՙي,Wq^nyrU/We=;i9srը{K-o3|[QKby(Q9]g$(d6@^נp)H^&U0KrZl'(}F6)z49>X&GN =<;c_9Ęlem2Revj!N }{;Y5w*q!WW*f`/E/ ?a5 lx= aC jGjG>*Jz, {uc0jjK̷@jQ Q?%Aк3H!(ZӪ/Ŝj=`r{oi6Vs9gUghZ}LJrkqʁ›1y^k:cĤEA{!YYYYnz𸡡6nֲ1~^tuNXkE5\!D33ûfDAiL4c{b jGK=ppL8hVJo>y:6HoÔ  _> 05wn7˟a nvf# I*N5d)h*?Ơ0[<ȋ͉5]hu~Saħ:P IDV;HaZ0daXj)КY.9CMֵ}J^-cR?'o%vQA$a1X !Ea OYX'Sec}pЉ0N׻^(9|?w'#V4`h͍u7wW @Q4 hSF0al=OIV@"9i10R@YQRB+lW$)r=K,$D$iqwҷz; !PK!'Qs{GV2TgUgF fs`殪)FU(z1a*HE~E}z.f rfҽ4ߔYs*(mfw}Xs5tAɨV<_N`*wPE\U^kP'm({f IyJ/$Z}0]$)AdB[)MVeWX:, ؐ`e``?m_l'F53"S{8WJƲ&Q\"\Lգ`2ˍrln()rP,Pj]=L۞_8V22 /18+sk6yM;}'Nl8j.yZ bA (iAf}hEj|NiiI/WqSحfӅs"~#"Χ5IÃZUY^W[Vi*4`Y[,<ә~ir?sѨ$g^Of盛Tpqlj6{K* ;`豟CT誣S$dp_3l!Fy ہ 8I46ptO(mCϦ47=D"a\-/ÿyM!nbWiEFdPe( hhGD%$iZ4sUq=y E+ D!KmQa +Kv YsN^נ㽢 -l- 1`Yzxk_鷩c)>*0b`  L4 kt,DP#؈&\~Ј|cYxjТxrŪӰ! +Mesm1,7qֆ"ϋIlVjp5158q $~HU^f LYV,V @ }!Ϋ.?JI턤 ^&򑅣J'# SGҨP&oHyׯS]<5 % b|x{7yS>BXń́G` w$@"m&9jG5;O<>[XRDTDټ?nŤkhUz M>zaO}.NKU)<^O~̾_TRSlu,ZpzDAS,$cL=/M8z^lKqs,KL*f]^؞Ա ܈wx"wLyN̸\0G*a2qm9$[*/1Bn5fEE5ZF1f<)5.t8V`ovwdyO4cm풸ɤh6pEVed4<#(6SRkIfg{E($t5+u"h' xaTvL͡jW|i*?ܿ-JꞱ[9,'q(.F^M 5KVEJ4n4=!B#eRJ*c_u7|ߏߡ@?zid[lnCʎM27kwȢm0WF=qΜ0[/tօg&8/SR-ݹR>kS<>N 7s N>'=&C|IPjB߁ 꺎yD'OٵS>' ?ú1w堋2 {/C08 -ّ(J-le|7[EA`a]վHU}*sY/ط!.Q b݃/5o{¿p| {gn2$dp&8]DV<"jo!kٺj ҠM I "l4>pla00" եQ{v~'},-}UG¸sFAnY uYjˉos/_w8ᆋ4x~7-<}tC|]i{i-r3lzsXʑ^lgA"Xh BZ䨦$R[f?΂}y5CctuxCa* ._VZeECR"gD&0J =EkD+s,7i=8etv>5lR% fbPA%-*ˊ:U!S2Ms4 6jt6N4Ʃt eڥaX^J$ R UC}# օq9o0B7n`j xϠP ,@+<\1`!L<5 U7DM )> L"(2) AG"R[g4о+ b*:C%W:v 4w&|wd ᶋ7zlۦ,?KŚyEzԞwau4 L8˳af0*RY]KueD/'. F cj]u0c3,8Oz(VN#YIxf:s´0bLf s [)ZLLrnJ au2Zʅ ,ʖf)iBCn:Y\뙀a9"[mIBnFTW eEB,UH E PREX ID"a Oc> }ɐ<:TP816*IF` ~!;PC}~q8@DxBHTK, \+LDdDci` @d8s,@TJ qguC7CH5ɤ.Ȭ:vhaC$bҐWD @J;"7,SM?$c4}GͨSUd42U&@4Qxڒ-DX-GY;h ABEJ,) Y(Td~)s (I   !@HV$0lZf[a)ʀY^zl.1r9!?oX4倐NĢ~4I@d)GPU"  ߇֏LYe|ݑ Q(H‰ݪ'!8z0p\q.myEoGmjIXuBk _^ &`K9I2JHt?i/DQj?t4NYV};(0vvչF]s(? Ȋ0$$_=璯a}zKCDqKh\Ci"Q_Mgz`TO_ NR)a[rJ:2T²TX*!-TO* )%i AjU0bghgx|ng>B'K}>w:?,a.cyKPL" ;." h0  ~Q{8y,.bqT&`yʈty$w>2F*Ъ$I $O[TYEK3QSʄiJLB#dn#`J8 ,[%bQHe,lUQVb(ȍIr˫!cRBQ k[lTܤ!MA"T: H<? WaRV;Jeύd|9.nҌԽ"3%5 BRѰc$/0F0`aZ21сXkEF"ګQAEcm EF TQTEbFVelTXF*5F,+ 5?LQ%*0dC2ڡFȃq*I <%ߊiw dӭe<+646^Nnѓ]Wa:-*b}?H]V==PfmK$w !YVM+U0KHQH°!5&wUE ߬Ǯp'Ꞽ+,!j(:qaR\t(#RkTj[>}aV@g]x߻?wy?bٮӵl @IGчKMxMB?e"s yh$EAm~-7"#n搠mENju+|R UڳugA20fmuG[??w{?o>ם AEa"Hݬ @뵾'GV(}Z^*7wmy06X|FB9oqt#fM(j:Z8-q߅*}q~e/}߉pq.*B҆墄a&YSsȀ>Qt];FJ4V1a !GpkS|I1^GA6n=n?Qnr2|CYS85c`o~kר% ;jʩ3 ByP72j["_{Mz!:+%g`!ju+P(4ݢ;M&)WCo*Z]^_HA@p\wM<&/ν `|?~ko;P| V%9f;š9mzvkcje2$SAnL:/&vek{?&Qxaڟ0ֿʁU1oe,z*tE@܀!ttM yHp7,X*3ʔ:kһ5";rMHwDZ,J%)+z9ZKE"y[OWBSʫ)N8=yG_&-uSkS7 'tgXH%A( SB hH#ٌ|Cqe'L:4˲-zxRDA 뛄>^!CQmsYhL&5h_][ dg$5 OrCOS*M 0>4(.Zy(}AI'U=1ꆜ͆_tmʑERfh*&W&sYMaNѭ:,z6a0BA0 Z Ud@sEx]6ZdXwi?fgTcD7؄GțJl6ZAϽ@JǠR,haR\EM$'{[ naXoC>ȹԠLJõnѡf%‹v$#o7u@2X?̄xӏ%qViHrIr>3T,ڗ~EtL 'k褿2o_ 50 :20e} TF:up^oR4-Oam zj fF Qg߻Ȯ9 @Ypby %۞|nw4'|8ve^10VZ_;k/?K;km+|O*)G/yTye5`qo:DSߧAKgfWcKXJaɽ=X IˑZ͚!e/ +.W'਽F6:.OaPWWs$E2IWna 7hkb>KMT`|ZW`&{XGJs0=z|LGr2BlDzíjɯIi/vk?¿j Svs'ek!F]dIhک*f!;e6F=/2ˈ.emJEe}DhL[ϓzxpW~UR)KHD6PmSq~c6&hB ~g3p>._63*c!ʏ ^X~Z3bg z.W@/=~Ob6R0cM"oY~!Ip0d'Qtrγ v(FS=Qjt7t,U}j 7s7Z}ǛVu6G飫ia6]fO,K\`=:Vٖ O#O=¦jǃ+:{&{~7wI^j[ Nch ݙG.zhPyQaOja0($VPdx"pkE+>G.}iB̔Mص; ۍ# Ic쬯m)jΔƴ_",jבK4s4wfmV u[UR*噙NS)o>g#K=qDZEmUv)35sv3at3Tyu4owPI;/v*i-kg4^#6U2:u}jf&DR:hKGEw̱߿@Vm=L q^wglRU=5SƳԆֹ9>dwXO-o~}HVXg3^E,Ҕ&YխmRoz#: w2>5 "70Cauz?7]q ܚȱA({]Ux_5~j_>1TW~ww:W/`sS2Ҩrw;%jY-sy~H S2cfY[aB75y d"p!@CR0CkRf6[k cQ> %|_P\.9Nwcp~GOr#Qu !rB)h0(Deh-`/M@P3 `!EYEy>N}(h݂:̮IeZo3!ɶ/s_ۦƠ3^Sdf@nU34?Ny- vF{wOq,TiG(nŔսB|gzINMb+*3/,`-"uy52:rm"ǀk5uf/Ƴ0-@HHbf_l"zܝC[̸p;pMw0ӭGfo0u=5S \W"'.Ff)(*M"g+n 4 f~zbqv/WuvӦSslD--|,F⩣^mBA$j*z,. "^Y2 T *$"שsGuIQmфv]AdžGCEp &c-<;ENA"JzHȟi{s")T=@̇JaccmC.xDt-eϭ!k7:c.)2`*ԕQS3 : ?<_7*Ā@`05e(o~De ݚaT9+o7s(b}%}!D*6w; ;D D=9nڟ{G%GwfsQq)CyZʀ:z lHTsxɗw.=i"lBRaYZO$,F7PC{`Ӄ}oR&D ?"J s(}~M 򷻭p,X_w"KB knbOd(F0i!|)k5~ˬL(swT)OLV '8/֛EX1S't! l}J0:sx^|59}$9Y/ ω?x.nVo>Urv7D06zaj(xUvOB^%t|^Kͧι#C9 *sy[2h,cQ_z~>";őMXe|>]_8R+ rISB!Ɋ5R#[s.oҰ|śI*2ly;Gei`R+!PSg}(6Y 833ml/shayG;2!oACWhyxyf;<}\V̏<%Dmb7g}=a2 jY'o)em1m)l#dh P{UvQa(gFgKaL??3`CS) DdKm sͿYn,Ay*i qV<[b1 HJx#N[c@%A0ch;vϝK}R#bQ㋃կDS`i@!n>DAhIȾ"G ˞bP,J'>ciz`DQdD5D}C殅TJMc [=<9cL> äE-aMLjj[nxX#H$=R";_zeN?Ua~ ">4[}/_>C>:eT揁"'ǃ N%!& %ߒ}7|Kұ>՜|6t}2=faADe:vac !U_Su,aqߙ+`b/ŚCuT娸?EZV8C}?;l5 l.뿣AS=y3n n"*YA. TA@A$EuT")PRH9'qGH&HuLI C̒hcQAd 3{=Q>wv8ӉuNȪaF@IAd$ 0s y``"Y d*i 0-@`'ڐEUOT+)rLd0dUH qJXRп97m^ +a@1 eSmQ-2Ҟ(5T  :GywO5>p͜ o3'w͈dd$Ї~~krz 7 wD x }q^ѭzLR"&Rk"bph!ewf A >E0IE Ơ@ S reTPn&·qzͅ&\ʏֲ BA X@X"{ۄ}-!&֛zw̬()ƵsFXF (qJ(FX*+rЙc)}jwF[ꘉ&j"->-.~JPzu61um(kPJgcț+mpv: K?Rӊ#ٱl9!Ù8oyvt}x:VA3`9F +]hcbtyt|wfnظfv娿Մ^.W@?k:uړ0YXl`0/ TvS(IF7W{q IC/`-|1Pco e 쫐0[V(4$2F@DrcS p%Fg,?Τ^;'e'ص}cUf+8m_),,M6Uc`Z 1|p{q[1_<.y .%dyg(8p2+`KiC07zJ*Ϛ<Laܳq/ATA] .ަ\(YQ9'V5oPC Y[[cY34i޶MV-8? %›6᯽v)2&]0!uL4@h4bpA,^.tUugg67w" *b ԥ z+"dY(tǀEdF@ ?Eui١ƀ!.pA/#xP(l'܇[ V,4I!=wpP+;ET;=8e͔@r^0,mL;HU!q6t6 jMioN©lTkH$_de SNAITHFwEp c)nZ:MDQo $ fèrJECW`WEA1J3@e W+wZ~L҆83+y:ՙ05}:"1pWJZLLqi9dPf--ۖ&Q Fv b G@D3]mт#F}2`-4`oLsW:qo $#={9BDQ㡖,JEFx'ϵj+-eRʰH`|a&3U?rYe6~Y#QC7LvH `oՆ$0w R. d.FzYR Hn|W1X!RvIef1,JĠ IQ";Vϕe2'L8pk b'zZ.Ֆ t;1=ViWSJ jD+Cv^רUTAxl6d- ɯs-cey素 ZDVKaE>+FD3H.3G_zRz B=*T"ǁ&_ر~{`Iϓ7PopJ´0gVRPv]2b#%΋6e3t$Xt߀mED0C yO)Tj|uTiHez [9(3W`%!E\.^T\]U-Q af՜ \Yh[Ў&LA <hJkڽ[>WOI5U' %D&$KZYKH2ɂÃD=0c pY#ܶ(Jx.S4/ l倴6%:xVbe 1%jiyfMzqBP^n$U`Ie]8h]Ԯ+]Mx\/9L] JӰȑ1&ϝ9 O[ErZ5߯ VZ6M!R\ZEp(3: غT4Re2GȌ+C%eaiZ!\4GSg4c׋%6r;;>} `}ŜʽjEڶ;bN&,9TFݘhأB% wdfW#2n]g.4x&Pj#u0yF3*KEJfV@I0g@҈ċ\Owܩ if6H=oCV}ro$Wmh&i7X0i4A +bgUsb)ϱ$:ŢQyb~ W2^І,av) fQ(ЖbjKi- UQ*OWͰukPi xFvcCn&C{m jy./aTrdž&aٲ/맳xf[Λf*^$7~93#mWنS5wG9QEn5N.~!t\ysN)Z(pPJqٚw|gg_X4|gYxQH[jC4t-b) Qm#k*15f%Ѻ NH+"hE[a9[KM=z55#'ecqo+`*Q i73^} K]Jwc6- ɐ bolhۻ8uZDwNiؿi.ةUO"XVjKSyN\-aRђܳki`$%N׎w+!yP3U/8 կP8+uy變hE~yj^bBӥ{,0nI gbM餭 ˏ,'6GVd0ڡ=[ؗj8M[ZębU ςӣ"4EQ"nA(h{I+ -j,:(T 'V!\)sjLM;'s>*''ח *R'1')VljYY+QĘg3`؅/݉ l QKK ZW閨nk fCo"أUt*ؗS-_e*Z\đXתTfbQ452"ͷLIg8qTi@-nN͒KGe_\O\S6;J 7B-" ZŎw"D_MN.6jαwS-kEs%i:dS]%yhPŎ [& Htv.h:Tit!]*n(;3(T#ٯMخXWU4Ef"ʦ:-:lC>M-Q˧e jS՜]]lne}kOC)ښkـtrlåV" 0fV8` P3^F2k]VLP)LaɅ`p BSe#hh#Ԏ]lV++Ne:PkZ\)okUˢ\]ikٝ|[ .Vҵ+Ūh_%?LZCc@u 9G x zH+ܵ[MuH,ʪ>i֢'ҊAx+Fŭ+:sr˫:LsCz:o,\|p6846΄9(ccujʦy1`lr҅Jџ3lEƖ c+x.eCrkX:]A̟MDG9!sb!(7Ao)Eһx*@z-,ZhM$f'4ܙ[m"y_M{`]Riv%UQMCNYEUjw"(Q zC+ݯnJ")8,WGKb+ ੕W. en U;\}w F^܏Tߋ>-\;͙Ւ2&#TsDYܫU;R5 װ"(kWGK:`bS[T|*YNEb_k38pptyIN7XZTqFBX,2A dQh·P;Zx/ 5ym`܎Z$QA&TILSڿ 逎 m:iMC2Y*KEUde;klE|΄<2poj"n@0/]&+ų#o^5*5ˎqXp ŀY0+L6z`vBE5,AY$T- YTkR6f8Onѧ!Nh[`=ɱ 1i7Qf10љX+!K*;V[8UKE yb&X#3ҁEVtyF"E:2m{`d/aPxOKcs֯Eߚu>4}n#Fz1Z'nvejwI5C %hw.+Eh#Y@џ2h$uaH̄Z -7Xuҧ\̼U!!(dvv6Wc<:^%eHlkrQͲݍ?Xm谼.8I ߍ.FG3"H! $:D("- O?ײF@X %m*̗YȜ=<nX:3SRLD SkfPk=yäJ\+yc$A5>(>&rZM4zHbt9l.@溿9ݵp@,?bW ԬZR(@3#c+)1)SRޑ`47(Z̒ozVR,Ym`2ND*FYP6~z&\/fׯi&B.5fH]Z.Ko BJ*2>ySnbF GDƥA_Χ=u$ QDmC:a+\OB7dP}ȞŸ7 ڮ3xgJ^i?jbg (N-Di9;xy~97*d Ga\ݘ#bZ[&!JiGl*'>EmC?|Q3IP,qR:g! &{ {3F}Lmk?Fv͡V`.E/oŀ|ee=2wLzmt?"T-%W ~<;$R`D6uÐ6G ߉Z-;G M*& ">])w{+q{Ī󧦨?EG4 QƫS ]uŢgÈoH|+[M?{wsGU uSc_|s4|Gl6I,_U'ym[,7^mM>4M/J`rA68g@rU*ȝ*gŮmaZ=?CF㋃>$K 9OҢ.iJ+Xo3([/}3| ȧe537wOFrEWh=v ?`*yVSD-d_yXWq&W\o_)15GAzV>Kc塎M,[1B,<3"w}OO,XtsѤ6sӊ __u205~㳞R3p]rV;Ki pi}"o{glp_:~ y~>wwN_O'#߮Dl(F|ޑ}bz AU.hbfsg x4Mͯe< 39ӉPLV9D`zJL˳8]Q>zo)vWsYRGި L˙A dǶ>OVu,Ӂ}yeZ~&Ϟ"L2M8ťp \ECj 0 1nB !$6<y?f~OJ?Nxߟv.$#%85~u³sR+4m3#6\c7"[%d>][|&s~?QӞ*kY-My6Hwh޶w;ygвocSO n9B`u.y{paȱQ$X\>'mux>>Vj]&n2"zd`[nrqhGϭUBb[xpa$g;4!Ғpd ~۲ٶB} ZJL͞~;o,~s/+skUoGuKSK/Ӣm;iV?f=`v;kCDG*uןS c9ntA{Ic6#ek壅ďՋOe81a@>5] v$y v]R= i:Wп-5bC*T9\U8lcvmU6I4L,wH6p*r04cgLK  Q>;&|Řs-Ӊ/7a<Qm`7|0<#$0 6RZ3-Yf3mCUEgܠ\hoFbeD[J.Pq"qE;5~hdq"Fy> +e*pl#)&YatҔk8>{NekScXܦm\Ws[kCwCk΀0o Nˢ+ޮϦ^}j#J%Ά~,8n92S&MS)+Do&`Ξyr,'|M6~4LVENËE}W5fS-lN~ Ak۶}Qy>etĢ[7Tg7sf7;44޻b7帞;u]ļwX.G3\~4S֓˭3ϩje6s ػuMHFTE^ .А>U9v uFWy^oJOGlePVVYFWg;ɰ4Ok?~ξ ~kxMTDAVf `fBaE4ja/G|_=@hhыRA#鑌tN;Rx-n3,,'(T 4N]N]4Sp?Caa K@yL*1Nz밤1)/x=  %n7QBd1Ⱦjf'|E*Vq%-~U5S#qOß-I`/J&i'w E#-)t5%85,⎳n (HXB9W\XNвk;j5Uwo_84FgjnQt0<{9=} P ! k᫃y)l{S?a>W|`;F:;K{@3 `Pm9r@ZL7Ꞓ@дf`NͩHWˇ8Ԩ)xj\w+ }C$6${Ÿ5 U_bU@u \ET,lD$lnA/˲^+luG-TrTBU/ҹ0.<#0c[hhO TYNʼSt~(cG'L#~$0RRr8Y"Rjx#9l&qx>}٩:Y;U37"teI3!^r@d@\e1D0`W#/S}L\*V{X?yWY4wBBJ)X^""a6V8v{@̵ 屋y,9F ?{M;ԇsw&;LZ]S'>w`cfd.%w7ooM,{{7m.+--З%|_ogv(~KP#A`ń(AOdQ,U]tvE\†u+)%SY~Uoi% 3t{S Xۤ C>GYC?}; ٰW2쾂.}$,mIN<Ƈ/f?-.BnU)$ӧbZBҸ2fD-gl;9=G+GA``_Ԧ9)-sR~FeMAfqVs-) , #JTCd(cw=`85+ WYAIX{ ZʠR(B(ṣWgCϗ;uE}`Ek>@%V("!btȉX`h7q) Ogq\^gE /y}2yg6J.U@2}K9؞!p)Z U A %qTgQ\~{[as5s9_\ֲz #]JDAZ**Rҷg!b[+uنHT(ZE͖qkIwoRX Yb?ٸ1gw0KČzQ܇(۫;Іl[E.حe&>A0l-_N/:d*겱*Gs&]!|CI@&I@zo$; 1n(+2f8]~s@S|~~,C4 B 4OUy >Є'[oCej݇UɬQmg<ա.;q;ht" R)~`ӱŀCɳ:Q ucg//&Uq ^yصc[a ^JQ$m(\^>'VPAyKCQ^UlIlyw#70yY$dIZկ1jTF2] +& Ӑy?Bv]Cb^Phu=Gf]jhΖ u/d[Ҏ%|QUYlL`͝и]s(HUT'Lfr)VGeJpBU+H?h\lf__a'"_ QXvru]'ec,H 'k@R=OڷQZP`U@R/ILrLv^ʃ?y_+eya(8QK'Ȇ-#-`/N7iV0_20ײlTq?v_ k)fɫ"prhyx=iH4*>^+6  L> 0? N8i:-CER:=aCc$]|R֡-yQG@ ]Hh\"%@0N@4E,(3ZP1sj?Vk. EȤ+F֗a)ӅW5况^WɧVsAgqdl?DPҔВ/T`Ur^w(+~O1N+v<#ibz# }H]ꬖ Ȓ0wZ9.a x.ڼZ/_]^E &o~cd&5Tk„xTFcQD>ofm>y H׼:8K+OCyP>pyZIN%<|^kV0=hiGe*TVZ\L#ؠruֵe!xiLWpCU̮;ɕŌB鯸Q EL# +P"X o`h,Tn12&wK;>^3ejW铣:e-;ک L6FY!4 o}Q"ȄIHg_B "U_ebÃƻ%TP:7mMe`2 ʕ>Qf ! /B6TQz(E#?湚;?b"_yeTQÇÃOƒ@t!m]L6h3"G=TXJDiG4q(Yq> P0m%Du56é%y92kF) 6\kck*1jU҃iJk"lq JBUY?KT$S ǺwQ?ʴժ)m F/yaD <ցwG5"LjoENT8^CMUɒ4p7Iθ .NC YS!3}*X3uau3__x/x/6o¾*ۖ!7xN5ÀPDσY]Y3OiW"vfX_.eJ!q\l1;陽/6Bh7l9jUq~UVم[>(GT%-փQE\Z6F ƒT,qrp&?gA|qf!+ˌ PG^BOe/_{혡W+wN J}x:K mB2ЕP?b>wleӄӽuHlZ}3h"|-XM4)3$0p]dT{Gl_Gj3iSQeC`M̖"T1BqjGr! U fĩqm^q2oP5[r8ԹMoK&UkD4c\_Xot[Ţ)Y~\Lyougܚ!`ǧ^m.1*!=@ oh:s)VC՟ s(v'Uʇc|{RURD6Oep%kjͻPT33; SrMFI_#_$d.R7p`4-.!`R @sEɨ<{!Yi{݅;fxQfh?n9[kIH?mLg{ޯ#ƫscYA`H4ZO5jө>lM"s273n, %*7R*"*<_33 n5 l{Uf>R--?!㙯*jyWćsYj=fwdhe 1&w,rںs;مɧ7pBIiÎG.g<@oy$hD}Q96 pt[k)k|Nz47ZRsEQ#[Svvs:k0ovWmȯ vx XVdY X;/7lOA&هiBAB3hfTډ^ WWjhLh_:Brfx(8?w7v0mL$,h. #b̰3B 5% RfHH0iH& ==4 33#0Cj[]]鰸y' ez f@?tBH0\#2#QdIf崿 V-f fbkvPRDh顨ٙz]ʎB*abY^tJ빉1 yPBRM'@X-%) 8lN~H߁-y+Yt.DXMX3àSPUt54ANT?oe&-+3"zX3faxeY"{@0Df+, >{Q " Q{iJ~gh PX fxcF;Tikmj܄s/|)Bۼ"+;vHfd|!۹)asg22"0 YAg kJ8@8%L;1cV nUqdIܢ\ T`-*f5(sFV\>N뛐3Jl 3zl˹S(ZJ[/& vscb 6Ab eIm}!%J j@JMLsF23#21e'C=4~Zr {2Hc=Q@Ey}L@ɋOaWQ!|}80j@hpҐ#Mh4\ʹ鵺I!G_U]a@S2B\/ ,Aí 4UIRg{nx!7F$[ Z忐5+NW_jjǺ &[`aa)fHk{V3Aal(ޝ.O~3so0=hJVCߜ@FdПYghr"VLj`P;V 09wc!Quk8DlA@rIh9k:'A) llٱޜb'30,3Tw8!uqv[xg6dgnV'*%md`,zG^) SJ!(LdfU$qCw<ܯx0!3s3wwv^~:YLXީE "ŜYf qnxح!ݻt$hV Z=<=V_, H:fTR6Bᑑs/T 3Ng>)MLoIwTL9iHdcߕ7O;1HKdis8A*8 O_8`ٓȽYK;hf _t}Hz`UkaNnuf3 o;UNVU# ) Ygc+@ҡlŕݣӤζo h`pl`_ֲzX67̍0dcFa0b v|k@v*n-:,;N3Q2H "Hf0E~ Yɽ ZmhZ|'Ŝsnʣ %AJ,KBI1; 4oBK~~c0 `u1bmH-MB",<#@] pbr&z6If94Og P0 `20E6߸1Y;+WrՊ}Qdv>qD)6+M8>qnUL@`="O֟v h-ֽPx<65v 䏕oGvA;g8d<|p/wM#cb_#0 q0pvԟٛV^E0=ڇQ8XQ\cXS+>VF̌3bQ9*= 0dfewDD`x&\ͶdQz;$"C-߈Y5.tLJ´877IZ-bJ3ԕe8f%qxcWR86AN&SF3XpnnfUfܷ9YO8wW2oŅ7xrAm[GȒ䶾'o].tb$/D0V'+Vvݾɛ!|yi.! j2n-FJVh։I`&_O9ϗu9Tk | Yf"b ^.1pVb&Dgkf$uzuriSw^k31w5\c0fq]Ymdg[QN={W}4ڌf0M2>y ChRY o=WmGz"|\8-<]7r&2cFlԉK{M;EöV# \E/[^y}09p[Z!qON:qmJtI=-ě͎ 3gnO3̌>d|3;8GcїTV+ʔHG[A8dSЄ+zl6GY2M7Q9"3 ;qu` ]f¯b0L`E6>kB)cgwyaߟ%-Iz6Gi-]׳#~;O:0d`Ȏ3S`'pSf!vWPd5^,(@^afidG_^Q>8nbWڰO ".{{-1 lkSۗʉ0zxx-I7Ê%ֳmyP28pݤ"jىgŬXJ7Ri>;!հ^Pnzaj-Zރ`Ϲ 6B8/nhA6 J{wxXL@ţvN̚kGu#@tu%ziη[6)~t^UuGjՑzK'ZI~dV ;N_ מ)Lj[P~YD23{'B^y6l$E|6gUtŮE̻FbHʜ97l$n%/&$D<2he@ʸ3L"sL.;LA;ϼE:1 qu~L`Xm?_Ř73ލﲫq_uFk(un/_߄ٯoUys䕯wa̿l!ƣKw~eOX|GzVߵd \"ǟhr5? Z<$R=]B\OpsmfM(ѱGP[AҐddGwi1W.,_8XFD]깟IkXw0z_9Q:o9)o`TCT&\(SDõ0RrH`)Ƌ.:.1%> \av&bXBB/+_wtþ__w<ʬQTHv/_j,.̲4h *Yr41AY&SY@!}6D9S-vgw;{w{x;^ޝ·UmjpYRL7u>@/ )w@:]{6]ڽ}Tmriuo}=Nx7{Kr{ts Sofpqr;%MڽP((rh/wdP*( iWffz2Pm0@4Q -R-z뀵ɥ$cCx#݀` _{UBXU7,4eI U Ppt+ZkuyP^SnR2{ϋ4co8N2@i@ yA3;& 1{q/˻}s\Q77R7`Slm`ՙOct-b2f&+g:| 4)$o9z>}'Op{w۬'v/D:56xmF炙}{zoa̧|^C>ol6ƈ@0t1ֆcmXv JfծN9vɮpY!v{Ov9HBrJP݆O@%Jw^C9ޔ BUe)=^j__>TH_^̊\]fFg#FliJvW  &4L#Fh2hb4  ɓiLS40jz @h442M0M&MO@zC@0&FL 3M112`OO#MO54hL&B!B0&OmIF!'zh<i1Gʞm 4PzMgO#jzOBdGa6xd3Q=4A'LOS4O4ѣ# d@ИdMFOS ɦ2@ڞ#'z442 dѡM2hh0F# = 2ijixLF i=6MF=#M4D3fS~mBd3&6$@ L4 AiGdi&! =i5]ڸ L-q4a123!/bd;qv0P IXZ`cd!ef%)4 (,&G"TH@3 nV0-bUA- az+C88BCq,搉7zIs=Pj ;Ab| HY f.vcxe (8{wtsy q(azH^vharn/[3.Iztqq1(a*" Tjmy\ApPtM-t$JJ-qs3f_FXj7Uue;/99.>4 J ԃk[$A Dy\ݍk۳T1e+.e U+YZH! BDTsT 7'1T7BU\syuBlHW1IG/ז>? n(kxbGo9W!!gKUs ʩxWUG7䉤@"Ʉg/c;.V2!1oa^dxٱq$0,3TH%؅N.ql4LY/o_њ$Bp\"|ƔƆcz3v'p\,sII4-ktl+ua/6\;.wf| )/Ձ,Pm ] 7FfAi)Xd5vti;mv9u Nnu|* &Ÿ1L;})Xf02Idϒ} 8F{̫_OH@s! 4 %dNjLC|vqS.YoʄcIͻPB/VR_ jd  T!Q"d("&AkvAD *P R6֤mQlV3cm_֍j6Ul0dactwvjw;lzn-øNmqְUM((5 Ym;đY_ 8<-]T<aQ17<MSbYY[l!VJZ؉6@@QRFs]5<Ht67(XZ5tRJ\;Ubjm <̔J !I"P(JXclܢ ZiШ֢IZ@Bj)Q(dH)"13ą|)M5@!1@Q*H$wcn'2sĿuLnrP}HYjsу 7&kK+kB_~GTvj=[i c7/,ޢ}|z>Ss(UWfv!dO]˞NxqTCfU D +5QUb?ӚҪ_e#t(v D ,PAⴡiu˾jװ˿A0HOqcQJc)y0VXګeQ0Z8z;xMj;@[Es<rO(0Y1'UB+](C_n0gU\j+HUBmkDEq1!Qd±V$lV5UFZ+7rŢKkj5mFڱdJJ5[Z5S/yZU$dR"slͅy,~.>.)_AML "H `N I08|[nmEƿu GīrRa$XǧqtV u뚜&B/Vdn8!cx~j;Hڬ|c/y}ΐ @&Ɣ֣]ni4RZZ*ȨѪ+F1m̚j4ER[i5VI1Q C%e)Ik<@Aؔi,c=p`/[mK_q!}.DUzȺ5DkdGPP `kL*a/vv6-3RTޕ+'[zG {c#>NӣYt^_@Z UAa YQ:X < 2uVA"IV&5`[!I#cg+"*&ˣ-.QprSTnuu [alۆ`=XO35 i/7G2=_уeҐ $1ŀjE{ X)lV^gmY/{ |b瑅u"RIʊ:`.H ŢX7]Bz3`lTB G'/|_ ΦxZ @ympmGSp@A>~T|Ь>XJmɈûƠif{Ǟ|OcrC<\U[W̰[dKjCw~_z?،cH!"@)`0H*06xgw֦JјF -A@IU@*BRT}~mObYB! I $HL~4|LZ)pE 1$ iY#QdHv :q:$AqrgEAzTӋ6Gw ]v} np$vxTj'܌jF=]-\CO'?P"IX `"b=:#3Խ鹬_>^xgk>C𴊐 n]Kw;WRT&iJ-*UV5j%3fZ+jmdWOxvy`Xf9{/%v_O9 RE ק1y2- ~kObM\䶾> })1X@d Z wb(_} s Tþ 9n5OD@ E͏$S%G1ED 40s 2PG拏,2ABEF +;M$vz2@̒% $gؐhLd꓿`l32!60].Ϫ c H qg4 69ۉ&P煛v>;C;fݰ;QCZ04$j\$!Hf)!l\CĖjW01fpHp7偄l$,JA0)4YnУ=B%(w6%C b9F``HL1`Y)(jZLmtgL.S쨴rt3FZE_pGUش3\&DB@aRu̪H$IZe7k3b)]d6PaqXa:.Y2^CqR$"T@ TCx|RA9H>bLĜ/O<4k-Ts8#CddMLi\-t>gLI|>pz9g3(֍k5oARHR*.}G.0YI+OAsp/_qC4?O6ےT4"+׬ TQ }iWxZaCg4܆(CIdDA\,aC=[(c c6 8q '"G{ѡcTgatJ~Qw}}zU^{S!՛ʇu~H zmty_89ɓӗG5;8]&^!"E@ ">`ϯp_{썙 `ףXi ?F9%Fj/Qk7h0u`!Fy8P38);b/mg ^bK^ەꋘ3%BUXG;JzV|rF/^9C?B+5 '5׶/Z_WG]xkd*VTD ^2[iRs:7Y(ܝJyq׾ ~Al!`[c 3"@km+Yc0c0cw-xɌ3׾oG:u]sX_8uXwgt:$pa#lyk.YH!myI%cZG&;VV,3S 9hg+N\wp :l&kdlNsنP˟@;GFD0aP3ԂB7! aVYHwѳWalPr$0m1 9A#yi MD(̱ yWc;foت0[NXTTRACĉCH,cF&a MaXŝr$X{J tDTN*1S@[q(='AB$r` Pi 1b1T6aՀC;MqwlC2320ٜD2 Alvl$S=@(R&ppVM=!obؖwdֺ1w[ `rg-{,Qb1.pi2(\G^\#v0&ØǢTQdf-!Ϭ ".V4S; [Gt>Qdz_t=b-7 tnvyxpe:?ycdf;tȣ(͝sL8j18JGC4E]s8f ]GT>Lb F,Fd/Ւ">zB>NEStqۗa`9#)GgQ3i9Zkir~\e0T=$Q(Sjj)}B!bEEtgGcom$mé62 i%\hx`fv X3$cL +5 A` :RE"C(G>s9Lb9ˋ"}c `h4{@DEnd$21Y(9_xфmubt1`謭v+r4%WUµ`"leBsh,D Q-YZr,ub x"[$[Hƍ!;86&,866,^BhjlWT!k^yu {ݿ0 h4 Z'i‰hE#%6b5MŪ ` ~lۈ4.c3yǩ#-dfi[a`L"Od;')t7>u,Gczz30@9 5 ,&O#>_T-t" l$մ{ya-nX!5Y& J=(5jA%\/ yÇW DfEzNXā;P԰:" BB;0䮐 &3mPFeb8e]!TpAA ҆^`䜩g& Q@_\Ёg&A! SI#†Bst p$`4o$ѡ&,TE5E)J5J[aSy8 H˩ Fx yJ. ^!~Aa8BRbҩ0P ` 2L%'nx#;sesl;EvΑ}Գ(C!H%%$3 2($Ef@7HD@qn)‚@x>"q" Ȧ c~ (L7>Y}_aa<;k+74f52$'Wgu.~!x h߫yBaPlG^]pd$x6xA},Kf9}%pd! `bt],nY5-!Lx9QbWCJ o+2+9r'Њ sIQԖ2n9t;NGR4_4l^ˡ|anۚ}LgI:,ϬF8+i萎{toݹ%vȗdܒM Thcc`JUdq3lR0neVpG4Y͆u!|v3̳t%dFL~{\2:XdCP"fUnPgy< ;F)yH5c$#<Q:8Z6HC`=XcIp5I*͆8K&--,wXops.=&Q p>d) zU2܅o|T(1 I4ɉ Xbܠ_*`r!S#"*f.= |‚A}4gζ~@iLZ\R @23\գڻL %"~D A@ܰ"iYblh #E,.&;4ks%C9>u u̍ʥaI7@))s/mh쑬u29ulE5zfKa${ذؐ)! DңCr ,#` u,YF`Ѻd`,]XC5ASk-=` Y=$3jsi0Nr1nirfbQ!ƌD2fHs s 3ak042vTgSȰ--J  bJ!vnWaWXپlʬls*sL9 2/ X6"!!YPM|EPT  I t`aɧRcKF3/-a'2s~ =ƒ&6 k= 3S"qC \wp`0Rئ,k,ntY:1h@hh>ÊVh}Z'J("HZWV@LKc:2hB3"+[-< l;-dn fwa:CE\$xwd(FLr( sX"bhwA0da@Γg&= $o}͓= f7t 8Ҙ@ iFr^N441r=Cz2W"\n,*$GR3cR̞`Iե }0zP8((Gπ9w՝ M70h\%r72{)y@tavvI*WOj;57: j[28ج7F 5aD! /YR`r%}DQ5C`"C|wܢ%̷Ӳ!!%eM_"v6I m 6R?7fܽ s'-r=I D*xF=b#RaojBϽM]#&o}ּ3$1dћ[ㇷ=9}v5u%9r0'u1q/n* T!wK  9ZlMI`mgZͪ^Hi fΌ0)L,h`c4'+o|y~x ˢHF'E'a᫏.u< >i\tzcATwh_W7]D"X32w !#X\ q,GO5&>1mlP!Os^$b~'p6a@=7a/@f ג!Cu_7,v[Юee IJSM->;kMFH)GCԇt(`bQ 2NL gHHozuTK48{{QO:fƤe'RR/~xa<.I"MzkARqw4l>+Oa|MmD22,^V/nS ؀ug(iя D@VjC82aF0oK.TI=~,!M}6LL|u>_.?dJP4 Hq_;\ڤ7GN JLb/$b f i{Wqy~+6 eD86sa MT 2@44M!B׀ԟ2I9tDJmqM;얀M$ w07$Oxfțh#~-}&ݾR`&dbS`@Rtgyk5WyO%TJL$IK<^AHsI1mh-捇ቪ Q1Kۢ`[]kS@ T{4:Z/zvò&Xeqbf5)%kRKp3c aЀX\D]o``!٬\dGR/iEqʬO ?{s.Xmjc3Ld3BFPpu\%i% = qϖ t6Ÿ˷dS4yxe˟^9n.X w^-IU%SUU* 0 ϝ*3`x͠~kPߑ#"Lp u0Gh&Bx9P93/p6G^FB rW Saq •Pn)ux< vZ|9 :glT'jd .Iq6mVy͉Zdyb7&B&! 'MC >|Wﮄ{$m"0),4}C=Wt׎B:(`k,dq5 1޹ [bi`cѨbb Mg`BߒB3tX! _at` X 0 辆L!A 'P0LkaVRgD }ఏ]xvG)cT5R&Ftcase1_[GQ7Qx:7RU/_mr ZDPsGA{8[46R[67 ? -}* BϩƬMhF^۪1/-P|J1@\1s_isGW@QCkn/>o>&Nt,#7;7nr4( #Ōwmva|yYWtcYFdXX  bpn LZN rSHA y5]d8<撇1,Ia-4= 򜛜y/(%ܘ5bѥDw'YV3;_dސo8 _.mWLW- V4z xftX1Kg(pk!O>tm]7O*_kOr(@8O4@Z d!b Ķ`WI!X@NXG#f"" .In*$OY0xK(JMFJMn6mdْ.#v#QTR*;JLAIBn&},>ڏAܥFF7m(vմG&W h23NNMᆢT.7I5ᴅ*>@35p1v B$)̪o簖{f9h@l`9!\\NG0{ nU HN;7K17 ,jOI]RF0)>@6p?`d1z~Њ ҬHDJ,4 LAPQ}jSnC;7`u#/7q4QYQ!m)[Nyf.BBmh<^#Z)'Wx!|V)۪Pa v?ME+g}.lBkmECK'ۥ5U5چ$#<^؁7UxΧ`oD/Wtm&4h&#rkk*&/ SuTBf~&m;ioMs<8 8xJm$@))R]P=rx.g0QHÓ $MV+k__WHKK%72H*N0: ;Ct)r"FA K瓴0C,'!pA_ ^=%TcC׏7c9hY$^g`:Cz=AĄ@aBlqk[c@rϗ"j(7xLHnĢE7*Mspk ~dm[SG09sV:Y1avbG6(8X{#c.k-x:TZ>vby#Sk^g&gsamx D$xu_xj40+Lyئ2L) @˰ ij/ٶ^( (x :$Zlg:zmF7v;_[U&PD4=54wrpvg= dyP͍/Ŕט~x2Y`;.knM(D`'e 5ѪhݎA6av;ӓ tOg(쐁XdHqgиuׁp{2eyp z1#y+6,Y%6<.s@qPE ׻2MO}+ /7cV\ރ~BЄ19t^ $c3 EFRnBqXvf[nrl$. pXД^HE@.,z!d%j=/84yeMܹ u A؉ 465 q+: UUUWBhZ5+Apiq4&f k(tfZ|=tr6(9c.ƶ=9rrq7e"gᕁň߈MZ[Y]YL=Y뽭 CtC3}@lmsabţžټ@\d(,,F 5\[q'W =[^!o2_nn&B# ڥs1|[zH$b]{.#(p"1jRO|a"YȲ*Cq-n# 4r2cp`-ý;q8̴`g]Dv`1yֲjI.NOz!,&2n[MserX]߹+R&9vϳ HgRw:v 0'kxn>VDDҺQ(I $D &C@6Y^>Bϐ^+n[;,Ŕ2JBYmf<ͱ&65> 2:WX>'~)ָG`#~.#M͎/oCq"qnf7G09\lmlϐcGec8% 89+ 6؛|jP06snx mB ͭ.WdˑME $!p QeP9[-mĖ ].DSa_ p0o_U9le 4h66+9̝dfXFI]Կ44IiIb)QTʻ[Q^_Sܣv,[C叅^PR'W[pHco iV]3+;V6٦yfj-v6yZ^o0\[׏ڣĥ ԍ2 $1@0 M.$ZdS!g4cğ3s9nkxU mx.g](Yf|sa ,n8 ]BY3rg=.ovFEDNYqUios &"pVU'ǩr]Kq=wt^nT(ʪ`H !24H4@Ja*( ҕD7 n% %0m 0)P+ ! 3|X D#l FfR4@bJ(&@ AQ(2 DjR Tu V<}-I WG+l e (iE#z۰bGA$acJD!ObCNPNdH H 19#\Ë*h!D&Eg"ts7""^"bWR 6HpA @PEÖkȓ$@/Ņ@y!r`@2&B&@-R/,ТqPwDX( N'% j8Ti$2y$Q7\3xAR+B<^RAyTWt} H& D:Њ&\e:\06zV@2E砢pDL"(|=E( (5N(_ ;|8QsBB#|(DDy@B({_Q@ \x2 CZKR"ixPTg@QK'D(v]؉J싴G'"qJ?xJ;(I2€8CǕS**sH :aAS2B `=wRϐ7ڔ=)Ay%DDz!hӽƒ,QȓUjChjCiD4HZ^DMM`"!@C ڷ_Ҕy2C(U(Q;=S%Syɚ A;נ TuTd6 B+ByyqC@ &7p@S*ky$CPc  C";B*Ӏ)qD@恴CEkB0hD~P S''f3pGŽH7ő$/1@dLDZEMɔ_ NF>DqES};fギn6R(iMﮰ+J뮡b˫r\p4@٢q!hH;9eN0,'`2+:"d﬋ø%$D_h8A˶ol m<'f@^ @mX a4 ք^|QAM ";W~Aϔ0IF%$P9r4d`IHvI2PMuˆCtذ!SP~$U@}W)[DG43HT;0 9OUSh[PPU(@)|I^>,d)5|!1@iҀ˫¸Ct؆P-* ؔ@GATa}UN TLDG!Q ’+U RԨ2lE@:p*'w ؑ({[JDzU)U (@!LHD , Ң A`$ z HR Jp$]mZ5+@aTX*;;/,DP'S  .y!U JG$bjUW4ԔU $cH%k|)Kkw72v,H==B"v%3X}wg@ؠրCCN?1h@_΅R*>\$G(|t xs&̈2LLKJD:T7H}b!s|jI6PH{*P{t A/ }*q ,T |Kw$GЁ8>o@.Fb\^Hu3V" fb$ igIW2A7 H  1F [ѻڠ#t(lB@Gw *`C n[d.zkiGns&[.,Š]Ts d!Cv $QdQ@UxA(t Odc4;)a$36%9Wd!)?>x}*bU]ŇbfutI`+@/@Bkr@MSѕzZEaDBWń P2UD1`3g+o%meaxv'BycVс1{p #Bdm1bnbM+FxMVRat\m-K+@J*LDU>nALD7rշneQ*,YۦڹET;B*Q:4bVO A HO BKH(1(L #@&|@DjAk\9WVR& h TJxx" 6P0Wr3 @݈*69 ޢEqT  *DUcmlF1=5JP!n@U!JoDZ|2d?}:*m G k% A P:$*;/)C @iI:p${iCe` J`)H`+B(J#!(E&T@(4PŊ@sOr<S&2)!FqLN @8W0  @Obow 2*dҏS(u;iޤ ̆pY 9!:"Ј:[A "*̒ЉҡɃ  ZT? .ChLEBE\)*%"(r*$/i IA)E4 ޠқL @dm #-aMFVɒۀ- _:@Ѓ!$ vcA@AIK Evq3-IFbIQ"4 h҄3r*,͍L6m 4`U-5,͔‰L$XL L$6d̔d&BW69fhĞlDJ44105+rh3,@LLA&U-r‰ ҜlU ߉<.*WH mZmIEɋɉ A RqIWP |".D`EBNDeHJS,&b3\M٢F4`$Zh- UIii3Vi)ة͍IJH&#DI[*MiYKRYMM4T2C aάJQJ_C D0aqw׉P  *.;{l"H+ F JD#b?"ut ҕle#a2#-UxWWEn괶R35_ڃ| I)7-0ɦRIf,SJ RBhL2B . CfJZQsVje!Y6,}6`wt;I\)C|Ѯ۷U$TUwB.9 ]znP@0!oB j4̦Y+6#"LD$ֱbԚPsn]k]V<"0;tgkZNHOi#q2Qhibcm5TmZImƢa"de5,kR^ .v0b%Q V4iS AٮR$UvQ`kd4 H "R-HtX ƈt聆 HB/F7 B{-uVIETZM*Hm#cr@4RLj4k2HT)MsnRe+o;?:KQEoKx+*4(!k.xK30 ˽\` ް5{}hc\7Bt P6WFtfB65b H :bD !v5*Kp}Q:o7 UbV+I[QVMiFD0KuM>wݗ$vzSU&e27sjՓ 䜗tu6j%]|*݄6nF& B &X:y%O''yg@d` pII 8E^/@N7@(f%@C6(IФbę"[*S[%S6&Ie̋I)*4ZKM,3)lTRbL3)hA,iזvrέs^)H *FҼКd:Յ6[=tD=11D"E:yP$H,$ - ؂"1 0B] `P+뇀c~Xo_a]bhB8@t`HS áTJ ) H—$Qӈ̆Ճ_d Q ma2]TI \FM~T7Bޠv Bo Х_Smv w'HfF!.l\Eذ( @ 9nHmс!@)@LjYDhcIN[r&)i4@d2HM. i30v"5&̱Qc,hR H-` ZT ȪS;fvBY@hС7GQ?>?`r 0%5L@4c2w2r5c&3R mV(׵}hĆM1͙0A/KjF 41̇5$'lQt,!:]*ΌF#\x~^ 41@%eBf "I`(a#vt@lVMd(PU)S8fيЁ)kj<(!i(EER H$MR]ҩ[x\0h/g+$mT[\D ID!!d T2@SaIF'"kHϼ9,%x9_XkT K<}K٤Lj cXEC>+ch67v+9CݟdT?ɂ@ṡx)9 0PبzNpO<',+0")pY$0u|m)cPflϗx5^c/8;M^wMg/-g+ 룲hI%z q ƻkX8}}rAepNz#t!lֶKgq*,g; Bv -Sp1Ym?cDzɔrkf^*M^HV/Qx`7o)rh5brF6 35XoafIǟB..ӪSko~\"X7Om(%\p9;"G鞓w,-X.ey]5āKMTXv@>0SNwwV I>?jgQ 2i3-V>ڪձf>^z,M´t6 '0 h*<yaiWe$L+)$H2X@^0٪e.ݦs֑]a7 NJd9CW9"тC 1Ϳ~iR0L D?3rnWa7ܙgp9CŨ|.J&QzAH-@j؎bBԹT5͜G!VChMg2Z;v+ KK:4X741$ >•'ఊO ?opg/㜦7C~˕b2_m#M uS,ѫ1壋cc]1 nU g   py^C^l*DT %۝%X96ʊ 2E&֧ͻvrFV& J 2MW_B~NyG!:[l4ߴRad:kKnȣH#J-"HB$($@J- +Sm&Bh֍Z&e+!M'eɳbEղ.2xM}`E4T4Y!i20ċf5^"HE% FFISn4V1c7f1tu k'f1{k=7ypm!D:Z'DS,rOn6NJ {2WnŖ2CA!vda' J 5}n\]XZmN5^P$BJHUIXqk791o[)iP0%pl)̤Bb-MBfַ~r8:@iLk"w`I`!j5bzs&㊽ˁQ_iP3Rq<>oK{wP|x=2f\܅5|605rS\31$Tynck_UWs?WV)\sEZ$,b۴g).'pTX JzrmezfOY?v HXťcRBDdK#[(v'\Js W_f>~k'&5=PX-Yɬ}-W}qD9ҰdU<ެK hżzvOTS J5i5?"z6ͳ.O55tGy]\H }쀜EV-JhшjB@OL']yIJXk&26ʺ/OVSp dŻAM nKz 坙z/AѰ>'Pz xa"&3FٟpXǭ!!槐>o)-Iҋ$KR2d?ԁt:?Z%\`_($-qitj1b)RF:E?u 29kX@= 7XiMtXc3 ’ew`tb/0RXA3'35 ̻G(؇XR"eHxkQ/T>РLo,go_gjpcIe:xca`J6+6lCbi`y09 A8adeѸ/6NSL ƒ;'vN@w/gf)ѻczv6 Bp@8/h!d'~`[d JR~ h75gD 6@Hx?+>PXZtn?#SxۺnC'6){ϧue6z3uq6DH] y=5zh۪꺊))wocF4pK!ԱK<uD)y͗u)F$NΛwLkJBF#P\e9F46\@hpd8KD%r3[wt!2#q@ɉ5NL+BV@ҏACӊG` !fǏ;!;MUS0TSրA ^F`fuiPn<yp>,7ҀsǁǾ&C;%AMrYO^7Z#%_biv}\@{W7YUnPla1dS.|{ ҟ) &KdP#YV0{|t'K81._ǂW#}#O)M6!oU .VTb\]FԇNO/ 2p*0 VJohtXrlwR ';R?o;~7r$M84WOxq7DVڥ f#џ:$}ow\P:?x`A$qeY:saFt">IStZbkxSo)w-TӴ\HmG5D(*́p] ¡D޷g7<fp+?OLaRP$`<nCn\_ n AbP< ؙ?gie8 ;mWfXT~r<(8@1ECӭJ@:CJEL@j(G OZTW$9!H2dIF* +R"wŠ Q{A #"XЪvg| .*D>քhD%R"S0|A Uhoik|Ӈ*|?`3CAOO/H'fDži_&J^Z%(@N{UE})\ X(10>EAqf( 9"̔~.A>8o2{eoOh0O/JPs{%%36z<4ƒ:|ۋt[Pn"l;TLD4_ 4 o6l)y`yB.K ul$7@yE`w\keEdw^C7-#سH! h0 L1yC'w ā4SCRNx2x(- l1Zue r0V(q;A]ÂpHݣr0Ff4@s@?Xq>0 @3Mo18ۛa#^pku7GHU +L_ho"l7(;hgud<ߓ 0c\ x?֭xzhe;3} ٩'3yl=]ɋ̜橈 լ @:Jqf@6{MYx##qwD,+FJ=Hhn_5. ̳ b["nVjob3C9#qP74(x;%3K-$n$~ :$L_ࢲ?]S.3aC|pqtB;^p}Z`t1ҤS:1Y'Z2X16.\1 ~Uu0'ah8yG|6ڳ 7%c. $"A v(=n稇%]9’1-TEQh6SD'! J s{JZ"6&I!@27+bbWPQK9u!] 8yCACM4% G?)KE%)SHδj#W ȦAs5FQjXTcZifuWU: f+b2l"1lF̂S$m;7nRc_݁Z*52٦LE7޹Is(+Eӕ j5Llh(&6QA1hb""bdi#xݽWޫRlj5꺊SQ+PIX2j*HcE6I,i ` ֘4F1aM&HZ3 c6,%EA QQ<J"*m5э@&* `ƔѰƤب,F5bTd:T%c.Eg{hۧ]1%RbkLdѤHhKa,Th3( HHš"RTF !̿V1XhobEM$̔)1Df ~7[ķ{/F׉v&]ِuV*6֊4mhF;ֺ 1l%dn!QPwu4b-\5㱄xz d:Kn(~ӫW{Uvw&nFjAILff5J~2c ă1)lQ] R1Bc s&Iqȓ fO/(vwhdڋ&u:1Tj5TbcTHjIRh֍c,b @[O7 ^-߭*7^&611]RQdSDc$E  i#A411$<붽,Lb R&^e/7,ߜ4kFذ#1T] 3!I'umޖ~~ ,DhJ McAi&d)QE_q&e346Ēf$TĖ#=dk4Ѡ(5&KLkv* A&)$ь űI2DaujW+kTjƢ+E53SU,TU~}wdŒئXB#bh)R$Ա Q" !l-IP"RMMMe0C$` L$E4H*f"*F&5,D1MiRr߱^^!FJli&$1cacJlDME 2^^EF5X;ǾTz_^b*&)lQD1h50HRh{a4e ^ JBfe1I23cf,:WOᧆI%1Cf֘$ Dݺ( lt)ҚJoܛ2F FqfI]w]߭ӥ~״VX6Z,ch,Tll[ui5mdfE.AeCn{{fDl L=׃]ӋԤDY5[v{^7FEjx;3c2A4i QD@Y2f#aW}}o;wm&kޭ4I9rc,,321mXEAJ"*af a!`IĒАL,0R "R`%Huj66nLlF_)1ӫnܰ4Oԃ9,b H( In̉%RmEJ i1JL4,S,hB J6CM%HHI2 E2ĉ6~¿/QLP9nb[TU*6(tF6- rcLٙ2E&2hHXNDHѷ$=r|nLI&LoKݻ3"ZT45@&HI!X/;{QfDLЄ3H,%&bQ)0j#6@_z#iI"!I6)5Qij3"M2PK0"zyA hHϧx8^;\IHPJV(89I RdE!0G$09$~:T.7kC9fq|as]'Uzs8=M7ygM ;Z̧@P?{XY柭T_T-۞p;BC%zO!zgN&JFgkgK/ߔLj.|?և 15KP`&>VKJ92}~o025@dHQ ߏy{vim9ȃ#jYӲqKP}TQe `/wT/Pc,~E$ " DCf P}kcQ-pf}K'ň?nxWZrm/.96<@Dw$h{و(O4͕ u#mHG+ykݹAʵ$ @.[ ODž TKӚ s Iu־d 7l0y)cb`:ڽLfN28 TRWfP:@5P.,bȏSMʲWMa 0a.9oLѹ[D;s?M}bh%^>h .MSX߸`׍PnɔIY/̷@!b~7;jD6 AuHxpu&_Ayd" p S-Kv90Bz8oڦwiPg<Ϧ,E%9c Id eSOۺ$ӌHOqF;0.À8.fnC0f G1:NGxvQSΘ`ۯi l3N mYFc1oyqzm|^CR%PO&JLe ,i č%KT4n<ߝ %Hؑ6&Οͻ XM!1FH" Qĥ"Da" f `ɆPt,͈cdƉ1EEFXeL[F63xDL`*".;tt- k/o|dYLRfxpT&vEdj5}Uہx{n̦Q)(ѯZj@B=%-Ä% @VTտ?zyHPSKIWW@%-P!?`Aߍ%7rKJm w'g`wL$*t)VDk߻&fi ~QkK0&( !QcD";nJQF1oQ0lDZLDh4iF&4ih ,cF2d5FƘ2ewwB F11P)R;}l]]RBcfF1jF㲽ۺF(MfQRE7Ζ[XiA$JH " `|E{]b~ޑ$RS9Sx'Vu@;mW1D!eP?Y># }hns( ]#O ,CZc$!AT) zD~(}I0OX;FS ? ?M Ҙ0!>|5Q(}3Yo"O TEo>YV(mq0#{4DWzQ{|ڊAt@)v4?]@r\{W(Ζ2BBkd> 4}g ǻ"_aU3dq!{,:D)n%i.M+qLg=s@$_[>B""H4wtpJ@HOXz/[rA9` K(D۰MK@aC㰖#H7._js9Hp±O %H; ܨ~_@4nF6ȯk_ycǹ6Y@=Rg:";PFpl^׏^c|aܹ AC ?4iuC.4>H%RLCɰz1tFdb! %b l^Kӎ1;ap`P70BEtↀ4EIk'e@c?RAנAU-AC(̅HHau ۚX}l俗2\=O,_{zy=C>Hƃ,L!  PGl]:_y]O_yO&bO3>?5.1d~HS`vE(h<ؘUfKV'5S ŰQK'1g^銦]&7_ٶ9鸧p.j DT^Y9@'LKy׌ +_lhEQF5` 2-l2(I2 sj @G~Lt=kcp1A͠A~~[!`<2'PwoT\NNC Up _&͋2+CޫPr%- !A: 8&CR8u< wءo"8z/.OҀ)2<. a .79f5{4<4,1ѠRl-GO"PscQܨr oCAo!{bL ̀ Øw1nɔ.~.>Mw` !? ȇژC&#.fEXAHP,G-K0zV^/ aJr'狀p%@!΅bA-;PNh=j#.!l}rOX>ўp>00,2X CVr5@ѠӪw&T9t(һSgյwaۅbG,Cš;0XvSyWn ?%]8b9Cb{ȗE0[xчH 'vj2r B-WA?GF .0g TWVԴ?QF*yhGZ,YtlGK"P1*:3y+d_9GwzOߏ`#ZlTPe?Zx6'cr= h)! }768;trSIzlCrN4&_I5"%Z! ոxARdRB ɔOc+-P2LO2ܦ2[` ~ `-SnHQnt +2Z;u07bˏ%GJAo74@4lc]9H^GEw_/^_]p|><߳>nZ:"1dsuBf&*Ay:R WL>rle X[>wN /NJˌp>=Ya $U7EGg>b֪%M!Ѣcf\m裊7S$Yz# 9M [ҖC% ./ cLMxY eS %Z,2B @MQJ8X0Pv$:5~jM=p9P`9o,[kdڹ$˹_ﻇ87Lr}U'=;Cugs&Lb :㹝׀ 3Rd,qoZ?BH@8p=E Ӳz~ǤqtT o4U 8HQh@{o>0I$u:{YAT!7}} Ʒof{[.S=.i&J4y ?X'n$aRآn* T?/"6F^@Y]LrƖ&Y"G9z]TfqKy<Q  4A>d:{5։$asjm- ZonۡsB9j.řmr8!"/^@)H$:&u?i>\ɫ~CPtZ9Ly~V%^R6{V"6\IoYE7HeUOwfѽԏf?c02FdNw>}׫;׷?ߣ}PAųJ6!@-v )$Bfs2$ed, ̛Y2#9,+2fAeV)VKV R0#Zch-TF$KTT6 kfS 1. )įsu"4l7tOZ=PeLH1$P0BOH;ʲ! .=/MbhAm dذ pX1zg^5i)PJ%v,P)!QU|jE0bjPAH>Pc4R JШ(e7P**qcNmb_K_X`/@뢏H8+ksק  o"kdzݫf.g{H?io?V I ._ըG ָZ @wO>0'-(jrMSST5D&}o;uϋ-dODxA`_m\y ߰q+ׅ;S ihg]q[~w1ިϋ 7sCX4rp֍uCՎ zs0 .TWUeA?j ؅I A$h=u.Le60 f7I$*s9>~p1^I$(P2JU@DF')D%E S-Ih}=8HA'0 7ZI`CC's\3x'@H{8f'$8㝂ӺCi_K{wҕ ӟW[N0P‰j-4{1d*Qwe~ $o[gՊZ Ls}MǷ4zNx༴1b@} '3{O-:]Z8fvsGOp-H!"t, izu DNv&Ol agr{{lg20]-%0eHi k6mC1)*)|ux匑ˏ|7ՔTR7lBE*(/?W7cDL]%7gU se5GzelX/kI L I*v܏ϯ"uK"LP 6C.qɍj fSk krUWYLP:yOk#ԌƏxWwZWwwdm܎^~RDDDDD[j/y;N#yS w?A&*v5ϐl%'k; J'˅0u3?Ĉz̀ƛ31u׳ @;!a3a̍r븭e.5$cje6[W]Qamܢf >R 3|~_;R"at2A%:$X KqUGT IaAdFFBpc c`UCMl5x2hQ2E6Ngo'_ .oAӯ|ʄ2YCy{CCӴCX/W8G-QWxwP ;?'y恭T^-Q1g}OGL Lo_k-Gou.^?<-~+4~ˏ1Ov?C\zkͷɯ>~nD oeߚ[s;((>s(!ۦyV&>Rq]5sv#7Y%WJ+vy0>)݈kX)78ˇ7/V}@N9/Nz'5 +!m&= °!VF]Tc!I.Ggm}#5$fX4!yZr2X,n:s=+TO: 0uxMQAU-*hl 1ا5n(cٽsskF&wG#lf X[4m13InFj([9w} ZJL` 90q񢦐^7b$|/ouo>)*OFTUS]&B$Et=,*N7yG(>?7A;P."U W)_ABu:] m9F$!"> xvn\m_-ܪ;tm&q *I * 2 BH0 78~_NT Mx^c:uÅr-Ңlir'-y$p3L8j}fJr[CdeKAs=m EBC }@zJWV5E1*hRb i]QqR*Tҍ(nc{IV "h!oAiYa#ɛS``.M[T5emz#"BOP-#t%l/(?9m>zgѩ~Yr!!|3@ 9>snMݾ[p#Ug Vo.2G{އ3k0o¿&?zγ栙k M^S[[o[< g]>(b(wHT^Z_3^5D~w|LH8(gv"Ri{G@ Q&>IO @,^?>qɕqu8ˮ0fT1$&X$ǎnP3l:C)w'eS\ L8sgqo^  aV3ehF6/c}UVt+"nZL;kAd F{l-}vm8wQ[-*%"+\ ΃ \.m6;9.MK6`VLg}k .g824(ǵ}$9LE*NE`DT e.?M ((B&Ѷ(6'QԨs'B ! a $㻏i%CO=ObbrC$ң*Udt>Go [lJ0=kK)kzz$wj6t/mo؎&[ bpaB3뫩UMu:G:?˯ y$$`1_S _i8/yo>5\/9̈́1MmzL&53\| sQN{{~d׏.WF5K,ڮX]bNVY@2,:JYý=˕{˞e{X[v4^>SI‘V?T-̜/Bƃm6SkʰQuqnejm1sl~ Y3 0n-(8n\/â~#ϿGޣ1{KS,Xk Fϕr9f[4 J4D4 *hǢ`7=;tQ>/K^I[$?Ǥ4|@(x|G?L?c}7K~Jْ"-KG|>j&جQ[4e2RlI Z4[6^Qm* 2+^OŵZŵ-lU-DmEVMƶ,Z-EEEZ5kIlZȑj)662XdVF3465e$Elm4[FSbMPЋI-o.,D IJJL1Xђ2TKJd6[$i$aI Cv?ǨF2?:?Av0dƢ($"p-c /nlUe$`'m6mm 4EblQA&RW:t)# !%$F6JfmFcV6-ш)}fFB=s|,iRHm93!д`RԸE t:0afFQ lD C,B5V6j!"6ùС1Dͽ8F2XLl-(!2 B deLj<]n^%b%bi ,3[·+R&^ e䪹W ) Z65i(#4lZ@vec4#Ȫ h(CKUyRvRw +^$K)E6MX+F HLG7FXa fغi p[i$޾x/5d#IhQcR@Fyy| |mdV KbmPeITJ>uۡ9C;vw'dFGY[d`w?[U鳹[DHoOR1PLyS愻`aFvDkW37oA? s1P$SZ8j!X &8QH9rӛ3#2$>/9cF"ƨ֍jٔ[ETjL99*(ҴMPBFibS~4]9kM`#Qj-ZT??__v |ߣ5UbZ 2JOJiqN<鶎*D9p![:H "I*6> ^I^κ>yAl=٠ ބ.3_>nHC毘l=|觧&AI ^hƙ"8><3=?q~GѠ&e3ECNm%clj-T${$Xh7PHo b8Kgh Rd9+(%2kƣm%MƲP(VCgs|7vCyhzD`w7zl/{1Ѝu(| *EѤ T*6#l"(NwN0l 77oNr G9RU3-& Fl[RFc髫#]d!]9sim-L%1{JP0ƨXbثM>(بب"T(O`If(6qWmo!/W"󰐾N76y\5 m"9G $ Pb"~)!Eb#X'/\ !pϒ'r"yf4F2O(M(18 $"m[hڢ5|_?:v_5&͒Mk߱![2Ơִd&NJU %  Э,H4yӟOzDƋF$(R %']_Q?٧/:6aP~.2cq0k}W{lXLŽe)6,QjKFɢEkX5֐BK!Za9#2[Y >=_B ZƆ$(mTm@q}zI+ϕm <6[9D{d(Ȭcsݟ%Bh7~ュw?~ Dg$EMAR[hV5W>k }%̷ߋhbMBQ$TiIѲYLA$RH('b>f.u " 3c[y[}g)=D4r @Q C Ȫbj_6W%BZ%2\*1(Y$E1`WqtŊ j5uWXŴm[lB#Ids̀&J " YU;7m H:8kG~֭(Sc)Gb-@?t;lڐ(dG b$$mh͋(Uv:"N8l(p/e ub}@5=q{8 ? I6AGh\jh$jZzB31o$qﺾ׏ޮ]`C $@7';[0g;I Q4|o۷_)[u8Bus9 +A[r7 0̓fwfix7=OۜȄ L M7}u7HRj٤,}{RV aʓ*(5c[IDQ&(K$Y&M!`i$fdL(*1P%Ehڢ5'(#,J:9 t  _EpTxo8xɽ(8=-o|>έQy 7\e`kr<ݼ>DkH$O v4A Cq*"˲u 6}aej%uIh?Hh [/9샟>6yFs,ȒS>hg0@E IB&Lɓ2LTbhf@VҔZA%[Tbmbfذh$ReTTDF̤kef$ѵ4I#%֗+⮔ H_xaO^c (m͉o6~ٳy$ }a}x & K癬f8OذբNDB؁Gfc{?Cg6V6hd2 ; ylߤ\l4 ّDfM.IVsf'Ktө&m8?9 9uo08A0kК~W*{O@^DwہJ/ ވe^/gg/l:9J{B I͒@>Oe?s+(P sC6PkX`K%{Q'Vw=K=)0LR#_1Ξgk/[uЙYbGX adX+yObZɾbl:&%`*~D 2A"8}ڿs&LĂХH4:oAW媏qb T%dh֠&LP +M߰m[@GD,iA $ =g8A֊ qR%nhQ2hJH1j}>#'clj*5j5FD1mchȕTdMQZ(DR-IΟS -n8~]<<3AA\C~u;C lF-R`6Fcc];5AXbhKloT]=3KMJ5G[H!ȈDzvBbrk $m{M[@lI ;?Cʮ燃[eAIM%ԛ߷֜D`$BGϤ$ XFu>b7cõ  >&1Β:|5w}/[.;_._aV>W]⮦K6/#{98[vВ{ySࠒ>* ݔhDA2AO:/sr!IC;9b?.P#?;WF/ 7=g=<;%bI>5)f x٤*_~$` 4x8=\#izh{ x-(TVuJ.o@,!**'/;zGVjF1cjb6| Z&@qdݟf sXbxo8h;Lx~dwfEadue9=m:Iޒ鞧1mmn=J0hѓ8W ,ʗZ\S`q"CJKzs7Qd{sCdhy*Zd'k х1I#v1F}6x-}qڢ@ (=\!^JcR>גHU,OI8ӥ$u>+:¡]7弳!e3*s@6/3U0njx#cXGy8'DV\|P~p!=I坬G'x* 'y!^c|ʉRH ~Nȡ$`.2w4"Q\I8'qN?Uy7~<IA+#C$HMPz|PM<ϋ5s^0<ʊ?r9#Y=N•/kO|_p3o*ڋMὈy7>'Cւw9dz.8 "s[TB+ 5+?*`MƑxHqk5k,辉NAq>OAVG_]G_%ifalͺD΋_ǡAxGHC@!#\_.pE?/4w0|hDJ Cobpڭg$$|~,&FM;, CYoZ Cz2*4'z]ېMa}r=D)VU#lwaUzȬ<_g5\̓%#!GŠ "=Q΀' G=P3-;?:ֺ`o--N@_Ľ([VoMQ3ǽP5<]ٟLnlcYwCM-|@ΖEUЅ>2RRaؖ^GHm v]>Zp' mM{Zesw8 hhDIW&FԂ  |aХض"(Mr|ҝ&#b"E87fYdt }$HWnEish,gwgr HosC)5& M*7h#@E[rAXGcG'eo=V/ 4UvᆒNխP7 cSYՆyFpmJ)$R٠%9X97I#8>ȳ,5&bc,lHEKʆs1"{nzK%&[^XorQp fDJ0A0H,^ef땵AN }͒ۼᓮ{J#`S G O Iz0n, A;&(5f )5K2\J8T :g3_e;m>"DcxokXt[хV 88S|m@.ڽR-bnHdSt4^>( mGc_9wdm# V7e( ,՘!z<׭ݦMQ>.}k:|Kd#r-AY9o{Ai^/EPYJ*&T嵉N&e֙ǿCgkh$znZ3"$2<Ą"YH/# HUWR~ $Q0pCeLتx}˰ؾ.SG@sŭ0A,xp!-w_pr_O7D~5 =T4E\D$xp =?=NXƙ>@;ΚttA}ghx-Lruف@A8>>rԤF4 uY8%ƪHHdm빝!f9yzl:!P઺c* $E()أ[EV1؊YU%SW_5԰7\]zÄN𜫎n'ED?\oGw/gh,Wl"E rƷoƁ}4T :r74AMR.P $)\+?njL5ܨWՃWo ɑp~1ʒu?cO ^DR2'_[_$ʣVˉ ѵ@}o0ӯcnIL#Vs aRP\Oc X۟pPd9;8c@XEf%l 7/d%cKa~ "J =k(|F2eKBvkZ(2h&7CTQfO{8i㻣{_C!9tC?u?~7.f듽Kwm۳G.mMs]qNcy[N3 Zw}2tȘ5/QMZnV:AzJ  *=F3oG1 xt箫1wn%*ǧܰm!Fa},@S0c\wR>~"*o.m[4cώQZ6eUwlt[!siَwvsn!|K4DOMz2#4s^<™4=xG@g3F^<4K$u4QQG'u/M*T.^$@~F@ͯr}\P>{l!m?"dz=B߲p7'WGOɈ=B^W7w<'\(=6MDT1_ob}z}y5fHIKT!FH9Nкb 3i  $a1dͬt&9 `c)N+iZSIQ&RZbvm'LFqj.r8 "Il`р*AԐ^ sk/+{y/kUSD |v/'|޷R?ۈqF~N($f=2]l t94DhصKm.ivtK%T))Pa@a5 ukIJ ° 1)R,("h)i`M8eS#gv'5z` $(kL:qcdsu &4U?v91bJWk}U])Ɯo=L@[80X0P[pJ&+i1WYz Τ?}Ӹn[ԢAӴ~o3@?;62Y96{9. N"uQ<>?_2E >?"R}g@:@|3՞ Lq$2.- G?+W_j9}_}xDE_5~IY&_Mq\%ߍQ!g훲t;HmxFlctRޔH,; 6-h|xq4OGkv;(!>?SdCݖXAB(;=ZG!jEHkH ?Zvk3_sߗ4B.}"sC!$A ?q՚T0y1ȱT>Tkt_iW=QlF `Db@0@U tyoVy}Vߙ[/3?9tlH$IbA[AA#!G("nbO3{u=U;Ap%~l A5j&M8K)iE"Lxs~n?i/w'{$DCoItm`,]CA$fus 2@qjJIe͎`&9v^>.CpQ%W+t7n~w%Su\DL@F7sp4Gfn%Ã^AoԵ&E(EEU"pN퀘E sQ* ]A(&mI~NTN(GTU Fݵ6 3}us0:dL*/Wj젯 mЎ»+~N׆|.lS=AVD 3sTr9 pz(@Rx~aZ*k~A~ _h8Iswny.3.\F4IxPW;uI累ws}&mX.X vC~/}j.5{b63Kjךzx0`(MW\R_҈`6HiX oOy܈Y)~U@}_y+B74~^aUC0WO>H&&Hb6r"Og;V@#KQ?;hAW~=PB GZѬ kKHc8j'wz.!anI8ymm%!KL .SE)/$`s %>@!Fj_sGߒj;U-IB$~҇aϗIWNսTZe;;Śn7!$ $ftxB;P)9,"ݝ:!0rE%P4#R NEfm&ҴT mUJa/PBF Dl̾ 㼟s~m5IS2+2HtP+"e~([C4$AI2,baھu~G5uby1!*$6C{߃+e,DVOd}~/bjk GH+ooGeZXĢ +mixuw5Ƭ&@•cMb%Ƹ1&{08 t2o{h7_7;Hv!uʹ~)@$z[A?$Y6k~is̷yvDy-׋;.ħ(PO̙;GM2$^*rrrtVwvO4d MNmArjd&UFcƲ#QyFZ`D /ɁUQ&\u APdVSݺ Bd 9% SF!wfy@'uwg.dtQP$}>h (&I u>/;BAPs|N;QCl;m&PP-NZo}癋Ît,3ppªS&^8H cRwA?g!&t837ص_XU䟱iw8Q7iX OCv~ H}¹|vv¸rpWC0߾!,tq{472559oWUp+R]$Kf&c.XFۭã*ѭNJ:˭favt&k'Bq& Ȅ98D a5~J) T)hFiG0ޒMU @$pw1(=om ^*Q_ }4+Rg}/(D9h|OAܫpXy^?o 쟨i9'DO sǠ\Hzqŋ{gs\W-Ba$ HM1!>w6:ΦۡuQ"f9@@ЏN'2P\#AC_>Ԍ4̂ h/j:.sٷ]vFʡgaTP*ZA IJ>bIeB-Q&l0HU^Jۄ3DLE/kGPegڤ6[jm[(̫.߸2ҝ_ش$>LOB__?s1<[qvԽMyyx. 䃔HQPuxԂDAH " +W5kͦZj2,IBc#B"nbE#7Qu@\MW>1/e}]Yr:{g3SIX^m3ku=YODRa>ePsw3Bk4 :<;jvE,;XsSp4'D{ptg&aS6$HA'jG35h Fcaa!(^:W_m*H;Q>:h\ P:~˻xG=EES_!4qoH#P Zղ[Nq0H -"P|HԀP"{3XST_GhsV ,.~ `$s 1.+5} ҅ NmxƊ)Q3oQ8wL"p \|hnUYW,>V|^Y87蓐mHe=73R Tmxz]$P$7oo P^6n9]U? CtV]Ϣ), aiŁ j!>?.iD&>g) IE 'YcDPw\Om~%±h^31aao|wj94)i  z}MIHʍ@%^\?"C֨/ϧaCZ![UGƱ`4gjJE hDli{~|FYI;ouBgeۦVGa3܂10&}L'OΉӱǦblWqgsfSwi>FB|.z>MݿXP]7@󳹔H5L@*%񧪸^'?΃s tQ!TB"Sk!mlו>glG9]%(hZ@DD @^x!2>LRR!}7ɌƇ$Z"`HCRd )HU'u(TOo Ҁm ѯUmm[q!@_QZH _=Η/PF> j/OCoՐ8~Xv}Eߪp_ε͚31793L?Aťq}ƘKa~亻.15sͭ}ڙB׃_t ] |P6kyW؜DnP+wߟHnnCU<圇^xlϡ"ݜ #ش8!5X EGS\. rٮu'w+#WC!1=\A;5W \eƺ;_-Uw[k7u]jZ*fRHƒC\<_ "EX憦-m"_4XF b&AA=7߽WHॖf$hv|VyaC8= v)'ӱr~{<$0{ښ ˞N)(ӛu`=Mӂ z.F ҭ߳zm6; GXS  eZϭJ]{+ݼ|LY+Wͽ6>g8IU %M ߋz^DrAy{8I85LMjl]V{&lxvh׷Z' LX69WG~2LNcCi碩ү00H zֈHX(kN`۞x %ww=T6}6#[{UofL=M!8=v6c;0Sv/Pz$m%t:/ף6t<8P9`?m@j X@$[l_9KXdo \lqqPkv\$˖SXs;7& j?wESMG+"˝\׍F W=~c^Tm!BsqX=΀x=o?IbN:\_ֆ &.]໠R[ULcGyư1 $Xb с.#=8kkio_MzM-}sufǛa"GyqKj,KveDHRkߛ$ude%~sU[rp_o(9+wzK) 9AOBJ~dx'BD(ړֹsh:Gj@:o$u> K tEW3}΢‚֥ &u@xEao.]}OIEHr;M~W[YU _Gg4JoIV}ހ+CNwlZ4zbuE8b,e ` k?z+b%褈;#GcVƺ<}Aic4؍U<GנVp[ I$Z":[!箶_)_6C_a._$@xd O[oɨHr)íe(;.?۴AYxV'슇t:@N^{luqsxb]\݃Sj~nk( aN8N2*=lTB\`Ik<";=!EA$Q v;%K0<#:3k?z"s`og=yC8E[w-pjI,bJ lO['4z&Wў\?ʺZ/ݖ"Kwxys ~U7&艢D[k5} 7xݩ`5͍"~ΌRZq镪&[`ձ\ )j~,qiOY:WYD"6|\ ',= <g1(cBE$b 2(PBAN^ ]U^w@iR65^XELLr!Ҋns7~LMoHݭ$ 3AOm߽ml}qVC1? ~6}l $HAGj$1\b:YPد@nΫLEw7pc0}xzpW\aoN5H!AO8 s|W.kzXwyow@A-K-emf,O/u˵̾Qt͢g."=^awY_O]z rT3rz1rl%=A; :KbZ71qCX nqzۈKf0~\Jwb\ ?([!KQa~ۿR%vyй:?+^uǝwgxmyυƟGuW%=u~dzKezUBT)(H{8Z K~7AŬZ=EoyDNmK(€S#VX)}g2P;O4D@=)Nl{n! 0Ч6R 8x&luΡL h( 9 Ѥ^WD H@hRY2kfƵU|@mP 0#H1)@IX  %*L"iPD +i@9@ q~3X$o2 }_6Fy~,i'ڤ~=7H`֑JL>lqL/_V^cEɨON=T)8c!%"5'KsAё4q,9fe+KWTDʊG&=Fh3`wW.o藾|z!/Gp@ %XlHHD`X LȤU(.K"+a0 'ێEtAYeQO#lpyZ XDDoB{etÁ F?]lUvY9"ŏ6H_"$݊|Xzo{]zq=l:A_m?d[+_K/ wD CjMhӄ+;sx$YcⷷmoۇĂ憁J:3@pt!}QhL75bŒH[X7?^&}~#W"C쁕?9DBpB&7tWtE蛱1s`xdJTBxMxLzq[-G7`>-L ߨA V6bP)T;wf+jDt7)Q~JQ_DKwV>F_8PwugjCV ǝ|ֽ;heDCQ`RGE -ٽDL\ʶ5jҿ=tj)"> je쳫4Unёiײ2.V!n镝7}gBua_XI@6uYwquu07g&@rb|*ڴkW 6Iv4͇Z} {[hE46:<ǵ 6>?kVIv8᛺*H T]M[Ŕ(*!j:2Zy(}9}j>&{ ir<-79v|MX/^LM0!q}.04rrPB8rDRP͙a i0j(~0Ƈ!Oi~a~} $3T?~/`xk~[.G گ#Bl$^-{biWDl޵Slppu~xJN'e*:a3H 1i,p(%)ZJ,-"|Aq W(s@! ,($PhR2 @ PF􈥐4)I"vÿ㺟"uz54T8+S.܆ JUF`)Zjh2k$U1rq81k\R[1/Q?-6IQ좱n*vQc6 r`Jlzu)wFgUƮcw ?֘˂rp֞&h5j cByO5 -'{6倈1뗷]Vf# yhZ6EāE~=ڂCxC34$LS o#^] caPɞR%CHbpK@ B2 S8]*H"d2Ql.clP.!%#4=YvחM+Q ᱙&-TTD)&"e5)b1biuUYdGDOeru1/k'|7Q!: {A9b"H?Es}4(ۏwժ^7Ц~Klg>rNt[y_hfR1+1yI8@|Pq]~a8b@ @@S3J< M+EjdQ]jĂVA, UIKW0i!Pg 4k:d\DM΍ Bx(uq P U5@r" Ukʃ 4';Ā*$$#RT0@+=׌||~2.{d#Yij?!C3e}~l-D4s(~|-}/;Sodܑ`|DrHp[{:_R @Sdȿa5K>NZMPqaտ˵́ʵRٴ?#zc POo~79gwY˫08D*c'nG5{i]S2Zq3uv( *q!#&4@4TQ&^PH:8P~b^b&1a76.{%2wL*IsTRA ēl5wY2'Ip Hn[#e|c\A#uVQDmax-1]DT^77$[7E(_h갻*g*N<[= 6<4]{C' '8ذ)%᫲AC<8ts]źƙwC^UQ =7ٝx rVD_{:ASDÛo  n.o1Xc0W\@@Q٬bhD(? _Sy(O)5#Xp2J-Dl]lp1Vpv{(as6R]+j趣:tP#Rv}GAGO0cvh\)Y5 @biȧ* HC;O $uwNL΋MN0 N֋άr33NHh(V]3DlNź73("뚢4Dfiil<@G 졾ĝ>} 6Pu؝+۴F^ _ҡoT0"#^XEj8¤*m}|``T:b*(0n "v9&C f H\dU*9¯J!+}k?DM؉=$](L*P1ސA?i1A6 G<|2 L_JJv[[ѺOYH@L5E%ƽdXRii)Ƶ7h>R=w:_U!D҅S2PKjh;DQ.XOū_uUHTi+tlp䜡X10.aZW`w QyBiV;ȷpdλ<ݫZ]|^?۱5hS3о_o}DMٿUk)oW2>nVeW"ǵ"15X1HADDL{{189^\c}8R6&O'0tow;n{?2߸"kb:]8M?8Co,xY!Ia%`&QI[GA SbbZi\Վ+kߵ,YqFU)7[w; fvڋ6-r!`MvJvVmS:j#V4,q0JӲ˨h=9jSvw[45k+Hs]hjc|ʐ!,WK]|+]gJ=Ȉዯk4_Z6(D.MY 6f@ũb[42vXuIZEr8e֪ۛ8@(8lr6EAЪdcJc[@VhL{3Ĵԭ FIQHP'fEL0.ª`Ռt ).r6m1saZC]M!,ZYR\Ƽ%SM_rWSBヲkIHh};Sf7_ Nnv9oaZ2t;|fΘ /QZi1Sqayw3vÓ_nf^'-J&k6v>@mG$EeFAj'2޳/wV->?H>h"\XB^pB?\Ng2O۰\p&Ȱ*OFkExTL-3_9+<|}Jɠ&F k$5RDL  ;?7M>#ȭNOJ=Kg@``76Ief1@wP7wXBZB)=nGVkYjO& +cpr41-҄ knk g>?O'vπfA ~@"پg"|x^oF! $Mԫ->,iĉưhF@ljbX\ ar>@%=r}ҷ~ vzlrEvlmp|/17Eѵ  (4TD?KE>E%|EO>ˆmտj(GB- |.#}Y _̜NJ& R.z6q,!u7#R-*[enslѧl"fPi!40'$BMm<[ѕ\q^ڶAhX>0PZ*ls{oGbSU)H=t>Qu"RZ 5(`R4&'LUӂa}R,gpǜlnt &H&EQWX5R asmo-=XΧ⤃3Zp'[MoE jQ# tZ>VYqu*;GNO@ȏKTGcy5߫Ou[Gs(i1fh7 41X7o %}_ m<^憐y)( j&;Nwߝ+q?Ʒssf5%fhxCL$.(m7} bUw9nrg5b o(CQ[e3O`ű0Jrn52o⯒zOKz QP>,@)*Ԁcr'$ 䈁}(X$6}F Wˉ7jfGö}18,(RIsky&_|觓T"#~71PAaˬ⿞*tM1"ܸ@&U|r_MŀPk$GYBhqD4G"f93OYs}Aww)z? TPY8F?Wl.+~_Q»~sw}]y;p8=]M(]d69̘yqs[f#E+Tϋ|:/a%}sb-WYQ1\.W~H$L8h8䀐3fik9")R28]X9,x%tUV> #\5Mi 75=wf` 򏕴-³~ZJx;yȑ?1[:9#dldRhH ɥσkHoZX%ER7J2W@ۣ̏YLPsjٵb8zppv5ڏV$ܖ}/wG~- Xi8>1C j~W.oTۏS\0ޘ5q6mUReX1Le+Zjl"s-W>,_Bq]e'61n+ݷ!}F4~gt4WvnV67xȦ&MA5mv?"i 2= S7KE<A"$ksQn%xixtAҏ.:%=GWې̾2ty Q IH)Vظ@qKްSRrzz,KRH S|uϕ5{V[<á ,#fz~DI [Su:<ۺ, A_h ;4>G\Ӡg.{(Z\NOr!BJ6 >!g+9SRRI(o)- 7ɷwF)(o7:/\Z_rS RXX BwvVwKxF#Պ`)&Pl"ܥSru\ysrRQ'P9^:8395eށZImSi8.U3gI2IHnfCĺa_ ]5Ae栍FHy&m>'b[_Fu6Ck+ J_Ŕ CAZ ?d642M/HLܺCMc֑gZ읷})b0m;AyN[68w\IK EaЏxLd4,ٟ=jIigė_!{p|?sG\3!njǭӎzЯrUۏZ&d?Å`?*3P0ily[=/WQ,rS%5/\d皻z~\uC.sqDqPʃBЋ.su4F~V:O=wJƺ"Rr l>M_i`"͌%`@RC>ԳLe4XQ4(w_Vg;9s}_aӢk'6@Y{ڻIi?bV!U譿JfF]z;AxiT:LMl)jñoÅm<:?`]5z5k[8CA#nI܆Psssz2a׷j|5zGb8E‹ YK.u,&4N7S\k m]msCV$a`6߫"ǡ\tt '4P KaіN_5U| Vpb.I`ZxsVBPHg%$*bG?a<דJ.s#:68w: H ͺ')T<Ȳި^C]߬3z'=Dd /_E6 k_hAJj^bbz:rrMQfdL* (8]| Fx%w V}2NXcT̎'B۔g^HbXbQRƍi)$c;$4[v6㔏P#׭p^D˥#dv5iⱍkop&d س",VD-v{D `2؈ur,1p [4a$a! uwo@; |~_um 0?){M/cѻc((j,f"H!wBJ%3ु$ y]G:CR9^Dr/[_TbḂDoLnauBn{f^zscS214m<$24BO;uWSCCux“ɄW+ C97|A`4P<׻N29nW5sV0}NPf&rtN2SbT,z:Ir!Dg-wCW&b0{B?&Ё}yPQSo.䢪M8 }: _Ou([=hO t[Jw ߉a3t\sӾ(݈jJHZWxlrZ0i3Z^~b፰>ۜ\P>bӣɨB^^ž}*?g*pzT I$L۞2% >vY\!jB%<[A#ɨyl`y&8Y~}z.U/5f-d4溎ˣ ` ̄ٱx_öu@EDI,%h۱P8V1(X4eGBi {W%Th0Uq5\400p/p8@Ѱ3Sm&@ަƦq*;ღI;ђqoᐈuH"! :rT JTTʨ`MMhA01D#9@ 1=:ۊaihj#Җ1pdQ[DH5|H=]D#ϦHQc/u/!$ dU+qcضOg:e$KI=Y}o9&k܂d9`mG(9U'4.J8•7<;џ;>\_]A,͢L" puuu/2 ."DdlF'Iɛpo;Z9GT>!Lq渇 vyo2U!E- ܵM =%BBJX+ZW ^aw: GX`^[d'^mM;ʐÖ1<"A%2EV(slZ DCx 7Zf{(_6!SGXGw?Wc%$b4PeG_`]ٸHtJy \hh*2(뗱YՔ'Q7ev݀`HmD4'ǔ;L|L^*}O}ڑR>ӗ. C0Gם͕cI]};4cNMT)Q"kq҇&'OdA1_or<}.C# 5(YW9VM[ }a0 ЪƂ"}/ޫ@b@gY* "VwmyYe[`\j[z(&28h5s~j8<ȱSWn\tpQnmm sTٹKa^;ʦBEi ;'ԁoZO\H)>aLR|,$l9UFW*Z{T@-/ o٭t \H ZguL^|ovFh)ĊF+{mŬmTM{鄠=&5r4kla'uD וW˜,%mֶmˣ2:[x䡗=m"#U/eMjbq~R>қS-\J襥Iᾇ1vm(ȟ'v۹ 70& aNww$ʎ:T@!8,aM8!7cw׾x(x˿7YOkh4SJ*z.n!;NA{o7+cx@sD|ρGx׭=USoǫFFMDS;X|a};cvx7r62cty&C_qmw<;?$O|1E)m},nߢg<~" 0C%27s # B# !E,Xv@mߡ-v`@?=Sp@MiRAJj\}/k4خW2q}鬶F/N=sʹl; 1=G'd9 ypG8% <Ax7!8n/'t5I$ʼnVO  {+Lާ%ģ3k~_#gdn:F4vI:kKqnB~Kַ6tJE ')u/AihկH8;?DA!HbBA+UHrP@o^ϯ ]" Ő?P@5 \j=S8Yp? n'5̓௓B8vG8{8A9|{U_k~ߴ(do.?,kP' y|oNkcY(-l9$Asث^|WV&"N20<9 ;ؘl `&jG!f7:BqQЏa/}K9c$ Ch zweޡ1xיPmckJpǛ:> ؂G-o>~g穏2_уҨ'8N~xq?Ǐ<n[ sus~{[vvێۺ}c꯿.xp߶0}~<4><;W}{<]?%D6gONB0+Q3wp/GЛ/aW%4xERn O5 @v@ rEy(|}<7qJ)4?^؛??y(`$A"!MƇ tdh~F(eMm󵆸۞IA Hw&*>M~o)Ղxry!(˨yr\R0bn/H<{O\sk]#c~_2]!=MBA _1y{+Ǝ)s(Đ$bݤ5^PpmX?/3^3-jS^}8vbF'X5wHhwz'Nn=5>{J)%|_VH@x/LC;9eM}؎;Nn!}?D_V:iȎ_H B {zb q9#_z78?+:<@zkٛx~F-Ml,h"n>g#aZVۦXj"<$8ZA2sXtﻗ?Bp+ձ?j9-/Vqc>}$hio\:3=oB"t6,q{h_41xv6zRr[q|o>7޾\B d+y5\+Ozqʹ "_״3RnCyw?w IC V8e+ {jq‘b&=ޓi ,]y\OpCRE7R}_>*vcq\ 9=0"~F;ͅwØfvp/uͧ\\׍â4] >]X}yr WDbT9Ys#O`f5JhbKwa+ȹV4~=DLv9ZNR(J 4uF 5cn== s_S_q rڒn?R F]}|ؔw/k4=ֳcIC ܖ]Ȥ-WFiP#}f;foP=`S-drHG%z-nNԩD7PJA'Lh$61q/ < z?/ќCt:DQɱ+q[fCS @OrTje`h hت%kxJ.7yw/_/Ϸ{nhw[:_4>CU6`C`aOV/gal7bǺ&-m0><'|wrLq+_毓~\P@_55WUM'ziգ:ǃBkZg+鸼;t;pg$ ?{s>9or't}Z8C8ߐ7MC }Z|b PP}ZEٞed!Ku59lG{&M@ ϒOR P! y4R$"Bc 9]~K;&/"zeJkWyᜑB#0̿VC1hJ^"n⿩{:f.ExҴJBAwUne1l>^sCr@$}~YXe|pfB:h Xf<Ǝw7Sd`v`~i~Wbc Ă c,2 Ԑd]v¿$*I}$PB`~| E1T^0,&(8ak&oF3EV˂g]Qq_ Jȸ?EIx~ASZ8?<ϬgLY'no֋|MO%~#Ӭ I#^8Mƣ܏1!A 0\몃FM P(kդ*IO,~FOΥ`߷=RG䌿dDL/nq6 iW:?%xP DoKBYRvr5'q2uFEMCH-&D+J2+R͊C m+jc|b$H=2!a^@G~R3(ԲpN4rtVY \ώs1\5K])5aEMʹ86k_TlG88?ಗ(E$Ú>:G/oo; iR3}B{~sk.n!Iz3.8n\2B%)χ;<̋psn z+xN%x9cvKůxCxIm ;zʃ=BY-rr».ؚC+WoFN7gt\_ QvtNۼ@G oh@vޣ*.hzE帐zHO9t=7=/-L,5L k:Ԣct< Ctc{iw!FjA" slN\ y|!1Ѽ'2{Sc4 x )_Sϡ8<Ѫݰ=@WMAELݍSQTcZ*WWrVĪˠB(+j3+l)#-A*5TMAς2/6n"cJ g*C80|e+4BwfiQׂH!]1I $ht|B:֊˦'9ܥ I#DäN]!$4Ůeͫ&NCH%(Rd4[n4\qaD*R9q77 9$ nrLr5㊸H2kԘNMÆjc6lk$qfmEk "W9\n\DITZѱ%4f`.AJd)4 JIAH1!!ldhђҋ9ȩ.MjREŸ֮*-bPQcF8Dd?4mmej"ܗr0 DATShtb|g]nqNsqE%ۉ hI%)ƳI(2ѓ53) Y 6LkW,FŊHЗ)D jR%F2XرQ1P,&i$D%02jJ) B12``1F#TLKBlL"IR˜q\f QLXa0B E)01 EJ#FV(hd!sChfF5% bѴ3b mbJȴEA'jԘib*׵\҈&%Gk擄&<{x0e Nq^%&t:]$5}wTɑ9/&pǷ$CBHLZZnpOT.qggs @] A Lגm`$F4Cv U&%K;V+-\gH>Lrp q!/ت7~^ 5BkX0)UWeIP)R&.1 4xw0uBݑCO81x(DI#oi~_~XH$?qiwoRۢ A?adt[BRf"d:ӄ%ҫeN"Mi<$]!5 NKzx,IVA-EAD6h!v}>bi߉߲dOٌmx79OO/5lmܛ|zv{3~("D !hP6TY*fRYFba%A1DR(Ms1E_{_{H4h}zM]77vmD』ιHsNs}N[g91hr %-XA|8_h@TJ#DRG{+Fu}o{{UB0׍ѿ+w)-3Ȱ°$C+ж( BX6$%#iL H)ZZUJ( ,&EvLYVA"DB`@1U Pָ=a}]v>A'H[֔c")?5M*PUEH(6c~EŰbS[W;y'#{xe%c@7 `=[;#)J]ȗƙ猶pe ڠlq +ٮĴL_.>sP&*ODV/S$fUG5Bv˜!F& c׷;+>ni3'",iE]jؐ"uR$<5 DWd?n'lF]~g}lM8e?7)+)u#Mǂ["͹E\8'},a  0~m Fۺ'YScccL#z֓knk6&,,&f0|XZ B6Z a,iHDITT"@5ywv*@sX5tNKoWß?dqw;f+:Ւt9BS!E,3nR9+zmO8O?tn[.x䬽ۡQiRTUT r<>?#-V5@v<d ]m/ovDȏ4g cfț8: DdVM!p0Կ&a=ԫRW>F՗~\idLn׿ D.MuA,&oiwnRP:o͞؈^u.! E0mӝ9ү{%[ dL1J uufx$ݔ*!cC`wVVncc kqZ뭍7ca@4G) ҡffnĻ_2m} \hl`̚F6X1KbՒgm(mT[yֳ"0kYr,& Wjgu,T(q.^M/K47yL6 gƷ9Y q &{H0KBHJd%[ ԙ (GU*ȋ5 VPMs*RiVOfTƶj=s=ze?=(e9.vunAo0Դ +vJOoS8GzG8D`ۄLx2ŪWgsJAQ,rB]KK.03NOlqΓJ˵wݶc- >Z6F> U<@4>?[.Huca )q <4'). #OlHnAP*<A m}? bHT#QH.Óx={81HR% HƆA Y|"&nĻ5KU>WN7oc݉gM.4|h떦RpBJHch.$e{T#[L?5:^ H8Ni0$B fДpG":{`Կ99Wyԉ*T@Oyx3ǣ2h$L v`Ve)|Px"|x~' E@RlP{;8 !!a*@@v iD%^p'Sw+q yYT_[<8pLU15QFװh"u3d4rj&0 yՆ)Tpuȶ Ed>4V‘ⓄM)їh 󃯀 GWXCQpcEi>]/6nc$ ػ?XزZipwZ:^4ql=g5D9Z8q.W<Wv^[vHkaei3l ==Z3}ASGz8E ^Km^O;4$" nM›Ƀ@w$.(3!~bU]b<;=hVW W_o/iny^.LK̮vyL }OIySS)[<7oMs`ncμWixIA$HzJnhhC utD7}|w&pݰqg7*eVsؙZo%VCct:pYx;pH9?:OU1CpZ/0.*shwc38(.׋},/m>r:@bp܆ v!|GP*%J_q>Y]Sl$UċQgDQkAal|ar?|Rl1U +B \@NXY<8dIl?F8kLhO[|oݶLQ20^%G)עagAM FD2ma2se`ʇܵBsmL ~hsKuC8[n>Vqgdw^4Votb{?ٮWf`qR$H &-Qp.<}aFe` 2NCzo 0bi&`9 $n,҉rn2a-1ĭ {vexԩ?h`Z{+8 ;Wq㶣6z~3*kՇo,-؃1 #Ė 0e$aLHQ rʨUED|2*ʼn P G;n xg.Nw!qXO<ëvxUP E px˟G7,.2 l9D^3 /=6۟1hz(SpCˌ`zxCC.C'zNǃ>J((| %FaI9\׶VzM Clbڐ̶%EHwN} 9p(fßj ̬bP*.Knpy d4UΙ4OvY5@7B*uυt' n*ޕ|R"q}9^F?89!xD!o$7<'D\m g pbP]Nr3^T381}w{a341JR4wBĦ(wTs=t%KY\r(ٱR۟^"*oJDm麹uE]21 Ci4Y +xIh30 S-$McAXj z\7<)b"RXZ5E&E) <@S>}>)@2<'(kIUgvJaPk4I\Sl` bJ>Q=&n'?ίE! /m%fJdJT,_&#LF5Df[# QB+9kkUH| %)?p"X~ҁLJeDTk]WD[ҼQdƨ cX#X~*bRZfMOݍyo$kZece~z뢛Cc4FYEHdJT9ceZ ̗2,$֋4B@|h Mn(E`p X (\׿]&m65-)$ (|u,GWXOz5Q^87TnHXAr p'G(QJ%j"VCl췣^wy24Do29sAy2YnX{ 2)cڝ6\ IUq$qD vQf 175y~ZT?;s8Pxω #f5(K٥_f+Ό^͔/C(mߤ3AOzIX̦i la8wĺ-lXTy){B}ϹZ @CUl[ucli#M  |HN1!_m8<# ܅ӥ Gnrx}|wiAN #ۆ"TXM6M.u}u|JyIzH5|1BCDeW81Icb5#BM|./OWqF3N8'tйB;gyz󽀣v HkLk{O|_m߾r4^n_#Dq+(18Z |д$..L=uᙄaiZ\tIz QNYd^}|ωȵDxu'mHrWýL4siq\סּLȦKƯS3z~Vས!uW_b>W.\C#Wx+c*H^fpY4|뽼39@nK;ݸ5 jάoU> ѯ_(鎞$#6$6ܩ"ZIh_U詰`d&3jXƛvU^z#w[3Ą1к% 4)./9)IјZC|ߚM_ OoKPDYw]k=d֫<:(17+򐲔4z_[ 2/?΅j |?&e!(LEboy_CE0 TO}_)]mi!AISHӝ>}ߏ1_ ݸӤ+،AD5QQ zV!yRM0|ez'F)ձ3_4ǯ{.gv=o`a2=S_Jmϥwo$U'9{0){>ͅL{?ټo6{>Pf%T*A Gzc c}D0Ncyw{E䐣>4^:R1hJB<#i8րm8*V20`-I?A~_%D6 QI"߸xaI} #UTfuKzyL#o']$ 5 @D$ X0BRf9ܛCsһmj-D!IpHX` #-MSQ X2otKvP|t<^':^FvB׮JBڬGv9;}3Ljf5=w^3WSͼ+cv8dLz@ )m0RB(1# 1$a 3m#p LawRovLIgo [+B-AVS  # UW&5eMZ<". <!Rб0 fb@$Ւ%#qA2Y ka ‰ 2NdmwH # to!!*7/cӽK[[W£y|BxsW{x2It͹ B! 4+5$J@CX,oc$CAmPTG2B?rſ8" s**)ȫ--VQ({4) ZM lZbHerP T}ci&La 8Vw`$ U.IW1c2EhǑè6^ז׏F57>y)!K+ޯȗ &=ew۷Ehۈ8yW5z~g)4m(LM~^=a =/yLLqD /L^ȂOo;Soxẅ́%Y= ~at FGڽs ӞHR]RjΫiu_?Ev,9DSqKB1nphx  - #\I fG c Ȭ"GN c_KO ݆hJD3ċPJ nd'KDҶBYD-%5[)lhܮ?T NTkiʻEsOk{|^[fcOVcTAXa8*!0#]vZgk,H-S*B}+5$H2\[IX DezN $36jC=@ccdžRj`I"iD]]nlٸJ1vůVFL^'ޑk|-"_>[n6&^4.='znNR;>ܗvd8E' y~Y#5~]7c=/ya 7oy<΂# MdfzaL9 w dw>蔔w.NN\ߊHLB0ކrL-BBuTK,M5քy%= 5Y*S* ;rIp)Ng-=OO-槑Q{=n"c,0"2&[t hނ-j@CQh_lyP68%G)i)43ov̧i),hJvS%%Nm閩t֖Օd?"ՠ0MX5 c4S JR[QKO4EyNR0*i !RjlU ZH6jƱecUe]SSP&1@R@(ac1"<2н<OcHYIHn,' :CNAIHLhJ]a:0!VPSR(mS,bS1U YjjZg A: |n2fo{͜qac{p;B)mGN/I(Ko5u@"X[DQV*蝠-NɲN [;B)l{[S0T˗n*`]NږqL9:XiK>VޤGHpfѷ^]Kvݾ4e?}|gwaaè{]ُp똧tqvnޏ[ϿvxȹMq>k;=PH S˯nHfXȪ=>E6Z:Tz{MU*>&6By)C4T,v*8񭫓^7ཾ1襴uv}ys6:r3nFBQgPH*Id7'bj*j$;g7@&’^Jxs0_ w i;N/SqKmQgSg{4܆^AkYO.fQ|G=*PUyN\o+ @xL)HD ;뾟6;s0|=rEqD;xoI(pI1ez3CL]@?>~)\,{$@,=rɮ9_ҨFX1‚,fԊ^]UO[GVTaRqvg|!"p!zqmܘy)I\Fѭzh#`4x$(08A0 .5[GH<&< .qїZjjV-diDQK޺@R>meۮyq:E4T;>⽟3'[:H <]ny5LF#> |~v :T5IFI>LW׵hˋRџ# I H!cxDj3MB6:7?5,d2$'}o|+yIn%f1^BI4̈́6/|:+ePf3A!tEC͏Ɉe Y 1Y^S<`b RE4,u+QI FفBRC_l8@$1sMO88'HA  Z3 j@vv6:tTȊb3!,,hmHf3>YJhi }zw#=e䲿*SkҶs&Rp={Ňf~ڋ=:Paѭ! R-9<%'|'IA =Pڃv!""-4XEV,7!KExT'\ ;Cxy=0, Qz Du B\W. 0=pq>ә9Ӯ^Cuta.\؁*wcCI({I -<Rr^)#>b)~ _֞iuɴ4P=-w\¼=>M\:;Crr"*+5oXhhU/MQ/o*ttA;&x@ ~6䳺6LƊ?;Kڙd+DGzwμ\ߌ 债*lʹ V1 6C|H+6 ֩*h LvD,EtpW7$SFR~Ik/S_n"z~2i'k .t|& o2d'&};0+!ҏ!ܙ{% {P p6Ls(e? B5(C ``a!y/Ӗ-슠*f y֯2-HA3n[A h'_cpqy`FȕDiI2)q$Epa4.DuDt*MDca8bdR:lB+GY}ow 8]n! iD߳Bz%u<~@|umJ$YA+ݦ'ԖNY7CxzH6y.ٳ[6#lM<˄~66u˅fbN\&[c>则Ds 2YDP0)1PĤ03vX|3Db/aIE†^?GJCV[ &|$?enD0o\\Cd ׼{(8y[kYܠe3~S̚xWy ƍѼbput(Aڹ S[y)n62-Ap:<[~CޮէwϏ>+QY}Wcv: $)TN͓:Uw2VulܸwX:K2 @r.Mb:mx8.M0z`g4?@"n Wwؓ[AFgl&pap6h^coy0n<"A HCF#3k{XPv4~м q 051h7gODH%Ӥ%\:M\P?z8'0F -pBpBP(8Q|h yѿ, O𨘪; xH?rcfgBh 0E{W\#Qf mGF Pq=!ȂxJ:@1H|N%#0$4T[o]^?U|sS ^9 2`cCBE9f%\-!FI aL LvUǛF;EP $ɽ@"=1.gfQbZ#73Bj-'K2PWB& kcآ6 (R%EC&WvG/;X| gJR5ߔ!֐f߆x;&@&2)uG'NSNP(<ݐ W`CeW鎩p ?XN\ck^QL[(n{̧TcF{n1%q@ gQ1$t3[uN~WuTy#:PSRJJ(r臘AycSW! X{:X _N$ЂF%Hp>r4+7xr=0_q08DqÀ1Ur fLIt{x@ CPy g@VØ $_J [0kCv]GEhEElr0#| $@BjShmm_NVFag58檘`9ǁǤ<)X1>|`^U̞F!(x~\rļFĂF+5+AfZ (&#d?os~u`8 ?B^Ke.t a?$2L$E$D RJ)FzXW{1 $HJ2l,EQQ]]:eIe,lbuevifZ*,"ȶŴ%+iR)K5i(ҡ! ZB*e}O~QW7Q#QJ Wތa~@+?)D(2F=1FD|`|c.gh\=湹MSiVz X9=h AHTGx| (qvosPmXiLޚ2<>0#ː6p2¿ ~AۗQD4&8uzQvV.a"L .<*5rZfJA@nH&]ıNfD:ް1*x +^S9YeҐ9&OX2>cK/d=RӀ*HG7BPcׂ>9ñ+x'!a. GH.h4៫{W0!s |Sї\i%G,84JQbJO(i@գ5"d fuh7u;c2j UaR0wj fFq0 Ƨ !1Yoj씋NA0Ej+Rm=p.<:BJJC}Tf5r4E&(RfLي۰WhFȋh0h!6nfq'_Τ;+hs7/5yAbl}Rd n=\Hqv\`n|LpFk6p D  S-_䀠G >4_?wEw0]r>mNZQ\foSG=zJO DK.Ie:HwiYc1N?Q ^ltGxWwNa C9PܶBP(E^I +Q_Jw3Xqߚhgy!MWJ' TwG~hlcŞꊽǯ\d˶Wۨ1nM`x1`2pFe{a`rdtcfo 7{ 'g4fsm+y0PÇE8umSlæY,coIVрd̃ӵσjhh\^q'ErZ:7wԸvqOP`'|:tPޏ n'K'yd۵0[3x`h;'rHe2l%ՈlZ h^!g>{b בx')F yn M3${=IZċkjݹ@RȆ HCmoנ[~8гn->].ؚ6D X%Dg p:\,b! 2QQI6~\\_c՚1Sttz~KynHo(2x %fA i! ?NIe9q9`E Y#t#ϝtbqxPhFXDZ6& + xW4n\~pW+,; Yii-Wp2xl?̆#o:~EH}uApDD"k2Wbp . K6K č斨p/O#3;Gِ|`-xssVM퇇<`w%5/O1Uzb ˥nvPB&p>ap^txN?;>%k =v{RhzqymɚPtC`t |hUI :0b84}39so%֦aF \y4,N j0n~@ZSCZB?Qi@I8 gOu;cy@66f67ha̖ cAz/J^Mk`=xY2'SMGA E᠛X0E/}4P̌?"~:Nc1PHƯEn8w᝝; xFoCn (cNrwiyYXTXP}f{`;$BÐgm2Oen!?u<. \,*| ^t0|[D6Ԏ6ڷx{Kq=}RA>ݗ!wpK%=0_VD(ËqA ś~{ ]la>o.MKvR6a_k?3뷠ndloprM.jr E0`q?(pat= SZYmlζ-@}tz1hv 6R R\[@[&:l蠡E(hq`sCx`^ЛviAPwmxޭ.Ԋ!PQJm U֐K6jkU-(ta@H֖n̜lbkAlwX;z&.8zv.2! {3L; l($d;刉 mz-Uc;z`XĺĎٙɌ#)RtݶRWff^IeZ4.ڤ2%; 5JQ%A 4(*UT(mPC@Q[aFF^@D(U)َw8()PB gv ( -TѾ]V޽}ͽo .=}O_{kn}v>n=hzr_mvm)ӺFHj4({[zaݫwaFZO]{}x\ m{[@=`iwoeV\p`@{g{owM5AAl1k]c_mtż5[=}wwNw7[sn[[l{gSwkcWA{^1Ѭkw{B7}۶kY>zm[ u˝뢷=vtﻇХ_]RKs{{"G8agz@>{w1$H_rnky{y/:FH/|9}|zv+>  y# G޼l+R}{îm;;" nW}<[_3:uj*`> h|: (^.>A@|^Jn`<TcwὌ2돥{Q]!-`#}p[^p-zKwXTWͫ`t{{{w[_}[.z=Nf⪾;^>UuBcݞ >u/jC};˗}{֗wtUH[k669Z=@P,{7˯hM}zO>]Nqڒ}VSm];u|ޝ}XϐltqkӶ;m>UnhI^N__:ݝ{[l۝a͆@ &@&Li`CC `L&d24bhSh Ё 4Fh4&L 4L)4TSRbeQiԚ3Dim 54 B&44 BdiMOɦS? Tz2jM6zMFIe6zhS=TzF~C&$Aii c@h4hLGSSƆI䧚L &4MT3SSzMA74~PSC$hPSFeI"& L@h Ɂ10FLM 4jd0OH M0$2h$  dd =Oh&4ThMMOM~iFz)M4foSAOQ(OdQ)SҞS=OSڧit? f$9h؈"*\3Ia֋[YVILD3CI2{\5C {Z]j\_nݰJ5?Rlo%E%}Pq*bZr0(]a$eE#cyqM0ĠFLl,Q`98`\64s#|Y f0p偱a1!\Y"`F 40X10q-flE)KLN!6 Ry`1}d .%`%!|X+F„\0ӖZ:l)DA`|^W&cgx]ѿ}˸mǴ` (d,`Ij2 `Xe]i "`7'y %DC~E:N&bڀu!{"2Cez],T! aaE Zozv!&\H"$"z6B(+ c'<|^qr(UTDȂQ+☺혿&e !e [ۈrf9SfwI/ DaZ}WkaҪ@ɱ0Ln,LX#[]8բe2KRf?ᛞ)O?5R:ssr|ph 0=D}:{aQNT\iBVAW-3Hfl˕m&a0rLg_~[0yᦈf!'KT0""#ث3|a,yFwؑ78;?AAL-[푽^J7wׇoT<8kL5/OכX?L}͟-ws=aARdAA=eDQċi豿5eDkʕy+2v ,  ?rw_8;Fgw~ ݣ)_⾱s{~Aȗpbh!'uqA6'(jjdW.f""]sȏec%*R g%K]_9#xk(܊=66[AgBVF\XFL_EbۛHaC'ni3 +_L)_r@KOqk$<ߙ0 4Ԁ>O:UneAdݡMBD9 g:SKßw7zecgx?^_s$z"t%vO7KG濨`=<8sӑխbɡ64, 4DӼU8PHV2luA14h`DF&$âltɷNՁ6)2_k\ 줭$jB0. }|9ɋ};~mf+D1at: " GB8;L0h^P"fyFXN]I2k>;rMcjvMf$" !"z9ki|3cXė P8Sbg<4QX˄e8k!'#7#b6lk7:i6܃8%AAsAP`8hk YS, f WH -0$65  vu@Hd`$JFrxkd'6d1 8RBLmX m((`opnIK$SG ufq烰Z*Q! =OjTls  r SΓh܊"\R!a-e (se^ɥk{g u!)ZJ*C}X3:85FJB  [ZP1aJ#kb<ȁ8 N;::_pC8CьfĥQن6T 8fHdbd;K\1:1ԅ`: & Xnx; ́vX0mG@Js@ vK/y.:z c0{8bꡝ r,qJ00 1vs.YߢpG/"LˬO}ž!IJ!J'^ V(ІIA7L >q$((hw`:J&JR*'V.V Bh`b*&p؂0tA]h7':0RZ]\駫օS2fILPJSaW9DU"4@TQR44\mvp "*"UmC$$fmzz徳m*u G%OxCIYc ᆤ4GmtCAE T5;%2rL'Nw$bzuBrЇL $$|CJ I7ӛ{derDm>?/]|@x|qDN95T4 Ѕqv>,$L$"mPۏ;$q'F rh{9cX8bHGR8LQ XEc 9v;FR~[x>f:B\[/|̟-pL*xÓTdY$s4 Fo̤{`ak˚I ";=;Ofg_6g}dn6YC3vK"2(Bk8 _tM^$ss튔';ouhJ(J(<|<D>ʇ䧝O gpx_ 9ô0|z&6j=pRHaatD9BA[hDƟ8}S' ld .gZGY ͭ$ ^~w#KuV3ͩ}Q%|5Ы\޲ɸ@?S!JŽ!z>*B)&ly*3/` d<D` >rB}YV}!zǥ͸!>AiT~paQ,RDg&ߦ0Kצ܌~ |H!cgiRK7zN|_x0qΰAyD;vbi,- }oov~^4o' s[ȩ"QHk׽c~4?!fD8=T٭jJCD01CA IDB g .ng vޡHĞ-;@4ke cb01ˢЀqR$@=1  8/=F5lO3NӅOBp%Bц ц 38l s<~CCΑ,[AP43LRL@O{:JFd3E1AA4Bfn\ #A>fΒ DZ#9؃-cX!~ Q)UPLkV0)G6m aL @tZa:5)ۣ`ұ `K$@DMFĤ H&$B"ؤY KC Ďhuf`#˃F% Ef(9<iCh20^dg7Fl8Pe0smbo^n4RơU VSpc6p C"f<г$pxlC#ĸ"Vl8cb )%i龎l!s6FG#6H,DJaFǐ-f[Wc+4, 6KCML*NjKC= C2•]{mfML: "jB0;Y! :#iؓpEym t&iZk$ (" h"h hhQ J*h"")zd"ihJiI1oĊK.>S7~?z]Kb D|ϣk#I2&Z} +n8Ž?mdWKK}?OW_Ol`*)fZU (d;)Gv֊$ !% -!lu.荻ۚ%{Af˖;s$Z٣zVR1N0CkP\6oRDfVQ4>Gt]ܸ# =wXfĝ˴ u!&wbc=$89't.oQ.3M_Khc2 V$!-u-NR +h@n%ih)"h1 x-5Pp Yr{RJP)hJ)V)VRh"*J j """*)j "j"j`h(U)R}?þ|_5 H4 UTR%-JRPIIJULUAIEPPC0Q#Dތi"iH(9 4YPj'o}ѹG84B AD1P3@ MB_@Yu|Avu4`'u3[ɸ]eDGp8lV"VZPѶSLx`2ƣ~Y7 ==[mշ~+=Hiʐ) ˾6?&\~ڱ ]) Jf9):3g=0xNBe&nb4]CSoX&e7rhsX(f"eHs=ja2ٙrtA^ Ї^EElYgٻNs]d "LhCQJSHMD-!BRH U35E R%E 0R)EQ@S@TH lu[qxȸC}VGqxVj} wǸ&5`\8JD;2gtFohZZۉq;lOǝz HZILyd"'LөM,W~ӳQ zsBx(_YS`0 *+|ܻxBҾÄrMs^UI')!@D~N.:qx~^.YͿ Ԇ0M L 8ͰP]۴ Ƈa@5m("Or 惬xr'b8>ʯB "S(°3k@ͪpLop$f#z"pLgT%`N=\[reZѝR\>4r~'܃ssj J(z0PtX{#" 3຃mˆתhCRL}bgAvCYGz?U7WLl 'D =J9cp@?EwID)29l,Ecp-)0`0Dh[cw&B Gi4I VLDzrq81{s G^GVss.DHu}meFbS金jS]L1}F*閰jk*Gz*|3~ۅse<)st0N1Czm6-\?{fqOqY?5O(M߷j." q! ;HU!Ѵo 7/d:j&$ΖӰcݙ1 QUN 8|:PsaYl1,t!mm x`:o`գJq5F>Co2AI ,QzAZ(oPY#vuzÃ6E,`[&4 +3xpy>}gπB{#Lj%\h :@4)aH:vƄ ,Nw'I om멦 9T$̐$Ct1 GI,9. ~z!nѷN4x]S{ƍyFy؁傂DS`” PE5DRTJuQBr4xƢ ӴD'Y8!9*ab6l,'m^iwAbRQs'ljT C:;rJhL g`~"(AKs!ݹtɦw & F@*UB~403UzOfvfX)t#FИLo2KgdAļ6ۏ oFP2C6x,$C@ѣ,s^u[;Zw]MãTȩ*]B/K[,# GEܴ`]j&#sX /Vݹ(q(h= %)$lz|p̌TGƥ7sr3cu R8ƌ*x*Y0R5"%`2'k*Ll lĊc2%!ʝ<7 nJ;KTێ]͌cfMdÌ дQ 2!bR;'6%u'4HDpF)BFi`h|m7=*o:YbsֱSce:R˝^;,BA2D*i(̎kxz8'!22r&/~eo;jmS Qp(X '/{ƅ\=yPt}[cn[$:=.l^x1Pע\}%G9""}# E /Pij;N]YO ÓM#`g򣫣UzUqÎgÁ`Ϋ:f2oV5LH~5q/N@TZlCr 2ygQY(OHj4>'3aʹ@rY -wFņ綛֡ dL Uу~@0|ff:tGG"ΔgNBHQOK+|/cgp xۉq/e[slM)7Y&:ެ_#fcfx'kFfj&ԩ\.6PѭQ$(lk$(8Dj=$lrf륤"QV6 em liN+0m\ jr8PB[4Bh 75ߞ7m1i ͚߹ˋ$8KXdC+Z/c#;UلpE6r! @=dD $x2Idh`0ݡ- D5au2-DKXFmFXrNYljb i':9V6TmٯTB0+f5CM`ښmjڌdf `ٴi1iM4ڬe}KJ7HY~52q%Lʰ͸F2*.b͒ L q02Vgx+Xa&8 Dœ98fDt97ƻ2lچN3 G"5 .'08̍fCJ#=U 5F6N`U͢ͱ5%V@ 4Mf4bCbo.hn4 D4ҙ >mJڕYmH7k" .S7HkffŢ*R|8סMʫdYbw!؆VMF5Xfwi)XYɎ G΅(Lm31WѴ Jdn]R4qr0j"#(4IM+ZH jTd2ХwfsG56}j[\>FͨfhbL&`M 1YF6305G3qgO.5vvߤ3j+"Ib"f*Ii3cf2,'IaHD2I ؗu"Fl۴©:"0GlnჰzMĎ/ND P)tDPUQDUW[u*8ٜbS@,V(|Ɠ59mѝj$âQLSM8ĮoV H/(8qc7e9ȫ4&%F c!9IjJnلx<~'N?9F$V꜇N,XLǕnaBxPtoQ!_KIQZS)SJwH5H~ZPDX ;jGIz]ȎཝBQ<Ѥ3\tX@ !Ϣw;}i~o dD/:~ m>qx~TT1Ip+ 9swf)AIwDQlxAAQJKK4ɏ[cմz{| AT0ԅT](:&zQz'!ͦ"mC:!`فfa=_ڹ?c6%{R ˄&eϦ]!+#BƱYA5p4GcӃi _G6]yZO5Џ)ċ,bw;k *qG4q3cl0{Ka ɀ42 yiԼxD?2.pNePq3 iETorɉ`l=8"Qb#FPnbaص9Gux|r[pe@k[B Li:H<+T(F`\Ai,p)#5JeB%WF튣F6}T| & i񾳃nq!gx|]ҧ7Ȋ]pd"P.ֆj0=LoNr3~Zaq.)潻\+\hQ qLY?;m=0n:tLLLH̒-MI+M/{f3Xa],j\rUi}hcgBSQPHDg$"h =>o(Z)R"hi@D𓯇Qkx hj?1q>'yMt:tr{" XƞS⵭]MQ훎3E蝓l_Vm<[įyݲ Z3@zHRi"Hrg\@aƯV]tlW"d6h+" dO % -40iw\X4Ӟ,ٻDP.1pA]::tvZիod~eÁB82v{3ᲂSgPR }b%7)a,3{-u9.᫨ 1,ՠb2}SA߈f8fwٺLeAi`&].3HeN3XtbR?g+$avYlo Oh,bZﻝ30w)) )*>nQ^DA\sFiѤ}ϟ AVы244.vIlYG׽pd(t!/%:u:f~ߧ-ġcBN0E܆`&B;Py='10ηc|)\HcQ1?] e0iy&G>l)nǠy;U On5dLԇV8|bʘte8ȲSq{8ZGG;՞Cb`Ŋ.Q誃*O{&&S̵*\J5TzZ~`TI(sj`@y3'RB>w;~D?T9@_x2ph"Xd12I31MQcvۑɷ`ᶻn]ZFȃH t2dU `̍Sw (b@ȘSS6V8cy&c$활GvXӁ+4a3HΌ6(p}! =jϑ68an'oM}GoKHMuϺ}]DH(PQAE,[8L!X*FZ˧H'U0ULM/|:5!S콟`u7ݝC7=&d!!)*"XJZH"FIdzYmdP ~kCC0 L‰Y/+W;q>'dBB7h Pĝ hb1_쯼ۻ:Uö%Lӥ$-39M[h'ȉ YBq&+{cB#E?M Rg/L-= HALԄ>:UI+PIaD""(h0#뻋vxUnCt*j8KX@8;Qs\u~BIc?Gm xv,p.2ѡd7F7֖>/EnwFjD459!ѮI ȼ}v81Z{oP(A;LPꙞ9B$,@xAPǪNeP> >il jڰbZW\Ћ1rI4HVEYݮ^1IMC1M6.b2d5LQb6q`v$! IȆv ᮄ #;Ȍ;&e%;1S&70op(@\5AapLd6dk /9 2-49.B85J6Tk8ݘg Ib$x5:"cD54&Bog(![ng*gYXm)lAY@p7&6u,:vVXFIJod !yVPjnl7g}G*}ŀ5 hC&[Sp@C ZW,5Q; 2hBc%HMk 8 |`~!}X!720lK6- !C9XF[ نjͮa(gCLq iyĤP-]-6k:d8^aMi쌋4g 7Sm(p JjŬ)☀q !!]£iಒt9:or]M!]7*X6Q }uk2+2E6Rx*ęօr`fA.^mb֚^M\\vb,TJw83 -1ecқX1G)BkK<Ndc\iS"jEْ¡¹bJK1;4fmSVUFMvQH=Mg!#it]$(5$so7iLHbWt,-s"S /2 &)hE!B?$v_e:I1xgKO=FF`"1J 4%تŪfָ~g_{L81dfO) qaCn@y=uBw?-׫/&n5js7Edn b9#rA!QjCl)<ˍ`*")592>ÇkϺ`3HHs֫rn hv~9W[GK@5CbRonh呣) c}[>_Ju)@44UEIAD ADDDET5$ "DK33& ҕTo5+MhPXo9,Xhfk-hF<G[>< 3 r:'˂fMM:d0Elh2bǕH S96ˀŹ:Xd񉹜 MJŋ8!XE٨8r9U}/U6fsvU 6FL!͊0J&6Z\4oGLG<"wB͵|#6j1o6$|܍^) B\mYO9gmcb@Ng߀( $2c"řFU0Tc~ cߊ&흙L+aXr*8X0نR H3̇uBJ%ld 9&+SLZ4 ṯvkתla̍qZF$ɤ†򹗶!-- `2P ikdln$YQ.Ifw4"7iFElMYph$IpָƦlq PlqFƆs2cq j3I96 5f1b͊jJfLڔ@@ݵ$j)s#R0aPl5 Vm+&v$ҍոD )4d`8-|@BAޠ' 7j:. '`aYOQ qYkru7&b2qڡCyC !;T' |pDP1_~DC5r1ҒKB ̶ [gX rM&ysfD x tЃ߅7#kD@2cS c[&Xӷ]pMCK"hL*C;#:@hZEJCL9jn(9 'o5(( xFD撈pk,(Uw $34:+LA-#$h Π!de.4Hơ&3\`vpL2 $1ڈ]5Ҧirm8!s;WcrEB:Ti.g}wk c hHh"UZ7M[XJlZx.CYr dP$g{wD5her7oʼnΛXqe ; *I"mb c(%,J$!@H!]ɡV43gYqAz0&!̠1q)HmZ,04ZB ّVeF.Rk@{ XdKQ[$6*vj傆L%ݢFЄ @wN9F(\ ɊCZf*"sYsB` Gm ru X" "0x#QiCUԈ5<9mikֈmgWhP3-F!pУ]a. "٠[tVhRKvȶhA% cCCE-ZR љre3Lt;0cvSfqL488lr:&:sB-r ܓ-haebmʋˆM@1$s0PܽggT k ڡkmaGq"K54i0P0M!HLvDd:DM@#*5"K@w5n01@FpdŜub7Igec! mb2 ݟ0fA2ajh&PTDoQTW{5 srj,;Pr QpvXmN#ybɨU6`Ԇ&f`Ma7cJȠh6plf"rA B`Fbs*"2xr!(/v A-,B73L>CfYk:5 0Q&ʂ!%(ͥ 3(!̊bM!f Zom3"--u73H6w.d`eC6C +!S-E˜\Il*,h Y0s53FoL\nyUÐʥpfEYtوp@0,ZhFf4#;Qpl3܂rns P0\c,N0]vvvlFd iV:"[u0DVmuB-/cv8%-\gSLSaIޅ pg51JHR%"-FN LcA4 +y%/T5EC 08}'lYb b@#Q\K&(3bbI z06()v` m6%"Jq@,1v 0vl#Q2.jiIk3*,n4L @4ila$CP.ٔ+daF6@&3l1F٘P*8faXhY_q:iC1TA%YL6RXAI9Fg@jȸ(E+,&େ51C&}b,3"FD`n57AqRCFCTAD2Q#TL5$>ՠ|li꒝ǶrLLZ3;b a+}K[`k +DPleyjV:97􆀰2*4%< *YphRI98uwq-N)1B2mp W[VGOtqh|Q +8Ŧ?5Ao֕3&aS~ΛRe2IN-b+~Hbd5. B]$;7/x`(4yTv[bG´sE2*4B+i0R@1R䠓TxeuV8r EF3b]%υ@ZC/)09asqRm:hDJ3.??^o?;ʿSH[]'޳8\땣h_TD*Kq7\LD= \-.)|ZꢞE+X39#CADƠ\mb$c9Pmӣt)_Ām8|8p#I_ hf䢷0ב23*>} h&k`&yK gc:,CTd681T-KXbGhFoPʔb+%Kۯ4kR} ZQ0U>8ɥ1CKAܱeb +4AS\#!kT!bArrٶ22) L 1( . :鰙/1` kaS"ƊٛľTɵTqRM"'&(d>t!d6TygTTJLXf  1vpR^divSVVW(HXi LӥmPnS'jQwbuNPYJ {-@p(hZ5Y;>F]],I.< hDµip'so%+F NBNqo pIAA,;1d)W(mz4 a( RjT%8eGCA%xHNaA d59E-SD5z+Y4"j:*sjl,Ír "2;;< 7϶OhCPwYn[Tv  ;X /f76aP3KxRYvXCɁ7X?[yȀcҗ ^3X6LUON-׻r#4!$$BE^kw7 b''\ {dt ;YGa|V(7*{-pэ xS#6`r X j8" 0փc&cɌrÆE (\/$@3ЮB-x!) hQEAp&[2Lm"cKF=2U+9`$3m |HKgKvg&nL1(v6> ,`Ԃ*$a^ 09t&"$G0 h+!aً!uʽr1B+;$$r-,۳f~ jL@dfC3w/A$J(ueNé!h % BACPXԒ崣-P CXvjQ.Sn&D`NV{1VΦpK;4њF<TaFf Rfv}A`w"j&j̪!J&(Ln +x(h$cU.4&n1RԍV&vkKNL"39BX(k"J0@Iv`HLq(LH;L  a(g h h錍D`P܀q-BQ(q4H%: f`r0 -ѤP& 4N# 6͌PC28V#}*EHp6jmgb eA%"6bhݥ,8j4XJaF ]Ii7! DÎ 6JWUf7dh 5]oL7 rT;N1xFk=)#NbFeȌLH8ɚN8pq"@5}dC0 u8^d߉!0$00KGUa`iC+bl&AcQ 8MD'0|,M9پ!o sBt?>c;=@}jBwš>摇)eġS%ԕ`N 3%Oś#L>h Sc5u54`Ey=ճ2ap`@N>ܦD) [f-/~=vE~$-gaLY4,< %V0d"a Ƶ0"$a{.1o q "V#6CÒ\("g6^~$B0$v&=L@mqjRTlBd.|]uwgYKGD! ^(y<'"0L c'@[7SŸ g.!:3”j!%iBl w-4RKbӤx̰D giDP]1T0rS 3i2i!GuUxAtOBWvE7tY!20!MP[gZ-={^|7Ƀ.m׉F< $ Q@wN8dgȆ@c)i%h.)1, :HW\n  W.;mj*y7l]NJt?d˦EF-y1ժ|T r’0 ~\5pl1brυm<{w(agvG_~ōH57 i.Q䎴tyOIS =Wߧ1Qr]3y!CJ^d􎕾-^e]볁iZDy]fS j*1@'Si?:X5 j{Mi=!_zij0L#5$w`KۢNL9pcX. LRJ }t gk&=k.(e>>E*""!q>ү6Y׷F٥ F%UBڶG=5KLc,-c КHIbԭo4Ta)P*01DgE$%\dTVqǠuSA#jt[rs/Pt?fd!G.E~7GB498޾a!t!/?Io_j BIi- ӳ([տyx ' f@XzFs~Y>yZ?xi-^E%M$PUM$B7Q4oAWv:r۝)&R ]$叒*?N 5PVtC w2I翨ǽ#" 9e7wkHAwyA}J``Ȅt87u!1Fd^<8gV/kL|o0 gD5|P\X8=@]ĵK@pҲW$)R:CtwDrɳ ņ݁DD9p+I|!a"(FP eI@rQʵ0 1Ʈa#2t|>'*erdjNuw؍ZL yaB3!љe%6̇[_aA)tiu ΚS YT E({iy#)΂C 0Ul,UŎ+)B\ӆ+__v8nDfdH4YEw S0ˇM1/w?Ya1q$y7$Ƿ"}_:eӇۯh!!?ÆUS a:FFE[dafdDEm4|,yo ޵PUS|9z+Fs=|@HR-8q YM{".DZn^#QJ57{ǭPnf]{`"~>99Iih*12ḱ̰(3Fڵ0UDJZ-F՘UaYAPj8xT^)DHu/:ړ67ʓ:[i>P a*Zƹ k3}Zmue tiO6cG`i==PJܻrw$vϑ^ 9+'"#!f~қނsAhcȥ!UT WcUhz$C_ER2@SRrxִγX@2"Ս?~1HM׳S*0)@RU\3MĄ`쩠"bjo3NZ [`m)KXᎴف+|΂~ ڵ.d$, ^r`JD Fg!k Cؕ6TQt~7?S<.dݹ y4Ԉ0a,چ " WMB% X_Hѝ4I <FCrzQN2K~eg=}T0qϯ+TLi"& Ka- ?еX${nܒ,o1OtO8lu^ޗ:RS`ɛ 12w"ӉzpeOG`R'QpTBͰ Bt ctFd(-3BCa *v{ n)ccgI}|͡'A#= :ُ $!{䃫s\O}Y%" u5TJ C󣱕>GFmCw܉ݼ+C~^c r7II(+⢊]0,5+iNu>tm*{ Lmk{ׁBdҸ47Vr<ʐy 'IVFdYW{p3;b(zyw<'<`}畳`B ,k/v9|"=r#aڥW1]!G>\;o!\!aTa_\*wo26l`ǥ!iJ 7k8|" fBC#`]h% ]n:rswUek> x_.2'0jU Fw &<RbiP p.@@3~A!!',bPR&҂}yo@uhT0Hѷ&]=c'—2cʘ"s(u<ĦL bR#s{vů>.'+ kZ} D0}}Dx߉fǛkG?U>`n@(b2k潀Lb Eb{w\퇷%4w@WF?1!f P32PA'tht5NM}A浫1p%4mfRfoʻ} Il<|!źK4E J'qPM96奚I 7SaWԿ h(DG=`|X7I6  Z,Ajvf:E.}=>zA;z=7`a,}>@ gޡ%KN1,o?U`xB/Z)eɳ#J !Qs1ܪ M#Ud6lּI4pvQcIk޺e8cLҘ&1_.{*YK롹 xC{#=^'G6='.9|U=.Ghn9l_3!Cn\PiɫE>w[ _ .n덃32X 0^[ ,8-{/e:uKkTYOx,r67bk$TH<>f^!{ n0"3!z)FY}3m)i0O,""j+^ lfy4tO"lb&Fy p澜2{ceC Zd,V>)fdQ&Qaذ8A <μI ̺n+Ї(oiүC=iu[JJp*(5d< ]//A#yjҘȬ^4'4ФV+-4_0d0D|iaj`wcJȡP ߗʽq}(G4z+8 ϾA  pdc 'z_'.U*'REV}ntީy6fTģO2Pzc(UGV:Ͽ#z2P,;\L>)c.+;Q: 0ehC/9]s̶% ؅D uCbͶ"5|ucTenriYgc$kɶzy/n!wSv{13nA!@ c, VSI#7jIԄPh"c>mCtd ݈&یj)qԆgZdS| m=m49嵬,p<"6N GkT";w\*UK]/jlޟܿd$ylwq L lXC~+OyAI[`hcr"uZ8HdX!s.|ZSǐ*@ab ׊5PlG|Rܗ[]^ E!_ZmiqeS^dN;/zWM糮 ɀGbSx#~słT㰭&US)rHO 34҈[)`gE< OzCL3X049tSNQQ 8 0ҷTmvϷy n2MZ K9wkvg8q36i(jdJjwd$0^M.+&|Pc.?B9!c-ܹFɎ@"3êҋQ02ej W;լ4gkY4UY^hIdbV/6̬ўKəQ;_ # ~3szRmG>!mP 4c ^2H0ӄuCz#vxb`yn^ht;0aI .i.k &KcYb^ (;\|̅}1vȱrDIs>BC_a^qI,+ʐBPQ)B|{U@~p:@yLp^ej~!QSPWbp N IAut(+šX"rD$;PDNrXF.;h1 o@hstY\r>SaRք!{ eIaBG5(<۞*2"I+FOC~Y&XK(\)]DaBC@lmPaSMj0gh":7<|b6LjQƣMp}vpůc(F np @叧&ÛYS}ð79͖܁zWn2΍n5Wa9'g!8ceҥ`r.uw˴6@kWyX\KEȷ]k{W<C),@cś<re&/5\,6n 9ᯟW3]5IX%d^䵦eAI@ B,7(.K^4!XT4ǀEqg\or6 nM -29\k&bf뤋vrj6e ĚdLCiIc&X3 AjZ)\H^[1o6J\8^f3"zBB/SV7 9`u7't,ޫW\q KT thjpcL _m$gW}4rSǑQL $Oɣ< =o+Wn>e!l<\| rr[ aA jp`ӆn!vd홳l!Ф"dP1i;tGR.fk׉|M 2 U| KCnkWhϟMj浂5Ud=0,7wVګ6)Vϼ=i&aV V!FԝJWoe:P3D<-zWAzR < $gBb2 f[6[lhsX5q9 =i_9C3)/Q捥/.'A7F/@r<(+9[|pI6Re 3u[ln+>T+(0zdAza)&m,h S 61o'F56Y`s{͆LA!?+Јn|U<=m$R];g%W%PDøf_IAfߨe{2!j4 c38rd1mM7BKL9a^iF {nv=/yFArpRobw+  mK#DH0 5 ⧂*vBqcM#v,X#FBmѥRxp3ӱ :X ^5L3 2IoeϾ#'8f#-JXcON}deuJl/̹ @f@͙%ʰh3 A08c0Hp.0 2AJbO[MnDO[;~OmQðw+X;=Tlv @w(Kjif}c"2xhL8(Depo} f}uqePͿ>=ٮ[gOџ'BXm!]+u>>*5Ǐn05lP'p ZxÞCІUhhDhCTLMcV 2͑`&S!6 AyE P@Q &3 Tc6hھ(^FzM.m{ihߥQ|L\@ DK Щ;] ]iQњX 5#zϯv$kXY۪*AX`e8/ jw2lH4:;p0n*,Q_'m~jT}smLEyFHnwr6W(aϏL0Fd`S Afd`ͯ+ո,TץݴJ}I &2Zv|˧:l@/356)4/'S0t)8eiHr2p4'ZLURcXlPrnRlD xm917tuh6܋J_wM]#O0|h9y%\ԦuY+ 0FdP1l\\_ڰHVo_OjVl^_[!>i@ |Ye gá:t#oj)'EDCsu=刲 u` Ȼ (Cv،ŎۚWl$9zC &g[ 4}ʻHȄ1'lG=H+"烓O#>>̋L,KP5Fj9ۨv ZrwXlPIEb @ rFy>6:8 i5hri~x#FۿGߛ-y|i& ā|FlSMu6ͤ{uxՕGP>|o ^I|Z] v~mk\˸ر{Q5IأS0FN6  a?t{{(rنM6;$Kyv:8Ö ^:@ɻ{ <71$B<0H(xbIluQxZj%};7[óoݟ?= @ى/Ţz3Ub/vWC(\glp RC<sBoޚ tMWϮ}8`VWY$Re6/}䡉w6Љo,[\3돓6Mvf#S^+9B ٟ#Y ph> nRP<Ma_w޲hNw\CB( /d)@#J-" )T@A'oFn]`*{,G!S;D9hZN.wOo/\6(<a'}1Yi{gte]`# NXȒ *d;ư2xZC;O׵s2PNjPЏϷ;@ysKN{#&֯ga՛g[Cdnݞ@ya>ydP!Y`%#]0<9Dk19 ΗOo8 5Űeud%1^+^+ bc: FHp z$JYly[ڸXLTNvw`#~?:"1@Etj&:`lI<Dyl[w@ymY4Zza}(A=j*HZ `BE@v}E1㐁YmQA&ㄉsM^Aqa Sw$O@\.G.O|Y7(|<e?qYDʐѦfqCG3'6۲w $()ZZn w:4~x97DNtr(6Cֈ1 :P0N-b53U]098*zE:p^n{(? 1I#}wafGj'@D"tiI vf8\R=W42  {ae27mE+bcʈVxT&8v^nnNiIPl4-UBr4`¡ՃA}n3O{ #AQKϢ)WwY궸W`PL 9y$II m,UNX6TN}ͯ8Ua/QJ""Wvn ([D?Lt5Mw:O UүBFϬ(2L"ةއv>QbCuCF< F]RU<NGi;D-[> 8L gG7{^Gjho O.! >-֧4M֖ss~D^ hoQ9.ktwI:5z% P qwGK42~Ƌwd}#ʙUAn IJOT8*JfFDa櫅DAxPǖ`':ԩAWjfb".r+VHHyrסW$JO(\[9b˄ /5~j*EDٹM]QL+A[#cA¦(ʻaw7x7VB#@gW%2w8Nm3 .AS ׷L!>1-#{ XB*$GyTnI2qxDqA\v[?jͺXAeM22XMZYC~~MIl dC*"1/BDdAu]b'87Jcvdu0 5|H7q\0IFnT`*R XqYffSҙs=<{g#FR%T7@kQBㅄOBP! @ H ٖEMعyvj I~Rs Xv9=+C ymōūӗ[dox8ѩ"Xu QېtAg\DeP$0:% K@Lɉ19aS9a@xJN^D35迵pۛ}j{Tv3]~n|#Ax6Pt Af烐 u&PR'!9>d)~jLO' N'2`o%uTdeiZk| Ab`EH082,KM[V1Aμ|,J v ͭ{~EƷ{-O$sNغ42[f)a譳#ԓZ%|6oyfP))7q SAXhAp m~R!FraVˎɊ NSi`L ‰i~R4A Y$bdo޸ZƶӸWPS=,àP nd O .UӷSZQ\)1*.vX*H BuUk~U T$iN xƒZ5r1(mXTC(U~q&o7f7^^:}NYu:-!+O2F)%wPw¶1 %2 P2" p`AyۘQr^lV->x1SAP* zKi?X{1l㸁!٘8pԶ {59?| G5Cp.#س'S7#.|eo>4r3ؙ9A̕de[˟zla/.'qܘ}G{:C O^[}gbd& 4]ƙʩHA> EsT2?IlޟdFKv__º>7nkQ<+ ^V$PjLiV,:7.3F|'*6pA $7MnV]R#3?}C@q%'\lpC]Y֌.upxN֡hY^ݤ;*f#n<={^\kiB]!V4[O9_7蛉:f"lzbJBvU 9A+EJH݋;8w_"APr[ΏQ#`F(@eyV⺵YnqonP ~((7W8eA'EMH3SNqBr&ٷ.߼rq?hSm^"\;v,+OÅE$ >^lrۻ>[(kpG8E 1L u!y|lm(@` kɦ" [ +,$ڡ"'?ƋRsyw˯ؾvlO֘!t`mY^'a>/*ݪ/,;6݋66>zyw+tt#,_^?$Mr[- ;h-5 Ij=jsqa00!lLriH!%YkÛh|F1x 0 \mZD+Hyg<JrvQn?Hˢ UQvUD 1,iߒ>m/E]~ރPaFp&.Stg4B0l-xM(Lâ: f4s{/_m;=}7=q߅TCwCM̠vx5HB*-dC7ջd5 ·Щ(-^2hapaje5o Ep7;6ͅ|/B=c-mq*rJnє{q=OD`hXh ٦Βų؈%^Xraq g6BIUe`R"B le}Ka\p~TlH1ΰ;,Ϊ![NWs 3}X[a?I7$3"MOjqlZ[ F 1f+VX2vvZmzɲVX b$:W5 nBgI3:[+6۟.l1`?J1ƃol@W`]M{=JzSXfv~ f#ibt]cBu=Orݲv-@@ C3|>UKF̐x~J:^|䁈TޅM^pIy%.[{bEl(!D+[-Bh $`v~45ք,ۮMԍvrm>C&ofv5p,"#{mʯ4t@zPw OS,U_zFr˧< %jt(V}@2K$r=,PDQJT@k9v@n -~F-LWﻡcgpw|=7ʙ'S~ǥz0-2ƌ})P;Bx3؀m2rYN(9 U:"]Ss@ rM0M:s uVr-wlM +\0yT4RDdjcX}(k `y[ͼt7 1ʵ$LZ%T.eR:j1 [hu0G4RI$A#؊NIxnԏ_Ft[htW5F %' Q]{x0h#@>/~?^=gCZCnkM0(/ \T5[(ZN(啾ZFmv B!P$"ar*0_u=zj.z~G 3fXn+ ).Va˫riDv/V~yAsU Tv07z[OeQXg@]+9AKIC'y(.gYs<w*.,`Adw)32.RJmϵv]Ly7!BiURU1ȴ7ӦϚ'NYNӊP'TA)uܞSa*a. LAx4S! [:x=Lئ%>wWIM儾gw"}Nbnw;Ѭ%{@ht4?~tÔ:(d2jNa `vd %w9<)n-^%tvjhv!=nϰ鉘rͮ 5\nZ$`#R=rSIVz*ɀN (bcB#{% ᦝn{iK -r5 dc Nϗ3k2v;iH6]2S230A4fhI ؼɎɂ|{! $7ʬ|o-j9 ZuNB:z:NuPyWdvO5e65oĄn[ acm.>**`}7y'xr>MWLsDj&wxFRx0饄ʎČ76l { Sov7$ہ=STY6@ LЅIMrEnlw:AH,J(C [qeqϕOvT\sFYs ˙=288ѸetћY9t SnAY!NhQ.8dUiCv%?FoC? %_=\eƜz(n=9 to4Йf!>Czx̀B$#9߀?ǃRr <&)m&mN/`|yCd R ]r^:y^C4U G/nD'D|8ڕD=S @&_# =xz1 HBˑ`D9* pw@ìxE #"Hʨ*R ĔPJ `#C%*"R6bJ N!|dGy#N+h=jGϷ`kcmql71O@w6t4mD~~@ahZPJ p[N!i\Par\>LhW gI%|p; 6Z"\A0T&BaJ#)HaD DH46'Hd7SRBaI@XHd@,V0* @dR Q^b*$Q8 ÉJ$Bq``9lWD<&F`̢1̘b ""SE d(VTAԭ$KY E*$IYC@$B*P$<$xIT[knSR/D wBSGV$aHX5JB #cx(H iWu) Pp|K퐟'I(s  qlAriEB%=,0B k)rAPҋs!zpD8&B"BCo!l-\i0bWT lpj4quwTDZEr .DsEe ""BhSO`EP6젧ȪI UVW|P1""J["(H,#DA Zxd #H%" NT؊` 4(j+5vgHp!"#{vPbH5@D2CUM4y-x^.ٴwã—j'`=j(U-xH"@AT8A*&-`PU UjH"ds[j͏%m8mSӤL D@AP:+\(C\JXSK>`B"E^d:)e)(d iXh z0B#Ҡ'kH") l5AWWI$P|Њ!"j o9;p"7ד}' pDxt j 0auVIu_ʇL5LȀPE|5QST 3s {.z=<~9 V_ 2 A+4HPBHHA@C5L@PA1DLD"P+@tf: A>$*u ^B) Vh*D I@;#9܍Ї)Ҭ URY#&à_G:Ni:)S´%'Y;ME&aҽGmvQ>7!ݐɥ^Qgx\ՙ5& + jτSEM1*(4Uth9(7WeEkMtc^H`>zNa>)@^DN xϊOsu,_gbJ:;lqrrWJYo[bFf+Hl((IY<Gi ̏!;_Ĭ~W^HH xw9@KqUy[(^J r^9B@a2B߽Ryxleh~Wu6ẌN"5UHD"D)2 ІD;]jF(9%çn Eª&rs:Z%uf&͓/y\ch}MܜmH `pvqC{azv?7:kuP:bRf* }::xP:sǪ V< ;"HP '^:16OY9mI _Pz 4GW8 PD\b) Hf+љ ?Mާ SA@h(IO@6rp%̈́( S\ձxJq|#$"D*ҝ^yoF:5-a b22B3 ,\GHPK= | 7LȸArhA ˤ,:l$)X *&4|;p )+I0&bW3,p(s0Ls)hb3# · (cd1fCZ֔.,0 !i)h0 K?Yl\Q`i{`iChIuJa&HF Mk R p4F 6 4pe$5hĆq^S-DrCu*(2.PB4䙬ut8B"IPPB(ТE T SA0 1-%* R D̑Q1HR 31B{Ѕ(U$@SLHQT{^JUE(-sbBZJF)JJX) ST %!Q7  &H((h@JZR"b)P**ibB^TMKRP5@1CM *p .x%`H^ %PTP%R T)LCQ%PPP4Pґ ,ABPT#H!(m* DCNJ'JP@O@QD9JDM]F3c)]X1G ΁GR&`H) *jr]14A+ϴ(JHS "NQ92/A"rP80@QL4IA PRA( Z)"hJD((J)L"ʨb8ufii%5JR "H)&*o+*T$h @,ʀ eAYSEP 5 v*nzd)IrA<~|^Lr"" "&D^^'J(d%&()(JqA9R;k*r-JM41Udަ":2o jX Baz6PyP0E>҄~{)s,0lCk[ ~ O@(ҊH@B"4 ( P@e+(m(d-:8wG~CbJ)((b&I X %(@b@"^8@tpi5ՂCʠWJp/-$8BUߢ9Ck H%DQsbTL34:I@O> !D@ gLX&t醑"A4JEd@ T/<"WcHX P\jsF4L"hRF8^)c5J"H#hbPUh J@MHEA TxPd(!$A| (HI ©B 4A,"'*aUKޔND^@0G~S(DJ\P|!:0Jf 'N fY6! 2Qf&ЍwlL´BϔPш(>%%Ve$HN6B ("l@ب&% QPc@hٞ:3PdpsODIM(A" e]#˸)̉n?QMQ s X @2QJ7(Аɣr<~ R@aԂ\ Aa,̤4% áz9DP$rTvg 'h_@O&DCJ(]*EPJf 3@o!%,Zt )x' * CݔsM dU^S*F?G y3WNaH1{J(ćlnZUDp;%J)RtȂPnbu)"(>lAPKiWrz/C0\+%+r\WܱEHᦀ D H qS BaHC݃AaT_"l; {(`* i%zDyb"T CWA^P1H`!Σ _:QzdWPmC"tb d@HE}AP؀;C_̅=䀮zE Gt* AU\ATА^DDZҁTC!$"Ģ'BF@Nz!V`@"d҃DB$B2J" fAP@[⢊҈ Ldق wJ>T5"( O{1?N*jAA*RBlH/ɕA{'0(@9 =uHˆɂ(>(k /1C^EDP6!r3>Q>j(?HH@~@?zZRܨtBT(@X4` el`H 6Ȫ @z$/ |9=aޥ2JxO9?/zly"|HD2*o{fr@iRjvG:7(8LF׺ َI}%'CT/R6hJGcf ɹ“!s,V:C7IG!̟- hSqj#h˂X! QWD& %E(D{]ky18SkIc@UIJ /.(dI"}}s7_;?yxECQi}ILӧ haԨOMRd[J]ۦ ֱ+m4Eb!B+\Xṅ2<|z3Bd! qLOP5 PjXv*8JMdwr,?<LN,`A7tגb3#0f;T&;PU-u% DFG_d~ v,wMm H"2óABȂD3 Q<M5ے/[yX?Lxoltk˰Z>Bvx9tCA⥽TGvJh2/թ"V w@ [ ?]s7@l=oҡk&i2Ⳕf#17;b#r'yF.I8X!n~7n8 |9ae'm01Nz0 i<DAAD|:9Z/~-^Q^hTT8/caLM8hʏe~IS,xI;/asDǏm_mɎWVO{g٣ţl((p5k@a"ߢWjmJT!b82|> 'nv]X4_bnM9[}5D`Jv<%;?ȠKz6~VTHHD:]?ފW8"R~4_;$⨑V/~51#",$I@ vݽkۭ?N# M5mn;߁ч\sCūlMNeJhZz߾||מzQE<|V9Q=C3]s?:q$Q '5q]C'g>~w=^?#(f濧 XbQatU٠f+Yi,k=/()2#eDfD'&vH@DxO^=x/4u3,<7gl7ulAu1D@ tIA4O\{UXMJG2W6$__7џLcOTBBcmV5/Jixs$݅7>e?ř,t46yyV獵C0l' rn$ܨX"vZyI`2ȷ̰ʟ&^F0Gɥfk;Xm2ZiJMH[/ G N7 m@Dhqq81 c[UxO24@ܜFW(a|zE6i,9BQb7HT8SHC̖Bcr2ҩ3 ?r(de-@UPӠc=u3]F`P+1T|2(,Ɓ;!?S*тNq;3 N0KjMn/RGsC07[zfB0f qcI`uW~5`8@vkh!۷¤|(y s:Sfcx;/ }&pSm 6ڛ({#"qj|təXlTt~bs}8tqi|"zjJKޛ7﬜ɋgCdPB(6b 2:$ȼ0 ^€,_3|Z7> Dts/5>3돧:K6zO v]$=hX3:y A%"D $AIJgwe!ER?[NJ)#"1%28,᝟rÊ^Lސ1,3EVB'V+z?)hoҡ)I$_%I %д(l,dW cFQ` xKr{8C22C`S0k+uzvȒ H.!Cw޷A7iC 1?<7'ҎG=UCCp DCµ6H?׀|M~#SOףnHlK̓l!~qͅ.pcrvS~vty/m|rtm t>4Gl{ oe}l;_2d fFAi\BOK/7m]~VqDssowf6Y7 ȁ0QIdۗ.ViHWJxXk bx{kB'aG0QbbQ[&;38S[F&FIaH`~C8+{~&U߇m8 o\p$6 8E0Lh]``rBxM&>EywJ8CPvd:?Ҙg:l|i ߟq_ϨfcI!6F'yK}sG另?Yɗ ƅ;1b$1B#I' g:ŁvtJd[ϗ,x|CЭQc̝"MG>\3{Y;5d&QAEJQIU-Pf$k9:ĐX233#4ifOf!E!`@ 5eq֨ ip>*l-ny z3?rLޫX]6װl!쁈og4 i)p=s*B% Y(;U? zɚT`'i\ -[LKCZ^lb'!^ȧ 榓8^S,Hca, hn!jJ"i!R 2;E|O~D@Ao,%i06 鐹 Ly#]\P8E:"b0A Ծ.8΁8X eD0!z_L?l*#,5@ 1 CxphQތ,EM:I}'(PFibHo;tzlrBOJRC%A"  ոۣ3L!_d/_ڏ>oP_'f}~-XNL|u}D$qB{Q+C ! 40aD==/tv Cv }iA=i{[Pkz@.<T_kWK sd~#Pp!, <}ti aCcP%{lq! JSSnDRp|vwgghaFJDJFBL _EW>M%5( )5QQS_Uwu/&Qrbl X~[_1׊u#%@Uw=}{vst3[W 5L0`Ȉ=krCA1 %+'B1leoRݓdh} ?t cW]x$!=mSlFVf"-_32% %:_\ [ zڐZ+M.LڬwŴKY&F'׉aQ! ÐL>dolu4D`W,/w < u|_?y)Pva+[rDdf4<>:01+ծ]$(j3d4Wqh NP`#̃LU,t~ۦ :6ԡVI!!ӡ:0ߙV_@┬lۮFt[Rlh~ Wny2PdzB\ûF>-Se0f"X(*oWYQ]V~F%߃-YLIN}v ᗥRqͭpaU"#2c0ȷqmw^>6~2N0D(' q[V'+p!a:XkSvR{gRL<2lX65Y&i/4000yc@KXA@E7YTS3&CnrN3W,O!)l"e%.NQԻU&k=(2چ+ϋ*#eg:/\Z<ĉkLe:/#ί",-Rb%*/4_Ұf0vZljF(36.Ա3 6UrIw}C-`:>!={ iu 0d(vν* 4J(MOS Q:8[9-G2ڢ'UnQhΊ $m"!9@r$DBprdڰgXOρy]sV6+<}utr#4 \W ۰̯t5XX 9E*l rWbG6i!nx5V' cj:Nj1\- (M$"RL<551\xǿJRG$k:hŝsNN50犡6  Q .R++NȄa<ø%2 JPD`/S|UKȿYw7amka%xZ-Q؏G-ҡlebP-$Ed id 2hP;z aW̉Pb< Zp 11[$Q“Y md`1+]8?2(HX#0 Оd!PE])!^4 ζ\+Kz-0aj>"P,% A]h'>@%2& Z=DvDVxGBSdܶ.")as..z8K`0"Pf3ބ'2.dhNg,Φ-ƞ}=J@iO8a5Zy *Mj*ZiA|YGb/?cbsJp jqb^Hf`Ϛ{\e^僎wb2HYh1{g@uH2 d IrS\&6WiTOmȲ̝N1qO6E+~r< BEt׋W3: P s:OiMR&H+2yDnm7 +VAi`DǍ@ǩYTE|@3 Lb {؛[R"ÄC U:j 3{J-e(wr#^Gu"iӼBf󖭹r}W؎ubݾ3.uV=J퍾`anDKSB w냂ImO=mf¹lCVv ُb@":(Z;| ?3˞9Q(!І0:>7b% !#Ph`պ.Fkk3GXT4oDo,g5y}vLx̱FxJOE.)Z0/K#h4 X>h'`jBP٣Z9.`"t0j:!Z=0 $UK!6#v'm6)jboy(*0u'DYT&5OA r%:e]ʔ`Ώi֧nС HX+ 䈸 M^xŻfϫ? i>6Gu0:}{ At {oR䵜@r-$ fFulCos|8ڧ1m9&xÅ苆18(V.A̲MB/bրڴs-g{z`XxFŦ?&HVLrt߿xQAy 26/Pûvͪ:ؿsE$NrtZhy+Hx)8c6/'InzԂmF m (aY `[@>3ayXc@Z5*UHt$ \5:7s?pp4 ".ɹf|<V 3OSJ Gnof[k:q^*w \.$W֕zg;q"}tPrux tgKcQmpp8T0++jJ*NRGއ. d>nmiSMڸ-fGv:FDwnNHt*q6zp'}c "غۼNϢɱ%Tۼar/. pK&grѨŒ`( =fj*x꭯(la}{0VS*m~< Ax&Q6;>k-@m$?m2@koc;D|gaqgSt͌aߚUV,3e)?XW/f71BwX=I T[I/Rs㻼2A!(!8Ɓ̓fL~f23vh.UM&uEpg^DL4' Z%\@(R͠mͳ`KO<NNu>}x<UL}n*u'9O/ElBHASu!?f;#i9+0;Bu3ܾ?n.9y XA"$֕ɀ D*hqNL#ypb=quO8\QMƏRlpv bhd aEǰuyLB0`6(ADaZ\BX#B$Mmh&<@= WٔR<{ܘ ;?Ԗ:cbqC_Yb{>)U[.Քwu& Rɝ6`q/}d29O/6B/z78WX,T1r:2t@CwX SóCӛa$؜I" =>ݎ0!\~NN`AHzN7X(&c9?S(a(-ɾ;ss; "B(uv4Dz5A/ |!eX!]FD(he0WTX=y;3I0OЃqX. 54!r.B 0_ $0M4a(p$bCkd~ v f [J% {(^] pbb,08%~H@(ćOzT~Lf!4wfNi{J٨rA{77p\\7{X'L@x$P*d2aPπ\L䏴i~1"C‘$l.$p?CIb20hi \0G4n4A}(4b]?N!tS{e\X[.}`8xoĂW{ -5,ģ B&#냟(TI}_?O"fN8L-8ؙ"0" \;pxGVP?'@: 0Fg K3 SY " v4Fc"D_0Yr`4ʗi(Av-OT`$Inch z6F >o`X(̤ObFu`'AA^r9iピOR޼'w݄{b LLzyjǘGf jPߦ-k#8R7c\j"/NDڨ !AtAJ `)HbBG$50"LЏLJ'W: ; GBc,*a dIڟyF~xxܮ/i ".2QJ$JbHf T&DBTb!0R DCSWcsxND &gI?tGt!DfP&Qe>]@4 (P%( 9B@'KsTĂc:BBG߮\3;kwّ'a%&"R@A!%R%!QX%(iJ"jN=TQX&֨ RRJ@ c*&CeBS$]@!KHńP!R/ygF"ݻiAdS{vE>"n&'1CBbw88KĸK/Hv\Gti?s?="ÃX`PyB„;8+~jJ>T: Gxl7pLM+CDPp4J"!-`J%(Q!AT%RL¢udpP (( Rb"(&.' цz5y3TM 1$Ё `ü2l~J>࿐@/##4 iocg׎z+# JHY\W #,@50jleqޓ tph?x8kNi$9 0l!ގϧFgH?b2ÇW?A*p\“O 9#IfWyÏs9] hc |lg3FaKߢsY$.qۈsedZ} \xq ^By  ;69$ņbUYNUX*K<ހd >Hc8뜅=kpnӆ!HDS4rܪ/zHV%W0fb( }5L.#`8 ,hAOEq{PQLj 9RܭWτ% )>࿞xdX1!!Ã<$lu)fC^LmxWkPS)$>"'ă&xXAnuL!%}`D" Z3|\ 9ΨE4 B0! DLūF8TCQ5I PpAl\14٣Bi;s>Qϊ~QoQsqs|Yh? ~P8ӄ3%P{Sײ5hI-5T)EhyFaiG;"*> {{+akgvP,;\y3~/?N>l^Ks&`!q ew،72B o 0" D[Nv:tME'a.lD[ʪ'yC00S8.p2S$$%6ڽQ=>O,NZXF8 NkzɁLlwmy _Plױ5 7]wU j82?W E"B 3ͅKhy2D44uĆ L$aph|XMF;)88CMkAF# 0f %AC}ƬѪՓY`PhQ;}?p`Ty!QNkrb]PLl>b;^uNzo3 C鍺/g5>ـҺM?Q}c`= fQBgfO uv3 ucՔE3DIIJ~ Qmlu]F$xͅ8"(TU'w9ɻmy^ :D5 D>{@I?",JdVѕB@:xr' @h7VM}ۼ?iwh3#ɼϽBJ#QW-6YUٽz>?cg}{ur\PH4>/ssCO Mze^c[/g/M3Ht_:-XՏ_Sׂ㣳xLWPM׌p@;u~?bޛKkf4;j.2YCg {&OwtAbv ,]H_ga]ϒtNЌh+8` d` D^SavMr>ɢVP+@ `X / ;pÁQOq7Bt! ?GY?oQX|"OBȇ 헾KM7Mz}vӃܖ _ӨrNj A=_ZP=掐6  Я"cLO`W鬤Pa0+ϡr~#.T4>:Gq`=p\!RtW ZG:s|AQsԮ+imEDVf{JOQ!ԛ:CٚLs]on#>?'S'+ݏ|F茁`NND2 鿀?V!L͝>A0*i<:ljP r*<"DP4d@4f.IBRd&IPHHfhr %r PB RPh@r\)Zև54)kPjuNHQfJd:iBc4& JudD |}hD>ao> `&fwr=j A( <Є5 g?VFLrpӿ,hu_qa(~6r〲.[X8(+/tH&6=/|MPu_!].W Ey鯈C{gͶY[9Vv*_rѿa@4zPb$Sƿ?ƞqLKj)˦x{ P u /dC71B@NEHfIUጫ[ͼ5ۧGPu\=M6Y v]?/sV1oY1G"dH10ˍ-NRtkNar^qM$Sǡ͠h٤ g.!DSOjs 7F+LYЧtC > 0ޏ $tZ@6vBԋMDI6@K?P/y*d ` `AW|;`C^ݝ>vG#D4'.quGzF&qkAN8@XN?(:RHwuϵ~1w?P4AJffe!*jb+!읾 # GWklleO58zORfsQ/NnK&J,u@LS Deάk#&* @h46"@}u+RFgtlk\m!"={:!kh^Ŀ<ƬPQm$75( qT+z?R&qV`(^ $BjPsFJp̴ * F%TOf &LNFԜs){LQ>d4]&i/|H_-"y*օ"+ƏTUSSPȇd4=??3}OF򠊹k5NdF5^ h 0gMY,aԬ Q ߺQ)_[_0d# %(-` D/P `E qٯ%|&/vҁ#XAm|Uf0frS@"Hs"Ω:7QUJ#P$jQ+'1j &ڤfVl0fE (oglB&fψb GM5*W)*-4Uq̹EE4>_a "آ0 O`U/ '-Q%9a~n/6Z̲nجH~d fp ad9ѡ5nf) azrz%0lq p%&D( U Z[l8.S(D@@ e(BBu~6_ P2E?{iÈ##C҃Qt"|Ze| 윉*|[0պLESBT]Y^F FZkg{fb Ik^CA 14UAF303ZcGܮq!uXCy+8]MS]ѠPtM"C`-cA8ȖVp,W)_#Vp`<ȟG kJJ\۰9w!FMN{s6ۿX*7q0*$ O[x17%7„`Xu `65%:sj^=r0i!pܽ[>j"ӓ v3##mÃ̃\ !?*pc#b芋({0䉚HЍpK4;#C|GOzp38>IfD@ǁYu&tQC#k-&cģ|:|xb:|u罉 =KNAdm%WvK[cEt.8Q뎂h(IG,#t\ʸn{D4X>i,\Kf e0^d$5;d,p̘/ř [߬-!AB*xu 4@_h(?ǿlzfR?Oս|M"хעsa ]:V1#`Λ+A1͆ܡj~cezߠN2v@oB/饈6!^A]Px( LJӑ 3\=?@#չڦF~0+ꉯ}oq:@Ca\C V_Uti%jrrG(ytXoyr蟠m A JFUBXA̶?%nЇdu=hci&#7]W .ZbFb&FB䵯oEL~S`d? "z"|[) qThM+'gV9e~6 IAcҍ, A䫦BKxHe\y `3% C8 ևٗ?e(KlWr~Skq[,Gy͗覄deؓg!A !8C$ AKp&Oq. }䉰H}LB ̕<ć'/א ~MWE0 `~;z3)ͤp &•ƊQb!Ȇ"41PllrA#8b4!O4aK%(S=p=g;ISΎ$8$6|_\ A) `#;x; qA&d( "yR Z3 COx|[49}z4.c"A]u+Qcw"4&C{9DhufPA1;OD̸"vzH7<6jP"S@1J\t@#4 d2҅CwP.HD NP1ڲJQ٘Ns;ӘI:?)vv;ul7Bsy_v! fZHObLhɡ)a||NA%Qy k䀗EEÑāqWiu$3"}Vc9I=Bgd$k, Ѵ#ZjK3ݢE4e($abcHctOp;&ӎ$=)bqC~d7A1%D =8Oݑ5펼Rgv,%} 1m ?u8X )ś)^fwzzLر]nV-TX^S~fhjkm?`OPqS ϛsRo骧  vwFNy؞$LAS<b$@0J)bBpL!*!?NX&^!tSu HE9|Ptio?Gx^M,{j;e_1n8lo(~%h7A)  9[ۦMzM]-&=0Em'I剢=C 4pG>E3w?CEz3,rrnr$sb X{p'`MFw74hI^ TlW',Dz^62(P7`$jCMQiL_c != A^ǟ}~uOB.]&Sbĕ)PCrm:lB`P%@{d02;\0k?zQ0ck9h>G\ ?NG.r~]a/讟wI|*ҥ(t+1هDe 4z^4$:Hu>0^:`)TN|Guc1>=6eY tń ]yìW3.O&Ϣ, ҁX>D n>O؃Ng̝b^  Јg0G˪O'*3 3x40` ~QaK| /(75/`B {h' KDWDLikc_KcWu| p'2Pzx@\!Pia]3ABȵL/vB9.$0(DUŚ[[>U}0tFëIٷ/B%vA_Jeo*\U ~;"˜I%$"R/^BUQN n{0X2A8_| / ps=/(rj:<7dBk4t_ƓϘz:4,OlQ-ٽd𙝦+OOY7[@q+3FIi}}7hEEwlC]9".|Fʧsc5;0.i~e냚 % ?,_mk4qQc"I 5>fS}_w'N`=A+/tx< w~,h0SKzKtjAy EUb{ZԚ;N8@دĜ\f쁝}>YVS\  ͓@dG!Cw˶мҜqX^Ia(XR|nw| Q&}g._a@߃;ڌvJA-C_?IgLv>p )t"bAȄqq@VUO1ъ# Ju,95ߎrGaarX`Cp)35$+h`~H:"`VcY;⣄f%h@t!O-qiNC $!6x_sD#qf/Ύ|єd-Zq>7luQ hfO},GS2lc^z{+A`>@% #=N_>'n&S6|rE !q b* "x"xMoc(@pDD7_\].<c!羜oSQ^` qQd۳DNO@ eb2&kۉǛ272Jy=q}yg}rC\clO%Ёgj\f%SWnO<f -GǣgjYP0{DAENl +u$ǜ2@s5MOLH .iM,&-} !~ bQ%>Dק^w8 ŷD Yz>{ИU[(1$" &,pm@:'Yx~Lsd0kZDlfrrI*Gaީ@j"̬ KV62}E~yS8/ #N;8`_H:cكq Qt ;0JǞ e!7]s_{1ebCX}:N}Nӥ1gD&EPsC $*2Qdv'4TuH4BD Ty2ڰOFC{ ɭ,ҋN1w86vS\5w?x-j KxׇZ RgCepl82.Tp7Z>v[Ѧs7:P Pмǁ΍q|7|?^LX~8'_17@ߎ;E'^mVw}`o$7=V붒jK[Lp}yi&z]aĨA]^:rrp zmͣ0WED hd+ ^w!ά7'*∟2\TC !L!6Ր)($wKS[z pO:HN#Y-3ʓ$xp"G"\̱ޤUcċl.HՂ$"Ax@/F-_G$`B$bt#H\ÆnxqH Zo`47iCr[#;&>SWvf?߇=uzs֋B_ԣ/ngVDfv+R ITz/NtgZ("7g8Uar.(Dht9&Ma<>,Xɐf˭,x:.c 76^|~BPH[r\U$!fB$r I!C)h12V$Z " " .KIp"d@qII6 )b8`lMKĐQK NH8+ :rq?C\CoĤCrXE bG{]Dp~hg2!r?K'2mh ܣA`,(`a§9:o-O̊ |fɉaj=IḂ P PDEr?xrA^gr~s:>𠗉}+$ޖ7Q\Fw]k|PO<9AbHw1 j>OuvR wiZ s>:'HD+WN'7 BJX~Q& (ɵ sCA4M"=Cr>),K:J "065s*/")x?Rp銞ّl*<9\ALAq@>Mq2!#A8h Щ?ŀwzyG.&A¶'SJ ՉA?0%|8Ct *vW牟0pX.XlRfuc |ms8:؜10 G&kXU l) i7y ͅW(2F`auE+ϵ _Y<_IT_s vwTY}3ƌ4f '(11@C]$rFMWUqW僷 )J pT{x0 )f [?j:CY4Ufr`?4-:6DvSڅrU5>ќ@ GLfD*AOEW#91*qLBOX}aҩNg$!էʯ .:&~ݙ<$J"0vc-r!ڧop~d?܀TRWd Z?]#h^*, I"I#!'&/C$qGD2R[݂4B@/wlGJQ;_Svvasl%HNxyftӴ*1e%}qrͦg(.*# R(W_-2%#0B|gP"h4o{Mِ u{ݷk8p޼ E5_{% "{_RJ5΂1J "Lx t= Sa&!ř2Q4EyLRJ!NymItGwo p.Cy锼#]I0(8d&Юw Z lqܬ/֥(dV&D¥"'w+'-GpW" NOIw`Y Wr_w;4P<#:vv 3D a܏qW[Pon<6\&QBIKhw t7XO;خΏ)Gz۔Hy{鋆&<3ctHϙh ;@_^a~(RaVH`X'Op \<`ӓf'Et~>_CFla''w ;)dȽ ~&\!/}͸3_yˤðN%}n#ݺ8Msc~H$0ڤBNUMBZ2 \6HoI+0u\0)`8DzL$;ncAB6D@:Èp`9 !\aA+%NoUf֋ň7Ak8& ~>@7s&kT(k+yFis`bӯ?"]x jabO%(3e1~r?wJT#@so׍8e_v`F_e%ڂ>&k./qFͩR_Y VmkOXwD:DTu@/Fm(3h\)gktwڄ`pZJ*҂:.At_z2gf|M-s-~Yw݋J!c4fwp=Wcd+Qт:\C(``pcp2$3V,^#Ytl9Qڵg(kPO?ƨpcjbX O8' p୙̈=CkiĀ mJ^E}5?95,oUAt|FBPE'ȫibRuŧ +H3MesvMc ! k: `|NB[R鬵P@ZHgOuipDPړA)` 73Ey th /e^MGF25%i+ c׆.;>^?JdK`%zh7E "h. P]=g Sk2g@Lz@ ]e_M}߮< [~_V 잔Vi YEϳ]Nv/i 7i OP rխ%R# FPkH !nmh+ 9rNGZ E CJrH.;!kzH𞯧r$ıt»xU}}22BO᨝rxZXGnYdt];)mː݌cnfa6;ʃrRlӷ4 l9NWCZm"iҧG?1(2(WBMP=gVAMLQCWJFmЯ͵)_%0Hzs"Hl!dӬ5HX#sjv*.ֻNhN̈CgmӆC#]anuu`9mmY4v%5z~|O@o"Abȫ)W ?OI Bx0I Ā}&c&"%;@Z ^ֽ(a謈 o  &kq7t~x0c"Xsn̽2WQcaÉ01<| Q WdsA@`}c 'qeh~7t*ƒω܌enD( dj (rP 7+pu̧+GVx _#{gt^f׋jKhw_ Z!*o3߲z;L VEXl{:b?M`BJUxy{tElxRQnCҁ x!fPI_x"P ve~/.G|EAGEGxp>v!q2h!jq4b =6_29.O[F_뀝;H@Dp!MݯkФС֨&ȁG8{7iheL׌@ełѣQvl5 `2 ԁAX(_E¥ Q3hoVW]҄7C1~vZrt>z 3Fϋt8(}S Yձ_ @z؋yIDŐ. y v { D  ;h~G$wr'ڬ2M e;V& |Y8$Bth f'p>ʲ? > mz`5\E@P98)N> bP"3ua#!ɑ :ت ~H4'0gJ> =>ʑL&q$Q3:O1z;_]O᧹WoyG1cq'yz~ATqUZD b@ɦcՅNG_+*)cnwMõ&PxpUzm`{0_ݥ"yhч_FXj;.x\_(P2Q N0ʈ(@y( ;+lyq_ /=(a!'TG?M2I B(Q>)>wlZܲ9tF;e\ɮ; > xNT_HHEՍ@q0n ?ݑnK6U?*N^`?FW"~zhm_zt!=e}}G0b>Imѐ@3_OTb N/BP#ejo$AQ<}MxaFx''כš:4{L1p9o>P<}5 >'ats?<|f(lH :5Z>]ͤ˳z>WςՀywϱZƩ)jE Cj<\Ѥ.'/.Tt0% Ilư _ &qGKc˵ɻœ{ÆsZzj:nsMw !K3"J4 mqH_ HS~zT=x%y<8k񜰆AD/~ /U2o,{$1DJůq{1[@'_CC]3 o>{ⷧ?"Z'zA9NFpt>W3Oo>my 6(2-߹අwCgU{5ic^^y(F٤J ]YOE/demAGy-I1PҿL98daE}᭺>r?WX 8(j^ʀt@?*z>P!7x?ʡ ~BǛRD-HHv`$ T:YF_[t[4LPؤ<6u=yL&e2 /5%usrkVu~36\cŶN4靨  ,D:[xʯ}qaQ*ˮk3ޜRBJc]Ey$ UH H{ۯ^t_=ӝ t{k:ȽDk9]%A.!tV>2ُ)H +pvWu 4@X;-R.4S٭H"'w$6mkqU!Սhkg7R30DEGD@x>7!6aD\U,ՒvZyVw2z}o Q7y2]h};#.׏&瘍lz6f^lM1Mءe<6-yalLOG>n#_%tĝ*~{3?.J^U|Gjgo;r{Ϸ=؞y#߻=b%;ʏN4y_ZEzF 󩓭O by(95U ~Aʫ}*ۘ/#*y(U`vjTmot\J4fTN͔j͍hC">_|8>mVnw#<[m\Wr?4&ǹKvyU xRȑgUzq`ḖAT岗ږ˻32uZ۔/Q5΀J'H:}\b_ qv5/{I[o 2zqskꙊ^;;Y/Hoo_y.+[YcI!e/wÑ~_5]Jp۰鐵E>2t7Zsm+PCY?ߧ.kL(^t͔wȨiC`꾽,3GY{{ӫAIA$[ԡّfp>rjuH-zzC[ardCFQ̏i @޹c\>B`pD(ifTT]%i1"mD]"Z2ڨZdLh!ㆆB/ k\Y֥SaYSOPܦlԣȈ^4  r5O?WNunQ7/*r*WiU^J] j0L%CWA/Rp=mèSI)*Ra)eE^B q^UnX y(j / B?  Ȝ3JLLp`BB!II(#k>8(gnF|l6c  \J?J! '<w4h-Î~fPH7v^5C湖=hZ {Oggwd+k@*ʑ JR )A8b Q"'~u4WfR?>ƨ|[\+OI;@wY rDςk=tCQi/=63]:<&0l?}.)}0wd?xBʿ{jʹ SN5}P D2" C)Ml~vᏽwG6VxJ𸫷zǂmV̐jKv ~ D=3ˈ.+`&(ߣW]Q2M{6j)i-XR+De g)ㅸ5ӣa{lЫ*jAP"yG_דwg 7l-}d" O<Ɛ6;[*0•1c V*rva@CaL^1co~Ww)>;ZiuU ' W~ڧVE rtSurjmZ?=2#IE4}{GM5oء‹@ F_o-B Z׀UmK7k\bw}/WgSU:.lCvf I:V{R6f0#ZIX$>$®XGC"*̈!{"/oFQ:܉_LsRň%a"S"ыF/#=-%+%Lz:}UAO##7Ae(\l&R\Y~Gy>˯ y?6 2iT! !ݡz vT6NL'B~ .gZu+wex=5%ӵn̯"gځR 68Ȟ %B;?_-aZvdQF%UD$I:.wq+Zcٯ'xG3fIXS dnJGg =s KLas3P)zhRB(DEDAzFQ|߿e rBF-^z="u:"y}&o[tݏKb]+8 DvBE5Z)m6筶b:V+wHSSIT`S<>#ڵKղbV\Fs7'8`]-2H]@/H- 5 "~$"2"20G*yeo.D]R}7zaxNR9xjޥvL0{Y >;'qo<䘱3հ/3Y~C=^3-5Zi5BBPa BU5abdY랝_g~?oF nq`YNg"1EB=J3Ô{-&j'̬Kr/nwX2f^o2eV]L5h8=^;CGz0V'f^&UbgkOKUv3S_rU1~jlx1lq˯W!??cUk`,t2[eS0C}Gf'WM\|F"TlB5}3j os%9i'u+h+Ey믩Xt6.OhnXuNg 7fȼRG]Qj R %:Sn|_-@eDo+_OXV{gy~gdcɉ>WQ&xat !lÉɄmKoO<-cOO,ܐ'-c"XHqТgsUl;?]UG&jM t5똿7|ֿU6ope6&;Bt>ƃfBzT,E/߫Wc̅֫+˻S?. "E"ߘƱC[bS&l:'k~V&^eߌߗJJԏawԷ2܍֋deue Q`Dj(@\pW>N!F1>?vSFET>o : Q@{Θ«`@EDGD*"!)qe21Uȿn 8T_AMuU[y_T:AU;(bFiBCr`!O%}~L_ f֐"y>uxbvTE;=IN1=H=oKHu삒%((!Wֈh|O65ٷۭt`y:CU/ gelnP @{{.36sco]oh@!Ȅ|A+:w4s=4SdO᥉0z-J"y^ w_',KzDeoު |CTEHHA؊'{6!8X!7 C `ֲt1Aƪ"[@oԶQZ]kcWB;({8 $3 s,SE zS_ZqvdU7Qj$Zn[/^<6k={`F @zDunR@o-=_BNd}DWU~:{O ,Ǚ$ƇΑ~ܷZɆ;pSMvLLb$'P•xt^jI BBi| > R)71]XxB^˄UM7{=Vuz"g `vz@o9xOrB@D|ٙ9ߜ:([Oh1mHgd8AM iVL,J΃⹠*L@nr WUMNJN(aPPx)dE/%NdIPKR%ĐLˣ8g)wА-.=%" 2Lwc{2FTJ2Xν"DG3Oӥ n{*CI/hf]Pиt.4¤épqyyiTZ&%48r |颡A*{H c>1`vPNJPBtI^SJ"M]Kӧ:Q̾4ry]D8K_D=*nG,#pw`Ap9?֚fmUtCQk=nK}ˡ+phgB(KvofvpMosA͓px~Ve.%6sͨf85f Oz'ah}'-C5 ؑ[./`Fګ4"~dB*Si-H jXDCߟs5~ |K*s!2  ob EX) f7i^Q(դb@v;}:IEX쎪PGvG6l$wz\t zc'2LYќ}5}?c%Vesnzt4Rtyp(?-KתWZnǹ|:0!O%tfk/ak i+Ϡ>{yٵigQtyX>]!~df80X" Il9rmuMD sgBUǀR~%A+Qg>4"j[0AxqJ)&PIORӲ/Wp$Qh< A mNASgs#Of ;PW}E?M'Ltg {;Il3ݥJ{ֽe D ==|oڏO"IR>؛{G*+֓;S//j1s6oi?6H{ᳪjӻ$L=\͑)=8YEWP0O{zn~qK 7&?Zn^޻aZV3{~2}Kx{uǃp+1, 亩sn{{LӂWh0Dq94T0jC{N&K5 N cXfO1 $]h !2$"a񼟠S$\i/Ԇjk;QF24[aK}X `N2M8/y4w^Fn~-rxiϱ+3OWt[lځ%G7zi]$} dw\P1QTӋ+|?ozƿ.+9|\<ʗ.zkhIY֯I"HTkr%+OWk>8e"VRa#)LIy<;8ۿG˯iIE;ыh79&Mˏo"o3;-;QȰW3W+Y*~-oux|6{:wU_H4KGcIWN/ž)n6ʃfIkȃ=5u/rҷ:ȵ'kPAyG5vK66?s'\%[w-F{+2tY3!40g@Cv+_= ڵ+ ADTDs!hiZ2 01]j c H2@ߒlSC{E@ח.~6@@H$&wbqu!I%R7$ )Mߵx%t}.Jm~2ňc*\7D ՙ8ݒq^ & #"L)o-.pO;3bbWGVL-.<>N8Mz-<.ghsI0Nu![Vǩ\~֚Uv~R@?qJ{} & AC}^g_❈@ɔAAރgpzW}B4&hbC_P{{u+ 0`0hB yf܋[rC28|_3BHPy?Miٮǖq6z0I4(R <߅q=3p6l#h#9O6@!`?sz+ 4H-E]ě֚~ȥǾ T<LcJ E3_YJ LN]9Yy '姟lIC9^?{Nqo;7%XЎ-;xxN6w멸(Xmkyq%< Lg`|~}uOVg}'xŶb'#UFlAG_z۞KMKq/^ oVk:^k65괺 M~ nw(oo6-vӍg;c;89[>_<]mݫB/ LK}~WBJ$ad H[Dyi_Ďkx7/ " 'qNeDH}uI[^v*$$0-Gn),Ub]Eڅ؁A+)- v-fĈbIVX K=iZXWJzVlfVPbv)JAЂPD!"[+Q\d Xk46+-pj@`.G:(T(łK[X#-DY)|DĵaD1A.1@2f?rg1E &X@m)@=dIDHEk3 /]Tx*Y4AלCwzr|Wgȸ*w0$"ӂ #!vS D^1#!lr߾Ćxb cn D:XkZNOr@ž}9 (   C"pOwSNw=_5q|/6*ms=I0GrA< "?c|UH" ;K/C- ^R}DV UJ⏘_:uW*Ud\<N߲g ~*v&9x>o@d%}߷И)./Sԥ% d2C"kNh,U}Sjfa]"s&fH]j|U/\Сgjp9l!^Ubpd'uSYex NísqU/YD i4eT6G3$&e7Y9 "PՏERPQZʴoI?R/;$mq43G9clm}#ynn0wRm?Z;ۓ>bBd7v6]vX}3@{9Z*j`#V^& zξpP"f/qQ gCz1n@<QɆy:l LQh.= A37AHsf{$%(2 @rTDO&cBz?$,rFic0_2"NJ`lIi mm$1HI(XeI dA{+ws?3R*%":ء@Ɩ=Gh_@AcFG |v.2,㶦G=||>˟D,@!Noť׎!>zƨ+%!jMr.7<߼kH -6:.{^63EBd}˓Urgvv 8BcˋLC7XDRn,57Hv':}RglIGceOklbPŐ8j^Bȟ5 \^W9|{,tge4+,5wNYwt\>[*uWIgS6o,%óm?: 6"²ChP2@1#eQQ*v]k6ȡs$q C/髧Y?o!o!N%ۦ_p,f@Y~sNk,Moa]jX2_2m"4pe1 @`/uoڙ83l6q38*KB$W6\r{HlNO~@ DBca8xwWp8i&:0F {Oq'n,Ny\Z<%IT{^hueߨED ]o~onr6Jۇ#I ]w2E۷YÕTĺ b1F@^_snIq/ {_1tbjC$ɒE2LD$[ycs9Xcs8qkCQAH4J4U*G=d|,΃s *pp|zm zʝ]]LJ @vG`l9T[@,6琝5Dj?7S!GT^)ϔFen+}7مZ!ǷFEMXɶzLxPOT\ gO>,ZfkQV#԰Ct:otWM2DOu/[ ԭp7l3cI"e|*b+9]^Wb2 ^.@LU S#šB2|{_C$3߫KYw]Xi@pw^9,]&[`3-E{|{66IX4猞&ij_U}Xo.*1OWEy;mQnFf-81f<{3'cI84̱kb! A>洬Xx1i51fϱ2I0$z85NDj#u8Rkh[nݿnN]kJ]G>'֥+ġ}e,DoQ1&H6҄!?IAڥ'a9ڮ"ϐҟftPCA Afd%(QOvY͕uVJ!ƍM>5+J"̪ ov4ZޤIp x[vcUMgF+_8ey®檚>ݚvRm U>1 NGݢVj:Zq5L&L)eTƒ#CC2Z\5woePT*w??i:MyӺƩsdiu=W cC7$u 5ˍCu}/7mq%r؁zYF!9k e.T舜KW-PW91{б3LJͯ}ϝƺw7m3w^ gߞ~ l:;6tvK~K&DB/OXNjn::8c*g j7 VĈV ԡ(Xn^Eődz6qVEi߹ǔ:'2)*̆/t*><{ywk5~Ȅi-^>$~\sxZ+1?wsz[ תъ:_ H^JXtmgRzCy|D,vOi~MC$b_ `AMdP uV9D?rBE*b ƀ3D3 lqv(v&3hc}Q@DJr! )O!8P(x2R7(MB9j! ܤĜ<(^qӸL|{ˇvURuyg_wgndiEPT=v".RDm(z൲6˸;)'5U$ BD "cL+&` aA I434h M9dM1 ),`ba9K`%?g т)9^.'?%^ }w!k"m?TnaY $QOGMURgl(1C{{;' ijdy|ed {NѠ@`#b(poP]-Lý |ᄊc>.!1<"AK@Da-? )_rAGQZ1@!`}rsbߪ[!vC$:֨"RyN{iPY|(KU $!'y㹑C{/?1@B@ž^ {Xύʂ 㳹@-Z~+C{i8bOwi}YPFɗJAV2_QUqŖEN*攟[)%6ݖ2[‡pz֧"|G jtp_7wWzdK)z۴`~sH%Ԃj.F?Uw7W9?~LB1(nJ2lmqT;V?sٽD{?c|ɜɴk“F,_|uPcbEk*47j⥌mh`72؝v ') g[g>E*ܒ.D*Gz[P!>fu_Yv^)\Ni!_gJlXȪ{uF֣m[2k_6bh+;85+5x1>w 'X;)~u96il) -im>nlLjh='<]7\w٧C{$g6}~E9S_U G((&#tJT[ݞ*zok'ls,zd?N*X_y<= *:/;l\_O#;xLRᝦ .YjH2cJ7{{ ém)!xY^cUۯ`0b\͊P])b~|1H$0\"3;h"hZhbhýxl oiZKnڔҔ̧ XcEQTUt7.51%DogVpG9b\I69tߙ+^k7 ?Y t\ gqZ"J-WmZϡ}ծDFG$y[/sDp2\cU7 N$ghyi;/1(k LflҔjF,̗:tM0]DHSٷȿ3E#VDn_oz}¶ʁ2$hjM9FFwM8K)E C @*dwtVJRE,啜?SmA4 Fotޤ>ƀV")M!޷peŻcfPn_2Go/š[ wp s@ : B"hl.- RK ]S^-/n^tz_*,1^n+͌ f!a;87gQ*VoX_]g&NK ;۶;yQh6q&=H_Q:խgS<?;~?g)_搹l[>UX ]r?}5R>5*"iDɀCEiBL`+B/=QR:+ҸfX3|=(BDY43t߁.Q)n12!L<(?C"5!GLUi]soTar "ZQBj2 4!Ȼ$UߖxNa4ᚁ$h+#WWN!9ȎYwύL/7gTfIs#Q1(|N#讌2&qx&WHbj! ޙJvq^%Jxy݆SZ"ocFpIyVL_G㎲(Di쇳13,Mf:MzЕ$C8D~pUZ3~dC:@ơ>ۜiyTHr5hde PT@ foPa?W#5d(\K5JChxN|-P=iP|9 !D:Զ.:}Iwnw=b!u3(E1Q3LSLBRPSDDSTEJ$N[VʜMJ#Zs:6ÆU* }EJu˶&u-ip[h m)'ˊ% xDOѐ1K#H1mnbZGCG "03 EXEkY"$HvRïݟ?F{_=Spf~7_>k6H JCy31J-d?c,vpTKޖz]qԲ E0AxmtcZcM3_f]e JNЇhbhi$ ,"A"MtC0Z)3$I:{ IQ0 ?N}cSy4Im~bf!\di[(Iy=^0X; [qpx1`8B[SH|ģE0>h #F"8@c+D9~}7Nv>kAUaaзi9g}oם)+gϚoWOxJ~w{岂1_s_wuxM l=i^7t,ዪVv) c PQ'1.9Îfvudf0 m@PS] -mپf[Ue ? SQqϣiy_oݲa8 BPS$iFJ(쿞@$; _n@^d[^TRMēTāDE)B4(Dm_ CdD A'WuPV(يRArhk~mgJg8ӆ, oZ 34CAU\yfR[Ѿ.X$JY[Ǭl͍k KEQނ) L]Ʉ(VOB@yX](/z f 4>p ma`ޭ7yD3/,UOVui-B][.f#LW&!} %Fs*Ik?u]^ <!DqR ]9&襜S0'W&Mᡕ(BX6h,]Jɝ YԔŜ P鞮sCpw;L;a4#ӗ+ Q{E Ndv<(IEsoeZ@ỹf|ۮ74m-!<3g-"&۰ҐB,D)M(t;@<ݡD -0 2Hq@~)"֏WvI %)L%"e(L%ZS 4ma78w**b7(Jh>Q8=9tEۀIXϰ?sPSP2 P }vT„ Mj 2H&F"bTᢺgړRNX_xA)Tҕ$D(S'F ,\w 0j+jm6v߈'kze8đAݨNjY6 ,8驦Gq4Ӕ@%ԡ E%!6f8hA*}WZdkؖJ*r\_u̲fmӞluA<>!I+RӁS{"=A21nf5__ꯕGYH)~?/yS.Q όNT8ƒ+b脈ngeLģ\Xz TJs̹UȒ (*`kW8Mq`ܺ/=!eJR:G}Z=wfoMbkJR R22&N(?]}E]h2x4-;h]'1\IW&qغjb1~p ?H;}ΌޚF#q,[j>ΛnP1=Ȍ믎?ovkK〒$I݇> AgwvhSgf"1b 5?`cRSM^}ڟ}x2 ff@ܛo 5ܐn3;d}k?,bΛQm 4Ff[ĸ٘mC L&L$R4`֑S)")z:h%P>4MwLybsZ'*ko/:=/,E31bTR0@X^$R0Z`&D3M?] }O3;upv⛛*2ʃ_dŌ-U,_7|_=K 1ُ&ȀuTEi{#]J%^?,5N!*R+$Ra+%KZ@Q_L|-[ uP2eĥ"dh*;u^l ovVmkt&zɟ4X&'ek/ֱƲv1--km?lAgo]5FZ_{W>NW_uGsߞOK2,Si~wSz^pd$dHIܹs>mֶI7"7$B! J%t:UŖlzn.2г7YRet&w`:zmJ;\V.͙Q#5H-wnMQk̬wZ.Cn>apLl -*!̉hd@2" =yˋֵa{ anѩ`܁wگ|6jjK ρDEҴڙ^AV##.%:u9b%o˸ÉߩLkzPX`ݴ"]^bWOB{,!p y̨Sz̿67Ǭz0^(!7is99N:׳mf/VYF[GrzVX|oCXtߡYs>GՌuN^wMϐyG6l3+j-uM=3Ys]<_k22CRN:;e(ᏟoKȉ|o/~s`㳗KrLk{. j9?Em[VmcդdcwRW])۝JiU;¼·)m&&sq2uXybl13mZv nM)ʝrjE}:өߛ2vZJEёVEﳾO kl~NFCUNgOEe)o)yӫ뿧؃z\N|oAMoN[;8F#=U&VWo^-&4=5L/Pzv/1M=yNhݣTZ'[Mڜ kbtp]Ƴ= ?UsR* 2.R?PYgA ^G׋_&e^0yii bh#mz:sjʚb*hYyhx=$cڍ+nW0c@R!0}az\: 8 V%!*pC6M  IZ5*3pNUU ڄ:vI7ڄUĪ@vxwIR[3DJxPyE@ ksu:妓xy*i`CE0Cq$yigƊ! RBnVh(5$"?V쳈c%)+#P*;2@3@JNMrB7bDQK0?r91u?~NNmmMp{jeR N'`lFGj% &63@[P+:`5L{.;-Z[Y(4z|"Y1:6|fNF4~EvD{٤!ǭV|4(`Pe*TTCw`  xXC$4(}re¾Ăedr)㧻 z{3yOR"`8i&),㋂M882IMMaEk0VhBPh4JӁLfk#Z0Z&jԘؚcfXhs ֭VDf3B@zy# q2)Z408faČh58Y| vv5;Z4服k kF00ӭudMB2`n[tq8@hCK<b7ǽ!MK,Jh)酺=;22+E1l`r}LD7NyМ;hDi@ąc"'cu0|ppôB8>(0NG%H.^+l3rim\BNd޾,* ?JylvTA@D=_ ~6nzDpåedJƾF؊bXbKySAQw ewVV1(vR eƏ|(׻uBv௅۫:r㴷73k*?&wWQĚ-3ͻx7v)aGR \ٱ9^!CK꣬#5lK =0k!cGu%Z4RQ!j|uy٘\;;6gF-`o!$F2;[Rde'/^KU &D3+mhcܗo:{ZqڕL{Q"z_B.b5PʼnP<#Z" ztX~nS?xOUk]/:a&\iPׅàF?AT.⶘4b1sQJ^]OO A_e4H+uB9sO+ԪLp*W,J oEգ6Ĵ?xi*"Lzo6KɔB֗+[RCkUWӈ%*;:^.j)*1Zj@vq.xo:wy6| 鑾9 9R^ [PyEL4=~}K8 iOI^+! >D(y"GCCa>m _`_rqT9o %D"'}x w?{[#^҃Caܞ{+/~Z)$?ACAo>Y|e !w_o ,9z=Xb<ˑv3%I28^zrih=(.D)yrR>T "Ֆz E[cnn":~s5{Uq"㣸syhj-i8!~\v9U~P~)_Рo-uD#9OI|H)vudf;uz+:XCTȀ.tgty~e2uuP_|P:u=pƯKq*JkmYY6(Y/<)Ax#>uNWWJzܿk[{qU-qۓ7WQբ, ZZД6 DYPAaY,폕l s_PC=Z>tiWsY.q zv.lw⭑H^jHT Z^fu8#{mZmcVE=ib;1:<@X',I1mg'Q' HVۢ: ]{,#!3!w9ۭ6ӎ/vXy/3Mm?)Ljc(33Q`FGG"$Rۼ{S"8$ϳu#v&3-%L*Kvm_S?w8o@bIf@@< ]ӎϻH:HRr2:?ioffO` ##$ d VqW)]ON=,2Ÿ羂-_ CMz'M,ٞj`>^M*/&5RYb-PҤ ;y|-̶Zăv⻝ /:/Wf%O¢$9x5;U!':;U#"Mg;=g&M9x2Sڮ2 ¥-k|DAVtpnMCluTV!0ɮΗLo[4q8aeXh*g&}Gt0L^N-a2uy^hh|' aJ j9 mf)i2*rA^~1?0$Toi+=tSHHᶅ˦_B,q]Naq"<]}S{rcT}?EܶeQW1nԥyQwʃc0)^!ӀZ4"#&4fHLPl5ʉv|tްt@25G [p9,miG.MPAz@GGj8X U:} ω&M3n;]ׯ~y@@; ۀ%,{h]\:Ӓuk$oC\SFI˹ ߵ;0{3I?)뷿|v D/*It= qpg1 F-!,,(G=I-_`K:)TAr;gT1:^b&!|hG3)+b:"!?6Ґ =Qú6'hK)aBjh= 5%Imwg9\"k\/1"H:qbC: YfFY_Š6ՃZ?He[y}mgG{h8o7:xPL`ѣ¾`ӏVg`r=y&|Z@ `̋1UxeMc"( 3 586.5kki`ܪ] αX.~^թ7_Z/Fۯk~:k{XZxt[&F;S`hA_ᾲR2@Kߧ0WЅ+P#}GSLJyn%}RDew DiڀͽX_ +c, 8$Kjz󣦔~.xV}>Jݻgo:ޏM. tȴ0 |0oR6üϕǽ'džhUf6*F"`b;_Yd1F3& N=:h|]*8fm_*'qW#5-s623D0]skjмmhVR}gi:>~puۃ+سť_z.HsC_r[ʘ)ϩ g@2Gy7-"?{p.}#P--"041<%r8wKE?USRm~M\Mjlp6H$и3Ѐf T =+eu9S#[>kLK0-O$f˥Zndn^v׍F@O0QIe&/TkK[BWw[>X.lqP*!ALϝNZ?ށc'w, Gkk_#<(e@2`s1D}V|Ckv}U\l>q  2~W6p2bH$PArp}ܚO>iӛ%ʜI:W6uS-2jۊ΅B0D4⌈tλ$蒂iJ غ-gh5_nWTnSG͡E-ϊR4ʉ>2X~l/(ZZlrv쀌J{F5CYYbP%/<5vA/g@67Q* 18a ; -p8-5:~2eu,C ͱnk]%'Z}l|Lq]1T &RK2#I޹™ ik7j٭dmD^pI7%X-ETV/L`3y[r[m.yL\$[RO^ʷktpc󘒸MփsSF,n#Ť<{_[K<}F$]T65PuvT$< $N<6;2eN~Fi{Qt64skᨣ  $AD<C,n꭬іx7'iUh*xmm66%TTZ,p *+3,*s"+0 byeFYMk(dfea]=|3_r=..7Ht&urf#Kɨ̬ Ѭ9dchG9'}ݡQ?ȸC1&@ng:?z 0'yP:#}s!aǑ}A|>oQbsƓ%xng >":h[)aЌ3L @>@RdO)PxJ-QZܛ'=<@?Y@;{? t=^~9S E}J"xzx=EŊI"r)p;r72!#M$}  `,E%jB "@g!IHD)сk4cdCCl1p h(?@48R}O[ y>tγ; ňP <ٮ ؾ $Vtt_-G!FoSX8Uns6 ٜ9!43+? ; 5*$N(u"?oz餺.Ñiܚ:8ChP@$cIv:uc XM,lz)FĈhБetjˡidAhCi !2u1QSdΟ{&'ǜ7'i xgAχ;)uǡG%ԡ*D!wgwzdj+{^𺅘}m_:x[0#:ӈ"  @دn>hf5hLr1G})|L\*ZQO2,x>Is\|} zxYݽ}#}9Q~ZՌn'ʃH{2!6&~6c6 t0V2y%e.uQwy^b֣Km W1 6^N)ÜuLfG_Yl<-vu%~n;R^JVŠ'&g87GbXvvi;Dy!@[1 ? :MO:ܣ?bWNrݱP֮B"! xdiP)U|Co˿JA{grsD .u?r m:2 x| z H;~Ec$B0~bߔ~]DXN#OȜkc=2=WGj,'wQ vtםPQ"(ƃҋDA>ewУJkR~/?_\DbŀW=ÐOcFW5çW4\ێ@hI#C=NUtv^ag(j CP{_?KΈ uX=>hxy-+buL|MFuD]9aؒY#EQ@v&+E04_z7 lױd"f4ވ+y"`>b0$&I|[5rY&O^𽟑1xKظ/!V"l>yeO:@PdbQWJU9?`UKJ>WP֦(*kxG05-aޱ~ڀJ6=T.p*̇&x>+DO$vlޔoPe2`Ne31Zh"HҔ!&PmSvjl= [YNF?;yΩLoo=ɪ1@ȳoЋLPPΘ}3.B^e@m9A t!\$q7 xP1z%n(M0D bYŁP$Qb.v+b,&E%` Qǵ A[j3Irb-f0M+k\"I1Ȥa{/f(Q ,&rTTN3ш0uTRkZ@.RP{^k+@X*x 11-Fw_)xT'{>`>$4~`4~לQ+CQkNzTY?VxQ+GMupm|ׁk|Ƨ]<\薐@=W|_SU`UnP :jKʵs.U54O,|p̻nEۃ&sPFf.)WKvyW:Ydf"GyX!c)+[#;su{]&1p7`6:º:Kx>M&S?e^r+77HN(((ǹ2Șd nD}ޱq%E#9?46_fgɗHUɱ0ϣ:x4j\~--t;{qȉ@q82eÂBe<7y/ZapI GVAi*_hPy0sp/srt,͢p{bT#Mު< _S6⏽丽iI{r[ݩQ>Mv<Gbf#$s_ ]qc6e=ä!@@\@q+lsU>I_c񣌒o[U{[^y[Lnn|vFՓ** LYqH'i(9j6 bhͱ5Ѣͳ4oeAEPCDEE1QYEbDV!pur{}'#'C$;< ?]z! _wULeŀ1 ©y[QF"_"V*ABj߭ZU2A:6+Rms-!v 0޲8D1Υ=x7NG㻋ݢZeTOEյ 2cP zzQM{ג+H/1ر+0ɦ]`~ҳФyez:a)dgPn#!0L^^h9Ex |j Ok=o!ELȆB-. h ZZ!&-aOSӔQQ0D$Sc7k[ѼSVAJ.T=RjSĚ5d-ujTm T!JVAf ٌ)Վ_gii[Diկ9¹rd6e:-z_*%QeB/mxa+O/O2IJg6aёTRª*ـְ""J7_R4; nfU7ZO>j_;Vb Ժ}?~?؟R{,:*=vx ߞ~yph?)Tz(`<™Iur=?M".dMmΟ+Λo .%gϑ.RF]~]qJcV^ʩ\z;V3izڢ5^5 }z#Ų9vM^ƽ;,XwnҭGo%>ףƷ~oEk*ӟ/NI&ѭ ,,Q->Ir{ρx6%D4:űĽ]lSM,N,_TZr;Akx/@>1ˆ'w#D7xSIbg޷M ܁@&eQX,4s_%Z5;kmWʜJ%iLQP Q|HSU(Fwx=am51*]Ϯ! {]([(%1'4".?V,NGCӈl:8aL\̞̝6ۿNVSiTlor:GFsӡ]::ͅW3ΕqW+A{o9sOcCPV#h^~X=forRxS>^\~n**k{Y^= =ns5冩h<]Yx;_֯-wkz{Φ}iز|+t6w= LTcYxh" kB<s=4'Q=?)ő$)ʍx-u=Q rֈ==2f)7DBKfuPvs#8@v y5 , ;%',"SGH=^Y']ѐ,C"(=m1A +*~y-kZHBֵUXW:ښO(geǻ%}DC9ytWzl09ro60+t 3iTz΁?ıԆD"?ӥU8?'TA]_Adܛ~u'NxfޭUnw /%b6M:b|5yX/\5d>!YTxAb(e, p2վh0p_7t  +:G[6YV;<8f}7kmW9Re A7%i1HF߾}6]goCyHp(I!sC~:s6tljS CxtE*%8d"R"dw*ȇ6y/:3`0 `)}v; heC]47Avnyn~U'iƶsƑ!MсH2-P2.\C>ͧ: Coja owPvkΟ鼁4b:L8ث<trͣW57ΙREmt'!5Qv̋ȩv8ڌȷKU6<L'#?V3u_f1_CH 9c<.1s[imwj<$53{gL&`RWD6uZTۉGZ]@L-G$}ԇ@]<܇+pBD>,C2n:|oICd:/[_wsn F@22AfUU~pKvͷ }=;p@1ǧ= zRE >ʹe P \O})zFQn\zx1{պ[Q`v Χ}E[@qcҞ = d;sÀw\@${~mh>oy#4$g-\?ZxȼKX ߵkt7{@&sM%U4Xr|Y!xl̎_\܌E@iZh A{ Gˇ1$Yyt=Du:8#Lw^ :ѩwh58o 4XĮ5@ ΊqXvdC턒E\VdNkB g~!!/ekN?Kl<]MhisF0lG{8 2&pQ>7܎FaJdwT_gp2Q!,$H 2ӎ-{Ӑ]8  6Llv?og_7KZ؞B\\h2GBщp ."#!hu+ Dd#?C?e[+4'L}u$$Y$0b5YCMIByQf^"JF? C|bC(xAC^˭ΐLk{ h3'пBP}}9ɕ~ʲ`̆xcE1,@PR4)J4VIa FI{jBck^^+'].y"@'e󟆼` ehg /C7z}Pw yy`9$I$XƧqؓۄܐEa[wٝժ.S*_eL@\)Ff ?G7&c\4;G5Ԡ }.+ *6xݯ?-\#|Pa1\HOY*'SoG?6=gUd`.JV5 m!v5󝢄Xj2}7Y3b?{Ig'>wЇQ܌W08{  Ơe~S TEԖσvMH;?{[( Ca?4e5&y(i XFN;pV>ū㨪 ^׎Bj; Y*lSMM2hq|߿FBgeBƘj"vv9M-!P{JX>ѕ_+#+>+v oE_!,92:̰CFb_kbX % /]!/۪H٪ (3()R:BJ)`A__Mo~۟*ux Bp(ԐLSb?/BSC4@DM"x=p0Цv<7C `~$H{WJ % х0 닻n4kX@k0 29 L}ı925ԋZZ70wX`HXtFlZye;>ysh:տZ_\lQoz rI!J qn]Chv{XpeĆ, 1{ϥp($2@_'iٟ]Gq qhOִ?ݛ+, [xQ?wfFvSFEG[v`|V!2 mF;U)Cp ZF@+@R\ O73~.Fʀ|ՙWhl%5co5x{) Q$**=V DNc%0%ʹDw1Mv@2 4No4-t>i uß˟AHu<`bbG`P U>:3U%|#Z;v^$e^7+Rng{:LR2cL>BcXR#j>mGT.Ry#)SdMW !Oixk 1Ti"m1[iIW,`6!{6WAO#ޘeAaRW Ee#-eѯ{p;f ףDzD;$(4uyD6zG:N0Qv R}wY'lB`g %2_WW]3Z>@\C/VH'+\Z:w#?2!m@F§QڋIbXu˫ @3 )C,H H=#i[G\JO{U[_نwtD7'TЖ+Wh`o:x.z8WPv݇X%pJAJ.7&k!@`|~SFvcP0ķtVGGUoM`4pCm d!A!C ( ja |@!cXsH 6ú*jɝ6r@/!v ̈4t7  BlL rF6%H6pcsc٪6ɯo47H mKP`A^ㅃ }[<`b 0PCnof@-Q[dHȏ'IbM'_t0:|$a]d*47t{M]ZymZ4x.1Y4:s bc1"!1Q@'l|2GrDzYG$`C/v$oa1O&t xv} 7Ν'bB·0wVKzgyX l8@3,X֣TL<C|pQV2n2q 6}X)It+fGUى!ңA6;XB j̲PqZG>qhCpw 5"<{ |= Eǀ7v_QЃ`Ε0^Ry1>~_`fP#0w|XAw4w sF:; & A$32"Iϲ,8-R} [Y0BoJ1=9q.N渑L5lǷ֥.7`Qn/?'^` r3M71 "QP~NրGPFo4&+6<] !aqCT@k p}*pն?]\Sj0GҡnHܚO){4OdJ3}yIGo}\aA`ݲtqt(Dd4'&؜eA? 2@ay/l #D[MC D%>s{6h;;8cS#*4)K؁_uC|*6qS6GaxBvAt"&€s/`8DTjp{5Hj񡱶=.N͂qI: VGٛw<_ou%@$BD,#L' #xDhěD猢t;Sั&H؇D9yH̆dks-4y=/兀kpb/X. X8WlgYcX݉z0]O,\{1!Bm;BMp9+qPH *loru#?AtѮ;S`iC41w@eR <|SC  cϬfv G UW{`G~(@;!e~3o/],B]~J)R 3܉ϕZ^tX@z4=pcUjk0 mjbL+\;n o=q!CƄ>7ȟ%X @DU!`x1ܳI {5HC9ú3r xM}*K*H1ѵ ! QꢧI= 6΃(;sHנC=~;, 0hc,Ө< 7(p8g.CMMl3`v>bK&;\j:=Bݎ{} t= EiwƟ`]kyh Ar1{m5*D<$6eq.m07fPTaq{?uQ=^b4 hp4o-^P S/`{O}C/>qLS݅)~(Ki\5#^97#Q+mRҙu==~@P=(o$rN}$3 }@Y<{u#!pLB [PkB4Lê6G2a5RZ~L=0g˖11C~-EԠ@&2[`2 qcco?ea @d0C2İ}Fv.+s+insY>ʀFղ+Klw=t0ktPvhIa+.yv>[+Z }$2<wM3(ߒ{ xY6P_`՜?bȀ0DZ /C,c>aAM}uyymdY~V+*?w_o8xN0eaK D TLzsdp0Rc[i36ݺpXjﴖOg!s!C?QRaj x~8(,$4!qv}a5 zcC a9Ta88De!9, -Y,s+]*o7[KzN(Vmh@)o1{zg̒V5ؑf Pt֍A{=Gg_K=MIUP!?ՊOm.Twُo?%֟PՆ}Pm߀O!/c=ND ɰ'isN[ 4~[ڙF~QMi!iG le1J=QGr$!t2oMB](\,/vk0?oKgzDc K'm415,UrGq5^=68hbtMhJv͂ *Bk ;+q$YO^f]Iޚkktb&;ߺQ3"* q7t}m[?o=?x@ XP!hD$Bp_Kidz}~%aOBׂ':Sg{_rYV.##L;b AKpSDx:JN_dnӹ$:7Y@V8%a`;%Z!Qt8, VSZP*Hi 4hW} Hd,'"DD߱t0P &σ.HdARB2&/E>_ЎY=`mECwɒё{Fjj;>wvJBC=g;a00n@%E) [!B'՗%H'\H +Zar>ضn$2mFL.uSof5u g$WZ"lׄs>$]@4lA Jճs?2 PLX[6T: oV>"K$Ǒc@r$FF"A*"D5!Ebἠ5cH +>n SO1SA&8^Փ l#LM%go.#HAF, @DHD0~Ěc#P~c4ݍ}|²Y\(q?>;&sOb. -Cc-mc0ن ѫJ;_otmQsvǷTb(p hy?B4D<7,  o!іhTm#3`?aYstBBDCq` q.<ë1 Wֆu1Eޟ[=6'`=\I?vٗ۹ej!sP%ô^ 9vdZM䢓uj^p+GM8kn6J V! %  -1 EVҦ_7M =^ ѿl}̬#> X.#'W+C6Nj-DS 6|F}b^ '[M8G4v)q=dՅzj} cK{qgû~_LI kg )o6La(p 0iHfo!؆j{\̆f, иs8 =zǣs9#GOLҥgK?V2>B_=n,&Eƾx`pQ[I}_\`" s[2EX=V19s9B\=Nre{}-f~Byk5.A:N_?DÎk2z9u bKI >H~f~=-&}x&+ w@?xBTr(|9ϑk{8"[ B OKɑJ ?$#_c~a&?рq~v#K P~?U!ӂ|>,B"~A@09OZMOü f@|eU C|)W\PEd!l,uڧGb>SC!*m`YюSB:J`?ގTL팡~ z=M{ +x&jRa*5$v/Y6:wk1!8|:0}N߲3ڞthj"Su5yOh9ux@𳳃$u+""؇Og~#LsFCȚ=8^;|4W#^GdkDk<4+-H ڌխAy ;Т.81QѰyAvrAbF8h[}zzk٢N(ps쏉=\dGy88ڐ5Xc𾧫}ł/jHz)QyOckX8T~YKBoVQ1I_˕WJ0げtE*B۵ ^8t ːysZi+Uqr\GK>zRA:EH\(>@ v_ueǭOȒL_ p)C# 2 S5Cp !/`~ D4le+ס>l䧏U@|y:?+p /z8 +{c+}Q0DOsP %$~}%?:tLASWHH ,)@P@J̪R4$$J RP¦J A " J!bB~`3'xD'|n! #&PɊebUe % "J$! $I d")Z% " "b!"ZT "a14BдS!$M TK DL?z04r0 F`"fF!4Q4"H L@tXDȆ}5`ۼ*XŐW–9 XrhQ>vw\Y廌yopt?hLĊj,ۛmYXu4rX$'&h C32?2~o*-j-zn(RPW@tk}zm n ;l;oF:nb'<=qiFeATSW(xAL]#SaiV |UmJi]#FiI6sr(b]ID 6 D: 7D)/03=DSA^u`!}7:eL`"P_ZkHqsFzIctZxH$;` i=DK/ *@p8 CH’S!@.8q xb+E)/|{Y1fKP8jPZDDP1,q?wWcI@m Ltʓ:9 D8ʲXGbCbI6u"ֹżhN̲OZZw^+DF>Ν7󳤹 / !j9(枠qE٩3@|u"d"jtĄä(w !t%9$xg9y$n7_h+_-`=׊s>.փ0xOy]L_y0ݘ ENY@ AuNz4~wZC l>Ar9ѢLuĒT^IE'oIQ@ϣ+H'Kh@j_xWw#犷cZ:X91kdUp14B61#0+*?_˻u9ΚxiG!M]6 [7ί>*3c{JA0<xAd!ttXPz:WR6c*4U&@5&؇5'ǽ{0!"w^q5kX.RKhQ2~sK`5M AfbZ$h`:*k4hS.ـ[8H 5A$w@ @҃~gm;٬W aЀxq_DaQ{6 3}2Wv~wӡFOd1r[]z?M38,&WC@\ \ohlCnš[KmyK#8VKkVnۗg(Pv tnϠ/H哙_DDE&44/#iz"ʕ2= LXʠ}!?Y"̾P9ι?q\w 4%th*kNESo9,8fc~vAH?Xއuo8'Ɩ7PAAi_mZhHho?}+NO1g>3/a!`JuP6x 饏(I :*{b){B` _;ߍwv7#5P5 n(asL:kӜŃ6j̫"p03F tmJ70w .4#\GÚ#K5Q9}!A?+Pz Y2 w wep+_ګ;]S@Ti9`=&M75aۇ ЙA" !`\X=8 ?Yzx>=8O$iM~GR[XEs[К&&v~ͷ~CCdC%y&6(yaL*B)@Esn.J.GpE%%fVaB|-Q5wqX|l)q>Y~CAE0&>tk(XY!Oe[Oq͐b$tb(}bhM)X@`.4/{ VZI~-WFB:b%6 xV#}lYہlgR?m7i[} K{ c3 /Z؍MPэ`DǼ}Æz!.(i<Bd~3`ZvA]2\";8K!Hpei5?fz}p`CP;vdH Uàn]+خNyZlTCq>Hl G␿ O0:Te !g>T?y @LGcjt\]HfH3!p@sͲ@(,l)ykث"qa~1&7Ø|i%iD @c=cаAC 9v_لlTM6?nêalz[j+fm]׉pHܬ)QIB v@:{aG7Bfph/x22j!,8>>ިi?{c864 XeF3_ң[0^7B 7B&0ae).%`@k)N.vlde9sl ¾B0#>Frj}'z[|;-P/:_ ۴'&"@,g"U*Ť (шnz6khkcȈ )|1jym%`k E@ ,eV^{Gy/r  (yXo MhQ@P|Ldbq)<`ZOB{}**_Ј$ We"o i 1 ixuA?݉9%'m>~? :M 8iE{/| vEWVXX, ꏏh>^2\Y -@@U Aw.1a]# )8t"Z FGwpD\FK2 ͩj҄Y1j@!ϒFf[70_gd]:}uh_?x]zO4 xqYIsvޔgAd|{dOaA-!14t`$V#b`xJԡb 22RB=Ҁتa"Y=d/ˑ L͑Z3F_j'!ou-Dg1y7  #LiIvz}R՚\m[=qQ4Cj 7 C˽qkD0hrAUpv3dŸyr ԜQSqgd::e0!p:7MXiar$],O@NE!"$%@ *+M@\nؔ4*}@ G5FrkQcjO+&:}AT+8ßI@%vKE#,+( {d_X AuL '>PB@Uy\kc:8HZ흼(5-@$Qя@$G0Po:A"&,U')c|> `nt14̉!cy5@ܔo$ EN89Ʈ =uxo~瀻gmz 9;VFwZkMeT,!wQA,;0'P\|m76  ľߐ[/n`05b0 ={hHyD $r&DlCbR9r! #K໤;&{Dv=#t#2mb$2g@-:ᏊVXv˫'z=Q;vn0@0Vб\&s:D;loovd`{C$,5Xoz[I('>zVptfSsΚ&,zXUwG8+F!@4~l!&-̈"hlA$E%WQ@ B!`c?M ˼c:Ĥ? h #BAQPFsp{3vlv ٓa@U% : 0aԔi{{!ɼ@"3Fvp'βw F[7#k {wX8Kr ӦL8PuwT0@2(tȘ3Yq0q6% t~ayЅR!AsA}S"W.ad. P: 9 AItq8T3@ t1KG#df~!W" $'h&B9Fwt5cW-M?#X9sQI 0 q9X:JC(Rd ?'/(?! Lpł*Ҥ9{M} g-AV^=V_#h+C@7scD^u6(8 "5c 3x_;nUjj}Z=:TUVǥUĎR6oC` 23Pe/q>Gj8߆^B?F[& sǧ-l$h|g£!t3R{])]ĘAļvi 'oM8C(o/2Ir5V`E@:1v˄N:?" |@5161 p#!:[GAۀeCv!Ǽ{g]H't@9C PA*1 u!ث$$!Qv%S zVATxS6,\C i@?[ћ,Z}Hھe ;3뉦*9h0. zZ@ \(19U{Ih}K>z W^A_,k@]p%!xoRR*b :d/4_  O:0 ҡDӔ0x48s֩.$:g #S eMވ7> vOGAwz`t<@`<$Bur I \߮Qb倪̪z>˫Xwjl6+lfF3sR :yB/3sрt$ }|f`qH?SJ5evEc R%eu.dGEfa {s  a"?@>!U{[j ѳѰ( AC )(INi!yWŠ (2)xzrŴ!L,֬3C V!y/;KpN.XuALueu]-5`7 5i:v#H aqݠ2+H1b|QA /sہP Fd!T䳤d04EJ,y%b8VxA ٠(S H d1X8| R #P&P Jvjv͉ebu mWNGtA_3|wA:⽉0`qnDaW MOD?7mTH@Hd;JLZ|_~BP5=Wk?._2%mBzTYNs!𽹐ĒЁ@H1P8Y4ލ@ɤc,\@5ރ<cȁyG`7:7Bz?< /<bhOނ/ bdʶbȠWƋْd"L5֐(pAb @T&(gF~l{!RȉB yrE&ʶW61ġ/Mm*4/<)c_sD?&=Vw/J' jKSZGT;RR.0 II6 h$m.݋"`( SǺ+S{ _!%t2f`>8 vJ{5^QvGy.#sAC9֋Q;&l]_Z) dW0r \>Ծw݀"PY\xشO୛9tݷWc j@4z/ÑE}~O # mJR5hi&_X7˽tVrА`_r#@%c(o^ Z<[]_$P E%NܯˢLA$@uWSZu\GxGK-/:d!ka[> MMҝ{ }4o~,[<4@~Q[?3 sL#ٯ~h֗6yhky:`$Z(bFvveS.o8q>_ (>!t|#DѱϢrOoθde7 G"'! ?W.@~ND}(w0Jbi#nƭDzhym{&ZQR!70H'-UUTNFTNzØ5Y2=1#OY&70kSI%@Yas6>-׈!V>SQ-40"݁Pen?剙0ɟSL$cL?jšjXQzP*DzVJ9? 8/in lj7!Ϻyrv*|S38=es 7,H>1vH“V{33hkX$?+r|}Yy~F>Qs<WHDhbDidd-}(BW. pа_6 OR|K u CCw|vXdx8(|M8fkW .!ڑهcGWP=e 9z_}⏥@q3 ~AQ0{a;~g@AP2Da k$=i3X>413{U0U~E w'Md(g`~X!נs ,NwhyA~'8 lv( 'ыHOMe:;!y#PD$#aՀ(a- )iO=j]6DEw<.x"tN̆p>XÄl=옽?ڝ2:ZWsk<O^@xqh"J&C!@Z3ʼ |X@H)ǭ86z3 Cp'2VN#VPXQj>w~F'rX摨4_c@%K61!GwbMCƊu}F퐻Balht}/f(3唧P\=?ے;ï(L;i!w358I/w+,[SZٺחM eIcM"# 1CxtD>f/vcGp(8|@6Xn?Sމdb@!$H;d R_XuCHqteV;! ć_1K~O: =E^iu$7 HLqb.Qzl. VY!x/Ƶ*(tѶ z3PpJ<Ü^Ko}C_> &QV\sARsx|Nd9#=hFs6e vh0t xIEh!E䆀Q&G!4+s5I?%rO;Q~KTH,HHC c< Cxݢt]OXC(!O|w=Y.c(5gEI2/$;-yWDxO*'oJ9{='A~ rӍ6#0nj]k1.@g^fG_V:?!5o ݂6hL$$djt4z$e=0L0\d#Dn^{2MP'=nsFNkFC(,F n_FĀ=-c*vk>C pypbmHk{qѫ¥^g\́_,]wg'Se&5Z&T|+f |%"Tت/9#U !#urVJB B5MY$`pōRhqK,v/垯\Usj#re<@;5q5Nׂ3 :prXQBP2b0'7{̎ArOJ]ӄDY9W׵qx"j@R}}ǵ7=u_BXê `C܄!D>9^_!|-QYʟ1AY&SYMn=でM>檅9w:lzm3YrtN7vw.}c_j׸{N[`#WfmiM٤Fvz͞Jzn׶{-l=TsX}*>_O}qwyv-ꤾlٖ6m3Pn]z`ꏡvKM n]{-=o0*CݾRN(΍(pыPgD tiY]y4n@W-PUvel YAV8֬z }۶J)=*P6k j:]P/X(uъzjOnl]j{gU!RE Lv2*tiD*U%&+P 4%JR*@ !mlH(U*)-2(%fthITkT#l)%Rr$Bf]"%Tt Ai%E/M5jvݵ)Vۖ]b,[MzhjTJBm JRD$tv ji4f@ PAC{}huvw]R^xzzĔhuAWF( )CL%%H** J"R@*ݪ ͪۀpa^Ivv{ϻr1>uţS| ҲG> DjPjT(M[sիNoU11۽.t6 4 ` $Xk6̔p|h4͗vDw5oql}mVf1[MPP5 j3PK}!^ݩr%RZ]vۺ{|>]lRϼ|Y>l&4J9ZU_{ f`MLZu/wP6mhאSU\( у#J`AABVU[(j0zxXf*QGS뽫pﻻ}g4^D@jŵ_p>UF,FgrWޟw٢6* m;I+WkRTUUݕg>3J}vupzjQ`[m-Gs7fVl!l5Oo7BfXf]JNaQlʘ eUUx_{zTo=ڟs{xUIM{˹J3JU`viI*}מssjj1kM tENr'"*p,Lt6apmk> (\S ٬#+kZ}Xq`oGBI*%onoQ]h4]^a_;Ss/uu;8}X&7`sTUUDlԕ[}:TIת9ⱅ/gnTfh{雟}b)*ltϐ>m8Jқ1f@l9ܻyzk* zk&-@k䨺aǯ3{读  `6Ӏ<ѝ}`"R;8% >K^{I}im [ @dӠT(H*k"h@ &&# &&ѐh4 zSЀ2#FL@ C iLLF)ᩓT l$̩?OOH@M4h4ɂ h@= 0Fbƍ)44M!4h44&!d &&Gb0aS4hiM=MO&LI149ʗalDIЊg ]`Nb=iV>:70߂F@(.:h!~Mn.ԪxlΏ9#Yo0QDđ7>"HL"B !պemHKJӷ ̬t>ocĹO㷿مe$!A$ !!$$! H, H(H,2HȒIb H H!%T   $d2A$$FD2Hd !$k>?MvYb@. )9M?chQ'U^{@ ɘ{m-+np`~JD$(x?#v!Yv6쐻KΚlk[Kz+m)nRgoyjsHg>W(bSHj@E4~.k(1U0!$($!.M8DmtN{Uw|!aoWSs4r_Ҷt\9&P  ^GOUc~tm&ؽRzLSIQRmj$XN@!^a/eĻOe#u7ʘN[$QA&W/+w|_.#&eesRsi%O8vmS}Np? FN~,TWҨG{:wV+2c"=jW/Â%tHD\ rRƾ%n#(q"HJ@M^<C:)eAFv3MQ_ˢbH <1VxkSPDj}zTj4;>=ދD\DCi|?Cf~s\{%呋qU . "}$H!q ]ic Eۧ75BHH`(PVip<_㲧K:^*b-NskȊ8 "E5$șƎNLT(BNVSo!|HDè55Y[q̷pns^ו!y b}Ys|r9w6ʱwz,|D(H3P~ߞ#$B8M1Au)785%=$L~k,XU7~ *Q1T~>Rf7xʤFɖ'f^T00ApE!;o _Hxo5ⷣ$"u4jpAq DDwNf\u_*d=yS ǏѪY7OQ( |-$@ <~̻]HABgV@5OK(W}P82#zuk҈ Il"JWpG@Br'`zZ~H(l1˩X PlݓƵouɮ1ܿvޑ"ι?ýwo/{OU"cy>Z@ĭYd*4 F5BA▔L "~hȵI8c ?[$ ""$`NxC#aRE5Ez\}_L;&q|E>p 2(4L,@9,>ǏpC,(: = MCNd !je$s{H}diih; ]v@ӒNo%`o?eurYIfC y3B$'k<\#U G}!dWHJEC3xņƷG(yq@1]ry(Z6N-^[F*(ؗr6|MaY{"vd}!9I1oc=oC$B@¹P.{CFgXBy@D%Zi*Dfw/Y-ݏUkwU|MIG!F-KzLz }hWSҖ^wn]?E6*<*1%袷[#ql^{8v2p;X~H|*qǹlmQI^/,w^[@k3,ĔB"^PCOv!հ6ǧw٥sJzc u~>r/ig#ϿIʹzhBY^>AUP< 9U@&P2Tavepi" K PDJ8<Key {T_xgOU =C{2s޹}Ȍٍ[ՖՖlT62Ț"'V6#"nWbcF!)e]eg "8 r#d'i !޿81d\DYT5z0A@HIvx"v7# @[ 2ul* @$$,B ثFm{YbeNz*h[En:`,`5ӳp+ZmӬ/i{RT1൯?Z^WZK$̴̖mrp\2z;P{ߥINlpm2Q:pu=%=gO\_D*g˱&U@@X "AndB?{OkUx/|7yf`2U/I4MqkbG> Oy,;}t@>.I4W"2_kO/_/__ po/򉮀2JqQ:xC `#'UU% %ne-5~3:HƺÜSः1ifK tYͻMjxɦ.|N/Șm'5uNTWw.:?#^[:zc[u6bp+@51bqyZ<Zn|RG}{~|}x(;k¹/1D}-([cϦp 9zF^G.]J@/bղ3H!bD&T 0!ۖ`h7:A6/ץyVQo",! z$2xQqTmxGN+ity` Mf YEw &0gxݓ3갱~tT<%UTbA;Ը>oSO~km``/z4H| I6ڤ ! *J H9C3K(>_Mϟҗc?.e[ 5p?? ىcåR Y[ԩiI"t%q 1SKbg>F,1^uJp904߬(b&ЄÕ%9z :'@Xq|B-Wb\,=A82)B?BoFO1۸176}lm zCXN'ޘl{8'2w.:$\*|{t@,BXP3HJf E!n30VL@%wVQ0VP-k[Y~</&LLrlX@QE!qHh% A!ը`J^[th1QG5p_tX-o,'tSeh Ct0hZ*y,Ć0qZI=tTi2}s]C[WsmQܧo“aWs.jW(&n?W$yՎǺFLRzS  _߶ A9W®4Bw,71lo倽=ZjvtڰBBTͲ$@@@! P(B+?W6=}4bS ! HB2"" \0*S (N X5ϻ %  KN)j=&9ͮ72]%eѸ3Md_|v7YG;iryjCl`ũH +OG: |9Ԍ*Hav!PT"0-`z] [SO0<_}J,70m"NHO|-E!C#qDW +Rh7.kޅFi@9rx?Y>x(Q7,{xp{:>O2kTR E7`ivє4q\u<1YX8$Ng)&LER!#kݿ:>Y8]@~.{UrYQ2_Rϫ]Y?{h+ceE&Ԡɥ|u;۴7M}&̒6g{ޯ !  "-+)(nV˴x~V]]z#n%wZHŠWVͺ4,DxL[rK Ԏy> ߫6_cڵپ]C'qTSP QChSɢPoPYX2yXUw& y5Ru)uIo%%5;6p@PȧdMh‚2P%:ѷO;KFTȓ(S](0m]`5W۳P/p)(M ]+¥䆽YN]W1xtO{/:z=UBbXGDziHu֯^4hkƤ{?`4.)D!Wq6)76> y]N ˘ C\(%Ƨ$Ty~;J]xg%eBûjWk6YbCwOc(W<*R#Ow+l5uDO00~;l/,׎vYI@9P>&B% ?ṳ~mx/Ovυf P/m"*B!n>󕐽<Un}^G<{C{tjLJFxX֒uCAgk*{Z Fd'*7JR 0@ꐍ)}~#9/0@5lsoAPt&yJ.ಱS3x҉A݃7fy-lNڭ + ? ~D|m/h }~X:Ѿ4H"DOfB PI <C~G xVUǚ+ WŝKLI:׀\e\0v7D R @U@ *roO s3P!Gtd23ՌR'`<=€f`KJ_LY۬{BI룆RP@o&="OwmWUb*haG d؀5-i>>L"Θ>qPAdy`痖jzM <ܳFI)5 =zBl3#Xap*lXݬ>_?:5tWxwA$!]M(QUtZ⪴() bjl);DVE{9&]{K/Ai)4]} ,Vr ʼng 卡lAALnVSJ^cnj#LQrr#S)]|HOwB۾i$b쬛mj?kܢ71 tJ3{v 8vI mn'EtH&SaD_7( ݱtS20F8,{`y$ь!Q !IA. s G4+=d|*\QFE +FGS5D ?B|@}ϑ@#e9  }1 AA !!F̉\`dZ6"eQjptB|C"ǛYj&͂Rb ]sugVYOg[JؖG>\NWJS(ZvTDS3j4RB˗?yӼPJ"Wvr{8r7oC[[[2td _ю?)398cl < L361:-T3ȫUF멛\;яM~*`ā:(hN#Z6l"q_2`S3tHopʶQ"''qиccH6k9P}3yxTy:O2-2d. .r|@sH&1 ;9q+X<ˠYw #ò}Q,9AӔS4@vEke"$iQ7W`#spn؁YGTƬ]q䵠"slRAS8݌;z/"./OM&C"W?? RK4( aE/{H:la&R0!rR72nb@s'un^$mmx^];%^ ՄXg밳ܠL3d56}> #p>gͅu` asz8Tʭh[chgF016tD%SD@c;hVTS炱Tϰ §s^b|yV]7~cтZDAVqcה |M'5m;NZ~w_„dij[W9I>?Ez aˌU!FpzCM<믜0. Hn;[kpշԇrTȦ'̄$b3Ez/hʼng {  "=2F L6h] GV,᧦c[|;S{N"B<426޺NBNW"ؾG{zwd.fU28+ G\6b+(w [N#jN7Up^ B s,bE;rΛx+Uօk+y]9嘼v6h33;{nswD^d"Cc^3`khI8CS\u„$z7^KMv~BR]_Q>kQ0?x%GAgS]MAb4CC>W6xAFDRA8T)4qџF!F32MZ[CJ^ $~[dŤ/ y543{ȱ*r꺁 D6yLQ} )EP(֔"eW_0/ ZFP[I&"@8[}29dq ڋg8j4szb _B2#N گLLuP:}Ms ,6D@69Y: Ef6 ;rNg:m"RCq _C՚Ѹq(7f KuuE^@E>@ي>e Xs + TSSApth7 O'VDxrO }oG`w,tquZ{U A3h  BEbD8RiIbkvap6FE]fDtJcPC{N.;F bQĥ.Ծ$!qLuQNLa/ɠ,|h߈9%:ŴCӸ=a07)xme0,M WqqUv , @'9 !@%_k\ph&,zq`wAA T_QB[~>!_j0Kn .-=|ُ2e"O  BQ5 W |q S-C];*VLrr_1bI# K99?wMNZ|rV#noR$)Vȭ; 4z^9:6Z㜂;Pϴ;,ʮ^%AU@ו@-" "dBHi{? {#(JޝPfZ ig2 0\0 }V(eP z%"C?sUrssk?Vye?J,C PM<;V笌A6["%O j#rNv,AbI>,Q\D%r! r(ݑD%gy<-9]?kQ=z4yY-D7G>K"ϯ/q,K ]徘ETGLdVrY@i)P@B@B$HEH2dV\1P D@B@!H"2%B $$!H!!BSE$ A)@j 2@ 2$Ԓ$$! A@! H@ PR )hgۢvyڱ^6(p-/4<|]-QwI+ 5*px'Ա҃Ykk:;.V#2N,$"2"Ȓ"HB$q{yt̴]; HI"00"$!&/oL yCq|}Sko̾\4 3 ҘA 'str& J fNX3+U\t*C6<_-Mg th)Dt\3Hj-jVtҞ5ysAFA)06MCa;o扐!)p uc"A @)}w+ܮ K:In*PAwBBB!WX !!@"Az]U^}rGyj_x2گJ|beQtEv"Q0҉LUBUG#]asth@q\u2s3ȻIoHNڭE?y9:Uu}f4]GKyL;g-a /Ev[/C`H&)'!ƽ!Rzh$ ē^ {y7V-;jTٿ)\jf cA{@E2|$0 sSȀ\Qf2!V,ߗf6$k 8PYub"w\Bw}r]NX"D(H: >q(N\ 'Lq{7g{[ߏ%G^rB Q P@BD HG Dh|XM{{l+VZE;Gv1AnǽnD @" 4Q ;KK_'k=b I$ 7Og]{YG.~y> ^.kN6FH@ B@B" &ȄJ2ڃ|$T/Y FI4-'y;j{ߨDTĝv'bېa]|ݵ~o/٬7e~W-j*;JǷBe)sx9_&QeG<:L y"]|{ގq$Iisy{ӦhAm뎵> A 왉xA3`y!lq}Z=~C.izۗ()6)e00aW?ע)ȧ'nslSδ(!4a5(awn` sPPqQT0S ` ?ȋx,\Ҷc;8"{ics >\=A;((%099O?M>ok/\\3g[KSG X"$ FDdHG6؝"'~2B$@B@$$ !@!D !B %E>@Ӗ uT$'žRor#W 1$`㟫MCTrkZK+Nn1JB9qm`P`y2 e&-Et\h1liqX)y|Jwh,:hU#UA 0z{QpTU flrc[ƝpfXqE CI>Lx'/v&|%jͷLscW{k;O2[+]Hzxr򯶫؟埂*zQ?kGNdtſ㾱$xF RPItD:˰յ_^29uH<حL*lhĦr R!1_֖e{d_^ϳa#z ( 9RSlL0^LUq<b޴ޥWvq57x(@8:e!Zl* |/N $z}Yt6pPA9Ѝ|~7~ Xʄ?騸3F@!CX*%aR e1kN'0 QbM¼lu՜\b@=49j-A^V,퀠W`,H.3]M!9yL f :-B: U̻ P! ݸ栗+9)/."hII D6Vq|a`IBJQ\:MnO;}'v5"#,/*!h3 0J矼Vvwa;ؒW{Ȩ̩;SryQ4ӡi?ސd$i{u9b=yt8ԏvs1a1C\_7}WuGb:}eR̞^n¥}Շe>/].~z(Ca@^ kE8 #$_!w#D~j:E((" 5.H2z KĖ Q[p)*6 |y0΢zF֘e~N#iL;<Jƾ5T{Z(ZzO_}30H:!tp)w'v<[6Kkޗ<@#?`8D@ae\;'*b?! 7%&$ BG] ̹q;Ĭdj,4 @’QP)PLHn XI Bbn-z8xܞd8(IZ!/QT^/J&OZ˪-n;`ŀ`g]_oӄ##BBihb=7虷ܸ!f,k4fl-.1:E\ԐXx?&%@9juƵ( =ͅ󬾞ȁ-QaX!;tyN\sQ_M5JqSNddɅK{X̻Hn3^}p`Y )0+ ${|Br/fSչ0N2&D]wN8i4π&/UN|_;m/+jT85@!H@ JOѠ {"h`/z8,cinc99|όl۵dRCꎆwO -B P]U} }{eB1- `s&YAO_ g ]%3,I. s x] ]wabA_D_Dll.^s i[[yq  HE9BI#6*<*M*$7(Mz1S9'  ,-,d9mSAȁySW!zg4,q/}is>}*|OS&ĵ/`]SqLK#jp=Ɏ#w:!ɹ%*=r ADk@8)z!L&8fԿt;ej SH,/ 0P$D(G0 Kp&'>ssC\.1nm2eۖ}{ht^On-^2ghΟxRyIT* 'JvtD>هþ.eCj&UcDMurWvQJy}㺆d[.w\F*$8ZY7JvۧPqiU`C eX%) C-Tv4W=ȤP r" JJY 'PI?P,eA,VQ$J1 L9m3*]V2LY7b!ŷGKn2Oyo{=}-HV*݄T`}3nX3 a|:= u 9|2v ~FDM8{+ )H;%@C*TMļڣN'8{3Ry3∖ d3c6džkGniA@9$@_L]w1F>vnPsQܢ4DZ$78e:06wh-.4" 2[֥tI:@C#*Ő6綷ʸDL߳emXCs[ Ax "ݳ8)nxӎ1tܴ <xI5fJV{c@vr9oPhHUSf!pZ#%GcI_#+ec̵;/N 0ǐ,lx|O ފq. Z`x$ :. Tl Or=MO[#ĩB;K]=8~19v!rH0%NX hpCް'^+^WsRnŌy@1"B6 RhLAZ:T Yl.O}lp,0#>v ծ\Q)u#kxx^K66G/kw gfK,Qugaޥ0|V5C)l~ݼYSEGhO0FŮǯ:G9Gqߑ#߈d[bgVb3nkDK:hݎ ތ|fNz83ׯ+>1Z(sSjq1\%lpK Yu;ݿskĜ R8V@Paƪ@-D:f-c?lDZ zަ: )3 Sׅ^ P9" ȝXQߓ MApV?s!PU6LJ v'TO,Rb8<y#_K=u)wqS8RD DBuQ m VSl9H(OquD&Gx]t箝6 Lo?Rn4Tob+U>Uۏݲ&>7]du!vx/i;S<${P;J K 9rǥM _Ճt#6Dh /\\Pxp[u# jʔV ,!UwvN"di"?Gɋ,["mLIicʤGTdQ0lyC46Vh'ODtFt,=$Ȁ;ҙ427~ӧ 1M[$]}||AĂAr<!xdF]5\3GcsN`9o<_(o4'4ش([;?hCiOc(Z/r9H LGӁfnA3,4d-[[P؊׽zE"ڰTlzЏA% H-O0AA(_"^҃73Xz3;DJQz/ĚH]LLAhxa;{!:$ Qļ 5v+(7w;k< UJF1D4FdZb/n6[L$}ܓ _ӊL[-P 6="`.=;ͻL\J 8$JgϧF(|@V.ғu={ .@i/0hdFS FlxBpF(:!rdrvkxq8&~4q(/wڎst8K@FXkp..87fwFO! Q}?,7p|k%#"#|@9mѱٜ܀9|aM|i Z=r2@ֈ5!AN/mm⟭oʁh$)Haj#qAsqYrԺ.ពdgoU 䀙v{5UA6|UKvp."g2Cz.ni@D2/J]c``eZ6Gb˙t-N6m8x {{*Wcv'd@qK0* %#tċ--cZUmA "[$/KOX,M?g<A{,!d  OZWwqEI~b^ ܗn`'o3zI. i7g0'6+^8 h~8wmC>l KMP9 끤#pCO1B̸YQXqE&IKI7!as)̕-VŢ)®.m) 0 +Tg^,z{3 N2%f-wy+;/{89>Ԥ![~lmܳ2F(_ SWZՋr8D ϱ'vL2L?az3 kQAYD6XZ|[cO8Lt5ABP$T:KP")p=CZa~ע .ɶf"AHO-SsE?} Sֱ:~ܛi$i1e>>_|\F8&p$Sп5!WBu ڵ]%W"٠2 *E2kd 4V\)N'hqn[=$#aGf[zMYؤuM܏ `r"&k244j*m.J\'ϘYs2!^eZzuѪ -4N!4 o%JR x9Dқ__}/ ]Ur ~ "aL`1k o2OfՐ6’81 ,09;FB?;i#z>> < j&t53Ľfd$V'VB$I3X ') 4bz&~. ~>ETh/4,O9`Al'Vr . i!Jsx(J0us r'S'RN#:W*>U 26Eʼn cSjLP)HQ(;#!~zto]+{nBIEQkl8Ոc (k" AP{tv7T5ހ+1gqV0B`ޚ}zPkXdMaRjr]a:dM~|ӑPD*h2(8Lju-BDrS:(lorVӄ L&}V-ڊP> 63Ũ|^-#͚{(@rmBe% &X]2bd^drpQt63np=@m٧vI˭UҪAt KkZO9X f_8' mj(O .P2J= XDkB '0̛FEֆn]7a"Y!1lPlς.M}D!Y$: ~tPTD~qu!M2\xy ˜;KYy0GWnoGo3W‡pq߮KFrMBbDR6l(m.$ BB?}bB @ъdl+.&U`'xK/cͭ=B=MLjh ^$ ҝ$xǙ K;yGT˜j6 i4&ƾi'`bnnt\h>Io#Jm@P+q[r}mmUtknd_CPkH>mKMnEBzb^Ϣ@VRtW B~*]{+.Q"q@Sa\JAݥ`cXπ`*s=NoЃSD[*ba{W{3$ qHI|A@p@ ؞ =@` P۶ j${#lBy@D AcDbx~z=j G&?9YzD&O}W,_V,g ?IE%*䘀d2#PAB@= ˵@)Kma@T۾xvthu)91 tu,skDTH_HJr?Pm\8Nt%ˣCoޟ;^8N,;z~q~" Nh-|A BFqN*a!-ES ,DY f\T:wRXSbJV4땩J A'. l#Zp&[PpT/?W̶C)Bz%ϚGiw#ЎϚy~[nALq^P*|ÿe{~W +ZwMӹ'Klbm#H -ESN m9w}–΅T!Bg">@hZ~%w'-SU29)5J@$K$ FeB@H]J4 2T{ⅆh)i10  76RUm[VH9܍ ȰƸJAu`!nj(j3D;~h pu։kjNv}2mBqXρL f4sHHvz],ѹ!á I6@܋#B!L |J A&RFW:ejBWc6d($!^P2 pX`?@0H_.e!`D678ӅAU\jU4奈ޕ@MAw*6[ e!o(ZC1C C,]i U@N;󠓐*OtyսrWSKqRD€6BxŕbQ(rj Pk:ݢ+`pcTM^qV;pQ0.~} @@{/>‚X {0 6<+RxJ> #.HFv86:!O^[jOSSK<rwMTH2:#ϲ'a.<9\;M` S.Jz05 O;nr o(CvYʍ. fDy&IBP%4o.2yЛطqZlk_%vf2r` (B^ :vjsѯ@ǙtW:TMjz~wwQ^r]Kܘ!Ȓb_pir9xGS77$W < BddRE$@I(޴ _v~Ӄ[ytL.1[/IV|1Oc(d 6iZBDw2M]spPPdߌ_M(` "v8?'/|ZR%f: .\!jJP!&vzT4+qI@mzcM(AchVnpYN1̓L#וTPf+ӿbX zQT2qUJ@\{Rw&Ѯ CE ,9IH3ȇ{-kP=s'zꮁb{OLZx]B5[E npDP'0n氃!v9kʀ\#^"] m[("b8s4cQ?gfKnzCs: *G"T' H(X W:U _.Fd , VW/U[iτEj\,ˬ\XhLL`':g2h;@/݀j4#gHtUm^]A&;R+ha& ;@ݓUFMkxj5U ]݂Ru@C;; c($'vD;&;C|5`[TNIKX6&r7mד?Bխ,u#>-L]p{FUoWt)-v2-c@/d-| D#p9tthK%);4[K{+-tJq{JP SˆW@FdQ5 o :NĞ oa=Ġ μ ! ^7ݐhm5w_. #'t;NX;*K֕v/&dzy%ާhH_ ÍMٕ=mZI! y/]\Ë@YgI|tMXJ #&h "eAC|S}V\^~ 0--3l ͼ[! SA3+kTąC %PD ^ޢ6D`iU M$·*x vgȚ^H{bTgZmx 2nK;!Nb8?xҗh+WяB t\le`z|-b2χgƮ(V, ]G/,eˊbY9S I. _MS&hR[Y2#)Q IF/9\=3Ƿ,2Te,yݗXC(uBCO "2F!a ! $BȎf7ĭ@ Q BD@$@+x_kwNP3"(Zm[{_\@Td H( CdP|0>gG0t$ftutn s7h!PuP9롰0vKHZSz$bҟ< H@ ( jDSŸࣰ !)ngLg,¢?3O[=&'tQp8SBMs~\[XMψ qMgުP 紏ЗoS޿Gwď>4)+\et6['4x{eX ï̇N +lTCni(u-@;d˯>Nv8/A 9[NǞ?OgS A٠&x$4Td<4F)F1L=j D,Y8OM zqLͷDtA687 TpiJ2bώPD P˛uLV]u2 !Q`}?nN(0ǐ5oZ89ݨ~-V4XrI=gr5o9R֜LQ~rh BSj2(D:B3R7dZ?8)?:|82uـT&p{ iV[8UrEkE zdFr%4cFg{N9dRejex*N{LС_6\e@~ArO$4(0;(_t'i*i,Jv$>6AE-YnSŢm NjTצ|;ݙ{ΝWa$D)x<8sDsKE4X+[o%&d/`NX@ı   QRE9B"gF +1ѳlmyPq`P[|.?i5 &WizwՎ(gCף8. Dv{.W1=|ExэX`^W)jg׌[X/3{]H RCy_#$4!9"]-Gc*E' .Q1U4JB¬ .l0Fd^_EM} a9Zõ<`H#9ijBr+ 6(Id!VTȪoT@ !3+vwڱ*8F6s MpT WHHQD{d<06clB㔱Qݒ54/\X*eZiˣ9EU gmDvdh.\-;1Djjo͕?ُǩo^? &`>*hEKb.h)DEx uɰ8zx OҲvM,~MAwbG[=ւO&M̿QɆ 'ށԍa\) L=(u>(< QơE.>Yz/um< xkYnX,zIGؐ 4b[vcsi\#@;á: N}I vTI I@I$I$D $A$T$RBBvW `0[Pn!gKi#yLDa,"#jyI }>Tx "ؘ(:UϬKRrCԮxm 7p`i), qOvT#+$ u?,P N} s`945 64=[o[8g}MJM hDGv)5'ZH a.B( `=>o(ؖ⠦!-dAgZ& C72 DQV||̺LE%} c'{ZުM#$_BD|.h%r#;7)0 Cc z2aH#pDhn ^B;a.&hka Ghβ"!Z#)NV@(OQ̔nȡwwW+1odUpDi"/{_SP!! `< =%UaCSJ0[|=;U4xyb֠@t  snޖ5 Lݰ\Q0WlPydt1[Vf@-d?4I֮hZ}xY.N`dϫCbONr+S\5{B;\r2zǥsu) 9`\HZu"eGraʰZn%!ʬHNhJ&iD9-EZg R` &yu!&-?MPs:(G E)qXἁ#!K9ܚ܋~o&5׿٧e*A#҅A+ u3~VzHx J* {4t~1r<_İ!XW!qkeVʘUo@.9ߠg,ϗ *w%((EY!')YVs]ebalC3?!O{xsDb罏 T2դqIٔ|pQ2;Ja.4L7$x0sA+NU$5"$Z5A)k>]\x҆TǴzEꚹW'2;C,zQ5B>#\!Kd޷@XT`Y(rFȔ_ hm>0h:XOnw$w|W2W9-.W;VrtAb)Y )?W~2en~O!\n?!1Q7!O)ߗGbwnΏB@2iqkjs%{e@E oQG;n(g0w@JJh-VQo)e ^b*{T.XM$DgDz5ߵj=x"޻eԇ** D`JNdK+mLwkI a5Ԍt/?敲8G&.C 7HM-𕜄M/,| CD ;=PuZ^Y` J|Kns ȥ91^rRBP%5f-$NDPSQIw Vd8W5ģձATg \~WWjWIY\p冊,H~]>_V~eȈaITˆiʨK՞=P5M4~T/-mrSG5CY1I2@mCf±Mpei= C6? "MBR2w֘p".gHS4EqJH5l"9mz@nCʈA~?ȠFȯ@u$6|LS̤psH=Gzm?=WXKfA-'/a;9'>調E|vsD1鲅%BŚ] ݞa%C.f(?)k(qOߜF\[6:{܈_ҩS @ "{BN$E lee͖x7.$MRa, yFط9HLՔ0Grφ# 셃/bcs&{9=0] 㵰'( uj NN aKSWS ߮Z3.Kb+n!4%U+i4ȥJ]d4lPL~ESP.]FC,\s$lLpb:%E/)AِI֜(!r!?g;IƵH[^uʋ<K9}*\5 &F2ACEq CǵWVtz41ԆpB%\MnC-t)f#Z#Ũoᚙz;hָOˎNVb@ y/F.upcmY4A0X"H :o@%60Uz:]y:%_NTJ)Ƈrlg{'_n] ZS@N%r8/8JxFtCB,L_QS,H')d0u7#لs<8@P$-ӘbEw/{N's'Ą³7[y=F@"h8K jZ2p ݌r׵ᡅ@NnS/ mfG] X2X~IOO'iy;A^O=tuOA£ӄ-xlT|FprbD}J-@+JZ>L{n@ MxڰX50[MʓdļHF&h=S'T 5E%stT fф#&!)pP81iljj=pϙ֒\Oq7/  %-Y i)M L%\墿:H9§G9,3wM.5_QwKKRg0d !#hCb\Cgك-~ְS5_LɄ=>$kA Uܞ vM@\|hv97M]|lBWhfu׽,o,2uQ %ů1e|l T=23! i֫~Qډ0ʡaxd!G*Q/px$4'8H(R=pZ( uXgA&̉ ˶JSUQ"% 1\8Q 5hG, q;Y?e{"Iݐh`5w?C#2pBƒG'#bedE@,r\a@Ɯ^ 85mߣ?[0}!P v15xFwVX)ͯ[S/c}7G 00BӚ&"9"{q.!z뽬S&9asuHྻmAl iLJ[QuԩC6 X+Qh?4輿> !kDKQ09elT$P*X)´1CR/ɠƪθQ)1P4gM2I7H`nL3ha'%+;&gwzTn(0`F]=:# bY!iU [ɢ qiss,o_]dmhg>bV"&6FR<^]r,g!,EtnQĊ44""KW᢫_ED9)uB9tD l4n)K~{uPrAoMS=FƨڡRRwÌN Wύ@@)|5;V\MDŽIAzD. {l 3,* 18JT)wXN!X#Fݠ w[.~wHQrۇo:M,NO)"l3kIt$90+>g_<=YG$A!)™]$ ?+C0;ZL\ǜCS. v]ˏ-`iR)9Cy NS7Ʈf ˉoHbL$o8%aԧn A=>)h_ż@j̦(4[E-Pj*H.]UCI២á& V:epK*nG3:]vz][8n1ÍP3[ f tS{;@ ޼kw?y !Ǘo|?.*y)Dd',뻚aûX+&(6<2P8=,-\j.;y*c͌^!*d1c$9fq"Zlҿ}ɇDgT6  vC)-uNܽ:Ό=6T[wlcS}n06ʅH\hوW! \ 0iY΋9z&24j~ _;D gOsU]No`pYM mnH$+DjpX Ak:@TRL6`%Ȼs#qwj@ԣ5K l;(&ܬ-l'V* .Wv?S =msӆ 3F"vx8B⥛:uivnfLUb f\P=P/Cմ]1-fyIL^l1*BcMJ B3?@|7 )3T^. MhP+VS[TFl GLC0`sp2!N ))L +hA<*΅iCh@.P\KG)Ši hON^Z E,Q&m)_Cda̹R)\!3AlB11"~~gnH mEC+vO5vet!aT ^(vXgmGXZJN6UKL(>AYG/_UۢIzIL\F;Ƿ<\&c?xPDsUEXC.7EI̓2w_D UZ0m FR`a̬gMp <=J1sd$P'TW*j#C,WjahbڤuK <37Tf ANk:[lQУo Ϳ'ޡ2=Q6rr F9l{21|2H$0r()LRrSO@P*= =W"Sz-?7RGPЙ#!w. _Oxﺐ -k9wZZW#jN9rH2[4F] cP$To2 ir jbL9⨁3wi)8:ZS:I38@1 gӸV\N'R{zk]+vO]T{ ZʪJ:py<}Ya) z!'50z<`ri xB1ign:zAݿ]Ei `:A(8vӣ/BGCae fBvXGf$=]Sۮzj)dڽ(2],X}w*&Isp-3G؀& mA{ yTwBpJce/hA>#o7徶HV;gήF*&UMSPy?G=ޙ{t Va7`q.0P# ՚dvRְL l6'yZ-7cf FŨz+lp iQ3STg p2mK.@A L @; p1_v.e:}A-a{{Ƕ mtx>:"7~~` Egò^;C?p[#5׼=#ڦU@9+d33vE-M5 qIVM mpBJ%19} \y]_c ,`CqA4@.,#K@1 ^zݓ_(`ۆ2g\m?yX*Sޚ+#Ot#^u;Z\#پEd/QHH)n"mt÷_jnfѪ A^P |Q4LsJ23_{4Q&hMG ^mukA+fkZ,|Q40ZȅNHҨTu) e܋kwfC[}]ZekI ĀH#67 PHmMct[ Wx Ox%jXf<'UlJ,IpO >"!/ fx$tϲz>sإz{l?Y)BUZƵg7idA&t0 篡@ DZb5`RGHOr}sA[rŻH|ATToCCdv]Re!|ɉCQ+6%GH DuD(80vt##wi90C!OK*Bk18w1T-@8NA}2ξ)M楔EA: z;4ah.iz(Yn6 =勺E8^g8ĐK+'QU cN(l%ʦ;`u#; Va@>RE4i +qdƀ\&ϜI>Ȯ-t8L觎!q KHr(zҼ &Zw^Xlu 5 Ē$)B›Ґ(PHE$rJA@0oo<,#0t5a $ !^k'7biPFJFQ[0c{c$ DZe|.k-u`^HbZ МLaMm#勴XzTM֑M5ti.VW<2dN<\lKZ;\Kt.Da+R$ =:oEA|9]# hT_zzQUBj5 t,EA8ftB1Y AbsB1 ܇#}_f= uwr5.(8EeDrxTuݎU9Vp4'!|)tW@fmBC!a sT1iJ>:AET@0rI-GɄ!k ]3* )l6)bGVp8[i0~0S"ޅT O i E;y9]=KEIp5873A%m?|@  91F85g/+Y˳W]M3펆I*ڰ]o؅jioz,˅;;sK,؋Z[&( t-={ӻ|M|R}9eܖBg! sH&UJ< w;(Zbd7$F(4)(Nu/[i+ ^ɼ#Y%mM;rwU"#ЀKrDdx-pX1rQmu˒5mR\77zmg<'jdz~+ꯂ Kėm T.=ZDo`Y*EӁP%fxHh1oTlqzp6P7B_C(@0efM6jɯyd 2)))K8>:(YAYc k/. a(X b2.b^T̋&¸wr$QjzSAPy2BB5_@L+y6+R%:=uy :!!r_+KJ\/45@NmB-fhKG%چJa:6'ؤx֨P`0j}.\H ⟦)!#}~`GRGָTT\a.@w(6+#[+v̊ x@!58NAa H*K%F!V(\Bhm&N!s((m8# lf~^ĭ|c!)cLx Ҁvyέl (}_(%Wsizķ~1>,?_G3ʂsD9 SVVү;N}]ۭ@>"2rn N; Y!8%sт…O-A}3 pB RBF*U!cgXBF}kcveCk F` x5:8k@`ƯZe-\VZ9(ob0(@pjPvTi g ]XKɩmDHQ=E Eɟp'[|_^“e5=*Jo`!h&NHz18S=0@Ͼ;p`..RbcuƬ. RD5 k2jm.A1r<.^.2*[[җܭ1ƯHy:FvZO8!^2@U6+[Cv} :OT:}WݜUC{i_QAڍ0bї Q>@6@/;PsZ<e`h Ypx, +}Z+2EH6}%@u\Rh^ A#!L`3sk1Kkk Ol T H(,f X-P}iNe-ؖHC-)] C?_*ۤ+eD U@A~͟ҳFtw^Z/\Qe@, nxOU|&7)<;cx%_RTE:5Hy"Eg"[y)gٖhf@TFKE&e hW  e1(B uA> !6N.JQIer F½ia_qyń^(!SXpйM[jt}V@?ͮPg׻\Vh:(I2䅒=ӲyR҈)%Sh#2e&Z K^oQD) j@Uy1\X l9^( *\@` -u6(a2eQy/U 10 `]a0@)JNm0TזN[8aGH8&b@ \'TI[<=Y5v|p/{2h.a/I%W4Xw$xY7pb=x{MArĘmyZ4;)a^<}'Q $hP٤^I\wmBOqvp\duywCkM;^Uz40)VtC;(+}^$9` tW.cfk{eF% lӖfq=QЩMP,iq}2"x%x2rkOWNz)oaN:K>)Q~_+ "":DR]h^ǬyUSF%>BbOA+Nj Ɗ2w\*LTGcr2\K_oE|75 P[P%+l[{Zy~dy,)8& .).tӧ}KV"=%5EoZzϑtWۖ޽;znja(QoYN+y ૳؋WPA1؟U!B$Glg@'廥{:i޾ae%ooR1*v6ڕ"go0Xn.s]*pܷ(9 9p)8YŪv: .7.<G&?_ p_.0=۔n)'˃rbPX&)Rdd@Ft O-N$c|^p:GϖЊx$m!yhrh-~G*˯Uxd3,mpDrO!V ,fRsK*WdFa82XAg8n/TDa֙b _ٌy?1{: &y6kZ_1̎%=eϠW^fZɠX+;lxA X *kZQ$Wj'%Z8mtPܞ)Z:t.=ҨMmK"?O̾K)L8qmQ!9 LQ<\:K>=+H YZ"5ɮF0&| \1qҬ vu -XxI(1trk282P=2EfEHݚ7Р;\F_+[*?^Ip7g0DLI-~]uΐ/@Zֶ^V mqvx>2WRWX露yF "(ا|GooYq^,%6S2uGE6k+t9 /B0fl{z!,$!BH"N%&m#562GXs.ʀPgs$+=$&'&PVWWx-v>Y;Fl-کݼ-d90 ڷbD0#F{1Yap>Ay;jR 腐(faס BBR~o.@؛%bEoJak;> t`(R WuCK$|]aiWΊ*:[1]* x%Ȕ@((DOy݈kJy65`%x6<g,kiTPx>TQi\tuc.)7&U uma_H>= %Qa ^&XQMj hR(Omscij1iSt6*[+RFÙW^ 6E:%+CKtMmAuZK DN#%ЎG5wPsƞ\H]T\n+9 ; 5.:u@D) ! q)1W8 uO:xNO ot;\oڈZG-.EGZOb3j[Sv3dqօŚd-=:RΩ8,~ -^o99X8!-,1h$p\O)^,{ƹA˰m02(W(L;0"cWZ]P Oчi_汓eF@t)A]/Mo*eӷlQ5i~$.O؀0T$sB&Q, | ^}N}x1;v/ | Yޠɂݺ[6O!3'bO,p*6;=!^kp͆Kg<5r˳s%@IKJ^5Ă>2xѿ}*|ѵBl{$Tt1e4!uw>S#}QߡتM䢸BJyKvyO򼚁n 5όFYH#N)|vnI?ՏWSKfeLe4h4Rfu nB[\Ң`/ -.<5G6r57yJ V(g0C`g ˯g֚?.@mYx̀`+HyJeealJXuY?qxx{sZuu7{=xT ÷UmbX@ݐ`NxD7kjRL@}^ <11m8Z DʟTTl ƉׅUm2_ Kg» r2sPŐ9)"'LXQ RgG8:"B8gDFDV֒} Jpo=vHu45X `$xq!OK];Pr#B$ׇ̻@+;;UpxquC'6JÈf>41H"wd9ȅ5Kǁ(_Ƹz5PJ-z6Z \1h=IIF]G!H^0PȷǹCz rC5Av<["Oy92iqjwr쟑 8[Lѩ| K׋1Osatt~pt`O&g@:cPgXР#/f9kznOTq UgeI~rEv&jC!+Muδ>CRP*Dor &)"tLJu;a%IuE|+^-\dkh;{7cSst >T0qE0$|ZD3%&q6=Uj;lQqp&)n\B}rAėu׵Ɓ_f)]N>0vA\>SSۅbG(HŔy}y Hyr,6Kqq,5;AsK$elhWoho `rQ4T) iOx)"RMb~\Rʂ px'&AuKRb?uQd5Ǥtl 9*f>]_8@aSNi4L-K7P$U/tr5HCvn.FiauPO&Gg\]{;2fZyCTkC˺v2)^ 9) 5fZIj{in͞C6!o;ʶ=+'־"9q ܫWiQrF!rBȲ*rz|xîoFW?gD:^Qt݁gTnUl`'&Bϰ^Ja׍g!Q0b`fmxK8q@hDBN~8=ryw$eϺwHORM{NR(vhb E X`U s"cs%=&lvժc M)Ckrj[lb*F)~toU~:=[<#?# a%HD .B kx|fYލ@&`b;qX`D>c`͹Z;'D!#%ԏQ[X̥%)VJ`/ hސPS'IBQ*O閰h"2sWtpS) a$D&mJE|H]w$_;l ~oED&.SskOԨ ޻˼`1 .qIM 1[ lPCvH7Umo5K&qg7ѣ^'E0Ht敺H4pFnaf__[t zKb H|L@ul.:ep ΂e5'VY{ (yܙBxOxEn@Rqj! 8(Ќ -٤֠E`PC6XKL R|_s֔)A! D8\yw3|zrPyvbْgMɝ-'bm&|H_ Vf}{M> kC.Bj0A`>ܘ`]T@gM8 0c "A^nN 59إjC#"\ـCHx-h5BSaVjl |8E8|g|8l`BU- |x ȉgT%!QLo`+miѧ+RvhG0'-߶hų+#췷1%@3'( ߳qE> NО&X@$mrY$/P3pij reZ0j2(JC A9:ϢʃqeD{vnŏF(1өsũ>!mYQy(V43 ]Zkirl$ĥPf17v^]E*j$j^H%TH36iPً"ig_xEGVp if' ! eU1q6i7r%SHa&`<a2Eur-찔9w^(؆^7k/G:lBsEKK e=#>`I֋¥ˊ`V>̹"8) ǥUp9|Kum#_h8/6VBf~/ 3rRzx};]g*@%&Vcߋ#[IkA[e _%CBE| $x=[W">oZs0XHfAqZTw~4P`!NI#U6rq?G ,UXZ9*'[W]pU Sd6Q5)JT+jF** 3 6PimLvdO W`kW^풘 >< 8}obFtU@of,Hw3"4zo %Rp2X.p~^ozT-$…{љ:iQ=*Bk %^la-,;aXDLܮ$P2s'Ö@$ԋ|ru9;5f)WkI:I@OBp@P0ocbJQAi:6lnkHل\#ٱ^~uVntMn43v8]br^68}ZWtk\mJΩH+A/=1m?#s" \CO-,H'4qk5'x<65f/*>&^B/ga \+NL3ipo7D.jަ|"SPm''VƞH(&i#Ax*z LϽIߝwS*Uɕk]%[aQhPJ礁F鍙#6qA sеƸKط.PbK(PY)Bdҁ=!ۖzW6Y4g >U[E@yXTq>!WC~Kɇ ua]jH9G[Imxhҫah'AF )+d,"ѡG7I۲[d\भ`0WϻBz>4:D7!T]|]wg"I)+LiյbAGuC xu1ξ|!h/'TنiT3NCwIXSC p XZ-=gԀ麮:'gN߻ 0-mJ;aB򻧾 2Re@LJ \m hB=]0$vLQ1;ՠCy.C!28`UGYʿZ(ir</+"!Z0$]kʹ@)6+ ,S]=Y 9 "V1 54 kHm?Lw)`8sv#$:LN$2Kq05ux<iFqx;c= CHh0&4[{&N~.^"{b:L=<+=GzoSg@eufմnr5}:"H[I;'wb@kIdPF)RXE(1 4+@=\LόrYL7Y:k8Wi# 1 p1L[sZ`'b Rhp;̇J /w/[!*eq2Sz  8$ 1~ 6ޯP""S (0drOZj~nZTaPk15¬U++h&F6&qr_TW̊l78_Y9|gGAXs:{ayT:Zrc <*"&z=fJĖ`9sqIdc!D_)kxq\Ґ3=,[c18N9p7l! U" z)A* FH;4 S9~h5jk! 4K/uq(n RY!\d}*L!1R7 %lڊK*Y9gҴgF-"Xl7]|vfc=EWpöǀ b7^S=w勻$y7WwsdeXa0k'SF`^=T^rй5Æ!"پgp#ķH_}ij3s?=*;{x[;/G֓ B*)!?б;Ag%!_wgn(>!8̦cɘ [8nV ҟE+ BgI"Ft9_&,kde|$ Ș[Eu2P߮RC\܇ ,[[ FIB^Hѫn?c1A vҬʍDC1vbڝeJD%s[-d ! P  MQtPI/1dݻ*0*M : @kD%:y }ȞW=ev $\@旉öprpu%oEKueW._.2vmxh8(OnBbKF(j5 .]O(SIuD6/MW{t+Jkhrحգ_\զ"x{)\;x%tAF,4%ckOi4}$7N1ٟLzBWIsHj؞D RV8zy5J0 IZ`(F*TԢ),Q Iv(0"%fjʼ%ytΙkY wy ɑx wMȰCW%-Yѯ/sCO)%>ܢ -ڟxFt!2yX0 ;U.1 .{(tDW03nj8 `h}!FO\y 3`:wPX/2ɷ{pGsr7q<@*ad A,j9F6Cp$}_ōaziDAۣhV(*wŴ?gv zf#d0KI/%:Վ0=\"J~nhJ=pu w2{y>2THHlr!]vk+r* cvkE JAKW.>nh'K6ރFLo&Si:ǫ\$`\|;e1,O5u &|cX@p̽[=CN~>l'6*z~ix&#c[ ۻl-ps7h/AΔW=%ቴU!& i;3RRǧ`:_1)I;+6[(EJ2&jz@ޝ8@9"˿TVz<焸"3Pu㯒JvQ%>[L;Ms^-P3caxB\nK;sd,HFȺu}e:)3V"E֡~%1+`nYz\"+MSSx|*Y8w̢:V2zt)pHb%Jz )Ӫ G6K5iScݡE7aRzJT~.8Qj j ˿oAz 'Y w^#,((=i5KKJհ qk, լ,6f9` w/>Q^h3 #2dY|A>WuxhYԂnjV:nԴgd%ӇtZI@Z 11aFvħ,]ڵp(^G06}t۝%IB&uzcj\3M'йF?:,(WWjhSИ@:zOB:TwKZǗJ Ԩ/ tɤ泋ʕٌ8_N{s8K݌ux`d@=]nO5[*=?,h򚢀-)M;@B '"7DfA`^ ü=̹)ơEn]uv6#ĕlѡL4 su>r!RCô~2u HkӢ!RizzC}/wLTTCwh,w6A|uWo ~Gi}^aSƢ|ǝšԔyҚE\8:{_[%oG46hx~#0D#gp& bǷkiUz{؆甓F.ШMҗSb%"Mxpch #r'xHÁ=_ر҃b4:];B%@E~f!Tx`a: ҇Gosd ѺZt ib Jbe9DCxdo@Y5ج.9yWw( upS}קʳF6 5!mFXbdfDjn< -n{W.(̴X4{{+X]GO:k\ L#cZ3hw͌^L%+ApI{'JՅR~ɹj* Us  ;wWy^?-3aDDTUC4]rQ[&E)Ruv"&9Yg*:Fy$JI99w7`+ -m`Pj#uYk JV^ˇP!6{=KOfTVZ|@!>-A*aDhXKM|(Ec*/ZI"o7@W]Z&4ZOGޥ6k?&b0ck#;6lO"w;'_ons:C%N9LwCH42Onf$O@{(]|p\,+l< V>T_NW?ii4;bAШ ֯Һ'"5g|j[xlw+Hxf䙂i!Rm}K,tsT> NK][.GOJ/` kւЧ}^670ɰ|"`mԾJghҨ'L5@%V]4R >4i;F4˔[6.4 HKnV1S{˟1^IJ1㙳L)xi`6ku1/r,tkN6K8Tnt<ֱè}'U|0X`@/O |(D*Þ!J{ AytR(rl]GgW~L ~m:h`%Wѭbv(dp>JAaK@ާ9ΏU5AU rȮnӍn[3),Y#)v%oxn`=|q%p_MT8~xs*d2mݜ=f@'T+to8p4-匦'T0U6Eޓ$,YYՔqxm ᬬ/t|;Gvdv eDvcQ )a(] ̋̍,w|6S-idȡb,.X2&z*ptZ ~K( bUGF>޲7Ĝ6yEêe+uAb$A4\)0 G֑!:hD,'47m{ߖ QT$_.oգ.yWGz%uY<''Mi/rfO,֖&tƬH cFrBt3mΠ"R3m:RlLmQ XJl ܆Q O/9ܤtm"S@ ̃T"iDz`kVnl/*ItJtP9 ϡcY.3߭y m>QQ%VG.Lei 'B(<{C\&Öh֨yN7$tvK:ATgrbS;H'`!Wl=hM^68g!Ү1~s e Q )p<_X MǹOq2^]t_6-ޱ "t\ubkYa(Cϕ 4 d_֓`{UqQ,렅$lwFt-ibRDz^î| )؀x+qp^40WW.0g snˍ76yZ^_OR[DCqȻUy40.V|Mtp*hyg4++;L}@j5W3~C;>K7h`9Jl &ZS`Z]ܮoӀvFˢ|du{dAy:KeZ$\J_VFA-agwQk޶B/RB6B\Yft:h׌5bʓ޽tQQ`2XLXr$21 )k` s@'F:B~y @\ڭc8Eu$԰oafm:vŧ(egdgeD1_jWQPǶֿoWak9PM,UB(nHNЏBqʥ\Y,V8ؑjH8n\E ˦ 2vLu!iNLrelj+c^[<ϺmOgͧ:7zxv}A)ZAcٙnu/MhqeC<б۰L%|Cz^,kOߚ ^#z}ýnܧFD/S Vԩ9t]]c B>vז}>k]uPރ;U@ oUc@4zѶ[ !\AI.&MJRQ`k'Q`BYtwx=nSdPPiwx=t,J\tшDf W"R ^<Y[ӎcBͪZSa@Z(u2 !]qKҟJ H09ӓ-zjskL;6?A `z.5)pDz u/X#V/ջYy;f"xmMO1uQR̲l_6_O*]VNyVn A&N'sj}5v7ts>R+6^q]&1U*#IU>X# agi3H(8˶(*. mm;<(Nb$6Rh:tp+)v*lIQ1si אhb.e3J:kZ8:5-Q22xzw1UV  gMzƮ(zv=AU"GNt@wo!FDwm:y*9w e9MC ڴD= :AgfVt-(Us}鋝&^=H )j*я\M(WӍA Nr>K N{¢CXjX:E_UM|*b_ĺY 3ctR<(m+'6.?bzի뭆joOMK׺n\mʵIXu%f !9.}H;O$wrm]e;KDO;0o[5:1z>篎!g[k[d.2?/ ."}Mhڦʩɫ=z7iS,h_q]/6Ո4_t(.NSz$5?̜AܤsM3 ̈́罶#S{.Y?|Sq\ ֪`z9 Lfbl]Qyn=,E 5>~9,AIlkO;;}0wpJ`IWO:!s>c+=݈3gz[t4#aʼ⫪rf8ȬZy:K)\@a_f5H/Uؔ˜5sQA/t`VP+ҷK?~C[dZBX[)fZw+ihb0]Ft pҭU-=pX79O@{yzO/sJt/rsB%=]@gUBؕ]ڷ4ոƥ|la@#f(eZ20@&iϠ謄u' D !i3'= 򏡷GPU-F.^%>te lj&}Vc-ɷth`y1f,F;t'LQ1pRpBX@9z`/D'I3Cd-+KsͲڬk2&prÉrrЪ7\9W "%Vf??IOֆ/|պʽ㎟|w)2k=4yk3>{d nov! 헭Y^ _?5*namK;yk/4 [+gT|EiɇR~?x{wm8|9';q?_;n:lRR]k9MsrwoU#PV滄 bۏLnxl9:"=CE#\BIm-=ImA.N Y";\=FMVN"Sh,ɣza-tġ~ONJ-P:,bC.d[2Doב+KX-8ְ 5VHTJA͛|R'~UuodDKm9)l6cӣT_'zUp,8t4#-+,]thAR"!Ra5Zhc@4;w+:跕H 24K_mhZO ̤N* m$/ ԻX~E wZJKz46S޵\hLMrp(7Amc+m_edܬG/5 gnqcQF;V(g'c#!,mlϲKЂŘ-ť<ߴZ}}fm_̶Y#:LȖIgLI3]0aCe͚m}H$jvCYR6mNw;VM|v]7okEUuxa};hir4yOZ}PxjhYxR[طkD:kS'{_y+\Fʻ qͲ@!5=yc/q=:MEQAYF&пUBqp%bPPۦkt2XζeůKJ3.WՀV`$~pӏ5Ć`HNаLvBZd$rN[X{fwZgȡc=&ZS\{څ3QժA)RɅ@&;x j cy07 Yz>KM!k(zz #@g3Pd@QIʼqPrmu>1-)շ7W7Kz0Qpi׎ 65=HaPAnEIߵkG$&h>.YLѻ]_O}_26W84x`v9!5"#Pm#q;>D(pJ,cv|`3dP sf]ϰmU՘yEOp8@'%'{2 lWFE_4+$>)>HHױ@yZJ<䥝mj2IN*YNVy\ (UBTؠ0k#˜l?YuLP!a []Y:/Q'ۋqR)0|<(`tp'Y_\Vu wXhbxzaCd$&A&[KgM'*ykHyT+=|-p=hӧZ%|G'9>J]ѫQ9mYҔ)93ϝ`(,l,ٰYIC^ ? f+.r%>"VFF[OƋ,$NL*UAwyH)&I\j+OJ[#:J#*zꥈ-]^]K5ֳY>!?E|08wP_ͨ㭁>}'rh@`-TBpm`idЮL.(;fJ1I5#Pbn醅S+nK&2mp)r@ 2unU#.Lf`c?pAtbQ{1˴xÔr@N]Y jZgzP7YasDr`^TZs̲Pdb -1>Uhȟ8IA N'!2.# yʪYh9Ir>8nFiz9~~2 &yҁKUնVfOd㗄4]q6ZTށ,zTRiyM\ⳠD"RA !å͢T2ط4ER? RҞ3fPC&eɁ$J=Z]KY8M--kp`C16Z2\É< ~."/lv?Ow?TUoy6/Uyp]$ԝ \夐p?G'˷ѡڞZld-R(} B'IR854>0sX4C%3fDCeŖ S^8?0i^ +6nl)|f^3miߑC#Ƥ>ZaDbӿl8 ޳F^E"+~FNhO+P6PܵI@uY+Zg VUK^hbFv.3;' |oOq5搘wBGV>@v[gz6Q܇*EwS__񶗗XLn{NJyG׼8xNjgqmS?lFi5ngл̰D=_#|Z\Qo׸ډaIY9 c<4$ƫ`~D}*= U1Wp> >*ޜz T.hsP=\475Xb~y$k7Iºam"WS䖳1yΏb%{m㏴8qL e˯Q~ΛcE+Vo%5Sn4|:9L2ݤ3asT3@&tw%M8Yw@g|Cj?:R7[!r֒Gq4 CV̔qTw`僰f$yƙ܊eCף̮IVQiͱfv{/o0LM(wHDkhPa]O}qud5e[}~KSh?WqaYÃgk OK媏Ii "0; -\cVx8fEZI[볟F?aoYxoJr<++S[ Β_ǵ_utyg^wyqUHs·ܯУcV,xu쀳3[m&둱0pC0E@kp62ж{Sy0 49H5eeT"vq`o/ 5\xA2J!nqVƀnl_@dJ  u,A_%).=.*nY~&џ&=;K}E4m<}e{(qA9=T('i*d"u7, +tA4[ t`ݧMZ!Tԅ5mSA)[D@.FtizVqŤO5y{Xq!0&YXV.^el6L}̨ts#l(Q+!(4Km6n2gfalB(yt .d6я"z7Ms6. ?i HP!%#eV9EثsY"iDSw&!:Z> ]"s=E 3ΰEn>22k)=dmM|VL4jcFyʄ)Ӑ5*лׂ;kߙx-{l=Yd@AO]~CY5Z^i*yy<왡A0`$O>w5tŸxӽlVA֊}\4<.5\)tJ"Ҟ;ExxPZX_4bx>,h7^-/"rwF9V PL' y|ö”f#hk0N-&*N$ &e mASjTб\U]\PuBxˠ G-`JL3|'jGP>믔kmg}-NřYNYǞe֪4apNIϢ q#I{\.@\XPW ɗ:k.Kccr|QiVklh; Q| 1Vz*z՚:{l]3u$g}A*7f0&:hقMP. t.]=9[¡EPH OOYo]f>?]#QG@柲`_k͵a&w5t߂^>ʜ^>l8dHSI>"#u}MaҜ k O7F!K]K=JYS@[a.\,/eq}) ew%zfmK6 =HJ zPk.XvT9 jsJIy-2sh8H!B;@­x0z޾ѓRc  hL) ܝz4(W)?2~wL)xev&}ZXChdau8̞q6+y(Уk]yEvצGG W6?kЖ27j{2.wpCaPm=_`pu}9ŤbNghTd!+RkD6VdcFFioX`#o!wS3B)OmFV%7(2o"L~@Z8DɮMsV/v|X rw^g?!Kю{u|> H WCdmN=N8n|X9|9WT31z֢F_mf(oh$0FgIޣRKznIs"0_dA %W*䄴 4ZE9up[⚊;rZƎfӐrڿމ sZMPp=9`~dFJIm/[WEަPylIcNayͧO[wS?aD<|I3[TbbٝCX.(+{W X3 N0 ͚CyȓSz\)Ob򾮙wCp^^!K,n;oK)=+l5}}*Ow@J>;ᢰ4x4&n6@ѲWd["gǺ qk^I逃eJURh m-WOyv[ַ+AI=E 8V MIpSwb[^Z/=R^qk+)֭)a+ ƔUF 8+#FO8lQ|>< %v_n$X/Sg# tzp<7Up$O'JN)+T Ӯ dZT`[7 jEi9 3k;ʛPKzދȩ>Vҭgؠ, J13vNbf3BY%i]=Tgq{C92Y=toQ@)@SjNNPiw|dV >鄧ؕ]mq=}tbs\'4|0ψո OcEgFUʬPؖ3 Twcc6ilns抡يj-t~r' TY[<t7gclwm~SQ>wxB@E>. lm1Ғ`=i-; Nu,2;JB@HdWoJ D.H:@ +CHD &=6E-hiol-y>t* ŕH ɻI^v%\Bu: zy]tY8P>tj13"i-^e,8Xm)F@9s*o}uZSjy}$6D66 4S١]IVT 4Hq%}5u*{]^•AK< s Cg*³AaL*p5NݓQΧ b3`<)9z튳#?PQ'0|+';~lZ M z+eԚOnVo4?x4RO%+'kec"ʍ諭SpPZN]xGbL}1^ۅ5* 1PW:Ql5QX Pa'~L+x H2 ?)g+!e=Py*cFN<흊!+RaȩEΪJz k&k}JL j =@0NةנCC3ZRhH9 A71ASdd4"p _fNߕ30۝U]&:rڞ8Rjw-B۲A&uoD ,3VL쯕0j)fGAI\KR3/h_AI]eyiǻӴjZDa:/}mZ3܈— .gB; +sPj.-><*I%٭ j_$9,8F 횚FTgNR<*BCg-^Q7 >.M6cUldrKdDŝ[$#;kgIv󍃃=<#.Ў{7N-X=[O)GԥZP:/uN˥%=\ tyC J{ +XMrtwn4#G͸0eFi(={XSTs:c]T4t\URV6x5 ٩Aj{]T80y:VQ};wTz r-@5/]eYo }KJ1'~U.2!ZY"HxRk_Ft"ݯ~|:5r7]qL faR[kT*^(ѣ8n;e1HȜ4xN:y>0EFSFhU! X_׮!VcΆq9^vγ)NݦPL1-sgcGH Y^o\ڭ0+Ntsew.Ų/s`)m=f)jV+6o涀BMXBGfbL,R7|R,=w`uDp HM|CqúBOm'hOeطXԀzPM{@,Ɩ:<XʹqT0IJ1Y@˝Fiu2]b\t,dNŦ mhѵr:A6W%ΪF?:%]5 il*B;8CoRvcht#_䬿e ǑޫKusn/ ޞD+k %r&,gMr!|s'hj-v( q/67h`9XelLԹ aٚ6Sd3-9He-bniuM!.5;x1hiGOQy 4l+[OgLo-KKc=JȿᕍYȓѭ}5'QEJ1˞2ҫśbڵ LKf@ -mJt :[zA ,ܓsKD<_nݰv^Yvؑ<&ӄ:3 hU.l#'~O<1C9b%iE CvP)`%Eߔ۰:բ9G7 S̭^1l1k2Uwk%zP`7@v7sO/N&<f򥚤y.NEh 5:(5">#]* >VAwGPxظϑջV_ !Ofȿa\s+7OMkmr8+СxyXF~TPuW9x1Rl;G)Qr^"Al̢3.S*4iH2?zvE>fBY$VM r'5&Ĉ8xmy*E@Yf"VLȐ[*WI ,l>nrP4d?6d1 E4Eu'A>A/ⳁnK3jFLZ^r. ȭ^nϹ.NA#{|#weѱK{.[8UoGCDKA@&!haga(֭cu/zsUq_ۥ _RD;3LY(m/UL/f9#`Hv$i/YͷUV߈x<~І꣼.U}xW2 ;.QD_k0FZ8ak8i4b]]"Ėj*ѬBڿ9%]nQ9WАk4e^ Mk&D[׹Y%(W\4O5OS"z=-t  !SP՞؀#\wh[fb!CIa <ǐOug6%J[瘯m2(osu/.pV6"_<'78I1kjEF2rן272ජbhݶ̋!'#<oEe6!upWo7ξƣF!|XwOz`UwWwV_enpr'H]gqdiFo.󸓗!__O,ͮWuF$JʛET% ʬqc +So .i:+c uI߇Snl:Z=IHFS%ހ(ͤ>,4L}?`3$ę槞nY+K#.y~ ީR{ʍѡ4e2ĂLJ4QEI[!DBúKdjf!mr*R򵒜Z-#XrZF/?& gP%pHY®9 H1@m@c;j[띒uZ_wWZeQIj$:FS.a?A>im|:]~YD4H)n:+iD% bz{Zu_RXm<}9gCNVm8OTL$:Z$K=+D<+󮁕vҲݔ@/=U4Z aqw)vJζ=Hh ]mZ걚|V.Ũu|FnCUfdu?P&w4߯_DvyFN:UJ:6dGcBx>Tmؐ|'_scSO C N&iY̓^BxL=jJYH5l8.Kz[;8'[fg!E-ك'w2 [noEmRw]c2%6Z^ Wl & (b19)6`NJl?έbw }H,.;?0QѨkpLK&'А(pH V8P3Of mR̚@gNFW^\3O:u\V+g%Ēw?; SxDZ֌[z33x*/68ssqh32;rcn{bB"2tDf 9>Q3s +3.ža죎aj,d>2E&.zo'`ݏ2a</\f*,αKL/xB^5A]T2щ!㣚R5Dtr Ff/^obSl&HعGNRV?=WE/ASA[FSR c@MoPC@\AYA:Ѥ#9iQ4XC(/M8_I0Eiu he^kɂCDE "_A KW=\BU| &nҾW_ʅSEMЦMX,A TGbPF>T>Y:xF\:'p`tZE7y)XKm2zYURF2^<yr錴u$X)Ql[Nn NhoH<[ | **[pImZ d, A Q Nw1s)-H2W>|NX)_pX4]sapB*#=u-lP@hy9yA>[~.!8!W153ZAYl G]rC5Cʍ*IȈL湫DS||gdժ1Aו5 dg#_Ί6d'|9pdC)IBi KQu esiZ 5xBzf|y6vMgi%~ F{qȃGq/e`]8o2 =fn:I\ SfMX(ȨVX#a .j:^ &_1Y2z,mZdpT]+nSJYf+zbxG١]!@rн|~HP1V᫒.RjޤM =(x5Y @;+bg>AK? ePDxkCjݞTO w ਛLvUG+74[_2jb3ZI#)/j˚/N>Y8'}Rk1 @! )ٷұLH) J/}ES PԭW3u &dYrxyQg]`:G:{_7VAT&:C=@0og`pr1,X2#&Kh7^@>078Mle Y_-4H%RԦגNuC38G!WGj"#Q"ĀhA W7wVsiZW8: s^xFO#Sq D^<ᯪ[ f ' 癸*"t<g b; fמ qoxiz^4:#C I;v(XFäg~ɳ .P4yxu+0c!b)d)m$ -Nlja=Ԣm( Y+Ǘ0w<2 #@^I#4y5-yG2ڃnS"8 kEfsgj-#K{LxbEתG±zeVSN}'RݽK|"JО͙99Q"巟v/@v'C TG;J9$2K:Z}DoFX #Cg;yCõ 9wK-d4LW(bλ;qc 2/&'V!n]"e@wߚ<8^A;GDM]✨4m`9اjtxģBX P0TU}?@W2GQBkw<9qzFA1GD1=%d_(.N^`Ҽ}!Fȁ(C* QQƎbʊiE~6lїa 6#kQNx D-( tZq@t_Sʆ]A0  yiwʹo(?3n+ښU5nu|;ށgU4G\ٗS<‚2O1_d&41.h-'5+KIp^xDSne1<2na =7h"^tBY\הr:5$7G]j[ nD6au0jQ-,F(@RM)(mwf3o>J}i>mXY.^ x!t;PW">7JZŃ OI _$BVhH@-UD8PX<]77,;Ѣ)TIYﴝŹZi:Bys$K˻\|bY#X_;3^~/ ǵלBxq][niI#7D(_Jc4ML #([aw"*%Tr!2Af8#x%;\2\.|@gήrN 3ko T"l.\$QYG̺D9uvM6i6{yn.4F@ɋFA ѽa6[]4)L{*<:{i̴$E4-jZ!O{E /u$W?pSL$aBky EPnm$U9)sC X*CrJD2ࢡEPҊE9p4MUFA!! ,HaQIT9*Kmi/=w`G*"\ehy vj$ȈoD=Th*|@ _[h@H"HwnbFCN={](@~N(vdd Ht{m-OAqm!AG7 *DM.oTAkz nb inIRDK!PFĢլ )F$FAhR͠4^ĒB(Lbx&@ @!$!F0.N>uqPBũˁ*&;4Q$iДiy-RS+9K7D΃DIZ $Hm^/-X-!ÑܛS v2U$]FBr߆0D^.}֋j0;U Q4Ujf"d*b^j)W !ˡHEzQȌ$5 VNͻ IBKQ1_};RB2"HHUir">&jQ%("SjvK,r"pJ䈱3*NZ3BK>B |[zK*͛[$ܟ0盞d!ĨOQCuPYLdPu|7_kz;q|,8s2; hEu<]uR;,J\N8nRR, 4E5x v|jӻmjdENVdUI-fҕSlGcD5AȔBEh.DQi^6&VDm|/hŦdbx;UֶS{dE\63A-9S6/'mCS#?i^ǬjA$EzZ A !$FFF Il!čd U@RL ݐkd23J'waQ 1ѻ $" q #!R'SU!A B1Q5$ jFj!I S.D 1P@$U/qQi!bB!*QDTrhBMPĊ"c!Bf@.U* (@2ъ$!B(ˈE!|:1 KUp ~tˈXM $מ(N "1e ,p-Ȍh66E$@dG I T*Xmc}Q;"vJ]kuڅ mb.0 ˑgW67ZdH A $H#$$$$#%Z @BF|+$snr N; yu{uҔ75&l 5N/ǁ$YTS*_MĀa;)M 1pyˊA{C kx]~*Z^/6^H;\ *0Ie*`jJt>7Duڔ#\wPb擔ߴյU!K EBE$! D"ĕO><=veNdc˻̩M z(6Dy8l/4!$$BE@dI$d  yL9Vs勻;1/6Y=)cyRn2KB-*8 .8Y$LoN7D0@PVǃ9y>*):Vg[q]}8(z՜ ٮV .C 7k-B" /.ߺkgm߭Hi{Ufj`%(kj:Cahs2@u!'>)gR Rn9pfѹWqϙх0bJgzM$ *h Xe\B!X6]`I@ %$%0B H-`7FjQst7ɭ0_vލzߡ$lv$FNxl@?"~7,T8Q8S_W99l5f=Ìո@QT3l̪g. 6M, mOR{"泽ӕ*|PBB:%Z͌N[F(y UN(P98*₆Pe-P"T H"pH J @@@!"MHeN܄29;gybL"h4T" q2J8 "*ᚪj^)\}Mhlݓ5~{@U=K«Y@kcD͡ӚyjᕂHQ 8qM|KZ>>! *&F.-j,HZ5K^ryVXB^Gȴu % y-rh)I5(1Hgs G[to}g۠hGUƁk)5NC'\4zD?nV3J0D@ >;b2ꔵlά UT#(d F  ,[H,# !$$FF X\idvowh^N$]ǁ㑱~-̙Y>j @ s5]:`TP6 .<Tl(֣QOjE(*2o.8aI.Z  DsK)8E7=LhW[4t<%jI=9 Qc{R !W@ @:-tP(#W,½f)L[m 3U ]Uhp4y^ݠP! E "%@d TdNaNXK:Ā@D !..(v"_WQps{ .) cztdQn;GFu1Gϸ@ !2`M\QE)Q RJmTT* AhPcMi$ HC2i؈Ld2`ІPКAUU*$ ))$Pd$J@ 1!Ęp"( (@XPC+3+ƩQ. ;] n,:.U>YI b8~L;r|gsU}<'aOo;|g9/vd:+vdP ȼ| z @$ajk ׈)H"0Y N  =IF.M BPBDNni S[Fn=MDjܫ:4ȋA S.p5· U5s:W/qp7@ώTW {>":ۀg+ohgzgքG3l7IO-{TBDBbP$Pm4(0 !.V0*Q"EBKQ TY%=m 뼗.@-## $PR@dIA ].w(#(EP`>EDY~ӟ"@H{ @DP1A "hc|KX/$8,J<*(! !Ɋ;x2  `؀+h@`@HD$PYFE$BI "FII$^"(Qr x0[U^(("!`BА$$IH0D@HE@62)S@H)"ERDDIj@(cX@:#ePTKp$ H2I"* d H ! HH2BH/ɈyH YB y"tDAs !@A<@P1A(ƣѐ͋P 2"@$!!$HVYRIaJHH!,$Hy2 $` ABIB! !$DBI !T//ԌI$A@Y T ImDbF!$@#!a⨨ H(-D@BAHȲA$aHB_!bh85O@* rb)iBB@I+I$ (HH$,@I-"0"H((! Ȩ" @@H*!!EdVAI$DV@$Ac YFDHIhB ID(PBI BQ $T@4! bdA$$HD$ F@$ !$$$ $d!D@$$RI$$E $Q "$ H"FB@!212$!dA! B@ C@$(A  AA*# TXV$!HYAE$ р)$"HH$ $BA( *(H\DZ#"HF0 HR@Yj B,5*,5EmiB Q*3 HFA FHDI@לAP`  HDdD0*H2dhc D #$! Ē$aHD"H@DABDd$$aPr Tc!!! $0A$ DI$PB 2 2 HH2HUHH"B PȀqD ȡ $(2BFIFCP*&D)QhHRqCPEJDXE@$HD$$HĐ@$TLP6T9HtbQ@(QTN n 3PBM($d\بi$hHE B  *Q#LIa $B , "T$"HHCU*#" D$ YadF@]U.`" h I*T $BB@$D$ALpj#$DddH2 HrQ i*"zX k8V0YB@`FD{ i! ZQ$ Q :#PR ȡ B,]$I@,! $KD Ac$&)Q4&">-$T)!1HD3` 5Aԧ K HH ԁ| H(Z*EHID$BDdH$$ IDdEHH!FB@H!DBIH!!IEB1 HH゗EN.-LYN[#S.Qˊێ"8 ߓr@60<4U "~t ϡ"qܺLڳKm " B20$Ba" )Hkj`|р/?v;<L2bcZ(TI*8@ɍIA*HuW&"( h/ڊ $ UVg**bH "WrI&’^S(rksa C`"q9{@ӂʈ+t*@ P$BZ*wPM)C .'""LWQUEQ>tT@x(*)  Duj,l8"qDp  <.T98 #|U  00QԂ(0mBk`(""|Aj Q_BEDT4 @U* F !|ԤG2(ؠ 8:!tQr QTW6T@L$Tˊ%Pv @?rQ?W 3(͈*he@#t &BA5fY@} U{"X1UC6"TTpD`/cDрH J o1MTUɊAL,W%"*B(ࢮ=! "#AaA`'dP癥u""lTy8)Ts aL2+"!B"| &*~H hD* RAOpoD$E7QU9 :PU hAAӊ#p@Q{ 0@5b&\")Q !j(4WQ}=:ؚ!"*\R0$! 20F$# $bj `"iETuCsUPɈu 0b(sP2DG8p+2 ΅@/2H`(7:8ҽT=$Yԟv7D9ѐG6 $ fρ FAE F@!$ IE$P$$II !dY$ ]BIf #t@-s" Oh.l<\T3LtИ}FH#(@Ʌ1}7ڪS*JŴS" Dwo*2+.S롗$ndEvuT Au f"M(uS▼OtH6yJhc$cC'6q?UW .7a/IeDg|ҟYgj-<}v3u0/wXs8ml1l4{k5羟FQ6Y2VwC`;x:^LdBD TM|ɖUuj#G$!$##k9<"\:$oC9]TH R,+d .SPq>ʏvF,3Fp I2 CYGY{% k1D?* | 1FR؄ /2 }oɿnTYo5"J2-B(8{! ;$9pNz }.ޑ@:.IUKZG aߛHL}uFӝ%2o-OߺNI8.\ў;d@fޜ1D!A$bI$-LcddB1a8VQtd˱LkAbSA"zeo&P#5r sjKT2]No sA cwuu ͑An3avhLBM)t)8MM @ q!*%7_ndlS^A'U3a eH*XQZƝdQxեWf 0&&Ch͊>nw3OgvӏLM-W2NppN-O'X4RpV>w|-.pz<<[mMxD*)vtxk{/um !6;?YC>89$0vXBy}G^\gT*nqš2{nP^-
> RQI2C!Q?4ql!Oj`"e/~q lS~1tN~ߗqnD 4G`8r"C愰 UрE_t6:08rXh5(lB(ʫ˭!Ż~B;V[Kǂ/<$:b.Z/,1Y.u%5Mwˡ~7X/8\jPX6Aȑ+0Y,p9FzW<+5_ws[jUΖya巺# tSξ0,G8)gg: k#84]pp""1U !@e[5y{yرo6u0HHH!HIf~Uo4MI B#3Ccp gk\UneZko SIX5$v Vka{yK2|(qINA rD ˀvX  []n6 t /0B 7w*#s!zx/EcOmm]ǟ6{>rm6>*&8pyW4tDw n5﷢bz|Q0~fj'XV'`2姿NlÅ:9}FzW{Z|e"I_܌z枻0MYz vԹ1[oZNj}iÎv,lSv+54ZV:zF ;llQ::8&5zOTwN’]E.t pʾxC\YJ:WvUy nI9L<Ma:PzK%XNˋ2bP8 *D#du_*W \":L0{.t#bW)&:GL~DK5"^68g@. MzPWNՂ۴\G-D-g<2rLtWuvPCՒROWɾ`M#8@%CJ \ݣn^`N bwp1,tt58.PhBpKA$ CV,ħ%=X4tLot(sz69U>#tx9/9\"V Ju$H2p 2EBs4`J ?[䱚y!ƏUz H''<!AmBCg$ ڧ !$1bĵ+ƀ 6Kd,#ٻCX+Q\$pAN+QB-J ߇ĬߘvV^iO$ t)Yb 6^pF8m,':Z)@8 EVKph 63{l+OپW%rp촇hϊBr,pn2VaApوmOv(.չԹΑҫ#yB0n&iH0BLn Df`kDD%GՔ A] 5Xv׷8XgY 6l x+Ң'Zs[y1Ȍo$צh Jk 0gCR&4e yvl{_aE6#tfHcscr\8\ƅ-5k:M\Z]xGv iҢp8Nk'PUdVA3%38R4+UС$I $ *AΘ \w@(p cK%BNh-1ĶCoMRcF7@ n5a;(=a7X@[I Jbg]kU.Cz0]M_$pE[x֯"Yt ElX$,bjx( u5ln)}Ⱥe`+iMQftZM&U1[p0p! H""<#0,)'8BPX%Wf5 hS`9?(9\MBr[aT !Pl_hZh ֞:uFM%}@zy9k&Uz㱤Gq?B#8DF!Pe"OJ`UQ+g%Xҵq2-j5,2dIFj[]_$T6ACj! J Vw:b0 s] #DC-F^QE |bs76z` yB1M 'i>r,͋i=^1 $$'# . @EAyğ ըi plLj@y vC w&8 )i-{&*o42I'ρ0qfsaH4ǔ8Rۯno 2⛊&Zk $8 چj[]qwZϹ Yl8&.BtxN !FvK,1Y5/:v''0P3,ud" Nֆ{O]eA :B tqT Eݍ T,A Jf1N]\G|S4PJ6`?X0ᦆގ% t/@NJL1i0ҷ:{5Y`28 s +(9n4H[虀 Ѭ@F*%7>]Gr7QGFRآ3Q5bUݮHXb25A'RfFu4 iUQ >#Ězt91Rk~4ԆD& Jg I)rӧpfi{H$kl}:d/W!V_QtđF! Fl9 XjvK'0vBF2L(\̄Ig ``̶mkocLa>p#9Ԙnrzqt]A%a/!#]y'\rC9,ŧeTڕ]0.69XwhVz49`BX$9J8L YG8&5 C<9"[uf.E|%>7"xײ܄%ND; PD b,v K1nnC> Z @ ,{#ɡexQ]ʾX YJd4`SE3:Ň۟#O=BJ: á*q)F#$ܰ*03)I췹ܰ$8tH A~+\fDiV-E$߾Z 'ڔ@ BAiOE. $:s%!36{^'%8&&\u_ @SJ EȅB`Ny<0&ZF KN1G 0뚊z@&WN_vv XEble",ey:g;"z;ieROfUgHUe3-4h5N+ZӔCuXL +gPv5C=V"' 6ll~{`hJ0"0%͘ uBu`!ĚxQ̋&zp93NNtLf񩿏^Nb$ƞYYq:-ƫ[" RYL`aR*8@+4/H\%P̸|9|l,p֥q(H:ڳtQ@.XV]sPN.=ԋ[]=|-Teqqg灴BshU}\Vء9gk4hl/Bj#_eΓ qimrjpH캆bۼLaWHvߝA5FU B 2UViLM087Ւ#Xc6o`iCԝHƕg8l#Rכ\c3儤$")iλRŀs,2Nf^Z p5Wie (9U1KE(nAT+\YyYq¼KOxkHZNm^8mRm:ߨ&ij9@nfiѢ덚r,uUeRm/Ϗ 1ObG)tVݑ*(F 1ZE^UVp21}xk-B0; X80X\, Up>n:'N0Xmue.Z'ZmZUG25?ENWqڣQ+K{cPG5%Ufa=3PJ+LvXuRR"wUz‡Ѵr}8jkC-uҤ:Z'V甞wCR4 6'Odc3c8wz7$$J3NqfiȐ1%%V& Ix[2齤×-gl?гiѵ9n RZ-+{ ;rSrk )p#[OqMM娄M)0b' QeZr8%bN/ _y5 /"XQ.C05 Y]Df+ 1eCJY}YDUǏXpv1%VEJe,$3(;]twx L2e0^9 GZ", xSL/]E,I΢(w%{ccliB Z )(b.z5m?2@-tc}$tt܌j2,{RJ3E"^pI pL59%%,Jn݄ 8&{0\eF5#?Qލ'qjҵħkfi輙FJ(_z39n)߬XZL4,}Y,z,TŴ.&Dnȯ9u+Z~54ۯoïmRm6 [ ` mRk!Bɟws=&ıwtoDvv > VWwZSDd4ԬǢ ';Hvݧl Mؙ8PhXq{fDUe.$> }|( MPP\{j($ Mp'1kHʅ &SybOpZ:uzrdFc)?n,Kt\+4dfU#W"FVD!6VMpLK0ߵ~u&'p3ab(6&t$K6ث s+XYѹZl=xqگl _ӏPf"[L'5%zf 'R[#E5ގs2+[py1xC^$HAo^a;v>)Q]+v`qqwĔSՓy6n;dɯj,7]HNeRc0XMɐ5a䊺;FTݏiA*XRD\3%}g8]۵ǰuc~sRآ[e]oJ5gEwwjg9Kq]٫jB xoȧbI@0H[i<2o2\E.Klymbqd-x"h$[@"L"IBF7hwSRv('Rv[C|<5mJ,Fl@[5S9N눱)}B iTe#-Qbp =y*AW2iu5T1+6saz<鉼=97LuR](&+cN=m $]cD%|9H(GX$.iM^zyd٢i!-.{nu(Ybvgo C83ᒻUq*= +]0xYe6k'~U"Ƅ}4}b)t{ZqAATHQbGY!)SsKbց}-7XuRJXhIn%tG(0c t}2(qfx{UeTmO6U0(۸pc9y@c<1lg:K: w2oXe~@rj"BWQenPujF2_E%q!nDdreF&jb2l3}/-~RlMHt܀)rMj)*AҟHXqsIUȓp hH;m&&KeT*qx5V!?(Q:CǘWpm4".(Y2EB0U4bY9tfXx+Ygdǐ ٔ,T:8nhp+D+-p U{PbgREKOʵIK\ifؚkd#03O^f#,W%Z(O.ٵŃiLѱW/ ktݑL-H1 s0റ. ;n'ɮV!Ԛ8Ģ+vK㑊 )u޵v5'$h 6:7"*|4' V ,]ߪ +*Жd! @gj^JH吼[S`qB.EvE#( 6_$^/V GkiNdX 3 2]>ژ}+M5^t*yfuq])E|D6Z3Lw a$l51t%Ů\ÞcQfvidRdN# V(9@+Lђ$1(؍7% ؀U=͠a܊PD/;@Y':q6GH~PeN `0,Yς,5rTߴ7OU?jpo{|~A"uGb "JIU0y0-#t(8Rj3ˉ|J%5HA,B J-2ug!JSrBEwx\Y0ajp)"Aw%f^5ɑJ(p)0V:/A@ai|@@AR+B b p3SLehW]lC#>F~Y$nu#LDE]1ALa9qZIn)u s_t)y7];nZ dcE;ӛWBvo}Ӯq4YLFƵ~rMpLD@K-I*)q2%Fo1-7fvbיDĆ=YhI!rkPwȥɎ$ BB Hbl@&! D@76|=_ȂA0H#R蘯RxyG7I*#dy.DՆ~Hcv_!.4ele?W &~Ǟt5)Jb@ToJTOQ 0$EGw1!)"^~TR7Re EqVwݧ;êl9yi2.8.LP[? "^E]&fE@[b}OL |x>xP/i~ǽ˂9PPAvPzJqNi@xklT η?mI4Ë$6wJdiߦF}g H( eͨ* g'Udqx5h4pr+ 2fW3pj QlWfcGj+ijCUɥ{us"jjUZZFϣZ\~Us{U5vѠM<*ԽW`C3@kE+]1ijWðW\qe3fǃ{Ōg,ߡo"yΙZMqfjֻjUMr 3mK<ծ904:2D 9=rd㑾 L¼#7'%?¯]p/ED~H9ΎB/o~N/#>NN*3⨫Ay/@ WCL @MCkLES+B"}GA1O-LL_.TQ>4 H(E@3(dn{D^@ߜE]OVuyg/-{jCw]6vƍvrU>o"0MQ[֭(KKA2i-tN(5ǨDqEEiCTP(JQwR 7ҔCKnӚCSJ {9 Gtj?W,nCm/Ee"@׿gc0-AdRX jFɎ5*FBޅZns+GãO^FKt\nƸIJ퉽x]ҾI󶚱}"8Oɺ̐)YnEQX;'_Q!Wb{ m5?J+C" ~aD]Qq oCHgbB2[V)U@F #)Ҝ( ܱ2J1Af8i] (ZE\Z V 3Cќ#$tT^s";؅` 7>`ܜ9 ldP2X-lU Q=u]/3s}BW RF -s.lxYH$n5t3C6` B5rt}^ k hAPnAPT_删.tDN'N>*+D砚>ߡ«cY 8P}s`g$d `WrgCȾ`%1Zߡճe, 2 R:7f5/YYY)43t?ӫ'^3q{lQky1D T<&]I`kZvrku1x7A˖m=|[VӍߖbzoDQJ?oTÎ!DX{ljʰ)RL=8;M s&nN[uJ5dД=sϢO¸C ǫB G`@XVsIf8˜PVJs.|dk<^.Xl ዲ(/(ıBA;BB)#0ţ}< J b.qĞ>MUn//6ÿ{( //2N&"GAu  mj3t]GӮK (qtjc(H{WKt}8+.BGC)BEu=◜5 w  Q:(ßT= TqO~DM(t:9OO]ls0}d|GO+u 8GYv:}3sCυi%^XNo~U #LrR`kH+]R`Q<_}*-lǡ>%5QHUƸɁb24=v'R @E*NFE; ./~Q\Poȋ"(b' Nl]=nKd˺%윟0ÿ%~.&D#v0ȟKk_tRSUnaG⑱^_2E )TCe2hyeOj ~[j -azU%5nW!Do~UN1`2"}H+7J9XύTʋ`4UNb oN_7W,:Q2,%S왎oӑU" b(. Yܦto=SS1qqPG(yȂb"D c"xmclw&ixo6 >OÕ'Ulzq˾ ]i*@5Kql"~* 0ExhLwk#Q}U=dDE|PC@<Hm@ې9|v+/A5=bi#*x<¿8γo1K롟 Z U[N=|P\*?>Pԏɑ>|>ѤT{( ED?*"|h +=xU"D[AQ *xD>Ƥ85Vym șuV?kRr?w'xO-OĦTjoDT@;DA / C^("*(#i9tn`yȄtm'#Y4߆*%%O_t5j-5/W9;R?o66@W,qK5_颍Z=PAw[yQ_wځ">z /7GEU}>n;n{71Y@?"09_cʂk"(a(g|@g萀L7(kM&gjdTM)j3_FBDz>⏌OzXAob>-vnR?UX>lC8U̦X #?Pvq[AtVZ༪v5?B"C ^j Ȃ'}GL?_́11oVnb~xÌj(PzݶCEŲ[>>G/J49rrJQ9kQ6Aqym_}"Ϙr;vwmi1eN~m~c$OK*X |Nl]_d%+0&w/ Շ+{f6їYXeYEekTrГ+-)zm'4^ BMKY.Z J)yssJoj ,*́t(@|u?ysl񐳸BABNx5y0nu|j[ҁCVx Yr%`|;?:RR Lez|/+QfE܂ǹRד%u-y |Xʸ틄nl> x Tau~<,׋VUh}6_>xMnEo $siZ6ϑ`~d x9Z|(ZL …LӴq:~bB>r9B ?'0S ~@B뛧MIe9s1˟;oc#fH%bA 2vpQAPh4H?Vʌb dH3; KJ5DIcnX_|a:2dxeEZbHn`Sש1C k3iz2ԗ{=?/kGG7,Ol-D;VGKhG4:y`TD!Fe},\5A5J% `8!ĕ5Njuej5;mMYE*艟y;$Cկ.];h`tg9+s_ֳ*rD!}.AlvfǘG$7[?+8;^jBbAJPi?5,8BR/%@ 0`0{xmTM@kLcV拾/)U,Z5 3B1wJ_HO,#ӳγZ!u,8`s&y-] ܽ,r|MMh1m˓vKpهmVoVjJ{J]gN`x&_ЫshIQ=σ~Bh Zp[\nuT)N=/-0Cجs8e]3c5-ݧt&n3pLxi=/u`sU<'>fn7TI=gj5n\-k.Uy~ S^VǴWq-4xUބ*&="Rzֳi+ >GgAZE'$F/ Q}:UW]0uŧQXל؃Mcc_1JAD!"HaKA "~s  b-^e1,0bof3s3G+Lч@杷3k)N!d5 "$f"Jm&ʭSxblph 1]y5.҄yPUg_n\~愇|j⧄? KlfCz/ŢLBwi'xc) )(g+A`ƯDU-<3{X)SVZ#u%6g2!n|$y+\@m]|5a6Wc: X,0ԯ5*bUtn@&+U+L[<Ѳ .aRtE4H"faSsr"= EY><'Djpn"wlDtP;c?>Qtp OOUh@?ﰰ ;i >nm쀆@A>ݜ7YP79a֫K=AA7︯7`qwQrYu|$oB!X瑙v9=x}=]]'5*VAeWeptu]G[(cubvYoNHFo`P~$ >6By;I$W;Yoeai)ΨՊ^Byݭ\*_-nJ| VظtC2!֜u l s.٘?> d[B{W~ޜs>kv 3oјTYo0'f\f2kC L\rmnjlǾ@3o?kz˶:7b{)#hfd=r,$$:\J+Pc刧ϳSv-DX=`GTMDkUHd 04`N =XW`v,]aW eV"U~ }zp١c%da ̽!6^6_JN[Xܹ'|HGvX?xnF+%)+jݕI4Xh_ @c!BlptN;5JU GL1cws]ykZP`6T)üRmp, B!$$'8rtmO Y;zsA%fj¨l<'`Abt 2H@x/zL?]Y CK?3*(~k3Mfm328i EnrtoǷߩLدFvАߜ@71Er)% j~!6LJiظ. -jJGE)C] zE%JCކ" *wFP uC>$"gP P&xR 㶘19Ք`rppQ[ȕ.Rzv(|t^~WF-s܆yR#A@j.FPV9rb(t| "ۥ*_-O +Ww;_˿m0Pl0ϵ،~́E; b1*TIb 1d0H\@@zJɂb¾oُ2t׉}pJ 6'a чߓ@:.Ęd7։cdJ!n\ kc_ZOSY|?] X5 Iﱗdd1Q7#XP{gP/mJh /ˊ}/ LUPUNϡSk}.jXowo<QfqF_HD^  #`" 'ǀQOQU9آ>r 6zz-u֕ oe M T q:(#|6~dE{Ȉ ‚"8"j!tSquKvZ*<~noj@d?FwWo8PJ !'%w{G`∂~'@{*/'Rr;A>*z>zy TЛ,;n﷉ұPN#朳}A?3: ?NE H-;}!h(TP}*y<\@xy⠀?e@~|~ NR% .FtD&۳<8nsGo.qzX?S{A@ɀ)PPz|aQ;H:qA_sSnv3_CϮB($('Sኀ|C3\h@@E@!= hbdqYcXE$;"3h-c/SLRNJ*of eɓ*3BxVbnƥ4?e =ky&D Mӿ\=3Ϸ2~W-hb##LCWʘ{xC$bXDB".m+oR91Cf$,QOE#* ?4U@?xQOkEWsl+m5KF"f*zކMuNdjA aԳuكTdA+.]Ж+}N-gI& Q+ ೪E|*9j*fCivw AWJa؀JHqZYlC(VU&]/ͅ"D j$ӣ!*22QVW[EF:x1Sv7$Qv6"8*'^*)AN 8؂*P ut|60[~! XPu?nCѪR:qY|Q`BMkqfIY**40m &f-20E" /c[DRm*(!\쩔W64*Z,FūJ^0+V3S!¶xϡ#t?M[nz͆;vݐab(ma%Ԉ46ڔG?[zH]ׄ,ё`L v [)hp-BW6YH1yupFơq&QzQe 4i;fb`Zᬐ!7 EʺZͩz?*if^Q}k3&;"@+mH|@ ("SE ꠡ !#yi4{l5smjL8L>s *puG-'^Tz(TXj{IßuvtW˺5=v(#Iȋ\(YF+Q<*kHʉ~|Hk?ZV WM>DH$"d H QmYT8Z!^gqH6J=)-W:\H!UDY\ jDTV Z^#7@쫌0h,CֳgͥQ] :"rɕ$L(#4J)טF:sB,RJQ` kF]T _M\NV/Y@NvyH (9*wp}q pOɟopNG-[J'zQSL#5¯bMͶ XEPrZtزq3/iR*TA +f̅,"Z!JA,Ş%I+/ 76D'z+B X *D'@:?B(e(DwbC/і}l ȏcK(ByaϗtVtDTb)E$QE)6;C6x1ꠏ p~N)< tȮIva<Q[V´m\nf_7CPz9gCX2h*f[76~38 ?fRnE[ql.;.w_?8p Q]F~`"Dt_菚d<U΀tcib~)EDVam}`S {WRm3yL).t< ~poiK3%#SϩCeDyVd v 7ui2"SMKuAU'BЂ\BoN9c_RyE 6hhlЇ/0v?3L;_^]A/ƻ^V[*:c~'-)l1%<_Kɨ/KǫeYg!ڊg\ݠۋaQjS)u]Kz`K|;% KPޓAUnt>C NķΦX+RdI:l=F[iP~si !scp8'czI1ݵ ^tǿBz8k5DVZ/(&P DC^T1渓pKUHZ> jgnbGS V`TDR)@b3MOOqh,__? `@EBgɭ&U x+9iA# չo#95KL,߬ C>ŠyR:vH0l!f4ڮ~4i VJ*7S9l.xD32eHxT(WڻH.At[0R/2?#Zlmǖ; l/s㳫ruv=e~l,52h'>podkJV]N9k3jV :0p) ۩л)fc L<_:v'),7 R|Av%5$c,esZ "P>RkHk\v-CQX4gʲrU8.|Nʛ73Wę";?Gx'II󗚺v[%HQx.ĐgEZ2K fvVY4I`JƿG}@\Yʶf(45~k`KYViI.Z OAUit)@Mg)(QG#5`K$Qkg= Sok\޲+D/$ (/2JE)VJ :nkzoNѵdQHAx?5V2QcGtqaâ@IV-=Zel~^+/ cYGsӣ&9a/f}LYGzĆ'` lD~.{_|ț 8 za:6Hj%tA+}4s~>|BџM>׌z[L} ȱ/\oO{Jt@_ӕ*6.JkRFh;?Ho{R&ɲu ֜#QTPCmoT^$ȃ7U#3Hx \ ?gّ0[_EY8g2910"0L@&ݢrt++u7y9(&ӔK7܈5ݴbayDW˛2m)=,lP!=ǿD6TT Na޽融 nr]eI8IKm3a./"h?ƍ?N\wcQ'y w!l:(q$87;"_< 2S:o݋٫'7QMv:?+ku%(2Ď'DGug&f=/NaPEӖ8Vn+&\ UP˯A:u< }aDQIO7P[#-].;MhS$Vo$D˥뾺u*ge+]}< 5UoJ4V2\r²u?g~o2@k:W'Cт v֨yTS ξWELdF tк{Zj7 5 m^lwq\A݆c*Wju=nׄ$y:!*VO6JlgOJ})kx{*X{e(yR_ 0VX n$A~ֈRk]{M ;!Wg@ڕRё<¼&{N9=*yzY/wA+PO\0Yv&hmJq%m*ijGjfWB=H񞬮tZbԡIw:?hP!+ܵtI,Ź0i=dž{o[u$QYz;|rv j5v3$"XeB8ilK01.ľs.nOo *nb ΉF $7cI4cκ^[]u7^vW6e̹+Ի4iʘ'W-)듑PФ@,Ev`sv(K-SdqFp˸[ZN`ie=sר{~]UMxϢ7,?svXbσM)R&ɕT&]]]Dn]*:ݞ|܂Z4e&ͬt|=7U0s,SLm A7D; 3r~; QW爃!U€3Ѣ:uI6ޤV&ۧT!&l F$f3$֪I\S[A8WGnY`~th>Ff+&P8qL\Jc.j)i/q*]K$`$|1B7#\kg]SQ6r{<E&3VWv~lg2nxxˉTF[)1‚Jo"05*?I6{ b)&s;D, 4LnSuK;9{zPf˫՗x>/yNKnw̞|ϔK!q 'bhpG# b)F L{ rOt`3:UY7 Nz}7z\ܒ$֫plCֿXxIIFǜ{AE%x [ktyi%V\Mjba x̰0uF/É+U1A+}ZJY!2.JbmkDY]|%T!3{ P O/`sI/\`Hu.!#3>WeN3 iI^?QdGMܦ^#~}`&zn$ 6B q#A33u?|ЋjAg1W8:xlL<=5 aby7~)BJ=@DAwzN?!sպY)w_n g/4怿 iy$|7R]5u6)kϏ5A t mԀqOjn{yif|>WNd3 :HV\"E6A-B;Jp3n\vi(;s|Z'1k 6~g7]fWS[ϴ/V+)&ʧu}/{xgQkQ럽^Mv7om э[?`'yXy4H5*Vf7gٰ[KWdaW8Z=v`Gw{"2`dph P*dnxvq/)`x# jȇ''ME~n]ee%Yimox[4 Ȫ+}gR7bxTuvn0 x_/shxB7ɶ1!,DONUtkm|!1{=*S1PJ-f>[~wQ0yz $Op x%bGl)?Ru #e0=ॹI'fb_[,~E{׊_Wv?UR&t` ׳klSSelήrA5[-j{hIP08J. xK.FttAn[h1\fIGAS oG/O|"TƓX7 |@}$ꋰ@F芗sF9VCµ1%ɫ5,\:f7d(0%2.fuSPE5 S@(dp(,PET2#*;#-d0HKkjglB]MYd%0եDQEQ`!U p^Յ z Ej c 6S7OUT7զф+е]‹!(Լ7H*EiUU4A63+ ]°g :n6EAT8Kf`̬βVL2Ѥh!_9(t{UlC/E&eXa\X*D.@f2%ªQIc;:ҢR .mE%ӆvJ$W5!BAeb'-(EpQ1 YN$f_J 謁Vm8[ JTiw*lI5oZQAPwpE1Eшm"WW]gWߦ8ˣboqɭ.Wڙ6CQjɎ04W gU rlYFVJPTF"bv(XZF{iQV;R4!(ƜiU/bloHtnp 5% Xs%(;pe9Z(ʉfR >(/ Uh`Uw]gx (=>gv]~}DfЍ(N딨՚Ё(?)U}V]63w33bSv] "d{_¿L 374#߯06s GD bDv~w9ŀa#Q,;LYX?cW;.rӮ<ڗXyB:T Ѱ*(vZD1QUSv}Àg};!kJ5}6}Ոs߱λ@QPFᵺ˿gylR 5%Nk^{пA8K~ňT`pqL@Q 0Lss|s_B iN>o+'qRLQ\Gp moj&isZ88}Eck&"AB;7~I4O< Lk-H9Dɇeц)kh 0F5|ic'S(jp-Zvx@c&ѶdMLn^JxAb?"ECE_)ACM r$OM7&Nmڛ]q5ؐT B^ : fRU!g*Qeɝ^,u8|@(ˆeQm:ՖBY s7C `)fU d PE?kne^EgGₐ>Ew)BQe]ܱW9Q Uh+ j:@޴]*Dť8Cwz\EIqQGa4ڛb1wdfzpIDc44^!eA%Ve՜F(h)YR+8"/+ ZsUB d VC0\3` *TLxKi`=Yʹ^'iDס*V}dN)f%  *Iseg `湛ZUj20%#h Ԉbɢݝ£:0`c,5;ݨ.C ಻z3P>iwdFlhH%0UQxrQcReE z-dja_.ChtL*|Š<8(iD@_nDO.}oG^v")㽅d@ȈmRfKN 쩳UCk02l\IfZr 9UXfRfȩ`XԄi)yQC%.]HN  YY(r#Hz˿(@BÀGsElufdkTspƢwgZ;iԜ>"R!jKl~r̊l(Ϧ*)Ц!kE* ̩|6Djy?7Py{gq70@3vCey7h<]@ OC/Y?j*leGx x^J2Tj:̇hƈPHe{Ue5W9E4N ¾L(^_,+іk)']QsJ]˾1=ވ9-h455[Y^$h6YEaCe<*aG - ~WlPPj^ѲeaN=%)=Y8 g*TG;;ScX:i|Ї y܈+C( eDCCMEyêD:L"HYh~D(EeT.nD$Ӏ%H.JY"PCkQ-)e{GFy*uD(lfEzѥX6-2XL`jѱX +3* L. TVbYIYZjji5 /ͬ$˺jA/4Z* BАL(lCurW…x B/>%Uj0P5G^:Ej>s;/_ҼԜ>kR!y@pDREHf2;W/IU4QZZ&Vz9ŊCEjA p38RB"Pi3"I,b5uNfjMJY[O좁1lS)K1YiEόW({-֎4hfUE+= D T !@v*6`f-$f5duͅQ V݀PNXP2z (@ZeCqYTMRtQ5YQ T0nᲢv8U$+UT05q@a/؂2lv_cecg=ƮND13Xd@t@/̰tJ %BP.m9w91?O󣔪[6l47 X]DR#RH"kF}7vƷGAҔʺ5C2@r?nhCCh դɅf%A ۑJ=B#lOfKܫaE (q(ZkRZ5PF.jHCsYTW+(@sxb82zLźML9ΚbV76&1c!L컵܀IЬFDQUMKjd~qFBaNο Jk{DJ#o/^`.&$!X!W4a0ThjJ,XzUH1@Ij;U 6]Rjjh`7̻áS84 ȫ,khօ0pdΘvZfjԉjNrh&ʕ"+J bg! Bp&V`(sy #B]VݟEթf*FFՋIBTf`W4*P^b"pe@Ȧ^UDpT?(F*ڴpaj0Z(r}{6rt+u%O5,Ur+9K1U.du*J f S`(؊p!?/UlIfuh/ k04PHWh]JLb&'(ah1!2#[nULk YP/;ɔ>b`Sk=e K0p JBND3D;u KBHbdvTMjWhyHl`LW¨$aj)KhaV[nP)HM5BW"PE BDˋ7G2M crM-?f}E=tA>>NT;h :xv"lP=(䠆{ 4b덓*!$͟#= YuEXqRU(B2pC^M., a{$XjS:Йlw}wt x.7'֥4 TdU|90$1OjJ/?vHYT =W KA`jp+uy[ eHQ䜍(&pj޹%E1 ~9vֶ,~Gj!' }Oє!O 8>O}͠pgk4czG)WEo dvE)^ hE"ݳ\2{~Mx 7ջtQSMhilײCH?_gt~ 4KO⽁%U 3-~./?w) *1T<EETbv"eS__ozwikQ౪o;4?]UɢT :mOэ -^Ф,x]ϲp!CoROKmnlBXd>[&\%*-wEoNU|)|z}8_S];Q1_ʻGo!HqWltq]2D8v-aEd `b F0u{Oq!f`b)ȞYMɷ+! JuSUo³K/ޱ mz0%#cɽ~+ YEݍ37ʗcރdp.sӍH5q=&%UI3vsPajO#ki=2E:t5ƥb Įeޒc)92zHIB1r|=A%gč[1b.`MnBh2.Z&T1b]KwsVW UN*pKp |>ff |ەS02 7 C"=n]fIο:q (pSؿK2rvGl4z\ZE+{115YL#$!>~U,L!Icu ~vfׄ&%GX3b\L†$#Eǟf>@PJek3%M.kNIG"V+0'At@f L@ )b!Ki*\ f ]^yĆk-+/Ո,!74o{8\\H$.z{%λZ3z-쑅?>iW&uZ~/WI|/70\΁WNooxOMOb$p'F9(5th{і7=44)VmjTU.M>sTZ> W/ll*mT 7/"}߇կ%,+t8p'|,EF:GoȧeE|ᇻ<~%Dx[s7䨉Qqy>2'O0Ѩ aбU' \^qBn}Vÿx~ϋ ð$wv1B5m_RXbbLkF" р8Rf">TY.Z!ԽnYݗلm,˳ʓȇYx1@b/#|-` l>2nyqBnxhF[cb\Xm'TX64x__rFZ1b{&fEh5I2b_PȨ\{,{1O䙊sA^/c ME:h7Ѿl5[/rqm>< Qb=s~o[>wz!/ KKఉ#v(&=QKSi'lmw\ VΜYQ1c'*O4A,U?X>`f;A/¡b0jMJlOx_C)ٵȰO*a9i]}6q{8N#+uf =F\asGu1ڝLo13` [.yi_Xw3l~!(Z'#qdykש^{ a~ֵJXktccEtt}1L75ZQ_-}/WpǍkqF{WfW]8oFsA#{F Q'&E{IRG_#v]Ł9t>ز2Y(yGEq菼}#ߒwLJK l(Y $ʜ'ilu:ɛJL*F$yUy7|/t2.ǩ"Gdq/fZb@eXorT>K;rN2+QxK~}k!#[e֙; G?+S!+{%y*?c㷐U8\|ċ2M{{"8J[_> cvv]ZчX_C+@tt Ww;h-X߅!䄣wQEj6IT.)(-X,guޯ:hp4P"`WcP _}h,|l$"'^Ӷy0rX:#T}U ?qh{|x?j=M^#+APLY} 4A1N>;Z X&fBZ+^Є@e p;.fi3gI6PFIx{7M\.lrsoqQ۹>BEWO*:ϷH=OŭTk9d|S.Y ['/#m0Se#<|6`>+,̱<:/#@!Tm?1fBV%BN@ f0DUvǸj|3C̓A(ԝ job&]H^!RTE>Gc=ܣ2lj o[[^aï:-O`)ć?x2E?fD }G("ePrbvh>_lώaz~VfG~C 7JHz^,JN a .46rյyVJY[望*JQXoټɘvbO2[Ł}/űtޟ u֞=q~>r{o5.J}A:jVx[n=*Db]#i>'LT?}vi$?}|O=[lzP Z%Rqxj,dͭu6L]$X$Q.IdY[~gwIL@$CRz(6>;U8hcV{WE 0kJKFu{w -Nבyi*G`nKur[թ҆TCo spo?[) 2Vi e-ua訟‘=槯~>On82j&3DNl$\d Ĩy ?T`&Լɶj{e JH? '=N41uD'r1E&ŌhgqwŢP\BAcc8]39թ#ِG)y?Cڨ/$@@U%Lɞ~ ?Ρ* GRqM]T wãsAϜW31gB /M:]Hܩ>4`,⠎8!e= eLxXt̕H Sa4fE1]]'af>זKZ=(k/HHif*Иθ\U`ֱF, jCv%}&ʲ: o{ނtĀ7P_҂k?] k)7; V*b.1A_۹- xC*q6BQO戮J/+CAR^jH솬?Pȓ]f̑බStRK| 2j.r61!&@SRq_,B@pZՑ/0CЅ\XHKTi"$"RPHQq @. h(:%K" #?ϑςC0߆g%dćL,Ӥҍ*KszrE(@/KЉU BиUP*r%VeYH"K"b!U#1B05fpZ` " ;DG|&(F֕XɖCLN )0t&9V@!b. ۫ܚѠX2${ , [đHg%"J4fP{(zXa(*Q @8Sk"'t1b="Pv"L:NWG^KBL4jQqh, @`#!!tBkoI?u H1"W Kn] E༗F&AE$AAQF[sT̠l;4tAR^Tbbp Q pᜢgɐ5̗b$#* f\! #dщ#`a!|HPnohlu섘eBD29 vfQAs,Ʈ4U-eUMkv,i!N lh 8DP0d vD'6V0PTc4z(g@ART  pEЫA -ImMtCeI xK @Jf22I! ("]*BKR&yncؠ)٭/P$#@$ m܍#J&0s hQq5>՟>qyT봴faD jNsHJq7`%9(EpL:)(V|kr*ÀE˄s Qާ"$XpNN MRiU$bY`bTou,`W T/906Ʉ*axw',!BJ HgLùEWtbG0S:Ur1&P( t[AΉp`*rw2 !Y`h9DV<pbfLZ3PqE%мCۭn ̆Y(j1JTb(U)jŔ ZRYD*9R 6hTrieHCRfXa$j4`͔Mp[Ú7aLc6P3ʫȉU\ȳdB9E / Q^H˖r |/H`A;ꢥ5v .HNlHb3 3jCՙcS\jRaFPY2ˑjb(@i##(D(̠Zbe6J& wg V'M j\[;v3.[n!a:,i\AqiG5tZ+4<_e-%,"LU)%pXXeF*IXFuZkg02ѝvY[`Q T($6`w+ S re J4!’X(h%"W(-Ÿe8SR2`jv=L7f"1$əEXsXԦaP1  f q>"Z V((TU@rC/ƺ/PC.K5   VךYP'2 ΠBERB̈tPUhê QX,d 1ַ@ e7J`ŭ *Qng[< @cHjR*Đ&]vlQ$CCX0&DRCVAgF+ԦR 4;)PС% ; PYh: ݂DFK:.Ĕ2  L1u/LeUa姍T!s*%&J7 1T(8*t/\"e e~Pú_&&󟝖P}l3Wmҧ14"^*> ?qW x0?[ޚnf.b@ |UZU* bҨŨ `j{rDD Vr J+ @qL2c/vNi>> CgiH̍2.HM=Nm U3 N tmL˝X$ZLCOs1Ss_V~| !jQ3i=NP;VI,7f9>\ޑ?HZC$xr-QW&non5P e4U[DՇ)S@{؋ ~vQ3 D1C?^!x U=iGI䍱DDO: 7*[eӺiN1XF"'#ńZ^}'Z F"w.ǪME f{Y~_ۯ}Mkվj8F9V25PÔwx̺NP;yD@:(+tpx2P:ɧ!3Dkܪ"%ҭW:w[*ģnMPΫ  r.[@ʪQF$3%X3ك!2"ҹ:*+"%eD[Uք4=l.[}@&Or*P=Dʀnӌ-_G*mf G<>jӍ <^J6>V<qD'vթ(ؔ<~B(w-Eg6j"bZۦ \EfÌ}n;PIXx]{[! @gEi$(WRZ~v1 |,#ȉA{j޲$cx#20P\HucE=sZx10h†CCHBA ͻ {CY8 6blD~Hs*QQC:`> H?y8+Dp_pE]i~NΒHiД׉舼Q@ Ց>⫯(C`EwME3˂ylʞF jws5fP SJJkҀoRO'q;]]7yQ1vNRH0ԫ~}=ʥCs'Ҿb(SP@ikw;lNx}l'>zDK]%9#JCiӤ3.D9 !2 '}5E"})|o.(%fd~l}?'+nos}!^/""b[ }L0E?GҮ[w9 m-ݻÑco1 lLXAr`_?^HC)'P#:h̀=OxFDo_az ׻g?q\=G];9@nЀ{e:qM+[=~cF 7?8Ef)Rj h5Eaz]jImǚ - [haYjލ "kz98g^R]2 q VSDJY%Fq.4-Ԃݛ1mz;T4G A,=s0'Gȫ^ h;Ψ@ 7FIXzEK.TF,$= {۩ Sx2k^6?4I)J O'ahlqc~2xsTف_yn-r1YQ_+ps\S_Tɧ,ؾB}$E?DX@"ā ~*^rzh^Wӧvm,?ֿkI,+9cfz?fo`T3ظ! !TKt8 {f40D%T= 4/%R_A(Wiq~s ~+b/C# 7)l ɉWf7U9Sg3l-X+曒2jF,Cz.0tη 0D, !a4a0oɣ(NY]f*ӠgwEsc3wSr'8ӵVښ;1, % cBV|.J&n۞`?P!7#nY~¯}I To)T+*-'rqe9;$~12C{x)Y4W|sq3YJ]? IgqFyCY \dmč2 ho{T<p(3>4kLկi-788t9F1I)фu"-+WX0v>g=[;-MQKQT$!LeW/-A3S'=Ӑou;O;,xJUw?Wi ׽$[vBcp r)kRsMa 5uB{QN!_L!lj*w&[\7~c>.)I7;ZR+Z;vo]#I먱=e:?rfbn?J+8LE-i-mոsOx??n7ƹ4}-ٖj5]wuOCYI@DڰъPfQˏʲ.BD[7hL'=,q \ZV?nGżem[>̓/$<2?,1lf"kJ7cƏ9[w}&j@NCl(4V>X7Ƶ|?~B e72jѾ ddJ~lj)Fj$z?-Sw%^C3N t]Uf,RT 䬂rj.F9tnv EmZl;). &}>4̦}->"#XDzbOO.\&< 6"rz Q2 j$$)WG '6lMT|w&Wc+; 5r~Ȏ9ƁոS>sil60`G_>zkJm"7=Gv 2J熺 2M@3#/&tC%2Tީ3T,V4|ѧ`3@(W,ͥ9@tloԋv&.䅰5\Ӭ{|mbD\^}}cWh?qT{IŤxݑ5 c_괍tTiXa{RMW][:*LCW.EphQ9-_n5^ū٦q' !L( % !)*"ƈs11ji|>ԗ*g6VE?B^7nfN\,kuJȌO[9~[T)iNrv 87}+2R !gP3YjK,4Qd{{A/M]l@$ޤž! J@2v9KWvX>BՖ."ߝh)o7"IxNlCI;OGV[3 1^~5cӑ-.gL)>fr8+Adyć9s28.!E(Bog$#oㇻxѫ\^1 yL P~RZj&5$WGE,ff!9nstV ufC@ :&l4tPɚ+؄}Q;Z*qE=R&x_J@!aƆ-+~&R)M.of/#5;us=8 hPobvЫI/Oەά{$m,+l} dG-2Y[is+wn֦DO=ߛĬޖi2| vඃvr[:E'5Ok=߃V48j+)!'m<]ګ%[\%s"D.,d|)wUǥ6~K#LȎ k]r_ Gm`1Lb/nL/=nHPy*YZ͎?UvdTRdJ ?pWԋږ϶@kmǴTda,#D9Bg/̱)LB&R -(/Gh`!Lx%6)3v.n=޽쯍YwcF}!'"ݽ3a蹥ċB\h_s/g^\֢:F~Li/wLK]w҅!Ԏ:ĵnw$;CXDө=v]߈s$+2n{`HuK WWͽ*1iY6R2!`؁E GsY}jxƁp)k9L߬tl (v>U!V$yB B,"H$8a嬕ꔾ#KK]{1ؠ\N+83l$سT xa5bOA=`9&y;uW]ϿL@&doYP}5R)Etx'.xм:ִK)סJ՛qTl+7Zֿed֤6 .SLT`4vjs29t_> P <AyF MU.OLFDF߾v!8 5E_]vɻ6GGFXr:CC bVgA1A$İ2HrlBd'+ tToV":ֲ<̴%(۬q_5:0 `/ƖkgIe ;2+U{A4Ơ`9?L5~?(PtU,Gq%V?MLKCI]GF`U!A D #}1]2Aghi{lτo&kIu7vA6ըkEn sl|okM/-Sm75,ǯacy_͋C59ؒUσ3.'ځ?ܛ]+ǿ#J.S2Ʋ6U< ԑ30']uz~^ffƕcKIb# 0E#fjR"6- B7v+׋I:6cZɮÆrLvH&x$> Ɖ. )o; 3e(䑨 \1Es:ȰIA;MfˢXϠ(eSZ wմr|?i#Yg@P}ˬ tCXD5vg+5Ԩ @fSJu0zb*Ge4v6~W%}qE?/Qܯٕ2JYM; 51u%-)9Wkvs,jN 4IR!#6Qh @CYXKf)hDɆ{SY)yݿXL NNvVښAX юy IT jbpigSs#p]ZĐ~S|Q0cCPY2ԋ17aikxq99ffZ3C.wW+wdם<~;ja:3"F"D8“EIWcҎl x*u0T@`UTFQ WEpF`NN.ĈU I-DU Q1Ql0f-yEmKy}YBjDɞr37n7 (Z;ƫcȃQc"'|(F m:\7+@xBL1p6i-(]nn2}q_\&xP ;b"Y@#y4dz3D?~KI,y.'`84W&cEi%5mZg9 SFVI@LuJ&I}3{ vTlc)X!ow/a B B@5;wwnJ1Vٺ1"|Sc)N@[%ɡoH;c+"PfVJ<!.]O k #X'DYaҞ>*쿄+:#E EaΕՑ/kz25%n"3~pzܧxT_S" 6QaȟB"kῂpgα} +M7>eR}߹Zw+>0ͫ+^mGe̿w*,k|zzZѕUK* =.0/3K4Y!G了ly+'UEoGhV?^UյVӉ0߮{»d hq| Ri()sv3x{8ؼ8p%f?.AaBZ'|T4] }F%oT.>S&ݳ*( T$R\Oz-h8KsN><4/#?W(S\z/>^3mlbrK83= Nv=S4!E6)Ā Amէ90]xV"ǵcZYwĮ?]JO-{رsDhNt;1] 8{|Be?Zk,qIqrJգsz6x6} ! 6, PgT;X':Z)]HSV4GxT8Nsm~(ӂ-]7tU)E$`0]{"k~tQv4?XA /*q^EuUJq~__=t,)!OY8}$~/᳭M}ү5x`7/2Lk&m03VN2O8-dQ{Fdϩ-wnX& -7|YC3`;n|)jum0xmCyJ4T=p7DU."c#A[` $EjslaT30Vvh6+ey=SI/7;0Uh(|)[ *uڤRnd9?|mǘM5F$cLP=jձe} x|u7MT ĵ Zl57mi&Am3gsDMkk7 y̙<~S#d+xy=%0}l(ݕJΝ sIs4K6|aL-2qwlG:<~cP:qc`{/Tg7s7 W9Wiߠ?ע2!d3Oa:8;O=ۋwqU\;->Z+@YW-,&]JAݽR۶\ջK*ֆb;h!a=o]H 1 oKNR|8GUil6,| H7 DI?(<:w溑̠T/0*P(z*h)m\UP2݅Gen"JWYz{ [?ԣ+eg'WL8ݪQX^褥_Z }Q?ZhȾjPR~J^7 ;ٟTlQJ# AIV֊" m4~8M|kb>(:! "zʥ^S‚ ):R9SgЛ-b}0l12򱶿 d2!M4Pg`L2qféEB@5dW٬6ҥ,!u߹~0 gLT(hF7oUW[L-{C[FW>UP䣒~)S1 ;@่NP;BO>e@  ~ڀXRPEzzrH Wڔ[S^󩂟*Ψ2btvՓ[bJM+Z_^TQl{"7nB(z|!+N) iԡԮ9omfTC(?sW*"fi h IJtf ϣ, bt`b2 eB6Ç۰;}vcpQ]Y)nt2Y҅8t!i4\gqT:_/;>考Uy=e=_zzXУECPR `ϲ<) Сr.-tT\MG~Z C +-FI(<6kaP% r ~c@lQ +pP.;\ tXTWWp3e@: 1<FR" d ܹ@Ļ-". U\h/u QIZYJX(Ƥ%ȹJ(9QP!BsK c (\Պ(ՊڔA|oG- D= zhRExZQB DT?OQF#(t2#0NڑDBJ0ݡ8%en%U0_ZN"_{V(El+P"l_G6gt}?@QF@$H'G8{~;:LuI?1C$5AA3kTZ> DAMUF'L#XF mEAKϲwCOڪbDԡaIr]z.- M2}} a!W 01dP /}~ܤoT T Z x@^3 BP* JM[<7kK QhDטI,JLD*n>jwʄZ'j֩; U6t xf osݢZJtfKIBX,ЙDVA«bD%##4-0WU1;JmfVZXѧmBcE* 4%JV(X?[Vf=̯8TO"Ba\&W (U3sp_@.3[GjH An'{.,&b%Jxf!0DuH`\IX79&Kʹ q>mP@$Pi:*7M̭ZRHoέ7wx |l7PV^ɶ6z7H?h~&*ץMkSMM5 4nAKCQ#3H"=e?Dsp~ɹNw7zv|)aLrU_֑+!$&h*ǡjK/[_^o\;O;_+sp}O;Ώ~CM~TpFm<.#I`cĘ8NX"5Q6j"!SԂ&gT`C\ '@8  ‹ddٌܷP+ ( 2 jak5W. n'(BʨE,noJ1f앗9ic))p=fRh"@(OCgvOT׭/Y]u` L<|??ʀ|Bȁ~r'ә7&R\:Lz1x46Yns+:&YenD2\Ҹc u9N@S¢++S4`Y^eY\3TFAC+R [JoH}#HE>{R=9q90"dF',1 i*839t(cr i[Wք:c&+Rˆ6L ZgjCEhU C/Uirk,ظ{죚Ml KEiV2F ,[fW~UKtü\:1u cȫ^/D̳z:hR*$-7 {#?ML)>} `9JC{oe{xw/ ]&`PM !KŗTHK]W0}YnlnXdňKW"$ R@["R:AɄB XP40`Up)ʒwBʊ.|yw_Xn.X1 ot!uHY./QtKBm( JjC}jL's%4B% ,Jp"UR[j٭I)u"N-TUJ(bHKN3P0'02 Yԃ5e`3Kcpe.A4&qʛcv4` cWI՘H`]ʸj{;R%5SXwqL C2ѡ )*%TGUI)wHv@Ufa! F|\WR Ƞ.jeZ+u"fДX *$)-n@ir˨-;0R |qvU4^]Z?׃?/g #dGZ͈$~JUԈk[-Ƭ4^t_,%WEy /qN͖:SA: B>`R5DWj=BB4i:Rg2*tj JKatuk×3, &(3%5Q8 /E9JÂY,s+u{>n)Zs*V8uٌJ.* O0UqZUTiwm]/PՋ+2̈́Q[MqwkR e0I1+@!d9d>΢Q8#R)΢)BcQ (. 3hBx XqY!іYQHM+9RqS|6K-X oљV0YYV),sRt̂K*j%3hMD}*(PE>3C>\xΠDgP8l0wbPt$79t\ ^ i8‚~fXXSɱa.36+VKM@ΉR58ьbqj k{N |/[w ]dw#VTn'p6}wOxE}77`mXmwKݶgxS3UE,V|B}B] ]8V|p0_]U^eRA3v5`oxkT!iX2=†L!^3Aq9)yՊӖ'3Zը`;ƹ!QZ95sPVҸQxzVk3V 0D 1Z!IQz*Q,BEAp T#ե  5$lhqj>n cxu "WTQ*͜ڵsSTQj8YjK?EL3F$ l8 ]`YhN])8`acVܬHM)a$)s%bWi6..>T;8*}#z<,5ٚ:{gg_Om3089=6<~6h*@ED `_Bv\&\h:'pxXwPr+$@VY+?O$!hI}鼁5;pw),h TU8TPx q*{*0{xcB4=ϢF3fD0"#@RqNaZ%•?\ Ej>X@D'BPσ]}b4C){?7[8|06@\pl}+e6]rkS$/"i`^҉'>?W٩YhSH iD ~?VH~Msw đ1- Ov˩ Sм\lvH#Rw_uS.񾱲 w^z36~KpFhqUH S{u $D$)E(ɻ]z8@Լm:诿a3nܪD!oj@>۵ȁENE E_>ψ%U'Z%fS5eb7bM*PL paK|0)`ɶ8*$3m[ )gT?7Lף;D) a:: dG]y׻율]& e(TD1GGQɯ{ K3=LWDB 4!ď֧Qw罌@f4 XBUmų5:r~So/2$\/SP-]~㛖ѧqm*JUT0QZ[x~J}-ܪⷅbzn~|;ױ\V"-N]!*;'=.Ƕ9s!7z?HNm>M6dyJ^6K[lr_~o_C*-OvNg4L4R*Gb(E1&!MNīi+BA.OL# ⴖlUg,PhcLO}e^Dhwq^E*#Ezg*({>/_ͺ[7~N?6ٽPIg/ %v78y_"]8$k< PQt[i%P(SNA^תZ񷌣#׺jr,Xd`6XAZ迥m ܟe jO,ooT!DȄO` Н0JPkF-bQiQfO}ު;4d:/"kOJKжgǞynWmVZVF뻼[AFtX9.b%x*J_K2ԉ09Nw5ij>fK8ֿKʆ$ GM@_d݂1 KFcߘ<5rfjP'>$+iC_GG'I(UӼ5Bvk!͉#U 6}ԝ.ʍHiK -1uTZ\)8v%6?B4k2#k遹5^Sa\^4NjhiaE;ie:羢t@Mh†AMdi C$kʏ72kԫ꬟C-5QLBDq4:j/  O< £O(zJp8E9*Fr1Ў,Wl?~Ǐ%.c_>'ꊏK^ΥG݌Bnsa:+O-Ml(;͢?kڭ!ək=:Y2|ZHRS}us($IG&of"78W*4C X`jœX, <i$]{7COGYޚƲب=mLd*z?gՔ.Td) Wc^YYeK%R*qעpgޏ_F/d۷{Y w4ǃʽڎ˹X|}j2MKGިoʭƒQ+7u E{CX,!qv_WhW)Ra8jH!VKWnP%vÙ!MA*bLlY " 9u25ƶgp{n|)rtp!4g`[),bcESBT?GD2 -&5v,H)( G ~a@VuU v f%-$tpgE ̔TzMDPcyh/T `TI,)!a9E#RWV =\zDz>Av+ w|$fS(A^:iego:t27gۦ<`}8\gr!~Cw 4^'yK13KRF6gA>sTh=Um\Qx{NNC{ywZ.36[<]րZ/.t>@;Z(Rj(uHlh.)dZX[5gThi!xlq <0XAՓp] uDU&vc!zjb|S#L8G FWIԕhH~0ʯW}ZY6 cF5R N09ŏ+%_0O@2xoˤS:B,mYάh}4!\?+^Nkem9YFsEgaB&S}~? qSp-pm~oزЋlNy'67s30D#]&ɤ~fwʡTЩMg?ȵ%&$JP nS5 ⋽.-bO5Ԇ,I۲ ]Z:Nh"tBnA:zq1# zǐtn3,?#PIg-/5Irxm1RARZQFƐ-@Y8904H =+U;ڕy~~32/~ӊ{XKh!PqMr AL@$0=oshɢ$R ZW(oq11]*=cvg/ۧ E kr#lwZ>ʀ g{29;QNpx0=74,WWн[g54kۧU \7sۙ"9 |)3k 1q,MNd+Yc`_W@?Z |HAsm-|5""mOr9 %ALj!'}ݓxGph ~52z%?ޤX=Ng~% ,OH K0¿ RJ/~_hj[trȄ{g;os'x ,M<Q{W{Q_e}7 4ށ iRﻉYͬHoelLKt3">٘嵋ZwSҸi ~߇q^ {~5D[< ePkp§nF6({=9M],Hm1 V!DN)Nzup="74%h@5вcRudi2Ħ"/;k$}PKxBŋt lAQiѶGck7X. Xh2Bo8il]Bh5K_K%"N,J Aq!8 U 9pHg"zҔ!,A3d!ƥF`:yΔL fDP~/fRmpGhYKju_J!+4ŗ<"%*x=Ba| VKzMP\O(4~>Sޔ!.; Yk:ugP-Ȏcx *)U]D;NMUR=d}AQu_j{=x "/*{ 8̊p]E̓"J.WP ғE zl `Yc ?2oVbFLN_UuSHi!]I]嵮UuifαBF٢P `Bժ&e 2",jIJtI1NřgJÃWi$*YUb!mLQUմ X-EfE4 V EF\1Mqerɗ}JB%•0 /jIC8[( `/ϰC@9@t!~_o5G:ƥl[aHڕufBJ"j*'-uF\BaI(HE$"A 9twEoKyd(R/*@ B- (wLWK$*2AKr'ʫIVm14'ga&%"S9I)i E y*Iu-=!_\,42e̹jEipЈ*#$$d8ò @K(X8 WRhДS2WJ0hFjW.CW.= ;3%TB @K@TѦ)҄ވ}ZSA?kCF>WуFdؗpoI-3hT%i! =@qQAb6j! ;%RUW ḧjHaEFLQ@<g4 $FBF BB'3(ӪA $a$RE\BJġiD PiUL1hZ 25*A BHH@B$ TA HXHPZ)" `jAEHb$B\QZcH!!yAR`K8j BU$h[P(%SDQA4bEcRRI"B"# CPPjHHKc/( $>&TD |Z!Pr#P$!u5E$$$d$dP2@$P0jR%@!$pD&(7D,B*2Z!TdMPBZ@4J%M\R)Fl IUK҆dd,HKh,i|,ԵY!BD F@( T!QrA$R"dI4I*,!fU A hJ`6`m!Dd%U H$a$FI⋊6.Ɍ)(X FF *H$!$2H$CR(YRB@I$H@RHH*B)!"@T "5BEJha0PvFU|FG" 0d*!#&S-ѾT!KI m ʤpDBQ P BˆPdX( &Bh!JPA/Z,ڪD$HBBZ,eH@ K H HFHBH@FFH!`H(dOW_;?ΎZL$vreф*!+tدk+ؓ FHeAdӂN+7)/BXwWnj% k@K|% QD ҙT@13-2%mbOa(s#8 0n ]|Qt7Te !rt P0SXab5,(E:P@7!ECĵg6 +tRB5Ii` eRyC{E1,1W@@T Y*&?23`b ZAȘPa(ra(@*Cp@A"[fuh4BN(L: .  rA}/pjIT %k`  "6Q5Z I+~xIhp+h4£;lIF % Pr-2tHuE4%}@s/ H\ V$V.,N<oj *ڀlHGV@I!ҋ' i5XÍDE([gn13C# ׇ066`-yR) (ȪUR7ێAYvD8ҡ/!Zac5UWC PVO!E5$TX{%*jՄ M G*$F -(b+GHg2 C#1ImG ʊ9U`+@U* SD@ @%)zG qnIj;YV`UB˻ T`H:hV**29fPJ +Hp6d3hV k.,rZPB X`"ˣDLTqᠫdS&y.ōLM1(VPƤKP`A) =w\ݴ`PTedeRƮnp[9ͤB !CHT ؅CR7gJ`rRE\ @"&QYWsbPbh^ٵHs7*21rAaa}튳|K(]XٰQ+86teR+R) "jC)@ң'QR$bQr"\ȭ ( eT('g; -T$* X$"6b@ )˃b 0,`"R3 Fbmb$X dAy $De)HYm!P0m8T|DT$^ 1 TIEXZ(Xl6E1*6#b@H"eԵL7HXc&m 4 I@,4JTc $Y!,,dL@ HfaY$ HǛ a0UeʈeE& ee 0<]9QСJ*BJ 8Z$ 08(etIA1`.K؎L$T3"VRY@@(0FD` %ԋT* T!kAWL:H R)HPN РbUdN`2(Ե$ ꆰ2@HW !t¸-ĂN!%BÐA*T=+x,^TPU!(\(uKfͅIP(PR б& )*!BHhdzb]J(Cbh0j+ ¤`p #+ک\8QR"^TQ]T*J,;ZgFag$H$6x8ը^H**K )F*B jJjfNF9tt(E$֮Tե"2]E0;LKV*7UaA i4īl GsK` 0Z"[8BWVPAnLηrb>r"^?iMT1Ky=oyw tLEmɶ>.<'8\w}EH {Yq%_38kI~I IQKzMLgc1/)o}|.^io "ؤDL@Xb@bI&S3n`=לz=ރ+5rN( C$hHzhtq cPw^ښmR2S7ͨ|,t0 cdb7AgEt~ʫ @F,Ii-wD<FlH3yqR @1t-GRLBx81]P PH@p&&Lp 7KJ$!]Z9vSe=NJb ńąU[J!0L !~ͱ:C=" $$"!" X~1ksogocn!3+cv>|K܏3oزf'<ax#ޥα@kѤ8zFGViw7 wTRRLItAǚp,1#{ L n,SPcf `4pQK5?9WhI~P-LOt>wUA'ow<`<\0AxΗ fۻ4j*2nrW@1IɬU]rx؞㧬Ê Oo\oK4zYW4I꾪07 ;כ{:.PlxϦ {BvU wٹ^f .G+WݵN,""R\xYމ׫V7n%cGA +JFc01ShE+85Դ(.MX. G`LB~0Laxt,I~mLs\ʑIqT_f1vw:cE=kE:-BcJ.HHN1-j:[ZA#rR74x'ønp(; m4U4[İJYa̳X08sqb k-MI!#LFxˁ`14<`'o v0 +2*X-绸4~GKe 8 9WŐd*× -)ҭ8+ײ>ҙ  +ܵG4yu[?Ӂ}kZƫZV˵Etpwc )EZ~l 8jyYYW"ʡ @5FU9)}b4PT.9$OZJ#hDB*YOm$bFy|7~Ņu*3V-G O5430r#ٰ[x0B@NP-ݶ@&>f mp ,8b&T/qYa~<" T͡ .EL2C! BE*Hkɱs 83bgbmFP 'ȄPxԔ{/%s =*:7;[!~!Ȍ$18y\׊ٜ\Ժ+]{ ߨxl`{$x(ZBoYVƽXs9~eފyVp c3T pJ"Lwz1D Vî9¨k)$H(IJ2ۊJRO!(!rq.g55VPעmsZ B4w}ǿҰ| W,.ΐP^S9K1lST$sXvef&ddI"S@51 S,LMn$ 4(b d16FIH:NŔM+CpVG]⎺W#5=]*aD啢_4#fpR$qQ3.B #`0e:iSF˚̊,!6[pLkcU4!B@:0dS)/4.h) f6}r5,S 'aQqL~&8%H.E/L8f f L{NHbK 5`3}L̪1wгyG7u i*e߳[ǷZgŠN}=^bPBŃBHW"X}ݭr@%nL LdJj22|xkAq4I 4SԭѤoݱ1&>F3߶ѕ6"ɍn׭[`>S?]s7Mw%Ʀ}̺eejXzK`D`uLD&\s}1W   JWF4Aɘd@Dfp]YZΡBzȡQΡjɹMM: MNA v_-tqcŊvP|?:obTMkOg.Ev4&OGXxj e(Au@U%V:ٰzչ)D)ڧQ!6WD UGKiuGgZt2rĈ%Tޜ*q֐tZ-WQ}d($TmJg /UB\G1¢L&JA6YMIR1';hх,A&~6gf)JˊTYXIuBH1 Mi:B"\q[*H l`'=b8`jA$Y$m Hh =LR_JV]s& jn b AXŐAm bߋywrͲ $P@@' |!Z8r8-pDEA0 j?T7!8;6(6Z\3)qPP EC0Ijd+, ˁ*m@ W !r$»Zk6-q3TyTR5eܙ Caa !- XP2@QE ++ X-@@Ztg0AX"8 mLEZ.U gQJ+,aeZP#qi* Y@ T%Qb+Ip2.,e#p9+dsRR\]ʢ$UDFEl-if4e D \3Г"hDAC = L#(UGb WbQ5̬1w{5`feYpCs╘$`UCRA}V〈qwwC3!aڤ0wVڵZ)(Ǝ:˅zʫ9a\4 8RDY`z /E!*" ViTN(Udb.AqSxbj R!q(UH`` ,^DZZ$JI*_en׍FXprH_)Y!DU%rp" XC(3 K)!RWdU$BCA4tQSD  (*G`]7Aged\dPnFd= ]% pX (ueux ), h%UZ*уRf"\E( j1%)T05r@*L64F" |vлތ¬VubI6 )RB%6U`QY"PY0Y/R*cW~"1 ~<hzOuB Y@j`/ck @6j 4bHChV-6*p :yH3rha#̎ d* x$z:(PTV\,B$8eRnBV!]ji$(JU(#hh (fG'yyEYg4UWh h\ SJ0H evwAk BR"DD@]E3LUlq-17!sXH*9 i;A$BSJfd6#.D:4A.̌ZA[7eDНY(dZ&pUE5#v.+i \sUAhU+G *.Z#08Q[9b06pI/yIP,hņ H lH%1E4@aLZӳJPc J @;.K0@AkUt(R)wQRd)@D66^;XUWCj4$joa*B"1U5G2i0jeL:2$Ƥ2) TRLˈ" h IQP+1HՈ!E%eTi`#jH@$B͕+} Heݾ4V FFuy3ȇAj H-=|W-BO8(Q w37XC pNė}!x>wt]%T҄)FYݎdVlJ2c'NI_t)\J,='>M0l3<ШdSW7Eyqt,`@DBۘW"MlL(];y5r[+!]p{-'q|f`ʥ>W|"iT4*~rD BXQp@3 zaeBj{Mڠ_W3Gϥ{_qV'w O= v6.gPuvctge{ +=}#(e" TUD_uUZd;*h$Q@I_kT:H2H#"w*BF"r2׸y1@9PvQLR%@梆(|D HwZr#j}Bxoh1D8z:7$cj ՛0T\EH /VV$|+)@aBA@A èCD4 P $[kkE =Cv\da`iP&TX2VdY BC :D! B !8rW)Q7GB% Zv Ű*$ YId!TR!]lRBWM&ssbA %%@"M~+W-j0 ½6PjD ($7$$ tJ$r-prXR"(캈! (Asi,:$`֪uJ3CB,I*}]Eɪ昑Q!cLc,)rRGWg,QYe@(HsIQJ"D9Z+"D3JkRStKSJ#Z%L3:zK wa{z_yazP8 Ja*B[_짐}"uPQʠ5FsUFkȖ#Z!۫& 1*BP,-VYof"J72 ڑ @061`4bD m9r!b10EQ"A0L.P^1BP*V0 +d/ŋDYB Rք T%(̈́d*إ!@%#L*k1.`R mU4łγqPUT1a=@=t 88DyZ) ̰p:f@ ^$&XRYtp݇t&az$hk3&H BQ]ZJ^Ob_鑰eIȡe T~¢) -ys =O-)~91`*!b]l^= %?%bs)+MM,eif7wgk|ix$-O+],]J ђXrBi5Ee >bO9 <*yq 7`!>;#, iIY Q b;svn2hojoVL.ϓM~D{ɔ~w>pG0I~⎮!!"sAHx](-jۆu^9Ê$=^wvf,@:D!<* 9Gqf3o -.8hٓQ )Dc@20CPR`ʠ)I yډz=4pB Xr$qĴ -fढ़|՜u.-n֫il8Pc!*@>0,|HM E$ Xٛgښ\і;hlMϘ87~ѐ㓇o!iՠx@Yb wWxZchSnt:kbf~&Zc`vgé~Ir7#R4Zb!v{RF`p5>׼#\oʍ S q-8H!\DHN'AoI|ΧK<""q6H0Mx#dN:yBpR鴃AȔt @Z\Ub-t;%Rij33/܀0LQP ]P I 1T ʽ|r -S?_]zK0[R9cqXs󁐭C 9j' b/'`|a Ni!՘Vh΂U62f)|p&T y,f>=dHs}H#hc~N"Z¡7.ȯJ&WG=)F>ڦ ~! At>7HZD!xNn6=ɿAKqǻs@qxA'h"庵BH ;EЖ>pU\h"B]kBoJ"bU$@bwh@f>&0qs@%@S & )~ܠ K.C#JJ(Tml!xcb0af- ^fsw\z- PBspB 90!g6圣?^DBk"RJ%xcYhrO-^<`)0(hW'$rjj܅~\I _~;H(HbNt,bbXbb _N5(ad|$r6WJ Y`f(2>U: lf²_@XH`/|X_b˭*A:\TXRB!bc`t—-3̕Xc+xmg1ҌV!͈ N YSz YN@~o٥-Qِ_+*^L B #y!D/B #",V pIe!vG 9p5D 4 BU^ǯ1_GdRūP D?)w崙10xOystB VT&y GS.75g,x墜)aJ u2Ϻ0t <gh{^!FRd,=5oN~ֶ0'>)<@bB[o,ȎVR;Hto'.hU`QBa.Ae=]\]H #0SGzfX+k]Dp7W5겎?"j#Xg?w`YeEoxU_:71JKt/B9GECf'A/ȎLKW)AÅTַ2=n_J+?HZm^s4QOnӨ|./cofJWu7; (A< 8O6Uu:0b}og9n/|o7˶^'3&4%ؒD&I040ƣWêYN-/(I"[pyD Ps*3ng/G G>|utۏS̿4lp5p&V ' 6iNF} _M85*V )7꠲$YxϿk?S$2ƊJzؼ1=OK6 5aEIAi}Tm%p#7.^}bם r;5@rdEéE߶EyO)\mj\rF+s[.~\Yk/oB ݲ:(/I1G,"o(1{$owfxTD}G[L"j>F3o G&A# f쉇y|FfH_]<PSq0c![’fzygJ` >Tvd$t+2i/M\Rp/:A#qag4x8#}j<~CepC߰ k 4I2g~=t~F1u>۲߈{XGwlSL*L\‘rQiW!Kn·X3oUvOf3,Pdѭo n]s>h̔Mz"ڿZ- ##FRx\ẙU'XvUq}?6H}.pF'%HVxyw2?B*.dʘh![J01u bcB)npŬp>8~$ Z : {!`[X6LGL8`R%sPΔ(7vh+QBmy,[{#@}F-@DܶV6s-E=cZ3f@ M c 5YW7V69>Z0WU2b=yMiURhK#jEp}dJpc>nwF"W #COUv!3nO퓵!'R1PZ MF3HNURv*MbfdGO jz*513'@)SsGP)>j t 礫{BV؜waRO#KѳyՃmf("\l {7'1}tud~5[ -tRH .=$ AYg̊Ķx?|`ru+jTGR+EsE2͞ \KBǴo+a`)=nboE,0xג3Eed#H j&׮}ig8~8rgkO %vdi~vVKU:!~Yu`0t |tYr(`C!,jú%Y+ A$-4Dƀ!V؏;scmlR\]g]Eӓ'9Svj FiT""T!'A%NJٍ4# 0 O'Wo<Fiuw|{zV=.lTBNB£@>OJd ;D9jm "Jo< 6j9?^F63g&rww⺺gaj6KT Yt!CT4ѕM=n؁<br+x+v/`VbN ;@YkХT'EC-2f8sWLRjArXXP?N)#S{/,كieTq$>SRueSz/b86'dbX݂G2u??e=_xk1a`/ OC#n{?{M ɶ,KU Ǵ+~VjxPV(iJ,"{G F ?Fm! #ӊ "LD@wiޓgo#* !Rxޘ{U|bK(O<, A=8$&ǭ ĞWWwsi×շ/)WXeUS㺀e~c~_q˂"H0I"HI9Jh m_J9(yp@!B%"MTi"#maaW+,m@+ y1hEQ Hˊ*9t R@)n:8$9($s-ܻ2k p0x´F0 T 2LT1  B @@ $!A@B@!50z%I>Gq1fNX ?'|o=3&ҭ)t|F_Gs$kYd)G^sJU Xǯʼn[VkcVn?[M4qTX! ./^6',a]I&[wsL Ջ\Z.f d;JL I,SӉU:>˅E~چ1߷i!h*řh0AH΢u^N'hMuّA.=DoL1*jU|oغE3_𪐇aUm0Fݿ+Yt<¯K>\SP'ޯƃ)Dx4_{]{UG.?ZWK'7#UA>e)ZnVJ%3H=,ƆʣYY5m22+ 4@>ͼou{~[^20(33I@@&$!5a(u=V;_vG3*c$bZSrd{iەA Q$@B!zDA! v2w>%*;)!{m!/!ۛd`v?uq yժۭ$4ex7 ӸPT%{=!W#>J,Zh^_1|!,y.CUjq?Om6rUcf{}z[-s\NW?kׁyk*1Q󶿮~?nU(V~c1@Vf,mUZe&wA/g! ИRo3}Le |ĵ/pcc5|$"9[}&),?gjyEou)o&n8~sՋU=_gwJ:MmOX憍_N2ocX e|\WS=mh#vp݂P.R.V%BO,$υ`<;jR5ڼInFWj#IeT~*h~zGݣk\Y 5ԣ+i7=Y ?]VjZ" A6(]/ąlZVb.w ; $h1|ukif_j e JBO(E^b~/~& ±$d{*#y4B'*жqNd{X<)^+qYgF%ⶡ k}J @# ¦5(wҏ-roY_nv'nZȠenD(ЛX0%G>`h}tN ˅3 FE;V7ߺ42?8>޲ejL @<_FQ'>qn׸@:/"wSsF6R+~Կ-ztiUl֐ ?58s(=7`Av* М_[hRg[<@OP}qNE(: /  E6t/"NP 5;JY")IdcsUt|{Bp;L^`0%34WJpW?ICU̷*j_չ!y:|=VN}~'B8!g#'gV>7m>st#vՁ9)9s/dzB?J#ߣ aJ9r,BңwrRZ?k?5P{o=yX?8R7}w{zO! &.^f_x{_Nb0 9A V!ߘ G3l/PNBRo+Q cQUPZH ̾vH \x*5LFm^` \%Zi#X(zSa}̺wKn/SQ̜֗zB8JxhAmq=„O#9]8?mtC\ۗw7?滹Ɗ^ЃsA2Nδ||'~f.Р"ݧ˕3!l׫3iQTeFa-K6V"ew6غG2"B4NJ AcJ@ޘo,U-qEB~۷ddo!"@T3&LH>w22<\JsYaQ6sݶ2E%㐚Iɝ$*YR}tt#ަl+q,TA:ɸ1S=g3q]InzHKmH Ad ,ڍʱ= TtwpC|mPh5-ó6˙mG'GK`?ȕ4y.;$o#kL3 #(4k q ]_6~HKMkz.]oQj(-YG1廙6m\HXt Ɏ uBHs^,e>vY mv{W-B╨̪g /"˻>U-TEò{9(A\=1g_Z@$"d;zָp(=d(q:C`hb +LK#*f;mN cˤ*H:HU?* 6bAT[H?:Led=son YRg I;D"jLJ/.rOޛ~5w+ AH9 QL2NQzA?T<gx#x6XYVN5VQ|SD*X "<YlR>[;v ]J#pD+ dA$+ '| gėbhIB ~B#d@ JY}yHY!B;  ,)aPBLV,YcUq,ڨ#ϑKȣ*aؖJD! Q\U!Br! jDRj[JbD ­D5lUZo Y#$BLqڪT$$\pI0  $Kō̪`k1U[:80ܘb}!DBĴ@.MP` aPe*HԴmpI#$ !!/9/IdxSSR}z: WG7??]"Ќh TW5>]| sփ<(/_3p:}uy>f/Q\>+_k'ݚMnG"JW !Rt!6zƊǴ+։xߧ?ڑbY+uni_N$ӁZ;ڵծ|S??C`q0sN#}]c8,ҞWv6E<;[YbS/B4S@MlOD8׾!]m@]AOOKJq9Ah.w^%~? w_l^} r'5!`O>\v}wgnCT|?դ;KE漅鬫P@u :wXBXV?a=dSB?H }$eA _a}ow+M0}BZr8ok>q:#.g9{B@HQF|1Crg]6w^^Ï:zަaTI;Gw!|ªͅܟ4oЙs+D/:ORЋ2پv\TɝZl5/cN~$v4n:R#4@uq|!Kԅ.o'}^wӞ7lΔ$;yI|ulF'+ܳr ]k1x)x!N:FNÊ4g4Cv3xzFW4 RU.^柚Rcsʧ0Z|o~w&^k!egTwہ{90'wLP zx(?Z7@IUO*Ȍ:垤GSXJޛ(p{*X0RC01aXPZu ^kzpu-W#h+~6LdFwo#?m1~'m.htϼ/X|3_k)}\f'v+?l*6I6^vK0c^O8GֆSSsT=9 vC9#4ó^ %@V,n^VBlmŲbhth:?֟ n\k VI%GGOYu}nc?TkӣN|&&?%g^ʽtWF?bƎ]WWNzr}N^hE<&޴Y OR=~}& 3K {v`^`OyF,=-6`;&o6Yt7,?oWb> [{g׻1M~_ ?b_ݑBkV0-ּ}ZvkR]d?"ޔ9 BbU ܬ; z \Ql/_Y/`8!k|Qy*7X7;e(sWنզ"tLɖ/^?dzI|S,/߫>f rR6lZu*; [h)@Z?7ntTui}ka?& 쁞XY4PiqА XX> S~ u i%[̫ن*q='" 򾘇v>>_,_SH'P \D٧qڝO9Sx/^N5lbnn2™J =0O%gfڹZ y@hM̫mUL;3ңX/;Z4MXBSUt#DZjeIl!߭FzzI"h`\,ݡos Hn2 CPZ2:əkezOȧu?/s{*%r? M ^nWo?i+N M(?7]Y:?qJIwphԾۭQ[yEXX-)Wflj":ŝUݤ[r0,«=R dAⴒY<ҕs*U7|]x\zt^G-K,rܠZ1?/+pIQjIW5 abc+K R^N V-d]%,OhF9>oG'ӷ]fhLgUؘIOԁ/cQsVi,}-jH\@#Nl\2[ȭYK{H s;`@*PqLj5@e~F^KAi@F-@L! DN?{*quٵK+{Nge=6z9Qm*tC3վ~*իnYH |}!@>f2bA!锼]t$<ɡȼ,H%63mtpwWaZw75wƽrR*qs(W2mQN1ʨ4ɿ{^="u\*kbȬL|-MQShR']) zS-T%֬eq|Z.Ռ~RDnhQuE@RFYBM%]\ (4Kg@'*C5]f9kv7KՈ%ZfKQ00) .B#/+jyn{'b[@rur7q3@,.DPhe$j*BHO'§V2p8 aO ܶ-c|LW}x-xBָN<-J 8]{O!?HE18_htJ]gvʈň}Om;=&eX9MC]jm<~~qU4g0-cK֐ VmI#N 'b"6XD㢤"Amu!0t|;]/  rBt pgf6$n/Ϩ (-j @6SS3$',1џ%Lc'ÜDH:x9ej2"c ت#n)Αs\F٬xUR%aB^+BEC0FjT'nbMT&rz4J$Y!F/bfkG˼YT&(vG{l[gB t! f ټǗ+ʁS}|;, < V8,5W)9}An!ؠB{Lʾ n+A A\Đl53j=Z.H nYkJw+ajU ܆mjW,5];!X;yVrtg*jg!,B !*6ZșkE4|ݥݲ>eRoV xGUl\)C?o9I$w_[QdxߥyXIQ,fDȄ)'{j_vbs~!-}3-9 $3 l} %O-tā=k%-._/p\2TrYV8?\q%3&98MўUP{Uw1o5_:mc XF%7%.:M 7PM|p4ռBQe$Pzλt%[k o_[ƜԿl LY }.L$G8yNLwX.COSK =5z%B& K_@I2o1L㐢-*^RܞTxWڢ{w\mUz*%`.)_;AIH8Fb ~WO<jV)'Q?V53 oC? vZ "ʲAvK" OX'(+|6h0}Z&f|=P_ߑJ"js?f26[Z|4_O:{V+myY9^ oEF<9#\bcwb6R1вET;9.25-i_&5 40SЮ~Q]nȩƉ-9'O<z>c:фz|Y{qlȜb.̹&~g[xnjqR~.jz⩲>Z[FZC#[_?QO͂ \VhZ̼pCYVOC]lPKCr4wʀw&80(UߪƳjS-h A5(MRT|4X;چ. <V9puQA7kt{JY ҡq~R&I[/xR94>8 [Y(ugyrqV"gnWMV(^,D} HK,;1|0$SDrѐ7  ;Zzd]8(~_Jz{\q\ؐ`NboNZ泱AzpURD"DJ u}Ĕ5:zc,Uaj(NVmB˞y{2+y;l(Q39*R^Fic^1Fnk8KՓo'gjȷ6v*9*y}|_MfS8)[N 8_!Z;!*½y~ gW]mO;$!5K2x3$=47~{en|zMDb3e }uwvq) ({v!>hőobp''?7@蝿{8-}D5~59pWuUv!Q> oYΠxi)|/J}}W.DFAORm~ewo3c>C¦?F/SGj~cԿ?̤h;cze]LnP']\v:7zE"/ CeQ$>Ґ5_G B7gWQ/);?O EYOv|[`A~`>PϿh-)(YPy%0r>.L|gtVݥޢ9DAq}6C@  #@C6i|oVo<A"3NP::O/ٹXg})O-v6_o `s1x}?S|?[GsslXzN'%`䧰~Tې[H}{i-O+<?<ߘq|OҲvS(~ׇwuP͏#;ycgOݽ~}݉;&*xO<}J?龤b܁y5lj)~Oc7pj zJG:.ߺ| elk"zoWA&Yz0~Z=?9]Ga_7d _K XMrrVeVv½ϳrg~oj#ou,炮_ij2]5]-_ՌT{|ϑ_K aw(>??G^ݜ>Ħ7HlPj_ / /~LBvBIXiǠG'? ~Xe`7j]}FoGA•_ % 86 >ܨQ(l"ڵ,9!ω]h:ƒbd4a&C&< WPҸa]<}aNAQI.VQfxl=d*H\)|DFkAwTOEWN3:k!HƼ3U I3|A`-<mF3Zm=7d< %8/D/ g/{.cͮ1A`nZZV8jG x9ZVVOgzpzƯB岼fƏ+r"?EfkhYqClavdx_&$r_Ys,!kR?[/i jȒВS۵?.X')AԵaSiծ`]䀳u.sI7s^ď~ɻԩz$ Z} &jU񼜾UESלQ6y)x1[QЉ?.m?5^;zZ߻nY4EwlKѴgݼh(x [P/)@c%F ,O5m7 .8;y-ݳ]*=wcoۯy)jC'M_|⡈vO[4zXbv L04({ E#>SgL, XRm-}֚`u#Ac]=ž,`^JĒD&[Fv~Z щà Sk(:mg&e@T,@#juU 'oW/[=iIO{"w;=Arv9A'X"mTl*Js£5w@Cm\J,8 ѱ%xpT3f9ߑng]y`ep!dH=Em]pAE/[s{t󝓹ϭ_ia~gd'l7-xȇ_*cl Bǔ3K2yXS,CvN\f1;VmqiSAtȨ$] iv.ve>\VfJa%c7*}ЌVr}?%o\nyJ D<QqFYrMs]U/tܹJ@BdB5Oqn[D؜]bY" =^Ta x'0E/kFk;BO9 {ݓ:^_ s6wmn3H[SV%D|Y SIgCM30 LqίzY4WxEq}`K؂WiW\,\J9~tdgp\Pۋ/'f` ,=OE֏ߙ\?'zr7?> QK-UV_^gc؁x!F -Vb,l O4\;QzWR䡑ߟ}DR^΀{+"m?FeߘO33Mv_2j@ `}o AXxbH rA ހUNhQi-rr/d<J3ԃeJBPlaH2B{sK+؍VkVܕZ]ԳQGY!]1g^‚㋦IPcv޾Db:h"y˼YQ]6[IOU !^liT4_3ǧU/e|+VTX‰pqNׅ&{t0\.j=}_]ie[xǢqfH}Gŗ~X_ȷ2!lm_;zͅ5^K$2J;§F^E$ ʅOԀV]~tvv?E k<gڲӏp Mp"H nsª"39z3tm~4LN.a㥜fCJT)j׆B>{1etel4h, )X_+#QIp,fV\$.#&y745}D(^m%ZuM˄}6Rx^ :c(BnAH<$EqG~+9Q &  )5}) F&8/_a׳rgh"%)$e3 Fo琚oid#!$JJk>K [8%T,TB#@JkXem'[ *|F`9c@,+zx8>62ةe Bs|sZ7uy^ +ۄ+*٧9ep(Y5zt#_~J%+ (߉L%!@@z)E5kYvW_2Ieg=Ux('vgo^}i=O<8dkN?e \wamh?+l2ȥ9WZ(<;PiψhF Y_<a)5 Qh=I#azF:9ז ArN.q~cDHbLjRPQ@7˄^̔#B'|0Xxy @/MM|{g׳q֜_vܳϕm=@{ MOl! -NuI,RRrYнԔDstXRظZz+wK` U+*`æWKe%#? D{շH~F~&뾗wW ډӸ׮֑e`lUWpouNO]N *5 |\nsWzk_M`#%%=_x{d '< ]Zkׅ}z' jb~s X v\{} k;H/2zuܿpXY(C<7"%^W|WÜ_ ”&VX>lf%k0^Cm^ώ0!EA; /fCz'Emcaʝ^no~ߍOjpm隌I&&#=j_R&lSU3L_o3c ;L4s=ǹkG8 A [nE譍&K7|IcNV)C2XV4P.:# EH@lv t`PV&0/.0٬@D'tz$T>$Gl^#V2/j?ʕ<\|Pe7&312)?RE8B״@ BU|ͻ |CC .@,O59#CCS V/ i]vܦ Şeϲ3n D~{jq̞y*!ޣ'22З=jF"Oj-Y]]q־pWg_*4?wuh抪5h4!/Z~mNuyU1w;<kv_a2.gp^7| ZTw[Y$+zL-0T.ߟla"$+{D)cXoYx-z+r_r_7R %|#^)7c̤Y߸CxW DMԯoW-q9}Z6'k#]Ÿ?#:s(v+RR/9Ixތ6Q>qx*l]hT:xO0VȦڻc~6Rb)!,$k@u.': p8 `01d%:q/+Ǟ"v/_apfھ_m, ~F:Ck{9X~o d@ӓ&]niF*9HRl4x$a8&u&2L9N3qh0cf2Ib @ $ 4P3@CCvI?WBIŌ5ŴlŲ^WR!G>"CE)P*Sʒ# z2j# l5m%Js:Ip?փÃ9ʟa\ƔDC%c2l\Y9OYTфaE,%{Z:,Bc:w!i:X9> ;eF̆1d~ܵf +?sُ9c P0A\ݓ2قkwTR9ZTi""$b~ 8]ǁAzǩQ XW(J+9"=T,*>7<۾$P:Kj̿Mg:2pQ ,B~d8 t}3opS)?ƭp&p"MhD+ `s|-fH+|c&%Z; 0{rsVX#Z8F*Ejo1@]ԏZ ?Fzu}/2<ֺyFCi\Y8탳_r>2K}&Y+ДT`URyӿB}Bmjzo~+oY>H$;m%B8=ˁTqFk{a“A@5)r{K0 ޹Y:KZ; tdu͹]l@HFHMH`l7D}gH@y8pmc@zTh4Ӊ?u [<IibKv^:6Rvf>7B¡I0-S9FL=K6Y?sg01RzwkdWrfGs?sf%JF s0Mr]%tU92f7%oyZ-9gf&rϽ%`%w|a '%KHdž?' #a]4ht$FdTbbQvբ%k8 @P ~|g~$WrKOTFe+K,3gXbLGxzoXP/^azŪk4S6??ʥFçt֠q:<v~y>'wZњ>O]@ʕsu$5^ןGk9ma&).ʴ^MAM(08mb>`aFvvr!lk*vor:5wx5tH2ן;DZE:/OBcR=QiHF /'9HTŋLԏ.gHv yfN&3>~ O}BIS aDL2D:હ*^k[UߣX-|ċ!$JGޥJ<ţR!+TtEL?bk;{ &Ϗ/qWZ%1wg -YHjt7 6_Xs}oRŵ_  X:')f3^O =edCw$SSaoc1>|{_>ոru*=P}h?%[DNh;${S`Oo}x&Q|]/:z+y^nǘՔpXr0 e GܛwYaᡥ vп?^_s}MdGOOB>kT}P# uU09}O9+*n{ϲv/Cǵ%gv*sX=a"@ap'@3+H|Fq!o(kUS~a켫kt<(:G:pGocPl Q q};;XM~UW8Q5&#`uP|䮫KSÏ󾷷{N㍣?y%GrlUk;A)9 b< 1L6:9<I۬<Yq=_=qUk_RBWׁ麚?kG_O7W_g/bZ=̉xS=/T|Wm3.wJ/o翟*ρx^hC"'Zq$Q -~ ]*w?by;j$տ?S4*߽CQѣ;O8Q 8}nEV//Y 5MA6y8ՋC1s%9wJk?3ZuAN_MuЌWػFs/s bx>FE tIy]x"Onk5y,O. SwW?⼷տ>k^'ת_~ȿWK$:0i4^l"|>$2a͕3%$ѥzj yO ayfW?8^<^nk< ř"Q-5@!EZ9I*WA3@hoJsRFt?kgn4'#(Z6u$8y&س4 _ v&nλI]xp))gu&Mzqi-;\gSw~m%e&OBЕr.k;ۣ5ctm,]Ivx\{ȵ1bg HDzpe}Hɦy侀eYJUO,}, n;u5OMrPg |VԹh,ҬjzBtF fC9,?cCg3ΪmC?=}n-?N@Rz..trJ`jso}%W? ʶ;`o+Oѵ̷kC9vvl\8w;q4u6kpd)V j~Z> ܑ pQUȈz@ :^nje5:vvQ1>#"ɯB=yvx/3^ɜ#ZWZ:][OS[+fe}ߵA.;ؑm o=\?jxͧofW>_I[uJx%\ٴxG ࡰeF~\uy[txmgwvvVj%8֮AWoVu_0*gf$wR=4}8Yc!-!P!Ϻֿ!DZ$\*|vPeB{z`|B'HmmPI@&#~gjj0 dnvJ^ͺrx+$7e;BNT!njDdi9Z{73?Z€6qJ챩Q Ogf懝ySK7m9^@q X^/ g)1P@c3BFPE\kYs/ هƅϯZTҷ&rX#I b-jk>OfTogPENր%4&PH<#(AW*!_kpnp#>5yB63;>PrSb%% :ZV,O0]D]SK9VVyRf~=k=_oaq_gKU3,'T񡷖1ErXvMP@qD$bq>M "~)ԱfvOǶxP-~)NYeǵɷ+=d[x1Ul,ළm?v:V_^W݈Baϛ^ЎqSIaX Hu1ZaJs<ǤA˲#y[Kۇe5ܝx>y{uxˋLs)'6l:,A^z';\ʮI7&j=p|N<&j 9pߖ(Bev%UgÊ+x-"L0[ncj┮w-۟6wܛiK0=ߍG5:t()Џ>tV׋W'[VK]x{C ~׮ "8`N%po)Ycu 1?!O=!P2Fs{VE9hqiood#lbQÒJ9w5Xνy+*jiL="J^(S~E)v^aAm0k9{·&t :Jˤm`k_ocљhtvVIKpwL}BOTĈ3״PNHd/;:'m5kLwI_ԅ]ҝ⍓?3^~QZ% *WfjJwxo9V j@HGFK`qr 'dd@̽&Tw ~#dO"6rq183` x+ՁP#ӸmՀR,#Ǽ_+'ڽ>zؓM86S-qܴWmnۛȈ2Ne?}Ki:漤(`($9,<A$֪kBj P~ռ>_ \;.Xh5HH3?A(kRF?' SOvD2+F@l,S4+ߐPpuPK8X߁=-WR'Nr}9;Gl uq !HK$>C/s('ccٲOZk7LjVWm 7#%!KECo'_Nw|݈ >"Q6:PwX6Ϛ'X;淿"'ym.7'毫 jk>0/c ݹmaoٓX$-ZKF#[L(ۻէOU:zPfS0 (WukzJ6o.5-gM-MuJɗJa#Dp8eHsIi9]}_"T *TևčemZ P,'nQdk }QP`4WieysK$$$/ ֜ 6|W &1&l>b2[ ˳yr-f)G+61#)ycn&@]Tvtwuqp6fQ^=vc:v"fdB~2&C6CЬg/kx(WCT$`ڟeI\}N]r?ξ*իnɶ.No~Ⱦgߠbr "s/4b\?:Э]Q6yx)_1o]jg7.RHqCYWz@qF% mdM\[|8حa5c`i.J3ً`Rj53Jƒ&8H@uᏖ->MVV\rd{FݯGΏjnyW!BDDl(e˺|яmqH9`8;!>yfE}Wv6{n.s:?&={sN)q+y2jU91>`ŬF(^+\,( #_7es|'uϙ ! C6G!꾦 zFԥ!Crn"Hج +6.98^>#Z+aQWxG.U~b0 WѪ0-^,O,es:X# ؆" h<ڝ6;.bSJxUAHIIw9D6qWbf|R4 W&mZFrEcLB,Ac# JHS(Ť@U2#a[+5SZDu1MC졨ncp;!3g,zpEVq#"3BĬ_a&@b5EAޱ`׃ f"qo8צv'J({ܟ"}Ce}y:H)egχ:gHH#?OK6u/"k>TF,ceˈxn Q7W o }ʜ Փcc-GL,ZXrLYI_9t?*uq&kWo5Mp )ÚS I:=,2$UxMwOu60qأCٺJT˚;po., ^Eb{m'Av$HQ& INV+:$7&ThzfN?E$gM.l75Sv LI=P`])ϐ.7{J,?o;[j>@8xfs>G\?.zn^lCvGVuPÓTx΃Kc 9y <_sр "lYE^S^\'K)vP|Dz؏_$#Jk#;PxM%/'8=do1Q=გ_٭¾ԶY?G^>~/VbP,y Q!~j$Ml;R˛eP!zնLwj(}[.3"E.Q|SH=(Va2[<}|wPEzu j?Ֆs麚?x?eYRV'+>K&<*yV4W>Z*꺪V5e,l?uA.Zësm󛒩~c&mgm7~G)y> s/ 3&nUCPGi![QXLoy~E?{ko+Yf!2iD9_15.?`aantYJ?o>tL6ܷ]?z/*Fzhmk-4W_~2"IT/ R5Vܧq<ZsLzlO岬vc+}޾|OqU|DRBB2x6ObG,ɵjfN`EWǃ͏ocooy5;!kAP3Zu\x}Ozr,oAs#R,ZIs<%u^y@3O k}ᨬܤب~{<%dHwFX3\ɑ y\1{lHMF]ol\)mcxk'_XqHKI uȦ[|(p?Щ ]$ ᫻\uq*g LHik.zmKAzD>vA+$#VzGvb:F@}UZi~ބB ׽iUCAK$_9n+h:w*U a<2 ',!.+c%?/20e@`biNϜo0~F^0NH,u܇c9iS{T⍨SOMyBGICѼQ x:ܖLnXnqި`mΟoBNu ^汐ÅNnqh5p÷%Đ$cQ)Xi3!UPܫ: Y84 Ȍ[]gqS wdi{<b͘h=3( _{4V9nZ?J͓8H!DDR'/ԂQW*b)';%fm2j'BLBzGIoE\k)NYWSd?,˹8Vlȭ)y|014DPb5-8U{m|rPA|##z\8l5玓^=oZ}&#KceeNǛiPF˩vȎs;; hzٕ`N;.gZq<;3I7mycRY>ߙ_;9ETof sH!HTU%j kowJ߃"[ {yhO>5ZajJ>D>t }s|_GmEZr}I&+JJx {\4[ߔ;W6uĦ=#Cq!]s_rdZ@ '*-+b SW-^{)ρuZwh-1 HU\;~#unaժG $;'a%ϔ3╦cm9'7ŶZǺ.,ZgGRO? ʤ.1hCo-Y>܄+;PӂvgbC7}gi>|RZ4^/U} ZNf˘jQa$CfL6#'*sˢ[^lOϕ+OmGKc3AVm&h9V1`1٧i >J);/R A8'[cgNAp '&'53#JOe+:r'cz^1gzZV|un]"J*X?o]sk[ ?\.]b}?77*O3XI$ iR(Wz ,;Vfw3 l$aj$䶳4?1O{?%n7?Toň;'SsK΅6m+gFgg|ד]dNcm*gP5^iFaRp>FQg񠴫 Ye\5c˳gO,7gnM#;/JKܧ FR9Viq*(15RQ䅊 M[brB3u$밳^/Ԧjt:p |S3GA"$Ml^ΝxϑfIK_j/DWmzb% cfBϷ\@跸*vG4}Sލ%DW^{Byӕgy:X)!7jj %P}z-OmO7y;2HF)W4A_UhVH^o[jγQlF78h1Կ{n~˦[Q V 9Xy쒃13~#0Ƴ\R"09?M~ O~d^b-:(辦UnVxR8q~W59Q~8; 6LuC T+>2M9]a0fW6X&=}JxƝEA`D!]dXEoBd/r۝/MF6YKPYK%8r`qvn\ k{}BqLҽ{xj;qF77 ;R'>kM*5-*Y6s7հL%i[g0nu;\3CW=xdcZG[%h5}ބ~?}_ 25LN鮶Mv+0>Lbs$}OVES4?/e+9s}Q7/\uL|uz/s /#xz+\eWY {*|z6r$HQVP vaXP Ϥ! $vsmm{t0r/ΠҀz-;Ⱥk-^&:؆V;.G]t[Oa~d}ȁN:s~ˠva0u9~G9;,uؖ #FR2Ъj;jx:mMJ˪3qFfKS+՛A8Z?MY=s+|;h5e)kDw{qj) IV_QrY9Oᯚ Ӱ%1IY?*"yo|'O7oU~yO Gr9tcȃRD >⮞˘M|7~^ӧoϔ[+&b12qY\ lJ+ ފUv o~1Sqⷄo#I P!>.Nr$?:m; KV!n27;B,w4(R.-r"rZEK d*sjNԶsZysgx+(xmݕ 4jhsdEb cu4^!ule jShW.t\Rx83\: DN@?gH&'tHcv6촤9x~"S~RvaLhݭA_f|绿PY"pZk~ kY[xsӻg$=+ߴ 2 p&(/s.qw)(HähՐ;+/6{k8LMeI/%}Jv|qn*kބJx3rs>rΟ~(?BIQ, Y ݖ)w(KQ(w={6\X'%QAʷާSCYmxp::g)i&ӶD/T ;۾דv ]™?Jgw|Cjdf3B/BB$ёuTid"Ӵ8eideuI} a&b㥯EQtAud#++f{4<~yu.3$1X1ahQg|ZTN'6,ِv G%Y'V:ʬ$T@Jzұ}wԃM x),{#"Kr4eWCF'AӔ,"[ 蝒w=Ȃ( dȕ̬uyЁ{|_ .Y ``'R`ՉźoQJN֮k)xY Y~#YjQefN5پ6kj}6=գ߽X%RjyT>zh$UBnC xDɮd(2FsO%&q~K"qxD v4J!7aɯ}ϋ}N_٢E}N1BU'q^KC.~2k>?0JN|}2&$ D24TzY?M%nI$~@w1ïN]Jb'ޣ; rzz (khMc9 b4[Bws2$Z;8l$Rxפz: Uqܞ kNrK2d`Db%{B?QKz,qg!HT[ĶʠMBN)с 'p&.U>HI0 2+Inds(Ŧ1I/^IY}7g89a#O|Ľ'V) TzȹaR4 ܞ w98;C܃_뗠f^9&/>p(jY (-VWx~lR,5ã: 8OI ^/erỸHf]6،F%t{EԞRN=I'Q!wm q䝻l1izFW NKv$j7z'ұ ;-~~>}6~S%2+I /geŰ2 XhW#Ey-̆ 6B۳m;[@'2fm6Pzd͏p`"Q9EJ'%p'\I?#fE9^ٯj'%.\pf''pʂH&1ޮ@5 r`uHNrnnw6 q峆q=`,VA$:MaNUTVVy|LdQ Ȗ|񫜚xWuHx>-hq3f/M (=jQ';([5hT1K9?n- 8,z {oWns P7Xn2vx V)z3&};u/ʤZ-2z{cPrŐ9yv!a|ĩd$`%a#2˪ɟ|?LկS#8mOϞu)@tsS0lZ yaV34],muC϶IH[u-{4v[e$ y l121˟*f﹛eσ~S7"vYD PӠnM!lh<,zٷ쿺 JËNWꢥk nс2$AA;b)V6swX]oHY3OgOs%/ܨ9Sl;>, H+SryKRO%q:K=^~m[®};Lʤ] |[o#TCL#vK-au[pܴpQ 1t =͙A_Eyc5HZ גػ!9 ?1Pa)}`Gv}Ŀ.&.㠭C-m?p;=kN|hw?,_G%17Tt| @?r4e~1 )-N'Q(s;4h: m~y  atL/jug>o/fM~ʿ'G̮'ۗB/ʹmh8^y|&3~\?RZ?+y$;PC،RC/`wE_B@$*lRDw_>ߙ?J_;Ŀ]yCh?}@n_46>8>Gs=XҞ# R?j5sU?noa0~T,\\>@oY2cÑTy[ߢ K(!($|C~C+y} OBCk~գoR)0e >^R}\A)T? ިN?T1?#ፌp h{Eqz^Y=oO%/[C30){' 2+}Q;ur~N""iGa>JЎyƜoR(ѳ0#٩ 4Bcx_p |?@HY ~͑k^o6fbQ8^{~iAw9䰇cimTy?y/p;ޘ~JI_g矩?m3}'%XqKb@??PTIh"=|%r` 0d}{߷d~=WkܲR\nq^w ړx&) 'cb?O=*r>%!s#]|i=T8rFT.PAzS|ZJД럦ǐULf=׃ߛV&%FxPob\OR$ک͓ޒ{j-v+j[m_ANjjh#iEbZ6ޕ6Tzm=S_VLixo.=dL\63zun˳Vèl"ꣶ󫮴gF\k{"\^_S3M.9D-ΛyIԽ:gl"y~ASĢt FQ?k(5}NVan6}uDǪ<}{U-ߎkUw?6?MJg]WCNݩmꊿg~d4QR*eCn`h1g#No+#ݕTjvjqȝ Tjb,-NQշ uckkJ-;>TvѨ / \[goli+jXob$y{F&ی)Y2HKp@FKq_BķYG?gIYsL/bv3Ϻy`$,`k~G2f\qyg.D QⶈȒ%C.fD~1087O( F [i>l€jtWIҡq@iJԚDdi'պ/hwirԌ׬}oYs_^S\T"ba^T;[@%Uibq3>vZ 9@Xo7n~o+ם#س=c|DQQ6e㏧NGxiEv$Y4n~2aC- U/,y>D:#}=|jtJ<W1EHS)QyH[6҂ppP.3{JUὢ NU6sneշ<{^Pah_-gwGikA1զΎNl>>_GŵnE1 #DCP ;@!25NaL IQ WV2LFN)0D*7otp[xN3- YTXSˆSo!cԧ´]iV"dv0BL>Rs(a `8c Au:" 7 aqm!3 4C Smqr'k'i `b, 1R `3`ujX&<][kũN @S 96&Nǩhm{k\7xWŭƣq+W" H@:f-25S$zW^>U(ɏ=|T:$E9) cLjg!lMoA{,l@2ď|_R丘G?gYѯ>cKΫ6KW8kɩ FPtT SlR{D i69p݌sRqpȴ%5QpRA UR}Dxaiv<'E#ŭѡ{{ئ/K'z2 rΥl Po+>eWQ"P$`ɚZM;ܹ>P||䔾حV@ :3;HG(?o !`@U<]`3n\ w, B1%bJ YH {ТaRWy~~uH8Dm+)M^@ rE^<〈@aV"W9uOCٵ=L HKtI#B5u$4Ҋ[h/7TFzeEetv:Z1i˞CsX|ߵؽ{$d[ mRFäx i**Ihl t tG1xr4&SŴ %M8j~OG{h[ aό`3>AIR C]ooaS~΅Wn7]r#s6Y.kYDpV^fv!XXZiw[Fnfbt+ύyPcX]d1f3O<ܫa Jb BJ6qGq,L7)HCBەH2 E](=Jє좵}46*]8$ '~s~_ȓ?2aG NuI{,{ oR=ϣm&G>OƻAFQ`;xR(%;6S<-~B4W&I_CJ(8eۚ*zu ?vF6Vdd#2д?3W{߹p>t;~O辞%{ݏS= Ac?|ᮧP@Nw 1ІZ< hCIU;'y#qϗI BPoe`z^~#4N`?&o[X+ ^^7b=ߦ^OCss甄8uEYM O%%eK!e$ld*sD]'Ch$>ԠІ6 4#,/;?u#t(=_I7YM'CysI,p" IzQu (>OzUo[F~ <sE-ϓb۠o4\UCi-Ljx@1N@td۳>-U2CqAzKJl>w}n~SmmL'[fi)]u, Ty% i pvrPĪA堈"!b!]"ڷZGӗ5'FQ4%{0Z2ԫ)R +i ^ ) ,жwr`9CD, XVe-%I%T5BfMBZͶi,&aKE 2Yr(Ma>z:?GRj^&xsp=d}OK5tY^W\ⅅJjMECz(%;JP"(OqۍEPH23.U fKUhrz;Bb%P(QŜ'&dv4Eڋ"6g zo;&2Z%ˉFi+Y4Q:d,}"Sۯ7FEi+r0RŚ!Emߋm>(2e;tSy;eU<+˄11_Q=o0>ۡ>'߼OA1^U M0: op$a,KQ,"T$$i c" WI~>|L`@qQ@K9$ 2;wD xJRzYՔ8VSQh 0ܑ"5RŖΊ9衦Ь ח (gViWPg !UZ :P "5LjC5T>%qBIsk(eŽҔZe{QKdC"_vQJ0Z.^" , RmUѤ@!PVW+E,颉R01@_Ko$7E} |bpϔa :Sb 1J9 d,WuYݜvbAX--v 6J7>wTeD7H!q,BfT&ǸwY`GYVkSST^nR! ??b% \Ԕ}&-ƀ $JJsuuɷv)%zʠgļ"\ϕ1`F߸8VԷOC=ZʻKܓ#;@KA$&|5Oj&&xKGozL4˘o&CRg= }Jn]*+D#-W~/Y+z֊.e"S'> g 1ˬO_ׂXִ<~]B?WL5(QNǷg?qEw1'")Zgw_k`͒2Q=9bȗ3H{tm6g Ǻ]> BN ??qYPVKzpYj$[J„;Sw i #}KQr%B6Ϋ(fF/?~%X^ @vNm80-vpDs"X$x7cK_U2Kubh8ɩx1SKEydwҗD,Y7:rdO 0ԁȭĝ opkԷygx@kF}*>QZ[o5yQ>4ƟO˪ZŲ_ȏSZb2?F q{*Ci0h4nwJ]'ʭ=-Tk+ZlTͣA/^v_ AfʝF8(-&!2!Ӧ) l-w}6:A ȄO>RywT,tT-?waSG,ib3MJN6HɴQջja9} %PNٶTA[~RXL 54;nSSzx8B@e ٬4R)] \m[ymwY_CD+ht_,[BXó(5C}+ 3k3p\u+xSBLd!VBe: P˯olRW[8J gD]{Rsf.į #0ea|0i'Z6b:%x}C-/dJ1B3 ۞ͪ@@ 9w e?H(n=Mo Tڸ:oo:ks3%zK>lAT8l' ,@GQqu7)%5!cCB;Ա2"q]w?dי۝fz.a9D4SpN=wf ]@Pb,J,c@n /b![ mؒi~-WFVk]WyZ20B^ۯim>})WkCrU~  5;*k\(8nWV͈zA~ŬÄ Ba,O>Lb9Cں7A!O,M-bhq"CyT5UV] @m'N@y5U]hn )+XZrθܭ.=W hg@|H9O&ƪ%@mC5ԃBdwt\V%tDm>,fP*pL!: Hqh|HSĶ͜yVxGnM8/}RB{:D?Q@"Y+/G a ĔLRRY۽I٨|vV [ JB! ?&U<'WfwL!q\K;{O91E}US:ђ<qD Rb2,1^r2R'P*L0z d}f/Ի)SUk|18 "@^44|-h[U r6G|uD/e,r;_;k{U+49JKw6lۿ(iv"?߃w] 9_CkƅL q@9v g|(:NE.|(q-alt4]Di7n/G+A*{|n^S<:мv-gs`]bA y=|Kf{VNMW/uik T$cA~d7F>yEƂƼ EsX^L8"r`*RLx#?brӎW*o>aK;qI)j1MqjMD fxB>q+_/CsCoG6^j 'se~SsLeo9&٬#H@m.8NIHXoIH U:YF_Ԡ-2 `rAݒu9jaFՊ␤Jo6RĤOڵڊ*ko{ң3E_<3ʲE#p&)1 Y:D!tB_.北*BKlG3iV1Gvk4 VjY0d (\ ogwd X a,A @2;mNJņ?I @(6/&Ab1?6L[_êKjRNfɬ-o{<5:g:7f*k>{& f?? SVpj&m|9:߆iVjLX!HF,0 4:]uO\qL yAaR/5~'гuɓћISrr2/$J3hN'eZ*2E~XuїHT0Fn4 /Tܮ⋅>4zw-nusoEOA#Y@*"bi c$ &oGjaj׺b=ô}.N>.F t9ϵ~:53ؑ^b'ҙu%-vkUWÈٵOrÖ]=D"wm>Qr**7@BaQ5`lSy+ߧPdA_lΰZZ-dUf]e^.V :>4&_8xwrw-zCV,2M}f.m.2O"JWoձԵʛyŽv356 ťgp= >2Z5=d:=&.`#Nvð|NZn1tx.j?\ے=;]%"ѾkD7FT?WjR` ==O+]6.lQ̓)d[Hb"v>ffJl%>B=2R>)n *jϫuOe<FX`0騄~v^}1`m^Tr;'YI;qR[\hq/@$R҇T]Hd V=qzʤcAgI\GVj/ݢ.SYdub?6뢽z!lɥ~,= Cwg#>nfz[S]Q!2"8D߈dhfExМ-M9{Sbf^ Q$l$8 ,!TAJfP ␐jqXw) Kcfl `KoCe䉓yކ _GV:ѩ$ĸ'3 >]Y=_i$pП{Sgu\y otzmpDVoEtT C;gW9磎Wrв7w^xgZ]QTjA[mWSSz~X`Σ^L"2OިFƟq<4ϛZXjG?P?yhհAFP0سBA) z(զ@X"T$I3 T@`~~Wc|$(* 6c!E! o?۳i)!U`oֿk(6FT/ѝE׿`׻!a4دI?/"wV\] &KZTʖ(h,er5'$[P"κEMEZT k8jPUN:@>L'n Ko[:1,6lʣcu-ǃBaw7*>my8TȢ!qGJmzyW[Qx<ϼךM@"۾"U }X5-EiR@ݑBG9I%J>JY' E@0QHaGb/hj_td"@"ɈNRd̻. "BmP ]ja >UxTr.4e&!6m B.If5,¦_lkxdp>*0F,: h\Z)BshuǎlT-&}&hÂźE*@0f`;:M)u@fF V!TJ*6%z,:p(N=FN{/30otJuL)`+vpr.|b0!E-Ju0k($S.V)iqMu6a45"9D%DE 9řcQ"ʦEEHee70v\($INPvKyB5!b3 {VEX0@d*) _+5m!ՃdD(\ dllEFPmbޖ!ua?!g(u.E4H 6ER3yF(5Z5_VVDRժ`( б(w B#*lJRԂx6Y͊4 +/٤j((ֻyr26,eohXUTgw6̊,2Z(ԢH;NfNaw%ԁp:Yk5 R왦 Ri;-Iމ6*YXFVW&} 錌:6k:u0 mV܁P27Y#R3,Z.ե[r!86(ZXk3 +a";"DbB:7ANYXɓ2G7gllQՆHBk5e_Bwp TP.G ? 9]ЅgpΫr*Z 0\-fH,Kqp˛J]D沬RP%đBViw+ )B9tp p*A 8v epnA)Z*qasrTxR )K^uPVC2PA R`O ?\J0J^aM(Q"G(K4b*]dW!sT4YX3 ()EvY:օ*$ acƑvRPda\if4#QRLL-:8[y\G<Ԥ!ý%LgKNjZ+w VUWZzA4f %gV&(ԅHQYIJlZc}2hd8ɄeF{M!ȕSqfkSP6m刓rۻ0/3^ 3U*K=5R*ivJݞ"iZ"Mb$3Gs2#_bI?3A JA&`αC e,2%h9Ssiβl* +kZ5YgEiguHb@˳iGjJ(|PƨZdvnXjq0l7`kU V#uTΌ6CJh_ʽ@d)-&(- !*akjsX_%?-uBR59Bpav3'-< VrT- %ϑb޹W3iF=u|:A XT 'ߛGO,3Sq?1HNEӴ֖,Sb*}ߘWB)m8WJ*Kzi7`C7zulśa2i71 Q Na}dq7i)ɀL wҕr={[ Ԭ*pzWϡTzq~n:?_opOQ]+vT@Gd|^v*< $m9kԈE2=+bΞ].*:ml)Ba[7WdIt[ڧS?ϕyݰ/$Gdӫٔb f^DŪC9TIȰ˚8K4BiE-LD= |[IB; i5//pD3ƃ(0 2lYPK|k _ 7 &2eGmrӀ:óii[DUL}Iť*~#z BXnJN|*{-I/ƬѻVOQ % VVxE"~wQJubP Q[?oo߃_awR+Na f2倓xE& {)CA4`P" ߼ 䲤QQ ]UJV@وegzC `ޅ1'vi(_;VYrUGȥ@+a*B]ˑ"y4U~&YZ3ŕ?;/J,Q)N( hPs`T0BPPؐ6I4iPi2STd~pP='wOyzn8< 8wx ^grA ; Ec,NKrLZAlezߗ.9U7)Qg|D~L0i*1޲m?y^FN(*5DB*: ++$ ,Uh\ciqMMOkJgjQ۔gF+ǻqxo-U=¬u6\q~_3=M ΂p 5ԨpGs/£3QB>G~*^a|GngeZ5:fXV폹d9x0]Wp7uIP.mr1 -lcgo}U- x5B5І2Ba)oU!@F)Ÿ5}Ȇ /sEsࠛ]-*y-'&NJ .oJLP9:5Hz%@e% Zֱ2@J,C!_&R ( y~fʺtt\d*z;3=V1VIuՖ5܊3Bz ==?_vPs|\}B=úI Ћ˼ʀ=z<A@k+Rgb!ՐUfw^q ]}lLCG֝\KN0d $Ơ$[ ,X-3ˉ:_&jCh3z:yf#{ͺݥsz_AX oE#\+Y.Q]x;uO(k+s_0eP-{ouw5>W!,e )sQg> 'R'C4?e3;zj q@ǭbڦO Ru=z?>b{nٯʛ~z`=γk,LT{W&U,OU\TZ̧}!;e2ɃI1, ^"wlYkkSޙ3-ofFWIѳQAۢI5=?k-|Ƹok.zM\bBWbhx# ~etyٟ|m0j - .훙>Ge5r 9O'jt.?8]6ACxՊs]FQǗ S}mD.|V"8ڮU{Kz\{j%0{xj#eV?QYU3)ܤĘiET m@H%+IϘF73FO|XIRn+2`!q:-N |TS}un=p\S>br2AdvC|alW]TNY?7_[/C;?g;j$3k;RJ!vR^ĒP Zh0 h2жfSeQHTYeJ9}Ƹe|9^!(*-av_pHso*{bEpo?2&*ߣ+9r j//6ϱP9WU'Ѹ \ B<\Ӯ\LĮ!|MEݟuJKdm񙲴Ji5EX?]UaP \}:5TxifYkxYe6\BcVKr:ul' 5kNRlf wMRĝpMc:rW_<.U''; ;PZ i$>KOu}YG _CP[P2T0W뿢?K-Q**R{\ԣ /-S6LUtHtY@(o $iwKZwEv㱞=@]Cj2} ] V\Im  E5ƼG?wecOTւ6;()8r j[I:kWhRI )WLqrc +{8IL8O-]y :1&B&d ڸ}ҜB^X\~ ,d@q JL. LHJ> "Rcځ5`YF=!:ۈ>N9^o!}Qa`%>eCKt7BzIeo<%qs*f\VB>KĄd`PQ7'YydbuP\#D cUm?-b)z_nU´I%qrװ'RuuU? ޝ>4Z57 ( zNf{EUyӃA+ *յYni!OTŦ/Q+T[W}榄mzihqۗJά/l[Qr`qb3EkgrfV#c q10Wu\ ;gtXOP:nqi4)T9Z[^THRR8=Jh`{Viv3ՑvXeshS,A|__"V::Ԓ@F,T~cS?bľG_A~[O>gzUXNFg/ul. ((;B']mW?RBѴM싵qfFC|}v!u~e ZxKX1 /eԩVG\_z8-TBk-Dw.fe6T~Nk3t9RgG"Aп]XU:3u:ݬjo>aEg% S[ .; jZR;Nf#Z9cZX 6&߭ѡ˜W}/+%h O2tѤM[9UOo>' ]}i&!~y~R8G#q6m(&O`kgkfv' .#QmТW>Mb3+"S?#5Z⏗ *Oqߊ=͎el8V%kakϝL[T핤s>ǏvZ+p*"TG* *_Fau/kfώy;|w0]}W)ȐA4ˢ"pЩ&?$c-2Cu{ Q_o>ڑԝ7(jm׳BS޿M?snŷ9On2Idܡ}9*g|"Aבۃ0.h8eP[Zm C+M0LZ3zA!Խ]zLH h{]!T ZHљ@cH洅]`=+ lrv.\_esiUWƳ} ,調CpU=Z=>;пv=噊1h?' ߙĆn1#P4l}=cŌ-%~P(~H;O8*9;bi 6=Wܬҗ$otgLo_GWo:wzsmH JNY]SI_4m_n ]}ci_ zxޟ&m/wyk6B(,Țwb QA !ओie^ ^`G!;k;죻ZTDf8x<u=N9puzɬJbgq+LNpuw f ~s3ƛeB\c8bWՓ{T |&H8{\WˡUX,|ɱ o^9MwhnmVu<:sXVdXd∥Nju`T!yE K/1zO͒N`E7u MV?z7۱q,nF`Nuqy3qW\ sJq) xO3'Ad? J",㭏4-v`"Q2?;;v`zHe8Ȯ7][BJsi2C|}۷ KQ_!Gu+Y_:w~AۆǿOL"mJ>lŠ vEzDmZy"$a(K`la{]ƖS$+7%Sr9<7U uB,A%&I_%f"$|OZnw¡Ե8A e@A5$fpDR5S63n U#Jw>7sen:8yG*.laFAPF:`ff!LqiKkVp'*6S$z*~7Ml6/w*?rg3tt>Y]8TݖJP%IjS@a*L.Ծ jIV.;=/A9d'?"[rg?d_gJ9% kQJ)jXpC0z[=Aonpw_zzOrݻu\ Ly~q̋AS>y}xZȝخl}U'_2}W$j<֌]Z*D#"4 h"0YP_ثXZ&s׋EQȵ'̇=y5i5כּ{ fYi* *u9}s@BƠo) nmL!>vˈ$z.cVnh#XBA!9BP2붜1*B\kj $E7|ϏN"UM'doy^NnaITLI !܉EY3ۛϕ^ߧ'fxAҚPx~Wg־Tϥ3JL!n'TK]?Ŭ_/ϲPYv-/BR=  T ]]3_t;~Pp_yN"+dO L,*9bR=_0P9ꏻ m潖VX3ޘQ=傾cAk=)r7^3X@3MvC"'x˺o)lŒUw霅(n 1`YPiC8/ǫn .m())6x nz }{C!8㫓lϸO=9ا0qs`/9*j¶J1vPG2G8_'o|ן?wyo.וxk30\w~n ;qdr0~#) Ђ` =B75w Bqǡ:4o|_=QɳX '"mڙz5,V_S:h3wsC*gyR1<>{Icy;{G7|!LJ҅]B?J*AN)@SЄt|M-8>$%eLdJa%"TUZC+(1.Nn*Rh󸦒;Y\x{E#624[RdKUZa [F:E3[AQ2hoE%ˉX? E6YMZVaTRI+\wjF;L:[[1`~7W0 3l!Mȁ?$`_ܤ,M5N*`2X S\ّ,h|RVjGd vfl!N%THĪn=33:2f 0{xB0_zJķDCš%5\P :4UROZR{6P!_/| 2n64) jD`) h1Y}HO5wJ7!R˔K{kSN9WjV7{PgI$P\\GX?/eINdev2hHw.`̻OYALLf`;dz 1#Qˀ5bB J+ASQ !u9QQ aɞ ^v&Y*} ;BkD:Vso<88ItbDU$.uGyap*;Pb@JϽ"ܑ,?%*pl8 I{M5]Fan2@۩'0Syn\ yox:b(1b/k5|ܯÖSݟ@k?!Sutu%;НHwݎ {H,12j_T;fνWkKc &.eq^=o)+o.Y12//XR[~3~D_XC- b~hpOnX/b$eTָ^a*L/鰯Љ;8$>ݏYp}_kJB^1I7x^{)+ئ% ^5t+7GkFe(ZJͅ1SS#/LGjUmOoU+NO80# uծFHbY~x8G݀伋a.xxߔvוgίoE)8G4#u#ڣWzh;%3'(df6HPJ޿K<ц:NQ/Cw''kWQfhsKL!}Ƃ >HMf[gx.}0h r_A6:5Z7;2BPauJܬ98e]|ܦxpLgodJ͢Hc PEfu?B{D8_*tr] ,!7xuiPL| ?G"Ie UFo߄q\Z3]e*:{ͻS-U;TI^,w~CAls;Mq6dͥs򿭩źL TI6uJ%E7niKRajG \kA^c׫K)d: t޷5xrcˤj<(e5pnM>}nK0hɷOO4 jrY[hk[JiSRVvd"& 2%TmDꔄ<\B:Ȕ)m*N#:=z2Pk4Z+L~OF ԮcZFx(؋<.de%Vl!%"*2+Z [*~h 32q쿚~[dN{=_oCa8Y=Zv<.~cT%mZ~O<%XAzYOoBEҔiA)<hЯ&1(i0%spw!}&`r:lâYFT[c=Jsp+črNL,[ {gu( v'LuK M*y6,r}*_@UDՙ+i5-N"gP?H;~sEH@79KtQ1H.[LijJҋBRE18l+ѽloѬi8:ӹY^uaTtwVpWȦ{vQ)F'JoيWP6{k0a nAyald8E $ ܽwoq<:ˣē7YY4RfB˻3BgE1tߗVQգLͥV 78˦+8Vqꏴy3ɀb1R_+%Mj_e]3A.{k?^,h%kz1M?Fi5>?}q=8 'N9Efnt x׻S^O} WedqEVLblQ;S]BTwa8]JH\&twB^BR '/AI䙥x(Rur\_]W)Y̶y:зϕH$(V. @% -[|߅/M74 u35}HȐIrYC" dx>_~+I "B`'hdrS(4>JSփ>TB7Dt_/~))f6"Z)Kbb9蕯CZAu<1 '"a>& vNK P0A,2*{)>Ñ2?=Ehqv[r\չ=UoA&%`+m5A@B;X\~V΅u}l3Tsbz/2R~mZίy >zHޗo,fSp5i 0Tl 79~cUI}fO!ʁxO|a>,ޅ6uk]IE5zrZ*EA$k9D`1RJ>B.dH 8 Q/Gg؞DkS&JYl=hkHއmOwLo_T,CIq?|q 젟u/k6br0 ]N:UFTtUK xe;L~~]C|6NMfq͋R[)(<>egWu ~+zƷp_vrѧ5$G{TH &9!Y^'ͺgZwZ~K{|1d+afgg츞.,U8A+]ZuQui0l.-k ዅXEԵw5R9i\{1IP6Z#jSֻ[x }/6K?}#|vJQvڦH9ipQd1;@Kn0'|1uEGi  |2p_'s,Q?5V]M5a5#9KYW0ҰqTc-WҢm NphUxg|?7~7~U릹C[0a"O4BvOĻB7w Mw9ƄCvΟ@@߱dpClMynC~rHVTg:e}_,Ae΢H$zWŢFM=+[}yttQ˿ٗX0d? ݴn{~ϳ٣H!IޠdO/H+3:GŔBNX% _;aPND2lۅHd :˦U}.אݛ6GĴ%Kf;:\6]&Z R:v9 RS0£Hz끽>W׷SRng6, #eVioduw5X{ƽE9Wx{p^cںqn3V7NsznuWOy4t g րIiP䎰 syeKj☇^ƹڽb4e" rajaYkWr7&ݹM(Rys6́sOtmǎJ[8oPzFUW~ -: 7 F୥FHN]獻E8Cw1tu)e6nzXMUXqP}bSέu#;wt Hv0'bŎ[@b.b Aqə_zoO7*<^O oBAU-]:4G@^0 U{^x;'9;-L?xUvd RWgo mL F HHH!HJˣi ĪX7ھ5HZϧ}l<ؐ|sppXa VeŭO3Ϡ*lP+%+5HgR9cG]5Mi >=g%TL;/E m0Bz̏{umڙ=7wd {MBS7n*ڲDU!R d=q,͟JMܭ XhUf knMQ}ql@8uj$-0}ZcU]kpo+B&iكf׭nD !)3[ 0ZPs{  B)_D޲. ߋch&Wozz"_t]g:D/g`;NQ<o`qܣ|Ve+yY2J2/2.-בճ{rRuZA!N*L K BC'de[? G^W*9 Ŕp._lpc=)r!x*nrzlG"qvZA3><1fwUC<ҕ[`QPnC@smc{|qg/91].v@чhn7(!GK "cUʘ\:JjF]g%|5A:$XHW& xYzJ*K)fOفR ˫YhZ^\l_k%pEEJ*" ]C/ ݪE$eTkW9jh+ +g"elA u%#8 Xo; Y@rqUW!D;e/Frfk0m2ror.g59awzyζq 4vZu`[`fpejZ%A3Y΢ȶ;|r]z{ɕiE 4va vԄS^"5waK ?5, Y.\Jk#Vuh&hTGrC Eee[2S]z>F@; TrN53dIxI^ӓ[@<{3Z[v꼠Dj@΢I,U µ,C0ɤv4 \Ĩ#hqPݍ^FL%-G^-"lFd;9e*Box/qI -vt,邌[0Xw 7(A ݕU>W8Ii{R6ր87C] %F$*1fa7M.q#`$4I#MՄWLGJ[TLE[Ex0Ι3a?1|E=^] ` YNتrmϹ鼤b:1F[&ez\[~߻PP,lƷD0<B(nJAJ%(VFRӫʉlʀ-+MVr@&fNHruRpK- ゅfxhM6bw^Oʋ\ ;Z(ls6RMylqnCy;ПU,&mE(4Ґ/nhxt5$?B=P=h&*A)@Wl@*$y\`)L)>񡂾C / ҴL;8-NWIOFUoºv.*/"]M/JA kOvܫUt|~/UK„NA:5;Q{%&gyQ7 _uөR։x9l&(&x8>1cc[|tt?g},%27v@Qo-e}8͗-YYO ^Cid Lgi{&07~-$Cd](HԪsX 5ϘYH "Sˎ2ʛh0zM1i3^$f\G%Ĺr~ۓg.c#̗m4+aPo)wg JnGvl C';brO3;ONl)>%ۀ)}-ƶTk6 jU3~Zݙ^`Y#$wu1 ʺanYTAzm|nr r{Nxw+/۞՟GyŒy4} #wJVCzI@> ģ~zAL'\wr,9~\SOHbY/;pnͺ|"*׏N, @aoRq&JH&48`@a Ko@!`"e$ZZvˤq жkz?~D]~тB>3q@6k fŒ 5 a<'!9`HQèUu(f mo@FE!}X X(eE$QبI}iH"n fdõ/w u=}PLCRNW+k?>_$7o3Ԛ|fP X[L;ɀ.'40tݤ+ >16GͲ;D'r*~G*=d\H@q\Ru8OpDR ##-F7bl`Xar4Pz&TH;+|K0?I# vg(lJmHtp*dRUoی$ ba*$dY̋u> 4HKr*#Q}MgǶJ*J)G,WV]D@Bۘ XbkoG3ybSuyrGڜa+ ~RQ+PbkZFW 7J'U4qA`aM0uQ VUՐlOamx!װPr6YfAyJLuqM.K7XejHRB26W!X!3?|4_6Iś,ddZDaI2k-+{QyA r!^-`&:,-DBMNQj!$i73CGwȖQ J뫈}Uu+8ggߍ4tC ͼWlLJ<LеF! ?gJdw%,&.esͲiJ4[eHyi? Wp)&eJ&T@T|JHsHb}djq΀>? ^xf< Ģ6x; '~^i#z*BCҾl+~TW,zC~ߞMuV;x?btK~fj@ Qs2l?|NLg?鴄!NvF ˁ)1l3~+ULh$:y ()p0TsRc1]&PiT/}evH QenEe§䶑vuvg9ڬq.H2 A4dhz%/ ̈HcLڢ|e>eu22κ7z`>c5a^uߝlSP(#j\ Z!\>+-9mٔg.L+X٦oDً0$Mz1; ױh7?70i 9ls[Pm&;^xa~<@H@(B4=_ظ7ƪ7k/EZԧIWTB~ʓ >͜cA>/%.]\()l - s5`FD$(([, f863pMu,==;F9hdu[;w77'q*h(j>g*-<64=G@<7נ$ke(5 ,vD<K .B7~ƛd+߃ Ŀ#!|L' ݈euO008 BG+]; @AdH(BPY90Z\` qecfhGDg irHg,`nd((oMBeK"*:V0I/N+?!IjQgkl4ח&ޟ?tx"m}*T*ק\m mPqq"1trCLV)n@XU=z e7Սvòr!Y]ߘƧ^Um\cVՌ.-q--M0(M֫=$J!Y)-%p+`\ BPN&q%ٶamFd\E]vyr3d~ v6vgP| [G5MywtW"ffTv,vrhv%\!ˀzȉf 3a;/s0gpv_[p= V]_{?:Ҿu33*TЏV+mJ ġ,'DAt]۶ty`X\\RO@PXpKUN`|>uǙ?:6b%R+m+Rt.cCa E;N$H-omVv;_c0XT:@Zv]{N r,Y^%=WA"sDo_w5 !avHwI&zRQl5u\v`;86!R@3xwоfw|-9m]]~HճH[@F"H0J5`ebLl.Kg(z\6U9َOG̠цK`X;E{mϬ TK1{ *AY&v @8!lcK& WZS"`{V?I՚&ْ}pt5"y#@1qYNsuޟSIin7d~ۑinMXF LTϮhrQ,UGpd2θ{z/: :x92愅Ғlwwd#yQQ!IjY+ Tx!A?La{fjYҩLֶ{;%2ف̓j޾R XΡІ3gjDSc :䕨4;st68텋/U[kj̛0V]rb@P'|t{Mo_[ijr. ȗ.YA z%x w2ui"(@ҦeZ_A[uliņk8w3#|XP2+QZ2K9tXXޜA(W&&<}:3:ZH8(/2w-َWP!7 :Ɣƴ#Ab#t%9 tHgP=6z3GX\DCmÅ~#*_RɫPK~3`~9L9D*czIO5^Đ>ۄ4KXp| i*M?mn70PϏ\)! C Qmkҁh* gh0 !dswHP<fm`,Bj$@c @RryCZj³LOͶlW/^&7}a^9^~Wd#-`ζ+.Y^J HܭGG)N䑴#Kie7h.cogX.'4&6) K =p+^SL_$rM yiLN<|BI,+pبݦA䀂D Ph>Û!SĥuADkM(>KJeηm2iSԚ*.@ ݶ3>.YQsAhBLǬd[>:pyf,C=rt؄: CKRX+xc^Tw{eJgkMb!$1ώ!%jr`tggH]i¸+i ¾OyR_sYж@RNym9sXJp8>iRp}`ZÜJr(+IP)If,Ҥ$#Ջwą*3boO]v"_^AzRTSU0;ϏRH^*]I;Z֘)jkԈld"3,-$HgyV$9IPBϔcR.qROX}J@ [  #8i/:o{Vh90AMT0 \S!q,En͞ 0 %d$5%ɦrn3+翮233 tADTEa׳(Yoq޿_mw acr,l]V(`%;w%sqNZn3k7BIV&Ly*թrB%^*Sܮ91AoMd PY"jJ YC+`NǍomqfە*HG]iKZf^,*4q%jG@OLjɱt5T8C:jEbFz~q;WQywWl-/SIiOւ z-dFnJ.*f`3𧧴9EBb¬tkՉdO9'n!b&H5!a4q0 t@I#Ja5u˧Ti4?;x\˹$݅_4`ݒ)r$5q[}ӷJ^ۄjCf1MH?CyOr!l Dsz1ƴ I i ĩ%===HfHIZ;IKZ[34?$]JYQj-Fm Z)Cg{ڊ:F뼚5LmcXRSScFtZ϶cJן@(|&ms݊`LIv0Ϊv%@IKzd`r3rcGoɜiԄP?ajv,\DTvc&K+={H%C9 cG,WJ'q@`GjUW&D̤H/ʺp\hٶϼX> &`HN̬t ՂϹmϼhYNbȅT&ɮdBEA8~qUMTō',P YTȗr6Eti{PId'% */{3J2j˗ٞ%TIڤ͍֫/QJh.&Mp.Lq?qy\[b`4 D@vAG&:_Vу 'ddI9"`Jr"f{%ޝo x2QnOrk (]ݖ 983J^,MD.٨oLo{NSkz v!$KVM1fu*Ap?+Bۜf6 (AX!hm0-^oEL{bw|6Z54ZK(bj1"o5@DqyyJ_7ap>>YϦ7B/]O[l}Gx1,sK:ͯv3_e5?`\5t'Gצ{ V9/`}}mҲ}}^^qǽs ڲe= ;۝omImm:׭'=}:yUao^NMVӤm}n/)^gqmfdl.sw{WA_s]=)ӧCZշZrbEvbFkNaU%l1Bٚl7FW2؊EJֆ]vwsFUfQ @(ѣUf2l-ZλGsm3Lki+&ijh2DRAk  Mdh٣hզjhl 5CvـjMQZ;b*tkA0J=m$яaU 4{R B4@@"@ - kleCK @ @  =SwmSLJMMĶT$P. W@դert(ǽ@ DtR5XԤXTn]>t{ֲ"͢[U]R ꏦ yklWZĭm4밭==׾wg;@LLudpqn_|G.@zC!݊љjRۦkckMwLfSXuxTrk m}rPlY`ZР 8=jYw;][=8盵ٓ|򨭮h#xl-';,o ڱ.H6&@4C0-оWvrm`=Cgm)fvZܡg7O7mkmW;}5gB>(>ke$UJ={aM(RMm:kڡ܎tT svisMP)E5ʁTmezuMh^}lV[{>iEZB ڽ6t(x}(|)+@*MCr(A @Hm`8x;̕r>wv4>}@X\*4뎝/Hu{Q>}Kݷ] W3our :v 6f)`62Ӡ Hm;#}^aGuͶ t6^:s:Kp8}|՛H{cKσ큠.t pi͙mMﷷ4 w"ӶIm:ܵgf@P [ͳPi-[7sЫ}cU@xxQysW(hk{KnɶgXlºc]{FSF&(r;(ԁ@P.dD&22h#@h 4ɠdѠA2hhi@F@iFLS#MioT&jSi~ЩSM@` M44&&=3S LzMO6i׮w+c[U+ L>ǎ_%K/\/]6h(5/bc*H"}KD `]YO)%*/aO{@6aPO 5.fRaB$r<>^0Vg?pK7xTv(sLQKke`V$E. c C)@KýjC>}e8?TR(sGEdUJj!fi}FԴ ?#6QsPS$G͊fڀnF5!(a,i㇧(kUK5]u"qtPqf+Z{?5Q#?dxDK!ӐsvT[vPAkMC`0Pٟ^J7p#|`" ǡ @5XMhZ,%DMha[=+ `?t&d$K [ d_J T@׮mA;C}Ѐ5l$Q˱[s,?Gnt(9O+iFNvću0 RW_"GBgہmt===ꋞE&p/dI$ѺiP6}T1A=F6\wc**bO@DP8C3z_)ta}ֲ7&dltp06Esw)7{dd+09! y|pjZ6gE-k'W@RKIe =[6#IU1#F)t2\`dwN?&4ί/똔4#mZb[P} S5AU&€W&}'4W;2fZA[J6[֪mbؒ8o?6mي*z9{Afg3?9K$?]UAhkF75We{D{/\HL^|it<~Җ,0M gcsD(i()ZBJT)`J)Zi QZT(R"T)(X)m Cc`mdzlFmթE O+},_[?̈?<{O@ ˟cmp@LB;)d1K]7!2=T9İ?BtWnPp=+M_ #O%ٜuB qnR 1|@x\LdXM n.{OoerJ/8M$BVk|MUL%_9(``lŹ"fF@h &J(Z KQP݌dF}k7䍵{ ë_J:~ex4bk .{ЈnsIDO|6㊇@^+B@$=_on~(`L*]t&sq`UxQv6[ /RY}FYEB(\IHtW=xcy7k^͑DkuL(Rrٰa8/T#YՏ輪+EwϪR9OM#6Gd.@(oNocIfD=#pF4{eX )*^N cʚtp/h48< `'˗) u:7 Հ7!n9vɱ9ॹKĆܒ_2gph_+l/kS|3~<O_90lI[Pϊ(-J`If_idnu|,VSNNMj;^ދތcS;&O= Yl .a$_]2P3XAqgK!;uGO_ISD7go8U<=x;f 619}xrQkxh4Z}it}G)[҂4/uKk ,W`ocdHD '#ា-9mL£ZkM1D؜)=i~ZπOtv{Bf쯳2\ {)UTPxF|nAwK4Hg!#Ա#`l)h̢9:yn0G&Z ;>ˏipyFY10#=#|d] G`^6}ld ׉0w[&_uǫbL:i "8$6`ϸxN"AMQ6CD(u:B(jQ6g{WzV ׭*l#h;$s_UI@ G4~W͠0Ԥ!f} l9nE0+Cy+gJO:m]CDKs[ZKLUcƅLާό{acy+OjE1b۲:*/'oA(!_i ܆VLB?Jdv{wZ09 xC&9E=o0! rz:R3{tqkYt4Ǡ#IIbT}} kG1%%fOpj!y2.5GXAΝPܘC8]~,qF/创BO1Vg y^kEtL4LD濷F@*װќϙ_o8ihfHm73{ $+Go1q/q>SN} 9L_ dRi0I)d+*E=v5 c%_]"iR`9y:,]%So74MBpg!D]} a$ s ДfR S(i6ْ/OP{(~w^f"(` L/c3/X+Et GJs:^Ώ ƻxM,ǯ%VNRؕx'_RR4ԚJfixvWk HIÇMWy3lf0$ذJLx#M &5Wq "ִMb6մۗ~HQ²g 3̈@@.Ƴ9'/'L,d@ Lʒ#0+@t> !Be!)쀂0`$#26[^.Y'glہ"#5EL HzǏ\f~0+>n 1!^NZZdӫUg1y1nYO%WS6) 7$8!y>&b9 [/2tMGŝMxpte* Agq@`DXXexD4azgZ3.NIRz~~"Dz4[Dg/HɌ5)D'r VA4nI/ Amt(?Hm;11-o}?byӰ6}jagIt :|5?GKP1{Hyn2rb5R]D巄䷶Zt{X'+,e:0Na!~v:CZD=#it^<c89}hk긳kwm54d[TIC9bHKn2ew :;iW{|5i#탦{aނx(d.u:>,A`<`Fg@'; D $Z_)/,x%Rg'hT1m-m g0M0ײ{\;ŤĮ8ĝ~O= -'YV:R:f"H=@_9 'Ѷ k `D`L7.JMCi8moo>[|NH-\L =}IؼNމ.h!qFY&65}+ z5=]đ,^aUaeGw˜)2"v9KdZPB F-IΟԯSzg1zmK<4a:Їw? [1jgNj-{G]ϭ7N$U]A(O} U?5oɚO2}j.Cm_;@Ƭ7*:NǚTÝaqbcgH`4x VS弓O>ZE㺔na F =;؂9>3km1Oh 冋)2Ho-ܯEiee̱yV ;ϥ,`1v#`NTx,bY|tzbW^ fZ8(`Q/s+7Qe͖GIE*.F}jw6-kveFMq f>F&/ :![ޮ1˞~"y+εV:CN]9/S4'q!8yj_"%tw>q%//Moy3I#$[CGvD3_ iS@hұV\@J5Em{75lYJ?"Y0^]6 b7kҪ!I+e#2531! K+{nV[ J Mv}j{cMMQ!6;2W 4CT8NmmR1re7 }5Ze=I!- ;O9AP&b*%Ay@b+cvqC :D ݳЉ%̣`)Ө@rJL!{Hܟ]" Ls{g93u- n\@b0W9J& [VL\Z[oA۝ 3 қ$pdV4n [Oyb(jڐ1 sZ% 7 ɷ„ ;@n7Sw9!M_[A-^Yg;oM/z}͒1rS,!#OVgFSZXd $z2YŶ@|@ߋ, PTP[Br|3aaO&~F;P_68Ds/OZ9WǪﱝ,: sH2ɛ,y{*~щ*;q" ^b mE=DӋrϓs|:?ra9 <<T"ŅA%48#7'9N\ZJIGddȏցi>d)< i j+q{n.T qKí+#IUau5zmbUfj91}:T1!psw~Oh#|XhfPǚppku_"d5s9榮 ;Zqs&.5_}(<˔RO <=_?c^-,Vg(վӟMĐpksR&;I-1. MBMcmdaE> FPqb9DH1i+C3MYڣ,\%v/;&0ȶߧ'bba۫ؓiQ]ɀT724A9DžI y9GIsɉ2:OU3H; d5k/"wY[jL4>d০*,2s_8]nE?*K~ϧ TDYg3IFF.okgcڢCQEJ339³CN: Hc<]sFΠ=3 CkLZCy!;UG~I!$9 #\ 52IG y7((8j{nTD bib.ݥ#|ZQ4vdԟ[]vl켡q,aoS#(xx-&sZgXי9|[>cS[B'lQ8Ԃ؉|Xbs5r.H\PWչ=IG%XAȲfedQ"LCh @o9/lޮgޘ |-=G~H~"1H;FZczʶ }ԥv^͙PxƢ2uSLܓ!ՆKu"\_(S`s+JTiyu;vDꬋ\);6Jz37e:|f߷n3Mjx :zd՗FlEou5s2q%w~_ݦg ݝU&b8A:*@x>ڤ^`ɆpҾcYl3 !7)7b!2z]e* r,p;s+fǒדdzIoACDM?]GZ&(i6_:t#liՒU-- Gչ黛NiGߟ.#XyI܋> %V1ޮ>Wc#Bռx2&wd$a (7 J{8!D6mZp=;m#v 6S^dgb!?5x"$=k_ny+:> 87"ǚeNt1ĭ<> M2#Bmؿ[nי`#wjM]ZLȳ#;Xh5#0XŇjUӊF3so: wL/)` l` iy: 6&;eLѓ?A6V2)EM&* (2j@}~[,/ވFהm>rugF'MeN>VK qbeL!N)Ysc}\Y,@xj|%83}~`aJ?~[9Ɗ@i@`F9Vhוj5@430'_^/!lM.n0_E^^7ƺ6k4UwކMLADDi "b^;_gДM?=^p*( &hc缟y)O+74 =!5QS^okQ}hldj]lQHƢOoDCϧBOcaZ}c-BF.I:az7gj`ܐNyw1gٽτt9ź$;JBmci0M\-1,S04wzȦbf4`<qyG=dr a(F߰qGS'$\a U,,4>`+ L *rHzr.J|탴U9}\+lMp?Xdl OkL1#ނ :AA=XBW&^ +{<' (F@;i X*616%1UE! UDEs> umҬAJ|7oJPpwcr8:y_`U 99:]t|7ӺL#[E͕숈DLr!8ǭ11xbU vV'Q>F ʶǗlzfkB63n}I;ױӵ,qEG31DAIMI3E)CQQ$PQAAJQM !QURPUf}o:o~O\R=Ì`n_v8L>& $I!Z *) h"i&dDEHlME;/Ĺ/erm8<` b1Oޕ9zhۀө7. d(6aDO=YKVa٧g;t˝{9ojyЕOd^`2gdY4ߍB^3e9Ka)QDdD?3P5[[֥ΌBOmhPyQN:SD\"N;Ĝ0xgJ5٬ZI4 qp ο2D lF upsZ`V f5>r &^;N _sr5.3%STmUHz^ʕ?h`ĿK\#Ўjw0 ~_eEv K6ɞ!Dlޒ{>wXE@=qƤ2`'hÁ 8NCje@$9dy+ D_?skٲ{>l чmn>rǭ>k*TްYZZps&33=6##]%$B4SSGN=\ae )'0AH&K]3(K]ԠsCl6ODD$#ptV*<>O}Oٴ^UDr+NEOa" ,ĖGBtޠ3 $C}D Y GF6 1|/爽Fy6ўyDzN 2p'c82g $w,A6וёC dSd·{ hi^ lvC/h,#U*6O#l^ ),VeGڿמT cLJx۲$\۳fFhqMbJEX%NҎVbuLj1{織 Tk-5D?mak|R[̷w]R}tTk97ވ's3/98Ø]8\B tff*Fvbonnׄ';쿁4<{]*((fw%FGG#Swhd"9(Epd(}ya [Kч+FM!'J}9yka0L)xR zܽx#Kz9W;<0o]ŏvxZDjtWxD)WB'rղ{9d7 %98 Eq˜Tb\wA.y hR0|pwPAPxRv[0LE ;3gDa?f|Bug{FZ"p^,t99[_`kkU,Ew39` <̀8O$JpMO)M^twº*7]ޢpMmb#qEEDk2!+nXo ?i9,(y>Uu$anmxs5v]:d0 :L,]8ŸC~s;a_<1q*k7 $weawD8!4d„dTkah'ѕ\Јp0&J߃iy~QcQps40 gr0?[_;n^7ƅm#<;!qԁ )U9|@S=7Oy>,rXN;h[ÝO38v9cM{ykg@_2ʗruz 9A5Eq{8'o]၈ &:z䵁%%vz۽6VLi)ȊEt['iݔ ,Hcϐd  tR?|ZL!eBwrcLFxfv]HYl$0i M {jޗz,m7sGVǗ'6D;8m@۴W/iT!\#]; ywrhېgb#nx"doOi`3(Rj 2) j$HrlvcklzOrMRїoC } @bׄ ;YR sq/htS*m=G+d7b*3;ӸBnC\כ{]|7eƍbmDϳC:[{z6{~=u\Ui +iyG֛R>DX=2bpSփ7%;LW]/{$k3KwTqu"jիM|)sGY!;m3ƾ(dXlCC"! fǕ>P9ȗk= Rb9Q}JyWAE{@*L+J}A"0> L cAZg3&D-^*<[[tP3jVcؐ5A,tά>F6]"HF]f1`~(N6hfI W`EoCyx^.7Y[䔹* P֩2 9Kmf +CǯtPq%φg \.h *؍0% ZTt)[4y䢨@0;z)e``[ :AuvWvP*Ȣ 4 [>̉1:) + .cgr'b+iO0oƹ:aTx嗅3YG#j7#/iH`Xz+kE $a(R%\o$xjdU8:O q@F$ D8!YRUNThzK,AL~Pex^+X55Fɘ5Ɠ &B~UC;$óH#zG/ G-nw:( -`.{]zlbTa$MsըG<{_X" ahe30W3%!S^7%'~+x]qѱuFaB;7,*4x@24Bpe[=&IKUC/فx.ܮO*}Nj 䝴rxSw5qybW>]#*v6TNçwPP .CIHM.{{`̴$3N΄g)Bm9@7Mif0,&G pq|Od>7׫ >jj&iBW!)3~P|`+* ^8;2INhK?1W}~K a4k'*USKƛN7{Uy1%1X`|8 ![9㕣!%ju@SsxN TT\tۨL=-#!7Yߦz({vb~_ DGN%˯7)em)J M6dפVN4dtYj# 01͑TԀ8ePn 2?Ch>iO[D!^GI3U:O1*ru[hj*GC[zV|,jz(7^ iF# IP|bj`ns^+}>:an+7[՗o?tZ'i ƝZ?0Vh By)(zSCA4z,Lps)74AŜ.B)Q8yJםA `.&'yf}|*CmiM!6/ x$/K>yQuY=ߥKĵAH+1eh6#H`B?75{.W(1"LL`}{"nոl.a$R61^F~P !{SE5.Nu\l(> 1=<asg{v~oJABRRPB+H%4rHdĉBg+ xD:DeR2к;x{<.ڈ*}InZ詎 fwCmE@\.p$qLlse39r~ВO;ƿ'4UVȶ_Zę򳜋zzo w0>@ n8rWB|X&HC]Tt8c,TT/qQmr5uί4!G !Ƥ)I5R$Ѽ0.lMj axeX˛%PhJfT$%s+:u x"9IVs$G9YeuLCs$! _pb0H &p<6ē˕#xc i}tKP},hN>9$IS >`_ y['KF.)Y5['/.dW;hCY͑!D(HXs*;^D x8rpDcRj|NU0)(:dṞ}ua+ETzr l85k(xT;?(goC)neAnK)f9 ndz {EKu oΏ}-)1usI] uDGɳš~5T#BDT!A0l c 1uFHǃ,wEp֒Ж):0%PQר/(wyk1_FU7{=W,U$6$1WAḜse?8.!ܲB5msOϨ$ɕP&u $f(G݂(rW& aq]w5pKJ< Wӡs&1 EFf9A?цeW>SgeE, sȈH~_U 4w#,*DcҮc F#x$=EUM/?JO81+ +cH/""0b,\ß!" 鹌1veW Ij8}DŽ[x[2hF01=f6>'a35=+"`_| 0kQxƞ rB0CBAXj-l\V;#j 8 vm|[+UR곹8jhI!aP7)Vu^ 1 e"6N%/p} f(ܛ)OC"7F&e$@U(-blǞ_O#nY'3\ (brG$4l<1tCnd o2-kL+/Ή#9Dt̛(\*2꼍k7g#"0$I.6cnDzK"X8q]M+J"&}ލGTV  q܊y@ {Cti/PnX# 4EF ^)⧖ϵx'GoSq6Nal {CcLc< Ȋ"lfUz|J!tQ2}>#g|Yh*4N?j̧]Oq=YyHm#=҇(dL{O9T·9}{x1$d @o;..>"KPS/~GwM:AzsϨ\dMhu@$pƾh@~OORHy:.2Dc@4#B n mZCmA 1|bm57}JPc`9a `W>kuȝ:)ޭ#%q=ZL$/g\xцGչ}Ɖolw(X秖C%D+L@9%%IW[  ļ:w~;LuvF 42)D HMZ7ldƞ4ҳO|00eJV' "Id}_t,yO+5C 0:cP \-q<]Pi^r!th+\1ΠnOg~Jg/Spc4^8= &jhIܢDaWe_ΏZ:ex}azoTE20GD@1=J}1wkOy{ypI?WVsM 'ˈV')Β@>1l={^X;Bsʕ$񡋴U\-DUR }NPz2MT{m'eu;mowhiŠ~&AX`5 [5݊!sO}rZćDnGuAA RR8MvpUIA4@NHĚ#W/yěHnga,Л.RfHff.y7chXХe" 2P]R8.H4f^+ҥQ8M2OLa@ƛMkXmW]]-oPu7w!gQIjs87N;R=Z z ! VmeТqS h Pvd /I@C4 CSGx}9lpg76b , w1q:BL#*L!sm_Q i*p ٯڰ6sZ\P巠?w3[Z ]GURru8=ku4w{m:rN10} Ym^07;K#\g"" wU[[St{t $Wx{6 ׉P.EQ(8m$B?p nQRգqI>ĀH:RV'LuoO%ȩW,{BM{W{=-j>pY%MOc;쇜Ȋ^X-TcKUFB[RU,T*֊r Cռ P\YGp0N&Z&!ᓡW3r`FW=´@٥걌 hH,ޅ/YpM3KYRPZTE)gކESIէ.JK.߅;*bۣ @G4wfdޑT,ýUn3{] &g ˆP+kW-%탠VqgkKPR5n I\VݙUv1U* %<ψwDP)c{Օ$q#-7Uj؄Ӳr{-#CO{pAH/S35yjX {]UK"5MZb/WRS|jXYJhn!1=gWd{NB9bǽʹG5+ƭY9ֈB0w;KX\*kO)R Ѣ3_:O݇rl5Grܝ|>aQ{2$d@DUDIDME{i69rˇnp7ᶛҞV"6.ɕ C׶(zIn.n|D sw&y@ޖ&O\"#oȡuyK  D$BWALA\ /GTYJAs4T:O񶭦]#f) t z'17`FⰝߦruXrTU v͂r}U>֌i'͙΢ku_dsp8zoq跻ޖX h]bN}a3[~O4cExC8\}Q4o7J*ayFzlwZ_!vAG d-$<8G>[($pj aa#~7fG:4"!뢹\Dur[ʮ?lT?G ={>\GgqIŠ&D> FMt5dY<,GZNF:5N0M5 T6xQb8Z7z-ňdcrUs1;mf4:l7?L|#nno(޲0p$$95S/Ց^2C0㓷wRZ(F s4klnpLK&H&mMU#:Ax9o$`ZѹG2Ѳ3Wc_QwT8g_E*tk[`CAV"9yM`1;FM9JvQʉ' 2uor`U`R"/@}-#A@wʹd\J3f1&Lk0. t3 `HO6 i%0$q!Ju8zg{Id? ;ASLHHL+{WxqwVƆOqeuaaL5h-NcaFfAfFfFQTpcYB yBQ *cLzb zH~FO6K?6bkr<+ARL!T!#8dA(0l%(zG8W[ `23s^wZKpkMyZFN~xhD5 0{]x*†u?-c@F-}4UJn9C!m8}E[|R?'Y^ّ ԞA(HGqߠh7owVWy nly^H@xŬU*c fc"x%bFTTnXlj@04U `0p8܍c \= E 5+\aGpBgX({KEo2vp~]?J GO{~:ε3.@ ':&7B MnL?Z$ݮ?h;<:@֥g{b$hEmo*H5p)^S!P3bBUU÷ h0 90B PM+M{ mF5U3ɱd%i !$[T-6]/NTd{uGuя:KH?.}9V YqGV\-xF,TD>QގH@ɑ{)Hg8^vVB%!? K|v$`#MetZ$N ;%@2S- ǝZ4 vCgTWw3~Cd B u0 ɹ$4Q3}١:3ss 9>Y\`? T=1#qޏ<퐓&oH ewx-׊qU^}Rk@JC O l$pQXu !Pz+3ÕΗi l'\hmqK-K(!AӈOV WMߥPHf\ u,(Ea֔1")-7{,ɠqa$Av*zܮ4[?9]=i ˍ4n޿ ;%gD/‹n7E|s'=kjyj1kp?8Zn9$mH{ 8WB?~5<&B[@5{dp~CL7?E Azm0FA19[୦'nsE`BUmˌwX"xD"Q6;sFuL9c7@Nf5:mjNc0@$0&&%>;sVM_3^j+?]d\Uv3z[@+E@#t`7tד8nu.3C)a"7U3A=t t|?safg^[:p5 ( )Fq ~n:G KAb E|I;>:]w5'L-ǜae1d`Hj6MI{*5}lײ$0UշIw LtȂNsiH Px'WæͧywC 90Kef xvEEd|I!)xM.'DL0Opc# ڷr "A=LӢi(046I8rGI-4 $6Mg@}n}UԆLMh~?_ c˃>}B'g~s z&4Y `A6ye5둠6TENAe:3ڍ`A93.1:6@i6ht.'k08w>ͤǹ[8XX̮c#Y=DڸS\{qnL2Hxd`eršFtg?( rQ|SR.1_DLJj.?Eoہ3vE%ׄzw!v /c]"~_hmfį @Z-;GPs qD2RU *}I#bB=yc د~.o/º[k~bAԐ[#LBPb!*qU*qL`]r@Fm`o2ֵ:Xڽ P=܌ޫd~tM#5H!.4pe]&7f&]31[mCM64|"y'{RЧa_ېw7fѠ9쑮NooΕ٫Œ1 =9GdoxhtѫtP^We @kmlMli2Q! 2LVZ"pc%lѯcׯ?v$yhy>|ڨ:ʆO5ȁ#cn0phjpxCv N07\py$O~{@@kO155WZߧa,9m \&% @0Nc L@GS! $efMcߋ]&3~Eީd+R.B<1 < s0Paw\g#Hlv|tXH}`?ܾp2"& 8I QǥsJCUe[|sU!_D+ ;gc0H4dlBbg\ ӝenaD|&7JrdI&Z RL`1RT5b1 #%c53T-z;,4sMpy&y.<0f:iDtŏ" cTԃj)1%i."lvF KpT3O7'*<h;:Y:Ӭ5SY=ɕʔ8V%o)u}7}!}dGiHk =L 9ZΊz>Bd 0*D#0sOJ W~vSMM󟘉Q=pY +P`8.@Gwp͉#E()A&*Jgb6ذu[ k__1i;kw4(+ у+O%Cź( (D͗KM+v WS~Cds7|ۮN 'tU!8mπ}ǖκ85|;}9j6 `@بcecwyߎTX\O8~S1(>?SLپI6MUR$K06q-f) 6:z؃']pH 0^q2OGXY(˅LR᧭FqI=a]ߦ;n&t^DUk@%kk$]p)(XFpS$eUg(c.Dgj_/-ݳmUm$3IuNQg!nIgE_t6JFcQ{y`1naX<5c#2Y]w1V8,1?m|k!wؒߒxj~4մoUXuT;eT!K_&A4l$Pɧ*vc`>1}jWw\C $/ws͉X4ƓnW8?pc.sVYq FS|q?hs=W:GCC:C ?'}@S#c!P(j̝*l92SC 9D"HNc}B?<{MNW~Z`c PP+)Lh]>q\*+N_:cpe%IJA*jJm$rrltL‚fN('NP^8JEaɰMCcIP(uE~Z B\:0S#~L{,tSaD8|3& hf ^Sve uj_t`an[(Ih.DU b:C`ris't9@\cϙ@K 7o'o/Kxasmn%=j[D L A}]_Gq| % )=:}h2fm=@#T8GA3|@z~\8z" 8%N΀t.s:C}3y r_s]O W!Rm 6/-(-YGd YDޣ9c-b^8GIa,@s@6 >&meܒ.f^~." WGRPIHt+'?>T'ǘ; m$ ⾷4ն7oaE wxg}}~Dy{n `?M>UkQAF%@ٻM|A\׼W}eCWE,P%8Tzu sT 8R1,G# |&O= (S\9 )TNtYU4,Sk*J˂uH70#wpZ棔埂@^";MrWe@\ wmoz7&tr kG`| @&`Y@Kcud' `(F[N%eeuJqN8C %Nos wS͕:TM]ϴshP$" ݲ{M-[fN߸CIѕٹ(;3YJޕ:xH=ؿk8WU]_cw#F n)uBcu&=^ٜW}"E)TI'6&1 myn&~Rv5 ׅ $$ 7,]Nj*۔>?z(ʶ<}Om*b!@痰UpspPiq!+jBi_g!~ K܁}g2筧uqoɲ#ǮtFSJꓠZ!Cg/ji1>9 恩9sPYOs`7q,rLDLht9\il>REsP_s ;14>;k.Lxcp>99"ylR4}$cQ7&quz| o#ZYgπV*| 8\Xc>fX.snK\b\ K% MjS`_ktnUFz^f`/:.$4iߢ԰|WlnU>ߏxGʠn,h4ڿӹ1 mCJD~+pXB5}>fAu*F:5,UPu)=)231Eq1ѯѰ[6Iro1n oWN@n#=è,4Λ,6AY|L3NYUN[lyiwY]ᕌg&"0*{UvF8uyjهFkҵ[~WE[M);A(ڳjW 1l)ivSa@Qcl j".*oTФi鿃_ \`57&_ y԰$I!8w KP'ah=OM@^rO)ei"Ee&A. j$#Bgrk$gM, v!AJkJ 1e3Q LRiHm< _qg#D4'z8k2"ߚa례=Ɯs" GMwkwt(42aF|NV?o+fM6 ^FV&s)/LSz`(I|&bѭc)e|dgm+.nYG0l޵ t|hc]3#Lx=[a΍a˞y,v1CFCz >'u_C2Z6P`}Xj)0s7gFH8C%q'7kFёLjE{BPAîs0Zҗ&BBhyؿ GegNeҀ@Pz, - ūWb y[v7hJ]gژ؆?5r,ULM^dmYx:A' "BivDB5G4 #CBӁ'ns}۝Qj0`-c%3x-Ŭp3~iI||It Y 4h8j4GA!M{f.l EJ29JQ>OG-Yg-*u/ŗw{^hxn,˸p̓0noPwJ/o?pȊ|jZ,@*GH *$iT(f)RGNax>a2\b:V@uOD}ɟ5%13TO*~〠/fCW <_LJiIc':f8.$HB)J̇W"dtWa^0"0qQNƒUܾqs>Yc9$PHU`ߟZJ"5uij Jψ-iY@@"cRZη-BQ#b`?rhrhad5Ss`WrICmDm@:% 8 iFڰ6o8lm]XǽE5w{h`k3hdA\Cï#N1܍pF -;6߫@s~=BEW5]): ~s >G-X<[d_ڪƪfUM]/I=[l2RPDH2M]=kyr~:οKemS I6znM $Ml[MF>.M2k7A}-7*:5#o1)TWsÐTwjS9>Ҏ/L{aǴPA ۈh7aɓ}c+!g,|c -R X,{"ng<3H܋p'hHM}*uH!Gʁ$ kl~X 37΢FTY=ĉ>ZåhG aQ/tm1x!CgT09FRxp/E|"P"2z !ӈOAiV(v^G9{Zʇ);<'!0G:[ƷזNiB\2w0|uJjWB88T\*HB[4ol}|. q⍔~AuZ^R>>ئSWYRI0ށ:h@2G0 &Cdq z~pYb+q+{}D"G½MjV匄p™K&rƜ\b(#7T$d$tIp@ݾ1i؏ڡKo#,'S"Ή 9|^4y*u cT M3}èQdp# D`-[هlGIRVlK"lEFWXNC[U)!8|IiV$}"ۦwi—oB3W1 csASNwy94JU1)PO{[ye|ܤOGh>N/t9wF.p[Cp5ST+٨;К M+Ćq!K;D2, 썳lCOFFxA;C,Eɹ"8$_ڟh [&pk0l4ۅH{439Amm@%GĴ-sMMHFy0.~|= жQp , ̃5"37 _ND& Oqχ`u a.w'O)4ߥgCNIFO8> r0XdۇVHE{{MU\O?EDMY92}]ta}#ߢh =5g br{&qŐzP2vfTU +ᨚrd^OnL^ &5-HKV5%"8Ix8E4&K\E> 6þ-oՖzﺢ/LzR=k)n[>d֋u*Lp_?y[Wg}phCly},U -"Jvۚ}1Լgg6GGm$bq34ltlct]{4As.דC# Xxp~gOZ&)8 H&I k[86u Q!8AxM14k?-IkkR9ݗwSt4aeS)B 1q qMf#ĈՎ Rv7S~#$Q]0gS`4xޓV Ḽv%_148O+r[~?_˻p+S]|ݎ㴽UjY, ͮ\!Hv~K#ɾ4z=q=B;yH!R@`ezB "/tƼtNh ??X`cЉ:U2 qd|}VIR^$jċ [HUoDGR"▇m9ßjN') 2=;Hpš'OMh>{kր7~A'n 0≝ YPϡY%Mr yB_&w ز52@r~f?!kLęیjSD1ы=H);vNx[`D\FDl`6=#"lLOX tx@Ʊʱ=De{ 09ŀ77'!u 9,K1+hK`D&EC)Fx6YwSY4c;B*#kt62fZLZoNxLP;}̖:4#&?t m+p@ƻ:ݪ=Z2d%}aR4+NiTa`Ғ hQ0F{۱(tqioO}eeHʀEhΌS(yO|UNqca*Ox@8MTo!Xz>Rq"AH*ښ+Da"`ȇ4əj3ƹ9D=4%ioTMjVYK{%#z5`ף\o"-w#ܢ)u?"EyˮV)А{n4|a"9#|F`+Hܛ[jHp#/^ 6<IFlZ\YsuU4Y58hƩivqÃO6.Qr.sN?WmYyvP0$ѷg!H-,KF/@q"B) _ˮ+,&Oفcz,/GIЃTsx$̖=%]"eK͚YF[ '-8`|(Q 3H¶!N,̭+M!𗎻?Q$`U;rfqj2]B9zM#L9ÐG ǒ]Yڤb,F'rH. [4<-B.NqLp^VR’ȵMM's@iU}d/.F59'qgq^6_bH;jƩsFwNWY3+/XйZAi/'mY;ә,@ȂMY7%k0cѸ2-ݴvءID) ^>C,9U" 4Bs,~Df+%jTO1]W q!b)56@9٦RRrˏ g{vjp4hBNڈR pq>rS]υ MDQ%?uρGňS_MdBE$cx ?*pUԶB*c~h'\qu%qk8*2d`!!44m~Ӆ`J]et_~Y?9 -Vys:r {w^C" H4|oC>?NKEfTETba#os<'_ݝ2TV"g++$+oA GXÍS*k]C o3A+S}$ġO,vBTr̴ A7#WY?*Z3+Rr1v by,9\qRd M:0Uܦrֹ{z\t_'mvf5da}4 7qt je{j0 @ƌvD9-($@y=7u: xrH!#[uW_z򟶅ˆVഊf}p.;r9Hz( ckև-Ĝ?{f1K3(0:[k CLNSހp._ ct/zO{#p"'V ^ 8O&0|;? $]=g}UPa6k1#.l%U;8^.ĵ;xCG. Aٌ ÖqDf!,J.׌0BL-A(cc[Ha1TĿ;6ߍK{:1!#RT͙zL&LW[+(td󗓙tC;jT.hR+yhYd>᩻kLn  !872V $kzue2z*HKa8X|j ZM,ϻLC'-.MT9bm@gUWΦir?ø 9k\k r Zll(k[+s924|M?LY(:^caAP R7g:aNYUB-gMfcjW(?vYv U'uXvpa"܄l/ZƦ|8(;[\Gm+󠖚:$+Ԭ QzvML߆\c"B:m)%i}}mQlpzj<:-|Nq.6 qOهMO l `~Xz fy8`%e.Yl>PКfS ,mؤO&7RrKwiͳF ^y4 Q=CH.yh8&{}đ'cJ ( w-p_kpā$*B^y~i _l-[ּ)eIЀa:o@kWAh?Smj/cHk@wn1F.rA)Ρu!%{R폵҈Nɰ N}RRJ2,Jɴ8&(Pj5%vqt/wiC"̷ =R3Y|& )ʆ<7K#(hUߜ6ޣa'4V/'r c+g{sXN+""16Ђ4#'2l.]u, ƙC#^a.$#%0X#FQiZҁsQ^BUJՒ&9> Iqu=Ň|k@@#" T)5|淩mN3FFAn(d ~889VWyDۭNѣ,+_ѼDI{.!6CŸAj0 \?cy4<%`,a8Շ2/qp8# 仇OЖg(( |OQqgI'f/1?G&c%ns0 DN^޷dAs*)*XcQSJgc&~vyj$!p)r * _~ Ř{|?[Ϥ 6XN7M,{uII$5rO a8kvr^%sƪύ,>\/>՛Xi6S]0ђykF_tmǐ{rQĨ ~&={] kk!a0z36 5E!]M}(1#kSYqGo#[>. 6~*B}·c*szPwG 0,9:H[dJ9atls~o}!i4,>@"(1rvYN"}Mhqi+2gt/pI\(]K!OsHt:h$1zJ'ZƈΞwÍwHLb3])qiԽ#-%]v4ίwQHC@ ͢`K5\mr^WԢcy5^P,@c= ADšdA%o%Ib])̅+*"a4UJ :I` ϐn#NL>/4@B^e=a wӀhPD 5X<ʾB@ɾXd&d(Gvs٥Ƕ+z}bb6`6 )1^Z:,[HJ, 行~'QivWCoޥFmMe)HّnՎLXO"Cyn=%85zKjcpDLu#!4߮F[&|BGtEG@׍[C#jFuU$뒢F{_ &‘;J) IcteVXȸh>n6*>>)ۚWݑlӀD( '8lٙzϏ4bh{^36vSb*'}o֙^r*R=Z }k5gY_A4jt h.avŮp^5c ʩ3 !MaCs{vv24e ,'o@=;%ˣJ#, = dËg~YSYG29溗"3@ʋȊ޿]L-HP{|n?P>r.Kڝ+G a >sنwâfF2W8,s7Jw&ְ@ŽP uv4I&D?.QO8X'ӿ]`5os3?ίA3 <>~~;siPHm}d0adaWV@MFf`s\smB:ݸk!]@#X$A3 `|=pcەkq oh<8.*A,C*czE߁-5$y!Ϝ4huŁ򼩘p׊%2J=ON<#s&8{D}@-@$L䫚*8Z֑<0}>5c)W׸ʋJ]?Aݮ*mCc.?*%_Di?V(;g3JDb@x92^ }C!rvsG>Uˆ|r4`؂]cU^jAm[@@xƋV,hmBTQQ`p_c-m sc"[ $70|.=at01`;|M\be1d]De,Gc _d <í=M2JWuXCu7XZcs6̌3o*8yLI*ѕH+Uns* eIRt=.6[nįFI#W d5dGn8]+n87ކ؎1I?O0<>)m>p\敝v{׼ATS+w:L%A% I])<8{f J4Z,7Am"0T; (7j:=2S9|Kn?SBHƴV㋕g$A 4YdfzΛYEۇ]7[lK6"3F0(P rj{|/#p'@cQ^YZ۹DWYK/ P*`;f3ꁩR]vRk,t;xL#j% UW󸾲\yYyn}Ɔ;:!M3<9 /c1{S{t}ww-m;Wp6 qn8VpC3zU>v?0"Q{}?q|"e^3MLE$R*Y:yJzbsKKMM='3{CEgiM# 8trjiBQf)Z(k7xx|3xama1 70(6zFӏ)yJpe#d隟1x5_Neiejr_i1Mb6ϣK*5d;uGu=AX9ӷ5MwnGK91uC|kwʞy _"@eys,B-HѳdbC*Tb d;*I/ +`/b[.dF(x6rSF Xjҽ9Efr:-yAdTNe &8.@&Y2:Ful^Xd.J"AZr̺"u7;l=[Cig ,mǐglWnjA𿑦g$`Fo=XipA0 HjYLUvV(xYiӨ3nAB$ւIf٣X&tqҮLy'RB|/,)Gڽe~__!/FGDn"" F"_яfB.+hBa>LŒ ygvVVlk~8HJ0*t} ?wJ>~~%أͥƳHV%s:$0z@bXc*xb27gc͂vڕ '4n^5 3 + jJIP*q/tlXYoLr7>tPchYfĚ*0#.D D,;-Vnv#hoL] @& VAP^뱟i9P ~"ֻ<%JWNWӘG²m¶6Wb-"T[wɒ6g VDFPŧ V!Q^[KQgޓ̥~ ա yʭ%Rm5).I`1[#:HcN,QF"Ȕ ZW0SՌTŃZQJ=R `E[kUmX&03#!#54Qal>Or%6LMұMz {׍qvp*1[ߑ*cajKĻ?i``詚pC7|| Iۏ{gL2"pIN|qWAQSڝҖ[g |>d5oO\}|&m}Li%^_p&" BZA棐UU!CbP{2Sm9~*O -~19fCV % .%6Z9D6z0"M@wCsV-$:ljZP3 ރuk{ͪܦ@!L7v3qx3I=y=\bԹ2LaTiEe29,H_|iɐLO @IږSPi.[íD`ā'  G=Mt197ՙmImdEԭjO%b2xnNd'3=In5Ȏ .:K]o#dox+1(&ytݣjې ]rs.W{Y{zQSKWBεVlQη f+FC<1'ț?H-zt@Ia#x:ܪ2Ƌ:n=??6CxDԑ<"R'YQkPǭ*cV<Ѳ;IPWi3Mtr;Ĝ.b&UDjic07pIofYH= 1z s7#kyN뗘X$b05r8%EBϺU?`jl/v=~zvsڟĨvj`Z1;dyf"Fg~yGQO _hΉ /jKJhUtozkTp@1u('݋̇tb`R' ur_D0Z.ef1G0z2r ֕ǟMvjV&XқHeP0H^QfMa5![6{ݍЯ nL"$g/8@ψXF7/_$Patq{2iFp'S>ȿDW?{(ϔ *M~>Fu}.,tKx]Rzm{ !Qzm^k)uxKV695٦˚A1 />Ĉ̀+=s:Cl +̑_t, :zTUQ.ޚS죍Ci'!?zkﺊTMH^KLiptYyNao '8AseWkA`X>>4vDȳJIe0_(t!+;j>ftڽtu v9ܜ@RUQwkBN1wku#R*$BvݢK"09@ 3bpue5ӭ-0󧍱6GQҫ( F52m`<,=簘*2;-ji,|Z#'",eAY舟f1s[EbR_ɂFqI?;g\PVBL Gv3,`ޜ0͛W8C]- :0+ZQ^=iE /έCB'myNɵL+)/7JصjוȣA54.bgBЛ(@>_cE )bs+vzӽn~Wϱne'݄ 0\L9 4hdY{+6lXr8ulloqn3n~nˀ> U;Ng=~_;;)「BNSe$P7b ᑀ1XѴ@?L:NjFh xIp޲ bk%ŁVvC% x HJ\|Nc ~qd*~] F{VKr0?jd`Zl"jj9 m>7ݩxܨ!7jnJSeDC#F%urf0Q84d _s$rw;pXIW!yt=Y'lw54VUzT* DPM*FgY !Lwvý[864рP> H~!=O.B|Dxs 6 G" t"X|1:G㱅"Lo<[Z!8]7DdpМ/mX7 ԕaCPH$_rE'ơB=q<9$𦇘A1;bRgzju%/}‘Dˡ-@Մ&SMwHoɡ:s $ք:%KkXOyW<8DZt\R.0~}/L|o`Y(qUaafm[RͮgcafZ4ˬ-Ĕ=#b0ʈ{vx|Pُ?#¶3iA@һa& iHRhP7Qci ` +xOVPk.˅lh hjt$ Nn4ttNii['Ԑ 6.9%#갘my4a;⊊r]UE(![k7 Ƣf)^qLMԹN#̱hzPNxwpYuۓeSDdYƸ-Ю^3S]Gtԡj^ʾZuy\ vc98KSxL oC9Hfaxep*tвMx혆iJnG]iuK=¸qn/c&u6 ^qnj%Vb> TaPffW_pb!4ێX&~1p퍛fNHZHt-I9=4f~A ǥ*6!3#Byb@=VwPDxF"y$F b,GSÆvrb֨pʧo=b^_մb9us\3C t#MJj655>f FeZ0TDj#6`7)HMb"!lx-`Yسe`p# DD/^K^ [\diK'1Uת}FFYS}zN~\B.5Hy{؞+D 9C'Ȅ.0`*d4#QҒ ӷ[ 4ͭmx=t2^p=~0 f,gf$aF@`k>L%vu0ẑ""{{!U"x'?UBI%S>Ta~yyrb>WAB6W+C0VeV=Iu3gBndj]X\r龄I$Jn6.͗6|mڣΒ3k8=lL ҧPWN ;ʴ7ǴjWuq4nsY,*w(wGfCPf~!1ju bt WDFJ|'_CC u18 =l*=g^n\7-a !q, :@"dmuDpNՎ0?>iP3pdHt>%,?KuDَKOxߦcɃ @l?}Y2CWFnʙ/qBX(\z$Xr2;;'mwD&&/'ØÔs^l=82\/\vE{9u!(3tL}bsD7VR ҌIOXXTB?C+q⡽#G "|0(ߜw~MM7DLnKPoݧC{psrOP>q:`iSRHAtQ^Am9O@zc+MQ9ŴA=Pg>F]+PU;"l7;u9;~w/b튀pe!Z40 G{L"am[QA8Q nx[%=T8*R O[Co>=p )ANN B'+N S6àu&iT jQsll;cc1@):r&ޚWs\A쨣)F Ӵgvz>eߩd0o8JHn.␞`U捌kibbx%nنߌ&5ȭW5ikk_\bmpN0ɔQJT@B= &Br48 o Ǘe0ym8Q'-B-z/2 @_0<PVίk:#= o49S;} Ř`!-W%n-d+xkgN(Nx*4/FTH͆jjs:+>H5'I3%8 \~m^Y×nsuxɶ7DZ$NOwWob-q&1q1E6+`|}cCWh%~F⤖ FI |kL FBI(BՇ\W(!kԂz C";[p?̗Ɵ9LQGNs$?ёE_|MkkSY{p;JBY*jS0:{cm'yT~}rmjM%*>]MYVh#K!:ab_Dk޻B-!b0CP6FMeХ@X',s! pQW@&˰\ڬಡ0gJ`܀fXA@ mrAsY9즙 b AGJNNΣU#;'% &t%xFwjzG81W=$sW_l5\AuLTGPZPBiiE  ؟ :Vg?ЪKˊ@£?Y[ Ld lg@[zD+r☶d N[8[7_RdA18 pmAFNf %kI;M8K`AI{jY=P3ʳcŻ*lS+OEif{ƫS%zAf&8*cn8~"jǞp@9~/obBdNDD $N߻d5I\֭ Z&3g([2(i^PN|ܫ6Hd#.DL@"+=>y}2;GY, Qf)up8RbejZNҞ]cMG)O! at _4[@yPJ14o|@eDEr0{z x!-ydEvZx tg[ݳ,FYɹTvTE򚿘ώ|"z wAcM&E8UcS{$rd #@ZTtĪ= l09&9>AzX B+$_ )2*M&k{22$>i[}78:ߧecE>O\_ziI&Gh̒R(U@XN[|dr`R:̈́T #SgA`睟`ꌅW'I&xTbˑu݂ĉ&MٴGFyѴ?e_ۋNM G}F7"'lx"˘0BPNFz,0rGGi |)э5%͛0͏"efTՙBsC 7g =l)B,?!HI6cgh.6%TH9_J$.Һg')c\,3I\d<)6F||>WqZH9"1pr7޿NI^g+r|;lLIWP)2x38uy~zP&R I l)a[U *Nrm J(D["=N* _8?q&$`%&xXʚ0D8$AFZsQO;I̢ʰ0৩EA9Y@Ӄ]*c̛o8PNAK$3yYMXO)"!#&/ꃳG?kcp\k!H7>b"!  y=,̵ė?vVHՕ?-@ d݄ m=up7SRM\ ETt}LnL~kG aqOJzvW ~Bm5^pomsXsc@b3 "E+eAPƘ) &/ Grmav:MprWx, 3$4Kt3n¡o|hɾZ\8 .Fމqp&if$< 9$k;\Cwp89dervͷϏVl18**asZd͑eA=^mo:vŪh^E4x}"Kt)l{pY7ت _ XOjm}'۱U˱`us(́oUA>q⑙wa7bm&ѻ~̡ioDXD c_Nˁhr*sY7ɶT.n@-:& OJ y4V>DtOEY= TSTe_T[ zO|{.[Uh@p=qcep9I<I.X8֌ml| &[mP(B֑OwߜwEv clW{%4y)ײз׆XNm$OPJdL" B;Kf{/?%|B ZSLc/,UP`EI^^Z2[oZԼ!TjlN^>! P-0(o| ؊={?mw ";~qZZj",3=:y\K6x%׎,qFa n5hq8kvi͌tx_Q1NOS;2uƆRH;v~V'5,r{q .F>)QT%P+$÷ºthJciF/Zy{R?OK8iDDLpsXQoW7O??X| QW'є _8ho,D9qm12%SWbpFpu~_/5 II}AYZpTyo" D4eeG8 p_=JU(~2(OȱJ 3Qb}w:աB"v>N)A_s@V$w2a[ )7#)C 5T7hQ0Ub5fZ"d'RfHblTێsSZƃ5lz*x~U}nd:]1W%oôd4yيp"h)eInZ^WZT Ҋ?LnX:4cz $W PO9icDp8˾NOM >Z(\M#LOQ %+},Q+p01+ #\9AMes_JTnDUzE&KXnss 7ҔQR8u:>/'[}8g:=cNjx锨ӊqA'G=PRAo-i砋Gv:8Wz3}+O (֮cѵ1lȵ,gYq*(Yt 'z KeﵜHBD!S|`,smO(Ly&wz !ۤm>=bcXa|PQU9ZK NS1a̍@M(ZDQIZ:bp̸Rv6.TO76^ I nʅ# r8J:4 mcEJՀ4TkSzGl[V F I1^_!z!'#X&5FCziz$1x o(;)zq5aP|eA39΄N  %~cv&n;= =<>j;e7+_bͲ>Jl, 4:~{)/J]7 9ޅ SuWS@m[ʀ.M{*Cxu.;u|OX M'Y+/i=] O+80v0h;Q W>NZ%##11=礞^VډܑTat{u'X#ygBU(!  /O,1 0\U4 [6ݬMZLg(9Ȧ֩qkC4īmĢ/ˍ*cf$#ۋHݷ, T5J&Cd1g FwUCׯoW4C{cDryy2 lm~e6qȪMjL=*;:l_") fBYT># )T`oɍmGLt|D @bDC'6uY\?'Dj0[!Tpr9 Ho9EtI 5j0j?ud_uO ~nCNajٟDVv^}(/]?i^NȺ!vPS1;wG"nBneTB!)Jr!O|jp գR)Y Ž^HmUQwG ~^X9ja L AbΜ9ِ>{Euo3f#BKp7~}|VbVEaP|oIBOvdr63/(ҽ?r3c$@e!o+ȈY!KC6N[p#m\=ត8TSQ#s 7qDm=N=^ ;.$D.MpcwDC3^Ϊ*қ]^n{A &6Bs@9jWv7XsHy0Ha:'lwp[+h} W?μܞv1l1.tP4Rz"C_`|t|E/kCCn"'CRgpo9v9S=z863oyCXϰ~4&514\#{xݮc._>:K7)KNZFKd=79z] y=<2Ig@ h:62׏(_ <`EӋ'_cڝC; gjHbXyJcIE ڶEs4%Z# % ԌnA^5ڀQtk59&1|Ԧwפ}boR/1``JWNDֿ̉᭒I,VtCs1,7$c X?ꋭپ3WO+{ \><3Gce4MՏRRbvםϧksL&OB5'+L|$Kz,}QVMbyG[G0i dК{ԠXorgNk$&{(} Dc5f FY+xUV d~ULc|Z?=>i3]xؑāH2eT6`ʌ{Ӎkwbq—mGN(r9:TnV uj R 2z^ȝA !2=R+rܐꃸsl:A& QN9V%QcDe-9H[z~2'1‚< ٥09Xqw Cfx}1">u%i (@I:W]d\ގG_ϟ#Lz8g ъwNkiF\'lJdΆ^Q=S9?&YjnM6ln3Tq>W#-$8 hZЎit׼ñM==v>\"@w' .?NPOw\~u h._k2nTzss}6Sv3FH#:즐|~g>R]/`VP_)(n9{_AT-w 5m=?ǜIq 1C9Ѧ\Ѳw:8X6wh"pJ }8c#e&<cm;.~Ԓ[K/WX8$)#C8PlOxؼ?\pzlVr%j{-)yI-ݛ 7h^K:BPd+c-_e>BswI8+kɑ?#R:BD;ËJ=FP@=nZ];qg+h:L@oMO\ro+0ob-OB¾- HHQ,=HН#( Ak‹};[~m`D{%B^Sq:t8jYNaYy6,\Jc J̉'fxl5Y⫾Icַs;ˋ%Nn҆`c6@ƃ3/C5#Л2oZ5 iWrq/h \%W!r&~Ɏ%>D8:'Q=q-q4t9 ^ UKY/k@ ^nn/^c L5IF*NT+jpJ^_J8$qʈv!wccgnDUyJ:2sɓy,FNY"N:y`c3wW5% O;-]vٖ{i/EwHu|ħϝuyR)B:-8)ۉrǠxfvhgwvo3K,Vs_UDi62 =?r'o#Iib} }w.h l뤓l0b(:mAJ<sZ~{@q쥧S:#9?ͤrf3>j)ĠN"80j)%[^z{7i p莏x}L}VNa[/S $Ö4.4I_B'Ry37֠V4ŜJ FiTAD" sXW;m7$K# 39-N1$NݟGFR# veHD(k^G DHR>20棇%7f9؎C.wO%q7>%p8,2^]¯'Lhbߞ?g_!T+,ݣ͔ 9ry. "5< ^iǞCCOfty)=x LP[,ϝpw;DPc5_,Z)poZ(ܩ< .4"\Uc dن`k~:Am mk(Vt (sNVPy%-SzTЋXRRsip5f^!`FC;2^Ea? z`# GGYT { CpPKcSvu5 vH_d/2A29BI oRߔs^EM|}J duj[@!d\_. w#Aq gxQϫqz ^/ƌ8j$yl1}71Mae+2_k2):AIs#s$,.[W-V"P!bEupidrQWQz3cdaOG,)ׁHJG SҪXg!>Id0v7{6 pwzs # T%x S7zqǪPI8$wC)-`eɇԙ-ov6 ˊ^ k %ƽ+{^a>,=k ,G!1ek[q悃"GGe+Nɵ7~1|0iZ,&Y) (΅D0q9L*m2Nak)v>جxՠ^ ަ )Uy'W+kG%R=d腮f\EBv g { N &W,zϙ`Y4ZV(Z4Wշau A{7+TJO #@7BaAW7z %0iN`$^4Zj4\}?Oxr/'A@zao~i:Q%+V.]JiR̈́)m3mطGp]sc'QJ4Io!|guh{E,=FK<#ċ]#8~ ^֩pqDݴ|TFVKlޟѷ͝ﺞ=\M 2`WJ=~:AuHURvv Ԑo/w%=?"-"}TiP7O^ۆ!# ȴ'-Fl?6ٴ70 1MQY{`jwII6MKD YmrˎuXw8OmW0=("Fa9[1쨮3mCm#w|wQђ'YpǮ @$MJ!mfܰu\} ,:+~J@cw%1֒?.yzUn01;%擽AMu$0s~9&i ֪! )a#OC5U&P0 >'Q0-^EƉiB@uF4صk^s'L0Vܘ E^5`y9HV&dԟJSO*\h \ag,|p?;"r| 99o4b>tӶfXKa%H%/;NR#P[ SJR!Ǫxˉ K֥8Ly(v@}&y R-7dF@mTnb9AirFTJ;;prd1:ANb}:~.k}v,ZW zTW B%LZQ82r#e ZY4X1^>Y3">7xpcOxHd,?M|[M*5J$},;yzSiط*|,$"D^-BsMosY1RZ jQVk-\鶟Ceo‡zq.xnrUZ)ѣ&t,uT*714ЂѺbq >ۑ:[ FhG|W_jOY!) ygcƻLu%' .U葓,)^b % db^6lvjС>:*UIM%w|48V|nj - S5,$68ngng=r8[CiDG\$9S̥3:}MWIJ.UuBM;-8RgW[5Ye<0+L~7dDTq8 |Gƿw&@a_h(>WQ%e n.谒rgR7x򻚭`{YSWɽЏnui`G 6T7)r5<GжR !hZK#0I3ӄ 4TWY-y=.mÜA Bv9tg.ti>l<8z ~ r0;񇋊o,ڋ?T.!X[Z0@3.AX@=RS^Vt~#Z@nޒc2i%mZhbgI85KK.:":OO&7l 8e^['>~_ O P=wbٸ\cL!耲1(4S7ueE´a?A$vP 3.&. -TU ZtKȚ !ܿZ=+88 =Xd/({ 0nKGJݩ:cnwI'Xh,YAI#plAw/^WW Qڢ90*c"hѓʖ47:yfK. " ڥQ.@E+J͸إ.ʎ\ l;\gwqbJ tGoldRƯ}Wцz?>L#9}=v7?=^Lfcz+B6;+WM5u/}3 /}$|}SW ZSRl-(^K>eG/<fJj"lfX"v;?tz^iIξ{0AJ5|U?':(;=خ|'^Swm9c{ТQf)֬6=ZݧԡLy뗆mY0q懷ܷ cދêU-<ώʫ)Q?SiWg(Mc9G/C p R7eRagmf@9hܿAO1ʂ^ˣM{r>(Pl/9Nj>]'Kp4T^ ^CEXKmfs} ɭ}%ϓ8,"y3a%1 jy}+qUBI0FV9qs?gJ3$.$^eXCi4-ʹb%liz|e_$C;MwBA+|"/5WE܀|<=; 3ˬxEjmӪizCH٨ ͛%S|L>GthMycIDcWx<2ki00[ /ƿaHʏJ]6;q| /8/dX'[94ggO~B0͓>;U; [FjJp~'s2 $}XAKjHPh?6^tҺ  Aj;IOW?I>mƋPUc͕%E 91?AʟCaۍ:ʂ-tT [Tg}3Ot& !N&R} 01an520~ um>/A'0naJ"(F3IqY>,z/}Ig7v]{á%iC+*Dnd3Iq8#n߃*[E %:'@JWR")m$= rj7[[g[,dx1BwY½wu v9+2gtZ߁Dr@mr bC$l~vjo/[4&1<86=lrncGr8+g$0͗~횠Hoj⏽!QoGCQܮ2I1hYj^)u _ S>^df|\7`gB{rYr'y^EZ / :0 Pa)/}[pk hpqf)ZƉ2!e?lu^m$8 [u`H?)7y*ڍ ս†5ą|YmDK;|5V-IޓU娊T^/@)m(\-񁵅x(/$>> -zogn=Yml[t)`ܡPx9bwGǚ>:1\zF lw&ɨc[t#p]C-.(V~iimoycJCo^+];R{E' .Z& ʪӺ2 !#&x6O@A<ҁN)!oB9%V"QuSD9E&p|P24z*F>dqƴszH"xf?Mu{ry> P5i/@7Ʀ~KemZIQRf,Zd'UK!)h/Ym&>: ;)oBLC^ں~s&Kyc65+o/$$o,ܷ0_ӧʻЎЅg].b1ڄA.x2[˫.sތlq\ϕ `b{s#Ci0N&yZ]ƓSzq7JsX\CO ¾Os$n=J]7_5D~|S4Kўg#*P wX|L.lt)sY $ch&` 񑍤q6pW3;6/WP8UCJi,McALy2q:)gD82a@,t;rޭݦM_HPɎHξKutc iWfвkr#$99a˞!ã˱p /`1b|F+~G_S mmDPHBdW;Ծ gi(~/I5(h\YTsg/āyz#QA.݋ yEOUB>o.VN2Q*q,|fߩ^(H>1LˢX~9M iu{<4ijk-wÁ-^5 Ma\S[5B@e UA;ZJsVniF=Ƒ_Kqr]|.c/Cg9Ìk̈́r8dW@8ay @]K7qsITe){!-q.,nn+2_u+/"˳ytuKBj5㰊w[ .(yq)#\ςPK{DK@iqf.OQ @vjIIR*s>+iNgh2L% zICLPȧ}Mls;1T|-DOe6Oe>UC!22G"$쟅tx܍^{M:e]J-ށ⯩  {J ͹sa"tbSB#L#xTT.!SNIa"P@d8Ǵq:wф6g].xP]\c- ;-9ud&es'%FrA} H3G096MC>`K}~FM0϶6a o@]l5W; )y.{[;a>ŕQ%Y RN09`5b?㟇ΩnQRThn"Ĝ`tԟlF [-M*W;Wp˯L~joi贡OgPIu66FFa0'P a(NmԱ'|c&Z9pfo7ʅ:۶.% 77D\<}{a#7Z%PDڑ\\dJMZ\ɔ2HFgOd>0 β]ɬ@N9hTA^K՝g&88Zo8ܕcV[PÆFʑΨ"}µ )3x@qvRX}X7ܺ>nd']x "I|Í1MѶz->`ۻZQ&sI?];^%U rOaCD2Cs(&i|K+Ź _ӱs v믦2>;lRzG]鐱۾~[̫붳z4ؕCK%kRG bhn"xG 2 \QXbCߴd| sem3D EalNF(il91n!srL?( ݥo7=Irnkgi2XQD躯 I_[J"q0{3L2yy._sDa$wNj[T&q|_  Yk بN̨ h c!aur2[#_u;fE $6Y8bYGJ8oUłFg}P^.O˓)F(" Qqq5X4v8R,Shvc/{"I#b1z}{\ׄba'qz Њl{o3Xqe ͧyX$F+5[_kS΅6ڇ}Q(r1ZDaI2[#~C2OH蝓Aё_Akh47oZmUܒ,ʑC<'%+VH[ꌂocB3?G|@Xu b0I.gzQb zmD;3!V_ɛCoUL?{866-4^ 0,.jMs0>ګl6!W`tl"!S͝Ʀ Jbi`*aHbaMTvjzRQo LYAob5'ggU=ώBٹ̂#JƄYoCNj;'J InM\:gb9uuL80(JX~'eUt7`w /qC AȻm,pz03\QIT!e/1GD㗿AI 3M7 *8wNUM\N/ȵIń pMjp/nT& V()ßsW`чqǪow Xꪀim:sBE^Қ9j Üxž_>_u&C#B ϢZtjO)}߁cFt(壝b[tI?'J ҟ<%+1ߩh{puV%}܅1\=HqT8R;001! S !C:;%A.C6N4wZ9TnvV_"Fc$K{"JgrvUl,'+*olse ~G΢ 5UCLb,r}3wٴxkh1zXr2'E1وs{ekf G$*ϳ]L3SFzʅ5p E"E<녬܎A5XJ)HW/~t"tsvٌ 5g8?-udr[6J F"C,' -?y*Q(1Is ?bz<.mԌ6Ěxs["0 R΅_uR@A [g+;C)d(G4hkǝJeVL]n:E#HpUdFa!8@U~zjB%3Q.)Кu\ջ̘oROUUmfzӖ {|F-UTPIRQ^*΢H@כR`"G}Scb 'BVgNs_ʏbҭCJ.5& /l'9\e[,@JjGK1>ef8V&BtNTDWυgdN86^tD©:|m'SSK^P˱C|U:>39GORa9 |gsNKETr}3#bw%5fźB ,6b2"^Hщ;+];Lf$} \\jF46Uz9R܈(Da˷-O5V;|$'wt+2A& q ) PϩBZ6wTqj-e焢s56exdou}TC OJ" 8UC*`Nh-S5Ԝ$ikOežU_2- 2i iw^ tɤS"v*:<<eSZ0ma`/C'Wr6^2E2_9F`a?F[墈k͑lCit;H> C]yP=F#izqTd̃vP%;~mLdG!('dBYT<|чkC#I= t.NRH ǨCΡq Ry\ZrIے8_ Au"')Ѧ!%r"*8;X%I\£:[!LeBxz1zG F/ۣ BC$ U/8k8N%I% hKHgC!o}7qOePG 9 .Jշ.Jcӭ51`J|-{z[HZעM}E%I4m >n%H!UO=$),RKX8k鯏bjqsS^nCpB~eDe])8ɽI¶ h/Ւ{XpNy>rL_ͣ c[a2PueDp&z59>Z 0@!:f=M#A#v2I>'Y nĝB"Ƌ{%jB3tjgےQ,ofǟd/vXt_b wǣ M|io]]eB Z gF+UBz7dsq$S5y5\1-vᢊеh2zӀnYu9G(ZySO:q#oTfnS<-\vVnz,FI1[݈ P<*e0;v|l5G"o.#ZGW+3Iz G3*KF :ϝY-OfEsYL\{uRF\4ЗTFzDve[-bgC_'1q~X^׵Yn@9̻)ܬ{ҳD+/xj5oj#II˗؎^77GG¥ 8:8#[ˬs+&i+bHpOWRHQ_($^$#4.YFU06<5~{ĸ˃<)qq ZҞ1.c*=Q@kБ )p[8ƢrHi?5^FW6wLގ?a0O.Xc_v񕨷䁔go=z,p 5mvybӣYObduZG-4;cPƄhkozJGn:k,@\ǫ)k?hv[m3%E箚CT_B]>3+s-@ٞKNC"ET_8ɀUYx`B0SB>os5U ~s9Ƒ^WYk~60i5~N Ԓc)\ 9%lT5ƶKڜRو]Ba+jjP5$MdP9ɀkN~,iGOJ8aSoU]Gp#ɞr cE 氊~Ǧ4vwI]n5ӆ脦+=ǝ\Lin}ζ귪<+Mȍoaƅ$0W1JhŷٮK^~6oJomOO)Bĉ-&&Apָ9#"0/#^S:Ӓu$-kvXt!p%GdH.Ӑ73"t%Ї`qto]q;I؉yI!Im[>h>jqW}"Ktoޖ9tr[a$p>t~JR>'Aj.6B!'FQm/U;\'lus2KD^qeݻWZʸfx9B.ͻ00q]XP+Q 1t }$O*1Vٳ&Vipٔ>LcJ|\ٙy$qU,,Z>])]$jI{:qNs}|I^ZYng9)Ǎ'eP;/.z wX}H{X OswKV}D\A][?)%G:U\GߗIQ1 =j^3!$?r}]<~U2XTit󮟠5qR+4UwŅȌm"y,NGt:%EqX7Z"G̫KuYɤ̤O"&RQn+4:]ι%bRL;"v,cǵQ7In5 ÐMFWAE$\CYۓ#njZ4}7ygvS2JGA 9`$)]99hFKURi8JףPz٧3`G{;Q!#D(#m20c ]/uydt=.^'!=HWrd pF:jٛ-2V4||+[N0}H %AEA&u'ߤކLD`<~X,NbHT%a iZP:϶)BN\ng笹Jk A.+AY2:KI/Ïe%RtNzOH$]rD)0!WP2NMd6\`8+J6#_KRzݷZy{ݟ5BA##v|k8T1vX9#jy1`#̽x>RGkߩԫ3!ZG W8n RAq~LOB^JVgW;a94l8~8b%=되Xe :^! MPzօIPw|ځ'P!@(C8R7y5U& *&v58x߇" eYcиm=J-/W;=bpE9dhS+eyQ}ǯKR!&^g4iEI2K1sIs;keb#M6fNe#S -F=?GN~|aoq@;ť%ڕNr1Tܕ$uJ;K+{H!FJD.NK7hg)k@p@.g~a3Nc =hk/Y a@]W˿w8~]5 .X.@]4j9}DNF53v fٛJp4l@݋dA&|fo_DZؔ!:&$[z ViB-SZh2L7b)vF>ͤ6Lh Aٛjz]u<E|mTzpt`4;V(?vD_ _:@U?(cY-x6vKcyǻB|P  hEe-T1N |= "'`.HC_yr,X{ I$h)W v(}. ! <_ OBlj^&̶w[1qD>EAWBi*cF+$$ȿW` (|HG@yHDNUA|="*Ҁ}Lo "")("b!"3(2&T5Nf D% DZX5-9PҪ; @$!xhHImXq:/1170*:PA*qe6[08}ff {@ŀtd ֥CP8()!w(rP)O7t״^vS}a(uRe YCJ5M+ڔ9_/?} 馨•k BBs!aِUD|+ *m̊rP@XQ>r }̀wR"*Q@KNTo|JL6ru:y 6hG[ IAS^)1Qn屠 q4qH,Ѷ8kcjJh߹7SBsT$ M'6!e9sX#[r?y:ĚB]Mòw\_eVe ]anNo+ePŘH<c!2zM5#Xb@"O:[ꦒUkdI]cE 4#]vna@  "'7 ( }LȊ'BXDG]0 uQ"b@WԕRǤ"H*I á9XV,*Fihͬh##"j ȢG%b-F5EkC3Yk5H ]k r ʪ 3 ,06Ѩ7Epo9{qD4Szv`/nZH*d 6J}P'x n#휁-5ϣ'%yҨ;^m[U1œ%`uϜ/6fGkJ"UGRIDb!çNC,AB1D^""{)ye@CNۚSͬCXmdlF &@$H!n/k4hei1HgC0pMCsCZhi(, + n%X?YVM11mcm`: @顼{T: Ve,Y]t`=5ĠBXmh ,b+jjld'%2 <>_ST/yο^i,'1,-$3!~?܋sZL\M^%ywU8pDR$!c{SH7A$':z7 &ծY2H@=USѝ *}=(P@O=;yH0^06;6Nf9,gaQ4X^Q 8K? >-4I' >POP>VE9 +$5*zI{p(>EAkIB{)AIIE MR78I: ̫(ʤ+MzAFmBCBA^.azo&;8B-*P-IK\91"R4Fƛc|sXޢiSĐ.堯qҾjA<zU~>4jְ,k3V7 >!CP;g=kӊV:`@`Oebd SV˯(UV[B_uJl/^ڑD}`ՀG`DNǛ} fYP0r"'}u4%젿s=&tlhBxI"W0u ]kWk-ִQmbü>:\ΙDDQ:p\<*h^;4L]Mu$M+, ɂv BNNE=ĺf @jNFe2#ÔkNl)*l6eI LxR4i%|FCFsH2q獍&KʣI^&a)3S`ymo76K5$kZѴ5hA0`Hf5N&TX!0 &-wG>Wa,3[@LF4bCaXW讜pԒհK2ZRAJ6ZW-J%脤͌fy!.٨{ i]6rFЃ؁D"{kBgd~d1`>wṰ$$X`AyUmp &1m.DԌĀ0Piv`:tZ51ff3b)O$~H~SGgW3X`afj*dffa9fadeYL2Si#5F(l4Z1 k ՐaCejqL]kgh>ꇌ !JTpmyx|a޻t߇_wwof݆@" N #$Ut.XUNm5,|.LPQyEFUS%@&EJ*PEt BuPKlN;QA>rAG%W.3BШLU^IDPMAP) !**IJPVD}@5Дw+PňJ$YZ@LJi P !@rAAZ҈TSdTTC( xHUAD 24M# s0QD))5(y$41aT$ UUH;Q4%?a"r,!H ],;DvJ)@(=MJPD2ݬ0!E+KfbSījCR)|@ ObT˳"S Ȃ' !hE]@DijHJdRH!򏭄b$)N@\NFHDaa#EeT+JJɄ5*!$IU(" MTB `#} )ـ6P<TO*J)ЦQ`HRD^(%PT"R( T@Bo ((j)SWMBT!CBGᤇ4)dX$#)2d!!B:(MMfYG (R)yWtct" )J ',HҨ@‘OKo*!ETDA@E)IPHG]bX).c_̓ (DD>E{0"vNCIQCKC,"1SUTXNU#8fBEVJ"(.@-P@RPQESADDU34DU%%LdEDLTQU%QDPUP>!рԿ@ތ(DD]@:r"j8 O !IU5T-RU%TCS4,dB<Ёхv(!**a(P=~(<7-e?2@^X0×(?adR (Iff S҇"E={6R 1D6Be9"M%fCA)#% $EDILT 3EA*$` )jYT%Ei b(}.bSBQ(o אܤB>D=z$; RdR+JҴ RPD5PR Q#JSBLRDA4P̑Q(P5BC*^ 0m*9a>)SD QM,N(^ISiB(@J6 ZI:x 32#`Ɂn6$Jz)Aeu>bm)C$"!bNwE^GJ$Ƴ(nX2!R ν ?e JҬB5m*) /5*t6TQݐrQJ#F9~>ͯP%))"PRH U BDP)&Z )?r yj`bd%)h*""i Z "&"b"$)B jbJi4^B(a| *,qoרm>bR@v`ZO0Q*">rPUyI!*S֔S2^ys*K"Tu 0*/ UFƠ=%p+5j̐xVT =$ t *v%U:REQOOA "!=z"xFG yRux9VzPTD@DBHTSATH{di1VT{(> ?) ъ+p3RÄQU@ʁ@zv!PR)" ;B"ĢPg{9%8BB"(T?NɊZ;`ATD:RЪ. <%P߷B"UyUujBVQ?~+ң ҄*9 .SR+JoM'%)E;P9J^(JWh|iVQB>xP/"wx o(<(+فT9"/@N( ڔ:D uQ_H*:PPQE**q6^H'ZQC v[t!M@ DM DjU<@EX6UDhy> ( /yAA9d@"JTJ*))J)@E2&%EF q@1B2 DHSB*tHDH<% EfQy^2PrE@@#҅;P䔈E4д(3 ?-;2jABK0l f4,\`*BES ¢(wӨQ<)DBP~NUnLTr@>_ xzQ29 Ϙr D /Rxwvʱk"w&YrʉЄ^VIH; qBFI@rB9 G"$aAOƑCJWW<=Ӆ{08Gx]!""5-)ɐ$Z@R&AHq Lv#nHN x^T$  ˘++ T!)@sԑ<^dDH2,t6֊arj_.ސ%y:"ʯPZ@5(??ؑiE?q" yD|SGR@C'<-)TT_Ƈ#(A) maUbXZrpi`[RI%~hctȎȡ/:ȒAdvq0hm^cd3"ުUx Jf>^o?;u坛 T didy,/@Px4۷{z˶ў=Ue@  toI\i%a *(0eҲ"v==`UZm1*ņ^C5Z?lg|m piiTHozGtQpZӴ #E- O`&;Tд+iL-0[w-g]x鯜JR{N2t5PI % Dو42N@ 4DY# ֻ},ZQ@$?o-ztkDlAdqk|*m=HL:݈d*x33$ĵA2tCl( "#>6L-]'hg,@rXBÐGs؄f@^D3j ?zžmRc#R6H'"m NS/N >\4o ׏"^*hY߂D ׭Nj٢)5|]W!g&>ckH(a ڙ"t$S3)`G62FfM cfHQ;u0䰍9]Li#xfsM2ٵΎm~,p(hM5p09p2hnrig-mlo9]Ӹ3Fh TTu͇ 8 yW/åAPJ Ϙq M-!ZŎ.a%CoyzKb@@ %$$2(244`w [ءmpvg#3_eiq^$Ji (m3441Xy a9qgw괗R/s$5U7F&ceFVc{0)ŧ-:v1uΐ]A/wk"W6J$d9]l;-[""B3t/s&Tw ||nY{'hW[L@BR'A$`qsoY:ٛ{"DIm!l)./&'>w{&G[ iP22IvXݤq5-@88`@&v+!4BEʿe /wzkNvhce#&~۴GSMl/X4 l+{}cՁL ۔(7E&`cתs8Z7 HW #T4.f{ą$91lY؅]=Ri. %@H"23gsu=~txE2 èq&Bn(&%?$ W]@A o>wFX%F &Ę*&z<8 @e&b$ސ3s0q &QSK Hjnq$"dc`6Zxnb\ wj5VqAkkثclBD|l(]F'A(4@ʲ?Q$}5rl?{f2Ao=[TB$^01@2RiȀ сIV"@RouN-bY ܹ9qbY6`\KwWȇ}!ʶ(ʎ!zg֌e{į !fL @ؒl%{3{g[{W/G#E"@0I1㐉`WptN*pR1Ǐk Ɓi*4/>,أ&g`;Ko#u9Z]0^d|#_$J?&>\;.|K4D= ]>b1;|gl4nX8Ż"o1 VH<my[DKN;kceNUZ\RJh 'B9 4fÓ< 2S4e6c{{97 ²\8A 11|^dsU\2D[lq Ł0'Y=I@¦ MRpa]¾} N`3+TlE"K 6Iys!bb%ƯjI`eX`nGq}D+0<Y9nvpۤ/=b Qv+*#ES1j&yAqED@W#05 D` m9Un#Bյ@ewM9N޷9 LJT+"`&_jS%$D0UAjoْ̅(Fz$÷q[)-#" YЇ2 L`w0ȸ D@B,Hi$AzJ V(oaa( @/'(erP:m~ /Q&O)@V'Dx  Iag#Ű%F,D'iHj'd@(vi&wy[Z -/D81~jb[VgD!F kYQ F_oƊ9C ݱࣇ gP^0c9H%tīmNp+i,\8]YPpM 1 8n5x+^S=M9[@&\6{F6=`'>rCI 8N 71UQ鱦-,,)-W1si e'e. 9Ez%L J/E r1= BZtffSVA&iN6JDʷFms&MpԬOI͆a7QӠ1N>QVVtX 9gO*Iz:&P "3ΎEsK$ETZyjBph^wI9 ]=ž_wjwkgW4?ʔlj!q7e%M7N%#M06$*֮ e*t$bA-CUn3!#3$}l+xbqgg>)3mϳzX՟>kع(N[c[;j06qPB2HH Lr\$0hfE^4@ji #0QlLb`pGyԃ%VWyfJ\N ^JЀB"H+tpLBm#)H#ȫo#6; 1cېdL1ԃ Dړ#6!X PGqeENh(Sڰk׈K_M%n$+i8 jΠrb"q5^p@Ji@~gq`oY"piK.œV2  f/ 웅vbοuoN;R"bi"÷=$4&G蘶)A\@HERD^l.,I#>w.C]lx3djՐz;$x14BS@ ЄK}1" ctd͇48[yV &sr7'fC46VL7^LNuu'P}L"a5E“v5{5YJH%ښlEQNI 1! a {EºpCSsž+ ]Kd\KST|aj ?*eC ]a wk6Ύ&g N,v쇾lCSTz k u~3]xxns ao/+ȖbnH@>"?O"Û!7!0$'tZE,9 1l2@SwU30}i~M-Z 08lW2Zث^Ys $A*3X7{'f 4E bM \=RL+]t/qfcn"p= m/S,g9rZI$Fg/ӋX԰q(w&r*H0iq#@YY^U(h7coՈm k=pŷL]VjŒM}@~(V0Z۶PqEqYIUt(Њ!NA] ] Oe.F+r>7rDT@g]"k!4R8]D<]Z2.)rDZdԌ?Y Wࢁ`ٝp̨,!46cěF]Wwjgwf,)BVP ;=U1f.s7rKΜ4X9f;&J]^N!j6iגnўIʤ?"(D*iGuUu4t4CRE41* h)̤5#kXQǭõPi { D; lw @40hf}>N,F $C7ӿ{tI؋H% j/zk!H"XGer \V/HS$MxmMV0"=p!dZhR o+ y)vQĦ"32vwBB*`$Eس V 8!bpu $ me3b&0P]zД+ eClCXjHCMM n$I۰lAwb> * ڄNA "x+t i]Vi#Tǥ踜Q PniQTxpHP%LjĩP*͚r}a:JIu'Ei$`9%lI6mvѕhK7otq]a&- GPE WkDMm+A ;_;i`R& }n[*GyDfSV|b,4N%t4--%jд%K'${rM`ϰ;f%yq H~^(J{i %|7mB}w!&ߎ"qR9d݄0r/U%o׃ +C=4'kjqxFQED4D"+0֛5k/y}pۊJo^e9F\xDk%X?CA|.M1`l^bS1HBK1y%4G3]VߪF z,uwg#oۣ"8ֵy ELq=Heq4&%}`"Oh8k⛺vP #Fn 0ZK,:7lw5W1#(Os)~uYmБr}K\sܱMɄ/Nl+\VKpk⧤4da#:_v|86ܳ~[@D*;@)P~2DW('4"i^!+x(p3ՅO; @ZA~v0i + K*[nvՑ]\2X7 d !+D=˧ah/qyaZY[7 W+N UWTXR)Ubkfn-.㼟^ÑgOs02c}Ƅ"ZB̡p;g`9  1R6@2_so}ړZ8@^ ƖSBly1ŚpŅ]Dw#pG)v9<ڦDabŋjj+7GD fG{п; 7#m{hj}̻OCW堺XՀgꮑC%RdۛM 3dw]9wǑŷ]~ӹa^%T4i$n#ꦁs$ ;jB> Hi똱4Kx4$#4ul3BeH=\Etϫ(VEe7ҵqcXh%mn@0m:UueS6+3>3@X(mfTRW_"j̯#_)!ʔ$6gmI "QÈUt!E\8j3B5Bm( "cֺ3eT^RTŁr_~PK"} 'HFtWfe{?mE&@]mjMDQ"YbzFЉ0ZGҵK% Â6!Mri,zHowbf NI0oQUڟ;}.;MLrPP6a)J{:$#i&B[ô\rNZ`K}m%Z̢(aIC`rH5Mw{mxnNĞe(rANrA'f)2Uy8r<0=cxdDTC{ 3Ƣ=-S3࿾2 \2X]IT^ƔȽG_?;YE(g<&|*u3{*7.6˃S)s_u-4f)L^nE<^Iu$Mlm$aB?uB-"^DsȂwObGJ1 ąrq 462WAR$K͒X PM1yD!/(,W)4&1yjU pd= 8!֔?bS@o,/AX&آ"RBd:Hήy]i0|V$H.(,L@ӵQv6RQۇwkVG۳XKUӘrH ; @mM@*`OE ,bXn8h K9:/Ným<0aK0ެ,YNA2 VLe"Cg*RcNB.r&LPW `1R%DT`#6ξ@``q!׆iw6u€ K!S]ћo}5&gU<$+K9f"Kك;1!Pޤ)a;5x?=˵o?$LI@x,zn# eדt d$1 .~8OYظ#ԸoƟnH߇u`AH(rDקJL͞'fW;7ᩈ0 Nˇ?lG([ڞ!1b[<&)[ q=S@M>X寯9/ $jdld!";D&wG(K . # HpL!A"q@u5w m_3>ԴvWsC$E'r?hQD'IT*M_Ҧ]WSJ"mN,lvmusEY2C'%EV݀;('O((e}SK%@s ״LЎi$Z-`>뿎gaQdiJ#`:[_7Ж-4R]r?duʾ"UbUK(K `ZUɇ?.lƪ&4,rTp#םHcI QXvzDmC`1y\\wʋ^ςT^8ߊ+kVwYeV׳XQC.4}Vؽ7Ʒi Ok|} z!=oFÂi>}V&GY[Aa'3OF?G5iLOob W"0oo%*K @aWh3UH0|`b…I$98jd}A_QQ()5/h[A `بadRŶQ~t>@7xӇ/7?i&<[@xC+XJ?BsJ́7VCON!TЫesqjwsCDaz`CԻF6c 6&)ENJ݆Ʌ_"3CkhhNx*a {?cidDtfEU)KRdrZͤ@`8!9YƖ[N0HY] Z?6Pe zouD FJ\\&\hPg!/hQRD%'3T‰ !. ^F= #d!SR@|׵k+̹nN 2kp|'"T<m$1uY}@!$=ݺ\rS,g֒LesZ,rC9MiΘ$R5nh@  >D6('rb ձ+u .[Vxϑߧ9 Ǧ*4z8a(I;AF9H˜|1wmr+=|*Zv MW  IW`M_>TG@I)A )xE o/dy^۩.w)2!l1/ 9f4Cxn6˱SL ϖJtoW8iN어L]4.!m'cHЈilS"j{]]$9ab +X s2A"6ꏉj*U5j69I_ ̢F:h~o7D;'~s)T-P()N[ٜo?S$P@&A~$J ֐QOȕe@:dL6V"NGUlu*<唭$Z-6rӵP  iaa|f rP;B"^!` ̌U(n,55nE(]٦srWV\ dmTw$`#ѷ3mǥio%a(s ZϰDił-j i, XbMhQR TΤU!)h?+;zU`ķ:Jз e[k;@_ł EdPHWG>€Y_kMupwEÑd{0*v͸fgALѬ yʠ.u=V$mӸ1&`" mfڤ#b a˨8m"Q@ Bva,Q !Per/[DL#;rZO~一C1P/&_\˜JgmAM+Ar5uq \ r'0\ u|Y2bANB,egP08-$yN\JL h$jP}m*1!Rl'ӈw8THFδ4zDUJzDBRFDd+IX.lEhA;$;T,2LB(V"pZv6b- Yhk325b}=ytdIE4$$S5S ,6HH/^; p{BOdq2D `Mb"Zw@bmJ .В7)$GF -Џ]b& Vg=Jj0th'BQƳ yo=3/=ȗB hgQ" :4pKD䜘&46`vHX7͔"!k͸f <;Zvx 酪hU8Ue yw $@ 0ذj%_n=rA_@@oiPT!&rP@ICjENI IUI$-7ӈ] 1EeEPad 33ȍʃmNE}+ ʞXkP[abqxqA$&"wX#|rmeI4n^>@3I\H,Y}HW`4!Al ~P?"i"ƀ;'xЄ,F< XG |4@! wD r1xl,JA5uAo0 vɸ#Z9eu>vӬ]!Z/6KAYxJ3BtuԱ. y/vWF_mfkOTTZ 05dTHl{GCrl: 83X|,^Omio$Dʍ Iwato=29LcrDzV$h^h8Q{>Qf^.mrE݈!QZkE^2"cܞ?a_Ȓ,1*%rgY8JC/絣g~^ZsIL?H@Xx ф^3ŹUo{]#۴Q}*o#rZ ^.II(gza雿!UzT@K-ox@q~>9"/A,6J.ڻX}ѿr_d!-{52"񕆖# ]HwL9"1ᰱ{g\#Z֋,{kRU|=.u/.gfABEms/fدC/_գZ|X̓{nfkvL}9ͼb]΋4m@u;}MnOj3.h_d'2m~= d)[+ i]Ͽӵ؍DJP}`2o #`$Yeː+woy۱L3rk?:J÷e㩃AbřI _ m # %}3qzu{Y^L'Ke UflDHI%ŠtQW?j}c}q .pω箤Zy`a0QAY\sT ض1z*\kfu-`YR07Z3{S zYfAPJX7yTԵr+rR4GyQǥ3aw;9ZV&k~}< 7 hmFZuwxޘ(0:*b'i5T%z @PV՗ +rXWzR mTo`z,I^JqN.9<`ֺL3NhBCjtߙtOWhvγ-LPse'r$-:\+<9͠X0]ׅX}ŋSYvS 0Ny5[mpр&W18gj`P]Pz \h:3HxA6dw~Da5YPhO؁Oѷt"SGX#8ā!K ovKξ#6+9m K[C$/0iI4gwwRΤQܑ R05hQmwsY#B/oɮ{ |˪Ⱦ9)z0\]Ӯ67U0)I6@\UweݴK?45v;zЉ|cl'&c}}ȎF0*ԃ{d.AB XO!#S~gf]UM#c>|joTjE4<#ѳUJ՛Dk`"k96T5rfBֲ^ MD%yzF ‹G+!LQjMj22g-WUVsaMI1(>2rW~A؎6};(I @M)\X̪>֎bs:NKiB -z:>wTBiE WQn.(jnҀPuN**|gVt>נڳ5V1D?Iܿ_ %+u|~ZU 0S^'sd,`NBL[=50`5鰜 @jcx2y)QN<>Sq쩽gS | {MFUWmTZ|-qP"=FNISȥp%o7q?w_G0"yO樃eOT~ 3S,5`<'X?EǙƍRAX zpޯ?p_'azaВYU,_p#s9I #>sXsvضU1H"L&ֆ1+o:#h;~y=ELB`=<<$c0ErM~fB{Rf\\KqɌY,hn`k71qE5 +^ny=\#s/q?ŋNZ<œrS%3 {>vK{2/}RM.Z7iV`l'B\&Uj3f]T;7,=ePvI$:H877|:OqwNi>> W 9[w"ڞzG١yXRU߳I/{_n't_4-f%,9Ubct;UmsihK!fI}[QV`{bGtm&~ eJwD-`ɋ5;C>ؖ]A'19҆Bȁmj ;$r f;]_`Ăg-f,=l `L zKb \<&'Qq*Tv٤03,"fLbzgC Q[~0HtC~&H3#QUx A%ϘI!d&~re]FIYzStScrI5o抐`]͔^)_r`Wݏt}S!pU֧YTh0L$oHib&0 ||j3ms[ 4hJ8N^ 6!:NE U3o%&5$dl9i̔8${Q֐ f,AP*J6khغRNCe0…ןzo T4zȗ;~ M-f-D<q{,VA=};O GJ_3vT0G?զ&&XJE%TZ֨fH&|L q n C0ުvƆ?| EO}r6uq th/ f:3!S3ҒIh3BzEo ~=jw뺤ֽŦاɖ)TrNWۘFzlfg<8 +*)7ͱZoК5 eu#Ւ;JSIr٦rɸw Qm5rO!p/__f+?qn1'{vƣk9WmtZ4,>(d&l[N^ x?rrr $T\K0Yx* VR sn⸸{[:lfqW^3e۾:x؆)IE`h@~%.s3\H㕏'RάvĂIѤp$nlsy|w2 ;g}cA&H1z ٨@Apt^s[޺KU'G[nj%ګ %oR MBsTGzሽb1 yEYrSʋ3$Q}9z1QyxKνާ|'I-ѱߔXFL#1sȴ˨^\~S9}PV08ZzZHkP^կgyIqI BC:<.fKH"p[{: 립\UedS BĬ9Ji\ ⬗Vi=gɛ70{OrM9ClACN AٳΆgGBM͗XQlP1EVߕ -WYf|ÍH7Cn^6E)/aØA$ 2.>/n_48"Ś@ I#)u+Ճv* Q:֣-FNs  a0#H(Z@. H؂oM]FJ`djaf5C:}/uQXvw ]+%!eG49"¯-㺺S=L&-g2:~w+o-: [`Y2Gpi ?2Xщǐ+q)c6c ?ҳԚ)ǘqdwǀB @ oU/G9LD_l?G6J)W;?oC :3%hHl<` oB0zcDg1Cz.R ۭj7UٚoHGBM+ܥ!>d=[Y=EYFS~)֭ 5{* ֍` _?`unuQ AppPсs6|5zFvprz.LA _|Yq p6 :DQa$PJPAQ%SJ- MX*Th%] 4TV7?=!,{hy 2WND2$!2yRfMa9cs5t22ͷmj\.-I` yx!QZr{2zIko4zcX,46oh i0'D U^<ՅsՌaPbRkѶRqֲ@ O*Qeֱ k-zj!~{@BIKUQn5\H_om #@*7%yk分')nJ]UQ8h IDIGedؼ>{k($H :eusThπߖ"斦i\QqAy-ca-8Kz(K,WnB@zzZfLkJнZpjZ!{"@Mڸ(|DFS 6pt+aѕ(2:d¡߅*zK)ca n R88y?*[y}{63PadnEֱcJr$sKt!d<wݡ؄)3sR6c{4~ ODv)NP$s :O!LPvsxHikQ}hH*32.ʁϳ2U9?G_+ذKfyqr6 MeY(~iWR|I,>nKO`Em 9^M}EZrsR~hUp319i;Ռi@ܦTKF]:]]C `ʰ̃c4BlՁteGiw%3EnE#'_c8v#$pm=[Є'Jk/ GD`vMFU1<-2./ Ap1㧽RyӹA{FGfs 8 6*N([pT~s@fez&gǫ{(Q.MRG ˿-Y]3ʪnE>fӌ iV7e Hz<]Q׉eWϣģ;. ~\R5?֑aiM?F 2bLj{_\U-O_ Uߺ2؞ݳ] s>=oWIkYh*:]^.rپokWNe"e3όͪA*&FM}׫&|W󫉀?ʏnlQUm)Jql:S (gEBI TXfƔ*k[M={ h-6IRSPb n kS 4Fi;cpf=+.T.UV%^IScS8LMu&o}BND%1._B̦{Σ@GV릫H$M5d?U"AW\-~"ņ}='dzocos # YqnLf,ĸ!kfcd -#L`sPj-G=S jt͐86")`HJ#g``"bxLY%D ηdMѠ&-} f ?g~ՙ|s--+A@,?CـΗZ5x5ƻp#GsMs#h`pᑾjc'SqdTկl"۸Ғ\"v7ϭV.΄p^f7K.Ld)}K@ۈ;sv΂^%MUQ ^H " 6^Aj=bts,5 .ƾF͵Ĵ4,l=j!O U KVűp}mo7W(`&qUq=|F%o8؉xڵf#"}ń Ћ<^_ߊDT᝞%Kj}(o^u0NE%7sYyrnk'.Q|K㒬f3kEƯxKI2ꄠ!pp/l b0 D\80 &Crg߷'wNk/%Nԍ=x@f|!D!|{ uQ%QT,9vbB  i4Fـ 2"?K AxJr=Ԩ!R ц/ټ,: #̽}Kئ jӝ1!!~ͣNeYMw=?/F!j&ifU2 XSAaÈ&Y`g"|03禶F-l"rt B/?Ua,LYw ՂNC ɤVe@~M<8 ` jfD8<2.( MNYƶ3{\ɇ E*1@H~^0Y9:s] d@@%6"XF9u[bD4sj 5PyύEH蚣E8PMy,0y8wROO$Xww9I˳N"Hj .-S-wŶijqqc<,1.b;tDKzb *!331  2+v[ek4䱲a7g+?-gh:FBYja ʈ&b毳0khE,4!i5|'5ooPF4ag|W}5.(<Q;bu@ "7.G-ox3ʏb_8M7UJI!~Hi Ff/?O{Ęuo>#_W38I),{)]ռ3v4FBfH8 M!zRgGb~@MVi$kַ  Uk~|.&> x\>cD&c绷.RL %tb ;X}DzqW TVDB9@C]Ď3.%!'H%|j-8NhCX{T+2a\<B?B $-k@$fى%?z>>3eG GR#eZ]t !k[:!3U*F?_@bdq&( -IQЂ2LX6w]y;DyeH0 lͻpV{!-Kއse0F)r!$r< &2TÞYwI#m7k#B9ZDŋI4v`ˮ T1Iqt@c~i  1B({7Ab`R\V}ދfRJN^L[sa?GtV==߱lX)h@ b1 M!)~wEI AhH?%A%m F3ZҴ^=nWyh΁Vzmdx]yώYP@{C!;G #if\ܯҧki9N Z^^F.+6^ ! DNaPP6^M1v48%l5?,)./AwI:IYqW[_Cn̍ Ys2Q*S SxЃx,I+ڠCHG- .@`*@ Y701(J!1Oq"|s,o_-@?V);¡aঁԘ槑Hv^;ÏW@tp\H.Cn#P*("bJ?]UxIbOȳŭ05QI@p bǝ^bP^Sx8$Ʌ*&h#.V?]`i%8+SDS/f "sw-$$0?o &ЭlA_QɁD(r_yO25s)>hkglc:Z+aIE!yq C0H0VX_oݼƁ_6]cs;srjeղ_,DO'E?.Zsp #; G1(C1;>cfz͹E2?#-ϐ;EgzxU·Bz6\$`Wv?=apI _ÀB3Z>!* =$/s@bIqyVH]6Z;VS/] P(C]-ЖvD"οz?|'JL|G3Ba1#oXLH/,aP'ūϼ[\kjɝTPP_q3Y_9TC/%iua)yl˾} &5F "1 "D( szٓ F3IVG`,o*AJvr~o*UjBB }},ioiRR@ 0gZaG6j NbxAO6G"B&M# 5:ƲI2O{kJPWe3&\54oͷ Hd`m;ItniNMHd)`!ɵp &%'?gww:YCXV'[X"A{3#gep=ތq0%_||g|Ē>_AK: =A>;T 0|\ w*ju㣜~tU\#anPŠzk"NvmiNWY,5ֱ Hl/i+l@|iA x$hI ߰4w2gp!w?jvwc6"E}sQkm0yM՚A$H0.=_j`Y@` 79? bepP}'?wR;XN}F^.$z⨬!/5XzZaP4џ+o#L\dz z-EM%fXL;ya])0ry%1,p\N_7=ok(%O\ 8ʪl;#%Ɔ9 ޻Giru7zA$g,m^GIIac Tx|Ԏmt^8r46u)ro-zN2Z,}̕SQ;fi^,O'xJɖԎdM(bŘX'k-ȓ˧nSSKlmc@o!`*O#l}Y7 T@dZkϨeέkE=WX5ÍEZtq1 n8t`wtf &QM6iBSb L~rHl%h(3|_b\,΅wMO7sP鍾,qX&ժCʞ ~BF ,s9`gMs+l?}- N6:v&^}uVT?뾸mm=.@zp):]HuV;c]ʌG*V#F\+xƱ[$ w}m?N nҿ8v!Y ޷mr|?W8rh+m/IbVoSj8dy)p XiIOtHH"Ďep6ل[bB0-WrxsQءLs@7kU8|x;Idq2cڎ>N%p<5# ee/Cߢ(dank$\`:p2}mk&)^JaTGD&G+3 o.q*s/Rh7p5yenZc a>M-ۻ {Q#j>,,)9AK1 '|a6 8v,FGfa JfEbdj Tn$N2dj 墰e"('(s 4H8@\s'ff"}; ߠAzcU (KNc+)tt?Y; z3jz^YoH/3w%O"ѝݎ'kMBj;tԜT"7M?a5RͶ:Ajd}Z8.'cZ&`g#x_('cgMw9"(<˴sDܒNOI{FKUTp& 6QSzsż'=A LF mpq hq+w2qmBD`e8a '\S8i#¶AMw :U+GAx24/ʔv. m%n55v3F'.LtǕT+EtOoƁ/u6?vpo'a'xm O94Dtd>">>lue_m٘$ʀa+ΔE,$j5˴h~N䡛HsNhR#PK EP&=4ќ3mgJ/{%2);fܽGom 鑏vNd.A:Ch00`)rNG ]B2u> =GKndrHMx w/oZ5[ :d9 \\=l xmsa{X^øfа:n&-dτ$( T(m""H0لod)1nFD6G2H N!ā>LVVuCA6zy:i.Jf%a]4EGYK_ zqy,치 D31isWeUpş7NMU58e:+֞4P ,EߧF{!H' [xi|rU 9/9*i %ܠK3: ֦DB5K8 .̉ ){b A0$Z"H*2oW=+1+infNķP ySwDZBŸ &gy?`Oې򹁘 \(Q AF@cBe" q5'8 xc>9L?0jiMgpd1Ӳһ3) Bs7>9  8M֒XJ3HH}#:aW?|LR=GbV5)Cp !],?[}>@f|C'ޮ"U4$Btl- syq(.l~jk"!.V*E vYUK Un;̼V>S7; OJ]A+c)4cyB>],[4bf&ԻG!mpaH+$ȂŁ땭Y=Ժ9 #ү" -i(%E5N!8έW7[.jŘ.& z2fxIu['raλi%~b=] vů9?\sVP}^ĪcJjE*T1)1?}wwvCzpZ94$oM~o*ûysNH$OXںY: ` żGQDЦ h,R;W%LapweWIZ2쏄iݚkFȁ { ^! `YL"hȳ}Ph^($CB#[60;Fҙ3"y[ʆeKc\@mQCsն08,ur¶#94j/F1+&Mm\,{HKTyŐ>,p7ll=NQyۑ{ N>-5lOóEN!IFcluOY`4# T$W2_k1F;~"A=WBk(E{0ɧUb=!'1J$Ջ p+7s^;vٟ W]]˱g>%@OB\T^!U~n7 KVaJ,qq W|Y`7:iQ٦} l'#$޶EH+YNQaec>/c02-mv,SyP^uhW¡RVK^z]EqoN؞a!ӃRw}vuY0(}7S62}l+\梟u$gZdVN/b]<>3-' AQgwCOZFIȾuGuOE!WWDL: z~f$e6#s r#H+ aM@,S*:\B-V?tJ@OJb)GfؤLx +}\h{/+ۑ6F.,6[G 6ힿ╻$mgMEѿvf~_5] OKK5T1bP9VC5QqMІ5dGK$O!cުn O@Ltpb'DEG-B#tUȘT4$^P{jOZc&F}IM8?+9ZR:?~VjقA ߗ/;?]:hk /HX@c1LעKgA9 ӳ_p~1(xkaylz[?^Ĭ̪pB.LEyG56NS~j?~ z3aw0tҁgo'ŀs;-l Gm^>%뾷 YrRb2bxv.СRoZҽ7mJB`a&hˑ`};٣,ImΡ:^`S_M,v|Vi|)P0BQe |iߖQG=z6w Eٚm㤮&'E\dJfRtV'"NE&(uhOr^Jy%y)z(EWrc0ڗJeJk+3r=יs9 0y'zU u kL6r_dd~/+y\@?,zON=bT K,K *7Iؔtؚ,󧇛ǭK(ÞĂ1[x0[zF|5GaX|Pb(eXCx.3miYfZdw|#p0wHĦ_ @(cVKb<(ٚp #'mwC8ztOHwAlW8ſѦN9wgǞX9¼@ 'euXvs2t o$ş%I_׽#hc{y1kҥ⛖T3fțh"sʞ"\i%3 '"%Y<=78d+xG韁>Y)=})HC(Dyr*`|!Qpy mk 4|x~3?|TDFYMXq(ݼ39Nlevbu͠^.5H__qS&v-1oΧR"P"5"߯׺>+ ehvr+fh: {iP V KH@ T#Y+FRU0o݇d,@RBHy)grz'p]59ooK $=2 .(Bcz4kM@l0zc _`D#Q7#?CnEXy3`/B_=CkQWm?_y(@>ȓ0&n1i=SOR:^ 6L{mRyF^B7R$/wyI/2P 9uiUά1lCj?9gJ34Ֆ(Cri@4Giv іsQ6í[R0a\YEj̢ 8;>jEЗtpc(WՒ4 @`yNIo[0&#MC$@AI{sqxzPA/" Km2N:$l ;U4'r- #-=lN\9(ōD$Em;lo`gSKtxԢR|-w),YȟCd4`>cK/"M*oơ5WIţK3_g-(W~/Yl>*`y'3lFp;X2 }c* l*`cnA!xBߟtLwA- ' RLFKpθyn:T&e䶻M( 3B2`c6[] Z7BQui(o5b|a,^b҈9zoO9>Fr^&>lZQ# 1fHmjq` j>|>b6h>:W'3)5s%쀘oC=-x8hWȫI 2 _UKr~t캎K") {[yD7d;:Va[4Z4AƹT~R!ȕjW|^Z +jӢYGqf;f4XMe1@E}"OmۥZi[lљ{ÁDʇumԕM"nH(X?x>niKoMKHvPjuwgd.)BG<$/ǒR)l ~_?_S?NJ=U=eNYV@OC}l|̋m${Ƶ\隶HysrQz'2jRL#@+0rs+ f[}LKa6k$bYiH1.wR.  p4K9c9bKK4jf919-$CE }@IlHLBG~_SͤăwKشY}s)cP Z,U"SJ2ЋG{kbo7tlQᗭ|ܒk[%%kBg(tna@;8\DB>N`I 1zYE +SI#Xެq |Nۛ4"\ήhbq@*Aa:(Ŏ5RRu,H@רNܘK|߈ڛ{xmٔNT$BZBa3I`E4"ray{H\I1" m Q_:H|8XU:Yv—yo3Fߗg􊁵?KBԁSP"#Si+&Hzf]$8@ `{%g4#޿X=k"{mB/ŁCx BѤA}l1jt,~bwc'=63~>p?ܡ /zh_ʎZ^l\7r̺ڭ`_<~sh}&zBH[3͹n? ݎL|H HF=ؓk*M1*j8ŝdw\[R,H!dQNIJ7BOurA)#oxf8:GngyH( 1ѭu :0l+UAA)Cx>\Cc X#I&|iaQ7"ИT oa!y*|5;[o\}'GG[h" "nc 3QP[f9U5DpfbX0a51Dp֮ymeUTQ3 Jnf=&mTY )S[`` ݱ#Hʮb<cK= qg!Cc#tӱMm6 ,&X ]h.ې1#I"EAzK 8h@H4;u5 @"Yha4@SU,R#AJ*@p R/``CD8!;yAdfT %t֖M*ė@Re3)t(R-D@~xC;@PoS5<X),I'.Ja)*gztBпЅm< Z0_'x{@S#_U  I#k&,1ŎIڄ P?t# " ;PP *M 넇`E,( h 8eEb\%2dF%֨wэ"26"S3Y0A%R5 (C9A D*H {$WAUN2pEfe63M. st{9ުåWz;wNh|7 }לXL`َ u?y y8zF},(k`F|$3!]ϩuJ]ff4 а˸uC2|+x1_CmcVrg஭X+Q+-?/}5M{E|5 Das50]BZ\`B 11D#gaX&v|$T22WdZc7̯ljU[Fd|h6T%!d. .~̵g+;E'?)J^|!!ғɛ 0%§X4_?pU^7o. pr $sƓL+:##}*KJͻ .goū堉 6{mT&֪SAⷞ5-Y꠶kHo3ϘnL! TB5ԮsXei&7)|pifkyә2M%#oƓ+ϢņқTYT'Gv'úwߴqßlwBsd<."͎Ɨ-7b:Jc΄d@\V>~q+!tڹai87"xFT v#KW)ҵ7D 0OY*ctGeG}btתk?q^~'iCP_YE%ʯ Hvlrc&hӿ|DTYWQSZb%Ob^ `0سgP2W-G~]= _[OYVX )J4nci!qxyU̎:| t*`Ңؚaaٿv [JQn_ УXxFh{H=@_ #Z. !VJK7SfC 0}h6mîeMc6ƾ+%NTOrhZ孻mف\[2]o pqaI?qڿYw ^S9)kan@ ˽VIvN.)U;Tuj{:B:KATH/+K:ֿƦxߚn71b`-; RNM W4 vvWW64@Uϗ |@9o `t~xDw/ OaiwN; ^z[t%v~2FPǹz%S@#hXY|3e;0oL_uFtS4ڲX\Cβ NX$[];h"\gB߳tv$줜"C]1yԑf?o+0Y s׫5񡂓 A-LiL)$rT"gV@ ?e2jE+`J ƄTѨ pg5Xz )hKTUt@-l+z0:H'y*~A<lQ:hwc>:"9$-`֫fkjméڏ#SB;LueiJDcU}$ 5v=weF=:#>oHE%#l5hp%KLzWfZX6vrm% ."ԯ͝.$}}Iv1#]#]?T.V>3·G5?5 ,|L7H%n٪RDECecO&9cT'uc'LM!)(@fVdݝ؇>HJyU#&,bӁ4p[lQ"iST35;]2yQҼqg@/1_[t;h\Ip[w 3)zF'>) W?Ѯ@/iJw1zo%H|o?iڥؿkv&-;E,fY;~شv>cjZ^zYTR\I_PU s 8"'$4beJkYݏ?џV=gq\r`-&N/FhL0'ɨle ~Qás=OOɵ N~Fާ&M4"<ʇu*ॲX?*6_D!$8D L21smqmRX' \qC>[ATs2̯`EYe"1Z1pjJ1mhjQ}xa 07:v.9H ip;΂x^+nsm9߷1L0s`B3?gJ듓=˩BcѾ*M<4/J M*hVyh9Mќ<"0BÉL1ʎ{꧙{; -cn9qHeTn Qe?Tϗg˞>OCƇWEJ!nÞ2u!Ѳ:)~w(R{omͨ:J}M:ˣ az^7Z29D|)"{RAd{~^?zݢv/ go2$@${/GBQ7B WMir̿pErN yzA#1nP2L_΋Ud&!W9|X{%hd@t0SVA4G(-KI*=ŅifI~s`< `h`N@1 =ޮPaϝj PceEY3i@[7l:`R[| e/ҽ"Q1gryî@7H^tIk՞XtʝuL<4QS@.Wv;h(M5s22'|rƼP~sxߤtȬPG؆4fG/^Ѐ]/hЖ52Z]H][a5Y'? KU>?E.h18m<<=QCP @jZH>8>) p$M}<#Z r _Fk5.8*FGcҏ$Ȥj>$tSIZG0%6BBwF(o@@/,տ e?e[2'_`M$y`_s<>9%o+3FrQ(Jw ֱ#=]B #0ȼ@߽hlH {!#d[;y'lBs-jIrL@>3# 4C7!|_#A`Ӵ=18nn%=IɎg KB[ |/bKٴ i$/iV}f-1 xtm@AO$Y *dgϤpnJ/g{?$}3BW$k*"kwgqSBR8e7طw|m wn$6mLe'p>\>,X埁id=v{)B$hE=fǛ_HBf$6|v~UR4!y ƀ^,Kn K4B$X 7cF@_H]~amy\c#iRRh mVQLOiA $^rD/$ A4k>3sh~#NK%EYP-S1#LXfpAЮ`5>w/߁K Ohb5ؑG`彀A AKUZB"O%bZֿvB$~O$\WдzՅ4$}kZh_!a .#شu_bNSʣ|z$x,M /BMQ1fJB?qKq$t ޔt%zjC= !B<}*w}6$}κ .JB.oI6 J<6Y&+_)x}K̃+ܰɠ_Td 7I ?̀ +[~RKѸHtF.=BGg6V .<~k_-I} wu9^hWHְ똎oZ[JA=LAi F'kW=kBF܋,:ъAWN2XgWwı5,nbEyh9*j %[kb.IUOx=(2 s1@2?$p}!A8ְZ:U'!|1(L0Tv&@G53@R0t]Pe=z`ĝuYw/b]]j7mjj{,9r,$dt\$i?>LđQ#ǎFOԎ7T{l 4#fٕ_K[c7ە|>]|`ySZ#`ZkQmJND3 Oy.5DqeK*1էYc.aMF#,#BڬYN-—mI+xR:1jXyHRI l2v]',Џ;/_ $jvZ^z#5,}NvTMoNrUQOSyU3 :~Yi*z|i U쬎ջlKXDZ/bcbosZ]i|6X^g'unDhbڮ%&a,1<+U4칇אqگjv,ϣ^}8N5 B]K׾8l7= mznF܅;ڛ>4tyqꟍpyD,` 2c@u\g󗘂R)У@ȭs T{ə-V5] V:,z9N[Ggs 9@VШ=(ubx4]цgMBYh:OP EBthd`tK8]n '%D5qK|1M%0)n>= b"-Z!_S93cM_ *jY`# ;J_tke|2:"F7ij[n(:}W#زp}u+`v4۠u~Z)*]H|M?[ً?[1u+"Z𰳀΄}_y6 @e!#n4&ԕd~ݠmTɶ6%LB96?z;ĒT'r\K W5T"ZN _?C'xH%DM8|LeuQ;) DP#y#Y>)ш6NZ.phv[d EY xz<ԅ[&F\M)X<7Ƒ3%W}\`%}0G&—?k`v3BhqH_qHv9i]0TqrI \`- e3FiZJ76yWu$e;iX8-ӱb(Q3չ@>цjpfQ?$so-[40ՕO?}{F,ҬJ;d~7M]n]DQZg"n&_@F:qͻ{'͏/xaXdveivjXHdMsR61 \Y =,ߞuړQ6%M(~}n_PK#w]L {='{]UQU(R>(<@zZǦ`o'/lӿ<dłݴnM*GJr:X )0ɤ. 7mƜ8;tW WY^4 ݄qvS읈+/CU}?5|B)PD`E0`A1Qi:j5V9#+Ab1`e6ZHgRNHJdi!f1|MzD%R N۽K3%ZmFp^tHaQf$xceAYYys<:Oڷ:Qc5 *u%SH窶M"sQV: ` c!xb!!8 )<@"P!N^'T\D@i&$ͭNE;r6νŸ \Z E D`!`AA%n6ˎK/ΰ*Åbm64iSr6yא=z}]i59|oSsRk%]{EKK 8 +v$\8wQ%|Rxc&]Cg$pWsfJ ^Ȭ[:w5 ۆY$uiAJku>n 0o6RWF!?4ُm-׃O{SQ&EڳK毬`|;i- +n6:Voοȼ M* ;HG.oN˼Ut8TQvrR~]Ę@.sH P1W 4?lc!La sLG𳝎\?d6*5?U01*F3XV#,4A'>́̀lx_gu#`|/v%8*΢z:Ǒق0dq TdOxn ^[nC aS9_ 5D$g1 7=f{u$Xvu?Hy x>@{֕A =ρE0.1U`oؒBp=[~2?!B~`?.Db䌘mSoXCtt%Bi2qfA;]<C?(=˿v}i1} {hwBcn}6 \+==G on~v$0'<дYTKllϽ%AD#4]3sg\yeh2)Oʟ͠ѱ'ҥ,=mǣ<3o[^&vlfE dW`#u?}-)bcnɹ bb5 vTqax78Vx‡a#`M` gx&,'"QVrɕ-٠#_3@7g& #Lqb$5Wupc</#_iQqI.w$bavoszMwFsh;:zԵ#~cF?'}ȿe]~&#]?+67h!Ǣ@+y%c!ՕR0/n<5gn\}+>MYDEc+y,?Z%oZ`lFE9$#AzCi⢦.a*69W WzT7c?Yb<|sB/TKP0*QE&E.w N ' gD4g#qy<ɷ%>?g:4?8£[SG\\46򯆠[%bӞO'OM-?{Ad$/V(=;)r]QTF-LbOŁh(h xyʐT~oaiwn7Kbmq၌ҞDlMbDfe9b::c+?#/Z= `]ҿA¾|KGU*wJx+t7x`s~`#1;%is-%>z:B@k+ FbDzdq[UhXR$؆86)0D5n1+[ks"B3m2j;2O$يs̏CiA|CBf a3=ғ&ش¯*FZGaRJoP,#4ZAoR@jYFɛbx<ֆJSTG\]l!w_1}*)4'V [R"`tSgcLy- G?hYNddսr^ZjqYD*KKZ4cZN^H4]nt//M'K.qg/]uSgooIDOf{NF5Eg/YEΘɦljF̮icV)%-sv0rtk&(?Yd~ *&Z­mTU HeNVYY"Be"RVk4nSل"'Cw#NoŮkAPk-(#Ws#|č x؟m5f[FFWwwJmk̃Ū@0s@1A?3I G`+`^R}KJ}eI3)KBl"\+> 5*QNz=Y38ڠځёh‹k7<6`mdmV8d=Gc BoKX+ [}箭W Ȭ}Kwo7~{lNtzo"peKݚFKBxėJi޺iwB\]$-2Kj8=^yKАx^^q}@ؑ4wk$iIm=BZx3sqڠ1$Pw>/i%؄+ֲW`NQGu* bmO%T3wA%*ۿFG\L\2)]|MD|qL`7}$]6:ɣMՍoW5@{嗺_?[>딉pOMKN;Qe|{755=E*bzkMccy_ ؃0SdB&3[[ H!Cܳn¾> đ/hY/D .îM%忊]Oq%GA]%X@~H&|m!|]̄_Ք  |i2?k\hoө>Smo>Nkͤ _x@.v]kC{ .2c؅ȵJؑO~’WJ+4#`+?6;h$Ӌ ) _'ퟵ58?h ۥW| zر#V>}_@xew}Vkķ 1Zz]E[BKSBBPҡ< 1hJ8ޣ$J:3ړ-,FG徵 h!#h_/i!+XemOV hwqؑϱ :źLI}6߱-Kh}6ր.G1aL& ko[oi! a,]P#}H 1r$$u'@辤Imz!i 3xV#3w=$!Pi=3rrw':3wSʼ\Hi$f+A, t,>`LC^2;}oxqJ)!dkbӅ>S{+!ׂi,f,4+Ii`AfW4D!bCPӬ@/(?]-ohGБD 5B]dI.4Iu@y׻ZWʶ5[C3}lL 榏ٱ;q@K$ȁIz|kBGhpe -궍oK9I\3!x? M 0k 0k2՜B7n "`^#%[/**S=ˉXef'5I\\]6|_mT<7ReMJ>E+WQn9yZiQK3D BetB K;_ʑ!E1"Qz1O˃3ɷx}nkce'v*~l/Ǵi(ĔKK[F̱w\ma=16xhKˆ}T/AfPe>ν(Vd+dK!C0\_vY%q9F xoSeAg0]/;lPItԻfGVҙd7fmkN[c0!߹73gc4<\KxʳGAàhPڳRo ۆ批 Tڵܐn 9{͍iekKQD Hq #vor֓ʎ>;rdKpOګQ ڐll ->vL]oݽ<7:TvΰvXTBaTÍR(YIߌ\.$#,وvJ> o'e1C΃0iKϚ{$Ϋv֓7\hkD;mxK>UO7eJp7M$ vzZxR5Q) `}eaOfT%kk3,UQ2TB9GK@x/zM}Sta`i4B]u;6m yq%]aRGEzW ٻKt ~ Wa$ BIt̎;ZכG3N=Ypa. {g\p$="ag--MKA+`aDr^9_6X$1yF~ϡ eQ1_Wx{7TCH^89cD#$j$`W 6,Q1 .ZGXnm,|zVgIIE{nNr%a;;RخWۘIٝݢ_Ǡ_-0/})&zdcK{@}"*mb9ol,YدE$Ā udM9l_ ղsJ0@ 8;6 [ZZ] uR/ܣTrIxSNE!C<$ٌhܳ*JӢ "QOJ{Zs }²fc-3Es7? sVG;(`N<pNJl4JvL-jV.3_,c[%qaH6ɧ>86u 2]RDR<[ۄΞJA&F>~+as @r#(JJ8LFR9X~e' iGٟA{,χ&3.wpo$x^A}p{; t~jPd$p R$=ϣ"ɤC;RpJmNW6Ÿ06YU\KoM`iaR?pj3'z%D ;k;Lap9QH HgB19 CCUڕY5X$_pMnNLt?N=JtV- {V/lu =@J&'0i[OkTV- mB,(RDu οDTn?f_"3zW&q' /vui8maS:šUQ~ZmL{,vƀ۴WRqH| Xe,#kO(/ܐO ]=v94zc~t/buSAَ.@! :,d zK.,fɇ*h yp1zT6hV hЫ~F׫I )M |ؼP)Qv-@NǮ6uaf -錄vXD~2Gdwji*O%gU9T_V@*B6ґ[Ǯg4ô_-bƒ"ES|~" tc03GT!mHrkn2:S17w9d:OrI-*g'cg[*bp_ށjjE{w|H%Y*5';ǫt,!MsXܹgug[sw?iW{7> rA̦GN]ɑz":(Ǯ>c;(q9:1jXo |XX̟󂂇t3CXRٲ3J~ᖮ6 .f @ 4ep 0cN^.+µ8Xo1kC,}Ϗ[PRQkq.ߥ箛c/^Ŀ2V)Sӎ_=2p+[1;~?0N5,<[ L=ۏAUQgMcN烟WU,C2#?P4* ݡ:ѺIXg? 'VQf<750_kM_vSIK Uاnˉ ,$ޕ[™TJg\}I*a4Fg}pu(x+ŔNb[jkEt;0oS&{U8ܜnq/h(Ya2㒁qlX6 χSˌպg0aU#տnAx at, T\ɰJE ߣL_#i$| ku1mcV%?O| qh0SP0Kbu, 7.BcF/߾s~A8tJa<ҽ(CqGRG~bCU%jRw0;a1io#GOͷ)x~JՒ(ޜ1-6I$s[_T8>rFW86$sXu係?aȑl``{!tm!KV M$_(enX r BP&AbU 1H(ZJpL)ȥ(\! Jd#EDQRjE" (쒆NcJP*E2rB&JԹUL#k%+0s 8k23ʦ2`DC' (*rHY8浪59 Nf)fTd!I8 ڇ #KH ƀ^w?$tzy 4_>bqXO"[x$ 9#̝Kflkׇm3&ʯDմe"3PYNlfZ"r2Zm`eemDk 9YATѩIlcA͆+%a{ :xfIf8FYef} jDJWP8BHJ*"B (J*DjC,}/{;0үi {OЁaUXl/AJk4+v7|x!tM)?G= ܳí:Ha(u4RS@2yE$J8ӃI(┈hӱcS^JY$6W4nyG!S됼}.`0C0_JAy~hXE:W5?0;'V`/cGcЌ8Ǚ# @ {Ʋ s4=)\Z f]N9fLJ&UĠH)I 0.jM4I.5,B^+_~CT5lclM, XP]Mi(JSxY`.wqݠwb$[n%{}(BĄ$C^ DFTS/e_FkāYF̨gdk<Ԃ0=ie1$g\}i߯O^o0Yqɐ:a Y!(HSE$Y._ KVal|݇Q^kwBٽP!}1 ?Oh$mc8`zkB& 5eDRлdd & k@C~a?d!> !Hm.Cb%.i[a3s&BN)ą)X+>IP GFyRi{932,N2X0.yfa9bj,GB&ΑD$" ח)f l$l&SBl=T6B$ @kA 1@b88NU2P SQ@PAK2d4cMJ:0qGBbZ/x(J$HJ(̄KIB@!@TP$A 3)S%$D AT4J+HJ PR!BJwB-#MH 1(P @"!BHOk(Z@(h"T(Q)@Jh JEi)(Z*  T& B+Pʫ "_^o{'C-6݂/wra^ЏhWzyhF"e$ɴFhwv@h k5lSvB"ﱣ8[ :a&cȊmVfREg p$^,t4:;84`)!ϲ&arD{GCr} _ _~RT ӛIS y&drD!X(ǩ}9r ߭|Ԃd=TlY dw$p'F sB['s:X#XV%O?[6 A`5ӿo7wMme5&Iq($h&f~ ? L:$!YB >{2tDrW6̉-aPRQCQ9D4YcQSPDU[YLQUEYW̢j*( NESU5}_@`ȟujT41q[9ivz6}#ʔ`de āWJvF"kA`@`cbh/}[Sa:BVQlveh"L]2ML(}gǚB bX]쐹 ]}/Z!`͜ez,+][4G5˞w_0m1h`wƴ;ݏ+WsOO{-3) LZ.2oHI[.8V3CI± iel)^ăMЂB&HJvrNP> ܄Ǔ2)'98MoOa8<"hXbe{u{0f(@̧LB ,0*,e/K@% o׭n`Ē$Q320kc.gۧ5{:h%FYt_z  eE/;8˔ <)gtLwB׻Mf:efH԰lMm#MYYCb*&%Fh%B,uOFR%,o~ء Ϙ'i (ܺ]M<:\IƆĢO*GŶRLYC3Lh3(1ߜ%R$^na6{c :sΆyp{LmnidhFHIKCm;} Hhkz sw__aF5EП]Tb$hEJ+S8R05BgY;e0)lҌ( cU;1褲vX(b`qXV6W/&B 'O1(.I, r݄gcܠ DC1Lc@uPuuFth14LRW (bs"槵W+- e Xap͸q b}=oߘ,B- 8Q3lxS^ e,vI&9uDzrt=:ҠksЛ[#P ih1p7ҦcNy.a 6x2Ʊ/M b$ks6df>(cn+/Z医 5P!PJI(" N&V|v`l 6mǽ&ק4 :ezc' YbxYγO%1ovک쨩ӫiUd1݊jw8s4]eoƵc q e𛏰WHylU$S8BN#yL(32SQ|gl&hAX,> ҩ%e7^ށäR@eV%=tCS0-S͐%Φr~}XS PU/T.yBX1RMd˧PFзE ³ lC0W˧qyxt: 0@јbx8R 0cptPb+$0dj\adR4o;H Iˆ*ЦBr$X e;$)W &Rf?t1$eI!Ms\ }6~\]0d&%IJND ^E4LPI@h 0Ȭ7ɠ2`FQѩ@)P2RJ "DTȡJCP !Hd9S3-牾 (GO^t%C@bP.%b;hQzc}jpI5.&2[C! a MJԦYjiPZh C$ԡqUo!BowBm zi##1NЍҁ{0YaX=kc>1 pK_\1CgϪׄ _/dd`v?+?r/ⳋTk%H؇M`U.%u?Oqh,)@|k7^a#đ UrQ׊0BO{&sm1ۀ.0 Ss|yO7Q$lcU9oyZBJUhtb0![e}&pHYWgQ.Ki'dKkgaeј;LRwfambj`$;M /jiTO{">D$Gvw֒K$@}Jwp+4#X4w?T@VF-4)b5,  b&^|4r\*|%<8K]!鰌%Xhs ZY0nr0jaljW۽OU05+[/אX^0((aׂ_Ȅs8dŖ҆Cm%[Ak$t+= /Riok#j?Buiƒ!;o,`+luj9'Uɍ)|y>HfTbb=#K΀?KM#I"WzsԍO'64E!2~6lc1΋ V/閖:ڏ5l֌t6ʊCb1 >]e7*y<9eE3چ@A 0h#]P͔_K8o߆}O_l6ט enUi9/JH#7Gzi*Ip]Iپxd 29BRUc i<Իd4*O찠Pg:ak-•ᄬ &^eWL̑⑌1 ng R(-OAdr]-b4^݄`+S8m۷݋Dq$Wc w2a %mC&bb1BaVybkbb.+jT/ 7 &0g"#qwwˍvM(a7 | &e]*Z*e ѝDDg_q^}oarBТwv ;X)̶eMO۹+6d78d9˪%ۡ6<ؒj*w!j P!"`d\_Ef0?}HFvڋ98߲Jley5 Gtbn7@&_JԲ@G!xS~4*x]&f F9YJX&*HBHd3]UR&(Hh&6!#Kg\ۂfu~ (e$QO>=Xn|36UBsW;z- T9BBzF5e .pEd-N/%qA@͡_H(wFa+0&,p;?Hr)142\م1$8%S7344\А핲w;ZfvW+K%VkZ 08[CI w[P|6F򀧘jq {WΡKj@mey9%ݓ! |S@=$r92vQ;:3giJgnkŏ4@vxpYy[ojKae'Y*}%2M} s7Syqr{XciP%hpLK7'ED6COcf ݵƒKǗ_iàmE0=yS,{(wm6N/$yVUVGW}%1a[1?)or[@·$nA$IF l:(nϟ\AX`;55 MK]tlkcwLD1IUaPaSv Q#UU)YvڎɞLGCRV!#F0s;Ẅ́7S?[z{/-l>E2yR,??w^ٜz(mngtnw{.ڭE$G?=?$וh la{+[uw_m2d6&r/5?m|e>OG[p0l6lúi4;{OMKa0&ljN(uv,](g +=_]hLpwt'=kjO;GuR6LxXsx 5xw^۱=T=y-@?̆ h(j;zHAԐEQ%(jJAQma hNED_[1W?Rl1yMAm;.WKs%zZC@IWmhFxCGΓ;UW g |&r*X$ur"$SL@x//GґoxU;J eם[m 0|ĥJij*"(/AWH"EJbP~?4;cTZϴ5$R$ <;LC;(䕆wSqʊׅB_ OO?X~Y@#Rs|#dDaQ6P1WNXk=%hV+ny.5x \ϺF%CIC FNǜpI\{W]-R5f @|":sOzNmKА2"NMZ t,eV)qfqM P,?MvXᓹW&T [IKbZQzg_2WQ?Vm&%vZ#hͮcOwqMdT1SP(ȯƣ#molaEQQPF͜q͎~uxLmhG)Йl>ȪSfs]ɇyD!Vo4#kN Zi;#\5{WI+䖝.y#iu }n4:[T|%ͣNv8[t1` @" ӵ$7N=WqĺۄRa O˺I72qkUn>wYUhR1W7=nSH|2]if 1bjbbaߞ?bNuF]&1-} >!LPJCfz,BB4W 5Zu6jns+woR9!(] 2+_'-'y'k."n /M8 y6Q-fUs HBZY#mr%dMN^$Nd99N`O'ʰvy=3 3:wc^G6w.2ՉsF-]>snXy鉳гm$XNB<~W f:EM,`lk(L\xҫkOw%GYrZ FaS;_ݽCZyO,3L7 tpM |pɶ`$7lF66L jD[ t7^}Æav}sZ5Sۃ큢HH V.ܞ/FJ 8=?m-=r=^7xTр ;1'ϰP/AqA̫6\K_ $^tMO#o G`xjKby_[4h Kδ[No3-eJ4I\uu ŋ1ڟd+V.{fkҦ͞88k)L!.^ʢhaGb6m(DzFѧΕtUoۨIFg٨b،ȍ"Q9{z@fJԱgvinV DktԐCouZ)*Fg{tQ>MR^5vsq%$* @s ZE?o _8&9TIt҄Hk_^m()Y=Fs-ts$F&}"DwNAߍ5fPۮ=ɢ2,/rM-c~-NUH[۽[jS*~miNVMJTܳKiȗQ ; :qGY{ϣnfsyvEP[y}y RdM{Ŭ-*OqÉlF'fὊNϋ~IMT{5wbgCwyk ŷDe{a1-t+` B-ƀe|鈣.aU#Z,v!|bpF6g)cMD[,ѷi tGfyeXH0$ouQڐ(beⷆfK")KdW~Z{MaM9y"Fjd"0Xэ7l]4Z>-e)MKk6&gʫBT: uL{O[5m8L1SE2&7T܊[38)`ީ}񲫹܍J1PG?r VU){8SNѻ3uWXzWpDj*کC؈ByGj|N1 ʴq9_R  ?K%h hiyK`j&jk}eks?CFy3M86Ծ>-Genp/yWsC6iw?N儚hʗr>qhԕ6;=Dۇ},%I;ۡw{u3\^bJEOYzJI/@ӼsEW0Isi슃ren4풚=ܰVt5`;jEbӵ_7ڷN ɕFLgsC&k+n|m]R*/3#M4)Nŏ*NQ hTi0MwKY/|k3nR 4JnO PI m H'iΝaMD"G׫^~ x}C~0$|i)S+rA~F]%Z >P[F+f_u+NqWw? ۓńS{ѡd 2 bjm+UkZTL!FhtK7#q꥕wOw\=FYj~mk}jl d 2$$B5"B0g%PVW(:%FMz.dFIͱĦ#aгH$MX,gu-l0ZawEzVU糥YkJ[FFtj8YUU]: %0$@0f~w?~gsn'$}o.L1HA J6rjgqi 0i`'1z6bS[OBfԵ7Qˎ!ōǪkcǤpp$ȉ"FiZOUΔtwW9 F5S{nd٬q8ݑj\I]%³xĉrUSkG'$W m>%9CdsY[J6F@I@' `:^3iyJs"۾O())Jv O`!VKCL?i%odF}#,چҸ.NtZ(u/_ .% ^ى]2 =$Ēm3P1w03LklO?5E [,l=bC!r *II!"(" (tj’YK| K`I doS=/Pap,NPK9?G5jvޝ=6  oUn̄]Q-!ݐߺi|!-&4k44.Y#}5'A5kzi7lH[?b]uY=a}̏ծA=+Bwֶ357DJT\U+sH ZMc$e0l)@P>$%{6 C ;\ʨ4 0p[?ZŐuB~%Qۢ%^?JZ#FO@S]jIWS?#08Do+3ҌD\s_\hNDkgFؒӱj{0|vbI~k<_?Uq58Hbp/$$DQЀM(HC!e\}lg&: %#_׽(YlVu`(d ;,-}ʳ w6yX1hF(܆ ԕspYOYV@뷙pfn,/bw bqV>n/)D8]hA \&uJ{x[hLZ~ j?/2V儅#3i[_,@+m[,d>HM,q+xgHt*{,|}QPGNvdٚ1e.>O` GQSnm]&'hs::mjfVVܳ](8%hAG Q {At9w_atyTlV귕Kð"#;:ɈB}nLc ~>ƹ{Mq{2GGu,*Vij½3p"vKY5flH:t"l2J}|)|iP;y>qW-SO.io(ZׯXemT}?O* n`Hz]u-IIeu"AVܱ g[ΪC?8Mħ:^0Hjx01jnVhm6?GC~VD܅o~jA9͝eUB'…ܒ)ZFt@v4| >B y(:M bk 2.\ ؀^' Ɵg au}$13ߣ($q.tԀG@{Gψ?Bcٳgw'I)!V>N0 b۴إJ]v" PH @yidfwh w >,hp·<{vl @}ˌ14S_6dp/@޽H=e-swE{pmx?OS;=)(1 k~jVj@h  QMk:$I|>ޤ}qSiQlH \_KIhv$lcey6Ū<$nȂ6l3KvO3;0`C(\/2' *6J& tgr/Dyqܮ C7t; G11yrjoLx# Ŀdž0H=*NڑCNT /<ݯ"w nSC`A/.߼=-s`N 'uV lMG~޴_qm5hB ]9l'饳Z3ܫq-}輱9o醲i= AX}~ OJ݄6!Nch a>lj|Jf~6^fb' ɾvhpna!n)mq-uC_M%8-mYY~t[.e&6VUeG=%cƐǨR*d=ˬCo=科s/.>ɷ%i9*^9Av N! ` l~v<?L>:߳ߵO4@w [T53 {]ӆ0X <ֶk 1YDxdw]Nbr>ƥN۳ϗVg G*%iF|y>F?|6[Mqw eqVp5|[u_bT:VDO Z̻4Cj+WLtW{ p 9B|Bd;UE@{kYﶟ09S:iݵI C\ |_g"LO`ףo \w\c`}?S<%6Q xth ,B^w_S/hHώS䞲eIٖ{4R<B6C;,ʓ63iGcSjXkewԝ TPt>%ȱjA#?!(f'AO&rlv &KZĨ&@pbg]XeNe|û&#tjf.uA/ .6Cc=Q4:kUOEuY#>'4š<^N؂11h[3m|T^81lG7&ZNӾ֦z?1%;6P5: cCDU43PGs)=<%~^CڿV$\:IMPI p,tA|leC,%j^6%h:r|Q.B.{Gq}ԻFlx<R,%YƲn#uNmp)-Om6 uX(1ն/CIaJJ] ]giľ_J^~t W=*kcpΪgLWcs#nf7dx㠤 O<,u?1mg(g;25U&me:z,YRVs۬uu o1gnuoFZj%LVRnHڷX#},cIQcHaͫewReIET\lk{q=p |ފP(@1oz0 : r6l9a0ѽ$2a ,N  c^$ྜྷVj- !Cj@DhRk8p%h f'G&Tg)F5ւ'ȱ+&c5<ե.ǧJ6]gﮉ-7֖fP*gisjkmғuFP?%a(C˥v G e>7jӒKfA^U˃o=&b32:*#Eי+QZfl $ {}fwǁ7}v]k_آ# i:ަh,}y[AG$4iSϡ̝zVՈwo |L<ZZџ_ߨz[ّm|x/ope$m7М@M>f~gO'.\4^9ɩLC61Z:U4_ Abՙ3nGkr䦍}/N;v2.IVW|acASӼqt&V&LO`0 m]]Ͼxg)!#kձ4H@B 2r #EAjL*3,j&&( 0)n}x+- $)I 4mV1*bm)D ?#ݿ3Y\QY%bb`~3qRŘ63t-ofOKXӳ2@fGJiAJC(0\ 2s ,\'@PBQ*)ġ@D+, MHRAEBI m"HUxA؀i4B>י3Zֵk\ŧ"Gb^wm hi4#e!`H9 ".ROyH{=kXrIwƀEuRh>o`ib{#e>W5S k E i]~jnMq*o@rA! G$?zXtŭkᤋŪh?u]|vi922uoʀ+"=leMBb%(mt=lB>H97m aZB FcD l_ %!Y5O 0}_jш +%kBC=Nw-bp/bqyK /aRYۼ{2 S`%i 9VhH58a7/Fh\Y5Jp uoGiK&$K!mi y\+u}ė0$t[hB8] ҵIU{zh[7x[_L-[P'so.;[8=[(~s>{)9 QQ.ZRe16l3{ߠTURWli]"+I&P*ByY-F'SY{]Ĩv(DMgZ]})G ;ۏ=WkBP:"U/=3 V'EssvΞF(081ն5wꦠfEٳmO3㛓IKm>)lxfnC (yA3ҦF#,/rsgIJu`6T8DKyx7Q5T%mQ)af|,G%go;{ Vx=, %\ w^ xO1F}$ч1B?&w87+R맠ƊTO2B8ߧh_LKM/vSq/>819<+ _,+eL@06\[ w88!ftJ)OTzC^JtԹSy( r%\;Q.lAf}2tQ`:$q{K=r kRR`l[P;+~g1Lj^T"YJF- 5T ֦ icqy9y%&-40T~ytMryIr_xmcwvWvr&P<<9eU/q.jJD貾y Slvw \n ÃZF'/"N%yi.^FmLd87DhOӧnȭZ ]688FhkGWwy,` w=:@T ;4CR$3&ͱ0%uh0{:ܜĶ$?dk#ܑu/˝'|.84є]%5&|jhI!d$}l,ز(.t*84V~-vwMRWʟ|EQc,;kNdX.THK™a&~C.S#/w>hq]LR e>yLJI8 !+Eqh!#c0̿Aq F_@ybf-n[S6ʐז~qq*X\r7}7O3֬r{8uO *mT:̦SeM.Z;7+iE)R\sZm- LF3 QN>3E-/[f{MA]LcG8cPjWuL#j}/mzr]՘]2z=\x1B> RW"OIٸ];=R,l׭vw<Ư>$;€`_=>&$2m}/$ݣ"MEcmCaH_TOZZϨW^.W/;Z Mk2zSGP4:A JυLsҔh妰^s{rK,-rE۳۞CA=OH#~~c]F63CJy|?-#m_ODUI_İq= =JPB1d)Cq3"",wz1&bsfׁOa./Obs$vD L)5Gĕ2ROY!>y(n,~sߺ4t2=j.`OFQ|9vp)@_qQ/& u} (z|?5=WhWkpkd)›`V|3=Eg}XOR>OTkLO= ^mT7.#w}l?>$.7Ο_y7|l, @23-'&MF<Èt?lо7WA̭:cxm5LΪS=8^V5r6r-됄ғ {C G{s?Z6q/eɴyUdD٪ o5у|q~ ]/YSeiRfPs4|;I^r}GĢѥ}&WuVu(?X;%juFW` Ds6cu%v0Al-l٧\>kӳ<1{ucP8hVn-^b^vajy7SY7yl)yVFwsr/O\) E^yLNe zwߣ.1ǏNDfiUx/<6l_:tezZ}Sb{x 6u 7SA {*y> 6?OGcVawb}]k@~o/Ȗ>G~vH!(7~0H9J?Pw }gl XpC`%J0I/6|'-.}{L=ϼ=Yb4s5݊;-V&ҟ"ϻuvRݧ8yQ#q훲v:F_ ?PbN|>?蛏;FwT wz>3zsabzky4t_ P]+ѧl4׊ߗ'q}>sG>9Ռsa:xLU:s_x?O P{)$&E$x@ _r~g °:xj*«I`AR`Y\S9yyJ bP^uY|(Y×45T*"Vv^dtٟbb3 uAY3 dQb6S?kTj,;ޔi!V.ڿ'Ӹl@AioWC%Y6BovQn6n"ߞhQv57M6,DMGx'KiKL.) 0inmZeiʼ(}뎀boX3P+ZE>,J)QQذFA)9MLԘh~ڑWBRB QIcƏfbDH:<ܹC7犘-e %PD`E/Ѿȋn̲;w`*1b_s Ȟ`"2(9kgE,w &Uުt%ALjB&HEo|隷xne^:pOVhV̼YO^!ܲ-GuEںwI /aE"s&-a%E'Er *[\`t 6\4U?I!ŅKd}dmpa*E jFn<:/C,y~8.|\t=.W ދ$7 ng9L Ẹ̑i)+Nm̘=w ljn/$LL}mTޢePU/B ~Y+f/)BܚXFÍQ#e`E TcDNf0j~%z2w;P96Lxiڿ½UN/]4-"#&~ܶņX]bߙtfkHt& r"Pq:G%i[w-qM#mclz2^op3D$@DF$oZt翾q)bȾnҰ3X}pƼDDiJݮBҵ؉| 1reDv6 kPiaZ &&ϔ n. l`MWtJ%+7tE.ַz:/~-hK{aW1F|A,֥߅ub3FF_x? bKuho5'̈< 8.EzZ`l8V)JMhb7o1e®°i:.:Ŝp%߶(.HMl2KK֡C=^a?P;{'O3>%3_w`eH8ϲ!  pF0t-M1}J'GC{#8"Ȗ(4ӿ7>sjs_LjE{5i) GDQݳc^FAZ eI DlJUs^pup=?r{I=^`⒨y8BK[Ā-pA}Rl܀\S *yR?_D< {/KkUs:_~W_yC˘,:Ǟ1p65=$'{Q\D1-4-9W_h@{/S1I]IνK.tGPD{rGg$ {+# u53q$db`21q5|+Pl1vq<#]Z"\_UzOyVw'S/2vipw{Ͷt6~K:؏+ `ŚҸˡ(qZDZ5N/e$NMD)H-*?ߥKS5/uW30c>=},e$k&YnIY! ^F*&kb߾c-7Q% ZvD<$ZZ Eak@imH`tdj4*dN* 2JHw-%$hXX^ )_;h<%,,v(K5?xP'2o <(Mk(%NH핤n+iThb2_qs䤬*|`A=+eIR=Rg1U]}@k-sD.Q١Ȃ79a PcF [W"ݗ@;/j/P5t3VLY}zkyLN VAM\MIqTΙ"X%H9|XLy|=wy5fO=P"O:^΄gPqG^Vdu @ɑ.2GzG4V=D"0oc e)?|DڔT|#[L^oJpqHjr`2ڐ|%aҿdoK!v UdIמt|h_]ґGUF6UN{x'a(`0a{Њ'E-"@_J^=66zq 'ǤҘdyIjg8 8>ɣ FlʊpD6&Z eQXQ]`҆#Z"/F։0lϨm!awb^=Mnʶ[aX7ӑE*ٙ 戢S(y|o67M'~l{` dbuʽ wPz:01P@Dc03 UeK"TجdZ'SUة˶l:[clhh6|; ݵA+L&YM6li0 .ۤ3iV1r4W];Y M3w_w>b/{rts^ҁ$,#5OIvlM.o$|ߡZt˒:&ٙOŵeO{Ο݂30?GEAy/a򇪝.]Բx=6{6Nn%d<)f'R tbU0>ckʼ.eNN7c^,M]^Ҥs:ӘؾRi,7$_W>/$n V`U杦$F~];Al®@ #4PxH,3U3eB'  Oc@{_\?GTfD#h*Ζϥ]B,~gh$Ӵrm/KL4$wI >wMnaD {(#GlZ'.=( y}Ē4f~u+u/d׈4|Fw0{;>,8 MP#"*tkš9 Բ17d DfcV9/WsY{'X~J!xA QkvT d!.Mzݧ6`\ix"6 *dVΚ>`rru)֘Np v$5Evu\i%_H.g8xQlIthv:^}@v,FvRZBR״Ti5_,cMXi  /J~o({PߜܳD+.#]1$hq(JV I) J9"Ha1|n=/lÀC I,'ϮmY@h Oȯ _-$Lx$ehϑ& Ώ ϴ}iS1jֱ`M SAof<;i~CW~==6y[oӒi䗽kБHD"@a.:#wkٟ! G߾ wc~fd b(`vF~){6E0nגc-G5l4i/jku,\aw꽝~#!~׾O@:>^7si*#E׀WM.jqj*v[1[){oq#!Rzj*;fe*s>BnCPSdDl,?RwѿonT3-}<]D/էeJiWCСHy[db}9Q&\Šuۍf~819=V sɆSW7WuU) 6L,:xf|* c(NӔ K8r /M8_*Ӭs*JBπ@þ{5H?&ǣ񳉪s yCZrw:Z[^k'׽5~Y2^|gyƨELykpQ2TiU5uVن⪍VKth(i/}?A=74odlͩe,qCEHK{Vk_3b~m|VYP~_=PͱԁN'̺MfMA K{gkeh+\z[T-P'U ޶y$w㢇lS14T EST(>TEW\Fid+UTOdvTXm|NrPYJU}Oi~ȧxvf6gmo橛uk97ˋny֢bJ^"R] b?Dl-hN'FU4 iXY8b%AsY\89:eͿަF[ ^]աdJMR0_^RI/km/EO,*'ϒdRǯϮ}c'oڇ*N颧= z=[ej ;F[Შ*RA*%|_*'"5rFmYRA^]uFBWTˣ!%J6#vyD'cZ]3=۝ۍeXFotɡ~۸-'amJV:gR ~mky[xRʧ9CgŌF J.kl0B8e+ŗ+k 8.z"/ `A8wsO=RڅXND\ G 5$,:7gZ _YiHK}<>+B!'H2ievqln<@ ba;ϛB0!͋ JS[cU#E3ѳ:kP^ibl γe'g(E51B 0ι,Y<36}^E:urqt't$I$oR;0?INi" @v'o"9evO LA#UEM\_{~ VDh0#mv9_/\TA(Ʌνi?T+p1팘Xָ i,uAɷ?Ú?(f#|UJ8͠Pzpp!ZT.WL 17_1UQ1-0l{a*A߉ A$ӆ̡ * ,#J'G}85I$>FXex_73 /QaVF3_K2(-`4e8VI!xY,ۗ2dmO۵t@g}uS$9@鑝8E>d,q-z߃gNm괌Yk>eQURNo_xq@$heYա*~SlqASBЈ}/iH"^MV)IԲSj O$CIԑV6ut˪$)1X*Q]QHAɲBY}2y)|2zzָjzFc+ jVn79ʹNLtzqxYؖP4<%_r͑2XYOfWwjiZjM3Udls;_j1n0'֣|ڟ.u_wZly-mhF<n2QVҷG>N4D .FB\i5hMZ{W&N_^FƷ!Eas/b?Ȅ%<3H>BXHj׽zo#Git-#}\%1ŻO+| %bYLZ\1ĸ%) 3;~}҅Vﵬ%ϛ$/BaaGɀ? 4e0-&4mYt GW_U!u?C0Yфj]"[g}\$z餅x\vחlz^fGߨ鋗cɡ:Юس? C4;t˳OkU2 EXEWju8GJ9l QG -i}t0` a4P&&g?*i)'ҭZ#n=$SIw am:i"BAEcl!fXA"gQWHp2nhŧ'!)idPIRC`~ 㰺3ߕ ?"N*F~cU i"vl1Yv@FPS(bi'bMcbq^G # ]0Id]młCe6ҡsoyz{HFٯγ=^% $l2sLWm~xy+v@ЖI/[St9H?i&/v9W8VH\*wɒ^qkш 7-] B9,F 1{U4bACȗL=9gS7;obu]%<Ƨ_=Bw\Fs z޵&V6ߐm=Xgdxd|fs:g>nt_v;\;:}13]#i=Ag_@ rYͧgX[ﲺ̱lC4,:-/e/yyqʦUj?J~IIXƞc=ݎwWB߸4fdy8r:LO5:צ+fwjwhCZ:G@$T6rMBpʆx}WN{O1ey R_+fQ*s z[::}~:6N).6iHSNͅ'n*;2X?OT@Y9`}pȽIz0LAtNOpQh_bU߇}gwK/?!U5x_"B6Y)/jNʗzXk u偝Sҽ~#sI17 } ߢ1R^t| ?Fgo;C6 hB^amB|et]_gޗJ_QUYUS*O{[N2X3#g(CXꥴ' s7 nˇ1f-Uw}{+NԵ02nP s"e &Ihdjdsl^ӥDH򬅭&g_UVDuk>Tu ct:g$ % i4چ tm%bYˣn:pFgY 87#,dP7iBt>!NZ]YR(GP kr[3Vp}/_#Zޣ5f|G[>!>-jѠ֨}?PjsfnZ݄2\QT!q`񉳚{Duh}C9O>cɡKz͌9YG\1PCj O`@YYI0푍*pgϛqnb3TkƓi{RH,5a5c@PX=`VuHGGa +R[^7pb&0֭ `TMY:yJM(}آxs.+S2 RMy]=32Q1u^A>6$s=cG 6˳DI:yٺiI 2|_0WVF+XhUbܖI?R'~g}:6}9(h˙zk?8z_HPɤ'ZkYa/Yv? = O^9:/R\J+ѢH@Hy"0"1*J 55:Me \T#36Q1 '&]]bySrΆ )RfYXjXX4_+!iLxoKl9|m,V !3τ 7OHu+,_4x+Y-H} 2<[× `l Me@ݜ?NWx)5)jjgjT7=;7R1#F25U{n^Qb1Fs a9IDI-ZhBRlsXzvAՒ'$2?V5# #r#n_՛ғlY;&H$rڄIh"cqZMQ"@mmWw;e/Rƛa.?&RV N"+߽+M]";ge;qԴLu VU?֟VX E$@J)uX^*?,ei9, 'YΥ}ܞcwvsqIpOlƶsڷmA$꽛({kkmެ@UUnm"cwpf F tFHlŒs7), P5cNdci)ʰG4v+`IxydN?̕X׆R$YcYT/}+/zY[ǧ' >uh ()$R:/H' ƳUusϖϵz>f] h y͔D+ _4W~f,3wy»AtAj5z46yv bGL@q$ă,$kmt:>vl?qV|6+!(n:-|RK|b J`DJLr|i LI ea! 8FW'R Js} s-EYa 2zADs44QZ ]m>( +tGqkKJ{g$լˏ0D¶G1ngyq1y$:ϛmt=86.HK.֋pϾswOMϙ!#"S8P:tev1.hT@/Jgk'A$+èq ;˯ʍeZ63'BA=Zqemw^W]`=m'`HTV1Ir<-U,vҽs_N8/θ2R !h==L YٙDE6wMƒ_6OWL6LذiE'f:F:ywBhR!s)y,˜ y؟>Q~6|ąHBnj]22Hr >O7m׿MG6YY\eOTK s%-x.Fwsp}P-@nA&A ;~f#%GŌ M6$6=dˉ08eSC * o 9#Qu3$XPXሆLhETkv]mHe$S+28UK.bI0gV:<)4reCL,- c%~8_ u췃Au- ҲenJuz9 x$4, 4R /1"{Aty횆%Gڟ@+;Z~ ?#gZH{CJ_Zcdt4Po=-Abw&w.)E;GbBq)^I< !5;Vc.vT\xmm/m!m`MDbJQ%e't)VaM[W/zIm>Kn)h]Qn5W$=;&nq`C/CccHw )g$+?15͠}$?;5Y[8xAtzB"Խ2 },(쥄ǾLW` Nڽ8@1#8 0U,w͝n&/[oaT/q"N}Trvp}kSr{WrI]&19cTkW5"O꒯+|&I=m@1]5`Y:!”@M!8r,YЯ،Ռ#:Jꖒ# 򂙈݄`wyȀ/r\3E'Qw lbG1Bck ?\H)S};dgՃ$LJmo?F.Ui*EM}>gʹq\R @` 㵚 EUUe/p{&ϭ Q\4Q -X՟>gi8+Wh.w:o:[pWDQӯG^Q@ě]9ZttZz)+(Vqv3.i.:=Q /=Iuf&&2Kkc5Kn•uQ՛mͣz%s|驝!ۃNOu{D+vg:*1['|Sǣ{uTrJ< k}3~ŜFM ҊHɯ #`qk M;"I/BE9fL4/.Ԇ4Ȕ97ФNH":ZZagڢO=5Q4b}z)625Җ4߬oaJ*ެlZv.]T3TU('e7/~ RaU*:/&ą[V1. ?Ok.&np:1@pH@7&B6n2T祝aB`"y =ߤ f4F{CWGv֩ө4pIW]GU(v-A4ޮ)o9Jp iR8i*a@tB'#*p4:ryu_^:cWM8#;F&厭quPW[aU,mŖds+ vmv :*݇k'QB"Bf%QHUf}ʼH-R-6xfY"f7*FFD~\e^Pm-ڣqS?Cs!V_uSuر$;SFƱYL$E8iv޶BWjӷۿf=6n{)2,^jf5gY2aɁ8|{f-#1A$WcXWű`߷wHx zIM1|f_~~xtL^=S ${[/գzS~c1=Afq:ޗyRAش-Os%؏ZlWE~-u]L o4ؘ,2GKI#%#ni5a&`3UVcaE.C(S%RL^S_@MXuPѹm|-u?.bŃ Dʨ̲wA?6xz3'?.M +H hgGaKvæЏ-;֍>㐵q΍!.a{8șPziw"q&3<<;$PYsT{SI?ؐ58NAxU.x{Dr!DXUX[ ~̃wLdǴG*!m(`TF ~aq! C q"\% ) s3ÃEt!DI!D&Ehph1s0oFmAUEP`PWd*J JH#`;YTYclfJ6df9E2SDDDfDS$KYeevcYoj2eUbd2L 2r,0na˥(M ÍHp(gtН$*- 7FV:1FSf 2w -4jaI>{`a-~X;XfI1fTfklFEE CFM&lE (p*+0ְ"Z<<5mM(0֣5k@('xʀlFoS2++z,ȑծxY3631?:04ځ9DVh?qC?E<<' ?GwHvIav0ֲ"]Ͱn& _7s?oMe4` 0Plap,T;9AA3a8iHN  :>8?69A_$1 u uπZ!3̘l|W+fxΕ|[;v!2LlllGsDov~.,ׯT֬,;QBGG}f-)pv, IV r ġ(䌜^;'7awo[/i{f%ҾG?3?UD_dGits#Kn,mMr>IY `M;A>N0m ̼9ӼajM\YV = ޶ԪmÌ}+Hm]~XQ<9 `XV7PAƫ@vkH۔- 6 ~E`c#XAVnUbBdQU:*!{DL'b%rju|7+Ps߇uR416q:qEf}qcywU̐Pf$5hE&S\ 3]ѧA ^TXw뉁O)ވza~1n3`ܑ$, .V͛cqMhb@Q09D,ɘ 7DrqCÚNmS@L8rj[+=8DNUik*j EVx Vy\Dj$)\Wc4\*3I\ ( v@[ldYq l8Ubps-{Ѐo9I#""hߝ`/nµ0҅n.OY[qkiի`\^~wi}/WkS<c\ċ_:b/Ran?o[kiM5e#"Lt?G.Jܓʚ?"R0L?EŌGQc$kuR 'DNxtN0(R*"w:(z{(!Qxr&`[:W.FDݺ| CY)i{o؅o_\MÿcR2)DϭF5W{\F4 YX!ہ5ooxm^Zpl{ql+7|^sk4vm>b0o:f0 UMSY_<@&bTp8B=60l ])qŲmW\ CfJ8&$puxkN v)&" JcHk9a M`ªQP6fnF8m&"1̭;gaC$%_"!`hyMp [v1oq,\l|C-\cjS8{ܗvY˧?c.-1ط > dtKxc`EX8l \?B#-=C^|‹A-*@i_."I˥A&kxkyDƹp_T("^|`dӌPHA@Xc2q^EU -0wF:adyPp,o-nZ)ȇUMȑ%tЙ-R^o@≠1r'8[$|^Ͽ2Čr)#X֤`wvif^ fV:[#ld`= Fd;DY6oN<#4!,j)j0 35L^TuHh{=P`"%NԢn?@( QBqS֮%&eFs݈8}j&#k h)7pX袓?&+u+0[ύ`a!{mxl v 9ـW`8zq߾Pjp"ϹZ56qma7$K3 ᮣ,nAvA.q^hFHK@Y$ӖF#}ԅ6!J k7 tģwo @9D$RաBs2!N-!!KߕaR.x{`) {nMVm$"s).|0]F_13k8ΆI‚$d!/ DAN_ 6ZF4stLtn ephZj-߰}a$Xy k)E0`u>Ĉ/tth! E(XI(U3|$E> G@E;dijێ.^صTSva͠Zhy#HDC zuvC%@Q E$Rw  u"EIAY hX 3Vog޽UEƌ ,aps2 +3SVEb#J1%1.u}0TpVn c86 ,(\.OP(tTQiՐ 5U_#[_* #4REfC Q9c ՜Ys"lqJ<1FP!ҍOx)jjHn6[|{C:?B⟥ AI*=lCvmٲy8brJӷ ip`V)#(@s!#,qL"r 1٩ɪmzau u&_GC9}!<Ơ_Z`NMzӗ^`rEI0Ę7JX Deq#JꑌDcq(fJ zępޒN?R:gÿPlɯ7 +0WLF|͂gÆD aI9N&mr?xjNM`ۍ1|o;kn/!G .|uƒ|RG|bs`\cNz.}"59 /9|U5>5~Uխ¡2Nnc3e%a>]xfJ:=O%ݥw]Qnr6 r;MI6 (UagPYN BO+&3!I)sJS'p&BFDÏee3OBh`GJz#30\% bY 928irLW*h8\f%XX\Y65+Tc}9R"mm8,͕\T +x61s 3^c1yN3L)oc q#}V&UTh_#91Ga_MXNY)L1>SGUj-q4Fpv=\8i,ƈzƏ6g@p|cB\i5=4op* Yeu0t%u0^٦ڱOL|b+cT̫P 03N, T( gڂ'4kw8Lgۼr=LR%F60@8^ZE$Pwd2 bLȕ7RTXZu^ ebl"s64!(FZ,q&{0_ EY&61\+joZIoxoyO"-jMNzz,ʉg!LυkeڭM8 ٲ|h6TvKCVc^53UHN_#\8ְq35_12 gg;S/BƖQBtॕ_4k +!X~vS|Y iЊ ӳZ/,NGw9dJqo1 U=t2ĿN$90&G9'!ךп@ʹ GW2~WYj,zrq)H m_8G遜ke%L85M&*;w~Ɠ4p1n).sM 57]eإky>=l+ Vbh=RiH3`/#Kɤbu3" w/Od=T!J qpQ4CH?_ouTI(QPP5m;,0N3S>`dջyk}iII?f|E ;djHJ7J ɥrqԖfiuL-Z3U#|nb8I'Wl{c=Ƨ.*db$hc$4!#_76_ݿCǏl7{>rz Gr|>wM((~&F@2fD2}QmYQݝ@d+BjLp?Q{uةEY@1WpFհA¨]5ec9NX4- 10^4&&aUS]HiB9[YƋsoC{]F#zfp`Y0RQ2T]M̫cW7掑yZSغ$蛦s2T(\εX=_qE/h9pl&~l7cv5$dVZgk˝9> sW=֦X& /1r]_g8hd0sW!}bQHbwgw?srO3 G؃oϟW~'JKa+Ov#B}'}voML!MG@za#ĉx' ԓl91=iTVwh`3jNaGOѧ70y @NJJFEm*B^]pXk˽:Prmg$"^ ?\|QKr}?7; zZ=R\-ġ> LpoV$20theClFcfk`B"&\P #< :C{IpڪÁDizIB5H_8aKnT9l44 ,"BũO() s% K2gw3DGg Dv!60Y`qcv0TpZޒ1½wVe/ul;0ᆶƙi2QΦCzys7wZeV]F]CuS[At L#5`*Ϳ+9k/gBJ:%ӤI"!-օtIϷ6Y<\օj^'o g U8-f T gaӊ Zsό$_F^2^39ou϶2FqQJ[(ĿWߤ, ϑ Qf2UE+rGr-IW.EqF Qy`Ŋ"q"iR9 (k($UA SVeu.==ßY%SN#g,D<7Fʪ&כ7Im JNnşlm|>M f[y]#!8ܴV9_h" z;Bn9g 3S?i>kCrS[Fǣ:?$:Ttkm|<z4wo|ֱ kN_wBn3iGQ#5Ŷ?;T]c Mȍge9)vZ3'b:?9Orb 97QВ&͔~qh0xk .1ƅ4`GP#bJ%=׌H1B$4T})[0Sۑ-]%6'j "#[G?>Jef4D:F@ڳe% kKZZ&YԈګ퍯%jHn NY_N]H. ƒa"=u%!, {|N1_.#+oÔmC`چG˺{[jNjve๏hg6+Z޺8*T`qX?I|J#V[!ИD"')Zh(s*Uڻ4– lC(D kf"`P t$ K+ K*WX;`8De3i >j45mQT0<$*H}p-@(Z4l' f /Ul]LX.+h`X`a! l3Xev, -WAbi @R * pA` >Z0 *5X|1P32pbl1IB /7pP4x(CuBB0vA^aCB%,@@̀#$=cpAřdPD68 &@ERYX *H38pqp6#RCJ6 \wfĦ Vw 4 t_YMgfr"tZ"VCLs\2`KG@cuMUbvKK?[#xTNf~hA0ZveY~|P1[G ?[+=DŦy`]̀!1TAzq$A@C1y Xt&鬢LZLI-R`B6j$Ud6 аSL. Qvg @Y iBpeAVp9V`Bb0 % B(DTY;ZMKðʒdN%q׊.C0@Z&"D) Lxv&ZYi*W3?XRqH4$Ã8R9ZhPM* &! \l)Wvx%w/ ɦ~8kk8qkzRaZؼRvaSZa Nrm[SXsQv9\.X1!i>nS7S%rܘJ#Ӕ.:Y6U n//aϵ\fznDWE}pi?njp  p mԭ*Y*zJ 9@%PPYN?fhTLE@CwJ+z0h!$07I8G %D2ZH h$$?cn57-'m8`tG96Hae\xqͻ,r> 2"cT𚳊oXk b;΅[x=-\[te;mu yj& | Mw7Bko$u A-.H;_Z%;yEmn$ydd BE3Deε+asƑ->3Oigfql`No?gW]qLq}g+sfxpDz- 18zwTQ)9?:X{D! `p)k`)anyz{>I k_Dwølܚo4Q*tRXcdNŭ@anWI4ڰHو<ϪuC"A=8#y>ZRxk1(wAN9qݣtT:wFN_d 㹻}o#L9V*[tpD r-'L_5 okzhDh! E7'N[= b0 s0bq]C)Hd AޙAִ-~l#5fyTBX;-CZU$KZV% y.0 R y'ǿHY#@YG5׻8h/ń0d$y1̥!8s;/1X^cMA J#b7m["d (pZ!* <剰ry`YB¯"tk`܈ArX4ì %<̛!jPR*)&&H0A{;tIpJ),ߘXBXy_]nh Ǫ21pbcْ]i]k46p)L:re.5Q. hjysJi9`0OYa_w 6jc{W@YәX0({>u辦Ȧd Fz}e/c9$,If@d87,4<}df~TlNwiPENu$*jWpO9XX"j< P4 pvDbHC"-)IOa !KrƣQ=1.acb-g-nbgV ·!Gj1WQ=MjAd9r~c޹wI6^8#%t:w~G9&'W &[a1y^ z`\2%c`AxœAEׂʈdH0(mN?G{^Em\Jѥ8& zimn{f!2j8_LS]1R`5ENڿƝ*d(!SǻN[pwOÇͽ&AP}Z7,Ր&qI2CVrΎiR8 |?7<G0X}IOEQ80_fqo~:}|\gu' F>Va 3W!rܦ3Y֧$-*"LhqWSQqH7oΘ01Jtσ8%OE .s55Ga=d+'=2~/@_`:Ⴐ,F$<`[#tc yQ@& nWs9Q w)99KR1LDJT8b8DA}4n6?l/w3ZI$ RzjPH%"g#iԽnԣf :N>d 3X\P1lQb֕iIۯ],܇`{ >(+*hCICH}5z8K{Ou-w)keWOwK<^jGzL 7JZR.LiB"bN/3HŒ $7j"ryf9R f4BgU2 U`8d!Ywr1[{L>C[/KeFcmE KMLe A>:\DV,ZQ+ ]h):}HWR`4 1ivMykzխ9v@:5nqݛKIS BDP(e5;C.~[tEҴ4S- ah0 2SЛ).&Q6Xև4λY˟IqҬvKiRo"rbq7w{?o `[}0F+[&m;O[?oÄty,d]k-6eUc<-Kz-q I6)dsv>gk&8 B 2 I+̇bY IۣtakEE{>H[Vw5x>_9|(8gORH03X|!hMX&$I9&S1? >|=vA6v|I K^y;F{?Ff.G6ᥪ=} A+d(MӶ 'wQ񽆖Cn$ҴDW>gVkNAnmSN4!JBrmlvKk:n`_(4Y)>1@:4Ff7g>Sf0rRzq#뻿ٚPʼ)W嚛K2 } tU>?uy@Loas1ˬb$o=Uēv m?ٜ%T6 ۲"+XeU1D1TmdU|O7،B@S2D%DE4DUS_UATDPATEQ1 CEAM0D ahFēAh` ?XQ1_ hhS:{ޏKmrrt3!,F ͸#"/^KX |{g:+-QQIK]~?.FZ&tg眠(0 S=c#M!d0)ѣVEήy~DU$v40rc\l p$V£ufsC{%ai(-'\aP9z#;S J(VwTavhuȸp!tOM]+h6 s*#& n&0XSQ&bdZ`K}W'Sߋ>T+U(:S O8Za?Ma_i)m] E<wnD$d^_}X)E=C' N/I‹z_U@QT5|bp5@y~ .5v (%R Ӓ/Y\i4](^Xnc+q6`eq]>̇}OȯƲ8+@ꝲJĴz0C޿> GGoἚi0綒x#0F|2BH~e\5 ~} p䩗D}Ǵnt5&] b}FDbm)8.\?b tjQ \PFJ-)/;C FZ(UO(?C>g4%*_WˏSp&v-5](|J_VV##Mm%OyS'%-fj+ĥ=4 lz5 ԮBGct{tAaj)zF1~T P`"Z2N 8PXMJu þ֒_ Ρ0`ԃ< |=Vj5F"I!f#"4Es1K0;կY`Ļ2؆a$I 1sN{XG&LB4ot7ё=<  u2#eF1pYPn@9RPL3j4[Rdƌ0< DaAGs5]Dr6?+6h%hWrT"D:f %0-|/=v w_jAiwBiy햌aPaA{s(D$ȡK@R:tGZoÃiʑrbJ~,:,HeTEˬ2KVko&P|(%!!SFZ[, A `w# !qXhWU"%ʟmۨSyLuLIV:R0ٔ> *C, O³`joՏ.i?gB>ڣWꡠ3'<7~Geyv#/5م0}4kx%2G}ɂ, ]?[}$i+jphdzxЭ=K%fgaCőgUyVU6: Xfsh3ڙ @UM :riS!Eց_ vhp{V@V~K5?n;VBG#pY;}>CLOu%Oj!|gLéBNSV l0?["K~ƞt.VoW1GKۉdl2?_R}stbZS˃nI X>;Kt)y8q]>0MGl}LF 7Lhqe.x_'(A aͬig-J5^=.6v7[B*[%K_]~W,EߩGfRVv 猫 ?/ߝ[LW#Qr K,I?"u0^]cird眦l3Pm}-?Ȩcy՘;&\`,OGHakOr{I wMaK F.N, .{ $ ̖nE"ŵ8M$M - *ā&$+=,9c'-9^ez$QqUOv7Oɐl8G~pc[i@!mgT֠e!nhM{(y3~|@pԯ^et\Z)n.ZhSn#C}(`uZ=[lgrN$Pq7PX1K~V $@1JJE( (+x V(^1T@%-(?/ *`8Ð FLfDRod%mSlZѭ Tͬ2397vͲ5m&pM5RBJQCIJE0Q4h1؍'1Ӗ&j*QX&-Uk ")Q*J)(`bVaAw%F6"k}ZcY.4s/PF֞9j!u. xutƾMt|pHR`_}&M/5uo)k%DM%հ. W027Vښeͼ}6,@DP|HՖQ1Ld2D%-(w@4P-K!6=t2=-˞{27`v뎩A%#ꓙlqrjHka8\Zu}ޮ^.eh z9ЮR4O4jN[럅u\2\m@QpawfX5 :.[H /_X&p 0ЏM9r~z~>gػe|})8rs]-"~Grض-wQ[g(ٽu=.n(Ɛ͖.ѡju<̒ ̰@ ^EvA}]Q̃%g"x{|&w ^3<$x}RߩQ7&g6ԌyQ.d9Ri9>̤w,/BudT?ZoN#VeqV&kzF[Z`'~ Ja5A!7.=R{b8?#`$}Bx.#?2? 0ݲe{,S$"NhY:HʊwCoPa+odtm_I**mbpcާvϜjcRaJR隋X<ӷ2a (J~we-mhL}Ǐ^qZ[Ʃ 6XggKQTkPCNl܋v UF:mRzU mlN-2.*[ߦx5=Z m /8=!Uh0;vǰvҜ&mУGղteo#GesKw1׿yt-p}mm./Kqk!AR=z %[O>)֫wDiS9X:m vi&\A\Jml+ԃPLC$1$q^k)fST=k;FNыo8i*B!PF[Ofy 75הE`BŊdɕ.6RR8Q;w(^QLRIc u܃Hty 2Tc;S'9gu9fLa᷊ϸ[o7S|vd3W,CV+W=,c Erwm;m=o"ְb 8sk죲#Î}ҫOuŒݰ/CSH6$Z@M4$/6έK11;g ~'y2?] 04`׼*Ζ47&ˆlGgMDٷFZ?vK>Ivؖo㘈YcF`ޫm80QEi@V! p=2F`a@qp6 &* HԙS%$Fr Do5TU2 `d4CDSZ]Os7_?a(6K!r$,)j =I@qJ_3Xkq4dw\k**iA2DdEL|ʃ63ADf`dU9c`mZƚb^>>@^N_ib$7J@T|@E06 7vS$2ۆЛai BH'$j(M˖u BkW S a_Pwt/7_?ˍ6=Gfv56_%)-Ii8 E@#)!J o#Po"dd֠Fͥu;ikKo>)yO|]I~&-7U0=+ ;퟉?s `Hә鵸HZTĔy=53l1?{~ w/D4'$in/ݢh![tÙrr^wc^99Cɔ<4)8xT;YssAH!dN8$ȼJԾOT15)#ysq7D;G}k1UtҡD*s$X{>D@eLl'莟 4@ I)""i*)h"SLSD1CAAE,ԑ %S1T$MPS1S@AQPTE9DDNLT'B 0u=7p#KOfSGZ+u?J+It{tQy QUXtZ"V9qB2 IwX#<T#b&-&&$^.f׋:B;ca4z [Ǩrہ)Ǯ6ߟt"vĆbTb;8$F h.ghqVQ aP>,-<'_wj \$_񡳯swgG-:D)/npCtAA !Z8I,FPB ͡qN73 #xd{һ02?t7)`61O^agkxc&p{ė_$yx[kNv C4:#Ffj bIk%9M p=/hb8q," B`Ɋ,bKyD& IB2N,h1m"]؄(SzI< BډKs5)iEaQG +g/#b^C2Ѐ1GoVrE &90|WSf&&g1vLt` )>_X &FZ($:/I AB%'axP@}` 皛"KI>AqK :EBrQeIo{=cGJiȆn"I|x}c[gm.|oC+T׳Cz˃&z 5eЮ2l ^>Av0U/a͵{!`kg&\hahb&R-vwZ/{׍/{XlWZ-=/(CD ܧО ֵ=[KB\;-i^e܃adVT 0-*#Y8oCH-hr,quHz{m6w}G{,z緧ꭴY':gz\H􇩂,9RSXo/ȳ=4Һly֑g控swy@[7lzY9|QG[}җ~ h,A7ZD пMp[ƀ,W cCT #m˰Ůmk`&kxܐ03Hd4!6C9ySDe5@/"1:%b6Ny;η~wU6fhʅx D,CA@{݅+,|kn(l@Fꆤi[kv ?)zɫΚP|VXv̹">EVrc~7>ѝq:>\%M&+'Iz:7"?kUyK65g&2##_Oݗ"pz6"\XgK]ַb賌&BzN *Gtlmg0Gi9}2vĪl@5EkpoG |[jKϽY-(yoow#iX)Cx\>ĦYl[¢dI072c譳ХShbH4157 FAx~s)WN' t,O%Eer?e"\JYIt?ůYƌMO*22j,E[sW3,RsaEƤ{|ӣ%_3aMU0~}TӳVQiZgow%-sa<$oag/C,٫(&K7EsUv8[tYLk3,,eJS~ל$WY}n z5ZA[z n>- sy@RVXC CNÝ*eXt] + υF1dJt1M=l<̋5ȞoWg+:J1wţ;l:%;Y*`^aS uWGX uMڬ:x"DSpb\,aU ke=XSJm/AI 7vS3Py#e?[tOvm ; ij3k?:3MK@=8CXE*% "U8gfs٪ˈ n/=?MeWiGDrjms[|^M >8XLKe+M>'ES5a)}wͣJC874o<.֞yI(ԏ S'՞Do!^|2BRDjzUI[! "LrQfäNlۘ/Ʃ ~z`b k1O]4ÇZbn* W/ɝ{m-Cb<##oC\H3o#vWkA'v@罜f~7vV=bX6^#2 4N)^ZʼLa/PHD")iMTD4{q!J#& b@ d<~Idw-:ps̄h.|%!KԘ d~3aQ fIHoo+-;&/#8GNoNk꧟Y:[3gEhzRNTuEbѽf ;޴kM3wZOQE䨱fP{ͯ{K?Z\ͅ:sjE{cž (8ޓDo;Cʞfm&*6# *ް+<|e&?5x #g\cwb\wyL^ʁDLJߋDr((VKgyo {bgA7fc fij)oĔl+{ykO-r mc?~10ޣҷ6D4H/dV[#o^y!Os#z}m 3K'쭢1) rr[7%IM<Ëfj #ݟU2YWRRòDYcr4룯D9' + /nWGY0Qmd۽;i'Ȧzdyf2C\ S'ˢ7.N{s0K t)RHCNU?.B ' ) 2?̻sq@>fۣoHA{U(֦lA@u*wd.">.~pt<_tKCc)tڢm /`ʵ{nw?y`aw Pp)>aX66& ]@wE=4`mipR/+XIWZՉkxЮ9`^Q?Mb9P ܇N5OCoA)Do|;ۥGk/ p](-%XH?B|9~Z$Χ=:gy6Bw:]|,[,)*f:M;jOBa>@9y׍yINoSOQqތ:ڒqC  _@4SAA.$q(?LW%UGGAD4908͘L;BxTy QNnG|0z=!dMdgMTr>chZfCLnyX!l&eMJc>VQ*AY5\)G1,\a}ʓjQ>PjƶRa\4Áj^Xj_Ǐku}G<~sno/弻skS5;<{WJbHF[]/%MkzHy$[~[81P'^4[wg0se.T*dQCoqUklC Y!`?ax=΍VcM'$-s&8C_P* _cNGL$ كkD+^|2=Dek D3F4= HWw,O}Xw_HAC}} "&&~;JG9쿔2M}z9=$bP?יF#7qf1NRMAk1yj)aҍUC֎^Wt%$nS'S'ܟ0#8^AbOl/[B/fwWPXg5<,pgi&kS6@S-OAT2o&%DB!)jjvgFʧz0ɏ>e5 r*zGVSQb5IEW*멸_Mq1fQ!@z$^} ^/5 B+jdE K`O=qǀh?H>WKEr/LZs'Q Ĥn*pFJM_9o=llxJ%|Ȃ9kN&:J$b04~Zݾr;A" ]P͘2I8A# @C,)d6,J b r%~ZB2R]ޝ2 F%FUcZ* 11pA\15;wY 9 39Qf!o}{~ZǴvX$}xxl* pawᵮeͰpsp2یGL#Wr5ZW(KiKo6 &(SH$ $0pLu1KZE1Nwp0/"`+);("ed b$Oi2{(K 4HC[b?Ղ\E~l'{p\Kͱ^)~n_rÀoiq.kW)wYo@ޮ7~K}ݏ8<'#zr.U!sN6vՆע`{K^`.&r6ڃײoP@2ӱRIsJ?>IKj$b'_g>&/c]xSܘzK drF+GF.|-6]\V<B -;οܺJ UjdM׀ZtNQA(Y>~i #7:\2Ung͵ęyrSd^+6K.2SG)XxkCŻ,T\j1Q-M^i6/xLF#۸AuՁ7]Cm/S(08OeQLs68.UP9EiYH"sWϔX~J Hyl%  !.;NZ䙱- Ȓ ԇt88-70˪-Cnjbifv[D]BJ^͞lú-=<,xOo12Hhb44)LMc)2(ٞ7įbx sg-A(c(Xfk3ߺɰ9a`v"U6PJNO/ZwMbIƻ9b0{/cQЂ< F9̃sV ,.j>nm:ϡn]`pak rs؍^ĄY8a9` {,'|> B$듯ݴM,jaC;$$?3̖ w޾+k6{oy[Hi҉;Pp+:ҊK_J-cVZ׿<{4ͽ[-u>A:iCwP6L\oI4R*G~ ihqPsoطtB63aO#.գaˋ7),D )L:"^YꌃIA)UNJY3q#> y /V| yy/Wb<ܕ=h_](N~ a_= t~2t&:02 /PaY%eB`<K$gx4)#=%NϏHH~)_t?tJ- $9~!k?z|RQy}3wQ:l,0pc9h Mx\aer53տWKĻݦ;np,dBz&P 3e{pD1f`Yh vhxH:6۽$- ^hZ -`0!#;,Nf~̤:9j!GD^tMu?OX],ژ<\v}Eֳ/7hOpKk1O-L[w∫]΅y 8ȄcFjU3|Gi-5z8hl8& Nt,Gm"w <(_Ě8PN([]G wKCz\ݦ㭣#ijMT:`HRSj"$+)-x)H81@.NABUd)|OnGM{p vE5L0ޟ ǡC?U 9*R Xu;?lZ58Ƞaf`Ӎ,!*ȁBVk֠2Dl̒GoŖ% YPA#ۓjkiNN,[ !'n8ުOqv<= x-aL7S젺`SJV&[4>ڮ^+1i{!^֢TyBrvxkۈ`7宁I0M`U|<D~m+^ <'z;)8w@ $#!dL@~&9th@ WВ@! Z7M._[Gu#竂 dFaaBvk4`okDEր%Ձ! *,́ aM^da0]60#׾t!SjR@~vl1vV Kf2p A!3ZVG "sBg33SHl&DVb ]Dl[20$ 0"QEEΔ$B \4!:y1@RQc * -2 StMK(w C)U5q01 S 1 $;RZĹ@EtTsHE\Ӓҭ@׳gDdHû%OJ3x݇$ؤAxy$P 1+SD]"6.ll7G>|8E٘F!{[(sK[G$> ]Yj=UN[bӯ4XV3c1#Hj4W"7/ݠEU&ggQI|v,*MQ$3""9/,)Y܌eDoy"VNzItR2%44aT` 0h""%D ȱp 3,A /Drhm4f7*7́a1eFtX3:H"Ϲ0x?p{Νb IANz(I7O{!@E/fpcGTA(&EJ e\Y{2`9RY[kFKc/Nƈ]f:p֝s[XQ3c   }jl-AL>r֊ @a<ÆͬE&L9B@XbH$!9A)lcSnrĜtu"K\ E裇IW,t:_|d:"d[ lw,),G6C_`kp^5P /wax" .!~+{~N]}4]Fq faUp" ִj 5l^-H Sɂp4")r u;wAg(3~Gz;}^К;6OZW#8w߂</żkW:߹I>JW:$쿼x]gXuE꟏ m3Ȃ^f8fxtKwſ~Epz|bi =oS3$`3㟸I.{(f>U ?QW//eRG&;@'뭯7zt}Jb^CBQ">e,_Z]ܾpv3$Z<" $'. #u4I"9!0ش胖7y~vzB vOP/Bc-$/;\v<"LK Clsc#?^Rqň xK-LL:1ʚ!xJ],ixQ-a\՞.!;d0k?~z?ztgt?PcBv"\|ngڽ;ZY|Y xȐ,TJvNTNDjk~e!_i]`DsvGٵT#ԯ-u;q>aGE_~$ IM'=K=*]Wu\{*,M1vMP&nxʭog,ZC[?:kZLӳoޕvs.~>k#SgwݔY!7r2@y |6CEeyO6# [K56ɖAo+ج=fp 7L_%uL%??!s;#1~0QLA}D?1w}u_ DW$C@<>=&6bCO\**Q]~9][ yd 8?=wE,2 ,bX8E k6,Z(>Il|ԗ;?qj%#[A7ݴ 8@U;?ժ/po=W#9GUƦdmvr +/c)z}gQ`͋RO:QQ1*T f1 /s%p[t99NTJ##Br?fC]v~nV(>Xwh|K:wI:4c|T`1y`T.~nk޳chٶp &CR%,/^>&jE^%mnA5ؖTmFu҈6u/;-ѿe"u5 ɻg_H iJ8sg][itW ZxIz<'E1SR{0umaa qNmˬ9 )$9B8уRE}DvoK2%*dn:b$x.x$d.IζuFKfdqmWJu>_ű =,9Zj@H½3y(yVQ񢰂9@"b:xK^oꔻ1߯QW: HEi6.kgX a=7wGcj. Y"lZ!6,_Gջ)}z\wK;`-.0NH8OHpTjEzf38#sǯO`WL*ti%b4sķvp twuH-^" ֵ}8`Yh"IN< ^;¸Vj 9( raf"XFI0,VRH A )1UD3[ K9$ԯw>):'q^ iZeRH^ZSGо<;1d_jۧ$Te'e>QR\ ?*r5$G*q+ ;&{j(V,Jpb(oP08;D%> qհ6eJ&4ARd睞8lKoT/VЃ5Y\>N yu=; DZe8U`lKc)@a>:ql͝՟/)<=̘XP nGs1Է`iSIn},:UkzJ^irGxeM5Â>Uf@j< SP` ߩ_bb759YS; !`b%L[_K61"LAʣMZxڋH)$M陿`F;DVJ=}1Fd{?NF^W+XX5uHTZeW/ON7 w~u<8ثđ(sC}Rt[W-S[~q [nCY8Sa/g;TT4]GF RJ{*?*=>9Ӣ G%R KRdhL Ytq8P$ `*pWMR #V W?Y{c/.^rSUȚ"Ԗ %\5{ajir w(4h{F ♙l~[;ΩzJH~,<#۝ucۄqcY0w:N,2Ie',C|u 7כrIں׀`,uVڡT?Je5m;m3l_5%M'QvPs@xrzsuQ,4ofXAuc&2i$:}D#r&;ڔ@fQ:ڬHbX[.X#4Vytz,mԐUj3T0 sSNX,zӔvIy=?E)d%$gh0Av:b41.& \6Q[~ţ/]vZmOC{ 7;ޫwqZGE06DR]V{AX򮋨3^4-s h*X!kEI2!h~iٗ,f(ct$ tHI3Y#4:L^^ʝ`X7 #/[[׬&3$DP}Dp Byǖ׉LdÄʤ ډM1_g#W'2'{b&b])WO@Q|0=H[SZt?]HI{:n5 n(WأVU.c0EbY<})+v__ۻx\ۻ?8}=gnw)UQfX{:흘tvv?:(?j^=q-mu OV|QK-OCCmxhW/"'<[򅕤2Jm[>eOۿ]2DR4ҞduyA(81%Tx?}^H':ZAW @ g'_G$|v^_~cJ@4  =KN#zTHAeT,&C6w8fb19?t##XߧB\ȅ|piWgp&Tkt Fmj#K2O~CM,յ,/Ƙ@,f(p)8 X]oD؊R @bD)a98DIoU֧9H0~hV֣IJiE>3k-uk2`8kx{慇9{=ȷIt0.Q^+7ϗkғu0V^YL;:U4I4cmkff:m6-z.Ww yVar=Ƃ /J8Pd'bQDhbU`aHGʴyjǠ\菗 X΅,FMu-8طz3wtNշŘ" A zfp6!mL0 ϕp'K)!YW 5K':TPڄBM1H mIVH=|i2&>[ZBv_=ޮq b-4k{5]"K\=;3us)t:o}U͸{lT䡡L2;n!T5l^+'1KgG1gY^K.SB0?qtXYZ`-MBsHaՋoI畼'ÈP@46 D4> 5׶gܴ?hQX-yB>K(kgL+6 W`^b VޱI{ g%\YSHi0KBg')Li&mFF1XQ" c #*T(CD1>` y 㣀*q~Y%]A]X?y&o('C_F Hw: 31)b_RgEk37W\F,0vEU幗7} ЧP{%r;1?g'k$5bĄR}/8̋⧆f{{հX|!lp ƳH6~9R膶]ׁs^m6pPжHX$-CD%̼W6Tz Ĕ+0R&%u^ht֘p`?{i'RiWwÕ-^l3oMub6lkܗРÀ@C]0c+[q䂤%P}}x_&p(!д.cE{ (EPrY+^zX["5b8N,6 i1 :co06ĵK'0hfW<5Z39V4_CeTYS,xQy7_T.α$/7g&ZڼR $o OF)|$cf02$j!ƘM͒FN$]bYo72aCitX _I܋_/??x5|bpئq 3vHWhp2!籲GtJt4< 8n-j~u)0Sf]Z9`,lww{>7avMmӻoqηrȝOM?WOqI;WgMq&+toOx[\6qRr>]$6SI]Rrn~0FSzGoZzǧ߮?cT#g!C,:D=dp18M125Wa翻8ı'alz+̻}*9”O"Kv͛RawT>el؀R`9&`'H$& {_pj HmdX.(Ƹ2`CwVh?';$K;T:0tP4`St+ B(; 5FǞ)?DV\rW9| Kk93S^0 Lg~8н/1@JgbMn΋!" C.EZt}#*IX nwO3Y6v{K8?K+{L5Y1UʉZ4UTķ~ aܦ7ⅹʧgDQyXr̿W?9$au_+ƨ1[^1"noKhu>M;ե;nb_V]`;FK%h7cX},h~"bA)]&Tmof˒[^ %H1jрɓߘBzR"H$F-t^0q5:i/׊3:zjL5׻x\/-9u{#G/R Ğԣ7o?*+R1Z4GC"OH5jpg" N[H /ȱ|ڎ2F-wP P!u*C+%8>ߙ?wMӘ> !}}^Iuܾ˹He<_3Zlf2(6iGu-2fJY(ocv6`k2*r _c$_ z7[%2Vپh#?wٍQh?qYGUML2&= Q+=̋LnyǪ:<%s|/WnKcq<ҏOM֫Fw'_/B{A5y0w3<cg-.o:5$/rJzo9|2qm`5Yk4u&A69J6siM݊V0ǦPCqrO&&|hƠooU\ +mGGa gE-#"tY+ 'ph-H2#A!8w^6Q첽ߧ!zPp?Nč +9So Z&S!WC atˢB  wI}Alb'4aǒ-bن\JEfװ~kPdWA4&ȚNX-hK򚊰zWnC^"%w[J(Mg?ձF^LMB1R3a&XN$*9v~ wx;YX-hL;_7"IOL[&=R=UݴRwm͎N1f UH^s_-͖a2!fP•[(95vrr6^}L^ b:?^F\?.SӹzFZ5=U= jvge:)H۳i T[UdtmgX2*7zQylj;mWy]|cz{и7;j:OHl:4&?Ӏ@DҮ);1,H dl=L*c|& @)҆a#uUQY"o687p2d,&/\(9ٯr20y9RAZm)~&nUJU1X;PzO|=j#v&dî8~C܎ $ Q2"&RU`RjzkHuz2t9|AD scs|]^9j I}n_Apso)JG0l^DlUe6U+LCUHl<%|0}V| \t?3!^Q_zKMdau{ʬlZ0Vl&x zOAe~-2|;эٚS ^_fy72pUR7@]pT͟M! u gv2VZm8$2SnA53OKth)zجs>-') j!ᐧ5Dj`2 uj,]<3&bRݵ2C^]PKS~6_SǞ*=JM* t6Qb V멐U[7>HKni#GfE'v׉9C+ lrAHg$)rqB[]7]$lfuNw\u ߸tjރZOBXP( Nx(ؚtܸz=rW Di+p/'׻\Lv{&oKzkPI;L[+s7zUy_uFy ;k0G!wts7C&Q)1<ߊ.KNi$sYGl5bֳ[š|vwNmw-ha/Yf 2QJkp.p7L^Vk̘no43nIPa6~iepa%@T~tCTx`hbT;Mq˻"v ^0)7茦c>BՉׅoDA{'4xN#OI| }<>\cmbyC8—B kȟ²]̈N!_[ёpqցSw/s`~{ Fg@X \j>4tUu(D9kGS;g!`N6SsXkS%@jj|ResO<#QɫRܐя(2)-lOvbU2Df99u҅A*盽>kda"/%=pLؠ#&vֲNZEW~2-J+;m)J.`ESR[i$(s^Wp*O!0لcw꽣SRtdep;g p-Y!ZF,oe 6{ }3R&;>O>tC~\LJtc2U+豠}*U[Qf+(.;vtJHcJ"#d0Cq]\P),eIWܭ+&yH8dgN?\3g۬21lEmߔ@lMQw)N,]*q+cֶD3u>:)<:;mɑTEvnY@_OTsm30Qj>Y qb\6Lvښ's[c+=U)ws,X}xBM 1p*'~){mBQ̾]۾Ym,2,/+ ^_4M ڭ W A}[x|#zl. ̻ >Gzŷ[T=!gK4'bFM„%`} " irs˵L鿯6B;#kq B/uIU7e^۱y_}~|:!I K 言,/qFۋ;s0q%<ʐ/(KǴ|6/{ 񫛜($;Kj▒^t׋DGVWyz߬~Qo՛o GcѢ7@eTYPyӦ `#Ձ 0._oŸg/XC[VS Dg)x'+>-%#G;ϩnB~~7v} -h}dz HxI/8-=suaF n4iknw hd-Z[YQ*NV}|)U4s ~xX4CC6e5xOnNtghpHtRY !ha"\p2ۡtBDM-,PFؿkiu?9f~px%ɮGܮƖ"R 1nj7ცd~' ձhKp5MHꐌT+t!HwVI2F q$@!T 咴'XEo.ϛ/G `XA@QRG~c[y~TcAFa@PO_51Q?ms&z60QR0r,0P" ins92" 2i >V,vy/R58 nhoiJ0H?,HIKz&е,LZ**Q6շч$AIͱŰġ0!aTC4b?W0ì Vp/`)۪|hTYH{c~)10G9􊏠qv̅S'ihBج,Bո[esL= R2J7'&aIFtZ@~vlA{1ͩTXr8VEs:a1W/Zadn̗ߢݫB;!ulR ^~/#|[g9U^YY[*eBKXPyɄ Z'>]̌c682?nS`Ξ373cKaiLJ5-AY4~Rrq66EO<,ɚJpgAD5aE<3Aq_sq%ULCmTMoތF4'^R9\M"#ҩI(>i#h}SЃiuHrVG/$C uOIB'eFA%CQ>/lNqHp]K6hl [5^o8 djn᥁R\'`s=<^ H2*r"o#'\57Ԁ;8miݙ[P1VXAQWݏo5Co7Hw߸oBn&hZS_!3AYQ -\Ǫv!džL]#g=?価q%E&P@v`q3@ho:i{k ÀV*B).;$NPujFF[x@9GuQx)R%bIz0!AJβy3O*з__c'z&8VK~=yd6Kw2sە,}D1+ar LzJ$]5b[Mux!ޝ}f}S\Gp9v[E@7(Z4),ݳDAHlj!)DJPO EvNſ%'IR^S.u觷K3v+BKCrGiL Ꝣ\3ߜ뗥^I\Y-'ĝG<~<n@ 7MyG%-HAwj< 52V{&e4.ۧN-պM)DŽ07iyhjJ UPhY,1CB։EM*)uwyxMi aN"P 3EiµPXe5cL$6O 66 ;!OCi\dy%umR puJ(E]e! CQԅ Isc<ƔLRs:1$q)VPA@zM;J C*Ӈ)z9 `:jCdF)E <ҙlAՏs`u`/]}Pe=!&;xqRЫΥ]IF.`ŋE<" sN(Y?-CĴi7M7mU~LwUBJOA)PX[l"O^˒GJ@&" Dspoi vݯٯj3C/:kGo?swwλ#oW]^9wf^Ǥ\^ۢ?&ƥv-;|O+!:}47O3SvDqU [l$I$=NvV0ƿy14rA4n9ѿC%ܙtY)u c/`.뱆Z$ ty.'"8(_imBmFRF"\dif@)lZ Ǐ+>8!iUGlڵK.pl(擧A'Elcj,*ҖgAeT4 y6,V۩H؉bD-\m=0>:YіLn=ﯽW8-{&<ܚT= MF_ogQ%6Di ܾtcCl<T-!*TK[mn-*K9@# h$5&(xn7" Ɨ F$Е#X7VהǺZ;f8|5}uqHܨR*uY7XgmT2㰚Ʌ_d -$A h`a2j!q'.~tx>/߶W٧8]M \IR8&v)|&ZD[suAyn ޛҪųu)YKEz̈s%^LWCjdOt ⪁h{Є%h( w(h5Dz\Kɣ4wc@%V$!K19 T\` 3Br A Pl.xzG70ƾq}rҼ-Pry#5zf`zŐ X!%k.4BRc`kډ cUN{EuIw_\w d>^bN;y%MH,.7nLϜ=qD;fPIk9[z`;fxjoň8Mhެ[Hkg@k MbthBiiE-sPOoWֆuJL[]XK#Yݹn4q0 jm &ҷC{%͌nU!$qPGqT M7x扨 x?yg^9kn2nY>0 vtPضyd?@s@xШ%.KxS7B;;o*!o~Tg|ȃlX $Θ li Tqi~O:( 2D "Z\7P;'f-ܱ!HK׋`}pH0%M-ڇ7rH#T;I)t  =:OLbǫ"k+<^{?& 4$`垇Pڜ=}sͥҨ}Aaw#?c5^'>M6,o=49GI †U@1bdw3;|]=X!bm6@ǡb{Uɮu: J4SH")C`Un`~(׾i$F f pBV+ӣ]Y\t0O`Jh\`@ l}$S9lqMK]<^Vy<>f(b=LJ돽>7o Xދ .e1Ѳc ##|X+Ēg0x24C? z'e)BDz8X_+-RNq`Z@`nj,R:@6shQqj;o^d^w5"bTjՐ'HiD_PH\mDj6l P׼Yb4R\YC1]jsfP2!(ڭ;<_P`y1]0\:CES:Z,S[6bmP @A%S%o5mGTE\*ya+u+L{U98PIp̩%+|Kky2Zv#/7N #'!Ru^Ȁ,J΅FP{'12ZQ;怼g 1[@"[9i6ci>5KR"WiقŕEQU%QEx_R9wèPbK7!AoǿVca6WM@HHH<=v=L| 1U\Q<~R*9\̃?w@:bI*S$;%!LH" [V;ip}WK2d*E5eo6:mߛBuxK\yǮrzdR!䛐Mn\w| /d=y٪@b~с>g>dݺ)CV L%LEVIh@iUS5mn!'m-uc;lB o)7crW@u<"-M +ZЙٻ~AI(F$Zmxvv7H:CP\stJM ? m7&:et,cR"1@8=)CYじD@ Mg ۧV }OWGPCL@˴i1voa;m>0lm-1cls9~KC.,Qdߩ8}F+ےI5-Mڤk.i[,L 2r)?QY NLuHg"0>E=H#~֕ iO\,s#͇`r/kwvNV|lDŽ '6v9X"}63ܨ/؇L@@ _Sc$9"q5苤v'00v`kl:8#ArwLougt/B单Fj5+OFUL(c~~5A`x#Ǚc65Ӂo-=)S8b_@H>8Q@<@PFIpuuZ,-&u:N#2R묧sFF,{}ͳ4R48/dA՞AnJxV[IRDͶ0?¸9Rf6 (_f%1OGZZM<< !F3 >bJ·VYs@u EkhDV΅,Üա<z"ytfȴ^$7ݶRqQ-T;kN0&610ME{\M|~-8cT޶ 1X,58hRo^L%6cc<H6غ}Pey<m5#D#ZDmA$F}ھ0սYҩ7[&'W e I1)qc亍##= ʉ|qnU}MG/ANT0Dn3^sR!ֺ!̊.E{1eseY)iNݯG# \InX .P$!5@6F1D+1D%@.n3sr\It]fAam[kFͦeqC2FjjPLǝc.Ŧn6#ww3ҟ](y 4p?vn e jr|9HaXWrfFFvYd.Atzel؍zֺ;VW<Mlgʡޙ)Y{ &dk8L=6[qcgIí 4:t4rqqC-9*;]hƼb7 | bx'Fiu&T)) >B6e%G^Ke-Vjwe쑦vXB 5m1pc'lo?#klk5+<~- I-@U+ ";xsbM~$:.ouQȤq-깒4֖OsEr4mFYZ Q+)[ĽiWGq;P%CZ -Q19<,^T,2=e"weyh>B/קCuۂԼ֣qiT{!?S0pj{喭YjE;htE)>AUglGf7=$Af7@I!I^/Q@*\3B^6֖JrvrLST#Ƈǩuj OHkg9>}a㹫&K||Sΐ(pMe*|5y~73O)G-vewRÍotpx+1J _9%Qܿ?Cjdq!-owJfeJ"1FM&_!Oq c9|e]1~=EUMHqt7?+>FiGq5J)El9s]8hB-+~ m `'n>.4HX~m t.28A7ܻ(:"$NF;f8؉{rwUݧIi"$g HN&jp: tlQ0$bl([V3沜s<{zև@@WWEc ,CE* `C@ o;AmWO5p4B30z9;U:">cbVrq%D+{ym擏%lήρ;WwXU7z.ӢɂBN}OU7ؓNwcmNdLc@oxTm)quqddQ0}q8:p`V8i=v8+S [+4O*xcҳsڞ]lsd]׷ b^ 3$[T7X:j-H쳶Xh̯66؛h@: xIr6+F4Dm]me0S"M@l?ދ¤ɩ$bqn.al8b&h͒y IUJjH+!u<ȫf_/Y^="KkpD3}/DIΘKf|ӂRP6*b˙_>/`7m:.=z 62(Xbc-0+*T7bk)r5E`5 'eE'ڰ+QD6}Ӽ$tVsㆫb[uSN t2H3K1 D!&VRU>67異nF-v)(sFӨa";RgKo<لu@ȳ, pJDqƮ.N]<'ۍ6;"6Rg+}qh?7RDTWK͹56+/V`M~[Lru#qȫqa%/BU6إtCcvn ,܌z%QF6(bpUQV ls y|+7LF~#O0kɌ /_ DR}JȠUV*2' 14;P<2us-er$;YB]jJ=-W5],ܼCT(G?HP\CNlgJfEm+e`[o#ŎRtUˍ Jɣ̓&2#?Y& %lqZ((_vKqs'/hw5H!|FNu qtɓޣJbqCs9..8p]jE,W" Jb"'<[S(ɂgTBBP6Mث%,2yAGxڼޢdс #/2ErT]udyp%"|ypzqD] G˲1Zsȴ{a Ye^]|߀ss؃\H1DԶhjJy ZY\<ʎU[vE(;tji%ɮ&WQM Av '!=ڠ1Lb733 Az4p'R$AVN^Jpzs)=8G?ՒPkEXH5;rlжm5s]LvVY8b -c>%\mHOCڍ֝nL02oAP̊6ra 9d?ju߻p`ft&_Q b#Vߡ8u'/i $/u/+98yX^1'^Ύemfkφx 4 g=}N$dF17f<ҷ?R~ֺi\8U|D0[<8ERT k@m*g B^㐅Bȱ!Wo@ K>I/E(1UIG7ςנ|XEq!C{?m|o4β燦 g[a9#Dg {g8wE1WН&3bFT 6 iujEWjv-4hGms;:]+&4=!Ϙx3,e=Zj:om/p *hJ%Iq{l=y\Sb*j_a~t zK5թb/Ϛ8yR"%ם%徥 H6(wJ>miʉyY@7k67l :A%%X%Fץ꬯T]q i oȐ΅,YƗFsXK& yIlT \1J++>M7pH!& s F Le!Ym!\8g> }}"*,H&b]gw";_WGiC5Wru{a ,e>-Ihh6y$իmKEGy&,bȁ!dOo /trpmlL+,_mryi Z2Hfss6_H%$5Em [-1._:׌m=߸$*ᗥZFuwCg*&SK9cJSTDm/$BUIj\VLSma1JÐ IO!;yE4o|[` m{aڰa}} )@ѳ7Yӑ6zXr^fA΁+0;m݀;+Q3W[A!}sۗu4{Я@8nTu;Z>Fzp|>=v}"=^vھwymڷ\tmý{.=>'zGcx;|;9 w;>_}+mE5݃R$omwuYmACmZkY$ ]مh4kC@Kc@A@v@C!@ @c0/M#kX4ՍW>|zh0>Hw'$7מwkoD]^>|-ȪV m}m4n3ʔu:!ggzs^՛!Y}{u=>k}b\;_y=p}.y^ȶ k]o z|;{]vꋴ7Rm۪oSyp9TG^|e rw|u}p{{V{>ڣkI"},g\>li}}{}7WN]fl;ŗ}yn,-{sSVsmw>G͆{}}w}ϖu<>\8}Im}Ct;0}bԝ>{׾Iٝ)}3{Zw{>݋=`J^}݊(/e=w^۸lBd{9奄q}6=]u>y=^&vtmS;ޮϹ>׮ܝ>]nlcJݰ۠+O|{ý>-vbݠWw@pu7w=O"MaNmv9TA_>}n;qZݻﭡS`w/-o/@U[aY_u%510Mh4C@4#!FAi@&@ #LLO h&Ʀi0h2di44`cCSj0dѦ#TS Lh$""L&bO@hMmxe=MMOL{SʧGjjhOSSdj=SOeMML҃i4ʚ{Tz$M&F`&043Ti<2M&A =uGo,DB\z)TktXm1} Ϝ 4;F;*eYlhQ@UoKy4ÔѬ@ YD4]ɫ&qr!Q`X #J%jKba{/E+nێк(!ǔ8Hqy8$rZqr7bNP Px.9RO$pn6H؛BQ%6s&2# K@ c`v r|dfÍվR| Mx-ƲU۷6ºHP o)r?6('sx6n!v3dgKK Zwݐ.ãѝvF;{]u٧wecFP(sjýC-^3*DCgzR *(bX ;5,ӵG@ݰj(EF^l #ѿb7+Ch`߿ 0A B?m‹KC,aF<2tGRXx!8v8Qbfϵmú {ڂΞG2 )$s.(\c`p\ S)†p {"C :* = HTD a;)ÃwN7ɢ%>+:ZW{çGcH[֒aPqȝ0P'52af4U@4=#k 2w09uY&Y+Mlv Vq׀7L" j"[8lޭnvMQlFS:"Ʀ Ura_8u; 㴱PnEDw`!bo=A?+wU_j 5U,j*&/jCt @v=..^ݿ )Avjw,xli1H`8[q+NZVW@SaqyfOkۚLwk@}8O1w҅5vRY"$0i*n.]8(PF覢dYA8I7ݟMi\#lu%Sr*ſڶjVj5V5b5FJkhBDM4,bF( 5Y d ڴjڊk-W6b֋G*lZb6ۮ*@$rY*oq#wz)QF@ ]Us9No=)`y6꟬ CkPޚía eM)IdWh֒j*QKEЙEU%l|v#DUZUm֋cV-Z TmV9*@$9HI!ڣFֱmkh 4%5_j&ũsw\nU9jW]t8J,XZɴ߉mE^(wqca$q~NYoK Q}p: j22 S$/U77c[E-h6hղk*UQ`J,̗M ,&13$ĥDE,j(dLd6QAlm&-[mhV &0?@#|;n؈!"!lVVZlmc[ElXضڱTX6EZ6U%Xj6k!XXƨcRk%j-Fɨ6ڢi X5ъV4j((5Tk)Q XfH3IhpEj&P،1QJ`4% ZHdA!DBX )0i(ъUEQeF+ɲlb(M3np7\9r&tE$l5EѴZ V,jj9r+E|5~J1%&! )$QdkTmlb6Ս*3Uҕk[:]65!sn5{QW_vG`Scև: f#*Qd 1#jC9ΎuwF$!|QRD GBFƴm䗷MU#_ `7;J0%k_M3wI hִj[(50jlV Yd?'~>Kw[o#t "l@٭,U} zG7q9vx^{qCqo~z5=t>>.T~tI$w~|6G{jz[|mm#9{Ez3p$?))f<ƓSg15 0 !̄m {7oN|cZ_XN,UWKRޟy%0+e&\] ;W'cbVq|P$A}.bTb^f^nj_ 5mƬY52N;rs>Z@a! .!5ޞ_}cK3(@.J'6f2bTf2)@DD:Ba!sbi "Qllj*Qgv( J( X+h6FV3\ )r4N†KixIf)[mTQ>O9 z$QQQA4lb4F,cm<|oξV^/k(ГӽCXb_Y{Nsm˽v cXr_w?W'=H!Zm5e/frWⓛACMMFJ>!$2@H=sy*^ʳ)8v= kFOv@VXbj-h*1Y G]CLDhTkPm%F*TZu\Gљ/oK+[S(D9G&QQk\+ hm9}#gc֕YypwUX$ "M8HʨVZcFDguh!Yc7?[76T/SN-~q(y|fuGOp=9[aTng_B`a.\?Mۻvw=HPY6IF^>nS4FZTXh֊cm%hsFT?.(nZik%,b$MbI}b\>ƹowjWz]ƾ巃-9h6x*Xn/sx>?M5ap_!( j|'yzeE $YF 4LFkFF @ 5tJIk3NwS6r&#d8pc 5xg1YϬRTlEň$Xf@#׹wE&"R1fG&aA BQ$EjaLJ#E5KE5Zb6эchUqEV)0FMN\m˿BoeߛՂɍd4mD Ad txsiwM`pBABA$mFQ5cm&6ŢEh,XFkmdS- HŊTFMdfdJ@"cJkJDQc-*`ţhIj+XcV6ZFAM0t/@5|^&J.}0 n>4 msU/}MۭV2NÕH9ήǿLZt:߳^\6nLLL FQ1M.XڋZ5I9$7S*$H(TZV6  !}Bk˅uymPVFNSFj}_w#2 QZ,)N^ZJ$׵[\U[mld!DB"}m~'gpOj-oӣѰ 0'qy[:@shs00ILޚ3 $0<ܩOhU$`KFحQiDcdy WJd@@JΪFkJ+T%"25eLQ,TL(IE -ky7-HsK9?s68髊lj5cmb-hE[`AdQqN%#O{ޠ*:K7!QݻiŨI#qrq\kr7 "F#[ݽ_Ϻ K<ݵ#9%N jk5N 91X`_:35ٽg“(ڃm{l~y`wu9SfZhVn,hXkZ(K) jd)2X+55iK&J233MfjfebYFѪڢƶ;3zR"1o6;SUl0_ĺh/~m!a=dkkG:ƸڌQ*@T*K q/;lGrsĂNjXk(6ʾ=g/:b4b#0дhTțM% 22&dMG\{M ~Ggǯ 0 FlZf*#Z1 mqQEXj5EZ5ϪgK{V5jmouohFF֊6-bm5]rk>uA(`ԌTovk7֮KTmd$D@ E9>;CÏffԙYGBSR85zNg.و:K➧a._Ɔ/+s~I8Ȳ*ȀFhI$a)f 1%i(hh5Z-b[-Xj "skt"!tzN|l$_]e`Cn(F(p1*T 7u`Yms*<7̯ ((M!1jشZuWc3M^SKB|jTՋa!]Sڄ ONn;œJwv>bK ],z5fi'Т*3yy6d HxCPbD%rf n=>mz}B?(NZk|4Q=5v;Θm9O71? 'A8I1H dz<0{(A] 3t8dž>zءQԂ(`cr_ wDo||V;(>*7\p RQI2Ւ$$"4`~|aU 2ց$&}z'Mۈm646 HsNJOQ`P^ ~E:4_Wq^w5^뻷cUu`mgɍUe_p͡qu մK{82-' ўg8\4/! "<{Ϡ&\X:C1)k/UH 96fn_kJ$ޣTur/a}%8W#tЬ|W] &" iymO}ReAm] 1m&3Ɠq=TdE_>E_K#"ZD 45R`( EO+pLFe.pVnOҘ{Y'E \7HECCD:#L>9!M!ѣP Xf5w#@z=G9a!ucsr&0XVupLHHs#Ir#';SK&,`3V:u]b!D2C*̿%p=Xʓxgz hݡLG%m5ב%uDZh`u:q֩mh%  *"@RcEJIefCMuNM>SJ0u"քb5BJP-D&Z/fx֙+Z)%5n1T @NՅGW fH"HF`M*rpk'<+ F#;+x`xbIP$b@I DTcLR/ B|k_zޛ\B } hD01in@_邼Zy`Y]Tzώk9k}wGH+gq-oa8FذˏI|RkaS3m摵ʅ\t=& G2OǺU(DkV,3όz@ICS#0,VL6 lwc~)Ki G37.Dl06Oq{Va*ۦ5*١[#ӗ`oɍ8J*%Y&.b ,P, (P=&Pm:"`# m\iVz}8y͢A' U'4 Gywu bZ4"& @`?ԫb$@:7&wZ"Aj@*B2$ ڀ6 l;0`$I&u5 DvW(Rj'ESsM iX-D$Wsؚ6fsEqPh#HHt0ȷW;D@u@ 0l 6ʘ^\?dHf<p@*_Z-f%S2>3?w%Ba`}t${dB804, h X̲Kt4pn$ Dqd]iqt8q/d{iH8@-" ^}].M#%c.pgAB0;"v/elXa6 $:(`X,t6*p&(ߒI"~H "S]o"Ƒ7hˍŽ2Ax|.%bk.nhqPCDAɝ>Jફ 4P㡁U8BXjx4m-5X EƓ^JTh*PB 2QΊ^hMD\лpwc5Ic:RҸ l J=R\0q7ޜ =pl9v1m^[J&1&Yz9aP!RlpcQEF,B.=jEmZ8>BK9®­*O%y=2g@^jg0gkҎ1ssvI7WțK;eNc2$GV ϊW!!]5[w;듽 :WѾ+W"hETb[ڮ9$n*/vjuoCENPZ3Ō"'RA!̰ٵLSWksA(a$bȖ+*> :}Pvt|֑LUm>d4s,5v*ۂDb+S!G_a y.mXH jhD y[HjȔ E{{^0Y]JtvLqSTȚTTz"#0LfjQXEWؕ)RHHY d}M6u6C@qȸ4{״a""DФCq,sPRG5}~:@ܻY i2*T<tÃ:BޭydrҙW6/EgD2FFB@=%^/oZMbun6m!s`8ѬTzދEW}-QU('exsx:$D@~?IRgJ_mT;\z 3L>`m]rfFD$##8N݌$)dkFv“u;|:d'Ͽ}k۪6Z6F([VrᦊF-EQi,͹I(LlKƨMF$">) d}56gÓwz|hZ6 bW_Y._}w*$^df}j_I3bƣhbz5sHޫ_161b>ֵWخo-ƒ ƙ4h^]څŜzw; $OKP>'V6Ŵ 4n_oNgmEfʝ4=,{&ڢ B{jtiQAU)*;r7ߌݏ1I 1qfV*Er*,.fJʎb߅V*R,`[ CMljh q=l3ixg+B$Rw0sq U6 wUlt9^W0ʩ$0p| hZVEZÐͱMr i-M,ah3p0*[Wealpr PCEi_(vk18mqIs}Y5%ECM1T>1DXl ώPU+Zh+Chcol16ʇ K@ C&pɴ4ƍJ_ppu[yuAbq>@ ON)-^_E~ i"#eVƭ.\ssrյfܭ*8d3Rۑ(YhͳϽezv:h jwhVS3tzup+dstu$t/sÎ<d$ oQF;4&%`WlU?hxK*RF]ZeP+Y$M04ߒFY}ϦfȐ!ޱo7hfXbOF`)xҜ Z><&`=yvQ* 0pzEȂȳIv& N8 *!nPϑ5On+P˰ jZcMO` (i\ѐWXj(w:}+DcAiZvbK#H@h {ÜrΛ2Â֍姦H%qSȆ7`5kA5(3Z)ɓFa8UQ&Ҍ9f!EE(qD7 p1 ~|ûMM_9e(;M'8j\{Tb iψYxBpG]}l9fwf%BU0#Sȧư&єOI U(55{rGsO@qA7BI80$"ChݐtAuͻ>eup:EKI:.s8]A ǥp?ҟ~\ 2dMgM$$ҿv9r=XФ@; R YbMiSYG6TPF/ B=RYnȡ8٭2L~A(A`Pt\ 7oa j#v~1{IC8 H"'}]3ϸ+E;53HxY(6dyT "}BI{w ި)&ؑ1`KM|aXX}( Wcjz+`CgH)f߮i<_܅sg+M9 P.~lE'ɿs]ۃ{OviP.ZRA'm:*f;# uXg0NXBx(&c`2A j klzNîڮFazdƢZԎdh@/ct@"^9b贔rlXA"MmKxtI h~cNG DIMuDhevP_+90OšiSɸ $z;@&!Bh^{XPBcʼnt'9\C5: h )㇜&C_YU^Em*hgN5(A9"UDBTRD|DtNΩWonͭ] I$ha&0ſSǹ[߯*qf_[u:`RA?gEtqG3!FkCh]gTf5[ >[AgJ{‹k![\&-zpdQlD|hro_Ja}m=8np1xr\>A Gy8bƜθ|Vg~wע9`Bc|5&1* ЋQI]K $LvB$HhK|/WyyGسBIs^'Ð o4O4&3upqЈw ʾ>+OrH來֘<ŨM%HFy{I_(^4$Q\DD?iБ.nݍ,=(`ď0Q…mR&b؂+qlV}*YiքE8aJ>0("{1W Gbhde{۽mKpL)AmxB<5E1Vj|2wp"JωEPs1/9LD<79xZ:Mju(*@:! =!@%Y7cA4 8/)L81axf7MMKu1?c"\yn:uR7H\.c /!:H{ "}D ǿI ՝f6JIeԳu#i)J/4ebbVUuՌ7Gn6aY)! $%$nCLWp$ x\ zVD}nNCoq${0eF||HDn!&L9a0D2QR!# A7:QL)Rc>ooN).Gn18, ѧej^bu8H-R/8- ӝsze,(\|U;;,cm,㡘kx;x0G#Y mxcǜ,BdWX*F}]:xX8 ٹ$h3&A s03wYO5wu*}*'S8 :2zc98 Ci͠r=e՚a,s93Hs}l0P<8p>iJH5n-gy:/ͯekZlN]I:H Rձ1}'seР6ba&ڽ ;{W7^B@ē :51CXrׁ8+QcT٣%̂rət#(Xp00).)!2I$(n$]~6-wh#1>t+s``ɝ E|0_ :φMS y;0ҭ%ō,z:$!$HB2B2HĞG>IH4 |`|6%YaG9mACTGC!$c)ʿO;@p-̱Yc5)?IF/~w0)e&Gߣ;Vv3yUdV4%eMr/1keް|m(E$xQ(J). 7(˺MR>?t)5Yu}32 Vr2m)9ۭX5/sL}_%gǐ[O91khfBj%KO՘AD91v4qpdu+|@0SI(+L|/8@W2C,YYDq lp;.PH֐tfk"'y8?s"D5ȁމJ`rd $vrGϳ?)4G yؚ2 <"$$`$.ӻnC_Q!2_an!8pCMHsLim*ڽ)ȑƽV(){ĢΟ+M#r{ew,ZG؎AÉˁxa1ڈDcIpXfxSh/Ok!1əO J㜄 Lw=QBcdHQ1γ9 ʳ45i]'>H:1ݓBw[ֈ3pE׃u _d `i//f^1bFz4q; H*Euב~wT 1@gQ|@PL1of_#shhdwI C1\ў_{9sµbs{ H>ͽLt8йvoOqH uaaVe]Z(]gO^JL\thS .'!8IdWUA Y P`f@pA$F M3̔U_j6QQcPO7'U]W.{?W7Vz@'tCFI!f9Ѵ$ Lg>iAjًq39k/7?:fb.pLN<~rOb3Fxޒ–)3rOx̯}@$=})OdElnqHkMԩ"3Ȑ&.7%3$6BqGM~k3ۑO+6-~ï=U0iBd9̹:GK1NnKh2,N^rO[P$hn=ɊHTG=1.$7@;'{L_< P^>ELEn4V-v |Kjo{?UCq[hvcJM,of$_C ցCS>PVE?6q4v2A.}2=$M66Q֖tNђH$BIK5$-9#,P΂r1Os1q b(J \7 ?E4shob2?>w`D`7ys7b"ݑPIo{  .[[+jUQvňڧCB7Uc}vɰ$UMSl syv o&Dʸ=Elad/GOXEe6l0kКF |螨xhhTfD8`0G_%Ctd"v X0ᶷmWޠ-$>Ivd6ܔ$-Ln֟9NM zLȨ&*]gtթʇv4L'Mk5jbtm 2I%nCA<@Qiluf&imőzIлj-=FQ]c(mL=-A r}w lH:nDGx|UB@!C Noho-]s Qc̽B4s;_~1<7̮ۇ~Z1 1v-XR'#EifD@d- 灳Cex=k[u ſoG/O~?4/5!mPY4"_Xx^1 "i(P[5ϱÖLC< B8`ˉå[,FePãufEQjn mkxlTX .Rbi&2^]m]}Iv'}$JIJbaz:: jI_MÒHK;,_Rg1B"5uL#?eT6;gqEBWaH=`OZ`@)Ֆ #p*cli6<"kx>u1"W{C^1"`rndoӷzFb$hI4% Xo֕*Mq23655f{RCmi>~y}i]|:0z&7y>@ա:Q^@Dxl!2I&nq\шS NI )M'n-6 {B$7+Si汲ye' *eҡزnk1\5]t! B>GԒ2:kr$&BH+wg=]c,b4瞤#7=YGm+ =$c| yو$&KXYVeT&)c &`>n_"OQD+/|/+}oɡ(c $i]ߧ?s^_'/\$ww0k/l6:&/(H^Ah&`FOcblfJe |t,͒AܭxΒVS4}zV> sbyYvˈ7ef,sfyk̅I=骍zBر + . F ,Ts}(\ל?d>[P8bQ&LB \TI &Y NZ9fm]:|\j*~rۚ}Spl{P<ۋwy/SaZs}X (c_ z+eprq8l 68V(O8ݜ%IiACZrpb] 3Fp?Ie*CG~9P-$=X Z+B:ʲ+ tdUNkyWX?}*FJgqbcKء;_mrTٳJjѫwI+ݢ,-":7X8Gz\޶c(7a&a"8z n AwSWqdz{Jt/q;7&"T)ڭvp&E0$! (lLήMUImTa`r@oz6HB&lf)n|NaUz =5k"Ei 6O[uqjFwkRUooQ}X=A@6x]Qš-F^4(oRΥB'԰& f{` ~"_c_yPmwAjE2g4b_'p32ŝeb53Lb+m)"i$wK6NKD #-/7Yos}Ǔ>ڹIky|! (68pHBѧFyx$UImB2.ԗsdaL{>kzkro<~E aohmgZ G;_X}AC_o 0+^o}RM;oX&2C@'<6Cm*!I| P(+⣄;ցB{UDvru*$ys2+uԓww_o9K ~T1ӗa@wy~'F2H \]yFX03I.VvHtTQNYs|O&S("4"=bL1d#&z{?o立i[$ ws.۝DR-n0vn]4mqۊt#xS_w-uWTh7 34@CąxÈRvPvUu.mhL ģaug8׀-\m#67wwc qL0qᬑRFL@!B ͞^er( H4|+ardN2ٙX?<'~@ AW?~ɾ}sUÙAt~|_ZOO+},=kW]vOyn BAXV#Y.Z$Ӻ=23N4%"Qo۩Zzf+AA 0~HB\ :j%S&layBvŷ9@Ó* Q/n~|r߰<ۮFJǗ1xy[[q2AQ~}7kY/$ S_?^FW޲VJH9o4L2$e$k &nƵ n6&Tcln)Φ(m 09@WP@@l vkҴJ/8FD#sQZn0mn&HQ0YF#H9IrtfgStw4#KhilkKwc޴̴J؞_cUhsZ!2),t,k յ;lu|a+f臡vzH_Hw*U,VLzgǓ`1}/Y`x-n*KlC9I^O&53{PA$shAv 830w婐Z _s\˵ܿJ4!J;flv|IB7b@dKn%w';2UTNIʮn}UXسgo|8Wc/__OlD\po53e5.N AV;rN=z"?@@^sU'#i1̢3bLLqlPe6bYb,eazQ/XjLOCY-H} `*F A0{݂&'bpcodlըPo>bFK 2)ڻeqNjc^#zܕݴ1T5nz9 ',f08NزD 1r~EÄ`L]*r%ty/uaԦ2DXE9մDwtdԸON͝7LE4YIǍEt'b#Kx@lG}X20p]MhZN`W(5Qs"v/ iV֫.M 64^djw}W͵V!Z /kC g8gS@uC:YL}ύ! B0+ڐ煔:_泧OzGb ĵT\c|&9Ș`K5krѮ. @΅A?uڣfѱB.(5!Ň 1pՍBʡDM߁\G=zHSZIԡn?М~z.mV*jC~$"NFJ=d 8:e4#6mX;#_WӣژE=GYL6J kjOZB"j`E)3jb*P1Tj :Pva [%Aw?YMߪFbLq.7-Q=,>.9Y0@?mRzJ 2jYg9&Jϯ.r0C .z=~$Lj sj|*F*?Gj'dg<x[i~X0i}E"=k2ٞ-u/kf3-mBE*kIgTy穥QZMh.Aw /`@Hv$ujIy_)}U Թ>ui9Xg٥jyֶ0A ;便Ŭ=st"H*{Gϸu8`UyHp_^cՑm+0BZ@rŖ>Nț=n8{3dYjvÅ! ^_( LpԈ3{>j.a<+Iy6@?!yB ͖^!jbV>ZFmN@v_z&$dkܰX4)aR$ 8O U4×ՍzƱ4##睧k:b uBTN)ʛw@.I X`"=,Ԭq.ӡ4H\_AQeG|I q\.Z|=2G` $surcpw2Ml_:!]EB+qpAFDs5Z~Y:҆;H-xWE 8hTiCw/| ĻHErDQ pFEH2D׆\pcL]@Z8_Xm&W4L"BkL{^ J4tƈT=OV6b%D_[m6rG;s}QB AxBmlbeKmQhC^"T^L--}EàCB(hCƃ$6PS2# Š-oPR= sp:Hr(8QS$ r@SʗEM9[ˀLV;PGn-DG"$jF ABA$qA@*"3l\G)ûDŽ.0h 7^H4+]5W+icWKmq@QJE1FSuY7̴R#W\5KW$h2BJQIqU=Ր6T,mjp"fKP(#QDL"p|TPL\8(b]X8\gq*P$@ JPdU09=y颪A$dQSV4"OLQT @7UDHelEt`UHCQ3Dx] URSGz yn"MQB'Հ|"ʄAC `DIT<.HE~k]GҚo-Ҵ.%Tq(aL4R:I hЋ }h TT(@bh  8 *+$myͫо׏Q])>R(#&x9;!6 R"+= 3$0S Dj&HKPx]E WUmVx-m%[AE=.:T/"h~pbDbm @E@̊d\țEzhOCٖPڥ2G>CVL;\"E`8 8`誆l!@HwZTLр1/ xPL8THi&Btwˊ9]12?e48,ng١j؂&h*\E3kc_q F# GYDCo/lZmUX6ZQk$[F!"$xltD! ؊|DHEb)BqFJlZ Ў MEdٕmWZ@D߱j #1ؤPhMTbka }G!X8*1FMEY0"-*|A;dJ PQ]8ޓ? n};b x/_L}{ky*ReslH2 p[P Z6I"4iJfE4!I#%=廩]6-E *"ѓ-PёdTP`dJQ٣%&DHID`` #X l &)qp[\D! LȊb1J`J0rQŊ@)Dsl38B0$E.yB(5>Ma@dEnWĀ2\rկ^nռ:̒d&DmE"ҀI3!3#)Q$!zJޢ[El[Q*m)XŊ#mY DmF@XTb ŀ,1%hѨQdh"!I ,hIc2T%F XB0ĩB`X@!6Ř@(F! ^ h6 A94&S)!H!"J~P3T\z?SؽP4 zjhUJ-qAmTQFƤhZ @EcIDd1lE o"H *R9 ǠBi@:CEp8R%$ηXTZEZ&6)$J$X(e[4aMJBF3r{[BFXADオ+oU4I#d cBdaL $,T-ѸCȂ^o ;̂Y9>4FEtaIt$RA 7Ku[AiKyGSz~JFijb_so:*HFI{|$ 'YZͼ FA# 20!4("u}쥎 `E# n1T, H !%"$Fu\z.L J K}>E[m{^;%y[k](Ay1SY0E(8v֋Ss5썹{QLADA"2$e! \9srg;um}z<)oPdE*"!DT )*'*oU5 [y#Qhhث(&ITh1Qelm 5ݵuv0hTPҘ@II"2E%lS "&!K@X2 4RA%"̔iFWUZ+k-[ȤѲ0:ub&s`"J hU5 )Ja[[խyE][VU[e2(sZף 2k85oի6!vZm[[!_L_am_^ھy{) B̴h *Z&mi(4,a`2$a #+μlS^+Db-ȲZBm2"MdŃa6,LQ T6,&VQY(6M@U j-bM,l,XŴQ D&ںUA)&DĆ(dbJ6M!0I&e3BW+U~k}c[ĊZ/w3-mWasEFVe $FmqS`$#FJ,b*"J Ff5XQlֽe~W!1jH ׊麪0#(a0FIج B,R(т(I F H24 k*&)0xfnMT\**uܔR9] KR@x82&\3־i}P+ `XAՀ U^ƍFE2!dD6CA(ݭලJV n|Fz'&&ժZw$!(t"sa"LĖKAY%o;m$1KQT"شk9E&@qf9^KݻJh1chŌQX HJDШSk s8b]ěc L-,&0LS2bgn{pm}Z",l$R/+nytGW_ܭWn鲕[6oe@:^E֦`d0Jf 6ch:hdN2@dQR$iu__G>Frֹ>+ܵtգ`]搘o H'quuRQkz/>rCbJ#Rij4e@j$Hc&#D MBBP#T R&(PH%$X&bTUں`lj]ݔKpF1an$iuvꓛ).ݴi4X,Q/wiE 1C&:<[yksmz65AkQQ1EzlbX#IkEW+ PhB4%E,ʒ FY1lXiSD2F0a @ xGoU檰E8iALQFAYR$?A[ojW2\QIF(]m]yUrw5M.R512S&E-3M$ѶXLbLl2f$2`iDO~ A7mNjmbǕ^mUyyʍ59ioN>i66'};t38&Dc#ttZ-Z](6$BkMuvM)ڪ֯[ɵjSVƢVBUJ'`p|C8I 4D d%Rg"tM  MH>.imJsd z?q8H62:& @lI`E0u]4FѱI6 dM-k#b6шhEL$**eXد]יk}iBLRLȔhdm;[sV76^{ScrDWalIFbB%j6MXT((6$4JX’T(b6,bɱIzHe4Bh4:Mi6FIrT!j46iQm5"11,EbtW* rJ1uutN:E9t\&ΛutNg !ҝtC2!u΃Q:("f-CuCQڢA@ `Ң0MRJe2GRn]lcmSK8Z+C"b @p6)C4AYVf4n[w""F~6N# 6{OaHLGQ 꾗6\ܖN'[W.[,"fYbiMjA4,e4TD-2c Hȉ!,e0ҕ)Db1AA  `B.5mXFƶ(جV66ܺ[,b4 H6b!6d&bA+X} TFHEmh$PfA;[בwwTj~oa^7^]2wYdq!I ɶRYlYMD 8i .as*Xrkޛ1cp0P:2Yi@"f<؅C[I$Y .J퐃狰T)o6E{%戲-xƇ 25caVAyR9=BYH XyV8 @Pnܠm5 *oz5܌ at3: )=lU75X82-1Z d36:^3i},\l_Z<=\!J/G;bo9?˫Xuϱ#S ZL $33 lFFF-F5EAS{mkVg$ fB #q~ǵMA>P[UkIhՊ_._Y˽GأEJI``BdX6U)=uj*1m[cXlmmiH\e4&kuvrNcm6Ѵck]k96.H4)vE"Sl'qwW2%OFl`M4"#I`ŚFp#Ij*-{o/wǛ{m[B03l8>~_u$ $0$!#KniS~(,zVS聺 pշ!+Um켼cQDkQbUmF5EIZ ZMjf,"&I9q2,I#hd *6F_ZmJ@z:9MCrDW]*"$# dC ‹bě j m -Emc[fhH랃%$ у&h0b+cmclYR25D5l돮?_+)0LB Zw:,ϸëjP{_C??lmc&fXK^GqocxꚞiT]ʦwtyuVp iU V1яS;>$Gw&F9 @ꌨ| ZU BfHIB&E5R뵍m!(ŭU毨x㎜9ϻRU7~~G_NߥݧUcڿF؛v!=E~}_)*/*3ԡneo3l!Sk{&z)[0.i9GBҝNu*#D)!"!-(MWxutJHFi:lChEa$hfJ92FV#vy|a~Q{,\N Wv vھQWC/9WէRj IR>xh|OiţZуAb#~ɹű2ZԱbôûZUСGa=)=<>6>^Wv]F?++kΊY3* f_[mA2Z+ڍGe_B*scf>:i]t[˩cK9I6 DZ!LNpRLQ(HQ֍mQL $ Ulf3 s/s=ϧ=U1x@gI7o'ҺA&2RdK(C:P PE3 1ova{߇a`L$@$1N9ux=npz'5o;_ *k`~ tIwJos_/>B%JѵF*-~&&a!8L8=ѹ_vޓ='??gE1f MeKa02C\Εq[F~ڍ1bƈD[$0$00'e.O+K4Rz*J; uO5ϯm! @FXD}oN/Ox~]dVPRi>Fg_vs͟y _[ =ƴw;C3C ?>˦,hm;ȣ@,vpdTK2}bzӽx׉U]6{a`+m#0`$ 07 m "2t)bB*An D" 0 ݨy7Gc PBfAry>;3dhOc8߁[ Rc|'La0z?j.[ځhܧ&o*LC~K6e+DA-@BDkk>„}I{4%8αOXe˗$Xd7ʟ_ކoO~U2yXo |?\4 *a~[lG3=42FZkPLӐ1Vb>~>0f?8{u`A:o,h>0KFtΖG\HGmH Ѳ@0i 31mR["xٚV%(iP)ԱBBp'voзk(Xz^OjJ\>1;Ŵ٭2@n(] ,x>sDM<6 3ZV)ihaS5ʀw1Υ@.ܔBtwG ē?beHV3n >J-k-ic5qȸ+ ѣ:/FLbR4V,]rk'`EZSKr {09'F)S>035)p8;\Ƈ, `1S>yv#؃i&l B"0|OļWoxSLkb)Vm_/˕&S-cV֢9M5̮S3JZ@z Gk5;&33%bD/z\:scr( ⼗1GgB ֩t)b.~K" ͍mkeYm·jmKypjxJZ-l%P| e=E=, ;AIShHAJ6RX- M.Sg@&g Mvq;MmjNe DTV-m]n¡h$jֆlG+q~עJrI HINoOH mV9Lѕă-͘Y̜._cboAӻ1kDobhwL0+!3MVhK+"0jN$EOIIH |Lnʻddڈռ] &4Ƅ8o`oj]w0'6yu2hF{3볲c^zYɹLd}H#g_|AJ?_%O,di93p9bd"r\Ϗ^1iOssFa3o\WR\6lJ4,#9S--<%PPKfg?y47Q(ĸi umDMWt)*YMC9._(h:gi\mmZl%Y6ga~m_[mfb6wXcʩzd̄ʰkTĊAe$?p~$TC~U I$#6s`dP3K CiKgj'ZO8~҂זwx͛1gf,.Z儔<.T=Gp=utFgV״89lA6iְm<&͎_ Z9`lbWԲu=D1QTcԭ{+.:mE7oTZ7en[ wmٻ3[K¾-ՖYsС1.Əud4r&yN ){چ頻z2I8mz)| khI|S1 irz ` Vpg̀u֠YIu^nMs?ќO\H3qya5 nӲ_wAzSXG1}vV$>Hǩr>&.Ĥ$.(,3kY 7[6?W~h E~PiC*~e`w֍uK?T76Ckφ0*gjEe`Xc:O`͐\@ v(zʉrpi0g**vϠ>ύMg'?]f>Ay"E8h AI_Hs" Z[qfv|T o/Ko9kb䫇,j٦G(3'YUY V\y{ sպl:J{7QΫ]0RirdV/ "K{yT-(qzhN旍k:<RqH>-}3(̸{W]rt2 tlxm/1-kkJ=&muirE ՘REy.W8ԫYFݿo h4OK{Ixm3WSrNzl61o$X{ZZW>[q&Ku3WƙuSoʺ U3/ulq8uҫn^H8׻ny%1i+ﹹ~?`WȘ!~|[?ͅC} #~ñqA.~4uDA#';݈B>n7x;#LA™ L9D T^Wm͵P.CaQN:!pluPP * >pM4L@0^`IU: ;=K##8(&3%4N# !ݼbxp / 0b2"Vƨ+)HȄ`HGFMf[LKߟgG՞l$=2 qdFJ/55ě`IGh4r%oFr7J^އA};ʼGvEm&_!bH),V62ٖ2B4iR]t}nZF22K< ̓ 3=g}ߩ̛m3F@8^E aA0E(c{G`AJHAuFIF=XNgA҂M$ WNvTX˿sNlt(*Lu!VᑩE#A{Y 7`}}/"b?=Git$ΕfRkwV2<XбK# joI?ѭ olidxnLFo0=1}t( s#!7a|KuTQ*5&Dr -EDRQ *!h) Dn>Р=|A3@ x-\[F譶kUR()"Ȉ"A**|`|P$Tsh~!t@x6ŴVūm*-D_HZ]GCA [̞o9ug~VyV?C ǯw/DzK7 x~Cl;NC:а"e^Mnd~yv/EkH9JB &#q#程?v_z): i< !r e-;ߣο@Nե,wߊP 6?Iy='rڲrsLQqqvdzSq\2&ZJ`3J‡\PmlצdRՉy/W|vVc]y^0 7;;|s3'Œ{G?ޞ$S{o ˼ѻIdmPeajMUm?oo?zޤ^+HqQs9}IuT%Ҵ29\oSMIMlůɾ>?w8?1t'FC㿇=}8^nT1~ſ$?O8U=(n01+@!f=-ߩ(n/pDb >[}6KnyVR^KLb8ѶHP=7nÜb 6X֙ 0fӥ(y~I Ȝ,a+$jZXmαCy ̘`dD=tyk+]wk㕲ǛlnΗ!VoWEFSmO.:1I' sˏ|ݍ2jc()՛HatB9p١р$*hZ&"!Ʒ@N~]"kUc 7 'P1e40Tgn ~HjW NX4K1qedqY ]_2uQ,'Kn!hF+n* 6,՛=$sQx1\{YNUWǫe̒+%eZ˯8EU`&t|nQԿY 5 Z-QlVŴDA[ٛ5;"]=n@%_pNUk@[8CDP:حW3Сh&:i#0-D݈ ǀ{8 'ֈ@C9(*z( h( DDUT(6`"Eyn1@0 kDJ u?a:6;A0G-M8B?QP$4tqt lS3BL1xf=v/"#Ȉ>Uǰ!&&A ^}.T .kUxG8V~[v6#Q/3$Ot;}{)tJ2DN34B|d?sxC .Ȃyq=5Lу=_Cwx75v\GTNffP V.62&<݆vD%5FQHHEC,ǡ=â㼢\?'z 7dήNhlkko_zM_鶝pZFݢ v8 }8=  oAw91!E3Jhr'MnKA)qiڪE15/&gPം&._Gh   y|}vsǡթ"/uYRq`aPQXF IA~2uσ+[mvJa׬E) Xk/sz[Ut?avt HW-iͬrcB 'T(_RXs_ﶻld9`mI$I&c~MBD!a)N;7{sSA&7SNN!*,KB}kG8 @]Th{/(S&9c:eBpXF$"2(A'7X.1[>Ȗ\/w4LLӅcVڝ~L5N-~<@z߫F|jSQD8; `+5)P6a)!Gn{ fKz8jaqȇD0?6LB Z H6"}kR-_UHv D9 i0%7u%+(=X{/5*Qq辎&PO: oS-z'~gC_:jiTK_Io',m,Xlᴯ⍩߄RbC~ R8И" )D|qi =_C,m#1?<ޡb t١[`7N(5lLx8VCVdhƇ!Oڲ?>Y,1%A>r=PG佡IVU!Tg< j@zfv~M%ç.N %0Kg ". (*hD*ICh^NŃ'Kj߹7 ً}͆2߼0o=k.NYuQ 0OcIЀ~ZhnE pٻ,4<@#KaΑLkw<-7 ֫]ԿR bkt[@߀k i$PjKI6ŵ_>^]l{ "T\(WORJ41l.>y8S<1 n-gT}}٢fо߀^ .ep/Bm{uU0݂?Lmj&R93w45 f\ա)Z`2]}r;(p@|/ ]H8?>/"j"[¾PƁ>=WN4n{>JImXXUR"b6շote`{3bbZ/C/.Po>Jx~vIs{.[zӂ4cV]&=u6,QӴӧ~4oox(_7{2jc3N4al؆g44o-"{ s|1SUo? j,3f̽xVEEɩ4V0o>zLy Be&X _>.K(YSra400T5G V= e07f4yT3j$t# ˡ4haS@tqck6Y&a (_!@JʜQzKcfgJͱik=PH1?Y_vRZ*gpUP#5qݺ匐uQv쳶OyF}Ónm2pE|ThoOi O9dZ81f˶p{8P}3鱞/z=g~ߣƵGfI8} oXd Er$-\%j쫠Hc I{$j*X.~FT<b{7. GߝA/T*wVs鉮1. +QwG5J p"5Uq_t[ }YΞ\cWM'a]É䞺sf{ܟs7E>P|0">,/ 3|42VS:$[^1Μ ڋo𻗃s(½>Ҽuj#:J!zQ90SwDA%}Ǒ[sھ <$zbA bL~=3HFښ ]0ġ5С;Eqo=?mړƫ{oI~jIŌ>^aEmR&B|*FLr*!og/p՟گ ZET,2CǒAGk[7_ZN#Jjֶi}?Y];z2y5=v7`,IIw 6Wd/?ʧk4䜅r,J΃2KK\@Nzo23w+\Nax[D=]-q fv0[?u4ɋ;}Ә[,kl2eџKDF @qbX"z#s.)!FH$oB}w>8mHutZ:9 k(?lŠ2^[ +8Ym2Уh'J+k=S \P_1Yhd?=1'LEf "/ Ddmfm'{kO$ x ˭;nhܸwOa 3y[}}/ vs)#vi9P]̽m] PW(g('>yZum 'D\# v6_\sG}gAq#2Ǧw-E/&jiaSt5#;ס͵k%3tXzAE'yUW ZcV"MV uWD".GK:yWjU";_)u(onS\RlK@F3R 7[d#_w۳>._.x*H U)$J>F U7#@"{&~O@.cisxZ"JSbiBHA`IƇ <9iD\"Rt0TxPH'_. iU?"Ux(q:/UdELAGlEBp-I XxH.8'ugvs+'+P&1idD<@rGn"b* #$v{'ΙMj]ʎ4ꎫz/hAAU85|g2T1PeQEqN~ =n`y|[{*90>0 @hhz["owCh(r",+.Әs@ Gz^wܞr⚊W2Gi6d "i$4$l7ļ"8d|CTʌo'KBn'u{[ʹǙ .HZ$B Rd$zk@D7co 37^$j@Bho]L쯠?KFLnίA6.Al::~hp35?Q |"\ m# ߡ$KF@A;""=9R\܀{8T~'Uۆ|OF?w+w ;-0Ӄ,d`(1 w=S_*>"]ΞtsR:S}/e ;?Q6_؀} "gɺ~ y~w ?բRqMmCԿҺ]cC$H 6n66G*nz]cc}uo Z4W:{ x>a7 K_"<7~G G9\TL ]|dHW8ĸOQyZ-O؁18si S`=m= "v>{ WȰ M(>iт1?&k!yuwD"Bo`͟-ϖ}gS֎ؓY=< ~?7c$BY\<633 xhaf<|\B*X̵ɑf]5s'92[^T!ޭ=~1U|Ǖb\`v̝7{>{K(|iFÙ]^IE"=&((o/>e]:tՏ.iZ"^}8$Calu0OѹZD>W̊ ?=Hc|>έܞ.Gz,DOGCx%F)<GN" #/S¬R4?û^!_p,r`L1JCK w~Q לNay0+~pU _M9ioɒmGV#kmYwpcڣC/10Gqoe0Dv" ՋF݉b~蒛HX# TGn(V 3BOBqeGڥTA_h<<4( 8AXj*o[oŷKiB|u^Ǻܨ]HcQErg46rjR[\l,[/jmmY"x9y\mxNRJ+c/V44+??'p##QQݭ݇HpN* @c܊$: X(W?H5&Vã(8}/]=JӜJ .;kjF,Ix~ا2,EDz)gΙ*LiI€묜ӗLT>{C5\ɶ3"1Q|Bi.td0 _ 0m^ h]*7F`k[Mo]4js0J"l ] {ߍgyN qgXa-u Acf+V<ZĽܪ|jwÒ͖'5e,|BL=3Hq3|u4b{3W%X3`-i}j{,ޒԘ,Bs΋)tՁk]44vyB|! z9MU,:ߣF.c׬um햽dj0mU?̵#~`fcq`:e݋ã lZ+LQe54S# g}Mm-a].o mAq TC:.J`S{s}*`BB"p5q] ۓjTu9TD Ȃ>gp=on4g%\툆9Jp5#5hG=٪z۱}}LoB K cA@B )*zU}KVo!"Ng3|ߣ𾶌1.{ vd~Bjz4E]n<lªkP-$p /n`LF(U(Bx&+!#뺸߾83xpAq'տD19 FAUȡm*m ޘ R7A-Ѯ/\mZ/(0VW { duY\q)!l~QxVm=sM6znKq4XH#6e<ʟħGNM MZ ) Uf jápc!JP@(Pa '[C:M<(OflJ1ЄmG:-or˘@6TAb]ˠv`>̶Q ]W=z/a^R*w@ johvzj|re> {+o` j(SmQ#5+!0Gn^+ȹ ^ $6-l*t쀛؛R z:+nn4 *Sia"JԊJ0ߗ9~ :!ٕ6QzgwJhU2 zC.BDR&ÙgGSdsM䬠&hpw C+KZ)޼K|[HpҨfM bXEfi;79 &IekƟq!U"L*S-ạ! }>o9ᛪi?z:Tes?]{?:fOo{Q-q虆ℏY 2 'ԌC nQW*("hz? PI#= W4~fA, 2"'!h"cQ +ι\tZ"b*k[~/5ܮʹQ1YEn*1lkImB^7fƍw](bdf)WSUYamaߘiW}D 9ݦ t֋h7%%ӻ*]ssuW".]n&mu54d!S !%dQ U Ȇ; Э.F _`#;hE 6m OGoV[@c!  2ST'tmBqԙFk.FDOCn:w'F\;Kvڻ\Z 5>Jj?eCǼ.ص^'w܇4!q0ì?1DSl.R[& ww OǥXs78qe؟&h0K' FP@Jᦥܑ'zʿ8ح?<9?(gϟ:U+ d*Xq9O*%B@D`6i_wTw$z6.8xsB_13io׏a 7%WT*Al~BsZb&OaZL"!iDh>nv\PFEPQ*l@8pI[x-4{;jaK8&ۛQo0[ƮGM^ s6Ab`9v0?7?_Z? #sOŋ~]\굮B2UVSx8QIwQ?viJN֕_z?X O7^P"z]jP_w"iBMB':g_( )m>װZ 1TىDׂ"kGÈ?k~e/BgH35eJx:{|'aN6ɱZACn%[UJ Lt\oJ6tDq\%W m;=v^M/:rnֆvO!|`tF{ƺ,G|28NF4_kN;aN>߷CWeVn3v obo!ˮ2.D uZ^ƧHR۾{D ZcC6m|0iԊڮV]o6hBE2`mSV02¶sZXNDaBkSK8f5Ɣt;~う"!P/q9L Mo+dne]@nGIot2Zv++UWk2u@AkTZ _)=6s SHPQeQa ;G2]{V ^HAmPPgAnjHx [IvBfHdhݍs90;cxG<;1􈛳Nzy ~G嘨E2uIV'^ "F+U[|jj*AV! t j*d**VT.Ȉ{ j:B 0YPCTHknin݌ MZۖrX~(!jؒ3Gd6iԹۡ'Ψ*'𧛿'qBHbсhpJ$Y5 @0cAv(ְd׀?=Λ^JڍRHz4[3ЄS:5Nl؈qbד}aC\\5b:Ҿ60F@" Pu`o kPh{҂gv@}S|%)7Cq~j@>누>$MCx,BG0wȨhwy}G!Hsp_G~n^POg^(%ukqN@3v)=GxLL<ٝ0{nR_{F8!/ .ȌoNxܰNvJaߘшB(P60*k;>)P>L7:˕˛Ѯsmn$LԍT,%Pk m6a҆ _H0vivE !F7 [cm #{vn,RK,3Ω\0ho"1vV쪒*ԃD&=\ =.O59^yBRdBUfPhWc`zeOjDQrhJؠ$J)mm7s6]-D7^`hh a~qjz~UBB2Hxd6}Au!Ȅ!q"$ A3Hv"2;tq`dF>?c/ꍙr'@EMP O*Lzq֗ADzx0 ϰ*o"ly_#u_闑1!:#y{ Dž/9o ]f PT2ë-Y֭Lh@lr ʂBaDA+P@0֊ TR@mP"ֻ5z-=C adhU HIF#rpξ~^Άpr>1{mm3H 6ƸHp9?x`7'N4M-y~AJeAҁ07Pb"߸cF%ӏʒ $FǪgZnX#j(I}g^ J@A^ۮ+kR}_!ݭo]>G @"KG٨֒Vm7uM.giح&TR2wVXҵhSC9MYшWjsSӡv0@?[L}깸,fg̓$2B:]L@xlu&sECRF$o41SC8=@&~{Ӎ(dy0*W>DT Elmu5kq'|7q^>nػ wi%gІ^*X(\k ڛj*4l+~~‡3܎#fNۍ/,j?Q]|kk)32kzq,aG |FclB2@zׄ!$Uzxd8 ]ivvC*_XE$$"RB70]\,#Ρ( @ F0; ~9K^k]ykJ ۅd0>Hqل0Y?n^(.ys6\3ق%Gb=裿nlAӡ=w( >ghjYzV}Eڡ-g4v6,ۿsD.XGԬ*aBCmFd͂oљdZmtؠ16`FJafHo`\%^ZRDJ-М ~>x?^]*!`fBDdݯ5S?mB}6Io~e=G,O܈{sʷ Qf#eGot>UO(C0"L3^QAE>T8\ZPPڣnz x 6&!MO۪߰SaDf&l.Sy7j! 26>t`Æ9|51M+1̣A-|x9Ĭx+tN haa~gиD 4Hr{xD(%%RB)X|OˏUˈͻځRhLBOm9/Ni%(l1T!-(,$>m=-/Yn"bnq @]_neS|ZOmBUKvx16U9S79fj bt̄*G'v&*"T$\eI:K?/BxAZvQs礳(w0bƍY5{ip4P\YQוT|m?px]%Ȗ'lְĀeq8 B }ԧy7cH&*Z!L(Q*mAz4Bl mos٧>>`7n:0 '7z/;h\TUDQp A$$YZmQVhڍlFj,lccT#ōh4*,Y1H[6ƀEPISssI!z:н{@V9KZ <nK33~ˤ[,?W*w]u|c8cowW?+`p7 U~낄nYBeᣦh6#f`0:rs~\ʪkhumU#>0PT}:/S-f5Ǧ~5bnyl2_]:tT)Xf]%Qe }qc.Nyy;uFY (Q[YDd᭛~k V-!In-Gn  D!|jXЕQ@^`9Y&E1Ad؃@珫dCXRöꯛ^^=[#]LYk/ɲ( ={餅S)Ɲ9Uœ :NmF='⍘֬X15n)D?۪7_͞nmyڈεrMRbm[sQַ^!'4PQ<o:S ׷3hN6'm?q}Ck4ޑAKfVe}`0hLᄒdYw5o0鿫óx5541- H*,f:]L'y@ l3|Ϫ9% {BK';tbQқtһFL^>EJkB{ dq>GSQXcȓB.Ѳ#♹t[z`ӊ : PW;I"?>LpAB@xԨZ3,~/Sʇ$ ֠WǮ $ " z 4C~wr]> G"_邟B)tU`=3~(^٨MȜh.Yk%Q`]o __R}'$%Ij{)p0DCEF [hִmWV{%z4cj16-k3nrƹ(-sI䚢˙27><ƚa-]~ScM1mmhQ_ -ָmm_nHS\sn xuwr ջ6+2jjʏ6FF>FYڄhcTW^nHHLMlnRWJ#V.Z## JH12JB4sV1'GWL1ג1GuD-QM q6^V\Iѹ̉Iem) |s|6=LLY}!{X,FP55xq(wvew$Rˁ~Z3u/cupWIg骉pTw:%E7,T34}o:,5Ds}sɔ]/ڭBuR}r5ztg1 A};Oj& 5>`=p Na.=MhÓY!+)dE3_̡=aS-*UsDAagMK4p< 49?e[G~aMOTtyyR$Q0"H) Z56acUF\b[Fb6H&a<[ApMCi3ؔqEmo0ȵU:Fb sM@Uz7`Ƹs3H}`ٳY87ViLƬ)ꋴ^a&!m [hV% o6 0Ǖ|= >7iM̗MNvf o&LR!G54 Xszz{Xw=.j;;TEl!|ϏeL?UcBBL/s4{oI4ԛLI*@4޼;,nn𳅮.TBl>z`65Hã9<֑"#6GdwQ4h;N{.]\U۶vDHZ5XdB_O)(>3Ga V&VSl' 8*u:ָ:T}CvJ4`k1B'8NR\qփ$3$]G㮶t Ê؎'ۓhA PB mpF |E),_RaVBKGM@vzw|,A66|ﲒxO/PT$AU҈ >yZ#N.2 .R燽DF[𻦄 %ptәnVFflOn1kunZ =P,B2zZ X.tJ1~]W}5 'j@Ў1V@@|AA4 {ڍw3Ɖ)#NcQ~;yCfIm`c" 陒ctk ۸# s9c_q]~Ȫ$2%/^\~umt6Mol.*o~Ny(O|X?Ue&L+୓JuA!ɛֺBk4=vO2rI28pd=ZV )T O)|e!?3yG1Z1i,w(9OF[4W fȚg:jt}kd}2{c-"~@PUF,G[MEPWG82N5l?tc/,هd d4b)Zv4(A^g<(&*ez \o_?c$)y/ށSehCP9؈exHzE9d~@.6._n6fC2+Rvs :H~2}{I#MLÆ䦰HC o%dwb$NgGD0){v? m`y d$ƅƂD חRe|MG_{t[c8k-$ !c[8.S1R{Pť-qJWl\ڑ5ip&3Ote]6Y|m8|?g3J7~vϴw֜z09щg_f :i^uw(X[so QţlUEF+mV"6Ƣ4b)T $!{}c',Ae>- rQf/'; t=?l4_{ɻ<{~cXɲhH"3Q`lXŊP&/g1cd{s]׷GpZodȼ+wTϒMM.ep0gS=ys&6vc c4C!oT2)ݎm:)>߰ڍxe-^>'-T5?b)U.(Q|}~]gY͘Vs? (TUwr֢<{m˴q.KEE8;sIysI T1C^qU",9a _Y5zE#EʃsEoАBC`$ҹ"#i8g7𡛇O$x<khoSA&|΢. "6 a a@FJƚjk^U{{SY*hثhxyC"pá7T Tt9 Xt& FTFÀϊ: FBd 4$dMRTZmQEQQeYd0-6 JTɚlXؼ,t‹gP/Z%!|dž_킪~y0$nOsr4GRъɩ5WsXHkn愃V( [JWx73=WO*|^sCau_g7cgC_ذ0P&T&-|{5x D)q-|\- "Uo]HIjVDc~uT1 SP k_]ŋaM85%\QN+ؚ,DsUɔ4q[%[$SƩձ1Swq>}MaGGk=egżlN5 _FY s%rI*0 4n]ưU-7v:PDGߦ<΄MS#=V Xqgɀ ^v~q$w'3zSGtwFƱkFZ+)p>'N;H~ZLm%y//inJ t&f8!.5Ä(.f#-ِ>K_v'@]RaĤ!T,Zmsg9j#.PֶRFۉ#U:\rȓT]f@ z9ݕEir,s z+$:\LZ֟=|^'o֝xzL֛}a=kcOgf}_nw&cFT?&m a@b~/)a3-*=[rgA$FlΆd)~4!%Oj6.9 e̯IO7H7apak$l*@FڤdՋT؄I14֋QcBbִIXEQ**[T̐T:zjsuZеWјlHIi:BF.ʓ>V@xCb")UK]PU=,$x9Gwy1ZO_X4?æ/$dz]_b=̨~=<:"_۩.UZfw1EQ7۔5ShI/@5m=- 21l`W34Ko:̅T~via_E4|&^h͏v!n\۲3Y5f s,~gɟ95`U+<(t^IkyuugR %(H"]p_H%@4j>]5lݩU{(KG.GV.1E<Ūx|H'"AO8sva]ϕ89 /Qԧx请śxVK˶09T'X5¯>Gk0aU2'&w߱@Agw8Y:)P/ `jϛYD,~.*2unl@byOz Uﶹ7Nw-!ԃH97hѣruַWG+&S&hb40x ҕTed:/lu]j`e-64}M BKEVBi49?siF6$lP%NN#:֜lX 0 &9#H1jPL4d0e,4SN$@ L4` @QmlcF\VؐAFV0*+CQIw5nv4)/S\`\νYzU DMX}ʺ G嫹HcswK<46F1L! b @4wv(  pf|R`@{06$Bh(X͆2Mk)b[| ƭvКn%}7y+.U>t:Bw ;Mx+x<(cZMl`*vJ^^-67}X9̀ -!DoH,@QЭ33 e̼D316EF]&2)KCkXIOF/Mr<lN/!zgHZ$X`Ѻ뱍-H5v )}gt[ko7{33*2C 1i_n& ,bb]6*q#@G{ۮ)L.;)G9r&E^R.z{Mz'<jM@wİhGndyśڝ1;W71kxĿՐWowux;ȵtk<B{^Ф^TU*\h>v 8JJBKh(سQmj~wp-l49RRf!ĔGQ e 9ʐ*ӦO~h5.j *daJk ^T';y`piI2Uױ& }7ت#S:8ScW[,afp@^ʫc|9ݓ4)Ѩh*e:i}>gro7Y\c6sgR#0VTQKb eUQ*'bp ZPaB72-̰EwRIXit+d} _LS4(tD`:TD֫ JJ^`*t[?v~.-/#؎( B~7qgGW.El3uR ˆ ]Ŧ.~ @U^4fN|wWV|?bcsy`@_jC>^VY*iIJ{eҠo?=cKěI6ɋh.6VH b6LX,U_9mŊVmkm߈}#fC9(\C-˓. r[2(`6`PQd+Rj}zmt#RA-A ̷2lvj^Vt ,Ɓ5 /D&.&@\׼ʰhtXI&i̅EH "H{NˈH']pk)*&m\,ÉTg =coHZo!~hRRuo蓀l?魭KLJr}g2Q܎@x|'~Aöݦ`0B!rcŦv3λ3{prɾ<ڱdy9&)*T`d(|"u>RH$Tōgoذ^Awx:I)tTB20sb wQRH琀V56! xe4(ݲMG9迮V]΍d vnϸy*.WyDyo1"yPoeJ1x ~svς .Mݻ;ÖvJ%>42>bҲ[b^Iwo!i. Y jǿ!^UCT#_._=V MU/hZ-=sFDgnNϟnHj}Kv~NhDV"N4?^wre! '*#/(FUf]6q RRwcoUAy# sYRnZ$%ފxZrnx{2"cL1Oo9dP$ JJ֬~}6ԓKmr/*љ g++}vfȅT҇{M$]̗:-mue>20()'WkCT;EQ޻ _ܜ0_I`h$$+z KBC(IuaՉ?WeUms 4/yɂi S{_;^k}*{t6(hHὦhSW-5j|:*WUHT)dR<ς[7C@ ,[FlT}AsJQVUYf KZ:jCMרqhmo@gw>/f1CN{sOu S?t'܁_5/h͕˂[7 e34#UZQQ*bl^.. '4vdޙ94 Їʏ._C_h4B E+m3A8{St^ƪ-dnM `2l[E~@M8ËNo5EvoEJY3yHb2`pr0Ȍ Ӊ4Ϡ0 HlGl4Ao ?'OZF PmvP凝ՃĚf?'éqeU/t]h [3'/H6:eGҡ)MG㠬ҝM\'q! L3=u#b*]t_,qm #8CqŭFL\m=$5!eϏWlf*:rGOQZ*ZW䦥C=P,V\oI879`S0t+R#K&:+s)=ɖ$d uKյ6w jufPܛfp=`pJy`=*v1r~U4d~EsN qQ. XqOtB( ! Lb~Kvm /Yf0;x?*0Um krmĤt5;KAdtxYu^5Oh9)5I Zt.g~D ]}!"$+|.q +d:E fC:Kw'9,hmbڿPXSs *a1. Ffm~vC;&k"ƭ.i`zRG5DHRRT9ÂkSc?9BHvtPnN?'Wn={rBj96F|zOS͙`BU|]$jo>4W"TIU},=Ñ0e{`LD%IdOlߓ{-Bm/EZ+Tg9&4bz:~g(25+ʏJVp4ocn.="wi᧮mr9Fvn{^jF1jPYVMDKq*12\XCʅ/nζxu{tq+B^urK|29V[s1c"ܗhNOczx^⦩{ToW^yz7֩\CJ0(H,(J"l\PaC6 p5jF@\ܦʅpiꇂMn5èuTqޛb:F5F l@m:n-qC ˹H'% t(2yBCz;}R-Λ;}_M]XLt O7@9?Ys~We4nb[Kj>&H0RTq޺1*)3at=tnLAZN\5Lus\ ̍(/)%V曌[iʭwXt(}*5Y0_\u)]."a\k³f|/ԿeА w7ozcYEwB[ dɂn)q E?߮|2b Ō()~1RhX~T@| GM>ƚi<8`̦A7M%;?1 $ _n>`$PM>':¡>щ:\͹{) g7)D o[+Qv.u3P$W}{/=;b3yyK|j'/*hu~ʫI0ۇ-EBzBorz5?@#CFKqU؜B$Ul[ )|l5ru^e(,7WQ./Ec7̜+Ʋny͢6h[Lݢbw WO b]o{Ɣ |tBa*j"ҕaw#dAFQet_}u? mg1#uȟ{f3ۨs̩_A9?DO/' .A>*ѻ+!B"ɾ-wcK=-3P(7v8pa喨;R͐A.)}c/<[dmu!!4- jsj7jp_[Z~AS{ ;>R59dl$!i^Z.S 1C x|Wɂ%1|E:91M[Zh&mEMo6۶kK5~{\m&Z h`j]'9P ' *)‚!Ib_.#AUDOAp2f2*}gsxuht˵9{x.iO)l1&4ڨ`U. \O#._!(Hu#_4 F7z3r/gSVsxTڵ9.7+eåZ{F2Pt2pnL?]/'fm,1hV7J)*=t2|1H-0u[)a:$)V9 5:vWkQW%ćQ4t:=E=_z26Z|;=?W;7'ʍ4w0_/!U@(OK:1_4Ks}E{n1X+[_@`0W^?ijAS{;:O#KYcUK4/0:偒3fmcS&'`aJm΅˕fM%#VHɟ[׮'ou3o!f(S: ~ǃ zlM?W{Re),vs&Ρ"cOgwTrJVQMif9/VVŶ5I#oPran$daQg]`B~e5-{S>S6;2 5tiTC&E<K0gŒ[ꓻ=,~BڴvƹtfkOz! ޝH~߶n!ey)䪜T!6tzw,0> ʬz=ǥ1.'89?3t*&Mc'C1@aP3i:_˅to`ՠEtHϖtTMZyKgxvb́yaR Ĺ_]zޛ)i2'adւRLq*5(nIp׶X9 g]r'{XsYF:2;׀!jreͮ"l! m=(x.avnlGw=_fcSY8)UؕĒI:dѩbArC۱ ɣf Hr B{MMӺyK t@l&{+,2(ᑏt Es\d1;ZnwV@'G]ӱr*՜=-0(c [zN/`AZHݓݒyQ2H iA{ZÎ 4']P"M,x2Kj-k j":TpF89?o4ļבv65:$im,.,q,c dYLEfTry[ T t|߱𹌟b1lVG&zy`1&[//o>? K ;Ep~#lSGB.?>ݵt l3VXWU594|=tuZ֙H B_tA~5`1œ:t ǒL&4K}Ԅ!?QGѻ62^@4`| N RlAx+HlS7E :FD /!i+Vރ#~FKx*C`mίB֙ҁhLk%EfS4Έ`YtMi!#(C*BX-]="ƨyapkdP|-GoswvVŋݍ}J?ܷUP1wrD93:yŃ]켼15NcE^_'8xxy=_=9R`d95]R#q.jfstyB/ O:qyi5ͦSq/)-oˍkT{uG~FJ:lxߌfլG ։VG+ag2F$٩+&eIIXMZy VW o#vC{uvoַTjhkkNNPYTWC%q0'4o` {-̵lkϱ$3_%waJ 1q~%viM>5˓+ ;0Y,/ǭNs6*GYC!%W$R (>#4 s08ɽVq:T~Cq5zlh7L2dÿ^xgmwW)'Ӭ(ԢK_f9ϳyI#4&\t,e< 9QEP + H g~y/6F[7 ܒ1cHxUq%;$@Ps0@0J)~LBISrG;*/}UۿJF٘ƚ;Ljzvх=eћEWOScx8SrMQ)4]qO#yY?`<` 1bQXV)#^(hwZI(oZE]7`LBmWWzod}_CMOJ0*sHcܕnDShR*sXZRJ0UxX0(3K  Y-b2crw=$חŇ3`6 w4*H9IThPc7ey<~ǭso\֎^cwx,.Uּ00Q ȋ4$H;Jax`kzR{35V&:g׮HAQ1s"˪)D 4m yA)0Lf"l\2s1 RHB83),b;>]?;f3=z۵{ȿX9KW2&õw5Nb8pTxԌ{_R 1XxEwfIo:n(p\QA"FF20JeQv } 5ÒkW8? Ui\S`箁YTŪ@)(SBTH"HwtE@UT)00TUaZAf)dU0ح0,&:h.",T1(1E*FbJ'}]H/v6pWU/-!ʕ4@ $1 !D1MUQBjG##M˲WK[N! Ivhhk.l{cc0x|e Cᗯs@7~bȐ$ [XL!RU0(ީ"TgH@~% K$"[NXQö׆\g)jXPՂa,7 =5 (4AP?vW_LTxM $͍˚%m$ٕ ӟ=#}WkYkhd ?5Lhov&FjxV><%AK}m-,[Ҏ/LDE&ëY+#OСq9a}=p+V`Twe>:uu\Ϋ!̴anΘ6m er4ÐT4NvzL_bjܱo?O`sf'qWdh?Qjz!J5 qdGt_Q \uuaϞFAZ*݃2/w"rb9+EZhc«PεNGDPcKN,e3E."5[\F6>R*}'6Lߓ:׷{ޞf{_4FtC bz9TceU1vM)+?bfMT}[/>nR&&~%5`\w: &j]KdlMv#C5˭xFsMSΕqfPSxx]x*Sthr>=laYaQSV||PE&1rFTPD1"nDy~$q;vMzjE6GЏ|eK󻾯 fa~Įr: >bI Y|"`fe^LsWސ k'vX3#Ce2zU]` j־؅]|)zZ"z>u ?SC~]7=i7?A֭?1Mxȳca[H5ŗJ.cWF.] C(yznE{ `M-yZҭK0B , dƶ iKV]Κ-44FxL۔ٳ]t4piTR4eh, Tkd6* $E~~3 hj1n<S%zf!m'a]O_={ޖRtV.>Ou'wWX Caf d22vHbXXL/Ln92c9hƦ6 g/9Y2e]&BkLsv`Jc0db=NMt2ii췝rV1 eh51$[Z8jdLdV cS `/S. h/o^0n|~ M;\K[.3̀hy}o3ٛ w'G?z;*WG] /E.ػ/G̞ YX *"1v0ҲNc{#Fl3őRP3vNN'7{4V\Ev\XoDBIT -AI{._l@w)IE7L5r&]]Z@Q /E о$ԋioyQ}/L j  $1( =yarD..cI {#o'UIRA<'A,Ts3$d 1F(2A$ t(32V1@ק=26- iJJ/ڟ q&:AJTP: kk9~ &eD.ZG6䡦WrqHW[HEC~Ww-xmFf<")]yx: `p-8!.j&K~[hd)]Ü0P!k9史Y jI)%UL8pA7 函0/k L# Rp U8 e9ڃ7NԔBdƄF*iw'ދquR#/q[V4}$vk&J\a;5JO>yb2a 滘C10}Y:YNGMz&n |&ޖcɐ{nc~f 3&`DҖʕ46TeIC2K)I1 ?]{.-U!!5 |ikW?m^uI#IWO_In-cwvnڔTKZ7(q=ϛk~I[Ӣ=/sy`LO/(wANBL9TJ{1w&FLI"&p]!Xkx_./#L|\E$ofz%j[T\%jgKI'AxQl9:DdlYQlÓ1ICɂH1IR2k,@Ѧ3D Q3V%JwPt1ف1Ԛ6|$u>Gj֤@,2Y-p2f昷vz= ulikJ8c΅t8z!cӗ_BM;NH+9^wfhfwn&,h/=zzJbaauj4i2;䅲z|[}ʾfjvxyQGS17FAZ\CmwXΔ#fpBH]:i#tXQ*YӢ2t#_п &!35̈'̋hM[pt8|?c~_9l4v`/LCV ƞ-՜Oc̋_9cc"KJAPey_J8m:}r Gwpg]2Ů3-Qfp<2lԻ6TnFife0AI Y!g)+e&tbdueU>;xiy80*hayN)#Yv$nYTF*TR\eX ,-PgW {Qh;h;qǣl&݆_oSnteGMnWs+~^5! lnuθ6gf*jDhP'RwO3Z{n욐׍2NB$)ݿؚ;:m?͡v 0**H:7&i. eFu> -A+MMDqn#R1QF()Q6V..f>5J3(GJQ6f6 ?~㴧xu-j?P_u;.>p Upr!P>yPڱgGcqPDccԶ_𢇾 J2E[ iF$] (U1D;D]Ys9,'}ܻ*s9Q!!IqbL0B탞Fiz? >>~>K_Y#wZioŪג61~^u^L_h<,8.' r\@R +-nyS'}VV7^YNXOy+}b7Q_Q?>7IP9Ƕ/?⼿zo^FKǵv! L pM  x} #irMvrd.^p OÁVX${E2X-z;[ [a"m|B3Qm[ JAf‑!YEŨzɩ utc,Bn rIR6صU^ ؊J/yr9ޮ$oW*i@r*=Ɛr8lh`Ԛ&$ T7t唡ڪQ!׋F\F<Ñ &TaI2udžc@ϐ29͘TQS~n73 )qHRHpP0bj:WSW4?FVrYx2{hI3JB&F!!IW/U$j@~.Lͭlj){+_JP |M2,"f}ސ2`Vmzb(Ig E]okT`PNNBj4'C}a\?\}'g5JhQ BPyzl_.,sFfuF f)013$ٲ77;;Ɗk0 sDmލ߽<&|(Ӭ1<?依i] ad<6GCMs~bjB/c3ò*{La%loz~KI8~sk0ufQ]WGM.ᠣb%A%mmXF X d*.r,']NjkF333^s4ϛn +?R'4eDPRIlhf fVc*ጸjBujsq;E(-NO)GNWp%Sun 8U=}-Uz@|j>Cgbfiٽc˪d(>Gzn.a_aTf7ow^/ڷ k^ˁ>?*'Qȿ w%Ăfz ȄSg߼/ʩ=яun"}N5TV MBb,ln g9V- NW͋[{,,0c?%f~$U#In(l¬2FYݢɲ[:4/sE0<woX1.  "5P#l 3Q~F1 W{v)ISCTfeY\vI2L=| Wh2m-E$u.H#XakZl1 "H!% !oIw loϿ28Z6BQb^`Pmɮ4 ժqad5^l, ֘lȩf[cK WK$`l{ F$֭w"EZ1 uwI(]a(k Y5r+h./ \ɵ[AK>ַg7c;z;rQD8ڶIoA/''AJKšEGٟnzOUk{G'O.ƭiy`_Yhrd=O2>7$j ITe(9>5 T<).rA/9V(tTekzV(I3" &M=sɰzRЂZ5q:5HS] ىFM-AҖ0 7vA\wLi2գF<.&Ȣs=z{VC!ŝCh STҗU#4qJTj6I_*>;[ÛSj6OD}Z<;/-teYgeEMHB>TIĐ8 1W6u \^ 龧.?7}oQdəW8ݲ6 L 'ઑ<['{sCRYV/5Zxx {*QEC^L f z]Z { 9$:S_ 5{z]]18_1(?5؞rti ]cQd3QY,hohD 4i(["$I]6CpK**$a`h$#IR, *V!I\Ij-Jfg( AY ”ĤZZ@Lh&1^40RVLBU,X,)[ ,/,~ۅoVlWJC)%Jn[t)w\N!/։j.BBe86DuXbihM0**ь)ۃF6Jq4;tڬgIO\XQhE#2#V*  eo>9E '4ݦۡ ~9Mi7ء|nZ-]n)Ds0P "}\/G̥l\A>cە\fN9k`3S&OhqaY ^WMwFlO(_ZOν;*&$srtH_z s'm.Ș̙7}",-+~w@~EV3*"&̂/O OD4yMhI7u+9Cv\_| F#dFi{f٫e! 4lg3cEٰʵou?C[hI"$/h0ź# wmi=Ӵ~P?.1}\8hJwW;>'6Au&xS8<:o !!ҊHY2FtITFfӑ]*^Ƴ C+3aZmr^`ͺy=+ln96*WBMҫiKm>{Zrc~W֍}~OTX2a|i^d*==msELAeA/pu.&A (H3Ouijiw\vqap Sڰ # ?%*w+]pO7Ysf\wV̜8FA) *xAu" CqqaTN5*$2C_?#*lWwIE\|:ɮ~lԪ wrW[(t0.fo:L5U:(LXh$:g}ЦLP*qR)AB߱R0f˲4 1UB$"TbV%2(@58t\K!AKTQ*m d#dPd 1T[n vMagwy~ra]+MWdpM6+Hd@pjj_K -qH_Ub;ePFL!Xn%պz4Rb510]`|V0ݸU<>zv߰]醔$Y@&q_:'*אqEБ eV(-6+z]7x~)[w>2s7w4um%A6Mp__jJYlHcn`)D⓰R9@Z,E>w AGn?ֵ4wZA?M߳0n s%_@gABF0c w BUIڼV56  XqEG$KQ[V1F4K$Kyɲz?ͯ6QvKDU>8 : DrvTF_HC{}V͚篢44Z,h,{vђ h-%yM3,Y /Y aiM/kBah,"ѡL:Z2V*җ6e`h[LVۛ)a6DHsڴ~,[x{OaA~l&=g5:a;4),lF %G8¦JgZhkhW3t:J2+]8v|c/]>h6f+[Oc9/5ca#zsznWX`;)RGبA@ri`# cuNc/FT)&| Z=5Ȼ CI'LmH里飑Ջ{)zH>فv+X}Bs lj1";`sr5Yka eB$A@BPȺTXAi9P*yz~|{ eϴ~KF1"H'qA/׸ n/e !93N:f=4x43P̭J̽~(jnT˫ǧNr{ L4*Q t> Hc MmF̣%EdY0Oy0I32d+N8]ul0`-,E( %0,`㜬,n- 648u"s@ MVFѹrM3!ȇ}řJ-Z656-EHcLU̎s)65II$a [>4_A=o6K :KtVU]_Ȯ9s-@d# }ČGf6ƾER b~^LGG ? MuFX6'xǸߊQ3l@@Kf(H9*L܉TVc~F=<l!2Cl`F"HCi T3dmSB&c T4~╯7wYh$iD񽨾- =Qcg0z{܃ˀ㯳 "T7:~ߛ~1㣜yz@<^'鶰e(fs]h`T NwIUWə;  e'CehUl\9$2|K vl{YGGK*ts`{|Bj/ (GIU{D#UJ4LHT l鴲NivzBs2CãVLAD˵=ÃXOH^QcCCC&cL+)ͤ.̭a4}^^GiE BH r^Ȃ+2 0F2{T2" V/_?:n@q YC⾿Q2q1YUO?ҢVvbL7A9L-ܳSRR֎ʋ1Bc\=7֖dr4i;WoARvI1 i[Ӽ;a,{yჿZNc~s_H)z-l"^> ?esM4oᶞַR ֈn2 uPQ+m-okZh6cy3~*7ckڔ"fvpJhu !L3nk[l _5dj+* Taz(F*H+1AMPp8(e@c'3ӰwOⰈ>MKaѨ^ 0twe)a1Ƨ|F͌0bhu,Nju Pp4߻T`lߺ4MQ].cϓl}V;?wNL!"4G{hhyl8TLrlI0W(>KI^-s9~SdDKg'2G Xx- MlXU5)fGCwЙNptmGAh]ǽUH[IɎ1H)vrGQQYS9P#ܕ{Ғ7lmtٿ },2g׿ujqsLT4]X?=A6w xPT0̵z//'=f~FeMVӹN2\nq4;1!̙ӭ䊩o=_" q\Yshu+%G#3X)6Z9v&D!$G2]TE&=FYG4\׃>iF4̈́xrجyV.Kzع@R@@HD4x?A*4s/CAGu|/sfoKX?bАg29|oܾh;# `$0dO4ʉUS)eY}Wj+,ۮ\0bD ("EL\+nmeVi,\*8ۈA#c5i!'8d=֢ڇEv`a22`q~3 rmy]^UU$~oBEmV-#(u)M5}&diW^둇٫ zYtp[,wWL+R|@cZ,@`.aTM>cOΚ`֩ڱaM@H\B+`ʎʬ)]mG,?dyLءa]R"+E&},V?X>'?aom؁>?-YiÇ_5sWAo)>$ *-TW{z+Tm{jYLU!bm"1!Mk0-drWƐȠx4]PN>X?aK/7E^s8&k sWjīWN|~:۷~bzyCSOۡ`^+ H Dz Vbk8WȈ*C"1ƿs~Ccg D.[t46Ȱ 7Bܵ526LX870 0@ &-VZ!2$HlY )#н(0i` 5aŠ?R\.-sl 2DU~^|dT[-ff hkLb X4ݜ#Ieݽ3UWfhf]ezt^WگYha͠*344 kU=za#]R>YSG`=^Ƴf+EnK?xaIǧ8!uskg="DFM0|4܅;nѾ"~qO:@'¥  ޻d] xHISH|ouC|L_Tye3mbhҞA? 8BLs$|GQ.2kgo ȷ꼋Nơ.6`.$IZOkӓ*ףCDTɰڣ#"k}6~sy_Z;|}76ʖ^w dHrUHx?qN}nDt4d<0ed*E ǸI' IC939_NhvfxWcNz{7md;w_b35`[dHQUb@7Qw| xg_F~jO*Ο_aH|MQB\?W뵰}|{KPL r}^mQ#m )-k~0D=9['s`hK&iX a+۵<%$۱̾{%8I.GNȨvKߋ u.ɽhW=/]osW)_{|s1F%\PZ$f2(mjt^-voh^b]P1zz2fiFL.޹+5HtsT@IO⇅);t=Fl!0EeŴZJ\V2֦9Ι޻#)54!1:&<3bDvpiӡw9/z^IlZV_BX+Nۺ<?I*3mkv[TTz39oʗ6uKH ̇]ARJOc5vtsę_@ 0@Wp2H Ċg ~tG8 pkG&FC[? + T<{Ě.q$ֺÞ"oh\f X,yb-oPo @]=0ͻH7o?ɚ</ZgsVSf֠]¿X\ԙNr 9)RhXnx]J ?(tC1 Ӗc.t2H*& X*F<w!'(Q^- E?Vz]A9{ YP$aw<\0Q{zߖf]W]g^߲K_yR|3d'Ov/ouesiqPPajPZ B*9]տ`Ə~ڞ]u(ld"\Ѥ&έˌ 2R+>sò*R[a`E3?"2fB@8*(?`@RYO$-M*WmeQYl~?WEfCQ9˫DGx%u|gM>j!W- Xv/bDO gcu9vvC+'Q{$ۗ%8Bg}q{0W I+H&L%Kg_+oʍFHh5S[VwNsSU|dn(2WqgF%Qf! ;n!S5M'ƛKcH\4"C`12ON,2@ RQ`'\N=ia$k8 pBDOBlk&Ř!iAo\JH! ﳊD")K`F$;#{q}֕1DkH00bE3ZDFWWE ^նɐ&QO61c.z1o܇vhْ!bro>NH޻aFקŰ>=WI6׀,lyFdI=~ RI jktFY+U3J%-c:Ga>PvrhS`F"Rr1S4B}2%^rgS7z>4L4!$9720̓Oz~Sũu-:o;~̦7J4$$xb9F%'w?e: 3x!7IhfwUDN O gFs9p;r`tdfIŲç&idz+ބe{xΌdrL\z (B:G)7;[9aLxCB SoQ1~@Fnr9_X'HfJ, ҳ'ہO+k27C{Oʝ3qQJF"V=X0zPHn9ݥ)>7?|J)={Aۿ!b )P)3? {(=#tg@UB0uzpYMq.Gŕ*<{y!E:ԉTP}?C&?66aZɭ1#6;53`Q(pEOBXsunuw?my"2dk9Q99$!dڊzX̀BHV:e<<nP1}iBtRHBឤh!붃4ypClNH `mY(Z'ac"a/BكY |}CN/:n?'Ot0.#-tb:W@X?~൐[w'nޘ$lÉB *b^xыSq:@~ Y<y1~R߳_и0OVu},=/V* u҄,CO}5jCЌh׆ PehRK(\dkOR-H xP3 53IU˸>} XO}34[9twLѐmnIl5wz˓D>ƋPq@يGȱR2+`)NLg=ly=k5: nRtb/lu $p@g !7c%M5$tWقts5L^&=^~zu˸ZҨ6^#ы*el<0}+f?Ai`u=nSQfm,qgU>>O]ȬYĻGR8>nw , t/s#zM~ɿΓ#/:/X}OcǢ'23*xH +OAssD!.ҷ jUI'1jD.Mwͻ?+9cRc-/g~{M`HBBHA>ÜΧhoʭy]Z 1_[ͲDV5s RxPuvǻz ?[&\PO.|?Ku>D:ZKI~gk*F+I IĀB@֥u]Q"ZvQG@~1"cr~]ّ$|Ie$+II'<[j{#ݫ\귲׺\LGGWV_W`x:8^uo,~vRMQ0'<| ̛by iq Lvvw@ k&׀<`cu\?@A@4{ēn g5Fօ_Ce 3t,ߟ{AsMޢWkRPw{=.sXd=Gse@EdrZo;\!f[WrV;K'(wh_3R4ħaEy:3 g (F7Rϋ 0> P^L>;%R' ٿG={:S$y+ ]bً>Ctj-] U73u7ٻibXlWVOna}sS7AsOr:?#wLMݙ|ϡO]ش_>>onqͶ WC q󗐚o_;h72"{^u|m}oC"HtTKz<,^@I=Cb|F\xa<~[:gslvE,|d g3Eq߹vޏ~f~Clu]5{ĉUhQ.꺬e@ĻgS@_$|į/@dSGϖsZ1w&s{ZES*U=G>SRauCD/m 92)~ 2%qtgO[3Tl?Z>jaJ E~/}*Fj섗MRzʟ03WF}NK wX_;Ug#t&ksYn!s!ۮp3#ø,}ZY7e>aDN.n\бgHEP 8k.H5_.PY*B:@H IEe_B_QҰY^8f+U>ᜂdXmXwo1]9i( g68д5:O]NQM[͚7B_ S&Q) >f*-`\D=ȍOCNǩzGj{/&ގ.˯}qm )QfͻUf`3 h̶[Bf:rt=)􄐽`&a IU">>LkOѷyY*gaҸ^"ʬE`II!T!"gKoʖ4AIK G{ ! 6CTolwA1CQCᔥA$Ez~4l[3-"$A!"p QQf_D]0TW#sGHzx.675M¡ň`*)CV(di r h]^ZhX獛YP5{TjijK( F{8?a<x}$PL^@fa-|qYFl,l!s5<ǢjFmP>C.3iX&l)T"XZD8r5 g5^IIN"Ą ׾$$`I^StXJk܆/؀Dpzങ7- /v/ҌEnÄ6_wjF܉)MIYhַH^^)+o`!c)Z-bp↤a#~Ǻ73Fp6X+ 7]r# _9Σ_J8G{1J Qan BfXz[chT77/ Xv0c6dǔ{GqLS鄭—/V!6Sk2WC(54&B{TSdA_'[S}4?S;p݂"! W+@~X]&pFΈeΖq\79]=Vlyixx1=w;U_ddd&jBz'zVs;{J8|YN%Y#|:-MfWm1} rf>1v?'AfJw80r/3(B f9u:!%`i,42db-cIgh(E٧* {&=E. Xߔ2f-E"Hm9c7y9 h]SF?)jnALT "LRj_q} Gl d=[oˆ*l82Sˣ#͟c Z`ђސV,i$cvHה>seV@'\@^a]׵::/Ixm^TX- F9낅SE jd'v|zlf h0LսE ʰNwC^$geIVS4sy۪o6**m i]/s!qA`$̙YF=J3!!"f310)n,)=eC4FZhy/,lʺKUiu}~W<ҳٺ-hw>I$ AIj)6DskbL/#hj$+(Y\Lp$6҈BfBK*#k7l/=:M:&Nͪvӵ0&GhF!$U=A4VgPmCLY5PhY . ̤^'yvdDPeR˒ /kS6t9dBsopjj: ѢL$,`fƅщ؏s<Ï3QCGdc-f_~MLKID`a&IAi PʄfF@OBE?? ԔfrL6~<9kNV7B;1FPs;o3JlɛYŋ7޶QFY$//1xXL9&Zי!CÖ 7#,;-u ,q0$fI#akc4 vIxxΜu_dlȡa{<dest*lnrvABf\K8cB:3&(~7' LjLHy9 8V͡9l39[:6'$59gkTl2`/}oZ@g}g?, Y,aLeS;<{Ccʒ..&)3!cǁ;>{#lX|msJ\脛?Q1%rks[u!Kx~6c e``_a8 x8Aw`NL:i 2Nᖿת58crۭiQzet yLaEte(Z7F}r,W?w;,4֑m*h%ӻ5AÃ]EEgYrfΗ338% BI#zahjSi&U\> Ǎz|i)|NSᙑ#L !+dx ӎ%-%RLŴiۀ)3&JF|g%nժ)hxw:LL;Gc5]kn {YRa_kFuq;1%Eo6ƪɫlY%]Wlʬɘ`Zg)Lܭy`WiVcdY+"Gjkt7TEP_[YX( M+َ4M%]e7q7^4ba lmswʂBMZ ldlgc .vbsHul<@>gMYf5y_Ň5eUFbB 5y&о]lQ|9meUvuS|+_%5j{{+D꾆o5F}> :}=>GR4uC0ܻnB=MwoW7YYԺ=Qt`l5}c(Vܼ& +"\܎ޱFυChs;˙d% ŋpkvV<`ϧF@yݸHu"G"GTi-}3뻻8W0&j~KM ;|Gn͇nO_g]`*ǁ}ωMIr$?7iSi_)kgs1s";\a%o]σ; E!vxZ59"q –*7 E!3!&#˂}gD'lۧ@O'x7()1Ux#LGgq{n:7:D¡"{U v?KA`j[۶mIVhYTSc[^iV ?i+bcjwQD 95]&o$fi,45""dOGjn]'r** X2, F$]վGXm3j15ۂ;_6{2H)"9:{͊0|C2""lbo9wi$a?ޅCBQ!UEE6fL)v=+S!{Td{ "H^}Lޯ{:joxNj|$%}yeH,76{ o'Z:z08ɩ/)d;S@aQ^h;YQRG&m&+a>EH,vh#?"h4B u (AaIu) zf.OP)b" D+ ;9 Py_7q7PjQ=OQK&QhǤ>{^3ypfzRPƌp?so?gWPfH}]77+W"+~n?ܗie/BsZYXz-/ڍW5g{$UsGN$xX=a=52%SSXByDL 3qJAZx JѹzZ% w,4nEpWxRӦA=RTؽo)hw} "R4v 3Vw|PH)Lto;+އUݛUvz:VikRwwP=A4kSvP];EEE*(ֆBI@ ҩUEJѢ:AUT kNۻSmu۷{{֑Ƈ]eYe.ZQ@43HDi@IXt˪4tR>s;J]NzݡwGaTzhw9Csr]n'+NUU=qv=li]}YTѕ@ݨꞫ*k=Ϟqր=S_g@.m02跻_]_u" pMwn۽R/  ('}= kB>齼"&tۺ`0-Gv@*BhJ@Z`kVITQ]¾Ͻ/_>uY^U>a]QUT_szcಮ{o[O@ozj=ٯ3tT.󈚭aX:us Ts=}Bƾ^/_s鷼eWjIDU)CiFvZAB:u7yBe@P(mM/\duBc=ۊFnl!$ d124hdL 12d&424d @&&A ! BM4SM#OdhzCLڦO=~!Mh4dA=@I!I=TOSoU??U=jқ)~'mF4MQImG=Cdh=OP4= =!HLPP"B52d#FȘ$M0OD$أIiLy'ɵ24=' iGhh4z&iha jbzF$DLLA@a2 #M 4 i@L&b42&Ihd44b04&&Кh("DA22hO=7TTGzdh=MzA40CiM2dh @ F@,8ӵJL N"ȸL;Go#8'|%Ca?2jԆPYc$"_:fyqAֳ{Gzukeo}p =.p’c(% yq1&R4JI\1Kd,Zq,I3#~M6rtK3 NZw9tah|ۜ9u,l( `ٓJ}Kpcԝr822;2M1t3X(hk68"7kb!Kn<$6 H0 Zu8Ařk-$! pA%ܔLN5`y 3 84Ad1JCj+J!BRv\ +$E46TI0h zk:4ֵ+9nvWnC h9f,[Zns!MF5zeyh DC粢hi"#8 5 !뭽qQ4I%7nF]Ip=dj9 L~=|L#f{Ҹ``[4ZMTȡB v4M1EQ)1TL<c`M2 4鋤"`&-A@6 "=8>[r30 Hf01UvQiO $9I|i.=KAۛ 84y8JCu|ff/1|X-ab[xVwZ+J`j!"&(A&0N{T{ɇ3{Fr/nE /JkU)A`EB$k( B+ÉEDJ;1"O!Pra&͆θ)kaY߆@8awKA10*Ce5ۣ$M34CzZ@ ja͏n#~7D008Ir8:hfN čͣ!{m 63(2cAy3|;"eo15ݦO $D,ER@p>$9or'SOLA:Ag贑h 83,6a[5^l'D)M̓Г$ R DG(fbh:eh CI Lc&'Sq#(C@9 lp0$6k);sd$d8X 20 X 5×C4Ac fib2B1(lT4D Te.}IC$e3uk8S&%P4X`2Xb:%hvQ (800!DL#"mF,$t-00LjJ׫j)wr. Fqʏ5}53nf&Q@ti4>JXpd'%yM$2<ǁJ2h`pTMydQC 0Ff`=H< xtƗL'NE@r7U|`wW# "^[^z}TP4 ߜ.('X^= r`YH'#1ް 1$ܺkM l,$ H /5E¹iL$dž`p` 4= `n3b)m:3ʥM1TNAKx) #3&;w^i !=MX01/<U[0,0*$, p<ݚ#9h6yZQ‰ֹVV$ϭm:ؽD8HzpV#02NCt A__ Ql8gR0j6ktUTxވg(Cok|NiT%jw}9^9AXcL v.zNg~<8/A7dT D- ޜp6Hw[r֐H{cjB`W`L"`@TEWTz L7(9m M\(ă :"6i46<%XG..2J'C4!ID)*m, hZV)YQBHl cM%Ml4b`ht0Z$̧څd$+X2R;0g,Qp|=Ƴϣ4Śq>.Ԛ!̠ 1.q#" !&iHٚ9`F&0`YXQNV,qIdD`փBnp8؀v< KX#AD,4:l  vlQz2Ôg 'SGitZ჻yf8-|xkj .16$mR"Ym*E-!w MbM1)B`Hb6 BV&a T2a( d 4P Bbuc5A!k7^s(эbQp9 `yL( F`8ßo@4dg" ^yj قf% !I CXB̑bziz1 ]OV^PRbHLʻlH@pRDHʣ𻛔 F @v6f ‘+4i&"f"oe%H\h xV7(T BL!P@])} Q#tª:SgK2u_r9>R]RB4QATA˯Xqy8J H/Vr]gIIadLC Ϯ9FLϽGtA3#ɡF< ßn3A0ٕ_@acm:`\@a%z7!1Qb'r~u,>$0N}}p; = pwr ! ٛ×8#CyO ;x`aBGmY"zeaD̒BM]=w4+Vyqi5ow_t_κy؎]~DJ,YC  ߶ ;d9slRdrַoS]{TAû7hI; rnoÀÉEpe7GST=ܠ*(<=}@ F@m)`y8Och$A8}C h 8M2Cj+^pޅ* sz|K0 (TZ(Pk gN Fe]n^|DH|F3ω8rrk3IXk^wl3ϳ2̢fts}ުlveRu͇rp*+-ĶIltڳQ%+Oe 7>W.v }Е(uϔ(Wt=hF^cwNuW7hT ,1U%X W%bljC] )(>iIj$@Q*0\-x8c ϣn2M}=N;M(fKh_xуi98ypԧy5J D0*vQ@RQ @М7pnab; <fe1CyJ:RŃ=Ogr5gob䥈16a{;+{`q/f}ɽF}8e uų[l~l{X8" FλIݮ,{^cX斲^mJ-7cu˳'[-p唸4;dƯ2S  2'P@S'{`ߌ! s,Cc&^B`HJ#qTeq\ ;6qfȥ6+4@2Cu+ hLtu*- i 9K#C!\l" oU-"UH%]ؐ[Tf@ ؊W\|)Qc)ƔT$w4Ii=Lx f Tw[ 3םಁɌ`b*RHv J(^FK""L,) FRCч> 0ANܲ#i4u2RQ%aZ\ {o="p 'mLGSgzqo\$e'h !;Q>s,>lI F* u3n{"]m՝X8D rSq"o5<Hr(+cs. 4Ri$(9qps&,LZxBY{O4[Ù 㩳(6A%igzlH+|&T`W'BuvbDcDp.:6*rX[hK $[S8RΆnb6snmpEa23"H c!*7 ԐXعW nF sRܱ᭚p|cifhXLiAnCNhJ *J * Sp7## H$f12ff]al1N |!c@Tr]|@DQIpddy)p9*qIE9R3 hgd)*TD0$cEDltX3MSmb- hd; i`019Uhߐhf$A팘'3 8 s[R$1;.REd rX'!120lt>73R%?@t-N#Sӆ,psLӷDZer= ɋB1M'&#vC 9:&dh4Hٜa=d 1]07:4`% ]FDq'^IPyߚ)V3@ʯ &RIs}[9ު .%qȂDC0Dy(15 irm[ )!;Pش SӴt4$a9k&LÂq,L VXEՉb 2tӕ ORv$ZdQ I3nLcr|0db dp1 / vHs"-<`i 2F a l8( =Inki<ZgVkPCm;tQA`sAt0p8qj[SrUYriMݣKp8&+[Utd!߫mf.ՐA+;9W:gtzouҨP_t52;v ݆c_r)DgF z[˧'$G8e򶛺 fbvҘ!b&bm!S`Vc=Tp9[c \k/Q&GΌT@fi#b\ qc))8bb38tF mXl f힢EHpAȊ [c48ܓIytgnb[w͸rIʐ xWx2ffIHH oA"lS%\QW8a#Cd3sp%Lj VGKEz~ +G#\ G 6 (cLLZK#p\Mo( @ & g̈́:\3#5ZH$HXLCqA`h42(H,jX6$].l_M.X `S8޹v2f:(1>dy+GdN؀&j걹tӨsTI}xz1K!oS!XB$n.NYہ6+KIqs*UbDF7!! p6m1t@k䴑@L59' Cqc耂-4lEȐMDG95P XAds.PW 9AhQ]D{Oˆ$ ]R+퐻ao+: CV/"(0fA9gq{vhd00 +)U'ɋ mpN@u`-nF59S05908:Ƒ!)bs(8Rc'&J$4r)MS\RWHfPbX[p9]0XH 895dxPg%4l].576&ƛNc" L7a@RŘFVL$9'ۢEÐI96tjaɷ7&H AA[^ApRZ.T9&[h>D]#p1p`ӆ%cAr 90DeUGc.M8enS)P P*0Wc&XrKad Qdd9 2dHXg{ۘ2'ӑj|T1BcVDVA!FɏɆ0e L@[Je7ݸq0m gJlI6e =ԩ q!hɣ[ĉi*,h$rImMio5ؔNTf$`bD,rf,:b-O.d1e)+nK @hs06l=i01I!i(S&-!|Az. {fL>96B@0 Htb=A'B¡Ad._z&l"dr*O[~F+`r\/pw U12HpAAik@df/)#!<3+%^U:8GE&V`"[30̄# e s9 7&ϕ替V)xOZRL+o2}KX7Lsp\o7`OWFcmDwNk[4R0J Z D";LM֢|~>ShZIࠁQ2gB:^Lzh8NH0 3;LlDⶶ:>©AlI8MpL9 uPl`d_4J8G 6H1dq"iڅ(&pȤ}&ɂi Ed`$'ta%dA5 n0NJcI*-S@5\QAB :A @,1A81' 1b1-PY) XbĖnU ( )6L pxb]%pw@`›됗P% )sA30] N$@L 4ͩՉ00miJG]Ѵ51>X$l͸kAKHc3r\`*MԎ >MJETH@Ӥds) 0k%r (sCRS ܖƌ)ig2A/J;b Sj (rgmm&{7:KK~`U&J  M˕qc"42ETmdUhQRy'Ω, +dlC05%E;8^ dAɢG`B$3A(`dfCa ]Y,T P| BXut$H!2,]0" U(&3@)^ 4(J&pT `ID`Q]j\\c@ćg15r qXM4 1Ī3b&90oŦc>('r|zIbpH 1r' ֤Y4lbun@#}y=X9qp*;79&!;:c05}ƻ0Ӿv2AbygÐqk1ѣJo|Ŧ, (PZYĸLvی l hbG&Q1YTy@LZM9"L!رyp5;&S%rU 1y.*bAe*3X8S ` KcQ&Ђ*.c6'j m9NQxe"J`j26JArF+b22f&9C "FIPC%(sah]5Ba˞1 l(kGChXR$LH'`Jx K DPPXYCj*R)GX*iRD!("X, ɺұ3 eFJr 2cXbF-HbƊDl80hZ("@;9A I eT t9Lcpi;nbE:tBB2# ɆCƝ٠@ƈѨyư=,%q1% 5CQT˥qͯݧׅ:0)/E3iD)AáXaQ3a[ t&CbiHbZ ]5cv2^J6o#' .6DƉ2<0{Lå cD u<F9@ s@D!ai#*.FA4 wC .Ppܶ0Hda,/r`tH 9{1E/"*6%q ͩ9!f.Iirc ܒ"8Tz$p9<.6bC#y͘d@`tQ'> q2"pS; R!`hb6c4a&:4Lߙo. @§ @I踉p-5®  Ӗ@58к)} IE #B`ޱ2H;RCHȚw6Gp Tv :(^Ht/)oBj$^SP'a`)54O xX:&h:a8 Z!*.\HîRB2"wI.rJHv ۠6KVY{'{ 'NAIyXC $fwu8ԥ&ըd³Z  kD$Lp݊OUZ[[ :Z4Ș -) &mZ#RB0#:]A!Bz`Y$ᩙ !#C%,} 0R|:{0EX,Tb ʧ9$G].0LX8S H.VN 61Q&'pWȢ Yrl09P4TZPJnA B50%H,`IH %:2<7@oFbP!Z2nT%Pp5Cd7Q(aڜ4͈щ.LPH Ihl}a0 Kh4UnM S%$ -*0P0&˕N;s+YSS(C2@lp0T ERj(s4 p= &$V 8<=IiZ(M}AVZ ,P`z &<^t6 70gn2< %2yeukmrd MٷZ.ߩʻ`tt'q9L A4)V6) @Lk¸R\mBlfPɴ2qiVm bR JUhCvM!&JSA$JG>/9o.82ɕEnrYfVa);Y#KsDrl`QʇZ#Pm2.r͜7UV+\(hsfCV_+eG (. d/ Tu0%NF:ǙBgǠ",= G&{8=$^ щ1z)-܃efC9w7X@4 ؍r3 a( }&u3 9%fׂ y'sh+$ݤ!'$4iqdu =Љ@e0+XàxL>HaRaj'; ṙ(v7>A8$,z l4d)4[ tΩ`]h BVZk36fٖ P(n4R)fPq`2 `bI!,`; . 1bVc&2[5F2l,H2p'v H$CA9)Ec@f$g@PslXkGBEP(i$%4.M цbIq xሜ 5;;s'!| y:UrFqN$ ¹( Q- ,086NIt-9CwC 1GHđ|TL[1 Ll M6WgcIMLupeg.5B- rLvUpC VÛ 끜(ph cK%Tn \2Ț* m gHLfLHHQk)11Ed$HؐmT`ʙ*M sx0bS2HF4jp#4f(00֝P4 0V46"mh2J(l0NDӕ-2cbrҎc2WmzzSeT('0!b( brɒQr,a3Aa,+CA6306aID8 7ya _4HF0]0h׉mċT D$Dol{d5 &, LiU1^.R*)&h]I\F$=\r- %$v[]5,TPs  j8u27@q@`fB7 Qaaa5pYk q G!0nDѣ Eܸ0dvll) EbKhT ;JŋXX NK ` l02Ad)!ni^ lCz$ U0-N)aJZ]hH ̈c;yEdZ02l0 8t0nCr/8"gŀN0L8'0b IP`s n0tDC0IӚ0HxD 0pe@3s5f2m;`4iS @9pab`\9 bFe6`:az֜\Cxc&\ 806FPJfH,X٢d,ـ2h-aȁ6p"AT4Ak{ ࠄ\c)[P"JCFK pL;#UDpc&:g0nXXTM˔`9AdKkwJdtd[\&9`P2BFyXrA:E* .y&+9[H 9̌sy#ndj #v$c͑i#Gal=19 %R7F/y &"M{cHx@0S'Ñ*Q"2q@1&ba.fI 9) S Ճ &$> DC)O\^̪1O.E{ ņs31C_80E:apXl0Yh7"EE.8Ne@6d8E@`E!sH9cIH tQF& N )EnZmA[#ؠ'5EoAFГ(nޔC2<#zI CESDSjy/"`& VmˠȻN{ D,6t J2p`6e7#,Ft (Yl[#zSrug яmef{Cȇɚ=j!C1]jcN K葏!FGPhly8$0L8䊧μC %aV.@+1D1S,?(l s8r,#/BD4!Dܭp! #I]8 i`5418 ؠ0)وMmH e&$9 4 Q1TXQ Dq;JDVARmYe E[;$ fK+7a9D<:FUÁ]#RH&Lu7b5Wy'Z7̀B#i.PsqP EJrzYQ N.ac@PrTP1;%##%H)E049TI̘`[ ͮ%OJg2LurlJf̸^?ݱ1Aqķ|Wzyr9OngYRsGz`8'd 7c= -hD+RfU#@4Y)XQ:LqUM$l)! z\RSVof<>z0vIGB,t3Lw]1KGM`ȃRh@IO6l3^ФI!@5W}Q}Za y;iVxcMnCHI'w@y թ[aٔY'pz Q1YR@Q1̗9=:EC ΎRq%kBT@p 7rr(CϹxL~ʾg3:ImǷml Xu;1x+>$isƧ6cђqH#&08IpLMjO"8C"4MtÄEx9jf?ޥP` ̶}DRN~;~v;c, ˔C! 3wk z%O3'+// Qz[[EA t;b ɞcKoN "J=f >rJ*U$/K?k>-kv4CAsp{+VSW]UPy1QْEfJIf%j<8Gag;ءRmr5&9id!5!bȿ/jC3Wwԛy`ݵ^v zdžԛxhh@#BNUnuByN,#;ҙs,KtH~G3 O#iCd;&/ ::lKE#GgT$Ixk7:ٵRjL XY,/iIw?TS1uuI@zyss9V9؜ cgE,1.6qc cȃ(|XZܞ_@w!KQ /-9cA\o|3)x{X!65BDD=Ko2k qf=U]QX`̉@r6qȨZ3BLPg|b4ϙO#Q,:W RVzyiv `]-N\ 91;(!bîPsѿ&oKp$FYoZb'.H9zP O(7I)@獡x"=WMOuYLs'QFaZ:!9RC+Z疝Nxд̍*haB!a -ɱ9L3 3 {ջ̚Q@ۉXwiw&RjWaۨmg*pu@8RP) z@D X\oT=FR! '  HRr$6()Σrqoe"jns-rO8.2ttfa Ar_:T!Ѱ>I Ý I|cU$3sۯkʩŘ b7.< ; ˦:m܌9pay+rL%v,J,l2(?w%DAq@1^%AE\X<L%K!fL\ t{I0 8=âu { l\(qKC\y.HDm3p˔,yzg2;hek21ün@P/OVX%-ǔZ֣5fff0Ӭբ"!1C#ґod ē|"FNDНo7-wm pDT¥ lъ$pQqO녶"Ƚp@m.:;ʅY f%= Y\>tC@T ցDI26B`yc 5&127z 2%uBET m}}4ѓ;llRW2Mir,K!Je ĭ*3A_s96|ۙmC hc;.~r맗S!D&B8;,ɀD8/TF{N3ߕ̓(n^(`|-*ש4=vj`bcX)QdHYp<}n m&=Yax>Qޞ#n]JJ~)#su a5"%S  ;QtőT:8[#r;AE!XO5 5Z~2"W-U9 %"q3y;Jz%X[M .*cԼsZ<:ؑ_VNmƅ.f Jҡ5`z dT+}^Pgj\) hrk^%4zK`upI'~21J)0Fn-;DF0Cr,ǂ3|= (Ie8De&M lm>>|lS'C^$^XJ)GuZZ0F~TgA:gR0@ (%kwo8$w #8,H`pd.$dp,bҊ2s \5P0XG,Wv jD$S)cUfp$+DVA;y>x0`zۂF@50M&,CY ŔrT"0!f HpjxֱZza 0hcaJ\]gHx!;X;ATAn X6ÿr~%|[Ү;5DHYǂY0_ҢJbl]!C9U 1d 7""Ia=ƴ~1ܐëkg k >h3qmĠ2A鎮0_@ITS2 ,m$L.IEئAf ͎gM͵σG| ~Жʹ"N5PI+製9`nj%FG %$88R[%xwK8>K2c](rl܆gN$teH<ũۣHn)DD :D$0"k03-Y;xOTJ*&G>cl5n4 MNPj^ Q."-ǎq_=xp9.CTfKHCI3qP*$*M$F[tXȀEF{3ȒȟJ,daáÌtjk۬ZD o%׿5b<鑇3O@1HD# J zS+VY{>k|0/F=[]&TAFq=OsyBA̡iqV] i5*υG|9}]JlQZ2 ajcF%z MЍFN6X6+3̎ClhMԻ2hbTprDLJrEXS n%r#Hske ْ|=e.;U gDA<.x, y + La6O$;oN,l6lq18 `m5ԉES_ԛW\;]Hsϝ< E1ՏGIArY3kQ&X]fq=׿+ Y CIߩLhAsX)[M #Ƒ8;U!4/C9==p 2lH` '$!is B,SO=,wuTkǙiKkD[/"}{U:ݢarvh=( }Rg|@9xά[ULse_,׊Q> zGطZLw|{c!&yu<\j0KmSv3LVZfZO2 W%E ލ*v F鉩Xv PSPld ihQ{E=VMѓc?`5,cd5"6#~M6~4W)d03/#G7[TLjq5DcP 儼GxlF/δXK }c RWv&(6ذs.cBx(¸%R<{ѱb/\_ MecZtjT3+>F8{ɶT !'Q1?"=s5t:q"ٴNc;'cm`Gn߹a0h} yBR܋3ܚDiݦ[6)|ĥgn9)KȢϢeSGý̶ӴyEYhL .mɏA܇˲c0x`v *h  _AZUb^Nte][ >J DWrV~޵mу~l&bCzlF*ƍdO YD͟Vuch/FqNX)'bNaRnNg彲>Amly9% = HIݎN$ҟbo3c9RݹqBHp2C^yڹyzqa Kr (y).FSF%k{=Th=1DHǙ6b0P/fu{~ӂ"r6o83۳OE'TR^ B)X/ML;m}x;S UNL&Yסo߷%zNs9~N߲> [7j`0Ǡ d4wrQgzuM>ȡFP?E9kIm4Ih|X"(&Ti37yf}v'ܺVVAR.Ltqfc0Djc<Ȍ7!9gS%8RnHbŎ[`4;uˆ[oCx/87l`=\r`KTBSVUybeѸS;ǎ0c$=sxc $70 _=WTOk**Q-hyc~vNe Z=PǗS 7M'"҃ '5Pl8PUI^H&UO} țQ#2ad@N' Ks] 9he^Kt<ܦPw ,(¨x ƕ9v/=m^`E[,Mtb@hOI@YUr^u^ddLeOхޡe_U+miûK<Ƒ<9"=LdPkw0? =~FJr >9ibȰf&'c5'],9$NiY2s5:5 x 7iWrsW[d`,Ec٭ a;>)"; 7C-Y9{8 X.˜+4+0܅Y~s^ M !ȅq 8;v6;[pqf?v::tT*"/#==Fyێ%ݬf&F`iLtr$zX]< #'AoL!pIv :*?^BbG3-xͰ%8P| ADuFy{Ů0o9\ʮKɠ:>~m`Q#v[g𧄟>(VirgV> okX4u?F La"ABkS 2! !.:lCȖwy!Y™oتE.lY?½p5tJ)`1^\b}I"Fi\Fl JyDPY\#FAQZtj+ {gx%羝!h9[Yޝkt`ԽԼG;khE e2HTFb:/ }ŷ1,^#? Ňb,s9V8;˜&s¹{72ܱObdž&FWgD2\{D/؂\7d6ș]̏/9fɰ4vuKQB g˳\gGnF6.%%!0n8Lt|^̼de1$نt""kvpcc@P^G?otc}$!z˼ҽrpė,S g9оO{ށ6\rŒzQrK=TOh?rM= Ɉf ;ZRHܰ~"sSǜP(9 N{K36z}7k %Fаzzsܓr {ޗf N! k3#)2j6*M>bM1+lx# 9lz7>{Sr~'iOݾ>;~$Ito?zR8iyI̫f.FxQ9;L mӔ?N9{5ɽV9 h&6) `!&7eR>w*ȧ"l%T$6˝وv1ŚsG^93B%sav ;|fDVo u4Ox(J:$ }8uay:xגM,}c Ik9űAv8<,E}9Z CC|kv,3!  'wcqίS{{o}=jGȡ|t0p'$;0Un@Av=D dR+ײQI;k_֤M``Na~p̂9WuBI.֤M?/3Gm#r˓ll2 ~.v6Ҷ}3rBA,dc^9ח/OeVfU`aU.? e[ $ܚeLcqGTiC>ƍA 1I'Pd1Nky r=c\mQ5W>GVKIcq2SI'qUWKOHLiHFetlqIǜ=,)&4P(1c:N)&h#򆛄6J97lb<:;ͤm0,tJaSr)8!yQc-cTQ'&UixS;8 iP68bS >ҠL@?r6ԟ:|g{,S;L)fcr̕vC'wsKpNJ(CZ>Fdh^ftI^is3OPZ]hrN .T}DMQ=[l65q5;v3ҰԽ?knj,r =ncdpvrƝ37\[i/5^~JQ f)nb`_vv-ht۾k ׏__y>ADgz\=i}x̒/n"u.׼Pnډx&VJ8T~*s]ũۍ zGv䆖׿̢PrV%crhH,^a%<\~E4ۮ.BR {4veOmޱ=;u9YǮ rpϝMV4`mv _Qϼi@Of>&xs;5)eV/ #WKb 8 cPǧ7P6hMt1Øi2"! OOzwh`+'И^dlezyVZ|GytqiLόax9 rfӣ@@©"iPINwKk6\Uo #4dغ!ݭ{E|bִ;s=V稰z?Zs[C)|yG;YXeUՇ[F54sAˠ2;d;5fP~]QN`]ԈW*;&. ]iݐ_͠nm]K,6bL1 4>]UCjeo=x(+5ƌ|#=OS-F=))'S}0*Es;w8?/@s lԧT4)*5O1'KN &T1F$Ҍvv~uNxluL >h{Th G=NF:D;} [.Q:w;.}XES0g*k/nF^l2Nȷyk/.;ټ'*6 }Nz:aB3v$ѠՊ":'( r9rLgnLS'tn9]^d'|xED=Q `Q8DjSX@:A!n"ihAS DRTt ^.€< kD "bx | b bD! 5p"* @Я8nfGK; ^v>Wwza:؂ @“#3b aъHiuHQ:gԊ Ȱ+(" ZhB1@VEDJA!  9*XI TQ *PHA  ›W2U!%@@"A #8F HN:Ѧu2(%dN#k Bs 0G bP໅=hq0fHl I#qGSifhA5x4Y X7p["A$D VjR<ȱ hB $f! Ehx*M00  >;Zu.""#XMHhma"RDIH T*aaP@TBi7dAZ"DtPTE!* D@#4 ),R+" ʡBH5j()@Te*T`Aـ T,8NeD6ޕ t ՔoBB!SXx.$Qp/< ( 4*PDByTqV®*"  `83 '01pň"J 00Ɍ%K1\I9f5XdD-YeBЍ BEq Ul@08j"f(FD@zo@FA@. HkAqP <` " !P Q$UJH. " RDx܁)T@DaID7+C@Ą$TփrqT 2UJ-dAGUaDԒ!wPo !" Q(" )C>uD^ ր(v􈀆@Cn=* =wP)BB-*P5 Oy[ :@|L /)3At* `H@ۂ?d/b.#]c^|P^X(9F-J!5")b*朎E4%%XjXSB'Y'W.^~ U?^)ƈ"ioxz)"ih$1@P(#s^8!"b"* i"bi"$#4(vh"#C &yR5TP0B@P)dE H " " A.)@AD! Y$ER@Z\D'B+QC^sW"2H!B-℀) BZ! A(H/ _zPAHco6ش )N A@] *$*EUj"*Ȋ$ " "2"$bD#QD n6!∩"*QT_@7G۪8MTI@1CH A7"wLaPliE:J%E@xA6 0AbmJx MCWhA@DK"Cp ;pV@I9Z0P:AL2(NXl@* (  @ EC\R"c )$DZ)D G J=@P( ^}IHE HaGK"g2@-r0sEaulY'fn/ hbjv=F$;{(DҵDC¢ "Tj {&(b6Yl0PzTS@@րS |!hA4<}/Hh#T]R f+{ִ)a##Et1 t"~N@\Jy5 *0 j@=!00sB B@ edT7P*/Ҡw DGF",: 0 \0DPPs!  @@@2ڀvE^qQS(LDU5L؂CGx*BT4A]|7YDB*$pDPv@"T ન+*DA""AA@vHmE@/@b";芦`"uQ*<d9*@$4B)/ OXUEAD$JU\ [RCڀf^ wPl`F0dEB?ew||=឵>2$WZ@EC-ۀ"1$P4` QZċ8$h- o:JЛeLږڕ5m=MVBz=y=F(^K0 (|EFA\Nx۶eDG `H$"/dQ0F@, #" *HH l Q,$=.v9JiP @ԈR`A,8ik.XL40a_#;L=13h4WF!\]F@vYbCdO> *ٿw'Pr'ؽb1_m\.%Z83L#4ff *[8YԉPƌz^#Xp~1J_% >х|[!CA4Vb4=\lE2ؔe &,Uqx3諸8 I`ޕ^]βü@̴!2@@ B:'fKѾO{w JS1,/ꚅhlEI6,yN|X+S7,yO`fu 3#eBt@J(W }*tGsHՓ! e2A։H4P5?v@*.m"jzQ4d"|/goous yA6|Еǡ>wD'4)[Y dvn[oRR`0_ khQ# 7ɡkp,lh6϶`+M3QAa,%+S > X\aJ&'P""`gMK)D:Xcf1G\X>1Nv}* A{ّ#ضy0>QW=UL*5WN\E8[OY `ZDU>ׯ9i&3C|)icY_vc tP߾s6GҢ*>,|t0 R:SגN?(ET(g.BRΓg:܎ia?$,ZKS0>wr@@=cy/uO醘jSILˡ6`TTtMLhZ%k| aZ RARS>%QG&F# !/TyP㮵r P(>C7щ9Щ'.c_%.ASwQ<H6DJ zFljyel!"q}tXJ2 8~;!>FI(I\AlxW 7 6hQst4C& Q HC Pn2Dty'.|S;6P+ 㶃E=BCu_ ;ejʟh,9t=5[C$2q*a=٣qqbd f4>0! =J{t_FoO4jv0v;R#Mv ˃=& FQ0 ӵH"aw$#_nYLs#ͩD* ǎ¬{A1ܤ \ ˀ `1$IVp{C#NIdlExR D'a!ČQ%# 'p \/=%PcB=4r3:EЅ)1:,暕lg 4)11#|)r$&%8]&:k3R$ L2Bv$nNA3tv5XcC n>jqNM҈:ӓ9ݣ]"JgRuzR#gX5օ `"T蚆;RE)y'#Jٰ'#-]MTR~rl)aFLeG1)#b4j{%FN*M>q;g!^O^oƔxiZ'\.~93?(DP渫[XHy%R9^]Ntwm;ZŠZP ,<.s)8{ }HrTx; ǣj$!p08L-9#H?~EAbcl5)>tjR)kH^u(BNw"Nꢡ268"8M@qtڱ-D̨2ta բ>4jtc#;ke.xLbBk6Dj B VpdC C9MZSMHo趩)q:;!YTn G{ݗ{me9ks[x.`lX3Qcۑ QW_r=-1Qsr9XY&JÔN6NmXzyz\ N(4D I,X6$bx3> c!md )xssAys5}H2o` GAUcX%^j HXj.&ד5]a@&Rg@ͫHه zڅ.3'վGW:}s5Ic\|*0xm#Ifp*s..J>U,o.G`>`SXiXΜhPW~&S`pk fϦyT2+-V~Ƨiy`ɉ c=RĘoYdZ4<Ó"44#{NP}^U0.d&. j1l,5~HW'Z@$7ABv[pG9(A@YaM-1ve J)(4V8.`Dq8a ɪkH:A ("IzЈ@dD> (^[/nǭbEyB@CbpQ"r Ih[ءb y`nP0WN …nԼ6/9_]jK+RDgFgkYtQ6"39BGW,D{5Os Խͺ5o*4_mH$Q} `0z,,Hh rcZ B"B HH0RDEB bG & j;:{9tQC5A. 0%tǓ wde@voK%{cO;:|mSxk(;}KkN [VޓCy{ "*Q}Ɗ'ߘ˒qL "\!oPP.hNr)t)};[ f|SS,"Պ7l0Gʤe7u7/+OQQQO,u[nYScϟ* I/I>gkb!PΖ1(A*@ !FG8טPOG_)>T,=~m'`&}V8B$87HuI& Ap&$|00O&Jx9\i\-L']w45e<8dmx7m:pIŐlf45u`~}8Dģ D?LmʚC X*6ɗ֒[J&T 0HpCMi|ysiфqBWr=>PQrO<xtW/Tf8/8g-xƌJ N&A! FG-%ۈsN6~ǝ5sf9Kn$а^:SJay<FBZ ڥzlb4)"s0=ܶ!ˑV7+r—CYxmPhlܹů 1Rᖄ(2D{!'rMy|:sAm[u|yobEi/gb!4n@jjxoXQebg2X2@Ǘ3hCIKWl!Z3 p%0m@pW$D /`oоn0QJh̶ˁp+)OݝgtS , >4 ͸eWP´5{ (S%]{^.T;&k6xsZxg*JS|h&Lگ:ݻU]7#WL1 66BB^@9v_VwG;lNUZ5KQ\K e➫},`2ducw\#ۓVQJ8F8=!hY.T8n臢(̏ @@zFh0'H1%`I>_~m̊d?*bz땈25,+?{hwNDy8 ˋF$60 y ye>:(mBbk0 458M̠gyHyK19$Q@e}g:{_-u9P_]#tʙ IQ ~9lX=)& `E0QFUA H8aϑ,*$/4N K;i?r9tTzvY!Aؤ #H2@܌e ~v)ZjF72y($IR㤏o8C5߉հ'σݳdHϾReE@R$d1).!zC Xo99ۆjio<ӎAPƬygN+w~~N'b>͸޽=Q?ܯ3@Z|-Ԝ8,BT ]:NO#;`:,!a*hh '<>4wMHmFjтS $,u*5J) R>0\ D N9MHV' qb^fcn1\h3I[ZQbtc8pNN5dƠxiP0 B(;738a8ϟ}ik8d8sfqaK n<#l'-KD*&"0')UFS5 mHMr"T WrGOF@pP0.В)K%Iv HB M^OgvzI b=z+dv &VXrd2qb`%@, Pڏ6dK4%.c`lCM$ 6N(dc"0fq9XC(ƍ+ oqLh807 ^oOS%㸢L'@a2gdޖͧ΋%ӈPh:,g-/uT>lB$3cX <4M ԇaA(zX_Ɉϱϖ$Hy*<}vC9{M0vC_N0@9&" FB/؀f!Z0 5C2Cpf- G\!y` )Uӧ#F!yXB %1cA"E:c'C iF8r4*l,mYhJ&aq(MF9rn/TG瀜~F쏞8:`؉!4Šgt'D%ao=|n,CN7⻹>ߎm >3|f/龖yi|.7J 7T>afg!y=p:x/%|[q$GG_HXAKT"X`\0.r=u1B!1#&8Ƈ ,삏„o4P%  D>pȍ=p0@ W)Z$Lj RQs_YN0-'M7Þ:yiF">EÖE n:1 V/9H}mv/gEɜpڗ>? u~-@p2a:b>WW]}Z&U4~ךѹ\Xb؝wsTܱwTtP[!b~= ?ܧ -,7ܝ*˛&Ǝ= /6e/mx4ArSSINdn#mMrNNnSbbP L'pǴLNC[c ]Ks?}c|xK AN9,dh#W@``o9־ oAՃo``IbH $PW 9%Q(M8'I7,T2\$H2)pL x)K=)_k,$GXw&Dd]vb^$GjՐf-uCf"~ ~B -QVEHx58bf>%imJSVL"# ſU? ~MޞOֿO?LE;MshE-K.'@R m1'#r;\zM5sy8̾wtD/\c:>?LYG., '3#p{C@b'̄gVOm8d((` H !h4kE~WSEٝoFh&IkК_-%"r1aDdR(K*WOd˦zI86H"A=%F" pP@ĔH 2MGFxκ;38!?saPފ} ;@>hxjP{iqzi>o;ȓG~kkNqku$3a1 (@`8#F{tع\5Bf8؈"A`' RtZ=x]cb"y뒀6c_,35`z @JZ*CM4EI O?oU%/=}(o`uǮ!񍁏|cf\!n"EtXB*<^ v25  FX ,B(:dAL .+,A H. EH0Zf"aXb P2@Ѕ"L $ "HA 04 !L=]dϵ I_4Q (AJ/fM#1cT"B,SJR!,FK6UB '"ANkz[;LF'сR ЕC "*xQ֩14!+^CP0I"! "@đ B$41$F@0Fa"@ Aq>uEez$Fs9*PV4r0CK>ӿ@y:+[gF %PIڛjy< k=cg:Ìktw1%FbldwuA^B6ríPaMA֬G֌ qC &*O({7pAHA@ht`s0I,'7§  )2!O܁Kp 9C?ƚBZ |_%8(kNz3-axdbY^tE=9,fċ[w2:mbYF`G^9?`}BmEyQP-/@K͇5K38SܴR 0ͷ7O :A29Z KM=̲G%0AM) L $\>"D}"-a◂~AbhDӪ 0nڪوNab#e24=j-!J3\}fr Hjh1FUCio(E ƒ;CAM;[t{ pOEH 'ye V|}؈2 }N*e5 bO!"01]%ǸpbІE(I=G,OX~3cxOg-sE&jS zWVeX&TA *HhR" J1L({L 1:AP0UQT"4tT /  D@K"xw'V`l~KLJ`+q]z,01@?Y640WJBR!M%;Y^P]2%ep 9I+0|;) Xa7 0m]S?# H~>ezmݐtb{)x2(Jw9Iq\HB! X3x؊ mEBxzU蠯LV%JdDETv VAl9ǬB}>kt,M# NX8-o>ܔ8A>sF XDJfkeVʔS)Y̰7R9UӚTL OLQD5W)ut?!) 9fL01+UIpSM2_w;0TmS T3 D/GC⋱ ݃ rHѢW[F. ^UNmPU6g/Jq5&2 mȮI[ɇL l~牏l-_SxOaF~d{p~};MC h)KF}0HP82vle,¹{X8\`6p&!6J@F)o7)W'`4=~w _nm>66qE8-G4Ar%o^zG~DCw8\ô}8Mjvp ttW =#ãXw5JQnWͳCMx9@é- =3=G0Ǘaۀe,C&pmEItJ3]`OJvž6g))'1a;yȰf{X e9Wa`\["OpVp"@N,d{sT|:G tg5=Xȅ=ɦMѻƠ2z={NW(c- ݎG\11^/A,y0j<4]:єL^1i~s$*AǠM=zD֯1L (,Vd/; >c C~당=A:oV*lw7\}X,4}T&"_#SGE]j5aQB5ADXFAD:؈PB!,i5 DAAF$ 00BA `Yw ?hi['&\Wn:P" aJ0IfbsϾB">~[Nre>ȯ=u4\^7SK"/e_hB=|ߣWYq\?:h&f|$J|#k9Zm#kK>,"& ׾ؕ߁_-ƥu(Ѻ8k7 ع4{0hzHي0u7V;~6tY׫=>teNv 4}J.*7:z68Q]VǮH/sTױ$ﰬ BI}s60bFbLD$n\_١Dj .ntjHpf51۶ΎhHa:H^ahSF%3ʇz : \@!cAdj(ABih<lf˦ MR/D'%=rbNгZ*Aa"9 2"% m3,AjBIT>o(FCqxx`Z# ,gÃKkg"Ŀ <&zф emծ0,XQa駅2#WWUlrN`TD:-'( @T A-=!H姟'I#NERLXO`[ƐE_҆ D}8Q>+bfV >*F܇x󖁶ԟ{* h*{?tRAq~HwodQ5l=Dq/f"~\DIP}u~%Px\w(sM,D.M"8,?oϓ$FG]80여{nCt_LB:@Š $N $e8r d2<V}l ǥK:r+ؗN__aAF9iB~M"`m~ӏq]0g'&E?HFoG}#.nx,{ 6cC*ff"_'ʿ޸eΦ9^=Y/tI#<0jChΤ9 v8E&J ;8F<=ZXcwi6& K=%H" Eo E'~TAfݞ`rZfD#dX/Al#|j@7+s]k>nV2oApӢAYtƜIG{}EISdH'="?ґT }^ sAU@{hG廠j U&ua0U5Сc5!ҟСS"|p #D>[)DNƖk덝Jc׈i nV {e؄Zˉ{,XQu.LxEܐ,*J`{.\abf^g/g[˯}ABi*195.RΣ J7xЌ" lC}(M̅34C7T.vEvDQ,EFNACGբ "r)')A4] T0"F1"^bu0vL8#$`5rfTVi_|h'Oy]@D[a`f%,LDTp+d'%EzW`|Rl9;j Q3!$;D }&Pmjwh΢L+Ǧ 18'Q:ZbM,áAgwfu-ƤiF0   fz1CsF34TXg G588I 4@^HsFzi^ޗ}pr|W'{9Nj{^}9_k/ 4.L-$K,u3QXFf$e2ڈoP{0]_caY?X8r , Pu*Z`B߬ymoa+חN҂(S.{㈒^5d^rə3,Ub0 WzhcjgPf_7/ DGW+ ?G_gZ!4s۠2t),,'98ᖔ*͔~(F* LjxH9AͺաKH詐 V>9">EZ䇵!pX(CDDD)D_‚)ן'NiĀf;bPivC( FEXϣ0!Ԋb"d"0r-RX$(T=lc`r2<#FMCGl&S@bľ!DԄ}d{|hd( U(@;##jP `<H"`F'(PWD3Nm9DLQ;BけʌW΂b [{Wy(r̊Bldߛ[L]7Ky|4sxA:H;ޠZC1t.1qa A;jȚyȿ`V(8Xk)nI`&@堟DԹE@րsDxl]'`r70k=q{ArRlE{)8IμqCFi8ql;Q^ob8@`ެFc:MCZu0P$.en\!MmO~bw‰@n=PAGDM?# W Qy3H#e'wub`yc1N K!~,T_Ab;. H]H,;w DHB ()(?6ϞA`хRQ K)DqB|1>\ߔS-,R{6_|b:@aDϱ,~" H $$|Q JJbׅz͍!g %Zaea,5E$; u_KYEYyf|-7\#;a:r\<[)nBx(G̡eD0"^ΚAkƘx5Nsx@G4CLZGݚ\ZOU `Ea@EHb8F B"p *!>9`yϾnDЏ-!).m=l"9ee◬S,K *cP.(R9#JV>*gP+edrm_%Ϣ%aև69$?RȎ 0׳4<ؼK 7F&ǹ^an`^'1QEQUq#荐(C 8&>ꛟN`B`^ZfkZE Atǚ2KJkӜےRy%d݌$r48p- \m EQ¥0U[䫑pab^DPA0ѫWT( 2p0?`|\c8  lh"Fr6~ C^&W/8t{@H ȊͰА J&ʓx"\g44ehyGg rwr7C@ }+V3 È7RvX AR"Qg2lzA$${(epb D}GՓQw Q)SvOxza4bIS`1w`kjjZO(R^+iKA(r$>BtHGtu'a[yNp DKe4b2Ik8D- %Nub_)m@롦*5CA j"]TT,} ܯQ3ذgpB޾z?+ (dS^[ȉ#}^ 7iR,O9pH+8G A =3e>'37}V~YQZ-p2{#BI!;AQ gLzd4@hGX3n   @щV|~GËſPWr0Fq`tٱV|`.寇{EkZUrQUOlHpT} .`$Q4z]}nd#l;;ٝfa.UUg?v ~k[0ER%I#@Jͮ.SŜ'ZG\. 8WW%e*?wzWHL3FpyM@-F2^YaTYLA 'HBxX3jS4V]܎(g {Ǭ0s?q>{"% [.$C1l+A rpv׾q4\%enkuY{ .C=Z) *)p4цHF1"9KR>v~T8Ko0_YGzK ٚ&#L{,E%tJbBl7Rsͦ]͟QoWR{ϠFs#$rKڻW}1iT/ınD%lB~≚h?ڛT<NnyzcBYKZ7Oqv,8PL( +"]lZE)6eH]a0-' "|"5J b9/;/&|]L,# Fͱ/8TC ЎCSs0(GÉi ~^Nz*$;#|h=): zHC#?QfC'izaP𡚐.OXUqU'ӕS"Qh۝)3B[܎<0yOP(J{4R:ԔoiprbΡZ}`ᴚpZ&}D-1b |nqQgh{gl"P : qňnA9FrLX kMxh#K 2+t!JzxpR HHQACDP8 s# S16Gqy > CF_Dd' !}q:1E B\) 0(i/qٸ1.G{ Szl6x0$ ~&iOKNȹGv GV }@f`D=0 Wws*.@! Jܶ{3mD++qn3[Ƹ dѩ|~0 +G cKJn cDM(*Smlqy711 R'P5] (*\nI՜D߾Xv$K^gۛxu؋5"<:\1'"HLL f>eB,:Zh_܉ȑu> %uVJtƖ[N85}!H>z'Z~˛&y.0>h@(׽!A$LDLRm2XdXaO>BR"}{b~cU8s!=lF֋!B2AY\Fmt>@w:yE)enױ5h7dSESn ALQ1XsN\4S6*%6lj"mIs>/h}r<>"2N-,+H%u9;ޥ| \PDWaQ*82^xoيN!]DY{CGh%{$A>;+Ep(A-_:+\-ȴAvR Z._v̿.Rp6oaPvA^u8djڻ迲^3s昝JJd؞i}NB gy9g@0\F ~rxQ4@5JBŅp5CXV.|muN Nf֋Ԅ&BWy@ (Q|ݔa[j 麂YVEh?W7;{mA< ~5dz-NWv5^&>G886F$B"-x@S1w-ky\^%c#-Z;qJf,4EQ)E- CfG 9}H20xp11dek/ǢFՌHB(s"{.kM#ּj 㾯w|W5km?܏_8̴lm-Dq`Q+FJ QAXQ s]Wm>o!9&Ëgɥ7л8!!$n](|R!)v=앚V),kCKYc|>lB6@h^ۑ})*K)%kAH0 HD!piE5 PTG#' <+l?G>@& aCk}83n+ds( 2Ixϲd k!T w`:q~N?4I$bʜbA!< |ڍ8<  hB#Ѻqp?]#kFe"R0A{9`iAaG^]d: fQĈvuKd_yCҗ8 WEf}AzG֔)%P毥K9AK`"?8B5R*`ټ(`pwN R& ^Pe\&ZLk2l}S_{w7"ب.G|0.>=PD',I׍D뭏 Dɂπ W6>M":@*yH`ߍ(s; i"ZtO'\ |S%=aчā^'7euP[0,)Ϸ]eJDOV8L &C'6ELp͠)Jm`d!1Sϰ;?7cVEΖn`ʳxUZ;rU99ˠMef~;pFJ](N8g?O7?V}1mZ̻9x$ᘝmϑ8BexgVi⯖jEN-IҌ=[F#HJ6~-xDMߺ>"_Z`̷rS)"s0W"MS1K84!QE$R{n`Q8%oj/5rpgZF]E|o^P33*6q,+檚;;Z3BjUGmZ'Yg~ BFkP׼ٞؗ''Ӿ~|?dAOHd,Z , Hΰ÷j֦2̵faНq}Y)ƘżX̆FFyQ8p"V"HK0~ɾZGƖj'jzpf DEW:V%>WT^ǂO͝;7}9 ƎQmǕ<‘KD ($&sNA.e]0 a"HK@BhBptNsW!=A쨲&.K^}Ԛ#?i/k}КP=ZO>G}1xu(<9d0YTMmiZ0l'-},ϰPڱn5.9H$Zn??^]WQdwc} xɾ{58"y>}/i`!kkHeBH":&Hu;&m"Z*ޯF]) "4(.NErI:dGY 2C&be2AC ( L9, @@KR6$H%q!,!QP @(""4 J$0?~9sOΰ> "Dg*)b'{]\36EցKѪK@;F\CId@DH}Ȋ|Ӈ݊\rHD +%B{>3'%3HB":7 olzKZY#r +$#:;?>Oq f\$s O⠀',-A ~(PcbɄ JԨ|9(!Pf6օ!@_Jf)\G&RH8sG!У"9P߁,Aӊz!d/&'gYSH/ł~,GxBtܱ|OLQg,땷!Q7dٞRyXYNd'SbXRkw0#;DR6c̀] :"u|{Gw+eD~5wc뿓֏T8k*дdŤqG Ͽx4ӻ.g?[r~ 4ENfEX NȩQ*Ģ'78G땖aaɣck h&*9隄%N_=DA)\f-KIIڀc(@AX^щ eOA%ѡBxؖĨ#ȫ:ۘomB`+.uC*&JQq (YM>"gz ҠeCc \0@Jf$yǠX6Q$!&k&-kԖ+n9PY/煜 d`\䲈jE !"; #v$,i)+HXt?¸" A* 2\f1l^oGn?7u+=Ys:9.Op-0޾#-R ٗg Oۀemfe `F!e3a?Č۟\;C~C, I! OSn~U᩹ VENp#a8gƽF !"$ XE'b (R<ޡ@h98JrNE:@r4td."B 1G6\<$bXBŤ帚<:v k&8ORގ6;HuA_4R/.2 8<]ֽ'J:ӸyV=Wf"SyIZ$⦅ YSR"t>4cBxƚ Z ͜*^}$K|nCy?<:srew+ef}Wu5*T .-VsD7"F@EJ_u*4ubd7l;Bi'egS(ⓖJ4o99 O=j5Nx?o–e2BP/9.\6u1ޒX9\"}?IE8Joo_Ӹu RA/)s4q(PV `o Tq9r3Ŝ!Q?~ D4`&9[Uَkܚ k,dOb]r+ܸ2^ױ(Q!)X^y?q`Gs:Ec(EMJ:^_ro7H{(E&pݷ?M"E@$["MޕM^Q;q汅iO9n-sN,FfJNE <| ?}s e~M [/δLu{Cg4FvI4~#/b"Dr1ȗ}K7?ITb7bHF@7ɘ0#q3 D?BvC4.r&Iu\^pLgt&(_H/_ GC Ġ1e^N(@td~R.;z!T4uy*az H5I^-o۽{{MJ@ôVAGbM5Ô /IYnݺ H y9ؗP·y^  L 621gB F=c"ok˶``*t k`pV%sY {'[g%;R%HHF.$ܵ4%m Q< qncaG^=X+0@y3Irǒכ7 o^2YVخכ:01dg >x k^xLm#-TV0d"diGfܮvHJ,^AcHsIDӚ"ȰbAD(&1}QU~8 "C0I(dHD$M$"~4)PDN=\EKFxcSbUjkܱlaHl`B᧶>4wP)iY7M[ Dbs~qB~@sGkK_}B Vb[,4gBBN Ӌ4ο)b@I e+ݐ5!YSʍorxc~/8t6G`XEQ:OP0T_,@8@D_p"*wI! t!0ءv/ɍƇzIl4aI (q毇103:`wLg/ e!@ AUj'^yǞ1s{O^$B*~{[;o { ,Mυ `2 b(7GBX.w}FH.fp&=g_{7?=1zߡ4Q&u[,A~ S$Þ`kߣ[+c- u0͈&MnS%Ob$P='WR=pz"t4@1jֆRO1P/ww pN0Sc^*ssZ{8s5CqSqέqa;jv ']J5e^+e[[#Lp?B&=8t\]C;,=S.+45WGӱ.i44]6ggv9}uɷɸK[rtwjo."66oSQvYO#|]֊8J][vkvX|lblznzej+5/pΎgl+Yp:+cpYGgo#ZOM;9*K-7\7syK?Wo]Mxzf-_}>Kow]z.uϷ\핽3~7qUs;6>s[t}80LÁOz zwf߼o.Կ#6/wOym 'ֈ)h~*='r d(;G-˙` ,@aq޸a(u=.2;]${Z[ӄnG359 dor7z޿b KA4?IVoR¿}Vw`Z,T{Ŏ$ofͪ;\'|d*l5_pyz87,0D2l;Q> bzQ#d x_# #&󒈼DK02;6zF'U͓s,D)4IA +0Mɀ1P<լ'FO4ja6n[4Bx;HKLx7G5m5('#̡ >K@Evv1`\ACx ;:%_\+AȰVemv͹|C4TzI??cv Qi=OPD K^d2RbcG 48B{ſGz{>|&YgMJbs  2F'-+<]_2|pЧ=H[3w/cŷf9o͑*jr%/M4X#?0Nr̒?O-Y|}tSKՄX@t! |x+˃yC,Oy$Zw{VJu3k4q"1$3CTz\d@SP R z,"mD.} @(uT{p QJ"=>H(r޹9ߙt;*$eq0m$=$(ʤ ܬ/x,UyұSLQ1KS4QUQU%5DQ@{!&a'`|(bbB0QdDS5PQMHT>WGgm踷>Q~TSb(O? !Qҁ>ZhªP{]@oFP'vEG7Ey<IQD*UXN@xyt=*@SDAG>Pvo T?2XQG|?e_` =5"p?Ux/lŸn_ ) )Ml5t}^׿ʋ3;( m@ppN`'"R( рtUHDE Q=)X,eu\snj<2k>c{ya~!z<*@TCmYyR:=?j(rrjjxT2RB|a1 FQkAř˾6"h d,Qf0L`11)6b\y,A EB*Jx 3H(`,E^4ьe w0f*KZ42RVt'2 raS)-a$F9Fdb"Vt )Y!h\)1o9lҊMЪdNf"H' >-Cq &WMHNixP@}!ڇuB" @4 E|џuBf4X| &9[>ɲ$KA_MH" 4CͰ}^`* wo~9aD><@b4f@TVEGf(Q >6E yTHAYc 5O@(R"EX0V)X0 F><4IJ NP@JU+ȉA"<_ƁC $H$R( }"~4>;$&H0 $@5!H"1? !r2&fb)"h(jO}O}?޻fqm6`*'|0euYܿso߮vӻNߵOF3=3 [yjДj֞#׋ wΑ]խ潄'MZm QMJK\cub+@2()@_}K' Mu}oВ,ɸz:9k^zs8 F FH$dHCHH?ܐNcED X@! PMD C){Iͱ(;/L*qK@R`IJ CH9t@0i~4iIH枂0$E zN\L n0\v?8@e)PI(hܿ9!a? s9}Zq~.  `)0aY?>Ш@@O];sh3Hu/xU_P?!xsj0#(J*Ƃ &6pDGIvM5V|KY% A$D0i-QcY$c0Xi0 @aXF:Y#0%@YL8G $&2f3p Ah: 305.I!{N/3=i]|Պ/OS #4p %K"\9q ^Fn'Q~}"!hՔDD(;$@O" W_G$|EP>R+LT POs,@X*O@N_R翟Z?7v_C~5v7^qZ׫?_㯏W K֗Ǜx?^ə~bz|o97U="H>4)B}LLTRA<@eK@MTPb~Cy:oj~?yy_Ŷ֦m辿%6X-bގŽ;肧O?@o$_YA 0;#4Hʰmwo|e6?_|?O쟏==5ӌc7ٛ_4.}b7omkp],N-u}1ZiW!ۻpջ*N;xq^Xx33;I(O}Z|ZXwG^kG5g)cq .y OtoP}zkxHmB/m]g]B`a؍Y@.7yooDZ/EGb!zQ{szCq}2g& &κ'|qga)g3FIVw~[rQĀMW f=0D:x>(#`Q:D^"𴨽Z (aMR"(_"(\`d$x^1ƆGۧv~nwI fd2` ϑCN #;?a0PK(~MEd>bA_y5H g<~4 8@9pnmw$q$3P>ocOo@|T_πIԎYk_ 99\tqflזrLg}xa˭q٣V\tziiµY> ]! `eGat@ ݻ,N/e_y'j8_Ԁ( R,w{_RO̯ _ѡ2v  HD6x+^IQA E'g,(h_{(,BQDĐj[.9oMh>K@q(>->`h$[:43nmFM@_DGwJT?R":(l ۦNȇAl fF$NNu΂Bl6@-vKNƟ>m}<J`$ P0:..F'tGt#D@ VzA=8ctG]<顀>`_ .eVćˇ,_[U%|&Tb\(g9j:/€"*ED X*m5'/D" O^;SE "eE#l@ o9fYC|=OC(@. =O;q,4H?RCǬu=H=L7BBU4ʌhbN/ůC` +V$(0BC9#F!3PF"H!H- ,0``͉#2 X.(թA3Xh1, L LFDX3 )ኄ"' G<>ۤpHFHf]nM|/bW o^{y&q?S8CyQN%*wC46?0 8p=)3|*VʱNWS&:h-^;߳P~:Yt$ ىI&/)۹h]s|f%]q_'yG8_\t˩r3'asfzoSӷ,3k~_={z~^}`@BЋ /Zя;6JZKoCAҼwVXjo{?bp־>'Gdw\?e|և'<+7T7ir~rtJ\Ohc<0d :2)*)Y 뼪l7 !d@aa~kd4C4LmqmLnfV>G?/)XOevSl~gt~f Y$9/QYAq2QC{1lK5Ct}xѶ=}iKqqtO78 _ppyk\_uV2uY+ģ~O+9&Z7[:t6Z^EgsWuONߞ[q3A#y^_ǻ-~&'gwC{n>M?p1?%kǥ}el߿N-7l>)wFvk!6w73;| \!\hr-rٿOIgnwc90 ?Q7M+Rt~?}Gm;~kSOϏ>Lz>zn(M:|vRpGs~vo_j%`$ !I3 N<jO_{(:۫U+z~cm-cn-&r7{4;/#k~qoɃTozER-w|T 2{q=7ugܬ''4`Ѹx`:8p~4ms{*-|ʴs<-G+=_ p2|$/PԯLX yA<=OվK>uZ\nU֏gU~,YҿoW4"ܞdy^gW7=_ãds7;yi˔o4gݗ ?u⮓;I6NmfVwnnsX\cZzMh7y׏-кpa"@d=2ЋXvaf^Ae@?Јd7)\>AGJsCyI䈉;q)_3;:y9i1 >J}Cg ȡ"d}~͎/ʗ'c|1 0 j N***O~ /6D'#m'FAb`>G~@C8k8ǂw0S"=!o9K !̮\A+C.\,؟>(Hq  Ax~GHl&\Y7O-$I >&tISIAe4>kEYFJ1mhЩ4%4d\l F!4"AiXY^RSHKrA2D)! ‘)2XCFD ( Ph4# 2 QOI"sE Ck-)݄(Th#-lSv*3 w 3e@ ŢɁZ$ hX@Lpe@BB>@l^&DVjVM߉xy?K=Jv^˗sU=nş{V] +1ַ: țG)d b{a;l֫ 5[8/~(: GB=i<՜p,dm59ŀ$*pcϠP:p"PA!;qs#}1:V\9vJ7qq1+eQ(m]t“jt`C;AֵKCah[ҡl)ȼiрϞhM(BBy,3uOKF<7X:8Z^";1G`޴~Ƿ|"Tɿn8K9%r< P_ G/ ܷm3NKmѹy%pqyeH>?{' 'L#&y_ToW5}9}0eر;h}_gv]*o_P".-H8'}X "΂t%Ay1~d`+c3r$HOgg~H}3I%j~җNsυ5$n2q'&†Tz΁|ߓt?,BBJ7fK?JbCpL @#?Mr\c|hĻ cyQ|jh28r) #5aU]ߟd8 ii3Zb>LBbUhZRRMRj`{dCկ7 @- Dff&bFM'hp]ofxl@nBx؁sI`wHr :R_$v%]WYU]`337#AKnCTwx.bk=>╶d:$Mؾת Ԟfl=U?.g#)c]]Tkͦ!@v2lwmzHQ@tOI*T;[~|md0v)8?m}[Z~~ط􏢑iy~045LL "(*3 SM7Vv+d J GK>W~=nY՟^[3v_2Rg:AI { ڝ'']'?sZ{> /ؖ:M_cS|)˟x F ؁eԜq vIMHıHf6~˭Tgr]N oou,ܾjN!F>ה@(h n$xo˯OԡOZ;Ri-kHn{=~$raPI~/a̎+s /y\Iu\B.:X*iu7Ok*5fv/*@qcQԎ,o~tJIRh]s!m0Qv b)!L'qiRTbldE~KbL2 8>|p-Qf_q@qu7sES0:|̸3`̻Î^ctӧ ;# P7X F7 gCȝEuK :ƿp +~"KN4BZVvvuܦ)@S̎W2Lǝߔ!8*3‡Er$-~7TJ> W]H]FO$FİW ؾ8%*Q(mS>_$$~"*鷩2p6X";j,Uvtҙ%8FQ8ps AP4=P`^ H~tPAm}E 4@HA$da=:g=ԀH@a,T!j!FXG@4!ė\yۙ:@t%HP0(|lu;~@Q|Owr"=a=vpd"?R0 Ί&*ъjnͱTl"=WнdIgMbR̴ R<)ȁxCvP5smq:` D=/"gAA] )au:!bl@M|MlS9+FHp|p a0KhN9) `;"85&sŚBJzg}a=Us&&V̄8)9(ЌS;aĻ)NCayPx'|ЕAVHjB#V9=L1T`:{XL{٥IZ3O}>jO7~GŽ `w}3D% E"%?!:Cƍ :XF1t+BH}Rx||7xǡڟRXx`kDA<#ChPuK *Aod\XiРG/cQ6Ok=O-?zN N1rRC4Alw"AD [+ú,i h3OkLo3 [k9`!6Vi,WisЍq_CF!!1Hdhm')4/? z7Ig@qp`ū[C?/zl {% 4s <,Oy?*MtFᐨ^>Ч!!NZfiIRNh=~ O6ei1QO #$@Nyǥì>e#F~b)B k[UXw7>~r >Խrة:z2/+С#zS"&d9&FvBS ?r9T͏`zz$ QnUdLp 7M#7p)7$DsdHP 8"W-,(ȁ XM >rw4C# -}l @) )5l3"zff|`&xacb߉DL~Q0YUйu }ϋ^m.ԇu٫q}._T^G{g rrd0Bn?/R=3ØD9%+.4"Bt']Oh{Kym<ήQ`)k7 WzQN;MO`B@=͢kKDtmڞH&J+tpK$L2׊MQJuJK8~_hdu'̜~jrhwNZy{37a5RLZ6Eb85S{rZN 3'**[Kef%̈́zE8ʲ+vEn^Jjk=*څ>d!n*,L4]`DQ{yA¯ LE`m0e#vAX=%a8w3ɒPS * ,7!Z%ˇDBh*4Cڠx&֑b˭Py"D_>oƒZpG=okV< Gg~T3m; %xj[F =m]GKA 4rRShCe8 CL/#7AFXsi&%d(ѱ@I646g{ %SRcO0ꛃ$5Rl#}C@'f: }Ϗ`F>+4!Y80> \F6Stha{1+ٕ>Lbҫ6;CɯhCF|cAŋȚGHA b!(EV%:PD% Al/Aѐ4&`hI_1/.A|DCD26x)Q%fz=H+RՉtf0 Ԟ~)Dӿ0јn;)@Ӥ2HrjQ4-A  R9H?WGxr96 ~= ӹR&.H=/ɎQ q؅N)3 HK]hTKH|?fyn;X8א=-\Ι" V%>LzǴh0!Gb={azȥ  ^f3♝L<7ϖφnw4^-ğј)}O0蝄A7r( gf٨êB]dbCvut`q0a^/?\dYM(qX \B7 NKFt,Z"h=Enti`gAh>Z f)NCV?+>9KePJR)5.7-# /эE39Ozɹ7ۂ7ƕԌSW̆DQP9zZ+TA`9rÑGy'殰,h'HA Eqj+N EME@צaLFBD\;b@C4#&}zDc ){ЛPcR@r+н"H&0VHepbQԛpx `"TAC"ax "PB<_ǣs͐&@Y]%-4\ aNxHNӨ:%f0A :XR$cb,T"B O;bBLJ*?\hɟk\O:~[(jbD;;0/ "h QcF1|0*8_z.8گ)zX#gOz3cİ]l1!PuK'2GlF!L9bWzi: tIshVՠNb^y |?IilM~82*/3F*T9Fq݆dN#=4\&!zZP~I!guq'[ьoPKE+C"BY0;aSC ~, }$&Θ$ \&ûnlC sp5 K0vvÈ N}'_nMBФ`=3俩s~ snuK!#!&WP|a@cGPˬyѠؠ )T;sc[N9=X#d;0怃tPDPyȢPbc-B$ @3ۼS$C~~-@܀ ;҃CW ,8CE?TCVĔ0!Г~x0%2 & Y4OP Gdr+~l4Boe$Q=a;I^ }`/_Ft"JCKsV}|ƌe U[qox}:ɨ}f)ܷ@BE6#*y3H9HUxzS$Wt$"Q(`_t 28?/atwig <Ǧ^?UTq8:{Z-AAޥwg2aAUPp,`0>!!<2Z&8 Kwr2S3*ѐ2钙5vCEfi+s~JGkyE!uuV_RxPfqi4"ReTE`_(C&+@JN鏽H5*\:釚Gjo;A}O($2L}:5dof1XKt"# ke1 ;29$6̈*ܜ_4ZpYJgt>$Fn+I ^9l0nΨp iIӗןS} ڳ-# G 'q`q$$r`'+ ۗH=Έ8.ap:*ɹdEL[RpO`DgB#,O3g`qNE dWo`!njfMєcF>tF!t2nᬐ af*T*Q[>-i[S6nszdXuz}85VsӟW4\}MbBH獀b"D{hӛ^['_LJ ċ(HO62\FͲ .ER*5[J'D j)ދjAK邂TO/ !r +ߍHT)cf-E<4΋60AI0p. TTG؃jk{}Pk'g]kf uŲR@kq581;3TԂ ^q's/p1Q;#eHD};slQە[2~xuS)4؉V%ΐ'ʣ_ևPg( K<EĹ[=c':n:=>6 A*8@b=&K{/]p^<Ў01/oH*nPЄG>Ѱ"i69J0/s aDLْ> >>qgl\V0aPJ Ѓ?-`L (dS LW-K <,B )3^|a\83m+~"iD%'^S Lsf:DoA4x4}-|b!{¸&:'"Lc~X¦?14=?2},Kq/^cl:ƭrwp#1 ;RWAN$4]O?FǮ 5xrn{;qaCVps2}p!ΉÐ<-"XڌD`E:uqtkaxH.XJEږuCOqVqN 7 ;o!C OsEC _Yg\% OtUKxS5AZDǘ..#Fi1,z/eɮ^r[im9ē^ϲrr;Enjqu7h8")0>x}KQ.XX| .R8lHnzGl\â_AAJc;FĎх;Pɣ;POU?yTvI@ I;8\x9ܽX*FHUO@Gg+L`~! HEc?;sgƫl/Z95xK. ry~ѳ2xBWwe.uRJbXSb<ԝRX@~B %~nDH}G>n9s&AѤU$BEOB `D'8:z) p26҈ 2v\OwdJ7PXuG/^;=(zCH 8?Yh?ˌP-ݚF'ErR2!򠟚OuC)) P@׊T~~ΣX5d`8GAnQ4J#TT%2?*4} DR- YIH!RKhL  t{u_aD5J8}jJyqsi7g Rc өo⸳.{AylYkxz7'$/b]L=wc<2QĬ#IƸ嘷"(L'v8\>~|u}:05$|@zn^GHAq]E;X*oM64"#D<[C{aIV!ܛ5ݣ駡ha9ԏngP\濷QwT=yxV ~T/C4`G1ChⱫ=0cOy}rDD0$A-=Pr@r.4(`vvAI."R̝05DwFd̦2*#gDڏT~5|j)^%<$zҲ'qMн(H]bB-i $5}HH{Ϩh!eTr0<Ğ] |7:oB zEWUhRLgNA+A >h5^Cg}Ut"o RL_)|dt),H@7^G_xǙ:EKд0H J4R!JPAK!#"K2 $@JPIA5"CAd$AddPZ!^FT0:B/>9BKXC2#* ",TK$0 ``( D`0H1"*0R@1IJ(ĔD@DQT!$,'L Pa1 ,E02D$Bz0'|?_$ "1"@B3#10 &B  sܣMil?=a + ( `TY,I @,d@@D%Uh(B EB2 <7h#{f/%i^f&~{nYg2"ӊa[1ZW!&8͌^X G5~3`l1 =ΞkI(h{H ^G~AQK?{ x>Fp#{DT`@0HJ6Ojq\v:H8::Eh i& us-=5Mw\'~W@Cp"C j6{FB!=\^4p!ih]Ʒt#!7frM4 mNۈބ@~Q Bߐ \OUXS-? 2; kPP QPy8SHVB xǺ"`vf$P$\0h) #C@Ҷ݌ U/Z/|2+ iOn8?=⓪@@6bK;(_DV%THɅ#$YG*c{g+@)F^2 AD!B͑hQJ5`\ C Bn,MlB&)YYTORktȔJ&7S]sg_i }'!`O{}8r9:A E:55 *ǜ)$< X5A& H,,)Xđr.xOX6KI R,G(h:لǓ l$S (9>&;,s&ÐɈ@5" K8i HAg֠8n]a}DdhZME} y e <<}s1F1bn\s+>4TFw>$m#~h@*LnXq^hl:߁*2N¨ ,TO1>wE˜DDtP%rdzw(TkJP!c>->=)h2<#vC}XFFGXcbpa?~}v:axtٿ4"$Ɂsa= -;0MJQRF ֓4sytOoµTf0r$vO}DF:d‘Dm^@j5U+OȲN/0 wG1Oa˯R "oυC;(*di-OYETh ɸOɛdUf5ȳ'ʄBjXwM1)1>'ʘf"&Qbq[%] YH >so>۵2Z)rgY@ s!$Q&Ɖg'q[ @.~ĤJ4:3NT&닅()&'lİdž?.t'C.' ]>%.7r> >(ˊSN,Q@al_ MIRlǶXu7k7ua'U+ VS-#R~oGfXulUS%qZdb~lueFǚ`(?/gAYkUHi\cOY}wJvgs=$_Fwyÿ~SEetF87_{Ŷ"rN:AUVfUVWgtR1(f0?N4rS33 ?ߟ g|A ĸ2.Y DjqڇS)_T t b}au>AsڏjԉUHƒO.. :H'hE[Ac<])P&B`8)vd˱ϗ: 3{B-zR`& MM˞1NͽT,* >ƺP0»PJGNw߸-qZfo?+rO!{ h d][xr0wnvWw_2Js6\8tD=A̰=m8Bl?)oIeC7͏7lr^;?YB<xG-4@U+ * Xh ".2݁6W(!$@RDPDRACP[b <A2'5 &*% `+&5 ]+C7(} V;5-%iq>lOK')sq.ARP0T\2au%L0Gb@Ք 7Tt#fIrܻIDž歘K0bU/|*oE~tN`n~R! d6{h"ܐ.69 aDmPU){DjFL>D^E.vX& tE F6 ;3n2+1c Q $J  rFx{iay;KyةsA'%FQ<D6FQz+ 9E6/+K}C|I q CɈ`ZzLĐU3I@ʳHLNo)ŃBd%opgCL޹1H_ƹ$V$$A^hAlвK!JyaᵁMegB0'A8i8gd-R;14\ TXEy\^%#^" yQoFވqy Q˘+fpXNrBafW= d.C}{yGsWj /CvRAɘ_A B]YjKcT}1TuvDGF80Xdno{:B5gFAT3*V'Νpb֓L0qaze$[Y*:Kye*#-[B5 3|7)<0z_/Q+uZ~a;ti}HK ˺Iw1S "<ψEg$e %V $i L#ng{: __JjrZ R aNbbA*L9 P;'K5BW0O(["$D0at \!DrjMZ;,egu5-,طEsd5e]03# DTL !PτBHPTDE5Z%]}ޝ*k{iJ¬rL5#! {0QjJ6Xh%ӃԹoBxhj<|^$QDEBhQ.R藏bXE2? Dn>C}-wN89$o7<9`j *F7OMPt0̅?OX5$(2 Zp6r !$4nQ=8\s.x2 )9pFPPƅ M視~]KF\#Q4vI#e؛'#H(ubG!>Zcߊ`rz/F\ҿԢ?z[\(#K"iEXts{P8:s3nL|y75Dt'0=8/Ŧd"ѻNa8K}-ă !IBr.pedXo Yն'MC@h18t>&}.kO=\P1b'$= c4Q /LĠ?z D bHAkRqJ9ȠJNiHРQWL4b"H`(b{E,6{Z+jtd #&90{"=C-{o@ePo D`;"f>1->3$bgH! ь#DP2C)4Y6YL&Z,} {AJ:؞s!#q4WN3AI>&QbuP" ꜱµ=? CJِtj'!y<ǚOonOZ)T@anB@Ԛ EC U**N$""B~d3x`8 ERQqCA40Ր-3cESɬA1 &$@2*BiqtRp.8Sr$=  (!RznL.9% ÕjMUBY[$P~+(A- zxBLĐD`[hb~ T6cw=<adnx='d)+s D<%bi/YpxO?^:r{Bҵ DB߁;w!hwT .`@I<%='$'̻ 0nZz=&d ȸ"{°/uN ,6>i`I[V$al9 d; ӂDDZTcp_p< jOaа$o^N6Y1o<3wU`7p01L$:NBrPJ!|F(Rx+K;㬱 L>芩=.OdHh}j;- DONhِT2 k!N$5RsB~yl6bC:7I|`y:NEѠxp圾h^c@ЅȵE -L4:tsEGo3(%LT1|!P6 )|!L`1sR5n4Bw Pp[{ɝk" S0Dw2D|0t4@#D@O-:'navLG2ݓz8ɢRrs G\.;_Y4Bs3+ACٹ Y 7QA؅2JBӸ*s\#к]$*P2(21 L#*e჆E[V)~~ܕv Nfz!TI׳5}`I߂I(;; 8u]h&e508 []k#l)})aG Pt`O|> 0\d[{D3 mDRgrd<rd8VNW房J\WId b4NIb#% H9?,1<9W"1`AK@=R)@@飂 L%s'/v[ ͊K`?[U+Ja$4md,ߢ9:/D@"5 pp}I&,dZݮ "yr8<i$h&WWJ z/n˕KNhDQY b!Zr8 - ! OMM9y7ZL# EЕd?3u1=L;UWi RE hkzR@B`;S*bO^[7aQr͙19EF8-@)W8`bK.ƿ άNc?Ő; w'TEJF@Dgpx^ a4IQi:*K>uXQ$rݙׄ8AgXg}!o3yz9 OX#}+Մt5VwD1ǿW@jk4\ȔASC[(}i}Xe`P ICw{QZn}tfqIK5'.aO38Ba/jC;TTsm|XI>TXޚ-Rc4Š+PQ<e c,@ȨBFD]dINxk~\{GiZ3.B6u@8ÄɰD""zW#!LJ"R'2?Xb1J[O;EhЃS6/@NhI<\DZbW@R C1F 󵗮\ EfT,!=mRLj1"SSf2pI#gu2rř8Iҽy $ N2sףNYUlSDU,A{q(Gk`BZ#!$W-[qaVBss{EG\{Oy5]0P'!0%xV3hR?qR* џ+aaSD# TxDRa8M^nMIjk#ra4Gxc݅ ~~U0vaMEDj"&׵׎0OqEZ%Dd79*ϛ+hb(z[Xְ%-a:Ry"Ic4$[MrpKgKKE) ś8yP!},+2O1aLR0B_.+4qP-3AQ<{_]ϒ'wڏ}q>aRПw]^aUcؤ~>O TI z{c|GsWqkyQ΍MJK;g<3Ŗ$/N__/g(#LG>/wS1wݿ_ٔ2ED1UQ3LQPQD5sl@$_iBlb!bwi53ˆ5?:KRC|MU=5}!q$ ȶҖiT9sC`Hvu+ɳȎ8'ӇI|gsCS2ͩ[Uq=P`a ,_P6|q4p_{1) a?5Y3[]RY<(q 1`DJKRkYO#x.@zxng{ V( ffj Mc>>:պ6LܬY>(f K̍4'Y w|c `|M6)9 04';s#qHD 0}jz; *pǗ Ib]8`z E p[ 0û|L(Q9K#tGdu)gf K ":ܑԞMX5BS}MZIې,D X0llZ LʄCFb|a{jIn[v"6D~]O9#!;200g#~FD.By)73<}z{E=M"dRw]F H`|JXFvC4Kd:% 1Wo#a2 ;ou<&{B*ҰMU{S-G .T0"G p LAx Eb+C_ƀ,g)! 4$VGp`KN@9pC`|Hj%zej4˦c.H妒-zə1 O³W2;-8VHe0^ 6;\[!sΆ\ Xwf cXT|X!A#%2fR" ~9DAC@֪4CN?]faAl:X^J=+0, o?F⬇^Iة#d$T@wQDߍ,2S0 x4 ĺR2C/ .pd'A HS zsp˄F`!z(9n2D1P#j@lp3փ Z P qHy 9Xo!%v8X?_Mp9& `0:^bY(5>.Qn,Ԏ`hgrPU 6gxfz/R+pZ,I_ڝ;L!{҆wm[dcH$bBG\Lл1Zv Ӕ ORzvMX)(RǫʥHD_ǰxjNÖ&b0~\ ,Xz&`t0L (5*;^^|\@7A0YfM rA󗶑ЁHx+az8/ SÉAnq݄XS 0˷0x''t_Đ)Db$p Cޞw^8FI@|cC: ?FMiv+,SEJPF? -5dS=dIXxfj~=X U!d8,?GBsL]+Pg$#mPKIDh[>]K6i"x0 #Y4RT[ʇOq6c^~uN K!"ل^ڑmB7$?a/.N!9ݙK`+f 4K1cjJ==spVoח³FnyG~bP#l!h |>AP;?P;P;P;P;P;P;P;P;P;P;P;P;P;P;P;P;P;P;P;P;P;P;P;P;P;P;P;P;P;P;<@P;@:complex imaginary Y Z4ZDZYZZZZdZYTZYh\\]0]X]]^H^^^0_h__`@`x``(aa bxbcbAdAdAdAdAdAdAdAdAdAdAdAdAdAdAdAdAdAdAdAdAdAdh\p\XcXkYlYl lYlllkkkkXkXkoxoXo@o onnnYl lPpPpPpPpPp l lPpPp lPpPpPpPp pPpPpPpPpPpPppoooooooooooooooooooooooooooooooo8n8n8n8n8n8n8n8n8n8n8n8n8n8n8n8n8n8n8n8n8n8n8n8n8n8n8n8n8n8n8n8nmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmXmYllYl lYlkYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYlYl`lmmmmmhhmnh nn!n,nh7nJn]npnnnmpq`qpHpq؄0Xh؄ۅۅۅۅ x؄;p/T7/d?04@x5FA5B6F7FT~G&G,>GVG%Hp(IKhDTO EQ$ER}4RDR =R/S/4T/TU 0U40UX0V|0X1ZP1D[1t` 2taL2tb2c2e3gh3g3th3i4jP44k4l44l5m,5oH6Dpd6p6$q6q74r,7sx7s7t7dt7u484}8}8}8~8 9T(9D9h9D99$9:L:::;d,;d\;tp;;;D8<t<4<<=4==t=$>d<>t>>D>D?T?? @L@dl@@4@T@@A,ADHAA4AB4BhBDBCPCdpCtCT!C!Cd&0DT6Dt6D;D0F@lF@FdBFdDFtEXGTGVHZPHT]H]H^HT^Ht^I$_(IbIqItTJtuJdxJx K{tKTKL$Ld8LdLLԛMtMMğMM8NNNO$OpOO8PPPQDQQQ$LRRdRtRSS,S@STShS|SSSS$S4SDSTTdTt0TDTXTlTTTTTTTT$ U4 UD4UTHUd\UtpUUUUUUUUV$V8V$LV4`VDtVTVdVtVVVVWW(W0?T?xT@@AdBB<CtIDJKK$K8LXTN4S(SLS`St4TTTTUdUtUU_$dtdā0ԂL4hԄԅH dHxDDXĝԝx$d0$LTlġDTt,@TtĤԦHTtdĩ(@$TTT$İ,@ıptd,|TtԹ`xDd$D,x4Pd|tDd,@xD0DDT$T$HtDDTtT$d04 $8L$`t$(4<TPx$  ` t44D0tt44d ,@Th4t`TtD d!T"4t#|$d%%(%<%P&&&d'')`)|)***(*<*T*h++D++++\$,t,-$--.(.\t//0t01L1xD2T333406`7t$8t888984UtZ`ZtZabdc4Devvy,zX{|$}},~hdd4d<XTd<ĉ\4@dd0DDXԢ4ԣ4\d  $Xԫ(dDDİ|4Tt@48dT d $ t D   T\  D  $4 dH \ p   4Ll$<D4DTP4Hd (4)$)857T8T9L9x4;?$f8diēԓ(<$PDpTtĔt(Dtt8ęT$h|dԜ$0tPtT$Ht$t@DH4 l    !44!tH!\!p!! "D P"""4"D#d## $$'$t)$/%6X%T7%8%D:%;&?L&?p&@&@&@&A&$A'4A'dA0'AL'A`'At'dI'4J'LX(DMx(TN(N(DO()Qt)R)4V*dV*WT*W*tX*Y+T[L+t[d+[x+[+[+\+T\+t\+\,\4,]P,4]l,D],],d^,t`,`-`$-Tap-b-$b-Db-db.b$.ch.d.De.te/tgd/g/g/h/dh 0hD04i0di0i0tl0l0l0m0$m1Tm41m\1To1u1u1u2w42y2y2Dy2dy2y2z34z3Tz03T{p3{3|34|3d},44~\4~44Ԅ5l55D6t 6T666ĉ667$74,7Đh7|7777$ 8Ē,8D8\8|888D89ĔD9d949d99:4:$L:4d:t:Ę::d:T8;L;d;x;;4;T;;; <tP<<4<<ԟ<D<t,=D=X=l=Ġ==d=4>>>$>4>D>T>t ?,?@?d??Ĥ?????4@t,@D@\@ĥp@@@@$@D@T@tA(ADAXAĦlAԦAADAAADB\BB4BCT4CtLC`CtCCIJCԲCCCdCtD$D@DtDĴD4EԶ,E@ETEhE|EEDFdF(F4?H$k =Z ={8111 @HtX;HEHe(11СHJH1计H3H=0Ht 5BHHH=H>JB/11Hh11+iHL%bMOLv"HgmHH=o=Ou{S=t2HSH1H=SPB1S=t1H=PGP[BPBH50 n%=t=NRBH=d=#RBHHH1&1)1)HLHtL%tL)SoLDHeM|HkHIH=}IE1DLHN,9A}HcsH5TAUIHHcLDYDLLL%]HID f.s@=x*x,V4Vx dH=1`h0V7HŬ=313H11L⾘11 11;fDH;|$XzE1H¾pHHD$(LT$(VHHsAAVH|$0E1LѺpH Ht$ Hx=j=I1H HXZH1҉fk0L{8AˆT$HC@Ht$PHHD$PHD$XHD$5Ht$XIL5MHD$ u HH|$bHD$AH@HD$Mt H9D$PH|$ HD$Xt H;D$Ll$PHt$PHHD$PHD$X95Ht$XIL)5MHD$ u H>HD$MLl$PtHt$PLLhLl$PH|$ HD$XtHt$XH|$ HHD$XMD$,tL9wHu LD$,H;D$|$Hc՗H;D$H{f?Do6D$,Ht$H=>LEd$ IDLd$ 4cHHsMHD$0qHD$HH=f8HD$HD$Dx6ܕH\$H.Ic1HHD$@HC1HD$8@HD$p6HE1AjH|$@H.gIHD$0DH<(_AX|$,t LIHйHDDLDDHD$HMHjHT$ 7ADo1Y^H9\$8IHD$@HHH9\$t=HD$f8&Hb E1HhL[]A\A]A^A_H|$HHʕHu/H5HHFH#xHHHH9Hs(H|$@HD$@f+7LCH#PfILHD$HIHT$H|9AFtH=DHt HT$H1mH&FH D$HAH,FH#T$@HI!1L!HT$@HT$@H‰sH EHDjLHI HHu1Ҿ$L9HtH@Ls02EMLDAV1Iv1AHHH?5DHT$8DL$4LD$(LT$ HD$L\$|$HT$8DL$4LD$(LT$ HD$L\$L$뎋Hc!HË1HHc.L-MufDHIMuImfA> u1ҾL螂9Hu1Ҿ$L节9HItL`Iu(H|$@HD$@'7Lo@H#PfILHD$HIHT$H 9AFtH=/AHt HT$H1LBL T$H1HBH#T$@E1IM!L!HT$@HsH EHIcAHLPHHuI}02EHHDG1Hw1LHHH HD$H??HD$HIFAHxXw(6 IFH5HxXW(6gfff)AfD(H;蹆 1HD, HHL$19IFt$UJDŽ9El$I5EI$EHcGD9P;IVHE+H߉t$=4t$|FEH,ňEH;H4H9uVIcD\$ DT$H}H4ň#HjE1A11I跇 H=&IFD\$0IEEHAFD]AXAYDT$M1ID7Jc#H1ҹHiH=:&IFHT$DLD$D$HD$LD$L$HD$0HHcǹ1}H4݈#HHEj1E1A1H7 HEXZH!2gff.AD$D 0HiHEHFIfHt E<QHEHDE<CtH$M@E1HEHt$fHEHD$P\fHJHIHyE@HxHHe` ;XyDHVsf.HHL$CaHL$\H(azH8aHXEH}诤aHIf=<9 H}f= f% x|$uAE<9 Hu|$8AHEHXH$QH$?D@D$E1 f=1Ctc1ҾHIpIGyHE HE(H$1#H_H$HHI(la0AXAfIEpHaf80H}f=HHf?H}f?%HGpHH}f80HwxHf>0a H}DAE:H$H|$莍H$H$115 LΤeIEpeH}HGp;Jl0d;6l` ;"l;l` ;k0f;k` ;k"苛\=ձX1qE@HŠ/HtLlH~f?tS~8L IJf.H}0YDM,EZHE81ҿf.Ht$>`Ht$DftLUE1A|$8J Hc,HŠ/LlE1m7I@Q>`rH=TH%E1m7I@lz8H@@(HaH f tfHZ f8 V HH1HH <H <HJ(vHtHT$<`HT$N H=[,OH$fDu:|$ujHWHsLWHcH8L[]A\A]A^A_ÐKPHWH(LWH(HH@H(|$tL>A}$IHut1ۀ=muD=t 腇t[]A\HoHPHHrH@8H9tPP1E1H@8у`H9ɋPPtAA`D9tMHHH0MP0LO0Iz8LW0IA8LH0LA8LO@IH0Hx@IH@H@8MYLWHYLKIYI{L_1IJLQу`H9sH(' /d@I8H0ATUS Yu9HnH@L L;`tqID$1HID$1HtP9t*HlH}HtߋdYt S#f 8FGZtH}HHtHu8H9HfH}HHGLk8LH0E=CZ HSCZHHt}tH{tHS(HtBZ@<tTHS8Ht$HT$HUEHHt-BZH{tHS(HtBZt HK8H9J8tHH(H[]A\A]A^A_HNpHEf9uCHYf=t'f u0HHt$>uH8HHCf8u-HH{HCZ<HEZt˃HID$H}Ht7HEUTLHE8 Md$0AD$ZtI;D$8uA9T$TuL{8IMF0MtA@ZtIF8I9@81LLL9]HEHP@HzHr1fDH(9H D,tL9>uL.f.HK8H9J8 `DHEHU8Ht$1LHT$HLQCHt HID$H{_fDH{ T$3+]T$tHCHX$1LL|$HEDAFTA9@T1LLLD$L9LD$u1M) R 8qR/ 8qfPHJ<9u}1p^iE9 HUXH+^HHC@x^1C@HD@EH@H@H<9HPXH~h,WHqaLsLCuu:H衰C:|tLHu+HCf8 HHC4LMH$f.D$HHPHHHxJHHL$t$1˙;/H==D1HH1bÞL%=HMt6LLHHI胟L}/u fH}/tHHI裞H==HcHH苞ALcJ|5[H5<IHHtLHA趤C7/McK<7HLL蜤LAa-^E.gcdfD]-7D7E 7>tc8uYH=z<H)ēHHrHMEH5VeT0HAfHHxHYt$蓖Hh[]A\A]A^A_@H=;IHcl$HE1E1Dǂ1fDL=adcg3=gcdaHd11+fkIHH>;Hcl$@Hcl$H}肑HHHH:HHoE.gcnfEfoncg*30BpHH5zeÀ;1҅‰mDLHhh11*)ED=*30Btdˆd$>D$?D$@B03*HT$DAhT$EHD$HH1P1LD$LC*nAXAYhD=E1Fp\0ېVHE1H\$LmDe(H-`D$]fDfuwEDDH=͏b5ُ$6=AŅD%D%juVD$1E1HHEHHBHA|D$#L<BTD$(IpI@H vHH98A@(A@ AHD  L  DL$ IH Hlj)A)AI8LN M1MIERtMD|$@l$DL;L$@tnE1MDRI D)DL$ AH$A)@BH9Hr)HL2MItID|$@l$DH;T$@uAH$A;^D$A;F qD$(1\$(HHHDCINHHHH9u\$(MLDLH\$H\$H5d1*H-HtH)0H̙Ha밿ddH\$H5QHHMAH$t}IhI LL$ 蚌LL$ Iƾ|$(IhD8XD$HAF mAnIFLljT$,LD$ LD$ T$,IpII@녺xi4Vz[1DhH\$Lt$)LD$1iAP ApDD$)H-]Hi0V[iJd[TH;As)HHHHQHH{@^ƃ[D@F% ff.@HHHH4H@rH[A|$T1H;xsH@HHHH4HcH`HH{0Md$8M2[]A\ff.@HHHH4HcH@HH;aH–}T1H;xsH@HHHH4HcH`HH{Hm8HdHhH@H(Ht+HHHRHH,Ht-HHzSHHu`H[]H[]fff.HّGTH;BUSHHHHRHH,HtyHHϭHu@H3Hٺ謭HHHپ芭HHu H[]G( HHH@TDBD9sYHJHHHH4tLHHWD$RTA9v@HH`HHHрHz@t9H進% fDfD% fD1DƂtD$Hf.HRHHHL4Mt5I~@I~`LAƆEH<@[]A\A]A^úՑ;ff.ff.HHcHLg(MߦLk2EMLD1AMIUH1H AcHf.Hu[?,MmM;1AEHITHu*@tϻHITHtf.1HH,9nIvKHHHHVLzUfURU>UQH@xHt xh"HjHGHDH)HH[]A\A]A^A_f.Z Cf.LIl$`HAHHzT$,Hr1E11Dt$,>@HmH9;f}uIwH螰HAHIl${(gk@HX{HnH{ H[A H1H|$P1110nHHD$P0H;nEH,n f@H nH6nH|$0mHD$@)D$0D$@ 4IHmHmHm%E1A_HL<wH<D$,HHz1ҾLIT$@J1ҾLID$@P$h1 11E1ubHmHH0 V1hB1 .1 1i 11H@*+2]11ҿ+H b2] Hau11Ҿ2]11ҾeHae2]5[HaEHLdebug_lh]ineX]$HR]輛11Ҿ2]HaX5ZAvHLdebug_aEfD^H]]bbre]$HLdebug_rJ5tZHLdebug_ilEH]}]nfom]$5BZHLdebug_lE$_H\\inet\$ߚ ZAsNEHZZangefD ZZ$4袚=;5Y~=I5HLdebug_alE[ddrH[[$Yt=HNEc5ZYHZ$5EYHLdebug_lEHZocZfZuZ$ YPYH[!0]11Ҿ|Hl_0]=0HI_=-HM11/]H^$ t=HN11Ҿ/]H^DU/]11ҾH^o/]11ҾH^U/]11ҾHx^;/]11Ҿ|H~^!/]5WH\^Efo'CeviZabbrffZKZZZHHcIGD$ AGMEu110?d 1Hž(L@/HEH2HE(HEA$f t-f ( ID$ f8 H@L`fA<$  AD$LE3/Ht'LhAUf0Rf2f80NL覔H=HEHHh(H-HDLDL$ IwDL$ DH$D$ E15H|$dD$aLVLD$`$*H=>P/H|$`:1=H=# =#HH9-IA fI~ Pffz, #HD$f8%VH9f?IHW HIDIzLHHLT$}LT$fDAS6AI IH1E1IDIGHD$`E1Ld$0H\$@Hl$HLHL$ LT$8MILHkH[IHtH蠬LL9~H|$HL$ MHl$ LT$8H\$@Ld$0Hl$HI~Ht$LD$@LT$8HL$0.HL$0LT$8LD$@fHIeXJfv(fLt$HPH4$f:Ht$tHt$`HL$WHT$XH6HHHD$<9F9FHD$XHHD$`H=H-`wAk{IUf:2IEH!H8Hf? HGpHH@HE wHH|$`H|$`@1HlLVHIt L95$=XH-AEIUHf: Lm IH9tH9HHD$f8%fLT$HT$XAHMrtZHt$`HմLT$MzIHBf82@H@Hf8 wIMH[IEHu AEHŠEHH@p`wj E1E1110Z 1Ҿ(17/`wQk@w̥@w踥@w褥HE LT$ HpIHH@ Hxz/LT$ tIHHP HE @9Bvf=s,H,fH<(Hv H$HHH9\$wހ|$,Ht_qH0[]A\A]A^ff.Ho1HtBj@H uH=p42[H=)`EZH )` o@c HocZjH9f.6T1҉Ht$`HD$`D$h HD$p$Ld$x$Df..H9-tHI<9sfHVSHcVM,It$HNt2~'HV0H H @Ht =T1HHZHtQH{PVH~CHcH9u;111H HHI(HcH9L\@1HH[]A\A]Mt AIt$td$fHLzHH[Il$HH[Il$:1ҾHD$@IT$D$H1ҾHD$@IT$D$H[Il$Dt ;J'Lf.H[Il$|AOx@f.C L AD$Lu$L95:qMt?H[]A\A]A^A_H)HHBE1L;57f.$kHDHdEHƒ.ˆ$u)HH@H;HHƒ.t$LH.mHIH+LLE1AD$A^H=(Ht$XLt$X;<$L8A_DEM L(H (Y =5R <$$=(I)L`HHHH H HmHu݀=N(M9 |$ <$I^P H f; IH t L118LHHHy'HHD$( D$0HL|$8Lt$@HHLxMHD$ HD$HD$H$I~PH ˆ$vHt EH9UD$s$M I|$iH% D$I|$61L#H x #HxM ID$Ht:Pt3x<H0HHDH(x<H9uAF;#(I=q3LhxHD$0IuI}0IUv=$D?EH$E1LƄ$H褞IU8Iu0E1HL莞$t 11!=ľ~oHHhxH&H=1fHDŽ$HD$ DŽ$1H$HD$h)$HE)$HD$H>H$pthH$H$E1Ld$EIHHDHAH|HgHD;`rLd$H$LH$H$H}0HD$hHL$pH\$xH$HD$t DmHE/H\$Ld$8E1IID$@HZD;hPDHlE2mH$H|$p)hH|$p1m11PHD$(轔 HL$LL$(fo$H@HXHHHL$ LH8H@@)D$pHH0HL$HD$HH$H\$H$H]AE;l$HH$H|$pygu|H|$p1Yl11PH HL$H@fo$HXHh8HHHL$ H@@H\$)D$pHH0HL$HD$HH$H$H|$p1Il$k11PH蒓 HL$HX8H@HhH@@HHHL$ HH0HL$HD$HID$0HD$H=H|$`If.HS11LH1/=t I9'HH[tHmfDH<M1ҹ1P諒 LH@HAIHuLFHu!11HLQH[H}HHuHm[11LH.=/H[Hm/=hD$0D$(R;D$0HHuRR@IHAH@pH4HH@HH9 H9wHH9w@HHu HH9HGPHtpL@t9HGPHAH9vHyvH8wHH9w11@pLI~VtI~VtLLM91I~JeH=;H~HtyPt H@#w_HA HsOLVHp{u.=[=MH$@H$HLDŽ$6DŽ$HDŽ$H$h@=ܥL~IVf:H=DEH¾dLHHHLLVM9-1LH.HIHLLXJHDI~VI^PHM9 PHIF(H<9 Ht EH94=L|HID$<LHx xH$AvHt$H$Lc$D$ =HID$HHHQ9H0 HAF92 <LMQ`dLLDDL$1HDDhT,IL1HORH@H $LPH̆H@HD$@H]H$LHLHI;MH4$LHL~IHBf8MIEHL`HLT$LQHmH}QLHQLT$it-Dj6OA8DtVHt :EHA$Ic pfr4BuL)ILHcDE1Hٿ].lIHDE1H¿^lHH[]A\A]f.f"It$uEAD$Ht$ Ht$ H׉L$,D$(7RHE1E111HiHH[]A\A]fD빺@,xjff.fIt$H购H\E1E111HIbtIŋ%=HcjATAE111LH{A_XaDt|U1E1E11LVsHf}HHgyfIt$HFD$|$LDfA>HL$I~f9PffD$u D;= lDL$HT$ LD$`Ht$HL$XL譭HV_LT$XL9LH;D$` DL$Ht$LD$`HL$X1LbHT$`Ht$XLDM IDI~f?:_6HT$ Ht$LD$`HL$XE1LIv1ҿ_X IVHMDJjHL$pHT$hAFHIA[[I%~Ht$ 1E1E11L)F|$HD$XLD|$0t EALH¾`D IMu$@D;=ijI~DE|$DLHMHIoMtLL!D8IH5n]LLT$qK\HT$HALDHE4LT$f%fu E H5$]LLT$KLT$HLT$LT$L;5fI^f;C6A9D$DLEE1E11|DIHD$X@A9/AI^< 7|$. SLփMC!LPLl$XMfD Mffd+A$ f>+HEDhID$XA8+L$E1E111LCHD$XDLD9fA<$HT$XAt$6D_IHD$XMV"E1E1111LBH|$E1E1111HD$XBMHD$`AUf.Df0AEHE1j11AL蓱Ht$pHV(A^A_tvAEHE1j1ALHKHt$hH(]A\AEœHt$`HHHH\$`DHt$X1LY{Ht$`LG{ME1E1111LAMHD$X|$Ht$X~uI~f?6yHHD$XIFLPMfDHt$ E1E111L A|$HHD$XHXLD1R E1LHډDHI#E1E1111L@|$HD$XMT$Ht$XLbM\f.Ht$LD$`HL$XE11L账DL$HL$`MHt$X1H8 I|$t#MtAE%=0tAED9JDΚHHt$LD$`HL$XE1HL>1HDQ HIDjDL$(HL$pHT$hHI_AXf}f;*t DWHHT$XHD$`H9t H9< tHDHD$`AZH\W^L$}H9HD7W^IDE Ht$XH9tHEHE1EhPjL$8LHIH Ht$`HDLIfA> HD$I^f8` H@@2C f[61ɉDG # IFE1@XHt$LD$`HL$X1L苤HT$XHL$`zyDYZ1HD$XHT$`DCZE1ɀ|$0HHD$`DD$LHt$XD%D3LHLHŤIDI~f?D6Ht$LD$`HL$XE11LģDL$HT$`1Ht$XMHH HI%S1HN Ht$LD$`HL$XE11LSH|$(E1E1111!D |$0HHD$`HT$XD64 )H|$(P#"HD$(HhHD$HPfZ4E1E1111L:Lt$E1E1111HD$XL:DHD$`苕Ht$XHHD$@I~Ld$`LT$f?DO6jATL׉މE11LT$A^_LT$HfDHT$ Ht$LD$`HL$XE1L衠DD$HT$`LHt$XDG HI#2 HL$(HT$MLH IfEHD$HXHH9sHL$(H;AdCD$Xf; C6D$ =Dd$ D? MHT$XDH]A> ^11Ho:HD$HT$Ht$(L$H|$HL$`E1蚞Af= <9H 1E1E111L7XH PHD$8D$0UHL$@uI~f?_6Ht$`FD9tDHD$`H$PA9tHDĈH$HA؋T$`RATPL$HL$`HT$Xt$PH|$8tH HH1ZHLHT$|$ 1DSHIMt)|$t"LLQtAED9 1H8ILT$E1谷YԯH̯HLH|$LT$H|$LLT$AE11DѥHXHHCLT$H|$(E11DL虥H!-XLT$IcPZ<OfA>t HD$Lt$Iƃ|$d|$LDHT$ H|$LD$`HL$XE1LE1ɀ|$0DD$LHT$`Ht$XDDImL$E1E111Lb5DHHD$XGHI8_&mfIcPjP^|$Ht$LD$`HL$XLD$0D\$(LDHT$ E11FD$0AT$D\$(Ã<w="D\$~LtQHMD$ DDPLD$pHL$hsA[I[^@IF f8-HD$f8#H@pHRH;XtH;Xt H9X8"U6U4H 9$x&fHLH|$XWIHD$XIF@2CLLt$HD$LH{1psHL$HH{1YsE1LD$`HL$XHT$ HH,@DL$HT$ LD$`Ht$HL$XL'|$0HL$`HT$XD6\DHLOMjHD$f8W6HD$Hxf?G69H|$X>HH@HAGHD$mHZHP [ @DȌI]D踌IeD訌IHD$XIc< HL$I~f9PfHG fHD$f80Q1HA3Dd$ HD$HD$Lt$ILd$LI\$HQLMHD$f C6D$CAÈD$(D<EAfA>T$1DD9 D\$(Ht$LD$`HL$X1E1LD\$ ږHt$XHT$`D\$ ~ z \$AHDLHD$XHD$H@H5HL$`HT$XD6)ILT$NKLT$LLPG|$8|$D;=_T1H7HI1HUf:fHGf8.QA9/I~H\$(H$H UH{f?l DG6HT$`Ht$XH$MDrID$,%!LT$`HT$H5ωL$rL$HHrHD$f}1 }6lIIcf @AFfG6D$(|$|Ht$ 1ҹLc.HD$X<  L$H|$11;.HD$`]HfDAM AEf0qf*}IFHhHPRIcI}HD(dI}Ht$X.(#Ht$`LfHt$X1LfMHL(Ht$XL(I~E1E1111D$0H_,I~ 1E1E111HD$8,sHD$@DT$0t$Xf;[6DD$0*QHf}H)QfD)QH=],AvHL`I^HC +b<9D$LiH{HHGE1E1111HD$0 *1{ HD$8vH{PE1E1111)HD$@HD$0f8HD$0X6HD$0|$Lpt$XD$0fA}*LA}H|$PDL$Ht$LD$`HL$X1L͏HL$`HT$XD7)I0Ht$(QHt$HD\$k(QD\$iL$E1E11LLD\$(HD$XIIFD\$@A8fA:,AbLT$XDd$1DD\$(D1 uD;%MvA>Av +HuIFHf~IFHH@@6f9 sHL$HqHc HH9OH%HH9:HD$ IFHHt$X=CH@DHADL$8p *=BLHD$[ *HL$ =BLI*H|$@HD$ H+wH觽QHD$(荻DL$8HD$(^LD$ HT$5BHAq(HL$ H9t HH#DL$HL$MHT$5PB]:(L9t HL#L5{@N޿F)L肵M"&E6HG~QD$(pQAH|$n\QD\$eD\$#AݝxHMDjLD$pHL$hD\H-?HD$臻BLD$pHL$hMDD$E[H~?HD$ LDHA(DAH3(AYA9AZLT$D\$HLT$XLT$ IcI~<vJ%8f?P_6HT$ Ht$LD$`HL$XE1L@߁ cvAHD$XƄ$Ƅ$fD$1L$HDŽ$H$HD$`Ƅ$Ƅ$f$HDŽ$H$Ƅ$Ƅ$f$HDŽ$M4E6H#LLt$HD$茒#D$DL$HL$MHT$X޿]%HV=HD$($T$L։T$ (T$ H߉D$(9D$H=LT$(MUQfta_6LPt1I~Hl$XHÿ9H(IDxHXHhXQQE6HE6Ht.&$&{gHD$ D$0E6H&E6H!!!!AHLT$ED\$DB(DAH4(A9D\$LT$r EH]LT$%HAAt1AWAVIAUATAUSHZ!PIf7tFft(f4t"DLL2H[]A\A]A^A_I~DLDI^HOHIHcL9HD$ffMt$LH+D$IcLL$H9HN1%'L$IFHPHpHPHx1Ll$@Dd$81ۃLt$IՉD$(HcL|$ HD$0IcHD$IHD$Mt$HM"LOIVHt IH;D$~uAD$;MtdL;|$IH;D$HL$ 1Ht HcLH)HHD$L)H9HN1LHD$HH@H};XtHHL`LhMtKLOItH9D$~ID$f8HHH1fIcL9|HD$1HHtHf8uH{L$*OHHD$~HML$HHD$0t LH+D$IcLH9HN1$L$IFHtDXEg1ۃL1Lt$8L$l$\$|HcE1H)H\$pLt$Ht$HL$IDd$xHl$fy&fHt$0HSDFDX4BAHSHvL$HH)HH$H1H1HH!H?H?H$$A@H$t@D)HHL$H$DŽ$IL$HH$H$HAH$jAjjLH$UH$$fo$$)$H$H E1L$H|$ HD$8IH@HFD;x<DHHHXH\$ HXHRE1LH+L$0HD$ LHH4L$(HDw1T$8Ht$ D)ULl$@Dd$8LH+D$IcH9HOHct$xHcD$LL$l$E1L\$@N6AHt$(E)H$M)MMEL$LdL9MHD$(LL9EHD$HL$|MH|$ LD$XLL$PH4HD~LdLL$PLD$XH\$@AMfID+t$I)IL|$TEvLdMMAtHD$HHt$@HH+t$pL\$hDT$`LD$XH<LL$PL%LL$PLD$XDT$`L\$ht@M9gHD$(LH9VL9M\LH+$HI9HIN1HH|$ L\$hLL$`DT$XLD$PYLD$PDT$XLL$`L\$h1DML$;HCf8HS f:HHt$0DZ4DFPAB HPHvL$HH)HH$H1H1HH!H?H?H$$A@H$t@D)HHL$H$DŽ$IL$HH$H$HLH$jAjjAH$UH$$$Hfo$H$)$Dh4Hs H fI~HCVAD@B fHVHpHH)HH$H1H1HH!H?H?H$$A@wXH$t@D)HHH$L$4L$EعLLyU‰$uH$E蹀LL\$yU$L\$t1!DT$xL=L$EعLLWyU‰$xfL$ $褥M$L$ 1$舥M$fDsM@L`MAH|$PHs!HL$PH|$MLHefDH|$PHH $?!H $Ht$PH|$ML0H|$0HH $!H $H|$PH!HL$PHt$0H|$MLLNLڑND<t<t`T_HxRG +uHGH +uf87u@tDtzL*WKtnH~KtoM81Hs~t?H9u/H{LHLH[]A\A]I1HSLHH1[]A\A]úGRFD 1HSH΋ 9w{HҶHf.f8ttG;H[[fDHXPH=It&H[ vDH޺[jf.}H1H[1Ҁ>ƒ뇺0>ff.C;I|$HDE;jHAD$0AA1HL$@HvHL$HLqLL!I9kE DHAD%PD%H=H=A0@A @)I<$HDHEDHH?C;tiAD$1uaH蹛IHiIHItI<9uAADL rHL$DHAHL$HAHL!L9A DHA2A|$>t Hփ HtL jru@L XrE1E1L HAAI $HEHDHHL$HL$/DHyHHL$eHHHL$>H HtA0uD%D9fHQmHM@ 5M,$Mt,@AA H޿s1Af;HuHCHPXHHT$tL$L$HD$L$LkH]oL$tMmAEfL$1HGoL$AAD$0tCC;===Y]6fAuAAE AD$1Ny0BDy0@oADHLODXLIf85DD$HL$pHp<9GI|$HL$DD$aHHAD$0AADD$HL$2EEAM AC9LD$-A|$,!HmLD$Hp s1LhLD$A|$((t1LoHSXHt 8HHS rD11aD1Aa EH|$u"uAUfA|$,A"HSXL$HDT$HDT$L$t}HLHL$DD$AB `{DAUfwBA|$,teAM AHs HD$xs1HL$HL$HT$ AM AD1AMIE1N AD fXAMƒtCu5 cNt&2?{2BL8S2?2[+22E81EA}9A=INHINAuE9f ]AEMDyA}t1 MmA} :Mm@IHT@ RIm8D$@A}Mm@E<9HNH@HH8HU A}1udE<92H"H@HH8谢 %H('DI}@IUHCHIE@@dHsE<9AuFAF AufDfHL$0aa)D$0)D$@)D$P)D$`|AW0HEHIE@HALhMu@MmMA}uAEtIIm8HxH:QPHLIyLIHHPAFHtA}BIHh8HHPH늃iHHL$ HtMtAt$H|$(ttAOH)M1E1I Ht$pB>E1P_IL$H~)9AB>DNALc|$HC0M_J|<9L9gHt$pB>H E1P_L\$ DT$DT$H )L\$ D9A>DNHC8DT$JDftffHPHz f9H9\$HT$ _HMHT$ tIF@HrHLFmGIuLH9YGHS8JDHD$D$@H@(xHD$@fEI|$ fH9\$KHHMtIF@It$QfD|$@H='1L|H='L@H'LHx:HIt @  LULH-Q'HF'DHI|$HE1AQLHIGIH-'H|$hH H=' L1H&AHC(1H#MxLHT$ H#U@HI|$HMHT$ IHi#LLMo*H|$hH]HC(x4=a 'HHLh8HHC H=#LHEHk MH"LI1Bf.H=" L\LE1O!LH-!H!>HI|$HKH-!I9fD|+.}8}HEHt x0 o1LHԂH}L0!HtH@!LH]}0 LmMAE0ƒ!!DU0u 0H}Lsf.HrH|$({SHD$H@@H:,[,G}.3x.. .H= 1L苁E0 H,F.:..u.a.M[.9H\$HC(&++++P+.d+/q/I}LHtH@:LLI}HLmM A}0Q1A}8VI}L.1LLHHtHE!LLCLHEHtx0 t H=K HHtx0ux8tHxLHH= f@\ANAHL,S,?.+.X+W.).ff.HtAIL9cwDH[]A\ff.?f.sfHD$0HD$8*HCL{IHT$IH(HLD$(HChHCLsHD$(HCH(HkHT$HD$HZH9*t L9c@H\$(H9H\$LHL# L{0fA?$AwH|$)HD$<H .<IN(vHt I~8t H HHAHH H1fDC@t2HD$ HtHT$(HPHT$(HHX1[]A\A]A^A_fDL{@fA?t,HT$ fH{0nk|IT$LL>GVIAUATUSH8HÍHo8W\Ht$(HDe T$(J8IDžubD$(IGAHH@H=@:h113HIt AeIEMoA_IFHHH@H贜tAMHsH8[]A\H8[]A\A]A^A_Ht$(H|$ LHcFlH% @IwL,l0HD$HHD$HHLHL$fLt$L賞 ILt$L蓞 uI1Ll1Llf1H\$HU t2H1WH\$H: t.H1,HD$HHD$HHu!HD$HHuXH=Xfc4t.HHu:f.H@0H#@ZtHX[]A\A]A^A_fDE9E  []L H|$pI L< D$DH\$PPHHHD$0[ T$DH|$pmH|$pcH|$pHD$H|$pFH|$pHALЄ11H`>H`Ht*@(H([]A\A]A^A_ú%>>=1~%H|$pE1觉H|$pH蚉H|$pH-HD$3f.DUSH1H(H-|H|$HHt$D$D$-|H{Ht$1|T$Ht$HHHHt H([]D1ҹ1MCZD$PHt$8H|$ H$HD$@訳pH$IF@H$LT$@H$L$HJHLʥH$HtGH9$uIV@BJ!1HLHHL9ttH$@H2HeD%E1efCZLxD$PIF@D$@PJ HHD$HAFL$@C1ЉT$PH4/T$PH|$L$@֋H@ ɈPX@ ӉبgH$HX(AFZH$HsIF@Hp8fDH\$ HHH@H<$׋f[HDŽ$f.HD$@6HDŽ$HD$@1L耤D$@HHf.IFH@HHHDH@AFZL"D$@IF@HHp0Hx8p(p pppH0HkHXd@Hx0HT$xHL$oHt$pHD$oD$@Ht<HH\$ HHHL @}IF@SE11ɸH@D$@HXe>IF@1D$@PJ 1L諢HIF@PJ ERE1t?O1AufDf諆VIAUHATUSILHH̍I\$xAHtCPtHHH-H[0HuI$HtHHH[0Hu[D]A\A]A^HsLHqA ff.UATAUSHHHLwHt$L$$E(E EDHuH D H  IH HD)A)ÉHL,MUMIDtRHD$LL\$8DL$0L$(HT$LT$L(L/p@LT$HT$L$(DL$0L\$8HuE1DALAD$I E)AEAED)؉D$E$f.\$E$L9rD)L<J<>L'MIt,HD$LLLLL$o@LL$uE$HufMLDIALHID$HHI9t2LHLL$(HL$eHL$LL$(tHLLtL}MHHL[]A\A]A^A_D$$ M HmIE@HGKvHH9LuDT$$HEDIEL@IBLhM9LLL\$8DL$0L$(HT$LD$LD$HT$L$(DL$0L\$8%LLHT$L\$0DL$(L$FHT$uHuE1L$DL$(L\$0HUI@E1I4u5DfA TfALUf PIHD$AH F@u!H5HHNHyHH!H9umS$@tH6_DHxpH[]A\A]A^A_$q@JT1HcHމD$ H|DXD$ JT9Ha_HxXJ rH@_H5IHxXJgfff)Efu%H5HHVHZHH!H9z@Mf@MIFHAHHPI9HЈD$uKf5qHHVHJHH!H9I_XH迻 w7DHL47T$DdH5gfffHl)t:|$IB48FHHt@HH9uAN(|$ $ԚIB8F $HTfHH9ueH$H$H F(fDu/H5\HHFH#xHHHH9Efw8H] f;tHHHHtHSf*f,1ҾHH%H]Iń HD$!HL$ H fDDHDT$ L $L$L $DT$ uL$HEH5WAgfffHxXfAAAA)m_0MDH5|>HHFH#HHHHH9HD$DD$,H@HxXHD$l DD$,EiE1D$(1HD$H5=H@HxXHD$kgfffDD$,)D<32A9wH|$LD$HLD$ApAx HHHH4FEff.@IUSHhL$Qt$(HT$$DD$SDL$RHD$H|$HD$@AD$H<ŠFHhIl$HHD$$D$,fM9Lm@LH;\$uHt$H(`s HL$H pGHFC <EWAVC Hf9 vALHL FM}M9dAH=5|L`H@Hpt$,p*H|HZH{|$SHH=j|Hk@uZHU|Q|HHC1ҸHHHH5 |H{HZH HIH{H{HH0H{H5{t$(HH{Hh0t$$pHC@fMLt LuL}M9WLm0L[L% {MIc$HHzHHHpHL$HH@Ht$L$$HzADqDHt$@HT$8D$0 HT$8L}Ht$@D$0L FQfDfDHS,DLT$XDD$TL$HDT$0Ht$@T$8:>Ht$@LT$XHD\$0T$8L$HHB<8FDD$THcHLT$HHDD$@L$8 gHD\$0L$8DD$@LT$HHUfDHDLT$HDD$@L$8Ht$0=Ht$0LT$HHL$8DD$@HB<8FtgHcHDD$8HL$0fHL$0DD$8f.u8H{qHPHJHL!L9t!I\$XHHӟ  IcT$THֽHu%H5EHHVHJHH!H9Ht$ GHFHL8MHF@H@Ht I9LHoIcWHHLhMDIG MeH@(yCt8IG H@HcHDf8uHpL&tI}fMmMu.fHHA@HD$ HXXHk wD$,HHPH52=Hkgfff)ȉD$,tM^L\$fA;*,;ff.HNHyHH!H9tHXH  w [ATUHSHt []A\fDSf}-AtyHH>HtHf}*u*MYPHHIH)H HP@D9t[D]A\f.HVHvfHhHXH@Yoff.fuIH5~HHHFH#xHH9t!HHx8HX^ HD$ FHLFHHt$XH@HD0nHŠFLAwHHl$`D$AHD$ ҋFx`;D$|ZD;d$(QD;d$(HD$ t$894FFD$D4$D$HD$ D,$Dd$(FD$8U1!H5|HFHHx8HX# TfH=mf0 urH5XfHHFH#xHHHH9tGIEHxHX蛔 v)IEH5 fHxHXv gfff)Mm)$Mef.UATUSHH/H=AHT H=AuHGIHHHL!L9u* u^H5WAHHVHJHH!H9t=AHCHxXo P D9tHD)[]A\A]fDE+CH[]A\A]HEHHXHMo w=H=@ ¸uH5@HVHJHL!L9tA]@HEH5@HxXn gfffH=n@)A$ uH5Q@HFHHHL!L9HCH5%@HxXn gfff)ff.HcR~H5VLcfBHs8BfHƉr Hs9uH[]A\A]M9AIH HH GB uff.I_HI9H}H@ HXH(HH\ULcEDI,_f,`u-HmHcMUI,_ f96 B`f*iU]%H\HI9ݍIG+$;$D؋8ktt 9&kLcl$B8Ft~v|$thH>JpGtWHxvHL$XHyjB9FBFBFB98FcD$AD"d$(Af*f0Z H=DHT$h1)6HSkHL$XH| D$tD$E1fD Ht$ McAă D$@A!H0H H GH$ JH H9wD$@D$Dt$EBt$p1LAt*H=}CHtT$h&15D$tErH=SCHtT$h8&1v5D$tƄ$E1D$"L茖^A2EHhHt$HH@ H0Hf8,Lcl$F8FEDH39wH=@HT$h%12Ll$LL$8L.LL$8AUD$H$gHHD$XD3=i[;2H=@H$'12fH=?H$*11|HD$xL$,*VHD$0HHt1PH=zfDD$DHL$86wHHL$8DD$DSH=D?Ht$T$h%1DD$DHL$8]1DD$DHL$8D$t$Ƅ$D$$ D$@H$HqZB9FMBFBFN$AD#d$(|$t0Lt$A~SZ Af0#f*H2eHL$XH|D$E1HD$H|$x$H:H6=WKH /Ht4* *XD$9K$9J$;KvfDHD$D$tbH$f?* $0tH$f80$`褿'H5 V^0y Ƅ$A01 +aUH$&1 ~E1&SH=.HaL$h$(1 ET$ Ht$DM[EIIf.HD$f8,IuL`IH*QHDmHH@t fA<$*L,-H|$H/:L|$ I?蒔HcI3HyiET$ DM[H II-)HL$@HD$HH+LT$QT$R}Ht$H|LT$T$_fD(M}LIEHH@ HL983H=A?f.Hp f>tH!HHL~Af,uMAf*HFH=f8*G;HHHLHHO5AQL 1~mD2HrHx 1HAAt6AJ @FHH9t6H H IuDMH REI yI @FHH9u1AtmADLfDHHHp9fDQt EH|$ #HHD[]A\A]A^A_ÐHYH|$ pvH=Hx db^H=%$m$ACH= w@H=H /LQ MJL H@Ht HPf:uDZ(Eft8Ht$u&L% D袽 H|$AԅH|$&efLH4$_L9IH4$B%0H=H= H5HHHVH#zHHHH9t]HxHX H\vt7HHt H@f8*tYH9LT( MfDHNf9*D9iLT/ fDZ-1HuD9hu~ff.uNH5HHFH#XHHHH9t#H]XHg6 D$H=WHtUT81IHEHHD$tFH8HH= LgHE@H@H9AHAr  fG;YWk HDWHHW HPHXHVHsHt WL$LHAؾ91IH=^fL fA? L觠|$+It5Ht0IGf8*u&xv H=1ҾōH@HHp@HLmVL VK|$$DHvHHcN DVHC HHH)HMMcs`fC96 sES`<FLJ GIA:v$M,If.AIMPIUP1D9ZDHHBHQP'H4$H9t$ H==k7HKjE1Hp@Z|$*HH@HHj8H;h\$,y5Hv gfff)ʉT$ HQP[1t!H:Hs޿D<u߿YiZSP1XsPYsADA XsZH &:Z^HB+h:_4H@H@HLj8L;hfDHHXLc@H=991H`+AUTHu((9!AUTHu/9AUTL99AUTH޿C9HMm8H@L9hvIcETH HHbH,H WHAXHt9PMA@1NHAP 2H11ɿS(uHP(uHf.V9uHW HN H@H@8tHD1HtA9BrÃfD$ fDD$ HHH HDoAL(Lx)D$hDpI9oAHI(L` )D$ HL$0 1H=PH=AL@ZvA?L;kH=D`vMC$t Lg7C$HAL-kLkDsHLc(tD$ rLD$ 9BCHH[]A\A]A^A_1DA?FHCl$ Dt$$H;D$ UL9c(C$lMVC$S9ktDL$ AA9H HH H;x;hEAD)LKH4HDH|LDHHL)IHL;(uH(9huH BL9LH9uH=(\vtfH= RXv5DDHzƺ(Jt;tHHrMOH@HH9uH1HtQPHLOHf.HHtWP0HxKzH=qH^Ht`MKzHΛH;JzH=HrHt`u?JzHZ[[UfKU!fDHGHHtH fDHHHHHuHfDHGHHH fHHHHHuHH- H9AI͋ IHMƍA@LzIAIE<$wfMtK Q@KHCHHvLLrHÏHtA<$HQ7A$Hĸ[]A\A]A^A_@A$r@BHCщ3HHHzAHL$I;ILɃIHL$H|$H|$1Mt< Q@HCHHH tLLHHL=ݎL9}D-ڎD5ώD$u D9mH]HDC%D9EPKt$DSHDL8L`DhDp@pHH Hz7LHCI:fH{LHHCIHt$ f{>DDHL$usH}HL$HHHH4DGT$ DD$HL$0zT$ HHEDD$HL$% DCEPCHȮan7DH谮aZDH{KHHHCHIHt$HT$HHH$HD$ H$HD$HLKLIMt 8/LX:I$HD$ EL$H=DH$~%AHLItfDHkCH9ƍT uHt$LD$ 9IHLD$ HHH ; A@UHCH %1HL$.zE1HEHHAHL$0fDAHEAC1DD$(HL$ DT$HT$HH4.zT$DD$(HEDT$@HL$ ֋ EPHPt=ApHtHHDH=;HHE=H|$@S1H=HH=XI>:H|$@2HHD$0HH9 H|$@2HHD$HH90H|$H H|$@2HHH9 H H|$@2HHH9HK H|$@t2IHI9zMM.~S=FsIE(HHD)D`1LH|$@ M.HI9IU(HDuIMIuHpHgH|$0"D$0HDIEMo(LJH{ǀH@H(HH$HDH$Ht'H =DrPH@HuEQ+HE@Ht%H rPH@HuHm8HwAzHHH@HH\$Ll$ HIH$LVCH$HtfHHpHHHuAEQrMe@Im@MuWE1&?H% Mm8M3HHLl$ \$H@HHBH|$H@H@@QHp@HEc Lt$IIHuIFA:HHD$uHHL%MMM9u%I$HL9IA$:uHT$HLAt$fAD$Hfw65AfAt$'U »AI$HOI$HH$H*fo$AEPfo$AE`$AEpH=vHpyH=|vA$t <AD$@xAD$`vg$JD+|$|$ & Dh;D+|$|$ Dh H$%H$DŽ$!H艔$L%&l$M Il$M^@ID$HTmMHIH9wLL\$ 9Il$L\$ ED$(AD$ EI4$LH  D IH HD)DA)‰HHH8HHt I91IAA EIEMH A)ADAA)A|$$M9thB H9Hw)HHLMIuHHD˃!@H$$DŽ$ H$A|$$Hȋ|$xMI@ HHAA)hE1LH|$@IDIL9HR HDueMI@H HAA)X hE1\$H|$@#H=d HH= 11(輌H|$@Ho%A9ʼnC S H|$@Y%H|$@HL%LHCH|$@"HCIDIL9H@H\`\$rDL衎M4$M.IEHH)~|$xHD$ \$H98u~@8xuHvHxDL$LD$ $HT$ $LD$DL$uHT$IH8L[]A\A]A^A_@HGKLmHH9DL$T$H4$Ml$DL$T$L$mfEID$EMIl$If@H>H98~@8xxHvHxLT$(DL$$T$LD$L\$ $E1 $L\$LD$T$DL$$LT$((E1IIH*0HC8HC@CHCICJHCP됐@@)1A))AD1))1ƉA)A D1))1AtHsHKHT$ L.fMHSHsLHH8% M'HT$Ht$LHH8tˋPDD )D)D)1ȉ)A)EA1DD)D)1D) 1)1A)A/MILH|$ D) 1кD)A))ED1ADDy7D) D)1Љ)A)E1։A)) A1DD)D) 1Љ)A)1։A))D1‰))1A))A D1)1A/DDy7D) 1кD)A))E))1A))A D1)1AHD9Dy7DD) 1кD)))1)) 1)) 1))1A))AD1))1A))A D1)1ADy7D) D)1Љ)A)E1։A)) A1DD)D) 1Љ)A)1։A))D1‰))1A))A D1)1ADD)D) 1Ɖ)A)1‰A)-DCD DKAD  @HC1Do)D$ o@D$ )D$0@ ׉  b|$ TH|$(¾)yAaSD׉T$&T$AƅHcsH{D0(yAH{DD) 1AA)E1ȉA)) A1DD)D) 1ʉ)A)1ȉA))D1))1A))A D1)1ADL{pL,DHL~'yAfD蔧{f.<${7@ D1NAHs H^HT$ L. MHS Hs LHH8xDAD1ަAfDD1ƦAVfDD1讦A.fDD1薦AHsHHHT$ L.MHSHHsHLHH8xD?AD1&AHsxHHT$ Li.M8HSxHsxLHH8xDϥAD1趥AHsXHHT$ L.M HSXHsXLH$H8:xD_AD1FAHHHT$ L.^ME HHLHH8xDAfDD1ΤAƋ$#C%C;H6H[L|$H6LHL.9T$ID$D|cAƃ<$#9D1FAHHHT$ L..M HHLHH8xDAfDD1ΣAHs HtjHT$ L.M HS Hs LH@H8VxD{A+fDD1fAfDD1NAHs(HtjHT$ L.Ml HS(Hs(LHH8xDA+fDD1AfDD1΢AHsxHtRHT$ L.MT HSxHsxLH@H8VxD{AfDD1fAHshHHT$ L.M HShHshLHH8xDAHT$ LS.T$ ID$D|֡AfDHHHt$ L|$fT$ID$D Ty7)D)D)1ȉ)A)EDD)D)1DfAUSHHAHHXHT$ 良.=CH=-Ht#hMH H5 ՑH|$EE1EDHH-H|$ƓH{XHT$ H .tZEH=Ht-H}Ms H ؜HH[]A\A]fD1HHfDH{`t$ H`HH[]A\A]H{`t$ H`@VMGff.fHtsAUATIUSՉIHt H車uH@[LL]A\A]A$<9H fx$QH`I}`IuI}H[L]A\A]KI}LO6HtAD$uUHMF@I}j`IuI}H[L]A\A] KI}sI}4`IuI}H[L]A\A] KI} `At$HL[]A\A]`I}I} LN6HI}I}`IuI}H[L]A\A]JI}I}v`IuI}H[L]A\A]J@I}J`IuI}H[L]A\A]JUUMGEU{M3EAWAVIAUATIUS։HH҉L$l3H|$ ysD$8AIHl$ Dl$(Dd$,D$t(HEuAG@EuAGDAGII8A1IvH9DPHA@=DXEHH 1E9o@IvAADE1EHH E9gDIVAƒA@I~HLT$`I8LT$H9MVAH詊HI1A"GJAAIvA@nHH EIVA(LLLU`AD$IvQ@HH IFAD$E9o@Io8AGHt IcL`E;gDEo@t IcLm`EgD|$tm1pA1I9oPIvI~AÃ@HH I^AH`I9oPIFAtHLYIoPHH[]A\A]A^A_fDAIvA@HH InAPfH=I#蔄uDI~HT$J`E1E9gDI8HT$AII 0I~HT$`1E9o@HT$I8@E1HH E9gDA뵐`I~DA:/AGJH="誃uHzfI~HD$`HD$ LT$fHLL#`LT$fI~HT$D\$LT$(`LT$D\$HT$ffDI~`pDIG8M[BE1D1ffDD$MeIE0AT$uL`ID$LHAD$ Al$ID$HsP/HL蚎LH=t H|$H=wH¾M1訁H=0Hp,έaH!tu%H=HD$HXHH=t H|$TH=H¾M1.L|$IGHD$L|H='0HLx0H=>L`A|$uLw`ID$AD$ HAl$H{ID$`L|$HI.IH/L6HIHHߋp`I$pt,E1DHHtA*.I$D9hwL|$HIwP.IGPHD$hHbHx0Ht$hxHD$hHFpH7`HD$hHtRI@mH{1`Hw`RHj`,1H`H$}H$}HD$XuH$tE111wK7It$iHtpH`ILcH@(HD@E1EuD9hA|$DluL`ID$A@(HAD$ ID$IAl$LcH@(HuTL`E1HD$xD$pH$A~xWDorLE~H4D$eTyT$DA#11p#yxDEu=G5u #H=WuMXt+Xt!XtXXHDHl$H|$H"u%YHl$H|$Hq~Hl$H|$dYHP=1}tW}HItAHl$HÃXYM[4YHDfDL$`fDH|$xHD$(JHl$H|$KYH1@Hl$H|$WYHf.f1<:tiH$I HgE$EuCH~>HcH9u6H$`Ft1Ht<:uCHH<:uHuHھZ11jr1ɺ`F1fU1ɺPF1RUHCIHHXXHHھ w\qaIHgHXXH`Hھ \HaHH>HXXHtHھ \#aHLH=AHWXH\`H C H=M HHAA!H!Ӻ%P[$o\HDxyHHDxHYHE1H@H0HgEHD$xADAAH|$DtH|$1D É؃EL%EIF H1ҿHHHԐHp HtNLdHHDH@fDH #Lt$LH Ht6\p#H= L1H5 = tDH H5D`"xHHHD`X`HHH@L8MHAA!H!DHt$dLP1ҋ|dHHutFLt$lE1E1Kc4.H==6JlLHIό5IIuHLH=\H^Ld$db\Ml$ A $1sIM9H=  Ր`FDdCfHH\([HHEAHS\HD1gHD$HxHoefDHT$X/Ht$H1P4HH#x[E1nL$H|$(踏6HH[AHT$p1HT$X1D\$0L$(L$(HD\$0-HT$X1D\$(HID\$(2D9\$ D$9D$ HLHHDt$DHT$X1DT$$t$HHT$XI1IV1ILHHl$1HH3DT$E HT$8H12HLLt$1L2HHLH賔5LD$ H1LHlI$H1L2HHLFt5H|$0H7{51H@HLYN5HHS\([H1E1H=/116H=ImLD$ HT$HH1HH\$HL$0IƿGH̓5LHH¿F蹓5LHIH¿L裓5HHL11HHLH耓5t$HHT$XHZLH1}1HHLHJ5t$DHT$XH$LH1G1HILF1HHT$pE1H \H[MHE1#E11LD$ HT$HH1HL$0Ht$HڿGH藒5Ht$HH¿F肒5Ht$HH¿Lm5HEHD꾠[1: 5\HND`;xHHD`HHE1H@H0DHHD$E1Het$HHSHپH1 HHH^[蔾^耾HD$:>]u[U1] [:p^-[&ff.I@I]AJ/ $u f&f O E1fD$+HD$Im 7ff|$+IG@HHSHHHRHRBQHB@Hr@H$H$HH$uCDƃLtIN;BLrMp @MvMt1A%=#uAF;uLLD$tJMvLD$MuLt$1LD$IdžLGLD$LAHIHtfXA8Ht$PH\$`Lt$PH袇HL±H[HHCH@@(EQHE@HU@HoH@ Hb8N8HD$PHT$XHl$`T1DHLL9Aw1ɅHHL-HHCH@@()Aw1ɅyH=7tHH1LHH=aHG @tID$ @It$XHtMDVEtDEB11Ift1DB+tH+`H@ @u&H=HtJH`HtH@Ht@@u/Lt$JOLO5H5:L6AG1'HT$I}NAOȃf $bȺHIO8fD$+Im HL$LHHCt H;ELp0HMIF(HPH;L11L|$Ld$Ll$ AMIIfDIuHڿWWt!HtIuH=WuAMMm8MSIE(HtPt L;tteP`S@H% HD$ w\LpX~L1H5HL$HAX>pHv H~uHD$PHD$XH\$`PeP`cS8H@ HxuHD$PHT$XHl$`PeP` SDHD$ \LpXEHD$ \LpX&D$,D$fAI0HD$@HD$HH|$r@b$u fH|$-J,HHP8hHT$@H|$&&,L$,ʃHDщT$,H|$#,HQL`8MDAP<<$uAGfkHT$@H5 HD$@HD$@HD$0L#IHD$01HHxM|$+HD$  HUHHHh6~HޅIHӅHILpLHL`LLCQ^HK@1Hs@HtHQ Lt$PHT$PHt$XHH\$`LwAD$QID$@H|$ Ld$`HD$PID$@HD$XHD$0HHHD$8(>HL(AEQIE@Ll$`HD$PIE@HD$XAPD< <$uAGf H=H6H|$ H=HƺL襩HߺL5\HL%\HsLyHs3AE9Au sD$Jl1ML$ Lt$P1AALLqL$HMpEeAMD9At$ AuKlAED$ H|$&HIuhH$H9Dm]D9Au HԿMuJD%1Md$Mt~A|$uIT$8A|$H5}1ɺAAHLpH$IHJDAA9ȍqrLlMd$Mu|$t31_AH5HnDd$ H$JD111a_L$HMAl$E,$AA9u{ At$I\D$H$HH3EQAWHLP-|$+u$HD$PH8$LҤH$HtGH([]A\A]A^A_À8$H@ Hxu1D$,D$D$,D$D$,D$BD$,D$BD$,D$I~D$,D$>iD$,D$?TD$,D$A?D$,D$*fdH&1HD$HCQHS@1HK@HtHB Ht8txHt$@HD$PHL$XH\$`HLt$P1AALmMo@HD$@MH=L%b6HIH8uH@ Hxu1vD;AGf3HL$HLt$P1dHQ1ɺHAAL5mH$|$+QrqHDHt$1hH$rNJHDD$=B=A=?Lt$P|$1H=r6AH5jHDd$ H$JD|$H|$!HTHP8xH5荸D$Ivt$,H= 6H$H|$QrqHD k!H[ H@8H$|$>QrqHDI_01HD$xHD$xHHD$p!HHtHP8Lt$PHL$pL2H HH_tHP8HL$pLߠI0N H đH5EH1cHL$pQrqHDHH Hx `"I_0HHdHZ E1{NuEDk HPHLC8LE9DEB豜H*LC@DHLHP蔜H H[HuAHHH5-E1H.u5H=HL6HH83HLȞMH1zHT$pJqrHDBpH=",6`H(HHD$pHo@d1H=AL6H5ٹE1E1HH05HT$pHJt2HLHD$pH@9wf.AGfyH=-6HT$8H|$ Ha4lzqH$H4HO%ÉL$txHH$L$% ÉJZqDLLLd$I@ HtH2H1HHH9uD@ApDHH|$Fz6t}DOqH$H4HDo%4sxHH$Dm% É]]Au|AGHL$HHL$0f0 H\$H|$H$HVHHL$1D$MD$ Lt$PAHDŽ$DC1HDH4]rxHtHMH HHH9uXsAHjH=!ׂtHHHD$0{DAqH$IH4H#DoA#pxHH$Dk%D CAuHD$P1A|$xeǴqH$IH4HoAoxIH$ HP0E@pAuDIH=H? 6AoHD$@lAHDŽ$DC1HDH4nxH=1 6HT$pHpe@1H='Z 6H5{E1E1HH0-5H56H.*HL辕M1HpH|$pHHt ygGH$QBAH\%1ɉt%E1H=2~tHPeYP`1H@HHuL9h8uH~0H$DL$藍H8H@HD$(HD$8-f.Ew H=AGt1AencLAcF4HD$wfDFAFPHC LsHhI~H証1HH$HH+fDFLAF-F4HD$ILLD$輥vHHxELD$1LLLHD$SHS f|$$DH|$HD$8LOSHC LsH\$(HhIH=!OE4H=HD$@9E4H=1HD$P&E4H=HD$XE4HD$`fDcAcIAfLl$8Ll$($H% 1HRHt$1L4HD$H\$(HG0HD$ HD$8FHP<<$oH|$ HH{HH蛓H{8*E1DOdH{@HT$@fE1I1ɾ0wLIH)D$`Hl$`)D$p)D$@)D$P)$L$x4H{@HT$@E11ɾ0wfo$LH)D$`Hl$`)D$p)D$@)D$PL$x褜4H{@HT$@E11ɾ0wfo$LH)D$`Hl$`)D$p)D$@)D$PL$xb4H{@HT$@E11ɾ0wfo$LH )D$`Hl$`)D$p)D$@)D$PL$x 4L;k8H{(HZHĈH[]A\A]A^A_Cf}Mt"@I$8$$M$MuH|$ Ll$ 'I$x1=pMmMtpIcEt rPwI}8tHŀ)A}1`~CpMuH=o荏HYHoRx1n=pMmMuDMI$8$Lk0HD$(Mu 2fMmMt'A}uHT$(L153MmHD$(MuHHHHD$1HD$(HIf.HD$IM HLq8Q LH A9Q My8tM9tpL $L1Ay9I0H$H$H@H8 Hp1HfDHHM3M;x@Ml$8IuLHH~ G !f>I0H$L$c@H8Q HhHb3H}HoIH&LH3M;fIp@U9<<$uBfMl$8IEH@ @I0LH$HL$?H8 LHh 3f85<HP<9+:I0H$H$H`?H8N Hp1+HHH3M;In8E<9QHE0Ht.f8t(Hj3f85f:HhE<9<:IHWvHHxMHE1H8~t?E<9MHIITH"LHHD$ROHD$AF IHm7<(&<$%/.*DIH.7< <$uBfI^8D$LcjtMd$A$PfAT$4g@9H'H|$H$H"HHC'H@ HP H5Q1bHAE1DD$}H$hIH9"B 9&"HHHDJL HhfH3H$pH{!Q19h!rHDl$HTqL"HBDMvM\$xA91H$hLd$(1I\$ H{y2HCPI\$(H$pH{_2HCPH$IT, B&HHuI2IHHHt$|$nHDŽ$"IH$H$tDDt$|ID賂IDHrH荘IH.HH$H$H<$$H|$Hb(MLt$HL<$f..AtZAAE -^ IEHDpAuX LGl$Ll$AIuMe8ID$H@ @ I0LH$PHAmL$Pe9H8SH@LHD$躄3f855HP<9m6I0H$HH$HH9H8AL$L4$H$AP_1H$L5MH$HLH!MmMUEu4@In8IITHքHD$H[]Hz1A~H} LH HHD$IH1< <$uBfLeH\H!1H\HD$H}HH$ 1C31HD$ C31IC3H|$HL$H$AP_1HD$8LKHT$ H$L#H$HH H|$ H$HH H5H|$8H$HrHH$HL7HH$HHt$H|$H$蝋HH$H11AmD1I H$hIHB 9HHH<$HDHމJL(L`3H$pHQ19~rHHTqL*HBAH$HH{0?MH{0 IsH$HHˁfDH|$EuHDŽ$x(I]8<9Hf5f._HC0Ht-f8t'Hw3f85i(HX<9@(I0HH$8HH$8+H8IU8L`f5f.aHjf} HADD$ S3H$xH3L1HD$@sHL$HH2DD$ IA} t Ef}D$ D$JHD$1LHx8;HHDH$H$I}@2u_IU@<9GH5D興HHXDH$IIDHQlH$|$H4$H$1AP_D?|$ YH$LH<$HHD$$W AݺI M@I?D)H H H9HD$(H$HT$ H@(H߀`p3HT$ HIH$HY3LHHLD$ "LD$ H^1HpjH4$HMH¿<2H$pHtH$p1ҾHD$ vH$pHD$ QYAF -NE<9AFAMfIl$8?LH$H$ H$L!NHH$ I0H$L$(H8HhI}P3HIJ3LHWIHSL13@BfQf.H$h1ҾLH$hB@@AD$A|$ HUf: t1HHT$HHT$tHRHR HHEHP EE1HH6H&fDH=I11H=DHC1Hx6HHID$ht f8uL`1_3HIH$HƺoMHsH1HgdHMHH¿q1~H$H)Ms`H8HC(HH$hHIM0HXGH$hM2H$h 1o.HH{I.3HHƿEl$2Ht9wDhID$pHCID$`i2//u,/a$.M{-9]-%t2 s`2y2H2c2T2..q=.]-I-5-!.0 ,000-0001mF-Y:2E^21O2h2 Y2A2,$01y-_-}=0i;0Uf.D$H1H=!tH$H1LMfDtH@f8 .DHt$D3H$HjHpHH¿92H$HfIn8HH$AFHEHEAFIv8HH$Eru$quH$1H HD3HH@H$Ht$(D3H$HehH$1H賡ucH$"uRfDHt$D[3H$HhA~LHHi}HH$12HAESHLeHoAK*[A1+GAp.3AAE14+A3+@f$@HI2HHHl$"H|$8LeH$HI0HILHH$PeIwHH$P"I(LSI811[ff?5Mg@II|$4kI|$f?IG@Ld I|$ ktEHH(I;o8s\Mg@II|$jI|$uH$`耧ID$fI|$ H$`HH(VI;o8ID$ rH|$ H$`H$P!Ht$01LI A<1$uBfH$pHt H|$-!1H$H$PkH@o2l-lH$`@ƒft H|$0H$"XhH$`L$hMHHHz0HT$HDŽ$H$tbH$HH H$pL$8H$HuH$@H2LHT$H$0HrAGfIo0Hu *HmH} uEuHU8I}0H$H$H5H8H@H}@HE8<9uI7~HE@두H$`3ID$]fAGƒfAWʨ @AX@.f)"f 1fD1fDHCHHpDD$L$@fA<$IHHD$<-P+H0NH/$~-1Hx"H/9$HxHtJHP8HD$H$Hx0H$H4H8 It$HP1蚑IH$H$0H5LO1f.@))C)C )C0H$$H$pHH$foD$9-H$)H$H$1)CHپw)C )C|$H11H#HL$0HHHht f9uHAH|$HH=K1]H=m1HD$PHDŽ$]H=N1HD$]Io0HD$XXH5H$輈IVHHLB HNlIWIL1I8lH>2LD$HL$XGHH$ lH5H‰HIVHHLB HkIWIL1IkH2H$ IFHHL$PH=L@ kH$1IIHQwkH}2f|$8H$HD$hHDŽ$(HDŽ$8H$LuPMH$1LL=SO讑Ht$hHHAHtH$HK|H$1LyH$L=OH$ݶH$HHAH$H{AD$;HDŽ$JL$MH$膶LHHNH$H{H$ZLHH{NH$H{{H$(H$H$H$HH:NH$H:{f|$8AXL$s!H}0H$(' Ht$hLH$(¹2AN;LH$H$(L AW H$rH$L=MH$VH$HHAH$HL$mzH|$H#H$HH?MLHDzL;d$H@!H$*oH}0H$H$(1H$LH$(ɸ2AL$;Af;H$LH$HL蕸2f|$84H$(LH$1L׎H$HHsLH$HsyH$ Ht$hHMLH$HMyH}0H$(\L;d$HH$(H$Ht nHL2AL$;LHt$hͷ2H$L 1HE0L諷2Af;H$HL荷2H$(H$H\$hH$HHKH$HxHHsKHJH$HZx5DBf;L f.`TH5GHHHD$hWH$pH'H=H$HCH$2H$E11ɋrHDFBFHl1M-AD$3$ЏtJs@H% fDAXtf.DH\$LH Iw(H H$01H H|$H H\$I0H$hLH聨H$htFH=11vHH H$hH; H=11KHH` AUH$HH|$ H|$ H$@H$P H$PH$@@H$@H$xHt H|$ BH$@IH:@H$@A$D Hl$HHH H$XH IHHO'W§gL$,HY,3Ht,Hl$0HcL;4EHl$@McD)Hc9H|$XH?H|$`L2L%H-2M9Ht$H|$0( ?H|$0L( *H|$0 I~L!TIHcL;4E>Ht$Hz( H|$0Le( yH|$0賣HI@LH Hu D{H#1AtE1DGD$DLHDLLHD$D9t$H|$Hj{ID9H"H= DHdfx5fD1@Dt$L AtD$t$t$|tpD@1N H)}0 1HD|L|$LmLRxB §60=HL$ ˷HL$ IH=w覷HD$ Ih[ §/sIcIDd$E1Yff.?@)эPHD@HHHD@MM9uI9tHL諟AIIE1Ll$ EfDLLLHI&D9t$IHLJIE9HUg(HpKHD$@D|$L$H$D$XLDLT$$D$\$h LIDHAE1jLH¾GDgHIDAYAZ|$HLT$D|$LDHLIjAHھHDDHƉ+YI^t$ E D$AD$t$ fH{角AG$1f;%{}f.LLAMLPyI D$ LHD$0AUATAUSHHHxD-_H$LL$HD$(|DD$ HD$ @|HD$8H|$0HD$@|Dl$HHD$PD$X HD$` D$hAQEH|$(WHHT$(ZjAAXrf.tF$HRHt3P1u (f;hs!DHHTrjHHuHx[]A\A]A^A_LP4IL=AWjM~<(ff.fq1FHlp1`FAp1Fp1FpHj. A`iD$E~sH$CiD$sH$+iH$HL$PH|$pIDH[HHEHt x\$P@D$H$HL$PH|$pIEO$g#UH=qk4AAAE11E)LLJM9}H$t$ H$t$ HTHDC@o{`~e+fM2H$A)H$wi$H$HL$PH|$pIO$!1HH=H!Hl$(HM1fDLlAE Au$DpDltYA} AE $Ő8DEADžT@H5DLh9]wDLhDLWDLKWDLVDLUDbH$`81 hT1H!828292r$8C92RD$D5LBHL@IIxLH8I9tE1ND$E1f.AUATUSH8~.PL$PL1DL ,GMHI9uHt$8Ht$8E1|$ LNfHT$PHHHH HH9HuHt$8LH$PHwL H H,tuI9HHYH ADD98T$ 1Ht$8NfH8[]A\A]A^A_A$f0y< k1fA<$*%oG1.^ED$ D$UH9H<@*Gf?0t <-GtA9tH ,G8tD9dPH4@*Gu @uH4@*GH<@*Gtf*L$(T$,L EA|$xѽ1ffA> AGuڨuAouH<$vH=xRu9H5zxHFHxHL!L9t"HCXHHD$Ѧ sTHH8NP Mp MMM|$MuMMEgAvH=wuDH5wHFHxHL!L9t-HKXHHL$ HL$fDH!IKTL8NP MH M_fu{?0H WIH?1HItHuF@tθ9`@ T@9u@ i@1HD HHLICD WH% fDH5uHYgfff)CH5uH|$/gfff)YfVIAUATIUS1HHHDŽ$If0tef*Ha_HHtLHt HI냐H$)HĘ[]A\A]A^A_SaHH)H hHHz8>HzH110AL@ƃ9H=:VIj 10LD@ƃ.H=oAuu^,EiFtHxwATUASHH@rH+D$Ht$ HHd$ D$(D$D$,&DHHr+D$Ht$ HHd$ D$(D$D$,&HH@[]A\ú z0f.AVAUATUSMII@HHH@ۃ@ExHA A@Ofi%IH0HIFxB4Jf% fZ4)% HH@0HHCxfLB4Jf% fB4迿HHAf;C6 HftS8LMc ЈC8HCC61HH=HC DLc LHAf;C6RHft)HcHHHDH?S8LMc ЈC8HCC$1H1HC H=!UH@4f%z%H@4Hf% fC4! H{.HCxH@4f%,%H@4Hf% fC4 H-.HCx[]A\A]A^1Ҿb)H )@qg)LLHLh)L .Hƹ)HL;l)L` Hx 1H蚚KljC0 践ff.t$L$HH$H$t 8H$1HzL}|H\$@u.H\$@H@@0H7@9u P|&H$1HH!@I9cf.H9tHHLHHtMd$Mt,LL'LHtHuMd$HMuH$Ht ;hpHu+HDŽ$HDŽ$Qf}Hf.fD$f8AD$ u{^,I|1f.؃H@HI|X9 ŃuAT$H&1H;hthH[]A\A]AT$H0HPt@뫺P1ff.AUATHUSHHH?H@ f8HPf:HEhHN0HDbt.t)ABH[]A\A]CfOvvf=<f=tf=uH{HD9tlH/Dt)H[]A\A]@AtBH/DH[]A\A]fLstfGwdfFs6ft:H1[]A\A]fTtDfXfH{H?1{ AvH{P+D BfHH{H 1{ AvH{PA7.EtrtkAD1D E1{ vLkPH{HLCAH.DA}xI|$Htf?tIEHt f8t@9tG9uLH[뚺pVP 'ff.XAf2.f4nf=f=u]EtIDD$XAIGHXHt f;[ HD$@HD$HHD$PD$XIGf8a|$8>HI$HEIEHL[]A\A]A^A_f.IG Ht f8fAHLfwIWf/tf0f=Bf.f/Nf.5LD\$Mw !HD\$t f80IG|$XH@H@1Ht f8I)HD$ HHE|$x;I$IEfDIG H@0Mw fA> |$XIGH@H@Ht f8 $AH‹9T$Xc  IEHUH5HH)I9$I D$ D$ IGLxxfH|$@$L!f8LHD$!f8IHL$HнD\$HHD$蓈&$8ID\$VHpHufDHRHtf:"uf>:ID\$L}&$LHLt$Ht$LG;H|$@L:IGD\$ T$XL踤!f8LHD$豅!f8HD$H$H$LHL$q1HfHHH9w쉴$IFH$$$H$ADHAoa&$$`H$H$AKc&D\$AH|$H$D\$b&D\$$8D$H,ɽD\$HjjAjLt$(HD@HHD\$f.H$LD$ܦD$HD$@HIGH@H}f8sHT$`H$H$H$fo$)D$@fo$)D$PAf.H$EйHL\$HHD$c&L\$‰$I8D\$jHjAHt$pHo&$A[A^D\$*f.D\$jAjHL$0f.fHAjjHjHt$(H@@Hu'HHV@H@HtnvH% fDHXH aAHB$0A9$Ik(0HHdb0sG$H|$(D$|#I1fDHApLHHubI;\$t[CQHs@HC@H$0H$(H$ Ht,<K<<HvHH$ uAD$$9rLd$(1HoHHCH8HCH;=vq)HCHt HxW)H8Ht'H;=QA)HSHHzHDLEEtHHx'H@Hu觤HCL-<H@fA}jI}_HD$(LHp\jLt$(IFHHD$)HBHX0IF@QH% fDHCLhMtyCP%=]nFuHC@HS@HHx H$(H$0HH$ t (#H$ LHCH@AD$$92HD$HHD$`HmHt$H0HD$(H;XQH$Hqo$H$fH~fI~)$HH$ CQHC@HS@Hu@H@Ht8tH$H$H$Df#L.HHH sHt [yH|$L$HZHHLAP H$1L$Mt6IF0HD$f=`HD$@[MvML$uHD$HD$H9D$`z=4lk|$|H!bHD$PIōC1E1L|$(L4Kf.%HSAEHuQHL蠪t M%IHII9HkH(HSH:Ht H;=߼uAEHzHt65AtH@HxJ$H@HuHSHBHHC[fDH$ ~H$ Hx$HDHD$xLHHIFL|$HxL HHt(H0W<tI9tI^XHf.I^HHt x HPbH$ AF,H$($HD$0IHD$pH qHt Y&H|$paH$(H$ HH HH"@DHHH9$( HH8vH@Ht׀xxHH$ Hx1#L-~ݼfA}jJfL許4#HCL("@HC@HS@H$0H$(HH$ 8tD8"H@HuHDŽ$ Df~HcN,pH=sHŠ/HDN H$ 2|H=ͣ1L&XHDŽ$ H$(H$0K H=sHuNH$ L苽HD$(L9`!D$|voL,fDHgE1H(H1HtPD9t0DH|Ht#L9'譔H1HuD9ufDHI9uHD$(MHxI9t#L FHD$(LHPb1Ll$(@ID$1HtP9LHlHuIuH쒿f.A H@@Hu,@D$|Ht$PH$0HHD$Ld$ IHHD$8Ht$@H)fHt$L$0L;d$ HD$(L9`L6!HHD$0>HN HD$0H@0Hh0HP(HDŽ$PHDŽ$0HDŽ$8H$(HDŽ$@H9H$ HDŽ$HHEHuHHt$HHHT$@,Ht$HHT$@<x@ fH$0Ht#H$8HPH$8HHDŽ$8H$0HEH$8HEL8L}H$ H$(H9HPHs(H$PH(w I\$8fD}VH$L`ML$Ld$@L|$HHD$@QhHP@Hp@1HtHB AD$QH$ H$ HD$H$(It$@H$0HE1ID$@LHD$HD$HD$H9D$88Ld$ MHD$(L9`L{HHD$H@0Lh0HP(HDŽ$PHDŽ$0HDŽ$8H$(HDŽ$@L9L$ HDŽ$HDIEM}H(H艩A<H<gH$0Ht#H$8HPH$8HHDŽ$8IEH$0L$0H$8IEH(ImH$ L$(I9HPHs(H$PH'fHD$H@0@ tK Ht$H$H:HtDHt.H$1]%$DH$ fDHHDHH$8HEH$HEHEE1E11AADDBM1DE1H4D$0AD$GtAD9t$0HD$pLt$XD`EHLx1Lt$AAL|$ LL|$0L|$@AH;H$1HDŽ$\$HL-H8CH@Ht @9vEAFHA9uADDt$xƉD$lLt$L|$0t'AE9tHt$pHJHJHpHrHH|$lFIF0H@HD$HHD$pHpH@Ht$PHD$8D$L$L HL$PHT$8HH|$HH|$0H@HHLiP D$HD$ D$9D$lHt$ )HHD$8HD$0HD$PH|$8H$1HDŽ$ZHt$ $H|$@D,H8Lht$xLD9t$Ht$HH=Q]11ЌIHHD$0>Ht$0FD$X ЈFIEH5p*11A@IVًDHHDHH@L8ML;=żfLn VLL$v1AAp_HLuHtk1Ht9wxHHƃYH!H$1AAp_HnuA9HtIE%DHA9uHMtEIDH|$@0L$(Ht$pFPVH|f0PHfH$0L$ ILHLeM A$El$%D9A|$AuDo[H}IH4HtDoM11L$ LI녀K H=ZH6l1ۺ"ˤH=1HuHEEEVM11E1k1H|$pzHs(HrHs0HrHS0LIT$0LC0HHE9H$E1L$H$LLL9AH{L$8H$(I9L$ IGMwL(LlA<<[H$0Ht#H$8HPH$8HHDŽ$8IGL$0H$8IGH(IoH$ L$(fHBH$PHH$0HHBH$8HHDŽ$0HDŽ$8H$ L$(H\$pH$1H{HDŽ$U$H|$@L&H8IH@HsHHt$8t xIF0H@HD$HHD$pH@HD$P`HpE1$FHD$(HPHp H=bL1T}bHD$PHD$(HpH HD$(HxID$0IEH0LDIInX<P <IW(vHt I8t HHHBHpH LH9.u I9HHAHpH H9.tHH~0HT$HHt$@Ht$@HT$HD^,EHF8HHIH9uH9u#D9sfHD@HHDHHH9tIJH9t:LMtHHIHHHLH@IJHHIJLHIBIIL6H~0HT$HHt$@#Ht$@HT$HA H4<H~(vHt H~8t H?HHGH7I HtXHFH6HHH9uH9tIHH9tBH8HtHHHOHHH9H@IHHHIHLHI@ILrIIK H=hSHd5IG(HHH I8L9{H9)HPnI9eIMH9UI0wA,IG8HpLH9.uI9u&9`40GHL$1AAp_HLXkHLHjHIL$L$H$虉_AXuZHHHLHt$HHH~HzHI0$HAZH$  0FI 0F 0wF0cFAWAVIAUATUS1HHt$@DA@HEEEu0IG(1Ht%PtHDHt@Eg$Av I0L-4EMH$Aw$tz1ېMdID$Ht]HxHtC$t HfHxLH@HuwzID$L-DH@HkID$A9_$wLQHDH<$L<$Hĸ[]A\A]A^A_LKǿHLzHH$AD$l$E1HHDIf.HI9LuuI~HtIHt ~9HM;wtLL輂uLHt8u1I~\$LLIAL?DL|IH@Ht$xvIwHHtxv H;nfA1HtGD9uIL\$HII9/H<$l$Mt@AD$tzI|AD$HGHHztMufDIG0HH$tSH t~HX~]zA|$xL]IG0HH$uA$wtH ~HF}9q蘗 1'HLH HIHt y1HtJ9uLĿH Mg1@LHtĿID$1HtP9u10HD$pFsbHt$pHD$pLHHD$pHt6xE1u *DD9`vDAHDHx%HD$pHuH|$pWL%JAMt9IT$HHs HH9wA|$,I<$^I0L,bL%@H@Mt9IT$HHs HH9wA|$,I<$0L~bH@v?@=@=LIG(HtuPtnH\HC(Ht5Pt.H|t# uHC0HtHx8 fDHt$@L軇HufL-@MzH%( fD Au@tà uN=?A/f.L(1aLHHD$(q¿H@HߋpTIGHߋpTH=L?HtEW1ۅuf;_؃HHHwH>=H=?HuL 'HT$(H=#JHS;l$I^ LLl$XLd$`H{HD IHtH̀uLHH|$pH蠸LL$H|$pE111LuH|$P1Ҿ^_{DD$$x]r [SHH4D$T$jbL$HËT$DD$$% ȉSC@QL$B1HЉT$H4jbT$L$@DD$$HlAD$ +AD$ƃ@ AD$ID$@Hx(H@0H9HHD)f.EH@t H9HIH9HP:tH;jt҉D3AD$A AD$DkE9{fAuDD[HDkH4IA4ibHËDk%D Cp{yEnADB1HDAH4hb@D EHfH\${QCH|$(谑L-6l$8 l$~Hc5\I|$H +f?a<9Hf-fA$oH pHp,EnADB1HDAH4Ifb@D EPHPEHLHGH6HCH@H%xHS0H@HJH9HHCHHxHRHHzH@HRHHHHxH@H;H IoHt-؈IG(HSPHHt$@H\H{H\I9H@0HH$t Hx8  03DH=nl$HtntAVT1H<${MA|$x L{AD${{{HDBD AE9Apt!rN|`403zDHT$ި[HT$H4D$DBHDD$$cbDL$H‹DD$$A%D ȉDBBH{Apj(1cbE1HH؃AHT$(DC1DL$$DDD$H4GcbDD$8HT$(DL$$@DƁHDHt$@1xHs1f.LH\$PH-ѹt<<|HLHD$ Hx[]A\A]A^A_DE +?HHIHD$<H{Xt<v<n<At$E1Ll$0dH=^HHD$0JH{0HLؾH8 Hl$Hp}zEfGHVfwH1@HHHD<H@m1fDLHp)H\$PHC0HC8kS +HuHCH?t <_HsH\$E8H\$kLs81Lk0H5K61H=̹LLHŰLHH藽qfCHE@HC0H4HC8fDHkHH551H=˹1%HHjLHHfDJDt$ H@PTt3;T$t8HD$NH H9PtH9u2FTD$ 9#AfDII+}A@D$8H|$ te@D)HHH|$ IE@tpH|$ "fDH|$(H|$ f.H|$ H1IE@H|$ 1|H|$ y\jjIMIwH|$0BB/ gWP!IUSH(HAGID$N\ Nt @I D9tLAEA;FsT$LL$L$tAEA[I ACD9AuAD9|$IDt$KvD9tuAI$HtDcE1ۀ}EH(H[]A\A]A^A_fD(1O\HHDDL $x $E1EuI$1HtB9D$A0DL$ $]S\ $LDHI Dk DL$IjD9AuEO(sN|EDt$EKvDt D$HA9EE!DAHDE9sJDDkDI$AE9Auù{x^DHT$ɓUHDkH4$N\D$HËAuDkHT$A%D CkADD$DC1H$DHH4\N\DD(DD$@H$AD 0E@pHpt)AxLLCM\$A $DC1HDH4L\DD( $81{L\HHD$HEHEED$pAFJ4SAAA@D$  DAHD$8HPHD$HxHH)HH$H1H1HH!H?H?H$BƉ$A9s$?@)ΉHHHHH|$HWBHD$hZHD$ H@ HhIL$ AJL)H0J|$(L)HLHHPIT$EBAWH4AAA@D$,eAIWIzHH)HH$H1H1HH!H?H?H$BƉ$(A9s$?@)ΉHHHHIRB 1҃A AJL)`I$H@H=It$ HT$GoHHdI$LKH@ DH=AID$(HsI$H@ H=It$(HT$GLL$HIID$LL$A9NH$0l$LM)IN$H$N<MAT$EL$ D$>I$H $DŽ$H$fo$D9APfo$A`؋$$H$ H$L$9މ$ 2@H$DŽ$H@H$'@)HHH$ HD$L$LHHHD$8jjHpSD$‰$_D9AXv&H|$HWBu PH1 A@A@H$tHDHHH!HHD$hHD$ H@ H 11^@HDŽ$DŽ$ Il$HD$ H@ HtD$|$;xLT$jIJjIwH$ 買D$<‰$8YD9^LT$v&pDʹ@?)HHHHIRBu$f.PHA@OA@H$tHDHHH!ЍP:IGI+BA@DŽ$(H$@D)HHH$IRBuH$y1HT$pHt$H1xH$A1AH,H~|1IHپm\HT$PH$Hz@QHH% fDLILL$LLL$@LHXLII$7IRB H$H1DLL$LLL$H_fH$H$a@1H$0E1EfDHD$8H|$H@H+GA@DŽ$H$'@D)HHH$HD$H@@uH$H$H$H\$ HHD$HwHWBH$wH$H$wHHD$hHDŽ$DŽ$(hg gWPfl`c1HIHHP@7H;u H{HEHx$HHD$HEH@HpHHxq2@HCH H@pP@DL$H{AO&2D$GD$Au2D$ EȸAAAt5EAIIILGD$A<AGAAE AuEAAAt|AAAAE AuEAAAk AAAADADL$L{A2D$AOEOAD$8AIHt$8%A DL$8AAA DL$;.A%IOD$DLD$IHL$PD$HL$HEHLMH{INjGu2WAuDtp%=EAAL$19r )ȉGL$%A ЉG8DFAD GHHSH HH2Hu Hx IGHx!HIGH@HpHEHx.@HCHt$HxgH|$D$8 D$,HCHt$,HxH5Ht$8H|$0D$0b,H{AHP HEH‹$ILLQm@DL$,ZYAu2T$ AAt3IIILGT$DɁGDL$HHLAA DL$$AAA LDL$'$QH LL$DD$,HA^A_tIu@Ht H@HxQ-@HHK$PLL$DD$,LHL9A[[IHHL[]A\A]A^A_AG!%=AAmAL$ADGAGA8FfH{HP`H{HP H{HHPLL$8LD$0HHLL\ HCHt$HxzHt$,H|$D$,)HAME@$ELmE1PHT$HHt$@HSH@HrHxb+@AYAZ6fA2OL$AD$AAHAE P HEH$mHVLD$fAHDL$LPDL$HM$HLLAQA n$LL$LDD$,H L4$HHY^tIu@Ht H@Hx(@HHKL⋄$LPLL$DD$,Ht$IXZ_f]1@D$A@L$D΁fOwW1AE1Af.AG@HHxHxHP@HCHt$HxHD$8HHXHt$8H{o%AHHkHPD$LEHHLIE1@8t$D@DC@1ujD8vMEADCDfDD@DC4f.DCjHHcH1HH=HFiT$DɁ,H{fDHt$HH|$D$8Ht$8H|$0D$0#H{AIm@HP HIE$mHLVLH|$,HAD$,$LL$H $@HxHP@DHCH86Hx+HxHP@HCHt$HxD$8HCHt$8HhH}"AHELmHPD$ELHLL2HHCHt$HxMD$8HHt$8HXH{t"AHLkHPD$ELHLHIHH:uAHzu:HzHP@tHCHt$HxD$8HC|H{fHt$HH|$D$8*Ht$8H|$0D$0!H{AIm@HPHIE$kHLVLH|$,HAD$,$HKLL$$f.HH8DHAS8HHt;hrLH8[]A\A]A^A_@.HH8HDHAS8ID$HsHx@H HEG/AHH1DEG/AAUDL$Er%HP%L<Q\$0HtpH\$0H Htv ey)HD$0HtGHt@E1DDH5H|HP0HD$0Ht AD;`rH5y MtA~AFHHHD$0HtHPt-1HsAt$_LD$0NIT$A@IIDH|HtH HHHH9uDApHD$0EϺ.HE1,(1\1HD$8HHGHHIt$Mt fA<$tKHsH߉i9wH[]A\A]f.H{pHCH[]A\A]fDI|$ u]fv)f#t f%uLLHt wfIUf:uHr HtLL|L kj޸HD$8H@f tfT$t{AŃ~A\H|$0p4HOHyW4 PԭLL$0:HLL$PT$HqT$HLL$PLLL$HVLL$HHLL$H?LL$HhH-H f;tC6R HH zf;t|C6.L$\Hf?!G6HʚutT$t~{ ЄHD$8H\$0H@HKDHP1f: ÍHdHf;C6HD$0HH@HxHIkf8FP6 HtIHcЀ t=HcЀt1HcЀt%HcЀtHcЀt 1L2AfA<$AD$6DL9|$HDŽ$tlL9t4H5EML$LD$@L/H$1蠜HD$@A^A_Lt$@H$L&HT$0E11ɉL\H-eEL5HILULL$@LLH=IH4H"_=SAHkXD PHMIHRHHRAV)t!IDHTHtHH9uKXH`RPzHHu1HH8HEH8`H$H@mEEAIED1D8H4CЉЉ0H47S00H@ `8t2INHPtsHtHt9:HHH9uXmW5A61؁)9sL51x0HtH‡HD$ 8tjH HtMtD *Ht$H=k11ʈAUTL$H=N1!$BHPBf~tf{uDH\$(HkHX 5ҧ 5ҧ 5ҧfDAWAV1AUATUSH(DʮHt$pEtH$v!ެZrֵ1(ެHެSHHެH@HHݬt:1H4SHݬ@ ӉX^HwݬhH[DcA%E)D)A9?{H4]HBSH'ݬD` ʉPL@hKAUHTfDydHuתH@HHt"HkkH[8HE0HuHKתH@HcxH"R1H1HD$@QD$\D$4H=D$\HtitT$\1DT$4E HD$D$ZH֪Ht$HL$@H@H@L|L ǵHHD$(tH@H@0HD PESIHt xD$YD$ZA~H AFH|$tHD$x @fDHD$HD$9D$4;|$ZH=ѪI T$\XHѪH@HH5ڜHCHH0tE=Ҭ<HGHҬH9Ҭ3HcҬHϾ<HHCHFҬH[8HHWҬHH -ҬH@0{]D$4Ht$@HcЉHHLdH,I) HI9WHѪUH@H@H\HSHtӋJv˃HBHLHHRHHtzuHH9uHp )$HDŽ$2EHQ H )$)$.H$HDŽ$HDŽ$E1HDŽ$HDŽ$@tHADH{LtLXGu%IFH$H$H@HPH OAHC1HtPD9uL E1AU H=G ЬHt (C Ht$HDŽ$n]H$LHHH$Ht4H$Htb H$H$WAE9~ DMlIEA}HHD$ \ƉD$HCHx]H rHtZt `H=ILXT$H=61fDH|$@> LdHfL$ EML$ AA;$D$YD$8AI1HtPD9KDHlE( &IGHt$11ILeHP HH8HpYIL$HybIHL$E$MAH\D$YmfMD$8^r[/ IL$InHHHL$xֲD$8HL$xHYHUH9HHHT$8tHT$8HH\@MAHD$fH|$|$YLt$`H\$Ll$h|$8|$ H 1H+H=H`UT$HH=01i @E$HL$Lt$`Ll$hHDpL?A$@Ņ@ ;;Ht$PXH=ʪ11HHHHCtHH#H#H9@ōkH߃ @kIt$膬AGQIW@IO@L$H$HH$t,:t%:sHRHuHDŽ$H$1HH@Dd$HHD` /DHYHHEHxsH=ɬ^Hs|$HHHzIGHHx'rIGHHx0GpH% HGH(HHEHX0HH{ Hm8HuTIt$I~˱D$8ID$I9FIt$I~觰MHD$H|$LHɬHt+H $ f.HHHHHuHHEɬHBɬH?ɬH<ɬH9ɬH6ɬH3ɬHgȬHt"H HHHHHuHHȬHȬHǬHǬHǬHǬHǬHȬH=ƪ`HȬ-AE1H4E11RI|HtH1H2HHH9uD`y8 )H jHt RHD$sLHH$t$[E@t$[Lt$HLl$PbMHL$Lt$HAD$$Ll$PHD=HƬAHoHXY8 H ƬHt HD$sLHH$)t$YE@t$YIE@(HH f fHAf8 ofDHDŽ$H$H$.H$H$1,ST1dH=L5OT$H=1>AUA~AFH$s 5j(p H=LHNHt$P;QH=1¾1L>5%5츧 ~lظ ~lĸ0(U H=HNHt$ ; QH=¾1@@ 5gH$ 5F`u HGHHtH fDHHHHHuHxf.t$vLgPHE@HH$AG;A DLtIvHtHDL$HL$DL$HL$EAH{t$AWE1`M9fIuH9tyH hHt h)EL}HIFHEPE< H hHt0nht'5 H=]h11HH$HX[]A\A]A^A_fDM9fAE E8@H|$0HH|$FfA?]MnIfA}H$H=p711HpfINt$LHH$$HHʼnXL9d$0LH4 mDLtAGHDI9^HD$H|$0HH|$莈H OgEHɉD$t 0g^IvHtHHD$HpHtHHD$EINHPH9EHD$HMHH@HEPE< GI9^0HD$H9X!HfL9`&8 oH=mf11HE3f9If:WH$H=511Hp7eHT$t$HH$HJIVjHŋD$ML9d$0ELHv olH=e11Hv@M H=-5HFuVID$LHHD$Ht$H=|$ D$ D$Y|$ HD$gHL$LLHL$HډLD$TD$H p.Ht X.vIUPIuHL6AE< H 7.Ht .m|$tNf;uH{LL$T$ LIt$RHL$HډLH([]A\A]A^A_fHCLHHD$誊sHt$H=qZfA<$LHD$IH\D|$ 1LH(L[]A\A]A^A_*f.1HLfD1LLHL$1Hf|FH=,11L)^@LFH=,11Lg@AM H= Ll 1<^ID$LHHD$LuLIHHt$H= uLA} vMeHI]Pt$$LAH|$A1A AS<t<t<t1ۉ[WHH +H<tJ<r<v<uӋ9tăt3ttL7uDH +말A΃v/uHf?tduûadTfDHH=K.nH@AWAVAUATMUSHf9H|$t$T$HD$(HD$0D$HHD$PD$h:A6HHD$XHD$`D$hAH%HD$PAD$f.f/ LT$PAD$1Ll$("L$ML\$fL[IMA@#f=$@H[T$HHIL(DAH[HLHp f>=HH(f9/H@0f8!H4H[HLHx(:A;l$AI_f.H[HID IEE1A;l$.fHD$(H"f8H[HID HH@<9HD$(EH@HIDHE1A;l$@\$ Hf;AHksVLT$PIt#FH|H$1fDHTHH9wH$$Afo$t$hfI~HD$`)D$PN@ID$ HX0GfHAHt f8Dt$hfHH $AH $H$DpHP1DH4H4HA9wH$D$fo$Dt$h)D$PHD$`fjjAHL$`Ht$@H$$ZYBjHjAHL$PEH蛨tFsHD$PH|$HHGj|$HteHD$H@H@ SLT$PDt$h jjAHL$ AHt$@H$˹$AZA[HD$0H|$AHGt#HD$H@ L;S9XWPnu HD$0IGffAd$(LID$ ID$蹺LAD$ LHX[]A\A]A^A_f.HHf>u HMHfE1HL~H$D$HHtHHH[]A\A]EtPf=uJHC8uH +HuHPH~t{uLhHLIu@uRffA}L=LQH,fH0L=9êHt$(T$@DDHtªL9fA>GI9.LLfA>u=fA?u5HD$ Ht*HI~Ht$HH;D$ u HD$HH9D$@tVEIE1A#L9=t$uC<Hs8Ht$@HALjLL$PHf8HLl$H=\AHt$AU8HHt @$Ht$f.E1E1DLHx'MiH{pMHt f?LLdMt LLd H5AHsAHH)HG/HTAHPHH9H `AHH HH'AH!HH)H'AH)H9v H5AHHAHD$`HH\$@HCH$Ht P1HC0HD$pHt$@T$xHA A( ЈA(D$hT$\AD$lQ La8Lq@AAH?HxCH8~HT$@HHD$@H?HH?qH@HpLk0YA$1HL#A$ML`8LM<NHs8HAH>!DHt$@H95\>FHt$@VHt$@HO6HHt@$tHpHD$@1RLl$(H=>Ht$(AU6HHt @$Ll$(HLZb<,H{0!H{816 1QHb H=Aj11H}H .jpƒH=jH1豁H5j x臃H ilH=i1L[H i@H=iL1/H5i w3It$LL9 HE1E1j1HHL蘖_AXH|$HLXH$HD$HLHp@`LL`HT$@1LLVHl$PH=<Ht$PU4HHt @$-Ht$P HxHOH=H5=H|$PL譫H|$P#H|$PHD$Hp9tT$ WHD$HT$ p9tWH (hHt h)Lt$8H=;Ht$8AVD3HHt@$uEHt$8L_t <E11H}HL$@OLhHpLLP_+Hp13f8VHxf? H[k;E1<*Ls8MH\$HLt$HL95Z:AFLt$HH=:HAVI2HHt@$tLp뼺L聀W W H{0LHGV@$HD$HH(P 0+1` 50k+HAUATIUSIHLHhHBH}HOHtq9pvlLtMt`M>MtXLL $" L $t4AV>IGH҈UEHH9tA|fDfA?,@fHt$0LLHD$P)D$@Ll$@)D$0dLl$@L! uCLd$0L! tLl$8Lu! }E1HhD[]A\A]A^A_L uLt$@L7! At]L|$0MvL! ItH|$8Ht uIM4$EMd$M}2 H LuSh0=HCHT$0Ht$LL $RgL $t"HL9HL{AEL$,HD$?wHHHH!ЋL$LHHT$0?wHHHH!HSAV>UHL$8HH $ H $IWLyHT$8 HH$ AH $HT$HAHI6I9;Hf8wLxAf=f [AiH|$Hf?0u HH|$HH$LD$OLD$H$IHLOHMH$H9$1HL;k% M@,LHHL$$9$(HL$u)1fD9pLL9tc$D$xHA@HD$`.@D)HHHD$`H}hH1LLD$6LD$IGf89I LD$g LD$tIGLxA1;$ 9\LL9toH@@(H@@QH@@HH@HuHLHL$ HL$MHII9tI"Hff.L?IohHcIGpHf8wL`AOHA$L$ f=If OA$M11L*A/E1EDH}Hc HIHD$UH Ht vTLtH}HpIMtIV|$ LMY(ILHI8H< pH Ht H|$kMH=Bt fA}|HĨ[]A\A]A^A_f.ID$f8% E1E uL~AtHftf=t E  H ܨHOf.K H=-֦HxE MvfA>L(6 IFHx Ht f?YM|$xu H|$`H$Ht$`H|$ DŽ$oD$ )D$`oD$0)D$pwDS$HDfD$ F fHDL}:DH>kIx>H(/} 0ɣf.jjHpAH|$p8x$$^9_?@)эPHD`HHHD`HK$t7HkHDgD$ H/G 6K H=ӦHlHHC軁fAI0E9t#I?11v9tE9df.SMd$H!ըMlL@Lt$L|$H $AQHq@HA@HL$pD$HD$hHt$`Htg1L=8Ll$`Io AE<D@Et,=Ԩt#<<BE1҃<Ll$`IuHHt$`u|$HӨH $HHAtPt HxHx1[]A\A]A^A_`1fDDQEt =ӨH|$`H:AHt$`HB11HH$tD$0"HҨH$HL$09HT$0HH\HtH;H4H\$(1H L$Ht$D$ IHLLH$0DŽ$8DT$ t7$8H=A6DT$ H|$E1E11HH觘DT$ AE<I}0ALI@H% fDAEpI}0f?Iu8f>LAE}L >IIEpHtJf8uCHXf=t'f u0DA$!= $`&HCf8% @B@AMHc +<I]@f;H{ Cf?uH12Af;E1JD#Hxf uHfuH1iIIuAAERH<$臼jAG8`DH$0HШHD$HHXI}H7H +@f7GE1fDIAxfA@E1fʃ<A1A} vIuPI}HMf?H-4ϨHIPHHt$HH$0H'H8H@ IHuID$HEHvL$ HH9w H*HuDE(E EH}LH  D IH HD)DA)‰HHHL M1It L9?1IEE ME\$I E)AEADEBADE$A)B AH9Hr)HHL MI L9uDE$D$ AD$A vHD$`L`@HD$H@HHDŽ$fA<$HD$((H=ͨ L$H1LLymD$0 H|ͨH 9H H\HD$ Ht$H|$(L$H$H$IH$0DŽ$8} E1HƄ$Ƅ$Ƅ$H$H;HD$0.H$AD\$0 ˆT$8E1D$A  h[|$8 $0;$4HL$EH+IL$(D$0HDT$0;DT$0HH$H=Y0DT$0L DT$0H$t$49$0C>$ HD$ H @  HC HC4HCC<C= Il$(H H/GDT$ HH‰=DT$ H$H{$m$8_HD$H=k/DT$ Hʨ HʨC,DT$ ʨAED<jȍH4ƒ(wH HH ʄvI}HHW f^11lKHH˨E1҉DAEI\$f;IT$ HHT$ 襝 HT$ {HD$ 1D$0Ht$`1$AEE1HHt$`HDAEE1+Ht$`AEE1H|$`$AEE1Ht$`DAEE1H\$`Ht$HHHHD$ k+HɨH9BLdMIT$H:HzpHT$"cHT$HD$(H6AHT$DHE11AHHHD$HHHD$0賏I|$~$HpH$0a$hSH$PDHq AHD$(=U=Ht$H|$01DT$(DT$(qL$It$LLL$(H=h,{ 1zILL$(tH5aHL$LϺ1H=.,1 I$H$AD$>ID$HkhH|H9Ht$ H趤H|$`H1 Ht$`qDHD$`11HHD$0HL$HHHL$(HHD$8NiD$  HPǨ1HtM9HvHH\HtHD$(HH-HD$flHD$(@>D$@fAHD$H@8op +HuHPHfFNH;HHD1x vHxPo A(HD$(H(H |$ HH\$8l$ ADT$@H-%HHD$N)f;LL$DT$@/AA<AA=HD$(Ht@=EAAHL$0IHAhHĨAQ,H ĨHĨPL$ E1HŨLAE<Hl$`HEhHHD$H|$W#HHH t!HcH[H9w HHSH{ HHƿDe6 Ht9wD`HEpHt$`AH|$`H蠱AnpHl$`H]hHHl AGH11HHD$ɨA$H|$11eA%HèHD9HDLdMM$MA|$>H 2EHɉD$ t I|$ HSHpH{HuHs|$ H#HH|$`Hu H|$`H(Hl$`E< wH$H Ht3t*Ol(H=y1HeMt I|$ ]LH]AEE1fHt$`DsAEE1I}H +HH{ j tHCHX"E1҃IqI}H11DT$dDT$? H¨H)9h HlHH}HE1A} vMeP}>fA<$WID$8IP +8HpH +$DU>EH}1蔸AHH>\HHE H9H AZf8HpH{r<HU H|$`E11ɃƘBL;l$`AEE1ҍP LDT$V"AEDT$fDA"<D9XHDHLHt߃i(uDEt?=hHH9@HcHϾHH6H_HH UHmpHHDHC 6D$0HD$ 1IEHMpHxf?wHp f>hLMXHD$ 19D$O$0D$$8H=#D$@$4DT$ND$HD$8I{ H|$E1E11HHHDT$ND\$Oi$8H$HH;D\$HDT$@HLHL$0 HL$0H;$ADT$@D\$H)H;HHD\$@DT$0 DT$0D\$@C8H$+HD$H+A@D$(HDT$0-|$8HDT$0!H$H=z"DT$08z DT$0H|$8t]HL$EH+IL$(tvD$0{>{Ht$1HDT$0ODT$0\UH$JHAHL$H+D$8IL$(Atf>4@-zMHDE$HmHD$ HBD$B {>Ht$H=5|u {<A HD$`C<HpHt$`H|$ AHD$H= HHHcH9HH$HFH$0HT$@HL$(x HL$(HT$@HHD$(H9AAgH^HE1Ht f8 HD$f8T DL$ E H6HDT$e+HDT$IH(@4H@H@ H@<@=DP>tLHD$8HD$0HDT$@HT$X/d LL$8DT$@ HT$f: f} AA8D"\$8`;$8HL$(HQf:HD$`HpHt$`HHEI}Hf?t,L$ L$<9Ht f-fwE$fAf;H UD$ML$I}HHt$L$H$ILH$0DŽ$8w#$0H=lv $t$HH|$HI$8H=pAD$<AD$='v E1E11HHLр@H=AD$6f?D$G69D$Hx@4fA3D$4fAE<L HD$XImH+H=LL$M;HT$X11H]YLL$H_MH9JHlHHEHf8L$H1HH4I4H9wH|$X$ IqAQAHH$yHDŽ$ABH$D$DŽ$H$IAHHH1H$0H1HH!H?H?H$8$P$ ;$P|1H0H9b9u}>QH Ht3t* L.H=1LZ1Ls H|$`HIE8iHt$`L茎LD$`D$A< wLLD$ LD$H H3&OlLD$LD$H=1L5ZLD$H.E1A} vMePLJ AID$HcH9K)HE1ЉDAEAR/0BHHH!H9LHHLHD$ /H|$ A"A9ľHDGyiHcyqDH9vD.VHDe/ f8H[@6HHH95H=g 1Hq LHHHw HIt9wEeHT$ HHw HHt9wDcL$gHHHIDgHHHHEؾqDIHL$HDHqMH|$`HHt$`fDHm1ID$HD$ tC SHD$D$ % = EHHtD9HvDH|mH5HDL$ HT$H5IHyILG/DL$ Ld$HDLrDL$ HX HHDfA<$DL$ u H|$DHHNAEE1H|$MAJ71HNAEE1H? H|$`HcHt$`=H}11LL$GULL$HIH9HHDHHHf8uL$Ht$L$0L$ H$LКLL$?LM L$ hH|$LyUAl$Ed$4H$H$HfAEH$D@$;$1HH9̀9uH$D>$1;$HH9̠9u0L\$jAjH|$F$`^_L\$\EH$H$H$HHDŽ$H$H$DŽ$D$HDŽ$H$DŽ$D$cAE@*jL$ L$ #"0~l蓠"0~lp.0k1L槻PH\$H\HH|$`Ht$`_At$ H|$8KLL$DT$@AA<AA=H|$8DT$rJD$ DT$DH$1@LDL$ "DL$ IM1H ڥEHɉD$ t=iڥt4 LLD$(H=Sڥ1HRLD$(ML?l$ H5+H‰HL$IIHQiC LIHX H=uD$ E1Hc,k DL$ I.0覞H|$8DT$,ID$ DT$|Ht$01HDT$@DT$HuDT$8LL$HUHIGDT$8LL$HHLHDT$8LL$ LL$DT$8L$Ht$DT$@H$H$0L#DT$@H+DS>f9Hl$L$Ht$H$LH$0ԕAhHD$(H(DP>V`.0N|$0$u |$8 H\$ DT$8HHGD$(DT$8$u0L$HHYt4$0;$4tƄ$HH-$L$0L$0t$(HHDT$8a HHDT$8HHHuRT$(jAA)hT$($H\tHD$ Ht @ uHC C=C<H$(Hxt"DT$0H|DT$0HHtH;L$(wzl$(xA1H41HT$8DT$0NHT$8HDT$0hHT HHAL$HLHtHHH9uLD$(DT$8L1DT$8H)H|$ \$@I|$f?DT$ ? HDT$ pH9gLDT$ O$H=XHDT$(L$ g L1HbL$ HډHQ HHtC=H$DT$(i0L)D9r)hJTH=DT$0f DT$0fH4Jt)H{(Hs@1HK ‰9wH 8Ht4VRH?;C.NHCH(;,.NHCH;.NHCHH;-NHCPH;-NHCXH;-NHC`H;H-NHC0H;x}HHChH H#VHCpHxHxhHHL=aHtDBvHE1MD$ IDd$6IwHH9t,Aw IPH9tIwXH9AG +HuIGHG*~t{uIGHHpIG@Hx+u#AEIG@A]H@@D$ IF(Mv0*͸AIFL$H$INH$HIFLt$pHD$xM~1L9AtAM6MHD$xL$I9Ld$pLt$pH$IT$H$M~HID$HD$xL9uDH$HHq9jH@HHD0LxH@A?L0A1H|$P5ݠHĸ[]A\A]A^A_ÐH$HHBH$E1HHDŽ$HDŽ$Lt$pHD$xbfD@IwH|$tvAAW +HuIGHf<HL$p1H$HAH$H9Ht HPHHH9u@HD$xIfDIE1 LE1oE9|KDvAVHIlHaA)9pYAIG@H@@;D$ (F1A vIGPDd$6FDAL9AD@t$6Dd$6D d$ EHD$(H8= HD$(I$ABAE~,EIn8D$ D$6D$ IG(HHp IO8H9H9HHI9I[fDEo,1Io8AEfH=繞Ht$@H|$p1ғD$pHD$xDŽ$ HDŽ$ǯH$HHtwp1H$u d@9Xv[H|$pHRHHD0H@H1D$pHD$xDŽ$ HDŽ$艮HEHuH|$p1D$pHD$xDŽ$ HDŽ$GE1I0ĸ`øOHHID$61;1D$ D$6D$ 1IH$9Dd$6D d$ HE1RfDPwσHuf.},uHHH[]fHEHx@$H{H@HHtXHtHDHH[]úP'hff.HIF8@D@A +LuHHH ~KHHHH@@HqHx&IF8O 1>@HT$HR:HD$A !HD$H@8uHD$A@H$HHL$D$xHA0HtIUHRHRH9HED$xHD$H|$I3H3IILM1I9fAu A}NIHtxA}IEHT$ MUH(HD$pABfD$zD$@H +uIBHfD$zD$@H8IB@H@HD$hPfv fHT$hB4R6 %9|$@5D$`$A$$$!Ѓ@舄$AH[?En6H}f?G6DA. V)L$ZH$HHf;ID{6fA}AE6D. U)H=1L1/HIL4 4HLH$Hƺ H=h11H1H襠HKIHL/.HH¾/辤H$Hƺ H= 11LHIL4c4HLsH$Hƺ> H=11H;H$HHKIHL//HHH$Hƺ H=[11LHL4H贄4HLLAL&H{AJ`6$?$1H$H诖tMn[L蜖tG @H$DHT$P1L4HAIN{D9l$(IFu(1F1H$HHH$1ҾqŻH$BHJLdLp0HT$PH$@H1yH$1ҾHŻH$BHJHlL$H$@LHD$`H|$`THL$`H5Hp}H$H=n11HH$IHھ<HI,HLH$LhL|$@LcG-`Dt$E9tCt$(A&HAǃtHHH!H9u A9`T%D|$cD;|$'Ƅ$#Lc/HxAL$4AID$MD$HD$XIDd.EA9H$Dt$HDt$HtAH9L$(w|$(*ZD$(1tܺ`31HD$X$LhI]H\$LH$@L$9H|$PHD$G6 AL$4HHD$`HH`LIH1LHH$HDŽ$@vH$@LH$Ht$H=11oHT$XHž1)6HHhht f}uHEHƺL!H=11LHLHD$^Hl$HHNHLHt$H=Z11HT$XHpHHD$HLH= 11LLL$L$?H$HILԕHLTH=ձ11LQLHL4.x4HH>HLHt$`H=11 1LH$Ld$L$HT$PH$@H1HDŽ$@frhL$κ迢Hk0HIL$(HH8IL$0HH@ID$0HIT$0F`耢HPHj0,1L1V`jI`e51L1`r o&11Ҹ`СDt$`?財`螡LsfA}IEuf8tHLD$(HT$8LHƿ/vHt$0H1H$1ɺHĻIXL趄I萅H\$0H$HHD$@t$(H$耺H$T Ic舠IEALHD$(|$I~HH߃;l$uIHt$(LG]HD$(f8x4IcHD$aH|$t\=ڝtGH$@PH1DŽ$@HDŽ$HDŽ$PaHDŽ$XǗЩƃ,|$=kڝtH$@H1DŽ$@HDŽ$HDŽ$PiHDŽ$XǗJЩ뗺7H11E1HD$pHE1LH |̞t$PH$諕E1DŽ$L$HD$Hz$H@ HDHxެIL$1MLd$8"HDRMH9l$HDH=ƨ1L4Ht$@H7LHH >*IV(9l$MHP8IV0HP@IF0H0Iv0!$L$$9D$PL$H$BprH\$IF HxpIH|$ue|$Hu<`L$`踜HD$ H|$xHD$xD$HH|$ D$w\`肜`nLt$0D$H^`K(7MM81y@ H|$8TxItLL9l$IEp +HuHHH fyDG,LH$@MMH`&蛕ݘHD$HD$`HHD$hE1H}H|$HPHHtzt[HpQ=QϝtEH$` 1DŽ$`HDŽ$hHDŽ$p"HDŽ$xrĩHxHHLHE1HHt11HPHHtztHP=Ν %1HHHI4HIHvH}f}H(f8(H$`H$0H$`H$0HHHD$ oD$8HHt$ 1D$8HL$H$`LLT$(H@DHp4ADDL$0/ LT$(DL$0H$$AL# H|$ H$1qD$8@HD$hHLLPUATAWL$LD$xHL$pHHD$HH Ht%=H|$PHD$H@HH'x'H|$[H£tH HCHt @D$.HL$Lt$ H$`HDHL$HD$0HD$LL(LL$0M A|$ID$HtH4HH9$t =̝ A$wdA$tY=˝tBOHH1DŽ$`HDŽ$hDŽ$p"HDŽ$x111LOMMtS=w˝)LHHIt׋0m A$A|$MMuLHL$`MtULHHt8DEuxt&QwtHH@Md$MuAEQM}@IE@L$pHD$H$hL$`fDMA?LHHIHDŽ$03xLMt)@LHLHHH$MMuM$LHHLGH$0H$`HLzL$`EHx(>HH.5L$`MAEQI}@HD$L$pHH$`H$hu(fDH$`HxHH$`?uG@tD_`Eu袜HxIf?t耧LH{HHHXHt PH`Ht @HpHt @HH0DH$0HD$ E1D;D%fȝEAf.L4HH1DŽ$`HDŽ$hDŽ$pr"HDŽ$x+fDH% fDAG +aIGH +Mf87C@9LΡHLL$`LHqL$`HD$HD$H9D$0H|$XEDL$8Lt$HHL$XLD$`HT$PLAEDE謟LLL|$ DHLzMHD$IF1Lum,LxXINXHAAHAAH1HuH8A2F_HH!H9HI#~XDHHPHHAHHH!AF_D8DFLl$HLIH!t H9Lt$Lh IL1HHIHADA)HD$EEEHD$H$`McHHMLHD$IcELHHH?=LH,-H$`LxL$`~It$H|$HH1DŽ$`HDŽ$hDŽ$p"HDŽ$xحLHH$0D$H H $H91I|$H4$H@CI~HD$P肓HD$X1HT$նHT$ELH=HH!H HH1HuHH8H81`HHtRHH9IHpHH9v HHH!ADGEHLH!1LD$8HT$0HD$MHT$0LD$8I=HI=H!I HL1HLH8H811LD$8LH1HuLH8HT$0H8LD$811|f.H蘙H$`1eHDŽ$`HDŽ$hHDŽ$p5#HDŽ$xiHD$Hx0tB_HH1DŽ$`HDŽ$hDŽ$p8#HDŽ$xH1DŽ$`HDŽ$hDŽ$p9#HDŽ$xTcH$`DH1DŽ$`HHDŽ$hDŽ$p"HDŽ$x}HH!H9HADGHHEgfHLH!H9HLL!H D$H=H!H HH!H9HLH1HuI8H8D1uAAHL$H9vHHHHH1H%E8@A,A@@tE8DAFLT$  $AA[A\LT$P!l@BLT$pjjAALLT$ $AA^A_L$jjHAAH$ $A[]jjAHt$0AH$/ $A\A]jjAHAH$ $_AXAA@jjAHt$0AH$ $AYAZL$jjAALH $XZCHD$L$jjAALHpx> $AY^L$jjAALH $A^A_-HD$L$jjAALHpP $AXZв7 jв6 qjP!]jQ"IjjjAAHH$ $Y^vjjAHt$0AH$ $_AXAHBtk&=tjHD$ptJcDH|$@ذHD$@HD$HD$PE HD$XH 1P`HD$pvfDLLkF?fDufD(1E1Y^FI$IDŽ$(11^FE1I$HDHT$@H {6ohEHD$PD$X )D$@HD$`H|$@1IgQa@H={gHHsHHD$腼HIHT$Hfu Lw H f=t wLHUHj0HB(HDŽ$HDŽ$HDŽ$H$xHDŽ$H9H$pHDŽ${HELuHHHT$HT$A<-p@ Iv(Ht<v I~8t H6H}HFHH DH$Ht#H$HPH$HHDŽ$H$HEH4$H$HEH0HuH$pH$xHjH90}H$H9H$pHELuL LA<<;$HHBH$1HH$pH$xHDŽ$HDŽ$H9DDH@D9tf}GHUHL$HrH91Hu+UHHc,< Hՠ/HDH$x<hP \<IV(vHt I~8t HHtJH$pHٺ޻H1DŽ$pHDŽ$xXDŽ$HDŽ$ڨLH$01HH̺H$0HD$xHt H$ ɥH$ HtxHD$HH@HR xH HD$H@QAHP@H@@HaH@ H\$HH$xHH$pH$XH$pd$&HD$HD$@D$LMLpH\$ID$Ll$pMHet$0H\$HxHt\$DWE҈$H\$HXHt\$DOEɈ$uiH\$H`Ht\$DGE$uCH\$HpHt\$$u\$$H\$Ht @yD$h\$hҸE؀$\$h,H|$|$H$ HD$H$(Lt HD$@LL$0H\$ H|$ E1HD$`HD$(H$|$,H=ǾHGx)|$(H(AM H=LlL|$IHAHqHyHD$@AHAML-\$0D$IEALl$pH@@ HL$X\$H9HcHL$ D\$0D$hH9$ENHD$hAH|$P11qHD$xHHH$pHH$pHF$p%HD$ KDt$0EpDH\$H3&HբH#&$HH$p%=H$p)$1HykrHH$H\$@1HL$xMFHX@HIT$n$IH|$(:H|$`.HD$(Iu1H=[1HXHH!IuH=:11HLHHfA<$I|HD$(HBQ&HJ@1Hr@HtHA H$pH$xH$L1H$p9HT$(1HH#H}nHT$`1HH#Lk0IUHL$pMG1PLt$@HIHx舿HS1IHپG"IH$LL(4H=H:)yL|$L$LHHD$(a|$t H|$)IMtGHD$(L8L\HuLIL\HtHp0I9LyHT$(H|$LHH$H$1ɀHHt$H|$諨$HD$` H<$+#HH$L$H$H|$H$D$0HLAD $EBH;ZIH=Ы11HpKI#ID$H$L $uH<$"HD$(HD$HHD$(H|$8HX<#H@HD$0HH@D$PHD$0HL$(H@H;HHD$8H$0HpH$PHL$0L$PMMHD$0H@HD$X@IFH@0ff#bAIE0IHX@f8HH;tH;oH|$8W!@$Hu1LH@HMtXiHUH1IHHHfH16LIHھF1IH߾rgH\$0CQHs@1HtHV H\$XH$H$pL1HHDŽ$H$xH\$0H$ºH$pHúH$H$p&HD$(HLp$MMmMMH|$L=yٜI8H$pLHH1DŽ$pHDŽ$xXDŽ$HDŽ$ NϨ@=ٜYH$pH1DŽ$pHDŽ$xXDŽ$HDŽ$ Ψ @L1kMIE>|$h H$HWMp1H|$HH<$HhH^΢IH$pH$ DŽ$pD$HHD$8Z$AHD$HELHHHDH2H<$IFHD$IFHD$0H@HxHXLkIEHxvH@H<$HHT$HHT$HH;P>H|$8LL$pMHl$HImXf}IH|$IuX誁Iu0H=ǦHILD1L$&Ht$0HHD$h$HT$1HLHD$XL$HD$X1LLHLL$hs$LLMmMUHl$HH9$H|$8H$DŽ$p‹$$ u@2$@%=H$$ H@@H%@8b@v Au @tpAAD uAAAADǁȹ=G%@8G$ %H$ Љ$ $ Ј$$ _)$ H<$1moH|$  HD$ HHH$` H$pD$PHt$8H<$1҉$*腝0="HILHHrsH|$HLzHH HHT$H|$ {)L|$xIݵE1E1LH1H藬H|$P1keH ILp1RHH$fL|$L(LVHLH$pCHɢH$PH$IDŽ$PHHL$`D$H4$LLALA蟒IHJH9uHJHt$HHH$p$HHD$8~)$HcHmL|$Ll$xFLLH$HT$P1MGFIHT$P1MGI$H$0LIƄ$0$0H$H|$PHt$XHHPHHR4H$p$x$|cHPHH$HR4H$P$XHL$8$\HT$`蕾Ht$8E1HDŽ$HH$pT诚Z:#HD$Ll$pD$1HpIHLD9HT$X7I@H$ LIhLD$LxHXH$IGHx A1HHHHHpHbHHIbLD$IH5 H$pIH@8H$PH$PH$pʴH $\ IHLG1-HMHc1LH1XHCL|$xLt$ HcMgLMILa1ILLF$ILt$ tb$pHcD$0HcH9t$ I9}8HD$ LLD$(H9HBHD$ oaLD$(HLZ1XIH$H k1:H<$INHŢHH$pH$DŽ$pHHD$8D$H|$HLAHAH詎HHD$`HEHHH9HL$(L4$HT$`HHL$(LHD$HLH@XHEXAV$H$DŽ$I$H$1D$$ L4EA@IHHSXH1LuH82C_AuDL!L9TH{XDL!HH"HIIDK_M!AA8AFI1HHXH$HL$XHHX@_L!HL$PAƈ$HSXAH1LuH82C_1HL$XHH1LuH8H8Ll$P1ȨH{XL!L9UHL$XHL!L9AIL#$HH=M M!LH HH$L!L9HIH1LuHD$XH$H8H81Ȩ`HD$PHLHH1IH$pH$pHH9H@HGC_L!EDI<F@A9AENE9}A9D~<FAA@Ht$PDHH#$L!H=H L!HHt$XH HH$H1LuHHH8H81DDD9ADYHH#$H=H L$PL!HHL$XH HH$H1LuHH8H81بDDDAH$HLL!E$AH#CXH HCXH8A $DC_H$9C$aH$xؗAMHL$8L$pMHD$HHDŽ$pH$xH$H$p|H<$.HpHHxHXH$pHHD$8H$pHC$HD@HH|pHmHuH|$ ~нj XVH$HL(HD$HHcI]HtHHH!H9,Hp[HIUM1LHھ]HhLHIն HlѶH$PLHHDŽ$PxEQIKHE@H$PH$pH$H$pHE@H$x9@HDŽ$hH$pHDŽ$xD$P\DC_LkXM!AEZk_MI!@н X獚HHc< XƍDC_AEHL$XH8@tP`IEDHl$PyHt$XH8@@tA@`IEDA1HH=L!H\$P&1HH=L!It5HL$8Ht$PL1IH=M!HHDŽ$pHH@HD$(_ILI9IMHt$ YILHt$ XIMHھG1DHcT$0$L)AH)1HHD$@HI\H6HEHD$HHAHYHyHHqAۗHD$Hc1Hcf fw1DL9bH\$@$HD$\$LlHD$LHD$@D$MH$ H$pȹ1DŽ$pHDŽ$xXDŽ$ HDŽ$+н XH$ H$pDŽ$pHDŽ$xXDŽ$ HDŽ$ϻн X覊։F7@B@2$ @1f X7 X# X X X牚 XӉH$ H$pȹDŽ$pHDŽ$xXDŽ$ HDŽ$蠺н Xwн XcH$ H$pDŽ$pHDŽ$xXDŽ$ HDŽ$0н XIt$H~tH$H$H$p1o$L)$pH$,~н X螈)|IE5H0\$E@$Ot$0Ll$pMX?@D|$DHø H$xH$xHDŽ$HDŽ$HDŽ$C4?H$p )H|$`HD$`HD$`D|$DPC4?p)t&HDVHT@HtHHH9uELE1I1H@L$AL$4L$pD$DŽ$HMH$6A}EED)A9vtA}C47Z>H$p1HH4HtWL$ T$>EL$ IH$pT$%AU AEAEEAEt%1AHLƒA9ILuEu@Ll$H$p1EL.ѷIHD$H1L4Ll$HH7HT$HH|$1Ll$`Hu?IDDl$DA9A-ED$D$$Dd$$ED$ D$$E1HD$1EfDHt$HtN41HHt$HtdH=x11҉ ID|$ D$HT$`LDP@ 8LHLHT5\H|$HiHD$`A.LdAE9jD$$9D$DAD$Dt$AIMH HIH9uD@I1D$$9D$DADAHD$(HH;k)9D$8vb{2D$84(A>HH4HD$(H8H_+EHt$(HX PD$8H|$`th1Ll$0Ld$(H(I$HD)H9HLPrpHLtBD9sA1L4HLDH|$4I $H9H|$`QrqHDuHt y~GH$pHt xTHĨ[]A\A]A^A_DHDŽ$pT$?f 1HD$hHD$hHD$`DGD$81D$?@D |$?PxHSHHt)uH|fHtH2H1HHH9uh;H|$`EE1HD$(HD$8A1jfu0ELHÀHH}A L0LcHC HC(CIEE11HC?HULHH}(;=y!{HH@ HH1ƅDE1fLdI|$HtLMD$Apt!Wt0f9Tu9rIPHD9AHHt;XrEDHI]HW Dc%D9E|$HD{JDD%o E>A.H0Ht:GAEI(I;.HH fc G@%E1AƅHbHHDž0DžH4HHH=HD` H}JrHBH|fDHH9QHHq1HtNA9tNt7IOH}H1EHEE HE;H1E1bDD,H}1HEHEEHEHH@ HE{0DD{S>I}IH4H7DghEHIEADcE|$%D CAIEEC1HDAH43 EIEHtH H HHH9uD`HEH}H1EHEEHE H}1HEHEEHEx HH@ HX1LLAL^HRS1HIFHIF HxE1Lu"fDHID$DH@ NHDžpHE0J ֙f.HHHxLrIvHJHBH@HHH<ʋHKDPD9VL I9$D HD;Lt\HpH">HpHuE1DH0DBH0DAƒ9H\uqf.1 1E11{=EHEEHExH}H1&=t[EHEEHExH9tHHH@H.HD`DT?HHY?I~1AL;9wv;DA9ƒ?ILHM‰IM THu1A9HHTHs^ۺ0hә0 xә0udә1}HpܧH`& HEHEHEEE{k>IH HHELHHpDeHpI>HxI>Iăo,u1OHAD$,1D]E1L LP HxDeDHHpHEHH HD+M"A9BGM4HAI~IЍPD)D苍)IL9ED$AA ucEHHH9w)։HHHLMuDeEuD]H% EtDetD]H% HBIFEtDetD]H}AE11Ҿ Mb)LHMIT$E1I>INIF HHtuD9`voDHLtIVHRHt;u%uI~Ht yKGAIN1Ht9Qv։TH}$HxHBH9HHIFH@ {AWAVAUATAUSHH8= HD$D$ HD$ OF9ssI\L+HLPnH{t HL=/tH|$ 1HD$ HD$(D$0:HD$8AF9rD$fD$MlM}(ImMtIGHPA<$HEHttHtmAUHptaH~(tZ1'fDHtH;XsCA9]Htv6H~(t/I}u؀~tHL(;HEHuD$D$A;F5HX[]A\A]A^A_fDH|$ AE11Ҿ Ht$ HH|$ j DHBHD$H@Hx*AH$IG HHu PH;Vx1H$HL$HT$@$H@HL`HDHxL$͵Iw HT$L$HTHHzH$?IG H`;XTHu HE HA@6OݾffDH$H$LDT$L$L$DT$IC$H$$LcH$ADHALH$L扄$L$DT$JICH[L$ $$H$ALDHALHL牄$@ZDT$L$Ld$PIKHL1DT$L$*+L$L$LIH$LL,vDT$iA{k^$mI{DT$DL$L\$qL\$LIG1ISVDT$IDL$L\$$lIqQf8HHfHB@43A4f@?tE1HD[]A\A]A^A_fDWHA +LuHOH fcu1 LgHvHoPfA<$uf}uHMD}LuI4fEDfL$DT$<E_Ll$ 1JfDI<HI| HH9uDD|$8A9s|$@?)J| HHJ| H$AHDŽ$DŽ$D$H$<HD$HFH$T$VHHt$@LL@4LD$t$@HAMt$H|$`AVAv4bSAVAv4HgQ|$pHt$`H|$@SWH$(L$mHD$LLLT$LH$$L։$$$H$ HDH}H|$ D$8Ll$ L$H$LLR$MH$H$ELHTRHƉ$a2L$Ht$@LLLT$C$;$LT$1H̀H99uH$(kUQ<H|$@HnQH$(<L$lD=RHDŽ$EDŽ$HDŽ$DŽ$Ef=HE8u=H$`E1H\$(H|$H +HuHPHIt' 1H[]A\A]A^A_x HhHHpPf><H$HxF=HI#8H) A>g$D$B H$fo$H$$)$$$IT$ HDŽ$HDŽ$H$HT$$HDŽ$yHHT$H$HDŽ$H$HH$$H$LIt$ I~ % fDH@HHhfDHL$$DŽ$HDŽ$HDŽ$Vx|$H\$(fxHp1HHH9wH$$HHL$H$ HL$LHL$H ׶H$It$ H$fo$HH$$)$$ֶH$fo$H$$)$$DH$jjAAH$PV($(A^A_}L$LHL@ֶH$IV It$ fo$HH$$)$$ֶL$H$jjALLH#$AYAZjIN jIt$ HA$_AXcH\$(aH$|$M1BB=IUHHfL$HH$H$L$LnH$LE0ff.@֮CfDH HRSHH HQH߾([鞮Cf.@TUS}1I1HH@@ 1MffCH{CkHtLKD+H[]A\fD1聇D+H[]A\@d$ fA<$8tLL@I9Hf;8Ld$HD$Ld$t"DAt$K\H[ fstfHuILE1HQ9wDaH8{HHt9wDeLd$MA;\$rA|$yAD$LH%E1AwHHDbAmpMtvLD$I~PHLH$HILD$I~0HLHHIxLD$I~HHLHHIUMvMuIELh @IExP>P$`I}GO L4It0LD$LHLHI_HIII}uLo_D$Hx HLHHI,MMMi@I}LwfA>&mIFI~f8LHt$,HD$,I~PfPfPf'f LD$HHLHHI|MNMi @IELpMWLD$I~HLHHI7MvMu&fDHII~Lt$Hx8HLHMHIIEHx@MbI~LT$L$HLGGL$LT$IAMfLLL\$L$ML$L\$AAEJJeCf.LLT$LT$L\$LL\$LT$T$)GLLT$LT$L\$~LL\$LT$T$dAAJBJBfDMMA'AA<$zfA:MIzG4f%fu GL$L$H¿FLIRD$@HD$PMHD$HMAAAA @<$AAMM5fA;LLLT$L$JL$LT$L޹LGHL$LT$M1\fA;MI{G4f%fu GCL$? L$H¿GLfILLL\$L$HL$L\$AA6JeJ><$tTfA}MAM+LLT$L$LzL$LT$fA>MAMH LH¿GILI}G4f%fuGtBLT$L$o HLFLT$IAL$MPHLT$L$. HLGAtuAfA;MI{ LH¿GIABAmfA}LT$L$I}(MAtCAfA:M-Izm LH¿FIMMLLT$L\$LT$7FLT$L\$T$H LH¿F,II~G4f%fu GLT$L$Ld$ LT$L$jjLT$LD$H^9_L$LT$?@)эPHD HHHD {IBI+CA@D$8HD$ W@Ld$ D)HHHD$ ?LL\$L$L$L\$fHLT$L$HLF@E$ttHEH0E ƒuHx71L(9]$v@HUH4‹E ƒuHyκRUKf.v% fDH% = tRCt4HHSXHtg2,蔳HtOH[fHH޿[1 8@uxzBEb"f.HSX2,HT$HWHT$1HHCXH[ff.@L$(AD$AA LLH A)AE AED)؉D$E$D`EAfDH{Ht$r7tFADd$D}$DL9rE)DHT$HHHHuMLDIߋT$ |$(AHt$ 1请8HHt$ p1薷8aHt$Wd1耻8USHHHHt'HHsoHHH[]1ɉfDHt,T$ <9uHtHT$ H먺0~Eb1H1Lff.H(d)d@HDHd1EH11ҾH(ff.@ATSHcGHw L)sHcHLIJHu~HH~L 1I H HL9uKDHHHJDRHE1E1蠶XZHe[A\]DHHH $HsH)HH $i@DPKIIIOHHcILISHII~%J 1IHIHH9u~KTHHHHHHAQE1E1ɺ践HH謮Y^He[A\]ÐuFHHcAP>D`K)HHHOHyA?HA)AB4HcLIRLII~N$1ҐI I HL9uE~&AKIHILHHHH9u߃?tHHKDHAQXWPnff.L]A\A]A^fDACE1HfN N IL9uؾHH 4D[]A\A]A^IC[]A\A]A^ff.AUATIUSLHMLHHt$HX pHh!tH|$z"H! fA<$u)f;u"HD$fD$ffC f=u-HtHHB7LsMtA>fA<$u'HtLHMt$Mt A>@f;u9HtHHLsMtA<6<ffA<$u'HtLHMt$Mt A> f;u"Ht HHtLsMt A>H_ fA<$u6HtLH 6M|$MtA<@<MDf;u)HtHH5HSHt :@fA<$u/HtLH5ID$Ht8ff;Ltf;HtHHHCH8H +HuHPHf{Gf~GfXuYHLpHtfA>u LHLEHL$Ht$L$HLL$L$wA@H蠫AfA<$HtLHMt$MA>}AV +HuIFHfH[HIVHUf:[HHT$HHHT$A~ U3MvPfA>uHL$LHL$HLEHLHL$+HL$u LIHf9H$Ht$L$HL$HLL$L$H$AefH|$Ʀ~H|$脨l!?A7H=)H{THCAE$I](IEAUHtHH2HSH+:"J +HuHBHf]HHJHOf9r[HT$ HHL$HIHL$HT$ [z 1LzPfA?u LHLEL?HL$Ht$L$HLL$L$L$"ARf!>H=o(H>HD$Ht$HAEXAE$LMe(AIEfDLt$Læt fA>! >AtsH='HAA$ID$AE$Me(IEAU3f;%DfA<$Ht LHtMt$Mt A>0E1f;5HtHHHSH: J +HuHBHfHHHJHRf9HT$ HHL$HIHL$HT$ z /LrPfA>uLD$LLD$HLELLLD$rLD$u LMIfA8uYHL$Ht$HL$L$LH$LHT$djf;HHHSH:HHT$ {f.fA<$f;HtHHZ7HSH:HHT$/ =(HT$_L=,^=Tz yHH{f9u HT$ HHL$HL$HT$ HHELfA>uHL$LHL$HLEI9 1HLHL$BHL$LHZ2fA>HL$Ht$L$ATHLL$H$L$xA@H|$^Lt$Lɠt fA>R8!8xH=!H! }AA81I3V@HxH|$ƞLt$L1AAf SfSHD$f8HD$P6 HFHcЀ FHcЀFHcЀFHcЀFHcЀFHpF!6D$>H= HHD$L$H$Ld$HAE4AE$HDŽ$IEIELDŽ$4L$Ƅ$DŽ$H$H$1LLd$`HIH|$jIUH$H$HH$LL$5A;fA<$lfDf;Hf;qAW +HuIGHFfR$ŠhAV +HuIFHf<HINHCf9HHHL$HIHL$LA~ &INPf9HHHL$ HHD$HL$ LA~ )GHMvXtfA>u LHLEAkf-$`iH$L`$bH$SH|$`H6|$x;H|$`/!53AH=HkHD$Ht$HAEcAE$LMe(I]0IEIG@J +HuHBHf{t f~1HLzHBfA?BLHIBfA>LHu%M~MA?AW +HuIGHf]HMwHfA>u LHtIA $MPfA?u LHLELO_HL$Ht$L$HLL$L$L$2AfH +HuHPHf{t f~)HLxHAfA?uALHIdAfA>uLHtIVHt :;fHL cf;HHH$HSH:J +HuHBHf{ݲf~ӲfX[HLzHtfA?u LHLEHL$Ht$L$HLL$L$AL)oT/&N>$S%A~ HMfA?u LHLEMfA>u LHLEL fA>HtLHIFH8H +HuHPHfXaHLpHtfA>u LHLEM9^1LL&LL(HL$Ht$L$HLL$ }AHDAV +HuIFHf<HINHBf9WHHL$HIHL$\A~ 8!INPf9WHHL$ HHD$HL$ WA~ JHMvXtfA>u LHLEAkf-K$jL)/l}.&0tx>$r%A~ HMfA?u LHLEMfA>u LHLEL NL91LHfA>$HtLHIFH891LHWLHt&HL$Ht$L$HLL$yA\HD$H@f8 YL=*^k0=(TXU=$A~ IHHD$(f8uHHHHDL$HL$MfA>LHHD$ /f; HHLsMA>LP=$A~ MHL|$(G7fA?uLHIDHD$(MfA>u LHLEHD$HL$(H9=[Ht$H|$(1ZL;t$ ZHt$ 1LkSHt$ L$>HD$HL$L$Ht$A$HLL$H$H$HD$ H$HD$(H$~A@L )t0&8E$t%A~ HM3fA?u LHLEMfA>u LHLELvM91LL@XLL]#EHL$Ht$L$HLL$uAL()5&5Q$%A uHMgfA>u LHLEMfA?u LHLEL91LHXߝLHu"HL$Ht$L$HLL$L$L$wAHIWHpf:HHT$HIHT$A ILcHL$Ht$L$HLL$H$L$FAHIWHof:ΆHHT$ HHD$HT$ A .MPfA?MuLHIf;uHHtHCHt 8ُL97C7HD$HL$L$Ht$HLL$L$H$VA9HIWHnf:хHHT$HIHT$A PMPfA?MuLIHMDfA>Vf;L>fA<$ޢHt$LHٿHLT$aHI !&AH=LT$HHD$Ht$HAETAE$LMu(MU0IE)'@HMHSmfA?NLHINf;uHHtHCHt 8HHhWGHL$Ht$L$HLL$H$SAHIOHmf9EHHL$HHHL$XA MPfA?MuHT$LHIHT$>f:HHT$HHT$LzMrA?hAW +Hf{f~fU5HIWHf:kSHHT$HHHT$=SfA?fA>L91LHHL$=ڝHL$H$Ht$L$HL$AGHLL$L$L$yAIDHIWHkf:HHT$HIHT$zA ILLu LMIfA?HtLHMMA?AW +Hf{f~fUYHIWH$f:HHT$HHHT$fA?fA>L9Y1LHHL$a؝HL$H$Ht$L$HL$AFHLL$L$L$,wAmHtLHIWH:J +HVflCfmfnHzHHH|$ 2f?uHT$(HHT$(tHD$ z HRPf:uHHT$(HT$(HHEHH|$t.LOt"H;\$ Q<Ht$ 1Hם,;H|$L_H;\$ [Ht$ 1H֝Ht$ HH|$nH|$褈\!AWH= H[HL$ HD$H$IUH\$AEXAE$IEHAHH$HH$HDŽ$DŽ$[H$Ƅ$DŽ$跫H$1HIHnHt$IE(HL舫IGf8HD$ LhH\HHD$ f8,hM fA?}L#fA?}LH|$t.LAt"H;\$ Ht$ 1Hԝ]LH|$LQH;\$ SHt$ 1HԝHt$ HrH|$؄`H|$薆N!AIH=HHL$ HD$H$IUH\$AEXAE$IEHAHH$HH$HDŽ$DŽ$[H$Ƅ$DŽ$詩H$1HGHHt$IE(HLzxDIGf8HD$ _fHLHHD$ f8?fM fA?{LfA?{LH|$t.L1t"H;\$ Ht$ 1HҝgJH|$LAH;\$ Ht$ 1HҝwHt$ HbH|$ȂPH|$膄>!A9H=H@HL$ HD$H$IUH\$AE~AE$IEHAHH$HH$HDŽ$DŽ$[H$Ƅ$DŽ$虧H$1HDHHt$IE(HLjhDIGf8HD$ rdHH|$|,H|$~!AEH=H.HD$IUH$H\$AE~AE$HDŽ$IEID$H$HHDŽ$[Ƅ$H$DŽ$L$詡H$1H?HPHt$IE(HLzxDIGf8HD$  nH Hd}HD$ f8nM fA?]xL fA?AxL;d$ xH|$ 1L ˝H|$ L(LHH|$V~LnH|${\H|$|J!AuH=HHD$IUH$H\$AEXAE$HDŽ$IEID$H$HHDŽ$[Ƅ$H$DŽ$L$ٟH$1H7=HNHt$IE(HL誟DIGf8HD$ lHLH|HD$ f8~lM fA?vLfA?vL;d$ 3vH|$ 1L=ɝH|$ LX LxH|$L&H|$4yH|$zz!@AH=&HHD$IUH$H\$AEXAE$HDŽ$IEID$H$HHDŽ$[Ƅ$H$DŽ$L$ H$1Hg;H/Ht$IE(HLڝDHtLHIWH:wJ +HuHBHfl#M9a1LLHL$(LD$ ǝLD$ HL$(LLHL$  HL$ H*rH|$8`LPH|$v>H|$x,!AWH=HlHD$IUH$H\$AE~AE$HDŽ$IEID$H$HHDŽ$[Ƅ$H$DŽ$L$軛H$1H9HHt$IE(HL茛IGf8HD$ ShH,HyHD$ f83hM fA?\rLfA?@rL;d$ uH|$ 1LŝH|$ L8LXH|$fL~H|$ulH|$vZ! AH=HHD$IUH$H\$AE~AE$HDŽ$IEID$H$HHDŽ$[Ƅ$H$DŽ$L$H$1HG7HHt$IE(HL躙z I% fD sAA81L@fA<$tfDf;Mf;8fA<$|fDfA<$fDAV +HuIFHf<HINHf9!HHL$HIHL$$A~ INPf9 HHL$ HHD$HL$ $A~  HMvXtfA>u LHLEAkf-k$@lHtLHIWHA:8J +HflM91LLHL$(LD$ ULD$ HL$(cLLHL$ cHL$ FH~6H|$<$L|H|$:qH|$r!F AH=,HHD$IUH$H\$AEXAE$HDŽ$IEID$H$HHDŽ$[Ƅ$H$DŽ$L$H$1Hm3Hv7Ht$IE(HLIGf8HD$ "RHLH|lHD$ f8RM fA?gLfA?gL;d$ jH|$ 1LuH|$ LxLhH|$nVLFH|$lo4H|$*q"!xAH=^HHD$IUH$H\$AEXAE$HDŽ$IEID$H$HHDŽ$[Ƅ$H$DŽ$L$AH$1H1HHt$IE(HLDIGf8HD$ /PH|$HTiHD$ f8PM fA?fLCfA?eL;d$ hH|$ 1L襽H|$ LLH|$LvH|$mdH|$ZoR!A H=HHD$IUH$H\$AE~AE$HDŽ$IEID$H$HHDŽ$[Ƅ$H$DŽ$L$qH$1H/HHt$IE(HLB@DIGf8HD$ NHTH:gHD$ f8NM fA?%dLsfA? dL;d$ fH|$ 1LջH|$ LLH|$LH|$kH|$m!A=H=Hk2HD$IUH$H\$AE~AE$HDŽ$IEID$H$HHDŽ$[Ƅ$H$DŽ$L$衐H$1H-H2Ht$IE(HLrpDIGf8HD$ LHHfHD$ f8LM fA?DbLKfA?(bL;d$ dH|$ 1LH|$ L L@H|$L>ֿH|$iĿH|$k!AmH=HHD$IUH$H\$AEXAE$HDŽ$IEID$H$HHDŽ$[Ƅ$H$DŽ$L$юH$1H/,H Ht$IE(HL袎頽DfA<$}鉽fDAW +HuIGHfLf f[ f\,HMwHtfA>u LHLEL9f1LH1LHN޾!AH=rHϛHD$AEHAE$IEIvH9tHjHt$Mu(HMu0L臍酼f," *!+A~ HMԇfA?u LHLEMfA>u LHLELfA>HtLHIFHr8iH +HuHPHfXHHLpHtfA>u LHLEM91LL蜶 LLHL$Ht$L$HLL$MAʼ/@HD$H@f8 n 5AA 81,Ae*D@++A~  HMfA?u LHLEMfA>u LHLEL ѻL91LHkfA>HtLH$IFH8~H +HuHPHfX]HLpHtfA>u LHLEL91LHڴ"LHHL$Ht$L$HLL$ KAߺmf.fH>HIWHA'f:HWHHT$HIHT$UXA pMPfA?MuLIHMDLLLT$JLT$u LMIfA>f;HCSHsL|$ILD$nLHHD$ Ff;HHLsMA>Lʡ=T}A~ MHL|$(fA?uLHIDHD$(MfA>u LHLEHD$HL$(H9Ht$H|$(1"oL;t$ THt$ 1LHt$ LӼHD$HL$L$Ht$ATTAdHLpH#fA>RLHIQfA?LHtIGHg8^H +HHPH߷fD,H*P+A~ HM]fA?u LHLEMfA>u LHLEL M9O1LLoLLtHL$Ht$L$HLL$3DADHIWHk&f:QHHT$ HHD$HT$ XA MPfA?MuLIHMDLbrL9{CqHD$H@@L!A_H=HqHD$IUH$H\$AEcAE$IEHD$HIE(IFH$HHDŽ$DŽ$XƄ$H$DŽ$L$軃H$1H!HHt$IE0HL范銲HMwHtfA>u LHLEL9b1LHm-LH!A-H=HFHD$Ht$HAEHAE$LMu(Mu0IE܂ڱ=u =pyL=mr"j=n|iz HLtfA>u LHLE,N*T+gA \HMfA>u LHLEMfA?u LHLEL9Ŷ1LH?LH\HL$Ht$L$HLL$L$L$BA®-=Q=A~ 2HMyfA?u LHLEMfA>u LHLELfA>ׯHtLHįIFH8H +HuHPHfXHLpHtfA>u LHLEM9+1LL᨝RLL?HL$Ht$L$HLL$5@At'(A~ HMpfA?u LHLEMfA>u LHLELfA>HtLH~IFHq8hH +HuHPHfXGHLpHtfA>u LHLEM91LL蛧 LLHL$Ht$L$HLL$>Aɭ.A~  HD$A~ IE1A~  HD$A~ IE1ȼfA>HAV6 HtLHcЀ t@HcЀt4HcЀt(HcЀtHcЀtHcЀ< G`?#M\M锾fA?MM鏾fA?8MCE1z ILzP鸰f;HT$E11LHWY'.(0A~ %HM}fA?u LHLEMfA>u LHLEL֫L9ڰ1LHp}fA>HtLH)IFH891LHߤ'LHHL$Ht$L$HLL$;Ar-1=QĪA~ HMfA?u LHLEMfA>u LHLELjL9@1LHfA>@HtLHIFH 89# 1LHsLHHL$Ht$L$HLL$9AxE1ظA~ 4IFPHD$A~ I-E1B=$z HHۆf9u HT$ HHL$HL$HT$ HHELfA>uHL$LHL$HLEI9xZ1HLHL$bHL$LHzfA>HL$Ht$L$A$HLL$H$L$BAϦ@E1pIMHL$zI HL$8#Շ xAA81QE1z I.LrP%A~ "3HE1V=9'y:=F^|Q=:`z HLfA>LHHD$f;HHHSH:HHT$ ﹡=:HT$ '=Tdz HHDD$ HD$ fA>=FuIvEmIѯI@CA~ IfDA~ HL$"A~ IDA~ HL$"EhY8=&eJ=$.Rz HLfA>uHT$LHT$HLEfAL91LH艜QLH>HD$HL$L$Ht$A$HLL$L$H$<AfDC-0;=QA~  HM)fA?u LHLEMfA>u LHLEL#M9C1LL腛LLHL$Ht$L$HLL$I0AZD'3(>A~ 3HMfA?u LHLEMfA>u LHLELLM91LL讚ƠLLHL$Ht$L$HLL$r/AAfDIFPHD$IFPHD$2I髤E1lLt$IFf8 .H|$HIfDz &IgfDA~ NHMifA?u LHLEMfA>u LHLELfA>HtLHIFHӟ8ʟH +HuHPHfXHLpHtfA>u LHLEM91LLnLL[HL$Ht$L$HLL$Q0A+鐝DA~ HMhfA?u LHLEMfA>u LHLEL־ǞfA>HtLHIFH8H +HuHPHfXqHLpHtfA>u LHLEM9"1LLŗ6LL#HL$Ht$L$HLL$/AXDA~ ޝHMdfA?u LHLEMfA>u LHLEL螽fA>HtLHpIFHc8ZH +HuHPHfX9HLpHtfA>u LHLEM91LL荖LLHL$Ht$L$HLL$-A 1HT$qJE11LHٜLH6gA~ gHMBffA?u LHLEMfA>u LHLEL'fA> HtLHIFH8H +HuHPHfX›HLpHtfA>u LHLEM9ް1LLLL3tHL$Ht$L$HLL$j,AD驙fDA~ .HMdfA?u LHLEMfA>u LHLELߚfA>ӚHtLHIFH8H +HuHPHfXHLpHtfA>u LHLEM91LLݓNLL;HL$Ht$L$HLL$1+A pDA~ HMcfA?u LHLEMfA>u LHLEL趹fA>HtLHIFH{8rH +HuHPHfXQHLpHtfA>u LHLEM9Y1LL襒LLHL$Ht$L$HLL$)AӘ8DA~ HM~fA?u LHLEMfA>u LHLEL~ofA>cHtLHPIFHC8:H +HuHPHfXHLpHtfA>u LHLEM97 1LLmޗLL˗HL$Ht$L$HLL$(ADz ΜHHdf9u HT$ HHL$HL$HT$ HHELfA>uHL$LHL$HLEI9oS1HLHL$藐HL$JLH7fA>,HL$Ht$L$A,HLL$H$L$0Az ֛HHPXf9u HT$ HHL$HL$HT$ HHELfA>uHL$LHL$HLEI9cL1HLHL$蟏HL$RLH?fA>4HL$Ht$L$A(HLL$H$L$/A f-58=QԕA ɕHMfA>u LHLEMfA?u LHLEL91LH謎lLHYHL$Ht$L$HLL$L$L$'A/@'d5(A HMbfA>u LHLEMfA?u LHLEL961LH֍LHHL$Ht$L$HLL$L$L$<&ACYfDI]HL$銨fA>MM,A~ oHL$dDE1A IMWPA~ IHHD$]f8uHHHHDL$HL$If:IuHHT$ HIL9w1HLHL$(ԋHL$(LH=`HHt ~H|$t.fA<$Tf;uHZHED$HD$!ԑ4H=HHL$HD$H$AEAE$IULd$IEHAH$HDŽ$H$LHH$HD$ DŽ$HƄ$DŽ$H$`H$1LHoIE(Mu0HLLAS`QfDA~ MIHHD$Zf8uHHHHDL$HL$MfA>͖LHHD$ f;ݎHHܕLsMϕA>ŕL =(A~ MHL|$(#fA?uLHIDHD$(MfA>u LHLEHD$Ht$(H9#Ht$H|$(1JL;t$ Ht$ 1L(Ht$ LCHD$HL$L$Ht$A(;,A錍fA XMwPHT$A~ It@A~ UIHHD$>Yf8uHHHHDL$HL$If:IuHHT$ HIL9\1HLHL$(6HL$(LHN=3\H Ht H|$t.fA<$f;uHHED$HD$!0БH=HzHL$HD$H$AEAE$IULd$IEHAH$HDŽ$H$LHH$HD$ DŽ$HƄ$DŽ$H$[H$1LGHъIE(Mu0HLLA[鳊A~ IHHD$_Vf8uHHHHDL$HL$MfA>tLHHD$ kf;EHHDLsM7A>-Lu=, A~ MHL|$(=fA?uLHIDHD$(MfA>u LHLEHD$Ht$(H9gHt$H|$(1貄L;t$ Ht$ 1L萄xHt$ LcHD$HL$L$Ht$A,'AA~ IHHD$Tf8uHHHHDL$HL$If:IuHHT$ HIL91HLHL$(ւHL$(LH=WHHt H|$t.fA<$Qf;uH\HED$HD$!ˑ6H=HHL$HD$H$AEAE$IULd$IEHAH$HDŽ$H$LHH$HD$ DŽ$HƄ$DŽ$H$WH$1LHqIE(Mu0HLLAUWSJ +HuHBHFf($ŰmHHJHLf9u!HT$ HHL$HHL$HT$ tHz ;LzPfA?uHL$LHL$HLEH$Ht$L$HL$AnHLL$H$AyHHJHf9u!HT$ HHL$HHL$HT$ tHz LzPfA?uHL$LHL$HLEH$Ht$L$HL$AmHLL$H$VAD̈́HHJHȘf9u!HT$ HHL$HHL$HT$ tHz LzPfA?uHL$LHL$HLEH$Ht$L$HL$AlHLL$H$A!HHJHf9u!HT$ HHL$HHL$HT$ tHz 7LzPfA?uHL$LHL$HLEH$Ht$L$HL$AkHLL$H$AuHLzH-fA?uHT$LHHT$tIǃz HRPf:uHHT$HT$HHELHT$0HT$vHL$Ht$L$H$LHLHt$HLH$H$|AHzHHH|$f?uHT$ HHT$ tHD$Ht$HLL$L$H$D{A'HHrHf>QHT$ HHt$HHHt$HT$ $H$AGHL$Ht$HL$L$LL${A3HLzHJfA?6HT$LHHT$Jz MIHRPf:uHHT$HT$HHEHLHT$!HT$u HLIf:Ht$H$AFHL$Ht$HL$L$LL${A3}A~ ~HMMIfA?u LHLEMfA>u LHLEL~F~L9r1LHwfA>~HtLHIFH}8}91LHOw}LHl}HL$Ht$L$HLL$ AT}{A~ =}HMTfA?u LHLEMfA>u LHLEL&|L91LHvAfA>|HtLHAIFH|8|H +HuHPHfXz|HLpHtfA>u LHLEL9ʓ1LHu?|LH,|HL$Ht$L$HLL$) A{zA~ {HMRfA?u LHLEMfA>u LHLELΛ{L9w1LH0uvfA>l{HtLHIFHL{8C{9c1LHtzLH輷zHL$Ht$L$HLL$ Az2yA~ zHMFfA?u LHLEMfA>u LHLELv>zL9v1LHsrvfA>zHtLHIFHy8yH +HuHPHfXyHLpHtfA>u LHLEL91LHGsyLHd|yHL$Ht$L$HLL$y ALywA~ 5yHM%EfA?u LHLEMfA>u LHLELxL9u1LHr.ufA>xHtLH9IFHx8x91LHq7xLH $xHL$Ht$L$HLL$!AwvA~ wHM`fA?u LHLEMfA>u LHLELƗwL9r1LH(qerfA>dwHtLHIFHDw8;wH +HuHPHfXwHLpHtfA>u LHLEL91LHpvLH贳vHL$Ht$L$HLL$Av*uA~ vHMGMfA?u LHLEMfA>u LHLELn6vL9p1LHojfA> vHtLHIFHu8u9!1LH?ouLH\tuHL$Ht$L$HLL$qADusHwf;MߒMŒA MPI@A IE1飑I'{E1A vMwPf9uH\A Jf;MwPiHT$靐A IE1鯏MvPH3wHD$A~ MHvDH3H|$ HHD$ 鲞H5H|$ HHD$ ׯHjH|$ HHD$ HH|$ HkHD$ aHH|$ HHD$ z IyHH|$ HėHD$ 麗H{H|$ HHD$ 駙H3H|$ HHD$ 锛A~ rHMAgfA?u LHLEMfA>u LHLELrM9_1LLglrLL脯lrHL$Ht$L$HLL$+Au LHLEL>qM9l1LLkqLL轮qHL$Ht$L$HLL$dAuq3p=8&>M#z zHLfA>vLHHD$f;zHHvHSHz:zHHT$ >HT$ =$zzz HHDD$ HD$ fA>z=\?zIvEm_yI~fA~ nHMpfA?u LHLEMfA>u LHLELmM9Z1LLgmLL蝪mHL$Ht$L$HLL$DAUmlA~ EmHMofA?u LHLEMfA>u LHLEL^lM9Ur1LLflLLݩlHL$Ht$L$HLL$AlSk=wITS={K=7vz vHHf:<HHT$HIHT$f;uHHqHSHu:uHHT$ =GL9֭1HLHL$oeHL$2uLH臨uHL$Ht$L$ADHLH$L$L$tAi@BFN}EC.BDtz tHL2fA>PLHHD$(f;ktHH=pHSHPt:GtHHT$ }DHT$ =(!tz HHDD$ HD$ fA>s=9sIvEmsI~fA~ gHMxTfA?u LHLEMfA>u LHLEL辇VgM9T1LL a8gLL=%gHL$Ht$L$HLL$AfeA~ fHMGSfA?u LHLEMfA>u LHLELfM9[1LL``xfLL}efHL$Ht$L$HLL$$A5fd=r;D1=s1=toz oHLtfA>u LHLEL91LH_qoLHƢ^oHt$H$LL$L$B1oAu LHLEL~eM9~Y1LL^dLLdHL$Ht$L$HLL$AdscfA>fM2HBH|$ H@HD$ 6LH fA>L迂H fA>L蝂Ht|fA>[LHt^fA>3LaHt@fA>LCHt"fA>L%H6cfA>LMHT$yHTH|$ HHD$ HH|$ HHD$ 鳗I˨HH|$ HrHD$ hHsH HD$f8H|$KH HD$f8H|$#HHD$f8H|$HhHD$f8SH|$ӀH@HD$f8XH|$諀HHD$f8'H|$胀{nA I飧A bHMUefA>u LHLEMfA?u LHLEL9^N1LH{[;bLH蘞(bHL$Ht$L$HLL$L$L$Aa_A aHMdfA>u LHLEMfA?u LHLEL9M1LHZ{aLH؝haHL$Ht$L$HLL$L$L$!A(a>_A aHMcfA>u LHLEMfA?u LHLEL9L1LHY`LH`HL$Ht$L$HLL$L$L$aAh`~^A X`HMbfA>u LHLEMfA?u LHLEL9b1LH;Y_LHX_HL$Ht$L$HLL$L$L$A_]A _HMafA>u LHLEMfA?u LHLEL9)b1LH{X;_LH蘛(_HL$Ht$L$HLL$L$L$A^\A HIVyLHLELHLEǝLHLELHLELHLE$LHLE鋆LHLEjLHLEIA IUzIBzA HT$ZyHT$EyH{f;MayxyA XI~}A EH{IX}f:M{Q^I%xA ]HMOcfA>u LHLEMfA?u LHLEL9Y1LHVB]LH蟙/]HL$Ht$L$HLL$L$L$A\[A OHT$5A \HMafA>u LHLEMfA?u LHLEL9V1LHUm\LHʘZ\HL$Ht$L$HLL$L$L$A\0ZAD$]鄙AD$CMPIvAD$_醗MWPIxAD$r陕M fA?LIZݘM fA?L$Z項M fA?ؖLZÖM fA?;LڒBZ&H +HuHPHf{t f~CxHLxHtfA?u LHLEHLxHL$Ht$L$HLL$L$ySAwXIE1KzM fA?L~Y邂M fA?bLYYMM fA?EL̑tY0M fA?L觑OY雓LHLE"AD$AD$LHLE鯍LHLE鮇LHLEm='z bHL&fA>LHHD$f;EbHH^HSH*b:!bHHT$ k='HT$ w=,az HHDD$ HD$ fA>a=&aIvEmfaI~AD$9`AD$h鏂=Gz  _HH`f:MHHT$HIHT$f;^HHZHSH^:^HHT$h=7L91HLHL$jNHL$-^LH肑^HL$Ht$L$ADHLL$L$H$]ARM fA?ȂLWS鳂H +H@f{t f~pHLxHtfA?u LHLEM9eH1LLMoHD$HL$L$Ht$HLL$L$H$1LAoRM fA?*L艋RM fA?LdR鐃IEH\$H$D$HDŽ$H$d"H$1H¿HIE1AQM fA?L越Q=([z [HLȴfA>uHT$LHT$HLEfAL91LHzKB[LH藎/[HD$HL$L$Ht$A(HLL$L$H$ZAO=%Zz ZHLfA>sLHHD$f;ZHHbVHSHuZ:lZHHT$ c=$HT$ K=%DZz :ZHLtfA>u LHLE=g ZH|$t.fA<$lf;uH HED$HD$!}UH=d}H,HL$HD$H$IUAEAE$IEHAL$Lt$H$HH$HDŽ$DŽ$FLH$Ƅ$DŽ$;H$1L虼H#NIE(HLLA  N=TXz XHLfA>uHT$LHT$HLEfAL91LHgH/XLH脋XHD$HL$L$Ht$ATHLL$L$H$WALAD$|驥?Wz WHLpfA>sLHHD$Kf;nWHH@SHSHSW:JWHHT$ `?HT$ a=($Wz HHDD$ HD$ fA>V=VIvEm VI~踽3cTz YTHHyf:fHHT$HIHT$8f;THHOHSHS:SHHT$W]={L91HLHL$CHL$|SLHцiSHL$Ht$L$AkHLH$L$L$&SA1HAD$龳=oSz RHLtfA>u LHLEL9Ӈ1LHBRLHRHt$H$LL$L$&zRAG=lgRz ]RHLtfA>u LHLE=;cPz YPHLfA>ʄLHHD$քf;PHHKHSHP:OHHT$ ^Y=;HT$ =TOz HHDD$ HD$ fA>hO=[OIvEm踳@OI~gAD$MNAD$鲬.Lz LHH6}f:#}HHT$HIHT$|f;LHHwHHSHL:LHHT$U=vL9w1HLHL$Fu LHLE=U ~>Lz pGHLfA><LHHD$f;6GHHCHSHG:GHHT$ uP=&HT$ (=(Fz HHDD$ HD$ fA>F= rFIvEmϪWFI~~HLzH'fA?3}LHH}f:];HHT$HHT$pHBH8;8/;H +HuHPHfX;HLxHLLz CHLtfA>u LHLEL9eO1LH3TCLHvACHt$H$LL$L$CA8M z BHLvNfA>`NLHHD$zf;BHH>HSHB:BHHT$ K=HT$ 8y=$rBz HHDD$ HD$ fA>B=TAIvEmWAI~HSHtH7MnM*M:MjM&MMOxf;\4IHD$ |f;@4MŢf;)4IHD$ MKf;4M鄫f;3M A AE1qFMֶz >HLtfA>u LHLEL9{1LH.d>LHqQ>Ht$H$LL$L$$>A/3z >HLtfA>u LHLEL9K1LH .=LH)q=Ht$H$LL$L$/=A2MxMM隹` M@AA81ir&@z *=HLZfA>uHT$LHT$HLEfAL91LH,xu LHLEL9@f1LH+;LHo;Ht$H$LL$L$;A0z o;HLYfA>YLHHD$Xf;5;HH7HSH;:;HHT$ tD@HT$ V=,:z HHDD$ HD$ fA>:=s:IvEmОX:I~HF-f;;-IHD$ Ff;-IHD$ gvM鉂z 7HLŧfA>LHHD$f;7HH}3HSH7:7HHT$ @=9HT$ =T_7z HHDD$ HD$ fA>6=A6IvEmD6I~)z 4HLtfA>u LHLEMpM z 2HLfA>LHHD$f;F2HH.HSH+2:"2HHT$ ;CHT$ П=,1z HHDD$ HD$ fA>1=1IvEmi1I~萘YW$z :/HLfA>LHHD$f;/HH*HSH.:.HHT$ ?8AHT$ =$.z HHDD$ HD$ fA>K.=>.IvEm蛒#.I~J!L,O $dAA81`cLeyM5Mwz +HLtfA>u LHLEL9j1LHR+LH^?+Ht$H$LL$L$+A z +HHgf:jHHT$HIL9g1HLHL$bHL$%*LHz]*HL$Ht$L$AkHLL$L$H$蟹)Az )HLmifA>WiLHHD$hf;)HHU%HSHh):_)HHT$ 2EHT$ Eg=,9)z HHDD$ HD$ fA>(=(IvEm(I~͏z w&HL hfA>gLHHD$hf;=&HH"HSH"&:&HHT$ |/BHT$ e=(%z HHDD$ HD$ fA>%={%IvEm؉`%I~臌PNMYz %#HHdf:ydHHT$HIL9c1HLHL$HL$J"LHU7"HL$Ht$L$AcHLL$L$H$ı!Az !HH_f:~_HHT$HIL9K1HLHL$DHL$!LH\T HL$Ht$L$AgHLL$L$H$聰 AMz  HH'^f:^HHT$HIHT$f;R HH$HSH7 :. HHT$)NB aAA81Saz yHH[f:[HHT$HIHT$[f;8HH HSH:HHT$w'=wL9X1HLHL$ HL$LHPHL$Ht$L$AgHLH$L$L$FAQMܬM,AD$֠Cs閲M-C黱CX~{~CM髫HPHCv--AD$r2rfA?LMpAD$ /AD$[~M9Ba1LLHL$(LD$ LD$ HL$(LLHL$ OHL$ yH!1iH|$/2WL2GH|$ݻ5H|$蛽#!TANH=>HM`HD$IUH$H\$AEXAE$HDŽ$IEID$H$HHDŽ$[Ƅ$H$DŽ$L$H$1H~H_Ht$IE(HLC9H|$ں9H|$蘼9!SAKH==HIHL$ HD$H$IUH\$AEXAE$IEHAHH$HH$HDŽ$DŽ$[H$Ƅ$DŽ$H$1H }HbHHt$IE(HL|zM9O1LLHL$(LD$ LD$ HL$(LLHL$ LHL$ H5.}H|$C/kL/[H|$IH|$诺7!QAbH=;H#NHD$IUH$H\$AE~AE$HDŽ$IEID$H$HHDŽ$[Ƅ$H$DŽ$L$H$1H${HMHt$IE(HL M9!]1LLHL$(LD$ LD$ HL$( LLHL$ +JHL$  HF, H|$T-| L-l H|$Z H|$H !PAsH=9H,\HD$IUH$H\$AEXAE$HDŽ$IEID$H$HHDŽ$[Ƅ$H$DŽ$L$H$1H5yH|Ht$IE(HL LHHHL$Ht$L$HLL$FA^ MAD$v陦AD$MmjHT$r'MeAG頷HzHHH|$ if?uHT$(HHT$(tHD$ z AHRPf:uHHT$(HT$(HHEH* H|$'+t.L+t"H;\$ kHt$ 1H}ijH|$+C L*3 H;\$ DHt$ 1H9 Ht$ HTG H|$b H|$  !nMAH=T7HYCHL$ HD$H$IUH\$AEXAE$IEHAHH$HH$HDŽ$DŽ$[H$Ƅ$DŽ$3H$1HvHBHt$IE(HLfkHzHHH|$  gf?uHT$(HHT$(tHD$ z ?HRPf:uHHT$(HT$(HHEH(H|$!)t.L)t"H;\$ k@Ht$ 1HwF?H|$)=L(-H;\$ WHt$ 1H3 Ht$ HNEH|$\H|$!hKA H=N5HCWHL$ HD$H$IUH\$AE~AE$IEHAHH$HH$HDŽ$DŽ$[H$Ƅ$DŽ$-H$1HtHVHt$IE(HLf;Lt$ C+H|$@+H|$+!LJA H=24H_xHL$ HD$H$IUH\$AE~AE$IEHAHH$HH$HDŽ$DŽ$[H$Ƅ$DŽ$H$1HosHwHt$IE(HLC-H|$9-H|$,!EIA H=+3HUHL$ HD$H$IUH\$AEXAE$IEHAHH$HH$HDŽ$DŽ$[H$Ƅ$DŽ$ H$1HhrH UHt$IE(HLC7AD$.QMpC)LHAHL$Ht$L$HLL$>AXVC4LHA,HL$Ht$L$HLL$꓃AC؈LH8A HL$Ht$L$HLL$薓AC܉HL$Ht$L$HLL$WAof;Iu\CMXLH@uzLHo@HL$Ht$L$HLL$͒ACkM9}1LLHL$(LD$ kLD$ HL$(yLLHL$ y?HL$ \H!LH|$R#:L"*H|$PH|$!\EAH=B/H|HD$IUH$H\$AEXAE$HDŽ$IEID$H$HHDŽ$[Ƅ$H$DŽ$L$%H$1HnH{Ht$IE(HLM9t1LLHL$(LD$ |LD$ HL$(LLHL$ =HL$ mH]H|$c!KL ;H|$a)H|$!mCAH=S-HsHD$IUH$H\$AE~AE$HDŽ$IEID$H$HHDŽ$[Ƅ$H$DŽ$L$6H$1HlH5Ht$IE(HLM9t51LLHL$(LD$ LD$ HL$(LLHL$ ;HL$ HxH|$~fLVH|$|DH|$:2!AAH=n+H4HD$IUH$H\$AE~AE$HDŽ$IEID$H$HHDŽ$[Ƅ$H$DŽ$L$QH$1HjH3Ht$IE(HL" HBH-5M?MsLH>: SHL$Ht$L$HLL$蜌ARCRM(MhChhCLoM2MfM際MΚC͝Ct'H|$茦y'H|$Jg'!?AH=~)H{HL$ HD$H$IUH\$AE~AE$IEHAHH$HH$HDŽ$DŽ$[H$Ƅ$DŽ$]H$1HhHxzHt$IE(HL.,Lt$ LHd8~HL$Ht$L$HLL$ŠA|~CX~LH8pOHL$Ht$L$HLL$nA@OCOf;Lt$ BeLt$ BAD$MDM錞AD$XXAD$AD$DDE1{'  AAA818LT$@LH7e>HL$Ht$L$HLL$^A5>vC>C1ITIMԜM If;Lt$ Vi.Lt$ XL9 1LHHL$HL$0H$Ht$L$HL$AFHLL$L$L$衐AYHzHHH|$ W f?uHT$(HHT$(tHD$ z u-HRPf:uHHT$(HT$(HHEHtH|$t.L&t"H;\$ 4Ht$ 1H H|$L6H;\$  Ht$ 1HlHt$ H3WH|$轠EH|${3!9A.H=#H HL$ HD$H$IUH\$AE~AE$IEHAHH$HH$HDŽ$DŽ$[H$Ƅ$DŽ$H$1HbH; Ht$IE(HL_]HT$E1HBHIGHEIGH]HMH fA?q LHHL f:{HHT$HHT$cLzMVA?LAW +HuIGHfU*HIWH}f:jHHT$HHHT$*fA?fA>L9G1LHHL$2HL$H$Ht$L$HL$AGHLL$L$L$A>Cq5qLH0uHL$Ht$L$HLL$FAu^CuMPA )Hz )I,tHA f)MPHIsz H)HJPItHHRHf:HHT$HHHT$f9HHL$HHL$HQH:J +HuHBH$H$AFHL$Ht$HL$L$LL$薊ANz 'I)ChI z s'HJPI*f:yHVHCf:^HSLpz w!1pz 'IoIoHRPpz &HJPIoHHRHf:HHT$HHHT$f9HHL$HHL$HQH:J +HuHBH$H$AGHL$Ht$HL$L$LL$EAtz H%I)Hf:JHz %HlHlz %HJPI f:H5IC&E1z vLzPf>Hlf9IlHt$HLL$L$LL$ H$ALL$ jjMjAGz뗃z v!f;LJPjjz w1if;uE1tjHRPhE1~jE1z AhLzP8hE1z gLzPvgE1z fLzPfE1z eLzPef;%i2iH +HuHPHf{t f~ iHLxHtfA?u LHLEHLahHD$HL$L$Ht$HLL$H$Ah L*cHD$cC6kZkCAb)cHbfA?.LLfA?LA !MPH  r(ACA81]*K(E13 AGA81!*Coz w1C(HRPA !H! H M_A  MPH Lt$驱f;eLt$魱LfA?LAXXHeXC AD$O9OC鴰L (XHD$XA 4 HHA  MPHfA?0L Ht$ H&H|$H|$谕!,AcH=HHL$ HD$H$IUH\$AEXAE$IEHAHH$HH$HDŽ$DŽ$[H$Ƅ$DŽ$øH$1H!VHt(Ht$IE(HL蘸CE1 ,A&A81m'E1CE1;Cb==CmmC,lOlC+HHD$.EvEmUDIvEmS%BI=vEmP?IM*( oAA?81AvEm$M.u LHLE=(H|$t.fA<$f;uHHED$HD$!>H=%HHL$HD$H$IUAEAE$IEHAL$Lt$H$HH$HDŽ$DŽ$FLH$Ƅ$DŽ$H$1LZFHIE(HLLĄLHD$f;Lt$O AIA81Lt$f;;Lt$ HHD$L I@vEmQ<4ICٙ `A= A?81 9 A A?81T b Ax A?81 }f;II魀f;I鳀H@~HD$U~AF69Cl $AA81  JsAp A?81S #sE1)E1! AHA81 CbE1E1R AA81`AD$AF6AF6AF6AF6E1YE1Q tAqA81?MAD$Ht$ HLH|$ZsH|$u!f AH=LHHL$ HD$H$IUH\$AEXAE$IEHAHH$HH$HDŽ$DŽ$[H$Ƅ$DŽ$+H$1H5Ht(Ht$IE(HLCE1k ,AA817 IA.A81LT$ AA81h麆E1>z wC1 E1) A'A81ǶHRPE1E1` AA81C 4 A~A81E1] A`A81KsCL9HD$Lf;IIlf;Irf;IIf;Iz HLtfA>u LHLE=VH|$t.fA<$f;uHRHED$HD$!,H=HHL$HD$H$IUAEAE$IEHAL$Lt$H$HH$HDŽ$DŽ$FLH$Ƅ$DŽ$脔H$1L1HlIE(HLLATRLWHD$z HLtfA>u LHLE=DH|$t.fA<$f;uHHED$HD$!ZH=AHQHL$HD$H$IUAEAE$IEHAL$Lt$H$HH$HDŽ$DŽ$FLH$Ƅ$DŽ$H$1Lv0HIE(HLLALHD$f;ImImf;Im rAA81rLt$if;>Lt$if;)Lt$i AA A81 LkzHD$~zHSHyLHyHt$H$LL$L$\yALt$?{f;^Lt$C{f;ILt$ { AA81@{CxTC#xFxLHBHt$H$LL$L$HlA鸿CE1Ƀz HJPw1Ƀz |HJPs IAA81p"C1Ƀz 8HJP/E1" бA\A81驱AD$ܰ@6^@6E@6,@6@6Z@6AAD$3C5qXqHHD$龇HHD$ 3z ;HLtfA>u LHLE=hH|$t.fA<$f;uH HED$HD$!~H=eHHL$HD$H$IUAEAE$IEHAL$Lt$H$HH$HDŽ$DŽ$FLH$Ƅ$DŽ$u LHLE=H|$t.fA<$f;uHHED$HD$!lH=HHL$HD$H$IUAEAE$IEHAL$Lt$H$HH$HDŽ$DŽ$FLH$Ƅ$DŽ$ċH$1L")HIE(HLLA蔋钺LHD$ AEA81bf;"Lt$}Lt$}  AA81 ABHQQHD$fQHHD$ Pz HLtfA>u LHLE=eH|$t.fA<$f;uHaHED$HD$!;H=HHL$HD$H$IUAEAE$IEHAL$Lt$H$HH$HDŽ$DŽ$FLH$Ƅ$DŽ$蓉H$1L&H{IE(HLLAcaLfHD$ A]A811f;I8Cf鉄L,HD$2,A++H+f;IVICf;{IIA%RIf;QII镠f;6I雠HFHD$鱙HHD$ &z HLtfA>u LHLE="H|$t.fA<$f;uHHED$HD$!8H=HHL$HD$H$IUAEAE$IEHAL$Lt$H$HH$HDŽ$DŽ$FLH$Ƅ$DŽ$H$1LT$H޵IE(HLLAƆĵLHD$f;Lt$ AYA81Lt$鲖f;5Lt$鶖f; IYIFz HLtfA>u LHLE=3ٿH|$t.fA<$f;uHHED$HD$!IH=0HLHL$HD$H$IUAEAE$IEHAL$Lt$H$HH$HDŽ$DŽ$FLH$Ƅ$DŽ$H$1Le"HIE(HLLAׄճLHD$f;I魛I際f;~I頛A0Lt$f;RLt$f;=Lt$ AUA814CoNN 6AA81S 4AA81 AD$@H HD$ HHD$ D z >HLtfA>u LHLE=kH|$t.fA<$f;uH HED$HD$!H=hHHL$HD$H$IUAEAE$IEHAL$Lt$H$HH$HDŽ$DŽ$FLH$Ƅ$DŽ$?H$1LH'IE(HLLA LHD$f;ѻLt$ ^߆ AA81E1鞰AFHt$H|$(zHD$HL$L$Ht$AT$LMA@Ht$ HFz>E1hކ AA81AD$L`HD$`ME1m݆ AA81[鉨C 1݆ ǣAA81頣AD$ӢE1ݮa݆ CA|A81z HLtfA>u LHLE=~_H|$t.fA<$f;uH[HED$HD$!5H=܆HHL$HD$H$IUAEAE$IEHAL$Lt$H$HH$HDŽ$DŽ$FLH$Ƅ$DŽ$~H$1LHuIE(HLLA]~[L`HD$HHD$_HHT$ |=:HT$ ( =;vEm*_IDvEm'S\ILt$7f;WLt$7f;BLt$t7Ն &AA819z HHDD$ HD$ fA>=vIvEmqI~au_HdHD$҆ AA?81/HHD$ Ef;ޮLt$DTvEmVIz w11z w1HRPHRP Ht$ HH|$KH|$M! AoH=ΆHHL$ HD$H$IUH\$AE~AE$IEHAHH$HH$HDŽ$DŽ$[H$Ƅ$DŽ$pH$1H-Ht(Ht$IE(HLp颟CE1鋟Ά ,AA81y߁͆ KdAtA?81E߁$d͆ hA A?81߁hhHt$ H/H|$=JpH|$K^!IAH=/͆HHL$ HD$H$IUH\$AE~AE$IEHAHH$HH$HDŽ$DŽ$[H$Ƅ$DŽ$oH$1Hl Ht(Ht$IE(HLnChE1ʝN̆ ,A?A81݁C_Ht$H$LL$L${^AWAIJE1@CL6h_HD${_L^HD$qHSHHSH^鮣LH2Q^1Ƀz HJPAD$ /AD$ˆ _ACA81p܁^Lt$Lf;&Lt$LHHD$mKvEm KI?vEmB[HIvEmEIvEmCI1Ƀz  HJP I AnA81]Ё" f;Lt$C DA|A?81ЁDHHD$ Dz ĚHLtfA>u LHLE=_H|$t.fA<$f;uHHED$HD$!ԐmH=HHL$HD$H$IUAEAE$IEHAL$Lt$H$HH$HDŽ$DŽ$FLH$Ƅ$DŽ$_H$1L#HIE(HLLA_铎LHD$ AMA81c΁f;#Lt$ALt$A eAyA81΁peAwwE1׍f;IwALL9 AA81́aE1yE1qHt$ H鉳Ht$ HlE15 AA81#́mL%xHD$8xf;ŗIkwCMpE -xAA81́xLNHD$af;QI锕I違f;6I釕A H黬H騬AD$ =iAA?81́iAD$0SAFh鏥Ht$H|$(AHD$HL$L$Ht$A$$~(AD@H=FHD$GHHD$ FC_YY~ pAA?81ʁpJ AEA81ʁ݋AD$QHHD$HHD$ %HWoHD$pHHD$ }of;I=CiiLHC|`Ht$H$LL$L$IhO`A鹉C(`H]HD$]HHD$ l]z aHLtfA>u LHLE=Y4H|$t.fA<$f;uH0HED$HD$!͐ H=HHL$HD$H$IUAEAE$IEHAL$Lt$H$HH$HDŽ$DŽ$FLH$Ƅ$DŽ$bYH$1LHJIE(HLLA2Y0L5HD$ AA81ȁf;IIG aAA?81ǁ~a 4jAA?81}ǁ jHHD$ `z -HLtfA>u LHLE=ZWH|$t.fA<$f;uHHED$HD$!pː֍H=WHHL$HD$H$IUAEAE$IEHAL$Lt$H$HH$HDŽ$DŽ$FLH$Ƅ$DŽ$.WH$1LHIE(HLLAVLHD$b AQA81Łf;Lt$<^Lt$'^f;mLt$+^f;XLt$f;CLt$Lt$f;$Lt$oXLt$ZXf;Lt$^XC yAA81āRAF>eHt$H|$(HD$HL$L$Ht$A,$y!A?@1Ƀz HJP鰂E1 eAA81ā>AD$q鲂U ?AA?81Á?C=W AA81|Á^C߯AD$RuC  CrAd::1Ƀz gHJP^1Ƀz mHJPdAF6ZAF6DAD$AD$1 #AA81C HoH_HD$]_HHD$ ^z HLtfA>u LHLE=KRH|$t.fA<$f;uHHED$HD$!aƐLjH=HHHL$HD$H$IUAEAE$IEHAL$Lt$H$HH$HDŽ$DŽ$FLH$Ƅ$DŽ$RH$1L}HIE(HLLAQLHD$S AA81f;}Lt$s\Lt$^\f;^Lt$b\HlYHD$ZC` Ht$ Hz邬 ٤AA81鲤C3ZE1E1E1I ؄A}A81賿鱄ff.tcIE(LptcIE(LEtmfwIHEHPfw5K4B4ff%f9v BuCuH}bu.fDE1HhD[]A\A]A^A_I8HufH}7~HELPA fJABuMʇu L!I¿!L$ >AtH= (L$HAD$cI\$HEAD$$ID$(HmHuL9LLT$w&Ht$LT$H4$LY HMI4$LHl$8H,$HD$D$ GHt$LHL$(HD$0D$4HD$@1LH gHHHpH9tHHD$%HT$IT$0LLLntfɇDHD$H$)f.Hl$8H,$LI$LLT$(HD$D$ ~HD$0D$4HD$1HLPfHHLT$q% A4Aa81Z7L$AWAVIAUATIUSLIILHXLxtHt$0LHl$0H\$8=DfHݣtLLDvH辥H9=1HHf}u&MtHAHDHUHt ::@L $t\HX[]A\A]A^A_EHLLhDHX[]A\A]A^A_@f;DLL$(LLHiu11HX[]A\A]A^A_@}ALL$dCz vkHRPHz vcHJPHt$Vf=X=HXHt$ z HBځfDAUATIUSHMLHXAHt$H $fm_ Ifp f=fn f=u*HtHHHSHt: DH$f8u9HtH$HH$LxMtA?fDf;u)Ht HHtLsMtA>oH;$*H$1HXĔ`A$fqf=fk5Ht$0HL}PE1HXD[]A\A]A^A_fkwLfL2uH$H@f8tH$f8ffDIPf:1HHT$HT$H[ f:I IL$ f9'HHL$HIHL$UAff=u%Ht HHtHCHt 8\*@L91LH”u!f;H&/HH8HSH:-Zf.HtLHIT$Ht :"fC! A H=~HHCAE$I](IEAUH$L$<9tMIރ<9tLsM9tLLSLLZ;Ld$0HL8A[! AIH=~HmH$DT$AE<AE$Me(I]8IEH$IE0Ht$HLDT$M|$fA?W LOHfA?M8 IT$ f:L HHT$HT$HHEHq\f;RH$q=HLt$0H\$8t#H $Ht$LD$0HL'AHtLHM|$MA?AW +HuIGHfkfqHIWH*f:+HHT$HIHT$+A  MPfA?u LHLELL̔u LMIfA?"HtLHMMA?AO +HMfcIHHH|$Wf?J^HZA HD$4 MPfA?u LHLEH|$L˔u LL|$HD$H|$rYLoIf;?H$o*HD$L|$0HD$8IT$f: HHT$HHT$f:I IL$ f9 HHL$HL$IHLDLL ˔u LMIfA?HtLHrIWHe:\J +HuHBHuHHT$HT$HHEHT$Ht$L.ʔu HD$L|$ILqLnf;H$mqHD$L|$8HD$0HH\$@pMH $Ht$LD$0HL{A)?H$f8\f;IPf:a HHT$tHT$7 Hf:I9 IL$ f9W HHL$HIHL$Af[f=u%Ht HHtHCHt 8$@L91LH:f;7Hv(HH HSH:%fMxfA?LH H+fA?MIL$ f9HHL$HHD$HL$HD$f-f=u'Ht HHtHCHt8"fDL91LH Vf;H&HH HSHc:-$fIPf:HHT$HT$ H f:IIL$ f9HHL$HL$IHLDLLƔu LMIfA?u'Ht LHtIGHt8vfDAff=u)Ht HHtHSHt:L91LH躸f.IPf:IHHT$HT$H f:I!IL$ f9?HHL$HIHL$5Aff=u%Ht HHtHCHt 8t@L91LH躷f;H#HH0HSH:!RfJ +HuHBHf<_HHJHf9JHT$HHL$HIHL$HT$Zz nHJPf9Iu%HHzXt f?uM91LL裶A<9tMvA$L<9tIT$I9t5HLHT$HT$`LHuMD!AH=~H!H$AE<LAE$Me(DT$IENIE(H$M}0IE8AW +HuIGHf<HIWHqf:HHT$HIHT$A IWPf:HHT$HHD$HT$tA lHMXtfA?u LHLEM931LLt\AL<9tINA$L<9tIT$H9 HHHT$ HL$HL$HT$ LL[! AiH=~H0H$LAE<AE$Me(DT$IEEMIE(HD$HHt$I]0LIE8DT$fAV +HKf<yHM~H fA?LHHA~ v&I~Pf?~ Ht$A~ Ht$ % HGLHIIT$ Mf:HHT$!HT$f:fHHHT$HIHT$ IL$ If9 HHL$\HL$f9I@HHHT$HIHT$$IL$ If9!HHL$HL$ OHHHT$HIHT$IL$ If9HHL$HL$f9I@H?LHI.IL$ Mf9+HHL$6HL$ HHHT$HIHT$IL$ If9HHL$HL$ f9I|@HHHT$HIHT$IL$ If9HHL$tHL$f9I@HCf8uH$f8!>AH=$~H ~ $AA 81qyD$Vf;e;!A&H=~HH$H$H@AE$Iu(IEAUIFH@J +HuHBHkfl$HfM|$ IfA?LqIT$ Mf:HHT$HT$ufDM|$ IfA?LHT$WHT$If=fDI@ f8HD$) H|$ Rjf.Mx fA?) LHT$HT$ .fDMx fA?9 LHT$HT$ fDMx fA? LHT$cHT$ E~ D$A,A 81yDT$UHLrHfA>-HT$LHIHT$W-z LrPfA>3-LHHD$ -fA?uLHtIGHt 8D/Ht$f>f=u!Ht HHtHCHt 8;L9a 1LH̬-f;HEHHBHSH%:"Dd@HLrHfA>+HT$LHIHT$+z LrPfA>p+LHHD$G+fA?uLHtIGHt 85Ht$f>f=u!Ht HHtHCHt 8f9L9v 1LH謫/,f;HmHHH"HSH:GD@HLrHfA>M*HT$LHIHT$ *z LrPfA>)LHHD$)fA?uLHtIGHt 8+/Ht$f>f=u!Ht HHtHCHt 8;L91LH茪*f;HFHHHSH:!A$@HHJH3f9H)HT$HHL$HIHL$HT$)z _HRPf:IuHHT$HT$IHLDLLu LMIAf`f=L91LHCc)HLrHfA>(HT$LHIHT$(z [LrPfA>(LHHD$c(fA?uLHtIGHt 8g/Ht$f>|f=u!Ht HHtHCHt 8G5L91LH_(f;\H@;HHHSH:[:g~ $AQA Xf.A~  A~  LHMvXLb"!ApH=~HH$AE<LAE$Me(DT$IEL@IE(H$Mu0IE8A~ qE1fA>oHt$LHHt$LETfDAD$<AD$!JAH=0~HH$AE<LAE$Me(DT$IE?I]0IE(M}8[DCOH$f8RHD$H $LD$0Ht$AmHLLt$0HD$8H$HD$@ A fC/H$f8 @CH$f8H$H $LD$0Ht$AlHLLt$0L|$8HD$@vAdfDCH$f8"H$H $LD$0Ht$AkHLLt$0L|$8HD$@AfDC7H$f8:H$H $LD$0Ht$AnHLLt$0L|$8HD$@AfDf9HL$f9I#Df=NnfDE1z Iz LzPo|DA = A IGPIHD$E1fDfA>I@fA>I@fA>I0@fA>HL$) EID$ f8HD$fA?MKCM|$ fA?f:wI!M|$ fA?.f:SI}M|$ fA?f:/IHك<9tHKAL<9tIWH9HHHT$HL$ԘHL$HT$ nHك<9tHKAL<9tIVH9HHHT$HL$wԘHL$HT$HHZԘHك<9tHKAL<9tIVH9HHHT$HL$ԘHL$HT$HHӘHك<9tHKAM<9tM~L9HLHL$ӘHL$[HLӘCHك<9tHKAL<9tIVH9EHHHT$HL$.ӘHL$HT$2HHӘ 1 E1HIWH f: HHT$HIHT$r A MPfA?u LHLELQ2f:AI;fA?HD$Mf:AIf:AIhIIbHT$z /ILz IfDH +HuHPHf{t f~LxHHL|$tfA?uLHT$IHLDHt$f>f=u!Ht HHtHCHt 828L9|1LHD f=Ht$f>A +HT$A I5DH +HuHPHf{t f~iLXHHL\$tfA;uLHT$IHLDfA>!f=LHL\$L\$H$f8H$H $LD$0Ht$ApHLL\$0Lt$8HD$@sAhaH +HuHPHf{t f~5H@HHHD$HD$tf8uHHT$HHEHT$fA?f=u!Ht HHtHCHt 80H;\$RHt$1Hʜf=fA?DH +HuHPHf{t f~mH@HHHD$HD$tf8uHHT$HHEHT$fA?f=u!Ht HHtHCHt 81H;\$Ht$1Hpf=fA?DH +HuHPHf{t f~H@HHHD$HD$tf8uHHT$HHEHT$fA?Kf=u!Ht HHtHCHt 8.1H;\$Ht$1H*f=hfA?DA~ [L-DIHD$E1LE1H$H $LD$0Ht$ApHLL|$0Lt$8HD$@Av`~ H$AA 81NyD$J +HuHBHf{t f~MHHJHtf9uHHL$HL$HHEI9w1HLHL$ܙHL$LHH$f8H$H $LD$0Ht$ApHLL|$0Lt$8HD$@dAR@H +HuHPHf{t f~iHHHHtf9uHHL$HL$HHEL91HLHL$HL$LHH$f8H$H $LD$0Ht$AnHLLt$0L|$8HD$@AnH +HuHPHf{t f~HHHHtf9uHHL$HL$HHEL91HLHL$HL$3LH0 H$f8H$H $LD$0Ht$AlHLLt$0L|$8HD$@AH +HuHPHf{t f~YHLxHtfA?u LHLEM9A1LLFLLc H$f8HD$H $LD$0Ht$AmHLLt$0HD$8H$HD$@AfDHHȘ H +HuHPHf{t f~9HHHHtf9uHHL$HL$HHEL91HLHL$HHL$LH`H$f8H$H $LD$0Ht$AkHLLt$0L|$8HD$@AJ +HuHBHfF"fGs}J +HuHBHfFxfGJJ +HuHBHfF3fGJ +HuHBHfFfGA GMPIzA ;MPIgHHƘ"~ D$AA 81yDT$D~ XD$AA 81EyDT$'f.IIA {IDA cIjDHSH`z AHBPHD$fA>MM@z >HBPHD$fA>MMyfDHSH~z HBPHD$fA>MqMW@HSH`>z HBPHD$fA>M^MD@z >LzPIQ~ hD$A8A 81yDT$7@HSHͼLzPfA?uLD$LLD$HLEM91LLLD$LD$EHHJHfAfA8uH$f87,HLBH fA8u!HT$LLD$HLD$HT$tIM91LLHL$LD$觐LD$HL$mHLHL$LD$FLD$HL$u HLIL9 A fA8uH$f8|(HLBHRfA8u!HT$LLD$HLD$HT$tIM91LLHL$LD$0LD$HL$HLHL$LD$ϛLD$HL$u HLIL9,A"fA8uH$f8&$z QIHD$.HLBH@fA8u!HT$LLD$HLD$HT$tIM91LLHL$LD$融LD$HL$dHLHL$LD$=LD$HL$u HLIL9AfA8uH$f8&IE14IE1-CBH$f8=XCH$f8HD$H $LD$0Ht$AmHLL|$0HD$8H$HD$@ARCnH$f8qHD$H $LD$0Ht$AkHLL|$0HD$8H$HD$@誹A+C4H$f87HD$H $LD$0Ht$AnHLL|$0HD$8H$HD$@CA1CH$f8HD$H $LD$0Ht$AlHLL|$0HD$8H$HD$@ܸA=CHtHHHSH:J +HxBHD$L9<Ht$1LHL$ LD$ꉔLD$HL$ Ht$LHL$HL$f9H$f8HD$L|$8HHL$HHD$8H$HD$PjGHL$Ht$LD$@LAnHئ{A_AX<1C0H$f85HD$H $LD$0Ht$AnHLL|$8HD$0H$HD$@ܶA1HHL$ E1HL$WI11HLLD$ HL$衈HL$LD$ gCH$f88HD$0H$HD$@ AcA 9A HtHHHSH:J +H(HD$L9(Ht$1LHL$ LD$LD$HL$ Ht$LHL$+HL$f9H$f8HD$L|$8HHL$HHD$8H$HD$PjGHL$Ht$LD$@AlHL {AAYA[heAG~1HHL$蘓E1HL$I11HLLD$HL$,HL$LD$AG1HHL$3E1HL$I11HLLD$HL$DžHL$LD$PHtHHHSH:J +H1HD$L9'Ht$1LHL$ LD$ʄLD$HL$ ,Ht$LHL$HL$ f9H$f8HD$L|$8HHL$HHD$8H$HD$PjGHL$Ht$LD$@AkHL踡{AY^1HHL$UE1HL$I11HLLD$HL$郔HL$LD$鯾6~ TD$A A 81yDT$#CzH$f8}HD$H $LD$0Ht$AmHLL|$0HD$8H$HD$@*A7HtHHHSH:J +H#H$f8L|$0H$HL|$HL$HHD$PL|$@jGHL$Ht$LD$@HAmL>{AEXZAKC_H$f8dHD$H $LD$0Ht$AkHLL|$8HD$0H$HD$@;A)AfA?Lt$$;Lt$ z hMMz NIIfA?Lt$Lt$IE1z  MYMGHD$HD$IDfA?Lt$Lt$}z MHM6fA?Lt$Lt$z qMMz WIHBPIHD$LH3靺LH!LH$LHLH)A@ f9uH$f8A@rf9uH$f8HD$Ht$HgA@f9uH$f8x锹LHHt$H`Ht$H4vHD$A@$fA?uH$f8z "IHJP1Ƀz EHJPuLHIDHD$HD$9u!HT$(HHL$ HHL$ HT$(tHz LrPfA>uHL$ LHL$ HLEHD$H95Ht$1HHL$ N|HL$ &Ht$HdfA>H$f8HD$Lt$@HHD$8HD$ HD$@H$HD$PjGHL$Ht$LD$@LAmHE{A_AX鞷H +HuHPHf{t f~LpHHLt$tfA>uLHIDHD$HD$9u!HT$(HHL$ HHL$ HT$(tHz LrPfA>uHL$ LHL$ HLEHD$H9"Ht$1HHL$ 'zHL$ ?Ht$H=*fA>H$f8 HD$Lt$@HHD$8HD$ HD$@H$HD$PjGHL$Ht$LD$@AkHL{AAYA[vz IHJPz DIHJPJ1Ƀz HJPH +HuHPHf{t f~rLpHHLt$tfA>uLHIDHD$HD$9u!HT$(HHL$ HHL$ HT$(tHz hLrPfA>uHL$ LHL$ HLEHD$H9$Ht$1HHL$ wHL$ Ht$HԺfA>H$f8HD$Lt$@HHD$8HD$ HD$@H$HD$PjGHL$Ht$LD$@AnHL赔{AEA^Xv H +HuHPHf{t f~gH@HHHD$tf8uHHHHDt$Ht$AHt$H蘹H$f8HD$H $LD$0Ht$ApHLLt$8HD$0H$HD$@AwH +HuHPHf{t f~LpHHLt$tfA>uLHIDHD$HD$9u!HT$(HHL$ HHL$ HT$(tHz $LrPfA>uHL$ LHL$ HLEHD$H9Ht$1HHL$ ztHL$ rHt$H萷]fA>RH$f8@HD$Lt$@HHD$8HD$ HD$@H$HD$PjGHL$Ht$LD$@HAlLq{AEXZʯ1Ƀz HJP LLe錮z ITIBLLHL$貶HL$Mz ɿII LHLD$sLD$H +HuHPHf{t f~HLpHtfA>u LHLEM9%1LLrWLLDH$f82HD$H $LD$0Ht$AnHLL|$0HD$8H$HD$@\AJH +HuHPHf{t f~sHLpHtfA>u LHLEM91LLr8LL%H$f8HD$H $LD$0Ht$AlHLL|$0HD$8H$HD$@芟AxLHLD$誴LD$ELHLD$腴LD$l z IqI_IGH H +HuHPHf{t f~HLpHtfA>u LHLEM931LLpLLݳH$f8HD$H $LD$0Ht$AmHLL|$0HD$8H$HD$@HAk6J +HuHBHf{t f~HHJHtf9uHHL$HL$HHEI91HLHL$oHL$zLHgH$f8UH$H $LD$0Ht$ApHLL|$0Lt$8HD$@hAVLLHL$舲HL$#H$H $LD$0Ht$ApHLL|$0Lt$8HD$@AکLHLD$LD$ 鵩H +HuHPHf{t f~ HLpHtfA>u LHLEM9 1LL{nLL蘱H$f8HD$H $LD$0Ht$AkHLL|$0HD$8H$HD$@AfLLHL$#HL$龨z :IIJ +HfFpfGHHJHCf9u!HT$ HHL$HHL$HT$ tHz ùLrPfA>uHL$LHL$HLEI91HLHL$ mHL$NHSHJ +HH$f8,H$H $LD$0Ht$ApHLL\$0L\$Lt$8HD$@AL\$ڧH +HH$f8hHD$H $LD$0Ht$AlHLL|$8HD$0H$HD$@A"H +HH$f8MHD$H $LD$0Ht$AnHLL|$8HD$0H$HD$@AH +H\H$f82HD$H $LD$0Ht$AkHLL|$8HD$0H$HD$@+AJ +HufFfGHHJHff9u!HT$ HHL$HHL$HT$ tHz *LrPfA>uHL$LHL$HLEI991HLHL$hHL$RJJ +HfFpfGHHJHf9u!HT$ HHL$HHL$HT$ tHz iLrPfA>uHL$LHL$HLEI91HLHL$gHL$w鑢鉢HSH駢Lt$0L|$8HLD$HHD$PLjFHL$AlHt$HLD$@߄{A_AX='Lt$0L|$8HLD$HHD$PAnjFHL$HHt$LLD$@菄{AY^ءA MP.H +Hmf{t f~HHHHtf9uHHL$HL$HHEI9U 1HLHL$fHL$\LH藩IH$f87HD$H $LD$0Ht$AmHLL|$0HD$8H$HD$@ALt$0L|$8HLD$HHD$PHjFHL$AkHt$LLD$@J{AEXZ钠HD$Lt$0HL|$HAmHLHD$@H$HD$PjGHL$Ht$LD$@{AAYA[J4Lt$0L|$8HHL$HHD$PAkjGHL$HHt$LLD$@蜂{AA^A_HD$Lt$0HLD$HAmHLHD$@H$HD$PjFHL$Ht$LD$@>{AA^A_酟Lt$0L|$8HHL$HHD$PAljGHL$HHt$LLD$@{AY^L6HSHTLt$0L|$8HHL$HHD$PHjGHL$AnHt$LLD$@茁{AEXZԞA OJ +HfFfGHHJHf9u!HT$ HHL$HHL$HT$ tHz LrPfA>uHL$LHL$HLEI91HLHL$@cHL$ HSH!HBH5AFr1Ƀz HJPE1z xLrPoHLrHCfA>/HT$LHHT$txz IHJPf9uHHL$HL$HHEHLHL$oHL$u HLIL9tE1HLHL$6bHL$uz w逮LH?uAufA>uH$f8tHD$L|$0HLt$HAlHLHD$@H$HD$PjFHL$Ht$LD$@{AA^A_xbLz vHHJPLH萤u4AGfA>uH$f8t! 1LbnuiLE1HD$L|$0HLt$HAlHLHD$@H$HD$PjGHL$Ht$LD$@N~{AAYA[镛1f@HD$鷡H$f88HD$L|$0HL|$HD$HH$L|$@HD$PjFz  AGbHD$1Ƀz HJPHBH]@%HBHAFAG'@PCBi@1bHD$H9Ht$1HHL$ ]HL$ ؚHt$HÚHD$f8H$f8HD$L|$8HHD$8HD$ HD$HH$HD$PjFz ܩ@_HD$1Ƀz HJP@^AGCkH$f8pHD$H $LD$0Ht$AnHLHD$0HD$HD$8H$HD$@A%鮘J +HuHBHf{t f~HLrHtfA>u LHLEHD$L9PHt$1L_\Ht$LzH$f8nHD$H $LD$0Ht$AnHLHD$0HD$HD$8H$HD$@A#ΗHLrH)fA>HT$ LHHT$ z  IHJPf9uHHL$ HL$ HHEHLHL$ hHL$ u HLIHD$H9Ht$1HHL$ =[HL$ UHt$HS@fA>5H$f8#HD$Lt$@HHD$8HD$ HD$@H$HD$PjFz !$@cL1Ƀz HJPHt$H踝wH$f8eHD$H $LD$0Ht$AkHLHD$0HD$HD$8H$HD$@A @HLrH)fA> HT$ LHHT$ z :IHJPf9uHHL$ HL$ HHEHLHL$ fHL$ u HLIHD$H9Ht$1HHL$ lYHL$ Ht$H肜fA>H$f8HD$Lt$@HHD$8HD$ HD$@H$HD$PjFz !SL@[1Ƀz HJPHt$H E1z LrPH$f8ZHD$H $LD$0Ht$ApHLLt$8HD$0H$HD$@覅A锓HPHAF@EvHD$H9qHt$1HHL$ VHL$ 6Ht$H!HD$f8H$f8HD$L|$8HHD$8HD$ HD$HH$HD$PjFz 鲢E1z LrPHLrHRfA>>HT$LHHHT$tvz bLrPfA>uHL$LHL$HLEHLHL$ cHL$u LIHM9tS1LLHL$UHL$upz LLLHL$衘HL$u@AFuf9uH$f8tHD$L|$0HHL$HAnHLHD$@H$HD$PjFHL$Ht$LD$@xr{AAYA[Ր鿏Lz vKLLrPLHu鎏AGqfA>uH$f8t'f1La1ɄLE1HD$L|$0HLt$HLAnHHD$@H$HD$PjGHL$Ht$LD$@q{A_AXHPH&HBHHPH@HBHAC,m@N@*1Ƀz xHJPoHBH HD$)@HPHHD$H9 Ht$1HHL$ RHL$ |Ht$H諕gHD$f8XH$f8FHD$L|$8HHD$8HD$ HD$HH$HD$PjFz xHLrHQfA>HT$LHHHT$tvz >LrPfA>uHL$LHL$HLEHLHL$^HL$u LIHM9t[1LLHL$}QHL$uLz ȝLLsLLHL$uHL$uAFuf9uH$f8tHD$L|$0HHL$HLAkHHD$@H$HD$PjFHL$Ht$LD$@Ln{A_AX锋z vLLrPHBH1L]1ɄLE1LH蜓u@AGfA>uH$f8tHD$L|$0HLt$HAkHLHD$@H$HD$PjGHL$Ht$LD$@sm{AY^ҋ鼊HLrHPfA>HT$LHHHT$tvz LrPfA>uHL$LHL$HLEHLHL$\HL$u LIHM9t[1LLHL$GOHL$uz LLsLLHL$?HL$uމAFuf9uH$f8t黉HD$L|$0HHL$HAmHLHD$@H$HD$PjFHL$Ht$LD$@l{AY^u_z vLLrPHBH 1L[1ɄLE1LHgu AGHfA>uH$f8tHD$L|$0HLt$HHAmLHD$@H$HD$PjGHL$Ht$LD$@>k{AEXZ醈E1z LrPE1z LrPE1z $LrP@W1Ƀz XHJPOAGAFDgE1bCJ +HuHBHf{t f~HLrHtfA>u LHLEHD$L9NHt$1LLuHt$LÏ`H$f8NHD$H $LD$0Ht$AkHLHD$0HD$HD$8H$HD$@)zAE1z LrPHLrHfA>HT$ LHHT$ z >IHJPf9uHHL$ HL$ HHEHLHL$ XHL$ u HLIHD$H92Ht$1HHL$ pKHL$ HHt$H膎3fA>(H$f8HD$Lt$@HHD$8HD$ HD$@H$HD$PjFz !WHLrH)fA> HT$ LHHT$ z IHJPf9uHHL$ HL$ HHEHLHL$ WHL$ u HLIHD$H9Ht$1HHL$ NJHL$ FHt$Hd1fA>&H$f8HD$Lt$@HHD$8HD$ HD$@H$HD$PjFz !5L@[1Ƀz HJPHt$HɌ5H$f8#HD$H $LD$0Ht$AlHLHD$0HD$HD$8H$HD$@/wACJ +HuHBHf{t f~~HLrHtfA>u LHLEL;t$Ht$1LH?Ht$L*H$f8HD$H $LD$0Ht$AlHLHD$0HD$HD$8H$HD$@FvA41Ƀz [HJPRHD$@tHBH|AF-THt$HH$f8HD$H $LD$0Ht$AmHLHD$0HD$HD$8H$HD$@uAZsC4J +HuHBHf{t f~HLrHtfA>u LHLEHD$L9Ht$1LGHt$L3H$f8HD$H $LD$0Ht$AmHLHD$0HD$HD$8H$HD$@tAL釂@-^@Y1Ƀz HJP@Lp@UIATUASFHHrf4HEHHfS4A4ff%f9v AH}H}FHELpA fAFAE!߮H=Ș|HLmHmL9utHL~DʓHLKzHUIHGD䱓H;XtHHھ~DʓLHD[]A\A]A^Ic魱DC[1]A\A]A^>:.L aIfH9訷~fD:uf;uA$f+f[f<HCLs L{(fl$bfmfnHHH@ H9H$+1HHHL$W܏HL$<9tHIUH<9tHEH@hH9AhfH<$mUL}EL5L-%L퍴C!1%H=zH[eHUHD[%ALH~DA2DL谋@Lp0!$ H=zHV9EH;HHL8H9DhID$f8 L2HIfDf}<HELu L}(fl fmlfnHpHx H4$qLteLutYH;$9H4$1H)ڏt=Hڃ<9tHSH4$H<9tHFH@hH9Bh9LLPH;$BH4$1Hُ<9`HSH4$<9`HFH@hH9BhL贉uxLxulE!!"H= zHgH $D[HQ>LHXD>fkfDf<fBuEH )t(&t#$>%ufHCxff}Bf;Bu=H7 )t",&t'`(uHCxHEBfw$D}E f{f~Ef{{f~qf;]EFfN$з@Lȉ Cm! ![H= zHHD)@HCf8]aLpLx L"IHT$@LDHl$@Lt$HL|$P+)HDHULu f{'f~ 'fUusLz fA?uhfA>uaL9HJ'1LHH $֏H $t;HL$@L|$HFLt$PL|$XHT$@LDVH>fHHt!f{of~efXHCi Hx1[]A\A]A^A_H6HHL8Ht9DhLhL؇t fA<$`&Cu!cH=zHL2HLXD ;)@H蘧L腴LXA$f WMf8fA<$:AT$6 H'HcЀ 'HcЀ'HcЀ'HcЀ'HcЀ'H{'!UH=zH[HL4D9LH HLDIq!HLDIc!LH4D9@HCLs L{(fl7fm'fnEHHH@ H9H$ #1HHHL$ӏHL$<9tHIUH<9tHEH@hH9AhH<$LLL脃LDC!H=qzH fHUHD[|8LHXDi8@Hx)t# &tb$t %@HExH}0Lu8L9$1LHWҏt8Hڃ<9tHSA<97IFH@hH9Bh$fA>XMvL901LHяiHڃ<9tHSA<9eIFH@hH9Bh1HL$@HLDLt$@ wHZ HT$@HDHl$@LH7;DLuAFf($Őf9tHSMH<9tHEH@hH9BhO!H=zH-.M;ftLL~D2ILfMHLHDfDH)t#&t$t %@HExH9Lu0L}81HH̏Hڃ<9Lu L}Lp L91LHˏHڃ<9tHSAL<9tIFH@hH9BhIG@!;H=$yHQIVLDX/0LILcDY9@H9LuS1HHVʏFHڃ<9tHSMH<9tHEH@hH9Bh!H=vyH-*LLbMHLHDf!6H=yHM#HLXD+/KfDfA<$ID$f8 dHpHx H4$_LtkLct_H;$'H4$1HɏtCHڃ<9tHSH4$H<9tHFH@hH9Bh'fLL8H;$1H4$1Hȏ<9OHSH4$<9wOHFH@hH9BhlLx\L\zLE!H=yHVH $D[HQ-LH~D-$ %v* +^gfH|$@H|$XH|$@H|$ H|$8H|$ !H=yHgtIHLcD@ H9H$1HHHL$ƏHL$<9tHIUH<9tHEH@hH9AhH<$LLoLvwL?xgC!H=lyHcHUHD[w+LH~Dd+HHH@ H9H$#1HHHL$ŏHL$<9tHIUH<9tHEH@hH9AhH<$LgLLguL'woC!kH=TyHcHUHD[_*LHXDL*lHCHT$@LDHD$@Hl$H2H>1@LuL} L@vHT$@LDH\$@Lt$HL|$PIHLuf{f~HHT$@LDLt$@H\$HHeDLuL} f{f~L91LHÏ#HT$@LDLt$@L|$HL|$PH0fDL}fA?f;5}HE HLLHH$~GHIAGC!s H=\yHUL$LLTDHULu f{f~ fULz fA?fA>L9HJ 1LHH $WH $HL$@L|$HGLt$PL|$XDH{0Ls8fA>XMvL9 1HLA<9tMvE<92'HEH@hI9FhnHL$@HLDHl$@nwHTHf.H{0Ls8L91HLUA<9tMvE<9&HEH@hI9FhHL$@HLDHl$@wHa IVnH\$PHT$@HD$HfIF IVmH\$PHT$@HD$H{IFM~ H$f{Ff~<L91LHgH$HT$@LDL|$HL|$PHD$@4H6fDIVIN f{f~fULz fA?f9L9LB1LHLD$H $kH $LD$HL$PLD$@GL|$HL|$X{@M~IF LH$q|H$HT$@LDH\$@L|$HHD$P HKM~f{nf~dHHT$@LDL|$@H\$HSH_y AA81fu!H=yHFHD$ Ht$@LHD$@D$8D$HD$XMvL9AFDy SAA81yu,@HCxH{0Ls8L9AF@fkvHHH@ H9H$ 1HHHL$GHL$<9tHIUH<9tHEH@hH9AhH<$L%LLlLnC:!(H=yH%IHUHD[!LHXD!DHpHx H4$_LotkLt_H;$H4$1HǻtCHڃ<9tHSH4$H<9tHFH@hH9BhfLLxH;$H$H4$1HHX<9}BHSH4$<9lBHFH@hH9BhLLk L mE!PH=9yHIH $D[HQC LHXD0 PHpHx H4$LtcLtWH;$H4$1H?t;Hڃ<9tHSH4$H<9tHFH@hH9Bh{L(LhH;$"H4$1Hȹ<9@HSH4$<9:AHFH@hH9BhLiLk|E1!H=yHHgH $D[HQLH~DfknHHH@ H9H$y1HHHL$׸?HL$<9tHIUH<9tHEH@hH9AhH<$LLLhLmjC!H=yHGHUHD[LH~DDHCf8XH@5HHH@ H9H$1HHHL$詷HL$<9tHIUH<9tHEH@hH9AhH<$LLoLgwL?igC!H=lyHJHUHD[wLHXDdHHH@ H9H$1HHHL$葶HL$<9tHIUH<9tHEH@hH9AhH<$LgLLgfL'hoC!kH=TyHSHUHD[_LH~DLl''(f-=Q-=Q'(fEkESEKH{I91L鰏0HT$@LDLt$@L|$HL|$PH~@E=E$E1E!E}C7~'(2-q=QfCoHSLDH\$@HD$PHl$XHT$HHT$@ZDHHBf8U^Lx fA?OfA>DL9HHCHBf8ULx fA?fA>L9HHC@ID$f8 H=u / =p`32=m'^!=nHExH9Lu071HHήt9<97[HSE<9[HEH@hH9Bh`7f;BAH?=n.HCx =}L{0!H=yHLIVMLDHILnD1xDCMH{I9D 1LҭH$HT$@LDL|$HL|$PHD$@HctC+HL$@HLDLt$@vH3NHCHH$9H$HT$@LDL|$@HD$HHXAG'4fA>AV6 HtLHcЀ t@HcЀt4HcЀt(HcЀtHcЀtHcЀ GE =$HCxLs0L{8L911HLæA<9tMvE<90HEH@hI9FhIfA?>Hl$@H\$H$L|$PC:Jy AA 81"tHBf8UoLx fA?`f9VL9L@Cf.HBf8U"Lx fA?f9 L9L@C3DEb=9 =F3*=:cHExf;BHL}0=:H3=THCxLs0HC8fA>H$= z}IvEm I~n!*yH=yH9IV1M>DoHEL$HPDHH6HPMDHFIL:D1nEh =&!_ =$#HExLu0L}8fA>u f;BBfA?}L90B1LH譣_<9BHSA<9AIFH@hH9Bh&HT$@$LDLt$@Hl$HL|$PxwHf.y AA81tHCxLs0L{8L9@1HLТA<9tMvE<9+HEH@hI9FhVfA?KHl$@H\$H,L|$PfDHCxLs0L{8L91HL<A<9tMvE<9p+HEH@hI9FhfA?Hl$@H\$H(L|$P*f.7y AˋA81tv@AGHCsElC3LQxL`ShE!H=yH$.H $D[HQLHXDCLQ>LR.E{!iH=yH?H $D[HQ LH~DDCLPNLNR>E!H={yHO3H $D[HQLH~DrD/y AA81t@C9LOvLQfE3!!H=yH}IH $D[HQLHXDDoy AA81t@=w%={=7cHExUf;BKHLu0E=G4HCx&L{0M931LLRA<9tMA<9*IFH@hI9GhHT$@DLDH\$@Lt$HHl$PAvHf=r =s=tsHExeH9Lu0'1HH葝C<9(HSE<9(HEH@hH9Bh Ht$@LHl$@Lt$H~wHBCDHExf;BHL}0DQ-=(HCx{Ls0HC8fA>H$d=q}WIvEmC HExlf;BbHL}0\>.=$BHCx4Ls0HC8fA>H$=p}IvEmI~2!H=yH.IV1MD}eHL$HPDHHHPMDHFIL>D1+e[fDHH[LOL觾H fA<$7L脾H fA<$6LaHfA<$6L>HfA<$6LHgfA<$ &LHDfA<$%LսCCL萼H fA>FLnH fA>FLLHt|fA>\FL.Ht^fA>4FLHt@fA>HFLHt"fA> FLԻHfA>|&L費pCU颽CCOC=oHExH9Lu0o1HH㒏<93%HSE<9%HEH@hH9Bh\Ht$@LHl$@Lt$H wHG7?.HEx f;BHL}0?7=(HCxLs0HC8fA>H$=4g}IvEm7I~!TۃH==yH/7IV1MD1\HoL$HPDHH`ސHPMDHFIސIL?D1[HExf;BHLu0HCx=#f}L{0!kڃH=TyHIVMLDHݐILD12[bEE;.0HEx"f;BHLu0=vHCxL{0M9Q$1LLA<9tMA<9-IFH@hI9GhHT$@cLDH\$@Lt$HHl$PvHyi=(^HExPLu0L}8fA>u f;B#fA?,L9D1LH\<9HSA<9IFH@hH9BhտHT$@(LDLt$@Hl$HL|$P'wH飿=;HExf;BHL}0z=;=T^HCxPLs0HC8fA>H$9=c},IvEmI~N!׃ H=yHIV1M@DXH׷L$HPDHHڐHPMDHFڐIL;D1GXwE=lRHExDH9Lu01HHpt9<9"HSE<9"HEH@hH9Bh^f;BH=lнHCx½=%b}L{0!mփH=VyH IVMLDHِILlD14WdE LLHEx,f;B"HL}0CD0=,HCxLs0HC8fA>H$ݼ=@a}мIvEmCI~!`ՃH=IyH/IV1MD=VH{L$HPDHHlؐHPMDHFUؐILCD1UHExf;BHL}0A/=$ػHCxʻLs0HC8fA>H$=`}IvEmI~z!6ԃH=yH/IV1MDUHQL$HPDHHBאHPMDHF+אILAD1THExغf;BκHL}0ȻB9=(HCxLs0HC8fA>H$=^}|IvEmaI~P! Ӄ[H=yH\8IV1MDSH'L$HPDHH֐HPMDHF֐ILBD1SǫLLHExf;BHLu0莺3HCxqL{0M9(1LL蝈OA<9tMA<9IFH@hI9GhHT$@kLDHl$@Lt$HH\$PvHHExٸf;BϸHLu0ɹ.HCxL{0M9'1LL؇A<9tMA<9<IFH@hI9GhTHT$@cLDHl$@Lt$HH\$PvH2"HExH9Lu01HH@t9<9^'HSE<9@'HEH@hH9Bhf;BH豸=mHCx=[}L{0!=ЃH=&yH,IVMLDHnӐILmD1QLEHExLf;BBHLu0H$6=Y})IvEmI~K!̓H=yHIV1M=DNHԭL$HPDHHАHPMDHFАIL9D1DNtHEx[f;BQHL}0KE=,1HCx#Ls0HC8fA>H$ =oX}IvEmrI~!ӳ!̃ެH=xyHIV1MDlMHL$HPDHHϐHPMDHFϐILED1MJHEx1f;B'HL}0!=&/=(HCxLs0HC8fA>H$=CW}ӲIvEmFI~!c˃H=LyHQ.IV1MD@LH~L$HPDHHoΐHPMDHFXΐIL&D1KLBL$HExf;BHL}0@=,˱HCxLs0HC8fA>H$= V}IvEm ~I~m!)ʃxH=yH IV1MDKHDL$HPDHH5͐HPMDHF͐IL@D1JHEx˰H9Lu01HH<9&HSE<9HEH@hH9BhpHt$@LHl$@Lt$HvH[KHEx=Lu0L}8fA>u f;B&fA?L9 1LHI<9; HSA<9- IFH@hH9Bh¯HT$@,LDLt$@Hl$HL|$PvH鐯HExH9Lu01HH~`<9kHSE<9(HEH@hH9Bh'Ht$@LHl$@Lt$HvHHExH9Lu0=1HH ~Ү<9?&HSE<9p(HEH@hH9BhHt$@LHl$@Lt$H vHtHExfH9Lu0U1HH}D<9_HSE<9i$HEH@hH9Bh Ht$@LHl$@Lt$HvH3ݭHExϭf;BŭHLu0迮={HCxL{0M9#1LL|~A<9tMA<9~IFH@hI9GhHHT$@kLDH\$@Lt$HHl$PvH&HExf;BHLu0HCx۬=>Q}L{0ʬ!ŃեH=oyHIVMLDHȐILD1MF}HExdH9Lu01HH{B<99"HSE<9_"HEH@hH9Bh Ht$@LHl$@Lt$H}vHHEx֫H9Lu01HH{t9<9"HSE<9!HEH@hH9BhVf;BuHs=FbHCxT=O}L{0C!ÃNH=yHA&IVMLDH0ǐILFD1DL鸫HExժf;B˪HL}0ū='=$HCxLs0HC8fA>H$=N}yIvEm^I~M! ÃXH=yH&IV1MDCH$L$HPDHHƐHPMDHFŐIL=D1CěH|HDH$鄰My 6AAq81跽tHH$鏽H$Hy A=A81atH$CHt$@LHl$@Lt$HwvH!y عAA81t鱹CHy RAA81貼t+y &AФA81~tC`Ht$@LHl$@Lt$HvHCCVHt$@LHl$@Lt$H_vH֙{CGCVC)rCXHt$@LHl$@Lt$HvHz}HHHC9HC٩y A:A81Cty AA81tqqy A?A81ۺtHCxLs0HC8fA>H$k=J}^IvEmCI~2!=H=רyHqIV1MD?H L$HPDHHHPMDHFIL%D1y?驗Ky AA81赹tHJH,y rACA?81qtKHCx={I}Ls0!ýH=yHIWMLDFIL9D1>麖HLLy AgA?81趸t[CQ y jAA81stCAD$6ĞAD$6驞L.y UASA81!t.y ζAƌA81t駶Oy AA81蹷ty A)A?81腷tHCx,=G}Ls0!׻&H=yHIWMLDFIL@D1<Δpy ԘAA81ڶt魘HCx=F}Ls0p!,{H=yHu>IWMLDFaIL:D1;'L1y tAkA81/tHC}y AҍA81tڨIy ADA81賵tAF6 y XAA81ut1Lϣy BAA819ty AA?81tHCx=E}Ls0!WH=@yHIWMLDF舼IL;D1:NHCx5=D}Ls0$!ฃ/H=ɢyHIWMLDFILDD19בyy AٙA?81tLi=y  AÛA81觳tHCxN=C}Ls0=!HH=yHujIWMLDF.ILED18y AݘA81tby tAA81вtf2y AA81蜲t鼪HHCx;=B}Ls0*!涃5H=ϠyHu>IWMLDFIL>D17Hd{y tAA81tNy AxA81踱tÙLy AlA81|t}ޟy bAA?81Ht;y KAA81t$vy -ANA81tAFH胝BTbDv=&#HCx=@}HC0H$IvEm߳I~Y!ǴH=yHIV1MD5HHPMDHHԷL$HPDHF輷IL&D1R5邍AFSy AA?81~tH0HНy -AA?81:tHCx=D?}HC0H$IvEm?޳zI~i!\H=EyHAIV1MD94HwHPMDHHiL$HPDHFQILDD13?HCx=X>}HC0H$IvEmSݳI~}!pH=YyHIV1MDM3HHPMDHH}L$HPDHFeIL?D12+HCx=u=}HC0H$IvEmpܳI~߳!荱ܑH=vyHIV1MDj2HHPMDHH蚴L$HPDHF肴ILBD12Hy AA?81Tty jAA?81 tCy APA?81tLLHCsAFHh=:(=;:=9!HCx=;}HC0H$IvEmڳI~@ݳ!讯H=yHIV1M=D0HɏHPMDHH軲L$HPDHF裲IL9D190iAFtLqy iAA?81]tBHCx=g:}HC0H$IvEmbٳI~ܳ!ΎH=hyHUIV1M@D\/HHPMDHH茱L$HPDHFtIL;D1 /:HCx=9}HC0H$IvEmسI~.۳!蜭H=yHIV1M>Dy.HHPMDHH詰L$HPDHF葰IL:D1'.Wy A2A?81ctwŖy MAeA?81/t&HL~HHFAD$6@AD$6%AD$6 AD$6HCx=7}Ls0y!5H=yHu>IWMLDFjIL'D1-0Ltʕy tAA818ty VAȎA81t/iy  AvA81Ӧt5y APA81蟦tǜLHCƽy A%A81Lty AA81tAFH軒>0>AR=% HCx=5}HC0H$IvEmԳ̼I~׳!NH=yHIV1MD*HHPMDHH L$HPDHFIL%D1*麂AFKAF0vHH,y QAA?81薤t*HCx=4}HC0H$IvEmӳI~Jֳt!踨H=yHAIV1MD)HӈHPMDHHūL$HPDHF譫ILAD1C)s=׺HCxɺ=3}HC0H$IvEmҳI~^ճ!̧H=yHIV1MD(HHPMDHH٪L$HPDHFIL=D1W(釀HCx=2}HC0H$ѹIvEmѳI~{Գ!馃8H=ҐyHIV1MD'HHPMDHHL$HPDHFީIL>D1t'Fy AFA?81谡ty jAA?81|tCޏy AyA?81Hty nAA?81tGHCx=1}Ls0!fH=OyHIWMLDF藨ILCD1-&]~y AA?81itHCx=s0}Ls0!軤 H=yHujIWMLDFILAD1%}Xy AYA81Ÿt$y tAA81蒟tfy AUA?81^tHCx=h/}Ls0!谣H=yHIWMLDFᦐIL?D1w$|Iy AA?81賞t_y AA81ty UA-A81Kt.HdHy MAўA?81t&HHAFTHHH芊CHE&='=HCx/=-}HC0H$IvEm̳I~`ϳ!ΡH=yHuzIV1MD"HHPMDHHߤL$HPDHFǤIL'D1]"zH'y yAA?81葜tRHCx=,}HC0H$IvEm˳I~Eγ!賠H=yHIV1MD!H΀HPMDHHL$HPDHF訣ILED1>!ny@6HCx(=+}HC0H$IvEmʳI~Yͳ!ǟH=yHqIV1MD HHPMDHHԢL$HPDHF輢IL@D1R x$y vAA81莚tOHHHCx=*}HC0H$IvEmɳI~2̳!蠞~H=yHIV1MD}H~HPMDHH譡L$HPDHF蕡ILCD1+[wy nA A?81gtGɇy MA<A?813t&L>y A%A81tYy BAA81Øt%y AoA?81菘t[HHŤy AA?81Kt{HCx=U(}Ls0!蝜|H=yHIIWMLDFΟIL&D1du6y AA81蠗tey AoA?81ltHCx=v'}Ls0!辛 |H=yHIWMLDFIL=D1tWy AA81t鄘#y A՝A81荖ty UAGA81Yt.AF6uAF6_AF6IAF63AF6eAF6Oy AA?81tpHCx=%}Ls0!;zH=$yHu6IWMLDFpILBD16s؃y tAA81Ft@LHh|$ H4$flfn,~fpLjLz EAUAf{ f~ fmf{f~L9 1LH'7UH<9HtHMAuL<9tIUHrhH9qh fDffkLzLj EAAMf{ f~ ff{f~}L9 1LHg UH<9HtHMA7L<9tIWHrhH9qhL 9 H`c HCf8H$f8@EfDHEf8EC!tH=]xHxCx kA$ A 81sDff{MEf~ L9 1LHLD$cLD$ UH<9HtHMA0Lƒ<9tIPHrhH9qh Aff"f{MGf~L9 1LHLD$ӦLD$ UH<9HtHMA0Lƒ<9tIPHzhH9yh] fFfG7AMDAD$TE>Ef;DH4$|$ HT$@pLl$@L|$HH\$P HI DErf;)EfF fGLuHE M9HD$1LL补A<9tMvAL<9tIGH@hI9Fh|HD$f8mf;cHD$L|$@AGLl$HH\$XHD$PkH4$|$ HT$@UHI@Ef;EfFn fGLuHE M9HD$1LL豤A<9tMvAL<9tIGH@hI9FhHD$f8}f;sHD$L|$@AGLl$HH\$XlHD$P fEf;2EfFfG LuHE M9HD$'1LLᣎA<9tMvAL<9tIGH@hI9FhHD$f8f;HD$L|$@AGLl$HH\$XnHD$P;fEf;EfF fGPLuHE M9HD$1LL!A<9tMvAL<9tIGH@hI9FhHD$f8f;HD$L|$@AGLl$HH\$XmHD$PkfH$f8kDLuM9[ 1LL^A<9tMvAUL<9tIEH@hI9Fhf;H4$|$ HT$@pLl$@L|$HH\$PHILuM9 1LL辡=A<9tMvAL<9tIGH@hI9Fhf;H4$|$ HT$@lL|$@Ll$HH\$P!HI@LuM9C 1LL&UA<9tMvAL<9tIGH@hI9Fh f;H4$|$ HT$@mL|$@Ll$HH\$PHILuM9 1LL膠]A<9tMvAL<9tIGH@hI9Fh(f;H4$|$ HT$@nL|$@Ll$HH\$PHILuM9 1LL柎aA<9tMvAL<9tIGH@hI9Fh,f;"H4$|$ HT$@kL|$@Ll$HH\$PIHIhLh Mt$ LxLS3LP#f}HpPHPSHEUHMH|$@I@4HL$@T$H%D$LRAIEAUH|$ Ll$ @4T$(%D$,lRA)E^AD$CEx!fH=xH{H$I;Gt|$ LH¾~IH=|Icc\$ IWIL]H$IHcIfDEf;u@H4$|$ HT$@lLD$@LD$Ll$HH\$PHILD$@EEf;u8H4$|$ HT$@pLD$@L|$HH\$P*HIIfDAEEIf;u@H4$|$ HT$@kLD$@LD$Ll$HH\$PHILD$@ETEf;u@H4$|$ HT$@nLD$@LD$Ll$HH\$PmHILD$@EEf;u@H4$|$ HT$@mLD$@LD$Ll$HH\$P HILD$'@ELuHE M9HD$ 1LLA<9tMvAL<9tIGH@hI9FhHD$f8f;HD$L|$@AFLl$HH\$XnHD$POfDLuHE M9HD$6 1LLM]A<9tMvAL<9tIGH@hI9Fh(HD$f8f;HD$L|$@AFLl$HH\$XlHD$PfDLuHE M9HD$P 1LL襚A<9tMvAL<9tIGH@hI9FhHD$f8qf;gHD$L|$@AFLl$HH\$XHD$PLuHE M9HD$ 1LLA<9tMvAL<9tIGH@hI9FhHD$f8f;HD$L|$@AFLl$HH\$XmHD$P_fDLuM9 1LLLD$aLD$A<9tMvAL<9tI@H@hI9Fhf;H4$|$ HT$@pLD$@LD$L|$HH\$PHILD$V@LuM9[ 1LLLD$豘LD$VA<9tMvAL<9tI@H@hI9Fh!f;H4$|$ HT$@nLD$@LD$Ll$HH\$P HILD$$@LuM91LLLD$LD$A<9tMvAL<9tI@H@hI9Fhf;H4$|$ HT$@kLD$@LD$Ll$HH\$PZHILD$t@LuM91LLLD$QLD$A<9tMvAL<9tI@H@hI9Fhif;_H4$|$ HT$@mLD$@LD$Ll$HH\$PHILD$@LuM91LLLD$衖LD$A<9tMvAL<9tI@H@hI9Fhif;_H4$|$ HT$@lLD$@LD$Ll$HH\$PHILD$@AEAGGAGAG OAG2wLuHE M9HD$N1LLLD$蘕LD$t?A<9tMvAL<9tI@H@hI9Fh f.AMEfLuHE M9HD$1LLLD$LD$tA<9tMvAL<9tI@H@hI9FhuHD$f8uf;yHD$LD$@AGLl$HH\$XHD$PH4$|$ HT$@k|HI2fLuHE M9HD$f1LLLD$XLD$t?A<9tMvAL<9tI@H@hI9Fh!f.AMEfLuHE M9HD$.1LLLD$ؓLD$t?A<9tMvAL<9tI@H@hI9Fhf.AMEfLuHE M9HD$1LLLD$XLD$;A<9tMvAL<9tI@H@hI9FhHD$f8f;HD$LD$@AGLl$HH\$XHD$PH4$|$ HT$@n{HIfDLuHE M9HD$1LLLD$舒LD$A<9tMvAL<9tI@H@hI9FhHD$f8f;HD$LD$@AFLl$HH\$XHD$PH4$|$ HT$@m7zHIVfDLuHE M9HD$V1LLLD$踑LD$t?A<9tMvAL<9tI@H@hI9Fhf.AMEfLuHE M9HD$1LLLD$8LD$tA<9tMvAL<9tI@H@hI9FhuHD$f8uf;yHD$LD$@AGLl$HH\$XHD$PH4$|$ HT$@lxHI2fAGHD$f8!AGXHD$f8AGHD$f8AG&HD$f8pAGHD$f8 AGLHD$f8fAGHD$f8#GAGHD$f8(A@*yA@)A@iA@ZA@H4$|$ HT$@mL|$@Ll$HH\$PHI!H4$|$ HT$@lL|$@Ll$HH\$PHIH4$|$ HT$@nL|$@Ll$HH\$PHIH4$|$ HT$@kL|$@Ll$HH\$P]HI|A@&A@FA@HD$f8f;HD$LD$@AFLl$HH\$XHD$PjA@&A@HD$f8f;HD$LD$@AFLl$HH\$XHD$PA@HD$f8f;HD$LD$@AGLl$HH\$XHD$PfA@HD$f8f;HD$LD$@AFLl$HH\$XHD$PbA@x AA 81sOx 7AA 81sx }A A 81sVx A A 81Qsx -A A 81sx cA A 81s<Kx cA A 81sHAuMǘMEŐH|$tHD$xy@HX[]A\A]A^A_LT$8lT$8` {HFE1jfDEOIvfAv MtIuLT$8\xHIID$HT$8MeHAH{It$@ff tHGMeIt$@HH9tyH_fuyMeIt$@@HsLHH@HPHs DHLGHp8I9IHD$cLD|$8xT$8H?ID$Xt>E11 fE1]WH11D$?ID$XD$0HTHArHPxE1uAD9h6DLHTHAHrHu |$0u]H?ID$XIT$HBHBMd$8@,HD$ L9`8Ev_MI!A|$?tH_sH@@0k8MucMmMNA}"uIULHL-rMfDIULHMmMu1HH=L!I(tf%=1AwUD)%FAMHH4#HHHLIH?s1LHH@HP]H@f84H@f81HxxHS@Ht!1LHP380pDD9t9L$LL$DH@Ht$HH|$@AغBHHڃHcIVHDD越InID"IDhLxHh1HDDHfDLL$8DL$0DD$(L$8DD$(DL$09DLL$8DL$0DD$(؜L$8DL$0DD$(Q9FfA@~DLDD$(蕒DD$(HD-DD$(LHL$HH1HHXhHt$H1DH@AIDLxM{AF HI9H_HL$ f@HIcf @6AFEuAHUfAFAH 9DMvDEEEfAF D$ HChHD$HHH)H LCxHHL9%HHspIIHI)IL!IHI)M9v HHEHSxAumIII)M9LL9H%HI!LH)L9LL@H{hHL$HHHL$D@HHH{hLHL$LD$HHHspHLSxHL$IILD$II)fE È]H[]fHATUSHl$ D$(\$09Dgr <@9v1AN\ES(A9vE9sMMSM9v[1]A\FA9vJHLH;iv?A)DNAt$JDLHtfHHH9isD9A(s ty,uH9u[]A\@9sE1cAWAVAUATUSHLOMAAiMlHD$IEI9DbA9EDMUODN4M;ppDv\E;_IQI\$0L$T$Ht$ H|$I9DILBf.HJ4H9rD9HHHI9v 9r\$I9HD$ D9Dp@Hh(r}H$ML|$HD$( EODI@ HT$(Ht$`HDŽ$Hx@-foD$`MOD9vKTAHjH+$AD9sHD$MlMUH$H?;oHt$8HTH|$HtH$HtGE1HD[]A\A]A^A_L=oHD$ l$xLLpL09|$4xP|$(HxpH|$HHxhH|$8iD$E1LLHD$@LA܉hDur9l$H|$ptH$H9$t N(;L$(t;L$4wH~ L$4H|$He]1$DIL$L=doV1HD$ DDp@Hh(b\$@DIHT$XHt$PH|$`HD$`H;D$Xt H;D$PH|$p@HD$H1҉HhD$ PD$@HPAVHD$@DD$HL$TLmH AHT$`H;T$XH;T$PKH|$ptH$.ao9D$\|$@tfoD$`~D$AE09foD$pAE@fo$AEPfo$AE`fo$MAAEpLl$HD$PH9D$`_IUDEDAD9sD$@E1D$ @E aj+D$9$a|$9HD$HDx,9rHLy,tK9H$VH=Qlsa\H=PlNa7HDŽ$y G,g"gHDŽ$D$@HFE1H^ H$HD$XgfH=qKlHt YKld HyֹpbIL$HYDHIHѡH "KlHt  Kl IL$HDIEE1Ld$E11D$EAGD$H=JlHtJJlt=LCH Dx1)\gHC H=Jl1HpH3bH5Jl XgHD$HPHHBI;D$D D9|$DEH\fD9I>sCHTHHBH9r0UHHLfDH1HHHVH9w D9uI~H9{S(E^XAED$D9INH9KHsHHCCHHI9vIF~H)=loHH9LKMFMLI@L)H9H9LD$HLT$@LL$8D\$0T$(HLqd{,T$(D\$0LL$8LT$@LD$HHOIA9VPЉACFPA9VLAFNLL MHt$L\$(HM9H~ATMBU1AQAMD[dH {,L\$(AC, = I HD$1HLL\$(HP}L\$(tTI HD$HLHP|t)HL֢H\$8L|$@DLt$PLHD$L` MA$AL$%98AAD$MtE4AHHIH gGlHKGlJ`ghLXmLIfIF`H&DPAH@HAF\9HAEAA@H\$Dd$0DˉD$(DT$8E܋T$(H|$ALơyAuH\$Dd$0SguA|$(q0HH4HD$Hx HOT$0L$(<IHD$T$0L$(L` A$AL$% A$AD$E{,A~UA;F\D$pH3A9VLKAFNLA9VPACVPH$LD\$0MH9Ht$xHCHt$$$HD$hHAкH~DH|$hATUPLT$HaH $LT$(D\$0JHD$`D$0Ht@D$0Ht$`AEL|$@LT$HD$tEƄ$H\$8AHDt$Lt$PH$$H$t$0Aǃ$|$9|$0E H$HD$`Lt$hHDHpL9! L$Dl$($l$\1PADL$pD;9щ|$(GH0I9LB9|$0IIHpL9P(DH,D9rAM9GD11AAL$8DB1T$0DDD$(H49Ht$DD$(T$0L$8HF 0A@A DAAD@ IHPt(qH|HtI4$H2HIH9uHI@L8{AI\$ HHk9BUIL$PLt#IMuI\$ HHHQHɋpQ9 I\$H$A9@H\$A H=gBlHt'1SgID$ HxH$E11Hl$!ID$ HO9XFڿ3LtWvtH|$LD$tIF`HWpE1u\HH|$`A2I;nhtHH=IF`AHD9hDH|$L|Io Hb}uE +uHEH +qf87g@]Ld$AL&gLyCI\$ IT$H'A~U EHl$t H @lHt[>ZgIL$HQDl$MDHȫID$E1HH$HD$QE1H$HHD$DAH@ FJYg1HwQgIL$HQqIt$E1H$Ht$1y{uszI|$ HH4H!o`5ID$ h ډP_ID$ Eat$p9$B9B؋D$\9FЉT$\DH(AIF`H\$DT$8Dd$0Ht@D$(HD$HP HJ%9 ABHD$LtHPHHAuBt$t$(9Ld77HILDd$(Dd$HV1D9l$(vWD$HEARD$ATAVLL$h$H$XH D$D$PDD$(EB;l$\v)t%9݉FD9AGA9sD$tt$t,H\$8D|$0l$8Lt$DL|$@ELt$PLT$HHD$xH$H9vL)IFHL艔t$\D$8L$9FA9FXAFFX9L$(AFXHD$H@HtHFH;D$BH"HD$H@HtHNH;L$D$0;F(v8~,H9t)HII~HH9vH)INLT$8ܓT$8;T$(vD$(D$$A9A$$IL$E1H$IL$HE1H$1L\$(L\$(HHHDHHLLHt$(HL$0HH|$89HHt$(HL$0H|$8HHLLHHHDƃLKMMFS(H=:lHt$(E1t$H=9lHPAu9lE1t$t$(t$zFqtIH4HD$Hx HODD$L$n/HHD$DD$L$HP DƁJ% Bu1E1AAL$8EC1HT$0DDD$DL$H4.Ht$DL$DD$HT$0L$8HF 0Dρ@ 0`EtnHLd$D$A1HD$DQH@ H1HD$hH\$8DD|$0EL|$@Lt$PLT$HHD$Dl$(D$\D$81$7lt$(E1t$1 ^ i [ iI1$  iff.kH9HD$  HEAHuH|$ HWHHRL|@A'IGXt9AG_HI#GXHHt H9HbuHAGPVuAAG`/AGQ IG@HL|$@HD$0IG@HD$8;uH=HD$DA)yHE(L|$(HD$HD$PH$f.HuH<$D$PHD$XH\$`MgDIФL$PIǍA@Ld$XD$PƍQLHAG A?AGA{LT$PHzAG ȃAAGH|$`;uHD$XD$PHD$L$$LfH|$IwfAGIHLδA(D$ŰHuHAGHuHAGIuHAGJuHAGKH%HIG@ME1Dt$MMHHLmADM$HHc ,L$͠/HI$x =H% HuAG(A<  H|$t LH袣H|$01LHuH=IG1ɉǺ1wIHuH=MHD$DA)H迵h@HxuH=AG`AFLÄHt f8uLxAL詶A< AO fufDL`H@JfvfQIM9DMDt$A?bAG@7HHsmIG`AFGA\AIQ1ITPHR f:uLzAWI9w%HHmIG(HHmHHIGHmHIGPHmIGXH|$`tD$PHD$XAG A?AG@Ld$XLHAG кAGщ|$PIALd$lfHPf:Hx lI$HPHJP2QsHP HRBaH@H;BSH;AII $@HuHcH9{L4X5H|$`tAWHD$XLd$X кAGAu7H|$`tAWHD$X AGLd$XLT$PHփ AG AGL0HGtH=Hu6Hx[]A\A]A^A_L|$(HtH=HtLt$ @HtIVHH|IHtD`LDd$!0EIvAD$E1HHD$@HHlmHHD$tHH$tH\$`HD$XIFIH|$Ht$D$PHP$HTHAHL9LI;t(HH9u11L2%$ f.1LLL$A1L$H}(HT$PHA@$IH@HIt`貮D9d$'f.D$@Ld$H[UHt$@HJmE<($`@H% fDHcu(H`Mt LHęH|$HJAH{x^IcH`HmHFH{1]Ht$H|$ \Ll$ MfDIU0f=wf#H{Eu,]IE0Hߋp^EttMeX1I4$HI0mIEH߃HHDHHcpTe^HCAT$Ht$@D$@HD$HHHD$PmA9uMmM>H{1\Hh[]A\A]A^A_@uHH]uIH]uJH]uKH]HU@HsHcEU $E1gHc ,DHTH͠/H=nL LL$ u MfLHAmD9e EP vE1Hu(Hm@H}@HuHH߹mHuPH߹omHuXHXm_1[BMmMY@H|$PZb@H|$PLY fH|$PYELd$HD$@<E1<DmAAyIAALusB@H% fDQHI ĸ@tIEH8[]A\A]fDX6fDX&fDXfDSH1H`u.f,ƒtPt1[[fH{tJfv΋[fH{wtH{jtЃ9fwHuHcҋ[DH[+(HD$HGH\$PH_ HL$HD$HGH\$XHHD$(H|$Ht$H|$INqH$DŽ$DŽ$u1Hĸ[]A\A]A^A_Ht$H|$(INrH$utĄff$$AFtHD$HH@H@HL8@MAƒft fL9u% @HD$-e.mL`@MH@HHH@Ht I91Ll$%fDIEHHH@L9HL{A$f7ID$ f8tHLHLxHxfA?*f*t%f,HGf8*zLA Md$M,9IM~I^ H$IFL|$8H$){H$1HHXH$Io@H\$pHXHH\$x$'HD$8H@HHH@H9u H E1HD$ 1D$`Lt$@5@$Ht$@H|$PHIAXAYI D$`H("Dc D9At$HD$HsJDHD$ H !D`AΉL$0AAt$E9HD$ MpN| EeEuAE9At$RAuKlHmHLl$0H|$0Lt$@VHD$0xFD$`t8HL$ LkLaHlfI4$I}IIkL9uL-ǫiLLhM)LH{HmHuT$`D$`HL$ HkLaL|fI<$IHO{H}F{M9uH$4{H|$p*{H|$x {L{&{Mt?LfHmH f} tu0yH} 1H?E0u1H|$Ht$H|$INqH$DŽ$DŽ$^Ht$H|$9Df@fIG f8tHLDHL`H@HD$ HD$(HtDHH@H@HH( HmEƒft f3H9u% @In8/f.IF8H@H9EHLH{HHuLLE1"L:LzI]MfhfA<$*ZAD$<A$f~!f-%ozHD$8H|$ 耽KHp;L|$Ld$0HHD$A~qInHIvXIF@HD$8IF`HD$ IFPtI~k A~r` MIn(AFq]KfIvH A~r A~rI^`DKIcEƒ@ t,I~8 A~r- A~rI A~r )h%IF`I^PE1MfX@f;6D$H] AFqD$pt#INHA~rA~rhMH|$(1E1Ht$0fHD$ NdIuLI}贻IuLutI}HT$XHu_IEf8*uxwxhuAHt$Llu0Ht$ L[uHT$XHLWMLL1E1=HH?HH1{HD$0x3@HD$ Htx @Ht{CH=ԍiHhIHX17dGID$@A|$wA|$hLd$8IF8H@H9Ef} Lm fA}@LHHI)LL1H8d( QgH|$ dH|$0dLl$0Lt$@1`{MfIT$f:!DIT$f:!5 QgHL$0Iv LAHDP({L$H$LH0nH}oLl$8LL0nH}oHELHxoLoLHnluotH$H$oIlFlIdžHD$0f80hHMu)H|$@uHtHHH!H9H^{A~q!INH I~P< t1I~X< t1u{IF(HIv`LHHAvXMNPL@讬HZYehuIVPf:!f{{IvHt A~rI~`AFqaI^`CHcA@MF(҉D$@A(IvHt A~rvIvXHLD$H}LD$Ht EAVq7AFqZIvHIVXH ٓif:!'H9IVPf:!HzIF0Ht{Iv`1ɉLKHQI~`H9tHѳL艮H4IV@Iv HL|$Ld$0R({Idž[H=9iHtI1詚dH\$8I9tDj{HL_HiLzH8z{HD$AW(HH@HH0 {H|$( H|$(}o"lH\$PL|$HLy,nHQzoLt$HIFHx{oL}oHLlo lA~qt!I~ A~rA~rLYA~qI~t A~rMfPH5ّiIN(I9HA~ I;vXWXH;itMkLHiWA\$:]HM;fXvEFsLML̰ {L=(izHE1AjHLHZ"|H_AXHE1E1jLGH‰ɝHY^I~`H1LHIV@Iv HL|$Ld$0R({Idž [I~`WA~qI~I~`0Au2InPI^Xf}8- f;8 f}E f;EA~quI~_IVPf:!\IFXf8!NI^`f;*@L`IF(kLzHD$@-{HI9t LLSLAEHI9FzIvPHHD$HLD$HHL$@DHM諝HtH="ifI~`H9tH裯L[HIV@Iv HL|$Ld$0R(U{IdžD IgH$zH|$pzH|$xzLz^{kD$`H$1^zH|$pTzH|$xJz1z{WIF`IN(4MNPMFXpaf84H>{I~`H蛮LSHIV@Iv HL|$Ld$0R(M {IdžBMF(DL$@A(H|$ HL|$Ld$0RHD$(HsH@HH;(uHUHHD$XHHHPT {1ҾH5ڦHt HHUH=viHIHLd$8&H=IiL|$Ld$0HIHfA<$6HCf8!ID$f8!zIt$H{2dH{f?*-IH[Md$EHzI9NP1@I~`I^PIv(HЦIF8I^(H$HIvPH{+薩0IvXH{聩W$fDIFPf86IvXHxEtKI^0Hf;t8+H{1` tIFPH@H;i H;iu A~qzI~dIFPH;uiIF0H FHPiI9FX5Iv`I~P蔨{IF`xzHLڧHz IV`HE1HBzjIHAYAZL I~`H9tH2LHHLHbhIF@Iv HL|$Ld$0P( {IdžEDf;!fA<$!HCI\$1HHHD$@H)HH?H?@8 9Dd$HDy{HD$`IF0HHt$`HNH]HB(@M@S@l$hD$H1 {Mt'Iv`L覦tDfDf;! Ht$PH|$axolLH=6~iHH|$Ht$P!nHoo[H{HMHAAIv(HZ̦IF8In(H$HH}HHuf>*,f>0H95niH;5qiCH;4E:EMtfWt fZ!WfH0u Ax {IV`Ez+HHEIN`t*E1ExH0E HIN`H9t HHLˡH{IV@Iv HL|$Ld$0R({H$In(IF0IdžIF8%H}H衣AE1UH{HHA3H}H]uI~`_HuI~P6xHuI~P!5HuI~X  IvXI~PǨ VInXIDnPI~`HעL|$Ld$0Idž4-HNiI9@;'f''f1LL $~WpL $;IQHHuHJkH@HRXHtSUL $[hHL $eIAHPXHt! J-hHL $7Iy8IAt @@;AYuuA+IAxHufH`hHHϾx1$rcuQAnf.|$gHl$`HH!HD$8HH=|$HD$@A]HL$0HL1HuLH8H81Ȩtm1]HHH1H, HLH!H9*H!H9HL$8H9X1HL$@HD$HA8DFHfDH|$` H6AHH82$K7H82$V@H)H!G2I$@mFDD1DL$(QcDL$(AHeL$ LǾ L;H$HIH9G2M11iHHFL0Dh@ 6;GHH1HH8H81Dl$+H$Dl$H5 (141H$HHff@affAC D9wH\HIQC@HtMtHtHpI}atXYHLL[]A\A]A^A_LLY1LLF1OQN\QN?HHH?@8@HH1@H)HHt6H?H1H)1HHHtHDDIL跫IIADىLIQID虫_@D]HD$H=aH0H aU]]HD$H=aH0H aA.]fHD$0HHHD@<HGHD$HC@H@HD$C +D$u HCHD$E1|$HD$H|$1Hp7HHD$0yH4LhALIYIF1pIEIVH|$1HI{7H"Ht$HT$@LH|$@HLIHD$1HxR{ I5HkXLcP1Lv=H5cHLqCLIH¾<LY_zIkLt$H|$@HIvIvI} AEHu zHCLe`zIH|$0f?UG6< tK< tGG6t:G6t-G6t G6tG6HHt$HT$@LRHD$0H|$@HHptIHD$0HT$4Hp@IwHxI)zu0I}1PL~HIw_zHHMkzLLリU_zLt$HILHHL~L"_zIHt$LE1 FHIf8D`6HD$0Hxf?G6A9u HD$0Lx1L;LHD$ :H|$1LI|41LIOLD$ LL<HI\z1LHHPH|$1L041HLHOHD$1Hx`OLL$ {HL^zI}HAxH H|$0f?H H|$0f?x螙HH|$0f?b{HH|$0f?LXHeH|$0f?65HBH|$0f? ;AHA@l^G6f|$HH"H|$D$@!I$H@8f8t#H$HL0 MMP Mb(LM4 I^(H|$TH-bHl$LLT$XcHHD$ VHD$(I$Hx D$8HHx D$0H1Hp I$Hx zuLT$XmI$H AzLtIA~H@pLtIH@;pw?M9t:D$0|$8D$8HD$(|$0H|$ HD$ HD$Hl$H|$(HHHH€yBG<EAAD|$@DDHD$HXH Dc%D9AD$H|$CKHDHxH|$(HHP Hh(Dp@Hx|$0xH|$ Hx0|$8x8@=_fo$ HD$H`H k%9 EHCH$HHCH$HCfD{ZfDH$H9kIHD$> ADH1LT$Hp8I$Hx8muHT$PLT$ HLLT$ HT$VHT$HHHCLT$ HD$T@=y_HH$0HL@I$HH1DŽ$HDŽ$DŽ$ HDŽ$Ek{{At$DHL$XHT$@HT$@HL$XIHH4HD$HXH DgDD$hHL$XHT$@跶HHD$DD$hHT$@HL$XHXADc%D Cb@H1Hp(I$Hx(uI$LT$XpHB8H f8bHq8f>Tf}XH$H$荹wLT$XHLT$XSlLT$XHEH$HHbH$H$LT$XH@8H$H$"wLT$X H|$LT$Xk.HD$LT$XH@H$HT$ H$HD$(H$H$HH$?H5$$SH$H$AкLT$X$LT$XLT$XLT$XLT$XLT$XH$HL&H$HD$HD$HID$HHHt HHI$HT$(H@H@HH HHL$ ZbHL$ HT$(AHH9T$H|$8 =m_I$H$@H|$Hp8!f.HD$ED$|$H| LHt*ooCFoC F oC0F0H{@H~@HHHHL9uD`H& Rgfffɉ)DAOH$AغH1DŽ$HDŽ$DŽ$PHDŽ$kXD{uHD$IHHH`HKozHHD$AH`k%D CE1E1A IML$H$H$H1DŽ$HDŽ$DŽ$ HDŽ$٬k @1E1ADC1DAHH耬H|$H`@D EPH|$H1fuLT$H$HD$AHǀ`E$Hz]fDI$HH$HLH8LBhHH1DŽ$HDŽ$DŽ$ HDŽ$vk@LT$X?w HHHH!H$H$LT$XH$LT$XH;$BH$H9$^f.@H8HDHHmgf.SHH[.gf.@StH\HԼHHvZf.H{; f. H DHHf.SHH߾[YKfHm H`DHHMf.SHH߾[ KfUSHHHHDHWJHHt HH[]ÐJHtHtHNNVH64line map string pair map hashalloc_entriescanon_relative_path_prefixinput_location_and_blockcf-protection=full-fcf-protection=none-fcf-protection=branch-fcf-protection=returnFunction is OpenACC serial offload paralleloacc kernelsoacc serialexecute_oacc_device_lowerscan_omp_forenclosing routine2-p6e_to_uhwicreate_artificial_child_accessbuild_ref_for_modelsign_maskNot vectorized: Incompatible number of vector subparts between %T andvect_reanalyze_as_main_loopReanalyzing as a main loop with vector mode %s ***** Failed to analyze main loop with vector mode %smaybe_set_vectorized_backedge_valuevect_reanalyze_as_main_loopalready exists: %qsinternal call LHS type mismatch=|checkzero start column(SUSE Linux) 10.3.0N9__gnu_cxx20recursive_init_errorEstd::bad_allocSt9bad_allocpure virtual method called basic_string::append,,Ԥ\()uninitialized __any_string4HHHHHHHHHHHHHHHHHt\D,10ctype_baseSt5ctypeIcESt5ctypeIwESoSt13basic_ostreamx|\4%.*Lf%m/%d/%y%H:%M%H:%M:%SH8))))))@))))h)))))))))8(8))()))) ( )))))X )))t(+*) )2`5p4@4h3hP8d$8  hLԐ$̒l$D$ԗt4ԛt$T|$Я`/dev/randomP |T,$p&./`'!!!.#!!!#!!!!8$$`%!!!%/'!!!!!!)*P++h,!!*!!!!--!!!!!-!!!//NMOdONXWY`YXSt7collateIcEiostream errorUnknown errorNSt8ios_base7failureB5cxx11ESt12ctype_bynameIcESt9type_infostd::bad_array_new_lengthSt20bad_array_new_length,TdtN10__cxxabiv117__class_type_infoEN10__cxxabiv121__vmi_class_type_infoEstd::bad_cast@@@@@@@@@@@@@@EEEEEEEE@@@@@@EEE@@@@@@@@@@@EEE...H......................... ................................0 hP`N~<4rv d\Tl|t]                                             CCC(CCCCCCCCCCCCCCCCCCCCHCCCCCCCCCCCCCCCCCP | "1E8)('(0( )h!!+p,*+*x+^ ^  / 0=2x3,x3--.{/K00@1T1X"#0%0%& 8&&3X44H55866(7C8$xPPPOPUUUUUUUUUUUUUUUUUUUUUUVVWVVWWWTXX$Yd\[\UUUUYUdZUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUZUd[|Ucomplex imaginary Sd ,!T!d! !!!!!!t! ##($P$x$$ %h%%%P&&& '`'''H((@))**a+a+a+a+a+a+a+a+a+a+a+a+a+a+a+a+a+a+a+a+a+a+##x*x2y3y3@3y303 33222x2x266x6`6@6655y3@3p7p7p7p7p7@3@3p7p7@3p7p7p7p7@7p7p7p7p7p7p70777777777777777777777777777777777X5X5X5X5X5X5X5X5X5X5X5X5X5X5X5X5X5X5X5X5X5X5X5X5X5X5X5X5X5X5X5X544444444444444444444444444444444x4y33y3@3y32y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y3y334444 5//5 5/+565A5L5/W5j5}55557887h788KPLxLLKLLLLL@LLK;AY 2\)2' *2]*2^t-2b/2gH024i22Ğ22؞22T2232r32632N32f32~323232Ĥ32 32432242=J42LNb42 Xz42Y42Y424]H52(f92f;2|;2LC<2sA2XB2F2G20H2-H2gI2plK2,K20K2HM2d?M2UN2U&N2>N2VN2O2ZO27Q2OQ2X(R2(R2DLS2qS22S2JS2bS2zS2 S2 S2 S2 S2p 8T2x PT2 U21 U2 ]U2ȟ U2 U2 U2 U2U U2 U2` V2| .V2 gV24 V2 V2 X2@ Y2| ^2! &^2Hc _2d _2d `2f >a2g c2h d2|m d2 y 8e2p Pe24 he2\ e2T e2h e2| f2 j2| k2 6k2X k2 _l2 m2T y2 y2 |2 }2# ~2$ ~2D- 62p: R2; j2P^ 2Dm 2Ԓ /2L 2\ •2ė ؛2, 2 2ě L2 2 2X 2 /2 ^2$ 2l <2 T2 l2@ 2$a 2Ld ҳ2( 2 2 22 J2 b2 z2 2 2( ´24ڴ2525 22<2hq22`2ط2O2Ƹ2Ƞ=2L2x-22С%22(2T222آ7220'22,52X2*22ܩ2242`2,222D2p 22ȭ2o2 2L:2^2002t2222T28222>2m22202H2Z28l2~ 3 3+ 3 3 3 303`3H3l33$!3p"3%3T4(3 )3x *3 ,3, -3x  /3 S03 23(23p$43T\73<3=38?3Z@31B3D3+E3[F3$G3l:I3J3;K3LM3,N3O3DR3T3(TW3tY3[3 B\31a3d38Tf3wg33c13hcw3sF3x4tyy4`|4|3 4| 4D}4}4L4@4ăr44<4x 4̅e44,!4H}4d454Ć444Ў4pW4d44 44z!4Y"4̓"4#4$44$4 %4x%4ě-&4'4x?(4(487)4p)4G*4̞*4+4dO,4,4̠.4,.4X/40/4|04Hu14Ф 2434`44L74E84 84X94m:4Ȩ+;4`Z<4=4,=4d>4:?4}@4A4B4C4%D4D4tuE4/F4 G4ȼ#J4<?M4N4O4uP4>Q4PR4R4S4 U4LqU4@V4V4$?Y4tZ4Tg4*i4\Dm4\q4Tvu4y4LH|4}444U4*4hY44L4_4404|c4844\4D4;4t<4<I4>48?4m44}44p44@.4-F4d2424l44Pg4Q4Q4R48R4LR_4HSn4T4 c648ct4xp44 4?4d4|4d4pc4h f4 4FD4Ft4\4D4)484 94L;44|44ܥt4D4d4\4(T4A4\4p t4 4 t5@5 D5 5 D5 T5  5| 57 5\Q T 5 5 5 5@55D$5P5 T45Ĉ5l5,C45hzd50t5Tl58584 5tw 5t!5,1!5,"5"5"5"5"54#5D#5T#5d#5$t#58#5L#5`$$5t$515P25xT3545d65@5LT5di5i5i5j54j50dj5Dj5Xj5l4k5k5l5l5l58DI7P7P7Q7@$Q7T4Q7hR7$R7DR7dR7R7R7R7R7R7,S7xT7DW7X78$X7L4X7`X7Y7Y7Y7Z7Z7Ld[7t[7[7[7$\7t^7(D_7x_7_7a7 b7D b7x c7 d7 d7 !d7 !d74!d7H!Te7t!$h7!$i7"di7("j7T"tk7p"m7"$p7D#p7p#4q7#r7#tt7\$t7$%u7X%u7t%tz7%}7 &}7 &7p&D7&D7"7p$7$47'7'7,'7h't7'7'7(D7\(7(7(7)7)D78*7d*D7*ԟ7*T7+7,7,$7 -7X-7(7)7,)7,+7|+d70,$7-7.7H.7t.7.7.7.7/7/70T70d8$1818181d818 28428H28d/$8x/48/8/8/808D080 82 82 82 82 82$ 824 83T 803t 8H3+83,834-8<4.84D/84/844084085T1845t28`5t38538578 678,6886987?87@87$H888H8T8$I88TK8894L8t9T89`8:d838D686T807$8p8898`:8;8;8T;8;48<8T<8p<T8<8<8<8=Ħ8=Ԧ8(=8@=d8X=Ԩ8l=8=8=88>8\>8|>8>ķ8>8>T8?8?80?8D?t8?8?8?8@$8@48<@T8@8A8,Ad8@A8TA8|A$8At8A8 B8BT8B8`CD8tC8C8C8C8CT8Dd8Dt80D8DD8XD8D8DD8D$8D8,E8E8E8:yT?:yd@:y@:yA:zT]:P{^:{_:|`:|e:|i:t}$:P~:~4:$$::d:T:t:<Ԙ::zt:zd:Dz:x|:L}D:`}:t:$:d::,T:P:Є::P:T:t:d:؆4:T:pT:ԇ:$:t:0::;8;T;ȉ4 ;`d;;Њ;; 4;8;|;;Ћ;$; D;4#;$.;̍d/;/;T0;<0;\DF;tF;F;G;܎tG;tQ;pth;h;t;;Ԥ;܏Ħ;,;\d;;ؒ;; ĭ;<;X;Ա;̓$;4;IJ;0Բ;D;X;l;;;d;;,ķ;@$;\d;;;<D;;ܖd;d;H;D;;0d;;И; T;p;; ; $;P4;d;;t;;ȚD;d;d;4$;l;;T;D;;؜;;TT;;;4;;P;lT;;4;<;tt;;ğD;ܟt;;d;t;;̠t#<`&<t-<-<-=X?=?=?=̷$@= t@=8DD=D=D=E=иE=U=LU=`tV=|V=V=$W=tW=̹DZ=_=L`=p`=`=$a=Da=Ⱥta=ܺa=a=a=b=c=0d=\de=f=ļDg=0g=Th=h=Ƚi=m====,D=@4==4=L=<Ԋ==TԌ==Կ=Ԓ=\t====P=d==Ԟ=4=H=\t=T=(=t===($=tԷ==8Ľ===$$=8=p$=T==,=x===,4=@D=TT=h=|===$>4>D>T> >8D>\d>t>>>@d>h>4>>>D> >Td > > > > >>Ld>>>,4>@d>T>|>>>$>>$>0>h>d>D>(>">#>($&>xD)>t)>t*><d6>t6>6>T7>9>P=>>>?>B>LtB>tD>J>4K>4K>hM>N>TN>N>tP>TQ>pdS>T>U>X>,Y>DY>dd>d>e>f>j>T$n>$p>p>Dq>,t>x~>4>L$>4>@>Ĉ>>>ԉ>>@>lċ>>>4>>>$>@>T4>hd>Ԏ>>>>>$D>8>T>|đ>>>Ē>Ě>X>l>T>ԛ>>>t>4$>p>>$>>HԦ>t>@>d>>>$>>4>Hd>>8>ı>>D>>X>4>t>,>Hd>>D4>D>>>@D>`d>t>>>$>0D>t>4>d>Dt>X>>t>0>\>>>T>T>T>D>4>>>D>d>D>d>>,$>`>t>>$>>>>,>@>T$>ht>t>>$>d>>T>hT>>>D>><>P4>|>>$>>,>d>>d>>>>>4>>?T???$$?8?d?$??D??$?p$ ?d ?d ?LD?t???(?l???d?$?L??4??d? ?h"?$?%?@%?T&?hd'?*?(+?HT-?-?-?.?/?d/?$0?H0?\0?p0?1?DF?TF?,F?tI?hJ?tL?N?<TQ?Q?R?U?$V?@V?tW?X?X?t\?d^?b?d?Le?r?tt?u?Lv?w?}? }???$?$?T??ħ?P?t?@T?d?t??0?tt????0?Dį?X?D??T??44?xT?t??? Ը?\?p4?t??D?T?0?x???4? ? ?X ? 4? ? $?4 ? ? d? d?L T? t? ? d?0 4?P ? ? d? ? T?@ d?T t?h ?| ? ? @8d@D@ $@@@D4@X@@T@%@D$&@pt'@'@(@*@,@x-@t.@.@$2@T7@PT9@;@;@@@Tt@@C@dO@R@S@`T@U@V@TV@ V@TW@pTW@dX@X@DZ@[@,t\@p]@d]@^@d_@_@0_@L`@`@`@a@a@ b@<b@xc@$c@c@c@h@dDi@i@4j@j@tk@Dk@p4n@$p@(Tp@Dy@@@@t@PD@@(@@,T@h@T@@T @h $@ @ 4@ăA\>4A>A>DA4?A?4A?tA?A?A?4A@A0@TAT@A|@A@A@A,A4A@AAxAAAAAtA(BdAdBdABtABtACtA\CtACACACA,DA|DԣADAEALE4AETAEAF4ATFAFAFA8GtAxGAGĺAGAGAH$ApHAH4A,IAXIAIAIAJAJA,JAXJAJDAJAKAKA0KDApKtAKAKAK4ALA@LALALtALAHM4ApMdAMA8NATNdANANANANtAOA@OdAxOAOdAOAOAPdA4PdAPAPDAQACT@CdACAC(tDCdHC<JCJCLCLCLC܊$MC4MCMCtMCNC4OCTOCOC4OCT4PCttPCPCtQCRC$RC\TTCVC XCYCZCd\C,\C@_CbC\DcCcCcC̐$dCTdCdC$TeC@teCTeC|eCeCDtfCpfCTgCgCgCܒhClCh$mCoCГqCrCdC̔ċC0CDC$CCԕ$CĦC 4C4CHC\C|dCCĖCؖTCĪC4C`$CtDCCԫCC 4C TC4tCHC\CpCCdCܘCCĭCC,C@DCldCĮC$CC4CpDCCC$dC`ıC$CܛtCC C8ԳCdTCC$CTCC4CНCDCCCԞCdC`CCC CH4CxdCC̠dCC,DC`C$CȡCC0tCdCTCТtCC$CtCDCXClC$CУdCCC 4C4CH$CCؤ$CtCHCddCCDC$CHChC|C4CCئTCC4CdCdCܧCC$C0TCTCC$CtCCDC(CT4ChC|CtCC4C4CCԪC tCpCCtCC CHtCCЬC4C0CXC$CtCЭC$C0tCXCCtCCD@dDxDDȯdDD,DDXD$DDܰDtD4D`dDTDTD DLDTDԲDTD<D|DTDDгDDtDhtDtDTD DLDD"D$$$DP*D(,D`.DD1D3D`7DT9D?DhDDعdKD tNDDPDRDSDHTTDdUDWDXD YD$YD8T[Dt\DT]Dؼ]D^D$^D\TaDDcDdD,dD@dDTdDhdD|eDdeDeDԾDfDdfDfDfD0fDDfDXgDlhDhDȿhD4iDiD iD(jD<4jDTDjDhtjD|kDkD4lD,lDXlDlmDnDoDrDrD$dsDHsD\sDptD$tDDtDTtDttDtDtDtD84uDXDuDluDvDvDwDdwD wD0yD$yD4yDtyDyDyDyDyD,TzD\zD|zDzDzD${DD{Dd{Dt{D{D({DD{DX{Dl{D$|D4|DD|D|D|D |D |D4$}D\}D}D~DD~Dd~D~D~D ~D ~D4~DH4DptDDDĀDD DD(DD$DhDDTDdDDD(ԃD<DPDddDDDԄDDD 4D TD4dDHD\DxDąDDD$D4DtD$D@DTĆDhD4DdDtDDDDHD\$Dp4DDDTDdDtDDDĈD(ԈD<DPD|DDTDdDDD`ԊDdDDDdTD$DTD@DpDTDTDPDtĢDTDDD(TDd$DdDԭDTD8İDlDDdDDHD4DDTDDLDDTD$DHDdTDDdDTDDtD<$DtDE EHEEE ExEE4E(E\EEEEDEEPE|4/E/E1E02Edt3E6E7ED9E4:E $;Ed=E@EDtAE4DE PEDTE$WE4]E$jE(mExnEdqErE@sE|4zE$EELEEEEDE$EETETE`$EEEHEDEtE$EE(tEEDExETE,FxF%F2FHt5F|5F8FT;F4LFd4ZFZFT\F8dxFyF|F}F~FdFDFtFdFLtFFtFF FdFFHtFt GL Gt G4GtGG`G'G,G8:G$=G?G$GGdLGMGdNGXVG^GduG`}GdGGGG GGDG8GtGTGD4GtGG G TGP G G< GX G| G tG G GDG,G@GTGhDG|GGH<HtDHH$HdH$DHX4H$HHX H H Hd HDH 4HpH<HHHHtH8HpH4HXD H$"Ht$H%H0'H)HPT*HL$+H$+Ht-Ht/Ht1Hd3H<44H 5H 5H 7HX t9Hl :H ;H ;H4;HH;H\H>H>HD?H8?HT?HBH8DHdFHFH4DGHhGH$HHHHIH8dIH\IH|$JHDJHJHDLHLHLH<LH\$MH|TMHMHMHNHNHNH04OHDOHX4QHlQHQHQHQHRHRH4SH(SHHSH\$THtTTHTHTHUHDUHUHdtVHVHVHXH@4YHlDYHYH$ZHtZHZH<ZHP[Hd4[Hx[HT]H]H]H$4^H8D^HLT^H`d^Ht^H^H`HaHbHT$bHhTbH|tbHbHbHdH kHpkH$lHtlHlH$mH tmH$mH<nHTdnHlnHoHDpHsH(sH`4uHduHvHD4wHpDwH$zHzH{H |HH 4}Hl }H ~H ~H ~H H!H8!HL!H`!tHt!H!ԁH!H!TH!H!ԂH!H"TH"H("ԃH<"HP"dHd"Hx"H"DH"tH"ąH"$H"H"$H#H4#DHH#ĈHt#$H#H#4HD$HX$$H$ĐH$TH$H$H%TH4%HL%H`%H|%H%H%ĕH%ԕH%H &$H4&THP&dHd&tHx&H&H&ԖH&H&H'HT'Hh'H|'$H'4H'DH'TH'dH'tH'H(H(H0(ԘHD(HX(Hl(H($H(H(DH(H(H)H$)TH@)HX)ěHt)H)$H)dH)H)H$*ԝHp*H*H*dH+H0+ԟHH+$H\+dHp+H+ĠH+ԠH+H,H,H4,4HP,H,DH,H-TH4-H\-H-H-İH,.tHh.H.DH.H.ĹH/H/H0/HD/4HX/THl/tH/H/H/H/4H/DH/TH 0dH 0tH40HH0H\0Hp0H0ĻH0ԻH0H0H0H0H0$H14H$1DH81THL1H1H24H(2DH2H24H3dH03H\3H3H3$H3H44H4H4H4H4H54H 5TH5H 6Hl6HH8H8H8TH84HptHHH4HtHH|1H<2H45H5H`7H7H\9$Hp9TH9H9tH94H:H\:DI|:I:I0;$I49tIH9Il;TI;I;I;I;4I;I,<I@<Ih<I<$I<t IL= I`= It= I=t I= I>IT>$Ih>DI|>TI>tI>I>I>I>I?I8?$I?4I?TI?I?I?$I?I(@IP@TI@I@I@I@AtIlAIAIA4IATIA$I,BdIPBIBDIBIBIBICIDCIXCDIxCICIC IC!ICT!ID!IDD$)ID4)IDD)ID)ID,IE/IhE/I|E/IE$0IE40IEd0IEt0IE0IFd1IGt1I$G1I8G1IGd2IG3IH4I(IT5IdIt7II8II49IJ;IK;IK>K4@KHBKFK4KKdUK|UK^K^K`KhcKtgK4hKPiKoK$dpKP4}K(dK|ĕKԫKKxKKd$ L8LȔLL oLtLttL}4LX}DL}$LLЕdLLL0LLL`LtLLLLԖL$L4LDL$TL8dLlTLdLT L$!L\T!Lt!Ld"Lܘd$L(*L4*LD*LT*Lșt*Lܙ*L*L+L.Lh$.L|4.Lt.L/L$BL@4BLTBLpCLdELELELdGLLTHLtHLDJLJL0LLlMLDNLܝNL4OLOLHDPLhPLDQLQLȞTL YL\$YLpiLjLDnL nL4sLwLdwL$zLDzLh{LT{L{Lԡ{L$LHLԂL Ld$LTLdLģ4L4LLLTLL04L$LL(ĢLh$L|dL4LĤLԦ$L$L8L`TLTL$LاTLԬLtL,ԭLdLxdL̨$LDL<LܩL(LtԻL4LxLDLTLdL̫LĿL$LDL$TL8LXLptLdLL<LtLLL,L@LdLЮL DL dL4LLLL4LLذ4LM@MM4M\MD!M(T'M|(M-Md.M$/M0M41Mt4M4Mص5MT6M,t7M7M48MԶ8MT9M(PD>P\$?Pt@PAPdDP8$GPhGPIPJP@^P_PD_PT_P_Pd`P aP\ tbP dcP xPzP`|P\ ~P P$ Pt P P tP P8 DP $P DP ԉP0 DPL Ph P| ԊP P P dP 4P$DP8dPTtPlčPP4PDPP$P\TPdPdPęPP4TP PPxDP$PTP4PHPtPhTP4PP$PDP|ԿP$PPPPtPDP`PPdP8PhtPPPP4P$DP8PdtPPP4P,dPTPPP$PP,PH4PtPP4PdP$PlDPPPP$P@DPhtPPPP$P$4P8dP`PPPTPP$PDDPdPPDPPP$P|PP4PPDPtPdP tP< PT P P P DP P(!TP!P!P!DP!$P$"PP"Pd"DP"P"P"P#4P#tP0#P\#P#4P$DP$TP,$4Pp$TP$P$$P%PD%4Pp%TP%P%P%DP&dPT&DP&dP&P&tP&TP4'Ph'P|'4 Q(T Q(( QH( Q( Q( Q( Q,)t QX) Q)Q)DQ)dQ*Q$*Q@*Q\*Q*Q*$Q*TQ*Q +4Q$+Q`+Qt+Q+4Q+TQ+tQ+Q+Q<,4Qt,DQ,TQ,Q,Q,dQ,Q,Q$-Q8-Q-Q-!Q-"Q.#Qh.t%Q.'QL/$(Q/t(Q/(Q/)Q/d)Q0)Q40)QP0D*Qd0*Q|04-Q0D-Q0-Q 1-Q 1.QL1/Q1t0Q10Q10Q11Q21Q<21Qh22Q22Q22Q22Q23Q 3t4QD35Qx35Q35Q35Q35Q36Q3t6Q46Q,4$7QX47Q4D9Q5t;Qd5=Qx5?Q5@Q6tBQ6DQ7HQH7HQ\7IQ7IQ74JQ7KQ(8dLQd8NQ8dSQ8VQP9WQd9WQ9d\Q`:\Q:D`Q ;`Q;eQ<fQ\<diQ<duQ=TQH>$Q>Q>Q?QD?QX?Q?Q?dQ(@Q<@$Q@Q@Q(AQtAQAQBQ`BԤQB$QBĥQBQCĪQ EQEQdF4QFQF4QpG4QGQHTR@HdRTH&RHt)R(IT,RtI,RID.RIt0R8J3RJT6RJd6RJ6RK9RDDd=RD>REt>RE?REd?RtK?RKDARL$BR4LCRlLERLGRLMJRMlRXPlRPtoRPDrRLQ}RQԂR@RRRRRRSRHS$R\SDRpSRSRSDRSdR TĉR@TRTRTčRT$RT4RUtR,UR@URURW$R,WDR@XR YR4YRlYRY$RY$RZRl\dR^ S0_-SPb0ST0S \1S@\1S]$2S^2S,^4Sh^t5S^5Sbt8Sb?SHcASc4BSctBS dLSUdEU0DGUGUGU TKUlLULÜMU4MUTMUNU`NUdOU؉tOUOUTPUdPUQU4QUԊTQUQUDRUDDSUdSUSUSUȋSU܋TUDTU4TU`TUTU$UǓDUUdUUUUTVUVUWU$WUdWU WU\4XUtXUXUȎXUXU4YU4YU`4ZUtZUZUZU4[U(T[U<[Uh[U|[U\Uؐ\U]UD]UD]Up]U^U$^UđD^Uؑ^U^Ud_UxT_U_UВ_UaULaU`bUbUܓ$cUDcUcUHcU\cUpcUdUDdUĔdUdUeU@deU|eU$fUfU0fUlDgUgUhU dhU\hUDiUܗiUjUXtjUjUИDkUtkU,kUPtmUę$pUtUtTvUwUTxU8xUTxUtDyUyUܛyU4zUzUp|UĜD|UT}U0}UP}Up}U}U$~UȝT~U~U~U8UL$U`DUtdUUUTU UHUTUUUTU@UxUTUؠUU8TU`UUDUUUHDUpUUТDUU@UxDUUU TUHUpUDUФUU04UhUUإ4UUHUTUUU(tUTUhUdUďU$UUDUpDUUȨUܨđUԑUUU,U@dUlUdUԩUU TU@tUTUhU|UU4UȪUU$U,dULU`UtԗUUUtUܫUԘUU$U84ULdUltUUęUԙUȬTUdUU(ĚU<UdDUUTUحUU U,$ULDU`UUĝUUȮDUdUUU0ĞUDTUhUğUU̯$UtUUU4UT4UhDU|ġUU4U԰ĢUU,4U@DUTU|UUرUDUDUhU|UTUȲtUܲUU U UDUddUUUܳU04U`UUUдdUUU(U<TUdtUxUUU4UTUU$U8ULU`TUUUU̶UUUdU(UHU\$UUUUdU(UPUpUUUUU8UdUUdUUйUUU4UTUp$UUغUUU$UHUlTUtUUлtUUU$U8$UXTUxtUUU̼$UDUTUU8dU\dUUUн4UUU$U8$U`dUUU4UԾTUUU$UL$U`UUdUԿUU dU@UTUDUtUUUUU$TUDUlUDUUUUTU$U\U|dUUUUU4U0DUDUdU4UdUtUUUU,U@DU`UUUUUTUdUU0UhU|UUUU4U$dUDUtTUUUUU4UdU<UPUdUU4UdUtUUU dUHU\UU$VVV V,$V@4VTVtVVVVDV0TVDVlV$V%Vd%V &V@4&VT$'VT'Vt'V'V'V(V(V(V8D)VT)V)Vt.V.V0V@$0VTt0Vt0V0V1Vd1V1V1V1V,1VLD2Vtd2V2V2VD3VT3V 3V,3VPd4Vt4V4V4V5VD5Vt5V45VH5VD6Vd6V6V7V47V8d7VX7Vt:V;V;V;VTVD>V>V>V4?V ?V4?VH4@Vhd@V@VAVtAVAVAVPBVddBVBVCVDDVDV DV4DVT$EVtDEVEVEVEVFV$FV tFV4FVHFVpFVDGVGVGVDHVHV4HVHHVhIV$IVIVJVMV(NVXNVlDOVOV4PVPVPVQV04QVDQVx4RVDRVRVRV$SVdSV<SVdTVTV4TVDTVTVUVUV,$UV@TUVldUVtUVUVUVUVVV,$VV@TVVlVVVVVVVVVVVVWV(DWVTTWVhWVWVWVWVWVXVXV$$XV8TXVdXVXVXVYVDYV4TYVHdYV\YVZVZV$ZVtZV,ZVXZVlZV[V$[V@t[Vh[V$\VD\Vl]V_V_V`Vd`V `V TaV8aV`aVbV4cVcVcVdV $dV$4dV8DdVLTdV`ddVt$eVDeVteVeV$fVfVhVd4hVxThVthVhVhViV$iV4iV DiV diV4iV`4jVDjVkVTkVkVlV4lV0DlVpV@rVtVTyVhyVyVyVyVzV4|VT|Vd|V(|VD4~VVV|V4V,VxVĉVTVdVtVV$VPTVVVVԑVTDVtVVD4VVԚV$VVĞVhVV$VTVdVTVhԨVԪVtV,VTĭVV4VVV įV4ԯVHDVpVVV 4VDdVpVVIJVԳVVVhV|ԴVV$VԷV$V@4VTDVhTV|VVVV8VhDVVV4VtV0V`VDVVV@4VhVVVDV V@VxVDVtVV8TVT$VVVtVLTV$VtV\V$VV\VVVVXTVVVVVV4V tV|VdVV(VtV V\VtVTVVDVHVVV@VWt WD WW@W WX!W$W D)Wx D+W /W /W 0W8 1W D2W 3W 4Wp 4W =W( >Wx >W ?W DAW BWD$GWdWtlW$WWpĹWWW(WWdW,W@WTWTWWW8WLWdWdWW(WPWWWW0dW|WWWWDWH4W|$WWTWX X$X/X$0X,0XP2X$5Xd6X8X\:X;X$XKX KX KX8MXttOXOXOXOXOX$PXtPX(PX<PXPQXdQXxVXVXVXWX4WXTWXtWXWXWX0WXDWXXXXl4XXTXXtXXXXXXXXXXYX 4YX TYX4tYXHYX\YXpYXYXZX4ZXTZXtZXZXZX $[X4 T[XH d[X\ t[Xp [X \X 4]X !]XD!^X!_X!`X"aX4"bXh"tcX"dX"4kX"kX #kX0#DlXP#lXd#lXx#TmX#oX#oX#$oX$4oX$pX\$TqX$rX$sX$uX4%xXd%xXx%xX%xX%xX%xX%xX%yX%yX&$yX&4yX,&DyX@&TyXT&dyXh&tyX|&yX&yX&yX&yX&yX&yX&yX'yX'zX0'zXD'$zXX'4zXl'DzX'TzX'dzX'tzX'zX'zX'zX'zX (zX (zX4(zXH(zX\({Xp({X(${X(4{X(D{X(T{X(d{X(t{X({X){X$){X8){XL){X`){Xt){X){X)|X)|X)d|X)t|X*|X*|X,*|X@*|XT*|Xh*|X|*|X*|X*}X*}X*$}X*t}X +}X +}X4+}XH+}X\+}Xp+}X+}X+4~X+D~X+T~X+d~X,t~X,~X(,~X<,~XP,~Xd,~Xx,~X,X,X,ĀX-TXT-dXh-tX|-X-X-X .X.$X /TX1X24X$2X82XP2Xd2$Xx2TX2dX2X2X2X2X2tX(3X<3XP3Xd3$Xx3ԮX3X3X3X4TXh4DX4TX4tX44X@5X5$X8XX9$X9tX9X:$X:4X:DX:TX;X.X /dX14X5tX;X\;X;X<X$<X8<DXd<dXx<dX<tXD=Xp=X=X=Xd>Xx>4X>X>X>$X>X?X?X?$X@DX0@Yx@Y@$Y@tYAYAY1c>3c>5c?5c?6c0?8c?:c?:c?;c@4;c@=cd@>c@?cAOcTAOchA4Pc7Pc8Pc8$QcAQcARcAScATcPBTcBUcB$VcB4bcHCbcCccCdcDcDtcDԛc,E$cxEԞcEc0F$cFtcFcGtcPGcdGcGcH4c`HDcHcIc0IdcdIĵcITcIcIcJcXJclJ$cJcJTc KĺcLK4cKtcKԼcKcXLDcpLĿcLcLcXM4cMDcMTc8NTcxNtcNdOdPOdO dOd dP dpPdP'dLQPdQ$QdR4Qd0RQd\R^dRt_dR`dRbdLSdSdTdTd,TdddTdTdTd,U$dxUDdUTdUtdUdWԷd,W$dWddWdXd0XdDXTdXXdXtdYd(YdLYd`YdYdY$dYd rpl> Ttp> dtp> tp> tp> tp> tp? upl? up? Dup? Tup? up? up? up@ pt@ p@ p@ ԟp@ dpA p? p@? p? DpA Ԣp4A ԩpA 4pA pA pB ԯp$C pC 4pD Dp(B pxB qtC dqXD $qD 4qE qHE q\E qtE tqE qE qE 4qF t q`F qxF  qF T qF qF qF  qG D qG qG qG tqH $q(H TqqM AqTM dBqM dEqM DHqN $IqTN IqhN DJq|N JqN KqN tMqN MqN OqPO OqlO OqO OqO PqO QqO QqO RqO RqP Rq$P Rq8P RqLP Rq`P RqtP SqP UqP DVqP d[q0Q ]qQ bqR cq,R tcqDR cqXR fqR gqR Dhq S hq@S hqhS 4iqS pqS qqT Tqq(T qq@T qqXT rqpT rqT sqT $tq U T~qU d~qU ~qU ~qU $qV DqV tq(V q@V DqV qV qW qHW q|W ĄqW qW qX qHX qX qY q(Y q@Y 4qXY DqlY dqY qY qY 4qY TqY dqY qZ q(Z qsp d>s >s $?s Asȏ Bs THs Hs< Is $KsА Os, DPsd DQs TRs Rs tSs\ Ts Vs [s \s bs8 Dcs is Djs lsD msX mst ps zs t|s }s0 4~s` s s $s` s ĉs s DsH Ts| s8 s| $s ds s$ s 4s s Ds 4s` s Ds dsT s s *tܜ +t< +tP +td +tx ,t ,t $,t 4,t̞ D,t ,t ,t4 -th $.t .tܟ /t D/t4 /tH /td /tx 0t 1tȠ 1t 43t@ 6t :t :tX ;tl u Bu Hul Ju Lux Ԓu ԓud Tu Ԗu u` u u ul u0 u| u u $u u Tv< v $v Tv tvD v v v8 d.v 3v< 3vP 47v T7v 7v 7v $8v 8v4 9vl Tv Ev Lv TNv4 NvH TOv` Pv DPv Pv Qv4 DQvH Rv Tv Uv tVv Vv Vv Vv( Wv< tWvP Wvd Xvx 4Zv T[v t\v \v \v ]v $]v, $^vp 4^v t^v _v _v `v\ `vp bv bv cv cv@ Ddv dvL ev fv\ hv jv dkv lv` tmv Tyv dyv yv 4|v@ v 4v Ăv v0 v čvT Ďv Tv $v v ēv0 v dv v ԗv0 dvD v Dv ԙv v dvL v` 4vt v Ĝv Tv v $v4 v tv dv tv v v 4vp Tv v Dv v v( ĥv< v v 4v 4v v v 4v4 v v dvh v v v v vH 4v dv v v $v 4v v 4v v v< v vD 4v$ Twl w w w w w w w w, wD 4w\ w 4w tw w  w, 4 wD d w` t wt w w w w w w  w t w w0 wL w` wt w  w $ w w w w w( $ w@ 4 wT D wh T w| t w w w w w w w w w4  wL $ w` 4 wt D w T w $w 4w Dw dw w$ w8 Tw` dwt w w $w tw w w w w0 wD wX wl $w 4w Dw w w w w w w4 wH $w\ 4wp Tw tw w w w w Dw w( w< wP $wd Dwx dw w w w w w w w w, w@ 4wT wp w w w w w w 4w Tw dw0 wH w\ wp w 4w Dw Tw w w4 $wH 4w\ dwp $w 4w dw tw w w w w( w< wP wd wx $w Dw tw w w w w w$ w8 wL $w` Dwx Tw dw tw w w w w@ wT wh w| w w w w 4w w w( w< T!w "w "w "w "w #w  $#w  D#w4 T#wH d#w\ #wp &w &w &w 'w $'w 4'w, D'w@ T'wT d'wh t'w| 'w 'w $(w (w  (w, d5wH d6w 6w 7w 7w $8w D8w 8w< 9wP d9wh 9w D:w t:w w, @wl T@w Dw Dw Ewh Hw Iw@ Lw Lw Mw DMw DOw Qw 4Qw Rw@ DSwT Sw Uw $Uw dUw Uw UwH Vwt Xw Yw DZw Zw [w< d[wX \wt \w \w 4]w D^w ^w, `w dw Tew dfwD Dgwl dgw gw $hw Thw hw lw lw Dmw, mw4 mw nw0 pw\ pwp qw qw {w {wD }w w w  4w Ԋw4 wP wd w w w Tw  twD TwX w w w tw0 ĕwL dw w w w w $w$ 4w8 DwL Tw` dwt tw $w 4w Dw Tw dw tw0 wD wX $wt tw Ĝw $w w4 wh Ğw tw w Ԡw wH w Dw ԥw Tw w0 w| dw Tw w ĭwD w w w w0 4wT ķwx Dw w w\ w w w@ twX w w w( 4w Tw w Tw Tw, tw@ wd Dw w w w w! 4w@! twh! w! w! w! w! $w! w" dw" w4$ w$ xh% x% x% x<& dx& x ' x' t&x( 'x( 6x D6x 6x " d8x# 9xT# T:x# dx% @xX' dBx' tDxL( Fx$) Fx8) FxL) Fx`) Fxt) DGx) Hx) $Ix* Ix$* Kx* x* x+ x0+ xd+ x+ x+ 4x, x@, xh, dx, Čx, Tx- xX- Ԗxp- x- tx- x- ėx- x@. x. x. Ԡx/ xX/ x/ tx/ xD0 ĪxX0 xt0 tx0 x1 x1 dx1 x2 dx2 xx3 $x3 $x 4 x4 x4 xT5 x6 x6 x$7 xX7 tx+ x81 $x1 x02 $y7 4y7 Dy7 y7 y8 y$8 yL8 y8 y8 $ y9 y9 y9 Dy9 y$: yP: yd: Dy: Ty: dy: y; yL; dyh; t y; y; 4$y< $yh< %y< t'y< t)y= )y= )y= *y= d*y= *y> *y$> *y8> +yX> 4+yx> T+y> d+y> +y> +y> d,y ? 2y? 8y,@ 9yP@ Ey@ Py(A $SyA [yA 4aypB TiyB kyLC DsyD y: čyh? yD yE ďylE yE DyE TyE y F yF DydH yH yI y4J $yE Ty0E yDE yXE yF 4yF y$G $ypG yG yH yI y\I 4yJ DyJ TyJ tyJ yJ y8K yLK y`K yK DyK yK yL DyTL TyhL dy|L yL yM y0M yXM yM DyM Dy$N 4ztN 4 zN TzO z49 ?d9 DH9 H9 I9 I9 $KH: Kp: M: 4M: TM: M: O ; O4; OH; TPt; DR; TR; dR; R; R; R< S< TT< DU< DW< TaH= d= g8> dtH? }? @ Ą@ t@ @ PA hA ԏA 4A 4B TB B B ԜC dC C D 4D > TD 䭄P AtQ IQ DTPR $YR fpE $fE 4fE DfE TfJ fJ gK j|K nK nQ p8R 4qR q(S r\S DS S DU tU V `T 䶅T ĸU V V $W 4W DhW TW W X X $ąX ɅX ͅ8Y DͅV Tͅ,W dͅ@W tͅTW ͅ|W ͅdY ͅxY ͅY ͅY ΅Y D΅Y T΅Y d΅Y ΅$Z ΅LZ υdZ dЅZ DՅZ ׅ$[ dمp[ ݅[ [ H\ t\ 4\ 54] 5H] 6\] T6] 6] 8] 8^ 8^ 80^ 8X^ ;^ ;^ ;^ ;^ ;_ <<_ ?_ D@_ F` H|` H` H` P+ ?X6 A7 A7 B8 C$8 tC88 DL8 $D`8 4Dt8 dD8 TE8 dE8 tE8 E8 $G49 G`9 H9 dI9 J9 J9 J : J : JH: K\: DK: K: L; TL; L8; tM; M; TN; N@< Rp< Y< $[ = T\8= D^= Ta= h> Di> tm,? ox? Tt? t? 䇑$@ @ 䍑@ @ $@ Ԏ@ $A 4$A @A TA pA A A 䑑 B 4XB lB B 䒑B B 4B tC $C 4PC tC $0D ĞD ĥE T8E ĩE TE E 4F $XF xF F ɑDG Tɑ`G 4ˑXH 4͑ I $БpI ԑI TJ J 0K $\K ttK K @L L dL L  M 4HM M N 4,N T|N DN NO $V\P VP jP HQ < T,< xC 4D t G TG 䚒H H XJ TN 4XO DO |Q TQ Q TQ ħQ R $(R hR R DR R 䩒S dS 䪒S $S dS t$T 4`T T dT ijDU 4U (V DV 4ߒT tU \V V W (W ttW W X LX 4`X DtX TX dX tX X X Y (Y tY Y $ Y D Y $Z LZ Z [ x[ [ D\ "\ #\ d$] (] +<^ ,d^ 1^ d6^ >`_ @_ $A_ tA_ TB$` CP` M` Xa Za Za Zb T[ 4k0  Ԫ, $T 䫙 t Կ 8  TD `   $X ļ ̙ tΙH dЙ $ؙ ؙ $ۙT ޙ Dߙ X T 4D  D TP 4 $ h $  L< Lx dN N TP P< dQh DS 4T V DX X X ta $b0 bX c Dj v4 x 4  ĸ D tĚ tƚ ʚ ֚h ۚ ݚ ݚ$ ߚ dD   d t  0 t (< (l ) T) + + $-d -< $0 2 3 T5 $7 47 D7, d7@ t7T 7h 7| 7 7 7 8 D8 9< ; < B$ B8 D $E DE E GD H| I TI I DJ J tK8 4Ld L L dM M $OL Ox O P $Q dQ4 Q\ DR T 4V4 V $Yh a 4b g@ k mx nT p r t Tu8 u v z\ { } } t~ ~<   T d 8 L $ d  Ԃ( Ԅ ą 0 T  d d 䠛 $ d$ 8 ġL 䡛` t D Ģ   T Ԥ ĥ ԥ   $0 T Ԧh ħ t ĩ, ԩ@ Ԫ| 䪛  $ \ 䬛| Į  D4 Th Ե 4 丛P 4 t h Ŀ Û( ʛ ʛ қl ݛ ݛ0 4ޛL 4ߛ d< T  tH $" $4 8 $Oh 4O $a a b( b$ b c 4c c d@ Te( e| tf f$ g8 $gL Dg` dgt g g g g Dh h< ix di i $j, 4j@ j k k$ l8 mL Tn n 4o o pP p Tq q q q r, rT rt s s s 4t t w< wP Tx x x x x Ty y y< $zP tzp | } } 44 Ԁp $ 䁜   Ԅ  T@ $  $ Ĉ 䈜 40 X   䊜 D @ Č\ x $  < h D ԑ $ t  Ԓ 0 DD tX l ԓ  D 䔜  X 䕜 4  ė, Ęp 4 ԙ  4  tH  4  D ԦT ĩ  Ԭ8  Į t 䯜 4< tt ԰ 4 t   ı( Ա< 䱜P d x D  Բ Գ Ķ  t  ķ ䷜ $ T  4 ĸH 4p   4 D TX t  Ľ   tŜ tǜ ɜ8 ɜl ɜ D˜ t˜ ˜( ˜< ˜P ̜d D̜x d̜ ͜ Μ Ϝ, Ӝ| ڜ ۜT ۜh 4ܜ| ܜ ܜ ݜ ݜ ޜ ޜ  dߜ8 ߜP 4 D T$ < X l T    $ P  H Dh d  4   4$ 8 ` t D     t , @ X Tp T t ! dX! ! D! ! ! " " T(" $X" " D" "  # \# # # T $ 0$ d L$ t $ $ $ 4$ t4% H% h% $% % % L& `& $& D& & $' H' tl' ' d' ( "|( $( D$( $) $<) $P) %t) %) &) &) ') '* ((* D(<* 4*T* D*h* *|* ** ** +* $+* 4+* D++ T++ d+(+ t+<+ +\+ +p+ ++ ++ ,+ ,+ $,+ 4,+ D,+ T,, d,$, t,8, ,L, ,`, ,t, ,, d2- 28- 3- d@- D. E. tx/ x 0 4yH0 {0 4|0 <1 T1 2 T(2 l2 42 t2 D2 ԍ 3 d3 䑝p5 ė5 D6 6 6 6 d6 䡝7 tP7 7 7 47 Ԥ08 dx8 T8 8 D8 9 䩝<9 dX9 Īt9 T9 $9 Ԭ9 t: <: : : $0; p; d; < ĸ< Ը0< Ĺp< d< t< < 仝= = ļP= Խ= $= Ŀ= d> @> Tɝ> ɝ> ʝ> ˝? ˝X? ͝? Tϝ? ϝ@ ϝ@ Tѝd@ ѝ@ ҝ@ 4ӝA ՝J >J ?J tCxK Hl I TI I I J TJ, dL, DMT- Mt- TN0. 4P$/ R/ Sp3 U(4 W4 Y4 \ J ^pJ `K b@L $eL 4gL g M gXM iM 4iM iM jTN kN $lN qO v(N wHO tyO yP y0P yDP zXP z|P zP T{P d{P {Q {Q T|LQ t|dQ |xQ |Q $}Q }Q }Q tR DXR TlR tR R R DS t$S 8S \S S 4S TS T 8T ԘT T 4T Ԣ4U lU 4U DU |V $V V DV W |W TW X X ķ0X ԷDX `X $X DX 仞 Y  Y Ľ|Y $Y dY $Z ԿPZ |Z tZ Z Z DÞ@[ Þh[ Þ[ Ğ[ Ğ\ ɞ\ $ʞ\ ˞\ ˞\ ˞] T՞l] ֞] (^ 4D^ _ Y d^ D^ \_ dp_ _ _ _ $_ _ ` 4` ` ` TPa pa a a T(b tD>$t>$>D>>>d>?,?@?T?h?4 |?d ? ?$!?t!?! @"(@$"<@#@#@$@$A&0A&DA$&XA4(A)BT) B)HH>dH?H?HA@I$ATITAhIBIDCICIC JDDHJDpJTGJdGJtGKHdKHKTMxLNMP8MRpMtTNWNtXOYhOTYOZOTZOtZPZ(PZPPD[xPt[P[P]P_(QD`TQt`|QbQTcHRcpRcR$dRdRdS$e8SteSjTkHTmTdpTp0UqpUrUdt V4uVvVvWw`WxWyXzX4z,Xdz@XzTX{X4{Xd{X{X4}YlYTYZԂTZZZL[D[\Ċt\\䍥]]D]^T<^\^^D^X_4X`|`T`,ahaDaaTb@bb`cDc@d䪥d4dԫeeeDTfdhf4f乥4gg4hĥdhǥhȥhͥxiӥi֥{D>{>{> |?$|D?<|?T|?h|?||?|?|?|?|?|@|T@}@H}TAp}A}B}TB}B ~B ~DC4~tCH~C\~Cp~$D~TD~D~LxMTM$NNOTP,$QQQЀRR$R 4R RTRRtSST0tT\TpTTTT4UDUU4UHU\Vp4VdVWXdX$XLX`YtY$Y4YDYĄTY؄dYYY(Y<YPYdZxT[ąt[؅[]<]X``܆Daa$a8aLbbЇbbc tc$c8ddTDgЈgg hpTiЉi$jtdpĊruz4ĔHԔ\䔦䖦PėPH䛦@X䫦4\|ȑij`dDddxĿ俦ē$ؓ4Dd0DxT¦æLƦǦԕTǦ8d`D<TPddtxȍ܍,@$T4hD|Tdt̎0DXl$4D8T(4$\t8d4ؗ< d TdX4Htlt$ T  4t!p!"Ԟ#D$p$% %\&&''H4)x**+t,l4--.0/`/T0ئ0t1@422344H4t455̨46647<7l488ԫt9:0:d;;جD<<4?xGȮtI tKH$MN0OLSU DWp[$hl4mTprrs8upuuvг|lTd̸ddTܹ$ 4\$§Tħǧ 4ʧḩļϧҧ`TէاڧDۧܧܾާ(ߧt$4 $phTLd|!t&$'p4)T*t+T>Ov|{Tht¨Lʨ,$Өtۨ0|4$40 Xd &-,$4t9\DA$IPH$W^hfdmqrTtsDt,uwxHydz4{h{|}L~dD0$|䂩ă`dDD$(䉩tĊ XdD$<䐩đ ldDP$䗩4ĘddD$H䞩ğ,xdDpDԧdTԭ8tDh䶩丩dԼ,4dX© é$ũTƩLtʩΩҩ0ө|tԩשDکxݩTt\$@$pt ldPt4  $4dLD0|d`!"\$d' QHQRTTdV4WYD\D]^Dd e0 4i k l  d\  D DD$$ԟԥԊ@TTT<4l  T!$Ů(#ٮ0%ڮ|%dۮt&ݮ&48($a(db(c)dh)Te)di*tmL*q*u*y0+}|+ā+ԅ,䉰`,,,D-D--䙰<..䧰.>@Ը@ AlAPBB4ı8CXD4 D ,EDxERG$U$H@I4\JtJT(KtKDK4LL8MMMN$ hNtNdHOOPQTQd8RTR4R̳Sd{@UUdUVĢWtdXXŴTYʹYմDZִZD״Z״([Tٴ[ߴ[T \l\T\D]P]]d4^^<`crj$tkTvLkdktk8lDlDlDmDhmDmDnDLnDŶnDʶnD϶0oDԶ|oDٶoڶp߶`pppDqqq(rtrr sXssLtdMtQlu$VudDvڷvvw x yyD z}(~x~䓹~Ĕ$\䛹Ĝ@؀d$dtD4ЯԼ@4h4ȵ$4¹HDùxĹDŹ$ƹضƹǹǹtȹDɹɹ܄$ʹʹ,˹Tt˹|˹T̹̹4͹͹tιH׹$HtHD؉$|$̊XT8pT$\d44p$$D,txȐ Ĕ$t@dTt$,$t8d t hD Ț  Ȝ$  DTtءDآDLd!#hd%ȥ'($)+,-/0\1$2t3l4t6 7\$9T:T;|<̭=,>?ԯ@4ABL$DdEDFhGI`JlKĴL$MNtP$RdTSTXUVWW$4XdZ[$\p]кd^ _p_,`Da4bXctdP$f$g\hik|lmP4o4phqr4sTuwT䍺䎺Dƺ8Ǻxdɺ$˺\T̺ͺκ$кdTѺҺ(tԺպd׺@4ٺ$ۺTۺ4ߺ$d$ t$t4!T"t* -34D5$<BHL4DV4X8Z[^Lacdk(mdo$rHtw|TDԊd444l$$ķ\佻Ļ$ջݻ4 KD`u$TL׼ T4x@D`T'QHQ\RxSSTU4VXDWtWXXf(hY tĿ,DD(TD$ 0TT4D$pt T  L `   D T   4 H t D T d   @D x  4  X $t8pdTPD0 h 4   $H t   d( `  T  @ 4l t   4 tH t  4  $P$dt dD|T$\D4<t$tT|4  0!l""#4#|$$ %T%&&,'td't'''''4(H(\$(p4(D(T(d(t(( )Dd)|)*T*$*`*D++ +84,d,,$- t-D-p-4.t.. .LD/x//40t0(0T041t1111,1@1T42D22222$3@43T33$4t4 40 5\ d5 5 6 T6 !d6 !t64!6`!7!$7!47!D7!T7!d7!t7"7 "74"7\"D8"8"8"49 #d9(#9D#9`#9|#D:#:#:$4;,$;X$;t$;$$<$=$$=%t=0%=\%>%?%@4&DA|&$B&C 'CT'D'E'F,(dGt(DH($I)dI )I<)Ih)J)K)K*tLP*$M*M*N+4O@+Ol+O+P+P+P,P,P0,Qx,Q,Q,Q,Q,Q,Q,$R -dR4-DSp-tS-S-S-T-T-dT .TL.Tt.DU.U.V/V@/tW|/4X/X/tY0Y$0Y80YL0DZd0Z0D[0D\1t\1\,1]\1]p1^1t^14q1t1$u1u1u2v$2$v824vL2Dv`2Tvt2dv2tv2v2v2v2v2v3v3v(3w<3wP3$wd34wx3Dw3Tw3dw3tw3w3w3x44x4dx,4x@4xT4xh4$y|4Ty4y4y4y4z4Dz4tz5z5z05zD5zX5zl5{54{5d{5t{5{5{5{5|6$|$6D|86d|L6t|`6|t6|6|6T}6}6}6~ 7t~L7~x77D7t7774788,8dD8h8T8ԃ89DH9|999 :\:::t:;D;l;;;0<X<<<$ =(=DP=|==d=>ė,>X>d>T>?X?d?t @t@$@T@@@AA,A$TA4hAD|AtAAA$AB0B$BDCt(CFT>@Gt?G@XH@HAHCICID JEJtFKTGLGMHMtIMJN4KNLO$LP4LPDL(PTL@$@4@D@@ MlMNO$O4ODOTOdO$tO8OLPQR(RLSSTdVXPZD]dixĎ$dTԐ4 D$\4pTd4 D H4$D@`$,4|t0XDTd,T0td dXDt$ 4 ddtx$XD 4T<4tt44DHT\d $ 44DH((4)*04*Dd*X+l+++, 0`344 d585XT9<$=d=t=0=L=p>>?$CDXdGJN`QTS4UT`ePdg|$jtjds@sTlD<DHTt<$t<T`@ hd \T"4%)XD,.26;p>DChEFhYt`\tcfxkm0qxdtz<4dtD  4dHtTT`LdTԞ`ԡ4,XdH4$T$x$dtt$l,- /X 40 d0 $5 D5 t5H 6t 6 6 7 48 T80 t8L 8h 8 8 8 49 D9 T9 d9 t9 Pl P Q TQp S U$ W $X X XTTYdYY$[4]$^l^a,bcd(eh$i$4jlTop(ttv$x`t}d@ddTht(d0\ԝDht4 8pt4d0ĨDԨXl4TD0t\pԬԮԯDXԲ$40tL`ij|Գ$4TD Ե,   D!t!t!4`""#dL#4##$#d\$$$d $ 0%t `% t% %$ %d %&D&t('d''(D#T(/(/(/(/(0,)T4)d5)5H*8*8*8*8+9@+<+T<+<,<,,$=@,T=T,=h,4>|,$?,d?,dA$-B`-F-TG-4H,.R.4T.V(/tZx/e/Dr$0h1Ğ11D1 33d32343D3DX4T44t4T45\5p55d5t56(6$P6466664 787d7t74778$8d8t99T0:Td::;0;$;D <T <L<`<t<T<d<$<$=t=d$>4%>T%>d%>%?%0?&X?&@$)Cd.pC.Cd0C2@D4D5E8PEd:tEt= FT>LF>xFt?FDGTFdGMGDfHf,IThpIiIk0JtltJpJu0Kv\KdwKTK@LLĮLMDLNNDN 8O <6 d6 x6$ $9t 89 L9`9d;tl?@@xA#A$ B%PB&B$(B((M)M,M,M,M-N-LO/O$1Od3PP5P5P6Q6$Q$68QD7dQ48QD8QT8`R8R8S_S_T_RaRtc4TcHTc\TcpTcTcTcT4dTtdUd8UgUmUp$VtpVyVd|W4|WdWLXtXġXt$YYY Z$4ZdHZpZ4Z4[Dx[[t\\]]L^^^(_t_T$ZZ4([<[t[(\ @\D!|\d"D]D#X]$&]&_&`&`&0`4'L`D'``(`)`+ad7`D88a$9\a:a;$b;8bd=pbt=b?bAdcActBcDcdNdN4d4O\dtObObO dOdOdOdPdPd$Pe4P$eDP`e$Re4ReDReTRfdR$fRDf4SxftSfSfSfdUgTV,gV@gTWTgX4h[phD]h4ditgiTijj@j$ldj$tkdkDklt4llĕldll m$4mmDmmԛmt8ndndd8eLe$g$gghh|itjnn4oTo,ooTopt8ppdppDp$,qxqq$qqqqrHrr`z?z?z4@{dEd{E{G{G4|$H\|$O|d4}fl}j}m8~o|~q~qu dwPxdz؀d{|$܁Ԃď t\̅ldlD԰TT@otToho|ooo4xDxdy4$yy{D}$~$Խ\ddt\ThDt,XpHddt$$4pdxhDtp(tȍDԏt8dLt`$Tܑ8ԓ$ 4T lTTܗ3h<==$=4=$D=LT=`d=tt==>@@p4@D@d@AB$BDBDCtGTLTPDU|VVԘVVVDW,WT[$`$d<dtftu0$uD4uXDulTuvvԛvvvw,dwPwdxz4{44||T}8p$Ԟă0|$؟t$Ld$|TԠԢ$8Dtd4DtT4dHt\pԣ4DTd$L`td$Tȥd 8pȦ Tţt4  Ш4  h4DȩTܩd$@T\tܪddd"D")T^jTkȫ$l<mmmnЭ$ndno<p$st$uvTv v$wXwwxܰyyz@$zTTzd~hTTTdh|ĂԂ̲4tԃH\DسHԴԈT4DصĊ@ԊTt T`ĎD d8đX4x̸DTйě4PĝȺԝܺT(d<t$Ի4(ĤT̼Dԧ|ȽtľthDhԾtlLdTT$tHTdh  dd$ D$**,d+\+ T,D,Ld-T2l236,6d7C$E8GxdGGDIDJ LMO4OHP\PPPQSS$T@U\DUtDWDX4$[$\\`Ddfp$gtjj4kHllmo,qpsx{X~$\ $$Pdt$Tx$4ԗԙL$lD`TDԤ,t$4DtttD$@4t,|TtT@dtd$h4$dT0D \ Td,@Dlt $PTTdTtD(Ttt<P ($ Dt4*,d9FK`PP4^<__` klTn8t~`$$XTHP4xԦ4dHdD$<0t44P$dDľ4(4@T LDd td$td4 d$%(0(D)X,0T9p?O4 T _ g8Dh$ i i j l4 Dm\ p q dq q sT t tu u tw( 4{ d| ~TĆԐ$ \$԰(ttPԺht TDtP4dDttDH$HdD0%-TT0418\EXDcdDde\DgDhjltnq tsssDt(t<vpv$wxty$z zL ${ { { | |!~T!|!D!!!D"8"4T"x""$"D"#,#4@#TT#|#T#$TL$|$D$$%0%Ԏ|%%$%4x&D&T&dT't'''ԑ'd0(d) *Ԗ**D +4$+p++T+++D+T+,tD,4|,-$-d<-ԠT-h-4|--D-$.ԣT./T/%&&$(&4<&DP&Td&d&t&&&''Ĩ,'Ԩ@'''t(($(4(T(t(()$)8)ĩL)ԩ`)|))D.t/$///T0ԭ0040Į$181h1111d82T2Th22Դ22D22243D3T03D3Tt334<4h4$4T44dx55(666<777T8h8T899@9T9$/4/T/d0$0T00t047h9<:TT;Dt;;4;d;T<$|<<8=T =D=>44>>DL@4@@!$A*9*9*9+9+9$+:4+:D+(:d+:d-:0>D0>0?d1@?2?4?T=tA>Ad>At>B>B?PB?lB4@Bt@B@BATCAlCDCCDECTF0DdUDWD[(E4^E^E^E4_ET_E`FcFqFu,GTvxGDyGyGd|LH4HHĘHDIę0AhD$EEF$4GPGxdH4IDITII$I<IPJ$KKTLL,MH$NlNdOPP tQ(QTR$XTYZ(Z<ZP\pd]Db@bdbxbDccdddtdden4ssԐ0HdD(`$t`$T\ĝpԬ  ĭ4t H4ddİ԰Td0DXlijԳ`dtԸ,@$X4ldd$4ԿDX$tDd$ 4xTt4(TDDhtT0tDX$THT\d44l4`T \dd4tHD$<P$d4x4$@DTdh4,xD D4TH!"D""$#D#t# #$$D$$X$l$D&(*xd++,-T/|0d1L23t4,4@4T4h555t668x889$99@9T9l9:$:T:::(;t4;;$<<<==@>t>>??@dATAdBB B DHFx$F4GGGHHPDdixiipr0trLTtąԅDp4Dtt0D|$,Tēpd tTԘtD,Xtt0HT\pD(Dtt 4tH\p$d@T4TTHd  , D ` |  t H d  D  4 t0 \ $p  H $p t   d 8    DT8` d d d H pd4@h$t'0D(\4//d03LD=E4H0tHXHIIJJ4JJ0KDKXDLltNNNNO$O0P\$Pp4PDPTPdPPRDUXTUlV$V4VDVTVYDZ`tZ]^T^^ cDfii8j`kldllDoTppz {Pt{p{4||d}}t~P$D d <Ttdą48 0!T!!L##t#l$$%%T%D%%%t,&@&T&Ժh&&'D'd'$'t'Ծ(D4(Կ(((D,))$))$L*H+$,@,,d-d./0/ / / 0 D0p00D0t:01=1h 2$h 24h42thH2hh2h|2h2h2i2i2j 3kl3Dm3m04nL4Tn`4n4Tt4Du45uH5v\5v5w5Tx5x6x06dyT6y|6$z6dz6z6{6d{74|(7|T7~x7477 8Ŀp8T84(9Th9:dL:::t;;(;4<;tP;D;$;0<x<<<4$=t===d>> >8?d?D??T?,@P@@T@t@@@@A,AD@AtTA$A%Ad(8B(XB)Bt*B*CD-TC$.C1C1C24DD3pD3DT5D6,ED7@E7E8E:Gd>G$AGCPHHHDNHTP,IVhIXIZJ]hJ^J$eJhJvJvJwJ4w Kw$Kw8KwLK$x`KDxxKxKxKyK4yKTyLty,LzlLzLzLd{L|L~M$~(M4~0>HBtEdG H`IKܦMN<PTTW8\$\4^^ _L`dbȩDdd(g|jЪm$rxrtЫtv8dxz|} t4ȭt4pDTȮԎTD|$h8$dȯ4Td$@\T 44ԕh|d$$PxDD(4hD|ĝԝгd  4H\$pش$D<pd$T04DTd $ ԫpԷT0dDtX4D$ԳPdx$DT 44TtԷкH4xԻT̻ļ D|мDTdL`tTtdLpD0Dtȿ$$lt0D|t(<PdxDTd(t<Pdx$Dt4T,t$$48d4D$4$D8dL`t$D(Dl4DT0dDtXTd$,T4$D0L`T$8$LDhd|D$HDT8Ltd|TDTd04\T4H\$p$T84DtD,T@l$4d t l4 $ TL`D0TDX,H444Pdx$4D,T@dTth|4D0DX4ldt$D d 4Hd $ D ,d @t T h$ |D T  $ 4 D T (d <t P d x       $ 4\t$ 4 D4THx4DdDxTd  <PxTdd(<XlDD HDt4T,@4Tp$44` t4  !!!!"\"p#d##$$4$Pt$d$%4%D% %X%l&$&&& &8$'L'''4(($)L)T** +8+L+`T,,d--4.hD..$///,T0`01d1111 2<2d2x22344$484L$5d5x6D67T8@8l949D:;;;,;@$>D> d>(>D>`>|>?$?D?d??$?@?\?|$@T@@@@A<DA\tA|AAB4BdBB<BhCdCCCCCD,DD@DTEHHTII($JXtJJJ$KtKK0LXdLLLLDMMMLN`DNtNN4OOO(O<PPPQR(SpSSSSTTtTT,U\dUUUVdVV,WttXXXXX X X4YHY\$Yp4YDYYZZ<[P$[dD[xT[t[[$\\D]D_d__,_@_T_h_|``$`4`D`d```8apaaaaab$c\4cpDccdd4eT$eh4eDe$fg@Tg`hktmlmndntrr<s\ds|xDy~D$dtĀd0PlTD 40 L Ԅh $ t ą D 8 D`  D  ċ,  T  @ $ d  , \ p ğ  Ĥ Ԥ $$D8DhT|tDTt,\pԺttPDh4lt4@$$\$\p#$##<#P-$-D-787L47`@@E$ELF`Ft$F4FDFTFdFtFFFF(F<FPFdGx$GdGGGGGH $H(DHDdH`H|HHH$ITII4I`I|J4JJ$L0tLXLMdMMN TNH Np N DO O QDZ>Z>[>4\?D],?T^X?^l?_?`@taH@4b@Tb@tb@b@bAdAtdAdADeAe4BePBflB$fBfBgBdg Cg,CthTChlCdiCiCjC$jDDj(DdjLDjpDjDjDTkDkEk0EkLEkhElE4lETlEtlElFl8FDmtFtmFmFmFn0GdodGoGpGqGq$H4rDHrdH4sHsH4tHtH4uIuIvl>@@̣$A4ADBPTBddCtCDDE4EHFFGGܥGGH $H<4HPH|HdIĦJ$K0KlLN$PHtQRST(dWWȩYZH[p\]d^0_l`aītbc$d\dpef$i4iliij4jTjЭdktklTlhmmnno8oLqq$rЯ4rDsTs0dthtt|uuȰvvv(v<vPvdwxw$w4wDwȱTwܱtwwww,w@wT$xhDx|dxxxx̲xy$yDydy8yTypyyzij$zDzdzz4zPzlz${T{̴{{ {,|LD|lt|||̵}4} d},}L}x~d~~ܶ~~ ~ <DPԂTԃ$HtxĄ$иtą HdpĆDܹ <PDd4ܺ,@ԋл`tčԍļؼtĎLd|ď̽ddtԾĒԒ$8$L4`DtT,@4TThd|ԕ4ĖTTtę0DXl$4DTd(`t4DLT`dt$4DDXTtdD$0tPDdhD44Ttԯ4 4Ի48XԽx$tľd$@T\xD44D$$TL`tTtp4 4`$4dTd(Xl4LD`TdDTT0TH!L#,6<$@AtBD<dMtMMV<VPdWW```0DjTjtjtt,t@~~ă,@T$pDdt$<T\ԅ4DXT\t$Ddē0Lh$dĔԔ0D\$t4ĕ$H4\Dpt 8ĘTlTԙH$D4$T<tXd4lԞd(@$\4Tġ8DpdĢtĤ(Ԥ<4xDD$lTħ THdTdt4Ld$|d$4 T(tDXtԬ$Dd<ĭl$@Ը40DDTXdl$DLTxĻX$ 4 T4tTh|$4D`4d<t`t4t48Ltd`dD XtT@pDEENxQUhXDY4Z]hdeeflij4k0dkXDmtmmm$nTn8nPndtooo$ptppp,Tq`tqtqqTrr$sDs0sHt`dtxttz 4D$TL$ďXĐ$tD`tTHt$D0dL`|ĩ$  Ԫ0 L ` 4| T t  ԫ  0 4L t|   Ԭ  T t( ĭX D T  Į   4 T d4 tH \ p   4  İD $t d ı  T DXIJlԳLDDX$Tt$P48lD84tDTp(XtTTD$$L8dtt4T(t<Pdx$T,$@4TDhT|t0$D4X|$h$|T4D0t\p4Dtd$xTd$DtXD XTd D X l $ 4 D T   $!@!\!x!!!D!!!"|"""$"T" #4#P#$x#T###$$t8$$$$D$$$%$%D,%x%%%$%T%$&8&4&D&&&& 'D 'X'd''D'T(4@(4|(D(t(((0)H)\)T))) *d0*tD****L+t+t++4,t0,X,, , , @- T- h-D -d - - . . (.4 <.T P.t d. . ...$ /4 /D4/TH/td/x///4/0(0<04X0Dl0d000001D1X11$1D1d11(2D2`2$|2222$2D3T$3t83L3|3$3T4t4T444&5'5-5d55986;t6;6T<6$=47>\74?t7D?7?7?7?7?7?7?8@8$@084@D8DA|8TA8dD8F9G49DIl9dI9I9J9J :KT:Ll:N:dN:N:DO;DQ|;Q;tS<TUL<\<]<a=db4=bH=b\=bx=tc=c=c=d>d>e,>$g|>g> ABA Hr F H s] PH U (A BBBA yT=8yT=BBG A(F0(A BBB (A ABBD C (A ABBG .PHBCMHBV^BZA]BZB^B_BhDGGRBE B(D0A8JX 8D0A(B BBBA DGDTN 8G0A(B BBBJ  8F0A(B BBBM (N:GDD _AB(Ny(A BBBK8NAA M ABD OܘOؘOԘO8j+D$yOOQAG sAA$(zOGAAO sAAk8DpxFdxApn 8A0A(B BBBA RVA 8D0A(B BBBH 0(UAAI@] AAF nAAA8JV 8A0A(B BBBF GGGGGUTGGGGQKCCCCIc (D BBBA A(D0 (D ABBG J (A ABBH ) 8C0A(B BBBE {PQAT|d[B8F0A(B BBBDHԠ8S]5adS]T8A0A(B BBBH]pZ 8D0A(B BBBB Hb] 8A0A(B BBBE Lv] `v]0A8L2 8A0A(B BBBK ](] (A ABBC A (A ABBI T^XGBD A(K0T (H GBBH pP0Z (A DBBE R (A DBBE h (A DBBE W (D ABBE U (A DBBE h (A DBBI Z (A DBBE L^8BBE B(D0A8H7 8A0A(B BBBG T^h^ HAXBKdBhPE 8A0A(B BBBA 8L 8A0A(B BBBH JAD I`s AABAB` (A BBBA D (F ABBA pl+qrBBD A(J0n (A ABBI H (F ABBE Y (F ABBD d (C ABBD $-q7AAL cDA$4-q7AAL cDAN 8A0A(B BBBE HiA+HdBHfBO_BHCp 8A0A(B BBBE xBaxDppX0=xEBBE B(D0A8DBaE 8A0A(B BBBH *BaDl0Y Dx BBE B(D0A8M 8A0A(B BBBE JBjDBhEBBG B(A0A8D 8C0A(B BBBJ rBaDB^E 8A0A(B BBBH GFfDWPAwpWB\LZAG 8D0A(B BBBB m]fBkVVBXvA\RXA YA YP 8A0A(B BBBE `h =A^ A 4: (DWL: 0 BBB B(D0A8GA 8A0A(B BBBE BeAH FH p`K 0ۃ6HeAQHfAKA^Ay 8D0A(B BBBI wDUA FH  GM !HRA 8C0A(B BBBF pi 8A0A(B BBBF L 0F(A BBBG AYAAYA (A ABBK I FfDzBoC P,. 8A0A(B BBBG  \  \  \ \ \( \< \ P \ d \ $x \8AAL dDA$ \8AAL dDA0 4\SBAD D@  ACBF I F O 8D0A(B BBBC 3 hJ3 h 4 h 4 h44 hH4 h\4 ip4 i84 iC F  B L(A BBBC I \ D I F $p D0x D I G d D  8A0A(B BBBA d 8F0A(B BBBE m 8A0A(B BBBA ^ OcB\R`AOdA\e[B\RaB\e[B\R`AnOdA0e[Be[B7 ^E  ^E  ^E  ^E ~ ^E ,e[BeZB ^E eZBeZAeZAe[A ^E  ^E 1 (A BBBR K (C BBBD  KdA8D0A(B BBBH d$EAA K CBI S FBE IEDR&VAANEOB]` ` ,` Aw H F J a O E K TIAv[EGM\| +DXlа  4H\p|xtplԱhd`\$X8TLP`LtHD@<DIJHزD@<8(4<0P,d(x$ ȳܳ ,@Th|̴0DXlе  4H\p|ĶxA[A] B $<T؟ lП;D_ M A A DAB CBAD ̟PKA%HXؠ%HXAG  AAB ,T$OAAG  DAH *PT b AG H -gDA K P%!uHT A 4HL@ `<AQpԣh`\hAQ4hKHtK\ p|fAm B p A $ȦDy C O I O I 0DQԤ8DL@, >AL<> P>,ALl>0><AL >@kAG k AD >AL>>AL ? ?ALH|$HHt$L$LL$(LHHD$ Hw?IL9HD$bML$mfDI]M}I}=uLHL$ L)HHH%h?%h@%hA%hB%ڂhC%҂hD%ʂhE%‚hF%hGp%hH`%hIP%hJ@%hK0%hL %hM%hN%zhO%rhP%jhQ%bhR%ZhS%RhT%JhU%BhV%:hWp%2hX`%*hYP%"hZ@%h[0%h\ % h]%h^%h_%h`%ha%hb%ځhc%ҁhd%ʁhe%hf%hgp%hh`%hiP%hj@%hk0%hl %hm%hn%zho%rhp%jhq%bhr%Zhs%Rht%Jhu%Bhv%:hwp%2hx`%*hyP%"hz@%h{0%h| % h}%h~%h%h%h%h%ڀh%Ҁh%ʀh%€h%hp%h`%hP%h@%h0%h %h%h%zh%rh%jh%bh%Zh%Rh%Jh%Bh%:hp%2h`%*hP%"h@%h0%h % h%h%h%h%h%h%h%h%h%h%hp%h`%hP%h@%h0%h %h%h%zh%rh%jh%bh%Zh%Rh%Jh%Bh%:hp%2h`%*hP%"h@%h0%h % h%h%~h%~h%~h%~h%~h%~h%~h%~h%~hp%~h`%~hP%~h@%~h0%~h %~h%~h%z~h%r~h%j~h%b~h%Z~h%R~h%J~h%B~h%:~hp%2~h`%*~hP%"~h@%~h0%~h % ~h%~h%}h%}h%}h%}h%}h%}h%}h%}h%}hp%}h`%}hP%}h@%}h0%}h %}h%}h%z}h%r}h%j}h%b}h%Z}h%R}h%J}h%B}h%:}hp%2}h`%*}hP%"}h@%}h0%}h % }h%}h%|h%|h%|h%|h%|h%|h%|h%|h%|hp%|h`%|h P%|h @%|h 0%|h  %|h %|h%z|h%r|h%j|h%b|h%Z|h%R|h%J|h%B|h%:|hp%2|h`%*|hP%"|h@%|h0%|h % |h%|h%{h%{h %{h!%{h"%{h#%{h$%{h%%{h&%{h'p%{h(`%{h)P%{h*@%{h+0%{h, %{h-%{h.%z{h/%r{h0%j{h1%b{h2%Z{h3%R{h4%J{h5%B{h6%:{h7p%2{h8`%*{h9P%"{h:@%{h;0%{h< % {h=%{h>%zh?%zh@%zhA%zhB%zhC%zhD%zhE%zhF%zhGp%zhH`%zhIP%zhJ@%zhK0%zhL %zhM%zhN%zzhO%rzhP%jzhQ%bzhR%ZzhS%RzhT%JzhU%BzhV%:zhWp%2zhX`%*zhYP%"zhZ@%zh[0%zh\ % zh]%zh^%yh_%yh`%yha%yhb%yhc%yhd%yhe%yhf%yhgp%yhh`%yhiP%yhj@%yhk0%yhl %yhm%yhn%zyho%ryhp%jyhq%byhr%Zyhs%Ryht%Jyhu%Byhv%:yhwp%2yhx`%*yhyP%"yhz@%yh{0%yh| % yh}%yh~%xh%xh%xh%xh%xh%xh%xh%xh%xhp%xh`%xhP%xh@%xh0%xh %xh%xh%zxh%rxh%jxh%bxh%Zxh%Rxh%Jxh%Bxh%:xhp%2xh`%*xhP%"xh@%xh0%xh % xh%xh%wh%wh%wh%wh%wh%wh%wh%wh%whp%wh`%whP%wh@%wh0%wh %wh%wh%zwh%rwh%jwh%bwh%Zwh%Rwh%Jwh%Bwh%:whp%2wh`%*whP%"wh@%wh0%wh % wh%wh%vh%vh%vh%vh%vh%vh%vh%vh%vhp%vh`%vhP%vh@%vh0%vh %vh%vh%zvh%rvh%jvh%bvh%Zvh%Rvh%Jvh%Bvh%:vhp%2vh`%*vhP%"vh@%vh0%vh % vh%vh%uh%uh%uh%uh%uh%uh%uh%uh%uhp%uh`%uhP%uh@%uh0%uh %uh%uh%zuh%ruh%juh%buh%Zuh%Ruh%Juh%Buh%:uhp%2uh`%*uhP%"uh@%uh0%uh % uh%uh%th%th%th%th%th%th%th%th%thp%th`%th P%th @%th 0%th  %th %th%zth%rth%jth%bth%Zth%Rth%Jth%Bth%:thp%2th`%*thP%"th@%th0%th % th%th%sh%sh %sh!%sh"%sh#%sh$%sh%%sh&%sh'p%sh(`%sh)P%sh*@%sh+0%sh, %sh-%sh.%zsh/%rsh0%jsh1%bsh2%Zsh3%Rsh4%Jsh5%Bsh6%:sh7p%2sh8`%*sh9P%"sh:@%sh;0%sh< % sh=%sh>%rh?%rh@%rhA%rhB%rhC%rhD%rhE%rhF%rhGp%rhH`%rhIP%rhJ@%rhK0%rhL %rhM%rhN%zrhO%rrhP%jrhQ%brhR%ZrhS%RrhT%JrhU%BrhV%:rhWp%2rhX`%*rhYP%"rhZ@%rh[0%rh\ % rh]%rh^%qh_%qh`%qha%qhb%qhc%qhd%qhe%qhf%qhgp%qhh`%qhiP%qhj@%qhk0%qhl %qhm%qhnAWAVIAUAT USIIHHt$HPHHŹL5 7HEHHH@(HEpPZHPHT@ǀ0ǀ4.pPZH}pHEx6A(77(HH7HC H1ҋ=O1HX P(HCHHX 6HC yHPHH1LA 7HCHKH@(PZHPǀ42ǀ0pPZ(Hq6LIHID$HCHHx(H6@ 1ҋ={1CCX{ rHL H jPASr/L1R/I@&LH AE nIFLHs # IFLLHxǀ@X(x5IF HH|$HX (HCHHX A5HC HH[]A\A]A^A_AWAVAUATMUSHHH(HL$LL$Ht$IHD$hHL8X=&11IGP%(IGX4H|$I%=1IEH|$`IE聫=1IEˢIEIIGLuLIFIſ74IEPII$H|$AGAG ALJIG8iAD$HI0AEXOAULNAPH 8%H$1-Hs Hx I8LILǀIǿI]3IGPILxPI$AIG8AGAG ALJ軞I0IvH|$H@PL0AD$XA$ADŽ$GI$I=1H@PL0 I$IH@PH0HHHI}Hs AEXH|$ ILǀII$2IEPI$H|$LhPAEAEAE ADžPHT$ I0I|$I$Hs 'HMnH@PHP8AD$XIEIFHLhIEPIFLh HIEXIFH@LhXAEADžIIFHs H@H@XLHx@X$HHIFeHCP茜HHCXǀI=HkXH|$(I$'HHCX HHxHr@XH8H[]A\A]A^A_HzHP01Ht @pAUATIUSHIHۃHu1LfFx<t#uݿu7DP8t9uVH"HX8A($Lh @<@HHPXtdHLtSHEx<u P89ttHLU 1(A($HX8@HLh Hh0@<HPXZ[]A\A]SHc=N#utH31 趐[SHְ=N#utH31 艐[HP`111Ht}GPIuw uHW(HHtB:u[1HHWPuIHuBG[`uHuHHH:HHHEX[]AWAVIAUATUSH1HxHt$HD$ HD$(HD$0r|$oH LAPAx]LIXAyNI@1Ax8H@ ~HQ11腀Awv]LLt HHE11HHPXp p H|$00_8_HD$0HHxXݛHHhXA]A_HHpPHVHHR(Hz@t5HPXp E1p1ɿ HT$@Ht$8H|$0HD$0t]A\H1HPPH@XHRH@LLMAD$ AŃA hAD$8A\AD$DPAD$:DHMLHPXHpPp p LHHD$0 H|$0AZA[ ID$HHHsHp]HqH@Z < `HHLHxP HHE11ɿHHPPp pU HD$HHAXAYH@PH@HHx(GPtvHD$#IHHD$IEHHT$ALHHL@ 1R&!LT$HT$HLl$HIBPIEHPY^H|$80_8_HD$ HHx蘙HD$ HLTLIEHD$ HHx HHpPHFHH@(@PH|$@E D$@赒I@1 HAEAE H@ADžAEXʝHHALHJLB &P1 IHD$PHD$HT$HE11HBHD$PLHHMiPHPPLL$p pq LL$ H HT$0Ht$(H|$@IAIu AFtIHt蠆Idž|$ IF8AV xu AFAFE1D9T$,~TK0KKDŽ0fK KDŽMKDŽ T$,II|$@tAt$8I|$@衅AF LAD$8@F U1L$LIH(DŽۇHt$LHH(贇HHDŽ蛇H(HDŽ1|$ u 0u1kwD$(dH% AMp HhL[]A\A]A^A_Hu#x8WvPH@u5u1HHt%Ix/L,=/L^ h111;3b  H==8^ (116^ H=HGPL1ɺpH=(_ H=+S ="S S HcS -111U2] `11162P_ 1112R !u[fAHcwݍPHcH~_HtʋJ+JR 9O9L뵃=c =c H= ^ c c HSc 1=R t4=Q ~+Q t1=a u =a c 1=Q u=Q Ѓ^ 1=5c t=a u =a c c c 1=b t==Q ~4Q t1=fc u=)c t =$c Pc /H=Z ʅ HHHY "1A111/111/111/P P !X111/` H=GdHV= c d!VU(wHI觝HLHVeH|$1+HD$iL$HHȉLH=O H!H;IH1LLHHHpL%X =a uxۉa Hŀ @11q2T11~.^ M=Ha t@111V.Xa :a +U/11/6W 11/W HL%_ MϒL"H7Hh|H=`|o=\ u{S=[ t2H{H1H=bV1P=] t1H=|< VVH5{ =S\ t=K ?VH{=V =K BsVHDK EHHe{11话1hHa|LH!L%] t5LSʅ LH{M|HHzH=-T IE1DLHN,9A}Hc6H5T AUIHHcLDDLLL% ] HID f.s@=(\ *,.4. H\ =_ [1;0.Hu\ =_ s1:H\\ 11:L11:`(11&fDS_ac0H|#$fHHI HI )I )I  HHI tHI HH@HHI HuI f.H_d8uH_d8uH_d8uH_d8uH^d8uH^d8uH^d8uH^d8uH^d8uH^d8uH^d8uH^d8uf.H_d8uH_d8uH_d8uH_d8uH_d8uH_d8uH_d8uH_d8ufDHH`d8uH@`d8uH8`d8uH0`d8uH(`d8uH `d8uH`d8uH`d8ufDHbd8uHbd8uHbd8uHbd8uHbd8uHbd8uHbd8uHbd8uHbd8uHbd8uHbd8uHbd8uf.HRH5:H=k\HBHRH5H=\H$@1I^HHPTI HP H`C>DU@_H=@_HtHt ]@_D]fD@_UH@_HHHH?HHtHt ]@_]fD=uUH~]Dff.UH]fAWAV1AUATUSHHH8IH;LcH|$$H|$1ҹHPHCH93IL9tL911HA0@skt$ t$ YCy11HH|$1HAͷL$1HH{nHt$1HAH^@ff.@SH藋H[ff.fAVAUIATUISHH >uHHAH []A\A]A^HDHIyHHt4HuHT$Lu{H}Ht$譆H腘HHuL5H@HSXp xs8H{@HHD$NS8HE@HD$P8LIEMf.LD$ D$ .ff.AWAVIAUATIUSHIH(HZHI菗HtAHHs>HT$LՃH{Ht$蹅H著HHuLAH@IUXp x~Au8I}@HHD$YAU8HC@HD$P8LIH([]A\A]A^A_LD$ D$ H([]A\A]A^A_fHL$HHL8AWAVIAUATIUSHIH(H:HIoHtAHH{?HT$LՃH{Ht$虄HqHHuL!H@IUXp x}Au8I}@HHD$9AU8HC@HD$P8LIH([]A\A]A^A_LD$ ̒D$ H([]A\A]A^A_fHL$HLH8AWAVIAUATIUSIHHu :  1HLK¸I$ߑHIHHHHHuBH{Ht$:HHHHHHHUHsHL$LtL蠑H[]A\A]A^A_@u!HLHAH[]A\A]A^A_fDfHuHuLH@IT$Xp x{At$8I|$@HHD$3AT$8HC@HD$P8LICff.AVAUIATUSIHHH|$r$Ű f.CtA|$ucfD$D$fDH{XLHډw{oD$@oD$(@oD$8@(H[]A\A]A^fCuA|$<@H{ϐtI|$tH|$D$Q( $H$L$u/( 1@H|$ݐE @H{OoCAD$oCD$(oC(D$8t,H[H^Mt;A$t2L!L^fEHL$LLH4AŃu AA]A HsXuD$Ő {A|$C A;D$ D$D$E1{,D$AD$f{A|$D$D$E1HT$LHAD$D$AD$ 9C H붿 1m1b 1mAuA|$u1: ֿ+ Ͽ ȿH F 1mH#~L~HD$p 1n 1mfUSHHWXHHw x1҅‰HEH[]ÃATUScHH$ @H1&HI1HLH[]A\и 1lH1HI1HLH[]A\ 1lH1HI1HLH[]A\ 1`lH1HI1HLH[]A\ 10lH1fHI1YHLH[]A\ 1lH16HI1)HLH[]A\` 1kH1HI1HLH[]A\ 1k 14l@ATUISHHHSXƿvt 1҅‰I$[]A\ATUISHH讋HSXƿvu 1҅‰I$[]A\ff.ATUISHHNHSXƿ.v19‰I$[]A\fATUISHHHSXƿu19‰I$[]A\fUSHHHH̿1HH/HHHqD$H0H[]AUATIUSIHHcFH^xHt$H){? H|$H9wyH3Hl$I$HIH9HHF荸H9Hs!HH< H)ѲI$Hl$)HD$ID$xH[]A\A]I|$1|IUXH( 1^H|$Yf.ATUISHHHHt$> H|$H9HUHT$HH9HHFHt71I$HT$HHH9HFH9HrH9sH<H)ھ ͱHHT$HD$HExH[]A\H}1}IT$XHh 1]H|$7f.USHH:tP;usHuiHxH؃ vH[]v@P H[]fHuHHT$9xHT$HHHى[]h u~uO ;N tDAWAVAUATIUSHWXHHHrLLIƾIMHHIHL,L۵HIHHL趵IBM4$H[]A\A]A^A_ff.@:t 1d@Hu wBB Hf.USHHt"t[GvAHvH[]f! tHVX+ 1HL$ZHL$H)H[]fDHVX1 1HL$ZHL$H)f.SHHFHHv8H~HHH[_HHH[HfDHֿ 1AeH1[閻fD1H.HHHHŀ_t HcHUSH׉HHHƼH|$ 赼K1HHH|$ 1HH胳HH9D$0H H|$ HH[]fDH?AVAUATUSH@H2`H谹Hcs4HdHcS8HHHH赧{4HkLc HkHH0[LUHLuH1Hȵ1HH{HH+1HHH荬CC0GH=s_AfDHHk Lk@#H|$ Ls`H 1ҾHHc1H1HHHcո1HHH赱Hc1Hį1HHPHc+1HHHKHc1HH自LoHc1Ln1LLPHcEL=Hc1L<+1LLHcHHc1HD+1HHHc޷1HHH|$ 1L買Ht$ 1H裯HT$ 1HHAHHA1H觪Hc1H牃耮1HHS1HHPHcZHH1HP ~# f.9u9~Ht$ 1H1HèTH@[]A\A]A^ÿ` 1]aff.f5ZS`t%H{ DZH{辱HHH貱S0uۋ t3_@HK`HS@Hs H1E1Hèu[Xtu@HP x f.ATU1SI" Hct"HcLHH4`4Ã! t5HlLHH`x#H`LO؉[]A\[]A\@AUATUSHHGtt] 1_w LLLLs LDRw HH[]A\A]w LL>tH[]A\A]HHH96H1H9gLϲH[]A\A]fHHH9H1H9L菲s LHHH9H1H9LMH[]A\A]11L贪s L911L蝪11L脪LPs LL=.D`DDt$ ׺t$ 9fDt$ 迺t$ fDt$ 觺t$ fDL谢6ff.ATHWXUISHw hHŋCt|ttJ 1]fDHH1;HLHH[]A\HHH9t萴HH1fAUATIUSHWXHIHw >gHŋCttU 1I\fI$HH13H+HLHHމ[]A\A]!I$HH%I$HH1÷랐AUATIUSHWXHIHw nfHŋCttU8 1y[fI$HH1æH[HLHHމ[]A\A]QI$HHէI$HH1C랐AUATIUSHWXHIHw eHŋCtte` 1ZfH褺I$HH1HHLHHމ[]A\A]uDI$HH襝I$HH13뚐AWAVLwXAUATIUSHHLH(w dIċC0ttv 1YfHŘ11Hu =^  H蠹11HԣI$T连I$貞D  HŘHM$&H|$Ht$Ht$HLHHt$?T$t1L 1ذHHL񾈺 1NRHFH|$L@L5@HHH9}BHH9=: u6LLHH([]A\A]A^A_f.tHHI$1ٵl@H1H_Rf.HLU8afDAWAVIAUATLgXUSHILH(w bHŋC(ttfƵ 1Wf=KHIH1HoALHLDkH([]A\A]A^A_@DAGE|KuIHH$5HHL$AH<$HRH詜HH螠"f=quM11LtHH1LĮHSX1X LHfH([]A\A]A^A_HSX1X KdfHIH1HtZ 1UDL濘 1TQ@HIH1|H1Ҿ H#fDH1ҾeH蟪sf.EAw 1ҿ'HHHH`HcrI|$1DH{H16MHH 1=T5nu611LʐDLA HL+ Z1\8f.IT$HsLMLE1Sff.fAWAVAAUATLgXUSHLLH(NLLLHL\DL8 uvAtOH11nuZH貚1LHu11HIHuWH(H[]A\A]A^A_DAH}1H{I1MHL` D1O<DH}1]H{I1OMHL 1<qH}1%H{I1DHML( 1;HG3fHUHsLH14Rff.@AWAVLwXAUATAUSLHH(LLL1LHL DLA! SIAH11蛙 s A9~kDLLMLStTL1LT$LT$H1LMHHپ( D1:VfAzDt3H}1H{I1MHL 1S:H(H[]A\A]A^A_Ð511LDD$DD$HUHsDLH1PDLM1LLL$LS1HD$LLT$HL$H¾` 1MDs4s LT$LL$A9JDLLT$LL$LL$LT$#D$L+ Z14DD$ff.AWAVLwXAUATUSLHHLeJL1LIL>Lt M$LP^5tds 9nt;s 9t4I|$1H{H1MHH 1h8HL[]A\A]A^A_fD jtދs 9}L5CI|$1@13H{I1MHL( 17{D]EuT11L踊=>u~11L蚊Ds HZfL+ Z1,2f.IT$HsLLE1MDL+ Z11iff.@SHWXHNH[USHWXHH)HHcHHSHH[]fSHWXHG1҅‰[fSQX [ATHWXUSIGLHH;HSxHH H[]A\f.ATHWXUSIYGLHHHSxHHR H[]A\f.ATHWXUSI GLHHHSxHHR H[]A\f.ATHWXUSIFLHH+HSxHH H[]A\f.ATHWXUSIiFLHHHSxHH H[]A\f.ATHWXUSIFLHHHSxHH H[]A\f.USHOHÉh @@H@xHH/ HHH[]ATHWXUSIyELHH HSxHHb H[]A\f.ATHWXUSI)ELHHHSxHH H[]A\f.H19t@HqHH=H9~(uoAXÐAWAVAAUATAUSHH__U o:HcHCL|LdHDILJILJILJ(ADŽ$e1Eu_EH(I(D$ Ht x tKH[]A\A]A^A_fD*0uI(Ht x t:0 1Ⱦ $8DD$ EADŽ$EHݠ訷AIHt x OAC:K0uFEHE`IHt x fD|$ H[]A\A]A^A_Ð諪A^1 47H[]A\A]A^A_ÿp 1L$ 7D$ L$ nH(赶?fD*n/IHtx to:I/ADŽ$/:/rp 1t6D$ f.軩1p H6:.D1 $6f.ADŽ$1p 5hf.( 15M1p 5:I.Dl$ fDHG8u=;uH=HDFE1Hff.USHHXH߉EH[]ff.@ATUSHDG*D`y-:f-t"McNLytt/[]A\1P 4[]A\HcH5.1H\H裩tHmu P-:,bH*HǃHC,RL贳<3HH51HD@EH@HHHp3 DfD H1L3=1Mc?JD[]A\f.H@HHHp3 fPmD=1?HCD@A'HHt;Ht6?u1>u,u&~u HƘHǘ٘Hÿ 12USHHTHtHIrIA>?A~4A/HIH|$MHHD$vHt$LH|Ht$LHHD$LHdH HD$8HC HH@HHD$ HD$ HU5>HHD$qHT$HUpLLLg|Ht$HEl|$~H{?uHH8L8L8H|$eyL]yHD$8HAIvXx 1 (8ff.fAUATUSHHHKƽHZHHH(LIH(DH}HH<H}pHC<HHCp<HL9uHL[]A\A]E1ATUASHHH1HHGHLJHH)HHs0[oC 0zBE{HCCH{HD$HCHt$1 K!H{HD$HC],KCqHc҃D9suHHt$1s  2H{;1 %H[]A\D(>CH[]A\f.H{1 i%뜋sD@ 1M%H[]A\1x ,%H[]A\fDC;),{CC1҉HxuqfD1 'R1 $@CPD9SfD*1p 9$i1D⿘ $IC[ CD1 #1D #1D #@ 1 ff.HATUSGHu?~,fDH|3H3H9H[]A\Ky1҃ )Ѕ~HoLdH}H3H}p3L9uH[]A\yff.HAVAUATAUSDEGHA(CD9IIJ|Jy~JtHtNMt>uA}uIHƘ苈{L2J|a7HJHeHƃA[D]A\A]A^AUATIUSAH@Ht( thEt[`H;u*I$HH[]A\A]f.{I$H[]A\A]f1 fH߉{wbK$ tB 1 HH[]A\A]fDt-C),SƒA1` maCf1 oAy;CH߉C{{ w,C $` f.t5C ],Cƒuf( 1/f.s f1H9H|u=1+ HDsfC @7 1fDu;u6C f.P 1X 1DEf.Wf1 Cbb/0  U0  tUDC,@ 1C t3{ uC C C 1A 10@ 1ff.@H!HH@0HtH@0Ht 8ATUISH6HKHHxysPHc<={S xσP s~=HȍNHHːHHHHHJHHHH9uH@r fHuHPӋ -1[]A\@GXHuH}PL谊 tHH H[]A\HuL1 [1]A\fD pу@@~.Hu 199@S|Dx?HcщH4HH|0H0H)IcfDH2H4HrxHtxHH9uEAHHHHːDHJHHHHHJHHH9uf z ARf.USHHtuHӸ!HHHHŋ@E~<H|/H݀HD/H݀EEH9HH[]f1ff.HH„tҸtD1Du1;uO;Nu݅ҸtDG1D;FuAuʅ~BATUHSHLƐ$HuH;tHHL9tHuH{u[]A\øff.AUAT1USH  HXHD$oȊH|$ )D$詥H|$ O8t+foD$1HE1WuB 1H|$<8HX[]A\A]@ts,H|$u1 ¸t1@  "HX[]A\A]f 1Y1k foD$1HlE1K1HNtEt$$|$ HT$!foD$ Iă|$ @foD$0@foD$@@(tvH|$ 7tHt$H|$ YAHD$I$ID$HtDhLeHX[]A\A]HT$11:!IDH|$v9HHu"sfH9HHVHCHx7HCu7Pt/8uʅuHPH3 Hr€uHpXP 1@91h  A|$DuU|$ t1H{fDHD$(HtH8uҀxuHt$ 1>@Ht$ 1$f.SGHtuoGN{)_{oG)c{oG()g{H1@3t+Su$fo {Cfo${Cfo({C([f@_zCHGHHH(HzHt]D$H []A\H []A\fHt t`1ff.fAWAVAUATUSHXH@@9wIHcIԋt} 1 H|$0IaHM4HD$IH HH$\TH$':uH|$03\H׽`LFaHX[]A\A]A^A_DHc1HHt78  fDH(tHfDH|$0`MttH|$`H|$ `I(H8H|$u[H|$ k[)fDHH|$0H$bH$HDI(IHT$@ PL_HT$0Ht$@L]HT$0LLXAT$H|$0ZH|$@ZIG0HDH/8&xHH|$ aIH8HH|$aL^HT$ Ht$LsVHT$0LL\HT$0LL#XAD$MLp^LLLHT$0LLLHT$ LL\LII(HD$IG0HT$@HpH iIG0H؈H8x1Ly^c1Lj^` 1 ff.@AWAVAUATIUSHHHcHXD$/H $L=~lD5lD-|lH-lH rlHL$H vlHL$H rlHL$H nlHL$ H jlHH[]A\A]A^A_W81tH@t޾H1ZAD$8~-HHHHIT$@iJEHA9D$8HH[]A\A]A^A_HoPHt'uu EtIt\HHuID$HH@(H1H3)HSfDH}81HuHLmZM IE1/HT$0HHAIH|$0VD9e HT$01DL~uH߈$oV$fDAVAUATUSH ?*H1HGHvH@H8ILd$HHt$1H|$Hj'HHuDH'HHt}Hs>uHFLd$HH8HHt$1HD$HtH|$H9tHEHs@H\$HHOD$vHKUH []A\A]A^H&HIE1fDIVHZPHufDHHtt;uHCHt8uH{Ht؃?uӃ:tSHǘRH{IHǘ?I)IE2HEH|$@LHDfD:ukHHt$HIHHtJH9L&HI-HT$HH]=1HDHD$븐HBHH8H?HǘriHD$HL SMyHEH/%HuAHX%Ht,HPHRHtHUH}@t}H\$H=v1 HEINH¿ 1HX 1  9 : 9 ; 9fSHHG8t)t Hu?H7H1[H=gHd]H H[ÐH=fHH.UHvR@HHt]O8tU9AADDAtGAAt7LG@Mt HcHHLlOHf1HfHGPHt.uHHHuHG@Ht HcHHu\HGHH@(@UHHHHfLG@MEHH>fDHHN5Hx8H1fDt$HuH1HH@HfHH`  S7 1  1@t D1ff.fAUATIUSIHu`G~?11DAtHc1ɉHLL/t-EHA9D$ʸH[]A\A]f.1ۉL H1[]A\A]H01H~HVtI|$0H9ff.HGPHt$u PtHHu@t 1HHP 1t 1USHH(Bf.H݀ tH9E~H| uH1[]DH[]f.fUSHHoH_Ht!HtP9E!HCHGH{HCHH[]fHEHHGH}HEHH[]fDHHH[]ff.HtcATIUHSH҅x8uH[]A\Zf.HsLHHCH[]A\DHsLHHCH1DHt{ATIUHSH҅x@tjHuLH9MHE| H[]A\@HPHUHh[]A\HuLH9MHE}HPHUHhHÿH 1i fSXSHHHֺMW )i')HH3 .)H[@SHHH3H[f.fH?t/HcH-fHOXH4_1p 1HfDHGHHH(DIZDȃ <}AAPtS҉uAT tH;uHoHuH HqH;uAT OH;BHcL_ACpmy t\DQPAAHGPHu|@AAHHtbEDuAuL@Axt1A@8@tDHA(HHtDP:ADM@MAx:yHdHOX1L޿ 1ff.@u(tuHtHGPHuNfDHHt<8uHpHtHSHHxHtH?z u{H[HGHtWHHtO8Hf>u;H*hHǘHt$;Ht$H@HGPHHGH`HHT8KH;ff.fATUHcSHGHHHt=H@(@T@t3H4_HbHKX 1E1QD[]A\fH{Au܋uHCHHx(G uH 1V AtHCHHx(H ~sHCHH@(o@ Co@0Co@@C(mff.SHHLJHHØMHxǃptHMH1[MDUHAWAVAUATSH(H6IH H&I7HE1HHH@ HAHuEHIcIEH?HAHI1ILmx@>aF1<H~HuȺ nQHUȀ:@ƅ@9AEtTDDM IMI7Huf.AHeD[A\A]A^A_]Ef.HI`H5`( 1E1뷐I7IWH r`` 1HXE1IG@u H `I7HPX1nDM*@EEA DMt\HEEFENIHpLhL9V9VtbDHILH fH9PtGH9uHHAH -_Hv_&D& HE1DMH% fDHcHH2_H4_HOX1 1HHcHAH4_H^HOX1( W1HHcHH^H4_HOX1` *1HHcHAH4_H^HOX1 1HËGPvuf.HcHHR^H4_HOX1 1HUSHcHH̀H4_H^HKXI 1kH1[]fAWAVLcAUATLcUSHHMHO8U82CtEu s9C t@H]HsXH 1E1HD[]A\A]A^A_fzs9C uHDL$ lDL$ E9CMACAG8_1DLDL$ DL$ r*sA;W tDLSH\HIOX[]A\A]A^A_ 1DDL$ cDL$ U8DHDL$ ;DL$ fHDL$ ckADL$ H<\HsX 1J4_N_HMXH\LKX 1E1bbDDκLDL$ DL$ 3DLDL$ pDL$ fDAff.fUSHcHHuFq9C t+H4_HT[HKX1h H1[]H[]@uH[]ÐAUATUSHHtXAAHu]C8ukH.*tzHt$ 1H Di}HSXHƿ 11H[]A\A]Vt܋C8tDH0tH)uIcHTZHKXH4_1 H[]A\A]t$ D藆kIf.USHHt1tHH[]f.H[]f>t DUSH1҉HHHt$ D$ ~H[]fHcHfYHKXH4_1 1f.USH1HHHHt$ eD$ x H[]ÐHSXH1 mH1[]@AVAUIATUIS1HHHEHHt$ Av 1ҿ6MtaHET$ H`u9}iH[]A\A]A^D9}HSXHL1h H1[]A\A]A^T$ xHHH`9|GEu9|HSXHL1 H1[]A\A]A^fډT$ HsX1H M1DfDATUHSHH^8DكtTLL-xHcL-~H5qWHUX 11H[]A\@1T$ EHH0yuT$ u~fHHt xxtDWDADUS1HHcHH8H $t D$ t!H4_HVHKX1 1H8[]ff.fAUATHUSHLcHLcHEHrt xt_{tA謩u'J4_N_HKXH+VP 1H[]A\A]fHCHH@(HP HHEH@HH@(HP HHE{uff.fAWAVIAUATIUS1I1HHEju1H[]A\A]A^A_111A H茄tH߈D$L$H߈L$adL$Mt?1E@ŃA}t LL$L$fA}8lM1E@ŃA|$t LL$L$At$8tLL$lL$1҉LL$OL$A|$ Mt~1E@ŃAt LL$L$AW8tLL$L$u1҉LL$L$XA :fDHSHsX 11/f.HSHsX0 1 LL$FL$yIt$Xp 11IwX 11HsX 1DUSHNXHHWX1HH5SH0 yHHH1[]HRSHwXH1x EH[fDUSHcHH8tNC8t{t H8[]DH4_HRHKX1 H81[]HHe D$@tHCH@8tH@`tH@ATUSH 9w8~r9J8HӉ~|HA+uH D[]A\HT$H t Ht$HE1MH|$AU8HM8H D[]A\ú    fATUS>ul?Ht[]A\D~tLH`L|'xHcLm'~H5QHSX1 r[1]A\ú E AWAVAUATIUSH|$H׉t$ D$MI]D$AL;CD$ ;C tHSX1  btGI9taMAfDMv AI9tCI~L @P1EDx H[>uD$D$H[]A\A]A^A_fDHm AHtH]D$;CZ|$rDL$ HOHKXID@ 1/ff.fAVAUATUSGHt u;H E1[D]A\A]A^DHG@UuD@mHF11HXHyu[1]A\fDHF@1LH*뼐1LHsDfUHAUATSIHH8gfo'l]fE)Efo"l)E%IH@@HHukHLLHH3-foEHHHB foEBD EfBD! 1He[A\A]]@}fDH1H1He1[A\A]]@UHAWAVAUATSHH8?t'HwX1P wHe1[A\A]A^A_]fD1ҿAatLcKD$`uDVUUUAD)DbAT$IHcHDIMIE9IcԾ0LDE&DEHE)IcL-H}2H}L<5t3fAE1fDAD$EDHAAE0fDLs<4t<5t<6t<7|AE3rAE2aff.UHAWAVAUATSHH?PDAD<AAuDVUUUD)DzEwIMcIFHtH=xH)IH $DIIcID4H8E9HE)LIcHH(+D4H(AH0LDQxH8LHH"5nZD9 HHPHDTDžP1 Au-HsX 1jHwX1P E1MHeD[A\A]A^A_]Ic׾0LD4"D4DHD)HcL*DHAAAEƃ<4@2|<5t@3fDAE1fD1Do4HAHHHՀ_@(tE1AGtvLALPALMGIcLHHLL 1H1fL H@o-H@L7H@Hc(LH8HH#HLCDc I4D4;8t<0@HuL+ 0t 1LL%H@A0(AGEDHAhHH-H $ef4(QI}HcƅP1H@L6AN8H@L1Ƀ))Hc%DAE0QfDLN<4t<5m<6}<7 AE3LHDžP0000ƅTDžU0001HƅYDžZ0010ƅ^Dž_0011D,_ƅcDžd0100ƅhDži0101ƅmADžn0110ƅrMcDžs0111ƅwLDžx1000ƅ|Dž}1001EE1010EE1011EE1100EE1101EE1110EE1111EcsL1HH LP(L~tH IDH83@AHLPHH4IL%HHL;8t)AAЃ vȍAvAA@L @HDžP000ƅTDžU001ƅYHDžZ010ƅ^Dž_011ƅc _Džd100ƅhDži101ƅmDžn110A@ƅrDžs111HƅwHFHcHH8qH81HHLv HLPLHH80HH4L$H8HHHD)9 HRq@IHLHp#H(}fDAE2fD98t(4HLTf.CpHxHHH@;HH)H1CHL* H[pIHLHp" H'pI  R ff.AUATIUSHIHhHH]HH H9tH[]A\A]fDMEXLHH1 H[]A\A]f.fDvfD1IfUSHHHG t4t1Ht.HH[] @HwX tH1[]D5"QHtC땐USHHHS8P;S u4}uNE8udP;U HH[]uD1HfuH1[]fHt؋E8tHu1tS8V1HD돐H[]@USHHuG8u*H9H59HKX1( ,1H[]H1ҾH~tHtۃ}uHHH[]1ff.=qSHu?t.11Hu1[f.H1[UDHHtƃxx,uHHXH{Puf.USHHHGƒug9Cu*C 9E tjHHSX [] 1H 8H8LCXH5o81h 1H[]f.1tEH[]@SH1u 1[DH߾[f.?t 1ff.AUATUSHHHh?HH$@D$$A@ HAEu9H|$0HEtD$0HFH1AHS8M8A9t1HM8B~0HUPHH8HH9uHhD[]A\A]HhE1[D]A\A]fH6H56HKX1 E1/HhD[]A\A]ÐHEH6H5b6HKX 1E1lf.Hi6H5:6HMX1( E1HhD[]A\A]ÐH=EsHuX 1E1}E1H5H55HMX 1E1?f.H5H55HMX 1E1f.USHHHt1BtHHH[]E11H[]f9u1@ff.ATUIS?HHt^E1111H dt)MHH[]A\A1f.H54HSX1 )[1]A\fHHtxx,uHHX냐u J9G t'HHt4HwX1h 1HD3ATUIS>HHt^E1111H=ct)MHH[]A\A1ɾf.H53HSX1 Y[1]A\fHHtxx,uHHh냐USHHHunC8u?1Ht#Ht H3H5^3HMX 11H[]D1Ht1Huf.1uff.USHHuEHHt&~ubC8u{H޿>1H[]fH2H52HOX1 1 H[]H^u1fDH3p1oCXHt$ H9Vt$ ^}tH H[]D1t;uHt$ 1HLD$ yHSX1)  ߸ufATUISHHuus8;}u;M8}A|$AT$8[]A\H6u1[]A\1ts8{1Ht̓;mH޿C X1DL\@[L]A\H1T@Hc14@USHHHGV ta t4u4{tHHߺ[]!H[]@t;1HuH1[] t3tf u HtɋEuf5At5@HtES t6s HmmSEf.USHHHu>{u;tKH[]Hu1H[]Ð1u1@H޿(t̓;uHHH[]E1(?@USHHHG t4t1Ht.HH[]@HwX ZtH1[]D5?H:tC땐SHu&p?;S t H1[0[f1u[ff.USHHHu^?;S u;Ht#}tHt(E8uCH[]1HuH1[]Ð1dufDHH[]5DUSHHHu~x>;S u+}u=g>;U tSHH[]1HuH1[]ÐHt>;U uH[]1nf.ATUISHHu}=;S uB}uT=;U tHtt.MtA|$ucAD$8uz[]A\1HFu[1]A\Hu1of.Lu[L]A\ff.@AUATIUSIHHHt0H1H[]A\A]<D$ A|$ 1LHtMt3A} L%tA|$A}Hus{A|$MfA}[-<A9m KI}1IMEXH‰ 1R!Dt$ LL2Ht$ 1H@t$ LHY%H5*%IMX( 1貺DH1%H5%IMXx 11舺w-5;A9l$ I|$1HMD$Xff.AUATUSHH8t=HHHII|$ x"H8LLHHE1[]A\A]@HsX18 H81[]A\A]AWAVAUATUSHHHtiIHH|$HIMP|$x1ALLHHRuF1HH[]A\A]A^A_HsX 11f@HwX1 1N@L tH|$LD$% = uWDd$AucL[-uyIFHLx(MMtIEHtHxtcIvX 11Ѹ@@IvXH 11趸%IFHIVXx 1H@(H01蓸L,LxIG u11a IHCHw HHD$=vtkIEH|$Hp %vIEH|$H0H vtrA~8uAAuIuHIvX 11PI1H FHIHINXHsXHM 1襷HHHIE1H8H EI}1HH EIvXHHIؿP 1X1IEHHI~P@;BP/AA A@DFQD2BQAu5@0FR2BRA t.IvX 11ӶBIvX@ 11輶+{!HCHHH81Ht?uHHHDHv(H H61Ht>uHH?HDHR(HH1Ht:uHH6HDIW(HH 1Ht9uHHHDH@HtH9u=H@H9@u+HH9IVXHsXX 1芵IVXHsX 11o11T11IvX 11?IvX 11(GAAAtu E1HHH5NHOX1 ޴1HAUATIUSH1HIH#uH1[]A\A]fHLLHHE1[]A\A]BfUSHHH Ftk "CK8u,EU8H[]fD1Hfu1H[]f HsX IEau H1D1HVK1f.H3iU8>HH[]騾H蘻11HuX uNCtvs HB1[{ EvH(1u H%EDs HCoDATUSHuG8u,HH5HKX( 1^1[]A\HIԾ1HtHt}t<LtMt[IL$XH` ]A\1p1HHu|f.AVAUIATUIS1HHlA|$tLJL.Mt A}NE8AT$8%D@A9M?A}Ht$ 1LL$ E8E1E1fDAAD9m8D$ D9tDLDH1uI|$@McILIcHHH}@HHH5SIL$XEEI 1k1XZ'DHH5IL$X 1A1H[]A\A]A^tLg1HLfD$ SG8Hu%{tHߺ1[[Ð1u[ff.ATUIS Htp1HtPMtMA|$ tkLƶt2}tlA|$u&HH5IL$Xx 1%D1ۉ[]A\5/5u5/Lu@HiH5:IL$X( 11迮ff.fS HtH1[ 5F/u[Gu 1USHHHGw6t6HH5HKX1 #H1[]f.wʾHntz{8t1HŹta}8tH詹tE11HH&tH[]fDHH5HKX` 1肭f1H[]~lH费R1USHHH}uDS ;-uZ;U tpHH5iHMX 11H[]H.tًS ;y-tHHH5HKX1 衬1H[]1R1AUATIUSHHHG ulE tH艸teE11HH AtIA}A}HD[]A\A]D1H1Au;} HE1[D]A\A]F |tGH߳E C{tEHųKL辷X@u Hu|@s H`{SC EcLY;#A}HLH[]A\A]AY- SHAWAVIAUATIUS1HIHOA|$tL轶LMA}ED$ S8AD$8t`DBA9D$ ~K~GE1E1 D$ D9tDLDH3AAD9k8DM?E11LH蚽{S8AF8DBA9pD$ E1E1AAD9k8~tD$ D9tDLDHuuIcHI~@IcHHH{@HHcH5<IL$XUEEI1 賨XZ1H[]A\A]A^A_DD$ nLófKw H^uH1B9LHHKXH5I( 1gDHH5jIL$X 11{u;tFH[]Hu1H[]Ð1u1@HHH[]A1ҿ-  f.ATUISHHHuiA|$u9Htt{tH脪t.;t{H[]A\DLVu1H[]A\14ufDA<$uAL1H- DfAHٺH- %t;]Ht$ 1H9D$ ~QA<$;Ht$1LD$y؉D$9D$ HSXIt$Xp 1#DHsXF 1 ff.USHHHuNK8u^}uU8ueH[]fDHu1H[]f1tދK8t1H迧u1HH[]靧ff.fATUISHHK8A|$ufAT$8HtA}E8}HEHHtHP(HtRZ []A\fLu1[]A\1tK8W1H軦EǐL裦BfH莧,@HuX1 1qfHCLfDHpHUX 1̚-USHH?tZEƒà t*H#H5HMX 11zH[]Ðt#H[]D÷1ۄtHH5HMX1 *H[]ÐATUHS1HIܱu1[]A\Ð1ҾHatHt}tBLPtMt[IL$XHJ` ]A\1&fDHHPuf.WATUStWHHItHt*1ҾH輸u[1]A\ÐHHUt[L]A\雨@ 1off.@USHHHuF;taHWt;Ht8HWHHMX[]` 1,@1u1H[]Htf.ff.{ff.ATUISHHu};S uB}uT;U t1H藣t1MtA|$ufAD$8u}[]A\1Hfu[1]A\H&u1of.Lu[L]A\鿢ff.@1餛@kff. ff.SHt1ztHߺ[郦[ff.USHH肟tEHk}tBuu HH[]gH59HUX 1虖H1[]HHMX1` t׋}ff.5Hff.f5Hݯff.f5H齯ff.fAUATUSHHH~LgH}_Lc{tmLEt }C83E81HHH[]A\A]fLmLԵt1Lv#1LHi#H5HSXHI1  1H[]A\A]LgL|}H}aHRH5#HMX1 諔H1[]A\A]fDH!H5HKX1 zH1[]A\A]D}8tH011HH詬HH5HKX1 1@HH5ZHKX H1[]A\A]H9H5*HKXx 1躓1H[]A\A]DLm_Hf1f.AWAVAUATUSHHH_H胱uH1[]A\A]A^A_fD1HtHE LhH@ LpH@ LxH@ L`MA|$tLGtAD$86MuQMtLA}1ҾL"[A}1HLR6D1ҾLMt A}MtPA~tL蛞L L1H | LHHL[]A\A]A^A_d@HE H8H@H@ 11MLh!UU@L7fD11ҞHU HR HR HR HB1HL ff.@AWAV1AUATUSHHH_H!uH1[]A\A]A^A_@1H辨tHE L`LtAT$8DAT$C@ƃ@8@ƃ@8uuAD$ 9C HE H@ LhH@ LxH@ LpH@ L`MA|$tL葜AD$8;MMA}c1ҾL`A}HL[]A\A]A^A_=DLH:HKXH5ML$X( 1莏|fLÚY1ҾLǮBMt A}MYAtL}L L1H ^ LH L$11HU HR HR HR HR HBHE H8H@ 11MH@H@ Lh٭+O1HLj51HLSHHCu1[H{1ͥtHs H{[ ff.SHH1Au 1[fH{1腥tHs H{[íAUATIUSHH;LtuMHt$L1Dd$;uU1DD$Ht$ A1HIcŋL$ H9 `} HSXDᾭ-  1fD1H[]A\A]Ð14t;JH޿-  5D%i Dd$Wff.@SHHt"H H5HOX1 d1[1)tHs H{[gAUATIUSIHH?thA<$}E11LHtRA}tLIt5{u1HLH[]A\A]b ED+t1H[]A\A]ÐLsff.ATUISGHHV  tu{tH袗tzE11HH9taAD$ tL`t8[LHE1]A\1t;1H,j[1]A\D t3tN @@u Ht͋E*fۡtES ts H赬tSEfDu L蕬iAD$ AUAT1USI1HHWu1H[]A\A]@1H薝tHz1ҾLtL`tLDuo{u A|$E11HھL艝lAL$8S89tHIt$X 11PJHsX 1160It$XH 1 L#HAA9uNA<$A|${u#HCHHtH@Ht @]H{誜LH߉It$X¿8 11茈It$X 1vp;pHCHHP(ID$HH@(H2H8LHsPID$PHxt z.HHHt'Ht"; tʃuHJH9HtIt$X 1ۇfDSHuv{u;t&[@Hu1[fDHHH9|JuHsX1 [11褓v1HH9u뵃t1nfDfUSHHHt1Bt.1H褞t Ht}uHHH[]1rf1H[]HtsS1HH0׊u 1H0[1HH۠$AtB$tHSX1x |tHu?H0[@fHH5rHKX1 1zHH5JHKX 1څRDAUATIUS1HIH8膝VA|$tL4L L1H  LH%ME11LHMA}8tLHt$H蕅Ht$ L膅A$t@H|$ q@tE$H|$ EItyH|$HD$LL$AIcL9~HHLH5ATIMX1 ^XZ@1H8[]A\A]f@tH|$aHxH|$PHI$辦HtDHP:u;E1 HP:u-tAH̦HuH|$ I4H|$ E1IUSHHHt1t.1HTt Ht}uHHH[]1"f1H[]ff.S1H耇tJHSHHR(BQt>BP t`HKPHt-HtuLH5[HKXH1 1[HiH52HKX1P ‚1[fDHAH5 HKX1 蚂1[fD1uqttHI8 f[fff.kff.tAPvt uHwXH1( 1HÐfDHHtH@PHGHH@(fDSHH*t{ tH1[D[ff.ff.;ff.USHHHuFK8uV}uU8u]H[]fD1Hyu1H[]@1\tK8t1H7u1HH[]DAWAVAUATIUSH1HIHl{8{tHҌHt$0HAL$4HD$8H|$0L0Dt$_EA!(ME11LHfE1HĈD[]A\A]A^A_H莋@H|$0HH5HKX 1E1uHsXh 1`fDMtSL^A}tL腋=A}M,f;Hϡ }uHEHH@(HHt xHt$@HFAH|$0 H@HHt(HCHt$0HHHXHHuHt$0H|$@H|$@ElH|$0bPDL3M-DHsX 1E1}HCPHHH9PH@8HHP:zH8xyHCHHP(BZDD$,D;D$UEL$~LA@HT@uaD@뙃|$4H|$@\H|$0RHuX 1{-HH5wINX 1z HsH5TINX` 1zATUSGHHI. thAD$I|$  A|$ID$HHH@H@]R@{u*HCHHtH@Ht@]oHH5RHKXp 1y[]A\ HKXIHUH5110 y[]A\fH)H5IL$X1p y[]A\f.Lh IL$XIff.SHu&{tHߺ[鍅D[Ð1tu[ff.AUATIUSIHHHA|$Ht}tHt}L!tgMt H!IMX1` ntBHLH[]A\A]E11ifL螄bH1[]A\A]f1t,H1[]A\A]ff.@SHuF{ uS8u$[f1HAu1[fH1[D1u1ff.SHuF ;S uS8u[D1H΂u1[fDH1[酂D1脃u1ff.HATIUHSHu HHt{K8Ht'}tHtnU8Mt[A|$AD$8[IL$XH ]A\1lH1 lu1[]A\D1H葂tK8G1Hh5fDLVU@L+G념H jfD[ff.USHHH?tC8t;t+H[]ےt1H[]fD1HvHP0Htzt?H譄tHtHHHMX[]` 1kf8uHsX1 t1USHHHu>{u;tKH[]1Hu1H[]@1u1@H޿Y軅tʃ;uHHH[]AY- USHHHGu!HHH[]E11fD1uH[]fDATUHS1HI謋u1[]A\ÐH.tHt}t?LtMt[IL$XH` ]A\1i1HH#uff.fS?HtwC ts8t ;[HCHH@(HHt@PvÃtH}H5FHKX1 r1[f;1uC|H@H5 HKX1x r1[DHSHHR(HHWzMHSPH@ 6z,HH5HKX1 /r1[ff.US11HHHt$t#{ K8t;t=H[]@LD$HdHKXH5)1@ qH[]@HCHH@(HHtxuHCPHtuxuH H5HKX 11aqo@HH5HKX1 18qH[]ff.@USHHu HG0tHsX1 p1H[]4*u֋O8HHtЃ}uDHE0u7HS494u%U8tHH[]{fDH}1H{1HHsXHH1 Ip1Pf1{e:ff.ATUSH@uIHWH*H90t HwX1P 1oH@[]A\O8HH|$H]D$@} H@&U8MA|$8tLH{A|$tL{tqHLotHcu8HHG}E8~1ҹHHԉf1)z1H@[]A\@MtkHuXP 1nfDHuX 11n@HuX 11vnHuX 11^nsf}ZHuX 16nK\It$X 1n0H $HUXH dHuX 1mSH0uHWH,H90tHwX1 m1H0[fDG8HuFHH+D$tDHuPD$uH0HSXX [1cf1xuHsX1 0m1@HsX1 m1eUSHH8u~HH葊H H#$H H9ttD$ S8 HH[]A\A]ÐHsX 10kHH[]A\A]HsX1 kHH[]A\A]HsX1 1jHH[]A\A]ÐC8;`p1HIlIH@0@tt AA}=HSXr 1`D@HhpHHsXP Qj@HL$HSX 1`HsX 1jHSX1 v`QSHu69C t#HKXHPp [ 1.`fD[Ð1uu[ff.AVAUIATUIS?H{8`Mt1ҾL輈tjA<$A}tLutBAE8t<[L]A\A]A^QtHH5RHKXA 1h1[]A\A]A^3UfM$L{xs8LHcfCHH5IL$X 11]hff.ATUISHHS8}uH;U u]MtnA|$A;T$ AD$8tH[L]A\1sH.tu[1]A\H3stMu[]A\1stƋS8W1HrE말LsJ@Lr@lfD1fUSHHHun;S uCHt.}tHFst0;U uOE8u`H[]@1H6ruH1[]f1rufDHquDHH[]qff.f1)fAUATIUSHHIHs8}uM8A|$tLJrtpAT$8Mt4A}A} tL0qt6AE8H[]A\A]úHqgH1[]A\A]f1qtڋs801Hp뻐HspfL^q-{L3p`fDHL[]A\A] pfUSHHHu^C8un}u1K;S t1Hot.HH[]5|DHpuH1[]Ð1ptC8t1HoouH1[]fATUISHHS8};U tHLoH{}MtzA|$tLotZDA;T$ tLnt@1l fHllfAUATUSHIH}t}8t)H5HUX8 11(`H[]A\A]HH`AtLHHcHРHuDâHuXڿx 11_H닿@ 1]USH ,HHHt=uk;S8t1Hjt]HSX1 UtDHt<}uDE8u]HHUX []1Uf1dku1H[]H>kuHjuDH޿ o11롿@ 1[Ht>Ht/uHOHH90tHwX10 K^1Hÿ@ 1[AUATUSH HHIHI Љu-HHvX 1]H[]A\A]fH}tHj}E8HHUXx []A\A]1SHHtMu A}LztEMt{1ҾL|t-HLL[]A\A]3vHShi1H[]A\A]@Ht[HuX1 1\H[]A\A]@H[]A\A]fDH m1uMLIuX@ 1\\x@ 1Yff.?t Ht:t 1ɉ@AVAUMATUISHMHHH1 u1H[]A\A]A^Ht$L/ tHt$HucHEHHtH HHHtaHcHHMI>tIUwfDA<$t AT$8tHuLE1HLHHcHI9vHHcD11f.I6A<$_It$X 1ZMIIEAUATIUSHHH?E { ;CtT Mt$A|$AT$8A<$tAHD[]A\A]HCHH@(@T@tE1UHsXH¿ 1ZHEHH@(@T@O$HuXH¿8 1YHE1[D]A\A]+wHHqfDLeAA|$ uLa(fDLdufDID$HH@(@P HH5IL$X 1YfHp[HPvLLD$E1HLHHwAhH $LD$L9VHUX 1eMB E1HsXH¿ 1`Xff.8t1dfDfATUHS1HIou1[]A\Ð1ҾHqwtHt}tBL`gtMt[IL$XHZ` ]A\16NfD1HHcjuff.f ff.AWAVAUATIUSHHH8t17cHot~}tHctbE11LHkAtF}U8AE89tIt0LNHoIMXH5Dh 1E1VE1HD[]A\A]A^A_~1MuX @A9]8~҉LHnuH HMH5 ZVHHVAHxHtUHP:uLE1fDHP:u8tAHxHuMcH٘L9|H E1E1HHTL%H5AVHKXI 1E1UXZAUATIUSHHIHt1aE11LH[itwHt}uLLdtVMtSHHIL$X[]A\A]` 1zKf.H>auf.1H[]A\A]SHu&;t9S8u[H1[_D1`u1[@Hqt1{ff.SG8Hu-{t1H`tH11[oX1a_u1[fAUATIUSHIHHtIHs;S K8t1H _c11HWOHtL}h:;U yU8tH^1ҾHWMthA|$tL_A;T$ tL^AD$81ҾL9WMA}tL_tfLyjtUA}8tLl^t8HL1[]A\A]V1^y@1H[]A\A]1H]b@Hv^~@H{]r롐L3]fAWAVMAUATIUSHHIHu}}uOA|$tL]tE11LHeu`E1HD[]A\A]A^A_H]tA|$u1]of.1ҾLQUAtA~uc}A>tsA<$}aA>WA<$LHIu M[L  ]A\A]A^A_bofL\uDL axfDL濪da}cA<$uDHs MH麘 $n;@H CavAqff.USHHHuNK8u^}uU8ueH[]fDH[u1H[]f1[tދK8t1HZu1HH[]}Zff.fSHt1j[tH11[ISf[ff.AVAU ATUISHIH =HA?MK8c{t1HZ]11HRI#;S :ML1MA}[I]XLeA}8TA}tL[Z1ҾL4RA;U tLCYHt$L>Nt*HD$LD$LDDL9vH5HڿH 1eM@fDHGHK8H@(hP@1H}XD1H []A\A]A^Ð1HA<$I\$XLdtA|$8tLXtA|$tL Y[A;T$ tLX_HLMt@HD$T$LҸLDDL9v!HڶH5EHٿ 11LMf1HWfID$H1H@(@P fDIEHH@(@P fLWH*H5EHٿ 1Kff.HG8tÐHHHH tuRHtHHt1Ґ1Ƀ8%HH9u1 fH pHu9uH@1Ͽh 1K 1tK@SG8HuU{u{ t9Hߺ1[)Vf1HVu1[f[f1Uu1֐HATIUHStTO8HS{!9C tHSX1 @11HUNHthU81ҾH.NEHUX tH V\9E MAD$8A|$tLUtWA9D$ tIT$X1 ?t0[L1Ҿ]A\mMD1HiU1[]A\f19Tt{fHTfHUXH 1?두LSvfDHtkSG8Hu@{u{ tLHߺ1[S@1HTu1[f1iSt{tD[Ðf.ATUISO8HHՅ{u:AD$tMtQHLH5IL$X 1G[1]A\D1HStAD$uAD$8uzHt-U8}uD;U t[H]A\R[]A\fD1yRIx@HfSuYDL;Rq:fDHRaff.;ff.SG8HuU{u{ t9Hߺ1[Qf1HRu1[f[f1Qu1֐USHHHG8u}{u'}uA;U tWHH[]sQ1HARu1H[]@HRt݋d;U uH[]1Pt1ff.fUS1HH@]td{8t1H7QtK11HIt9{u{ t/HHߺ[]1P1HqQuD1H[]ff.fATUIS1H\{8t1HP11HH{{ t1HPtaAD$8u@A|$tLPt:A|$ t4[L]A\O@L{Ou1[]A\1HaP_ff. ff.USHHHunh;S u3HtNE8uo}tAHH[]Of1HNu1H[]fH[]@1Ou1@H{Nu1USHHHu^DD9C n}u+;U |HH[]YNfHOuH1[]Ð1OtDYD9C ~HH5HKX 1uBH[]@鋃ff.HtCSHuC8u![1NtC8tH1[^MfDfff.ff. ff.ff.US1HHYtd{8t1HwMtK11HEt9{u5;S t*HH1[]L1HMuD1H[]ff.fUSHHH;S uwC8HXts}8tHLtW1ҾHEtB}u|v;U t3HH[]LfD1HLwfD1H[]D1HKZ@1L*fHLj@ff.?t 1W@HHHwX1p ?1Hff.HtCSHuS8u[1Lu1[@H1[JD1D雙ff.雀ff.kff.USHHHuFS8u_Ht*E8uk}tHH[]mKDH[]@1LKu1H[]1HJu1fDHJu1ff.ff.ff.USHHHO8{uCUtVt HڨH5HMX 13>1H[]f.1HqJtڋUuU8tHH[]6IfD1)Ihff.@ATUISO8HH{u2EtEtGHH5HMX 1w=[1]A\1HItEuE8uvMtA|$u1AT$8uK[]A\fD1qHqL^IAT$8t[L]A\,H@HHuGff.USHHHD;S u/E8}u>8;U tTHH[]G1HGuH1[]fHHt؋;U uH[]1dHbf.H3Gt}WAUATUSHHGP „VIJà @@  u   uN}t8HH5IL$X 1I;LCH[]A\A]@u L]tE11LHOAtËU8AD$8DtHL[]A\A]&FfDHYH5"HOX1 1:H[]A\A]H)H5IL$X1 1:H[]A\A]ft;HH5ŤIL$X1 L:HBH[]A\A]At$ H\AT$1HVEfLHQEAT$WS?G  HtiHt4{u>C8ug;t"HH5HKX 1t91[ø[fHEuPtDHsDuDHwX1 91[@HH5RHOX1 81[fDHaH5*HOX1x 81[fD1HHItetUuZHJH1u=HHt7 u+Rt#MtAHt@MtAfATUA0 SG]t+HSּLHHǾ1zH[]A\fDHH?EHI~C]HhuH{Ht7~H|ռHSHLHǾ~1yH[]A\H HH^H8~H|ռHH LH~HH1ayH[]A\fUSHHHtnHfHHtT{uC8AuHsHHt HuHuH{tH[]fH1[]B4t Hz(ATUE1SIԹHHH HHD$@:tAHHt5Ao$AoD$@AoD$ @ IL$0HHH0H ʀP4I|$Ht*AD$4Ho(H\$Hkt0EW C? ЈC?H[]A\f.fDHH\$DHT$HH` tH\$HHtkAo$AoD$@AoD$ @ IL$0HHH0H ʀP4CBKLv~E1IM f.H$EHIEXIHH$1L$fDHHxuL9hu H$LL H$1ҹP_HxPb H$GZOWGZ H$H$L& rH$1CBCC CBIc=ȧ3H$H$H$ H$=\11CB{ CLkCBx3H$H$H W H$BBJ8@BBAE]uL' HA H$H$1CHC0HAE]H$uIH$ &H$CBK8@KA CLkCBHIEX(1[0HH$H$}Z H$8CBK:CBüHH$HH4HIEX~1/HH$H$ 2H$8EBM:MA EBNüHH$HDŽ$@HD$(HH4AE]AETHD$A}[xIHHzIHl$D$0H I;i HD$0HHDŽ$D$XIHpHPBX"HxH@(HHEH:_H1c.HL$(HH$H$ ] H$8CBK:CBHH$HH4HIEXu|$u51-HAE]H$uL4H$H91L HI$ L H I$   1衿H$HHI H$HGZOT@OW OY pGZG[G[I$BZ<0uHHG H$H 2 H$CZKQKY C KPLk(LCZwHH$=1HC.H$HCa H$H$ HH$HHb HÿHֿHHH$H$H$HH$ H$HH$HPzf/HHHD$SHD$8sIL0I1L贸 HI$ L H I$   1XH$HHIR H$HGZOT@OY pGZG[G[GW0GWI$BZ<0uHHG訌 H$H޿݇ H$GZOQOY G Lo(DGZf H$H$H޿ԇHH$HHH$i H$GZOQOY G Lo(LGZ H$H螽H$HEH$HH@H_ H$HHˌH$HEPH賌H$HCXH$蚌H$HH$HPlL 1kH$LHIe H$GSOPOY GSGZ AGZ H$H{ C Lk(jH$HH$HNAE\)LD$0r1LԵ HH$I$ H H,$L  D$ 1tH$HHH$i H$HGZOT OY0HG G$pGZGW߃@ۈGWPG[ ЈG[I$BZ<0uHHG蠉 H$S H4$H$p; H$EZMQMRE MPLm(MY @EZH$HH@OZ  H$˺H$HH$HH8 H4$H$(S& H$(ŜG$GZOQ OY G @GZ荈 H$H?H$(HEH$HH@H8 H4$H$  肽 H$ GZOQ OY G G$@GZ H$HHh谹H$ HEH$HH@H@H8u M1%HL$(HH$HA2H$AETH}v}|$0uIHt=Ht31L1荲 H$HI$ H H$I$ HD$HHxP8H|$/HL$(HH@HH@(HAfHH$HH@(EHE$HH$1H H;HHHHE H$H$HרH藻 H$HMG$GZOY G @GZ H$H H$PH޿ G H$PG$GZOY G @GZɅ H$P| H$`H޿ H$`G$GZOY G @GZy H$`, H$蟆HH LPAH1 &bHD$ HH$XAXAYH$9O tHp E11ɺH,$H$X>H H: H$XؘG$GZOY G @GZ輄 H$Xo H$0H H$0C$CZKPKY C @CZ.HH$011HZ H=NCH$HCH$0H|$H,H$0H H$0調 H$8H} % H$8ۗC$CZKPKY C @CZiHH$81HZ H=C#HCH$8H|$H,H$8H/ H$8 W H$HH11HCP#HCXHH,$V H$XHCH =P_HEP`#HEXV H$0HHE΃HHCPHHEPHCP1ҋ=|1HhPEEE Dž^"H$0H01HCP=3HH@PHP8*"(HCXdzH$HH'=1HE!H|$HE*=ѕ1HE!HEU HCHHD$ uU LT$ HIB_U LT$ HCH$8IBLT$@Hh茂HÿHEPHCPHEPH$HHXPCC ǃ?H$8H0HEPH$HHH@PHP8H$HSLAPH u H|$ 1īHEXvT H$@HHE貁HHCP,HEPHCPH$XHhPEEE DždH$@H0HCPHH@PHP8mo5HH$@HCX@Xǀ HHCXH|HEPHCXHHhPH$@HEEE DžHE8H0HCXoH$XHH@PH0EXEDž[HHCX1=HH@PH0  HcHHCXHH@PH0HHHH1H$XHD$HH$(HD$0HL$HHT$0ALHkXH $R`PH 1H|$ vH{XP HHHP H$hH Hp Q HHCQ HEHYo!HHCP@ɑH$8EXEDž E ~HHCPHWH$HHEPHCPHHhPEEE Dž~H0HkPH$8H$0HHH@PHP8R~HHCPHHEPHCPHHhPH$0HEEE DžHE8H0HCPoԁH$HHH@PH0EXEDž}HHCP=^1HH@PH0Q HHCPHH@PH0HHHO H$XHCH }=֏1HEPP_zH4$H$hHEX  H$hG$GZOY G @GZ{ H$h5 [O LT$@H$hHIB|HCP=WHHCX1ǀ@HHkX oHHH}HpEXH$0&|HHCXH蕬H|$HEPHCXHHhPEEE DžI#H0HCXH$0HHH@PHP8IH? H1 H4$H$@ | H$@GZOP@OY Lo(G @GZy H$@貵 M H|$HHCHD$8"HEPIL$HD$XL$L$H1L$IH HE H H@(@W M` KM IEIžxo}~HIHE H@(HH@P H=1HcHCIH{!HC HE H@(HH@PH$XL$H$H$`HD$@H$PHD$`H$0HD$hH$8HD$pH$hHD$xH$@HD$HH$HHD$ H$(HD$P/L HIEL HCIoZ}1ҋ=&1@XǀHLI)yAWL(}APH } H|$ 1Hc=Ë1HHHHp HzHrH{ o|^ID$P@X@P ǀ or|I@H|$`AAGXAG ALJ =xIHE I^H@(_HHHyu@Rtb3o{H@AXH$LA ǁH$wH$HIIT$PLH$H$=S1ou{HID$P1LAGXAGAG AALJ IID$PHL II IID$HE (IGHH@(IGpoIID$LLv(IGID$L5HIG ID$HLx IG+I ID$II H<$IGI9H$xHI$IDŽ$ I= H$xIF(HG(GZOP@OY G @GZt H$xo H$pL  L$pIF(ID$(AD$ZAL$PAL$Y AD$ @AD$Z$I$H$pE1ҀHZ HHU H$pL$IL@HR(HHHHLcHD9!H$!=1H$HcЇ1=ćJD=H1H$LH$tHLxAH$QH$H xRP`H|$ 1qH$GH P HJ F HIEHE (HCHH@(HCp踤HIEH$H tHCHm*LL$r H$p_ (`H|$ Is=11I$H|$xID$s=o1ID$jID$+F ILH$ F L$IBzoBwIL|$ @XǀLsHI$HT$PHI$H\$I|$ Hr ISL$LIBHPt$(L$H$HT$pH$HT$`L|$@nH H|$@IrHt$HHHLH$8kH $H$xLH$pHIGkHCID H$xIGH0rH|$HHCP"r HCXD L$IǾ(IBHE IGHH@(IGpvIH$pHqHCHE H@(HH@Z <(+H|$ Hq=\11H\H|$xHCnq=<1HC8HCC IGHHD$xC HT$xHL|$HBHAWPt$(LL$pL$H$HT$`H$H|$@LlH H|$@HpHt$HLHLiLHCI4H$xH$pLHYiIFH8C H|$HHCIwpH[H$xID$PapHCXS="11)HCB IHD$8LhHl$X c 9H|$XLL$L$L$L$dHۿ@zB HHkxD(H.H$HHo=\11H\H$hHCko=91HC5HCA HEHHD$ A HT$ HHl$HBHUPt$(L$HL$PH$XH$H$pH$hAjH H$`HnH$@H$HHmg HCHLA HL$XHHE(HA HCHHAHCpH$@HHZnH\$8HE|$0H$@HDŽ$xuPH4$H$@ ) H$@GZOP@OY G Lo(@GZl H$@_ H4$H$p ١ H$pGZOY G @GZsG$[l H$p (H$HHom==11HE< H$hHEKm=1HE HE? HCHHD$ H? HHCHHl$UPt$(L$HL$PH$XH$H$pH$h#hH$H H$`lH$xH$@HH$HLM HHCHD$(H@HHEHH@(HEpH$HHHkHCH(辛HC HH$(HX k(HCHHX 脛H$ HC HH@ HX jHCH|$v H$jHL$(H$HH$HAR:= HHD$8HhH\$8 fc 24ff.AVAUIATUISHGHH_PH@(Hh(HGPHuwu`HtgHHЋHuL@I0u"Ht uHfIhHHuHt"HHtHHt zL#MtiLuQIعLH Ht'H HHHuMLt[]A\A]A^@LE1 o@[AE]o@A\AEo@(AE(A]A^HzMHxHtHH@(HHB$ c p2fATUSHGHHHh`DeHCPDc8HtWHHuYHHHut'輘H@Hh8[]A\% HATUASH0`HHXPH@PHt'HHtHHHHu8tIdHLD9e t)H~E11ɺHH$Dd$H0H[]A\ÐHPH:€uHHfAVAUATUSHPHH IHHHt yC@zfAuC (C Ш@CNHAtD)ADDIt$HTH|$HEHH@Ht$D4 1^IID$@]4H mHL Mu II Hu1HLL H<$^IL$H$A]uWQT%=t~H$ ҋpT  PTP[q[ P[Q\P\ ʈP\HV ID$L#HEMHP[]A\A]A^f.H1HQ 1]HP1[]A\A]A^fD1 <HP1[]A\A]A^DL @Ht$@H(S 1䓼I@Ht$D 1ēIa@" 1謓II@LLe LLH$ H<$Hx(9c L,$ID$AAE @]u@XAEX AEX11Z IH$1ҹP_HxP y H<$HT$߽ HT$Ao$BJBAoD$BAoD$ JBIL$B(BHJ J: J:@u fft C H<$m B8 ȈB8  ȈB8  ȈB8  ȈB8B@OX ȈB@HEHǂHB`HT$ HD$PBH8@@PBI|$G]uMH@@` 1輑IY@F 1褑IAE1{IsWHtgHP(HD$dH<$HPHT$B  HD$tP PB@HA PB1 c + c +AWAVIAUATIUSHHIMHL$t'HD$PH Hu (H@HtH9uL9puH[]A\A]A^A_DH HPPHHuHHHuHT$呼HT$HHAoFAF8LyLqEAoFEAoF(E(AV E1 L$3 Iǿ]LIH(`IGP]3 IP3 HIEHEPIE1od.&Ht$PHPPBXHHQ IEH@PH@HHx(GZOPHpGZ贚 IEHhPrEE ]HIE(HhP脐HHIEH@PHV(HEIEH@PHDL$HE IEMLx MhHD$PL Ht$PHLpL`H H[]A\A]A^A_fDI~LHMAG8fDHL$褏AVHL$@ HVIN`HH8 ɉM8ɃHAF8IvtbLLMt@HLMt$XLD$LHMXZMtIvΨfDHLMtAG8tuAG8If0 II$HLxI$H@I$0 H<$IGHD$]LD$I@P@IvuA~I~<1HHEPODI$H-LhI$H@I$fDHHHHH@(LMMMIuHuٿ / II(H@HIGHH@(IGpVIHhI$HtyLxI$H@I$DI~HHI`-DLM oM<$PM,$ c &M<$AUATIUSHH HtH9Iu^HHtBHx HtL!HHt%Hx(HtHL[]A\A]f.H[]A\A]DHxH@(HHuMfDHHt;H3E1L HuHH3LնHHuLL+fATUISHHHHt|HMH@(HHHHHHH2HHto8tjI$[]A\fHHu8[1]A\DC8AuH{14P@HHtȃ{u@[]A\ú j c # h c #ff.AVAUATUHSH?Z16H`DH Hu bfHH HuDexH|$PH¾ D1-Ht$P 1@HL$1HHI HD$HHD$L@AHD$HHD$~ H|$ HD$Ht j HD$H|$(HtHD$Q HD$H|$0HtHD$8 HD$H|$8HtHD$ HD$HĐ[]A\A]A^f.HE@V BHH2HxH% HĐ[]A\A]A^H}HĐ[]A\A]A^HT$HL H|$1ҹP_G HPl H|$GSOPOVGSGZGZ{V Ht$P 1誆HL$ 1HHII肩 HD$ HtpH\$H{ HC(UHHD$LHD$?HD$HD HĐ1[]A\A]A^1fHHT$ HL HD$ 1ҹP_HxPk tH|$ GZOWGZU H|$ HT$@L&蛰 dHT$@HBBBB BB=Ui1Ld$@HcVH|$ HT$@I$$: HT$@hB BBBBB[ HxHI}HcEHD$HHT$HLl$@1=hLIH|$ HT$@ 觯 pLd$@1AD$BAL$8@AD$ AD$BSH|$ HT$@I$W  Ld$@1AD$BAL$8@AD$ AD$BH|$ HT$@I$} Ld$@8AD$BAL$:AD$B脅I$HD$@HH4}`Ht$P 1I1H~ IH HT$(LLI$ L A$ A$ 脈 H|$(I$GZOT@pGZG[G[GW0GWHHtPZ0uHHGR HT$0L݇ H|$0EG EOQG$GZDGZR H\$(^HT$0LԇHHD$(HHHT$8豇 H|$8EG EOQG$GZLGZ:R HD$(HHT$8HCHD$(HH@H% H|$8HI$$SH|$0HCPI$SHCXH\$@H|$(RHT$(HHD$@H|$ HPHT$@責 {H\$@8CBK:KA CB1HH\$@1HH4CHHD$ us 1cL IMM$MtI6L>tM$ Ht$HH Lt$(ff.AWAVAUATUSHHHHt >HHD$HCs;HPoI@oC@oC @(HCHHH|$MA'HID$LL`HHt\_H3m IuEuHt}u1ID$fDH|$QID$HL[]A\A]A^A_fDH~HD$ff.fME1bfMAOff.@AUATAUSHHHtHHtBIcLd0Mt5AD$4@u-Ht@uGStA|$tjHL[]A\A]@GTLD$t2 HLD$t{H@DHH[]A\A]_HE1[L]A\A]MLD$t%H/D+LD$HƿG H1LH^V@  c Sff.USHHHH?,{ HtH[]HHH[]z f.DSHHHtH{ Ht +H{B2H{@92H[ 7SPpHxHHH@H@ 1B @H{H+>B1HHv1HuH:BH7BHtDf.HSH=A 16H{H+A&1HH1HuHAH[]@H5A ?AAff.UHAVAUATSAIH =pA@]At He[A\A]A^]H}DH?LuL!H HUHLHH߉A @H=@MHپ D1y5H=Dt}H5@ >He[A\A]A^]fDH5q@ >f(H !@ >H5@lH>H5?Hff.fUSHH=+HX=??uN9P9}3Hᄄd[]H5y? =?x?9|H[]DH5I? =Q?N?A?SHEHH= ? 13?[fDSHHSH=> 13>[D~tt@ATUHSD&DH>H>t[]A\f.H=YDHD+ j>HZD!xH51> t@Hu H-@HH@ pEH@(E5H0H2H=1ʅ !HH2u> 8GkH51H¿ 1x 4 T 8kH51H¿ 1ff.1H=v1PHH=&2ATUSnH2x0L%2H11L8'H@-I9*fDI` C0t3tGt[HHC0tsuH=* C0uH=n* C0uH=U* C0uH=<*HH C0uDtv_!t0HètN uH=) XuH=) >@HpH=)(r )((3TH=j)HH[]A\A]HpH=E) H=1)Lh1ҁiLHADHf.H=(oǸH-(1HvHOM H9\ H,DIEu0=Jt H H=(Ǹp HHUSHHt?u-HHHtH{H-HHuH[]fDHHD$HtHx8tHEHHtPHHuUDHHtCH;X8uHpH|$HVXHHuHD$HH[]@fDH|$11AWAVAUATUSHHHGHH|$H|$PHt$ 1HT$8HL$(H@(LHD$0H HD$/LkPM A}HD$MMAE5?L4$IE8MH$8A}Ht$8H|$P D$T H|$(AEIHt$PH|$p Ht$pHT$(H I}8Ht$`Bt!Ht$`H|$p%H|$`=H|$PH $HHHHCHtgH@`H$09$L|$ H$$IG`0$;$L~HCHHXHD$0P  H1Ll$@Lt$HKf~*HDLI9H|$pM4+I9I HZMLnHHkHt$pHI$JHt$PLc@Hi$ALEpJHt$PHL fDf.M HD$ HhX1H|I HD$8 H|$ HHD$HsL`L.{vMtHD$L;hP HD$HL`MAE,HD$L;hP?HtH|$P H[]A\A]A^A_A>x L4$I]HCHD$H$H5HIt$H9X8u;DH;X8HNHuH$11HuIIEIG8M9MwL|$MMA}MI H|$P" H|$( MtqIG`HthLt$ I~`t\0H$-!IF`H$$0!;$HD$0LP  HIO1HXt+@MEHD$H@HH81}fH MH<$CfDHD$0LAEr5w[H  1MH|$P 1 HI$1 HD$L;hPHD$0L@HD$Hp IHD$L`AE`HI$1~ NfKH$AEjH $IEAHAZ@H|$HD$H $HHf.I1LHxfDI}8H$觭t'H$H|$P > H$ H|$PRH $HHIHX@HIoHt$PHH|$PAH $EIW(L,HL1HIHLH}@HHEHj{DH$CH|$P1GH$AU nIE(Lt$@M0E1IfDL0 1IHH$AIHIE8H$HHTH˜eH$H$HHT$PH$H IE8H$HHHHHHƘ H$HH$H$HE9e ~gI?IcHHIE8H|.IE8H؈HLt$@H$.H$!O@HL$H$o@oA@oA P8@(Ll$@Lt$HH|$PHL$ 1HQXH $HHt$(Hx@IH|$pHD$(fDHD$Hp I,HD$0H $HL.H|$PH $HRHH@HxHt$pX%I}wMMAHD$Ht$p1E1H@H8MIA} I}I}HHItLpIu1p HX$HLEE1H|$HD$Lx$H;LLl$@Lt$HNHIH;XHI9HLH|$pHcHPH97H)LH(Ht$pH H|$pQIMUH|$P7HL$ 1HQXH $HIHD$8HD$xH|$ kH\$HpIHHq1HLLlj1LD$@IIID$ID$@HHHIHD$HlHT$PLLHt$PHHL?yMLHD$3HD$0LTHD$ 8HD$H@HH1*HD$HxHD$0HT$@ H0HX1輲1Ht$X 1L$$MHX虲Ld$pHDŽ$1L$HL$ I)MÃy HHL$ LM9LD$pHD$ MHPXL11 DHD$ Ld$pLx %H$JH|HD$ HLD$pM9!ÄHD$ x IIIwx IHD$INX HPXH@HH@(H01|L_H\$Ht$p1AHCH8{HL$p11{oI@oC@oC @(HcAHIxIH $I)M9LD$pL!M'HD$ HH$Ht1HH9uH$1HH>H$1LH$LH$L$L$$H|$`MHt$ 1HX@RL$$MH$J 11 /HD$ HD$px 4H$1MM1  L$H$ J|L)ǟ   E c 1 0         +    q H|$PMIT$(H\$HLD$@ff.fAWAVAUATUSHHHHT$VH|$jHE1HIIHLD$ D$ EJ(H*HI5*I$H]L腾AGHI9CBMcuJHHIIHHbELAA JHHC0IcHHЈHƘyEH|$1H|$  H|$0CLd$1fHC0LcH|$0LIJTH˜Ht$0HT$ HH|$Ht$0HuHC0H|$0JHHHHHƘHt$0HHHT$ Ht$0HV9k]H|$0cH|$ YHH[]A\A]A^A_f.HHADHC0HJtHƘH 1gH4@AWAVAUATIUS1HHH8HT$=H|$ #H|$E11fHI{SHU0MH|$ LJTH˜Ht$ HT$HH|$Ht$ HLL`L舻HE0H|$ IHHHHHHƘHt$ HHT$Ht$ HCH9E~QLLcrnwXH(HHE0LJtHƘWH|$ ^H|$TH8[]A\A]A^A_Ãu  1褪 .  膪fDHtCUSHHH;Hk8fH{1H{0HHHuH[]fDAWAVAAUATUSHHHG}DmDeE2DE&EDDE;xG9?HHcHDH@49H HtHpHЈHuԅA$@J|JJDI9~4EJ|tJDJJDI9DmDeCT%~$1DDh >+EE1H[]A\A]A^A_fEuS~C UE CA~FMcJDHxIHAD9cUE,DAHEL1 l#f1` t/u:C4C4P 1'fD1(Ĝt߃uC4uVC41h ԡH1HC 1 覡]1` 萡G10 z11 d( 1N)Pn *` DAWAVAUATUSHL5 L-  z <(u~ l DAD <&tҼ M <,<-ta<)2o H1谠L5L-H[]A\A]A^A_fDx 1|f.[ ֺ  <&HtA 輺 <)A<,t EpE@8h@D SE H[]A\A]A^A_DADDD  fD 1輟AWAVAAUATIUSHHHJH 1HY_ H}AHT$1L\A HD$HHUz @Zp uHUo@ HLH5B o@0B0o@@B@HE1H x] HUH$H=HP(`9 HT$H$HP(f.H@ AEH]K[H\$ʃCX%=CP/H=H HtfH}t_E1HL\ AH$HHT$HP(B[<tHzPH1ɾ% DDHD[]A\A]A^A_@H1L? AH1 sZ@tDt0CT`t*t5HSL 1A̙f.CYuf.H{PHH!H9u*u%HSL@ 1A|9CUt0CT`t*CXu$HSL 1ACfDK$tAuMuIHzH t8u3Eu.L 1A@{ tf.CPCZCY@ CT@H 8H@08fDL5 HT$H$HP(H HH@Z<0fH L7 HHD$P_HHp ' HD$1HHpPHP* H\$CPHHt"H9HHHǃtCT`tC[ƒt&@CT`@uу<uH 8uHSL 1ACfDHHoDCT`j;HL-H\$ICPH1LH 3 HT$H$HP(JAUATUSI1HHHT$9 u+HT$1HtHB(I$H[]A\A]f.HwHB0Htσ8uLhMtIuH|uM,$1US1ۉHXDk 1aHT$Ht$HNtQtb<$w&$$ h @H|$.4 PX,֑uLt" H1HX[]fDHcT$H HH׋  |{tH H rfHT$H|$1: u<$HD$t`HHxP1ɉ@" nH|$GU)U HHxPHD$1ɉH" /fH HHtRZ0uPZtpPZffDZH~x H1ٗH0 1fDAVAUATUISH: :&L*LrHT$1Hz9 Ht$HHurF\ulFZLnLv9tHH~P1ɉ tjHt$tkf\H~PH61 HD$I$H[]A\A]A^fDHVH61H ՖH1[]A\A]A^fDH1[]A\A]A^ÐHhHt$} n X'` KH޿ fUSHHH))G)G )G0(ڎ1Ht HH[]H8HPHu HHj(H 1LH5 HP tfETHu uH H Ht H; t2HE:t`HH}PHX t` 1膕HH[]f.Ht݃@H3fo$HHC(foD$@foD$ @ foD$0@0yHӔkHyZtK@L3fo$HE8HfoD$@foD$ @ foD$0@0,%t6 H1wHH[]1 \ fH1  輊Hu{)讌u@H1` t:1a 誎t$1 a 蔎tX 1输1҉HDATUHSHH*"tY:tH6ŃuL[]A\D1a ωu[]A\E[]A\H5H;1OtH|ttZ=hfxu:DEyH1`A<$L;HHpXx 1耒0P8uHPH3 Hr€@HcH,H葧IH@PHt8u xLuNAt$uEID$HH@(@Ut6 Q0H  H ID$PHu xL%nff.AUATIUS*HAHЉtH[]A\A]@H|$ 1|tuPEt+1 zuH[]A\A]@HcT$ =i1螝I$(Ft#HLt&=utu 1肐 C )*I<$I$ƒ?t 1ATUHSHGHHH@(H]@HDHL fDH[ HH3LuH{KoHEo@Eo@ E o@0E0o@@E@o@PEPo@`E`o@pEpooooooR[1]A\tyo  ` DUSHH~ LuXHIHtHEH[]f.HtuHԬSH@HEf.1 1肎H[]fAWAVIAUATHUSHAILHXHH8( HtIH8HT$1H0  :&9ЉH?HH|$H؃1t tDfDHtH$@t$Ht$H8' Ht=HP(BZAWAVAUATUSHLL9HHHI L9IHHH:It+M9t&; uHsXLL 1~HAH{xLHugH[]A\A]A^A_HXp 1}N<HHLFL9\HK<> L)Yq C@HHCx,Hǃu@HSXIL龠 1zAWAVIAUATIUS1HhL&Ht$]H\$HCpHoCPHD$P)D$0D$:oC`D=Ht$H|$衽tԃs1a E1D_t#Ht$H|$GtA)\)=MDH=1 EHEDc7)\t-1a ^Ht$H|$̼a=a 1T^Ht$H|$腼Nff.1ɺE1fH5yE1hAWAVIAUATUSHHHHT$(H|$H|$PAHt$ a HHHD$1H H\ E1QMoAWIxIH=(IE IHMIEt3L71L芫A~t LwIEC]@MAWAUAGM<Lۄ,Ht$HD$ H8O Ll$HI;E"H-HuIHIFHHu DMd$ H@HuMtI|$ lID$ HD$HH=>1HpHAHD$HH-Hx HD$HHHATBH|$@ WT  ЉGTfHD$HH@HIFHHx`@HE`TIH,pILl$H1E1tfDIcH|H $H $AIF`DDHȈIF`ADAD9@E1)fDMcD$IoH<$p1ɺEI1HLãHD$HH@`H& 1SHt$8' 1wSHt$' 1RSff.fft!1t !ЃfDHOHt݋Q8uAUAWAVIAUATAUSIHHoTf GT tgLMt[AD$U5 I$IL$' 1LTGAt$ AD$8AL$WAt$@ fs u1AAu$CZ<HD[]A\A]A^A_H{ uCAA? tHHK( 1F@AACUtCP@vf WHtH3HS(* 1vQ{ 0HC(HtH8Ht ?BH3HS1h* ?QCZ<L8MHHK* 11+KfDINHLC0( IEDCUA"H3HKIV0) 1E1PAt$ LFH3HS1) E1zPf 3f.H3HS1( AGPAhH3HS1) E1PCPkHǘߐfATUSGUWZHpH HHHFUH eS u>5ft"LHHKh, 17DS @DcPAcA@CPf% f= CP HHtxtk[]A\@[]A\@t,GWuWTH@Hw []A\H7HW1+ 1JfH HHKx. HLLH1DDY@HC(HHSH3X+ L13NS AHC(HtHHt 8=H H3HS, 1HH1MDcP@H HHK - HL1'DDcPg1`H HHKp- HL1CD/H HHK(. HL1CD H H3HS- 1HH1LHH3HS+ 1LS wH薍qfUSHHHHt&fDHs HHHHuH[]fUSHHPP_HH31R H3HH8[]ff.SHwHHǰt ǃ[f.USHXH|$HD$=H|$dt1. KHX1[]f.H|$1TH1H|$HtwHD$e6,CH|$?=tbH|$УH|$1آH|$HHuHHuHt\DžjDHt$H|$y 5@Ht$H|$1_ f.HX[]@HX1[]1(/ dJHX1[]ÿa 1Jff.@AWAVE1AUATAUSIHhGXt$L$ t GT(B۹t\11HL AD$Xt3AD$Tt+MH5-D0 1EHh[]A\A]A^A_fDLHLJ(A_)E11A#|$ Ex/ 1E:DH0HxP12 tHD$1HxP^ |fDHL$_tL$?fDEHD$/ 1E#HHIGHD$HHt T$ZHHt%HT$HtEuL:H0LR)@,I@*@=H|$ :ED$t_I$HT$H|$ L$H%1/ >fHT$1A$ I^LeHtiMIHUHtL:L(LLޡtLL(1 1xCI^LeHILeM\I^LeHuM:Mf'Yf.HxDHHHHHHtHDH;t;H@HuHD|$ D$ta0 1BL$m|$ H2tf0 1BE|$ :D$/ 1uBL$!P0 1FL$ L/ E0 1Ef.USHHH?H< uf.H< tH赌HD8 uH@8 tHH}HwH]H <&uHt\<_<$tt.fDHt5 <&u_$t1p1 DH[]H[]@HEH[]US1Ha HH$HD$>?t H[],<)<\@9HCQH$HtH8pH8w=j3 1CT1a t>H|$H|$1貌HD$8 x 9H P8HH3 H|$H$SH_@1 1B~2 1Bh@H$HHH#fDX2 1B&f.1 |B@H|$S@2 1TBH|$RH$Hq.fDHt@tH;HtA`SH@H8`H32 1AnfS _1H+t[1a 8tW 1O<uϻ[ff.fAVAU_ATUSH  HD$H;H-<HHPhHH9u\ w[ <:<,u;\ [ a<w*$Xh 1$At AH=+HH- H H D[]A\A]A^AtA/fIEeEIZ D8tfAoAVfIEeE!Z D8tOAlAdf.IEeEIZ D8tAkA. f.IEeE Z D8tAiA9f.IEeEOY D8tAeAf.IEeE)Y D8tOAdArf.IEeEIY D8t3Y .Y IEeEUT D8tAnADIEeEt%T D8t薥APA!8DohDH)DGAA A AvAnd IE,$EzS D8t@AiAr_S D8EIEeEuAtH|$11H|$ώ_A oE1gA\AQAFA ;HT$HןAA A A AE1BDE.AFw\I H=uƒ ttDtJ@LA A]AVnAXB$i H'HHt@Z<0e1Kb -H$ 1_ IAH$L1H_H语 AuHbl6A,A @x Ix HELH$HHуNHG0HA88H1x6 ,HФAB$i MAA4Ip4 J$4 AHE15*H$_訩 H$1Ҿ_芸 ~H$_# gH$_| PH$_Ŧ 9H$1_ H$_0 H$_ H$_ H$_A H$_ H$11_Ͼ 1{b +FH$1_] Q1eb *H$1_諬 H$_ı H$`1Ҿ_ƽ H$P1_] H$01Ҿ_菽 H$_H 10b 2*_H$1_膤 j1b *-H$1_ 8H$1_+ At@5 143p7 13I6 1J$A3F4 1207 12Ҹ4 @4 HE1B)o=s5g95eL06 1A2Ҹ3 3 HE1(=ui5955 1A32tҸ@3 x3 HE1(5 1A14x5 1A1|$LH=Mƒ =Iow A J$a 85 HD1Ae1ܘxgHʿ6 1$1A_fATUISHL K Hx:uH HRZ0!o )D$1o~)D$b E1 HDAo%H|$)D$6H|$xEufD%E16 oە")D$b ](o!)D$ 2o#)D$ c `< 1f#Hdž@Eb HL$1< ;ȜHME1o)D$ t[;  foD$ \HbHB0HZE18AK;&wH`PHHЃtMHL$HL濰< 12"< 1"HH|$ t0MtrH|$ L|` L€t[H5H|$ H@H HH0>|uE  ;% H1f!tL.~fL``;H= HRH HH1H1!K;跚LH¿= 1 -H_" [HBfHQS8 t3讌t4_ . È [8= 1 [ff.fH_袮 ۇH™fHH͎8 t _v  H閙fD1h=  Hf.SH0; : <(HD$2D@ P$,>o_)D$ έ Ht$ _ϔ H|$1.(dH|$1ҾHD$Ht$ HxPl H|$躋 H|$HHuH|$yPƀ&HD$H)G,4{(!op_)D$ 豬 Ht$ _R  H|$1  EHD$Ht$ HxP H|$ HD$P 8P$ 9.> 11_ ?@*H0[ƀ@H|$HD$YHt$HT$ aEH|$HD$HH~]fD1= /H0[t&P> 1:fdc 1'Oc 1 fc 1f.H|$c 1f.x> 1_H|$= 1? 1)lP> f.H_B {HbfH_" [HBfH1c ;t_ (H DHff.H_販 HҔfH_蒩 ˂H鲔fH5 ¸:w?HHs/H̏HHt @Z<0t107 Hk 11c CtEt/H|$1t4t%tA,tѿ> 1zHHD$P_H0HxP* uڸ^f.S1Hkd t [@+tVHt HR0 urHB0Hti8udB[ fHAtHR0t) u.HB0Ht%8u B[fHt:t1? j[4 <:(HH7f.Ht:uGS1Hvd t [@ t.t\Hm8u8D[HI8uD[1X? [DK3 <:xHHrcff.US(]V H*f1H tH[]D=H|$[uH<$G u1Ҿ$ tMH<$HT$1@t9H$1ɾHHxP腚 tH$HH@ 1H|$(H[]1? \H[]? 1?f.@ 1$1c @ c H8Ht$P_t41H,Krx@ 1vff.SH}k 11/H|$1 tu-H[fDHD$P_H0HxP蒄 t7tB,t1(A 1a @ H[@۸1@ iff.fSHk 11OH|$1 tu2H[fDHD$P_HHxP tRtE, t1A "1ow @ H[@۸w1XA aDHXHu k 1]H|$1Ht$1c .tit>H|$1(l@Hǀ N, tH׈ t(B 1@HXfD1A @_ .DHD$P_HHxPu j t1A @ tH:HH HX[fDHH́8 1d  uHD1k  twt=H|$1Attt,_ tf.B 1f1PB Hf.{{VDHD$P_H0HxPŠ s.D$fDH1d uHD1k  t9H|$:t5t,X tC 1fHD$H8XHpHHD$HH0bHkKHD$H H9 t@Pu:H0HxPP_& ^fH0B 1@fDH14d [uHD1k | t9H|$t5t, t0C 1dfHD$H8WHpŮHHD$HH0BaHjKHD$P_H0HxP茉 t(7DfDHQ~:tDHR0HtwATU1SQd HPHt$L tHP[]A\fD1`C H|$1HoH<$HHt0@T t*HH9HxHBSsGXoGG[ G[HH=uHH8H1E1J+HEHH3H|$1հ uDH|$H}3{ H[H<$HtCH;t=HH9ؾuլIHL9DE1P_LQ t(¸H$H0HH8貧 H@(H$H H$=Hǀj H1 HH2H=C H<$HHP(cff.ATUSH`H{8uHx0t?wu1C + H`[]A\DHɁH Hu$BX%=pt"H HvHBZ<uJ7H( o~1l )D$iHfoD$~fD,E1HT$H|$ H` HD$@PP[t HHxP1ɾZ HD$H1HD$HXEfoD$@H|$ o})D$t%ue:AE+6HH6H9tLcHaL9Lu=* H1k H`[]A\DHT$1C IHt$1D  ff.US1Hhd HHH`y8u#11ҾHr uTH[]f18D  H[]1d tP_H tH[]fD1sd  u'Hx8uk11ҾHړ uc181t_1}d jHt1Hd tQH[]D1D  f.11P_H迓 -fP_HS  fDHhoT{)D$jt%1D I foD$'{Hhf JpH|$ d @A2p1gHT$HL$H|$ Bq¸tHD$HoHhÐHhoz)D$t%1D foD$zHhf oH|$ d @Ao1KgHT$HL$H|$ p¸tHD$HKoHhÐHhRHt$ 1d D$ tAHvƒ t  nH|$ d @An1fHCvoy)D$ƒ tu}H|$ H|$ HT$HL$ Ho¸tHD$HcnHhf13 E /HhD+u1E 1`E DHt$ 1E HhDUPHAWAVAUATASHH(H=xH)HuHmLd$H $I8MiHA1Hm y1d 1 1%1ue 1e 1e 1e wD1e bE1Hw&e L=swHE185HEL=NwHMwE ltHs 1Ie A1Ve A1L2f A1_e <t_PW)AG=kHrHC8KHM1L) 0H}H# AHCHH-H $efH@Hx${xH蹐ClD1d  7tsDuEu!b <,YHHE肚xCAf F 1D@AHeD[A\A]A^A_]E1AfCE1At׋tu& <:d 1AD1H>ArEHq=ijA7[@S  <*1L2f  LuVC1F AA )wf% LuX1F '_E0 )ld 1Ak,fDve LuC@{e LA/%LLtme L~CHH<)E AQe Au@1F )ACf;H迒s;Ab)~Lce LLLe L4Ccf.{MfD CS@1d AHmƒ 1d w @HXA_ls i <)$)'GLHM1HM zHEHHH#PXHH9PZHC e a1e  w1F ^)\Fq~ACoK~C;~C`F 1Ao01e  u1ÿ H5qHM0  H}1H=q0 qHUHP(HS=m1L0e |D 15e bt1?e Q9G 1|(Hj:0H=cHu1%AHEP^ H8HCiHH1HHHQIH?v?HAHHt'HILJHECD%Hz+L{IHEHELuLL8 HEHxxHM11L9 =HEHUHxxHPZttp@U@T@HH#PXHH9uXHni:tLH=b1L蠗AHEP^tH8HCLHKHEH H}4g HE@U@T H}HTHH#GXHH9U蹖 H}f HEPZHxPHE1ɾH| HEL' 1AH5n0 耕 H=qn0 HE H}Hx(Nf HE1ҹP_H]HxP@ H\O| 'HU|HEF L=>kH?k1@H0`HPL8G 1AH~gHP0:uHm Lu1LL衸 HEHt=Hxxt6HM1L~ HEHtHxxt H}H}CtWHE/H謵 HEH}uHM1L "HEHt@^uHEHEAHUL1 HELeȿH01qI$HE L`xKHUoiL`HHU@HBxB[B[aH0HxP1| ?H0HxP1q} )HEH؏A An k` .n ` .L' 1AHB HEnHje8H= ^Hu1藓AH}G^,jn ` [.@AUATUSH8H7k HH?j` HL-hL%hT` <$1H舊?T? 5 <,H` tcH8[]A\A]{ F <,;w HH Єt9<$H` t,*UuH8[]A\A]L-gL%g_ <$H苍fDHH` fD1G H8[]A\A]@1G fHH_fD26n H1w2fH|$uH=i1D$, =tIĺ1HD$I$f <,kff.AVAUATUIS_+ =$__E11f1e 1yM At1L11e =1e (1e 1e ]EZv =N^[;^]A\A]A^1_r ff.= ^[]]A\A]A^fD1_y H<$HD$ff.H[w-Hs"ATU1SH`tH`[]A\DfDHt$ 1&f guHt$H|$ 1L TH|$oe^GtOXcHD$t HHD$HZS5  HD$1H0HxPq H|$11СHL$HZAUtFAUHB0Ht78t2HH o@)D$Ht$X; 1H5Z:u"HD$1H H@Z<0@l H|$o]])D$賦HY8"(^H\$CXtHLMtI1LU-Mt A?kLHD$0JlHIu =H1L-Mt A>LA[Ht$0D9S~E1gL(Ht$0NA>t@LHI,u =[HA?LHt$0AD9.~lMcJT:tHH$HI3,H$u =GA?TLHL@>)GE1=vJHD$(D$A _H5MWHHt   :f LD=EHPEt[ H5VHL$81L H|$8Ht G[@z H|$耾HPƒ t5t0T$LD$PHL$0LLFSNfDL <8IJ1U >H@d N 1WfuL{` MLu)I LL$HH0;L$LL; IHH8` A$ppr@AD$HH$1Lv H$H HH|$0 8T 1`fD%@J 1?DG:f J 1I 1uf.+fDuC=Gu.HGHt"@ZƒЀ$<DS 1H-5TH` HDH_ 5HH` HuHHt$0HD$=F Gv,gfDHHD$0+fDHFM@[F HD$8H@QHQH0J 1HH!@Ht$H|$(1KPL-Dt2IEHt8t$H=Q1褅 IHDH8"IEDD$!D9HpWZ`$}z $O9Hx@P Hp1Hw _>HD$ 9jH<@xP Hp1H;1Q  8 /9'K 1YH$Ht$`xu HHXK 1)L<'Ht$HT$PZHD$8HPƀ/+ 1ME8H>Y1`t'H->H>Hh[]A\A]A^A_D1 _L5>L=>D$ uD$ tLtgukHh[]A\A]A^A_H@Zu"P uHHxP1ɾO t@1Ma LZfD<)tH|$ L5=L==HL$H|$ 11HD$c YH|@H HtHi@HD$Ht_HP(L-T@HI$_HB0HH9u HI$HP0H9@H@tfDHt$H|$ Y HD$f.AWAVAUATUSHH98 $Hl fDH|$HD$HD$ 8 DHl$t$*@EZ^HL$AZuUZ ЈAZD$%tEU ЈEUHMHl$tH^uHvHl$EXT$( ЈEXH-!1LY WI4$1Y =$JH$1X $#H0HxP1> HD$VHPL1Z й8H0Z 1跹HD$@Z$HUH}P1:C Hl$EZcHUH}P1@ C nHl$4HT$H$H A$L% H\$H{CLcZ%HAD$THL$ QT  ЉATI$iAD$^tYHD$E11ۀH^I$WHCHHEHmHH I;$uVHIHl$H0`Z 1BHD$H8{HT$1HZ HD$Hl$H011HEHD$ HhxgVHT$o)H@BHT$HhHBxB[B[[H|$11]mtp螢[ 1ԳHl$HHCL11ZO I$11DO HaHD$H^ff.fAVAUATUSH@ uuH@[]A\A]A^fD1H 葱uոuL-(L%(f Ebt91k Qt(IL葨t9L-e(L%f(tZL-M(L%N(EVNNt;$<(t_L-(L%(EOAEO L-'L%'+L1i ttEV7Z LLL-'L%'E\v'ff.@WƇttDƇH1 @HfDS:t [1f 臫D؉[fAUAT _USHA1f 药tHĈ[]A\A]fD1W dt1H#8?5_*(B 11Ҿ_,7 uaH=跐HaH|$/t,耬H|$@HD$HD$L%%H-%ͥuo%LD$ HL$H|$@11)D$ !%H\$H{Ht$ [ 1RH|$hfDH[ 14H=}HtH_7 HHuHLHI)H|$~H='HtfH_HHuHH\= H|$%H|$PqH 8H=Ht@H_?HHu[ 1HMHJ52HT$0L-,Hl$8Ld$05HHHSMIH AoEXH5+2CXQH\$HT$ Ht$H|$@H0HL$H|$@11Aq Ld$H-MHOHL HhH@HDHtgHCH,HH&HxHǘFyHE1HNHH\$-HH뽿O 1lH|$x[ 1QHD$n )` ff.@AUATUSHH A tH[]A\A]fHA0HhHHofH|$1ҾHD$@)D$ )D$0o!)D$ 1k >uAHT$ Ht$PH|$uds|$D$$Šl D1f ,H[]A\A]f1i o ID$(AL$4AD$D$ID$ 8O/H{ Ht$P0HbL`(H H|$P襠Ht$PHHID$Ht#H;(u/H9(H@Hu LIT$t$f H(H@HPV@ID$1Ĩ\¸] 1ڭH[]A\A]1\ 购H[]A\A]ÐL$HL$Pg O1L_ 1U 3D$t H{(HE{ Ht$PF HL`(M AD$4|$HU1L\ f|$ L$/g HOL1 41\ 褬f.L$HL$PO1LO f.HcD$ Ld0+1#g DCf.Ht$P1Lx] fD$(A9D$^1L] 1H\ ԫHcD$ Ld0im R,` efDm ,` BH{(1P] cbm +`  m x,` m 0,` m Y,` fATUSHPt.Ho8 tG=}HP[]A\{<:wHHsH(8 uH@0HXHtH@0HE8<H-%H:H@Z<01k DH|$1oq,臢E1HT$H|$HK HT$B[<t HzPH1ɾ0 HT$HHHut@HHthH;PuHt$1^ i1] LHP[]A\fD10^ ,HP[]A\fD(GHT$H@ EoHPH@HHHHt$^ 1詨ffDAD19g t1^ ^1Kg Hnm ,` m ,` m ,` fUSHH|$-6 H|$@ukH|$@`g 1۽8o xtt\HuH|$@HZuH|$P_u t|<:tC<,t_ 1sHĈ[]f1@_ OHĈ[]<:uBuYHt$H|$@ uHL$T$ A` ƒ? ЈA`y,mUH|$@ژtPff.HH|$ Bu|$ w#螑t91_ }H1x_ _HDD$ D‰~DH7t1` H@ Hff.Ht1@` ֥H@  HDH跐t1p` 薥H@  HDAWAV1AUATjg USHHl$0HtHĸ[]A\A]A^A_D1g ܟD$Ht$p1g 躟u!HtHt$pHuH= VHH!%tDHWHDHH)H BHHHHD$g 1H-b H LeHEKdHH9oDU(E AALHD$L$D@$IH H‰)AA)HEDHL,HD$I}HZHt1Ht$LT$(L\$ L$lL$L\$ E1LT$(IsAAH$LI D)BA)ƋE$t$DpHt$tjAD|$Du$DL9rE)DHL$HHH;HtrHuMLD1g ̝D$I݋D$A EH|$HLeMt(HmIEHD$IED$AEJIHEf ?DHt3USHHHH{(H{HH[HuH[]DfDUSHHHHWHto H1HHο)[]hMs C H 9yº^s iH 륺 Cs LC H y@ s $CH F+s H & s CH @ s CH a@ r CH 9@r dCH a@ r <CH * r H  fr H r H } r CH Yw@r H 5Gr \CH @sr 4CH @ gr  CH @ \r CH @ Rr CH iS@ ?r C H AIr mH 0r LC H @r $C H @ r C H g@q C H 7@q C H Y@ q C H 1q ]H  q <C H {@ q C H K@ q C H @ q C H q@q C H I@ qq tC H !@ [q L{ H V eq 'H 6Lq H  2H  2CH i@ Bq H Eq 7q lH ? *q LCH @ q $HtH=H꾐'1H D/5CH @q H mgq H M= p |C H )@p TC H @p ,H p  { @ CH wfDG:;H A7H a!@ p CH 9@ dCH @ p H8fH4u YH5 DH(1H5`DH ) H58DH`u HH5H8_Lf.HZu HH5:DHaH5DfDH_u 1K@H Au ff.ATUHS t uH  k8H}t5H ev H}H5A)?H}t5H )v ]H}tH5)H}t5H  v !H}8H5) Ht8H  0v HH5z)xHt8H _ Ht8H %Gw YHmH5)Ht8H Pw H+H5)hH t8H _ w H H58)6H(t8H w QH(eH5)H0t8H  w H0#H5)貿H8t8H  w H8H5r)p1fDH@tgH Nv H H5,H [TH@gH5)HH@uu []A\fD[]A\H w fH  v 7H v  H a v H Av ufH )v EH=H5(H:H5)ɽ @H uv HH5(膽HH5k)i@H Qmv Ht.H5,(*H~H5) $H Zv %HH5(ƼHH `v HHH5m)etfH A w uH ! w UH w 5xH 6KH w H vw H _w HtH5\,ZHH5?)=H  }4L7,tH  7 LHO,誺H5'Hev HHH5[eu ow u u yIx o_3e|u tu |H v 7H5v N7DHv H5v hH5*蔹H5 ~  ~ GUt20t)ATUSGZH<@t/<HH5HxkH{t$H H[HuH[]A\A]D x AD$4IT$ H=,H#x 1oAD$4H ,x 8A|$H D>x  ID$H5H@(H8H[]A\A]nfDH x YHx 5@HG(Ht HwHHЎ ~ 躔AWAVAUATIUSA HI|$(H5I\$(HtSH=z 1HCl-1Ѓ)fH5 蜳9A$=,$0 H q0 I$H;t#H Hz |H;H{t$H  { RH{iHt'H z $H8H{t$H { H{ H{t$H { H{H{ t$H l{ H{ H{(t$H A{ uH{(H{0t$H { JH{0aH{8t$H { H{86H{@t$H { H{@ H{Ht$H {  ɿH{HH{Pt$H j{ 螿H{PH{Xt$H ?${ sH{XH{`t$H {  HH{`_H{ht$H |  H{h4H{pt$H +{ H{p H{xt$H |  ǾH{xHt'H e2{ 虾HHt'H 4:{  hH|Ht'H E{ 7HKHt'H | HHt'H #| սHHt'H p+|  褽HHt'H ?N{ sHHt'H V{ BHVHt'H ݿ{  H%Ht'H { HHt'H {\{  诼HHt'H Jf{  ~HHt'H 7|  MHaHt'H q{ H0Ht'H y{ HHt'H A| 躻HHt'H UH| 艻HH8HH[]A\A]A^A_fD ,j1H5 9uKH Խ+y H 2y I|$PH H[]A\A]A^A_H }x H[]A\A]A^A_飺8H Q$y H 0y dI|$PHtuI|$ht%H y 2I|$hHI|$Xt%H Ѽwy I|$XI|$`H ~y ԹI|$`H { cy H `Py H E Dy H * Yy H 8CI$H;t#H z H;2H{t$H z H{H{t$H { ŸH{HCHBP{ H=XH1[]A\A]A^A_雮H <{ p(H { PH  { 0H ܺ| I$HHtPH={ 1HHtPH=| 1ԭHHGP| H X|  茷I|$PH 3M|  gI|$P}ID$HXH1H{H@DH[HuH ع0  I$H;t#H z H;H{t$H W| 蹶H{HHtPH=R]| 1螬HHtHH=0e| 1|H{0t$H z JH{0aH{ t$H { H{ 6H{(t$H A| H{( H{t$H m| ɵH{H{t$H js|  螵H{H{8t$H ?H| sH{8H{@t$H }| HH{@_H{Ht$H { H{H4H{Pt$H 2{ H{P H{Xt$H \{  ǴH{XH{`t$H hN{ 蜴H{`H{ht$H =V{ qH{hH{pt$H q{ FH{p]H{xH y{ H{x.H )H 5γI$HPH;t#H hz 蜳H;H{t$H >{ rH{H{t$H z GH{^H{t$H H| H{3HC HtPH={ 1HC(HtPH=| 1HC0HVH tv 訲I$H;t#H Kz H;H{t$H !z UH{lH{t$H { *H{AH{t$H ˴{ H{HC H:wH z ɱI$H;t#H lz 蠱H;H{`t$H Bz vH{`H{Xt$H { KH{XbH{t$H  {  H{7H{t$H { H{ H{t$H { ʰH{H{ t$H k${ 蟰H{ H{(t$H @+{ tH{(H{0t$H 2{ IH{0`H{8t$H :{  H{85H{@t$H E{ H{@ H{Ht$H N{ ȯH{HH{Pt$H iV{ 蝯H{PH{pt$H >\{  rH{pH{xt$H f{  GH{x^Ht'H q{ H-Ht'H y{ HH{ht$H {  躮H{hHt'H X{ 茮HHHH y PI|$PHH N'H Ӱ y I|$XH  8y %:H  4y 跭H hy 蜭I|$PHtI|$hYH 2 y fI|$h|/H $y E1A8mq H=ܠ1L| %A$=C$X I$H `ԝI$H5sH)[]A\A]A^A_cI$H=LHt"H'1茓HtH=%H A$P3AI\$UQHCD-D$AED$ HCHt|$ HH$H$H@HuH{tND$1҅~ H5 $蚎$;T$uH | 賜H[HuA<$Z2H5P E1IE~H55 0D9uH=Emq L| HD1THH}0XH |  Hu0@I$HsvH=1oE1A8mq E1A?8mq E1A,:mq xE1A:mq eE1A7mq RE1A6mq ?E1A5mq ,E1A8mq E1A7mq E1A7mq E1A7mq E1Au7mq E1AD7mq E1A~4mq E1A7mq E1A7mq E1A8mq nE1A6mq [E1A6mq HAA52AA5AA5AA5AA5AAn5AAJ5AA05AA5AA4lAA~4VAA4@AAR4*AA4E1A mq E1A8mq E1AU8mq E1Ax mq E1A mq E1A6mq E1A6mq E1A8mq |E1A7mq iE1A7mq VE1AZ7mq CE1A6mq 0E1A6mq E1Al8mq  E1A!6mq E1A6mq E1A9mq E1A9mq E1AЀ mq E1A mq E1A mq E1A\x mq rE1Ayx mq _E1AN mq LE1Ax mq 9E1A"8mq &E1A8mq E1A9mq H HRI\$HwH[AEH4Hމlj$fH[$Hu DID$HHxHPy 1袌%y 胖y jI|$P{Md$MH y E/I\$0H=ۘHtSz 1H=HD1Ѓ)@ 裇H=9HI|$t2,胇Md$MuGz 蠕)QH B<,1H5  9u@z H 3z /H y ( 120 ~ EgH[AEH .^H || A<$N8I$H'H'Ht+USHHHމH[HuH[]fDff.AWAVAUATUSHHGHjH5ӖI HD$ɅÖ,~$1f.H5 蜅9uH  | E1躓f.IcAL$AEH@L$HAIM~fDI0AHHLA_uE9H=[T$| 1ABHJApIt[H5Ǖ ń,~ 1fDH5 蜄9uIH=| H1ˈI}PGkL-E1\ I}@]GN H5?K(EH ",~E1fH5 AA9uDH=ߔH¾ } 1(HH=H1H[HuH5IIYI}tj 舃,~#1fH5a \9uH I'} }I}GߋH5 IH D$E~ 1DH5 ܂D9uH ȓ 8} HH{H[Ht'H ̐f.HmH5mH\Mp Mtm VP,~!1H51 ,9uH  F} MI|$CM$$H5Mu H5ڒ ؁Ғ,~#1fH5 謁9uH T} ͏I-HtHމNH[Hu-_I -OHt=H ; Z} oH߃% H -HuH5 HD$HxH[]A\A]A^A_ÐH=ёAT$KA| 1A\$H=ADϺ  ~ %aff.AVAUe} ATIUSH1H=\评I<$Vp "H59 4 uI$8H=Htv} 1_H=H ,~1DH5ѐ 9uH  } I|$ H5 ,~1H5q l9uH Y } 荍It$I|$P讈I$tcH5$ ",~1H5 ~9uH I$0A|$ TI$tiH5 ~,~#1fH5 |~9uH i}  蝌I$耮I|$xH55 3~-,~1@H5  ~9uH } -I\$xHt%HH=ΎH1H[HuI$tYH5 },~1H5 |}9uI$H=a} H1誁I$H5< :}4,~1H5 }9uH  } 5I$HdH5Ս |͍D,E~1fH5 |D9uHUH=p 1H}Ht'H n}  袊H趝HH ?t'} qH襭H E8dEE@'E8@E:E8Hο {H}`uB/H5){Ht$H5 {HHI$H5b `{Z,~1ېH5A <{9uA$H=!AH M$ {,~1DH5 z9uH ы ~ MMt`fH5 z,~1H5 |z9uIF H=e~ H@(H1~M6MuI} @GD H55 -1,z&,~1DH5  z9uH '~ 1#I}(@Gӊ fDI|0tI|0HH([]A\A]A^A_f@<f. @AWAVAUATIUSHHt$PH|$ =e~tH vLuDDH|$p:H []A\A]A^HT$&H []A\A]A^fDAWAVAUATIUSHHHuHHFLkHHC2HwuLkǀƅTI$HCH9CHH9C HPHSI$HXHZLLcII}IE H)I9LHebMuE~ uHDžH9uHtIHHkHLc1HtHkǀH[]A\A]A^A_fDLLI}T@H㙺HCf.AWAVIAUATIUSa HӺAHCEPACHHHHVB  H€tgA<$u0HDHHIu []A\A]A^A_1Zf.HEDHHI[]A\A]A^A_g 1'HHHH$[]A\A]A^A_1ff.fUSHHHH?DD$,L$(HT$ H;HHž1UH}gH;U6H;H[]G6AWAVAUATUSHxHcF@L, HHL4ŀ MfL'L+XLHA|$CʿA|$CI<HLL 1Hu(H|$01ID8Hu(Lt$0D|$8DD$dIHLHt$1${tI G E&wx wH AG D@v~ vH QG D@HRntIvHff.f@=vS8 ؃ *   wt t\ut@5t ytfot[jt_f. JtD؈@t_[Ð2t f(t!t @5t tt s s s [s\Ufs_@ \xssf xs[ss_=xtu 1=xfuXu@uXu`@ǀuj@ut@ u~@@u~@u~t  f. f. f.А f.Ə f.ԏ f. ø ø H G MAfDUHAWAVAUATISHH0H8L@LHt))P)`)p)])e)m)u)}w!#=xwA=  rLPIIDHeHIOIID HtH=pH)HH $HLLHHWJ|#IVLB# iWHEHH1DžDž 0HH H 51u He[A\A]A^A_]@tq1uLOIljH{HIOIID HtH=H)HH $HLLHHVJ|#B# LXHEHH1DžDž 0HH HHe1[A\A]A^A_]@HH-H $mfHH-H $7fATUIS\$ H5J4H$HL$pHT$ALHHD$H$ D$D$ 0HD$ŠH=gHމl$dHPn蛋H|$pH9n,H[]A\fDATUISHH$0H|$p11H\$ H523H$HL$pHT$ALHHD$H$ D$D$ 0HD$譟H=fHމl$dH8m胊H|$pH!ml+H[]A\ff.ATUISHH$0H|$p11H\$ H52H$HL$pHT$ALHHD$H$ D$D$ 0HD$荞H=neHމl$dHlcH|$pHlL*H[]A\ff.SHHH$(H$0PkHD$fH|$ H|$pB)H[fSHHH$(H$0mH|$  G v9ff.@USHHcHHiHkHHCI HbiHkiǀH[]fH iHHPH9PtxHtrUSHHbHHkHHK H=bMiCiG<W Ѓu4HkH=ObH[]釄ú G a8SHHH$(H$0H|$p11H5,HU`BBB<~ vUH$HT$HL$pH|$ HAHD$H$ D$D$ 0HD$H|$ ` }G 6USHH_fhHHfHkHHCH_HkǀH[]ff.@ifuHFfHHJH9JtH@9fHfuHHqH9qHUSHH _eHHkHHSH=^HeGHHPH9PtHu1HkQH=^H[]@úP G 4f.H__fDUSHHHG^HHkH{H|H-^HkǀH[]HtH ^B<B HtH]BBB<@=2fH]SPHHǀ jGHǀiGHǀeGHB0cG12fP|YHHHWdƃLHMd_e[ff.HH=M]@/WHA]^^Hǀ jGHǀeGfP|HDH'AVAUATUISLHLLքwnŋ$ LAԄt^A}]AEfPHsX 11[]A\A]A^@HsXh 11[]A\A]A^A}}A}tIuX 11f.A}5AEPvtHsX 11bnDLAԄzA}AEPvuA>AFPw$LAԄ2A}A} HsX0 11DLAԄA}0A}IuX` 11LDA>.AFPw\iLHLL0suA> A~_AF A9E HsX 11fDA>A~LuA}uA>A~A}AEqLLj#/LUV\L@IL+4L8LHLI9ff.@?t 1HGHHt H@(@]H@ h Z/?t 1HHGHHx(d 1H?t 1HHWHHHt%HOXL@`Hz(H5 aC Hú` Ph .HGHL@(A@RuRHWPHt@2tL1u zHtYHt!1HʸtuHHHu1Ax t:HB@Hv t1ftHtċ1HA@QtI@(H@8%=ff.fHHHu1f:uHHtHPHuH1HÐSPHxH褂HHC@HCPHCH[ATUASHӉHHǀ@8H@@D`h to@X[]A\fDSHgH@to@X[ff.AUATAUSHIHH%HDLHto@XH[]A\A]fATUASHӉHHǀD`h to@X[]A\DAUATUSHHIԉAAo$HhDh @XtYt$u HIHH[]A\A]DD@HH,;HH[]A\A]fDDHKHH[]A\A]ÿ 1jf.AUATAUSHHHHtHHIH۸P_DHDؿHLHH[]A\A]HyґHIľ HoADATUP_SHՉH@HHDHx<Iă?w @)HH|$|$,I$H|$ 1H,$D$Hd$ D$(j}H@L[]A\ff.HP_SHDH[USHHo@@XH[]ff.~s8H{@H{PH=H{HHǃ1HH)H[HfGvztUu HH0[]A\1 H0[]A\@{t2t@ 1벿@ 1oDHØH4.~@krh 1a 1|IHH1RxH1/A$f.h 1f.ATUISHHH0HE D$ @t+HCPHt"DHHAE@tHHHu;t>uwH0[]A\f1 DH0[]A\fD{tZt1@ w1@ H0[]A\fD1 Dxf.H5FH1H}|$t!+?1h .H$I$1h 1u HWPHttHHu1fJ ~胺tHH fH9tHxuff.AUATUSHHtaAHHcnEI~.HAD$HMdDHHHQ9HL9uHL[]A\A]f.E1ff.HATUSCHoH8 oooooow[$X f.HwH HHH@s8H{@\H{PHE@HHEPHt HH[]A\HHHHHt0HCxHxlHKxHHHHQ7} LE$Ű f.HHHHf.HHx놺HHHHHF71fH#HvHf.} L4LH/H1L0=S} L艂L?H1LMDHc腺HcHHH;0fDHH5 h @{HHHxjHHHuH{ tHDžBHHH5& 12fAUATUSHHH(넺oIH{MoC@HC IE IE IEMfD(衄oHoC@HC H{HE mHE HEIl$ IH[ HuHL[]A\A]fHE1[L]A\A]ff.@USHHtUHHŋErPtst6HHHH[]f1HH[]@CEH{7H}HƹZHH0;fDoCHCH{EHE^H{HEQHEboCELfHSHt7HHvHBHCPHHHFPHCPHǀ[HfDHGHt HHt8t xuAWAVAUATUSHHHHy}I}tH[]A\A]A^A_HEHtHHtHHCH8HǘUJtHuHItHE1HtPIu~HHDfLhHIUHpuHL`M}MwMt1HCH8IHHH[]A\A]A^A_H`HsH=W IG낺 h  h f.ATUISH0HH@PHt*8HHt}t)HHHuH1H[]A\HǀH-Q{u1 tHCH8Ht ?h;uH=b1HI$OHHHDEp{,;#1 s HSXI$HH1GI<$w{;1Ŀ s HSXI$HH1HcS HCPHC HI$I<$Ht ?tHH1[]A\H۸H߈D$D$ff.fAWAVAUATUSHHH}:uxzurHHA"AAxWE9~RA|$1HcF~IIA9tLHI<.HA9HL[]A\A]A^A_fDE1F 9G MG DB fUSHHb(HH@H}HHhHH[]ff.fAUATUSHHELAT$9AHPU AD$ G9Oƒ}G u9E tHsE1DHA|$u C A9D$ teHHsE1DH[]A\A]pHH[]A\A] kDoEGoEGoE(G(H[]A\A]DAD$ 9E t~DHuE1DLoECoECoE(C(H[]A\A]It$E1DHfDHwE1DLfAoD$HwE1DGAoD$GAoD$(G(HUSHHHtB; !$ HCPHt'HxHCPHXHufH[]@Hd;HHtH}zHzHHuHTtHHtMHCHH@(@]H[]fH{HH~Ht8HHt"H}tHm HuHHt @H!H1[]ÿ 1|@HHx !H<fHtHpл h LDAWAVIAUATUSHHH@H|$`HLJHD$xHF8D$$H$/H$/H$0D$$cL$L$HL$@H$ HHD$hH$0L|$HMHD$H$1Ll$HLt$PHD$@HL$IDI|I"0I<0HD$H< 0I<0HD$H<H/IGH9uHD$L|$1E1D$\HHHD$pIG8CDd$XK0JHHHHL$0HD$Ht ;OMt A>K H9>tHD$HƘHHHD$81HD$p<(HD$PHL,(HL1HD$@LHHHD$(1HD$L,(HD$HMHD$IL1CH\$Lt$(HLn?HL[?H\$0LHØHfDH|$8+HD$PHL,( HD$0LH-fDLd$(IT$t$X 1HL4H$1U&H$H&H$;&E1@l$Ll$DLt$HL|$PLd$@1f.I<&I<%HD$H<%I<%I|H%9l$$l$H|$xQH[]A\A]A^A_DHD$HHHD$8*fHt$L,C1HL,H;H{8H{@SHIJH@HHD$@H<(),H$HS@HHc|$\HD$`Hs@HHx@+Ll$HD$0Ld$(Hl$0LIŘHLl$IfDHELLL9LLL9H HHuHl$0Ld$(D$\H$ LLH(&H$0LH4(HD$( (6I9AEHH$@LHH+9AI0H$HHcpHL$`HA@ H$c#H$V#H$I#H$<#D$$eH$HD$HH$@HD$PH$0HD$H$HD$@H$ HD$f.Ht$` 1HXH$1%P th 5 P h ! P h  P h fHuHt1>tDHFHHx(1HtGQtH HBUtWPtǃAuH@?t 1ff.HUSHHu^HSP1HufHHHu1:tj;C8uj1Hpt.HH[]fDc]HsXH¿ 11H[]f.HjE:HDfHsX1 H1[]HaHSHCHHx(i>Ht E? /HCHHSX H@(H01eHCHH@(HW 1DfHAUATUSHH?j$x HG[]A\A]l@1HDH[]A\A]HGPHx31H[]A\A]fDG.1fLMHHtH}+tHm HuLA$0kAD$[<<AD$W0@HHu#H[ HH{u1&HGHLh(FtJAEQtCI H;uBu3HHt'@Wt!IuHSX10 1fDAEP UAEZ <eHkPHtEEt1.H}qH}`HHuCp%HIEPAEZ4E1{fDfATUSH0Ht3?H%$ fDHHD$ @:fH0[]A\þ@G&f1H0[]A\Ð@G&tH0H1[]A\HGHHh(0uHExHtH0XHH@IHHu}DHm HH}u1SHGPH=Hx,HCPHx31fDG$G:HGHHGHt.fE8tHPHt :HHYHuH0H߾@G[]A\鯯HSXHK`1H趫1H脬HHNHCH3 HP(H*H€HHufHm HSH}u1H蠮HCHHH(AZ1HKHHA(HH3RcuH0HSX 1BImBHtMH}HCHH@(H8l-HC@H9@HCH3 HH(H)HUBHw/A u'A`  LEHIYI<$HuHSXH0 11fH1HSX` 1HCHH@(H8,H@HH8H0@PHSXf%Af= 1VvH0HSX 15UH0HSX 1=HCH3 HP(H*H€ HHHSXH 11˿HqHSX0 1貿@ 1衿HH}jHm Hu  h '0  h HSX1Й 践1lUSHH)ۍu)H[]DH<w)uHt;uH+u1HH߾[]魪ff.fAWAVAUATUSHHxLPHD$(fMtOAyuHt$(H~tsHt$(HMH{PHCPHx[]A\A]A^A_It$ KT'@1H|$0%H|$@H|$PH|$`Hx1[]A\A]A^A_fDIG8uIG8uHHHD$(HCPHPHIH:HPHH1H9E1E1Hl$([LHHHl$(HLL HB(DAG; ItCP LHCPnf.HMo0MwHEfTA Au,PfDILpHEHAHt&IIuE1Hx諀 HuMM9t"f.M$HHM9uHHL{PT@LHEL{PAG@;t1HL{PHH|$@1HD$H|$0H|$`H|$PEO EE1CIG8IJ|'K0}4IG8JH>HP:HŘHƘHT$HT!AL$L$ HT$HH˜HHT$'!HT$H|$0HIHt$0HT$PHHt$@HT$0HEH|$`IG8Ht$`HJTH˜IG8Ht$`HJH˜}Ht$PHT$`HD$ A9G IG8McMuJ<@H|HHt:HCPLeHH}ID$PHHHuƋsL{PI LDHHHu:HHH@88xLHkHH}Ht qHk={9L1L{PHE!HPHHtL$1YHL$IIAA@HLHHtH@o@Co@Co@(C(L{PsIH|$DD$DHEt{HD$HL MtnHugfHIHHtQMtLH} tH|$0 H|$@H|$PH|$`H)LwHN,_H|$0H|$@H|$PH|$`H>HHL{Pp Kh  HCHHt&H@(1H ` HHC.E1H=+1_ HHC h  h AVAUATUSH HH|$t2?Hw*$ HPHt?VfDH []A\A]A^@HPHT$1H []A\A]A^fHPHT$tHD$8uHxPuH8uHx@tHHHu0Mt L[HHHTHC HtAHxNHC Hx7HC Hx H{HtHIoH{L[nfDDALLLIL7LeMtLPAHǃHǃD$H fDHGHHP(BZHm HuHD$HHt$1HxSHt?t)H@ HufD1HĘ[]A\A]A^A_@H|$III7AoFIDžIAEHAoFAEAoF(AE(AoFXAEAEXA+nJDXHR DIHtmHzHt E#HBD<tNjw8uuBD<HHT$DHRHT$JDXIHR HuHcHD$HD`HHD$(IHD$8D$D$4EHDhHD$L賿H|$81HdHD$ H@HߺMH\$`E17fD{HIG IHHtLHxHOH;\$IGMd$ tMM(u;HIHD$ H@LHHuI|$HH;\$IGMd$ uHD$ Hxt$4HHHtHйH\$`H;Ht#HHH;\$uDHt'H9~sT@tH|`HL$HT`uHtٿ 18LPLHHT$tHT$Hz@BtCt ELXMoDHSHHtB8tMoCxI@XoC@oC@oC(@(HIT$RZHL L8LIDHsE11ɺL҆1H8H[]A\A]A^A_fDHCHH@8@fMw fƒ IHCHH5@Z1Hz1HT$HHD$HT$HT$蛞oCxIHǀ@XoC@oC@oC(@(>{ HHT$IHFHT$HHD$HT$H5HL$(' vR IH@轴 1\ff.1fATUHS贝HhHHHE(o@ PZCo@0pCo@@C(tWHH HtUtXC81HCPI@HE(x tFHxID$8H[]A\DuH@(HHtH@`HtҋH@(HHx`ff.@ATUISHGPHHteHHu oHHHu0HU8oEX@L`8P @([]A\fD0HEPHff.USHH"oEHHu@XoE @oE0@oE@@(H H8+ } HCHt(HHtDC8tHHH[]ÐHE(HHtȋP8uHp`@C8HH[]G8tqwgHWH1Ht HB(HHWPHtJ1@uHBH@`HHt uzHOHFHufD1@AWAVAUATUSHHL$ HHIHT$ ՄAA|$A<$ |A$$0 DI$L$ HHuUI$L$ HHnu:f.Ml$PMt,AEcAHD[]A\A]A^A_DID$HGH8H;?2L$ HH@I$CHIuVfDLhHI@QHPHB89j8E1@;@V@H@(H_PHh8@H![]A\ff.@HGPHt"DEupIHHuHH@Ht2H@PHt)@uPHHuDH@1HGPHt"DEupIHHuHH@Ht2H@PHt)@uPHHuDH@ 1HPHt uGHHu1@HHff.@?t 1Kff.US1H?tH[]f.Ht{tfHCH1H@(HHthHCPHuf.HP8jHHttu Yh %fDHCHHtH@`Huff.f1?t HGPHt_1D8HDHHuHt@HJQt1uHA@[fHAH@CDWt1uHG@[DHGH@CD1?t HGPHt_1D8HDHHuHt@HJQt1uHA@\fHAH@DDWt1uHG@\DHGH@DDAWAVAUATUSHtkt&1H[]A\A]A^A_f.HHtDHhR@H1ۃHGHH@(fHGH1H@(@ |HGPHo18HDHHuH HB1H:H@H-hR@Au G8HGP1HxH 1HHXHGHH@(t'HHA@8%@@WHS DrE1@Mc@ZE&N(MJHA89HIHL$L$L$HL$JH.?uqukHǘ"uVN(JMtA1Ht Eu9tL1M9IB@16HHHH&DHC0JHt8uHHE~JHtE 1N(JM,E!fHC0IcHDH8HIH ${H $H&H@@8%@@@HGHH@(W@RM1@P@HHB@311P@R뺺й h й h HC0IcHDHAWAVAUATUSHHHHXH|$PT$L$脜DUEu D$RHEHHH(D$Zj@D$uA}tFD\IuX 1 jkIEHIUX( 1Hp1iEIUHHR(H HHtzYyBSBSff.AWAVAUATUSH(H_PHtHt$HHuE1H(D[]A\A]A^A_HLkAu؃{uϋC ~1E1DEH9C ~HT$1ɉLC]AtHC8HHHHx!X At:EtH|$趩Ht$HAHEDEH|$聩~{tH|$轹XuH Ht҃8uH3tf.HhH|$^fAWAV1AUATIUSHMEHhH<$S HzHNHD$qHT$H4$I1HoB@oB(@oB8HHPHLxX@(Lp`Hf IEHHx( IEHHH@(PZptPZHY HP4pEPZH$D$0HD$ HD$0HD$(tF11ېH۾(tI|HC HËD$/wmHT$(D$HA9HCuHhL[]A\A]A^A_D3HI벺 h Z th FHT$ HBHD$ ff.@AWAVAAUATAUSHMHHL$E1H|$H較@t?;.MHSXL 1E1eHHD[]A\A]A^A_DuAGZI|$Ht4LHHHtH7I94$HGuHCWH{ߐHE1[D]A\A]JZAAAt~jHH 1ZRH H:H0HBPHJH2tHteIH꿠 1ZHt5IH 1ZHH 1ApZHp 1\ZH 1HZHAVAUHATAUISHuAuAfDHkHtEtFMd$H[ MHt 1iZftf.I$1DpZAAt,HEHHx(GtI$x @P@ 1}YI$x KH;I$HuX H1YDHH xx,HHhHEf.HhgEAFI$x tu@P@HuXX 1XGf@V@H@(Hx:HuX 1QX A[D]A\A]A^fD@V@tH@(Hx:z}HDHuXH¿ 1WfUHAWAVAUATSH8HHAA aIHE1@JBHIM@ IHMAA tMLMIHLMHL`IK,LDH8HpHH9HHv uHu HLI\$KD,HEHE@HCHEHuHHHt9tUHMH9MHeD[A\A]A^A_]=fytHCHDhZAAH9uvHCH@H8LcIDmIHHSHLB(HPHL;B(tLH$@HSPHpPHAH@EA@uɋ>;:uÃtMrltWAx twHHH@H@Au|u:;>vuHNH9JtbfLJ9NDt:f.{xuN9Ji EIHBZ}ut<@ƃ}@tLL LMHEIE}HL0wHKXHLMLEH¾ Q1L1E1oIXZL9esI$H@Ht,8u'IUHZHPII;?%0 1T 1TX 1TfD tzH~ t=HHtQHҋt9 tzffHV(B]uHHHR`ufD1HuHG(@]xHH@`ofM_  LDHLHHGu/A4t'HQu(HB(o@ Go@0Go@@G(HBH@H@(fI e LƸ€t"c  Lff.MAtHLt EAHtU 11QHEȾ 11Q@tkt7EAHHuI 11Q` 11Qf.AHH1@ 1iQf 11RQfH1 QH` o 耉fSHHtHP1Ht BR[HCHHP(1HH9tAR[fDGP@u* tDHW(HHtz:yfSHH0Hn$ƒtH0[fDSZ uff.fATUSHH9W At 9O$$AtvAupGCPtHs 1DL$PDL$tyCZD9Dο@HHsH[]A\( 1?PAuHHtxtHs1 DL$ PDL$uHC(H8yHs1 DL$OCZDL$D9_H[]A\DHHsH[]A\ 1OfDHw1ʿP DL$ DD$tOADL$ DD$#HC(H@8ff.AVAUAATUASHAHHF(#H@4JH@H@H@4H@HX(H CT@eLML1H@HuA9rAI>EAAHETED%E11Ƀ[MvM.I>H E1AD&MvMEI> HY E1ADMnMI}H# hAADMuMI>H 0E1ADzMnMtQI}H E1ADHIEHtH8 o ȄM]H[]A\A]A^HtH@xHu3J t`uH9j(fH@HtHHtHHtHHuHHsX []A\A]A^1oLHR(HH9jufHDD$ DD$ fH3HS 1DD$ LDD$ LfDH3HS H1[]A\A]A^KfHaGfHSH3 ƺ o u o af.HGPHu_DHHtLuxuH ~ჸt'HH f.H9tHzu1Ht#Sf.H_'HHu[DfDATU1SH IHHPEHt$ 1 yEu4D$ ED‰HP[]A\fD1 .EtyHt$1 EH|$Uj9tlHP[]A\fDEHP[]A\f`fDH|$'Ht$1 oDuH|$tat4_H|$;E\~LHVE~YLH.Ew8 1I0~LHE1~LHəEq).AtHt$L蓙EgfHt$LsE+SHP63HT$Ht$HBtY83uC$t[u6 tcupH|$Hh 1GHP[fD$uHT$H|$1zuHD$@Ut+@QuBHnH_끃tfH0HxP12VHD$H0 1G;f.H1 k=u H1u°H1 FHUSHX1HT$ Ht$H|$|1D$GC$ D=Gs,|$ Hg,v\HHH uHHHǹHuHH HH uHHcHǹHuHH3S_%HH uHHHǹHuHHjHH uHHHǹHuHH]O_%HH uHH"HǹHuHH;Hu Hx Hu;HKd1H6CH޿ qCHX[]1 TCHX[]fD:v/tZu*H&HH޿` 1 CDtֺ@` 1Bnf.HHuHt$HdtH޿( 1B+H 9HcHt$H!Ni@9|$ 5V 1EBHH޿ 1)Bff.@ t>DA t5t{t6uHHOHHt8AV t H@@]uA]t"DAt@A uyD9t\uDu , @1DAuHOHtLNMI9AD u efD tStF9GÐHNLMxAV t6MIAA]t+HOATVHvHA]2@AUATIUSHHIHu!HuH6H;Ju fDCB2EBuC8U81ƒCt1H[]A\A]fSEtMu L;ku L;eHuH{t{E덐uHCHUHtHtH8HtH2Ht?u>u> LSEjHu`H{`%C {L;kt)L;etL;kUL9eGZAF <&fDH$8@ <&@AEU7 1<SH HH u{ [fG t0tF` tfD~ u1fD~ tF` uDF` t @USHHHotC` uHHH[]H[]HHu H9t.WZNZpt(pu7ÐH91у`u u!puWT`uGPtGNT`uFPuff.@AWAVAUATUSHXHHt$HT$HIH1@H@HuHcT$8IعAVHpHD$0HщT$uMvMu;\$bALH1H|$0HX[]A\A]A^A_H|$t-fDþ11ֹHD$0ff.AWAVAUATUSH(HH<$Ht$HL$IHt$H$H@P MtH8LϑuHCH$L|$Mt"Ht$HtIH;褑uIGHD$HD$Ht}HH$HH8[S(t-HD$HH$LHLuCLHu4H$H@H$HD$H@HD$H<$(1H<$tH,$DLmAEP MMetLLݐtqH|$L|$fDMMtILH;詐uHLzt^Kt%HMHH.uBHHu3HmH[H$H|$H@H$H([]A\A]A^A_@H$H0HD$H8ff.@AWAVAUATUSH8HHT$ HL$(eIA$ HCHHAE1E1EEE111AfzZHE1DJ @HtD.DZ$H@HtFHHtujZHE1DR @HtDH@Dr$HuAAA uqAAWT tsA6AtIw1 31H8[]A\A]A^A_AIwt18 3H81[]A\A]A^A_Ð1 3H81[]A\A]A^A_A7Iwu*Iw1 S31n@JLD\$DL$DT$DD$7 HHDD$DT$DL$D\$H ΃tEE9{XAA!E9tE9E„t EAB$ Au Ht$ 1X r21DȃABvAtQAtAA=AA-A#Iw1 21Iw1x 11Iw1H 11AD$AA$AIw1 11Iw1 q11HrH69N t~Hx D\$DL$DT$HT$DD$QDD$HT$DT$DL$D\$-HBD\$DL$DT$H8H QDD$DT$DL$D\$Iw1 01A^ ko mhfDAWAVAUATUSHHG0HHHHF0Ht,HHt C u.AHD[]A\A]A^A_ÐHC HHDtIHHAIWA<E` tnSPEP1Шd@CR2ER%L9Ic L1E1藎_fHHHD(DHHmAu}  HE1HtD{ H1HtIc L1f{ AHHHHH;JvPE1(fD HAPD9UMcDD$ IOIHH$H>AODD$ H!AHIcp L1E1踌H} 1H{ H1IcHI躘 L1Gf.Ic L1E1[#fDHE(HHH@`HC(HHM(HuD@8sDH8HH1H+ HH SP@Ic0 L1E1裋kfDIc` L1E1胋KfDHC(HH$H@`"IcH L1E1FIc L1E1,D 1,ƿ 1,f.AWAVAUATUSHHHl$hHHIIAMtECT ttAT$T@t)MHct$PL L11藊df oAL$ jI$ LDD$ 赗tCTDD$ fD@u\EET$ u-t) CW t AD$W }D$ PC[t AD$[n@AD$T tMHct$PL) L11ƉLL+L]M $AC` lLLYMtRAy dIE1HtDA{ *IHtD(Hct$PI Eh L111H[]A\A]A^A_D@(ED$ ED$  L$PLLHWHtE1fAD$WHct$P L11螈kfHLHHIH fEuyun|$ L]M $AC` nt&LLiLLVHmMd$HMtMHct$PL L1辇fAT$TDHct$PB L11芇WDDD$PLLLa+HL$`HT$XLH HL$XHT$`HLHtH}u MHL$`HT$XLHHL$XHT$`HLDL$PLHL6}MpHct$PI L11臆TMII<$1H ʵHHE1Hx 踵HUHct$PIIL11H @ 5IC(HHH@`IA(HHH@`I<$ o ^fAWAVIAUATEUSHHHSZEZ1Ш CP2EP @RCR2ER#[CQ2EQlp?{ nHHtZLMtNH pAMHHtBXt 9D@EME90DADtHHtpE` HHA} CHE1HtD{  HE1HtDH Ich L134H} E1贷H{ H訷H IIcI@ L1HD[]A\A]A^A_H Ic L1E1ȃfDH Ic L1E1訃fDH IcH L1E1舃fDH Ic L1E1hfH Ic L1E1HFH Ic L1E1(&H Icp L1E1H Ic L1E1H Ic L1E1ȂHC(HH8HyHE(HlH0H`" HwH 9ƿ 1$fHC(HHH@`HE(HHH@`H Ic L1E1jj1jhAHUHHLL$0H AH LD$Ic0 L1詁H Ic L1E1茁EAOȍT15fD HH0ɹOHL19qLcT$ MEIJH3H9tHH9uHFH9GuH[HufMvM{E1HD[]A\A]A^A_fDjjE1jjE1H AtEIug@Su HBSu"INL 11DT$WDT$HHDT$HLLB QH 1L1&XZDT$YHLLHL 1HJHE1E1Df.USH] 1HhHHaH}HiuJHHt>fH}H4HtHpH}AH1H HuHh[]@H1H9 tfDHxtUSHHH/HzeHH=Hu H1X`H{xHuSH{xHtWHf.HRHt?H AX<uA[<uyTyH1H 1HĨ[]CY@AtHH f.DCSAAAѺ o 0Uf w@ f.AWAV\HAUATAUSH1HhL5ӑH=̑O`\HHHLcE HH1_H(HSupH(HtHPHHDH8_H(HIK(H(1AHHuXA vM MuǃtAHA;DHc@ tI(H(1AH}tL5Hh[]A\A]A^A_H(Hment opeHfoVHD$rD$rato)$fD$F_fW81~DHH9Wtt|1uHGHH@(PPutA1H1HtyHOPHu\@HHtL9uHqAv8@t@E11GqfDE1D D ff.@Ht G8t?t1@UHAWAVAUATSHHlH`HxDhHHLHDƅ ˆL1HHDž|f.H@HuHcH≅|HHtH=H)HH $|HHHHHЍVHTHHH9uH#H8 LDžHEE1HXHIM&Mtb A<$%tkHH3IE1DI]HtH;LsMmAMuHMNHe[A\A]A^A_]MI]MVHMDc AA:I]Dc uCP@% Dc A AAB)ANCZ L. MNI]AMSZpHHt z ICP@A:{ >A:A@ZukAx 0 A@P@t$LX11ҾL!MNLX111LrMEMNA@Z@MtAy8t A9 A@Qt A9 IyPt9LMEA@Qt!Ht 8MNA9 DL9HADljIcAL4Mv MmMGHHHfH@HHH:uHH AP t@tHHLH1p 1#8fDHIcH<MVMICP =HH 10MHH 1fDIBHH@(@RCZHLaLPD@IMH0LpH(IILHL8HIx?4II;$HǘSHISI)IMM9uHLpLPH(LHD@IL8H0HDHQH9DCZLHcL N<DK HLMIH)HPODM'MA<$IFxHI6H8II>HHpGHpIHGI)HpI$IEHGHpH)H)L9PTLƅ/fIFH H H#P8H9Lx`AIHMgM8A<$ILpI MMI5DIFxHsM>Mg8^IA?PHFIIFI)IIM9uLp/fA:f1_AF D`ID$E1HP@?Hǘ FK0IH?HǘEKHpH?uuHǘIEH+pHIHHL9PZK McH`K0AHrIF8IcH|H]@L8L01L(D$L@H/IF8JH9뱐IBHYLHvMNA@M~8AMgMdIHTA<$IA|$=?4ILpM,MrDHǘ\DI$ILDI)IGHM9M&MI~xHA<$A|$I?tCV@HS(HA: z: B8HH3IRX0 1fDLIUH1HHL2?MmHDLtMoIUHHu]fLL!zL^sP<IBPHHJ[18HDHHuH6HBH@`D@A9=HMII7IVXx 1HLHIJXL 1ƅrMVI]MA'Ls MVI]MALc IrLLL{ yHIMLAGY  D%Et A@u/H"LH觏IOIVXIIپP 11A@ƅI~>yHrIEIv HHX1=RIBHH@(@PCPEZ8EH1L׾L zLH@tCP@t CRHH3IRX 1ƅ|CZ < D EH„M<EH IRX 1LLfIHHHLPLp?LpHHLPHS(HgH:H[?R% HHYxOHHL@HLHHpHHPbHpHPLHL@HIMHH>HI>MMNXHL 11fHS(HHHR`IzMVI]A:C A@HHIrX 1_tHfIEIrX H15JHu't u H>tFHuH~7A H|$|HT$Ht$/E11 HtL`(|$/eDMWAD$4JMt$Mu@H@MvMIFH@4@uH@HLx(HD$0AOZpuH|$0L!H|$0AEtHEH\$HT$HIH@HIFHh[]A\A]A^A_ÐMd$(6fDHt$/E11D.IfDHgHYH|$0?D$6@DHEH@HHx0 o /"1ú0 wo "0 o "ff.fAVAUIATUISHH0G[GWAEPE1@lƒ HHKIHf.FpuEt ~HHt 8%F u2HFHt)H90HD$ HsHtxFu2HFHt)L90HD$ @Hs|Ht;u+HFHH@(x HHt 8o@H[ Ht7HsHEHvHtـ8%tHVXIu 1H0A[D]A\A]A^fLxHAE[AMWE1MHH߃<A1҃1E1H0D[]A\A]A^fDHH Ht- Hf H HuDE5IUL 1@A42JHFHHVX H@(H0fD1iHCHp@NHX 1=HCHpH_fHP(HH+HB`)H3H赌A(TH3H@I ^ AE[,AEPA@IuL18 E1sH0D[]A\A]A^IuL1 E1JH0D[]A\A]A^f.IuL⿘ 1E1DIUL 1I 9@H7I}|HtIuHLx 1E1IuL 1E1pHXIU HXIU  1HXH jHCHp/HX  ff.@USIHHHGC[t8udG?u2HLɾ` 1LL$CCLL$fDu,HEHtHHt:%u{H@ HuH[]K?HCE11HHtHCHuH>tHCHuHH[]銌f.HPH3H[]HX 13FZfHHH8xHtHxHT$1YH@H HuHD$Hx f3H|$\3H([]A\A]@J(HT$1HH HfHHT$1ǺP o AVAUATUHSHH ?LgPHD$LwXdAD$ƒt5AV89A;FI|$tI~(z(L`IHD$zHt$IE LD$H|$1ɺHF LpoHIH|$LD$HT$LHt$HCPHDHCPoC@XHCX H []A\A]A^HHT$jHumH HuHD$Hx {1H|$q11H []A\A]A^fA;T$8A<$A>A~ |1@ HHCHHD$HCPHCXHH []A\A]A^ú o {@HHH$HL$tH;7u'H97tHHuHH6H1X ^1HUSH=@D4DH$Ř f.H1DoLHB)$HuH@HH;uH$fLCHc=CMLDOHcoKHLɃ B$ @H-CHHyCHL$1HH8?HD$HtuH@xo9KH)$uZf.H@HtGH;uH$H31X  H1[]H[]fH HXHBHhxH)$HuEH)$HuH@HH;uH$WH@HtH;uH$;DH)D$pHuHH)$Hu^fDH@HKH;uH$H@HtH;uHT$pH)D$PHuHH)D$`HufH@HH;uHT$`z@H@HtH;uHT$P^H)D$0HuHH)D$@HufH@HkH;uHT$@ @H@HtH;uHT$0H)D$HuHH)D$ HufH@HH;uHT$ @H@HtH;uHT$~H)$HuEH)$HuH@HH;uH$'H@HtH;uH$ DHj tHUHoGHP@HEH[]HD)$Hu/fDH HI(fDH@HtH;uH$s 1!z?tvKu.H?H@fHcy?Hb?H(fuHtvKu.H>HxfHc>H>H(fuH>HxxtH ko R ff.Ht#Sf.H_*HHu[DfDAWAVIAUATUSHHW(r4@IN(A4JoI)D$0u qHBHh(HAHX(EWsW@ t sCT@t ET@ t3ET HT$@HHD$IN(IU(yLB4D$ u1Hz D$ A4D$u1Hy D$H<HH1HIHˆT$6W9\$ H|$(HL$@AL19\$!ޫvHmMd$HMIE(Hx HEHD$(HHHD$tH2-D$ DÉD$ IF(M<$Hx IHHD$ tH-D$DÉD$Ht$ H|$,8IULD$ HL$0Ht$H 1!HwHT$0 1D$D$H[]A\A]A^A_Iv 1D$fIuHT$0 1D$@HwHT$0( 1sD${fIuHT$0p 1KD$SIuHT$0 -D$5zIuHT$0 1D$ IuHT$0 1D$HMˆT$uLIE(IV(@4B4D$IuHL$@HT$0 1vIuHT$0 1[D$cIULD$@HL$0 1D$>@IuHT$0H 1D$fu@\$ 9\$SIuHT$0 1IuHT$0X IuHT$0 1 o Uff.ATUSHfGV D$LgH1ۉ޿~bHt$MHHjHt1ɃAHHu1ۉ޿~4bH HH:hHt1ɃAH1HuH[]A\@USHHD$tht_GZMHfDMgM>MLpxHMv HuM10fHxt U@HtAHC HHHjHExHuɾ( bU@HۉPuI$fH{xLsxMv MM>H@HHt|HLuMXM>MM(LI~ XHT$1L 1H[]A\A]A^A_@A?%vHL$1LLH 1HHHH~xuFpuHL$1L꿰 _1M'H[]A\A]A^A_H@ fDpLI~ IFH8uHpX( 11 Ht$1 1Ht$1@ 1HL$1LH޿ 1IFp83HpX 1r1IN1LLƿ HXQ1wf.Ht$1 /1U&L*MMfUSHHHHGPH=PH=0PHtCHHH= _tyHt$oCXHH@XH[]@LU`MtWIJHI)J IJЈD$HAD$H[]fH1[]Ht~K{oECoE(CoE8C([fDH"HH=pPHztH= Pu,UuHEHrHR HuHu$HAҸHJ HrHuGHF H~HuHHAҸHPH@ Hu'HHAҸ|LA HQMu'BHHH@ Hu'HHAҸHMH IHMu'L@H@ Hu'HHAҸMQ MAMu'LHH@ Hu%HHAҸIz MJu,Hx Hpu&HVHHAXZ 1ؾ@ 1̾ff.(ru (ffDUS1HHH(oCoCCoCC oCC0oCC@oCCPoCC`oCCp7HHQ(hLHHC(H[]@(HF LNHu HLHpH@ Hu HLHPH@ Hu HLHHH@ HuHLHx L@uHLHD 1耽HWAWAVAUATA8 USIHIAH8LD1;Cpt111M,VHM HIwHtoC@)$D$oCP)D$oC`)D$ uF D$HHzt`I?{AE=„a=VQH81[]A\A]A^A_H8[]A\A]A^A_@EtH{@Hc'NIHLIWH _IH5%&Iٿx 1HX肻돸f.AWAV_AUATAUSHI_HHSH?HH=%Ht)HHHtHH9u 1赻DHKXHpDHD@ H D@ H@D@@t8I$H8t=I$HCXH[]A\A]A^A_M$IL$XLu1H[]A\A]A^A_I$HAtHCXI$H=DH=0DH=@DH=DH=DHHЄtlCHt^I$HLc_L:L a$LE1H $$ Lw(H[ HHt HsHuD Li&yE(oK+e,[w+QHCXHH=HsHSxDNtI|$Hs 8 ^ AAuw uw Dʿ| 12AVAUAATUASՉEH`HEH|$0D4$\$mEHc*#t$D<$l$0Dd$4HH6#aKIKH HHCHCP PDkLfoD$0Cx="CfoD$@C(foD$PC8H`[]A\A]A^ff.f =AVAUATUS]`E1I1ۅAX`u#]f.]"IHAL$t6A9t=9"tA6IHAAL$uʋ =IHEAEuD 8_,!Hè !=!t΋uA.=!M3A=\! u\A=5!M3VHèA YHHDME$@ 7<A_E1_18 Hè  tgA9A4$v u A=M  A4$AHèA uIĨE$AEtl 71A= A4$AcA4$AD :A`E1ftV`1mH Mt5A9t=KtAuH AMuI A}At 9뎋 6$"@59` Z9t|`#H tR=tۋuA@+=M3H A@ uHHMkqt< K9X`uLf *5_H݅uzfD 8`[]A\A]A^@HHKt=t53A@bσHŨX=tۋ53A@ fDuH K]=Wt5c3A@@ B85<3"tMX`HHKt1tHHA@ K52uϋ 3_H݅uIHŨXtA@ *52z@l 6`uF@MH KP=/t5;2A@ @ 3u"HèX=tۋ51A@`fD 2u*1fHèX tA@ 5t1ZA@ n(@65*1: ^Y"ff.fHXHt$(HT$0HL$8LD$@LL$Ht HX@HD$`D$HD$HD$ HD$HHHHteLT$LD$`E1ɾ/wALLIAp ЃApHtHuEt|$X ^ ff.HE1E1jj1jj1j 1QH8ff.AVAUATUqSD%//-/D-/D5/D mEu HHȀjjE1jSE1j11h! jjSjhж%jjSjh h0KJjhаDhHĠE1E1jj1jS1jh! jj$Sjh jjSjhж%hPKJjhDh@HĠE1E1jj1jS1jh! jjSjh. jjSjhjjSjhjjSjh jjh@DhHE1E1jj1jS1jh! jjSj hж%jjSjh jjh`DhvHĠE1E1jj1jS1jh! jjSj hж%jjSjh jjh`DhHĠE1E1jj1jS1jh! jj#Sj hж%jjSjh jjh`DhHĠE1E1jj1jS1jh! jj%Sj hж%jjSjh jjh`DhkHĘE1E1jj1jS1jh! jjSjhjjSjhж%jjSjh jjh@DhHE1E1jj1jS1jh! jj SjhjjSjhж%jjSjh jjh@DhHE1E1jj1jS1jh! jj!Sj*hjjSjhж%jjSjh jjh@DhHP1EjjAjATjhhJhP>hDj 谱HPAjjjD$$>PjAhhJhPjjD$\1dHPE1jjAjATjhhJhP>jj HP 輼>jj1jUAjhAhJhPhDj@躰HP1EjjAjATjhhJhP@hDj rHP@jj1jS1jh. AjjAAWjhVIjCjAVjhh@JhVPhDjHĠCjj1jAW1jhVIAjjASjhj{jEjU jhphJhPhDj薯HĠE jjE1jS1jh1h Jj1hDj AFE FHPFE jjEjUAjh] 1h JhPhDj1N HPN pjj1jU1jhAjh"PAhDjPV%裮HPPV% H1AjjAjUjhU&jjQUjVIhh Jh@#PhDj=Hx1AjjAjSjhU&jjQSj hh Jh@#PjjHx1EjjAjATjhU&jjQATj hh Jh@#PhDj蘭H쀾QVIH11jjAjUAjh jjRUjY h h Jh$PhDj5H쀾RY H1EjjAjUjhU&jjSUj hh@ Jh0%Ph DjϬH쀾S LjjEjATAjh] 1jhOjj 1 聬HP jj1jS1jhj{AjjASjhYjTjS! jh?hp Jh0&PhDhHĠT! jj1jS1jhj{AjjASjhYjUjS) jh?hp Jh&PhDh被HĠU) H11jjAjAWAjhVIjjWUj1 hdjjAWjhj{jjUjhph Jh`&PhDj"HW1 jj1jU1jhAjh 4PAhDjXl ϪHPXl Ljj1jU1jhAh0Jh.PAhDhYyHPEAjj1jATjh1h0Jh.PYhDj M4HPYjj1jU1jhAh0JhP/PAhDhZ(ީHP11jjEjATAjhh0JhP/PZhDj 9 虩HPZ(jj1jU1jhAh0Jh/PAhDh[? CHP[? jj1jj1jh1jjAh5Ej AV HPV rjj1jj1jh1jjAh5Ej A\8D 觨HP\8D $jj1jUAjhAh@Jh4PhDj`QHP1EjjAjATjhh@Jh4P`hDjQ  HPAjjjD$$`<PjAhh@J1h4PjjHP1EjjAjATjhh@Jh4P`jj K |HP P `jj1jU1jhAhJh5PAhDjaHPaH11jjEjjAjhf jjbjjV h] hJh7PhEj赦Hx11jjEjSAjh. jjcSj hjY hAJh6PhвDhbHx11jhJ1jjEj AdF 3H0dF HAAjj1jSjh. 1jjjUjh] h@Jh;Ph`DjХH쀾jMjj1jS1jh1hJjAhEj Alk 胥HPlk jj1jU1jhAhJh;PAhDjop 0HPop H11jj1jSAjh/ AjjrSjt hh@>JjhEj ӤH쀾rt Pjj1jS1jh1h?JjEhpEj As[M 膤HPs[M H11jj1jAUAjh"^%AjjgSjM hh>Jjh0Ej (H쀾gM jj1jAU1jh"^%1h?JjAhEj Af-M ڣHPf-M WH11jj1jAUAjh"^%AjjnSjPh Dj۟H쀾 XH1ɿ jj1jSAjhYAjjSjh?hJhp?PhDjx=b'H쀅vJ H11jj1jjEjhYAjjjjh?hIhOh?Ej H쀾jj1jAV1jhAjjAAWjhVIjjU jhphJh`VPhDh蘞HĠ jj1jAV1jhAjjAAWjhVIjjU jhphJhpVPhDh%HĠ H11jj1jjAj AcH0c`H1ɾjj1jSAjh(AjjSj h?hJh@PhpDj耝5j%H쀅6H jj1jS1jhdAjjASjh(jjS jh?hJh@Ph Dj $HĠrG sH1ɺjj1jSAjh(AjjSj h?h@JhBPhpDj蓜}$H쀅F H11jjAjSAjh. jjAUj; h"^%hJh0CPh Dj!H쀾; H11jjAjSAjhYjjSj h?hPJhCPhDj辛#H쀅E -H11jj1jjEjhYAjj$jj%h?hIJh Ph?Ej NH쀾$%H11jh J1jjAj A  H0 H11jjAjAWAjhmR jjUj h h@BJhPhEh襚Hx11jjAjSAj hjY jjSju hY hAJhPh0DhRHx1AjjAjSjh. jjAVjPh% jjAUjh* jjAUjh hJhDPhDjHP^H11jjAjSAjh. jjUj9h] hPJhPhDj~HA1jj1jUAjh] jhHPjj4 @HP11jjAjATAjh] jhpIPjj9 HP9~jj1jU1jh] AhJh@PAhDj ? 讘HP M%K? jj1jU1jh] AhJhPPAhDj LHPjj1jU1jh] AhPJh`PAhDj \%HP\%vH11jjAjSAjhYjjSjOh?hJhPJPhDj薗H쀅,AOH11jj1jjEjhYAjjjj%h?h0JhrPh?Ej &H쀾%jj1jAV1jhAjjAAWjhVIjjUX jhphJhVPhDh趖HĠX 0jj1jj1jh?1jjAh2Ej A` eHP` jj1jS1jh1hJjEh;Ej Af HPf jj1jU1jhpEh0JhuPAhP;Eh>H •HP>H ?jjEjjAjh?1jhJPhDj1m qHPm jj1jj1jh?Ejh0KPAhDj{  HP{ jj1jj1jhEjhpKPAh;Ej jϔHPjLH11jjAjSAjhj{jjSj h?hJhLPhpDj lH쀾 H11jjAjSAjhj{jjSj h?hJhKPhpDj H쀾 H11jjAjSAjhj{jjSj h?h`JhKPhpDj覓H쀅< jj1jS1jh`AjjASjhj{jjS jh?h JhLPhDj-HĠ0< H11jj1jSAjhAjjSjhmG jjhpDj ’H쀾?jj1jU1jhAjhOPAhDj. rHP. jj1jS1jh. AjjASjhVIjjUH jhphpJhOPh@DjHĠH jj1jS1jh. AjjAAWjhVIjjUH jh hJhPPhDh蕑HĠH jj1jS1jh?AjhPPAhDh ?HP HAAjjjSjh. 1jjAUjdh hJhPPh`DjؐH쀾dUH11jjAjSAjh. jjAUj h hJhZPh`DjtH쀾  jj1jU1jhAh`JhP[PAhDj HP 诛jj1jU1jhAh`JhP[PAhDj  迏HP11jjAjUAjhh`JhP[PhDj {HP H11jjEjAUAjh jjAUj h jh[PhDjH쀾 H11jjEjAUAjh jjAUj h jh \PhDj跎H쀾 4H11jjEjAUAjh jjAUj h jh`\PhDjUH쀾 H11jjEjAUAjh jjAUj h jh\PhDjH쀾 pH11jj1jAUAjh[ AjjAUjha hJjhDj 蔍H쀾jj1jU1jhAhJh\PAhDjjAHP1AjjAjUjhhJh\Pjj HP1EjjAjATjhhJh\PhDj# 赌HPAjjjD$$PjAhhJ1h\PjjmHP1EjjAjATjhhJh\Pjj  (HP " ŗjjj1jU1jhAhPJhP^PAhDjnjƋHP1AjjAjUjhhPJhP^Pjj( 肋HP1EjjAjATjhhPJhP^PhDj/ :HPnjH11jjEjSAjh. jjAVje h'2 hJh0_Ph0Dj֊H쀾e SH11jjAjSAjh/ 1jjAUjz h h>JjhEj xH쀾z jj1jS1jh`1jjEhEj A.HPH11jjAjSAjh. jjSj6 h?h'Jh@cPhDhȉH쀾6 EH11jjAjSAjh. jjSj< h?h'Jh@bPhDhbH쀾< ߿H11jjAjUAjhB jjUje hK hJh_Ph`DjH쀾e |HE1E1jj1jU1jhD&jjUjNh&hJh0PhpDj蟈HxAAjj1jSjhD&jjU Sj1h&jh0PhDjRHx11jjAjSAjhD&jjSjT h&jh0PhDjHx11jjAjUAjhD&jjUjZ h&jh0Ph Dj踇Hx11jjAjUAjhD&jjUj[ h&jh0Ph DjkHx11jjEjATAjhD&jjATj` h&jh0Ph@DjH쀾N虽jj1jU1jhAjhjPAhDjy ̆HPy Ijj1jAV1jh% AjjASjh. jjAV4I jhjjAWjhVIjjUjhphJhPgPhDjFH4I HAAjj1jAVjh% 1jjhSjSI h. jjAVjhjjAWjhVIjjUjhж%jjUjhph"Jh`gPh0Dh誅H hSI $jj1jAV1jhAjjAAWjhVIjjUI jhph%Jh0gPh@Dj:HĠI 贻H11jhP'J1jjAj AI H0I rH11jhp'J1jjAj AI 賄H0I 0jj1jAV1jhAjjAUjhf jjU jhn h(JhdPh DjGHĠ jj1jS1jhAjjASjhYjjSv jh?h(Jh0fPhDhփHĠv PHE1jjE1jU1jhD&1jjUj h&h)JhPhpDjsHx11jjAjSAjhD&jjSj h&jhPhDj&Hx11jjAjSAjhD&jjSj h&jhPhDjقHx11jjAjUAjhD&jjUj h&jhPh Dj茂Hx11jjAjUAjhD&jjUj h&jhPh Dj?Hx11jjEjATAjhD&jjATj h&jhPh@DjH쀾 mjjAjUAjh1jhPjPhDj1 蠁HP jj1jAV1jh% AjjASjh. jjAVJ jhjjAWjhVIjjUjhph)Jh@gPhDjHJ 蔷jj1jAV1jhAjjAAWjhVIjjUJ jhph+Jh gPh@Dj誀HĠJ $H1AjjAjSjhSjjSj%h] h`-JhjPhDjA+H쀅(H1AjjAjUjhSjjUj h] h`-JhjPjjHx1EjjAjATjhSjjATj h] h`-JhjPhDjH쀾%H11jjAjSAjhSjjSj h] h .JhlPhDj'H쀾 褵H11jjAjUAjhjjUj hh.JhmPhDj~H쀾 Ajj1jAU1jh] Ejh oPAh@Dj s~HP H1AjjAjSjh. jjUj h] hP/JhPoPh`Dj ~HAAjjjATjh] 1hJh`oPhDj }HP Bjj1jS1jh?Ah/Jh qPAhDj%r}\HP%%H11jjAjAWAjhVIjjUj hh/JhpoPh`Dh|H쀾 zjj1jS1jhf AjhqPAhDj3 |HP3 *H11jjAjAVAjh$jjSjh jhqPhEhI|H11jjAjUAjh{(jjAVj<hjjUjhph0JhsPhDj{HĠ<fH11jjEjAWAjhVIjjAVj.hh1JhPvPhDh{H쀾.jjAjSAjh?1jh`vPhDh1 .{HP 諱jj1jS1jh?AjhwPAhDh zHP Xjj1jj1jhAjhwPAhDje zHPe jj1jU1jh] EjjAh Dj/.=zHP/.躰jj1jAV1jhAjjAAWjhVIjjU] jhph1JhwPhDjyHĠ] Jjj1jj1jhAjhwPAhDj:|yHP:jjAjjAjh?1jj1h4Ej 1´.yHP ˄´蜯jj1jU1jhAjh`xPAhDjL%xHPL%Ljj1jU1jh] Ah1JhxPAh Dh &yxHP &H11jjAjSAjh. jjUjve h] h2JhPPhDjxH쀾ve 蓮jj1jU1jh] Ah2JhyPAh`Dj  wHP @jj1jS1jh] Ajh7PAhDji0&swD\HPEi0&jj1jU1jh] Eh JhPAhDj O wHPO 荭jj1jAT1jh] AjhPAhEj vHP  h] h9JhВPh EjoHx1AjjAjSjhMjjSjU h] h9JhВPjjoHx1EjjAjATjhMjjATj[ h] h9JhВPhDjXoH쀾> եH11jj1jjAj hAjjSj h.h`9Jjhjj1jS1jh`AjjAUjhf jjUr/jhBi hCJhPh EjVbHĠr/Иjj1jU1jh AhEJh PAh#EjM bHPM }jj1jAU1jh Eh`FJhPAh&Ej aHP )jjE1jS1jh1hGJj1h'Ej AN _aHPN ܗjj1jS1jh. AjjAAWjhVIjjUhN jhph@GJh@Ph0#Ej`HĠhN mjj1jS1jh. AjjAAWjhVIjjUqN jh hPGJhPPh#Eh`HĠqN jj1jS1jh1hpGJjAhdHc ><5HHHHHH4캸HcHc=H2HHHH踺HHcֆHHHxHWlcǜHc=tRAP_E11P_19t AHDZEuIEQEuEIH9SA@_1D$ A@_E1D9|$ H|$ AEH|$@D$D$L%ʅAHcŋt$|$HHD$@IĉL$DaI$HH؀`IHH HH@PpP@L@x=foD$@HP@foD$P@(foD$`@8IAA5IAED$ t>fHx[]A\A]A^A_É5ۄHcHH=@pH=4pH=̈́(pH=HSHH $LD$u8GL:  :fH[@tS~y== fDʇtp  tHIH 11D ɐ fk f uul  Y@VD1H9HH[f NfDuP 9 .H7¿ 1%DUSH(GPH$HL$GXG[<HH?HtlH $LD$Ht$1H$u>CY u8C[uBdt8HLD$Hp !1I H([]f1H([]D蛃ufDAWAVAUATIUSHHHLwVH[]A\A]A^A_I>@HHl@xH=@ƃ7wH0H I|$X=H|$t @H{XpDID$XH!# CHt3L{hMt*I9t1LL>[MpMux LH[I߅M$LLeU=t4CHt.At(HT$1  f.AFZuANT ANPpAFZ€puAF[ANT AF[I~.CHt L^H[]A\A]A^A_DI$E1Ht/fDHPHtJ@1҃!A H@ HuID$HLp(A4A0H;CxHHHHH9j9CxuI$}XH[]A\A]A^A_@HCH_HH2_tHHHtHH9u 1IIL$X1H  ~tTOCHECx-3wH@ H$IIL$X1x  wfD8]6fD6la 1芍IFp ^ DKff.fAUATIUSHHGHH@(4L 0}}H~D;Cxt%HHHDHH99CxutjHSH__HHo}HtZHHHtKHH9u 1VfDLxHHuH[]A\A];fCIIHKHCXIHHЄtwHC`I]xHt~LCH ؁ƒIMpQY ЈAYKfDIMIL=MOuH{{SHIEpPW ʈPW]m1ɫuGCHV1bCHFHsHSxWTHCXHHSIU1Lx  fIU1L( I ^ HAWAVAAUATAUSHHHXoG)D$ oG)D$0oG()D$@ESq ʃKSq5uH"D$\HEH9D$ yEt |$ ]UMs { WzL%z~.I;$HHL I;$~II9u|$tHH辠IJfOoG)D$ oG)D$0oG()D$@~H|$ 1貝IHX1AH{藝HHL1 B1HX[]A\A]A^A_ÐoECoECoE C(fDCqEƒ ЈCqf}H{HDD$D$DD$HX[]A\A]A^A_oG)D$ oG)D$0oG()D$@GqEƒ ЈGqD$"fD|AD$L|$dH|$ 1pIH蕠LCXHL 11-LCXs8HC@L{`LD$t$HD$ oHoooooo-LD$IID$M~`t$H|$MMFXIoEAFoEAFoE Av8AF(!I<$IvHIF@IFHla Lx(EAG EAOT AOW0MAG$AGZAOY@pAGZ1舆IGIFHHx(AoLAoFCAoF C AoF0C0AoF@C@AoFPCPAoF`C`AoFpCpAoAoAoAoAoAoaoEHCoEHxCoE C(.HLJAA@EbDD$ Et ;-D cEt :U9u Hf &  ^ BHЛIHH+UJwD;zqD1L{XHEt E9D$$N藘H|$ I1舘MHLP 1NU9 ,H*}D$ HLH|$ 1I LCXHLP 1t;1H×H|$ 1I贗LCXLH¾ 1yG|$t[HEI1HxLCXHL 1= ~IH|$ I1:MLH|$ 1&IHHLp 1g E1AWAVAUATUSH!>INw HtOsH-'s~*H9Et=HHH DH;Et"HH9u ^ ?DHKXHC@Lk`D{8HL$H$"oHoooooo (HL$IHEMl$`H<$DI$IL$XI$AoAD$AoFAD$AoF E|$8AD$(H}It$HID$@ ID$Hla Lh(AAE AFAMT AMWAE$AEZAMY@pAEZ1蛀IEID$HHx( Ao$LAoD$CAoD$ C AoD$0C0AoD$@C@AoD$PCPAoD$`C`AoD$pCpAo$Ao$Ao$Ao$Ao$Ao$[AoHCAoFHxCAoF C((uH[]A\A]A^A_HHH[]A\A]A^A_wD ^ *=USHHH?u` Ht`HKLC1H/tE@HKHuH Ht> H 1[]pHuH[]fD fAWAVMAUATIUSHHH(D$`H|$LL$LL$D$xD$t?ƒt0A< u/H4$HHx bʉуuD$M>MtYDd$E1IOHLHD$(>HCHD$(HL$ HPHJIEfDHAZ <tHvHu1fDHH1H1 HUSH5ghHg ,8HPHg=gu!ggH[]\uڿT_H-nHnuH-gnHhnj ui=GgEguhHagHP8=)gHNgon)gYf.; ljft.Lt\uOf.uf1l( 11LDH@ft.{ff< ft$< t< tHfD1Y< afuܾP 1ff.USHa 1 ¸`%``8t`uٿދ`t+``zU!UU=# (`_ 1eE1HcrU`_!/ L$ HH8U1L$ =6zD UETA|AHcTDD$ HHT=4XCH4 ; H`_ 11HcTDD$ DTDHH)TJT>YTDT "T TgSSÃ[kSS1Q1`_ AhS` G`_ 11*@S A4@  i O D2P `_ 1DD$ DR7h; X. _ R m&C`_p L$ H4 ; 1UL$ GR &ff.AWAVAUATUSHHHHHH۵[H`@ E3{9M8H@@ H @ U H@ H<  H`?  H`> SH> F@;gtIH? H?  H? u H; uHHMX1P twH; H @ H@ H@@ H< HUX/  1GAfH3HUXS/ 1DE1HHD[]A\A]A^A_HMHH HI(HH9ArH3HUXH 1E12닋e9HHMX1 ZE f.e9E H3HUX 1E1!fDUkU8H]XAă AuH޿ 1iE1HUX1/ @H/ H1^&111IH}A)hDu d9M t:HuXH 1E1;HUX1o/ KU8~*HAtE WU8EUd9E H]XH޿ 1NE1fDtU8H]XuʿE1ZpHHƿp 1^1Hھ @o?u8HEHH@(HRk@P@VH޿ 1H]X0]1Hھ HEHH@(@Q H0Hڿ8 1E1AHLmXHHP1LHLHu} 1LH6HILHt;JHPMH~1Ґ A HI9uIHHuLHHEHH@(@ZH޿ 1*DAUAT1USHIHHL.tH[]A\A]fDH}1Lu#Mt>Hu 1Hu 1fI<$rwDxATUHS1IHHtH[]A\DH}Ht$1u+I<$tDHu 1H|$Hu 1fHD$I$DUSH1HHHyt H[]DH{Ht$1XuSH}ulHD$HPHHt/Hz(GZ <tw*yt1yus1d|HD$HE@H31 1sH3 1H|$IDH38 fDH3` ff.USHHt H[]fHuH~HHt zHu>uHX/ 1ff.SHH@ t [HsB uHs@@ uHs @ LuHs @< [SHH@ t [Hs@@ buHs @ uHs@< [ff.SHH@ t [Hs @< RuHs( < ?uHs0< ,uHs@@ uHs @ VuHs; [fUS*HHufH}t 1H[]D1ҋ=]1HHEj<)uȿ 1H|$A\uH}H|$t wfDH}tfAUATUSH*HHL-$NL%%Nu;H;t 1H[]A\A]ÐH}uHE W_H|$>tYtH[t'L-ML%MH[]A\A]fH;H<$uH}tW*hDk6<.t"<_tH;H|$uH}t47p5fL-9ML%:MiDH;rHD$HEUS1/ HHXtt1/ Ht$1/ ~u)Ht1 HX[]HsH@A HX[]fHމVyHX[]fDHH{hHX[]HL$H|$1HD$H$PZHuHH8OHsP B B0Hs`A +HshA HsxA Hsp A Hs8; oHsXA Hs@ Hs@@ xHs0@@ saHs @ JH@< 'Hs@ "Hs(@ H < AH< oJh 1DfDH018 nHt$܃toIGoI3ff.AUATIUSAHHtg11@{tJ,uWHt$Dzt\t@Ht$HHHuHl$1uImH[]A\A]fDD H1HH[]A\A]ff.@SHH?TH{`KH{XBH{9H{0H{'H{ H{(H{0 H{8H{@H{HH{PH{pH{xHHH{hHHHHH[+ff.SHH?dH{[H{RH{IH[+SHH?4H{+H{"H{H{0H{ H{(H{hH{`H{xH{pH{PH{XH{@HH[*+f.AWAVAUATUSHXL-FL%FSs(HH?OuXHމHމsu5ftQ),uHމ?uf. H1HmHX[]A\A]A^A_HL$H|$1ƓHD$HPZHuHH8^uf.)&]P_H,H$jtH( KtƃH{H1DfDHHH{1ҋ=vS1HvP_H'HH$蓽A, 1V< <*$f,L=CL5CyH{L-CL%CHCQH{Ht ]HCL=CL5CfL-CL%CD< *L-RCL%SCUChHH\HcH<H)<  1HD`_H<1f.1H5E@ǽ*=D~H<$HDHxD)F ,3jHމdHH{BH|$L-@BL%ABT?L-$BL%%BLcMA|$A<$I$1AoD$X@:H:):hHc:I$H9Hp9I$HcH<< FHHHH4HH< $9~RfHH{L-.AL%/AHCH{=O1P_HyHMD^H{DHHFDH|$HL$H|$1_HD$Ht@Z<L-O@L%P@,L=)@L5*@= 1L=@L5@=N11P_HoHC1  B=tN1xHHD$P_HHff.SHH?H{HH{H{H{ H{(H{0H{8H{@H{HH{PH{XH{`H{hH{pzH{xqHeHYHMHAH5H)HHHHHHHHHHHHHH(H0uH[!ff.SHH?TH{KH{BH{9H[!SHH@ t [HsB uH8@< ,uH@@ uHs @ SuHs? @uHs ? -zHs0? cHs8? LHs(`? 5Hs@@? HsH ? HsP? HsX> Hs`> uHsh> ^Hsp> GHsx`> 0}H@> cH > IH> /H= H= H= H= zH= `H@= FH@ ,yH = _H< EH< +H`= H< H< H`< vH; \H; BH ; (uH( C /7[H0B [7USHH@A H6tH[]fDH@ uHu`@@ uHuX @ uHuB uHuB uHuB pHu `B WHu(@B x>Hu0 B _%Hu8B F Hu@A -HuHA HuPA HupA Hux`A H A sHA WH@< ;Huh< \"H; H@C 4HC 41/ !tn1/ 10 {=t{Ht08 1M=t28=E0 P_H/  18 1=tH]=E/ P_Hff.SHH8HHt@Z<0IH2H=|2o51..)$UuH[f諮tf.<= H1}H[DHB08ox 1R10 ?H[Hc A.=CHfo$164671H-H-Hf1HB(mح1H H[Ht fDff.AVAUATIUSH7H@ 5u1[]A\A]A^@Hs`@@ tHsX @ tHsB tHsB tHsB tHs `B tHs(@B mHs0 B WHs8B zAHs@A d+HsHA NHsPA 8HspA "Hsx`A  H@A H A HA Hsh< rH; YH@C y@HC `'HgWA Hu H{X@HCHt 8HC@H38HHt-HPX 1莶H8HC0Ht*HPX  1XoHC082HHt 8^HCpHt*HPXH 1'HCp8lHCHHt 8HCxHt*HPXx 1˵HCx8dHC Ht 8HCPHt 8AHC8Ht 8HHt-HPX 1_vH8HHt 8(HHt-HPX 1(H80HC(Ht 8_HCHt8t3HHHHHpX 11+HPXH1jAz T_R@տ{0 RL1FH H{HCHH0 ~]H{RHC@HPX 11l.H{HC Ht 8H{(tOHCHt8tDH%uHC(@HPX 11H{8tHCHt8uH 0 該tHC%H苲nHCF&HiLHS8HX@ubHֿ0 1腼!HSHt.:H0 HHpX  1170 11Hֿ 1Hֿ 1S&許HCH@IHC1L( HPX1舰LHCHL` 11H{HH c HCHHt~L`X40 @t\HLᾠ 11HsH¿( 1HXHC. H詰AFA~t1&HA4H{HHD$z"DLsHHI1HD$$1ҾT_

|HCPHt+HPX  1貢LsPA>HChHt+HPX" 1~LshA>7HCpHt+HPX 1JRLspA> LsxMtA>tmHC`Ht*HPX# 1HS`:HMI} Iu$ 1E1=DIIFXHj11PD$S_!PMRH EHCxHx`HpX8# 1E1˪YDHT$IHI1jHT$S__0 HBXPDt$AVL1ԥH EuQHC`Hx`t/HpX# 1E1UH$ 1AE1H# 1*E1H W_HHEA51F)IFHAH@(u @Z <t+LCtUI~`t7IvX0 1E1蕩#I~`tWIvX 1E1vH0 1bE1HCHtMHx`t/HpXh 1E17H 1#E1Hh 1 E1HC@HHx`HpX 1E1٨gD\IIFXHj11PD$T_W0 PMH "E9HCXHx`HpX8" 1E1gDIIFXHj11PD$`T_ PM|H EHCPHx`t/HpX" 1E1H8" 1E1pH" 1ΧE1YH{82HHAH! 1舧E1DE1IVXHƿ! 1bLHf:-]HBPHHHC`Hx`tMHpX$ 1E1E|HHC`Hx`t/HpX@$ 1E1ǦUH$ 1賦E1>H@$ 1蜦E1'u H@HH@(HRHR HCH.Hx`tZHpX$ 1E1SH{}H2HDH% 1E1H$ 1E1H8# 1E1qE0HCxHx`XHpX# 1趥DD9IIFXHj11PD$@T_n0 PMˠH E~HChHx`HpX" 1E1DHH% 1"E1H 1 E1H W_HC@H Hx`tZHpX@& 1E1ˤYLsMH W_EtMI~`t/IvX& 1E1艤H@& 1uE1H& 1^E1A>E1H{(EgH*HLH' 1E1I~`=IvXp& 1E1nLsM&EuHC(HhH{l L+MI IEHHtH@(@ZMmHHtHPX+ 1@識HHt#8HxHHt#8hHx/HHt#89HxHH= W_AHA ExpHX, 1YHH{HH( 1E1IuX) 1E1H& 1E1vH% 1ԠE1_H" 1轠E1HD=IIFXHj11PD$T_PMϛH H# 1_EtaHHx`t@HpX 1E18HCHtrHx`tTHpXX% 1E1H 1E1HCHHtsHx`tXHpX! 1ԟbHX% 1E1KHCHHx`t|HpX% 1E1葟H! 1} HtIVX1H! ٕHCHpHx`t,HpXp! 1-H% 1E1Hp! 1HtEHt$H' 1E1ӞaHC( H' 1趞E1AHCPHHx`HpX' 1E1胞HEEHtfHp' 1AA~ML"HLHtDI~`t)IvX' 1Hp' 1ߝmH' 1˝Y$LmD,$ADuHC(HEFE=Hx`tMHpX@' 1E1aNL L$7HT$uhH@' 1E1H' 1E1HChHHx`ttHpX ( 1E1Ҝ`H' 1辜E1IHp& 1觜E12A} IuX( 1E1聜D|$H ( 1bE1HC(HHx`UHpXP( 1E1/HHHy<$EHЋR E1HP(LrXHI* H H19AETIEHH0+ H0H1舛1WHIEP  D|$b$HuIEP ;HH0Hp* 1E1HpH+ 1E1HP( 1ޚE1iIU111A0 LY4 脪H{Xt.u 0 H1H{X111I+Hs`> 9zH{`t.{ 0 H1vH{`111I*GHsh> /H{ht.(1 0 H1+H{h111I*Hsp> 裩H{pt.41 0 H1H{p111I>*Hsx`> XH{xt.91 0 H1H{x111I)fH@>  KHt1 0 H1DH111I)H > 趨Ht1r0 0 H1H111IK)H> bHt1kY0 H1H111I(jH= OHt1)0 H1HH111I(H= 躧Ht10 0 H1H111IO(H= fHt1A1 0 H1H111I'nH= SHt1_0 0 H1LH111I'H`= 辦Ht1n0 0 H1H111IS'H = jHt1e0 0 H1H111I&rH< WHt10 H1PH111I&H< ¥Ht1K1 0 H1H111IW&H< nHt1#0 H1H111I&vH`< [Ht1#0 H1TH111I%"H= ƤHt1

@G DLH{@HcAVAUIATIUSHzHӋGo FHB n Ht fDHPR 9~AU H@ HuMuH{9o tLzyAm H[ HuA}1L1[I]A\A]A^USHHoFGoFGoF(G(uGn ~1C 1HH[]>GHLJC G @SF HHvGG HtH=:*HCC ƿC 15H[ff.SF HHvGG HtH=)HCC ƿC 1H[ff.fDUSHHoFGn oFGoF(G(~1_C 1~HH[]USHHoFGn oFGoF(G(~1C 1.HH[]USHHF GG ~n 1C 1HH[]ff.AVAUATUISFHHH։GH>AD$ Ds AD;u t&HH!A9tLHuHvEl$ {1E 1H[]A\A]A^ATIHcUSH=H1菃LHH[]A\H f.AVAUATUHS~HJtqF tYCz1C Dj )}Du 1As EEDOE 1,H[]A\A]A^@J 9LfDtB S1HoFGoFGoF(v G(eE H[fUSHHoFGn oFGoF(G(~1OpE 1nHH[]USHHoFGn oFGoF(G(~1{E 1HH[]HATIUHSHGtoHC Mt$E8LC8Pu8H}@L裔HC@Hw}1}s пE 1H[]A\RG ff.fAWAVAUATIUSHHIHH~loCE1H{@EoCEoC(E(C8E8s8HE@AE8D=AM>AD$ AU D9A9A<$tID$HHtH@(@P LLOAU A9T$ tIuE11LbAU {K ; DDE 1oHHH[]A\A]A^A_f9eH|$D$ *D$ Ht$E11ɺLD$D$Sb+fDDE 1HHH[]A\A]A^A_H~PHm\|AU A9H|$Ht$E11LD$D|$aAU @IcE HID$xE 1THHH[]A\A]A^A_DfDUSHHH8H CC } t(HH$D$HD$.iE 1HH8[]@USHHuGG ~n 1jE 1HH[]ff.AVAUHATUHSH~{twIHHّC Du AE;t$ t&LH轑A9tXIt$HfhDk {1DF 1H[]A\A]A^fDDn Dk fDHuLhDk {fDATUISHHHH|$${HL$H$DŽ$soD$1Ck {oD$CoD$(C(F 1'HH[]A\@5S1HGw -F H[SHHoF@xGToFGoF(G(t"Uu5w 1EF H[Dw 19F H[úS mQF ;@AWAVIAUATIUSHHMHH~yoCE1H{@EoCEoC(E(C8E8s8 AU8HE@AMt AG8~AMD=AU 6AD$ D9D9A<$tID$HHtH@(@P @LJAU A9T$ tIuE11L]AU {K ; DF 1KHHH[]A\A]A^A_D9iH|$D$ D$ Ht$E11ɺLD$D$3]/fDDF 1HHH[]A\A]A^A_H~P|HMWoD9$H|$}Ht$E11LD$D|$\AU IcE HID$xnF 1H}PIyH} HEHDžL}XHLu`)1HEI IuH8AoE HEHEAoE0EAoE@L}XLu`CE(0fDm Hn!fDm Hnff.SHGG F 1 H[HATUHSHGt H}C 1De s DG 1H[]A\fDzSHHCC v 9tHs`s G 1uH[ff.S1HoFGoFGoF(v G($G 2H[foF1SHGoFGoF(G(D H[ff.S1HoFGoFGoF(v G(2G H[fS1HGG >G H[fS1HGG OG RH[fS1HGG `G "H[fS1HGG qG H[fS1HGG G H[fS1HoFGoFGoF(v G(G H[fIHHH Jf.AUATUSHHHDn D;j t%HHI踇A9tSIt$Ha^oCG 1Es oCEoC(E(HH[]A\A]DHsL^ff.fIHHH f.S1HoFGoFGoF(v G(G bH[fS1HoFGoFGoF(v G(G "H[fS1HoFGoFGoF(v G(G H[fUSHHHHGt0H謨C u G 1HH[]@RG ff.fff.*SHGG V G 1;H[f5S1HGw G H[AUATUSHHHDn D;j t%HHIhA9tSIt$H\oCG 1Es oCEoC(E(HH[]A\A]DHsL[ff.fAUATUSHHHDn D;j t%HHIȄA9tSIt$Hq[oCH 1Es oCEoC(E(HH[]A\A]DHsL[ff.fATUHSHH0H{pHE LeH@ HhH@ HxCHtlHǘpC Ht/29E t$HH$D$HD$ZS At$ H 1%KpHH0[]A\ÐC DHATUHSHGt Hإ}C 1De s DH 1H[]A\fDUSHHG G~n 1Ls ЉH 1hHH[]f.USHHG G~n 1s ЉH 1HH[]f.USHHG G~n 1s ЉH 1HH[]f.IHHHX f.USHHH8Hl5F$Cs 9E t'HH$D$HD$Xs 'H 1.HH8[]ËSHGG Ht$E11ɺLD$D$gIHH[]A\A]A^A_K8E1AAC8rIct$8HC@HMAU8LA3I nA|$E1I蔜Hc{8Is8HC@^AW1E1T$ DL$HC@HcD9|$ Nu~HcHIt$@LHL$HǧAGI9C8DLA3I AA|$fA|$A4I LA+I ff.fpAWAVIAUATMUSIHHHHMGD5HtIjAƋ5C AG9EA8HuE11ɺLrGM9E8LC8(4H}@t A<$MAE8YLARI S8E1AAH$x t6H|$H$Ht$E11ɺD$D$FD} }NED1LdI fD9s Ht2KpH|$;Ht$E11ɺHD$Dt$gFHH[]A\A]A^A_AG 9E Mf.C8jHcu8HC@HMtEEe8Eu\LA[I nffDLA[I JDASI DASI DLE1ARI wf.I$3Hc{8IK8HC@~_AT$E11T$ HC@IcD9|$ Dfu NIcAHHu@LHHnAGI9C8@MASI Aff.AWAVAUATIUSHHHHoFGoFGoF(G(tIԋF8G8HU1Du8EulMAI tAU8AI xI LLD{1[ ξL濈I 1HH[]A\A]A^A_@H{@tA<$uI$赗IcI5M8HE@ZAE11 HE@DE9Dzu JDAHcLHHs@HHAFI9E8ȋE81ALdI 1HH[]A\A]A^A_ÐGG HLJI fDGG HLJI fDHS5HGtH螖Hs xxtI 1}H[@I 1dH[ff.ATUHSHH~tTztmoC1EDc {oCEoC(E(ݼDI 1HH[]A\H~PtAHHT$t<HT$zuHzPtHZ<{zHmzuHzPuאS1HoFGoFGoF(v G(I bH[fHHI AWAVIAUATIUSIHMHHMG-tIŃOMC $AD$8LC8-I|$@t A?WMAJ tAE8LAI bA|$A1Ed$ RK DD1EL;I f9k Ht-H|$@Ht$E11ɺHD$l$m@A~ t1H|$ Ht$E11ɺLD$D$5@HH[]A\A]A^A_fDK8E1AA=C8BIct$8HC@HMAU8LAJ >A|$E1IdHc{8Is8HC@}AW1E1T$ DL$HC@HcD9|$ Nu~HcHIt$@LHL$H藞AGI9C8DLAJ AA|$0fA|$AJ LAI Rff.fAWAVAUATIUSHHHHoFGoFGoF(G(tIԋF8G8HE+Du8EulMAJ tAU8AJ J LLD{1[ 辸L濈I 1HH[]A\A]A^A_@H{@tA<$uI$襑IcI%M8HE@ZAE11 HE@DE9Dzu JDAHcLHHs@HHAFI9E8ȋE81ALdI 1HH[]A\A]A^A_ÐAVAUHATUHSH~{twIHHYnC Du AE;t$ t&LH=nA9tXIt$HDDk {1XDJ 1vH[]A\A]A^fDDn Dk fDHuLDDk {fDAVAUHATUHSH~{twIHHmC Du AE;t$ t&LH}mA9tXIt$H&DDk {1蘶D)J 1H[]A\A]A^fDDn Dk fDHuLCDk {fDATUHSF H9B tHvH׺CoC1EDc {oCEoC(E(D7J 1H[]A\fDUSHHHHG5tH֎Ɖs U G 1HH[]@HE1HH f.S1HoFGoFGoF(v G(FJ bH[fAVAUATUISFHHH։GHkAD$ Ds AD;u t&HHkA9tLHuHKBEl$ {1輴DOJ 1[I$]A\A]A^fHsHAEl$ ATUIS~HHtmoCLEoCEoC(E8E(#AD$8t{tLHDžJ []A\@{tZHDžJ []A\f.H~PtH3s YJ t J 1H[]A\s mJ t俨J 1DHE1HH.f.IHHH] zf.JSHGG J 1`H[USHHHHGtHH,}C m 1s пJ 1HH[]f.~uF C f.USHHF GG ~n 1}s ЉJ 1HH[]ff.5JS1HGw J `H[ATUHSF HGIԉG v 1J 'H{Hti}tCHEHt/H8Ht'}]HHtLm]H[HHH[]A\fDH=g1THH=q1HCff.AWAVAUATMUSHIHHX~\AoD$Ht$HEAoD$EAoD$(E(GH|$4H|$E8gA|$w-HH^K K HD @K H;9C t9oCHt$ E11ɺH)D$ D$$oC)D$0oC()D$@5Mt"`A9E tHsE11ɺL5HX[]A\A]A^A_ÐAL$ KH=GaK 1!H1DH~PH/HhHc}8bHHE@hIċE8E1E1fDID$H}@AHLI(LhD9u8If1L$ 5L$ 1K 1PH`@HK 14HDS1HoFGoFGoF(v G(vK H[fS1HK оHHR HtHK HR HuHHaHCpH([@S1HoFGoFGoF(v G(K bH[fUSHHHMGt0I,C u K 1HH[]@G ff.foF1SHGoFGoF(G(K ŽH[ff.S1HoFGoFGoF(v G(K 肽H[fUSHHHHGtXHLC C8}8tHcu8HC@H蝗s K 1HH[]ËG DSHHoF@xG=oFGoF(G(t0=tI=uZw 1FF 裼H[f.w 1K 聼H[w 1:F aH[úS  QF z@USHHoFGn oFGoF(G(~1߫K 1HH[]USHHHGzG uN;B Hu61L 躻Ha9E tFHHsH[]8Hs81L tH9E uH[]ff.USHHoFGn oFGoF(G(~14L 1HH[]USHHoFGn oFGoF(G(~1蟪?L 1辺HH[]HGtSHH蔃C [RG ff.>GG ff.@S1HoFGoFGoF(v G(KL "H[fAWAVIAUATIUSHHH~oCAGoCAGoC(AG(C8AG8C8u={HILJL u=HHL[]A\A]A^A_{nILJL HtHNbMttL=btH{@tDIc8EFHL0ݷIG@IcEHAHJHL$~;AIIL4$E1f.Hs@J<0LIlL94$IG@uH|$I$HME9o8IcHHLvI@Hs@ALE9o8Lf.H~PtH'{[H菷Nf.s XL t 0S 1Is uL t俣L 1DnܷHD$IG@ ff.USHHoFGn oFGoF(G(~1_L 1~HH[]Ë5:S1HGw L PH[5 S1HGw L  H[ڽSHHCC v 9tHs44s L 1նH[ff.USHHH8HZCC ;E t$HH$D$HD$3M 1gHH8[]fSHGG #M 10H[USHHH8HCC ;E t$HH$D$HD$#32M 1ǵHH8[]f^SHGG BM 1萵H[USHHH8H\CC  9E t$HH$D$HD$}2QM 1!HH8[]HGtSHH}C [G ff.IHHH骶f.5zS1HGw aM 萴H[S1HGG vM bH[fUSHHoFGn oFGoF(G(~1M 1HH[]USHHoFGn oFGoF(G(~1蟣M 1辳HH[]HSHG8Gt!Hs H1kKHǃ0X_[fDBG Hǃ0X_[f*GHLJX_G @ G8GHLJW_G ff.fڹGHLJW_G @HSHG8Gt!Hs H1JHǃW_[fDG HǃW_[fjG8GHLJW_G ff.fHATUSHtHHIVu"CHǃW_C []A\fLHH[]A\AAM 1S1HGG M H[fS1HGG M ±H[fAUATIUSHHHztWoEMCoECoE(C(LC8HǃpW_IZH[]A\A]fDHBH8uHZt}HYH@Lm1=HcCIEXE8tQC8HǃpW_H[]A\A]@AշHC@HI$[]A\A]H3C8HǃW_H[]A\A]LmHc׋=1tA]H[]A\A]A^A_fDH-HH[]A\A]A^A_fD1T |H[]A\A]A^A_USHHH|$ tH[]DH|$Ht$$H|$E#D$EH[]AVAUATUSHD%WHD$D=Մt H|$L5L-Au$ uD$=]DH|$L5\L-] t%t at< H16H|$"H[]A\A]A^f.1iQ*衋h H1,<;HHkAtBH)H|$bH=хH[]A\A]A^.H|$*c H¿11CH|$H1NZH|$?t cQHD$8t A P{GHpXpd 1f.1蹭HH|$HmD*ta o T 6HH|$< @b H12b 1u;u>H|$H-fb 1@c 1xxc 1gHwX1c RH8u+;9P HpXe 1b'P8tHpXd 1F 9P sHpXd 1"HpXd 1 ATUISHHH誎tH[]A\H}Ht$  E؋D$ A$H[]A\AUATIUSHL%|H-}Ht$ H|$tH[]A\A]f|$ Y|$u1T 1EL%.H-/H[]A\A]DKIEH[]A\A]f.8e 1 fAUATIUSHo)$)=( <&Hu<_}ts=@o)$'H <&u<_t ty@$uyHAl9{Lc}1T K H[]A\A]CUՁ@(„t<uGfo$~H[]A\A]@@$t/fo$C,~H[]A\A]ÿhe 1 H濈e 1 @SH@taH"{Hzty(Ht H@[fDHzHH@HHgu#^H@[H@[D1H޿e ! f.USHHHH tHH[]ÐtHHu HH[]Ð11HH衪tHH[]ÐSHHH|$tHH[@HT$HtHR(HH[ÐATUSL%U}HH-S}F'%*<w$Ÿ] %<=L%}H-}[]A\%<=fD%<*kfD[]A\[]A\%a<m$`^ fDK%<=c$<. fD$taE1E1Lf H1L9d$t LLd$M9t L Ld$LH|$(9D$E1E1Ht$1!U RELd$C,HD$1]V /bD~_%j*?,#B)E1E1_&Ht$1,U ts|$Ld$,tMHD$MAM@l$MHt$1GX $+MIt$X l 1M@ 1It$X@l 1MMf.H1{U Ht$tNt9HD$1ɾHHxPmtHD$H9QHHfD1n HfUS1U H(H-[H[Ht$HD$t-H|$H-[H[EH([]fHt$1Y HD$6tQH|$?t/D$ H-L[HM[H|$D$ H([]8n 1,H|$HD$H|$x tP?tHGHHtrHx(UHD$HT$\HxHH\H\.$@uHpXhn 1H|$?d+f+H|$!ATU1SU HSL%@ZHH->ZHD$H$oSuyH<$HGHH@(@RtoISHt$1Y -SStQH<$L%YH-YbH|$XH[]A\DȐ[uRH$:[H[HD$H[DSH@ftIHYuLHUHxt>Au4HUHH@HHAu/H@[D1n H@[Ð1HHo US(Hh@u#1H濏U \t-Hh[]@1Ho wHh[]fD)Ht$1U o,X)D$7H|$ $HTHxHlTH|$ H@H(H@*H$qYHY xo 1H<$&Hh[]DtHSHx0H|$ bHt$o 1/DHt$o 1HT$Ht$ Hp 1cHtKUSHHH?xH{oH{fH{]u H[]fHH[]:fff.HHD$t!16.-tH@H@H|$tD1~"]11U rt\H|$WHtzP;fD8p 1f&] H1gDxp 1LU 16@ 1H't"1t!HHfDttиf.ATUSt"1b ft[]A\D[]A\@(UHVpӀHh IĿU 1H2I\$h1FX Ht,1H޿FX KM3MH0VoSHAD$Xu KHHt@H{ HlmIoAR)D$и <&uHh[]A\A]f.HsHH>K)u$HC@XHh[]A\A]fD1 u?`Cv<*1H4tff1e D1e /u: `CfH0HT$1q <f1%teHl$ 11f HthfoD$P_C@V 1H,wnve Hte Hd|$ r_C;<*l<:><)6t,(1H2f L-OL%OL-OL%OH|$]ƒp<,<)H|$?@"^C Gb<=j] H7<'2 H7'. H79BH|$8e?HD$H;CsH|$EfvH|$@HtKUSHHfH;Hk H{H{H{HH^2HuH[]fff.AVAU(ATUISL-ML%MxzHH[t&HL-ML%MP[]A\A]A^D=AuHE8uH}Z[u:tq 1xH}([t߃t:t3H}[ttHEH@HH@(I.L=[1HE@AUATUSHHH(HD$htH[]A\A]ÐHtsL$$M,)u*HltJH|$4Ztbt-)tNHD$LeHwDE1H|$LH[]A\A]fE1,S H1fH$IE I=USHHH$HD$t1b kt H[]Ht$HtOt1tot6t#u}D,R H17H<$H|$1H[]HD$E-LHLH$H M7IwoLHoL@oL@ oL@0oL@@oL@PoL@`oL@poLoLoLoLoLoJ@kHD$KHLH$H%LHXHKE, @UfSHXH-IHI)D$ HD$@HD$)D$0tR1.6tQHX[]DH|$ H|$(H|$0H|$8HX[]@H|$vt,t1t)K H1`H@)tHD$H@Eyff.S1W u[Ð(tD6[fD~J H1ff.fH18W u H(t Ht$1GX #D7t)I H1@H@)tHD$H Dyff.H1TW ktHfHfDH1RW +tHfHfD7ffD9VfD8FfDHWuBHfD H H1HDH1%2 Ht$tH@H|$ete1HiW vu1rW  tH ?H!?L->Ht3USHHfH;HktHH HuH[]DfDAWAV(AUATUSHHD$8t7-D H1nH[]A\A]A^A_o8<H|$`)D$@Oo<1W )$foD$@D$`D$-;E1D$/D$.1E1E1MD$,D$+HD$ HD$HD$HD$H|$ThHIE1HH8NHxHeHH@HHx(b|$-HP,HPHt H:H@HH@(@QAAL+IuPIEHHLx(t>@MA AGPAI HtHHt BPA!IuXs 1NfDH|$H|$ H|$LH|$8Ht H|$H=-3HD$D$tDH$1W D$`D$-ft$`tAE?H=2HtIL+Mt A}<(u AGP,ο<DHt$81!U uMELl$8EA}HD$8FI}HFcO,A^Ll$tHt$81,U sbHt$81!X P?Ht$81?X -ufHD$8JX HPX1żA@Ll$8<T$`HAMp,HD$8DM蒾)IAk|$+Ll$8<,HD$8 D|$+Ll$HHD$8.X HPX1ֻA|$,HT$8L$` HD$HxHX10u 芻HD$8HD$ ,HD$8uDl$,YfL+IEHIuPLx(AGPHHf.8uHPR8HHu=>m=Xs 1I@Ho@XT$.)D$P5DHT$@1W 芺t]$Ht$@ ɈL$-u<|$`%H= /H19Ht$@Pr 1E1HD$ HD$HD$HD$5D`fDHT$ ffD1]1R`DaHD$HfDI}Ht$`貄L+AM 9L$dAEeu#IEHHtH@Ht @]FIuXt 1pAGV@ IG(H@8IEHHH@H @]D$/ED$/DI}识H=h-IEL+HkfDHt$@W IfIuXpu 1IuXHT$@u 1vIuX l 1aIuXt 1Ll$GLl$=+; H1lIuX@l 1W H$r 1>HrXt 1)HrXHT$@t 1W 11]V 譻_HD$ HtsMtnHPXIvXu 1tHT$hH:tABpt 1Or 1>H3s HXu(D$-|$.|$`u=H|$ u5Mu0Ht$Pu 17HD$ s 1HD$ Ht$`3H4HD$H4HD$ HLEHD$L53H=4_HǸH=*H@|$/t|$`uH|$ wMnH\$ H[Ht?HHtxuHPHHtHRHtB]tHpXhv 19E1HD$ fDUS(H蠷1DH|$1dHD$HxH4[H|$b8JHD$H@PHt-HHu HHHuP_%HHHCHHD$HkX]V HCP1,輶#?7 H1H=1H=2H1H=1H1mH1H[]fDf1 _D@H‹ zw 1uiv 1dXAWAV(AUATUSH8HD$(迵f1D$E1HD$1HD$H|$m[IHE1LLI8OHxHYIH@HHh(HZVI>z6t5I>\6 IH@PHt8Ht} EPA),跴bfHl$D$Ht$(1,U 赶HD$(YX HPX1IAL|$(,7nDMHt$(1!U Go|$Hl$(H}HW,ڳ1AMHl$D$HE(HHt@8w 1@LH|$~H|$H8[]A\A]A^A_E1HD$HD$Q4 H1蒺1T1WDXHD$IxfD^3w 1:EDDM)tdL IwX@l 1H|$(Hw 1HuX l 1йH|$(fHl$1]V kHD$S- L%-H-HD$H-W 1cnw 1R]ff.fHHD$92'2u{ 2*Ht$1Y 蔳totFp2 H1豸H|$GHD1px 茸Hf18x tHf.H|$1|$tOHD$, HI,Hи <&(1x  ouZx ַ AUAT1US0f HHt$ HD$itHĨ[]A\A]DHt$H|$ wcHD$HX(CZY 1*تl1êM1vH讪.1X 虪1X 脪1DY o1MY Z1YY E01eY 01M O1~11=-mt1 өqEu z H|$荿DfD(^UHt$1GX tH L= L% L-} HD$ÖtHD$)L=] H^ z HL$(HT$ Ht$1 Y A@z H|$EH|$ DH|$(D1X JHD$s!H!HD$H!HD$ H!HD$(H!Ey 1_vz 1IH|$߽V1X AtET _0HſgKHHEHǹ4oHE@HD$HEPi H-j HD$N H E1vuz 1eit܃ _Mtt'-M% H1ƫH|$H\7h-#ZVlLD$@HL$8HT$0Ht$H1Y 'H|$0BWH|$8A@H|$@A)1X ^HD$H _HHD$0HHD$8HHD$@H=Ht$HH|$@HD$HHD$@]tut#H|$@H|$HߺC%;tӃ _H4HHoH@t,# H1ʩr HD$HlHHD$@H4HUHh#PF+T  袭}9ٺn7ʺ_ UHF< _CHHD$HHEPG4HEHHoaHE@H-I82ff.fUSHHHtS iIt?HAP1HHHxZAHHC(HtFHH[]HHHx}AHHuHfHHxAHD FoqHHC(HC(HxzHC(HH[]ff.@SH/ҟt=/ßt.H6¸tt%1z  [f[fD/vu[ff.@ATUSHP u!HH*HD$H|$5|$AH"L H Mo; @1H膚eCH$k@T*HH :Hq:tH| 1 PH|$H<$H7uHHD$xHP1UH<$GP@H7HD$HD$HHP1#H$H@MI$CTHtBHHHt/fHHPHRHH;Z(H@HuH?Hu肿HP[]A\fDP t x8@UzH0HU1{ ڟH$^H;@H3HH| 1IH|$|DH|$1LHMHMH0HP{ 1IRH$@URH0{ 1ТSH| 1赢| 1褢ff.@HX1P_oY uHXkt^Ht$1U uHT$H|$1/DuHD$1ɾHHxP$tHD$H @HHXHt#Sf.H_HHu[DfDHtkUSHHfH{7HCHt,HxHt HCHxHt HCHHk HvHHuH[]ff.S1Y HHܛH$P PZ HHxP1ɾ #HHt zB#?HT$HH$HHHAH{,/ܘH|${+HD$@TBH0HxP1lHD$"f.@StHp} 1 FH$PZpDHH蝊"H[fH$H0} 1HZ蕡1H濌Y .tE> H1H[Hx 1D-HH$0} H01H[S1p_|U 莙t[H1ɾHHxP!D؉[H9u fUSHH1HHH;HH{HH[]g1ff.AWAVAUATUSH ظ<(tH[]A\A]A^A_fD(E1f<IHH(HHUHMtHE1D$H{ N%YHSHBPHt"fDu xHHuHBH1Lx(I7IP!AtsA AGTt MDD$)A蓕,耕(X;HCH#@H~ 1輜HmI]1IEH HLHtIG(HHHI7Hx81DD$ DD$'f) H1'f~ 1UA|$uH ,xur(T:HHTLHmHtHEH@HH@(HHtL9tIVHq~ 1臛~ 1v 1e1Y ϑ<҃2AVAUATUSHH|$XD$PHH|$L% H- ~E1E11HL% H- 0t!H|$XoHİ[]A\A]A^DH|$P辝H$1ɾHHxP!ucH|$P褝fH|$1HٺLZH$Ha@T W@Y@MLAAfH<$11ҾOHt$1Y ĔfH|$PtoH4$H|$1ɺ^J u`EH$HT$t L9u`H ;Hw HA08t\1Y HD$0 HpX1\H0HXX 1BdHrX 1(JATUSHH|$XD$PH|$L% H- tH|$X}Hİ[]A\H|$1HXH$Ht@Z:H1x'H$~HxHH_(ID$(HuCZKY@KV@@CZID$(HHC(Hq8B8 B8q8࿃@ B8I8 ȈB8KP@1]V ̌Ȁ 1wHJ(HH3HQ`1ID$(HHH@`H{ It$ I$wSZHCPKY@@CPSZAL$VSV@ ʈSVAT$P ЈCPAT$P࿃@ ЈCPAT$P ЈCPH$HT$#H H H@HHL`(HY/H H@L HH-qHPH1ATUS8t 1 蔐[]A\f.1+Z 4(t1 P[]A\fDE11x.o:H衈H}_tGH}GPH1 HMHqXH¿ 1賏HknHtHkXHZnHHu[]A\H} tHE PJv5t0Hx1HM {:ƇHEHE Ht@Il$X)衇,I苇 1;HH 1̎ VxU-oxH@軁pHfDH} nHATUS8t 1 []A\f.(vt1 Ս[]A\x9,oL`(HHL,t'u2fHkXHDlHHu[]A\f 1df)u蜀t׃t}("H-}tLlHu0HtFTu-F^tH=*t1X 1؁ Ό@HA:tDUS1+Z HVtA15Z Ett(tl؂ 1`H[]@tڃtx"*o1@(H`j릐xq*oHh(Hp0HHt!t){(uC()Bt&؂ 1衋HkXHTjHHu'~tЃtH+"HOjDAWAVAUATUSHH8t#1 'H[]A\A]A^A_1+Z ą(聃t1 fxA)oHH{HHo@ C(o@0C8o@@CHE HkXH$iHHu0f}Vtx#(o He@(H<H@*x 1Ӊ^HC8uxt( 1賉>P8uHA)u|tE1{(H#HL({(<Ds,Hl$YDH¾`N H14ELt$`DH:Z 1LH5HL$1L[GH|$H5k1HL'H$Hs(1Hh(HA} AUPEP@ ЈEPAUP ЈEPAUP EEPLePH1LL M]{(uHH} L͎p~&HHHHHH$HBPHHB=H1A~HCx 1蔇IEP mEH@D HC0Hl$FT HH1s:HC0HtHHt 8Ds,Hl$QEHL6T H1)HC0Hl$UT HH1 IE(HH@HE(IE(HEPJ8࿃@ ȈEPJ8 ȈEPJ8eV EEP0Hz`EPkEHA@Lt$`DHپKZ 1LQ;HIUSHHtHt$1]Z 蚀tH[]f.p2轛LtH|$0fDEMHHD$HEP#HEoo@o@ o@0o!@@o%@Po)@`o-@po1o2o3o4o5HEo@OH-PHD$EL*HsfDM H1?f.SHPH8t1 HP[(HD$}|thnt>H_HxtH|$utsH|$NHP[HD$QHrH|$6t!S){f H15H|$˓x+nuHH|$H@HHSHt$HڿH 1߂"f.ATUASHH-HH|$ =wt H-HH[1]A\D9d$ uH[]A\HuHfAVAU1ATUISAv H@L%H-HH|tH@[]A\A]A^@Htp6=HtXA( II~HtcHCuH1؄ 老L%iH-jH@[]A\A]A^EuL%DH-EH@[]A\A]A^$t;HS <&u :f.H <&tuHL1 Ԁf.US1HHHH}tHH[]DH`HEHH[]SHHXH1ɾIoC@oC@oC(@(S8P8[fDSHH0H|$ t8H0[HT$uoHL$H|$ HT$HD$HH|$HD$tIt\oH|$)D$ u 1HT$H?H0[@HD$HH0[DH|$H0[AUATUSHH8>o6H|$HD$)D$ ntH|$1KwtttH8[]A\A]fH|$H\$t#H|$ H1ɾ迉HHD$HH]H8[]A\A]H|$ewH|$豦b(vTH|$6)vH|$HD$Hǻ諎fDH|$ L-L%t3u:Ht$ HA1HtHt$(H|$ $ L=L5fLt$ tL*HIMnXMf`LuH8[]A\A]A^A_{+IMHD$M~XIF`++IfD*IfD*IfD*IfD++IfD 1$vH|$ 躆zL訆hfDH|$ 莆H|$(脆D~ 1nvff.ATUHSHHt8H[]A\H<$uZHt$5)H|H$L`XHX`tBH|$L%FHGzu 19uH<$ЅH[]A\fH$HEH[]A\f.H<$藅H|$ATUHSHHt8H[]A\H<$uZHt$(H|H$L`XHX` tBH|$L%vHwu 1itH<$H[]A\fH$HEH[]A\f.H<$DŽH|$AUATUSHHHtmH[]A\A] A H|$L%HtcH<$ujA Ht$t/'Ht/H$L`XHX` tA 'HuH<$ H|$fD 1TsH<$H[]A\A]H$HE fDG HGHG HGGG$ff.@?u Hԁ fD $ fD3f f.3:f.(f. f.f.s f. f.*f.K f.b f. f. f.Ć H 1jrf.US__H(HDڃ? H$h BOZH(H[]HUNKNOWNH琋W̆ H1~@W H1f@W؆ H1N@W H16@HGHH2 H1b@@HG1Ht'HHtH5HH zH$M; D H1诲f.HGH3HHHRHB]HCLASS(*)CHfDHPROCEDURHEfCDHTYPE(*)HtfDHG( HH1SDHHOLLERITHHfK3DH; H1Ʊ H1豱؇ 1pHinvalid CtypeC HW H1nfHinvalid sCclasHfS ff.US@__H(pHD]uT?Htd,HGHtHH:S ;K H1趰=@H(H1[]HK ; tF H1wH(H[]DW ;t_ H1N@E ʐ H11*몐W ;t=o H1@H5H˜HzH$K@R Wff.S__HD؃?uJHGHtH8u;W;tX ~;HCHARACTECR(*,CH?)fC H[D[1XH߾o 11H[@HCHARACTECR(*)C Hff.HHt;wtHHHt 9wuH 1;mff.USHHH?Ht'Hf.HH;Ht H uCH[]ff.HqHt';=qt@HHHt;zuH 1lff.fH^bHN@HgH>ff.H'brQH fHGH.ff.ATIUHS1fHH4tLHu[]A\fD[]A\ff.AWAVAUATUSHHHH4$3HHD$L<$M7MDd$E1fDM7Mt'LLDHM9sM/IuMtILıHt$HH<$HHHH9ƉF9wHL[]A\A]A^A_@sE1ff.H(H5}HHzH$H(ff.HHH@1H4$HH|$ HD$Hd$ HD$(yHHf.HF0H9G0rfHFH9G|fHWHFH@H9B|f.Ht GXt.ff.USHHHHvHuHu H{ H[]H[]fHt#SHHHu.H{HuH[@fDH[ffHu fSHHH{H[顾Hu fSHHH{H[qHAVAUAATUISHo(HIHuLDBUuHzSH{DLLHusH[HtwHk(HtnHuLuHuHtLmuHC8@tHaLH8^EHS(qBUm{fH[]A\A]A^@[1]A\A]A^D1ff.fSKHKHEHHH@0H@@ _jiHH@H@ _HP0HjA[HtcUSHHH{ t!HsHH[HuH[]D{8uپ KHXHH/jDfDHt[SH(HuRHH{t H[+H{8H{@H{H H[@ÿ؈ 1rffSHH=ZHo YWt1uH5sIؿ0 1bH5]Iؿ` 1bH5OIؿ 1bHHPHHt4 < u H ff.fS@=a_HCH_tPk <&uރ_-„t΋@H-E[:$*f fDSH$D v'Sw$ Hrڿ PH-[1@H-[ÐrH-z[oY_fBH-J[?)/fH-[fAUATUSH$H@ vPكSw$ՠ Hrڿ ( H1[]A\A]@0HH<#f.H)HHHDPHR <&uՋ7H-?5%H[]A\A]DA1IHfL9skA,H'u't݋HsLH-HNH[]A\A]fDMLLIH xH[]A\A]H[]A\A]ATUSD%3-189two$@ []A\ D%-q f f fȕ f 1a@St t&t [ufDH=̶f.H`_xHÿ ff.@ATUIS: 1H)Ht@HHH u9H|$ HٺHHt1H[]A\fHرH[]A\DñT$ D$L$ %эD$I$H[]A\UHAWAVAUATSHHMȈUIIf.}t A~M.LL聦HH!%tIDHSHD@HL)HD LIHMLHMHSLI<譭ԁ L HHu(MvM1HeH[A\A]A^A_][t uL[H=<oLH(ff.AWAVEAUATAUSHHHAHHMH8HHD$tDH@(HtHpHt H芸tH޿ 1]H[]A\A]A^A_H5aHT$H1zHD$EDDHX(tfKT@HCZHKPpCZ1=KSHCD0D4H[]A\A]A^A_KT MtC Ls(C8@8 @Lp @0n = 莔uÐHG HHHF HP1H9uHvH$@ff.S1HHH6HHD$tlH޿1HD$8HHD$0HD$ Hu0HCHICHHtHHTuHC HĐ[HĐ1[DHD$8DHIHt1HP0H9u~fDHP0H9tgHHH9HCHHHuSHHHcHX0Hƿ_HD$JHPK _HD$H[ffDfDUSHHHu@~KH[HtH;{}H[HuH\KHhH@0Hƿ_H^HH[]ÐUSHHHxHH@0tHtHEHH[]ÐHS(HPHC(H(HH[]Hu fATUSHHH{{ t []A\@C`tuH{0[]A\? ChuHc{dHh0HC0f@VuYH`IHC0H{PID$(HLcXtHHoL HHu{ HCPfC`oH0H}3HIuHC0HH0IH>ATUSu:IHH?HHtDHH_H(AHHuI$[]A\ÿ؉ 1YS@H==t+ t[f.vs[88A H1VfATUSH H!1Ht^ЃH MHt5t)AĐ<'u EH{u߃tx-|H []A\aHtЃH~=F{r h'Ht Z'-H []A\f'f>HI1 LL#1<$QLH!ʁ tL€DHHHDHD) Na >USHHauExH觮H5H1ɺ聫H=襬H[]fDHt$E1I11HHHCHvxHpHxHL$Hm HH1DH5u~-u -0F.fQH=RHH[]fDH#H۫H[]ÿX f.SHH =1tH=HH [H?HPHt$HD$HT$HD$H [SP HÅu@ uH{tHC0Ht [fD11Hs0H{(H[fD=yt=D1ff.=IuH7dHHDfDSHH=1?u H[郪HCHH=[cSHH=tHH[fDHcHt$HD$YH[=tSHIHBH[DHff.=ISt H[HHH[ff.SHH=tHH[fDHcHt$HD$H[=t}D1\ff.=t7USHHDH-5HHңHH[]Ԩ@HUSHH=0t1HH[]fH1HH[]ÐSHH#{ @@C @ C ǃj{ C @ ЈC ?{  @{ C {  ЈC CH|$ ЈCC?D$7CT$H|$ ? ЈCC D$ D$ ‹C% ЉC=ξt4 dDw߉$h  CGH[@Cp 0C롐Cp CfCpz CffDKWKGK7K'K K @K@KKKKKKKK wK gK WK@GK7K'KKK  KKKK K KKwKgKWK @GK 7K'KKK K K K K@K @K K KwK@gK WKGK7K@'K KKKKK  @   x h XCpJ@C6tC VC8C |@`{'C8C;(C@CqCM-  @CCCq UC@7C C@C C @CC[?{ 6C  @qUC 7{"CC{C C{CsS3C@C @CumuXC<$Ő  B+ A D @뗾 ?C{ >C_ =C? ;C  :~ 9j 8VC 7>C 6&Cu 4K 2# + ' % $ C w ~C W *fC 7 NC  6C C " !C  m O C /   1r /^ J 6Cw CW #7    C  C  C  { n] ZC= BC * C   m N / - )} i (R 0> 1-{ .   , & 5 3C <v J( @` > = {ff.SHu:uXHpH51 HuB[H=˙H 1)HHH[頙  ff.AWAVAUATUSH(=H-UWH1*\UP H HLgtL\HtΉHLguHRMHD$tlE2E1,HL$BHIHhM9t7@<\uE<\ugHD$B\HEIM9HhuʐIHD$H=B cHD$H([]A\A]A^A_PHH5ZHT$1HsHD$PX_HX(1iHCCZKPD4C ǃ0PCZKSC$ܶ=HIź1@J=IEHL1JILOܶID$@HHHx@'H[]A\A]ff.ATUSHHAHH>H8>HHD$t3H@(PX_HpuH[]A\DH޿Ж 1A=H5HT$1H HD$PX_HX(1Hc1HCCZC PCZ={$IKSHǃ0D4H[]A\Ð ;&@8tHW1 鑶H1遶 :&@8tHW1 aH1QAWAVAUATAUS1IHHt$: ;&tuHL=éHD$1MD0u;H踖uAt8EuD9t8MMtaHI@uL耖uAuEtЃD9uȅt6HALJAtMt3HL[]A\A]A^A_Eu HD$(1 u,LH[]A\A]A^A_@tHL[]A\A]A^A_1fDHu fUSHHHH{HS(BZH:Ѐt H}HCؐATUASHHHHH8HHD$t5H@(PX_HpԔuH[]A\H޿Ж 1:H5HT$1HHD$PX_HX(1HCCZKT KUH;D4ǃ0pCZH-UHT$1HHfHD$H3Hh(1^PX_HEEZM\ MSȀEZ14D4HE LcxDž0׶oH(L`@HCxC[C[H[]A\ff.@AWAV AUATcUSHX_Ll$0LHD$@FDŽ$DŽ$D$X D$p|$k $Z@$I$ȸ DŽ$0DŽ$HDŽ$`DŽ$$x_@$N$=$HPX_DŽ$DŽ$DŽ$ pDŽ$8qH8VHHD$(H@(@PuPX_ 1m7Hl$8HL%D$MufM$MI\$@HHϑAuM$A}1ADŽ$H Ln-D$=EU5A  L` 11DT$0DT$A<$AuDIEHM$M@|$-IImHHݣHt-uHHs@` 16HHuHX[]A\A]A^A_ÐA AD$ة fA<$PX_DIEE1E1Hw5j1ҿ?Hc7HD$HA_HD$f=*1IƨA(BH|$1HyVIcXuA<$HT$DIEH5ҵ1ҿf?HcHD$ZHA`HD$=1I AAH|$1HUIcVuA<$HT$DIEHQ5J1ҿ>Hc'HD$HAX`HD$f= 1IHA AH|$1H\UIcVuA<$HT$DIEH51ҿN>HcǴHD$+LAD=1@1HLTDILHHc0_uA<$HT$IEHj5DA<$DIEH fDD$fD֠AE DEeu7Bt7A  u, 11-AuDH2pA MAD$ت 1E1|E1ZE1DH%#E1E1PX_DH551ҿ1HL9SAXuιLH51ҿ51HLRAGuԹLHb5'1ҿ;IƋWMAX`D=1HcIH=1HLDRAGuԹLH51ҿC;IċI$A@=z1Hc=1HHQDILH0_u˹LHk }H5HT$(1ɿPX_4HD$(HtyHX(PX_CZKP0CZ1ǃ0HC1 1LH11aб ,= gб = gб e= pgfAWAV8AUATUSHζH11*1k *1W 葿H5H?H=sH#wH[薼tH[]fDH}YKH[]=YSHu#H?HtjH tO[H f.(oiHx HH@HH@[ú = X = Wff.fUSHHH==Hǃ*H|$HD$HD$HHHbuH{ _H{(V=:u3D$cH|$ HcD$H|0uvH[]HHHD$vH|$fHD$HHHD$uRH01ft&H;t H|$l$rHjHuH[]ff.fATUSHHH ;v~bt-u{ CH[]A\f.H{H{H{ PH[]A\u{ jH{ CCXs L0fDLIC 9L#~8f.HH9C ݋u@S oHC(~-HC@HHTPfDo H@H9u>9H[]A\H{H{^P @ H[]A\fDK L0fDLI|$x{I$IjC 9fDC UHt$HD$H9C Dډ 1ATUISH=HD$1f{C$h @` H ZHHqHH{HH1=BH=?RHǃmH|$(D$诵H=#H=H@m6H.HHH(HH8H蚰HHCHH5~HT$H蹸HT$HBZJUpBZHQH8QH|$HCH@HEH[]A\A]DIcŴHHC@bfH1H=HfDHH|$HD$Ht$HHzHnfD{ 萰HHؾC{ hH輾'H4Hl =  MHH?H4=qL%bHt$HMD$LuHHH葯LHjDHL7HHkH}RH} 蔿H\7HHudHC@H H{Pt2D$H(HH0cHtSD$HL[HxIžNLHLHbLBD- BD-iID$ 躭 谭( 覭fDUSHHB=uH;HtH[]aۮtH=1HHH+H[]/ff.@SH׿; K‰t H{H{H{{H{rCt;{ ;C t:q謽=%u9;tn[阽; Ŀ;C uH{=ttu>C$[MDH{߽Mf.{$t֘ 1^{ ff.@USHH貾=ku9H]Ht#HH{H{vH{mH[]鲼f+t(HHEfDHEH[]uDUSHH H舼H{@t H{H[]6fDAVAUIATUDS1ڽ=~uGHE DHHquH(=HHuIEI]HtE@HD=H[ Hu[]A\A]A^闻ATUHSH@=}H]Htu3H|$ ID$ H{{` z }CStz ʅ~-HkE1HAH}xHCD9H[]A\[tN'HEHHXct;ҋ yDC d@HE{H[]A\fmfDATUASHH=| H|$H|$ܳHp0Hx(HH虵E uE H#H{H{`QHHH{8K{uK>@{BCB ECBtHHu?HC:t HoCx@H[]A\L&HufDH(HxHAWAVAUATIUSHH(=U{HHxqHLH 8 ҺHHxHH1H=HH|$HxxD$T$HH@xH0iH([]A\A]A^A_f.HHELIIc}dI;xdI}0藱HX0IE(EM}0HD$%H޿ PoE IIu@@ oE0@0oE@@@1RH|$IFLLAN^I|$0B HoE IIt$@@ oE0@0oE@@@1It$0I|$(IEL蛲AM^1 :hLHxPHHHxXxHo}@ H([]A\A]A^A_H޿ +oE IIu@@ oE0@0oE@@@1-H|$IFLLٱAN^I|$0 fD KHxHHH1HH|$菶D$)H=xHvH=xHEx`HHpH([]A\A]A^A_Ð1ɷHHxP)HHXH([]A\A]A^A_f.H xHPI$HPwHHxpS  = A = A = Aff.@UHAVAUATSH@|@3HE]H}ĴH}KH}bH]  H*HS ɒHH.ϒHB`T HubxvAfDHIvHD;uf.D9HHuffT HkT HqT H@;.LsIL0OIH@IIB<+.<IULLVHuCD,LE&HII9LLHA!H{. VBD#.BD#H޿1HHEH{HHx藠HuHIHHt`&H=t芫L`@HCXHpLFaN蔥葱He[A\A]A^]@ѤfoEHHEH=zC DkPfoEHCC0foEC@oxCHzHChHsXE1HHHHKhHS` utJH=zϝHpzHHspAHHHHHSxMID$(HI\$(苰fDAsbL1P_H ,HCXHS Hp1Hu1HEH\H]A!AsDHyyHuHx$HX(;ACDA 3A (A  4= *<UHAUATSHHHcOP!w'HH!fC uH{(fGVu tU۰H{BH{ HKhHS`HsXE1HHHtHSxHspAHˮHe[A\A]]À=pHJx e\t̋ 謦DfLgA<$.uI|${JIH@HIULHHQHBD(HwHHx蛜H=oH@(PH@ ݥuHt;USHHH}H](HtHHHuHmHuH[]ff.AUATIUSHH(=ouL&MuH([]A\A]@諝uL#ޮ=o1H$HD$HD$@(AosIH@IID$>/fD+un1` ۮ#w$Š 8 ֟HHD$o6sHH(HE ]E(HD$;H}轜tLLH([]A\A]8iHōCorHHHH*HU E(H%ID$ Ho f.ID$ ƀID$ ƀ؞HPHID$ H$HPXHPHD$HT$@1 脧ID$ Ht` 1aID$ t` DID$ Ht%` &ID$ HID$ HXHHt,` HH[ HuID$ HXXHu+D` HiH{ H[ Ht{tЃ ` 蓬ID$ HXPH` iHH{H[ Huf.AUATUSHHD$ 覫H{PmC^ DkW艫A=>lt\HǃE1@I$ItZ䜶DHH&MuHHHt@HDHHu4HtZH{ { uKV@=zkH舨HHtHH脪==kt[HǃE1 Il$ItW盶HHlHuHfHHt@H8HmHu:CZA}_t7L%eMu)M$MtI|$@LQuKS@fDUlAKS8H;臺HHc}dHX0IHHt]H}8H裒H}8HHu@H1pH}HHCHtNH@H8Hu0H}(HGH}@H5?kHV11{It$0I|$(HHXAUATIUSH=ctjH/HuUE1 DH]H輑tk 舔ogHH@HIuI]HHEHufHtHHtHH[HuE1ǠHL[]A\A]fHUSHHH{Hk(Ht?HC8@t6H}xt/fEVtf諍L H|$D谝H|$L覝Hc|$L輖H$1LHDŽ$\D$A*ElD$HHAfH,_u#DElEuL=L_LLLHLHHH@(H=eH@S 诋L}0HS(MHP(HC(rL{(ALLKtAOS|$`_|^AGS AGS}`H]XuE`EhD$HD9D$H}Ht$H1LHIH5K^LK+M}du*E`Le01HKFfD1)@HHt HsH;2H;E0f@V@UH@Ht~H;]tuUpl]D=i]L5f]X]UtD$<O]HUxHH]kH$$&$D$INI~HAHu@I0H]XLu0#@A>@H=mcOLHL}0D$DMC H5AcH}8@H}8HE0I谱Hu@I1L}0IhH}HIGHFs NH@I81 LZ& L> Ly& L]1LLHPHHHbH/ElD$HAfDHH)[8 1=C LD$,11ۉZD$8ZHD$0HZZDHLctHxZHufDHHD9uHhaǃL(yLAHBaJ([t<ZHoHaL(,LHuuUD$1YD$YHD$ HYfD裇U՘H$H$Ht$XH$D$X\Dd$XD$TA*뙐1HHD$TD9D$TqHt$TH$HHtJ|$TuHHxHD11?f.ǃˇu轇ufH踅 HVo\HHCH}@ (DA,E1I賕nD$Lt$@qMH$pH$cHt$TH$1D$T׭\$TD$P~H^HWH$H8蝁5WIƅH{L`(MLd$XAD$Uu9AN u1It$Ht'H$JDtAF D$fAD$SW? AD$SD$PI|$xIT$PL!H9ÉD$PHt$PH$1谬HIt HL$X11HרD$Pu蚅uD$PH|$XHgMf(f.AT$Zp[HL$HML9!tHHRH0L9"uAL$U|$AF HL$AL$UIL$xHt+L`(I~LLd$XBM}DHT$X1LLd$XH$1aI$Ld$XHUAL$UID$AD$ZpAD$ZAD$SAD$SI|$x6Ld$XID$xHD$D$P&fD1A@H|$XTH[LH8IfH=[L葀Ld$XIL`(A$AD$SדDH=~DŽ$襓H$ȓH$v>HH觾H$J$tCtCHC#ɄH-SHS}MHeNH@H5@HZSHH6L Mu!II$HufD諒11#D;HCHH{H{r=t H(uHH@H)ZHHt,DH{HtHuHt ?HHuMI,$HID@軀I?*D$RRD$(RHD$HR.fDH=aRuHYRHtZ@u@L%RHLHH 1nHHuH=QHh[]A\A]A^A_@H}H]HHH:HufDHXHHs@LH 1|Hs@LH꿈 1aIE1IT$I4$ 1 f.UHAWAVAUATSHHXHuUWoG :QHEH_U҉ QHUHEHH*QHGHPHQ8tmLsMtdLL*IH@$HHtH=H)HH $LbLLILM1AdAE.smofEEPL5PIL)IH@#HsILLIML1AE.modAE{HXPR{H1PĚ H, Hu1Ԛ EeA<$/bH5SL=HHOHO H€r( H€HƿU H=XOHO8t H{L濈 1HH-H $H=sH)IH $ufԁ L=HI ?H=N6L6L-NHNHyUH5NH8rzHtH@(@P0N:N,NAE1~AH!N&H=NEIcEE)HMH="NIcDHCAA9~At$H=MMcHc~H=MB E1HMHM@7DA>`_  A QzA. AAD$Au`_  @о@ wH5[Q1ɺLHH"MdHH-H $p zHLH{HtL4MLuHEHCCHPHEHPfDyA`_%  y  { HK&H HA4A qL%LMtSL-K fMd$0Mt>E4$AFwID$LH88uA-H 1cvHdRH!KH8iH= KuH=KHJ1wH=*KHJz38HK{oHS0oCHP0HQ@oC @ H HH0H HEHNHEHNHe[A\A]A^A_]H5J 1*D8M|LH¿P 1 1L HMQHE H8@vHHE L=JMM E1E1HEHEL0`6L`6{L`6fL`6 A?* MEE11L貥HEMM(Mg@8`L]6`Duп ԁ L8HIft LeH=FIy1LzL-:IH+IHIq1 !HHi{HH;Hƿ 1Rf18 _HX1e 7?mH0DmE0H}.H2 E9H}LG< A `HEH]]E1MM׍C։EHMw@LL4u]3ALJ$ eHO A? 3MEE1E1L{DefMMuE}3HEIHL)GDGA?HE 2LH(MEE1L%{DeGA? 0MEE1E1LzDeRwGA?HE 1LH(MEE1LzDe6GA?A /MEE1LlzDeFA?A .MEE1L-zDeFP I 1DeSyFz ŋcF 믋MF 뙋7F 냋!F jF1 QE" 8E@ E E E rEa YEX @Eo 'Eԙ pEę WD >D %D  D D xD _DL FDu -Dl vD`  ]CU DCJ +C> C7 C/ E  $` DDe.fDB/3$Ű -C1E11҉޿ 舞C @BE1E1 0| 4vBHEE1 1 LH(u~B{HEE1 2A LH(uHXBE1E1 3 u(BBE1A / QuAE1A .5 uCփ e$P DH}-HDMA?޿ MEE1LDe=@~H@VL%s@H]Mt2A$uI$It$@ؑ 1M$MuH{HL(MLuH5[@ 1 l=@EL?5?H5 @x 1wL激`,4A?, MEE11LÛHE H5FHU1ɿ pHEHL`( AD$ZAL$P0AD$Z1zKADŽ$0ID$?tHEHEE1HEE1*A 11HE >E1, A11HE+HM  1w+HM-HDM5 ~ t j| `A V Lff.U1HAWAVAUATISAHH8:JD52=H>EL=>LSH5<>XHHHuI1HHuHcHHIHHMILHMLI| E.modBCIHHHtH=H)IH $ILIL("0Efwʅ L+HH<H=@1LH=<H& 1&j<ugk<{HgC ; HQC ; HHtHcË tuE1tqtfDL(+{LuYy /u %u{H=B`Kϳ*y u tzHBH8Rx t tzHuBHEȤ EH `HX1Hw:HtHݥH1ӥH=T:gdHD:wx Mt CtzHALMtQE1LyH{HSNyH{űH[HuxM$$AMuw H]s syHiAHxw s sjyH;AH8H5:Ht:HHEeH}HtH}cH59HuPw &s sxH@P;KH83wH=9e Hr9rH=9,NHuLMk%L-.t$ 8ZLLn^He[A\A]A^A_]d.smofJ@LH5i9XH-HcHLjLIL)L(-He[A\A]A^A_]fHuLjHEH9EL, 8iLH¿h 1H]wHuH>H ,H>H}1 EuH}udJquH>9fHH-H $WfH57 D1CHHCfDLa 8hLH¿ 1 8hLH¿x 1hLH¿ 1 8hLHL8 1gSHHHD$A_tL=5HGutt3HG lt uHtH|$sH{OH[ffGVu3f.USHKHH6H= =+5815|{H= 5u =5tH[]fH5H4H54H[]UHAVAUATSH5HADsDCH{0DD ƃHL+HLDH@0Ht(L;(uHHH@0DEADHuEt@t ELI@HG0HtzHIHH9u{GSuEuҀuCHWHu 'DHHHuHCHHGHCHG0IE0LHG0HuH[0HL5i4MI^Mf0HtuHtXtSHtVIHt6IH9HuILILHuHHHuLMcH=3ML%3WHt3He[A\A]A^]A8E@EL+L HP#IHLHH4IƺA.modAFHHt#HAH{0HSL+H{0E1fHtNUSHHfH}HtHHH]HuH]0HHHuH[]ff.1H2H02ff.@H=1Ht,SHTHHuH1[H1f.fHtSȃ<҃<tttCfH:t1DHuHBHH@(@Pu@[uHH0HX1P [1H@H7>t1DHzHFHH H@(u+H9AXtH9A`tHX 11HH9AptH9AxtHX 1fAWAV1AUATUSHhL=4L54H4$ytHh[]A\A]A^A_DE1E1H|$H-y4Hz4ͶHt$ 1c t}H6Ht$ Hx[H8HD$`MHT$IE HR(IHh(HX0HP){,h5 1L L=3L53Hh[]A\A]A^A_f8`MtIE HT$IHh(HX0HkIIHH$HHufHHB HuHB H$H$L gHt$ X 1 BDAUATIUSHIHHHD-5H}LZHtqHX(HthMt>H{ L~tCA<$u'HC(H8HtID$H0Ht# YnuHH[]A\A]HHu uHHt @[<tH HN1ff.AWAVIAUATAUSH$Ht$ LD$(DL$D$H1HD$H1HD$1菺tHĈ[]A\A]A^A_11H|$0L-1L%1߳mESHt$@1c #J0H4Ht$@HxXHH@(HHH|$0u:fHC HT$0Lh(HL`0HHHH|$0+8P]HuHfD|$HD$8|$H|$0G+8]HHC HT$0Lh(HL`0HHT$8HPMt:)Ͷ,躶u 1DHXHD$H/HD$H/fD<('H|$81L-/L%/)ttH|$8E7 1葽|@C<%fHAHD$ HHu DHHB HuHB HD$ H\$(HL$ HH)VH NHt$@X  @S҉AHHdHD$Q1LD$Z1҃YtH[fDHD$HHtXH@ HuH[ff.@AWAVIAUATAUSH(HU.HD$HQ.H$1tH([]A\A]A^A_f.E117fHD$HE)蝴,I致uNH|$;At EHZHHtpID$At1it 1蘻fDHkHHHuHD$H`-H$H]-H([]A\A]A^A_ÐHuIHu fDHHBHuLrIH([]A\A]A^A_fUSHHH迉t{tHHSXH[] 1̺@C8uH[]f.AUATIUSIHHL9@H@;$` DH[PHH[]A\A]fDHCHHtL9h(@H[]A\A]fHHtH{LLeuH[ HuH[]A\A]fHLL.ujHHL981UH{LL4H[0 Af.fAWAVAUATUSH(H|$T$ HL$IWAfDIHxtIv1 IHH@HALh(HD$H@HXfHHH@HLx(M9ILHpRILHpQILHpQHCD9HXuD9d$ IVHtHRHuIVLrMAPAD9d$ |PIv H(1[]A\A]A^A_鰷Ht$IV1 蛷D9d$ WH([]A\A]A^A_ÐHt$H@ H(1[]A\A]A^A_]HT$Ht$p HIv@ up fD?uSHHtGHu=xx=u7HH@H@HPt' 9uJ9OHN1@HHHʋ 9ff.@USH(HHT$HL$Ht@HHt:t H([]fDHuHL$L¿ 1PH([]fH7HL$1L¿ LD$$HHt@LD$fUSHH-'H'b< t1 褴[]A\f.{1??~1**u}fD1*B1(1*1ή1r@1蜮t/1*苮t-H%H-%1۽2@f.H8t1DH@HH@(H;tH;BuSHQHHJHoBXHI(@(H@@ HH8HJPHHBP1[@Ht 1H1(蜭¸t1/脭¸t1.l¸t1/TuAVAU1ATUISIU H`H-P$HQ$HL-'H$HD$H-$H$Ht$ 11f Ѭt1H`[]A\A]A^DI4$H|$ tI6H|$ tI Ht$H|$ Hu&quHD$L`(AD$ZOLt& Lf.MfDMeWN2$ Dff.}H fUSHHH;t H[]{uHHKXH[]8 11eDUSHHHo;t H[]{uyHHKXH[]p 11DAWAVIAUATIUSHHHH8k IHD$I$H@HIEHI$H=II$HT$Lt$ L$LML+LHo@IHB+LHHHƹ+HHHƹ+HHLMLI$ HI$I@PH5HHT$: tMH|$1ɾoZKHL+HHL+HD$HH[]A\A]A^A_@HzxuHBpHt@Pu@[uH0H 1衧.IHD$I$HxPIEPI$Ll$HxX˼IEXfHL$H|$KoZHl$H\$HT$AWAVAUATUSHxHL$3L$4HHT$8H4$t ;,H$1L Mt0Lut A|$+It$X0 1衦L4$E1f.O@Mt)LKut A|$It$X0 1ZIIuH$HXHt&H ut {HsX 1H$HxHt 6H$HHt&Htt {HsX@ 1ХH4$L,$Lt$81MdMuDH/pN IT$(H0u Pkx 1lMd$ MtRI$HZ@t@RtHu@Qt L; tŐH0IT$(1 Md$ MufHHRDt$31EA! EHHt!@H wظHsHHuL,$E1K\% Hu ;H[ Ht2HtH0HS(1 xHH[ HuIIuH$HX Hu@@H[ Ht*H tH1HS(1 H[ HuH$HX(HtkH@HH@ HuH u'H[ Ht*H tH1HS(1 蝣H[ HuH$HXPHtkH@HH@ HuH u'H[ Ht*H tH1HS(1 %H[ Huր|$4 H$HHHf.HH@ HuHuHfDHtH0HS( 1袢H[ HHtH0HS(1 sHfD@Q.L9 !pH;tIT$(H0f. DA0H$L|MIGHt 8IH0IW( 1ġM fD@T wH_HH9+H HHH9 AYoGY,H HH9ufH;HRHu9HU 1Hx[]A\A]A^A_H$HXpH@xHtmHڐH HR HuHuCfDH@ Ht/HxuH@H[ HtH{HuHuE1D$H$DNtMAEN$@v$ IHHt z@P@tPZ zMv MuIf.@[ uBx uHP(H*H90t%H0IV( 1躟Mv MtI@PAtI^HtHD$L_nt {xIIV( H01kfH0IV(x 1MMv M@I@Q@uDH0IV(1 I@[@u"@] u8Mv MI@Q@u@[@tH0IV(1P ߞI@] tH0IV(x 1Þ볐IE@PI x u AF[  @ AFZ<@u0Iu&AFP@uIIM(u  1[MuAF AoE(D$` s uAIF(HHt1@8@[ @C ~ @C@tI6HL$`L 1I}IU(MIM0H2Mm M%Ll$(AH$HHt M H$HHt H$HHt H$HHt u H$H(Ht U 4H$H8Ht Z H$HHt&Hkt {HsX 1H$HHt c H$H Ht p H$H0Ht z H$tHHt *v sH$HHt H$HHt oH$HHt UH$HHt [v ;H$HHt [v !H$HHt nv H$HHt vv H$HHtDH;5H[HuH$tHtHuP 1jH$ |$4:H1H$Hxh"Eyzh7fDI@QHu!H0I^(1u X HIHPPB x  @{ FQH6IN(u  1I蕚 D|$3Ll$D I~HjLoP1iEA=5t"I~1D$dDD$ Mt:AE IH8LMuIFHEIIN(Lh H01跙 uRAtLEyjPw2ANz g/5dIV(HD 1d@Mv M Ll$EH$J\Hurf@[@u9H[ H]HHY@@Q@tH0HK(1L H@[@tH0HK(L0 1ۘfDLl$(HM% D$D|$HD$DI}GQ@G[@XG] &t) AGRt E GP@ j HtE= uHHt xGTtEtH7IM( L1I}GP@tGZ <$h A}!`AE$ fHtAEt ;D$H$AUIEVx H0IU(8 1i@Ht }L DIEx IMHH1HL$LpIEHP LHL$MUHMMpIphIH`LIPXAE!LT$ LD$LD$LT$ IIBHMuIHxLIPp_IFfDIE@ CIEIEx tIMH,fA} M$ f.IEHx 裶뜃80E M~ IV(IN0IML1l@WH|IU(IM0HMI}rfG[ 5E,ME(H7L L1dI} H7IM( L1BI}fH7IM(0 L1I}H7IM( L1I}_H|$H7IM(L@ 1ŔAFP@t$I6HL$`1L⿀ 袔AF AF[ t)I6HL$`1L⿀ uAF AF[@nRfDAF[ S@NI6HL$`1L@ $AF M(E6IF8)IEEEAU I u]ADB1I!fAH  L9IU(tIV(L濠 1XH7IM( L1:I} HP(HHtSB8tCf.H0IM(1u  IEx IBHP(HHt z:IAF[ %I6IM(u H 1襒MuH@Iu( 1舒MuML2at A~IEIU( H01P Yf}R-Hu 1vA A ANAv8HiA]u IFHHP(BQtHD$8H9 IEIU( H01AJ4IT$X 1ƍg@QIM(ۛ 2AYhH HXH;HRHuH6HI~#DD$#Iv(/ 19DD$H0IM(u  1Muh|$3tB 1PsIAOA t@}( 1踌GHw1 1行H$L MAD$8Dk8EP;K8ydDc8Ek;rDEbHSX 11Kow HU 1@H Hf|fIh 1HpX1ˋlME(H71 LLLD$訋I}GP@ LD$H71 LLLD${I}G[ LD$AFURIv( 17H$aH$RHu` 1=tu |u ME(IAAAAAE AEA9IA~ wAAP@AoF(D$PAy AA[ Ay @AAP/AAQt#I1HL$P 3 1LL$LL$HT$PHL$XIA 3L-hHu 1谉RIv(X 1蒉-IU0H8IUH9HHLD$ HT$HT$LD$ IIV( H01H$u5I~HWt IFxt5Iv( 1ˈfIv(1 趈H$ƀFx8<H|$L蒣D$H{MH{X81LA $HH[]A\A]A^A_DMdAtۃIxLKPA9bIAHHUEQ8EHAIQ8LSXI1Lh(L HI7EH|$L藢D$AuIHtIIvX 1豄HSHHBP8HxHx8HtLBX1LuI~HtA~8uAFIvXp bMAv8u1LLIvX 4HsX &L#HIVEA D$xA MA8LL$HIt 8L$ME1M Mt8E$ALHM$IM MuIvX XA fA]ARAGAL$Mu]II9HHPI9uHHIL$L$LLMxwL$MA@8uH{XLLjaIvXH IvX( HpXp I@HHL9h(MH@HHL9h(IqX {MMIP H@x==t=tIvX .LCXM1QHMg1LLHL$L$uMgL$HL$A|$8>M MM9uMgA<$uID$HHtL9h(uLI\=t =NHz ZIvX qIvX 1L$~L$HpX( FI &nIuIT$XP HH1[]A\A]A^A_=~MxX1LHMLLELHIDHt\I9EHy Lt3HHP H9uLA IH@ It$X PW jIVXIu ?ATUHSH?H@LH} H}HHHݍHэHōH蹍H譍H 衍H(蕍H0艍H8}DH;HlL9uHH]LPHDH8H,H HHH@H;HL9uHHtH_OHHuHHtH_/HHuH]A\ff.@AWAVIAUATIUSE̾IHDDL$*EH$D$t"LLHHH LHH ЃD$@IBLMHH$H$H$0LHI@H H$xLHH$LH H$LHH$LH H$pLHH$LH*H$LH+H$ LHH$LH)H$LHH$0LHH$hLH$HD$0LHH$LH H$LHHD$(LH H$(LHH$`LHH$LH(H$LHH$LHHLHHD$LHHD$LHHD$ LHH$XLH'H$LHH$(LH H$LHH$LH%H$LHH$ LH,H$PLHH$LH&H$LHH$LHH$LHH$LHH$LHH$LHH$LHH$HLHHD$pLHHD$xLHH$LHH$LH#H$@LH"HD$hLH HD$`LH!H$LHHD$PLHHD$XLHH$LHH$LH HD$HLH H$LHH$8LHH$LHH$LHH$LH H$LHH$LHH$LH HD$@LHH$LHHD$8LHH$LHH$LIHtol)$/a<$f.H$H$xH$1IpH$X 1p',rnPMI``譿tuwH<$H[]A\A]A^A_1 ?p$1 *pH$w';MuH$I똃,m uVH|$8Ƅ$HDŽ$ tH$rH$tH$@hH$O1 So:H$L$1ҾHxhqf,lH$tH$HH$H$1{(nH$X 1n 1MnH$IH1U unu1Ma dnffo${fDH$ H$ptH$eH$H$GH$ H$HE1jE111ҿ H^_H$ H$H$q1 um\H$,ƀ'kfH$ H$HƄ$HDŽ$fH|$0|$H$H+ B@H$1! l+rjL$$HL$$PH $H逃H$X*7jA-jAf1`T 3lAK1fT lA 01kT kA 1qT kA H$0l+H$b4(Ƅ$AH$H$8HDŽ$E1D$hL$@H$P11ҿ PD$`L$HH$hUZYL$@D$h"MfH$H$1.jH$1} HDŽ$jH $ƁH$H$DŽ$H葁$$$$H $H$-,hfDH$H$tH$mH$ t"H$uH$PtH$TH$tH$HgH$tH$H0H$tH$HH$tH$HxH$4H$H"H7 1iH|$`tH$1H$H$H H  1hDH|$PtH$HxH|$XtH$H$Q H$RH$H1j1ҿ AL$HpxX_AXH|$HtH$DE4H$tH$H$8 H$ H$ H$t |$1 gH$IH$H$SH$DE?1 Cg*H$ Hǀ1g)dH$MǀH$tH$HH$tH$H$0tH$HH$hkH|$0H$Hpp7H11jAL$e^_*DH$tH$HH$tH$DEH|$(HH$(tH$lH$`- H$tH$HH$H$p1.te[H$,ƀ&cH|$ptH$H8H|$xtH$1H$tH$uq@H$tH$ H$@H$H1jE1E11ҿC H踩Y^}hH|$htH$H(H|$(J1 Nd5H$L$ Hxhi@H|$@tH$MH|$t |$ 1 c-H|$ H$ H|$ 1 cH$L$1Hxh踫`KMH|$4H|$H|$ RH$XMH|$ H|$ H|$  H$tH$H8A H$(H$1 bWH$ǀo1J bH$L$Hxh螪11 DbH$L$Hxh_fH$HE1j1ҿ L$H$HpP_AX7$HDŽ$H$1 aH$MH8H$fo$HH$H1j1E1E1 HpHgZY 1X +a!H$1ҿ`HHH$85P_qHHH$@貰H$HHuHHPHuHH$8IH$8,HH$@H9^H$HE1jE11ɺS Hp(SA[Z1 `lH6 H$HH*H$@1_ $81_E1҃tA1 _1 _$8ADDH$D$8HHDŽ$jA11ҿdL$HphiAYAZD$8Cfo$T~1S _H$L$Hxh.1 ^H$L$Hxhf1b ^H$L$ Hxh详Bf1^ T^H$L$HxhofH$HE1jE11ɺ Hp AYAZ}1j ]H$L$1HxhxfD1a ]H$HDŽ$HjE111ҿj HXL$HH$PuAYAZPHE11jH$P1ҿq L$A_AXDŽ$8H1E1jH$P1ҿx L$Y^DŽ$8H1E1jH$PdL$H$A[ZzDŽ$8$5uH$mHH$:H$$8H$HHPʉPH@ Hu1u \H$L$1Ҿ Hxh"D1H [H$L$HxhߣrfH$HE1jE11ɺj Hp8苠AXAY9H$HE1jE11ɺ HQAZA[1m [OH$L$1ҾHxh1.@1R Z1%Z1[Z_1[ZH$u1d sZ1\ ^ZlHDŽ$fo$ij1 (ZH$L$HxhCfD1 YH$L$HxhfH$HE1jE11ɺ Hp0諞AZA[Y1 mYMH$L$Hxh舡)1 ,YH$L$1Ҿ HxhJD1 XH$L$Hxhf.H$HE1jE111ҿ H諝AXAYYH$H1jE1E1 HqZY!1 5X-H$L$1HxhS fDH$HE1jE11ɿW Hp_AXH$H1jE1E1 Hp@ŜZYuH$HHw 1pWBf1I TWWHtH$HH$@HeH$811.W H$ƀu1Vt1ZH$HVtH$@HH$8H1 V 1-Vn H$ǀH$v H$ H1PV ) T H$ǀfo$F1 VUH$,ƀSEf1 U|$h1 U H$DEbf1tUH$,&SH$1 HDŽ$ UH$H$l} $c $H $H$j1ITH$X 1TH$H$@HHH$8LHI 1WTH$81U CT 1.T 1 Tu*Q Hǃ1, SH$,Q_1 S~H$,ƀFQ1ITSH$,Q1` S#H$,ƀP17 RH$, P%pf1 RH$,ƀVP/1Y dRH$,ƀP膔 1cH RyH$} H1Q` I߿,OH H$@H$HHH$P1Qv`1zQH$L$XL$`1H$hH$8HH$81JMH$HHJH$PHC1PH$,@N:Hp 1PH 1PoH: 1|P`NH 1[P_-1& APH4$1 (P|$@HHH$H<uH$HH$8H41OH$8uH( 1O>H I[1 oO~H$,ƀ!MH 1-O}1 O*H$,ƀLSH0 1N:H 1N6~H$HHjH$8H(iHCH[ HtHC H$HtHuH{HHp$ 11X RL1U =LH$ƀ#H$ǀH$L$ Hxh2ZH$ǀYH$ǀHDŽ$AH$D$8Hj11ҿZAL$Hp`蛐^_D$8H$IHDPH@ HuvH$1c 2K$ HH$HxHH@(HHH$u;@IG H$IoHH@(HH$ts艼8JHuHH$GTPY8HIG ouIH$H@@(H)G,G 1M6OH1Ma IG+$8A0A%H$ǀyH$ƀ 1KDŽ$15 ^IH$ǀ"HDŽ$E1fo$N1 I!H$ǀxH$HHIDPH@ Hu 1MHH$IH%ǀw1 H?$8ADDHAH$HH MDRtFD$HH$@L$8H$@L$8D$HHBLHR Hu1\ GH$ǀBH$H$H$ HH1H$8%IH$8S5mP_XH $HHH3, E1Ma H$@G2THDŽ$Aff.Aff.4=ff.B|ff.4?\ff.CDAJff.!Kff.SHݵHt$`H$HD$1ɃA1AHrtyH\$Ht2H;HtX H3tl~ufF8u_H[HuH\$H$ñHHH5H[f. 1=H[f1HX0 =H[DSH ܵHH1jE1A`HB}ZYD$ uWHثt)t$1h 8=H蠿H[DHYϰIH[@HhD$ H[ff.ATU1S`11H`oHD$)D$r7HcHHtR[tzuLD$LoxAH*MtAwL A tMx @L^H`[]A\H|$ .} 1;foD$ԭH`[]A\fDH|$E111<AԿ¸(H`[]A\ 1;9H|$ fHHb)3 1K;foD$+RfDHL ޯI HtdDHvHtGH9uH~vD91@` 1:f 1:wpA(DDc_AA Dc_ٵLD$H3HfoD$L@H @HHH L@ IHA_A9 1:fDH|$ HHHHt$ H8Ht1HX(HHHTH;H0cHD>Ht$ 0 19foD$zDH1HyPHT$L AAIB_A Db_LD$fS1 HPHt$HD$3tlD$u=|$H\$NH"tH|$6صHHP[@1h 8HP[fH|$1E1AH蜼¸oHP[fDHH1 H 3t$/u*1<$xu8HaHH1h 8HHf.HH׵fD1ҿjff.@ 1ҿlff.@1ҿmff.@Gɀ1ҿkff.@G1ҿOff.@G1ҿe_ff.@H 1E1HD$jE1`Ht$vZYu)HD$PHHf1 6H|$HATU1S0 HL%zH-{H;1tftJE1H|$E11ҹku$H t6H|$Ht蒸H[]A\H$A뗾(lԵH$L`HhHPHT$HP HH HH fUSHHH_(HHu1uuHUHC(H2H8 teHHtlC U9uŃt͋u9s$u܃uuu~u uHH[]D1HH[]ff.AWAVAUAT(USHHH-H$H$oH$)D$p-tHH$Htx D$臊D$H[]A\A]A^A_ÐGHt$D19 .tQL$I|$ڱyH1L@ .XD$!H$[S:$uƄ$L$Mz AUEuAA9֍rSAuHR.f.EuE}AE9Av KvAufo$H,IAEfo$AEfo$AE(+8H$RjL$Mq1E1ѵH$IHAvcƄ$.L H!%tZDHJHDщ.H1f -oA}DsH$H4RHHHDw%AеIH$Eu%D AEAEf|$DL$. HBL1舐D$DD$AHDŽ$DC1IDH4@HH5еLIHHAwEIfD:V)5HnLHxjHD$ H$He@Z1Hl$0D$H\$(L$$fHD$XHD$`HD$hAεH4$Iǿ1K\mIGD$HH=ФAGPH$Ho@AG o@AG0o@(AG@foD$pAGIHH HT$H1L HIHT$P1L oA HD$HH$HP(HoAHD$PB oAH@(B0oA(B@oA@ oA@0oA(@@J^H^JZ@JZPZ@PZH|$HHt$PMwhHG(HHIGXHFIG`HРHD$H̠HD$J|A 1G s)|$I HOuH~HD$(HzHD$0HD$OtmH$H|$ IHKHD$ILH@H$H4$HL$hHLDt$D~A!Ht$hHHT$p 1N*H$Ht9hDd$`H\$(Hl$0HH-w¸ 1 -foD$pu$.efHiHt$p1X )Hs 1)S%E1A'A}xmsnH$H4RHHHW%Alj$˵IH$$%A E}AAUAE}r9AHDŽ$DC1IDH4@HHZ˵DD8H$@HPAD 0E@pt?AvHtvHHAvDIzHL$ HHA(ILy(I H=IHvH HT$X1L޿ L\$8HIpL\$8HT$`1ɿ LWL\$8A L\$8H$HD$XHP(oCHD$`B oCH@(B0oC(B@oC@ oC@0oC(@@J^H^JZ@JZPZ@PZ~HD$XL\$8HP(HxMIWpHT$`HJ(HrIOx;x HD$HHD$HI HEL@Ht$p 1&r%1AXHH4$HxHD$ Lx(A+1ȵ1H$IH;AWAVAUATUSH(HoӛHD$H)D$t0sHtHHt@[<1`O$tBH|$E11AH膭taH([]A\A]A^A_DǵfoD$Hpp`HD$9frf19HL$HipHOHA @H@ Ht(HHuHPHHuH@ HuA VH2A _D@TP^``> u AHu( 1$HEHm HtfHEHuHUB<Au<|HHHu(1 /$Hm HufDA VH|$Ht H([]A\A]A^A_fHu(x 1#1H0HPHxPA HEH0HPHxPRDAHu( 1n#fJAHUAC ?B ȈBHEHHuHHLsLkPH3LL;Et΄tH3LL褡fAzHu( 1"2HL$HAHHHA]D 1&H|$foD$Hd 1%f.H0HxPHT$_tfDHu( 1"aHHXHu(@ 1!HPHt$x DHt$IW( 1kP}IUHt$ IWHt$   G  Gf.USHH HtKHHCHtHH9tH1HS 1Hs Ht 1H1hHHuH[]ff.Hu fATUSHH HH{H[(HNAH @HS` 10HH{PHD$Hs!wLHHtHS HL$^{ uHC(HHt 81HChHL$KoZH\$D$HxaHChH: lHHHL$HoZKD$4aHH> HHHPHIHHy6HI HuH []A\Ht$HSH 1: D{ H{(rHs  rfHsHS 1 WHJHH{pH9y(uHzP7H@ HHPHt:uHֿ 1 @HHuH@ HoHxuHrP 1 SH1  HHHHffH 1 kff.AUATUSHHt$H5zHtZLD$A@Q@uN[zH~)HUzH9u 3fDH9t(H@H@u~ uHHÃvFt H[]A\A]L^LHAAAk(AC HHH@$L$@$@IH ID)DA)EMLO MtzM9uIuHMKH$IDcH ID)DA{$A)fDB I9HwD)ȉI HtHotI9tsA{$H~Ht$չ9H8HT$@HHx8HH^HT$HHSHP HCA{$ff.GZ<@tHuG uH H1HHAVAUIATUISH0LPÃIF@0L,$E11A HH0H\$E11HA HHwH\$D$ H%wHD$1htDtKhuI\$HtHH|$HHH8uHHH[ Hu҉fDI}L .H\$HD$HHPwtH0HH\$HtH0HH0[]A\A]A^DI$HA$<tkIm7fp ^H8tH 1H[HbH;Huʋ5THHH1&SIu1x dIHHnAWAVIAUATIUS1AH8LH<$0AE11Ҿ HH0H\$AHE11Ҿ H`HuH\$D$ H%uHD$ HHtfH wLH!tI\HtHH|$HHwH8uHHH[ HuHHuAET,wHHreI}L+H\$HD$HHttHJ0HmH\$HtH+0HNH8[]A\A]A^A_LLfHWHHHs HH9wy,H9tf\o =@t< = =  =F =4 =6 =, = = = =ff.AWAVAUATIUSH@t$ upf.AE<,uIAE<,t+IE DtIHM)H,u@uzId1INLH9IIY INLH9u0Ico?AESH[]A\A]A^A_E1LLMcHt.IJ< Hu݀|$ Ltc1@B Iu5 nDnIfD@Lt n1@f.LJhƇLJ Ƈ|ff.fHHH?HmHH mHmۥHHqHmmHmHmmmPH mmu 1HDH=m,Hff.@UHAVAUATSHHL'=HEt >mf%3m==ٽW=uб1==u1lt lt t >`lM7LϒdLD6lLmL%oMLE~:HcA|/!Hc؉HH)A|/HCHH9ىu"|eH}H;=RotSkTNoko8:bttk t ek @M=u &* ;ɾ 11+=ufDu vDoE==u f=-=% WwXt p=C48j,t H{(He[A\A]A^]HcڃIHcHHIHLILsJHHD%UEt =}i=B WDxEt|=igPNȻ? iF=uwD EtGtC=н?OufD=$|@fD5EH=O[HGPDE1ɺ)"uDHk==׸sfzt 5t1fDRt T1fDA<$HHcHA|uHH1H) fDA.Lc_HH9uL%jEA<$MLD11Lj-=f.5B1u 1a@HuL$HHiLmHML-iLm@AdD 1 ʷ_H 1pD!E fDe2fMA1۾@fDIuA]IL;Ku߃i] _1qCSfD0 11L.f 11~1=+7p 1 1H 1ff.AWAVAUATUSHHHH4$HT$L$DD$LL$$H#\|EAGLc wdByjH Hcc?DHH[]A\A]A^A_f.X.N\&L,$H\$8AE&<,uIAE<,tIEAtIHHDIN,uEHt$0E1IY HHD$ IEHD$(LLHB-uHHt$4Ht$HfDH-HH=tH=uHXI9|IC.HL[]A\A]A^H-YHs7aE1AWAVIAUATIUSHHHHN8L:Vv:H$σ@@]D$ E\A@E\E\E1҃\DЃA!ăD$ EE11ɀ|$ tlEs{YE\EEĨ tCt>Et9LHMIWxI7LKxh1DT$D\$ DT$D\$ DEHE{4E\TEK|$ @HMH3HSx`1EDcCB<T{H$M}IH[]A\A]A^A_C>@HCLA@8M[@D$ ^HCD$ D@\AiE1@E\E1E\HCE1fDH3HSxh1DT$D\$ SD\$ DT$|$ EEz{PE\@AEH$LLKxHMPH0HPx1fM\fHCAD@\AE\AE\DHCP\D@A@t$ E! H0HH9C@\ |$ DE\H H9ʈE\D$ Iߺ@AE\H$E)H3HSx1L$DT$DD$ DD$ DT$L$fHH91ɨ@D d$ D$ E\kD1E1ۃBE\WO|$ DHMH3HSxx1DT$D\$ D\$ DT$Y@HCx[D$ C>@tHCHx:=DE1@KE\<1E\E1C>@IHCHHJHH901HEHPHH90ufD@E\@E\AD$ TC>@fHCHB8E\'H3HSx1DT$ D\$ D\$ DT$ fH3HSx1L$DT$DD$ _DD$ EDT$L$ED$ E\@E\qADD$ ffDHC@\%DM\ HCI߹@\\HC@\@H3HSx1DT$ D\$ D\$ DT$ t@HC@\ H3HSx1L$DT$DD$ aDD$ DT$L$bHC@\@H3HSx1DT$ D\$ E\D\$ DT$ HC@\ H3HSx1(L$DT$D\$ E\D\$ DT$L$ EEJDEECDE\fD$ E1AfDM\AE1D$ @M\ C>@tfI߹HCDD$ E1@D$ ?DAD$ M[EuE1I߹"EDIE1ff.fH5R1HH=QHHH ttHR0Hu1H@H;r8u@tHR0Ht׋ uH;r8u1HH9׸t1PH;w8Hu|Dff.Hu fSHHH{{8t[fHCXHtHWH HVvHCX[HH=U%H=U$ _Z#Hq"SH`H\$$HH1(H\HxVHgPHHH` Hh 5H`[ff.HH5]S1-iHOHtUHJ0HtLHA0HufHHHP0HuHHR HUHu _x&1@SpoTHHfOoTHz @oT@ oT@0oT@@oT@PoT@`oT@poToToToToToR@t1HB(HX HHCHuHZ(c H[fDHB fHw%$@;H=`NtSJuH @HYTHFHҐfH=Nt,SgDH=Mt S GDR!1mff.ATU.SHHfD$t"GXu0GSOT GSGZHHZfDBBBBHHu6f.KScTHATIUHSHHuH;wHt4Hx(GQt H9 t!WZt;MZtKH9 t[H HH Hu[]A\fDMTf f tMZuOTf f tH9 uGXuGPuO uXPtLgP@ƀ@v`mW[]Hh(KHfDfDGU0ff.SHHHH@HT$~H|$jD$d$L$@HT$0L$ D$HD$foD$@PHPpfoD$ @`oFN@HփH@[HHPHH=H=PHHXYH C舎SHf.US1HHHtH[]ÐӃtfoED$ MdD$ H[]fH觏2oeMH`F8)$ H⾰FZHFGZ%:H0F[(teHpF{(tIH⾰F-t-xut!VHþ11@ATU1SIHHHtH[]A\f.EӃtfAo$D$ LD$ H[]A\fAUATUSH(D$RHsH@8oK)D$Ld$HL$ XL.Lz.]s@t1.tLXL.bB "@t1.#L@XL.h6t@t1.AL`XL.hD-iE~@t1(LWL(F,f= @t1(.!Ld$6L(.Lat5tH@t13.>LCL3.}D@t1Ll$Ld$6LLL/H tC@t1@.-L6L@.rs@t1[.YLL6L[.CtC@t1o. L6Lo.h@t1~.WLL^L~. >s"=tC@t1.wL^L.M@t1.4LL6L.w5=YtC@t1.vL6L.RD@t1.3LL6L.@t1.\LL6L.=@L6L.}Ty=B@L6L.:=@L6L.rI=Đ@L6L/3=@bL^L/=)}=Ftm@L _L/B= t2@_L6L(/~:&@LL6L8/h%=tm@L6LT/w=`t2@FL6Lk/! \@LL6L/_K'@#LL6L?+=t2@NL6L.)@LL6Li=K@WL6L/2C= @L6L/{=͍tm@L6L/@ZF=t2@L6L/@LL6L/@wY="tm@L6L/=t2@L6L/Zs_@LL6L@=wt2@L6Lhs@/LL6L/%>=@L6L0v=ȋ@PL6L,07+=@L^L60zK`fDD 9E@t1,/OLd$YL,/L  DE@t1a*fL YLa*<8  =tK@t1*L@YL*7  x@t1=@LZL/=q@WL[L92i=2tm@LP[L4p.=t2@?L⾠ZL/j@LL\L =t2@L[Lr0@LL \L05N=/@LL[L0'=؅@VL⾠[L00G] =L VL0 dynfDD-AEV@t1,.WHT$@]L,.ڄ@t1.`HT$\L./fz@t1H葽HT$@DLHV`'41.8 y y1/h1ڲ#Dyy1/薲Rygc1/gl#y841/8x 1/ s1Yx1/d*|xqx1膱^18/b[x3/1k/3w1T/;n1J1/輰&xwmw~1r0肰Z14^w/+19/v1/&jv1/ѯ;vvwv1/茯Hv]Y1*]v.*1/.=uu10Ju10豮.1,0荮Iu^Z10^-u/+1//t1@jtt1h輭&1/袭 ^tsSthd1/hS$t951/9ff.USHuf諰1MLH 8t112Oo $)$a<$D@H@Hx$u1No#)$a<wh$D1rH9LltOt(CHp9L)rt)"f蛳u x譭CH[]D18脭H[]Ð1H6L(.Htt@t10蔫Hھ6L0mmB]@t10LHھ6L0o%@t10THھ6L0Iw@t10輪 Hھ6L0 o@t10yHھ6L0,G@nHھ6L0I@Hھ6L/Z@Hھ6L0(@lHھ6L0Gd@HھpL<f~1H8L0H9x@t1)謨Hھ8L)zfD1H㾐8L0HvU*E@t1)4Hھ8L)yju? 1)NH09L)CnD1&*豧H=L&*uu1?{H;L?suS1-)EHP9L-)pL1X  t_H⾰100{y10 )Wi1/3g10Ĥ}qff.ATUSH ]~)͔4H58uH@Hx$o{)D$AXHھhFж%It Hھ iF-}HھЄF8 Hھ$FZ1zHھ3H1 ZHھZFi1{:HھZFh [HھfFh;HھeF HڿHھ0sFĿ(`nHھcFMP 蕿Hھ`[FTuH|$ kHھ@F1BH|$ XHھ `F}PHھ[FV]0Hھ0IB =.*Hھ0F41誾HھKFF1芾Hھ0JN1jHھpZF^xfoD$_*HھPFsUHھ_FrS&Hھp_F(13ŽHھ_F^Hھ`4H1腽HھP_F/eHھpjF1EHھdF M fD KjT$Z#T$@T$7T$.3$26(XT$ :Hھ@J0Y tKhHھ0J-%>C辠f1l}?sHھ@F1fHھJY F03Hھ@JX $趻Hھ`F薻HھPIHtZzHھJ Y t4^Hھ0IB .vlbHھ IX UHھ@JX 5[ǺHھJc(1觺HھJB%臺Hھ0Jy1gHھJ1GHھJX 'Hھ0F41u_H|$}r>HھKFF1BԹHھ0JN1"费HھpZF蔹HھJX tHھJX ¿HھJ 蝿tZ3HھPF聿t4Hھ_FreMC9/%)(T$X*+Hھ`J3 諾HھI3H膾HھЄF8 fHھ$FZ1FطHھ`%FUw &tl輷Hھ3H1 tF蠷HھZFi1t 脷HھZFh ҽ!64325HھqFb {txHھnF1_tRHھJ C;0Ht$ 1t|$ 1Mr=d)6T$ HھJvT軼MHھJX 蛼-HھI H{ HھJ[aHھ_F;͵Hھ JX 譵HھJWk荵HھJۻmHھpfJp1軻MHھJz6蛻G-Hھp_F(1{Hھ`4H1[/HھJ ;(ȴHھP_F/訴Hھ`JG 舴HھpjF1ֺqhHھdF M 趺QHھ0IH薺tZ,HھJb zt4kTfoD$%]<2(W6  HھIX 跹tjMHھI蛹OHھhFж%vHھ iFVN9HھrFb ,農HھJM  QHھ`F/yHھF Ǹt4]Hھ@J1諸\bVHھ0sFĿrHھpfJRHھpJX 2ıHھfF@褱HھeF  脱Hھ QKb ҷCdHھJDY 買DHھJMY 蒷$HھJYY rHھJeY R?HھJ &-述HھpI* tZ裰Hھ Jt4臰Hھ@IvHնG8FE_97<wHmSc`YHھI>Y L9ޯHھ0J8Y ,辯Hھ`[FT :螯H|$wHھ@F1ٵkHھJc 蹵tyOH|$ext\Hھ `F莵HھIHnt8Hھ[FVRT$6C,@"T$AOHa7HWHG(HG HGHG0t tHG8HtH@(H=HGHH8uFH5%HHHtx HP(uHu fH:HtnH~1fDATUASH1-AT$Ld8EuU1HL9u< u$۸u []A\fD裭[]A\@111蕏[1]A\ú 1Af.ATUSAHϩUHl8uTH賩H9u< w+HHsEu[]A\[]A\Є|1Єz1@AWAVAUATUSHHBHtHHHH` Hh DH軁Fy贆觧u=JL-3L%4A <&HInuH1ɃԡH1賃It.8_11ǾfDHqHt4;8u fD98tH@0Huff.1ff.fH!Hu2ft"tH@0HtJv tuHtÐ1HtÐs$HNcy øPy øDy øYy øs2øh2ø]2øKU ø2ø6ø6øx øb ø8øu øv8øf ø58ø6 ø9ø9øHø ø8ø ø7ø9øø øPø` øø ø(ø8 øø øpøH øø ø8ø øK8ø9ø ø øø ø6ø ø6øp ø8ø6:øn6ø9ø7ø~9ø7øq9øC6ø[6øP7ø ø6ø ø6ø øb8ø:ø6ø6ø&:ø:ø7ø9øV:øB:ød9øV9øG9ø.9ø9ø ø9ø8ø8ø8ø ø8ø7ø7ø7øk7ø:7ø'7ø7ø7ø6ø6ø6ø46ø6ø5ø5ø5ø5ø5ø5ø5ø 5ø4ø4ø4ø4ø5ø5ø5øF} øs5øh5øT5øD5ø55ø*5ø4ø4ø4øx4ø`4øL4ø3ø 3ø3ø3ø3ø3ø3ø3ø3ø3ø3øC4ø-4ø4ø 4ø22ø3ø3ø\q ø3ø@ ø1ø0 ø5øuz ø|3ø(3ø2ø?3ø{ øFs ø)ø@x ø2øIx øz3øz ø3ø.q ø3ø2ø2øg ø3ø3ø3ø2ø0 øs3øV3øM3øD3øIø2ø3 øz øø93ø-3øp ø$3øqz ø2øZz ø2ø2ø2ø2ø2ø2øz ø8ø2ød3ø6ø2ø2ø~2øS2øK2øø6ø'2ø2ø05øz ø 2øx ø;q øv ø3øø2ø2ø{ ø2ø1ø)2ø1H1aAUATUSHB$ŨUf.oxAwGHD[]A\A]D??ctf?~f?;?+?p?E1jHI~HKLHƿ1HD[]A\A]f.?~fDe?fDE1DHf:H18Hff.@AVAUAATUSH`HD$HD$deDLd$pHH|$0Hǀat6wL`u.ɀ UVq`^`>yfVHD$0HD$8DHH6DŽ$HDŽ$HDŽ$HDŽ$HH$H$HDŽ$ H@(E1ǂE1E1H$H$H,@'t6 'u@E']HLpMMt3HL$8HT$0LLQM$MuEHuAN\H@0HHqU@6ΒHD=L5aD$`HDŽ$HDŽ$HD$HL5HD$pHDŽ$HD$vD$AGHDNHHD$ HHL$hHH$W H@(MD$HD$xHD$`H1L H=A /AH=GP@t OR࿈GPH1ɾHBXtH) L$`u   H$DZDBL%;HI$L;u8o fD%WNH1p蛔AtuI$I9G uI$L$(L$(fDAC$cH$`1f #k r"L-9L%:FDH)::0C==ID@H1o腓H=vIL-DStsBHHuH@0H8uEEH$BAǀJDEgHH@@WEH1AH@H0HxPzED 1Oe ՎHF8 H@H@W@U 蚎H DŽ$  HDŽ$(HDŽ$HHDŽ$@HDŽ$0HH$PHDŽ$XiH@(E1E1H$8H$ HBwpg)x'BEu1!9dupH1H@08H#5` 1H=MS1P"ZmpBUHHuwH@0Hc8uEuE\1l賐fD=uI1:Pc胐fD u&"1lbD=x6!1tl/jf.1!bD$ID$A,@1:bGD$D$A@H1k?=AЋ#1kx1b'蕋HLpMM4]DHHHHr @2@H{(C$fH=H HHwH@0HlDH=TL-#1j"1jGf[@p#1djAD"H12jH=萈H}(~rH08!1iX@H0 1iHZH@+f 1i=tAD$QuI|$PP_HD$LD=yL5HsHD$HwD$uHL-RH@0MH5IH%M$$H:HT$(LHT$(HLhL-1h/D$ʈD$ADBEE計f 1thfX 11:]CDHH$`HXH3oCL-L%fI$LL$(/L$(fD1gch"H1gdI;$ A$HzPP_H~H8tHkxH01LgHUHL$ HAu,t(H1HyP1H D$Au1t-HD$ 1H0HxPVHD$yDl$x;1p` 1\` 1HUSL1HH~HHp$HD$ HD$H%HX(HD$0HD$(H@(HT$HD$8HD$HSPHPPHCPHCLtiw:&ud&HSH@0HHHH[]fPt=Mu#褅&tLu@! $H1beu\1H=ZHHX @ѥHHCHB(HB H"HPPfD#1d览]ff.fS-H@фHBH$HD$ HD$H%HX(HD$0HD$(H@(HT$HD$8HD$eHCvLt=wE,tO-t{tNP$H1(dt89LuQwKPsMuͿHgH@0H\H@[fDcfD=o(H!HtTH;xu @H9xtH@0HuHHP Hw1$H_Hf.1ff.fUSHH(- HHHt?HHHt@WH@AW ЈAWH(H[]fD ?:HA?1HHHH1ɾHHxPYtH d01AWAVAUATUSH}H$`f~AtDCEw"$8mD(%H1aS $hoD9 $vH<Ht9A$f;uHC$=IG1]H[0HuH_H$mHH_H-HX(H$HDŽ$H@(DŽ$`H$hHDŽ$HDŽ$pHDŽ$H$xHCfD~=C =a׾ =8t '1`= = =r 8&HĨ[]A\A]A^A_16 넿*1L-Mu,Mm0MtA}u1('4\Mm0Mu߃hHBH-6Lh(H$HDŽ$H@(DŽ$`HDŽ$hHDŽ$HDŽ$pHDŽ$H$xHIEHB(HB AUH@$8~L-Mu(Mm0MtA}u1('L[Mm0Mu~H]Lh(H$PHDŽ$HH@(DŽ$ HDŽ$(HDŽ$@HDŽ$0H$8H$ HDŽ$XHHIEHB(HB AUH@f.:1$^߁:=:uHHtHzH@0Ht 8KzH\H@0HQqfg=bV"M\2H^H({}HHmHX(H$HDŽ$H@(DŽ$H$HDŽ$HDŽ$H$H$HDŽ$H{_ &1\薀= $uѐHA$H@0H1|HUH=H H@HGH蘲H[Hk|HH]DŽ$HDŽ$HDŽ$Lh(H$HDŽ$H@(H$HDŽ$H$H$Hk@ P&1[= $uR{HHD$`HDŽ$HD$pHX(H$HDŽ$H@(HT$hHDŽ$HD$xHD$`HfN &1[~1=1 $uѿ${HxH@0Hm@:1OQH=HHjHzHHDŽ$` HDŽ$HDŽ$HDŽ$pHH$H$hHDŽ$H@(HH-H$xH HHH@0H|O:1^PH=L5I~HM,H_IVPI~ 11oAIFPHX(CZLHH߃@CZAoFXCIFho@C o@C0o@(C@I~hHGPH1@0HHHHDHuHANB~SH41@AHH9t3tuHBuHB(tHH9u{ HHD;8taH$@ID8I~hdH$AD$CPHR҈KPt CPM6MtxHID$ HD$HHD$@HD$0HHD$PHT$(HD$X6 H@(HD$8HD$ H+Nt1%t&=#t'uωqwHHP IHH@0H HH10wHH"HX(H$PHDŽ$HH@(H$(DŽ$  HDŽ$@HDŽ$0H$8H$ HDŽ$XH&uHSPHPPHCPHC1&tA%Ht$1%IVzuDHt$%1Vy;flrSfDc>`%1MLuAq`%1&LuuC@H$HHtH91&QHHHH@0H;ufDD8E{ sfHC(HHVH@`H_D;8HGHo@8)$ o@H)$0H@XH$@D T$ HDŽ$HC(H$H$ H$&Ho@)$`o@)$po@()$Z fo$`C fo$pC0fo$C KV@C@fDH`H@0HPsfD@(1SC(1S'1|SEBfQr@=hKV@HLh 諓HIEHB(HB HHPPrXHQoHX )D$NH*HCHB(HB rEGHGHo@8)$ o@H)$0H@XH$@Q¾T$ T$ EH$@D8P $   Ј$ hP]nlHMIEowHB(HB ot@ox@ o|@0o@@o@Po@`o@poooooq] \s$$u @HILh SH4IEo`HB(HB o]@oa@ oe@0oi@@om@Poq@`ou@poyozo{o|o}QoD=PnfHGHC_onHB(HB ok@oo@ os@0ow@@o{@Po@`o@poooooo_t*= $~t_uH@HX JH+HCoWHB(HB oT@oX@ o\@0o`@@od@Poh@`ol@popoqorosotH_moHPHCowHB(HB ot@ox@ o|@0o@@o@Po@`o@pooooom$= HZHX dHEHCoqHB(HB on@or@ ov@0oz@@o~@Po@`o@pooooobl$Hf$H@0HVk$@iϼw{F=9tW0;H1qK,oʻoûm뼻k뵻iH;P8dHŹ=~IHH@0H]vu wFyx411`'?pq@71b1N@+151!@1 0:1fUSH裁 H,Ht  $ Hl: txH1f.18H@0@Hu2t-H8t#H@0HutHR0Ht:u(1qI,m臺tCe Hͷ: ]lKu!iHHP HHH[]Ð@T HgH@0H8H@08HmHHET HHHH9qHhHHHG[G[-r P H:H=üD$ BD$ f%=tl#f%=tL#H8:@D^PRqf1ceff.@AVAUATUSHHҵD$ HD$(HD$HHD$@HHD$PHD$0HD$XH@(L%AE11HD$8HD$ HkLI$ H|H I$ o)D$÷%rIt=r/f1Hy/@H|$`uEHUHD$PXу@W HXuBPXH5ϺH|$ hH5H Ht"f.H;gH[H5HuH$L%HH>H@0H3!H=PEL%Ft WI$ H I$ HH@0H׳uHT$0)1;HĠ[]A\A]A^t r9@߃w1eH=HD$@HV%:HA胾(H1Dhf.HHT$LH8t%H(H01D HD$0HD$H tHHxPHtHT/HPHD$H0HxPtL%Ըb脳fDATUS1H@H߷D%HD$8HHtUHHZHtfP  =t @"=t @#=cHHzAHHh(DmH\$ރHHB(HB HEt=tTHyHpHt1H)8?HYH@0HLbH@[]A\fDHT$8H4HtH9t:1x)>HHu"Hf@$mQaHҰt`USHHCbH$HD$HD$ HD$H%Hh(HD$0HD$(H@(HD$8HD$uHVHEHB(HB UH@ YIJ=zuHHtHH@0Ht8t6]HH@0HٯlK=$~@H;P8uH9HH[]úܺպκǺ빺벺뫺뤺띺떺돺눺끺wmcYO\t u6諀?\HH[]Hg^؃|1w؃u1wfAVAUATUASH@_H[$HD$HD$ HD$H%-Lh(HD$0HD$(H@(HD$8HD$IIEHLp(Lp AEIFAA$}L$0f.AAu9==)9y袿tfD=t<L7P==uf몽XND:0&|rh^T@:\]rH=IHW0HP0HG0};HH@0HH@[]A\A]A^Ð===DEe}HFIFHB(HB AUH@Ik,t{-v=|==CSVfD1?Bf.M \@{fDxutIHHH8uHt&HHit)1;HH躑HϯKf.81*sIY1rff.HHP_HDЃ8Hw0LGHG8$ŘD2Ht 0*1:f.H2uHw1x*:fD2f;q f2f2fHtL*1\:@HwL*1E:DUSHHAH8/H H8b/HH8tH8H腔tlHP< P8-P8HH oBHH H8H0@<@HHH0HPXH[]fDHaH81H0HD$0D$0HHuHw _Hh胒1҅tHĸ[]A\A]HbD$/AHl$p>1A.d蜧<1U/;5E/ HʤH Ko#D$pHDŽ$HH$HL$x)D$ HDŽ$HDŽ$HDŽ$ HR(H=;D$ H-MAH$XD$ wHXH$HHuHH HuH HHH@0Hݣk<HãHDD$pHDŽ$HDŽ$HDŽ$HH$HT$xHDŽ$ H@(H-c<H$THt$1-8XH=iH$H-DH=I\ uE@ EtH5UH=&9Ht$17D$pT$u H=I9HH@0HDIY=CEHsHo̥D$pHDŽ$HH$HT$x)D$ HDŽ$HDŽ$HDŽ$H@(H2H=3H-H$VCmSHfHt3H81Ht H=;%Hĸ[]A\A]HHfHWo@H H1H @L$H H=H HHtPZ0uHiHD$x19T$H8@(cH1HBHH oB@8@<@HMoBH@8@H@<HPX_X+1+Ht$ -1 +NosDHG=AUATAUSHH1 D$ Et.Åt\t5A9|H[]A\A]E1H[]A\A]fH|$ L-THUt{ uLu( 11f\$ f.1L-L%;19gL-L%1EH[]A\A]L-H›rDSHHP_8"u H|$1 tu2T$HP[fDH|$vt!u Ԗ1P)HP[HL$H|$1 uHT$HtBWBZHCLkIH|$u1Eu=1M-H$I}5ff.UHAWAVAUATASHXH}L51L=2%<:EEt CըE1EE1.t@ <&EL5בL=ؑHe[A\A]A^A_]@DME#HHHE:PH H9t H; %H HH;(AH@HH9(u*@AVAUIATUISAH@H-$H%H5t H- H H@[]A\A]A^D= ¸uEu]I}\u<$u#&I<$Ht HgntkMd$ MuH1芍IEz@* uAE: t(AEg@H;AEM@tK~1 tuL@? <&f%01@ u 01@fD01@H#1SAWAVAUATAUSAHH-EAtAFE1ADQA1HtH]Du@Ht@+HAL=)L5*,DuL=L5HD[]A\A]A^A_@HtDuH-AK,AVAff.UHAWAVAUATISH(L%H-,@IWHHz`H}PL IIGHHp`fD|$%t"E11oaEX1H H;DD$ L$0[8L$DD$ eC Mt@IEHHt,H@(Ht#HHPP@[<A}H31u1[}Ey1 ,EnEE %Ht$01AP_LHDL$GHDL$g7 !} ! !AWAVAUATUSHH(H.XwHEH|$L-o[L%p[ctH([]A\A]A^A_HD$Ht$H|$H@(HD$KuH|$gRUH|$GZOHNHHy(oGP HOpoG`HH @tDDDD@AAuDAA@EEEHt DVEfLVPME AL>HHuE1AAE1*f*AEFAFHDIAIuIJHoA8LAXoAHL@ @t(Et#oA:oABoA(B yDA8DDA@EEHeDA ڃ AWSA AE ؃DP ʈP[]A\A]A^DoG oG0BoG@B {EJAtvAt`AADAHIHDA8I:DAɃ@AMDE11ɽcEJ EEOAON@GV@HO(HuHDA8I:DD@ɃAgH_>VHBI1DSHH@tQu4H|$1oD$HD$0oD$ HC CH@H[SpH@H[f.Ht$6pHt$HHteHHtYo@Px HPpo@`HS CuH@(HJ8R:ཉ Ѓ Ȉ_HH$@H~HFHo@8o@HH@XCHC SHHPtAu|H|$ 迹oD$ HD$@oD$0HC CHPH[f.HL$T$Ht$ oHt$T$HL$HHt HHuH~HuHnHPH[o@Px HPpo@`HS CrH@(HN8V:ཉ Ѓ HNC Q[ ЈC A\C ЈC fAWAVAUATUSHxEH|$H=QHt$HT$8HL$@DD$4H|$ H=QDD$3HD$`H|$bHLHD$XHD$HHD$PHD$(1HH|$X(}HHE HD$PH8U1HD$PE1H8|~H|$HE111HHD$(萣IMA~Ht$PLnMLkL|$X&AoEXL9CHFMH+DM I9MxI?H/9uHP_MHEH^HD$ H\$XH=PHD$HH6Pt)@H;Hk L4H{HH84HuH|$`1Hx[]A\A]A^A_MAD$A I<$2}I4$H|$E1H11JIfAF8@t M`MD$3t$HD$L9M$M"D$3HD$PHH@ HD$PHHL$HD$HD$XAXo@XOLH}@AF8IFHHH8A}2LkL|$Xf;{HHD$XZfDL9HFfLt$Ht$`H|$XLH\$XHoAFT  LUH3HSP1HD$HT$ؒH0HX1fD@TWHCHJP tuINH9H/It$HT$PHL$hAHD$hF\HBHD$PHSHCHD1q ID$@\M$HD$PH3HS15H[ HuHt$(H@HHPXHP_@HD1|$4zHL$HT$HQHT$`A AHHL$ H LHL$H LA}HGpHt$PHLnML9HxxL9LHNHLHt$PH~H41H  HH9uL9~JH4 HH9uHHD$H 0Ht$PHD$HLnIZIL9MEXLH1Ht$PLnHpXHP_HD1tP_1HL$HT$`HHHL$8HHHD$ HL$@H KHD$HKHD$PHD$4  HT$PHzATUHSIH H?Ht$HT$HoJHPH@XEZt <HB(PZpHHUH11D$ tHD$ H []A\@E111HH6t"Tu9I$H []A\HH []A\f.H뽺1 fAWAV1AUATUSHHpt+H\$0HAtT1L0l1CDHu1LM1$fDHATUHSGZHfDHHHuHt H[]A\ÐxtoC @oC0@oC@@(HC(HK8H{`uyHF(HL$HHx`HL$HC`Rf.HtI$HHCPH[]A\fP8t8KPHHL$跓HL$HfD11x(HC(HH8fDH0HH[]A\HUSHHҷH5c"HPHH1ɿB rLHCHHx(GZOPHpGZU+.CC (HKoEXHHhHCXH[]ff.@AVAUATUHSH|&HĀH|$@B1HT$8H|$@1HKHD$8Hu1Hx(8},M8tEPE1HD$8HHE1HX(o$CPoD$C`HD$ HCpHD$8HX(CZcT߀cQ@CZU8ukHD$8cPHx(HD$8Hx(THD$8Hx(MHtA}tHxPeL=HCPHH[]A\A]A^fEPvu1JHHD$8U8Hx(H@OPWD}E81H}ILePMufDM$MA<$uID$Lp`M;u0uHD$8HHHX(o$CPoD$C`HD$ HCpHD$8HX(CZcT߀cQ@MCZLJMHt^A~HD$8Hx(lOPcfHD$8H1=+H@(HX(/HLePf.Lp0HEHH@(L9E1ff.@ATUISHWPHHuHHHuHKHAoD$HHUIt$`EHPL`HAoD$EAoD$(E(tHID$`E8[]A\@H_Pf.AVAUIATUISHMLHH@褾LHCP蘾oD$0HHCXCtMtH{PHH{XLH[]A\A]A^ff.1?t HPt tufDHGHH9p(@HATAU1SHCSDtG tBy&H{`HCP\tWucҽx^f.HHuA,[]A\C8AuuC:uH{`HCP\ux419LyHKfD1@HiHt$HPH=VHPHPHPH=2ÐUSH$*195(H[]f.195(H[]@195'H[]@95't195'fD195'H[]@H1HsHUu }7HHRHUu }9tݻ51&fSHAXt$1HtHH HHtHƿ81M[f1Hο&1[ff.f "1uHHtyt@HrPHtLF8AHcHΨuAxu҃~uHH7HX`轥HHt[SHt(uCHCHHt:Hx(H[H^fDHuHHu@1[@1ff.fAWAVIAUATUSH(HLMLMAGWE1LDHEHt D`8EHm HuE1D$MpMQ1LE1AB@HEHt@P EEHm^tHCHt 8wH[ HHuMM AABpMDEHt$LMtL~pMAGWH([]A\A]A^A_MMtE1AAH@L1!1&HH[ Ht8H{HtDG8EtsHH{uH[ HHu@MMUELI3fHQZtIuHt ~8Mm H@MHuDMEEq@HPHHR(BP wEnH8cD9u |$SMt Ayx=CHHHXȩf1DD$LT$LL$D$oDD$LT$LL$D$DHFHLx(MEBE& H M&&HDпHX1H(1[]A\A]A^A_D8D$uHPHHR(RPT$MI~@Ef8Hx@IcL $4@EIF@L $A\$Dd$HH1HIH@IF@HMJ< Hq@LIL9uL $Dd$IE1GQuGXt1ÐwTHWHOSH@蔈u C[<tfCRu H;[@1[ff.HHH4$HT$t9W4u0HGH@(Ht7u HfD1H虠1C5E&HC5E&4ff.fGZƒЃ< Ȁt 1f.GYtH H HtH:GW@u^buTHHH#fLMtmAHZ`u3H HtiH HuLf.fDL9t-A@Xt!H Ht>H HtHH9HøH`BE&`BE&ff.USHt`HHtHu6(HL$=HL$HhHX HtHH[]fDH HC HH[]úAE&yfHt GZuGuA~u;HHƘHǘ1x HfDfDfDfDHtGT%=t f.GZMMILHMxP Lp hH81IVL|$AWYAGZAWYAGZL MfILMt9II]HuH@HH;uMmMuL|$MvMuH HtgM@HMHtDLfDHHt*Ht!H;t HfH;t H@HuJQHIHuHHuL>HD$H Hh[]A\A]A^A_D4IHHPI!L|IFHHHSH;HHHHt ; H  HHHH1ALL{H pHH0HP1茕L|$AWYAWYL Mu<DH H3HKL9HHH17MmMt]IUHCPu@t+H H3HKL9HH1뵋S H{ \wE$*L|$AWYSH1賔wG;]h;M\H LsL9@H1L8"H3HML@1Z9Gu9Gu9GuVHH;H HL|$oLAG oCAG0oC AG@AEP@IP1L|$AWY+H;1HH DH;1HNjH1^"LHayIE1L8q!H3HML1IxH|$1HuAEPL|$.DE&DE&fUS1H?tH[]f.HGHHh(H HH HuHH HuH91@uEZHt$I? t\AD$0MvMuHT$HuHL$ 1I@L5!IHLcA txHuHT$ 1MtCE1LL[MHt&HHsHT$ 8H1览zfLc(Ad$4xIG(H vHH9HtII7L LD$ HU1H[.MtZM>M5C6E&HP`Htu^@8u>x:nHuHT$ 1HuHT$ h1HuHT$ p1ȇHuHT$ 01讇Cd6E&\Cw6E&HC|6E&4ff.fUSHHHG(HH @4@uxH=DHHHStiHt_E11H=HH@(Hs(HtHF(HSHWu;HC(H4@H[]f.Hs(HF(HtHC(HhHu/fHS(HH2HRtHmHtHC(H0HPH}lHurfHs(HS뒺 D*6E& D06E&սff.SHH0HC HHE1$D$tH0[@Ht{SH~9{u7C V tL9tHE11ɺHtC8uS[@HsXH¿1`1[@9tHsX1D1[øf.HsX1( 1[ff.Ht;?u6Hu2Hǘp1x Hf.ÿ`1^ff.Ht;?u6Hu2HǘHcm1x Hÿ`1ff.AWAVLcAUATIUSLHHHBUt0t AHHD[]A\A]A^A_@JL,HD$IHgM(MGL7I1D$HHT$HT$L$t%HuWuRHD$LIHBHT$HHD$HT$AHLHHD$H|$ MMI(H:6H|$ AJHH(HO4HHT$MnLHT$u]HE$$mII^MIDMIA9JT8 L񾈳11Aw0fDMHHT$ALHT$HE$$DMIHD$DL$MIDL$IA9HL$JT811vfDHCE&EfDOtfDJE)Lг11Av#f J(\f.KTH DOM&1IE1Jt8轀MA>MMt A}zA~MWA}wLHT$HT$,1LHT$~HT$IƘD$LHHHL$|$H|$0HT$H|$0L8Ht$0IHDH|$ HT$0LH|$0Ld$Ht$ Ml$LM$Ht$ LcI$NH|$ IAMDMMIHJT811 tBE)`11s1fDH|$AHǘIIMDMHIMྨJT811sfIH|$ IwMDMMIHHJT81H|$AHǘ9II*MDMHIMJT811r LHIH|$ \?CE&荵fDUSH?HHHxx,HHoHGfDH Ht,H9t'HRZpuGT uָWQ1 uHPT@tOZ tȉHt+oGP1)D$T$҃ptD$@{ff.USHHHHGHHHtHp(Ht F[<txH8HHu(HHt4HlHHUH[]fDHxOHHUH[]ÐHHt<1BH HHUH[]D1ɺNH[]HHE1f1?t HWHHtHR(H9tR[tHHt$Hf.Ht+WX1@@ptfDGWfHHu)fWZptH HtHHu1Ë%X SHHuHHt[p1[@[Hx^ff.SHH;t H1[H9atHt$HtHDH1[@HpuHT$HKXHt` WHʾ 1WHtGZ`AD$_@HEX8*LLE1#4fDID$HHH@(H;;H; .17A|$A<$ID$@\A?yIGHHx(uL~X1Lf.1F@ID$@\A?IGHHx(JuLLIwXؽ1V['HHzx,HUPR8t @8H|$@LxD$@LA@~f1LL螵H}X?HUHL4 E HEH-H@(HEpPZHPHT@pPZH}pHEx((HHL`HC HH@ LxHEPHEXHHGxx,=I3@A|$vH޿1Y@I\H޿P 1YE1At$ L|iHGHH1E1pYA0@F*E&0@?*E& HIu$u!uO"u.1HL¿@X1Hf.L¿1X1D1L¿X1HHt;oGP1)D$T$oG`p)D$tfDD$@H!H1HufAWAVAUATUSHHD-^D%-t.A $+DHGHH HH19@Lp(MGCp=H5nH t>I; t5HxHL$HpHD$HI9tHt Av 9p SHCHHHE1H{P~HSXHCHH@(H01#W{fH{HCPHt|8twH@mDH`CAƒ{ED-D%w@-ot AVHĈD[]A\A]A^A_H5xD-D%'E1@-fDLL{HMt A~5MHH@4D$5tHt$8H|$@H躥HIOwHD$@HNH@@L CE1LFHt$HH]SoCD$Ls)D$PH|$8oC)D$`oC()D$pHHǃL{HtH{Pm HXHt$HHXLt$XfoD$Pf~CfoD$`CfoD$pC(D HHI$ o@HǃCo@Co@(H@`C(HtC8AF< H4IFHt 1HAvCH@NH$]HHSXHLn{t) @H3{A0EzLs Muyp M xH=1 HCHfDH 3(1H1H輜A@{^D-D%@-AHCHH@(@]x H@(HHzH@Hm@]cKpZfHH(AQH HHHHH?HH H1H8mH(Hp(HFQHCHy@HLp(MAFPA4,H{PHSXHLR"AFZ<@AFT@AFXt6H t$IHP(HtH:}f.HCHAHtH@(Ht Dx[AAHHt @HtM] L*1҄s HDK L-GdA~ IHD{AEuAoF CAoF0CAoF@C(; {H{HCPH8HDT$4:DT$HLp(AF` =/|=oHCPHDEu x I6HP(15P{`1Pf.HM~E1m LtHx Ht$HHHǃ% {Lsh DLsH>xH0Hq;{1H:HIAI6HSXI~8IG(x H|$8\1HUA{^H{SHCPHT8 H7{9D-XD%ڿ@-ҿ`HCH@H1@{nf{ AF IHtzr Ht z@IHHtH;>DT$I6HSX1K{[fDH{P=nC=ؼt =W HCPHMDEu x-I6HP(1kK{LHLHPHH:H`IIFHH8Hr`IH=1MHCMLLHD$HT$1HHCH8HGG ՓHCH8Ht px>HCHnH{PH3H{ HAF IwH =1H 6.I6HSX1!(I{hAVV@t(Iv(HtHHtHv`Ht ~;@t(IV(HtHHtHR`Ht zIHH#s8HsPHH AFP7IHIIZHHAGt IG8AG HCPL{PIpI L AoF CAoF0CAoF@C(E18cH= 1?UIGA>H=1E1,HCLXLXI~(GUnIF(H1D4Bt=%-LHz H1DAz{2C;=¸uzCuoH!At`HCPH1uPHH uHHH 8u{oEHR=' AAFT BIHH H#PPHA~ x @RH{Hp HXAIF(H@8D{uHC@W0HSPI@@ Ht_  IL{P{ LHD$HT$HBH>T{ GHg{ ]H{PH7A~,SHCPHHHSPHD Eu zI6H(1P0E{@X%=ps8H{@"SHCHHL$P1HxyHD$PHAVZp ; H{PHǃHCHHIuH IAItHu% fDI6HSX1P]D{>H|$8RAF]=HHtHRHt :#I6HSX1C{lHHHt"fDHPHt z XH@ Hu1HDT$D$ADT$y=t'HHtHDT$DT$3 LMtOLMtCAxxXA5wH  HHHփKDt @D$HDT$ܳODT$LDDMAGW@ { {u/HCHH@(HHtx t@RuH{Hp H{PJH HHXf=t5IHt 8IO H5LIA@x=uL9uIO H LIMaA9H9Ht;. LD$(DT$ HL$LL$cLL$HL$DT$ LD$(ILD$ DT$HL$kIHL$DT$LD$ 9G8xH#uIF(HFfD{ fIF(HH8AFQA~DH|AFXqI6HSX1~?{_I=1+LITH HI H9tH H9HANT=9=I6HSX1H>{]I6HSX1>{;'ݳIHHCP@@ LRlAF HsX1L*HrX1E>{&H HHIHv9mAFQt I; DAA~ {IH:y1"HHt p9 SHcH|HtрpxDD$Ht$L$'L$Ht$DD$uE1I;r|;HHHu !fH@HtH8HtH?I9>uAHRHuEDEI6HQ 1t51҉/ HHt;tR@-H[]y11;H;H?f.=^{uHHHH4` ~HsX1.@-11ff.AWAVIAUATUSHH8H?Ht pH|$ 1KH HAHL`(I$  t AD$THyPHH $舧H $AD$PLiPMvAUH=LtBu xHW:HHuHPB8@uͅuHHt8:uD18HtHHt18tHHu׋A8lA}@AE3ZI}HT$ Ht$@HL$AH<$HL$Ld$HL:-*L81,H|$ }H8[]A\A]A^A_DHHuL1=,1fxx,uHHHI4$IT$11+k@I$I4$L11+>I4$LP11+fDLd$E1ALH$Dt$nH(H;1ҹHpAMtI8_HnLLnHǚH;HT$ 1HptjjHt$ H躆t$RfDAL{H|$ { f8_HNnLL>nH7H;HT$ 1HpitAH4$HT$ H|$@{fD8_jLj8_LojH|$08_hxLL8_(w18_HH;HL$0HT$ Hp/iHt$ HT$0D$Hi}H|$0zD$1L1Y)@HQ8D?E&`H$HhLl$@1f.LI!zH$9X:I4$L1(?MMvLp1(@H$AH|$0Lw8_LHu1L~D)?E&`ff.fHAWAVIAUATUSHHXLd$8$@HLM8MI?uLz}H|$ p}I7=IH+=I I=HIpt ;HsX1'1LD$xH|$ xH8L7L7D$HT$HHaDHX[]A\A]A^A_f.HX[]A\A]A^A_@L+ptA}tIuX1&1YLoD$tA>t$IvX1H&1'AIH|$ ~HHt$ HHHD$sIHt$ HHHD$"zHT$Ht$ H`uHD$LH~H9T$$HD$HIGH@HHD$0HD$0H~\I0H_I?G;HHD$ntKHT$LLyHt$ HoiD$$D$ IvX1}%1H|$D$6D$ff.HATUASHH0p{HkX1H=(@stN9C tAu {H謱E11ɺHH$D$H0[]A\;t_C8THsX1d$1fD蓱HH¿1A$H01[]A\fD9C t AGpfDHSHH0pyYC8ub{u|9C t.H詰E11ɺHH$D$H0[;tC8tHsX1(p#H01[HsX1PP#H01[f.AVI5AUAT1USH H]LcH{pyR{ufS89H,LeuhHLIwUHsX1"11DHPH]{tHsX1~"1[]A\A]A^LeM#A|$pxLLeA|$AD$8&HEH1H0L薄KDLH4`HuLaIt$X1![1]A\A]A^fHsX1![1]A\A]A^DIt$X1![1]A\A]A^@H]LHKDLH4`0[]A\A]A^fuH}muAIt$X1P 1tKDLH4`˛uHuLJyfHsX1 1fAWAVAUATIUSHHLPMt%LDEu Hx8HHuMePOu0Ht$LD$A}M<$t |$MMMMAuIGAH$AG IG(E1HD$'AAG ANI9t1CK0Ic1҉*KK CiK0W86?CDŽAHGHH@(x /aK0CK H;Hu_DEH<$HL$0HT$ KHx8HHt$0vH|$ QoH|$0Go[fH$H@1Ht(81HH[]A\A]A^A_@CDŽHPHR8@L@8u  IuX11@ILI$IGILJIEIG+M<$MCDI]PE1E11Ht61tgr5u {HHuDGCHC81Iދ@fD t%HC@8@Dx@8~,DHDEH=gHHtȅHA1I޽ fMA},AF AFAMp AE8ADCDŽK(Ht $_, $=v1 $w(KK( $Ht*, $KHt $, $DIA9F ;>@hI)E11fDAWIo8tuRAG~FAQIH H]H9uf/EI $tAAwu9Aw AG9}*HcIHIT@ǀHH9uIJDIEHH@(x H@(HHH`L9Du#HPB8tHtT@H9J`zHHt!uHHH8H1HuHy `HJ`IuXX11EAGAG }MAG2AW 9E9A@AL1M9ىH4$H”u1AG =I0ALJ,I I P1HBDIG8ADŽIGHIG H9uAGIuX11RXH1 AHAIǨIw(p1 fDIw(1X1sIw(1HT$5t#Ht$0HKo]AtWE9HcMUCH9~JA߃t Atu8H4$Hu14IG8I $@xIo8WsIW( 11)1CE&NCE&NCE&kNM<$CtE&NN1ff.AUATUSHH_PHt81Pw?Pv.uHCHhHHuH[]A\A]f.tHIeHGHH8H |&ID$AD$HAT$ H{H+H{IH+MHHHLHH5IT$X HHH,qHdfID$HnHCC 1^ID$H8Hp葼@A<$H{sAD$HAD$ =:1j!H{IH1A<$uHHEH8@L%H$H[]A\A]DGG H=1 ID$fHIl$L =1HE I\$HH;HID$HID$HH@(@QID$HHh(H 8H1R^ID$Hw0CE&HKAWAVAUATUSHHwiH tZHH}p} H}H|$tPHuFK$`+HHxpyZHHuH|$?uHl$ 1HqHK0t"HsXH113)1H[]A\A]A^A_DEPv 8oECoECoE(C(H t?E8Ld$AD$8C8E8AD$8b1HZH߉4C9fHX%HHXHHH|$1@HHH_ H|$}eeHHxPHHIHl$ HIMH1H oHHH@HHY@b.@H}71?}H|$聡HIHl$ qIMMHHDH1!oH-ooEC{oECoE(C(HEHCfDED Ee$  Ht$F 1DLeEt)} L0tLl$A} WL08}H|$u NHIHl$ >IMMHD}u HIHl$ MHH1mHq,&E,DEtoGc莟HIHl$ ~IMMH DEt YfoHpared atHHD$@D$H %L)D$ fow)D$0+HHl$ YvDEEt E CC 1QWt u+ Ld$AD$L u&H耇EDLd$AD$LAD$ 9E CC DOfH{@E8C87u8H}@!HC@"f.1CHH¿1HX K@Hl$ HH1ykH1*HT$HuXH@11HXW fHHl$ CHfHD$Hx,H -5;CC E  t AH謜HMXHL1v[GxfHD$Hx+H=,fHCH+C HCH9EtHsH@HCH|$H9GtHs#HWHHuX(",?Ht$Fv H/ HX(+Eu H|$x/HHl$ +HIH X,HHl$ IYMHE ;G CC L H@Hl$ Ht$HD$HHxHD$ Ht$LrHILH{@C8VE8Ld$AT$8u+H{@C8E8aLd$AT$8O9H{@S89Lu@D}8MMl$@MEAGLHIDH$HIH;$HLHT$.oHT$tHuXIT$X11HC@HD$Hx)=pI qT Hl$ .E CC !HCHr(C ;E Ld$A;D$ D%;EAHT$ H|$ND$ 3 `kT aL0H|$ 1C8DLHC@ff.AWAVAUATUSH(pwHT$ bwD$ VwOwD$KHHHЃFЈT$f.A>A~-Sv%CvA~p-2vHHt 8%L!yAu HPQH[ vHLsMvHCHtփxuЋpHP811D$ \vD$PvH([]A\A]A^A_L0H1@IFHLx(AGUt |$ 6A~-Su AGZI-tHP8DI@K H€AFmAF8x|$#IvX(11*sDIFHLx(AGZ|L~IvX11tAwTIVXLIN`@tAOPAG[I?<HAwT@AGPtAGXt AGS^HI#GPHH9AGUtL:MIFHLx(I LAW u AGPaA~p\L.L1HǾ^3A~(IFHH@(@[<IvX`11aI H HI?HL$w?HD$HLx(IFHAGZ-qIVXH1PI?DI7IVX1AGV@vIG(HH@`HYIW(A>-zpHHR`HP8I7IVX?*f.I?kHtk@H teo@AG o@(AG0o@8AOPAOT AG@_4IvX11IvX11I7IVXh1I7IVXT(11fDUSHH HHt@HHt#{px HtHH[]1HH[]@3HHEHH}PHCHI{pHCPoEXCXy@HwXr(1땺BE&5f.ATUE1SHH0HthHË@udHHD$@tB{tlHA@4t!C8tHuX1E1?H H0D[]A\@HuXr(1H fDHC@XtHHuXp1E1ff.AWAVAUATUSH? 3LHAD$4CLII\$HuUf.Aw[H|$@H|$L-H|$A EH[HHCHk@4@uH@HLx(HD$HC@4u@H|HHKQ06A4@@HI H|$H\HD$L|Z1҄2L1HDMd$(MAD$4A*E&13f.AHD[]A\A]A^A_HCMHHHXtI]11HJLEX1HH.HtH@(HH|$E1 HUXL1E1hA1E&u2ADE&a2AEE&M2AE&92fAUATUSH? HHE4@t1H[]A\A]fDIIHtE4u2u)U0tHM HHZHfDH HIEHH{PH@I$-HHCPHǃVHB4EHH5z+HjHD$1DHu0HteHHtjHx HtaHHHD$tKHP(Ht-HRHt$HzH HL$臷HD$HtI$H蠶HHD$kAE&v0AE&b0ff.USHHH@4 HHt HHEHHnH@4H@H@(HHtC8Ht$HHtjH$HL${pHǃHHQ(HǃHKHoB HCoB0CoB@C(xHDf1H[]HHWX1PiH1[]ú BE&/ BE&.AUATUSHH.kHHU4@IHCHH{PID$H#HID$PoCXHAD$Xu fHHHHuLIEA|$px L*tfAD$8umA|$t5U0HM HLWH+fID$@XtHHsX1f1H[]A\A]HsX1p1@Ml$PHD`AyE&-USHHHHtpyAC8urCHSXu=@t8HHH[](1t$ t$ uH[]H1(AH1[]HSXH1(H1[]@H[]@AVAUATUHSH?@Aԉ޺(u1ۉ[]A\A]A^fDH}11AA(QtH}(tH})tH}޺2)tH}HEHPH9WtHpHEHPH}H9WtHpoHEHPH}H9WtHpOH}?LeA$A|$ H}?LHLXLAƾ4EAAt$ 1ҿ薊 HHkHH`HEHYHUHHX*lDGHMHU9Iԋ3yHHH9HE1H9HH6=ݮE~xEHEHPX1`y?HwX1ZzHHH9}HH9uPH}D8A|AEHHhX#OHMHUAt$ 1ҿЈ HHkHE-HEH`HXHUxHX[fDHtkSH/DH{1Pt4H{0t'H[8Ht&H{0uH;Htpx̘u1[@[ff.SZHH@PHu@Ht;HЋHtHus8H{@{pxHTH[H u HP8HtRu@@~H0H8DHǂHBxHHBH9uiAWAVAUATUSHHoPH_XH|$Ht$H 111AM)H耉}HEPHt 8nHuX1HD$HHH $HEHH@(H0fDHIH HHPHHR(H92uH@PHUPHu36@8u:uL@HzH?I98uHHHtHuHT$HuX1HH111A[)H葈{ HCPHt 8 HsX1HD$HHH $ HCHH@(H0HIHHHPHHR(H92uH@PHSPHu36@8u:uL@HzH?I98uHHHtHuHT$HsX1HHD$HHH$ DH$LxL(MIEXHD$MMSIEHM7Hp(IFHH@(HH9uI]PInPHH@uHt @H@@;E%HHHFH%Dc EACDI9C ~'J(J(HuHuDI9C ڐHHHH@Ƅt @Ht @H91HT$IvX1XMMH$H@HH$kHt$)2 @HD$H@`Ht @p HD$LMD$'M,$IEPHt4HHu> f.HHHuuBA}pIEHE1HtLx(A}D$A}H|$@L110 D$@AΉщ$A1҉L$D u |$MHD$H@`HtJHpI}٬HD$:L@`AP8AE A9@ t |$ Ax HL$틁Hy`fL~7EHF1Am)HH$H8Md$Mj|$'tHD$Hx`tHĨ[]A\A]A^A_Du8E@HsHEHH9u$HEH{1ɺMH@HHp(D>\ HCHPXHEH@HHp1H[ HuLd$ID$PMHxu P8HpX(1Ll$MeMLd$fDH4$L Ll$MeMLd$EMMeMLd$I|$PHc4AVII?H549L4I3;33M MLI|$3L=3LwD$Mu#NuI~t$L MvMt(A uL;MvMuI|$H4$$V3D-O3A9}BIcD)L4LlIH)3J<0IM9uD-3EuuH=2'23fDHwX1t$I~PLIfHEH{1ɺMH@HHp(0+iLd$MA$ iH{hH{ *hH{EoaHwX1H{HLHc1^_H1A<$A$$@;Mt$IDŽ$MLA<$3IID$PIEPID$XIEXID$IEHD$ID$XID$PID$HHLhI]AE#ImPHQE1HT$`H$:Z 17HD$H$H8YIH@(LHMH}H2,IGhID$(oEXHHH@h@XKH|$In\HsIHI|$(L{HH@tI`I|$(1eH[HwLI|$tA|$(*E|$,PLH|$`HD`N 1)I|$HǘHT$`H$پKZ 1I|$Pu ?uHGHH90K HwX1胾 HHxx,@fI|$XHemeI|$`H}~He;HsX)1HE(x H|$0H11L<ƒMtIOHk6HCPHt@8uHHHiHHuMEIƒI}H}+ 1@Hf Et$uH}G\@  !HSHHr(HH xHCPHDExIt$1sLl$fHsX(1PAFeH}G\ [ @ M ,It$1Ll$-D9LLt$ƿ1L迻MMnMfDMt  HmHMt$MMtI9MH@ID$0HtHHt 8XE|$,vHH|$`HEL6T 1LID$0H|$`FT H1.ID$0H|$`UT H1I]HD$HHH}/pLl$H$MeHd/MLd$fIt$PH$$ukIt$PH|$`D$`@uUIt$PH$$uIt$PH|$`D$`@ID$PHt$HPX1oID$PHt$HPX1HOA|$(IH3HWXH{84?I|$PL'2Ll$MeMLd$H =HD$L1H ;tHHt H@ HuLsE1m L}HxHLt$`1LL|\HD$`ImPHHD$`HCHH{PHEH,IUPH|$811HEPvHD$8HHHtHD$8HǀIEPHǾm &I}PHt$`;IEPHǀHXhHSHHj(H ??tHwX1_?JfDHPPHt DEHPHHHJ(HyQZPAQ _H1HPX1蜷Ll$f?1H~ID$PHHHu`fH8HHJHk}uHyHuX`1 111A@ HPH߉HD$1ɃHpXHxP7Ll$ @IEXxIuP~H@H@8H$$Ld$MA$f.111A)HO69Z$BH4UDEIt$ 1Ll$)I|$Pu ?HwX1H轵ZC8H|HsX`1脵pHP81lLd$\I|$XHDLd$MI|$PZI|$0LU9Ll$MeMLd$~sfHD$HPPHRHHR(gH4$HxXIH@H1LE1HP(BP@BPHD$o@HPXAFXp pJ Ld$(IT$MHPID$ID$PA$Mt$PID$XY^M\I|$h).It$1ILl$Lt$`H5(1ɿ)LSHD$`Hh(EZM\Dž0Dž4ȀEZTHWHH90o8:u-A$;FI|$P-4ID$PHpX1}HI|$P @1DDo8E&-AA$DI|$P,]ID$PHpX1@IEPHxPHLJIEPH@P6fIT$1 *DHSHIHr(H HHtH9uHHt H;0EQfH|$0DO8EwHHLHE1Ih1z@q)E&?=&E&y?E&e@)E&Q@'E&=@)/E&)I|$P[,8It$X>Ld$p@'E&p@}'E&I|$P*I|$P>H{PjAW@`H1HQ1詰Ll$It$1莰Ll$ID$XM8 H@HH@(@ZH{(H1Ho@8h:HHX)D$Lh`@o@HHL$0)D$ +@HYCZƒOf u|C` CQC Pv wfCPEv CZ < CPiHxXH3HS1@HsH4)@H3HS1ܗCQƃAUIEILHH9HHtHxxuHrX1tXH`H#CPHCT iE CYVsP1H@u@S /=xIH@1eIF0HC(HtfH H; tV@SuPHR[t=H8HL$@1+PHD$@Ht$@U2PZуЀt AHu ExH?CPA5EuCZ <!HhH{ @-kHHSPH@H1HH!H9H @Ht Ht-lH@-H H;=tCXuCPu CP%@=@l H H#CXH@ H9 { { uHH9 $CZƒ@H3HS1uH3HS(1uH3HS1uuH3HS1u]H1HS1utPC);E&>PC0;E&*PC+;E&CQtCZ <CT CZdHHtx @PACT%=CRuaCS,\CZ :HCPH@HH!H9H H!H9H3` 8eC[,tKXCX%=H H|$@HH0SH|$@"|D"|1H5HIHǃHtIHHt ~ CW2FW @tgHsH 1sHHHH}HMH-N&HmHu`_HBuMHKHLD`_H M=s!HsH꿘 1%s HsH@ 1 sSTH@Hs H uecUcWC8HE1AEHHt H}HtGUu "EDHmEcWC8cUKWC8띀~DvHV(HP(iCPH3HSx1Br&H3HS1H*rHHp@4HLL5AHHtCPuCS,t8AADhL@@HDjuH3HS01q@-*bHH(HH\A8A\TH0H HPp18qH3HS81 qH HSHH01pSXSC[,tG?H3HS1pH HShHH01p{TyH3HS 1{p_H3HS1cpGff.USHHo0HH9HQHwHUE[u EPHEPHHoB CPC oB0C0oB@C@JP ȈCPJP࿃@ ȈCPJP ȈCPCVJV࿃@ ȈCVHUHHUTCTHk0 ЈCTUT࿃@ ЈCTCWUW߃ ЈCWUW SRCWMR࿃ ʈSRUW@ ЈCWCYUY ЈCYC`U` ƒ? ЈC`CUUU ЈCU} u?Hu(Ht6H H8HC(Ht@(upxH[]>fDH[]@HuHloE CPC oE0C0oE@C@UP ЈCPUP࿃@ ЈCPUP ЈCPCVUV࿃@ ЈCVHSHwfH7HW18mH1[]@AWAVAUATUSHHHHHDu APA~MYIYL%M<I@DI$H{TyL+H}LC[LsPCZdC `LD=1LJMD=tAUAE~t"tCPu{  tHMYHtHYCZHtpCZƒpCP@uNUT t7 <t>CQLCHUH3HѿH1bUT@D{ vEX@tNSZуpCP@u6MT΃@ tу tH@gHtTCP{ .H`@CZƒpB ffDE[<AMd$MH[]A\A]A^A_DC EPCT t H9H H,C[<HihCV@HC(HLh`f.HHKH{PKfDH ET HuHU12jDL+H}LqHI@H{ HC(HtHHt 8H3HS@1iufDHHKH{P{C CZ <{HUH3HK1ji$DHS(HHJ8OHuHz`f.H3HSh1 i{ CPA1CZ Eu EHD$ HH` Hu ifH[ Ht^H;?qtHs8_HH5hHkH;HN:tt1H8bbH[ HuH\$ EHdH|PMHoHHHD$Pf.Hl$HD$AE1H]HCHH@(Lx HD$(L|$o@ HsPCHo@0Co@@C(t<>u6@PS9HLHF8MD{p^HCHLx(AGS<AEE9 I LMtAFW t AGT;6{IG AH$tH|$Iw HCXHD$LsPMAuFItEI/MLsPMuMMA<uM~IvLuIwX1R`H$HD$fHmAHtSH]HCHH@(JH|$ (3HfDHmH$AHHD$ufHD$HHHD$H\$ H{ޥMHHOH{PHD$H H5HX[]A\A]A^A_fDH(p1O_ f.IFHT$PH01(_H$HD$f.H{HL@HD$HPp8MIW(Aw$A A7AAAÃu?Et:D$0HD$(HL$ D\$8H1 UDT$0D\$88A0Eu5Et0HD$(HL$ DT$0H1TDT$0ELCXAmAcIL1h LD$0wTLD$0LD$I7HSX`1]AuLRM/ADAE&OIA VL f.HT$I71L]H$HD$LCXI1 D\$GLDT$8LD$0SLD$0DT$8D\$GEEHD$(HL$ LD$8DT$0H1BSDT$0LD$8pAAI1L LD$0S8LD$0LD$HsX1Y\tIL1 DT$8LD$0RLD$0DT$8AI I0MHML9MLMLڱ/HHMH1HQ[7HщH HLhM|I/AAI}@HMEDMuEIIHSXH1[.HD$ HX @H{tLIT$BUa H9ƒu1H{Ht(HtC8@tH9tHPu @1C8AH{`@,HHfAHhD[]A\A]A^A_S K:ft B ЈC>AUTCAE0Md$MuH3HKxML 1E1MHH4@fDH H^fDC>@HCHHtH:@:@8B@ ЈC>H;hI6LMLCxH1E1HEHH4@aHHsxH{XMM4$M%pE8E&h}HID$Lf!HCHHt-HXH3h1bEH99HHSxI6LCxHHH|HMHLE1 ;I6LCxHHA#I6LCxHHH3HSx1DHT$Ht$HX AHD$P PB@HA PBEDF@AUATIUSHGZHtAH1W$TEIt$tD$>DHL9\IH{HrI\$Ht҃{ tT{pHI\$H{GZ<G]H{D;unfDHE@]HH@`HmfE8A}( }uHEH@8yHUHsX"1E1D8]HEH?HH38*HSHHH:K8HHVAVAH=k1HCID$HXHEH8CLI|$HR2I\$D;DL}H{LPtI\$YfE1Hu  I\$oECoECoE(C(fDH{G^QH1HHqE1HD[]A\A]A^A_DID$XH@HHp(HH}j1jjAhLL$PC H IL$HULD$0`#1E1HX5oID$HU#E1HpX16E8@t sHߺL'-EI\$AHFHHx(uHWW躽1>fDHX 15E1It$ HX 1`5E1It$HCHH@HH@(HHf}E1UHID$xEIL$HUIؿx"HqXH14I\$@ID$Hu0$E1HPX14H=ХHs1uH= NH=HHtAEHC@^HCHH@(@ZoCIID$H{@@oC@oC(@(oCX@XC8AG8s8HIG@ID$HHPXH!UI\$D; @HHt4S:Ht3zpyH1[:f.HXu$[ffDHfHAWAVIAUATUSHLHt$HD$HA$ IWPzt>IXHt Iw$1 3H[]A\A]A^A_fHBHHH(AV@tIXHu[HBHHH@H_IwP$1HX2떐L蘄[HBHHH(AV@sDAUHGPAH1fD8u LHAyHDHHuHEHFto@A o@A0o@(A@H@HH@HËG8t z荫MwMw 1Ld$0L|$(E1MII;fDAF]%uHB0H8zLxMMCIJ(AAEt=At7AF]uHT$茾HT$Hƿ%181AfHD$(HXHu H[HtL9Ht9M(ut1tAt׋E,9B,uIҿ%HL10DHB0HtHu0HtH6H8HT$ DT$L$UL$DT$HT$ oDHr0FTFV tHH@@]|LHT$HT$B(fDMu#MHֿ%1 0ApI$`1/H60%1A/8IwX$1HX/J(ȃFHB0I`%H01/KEtoGA oGA0oG(A@HGHHHIH@H=ELd$8L|$(Ld$0IXApͲHIGPIXH@HHCP_DIWXHCh1oBX@XIGPIH@HHx(HH IwXHF8D$zHFPHHHu HHHuHT$HD$[HL$T$IP~A?nHHD$ IGPHFPIGXHFXIGHFIGXIGPIGI$HHt$ HpH\$ m H{P"x1H{PG8HkHCPHHLpsH{0WHIHC01HcD:HC LXH{(HC'HC0E1HtHHt 8Dk,XH$HME0T L1֋I<$H$IH@(LHMI~H%IEhIG(AoFXHHHBh@XC(t .mLHSHUHHI(H]HH@tH`I(1HmHXHC(tH{(@HIjH@(HHHOZAx HC lHt$PE11ɺHD$TD$P@D{,H$H¾$T D1mHzh/L$IG(HL$1T$HHHh{fHC0H$@T H1Um*HC0H$OT H15m HRHHR`HHT$HHL$ E1IH~(HHAHuEHH|$8H\$ HBH6HHHI0H~09E1LD$HRfIHPL@HHBHHHHI0H~09EqTOT @8sHHAHJIHPIHHCHu"HT$QjxIHCHD$8ȲHT$I@(IGEHT$HtHLHBHHT$iHT$HIHH}Ld$(Ld$ IHD$iHHC(3F HEPH@@ 1ѢHH]PGDzHH]P(H@xbDzHHEPI~HHAoD$m HCHEPHHyGXAoD$@X+IHx0H H0H9HHEP(LƲHHIG HEPHLx TAoD$IGHEPHH@ H@@XIEMmMHEL|$Ld$(H\$8H{t=hHSHEHPHD$8LxHD$ LL%1Hx LHD$HHD$HKHn}>H{(IILJIGPHD$p8t$H@PHHHHHHHuHT$HD${H|$ AHD$g&1&MLuDHHwHuHGHqHvHqH@I$pHD$8+HIA¿HT$(fHT$(HLd$(HD$Ld$ IfHIOH|$H@IwXHD$HD$ HD$jHD$k?%E&{]?$E&g]?$E&S]LD$HLff.@ ubwff.@ATUSH%H;sjH,[HYHHH;xuRIH'R0HH9v6H)HHL9`u"H0HxHuR8HH9wH[]A\fDHHHHLH ߀Gu|HtwHPHlPte߀Cu]HP HRPtK߀CuCHPH8P t1$u,H˖LH=H5 H1fDUSHH<"WH1fHS<"t"<\utwCHStkH<"Suބu\H{HcɾH)H)HH9tMHHu\tHJH9uH[]MHJH1[]1DuHHHt=DLHL ֔Et1AAMu MtD@LXLEuL 1@H1fDAAPuDHL@ LwEt2A IfDMHAAu/u1 fH</u uH@LοVqt2}_8<;He[A\A]A^A_]fDX%=@)EI$Hu HHCHuIHCIFEnLeHxܼLIH>q/fHe[A\A]A^A_]HH-H $f蛽HH0I11He[A\A]A^A_]þYII$M4$IFO6II$,HھF:1H޿JI1fATUIS 軲LH轼HHCHHHHHStRSH=AH1&H[]A\ÐAWAVAUATUSHHL.H|$Ht$(HT$0D$ l:D$MHsD$$D$$#D$:1E1QD$$ E1LD$;E1E1D$9D$D$ E1If. @ƅ@&PL$9!Dƒ@ DEEʈL$9=E@@ H|$ 9D$DA!ED MEt|$wD9u $|$DwD9ttpσwlAD$1H|$^ [1M HD$0L$ AEHH[]A\A]A^A_@EWt$IT$A$ \$:@D;T$ uH|$I1EC^`f.1Ml ^DP|$$#E<|$D9/LD)IT HH9uMlD|$1DAfDEDt$9E1=E@@ Eփ|$ 'd"V!@@ EuAD|$@D9T$uIEf.D$9fD A _=1E(|$;HDҾFDT$zHu[H=IvD^@ATUS?/Ht"@u@HH5z[]A\ffDHHIt4uL[]A\H14L[]A\fDNzf>zuH=yt f.zu*H=yt uH?y1H H9*y‰@ yu#H=iytruHx1ҋ‰ff.HHt H5x1P0H=xHxHd@H1UHFtH@81HHff.yu-HWxHt@HxHt!tCH=HVAD$E[A xDL\$IcL\$I9EEEIEHHAHEbƒ  H%bH|$ 0Ub9D$(t:HbHt0H|$ jUb9D$(:Ha@fDfb;]Wbu 'u DEt]G11s|fDA A uLIaLA @H_H1aH|$ 0TD$(afDA H5`E1A!EFHNMc'DAAEIA IDL\$A9AEL\$7H=D>EtAAH g` JuaI*&]fD`;q\` u=qtC\G11[1A D`ux`u |$H_HHH_H_H|$ 0 SD$(#`H|_1A h3H|$ RD$(P;_?HB_H|$ 2RD$(P_@H_@AA r|$H^H^H|$ 0ERD$(X_HcH^I@)HD P_H^1Hr^D=_^|$L:^H;^A HD>HFEIH5^A EҸHGGHE1L\$L\$A&|$MA't A"A ;FfA 2D*^(D%"^HD$Hq]-^H\]AvE-A HA Hw+Ms%Hu@|$H ]~D|$H\ fH@DxD$Eu@H \HH5\A sT]t,H\H\H3]H3\H u A!]t!Hk\H\\H3\H3\H tD=\Et!H>\H/\H3\H3\H tH\1!HHQ HL$HuۄH[@]-v\5HD$H[D%Z\H[CIcHADIH5[E1A!EFHNMc"AAEIA IDL\$A9AEL\$Ht1D>EtAtsAH Z4f.AuEA HA HMHH@DxD$Eu@H ZHH5ZA Ht.IWIcAHDIHtIEE1A `E2E1A HZIHZ&|$E1A HYEpH5YdHYLYA!o3bHYHYWtLHCu5 t+;t&'@ƃ"@ t&\Wu H=VuHDfDHVHt&HzOtD9tM tT\tHHu@tHFV@JLBt! LuH&VHÐH=VfH= VnATUISH fD1H-UHUMt  vk <&uA tH[]A\A]A^A_À|$H-TH HTE=LH=ZTHT@HD$(D$$HtDH=*T8H=&THTHtH= T7HSAHD$HT$$Ht$(1HHT$(AƋ*1HDEuAH? L|$(EA5T$$(== =A` DRAH=u_E!@ƃ*@u MCuA$MA t ufDIA t tLL)0u EA1HDN1ɁEX1HLDcIA t t!uDIA t t ! L}t! ! 1A?AEHH4 (~IHNRH=GpVPDEA~H='D_HREwIW IGHD$(0Aw tDH0H2uH=QHkQLxHPH5PL=RQH9s0H&QH@Hu>Ht$ 1҉HEuIt$ qh x5D$ H[]A\fDHVX1LXJ5fDHSX1L濘Jff.AWAVAUATUS1H?A̋ QIHջ _)AtpAM}XEt A!AtS1ҹLDHIAHҺHDHHH[]A\A]A^A_H$ HH *HHIH|6H!DLH(K1 _|DILHJ1xLHJ1~(fI?kf{ fDAWAVAUATIUSHHHHT$LD$LL$ t9IHD$pLHH$|$t 1E1H<$L$8 LFmIHD$H1cA?IƋL$8w~VH|$HALdIfDIFLIiHIt HCuIHDHIM9uH$HHD$tD$8HD$xHL Dd$8DIclHHEAT$ILdfHPHIIWM9uH|$Ht$l1D$lDxAE8D|$l^IM@E1E1DpH\$Hl$(DLDŽ$$AD$IL8DHIA9LttHcۉDt{Hc@I9@tBDŽuAHD$xL DMcDI9D$D,$uL$8H\$Hl$(~Lc$$Hc$@Ll$H$J<$Dd$H\$HD$@Hc$HHHD$(HHHL$0$L$<HHcHHHHHD$XLHT$PIIE~IH\$@E1Hl$DMI?HtSI}HHD$IEID9uHl$HD$ Ht I}HIE$Lt$($Ll$0;D$<E8DŽ$HL$PH<M HL$XMD Mf4H$Hc44HM Hc4@IMl49DŽ4HAHcHHHI)I)H9uHH\$L$8I܍QLlI$HIHPM9uH|$H H HĈH[]A\A]A^A_Ë$MΉ$_fDH<$1hHHH|$Ht$l1D$lDxAE8D|$l}Hbf$$DEKHD$pLHH$\T$tHD$xH<$L Dd$8IcHD$pLHH$D$ttrHD$xH<$1L Dd$8IcgIHD$HL$8I$D$IL$8L$8HmH<$61zgIHD$H@SHH H H1&"HH1ɺHHH1&H,#H [fDSHH H H1!HH1ɺ HHH1H"H [fDAVAUAATUHSHH?HI}UHHuVHHH&HSXD?HHI- HHL[]A\A]A^D@HhLwH uAv8HH8DHLHI|=GIc1HILH{HHKXLC`H=;ATAUP`1^H IHHHRHE1[L]A\A]A^DHLRHIL/f.IAWAVAAUATAUSIHHGtt`K1@HH9\HL1sAŅu EEH[]A\A]A^A_HŘIĘHL7,AŅ~LHHH9iH9tLHŘHL] AŅ\A$1LHz+QDHDHHxcM$HHI$N,LHJ<( L)HHHHcLHHHN<LD$LrI$LD$J<8 L)脴I$HHI$HGfHH1һ@*ff.USH1ɺHHxHpHH[]fDUSH1ɺHHHH@HH[]fDAWAVIAUATUSHHH$HT$LD$0DL$D$(9IYH$HHH$D$EE11H<$L$?LaHHD$X1A?IŋL$~ZH|$XALdIf.IELIHIt HCuIH|HIM9uL<$LL$D$H$LH\$wHc`IISMI\HPHIIWL9uH|$Ht$|1D$|D$|E8E1E1H]@Ld$ELt$DŽ$MD$,$AIf.HhIHE9Ht}PIcE䉄tA|$HcPAM9PtGABDŽuH$L DMcM9D$,,$ut$Ld$Lt$Hc$Hc$PM狼$H\$XLd$H|$HD$HHc$HHHD$8HHHL$@$L$PHHHcHHHD$hD$,HHT$`D$TD$(D$(H|$0L$,H$~el$TAIH\$ LHIHl$HI}Ht'L$(T$H4$~~IHHx HII9uH\$ D$H<$l$H\$8L|$@$4;l$PXAF8DŽ$HL$hH\$`H<M@4H$Hc44HHHc4PIM49DŽ4HAHcHHHH)I)H9uILd$L$LQIlHHHHPH9uH|$XLH|$0HĘL[]A\A]A^A_$MljD$+DH<$1\IIH|$Ht$|1D$|D$|E8=IQf.DESH$HHH$Y$H$H<$L Dd$IcH$HHH$ $tuH$H<$1L Dd$Ic[HHD$XL$IOD$,$XIL$L$HH<$$1h[HHD$XRfAWAVAUATUSHHHDj8H|$Ht$DD$DL$E,AMHD$(HT$ H4H DHH9H HuHt}EL{@E1L$IIE9LKHJ H|$HD$HIċC8~:L|$ IDHpID$HyLIĉH9C8HD$HĨ[]A\A]A^A_Ët3DHE1)HItU8E1D$D$f.MM}tID$L狨$IątFL$T$LHt$~-C8~&L|$ H 1IIHH9uLIH$HH;$ H$jK8I4HVIHH;B9IDnH$I HHE1L$I@HL$I@AWAVIAUATUSHHMH|$HT$HDL$3A8M H$HHHD$\D$EE1E1H|$DD$ LXHHD$X16A>IDD$E~XHL$XA@LlIfDID$LI8HIt HCuIHHIM9uLd$Lt$LL蒹$D$$H$LH\$$ HcKWIIHD$gSMI\@HPHIIU|L9uHt$|1LHD$|DxE8D|$|dH]@DŽ$DpE1D$D$4E1HIHE9HtuIcE퉄tAMHcɋPAM9扄PtAABDŽuH$L(EMcM9D$4l$ut$$Hc$PHc$Lcl$Hl$XHHHHD$8Hȋ$H\$@$IL$É\$PH\$HHHcHHHD$hD$4HHT$`D$TL$4~lDd$TAHl$(Lt$HI@ILM9tH|$c1SHD$HD$HHt$|1LD$|DxE8D|$|HD$HRD$$EEFH$HHHD$,$H$H|$L(Dl$IcH$HHHD$޴$tzH$H|$1L(Dl$WIcRHHD$XDD$ID$4D$eIDD$DD$HUH|$13RHHD$XRff.fAWAVAUATUSHHHDr8H|$HL$DD$EIANHD$(HT$ H4HfHH9H HuH|$tHD$8E"L{@E1L$IIE9KLHJ HD$HqIċC8~:L|$ IDHpID$HCL{IĉH9C8HD$HĨ[]A\A]A^A_Ët<HE1H|$IHD$E1P8HIsfDC8~2L|$ L$H<1I4I4HH9u|$LIH$HH;$ H$tCMM}tIFLDoIEt HL7_uVfD{8L$ fDI HqI4HH;u9IDBH|$HE1IVHIff.USH(~>H?H9Ht1HƘH1HHq $tHH9}#H(H[]H81H9Hc5#H H1HHH 묺]1X]1XAVAUATUSH ~>HH? LYJs 1ҿF LcHHH9|1IcH H _QHcH9|"9SH(HHD5"H9}cEL1LH(HH1ɺf HLL1f HN H H[]A\A]A^fDKI _IcH)ȍJHcH9pDr1HLqD+5!HHc=1HLL 1HLL} ED5!"D1HHc5!H1HLH: D1ɺLLj!]1Xs]1X_H~u0>u+u9?u49HHú]1X ]1Xff.S~u.>u)Hu4?u/HHH H[ú^ 1X^ 1XUSHHH_MtrvNt1uLHSXHNX1艬H! _H[]DHSXHK1]HtHSXHK1=HSXHK1%AVAUATUISHHHH ,X;H߉LLHHXILjHډHHLHD$EL,HD$Ht8t=H[]A\A]A^H _[]A\A]A^fDH1[]A\A]A^fHLH[]A\A]A^ifAUATUSH>HHt :HHSXIVIŋCttu0L1aHI1K Ht,EtvHI1MtgA}HL[]A\A]@HI1떐HE1[L]A\A]HI1CMuHA _[L]A\A]f.HI1;+fDHLL[]A\A]XL1Off.@AWAVhXAUATAUSoXIMH(ɋ 2*HE؉T$LL$HD|$`IFXHHD$襴EHD$A>M1L:ZEAE=9D$HT$JIHD$xEIH|$O =HE1H|$1DO A9HH98uA@Lt$Muh@A _H(L[]A\A]A^A_A>EIHtHct$LLLI腸MYA>uH(HL[]A\A]A^A_DAE;D$HT$ IHcl$IDHhLH1MUtA}D!E MIĘET$L$2LcLLL$T$bu{uwHIg ILEHcl$I\HC8u$H|$BH{H([]A\A]A^A_Lt$LE19DD$t$H|$1ɃH脚f.uIA}2A(OfD=cQ7fDILQ@HD$H*HcefDE1fILAC^1X?^1X+^1XAWAVAUATE1USAHHH?H$E1H}H~^9C8|Yt?Mt AVHDjHt$MMHHXZ5MtAVuAt@HuXL1舤 _HH[]A\A]A^A_fDHGHH@(@LoPLMu(fDuIELp`MMAEuE}A5A-MAFu AH|$HZD$LD$@HL?D$@@H{AuAt A~E1RHAWMjHt$DHMLD$PHHK^_H _IGID9{8D #H4$EhXoXHEO^HSXƿ6@8H$IcuIG@H AMMgXIHL$HtLHH $HA9EH $jHGHHLh`Lk8XHC8DHEL97`^1Xf.HAUIjHt$MDHIXZ@t$ AD$HYH, DHH9$H9HuE1DLA _p`^ 1X`^1Xff.fATUSuJ~HuitKHkHIHH|HtL[]A\úx_1Xx_1Xkx_1XWHSHt t71[G8~HGHH@(@Zs}H1@HtiHLk菿HHtL}H{HAtLLHAHIuH1[]A\A]A^A_wHL[]A\A]A^A_HԾHl@H>HRH?AWAVIAUATLUSIHDƉHME"3I>LHD$耻 Ht$QHF@҉V8QLPAHHTIDI9tA IBHuIE1I9uE~6HD$1LfDLD蛦1HLKD9uHD$H[]A\A]A^A_fLDƉcLDSHL$1HHff.AWAVHǘAUATAUSHHHL{XL$HIss {LHҤ@8I8HHIE@L)HHIHL$HI9~HĨL[]A\A]A^A_fH11HHD$(EMs {L@H1ҾIILLغH9gILcLl$Dt$HD$+f11wH|$LLI胺L9|N{s L谤EIHuDv[DIciLl$fD{ 3H|$`1EHT$H|$`H1s {L#L$hHHt$`1HD$ LD$ HD$(XM0A8L H= _ILLHHD$ }I9dH$7EfHT$It$H$1s {Lw$HH$1HD$0%LD$0MOA8LH= _6EH|$ LLID$H9JH|$@HT$H|$@1ɾrH|$HELl$8ADl$HD$ID$HD$0DHYuHH9$HD$0J0Ht$@1HH$1HHHT$`1HHM7A<$LH= _EH|$ DLL$H$H|$`1A$H$1HIL9t$s {L譡EIHT$HL)DH|$`Ls {LmL$hHHt$`1HD$ !HD$(XLD$ EfDHT$HuH$1 H|$ LLȶH|$`Ll$8H$H|$@1LA _"H|$`LD$LD$LԥLA _ƥIuXL1LA _袥DHt$(L+-fDH|$`fH|$ LLLl$8H|$`9H$,H|$@"_DH|$`LD$ H$&@Ht$(LLD$ LD$ @H|$`H$Ht$(LLD$0ZLD$0FHH|$ LL#ff.USHHW(Vx tvHcIHuhH[]@tvHcyHu\HHcHHhHH8HHHHH[]ú_1X_1XUSH(~HHHHs {1) HHH4`HH|$Nu }1) HH|$H4`Ht$H HH|$H([]ú_]1X#_^1XSHHpC3H|$!H|$0H|$01ҾKHt$0H|$12H|$0x1HHH|$PH|$P1ҾhHT$P1HHH|$P/H71HH-HHhZHZH1H@HT$H1HH_uÿm DHGHH@(@VS~u.>u)Hu4?u/HHH0H[ú^ 1X^ 1XS~u.>u)Hu4?u/HHHPH[ú^1X^1XzAWAVAUATAUSIHIǿ@8,HcIG@Hv~6M1fLD谙1HL`9uHL[]A\A]A^A_f.USHH[PtHcHuHHctH[]H+HcHHHHmHHHHߺrHHH[HH[]ú@`1Xff.AWAVAAUATIUSIH1H(w '$ HHt$1HHߋ,`˞Et +D$D$At$ IT$XIŘE1>L1HD$L\$CfDAGI9~1L)DHctLL\$AGI9ωIĘE1)؅IAEI)9~2LLUtB4+LHcr\$AEI)9ΉLHD$H([]A\A]A^A_AWAVEAUATUSHHLsXH8Ht$,HT$1҉L$萝s 1ҿ" s H{HLD$`l$,IŅNAuH1)A9}THt$LL1A _PH8L[]A\A]A^A_fDAu݉l$,1)A9|IcľE|$HHD$|)IHD|$1IHD$fH|$HAHI9uDL$,E1IEfDLH]t$,AGI9D1M),f.+t$,DI)9[EHHcEt@HHD)|$HD$ALD$9|4HcL~)LH)HD$ fDIHL;|$LuD$9HcHcH))I,L~LH)HD$@KL9|$LIDH߅tHt$LL1A _FD$9rHcL~)LH)HD$@IHL9|$Lu;DHDLW81thtu\H~EHOHHI(IZPu1HO@Ht(HAHHT~HH9u1DfUSH?1GHtthX1̈@ "s HSXHHH1H}HH[]f.w HWX/HHHH9tX1)}uHH[]fDw HWXϒHHH1ҹHt-}ttHH[]1HH[]@ _HH[]ÐsrfDsYATUS?GHtPw HWXHH1H6H}H[]A\L1ҾHkXL~"H1 _8MFH[]A\fD1HLxɋs {H腑H1LHqf@1H[]A\fD _H[]A\[Hᅠ]A\hM1[ff.ATUS?GHtPw HWXHHH1Ht[}H[]A\L1ҾLHSXxJs {菐HH1LHu _H[]A\f1H[]A\fD1 _HֿM̄H[]A\@[H﾿]A\M1C?AUATHwXUSHHL EEIcLA8 AUHDHA| uH9‰uE1A)hE~4Hc1Hf.H<HH}t HH[]fD1HH[]@HHᄒX[]UD _HH[]ÐATUXSHHO Ct^;uiH߉?HHIĺdL:LHD$=HD$Ht8t.H[]A\DH _[]A\fH1[]A\DHXH[]A\ff.USH?uUH蝖HHH5HD:HH虑Ht<;tHH[]D1HH[]@HH߾Y[]D _HH[]fUSH?uEw HWXHHHH19Ht<}tHH[]@1HH[]@HHX[]mD _HH[]fUSXHHO UtH;u[{HSXMHHHźrHt}t7HH[]D _HH[]f1HH[]@HHX[]ff.f?>USHHHC~ 9{ M{ t0t 1҅HHsX[]aHSX`HHHHD$HD$Ht8t"H[]1D _fH`H[]`1X@fUSH?uM5HWXH辉HHHźHt>}t HH[]fD1HH[]@HHZ[]5D _HH[]ÐATUS?GHtPw HWXHH1HFH}H[]A\L1ҾHkXL~"H1 _MF}H[]A\fD1HLxɋs {H腈H1LHAf@1H[]A\fD _H[]A\[H-]A\N1[}ff.ATUS?HH1ҾLcXH6~"L@N1 _n|H[]A\fD1HHxɋs {L譇L1HHLL>Ht);uH߾X[]A\C1H[]A\fD _xfDATUS?HH1ҾLcXHf~"LxN1 _{H[]A\fD1HH/xɋs {L݆L1HHLLnHt);uH߾X[]A\s1H[]A\fD _xfDATUS?uOHWXw HeLH1HLڻLHt=}tH[]A\1H[]A\fD[HX]A\ _H[]A\USH?HHWXw υHŋCt4~HH1Ht+}tKHH[]ÐHH1[Huս _HH[]1HH[]@HHL[] N1azUSH?HHWXw HŋCt4~HH1VHt+}tKHH[]ÐHH1KHuս _HH[]1HH[]@HHホ[]MN1yATUS?GHtPw HWX>HH1HH}H[]A\L1ҾHkXL.x"H1 _OfxH[]A\fD1HL~ɋs {H襃H1LHf@1H[]A\fD _H[]A\[Hᆳ]A\'@O1{xff.ATUS>uo?ujHH9HHu H9tps {HSXHHHI12Mt%A<$tVL[]A\fDE1L[]A\DA _L[]A\f.HwX1hOA _w[L]A\?ff.@USH?uEHWXw HCHHH1 Ht>}t HH[]fD1HH[]@HHX[]D _HH[]fUSH?uEHWXw H賁HHH1Ht>}t HH[]fD1HH[]@HHX[]-D _HH[]fATUS>uW?uRHHǘ裸s {HSXIHHH1LHt!}t+H[]A\1H[]A\fD _H[]A\[HネX]A\ff.@:u?u>uW1ff.fUSH?uEHWXw HSHHH1Ht>}t HH[]fD1HH[]@HHX[]D _HH[]fUSH?uEHWXw HHHH1 Ht>}t HH[]fD1HH[]@HHY[]=D _HH[]fATUS>uW?uRHHǘ賶s {HSXI!HHH1LFHt!}t+H[]A\1H[]A\fD _H[]A\[HメX]A\ff.@:u?u>u1jf.1ff.fSHw 1 HHsX{ H[Hc`@1?>uPATU1SHHLcXHHt$ 肄u6Hct$ -Oy/L1H[]A\f.fD-H1҅L‰H[]A\f.ff.?u+>u&SHHsXЋ=[ffD1ff.f?u+>u&SHHsX=1[&fD1ff.f?u+>u&SH}HsX=?1[~fD1ff.f?u+>u&SH=HsX=[~1ff.fAUATYUSHH n;H߉bLHHXIL耹Hډ1|HHLHD$ LBHD$Ht8t3H[]A\A]H _[]A\A]H1[]A\A]HYH[]A\A]f1ɺ}_%4@USHHHֿH ,YltHHH[],YH _[]ff.GtHt+Ft#tBH_ 1XefDO HHs eD~ tߋN HHs AS?u*ńHH1H衷Ht,;tH[f1H[fH߾Y[Rf _H[fDAWAVIAUATIUSEHt$DL$T11{HAoEXICXItfD|$t ALH$H$IHԃLH!HH>!Hl$IAuIIDcMHtkHtfCjHrLH$fH4$H!HH!Hf.6@_1XzHH[]A\A]A^A_fDLH$,H$I=&11zHfAUATUSH>?|HH9HHu H9s {HSXxLHH1ILLMt0A<$tYHA _[L]A\A]@HwX1OA _lHLY[]A\A]驿fUSH?HHWXw wHŋCtD{ HH1ެHt3}tSHH[]fHH1KHuͽ _HH[]1HH[]@HHᄀ[]ݾP11lATUE1SH@?ucHHWXw vHH1IHHUu11HHH!MOA<$]H@L[]A\Ð1HH>H|$ H|$ 1ҾhHT$ 1HHH|$ WH|$]H|$1HuRH|$t?H|$@H|$QHH1H蝫o1ҾHy1ҾHb1҅;HK1ɅHhHHi1HH\H|$MH@A _L[]A\LXH@IL[]A\@1ҾZH1҅H説1Ʌ8H꾴H`HȠ1HHfDH|$Z葫H|$<zRN1H<$T$H$RP蕬H|$ <ƩZYH|$,譩D1HH裶fDH˫1Ʌ~EHHH1HHfD1HmfHH`ZHHHfDHH0Hx1HH fD-H1HZH1ҾH1HH1ɺHH¿H|$cH|$JLxff.@ATUE1SHp?HHWXw rHH1IHIH9H|$H|$0H|$01ҾA$Ht$0H|$1H|$0A1HHH|$P蕾H|$P1Ҿh蔳HT$P1HHRH|$PH1HHcH螨H艨aZHt<H_HʦJH1H<$kXZ@1HHH蠴MA _HpL[]A\D1ҾH豨1ҾH蚨1҅;H胨1ɅHhH9Ht$1HH蒜1HH5HT$1HHêH+H|$聿A<$ELXHpIL[]A\f.1ҾZH1҅7Hʧ1ɅH꾴HQH蛧1Ʌ~mHHUHt$1H6H讛1HH@HZHH1҃HDHHHZHH8fD11HİH臤Ht$1H耱vHH-H蓦1H1HZIH豚@1ҾH91HH1ɺHH:Hݣ1HHذGATUS?uWHHWXw %nLH1HLLLHt?}t!H[]A\f1H[]A\fD[HX]A\鏵 _H[]A\ff.ATUS?uwHHWXw mHH1IH41ɺZHHrH1HH软Mt@A<$t!L[]A\fE1L[]A\D[L$Y]A\ϴA _L[]A\f.USH?HHWXw lHŋCt4~HH1VHt+}tKHH[]ÐHH1;Huս _HH[]1HH[]@HH[]_1XpfAWAVAUATIUSHHHNA}8LE2H,H,H($Ht$PHH]X`t$THD$XH|$PL8L|$.u }HPju8H}@HHD$(tHC@E8MC8HEHCL|L1HIMOHJ"HT$8fDHPHIIWdL9uH|$HD$0E8HDŽ$HD$`D$@RH]@HH$`D$@HKLt$Ld$Hl$ IXLl$HIIHCIH$HDNXIIL9NdL衡EJ`uA@L軧fDH|$P覯u }Hhu8H}@HHD$(DsHC@E8C8HEHCHD$(H[]A\A]A^A_$Ll$H1Lt$Ld$Hl$ AE8E1E1IAHD$D$TfIcHHD`HHHH`tH`HHAI9t"A9H`HHv 9w Mw HHWX^cIċCtcHŘHØHH輛I$1HH菦MtOA<$L[]A\HŘHØHHI$~KHH谤MuA _L[]A\DE1L[]A\D11/{f.1ie@[L羅]A\/2Y1WUSHHHH@Ht+趙Hu!{tk=uH1[]1vcfDHt4Ht(HHH[]AE11ҾgH1[]=H1[]1cff.ATUS?>u~H5HM5HI<5HSXHſUaHI$H1H腛H]fLUfHt ;t+H[]A\1H[]A\fD _H[]A\H߾OY[]A\鯨ff.@?u>u:u1ff.f?u>u:u1f.1ff.fAWAVAUATIUSHHHHT$LDEL$8ErLwD$3&HD$HD@8E_H|$uwHtXH`wH\D$3$;L{|fDHc뒐$D$3Ht$pHH]XTt$tHD$xH|$pL0%u }HG^u8H}@HHD$hoEMHC@E8CC8oECoE(C(LHHI6vJL,HT$hDHPHIIWLvM9uLDu8HD$`HDŽ$HDŽ$EjLm@L$LIUEnH$H\$Hl$ AIEAHLIJIHM9K\H處EJu̻DHD$HD$HĈ[]A\A]A^A_H|$p薣u }H\u8H}@HHD$4goEHC@E8CC8oECoE(C(|$3tH|$b@H?Ht_HӔ$} E$ŰZfH\$Hl$ AL$8I$茔HD$HD$8HD$E1P8~H#tIŋE8D$4hDH11AAL$ID$:HcHHHDŽԀHI9HԀHt A9HI4uHIII9uH$N$LT$ L\$@H$H$HD$HHcHl$`HDŽH$HHHD$PD$4HHD$XH|$HD$8HtHGHHD$(FHH|$ II1I)I9mL)IHL$@IHHHIMt)EL1fDHLHLD9uMMHEpLHMiEt@I}A3eIMAuH$HHHH|$tH|$rHD$LvrIH$H$H;D$HeHD$PHDŽ$HHÃ|$4H|$X3HHDŽHHHHHH9NHHHHH$HHHHH9HtH$H$I$@I$SqHD$8HD$HEP_HHH } HD1ZHD$1u P_YH11HD$讨u P_YH11HD$} 11Q[HD$} 11ZHD$HD$ E1IHD$Hl$`Ld$hHIH}H c1HHnL9udL| HXEt0H\$(LL|$fDLAbHLAuH\$(H$HHHH|$ulH$E1H$H;D$HH$H$HD$HH$A1HD$@HD$  _ 1X|H|$oHD$USH?uEHWXw HWHHH1Ht>}t HH[]fD1HH[]@HHɯ[]mD _HH[]fUSH?uEHWXw HcWHHH1Ht>}t HH[]fD1HH[]@HHᆳ[]ݞD _HH[]fAWAVAUATUSH?HWXw HHVM0HIڮݗH<IL$M«H|$H|$0H|$PH|$pH$ݢHuH$ȢLH$LLL赝H|$0H1һưH|$1Ҿ腗HȠL$8Ht$0H|$P1蓰Ht$P1H蔰Ht$P1ɺH@Ht$P1HHHHHL$Ht$H$1#HHH%L$XHt$PH$1\$H$1HH4H|$p1HśHt$pH$1H>Ht$HT$p1H L$Ht$1L藯H$LB*H|$P1莢Ht$P1HHt$PHT$01HˎHt$HT$P1HGL$Ht$1L$L茩jAT1L$L$H$HT$`Ht$@H|$ H|$ЙMXZA}HL[]A\A]A^A_DH}t HH[]fD1HH[]@HHᄒ[]}D _HH[]fATU1SHw  s {HSXHcjIIċCt?u\HDI$1HDH_H_L[]A\HDI$H4`L[]A\ú^~ 1XAuf.ATUS?uW>uRHWXHHw HHHHI1.Mt)A<$t2L[]A\f.E1L[]A\DA _L[]A\f[L̦]A\ff.@?ATUHSHHucHD tAwy SrYHjtLHsXD>IHt98u)[]A\rYH铏HwX1xP@< _[]A\fD[ _]A\fDHWXP10mfD1ff.fATUS?uW>uRHw HWXH;GHHHIޘMt)A<$t2L[]A\f.E1L[]A\DA _L[]A\f[LyY]A\韎ff.@ATUS1H?um>uhHHt$ H1HLs {1* H߉PLHHcHLH,`u< Hct$ L蟞u:HHt$(HI1HOLHt$,1H@LAw 1ҿp LcKDH `D$,D$(i<9Aw AIWXHL$EHL$LILq<-HcرH UHTDHH9uD$,D$~CIE1HD$HcD$(HD$@HD$H|$J48轛BID9|$ޅ~CE1"DLLI踇D9~BuLLI D9H_KDL4`fHsXP1A _ 9H8L[]A\A]A^A_E1D$,D$ 돿~Y1j9f.ATUS1H?um>uhHHt$ H1HgJs {1 H߉MLHHcHLH,`u<|Hct$ L蟆uuRHw HWXHBHHHI讑Mt)A<$t2L[]A\f.E1L[]A\DA _L[]A\f[L羓Y]A\/ff.@?>AWAVAUATUSH8Hw:E1AHȋ VHHjHƿdEHSXALHHA9D|$A?EH|$!YEHD$E1H<L E1HELLMLMlE"HI9D9u11 fDA A9 HI9uD9uAsHD$HcH|$HD$HD$Ht58u5H8jH[]A\A]A^A_酈DE1 _H8[]A\A]A^A_@1DXoEEL|$ H<L$E)HD$(H|$1AEDl$ HHE)IcLHD$HELL HL9tG D9uE9|11 DA 9 HL9uD9uAuHD$(HcfAD9|$hHD$(1DAID9\$HD$11HD$~HD$f.1fHD$q~HD$AwH|$HD$HcJ~HD$eEHH Ic; H<ult$HcE~IHHDD;AwHH9tHD9DuHc1}HT>AEt2DHA9u|$lAUATUSH?uyH)HLHIHXLG{5QHڿ=HHLHLCHtR}t,HH[]A\A]fDH1[H]A\A]HHリY[]A\A]YfH _[H]A\A]ff.fAUATUSH?uyHIGLHIHXLgz5qHڿ=HHLHL'BHtR}t,HH[]A\A]fDH1[H]A\A]HHンY[]A\A]yfH _[H]A\A]ff.fATUS?uW>uRHw HWXH[uE11ɺY駣1ff.f?u>uA1ɺYt@1ff.f?u>uAYA1ff.f?u>uAY1ff.f?u>uA1ɺY@1ff.f?u>uA1ɺY鴢@1ff.fAWAVAUATUSH8?>u|HHt$(HHH1,@s {1_ HHHD,`t\}u>Ht$,1H?D$(D|$(A1A)D9|$,}^Q1 _-1H8H[]A\A]A^A_@D$(D|$(Dl$,A1A)E9}`Q1 _-뻐s {HSX9HHLHHD$LDL$(EtDL苙Hc|$,˱IƋD$,D$~11HHD$H|$HKAHH9\$u\$t$(D)~UD{IGE1HD$諐IL9|$t$(GDLHcEt{EHcAGH^HHHHD$IHD$CH;\$HHHD$LtzD$,D$;\$}JHcE1IHD$fDB;I9T$,~HD$LLB tUzLDL D$,D$D;|$}IcI{AHD9|$,~t$(LDHctyfHcW =F1O8ff.@1f1ىf?ukUSH1Hw E HHHc,`HtxHHH)‹=HHsX[]71ff.fUSHp H wYUHMHtH Ht 9u~}uxHUHHR(Htkz teHHHtYHZhHtR{uJHSHHR(Ht=z u7HR(HHt'HRHtB]tHH߉[]01HH[]@ycHUXƿF5HHEHH0HƘ)Hu? _HH[]DHUXƿ5HHH*tHt;nHH߾p []|fATUSHHHIԃ4w 1& {SS$ZfTHD$PDZHHD$HDHH[]A\fHHc)s1H[]A\f.HØ=Hc1HdimHHc1gH[]A\fDHJHH.H{LHJHHuH[]A\ú `[1Xl_DHHHH1H_H_軍fDHHH4`1LvHt$1Hw:HD$HxűHT$HH?=f.HHHH1H_H_1HHtfHHH4`1LHt$1H9HD$HxıHT$1HH `_1X^fAWAVMAUATAUSHHIH>A膈u EHHMtL\uA<$uqKDC Hs@H{8LKXHƉHHDHD$cEHD$u;HtL{8tFHMLHHE1[]A\A]A^A_H9Vf1H[]A\A]A^A_HHL[LH]A\A]A^A_EDA/D1ff.A0/D11A@OAO1APO1AUATIUSHHH؛A u EH{GLkGtw ALWRHs@H{8LKXAHS11HIEu2HtB{8tu&SHHsXЋ= [,fD1ff.f?u+>u&SH]HsX=ϟ1[v,fD1ff.f?u+>u&SHHsX=1[6,fD1ff.f?u+>u&SHHsX=O[+1ff.fATUS?HLgXw L)HŋCtcHH9u H9{ HH1cH}H[]A\fHHH9|BxLQ1H _T.H[]A\@1H[]A\fDHH9uZ{ _H[]A\[HԸ]A\pHH1#n1fDL8RcY1ff.@USH?ueHHHHwXH9|(NhR1 _HH[]HH9uHt$}zHt$fD1HH[]@Hs 'HHH1ZrHt}uHH[]o@ _qff. ֜S HwJx#;u.HsX1[DžU)D _[f1[ff.AWAVAUATIUSIH$~L~A|$A;}I\$XH|$(D$ VH|$ DŽ$L$L$L:Iw IHWXH_ HHHmcIH|$\cI$H|$ JcHT$ HHqHt$ H_Ht$HT$ HqHT$HHgkHeH|$eH|$ eH8H[]A\A]ff.fApOAOAEDA5DAWAVMAUATIUSIHHMH8KvSL6CL63HtHvu }MHt$LM%MA<$A$LKXDD$D$MHs@H{8L+w1HHI{8DL$LILLHAXIH8L[]A\A]A^A_MD$YMD$LKXDD$ns8LDDD$HHLHA^I됐E1I$1HHD$]H}H4}H1H|$ 17nHtkHt$ HmHLs4HHtL}H4HAt LL蛿uH|$Ht$ (j|$uH|$ bDDH1*4H?fH4H)ff.Sw H1ҿM HHsX=H[HHc_ff.Sw H1ҿ HHsX=H[HHc _ff.AVAUATUS>FHH)H11S}{ 9} HUXM} HØLA}D}IDI$1HLgMA<$[L]A\A]A^fDnHSX1A _. R[L]A\A]A^E1[L]A\A]A^ÐHHL.tMA _[L]A\A]A^[L>]A\A]A^bR1ff.@AWAVAUATUSH>)FHHvH11NR}{ 9} HUXM} LLA}DR}IDM薩1LLLe11LQ8utMA}uEHLY[]A\A]A^A_aXMf1LLs,0S1Lff.AVAUATUS1ۃ?>HI"HQ[Iou 1ҿIg HHHH,ŀ_HcH+HSHcHHV1HxfIHH9|bAH`1HdL4SLLVHH9u uH[]A\A]A^HH9uRjHsXXS1p H߻ _HX1H)dx@HHwX 1ҹiH HHHX_H1ҿZ_AUATIUSHH(ACnu EFML.-KDC Hs@H{8LKXLo11HH3E{8 H1H﹐O`nH1HHH9j~tWHH9|DHc1ҾHXHc5~HQH1HHPHnefDn~H(H[]A\A]ËKDC Hs@H{8LKX1n11HHGEu(~H(1[H]A\A]LHHAOAO1;H(H[H]A\A]USH?uEHWXw HHHH UHt@}t"HH[]1HH[]@HH? [] ]D _HH[]fHtHH;ff.USH uOHHto>uB5P_Hŋu0HSHH[]D1HH[]@H1SHH[]Ë5P_KH랿@ ,ff.?>USHHHC~ 9{ M{ t8u 1҅HHsX[]@HSXHHHHD$_HD$Ht8tH[]1D _HYH[]-[]1X?fAWAVIAUATUSHHH$jHjLt*uLihu }HUX}I<H*HIH)HAAHt1M@H{1HL2(H)HHuھcLIE@Hd}uHEIEHL[]A\A]A^A_@AuIC)HIt1MLh)LI])HI9IFtI|$1HLl'H$)HHEIEE1JI(HIoAMu&AM1MI|$1HL&Ly(HIn(MHuIS1H萁u$H贛t0HsX=1[fDHsX=څ[1[ff.AO11?USH1Hw  HH`SHcHHH4`pHct$Hxt:H|$VHcHU=LHsXHPH[]f11f?uSSHHNHtXHt$ 1H u1D$ =HsX)HcH[D1ú]51X;]21X;ff.fSHw 1Ϛ HHsX=sH[HHc_mff.fA.D@SHw 1 St'u@HHsX=H[HHc_HHsX=H[Hc``]\1X:ff.SHw 1 St/rKwCHHsX=H[HHc_HHsX=lH[Hc `j@]u1XM:fDHcW8tHwX=-81ff.fUSH?uEw HWXH HHH1fHt<}tHH[]@1HH[]@HH Z[]UD _HH[]f>AWAVAUATUSHHH85HGHt,HHt$8;1H8[]A\A]A^A_f.;uLILVM1LWL^R;uHCHHHt$ 1H<mLd$ HsXMoHLI-{ 1L LT$ MtRLE11fDHt.L1fDHDHHDHH9uHMI9uHBH8[]A\A]A^A_HILIUMHLl$LyXLU51ҿ HSHLHHH4`H˜_LL@fLPLPHuX1SH8 _[]A\A]A^A_@HsXHD$ { 11 fHvX1xSPH8 _[]A\A]A^A_@HLl$LWLT51ҿ H|$ ATH|$ LIcHT$ LHH4`^H|$ ODOLO;*D1H1S1 ]"1X:6AWAVIAUATIUSIIH`L`L`L`D$#I}@H|$0SH$HPIw HWXHHHH9}s 1ҿIĘW HLHHL4ŀ_EE+AuHcH[~LHuXxU1H _HXH[]A\A]A^A_DH11BfDLD)HcZxLL8IH{ EHD$H\$L|$0HHpMLhMIc1LgBIc1LHRH1HL:11LH[Ht}HZHHH H\$L|$0HLLLIc1LADLHHc1KRH1HLRof _fDATUHH΋ AuSIH#Z"B;1}'MA<$HA$LHMt@uwIw HWXHHHHHHwHLH9}{ ʼnH|$ FH|$@FHFH|$`FH$FHH|$ 1THt$ H|$@1 JHt$@H77Ht$@1ɺHUH|$`1H/:HT$`Ht$ H$1LI$C AӾf.skH,H$1LIH$HT$`Ht$@H|$ E11cTHt~}uH8ZAHHĨH[]A\A]\HH,H1HH1SLh3fD _fAWAVt AUATUSHH m{8-AHCXDHHD$qHſHE@HBHcs87;~H{@Ld$E1L$$LML+<$IBHs@LAILEs8D9҅$E1nfHD$LIHMHDMI _A?H1LQ C8AT$I9ЉHT$Di<$IuAt$HHcH= _MH<$uz _1HHt$H$C8f<$uRHcH}@5if~$1Ifs8H|$tfDHD$H$H<$Hcs8USHHHֿH fkt'HHJ"Ht H= _t 8tH[]ÐH _[]@HH[]=ff.fUSH(Gtr,ultwG8Ht?uH5jHSXtHt$HH褿Ht$H4H(H[]fHt$Vuf1H(H[]@HGHtHHt߃8o1fUSHG,u'?t"HGHHH8 jHEZt[HSXƿHt$HHHHt$H3HH7AHt}t,HH[]f _HH[]Ð1HH[]@HEZ;Hff.ATUSH ?>HHWXIw HŋCtf!HHI$HH9A$1ANH H[]A\DHHH9tH@1)ЉA$y؉H H[]A\H 1H[]A\f.HH9fHt$H|$L$ +EL$ H|$Ht$@A$1qM+RZ1USH?HHWXw oHŋCtDHØH|H1H6Ht0}tPHH[]fDHH1CHuн _HH[]1HH[]@HH.[]9U1USH?HHWXw HŋCt4~HH1JHt+}tKHH[]ÐHH1;KHuս _HH[]1HH[]@HHM[]8\1XPf?uKUSH5'f-H蓞H-t28tH[]fD1DHpZH[]U8DH _[]@AUATUSH?Hw 1{ s HSXA9HHHH9HLH9HH9IcľLH HHŀ_HcHcH)HH9HM1Q11HLLEH}HH[]A\A]H1[H]A\A]L)HH[]A\A]f.IcHHH1H_H_IHH[]A\A]H _[H]A\A]DH1LIN@HHZ[]A\A]6AWAVAUATIUSHHHuLEHH{BHt$1Hl$}8Ht$1HH\$ |$ID$A<$D$t^H 7H%D$ 9}b=MtQIT$XƿU1A _`HĸL[]A\A]A^A_HLuUE1A$9AT$8D$9rjAt$ A|$IT$X+A|$Iu ID$IEAD$8xA}8HcAM8IIE@+11VDHcDuHIt$@D/Mu@K<>K$HۉDpt D,DlD0CHA9E8II9\$K<>uHct$$=f\${At$ A|$IT$XPA|$ItuAE8NHct$IE@HHt+}tKHH[]ÐHH1Huս _HH[]1HH[]@HHマ[]/\1XfUSH?HHWXw HŋCt4~HH1-Ht+}tKHH[]ÐHH1)Huս _HH[]1HH[]@HHᆴ[].`\1X@fUSH1Hw hr u HUXÿHHcHHH1H_H_XAHH[]ff.?uSUSH1Hw q HH1H,`Z0=[HsXHcH9HGH[]{1DUHAWAVAUATISIHH8aYLQIH19=DLMLEHMLHL} A}gs {IUXoCIă;@oC@oC(@(xC8u M_A$AD$8/xHuID$@H.H]H9]IHC]HCHtH=H)IH $IH1IL"HLL蚰1LHLJHE1HeL[A\A]A^A_]HHA|$uLxLtII$;fDHuL<U:f.HH-H $fAD$8fxIgff.AWAVIAUATUSHH:A8IGXAw AHHD$^Hÿ@8HD$ `vHC@IW@HHr\&HC@Iw@HxHD$F&AGI_@HH{IHD$0D$(oED$,D$8E1D$D$<@D$,~uHD$ l$<1Dl$LD$0HD$LEIDIfDIJHrHT$A HLHJl$L9uEMD$Dd$8D$9D$(kHD$ HH[]A\A]A^A_IGHT$ HBHD$ Ⱥ@\1X>?USHHwXHL EDHA@H< }IcHDfDHx u эQ9}D1)Hc6~1H4H4H9H[]D1DHIc1[]Ic1@AWAVAUATUSHHH<$H7D$?H $HAHH@(HHAPHt8uHh8HHu}H}IąH@L9]~LHUE1jH|$1I3+II _XZnMtA}LIL9]ID$M HU1jH|$1E1I*HIY^H= _ID$MLLL:LL9]IxMk=|$5TP_?HH$jH$~@ID$MLLL:LrLHgH4$HWI1HX[LH _[]A\A]A^A_1H[]A\A]A^A_LLHH4$IH1WHXLH _[]A\A]A^A_Ë5SP_0|$H1H$SH$dH \! 1XD D \1X+ @USHtyuSH#tG5RHSXHH HHt'pHH[]1HH[]@1HH[]H<@ o1CfUSH=;uiHHHt Ku=5GRP_HHHH[]f.HHH[]11/1HH[]ff.+fD/fDAWAVAUATIUSHHHT3,HD3LuL(3s {HSX{I E8H}@AG8u8q{IG@HHI1A}HHMSfDMI|$LHD$Ht$I1L.HHHHHt6HCuA}tL6H빐H}'HfE1HL[]A\A]A^A_HCIGHCIGIH\ 1XAUATIUSHHIտH O{Z _\}eA<$ZMA}FAHUXƿ?HHM$H$MHHM$DL1HL9A; uI9HufHUXƿLHMI$HHLM$EEE 1DEtE9tvLD9tkH uHL9HG$HH[]A\A]DH1H[]A\A]H1#HH[]A\A]@HFEEtHk두Hz?>USHHHC~ 9{ M{ t0tgHHsX[]@HSX@HHHHD$!'HD$Ht'8t2H[]1Df _fH{!H[][!1XfAWAVAUATUSH w,G$ [fA`3Dt[A _HL[]A\A]A^A_wۉ$p[fqA2DuDAHUDg7HSXDs8H{@IRIF@C8HAF8HHEIHD$eth8AHH= _H|$HS(H%H{ IHt IG HHHH{ H{tDwDGAp>D_FA 8Dof.W$Ũ[fDE1FAHI5*f.LE1A1D]DA?DMDA@D=DAP@D-DA@DDAAD DA`?DDA@DDA@ADDA?DDA?DDA>DDAp0D<@A@DEDH{A?ukUSH^I-ހHπNH-tH= _t8t(H[]fH _[]@1DHZH[]uDAUATUSHH tm?uqD%M-KIՉH2H= _D% -tHt8tZH[]A\A]H _[]A\A] uH1[]A\A]f.HH[]A\A]fDHLH[]A\A] HSXHtHߺX[f _[fZZZAWAVIAUATUSHHHthIA1E1Bf.MIw1DIHC HHE HC QH] HtH{?tHH] HuI]H{ LCIHMHHI9tE H€[ H€tiHI9t1 H€tiHƿT tQHL[]A\A]A^A_+IHDHL[]A\A]A^A_H@H1[]A\A]A^A_ff.@HHǘHHHǘ(ATUSHHHC  Et{H9D$ tp;D%|-|¾HH|`HD%|-|t.H= _t&8u&HfK } _H[]A\fHt3Ht$ 1t$ Hct'HH[]A\fD5Dt$ @H1[]A\ff.fAVAUAATUASH1HSXDD s8H{@HHE@C8HE8HCHE0HHtJLH{DDrH= _&H=HS(HLnH&HHu[H]A\A]A^fLkXDL|HH HHHHx]aHHHHHH|1fDHH9aHDL4;7@x Ix HDHHhH1ip )eed~dtdjp6`H:V2L2Bp 8.q .p $/p xp np 7p ebp 

C @uN@C C t H[]fDHHH[]1tH1[]@H1@l9H1[]ATUISGHHըu3uFKC t[]A\HLH[]A\fDHLu Ct1H꾀l@=u[1]A\fUSHHHGuKC t7H[]HHHT$HT$tH1[]fDHHH[]GAUATIUSHIHGu5tps C @st5H[]A\A]f.HH%t1H[]A\A]@HLLH[]A\A]1 #uU1%C<oEuBW1Ll@~;1u@1LlA1[f.GSuHHGPu GZ<@tDSGPHAu(tHHt ljt@{ t"CR@u1HKH{PH[~fDHC(HtH8自u[ÐATUISGHHu5CuGKC t[]A\HLH[]A\^fDHLt1[]A\f1H m@}u1@ATUISGHHt HH9Gt`1Y!u=KC HHCt []A\fDHLH[]A\fDH1m詆[1]A\f1Pm@}ufATUISGHHH tH9G t8KC HC t []A\HLH[]A\.fD1m@|tHf.USHHHGuNC@u@C Ct]H[]fDHҸP_H:HDп0k1諅H1[]fHHHT$HT$tH1[]fDHHH[]gUSHHHGu&C u C Ct:H[]fDHHHT$XHT$tH1[]fDHHH[]ff.@USHHHGu&C@u @C Ct:H[]fDHHHT$HT$tH1[]fDHHH[]off.@USHHHGu6{x KC tH[]fHHH[]HHHT$HHT$tH1[]fDUSHHHGu&Cu C Ct:H[]fDHHHT$HT$tH1[]fDHHH[]ff.@USHHHGu&Cu C Ct:H[]fDHHHT$hHT$tH1[]fDHHH[]ff.@USHHHGu&Cu C Ct:H[]fDHHHT$HT$tH1[]fDHHH[]ff.@USHHHGu;x1 C tSH[]@1ytH1[]@HP_7HD0k1HH1[]HHH[]7USHHHGuKC t7H[]HHHT$HT$tH1[]fDHHH[]wUSHHHGuKC t7H[]HHHT$pHT$tH1[]fDHHH[]USHHHGuKC t7H[]HHHT$HT$tH1[]fDHHH[]OG t{ff.USHHHGuKC t7H[]HHHT$pHT$tH1[]fDHHH[]USHHHGuVCuC CtUH[]fDHP_r HD0k1HH1[]1tH1[]@HHH[]ff.@USHHHGuVC u C CtUH[]fDHP_r HD0k1HH1[]11tH1[]@HHH[]ff.@USHHHGuVC@u@C CtUH[]fDHP_r HD0k1Hh~H1[]1tH1[]@HHH[]ff.@USHHHGuNC uC  t\H[]DHҸP_p HDп0k1}H1[]fHHHT$HT$tH1[]fDHHH[]Gt0HP_HHD0k1HCq T}1HDG GtDH1&fD AVIAUIATIUSvhG 9HHt@@tuuu AE Am []A\A]A^fDnHHЉÃu GAE 9phDMP_LDM俀gAu Hÿ@@MMHLH11n|[]A\A]A^LH%[[]A\A]A^f[LLL]A\A]A^f.Au Hÿ@@LHH110iz{[]A\A]A^ff.@ATUISOHG Ht9C <@tLHH[]A\[]A\fD u[]A\fATUISG HHy1fHH1[]A\ff.@AUATIUSAHHHGC HT$HT$H3H(q1:rH1[f oHKH3LC1p r1D;H HT$pHH1H6q1]ff.@fHG Gff.f.ATIUSGHVH ƒ? ЈGC@C S@ks@}CCC@C @PC$|;C?WC tM CtME CcCsC @C @C   %@C Q@eCsCtMCtMxCtMMUE8H[]A\A]HMH3HSx1MuE1KIC:!HMH3HSx1M`vE1$IC8HMH3HSxMv1E1HHMHLCxw1=H7HW1 uE1 EHH[D AHuHU1wlHE81fDHMHLCx8w1I=dt1HAWAVIAUATUSH1HH(HH|$H$HHD$LLsHME1E1?MMoMoHMI]IEMM H<$tH|HEHm HtӾH5 IHLGI]AoF@C AoFPC0AoF`C@AFpCP ЈCPAFpCQ SZCQAvpCQ @{ SZt)MMIEMHY0HT$ HT$H HpH HC(I]@HL$A[L$MA[tHHHD$HH([]A\A]A^A_HL$A[Hǁ$A[ú}<d}ATUSHHH $LD$t{=u G HtH[]A\AH1uEH3u%HH@x 1;f.H@x1QE띺 }d}ff.AWAV8AUATUSHHHHt!ooC@oC @ HC0HE0L-!M4@I]HDc%D9tAD$CJlH[H]A\A]A^A_.fHH Ht xH|4dzfHH1Ht3O t&t(1BtGPtHxDGDtHJhHtǃ9uHHtmWZуt_ЀtWGUtOHOxHtFHPZփ@t6Ѐu#.HPZփ@tЀt HIHu1HHtoGUuAGTtHHHthxu:H@Htm@UuHfHH4@+HfD1պ}dy}dky}dWy}dCyff.AVHAUAATIUISHHD <E uHHt @[<t_H HtSH;LHHtH51HHU(HH@HH;uHHHtHHA8tIE1[]A\A]A^ÐHXC^tfDHHtH8Ht L4HuHH7HHu[Im1]A\A]A^ËE ImtHu(1FUuL}HfSHHHL$HT$HtHR(HH[HH[USH1HH(H5HL$QHT$D$ Ht#Hz(*HT$D$ HUH([]fDH5AHL$HuHT$HtHUH([]fDH5 1HH$fSHHHt$NHT$HtHR(HH[HH[GTt6Ht!HNj@TuHH1HÐ1ff.fUSHHHfCTtH_HHHatHHH[]Lff.AtJt%t tktf ta t\9fDtLEtGHvHf.LOMMAAV uMAA]uQuf u tAu=t9HvFV t HHpH{MRAB]tu8Et3HHvGV t HHxDtfDD t+HvFV t HHpHGV tf.ff.USHHGU HtHH[]fC\uH H;H HHL$1HT$HHj(H;HEZƒt <H HHh H0H9tH;uH(H0H9uHH9Hh(HH HHH8+HH0HH[]fD+HH;uH1px;1H1e;1ff.fHtG]uH@1DAWAVAUATUSH(HT$HD$HHGZAIAMljƒ<.MtIHH{ H;L荕uyUSDMt /ٰHXHhIH(H[]A\A]A^A_@G]pyHH PZJHH^ETtHxELM1fDHHMuذMHDLtHMuHLHHUHLx119fETCB<t <EuHUL濰x1`91fT$H{HL$LjHMtlHD$װHKHT$HXIHHHL$HHLf.ALLHH(fDHHUL0y118fDH91HxAAMLHH1~ dpff.AVAUIATUISLHMLf.HHt4HH;H薒uHSxH@R 7[1]A\A]A^fDAETtI|$E1H,HuְIHH1_oWI$AD$AD$x[M&]A\A]A^GTtHH<%E1Ht)IH`yHP1H6[1]A\A]A^þ`հIMSAVAUIATUISHHHtH{LD$L'HItOMt1԰HSHXIEHPHT$HHL[]A\A]A^fHHu@E1ff.AWAVIAUATIUSHcEH8Ht!:HtE-w($yf.HcHH `E-vؿf &+L%^HI<$ZEIHD$ u]Ht%H@(Ht@U0PEuWHT$ 1LH1HD$ HHX(f.HtH@(Ht 0HD$ LH5HHD$ HHL"MA$AL$%9DAAD$HD$ I\HǃHX(1LvHCEՀKWKSǃ04t M-$h{{fDEHD$(H;L-ǨHII}HHD$ HT$ 1LLHD$HD$ *s L`(HD1読 I$Lkx#ҰCUL oLhLD$@HCxCT 1LTID$AD$ZAL$SAL$WADŽ$0A$4AL$\AL$UȀAD$ZAL$Y@HID$8HADŽ$AD$ AD$@ HMd$(HPI$PLPHT$(fLL%P{HD$(Hl@@( `HHT$(H``B B BBBBHLD$ H8L[]A\A]A^A_=ڲ1ҾP_IO>HTHL` HǀH0``L-NаHI =R1HEI Hk(s>HECZKWC Lc8C@PCZ&C$fDHD=1H, `HcU@>SZHC PSZS$HHS8HP KWEDC@DMIW(HJY@HHS(HC8CZKWKUC C@ PCZ:HS(HH1@@( HP1QHHS=$H11$=HE@$@{fD94@Uu~fDPZH5H4H DD0Et ;4HHPH9uHPHPHPHEVHLD$HLD$A|$tqHT$pHT$IH4H:HODD$L$HT$ΰHT$IDD$L$HAAL$%D A$AD$/1E1ADD$DC1L$DHT$DL$H4ͰHT$DL$DD$gIHPt%qH|@HtI4$H2HIH9uHIp@IE1M$P?HAAHP0@H3H{P1LD$ųLD$;iH3H{P1LD$蝲LD$A1=`}d4e`}d e@}d e`}dd`}qdd`}4ddI1;ff.@HHtfHW0HtHf.DUSHH1HHHH[]Hff.H@6 HfDH@6 HfDH@6 HfDATUSH0 GHH$fDHGxHH0[]A\ XHEH0[]A\fD xHEH0[]A\fD 8HEH0[]A\fD HHEH0[]A\oG1)$oG)D$oG(H)D$  HmHHEH0[]A\fD?HGHt H8Ht?tHEH01[]A\fDu1H>L$$@s 1ҿHHHcD_IHH@HHIHa1+DL믺Ѕ"bff.ATUSH HtvW8HHHu/AHt$Ht@Ld$LeH []A\DHt$*tD$E1tHD$L DH 1[]A\úafAWAVAAUATIUS1MHHHt$HDHHcD_HhHHIs1DIH]HHHcD_HD$HHHHIHI9rRHD$Ht4MtAAuLInH޹DHHJM9uHD$H[]A\A]A^A_ú`ff.fHt{AWAVAUATIUSIHH8Pv iHHu{gS$@HSxLDwHCxH8[]A\A]A^A_Ð1DHSPHHzHt$19;HCPHt$ 1Hx%;HD$HMHt$ { LHTH)HHHHjf.HSz@ H-JHx?Hǘ5@Ds HLLDwHLHIHH)DYH{ HLL;H@{ HcHHnDLH#H8H[]A\A]A^A_fs HQHHt$ HD$'L$$1tLHT$(~A11fHcH|$MM44I)HLLHËD$$u/HcH|$ HD$MwHD$H8[]A\A]A^A_fHD$(;(|fH{1e HHHIHCMLu8fDHcLI4H)LHHIMMIHtIFhHHZHPpHZ?uHx1mHHcH1LnfDHHM{ Ldp1\1%HmHX\X v\pcb\p`N\pT:\X&\ff.@UHAWAVAUATSHHLERIHIIHuIHUHBHtH=H)IH $ILIL(IHE~2P1H@A|t4A84ADHH9uLLHrH$FIIGMLtdMIWhHI~Ht-HHRpLLEH@HBHILHLEHIMuL}HeL[A\A]A^A_]E1HH-H $MtL81"E1p1E1"떺BZff.AUATIUSHHIHw\DLHHHӹH[]A\A]ÐAUATIUSH͉IHּHyGDLHHH螻wH[]A\A]ff.AVAUIATUISHAIM HcLHH)DHl[]A\A]A^ÐAUATIUSIH͉H(6DLH(HPHL$HD$D$H $R4xT$ tEH([]A\A]@Hxff.AWAVAUATIUSHHHHBHt H8HtHǘ|IHLs 1ҿ誷HHHcD_IHPHHHHI9PHHxܾs ;5HuZH1f.L HHHH9wHH[]A\A]A^A_@1ҿHHHHcD_LhHLII`uIHtbEE1fD{ DHHZHHHD$T$HHHDLE)BIL9wHrpHH1@V@VfAWAVIAUATIUSHH(:`z B$xf.z HHH+HIGxAu6IGxHxI_xIHHH[mH(H[]A\A]A^A_DHBHo@Bo@Bo@(B(LLH HHIGxhlUfz HHH HIGx+IGxfz HHHHIGxz HHH[HIGxR8HD$BIO@H5HAAHHLHLH9t0HPHuHE1H9uMIGX1E1H$@Aw AH$O(AoGI@xAoG@AoG(@(uIIH$H|$LLL$I=LL$H|LH)޹LHM9wHD$IfDIA I_xI111븺xSfDAWAVIAUATIUSH(HBH<$0Lu4*I|$10 MHD$ID$XHD$I$HD$fDH|$1LHc83A$A|$8AT$8ID$@ 11 @ID$@H<LcnIW`I|$@JHHH˜$nIW`I|$@JTHHHH˜gA9l$8@H|$mH([]A\A]A^A_HAw ABHRXE%I$H1H:HLHLx8H=*vC$vmH([]A\A]A^A_ú 4:Q 2&Q 7Qff.AWAVMAUATIUSIIHtt:1ۃtHH[]A\A]A^A_ÐMLLLHf.H1;HHtH}Ht$lE tH}[HD$HH}I t$xITt$xMt$xt$xI)t$xt$xL0H0HHQ;HHuA@t @UHAWAVAUATISHH>D~LL<LIuI9=t+I9v X`@R HHL9s 8H…u{ 3I9ID$Is ILI DLHLqhL9oA~LLLACAFC Heظ[A\A]A^A_]fD{ xIFЄNЄNHFHH9SHHftx6ftp8fthLHhL9Kh'BvO .zHHXHH[ @HH[H@HHVHH[ =HH[~XY=HH[{8Yff6f1HHL$nHL$=0IHΜq$@=HHHھ~[XfHHlHH[ HH}VHH[ Hi=H9tƃu1HHt%zx=uxuHHRzuJ 9H fDtuHHff.fHHt5?t HuHǘQ1@Hú? Hff.Ht&1HHt~uAWAVAUATUSHH $HIHHfA$ $uA$=EHL $! $t1H߈ $Na $H[]A\A]A^A_f1ɉ1HZEuHI$H!]A<$uQHHtH $ $HH $HtC8I $L6 $IHg% M$MtL $O $II$ $HtB38I $WA}-}0A<$hA$R% M$MtL $ $II$ $H8I $A}E&LMtL $] $IH $Ht`B8I $A}LL $ uIH9tHf[% % ?HHtH׈ $ $HLMtLL$H$L$H$IA>yA}]A?SHL $  $9IIH`LL  A} $EcHL|  $}!sHHtH $ $HH $H9v8I $A}LMtL $D $IHHtHLj $& $HL $ E $hIH9HL $}  $BHLa  $IHc2E $LMtL $y $IHHtH׈ $[ $HA>A}5:A?"HL $  $}.A?HLL$H$ H$L$YIIHI^E:ZA?HL $.  $pIIH|WIIH\W% A}LMtL $ $IH $HtD8H $%A?uHL $s $}% LG $IIHVLLH$L$H$tCA?L$? LLH$L$H$t4A?L$IH˜HVIH˜HUsIIHW\LLL$H$OH$tA?L$H˜f.AWAVHcAUATUSHhHD$0HH|$ HHHL(H(LHLHL$HMqMH|$DH[;tFHtaMt\HLD$MuYD$=@Hh[]A\A]A^A_@{uD$봐D$M|$AA>MH|$HLD$(A Eo=1LIHLHEMD$8D$,D$L$H ȈD$?mL-D$HM|$(tE11|$("|$??Mt H|$Ht ;`|$ƒ|$( (\$,D$81ɍso<|$8sHtNHD$ H@0p|$(u;u6H\$0HL$T$H؈T$L$t$8|$,uM|$,ɸD$(cMD$=1D$(HIKf.HF0xBH|$HL$0Ld8HL$0HȈHD$!DHG0xHL$0MHlHL$0LȈDLI|$("HT$LRf.HF0xqDHD$ H@0x|A~AE=ք1LID$($|$|$( t|$,w |$8*D$8HQHD$ H@0p|$8@Ht$Lk5|$u|$(t|$|$(Hh[]A\A]A^A_Ht$LHD$Ht|$tHT$HT$t*HM!ȄLHa@{MA>A~xHT$@LH`H|$P&XHIH|$PCHt$@HT$PHL\$DH|$P%vSH|$@lSQLH D$,@LH|$u |$(|$|$(H\$0HL$T$HtT$L$@#fDD|$(t|$8f|$8D$ D$(CQRH|$@GR4fHAVAUHATUStE1[D]A\A]A^Ð#HH;H8Eƒ=9uQv^5uYHCHHUHH@(HR(HQ[BQQHH9NDA[D]A\A]A^fDzA[D]A\A]A^D}Eƒ88lHHHHAEAH:H8HH9v/fDCNr}HHZ%H9H[PHmPHHtjE;aE9CSHHHHA$fDу;9HHHtCHHHHA|Au Au HHHHAqADD Eu AvA| ETAu ADAu A}ufD;_DHH1[]A\A]A^kH:X}uƃuHHHHAD EA_H8@x6HH2H&AEA)fDH8EMc;ifDH:YzOHHgUzHHWPHAEQfDx.HHHA}HH>fDh;EXHuH{;H{HuHeC }C ;E A@DI9C eJ(J(tf.HEH9C/}D}HH"`A}OHH[]A\A]A^雦Hf.}}C ;E LuLkE13AD$9C GBIBxDLLuDLLxtNHH{eHH[HHAED ZA|OEu?|:A;H{}.E Au Au F;Ep1U[W ]/@1*f.?>HHtHH9u H HHHHH9@HuAUATUSHHHHu@H[ Hm HHtzLcLmMM8uGMtMtLLu.A<$uA|$uA}uA}uII9$tH[]A\A]fHuHuH1[]A\A]fH@W fHAVAUATUSHIHcI9H,L$/HHI$HHuuHGHHH(1ATHVHLB(A@TtsG8tlF8teH HbHtTH9tOL fDMIHty11I9t3H@HtdHH9uHu-I9t`HL DHHtHDHPH~4H~H~+H@HvHH9~HH9|MImDHø1Hz $f.Htu Gt t1fHfDHuًG AWAVIAUATAUSHHE1HB+McMJ0HHC8HtIcHtH] E1JAHtGHC8E1Ht;IcHЈE1Ht(DL$DD$DD$DL$DJ Ht%1DL$DD$DD$DL$ Ht:EAD$9C K.ID$HtEH9K EEAD$III9C H[]A\A]A^A_Ht4EAD$9C ~&K.ID$HڐtEH9K HC8HtEJHxHt8HHt,u#J0HtHC8JtD1H[]A\A]A^A_ACDJE1AHFyJ1Ҿ0gQ DD$ATJ01Ҿ0gQ AEkE1]@f.AWAVIAUATIUSH(HHH$L$È\$ .AE1D$=fD7AFA9GMMMMAA9,0ru1AH([]A\A]A^A_ÐAW|AFpLL-]LLJAG A;F 1ALtA~%1L<EAAIGHtH0tOAVIVHtH2t1+I9WvD$ @MAMAIVf. L1MD$ EDd$ IDAA]Du MtЇ k@uLLA|$ AAtD$ Au  A"D$ ADAA9׍CELHA9F 'Dd$ LEMIAAO|$DD1/LLAfDA;1WAIfDAIA5I0I0T$CT$ fH $HtHcD$ A@Hu @Af.Mu3Mu.Eu|$ u"E5An|$c@^fDMMMfLLLL$pulLL$LLJAfLLUADD$ AЇO fADA~AAGtu@tAu-DD$^HHЇ OЇ 3AtAUATUSHHx>Hwm$hfDHDDuHH11Hx[]A\A]HGHL`(HFHLh(M9JHx[]A\A]f.H E t1(H$HHhH{ uHs>uH@HHuCHt`f.HPHIHvPHHAu~HE@Z <twDLt-u u_HH[ tHHmu[1]A\A]A^fHHtՁxxuH1ɾHxHufDH[ f[]A\A]A^f1ff.fHRH9Rff.ff.H9HuHHAWAVLAUATIUSH_IHIHL9u LHAHL9uM$fL9u LHAHI9uLM$xfDL9u LHAHI9uL9HxLAHLAHLAHLAHLAHLAHLAHLAHHL[L]A\A]A^A_H[]A\A]A^A_Ð1H1خnHHEHH11®nHEH_41HH1訮n1HEH1蘮nHEH=54mHHH11vnHE H54HH11\nHE(H[]ÐATUESIȉE1ɹ%2HDHHmh-`H`-`h-`[]A\@AWAVAUATUSHH5 5H$H55H$H55H$@H5<5H$pH505H$H5$5H$H55H=411`nH54H=4H11FnH54H=x41I1,nH5e4H=41H1nH54H=L41I1nH594H=41I1ެnH=4I?mH=41HH1蹬nH=3HD$mH=4HH11蒬nH=3HD$mH=z41HH1knH$@AAHD$H$AAʈH$AAوψH$AA߈H$AAlH$pAA JH$@AA(H$AA,"H$AA<2H$AAMCH$AA_UH$pAApf~H$@AA x\H$AA:H$AA H$AAH$AA‰H$pAA҉ȉH$@AA ىH$AA nH$AA LH$AA *H$AA'H$pAA8.H$@AAJ@H$AAZPH$AAi_H$AAyo^H$AAH$pAAH$@AA2H$AA0H$AA1H$AA΋ċH$AAߋՋrH$pAAPH$@AAE.H$AA= H$AADH$AA,"H$AA<2H$pAAKAH$@AA[QbH$AAj`@H$AAxnH$AA}H$AAH$pAAH$@AAH$AAnjtH$AA׌͌RH$AAތ0H$AAH$pAA H$@AAkH$AA,c"H$AA:g0H$AAI?dH$AAYOBH$pAAh^ H$@AAxjnH$AAh~H$AAiH$AAH$AAvH$pAAȍTH$@AAٍnύ2H$AAlߍH$AAmH$AAH$AAH$pAA)H$@AA:{0fH$AAIv?DH$AAWwM"H$AAf\H$AAvlH$pAA{H$@AAzH$AAxxH$AAyVH$AAĎ4H$AAՎˎH$pAAێH$@AAH$AA( H$AAH$@AAhH$AA9 #FH$AA:0$H$hAAH>H$8AAJ LH$AAcYH$@AAqgH$AA[ uzH$AAXH$@AA6H$AAl H$AAH$hAAÏH$8AA} ǏH$AAޏԏH$@AA<jH$AA7HH$AA ;&H$@AA:H$AA(8H$AA79-H$@AAG~=H$AAY|O|H$AAj}`ZH$@AA|r8H$AAH$AAH$HAAH$AAH$AAҐȐH$@AABِlH$AA@JH$AAA(H$@AA H$AA&H$AA6,AAGL=AAWHMHAAf\1iH$HAAv&lGH$AA$%H$AA%H$HAA H$AAƑH$AAڑБDH$@5EÁAAkH$Aع}03LH$Aع4-H$HAع/H$Aع3-)H$AعG.=H$@AA\RH$AAi0bH$AA{qjH$`A@HH$0A@&H$A@H$@AAKH$AA͒IÒH$AAܒJҒH$PAAY|H$ AAWZH$AAX8H$HAA!H$AA1'H$AA@6H5SH=11 nAAPHƺFH5H=11וnAAbHƺXjH5H=t11裕nAAtHƺj6H5H= 11onAAHƺ}H5H=11;nAAHƺH5WH=11nAAHƺH5+H=11ӔnAAHƺfH5H=`11蟔nAA͓HƺÓ2H5H=<11knAAHƹߓՓH$HAA,H$AA$H$AA(H$AAvH$AA.$TH$xAA=32H$PAAM+CH$ AA])SH$AAl*b iB &CH5H=11ߒnHAAHϺWHqHAAHϺ\V$HAAHϺ;5AAHHϺ H5)H= 11YnHHAAHϺYHWHAAHϺ^fHAAHϺxAAHHϺH5H=11ӑnHAAHϺZHeHAAHϺ_JHAAHϺ/AAHHϺH=11HHNnԕAAHHϺqH54DH=%11nA@Hƺ|H5H=C11nAPHƹ{H5H=C11H豐nAPHƹUDH]1ɀ1H=CH5HlnA@ HƿH=1n(HAAHϺH;HAAHϺKHAAHϺ`HAAHϺpHAAHϺgAAZHHLH==1.nHAAHϺH HAAHϺHAAHϺԖHAAHϺHAAHϺAAHHL- HIHqH-bL=k=4H\$HD$H H H$HL$XtHHHD H$tHHHD5b Ht$x~tHHHD=@H|$p@\tHILD5Lt$h:tHHHDH=HHT$`o\nH'mH=IS\nH mLH$mH|$HD$(mHHD$ m1HD$L1^n1H$H1Jn1H$H16nH|$H$11 n1H$L1 n1H$HL1nHt$H$1L1܌n1H$LL1Ōn1H$H1H讌nLT$H$11LL蒌n1H$HH1{n1H$HL1dn1H$LL1Mn1H$HL16nH$H$mLT$(H$1L1LLL$nL\$ H$1L1LLL$ӋnL$H$1L1LL豋nLL$xHD$1L1LL蕋nH|$pHD$ L11H|nLD$hHD$(1L1LL`nLD$`HD$01L1LLDn1H$HHL1*nH$H$1HD$8H1 nH$H|$x1H$H1nH$H|$p1H$H1NJnH$H|$h1H$H1覊nH$H|$`1H$H1腊n1H$HLL1kn1H$HLL1QnH$Ht$X1H$L10n1H$ LHL1nLT$H$ 1L1LLnH|$8H$(UmL$H$E1LL1LL迉nLL$xH$(E1LL1LL蜉nLT$pH$0E1LL1LLynH|$hH$8E1L1HHLVnHT$`H$@E1LL1H6nL$H$HE1L1LLLnLL$xH$PE1L1LLLnLT$pH$XLE11LLLʈnH|$hH$`E1L1HH誈nLL$`H$hE1L1LLL臈nL$H$pE1HL1LLanLL$xHD$8E1HL1LLAnLT$pHD$@E1HL1LL!nL\$hHD$HE1HL1LLnLL$`HD$PE1HL1LLnH$H$xE1HLL1辇nHT$xH$E1HLL1螇nHT$pH$E1HLL1~nHT$hH$E1H1LL^nHT$`H$E1HLL1>nL\$XH$E1HL1LLnH|$H$0E1HH1HnH|$H$E1H$8H1IHLІnH|$H$E1H$@H1IH訆nL$H$HE1IHL1LLnHt$XH$E1IHLL1\nH$Ht$XE1H$IHL14nL$Ht$E1H$L1MLLL$nL$Ht$LH$PE11MLLL$΅nE1H$XIHHHL1譅nHH$hHjH|$1H$IIHL|nL$H$xLLLL1H$MML$CnHt$hH$IIHLL1H$nHT$hH$IIHHL1H$nL$Ht$LH$L1H$MMLL$貄nL$HL$ LHt$H$1H$MMLL$unH$jML$LLLL1ARML$@nH$H LjL$L1ASHT$ ML$IHnH$jIUH$MHLL1܃nH$H IjUMH$HLL1诃nH$jIUH$MHLL1膃nH$H IjUMHL$xHLL1\nHD$xjIUH$MHLL16nH$H IjUIHt$hHLL1 nHL$H$MH$MHL1QAWH$HL$(ԂnH$HLjL$L1ASASHT$0L$IIH藂nHIIjSHt$ HT$xHLI1nnH$H MjHL$MHL1QAWAWH$HL$(6nH$H MjSLSL$L1ARHt$(MLL$nH$H LjSLSL$1ASLD$0MHt$(LL$躁nH$HMjUMSt$(HATH$LH$1聁nH$H(IjSISt$(HSH$LH$1InH$H(MjSMSL$LLLL1ARARnH$H0MjSLSL$L1ASASHT$8RIHHT$@׀nH$H(MjAWMAWAWHUAWLLd$@1ATH$H$薀nHD$HH8MjHT$ ML1RRHT$0RHUAWATH$H$Vn1H$LH@1ހn1ILL1̀nHt$H$1L1賀n1HD$HHL1蜀nH$HIjUIUL|$xHHL1LnnHD$xH IjSIH$LHL1@:n5dlAIZYDDHL!=;lHt$,AغHDDDlHt$ CHϺEADDD kHt$(ZAHϺEDDDkHt$0qAHϺEDDDkL$AHϺELDDUokAHϺLDD1KkHt$AHϺDD %kHt$ ̗AHϺDDjHt$(AHϺDDjHt$0AHϺDDjALHϺDDuj)AHLҺDDQ kjHt$Aغɹ=DDH+5EjSAHϺHt$ DD=jHt$(iAغHDDDiHt$0HϺEADDD iALHϺEDDDiAHϺLEDDkDiHt$AHϺEDDC]iHt$ ؘAHϺ DD7iHt$(AHϺ DDiHt$0AHϺ DDhALHϺ DDh5AHϺ LDDhHt$JAHϺDDc}hHt$ aAHυҺDD= WhHt$(AغɹxDDH51hAHϺHt$0DD= hALHDDDgHLEADDD gHt$ԙAHϺEDDDgHt$ AHϺEDDWDpgHt$(AHϺEDD/IgHt$0AHϺDD #g4ALHϺDDfMAHϺLDDfHt$bAHϺDDfHt$ yAHϺDDufHt$(AHϺDDOifHt$0AHυҺDD) CfALɹDDH5f֚AHϺLDD=eHt$Aغ HDDDeHt$ HϺ!EADDD eHt$(AHϺ"EDDkDeHt$00AHϺ#EDDCD\eGALHϺ$EDD7e_AHϺ%LDDeHt$sAHϺ&DDdHt$ AHϺ'DDdHt$(AHϺ(DDdHt$0AHϺ)DDa{d˛ALHϺ*DD=WdAHLҺ+DD 3dHt$Aغ,ɹDDH5 dAHϺ-Ht$ DD=cHt$(%Aغ.HDDDcHt$0<HϺ/EADDD cSALHϺ0EDDYDrckAHϺ1LEDD3DLcHt$AHϺ2EDD %cHt$ AHϺ3DDbHt$(AHϺ4DDbHt$0ŜAHϺ5DDbܜALHϺ6DDubAHϺ7LDDQkbHt$ AHϺ8DD+EbHt$ "AHυҺ9DD bHt$(Aغ:ɹ:DDH5aRAHϺ;Ht$0DD=ajAL<HDDDa=HLEADDoD aH$(AHϺ>EDDDD]aH$0AHϺ?EDDD2aH$8(AHϺ@EDDaH$@HAHϺADD`H$HhAHϺBDD`AHϺCLDDx`H$PAHϺDDDOi`H$XڝAHϺEDD&@`H$`AHϺFDD`H$hAHυҺGDD _H$pAغHɹDDH5_4AHϺILDD=_Ht$MAغJHDDaDz_Ht$ hHϺKEADD9D R_Ht$(AHϺLEDDD*_Ht$0AHϺMEDDD_ALHϺNEDDD=^՞AHϺOLEDD^L$AHϺPLDDq^ALHϺQDDMg^ALHϺRDD)C^+ALHϺSDD^AALHϺTDD]H$XAHϺUDD]H$kAHυҺVDD ]H$AغWɹDDHf5]AHϺXH$DD==W]AغYLHDDD2]Ht$8HϺZEADDD ]Ht$@ʟAHϺ[EDDD\Ht$HޟAHϺ\EDDD\Ht$PAHϺ]EDDyD=\AHϺ^EL$xDDLKe\H$AHϺ_DD"<\)AHϺ`LDD\H$9AHϺaDD[H$IAHϺbDD[H$YAHϺcDD[H$iAHϺdDDZt[H$yAHυҺeDD1 K[H$AغfɹDDH5"[AHϺgLDD=ZH$AغhHDD軿DZHt$xܠHϺiEADD蓿D ZHt$pAHϺjEDDkDZHt$hAHϺkEDDCD\ZHt$`0AHϺlEDD5ZH$MAHϺmDD Z\AHϺnLDDξYH$mAHϺoDD襾YH$~AHϺpDD|YH$AHϺqDDSmYH$AHϺrDD*DYH$AHυҺsDD YAغtLɹáDDHݽ5X֡AHϺuHt$8DD跽=XHt$@AغvHDD葽DXHt$HHϺwEADDiD XHt$PAHϺxEDDADZX*ALHϺyEDDD4X@AHϺzLEDDXHt$8SAHϺ{DDϼWHt$@hAHϺ|DD詼WHt$H}AHϺ}DD胼WHt$PAHϺ~DD]wWALHϺDD9SWAHϺLDD/WHt$8ТAHυҺDD WHt$@AغɹDDHɻ5VAHϺHt$HDD裻=VHt$PAغHDD}DV$LHϺEADDWD pV:AHϺLEDD1DJVHt$8NAHϺEDD D"VHt$@dAHϺEDDUHt$HzAHϺDD軺UHt$PAHϺDD蕺UALHϺDDqUAHϺLDDMgUHt$8УAHϺDD'AUHt$@AHϺDDUHt$HAHυҺDD۹ THt$PAغɹDDH赹5T$AHϺLDD葹=T:AغLHDDmDTHt$8LHϺEADDED ^THt$@`AHϺEDDD6THt$HtAHϺEDDDTHt$PAHϺEDD͸SALHϺDD詸SAHϺLDD腸SHt$8ĤAHϺDD_ySHt$@٤AHϺDD9SSHt$HAHϺDD-SHt$PAHϺDDSALHυҺDDɷ RAغLɹ.DDH襷5RAAHϺHt$8DD=RHt$@VAغHDDYDrRHt$HkHϺEADD1D JRHt$PAHϺEDD D"RALHϺEDDDQAHϺLEDD轶QHt$8AHϺDD藶QHt$@ӥAHϺDDqQHt$HAHϺDDKeQHt$PAHϺDD%?QALHϺDDQ(AHϺLDDݵPHt$8<AHυҺDD践 PHt$@AغɹRDDH葵5PhAHϺHt$HDDk=PHt$P~AغHDDED^PLHϺEADDD 8PAHϺLEDDDPHt$8AHϺEDDѴDOHt$@ӦAHϺEDD詴OHt$HAHϺDD胴OHt$PAHϺDD]wOALHϺDD9SO(AHϺLDD/OHt$8:AHϺDD OHt$@NAHϺDDɳNHt$HbAHυҺDD裳 NHt$PAغɹvDDH}5NAHϺLDDYL$ AAHϺL6AAHLH$ϧA@HϺHA@HHݲAHH²=NH= NuPH$1A@ɨHz1HA@`H$AA%H;HAAB8HAA`VHAA{qH$A@H迱HA@袱HA@ͩé腱L$A@کL`HA@CH$A@HHA@/%HA@>HA@]SǰH$ A@zp襰H$A@(胰HA@fHA@ƪIL$A@تPL$LA@xL$A@LLA@ȳůLA@7訯L$pA@NL胯LA@@hfLA@IA@L,A@HLxLA@еLA@ ծLA@Pp踮A@L蛮L$xA@LvLA@@hYLA@nXH$A@?h6HA@@H$A@AHA@ɰBH޿ףH$(A@C赣A@DL蘣A@EH {A@PFHx^H$0A@GH=/Qn<HAAHϺs跢H$UAAHϺ藢H$jA@HwrRn< H[]A\A]A^A_f.H$@AA|r.H$AA H$AAAAL͡AAL谡AA̔L”蓡 ڼHt$A@ݔfӔaHt$A@dBHt$A@e#xfDH$A@L$A@:0LѠH$A@UK诠H\$XA@dH荠LA@zȲpHA@SH$A@1H$AAHϺH1HA@ɨ1HA@ڟA@ IL轟HG HpfSH {Ht[DCHL#mHH [Hff.fSHH H H;r G;u.H\HC(HHCHf;&t3H H[f.[nHC(H, HHCHxxu‹sH ުT$HCtHS HPx@H HG(HXxf.SH7H[fAWAVAUATIUSHHIHf>uL9t)H[]A\A]A^A_@H)m&w$@H{LH{LH{HLH[]A\A]A^A_fDL9uH{LH{@L9um@L9H{fDTL9H{LH{L9tL9uHD$LJH[Huo[JQ2f.SH0 H1 H* H [ff.)t = t mff.Hw SHuCH;HtFH{Ht8H{Ht*H{HtH{ Ht[[ff.H*AVAUATUHSH$fDHB|cH[HvH:tfDH9t&EH}H H}H H} H H}(H H}0H H}HH H}PH H}XHi H}`HR H}hHtH}pHt &H}xH HHto=&HHH}LcmH^H}HGE1MJ|H{ IM9u{H}H< H}H% H}H H} HGSH}H] H}H9#H}H H}H H}H H}Ht&H}H&EH}Ht%H}HEH}HH}HH} HH}(HH}0HH}HHH}PHsH}XH\H}`HH}hHt:H}pHZ H}xHC HH) HHkH}H H} H H}(Ht^%H}0H H}8H H}@H H}HHy H}PH H}XHgE]H}H H}H H} H H}(H H}0H H}HH H}PH~ H}XH H}`H H}hHtH}pHtq &H}xHjH}H H}H H}H H} H H}(H H}@H H}HH HFuH}PHtvH}XH& H}`H H}hHH}pHH}xH HHt HH HHk HHo HHDmMH}AHE:E1E~J|HIM9ubH}HO H}H8 H}H;GH}H H}H H}Ht*&HEHH} fH(%EHcEH}HD5% H}H H}(Ht[*&H}0HtM*&E1MJ|8HvIM9ulH}HeU40* u q+" 3vlYbBX+ND:0&hQtPe{d|r4hz^TkJT@=6&," xaJ3mB+B}fxOn8dZPlFU<>2'(fHvBSH.@H;HtH{HtH[HtHv HNt[fDw ATUIS$I\$I$H;Ht;HH9uI$f.H;HtHH9uHI$xH;HtHH9uI$xHtI$HtI$HtI$HtI$HtI$HtvI$HtdI$HtRI$Ht []A\<@[]A\Hw SHt[ÐH;8H{HtH{Hv[zf.SHH?H{HtH{Hw [f.[:f.ff.ff.SQHH tKH;HtNH{Ht@H{Ht2H{Ht$H{ Ht [[ff.AVAUATUHSHfRHH0 tHB|ct\H[fD1H9t4*b K4 hH}Ht]p&H}H<&EAj@zukaWMC9o/X%A*pYB+HD@&EHULcj0 uk(aW:M#Cf9O/%~gP9{+qgu]^SGI0?5+!p 2gM3Sw}mfcOY8OE;1k'WN4 iR}si_UEKRA;7r-[#D-z(lUyoej[SQ~tPH}Htgf?tHEHOEtcHEHHHPHD$H$zOH[]A\ÐHFf8uH~ltH]H[]A\fDHH OH[]A\@ATUHSH51فH=H=nH=۔Iĉm[MHH5]A\=JI1`mHGHH="HH 0H)HPSH~fHt H@uHpFqSHR!Ht[ÀH@[f.USHHR!HHuH@HHH[]@IHHCSH9udHtH1H[ff.fAUATUSHH=DLIr0/f0f0f@H$HL$H= HD=p HD$HD$ D$D$ 0HD$H@AWAVAAUATAUSIHLHLL$t=)D$p)$)$)$)$)$)$)$HlFEQH=z -E!IHAMMHN IEP1LL?ʾL$EHD$uWL|$1-V?1HL H$0D$(0D$,0HD$0HD$@HD$8H= H[]A\A]A^A_@+HIHL$?H|$PHi;=*$h{D$\D$D$H$1lH$H HUHILD$IB7f}.VhHD$@H$L\$(LT$ HH$|$LT$ L\$(D$MLLLP$XLL$H|$G_AXMH$XHHHx=2^I|$HXnH =ILq F=I!mDfIE*Hw=<ILH<$1H_H<$&HdfDD$?D$D$tD$D$bDHRH$LT$(̨HH$$$$$H$襭HD$PH0HD$@H$HH$|$LT$(HvHvL=LLT$L@ DL$MHQ$XLT$ LL$Ht$0H|$LkY^H|$ 5Hu1H`H4$=_HHD$P11-$HCHH0H9tfHH9D$D$`L]ISB7fL*HD$LL$ID$D$sfDHIuHRuL=iLL\$ LT$L@ YCL$HMQ$XLT$ L\$0LL$H|$LLXZPƓD$LMLP$XLL$Ht$0H|$AYAZMfDAWAVAUATUSHHHD$tFIHt#t41H[]A\A]A^A_f.H~Ht$r*H|$tL-LhPHMtFIHt:HHHt HIHHHuE1{D$NLd$A<$oC)D$`oC)D$poC()$H$1L$(HD$ H$LL$IF f|$`$MH$1Ld$ 跳H$LWH$L7H$AH$HH$+H$I1bAE8$LH$LH$LH$LL$ҏH$Lc&IG fv L1yIH$LVL 7qL$L=:L1:mH$IL 1LLL{{Iu |$H1LH;LhH$L$@L$0t:kHJH1JH#IDžD@MH|$`1H5 HP 1y[H$1H$v$ƒ$S8H$Hވ$Ld$ hL$H=L:I3fH{1&AL$8D$u0I|$Pu(ID$HH@(H@(HH@`Ht C8fDH)m HHD$.H|$H(m IH$LDH$HH$H$H|$0L$PH$1!HLt$Xt$Xt$Xt$Xt$XH$)H0IH$LLd$ f:*A}IEH0L HCPfDH$1qHt$H$$H$1Hx$mH$Hu I=q<{I=mHcn=MIL<IfH|$0H$1ǮHLt$Xt$Xt$Xt$Xt$XH$ϤH0IHCHHgH@HZ@]PH&m HHH$1>H$Hހ$H!H$1HxmH$H< I=q;H" =tMHel;IRH>H *HH(H3GGGff.@AWAVIAUATIUSHHx$HL$LD$L$AKHD$HDŽ$1D $~mHז =)IHپp!:IID $L$8HD$0EMA}IEHHHxdH=} HIuXH1HMH$tLElUH|$01HHl$@褠L$HH$HD$@tH$1HLHn$t =OH5 1HfUH5 =H13m1HHH{1"m=HHH>MtLql)H!H|$0HH]j=ILI<Hx[]A\A]A^A_DH$$$H{D$HD$HDŽ$G7HD $(D $HHxf.=mH3DML-nhH|$IF fv L1IMt LBluH$H9fMH$1{Hct$H=fH$mt$H5[ IAV=8MHL$11m1HH^Y 耠11H4ldIbH5h1E1HMH$D-7lHZYtBwDhH1HH9M=LH{1ImH-h=_IHپ9W6IHD$HHHg=.NI|$HTnH =qIL65qt +HPMIHHg=< 1HHcfDH|$H= eM nHD$Mt;Hl$@HL$01LD $H)D$LD $u|$0tD$ @D$H{I|$1HGH$dmH =ILq5H<$HD$ 4mH$=L5HD$(MLD$(H$H=94HL$ HpfI=I<HfI|$1H_mH =kILqc41HI薿m=HHL5IL=/HMH9$4=Hm=ItL*H$XL $IILfDD $g$D$HHxHDŽ$D $jH{1m=HHH@D $$HkH)mID $E7uLHKhHtHILHH$tHHHH$tdL@HL--D $H$p{?p?p?p[?p?AWAVAAUATEUSIHLHMH$HT$H$t EI|$1腽mHގ =0ILp(2IPL$HEHD$@)H/E1;#HCHHHxnH= HHsXH1IHtHLL$RlLL$H|$@1L&豘H|$PHހL$XHD$PEt =*|H5 1LMH54 =&H1+mH|$P1HHtH賨l;H|$PaH|$@HTHb=ILI<H[]A\A]A^A_f$DH$x[6E1ɋ={LL$L$HH<$1%5H|$P1HH|$`HL$?1LL$D$lI|$|$?1ZmH =ILp/I%L$HEHD$@AH|$`1L%b`H|$`HPHHHkHClH=^uHEH9xIH_IcL-`H$蝺mH$H5b I=AMH1RAU)mH|$`1H^Y 膘11H:ldIhH5`1E1HMῗH$>lHZYtBwXHH|$P1H?H=HHHD$HHH-`=_"GH{HnH =AHIq9.=+H mIIHH_=<MH|$P1HfE1= mIf.H}1LL$(HGHD$mH@ =IHq-H|$HD$趸mHT$=cH5HD$ FLD$ HT$H=>9<-LL$(HL$IH^=<hI{1ImH =ILp,IL$HEHD$@AfH}1H_mHJ =IHq,H1IǷm=yHH5I.E=`MHHھ9U,=GH'mIIL%fH=[@H߾+L $oL $HID$B @7`LL$mL$Ha@8@l8@p8DS(H=oHCH:v HP[@HHGHPHH 0Hv HH=-H)Ha#HfDAUATUSHHtqHI6L$H$fH{(tHQ1HHtLcoCLq \3$@D=umH[]A\A]f.H踃HDLHt*Ef~f=t AE1HHH[HHH?HE1111H_HH(?HHbHHX4HpHx8H`HHPHXH@HH0HH HhHHxHH(HHHHhHHHHHH8HHHHHH(HpHH`HXHPHH@HH0H H HHHL:HEf8H} lE1E111HwHH1HCxHt1xx@ff.HHGHp 4Q@ff.1f z u xaKff.HtHHwHGHV2ff.@USHHt$HպHžHH{H[]1鱾p2ff.fUSHHt}HHwHH|$1HD$jHKHD$Ht6HT$H5pV-lHtBwhHHCHCH[]úP [1@uHf雾ff.uHfATUHSH5S1Y@H=zSHmH=[SIĉ)m[MHH5+ ]A\=I1m1tfDUSHHHXG,uGFt9H*H1:@H;HeSt!HHu11HUHX[]fDHShDAUATUSHHH(IH8H-TGI$HH=HBHRHH 0H)HM$HHھ*ټlI(I$H8HxItHxHGH9GI$LH(IE{]I$ Hf.H1H8 HI$LH(H(HP(HH(I$ Ht<@HI$LH(H(HP(HH(QH[HuH HXI$ Ht!HtHv]H HuH[]A\A]#a.f.H _toATUISHHELehPv=t81@t(HEHcH|PHtLuHmPH _u[]A\fDfDUSIHHվGHHZ=H=IHHNH߾H豪m=cHHIFXH1H苪m==HHZI2H5 | HH[]1k;ff.Pftf5u%HG@:fDG:Df.uHW 1B: tHB@71SHG@7tTHHqHIt$HPH;^{ uHً=kE1[.鵯X+D+ff.fSHG@7tTHHHIt$HPH;z uHً=E1[.5`*`*ff.fAUATIUSHHIH!t$Ht,Mt'HSPHt u HI9t1H[]A\A]fHt3H(HtHQhHt HHHEIMH[]A\A]DHSHHR(H(ATUSf:HtCH>H訔lt H][]A\HMHy I؋=̺FHE[]A\ÐH7IFI$[]A\fAWAVEAUATAUSHHH H HAE裯E1ɄtEtEE EHD[]A\A]A^A_EEuEthEtcCP%=t6EPE1%=uEPAtHE1ɃxA@CPAtHxuuE1mf.IHHHLkMtcI _t}Lf.HP`Ht+HRhHq@H9tjHz8HzHHJ8unHrHHQ@HPPH _t'HHPHHu56(HQ(HPHLA(HH;J8t-'('%'ff.ATUHSI1H`HHxdHHHI$1Hx>HHHtKH5v H16H5v H1H6Hv =1HIF)H`[]A\I<$Q(HG1HH8PI$HH@H0Hl$ tfDH}!HG1HH8HEHH@H0H\$ I$1Hx9HHHBHt H#(G&蛋I4$HL$8:Hl$({HuHL$8H\$(xHH=zu -mH]DHH=Zu  mHcDHHǨHCDHHǨHDAUATIUSH1H( DH|$?CH7CH?HHE1 fDAHZHHHk HHE8uHE8uHE8uHCH|$P80HA HEHHHEH[:HUHHH˜<HHHD$uH|$1!CLE1H|$=H=H(D[]A\A]DHCH|$P80H6A HT$LLC@fDHt$HHX0fDt3H~1BHfvB1HH1@AWAVAUATUSHXH|$H|$(HD$(HD$(H@8HuL|$H|$@dAAG I1HD$Hl$D,1L`(I _LDH@HH= _H0NAA EtՀu@1Lct$qLXHNcT MLzHIHaI?WKװZlBH3Md$HI _>I$Htv݀tEuHKc| iLXE1E1MomHAD$puHyu |,1H DT$ HHL$>lHL$LcT$ lHf?Hf8(OHt$@H)5f H|$@|IIlLt$Hm M=FII@f.IcHL$ DT$IǠ覈lLcT$HL$ hL#LfIcIǠf8PL#LMHl L=HI9@Pff.@SH_f; uH[C7t:LH=C.E1IP芠H1[HHq+ff.fAWAVIAUATLUSIHILHHL`=¬HB11LH賱}IHS HB@7 tLI~/@7 u9}H{ Fx=hHHC HC HIuHS Iu1*LHC HHߪHL$HHsL[]A\A]A^A_ f.u 1ҿAzI~LcI聐=ӫH裏lH5Tj H1*IAD_I~pIHcmIV=LLI HIIu f} LHeM HLK LC(LHHQסY^P = X 1= X HW HS(HptiL)H5W HH蚩W fDHS 1)LHC HHkfLXHHC !@(H |W H%i I=tqr HS(H5;1I(HFW H5;I1(AVLEXPL1AH1ijXZff.fAUATUSHHF@7tSHHIԋ=HE1ɾ.HSI"HsL1I(HLH[]A\A]HYDHHG@7t/LH=j.E1IP豜1HHcFff.@{ff.ATUHSHI^HpHL1M'HHH[]A\雧ff.ATUSHG@7t_HH螥HHt/L`I9u!HSIH[]A\=E1ɾ.כzfff.@ff.ATUHSHIHpHL1m&HHH[]A\黦ff.SHH@H¤Ht'HPH;9IuHً=E1[.@$fSHH@1HeHt'HPH;M;IuHً=_E1[.驚 6LDSZHH@HHt'HPH;j8IuHً=E1[.FFfSHG@7t[HH衣HIt+HPf:u!HBf8uHً=E1[.ޙWRmfDSHwH11[HGSHH@1HHt'HPH;d IuHً=E1[.IDSzHH@H袢Ht'HPH;d IuHً=E1[.`fSHH@HBHt'HPH;2d IuHً=<E1[.醘@)fATUHSHGH@7>~sH1 H5S =H1mHH}qH5c HP 1#H Hc I=HH[]A\f.H5iR H5]R HIHKc =]ILGU=GHHL*R Hc F1=#HHLQ Hb Z H+DHHxHf.SHG@7tz=u]HH蘠HIt-HPH;eb t H;db uHً=E1[.ӖwvqbpNATUSH_C7tQHHI|lt0H@w!Hs[HP ]A\f.[LH]A\ff.fATUHSHHIkHpHL1 HHH[]A\8ff.ATUHSHHI{HpHL1 HHH[]A\ff.ATUHSHHIHpHL1J HHH[]A\阠AWAVAUATIUSHHHHH5O :H` =MHFIHsO HHHeH H{ILRO HC` L=RLx FLH55L1IH5 5L1ItH4=MHHLHHHHH[]A\A]A^A_Hff.fHG7tHHX3mHHú fDAWAVLcAUATHUSHH1HH5M_ HIJ4_HHD$hJ4_HIJ4_HIH^ = MLGH$H^ =ILFIJ_HHHL$H^ L=HHL$H^ I=GHHH`J_HLHH[]A\A]A^A_ff.AWAVIAUAT1USIMLIHH_hHHIlIHH-lIHɛHlIEH講HHlI$HC1HXHC HEHHǂlHT$@HH^H覂lHT$HHH=H腂lHT$PHH[]A\A]A^A_ff.H _t)@H HPH _uff.H _tCUSHHHu5HHt&H;HkP<H+H _uH[]ú] ff.~fATUHSu9u'H_PLfDH;HtSHL9u[H]A\*n! f.SHH?H[[*ff.AVAUIATUISArDpxLhHqrHL`Ph~ M1Hf.T HH9u[]A\A]A^f.AUATIUSIHr@xLhL`HqHH@P _h~M1HT HH9uH[]A\A]ATUISHqHL`@xwqHHhP[]A\ff.ATUISH_(H _u3f.HHH;HkHH(H _uID$8Ht&HP@HtL9uSDL9t;HHBHHuI\$@HtHkHHqHi(HHu[]A\@HHID$HHHP@ H _tfff.SHHHH|$tHOHA7$<HT$11HucH|$f?.Ht ;>H~HHt f8=-~lH5 Y H1oH[fHt$HN fv,HH[DS fwԋ=DHHD$f.HG@7>HGH@@7 ,lH3j_fDHHHtH4$ slH4$1;&{HH5 X H1rDHCHH@(x HSP:H HHP(HB8oH(dHaiff.SH(CtH{HCP[úvff.@HtztAWAVAUATIUSIHL~H2H|$D$,MmIWhB D$,M1fB KH9.HcH(H<uEt A979CH|$01lIcH|$Ht$0M\Mt$ISL\$ 9Ht$0L茬LT$@LLLT$wHT$PH5KV 1H|$01HD$kL\$ H|$Ht$0IHt$0L+LT$LLHT$PH5U 1XHU LD$H=GLH莔IWhKHAuHB 9MXM9t$,u:HĘ[]A\A]A^A_ú``z`fff.@AWAVAUATUSHHHXHɋ$MHt$(H $LD$DL$4D$8$D$< ˆT$@ H$@7 H$H5T 1HP $HD$ H|$P1HH$HHHD$Cb L{hA;G h 5}tH$!Lt$Hl$ID$0LD$0HALIK8NcdHt7OHS H=G Ht$H诒K8LB HKDŽ1Oy~%HQ LDD1D;"@HL9uHIXHuHcHyB HKLTPL0LL@HDIA9E IMmXMt ImhE Hl$j1jt$@AH|$L$HL$`ߎ(HI?H9IHD$ L5A L`(IZYHLhIExHKHL藅lHH\LI|$I跟LHH9L$0#H$H$QH HHxdH9uMVHBE1E1H\$H$J_LLHWJ_H @ LHJJ_LHJL@ FHQ =L@ HNQ H=lHHH=MIHQ =lLhgIH=Q =OLIؾHICHHL9,$IMH\$H5P |$<L? =MLH<;L|$ HP H=ΑHIMH5P 1LILHHdHL$Hx? MLa(HA8tL3klH$LHYLHL$H5? LHHA0&|$8t"LHH|$(HҥHHCXHuHChS;~HXL[]A\A]A^A_IHl$H$IGh@ E1GD H<$Ic1fDO~$HG HtD1D;HH9uHXHuHcLH4_H4_LIH7O =IMHGAIWhH@AFI9B XMXM9Hl$H$HT$ H5N 1E1HLHHHL$H= La(HA8V|$4H|$HD$H@f8 $Hx>tHlHɌL|$H5A IƋ=l1L%lIvH1 LHH=;L軈HH|$@IH|$@L1IvH1Y H|$@HL詍LLL蛔H|$@H莣HwM =ɎMLpH|$@I$=H$lH^"L$I=L<ԁHH)H5:M L1 HLHJH$@1\L{hAG H|$i1HLHH軋HChH@(LpHI _BIH6HBH)xHz(H8+_IIE1HPHt z0IFHH= _t":HD$HtHJHt 9H$@7 MgLyYHx1I{zmHK =&LIqL-: LIl]HK =IML<<HH葊LHHZHHD$ pH$I@ fHt$=H^HI/0dsyHz(IHRHt zHHxH _l^IH@@7 MBL:XLI/XIv1HA LHH蓊|$|$tLWH9 HHH$@7 XH$HHx5pH$2HtIL1Ҿ&1{lIExIMgE1MH5$J (HT$ 2=L9 IVL= G HH讈H58 H=I HmI|$I{o1LHmv- HLI}u"=IVL[HH@L1薊LHHhLHL$HD8 LHHA053L}0.00`LVEILVHLH腈IH@(Hx`nH$IHxHH(Hu HH1螉LIUHp1ILLHH|$u\H<$u.H%  ?Hx1mvmEH$@7 0HHxmH$LUIIHt.H@I|$Htxt!1vmHLHucI|$Hcp umIFHH= _t3HE1HD$E1E1 NvH$LHBHHH@H@]AHRHHHRHR]T$DSHtUHx uNHPr8C:u>1!f.Hx u)HPJ8:uHHHuH[1H[f1tfDG(uuHOHHɋWty tu̍@uff.fSHHHLJHH1HH)HHH{݅1 DŽ\HHuHC( _[fHwPff.HG@7tf8 uH@H1鞆fDAUATIUSIHHHHf?5taHPB7t%f8.uHxHGH@XHt@7 teRHHz=HHHL[LH]A\A]HGH@hHtH8HtQ@ATUSH`HF8HH1HAZHsHL$8GHH蜚H|$ HD$(EH{(HCHCxH|$ #uIF HPIFEHIcHlH_HAw1VIH4A =IHپk~H5H1IH5HH1UPL1MML1ڋLXZH(H[]A\A]A^A_fDHDLHT$HT$IAwHR`1VIHt@ =ƁIHپkH5/L1HD$H5H1HD$H5H1HD$HM1LD$ APHL$ MQPLHT$(1H? =9H MHپm-H5L1HD$gH5H1ISH5tH1H?HMMAVULPHT$(11kH Lf.H1TI@1TI@H4_LHT$HT$HOfDH4_LHT$cHT$H )fAWAVAUATIUSHcMHML>A8LHcIIwAtyCtIDPHHPLp I@0Ax 3% MtLYl$HL[]A\A]A^A_fHeHWPHXIHcLtPHrYl H0Xlu\HGPH%HcHc LtPHGHHt HxIH0H@tXl3L0H= L=~FI HPHWPHCIDPH]HxPHcLp(=A~HH= GL$HHPL%HIFL$HLMkH< =}HL$LI-=}HOw11LHLHe{H5< 1HIHC0AL$xCE1Ƀ9A# LC LLLIH)< =;}H[MLH]A\A]A^A_%DLH; L=}HIH@HH@HHH@(HH@P@H4_I(gIH5q* LLT$H $@LT$IH $luH ] ID 5W !M  Y u  f.AWAV1AUATIUSHHHwHMHD$(H.L}t HGP HcLMHHߋT MNH}8IHt UlHu0=C{tIHCHHS HHL@t#t HvHuH`HHL$ DHHuHL$HC0GHHH1MA AzLHT$LD$#*HT$H|$01HHǂHD$KLD$H|$0L<Ht$0HHT$HD$HL$PHHQ f=vHgpHHPuHL DLE8H5 L=vFIMfH;3HlH9(1ȐHHHT$SHT$(Ht$(==voHD$(HT$(HBHD$( HI@HH@(H(qMHHAWAVEAUATIUSH1IIHxHH|$DL$ JH{(Ll$`L$ H|$0HGHIcLMtjH|$MDL_Ht$HIoHK8HD4 M=SuFQHHC8HrHC8Hx[]A\A]A^A_H4_L$ IfAWAVAUATUSH(H_(H _EFH $AIAD$ HHHD$[fDHD$E D9L$ LHIcH<$LDD HAD9H[HH _H+Dt݋EPvu͋CD9A;G qHU LbHҺLDD9SIw8HtzHCXHwH9(ZD E>@ @I8AH[HHE8H _HE@9H([]A\A]A^A_DA uH HHB`HuHE(HcR HHHHHtrH4$pH} HEHAT$1Hl$LIfBuH<$AEHHL AT$I9UHl$KH4_HHv    f.ATUMcSHJH`N$ҋ$Mufu:EHAHJ|HH;HoHH`[]A\ÐHA@7tEJ4_HuiqH@1H>FH0 LHHHH|$ H;HD%HuDJ4_HHXAWAVAUATUSHx~vHn LcHAB It 3HM8HHC(N H0HAELyQHHD$HM8HD$HH@ELyE1ɉ $HNMXZtWH|$1 EH/ H|$LǹHt$L芅H|$0L-nJHx[]A\A]A^A_fDHA JHx[]A\A]A^A_fH JHǀ@HH HHx[]A\A]A^A_ú(ff.SHGHH1HH@HG(H[}Dff.fUHAWAVAUATISHH8HBHDVHHDžH@(EHu|DNEu @]uHF0H D@E I} HWB7t f: AE8IU 1HBHphCIE He[A\A]A^A_]Ã>t IHDžHHHHMt$PHHHMtrLMI$IGH9MMAt܃uIGH8Md@sLBHe[A\A]A^A_]õHHHHDH I} HHHG@7YHHHH`CCLcLM~LH)HMDH0H, H=mMHHHL9IHLEFAJ(H+ HHL輁Y MctH@p gI} HGHNM?J4_2I*f.1HeIE He[A\A]A^A_]fD胯HLHElmHHHLD,A|$,ME H@PuGMtBI@@7$<u3L裮t'Af.t f5LL*H@ uHIM0HLIE He[A\A]A^A_]H0LElI} H0LHGHNLMH@QH0H{) Mȋ=jkLLANHH1Ή>LH51HLH0H51HHL1LHQPL1sH~CLY^D9HC0x]I} DLiIHL@QH0H`( Mȋ=imLLHH1=LH51ILH0H51HHHML1QPL1rLuXLZhH@]I@@7LfA85L`IPHPDLILHHtH,H)IH $IHHILL'f.IFH9MM{At܃uL H!%t%%DHJHDщHL)If8@I|IFH0f.H=g.aHfLHFHf& =xg{HbHpLL;LHC0HpH& Jt_HpL |LML'HH(HfDHpLA;LHC0HpH% JЈHpL{LML@HDž1!fDH=of_IE XfHHH $J4_SLI*`AWAVAUATUSHHt$G HxpLcM~MI)@INc I]#HHHdH=$ b1D9DJDPHHMTjM9uH|$H[]A\A]A^A_-d@PfAWAVAAUATIUSIH8c  IcLHk#HLA\?bdILIHqPHH9^HL$Ld$ 1eVLIcHL$HN# E1ҾmL@HIP=dDT$I\H?=qdLHD$'ANHD$=Od2lLD$HL$IH=0d<~WLHxLLxDT$L HKPH" =cDփFHsPHLbLEcH=cH{HH`x=cHcLVHH;xH8[]A\A]A^A_@HHL$bfDDG A@99DD$q H/ I=>VF<H= 1HkCmH =VIHپZNIMNH =UMHqݷHT$0t$ H=MOFKL$ H H=,OlH'L H I=OH<IPBHHt H H M=Nh۰HH[]A\9AVAUATL% US}uBHcHH_)HL4_H3HH詨HsIH1HLHF =XNLIHPL9Iu[L]A\A]A^ff.1WTGHHHtgHHu\HHDEuztDR Eu@DJE~7HHuD1EuHH1HHH肼ff.AWAVAUATIUSH$Ht$@H$HT$PHL$XLD$`LL$hHDŽ$D$HHFPH$H$$`H$HjH@@8‰ÃD@$$$HD$@H@PHt#@DEu xEHHuြ$ HDŽ$Ƅ$H$H$Eƒ 9|pLd$pLl$(H$H$L$0H$H$H51L$LIHHD$( $HD$HD$0I] HC f$Ht =:@tE1HH٢l$H LJM$ jt$hHLL$hLD$`LHL[XZHD$$_ LXHL$qHg =J輬)H_LHJLD$0HXHً=JI<=LH'_H\$Iu HT$(HAH5: L1Iu HHH|$IH\$PIHH{1[7mH =JHIpHHD$Ht H =IHIhԫHË=I覇l=IHy<IMH<LHK^$HĘ[]A\A]A^A_ÐHD$@Ƅ$H@HH@(@P$$f.H0HD$8HHD$ HD$@@HHL$(H$HHD$G$D$ptH$H@8@D$p1HD$@@8D$LIE L% HD$H@HD$xH} HD$HD$@xY H|$蟟H|$xH$TH$HL Gt$LH H$D$L1H$HHD$0@|$HH_H Ht$H|$H$xL$H M=GHLéHL$H I=GGH$蜩1HHD$H_H$虡H_H$IH; =MGMHGEL& H H=&GFH$LHH$DHH$H_Ht$HH|$ĤH_Ht$LH|$L$H$LL LHUDH5~ 1IHHf =xFMHLpH  =[FMHkH$K)HL LHH=F<f9L H L=EpH$.H萃L$LQHH/=E<9HL$ HI=EFH$萧LHH$-CHV =hELMHHD$ [LHCH9\$0IHD$(1HL<H$H|$8HD$8H4Hj MyH HHLYH$H_Ht$H|$QL$H M=DHL蜦=DHL$GHe IH$u1HHD$H? LH脍HLIYA?H$4fH$H@H_Ht$HH|$H$菡L$H M=CHLڥHL$H I=CGH$賥1HHD$H_H$HH$H a IH$fH A H_Ht$H|$H$ˠL$H  M=CHLHL$H I=BGH$1HHD$@MqP\ʱH$L`@HH$D$LT$p9Lc|$LDrLd$pH$Ld$8Dt$HD)MIDL|$(Ll$(Ll$HD$0OMHH LHHH|$(H WJ_H$LH|$~D9t$H~F1H=H HHH|$(HVJ_H$LH|$聟IL;t$01HKM;D9t$H H Q J_LH|$H$ܞf.\$HD9tWH  J_LH|$LD$H$螞D9LD$_1HLD$QLD$L@H LHH|$(HUJ_H$LH|$4H$H@8D$LL?LHU'H= -=H{1HD$-mH =*@HIp"Ht$HL>L>H$1HtHA8@Pt$HLt$xt$xHLL$xLD$pHLXH LLe=H$1HH5yLHj=?MLHHuH =`?IHkH$P)H|L LHH=?<k2H=1H$E,mH =>ILp.H|L$LLHH*=><2HL$ HI=>FH$苠LHH$(MHپHHD$ VH|$LH;H$HD$(l~H$_lH}`I;H$HƄ$ HE8H= 1l&mHvL$0HL$0tH$HH= H8H5 =8H1腕lHD$0-HL$0[m HH{s$Hns1L HLPHQH$PH$H$uߊI} HiIU(f:.lHD$8HD$8HD$(H|$xL$0H@ H$H$HD$@1Hx\H|$HHD$xHH$P賦P蟦P*苦pwpkcff.AWAVAUATUSHhz LA]AE9HcIIIlH}ItNHcL|fA?t?1HI H HuHHLKHT$ LLk5IHtMHcLЈfA?t;1H Hg HHHLmKHT$ LL 5AEAEH96Hh[]A\A]A^A_HB(HHH8Lh`AWAVIAUATUSH~ Ht$ HT$(HL$CHD$ HH H- E1HD$f.IHcML(MtfA9H R HC M=R5GLL$KH$ =65ILF.H =5HIHLL$H|$HL3LL$LL H H=4nӖL H I=4H<'H|$HHO3IA9Mc=J|IIcLltRfA}tJH|$01H; JtH|$0|}H|$Ht$0=IHT$PH|$L2IJNtNfA<$tFH|$01]H JH|$0}H|$Ht$0HHT$PH|$Ls2H =3ILH覕H HL$I=3G臕HD$AV9IMLxLDLt$ H\$LLHHL-6HD$(H\$HHĘH[]A\A]A^A_H H M=2GH =2ILFʔH =2HIH譔Ht$HP0HHF(HHP8HX`AWAVAUATUSH(FPA$FSHLgHAD$7H|$IE4$1C fA CSMd$0EH{HL$HT$HLC[@Iv,HEf8 Lxv-  H{I:.LL#E1u-lHt$HIU4I$Hf>#H2H|$0MItL4H|$/IV=0L賫H|$HEL}H{-LLE1脫H|$HHd/H|$01HL]mLLLOmH([]A\A]A^A_fA vH{(HGf8t=HT$1 Ht$HN3=@0HUHH|$HDH|$/L1HlH([]A\A]A^A_@I$Hf>#uHT$H|$.f.H{(HGf8HT$1bDL@H5y HP 1ޭHg =y/LIHE1nHuH|$H=#f}%t HEhH}H0}NH|$ HH,H|$ S-H|$IifCP t_f}%=-LVHoH9袏HžHbHK=}-MIH< ILL1"jH8[]A\A]H{(HGf8#HT$ 1DHT$H|$ +@jlHdff.@AWAVAUATIUSH Ht$HT$0GP@tH[]A\A]A^A_fDuAEQD$g H|$pHD$pbH|$p I}bHD$LxAG7 HD$|$gH@hH0HHD$(A}  MH$*A}  A~D$fu D$fHD$HD$HX8$@:\$` D\$`EHl$(H5~ HVH$H6)L_ H H=g+pHbLC H4 Iً=C+H<IH$LHL)H}HD$@LH1OHt$H$H1)H ALl$PHD$H HD$hHLIsDt$`HHLt H|$HJ4_H|$(AfA<$T$`HD$@H$HL)AIA9Mc|$fCKHD$ HD$"HJ\f;t\H$1QH KtH$sH$H$>H$H$H[(HKJAOL$8H$9f8t\H$1HE KH$rH$H$H8>H$H4$H$'|$fH =)OMHپHHL$H I=(G؊AHD$AG9 HLxM_fA<$SH Hs I؋=(G耊H $HU I=d(FbIHH1 =C(HJ9@HG(HHP8x:PAEPE\H\$(J4_H[J4_HHD$HD$ @H $IؾG觉H Hy I='F膉LD$ HL$GHP =b'HD$8`H A H2 I=A'F?HL$8H I="'qHD$XHL$PAwHD$ 1H.LL$XH5j1HD$L3HT$8H5O1HD$XHL$X1QP1HT$0LL$(L$H$<0H|$(AXHi ={&AYu@Ll$PH$HLHHf>#H$L8)H$%HHD$AEP H@hH(uUI E1H@YtAEQu7D$`1҅H|$0HbH|$pafDLt$HT1IvHb=%IL9茇LILH5=g%MIH<AH^fDLD$ HL$GH =$%'H =%IHF H4$H$H#H =$@AHD$@fA<$vH H I؋=$G裆H $Hx I=$F腆HL$hHY I=h$HfH\$@IHD$hMHHLHL$HH I؋=$$<rDIHEkHD$H5q =#H@hH1蘀lDHD$@L% E<H|$HtDH|$(:vHt5LL$@MHPHL$HI=}#<jfDHD$@ZfD=R#HD$('H$ "AEZ <+H|$6H$H7H$t"|$gI="HhHpuLD$Ht H="q="H`lHl="IMH<EH1LH$I=k"HK`lH$=Y"IH<IH H]<%:*uDIE(HHP8Lp`I}(HGf8#H$1~f.H=Y @HH|$(HH5| HD$HR~Ls H$ H=s!pqH$HH fHD$xH HD$px:GPy@8oD$gDHHLD$H5l H= 1}lH$H5f.LH,Ht$0LfDHT$H$69HT$p@1L-XeDff.fAWAVAUATMUSHDHHHt$(HT$ H|$ H11rIt#ID$PHtuxutHHuHD$ LH@HHHt$(HHuHĘ[]A\A]A^A_EtH5T H11IuLMudfAGu'IF=L3HP̙IfMM(AuIGL@hMfA8"IP=LE1ɾ.IDIFf8uH|$01H IwH|$0gHt$0Hl3Ht$PL11[#ISIFf8 AAG L% L- 1H|$01lH I0H|$0#gH|$PHH|$01HD$5IG8LcH H|$0L\$JtfH|$PHVHL$Hz I=GHD$H[ =mMHHeH> =PMHFHHHH|$01IL\$IG8H|$0H J؈uIvAu^PDEtDA념H(Ar8H4$H|$eD$$HD$(HML9h xLt$H|$HH $DgPLiPIDD@D$L;i A}DD|$ D$MEI1f.@u{uwIML9x CA?9IAu_PDEoP@D͋L$ T$LWP|$utEyIGI|$ Hpo!H-DgPAAĉD$|f.DPD$AAEAljD$ Lq@cf.H4$H|$1cRK(IH|$HG D$ H4$H|$YEDH$8L|$1HHLNc{D$$u1HL7ctK(D$$]fDt\EtWIFHp IEHxJIFHpIEHx -IFHpIEHxHD$HH$HT$@1HpPHD$HxP\D$$NHD$(D@EA~Hp 1ҍOH~!1D9t A\ 9HH9uHD9uA}H4$H|$1b2HAVAUATIUSIHfDC!x`+}H0 _qI$H@H _JDPIcL$HAl AD$EH9C 6AL$HB H@8@\f.HH0[L]A\A]A^@HC8LL qIHHX HC8C ~H0ǃHH P1H@@HC8DŽXHC]HHC ^H9u*f.1LLNps HIąHZ 1Lc\t`+{H0H _pI$H@H _JDPQC tHcЉH0HLtH,I)HuLHpL9IuHHh3HsLTpHsHHpIċI`+!{`+ {`+z`+z`+zf.HHtWH _Ht2HWPHt. _HJPHHtHHH _HpPuHúH,gzH ,Szff.HtJ?tNHGHH@(HWPHu4HHt!1:uHJA:tHHAHu1DHHuf.AUATUSHH6QH$HdHHHH9IIHL[]A\A]HHSPHtuzu HHuHHH[]A\A]HH LMH(H8H@8AEW@?@HxHHHH9II$DS8HHH[]A\A]slHHIIHt HjIL9L9M9HHHL[]A\A] mf.HH[]A\A]LLtI@PAEW6HnHHHH9IAERAEX%=1;@HHH@(HH@`Mt Ml$PL9tNMtIEPH9tCIMu+vHCHLh(S8SMHHkIEP3AWAVAUATAUS1HHHH|$t HHH!E1A _AIHHMIE1AP t Hs>MHtHmH[ HtOHsHt>tL9L9uAHsLjIHHDhtHMHHDEuyHD$IFPHĨL[]A\A]A^A_DE1MIhM@IIFPH= _uK@H|$N[D$ t$IƀA _LZhL|$HsH|$@[D$@uHsH|$p[D$p@u `,uHH _HH#AVAUIATUISHHHD$_H H1}fDHT$@L1HD$@H=e HlHT$Lt$@HJ29qHHHrHT HLre_HHH|$ 1}HsH|$ HCxaAMQfcLt$@x rLHHD$@ADH|$1ҾHD$肤HT$HD$JI1fDL$1!LHI|$@AD$8DŽ$H$.~^1 fDI|$@HHHl~=6 11H:HU=! ID1Hc"HIވCHA9D$8LLHŋ$$~9DI|KIހK$$H9Ht$HLkHHv- IQ =iHH¾#qkH9HHL`PHÀHfHĐH[]A\A]A^=* 111G= HU1H$GH$ff.AWAVAUATUSHH($`H|$ H|$pHt$HHT$0LD$LL$8D$/荏H\HLH$HH$HD$HD$HD$@H} HD$Hf:H$1HE H$Hp JLt$ H$LH$LH$1HD$@THE H$HpIH$LH$LoH$1HD$ HE H$HpvIH$LH$L&HD$HE 1HH@HL`(I|$ &HHH$HLH$H|$`LeA<$ AT$8L|$8I7H= Lt$H|$`IIH$ 1IIH-L _HH= _I HqyLHFH螡It$XH|$/t~$L=T ~\LA DH9 J8J|BH ='LIHbIDI9$LkdLfH HRPH _uH$H譙HD$Ht$0H$ H$MH$pL$hHRH$h _L|$LR FH> =PIPaI7H$HH$H-HsH!HH|$`HHdcH} H|$`_HIHT$@H4$H|$/HE HP:u HP:HT$Ht$H<$@IHEH|$pP80H~I5| H|$p^HG =YLIHQ`HIkuHt$0LHR\f1H|$`IL|$1ILlH =LImL|$_LD$H $kHu =I_LD$H $mHR =I_H= =MIL<H==oLᾑI'xAL$INj=P3;lH =>IML<H|$`HH|$`LL<$LD$F=IWL^H|$`HLH|$`OH=HwHHjHs=Lᾐ`wHHEHH|$ H0HE H$HH@HHx(4HVHHH|$p趄H([]A\A]A^A_@HE1f.Hx uHP:uHI4VHuIE1ILl$XHDŽ$ILLt$HfDH$L1zH$H= HL}lH$H$HQ19rHLHTMqHHj~UHLII|$ ID$8H$1vIt$H$IEx.A f7ID$L$x `LHH$ HEx8oHD$Hf:]HD$8H0HH|$ HL$8HT$HHHHH} fDH|$ HfH$1qLt$LEH$Ht$0H|$`ILLϧ IFH =[If8Lt$8H|$`HI6[IHL$HHt$0LHW1ҾHHD$P<H$HD$PQLLl$X1I}H5 H= H'mH|$H1HXlH$HIkHHLI-HHIH|$0L`PdK=HFL|$11HI71HL1IIHD$HH=dH@ HpIlH5 =fIML1UlH|$`H H=% L}lIH I="F ZINHI$D$7PLL$HLD$HT$@Ht$XH|$pY^fH@8fvfTffAWAVAUATUSHHFpT$ { CH= dHL$H` a HAh@ 9AAVIHD$I~@KHD$HH@HD$H1P H@8HuHH|$fHHD$X E1HIuE1t$8jAjLt$`LHT$8L\XH H=~ Md$(H$H H$H@LH$HLL`SHL$ L$L$SEXZH$GHD$xf8D$n HD$`H D$oݚ L|$Ld$X\$hIGhp ~}1L$5˥ J, H HIc| HI~@蓻MghH@GHl Lm =wI$@rOI$@IGhSH;P |MXMkHD$HLd$X\$hH@HHD$Hx8Hx@HD$x HD$HkH$yHt$H$M6H$AHg5 H$蒺HL$H$H@tEt H|$pdHHD$XH|$H-= HD$1ҋH H@8HuLt$L$hAN If?8I@f8'A|$hK8OGH =MHL H F=MH| =HIHME9~ HKf?K@If8wkNK8vL H$GH =MLHHߘ H|$HLHHHD$HH@Ht f>#$HD$HH@Ht$ H$^ID$ 迨H$-H$1پIFH H$H03H$ID$Hxx$mIL$Hܨ I=+k)LƏ H|M|$I;$mH =IMHپ<;H$HID$H$ID$LHt$PI}It$HIFA~ HpHD$@lP@ }XMv8M:HHt$XHskqL|$Ht$@IHxUI1HHC(4AOH HC0HS8H5 HHHhHHHPpHH9uH- rP> WHLRe1LD$0H$II1Hھ1 LuG0 SWp ?W0 +W0 W0 W0 Vp V) V0m Vff.AUATIUSHH2Hs(1LHLk0MIm}H{8HuHUHt HzHH} HEHDžH)1HEAEtQHHsHATjE1jE1LHPH tH[]A\A]DHH[]A\A]龕UUyUfAWAVIAUATUSHG9H|$8tQH虗{L _dHlH= _IhHu? U@ _&H.IMtI _OAM.AmI]tBL-HIxxILxAIuL9ƒ HD$8T$o@8HD$8H@HHsL9HD$8Hp HNA7$<lH=HkH5ݣ H1CcH\$8HHs HHIWH;BIEHL$XHIhH|$XHt$H|$H8&3D$mE5LcA$HHEHt f8ID$HMt$PLx(I Mt,AtA/MMuIGH@HHEfE1AGfIFHsIV:f8]HUIFHHHJUIVHHH˜NH5 H襘HHEiR@A$=HLL*fIFLx1HAD$8LHt*5H|$(t8HLHD$(IWHBԺ l8 X8AUATUSHHHxHGH8Ht$O#tH{Ht-Hx[]A\A]Dmt(H{HH%Hx[]A\A]DH"L`H|$1AD$8LH|$L$HA<$"Ht$HHt$ H}wHCHpHtf>#t2H=Ut 1vH=׆ IžWHC~IEHCHpHT$8HHD$0f8.HT$8f:.HJH9HHCHPID$HPHHHHPHCH9HRHPHH{H8H肰HCHHx(LIMu'IuHt 1HMd$ MmMtIL$HuMMd$ tftfDH= WLc+HID$HHHH@HPHCHpgfH|$1HCHt$HH0Ht$HHt$ H}uHT$0H5 1DHD$0HxmHL$0=AIZHQ8(HSHD$0HHBCfDHuMHHF=HRH@HJL@H_ 'HSHHrHoP<4fDAWAVAUATUSHHhH|$H|$PHT$ Ht$L$0DD$DL$UL{APf f AG7|$0„u |$j|$7HD$LMD$LD$4CƒH{Do[AHHt$H9qAE1L$LchIl$ !$͘DLt$`LsHL$=pE1HM.起SHD$H C8@*Hk8EC8@H{c!HS`D$0Ht2t$0H1A $DL$,HƋL$@HA[A\IL$LMxLLmhHL$=E1HM.SHD$H!uIH{Ht@G^t:HC`1H =p HtHt$HKH|$PHHD$@f.{E fDHH H|$PvHh[]A\A]A^A_@CFtHK`HtytuuHCHt@^tHL$=E1HM.SHD$HuBH{Ht9G^t3C8Au-HC`1HtHt$HJH|$PHHD$@@CFHD$H1HxdlHL$HH I=q$H|$HH=HD$@kLD$@HUI=H<H|$PHHD$@dHD$HHHT1Hp?Ht$HH|$PHHD$@6HHaD$EC8u'-HC@\EHL$=E1HM.aHL$ =E1HM.HD$H>C8H!{H|$H ]H|$PHHC8@HL$=E1MH.ճHL$ =rE1MH.HD$H貳HT$H1HrH>{HD$(pE1EtC:u EDE1Ht$@HYCF HC`1HtjjE1Ht$XH|$8E1H认H|$`HHD$P|HAZA[HC:S8u5Et L$V@MDA='iE HL$=-E1HM.rHS`HD$HD$0Ht2t$0HH{A$DL$,1ҋL$@H>fHL$=E1MH.CEHD$HCFOStHK`Ht y уNH{HAG^7>g HG^C8A HHt(H1ҾHxH@ DHuHE j HC`1HtHt$HEH|$PHHD$@|Lt$`L裼HL$=E1HM.SHD$HC8@ZHs8Ht$(HL$HHL‹CVHt$(HHD$(EC8@<E1ۄwH$L\$8ѻL\$8M{H$LsffDHD$@^HE1A$L$@Ht$1H|$ H|$`HHD$PA]A^NDL$L@L踻LHHD$@EHD$HHx{HD$@HC`Ht$HLH$E1tE1E1+DH{H;|$Dw^AA DH\$H4=HHD$@AG7I"T$0H|$PH艟LZi HKz H=ZGHD$@SHHD$@ Hx1Hxl=*Hy IHپqH|$@IHt$PH$豸H=y I-HHҹ11LHӿIƋD$w D$H1$DL$,DD$(L$@H|$ LHD$PHY^H/H$B3HKh H$HDŽ$H$L$H$8YH$H|$P&H|$P茹|$HD$@ M=kLD$@HMI=۹L<&Ht$=@HD$HD$ Ht=H#HD$ HD$LxA<@H@H=]x HD$@讦lH$jjAH|$P_1L$[HqH9IĿHt$PkLI0LHL買H f Hf LLyH f Hf LLH$Hf LLEAYAZH|$H. HHD$HLL{H$H=Dw LL-KH(zH5ki =UHAUAUIM1lLHH|$`L,H_AXH)@H$1a{HC8Hp ppp0Ht$xH$_1HD$xHH0̷H$LHD$HHT$HDHKHy:Hi89Zf.HSHz:Hr8Ht$(-DH$1虋HCHDŽ$H8H5c HIH$H?trH$HteHPH$L L$HL$E1ɋ=.IPH$H|$PHH$IHCH$HPLaH{`CFDLE1轛kH56u 1HHD$@4HHD$@'f.L- c H$1zMH+E1IuHGu AEuMeMm MuMHD$HH$LHPL$HL$HqHHt =H;H"H=;t H蓵1H(1m=_I?kHI=JIMH<菨H|$PHHD$@1E1EtfC8@u]E1DuS{H{HC`D$0HtD$0H1A$DL$,L$@Ht$XY^I1ҐHD$(t E<uMt LL,{zH|$PLeHHe@C8%u E*EVEHHCHH@`f.HCHH8@Ht$HL=a L5a HF@7BHC`E1Hl$8H$1IcHC`H|0H5Y` HI^HGr H$LL/L$J_H|$PHt$HLLT$(HC`HH5_ HIHq H$LLH$J_H|$PHt$HHJ_Ht$HH|$PLHt$PLMHvq =LIGI}Ht$PH LT$(HDq H=SGIMKL,` Hq H=,FHD$@%Hp =ILHHD$@HS`IAD$I9KHl$8Ht$HH|$PL{1HbHP HT$@H5p 1/Hp =ILHHD$@Ht$PH,H|$P1HIJHt$HH|$PHHD$@CH|$HiHHD$@\Ht$@H|$PHگHHt P8CH{HKHA8uEXy:N@A8dHL$=E1MH.HHD$H|H|$HI[|HxHD$HG7VHCH|$PHT{H{uHHtH@Ht @]b6H(IH$1,IHD$@HCL=9H|$PHHD$@D$L|$=E1MH.LC8HD$HHx跻H|$HIH|$PLH*=lE1MLH.订HHD$@q IL1TjjE1Ht$XHAE1H|$`HA^A_H|$HzCFHD$Hf1fHD$x[HE1A$#H|$H6Hx1HD$@֛lHL$@H*m I=yqwH|$@HJ=\HD$@:kLD$@HBI==H<舡H|$PHHD$@H@Ht$HH|$PHHHD$(vt D$gHL$=ƭE1MH. HHD$Hy{HD$@H{uoHHtcHRHtZB]tTHx1蹚lHt$@HLH|$HzHx1HD$@菚lHt$@HLf.+H(HR 1$HT$@1HrHQ+Ht$@HL衫z@L-Y H$12M E1fDIuHtH;>uMeMm MuM HD$HH$LHPH$Ht$HH|$PH(xSHD$HH\$ HSB7)PjLL$@Ht$E1H|$0E1A_ZfDt$AAT HD$(H|$HE1fHCH@8{H|$(wH|$HHtwH|$HI{H5h?1H)IHCH@8Z H@`1L\H5/?H1I)H?=5LIH- IHCH@8 Lt$`LH|$H.wH|$(HD$8wHT$8HL菩{HCHj @8T1Ht$(H|$HL\LHHD$@LLLHD$@wHuH=1(HHLLϩL<Hh L=@qH; LD$@H=I=H<iH|$PH輾Ld$@HL$E1ɋ=.IT$M+HL$ IT$E1ɋ=M.HD$@LD$@HPH=9I H|$PHHD$@5{ LjPE1ɋL$@Ht$XE1H|$8H+xfDHL$==E1MH.肜C8HD$H#H;H|$PHHt$HHB=˝D$E1DHtSHD$H)H|$ !=H 11HHHD$@茭HE1A$H‹L$@@HCHD$LDl$4M(IWHL$E1ɋ=M.L\$8WIHD$(H|$HL\$8xAWAS1AUjAE111LHD$`HH d3HCH@D[D$cFfD<H$HHD$HHxHD$(܋{HD$@HC`Ht$HH$%H$jjAH|$P_1L$赡[HH9IſHt$PLIH$HLTH T HvT H$LH gT HXT H$LH$H9T H$L)XZHD$(HT$HH$LH{KH(IH1HHD$@Dv=HfIHD$H1HxlLD$HHgd H=qHD$@L1HD$8谥=HL1HlH$HHD$@#H$֤=XHD$@6kLD$@HL$8IH9=4<肘LHHD$@ҹH|$HT$8pT$8HD$HHx15lHt$HH|$PH蓣H$H5c 1#{ HHD$@(Ht$PHHD$@VHT$HH|$PHD$@HrHkHt$HH|$PHHD$@$@D$1HD$(H|$HAE1DHD$0HP`Ht2t$0Hx2C8HS`EaH]L$H|$PL~HHH|$HopH|$(HD$8`pHT$8HL@Ht$HLLpqIHCH@8u=H5 =WL1lHuH1HD$@!HHLߡ{eHCHHQH@HD@]70&@LM%.HD$0HJ`Ht1t$0HzH|$HuHPHD$@ fHsHHD$@LD$HHL$E1ɋ=F.IP萕Hx1HD$HplHL$HIHQH$1pvHH5N HHuH$HrH$H|$PжHH55H L1HD$@H$H+H|$HIHt$@={HIM1.kH|$PHHD$@ H$H|$PJ2H|$@H=` olH$1[lHL$HH4I=9H|$PHHD$@蚵HC`D$0HtD$0Ld$(HT$H1It$H|$PHLHD$@SHH{A$DL$,LL$@Ht$XAXIAYFL\$8L\$8HHD$HlHD$(H|$HE1E1FH|$HHHD$HkHHD$HkH|$HH1HD$(H|$H%L\$8H$1)tLHL$E1H{^ =.{IHC8uHCHH8Hp ppp0Ht$xH$iHD$xH0H$H|$PHHH{A$DL$,H‹L$@Ht$XA1H=2ClI2t$0=uD$,ATjE1E1L$@Ht$XHH|$8脃HD$P^_H|$PHP{HuD$0뗋=~HHD$@HI(XHHPHt B]@8)DHCHH@`LHIHHI(jE1jAI}1lIHD$@%8BfDHH|$(IH|$pLd$HsH$fLt$(IF@7 H$1qH$b$HL$$$$$H$gHD$hH0H$H$HEH/H1Lx !H|$pHLLt$(lL1œjjE1Ht$HE1HH$LpH$H$IcL1yH5:0L1HjjHHt$XE1E1H$HH50H L1H-\ gHL$(IM=H1kH|$pH蚰H|$pPHID$@7BH=Z L?LX.HQZ H=qHW/=MIH<ΎH$HH$њHD$@H|$(MHct$0HxIIylH$LH֙EL5-tT$0H|$pLIHH5.HP 1<H5.L1H(H.=ÚIHHLt$(HO-H|$pILL1襚H5f.L1HjjHE1E1HH$LLt$8D1HH*f.HE1jA1HAWAVIAUATIUSHh=:HT$D$?A?H|$L{tpHD$Dp8EtbI\$(HtCH _tOHHD$H@L9HDH _t'H L9ytVHRHHuH% H|$tu1Hh[]A\A]A^A_@L{tH|$n[HL$H$fH _tHH@L9tI9\$0u H[HHuHD$xH$HD$Lp(IF@7LHD$pHH@(HD$0AGHD$@HD$1Hx萶Hh HDŽ$H5P H1H$HO1HD$P31HD$'Hl$p1HD$`H}~lH P =]HIpUAHu A,H$HHD$H=kHL$H$=H HL$HH$I=Iپ</H$HHD$8uHHt @IH|$6HD$(A_8"AfDJ4_LJ4_LHK8OGH)O =;H;HO =&IHFHN = IHGL= HN H=q=ҏHkHL$H#=Hs Hl#=IIHپ<H$H3DIA9G8HD$@=_Ht|LD$PHM HqI=;HkHL$H"=H H"=IIHپ<LH$H蜣=ގHL$`H" H$HpHL$Ht"=d H$HDH=mM 1ƋH$HfHO< H$HH$BAW81Lt?H$HHH$蔍H$H脍HL$HH!I=I<BH$H蒢HD$L-; DX8EH\$H@I8MGH|L =Lr; HcL H=rFpHIL =[MHHSIʼnH9C8H\$H$LHK =qIHH$ A,H$ H=K 蜊HD$XAHDŽ$X H: H-: HD$HD$DP8EE1L$MQHD$0H@@7UHT$L\$  H8K =JMHپGBLD$ HK H=%F#L\$ IJ_H$LLJ_H$LLJ_H$HL J4_LHJ =IHHHL$HqJ I=G~HWJ =iIHپHHD$\H\$HAGI9C8~oK@OGH J =McL8 HI H=FH|$(HL8 VIfDL$LHT$H$HH`HD$Hp@Ht f>#EO8E~nL$fH4_LIclHDI H=SGMIHL$H0Ht f>#HA9G8IF@7 L$HL$PHHH =H51H.HLH=ZH$H]I~HAWG7t  |$? H$H$O1H]LHHH$H$H)A|$?L%L-H=a4vlAUH5: IAT=؈MH$1kH$HiH21LL-FHT$ 謈L$H=ILL$ulHT$ H5: I=^H1RAUAUATLL$8 kH0HH$A<HuH$H$~HHD$@HtSLD$PHF H=qH$H肅HL$XHfF I=hHD$X=kHL$XHWI=I<zH$Ha|$?H5 =QH1 kH$HL'IVAGB7 |$?H$cHL$HHI=ֆI<!zH$HqHD$H@0Ht f8# HL$`H]=MH$H-H$u#IGHHtH@Ht @]@LXI~HLH$HHɄAG I[HAO81jAAL>H$H~XZA,| |$? f.H$H${1HYA6I[JHAO8AjH$E11覿H$H_AXfDHD$(H|$0L$@$J_HIH|$0LLl$xHD$ bHt$xH|$0ILD$ HdC L=nHD$hL2 HL$hnH4C =IHC =qMHgiLB2 HL$hkHB =CH$>$IL-B L$H$HB Mȋ=hL\$hHB =ILhL 1 LD$ HHB =<wL\$hHD$ FDH$H@^HH @I HD$L\$ HHPHBHH@(H(HHHBPHu 8Ht3HЃ8HuH@Hx`tHtzuH@hHH@HJDHD$ fH舀HD$XfDHAL1L-UHT$軂L$H=ILL$olHT$H54 I=mH1RAUAUATLL$(kH HH$ HT$pH5@ 1}H5& =HIغ1kH$HLAI[HAO8AjA1L"H$HbY^8cHUHH@HRHPADIGx[H$L H$}DA,IGH@HHD$@t f8#H$HH@HHD$@HL$y,HAHh]Htf}#tHHhIGHpHtf>#tH$HHpH$HXAG HDŽ$H}HcmlHU=qHIHiHHT$PH$L 3fDH$HD$H= ? L$@8pD$hHcRmlHL$xH> I=pL$IL$Eo8E L賫HHWHKHx1IllH:> =LIqHIOL R- HK> I=ZL<rHD$@H|$@tH|$@H$|HD$@AG|;IGHL$HQHtH@Ht @]^HD$h88HD$1H@HHsHh HDŽ$fDH$1ɺ$D$?DID$(Hu0fHHRHt :\H@HHYH= _uaHD$@IGHx(I1轢IWH@H=}LBHH H< HDŽ$HfLOHHHD$HHx{#HD$(LHHD$xHx1IHD$Lx8蘆H$HH{AI~QIHHD$xSH|$H1HH}H|HuH1/H$HH|{H$H${1H Q|$?!HD賈LHH$HH%{A,e|$?ZDHH$q=2|HL@HJH: #H: =|IHhH$HyHD$H@LpH\$H$HHs0wzfH$H`zA,HIGH1H@HpHfIG1HH H@ HD$@IGI~HHH@H@]GHHD$xnH}1+hlH$HHyHt$P^IdHD$0f8. H$HFHy1>Hh HDŽ$HL$9H8H$1NLl$H8 H$IEH0RH$ MmH$HwIEvH|$03LHHHDŽ$u1HD$8H|$跥HH$H$ JHHKHELSHP fH$ME1H1HEH$1VyH$LQHL$HHAHxtLH1HНHA8TH$HHw H$L|$II1臝HH18rCKHHIпHnIT$ Hu1'f(Rff.HAAj1臲HfHEAjlHHE1AjLHSH$ E1H $ AH1jZH$ Y[ff.HE1jA1HSHX$ E1H N$ AH1j蹱ZH1$ Y[ff.AWAVIAUATMUSHHxH>HL$HNPLD$ HD$XLp(HHH2 E11HЃ9uHQDB8DA@HHHu҃D$WueAu:IHT$XHHL$DD$*HD$XIWDD$HL$HBHC(@E1"D$WE1tE1@MD$WAD$t,IHt AEP@uRrvuNfHtHQ8HtRJvu'1LDD$.)DD$D$W@ZAE1IGHH=H@(@P/LHAGƒM_H{ A{[MtIw8LH{ \HC Hx[]A\A]A^A_f1@`AOEtAƄBETL|$LH%MtH{ Iw8L`Hs 1t@HC `HPHRHOB7E=?tHmf.EE|$kK8LH茍MtHs =shmIw8LH@ AGLc u`IGx[yVA?tP=sL"mHHD$`AO8IjAA1H٭H{HHD$h8AYAZLc 5Nb=jA?IGHHtH@(Ht @P@LIGHH@(HHt xHD$H x  HD$@ZD$HE1jAULt$8LL$(H|H H{ ]A?SM_ICHC(BD$WD$E11\@E1E1H5$ =rL1kH|$HHD$(HD$@P MAEP 1LHL$GHD$XHC 1H[HpHD$HC 1HL$HpHHS LD$H<D5vqHrHT$XgkHHtBwDqfHHn@HD$GHD$HC  HD$H HT$ HH01&E=pLIejHHD$X(HT$Hp1HD$XHD$HW/ LD$H=pqHHD$XHD$@P tMt AEP HT$XMGX1MHپ1yLH|$`oHD$Ht@ZD$ <t3H5" =pLLD$1kH|$`HHD$X豄H|$ׂH|$`HHD$X蕄H|$`KoH|$`In=oL@iHHD$XHT$Hp1IHD$XH4. =oMHq~IHD$XHD$HtD@P t>Mt9AEP t21L EH- =;oMHg3HD$XfD=oHkHL$XHI=nM<GbH|$`HHD$X蕃H|$`H؃Hm@u#=nL=hHT$(HHHD$XHt$`H蛃fD@MHt$WHDD$(HL$萸D$HL$DD$(A?A?HAO8HjLAA1胨H{HHD$hA[AGH{ ]fIHkHBPZH~DBPE5f.H|$HD$@R1L vH=?, Lc aj=[mLIHD$(fHHD$XIv1HHD$XIV=!mLI9HHHD$X蹁 [tHD$IWXH1H5[ =lL1kHfDD$WhE1fqIGHLh(MtI_A?TAEZA} HD$XDD$HL$u IU(HRHS(AUPփ@ulIHt`RzvUtP@AUQuTuHP fv H1kMHC H|$XIw8Li|@@IAUQQMtIw8HL5HD$XH7I(H9L9fDHy80f=jLbdL _HD$X`H_HIƅ~#@ t(fHcA9D 9|HT$Ht$XHyI>LFH|$IWX1@sDaEH5 =.jL1kHHHgHD$(HD$1*>=iLIicHHD$X,HT$Hp1HD$XHD$H[( LD$H=iqH|$HHD$XHD$XiHxHD$fH|$HD$0H|$0HD$8tHT$8HHgH|$芬H|$0HD$8{HT$8HHgAW8Ll$8Ll$0l$LLd$@H@ELL fDHHG`Hu1fO~$HG HtD1D;HH9uHXHuHcH|$H4_'H4_ILLHHgHA9G8fLl$8Ld$@l$L=]t@HT$Ht$0HHt$011hHC LH蝁HC HD$XHD$H@@7tHxuH|$H|$0HD$HT$HHjfv1LHT$v=HT$=gHgIH)& HHD$XHD$H1~^ H5 =;gL1kHD$(H$YAIHT$XHHL$(DD$蠚HD$XIWDD$HL$(HBHC(bD$WD$E11E1ATUISHHH$HL$ LD$(HD$0tRH|$@1:H|$@LL$xHt$`HHHD$(AA 1P蝠Hİ[]A\ÐHdH|$eH|$@1r:H|$@L~Ht$@HzHt$PH|$zHt$`fAWAVAUATIUSIHHD$(t99*tUu7LHĨ[]A\A]A^A_fDHFHHP(BZH|$@HƘHt$@HT$HdHt$ HT$@H"I܈HITH:rH˜>bH|$0HƘYHt$0HT$4Ht$HT$0HHI9ITHH0>H|$@HƘI@H|$01T$4}H|$01lIvXr10HD$(L$$H|$HɸHD7H|$ -H|$0#H|$@H$x t H|$1SH_kHIqH$Hx((E11<10ATUISH_H@HH=bHHHrHHH1ҨA$  H U_HH$SHHH1fA$vI$Hen f H9f.DHpEH@[]A\@H1HDEH@[]A\úh@AWAVAUATUSHH|$Hf.HY HHD$0D$G@HD$0H@ HHD$0[HD$0HuփH HHHD$8tD$FfHD$pH\$8D$DD$1D$ED$E1)D$PD$CD$BE1)D$`E1E1D$AD$@D$/D$.D$-D$,D$+D$*D$)D$(D$&D$%D$$D$#D$"D$!D$ D$D$D$D$D$D$D$D$D$D$D$D$D$D$D$D$'HCH@HH@(HQxSDHZAAAA PP@A @A @A @ փ@t$'pWA@@t$pTAADD$AAADD$AAADD$AAADD$AAADD$D@YAADD$AAADD$AAADD$AAADD$AAADD$AAADD$D@[EAADD$AAAADD$AAADT$DD$ @@@|$"AAAxU@t$$pRDD$!T$)AAADD$#AAADD$%D@XAADD$&AA@ADD$(T$*@t$,@|$+L$-H^T$.L$AT$/P_щ@t$@L$BT$CD:\$v D\$D$ED:L$v DL$D$DS uH|$0H9AH[HiD$PAAAAAL$W|$QADL$'?AD D ᯃAD AD$D$‰L$T ЈD$WD$4D$‰D ЉD$ ЉD$ៃDD ЉD$D$Y ЈD$YD$AD$‰L$[D DD$S D$oAA,D$AD$AD$D$‰ t$ ЈD$[D$DD ȃ DL$UD$!A ЉD$"ȃ ЈD$SD$#T$$t$RADD ЈD$TD$%DD$&D$X ЈD$XD$(D ЉD$) ЈD$PD$*‰ T$+DȃD T$,D$Ut$/࿃ T$-D$R|$A࿃ T$.D$QD$^ D$_ t$@ T$^T$B  T$C Ѐ|$ED$_t ȃ D$D$S|$DtD$Z D$D$ZHD$8H|$PH@HPXH@HHpHD$8HHHD$8D$FH\$HD$G|$GcH HD$0&Hl$0HHEH H HSH  a&HHHPHUHHUHPHUHm HPHuHĈ[]A\A]A^A_D$PAAAAAL$W|$QADL$'?AD D ᯃAD AD$D$‰L$T ЈD$WD$4D$‰D ЉD$ ЉD$ៃDD ЉD$D$Y ЈD$YD$AD$‰L$[D DD$S D$oAAD$D$AD$AD$D$‰ t$ ЈD$[D$DD ȃ DL$UD$!A ЉD$"ȃ ЈD$SD$#T$$t$RADD ЈD$TD$%DD$&D$X ЈD$XD$(D ЉD$)D ЈD$PD$*‰ T$+DȃD T$,D$Ut$.࿃ Љ|$AD$RD$- T$/D$QD$^ t$@ D$_ T$^T$B  D$C Ѐ|$ED$_t ȃ D$D$S|$DtD$Z D$D$ZLl$8H|$PIEHPXH@HHpC Lt$0fDMmMAE L9 AtIEH- H@HHH@(tH9Eu uDH;EtjHm HuH{I}HIvHH)IEH@HHx(H=* HH AE Dd$FQfH{I}IM~HEI9tIHMLAH0H HP1-ff.AWAVAUATUSHHxH|$(A%kHHD$ 5EHIHD$P H$11HHHPH0HHt H9~D$$HH@ H9HLHù<$Z HHIHD$ IH HI1LH1LIHHtQH 1HHtHRHtHrHXIWpHp@p8p0p(p MGILLhH0M MuHxI,Hl$H\$0LH\$I9Q19t=rHHHTI9qHHBuHn7lHL$PHuH|$P1ҾHD$HL$PHD$Q^kIHHD$ >3EILHHD$H HD$HHD$8$D$D$IH\$@"HHHD؃?LLKEH9sAOHD$ 1L?IG(BAO HD$ @6E8HD$ 1HHE>E<$H}PlLt$HHD$PML|$8MH|$P1Ҿ6Ht$PFPHHDVL8LpHD$(>HD$(H+HL$ =EL#AkH?HI=f.Q19tkrHHTqL*HBMd$ MI4$HHtNPID$E1HPAAH E1uwHL$PMl$HuH|$P1ҾH$!HL$PH$QoI_tM;e sb0M$VHP L` H@(L_I HHu H__L跢H H7<$IE(HEPf.HWLl$(L(Lc HML1HCL`Lp HHx@LL#W?kUI@ ЃAD$H@St AL$9HxPyAL$Ad$LHHpPY6Mt%HH HHt@Z<0L74LC=BE1H.IP5LHlAL$;AL$:H@Q L-BAL$:IT$hHCHHhHIHJH@hIT$hH@HBHfHx[]A\A]A^A_ÐLIYHl$H\$0LnLfH= Hu6/lH5 H= HIlH|$1HGlP_H>1H"=kHLIHD$8CE=L|$8HD$ 11HIG(:AwH¾#y:kH9Hŀ|$2MMHP_M:0H/H HEP}fHHD$[HD$HfDHD$ HHD$LEEH|$(tKHD$8$HD$HvFH<@ H|$PA 1sH|$P蹒EHL$ =6=H¾#A9kH9HD$H}H\0t |$M@H@I9D$E(t4IU  HL龸1AfT$LL1H([]A\A]A^A_fHCH8~tf?Nd HHI!L9cHD$tHT$HuLP1pHLcHf.E11fH|$I_fDHT$HuL鿐1p.MDP1DeoDL1(eIuL010pH([]A\A]A^A_ÐH(H AVAUATUSHHH t>hH H__BLNK__HoHx@HH{@ T"HHt)HEHtP t(H@HuHmHuH[]A\A]A^@HE1H@HHx(-H|$H LH|$2H5 HT$HNHH!H)H@HJHHJHR HuH}b IHHtko@A$LuLH=Iw_I|$LH袽H5+ CD,LhHmHfDE1 tHH@()hf.wf twUSHHft ftR@ fDHkH{H5THܙkHHH[]HVlH[]%UD髙kff.USHH kH1H-H5H= H_~HH7kHmLL9"H{ H[]A\A]A^HN4HILHK|5H L)Z{ LHcLHH[]A\A]A^ú o [ GS1DH= HilH_HHu[DUSHHHNHHƹHqD$Hd$ D$(D$D$,Ht$ HlHH[]ff.HtHtfHHt8uSp HHpH5 HCH1軯HC[DSHHH@HqD$Ht$ HHd$ D$(D$D$,VlH@[H(HFNHP@4H4$HHL$%D$ )pH(@ATUAS1HH HHH<_u_9HHH9EuEu11HHH @t$t$Ht$t$9kH@[]A\þH?кp Dff.fUSHH(f?to6@u21ҿTHHH1H<Š_ H([]fHh?t$t$Ht$t$kH H([]D|kSHH0f?tbG6tR tuXHH ~_H|$?t$(t$(Ht$(t$(kHP[D1@Ht$|kHt$둺`ff.Hv1@DATUS?Y HG$@[]A\HHx$@HHpHx{ H[H=z.H]A\42fHHHx{ H=>.HH¾4HHkC{ +[HH1]A\Zf.HHtlHx[{ HO[]A\HH fDHHHx{ H$Wpw HSps HHŃyH{I1 R[LHH]A\yOlf[]A\w H DWpw H[]A\@t f.Hk="HSX@E1XHc [HH]A\l@. }fDATUHS~H;uWHUHHH9CHPH9XHP H@HHU HE([]A\f= 11pHsXIĿ1cLHE []A\DHFH_@WUHFHH@(4+;= 11oH$H{HE uH@HHHE(  F  2  0w  fDGTtH8HHOHtG[ƒt GPGXYG[H< %_HHY+$HمHH1ۿH#EH[]f<hLH1螿H~EH[]@H~EH[]fDH~EH[]fDH1Dd;~EYHy Hƿ%€Һ;HDH?~Eff.@GUtH8Ht@HHOHuHuoYHH HHtHtHrZ@pu;H HRHtIIȾTHѿ`_1P`_fIT`_1'`_HDHƿ%€Һ;HDHIfDATUHSHH}EH= HI'HtL[]A\f.H1vH HPXHUHP L[]A\ff.AWAVAUATIUSHIHHG(=''Hp H5 1HeCQHH= t CZ <ur13lHS(LHrH%LbCZ <HC(HF H=&9L@览CP HuVHH[]A\A]A^A_@HC(H=s&I9HHjCP ILL;mD1H=0&HdkH5HIHڿ<D% &okHHt'BSDeJ1;f1H=%IckH5qI%LL<D$ kHI+BD$ AFDulATUHSˋt-tHtCJ[]A\f.t9 t4[]A\f[]A\fDf.uHGHL~HHu$f.H~HHM$tH}HtAT$8uԉIt$ u3fHD~HIu)GL_~HI#I|$ I|$u11Huĉ@[]A\11ATUISH->H=11}/lHr=#HH¾ kHǾHc"kH <=n#1Ҿ'zkH9HX(HHsHH)>DE11H(<H`|o1HQ_"@*TL _p"HIrHCPLAHX8LKPLH{HHsHL "11H LkHtHS, .[H-F=]A\fH ސUS0THHHuH[]H}PTHH}t1fDHEHHGQ|H߾kI>HdHD$zDH@ HD$H; H2H|$1辐H|$0D$0_]EHL$=LH¾%WkH|$IHjHD$AO9LHD$AOMo(IGPSIHr(Ht H>Ht?H~ tH HT$(HT$(HB(ILJ(HI9My HD$H@HD$IAHtI9fDȃ @tH  _AO;f.IHr(H@BH@(HB8<XHD$ @UHBhH@HD$(H= AkH$U=HH¾%kHIx HyH(HGhHYHxHXK9Lk(HHE KH|$HCP)jHD$HD$HH@HD$nH &}T=H¾%HkHIx HyH(H@hHx HP J9Lj(HHE JH|$HT$(HBPjHD$HD$HT$(H@HHD$HD$fD$( f / YtfHD$ @U2IGHD$(QLIGh2HD$ f=LLL$pLL$IIABZH4@HD$ @UHD$HtH H9H MHĈ[]A\A]A^A_DHIDIH(H@hH{hLLjHmIHE Lx oXE=H¾%LjIA fHD$ H@(H8/fA>%ANAN9LMn(M~PHD$ H@(H@HI9HD$ HHHDL(HHǃ()HL(bkHBLe  WE=H¾%LjHHL`PHLh( K9H1jHmIHt$LjI*Lp ANH1 H]kI>H H8fLpANHֿ  VE=1H¾#H9jH9HHL$ HQ(HB  p  o - o  o  o  o  o  vo V boff.Hht  ff.ATUSHHHwH= HHpXHHtx8t1H[]A\DH;HL$1HD$Ld$MCZǃЃ<@@@8AD$ZH@HoHHD$IHt$HUH8uHH[]Df>&u HFH@x벺?bff.AWAVIAUATIUSIH(z _M$Il$PAG7+I$ HH5 H9(MAD$X1<@D$t$EIt =?1+E1%H+AD9McJH,H}aH,HuA~u D9~t$hN HT$WHIHT$JH@IH,Hmt$u A HIJH@I+D9>C9|f9}HcL$J #Hy;I܋CI$uȍH9~t$hN HT${HT$I$IHЈH@IC9|HHx<A fkIIGxAH~sIHcHʈHtYf:#uSHt&f1 fH@9ufHHhH([]A\A]A^A_f.t$H HT$HL$XHL$HT$HAIHDH@IfDt$H HT$HEIHT$JDH@IgHB(HHP8{Hh8Lp`{DD$D$fDA~HH=ح vkI|$Ht*AD$SI$ H@Z<0tH߾HHIHC9KC9I|$t AD$SAD$THZ HR(f:-HS(C9H+ H$ HCI@IG@7HI9IIpIHxHYLڛ Hˬ G=OH5 H= HkLIEIGxAfDMoy#fHt f}#IHcH=C HL`HLHIk1LH|kMItfA<$#uAD$9<uID$ HtHxH fHtAL$Ad$9aE9<aHE HtHxhN fHtMe98=L1Ҿ&jIGxLH([]A\A]A^A_ÐA~t$`HxIHxH@fH= OHIHH@|$IHtAHHHR(H9HP(H H HPIHuHHHP(Hy Hr HPIKH_HC(HL HE HCILHp@1uH@EA}HH¾#9jAD$SHtrH;KAD$X% =D ,HI$ H= HHH(HC(8IKH댺`YYcqYH]YFIY+UHAWAVAUATISHH GPE1L@Mt A~CQtHEID$fA<$%HPI|$ fB7{ q OAF{PAA~cIHtpIvHtg8ub>u]IFH<2HHxH(H0H9H>H9uA@E1AEEzHbHKPDLKIID$ HPHHUgH>IHU1ɾI1LLEaLEL=E=LH¾#&jI@9AMAeAe;AE9CP tAM@A~AyAuȀAE9I}IuHID$hHvIEhH L HyHH9(CXH@(AM9MIE(H4 L-- IEHeL[A\A]A^A_]HG(HHtP8t"LHSPE1x:x먐Lp`HP8AL踲$AMMHIE(H L- IE]fDHK(HIf9H(HAH9Hǃ(2If.AM:fDH{ 1 H~DIEhHLIEhpf.It$HfDHA @T@TAWAVAUATUSHXL%yHz t Hi Md$(HH AHLM9(t&HH[ HXH[]A\A]A^A_AFYmI9dHt,H[Ht#L'HCLHxvtH[HuDL8EI9HL5 HxH H@YHH\L}f.H[H?HC LHxcvuHS=]E1IL.HuHxI24EHHHH@(K9HC(H Hܙ HCLHmkK;K:MfH}X9ELHH jIFHrqfHb} HH I$HL9(KHtHH1HE^H1jH$ EQ H(H@hHChfDHE(H@H_f8#XHx(MHHHP(H H HP'@ID$ OHHP1(\EYu ET ID$HXH8EA|$HH¾#jHÀH9`;@1H߀K:KC(DHuPHEt/H H@(K9HC(H H HCDKHHC(Hڗ Hӗ HCH1mjHΗ @Y\H[SDH HHHHL5 tgHxH H@YHub1HHtUKH?HC(H, H% HC<5 7PH% Hu 1Hff.H S1H(HHzHtH9t/pT@@t>H Ht HI HuA1H9u ui[=X[HH9tH(HtHHHPI=T9RBH tHfD1H@AUATUSH(H=I1҅HHHD$LkIHtOHR^Hމj1HNjMl$(Hþ^W5ELHHjLHӭIċ=LH jH;HHH蒝H1ҾSI<H(H[]A\A]DH|$LD$L$AA1A)D|Ht$L$LD$FELA8/w"IPA8  MIPHzIxH:PD9ʉVH|t fHVxNvLkID$HD$ D$ D$ 0HD$CHff.AWAVAUATUSHh蘿L-IqL%nH-pH2EMLHH1yH HL5pL-pL% H-ps2EHMMAVHHǾ 1H HL-pL%mH-Fp!2EMLHH1Hb HL-*pL% H-oL5@p1EMMHHǾ 1L4$"H HH-o1EHٺHH1LL%oH-~ HoH֎ Q1EMIHHǾ 1H-= HFoK H 1EIHڹHǾ1fH? HH H-n/0EIIHHǾJ1!H HH H-nN0EIIHHǾJ1L%-lH- HrnL-[ kH :0EMIHڹHǾP 1L,$H" H-+nH* /EIIHJH1HH HH=֘ Q6kH-kHmIľ/EMHHH1YH-JkHmH e/EHHH1 H-q HbmxHQ ,/EHHH1H-8 H)mH .EHHH1H=jHp [5kH-jHlIľ.EMIHڹHǾ1H-OjHlH j.EHHH1%H-jHgl HF 1.EHHH1H-iH.l"H -EHHH1H )Hˊ -EIHڹHǾA1H͖ H-kDHm -EIIHHǾ^1L%| H- HqkH @-EMIHHǾb1H-4 H5kL%1 H͉ ,EIHڹMHǾb1KH H-hfH ,EIIHHǹ1J HS @H=[]AHHD$ 2kHD$(HHD$02kIHD$8蚹I HL= L5g6 H+ESAWIMLHǾO 1@H @HU L= L5i+ESAWISAWMLHǾm 1HW HH0L5P u Y+EIMLHǾ1H @H L5 +EHIMATSLAVHǾ 1^H @H H L5 *EHIMATSLAVHǾ 1 H] @H H L5] p*EHIMATSAVL H1H @H HHL5 0kH= HD$0kH-h IHT$)ESAVMLL$ HT$ H1BHL5eHy L0kH= HD$ ;0kHg IHT$)EAVAVMLL$0HT$(HǾ 1L=I L5zgH  H @)EHMISLHǾ! 1H HL= L5 g& (EMLIعHǾ! 1H$EHn HL5 Lߑ 8 L=sdLD$(EAVSMLD$(LHǾ 1H @H HL=@ L5dL(EUAWIMLHǾO 1H @HP L= L5"f'EUAWIUAWMLHǾm 1JH[ HH0L5  'EIMLHǾ1H @Hn L5] p'EHIMATULAVHǾ 1H @H H L5 'EHIMATULAVHǾ 1hHa @HH L5 &EHIMATUAVL H1H @H HHL5n -kH=b HD$-kHd IHT$R&EUAVMLL$ HT$ H1HL5aH} ,kH= HD$ ,kH d IHT$%EAVAVMLL$0HT$(HǾ 14L= L5cH  H~ %EHMIULHǾ! 1H~ HL=K L5|c R%EMILHǾ! 1H,$Hr~ HH / L=6 L5`HT$$EHT$MM H1RAWLHH~ @HHL LL$(H >+k @IL5b$EHMSLL$LM H1HL5O H} *kHtbIǾ hB$EMMHڹHǾM 1H,$L5 H-# H'bL=_N H$} #EMIHڹHǾR 1L<$A8L5 H-ʌ HaF H #EMIHHǾR 1E L5W H-x HaHJ U#E_t$8MIHڹHǾR 1tO HBaH #ELt$0MLHH1MH׋ H- U H{ "EIIعLHǾ1%H{ @Hr H "EIعLHǾ1H\{ @ HHx K"EIIعLHǾJ1H$H { @H H'`"EHL$8HH1IIH_ H~ !EMMHLH1}HD$ _HM~ AXAYHD$0D|$|$A0'DHIAHl$HItOEtJT$H|$@D 1 EH|$@!EMHLL1HEHHEHH<HHH9tD;D蜭HIufHD$ L`趭HItNEtIL$H|$@ 1rDH|$@h EMHLL1PI$HI$H虭HItPEtKL$H|$@ 1DH|$@ EMHLL1ID$HID$HHIHH0t +HI-vfDHD$HD$HD$HL$H=(_tD$,EH\$ H1IIHHQHzy HHjy H\EHt$0HI1IغHH0y HH%y L|$8HMt@&EIHILL1Hx HHx HH=Zc%k=% H赫HM%k= I蟫H7%k= IɫH!%k=߀ I賫H %kHŋL53ZH t H5}\mfHEHL$Ht$EAVAVISAWHAWSAWS1AWAWSSHL$pHt$hIH5%\/zL5YHv sL= Ht$hH`HEEAVAVISAUMAUSLAUSAUAUHSS1Ht$hl L=WYH`L5 L-[HIv HE1EAWAWISATMATSLATSATATLSSH15]L5XH`L-+ L%4[Hu HEEAVAVISUMUSLUSUULSSH1H- H H`9Htu EIHڹHǾ1Hu @HIH H- ^EIIHHǾ1HZu @HY ELt$ 1H1LHu HHZiELL$(MLHH1|Ht EMLLH1pH Ht EILHǹ1Hbt @HHh[]A\A]A^A_fDH=V!kH=5 H!kH!kH-.YIľ{EMHٺHH1H-XHdw E1HH1H-yVH0w EHHH1VH-GVHv iEIHHHǺ1!HjXH-{XL%G L=X L5< L-UHT$Hv  EHMIUATSAWHAVHT$81QL=XL5 H0L-UL%ق H-WH!v EHMMAWAVHSHǾ1H- L%8UH L=WL5WL-rWHu AESAWMATATIUULUAT HǾ'1H-9 L%TH@L=gWL50WL- W2HHu EHMIUSLAWAT ATUHUU1ATEH-΁ L%gTHPL-VL=VL5VQHt pEATAWMATATIUULATAUUUHUATg1H-TL%Z H`L-VL5@VvHgt  EUSIAUAUMUULATAT HǾ1QH-SL% H@L-0VL5UHt EUSIAUAUMUULATAT HǾ1H-CSL% H@L5UL-UHs LEHIMUULSS AVUHUATUAT1ATL-MUL% H`H-#UH2s EHMIAUHHǾ1:L% H-TL-THr EMIHHǾ1L,$L5TL- L%,RH-}THr LEL4$AUISMHHǾ51L%QH-:TH ;H-r ELHH1Hq HL- TL%~ TH-SEMMHHǾ 1Hq HL%PQH-SqwEMHHǾ 1Hnq HL-~SL%O~ H-SS.EMMHHǾ 1H*q HH-PL%~ L5SL-SEHMMUULSATUHǾ1$H-uPL%} H0L5RL-RHp ~EHMMUULSATUHǾ1H-x} L%PH0L5vRL-WRHp !EHMIATATLSU UUHAT1fH-OL%} H@L5RL-Q Ho EHMIUULSAT ATUHAV1 H-QL=| H@L5DOL-| L%Q% HEo XEHIMUAWLSSAVHǾ8 1H-aQL=2| H0L5NL-8| L%$Q? Hn EUAWISAVMLHǾ1@H-QL={ H L5NL-{ L%PT Hpn EUAWISAVMLHǾ1H-PL=r{ H L5NL-x{ L%dPm Hn 3EUAWISAVMLHǾ1L5ML-:PH L%{ H-P Hm EHMMSSAVHHǾ1%H-OH  HYm E1HH1UH6m HL56ML-z  L%z H-tOOEHMMAVHHǾ1H-LL-6z L%2O Hl EMILHǾM1H,$PL-LL%y H-N HRl EMMHHǾK1H-N Hl {E1HH1tzHBHt*HIH9uDHH9H@HuHD$HeP9RH ;HrpHLIx MvMsLt$HT$MtPAvtHHmJAAIHLIFH|HtH0H1HHH9uDBI=H(KkII@YHHHQH;Ht-=9HuH=Hb/IH|$ LBH|$ H\rI~HEPLAHh8LMPLHH}萧HHLH 11耏HHx H@Ht1HHtz Hǂ(u HR(HBH@HuCT t4{ u HC(H@Hx uH@(H@fMd$MH|$0xI _AWAVAUATIUSHHHH8HH= [ 譚HHt xDCCS!HsHCPH;RHx`H7H$L$`1HDŽ$HDŽ$XHLDŽ$HHEHo@@$ho@P$xo@`$HHLH$U`$Hl$ H$ 1Hg'fH7H=Y qHHxDE< HEXHHtHHtJZ0CSt C[,tH(fDHsHtH=AY HHTHEXHGU8HH+RZ0H} H$`1HHDŽ$` H$`HH(HK ΃H(>fDHwHHGS&G[<H( HdHXHH݁ILH诸=1HH 9jHD$H;HHspshs`sXsPdH011H|$HoHl$LH~3 HH#qjH|$HG(CW uCX@tCT gH|$CYt OH|$H(H(6HHt H;HD$HH[]A\A]A^A_fDBc 9$*Hl$ H$ 1H$HEHIfDHH|$C@7C[<H(+HsHQHs}8wH;H$`1HHDŽ$`蕡H  RvfDHH01LH$U`Hz(Hs(H$`H(@fHR(HJHS(HJ)HHt#U8Jv@HsHVfDHsHCfDHS&HH(L ML}MuLI69;MmMuH(HS[SSSSHH11LH$U`fCPH$`)1„H}X识H$`HEXfI(H(<DHH(H(HHE`H=JtE111LH$(A}IH¾(HT$}HT$AHB tHH$MnLU`IFLw```|DH(HtfUSHHGYAGRuH[]HtaHi~H= S H誒Ht-HpXH;HL$1"HD$HtH(Hu@H(HHD$uDuHD$1H$HkH;HD=HH¾#蛢jH{HHD$t3H~Hl$HH. HHkjCSHl$tM9H HHtHH9(CX@Z<0t.HHH@(M9HE(H[ H-[ HECSH(9M;MCSHl$EjHHCQ@tH諯nH|$?ljHspshs`sXsPǐH011H|$HҢHD$HGMHHE(H[ H-Z HEHl$5HHH HxYEt#KPHUHs AE1HEPCQ@2Hl$EE;H{tH HRZ0VMCX% =t1HD$HECQ@uMHl$sHT$ 뵺`VVff.AVAUATUSHHGY@H uTWZptHGS0uBHG[<t>H{tC[<t H(u"w1HtH1Ht GT C CT t { 1H' tHH H#CPH=H9~H(HHH[]A\A]A^fDHC(HHxH(HTHHCPuH H )CQHHbD1H>tHuH8CQCT {DH(H { SPAH(ABABt @{ HCQ H(t-E:t{ t4HH[]A\A]A^fHH({ HM:H(H[H]A\A]A^{Dt Ѓ<BBuH(H: C l{DLc(IT$ Hs f#H(KP DHS(HH R8HPhHt H:HHLH(HD$H9t HtJ9H(fD;@{DLc(I|$ It$Hf>%It$ HF f 1nID$@ID$ H9ZCQHH f%IT$ H1ID$HC(HPf{H{(1pC #Hx}NH(`H HH(HH(HBY#H@H(fDHH轴H(HD$Ht H9tJ9H(SSCZ,Ld$(1HL|H|$LL9D$'D$&HBM*fE1IHML9AG uI([HIHAuG^ IWPH!HBeAWPIIPNAOQY @h _ tI(@H|$I.EA >I(LLYH|$n1LdM(IFhHL0M=L薄HxHHxHD$*jH-H=I IH|$`H$軉I~1xkHYI =ILqH|$pHD$脉LL$1L蕊H|$pHL%H56I 1LH4$H|$pHH4$H=I SH5< =.IL1jH|$pHǞIv1LYH|$pHL詈A =jH|$pIFHL$HM=I<}H|$`HWH|$` H|$`I耈H|$pvH4$H=3H 膉H5; =aMH1jH|$pH=<jH|$pI袈HL$HI=M<`|H|$`H賝H|$`iLHLML9|H|$@辇HHHHmHH}HtHu uHW(HRHtf:%uHt$@m1I(LIFHt @^IGPH!HBxAGSnAWPlM9AOQσhADuIw(H~:TADt>AwPf@f@AwD @E5 c"  |$'fD1LFM(IFhHIFH@@71IHAyAG sI(HIWPH!HBDI(H@@7AG yAGSI(L`AV8IFHt @]L5u)WHo1HUIIG(HP8u =zI(QHx3HM(LLIEPI(H@PHfDIW(HB85AGSm}_H|$MwL)H|$POA AGP.@oAGQH$t A  H|$E1NfAGP@AG I HzYxruIG(H@89L'*AwHAw@E1Aw8Aw0IAw(Aw H$11HD$0.H0L$ILAG  H|$PLHLFAIw(H~:DAOQkLLUD$&A!ELLD$&fAGQQI(LLWD$&A!fDAWP3AN8΁I~8Iv`H|$@vH=_B 1pkI(H|$@H$H|$@ʂ1HL t  LLA D$&AGDD$&FAGQH|$PLt$LTI{Ht$PLL{[I%wEBLL^fDMAN8mIG(Hx:I(@Lt$L誸IѸLL1D$&A!H|$@fAGPAI1IHtI(I(E1 H|$@HrAGQu#I1HtI(I( IH|$@LHL@IGPD$'H!HBDH|$@FAGQkI(MI~`1IHtI~H H @IG(=H@H f8#L_zH7IW(1HRHrHIW(H|$pHrH\Hrt { H|$@H|$(1H1H[]A\A]A^A_LL-D$&AGQI1IHtI(I(H|$@H豔H|$@g1HL誼eDIG(H@8AKI(KI~`1IHtI~HH|$@HBI(VK1AGQuIN`HtI~HHH|$@H$~H$HLEH|$蝵IĵAGQ/Iw(LLyH|$fI荵I( H|$p}H=V= HlkI(H|$pHRhHrHV}H|$p ~1HLOH|$蕺@H96{ ,HC(HH@8%H%4 HHh HHE@7 H|$P|HIHHHx11kkH|$PHHH|$PG}H|$(1H舺D$&vIG(HHQH@HD@]K5AwDH=\3 HC(H@H f8%Ht$@HFƒI(HHH:HT$P1AGPL "A I`H$13QA uIG(@PuH@8RH$L$詔H$LiAGQtAWZ H$H1HpH$=?|I9HQ6AGP HTH|$PHːAGQHx(1`HE@PHx([tH@PHt$`HLLDl$&D@G@j@5H|$p;zLt$LαIHt$pLLSH|$pIIAGSH|$@yHt$@1LFH|$@lz1HL请I(IG(HH8H|$pyI(HT$p1[Ht$pL}I(=zHQCH|$pH&HR0 HH>H|$0HkL` H|$謰H԰HT$(LHDM{ Hs(HFf8#{DH|$PxHT$Ht$PHRIH|$PEyH|$(LH腶H|$@xI(Ht$@LHxNBH|$@y1HLGHUADIG(Hx  HT$Ht$PLQIIAGSH|$@wHt$@1LJ}H|$@px1HL賵tH|$TL|Lm HI褳H$1MH$L$譐LuA I(u0HHt$H@Ht@]tH$1Hxek!UHmH$HHzIxH$H|$PHvH|$*wC  nt fCYtCH H Ht0HH9tyfDHH;cH@Hu5}iHkLHH8~1莣HD$0HD$0H$L$WH$HH$=%wHH$pH${DHs(HL$0HkHL$HFf8#HT$(Hz1LHD$L=vHD$蕴jHL$HT$IH5a <D-vljHH\BFDi=H~H1HtH(1HItbH}GP uH H@Y1K=uIڳjIt$LIL<D-u;ljHIHEjjAjjE1H|$ 111H IAG H|$p{tH}OY@H|$pIH螔LFH|$ptHIVLt$0HkLLt$襫HͫH|$PsH=3 1UbkHS(H|$PHrHsH(H$1IFIH$L怌$ތL覼L$HW1It$H|$PHLE1HsL蠰HC(HPf:5L` M=ctLmIHC(1It$HPIT$=9tIL91J@`h1耨@`oAVAUATUSGZH<@H1ɺTHJ H9( a~fGT t`HH9tTHtO@Y@uIGSuCW[t7HHL@P~1lHfCQrH{ 1œH(fs 4ȉʃ@aS8H HH@U11HCY@t0v H|@SQЃSZ  H(HH@HC(HH8H?1ɺTHGfHHO1ؙHC(H@Ht f8#ȉʃNDHHK1X萙H(H{ t?HHK1Z DHHKX18H{(z HC(@\HHK1KY@HSQЃCY@tHKHx]f?t 1HGHHx(H9tuH H9 uH1Hf.GYujUSHHGZƒpt[GWt  <H(Ht f8@t-PTptH[]fDCRtfCTtSH(HtGHF(Ht f8-*H HBZ<0XH(H=8! HF(wHCPH0HqC[@gH(t*PCTFHHH{t;CQu5CZu/H  tu_ctCZCZf]tCX%@=KY@HHH(H HBZ<0t*H H@Z<0 H(1HHE(?;H= Hv{ FCPf%f5CY@+CS! @fH HQZ0CS0H(uuHHpHtǺXGZH H HBW@#D$HD$(TH J HD$ D$#H$^I$H H9Ht ET@L ML$L^1IQIcwH=~ ILkL1H6jLH.t1QH0=b_HILHsMoMMuL^IHz="_1HH$Hs t EU 1HHdHH$H{sI$H |HH9HET@'H  H0Lq M#H@ H#EPH H9 \}H$HrH|$p2] t |$D$ZL$H$}]LH肚LHH|$p]]H|$`IP]LHL蒚H= ?jHHtfDLoHGMLuLHi 蜚I$_I'ID$PLAL`8ML$PM$HA|$PLI$IH ]11}8H|$t8H H HD$LHw~jH =~ t =QtLe Hc'TH&G}YTHD$HP H[]A\A]A^A_fDH9L(M _ E15:N~H9H(H@M@ DUD$HD$(E Hu1 0H=v I YHIHD$(HF H|$`LIGPUpLEHL$`ML11(eH H|$`L$ZL2H9 H: DH 1HvuHH$HY3f.= RH 7EfDH|$`HD$ @Ht$ H|$pi@H9[HRz HHB(x[:JSDHD$(H6H H|$pHYL-!uMt,;H=uH H H H= 4jH=e1H1Hek%H|D=YHH¾ H,Uj`;H HED11H;j=YHCX1H 'UjH9HX(HHHCdHLu D=9YH¾%LAUjIHX(HE AOLIGPU:1L9jLu HD$In 菮D=XH¾%HTjHHX(IF MM; HHEP9H|$H9jHHH|sE1ҾHx;H谱n1H6!H$TW=5M? AOLIM1H5 =,XjH$HlHc5 H=yHD$XCEkHL$XHQ19rHHTqHc5x HHBH="DkHL$XHQ19rHHTqHc5IHHBH=DkHL$XHiQ19VrHHTqHc5KHHBH=YDkHL$XHQ19rHHTqHc5jQHHBH=8 DkHL$XHmQ19ZrHHTq59 HHBH=CkHL$XHQ19rHHTqHc5 HHBH=mCkHL$XHoQ19\rHHTqHHBH=N^kHt$XHH*jHH I=UHH¾#QjHŀH9H@HLuPEH=話jHH~UH=HBkH5  =EUHI1jH$Hi 7JNSM3=T1L1衱jH$HiAL$=I;H=1AkHH I=T9菶HH=zTH5H$HiH$SH,HHCPHAHX8LKPLH{`GHHHo' Mt/H H L-n/E1MPȃ<E u#uHH]Hz([E#fD$HD$((DH]HIPLd$8Hl$@H\$HMMIHt{ u{DuLs(MdIN HCMFMrCPAHHtP8uHD$0PkHH =R蝴CP IH;H=1 HRIVH51HIV H51IHLC1ATPLULL$P1H$[MH M@Ld$8Hl$@H\$H@B HE(H@8uHB(HHHH@Y1HHT$0kHI>H@ H#EPH HT$0H9A@H|$X1ҾHD$HL$XHD$QVH9oIvH1aH|$`HLO|@HD$0q6DLS$ fD1豙L(DIF HxӊkIN H8 I=Pq腲KY@1HHD&H  =_PIHhWH =BPLIg:IfHx(wHHH@(HHH HHH@HHHR(HJuhH|$X1ҾHD$HL$XHD$Q$H|$X1ҾHD$jHL$XHD$Q|H|$X1ҾHD$:HL$XHD$QH|$X1ҾHD$ HL$XHD$Q~H|$X1ҾHD$HL$XHD$QH|$X1ҾHD$HL$XHD$QH=!Hc;kH5J =NH1UjH$H5cH=Hc;kH5* =\NH1jH$Hb#H=Hcq;kH5 =NH1ժjH$HbH=a2;kH5 =MH1薪jH$HvbHE(HB8<'H9LBh=MLE1ɾ.IP@HHp1HD$0HL$0H|$`HHLHUHM18~Oy ?@H=yHAVAUATUHSHwL$I$Htt[]A\A]A^@u:HIIIT$=)CLH$HH[]A\A]A^@IAL)HLHIpID)L)XHILHGIU=BLIH譤HHR@I$[]A\A]A^DAWAVAUATIUSHHAHHLHHPH(LHH*HbLIWLHH@LAHI6EIMgH E1E1LL1D-A:jHHtBvpH E1E1LL1D-A9jHtBwDhHHH[]A\A]A^A_H@DL`YDmfHUSHH; w6$` HHHtH{H;HHuH[]@HHHufD{HuHCH8uHtn觚HhH@HSHHH[]fHHtH{+H[ Huj'T @`fú'c @Af.AUATIUSH~ H1IHd1CPHH&*HH#=LHH1#HHLu>HH[]A\A]Ht>tFAVAUATUSHHo Ht HUHI fB7u!H[]A\A]A^@HFHH@(@P tH8>HБHx1Ir,kH =?LIpHI*H}I#{8HCHIU=>LI9٠HH~S=>|jHH(>Hq=>ILI<1LH=SH[]A\A]A^Ë=r>H7HPHB7 ff.St_u@u1[ÐHHHuf.H8HHtH{Httٸ[f2HHtx t@P@u댐f.HP(BV tHz:yATAUHSHtyt\t[1]A\wLH |tHH~8C8@HH=s[]A\HHtT@WtNHHuDH[ HHsHtN8~H#u;f.HH!@HHHuHH[ Ht7HsHtV8~HufDH@[]A\fD{8uHHnq1?t HuWt~v\t7ujHH9uƋxH9uHH9uÐu1D@USHH=';4HH@H7=;HPE1IHپ.H.HtH[]ú)4@ߩf>ATUSHH~f?0HGf8Hq/kLeI;D$ H>jHCHIT$HcRH9uw1HH萸=2:H3f8uLHEHHA~7Hc|1 u-HHcЉHH)H| uH9u[]A\Ð[]A\fDf.t@uy+u) b+t`u1eDv1eff.@?t 1G8~HGPHtHuS1HHHtStwsD1[à uHC[@[HBD@4Df%zt 1AVAUA@ATUSD)H HjIHIMI@IBLNM<III@IBLfIHIBjA$f Jf "tHHSfHl$ 1H5HHLLt$(LD$(=7HLTؙHD$(HHLEIE H []A\A]A^@1H []A\A]A^fDIL$f9 55^30!DH5L9IE LL$0HD$0D$(D$,LL$ IIك@vtL!LL $HH?H?HL$Ƀ9sA?DHD)HHHL $L[DHl$ 1HHLt$(@MII@MEfDI~H#kH! =s6LIpkI~H#kH =E6LIp=IueH = 6HIh1LHK#kHL;#kIIHً=5L</)IE +1L#kHLH"k=5IIHL<(HL"kIILff.fAVAUATUISHHHt$Xt$Xt$Xt$Xt$XbH0HLkm2H9HHILHL}3HHL诊D$0At2D$: <t&H4HsH1I|$HH63[H]A\A]A^f.H1v4HnfDH{(1HIHLL2SHWH fv,H10HHP=3[IE1ɾ.@'H=3b-HHPfSf?#HtFHS fvYH0HHP=3[IE1ɾ.&HGhHtHHHEHS fw=O3H,HHPfDSf?#HtFHS fvYH/HHP=3[IE1ɾ.G&HGhHtHHHEHS fw=2H?,HHPfDSf?#HtFHS fvYHe/Ht`HHP=k2[E1I.%fHGhHtHHHEHS fw=/2H+HHPfD[H= ClkAVAUIATUHSHJHH jMLeLH/HL1HLHd/=1HLIHH蘓LIkkH& =x1HImp=b1IML<H$H]A\A]A^.=21IHLH'LILkkH =1HImML]A\A]A^=0H<+$[H]A\A]A^ff.H1HHif1YfHHH6fD&fDHHHfDfDHWHHfDfDH'HHfDfDHHHvfDffDHHJH5c 1HɭHtHúp)d@bff.HH5 1H聭HtHúP)o@ATUSHLgPML11fxHEHHtDtuHPzuHtHRHB8HHHufDHLHǃ@Hu5 {Ht5HHD$ tHD$LH[]A\DtHuHUHHtHR(z uH}PHD$sLePHD$f.HGHHt H@(x t 1E1fHGP1W1ff.fHHWfvF7 tH% H=-H&HpF7 u1HfH7HtHHfD1HfHAVAAUIATIUSHL EHLWH5p H1֪H_ =q,HIIHfI|$IfkH =D,ILm<H =',IMH<lHH =,ILHHH~H5 H1H-=+HHHsH1[]A\A]A^fDHxHUSHHH1HHGHGHHHH)XHHS*H{J*HHk@tHUHHEPHSHHCPH[]f.HHt|HWHHtsH H9tgHw@HtLfHFHH9tH@`H9tU@H@XHt HPPH9tHHWHtHHVHHw@HuHHú('@螙(/@芙HG tRf8tLHCpHtH;H(t4HHufDH@Hf8"uHJ H9H uHCpHD$IEHt x HE0LD$="E1L.IPX{HE uC:t`Ht$H+`C8At+u'{t!Hu =!BHE H[]A\A]C:uH[]A\A]C8CFuHCH@Hf8HE(l@H;LlHE0 LD$=!E1L.IPaHE(%H~Ht18HShHHT$$ @HD$'B @{,f'Y @蓏'u @AWAVIAUATUSHLfLvM$Mt?I.LHHt$H3HuH[]A\A]A^A_I$($HDŽ$Ld$Ll$t0HLI$HLLt$H@HD$Df1L7@SHH GQtCPf%f=tJ{ uDH HCPCPBCY=L$L$HfDu,CPAH HCPu>CPu<u/{ HH[DGPAtH HGPt܋=H["fHF@7uHFhH0=yL$L$H@CPƒ{ HS(HR8u @CQtrHC(HP8,HH`Ht y=@80H HKPfDHHx:H HCPHC(HP8u_=ZL$L$HDHHt$kHt$.HH>x4L+AVAUATUISHHH't{H{ kHS=IHپG~H=< H5UH$kH}f?1H!koH[HH]A\A]A^}D*HIE fHEf8uH@HP H51wH5H1IcH=MHH}HLH(LHLL.I|$Hu0[L]A\A]A^fH1H k' @萊fH7ATUASHHjHE fH{Af}H}f?0HGf8kHUH;B u}HCHvsHRHcRH9ufl1HH=HAf8u;HUHRJ~-z u%HHH9tHz tf1[]A\fDHH1諘[]A\=IHf1ff.fAWAVAUATUSHHF fHAI fDIDDL$IH^ILLNMDL$t HHD$ujEqAAfzH5 H=sLATMI1$vjZYH[]A\A]A^A_H5L1菗LD$H51HLvH=HHIG[]A\A]A^A_z5bBfBfA}6HL~HcL.jjLLBAfH5 @fA}LLyjt AIwHP 1茖IW=*MHH"zHHE fH= H HID$H=  fv]LIH5H1H5 =IMH1ctjH5HH[]A\A]A^A_1ԕLH1ǕIH53 H1豕H`HcLn-jHǩP'@-P'@P'@ff.HHGATUISHHt>HCH+Hx(HLJ5H{^H{HHHuI\$Ht%fH;HkT^HHɣHu[]A\ff.@AWAVIAUATIUS1HIHHI>,AoF HI@ AoF0@0AoF@@@2SYHCP@SYANP KRCPAvR KRANP ȈCPANP࿃@ ȈCPANP ȈCPCZANZ ȈCZCTANT߃ ȈCTAFZBjHL$HI؋=AI<H<$HHh[]A\A]A^A_ÐH5L1_ HQ =߻ ICLH(jH5Q =IIH1Hl$P`jHfH4$HTIr@H AkHL$ILHAkLHuDHL1諁MIH= wLHLt$H H=kIMeH5H1HD$ MH5F =IML1_jIV=LIGHD$(dI@HL1fDH<$HH9Hh[]A\A]A^A_ú`&&@mqfDUSHHH_ H{ftfHfu'G7t!f5t-X^jHH-HE H@ fwHH[]fHu(= H^jHu H^jHHHE H[]ú#G&@pff.fAWAVAUATUSHHLeHHD$H|$ T$L$DD$D$L$IAAA A&A}[|$AE^Hs 1Hu 1I H =MHpE1bHH|$I+H|$ HH|$ HHE HS 1Hp~Hu H|$ H5AE\M|$HU LL=II)>jH=4IML<yH|$ HRAH|$ HHH|$ HE HS 1Hp)~Hu H|$ HxH|$ LHsH|$ H|$ HH[]A\A]A^A_@Lm(MtHH|$ HrLm(LC(MtHH|$ HPLC(HHU LK AVDLH|$0XZ_H|$ HH|$ HHE HK 4=HPyHu H|$ H @|$|$HU LHs E1AA脆=HvHIH~1tIHEHHDHH(fDLUMC8tSBpHHt":tH3 LCMtH3$€S HShHzDK:AAHsHALAAH HL$LshHQ19rHHTqL2HBHCHHHHt$LiID$ HH []A\A]A^HRHHHz(BE1H(WH|$LH@fD1LLshHP Iv1+gHL$H+H|$1ҾHD$臈HL$HD$QDH0LAL$91HID$ HH}HiLHH []A\A]A^H3dLshI~€AHzH:`%!@W`%!@V@AWAVAUATUSHHHHGHHt L I9t$t&{thHh;$!fDA|$wID$ HG L_tHu =qHE ID$HHE(/HDHCx@ u@Ut;_= CC( C cHyHE H[]A\A]A^A_)$P!fDHHHHUHH;LcPMA<$HH{ RHE HPLCX1LHHHHU(H@H7S \521HHCfDHEHH H; HH;X x Hvf.HCHH@(4+HHHH@9B Ll$ HL$AFL̹HLmLHRHL觹HLHLH-H{1 HL$HL$@Hڋ=DFHE HIt$HIuHHHX$-E1AHHH@9B Ll$ HL$LHLLHjHL迸HL`LHEH{1EHfDL$HL$@DH =FHH{HE E1AG.CgHHz,HE HPf fwHBf8 v0%T@~RHHj+FAlDHH@;BX L{iLl$ HL$LqHLLHHLLHLLH1LHAkeAnZAmOAqDAp9=.AEAAf=AEAAeHz Hx @ 9B  Ll$ HL$LiHL LHLGHLLpLHLHHs{ IHLML=jLHH~INjC H5 tH5 H=JM$$Lt$XLL$h1=jH HHIt$H IuHL} Lu(H|$ HGHH|$ (H|$@Ht$ wHt$ HHD$@H$HHH$q(H$HHHt$@xf>uH$H@f8HIHD` BJ z$ @OfDE1E1{AF4L'H%"@NfLl$ HL$LHLLHHLHLLHiLLHH$EH$Ht$@H|$HD@ HD$@Hx1\jIt$H$I}/fDHH{ 躪HE HPHHHU(H@H78OH$H躥Ht$@-Dj AD$0"AAAD$Ÿ"AAAE1H$LE1H$AHRIcHkHkHH_L$HL$@1=:jAHHE AE9DAMHH$HE A1iHxyHt$@LHD$@1?HAP6IcHkHkHH_(ADL$HL$@HƋ=19jHE D$H$H5D 1[HT$HH51 1IZH =MHF>IHFLf@4%?w @)IIH1ҿp 葪II%MLH?HL1H)HrHHHkH`HЃHHHD$@HHXHPHD$HHt$H$H@HD$藪oL$HcI}jH$=}MHHHRq=IžHj=SMHHھ]H=1HI{jH$H̙ I=n=1HILj=IMLHھ<@IH$s4HxjH$Hc I=n<1HIj=LIMHھ<MI3HE ]Eu+/H5& HEIcHkHkHH_-$ @IHD$@HXH2jH$=IHھc;HIj=LIHھ];HIj=MHHھG;HE dHD$@HxjHE H`$f@PHH5 = H5 `$g@H$f @G0%@G0%@G=*1$*@G$z @G$Z @GH$=fFHQI]:IL$H{jH$=%IHھc:HIMj=LIHھ]9HI$j=MHHھG9=ILHھH9HE $E"@F$@F$@tFff.fAWAVAUATIUSHHhHFHH@(hP#HPPHI1HHt,:uHBH8At HEHHuDHOHCDp8HAHILLHXEHǃAE8LLL$8LLqHD$ f8.u H@HD$ L5HUHL$ .=9E1IH1IFf8 H~I|$0LIw1H PH|$0HL[|$u|$/t |$$|$ IHCH1Hx(:Ht$0LHZI$H|$0I$HD$0|$I=#jH5dILHڿ<-fiH.HƋB2LL1ID$ HpHbc1 OIL$ IHڿ<-HqiHtBwhID$ dID$ IWH9Pt=SH4KID$ H|$0HL|$|$/|$H$Mt$HH@`H;S8;0Ll"I|$ I蟛LHLq@HH|$pL1HDžH|$pHLL$H$MHD$i(@R>LpHoHt$0L#L1)ID$ H[]A\A]A^A_fH({H$HH@`Htă8ufDH$HH,H@@]LZHŋCH|$HuHT$1LHHL|$|$/|${HD$f8#L|$H1Iw[LI|$HLrfDHCHH4Hx( &H(f8 HPhIHt L2MLDH=akHD$H|$0vLNH H`Hp1HKH|$0HHH|$0H5aILHڿ<-$i1HuXLL1fIT$ H|$01L\I|$0BMHƉhLT1L|$ID$ qDHuL1JI|$HHD=JLIZfH|$pH!D$pD$/It$ LLyHCHHtH@Ht@]u>H=_ kHD$Y11It$ LL\L_HD$HCHHTH@HG@]L8H=(@P:fAWAVAUATUSHH8HH{PH{PIm HHH}1HHCPE8xuH@H@81LH{P[m IH{L羃$nH|$1H|$p1֞H$1ǞH|$LjH|$pH]H$LMHt$pH0H$11H$H|$0HH$IHCPHt xH@HHx(GP uH H@YtR1蕟=HjIt$LIHڿ<-iHItBwAl$LHHH8[]A\A]A^A_DHPHHNHRHAB]7H$H[1Hp[GH$IH =q*=IjIt$LIL<-iHIBAl$AHSPHǹ!HRHHr`HHLl(LsPHI(ID$杬LIID$ LIGm IH}L ?IIGHE1111LH@(IEpHWMM}PLIAMtA~t HCP@IFHHtH@Ht@]tH$1›H$HbH$1SHYH$1HpyEH$IfAWAVIAUATIUSHHMMHHAP7\HxtIAHH@(@]`H"IHHC@7tHRHShHLHR H?HUIT$ fFB7LIH5 1LDHMH I=F(HEIFHH@(x  HC@7 fEH5 H1DHUH5 1HDH =IHG'HEHĘ[]A\A]A^A_DHIhHtHIHtHfDHHHIHHHYHEHB@7"B7@PH@(x[LIHPP HHHuuJztDJJ ~3H0H8fHH@xHH9uBH|$01赘11LHV$LH|$0H|$PHL Ht$0L Ht$@I}f.HxH=LRHZIfH= 1jDH\$Pw0(@20(@x20(@d20(@P2fAVAUATUSH`HHIHIDHHHCH@(x uH@(HxuL0LLe1H;LHHT$ H5L 1 AHHHD$ ]HHHD$ Ht$LHCHHT$ H@(HH@(HPcH`[]A\A]A^fATUISHHLHHHHEH{ HٿHHC []A\ fAWAVIAUATUSHHHH|$ H|$01@H|$0Ld$h^Ht$0H|$ HT$PH|$ H]C Ht$Pu AkHC`HShE1HHRZHH|$ Ht$@H|$ NA?L,gHIH o H|$ HHD$ȿH= 1JHD$AG8HD$1E1zfDH$11ItH$H$H|$ H$Ht$ KHH5q 1AH>H_HI@H_HH$H $H2 I=AG?"H =*MHF""H_H|$ HHH_H|$ LHH_HH_HISH~ =MHH!Ll$H|$ HLQHL$Hu~ M=G!H|$ HHRE9g8~lMHJm I|pHD$Ht&H@HHx(YH_HHfH_HHqfH|$tHD$H@HH@(@PAH|$ GH[]A\A]A^A_DH{Ht$P[HC`E1H|FH-@H)QHt$PH|$ jA?,IH@IIH@HHD$8HdHIHk H|$ HHD$踼H=| 1:AW8HD$@H|$ FH|$ I虼HbPH|$ HH|$ HHD$H@HH@(H(H(PHp1H;H5| HHڿpD-Y̰iHtBwDhHP=,MIH<qE@HCHB8x[AO8xHUE1HڤH{IE1HDHdff.AWAVAUATUSHHHFHHH|$01H@(IHD$H$1ڐLHI1H@HuLc0LLHD$(H|$HD$ sHIL8AGPmHD$@Ld$ HD$ DIw(H3H>H'?H~A$ I7IHIHuH$I$HD$LH$Ht$0HjH$HZAO$HLHQHD$ L$L$HαHt$H!H$HMmHm ^_MM}IAGPOA I 1I7H HuH|$0I$IHt$0HHT$PLHHHt$HMmHm MuH|$HHt-Ll$ Ld$(fIuH}LII0踼HmHuLt$HI;A~ txH|$jHHt"Hl$(DH;HH0ɼH[HuH|$ FGH|$(H[]A\A]A^A_+GHt$ΊHt$HFIw(Lt$I~(HS(f:t|Iv(Lt$A~$ҝI6HH贶HIjLHnAN$HIF(HHHHpQLK LC(ׯIF(Hk ZYH@HC(E1HD$DJHzHp(HFD@At(HHDκ6moHL$Hq(EHFAtAEtIAQH<y$@$@'$@'HJH9H랺$Wn'USHHH_{tH}uH[]ú@$"@.'USH~tSHH{t=HC 1Hx6HS HHH蔶Hk HHsH[]fDH[]ú $"@&ff.fATUSHpzAH|$HHۋHH|$JHt$H]H{1AjHt)HL$0=6D1HE Hp[]A\f1HHD$QjHL$0HT$I=pHE Hp[]A\ú% @%USHHHHs H3HC H[]ATUSH`?HtHCH8;HCHH[PH@(x 1HD?1H臊HsH-t HHt$ HHsL%t HHHD$ Ht$ L_H`t =ʵIHGHD$ Lc H7t H=FHHt0EHSzuHHBHH@uHu`p( @N$H@(HH@uѐ1H膉HCH-s HH0Ht$ HHSHBH`[]A\úp(@#p(@#f.AWAVAUATUSHL~PH|$Mt$IfDAuAG)MMu( @p#fIcG A-H=&s qjH=s IjHD$IF@7EW AGDA93McHl$ AUI91HFK0Hr HHt$@HBH|$HHD$@J4_L HL$@HPI=GHHL$HPI=gHe=WHPLIFKIAGAG PD94J4_L1HHcHL$HPI=HHD$AGAUAG I9fDI|$^jIT$=FILH5DH10HĈ[]A\A]A^A_Ic_ AG9}Hl$ SH9}1HÆI0L-4q H,Ht$@LH|$HHD$@-IFLcHL$@=GL\$HNlIUMHL$HPI=߱H=ϱHPLIFIAGAG L\$P90IF=MGHJ؈HQ~HxIjIU=`ILFXHL$HPI=>H<HD$AGAG f.H CH5:c =1 jYff.AWAVAUATMUSHHIHHvH jH|$ HIHuH|$ L=(o {Ht$@LHt$ HHD$@|H|$@貊irHH$H貄HuHH$L5n H$LH$HH$ H$jBNU H}HtHuHT$2JHT$@H56n 1-H$H5 n 1H-Hn =yIHGqH=m H蝜jHm =HIHF@H=m 1HojHm =IHZHC(H[]A\A]A^A_@H|$@~jB`HC HHrfuA7u=H/HPf1I+HMMPHT$H11L:H$LC(mHk =%(Hk =ILe LI1)_AXIHS(H5`>11+H$H5J>I1+AWPL1MMH1KL:XZ@H$H8H$H|$@HHD$@81脀HT$@H5=1I*HMMPHT$H11轵Lu9H$LC(mHj = Hj =ݫLeI xI1YI^ff.AWAVAUATUSHHHHFHD$Lh(AM HSHHH9ndHIH9h+HP(Lx HS HPMHS(SA+A Mu TLeX} kAGHK hwHS(H}1HT${HT$H1(HC MMA1҃z|$IGLpt+t'AEQt C8tIU(HB8jfIG@Tu{LHXIHt|IHEHU MLHHHZfC8tAuqt ItcIwLHH1:$@LHMuIGS8@Pt}~Hs 1蔩HC H[]A\A]A^A_ÐIcFH}1HD$Ht$H菖jf.HA=.2HP#wHA=3HP#WЃ<UnfDHR`HzH8 `IE(HH@8HSH%D$H LAET HC M9AAAEYEEE1AEQt#I HGYtHC AH 1HtHJ(H9I HHHqHhH9([ENEEEEAEQAUZpH9r AEPf8 0H1zHC EI HHHHH;(EE[EMLJH[HC S8L$HL6A} HC L}PuAEYIE(H@u"HC(HMMu A{HuI HH@(H@HC(Et H9EHH;(1H HL;*u,fDL9*HRHu\HQrHH[]A\A]A^A_鋣H6A} S81E1E1vEHH;(1HC ' @EiERp' @K' @7'A @#'@ @'I @' @H(Ht(H 6HHH(H +6HHPH(`;Ht7AUATUSHHH8H2HttH8[]A\A]@fDH~PHHLe X ԇLHIH=5IľejLH蘈HE H8[]A\A]fHhH=[ 1HHc[ 9P HMT$HH$1-@HFHDf Hx(IHx LH莣H=4Hþ誐jDHH܇HE ATUIS1HHH`HwH3Ht}L%a HHt$ LHD$ Hx3jHL$ Ha I=ZHHHD$ ԷHsHHT$ HjH`[]A\ÐHCHtf8#tMtLDIH9aL葨LH膺HHkHt$(H=` LHC.H`[]A\DH|$ HHCH`[]A\fDAWAVAUATMUSHH Mɋ$ H<$T$ L$DD$$D$t AAPH$  ЈD$8 |$D$ H$0H$01v>G tgM' H$  I4$H$ 1uIH=_ 蔎jLEXH1MHپ1٪L.H|$p1uH$1vuHL5ZII9 H$PH$PLH$PA} H}1|@7 " M* 1$puHMHqHLmXHH$PHH$HRHS(H$PL豸H$H$PH|$pHH@(HC lH$PH$WH$LL$H$H$Ht$0H$P9H$HηHK H|$p/|$I $p~FAH|$0vJclH#HfoPDI9$p)D$0H|$0>H$PH.H|$p1sH$1ssH IH$H$L$H$H$LH$HDpJ7HCHD$H|$H$`蜳H$PH$}u HEH@HC(|$$XEL-CL L%4L 2ALE1IfDJ4_H{ LL H\ H=FJ_Hs HHJ_H K HHs dLHyH\ =LIGIHHNIH$ Hl\ =~GJIH@LfLGK H8\ H=GFEH\ =0LIH(M9IHs LHD$ Hs mHHC |$|$0HC *LpLIH=0Ht$`HIrH|$`CH$HpH$H3L$$H$LOH$LLH$L,H|$`$L$H4$LnpHL蓵H,$LHIt$HH|$@͚H$ 1HLH|$@HLcH|$@H|$PH茚H|$PH菰HS H|$PL.H|$P2LH4H/=OIIH<蔎|$IH<$HگH|$萚|$H&{@ fDHC *LpL#|$IHS HHL$E1kH#IH=./.LD$ڗH$ 1HXpH5.1HH<$HHLP1L薇jLHLL谙0HHLD$H-I=H<^H<$H貮H|$h|$Hj=ܙiH-=ʙHII<H$HH{_Lk HĘ []A\A]A^A_f1虙HC H$:H$LñH$H|$pH$H$|LEfH$H=W 1KHt$0H$HD$(2H$EL-F HHD$@H$ McHW I=GALHHPLHqW =MHF{L$ HLW G=YII8MKL,F HW H=,F*HW =LIH EI1H$ HV L=GLH$ L|$(HH0LiH$ HV M=FHHPIHD$=}Hp011LHm}H$H|$pH^HGPH9$$u0u(E1u u1uuH$H$hH0HH|$0觫Ht$0H$5|$HCIHD$ H$H辫H$L讫fH$H$ @P |$D$tD1jI@u0u(E1u u1uu1H$H$肔H0HH|$0L;$Ht$0H$P4@H}HHH~fH$xHC H$PH@HHHp(y@H(zMHE1HH&MfDHEH@H$H=A)jH<$HLՓH=!)DjH)=ILp2HHfH$H0iH$H萪L$$H$L謩H$LLIH$L艩b@H=(jHu(=_ILpW2HH߾0I H;(=%MHeH+&@&1@&@fAWAVAUATIUSHHHH|$E8T$E1@@IMzuAEPAAMEiK9AER"HH߬HC HP ff85MDx9AAIt#H|$@HD$@@&HHCS} GH}1螷HIE HD$HC HxWwLT$L9Hs H ~@ IAHLT$HUrLT$}8LśH{ IHLH H{ fE1AID1HEEEPAMZHjjE1j1HHAAH fDH@Dx9AAf.MtH3EH{ H5rP H1Hs HH9H{ 0HxLI~jH%=LIL9DHHHs 1>H=O *HC HD$TH#IHH诏1LLD$H5QC H=ؐ1iHHyILLs}jHr$=IL9HHALLc eLHJHs 1HL$H5B I=R1iLHAEP t }HĨ[]A\A]A^A_HA6HE8eqHHHs HF f]HHt$8t$8t$8t$8t$8MZH0HHC 8DHEHHx(GP d1_eH=PN 1H|jH7#LI91eHIʎH#=EMHI<节HHߣ=!iLH艎= H"IH<IIHĨLH[]A\A]A^A_鍣DH|$pHD$pH0H{ 2=HHC Rf=rHHC fAWAVIAUATIUSHHEHH|$ AG8@@AWAG8t AG:E6; M|HsC8h{HSX1TAG  Cƒ 1Hx^m HHXHテ$KH$1aH$H\H$H|$ :H$0H5!1t HI1LLH Z=AHuIGx[ H5 =Ɍ1MiIt$HF7  1 H|$ HLIAu*A,t#HuL1 H|$ HHDH$1H`HH[H|$ H>HCHHtH@(@P%= A{LW=HHD$PnH{HD$PH1HދLIWHpL1 LIWH|$ LHJ(D;AG8rIOHI8X{NH{|Hd1HJLH_WHD$PHpH1l Ht$PH|$ H躉LVHHD$PLHH$1H6_AG8tOAG:uH;HH$L$4H|$ H藟HuH|$ 艟{HH$}H|$ H`HKHH$HtHA(HPIt$1zLHH|$ ʈHuH|$ H|$ rHĸ[]A\A]A^A_ËC K{( A;H$1E1H^HH踡H|$ H蛞IGH$x[It$1H|$ HLI[A;8K8H$E1LH|$ HHD$PڝMH|$ LĝAG:D;HLLH|$ HHD$PwfCIWHB8fDHt$PLkLD$PMU IL$IPE1ɋ=S.{;HH$1\H$H6A H$腠H5N =HHú1iIt$1HHD$P2H|$ HL肆HuH$1H|$ HH_H$LHLH|$ HHD$P:yD=rLHD$PAuVA,tOH$HWA 赟H$H|$PHH|$ HHD$PΛfHD$PH$1HpHHt$PHAGFH|$@H$蛅H$HHD$1H [H|$P1[H蘀H;?HD$uIG`H={?1}L0HD$IG`Hc8\LIFMf(LIF0(IF8IG`E1DLeKH3 I~KLHHH$IG`H<$I~JHHHHHƘY H<$I~HIW`AEI9pLd$Ht$LLl$LLLv&HsXL躝LHH|$PLLL$HD$HH$Ht$0LH.AuIGH@H$H|$PHAw0Aw(E1Aw Aw1AwAwHH$H0HH|$0GH$H9=VH\$Ht$0Hp#H|$@HcHsH|$@UI~ IG`I~0H|$H|$@藃H|$ HHD$PŘAGA H$1HSXH$1HHD$ =Y.sI1H H@L1Lp IvIV=!LIHHHHHD$vLH$KH$IHp0H{1Lh LH=H $IjH $H5> =IH1ViIt$1HH|$ HL'~HT$ HHCLhM%@G%)@3% @&!@ %@% @& @AWAVIAUATIUSAՉAHH|$0g}ID$0H(HuHgHI$;HH f.Et#U8uAtH8LEhMIP=}E1L.1qHSEHtTIL$HHdHH|$0gHHHHtpHCHcE8Y1DD$H$`L $DD$HIL$HHD$LLT$H$LH4$tDH|$0|H[]A\A]A^A_@}T$LEh=|E1L.IP/p}H$|$H<$}HD$HE`0H=yHt$辶jHct$H= HD$ijH<$HD$jHT$H|$0H{Ht$@H|$0IH4$L1HLT$(HT$ 6|Ht$H=: HD$ |H=q H$;ijLT$(H5. I={1ARHT$(RRHL$0QLL$ HL$(iH HH|$0`L={E1L.IPn}HD$HE8uHEHH8Hp ppp0Ht$0H|$pEHL$0HQHJ HL$8H0H$fHEH@8D$I|$1负HID$HH|$@1+OnfD4Qփ)H|$@1OH$1NI$vHpH|$@舒IvHT$`1L$H$H|$0HxH|$0y@HFH$VHEHH@`0b &!@ &e!@fAUATIUSHHHGHHt HH9p*A|$A$utI$HtWHHt[@PB<@uPM8LHyHE 1HxvHU HHHwH] I|$P@LH-H] CH5 HK fH9HuHPfvf{u HCH9u܋=x1Ҿ$tiHHXPHI6AD$pttL1xHE H[]A\A]fDx葐A|$uA<$u]Hu 1wxHE H{1uHU HHIvAD$puHuH/{f.LH%HH[]A\A]uM8LHT$ H}T$ t:HE 1HxuHU HHHqvHuH赌H] @I|$PID$HH@(H(H{LiHHH[]A\A]HvID$HH@(DAWAVAUATIUSH1H8H|$pz6HctHIHCI|$H;HH1HwHu1HI/HHHuLBIHCHHtHMLpH9/`K9HCHLHA<$AD$8t*~&1Hs H M$ H߃AD$89HT$xHK HHR`9IW=!v4ILLHt|$p&HD$xHHB@]ztVt+LBHSHpH1HHHgtL1uHC H8[]A\A]A^A_f.tLDBA|$HHHu1LHsHS Iw1 1HCHLHCHD$xHH@`HtxuHLHHs Ht$(t$(t$(t$(t$(i?IW=tH0H4~IfDHLHRsDHS(HID$HPH 1LA<$IT$Xu M|$AG(tHֿ1臫L5 , ߶IID$I$H8HǘI|$VBID$@(HP)@)@fAWAVIAUATAUSHH1HH$DD$Oz6HpHI?MIIwL1LHHqL?EIHD$HS HtHB f]HCHHtHHt HlH9+IH|$ qK9L{HHHRHEPHu@HHuHQvߋH ~؃tHH H9tHzuU8~.E1Hs H - H|$ DAU8D9H$HH@`9xHS H|$ 1Ld/EH|$ QqH|$ HD$pHkH|$ LH|$ $q=qLD$<HL$HXIdHH5H$Hx1LrqEHC tX|$tQHpH1HK HT$I<-!qHqgiHtBHC Hĸ[]A\A]A^A_1HCHHHH$HHx`H|$ HHH{ t$Ht$Ht$Ht$Ht$H,H0H|LILHHoEHS LH>f1HHSHS Iv1tEILLHnD}U8fDH@@]nLHt2D$"D$M}  tH$pH$HHSH|$H$pH|$0H$ H$Ht7GuHGx[~ t H$HtE|$t>}Dt=f5f#tf.t Ht$H8gH1iH$HtDCUu>Ht } H$H{J29hqrHDH3>_HtH$H=]'H$HP9H$rpHLMM tMvD$,MMoML$'E11H|$bCYMD$ k@HD$@8uI"D$H~DH\$C?H\$@CWq3HD$NHD$xHD$@8|;HD$x=D$MHD$@E1D$OHx H@(HHH@HH|$"P]T$OHD$}H8D$OHD$LhHMtIEAH$p1;H$%aH9 H H$HD$DF8HE>'~uH} EHH$uZ@HL$8H$E1H{t$H$H H#SPHHHCX: CW 0 @u$CSu 9>H% CU^?Dd$ 1lH$tH} Y(HH$t} ?0ERLHH$pE1HDgCUe H$HT HsHHt x])EP8 E1E1fDHE AUHE(HHH@H@]I}!Ht$H$p?9uHu@u8u0u(u H$H$hH0H$H EPD$E1E1HD$@8t4@H/%HE p(p ppp0H$=H0HH$f.H$1ҾlH$P2@H$Hx H=X} EUP@I}PeA}ZIEH@8EAoEL)$AoE)$AoE()$L$IGfo$AEfo$AEfo$AE(2fDH$W8uu H?A CUH EQ } u&u8tH HHt @U:fH$H$pzEP H$tE ^'D$E1E1fDHE(HH8AH?HǘOA@H=V } @!UPЃ<L!1DHH$HJ29qrHD f.Ht$H$pN4HEEQ;H$H$pwH$xTHD$} IExL`0H 1It$H =U_LIpML$H I1IqLL$X{0LI˜H$LL$XM=^H<HRDRH$HE1} H$AG~FHE(H@8E1EP Dt >3EZuH1u@u8Au0u(u <H$E1!Ѻ薊H$H0DH$~HVHHR(z Hz(HH:D$XC,H$D$XHVHHR(HR(HRH$zfHH$萲Ht#EP t$u@ fD$ƒ  HRUPE@ $wERmH$H@HH@(H01F1IH$1H@HHx(2H Hp1I7H =\LIpʾIH$HX 1HpH$H> I=\p苾H$ =v\ILfnHH$M11LAXHL$eLt@HEPtEZ <JH$E1H&HtEPE1DE1>HFHH@(@P F8HVxt |$N_MH$HAxCfD*€AIW H4HRH'JHRB] H$H.7P97H$rpHLHGHH@(@[@HEZt91uHu@1u8u0u(u H$H$H0D$HFHH@(@P u@I$Hs"H@P@tHt }P,ID$HHP(HHH9 шL$@RH$pL1jH+.ER2H$E1E1z#H$#fD@ Ht UP@,H$0H@HH@(H01%I?@Ht UP@1H$pH@HH@(H01%IDHE(Hx:! u @8l&z@D@7T fY&@fD$8tHL$PHfH9Ah\H$=P$JHD$h?f.H} HU(HHA`Ht xIt$HF8A88Hq8LΌH$H$p1Ҁ$/EP L$tA}Ƅ$tƄ$oE H$)$@oE0)$PoE@)$`NIEPH ,E1fD8u HPzLDHHuM+I$H$1IDŽ$HD$#H$LRgHD$I$H$HD$H$@1sz6HLLHHD$X/H|$I"It$1H3H$LHH}MH|$XH|$IH>It$1HH$LHuHNHHI(DAPAAuHu@1u8u0<u(u !1H$&xH$H0DILMH= 1LD$8jIAE7LD$efDL} IL$fLLD$.} ILD$GHU(H@HHR`RQf^&@ fDtILMQ&s@ع} H$H H9GtH=J#DHH@@7jjE1j$111AqH EP ItH$8V!H|$L^'ERt2cHFHH@(@QQHAEPtEZHo =GIHZѩHH$(H$HDŽ$HDŽ$H$H|$HD$@8E1E1xH$0L$1MtAuH$HtpH$HtpHD$`HtpH$MtNAutFHM"B,"HDNIHtHtIMHHIH9uzL$MtRAutJHY!J9!HLIELDHtH0H1HHI9uzL$MtSAutKH J HLIELDfHtH0H1HHI9uzHD$`HtWptPH BAA HL$`HDH|HfDHtH1H0HHH9uDBH|$hH$Lt$hfA>#tI~1BH|$LHDH|$hLt$=LEHIF >IF HD$PH@PH-LMu a%IIHuA}i-IE@Tt H|$L|$Lt$LL!Iv =D-HIF HFDD$HH@EHxt4H H9t(HL$@AQHPHD$Hp HFH@HxH$?5jIHD$Lh  HD$@x @PƒFH|$tHD$@8ƒ3HL$A8uAt=CL~=IHD$Lh l:AMH|$83 HD$8HxH! H  Ht$0H|$XHD$PMHHp8HH@HHCHxH@x[L|$LL@Lt$PHHCE1HLAN8HxLHWAN8HUHL*LHWHejj1jjHAE111jH HLWWD$HH8[]A\A]A^A_ÐH$1ҾHD$H$HD$JHfH$1ҾLD$H$LD$Jf1vfEPAID$HH@(@PC{@] uAA9@1\H$~EZ <EPu#HE(H@8uH$E1L$Lv@L$L$AxAp0Ap(E1Ap Ap1ApAp1LpH0HLUL H L=A9LHHD$XU} EP tH$8XLF@H|$HyUE HU(HH81H$HP9H  rH=N p1HL-jH$H.J29qrHDEP (&Z@fH$(H@HH@(H01!IfH$H@HH@(H01IYf~} o(HE(H@8<U(|H$$D$w|1HFE1H= 1E1E1\,jH$H$D$fK HMG€HD$^H€*HƿcIpX1uH$oE gL% H"f1HL襼H$H$HP9H rD$E1E1pH$HL~$"H@EP8E D$H@DEPMZHjjE1jAAHH$DAH CUfD} K/H$f>uHupuhu`uXE1uPH$1H$HH0uHFHHx(GP ` H$HZP9GH zD$E1E1xHLBHFH@8H$E1kH$H$wHNHA8HHqxg} Xf0tHD$@x  AMLt$Iv LNAF8IF unHD$@@PqH|$KHD$@8:$NX!=50J!Ht$p=93HL$HA HD$8HHxHHD$@x MA$uHFHH@(D@PAAEZuH1u@u8u0u(u <H$!1^H0-HVHB8{Ht%+B9-H$1L$H$ H$1ҾHD$WH$HD$JH$1Ҿ.H$POHD$@x HD$@o@ )$o@0)$o@@)$$1H|$@i#HD$@83HD$@H@PHD$HxPH|$x H$1UHHD$HPHH@P@ 9BHD$@Lt$H$HLH}D~*EIVHHt BpHD$@pP@LLD$軌LD$IH|$P]HD$@@Q HD$@H(H5HHD$@@[@H(HF fvf> 1-0HHD$Hp FID$HHP(BP@H|$I}1-Lt$IHIV Lv.LMn BLl$0HLZDIF 1HxjIv LH<.Mn HD$@8HD$H@8RD$NHD$@x eHD$@x$;=3 HAMIHD$HxPH|$xHD$1HxSHHD$HPHH@P@ 9BHD$Lt$H$HP`LZD[(E$ IVHHt BpHD$p8HHHH@HXP1@AE1E1PLPH|$(H$1H L`LH.H$HHJ29ABH\u11EzH$pL1H$H$H$pzH~P1gIH$H@HH@(@QuDH$H 1Hp贫L$=N-IL<IP H$H5; L1yH$IHbJ29OAH$D$E1BLdE1^$ HD$@@Q H$HhH$(HxIA@7 u HHxL1DCID$MHD$@@UHD$@8$$D "E$ HQ)1H,HD$pH$H B 9 H\$pHE1JH\:H|$POL(IHIEH4jIEI(1+HHD$Hp HFH@HxH$HHD$0>*HDŽ$HDŽ$D$NHD$x$H= 1PjH\$HC(dHFHH@(D@PAAUE1HD$H@HH8D$MH@(Ho@)$o@)$o@()$H@HH@(@P =f*IhiLI)IH$1H@HHx(H =/*MMH<tIH$1pH|$PvD0H$IHALEuqH$=)+HHD$@8M16=)"#Hx7 HD$xHx{HD$Hx Hh(HH$prAH$Hx聅iIH$H@HH@(HRLHIH$fH$pLAH$H@HH@(@P =(fiIH$1H@HHx(H =(MMH<IKH$1ҾH$H$PDEPMZHjjE1jH$DH$AAH H\$@Hp2CPID$MB{ D$M3HD$@H@(HH8D$MH$pH$HHDŽ$قH$H$$H$p?H$pL1螭H$1'H$H$HHAHHH(H H9 1H&H}(HHtH@Ht @]P2H1H'H$IHpL1HD$XHL$XLHHl%LLa;B8YL$L%H$UjjAj$E1111HIMH HL:L H L=0&9.LH:H$LPEP tH$8m Ld%H|$H:HE(HEZ H$pL=H$1%H$%)9YH$AuH$fJ<&%)9H$AuH$J<1@ )D9)DH$@HD$`H$p4BD@ )9H$AuH$B<H|$L7L$IH$1ҾH$BH0H$p3@B&F@.&B@&@&@&@ކLID$h&b@踆&a@褆&@萆&@|&@h&@T&#@@&@,HHD$'&@HuNUSHHHtD0HHtLE[<tbHHHHE1[]鿩飽HFHHHHh(0uHt$DHL$tH[]HHH[]YAWAVAUATUSHHH(HGHHƄ$L`(Qt }HtHWyHHt @WcH}PHtH$_]H}P^4HHtBPAAED$ {HEPHu!3HHuAfAD$QtAD$Z <IT$PH@ HAD$[@xA|$PlHHtk@W HX@HHuEAD$[@u=AD$TI$ H HtH; 1H([]A\A]A^A_@$EDHE1H muH t AD$PLHD$a$LH&&LcHHLPgH$L%9 EEwE1IFH$@J4_HnJ4_LInH =LIGrJ4_HInH =LIGrJ4_LI}nH =ILFrLi H" H=qqorH$HMH =AMHh9rIIL9<$Lt$H M= hE1LrH$HLH$S=IMiH =MLd$0IL<LHU$H$Lz HLHD$ @I_HLT$IlLQ HL$IH: =L<LH I_LIlH =MHFqI_LHLH$lH$I_HLHlI_LkI_HHLH$kL$H L=HpLT$Hl I={GLvpLH L;|$ILd$0LHL*eH|$ H=H|$ L0#H HH@P'wK8dHEHHU8LHKH H<93HHH$\H$ HCHHH@(H;vH`H9YƹHHD$hH$Hp H$HZrHt$hH$HrH$H$L[%H$H|$ )H$H|$ !H$H|$ !HD$hH|$ Hހ`pMpH|$ Ht$0!H$pD=| HHLI6L|$H M=O hLJnH$HL H$ = IIiH՟ = IML<LLH Lʹ `#7(@z`#M(@zff.fAUATIUSHHhLfP1HMt}I$Hu HHHudHjH}@HD$HD$HD$ HD$ 1HNHHt x[@HSHHHRHB]H|$ HHD$H@ HD$ H@@7 _H|$ HD$ KMtaH@HHtH@Ht@]uGH׾m HT$r>HT$It$H11ǁH|$HLLH<$HHT$HT$H랺##@5r##@!rf.AVAUIATUHSIHLHMH{ H}HC HE1E1LHLHLHHC *HxHS IHL;LHuH1ŀHLHL/HC H[]A\A]A^1HHC H[]A\A]A^ff.AWAVIAUATAUSHHH<$H|$@Ht$H|$P1DH$pHV$rt A>L4$LH9HHcA~\H$H@PHt/LMu IIHuEEH|$P1H4$H|$PHD$xH\$pHD$ HD$ E1@uHD$x H{*.$Ht$H|$PHHD$p$RHD$xH|$PHHHD$(XHt$PH|$@MH $AG ;A8"HL$9A8IE1L$jfDH$pKH$pH|$@)H$H|$@H_H|$@LAFA9G 7IKKH$p1D2K0HyH H$H$xuFHD$xu;HD$8t1Et,Lt$LZ:H9sAFDH4$H|$PH$$R1HHD$$8u x Ht$HEtH$x H$H@HH@(@P%=@ HD$H@HHtH@(@P%= Ht$PH|$@H|$@HH$xXH$x,tDH\$HD$x;* Ht,HpH$1|Ht$xH|$@HHD$pH$1Hpx|Ht$pH|$@H=  HuH|$@Ht$`H|$@Mf.HD$H@HH@(@ZHL$7H$p1ZJ$3$p%H4$H|$P1H|$7HCHt$HHD$8Hs1H8rH5 HHڿ9D% :hHtBwD`H|$`HrHt$p=  HD$pHuHT$(1qH|$@HH H$xHD$JHH$= H$[HD$xHT$H|$@E1E1HHNHT$H|$@E1E1HH.H|$p蔾HD$p$H4$H|$P>1HH - H5F H¿9p hHBXH $Ht$PH|$@IH#HT$H$H|$@E1E1HHDŽ$HDŽ$V]H4$1HX)1CH $Ht$PH|$@IHHD$(HT$H4$H$H|$@E1E1H$H|$@IyH$*HxH$H|$PHHHsIt$H1oH|$PHLHH|$PHHH$HD$(NH+jH|$@HH$%@_H|$Ht$8HE1E11H)HD$8$%@_$&@x_$%@d_$l%@P_$&@<_HD$xHD$8E~~$$@^$$@^fHwXHPAWAVEAUATUSHHHH|$PT$ L$D $DD$D$>D$?aH$01H$1HkHIt4AMp} AMp{u:HjMp$@CpAEp AEp{tH$HCEu$@ {D$D u ED$=  L9- LfHzt'H@PH9u@#B+@}]fDL$ L0gHXH9QI|$t}8u}mI@EpLLAFp AFp1SLL&SLI$\I$fDHH9u<$ HuXLH$0LfH$LfLL$NOH$PH LH$x(OH$PO D$ƒ HD$`LHH$ H$PD$ = "D$ G $H$H Ј$H記t {Z}HD$ |$ H$0|E C D$|$|$uD E E@$HDŽ$<D$H<$LL$HLD$@H$HHQIL衰L$IIF@7 x|$IF fwUfA>5tNEYdHEHHPHPHCB]9@8A/@1L^H$HcJ29PqrHDH$PHPB7  f9f85/Su)HSHHtHJHtA]t B8AH$HJ29qrHDLID$HxEiH|$@IH|$H hH$H{2J9hH|$HA9ƉBH|,HL$@prLHLID$HxiIHD$HHx$jHL$HHz I= mKH} = MMH<IH$0H%  t |$  |$t}t\EfD1HeHU1HCHD$7HCHH@(@] uCUuHCHH@Ht @^}V@HEx[8H*D$C$S8t/De8E\|$gD$D$D E<@}.{,$D$L95(t0L9-t'Eu H$H|$PD$<D i EHDE8EHHD$@=I  EpL5`t {4 HCHD|$ HD |$ H@(AD"|$D`PEAAu |$ s0s(Ds sEssH$H$`>H0IL$$H$LH$0LLLXH$LH$@LM9L9$x=L;$/|$ L$ @|$|$t%w H4$L+H|$PLH|$PIt$LgJH|$PmH[]A\A]A^A_{,>H$HD$ ƒf%f#H|$ H$SHD$ fC8D$$HCx[ D$"D$H}H$1QH$0HD E}|$f{H.uHҐH{H$L$11mKLHL9-UL95HH$0L3=2 pH4$H|$PthH$P, D$ fDLLLnDL$PID$@7 H$PIF fIt$ fL1!b= H$PHL(H$PIǃ}L5M5fLL$x;E fD{HSX菩HHL% OH{8HƃHcH#HLH4$L贂H$01UH$1FH$0L\H$L\H$PH$L$xH$rH$0HL;$x;L;$-}L$s0E1s(s 1sssH$H$`H0H<$HDHL$>H|$p1H8HL$?H$1H8L9-CHD$`E1HH$= D$D$ "D$ fDD$pH$PHp6$P>H$HD$ HD$ H$XCHHf@H.$@E$L$PHDŽ$IF@7 wH<$LL$HLD$@H$HH4IIF@7 I|$1iHD$(LZL$PIIF@7 H$tH$1LL/H$PRH5# L1^H5J =< MH1LCXHA L11zaHD$ H$XHEHHH@Hr@]rcHkQvL趮H<$LL$HLD$@H$IHHMIpHE@]t @PAH$HP9JHLtfHDŽ$HSB]BPA1HH$HH$1ҾHD$ܖH$HD$JD$p$E|$>:L$P1fH$LHDŽ$HDŽ$HH$-H$ )$H$H$)$)$H$,}LH$"L5ID}H}[H$U81.AM9HEH$0LIx1҃}'fH=)H1>IHHPLХIgL H4$HHD$((H$1ҾHD$H$HD$JB HCPH HcD$DM8EwDH$1Ҿ褔H$P;HH,H$H$'HD$ H$^|$?H$1Ҿ(H$BH$1ҾH$2JlH{H(G^S8H$P`H$0HHHt @W&HHD@H:H$MqL1IH$P蹣HKS8HHHyUD$EH1d}HUX1H$1H$Hހ$(11I IH$1HxiH$H I=7 q59=' HiHj = LIQHj = IMH<:H|$PH1{H{u {,Lh LLdi Hj H= Z8H$P11HH,$Pt =J q{uHCx[H5 =C H12iH$1HrHvTH$H|$PH{Hi = LPH|$PH{T{,JH$L1HpTH$HG I= p7= HfiH?i =q LᾑI)PH"i =T MHI<H|$PHH$H5 1vSH5 = IH11iH$1HrHASH$H|$PHHh = LᾐOH|$PHeH$HT$ 1HpRH$H|$PH8=r LIH=;h jH5 =I HI10iH@Ll$ 1Hh LHujRHU= IHH6@H$nH$P1?HHt$HH$PE1E1H1 H$PH|$PHt$H1jSE1AVjE1H$0HHH|$pgH ]D${,;+= )@#,@;B@#~,@'B@#+@B= t}tD$,D${,LᾑMH|$PHpHyf = ff.AWAVEAUATAUSHHAEH(>u V8HtHlHtHHИt }BHt {SEH(AAEHH[]A\A]A^A_f.苻HbH([]A\A]A^A_HCHHx(-HxI fwHG7L $:HHxH#f9HL$H5< HP 1O=C HL$IH H23L $IIy f_1Хh5 L $H9t$H5d L[hHBt$pHCHHH@H@]C8wApHuH{+1HH6#HCHHx(a,H$HEHHx(P,IH$Hx fwHMJA fwMIG7LT$AA7LL$HHxHf9HL$VH5 HP 1MHL$Hx I= H1LL$HD$IHxH,f9HL$LH5& HP 1MHL$H I= H1Ht$HhLT$HT$H<$LHHEH{HH$HFHHD$HCHHx(*HPIHt$ fwHRB7ZHLL$HT$HxH9f9/HHL$_hHT$HʲH5 HP 1hLHL$H I= H/H<$1HD$HHLL$HT$HLyHH5y L1LD$KLD$H5na 1H$LKL` H $IH5 =e 1!*iH5a 1HKHufHwXHPAA1ɺHwXHPAE11cUSHHIHh= t >Hu*At$1EtHf_ L1 KHh[]DHt{tuAEH蒐HLfHHh[]@HVHH90f1LhzfDC$tH{yAftσ fw81LAi@@1L讠hzfL+d@&@:EEEtEvctc t^1HHHHD$ f87@&@h:;uHEHHx fDu1H葟}u;t-HH&HD$ f87HHD$ =HHHHƃuEtRH;1HHHL$8HD$ f87@&@91H֞HHHHkHD$ f87E@&@?9@&@+9@9=[_t Hh9xuH\ R49tJH\ R49t8H\ R49t@HA R49u3H@@fDxtHH`-`x 1ff.fUSHHC=L HH¾ ThPH;HǀHHà P13:HH[]@AVAUAATUHSHHLnHHHIbHHLHHIEH_ tm= MHn*HI׵iFI= HH~*ILH[]A\A]A^=_ I<骻f.=B MHl:*HIjiGIATUHSHAHwFH:1HoEHPP@ oE@0oE(@@ЀKT KYCPC[HC[CZpCZE8~( SPDU8H@1EtHHHtH[]A\úC_*?6AWAVHcAUATUSHH1H8H4_D$xHL$1LD$ LL$(Ld$pL$D$Lp1HI= HPE1HIIž.Iu1LHD$DHt$HH IHھa61LH$hiHR = ILp L $LD$H= Hھ<<LLH$iHvR =ȓ LIpL $H0A *UIHAWAVAUATISHHH1DH@ HtHPHtzuH@ HuCIHHIHHHtH=OH)IH $H=Q *DMLEII3H=tQ dILEIMLIIIAL1LIE1賒H=,E IEsKHE DMM=w HRDHrH謂iLHI~1iHP =A LIm9LH= IhH% = MHI<PI|$H裦M|$ Mt$(He[A\A]A^A_]f.HH-H $fA1_ff.UHAWAVAUATISIIH|tIǸ@H=H)IFH $IH\$HEHHF H@ HxIAHLH}1µHs LIsHC LKIU=Ր IعHrH iLID$ H9ID$ He[A\A]A^A_]DHH-H $%fIAHLHH}11LI}iHC XAUATAUSHIHHHHL$HD$HcHx?}iH{Hl$1I謴=ޏ MHH¾pID$ H[]A\A]AWAVAAUATIUSH(HHHL$qHl$HEX4fDA 9L% L DLD$fhH1H 1HLHD$ LHHD$׌=!G HD$gqLD$HMhEt$4HL$LƋ= 1AhE)HH1 IcHI|i= ILHGIH{iL1H{iHL$H!M I=p pn=` MIHH<袁IE H([]A\A]A^A_A@"x tL% L fDA@AǀgHl$If1HH# LH1HD$ LHHD$a=E HD$oHž@HziH= 1IziH  =k Hd&LH1 =K MHL]@HL$=- ILc%1LIXziHK = LIqHL$= ML^HD$H5d H1 H5 = H1nhHH1 HT$H5% 1I H5d =v H1/hHH1 =T MHHFIHL$=6 IMH<{I~L% L @*fAWAVAAUATIUSHHHHL$Hl$HEX4f L%  L5 dH1H 1HLHD$ LHHD$=[C HD$mMH-HL$=2 L1hHH1n IHPxiL1HCxiHL$HI I= p=֊ MIHH<~IE H[]A\A]A^A_D@#(u tL% L5 @ǀ|cHl$Ijc1HHLH1HD$LHHD$ۇ=%B HD$klHž@HKwiH= 1I:wiH = HdLH1#HL$= ILc1LIviHDH = LIpHL$={ ML^IpH5 H1H5 =J H1hHH1=( MHHFHT$H5 1IWH50 = H1hHH1.=Ј MILH<|If.L%9 L5җ @*fAWAVAAUATAUSHHHHHL$.HD$=C@ H@X4jIā AL- HHLW`L|$ a1HL?LH1HD$-=χ HHD$L1hLH1HE H[]A\A]A^A_Ã@+r t"AL- HHLW`i@À/AHHLW`O`HT$1HH}HHHD$̈́H5 1HHD$W= HL1h@HItiHL$=̆ IHھ^H5M 1H= H1L[hEILLtE= b}HE @AL- HHLW`G=: F8HE `@*"ff.AVAUAATUASHHH HHHHHL$EHT$Ht$H<$H@D@ kHE Hx1riH{Lm 1IT= MLDH}HE H []A\A]A^ff.AWAVAAUATAUSHIHHHHL$ H|$L΂} HD$agHI֩iu 1ҿHDSHHHc4`HD$Hxri1HI ri= HHھduEHHT$H1= IHHھ]=s HHھd+HL$HB M=N pII=; MIHHھ<}wLH1pID$ H[]A\A]A^A_@LD$= LGIPH1H/=у HHI]IHD$1HxpiHL$HGB I= p1HHpi=y IMHHھ<v9fDUHAWAVAUATISHH(HUH&HE1fDAH@ HtHPHtzuH@ AHuIcDEHHHHtH=H)HH $HLHIHUHLH{1dIIHL9htHLIf8#t @CU&LIHELLH}Lc 1}mHMHYYEȋEHDHE}f#t AA}mLLWrMHLY=ց H =ā 9ILHLH]H9]Md$ 6ID$L 8AqH@HH@(@P _f5UIA1LMHxniLMH? I=> qII8LMLHEL襡uȋ= MHLH = 9ILH}H*=Հ HE贾hHuHMIH = I<sHMHLZLL%~IfDMw He[A\A]A^A_]DHH-H $RfLH}HMHfE1E1 ff.@UIHAWAVAUATISHHH1DH@ HtHPHtzuH@ HuCIHHIHHHtH=OH)IH $H= > *DMLEII|H== dIo|LEIMLIIIAL1LIE1H=t1 IEKHd1 DMM=~ HRDHrHniLHaI~1kiH?= =~ LImLH^=p~ IPhH) =[~ MHI<qI|$HM|$ Mt$(He[A\A]A^A_]f.HH-H $fA1_ff.AVAUAATUISHHHHHO# @H$HhHD$1HxjiHL$H!< I=p} knIHD$u4HxjiHL$H; I=6} nI1H; =} LIfH5 L1IPHT$H5l I1:AVLCXPL1A+L1iXZLl$Hii=| HHM]H $=| HIcH1IiiH; =`| LIqXH{H1 HH远ID$ H[]A\A]A^ff.fAWAVAAUATIUSHH8HHHL$! @HD$HXHT$H5D 1HXHT$ H5- 1Is4H= IǁhiIHD$1HxhiHL$H: I=b{ k`H9 =K{ MLmHD$>HL$H9 I=!{ fAUAVA(,HIL11螄HD$01Hx.hiHL$0H9 I=z kHh9 =z MLmHD$HL$HA9 I=z fAUAWAh,HIL11 H =Xz H MLFLH8 =7z MHm/HHA,AUAWIAVL11訃H HH5giLD$ =y HH]=y HHdLD$HL$Hڋ=y ^H=y IHHcID$ H8[]A\A]A^A_UAHAWAVAUATISAHMIH8HH @HEHXuxLuȾHWfi= y ]HMHEau=x dHHcHM=x HIID$ He[A\A]A^A_]HE1HxeiHMH67 I=x kHEHEs4HxeiHMH6 I=Lx nIGH6 HMI=+x f)IHEHHuHHM{HPHHMHuHHHHt1H  :& HH9uMDH=f6 HwH5 L1HEHUH5 1IHMHHUMGXA,1QAVLP1H ()ff.UAHAWAVAUATISAHH8HHMHMH}HztH}HHEjtDMLuHEEMfYE=v LIL^HC HE1HxciHMH5 I=Pv nNHLI|ci1LIoci=!v MILL<ciIHEAt$4Hx7ci @IuSHMH4 M=u nLK =u MLH<iHC He[A\A]A^A_]DHE1HxbiHMH4 I=mu kkHMH4 M=Ou mHEIH3 HMI=-u f+IHEHHuHHM}HPHHMHuHHHHt!1H  :& HH9uMAH=f3 HtH5 L1HEHUH5 1HELEHMHHUA-1APMEXQPH1}H `LiL1HEHEHLEHP= t ] HHC L1J1HC L,aiI@@*=s ^HHC wff.AWAVIAUATAUSHH8HHHL$ H|$ HpH|$(HHD$ pHD$(HD$ HXHٗiHL$(=r [IHQHL$ =r IHھ]IHt$ LI?=r MHL^1HHIHD$(1Hx_iHL$(H1 I=ir ng=Yr MMHHھ<eIHD$(s4Hxp_i @IH0 = r ILmH5t L1HD$=H5^ L1HD$'HT$(H5C 1HD$HAP-1LD$ APHL$ MFXQPHHT$(1-{H H"0 =tq MLnlH1I^i=Qq MLHI<dHE H8[]A\A]A^A_UHAWAVAUATISIHHHaHE1AH@ HtHPHtzuH@ AHuIcI╨EILDMMLLOIH= LpAv4]iDMHEAH;LmH{HLm @IHCQH =p L_HK=o XIHQH =o IL`HEHC1Hx\iHKHI. I=o mILMȋ=o MHL<bHKH . M=Zo pIUL=Do MHL<bID$ He[A\A]A^A_]DfQA} HIwH|IwIGHl @IGA} AE $0DH51 H1H =n H[UL}H*- H=yn mMtHSH5 1HEHHUMEXAWAWA-PL11wLCH fH5 MOMG1I=m hA} ID$ LHaID$ He[A\A]A^A_]DH5Y H5Q IE1H1HqH$H= LpAv4ZiHE*fI7HՍIEDH5 H1HIG1HxeZiH+ =m IHپlLEH+ Hً=l mHEHMH{+ I=l fM]XuSA-LHM11L]?vIWH5 1H =xl H[3H + =^l IHmVIWH5 1HEL]HUA-H$SLP1M1uH 4@*ff.UAHAWAVAUATISIHHHHE1fDAH@ HtHPHtzuH@ AHuA@EAEHHHHHHtH=H)HH $HILHDUL]HK H=) d_h1HI"kA HN*H9hH=) HIjHcuH=D HCXiL]H=0 HCLWiHCAG H  tH  HHMlKHMȋ=zj IHQMHrHZiLHI}1WiH( =Aj LIm9LH}= j IhH = j MHI<P]I|$H~Mt$ Ml$(He[A\A]A^A_]f.HH-H $OfE1۸ EE1?A*DUHAUATSIHHHHHE1DAH@ HtHPHtzuH@ AHuIcHHHHtH=H)HH $LiI$HIL] ttq>` *DI$HxH= HUiHC He[A\A]]HH-H $VfI$H= HHxzUiH IH59 =h 1hHC He[A\A]]I}HeH5U L%n H1,Hu& =g MHpH= pITiL$ H L=g I<ZHC He[A\A]]TiIMI/1E1.ff.UHATSHHHHE1fDAH@ HtHPHtzuH@ AHuIcHHIHtH=H)IH $IHHIHL臽y[ tlH= HSiI $H5 I=df 1 hHC He[A\]HH-H $^fI<$HcH5 L% H1\H$ =e MHpH=P pISiLT H5 L=e I<YHC He[A\]f.1E1=6 *fAWAVAUATUSH8HG HoL`H@ HLhtH|$19H|$HM}MtH|$p19H|$pL/}MtH$1k9H$L }Ht] MMt AD$ 9OMt AE 9OH=<u _8`2MIH- MInIL[c^tu0Mt L1YdHt H1JdMt L1;dMt(HD$0L9tHpL1[Ht$0HLbHt.H$H9t!HpH1+H$HLxbMt.H$L9t!HpL1H$HLEbLbH8[]A\A]A^A_EH`MIE H~`MH=IuE HX`LIaML1bIHH{ t H1bHMH b t L1bHH5 MIغ=b 1ThLH9w\@ME1fD H_MHtLID Hg_LI aMt>L1bI9M1IE1L`|uyL=r E1HHe t H1aHMH L t L1aHMIغH5 fD1MCILLW`DE1,L= E1`D`&/MIM*tvL% 1fDIL_E16CH^MImMtE1~1@1E11^ff.@ATUHSH>HtHI&:htHE[]A\fH L LL1G[HMLI]A\F1-ff.fATUHSHHBxKJG1HLcC4HuHwHHtHt$LtHE Hx0H|$`14HE H|$`HpwHt$`HtHt$pLxtL$HL$ pH =Q_ THL$ L qH =._ H.H =_ IHgHC H[]A\ÐH~1ҋG8t[1\HuE1E11ɺHHs H= ;HHC H\\HC H[]A\@HHHs HD$ H 1HpHL$ H I=L^ qJHC H[]A\f.Hv1EH[]A\fHwH{ 菰HHC ,0B$*ff.@USHHHG HXHFH1Hx(3HH%~=] HHjOLE =}] HHHf[]off.@AWAV1AUATUSHHHH|$ Ld$01H$11HsH$+uH$H|$ rH$LqH$1J1HC HpF8H$$tH$H|$ qH$LqHC H@H@HH@(@P%= H$H$Hp1H$=L\ I9HQCH|$ HHD$@pHD$ Ht$0HqH|$ [H|$ZHC H$$HpyuH$H|$pH$H}f@H5 HP 1H|$HHdHEH$1HHH|$HHcH}gHIH|$LHYHC H@ Hx>UH;7IH$@1/H$H$L2H$HC H$HpHXCH߾m %L&HE1E1Ht{tJEt|HE= 1HcH\fD{=fDE1HCE1HHtH@Ht@]tHCHH@(H(Hx1I@iH* =|S LIqtIM@HC=^ 1Hc`HHEHH@(H(\Hx1I^@iH = S ILqIftySHFHHp4B4%  @}+= ҃@9~u@fHƋ=uR 11hHH1[=RR HV D11&fDH랐H뎺B*ff.fUHAWAVAUATISIAHHHF1fH@ HtHPHtzuH@ HuHcHHIHtH=H)IH $I1IIvIHEH~IALLI6HVf|ftFHUDLID$ He[A\A]A^A_]HH-H $TfAtHR=P H2rHIf.H}qID$ He[A\A]A^A_]11A*jAWAVIAUATMnUSHHH|$ =: 2HD$HEHxsHEH|$@1H@Dx8i$L$yH|$@HEHpiHt$@LdHt$PLdH$1Ld$`$$HEH$B,;H@ Hp4iH$LdH$LdH$=iO HHIH$ELHcH4_HL$(ǬHc1LH4_HL$01I(TH5 1HgH =O MDmHkE9ID)IcE)UHH_McH)HI)H _H\$8HL$DHuL$HuLHD$cJtH<$11IHzSH5S 1HHLD$H H=IN kGH =2N LIh*H5 1HIfH =N MHmH =M LIhܯH;l$I/HH\$8lHD$HD$01H<$1H4_RH5v H1IHD$(LH4_H5M L1IH9 =KM MHGCILILI@H3L襪H3IL1LHzH =L IHHJ4H<$11IHQH5 1HH =L LIF觮HsLIHq =L ILG{L9H?ILt$L9i1LLH=IF H[]A\A]A^A_DH|$Q8iI0f/H裈HD$HD$01H<$1H4_OH5 H1HHD$(LH4_+H5 H1HHp =J IHGzIfAWAVAUATIUSHHLpH@ HP HZH;H$1I$H$H@ H@ HpI$H$H@ H@HR8H$16I$H$H@ Hp蝑H$H|$0{H=|> H$1I$H$HpZHuH$1H|$0HHHHtH{16iH|$0HHGH|$0HH[]A\A]A^A_H= ItsL;#IIFHH@(x uHP(HH90tsIvX1@.1H@== HDE H1DIHtxx,wILpgf.L8tIvX1.1DH Ht$(H|$0L$E11MLHD$LLߜ$*H H9 H=B E1H9}tIEH] HE1H9tH= H9{t! IDHH5f L1H9 IHt H1{GH1HtH1HT$bGHT$RH5P=1G MLL$(HL$81hH|$@H[MXZtIvH1SH|$0HLEMIuH1,H|$0HL|EMnPZLHIuIUHI}Iu8IEIDžHT$HHHH)HH$Iu8AE1H$L$_H$H|$0/[HT$H$ZLHHD$HH= 蓃iAM HD$E1DAWI9H$1)H K0H$ݎH$HZJ4_H|$ H$HPI=nE GIiHL$HPI=OE HM=?E HPLIF3IAE PD98J4_H|$ޢ1HLqHL$HPI=D HHD$AE HE1;-*迳;-*諳@AWAVIAUATIUSHHHHHB(@PAH(HL$0D$[M@1@Au&MMtAtIDMMuHHC1HxhHD$PHC@D$\MMHD$81HD$HMyfPIF1HxxhHl$8HD$PIFH@D$\r@Ld$@HPE1ɋ=xC I.L6H= H10iHHLBH @=2C HPE1LHI.p6HHC1H?=B HPE1HI.>6HH@1H?=B HPE1HI. 6IIFIT$HxhHHp It$1HHLHL,AHCHG?=YB HPE1HI.5HIFP8At HHsH HtHHHpHtHsH1FHHL@IF@8A H@hD$[HD$0HD$8H>Ld$@HPE1ɋ=A I.HL4HHD$PHu1HP ʿHHL@MLM A6uAN~ AV  E.Hl>IHHHD$@HD$HHL$8HIDHHD$HtKH1==@ HPHHE1I.'4H}HL@HHLj?A|$[HD$8HtIF8x9 d=HL$@HP.=h@ IE13HD$[H= -iHHL>HD$8H=HL$@HPE1ɋ=@ HI.V3HHCH<=? HPE1IH.!3|$[H HC1H<=? HPE1IHپ.2HD$ HCHZ<=l? HPE1IHپ.2HD$(AF - 1f.H4_H|$(11҉(DIH@HD$At/-CJ*fIH|$`1%IH|$`VHt$`LS|$[H$H5l1ռLH*HD$H|$`1 I0H|$`SHt$`LP|$[oH$H5P1蹹H|$`1HD$IF8HcH|$`HtRSHt$`L5PH$H51oHL$HIG1H|$0H$HH$IH1ҜH$HH51 LH^8H1AH= @H'iHD$f|$[H= 'iH|$`1HD$I0H|$`$SHt$`L,OH$H5H蕗H5IHӗ1LHfH5IH褖IT$=9 ILLɛHINHL$HD$H6=9 HPE1LI.,HH@1HQ6=c9 HPE1HI.,HT$HpI1葷LHL7HCH5=9 HPE1HI.Q,HpIL1@LHL7HCH5=8 HPHIE1ɾ.,HI0H= Hcp %iHHL77f|$[H= %iHD$HD$HD$1H#5=58 HPLIE1ɾ.v+IfD|$[H|$0QH|$0IrH1MHGٙI HD$eH|$0ٳHL$H1IHI蛙LH@5I0HD$H= H $iHD$f.L%1-@H|$H= c$iHD$fH$<H|$`1L$L IF8HcH|$`HtNHt$`LKH$H51HIL1G蚘H5s1HH$ѴLH&4I HD$fH|$`1 I H|$`RNHt$`L5KH$H51oLH3H|$0I%HM1HHH|$IRH|$0HL$HI1H趗LH[3LLHD$K3I0I @HD$fH= "iI HD$H|$`1 I H|$`=MHt$`L JH$H51ZLH2I0IxHD$HD$0D$[DH@H1=4 HPIE1Hپ.'Hct$\HxH!iHHL3f.MMHtVHD$81H1=*4 HPHHE1I.h'HuH H1SHHL2HD$HHtH13HD$HHD$HH[]A\A]A^A_DI0L%HD$'Hxh賏H@ HSHxpHHU=i3 FIHaHsHX+0HL$@HP.=/3 IE1|&H1Hz(1H WHD$PHEH@(@ D$\HD$0D$[|HD$HC*¡H|$`1I0H|$`JHt$`LG C*~C\*jAWAVAUATIUSH1HH|$pHH}=& u HE8H|$@=& HuF8H|$p$jKH$]HHuH_ 1HQ=1 HIq蛓HH|$pt Ht$pLFH{18VHHQID$ H[]A\A]A^A_fDH|$p$NIH$gD$@.HHhHUPHt&HfD8u H'HHuHuHL?HI9H IH@f8  HLL{Ht$8IE1L1LIiHL$8H5eM=t0 M1-hHDHGH@8tglDHHzx,H11HHpo$HD$ )D$@oD$HD$`)D$PjfHcH f=/ L")I0C!*荞fDAWAVAUATUSH=@$ H $LD$DL$ HGHHIIEHtHMM I1SH$HD$0H LHD$ `H HHD$(HL$tIH$1H(HD$(1HHִH$HHHD$CHuH{CH$HHA@DIWPIwHHHIMLt$(Lt$ Ict$ H= niHH$LHHDH$THH@f8 HLHHL$00yHL$0Ht$HE1M1HIfMHD$HD ^Y HD$11HhdHD$0HH5D E1jLL$@1H¿D5a- &hHAXAYtBwDpHHHAAwH= fi|$H MHD HH5P=, 1t$ Rt$( Ut$0AULL$hH$蔉hH@HHuAH{HtHAHD$ HC uH$HC( 軼Hĸ[]A\A]A^A_fH$1A8UH$MHH$LBDMLt$H$x1H$HH0+H$Ict$ H=& iIH$H$HHIDH$@H$H{@LHH@f8 HLHHL$0vL$HL$0Ht$HMHHHD$(%dMHD$HD ^Y HD$11H5 hdIcHH5 1jE1HM-* ;$hHZYtB>HHHO?HH5=* t$t$ t$AT1t$@t$P$LL$hL$ H$hH@HH>H{HtHALk xfDI$LxLH$LpCHt$0A8"3H$HHH$H$HW(MLt$MfH$1)H$f.hLE`1LbIŸZH$HLHI}IEIDžLHHLH)HH$H$AEIE0iBHt$0$2H$HH$HHH$M'H$H$HlZLHHD$ HDH|$P11Lg~HD$PH$ADAAG8aHCHHSHE1E1HHD$@HIGXPjjHL$P詑HD$PHCH1HH@(HD$@HH 'HT$0HD$(D$HH|$ DzjjAjjE1111HNH H|$8H4<@l$;f.=b' H Hf.=B' H H7f.HSHHBhp Hl$E1HHJhAEI9A ~_HEPJH@uJ4_H$XJ4_H$HH]P舄IcH@HUHHHl$IGXH$HE1E1PjjHL$PH$HCHH 1HLh(Lb&HD$HAH|$0.^#IHL$$$$$H$tH0H1%HD$@H$;AH|$0."HD$ HH$$$$$Ht$PH0H1j%HD$(H$1EW8ElAH$L>@L$H|$0.hLHHIIGH|$0HhhHHHHD$ IGH@H$ A*虓fAWAVAUATUSHh=P H HD$HHXH@ L`HL'wH& HE& H$HD$#H$11H zH|$@11LzHD$`H$1oD$@H$H{ )$oD$PHD$)$k8HH$Hkt$ H@HAW7E H$Hހ$8$H?#H|$IqLHHL$H$H$L]ZLLHD$H@Ht$pH$E1E1H1kRfDI$Hxx=I$L`@H$LH81HH$$$$$H$1H$HH0H$GfD^Y )11HgdHD$8 HH5N E1jLL$H1ɿHD5k hH^_tBwDpHH$H,LuIH@f8 H$LLQcHD$8HH$H\HHD$t]H$13HD$H$1HܝH$H$H$HD$,H$H$,H$LE1E11HZPHSPH$HsHIT$PH$It$HHD$At$H= IeisH= ISiATH5 P= 1t$t$Ht$8AWAUAVt$x$LL$`LD$xH$shHPH@H$Hހ$8001HV;H$ = IHH2BH13Ox3=J HH[fH$H$H$MLHIOHH5= t$t$(AWAU1t$8St$p$$t$p$xL$L$H$trhH`HHL-vHHD$(IH$IAH$LzH$LL|$ L{XxzH$KH$L-1H9H$HE1AWjE1jHLH$ ~IH@(HD$PH H$`H$H$蹮1HH$HL|$ HHL$,H$H)$H$HcЉHHLL4I)@INMFGH=& I%vH= MHFvIMFGH= IuLHH= FuH= LIHuM9IQL$HjL=y GwuH$HHI=W FUuL|$0IL蕌1H 11LH,H$H$HHuH$(L1 H$H$H$ ܱH$H$H'H$'H$HW'H|$(uH|$ uHD$HD$ %@L1FKx~fD=Z L I.;*E;C*1f.AWAVAUATAUSHH~HH8L1T$ 6HH$ApIBH5 H<$B2HLHdI H;ʸI;H$^1'HH$LIvH$sHsXH$)LsHt$ H$#H|$|$ pH5 H<$1HH|$LH = L龑褋H|$H%H|$=H|$01HH$H|$0IwH|$0Ld$xyXHt$0H|$ %HD$P1HxhHL$Pt$ IH=Y \r=N I.NhH =9 IIL<~H|$ H$Ht$@H|$ %Ht$ H$H = L龐覊H$H$H$H$H$H$H$ tLu H8[]A\A]A^A_H5y H<$0H5Y H<$/HUB*S~ff.AWAVAUATIUSHHLHGH$Im HHUH_ HWPz = HH9pX HCPHG= Lt$PHHPLpH1L IuL-(LH#IvH{#I}Lt$pgf}IMt A@HLlHLHD$kE1HD$MDL$'[HLjLD$HL$nH= HD$oLûHL$nHr= HD$(oLD$(HVH= goH<LukHL$=} HD$({o$IDL$'AL5 HL$(HM؋=8 h6oH=! ILhoL LD$H= H<CHC I|$11Hs HW-HC Hĸ[]A\A]A^A_Ð$HHC HHE Hx1HhH =r IHkjnMHD$t AIFHHcH_Hu1脊H= IHnnHL$HI= fmMD$XH11AHٿyMyHuHH?wHHHH!H!H9t8HEIQH4EHD$@HT$0D$8<L$<< $hXoXIT$X +HE1xBf}MA\IEx>H@HH@(@PAAHLDL$hHLHD$Nh $HD$DL$EHMHC AH=_HhLIhbH5AH1览H0=B MHF:lH{ HIH= HMqHC lHL$HM= qHkH= IHhkMLC HtxLL$0IFHULL$0uIIHR4HcHL$0D$8T$~ *t@>} *t@> *t@> *mtfDAWAVIAUATUSHHH@x==„t =IH}Le ?XSH|$@1ATL$yH|$@HuqHt$@HHt$PH{HCHLt$`HI|$HE+HSPz  HL9xHEHJPF=i HJHc@8L_YfHILLaIIxxt8L[ I1(Hs H$HC HĨ[]A\A]A^A_= AQH=B HL$ hH C H5ܵI= 1g`hH=IhH5lL1HρHX=j IHGbeL$H7I=F FLAeI fI|$H|$@1xIt$HH|$@2LHt$@HLd$`fA<$HE@8qH= HLFL_dIBHEDLL\$p8YH=ړ HH$hH ۔ H5tI=C 1^hH=HD$ihHT$H51HD$aLD$HH= GcH$H51HD$+HL$HI= LcL\$HI= FLcIHEI|$L$p8DHchH=d ILp\cLLH_L$H#I=2 H<MzHC LHL= GbIHLHxIhHm= LIkbH$IFLHE= mHHc@L_bH $HI=f fdbMGXHAH11 jAID$ED$H|$0P4HD$0H|$ D$(AT$,tVIFHHc@HHD$0yIL$~mfDIIWX +H016?ID$wHHHH!HtIFHHcpHD$0D$(T$,HD$ Ht$0I|$HxC?HwHHHHH!H9N`>V *n@Hx%?wHHHH!H9#?HHwHHHHH!H!H9>[ *m>l *m>k *{m>j *gm>i *Sm>v *?mS1҉qHHHH _t*;[_tDHh;XuH@@[fߋ_[3fAWAVAAUATAUSHHHHHH}TCtes OHIHKNKIU=Y IHrHhEHE ulH[]A\A]A^A_DH $ELD$= HQt R_HE H[]A\A]A^A_P^HE H[]A\A]A^A_H<$HLgxH|$HH$gH50 LHD$H} HIDIƋ# H $HM=g ke^HL$HM=H kHC^Hܺ=. HIi&^H= MLqH^H= IHg]HHLD$= HLLF]MIHً= L<HE H[]A\A]A^A_Ðs $SH $HM=g kIb]HL$HM=E kH@]Hٹ=+ HIi#]LD$= LLFH]= MHIL<9H= LMqH\LD$= ILL1iWhIILA *iAg *ziAWAVIAUATAUS1IH8Hn Hu-fDH@ HtHPHtzuH@ HuAti@IIH~IAHLPMtHH $= L1VhID$ H8[]A\A]A^A_nIfI~1EH<$LH&H${H|$衈1ҿHH|$1HH4`詂H|$1KH $HI= k [Ht$1IH+~H|$1 H $HI= mZHh= ILgZIH4$DHL覥HH L $=y LHI<H|$ID$ /H8[]A\A]A^A_úB*Cgff.AWAV1AUATUSHHH8v ]Hs HH_D$s WIs IHD$H{1I{HHHL$AHIwNH|$H:H= 1HD$1LIHL$=W HL1 ThHHHH|$ L1;HL$= IL1ShH|$ H Hct$H=] 0hHQ = MHGXHt$= IHٺ1vShH|$ HHVH|$ H5 LI[hH#=u HIqmX=_ I?4hH =J IML<H5 =* HIƺ1RhL羨dIpHɉ = HI9WH = MLI<#HHx HL1 tHE H8[]A\A]A^A_AUATAUSHHHHHHTL{H<$HH_ LD$=G HHھbH$;Ws4HIāHc`h= LIHھ^WHHH $= IHھFIV= MHHھbVHE H[]A\A]s $ss IIcHË "HT$HzIfw6Ht9w6 Ht)G6< V<uHzG6H` f@G6H`ELD$Ixf?G6H`HH~_x5ucH $= L1ZPhHE H[]A\A]fG6HI`fLD$fIxH5Ņ PXhHL$HԱI=# p!ULD$H $L= Iź1OhH $= IĺH1OhIH$= ML<HPHE f.?hHHT$HzIf*n?hHHT$ t~HzIfE?hHHT$ tUHzIf??hHHT$Hz@fu*>hHT$Hz@HzG6>hLD$>hfD>hHT$'@?%*`G6Gff.fAWAV1AUATUSHHHHv 蝿HH= HL$IIc$_khA$ _H= IǃHcNhu I$_1Iu HD$u WHD$HHHL$(AHHD$GH|$(HRH}1HD$(RH= I11LIH|$0H7L1mHL$(Ht$I=B 1LhH|$0HH = MLGRHu = MHZQH|$0HLH5h LThHt$= MH1{LhH|$0HH[H5$ LThHL$(H0I= p}QH|$0ILD$H$ H=S 9INQH =9 MIL<~HL$(H5 IƋ= 1KhL羨dI]H = HI9PH = MLI<HHeHk HH[]A\A]A^A_fAWAVAUATUSH(0=@W[_:tHH`-`HH@tHH`-`HP`Hhpu PtHH`-`HP(PtHH`-`HPHtHH`-`HP0D@AtDHH`-`HPPH tHH`-`HP8PtAIM`-`LHX@ HcHT HtHP@#DMcJ| H@`@H(0[]A\A]A^A_fDH|$ 10mL511L%}LLhH=[~ H1L1hH=U~ 1LH$1hH=N~ 1LHD$1uh1LLLHD$1^hH=} HHh1HLL1=hH} 1LLI1$h1LHD$L1h1ILLL1hL.I"H.H$ "L/H$@"H޿ /H$0h"H/H$hN"1L/H$7"H޿/H$"Ht$"/H$"Ht$#/H$0!H4$+/H$!H3/H$!H/H$ !H>/H$@ |!H޿I/H$ b!H޿,/H$ H!Ht$O/H$ ,!H޿W/H$ !^/HH$ =ѨH$ @uD ff.AWAVAUATIUSHx~H|$LH9SH|$1OH|$LL$H mLŏHL$HHYt xL|$Ht$LHt$ HH|$0LAH5jHH?FHxHhHo} =Y IHپpQKA}8I_E1H3HAHFHsHIFH= MHGJLږH˧H= FJHsHIE= HPMHHJH3HIaEH| = MHpJH| =o LIggJIAE8D9HD$Lx Hx[]A\A]A^A_@H$H|$LL$HL|$Ht$LHt$ HH|$0LH5= H1DhH=gIG H;IG Hx[]A\A]A^A_úPD, *V@US1HHɵHH=PHD_pIHchH5,H1HeH=* HHIH[]Iff.AWAVIAUATUSH1HxH|$PHHXs8u{ u\H|$H_C t^&ES8Ht$0H~<HƋ= ^gfH|$1HlC uHBH3z @4f3B4f HCHH@(H(HPhHHt H:HHDHG f+=HCH1= HPIE1H.(H5H1dHŋC8u}In Ht$LHx[]A\A]A^A_DK8-/HD$0f8#tHxf?5:xHCHH@(H(H@hH8譵HŋC8tH=oH|$HHIC8~vL$_H|$0LkBH|$0ILB1HL>H= MHHFH|$HLQH9C8Mn D=z Hp@{ Ht$8HHCPHt 8H|$0蜴Hp@H|$0&H'=9 MHG1BH|$PHHLHHً= FBHٞ= HIHAIL$HDI9A8KH\$L|$H|$PHdHt$PHH|$Ht$`Md$ H|$P1ID$HD$8ݳIt$Dn8EH|$P1eHt$p=L H@HD$Ht$PH9H|$Ht$`*Id/IT$Bu <H|$pAIH=1LHHHD$mMd$ H=1I|$I H|$P1It$H|$P1dHt$p=k H=GHHt$PHIOH|$Ht$`@EMH= 1`IHLD$L= H?HLHLD$HĜL= M?LHHsHCPLc JH@HLHvG= ?LgHXH=g Fe?H>=P MHYH?LHHLD$HL= H?LHHHCPHL= FN>HLH= G>HSPJ@IGXHHSHE1E1PjHjHL$8Ht$(AFH Lt$ I~/H5@1HZHG=Y ILYIN>H5p 1HZHT$(H51HwZH5`= IIL18hHHAIFHC Hĸ[]A\A]A^A_DH|$P1$It$H|$P$=HD$pHx?HD$ODEH=f`HHL$=j IYe=L>H7H=F ZD=AIL|$r/LdLHI7hHt$(HIHD$8xH1IH5n L1,YH5L1IYH51LIYH5= HMM1R7hHH7HD$8xHk H=A1MIHP)fDHcHHSxxFHD$0x7A,Hǀd/Ht$= 訽gH5H1XIz Ht$xIID$Lx 蘽HD$fDHaLbL=i Gg;HSPH <Jx Ht$xHpHt$(= = HpgH5H1PWH7Ht$p1 HH5L= 1z5hHt$PH]H|$@IpLLt$(sH|$@H6H|$@H|$@I_H|$pQHL$(HI=H qF:=8 HD$hHL$Hk M= I<dH|$@HLt$H|$@LLHt$@LID$x@Ht$p= 8HPB7 I|$?HD$p1HD$({H|$P1IܫH|$PLIT$H|$pBDHT$H5ԕ1UHC(U@z Ht$xHb1YHIfDz PIT$HD$BfDIWXHD$@1HL$@H|$p]fDA 7Ht$(HA Ir/HH|$@IdH|$@LLH|$@H|$@HD$;A ҹL|$H|$@HL Iv1HoTH|$@HLH5i L1NTH5L1I:TH51LI&TH5= MHM1t2hH|$@HWH|$@ H6= ILlI}7LD$H1i M=` H<HHL(H=3 ILmI(7= HhHh = IMH<JH|$HH5nLk L1(SHC(H萠IT$HD$(1B/HԠIHD$0HxHH01HIt$1HRLHH HHT$HpIHLHHHD$8x#HI `>w *C`>!*BDUSH1 HKH5lf H=g HhH=d 1Hh{/HH=XHH5Ʌ=s HH1)0hHHHH[]@SHHH5=4 1/hHH[@AUATAUSHHHHHT$7(HD$DH@p4HIHL$= IĺL1t/hH}I1fLH1PH{HC L9H[]A\A]ff.AVAUATUHS/HHHH€j/H€/H€/H€/H€&/H€0H€-0H€G0H€Z0H€Hƿn0€1u43HHHH&H[]A\A]A^fHH&H[]A\A]A^fHHH%H<$H-H5=h HH$1-hH 6b IHxI9|$u!IA$Pfvf{u ID$H9uH $H5= 1,hLb HH= q1IT$= LjIJHdILg= 1H{IHE 1\LH1MHE H[]A\A]A^fHH[%H[]A\A]A^fHHHi$H<$HH5= HH$1+hIH$H5oa Hx3hH $H{I= p0LIH[hHHH#HL$H5m=z 1;+hL` H H=\ qZ0H` H_ X=9 IHHH_ =! IIL<fH$1HrHTLHD$H$$H@p4= HH=*hHE HILH"H$LD$HzIpG4N4f%ff9]s1KH$H@@4f%--HIbLHúL=& )hH$1HrHgKH}HE HDHILH"H$WH@p4HT$HH5^ HBH4F4ff%f9HH:5h1ҿHD$Lh迚H5KHH<`FLH1JHL$=6 ILYI+.= LLXHD$FHL$= LZI-H5V] HHD$1/J= LHHD$4(hH$1HrHJHE @HHH H$HT$1HpIHD$H$H@p44HIiHHúLfDHH0HL1YIHD$H$H@@4f%ff.@HH@H@x=t v-wf.-vH=F**9ff.H1HtnRx ttVt 1 1u8HHJ HQHt:uHA HHG8H u f9uHAHH@(@Pf% f= AWAVIAUATAUSHHF8L$0H/ L|$ H}1WHH|$0HD$AFAFˆT$7 H5[ H+Ht$HLHAqHHHL$( H@ H@ HP L`D$0HD$MD$7t%A<$uID$HH@(@Pf% f= D$7@E8IIGHH@`LhhIGHI;E( HD$HD$8Ht$@L|bMtE\$8EHD$PHHD$WHt$ H$赜H$LADM8Eu HD$8H$Ht$(H$H|$H$ |$0HD$`H0H&H$= Hھ[IBHHD$`LLt$ H`HIHHD$`H@HD$pHsLD$ HL$Hڋ=h Tf)HHD$pLH=E MLHھH:)HL$=' IHھH)H5Y HHD$(HL$(= IHھF(Ht$HHD$pHHT$ Ht$HD$pHtHD$pH'H$HD$(HLD$HL$ Hڋ= T(H$HL"=d MLHھHY(Lt$=F IHھFL;(H$LHH$HLD$HL$ mH= I'LD$(HY M= H<"HHD$`HrHD$`H%H5.X HI胐H$H4I= q'=s HD$ Qh=c HL$ <HY IM蟸H|$HH|$H$Mu,MH|$6ED$8EH{H|$@HHt$@LcAt$8LLLLM8|$0tGu lHL$= Hƺ1U!hHL$= IHھHz&HD$HD$IG H[]A\A]A^A_f.AqAcH5V H HHt$LHHHHL$(L$HL$Hڋ= D%Ht$H|$H3|$7IH$= HL$(LD$ xhLD$ HL$(IHDW =v <ĶdHt$ H$趗H$L>*/@HD$ 8f.LȺI9HD$|>*/fD= ugIIGHHGT =y M<HHH 轳LI27DHt$L{%%fDH|$"H|$8!D= HHھX:HH׊HSH|$@HHt$@L^@H$LHD$ aLD$ HfDD$7HD$E1fL$PL@8Ht$8L$Lk`HuXLHD$NH5!R HE1Ht$HLνHHD$H@HD$(0H$L aI>T*-> *->*->4*->*-@H?HHxG81҄u@Fx=wC=sghtbwuCt[WcE1҉HfD=tvW= tv="ti=(t= uƺHf=twOt8=t1=t=t=t=kHfD=t=t=z1; bY1=@=51-1`=**+`=**+AWAVAUATUSHHBHHHIDn8EHuAD$x=tWw:,EDHHH[]A\A]A^A_?=tk-]HHxH@ HxtHHH[]A\A]A^A_HH1[]A\A]A^A_1f.H@x= tw=DHHpH:L%itL葰I9Hu @HHzvxP HPH HPPI9uHXP?@LMn Im HHuL%sHTLI9IIvLI9HI}Hǘ4xI9<$Lc4$HMIL)HHD$,IG`HtHHXHA`IO`MPIM9x HLhPLxXIHAGtIxaAAw94$CD A m94$Aw}%HD$MLA| A| H9HcAD EDr-@=**(@MH $=HIB2H $I9CHLH $H $HAXHXPhD= =0**((H=**(IvLI9HtHuH@=**'<)*'<)*'<)*'<)*y'=)*e'<)*Q'<a**='<)*)'<)*'ff.AWAVAUATUSHHGxH$@xI7vGpVH=% v{= =HR=3#.%o-VHʒHĸ[]A\A]A^A_=%H$H$kHIH$1aIH$HpH$HH$HyLH1 5HHIIH$H@ HpH$HH$H"L1H4HHIWIH@ H@ HxUH$P1莊IH$PH@ H@ HpT0H7HH1HH5d=̵ IML1hHHdH$pH9HpH13H$pHH2DH$P1ɉIH$PHpaH$pH5 I 13H$HH5$= Hٺ1hH$HH$bDH5c=Ӵ ML qG L1hHHgHPf.IHX;|H|$PL$JH|$``1LƈHL瀌$(cH|$PLFIt$H|$`71LH$腈I= H@ HpLH|$PLIt$H|$`H$H$IFxPx%w]HX(HsMI= H@ H@ LhE1ME1fDB  $H1D1LƇ=7 y ILH@ H@ HpNH|$PL1It$H|$`"IL$H@ H@ H@ LhM A} 1LE=  LLH|$PLIt$H|$`= IFxL$Px  [ $ՠ1fHgI ]UHg@.HHh}@H|$0H$H|$@1HZHH߀$1HL$5= u$IHH@ HpH|$0HHsH|$@1HL$=Q u$IHH@ H@ HpdH|$0HGHsH|$@9H$1HHD$肅I= H@ H@ H@ Hp.HH|$0HHsH|$@H$H$IH@ H@ H@ H@ H@HK81H=i  IHH@ H@ H@ H@ HpHH|$0H[HsH|$@M= L$IGH@HX H褴gHCgHtHHH!H9uHhHcHHT$H|$0HLH`-`HD${11h11HhH$It$1L5B H-1LH$螯H=| IUSLAVLL$ Ht$01$ hH HH|$0MtI}1胜hH|$0HLH|$0Ht$@$H|$0芮D1fHH@ HX;XH|$pL$H$P螭L趭1HHH$H|$pHHuL1HL$pނ=O IHpHnH|$pHQHuLEIL$pH@ H@ H@Ha 81Hn=ߢ AIHH@ H@ HpH|$pHHuL= L$p IGH@HX H$gFHC`HtHHH!H9uHaHcHH=> 1HH`-`蔚h=F ILH1 hIu1H{+H|$pHL˫MtI~1KhH|$pHL諫H|$pLH|$pTL$H$`ΏLH裫H$P1IH$PHpH$PL|H$pLHH1*HLIQIH$PH@ HpEH$PL%H$pLũHH1X*HLH@1H&h= IH5/Z= IL1whLH\LEH$L$H$PHHI1LLpH@ Lh1HA~u"A}tP<.*6Lp\AAF8D ^A}u$$)LHtLLiEH$p= jjAjAUE1111H HH@H$pH= Hp1H(Hi=Y HIpQ H=; 1H耗hH < H5 \I=! 1Ih= IgH= = IMH<<HH葾EA}-Xjj1jAUHAE111H HHJH$H$pHދ=y H$H< Hp1H'=L HHH趦A}A},yH$xH$1Hpe'H$x= HHlA~,:H$1HxhH$=Ȩ HH5 L$H$LxH萧IFxxx6IH@ HHC H@ H $HD$LhH@ LxMH$P1H|LHPHL5HuH)= L$ptL1=ߜ IufMMIuHn: 17&LHL艦(@HdJe= L-D: H$P1H{IHp~8 H$l1H{IHHpbHLGHuH;HH$$$$$H$qH0H1HD$HLHuHڻIH L9 H@HL$ xu H$xHD$ H<$H$9 HD$tH1HzH4$H藾HL|HuHpH$pH57 1$HD$MT 1HzLHGHL,HuH H$pH$x1H59 H$N$HIFxPxB̓$1$(z$($$IEx[AU8H1HA HH&H$pfE1=. IFxPxL-7 DL$m HH$H4$HA~' I~HH H@H @] L$(um HHD$Ht$L趼H$pH$1Hp"H$p=u HHH$H7 Hp1"H$=B HH诡H<$1HxA~AF8AHET$HXLc LrgID$T$HtHE1HLH!H9ADBD111HH%L`-`wHX(H- ːhH$H1I!=l IMHL1hIwH1!H|$PHLMtI}1nhH|$PHLΡH|$PHt$`H|$PuADAnDADADE1= L5I5 H|HH@f8  HHL$cSH 4 H $HL$' 1HHL$vHL$H$H$IMH)H|$pH,IE f HD$E1Hcs H=i3 1H HrQ H=l0 HcHL$(7hHt$HIH=՞ LHL$0M1Qt$0t$AUt$8L$HL$HjgH0ItL=OIHp~u&H~Ht [ Ѓ3t~6H=Q LQt$0Mt$LD$0HL$81gH ILLҲLHLL= OIHp~eQ=ߝ t$LD$ LHL$(M1gIXZL=NL=ND$HwHD$H@f8 H`NH / HL$H$H@ fvQHEж%Hx膚H$Hp1HD$ HL$ H|$PHHH $ɛH $1HH$1L]qHL$H$H$PIILH|$PLͱIFxHcs H=&. xxhHcsH= . H܉hHct$L$IH=- H$PLL$賉hUH5MHAV=X IAUAW t$ t$81LL$8gH0HH|$P߰It$H|$P D$kD$^D$QD$DBUE 9F $C 9F >$1HHxx,HHX~fDC 9F K$(>H@ HL$HHH@ LhH@ H $Lxf.HHtxx,jHHXZfDHHxx,HHhfDA~L4$9A}MH4$L瀌$($臲LH|LHaHHVAu0Au(E1Au AuMAuAuH$11H0HHîH$pH$1HpHH$p= HHOH$Hp, 1HpH$= HHIt$H蟮HuH蓮AE<=H$H$pHD$]fDH5iK=3 H1gH$n{L $H1L]L $H$p&@H+ H=, H$hHhHcsH=1* HhUH5YJP= 1AUt$LL$0L$H$pKgH HFHT$H|$PHHD$螅hH$H1I=? ILMH1gH|$PHҬIGж%HxK1HILl$IML=* MDI~HH1HŗH$p1HrHH$p= HHHD$9Ht$=e HD$H|$0LHHD$0H|$pHHtHL$H$!H6LHD$$(m HHD$Ht$LH$pH$1HpH$p= HH LD$L1H詖H$HrH&L-HIMLHH$/L"1Hjm L$LH(A~I~HH H@H@]H<$1L/jm L$(LL迭H$p1H$HpH$p=~ HHH$H ( Hp1H$=K HH踒AELt$HL$(IFHHx(1HL$(@A~HD$1H@HH@(HH@Q4 37b= H;H#= L%ImH|$s1LhcI~HH1HpH$p1HrHH$p=4 HH衑AE1z HD$@BUfHHMH<$1L,hm L$(LL輫H$p1H$HpH$p={ HHH1HvH$HrHHD$H@HHP(BU/H~HHV8MQLL$HL$ X,LH譧Y^p</*P<.*p;w+*0<'-*P<x.*P<\.*<d,*q<+*] <rI;,*5<,*!P<.* fDAVAUATUSHHH*0HHG HHxH|$BHt$H']H8Ita1AD$sME1t 1@1/g1HrgILuX0ALHLsgHEXuH[]A\A]A^ATUS[_cDH{(Hh#H{H{H{H{H{H{H{H{H{u{uL%.MI\$I,$HHH9sfDH}wHH9w11@HuHtFH}HEH9vH}v #H8wHH9w11L%~.@HMt:IT$HHsDHH9wA|$,I<$t"H0LH&.[]A\<f.AWAVAUATIUSHHxVgDN AOA /A+HG`Lo`H@}Lt$ HM cArI DDBL1G%I$HD$Hu2T@J29t3H[ DyDzHDHHC1Hx萳HT$HuH|$1ҾHD$JHT$HD$J먐DN A~`A ~AnHG@Lo@@HHx[]A\A]A^A_LoXHGXfDLoPHGPfDALo(HG(@HZI|$1ȲHDHhLH_A= HH¾ gH;HHUHD$P ʈP18HD$IEHx[]A\A]A^A_DA4LoHHGH1Dο[HHH _tcA|$d"^%Lt$ AHDLE X$BL14#Lo8HG8Lo0HG0{u<tnA|$d"^%Lt$ A'2 HDE11fDA|$Lt$ LE "^%dIBLHD1"KttA|$d"^%Lt$ A /HDPBP*=PB?*)PBT*PB:*PBi*fDAVAUATUISHH~1H0胰I$I$HL$HAH}A|$[_utHht>uL=Z HL$Hƺ1g[_I@tHht=uLNHL$= Hƺ1g= LHITHC H0[]A\A]A^f.A|$ A[_\H|$H|$1JHt$1ɺHAt$ H|$1YH|$IHfDA}tIhAUtuHL$=? MHF7LLIY= LHƺ1g= HHXHC H0[]A\A]A^f.UHAWAVAUATISI[_HHJx Hhu {9uIHHE1DAH@ HtHHHtyuH@ AHuIcHHIHtH=skH)IH $IELILLH/HPHHZJ= MDHHyhIE He[A\A]A^A_]DHH-H $vf1E1VIʿ.1aAWAVAUATUSHHHHGHH/HH9p"N8u8H@IHE1HD$xxt5HHLHLxnH[]A\A]A^A_fCp@uA|$ t=?jIŋ:~ tAD$ PH|$1ҾHD$BLt$1Lth1LAVJANIDLt$thLD$AVJANIDA@H HAHITV8=*AT$$JVL5\5LE5L5H|$1ҾHD$BL|$LshLAWJAOHc5?{ IDL|$shAWLl$L1JAOID腆AULD$JAMIDFL54LE54`=*7=*#ff.AVAUATUHSHHHHtoH[]A\A]A^H~L1膪IH1LIrhLHHKI~H9>I^H|$pHH|$pLH|$pq&HuXH|$p賝LH|$pH)I|$jrhIT$= LIFH = LI9H|$1HCYHt$pH|$IvH|$Ll$X= gHL$0Hj I= I<wHt$HH脙HH)Ht$ HlH|$pH_#Ht$pHRH$HBH|$pLc H[]A\A]A^úpA*AWAVAUATUSHHHLMg IT$ HJ Li MuIE MHD$;t8IHL$ HT$LD$HL$ HT$D$IEIt$~tI$K H{HHt$HK tKM8K8uHzHT$2HT$HBHHHĸ[]A\A]A^A_Iw~tHL$ZeH=A(HD$P4H=AI1HD$ phLHH老H=AHL$HD$LqMPA>FIFHH@(@Pf% f= D$GH$1Ld$0HD$HAH\$XHD$(Hl$8IR|H9K;HYMt AF8H$1sHHHD$8LhXH$H-H$#H$L@L$H /GH@= Ht$H$HUHhE_Ht$pH$mHD$H|$`:Ll$ LD$FH$`=E IUDH|$`HLH = H龑H|$`H轕H|$`sH$1IQUH$H$IwH$H$H$H|$p豕H$1UHD$0H$HplH$H|$pzL$H$pH>=P S=E H%gH =0 MIHپ<usH|$ItA|$GH$p= ܽgH = MIHپ<,sIH|$pLAxHt$(H|$p蹔Ht$pH$H` = H龐MH$H-H$@~H$H$[H$H$ FAH$/~H$IAH$1H\$XaSHD$H$HpH$HۓLD$HH$MH =~ <rMHt H|$HHEHt$ H|$PHC CDLpxH9i7HD$H$!1oHHHD$8LhXHD$HH$HHIH$HH6H$HH$LHCL$H +GH<=} H$Ht$HHU|HhH|$YEHt$pH$^H$P1QH$H$PHD$H$PLH$ H$PH|$pfDH$PL0HH$P1AQH$PLH|$`{H$pH|$`e|$GL$pu^=| {gHT =| IIL<oH|$`HH|$`{H@D$GfDH$PLhIHt$pH$DŽ$lHD$p{H$HD$HN@F*DŽ$lE@K*AWAVAUATUSHHHxLT$Mt$LHB HtDHHB HuHK H}HtHHHx[]A\A]A^A_ÐMeA|$ H= :(_xH=9IKxH{1HD$@軟LHD$XtH93HD$0CIE H@ HHHH@ D$fH@ HHL$Lht93 HD$P8 H$LSHD$  HD$I|$1s HwIAD$ uAt$ IIH|$m LHHxH$HXHt$H H$Ht$0H$H$$H$Hމ$h$D$gH|$ tA8.M< HO'H$Lx1jH1HD$8jHHD$PL$MEH 'H8G= y  Ht$@H$HwH|$0HD$H HHHD$pH$HHHD$(H$1HMH$H$HH$XHt$H$HH$CH|$(H$QHwH$1LH$H$eHD$0H$LH$H$HH$p1ILH$pL蹿H$pHɌH$vH$H$LTvH|$ ` H$ IVFLD$@=tw wH$HLvHL$8HH =Aw H$HߋH$vA8IJ<tu1|$mH$H5=v M@ƍt6lHD$HH|$H聴=v HD$H衴gHL$HHu M=v I<iILHAH|$= |$fH$0j HHL$Hu=Wv I7gHL$HH I=:v M<iLl$(HLՊH|$8H$LA8I<H|$ tetaL#H L=u 9=u I褳g=u HL$ <Hm IMhH$HBHL$PHF L$=pu .LHHL$8H =Du LHLl$M H$1IH$H$JHt$H$L$XмH|$(H$މHsH$1'IH$H$HD$0H$LH$uH$H腉H$HsH$H$LsLD$@H$ FIV=HHjH9)HD$SHD$ W5(H$>H$I LvH//L=~p m|HD$ fHAHH@(@Pf% f= D$ffIE M3H0HP Ht. IHHuIE HHxt3H0VI€tIIE HtHxuHP HIU H@ 谷fDH$H蘄H$H舄9IT$ HH@ kID$ HNHP :@Ht$pH$H$Ht$(! Hn@Ht$0H$>H$$H$Hމ$hC$H|$ DH$HL$MLD$ ZH;-=n lD$gHD$ @H$mH=-IkH$HD$Hm1|$mH$H,=!n M@ƍt6lHt$HH$HlH$gmH$HD$hlH$t$MHo,=m Ht$HH$H_lH$mLD$hH$IH+,=}m <`HH HQ=Vm XA|$H+HxHD$(rZhHL$(=m IGHQHfDHt$H$&H|fDLL <HL$ H+=l `H$HL[kL$MH {Hl+G=yl wHt$@H$HkHD$PHD$8UHkA8?<H$jLD$@H$ FIV=k H$LHjH$HkLHj*L=k pI=k HD$H脩gHL$HHX I=k M<^HH'fHt$H$HD$ LL$H@D$gfDH jUHD$D$gHD$ @Ht$H$. I1H^jZ?*?*?*?o*?l*D$g?y*nAWAVAUATIUSHHHHt"LHH[]A\A]A^A_DHHCxHD$u:H{ HHG HPN8HuHG HH~1A肎s HHD$(fAu A}1I7AU=HAu HH1H<Š_8A|$8IHD$@(>< A|$8<<HD$H AmLLHgH|$cH9 "HD$HC H@ HH@D$VHHD$t.8uH@HH@(@Pf% f= D$VHD$P8~H$pH|$IXHeH|$D$WHJHt$HHu $LH߉$謀$H$ A|$8<< H=&0ReH&HHIHD$XfA|$8;<UH|$H<$HD$8HD$0H<$Ll$pvLHLl$ H$1;H$HH$Ht$H$H$X9LH$L$A|L fH$1;H$HZHD$Ht$H$H$ۮH$L{H$eMtHj%H$LreA|$8g:<1Am@ƄH$H6%t6l=f MHL$8IHS H, =^f I=Kf LD$`)g=;f LD$`<H ILwYH$HzH|$0 H$neA|$8I9<1H$AmHT$(=e @ƃYMHy =e IL9I=e HL$0<HI MXLH&zH$LfzLdH<$Ih|$VH$0=/e HL$` gHL$`H M=e I<[XLt$ HLyH|$8LHLt$ LL$@LD$HL0HL$XHT$(<=d XLHLVcHL$8HZ =d JLH/yH$1H8H$HHH$Ht$H$XH$HH|$ H$'yLbH$1p8H$H@HD$Ht$H$H$H$LxA|$8v7<1HT$(Am=c H$@ƃYMLHL8bH$LxxLbH<$ItA|$VL$0=Ec (gH =3c MIL<xVIH|$ LwH$H$8Ht$ HL|$0MLL$@LD$H<HL$XHT$(=b VH =b IL9=b I~g=b HQ <IMLUHH#w|$WH$16Ht$H$'L/aHL4wHsL(wLaA|$8I6<LD$@H L=a 9I|$VH$0%H =a M< ULH`vLHvHLe ]@HAu HH<`i/AmIDHD$@HD$H+DHHC HG 6H([H9!H$H$pIXHH4$HHHt$HH$LH߉$ yD$WfD5H.HILpH)L=x` mvHIHD$X6Ht$HHa$LH߉$xMH$E1'A|$83<E1HD$X@HT$(H$Y=_ MH$HLo^f.HHtHsHt@$H=- \HHHHD$0 ^H|$H<$HD$8~Ht$pL$HHt$ L^H$H$LsuH|$ HsH|$8Ht$ HL0kLL$@LD$H<HL$XHT$(=^ QLHL9]HL$8H= =j^ -LHsLJ]6DA|$83<H|$8HT$(LL$@MHL$X=^ <_QLHH\@=] ͛gIDH$HqM=] DL$Hq L=] 9HD$`虿=] HD$higLD$hHL$`IH5 =g] <PIDHt$H$vHfDHC H$HM=] DL$H L=\ 9I=\ I赚gH =\ IML<PLHZqDH$8HYL$MHH=i\ lgHHD$X#A|$8/<$H=- E1dYHHHHD$0ZH|$νHD$81MHH|$HD$8衽H<$HD$0Ht$H$L $L $HfDH|$(d+HD$H;H$p<$$$$H|$H1gH IfH$ZH$H$LYHHt$0H$YH$EZItDHt$H$I[fDIV=Z LXQHL$@IHtHQ=rZ X0HD$@A}6H|$(GhIV=?Z LIG7IHP=Z HXHD$HIHD$@f.Ht$HH$LH߉$"rD$WlH|$E1vHD$8HD$0]1E1%KHH|$HD$8=HD$0-?*R?*>?**?*ff.@AWAVAUATUSHHHD^8E(HDbxA(D$ 2fDH0t̃tWH@x=BwPh=DmHHHĨ[]A\A]A^A_f.M8f.== LMl$ I}HHIEI|$I|$HQHIs{iHE H\H@@7NMEI$x7H=|HDhH|$@1I+I$H|$@L$xHp H|$`cH5<1HH+==W MHF5HHTH|$`HHHHTH=V MHpH} I6=V HXP11HH[H|$HUH{HChH|$HHOU=V tgH|$HUH@ =rV IIL<IHH kkHHWtHH=uHH HP H@HHR HHj xHD$@'E8 1H蛍HLD$@IHHHL<%bHHH} H\HH;fD11HH!@HHOaHH@cHH(wHHdHHL$@AH薫H|$PHYRHL$@IHD$HHYH;XI;\$=uT MHھcm=_T LHھdHD$@HL$H=?T IHھc7HL$@=$T IHھaHD$HHE HHC H<t HHH脬ckHH_NHHxHE1fAH@ H`HHHtyuAH}HGHH{1L`xHA$Ll$@HL'ED$8LLE/kLHhIuH}gHt$hHt$HdsHE o1ɺqHHXHHL$@AH芩HL$@=R dHQQHE FHH蓈1HHqs Ts $ITH{1IvHHHL$@AHIs L!HT$HL1H@=Q IHٺL1蕮gHT$@L1H=Q HIغL1hgLHE H1HE "HHS HH>YH}HHHXC8N{1HHQH] =+ r3HHq= HE X31H>NHU HHHOH] wkHHblHHM{&mHH?.1HH-HHHHH@@ tHQHH设HL% HxtYH|$@1x$HH|$@HpHt$@HdHt$PH}dHT$`H5D 1%IHH@ Hx]H|$@1$HH|$@H@ HpwHt$@HdHt$PH}|dHT$`H5 1IH5=QO L1 g=DH1HH1HE HHHHxt =C g H{1sHHHL$@AHI謥A.A^As qPHL$@=N Hƺ1Gg{ ILH1=aN ILLHVHE H}HHHHHSHHL$@AHI0HT$@H1:1HMH{1HD$@rHL$@H5THË=M 1ogHHE H1HE )HHW1ɺHHcHH>HH^s Os WINH{1IqHHHL$@AHIڣH|$@HJH=N 1HD$@J1HLHT$@L1H=L IHL1cgHT$HH5 1H=L IHL12gLH1HL$@H5^=FL I1gdLH=&L IMHL<h?HE HHiH@@ tHHHJis WMH{1IzpHHHL$@AHIvHT$HH5 1HT$@L1H=cK HIغL1gLHE H1HE HLl$@1LL`{ j-AL$8HÅA|$$1LLHt$`=J aDHpA|$=J .gHH1 IH߾7h=J ILHھH萬LHE H_LHD$MLl$@L}MuH|$fI|$9G t Ht$D$D$xI|$H1L)LH^LL^HD$HP9Md$ HL$`rpMHL HL I|$MH6 HD$`HHXHH褐1ɺFHH}HHCLc 8@H|$@1HsH|$@bHt$@H^Ht$PH} ^I|$H\$`H|$@1NHIt$H|$@Ht$@H]LHL$`GH=H 訪HH]HE (HH HHL$@AHJHT$@H56 1H5[=BH HHD$@1gH{I1lLH1kHE HH{ZkHHw1HHJeC Ht}HHHW6HHL$@AHhHH@@s +IHL$@=HG Hƺ1gHE HEHHtHBHHdHHHHL$@AHŝHL$@=F 3HAHPHE SHHL$@AH腝HL$@=F HQLHE mHHHHnHH HE1AH@ H HHHtyuAދs GH{1IjHHHL$@AHI軜H|$@H~CH5W=E HHD$@1mgH= 1HB1HEHT$@L1I=rE HML1%gLH1dLI=:E IMHL<|8HE HH(1ɺaHH!{HHHp\x1ɺcHHza1ɺHHHzJHH@8H@H1Hx(cH{I1HE AiLHdHE HHݻ1HH˻HHFHHH@@ tZHHHlHH{HHxgHHW1ɺbHHy@HH\+HHHpTYLe H{15hLH1HE HH$HHHHL$@AHHL$@H5YU=C 1ǟgHE Jfv f{ L`H L9 HPI9T$t fHPI9T$u%ILe A$Pfvf{u ID$H9uH{1EgLH1HE HHHH/H|$@1HH|$@L$x!Hp[Ht$@HWHt$PH} WH|$`=HHE =$HH10HE k1HH2YHHjImHH蕻4nHH耻HHL$@AHHFD` IH{1fAHH5tAX H5]LD$HHL$@= A 1gHHE HaHE HHL$@AHϗLD$HHL$@H5q=@ 1茝gH{I1HE zeLH1HE 8H} H'HHh1HHH}H$qHH6.1HHEmHHnHH HHL$@AHۖHD$@s L`ALD$HHL$@HƋ=? 1膜g{ I;=? ILLH袡HE -HH{HH HHH1HHx1ԘHH?HE1E1j11HHAYAZHHkHH֝ulHH`HH!PHH@p t@HHHL$@AHI`H|$@H#<5Q1ҿHD$@ H58HH<`3 HL$@H5O=$> I1gH= 1IQ;1HI>HL$@== IL1觚gHH == MH8ʟH+ == IML<0H{1HE [bHU H1ݻHE 1HHצHH¦HHL$@AH#H{1aHt$@HH]Ht$HHHD$@]HL$@== IHھHHD$HHE HH}qHHL$@AH裓HL$@LD$HG=< HYH袞HHG:H5 HIeH=p< ILlIe=W< MMHHھ</HE HH!UHHHH]HHH{1`HHHL$AHI詒{ H|$@H|$@1ҾZs H|$@1 HL$=; ILF|H|$@HD$=[_fDtHht=uHnHL$=+; Hƺ1g=; HLXεHE HHHH yHHiD)*ȩD&*贩DHzH=1'hH|$@1ҾIHD$@HL$@QBALdf.H|$1ҾHD$P`E1HL$@H{ jHHJ7HL$@=9 IHھ~说LHHHD$@8HgLH9H=JHE &hHE(?H{1`^HT$H5HË=9 ֘gHHE H1ŷHE E1HL$@H$*軤D.(*觤`?V*蓤D,'*CU*kCT*WA*CDR'*/@@w*> *> * @*ߣCe*ˣ?T*跣?S*裣f.fTUISHcHH=!3 Lp_u+HFH;mHE1ɾ.LL&=2 HHHE1ɾ.LXp_IP#&[LHH]A\q1AWAVIAUATIUSHcHH1HhL4@p_HIFLhLH]zHt$ LRHLHD$ `GAF u+HEHulH=<2 E1ɾ.L%HH=2 HLE1ɾ.H@p_LCT%HT$ HL0CHh[]A\A]A^A_fAWAVAUATUSHxA@IH|$I1LIՉH|$H;yHt$LFHD$0Hxf?3D6HcHHXp_H@Hxf?G6A9HHHXp_H@p_HRHze.IL11HD$HD$0LHxyQHt$0HL/HL$LLHHp_Hx[]A\A]A^A_DHt$010HHHXp_HzQHuHD$HD$01HxhHt$0HL/HL$yDC}gfD3}gAHx1mhLHL.NpF fDHcAUATUHHSIIH@p_HHCHxhHLLCH[]A\A]ÐAWAVIAUATMUSHcIHH|$0Ht$1H Lp_wH=]/ E1L.L<@p_MGIP"MG =3/ E1L.HIPt"AUIAE8LL$(H|$0E11ɺL裷HT$(H5_1 HD$XH|$0a,HuHT$P1H|$0HHQ-It$HT$X1H|$0HL0-Ht$0LsCH|$Ht$@dCHHp_HĘ[]A\A]A^A_IUHH/Lz(I ɉL$A}IEPxH|$0LEI(LjAE7IHxHL$(HHD$PHG=- L@ 蘏Ht$P1HD$(-LHD$PH5SHP 1踫HL$(H<I=K- HIH5H1HD$(胫HD$X^fH|$0LE;fDIUHHJ(AQ H|$0LyBHD$P1H@hLxIhHZ=, LIk褎HD$IEH(JHp1H5 L1HD$ŪHMEX1PL|$(1HT$ HL$@M5L褹HD$`Hu1H@hHP|H|$@HH*HD$`H|$@LH@hHP*XZ{ID$HfL=+ E1ɾ.LIXD|$HHT$AD迧HT$DHD$(HHT$էHT$DHD$HLD$HDH=S+ GQL2H#H=2+ F0LD$(HH=+ HHHD$(CNRpFMpFMpFϙMpF軙MpF觙AWAVMAUATIUSHHH|$Ht$( HD$H HHD$ u MH|$(H=HJ*HHD$0$} HE H$tVLMY@H+HHD H9H;X(Ht  f.v0pM?pFӘ@HE(HLh`H,@HsHS ff"B7  fMEuEHEQL(H$8I|$75IH51)HD$H$rLl$ CH$H@HhH$H|$HcpHD$ H9V hHH5=( AUt$0IAWUPLL$`1HL$H7gH0H|$H=EEn1H\$0M}fDL辤LI聤LIDH|$HHHPhAVH57IAU=' HL$(I1訄gH|$HL% H0=IHjPAHMIUULUHǹG1)HBH[]A\A]A^A_H;Lt`cPAHL$LHLHCL=4H;1PAHL$LHL~HCHH4$HHLHLHL$0HH)H)Ӄ^1҉փH<39H<1rC=*@8~uT=@8~=wH8?HHHDH?V1=ñ6H8?H4$TTH4$TfT0NpFh0NpFh0NpFmhfDATUSHH HH|$3H=t3ZIrHS HHIH1LE`MtHt$LHILEXMtHt$LH HtLEMtHt$LH LEMtHt$ LH LEMtHt$ LH LE MtHt$ LH HM(HtLHq LE0MtHt$ LHa LE8MtHt$ LHA LE@MtHt$LH! LEHMtHt$LH LEPMtHt$LH LEpMtHt$LH LExMtHt$LH LMtHt$LH~ LMtHt$LH[ LMtHt$LH8 LEhMtHt$LH LMtHt$LH LMtHt$LH LMtHt$LH LHF 1LH5HMHt|1H}XLH@&L1H5ݬ= H1RgHH Ht$H HLHE11QH)H []A\1ɺLHff.ATUE1SHH HJH|$`H=/dIHS HHH-HLCMtHt$HHxALCMtHt$HH'A H{ tALCMtHt$HH+A D1HHH H|1H{HH@H1H5u= H1@QgHH% Ht$Hh HS HHE11HH []A\1ɺHH.ff.HH=HH=HH=HH=AWAVAUATIUS1HHH|$H=.oIIU HHIuILMtHt$LHLCMtHt$LHm H8tH;LCtMt0IuX- 1_*LCMtHt$&LHQ LCMtHt$LH LC MtHt$ LH LC(MtHt$!LH LC0MtHt$"LH LC8MtHt$.LH LC@MtHt$'LH LCHMtHt$/LHq LCPMtHt$0LHQ LCXMtHt$(LH1 LC`MtHt$1LH LChMtHt$2LH LCpMtHt$$LH衿 LCxMtHt$#LH聿 LMtHt$)LH LMtHt$,LHk LMtHt$*LHH LMtHt$+LH% LMtHt$3LH LMtHt$4LH LMtHt$5LH LMtHt$-LH LMtHt$6LHv LMtHt$%LH# LE1MtHt$8LH-ALMtHt$9LH A LMtHt$:LHA LMtHt$;LHA LMtHt$<LH蝿A LMtHt$=LHIA LMtHt$>LH%A LMtHt$?LHA L MtHt$@LH A L(MtHt$ALHA L0MtHt$BLHžA E1LHZH H~LH~HH+H|$01LhH|$0H5} Ht$PLHt$0HHD$PL1H5Ť= H1JgHHmHt$HH8LHE11)HHĘ[]A\A]A^A_H|$PHt$01ҾHD$P裻HH,HH`(HT$P1HrHukHT$PH5HkD= ufHItBwE~H`ƺHT$P1HrHkHT$PH5WHm- fHtBwhH5#HLhD=i fHHtBwD}H|$ H|$ L-H|$ ,HI)= H)gH = IMH<5H|$0H1ɺLH辻fD7LH蝻 -fDATUE1SHH HJH|$`H=%|IHS HHH-HLCMtHt$HHxALCMtHt$HH'A H{ tALCMtHt$DHHA D1HHȺH Ht1H{HH@H1H5ѡ= H1DGgHH)Ht$HlHS HHE11HH []A\HH=&<7HH=6HH=&USHHHٟHH51= HH1DFgHH)H5rHjH=cN=gt0HHtiHHHLH52HH[]fDH1Z@H1JPM.pFWPM:pFWff.AUATIUSH(HxH|$I]PH|$ 1DK8E{H|$ 1HnE1HT$@HsH|$ LHt$ H|$@Ht$0H|$1HD$hHS8H;oHt$H$H$H$HH$DH$H$LHHiH!H([]A\A]H{PHHkPHuHHDEEu{DL$H8LH_HLRLL*IEPLHpXjHHHt$LH|$ L#`H|$ 1HHl$hlE1{]HEHx(赹IHD1{T;{CH „4L$(HHL }u?E ~8THŜHDH}.H9uH|$ HHt$@1HŃ{Ll$HHcs H=7w  gH5˛1I ÛH5lMIHE5_HAUH11EBgH|$0H(HD$0Ht$@Hf^_H|$ HހL$XVH|$@@HHL1E1AA HHI1AE1E1۽1kf8HHXLcωAAAI AAAIу"fHHI1E1E11ɻA1E1`DHHL1E1E1E1۽1E1fD03f1@ HH]AAAʃAAAI1H@H @HHIѹAAA111'c`pXPQAf.tlHp EH t4Hp H &@EeHHR@Et;Hp H uDHp(H(@EtHp H Hp(H(Hp(H(AWAVIAUATIUS1IH HzHT$@LD$HD$`HD$hHD$HH}zHD$pHD$xHD$PHHtH(Ml$8HD$1L=@zHAzI]H fA}%HD$<H$1+AoF HI AT$ )$L$H$IAoF0H$HD$)$AoF@H$Ƅ$@)$Ƅ$@HHD$H$+LHD$HHIH$Ht H$$:$1@M9l$8$:HD$H$tH1VH$I6H|$`qH;D$`HD$ Lp( I6H|$hPH;D$hH$HD$8HP(L I6H|$p'H;D$pHD$(HX( H$1HHHUXHE`+HH$HHH$+HH+HHH+HLHtJHUpHExH$P+HHH$+HH+HLH+HHHAoF HLxXHD$AFPCAoF0HC`HD$ CAoF@HCHC(He=H|$!AoF ILxXLHD$AEAoF0IE`HD$8AEAoF@IEHAE(HuwHHT$(HHD$HPHuA|$ q-AD$ $ŰRfA~ I~ qHIHH{HD$@H8^AF H$H[hHD$HrHD$@E1H8HG8BH$@H`XoZH$@Ƅ$HV$HAL$fDf; HC@7HxL1IDH$#LHx@3HD$fDHT$HCPLx(@@ HP0IHP8@H|$HcHD$0IAD$ t*HH|$HrLHD$ؔMAE111LHI1fA?AWH&_ =X M1IA螾IM|$@qM2H|$AE11L{I1fA?A葕H^ = MI1ɾA2IHD$M|$HHID$PHD$HHsHD$PHsMtLH|$yMtLlH|$ WH|$8WH|$(WH|$0WH$HtHt]HH}X+H$HH$+H}`Ht+HExH}pH$P+HH$+HHH []A\A]A^A_fDH|$H軴HD$0If. H$@1+AoF HHT$XHI H|$xH$)$`H$@Ƅ$AoF0H$`HD$)$pL$PAoF@H$XƄ$@)$Ƅ$ Ƅ$蹙H;D$xHT$XHD$0HP(K ILxXHD$(IF`HD$0IFH}IIHD$(IG[IG IH@ HX fDH|$HsHD$0IrfH|$H蓳HD$0IRfH|$H蓳HD$0I2fHD$@E1HD$0HH@8BH|$L|$@IH@HxPHD$IH@HxXkH|$ILbXXP6f fDNfDH|$H胱HD$0IbfOfD1蹑E@1詑M|$@?MHD$@E1111HHx,I@HD$@E1111HHxI:fDHxX_HI,bXP5b\XP{5bXPg5bXPS5b'XP?5bXP+5bXP5bXP5b XP4bmXP4b~XP4b XP4bXP4bVXP4@AWAVAUATUSHXHoۃ}HEPH|$0I1H@HLh(薙H$1臙H$1xH|$ A$L}XA?HuPA?AH|$0Ht$0H|$ Ht$P1HFHD$HD$A$ƒ}uAFI@x=SM==D$bIHP:HRHHL;j(H@ H$HpH$H|$ ID$H@ Hx H$1HxH$H|$ HIGIH@ Lx MtjH$cIwH$rH$H|$ PIUL$Lt$=/ 2%H|$ HLM MuL$D$.H$H$H|$ H|$0LHt$0H|$ Ht$P1HnHH腴fXH$HHzH$H|$ H)Ht$0H|$ jH|$ H$XH|$ HX[]A\A]A^A_@IH8zx=.ILzfDH$LH$H|$ D$D$H|$fIf=tCf=u IEf8#t3I}1H LE1E1;HP(IuH賹fIH$Ht$ IA$M̉ƒtIAf8Ht$0H|$ 5H$H|$ #AAuhHEHhXUHpPH$H$H|$ H|$01!H|$0HHt$0H|$ L|$LLLHLDfXH$HHzH$H|$ HH|$ 辿fHuPAgHL$f9HP= L2:IfHHt zx=PaXP.fDA A$@V@= LLL$LL$HIy |$LL$IQtlMHt$=d g!LL$If.H|$LHR =. IL&!H|$ XHILLL$==D$YDH$AH$H|$ H|$0LHt$0H|$ H|$01HmL}XHuPA?HFHLh(IHxx=ILxyD$cD$ZHHjUDD$qIIH8AH@HH4L;h(*D$H$H$H|$ D$FsD$ec1ALD$FfDD$H3D$G#D$pD$fD$aHQHHL9j(HD$!@HHzx=HHJ@HHtzx=tPa)XPy+fHH@PaXPQ+PaJXP=+PaEXP)+Pa2XP+PaXP+A%ff.@SHPHGhHf?%HWB7 HHf;%HChH@(HCf8 uH@f8$_HC H|$HpCHAHS(HHu H[HtH;C uC9u%H[HucXP*fD1HPH[DH]HLM HN =Ѻ qHPHH[AWAVAUATIUSHIՉHLH|$@膹AG7 IEhHt @(&fA? MMM}IuhLAG7At IxtIEhHt@(uEtz= E1MLL.TE1HHD$t= MHL..It^AG7ZIEhHt @(wEvMmM>H|$@H[]A\A]A^A_DAG7tIxHIEhHt@(tHt$L1LLL$jLL$Ht4H|$0E111Ҿ0`XLL$HD$0hHt$0H|$@LL$EEAG7tIxSIEhH @(H|$fHH|$`E111Ҿ0`XHD$`hHt$`H|$@0MmMfDIxE1Ht$LLLL$-LL$H|$PLL$LL$H|$LLJH|$PHH|$PcAG7ItIx`IEhHt @(H|$@LhIHHHH aI H9H9f:#L蹜f@=B L°HIt=, H謰HM|$Lt$PLILLLI\H|$(E111Ҿ0`XHD$(gHt$(H|$@rH|$ E111Ҿ0`XL|$ g= LdGHL$ HiuHHt$@'H=PuHD$衳LIFL/=@ H11LH1HIt(Hq/= H蓯11LHHʼnLHL1LHH|$`.HcH|$`LDŽ$L$H$(HD$H$TH|$@Ht$`H|$@fH|$cHjjAjjE1111HHD$XH E1H|$811Ҿ0`X gHt$8H|$@v@H|$LL$)cLL$HD$LcHT$H|$@H蕳ILL$LHt$LLL$HRXHO 0PQH7IGIO UH 1L@ qH G H¾G1H54s1HLt$P2HDE11fDH|$>bHPH5r1HT2LF HrH= q=ճ HD$fHT$IH5G  L<D$ fHIBD$AFfDH|$aHCcXPT"ff.f1.u Hv f>.tHVhHtB(uF:HuHNHIf9tF9u-IHf9t#DG9tf?%DHtH:W:@tHGf8 D@@G8%=tIu9f?%tQW:fftfft1#tn@HOhHtH9ufHWB7t&HAupHcHtf: uHBf8gfGuG;ff.G9HtHWhHt HHHEfHILW0tMRAB7t Ixt$IAhHt@(uIqhLHDHfDAWAVAUATUSH(GH^ vcQXP)fIպ2HsC7IHtHxIt$8HFhHt@(uaHhH5MH茮IT$8HLHHLHH([]A\A]A^A_D fwMH3H|$IC7HCH5C 1HP -IH=m1hH|$1LHIӽC7aH!]H|$LHID$8HHph@xH|$>A|$IOH|$襭C7{L%DA H\H|$HL|H|$C7IuH56mL1,H|\L@ HlH== q;H5A MLH¿< {fHtBwXHHElH=M FKIH8~+LH H l= LIHIH詑H5kHP 1G+Hk= LIHHZH5@ H1+Ht$HdIv@HuL1*HIZH|$LH/f.HH|IT$8HLH|$HafH}WgHI 1HP .)HD$H=i1 HT$H|$ 1HItAG7IHXH|$ LHAG7LXH5?i1H(LL$H5I=P LI1gH5= H1(H|$ HؾHE8LHphxH|$ vH|$ IAG7^H-< HWH|$ HHH|$ 5H-^< LIWHLh= HIq H5O= MLH¿<-s ֟fHtBwhH|$HBfu1A7urfwD1fuHAf8 @HWh1HtB(tHIA7(uW:f%f=t fft1Ðf.SHHt$ gf8.t$ tS:fft%1H[DH@f8#u@:tfG:uW:fft 1ffDSgf8.t 1[fDHPf:#uB:u1f.F7 AUATIUSHHHH~T1fDLcLJtjHLJjHLJjH9(HxL}jHLHH[]A\A]]jDúcXPfAWAVAUATUSHHHHT$PHL$XDD$q0DD$DH4$1H|$HD$fDHD$H $HlAH $ŠV@Dd$H\$(E1E1E1HHCH;HLEDELEGYD3ArH9 HD$ =Ҍ  fHT$ IHP8ChRi 0MAQ t5Mu0EA> LT$0LL$ y]T$0LL$ IE H;GZLL$ 1VHH[EHP 1IM@Hs>IH1IE@EIFHLL$ Hx(LL$ IAI Mi@AgHf,IvLXAPhI AD$ fMH\$ =~  fLI@  `fI]IE8LHm @MH\$ uHfDH$1L$H|$H$wIGH@ IE@fD1ffDfD@P M}dA fpLI1HwLID$8RN=} ID$@ݻfLh8@ IW =} HD$hHP@赻fL@ IHD$pnILl$pIG811 wIE@b@ @HXL`8HHHL<$It|H$1QIwH$DLt$H$LH$LzH$LLt$PHLy+fHXHh8HL<$It|H$1QIwH$轔Lt$H$L蘑H$L8zH$LLt$PHpL(y$\fHXHh8HL<$1It[H$1PIH$.L|$H$L H$LyH$LHH$EH$H$t Lt$PLhx&蜹fHXHH$~H$t Lt$PL*x*^fHXHH$t Lt$PLw,1fHXHH$t>Lt$PLw)fHH$H=] Hc)hgH]HE8HH$t Lt$PLw/踸fHXHH$t Lt$PLWw0苸fHXHH$6`Lu Lt$PLw:DfHXHH$[L<$ItH$1nNIH$ Lt$H$LH$LwH$LLt$PH轎Luv4詷fHXHh8HL<$It>|$D?Lt$PL8v5lfIHmJH]HE8HL<$ItH$1Lt$PMIH$,L|$H$LH$LvH$LHLu1жfHXHh8HL<$ItH$1MIH$褐Lt$H$LH$LvH$LLt$PHWLu-CfHXHh8HL<$ItH$1Lt$PvLIH$L|$H$LH$LuH$LHʌLt2趵fHXHh8HL<$1It[H$1KIH$舏Lt$H$LcH$LuH$LH@H$t$Lt$PLs.fHXHh8HL<$I tH$1Lt$PLKI H$L|$H$LËH$LctH$LH蠋LXs<茴fHXHh8HL<$I0tH$1JI0H$`Lt$H$L;H$LsH$LLt$PHLr=fHXHh8HL<$I8tH$1Lt$P2JI8H$΍L|$H$L詊H$LIsH$LH膊L>r;rfHXHh8HL<$I(tH$1II(H$FLt$H$L!H$LrH$LLt$PHLqAfHXHh8HH$t Lt$PLq@贲fHXHH$t Lt$PLSq?臲fHXHH$t Lt$PL&q>ZfHXHH$t'Lt$PLpB-fHX@ !HH$t'Lt$PLp fHX@ HH$8t+Lt$PLp蹱fH $HXH@t Lt$PLUp艱fHXHH$t+Lt$PL%pOYfH $HXHt Lt$PLoP)fHXHH$ t Lt$PLoGfHXHH$ILMLt$PH$Lt$L}IH\$L|$H|$no袰fMe1HIFLH薊HH{H$HpHt$IH[MwMg8MMmMuMH$HHtmH$1HFHH'L|$HLH$LoHuLHH|$PnJүfL`HX8IH$HHtmH$1HFHH誉Lt$HL芆H$L*oHuLHkH|$P!nKUfL`HX8IH$HHtmH$1HEHH-L|$HL H$LnHuLHH|$PmLخfL`HX8IH$HH|$x1ҾHD$x+LMNH$Ll$H],fDJ%9tcMvABL|MM>1HDHLJHL/H$LmHLIHT$xHuH|$x1ҾF+HT$xJ@ h@ \@ P@ D@@ 4@ (@ @L1iIIA@7YLH= IfLH11HhI6H 1LL$ LH1kLL$ IA@)DHEhHJ@(@H1ghH.H$1T$0LL$ CH$LLl$H$L蛃H$LLl$6lH|$H$IqLL$ T$0 fLDHHuIFxo@8H$1)$o@H)$H@XH$XB$rIF@_HuH$օL|$H$L豂H$Ht$ LID$8藂qfH$1L$LLL$(MHLL$(tH H|$L6LL$(HH=w LL$(HT$0XfHT$01Hi1Hf=m ID$8fLL$(@ Lh8=l HD$hIQ HP@ͪfMHD$p@ LL$(L;1HlH : H|$HLz1HfHT$pLL$(HB811HT$0LL$(eHT$0LL$(HB@HE@PƒBB H$LL$(kLL$(H$LIPMILL$(H=*1LL$0iH$LHHD$(wjH$*kH5s E1HL<D-k afHLL$0tBwDhH|$HLL$0HD$(LL$0ID$@HEL@P&OH5*HP 1IL$@H*IH1ID$@Lt$PLg8fHXHH$Lt$PLg3跨fH $   b XPfDH}11LJIH$HELhPMtqM9uAfIE@TH$L?GMMt,M9A}t1Pc8MMuL$IF@ƒ $A IBLh L1LT$(c=i Ml$@ID$8}fLT$(I@ 11HD$hLP8bIE@2H@HH|$PeM-fH|$xHJfLcHC8IHD$x@H$eL.JfHH[]A\A]A^A_LLL$(OLL$(u0IEhHt @(uuAE:uIA@7u AM=uh t AA7ff1HD$x1@ Lh8HD$(aHT$(1LHB@aHIEf8 tLHT$(軃HT$(tyIEhHtp@(tj=g HT$(fHT$(11@ IHD$pHP8MaHT$(IE@1H:aILH0H$L.GjLJIIUf:5*A f H  H|$LJ1H`ID$8%H$1;H$H$H/Lt$H$L |H$LdH$LLt$PH{Lc5ΤfHXHh8Hl1>P"5A f=f 葤f1HD$x1Lh8@ HD$(_HL$(L1HA@_IG1rHyfLL$ HH|$PbH&fHL`H$HHH$1IY:H$H}Lt$H$LzH$LtcH$LHzL<$Hk8IHxXHD$TbCAH|$P7bkfL<$L`HIHtsH$19H$HE}Lt$H$L zH$LbH$HLyHk8IHxXHD$aCH$I܀LH-W MtWH$1 9H$L|Lt$H$LyH$L&bH$LHcyHk@IqH|$P aIAfHL`H$HHH$1It8H$H|L|$H$LxH$LaH$LHxL<$Hk8IHxXHD$o`CALt$PLR`膡fIH$LMZH$1H7LHb{Lt$HLBxH$L`HuIL#xMl$8H$I\$L/Lt$PL_'fHH$1H]HE8HBLt$PLy_(譠fH $$8YLt$PLB_%vfH $Hh8$hYL$IF@ƒmIDE@1ɃE3*MjAE7MT$8f@ bfHuVPu }!ttHUHMLHD$ HD$ HXH@ @ Z@ Y@ c@ a@ F@ G@ Hu@ ei@ f]@ pQ@ qEID$8IIMT$8IBH@ ID$@@ HXHLX Hq L9H =` LjHD$(ED v` HL$(E1H5/ H¿<DL$0VfHtBw DL$0DHH|$Ht="` I1HHMAH$LHPH$PH$PH|$0;H @ HXH@ @ @ @ H $H H $tHHXH@ @ @ @ @ pLt$PL[9"fHXHLt$PL[7fHXHH $XH OH=8 HcLg@ @ LLL$(HD$0)HT$0LL$(HB8HT$pH=Z1LL$0[H$HD$(L]LL$0H$LILL$HIBHt$(H$H\H$]H$HD$0\H=17KgHt$(H$H\H$E]LHD$8XH51HLP H H=] q薿MLL$Ht"H =w] ILeoLL$HLD$8HL$0HH5 <LL$HD-=] SfHLL$H11AH|$H$LL$(rMl$@LL$(LLT$((LT$(IL<-MI+MT$8Hm ID$@bE XPbXPbXPbS XPrbj XP^LLT$(ڮ1HU=2\ ID$8+fLT$(@ = \ HD$hLP8IU LT$0HP@fLT$0@ HD$pHD$(LkHT$(HB8HT$pHz8HT$(OfHT$(11HB8HT$pHT$(#UHT$(LT$0H|$HB@IL֋?LID$@+b/ XPeb XPQf.USHHHYHHSE1HK HHpQHHCHxH5 HH¿Z ONfHtBwXHHjoH"ZH[]ff.AUATIUSHHH8Hn(CWH|$AFYHSHK H|$LE1H|$ IAY]1Ht 1H{@H{~f8~HuH@f8=Z H H|$ HnH[HuH|$ OYH5 LH¿8MfHtBwDhH|$HPnH|$YH8[]A\A]HyHL$*3fHL$WvAUAT1US@HHHl$HHHƄ$WHHSE1HK HTI<"HHa1f8AHx#H =X II1ɾALHL- H{vULHLLfHtBwXHHH(mHWH[]A\A]f1"H t@AUATIUSHH HtuHs HS1HKAGHIHL(tH11:fHž0s@HHH9fIT$XHPID$XHHuH[]A\A]AWAVAUATIUSLH(HMT$8Ht$HL$H$IHt1@H@HuHD$HLHpEи҉$OA?D$hDt$hD!0fDHD$X0fDHD$`0fHD$@HD$HHD$P M IWIO E1ƃHL|$8dHD$(HD$$$$$$!<E1E1HHH$|$8jDd$lHD$HH@Ht1IHHRHHR(H9u H9 H@ HuD$<H$1/*IH$H0XH$LjH$HXHD$f;` H$1)IH$HpFH$LVjH$LRH$1HD$ )IH$HpH$L jH$LRH$1HH)IH$Hp谜H$LiH$L`RHD$HD$f8#I1HH@HHx(wHD$0LD$ HL$Hڋ=cT Ga=ST HIHھFHLD$=5T HHھL-LHQHHRQ1HHD$CAgHL$H I=S 9HL$XHT$kJDS HH5`D$p'GfHtBwL$pHHL$`HJD@gHl$=oS IHھFHdIHD$@HHھ9ND=>S AHL$@LD$HھHJD=S HHL$ = S IHھFMH AnEAA9uHHT$AvID.HHHHD$HDEt H H1HL$DE1HLHD$PHt HL$0JL|$<$Eu?$t5HD$(Ht+HT$ @H@Htx uH;P8uHE+ |$8O1H$tH$HD$0 HmH}uH;E8u=Q DL$HL$轏fHL$0DL$HH8HL$(EHD$(HHHL$HH@EID9d$hH$I9tH'PH$PMtHAFt-1@H$HLHPHpOA9nwA~AF1BHIIGILJHxuH$HheED$8dFAOfHH5 H$HEOHEHD$(HE HD$XHE(HD$`HE0HD$@HE8HD$PHtHEHHHdH=dHtrH$HHHkOH([]A\A]A^A_fL$<|$8d=O k$DL$<覍fDL$?t =oatXPbHL$FI1@LHH6LL$@LH$$$$։уဃ} H$HDŽ$ H$H$HDŽ$H$H$HDŽ$H$у H$HDŽ$ H$у@ ȉу ߈$ ȁ$H$HDŽ$H$HD$(HD$(H$(H$H$u HDŽ$IM IUE1LH$IHt1II}E1MFIHBf1fA<$AH0 =bE MI1ɾA8IH5 LL8fHtBwhHHYHDH[]A\A]A^@fDE1H.D1i@sfDAWAVAUATUSHBÃyH$ŘYKHĘ[]A\A]A^A_ffCADAtEAtAH\$0HBHUHM E1HHNIHEHxhL- H}I@LLLDC7fHtBwhHH^XHCHHE11ɺFIfDjC HD$ HH1DH@DrHurHD$McH|$1uH|$ L-hALM H\$01HULHZH|$ HWH|$PHt$ ~@HsH|$ IWH\$LSBCL|H= /gSLMJKHDtuH\$0HkM>1HLd$LHcZH|$ HFWH|$PHt$ ?H|$ HI'WAT$MvMBAD$M|uHT$|$ LKfH|$ HVH|$HtH|$ HD$ I1ɾAa1HË6 tL- I|$:HLLAM1fHHtBwDcKYfDIT$IL$ HE1HԯMI1HLL1f8AHtHI)fDAWAVAUATIUSHHH8$D$ HCHX2H|$&H+E11H{(tHA=H|$Ht~@RH|$ L;-wH[HEE\H|$&HfAf~uHCf81rH =& II1ɾAHf;~H8H[]A\A]A^A_f.HD=IHIH؜IHXIHH6IHXIH{\fH|$ A$H|$ L:L"D$ f.ƒA}EfDIFf86I~AeW @H|$ $H5/ 1H¿fHtBwL$ HH|$HE19H|$L9H|$L9fDHCf8DH{e3A}(H5'K=$ 11舁fH H|$ &$1A}HuhH5c HڿfHtBwL$ HH|$H9JH{ egD11=4$ &2bfH냿P1\ATUSHHpHPH|$1^H|$"HsPH~`tHQZHCPHxXtZHsPH|$kHCPHx`tH_H{_H{HHun=# safIHSP=~# Hz`t HzX6 HL$0H( MI<H|$H7H|$"Hp[]A\ IfDHW_Hp[]A\ff.AWAVAAUATIUSIIH(HB=" Ht$H-HH HFYHQL@ 趄H5 H1H)1LHD$"1LHD$"LL$LD$H=e" H1fLH7EAF_H_HD$@L3LHLLIL~IW=! ILHHU=! HIG܃LHH;\$HuHt$H(H[L]A\A]A^A_xfAWAVIAUATUSHhH|$ H|$@Ht$I M'HD$ L(@D$8E1D$? H =! Lᾑ֛=! HD$0^fLD$0H I= L<4H|$PH5Ht$H|$Px5HSLD$Hً= F謂H|$PHHLEuH|$?tAHD$ Hxt5LxHYF=f LOL_H|$PHLH{A} gHS=+ IHG#H|$PHHH|$PyH|$PIHT$H|$PHHSLD$GHL$= ́HSHL$(I= F谁LD$HSH= L蔁H|$PHH4H= =o L*H|$PH 4H =H LᾐH|$PH3H|$PD9t$8HD$Mm IEI]1HD$IEHD$(IEHD$HIH{H|$PHH}1 gH^= IHl言IHD$ x0fL H= 1LD$0 gIwLD$0LHfI+HD$ H@(HHD$ Lt$ I~HIv11#= H[fLD$Hñ I= H<=I~HD$}IvHH|$@xcHt$H|$@a2H|$@Hh[]A\A]A^A_AWAVIAUATIUSHHHf>tCNfDH{LC fA8H'=9 HH4H[(HHuIjH=jILHHLILMt LL1HupLHL|L4HLi1HL[]A\A]A^A_H=yjIHeHLI'L?MuHC=U ILFM~LHLLHH^fAWAVIAUATIUSHHIHHW H51AHI&eHE1f;H5H=#g1LH gLItf;tuLwfv HH=AHH;HEHsLL;(HHLH}t>=8 HH[]A\A]A^A_f.v LSHEHHH[]A\A]A^A_fDH= MHپH|HE1f;IHLHً= G|HAWAVMAUATIUSHH$L$$LL$HT$Ht$)LQLIFH=IrH H$HHH|$0H$1_H|$P1SL9-ҶH$H\$@褒H$L~H$L~H$跺It$XH$1LX{H$HhH$H$裒H$LL$[1H$yH|$IM1H|$%Ht$PH.H$H-H$H$11LHHHt$`H-H$H-L9-ѶLH{H= FzHHH*H$HjH$H|$0X-H|$0H$ F-H$|H|$0HL$HtHHuH$H,Hĸ[]A\A]A^A_H$Ht$PHHD$(,H$H,H$H$1LHHGH|$11H|$$Ht$(HH11MHtLLWHL$(H$IƋ=9 Mg4yLHHH|$G11HH|$ H$= LjI諑H$HL$(I= gxHH|$0H+ufDHt$PH+H$H+H$H$1ĔLHHM11HLNLHHH|$aH|$11H#H$=  LjIŐHLHfH\$@HfDH$= Hj}=H|$pff.@AWAVMAUATIUSHIHnL=gζHIL9;H-;HL=I9IH3HVzI6HKzH瀌$؃ IuXHH-D$ ~|LH GH= vL8HH= FvHi={ HIHsvHʼnH9D$ HLY)HLHt$LH>)HĨH[]A\A]A^A_fLL%xIff.f?t 1:tHGHH;p(uH2HGH@1?t 1HH9GHuHHGH1DAWAVAUATUSHHHHt$HGHXhE1I;EM]SHCPA~H4E1HHDEuxuHAEAE t[E1IE(HHAdfHHBH0L€uEAEHCHHl$01HLp({AFQM( IGhHtL8IW fB7 1LHD$PIE(M(HH@`H DAEDIIVD$ fL EHT$PDLHDD$DD$IHH@hH@HH@(HHHt2x t,HBhH@HH@(H(Ht$PHF f% AEPAA} f.H}HIH|$LHMEHĘ[]A\A]A^A_AE1E11EWAEPmEIxtIH@F;M( Hl$01DD$HIHDD$@tL$iL$hI(Lt$PH: HHDD$)A} DD$ IH@u E Eu0{8~)E1H DLHAgoD9{8AERt-D$hu& H|$PHuLHH}grA} gI}(HZG^PIHt$PݗH}H!$,@LAEP Hl$01HA} >HH8'EAE aH|$PHA} I(Lt$PLzu {wLLI(HHz DHKHHt0HIHt'A]t!HArAHBE1AE1}fDHA1HAA} t EHl$01HyHHL$h'HT$PHB@7 I I(HNA7 f:5uHRHBhH8I(HH_ EfI(dII(HxLHH) 4@HH&IHDE1M(D$ElH|$P_IvH1rLHH E1[@A} LE1IE(H@8{HCH@81HHԒH|$PAuHAu@E1Au8Au01Au(Au HHH还H0Ht$P=h HD$PAE BD1HHCHHx( tGT t H;DhHIL1hA} 6IE(HXL9HsL17HHH H}@ HH5 H|$HHuHIE(H@8A[EHt$PH}+DL{A} 0|$M(}LLHD$yHT$H}HH|$ A} IE(HHH@H@]LLHD$HT$H|$H I(H@HC@]EfHl$01HAEPt1M(IvF7PH 1跈LHH L聯E11AHHI7HH;I(@E1AEPtHG@7AePIH[ H|$LHFLQfDDHG(L`AE]uIE(H@f8#H|$P{|HG fhI(H1Hp谇LC_HH"Ht$PHF fJIE(M(HH@`HDEADIMD$E1AE IE(HH@H@^kIU(HHHz1If.K8~IvF7 pHgHD$PHBz6HxwH|$PIHPB7IHZLHD$HT$Hp1HD$BLD$HHLH|$PLHD$HT$Hp1HD$LD$HHLPA} Lt$PHzz6DD$1H~LHHLLHD${HT$HHA} DD$LDD$D$IDD$5EHT$PLHyYH= LSImIE(H@^H1HIHT$XH9Hp1քIU(HHrH#;*A} H|$PHDIE(HHH@H@]LLHD$HT$HHDD$M D$= UHD$PM(E1D$}D$hpA}DeAE]ZIE(H@f8#HHT$XH9:Hp1軃IU(HHrHDD$DH=- DD$DD$HD$PHHHD$PE1E11ҹHHHD$(AuHAu@Au8Au0Au(Au 1H0Ht$P= CHHD$Ps8HD$L5LA@J_IH|$HHT$ aHT$ H|$HD$HaLD$H/H=> HH@(H(H@hH&H8HHD$(f8. H|$PI{HpL1HD$HL$HHHEHL$It$1HсH}HL"Ht$LHE1?ZAE ;1HI&=1 LIHPLUIHpHDD${HDD$S{IHl$01DD$AH-DD$KHD$(HxIE(HH H@H@]H|$P?LHD$2HT$Hp1HD$譀LD$HHLDI(H}H idq idq idp idp iEdp idp idp idpArAWAVIAUATIUSIIHHH@HHh(HHi1EPHH|$ EHE1E1jj1jHH|$@^H LHt$ Ht$0LHt$@= NHD$@I$H@PxgHHH}1oE HH|$@ oE0@0oE@HY@HQ@@@PUP ЈCPCZUZ ЈCZHD$@HuHD$H(H;D$HHX(ZI<$H 0YyI}H 0YcH[]A\A]A^A_Ð;H$1,H$H{H{1g$v HHD$@1҃;s0s(s sssE11H$H|$PH0HLH(4{Ht$@HLVxH$= Ht$HH$HHT$+[fHT$H|$ HHt$ LHD$@Ht$0L0@h6dmAWAVIAUATUSHֵHt$D$/ƒ u D$/HGXMH_PHD$^IFLxPH@XHD$HH|$PH$PwH|$p1H|$ H$PHHD$ ;cH$1\H$01MHH|$IH;ѶIH$PLHkXbH$PLbMH$1HHD$8H$1HHD$0LeH|$HHD$VH9OHD$@\Ht$H$PbbHt$@H$PPbH$P#H$PHHl$`^H|$ ^L^L^H|$^H|$@^H$PH藖H$PH|$puH$PH$uH$PH$0uHL$ H|$8H$PL$L$xH$|uH|$0H$PjuHD$H$HD$@H$8Ht$H|$pHt$pHHt$H$0L$L;-ӴtqH$tfH$Muis0s(E1s sss1H$`H$=u H0HQ9fIfH$H`MtH$Ht$HHHD$0AHD$H9%H$7H$(HHD$8+s0s(E1s sss1H$`H$*H0Aw0Aw(Aw AwE1AwAwHt$`H|$h1HH0IHD =v IL<HHH$HS|$/t%H$PH5H$PH|$P#H$`H|$PH$Pd^H|$PjH[]A\A]A^A_LHHD$8H$P˚H$PHHl$`H|$ g[LZ[LM[H$PH]H$PH|$prH$PH$rH$PH$0qrHD$ L$L$xHD$H$fDE1HD$HHt$HH=ܱ\HD$@HHt$H\IHAWAVAUATIUSHHH|$ HT$L$L$LL$=H|$@1H$1HJHCH9II@Ixt'MPL9ug_dgf.H$pLH9I_H$L\H$L\H$˜H$LL_IuXH$H$H|$@pH$H$pLL$"YH$0HaLH$XH$0XHt$0H$H$LH$0H|$@H4$H|$11|$I1H$0u0u(u uuuE11H$H|$p=z H0IV4fH/ =a IML<H|$0H H;$H;$H$0LH= JL4$FLWH|$0HLHt$0H$臕H|$@1+H$1H$H|$@nH$H$nH$0H$L$H$蠅H|$@Hs H;$H;$Ht$H|$114|$H~u0u(E1u uuu1H$H|$pX= H0H2fH = HII<H|$0Hi H\$LFH= HVH|$0HH3Ht$0H$qH$H|$ _ H$H|$ M H$XH|$ HĨ[]A\A]A^A_@H|$@H# /fDHP= HjnI*fLL$UfDH$FHHP= HjQnHcfLHWHIƀgUdVbgdBbgd.bAWAVIAUATIUSHdMHH=ױHT$L$ HjeIHP= HjeIH|$pLfDL$P1LFLHH$x=Ib h)dYHt$ LOf.HH(H<~ =n HH%eDSHFHH@(@Q t;HYHS f.tf5t[@HBHC [fDHB HC [újOd YfAVAU1ATUSHH`HGHHsPHsHD$ H{0H@hHhL`9HK0QLi(H=#IHcfIsA} IHcCH=4HtHHuL1gHHHHLHHH`[]A\A]A^fH=H!H=HH;fIfDSHHpoGH0)$Ht&QH| = HwcHp[ÐH|$1H|$jHsPH|$,HD$0HH@hHXH{fH=Z HIqRJHL$IAdH11HD$0H{ = H@hHHbH|$HH|$hHp[fHH{ = HHbff.AWAVIAUATIUSA1HH|$pHT$DD$H|$pVIupHD FWIIMPH|$pE1gzH$JIE8H@PH$1H@HHX(H{ HHH(H$HgH|$pHHD$pH$HH|$p8HĘ[]A\A]A^A_fDAE :I}PH-PHuIEpIHHH9HH$1H譺H$_H$H`KH$3IuPHH$HXeHGH$H$#_1EH$01H$D$H$IEpIHH\$(HD$0HIH$@Ll$IMHD$8H$HD$@I\$HtfL$MuIH9XtMvPMtI9uM;u+C8MMt MtAqZLL$ <z@MMd$ MmMuLl$H$0H$H$@H$Ht$`H$~H$wHD$HM11LHa|$HIMPIH$IupE1Dw=&  "fL$Hw I= Hپ<TH$HH͢= FLÑLEH$HLpH$H$H$H$H$H|$`H,Ht$`H$jH$H|$pXH$H|$pFHD$pH$H1H$GMMd$ M5I\$Ht`L$MtI9tE1fDIupIMPH$IE1uH$H$\Md$ E1IuH$uDD$HL$(@HT$0H<;LL$ _AV1HLL$HGEHHD$ CIL\$ xHHIHJH$LL$HH9tL9uH@L9tuHHBPH9uﺨidPH$H|$paHP= HjU\H:H=HH(H$zHPLIFPICPH$ HP(H9tL9u@L9t HHBHH9uﺨid"PHBHLIFHLLL$HL\$ ICHIFhIChCH$1CH$Hހ$[H$H$0LL$HKL$L\$ AAZ < oAy H$u<UE1HBf8 HxL\$ L\$ HH=#1HL$XL\$PrH=1HD$ _H|$`HD$HIL$HE1MHAXPjjL\$pHL$xH$PH$LIH Ht$ HH$0mL\$PH5y1IHP0]Lt$HH$0HL;H$ 莎{tpH$H5M=X 1!fHH#Ht$HfHH`[]A\D=! H5uHE52wH8HT$ H5U 1HV ^B@u9=ڸ H5uHE5w= HIغ1 fHX= H5ruHE5vH=W fH=M1HL%yV ̰f@t'=X H5!uHE5ivMIH=1 H5uHE5RvH= tCsQ 1RtHdH2H U = Hƺ1HffH=1"fH53t= H1HffUSH(= tNsQ 1sHdHH U =x Hƺ11fH([]H1H賖H|$`1觖H$1蘖HHsPH|$` HsXH$ H$H5jS 1K@H$H=HHt$`HHH$HH=%S 1fH5r= IIHٺ1TfHH9Ht$pH|H$HlHH([]f.SH`= tJsQ 1arHbHAH S = Hƺ1fH`[H1HKHsPHHt$ H=H=/R 1HfH5q= IHٺ1^fHHCHt$HHH`[H== tCsQ 1qHaHbH R =- Hƺ1HffH=~1RfH53q= H1HffSH`= tJsQ 1qHIaHH :R = Hƺ1efH`[H1HHsPH_Ht$ H=+~~H=P 1H蝬fH5vp=H IHٺ1fHHHt$H&H莾H`[AWAVIAUATAUSHH_XH;H$14IvXH$InhH$H}H$1IvhH$dH$H|$@B=C HtH{1蕫fH|$@HHHt"HuH%Q 1~<H|$@HHμH|$@脽H[]A\A]A^A_f=Ѳ HD-P p@Hoh HtyHDnP fH=|fI~PIMNPIIAHH@(x uHP(HH90tHH[]A\A]A^A_@Hx袣fH$IFHQf H=u_fI}PHD$aIuPIHFHH@(x HP(HH90H$1>MMPHt$HH$E1L1H$HI}PfH H HL$L$IuPL_ $I}`H$1跉Iu`H$$NH$H.H$H5H 1L$`3IE1H9G tH=bF H9{t! I'HHT$H5H 1!3<$:IHH9ZG tH1L$躴L$HH5f= AWAVMPHL$h1<fH HH^Y 11H肖edIHH5G 1jE1HMD% 臭eHZYtBwD`HHHMTIuH1$2LHHv3= HDQF 1@IEPLLxPZHIwIWHIIw8IGILJHT$HHHH)HH$Iw8AG1薇IuPH$$H$H HT$H$ZLHHD$ HH=qqfAw dE1H\$Ld$(Hl$8ILH\$Ll$0Lt$ @UH9H$1HcqI0H$H$H^H4_LH$HPI=1 GI,= HPILH= HPHIFHAG P9>H4_L1HL9b= HPLIHIAG IuPLHHD$H=QE L5D fIXD1HX@.1fH9C tH1L$9L$HH5b= AWAVPLL$01HL$h fH xH\$Ld$(H\$Ll$0Hl$8apidff.HXAWAVAUATE1USH1HH|$ H|$ FHCPHt @8HCXH8 H$1蝄HsXH$ HC`L$ HHCPE1E1HtUuP= H{XtI|$1fH|$ HLpH|$ &H[]A\A]A^A_@@8uH5@ 1L-=Y H$'I}詜fHn=T ILqLHHHKP1t$1AeLAXHL$0= AZA[,^Y iz11HedH$Jz ̮ HH5B E1H¿L$j1LL$eHAXAYtBwL$HHH|$ H)=J E܃&,HsPH~8Wf9F -LL$H|$ E11ɺ6H{P1Hl$@H@7 HP HD$1Hp5,HL$=ҭ ILHQH5*? H1HD$,HL@ HD$I9ID$H=? RH91! 迪1HH肭HT$H5> 1I+H5_=G HAWAVMI1 fH|$0HIt$H1j+H|$0HL身XZC8H$1HHs`H$$H$bH$1ìH$H5|@ 1I*IHCPH'RH$1HsPH$0HCXL$HHC`L%>  HH=? 1L5> CfIdQF= 91H q> ID$H=-= L9 H9 ! 1HH謫H5m]= HMM1:fH|$ HIt$H1)H|$ HLfH9HT$H1x)H5\= HAWAVMI1fH|$0H西Y^.fDH=I< fL1HD$تHa= HfH9uL1讪H%= MMH5R\13fH|$ HHZH=; H}fH < H5O\I= 1fH $HhI= m I}HD$)fHh=ԩ ILk HehHL$I= h H\DH=; H͖fH-&< HD$HfH59[1"fDL1^H=: I`id2`id`id HVXe1USH1H耚HHH辧H}襄HHjHs< = Hپ`#HHEHH[]fDAUAT1USHHhH|H1HsPH5H|$ HH5: HD$ HxsH{0IHC8@9G\藚H; = H"H{8HqH; =ܧ H"HS0IHC@MHL$ @9BHWf= k HHW; = IM<њHH{@HS0G9BtmHS8;BtdH; =K H"HL$ HeM=) lH$ H: = IIH<THHH覻H^Hh[]A\A]=Ҧ qH*;Ht: = Ha!HfAUATUSH1HHI=J HCHxHH=% tH賥H[]A\A]H=: TfIH8 HL 8 H5WH= APPP1fH HH萺^Y Aq11HedH#qHH5h9 1jE1HIٿ- eHZYtBwhHHHH=)9 |fH{pI`H@L 7 MH5V= HHH7 ATPPHP1fH HH蝹^Y Np11HedI0pHH5u8 E1j1ɿMHD% eH^_tBwD`HHHfDAUATUSH8HHHH HIHHvH1觕HI$HI謩H HH腸H7 = L龐{HH`HH|$HHt$HI$HtHCPHt$Hx(ޏHHuH|$lH8[]A\A]ú@idD@id0@idDAWAVAUATUSHHH|$pHt$(轡HHxHX號H$1(wHH$H0QH$H|$p蟷L$H$1IGHD$vHH$HpGH$H|$pUH$Ht$pH$1IvHH$HpH$H|$pH$Ht$p襟H$1HD$ AvHH$HpH$H|$p跶H$Ht$pUBGHD$ HD$f8TH=]`HfSdHI訞1HD$1HD$0HIHD$0H|$pLLLH֟FHD$8@HD$f8HL$ HT$MGLD$HT$HTLHV}Ht$H|$pH[H|$fHL$H5_ImHL$ H_MmHD$@fHL$ H^MkHD$HFLL$HHL$@IH^<v= IeH3 = L龑HD$ LL$ ILH3 <L$#H|$pHvL莞HEHt$(HxLHQHD$0@R/EwH|$(tUH63 =h L龑#=U HD$ 3eLD$ HL$(<H2 I胒LHسLD$HT$LFLHLoDRL[d%HL$HLI@LfHL$ILGHT$HL 1LЋfH)]pILwIHp ftKL;%=0 tBH=/ 菋fH=/ I{fIt$LMH胔eIHx"tDL9%/  H=g/ L=x0 11fIt$LMH9eIHx#tCL9%/ t|H=!/ L=20 fIt$LMHeIHx$tLD$8HWL߾qLEHAfL11uKH|$-dHt$pLH‰HD$8DHT$8LLfA^uAYY)H|$MgeD$HHxHXIw1LjHt$pHLx=HD$81苉1IHD$|LL$ILL苖H|$(tQHl+ = L龑YHD$0meLD$0HL$(<H7+ I轊LHL$1HT$ LHUmIL.HLqIHp ftXL;( tOH=f( LT$04fH=U( HD$ fLT$0LD$ HIrL eIHx"tTL9( sL) H=' 1LT$0LD$ ăfLT$0LD$ HIrLƌeIHx#tWL9/( L( H=' LT$0LD$ gfLT$0LD$ HIrLieIHx$tLL9' tCLi( H=J' LT$0LD$ fLT$0LD$ HIrLeIHf) L龑LT$0RAMHD$ aeLT$0LD$ IH-) <L讈LH:tKD$M$LHSp߅MN$LEHA:dL11蟞HEHt$(HxLH蒩HD$@#p:LD$LLFLHL<:HT$8LL~x#*fHRkILBHf.x#y"L-dSHt$pLH‰HD$8lLD$8HRL߾qLEHAfL11WfHi' = HVLH;fDAWAVAUATUSHHH|$211IH|$ H1LHgHsPH|$ Ht$ H|$Ll$@HCPMuHxXLLj H& =ڒ H龑HD$@ MIHCPHxX|eHD$HCPLt$@L-`& HxXWLL$MLL<ͅH|$H HCHxH|$HAD$t'=@ H& L H|$HצH|$荑H|$IH{PL%% HX輎LL⾔ H|$H荦H% =ȑ H龐 H|$HfH|$HĈ[]A\A]A^A_f.AWAVAUATUSHHHPIH|$N1I}HD$IpH|$ H|$@1eIuPH|$@Ht$@H|$ IEPHHxL&HH$I`I]HuE1OH1E(ƒH} H$1dHu H$SH$H|$ aH|$0H$?H{H|$0HEH|$0薏IHH$1kdH4$H$H$H|$ HD$`L$HHP=Ə p= HeHo# = MIH<IH|$ L6H[HHEHEHuڋE(H|$0;H{H|$0HH|$0蛎IfDH$1qcHuH$1H$H|$ H|$0L$ʍH{H|$0HtIHIHf" = [ H|$0H>H|$0MIUHL$`MHQEtsH<$2H|$ 踍H|$HHD$@t'=" H! HH|$H蹢H|$oH[]A\A]A^A_DI]AH\u= exH|$01E1行H{H|$0HKhW dE1hdmfDAWAVAUATUSHHHPIH|$1~I|$HD$I$fH|$ ԋH|$@1XaIt$PH|$@yH|$`Ht$ H肊HxHHyfH=KIyfH5KH1H HpK= IHGzHt$ HH|$`H0HxHHyfHJ=8 HIq0HS= MIH<cH{1HEyfHJ= HIpHS=֋ IIH<Ht$ HnII$`I\$HE1%fH H$1_HuH$GH$H|$ UH|$0L$3H{zH|$0HݟI$HI$H = H|$0H襟H|$0[MHtYIvL1% IV=Ê ILp= IeHf = IIL<}HH|$ H-H[HtLHEHEHuH|$0AH{H|$0HH|$0衉H@EuI\$AHuH|$ vH|$H詞HD$@t'= H HH|$HwH|$-H[]A\A]A^A_= }e땺hd1驱fff.ATUSHHHoHEHt8t3H HI111 H蒈H[]A\fHxuLeM4I|$PuI|$uIT$Ht:uHzuHuPHxP1uID$HuP1HxPdID$HpXHEHxPlAHEHpXID$HxPIHEHpXHxP+ID$HpXHxP ID$1HpPHEHxPHE1HpPID$HxPHuPH}L$HuPHxP1nHE1HpXHxP~SHuPH}1׈`fHHHtIH =O HgdAf.HHHtIH = Hgdf.HAWAVIAUATUSHhH$衅H$蔅H$臅I~P,h! H赃I~XHD$H$@1HHH$ZIvXH߀$xH$`HD$8H$hHD$@1w1HD$xw`HD$0I~`H 7I~`D$b?IHufDuHH@zHuH<$1ZIv`Pv MAFpu DV8EH<$H$`f.A1L<$H$L7IF`@pIwH$IF`xL$`MEI}G7Q|$bIF`xIEBi ML`DŽ$ILNIEhL = H$LLH$hHH|$ tH$HD$ IF`PHljуu8u@tHx[5PHD$E_HD$pH|$ P{Hx1蚨HP HH5 1$H}I~`H?tMtAEH¸IEH|$ HD$pI~`HHGHHH@H@]Mt HP1HT$6>dHHL<$1LAHL-H$`HHD$ HT$(fD$D$biHL$0H =l H$HdH$H$蓁VfD~AAx OHP I1 H@ H$Iv`DN8E$At>~H<$莅@pP81LH$H袀IF`AE9:0oAHH$IE Hp11HHC>HC(L(ҬIN`HHD$(DA8HXHD@8HC(HY@yoAXFXkH\$(oACoACoA(C(H\$(EoC@ oC@0oC(@@IU fwHP@H\$({u HP(H\$ HZHlJIH<$1IF`xKH$`9DŽ$lXMOIEL f$t&HPB7 Df#tI~`HPLM6HD$pCI1ŽHh HD$ $It$LGpWIF`HpHX H=9(1HD$8VfHD$@H$@H$qIv`H$^IV`:$AHx胴HHb1HHiHD$pIF`-d?HExdH$?J<_HHH;L= dJ_跞?J<_HLH:dJ_芞?J<_HHH:HU dJ_V?J<_HHHp:H /J_"?J<_HHH<:J<_J_G5HE*@tdѝ?HExi dh dh dhV d߶h d˶hJ d跶h d裶f.fUSHHHHH=pG "~CeHh(H@Ht(HtHHtHHPHH[]f.H t.O49 t!`DHH9JHHHH H9tHt&[H]A\A]A^f.HH[]A\A]A^ÐH f w1$mHHt@f}"eHmHuHHHH[]A\A]A^11H;fH{H9CItHfHL`HHtHHtIHL9i11X8HHHHHLH#DH{gH9CHtf; HQ\eHKSC@ fHHfDH}H9EHfHHǀHLHHLHtbMHp I9t$ t&fDMIL$ H9J HHPMd$HuLhI|$VIFMd$Mt0L@ fHh(H@IIEfA<$"LhuIEH@[eHojff.@``HHH=`t9PuO9uf1ff.ftF59tL`H fHH9z1.c_ch-aWHc_null_cSharHD~LnHc_alertH`LHc_backspH|zaceLHc_form_f Heed\LneHc_new_lif5 HLfoufnf= K foJf)KAab Hc_vertical_tHfDkKH5 Arc_ptfD HHHc_null_pHtr fHc_funptrHUr Hc_null_fHunptf*Hc_f_pointerHHc_associ8H}Hc_funlocHfodHcGfHc_sizeof(I'ated$RDGMqJoc_loKHLH[]Hc_null_p)HtrMf Hc_funptr.H/Hc_null_f&H_rLfU78unptj@foQc?)6efo)cHc_doubleH@Hc_long_dex)Hke fobfYHc_float1 Hv28fobf`x)oubl*fobOAumpleff `efH$0nf.tn9t\9tJH*t.9_1DH9t(uHfDt9tZ`1fHH9t͋yxu@"t9t"`1 H 9ty u@1뉿j1hhkj1Jhf.AUATUSoH5-IHc_AE1f.Hè3tKT1x HH<`5eH9uʋCD9~D#Hè3AŅuAtHD[]A\A]@5t^_A1x HH<`4eH9EOHèXuHD[]A\A]fDHA[D]A\A]@AWAVAUATUSH-mE1E11f߉?t HfwЃ+҉&DDHczH H@H `4``ۉq0A4y8Q<xAE,\lHcHD`҉)DD$D$ D$ D$1w Dl$ t#EHH4H9_CHcH4H4L_AvA;vAAVAAAVAAV A~ AHH<_DED< t<oAGHL4~_MtDotDAtH= f?G6A9t0H= f?t|G6A9tH=Կ f?tvG6A9_IcWDl$DADl$ fD[yexfDKye~fD;yefD$D$ D$ D$D!  EE=}! !|$ 5T! |$ ɉt]5`fHHx<rHcPxu 1`f.HHx< HxuU9 HE,H[]A\A]A^A_fF  |$ %  t |$  n=! YU@pj9fH@_ǂD_ HǂH_.Z@ H(Zi|$   0 B# t2 `"|$ bh|$ 3 H-|$ DÉh|$u U"  t_|$t|$ u41ҿ EHk1;](|$ 9h f|$Ml1\f|$tT31|$ rl1\D ҉ a! m1\f|$tepj跖pj裖pj菖pj{m1\11D$ADňD$\Pm1[l1[Pl1[k1[k1[k1[ft+t tA^ t(@uCHL H! H @uH4 H N?ff.AWAVAUATUSHxD[E`E1Ld$7HcH ɀ $`u@u HҀ $`H C4%9t_H C4%9tKH C4%9t7H C4%9t#HԷ C4%9t J?HC7MXrL1J``HHI蟻Lz?=i% HH¾&q!eHyENHEtx}  IcIIcH ɀ $`} hI?Hj 2HcH4ɀ $`Hɀ $`D1E`E1Ld$fDL?Hþf8HfHC4f%fC4MirL1J `H IvLy?=@$ &HHH eHPAv t(}IiIcH H@ `널DE A_E1Ld$DA@HH _H- E4%A9tUH- E4%A9t@H- E4%A9t+ fAdH@4Hf%A fDu49?AzrL1J,`ULx?=# HH¾&'eH/HH _tH-vH;- L58 t9H;-ߵ L50 t)H;-׵ L5( t,dHhHI 9?ArL1N4`许Lw?=x" LH¾&eHHH _tL5HD$IIǨteEwIcL|$A GIcHH _IA5HXH _ Ld$1ۋD_1He@_HźrL1H裷Lv?=m! HH¾&ueH}HHx`~}eHX`@_uHH~ Hv?= ! HH¾&eHH &v?= H¾&HeHH' H{x|H H{x2H=[ |eHHOeH=CH4|eH=8H|eH= 1H1,fHm|e5;Hq1x HH<``1H=H\$0> fH=1H'fHH Hp4t$LjT$LD$H9vxֹ@?)Ht0HHHt0H=c Ht$PT$\H\$PD$X f5ZH1x HH< `H=w fH=1Hb fH HHt_Hƿ@HH``H_HHx[]A\A]A^A_f.kHrt?=S HH¾&[eHcfD rs?= HH¾&#eH+RfDH}HH``Hg@Hcff.HxHH``H1ff.HxHH`H1ff.HkxHHŠ`H1ff.H+xHH `H1ff.HxHHŀ`H1ff.HxHH``H1ff.SHHH5 H=$fH1HI!fH7[US1҉HH7&1x HH<ŀ`HH[]ff.1HStHFHtHP fv H[DH1H[kff.HG7t,f? t^HuA@HHfDt[HHf8 u[H@f8t.HÐf?umHGHÐHHuD@7uH@HúPpj載Ppj訉Ppj蔉Ppj耉fH tvGPH@HHt1yu{GQH HGPu_yu|0pWjHG(HHt0B8%u#B8HJ`t&z:rD1HÐGPHuں0pBj蔈DːAff.@AWAVAUATUSH1HhH|$H|$0T$L$H|$@פH|$P;ydIAO7111lt$IHHc@ŅAE11t5H|$@DL$DL$McJ|IIcE?HI1E1HDJHt ?IMc1JDŽȈ|$HcENA9\C9YfIHӈMcJD1Ht ?K qD9~IJֈBH9H|1Aу?u5HǘHT$ T$HT$ DL$uDI11A9Jf.5^HǘHT$9IHT$HD0JDHt$@H|$PHeHt$0HT$PH裛fDEIXHǁIHǂxIH5H=SIPI1HǀfH|$1HfIHHT$re|$HT$HZ|$u = u-LXre|$IIGAO7HIHhL[]A\A]A^A_Ð5HǘLT$(HT$ DL$LT$(HT$ DL$5HǘIMc̅JʈQJDH|$PHJHHt$PH腨Ht$@HT$PH蓐fD1҅H5H=fLt$IL7qeLMw[+?H|$0aH|$@WH|$PMEtxD+AxJIIcH=AHHHpfH|$1HfAHD$uLt$=r 1Ҿ&L{eIGxL|$%C?= 2H|$eIfDH{eIE1IHHT$CeHT$Hxf5H|$@HT$HT$Hxf.5H|$0HL$AeHL$H@Ht$@H^5LH|$@.HfDC9~A^ff.USHH1Ҿ&H= eHHCHC)?H߾6HH1[]|6USHHsHD$Ht HH[]d Hÿrh?KHCxH-@ rg?HL$HHHH@%H- g?HL$HHHH@ &H-Q g?HL$HHHaH@ H-$ wg?HL$HHH4H@ S H- Jg?HL$HHHH@H{HCH;H<HH[]fAWAVAUATUSH8DH$,7D$hAMcAŃ= uEu"J`Ht%H8H[]A\A]A^A_J`HuVdH|$HÉr1@H|$6f?KHCxEL5tLD5t =f?LHHHL5ef?HHHLH@Iƿre?LHHHH@rL5e?HLHH~H@uN= HHCH;= uEtJ`fDJ`}H5HcH_H=|fH-IHD$Hto1HLfHſVIe?HHHHH@@fDH- d?HHHHH@vdHſrd?Mu L=HExvd?HL$LHH3H@H L=Id?HL$LHHH@hN L=ad?HL$LHHH@HMHEH;H-o j<}DHH@Pf`ABD9~H1Ht ??I}HCD9}TAH|u1Ht1?uf5^HǘDL$93DL$H!It~EAUDD$MLDE1PH|$ H8[]A\A]A^A_Ë5޶Hǘf5Hǘ{f1|$D$ lf.A9DLLt$ L$fD5VHǘ63E71|$D$ fAUATUS_HcHH`HtH[]A\A]H=tFH_H5H=fH=l1HfH`H[]A\A]HD$dH-N{(IS?HL$HHLѵH@. H- R?HL$HHL褵H@LHD$dL% &HR?HL$LHHYH@ &L%oR?HL$LHH,H@u L%BR?HL$LHHH@HsHD$d7U IQ?HL$LHL軴H@sQ?HL$HHL蕴H@L HD$vdH-׏ sHQ?H=HL$HHBH@Y_Q?H=`HL$LHH@H=EH-9 "s"Q?HExfH HtfAUATUSHHD$dHIP?HL$HLH至H@/sHP?HL$HHLZH@LHD$;dHHRP?HL$HHHH@%Hj%P?HL$HHHH@u H=O?HL$HHH赲H@H)HD$dL-{(HO?HL$LHHjH@sL- O?HL$LHH=H@. L- SO?HL$LHHH@HHD$dIO?HL$HLH̱H@7U N?HL$HHL覱H@LHsH5H=f1HLCeHHD$]dH6H5HH=fH= 1He IMN?HL$LHH H@@sL-  N?HL$LHHݰH@VIM?HL$HHH跰H@H+HD$d"^%HM?HL$LHHsH@] M?HL$HHHMH@HHD$.dHHTSe<H=M?H=6HL$HHH@ Hو  M?H=HL$HHůH@ RsH L?H=HL$HH蔯H@YL?H=HL$HHjH@H=H\stL?HCxHiH[]A\A]ff.AVAUATUSH Lv HD$f A$hnf.H1H2Hp1LItHL$HQ19rHHTqHBHL*HHxIcI#eHL$HuQ19brHHTqHBHcL*HtHxHIeHt$HV9JHHHTNL"HBdH []A\A]A^ffDfDHFf8Lp @~fDMjHFLp ]DF7 <HHH?HyH []A\A]A^@H|$1ҾHD$BHt$HD$VH|$1ҾHD$HL$HD$QvH|$1ҾHD$HL$HD$QH=ч 1 -eI}DSG7 t)HqHHƋ:HH[ú pjbff.AWAVAUATUSHGZHD$<AHHUHHD@P@u CRD, .{ mHf.HHteHwPHHt?BPu9B[ u3J 'n#J8f.D, .HH@HuLd1HJdIl$(Hþ^SG?HHHdLHgDGYIH(|HHHD} CT *H3 H9(uMtHHE1uHHteLuMA~ AFZI KHD$HEHH8>?HT$HL$HLΡLDIFH;HD$KH=çfEeHT$HHtJ%9tABHlH|$1ҾJHT$JՃ1HLjPRLl$ KHEeIHHRHD$po jWPoP jWHGCt HGHu]H H#G8H H9t H=| 11eHHtDe@tH1 Hf.H1HH:Def.AWAVAUATE1USAHHx HD$(C^ C]0^CZeIAG8ƒAeAO:uLIw`u AGF @t 1҄ƒE1LA1ɨEIAG8@uEt LfIAGvMI?L\$6?L\$HL$(HHLșf`u:H7>t1DHHtHHX@H8 t1DHHxpHp\ff.H19t@HuHIHHI(AW uAX@uHH覟Hff.H19 t@HIpHtH HqPuAX@uHHUHff.HH~$H9 u( H| uH9uf.1ff.fHtHHtHxXt Hx`t1fHH]`u1H1H1HfHH=.tHHtHxt Hx t1Hu111Hff.@AWAVAUATUSHxL'A<$tHx1[]A\A]A^A_=u߀=hu֋̀=suÀ=iuM|$XA?uIHtxxuAGp@uIl$PEA;GD$uE A;G D$Xp|$|$ZD$XDHD$ IHD$8H@x8!HL$8HY HSH\$@z8HT$nHt$lIHwHHD$|$nHD$0- t|$l>)HDHD$0HD$@HT$oHt$mHx)HI|$oD$nE|$m HD$) AOp@Ht$HL5H H)LH H|$L8|HHD$(ID$HD$HHHHAID$L`j=K IAoD$@HD$('mH@LpIFE1҃1ɺLDT$_CzHt$1ɺHD$P(zH|$PuH:}DT$_IFLHEHt$LHLL$P~LL$PIHL8HL$HIWHHD$(H5IH|$ HHA IGHHx(GZOTpGZIp.(/HT$0=T~HùL IHC(HT$=~ILLd$ IFHD$8Ls HxHƿxH\$@HD$(H{HƿxH{HD$ eHƿx(ILLL$(IF (ILH-LD$ IIG (L@(IF L`I\$HT$IƋt$XH1IFHǘVMw (觚H\$IHMf (ID$HIdHƿw(M|$ IGELI*(IFMw #LIHƿGw(ID$Mf 陧\$HT$Iŋt$XH11HǘIEU(Ml$ 螙HH(HCI] |HIaHƿvLc ID$HAHD$(iH@LpMNbuF1ɺLDT$_'vHt$1ɺHD$P vH|$PuHyDT$_IFLHE1ɺHHD$Pu1IƺHt$uL8vHxL\$P1ɺHHICu1IƺLjupvHL~xHCHquN1ɺL4uHt$1ɺHD$PuH|$PuH+xIFLHu0E11ɺLDT$_tHt$1ɺHD$PtH|$PuHwDT$_IFLHE1ɺHHD$Pmt1IƺHD$>EA~E/<HD$- ۃ1ɺHHD$PsHt$I1ɺs8vHLvL\$PHIC1ɺHs1Iƺ/HD$ oHD$ zOu1ɺL[sHt$H1ɺBsHߺuHVv1IFHHD$PsHt$I1ɺr8vHLvLL$PHIAHD$ mHmHH=oHrHoB@葓HZH [HHǂH1HH1H~(Ht Hz(HF(HvHHrHH5H1HFHHݐH HÐUSHHHXHHHx HtoHEoCEoC E oC0E0oC@E@oCPEPoC`E`oCpEpooooooH[]KPu-fSHt#Hut1[DuHHtxxuHH;uHHt xxtָ[DAWAVAUATIUSH(HHC LsLhH$E1Mt L2AMtL"DEAt$8~H(D[]A\A]A^A_ÐAvto uAvtEUAAvÃtLL|Y ($yfDLL1H$IEUA tfDA~JA}?A>5A'A}AIIIMHHHT$HD$HL$I(HT$o:uA}uIH9H#IH|$IHHD$HCH$LpLAgf.LAHLHA$A$'f.ffff|@EZA>t.A}ILI LA}tb=grIt$X1H‹5RrLHT$LHD$DHT$HD$IIM$I$8LD$CL$uV=qIt$X1҈L$L$H„x5qLHT$LIHT$I뀄tE10u(=qIt$X1H롿v1qI$M$H|$LHT$?H|$FLFHHT$tH$HL$HSHHHD$I$I$wfDUSHHPv=t8H=̊HHtH1[]@HkPHsX}Ht$t5E8~Ht$Hd uHt$1H},uHt HEHt$~u>uHMtHdžff.AWAVIAUATIUSIHHI?HtqHD$JoCHSXH|$H@ILoC@oC(@(oCX@8@XIHP HD$I$IM8t^Ipx tbtZVHIMAMEH=1jU\HXZHEHHP(BZBZHH[]A\A]A^A_HHIMMEH=ʈAjj1UH H먿v1AWAVAUATUSHHC8u;tH1[]A\A]A^A_fHHt@x= t=tPuHHJ Hyu= 6=+HR:uL'=0= AlH|$HHHOHCXH}LHu LHH$HIHIfoCHH$Iv L@oC@oC(@(C8oCXLDII~E8EXCLHHIuH15HD$HH(zDHI Hyf AMAA LDH8t1DxDEDsEDH8EHx@HHt"JWu5C u u BX@t@AUATUSHHHHRx=t-t%HuHH@Ht{R8~tH1[]A\A]fHPHHH:fHHXJW6DX EtH-H:DeDmAE9At$t uJ\R}x^D+H=CH4HDg%A?HHDe%D EEAt$돾(1E1UH.HH$ff.@AVAUATUSHɃH/Ht9Sw^E+wC$y@HHuHu Ht~ BHD[1]A\A]A^DSHHxtڅ@HHx uHH8H8HDD-D-HDcD3AE9At$IsJD#H(Dh@ 1[]A\A]A^fDHH#.fDH}pHHI LMH5[HFHNH1(H;^HHLHtsIT$HtHBHHtHH@(H?HHH;G(uI}Z @tW@ uHuHHHXpwHv(LH01H5kMd$ MmMM%HuHHHXvHv(LH01>H5O{xkD(H=5HHHDg%AiHHDc%D CAt$ @1AvEH\%E1AH1E1mHHHAWAVAUATUSHHxD EHHHHD$ yH|$0It?A^1CHt$HT$H|$01уAHD$s8oCHh(E oCE0oC(E@E1{qEZMY@H@EZS8MR EP ЈEPoE IH{@@oE0@oE@@(Dh,C8AD$8s8%K8ID$@HD$oCXID$HAD$X聂HH~LHo@Hw~EHHE?HEPH\~H]XH(~HxL[]A\A]A^A_@{8JH{@?E1D} S8oCXID$P@P @(HHP8 HSX@1聁H{@S8H@$LsXE1@5cLMcA&HH$Q-HH$5OcJDLHs@HH$LI:D9c8HH$JvE1{H|$1 HE(HCHtH8Ht ?"HCPHuHH8uHpHxHT$ 5_bHSX-Ht$ HI=H|$ 2MHE(E1L fHGHH@(@R HIfDH4$OH4$HD$HU(HHR`@@MPaHCHHt'H@(H@(HtH8Ht?u XIF;t#HE(MPAHEDDHCHH@(x uHCH8tHLu(II L(I$~AoEXIT$HLhI$1ɋ`H5ztAD$XAD$E1AD$ ~ID$HAoD$Hx(GZG AoD$rG0AoD$(OT OWGZOY@G@CM&!(u~H8t1DHHtyxu5 zً yπ=yuƀ=yu=yu=yuH yH H9AXtUSe HHHHHtHEH1[]DAWAVAUATUSHH8Hyt/E1:H81[]A\A]A^A_flyHH@x= ='=u#HH]Ht1'H]HߋA`HHP@x=t =;Hk.fD{HHHf@x==HH@8oHPHHR(BP]L`PMu m@IA$:I$HuIH3H{P9{=]1ID5]HIEHDMIE9I<$.H}1M,$)&H]E-w=`H8E1F$X{@HߋA/HfDH{fDH`AŃ{8EH}u/LLLHD$(HH}1%H}fff nfD ^fDNfD=uPuBu4=u'LLMMˆT$A?A<$A8LD$MIA|$IHHD$Ht7HP:u.1 HH9u!HljT$HT$u߃wHD$(LAHH|$IHD$aLLT$IHD$HxLT$IHSXH|$(LSHL$HQ HHP HA HD$H|$HsIHx |$LT$HD$HxG8oGXAB8ABXA? LT$LILT$IC|~F"HߋE1A|$8FE1LcPw=Y1ID5YHIE^HDCIEhufDAUATUSH8H;tH81[]A\A]ÐLcPA|$uHLH&$tA|$,tA<$uID$HH@(@R uH{XyIŋtrtuAuLLuLLbtLH-rHrHrfHCX/IEPHu!HH 8usff.H1%HtZH1:t7J~0t6HQHfDHxtH9u1Hf.HúufH8t1D qq={quۀ=qquAWAVAUATUSH(-Yq@u HPX:tH(1[]A\A]A^A_HHtyxuLHHT$Ht$LpPI|$_HI8uMl$ HT$Ht$I}_HH/8vE1HϺLHL$HL$oEHpHpHpI}tdHIE Ml$ HT$Ht$AI}_HHtZ9iEI|$ H1pHpH pH@ID$`HoHoHoI|$ Hu%HoHoHoLH L5Ml$ HT$Ht$AI}^HHH@AWAVIAUATIUSHHH?D$ LHt$ LՅAT$ I>? $ň{fDHHHu HHHH{ H{tfLH\nGucH{ LH-Bn-uIHC LHHxu2HC LHHxuHC HxLHhAHD[]A\A]A^A_I>?uH_PHu>Du&H{LHuH{LHuHHtuËCwC ~L0E1LHLIaI~xLH2JILH0AID9{ xDHǨLHAIHLE1HǨLHIHHH{LHtH[ HufDHHkH{LH9QH[ HuFATUISHHH0 H|$HHD$Ld$1 HT$H|$ppZH|$17Ht$>t%H|$s"H|$H0[]A\fDHƘHfHkHtTPtMAWAVAUATUSHHH/}t8H|uJH8tZHĈ1[]A\A]A^A_fD1DHH|t@D Euf.=juPquHXPPqHuDHHnDEu{uHjHt֋ztHjHtE1IcHHЀx Zj Xj9HE1HHHIHLi(A} OHp>BA(H|$@HƘHHP:H@8HH|$PE1AC D|$1ELl$E"DEtEH9C AEtЄtH0?tHt$HL$pHT$0#t:HS8LcJDHt 8J҈Ht 8<H|$p]DHhfAHD;bDHHlH<5 ACH0?Ht$HL$pHT$PCHS8McJDHt 8J҈Ht 8dH|$p@AHH|$0HHp>tE1A"H|$PHƘHt$0H|$P3E~EyH|$@A AH|$`D\$"H|$p"HT$0Ht$PH|$`DHT$@Ht$`H|$pP1Ht$PHT$pHH|$`H|$pD\$SHEHHx(袟HI;LH5fMHVfHNHE1*fDVAHD9}DHHLHtlIEHtHPHHtHH?HHH(H;z(uIRZ tR uHUHHHr(HPX1LH7PxH5-fpMm MvMM6HUHHHr(HPX1LH7wH5e#HH|$pHHT$1HT$H%H|$pHD$DH0LD$HHe|$HJHVXyLIH1̿HLH|$pLT$ HT$LI1HT$LT$ LLT$ H|$pI DH0|$HHdMHJHVXxLIH1EHS8LT$LH|$pL0L# H|$pI DH0|$HHdMHJHVXyLIH1ξ.HH|$pLT$(HT$ HHL$F0HL$HT$ LT$(HLT$ H|$pHD$ DH0LD$HHc|$HJHVXxLIH19HS8LT$ HcP\ff.H8t1DAWAVAUATUSHLxPMu!MMtAuAtH1[]A\A]A^A_DAG ~M0I1@IUHt2M$I]PpZLtDHPHSHuݍEHA9G ff.AWAVAUATUSHH;tHĨ1[]A\A]A^A_=bu=buDbEˀ=bu€=buHCX8uHHtzxuHH=|bHT$^Ht$@Ht$\LkPHVbHCbH}QHHD$MHE HT$_Ht$]HxPHH*LH|$HHt$LHLAHL$E8y8 |$_D$HuaH HHD$AER    Au IUXH111I$ D$4D$0HD$`LHD$L%`HxPRH}PRI}PRy`' L%` HD$8~HD$x8 ANp \doCILHD$ @-IGPHp = MwX4Ls|$ D$ z1ɺHAHt$I1ɺALuHDLI$1ɺHD$@Ht$I1ɺ@L8vHCLD$L1ɺII@@1IĺHx@LpvHCHL$ =[E1IGLHHUIboCI@H|$NHINH|$HD$NLHD$(HL$LLHOLIsHL$Ht$LHNLHD$QHL$Ht$(LHNIWLD$Ht$`H{PLBIPHBHPHLbIL0H ILHL$`HL$LD$hLD$HIHL$H|$Ht$`HD$(Lt$`HL$hsILD$HHD$Lt$`LD$hHt$`HMIIƀ|$\t[HD$H|$pH A HpL@`HHXt$ <H|$ 1ZH$Hx HD$(Y^|$]tRH|$pHHu LE` HMXAV<H|$ A1VZH$Hx IXZHD$ LxL|$(LHt$|$0LID$P>|$4HL>H|$8ID$X HCLHD$@E11ɺHDD$(=Ht$I1ɺ=LuH@DD$(I$IL`EL$Ht$HLAI$L`HD$ =0B1LI$.I_|$IoC@D$HXKH|$IFKH|$HD$2KLHD$(CHL$LLHKLI%HL$Ht$LHKLHD$HL$Ht$(LHpKIWLD$Ht$`H{PLBIPHBHPHLbIH ILL2HL$hLD$`LD$HL$HMFLD$HD$(Lt$hLD$`H|$Ht$`%FHL$HHD$Lt$`HL$hL1ɺ;Ht$I1ɺt;L8vH>L1ɺIGII;1IźH2;LpvHF>HL$ =@1ID$LHHI]oCI@L-pA^_-tx|$Lsl|$Ls(|$Ls1ɺHx:Ht$I1ɺ_:uHLs=I$L`Ls|$ZHHH|$IHLHD$HL$LLH ILIHL$Ht$LHIIWH{PHt$`HBHPHLbIHL2HL$`HL$CHL$H|$Ht$`Lt$hHD$(HL$`CHt$`HLt$`HD$CIhE11ɺHDD$(+9Ht$I1ɺ9LuH&HHH8tH>HxPH5OHH~HLH@HI9AHHH|(AHOA9n gI0HH{PvHA^fDI$H0H~PBHD$8HD$0H5OLd$@H OID$Ht$0HHxHNL9bt ;V1H{PHNHtHpHt H>HKHH9OHtmHL$0uH|HL$HH0HtH6HKHH9NHHH9uHD$HDHD(HNHHt(AH NHCA~ HD$0HD$9IwH{IwHgHT$HA9F ~^HD$HDHtHL$L|MtH(IwHT$HtIT$LD$HBD$zI|$PHT$qHDh8ID$PHT$Hx@tIcVQHT$HC@A~ qHD$0HD$HD$HD$HL$L,HcM'M0At\t1HCPHu$HT$(Ht$ H|DŽ@LI}HD$LL$Ht$ IHCPLLL18LpI}HD$bLL$1IHCPLLA=HT$( &LHT$ I}IIHCP1LL HT$HL$HA9F HL$HKI\$PHxH@H@L`@HCPL}$HT$ J|DŽ@HsPI}Ht$UHt$I}JDL{P>I}IL{P)HT$ I 9L{PI}HT$HT$I t1Ht>HBH@ID$LbHD$0HD$8uH@H@HPHRHRIT$L`wff.fATUSHPHD$HBHt *vHP1[]A\@HT$ HBHZHD$HD$Ht1Ld$Hl$0HHBHZHt8uHHD$@Ht$HLd$0H-IHL$8H@HxiHIff.AWAVIAUATUSH(=I =I=fIHH=IHt?G1Ҿ@ZH9HrIHxAfDLtI>?E11DH)II>HlHuu+M^g HMuVH^LHED9uMtM.HHAHt D9`hAH(1[]A\A]A^A_Ã9uHt$HHL$yHHL$tHHMf.I>t3IWAfH@HHHL$IHL$uHT$HXHtHy 13HMHʾxy 1aH?AWAVIAUATIUSHHH(D$HHt$LAօD$Me]GA$D$MGD$AGD$Bv${fD GI$HHqI$HHHxRI$HHHx3I$HxHHM|$PHHL~I|$XHHfI|$`HHNI|$hHH6M|$MtIIPHHIXHHu}IHHLHugMMuA$NU-ED$ED$ED$EMefI|$Ml$1H([]A\A]A^A_fDwEI$HHHEuI$HHHx*uI$HHHxuI$HHHlI$HHHJI$HHH(I$HHHI$HHHiI$HHHGI$HHH%I$HHH ~I$HHH(\I$HHH0:I$HHH8A@I$HHLrIIuHD$I$L|Mu M M IHHtM$MuefDM MRIHHtNBM|$P+M$Mu] fDIHHIHHrIHH[M M4 HHL8tI|$PHHBMd$M^M$Mu2 fI HHLMXM IHHt'@I$HHLHM$M[ATAD$IhHHAM?MuD$M|$PA@%AM|$P{I$HHhI$HHHxII$HHHx*I$HHHx I$HHHx qI$HHHx(RI$HHHx03I$HHHx8I$HHHx@pI$HHHxHQI$HHHxP2I$HHHxXI$HHHx`yI$HHHxhZI$HHHxp;I$HHHxxI$H\I$HHhI$HHHxII$HxDI$HHI$HHHx~I$HHH\I$HHHx=I$HHHxI$HHHx zI$HHHx([I$HHHx0HD$MLl$PtHt$PLV}LhLl$PH|$ HD$XtHt$XH|$ 1}HHD$XMD$,tL9wHu LD$,H;D$|$HcH;D$H{f?Do6D$,Ht$H=+yLEd$ IDLd$ cHHsMHD$0HD$HHxf8HD$HD$Dx6LH\$H.Ic1H HD$@HC1HD$8@HD$p6HE1AjH|$@H螐IHD$0DH<(_AX|$,t LIHйHDD=LDDHD$w=HMHjHT$ 7AD賧1Y^H9\$8IHD$@HHH9\$t=HD$f8&H63c E1HhL[]A\A]A^A_H|$HfHT:H肸u/H59HHFH#xHHHH9Hs(H|$@HD$@c7LNmH#P$]LHD$HIHT$H謺9AFtH=NHt HT$H1mH6H D$HAH<H#T$@HI!1L!HT$@HT$@H‰sHŠ`HDjLHI HHu1Ҿ$LI9HtH@Ls0,`MLDAV1Iv1AHHH?5DHT$8DL$4LD$(LT$ HD$L\$|$HT$8DL$4LD$(LT$ HD$L\$L$뎋WHcHËW1HHc词L-MufDHIMuImfA> u1ҾLκ9Hu1Ҿ$L躺9HItL`Iu(H|$@HD$@'`7LiH#P$]LHD$HIHT$H=9AFtH={Ht HT$H1L}L T$H1H}H#T$@E1IM!L!HT$@HsH Š`HIcAHLPHHuI}0,`HHDG1Hw1LHHHPHD$HyHD$HIFAHxX`6 IFH54HxX`6gfff)AfD(H;) 1HD, HHL$Hc5T!]1IFt$UJDŽ9El$I5REI$(`HcSD9PSIVHŀ`+H߉t$Mt$|F`H,`H;H%H9uVIcD\$ DT$H}H4hcjE1A11I' H=XaIFD\$0I(`EHAFD]AXAYDT$Mx1IDJchc1ҹ HH=`IFHT$DLD$D$HD$LD$L$HD$0HHcǹ1 H4hcHEj1E1A1H觯 HEXZ@`tH1gff.$AD$D 0Hy(HEHV'IfHt E<QHEHDE<CtH$M@E1HEHt$fHEHD$P\fHJHIHyE@HxHHe@ ۾4DHVsf.HHL$aHL$\HazHHuaHXEH}_aHIf=06 H}DAE:H$H|$莍H$H$H11O L~eIEppeH}HGp۾&d۾@ ۾۾@ ۾f۾@ ۾1]苵\=81qE@H HtLlH~f?tS~8Lb IJf.H}0wDM,EZHE81ҿf.Ht$f{`Ht$DftLUE1A|$8J Hc H LlE17I@{`rH=-nH%E1إ7I@lz8H@@(H^H f tfH f8  HH[1HH <H <HJ(vHtHT$y`HT$N H= VniOH$fDu:|$ujHWHsLWHcH8L[]A\A]A^A_ÐKPHWH(LWH(HH@H(|$tL>A}$IHut1ۀ=uD=1t 腇t[]A\HLHPHHrH@8H9tPP1E1H@8у`H9ɋPPtAA`D9tMHHH0MP0LO0Iz8LW0IA8LH0LA8LO@IH0Hx@IH@H@8MYLWHYLKIYI{L_1IJLQу`H9sH(  @I8H0ATUS utH0H@L L;`tqID$1HID$1HtP9t*HlH}Htߋtt;#fHr6)"GZtH}HHtHu8H9HfH}HHGLk8LH=CZ HSCZHHt}tH{tHS(HtBZ@<tTHS8Ht$HT$HUEHHt-BZH{tHS(HtBZt HK8H9J8tHH(H[]A\A]A^A_HNpHEf9uCHYf=t'f u0HHt$>uH8HHCf8u-HH{HCZ<HEZt˃HID$H}Ht7HEUTLHE8 Md$0AD$ZtI;D$8uA9T$TuL{8IMF0MtA@ZtIF8I9@81LLL9]HEHP@HzHr1fDH(9H D,tL9>uL.f.HK8H9J8 `DHEHU8Ht$1LHT$HLQCHt HID$H{_fDH{ T$g]T$tHCHX$1LL|$HEDAFTA9@T1LLLD$L9LD$u1M)*)/fPHJD$?D$@B03*HT$DA`RT$EHD$HHz1P1LD$LnAXAYhD=UE1覄\0;(HE1H\$LmDe(H-D$]fDfuwEDDH=-b59$6=AŅD%D%juVD$1E1'HH(EHs(HBHA|D$#LBTD$(IpI@H vHH98A@(A@ AHD@$L$L@$D$DL$ IH Hlj)A)AI8LN M1MIERtMD|$@l$DL;L$@tnE1MDRI D)DL$ AH$A)@BH9Hr)HL2MItID|$@l$DH;T$@uAH$A;^D$A;F qD$(1\$(HHHDCINHHHH9u\$(MLDLH\$H\$H5N1]H-HtH)0HjH밿NNH\$H5HHMAH$t}IhI LL$ :%LL$ Iƾ|$(IhD8XD$HAF %AnIFLljT$,LD$ LD$ T$,IpII@녺8S4.1DRH\$Lt$TLD$1RAP ApDD$4H-HhS0.袽HSJM莽=ATIUHSHt2Mt A$Ht EHt[]A\1A``_V V藾t`_V V1wff.USHHHT$ Ht$H|$CT$1=VHT$1IVHݪT$ 1[VHʪ1$rVH趪11VH襪1VH葪1VH}1VHi1VHU1VHAs-i-7$_5$9 #.d31u WH쨚/1WH觩H@4f%f@H'Hh H=\LHU1WH`HaHh H]=\HU1WH2HCHh H/=\HU1WHHHh H=\HU1XH֨HWHh H<\HU1XH訨H1Hh H<\fHU11XHzH Hh Hw<\8HU1GXHLHHh HI<\ HUH1bXHH߾xXP411H߾X꧚H߾X H߾XH߾XHߺ%VX1诧I%VVHEH1Y芧!H߾Y1gH[]m!P4ff WHcWHVHf.}WH3+fDoWHfDbWHfDPWH륚fDPPҶP辶P誶bH@xHt xh"HHHHiHH[]A\A]A^A_f.Prn P^X PJ P6Z P"f. Il$`HRHHzT$,H҃1E11Dt$,>@HmH9;f}uIwH螰HAHIl${(gk@HHH{ H[A H1H|$P111pHHD$P0H{0`Hl f@HMHvH|$0#HD$@)D$0D$@.drHH~H~%E1A_HL<wHMD$,HHz1ҾLIT$@J1ҾLID$@P$P  P1E1H^~HH P’P讒X P蚒 P膒Xi Pr1H@+F]11ҿHLsE] H*su11ҾE]11ҾHrE]5_lHr,`HLdebug_lninen$Hn輛11Ҿ<YE]HRr 5kAvHLdebug_a.`fDNoH;o9obbre)o$HLdebug_rJ5kHLdebug_i`Hnnnfon$5kHLdebug_l,`dpHmminem$ߚIkAsΟ`H8l6langefD 2ll$4袚= 5k~= 5HLdebug_a`mddrHll$Y t= HNa`05jH0l$5jHLdebug_l+`Hkockfkk$JjPAjH[aC]11ҾHpGC]= HpHM11C]H.p t= HN11ҾB]HpDB]11ҾsHpB]11ҾHoB]11ҾHo{B]11ҾHoaB]5iHow`fogGevkabbrfkkk|k$)$c11ҾB]5h`Hofo$jlinejrjjj$11ҾA]5lh`Hnfo$Bkinfo?k$kk$Ǘ= HM11TA]11ҿ-HWn:A]H#n 11ҾA]H;n11Ҿ@]11ҾH#n@]=o HmC/HD11@]=E HmgSHD11@]11ҾJH}mp@]Him u/)DEuDE D=wHM11 @]11ҾHl?]Hl?]11Ҿ`H!m?]11ҾHl?]11ҾHl?]5BfHlw`foDevhabbrfhhhh$)$芕11Ҿ'?]5e`H5lfo$glineggg$<11Ҿ>]5e`Hkfo$ihinfofhKhCh$11ɿ>]9Hnk@=HN@=HNv= HN 11Ҿc>]Hkf4sn`HIeGeangef De0e$5d11ҾU=]Hj]11ҾV\HH$A^H1H|$* fUSHH(G t EHH~_HwH}-Hc1HH  fw-fuyH$HCH([]@HHf9~@HHSHHFHH9ufD11tkz%뙺ksL_% pH=qHHHδHHHHHpH]HJH7%HݳHʳHHtx-HuH=&H[@G{fDEHt H9-t[Hn8WfrHXHuPH[HC3VfwDx( t%f.H@H93@uf8uHVHE1fD<'1e6FHQ3HfHE1H@HHp蠻QEt@1Ҿ(H-Q/HH]Hfv(H[HueDH[Ht fwIH-1E1YHG H@HHcIGD$ AGMEu110?d 1Hž(LO/HEHrHE(HEA$f t-f ( ID$ f8 H@L`fA<$  AD$LEB/Ht'LhAUf0Rf2f80NL覔H=HEHHh(H-HDLDL$ IwDL$ DH$D$ E15H|$dD$aLVLD$`$*H=޺"H|$`z1=H,=P =BHH9-IA fI~ Pff?,HD$f8%VH9f?IHW HIDIzLHHLT$}LT$fDAQ$A!IH1E1IDIGHD$`E1Ld$0H\$@Hl$HLHL$ LT$8MILHkH[IHtH蠬LL9~H|$HL$ MHl$ LT$8H\$@Ld$0Hl$HI~Ht$LD$@LT$8HL$0.HL$0LT$8LD$@fHIXJfv(fLt$HPH4$f:Ht$tHt$`HL$WHT$XH6HHHD$IvHFt3D;nHHV0Hk(HfDH(D;jH9uI^%;FtCIVJABHHDD(@ H@HhI^HI^u1ҾHIVHu%1ҾH딺n@4%;Bts]I^H;D%<Ht9Pt'1H|Ht eHn;9XwH[;H=[;t\PכH-<HTH'D.rFD $LT$L\$`AVIvDL\$ LT$DL$ H$DL$LT$L\$ f*u~@H4$L\$FA9DLT$ DL$(L\$H$LT$ DL$:wHL$(LD$(LL\$ LT$DL$H$DL$LT$L\$ f,H$H@f8*y@CLT$DL$͠DL$LT$BDLT$ DL$!H4$HH$aYD$LT$ DL$DLDDL$8+HDL$]LDHDHI>LDDL$1HDD(c,IL1HSH@H $LPHlH@HD$@H]H$LHLHI;MH4$LHL~IHBf8MIEHL`HLT$L RHmH RLH: RLT$it-" tXrsDj6OA8DtVHt :EHA$Ic pfr4BuL)ILHcDE1Hٿ].lIHDE1H¿^lHH[]A\A]f.f"It$uEAD$Ht$ Ht$ H׉L$,D$(NJRHE1E111HiHH[]A\A]fD빺,"ff.fIt$H购H\E1E111HIbtIŋ%=HcjAT AE111LH{A_XaDt|U1E1E11LVsHf}HHgyfIt$HFD$|$LDfA>HL$I~f9PffD$u D;=L}DL$HT$ LD$`Ht$HL$XL譭HpLT$XL9LH;D$` DL$Ht$LD$`HL$X1LbHT$`Ht$XLD & IDI~f?:_6HT$ Ht$LD$`HL$XE1LIv1ҿg IVHMDJjHL$pHT$hAHIA[[I%"zHt$ 1E1E11L)F|$HD$XLD|$0t EALH¾`DIMu$"sz@D;={I~DE|$DLHMHIoMtLL!D8IH5nLLT$qK\HT$HALDHE4LT$f%fu E H5dnLLT$KLT$HLT$LT$L;5IfI^f;C6A9D$DLEE1E11|DIHD$X@A9/AI^< 7|$. SLփMC!LPLl$XMfD Mffd+A$ f>+HEDhID$XA8+L$E1E111LCHD$XDlLDY9fA<$HT$XAt$6D_IHD$XMV""wE1E1111LBH|$E1E1111HD$XBMHD$`AUf.Df0AEHE1j11AL`蓱Ht$pH(A^A_tvAEHE1j1AL`H KHt$hH(]A\AE`œHt$`HHHH\$`DHt$X1LY{Ht$`LG{ME1E1111LAMHD$X|$Ht$X~uI~f?6yHHD$XIFLP^MfDHt$ E1E111L A|$HHD$XHXLD1ta E1LHډDHI#"XuE1E1111L@|$HD$XMT$Ht$XL"M\f.Ht$LD$`HL$XE11L账DL$HL$`MHt$X1H I|$t#MtAE%=0tAED9JDΚHHt$LD$`HL$XE1HL>1HDQ` HIDjDL$(HL$pHT$h?HI_AXf}f;*t DWHHT$XHD$`H9t H9< tHDHD$`AZHh^L$}H9HDwh^IDGT Ht$XH9tHEHE1EhPjL$8LHIH Ht$`HDLIfA> HD$I^f8` H@@2C f[61ɉD]V # IFE1@XHt$LD$`HL$X1L苤HT$XHL$`zyDYZ1HD$XHT$`DCZE1ɀ|$0HHD$`DD$LHt$XD%D3LHLHŤIDI~f?D6Ht$LD$`HL$XE11LģDL$HT$`1Ht$XMH HI%"q1H] Ht$LD$`HL$XE11LSH|$(E1E1111!D |$0HHD$`HT$XD64 )H|$(>Q#"HD$(HhHD$HPfZ4E1E1111L:Lt$E1E1111HD$XL:DHD$`苕Ht$XHHD$@I~Ld$`LT$f?DO6jATL׉މE11LT$A^_LT$HfDHT$ Ht$LD$`HL$XE1L衠DD$HT$`LHt$XD HI#"nHL$(HT$MLH IfEHD$HXHDH9sHL$(H;AdCD$Xf; C6D$ =4&Dd$ DM MHT$XDH]AM ^11Ho:HD$%HT$Ht$(L$H|$HL$`E1蚞Af= t HD$Lt$Iƃ|$d|$LDHT$ H|$LD$`HL$XE1LE1ɀ|$0DD$LHT$`Ht$XDDImL$E1E111Lb5DHHD$XUHI8_&"ifIcPjP^|$Ht$LD$`HL$XLD$0D\$(LDHT$ E11FD$0AT$D\$(Ã<w=D\$~LCQHMD$ DDPLD$pHL$hsA[I[^@"hIF f8-HD$f8#H@pHRH;itH;it H9i8""ZhU6U4H9$"$hx&"hfHLH|$XIHD$XIF@2CLLt$HD$LH{1psHL$HH{1YsE1LD$`HL$XHT$ HH,@DL$HT$ LD$`Ht$HL$XL'|$0HL$`HT$XD6\DHL]MjHD$f8W6HD$Hxf?G69H|$X>HH@HAGHD$mHZH$P [ @DȌI]D踌IeD訌IHD$XIc< HL$I~f9PfHG fHD$f8CQ1HA3Dd$ HD$HD$Lt$ILd$LI\$H QLMHD$f C6D$CAÈD$(D<EAfA>T$1DDH D\$(Ht$LD$`HL$X1E1LD\$ ږHt$XHT$`D\$ ~ z \$AHDLHD$XHD$H@H5HL$`HT$XD6)ILT$LT$LPG|$8|$D;=e1H7HI1HUf:fHGf8AQA9/I~H\$(H$HKUH{f?l DG6HT$`Ht$XH$MDcID$,!LT$`HT$H5 ωL$rL$HHrHD$f}1 }6lIIcf @AFfG6D$(|$|Ht$ 1ҹLc.HD$X<  L$H|$11;.HD$`]HfDAM AEf0qf*}IFHhHPRIcI}H(dI}Ht$X(#Ht$`LfHt$X1LfMHL(Ht$XL(I~E1E1111D$0H_,I~ 1E1E111HD$8,sHD$@DT$0t$Xf;[6DD$0E=QHf}H#=QfD=QH==AvHL7I^HCbMvA>AvHuIFHf~IFHH@@6 f9 sHL$HqHc HH9OH%HH9:HD$ IFHHt$X=STH@DHADL$80/*=6TLHD$/*HL$ =TLI.*H|$@HD$ H+wHQHD$(荻DL$8HD$(^LD$ HT$5SHAq(HL$ H9t HH#DL$HL$MHT$5S]:(L9t HL#L5QN޿F7L肵M"&"QE6HG/QD$(/QAH|$oQD\$eD\$赣#AݝxHMDjLD$pHL$hD\H-PHD$臻BLD$pHL$hMDD$E[HPHD$ LDH(DAH(AYA9AZLT$D\$HLT$XLT$ IcI~<vJ%"Pf?P_6HT$ Ht$LD$`HL$XE1L@߁ 裢vAHD$XƄ$Ƅ$fD$1L$HDŽ$H$HD$`Ƅ$Ƅ$f$HDŽ$H$Ƅ$Ƅ$f$HDŽ$M4E6H#"MOLLt$HD$L#D$DL$HL$MHT$X޿]%HNHD$($T$L։T$ (T$ H߉D$(9D$H=LT$(M,Qfta_6LPt1I~Hl$XHÿ9[W(IDxHXHhحrJN.,Q",QE6HE6Ht.&"M$&"M@rMHD$ D$0E6H&E6HXrM!"rM!"^M!"JM!"6MAHLT$ED\$D(DAH(A9D\$LT$r EH]LT$%HAAt1AWAVIAUATAUSH4PIf7tFft(f4t"DLL2H[]A\A]A^A_I~DLDI^HOHIHcL9HD$ffMt$LH+D$IcLL$H9HN1\L$IFHPHpHPHx1Ll$@Dd$81ۃLt$IՉD$(HcL|$ HD$0IcHD$IHD$Mt$HM"LOIVHt IH;D$~uAD$;MtdL;|$IH;D$HL$ 1Ht HcLH)HHD$L)H9HN1LHD$HH@H};XtHHL`LhMtKL#OItH9D$~ID$f8HHH1fIcL9|HD$1HHtHf8uH{L$jOHHD$~HML$HHD$0t LH+D$IcLH9HN1ZL$IFHtDXEg1ۃL1Lt$8L$l$\$|HcE1H)H\$pLt$Ht$HL$IDd$xHl$fy&fHt$0HSDFDX4BAHSHvL$HH)HH$H1H1HH!H?H?H$$A@H$t@D)HHL$H$DŽ$IL$HH$H$HAH$jAjjLH$XUH$$fo$$)$H$H E1L$H|$ HD$8IH@HFD;x<DHHHXH\$ HXHRE1LH+L$0HD$ LHH4L$(HDw1T$8Ht$ D)ULl$@Dd$8LH+D$IcH9HOHct$xHcD$LL$l$E1L\$@N6AHt$(E)H$M)MMEL$LdL9MHD$(LL9EHD$HL$|MH|$ LD$XLL$PH4HD~LdLL$PLD$XH\$@AMfID+t$I)IL|$TEvLdMMAtHD$HHt$@HH+t$pL\$hDT$`LD$XH<LL$PL^\LL$PLD$XDT$`L\$ht@M9gHD$(LH9VL9M\LH+$HI9HIN1HH|$ L\$hLL$`DT$XLD$PYLD$PDT$XLL$`L\$h1DML$;HCf8HS f:H}Ht$0DZ4DFPAB HPHvL$HH)HH$H1H1HH!H?H?H$$A@H$t@D)HHL$H$DŽ$IL$HH$H$HLH$jAjjAH$ UH$$$Hcfo$H$)$Dh4Hs H fI~HCVAD@B fHVHpHH)HH$H1H1HH!H?H?H$$A@wXH$t@D)HHH$L$4L$EعLLU‰$uH$E蹀LL\$یU$L\$t1!DT$xL=L$EعLL藌U‰$fHH~_X9XRHt$1DLLD$@A9/Ht$LMLD$LE f1fE1LfsDff.IGMW HD$ff$(E1HD$IGE1HD$D$8P ;l$8 |$8HAÃHADD ZfA:w.AEt AF LVM|$8Ht AFLD$T$8HH|$LHL1HsLH1 pHwfBPv FV4fAM4ff9VAN4ff9C|$8Ht A2E0@u3D<Eu'=uA f5 AF1LpH\$8LD$H|$HƃHLEL D EHHHH!/H$AE fAEu=2u5uHD$LLֿLLpLH1jHHT$4fLIHD$f8D$FfD$8NUIƋD$LL$HLD$0vD$$|$1LLHIDT$(LL$P1LD$8E|$LLH|$#1eH@LLg\MHBO\$XCfenH$LD$#HL$PHT$8Ht$(͵DL$IEft$$|$(1Ʌ1L$$D$(Ht$Xt$HPLL$hLD$PHT$xHt$`H|$L3sH MLLL$XLD$@L$,HD`Yfu f8^Hh[]A\A]A^A_1H$eHT$@LgMH[fDeof}ffA}fxE9 |$MHLHqD=ԸED5dE؃eID\$EsfAhAfE9AE9A9@H0L |$MHLDK.fDt$$,H$HL$PLD$#HT$8Ht$(SL$$IƋD$(Dt$HM MLDHHMILD-GDff}efA}e|$1HL/Hn|$IHLr-DT$$L$(1D$$1D$(DHh1[]A\A]A^A_|$$H$LD$#HL$PHT$8Ht$(;IwL$$|$LL荼HILL$PLD$8L$(fDHD$dHD$ f.HD$HEDeAgAefHD$ HD$xfL$ $M$L$ 1$ȸM$fD賸M@L蠸MAH|$PH3"HL$PH|$MLHefDH|$PHH $"H $Ht$PH|$ML0H|$0HH $"H $H|$PH"HL$PHt$0H|$MLL'NLND<t<tLHLH[]A\A]I1HfLHH1[]A\A]ú3rWD 1HSH΋ Bw{HҶHf.f8ttG;H[[fDHXPH=Zt&H[uDH޺[jf.}H1H[1Ҁ>ƒ뇺l2ff.C;I|$HDEHAD$0AA1HL$@HdFHL$HLqLL!I9kE DHAD%PD%H=H=A0@A @)I<$HDHEDHXH?C;tiAD$1uaHIHA|IHItIAADL BHL$DHAHL$HAHL!L9A DHA2A|$>t Hփ HtL JBu@L 8BE1E1L HAAI $HEHDHHL$HL$/DHyHHL$HHHL$>H HtA0uD%D9fHQmH?M@ 5M,$Mt,@AA H޿P`1Af;HuHCHPXHHT$t5mL$L$HD$L$LkH=?L$tMmAEfL$1H'?L$AAD$0tCC;===YPlɗfAuAAE AD$1Ny0BDy0@oADHLODX茧If85DD$HL$pHpf5HvACH$D Á|$ZL#H\$`H I EAM 6H\$H$H{ϋH"C0\M@L$Af.tf5kI@HHtBwXH5pHHh HHHBD$(EH~H$跢+IF Hp Iv H$nH92H&D@u @;HLjL$(蜢HL$(:f1D\$(H$HF =H$H5nHqHHHB]L$H}HHD$!H8HH$qP%lI`lIH=710*1HH$HD$PYHT$PHHx qH9p8gHIf8.u6f9.u0Hx H9y t$fDf9.uHy H9x uH@HIf8.tH93H$H$Mb$;$HM H 6gF1IT$@6ID$@HHD$`%l@C`gl%CHH$HHD$(HD$(HHDy0H$u A8D$(uH詢HyHL$P椋HHL$Pt\H$̺O#lBE1HD$`HqHH@I9HXL;hNH$1HHL$PynHL$PuH$HϺ!HL$PHL$PH Ht^A0H$!!HyHL$PHHL$P*H$HϺHL$P輜HL$PH HIu8Hd\$(~A}e1z1H̰H9̐9uH$pH9$8@BH\$(H/@Lt$`̠H1HD$ H1HD$(贠HIM1HD$0HHL$8輰Ht$HIl[H|$HjH|$ _H|$HjH|$hE111Ҿ0 vLd$hIHT$hE1E1LL0BXHIHD$`f87^Ht$H|$`1AsLL$OtLL$Ht$`LZH|$HiLD$0HL$(LLp]Ll$HLiH|$0^LHiHt$8H1Ht$8HLFRHLHIZLH~iH|$ ^LHiiH|$(o^LHTiI}0HtO8HAEHx[]A\A]A^A_ÐL$IvHHT$*GDML$HpLHT$@lzilfff.ALuf=B9@wHHpHHL`8HhIGH4$HHpZOsD1'A5A;HUp@0%DD$ Imt$1ۋ=8vHIL`8HhD$AD A@ALD$!L1*IIG8ID$H|$E111Ҿ0 vH@H@ HD$0}IHlHD$I@H4$1AsIG@HHl|jAHllrHL`8H@ HNlHP@IWHPIG/@ H@f8 =o4L)IIG8D$LT$T$kImt$=/42rHIL`8HhD$I|$NH@0LPj1e H|$8;rt fA<$#t9D$I|$$NH@tL j1A$tIT$XHtHu"IT$X1HsHID$XI$Hj8aH D$P 'l p'l n'l֡ff.@;@>HtMtAt$H|$(JtAOH:M1E1I Ht$pB>E1sIL$H:9AB>DNALc|$HC0M_J|H E1sL\$ DT$DT$H E:L\$ D9A>DNHC8DT$JDftffHPHz f9H9\$HT$ HMHT$ tIF@HrHLFGIuLH9GHS8JDHD$D$@H@(xHD$@fEI|$ fH9\$KHHMtIF@It$QfD|$@H=81L|H=8L@H8LHx}ҊHIt @  LULH-8H8QWHI|$HE1AQLHIGIH-]8H|$hH H=D8 L1H.8AHC(1H4MxLHT$ H4SHI|$HMHT$ IH4LLMo*H|$hH]HC(x4= :HHLh8HHC H=Y4LHEHk MH:4LI1Bf.H=4 L\LE1O!LH-3H3QHI|$HKH-2I9fD輭 }8}HEHt x0 o1LHԂH}L谸HtH@!LH]}0 LmMAE0ƒ!!DU0u 0H}Lsf.HrH|$( *HD$H@@H:P,lUP,lUP}.lUPx.lUP.lwUP.lcUH=1L苁E0 HP,lUPF.l UP:.lTP.lTP.lTP.lTP[.lTH\$HC(P&+lTP++loTP+l[TPP+lGTP.l3TPd+lTP/l TPq/lSI}LHtH@:LLI}HLmM A}0Q1A}8VI}L讵1LLHHtHE!LLCLHEHtx0 t H=HHtx0ux8tHxL&HH=\f@\ANAHLP,lRP,lRP.lRP.lRroRP+l[RPW.lGRP).l3Rff.HtIL9cwDH[]A\ff.?f.sfUSHcHH@aHtVH DHHtD{tHC8=tA@H€uHp11HӆHCHHuH[]DHp1H1҆HCfAWAVIAUATIUSIHEHHt$Hl$G@H‹HSL HH{HEHH]HHsHT$uD$(%=@LKLAP1111HH{HEfLH$4قxt{H$XHEH{HE^HH=21H{?DL}HD$HĨ[]A\A]A^A_DHL$HT$MHK0HS8tyL @HHthH;P8uH;H0uHp(H9s(uHEH{HEHH=1H EH{@HD$H9ufHH9txH;P8uH;H0uHp(H9s(uHEH{HE@HH=1H{!:IHKIIL16MHt8Hu3I;W8u-I;O0u'IO(H9K(uEHEH{zHHDSHXHtH{HHHtH{mHeH&߬H[HHHڬHˬf.HcH aHtH8H< a@H<@aH< aff.@AUATIUS@AHFKHL`v?@HcH@(@hH aHt/HH aH[]A\A]fEƒPfH@afDAVAUATUASHHE1AHA<:ttugf.t H<:uH9tIII)I~ FLHHIB01DDLHk;tCH<:u"|FI@[]A\A]A^AWAVE1AUATEUSHHEH8HT$HL$Dd$pԂx AMx2IdJ<DHt$HEDHt$HHt11HHHHH5}H~HHFH11EH$H5EEHHHH:H.H5EHHHEH: HH8[HH]A\A]A^A_KE1nH|$D$H=j\tH=\jHEA2Hl$(D$ LD|$ 0U1ɾHHH}H}tEtHEtu |$xHuH}UuHL$HtHHE11͆wfDH YjHuHkH #@- H@HtKHSw1HH9yH= uH#h HH=g uDHj H"+DHCHIUHt|/u ADHu1L1ʆLHD$ HD$ fHl$(fDHHNHY1oL5@MPHuH5Y1L1 ʆHuIH!LHD$ HD$ /f.HlEA2D$ D$ Hl$ LAHH}Hts}tEtHt}uH6Y1\ uHuH5Y1H|$qɆUt@tcU1HHH}HuHl$ H=4YHL$HtHHE11iʆH gHuHl'H=x1ҾȆ11HHD$ 'HL$ IHH&jHcH@a@HD$0HD$8*HCL{IHT$IH(H\D$(HChHCLsHD$(HCH(HkHT$HD$HZH9*t L9c@H\$(H9H\$LHLm# L{0fA?$AwH|$rHD$<H .<IN(vHt I~8t H HHAHH H1fDC@t2HD$ HtHT$(HPHT$(HHX1[]A\A]A^A_fDL{@fA?t,HT$ fH{0\YIT$LL>GVIAUATUSH8HsHo8W\Ht$(HDe T$(J8IDžubD$(IGAHH@H=p-|11HIt AeIEMoA_IFHHH@H贜tAMHH8[]A\H8[]A\A]A^A_Ht$(H|$ LHc6lH% @IwLl0HD$HHD$HHLHL$fLt$L胢 ILt$Lc uI1L蝳l1L莳lf1H\$H% t2H1WH\$H t.H1,HD$HHD$HHu!HD$HHuXH=Xf;t.HHu:f.H@0H#@ZtHX[]A\A]A^A_fDE9E  ~׳LHALЄ117H~ܔHgHt*@(H([]A\A]A^A_ú~衔~荔=x%1~%H|$pE17kH|$pH*kH|$pH}HD$3f.DUSH1H(H-,H|$HHt$D$D$=qH{Ht$1-qT$Ht$HHHHt H([]D1ҹ1HT$oHRHPHH([]ff.WAVE1AUATIUSHHxHt$HL$HAU\H$@H$@1HDEtLxIEH$@HHHEË\HD$8HY$@HI8HD$ I}H(11v\H$@H$@nI0THD$0LuD$PM]Ll$(@In@Mn0M@4MSM'1MtAD$9<HD$HIDH$H$HAFZH$HH$@HD$HHHHDfH@kEJSHD$01HHt);Hs$HT$HHTHtCZD$PHt$8H|$ H$HD$@蘙pH$IF@H$LT$@H$L$HJHLʥH$HtGH9$uIV@BJ!1HLHHL9ttH$@H2HeD%E1efCZLxD$PIF@D$@PJ HHD$HAFL$@C1ЉT$PH4T$PH|$L$@֋H@ ɈPX@ ӉبgH$HX(AFZH$HsIF@Hp8fDH\$ HHH@H<$׋f HDŽ$f.HD$@HDŽ$HD$@1L耤D$@HHf.IFH@HHHDH@AFZLbD$@IF@HHp0Hx8p(p pppH0HkHXd@Hx0HT$xHL$oHt$p HD$oD$@Ht<HH\$ HHHL @}IF@SE11ɸH@D$@HXe>IF@1D$@PJ 1L諢HIF@PJ EkE1t?O1AufDf[FM F F lFDAVIAUHATUSILHH̍I\$xAHtCPtHHH-H[0HuI$HtHHH[0Hu[D]A\A]A^HsLHqA ff.UATAUSHHHLwHt$L$$E(E EDHuHL$D@$H@$D$IH HD)A)ÉHL,MUMIDtRHD$LL\$8DL$0L$(HT$LT$L(Lx@LT$HT$L$(DL$0L\$8HuE1DALAD$I E)AEAED)؉D$E$f.\$E$L9rD)L<J<>L'MIt,HD$LLLLL$w@LL$uE$HufMLDIALHID$HHI9t2LHLL$(HL$uHL$LL$(tHL\tL}MHHL[]A\A]A^A_D$$ M HmIE@HGKvHH9LuDT$$HEDIEL@IBLhM9LLL\$8DL$0L$(HT$LD$LD$HT$L$(DL$0L\$8%LLHT$L\$0DL$(L$VHT$uHuE1L$DL$(L\$0HUI@E1I4u5DfA TfALUf PIHD$AHa@u!H5HHNHyHH!H9umS$@tHbDHxpH[]A\A]A^A_8@JT1HcHމD$ H|DXD$ JT9HAbHxXJ rH bH5HxXJgfff)Efu%H5HHVHZHH!H9z Mf@MIFHAHHPI9HЈD$uKf5HHVHJHH!H9I_XH迻 w7DHaL47T$DdH5gfffHl)t:|$IB4aHHt@HH9uAN(|$ $ԚIBa $HTfHH9ueH$H$Ha(fDu/H5^HHFH#xHHHH9Efw8H] f;tHHHHtHSf*f,1ҾHH%H]Iń HD$!HL$ H fDDHDT$ L $L$L $DT$ uL$HEH5}ZAgfffHxXfAAAA)m_ܟDH5@HHFH#HHHHH9HD$DD$,H@HxXHD$l DD$,EiE1D$(1HD$H5 @H@HxXHD$kgfffDD$,)D<3xgA9wH|$LD$HLD$ApAx HHHH4aEff.@IUSHhL$Qt$(HT$$DD$SDL$RHD$H|$HD$@AD$H<ŀaHhIl$HHD$$D$,fM9Lm@LH;\$uHt$H(`s HL$H PbHFC <EWAVCHf9vALHL aM}M9dAH=L`H@Hpt$,p*H~HZH~|$SHH=JHk@uZH51HHC1ҸHHHH5H~H:H H)H~H~HH0H~H5~t$(HH~Hh0t$$pHC@fMLt LuL}M9WLm0L[L%}MIc$HH}HHHpHL$HH@Ht$L$$H}ADqDHt$@HT$8D$0 HT$8L}Ht$@D$0L aQfDfDHS,DLT$XDD$TL$HDT$0Ht$@T$8:>Ht$@LT$XHD\$0T$8L$HHB<aDD$THcHLT$HHDD$@L$8HD\$0L$8DD$@LT$HHUfDHDLT$HDD$@L$8Ht$0=Ht$0LT$HHL$8DD$@HB<atgHcHDD$8HL$0oHL$0DD$8f.u8HsHPHJHL!L9t!I\$XHHӟ  IcT$THHu%H5 HHHVHJHH!H9Ht$ 'HFHL8MHF@H@Ht I9LHqIcWHHLhMDIG MeH@(yCt8IG H@HcHDf8uHpL&tI}fMmMu.fHHA@HD$ HXXHk wD$,HHPH5?Hkgfff)ȉD$,tM^L\$fA;*,;ff.GHHNHyHH!H9tHXH  w [ATUHSHt []A\fDf}-AtyHH~HtHf}*u*M9PHH)H)H gHP@;D9t[D]A\f.HVHvfHhHXH@Joff.fuIH5HHHFH#xHH9t!HHx8HX^ HD$ aHLaHHt$XH@HD0nHŀaLAwHHl$`D$AHD$ ҋax`;D$|ZD;d$(QD;d$(HD$ t$894aaD$D4$D$HD$ D,$Dd$(aD$8U1!H5?HFHHx8HX# TfH=h0 urH5hHHFH#xHHHH9tGIEHxHX蛔 v)IEH5hHxHXv gfff)Mm)$Mef.UATUSHH/H=DHT H=DuHGIHHHL!L9u* u^H5CHHVHJHH!H9t=AHCHxXo P D9tHD)[]A\A]fDE+CH[]A\A]HEHHXHMo w=H=FC ¸uH5/CHVHJHL!L9tA]@HEH5CHxXn gfffH=B)A$ uH5BHFHHHL!L9HCH5BHxXn gfff)ff.HcR~H5YLcfBHs8BfHƉr Hs9uH[]A\A]M9AIH HHbB uff.I_HI9HH@ HXH(HH\ULcEDI,_f,`u-HmHcMUI,_ f96B`f*iU]%H\HI9ݍIG+$;$D؋ntt 9nLcl$Bat~v|$ethHAJPbtWHXyHL$XHymB9aBaBaB9acD$AD"d$(Af*f0Z H=cGHT$h1`H3nHL$XH| D$tD$E1fD Ht$ McAă D$@A!H0H HbH$ JH H9wD$@D$Dt$EBt$p1LAt*H=]FHtT$h1_D$tErH=3FHtT$h1V_D$tƄ$E1D$"L ^A2EH{kHt$HH@ H0Hf8,Lcl$FaEDH39wH=hCHT$h1\Ll$LL$8L.LL$8AUD$HjHHD$XD3=I^;2H=BH$1[fH=BH$`1[|HD$xL$,*VHD$0HHt1PH=ZiDD$DHL$86wHHL$8DD$DSH=$BHt$T$hX1DD$DHL$8=[DD$DHL$8D$t$Ƅ$D$$ D$@H$Hq\B9aMBaBaN$AD#d$(|$t0Lt$A~SZ Af0#f*HhHL$XH|D$E1HD$H|$x$H:H6=7NH 2Ht4UXD$9M$9M$;MvfDHD$D$tbH$f?* $0tH$f80$`褿'H5X^0y Ƅ$A  H$1J~E1&该H=~1HaL$h$1JET$ Ht$DdEIIf.HD$f8,IuL`艋H THDmHH@t fA<$*L,-H|$H/:L|$ I?҅H!IsHyiET$ DdH II-)HL$@HD$HH+LT$QT$R}Ht$H|LT$T$_fD(M}LIEHH@ HL983H=A?f.Hp f>tH!HHL~Af,uMAf*HFH=wf8*G;HHHLHH/AQL ~mDKrHx 1HAAt6AJ aHH9t6H H IuDMH REI yI aHH9u1AtmADLfDHHH~p9fDTt EH|$ HHD[]A\A]A^A_ÐH9H|$ p謦vHHx db^H=$m$ACH=w@H=H /LQ MJL H@Ht HPf:uDZ(Eft8Ht$u&L% D袽 H|$AԅH|$&efLH4$_L9IH4$B%0H=H=w H5cHHzHVH#zHHHH9t]HxHX H<vt7HHt H@f8*tYHLT( MfDHNf9*D9iLT/ fDZ1)uD9hu~ff.uNH5 HHFH#XHHHH9t#H]XHg6 D$H=qZHtUT&1sHEHHD$ naH8HH=ۃ LgHE@H@H9AHAr  fG;9Zk H$ZHHW HPHlXHYHSHt WL$LHAؾ'1rH=>fL fA? L觠|$+It5Ht0IGf8*u&xv H=1ҾōH@HɂHp@HLMYL M|$$DHvHHcN DVHCHHH)HMMcs`fC96sES`vtfH=":v5DDHzƺ(A tHHr:6h@HH9uH1HtQPH:6bhf.HHtWPe0Hh zH=QH>Ht`= zHH zH=eHRHt`u? zH:[;~f+~!fDHGHHtH fDHHHHHuHfDHGHHH |fHHHHHuH`H-H9AI͋ IHMƍA@LzIAIE<$wfMtK Q@KHCHHvLLRHHtA<$HV7A$Hĸ[]A\A]A^A_@A$r@BHCщ3HHHzAHL$I;ILɃIHL$H|$H|$1Mt< Q@HCHHH ۑtLLuHHÑL=ԑL9}D-D5D$u D9mH]HDC%D9EPKt$DSHDL8L`DhDp@pHH HzaHCI:fH{׎aHHCIHt$ f{>DDHL$6\sH}HL$HHHH4DGT$ DD$HL$|yT$ HHEDD$HL$% DCEPCH(aN7DHa:DH{׍aHHHCHIHt$HT$HHH$HD$ H$HD$HLxaLIMt 8/L`I$HD$ EL$H=DH$~%AHLItfDHkCH9ƍT uHt$LD$ 9IHLD$ HHH  A@UHCH%1HL$请yE1HEHHAHL$0fDAHEAC1DD$(HL$ DT$HT$HH4GyT$DD$(HEDT$@HL$ ֋ EPHPt=ApHtHHDH=H{ HE=H|$@taH=HH=XI>:H|$@uaHHD$0HH9 H|$@uaHHD$HH90H|$H H|$@vuaHHH9 H H|$@UuaHHH9HK H|$@4uaIHI9zMM.~S=FWIE(HHD)D`1LH|$@M.HI9IU(HDuIMIuHpHgH|$0"D$0HDIEMo(LJHǀH@H(HH$HY5H$Ht'H DrPH@HuEQ+HE@Ht%H rPH@HuHm8HwA蟛yHH=H@HH\$Ll$ HIH$L4H$HtfHHpHHHuAEQrMe@Im@MuWE1&?H% Mm8M3HJHLl$ \$H@HHBH|$H@H@@QHp@HEc Lt$IIHuIFA:HHD$uHHL%MMM9u%I$HL9IA$:uHT$HLAt$fAD$Hfw65hAfAt$'5 AI$HOI$HH$Hlafo$AEPfo$AE`$AEpH=}uHpyH=\uA$tӺ <AD$@xAD$`vg$7D+|$|$ & Dh;D+|$|$ Dh H$haH$DŽ$!H艔$L%l$M Il$M^@ID$HTmMHIH9wLL\$ 9Il$L\$ ED$(AD$ EI4$LHD$L$D@$IH HD)DA)‰HHH8HHt I91IAAH$EIEMH A)ADAA)A|$$M9thB H9Hw)HHLMIuHHD˃!@H$AgaDŽ$ H$A|$$Hȋ|$xMI@ HHAA)hE1LH|$@IDIL9HR HDueMI@H HAA)X hE1\$H|$@faH=d HH= 11(}H|$@H/haA9ʼnC S H|$@haH|$@H haLHCH|$@HCIDIL9H@H\`\$rDLQvM4$M.IEHH)~|$xHD$ \$H98u~@8xuHvHxDL$LD$ $HT$w= $LD$DL$uHT$IH8L[]A\A]A^A_@HGKLmHH9DL$T$H4$Ml$DL$T$L$mfEID$EMIl$If@H>H98~@8xxHvHxLT$(DL$$T$LD$L\$ $~<E1 $L\$LD$T$DL$$LT$((E1IIHf-0HC8HC@CHCICJHCPĬ@@)1A))AD1))1ƉA)A D1))1AtHsHKHT$ LF.fMHSHsLH1H8% M'HT$Ht$LHH8tˋPDD )D)D)1ȉ)A)EA1DD)D)1D) 1)1A)A/MILH|$ D) 1кD)A))ED1ADDy7D) D)1Љ)A)E1։A)) A1DD)D) 1Љ)A)1։A))D1‰))1A))A D1)1A/DDy7D) 1кD)A))E))1A))A D1)1AHDBDy7DD) 1кD)))1)) 1)) 1))1A))AD1))1A))A D1)1ADy7D) D)1Љ)A)E1։A)) A1DD)D) 1Љ)A)1։A))D1‰))1A))A D1)1ADD)D) 1Ɖ)A)1‰A)-DCD DKAD  @HC1Do)D$ o@D$ )D$0@ ׉  袚|$ 蔚H|$(¾ yAaSD׉T$fT$AƅHcsH{D yAH{DD) 1AA)E1ȉA)) A1DD)D) 1ʉ)A)1ȉA))D1))1A))A D1)1ADL{pLַDHLN yAfDԘ{f.<${7@ D1莘AHs H^HT$ L. MHS Hs LHH8xD7AD1AfDD1AVfDD1A.fDD1֗AHsHHHT$ LY.MHSHHsHLHDH8xDAD1fAHsxHHT$ L.M8HSxHsxLHԿH8xDAD1AHsXHHT$ Ly.M HSXHsXLHdH8:xD蟖AD1膖AHHHT$ L.^ME HHLHH8xD&AfDD1AƋ$#C%C;Ht6H[L|$H^6LHL`.9T$ID$D|裕Aƃ<$#9D1膕AHHHT$ L..M HHLHH8xD&AfDD1AHs HtjHT$ L.M HS Hs LH耽H8VxD軔A+fDD1覔AfDD1莔AHs(HtjHT$ L.Ml HS(Hs(LHH8xD;A+fDD1&AfDD1AHsxHtRHT$ L.MT HSxHsxLH耼H8VxD軓AfDD1覓AHshHHT$ L).M HShHshLHH8xDOAHT$ L.T$ ID$D|AfDHHHt$ L|$fT$ID$D Ty7)D)D)1ȉ)A)EDD)D)1DfAUSHHAHHXHT$ .=bCH= Ht#H;H H5 պH|$EE1EDHH H|$ƓH{XHT$ H~.tZEH=Ht-H];s H l4ȷHH[]A\A]fD1HHfDH{`t$ H`HfH[]A\A]H{`t$ H`@C:Dff.fHtsAUATIUSՉIHt H車uH@[LL]A\A]A$H`I}`IuI}H[L]A\A]KI}L T6HtAD$uCH:S@I}`IuI}H[L]A\A] KI}sI}T`IuI}H[L]A\A] KI},`At$HL[]A\A]5`I}I} LAS6HI}I}`IuI}H[L]A\A]JI}I}`IuI}H[L]A\A]J@I}j`IuI}H[L]A\A]JCU:C{:ߗAWAVIAUATIUS։HH҉L$l3H|$ IsD$8AIHl$ Dl$(Dd$,D$t(HEuAG@EuAGDAGII8A1IvH9DPHA@=DXEHH 1E9o@IvAADE1EHH E9gDIVAƒA@I~HLT$`I8LT$H9MVAH{HI1A"GJAAIvA@nHH EIVA(LLLu`AD$IvQ@HH IFAD$E9o@Io8AGHt IcL`E;gDEo@t IcL`EgD|$tm1pA1I9oPIvI~AÃ@HH I^AH`I9oPIFAtHLYIoPHH[]A\A]A^A_fDAIvA@HH InAPfH=)&dfuDI~HT$j`E1E9gDI8HT$AII 0I~HT$2`1E9o@HT$I8@E1HH E9gDA뵐`I~DA:/AGJH=a%zeuHzfI~HD$`HD$ LT$HLLC`LT$fI~HT$D\$LT$H`LT$D\$HT$ffDI~`pDIG8dBE1D1ffDD$MeIE0AT$uL`ID$LHAD$ Al$ID$HsP2HL蚎LH=jt H|$螷H=WH¾;1舫H=2Hp,NaHaeu%H=HD$HXHH=t H|$$H=ݼH¾q;1L|$IGHD$L|H=3HLx0H=zAL`A|$uL`ID$AD$ HAl$H{ID$`L|$HIm1IH /L6HIHHߋp`I$pt,E1DHHtA 1I$D9hwL|$HIwP0IGPHD$hHbHx0Ht$hxHD$hHFpHW`HD$hHtRI@H{1Է`H藷`RH芷`,1H蹸`H$DH$EHD$XuH$E111O7It$iHtpH轹`ILcH@(HD@E1EuD9hA|$DluL=`ID$A@(HAD$ ID$IAl$LcH@(HuTL诶`E1HD$xD$pH$A~xWDrLE~H4D$%xT$DAp11@虚xDEu='5mu H=7uMFt+Ft!FtFFHDHl$H|$Hu%FHl$H|$Hq~Hl$H|$DGHP=tWaHItAHl$HÃFFdGHDfDL$`fDH|$ xHD$(JHl$H|$+GH1@Hl$H|$7GHf.f1<:ti-H$I H7GE$EuCH~>HcH9u6H$@at1Ht<:uCHH<:uHuHھH11:r1ɺ@a1f91ɺ0a1R9H_KuI1K IPL-Iff.0NĭU0N谭U0N蜭U0N舭U0NtU~0N`U0NLU0N8U0N$ff.fAWAVAUATIUSH(H3I@I]AJ/ $u f&f O E1fD$+HD$Im 7ff|$+IG@HHSHHHRHRBQHB@Hr@H$H$HH$uCDƃLtIN;BLrMp @MvMt1A%=#uAF;uLLD$DMvLD$MuLt$1LD$IdžL臈LD$LAHIHtfXA8Ht$PH\$`Lt$PHrOHLH[HHCH@@(EQHE@HU@HoH@ Hb8N8HD$PHT$XHl$`T15HL&LyAw1ɅHHLHHCH@@()Aw1ɅyH=:ztHH1LsHH=dHG @tID$ @It$XHtMDVEtDEB11Ift1".tHbH@ @uŎ@H=HtJHbHtH@Ht@@u/Lt$ʒ/LH5Lz;AG1'HT$I}NAOȃf $OȺHIO8fD$+Im HL$LHHCt H;ELp0HMIF(HPH;L11L|$Ld$Ll$ AMIIfDIuHڿHt!HtIuH}HuAMMm8MSIE(HtPt L;ttR0N苦@H% HD$ WJLpXL1H5HL$HAX>pHv H~uHD$PHD$XH\$`0S0N8H@ HxuHD$PHT$XHl$`0S0N̥DHD$ rJLpXEHD$ eJLpX&D$,D$fAI0HD$@HD$HH|$r@b$u fH|$-J,HHP8hHT$@H|$&&,L$,ʃHDщT$,H|$#,HQL`8MDAP<<$uAGfkHT$@H5ʳHD$@HD$@HD$0LcIHD$01HHx荅|$+HD$  H%IHHHhFHMIHMHIL=HL=LCQ^HK@1Hs@HtHQ Lt$PHT$PHt$XHH\$`L跛AD$QID$@H|$ Ld$`HD$PID$@HD$XHD$0HHHD$8h/HLhAEQIE@Ll$`HD$PIE@HD$XAPD< <$uAGf H=fH2!6H|$ H.HƺLHߺL赑\HL襑\HsLIHs3AE9Au sD$Jl1ML$ Lt$P1AALLbL$HMpEeAMD9At$ AuKlAED$ H|$&HIuhH$H9Dm]D9Au HMuJD%1Md$Mt~A|$uIT$8A|$H5]@1ɺAAHLaH$IHJDAA9ȍqrLlMd$Mu|$t31_AH5H训Dd$ H$JD111a_L$HMAl$E,$AA9u{ At$I\D$H$HH`-`BAWHLPm|$+u$HD$PH8$LH$HtGH([]A\A]A^A_À8$H@ Hxu1D$,D$D$,D$D$,D$BD$,D$BD$,D$I~D$,D$>iD$,D$?TD$,D$A?D$,D$*fdH1HD$HCQHS@1HK@HtHB Ht8txHt$@HD$PHL$XH\$`HLt$P1AAL_Mo@HD$@MH=L%!g6HIH8uH@ Hxu1v$;AGf3HL$HLt$P1dHQS1ɺHAALu^H$|$+QrqHDHt$1訔H$rNJHDD$=B=A=?Lt$P|$1H=R%6AH5JH"Dd$ H$JD|$H|$!HTHP8xH5ͩD$Ivt$,H=ռ6H$H|$QrqHD k!H[ H@8H$|$>QrqHDI_01HD$xHD$xHHD$p!HHstHP8Lt$PHL$pL2H HH?tHP8HL$pLߠI0N H đH5%H1裨HL$pQrqHDHiH Hx NbzI_0HHDHZ E1{NuEDk HPHLC8LE9DEB豜H LC@DHLHP蔜HH[HuAHHH5 E1H.5H=HL 6HHx$HLMH1躑HT$pJqrHDBpH=e16NHhyHHD$pHo@d1H=kAL86H5E1E1HH0 !5HT$pHJt#HLYHD$pH@9wf.AGfyH=߹6HT$8H|$ HaV%lzNqH$H4HO%ÉL$wHH$L$% ÉJZqDLLLd$I@ HtH2H1HHH9uD@ApDH@H|$~6t}D蟞qH$H4HDo%wHH$Dm% É]]Au|AGHL$HHL$0f0 H\$H|$H$HVHHL$1D$MD$ Lt$PAHDŽ$DC1HDH4wHtHMH HHH9uXsAHjH=$dtHHHD$0{D葛qH$IH4H#DoAwHH$Dk%D CAuHD$P1A|$xeqH$IH4HoAjwIH$ HP0E@pAuDIH=H6AoHD$@lAHDŽ$DC1HDH4RwH=1 6HT$pHpe@1H= 6H5[E1E1HH05H5HnHLM1H谈H|$pHHt ygGH$QBAH\%1ɉt%E1H=`tH0SY0Nݎ1%E1jAE1A11(H=H 6H'6HH=ͱ1H$6HH4fNHC H-tHVH¾"H!5HCH6 kNHPHBH(tHH¾"H5HCH uNHPHHHJH(CtHH¾"H5HCH0XHPHHHJHP1Hx \@H A1(1E1CwH$IH(1E1wH$HH60S0!0N錹RSNՌR0N0S 0N譌SSN虌(11wH$IH<0S 0N`(1E1ZwH$HH(1-w1H$HH0Sq0N0N2fT0NfT0N fT0NeT{0NeT0NeT0NeT0NeT!0NeT0N~eT0Nje`UrVeT 0NBeT0N.eTr0NeTu0NeT0NdT|0NdT{0NdUHAWAVAUATISH8HHpHPXH ~J/wHHUHPXHt eJ wH1HtHRHt B@8!HpH HH@08H @HHH zHHKHV!8k HSHHP1H0NkCPU0NWCPU0NCCHDžPU30NC0U0N CPUb0NBPU0NBPU0NB0U0NBPU0NBPU0NBPU0NBPU0NkBPU:0NWB@PU`0N6BPU0N"Bff.pR$0N4DAEH[]A\A]A^A_ÐH0H E1E1H j4IcWP1HLHTIcGPHHB<tF<$uBBPfw2fw,ȸHtH@H1^3H@DLk8fA} Hd DCA6 AAIEH@ @T1HL H[H\CHLk8fD8%u@f.Ls8HL-@c@;YCHLH@/@;%HLH LH[HDLs8IVHR Bt HHF <$BfȸHL3f85X LxAH@HHuL9h8uH~0H$DL$gUH8H@HD$(HD$8-f.Ew H=uAGt1AencLAc K4HD$wfDFAFPHC LsHhI~H1HH$HHkfDFLAFJ4HD$ILLD$|.vHHxELD$1LLLHD$SHS f|$$DH|$HD$8LOSHC LsH\$(HhIH=I4H=HD$@I4H=1HD$PI4H=HD$XI4HD$`fDcAcIAfLl$8Ll$($H% 1HRHt$1;Q4HD$H\$(HG0HD$ HD$8FHP<<$oH|$ HH{HH蛓H{8*E1DOdH{@HT$@fE1I1ɾ0~LIH)D$`Hl$`)D$p)D$@)D$P)$L$xf4H{@HT$@E11ɾ0~fo$LH)D$`Hl$`)D$p)D$@)D$PL$x$4H{@HT$@E11ɾ0~fo$LH)D$`Hl$`)D$p)D$@)D$PL$x4H{@HT$@E11ɾ0~fo$LH )D$`Hl$`)D$p)D$@)D$PL$x蠠4L;k8H{(HZHĈH[]A\A]A^A_Cf}Mt"@I$8$$M$MuH|$ Ll$ 'I$hx1pMmMtpIcEt rPwI}8tH2A}1@N%pMuH=r荏HYHr2x1>pMmMuDMI$8$Lk0HD$(Mu 2fMmMt'A}uHT$(L193MmHD$(MuHHHHD$1HD$(HIf.HD$IM HLq8Q LH A9Q My8tM9tI0H$H$HH8 Hp1 HfDHH3M;x@Ml$8IuLHH~ G !f>I0H$L$3H8Q HhH-g3H}H`IHfLH;3M;fIp@U9<<$uBfMl$8IEH@ @I0LH$HL$H8 LHh茏3f85<HPH$HLHMmMUEu4@In8IITHքHD$H{Hz1A~H} LH HHD$IH1< <$uBfLeH H!1HHD$H}HH$ 1H31HD$ H31IG3H|$HL$H$As1HD$8L=HT$ H$LcH$HH`H|$ H$HHCH5H|$8H$*}HHH$HLwHH$HHt$H|$H$|HEH$H11AmXD1IKH$hIHB 9HHH<$HDHމJL(L`3H$pHQ19~rHHTqL*HBAH$HH{0?MH{0 I1Hp'\H4$HMH¿<p2H$pHtH$p1ҾHD$ 趙H$pHD$ QYAF -NEE1HH6H&fDH=11H=DHC1Hx(HHID$ht f8uL`13HIH$Hƺ>HsH1HsjUHMHH¿q1H$Hi>SNGFH8HC(HH$hHIM0HXGH$hM2H$hE1o.EHH{23HHƿEl$d2HtBwDhID$pHCID$`iЁ2IE/5E/!E,/ E$.D{-D]-DЁt2DSNDЁ2DЁy2DЁH2mDЁc2YDЁT2ED.1D.D=. D-C-C-C.0C,0C0C0}C-iC0UC0AC0-C1CF-CЁ:2BЁ^2BЁO2BЁh2BЁY2BЁA2B,yB$0eB1QBy-=B_-)B=0B;0Bf.D$H1H=#昆tH$H1LMfDtH@f8 .DHt$Dk 3H$HjHpHH¿9g2H$HGfIn8HH$AFHEHEAFIv8HH$t$tH$1H HDF3HH@H$Ht$(D%3H$HehH$1H賡ucH$"uRfDHt$D3H$HhA~LHH>HH$H$LHLHHLHH$LH`LHH5sH|$PLHH$LH$Li1HDŽ$( 21H 2H= r1H0H ,sH$FHI|H$(HLHt$0IHLpH$(H$HH$(H H$H$(H_H$(HH$HH$(H$(H$HDHD$HL|$PH\$0ID$#HDŽ$0HDŽ$HDŽ$HDŽ$HDŽ$HDŽ$Lt$(Hs(I~(.1IF(HC0HC(H~H@ qHDŽ$H$H$H{(H$H$H}(vH$H$Hp(xAH$FHs(H$PAH$EHu(H$(AH$EH$H$Hp(@H$EL;d$HLxL$L$(LH2LH$862H$(HD$HH$X@L2HD$H)@H}(H$(H$HHH$(F f|$8H$(u'L;d$HHD$hHt EHLŢ2AL$;H5(oLL$(訢2AN;LLL$`H$(H$HE(LLH$(4H$(H$1LE2Af;HE(H$H!L2_AW1H$1LsL$HL薊H$HHCRHD$Hu }HPNHHdLIYH$LHRL>H$HHQL#H$H$HQ] FGDf|$8AXL$tH$҉Ht$hLHQHL$HLt$hHqL1LHL\QAWtH$腉Ht$hLH5QL$LeH$IUH$LHQLt$HH$LIvw1L$(H$(H$HPH$1L5rH$H$HPL$L͈HL$HH$ HqH1H$HL_PzLY2H$Ht$HLȟ2AN;LH$胯H$L$IH$8HHELHjMH$H$L=H$H$HHAH$HL$MH$(ˇH$HHLHLL;d$HH}0H$(0H$LH$(Ȟ2AL$;H$(L貞2AN;j<%H$GHD$VH$L$ H$8LHNL$LH$IنH$LHNHD$HH$LHp1fDHl$h\$`L$p1ɺL$L$Ll$pHIL$x11H3JAd$;1tAd$;1o@Bƒ ffЀFFIw(H|$i嘆 Ff[dL1LL$EnL$IH}0H$(H$LH$( 2AL$;EAd$;1fHD$HHHD$HD$H(@H$1HIt$H$1zTISNE*)Ez%D%DpL&Dp&DH$=H}0H$H$(BH$LH$(ښ2AL$; p&RDp&>Dp|&*Dpl&Dpe&DpS&Ct9H^ULd$hH$MtImMXM'fH|$0* H|$0L* H|$0@HDUHD$Hc!I+vDGIEIH茥LHDHD$0E1O()ԺH|$= D WD$HL$0E1Ht$LDHHmH|$~W\L$,Hٹl$HHl$0HcL;4@`Hl$@McD)Hc9H|$XHH|$`LrL%H-l#M9Ht$H|$0( ?H|$0L( *H|$0 I~L!EIHcL;4@`>Ht$Hz( H|$0Le( yH|$0賣H艉@LH Hu DHc1AtE1DGD$DLHDLLHD$2D9t$H|$HlID9H5H}D(Hdfx5fD1@Dt$ AtD$t$t$|tpD@1N ())M1HDnL|$LmLRXB ₵=!HL$ HL$ IH=HD$ IH[ 菂IcIDd$E1Yff.?@)эPHD@HHHD@MM9uI9tHLAIE1Ll$ EfDLL6LHIfD9t$IHL;IE9HihHKHD$@D|$L$H$D$XLDLT$$D$\$LIDvHAE1jLH¾GDgHIDvAYAZ|$HLT$D|$LDvHLIjAHھHDDHƉ+YI^t$ D$AD$t$ fH{角AG$1f;%{}f.LLAMLP蹋II D$ LtHD$0AUATAUSHHHxD-bH$LL$HD$(DD$ HD$ @HD$8H|$0HD$@Dl$HHD$PD$X HD$`pD$hAQEH|$(WHHT$(@jAAXrf.tF$H2Ht3P1u (f;hs!DHHT6YjHHuHx[]A\A]A^A_L0.dL=!=jM~feeff.fooH=~ooH=g$oH=nDKpp1aHfo9}p1@aA'}p1a}p1`a}pHm.IA0diD$E~sH$diD$sH$ciH$HL$PH|$pIDH[HHEHt x\$P@D$H$HL$PH|$pIEO$gLUH=Qj4AAAE11E)LLJM9}H$t$ H$t$ HTH*DC@o{`~e+fM2H$A)H$Yi$H$HL$PH|$pIO$I1HH=H!Hl$(HM1fDLlAE Au$DpD]tYA} AE $p&DhEADžT@H5_DL7m9]wDLlDLX\DL[DL>[DLqZDWgH$8&1܇hT1H!& F*& 2*& *$8C' )D$D5,"HBL@IIxLH8I9tE1ND$E1f.AUATUSH8~.PL$PL1DL&bMHI9uHt$8Ht$8E1|$ LNfHT$PHHHH HH9HuDD9Ht$8LH$PHwL H H,tuI9HHYHADD98T$ 1Ht$8NfH8[]A\A]A^A_A$f0y< k1fA<$*%oG1.^ED$ }D$UH9H< $bf?0t <'btA9tH&b8tD9dPH4 $bu uH4 $bH< $btf*L$(T$,L EA|$xѽ1ffA> AGuڨuAouH<$svH={Ru9H5zHFHxHL!L9t"HCXHHD$Ѧ sTHH8NP Mp MMM|$MuMMEgAvH=ZzuDH5EzHFHxHL!L9t-HKXHHL$ HL$fDHIKTL8NP MH M_fu{?0H WIH?1HItHuF@tθ9`@ T@9u@ i@1HD HHLICD WH% fDH5ixHYgfff)CH5AxH|$/gfff)YfVIAUATIUS1HHHDŽ$If0tef*HAYcHtLHt HI냐H$)HĘ[]A\A]A^A_SaHH)H H"HHz8>HzH113AL@ƃ9H=YIj 1u3LD@ƃ.H=5rAuu^,EiFtHxwATUASHH@rH+D$Ht$ HHd$ D$(D$D$,&DHHr+D$Ht$ HHd$ D$(D$D$,&HH@[]A\ú 1~f.AVAUATUSMII@HHH@ۃ@ExHAA@Ofi%NIH%0HIFxB4Jf% fZ4)% XHH@0HHCxfLB4Jf% fB4迿HAf;C6 H`ftS8LMc ЈC8HCC6 1H9tH=-HC DLc LHAf;C6RH`ft)HcHHHDH?S8LMc ЈC8HCC$ 1HHC H=UH^@4f%z%H@4Hf% fC4! MH({.HCxH @4f%,%H@4Hf% fC4 WH-.HCx[]A\A]A^1Ҿ=b)H )@qg)LL!HLh)L #Hƹ)HL;l)LN\1Hx 1H蚚KljC0`ff.t$L$HH$H$t 8H$1H:9}|H\$@5H\$@H@@0H7@9u P|&H$1H4!@aHH[I9cf.H9tHHL]HHtMd$Mt,LL'L薱HtHuMd$HMuH$Ht ;hpHu+HDŽ$HDŽ$Qf}Hf.fD$f8AD$ u{^,I|1f.؃H@HI|X9 ŃuAT$H/H;hthH[]A\A]AT$H4/HPt@뫺ظff.AUATHUSHHH,H@ f8HPf:HEhH.HDbt.t)ABH[]A\A]CfOvvf=<f=tf=uH{HD9tlH5.Dt)H[]A\A]@AtBH .DH[]A\A]fLstfGwdfFs6ft:H1[]A\A]fTtDfXfH{H?1{ AvH{P+D BfHH{H 1{ AvH{PA7.EtrtkAD1D E1{ vLkPH{HLCAH,DA}xI|$Htf?tIEHt f8t@9tG9uLH[뚺V5uff.Af2.f4nf=f=u]EtIDD$XAIGHXHt f;[ HD$@HD$HHD$PD$XIGf8a|$8>HI$HEIEHL[]A\A]A^A_f.IG Ht f8fAHLfwIWf/tf0f=Bf.f/Nf.5LD\$Mw !HD\$t f80IG|$XH@H@1Ht f8I)HD$ HHE|$x;I$IEfDIG H@0Mw fA> |$XIGH@H@Ht f8 $AH‹BT$Xc  IEHUH5HH)I9$I D$ D$ IGLxxfH|$@$L!f8LHD$!f8IHL$H0D\$HHD$蓈&$8ID\$VHpHufDHRHtf:"uf>:ID\$L}&$LHLt$Ht$L(H|$@L:IGD\$ T$XL踤!f8LHD$豅!f8HD$H$H$LHL$q1HfHHH9w쉴$IFH$$$H$ADHAoa&$$`H$H$AKc&D\$AH|$H$D\$b&D\$$8D$HD\$HjjAjLt$(HD@HHD\$f.H$LD$ܦD$HD$@HIGH@H}f8sHT$`H$H$H$fo$)D$@fo$)D$PAf.H$EйHL\$HHD$c&L\$‰$I8D\$jHjAHt$pHo&$A[A^D\$*f.D\$jAjHL$0f.fHAjjHjHt$(H@@Hu'HHV@H@HtnvH% fDH訸H aAHB$P (A9$I@$L`bLck(0HHyPߣPߣG$H|$(D$|#I1fDHnLHHubI;\$t[CQHs@HC@H$0H$(H$ Ht,<K<<HvHH$ uAD$$9rLd$(1HnHHCH8HCH;=,vq)HCHt HxW)H8Ht'H;=QA)HSHHzHDCEtHHx'H@HugHCL-H@fA}jI}HD$(LHpWLt$(IFHHD$)HBHX0IF@QH% fDHCLhMtyCP%=lFuHC@HS@HHx H$(H$0HH$ t (#H$ LޯHCH@AD$$92HD$HHD$`HlHt$H0HD$(H;XQH$H1o$H$fH~fI~)$HH$ CQHC@HS@Hu@H@Ht8tH$H$H$Df#L.HHH sHt [yH|$L$HtGHHLP H$1L$Mt6IF0HD$f=`HD$@[MvML$uHD$HD$H9D$`z=jk|$|Ha!bHD$PIōC1E1L|$(L4Kf.%HSAEHuQHL蠪t M%IHII9HjH(HSH:Ht H;=1uAEHzHt65?tH@HxJ$H@Hu跍HSHBHSؠHC[fDH$ >H$ Hx$HDHD$xLHHIFL|$HxL HHt(HW<tI9tI^XHf.I^HHt x @$HbH$ AF,H$($HD$0IHD$pH qHt Y&H|$paH$(H$ HH HH"@DHHH9$( HH8vH@Ht׀xxHq֠H$ Hx1#L-fA}jJfLh4#HCL("@HC@HS@H$0H$(HH$ 8tD8"H@HuHDŽ$ Df~HcNtH=qH HDN H$ RH=͡1LXHDŽ$ H$(H$0K H=qHuNH$ LKHD$(L9`!D$|voL,fDHeE1H(H1HtPD9t0DH|Ht#L9'WH1HuD9ufDHI9uHD$(MHxI9t#L FHD$(LHPuO1Ll$(@ID$1HtP9LHlH쬿uIuHHN HD$0H@0Hh0HP(HDŽ$PHDŽ$0HDŽ$8H$(HDŽ$@H9H$ HDŽ$HHEHuHHt$HHHT$@Ht$HHT$@<x@ fH$0Ht#H$8HPH$8HHDŽ$8H$0HEH$8HEL8L}H$ H$(H9HPHs(H$PH(w I\$8fD}VH$L`ML$Ld$@L|$HHD$@QhHP@Hp@1HtHB AD$QH$ H$ HD$H$(It$@H$0HE1譥ID$@LlWHD$HD$HD$H9D$88Ld$ MHD$(L9`L{HHD$H@0Lh0HP(HDŽ$PHDŽ$0HDŽ$8H$(HDŽ$@L9L$ HDŽ$HDIEM}H(HIA<H<gH$0Ht#H$8HPH$8HHDŽ$8IEH$0L$0H$8IEH(ImH$ L$(I9HPHs(H$PH'fHD$H@0@ tK Ht$H$H$HD$(LHxYtHD$(LHx$ZH=_HΠH_H|$PHh[]A\A]A^A_"ΠHD$0H@0@ tK Ht$0H$mH$H$o$H$)$*AD$Q&Il$@ID$@L$H$HH$Ld$@IHl$L|$HHt$L#A$HۍPI|$8tUIT$T$0HHEڀ}fD<Z<t<AE1DHf.HE9mA$AH wHc IH L(MtAEff=t LɒtMH\$0ALH<EtwDH fDHE9s_A$AH wHc IH HHtfuH[ f=tHDtHȑEuH\$0bL-5HCH@fA}jMI}贓H:HtDHt.H$1OJ%$DH$ fDHHDHH$8HEH$HEHEE1E11AADDBM1DE1H4D$0AD$GtAD9t$0HD$pLt$XD`EHLx1Lt$AAL|$ LL|$0L|$@AH;H$1HDŽ$H$HL-H8CH@Ht @9vEAFHA9uADDt$xƉD$lLt$L|$0t'AE9tHt$pHJHJHpHrHH|$lFIF0H@HD$HHD$pHpH@Ht$PHD$8D$L$LHL$PHT$8HH|$HH|$0H-HHL)P D$HD$ D$9D$lHt$ )HHD$8HD$0HD$PH|$8H$1HDŽ$jGHt$ $H|$@D,H8Lht$xLD9t$Ht$HH=Q[11ЌiHHD$0>Ht$0FD$X ЈFIEH5p*11A@IVًDHHDHH@L8ML;=,fLn VLL$61AAsHLbHtk1HtBwxHHƃYHH$1AAsH.bA9HtIE%DHA9uHMtEIDH|$@0L$(Ht$pFPVH|fP#HfH$0L$ ILHLeM A$El$%D9A|$AuD?[H}IH4HtDoM11L$ LI녀K H=XH6l1ۺ"ˠH=1H5HEEEVM11E1k1H|$pZzHs(HrHs0HrHS0LIT$0LC0HHE9H$E1L$H$LLL9AH{L$8H$(I9L$ IGMwL(L,A<<[H$0Ht#H$8HPH$8HHDŽ$8IGL$0H$8IGH(IoH$ L$(fHBH$PHH$0HHBH$8HHDŽ$0HDŽ$8H$ L$(H\$pH$1H{HDŽ$A$H|$@L&H8IH@HsHHt$8t xIF0H@HD$HHD$pH@HD$P`HpE1$FHD$(HPHp4=J1bHD$PHD$(HpH HD$(HxID$0IEHP<DIInX<P <IW(vHt I8t HHHBHpH LH9.u I9HHAHpH H9.tHH~0HT$HHt$@}Ht$@HT$HD^,EHF8HHIH9uH9u#D9sfHD@HHDHHH9tIJH9t:LMtHHIHHHLH@IJHHIJLHIBIIL6H~0HT$HHt$@|Ht$@HT$HA H4<H~(vHt H~8t H?HHGH7I HtXHFH6HHH9uH9tIHH9tBH8HtHHHOHHH9H@IHHHIHLHI@ILrIIK H=hQHd5IG(HHH I8L9{H9)HPnI9eIMH9UI07{A,IG8HpLH9.uI9u&94P辕HL$1AAsHLXHLHjHIL$L$H$虉_AXuZHHHLaHt$HHH~HzHI0yHAZH$   PӔI P跔  P裔P菔AWAVIAUATUS1HHt$@DA@HCCCu0IG(1Ht%PtHDHt@Eg$Av I0L-CMH$Aw$tz1ېMdID$Ht]HxHtCt HfHxLH@Hu7gID$L-#CH@H˱ID$A9_$wL豱HBH<$L<$Hĸ[]A\A]A^A_L蛊HLkHH$AD$l$E1HHDIf.HI9LuuI~HtIHt 9HM;wtLL輂uLHt8u1I~\$LLIAL?DL|IH@Ht$xvIwHHtxv H;nfA1HtGD9uIL\$HII9/H<$l$UMt@AD$tzI|AD$HGHHz7MufDIG0HH$tSH t|HX|}"zA|$xL软IG0HH$uA$wtH |HF{9Ⱦ 1'HLH HIHt y1HtJ9uLMH Mg1@LHtID$1HtP9u10HD$paHt$pHD$pLHHD$pHt6xE1u *DD9`vDAHDHx腮HD$pHuH|$peWL%?Mt9IT$HHs HH9wA|$,I<$^ٺ0L;bL%]?HZ?Mt9IT$HHs HH9wA|$,I<$Iٺ0LL;bH?v~?=>=IG(HtuPtnH\HC(Ht5Pt.H|t# uHC0HtHx8 fDHt$@L軇HufL-g>MzH%( fD "Au@tà uN=%>A/f.L(1\LHHD$(H@HߋpTRbIGHߋpTCbH==HtEW1ۅuf;_؃HHHwH>=H=g=HuL wHT$(H=#HHS;l$I^ LLl$XLd$`H{HD5HtHmuL5H|$pH蠸LL$H|$pE111LuH|$P1ҾK_{DD$$x]rۑ[SHH4D$T$>aL$HËT$DD$$% ȉSC@QL$B1HЉT$H4aT$L$@DD$$HlAD$AD$ƃ@ AD$ID$@Hx(H@0H9HHD )f.EH@t H9HIH9HP:tH;jt҉D3AD$A AD$DkE9{fAuD[HDkH4IAtaHËDk%D Cp{yEnADB1HDAH4a@D EHfH\${QCH|$(UL-15l$8 l$IHD$H0E1Ll$0H}(HSIHD$|fDH0E1Ll$0OH}(HIHD$LfDH(E1Ll$0Hu0HäuHU0HǃLIHD$H(E1Ll$0Hu0HIHD$fDH(E1Ll$0WHu0H{IHD$TfDH(E1Ll$0'HIHD$(fuMI?HtGI_HtH3.0HV`1H[]A\A]A^A_f1IGl_H4$IGHHLAGHHD$dIOH=CHA9snHt CMH|$贲I?HtGI.UI_H*:wLc@HC@HھйtH{@Ht Hu(1/Lc@HmHDEv tHEPHtHS8L`(HP(HEXHtHS8HP(HM(Lk@HC@HھйH{@Ht Hu(1.HM0HC@HھйH{@Ht Hu01b.Lk@I;E8H1HLHbHL-f.vBAPfwM@Af/t f0f=mA@H]P M9шI)t!MLeHItfAUATUSHhH~IHQAIHMDf:.tmy$AEuLBM}A@A8G. h{_*Ht$8H9t$0LD$(D$HMHIo0I0Ht$LD$H$H藟D$H$LD$Ht$H;U'=,uZHHHC1ҸHHHH 澽HHBH.HBHxH-HHUHH} HH^HwHEHHDž)1HLEHuLHL}LDIxI@HHH9kEP(A@ LHH@$D@$L$@HT$D $HH H‰)ЉAM)‰HHLLMIItM9t{1IEH$DVL\$I D)DEX$ $A)fB AH9Hr)HLH1HHI9uEX$HHBHDxD9#*ߵkL9tJHCfw:H Ht.t%;Xe4H=1HC LAfHqHHHHHHgHhHH!HRH;H,H4H@HHDxAwDH$E1VH$HH4HzHtDL$H$m\H$L$HBDx ʉP@HEX$IhHL:HB1E1D$AHL$DC1HT$DD$H4pl\D$HT$HL$DDHB@AD Ɖ0|$@pkHELHt$L$<L$Ht$IxrDH}=HI@H-?HPH aHHEH&1HH D(eH1aH1HBAwHeL1XeL1yDeL1e0Ht$L$Mj\L$Ht$)E1vi{_irff.@%L賦HAI~HHEIV H|$HU HUIN HpHD$?HD$HEPHA褩2LHoH߈$*$AXHLH$H$HLHPHu1HUHtBw~E1E1LyHEph {_ff.AUATAUSHH8EHcAHD$HEDAH|HP IHEJ|HPD@IIEAUH4AAA@DL$,tAIUI~HH)HHD$H1H1HH!H?H?Ht$BBt?e1H8[]A\A]A^A_@Hc@HD$PH|xA@wA@HT$tHDHHH!H[(\1H9wHH H HH9rD)ڃHHT$H9fA9pHu1 AJTz0ƒA9HDsLl$LL9DIUI+VA@D$(HT$t=@D)HHHT$IF@u H|$HT$f.IF@ H|$H|$HT$f.HT$HhDjjINIuH|$ B#DL$<‰D$8YD9^v pDʹ@?)HTHHHTIVBB/oWJE1AWAVIAUATIUSHXHG T$ HhE11A@DDH|HHR HHw(H|HH@H=p`H_ HFDCVH4AAA@DL$,A5HVH{HH)HHD$H1H1HH!H?H?Ht$BƉD$(A9s҃?DHt)ωHHHtHSB+ IG H|pAD9tjNHP1H|fH y0HH9׍\uIHJ HtBLIIHBH=PtVLIgD1fD1I1HJ HuNH|HH@ H=Lw(HBH=PuIG HxHH@H=puLo IFEEAVH4AAA@DL$LAIVI}HH)HHD$0H1H1HH!H?H?Ht$8BƉD$HA9s?@)ΉHt0HHHt0IUBE1H|$8Ll$0jjHKHH|$ DL$<‰D$8_D9AXvpDD?)HTHHHTHSBu@PH|A@A@HD$tHDHHH!ЍlHFH+CA@D$(HD$tUDD)HHHD$HC@HD$H4$H4$HfDHIG bfHSBqH|$GHD$HQDH|$HD$1fH|$HD$f.PH|0A@-A@Ll$0tHDHHI!IILP6k3kHDL |$ L:ܚIH@H=PIG HxHH@H=pHw 1LL-hܚIH@ H=IW HtBH|HH@ H=Hw(L1+L ܚHX[]A\A]A^A_fMH*Y2jfI*Mى^kLZDLHLHfMFM+EA@D$HLD$0t=@D)IILD$0IEM@=H|$0&DE1;IEM@&DIIVLLfHH H*XLl$0MjjIMIvH|$@DL$\‰D$XYD9^v pDʹ@?)HT0HHHT0IUBV?oWf.AWAVMAUATEUSIHHHHL$D$ HxHH@H=p H_ IGDCAWH4AAA@DL$|`A IWH{HH)HHD$`H1H1HH!H?H?Ht$hBƉD$xA9s?@)ΉHt`HHHt`HSBHD$ID$ 1Ht DxEI|$1LT@ID$ HtbD@EtYE1fDEJDHxWSI|$JKHD1ID$ JDHp@ID$ Ht AD9pwH|$HHHH\$8[MIDhID$@QHP@H|$@LHD$PHT$@HP@HT$H1诰ID$ Hx1HD$MI~LlIu\ IIEHP HDIu(H@H=pMm HFEEVH4AAA@D$AjHVI}HH)HH$H1H1HH!H?H?H$BƉ$A9s$҃?@HԀ)ωHHHԀIUB_1҃IF IT$Ht;hfMH=qL辈I@,ID$ HPHL$H\$ E1LLt$(HIHH|$DH{HDHp HpIHHHD$1IH=HG%AV, AF,AF/HC HtAD;hrfH\$ Lt$(HD$H H|$F,uA2N/G% AF,@8F AF/I1[]A\A]A^A_@PH|`A@A@HD$`tHDHHH!HHD$jjIMHH$D$‰$YD9^IUBu@PHԀA@A@H$tHDHHH!HHT$jI%H$LɗH$HHF$%HVI+UA@DŽ$H$@D)HHH$IE@IH$,1Ht$LHt$IfLHIEH% fDIE@H$H1EH$.H$@HК}I$IHBH=PLHHD$HHt$H$H¿,H$HF$AF,HD$fIGH+CA@D$xHD$`@D)HHHD$`HC@uH|$`x~HD$`fD1AH4g\HËC%D C)щρL@HSBqH|$`GHD$dH|$hXHD$`Af.% AF, AF/fHD$`H뒾jjHKIwH|$p D$‰$AZD9A[v pDʹ@?)HT`HHHT`HSB^GoW fAWAVAUATUSHH9HD$HD$H$HՉIAH|HP IIJ|HPD@IIGABƉD$8A9s?@)ΉHt HHHt IUBtI`71|$ x H|$*HH[]A\A]A^A_fPH| xA@wA@H|$ tHDHHH!Htt9wI>fDJDt$ H@PTt3;T$t8HD$NH H9PtH9u2FTD$ 9#AfDII+}A@D$8H|$ te@D)HHH|$ IE@tpH|$ "fDH|$(H|$ f.H|$ H1IE@H|$ 1|H|$ y\jjIMIwH|$0BB/oWoIUSH(HAGID$N\ Nt @I D9tLAEA;FsT$LL$L$tAEA[I ACD9AuAD9|$IDt$KvD9tuAI$HtDcE1ۀ}EH(H[]A\A]A^A_fD(1<[HHDDL $x $E1EuI$1HtB9D$A0DL$ $\ $LDHI Dk DL$I sD9AuEO(sN|EDt$EKvDt D$HA9EE!DAHDE9sJDDkDI$AE9Auù{x^DHT$uUHDkH4$[D$HËAuDkHT$A%D CkADD$DC1H$DHH4[DD(DD$@H$AD 0E@pHpt)AxLL[$A $DC1HDH40[DD( $81[HHD$HEHEED$pAFJ4SAAA@D$  DAHD$8HPHD$HxHH)HH$H1H1HH!H?H?H$BƉ$A9s$?@)ΉHHHHH|$HWBHD$hZHD$ H@ HhIL$ AJL)H0J|$(L)HLHHPIT$EBAWH4AAA@D$,eAIWIzHH)HH$H1H1HH!H?H?H$BƉ$(A9s$?@)ΉHHHHIRB 1҃A AJL)`I$H@H=pIt$ HT$G/HHdI$LKH@ DH=AID$(HsI$H@ H=It$(HT$GLL$HIID$LL$A9NH$0l$LM)IN$H$N<MAT$EL$ D$>I$H $DŽ$H$fo$D9APfo$A`؋$$H$ H$L$9މ$ 2@H$DŽ$H@H$'@)HHH$ HD$L$LHHHD$8jjHpSD$‰$_D9AXv&H|$HWBu PH1 A@A@H$tHDHHH!HHD$hHD$ H@ H 11^@HDŽ$DŽ$ Il$HD$ H@ HtD$|$;xLT$jIJjIwH$ 買D$<‰$8YD9^LT$v&pDʹ@?)HHHHIRBu$f.PHA@OA@H$tHDHHH!ЍP:IGI+BA@DŽ$(H$@D)HHH$IRBuH$y1HT$pHt$H1PeH$A1AHHz1IHپmHHT$PH$Hz@QHH% fDLILL$LLL$@LHXLII$7IRB H$H1DLL$LLL$H_fH$H$a@1H$0E1EfDHD$8H|$H@H+GA@DŽ$H$'@D)HHH$HD$H@@uH$H$H$H\$ HHD$HwHWBH$wH$H$wHHD$hHDŽ$DŽ$(prNoWNNollN1HIHHP@7H;u H{HEHx!HHD$HEH@HpHHxq2@HCH H@pP@DL$H{AO&2D$GD$Au2D$ EȸAAAt5EAIIILGD$A<AGAAE AuEAAAt|AAAAE AuEAAAk AAAADADL$L{A2D$AOEOAD$8AIHt$8%A DL$8AAA DL$;.A%IOD$DLD$IHL$PD$HL$HEHLMH{INjGu2WAuDtp%=EAAL$19r )ȉGL$%A ЉG8DFAD GHHSH HH2Hu Hx IGHxeHIGH@HpHEHx.@HCHt$HxH|$D$8D$,HCHt$,HxHHt$8H|$0D$0b,H{AHP HEH‹$ILLQm@DL$,ZYAu2T$ AAt3IIILGT$DɁGDL$HHLAA DL$$AAA LDL$'$QH LL$DD$,HA^A_tIu@Ht H@HxQ-@HHK$PLL$DD$,LHL9A[[IHHL[]A\A]A^A_AG!%=AAmAL$ADGAGA8FfH{HP`H{HP H{HHPLL$8LD$0HHLL\ HCHt$Hx*Ht$,H|$D$,)HAME@$ELmE1PHT$HHt$@HSH@HrHxb+@AYAZ6fA2OL$AD$AAHAE P HEH$mHVLD$fAHDL$LPDL$HM$HLLAQA n$LL$LDD$,H L4$HHY^tIu@Ht H@Hx(@HHKL⋄$LPLL$DD$,Ht$IXZ_f]1@D$A@L$D΁fOwW1AE1Af.AG@HHxHxHP@HCHt$HxD$8HHXHt$8H{o%AHHkHPD$LEHHLIE1@8t$D@DC@1ujD8vMEADCDfDD@DC4f.DCjHHcH1HH=HFiT$DɁ,H{fDHt$HBH|$D$8Ht$8H|$0D$0#H{AIm@HP HIE$mHLVLH|$,HjAD$,$LL$H $@HxHP@DHCH86Hx+HxHP@HCHt$HxYD$8HCHt$8HhH}"AHELmHPD$ELHLL2HHCHt$HxD$8HHt$8HXH{t"AHLkHPD$ELHLHIHH:uAHzu:HzHP@tHCHt$HxzD$8HC|H{fHt$HRH|$D$8Ht$8H|$0D$0!H{AIm@HPHIE$kHLVLH|$,HzAD$,$HKLL$$f.HH8DHAS8HHt;hrLH8[]A\A]A^A_@.HH8HDHAS8ID$HsHx@HθHEG/AHH1DEG/AAUDL$Er%HP%L<Q\$0HtpH\$0H Htv nPHD$0HtGHt@E1DDH5 H| HP0HD$0Ht AD;`rH5y BMtA~AFHHHD$0HtHPt-1HsAt$_LD$0NIT$A@IIDH|HtH HHHH9uDApHD$0EϺ.HE1,(1[1HD$8HHGHHIt$Mt fA<$tKHzH߉z9wH[]A\A]f.H{pHCH[]A\A]fDI|$ u]fv)f#t f%uLLHt wfIUf:uHr HtLL|Lshs HD$8H@f tfT$t{AŃ~A\H|$0p4HOHyW4C "ALL$0:HLL$PT$HqT$HLL$PLLL$HVLL$HHLL$H?LL$HhH-H f;tC6R HH zf;t|C6.L$\Hf?!G6HʚutT$t~{ ЄHD$8H\$0H@HKDHP1f: ÍHdHf;C6HD$0HH@HxHIkf8FP6 HtIHcЀ t=HcЀt1HcЀt%HcЀtHcЀt 1L AfA<$AD$6D L9|$HDŽ$tlL9t4H5)DML$LD$@L/H$1`HD$@A^A_Lt$@H$L]HT$0E11ɉLY\H-CL5HILULL$@LdLH=IH4H"_#RAHXD PHMIHRHHRAV)t!IDHTHtHH9uKXH`RPzHKHu1HH8HEH8`H$H@mEEAIED1D8H4VRD8E1HSLH@DE ʉPgAp@oc*S1D L(D0D8HǿRD8L(D0HD D@ IILHEۈPLt'AsI|fDHtH1H2HHH9uDXHHUH9H eH|IrH8HD@EHZH0t&z9Wf.P PHFWuHBPH`H]H8H`H#{DD`LH=IH4HDoARHHDk%D CH`EuHm%AHhAC1HЉ0H4聽R0H<EPHPt!AuHtHtH H HHH9uDhH`HN(1E1RHHHLjHu1HELIu0uU1H؃H=CЉЉ0H4λR00Hw@ `8t2INHPtsHtHt9:HHH9uXm`WRA61؁)9s`LR1x0HtH‡HD$ 8tjH HtMtD EOQHt$H=k11uAUTL$RH=N1/G߳BHPBf~tf{uDH\$(HkHX      fDAWAV1AUATUSH(D*Ht$pEtHqܳZrõ1( ݳH ݳ@RHHܳH@HH!ܳt:1H4RH۳@ ӉX^H۳hH[DcA%E)D)A9?{H4]H肆RH۳D` ʉPL@hKAUHTfDydHuձH@HHt"HkkH[8HE0HuHKձH@HcxHbR1H1HD$@衄D$\D$4H=D$\HtitT$\g1K>DT$4E HD$D$ZHԱHt$HL$@H@H@L|LʳHHD$(tH@H@0HD ESIHt xH HttFH=H`H5 28DIA;FRD$[D$ZA~ AFH|$tHD$x @=wԳD$YD$ZA~H AFH|$tHD$x @fDHD$HD$9D$4;|$ZH=ϱI T$\XHϱH@HH5:HCHH0tE=3ѳ<HGHгH9г3HcгHϾ&HHCHгH[8HHгHH гH@0{D$4Ht$@HcЉHHLdH,I) HI9WHϱUH@H@H\HSHtӋJv˃HBHLHHRHHtzuHH9uHp )$HDŽ$@,`HQ H )$)$@H$HDŽ$HDŽ$E1HDŽ$HDŽ$ ncADH{LtL u%IFH$H$H@HPH AHC1HtPD9uL E1AU H=GγHt (C Ht$HDŽ$n]H$LHHH$Ht4H$Htb H$H$AE9~ DMlIEA}HHD$ \ƉD$HCHx]H rHtZt  (?H=ILXT$H=6^15fDH|$@/LdHbL$ EML$ AA;$D$YD$8AI1HtPD9KDHlE( &IGHt$11ILeHP HH8HpYIL$HybIHL$E$MAH\D$YmfMD$8^r[/ IL$InHHHL$xֲD$8HL$xHYHUH9HHHT$84HT$8HH@MAHD$fH|$,|$YLt$`H\$Ll$h|$8|$ H 1H+;H=H`UT$HH=11@E$HL$Lt$`Ll$hHDpL8,?A$@Ņ@ ;;Ht$PXH=ȱ11HbHHHCtHH#H#H9@ōkH߃ @kIt$FAGQIW@IO@L$H$HH$t,:t%:sHRHuHDŽ$H$1HxH@Dd$HHD` /DHYHHEHxsH=sdzHs|$HHHzIGHHx'rIGHHx0GpH% HGH(HHEHX0HP٫H{ G٫Hm8HuTIt$I~˱D$8ID$I9FIt$I~觰MHD$H|$*L)HȳHt+H f.HHHHHuH`HdzHdzHdzHdzHdzHdzHdzHƳHt"H HHHHHuHHeƳHbƳH_ƳH\ƳHYƳHVƳHSƳ(HNdzH=ı(HuƳ-AE1H4E11>qRI|HtH1H2HHH9uD`y((8A)H jHt RHD$sLHH$t$[E@t$[Lt$HLl$PbMHL$Lt$HAD$$Ll$PHD=H2ųAHoHXY8H ijHt HD$sLHH$)t$YE@t$YIE@(HH f fHAf8 ofDHDŽ$H$H$.H$H$1,STH1+H=L5OT$H=^1+AUA~AFH$s (0+5H=LHNHt$P;QH=1¾^1 +L%,t%YY04H=HNHt$ ; QH=¾^1e*@ H$ r`u HGHHtH fDHHHHHuHؖf.t$vLgPHE@HH$AG;A DLtIvHtHDL$HL$DL$HL$EAH{t$AWE1`M9fIuH9tyH fHt f)EL}HIFHEPE< H fHt0nft'=HH(f9/H@0f8!HH[HLHx(:A;l$AI_f.H[HID IEE1A;l$.fHD$(H"f8H[HID HH@u HHfE1HL~H$D$HHtHHH[]A\A]EtPf=uJHC8uHHuHPH~t{uLhHLIuuRffA}L=LH,fH0L=Ht$(T$@DDHL9fA>GI9.LLfA>u=fA?u5HD$ Ht*HI~Ht$HH;D$ u HD$HH9D$@tVEIE1A#L9=Dt$A<Hs8Ht$@HALjLL$PHf8HLl$H=?Ht$AU8HHt @$Ht$f.E1E1DLe'MiH{pMHt f?LLdMt LLd H5?H?HH)HG/H?HPHH9H ?HH HH?H!HH)H?H)H9v H5x?HHf?HD$`HH\$@HCH$Ht P1HC0HD$pHt$@T$xHA A( ЈA(D$hT$\AD$lQ La8Lq@AAHB>HxCH8~HT$@HHD$@H.>HH$>qH@HpLk0YA$1HL#A$ML`8LM<NHs8HAH_=!DHt$@H95<FHt$@VHt$@HO6HHt@$tHpHD$@1RLl$(H=<Ht$(AU6HHt @$Ll$(HLZb<,H{0!H{816 1QHb ' H=Ah11H=H .h H=hH1豁H5h 萜 跪H g0 蜪H=g1L[H g pH=gL1/H5g 3It$LL9 HE1E1j1HHL蘖_AXH|$HLXH$HD$HLHp@`LL`HT$@1LLVHl$PH=:Ht$PU4HHt @$-Ht$P Hrc=OH~;H5;H|$PL譫H|$P#H|$PHD$Hp9tT$ HD$HT$ p9tH (fHt f)Lt$8H=*:Ht$8AVD3HHt@$uEHt$8L_t :E11H}HL$@OLhHpLLP_+Hp13f8VHxf? H.;E1<*Ls8MH\$HLt$HL958AFLt$HH=-9HAVI2HHt@$tLp뼺LH豧W W H{0LHGV@$HD$HH(y15yHAUATIUSIHLHhHBH}HnMHtq9pvlLtMt`M>MtXLL $" L $t4AV>IGH҈UEHH9tA|fDfA?,@fHt$0LLHD$P)D$@Ll$@)D$0dLl$@L! uCLd$0L! tLl$8Lu! }E1HhD[]A\A]A^A_L uLt$@L7! At]L|$0MvL! ItH|$8Ht uIM4$EMd$M}2 H LuS9rFHCHT$0Ht$LL $RgL $t"HL9HL{AEL$,HD$?wHHHH!ЋL$LHHT$0?wHHHH!HSAV>UHL$8HH $ H $IWLyHT$8 HH$ AH $HT$HAHI6I9;Hf8wLxAf=f [AiH|$Hf?0u HH|$HH$LD$OLD$H$IHLOHMH$H9$1HL(k% M@,LHHL$$9$(HL$u)1fD9pLL9tc$D$xHA@HD$`.@D)HHHD$`H}hH1LLD$q#LD$IGf89I LD$g LD$tIGLxA1;$ 9\LL9toH@@(H@@QH@@HH@HuHLHL$HL$MHII9tI"Hff.L?IohHcIGpHf8wL`AOHA$L$ f=If OA$M11L*A/E1EDH}Hc HIHD$UH Ht v TLaH}HpIMtIV|$ LMYgILHI8H< pH  Ht  H|$kMH=t fA}|HĨ[]A\A]A^A_f.ID$f8% E1E uL~AtHftf=t E  HگHOf.K H=-ԭHxE MvfA>L(6 IFHx Ht f?YM|$xu H|$`H$Ht$`H|$ DŽ${oD$ )D$`oD$0)D$pwDS$HDfD$ F fHDL}:DH>kIx>H(`8} !f.jjHpAH|$p8x$$^9_?@)эPHD`HHHD`HK$t7HkHDgD$ H/G 6K H=ѭHlHHC軁AWAVLw AUATIUSLHHxH4$ɱ8H޿謭HCHH@HԯHH$0HD$H4$HH$0Huf#H[HHS0f=uBt1H$IHHHDŽ$dRL$u0HԯHt$PtHxHӯH4$H|$L$0MLt$L|$-Df#HC fv:Md$MI\$0f=uCuHC fwI|$X11vtEl$,A M$AfAI0E9t#I?11v9tE9df.SMd$HӯMlL@Lt$L|$H $AQHq@HA@HL$pD$HD$hHt$`Htg1Lݤ8Ll$`Io AE<DEt,=үt#<<BE1҃<Ll$`IuHHt$`u|$HүH $HHAtPt HxHx1[]A\A]A^A_`1fDDEt = үH|$`H蒧:AHt$`HB11HH$tD$0"HѯH$HL$09HT$0HH\HtH;H4H\$(1H L$Ht$D$ IHLLH$0DŽ$8DT$ t7$8H=4DT$ H|$E1E11HH觘DT$ AE<I}0ALI@H% fDAEpI}0f?Iu8f>LAE}L >IIEpHtJf8uCHXf=t'f u0DA$!= $/HCf8% @B@AMHc<I]@f;H{ Cf?uH12Af;E1JD#Hxf uHfuH1iIIuAAERH<$臼jAG8`DH$0HeίHD$HHXI}H7H@f7GE1fDIAxfA@E1fʃ<A1A} vIuPI}HMf?H-ͯHIPHHt$HH$0H'H8H@ IHuID$HEHvL$ HH9w H*HuDE(E EH}LHD$L$D@$IH HD)DA)‰HHHL M1It L9?1IEEH$ME\$I E)AEADEBADE$A)B AH9Hr)HHL MI L9uDE$D$ AD$A vHD$`L`@HD$H@HHDŽ$fA<$HD$((H=̯ L$H1LLymD$0 H˯H 9H H\HD$ Ht$H|$(L$H$H$IH$0DŽ$8} E1HƄ$Ƅ$Ƅ$H$H;HD$0.H$AD\$0 ˆT$8E1D$A  h[|$8 $0;$4HL$EH+IL$(D$0HDT$0;DT$0HH$H=.DT$0L DT$0H$t$49$0C>$ HD$ H @  HC HC4HCC<C= Il$(H H(.GDT$ ?HH‰=DT$ H$H{$m$8_HD$H=-DT$ Hɯ HȯC,DT$ ȯAED<jȍH4ƒ(wH HH ʄvI}HHW f^11lKHɯE1҉DAEI\$f;IT$ HHT$ 襝 HT$ {HD$ 1D$0Ht$`1$AEE1HHt$`HDAEE1+Ht$`AEE1H|$`$AEE1Ht$`DAEE1H\$`Ht$HHHHD$ k+HǯH9BLdMIT$H:HzpHT$NHT$HD$(H6AHT$DHE11AHHHD$HHHD$0賏I|$~$HpH$0a$hSH$PDHq AHD$(=U=Ht$H|$01DT$(?DT$(qL$It$LLL$(H=*{ 1zILL$(tH5aHL$LϺ1H=*1 I$H$AD$>ID$HkhH|H9Ht$ HvH|$`H1wHt$`qDHD$`11HHD$0HL$HHHL$(HHD$8NiD$  Hů1HtM9HvHH\HtHD$(HH-HD$flHD$(@>D$@fAHD$H@8opHuHPHfFNH;HHD1x vHxPo A(HD$(H(H |$ HH\$8l$ ADT$@H-%HHD$N)f;LL$DT$@/AA<AA=HD$(Ht@=EAAHL$0IHAhH<ïAQ,H )ïH0ïPL$ E1H9įLAE<Hl$`HEhHHD$H|$W#HHH t!HcH[ H9w HHSH{ HHƿDe6 HtBwD`HEpHt$`AH|$`H蠱AnpHl$`H]hHHl AGH11HHD$_A$H|$11eA%HF¯HD9HDLdMM$MA|$>H 2EHɉD$ t I|$ HN@HpH{HnٿuHs|$ HZHH|$`Hu H|$`HyHl$`E< wH$H Ht3t*TYX.H=y1HRMt I|$ ]LH]AEE1fHt$`DsAEE1I}HHH{ j tHCHX"E1҃IqI}H11DT$dDT$? HzH)9h HlHH}HE1A} vMeP}>fA<$WID$8IP8HpH$DU>EH}1TAHH>\HHE H9H AZf8HpH{2׿<HU H|$`E11ɃƘL;l$`AEE1ҍP LDT$V"AEDT$fDA"<D9XHDHLHt߃i(uDEt?=FhH H9 @HcHϾHHHHH H;pHHDHC 6D$0HD$ 1IEHMpHxf?wHp f>hLMXHD$ 19D$O$0D$$8H="D$@$4DT$ND$HD$8I{ H|$E1E11HHHDT$ND\$Oi$8H$HH;D\$HDT$@HLHL$0 HL$0H;$ADT$@D\$H)H;HHD\$@DT$0 DT$0D\$@C8H$+HD$H+A@D$(HDT$0-|$8HDT$0!H$H= DT$08z DT$0H|$8t]HL$EH+IL$(tvD$0{>{Ht$1HDT$0ODT$0\UH$JHAHL$H+D$8IL$(Atf>4@-zMHDE$HmHD$ HBD$B {>Ht$H=5|u {<A HD$`C<HpHt$`H|$ AHD$H=ZHHHcH9HH$HFH$0HT$@HL$(x HL$(HT$@HHD$(H9AAgH^HE1Ht f8 HD$f8T DL$ E H6HDT$e+HDT$IH(@4H@H@ H@<@=DP>tLHD$8HD$0HDT$@HT$X/d LL$8DT$@ HT$f: f} AA8D"\$8`;$8HL$(HQf:HD$`HpHt$`HHEI}Hf?t,L$ L$QH Ht3t* !^#H=1LG1Ls H|$`HIE8)Ht$`L茎LD$`D$A< wLLD$ LD$H H3߬&TYLD$"LD$H=߬1LFLD$H.E1A} vMePLJ AID$HcH9K)HE1ЉDAEAR7!nHHH!H9LHHLHD$ gH|$ ArgA9ľHDGyiHcyqDH9vD VHDe/ f8H[@6H HH95H= 1Hq LHHHw HItBwEeHT$ HHw HHtBwDcLSHHHIDSHHHHEؾqDIHL$HDHq H|$`H@Ht$`fDHm1ID$HD$ tC SHD$D$ % = EHHtD9HvDH|mH5HDL$ nHT$H5IXHILGDL$ Ld$HDLrDL$ HX HHDfA<$DL$ u H|$DHHNAEE1H|$MAJ71HNAEE1H? H|$`HcHt$`=H}11LL$GULL$HH9HHDHHHf8uL$Ht$L$0L$ H$L ^LL$?LM L$ hH|$LyUAl$Ed$4H$H$HfAEH$D@$;$1HH9̀9uH$D>$1;$HH9̠9u0L\$jAjH|$F$`^_L\$\EH$H$H$HHDŽ$H$H$DŽ$D$HDŽ$H$DŽ$D$cAE@*jL$ L$ #8Y8Y7!1L覔PH\$H\HH|$`Ht$`_At$ H|$8KLL$DT$@AA<AA=H|$8DT$rJD$ DT$DH$1@LDL$ "DL$ IM1H جEHɉD$ t=iجt4 !LD$(2H=Sج1Hl?LD$(MLv,l$ H5H‰HL$IIHQ)C LIHX H=D$ E1Hc,k DL$ IP7!H|$8DT$,ID$ DT$|Ht$01HDT$@DT$HuDT$8LL$HUHIG[߼DT$8LL$HHLHDT$8LL$ LL$DT$8L$Ht$DT$@H$H$0LsYDT$@H+DS>f9Hl$L$Ht$H$LH$0$YAhHD$(H(DP>V7!z|$0$u |$8 H\$ DT$8HHGD$(DT$8$u0L$HHt4$0;$4tƄ$HH$L$0L$0t$(HHDT$8a HH_DT$8HHHuRT$(jAA)hT$($H\tHD$ Ht @ uHC C=C<H$(Hxt"DT$0HܪDT$0HHtH;L$(wzl$(xA1H41HT$8DT$0PNHT$8HuDT$0hHT HHMAL$HLHtHHH9uLD$(DT$8L1DT$8H)H|$ \$@I|$f?DT$ ? HDT$ pH9gLDT$ O$H= HDT$(L$ g L1H"L$ HډHQ HHtC=H$DT$(i0L)D9r)hJTH=8 DT$0f DT$0fH4Jt)H{(Hs@1HK ‰9wH 6Ht!=yH;胲MHCH;lMHCHq;UMHCHHZ;>MHCPHC;'MHCXH,;MHC`H;HʱMHC0H ;ZHHChH ycsHCpH Hxh_HHpL= HtDBvHE1MD$ IDd$6IwHH9t,Aw IPH9tIwXH9AGHuIGHG*~t{uIGHHpIG@Hx+u#AEIG@A]H@@D$ IF(Mv0깸AIFL$H$INH$HIFLt$pHD$xM~1L9AtAM6MHD$xL$I9Ld$pLt$pH$IT$H$M~HID$HD$xL9uDH$HHq9jH@HHD0LxH@A?L0A1H|$P腠Hĸ[]A\A]A^A_ÐH$HHBH$E1HHDŽ$HDŽ$Lt$pHD$xbfD@IwH|$⩤tvAAWHuIGHf<HL$p1H$HAH$H9Ht HPHHH9u@HD$xIfDIE1 LE1oE9|KDvAVHIlHaA)9pYAIG@H@@;D$ (F1A vIGPDd$6FDAL9AD@t$6Dd$6D d$ EHD$(H8= HD$(I$ABAE~,EIn8D$ D$6D$ IG(HHp IO8H9H9HHI9I[fDEo,1Io8AEfH=緥Ht$@H|$p1rD$pHD$xDŽ$ HDŽ$]臜H$HHtwp1H$u d@9Xv[H|$pHRHHD0H@H1D$pHD$xDŽ$ HDŽ$]IHEHuH|$p51D$pHD$xDŽ$ HDŽ$]E1I0Ѱ OHHID$61;1D$ D$6D$ 1IH$9Dd$6D d$ HE1RfDPwσHuf.},uHHH[]fHEHxC@$H{H@HHtXHtHDHH[]úPSff.HIF8@D@ALuHHH ~KHHHH@@HqHx&IF8O 1>@HT$HR:HD$A !HD$H@8uHD$A@H$HHL$D$xHA0HtIUHRHRH9HED$xHD$H|$I3H3IILM1I9fAu A}NIHtxA}IEHT$ MUH(HD$pABfD$zD$@HuIBHfD$zD$@H`AIB@H@HD$hPfv fHT$hB4R6%9|$@5D$`$A$$$!Ѓ@舄$A8닋L$T$ H$ 1ExDL$(DŽ$ HDŽ$(DŽ$0 HDŽ$8DL$(E=9HD$NL|$0AE1LLcNAAE1L LBNAAE1L L!NAj|$8 AL$4H|$0AE1 HLD<DMH|$0AwAAE1L1MA%H|$@$HT$8Hk0H$LtXH$E11HLSi$ZH$I~Hp=L$HIL1ljHIH趏ty=3tGH$ @{c1DŽ$ HDŽ$(DŽ$0HDŽ$8LZHT$ <Z ǂL衎wy#H|$XH HD$Xf8@6Ht$X1ҿ/~=GtGH$ Ь{c1DŽ$ HDŽ$(DŽ$0HDŽ$8HD$ ǀI~1AYQIH|$XHJHD$Xf8t @6lH|$XHHD$Xf8t @6OH|$XHHD$Xf8t @62H|$XhHHD$Xf8t @6H|$X;fDAWAVAUATUSHxH_Ht$@HT$;t H% GID$v4HD$HHt x8A~LHt^EtBH|$tIH ILxj1Y:D$vHx[]A\A]A^A_HHD$ H@0Ht!IVHRH9BuD$vIbA MnAM4HH|$3E3HD$1H@HtHcPHD$McD$Hc@(HHLH)1HHAHHH)E @1L.肞D$vm2=G tEH$@{cDŽ$@HDŽ$HDŽ$PYHDŽ$Xgƃ@l$vHLi5 uH$@P{c1DŽ$@HDŽ$HDŽ$P,HDŽ$Xgg\$v%fDLphHD$1D$v5I^;HDA@ENpfHD$0H% D$vD$HH?'HG0HD$HD$fA}HDŽ$$|$$H|$$IFH|$@HE1LL``LL$8H$IL-1~HT$0H$H1逸IH$HEQHE@HU@H$H$HH$t,8t%8$H@HuHDŽ$t2HD$H;CXun f.HD$H;CXM H[Hu|$(HHH$pH%HH$p1HH9H$@$ H9y HML!tfA<$uIL$0H8HHIL$0HID$0DH\$HH\$H\$;CIA|$lAD$D$pD$p;D$(MtA|$u!AD$H$HtK!GH$Hm GWH$ H$ 1];H{0L葒HL!Hy|$@]AL$4D$AID$MDIHD$XD|$OA9H$Hxc |$@D|$ D$DA9Lct G-fA<$E AD$6Ht$XLICIӅ$A <U |$p1Lwމ6 D;|$D$ fDH=11L51I趏INIHL//HLϓH$HƺH=[11LLHWINHLI/ /HHpH$Hƺ;H=11Lxt$pHLHHsH$HƺHH$HX\$9$l$fA<$\$mAD$6Ht$XHډAHXIH膙{fA>H[?En6H}f?G6DA. V)L$ZH$HHf;ID{6fA}AE6D. U)H=1L1/HIL4 4HLܑH$HƺH=h11H1HeHKIHL/.HH¾/~H$HƺIH= 11LHIL4c4HL3H$HƺH=11H;H$H趌HKIHL//HHϐH$HƺH=[11LHL4H贄4HLLAL&H{APrvb$?$1H$H诖tMn[L蜖tG= @H$DHT$P1L4HAINRhD9l$(IFu(1aF1H$HHH$1Ҿ1H$BHJLdLp0HT$PH$@H1fH$1ҾH߱H$BHJHlL$H$@L|HD$`H|$`THL$`H5*Hp}H$H=n11HH$IHھ<HI셷HLlH$LhL|$@LcG-Dt$E9tCt$(AHAǃtHHH!H9u A9TQD|$cD;|$'Ƅ$yLc/HxAL$4AID$MD$HD$XID EA9H$Dt$HDt$HtAH9L$(w|$(*ZD$(1tܺ_1HD$X$LhI]H\$LH$@L$9H|$PHD$G6AL$4HHD$`HH`LIH1LHH$HDŽ$@cH$@LH$Ht$H=11oHT$XHž1)HHhht f}uHEHƺLH=11LHLHD$^Hl$HHHLHt$H=Z11HT$XHpHHD$σHL_H= 11LLL$L$?H$HIL蔂HLH=կ11LQLHL4.x4HHHLHt$`H=11 1LH$Ld$L$HT$PH$@H1HDŽ$@&_hL$κ0Hk0HIL$(HH8IL$0HH@ID$0HIT$0FHPHj0,1L1Vjuea1L{1r7 11ҸDt$?L0fA}IEuf8tHLD$(HT$8LHƿ/vHt$0H1H$1ɺH趱IXL趄I萅H\$0H$HHD$@t$(H$@H$TɛIc0IEALHD$(|$I~HH߃;l$uIHt$(LGJHD$(f8x4xI#HD$aH|$t\=ؤtGH$@{c1DŽ$@HDŽ$HDŽ$PaHDŽ$Xgqƃ,|$=kؤtH$@@{c1DŽ$@HDŽ$HDŽ$PiHDŽ$Xg 뗺07KH11E1HD$pHE1LH |0t$PH$諕E1DŽ$L$HD$Hz$H@ HDHx螙IL$1MLd$8"HDRMH9l$HDH=Ʀ1L4Ht$@H7LHH >*IV(9l$MHP8IV0HP@IF0H0Iv0!$L$$9D$PL$H$BprH\$IF Hx0IH|$ue|$Hu< L$HD$ H|$xHD$xD$HH|$ D$w\Lt$0D$H^w0(cMM81y@ H|$8TxItLL9l$IEpHuHHH fyDG,LH$@MMH&HD$HD$`HHD$hE1H}H|$/ڣHPHHtzt[HpQ=QͤtEH$`1DŽ$`HDŽ$hHDŽ$p"HDŽ$x2HxHHLHE1HHt1٣HPHHtztHP=̤ %1HHHI HIHvH}f}H(f8(H$`H$0H$`H$0HHHD$ ND$8HHt$ 1D$8HL$H$`LLT$(H@DHp4ADDL$0/ LT$(DL$0H$$AL# H|$ H$1qD$8@HD$hHLLPUATAWL$LD$xHL$pHHD$HH Ht%=H|$PHD$H@HH'x'H|$֣H#tH HCHt @D$.HL$Lt$ H$`HDHL$HD$0HD$LL(Lu߷L$0M A|$ID$HtH4HH9$t =ʤ A$wdA$tY=ɤtB{cH1DŽ$`HDŽ$hDŽ$p"HDŽ$x}111LOMMtS=wɤ)LHHIt׋k A$A|$MMuLHH޷L$`MtULHHt8DEuxt&QwtHH@Md$MuAEQM}@IE@L$pHD$H$hL$`fDMA?LHHIHDŽ$03xLMt)@LHLHHH$MMuM$LHHLGH$0H$`HLzL$`EHx(>HH.5L$`MAEQI}@HD$L$pHH$`H$hu(fDH$`HxHH$`?uG@tD_`EubHxIf?t耧LHEhHHEѷHXHt PH`Ht @HpHt @HH0DH$0HD$ E1D;D%fƤEAf.LԠ{cH1DŽ$`HDŽ$hDŽ$pr"HDŽ$x목fDH% fDAGaIGHMf87C@9LΡHAӷLL$`LHqL$`HD$HD$H9D$0H|$XEDL$8Lt$HHL$XLD$`HT$PLAEDE謟LLL|$УDHLzMHD$IF1Lum,LxX:INXHAAHAAH1HuH8A2F_HH!H9HI#~XDHHPHHAHHH!AF_D8DFLl$HLIH!t H9Lt$LУIL1HHIHADA)HD$EEEHD$H$`McHHMLHD$IcELHHH?=LH,-H$`LxL$`~It$H|${cH1DŽ$`HDŽ$hDŽ$p"HDŽ$x蘚LHH$0D$H H $H91I|$H4$H@CI~HD$P肓HD$X1HT$UHT$ELH=HH!H HH1HuHH8H81`HHtRHH9IHpHH9v HHH!ADGEHLH!1LD$8HT$0HD$M萱HT$0LD$8I=HI=H!I HL1HLH8H811LD$8@LH1HuLH8HT$0H8LD$811|f.H蘙H$`1{cDŽ$`HDŽ$hHDŽ$p5#HDŽ$x)HD$Hx0tB{cH1DŽ$`HDŽ$hDŽ$p8#HDŽ$xۏ5H1DŽ$`HDŽ$hDŽ$p9#HDŽ$xގTcH$`D{c1DŽ$`HHDŽ$hDŽ$p"HDŽ$x=HH!H9HADGHHEgfHLH!H9HLL!H D$H=H!H HH!H9HLH1HuI8H8D1uAAHL$H9vHHHHH1H%E8@A,A@@tE8DAFLT$  $AA[A\LT$!պ@BLT$pjjAALLT$ $AA^A_L$jjHAAH$ $A[]jjAHt$0AH$/ $A\A]jjAHAH$ $_AXAA@jjAHt$0AH$ $AYAZL$jjAALH $XZCHD$L$jjAALHpx> $AY^L$jjAALH $A^A_-HD$L$jjAALHpP $AXZp7 豸p6 蝸!艸Q"ujjAAHH$ $Y^vjjAHt$0AH$ $_AXAHBtk&=thHD$ptJcDH|$@x{cD$@HD$HD$PE HD$XH 1MHD$pvfDLL'F?fDfD(1E1EI$IDŽ$(1qEE1I$HDHT$@{cg6oCHD$PD$X )D$@HD$`H|$@1IeN@H={eHHsHHD$腼HIHT$Hfu Lw H f=t 7LHUHj0HB(HDŽ$HDŽ$HDŽ$H$xHDŽ$H9H$pHDŽ${HELuHHHT$HT$A<-p@ Iv(Ht<v I~8t H6H}HFHH DH$Ht#H$HPH$HHDŽ$H$HEH4$H$HEH0HuH$pH$xHjH90}H$H9H$pHELuL LJA<<;$HHBH$1HH$pH$xHDŽ$HDŽ$H9DDH@D9tf}GHUHL$HrH91Hɢu+UHHc < H HDH$x<hP \<IV(vHt I~8t HHtJH$pHٺ~{c1DŽ$pHDŽ$xDŽ$HDŽ$MƨVL]ضH$01HHxH$0HD$xHt H$ 艒H$ HtxHD$HH@HR xH HD$H@QAHP@H@@HaH@ H\$HH$xHH$pH$XH$p$$&HD$HD$@D$LMLpH\$ID$Ll$pMHet$0H\$HxHt\$DWE҈$H\$HXHt\$DOEɈ$uiH\$H`Ht\$DGE$uCH\$HpHt\$$u\$$H\$Ht @yD$h\$hҸE؀$\$h,H|$|$H$ HD$H$(Lt HD$@LL$0H\$ H|$ E1HD$`HD$(H$|$,H='HGx)|$(H(AM H=LlL|$IHAHqHyHD$@AHAML-\$0D$IEALl$pH@@ HL$X\$H9HcHL$ D\$0D$hH9$ENHD$hAH|$P11qHD$xHHH$pHRH$pHF$p%HD$ KDt$0EpDH\$HHHs$HH$p%=H$p)$1H=kHH$H\$@1HL$xMFHX@HIT$.$IH|$(:H|$`.HD$(Iu1H=[1HXHH!IuH=:11HLHH}fA<$I|HD$(HBQ&HJ@1Hr@HtHA H$pH$xH$L1H$pHT$(1HH#H}nHT$`1HH#Lk0IUHL$pMG1Lt$@HIHx舿HS1IHپGIH$LL(4H=aH:)yL|$L!LHHD$(a|$t H|$)IMtGHD$(L8L\HuLމIL\HtHp0I9LyHT$(H|$LHH$H$1ɀHHt$H|$諨$HD$` H<${HH$L$H$H|$H$D$0HLAD $EBH;ZIH=Щ11HpKIID$H$L $uH<$HD$(HD$HHD$(H|$8HXH@HD$0HH@D$PHD$0HL$(H@H;HHD$8H$0HpH$PHL$0L$PMMHD$0H@HD$X@IFH@0ff#bAIE0IHXI-f8HH;H;H|$8@$Hu1LH@HMtX)HUH1IHHHfH1LIHھF1IH߾pζH\$0CQHs@1HtHV H\$XH$H$pL1HHDŽ$H$xH\$0H$萯H$pHúH$H$p&HD$(HLp$MMmMMH|$L=yףI8H$pL{c1DŽ$pHDŽ$xDŽ$HDŽ$@=ףYH$p{c1DŽ$pHDŽ$xDŽ$HDŽ$裻 @L1+:IE޶|$h H$HMp1޳H|$HH<$HhH讑IH$pH$ DŽ$pD$HHD$8$AHD$HELHHHDH2H<$IHD$IFHD$0H@HxHXLkIEHxvH@H<$HHT$H=HT$HH;P>H|$8LL$pMHl$HImXf}IH|$IuX誁Iu0H=ǤHILD1L$&Ht$0HHD$h$HT$1HLHD$XL$HD$X1LLHLL$hs$LLMmMUHl$HH9$H|$8H$DŽ$p@‹$$ u@2$@%=H$$ H@@H%@8b@v Au @tpAAD uAAAADǁȹ=G%@8G$ %H$ Љ$ $ Ј$$ _)$ H<$12H|$  HD$ HHH$` H$pD$PHt$8H<$1҉$*腝0="HI`ޢLHHrsH|$HLzHH HHT$H|$ {)L|$xIݵE1E1LH1H藬H|$P1keH, ILp1ڳHH$ݢL|$L(LݢHLH$pݢH H$PH$IDŽ$PHHL$`@D$H4$LLALA蟒IHJH9uHJHt$HHH$p$HHD$8~)$HcH0L|$Ll$xFLLH$HT$P1MGٳIHT$P1MGIسH$0LIƄ$0$0H$H|$PHt$XHHPHHR4H$p$x$|cHPHH$HR4H$P$XHL$8$\HT$`蕾Ht$8E1HDŽ$HH$p)T诚Z:#HD$Ll$pD$1HpIHLD9HT$X7I@H$ LIhLD$LxHXH$IGHx A1HHHHHpHbHHIbLD$IH5H$pIH@8H$PH$PH$p螷H $\ IHLG1ֳHMHc1ֳLH1HCL|$xLt$ HcMgLMILa1ILLFֳ$ILt$ tb$pHcD$0HcH9t$ I9}8HD$ LLD$(H9HBHD$ oaLD$(HLZ1ֳIH$Hk1ճH<$I٢HHH$pH$DŽ$pHHD$8KD$H|$HLAHAH詎HHD$`HEHHH9HL$(L4$HT$`HHL$(LHD$HLH@XHEX AV$H$DŽ$I$H$1D$$ L4EA@IHHSXH1LuH82C_AuDL!L9TH{XDL!HH"HIIDK_M!AA8AFI1HHXH$HL$XHHX@_L!HL$PAƈ$ZHSXAH1LuH82C_1/HL$XHH1LuH8H8Ll$P1ȨH{XL!L9UHL$XHL!L9AIL#$HH=M M!LH HH$L!L9HIH1LuHD$XH$H8H81Ȩ`HD$PHLHH1IH$pH$pHH9H@HGC_L!EDI<F@A9AENE9}A9D~<FAA@Ht$PDHH#$L!H=H L!HHt$XH HH$H1LuHHH8H81DDD9ADYHH#$H=H L$PL!HHL$XH HH$H1LuHH8H81بDDDAH$HLL!E$AH#CXH HCXH8A $DC_H$9C$aH$AMHL$8yL$pMHD$HHDŽ$pH$xH$H$p<ѶH<$~բHpHHxHXH$pHHD$8ݶH$pHC$HD@HH|pHmHuH|$ ~pj ݆H$HL(HD$HHcI]HtHHH!H9,Hp[HIԢM1LHھ]Hh гHIq¶H,H$PLHHDŽ$P8EQIKHE@H$PH$pH$H$pHE@H$x϶9@HDŽ$hH$pHDŽ$xD$P\DC_LkXM!AEZk_MI!@p ܆HHc@< ۆDC_AEHL$XH8@tP`IEDHl$PyHt$XH8@@tA@`IEDA1HH=L!H\$P&1}HH=L!It5HL$8Ht$PLF1EIH=M!HHDŽ$pHH@HD$(_ILI9IMHt$ YILHt$ XIMHھG1ͳDHcT$0$L)AH)1HHD$@HI\H6HEHD$HHAHYHyHHqAHD$Hc1Hc f fw1DL9bH\$@$HD$\$LlHD$LHD$@D$MH$ H$ph1DŽ$pHDŽ$xDŽ$ HDŽ$먨p .نH$ H$p8DŽ$pHDŽ$xDŽ$ HDŽ$菨p ؆։F7@B@2$ @1f c؆ O؆ ;؆'؆؆׆H$ H$phDŽ$pHDŽ$xDŽ$ HDŽ$`p ׆p ׆H$ H$pDŽ$pHDŽ$xDŽ$ HDŽ$p 3׆It$H~tH$ѶH$H$p1o$L)$pH$ζ~p ֆ)|IE5H0\$E@$Ot$0Ll$pMPkֆ@D|$DHø H$xH$xHDŽ$HDŽ$HDŽ$C4?H$p )H|$`HD$`vHD$`D|$DPC4?p)t&HDVHT@HtHHH9uELE1I1H@L$AL$4L$pD$DŽ$HMH$6A}EED)A9vtA}C47*>H$p1HH4HtWL$ T$~$EL$ IH$pT$%AU AEAEEAEt%1AHLƒA9ILuEu@Ll$H$p1ELIHD$H1L4Ll$HH$HT$HH|$1Ll$`H5,IDDl$DA9A-ED$D$$Dd$$ED$ D$$E1HD$1EfDHt$HtN41HHt$HtdH=v11҉ ID|$ D$HT$`LDP@ 8LHLHTHH|$HVHD$`A.LdAE9jD$$9D$DAD$Dt$AIMH HIH9uD@I1D$$9D$DADAHD$(HH;k)9D$8vb{2D$84(>HH4HD$(H8H_k!EHt$(HX PD$8H|$`th1Ll$0Ld$(H(I$HD)H9HLPrpHLtBD9sA1L4HLDH|$!I $H9H|$`QrqHDuHt y~GH$pHt xT׃HĨ[]A\A]A^A_DHDŽ$pT$?f 1HD$hHD$hHD$`DG{׃D$81D$?@D |$?PxHSHHt)uH|fHtH2H1HHH9uh;H|$`EE1HD$(HD$8A100fu0ÛDLHÀHH}A L0LcHC HC(CIEE11HC?HULHH}(;=y{HH@ HH1ƅDE1fLdI|$HtLMD$Apt!Wt0f9Tu9rIPHD9AHHt;XrEDHI]HW Dc%D9E|$HD{JDD%oE>A.H0Ht:GAEI(I;.HH fc G@%E1AƅHbHHDž0DžH4HHH=HD` H}JrHBH|fDHH9QHHq1HtNA9tNt7IOH}{c1EHEE HElH1E1bDD,H}1{cEHEEHElHH@ HE{0DDK5>I}IH4H7Dg訒DHIEADcE|$%D CAIEEC1HDAH4sDIEHtH H HHH9uD`HEH}{c1EHEEHElH}1{cEHEEHEIHH@ HX1LLAL^H@1HIFHIF HxE1Lu"fDBHID$DH@ NHDžpHEJ A$f.HHHxLrIvHJHBH@HHH<ʋHKDPD9VL I9$D HD;Lt\HpH">HpHuE1DH0DBH0DAƒ9H\uqf.1 1E11{= EHE@EHEH}{c1h&=t [EHEEHEH9tHHH@H.HD`D1?HH7?I~1AL;9wv;DA9ƒ?ILHM‰IM THu1A9HHTHs^ۺh! !u!1}Hp,kH?& HEHEHEEEKM>IH@$HHȇDLHHpDeHpI>HxI>Iăo,u1OHAD$,1D]E1L LP HxDeDHHpHEHH@$HD+M"A9BGM4HAI~IЍPD)D苍)IL9ED$AA ucEHHH9w)։HHHLMuDeEuD]H% EtDetD]H% HBIFEtDetD]H}AE11Ҿ Mb)LHMIT$E1I>INIF HHtuD9`voDHLtIVHRHt;u%uI~Ht yKGAIN1Ht9Qv։TH}$HxHBH9Hf<H$HxF=HI#8H) A>g$D$B H$fo$H$$)$$$IT$ HDŽ$HDŽ$H$HT$$HDŽ$ =HHT$H$HDŽ$H$HH$$H$LIt$ I~ ׵% fDH@HHhfDHL$$DŽ$HDŽ$HDŽ$;|$H\$(fxHp1HHH9wH$$HHL$H$ѵHL$LHL$HöH$It$ H$fo$HH$$)$$öH$fo$H$$)$$DH$jjAAH$PV($(A^A_}L$LHLöH$IV It$ fo$HH$$)$$¶L$H$jjALLH#$AYAZjIN jIt$ HA$_AXcH\$(aH$|$M81=IUHHfL$HH$H$L$LI2H$LE0ff.@FkCfDHHRSHHHQH߾([kCf.@TUS}1I1՛HH@@ 1MffCH{CkHtL軯D+H[]A\fD1D+H[]A\@d$ fA<$8tLL@I9Hf;8Ld$HD$Ld$t"DAt$K\H[ fstfHuILE1HQBwDaH8{HHtBwDeLd$MA;\$rA|$yAD$Lm%E1AwHHDbAmpMtvLD$I~PHLH$HILD$I~0HLHHIxLD$I~HHLHHIUMvMuIELh @IExP>P$!I}GO L4It0LD$LHLHI_HIII}uLo_D$Hx HLHHI,MMMi@I}LwfA>&mIFI~f8LHt$,HD$,I~PfPfPf'f LD$HHLHHI|MNMi @IELpMWLD$I~HLHHI7MvMu&fDHII~Lt$Hx8HLHMHIIEHx@MbI~LT$L$HLGL$LT$IAMfLLL\$L$ML$L\$AAE0TS葑f.LLT$LT$L\$LL\$LT$T$)GLLT$LT$L\$~LL\$LT$T$dAA0TBSfDMMA'AA<$zfA:MIzG4f%fu GL$L$H¿FLIRD$@HD$PMHD$HMAAAA @<$AAMM5fA;LLLT$L$JL$LT$L޹LGHL$LT$M1\fA;MI{G4f%fu GCL$? L$H¿GL&ILLL\$L$HL$L\$AA60TeS<$tTfA}MAM+LLT$L$LzL$LT$fA>MAMH LH¿GILI}G4f%fuGtBLT$L$o HLFZLT$IAL$MPHLT$L$. HLGAtuAfA;MI{ LH¿GIABAmfA}LT$L$I}(MAtCAfA:M-Izm LH¿FXIMMLLT$L\$LT$7FLT$L\$T$H LH¿FII~G4f%fu GLT$L$Ld$ LT$L$jjLT$LD$H^9_L$LT$?@)эPHD HHHD {IBI+CA@D$8HD$ W@Ld$ D)HHHD$ ?LL\$L$L$L\$fHLT$L$HLF@E$ttHEH0E ƒuHx71L5(9]$v@HUH4‹E ƒuHyκ[UU!af.v% fDH% = tRCt4HHSXHtg >HtOH[fHH޿[1\8@uBkpf.HSX HT$HWHT$1HHCXH[ff.@pL$(AD$AAH$LLH A)AE AED)؉D$E$D`EAfDH{Ht$\}tFADd$D}$DL9rE)DHT$HHHHuMLDIߋT$ |$(AHt$ @18HHt$ 18aHt$m1И8USHHHHt'HHNHHH[]1ɉfDHt,T$ D`K)HHHOHyA?HA)AB4HcLIRLII~N$1ҐI I HL9uE~&AKIHILHHHH9u߃?tHHKDHAQ؟WҼ~ff.L]A\A]A^fDACE1HfN N IL9uؾHH 4D[]A\A]A^IC[]A\A]A^ff.AUATIUSLHMLHHt$HX pHh!tH|$"H! fA<$u)f;u"HD$fD$ffC f=u-HtHHB7LsMtA>fA<$u'HtLHMt$Mt A>@f;u9HtHHLsMtA<6<ffA<$u'HtLHMt$Mt A> f;u"Ht HHtLsMt A>H_ fA<$u6HtLH 6M|$MtA<@<MDf;u)HtHH5HSHt :@fA<$u/HtLH5ID$Ht8ff;Ltf;HtHHHCH8HHuHPHf{Gf~GfXuYHLpHtfA>u LHLEHL$Ht$L$HLL$L$wA@H蠫AfA<$HtLHMt$MA>}AVHuIFHfH[HIVHUf:[HHT$HHHT$A~ U3MvPfA>uHL$LHL$HLEHLHL$HL$u LIHf9H$Ht$L$HL$HLL$L$H$AefH|$F~H|$l!,A7H='H{THCAE$I](IEAUHtHH2HSH+:"JHuHBHf]HHJHOf9r[HT$ HHL$HIHL$HT$ [z 1LzPfA?u LHLEL?HL$Ht$L$HLL$L$L$"ARf!F+H=o&H>HD$Ht$HAEXAE$LMe(AIEfDLt$LCt fA>!*AtsH=%HAA$ID$AE$Me(IEAU3f;%DfA<$Ht LHtMt$Mt A>0E1f;5HtHHHSH: JHuHBHfHHHJHRf9HT$ HHL$HIHL$HT$ z /LrPfA>uLD$LLD$HLELLLD$2LD$u LMIfA8uYHL$Ht$HL$L$LH$LHT$djf;HHHSH:HHT${f.fA<$f;HtHHZ7HSH:HHT$=(HT$_L=,^=Tz yHH{f9u HT$ HHL$HL$HT$ HHELfA>uHL$LHL$HLEI9 1HLHL$ܝHL$LHZ2fA>HL$Ht$L$ATHLL$H$L$pA@H|$ޚLt$LIt fA>R8!$xH=H }AA&&&1XoV@HxH|$FLt$L豛AAf SfSHD$f8HD$P6 HFHcЀ FHcЀFHcЀFHcЀFHcЀFHpF!#D$>H=HHD$L$H$Ld$HAE4AE$HDŽ$IEIELDŽ$4L$Ƅ$DŽ$H$H$1LLd$`HIH|$jIUH$H$HH$LL$5A;fA<$lfDf;Hf;qAWHuIGHFfR$`rAVHuIFHf<HINHCf9HHHL$HIHL$LA~ &INPf9HHHL$ HHD$HL$ LA~ )GHMvXtfA>u LHLEAkf-$ sH$L`$bH$SH|$`H6|$x;H|$`/!AH=HkHD$Ht$HAEcAE$LMe(I]0IEIG@JHuHBHf{t f~1HLzHBfA?BLHIBfA>LHu%M~MA?AWHuIGHf]HMwHfA>u LHtIA $MPfA?u LHLELO_HL$Ht$L$HLL$L$L$2AfHHuHPHf{t f~)HLxHAfA?uALHIdAfA>uLHtIVHt :;fHL cf;HHH$HSH:JHuHBHf{ݲf~ӲfX[HLzHtfA?u LHLEHL$Ht$L$HLL$L$AL)oT/&N>$S%A~ HMfA?u LHLEMfA>u LHLEL fA>HtLHIFH8HHuHPHfXaHLpHtfA>u LHLEM9^1LLuҝ&LL(HL$Ht$L$HLL$5pAHDAVHuIFHf<HINHBf9WHHL$HIHL$\A~ 8!INPf9WHHL$ HHD$HL$ WA~ JHMvXtfA>u LHLEAkf-K$ŐtL)/l}.&0tx>$r%A~ HMfA?u LHLEMfA>u LHLEL NL91LHНfA>$HtLHIFH891LHНLHt&HL$Ht$L$HLL$pA\HD$H@f8 YL=*^k0=(TXU=$A~ IHHD$(f8uHHHHDL$HL$MfA>LHHD$ /f; HHLsMA>L=$A~ MHL|$(G7fA?uLHIDHD$(MfA>u LHLEHD$HL$(H9=[Ht$H|$(1MΝZL;t$ ZHt$ 1L+ΝSHt$ L$>HD$HL$L$Ht$A$HLL$H$H$HD$ H$HD$(H$pA@L)t0&8E$t%A~ HM3fA?u LHLEMfA>u LHLELvM91LL͝XLL]#EHL$Ht$L$HLL$0pAL)5&5Q$%A uHMgfA>u LHLEMfA?u LHLEL91LH̝LHu"HL$Ht$L$HLL$L$L$pAHIWHpf:HHT$HIHT$A ILcHL$Ht$L$HLL$H$L$FAHIWHof:ΆHHT$ HHD$HT$ A .MPfA?MuLHIf;uHHtHCHt 8ُL97C7HD$HL$L$Ht$HLL$L$H$VA9HIWHnf:хHHT$HIHT$A PMPfA?MuLIHMDfA>Vf;L晴>fA<$ޢHt$LHٿHLT$MHI !AH=LT$HHD$Ht$HAETAE$LMu(MU0IE)'@HMHSmfA?NLHINf;uHHtHCHt 8HHhWGHL$Ht$L$HLL$H$SAHIOHmf9EHHL$HHHL$XA MPfA?MuHT$LHIHT$>f:HHT$HHT$LzMrA?hAWHf{f~fU5HIWHf:kSHHT$HHHT$=SfA?fA>L91LHHL$ƝHL$H$Ht$L$HL$AGHLL$L$L$4pAIDHIWHkf:HHT$HIHT$zA ILLhӝu LMIfA?HtLHMMA?AWHf{f~fUYHIWH$f:HHT$HHHT$fA?fA>L9Y1LHHL$!ŝHL$H$Ht$L$HL$AFHLL$L$L$XpAmHtLHIWH:JHVflCfmfnHzHHH|$ 2f?uHT$(HHT$(tHD$ z HRPf:uHHT$(HT$(HHEHH|$t.LOt"H;\$ Q<Ht$ 1HÝ,;H|$L_H;\$ [Ht$ 1H}ÝHt$ HH|$fnH|$$\! AWH=H[HL$ HD$H$IUH\$AEXAE$IEHAHH$HH$HDŽ$DŽ$[H$Ƅ$DŽ$跫H$1HIHnHt$IE(HL舫IGf8HD$ LhH\HHD$ f8,hM fA?}L{#fA?}LH|$t.LAt"H;\$ Ht$ 1H]LH|$LQH;\$ SHt$ 1HoHt$ HrH|$X`H|$N! AIH=HHL$ HD$H$IUH\$AEXAE$IEHAHH$HH$HDŽ$DŽ$[H$Ƅ$DŽ$詩H$1HGHHt$IE(HLzxDIGf8HD$ _fHLHHD$ f8?fM fA?{LkfA?{LH|$t.L1t"H;\$ Ht$ 1H裿gJH|$LAH;\$ Ht$ 1H_wHt$ HbH|$H~PH|$>!A9H=H@HL$ HD$H$IUH\$AE~AE$IEHAHH$HH$HDŽ$DŽ$[H$Ƅ$DŽ$虧H$1HDHHt$IE(HLjhDIGf8HD$ rdHH|$Tx,H|$z!AEH=H.HD$IUH$H\$AE~AE$HDŽ$IEID$H$HHDŽ$[Ƅ$H$DŽ$L$詡H$1H?HPHt$IE(HLzxDIGf8HD$  nHHd}HD$ f8nM fA?]xLkfA?AxL;d$ xH|$ 1LͷH|$ L(LHH|$V~LnH|$v\H|$BxJ!AuH=HHD$IUH$H\$AEXAE$HDŽ$IEID$H$HHDŽ$[Ƅ$H$DŽ$L$ٟH$1H7=HNHt$IE(HL誟DIGf8HD$ lHLH|HD$ f8~lM fA?vLfA?vL;d$ 3vH|$ 1LH|$ LX LxH|$L&H|$tH|$rvz!AH=&HHD$IUH$H\$AEXAE$HDŽ$IEID$H$HHDŽ$[Ƅ$H$DŽ$L$ H$1Hg;H/Ht$IE(HLڝDHtLHIWH:wJHuHBHfl#M9a1LLHL$(LD$ LD$ HL$(LLHL$  HL$ H*rH|$8`LPH|$fr>H|$$t,!AWH=HlHD$IUH$H\$AE~AE$HDŽ$IEID$H$HHDŽ$[Ƅ$H$DŽ$L$軛H$1H9HHt$IE(HL茛IGf8HD$ ShH,HyHD$ f83hM fA?\rL{fA?@rL;d$ uH|$ 1LݱH|$ L8LXH|$fL~H|$plH|$RrZ!AH=HHD$IUH$H\$AE~AE$HDŽ$IEID$H$HHDŽ$[Ƅ$H$DŽ$L$H$1HG7HHt$IE(HL躙z I% fD sAA&&&1-oL@fA<$tfDf;Mf;8fA<$|fDfA<$fDAVHuIFHf<HINHf9!HHL$HIHL$$A~ INPf9 HHL$ HHD$HL$ $A~  HMvXtfA>u LHLEAkf-k$vHtLHIWHA:8JHflM91LLHL$(LD$ LD$ HL$(cLLHL$ cHL$ FH~6H|$<$L|H|$lH|$xn!AH=,HHD$IUH$H\$AEXAE$HDŽ$IEID$H$HHDŽ$[Ƅ$H$DŽ$L$H$1Hm3Hv7Ht$IE(HLIGf8HD$ "RH H|lHD$ f8RM fA?gLfA?gL;d$ jH|$ 1L5H|$ LxLhH|$nVLFH|$j4H|$l"!8AH=^HHD$IUH$H\$AEXAE$HDŽ$IEID$H$HHDŽ$[Ƅ$H$DŽ$L$AH$1H1HHt$IE(HLDIGf8HD$ /PH<$HTiHD$ f8PM fA?fLfA?eL;d$ hH|$ 1LeH|$ LLH|$LvH|$idH|$jR!hA H=HHD$IUH$H\$AE~AE$HDŽ$IEID$H$HHDŽ$[Ƅ$H$DŽ$L$qH$1H/HHt$IE(HLB@DIGf8HD$ NHlTH:gHD$ f8NM fA?%dL3fA? dL;d$ fH|$ 1L蕨H|$ LLH|$LH|$LgH|$ i!A=H=Hk2HD$IUH$H\$AE~AE$HDŽ$IEID$H$HHDŽ$[Ƅ$H$DŽ$L$衐H$1H-H2Ht$IE(HLrpDIGf8HD$ LHHfHD$ f8LM fA?DbLcKfA?(bL;d$ dH|$ 1LŦH|$ L L@H|$L>ֿH|$|eĿH|$:g!AmH=HHD$IUH$H\$AEXAE$HDŽ$IEID$H$HHDŽ$[Ƅ$H$DŽ$L$юH$1H/,H Ht$IE(HL袎頽DfA<$}鉽fDAWHuIGHfLf f[ f\,HMwHtfA>u LHLEL9f1LHLHN޾!LAH=rHϛHD$AEHAE$IEIvH9tH֡jHt$Mu(HMu0L臍酼f," *!+A~ HMԇfA?u LHLEMfA>u LHLELfA>HtLHIFHr8iHHuHPHfXHHLpHtfA>u LHLEM91LL\ LLHL$Ht$L$HLL$pAʼ/@HD$H@f8 n 5AA& &&1a o,Ae*D@++A~  HMfA?u LHLEMfA>u LHLEL ѻL91LH+fA>HtLH$IFH8~HHuHPHfX]HLpHtfA>u LHLEL91LH蚡"LHHL$Ht$L$HLL$8pAߺmf.fH>HIWHA'f:HWHHT$HIHT$UXA pMPfA?MuLIHMDLLLT$ LT$u LMIfA>f;HCSHsL|$ILD$nLHHD$ Ff;HHLsMA>L襷=T}A~ MHL|$(fA?uLHIDHD$(MfA>u LHLEHD$HL$(H9Ht$H|$(1❝oL;t$ THt$ 1LHt$ LӼHD$HL$L$Ht$AT?pAdHLpH#fA>RLHIQfA?LHtIGHg8^HHHPH߷fD,H*P+A~ HM]fA?u LHLEMfA>u LHLEL M9O1LL/LLtHL$Ht$L$HLL$_pADHIWHk&f:QHHT$ HHD$HT$ XA MPfA?MuLIHMDLbrL9{CqHD$H@@L!A_H=ߎHqHD$IUH$H\$AEcAE$IEHD$HIE(IFH$HHDŽ$DŽ$XƄ$H$DŽ$L$軃H$1H!HHt$IE0HL范銲HMwHtfA>u LHLEL9b1LH--LH!A-H=ގHFHD$Ht$HAEHAE$LMu(Mu0IE܂ڱ=u =pyL=mr"j=n|iz HLtfA>u LHLE,N*T+gA \HMfA>u LHLEMfA?u LHLEL9Ŷ1LHLH\HL$Ht$L$HLL$L$L$ѐpA®-=Q=A~ 2HMyfA?u LHLEMfA>u LHLELfA>ׯHtLHįIFH8HHuHPHfXHLpHtfA>u LHLEM9+1LL衕RLL?HL$Ht$L$HLL$apAt'(A~ HMpfA?u LHLEMfA>u LHLELfA>HtLH~IFHq8hHHuHPHfXGHLpHtfA>u LHLEM91LL[ LLHL$Ht$L$HLL$pAɭ.A~  HD$A~ IE1A~  HD$A~ IE1ȼfA>HAV6 HtLHcЀ t@HcЀt4HcЀt(HcЀtHcЀtHcЀ< G`?#M\M锾fA?MM鏾fA?8MCE1z ILzP鸰f;HT$E11LH迒WY'.(0A~ %HM}fA?u LHLEMfA>u LHLEL֫L9ڰ1LH0}fA>HtLH)IFH891LH蟑'LHHL$Ht$L$HLL$=pAr-1=QĪA~ HMfA?u LHLEMfA>u LHLELjL9@1LHĐfA>@HtLHIFH 89# 1LH3LHHL$Ht$L$HLL$чpAxE1ظA~ 4IFPHD$A~ I-E1B=$z HHۆf9u HT$ HHL$HL$HT$ HHELfA>uHL$LHL$HLEI9xZ1HLHL$"HL$LHzfA>HL$Ht$L$A$HLL$H$L$ĐpAϦ@E1pIMHL$zI HL$8#ӎ xAA&&&1 oQE1z I.LrP%A~ "3HE1V=9'y:=F^|Q=:`z HLfA>LHHD$f;HHHSH:HHT$ 详=:HT$ '=Tdz HHDD$ HD$ fA>=sIv`mѯI@CA~ IfDA~ HL$"A~ IDA~ HL$"EhY8=&eJ=$.Rz HLfA>uHT$LHT$HLEfAL91LHIQLH>HD$HL$L$Ht$A$HLL$L$H$pAfDC-0;=QA~  HM)fA?u LHLEMfA>u LHLEL#M9C1LLELLHL$Ht$L$HLL$u~pAZD'3(>A~ 3HMfA?u LHLEMfA>u LHLELLM91LLnƠLLHL$Ht$L$HLL$}pAAfDIFPHD$IFPHD$2I髤E1lLt$IFf8 .H|$HIfDz &IgfDA~ NHMifA?u LHLEMfA>u LHLELfA>HtLHIFHӟ8ʟHHuHPHfXHLpHtfA>u LHLEM91LL轅nLL[HL$Ht$L$HLL$}~pA+鐝DA~ HMhfA?u LHLEMfA>u LHLEL־ǞfA>HtLHIFH8HHuHPHfXqHLpHtfA>u LHLEM9"1LL腄6LL#HL$Ht$L$HLL$E}pAXDA~ ޝHMdfA?u LHLEMfA>u LHLEL螽fA>HtLHpIFHc8ZHHuHPHfX9HLpHtfA>u LHLEM91LLMLLHL$Ht$L$HLL$ |pA 1HT$qJE11LHقٜLH6gA~ gHMBffA?u LHLEMfA>u LHLEL'fA> HtLHIFH8HHuHPHfX›HLpHtfA>u LHLEM9ް1LLցLL3tHL$Ht$L$HLL$zpAD驙fDA~ .HMdfA?u LHLEMfA>u LHLELߚfA>ӚHtLHIFH8HHuHPHfXHLpHtfA>u LHLEM91LL蝀NLL;HL$Ht$L$HLL$]ypA pDA~ HMcfA?u LHLEMfA>u LHLEL趹fA>HtLHIFH{8rHHuHPHfXQHLpHtfA>u LHLEM9Y1LLeLLHL$Ht$L$HLL$%xpAӘ8DA~ HM~fA?u LHLEMfA>u LHLEL~ofA>cHtLHPIFHC8:HHuHPHfXHLpHtfA>u LHLEM97 1LL-~ޗLL˗HL$Ht$L$HLL$vpADz ΜHHdf9u HT$ HHL$HL$HT$ HHELfA>uHL$LHL$HLEI9oS1HLHL$W}HL$JLH7fA>,HL$Ht$L$A,HLL$H$L$~pAz ֛HHPXf9u HT$ HHL$HL$HT$ HHELfA>uHL$LHL$HLEI9cL1HLHL$_|HL$RLH?fA>4HL$Ht$L$A(HLL$H$L$~pA f-58=QԕA ɕHMfA>u LHLEMfA?u LHLEL91LHl{lLHYHL$Ht$L$HLL$L$L$>upA/@'d5(A HMbfA>u LHLEMfA?u LHLEL961LHzLHHL$Ht$L$HLL$L$L$htpACYfDI]HL$銨fA>MM,A~ oHL$dDE1A IMWPA~ IHHD$]f8uHHHHDL$HL$If:IuHHT$ HIL9w1HLHL$(xHL$(LH=1_HHt ~H|$t.fA<$Tf;uHZHED$HD$!4H=HHL$HD$H$AEAE$IULd$IEHAH$HDŽ$H$LHH$HD$ DŽ$HƄ$DŽ$H$`H$1LHoIE(Mu0HLLAS`QfDA~ MIHHD$Zf8uHHHHDL$HL$MfA>͖LHHD$ f;ݎHHܕLsMϕA>ŕL͏=(A~ MHL|$(#fA?uLHIDHD$(MfA>u LHLEHD$Ht$(H9#Ht$H|$(1 vL;t$ Ht$ 1LuHt$ LCHD$HL$L$Ht$A(gzpA錍fA XMwPHT$A~ It@A~ UIHHD$>Yf8uHHHHDL$HL$If:IuHHT$ HIL9\1HLHL$(sHL$(LHN=ZH Ht H|$t.fA<$f;uHHED$HD$!H=HzHL$HD$H$AEAE$IULd$IEHAH$HDŽ$H$LHH$HD$ DŽ$HƄ$DŽ$H$[H$1LGHъIE(Mu0HLLA[鳊A~ IHHD$_Vf8uHHHHDL$HL$MfA>tLHHD$ kf;EHHDLsM7A>-L5=, A~ MHL|$(=fA?uLHIDHD$(MfA>u LHLEHD$Ht$(H9gHt$H|$(1rqL;t$ Ht$ 1LPqxHt$ LcHD$HL$L$Ht$A,upAA~ IHHD$Tf8uHHHHDL$HL$If:IuHHT$ HIL91HLHL$(oHL$(LH=3VHHt H|$t.fA<$Qf;uH\HED$HD$!萸6H=HHL$HD$H$AEAE$IULd$IEHAH$HDŽ$H$LHH$HD$ DŽ$HƄ$DŽ$H$WH$1LHqIE(Mu0HLLAUWSJHuHBHFf($pwHHJHLf9u!HT$ HHL$HHL$HT$ tHz ;LzPfA?uHL$LHL$HLEH$Ht$L$HL$AnHLL$H$.hpAyHHJHf9u!HT$ HHL$HHL$HT$ tHz LzPfA?uHL$LHL$HLEH$Ht$L$HL$AmHLL$H$gpAD̈́HHJHȘf9u!HT$ HHL$HHL$HT$ tHz LzPfA?uHL$LHL$HLEH$Ht$L$HL$AlHLL$H$fpA!HHJHf9u!HT$ HHL$HHL$HT$ tHz 7LzPfA?uHL$LHL$HLEH$Ht$L$HL$AkHLL$H$*fpAuHLzH-fA?uHT$LHHT$tIǃz HRPf:uHHT$HT$HHELHT$0HT$vHL$Ht$L$H$LHLHt$HLH$H$|AHzHHH|$f?uHT$ HHT$ tHD$Ht$HLL$L$H$D{A'HHrHf>QHT$ HHt$HHHt$HT$ $H$AGHL$Ht$HL$L$LL$gpA3HLzHJfA?6HT$LHHT$Jz MIHRPf:uHHT$HT$HHEHLHT$sHT$u HLIf:Ht$H$AFHL$Ht$HL$L$LL$epA3}A~ ~HMMIfA?u LHLEMfA>u LHLEL~F~L9r1LHdfA>~HtLHIFH}8}91LHd}LHl}HL$Ht$L$HLL$[pAT}{A~ =}HMTfA?u LHLEMfA>u LHLEL&|L91LHHcAfA>|HtLHAIFH|8|HHuHPHfXz|HLpHtfA>u LHLEL9ʓ1LHb?|LH,|HL$Ht$L$HLL$UZpA{zA~ {HMRfA?u LHLEMfA>u LHLELΛ{L9w1LHavfA>l{HtLHIFHL{8C{9c1LH_azLH輷zHL$Ht$L$HLL$XpAz2yA~ zHMFfA?u LHLEMfA>u LHLELv>zL9v1LH`rvfA>zHtLHIFHy8yHHuHPHfXyHLpHtfA>u LHLEL91LH`yLHd|yHL$Ht$L$HLL$WpALywA~ 5yHM%EfA?u LHLEMfA>u LHLELxL9u1LH@_.ufA>xHtLH9IFHx8x91LH^7xLH $xHL$Ht$L$HLL$MVpAwvA~ wHM`fA?u LHLEMfA>u LHLELƗwL9r1LH]erfA>dwHtLHIFHDw8;wHHuHPHfXwHLpHtfA>u LHLEL91LHW]vLH贳vHL$Ht$L$HLL$TpAv*uA~ vHMGMfA?u LHLEMfA>u LHLELn6vL9p1LH\jfA> vHtLHIFHu8u9!1LH[uLH\tuHL$Ht$L$HLL$SpADusHwf;MߒMŒA MPI@A IE1飑I'{E1A vMwPf9uH\A Jf;MwPiHT$靐A IE1鯏MvPH3wHD$A~ MHvDH3H|$ HHD$ 鲞H5H|$ HHD$ ׯHjH|$ HHD$ HH|$ HkHD$ aHH|$ HHD$ z IyHH|$ HėHD$ 麗H{H|$ HHD$ 駙H3H|$ HHD$ 锛A~ rHMAgfA?u LHLEMfA>u LHLELrM9_1LL'YrLL脯lrHL$Ht$L$HLL$WOpAu LHLEL>qM9l1LL`XqLL轮qHL$Ht$L$HLL$NpAuq3p=8&>M#z zHLfA>vLHHD$f;zHHvHSHz:zHHT$ p>HT$ =$zzz HHDD$ HD$ fA>z==zIv`myI~fA~ nHMpfA?u LHLEMfA>u LHLELmM9Z1LL@TmLL蝪mHL$Ht$L$HLL$pJpAUmlA~ EmHMofA?u LHLEMfA>u LHLEL^lM9Ur1LLSlLLݩlHL$Ht$L$HLL$IpAlSk=wITS={K=7vz vHHf:<HHT$HIHT$f;uHHqHSHu:uHHT$k=GL9֭1HLHL$/RHL$2uLH臨uHL$Ht$L$ADHLH$L$L$RptAi@BFN}EC.BDtz tHL2fA>PLHHD$(f;ktHH=pHSHPt:GtHHT$ jjDHT$ =(!tz HHDD$ HD$ fA>s=c7sIv`msI~5fA~ gHMxTfA?u LHLEMfA>u LHLEL辇VgM9T1LLM8gLL=%gHL$Ht$L$HLL$DpAfeA~ fHMGSfA?u LHLEMfA>u LHLELfM9[1LL MxfLL}efHL$Ht$L$HLL$PCpA5fd=r;D1=s1=toz oHLtfA>u LHLEL91LHiLqoLHƢ^oHt$H$LL$L$o1oAu LHLEL~eM9~Y1LLKdLLdHL$Ht$L$HLL$ApAdscfA>fM2HBH|$ H@HD$ 6LH fA>L迂H fA>L蝂Ht|fA>[LHt^fA>3LaHt@fA>LCHt"fA>L%H6cfA>LMHT$yHTH|$ HHD$ HH|$ HHD$ 鳗I˨HH|$ HrHD$ hHsH HD$f8H|$KH HD$f8H|$#HHD$f8H|$HhHD$f8SH|$ӀH@HD$f8XH|$諀HHD$f8'H|$胀{nA I飧A bHMUefA>u LHLEMfA?u LHLEL9^N1LH;H;bLH蘞(bHL$Ht$L$HLL$L$L$ BpAa_A aHMdfA>u LHLEMfA?u LHLEL9M1LH{G{aLH؝haHL$Ht$L$HLL$L$L$MApA(a>_A aHMcfA>u LHLEMfA?u LHLEL9L1LHF`LH`HL$Ht$L$HLL$L$L$@pAh`~^A X`HMbfA>u LHLEMfA?u LHLEL9b1LHE_LHX_HL$Ht$L$HLL$L$L$?pA_]A _HMafA>u LHLEMfA?u LHLEL9)b1LH;E;_LH蘛(_HL$Ht$L$HLL$L$L$ ?pA^\A HIVyLHLELHLEǝLHLELHLELHLE$LHLE鋆LHLEjLHLEIA IUzIBzA HT$ZyHT$EyH{f;MayxyA XI~}A EH{IX}f:M{Q^I%xA ]HMOcfA>u LHLEMfA?u LHLEL9Y1LHBCB]LH蟙/]HL$Ht$L$HLL$L$L$=pA\[A OHT$5A \HMafA>u LHLEMfA?u LHLEL9V1LHmBm\LHʘZ\HL$Ht$L$HLL$L$L$?a=;%aIv`m^faI~ AD$9`AD$h鏂=Gz  _HH`f:MHHT$HIHT$f;^HHZHSH^:^HHT$T=7L91HLHL$*;HL$-^LH肑^HL$Ht$L$ADHLL$L$H$;p]ARM fA?ȂLyS鳂HH@f{t f~pHLxHtfA?u LHLEM9eH1LLC:oHD$HL$L$Ht$HLL$L$H$1LAoRM fA?*LIxRM fA?L$xR鐃IEH\$H$D$HDŽ$H$d"H$1H¿HIE1AQM fA?LvwQ=([z [HLȴfA>uHT$LHT$HLEfAL91LH:8B[LH藎/[HD$HL$L$Ht$A(HLL$L$H$9pZAO=%Zz ZHLfA>sLHHD$f;ZHHbVHSHuZ:lZHHT$ P=$HT$ K=%DZz :ZHLtfA>u LHLE= ZH|$t.fA<$lf;uH HED$HD$!=UH=d{H,HL$HD$H$IUAEAE$IEHAL$Lt$H$HH$HDŽ$DŽ$FLH$Ƅ$DŽ$;H$1L虼H#NIE(HLLA  N=TXz XHLfA>uHT$LHT$HLEfAL91LH'5/XLH脋XHD$HL$L$Ht$ATHLL$L$H$6pWALAD$|驥?Wz WHLpfA>sLHHD$Kf;nWHH@SHSHSW:JWHHT$ mM?HT$ a=($Wz HHDD$ HD$ fA>V=fVIv`m艶VI~83cTz YTHHyf:fHHT$HIHT$8f;THHOHSHS:SHHT$J={L91HLHL$y0HL$|SLHцiSHL$Ht$L$AkHLH$L$L$"1p&SA1HAD$龳=oSz RHLtfA>u LHLEL9Ӈ1LH/RLHRHt$H$LL$L$AtozRAG=lgRz ]RHLtfA>u LHLE=;cPz YPHLfA>ʄLHHD$քf;PHHKHSHP:OHHT$ F=;HT$ =TOz HHDD$ HD$ fA>hO=[OIv`m8@OI~AD$MNAD$鲬.Lz LHH6}f:#}HHT$HIHT$|f;LHHwHHSHL:LHHT$B=vL9w1HLHL$)HL$ LLH^KHL$Ht$L$AcHLH$L$L$)pKA@Kz KHH{f:{HHT$HIHT$y{f;WKHH)GHSHu LHLE= ~>Lz pGHLfA><LHHD$f;6GHHCHSHG:GHHT$ 5==&HT$ (=(Fz HHDD$ HD$ fA>F=, rFIv`mOWFI~HLzH'fA?3}LHH}f:];HHT$HHT$pHBH8;8/;HHuHPHfX;HLxHLLz CHLtfA>u LHLEL9eO1LHL TCLHvACHt$H$LL$L$doCA8M z BHLvNfA>`NLHHD$zf;BHH>HSHB:BHHT$ 8=HT$ 8y=$rBz HHDD$ HD$ fA>B=AIv`mסAI~膤HSHtH7MnM*M:MjM&MMOxf;\4IHD$ |f;@4MŢf;)4IHD$ MKf;4M鄫f;3M A AE1qFMֶz >HLtfA>u LHLEL9{1LH\d>LHqQ>Ht$H$LL$L$_o$>A/3z >HLtfA>u LHLEL9K1LH=LH)q=Ht$H$LL$L$[_o=A2MxMM隹^ M@AA&&&1ɗn&@z *=HLZfA>uHT$LHT$HLEfAL91LHpxu LHLEL9@f1LH;LHo;Ht$H$LL$L$H]o;A0z o;HLYfA>YLHHD$Xf;5;HH7HSH;:;HHT$ 41@HT$ V=,:z HHDD$ HD$ fA>:=-s:Iv`mPX:I~HF-f;;-IHD$ Ff;-IHD$ gvM鉂z 7HLŧfA>LHHD$f;7HH}3HSH7:7HHT$ -=9HT$ =T_7z HHDD$ HD$ fA>6=6Iv`mĖ6I~s)z 4HLtfA>u LHLEMpM z 2HLfA>LHHD$f;F2HH.HSH+2:"2HHT$ E(CHT$ П=,1z HHDD$ HD$ fA>1=>1Iv`mai1I~YW$z :/HLfA>LHHD$f;/HH*HSH.:.HHT$ $AHT$ =$.z HHDD$ HD$ fA>K.=>.Iv`m#.I~ʐ!L,M $dAA&&&1SncLeyM5Mwz +HLtfA>u LHLEL9j1LHJR+LH^?+Ht$H$LL$L$Lo+A z +HHgf:jHHT$HIL9g1HLHL$"HL$%*LHz]*HL$Ht$L$AkHLL$L$H$p)Az )HLmifA>WiLHHD$hf;)HHU%HSHh):_)HHT$ EHT$ Eg=,9)z HHDD$ HD$ fA>(={(Iv`m螈(I~Mz w&HL hfA>gLHHD$hf;=&HH"HSH"&:&HHT$ <BHT$ e=(%z HHDD$ HD$ fA>%=5{%Iv`mX`%I~PNMYz %#HHdf:ydHHT$HIL9c1HLHL$GHL$J"LHU7"HL$Ht$L$AcHLL$L$H$o!Az !HH_f:~_HHT$HIL9K1HLHL$HL$!LH\T HL$Ht$L$AgHLL$L$H$o AMz  HH'^f:^HHT$HIHT$f;R HH$HSH7 :. HHT$QN@ aAA&&&1ynaz yHH[f:[HHT$HIHT$[f;8HH HSH:HHT$7=wL9X1HLHL$HL$LHPHL$Ht$L$AgHLH$L$L$BoFAQMܬM,AD$֠Cs閲M-C黱CX~{~CM髫HPHCv--AD$r2rfA?LMpAD$ /AD$[~M9Ba1LLHL$(LD$ LD$ HL$(LLHL$ OHL$ yH!1iH|$/2WL2GH|$]5H|$#!AANH=<HM`HD$IUH$H\$AEXAE$HDŽ$IEID$H$HHDŽ$[Ƅ$H$DŽ$L$H$1H~H_Ht$IE(HLC9H|$Z9H|$9!@AKH=;HIHL$ HD$H$IUH\$AEXAE$IEHAHH$HH$HDŽ$DŽ$[H$Ƅ$DŽ$H$1H }HbHHt$IE(HL|zM9O1LLHL$(LD$ LD$ HL$(LLHL$ LHL$ H5.}H|$C/kL/[H|$qIH|$/7!>AbH=9H#NHD$IUH$H\$AE~AE$HDŽ$IEID$H$HHDŽ$[Ƅ$H$DŽ$L$H$1H${HMHt$IE(HL M9!]1LLHL$(LD$ LD$ HL$( LLHL$ +JHL$  HF, H|$T-| L-l H|$育Z H|$@H !: SHL$Ht$L$HLL$oARCRM(MhChhCLoM2MfM際MΚC͝Ct'H|$ y'H|$ʣg'!X,AH=~'H{HL$ HD$H$IUH\$AE~AE$IEHAHH$HH$HDŽ$DŽ$[H$Ƅ$DŽ$]H$1HhHxzHt$IE(HL.,Lt$ LHd8~HL$Ht$L$HLL$oA|~CX~LH8pOHL$Ht$L$HLL$oA@OCOf;Lt$ BeLt$ BAD$MDM錞AD$XXAD$AD$DDE1{%  AAA&&&1E^nLT$@LH7e>HL$Ht$L$HLL$oA5>vC>C1ITIMԜM If;Lt$ Vi.Lt$ XL9 1LHHL$ޜHL$0H$Ht$L$HL$AFHLL$L$L$oAYHzHHH|$ W f?uHT$(HHT$(tHD$ z u-HRPf:uHHT$(HT$(HHEHtH|$t.L&t"H;\$ 4Ht$ 1Hݜ H|$L6H;\$  Ht$ 1HTݜlHt$ H3WH|$=EH|$3!&A.H=!H HL$ HD$H$IUH\$AE~AE$IEHAHH$HH$HDŽ$DŽ$[H$Ƅ$DŽ$H$1HbH; Ht$IE(HL_]HT$E1HBHIGHEIGH]HMH fA?q LHHL f:{HHT$HHT$cLzMVA?LAWHuIGHfU*HIWH}f:jHHT$HHHT$*fA?fA>L9G1LHHL$ڜHL$H$Ht$L$HL$AGHLL$L$L$)oA>Cq5qLH0uHL$Ht$L$HLL$roAu^CuMPA )Hz )I,tHA f)MPHIsz H)HJPItHHRHf:HHT$HHHT$f9HHL$HHL$HQH:JHuHBH$H$AFHL$Ht$HL$L$LL$oANz 'I)ChI z s'HJPI*f:yHVHCf:^HSLpz w!1pz 'IoIoHRPpz &HJPIoHHRHf:HHT$HHHT$f9HHL$HHL$HQH:JHuHBH$H$AGHL$Ht$HL$L$LL$qoAtz H%I)Hf:JHz %HlHlz %HJPI f:H5IC&E1z vLzPf>Hlf9IlHt$HLL$L$LL$ H$ALL$ jjMjAGz뗃z v!f;LJPjjz w1if;uE1tjHRPhE1~jE1z AhLzP8hE1z gLzPvgE1z fLzPfE1z eLzPef;%i2iHHuHPHf{t f~ iHLxHtfA?u LHLEHLahHD$HL$L$Ht$HLL$H$Ah L*cHD$cC6kZkCAb)cHbfA?.LLfA?LA !MPH  r(ACA&&&1OnK(E13 AGA&&&1OnCoz w1C(HRPA !H! H M_A  MPH Lt$驱f;eLt$魱LfA?LAXXHeXC AD$O9OC鴰L (XHD$XA 4 HHA  MPHfA?0L Ht$ H&H|$rH|$0!AcH=HHL$ HD$H$IUH\$AEXAE$IEHAHH$HH$HDŽ$DŽ$[H$Ƅ$DŽ$øH$1H!VHt(Ht$IE(HL蘸CE1 ,A&A&&&1LnE1CE1;Cb==CmmC,lOlC+HHD$.Ev`m>QDIv`mN%BI=v`mK?IM*( oAA&?&&1Cnv`mH.u LHLE=H|$t.fA<$f;uHHED$HD$! H=%HHL$HD$H$IUAEAE$IEHAL$Lt$H$HH$HDŽ$DŽ$FLH$Ƅ$DŽ$H$1LZFHIE(HLLĄLHD$f;Lt$O AIA&&&1z wC1 E1) A'A&&&1w+nǶHRPE1E1` AA&&&1*+nC 4 A~A&&&1*nE1] A`A&&&1*nsCL9HD$Lf;IIlf;Irf;IIf;Iz HLtfA>u LHLE=VH|$t.fA<$f;uHRHED$HD$!,H=HHL$HD$H$IUAEAE$IEHAL$Lt$H$HH$HDŽ$DŽ$FLH$Ƅ$DŽ$脔H$1L1HlIE(HLLATRLWHD$z HLtfA>u LHLE=H|$t.fA<$f;uHHED$HD$!H=AHQHL$HD$H$IUAEAE$IEHAL$Lt$H$HH$HDŽ$DŽ$FLH$Ƅ$DŽ$H$1Lv0HIE(HLLALHD$f;ImImf;Im rAA&&&1&nrLt$if;>Lt$if;)Lt$i AA A&&&1&nLkzHD$~zHSHyLHyHt$H$LL$L$Bn\yALt$?{f;^Lt$C{f;ILt$ { AA&&&1%n{CxTC#xFxLHBHt$H$LL$L$tnlA鸿CE1Ƀz HJPw1Ƀz |HJPs IAA&&&1$n"C1Ƀz 8HJP/E1" бA\A&&&1p$n驱AD$ܰ@6^@6E@6,@6@6Z@6AAD$3C5qXqHHD$龇HHD$ 3z ;HLtfA>u LHLE=ȌH|$t.fA<$f;uH HED$HD$!>H=eHHL$HD$H$IUAEAE$IEHAL$Lt$H$HH$HDŽ$DŽ$FLH$Ƅ$DŽ$u LHLE=PH|$t.fA<$f;uHHED$HD$!lH=HHL$HD$H$IUAEAE$IEHAL$Lt$H$HH$HDŽ$DŽ$FLH$Ƅ$DŽ$ċH$1L")HIE(HLLA蔋钺LHD$ AEA&&&1nf;"Lt$}Lt$}  AA&&&1on ABHQQHD$fQHHD$ Pz HLtfA>u LHLE=eH|$t.fA<$f;uHaHED$HD$!;H=HHL$HD$H$IUAEAE$IEHAL$Lt$H$HH$HDŽ$DŽ$FLH$Ƅ$DŽ$蓉H$1L&H{IE(HLLAcaLfHD$ A]A&&&1nf;I8Cf鉄L,HD$2,A++H+f;IVICf;{IIA%RIf;QII镠f;6I雠HFHD$鱙HHD$ &z HLtfA>u LHLE=H|$t.fA<$f;uHHED$HD$!H=HHL$HD$H$IUAEAE$IEHAL$Lt$H$HH$HDŽ$DŽ$FLH$Ƅ$DŽ$H$1LT$H޵IE(HLLAƆĵLHD$f;Lt$ AYA&&&1nLt$鲖f;5Lt$鶖f; IYIFz HLtfA>u LHLE=ٿH|$t.fA<$f;uHHED$HD$! H=0HLHL$HD$H$IUAEAE$IEHAL$Lt$H$HH$HDŽ$DŽ$FLH$Ƅ$DŽ$H$1Le"HIE(HLLAׄճLHD$f;I魛I際f;~I頛A0Lt$f;RLt$f;=Lt$ߍ AUA&&&1nCoNNߍ 6AA&&&1QnSߍ 4AA&&&1n AD$@H HD$ HHD$ D z >HLtfA>u LHLE=ˀH|$t.fA<$f;uH HED$HD$!AH=hލHHL$HD$H$IUAEAE$IEHAL$Lt$H$HH$HDŽ$DŽ$FLH$Ƅ$DŽ$?H$1LH'IE(HLLA LHD$f;ѻLt$ ^ݍ AA&&&1(nE1鞰AFHt$H|$(zHD$HL$L$Ht$AT$xoA@Ht$ HFz>E1h܍ AA&&&12nAD$L`HD$`ME1mۍ AA&&&1n鉨C 1ۍ ǣAA&&&1xn頣AD$ӢE1ݮaۍ CA|A&&&1+nz HLtfA>u LHLE=}_H|$t.fA<$f;uH[HED$HD$!ߐ5H=ڍHHL$HD$H$IUAEAE$IEHAL$Lt$H$HH$HDŽ$DŽ$FLH$Ƅ$DŽ$~H$1LHuIE(HLLA]~[L`HD$HHD$_HHT$ <=:HT$ ( =;v`m~*_Inv`mS\ILt$7f;WLt$7f;BLt$t7Ӎ &AA&&&1 nz HHDD$ HD$ fA>=FuIv`miqI~au_HdHD$Ѝ AA&?&&1 nHHD$ Ef;ޮLt$DTv`mlVI^nz w11z w1HRPHRP Ht$ HH|$~GH|$u LHLE=Q^H|$t.fA<$f;uHHED$HD$!mH=HHL$HD$H$IUAEAE$IEHAL$Lt$H$HH$HDŽ$DŽ$FLH$Ƅ$DŽ$_H$1L#HIE(HLLA_铎LHD$ AMA&&&1mf;#Lt$ALt$A eAyA&&&1pmpeAwwE1׍f;IwALL9 AA&&&1maE1yE1qHt$ H鉳Ht$ HlE15 AA&&&1mmL%xHD$8xf;ŗIkwCMpE -xAA&&&1mxLNHD$af;QI锕I違f;6I釕A H黬H騬AD$ =iAA&?&&1cmiAD$0SAFh鏥Ht$H|$(AHD$HL$L$Ht$A$$voAD@H=FHD$GHHD$ FC_YY~ pAA&?&&1HmpJ AEA&&&1m݋AD$QHHD$HHD$ %HWoHD$pHHD$ }of;I=CiiLHC|`Ht$H$LL$L$unO`A鹉C(`H]HD$]HHD$ l]z aHLtfA>u LHLE=W4H|$t.fA<$f;uH0HED$HD$!d H=HHL$HD$H$IUAEAE$IEHAL$Lt$H$HH$HDŽ$DŽ$FLH$Ƅ$DŽ$bYH$1LHJIE(HLLA2Y0L5HD$ AA&&&1`mf;IIG aAA&?&&1m~a 4jAA&?&&1m jHHD$ `z -HLtfA>u LHLE=UH|$t.fA<$f;uHHED$HD$!0֍H=WHHL$HD$H$IUAEAE$IEHAL$Lt$H$HH$HDŽ$DŽ$FLH$Ƅ$DŽ$.WH$1LHIE(HLLAVLHD$b AQA&&&1,mf;Lt$<^Lt$'^f;mLt$+^f;XLt$f;CLt$Lt$f;$Lt$oXLt$ZXf;Lt$^XC yAA&&&1MmRAF>eHt$H|$(HD$HL$L$Ht$A,$ooA?@1Ƀz HJP鰂E1 eAA&&&1dm>AD$q鲂U ?AA&?&&1m?C=W AA&&&1m^C߯AD$RuC  CrAd::1Ƀz gHJP^1Ƀz mHJPdAF6ZAF6DAD$AD$1 #AA&&&1mC HoH_HD$]_HHD$ ^z HLtfA>u LHLE=PH|$t.fA<$f;uHHED$HD$!!LjH=HHHL$HD$H$IUAEAE$IEHAL$Lt$H$HH$HDŽ$DŽ$FLH$Ƅ$DŽ$RH$1L}HIE(HLLAQLHD$S AA&&&1mf;}Lt$s\Lt$^\f;^Lt$b\HlYHD$ZC` Ht$ Hz邬 ٤AA&&&1nm鲤C3ZE1E1E1I ؄A}A&&&1m鱄ff.tcIE(LptcIE(LEtmfwIHEHPfw5K4B4ff%f9v BuCuH}bu.fDE1HhD[]A\A]A^A_I84ufH}7~HELPA fJABuȎu L!I¿!L$*AtH= &L$HAD$cI\$HEAD$$ID$(HmHuL9LLT$7Ht$LT$H4$LY HMI4$LHl$8H,$HD$D$ GHt$LHL$(HD$0D$4HD$@1LH gHHHpH9tHHD$HT$IT$0LLLntfnǎDHD$H$)f.Hl$8H,$LI$LLT$(HD$D$ ~HD$0D$4HD$1HLPfHHLT$q# A4A&a&&1\kL$AWAVIAUATIUSLIILHXLxtHt$0LHl$0H\$8=DfHݣtLLDvH辥H9=1HHޙf}u&MtHAHDHUHt ::@L $t\HX[]A\A]A^A_EHLLhDHX[]A\A]A^A_@f;DLL$(LLHiu11HX[]A\A]A^A_@}ALL$dCz vkHRPHz vcHJPHt$Vf=X=HXHt$ z HB-(nfDAUATIUSHMLHXAHt$H $fm_ Ifp f=fn f=u*HtHHHSHt: DH$f8u9HtH$HH$LxMtA?fDf;u)Ht HHtLsMtA>oH;$*H$1H`A$fqf=fk5Ht$0HL}PE1HXD[]A\A]A^A_fkwLfL2uH$H@f8tH$f8ffDIPf:1HHT$HT$H[ f:I IL$ f9'HHL$HIHL$UAff=u%Ht HHtHCHt 8\*@L91LH肯u!f;H&/HH8HSH:-Zf.HtLHIT$Ht :"fC!lA H=HHCAE$I](IEAUH$L$-HT$LHIHT$W-z LrPfA>3-LHHD$ -fA?uLHtIGHt 8D/Ht$f>f=u!Ht HHtHCHt 8;L9a 1LH茙-f;HEHHBHSH%:"Dd@HLrHfA>+HT$LHIHT$+z LrPfA>p+LHHD$G+fA?uLHtIGHt 85Ht$f>f=u!Ht HHtHCHt 8f9L9v 1LHl/,f;HmHHH"HSH:GD@HLrHfA>M*HT$LHIHT$ *z LrPfA>)LHHD$)fA?uLHtIGHt 8+/Ht$f>f=u!Ht HHtHCHt 8;L91LHL*f;HFHHHSH:!A$@HHJH3f9H)HT$HHL$HIHL$HT$)z _HRPf:IuHHT$HT$IHLDLLǣu LMIAf`f=L91LHc)HLrHfA>(HT$LHIHT$(z [LrPfA>(LHHD$c(fA?uLHtIGHt 8g/Ht$f>|f=u!Ht HHtHCHt 8G5L91LH(f;\H@;HHHSH:[:gم $AQA& Xf.A~  A~  LHMvXL""!݈ApH=؅HH$AE<LAE$Me(DT$IE -IE(H$Mu0IE8A~ qE1fA>oHt$LHHt$LETfDAD$<AD$! ݈AH=0؅HH$AE<LAE$Me(DT$IEF,I]0IE(M}8[DCOH$f8RHD$H $LD$0Ht$AmHLLt$0HD$8H$HD$@ A fC/H$f8 @CH$f8H$H $LD$0Ht$AlHLLt$0L|$8HD$@vAdfDCH$f8"H$H $LD$0Ht$AkHLLt$0L|$8HD$@AfDC7H$f8:H$H $LD$0Ht$AnHLLt$0L|$8HD$@AfDf9HL$f9I#Df=NnfDE1z Iz LzPo|DA = A IGPIHD$E1fDfA>I@fA>I@fA>I0@fA>HL$) EID$ f8HD$fA?MKCM|$ fA?f:wI!M|$ fA?.f:SI}M|$ fA?f:/IHكf=u!Ht HHtHCHt 828L9|1LHD f=Ht$f>A +HT$A I5DHHuHPHf{t f~iLXHHL\$tfA;uLHT$IHLDfA>!f=LHL\$L\$H$f8H$H $LD$0Ht$ApHLL\$0Lt$8HD$@sAhaHHuHPHf{t f~5H@HHHD$HD$tf8uHHT$HHEHT$fA?f=u!Ht HHtHCHt 80H;\$RHt$1H芉f=fA?DHHuHPHf{t f~mH@HHHD$HD$tf8uHHT$HHEHT$fA?f=u!Ht HHtHCHt 81H;\$Ht$1H躈pf=fA?DHHuHPHf{t f~H@HHHD$HD$tf8uHHT$HHEHT$fA?Kf=u!Ht HHtHCHt 8.1H;\$Ht$1Hꇔf=hfA?DA~ [L-DIHD$E1LE1H$H $LD$0Ht$ApHLL|$0Lt$8HD$@Av`˅ H$AA& &&1fD$JHuHBHf{t f~MHHJHtf9uHHL$HL$HHEI9w1HLHL$蜆HL$LHH$f8H$H $LD$0Ht$ApHLL|$0Lt$8HD$@dAR@HHuHPHf{t f~iHHHHtf9uHHL$HL$HHEL91HLHL$踅HL$LHH$f8H$H $LD$0Ht$AnHLLt$0L|$8HD$@AnHHuHPHf{t f~HHHHtf9uHHL$HL$HHEL91HLHL$؄HL$3LH0 H$f8H$H $LD$0Ht$AlHLLt$0L|$8HD$@AHHuHPHf{t f~YHLxHtfA?u LHLEM9A1LLLLc H$f8HD$H $LD$0Ht$AmHLLt$0HD$8H$HD$@AfDHH譵 HHuHPHf{t f~9HHHHtf9uHHL$HL$HHEL91HLHL$HL$LH`H$f8H$H $LD$0Ht$AkHLLt$0L|$8HD$@AJHuHBHfF"fGs}JHuHBHfFxfGJJHuHBHfF3fGJHuHBHfFfGA GMPIzA ;MPIgHH軳"ƅ D$AA& &&1eDT$DŅ XD$AA& &&1eDT$'f.IIA {IDA cIjDHSH`z AHBPHD$fA>MM@z >HBPHD$fA>MMyfDHSH~z HBPHD$fA>MqMW@HSH`>z HBPHD$fA>M^MD@z >LzPIQą hD$A8A& &&1eDT$7@HSHͼLzPfA?uLD$LLD$HLEM91LLLD$~LD$EHHJHfAfA8uH$f87,HLBH fA8u!HT$LLD$HLD$HT$tIM91LLHL$LD$g}LD$HL$mHLHL$LD$LD$HL$u HLIL9 A fA8uH$f8|(HLBHRfA8u!HT$LLD$HLD$HT$tIM91LLHL$LD${LD$HL$HLHL$LD$菈LD$HL$u HLIL9,A"fA8uH$f8&$z QIHD$.HLBH@fA8u!HT$LLD$HLD$HT$tIM91LLHL$LD$^zLD$HL$dHLHL$LD$LD$HL$u HLIL9AfA8uH$f8&IE14IE1-CBH$f8=XCH$f8HD$H $LD$0Ht$AmHLL|$0HD$8H$HD$@ARCnH$f8qHD$H $LD$0Ht$AkHLL|$0HD$8H$HD$@誹A+C4H$f87HD$H $LD$0Ht$AnHLL|$0HD$8H$HD$@CA1CH$f8HD$H $LD$0Ht$AlHLL|$0HD$8H$HD$@ܸA=CHtHHHSH:JHxBHD$L9<Ht$1LHL$ LD$vLD$HL$ Ht$LHL$HL$f9H$f8HD$L|$8HHL$HHD$8H$HD$PjGHL$Ht$LD$@LAnHgA_AX<1C0H$f85HD$H $LD$0Ht$AnHLL|$8HD$0H$HD$@ܶA1HHL$͂E1HL$WI11HLLD$ HL$auHL$LD$ gCH$f88HD$0H$HD$@ AcA 9A HtHHHSH:JH(HD$L9(Ht$1LHL$ LD$sLD$HL$ Ht$LHL$+HL$f9H$f8HD$L|$8HHL$HHD$8H$HD$PjGHL$Ht$LD$@AlHL9gAAYA[heAG~1HHL$XE1HL$I11HLLD$HL$rHL$LD$AG1HHL$E1HL$I11HLLD$HL$rHL$LD$PHtHHHSH:JH1HD$L9'Ht$1LHL$ LD$qLD$HL$ ,Ht$LHL$HL$ f9H$f8HD$L|$8HHL$HHD$8H$HD$PjGHL$Ht$LD$@AkHLgAY^1HHL$~E1HL$I11HLLD$HL$pHL$LD$鯾6 TD$A A& &&1eDT$#CzH$f8}HD$H $LD$0Ht$AmHLL|$0HD$8H$HD$@*A7HtHHHSH:JH#H$f8L|$0H$HL|$HL$HHD$PL|$@jGHL$Ht$LD$@HAmLjgAEXZAKC_H$f8dHD$H $LD$0Ht$AkHLL|$8HD$0H$HD$@;A)AfA?Lt$$;Lt$ z hMMz NIIfA?Lt$Lt$IE1z  MYMGHD$HD$IDfA?Lt$Lt$}z MHM6fA?Lt$Lt$z qMMz WIHBPIHD$LH3靺LH!LH$LHLH)A@ f9uH$f8A@rf9uH$f8HD$Ht$HgA@f9uH$f8x锹LHHt$H`Ht$H4vHD$A@$fA?uH$f8z "IHJP1Ƀz EHJPuLHIDHD$HD$9u!HT$(HHL$ HHL$ HT$(tHz LrPfA>uHL$ LHL$ HLEHD$H95Ht$1HHL$ iHL$ &Ht$HdfA>H$f8HD$Lt$@HHD$8HD$ HD$@H$HD$PjGHL$Ht$LD$@LAmHqgA_AX鞷HHuHPHf{t f~LpHHLt$tfA>uLHIDHD$HD$9u!HT$(HHL$ HHL$ HT$(tHz LrPfA>uHL$ LHL$ HLEHD$H9"Ht$1HHL$ fHL$ ?Ht$H=*fA>H$f8 HD$Lt$@HHD$8HD$ HD$@H$HD$PjGHL$Ht$LD$@AkHLJgAAYA[vz IHJPz DIHJPJ1Ƀz HJPHHuHPHf{t f~rLpHHLt$tfA>uLHIDHD$HD$9u!HT$(HHL$ HHL$ HT$(tHz hLrPfA>uHL$ LHL$ HLEHD$H9$Ht$1HHL$ ~dHL$ Ht$HԺfA>H$f8HD$Lt$@HHD$8HD$ HD$@H$HD$PjGHL$Ht$LD$@AnHLgAEA^Xv HHuHPHf{t f~gH@HHHD$tf8uHHHHDt$Ht$AHt$H蘹H$f8HD$H $LD$0Ht$ApHLLt$8HD$0H$HD$@AwHHuHPHf{t f~LpHHLt$tfA>uLHIDHD$HD$9u!HT$(HHL$ HHL$ HT$(tHz $LrPfA>uHL$ LHL$ HLEHD$H9Ht$1HHL$ :aHL$ rHt$H萷]fA>RH$f8@HD$Lt$@HHD$8HD$ HD$@H$HD$PjGHL$Ht$LD$@HAlLgAEXZʯ1Ƀz HJP LLe錮z ITIBLLHL$貶HL$Mz ɿII LHLD$sLD$HHuHPHf{t f~HLpHtfA>u LHLEM9%1LL_WLLDH$f82HD$H $LD$0Ht$AnHLL|$0HD$8H$HD$@\AJHHuHPHf{t f~sHLpHtfA>u LHLEM91LL^8LL%H$f8HD$H $LD$0Ht$AlHLL|$0HD$8H$HD$@芟AxLHLD$誴LD$ELHLD$腴LD$l z IqI_IGH HHuHPHf{t f~HLpHtfA>u LHLEM931LL]LLݳH$f8HD$H $LD$0Ht$AmHLL|$0HD$8H$HD$@HAk6JHuHBHf{t f~HHJHtf9uHHL$HL$HHEI91HLHL$\HL$zLHgH$f8UH$H $LD$0Ht$ApHLL|$0Lt$8HD$@hAVLLHL$舲HL$#H$H $LD$0Ht$ApHLL|$0Lt$8HD$@AکLHLD$LD$ 鵩HHuHPHf{t f~ HLpHtfA>u LHLEM9 1LL;[LL蘱H$f8HD$H $LD$0Ht$AkHLL|$0HD$8H$HD$@AfLLHL$#HL$龨z :IIJHfFpfGHHJHCf9u!HT$ HHL$HHL$HT$ tHz ùLrPfA>uHL$LHL$HLEI91HLHL$YHL$NHSHJHH$f8,H$H $LD$0Ht$ApHLL\$0L\$Lt$8HD$@AL\$ڧHHH$f8hHD$H $LD$0Ht$AlHLL|$8HD$0H$HD$@A"HHH$f8MHD$H $LD$0Ht$AnHLL|$8HD$0H$HD$@AHH\H$f82HD$H $LD$0Ht$AkHLL|$8HD$0H$HD$@+AJHufFfGHHJHff9u!HT$ HHL$HHL$HT$ tHz *LrPfA>uHL$LHL$HLEI991HLHL$GUHL$RJJHfFpfGHHJHf9u!HT$ HHL$HHL$HT$ tHz iLrPfA>uHL$LHL$HLEI91HLHL$THL$w鑢鉢HSH駢Lt$0L|$8HLD$HHD$PLjFHL$AlHt$HLD$@ gA_AX='Lt$0L|$8HLD$HHD$PAnjFHL$HHt$LLD$@gAY^ءA MP.HHmf{t f~HHHHtf9uHHL$HL$HHEI9U 1HLHL$?SHL$\LH藩IH$f87HD$H $LD$0Ht$AmHLL|$0HD$8H$HD$@ALt$0L|$8HLD$HHD$PHjFHL$AkHt$LLD$@vgAEXZ钠HD$Lt$0HL|$HAmHLHD$@H$HD$PjGHL$Ht$LD$@gAAYA[J4Lt$0L|$8HHL$HHD$PAkjGHL$HHt$LLD$@gAA^A_HD$Lt$0HLD$HAmHLHD$@H$HD$PjFHL$Ht$LD$@jgAA^A_酟Lt$0L|$8HHL$HHD$PAljGHL$HHt$LLD$@gAY^L6HSHTLt$0L|$8HHL$HHD$PHjGHL$AnHt$LLD$@gAEXZԞA OJHfFfGHHJHf9u!HT$ HHL$HHL$HT$ tHz LrPfA>uHL$LHL$HLEI91HLHL$PHL$ HSH!HBH5AFr1Ƀz HJPE1z xLrPoHLrHCfA>/HT$LHHT$txz IHJPf9uHHL$HL$HHEHLHL$_\HL$u HLIL9tE1HLHL$NHL$uz w逮LH?uAufA>uH$f8tHD$L|$0HLt$HAlHLHD$@H$HD$PjFHL$Ht$LD$@GgAA^A_xbLz vHHJPLH萤u4AGfA>uH$f8t! 1L"[uiLE1HD$L|$0HLt$HAlHLHD$@H$HD$PjGHL$Ht$LD$@zgAAYA[镛1f@HD$鷡H$f88HD$L|$0HL|$HD$HH$L|$@HD$PjFz  AGbHD$1Ƀz HJPHBH]@%HBHAFAG'@PCBi@1bHD$H9Ht$1HHL$ JHL$ ؚHt$HÚHD$f8H$f8HD$L|$8HHD$8HD$ HD$HH$HD$PjFz ܩ@_HD$1Ƀz HJP@^AGCkH$f8pHD$H $LD$0Ht$AnHLHD$0HD$HD$8H$HD$@A%鮘JHuHBHf{t f~HLrHtfA>u LHLEHD$L9PHt$1LIHt$LzH$f8nHD$H $LD$0Ht$AnHLHD$0HD$HD$8H$HD$@A#ΗHLrH)fA>HT$ LHHT$ z  IHJPf9uHHL$ HL$ HHEHLHL$ qUHL$ u HLIHD$H9Ht$1HHL$ GHL$ UHt$HS@fA>5H$f8#HD$Lt$@HHD$8HD$ HD$@H$HD$PjFz !$@cL1Ƀz HJPHt$H踝wH$f8eHD$H $LD$0Ht$AkHLHD$0HD$HD$8H$HD$@A @HLrH)fA> HT$ LHHT$ z :IHJPf9uHHL$ HL$ HHEHLHL$ SHL$ u HLIHD$H9Ht$1HHL$ ,FHL$ Ht$H肜fA>H$f8HD$Lt$@HHD$8HD$ HD$@H$HD$PjFz !SL@[1Ƀz HJPHt$H E1z LrPH$f8ZHD$H $LD$0Ht$ApHLLt$8HD$0H$HD$@覅A锓HPHAF@EvHD$H9qHt$1HHL$ CHL$ 6Ht$H!HD$f8H$f8HD$L|$8HHD$8HD$ HD$HH$HD$PjFz 鲢E1z LrPHLrHRfA>>HT$LHHHT$tvz bLrPfA>uHL$LHL$HLEHLHL$OHL$u LIHM9tS1LLHL$aBHL$upz LLLHL$衘HL$u@AFuf9uH$f8tHD$L|$0HHL$HAnHLHD$@H$HD$PjFHL$Ht$LD$@gAAYA[Ր鿏Lz vKLLrPLHu鎏AGqfA>uH$f8t'f1L|N1ɄLE1HD$L|$0HLt$HLAnHHD$@H$HD$PjGHL$Ht$LD$@οgA_AXHPH&HBHHPH@HBHAC,m@N@*1Ƀz xHJPoHBH HD$)@HPHHD$H9 Ht$1HHL$ U?HL$ |Ht$H諕gHD$f8XH$f8FHD$L|$8HHD$8HD$ HD$HH$HD$PjFz xHLrHQfA>HT$LHHHT$tvz >LrPfA>uHL$LHL$HLEHLHL$KHL$u LIHM9t[1LLHL$=>HL$uLz ȝLLsLLHL$uHL$uAFuf9uH$f8tHD$L|$0HHL$HLAkHHD$@H$HD$PjFHL$Ht$LD$@xgA_AX锋z vLLrPHBH1LJ1ɄLE1LH蜓u@AGfA>uH$f8tHD$L|$0HLt$HAkHLHD$@H$HD$PjGHL$Ht$LD$@蟻gAY^ҋ鼊HLrHPfA>HT$LHHHT$tvz LrPfA>uHL$LHL$HLEHLHL$pIHL$u LIHM9t[1LLHL$uH$f8tHD$L|$0HLt$HHAmLHD$@H$HD$PjGHL$Ht$LD$@jgAEXZ醈E1z LrPE1z LrPE1z $LrP@W1Ƀz XHJPOAGAFDgE1bCJHuHBHf{t f~HLrHtfA>u LHLEHD$L9NHt$1Lh9uHt$LÏ`H$f8NHD$H $LD$0Ht$AkHLHD$0HD$HD$8H$HD$@)zAE1z LrPHLrHfA>HT$ LHHT$ z >IHJPf9uHHL$ HL$ HHEHLHL$ EHL$ u HLIHD$H92Ht$1HHL$ 08HL$ HHt$H膎3fA>(H$f8HD$Lt$@HHD$8HD$ HD$@H$HD$PjFz !WHLrH)fA> HT$ LHHT$ z IHJPf9uHHL$ HL$ HHEHLHL$ DHL$ u HLIHD$H9Ht$1HHL$ 7HL$ FHt$Hd1fA>&H$f8HD$Lt$@HHD$8HD$ HD$@H$HD$PjFz !5L@[1Ƀz HJPHt$HɌ5H$f8#HD$H $LD$0Ht$AlHLHD$0HD$HD$8H$HD$@/wACJHuHBHf{t f~~HLrHtfA>u LHLEL;t$Ht$1L5?Ht$L*H$f8HD$H $LD$0Ht$AlHLHD$0HD$HD$8H$HD$@FvA41Ƀz [HJPRHD$@tHBH|AF-THt$HH$f8HD$H $LD$0Ht$AmHLHD$0HD$HD$8H$HD$@uAZsC4JHuHBHf{t f~HLrHtfA>u LHLEHD$L9Ht$1L3Ht$L3H$f8HD$H $LD$0Ht$AmHLHD$0HD$HD$8H$HD$@tAL釂@-^@Y1Ƀz HJP@Lp@=`gfUIATUASFHHrf4HEHHfS4A4ff%f9v AH}H}FHELpA fAFAE!蟛H=ȖHLmHmL9utHL~D膷HLKzHUIHGD褞H;XtHHھ~DKLHD[]A\A]A^IcmDC[1]A\A]A^8.L aIfH9h~fDb8uf;uA$f+f[f<HCLs L{(fl$bfmfnHHH@ H9H$+1HHHL$ɏHL$%ufHCxff}Bf;Bu=H7 )t",&t'`(uHCxHEBfw$ŸDE f{f~Ef{{f~qf;]EFfN$Ő@LH Cm! [H=HHD蹩)@HCf8]aLpLx L"IHT$@LDHl$@Lt$HL|$P+)HDHULu f{'f~ 'fUusLz fA?uhfA>uaL9HJ'1LHH $sÏH $t;HL$@L|$HFLt$PL|$XHT$@LDVH>fHHt!f{of~efXHCi Hx1[]A\A]A^A_H#HHL8HtBDhL聴LXt fA<$`&Cu! cH=HL2HLXD')@H蘧LhL؂A$f WMf8fA<$:AT$6 H'HcЀ 'HcЀ'HcЀ'HcЀ'HcЀ'H{'! UH=H[HL4D&LH HLDIq HLDIc LH4Dq&@HCLs L{(fl7fm'fnEHHH@ H9H$ #1HHHL$nHL$XMvL901LH趾iHڃ1@LuL} L@vHT$@LDH\$@Lt$HL|$PIHLuf{f~HHT$@LDLt$@H\$HHeDLuL} f{f~L91LH[#HT$@LDLt$@L|$HL|$PH0fDL}fA?f;5QHE HLLHH$>4HIAGC!3H=\HUL$LLTDdHULu f{f~ fULz fA?fA>L9HJ 1LHH $H $HL$@L|$HGLt$PL|$XDH{0Ls8fA>XMvL9 1HL议AXMvL9AFD SAAX&&1'a,@HCxH{0Ls8L9AF@fkvHHH@ H9H$ 1HHHL$迩GHL$DL9HHCHBf8ULx fA?fA>L9HHC@ID$f8 H=u / =p`32=m'^!=nHExH9Lu071HH莛t9AV6 HtLHcЀ t@HcЀt4HcЀt(HcЀtHcЀtHcЀ GE =$HCxLs0L{8L911HL胓AHl$@H\$H$L|$PC:J׀ AAX &&1aHBf8UoLx fA?`f9VL9L@Cf.HBf8U"Lx fA?f9 L9L@C3DEb=9 =F3*=:cHExf;BHL}0=:H3=THCxLs0HC8fA>H$=jxIv`mI~<n!ڃyH=րH9IV1M>DoHEL$HPDHHݐHPMDHFݐIL:D1nEh =&!_ =$#HExLu0L}8fA>u f;BBfA?}L90B1LHm_LVN.E{!փiH=ҀH?H $D[HQLH~DDCLLNLM>E!RփH={рHO3H $D[HQELH~D2D/р AAX&&1 a@C9LNKvLMfE3!Ճ!H=ЀH}IH $D[HQLHXDrDoЀ AAX&&19 a@=w%={=7cHExUf;BKHLu0E=G4HCx&L{0M931LLAcHf=r =s=tsHExeH9Lu0'1HHQCH$d='pWIv`mJ C HExlf;BbHL}0\>.=$BHCx4Ls0HC8fA>H$=nIv`m I~ !`уH=̀H.IV1MD}eHL$HPDHHlԐHPMDHFUԐIL>D1+e[fDHH[LOL觾H fA<$7L脾H fA<$6LaHfA<$6L>HfA<$6LHgfA<$ &LHDfA<$%LսCCL萼H fA>FLnH fA>FLLHt|fA>\FL.Ht^fA>4FLHt@fA>HFLHt"fA> FLԻHfA>|&L費pCU颽CCOC=oHExH9Lu0o1HHH$=eIv`mI~f!ȃH==ÀH/7IV1MD1\HoL$HPDHH ːHPMDHF ːIL?D1[HExf;BHLu0HCx=dL{0!+ǃH=T€HIVMLDH\ʐILD12[bEE;.0HEx"f;BHLu0=vHCxL{0M9Q$1LL|Au f;B#fA?,L9D1LH|H$9=a,Iv`mI~!|ă H=HIV1M@DXH׷L$HPDHHǐHPMDHFqǐIL;D1GXwE=lRHExDH9Lu01HH0zt9H$ݼ=_мIv`mI~r! ƒH=IH/IV1MD=VH{L$HPDHH,ŐHPMDHFŐILCD1UHExf;BHL}0A/=$ػHCxʻLs0HC8fA>H$=v^Iv`mI~Hz!H=H/IV1MDUHQL$HPDHHĐHPMDHFÐILAD1THExغf;BκHL}0ȻB9=(HCxLs0HC8fA>H$=L]|Iv`moaI~P!̿[H=H\8IV1MDSH'L$HPDHHHPMDHFILBD1SǫLLHExf;BHLu0莺3HCxqL{0M9(1LL]uOAH$6=W)Iv`mI~!yH=HIV1M=DNHԭL$HPDHH腽HPMDHFnIL9D1DNtHEx[f;BQHL}0KE=,1HCx#Ls0HC8fA>H$ =VIv`mI~ӳ!OެH=xHIV1MDlMHL$HPDHH[HPMDHFDILED1MJHEx1f;B'HL}0!=&/=(HCxLs0HC8fA>H$=UӲIv`mI~u!#H=LHQ.IV1MD@LH~L$HPDHH/HPMDHFIL&D1KLBL$HExf;BHL}0@=,˱HCxLs0HC8fA>H$=iTIv`m~I~;m!鶃xH=H IV1MDKHDL$HPDHHHPMDHF޹IL@D1JHEx˰H9Lu01HHlu f;B&fA?L9 1LH lH$=IMyIv`ml^I~M!ɯXH=H&IV1MDCH$L$HPDHHղHPMDHF農IL=D1CěH|HDH$鄰M 6AAXq&&1`HH$鏽H$H A=AX&&1`H$CHt$@LHl$@Lt$H?cH! عAAX&&1U`鱹CH RAAX&&1`+ &AФAX&&1`C`Ht$@LHl$@Lt$H>cHCCVHt$@LHl$@Lt$H>cH֙{CGCVC)rCXHt$@LHl$@Lt$H/>cHz}HHHC9HC٧ A:AX&&1` AAX&&1o`qq A?AX&&1;`HCxLs0HC8fA>H$k=.I^Iv`mQCI~2!讫=H=צHqIV1MD?H L$HPDHH躮HPMDHF裮IL%D1y?驗K AAX&&1`HJH, rACAX?&&1`KHCx=GLs0!胪H=HIWMLDF购IL9D1>麖HLL AgAX?&&1`[CQ jAAX&&1`CAD$6ĞAD$6驞L. UASAX&&1`. ζAƌAX&&1M`駶O AAX&&1` A)AX?&&1`HCx,=ELs0!藨&H=HIWMLDFȫIL@D1<Δp ԘAAX&&1:`魘HCx=DELs0p!짃{H=Hu>IWMLDF!IL:D1;'L1 tAkAX&&1`HC} AҍAX&&1G`ڨI ADAX&&1`AF6 XAAX&&1`1Lϡ BAAX&&1` AAX?&&1e`HCx=oCLs0!H=@HIWMLDFHIL;D1:NHCx5=BLs0$!蠥/H=ɠHIWMLDFѨILDD19בy AٙAX?&&1C`Li=  AÛAX&&1`HCxN=BLs0=!蹤HH=⟀HujIWMLDFILED18 AݘAX&&1``b tAAX&&10`f2 AAX&&1`鼪HHCx;=@Ls0*!覣5H=ϞHu>IWMLDFۦIL>D17Hd{ tAAX&&1I`N AxAX&&1`ÙL AlAX&&1`}ޝ bAAX?&&1`; KAAX&&1t`$v -ANAX&&1@`AFH胝BTbDv=&#HCx=?HC0H$Iv`m*۳I~ݳ!臡H=HIV1MD5HHPMDHH蔤L$HPDHF|IL&D1R5邍AFS AAX?&&1`H0HЛ -AAX?&&1`HCx==HC0H$Iv`mٳzI~nܳi!H=EHAIV1MD94HwHPMDHH)L$HPDHFILDD13?HCx=<HC0H$Iv`mسI~۳}!0H=YHIV1MDM3HHPMDHH=L$HPDHF%IL?D12+HCx=;HC0H$Iv`m׳I~ڳ!MܑH=vHIV1MDj2HHPMDHHZL$HPDHFBILBD12Hꘀ AAX?&&1` jAAX?&&1`C APAX?&&1L`LLHCsAFHh=:(=;:=9!HCx=9HC0H$Iv`mֳI~س!nH=HIV1M=D0HɏHPMDHH{L$HPDHFcIL9D190iAFtLq iAAX?&&1`BHCx=8HC0H$Iv`mԳI~׳!?ΎH=hHUIV1M@D\/HHPMDHHLL$HPDHF4IL;D1 /:HCx=7HC0H$Iv`mӳI~ֳ!\H=HIV1M>Dy.HHPMDHHiL$HPDHFQIL:D1'.W A2AX?&&1`wŔ MAeAX?&&1`&HL~HHFAD$6@AD$6%AD$6 AD$6HCx=M6Ls0y!H=Hu>IWMLDF*IL'D1-0Ltʓ tAAX&&1` VAȎAX&&1g`/i  AvAX&&13`5 APAX&&1`ǜLHCƽ⒀ A%AX&&1` AAX&&1x`AFH軒>0>AR=% HCx=G4HC0H$Iv`mbг̼I~ӳ!迖NH=葀HIV1MD*HHPMDHH̙L$HPDHF贙IL%D1*麂AFKAF0vHH, QAAX?&&1`*HCx=3HC0H$Iv`mϳI~ѳt!xH=HAIV1MD)HӈHPMDHH腘L$HPDHFmILAD1C)s=׺HCxɺ=2HC0H$Iv`m/γI~г!茔H=HIV1MD(HHPMDHH虗L$HPDHF聗IL=D1W(釀HCx=11HC0H$ѹIv`mLͳI~ϳ!詓8H=ҎHIV1MD'HHPMDHH趖L$HPDHF螖IL>D1t'F AFAX?&&1` jAAX?&&1`Cލ AyAX?&&1` nAAX?&&1t`GHCx=~/Ls0!&H=OHIWMLDFWILCD1-&]~ AAX?&&1`HCx=.Ls0!{ H=HujIWMLDF谔ILAD1%}X AYAX&&1"`$ tAAX&&1`f AUAX?&&1`HCx=-Ls0!pH=HIWMLDF術IL?D1w$|I AAX?&&1`_ AAX&&1`ኀ UA-AX&&1`.HdH MAўAX?&&1g`&HHAFTHHH芊CHE&='=HCx/=,HC0H$Iv`m1ȳI~ʳ!莎H=HuzIV1MD"HHPMDHH蟑L$HPDHF臑IL'D1]"zH' yAAX?&&1`RHCx=*HC0H$Iv`mdzI~ɳ!sH=HIV1MD!H΀HPMDHH耐L$HPDHFhILED1>!ny@6HCx(=*HC0H$Iv`m*ƳI~ȳ!臌H=HqIV1MD HHPMDHH蔏L$HPDHF|IL@D1R x$ vAAX&&1`OHHHCx=(HC0H$Iv`mųI~dz!`~H=HIV1MD}H~HPMDHHmL$HPDHFUILCD1+[w nA AX?&&1Ǿ`GɅ MA<AX?&&1蓾`&L> A%AX&&1W`Y BAAX&&1#`% AoAX?&&1`[HHŤᄀ AAX?&&1諽`{HCx=&Ls0!]|H=HIIWMLDF莌IL&D1du6 AAX&&1`e AoAX?&&1̼`HCx=%Ls0!~ |H=HIWMLDF诋IL=D1tW AAX&&1!`鄘# A՝AX&&1` UAGAX&&1蹻`.AF6uAF6_AF6IAF63AF6eAF6O AAX?&&1I`pHCx=S$Ls0!zH=$Hu6IWMLDF0ILBD16s؁ tAAX&&1覺`@LHh|$ H4$flfn,~fpLjLz EAUAf{ f~ fmf{f~L9 1LH矎7UHDHEf8EC!4ނH=]HxC kA$ AX &&1 `Dff{MEf~ L9 1LHLD$#LD$ UHEf;DH4$|$ HT$@pLl$@L|$HH\$P HI DErf;)EfF fGLuHE M9HD$1LLaAIcc\$ IWIL]׏H$IHc׏IfDEf;u@H4$|$ HT$@lLD$@LD$Ll$HH\$PHILD$@EEf;u8H4$|$ HT$@pLD$@L|$HH\$P*HIIfDAEEIf;u@H4$|$ HT$@kLD$@LD$Ll$HH\$PHILD$@ETEf;u@H4$|$ HT$@nLD$@LD$Ll$HH\$PmHILD$@EEf;u@H4$|$ HT$@mLD$@LD$Ll$HH\$P HILD$'@ELuHE M9HD$ 1LL赈AHAuMǘMEŐH|$tHD$xy@HX[]A\A]A^A_LT$8lT$8` {HFE1jfDEOIvfAv MtIuLT$8謜xHIID$HT$8MeHAH{It$@ff tHGMeIt$@HH9tyH_fuyMeIt$@@HzLHH@HPHz DHLGHp8I9IHD$cLD|$8ۘxT$8H?ID$Xt>E11 fE1DH11D$?ID$XD$0HHPHAyHPxE1uAD9h6DLHTHAHyHu |$0u]H?ID$XIT$HBHBMd$8@,HD$ L9`8Ev_MI!A|$?tH_zH@@0k8MucMmMNA}"uIULHL-yMfDIULHMmMu1RHH=L!I(tf%=1AwUD)%FAMHH4#HHHLIH?z1LHH@HP]H@f84H@f81HxxHS@Ht!1LHPp 3 \\DD9t9L$LL$DHŠHt$HH|$@AغBHHڃHcȡIVHDD6InIDIDhLxHh1HDDHfDLL$8DL$0DD$(臘L$8DD$(DL$09DLL$8DL$0DD$(XL$8DL$0DD$(Q9FfA@~DLDD$(DD$(HD5)DD$(LHL$HH1HHcHt$H1H@AIDLxM{AFHI9H_HL$ f@HIcf @6AFEuAHUfAFAH 9DMvDEEEf2vt H1WHH8{tvHf.v1fAF D$ HChHD$HHH)H LCxHHL9%HHspIIHI)IL!IHI)M9v HHEHSxAumIII)M9LL9H%HI!LH)L9LL@H{hHL$HHHL$D@HHH{hLHL$LD$HHHspHLSxHL$IILD$II)fE È]H[]ftOHATUSHl$ D$(\$09Dgr <@9v1AN\ES(A9vE9sMMSM9v[1]A\FA9vJHLH;iv?A)DNAt$JDLHtfHHH9isD9A(s ty,uH9u[]A\@9sE1cAWAVAUATUSHLOMAAiMlHD$IEI9DbA9EDMUODN4M;ppDv\E;_IQI\$0L$T$Ht$ H|$I9DILBf.HJ4H9rD9HHHI9v 9r\$I9HD$ D9Dp@Hh(r}H$ML|$HD$( EODI@ HT$(Ht$`HDŽ$Hx@KoD$`MOD9vKTAHjH+$AD9sHD$MlMUH$H?;oHt$8HTH|$HtH$HtGE1HD[]A\A]A^A_L=vHD$ l$xLLpL09|$4xP|$(HxpH|$HHxhH|$8iD$E1LLHD$@LA܉hDur9l$H|$ptH$H9$t N(;L$(t;L$4wH~ L$4H|$He]1$DIL$L=$vV1HD$ DDp@Hh(b\$@DIHT$XHt$PH|$`HD$`H;D$Xt H;D$PH|$p@HD$H1҉HhD$ PD$@HPAVHD$@DD$HL$TLmH AHT$`H;T$XH;T$PKH|$ptH$Fo9D$\|$@tfoD$`~D$AE09foD$pAE@fo$AEPfo$AE`fo$MAAEpLl$HD$PH9D$`_IUDEDAD9sD$@E1D$ @E P|V+D$9$a|$9HD$HDx,9rHLy,tK9H$vH=aqssa!$\H=,qsNa#7HDŽ$y GSSHDŽ$D$@HFE1H^ H$HD$kSfH=ksHt ksd HyֹIL$HY#HIHѡH ksHt jks IL$HDIEE1Ld$E11D$EAGD$H=)ksHtJkst=LCH D1SHC H=js1HpH蓄H5js rSHD$HPHHBI;D$D D9|$DEH\fD9I>sCHTHHBH9r0UHHLfDH1HHHVH9w D9uI~H9{S(E^XAED$D9INH9KHsHHCCHHI9vIF~H)=,vHH9LKMFMLI@L)H9H9LD$HLT$@LL$8D\$0T$(HLqd{,T$(D\$0LL$8LT$@LD$HHOIA9VPЉACFPA9VLAFNLL MHt$L\$(HM9H~ATMBU1AQAMD[dH {,L\$(AC, = I HD$1HLL\$(HP}L\$(tTI HD$HLHP|t)HL֢H\$8L|$@DLt$PLHD$L` MA$AL$%98AAD$MtE HHIH gsHgsuShLXmL{(fIF`H&DPAH@HAF\9HAEAA@H\$Dd$0DˉD$(DT$8E܋T$(H|$ALơyAuH\$Dd$0KSuA|$(q`HH4HD$Hx HOT$0L$(IHD$T$0L$(L` A$AL$% A$AD$E{,A~UA;F\D$pH3A9VLKAFNLA9VPACVPH$LD\$0MH9Ht$xHCHt$$$HD$hHAкH~DH|$hATUPLT$HaH $LT$(D\$0JHD$`D$0Ht@D$0Ht$`AEL|$@LT$HD$tEƄ$H\$8AHDt$Lt$PH$$H$t$0Aǃ$|$9|$0E H$HD$`Lt$hHDHpL9! L$Dl$($l$\1PADL$pD;9щ|$(GH0I9LB9|$0IIHpL9P(DH,D9rAM9GD11AAL$8DB1T$0DDD$(H4:Ht$DD$(T$0L$8HF 0A@A DAAD@ IHPt(qH|HtI4$H2HIH9uHI@L8{AI\$ HHk9BUIL$PLt#IMuI\$ HHHQHɋpQ9 I\$H$A9@H\$A H=bsHt'1@蜛SID$ HxH$E11Hl$!ID$ HO9XFڿ3LtgvtH|$LD$tIF`HWpE1u\HH|$oA2I;nhtH>H<`IF`AHD9hDH|$L|Io Hq}uEuHEHqf87g@]Ld$AL&gLI"I\$ IT$H'A~U EHl$t H asHtΣSIL$HQDl$MDH(αID$E1HH$HD$QE1H$HHD$DAH@ F8S17SIL$HQqIt$E1H$Ht$1y{u~I|$ HH4H!oID$ h ډP_ID$ Eat$p9$B9B؋D$\9FЉT$\DH(AIF`H\$DT$8Dd$0Ht@D$(HD$HP HJ%9 ABHD$LtHPHHAuBt$t$(9LdHILDd$(Dd$HV1D9l$(vWD$HEARD$ATAVLL$h$H$XH D$D$PDD$(EB;l$\v)t%9݉FD9AGA9sD$tt$t,H\$8D|$0l$8Lt$DL|$@ELt$PLT$HHD$xH$H9vL)IFHL艔t$\D$8L$9FA9FXAFFX9L$(AFXHD$H@HtHFH;D$BH"HD$H@HtHNH;L$D$0;F(v8~,H9t)HII~HH9vH)INLT$8ܓT$8;T$(vD$(D$$A9A$$IL$E1H$IL$HE1H$1L\$(L\$(HHHDHHLLHt$(HL$0HH|$89HHt$(HL$0H|$8HHLLHHHDƃLKMMFS(H=dZsHt$(E1t$H=DZsHPAu%ZsE1t$t$(t$zFqxIH4HD$Hx HODD$L$HHD$DD$L$HP DƁJ% Bu1E1AAL$8EC1HT$0DDD$DL$H4Ht$DL$DD$HT$0L$8HF 0Dρ@ 0`EtnHLd$D$A1HD$DQH@ H1HD$hH\$8DD|$0EL|$@Lt$PLT$HHD$Dl$(D$\D$81$SXst$(E1t$1^ PgmV[ PSmVI1$ P5mVff.3rH9HD$  HEAHeH|$ HWHHRL| A'IGXt9AG_HI#GXHHt H9HgHAGPfAAG`/AGQ IG@HL|$@HD$0IG@HD$8dH=HD$DA)yHE(L|$(HD$HD$PH$f.HdH<$D$PHD$XH\$`cDIL$PIǍA@Ld$XD$PƍQLHAG A?AGA{LT$PHzAG ȃAAGH|$`cHD$XD$PHD$L$$LbH|$IwfAGIHLҙA(D$PHcHAGHvcHAGIjcHAGJ^cHAGKH_HHIG@ME1Dt$MMHH tADM$HHc L$ HI$x =H% HdAG(A<  H|$t LHH|$01LHbH=IG1ɉǺ1WIHbbH=MHD$DA)\6K&U@HbH=AG`AFL ӄHt f8uLxALńA< AO fufDL`H@JfvfQIM9DMDt$A?bAG@7HH3tIG`AFGA\AIQ1ITPHR f:uLzAWI9w%HHtIG(HHtHHIGHtHIGPHtIGXH|$``D$PHD$XAG A?AG@Ld$XLHAG кAGщ|$PIALd$lfHPf:Hx I$HPHJP2QsHP HRBaH@H;BSH;AII $@HaHcH9{Lz5H|$`_AWHD$XLd$X кAGAu7H|$`R_AWHD$X AGLd$XLT$PHփ AG AGLRH^H=Hu6Hx[]A\A]A^A_L|$(H^H=HtLt$ @H^IVHH|IHtD`LDd$REIvAD$E1HHD$@HH,tHHD$7^HH$+^H\$`HD$XIFIH|$Ht$D$PHP$HTHAHL9LI;t(HH9u11LsT%$ f.1LLL$AKTL$H}(HT$PHA@$IH@HIt`̙D9d$'f.D$@Ld$H XUHt$@H tE<($@H% fDHcu(H`Mt LH$H|$HmAH{x^IcH`HmHFH{1]Ht$H|$ |Ll$ MfDIU0f=wf#H{Eu,]IE0Hߋp^EttMeX1I4$HI0tIEH߃HHDHHcpTe^HCAT$Ht$@D$@HD$HHHD$PtA9uMmM>H{1\Hh[]A\A]A^A_@uHH]uIH]uJH]uKH]HU@HsHcEU $E1gHc DHTH H=]uL LL$ u MfLHAtD9e EP vE1Hu(Hqt@H}@HuHH߹FtHuPH߹/tHuXHt_1[BMmMY@H|$PZb@H|$PLY fH|$PYELd$HD$@<E1<DmAAyIAALusB@H% fDQHI ĸ@tIEH8[]A\A]fDXfDXfDXfDSH1H}u.f,ƒtPt1[[fH{tJfv΋[fH{wtH{jtЃ9fwHuHcҋ[DH[+(HD$HGH\$PH_ HL$HD$HGH\$XHHD$(H|$#Ht$H|$INqH$DŽ$DŽ$u1Hĸ[]A\A]A^A_Ht$H|$(INrH$utĄff$$AFtHD$HH@H@HL8@MAƒft fL9u% @HD$-%OtL`@MH@HHH@Ht I91Ll$%fDIEHHH@L9HL {A$f7ID$ f8tHLa"HLxHxfA?*f*t%f,HGf8*zLAMd$M,9IM~I^ H$IFL|$8H$8{H$1HHXH$Io@H\$pHXHH\$x$'HD$8H@HHH@H9u H E1HD$ 1D$`Lt$@5@$Ht$@H|$PHIAXAYI D$`H("Dc D9At$HD$HsJDHD$ H !D`AΉL$0AAt$E9HD$ MpN| EeEuAE9At$RAuKlHmHLl$0H|$0Lt$@VHD$0xFD$`t8HL$ LkLaHlfI4$I}IIkL9uL-'pLLoM)LH{HmHuT$`D$`HL$ HkLaL|fI<$IHo{H}f{M9uH$T{H|$pJ{H|$x@{L{6{Mt?LfHmH f} tu0yH} 1HGbE0u1H|$Ht$H|$INqH$DŽ$DŽ$^Ht$H|$4Df@fIG f8tHL$HL`H@HD$ HD$(HtDHH@H@HH( HmEƒft f3H9u% @In8/f.IF8H@H9EHLH{HHuLLE1"L:LZI]MfhfA<$*ZAD$<A$f~!f-%zHD$8H|$ 耽KHp;L|$Ld$0HHD$A~qInHIvXIF@HD$8IF`HD$ IFPtI~k A~r` MIn(AFq]KfIvH A~r A~rI^`DKIcEƒ@ t,I~8 A~r- A~rI A~r  o%IF`I^PE1MfX@f;6D$H] AFqD$pt#INHA~rA~roMH|$(1E1Ht$0fHD$ NdIuLI}贻IuL3utI}HT$XH u_IEf8*uxwxouAHt$LL3u0Ht$ L;3uHT$XHL7 MLL1E1=HH?HH1-{HD$0x3@HD$ Htx @Ht{CH=4pHhIHXx1PGID$@A|$wA|$oLd$8IF8H@H9Ef} Lm fA}@LHvHI)LrL1HP(0 vSH|$ PH|$0PLl$0Lt$@1,{MfIT$f:!DIT$f:!50 vSHL$0Iv LAHDP(${L$H$LHnHcoLl$8LLnHcoHELHx?eoLfoLH. t%kotH$H$ko t tIdžHD$0f80hHMu)H|$@uHtHHH!H9H^2+{A~q!INH I~P< t15I~X< t15){IF(HIv`LHHAvXMNPL@讬HZYEouIVPf:!*{{IvHt A~rI~`|%AFqaI^`CHcA@MF(҉D$@A(IvHt A~rvIvXHLD$H]ϦLD$Ht EAVq7AFqZIvHIVXH 9pf:!'H9IVPf:!HzIF0Ht({Iv`1ɉLKHQI~`H9tHѳL艮H4IV@Iv HL|$Ld$0R(!{IdžH=pHtIx1iPH\$8I9tD'{HL_H pL{HX{({HD$AW(HH@HH0+!{H|$( H|$(ucotH\$PL|$HL)nH`oLt$HIFHxaoL4coHLtgo tA~qt!I~ A~rA~rLYA~qI~t A~rMfPH59pIN(I9HA~ I;vXWXH;ptMkLHi WA\$:]HM;fXvEFsL-3L̰ %{L=pzHE1AjHLHZB|H_AXHE1E1jLGH‰xHY^I~`H1LHIV@Iv HL|$Ld$0R({Idž[I~`7#A~qI~I~`#Au2InPI^Xf}8- f;8 f}E f;EA~quI~_IVPf:!\IFXf8!NI^`f;*@L`IF(kLzHD$@M${HI9t LLSLAEHI9fzIvPHHD$HLD$HHL$@DHM苻HtH=pfI~`H9tH裯L[HIV@Iv HL|$Ld$0R(u{Idž+D v胺SH${H|$p{H|$x{L3{~${kD$`H$1~{H|$pt{H|$xj{1{>${WIF`IN(4MNPMFXpf84H>"{I~`H蛮LSHIV@Iv HL|$Ld$0R(m{IdžMF(DL$@A(H|$ HL|$Ld$0RHD$(HsH@HH;(uHUHHD$XHHHPt{1ҾHHt HH5H=֣pHIHLd$8&H=pL|$Ld$0HIHfA<$6HCf8!ID$f8!zIt$H{ȦdH{f?*-IH[Md$EHzI9NP1@I~`oI^PIv(HIF8I^(H$HIvPH{+vǦ0IvXH{aǦW$@fDIFPf86IvXHx%ǦtKI^0Hf;t8+H{1@)tIFPH@H;p H;pu A~qzI~dIFPH;իpIF0H FHpI9FX5Iv`I~PtƦ{IF`xzHLڧHz IV`HE1HBzjIHAYAZL I~`H9tH2LHHLHBoIF@Iv HL|$Ld$0P({IdžEDf;!fA<$!HCI\$1HHHD$@H)HH?H?@8 9Dd$HD{HD$`IF0HHt$`HNH]HB(@M@S@l$hD$H1){Mt'Iv`LĦtD\zIv`HIݩIv`L$H@L)HHHD$`HHMt%HILjHE1ɾ6D[A_HH|$@DI^` A{|$hIHHDHE1j7AZA[HeI~`H9tH+LHHELH;o1IF@Iv HL|$Ld$0P({IdžF>fDf;! Ht$PH|$^o~sLH=pHH|$Ht$PnH~Uo[H{H-æHAAIv(H:IF8In(H$HH}H¦Huf>*,f>0H95ΧpH;5ѧpCH;4@`:EMtfWt fZ!WfH0u A{IV`Ez+H;HEIN`t*E1ExH0E0HIN`H9t HHLˡH{IV@Iv HL|$Ld$0R({H$In(IF0IdžIF8%H}HAE1UH{H_HA3H}H=uI~`_HuI~PxHuI~P5HuI~X IvXI~PǨ VInXIDnPI~`HL|$Ld$0Idž-HpI9@f}ZLEEHo8+f}XH~%=28HH$Ht$@zHH&H$Ht$@H@H9B HHt$`HT$HСHHxHt$`HD$@4HT$HHHHL$@t*HpEH}HHuAHqv5SD$hD$HHt_H|$@D$HH|$`DH{HD$`H@l$hH\$@l$pHD$@Iv I~(\5#D$h\$HkH|$@I^`D9{HCDjE1IHH^_D$Hk(1E1fIH(1E1HHD$ HR{lIv I~(4 H9\$@<11HI^`Hc6{HE1IjHHFDAXAYL{I^`AVqCHcA@)Iv I~(#4A{=(1E1uHHH9 L9wr%{I~`HL豚HIV@Iv HL|$Ld$0R( {HuI~X艺sH$HL詨Ho{IFPH@H@HIN`HyyzٺHL觛HMF`IVX6AxHE1jH{HA[[tfI~`H9tHL躙HHLHoIF@Iv HL|$Ld$0P( {Idžs{z1ɺHLHtMF`IVX7D1IN`{IFHHIH9HpANQ{MF`HINXIVPEAxjpHZYtgI~`H9tHL谘HIV@Iv HL|$Ld$0R( {H$I^(IF0IdžIF8 {AMRE1LEE1K8IvPH{1zIvXH{e=2H$HL3H\Iv`I~X޷'1cjHsI~`EH{H=ՒpHA8H{1y{EHI~`HKHSE1PGPIFPpMNXH HE{&Iv I~(03I^0HI^(Iv H/IwHٶIGf8!{AH<͠L$HH#xHH9UfA?CtwUt}D|$@HDQ3{DHHڿBIM{LH肛L:HIV@Iv HL|$Ld$0R(T {uUbD|$@D2{DHHڿCDIff.@HRHHH4H|$`H@QkH$ OkH$NkH]@HHEHHH@H9u=Ht8 (1E1HD$ H@H|$0H#͠LD$PwHAzIjH|$0@LxLD$PHxpaSaSOSff.0CKR@f0C#RxHjOHspA\H$H$EDEH@HD$HH@D$ HD$1Ll$0E1Dt$(IHXfDHD$HhL} }t7AD;d$ tcHD$HHIŠD9` fL}MtHD$Hh0fDLH͒M@MuAD;d$ uDt$(Ll$0I]xHuIH[0H;H'uHDL} Mu=f.M M&A uM7Mt@LH5Mv Muf.L}MLHM MuH% HD$HxHpH4$GHI fDXOR@t H(D9xCTqkDt$lLl$0H$Ht GH$Htl GH$HP1۽u)@GHH;XL$(I<$Htڀx OI$H$뽐ھYH1OI61҃HI9^H$Ht9X~ ـ|uЄH yot/;OH \ofD{,u謫 I~HHHHoE11UDLd$Md$ MD$`D$`9D$ddHD$@HHHD$XL$`9HnfDH2a,OIwH=IoHH =ox%OT$`Ht$@H=o謐HD$EOdEH=oHHHD$P"HE1OIFUDT$kE1118LDD$L $HL$5oHL$H=AoH¾1OL $DD$HH$?H$E1LϹUL $DŽ$XNp$L $L$$H=oH$I@I6H$OHDŽ$OHDŽ$QHH$HH$@K1$DŽ$VHD$@H5rH(HM%OH AoH%oEPOLLD$LL$oH=oH¾1OLL$LD$$MH=ˇo1LL$ LD$L$OL$LD$LL$ 1ɄLLL$A?H5o LL$AOI61LL$ LL$LbÀu AXu'H=0ozLL $oH1LL$XLL$H$E11ɾhH$HDŽ$DŽ$L:JpLL$HnpA`DHHH19B(HD$9HH=moL$LLL$:oH=KoH¾P1$OL$LL$hoiAA^A:t'1_LE11ҾЫ`IpL $LLD$ LL$L$oH=…oH¾1蛾O<$LL$LD$ X L$$DŽ$Y꺋>CǗRC賗R1LL $.=pL $;IQHHuH rH@THRXHtS0L $ HzL $eIAHPXHt!  HxL $7Iy8IAt @@;AYuuA+IAxHufXHoHHϾ1OuQAnf.|$gHl$`HH!HD$8HH=|$HD$@A{HL$0HL1HuLH8H81Ȩtm1{HHH1H, HLH!H9*H!H9HL$8H9X1HL$@HD$HA8DFHfDH|$` H6AHH82$K7H82$V@H)H!G2I$@mFDD1DL$(AODL$(AHeL$ LǾ LlYH$HIH9G2M11LiHHFL0Dh@ 6eHH1HH8H81Dl$+H$Dl$H5 CsR(11H$HHPCxsRPCdsRo PsRf?8蜏OAC D9wH<HI1C@HtMtHtHpI}atXYHLL[]A\A]A^A_LL9@LL&@O0MJ\0M8J?HHH?@8@HH1@H)HHt6H?H1H)1HHHtHDDIL跫IIADىLIQID虫_@ēMIHD$H= iH0H iՓMIHD$H= iH0H i~MIfHD$0HHHD@<HGHD$HC@H@HD$CD$u HCHD$E1|$HD$H|$1HpYHHD$0yH4LhALI{IF1pIEI=xH|$1HIYH"Ht$HT$@LH|$@HLIHD$1HxR{ I5HkXLcP1L_H5WkHLqeLIH¾<L9nzIkLt$H|$@HIvIvI} AEHuzHCLEozIH|$0f?UG6< tK< tGG6t:G6t-G6t G6tG6HHt$HT$@LRHD$0H|$@HHptIHD$0HT$4HpbIwHxI zu0I}1PL~HIWnzHHMzzLLリ5nzLt$HIL2 HHL4~LnzIHt$LE1 FHIf8D`6HD$0Hxf?G6A9u HD$0Lx1L ]LHD$ ]H|$1LIV1LIOLD$ LL<HIkz1LHHPH|$1LPV1HLHOHD$1Hx`OLL$ {HLlzI}HAxH H|$0f?H H|$0f?x辻HH|$0f?b蛻HH|$0f?LxHeH|$0f?6UHBH|$0f? 2(;AHArJG6f|$HH"H|$D$@!I$H@8f8t#H$HL0 MMP Mb(LM4 I^(H|$TH-=jHl$LLT$XcHHD$ VHD$(I$Hx D$8HHx D$0H1Hp I$Hx ZuLT$XmI$H AzLtIA~H@pLtIH@;pw?M9t:D$0|$8D$8HD$(|$0H|$ HD$ HD$Hl$H|$(HHHH€yBG<EAAD|$@DDHD$HXH Dc%D9AD$H|$CKHDHxH|$(HHP Hh(Dp@Hx|$0xH|$ Hx0|$8x8@=ffo$ HD$H`H k%9 EHCH$HHCH$HCfDGfDH${c kIHD$> ADH1LT$Hp8I$Hx8MuHT$PLT$ HLLT$ HT$VHT$HHHCLT$ HD$T@=fHH${cL@I$HH1DŽ$HDŽ$hDŽ$ HDŽ$ %k{{At$DHL$XHT$@HT$@HL$XIHH4HD$HXH DgDD$hHL$XHT$@]HHD$DD$hHT$@HL$XHXADc%D Cb@H1Hp(I$Hx(ʛuI$LT$XpHB8H f8bHq8f>Tf}XH$H$mwLT$XHLT$XsLT$XHEH$HHHjH$H$LT$XH@8H$H$wLT$X H|$LT$X.HD$LT$XH@H$HT$ H$HD$(H$H$HH$?H5$$SH$H$AкLT$X9$LT$XLT$X7LT$XLT$X6LT$XH$HL&H$HD$HD$HID$HHHt HHI$HT$(H@H@HH HHL$ zHL$ HT$(AHH9T$H|$8 =fI$H$@H|$Hp8!f.HD$ED$|$H| LHt*ooCFoC F oC0F0H{@H~@HHHHL9uD`H& Rgfffɉ)DAOH$AغP{c1DŽ$HDŽ$hDŽ$PHDŽ$ ¼kXD{uHD$IHHH`HKoSHHD$AH`k%D CE1E1A IML$H$H$x{c1DŽ$HDŽ$hDŽ$ HDŽ$ 蹻k @1E1ADC1DAHHRH|$H`@D EPH|$H1uuLT$H$HD$AHǀ`E$H IfDI$HH${cLH8LBHH1DŽ$HDŽ$hDŽ$ HDŽ$ Vk@LT$X?w HHHH!H$H$LT$XH$LT$XH;$BH$H9$Ht[ÐHtXH^H^HHHHa@f.f. f.HaDHaH^HH靗f.SHH߾[YfH^HHÐo>f.H#H^HH-f.SHH߾[fH5n#HF^>H5q#H&^>f.@USHHHH[]HHHQ/ HbG&HGd1.f.H=\tHH?zd>uHÐÿH^H^HHHH°YHHt. }f.at HaDH=aHXt)fH=qaHa)wa)akHdaHf.D HHGH9t H<@fDHHGH9t H@jHVH6gHff.f.H^SHHHHG贄{tH{0Ht`HH[WUH ^SHHHHGtHx_HHH[Uf.H^HHf.SHWH[Rf.@SHwH[Rf.@SHH[Rf.@SLHH)A=H[f.fSHHH)!=H[f.fHHG8<tHt(HH@8H0u?HHH)z+;HH[]HL$H4$"rHHL$H4$H@8H0tHHH[]ڒP2_HHMf.SHH[qf.@HReH 5_H5_HHHH` ^f.@HeH`5_HH5_HHHH° ~^f.@HdH4_H3_HHHHP >^f.@H=zH4_HHmgf.SHH[.gf.@StH\HԼHHvZf.H{; f.bH 5_HHf.SHH߾[YKfHaH`5_HHMf.SHH߾[ KfUSHHHHDH6HHt HH[]ÐJHtHtHNNVH64elemental binary operationDivision by zero at %LArithmetic NaN at %LArithmetic underflow at %LArithmetic overflow at %LFix min_int calculation../../gcc/fortran/arith.cgfc_range_check(): Bad typearith_power(): Bad basearith_power(): unknown typecompare_real(): Bad operatorInteger outside symmetric range implied by Standard Fortran at %LArray operands are incommensurate at %Leval_intrinsic(): Bad operatorgfc_arith_error(): Bad error codeIllegal type in character concatenation at %LArithmetic OK converting %s to %s at %LArithmetic overflow converting %s to %s at %L. This check can be disabled with the option %<-fno-range-check%>Arithmetic underflow converting %s to %s at %L. This check can be disabled with the option %<-fno-range-check%>Arithmetic NaN converting %s to %s at %L. This check can be disabled with the option %<-fno-range-check%>Division by zero converting %s to %s at %LArray operands are incommensurate converting %s to %s at %LInteger outside symmetric range implied by Standard Fortran converting %s to %s at %LThe Hollerith constant at %L is truncated in conversion to %qsThe character constant at %L is truncated in conversion to %sConversion of an Infinity or Not-a-Number at %L to INTEGERgfc_arith_uminus(): Bad basic typegfc_arith_plus(): Bad basic typegfc_arith_minus(): Bad basic typegfc_arith_times(): Bad basic typeInteger division truncated to constant %qs at %Lgfc_arith_divide(): Bad basic typeNegative exponent of integer has zero result at %LResult of exponentiation at %L exceeds the range of %sNoninteger exponent in an initialization expression at %LRaising a negative REAL at %L to a REAL power is prohibitedgfc_compare_expr(): Bad basic typeConversion from %qs to %qs at %LChange of value in conversion from %qs to %qs at %LNon-zero imaginary part discarded in conversion from %qs to %qs at %L D D D D D D D D D D D D D D D D D D D D D D D D D D D D D D D D D D D D Dr DDDDD0D`DDpD)Di*DP*D0*D*D)Dgfc_check_character_rangeInvalid BOZ literal constant used in subscript at %CExpected array subscript at %CUnexpected %<*%> in coarray subscript at %CStrides not allowed in coarray subscript at %CExpected array subscript stride at %CExpected expression in array specification at %CExpecting a scalar INTEGER expression at %C, found %sgfc_compare_array_spec(): Array spec clobberedDO-iterator %qs at %L is inside iterator of the same nameElement in %s array constructor at %L is %sVariable %qs at %L in this context must be constantExpression at %L in this context must be constantSyntax error in array constructor at %CBOZ literal constant at %L cannot appear in an array constructorNULL() at %C cannot appear in an array constructorIterator step at %L cannot be zeroInvalid form of array reference at %CArray reference at %C has more than 7 dimensionsArray reference at %C cannot have more than %d dimensionsCoarrays disabled at %C, use %<-fcoarray=%> to enableUnexpected coarray designator at %CTEAM= attribute in %C misplacedSTAT= attribute in %C misplacedToo few codimensions at %C, expected %d not %dToo many codimensions at %C, expected %d not %dUnexpected %<*%> for codimension %d of %d at %CInvalid form of coarray reference at %CInvalid codimension %d at %C, only %d codimensions existBad array specification for implied-shape array at %CBad array specification for an explicitly shaped array at %CBad array specification for assumed shape array at %CBad specification for deferred shape array at %CBad specification for assumed size array at %CExpected another dimension in array declaration at %CArray specification at %C has more than %d dimensionsArray specification at %C with more than 7 dimensionsUpper bound of last coarray dimension must be %<*%> at %CThe assumed-rank array %qs at %L shall not have a codimensionrank + corank of %qs exceeds %d at %C[...] style array constructors at %CArray constructor including type specification at %CType-spec at %L cannot contain a deferred type parameterType-spec at %L cannot contain an asterisk for a type parameterEmpty array constructor at %C is not allowedIncompatible typespec for array element at %LThe number of elements in the array constructor at %L requires an increase of the allowed %d upper limit. See %<-fmax-array-constructor%> optionAC-IMPLIED-DO initial expression references control variable at %LAC-IMPLIED-DO final expression references control variable at %LAC-IMPLIED-DO step expression references control variable at %LArray constructor value at %L shall not be unlimited polymorphic [F2008: C4106]spec_dimen_size(): Bad dimensiongfc_ref_dimen_size(): Bad dimensiongfc_ref_dimen_size(): Bad dimen_typeDifferent CHARACTER lengths (%ld/%ld) in array constructor at %Lgfc_array_dimen_size(): Bad dimensiongfc_array_dimen_size(): Bad EXPR_ARRAY exprgfc_find_array_ref(): No ref found , team = %e , stat = %e .. )Assumed-rank array at %C../../gcc/fortran/array.cCoarray declaration at %C ] /) (/Bad array dimension at %LSimplification errorZDZD\DZD\D\D]DZD[D@]D(]D[D]DZD]DZDgfc_array_dimen_sizeresolve_array_listgfc_resolve_character_array_constructorgfc_set_array_specgfc_match_array_specinsert_bbt(): Duplicate key found%qs argument of %qs intrinsic at %L must be a variable%qs argument of %qs intrinsic at %L cannot be INTENT(IN)%qs argument of %qs intrinsic at %L must have a numeric typeIntrinsic %qs at %L must have at least two argumentsUnknown argument %qs at %L to intrinsic %sDuplicate argument %qs at %L to intrinsic %sMissing %qs argument to the %s intrinsic at %L%qs argument of %qs intrinsic at %L must be a scalar%qs argument of %qs intrinsic at %L must be of kind %d%qs argument of %qs intrinsic at %L must be INTEGER or REAL%qs argument of %qs intrinsic at %L must be of rank %d%qs argument of %qs intrinsic at %L must be of intrinsic type%qs argument of %qs intrinsic at %L must be %sATOM argument at %L to intrinsic function %s shall be an integer of ATOMIC_INT_KIND or a logical of ATOMIC_LOGICAL_KINDATOM argument at %L of the %s intrinsic function shall be a coarray or coindexed%qs argument of %qs intrinsic at %L shall have the same type as %qs at %L%qs argument of %qs intrinsic at %L must be double precision%qs argument of %qs intrinsic at %L must be a constant%qs argument of %qs intrinsic at %L must be positiveThe absolute value of SHIFT at %L must be less than or equal to BIT_SIZE(%qs)%qs at %L must be less than or equal to BIT_SIZE(%qs)%qs at %L must be less than BIT_SIZE(%qs)% argument of %qs intrinsic at %L is not a valid dimension index%qs argument of %qs intrinsic at %L must be ALLOCATABLE%qs argument of %qs intrinsic at %L must be the same type and kind as %qsargument 'A' with INTENT(INOUT)Argument % with INTENT(INOUT) at %L of the intrinsic subroutine %s shall not have a vector subscriptThe A argument at %L to the intrinsic %s shall not be coindexedThe stat= argument at %L must be a kind=4 integer variableThe errmsg= argument at %L must be a default-kind character variableCoarrays disabled at %L, use %<-fcoarray=%> to enableArguments of %qs at %L and %L cannot both be BOZ literal constantsBOZ literal constant at %L cannot be an actual argument to %qs%qs argument of %qs intrinsic at %L must be an array% argument of %qs intrinsic at %L is not a valid codimension index% argument of %qs intrinsic at %L must be %s(%d)arguments 'a%d' and 'a%d' for intrinsic '%s'Expression is a noninteroperable derived typeProcedure unexpected as argumentExtension to use a non-C_Bool-kind LOGICALExtension to use a non-C_CHAR-kind CHARACTERExpression shall not be a deferred-length stringis_c_interoperable(): gfc_simplify_expr failedType shall have a character length of 1Coarrays are not interoperableOnly whole-arrays are interoperableOnly explicit-size and assumed-size arrays are interoperableNULL at %L is not permitted as actual argument to %qs intrinsic functionExpected coarray variable as %qs argument to the %s intrinsic at %LFortran 2003: Character for %qs argument of %qs intrinsic at %L%qs argument of %qs intrinsic at %L must be INTEGER or REAL or CHARACTER%<%s + %s%> at %L must be less than or equal to BIT_SIZE(%qs)%qs argument of %qs intrinsic at %L must not be OPTIONALarguments '%s' and '%s' for intrinsic %sInvalid use of BOZ literal constant at %LFailure in conversion of BOZ to REAL at %LUnequal character lengths (%ld/%ld) in %s at %LBOZ literal constant at %L cannot appear in ACHAR intrinsic subprogram%qs argument of %qs intrinsic at %L must be a logical array%qs and %qs arguments of %qs intrinsic at %L must have the same type%qs argument of %qs intrinsic at %L must be a POINTER%qs argument of %qs intrinsic at %L shall not be coindexed%qs argument of %qs intrinsic at %L must be a pointer or target VARIABLE or FUNCTION%qs argument of %qs intrinsic at %L must be a POINTER or a TARGETArray section with a vector subscript at %L shall not be the target of a pointerATOM argument of the %s intrinsic function at %L shall be definableATOM argument at %L to intrinsic function %s shall be an integer of ATOMIC_INT_KINDVALUE argument of the %s intrinsic function at %L shall be definable%qs argument of %qs intrinsic at %L not yet supported%qs argument of %qs intrinsic at %L shall specify a valid integer kindOLD argument of the %s intrinsic function at %L shall be definableEVENT argument at %L to the intrinsic EVENT_QUERY shall be of type EVENT_TYPECOUNT argument of the EVENT_QUERY intrinsic function at %L shall be definableCOUNT argument of the EVENT_QUERY intrinsic function at %L shall have at least the range of the default integerBOZ literal constant at %L cannot appear in CHAR intrinsic subprogram%qs argument of %qs intrinsic at %L must not be present if % is COMPLEX%qs argument of %qs intrinsic at %L must have a type of either REAL or INTEGERConversion from %s to default-kind COMPLEX(%d) at %L might lose precision, consider using the KIND argumentSupport for the A argument at %L which is polymorphic A argument or has allocatable components is not yet implementedThe A argument at %L of CO_REDUCE shall not be polymorphicSupport for the A argument at %L with allocatable components is not yet implementedOPERATOR argument at %L must be a PURE functionIntrinsic function %s at %L is not permitted for CO_REDUCEThe function passed as OPERATOR at %L shall have two argumentsThe A argument at %L has type %s but the function passed as OPERATOR at %L returns %sThe function passed as OPERATOR at %L has arguments of type %s and %s but shall have type %sThe function passed as OPERATOR at %L shall have scalar nonallocatable nonpointer arguments and return a nonallocatable nonpointer scalarThe function passed as OPERATOR at %L shall have the VALUE attribute either for none or both argumentsThe function passed as OPERATOR at %L shall have the TARGET attribute either for none or both argumentsThe function passed as OPERATOR at %L shall have the ASYNCHRONOUS attribute either for none or both argumentsThe function passed as OPERATOR at %L shall not have the OPTIONAL attribute for either of the argumentsThe character length of the A argument at %L and of the arguments of the OPERATOR at %L shall be the sameThe character length of the A argument at %L and of the function result of the OPERATOR at %L shall be the same%qs argument of %qs intrinsic at %L shall be of type integer, real or characterBOZ constant at %L cannot appear in the COMPLEX intrinsic subprogram%qs intrinsic with KIND argument at %L%qs argument of %qs intrinsic at %L has invalid shape in dimension %d (%ld/%ld)%qs argument of intrinsic %qs at %L of must have rank %d or be a scalar%qs argument of %qs intrinsic at %L must be numeric or LOGICALDifferent shape for arguments %qs and %qs at %L for intrinsic %%qs argument of %qs intrinsic at %L must be default real%qs must be of same type and kind as %qs at %L in %qsMissing %qs argument to %qs intrinsic at %L for %qs of type %qsBOZ literal constant at %L cannot appear in the FLOAT intrinsic subprogramnon-default INTEGER kind argument to %s intrinsic at %L%qs argument of %qs intrinsic at %L must be REAL or COMPLEXCOMPLEX argument %qs of %qs intrinsic at %LArguments of %qs have different kind type parameters at %LArgument of %s at %L must be of length one%qs argument of %qs intrinsic at %L must be the same kind as %qs%qs intrinsic subprogram at %L has been deprecated. Use INT intrinsic subprogram.The absolute value of SHIFT at %L must be less than or equal to SIZE at %LSTATUS at %L shall be an INTENT(OUT) variable%qs at %L shall be an INTENT(OUT) variable%qs argument of %qs intrinsic at %L must be a data entity%qs intrinsic with CHARACTER argument at %L% argument of %qs intrinsic at %L must be INTEGER, REAL or CHARACTERArgument types of %qs intrinsic at %L must match (%s/%s)Different shape on dimension 1 for arguments %qs and %qs at %L for intrinsic matmulDifferent shape on dimension 2 for argument %qs and dimension 1 for argument %qs at %L for intrinsic matmul%qs argument of %qs intrinsic at %L must be of rank 1 or 2Argument %qs of %qs intrinsic at %L must be in type conformance to argument %qs at %L%qs at %L must be less than or equal to the BIT_SIZE of INTEGER(KIND=%d)%qs argument of %qs intrinsic at %L must be INTEGERThe FROM argument to MOVE_ALLOC at %L shall not be coindexedThe TO argument to MOVE_ALLOC at %L shall not be coindexedThe TO arguments in MOVE_ALLOC at %L must be polymorphic if FROM is polymorphicThe FROM and TO arguments of the MOVE_ALLOC intrinsic at %L must have the same rank %d/%dThe FROM and TO arguments of the MOVE_ALLOC intrinsic at %L must have the same corank %d/%dThe FROM and TO arguments at %L violate aliasing restrictions (F2003 12.4.1.7)Argument % of NEAREST at %L shall not be zero%qs argument of %qs intrinsic at %L must be a POINTER, ALLOCATABLE or procedure pointerNULL intrinsic with allocatable MOLD at %Larguments '%s' and '%s' for intrinsic '%s'%qs argument of %qs intrinsic at %L must provide at least as many elements as there are .TRUE. values in %qs (%ld/%d)%qs argument of %qs intrinsic at %L must be of a dummy variable%qs argument of %qs intrinsic at %L must be of an OPTIONAL dummy variable%qs argument of %qs intrinsic at %L must not be a subobject of %qsThe argument of the RANK intrinsic at %L must be a data object% argument of % intrinsic at %L must be an array of constant size%qs argument of %qs intrinsic at %L is empty% argument of % intrinsic at %L has more than %d elements%qs argument of %qs intrinsic at %L has negative element (%d)Element %d of actual argument of RESHAPE at %L cannot be negative%qs argument of %qs intrinsic at %L has wrong number of elements (%d/%d)%qs argument of %qs intrinsic at %L has out-of-range dimension (%d)%qs argument of %qs intrinsic at %L has invalid permutation of dimensions (dimension %qd duplicated)Without padding, there are not enough elements in the intrinsic RESHAPE source at %L to match the shape%qs argument of %qs intrinsic at %L cannot be of type %s%qs argument of %qs intrinsic at %L must be of an extensible typeSELECTED_REAL_KIND with neither % nor % argument at %L%qs intrinsic with RADIX argument at %L% argument of % intrinsic at %L must not be an assumed size array%qs argument of %qs intrinsic at %L shall not be a procedure%qs argument of %qs intrinsic at %L shall not be TYPE(*)%qs argument of %qs intrinsic at %L shall not be an assumed-size array%qs argument of %qs intrinsic at %L must be an interoperable data entity: %sArgument C_PTR_1 at %L to C_ASSOCIATED shall have the type TYPE(C_PTR) or TYPE(C_FUNPTR)Argument C_PTR_2 at %L to C_ASSOCIATED shall have the same type as C_PTR_1: %s instead of %sArgument CPTR at %L to C_F_POINTER shall have the type TYPE(C_PTR)Argument FPTR at %L to C_F_POINTER must be a pointerFPTR argument at %L to C_F_POINTER shall not be polymorphicArgument FPTR at %L to C_F_POINTER shall not be coindexedUnexpected SHAPE argument at %L to C_F_POINTER with scalar FPTRExpected SHAPE argument to C_F_POINTER with array FPTR at %LSHAPE argument at %L to C_F_POINTER must have the same size as the RANK of FPTRPolymorphic FPTR at %L to C_F_POINTERNoninteroperable array FPTR at %L to C_F_POINTER: %sArgument CPTR at %L to C_F_PROCPOINTER shall have the type TYPE(C_FUNPTR)Argument FPTR at %L to C_F_PROCPOINTER shall be a procedure pointerArgument FPTR at %L to C_F_PROCPOINTER shall not be coindexedNoninteroperable procedure pointer at %L to C_F_PROCPOINTERArgument X at %L to C_FUNLOC shall not be coindexedFunction result %qs at %L is invalid as X argument to C_FUNLOCArgument X at %L to C_FUNLOC shall be a procedure or a procedure pointerNoninteroperable procedure at %L to C_FUNLOCArgument X at %L to C_LOC shall not be coindexedX argument at %L to C_LOC shall not be polymorphicArgument X at %L to C_LOC shall have either the POINTER or the TARGET attributeArgument X at %L to C_LOC shall be not be a zero-sized stringArgument at %L to C_LOC shall not be polymorphicNoninteroperable array at %L as argument to C_LOC: %sArray of interoperable type at %L to C_LOC which is nonallocatable and neither assumed size nor explicit sizenon double precision REAL argument to %s intrinsic at %L%qs argument of %qs intrinsic at %L must be less than rank %d%qs argument of %qs intrinsic at %L is not a valid dimension index%s argument to IMAGE_INDEX must be a rank one array at %LThe number of array elements of the SUB argument to IMAGE_INDEX at %L shall be %d (corank) not %dDISTANCE= argument to NUM_IMAGES at %LFAILED= argument to NUM_IMAGES at %LTEAM argument at %L to the intrinsic TEAM_NUMBER shall be of type TEAM_TYPEDIM argument without COARRAY argument not allowed for THIS_IMAGE intrinsic at %LThe DISTANCE argument may not be specified together with the COARRAY or DIM argument in intrinsic at %LUnexpected DIM argument with noncoarray argument at %LDISTANCE= argument to THIS_IMAGE at %L% argument of % intrinsic at %L is an array and shall not have storage size 0 when % argument has size greater than 0% argument of % intrinsic at %L must not be a %s% argument of % intrinsic at %L must not be a %s% argument of % intrinsic at %L must not be %sIntrinsic TRANSFER at %L has partly undefined result: source size %ld < result size %ld%qs argument of %qs intrinsic at %L must have the same rank as %qs or be a scalar%qs and %qs arguments of %qs intrinsic at %L must have identical shape.Size of %qs argument of %qs intrinsic at %L too small (%i/%i)Too many arguments to %s at %Lfe_runtime_error string must be null terminatedfe_runtime_error: Wrong number of arguments (%d instead of %d)COUNT argument to SYSTEM_CLOCK at %L has non-default kindReal COUNT_RATE argument to SYSTEM_CLOCK at %LCOUNT_RATE argument to SYSTEM_CLOCK at %L has non-default kindCOUNT_MAX argument to SYSTEM_CLOCK at %L has non-default kind%qs argument of %qs intrinsic at %L must be INTEGER or PROCEDURE%qs argument of %qs intrinsic at %L must be of a kind not wider than the default kind (%d)Actual argument at %L of %qs intrinsic shall be an associated pointer%qs argument of %qs intrinsic at %L must be INTEGER, LOGICAL, or a BOZ literal constantIntrinsic function NULL at %L cannot be an actual argument to STORAGE_SIZE, because it returns a disassociated pointerSTAT= argument to %s at %LInvalid kind for %s at %L%qs at %L must be nonnegative../../gcc/fortran/check.cDifferent type kinds at %LExpression is polymorphicNegative argument N at %LN1SIZE at %L must be positiveMERGE intrinsicArray section at %L to C_LOCDISTANCEfrompostoposdim_corank_checkgfc_check_ichar_iacharidentical_dimen_shapehex2bin [see %<-fno-allow-invalid-boz%>STARc_f_pointerc_loc%qs of %qs is PRIVATE at %L../../gcc/fortran/class.c%X__vtype_%s_extends__def_init_%s__copy_%s__final_%sfini_coarrayidx2nelemptr1ptr2tmp_arrayignore_deallocate__deallocate_%s__vtab_%s__class_%s_%d_%da__class_%s_%d_%dp__class_%s_%d_%dt__class_%s_p__class_%s_a__class_%s_t_vptr__copy_character_%d%s_%d_Assumed size polymorphic objects or components, such as that at %C, have not yet been implementedMaximum extension level reached with type %qs at %Lgfc_find_typebound_intrinsic_opfind_typebound_proc_uopgfc_is_finalizablefinalize_componentgenerate_finalization_wrappergfc_build_class_symbolgfc_add_component_refinsert_component_ref../../gcc/fortran/constructor.cgfc_constructor_swapgfc_constructor_expr_foreachgfc_constructor_insert#define #undef %s # %u "%s"%s 3 4#ident %s #%s <%s>#%s "%s"../../gcc/fortran/cpp.c#pragma opening output file %qs: %s# 1 "%s//" __GFORTRAN__=1_LANGUAGE_FORTRAN=1_OPENACC=201711_OPENMP=201511__GFC_INT_1__=1__GFC_INT_2__=1__GFC_INT_8__=1__GFC_INT_16__=1__GFC_REAL_10__=1__GFC_REAL_16__=1E`EEEХEEPEgfc_cpp_donecb_cpp_diagnosticgfc_cpp_init_0gfc_cpp_post_optionsoutput filename specified twiceTo enable preprocessing, use %<-cpp%>../../gcc/fortran/data.c%qs at %L already is initialized at %Lnon-constant array in DATA statement %LData element below array lower bound at %LData element above array upper bound at %LF18(R841): data-implied-do object at %L is neither an array-element nor a scalar-structure-componentLEN or KIND inquiry ref in DATA statement at %Lre-initialization of %qs at %Lfailure to simplify substring reference in DATA statement at %LUnused initialization string at %L because variable has zero lengthInitialization string at %L was truncated to fit the variable (%ld/%ld)Nonpointer object %qs with default initialization shall not appear in a DATA statement at %LTODO: Vector sections in data statementsgfc_get_section_indexcreate_character_initializergfc_assign_data_valueThe assumed-rank array at %C shall not have a codimensionSum of array rank %d and corank %d at %C exceeds maximum allowed dimensions of %dExpected initialization expression at %CExpected scalar initialization expression at %CKind %d is not supported for CHARACTER at %CRESULT variable %qs at %L prohibits FUNCTION name %qs at %C from appearing in a specification statementDuplicate access-specifier at %CBinding attributes already specify passing, illegal NOPASS at %CBinding attributes already specify passing, illegal PASS at %CDuplicate POINTER attribute at %CDuplicate NON_OVERRIDABLE at %CNON_OVERRIDABLE and DEFERRED cannot both appear at %CPOINTER attribute is required for procedure pointer component at %CExpected access-specifier at %CExpected binding attribute at %CMissing character range in IMPLICIT at %CLetters must be in alphabetic order in IMPLICIT statement at %CSyntax error in %s statement at %CProcedure %qs at %C is already defined at %LContained procedure %qs at %C clashes with procedure defined at %LName %qs at %C is already defined as a generic interface at %LProcedure %qs at %C has an explicit interface from a previous declarationProcedure %qs defined in interface body at %L clashes with internal procedure defined at %CAccess specification of the %s operator at %C has already been specifiedAccess specification of the .%s. operator at %C has already been specifiedFailed to create structure type '%s' at %CType definition of %qs at %C was already defined at %LInquiry parameter cannot appear in a data-stmt-object-list at %CHost associated variable %qs may not be in the DATA statement at %Cinitialization of common block variable %qs in DATA statement at %CBad INTENT specification at %CScalar INTEGER expression expected at %LOld-style character length at %CSyntax error in character length specification at %CMultiple identifiers provided with single NAME= specifier at %CSymbol %qs at %C also declared as a type at %LVariable %qs in common block %qs at %C must be declared with a C interoperable kind since common block %qs is BIND(C)ELEMENTAL prefix in MODULE PROCEDURE interface is missing at %LPURE prefix in MODULE PROCEDURE interface is missing at %LRECURSIVE prefix in MODULE PROCEDURE interface is missing at %LRESULT variable at %C must be different than function nameMissing array specification at %L in DIMENSION statementDimensions specified for %s at %L after its initializationMissing array specification at %L in CODIMENSION statementArray specification must be deferred at %LDuplicate DIMENSION attribute at %CCray Pointee at %C cannot be assumed shape arrayUnexpected character in variable list at %CCHARACTER length must be a constant at %LCHARACTER expression at %L is being truncated (%ld/%ld)The CHARACTER elements of the array constructor at %L must have the same length (%ld/%ld)Initializer not allowed for PARAMETER %qs at %CPARAMETER at %L is missing an initializerVariable %qs at %C with an initializer already appears in a DATA statementCannot initialize parameter array at %L with variable length elementsCannot initialize implied-shape array at %L with scalarNon-constant lower bound in implied-shape declaration at %LNULL() initialization at %C is ambiguousNULL() initialization at %C may not have MOLDSymbol %qs must be a PARAMETER in DATA statement at %CInvalid initializer %s in Data statement at %CDATA statement at %C cannot appear within an INTERFACEInvalid substring in data-implied-do at %L in DATA statementAllocatable component or deferred-shaped array near %C in DATA statementBOZ literal constant at %L cannot appear in a structure constructorDATA statement at %C is not allowed in a PURE procedurepart-ref with pointer attribute near %L is not rightmost part-ref of data-stmt-objectInitialization of pointer at %C is not allowed in a PURE procedureError in pointer initialization at %Cnon-NULL pointer initialization at %COld-style type declaration %s*%d not supported at %CNonstandard type declaration %s*%d at %CMissing right parenthesis at %CKind %d not supported for type %s at %CC kind type parameter is for type %s but type at %L is %sMissing right parenthesis or comma at %CSyntax error in CHARACTER declaration at %CThe type parameter spec list at %C does not contain enough parameter expressionsThe derived parameter %qs at %C does not have a default valueThe parameter expression at %C must be of INTEGER type and not %s typeThe KIND parameter %qs at %C cannot either be ASSUMED or DEFERREDThe value for the KIND parameter %qs at %C does not reduce to a constant expressionThe type parameter spec list at %C contains too many parameter expressionsParameterized derived type at %C is ambiguousDuplicate IMPLICIT NONE statement at %CIMPORT NONE with spec list at %CIMPORT statement at %C only permitted in an INTERFACE bodyF2008: C1210 IMPORT statement at %C is not permitted in a module procedure interface bodyExpecting list of named entities at %CType name %qs at %C is ambiguousCannot IMPORT %qs from host scoping unit at %C - does not exist.%qs is already IMPORTed from host scoping unit at %CSyntax error in IMPORT statement at %CImplicitly declared BIND(C) function %qs at %L may not be C interoperableVariable %qs in common block %qs at %L may not be a C interoperable kind though common block %qs is BIND(C)Type declaration %qs at %L is not C interoperable but it is BIND(C)Variable %qs at %L may not be a C interoperable kind but it is BIND(C)Variable %qs in common block %qs at %L cannot be declared with BIND(C) since it is not a globalVariable %qs at %L cannot have both the POINTER and BIND(C) attributesVariable %qs at %L cannot have both the ALLOCATABLE and BIND(C) attributesReturn type of BIND(C) function %qs at %L cannot be an arrayReturn type of BIND(C) function %qs of character type at %L must have length 1Symbol %qs at %L is marked PRIVATE but has been given the binding label %qsProcedure %qs at %L must have the BIND(C) attribute to be C interoperableVariable %qs at %L is a dummy argument to the BIND(C) procedure %qs but is not C interoperable because derived type %qs is not C interoperableVariable %qs at %L is a dummy argument to the BIND(C) procedure %qs but is not C interoperable because it is polymorphicVariable %qs at %L is a dummy argument of the BIND(C) procedure %qs but may not be C interoperableCharacter argument %qs at %L must be length 1 because procedure %qs is BIND(C)Variable %qs at %L with ALLOCATABLE attribute in procedure %qs with BIND(C)Variable %qs at %L with POINTER attribute in procedure %qs with BIND(C)Variable %qs at %L cannot have both the OPTIONAL and the VALUE attribute because procedure %qs is BIND(C)Variable %qs at %L with OPTIONAL attribute in procedure %qs which is BIND(C)Assumed-shape array %qs at %L as dummy argument to the BIND(C) procedure %qs at %LNeed either entity or common block name for attribute specification statement at %CMissing entity or common block name for attribute specification statement at %CA type parameter list is required at %CAlternate-return argument at %CA parameter name is required at %CName %qs at %C is the name of the procedureExpected parameter list in type declaration at %CUnexpected junk in formal argument list at %CDuplicate name %qs in parameter list at %CDuplicate symbol %qs in formal argument list at %CMismatch in number of MODULE PROCEDURE formal arguments at %CMismatch in MODULE PROCEDURE formal argument names (%s/%s) at %CInvalid C identifier in NAME= specifier at %CSyntax error in NAME= specifier for binding label at %CNAME= specifier at %C should be a constant expressionNAME= specifier at %C should be a scalar of default character kindMissing closing paren for binding label at %CNo binding name is allowed in BIND(C) at %CFor dummy procedure %s, no binding name is allowed in BIND(C) at %CNAME not allowed on BIND(C) for ABSTRACT INTERFACE at %CALLOCATABLE attribute at %C in a TYPE definitionALLOCATABLE attribute at %C in a STRUCTURE definitionKIND attribute at %C in a TYPE definitionKIND attribute at %C in a STRUCTURE definitionLEN attribute at %C in a TYPE definitionLEN attribute at %C in a STRUCTURE definitionAttribute at %L is not allowed in a TYPE definitionAttribute at %L is not allowed in a STRUCTURE definitionMissing codimension specification at %CMissing dimension specification at %C%s at %L is a DEC extension, enable with %<-fdec-static%>Component with KIND attribute at %C must be INTEGERComponent with KIND attribute at %C must be default integer kind (%d)Component with LEN attribute at %C must be INTEGERComponent with LEN attribute at %C must be default integer kind (%d)Attribute %s at %L in a TYPE definition%s attribute at %L is not allowed outside of the specification part of a moduleAttribute at %L is not allowed outside a TYPE definitionPROTECTED at %C only allowed in specification part of a modulematch_attr_spec(): Bad attributeUnexpected junk after function declaration at %CBIND(C) attribute at %L may not be specified for an internal procedureENTRY statement at %C cannot appear within a PROGRAMENTRY statement at %C cannot appear within a MODULEENTRY statement at %C cannot appear within a SUBMODULEENTRY statement at %C cannot appear within a BLOCK DATAENTRY statement at %C cannot appear within an INTERFACEENTRY statement at %C cannot appear within a STRUCTURE blockENTRY statement at %C cannot appear within a DERIVED TYPE blockENTRY statement at %C cannot appear within an IF-THEN blockENTRY statement at %C cannot appear within a DO blockENTRY statement at %C cannot appear within a SELECT blockENTRY statement at %C cannot appear within a FORALL blockENTRY statement at %C cannot appear within a WHERE blockENTRY statement at %C cannot appear within a contained subprogramUnexpected ENTRY statement at %CBIND(C) attribute at %L can only be used for variables or common blocksMissing required parentheses before BIND(C) at %CENTRY statement at %L with BIND(C) prohibited in an elemental procedureENTRY statement at %C cannot appear in a contained procedureUnexpected END statement at %CEND statement instead of %s statement at %LExpected block name of %qs in %s statement at %LExpected terminating name at %CExpected label %qs for %s statement at %CINTENT is not allowed inside of BLOCK at %COPTIONAL is not allowed inside of BLOCK at %CCray pointer declaration at %C requires %<-fcray-pointer%> flagCray pointer at %C must be an integerCray pointer at %C has %d bytes of precision; memory addresses require %d bytesCannot set Cray pointee array spec.Duplicate array spec for Cray pointee at %CExpected %<,%> or end of statement at %CSyntax error in PROTECTED statement at %CPRIVATE statement at %C is only allowed in the specification part of a modulePUBLIC statement at %C is only allowed in the specification part of a moduleExpected variable name at %C in PARAMETER statementExpected = sign in PARAMETER statement at %CExpected expression at %C in PARAMETER statementInitializing already initialized variable at %CUnexpected characters in PARAMETER statement at %C%s at %C is a DEC extension, enable with %<-fdec-static%>Expected entity-list in AUTOMATIC statement at %CSyntax error in AUTOMATIC statement at %CExpected entity-list in STATIC statement at %CSyntax error in STATIC statement at %CBlanket SAVE statement at %C follows previous SAVE statementSAVE statement at %C follows blanket SAVE statementSyntax error in SAVE statement at %CVALUE is not allowed inside of BLOCK at %CSyntax error in VALUE statement at %CSpecifying VOLATILE for coarray variable %qs at %C, which is use-/host-associatedSyntax error in VOLATILE statement at %CSyntax error in ASYNCHRONOUS statement at %CMODULE PROCEDURE declaration at %CMODULE PROCEDURE at %C must be in a generic module interfacedouble colon in MODULE PROCEDURE statement at %LIntrinsic procedure at %L cannot be a MODULE PROCEDUREDerived type at %C can only be PRIVATE in the specification part of a moduleDerived type at %C can only be PUBLIC in the specification part of a moduleJunk after MAP statement at %CJunk after UNION statement at %C%s at %C is a DEC extension, enable with %<-fdec-structure%>Structure name expected in non-nested structure declaration at %CJunk after non-nested STRUCTURE statement at %CStructure name %qs at %C cannot be the same as an intrinsic typeBYTE type used at %C is not available on the target machineAssumed type at %C is not allowed for componentsTYPE with intrinsic-type-spec at %CRECORD at %C is an extension, enable it with %<-fdec-structure%>Structure name expected after RECORD at %CType name %qs at %C conflicts with previously declared entity at %L, which has the same nameIMPLICIT statement at %C following an IMPLICIT NONE (type) statementEmpty IMPLICIT statement at %CMODULE prefix at %C found outside of a module, submodule, or interfacePURE and IMPURE must not appear both at %CExpected formal argument list in function definition at %CMismatch in BIND(C) names (%qs/%qs) at %CAlternate return dummy argument cannot appear in a SUBROUTINE with the BIND(C) attribute at %LDerived type at %C has not been previously defined and so cannot appear in a derived type definition%qs not allowed outside STRUCTURE at %C%qs at %C is a DEC extension, enable with %<-fdec-structure%>%qs entity cannot have attributes at %CInvalid character %qc in variable name at %CNon-PARAMETER symbol %qs at %L cannot be implied-shape%qs at %C is a redefinition of the declaration in the corresponding interface for MODULE PROCEDURE %qs%qs entity cannot have an initializer at %CCannot set pointee array spec.Old-style initialization at %CInvalid old style initialization for derived type component at %CEmpty old style initializer list at %CArray in initializer list at %L must have an explicit shapeRepeat spec invalid in scalar initializer at %CRepeat spec must be an integer at %CExpected data constant after repeat spec at %CEnd of scalar initializer expected at %CNot enough elements in array initializer at %CToo many elements in array initializer at %CSyntax error in old style initializer list at %CSyntax error in old style initialization of %s at %CBOZ at %L cannot appear in an old-style initializationBOZ literal constant near %L cannot be assigned to a %qs variable in an old-style initializationInitialization at %C is not allowed in a PURE procedureInitialization at %C isn't for a pointer variablePointer initialization at %C requires %<=>%>, not %<=%>Expected an initialization expression at %CInitialization of variable at %C is not allowed in a PURE procedureInitialization of allocatable component at %C is not allowedInitialization of structure component with a HOLLERITH constant at %L is not allowedThe component with KIND or LEN attribute at %C does not not appear in the type parameter list at %LThe component at %C that appears in the type parameter list at %L has neither the KIND nor LEN attributeBOZ literal constant at %L cannot appear as an initializerIncompatible initialization between a derived type entity and an entity with %qs type at %CComponent at %C must have the POINTER attributeComponent %qs with CLASS at %C must be allocatable or pointerArray component of structure at %C must have explicit or deferred shapeComponent %qs at %C already declared at %LPointer array component of structure at %C must have a deferred shapeAllocatable component of structure at %C must have a deferred shapeArray component of structure at %C must have an explicit shapeType parameter %qs at %C has no corresponding entry in the type parameter name list at %LSyntax error in anonymous structure declaration at %CSyntax error in data declaration at %CExplicit shaped array with nonconstant bounds at %CThe component at %C which is a type parameter must be a scalarBIND(C) attribute at %C requires an interface with BIND(C)BIND(C) procedure with NAME may not have POINTER attribute at %CDummy procedure at %C may not have BIND(C) attribute with NAMEProcedure %qs at %L already has basic type of %sSyntax error in PROCEDURE statement at %CPROCEDURE at %C must be in a generic interfaceExpected %<::%> after binding-attributes at %CNOPASS or explicit interface required at %CProcedure pointer component at %CSyntax error in procedure pointer component at %CInterface-name expected after %<(%> at %CInterface must be specified for DEFERRED binding at %CPROCEDURE(interface) at %C should be declared DEFERRED%<=> target%> is invalid for DEFERRED binding at %C%<::%> needed in PROCEDURE binding with explicit target at %CExpected binding target after %<=>%> at %CType %qs containing DEFERRED binding at %C is not ABSTRACTThere is already a procedure with binding name %qs for the derived type %qs at %CAmbiguous symbol in TYPE definition at %CSymbol %qs at %C has not been previously defined%qs in EXTENDS expression at %C is not a derived type%qs cannot be extended at %C because it is BIND(C)%qs cannot be extended at %C because it is a SEQUENCE typeExpected :: in TYPE definition at %CMangled derived type definition at %CType name %qs at %C cannot be the same as an intrinsic typeDerived type name %qs at %C already has a basic type of %sDerived type name %qs at %C already has a basic typeDummy argument %qs at %L cannot be a derived type at %CDerived type definition of %qs at %C has already been definedGarbage after PARAMETERIZED TYPE declaration at %CENUM definition statement expected before %CEnumerator exceeds the C integer type at %CENUMERATOR %L not initialized with integer expressionSyntax error in ENUMERATOR definition at %CGENERIC at %C must be inside a derived-type CONTAINSExpected generic name or operator descriptor at %CMalformed GENERIC statement at %CThere's already a non-generic procedure with binding name %qs for the derived type %qs at %CBinding at %C must have the same access as already defined binding %qsExpected specific binding name at %C%qs already defined as specific binding for the generic %qs at %CJunk after GENERIC binding at %CFINAL declaration at %C must be inside a derived type CONTAINS sectionDerived type declaration with FINAL at %C must be in the specification part of a MODULEExpected module procedure name at %CUnknown procedure name %qs at %C%qs at %C is already defined as FINAL procedureSyntax error in !GCC$ ATTRIBUTES statement at %CUnknown attribute in !GCC$ ATTRIBUTES statement at %C% directive requires a non-negative integral constant less than or equal to %u at %CSyntax error in !GCC$ UNROLL directive at %CSyntax error in !GCC$ IVDEP directive at %CSyntax error in !GCC$ VECTOR directive at %CSyntax error in !GCC$ NOVECTOR directive at %Cppr@ nopass pass../../gcc/fortran/decl.c non_overridableDuplicate DEFERRED at %C ( in out ) ( in ) ( out )deferred type parameter at %Cgfc_array_size failed null ( ) null ( %n ) mold = %n )(intrinsic) kind = , len = , kind =Pdt%sIMPORT statement at %CMissing symbol bind ( c name = BIND(C) statement at %CAUTOMATICDuplicate %s attribute at %LASYNCHRONOUS attribute at %CCONTIGUOUS attribute at %CPROTECTED attribute at %CVALUE attribute at %CVOLATILE attribute at %C result (ENTRY statement at %CEND PROCEDURE enum forall select associate union map submodule block data programblock@Expecting %s statement at %L%s statement expected at %LExpected %<(%> at %CExpected variable name at %CExpected "," at %CExpected ")" at %CCONTIGUOUS statement at %CPROTECTED statement at %C %tPARAMETER without '()' at %C / %n /VALUE statement at %CVOLATILE statement at %CASYNCHRONOUS statement at %C module% procedure% %n , private , public , abstractABSTRACT type at %C , extends ( %n )MM$%uUU$%u /%n/SS$%uBYTE type at %C type (* ) Assumed type at %CMalformed type-spec at %CDOUBLE COMPLEX at %C record / %n/ record% record%t class ( * ) class (CLASS statement at %C realdoubleprecision double precisiondoublecomplex double complex logicalmodule% MODULE prefix at %Celemental% recursive% impure% IMPURE procedure at %Cfunction% %nsubroutine% %nfill%FILL%%FILL%uImplied-shape array at %L%<)%> expected at %CExpected binding name at %CPROCEDURE list at %CPROCEDURE statement at %C %n ,ENUM and ENUMERATOR at %C enumeratorExpected %<::%> at %COPERATOR(.%s.)OPERATOR(%s)Expected %<=>%> at %CEmpty FINAL at %CExpected %<,%> at %Cdllimport ( %n ) attributes simd ( notinbranch ) ( inbranch ) if ( '%n' ) __builtin_%sINTENT (IN)INTENT (OUT)INTENT (IN OUT)PARAMETERIS_BIND_Ccdeclstdcallfastcallno_arg_checkE0EEXEE0EE=F:FFmFFQFF5FFFFFFEFEFEFEFEFCFhGFNDFOGF1GFNDFGFFFFFFFHFGFGFGFFFmFFQFF5FFFFFFEFEFEFEFEFqHFhGFNDFOGF1GFNDFGFFFFFFFHFGFGFGFFFOF6OF OFbOFOFOFLOFpNFxOFOFOFOFNFNFOFOFNFNFNFNF OFOFOFOFOFOFNFWFWFWFmWFXWFVFVFVFtVFtVFIVFVFUFUFUFrUFUFTFGUFTFWFSFTFTFWFRFUFFFFFFF FFpFFF/FFFFF/FF/p g g g Wp np xp p g ep Ix ow a @x f7 p g Jp 

ssize_t__GFORTRAN_FLOAT_COMPLEX__GFORTRAN_DOUBLE_COMPLEX_Boolc_funptr()))()[%Zd]void , size_t %s_len(/ NULL().true..false.(complex b'o'z'%ldH???%.2xU+ U- ** AND NEQV /= <= NOT parens (TYPE(*)) C_INTEROP ISO_C DEFERREDFULL %% %s INQUIRY_KIND INQUIRY_LEN INQUIRY_RE INQUIRY_IM AS_EXPLICITAS_DEFERREDAS_ASSUMED_SIZEAS_ASSUMED_SHAPEAS_ASSUMED_RANK(%d [%d](arg not-present)max:min:iand:ior:ieor:inout:sink:) DEPEND(alloc:tofrom:ref(uval(DYNAMICGUIDEDRUNTIMEAUTOSHAREDFIRSTPRIVATEPRESENTLASTPRIVATECOPYPRIVATEUNIFORMLINEARFROMUSE_DEVICEIS_DEVICE_PTRUSE_DEVICE_PTRUSE_DEVICE_ADDRSPREADCLOSE IF( FINAL( NUM_THREADS( ASYNC NUM_GANGS( NUM_WORKERS( VECTOR_LENGTH( GANGnum:static: WORKER VECTOR SCHEDULE (SIMD, SIMD: NONMONOTONIC: DEFAULT(%s) TILE( WAIT( SEQ INDEPENDENT ORDERED(%d) UNTIED MERGEABLE COLLAPSE(%d) SAFELEN( SIMDLEN( INBRANCH NOTINBRANCH PROC_BIND(%s) NUM_TEAMS( DEVICE( THREAD_LIMIT( DIST_SCHEDULE (STATIC DEFALTMAP (TOFROM: SCALAR) NOGROUP THREADS GRAINSIZE( HINT( NUM_TASKS( PRIORITY( DEPEND(source)typedef struct %s { } %s; common: /%s/ GENERICPROCEDURE, NOPASSPASS(%s), NON_OVERRIDABLE, PUBLIC, PRIVATE :: %s => TEAMS DISTRIBUTE PARALLEL DOTARGET TEAMS DISTRIBUTE SIMDTARGET_PARALLEL_DO_SIMDDISTRIBUTE PARALLEL DO SIMDCONTINUEENTRY %sLABEL ASSIGN POINTER ASSIGN GOTO , (CALL %s CALL ?? CALL RETURN PAUSE ERROR STOP FAIL IMAGE CHANGE TEAMEND TEAMFORM TEAMSYNC TEAMSYNC ALL stat= errmsg=SYNC MEMORY SYNC IMAGES image-set=EVENT POST EVENT WAIT event-variable= until_count=UNLOCK lock-variable= acquired_lock= %d, %d, %dELSE IF ENDIF %s = END %s SELECT RANK SELECT TYPE SELECT CASE END SELECTELSE WHERE END WHEREFORALL END FORALLCRITICAL DO %-5d DO CONCURRENT DO WHILE CYCLE STAT= ERRMSG= MOLD= SOURCE=DEALLOCATE OPEN UNIT= IOMSG= IOSTAT= FILE= STATUS= ACCESS= FORM= RECL= BLANK= POSITION= ACTION= DELIM= PAD= DECIMAL= ENCODING= ROUND= SIGN= CONVERT= ASYNCHRONOUS= ERR=%dBACKSPACEENDFILEREWIND EXIST= OPENED= NUMBER= NAMED= NAME= SEQUENTIAL= DIRECT= FORMATTED UNFORMATTED= NEXTREC= READ= WRITE= READWRITE= PENDING= SIZE= ID=IOLENGTH FMT= FMT=%d NML=%s REC= ADVANCE= POS=TRANSFER DT_END END=%d EOR=%d!$%s %s!$OMP SECTION !$%s END %s COPYPRIVATE( NOWAITNamespace: %c-%c: %c: procedure name = %sOperator interfaces for %s:User operators: Equivalence: !$ACC DECLAREcode: CONTAINS || symbol: '%s' || binding_label: '%s' type spec : attributes: Array spec:Generic interfaces:result: %scomponents: kind_expr: PDT parameters PPChash: %dProcedure bindings:FINAL %sOperator bindings:Formal arglist: [Alt Return]Formal namespacevariables : symtree: '%s' Ambiguous() __GFORTRAN_LONG_DOUBLE_COMPLEX../../gcc/fortran/dump-parse-tree.cCannot convert %qs to interoperable type at %L/* Cannot convert '%s' to interoperable type */ /* WARNING: Converting '%s' to interoperable type */char *result_%s, size_t result_%s_len /* WARNING: non-interoperable KIND */ show_expr(): Bad intrinsic in expressionshow_expr(): Don't know how to show exprshow_array_ref(): Unknown array referenceshow_ref(): Bad component codeshow_array_spec(): Unhandled array shape type.TEAMS DISTRIBUTE PARALLEL DO SIMDTARGET TEAMS DISTRIBUTE PARALLEL DO SIMDTARGET TEAMS DISTRIBUTE PARALLEL DOshow_code_node(): Bad statement code|| symbol: '%s' from namespace '%s'/* Prototypes for external procedures generated from %s by GNU Fortran %s%s. Use of this interface is discouraged, consider using the BIND(C) feature of standard Fortran instead. */ &F(FXFFFF F`FFFFFFFFfFFF&FFFFFFfFFF&FFFFfFFF&FFFFFFpFPFXF FFFPFFPFPFHFPFPFPF`FGGGGGGG@G@GGGG@G@G@GGGGG@G@G@G@G@G@G@G@G@G@G@G@G@G@G@G@GGGG`G GGG<G2G GxGnGGdGZGPGFG GGxGGGGGGpG`GPG@G0G GGGGGGGGGKG5GKGc%Gd5G5G4G4G%G4GD4G3GD4G3G4Gy3G%G&Gd5GAG=G?GT:G>?G;G>G$9G8G7G7G7G6G5Gc%G;G;GkBG2Ge2G.G .G,G,G)GN)G))G(G(Ge(GGE(Gr'GW'G<'G!'G'G-&G-&G2G2G2G5G5G5GKG5G5G5G5G5G5G5G5G5G5G5GKGKG5G5G5G5G5G5G5G5G5G5G5G5G5G5GKGKG5G5G5G5G5G5G5G5G5G5G5G5G5G5G5G5G5G5G5G5G5G5G5G5G5G5GKG5G5G5G5G5G5G5G5G{GGeGGOGGKG9GG#GG GGFGFGFGFGFGFGsFG]FGKGKGJFG7FG$FGFGEGEGEGEGEGEGEGyEGfEGSEGKGKG@EG-EGEGHGHGHGuHGbHGOHG cannot be NULLConstant expression required at %CInteger expression required at %CInteger value too large in expression at %Cgfc_copy_expr(): Bad expr nodegfc_is_constant_expr(): Unknown expression typeindex in dimension %d is out of bounds at %LThe number of elements in the array constructor at %L requires an increase of the allowed %d upper limit. See %<-fmax-array-constructor%> optionExpression at %L must be of INTEGER type, found %sFunction %qs at %L must be PUREExpression at %L must be scalarieee_support_underflow_controlSpecification function %qs at %L cannot be a statement functionSpecification function %qs at %L cannot be an internal functionSpecification function %qs at %L must be PURESpecification function %qs at %L cannot be RECURSIVEDummy argument %qs not allowed in expression at %LDummy argument %qs at %L cannot be OPTIONALDummy argument %qs at %L cannot be INTENT(OUT)Variable %qs cannot appear in the expression at %Lcheck_restricted(): Unknown expression typeAssumed or deferred character length variable %qs in constant expression at %LFunction %qs in initialization expression at %L must be an intrinsic functionEvaluation of nonstandard initialization expression at %LIntrinsic function %qs at %L is not permitted in an initialization expressionPARAMETER %qs is used at %L before its definition is completeAssumed size array %qs at %L is not permitted in an initialization expressionAssumed shape array %qs at %L is not permitted in an initialization expressionAssumed-shape array %qs at %L is not permitted in an initialization expressionDeferred array %qs at %L is not permitted in an initialization expressionArray %qs at %L is a variable, which does not reduce to a constant expressionParameter %qs at %L has not been declared or is a variable, which does not reduce to a constant expressionInvalid initialization expression for ALLOCATABLE component %qs in structure constructor at %Lcheck_init_expr(): Unknown expression typetransformational intrinsic %qs at %L is not permitted in an initialization expressionIndex in dimension %d is out of bounds at %Lsimplify_intrinsic_op(): Bad operatorelemental function arguments at %C are not compliantIncompatible ranks in %s (%d and %d) at %LDifferent shape for %s at %L on dimension %d (%d and %d)Assignment to a LEN or KIND part_ref at %L is not allowedIllegal assignment to external procedure at %LIncompatible ranks %d and %d in assignment at %LVariable type is UNKNOWN in assignment at %LNULL appears on right-hand side in assignment at %LPOINTER-valued function appears on right-hand side of assignment at %LBOZ literal constant at %L is neither a data-stmt-constant nor an actual argument to INT, REAL, DBLE, or CMPLX intrinsic functionBOZ literal constant near %L cannot be assigned to a REAL variableBOZ literal constant near %L cannot be assigned to a %qs variableThe assignment to a KIND or LEN component of a parameterized type at %L is not allowedIncompatible types in DATA statement at %L; attempted conversion of %s to %sgenerate_isocbinding_initializer(): bad iso_c_binding type, expected % or %Pointer assignment target is not a POINTER at %L%qs in the pointer assignment at %L cannot be an l-value since it is a procedureExpected bounds specification for %qs at %LBounds specification for %qs in pointer assignment at %LStride must not be present at %LRank remapping requires a list of % specifications at %LExpected list of % or list of % specifications at %LPointer object at %L shall not have a coindexInvalid procedure pointer assignment at %LFunction result %qs is invalid as proc-target in procedure pointer assignment at %LAbstract interface %qs is invalid in procedure pointer assignment at %LStatement function %qs is invalid in procedure pointer assignment at %LInternal procedure %qs is invalid in procedure pointer assignment at %LIntrinsic %qs at %L is invalid in procedure pointer assignmentNonintrinsic elemental procedure %qs is invalid in procedure pointer assignment at %LMismatch in the procedure pointer assignment at %L: mismatch in the calling conventionInterface mismatch in procedure pointer assignment at %L: %qs is not a subroutineExplicit interface required for component %qs at %L: %sExplicit interface required for %qs at %L: %sInterface mismatch in procedure pointer assignment at %L: %sProcedure pointer target %qs at %L must be either an intrinsic, host or use associated, referenced or have the EXTERNAL attributePointer assignment target cannot be a constant at %LData-pointer-object at %L must be unlimited polymorphic, or of a type with the BIND or SEQUENCE attribute, to be compatible with an unlimited polymorphic targetDifferent types in pointer assignment at %L; attempted assignment of %s to %sDifferent kind type parameters in pointer assignment at %LDifferent ranks in pointer assignment at %LRank remapping target is smaller than size of the pointer (%ld < %ld) at %LRank remapping target must be rank 1 or simply contiguous at %LRank remapping target is not rank 1 at %LData target at %L shall not have a coindexTarget expression in pointer assignment at %L must deliver a pointer resultZero-sized array detected at %L where an entity with the TARGET attribute is expectedPointer assignment target in initialization expression does not have the TARGET attribute at %LPointer assignment target is neither TARGET nor POINTER at %LBad target in pointer assignment in PURE procedure at %LPointer assignment target has PROTECTED attribute at %LAssignment to contiguous pointer from non-contiguous target at %LPointer at %L inthe pointer targetPointer assignment with vector subscript on rhs at %LPointer initialization target at %L must not be ALLOCATABLEPointer initialization target at %L must have the TARGET attributePointer initialization target at %L must have the SAVE attributeProcedure pointer initialization target at %L may not be a procedure pointerInternal procedure %qs is invalid in procedure pointer initialization at %LDummy procedure %qs is invalid in procedure pointer initialization at %LFortran 2008: Pointer functions in variable definition context (%s) at %LNon-variable expression in variable definition context (%s) at %LNamed constant %qs in variable definition context (%s) at %L%qs in variable definition context (%s) at %L is not a variableNon-POINTER in pointer association context (%s) at %LType inaccessible in variable definition context (%s) at %LLOCK_TYPE in variable definition context (%s) at %LLOCK_EVENT in variable definition context (%s) at %LDummy argument %qs with INTENT(IN) in pointer association context (%s) at %LDummy argument %qs with INTENT(IN) in variable definition context (%s) at %LVariable %qs is PROTECTED and cannot appear in a pointer association context (%s) at %LVariable %qs is PROTECTED and cannot appear in a variable definition context (%s) at %LVariable %qs cannot appear in a variable definition context (%s) at %L in PURE procedure%qs at %L associated to vector-indexed target cannot be used in a variable definition context (%s)%qs at %L associated to expression cannot be used in a variable definition context (%s)Associate-name %qs cannot appear in a variable definition context (%s) at %L because its target at %L cannot, eitherElements with the same value at %L and %L in vector subscript in a variable definition context (%s)ЀGЀGGGGGGGPGPGPGPG(G(G(G(G(G(G(G(G(G(G(G(GGGGqGGGG GŠGXGpGpGŠGGG8GG@GXGGG(G(GGGG”G GGG`GG0GG”GG”GGGГGxGG GGG`GGGG G GGGGpGxGG@G@GpG~GGеGGhGXGGGGG`G GGGGG0GG GGGGG>GG GGGGGGtGpGpGHGG0GGGGG\GGG0G0G0GGG0GGGgfc_check_vardef_contextgfc_build_intrinsic_callgfc_is_simply_contiguousgfc_get_corankgfc_ref_this_imagegfc_spec_list_typederived_parameter_exprexpr_check_typed_helpgfc_expr_check_typedgfc_traverse_exprgfc_has_default_initializergfc_apply_initcheck_alloc_comp_initcheck_referencesgfc_check_init_exprfind_array_sectionfind_array_elementsimplify_const_refgfc_simplify_exprgfc_is_constant_exprgfc_copy_exprIY IY e 3 <]  )K _ | ۿ L r/M  WY e 3 <]  )K _ | ۿ L r/M  W )K | ۿ r/ H +K `hN f&d. V%l t y  e :L%   H +K `hN f&d. V%l t y  e :L%  H +K `hN f&d. V%l t y  e :L% ../../gcc/fortran/expr.cfree_expr0(): Bad expr typeLEN part_ref at %CKIND part_ref at %CRE part_ref at %Cieee_selected_real_kindieee_support_roundingieee_support_flagieee_support_haltingieee_support_datatypeieee_support_denormalieee_support_subnormalieee_support_divideieee_support_infieee_support_ioieee_support_nanieee_support_sqrtieee_support_standardoptype_msgid overflow: %d%qs at %L is not a VALUEarray assignmentrepeatc_null_ptrc_null_funptrpointer assignment_F.%sdot_productmatmulepsilonhugemaxexponentminexponenttinynew_lineProcedure %qs in %s at %L has no explicit interface; did you mean %qs?Procedure %qs in %s at %L has no explicit interfaceProcedure %qs in %s at %L is neither function nor subroutine; did you mean %qs?Procedure %qs in %s at %L is neither function nor subroutineIn %s at %L procedures must be either all SUBROUTINEs or all FUNCTIONsIn %s at %L procedures must be all FUNCTIONs as the generic name is also the name of a derived typeInternal procedure %qs in %s at %Lcheck_intents(): List mismatchProcedure argument at %L is local to a PURE procedure and has the POINTER attributeCoindexed actual argument at %L in PURE procedure is passed to an INTENT(%s) argumentCoindexed actual argument at %L in PURE procedure is passed to a POINTER dummy argumentCoindexed polymorphic actual argument at %L is passed polymorphic dummy argument %qscheck_some_aliasing(): List mismatchcheck_some_aliasing(): corrupted datacompare_actual_expr(): Bad component codeSame actual argument associated with INTENT(%s) argument %qs and INTENT(%s) argument %qs at %LThe assumed-rank array at %L requires that the dummy argument %qs has assumed-rankRank mismatch in argument %qs at %L (scalar and rank-%d)Rank mismatch in argument %qs at %L (rank-%d and scalar)Rank mismatch in argument %qs at %L (rank-%d and rank-%d)Rank mismatch between actual argument at %L and actual argument at %L (scalar and rank-%d)Rank mismatch between actual argument at %L and actual argument at %L (rank-%d and scalar)Rank mismatch between actual argument at %L and actual argument at %L (rank-%d and rank-%d)DTIO dummy argument at %L must be of type %sDTIO dummy argument at %L must be of KIND = %dDTIO dummy argument at %L must be a scalarDTIO dummy argument at %L must be an ASSUMED SHAPE ARRAYDTIO character argument at %L must have assumed lengthDTIO dummy argument at %L must have INTENT %sAlternate return at %L is not permitted in a DTIO procedureDTIO procedure %qs at %L must be a subroutineToo few dummy arguments in DTIO procedure %qs at %LToo many dummy arguments in DTIO procedure %qs at %LSyntax error in generic specification at %CSyntax error: Trailing garbage in INTERFACE statement at %CDummy procedure %qs at %C cannot have a generic interfaceSyntax error in ABSTRACT INTERFACE statement at %CSyntax error: Trailing garbage in END INTERFACE statement at %CExpected a nameless interface at %CExpected % at %CExpecting % at %CExpecting % at %C, but got %qsExpecting % at %CExpecting % at %Cgfc_compare_derived_types: invalid derived typeAlternate return cannot appear in operator interface at %LAssignment operator interface at %L must have two argumentsOperator interface at %L has the wrong number of argumentsAssignment operator interface at %L must be a SUBROUTINEAssignment operator interface at %L must not redefine an INTRINSIC type assignmentIntrinsic operator interface at %L must be a FUNCTIONFirst argument of defined assignment at %L must be INTENT(OUT) or INTENT(INOUT)Second argument of defined assignment at %L must be INTENT(IN)First argument of operator interface at %L must be INTENT(IN)Second argument of operator interface at %L must be INTENT(IN)Operator interface at %L conflicts with intrinsic interfaceType mismatch in function result (%s/%s)Rank mismatch in function result (%i/%i)ALLOCATABLE attribute mismatch in function resultPOINTER attribute mismatch in function resultCONTIGUOUS attribute mismatch in function resultPROCEDURE POINTER mismatch in function resultCharacter length mismatch in function resultcheck_result_characteristics (1): Unexpected result %i of gfc_dep_compare_exprShape mismatch in function resultShape mismatch in dimension %i of function resultcheck_result_characteristics (2): Unexpected result %i of gfc_dep_compare_exprMismatch in ELEMENTAL attribute'%s' has the wrong number of argumentsType mismatch in argument '%s' (%s/%s)Rank mismatch in argument '%s' (%i/%i)Mismatching POINTER/ALLOCATABLE attribute in argument '%s' INTENT mismatch in argument '%s'OPTIONAL mismatch in argument '%s'ALLOCATABLE mismatch in argument '%s'POINTER mismatch in argument '%s'TARGET mismatch in argument '%s'ASYNCHRONOUS mismatch in argument '%s'CONTIGUOUS mismatch in argument '%s'VALUE mismatch in argument '%s'VOLATILE mismatch in argument '%s'Interface mismatch in dummy procedure '%s': %sCharacter length mismatch in argument '%s'check_dummy_characteristics: Unexpected result %i of gfc_dep_compare_exprShape mismatch in argument '%s'Corank mismatch in argument '%s' (%i/%i)Shape mismatch in dimension %i of argument '%s'Shape mismatch in codimension %i of argument '%s'Ambiguous interfaces in %s for %qs at %L and %qs at %LAlthough not referenced, %qs has ambiguous interfaces at %L%qs at %L is not a module procedureactual argument to INTENT = OUT/INOUTKeyword argument %qs at %L is invalid in a statement functionKeyword argument %qs at %L is not in the procedure; did you mean %qs?Keyword argument %qs at %L is not in the procedureKeyword argument %qs at %L is already associated with another actual argumentMore actual than formal arguments in procedure call at %LMissing alternate return specifier in subroutine call at %LUnexpected alternate return specifier in subroutine call at %LUnexpected NULL() intrinsic at %L to dummy %qsFortran 2008: Null pointer at %L to non-pointer dummy %qsInvalid procedure argument at %LInterface mismatch in dummy procedure %qs at %L: %sActual argument to contiguous pointer dummy %qs at %L must be simply contiguousType mismatch between actual argument at %L and actual argument at %L (%s/%s).Type mismatch in argument %qs at %L; passed %s to %sAssumed-type actual argument at %L requires that dummy argument %qs is of assumed typeActual argument to %qs at %L must be polymorphicActual argument to %qs at %L must have the same declared typeActual argument to %qs at %L must be unlimited polymorphic since the formal argument is a pointer or allocatable unlimited polymorphic entity [F2008: 12.5.2.5]Actual argument to %qs at %L must be a coarrayCorank mismatch in argument %qs at %L (%d and %d)Actual argument to %qs at %L must be simply contiguous or an element of such an arrayActual argument to non-INTENT(INOUT) dummy %qs at %L, which is LOCK_TYPE or has a LOCK_TYPE componentActual argument to non-INTENT(INOUT) dummy %qs at %L, which is EVENT_TYPE or has a EVENT_TYPE componentDummy argument %qs has to be a pointer, assumed-shape or assumed-rank array without CONTIGUOUS attribute - as actual argument at %L is not simply contiguous and both are ASYNCHRONOUS or VOLATILEPassing coarray at %L to allocatable, noncoarray, INTENT(OUT) dummy argument %qsPassing coarray at %L to allocatable, noncoarray dummy argument %qs, which is invalid if the allocation status is modifiedPolymorphic scalar passed to array dummy argument %qs at %LElement of assumed-shape or pointer array as actual argument at %L cannot correspond to actual argument at %LElement of assumed-shape or pointer array passed to array dummy argument %qs at %LExtension: Scalar non-default-kind, non-C_CHAR-kind CHARACTER actual argument with array dummy argument %qs at %LFortran 2003: Scalar CHARACTER actual argument with array dummy argument %qs at %LActual argument at %L to assumed-type dummy is of derived type with type-bound or FINAL proceduresActual argument at %L to allocatable or pointer dummy argument %qs must have a deferred length type parameter if and only if the dummy has oneCharacter length of actual argument shorter than of dummy argument %qs (%lu/%lu) at %LActual argument contains too few elements for dummy argument %qs (%lu/%lu) at %LExpected a procedure pointer for argument %qs at %LExpected a procedure for argument %qs at %LActual argument for %qs cannot be an assumed-size array at %LActual argument for %qs must be a pointer at %LFortran 2008: Non-pointer actual argument at %L to pointer dummy %qsCoindexed actual argument at %L to pointer dummy %qsCoindexed actual argument at %L to allocatable dummy %qs requires INTENT(IN)Coindexed ASYNCHRONOUS or VOLATILE actual argument at %L requires that dummy %qs has neither ASYNCHRONOUS nor VOLATILECoindexed actual argument at %L with allocatable ultimate component to dummy %qs requires either VALUE or INTENT(IN)Actual CLASS array argument for %qs must be a full array at %LActual argument for %qs must be ALLOCATABLE at %LArray-section actual argument with vector subscripts at %L is incompatible with INTENT(OUT), INTENT(INOUT), VOLATILE or ASYNCHRONOUS attribute of the dummy argument %qsAssumed-shape actual argument at %L is incompatible with the non-assumed-shape dummy argument %qs due to VOLATILE attributeArray-section actual argument at %L is incompatible with the non-assumed-shape dummy argument %qs due to VOLATILE attributePointer-array actual argument at %L requires an assumed-shape or pointer-array dummy argument %qs due to VOLATILE attributeMissing alternate return spec in subroutine call at %LMissing actual argument for argument %qs at %LCharacter length mismatch (%ld/%ld) between actual argument and pointer or allocatable dummy argument %qs at %LCharacter length mismatch (%ld/%ld) between actual argument and assumed-shape dummy argument %qs at %LProcedure %qs called at %L is not explicitly declared; did you mean %qs?Procedure %qs called at %L is not explicitly declaredProcedure %qs called with an implicit interface at %LThe pointer object %qs at %L must have an explicit function interface or be declared as arrayThe allocatable object %qs at %L must have an explicit function interface or be declared as arrayAllocatable function %qs at %L must have an explicit function interfaceExplicit interface required for polymorphic argument at %LKeyword argument requires explicit interface for procedure %qs at %LAssumed-type argument %s at %L requires an explicit interfaceActual argument of LOCK_TYPE or with LOCK_TYPE component at %L requires an explicit interface for procedure %qsActual argument of EVENT_TYPE or with EVENT_TYPE component at %L requires an explicit interface for procedure %qsMOLD argument to NULL required at %LAssumed-rank argument requires an explicit interface at %LProcedure pointer component %qs called with an implicit interface at %LKeyword argument requires explicit interface for procedure pointer component %qs at %LMOLD= required in NULL() argument at %L: Ambiguity between specific functions %s and %sEntity %qs at %L is already present in the interfacegfc_add_interface(): Bad interface typeCannot overwrite GENERIC %qs at %L%qs at %L overrides a procedure binding declared NON_OVERRIDABLE%qs at %L must not be DEFERRED as it overrides a non-DEFERRED binding%qs at %L overrides a PURE procedure and must also be PURE%qs at %L overrides an ELEMENTAL procedure and must also be ELEMENTAL%qs at %L overrides a non-ELEMENTAL procedure and must not be ELEMENTAL, either%qs at %L overrides a SUBROUTINE and must also be a SUBROUTINE%qs at %L overrides a FUNCTION and must also be a FUNCTIONResult mismatch for the overriding procedure %qs at %L: %s%qs at %L overrides a PUBLIC procedure and must not be PRIVATEDummy argument %qs of %qs at %L should be named %qs as to match the corresponding argument of the overridden procedureArgument mismatch for the overriding procedure %qs at %L: %s%qs at %L must have the same number of formal arguments as the overridden procedure%qs at %L overrides a NOPASS binding and must also be NOPASS%qs at %L overrides a binding with PASS and must also be PASSPassed-object dummy argument of %qs at %L must be at the same position as the passed-object dummy argument of the overridden procedure$opunformatted../../gcc/fortran/interface.c assignment ( = ) operator ( %o ) operator ( read ( %n ) write ( %n )ABSTRACT INTERFACE at %C.eq./=.ne..le..lt..ge..gt.'%s' is not a function'%s' is not a subroutineMismatch in PURE attributeoperator interface '%s'generic interface '%s'intrinsic '%s' operatorUnable to find symbol %qs_formal_%de8Hp8H8H05H8Hp8H8HiGHiGHHHHHHHHHHHHH2IH2IH2IH2IHHHHHIHIHIHIHHHHHIHIHIHIHH؜HHXHHXHH؜HHXHHXHHpH8HH HHHpH8HH HHӥHhHءHH@HhHءH`HHHHHФH`HHHHHФH gfc_find_specific_dtio_proccheck_dtio_interface1gfc_check_typebound_overridegfc_set_current_interface_headgfc_current_interface_headgfc_extend_assignmatching_typebound_opgfc_extend_exprargument_rank_mismatchcheck_sym_interfacesgfc_compare_interfacesgfc_check_operator_interfaceintrinsic assignadd_sym(): Bad sizing modeScalar entity required at %LArray entity required at %Ldo_check(): too many args../../gcc/fortran/intrinsic.c__convert_%c%d_%c%datomatomic_addatomic_andatomic_oratomic_xorcpu_timeeventitimeltimegmtimechdirchmoddate_and_timedtimeexecute_command_linecmdmsgcmdstatexitstatgetenvgetargget_commandget_command_argumentget_environment_variabletrim_namemove_allocbmvbitsimvbitsjmvbitskmvbitsimage_distinctrepeatablerandom_numberharvestrandom_seedfptrcptrc_f_procpointerfe_runtime_errorerrmsgsource_imageresult_image_F.caf_sendalarmsecondsfseekwhencepath2path1perrorsleepfstatlstatsignalsymlnksystem_clockcount_maxcount_rateumaskbabsiiabsjiabskiabszabscdabsdacosdacoshaimagimagpartdimaganintdasindasinhdatandatanhdatan2bessel_j0dbesj0bessel_j1dbesj1bessel_jndbesjnbessel_y0dbesy0bessel_y1dbesy1bessel_yndbesynbgebgtbltbbtestbitestbjtestbktestceilingcommand_argument_countdcmplxdconjgzcoscdcosdcoshcshiftdbleidimddimvector_bvector_adproddrealdshiftldshiftreoshiftderfcerfc_scaledzexpcdexpextends_type_ofmoldfnumfractiondgammaiacharbiandiiandjiandkiandiallianybbclriibclrjibclrkibclrbbitsiibitsjibitskibitsbbsetiibsetjibsetkibsetbieoriieorjieorkieorierrnoimage_indexbacksubstringifixidintint2bioriiorjiorkioriparityirandisattyis_iostat_endis_iostat_eorrshiftlshiftbshftiishftjishftkishftbshftciishftcjishftckishftcleadzlnblnklog_gammaalgamadlgamalgestring_bstring_algtllelltalogzlogcdlogalog10dlog10masklmaskrmatrix_bmatrix_aamax0amax1dmax1fsourcetsourcemerge_bitsamin0amin1dmin1bmodimodjmodkmodamoddmodmodulonearestidnintbnotinotjnotknotnorm2distancepopcntpopparrealpartfloatifloatjfloatkdfloatsnglncopiesrrspacingsame_type_asset_exponentshiftaisigndsignzsincdsindsinh_F.stridec_associatedc_ptr_2c_ptr_1c_funlocc_sizeofcompiler_optionscompiler_versionzsqrtcdsqrtdtandtanhtime8trailzmatrixdacosddasinddatanddatan2ddcosddcotanccotanzcotandcotanddsinddtand_F.caf_getavailable since Fortran 77obsolescent in Fortran 95deleted in Fortran 95new in Fortran 95new in Fortran 2003new in Fortran 2008new in Fortran 2018for backward compatibilitya GNU Fortran extensionCannot convert %s to %s at %LVariable with NO_ARG_CHECK attribute at %L is only permitted as argument to the intrinsic functions C_LOC and PRESENTAssumed-type argument at %L is not permitted as actual argument to the intrinsic %sAssumed-type argument at %L is only permitted as first actual argument to the intrinsic %sAssumed-rank argument at %L is only permitted as actual argument to intrinsic inquiry functionsAssumed-rank argument at %L is only permitted as first actual argument to the intrinsic inquiry function %sALLOCATED intrinsic at %L requires an array or scalar allocatable entityInvalid keyword %qs in %qs intrinsic function at %LToo many arguments in call to %qs at %LThe argument list functions %%VAL, %%LOC or %%REF are not allowed in this context at %LCannot find keyword named %qs in call to %qs at %LArgument %qs appears twice in call to %qs at %LMissing actual argument %qs in call to %qs at %LALTERNATE RETURN not permitted at %Ldo_simplify(): Too many args for intrinsicresolve_intrinsic(): Too many args for intrinsicIn call to %qs at %L, type mismatch in argument %qs; pass %qs to %qsinit_arglist(): too many argumentsmake_generic(): Cannot find generic symbol %qsInvalid standard code on intrinsic %qs (%d)Intrinsic %qs (is %s) is used at %LThe intrinsic %qs at %L is not included in the selected standard but %s and %qs will be treated as if declared EXTERNAL. Use an appropriate %<-std=%>* option or define %<-fall-intrinsics%> to allow this intrinsic.Function %qs as initialization expression at %LTransformational function %qs at %L is invalid in an initialization expressionElemental function as initialization expression with non-integer/non-character arguments at %LSubroutine call to intrinsic %qs in DO CONCURRENT block at %L is not PURESubroutine call to intrinsic %qs at %L is not PUREExtension: Conversion from %s to %s at %LNonstandard conversion from %s to %s at %LPossible change of value in conversion from %s to %s at %LConversion from %s to %s at %LCannot convert %qs to %qs at %L%qs declared at %L may shadow the intrinsic of the same name. In order to call the intrinsic, explicit INTRINSIC declarations may be required.%qs declared at %L is also the name of an intrinsic. It can only be called via an explicit interface or if declared EXTERNAL.HPH`HpHHH0HHH@Hilrcusuuhgfc_convert_chartypegfc_convert_type_warncheck_specificmake_genericset_attr_valuegfc_intrinsic_function_by_idgfc_intrinsic_subroutine_by_idgfc_isym_id_by_intmodFortran 2003: %s specifier in %s statement at %L has value %qs%s specifier in %s statement at %L has value %qsExtension: %s specifier in %s statement at %L has value %qs%s specifier in %s statement at %L has invalid value %qsSyntax error in I/O iterator at %CExpecting variable in READ statement at %CExpecting variable or io-implied-do in READ statement at %LExpecting variable or io-implied-do at %LExpected expression in %s statement at %CBOZ literal constant at %L cannot appear in an output IO listSymbol %qs in namelist %qs is INTENT(IN) at %CExtension: backslash character at %CNonconforming tab character in format at %CDP format specifier not allowed at %CDC format specifier not allowed at %CFortran 2003: DT format specifier not allowed at %CMissing leading left parenthesis in format string at %LLeft parenthesis required after %<*%> in format string at %LExpected P edit descriptor in format string at %LP descriptor requires leading scale factor in format string at %LUnexpected end of format string in format string at %LMissing item in format string at %LPositive width required in format string at %LComma required after P descriptor in format string at %LPositive exponent width required in format string at %LPositive width required with T descriptor in format string at %LZero width in format descriptor in format string at %LPeriod required in format specifier in format string at %LRight parenthesis expected at %C in format string at %LE specifier not allowed with g0 descriptor in format string at %LNonnegative width required in format string at %LUnexpected element %qc in format string at %LX descriptor requires leading space count at %L$ should be the last specifier in format at %LExtension: Zero width after L descriptor at %LMissing positive width after L descriptor at %LPositive width required in format specifier %s at %LPeriod required in format specifier %s at %LPeriod required in format specifier at %LThe H format specifier at %L is a Fortran 95 deleted featureConstant expression in FORMAT tag at %L must be of type default CHARACTERFORMAT tag at %L cannot be a zero-sized arrayFORMAT tag at %L must be of type default-kind CHARACTER or of INTEGERASSIGNED variable in FORMAT tag at %LVariable %qs at %L has not been assigned a format labelScalar %qs in FORMAT tag at %L is not an ASSIGNED variableNon-character non-Hollerith in FORMAT tag at %LNon-character in FORMAT tag at %LNon-character assumed shape array element in FORMAT tag at %LNon-character assumed size array element in FORMAT tag at %LNon-character pointer array element in FORMAT tag at %L%s tag at %L must be of type %s%s tag at %L must be a character string of default kindFortran 95 requires default INTEGER in %s tag at %LNon-default LOGICAL kind in %s tag at %LInvalid value for %s specification at %CDuplicate %s label specification at %CDuplicate %s specification at %C%s at %C is a DEC extension, enable with %<-fdec%>Variable %s cannot be INTENT(IN) at %CVariable %s cannot be assigned in PURE procedure at %CMissing format with default unit at %CDuplicate UNIT specification at %CDuplicate format specification at %CDuplicate NML specification at %CSymbol %qs at %C must be a NAMELIST group nameEND tag at %C not allowed in output statementPRINT namelist at %C is an extensionComma after * at %C not allowed without I/O listExpected comma in I/O list at %CExtraneous characters in format at %LFormat statement in module main block at %CFORMAT statement at %C cannot appear within an INTERFACEACTION type conflicts with READONLY specifier at %LASYNCHRONOUS= at %L not allowed in Fortran 95BLANK= at %L not allowed in Fortran 95DECIMAL= at %L not allowed in Fortran 95ENCODING= at %L not allowed in Fortran 95ROUND= at %L not allowed in Fortran 95SIGN= at %L not allowed in Fortran 95RECL in OPEN statement at %L must be positiveThe STATUS specified in OPEN statement at %L is %qs and no FILE specifier is presentThe STATUS specified in OPEN statement at %L cannot have the value SCRATCH if a FILE specifier is presentUNIT specifier not allowed with NEWUNIT at %LNEWUNIT specifier must have FILE= or STATUS='scratch' at %LOPEN statement at %L must have UNIT or NEWUNIT specified%s specifier at %L not allowed in OPEN statement for unformatted I/ORECL specifier not allowed in OPEN statement at %L for stream I/OPOSITION specifier in OPEN statement at %L only allowed for stream or sequential ACCESSOPEN statement not allowed in PURE procedure at %CCLOSE statement not allowed in PURE procedure at %CCLOSE statement at %L requires a UNIT numberUNIT number in CLOSE statement at %L must be non-negative%s statement not allowed in PURE procedure at %CUNIT number missing in statement at %LUNIT number in statement at %L must be non-negativeInternal file at %L must not be INTENT(IN)Internal file incompatible with vector subscript at %LREC tag at %L is incompatible with internal filePOS tag at %L is incompatible with internal fileUnformatted I/O not allowed with internal unit at %LASYNCHRONOUS tag at %L not allowed with internal fileInternal file at %L with namelistADVANCE tag at %L is incompatible with internal fileIO UNIT in %s statement at %L must be an internal file in a PURE procedureID= specifier at %L must be with ASYNCHRONOUS='yes' specifierthe DECIMAL= specifier at %L must be with an explicit format expressionthe BLANK= specifier at %L must be with an explicit format expressionPAD= at %L not allowed in Fortran 95the PAD= specifier at %L must be with an explicit format expressionSIGN= specifier at %L must be with an explicit format expressionSIGN= specifier at %L not allowed in a READ statementDELIM= at %L not allowed in Fortran 95DELIM= specifier at %L not allowed in a READ statementDELIM= specifier at %L must have FMT=*DELIM= specifier at %L must be with FMT=* or NML= specifierNAMELIST cannot be followed by IO-list at %LIO spec-list cannot contain both NAMELIST group name and format specification at %LIO spec-list cannot contain both NAMELIST group name and format label at %LNAMELIST IO is not allowed with a REC= specifier at %LNAMELIST IO is not allowed with a ADVANCE= specifier at %LAn END tag is not allowed with a REC= specifier at %LFMT=* is not allowed with a REC= specifier at %LPOS= is not allowed with REC= specifier at %LList directed format(*) is not allowed with a ADVANCE= specifier at %L.the ADVANCE= specifier at %L must appear with an explicit format expressionADVANCE= specifier at %L must have value = YES or NO.SIZE tag at %L requires an ADVANCE = %EOR tag at %L requires an ADVANCE = %END tag not allowed with output at %LEOR tag not allowed with output at %LBLANK= specifier not allowed with output at %LPAD= specifier not allowed with output at %LSIZE= specifier not allowed with output at %LSIZE tag at %L requires an ADVANCE tagEOR tag at %L requires an ADVANCE tagUNIT specification at %L must not be a character PARAMETERUNIT specification at %L must be an INTEGER expression or a CHARACTER variableInvalid form of WRITE statement at %L, UNIT requiredInternal unit with vector subscript at %LExternal IO UNIT cannot be an array at %LNAMELIST %qs in READ statement at %L contains the symbol %qs which may not appear in a variable definition contextNAMELIST object %qs in namelist %qs at %L is polymorphic and requires a defined input/output procedureNAMELIST object %qs in namelist %qs at %L with ALLOCATABLE or POINTER componentsNAMELIST object %qs in namelist %qs at %L has ALLOCATABLE or POINTER components and thus requires a defined input/output procedureComma before i/o item list at %LERR tag label %d at %L not definedEND tag label %d at %L not definedEOR tag label %d at %L not definedFORMAT label %d at %L not definedPRINT statement at %C not allowed within PURE procedureNULL() near %L cannot appear in INQUIRE statementINQUIRE statement not allowed in PURE procedure at %CIOLENGTH tag invalid in INQUIRE statement at %CINQUIRE statement at %L cannot contain both FILE and UNIT specifiersINQUIRE statement at %L requires either FILE or UNIT specifierUNIT number in INQUIRE statement at %L cannot be %dINQUIRE statement at %L requires a PENDING= specifier with the ID= specifierWAIT at %C not allowed in Fortran 95WAIT statement not allowed in PURE procedure at %Cio_kind_name(): bad I/O-kind$ descriptor at %L../../gcc/fortran/io.c% in format at %Lpositive width required at %LMissing comma at %L%s tag at %L must be scalarIOMSG tag at %LNEWUNIT specifier at %L%s tagCONVERT tag at %LExpecting a variable at %L unit = fmt = nml = %n readonlyREADONLY shareddenynone noshareddenyrwMissing format label at %CDELIM PAD BLANK CARRIAGECONTROLDECIMALDELIMENCODINGPADPOSITIONSTATUSscratchsequentialFLUSH statement at %CUNIT not specified at %Linternal unit in WRITEIOMSG%s tag with INQUIREIOSTATEXISTOPENEDNUMBERNAMEDSEQUENTIALDIRECTUNFORMATTEDRECLNEXTRECREADWRITEIOLENGTHPENDINGSTREAMAPOSTROPHEQUOTESUPPRESSPROCESSOR_DEFINEDNEARESTCOMPATIBLEYESCOMMAKEEPDELETEOLDSCRATCHREPLACEDENYNONEDENYRWASISAPPENDDEFAULT stream =NEWUNIT newunit = pending = id = eor = %l end =ERR err = pos = convert = iolength = sign = round = asynchronous = encoding = decimal = pad = delim = readwrite = write = read = action = position = blank = nextrec = recl = unformatted = formatted = form = direct = sequential = access = number = name = named = opened = exist = size = iostat = iomsg =FORMAT rec =ADVANCE advance = status = file = carriagecontrol = share =PIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIЂIIII IIII`IpIIЀI I I I I I I I I I IIIIIIIIIpIIIpIIIIIIIIIIIIII@I@IIII0III΃IIIIIIIIIIIăIIIIIIIIIIII؃IIIII0IIIIIIIIpIIIxIIfIfIfIfIfIfIfIfIfIfI IIIIfIfIfI IIIIIIIfIIII`I`II`I`I`I`I`I`I`I`I`I`I`I`I`I`I`I IIIIIډI0I؍IIIII I΋IIPIIIIIIIIIIIIIIIIIII$I$II$I$I$I$I$I$I$I$I$I$I$I$I$I$I$I I- r 3ddagfc_resolve_dtcheck_format\1 1 5U 1 2 5U T1  2 5U 2 5U Y2 %2 g)2 %2 02 42 %2 #;2 5U #B2 FX K1 M2 5U !Y2 5U a2 5U 

_gfortran_bessel_jn_r%d_gfortran_bessel_yn_r%d__btest_%d_%d__ceiling_%d_%c%d_gfortran_chdir_i%d_gfortran_chdir_i%d_sub_gfortran_chmod_func_gfortran_chmod_i%d_sub__cmplx0_%d_%c%d__cmplx1_%d_%c%d_%c%d__conjg_%d__cos_%c%d__cosh_%c%d_gfortran_count_%d_%c_gfortran_cshift%d_%d_char_gfortran_cshift%d_%d_char%d_gfortran_cshift%d_%d_gfortran_ctime__dble_%c%d__dim_%c%d_gfortran_dot_product_%c%d__dprod_r%ddshiftl_i%ddshiftr_i%d../../gcc/fortran/iresolve.c_gfortran_eoshift%d_%d_char_gfortran_eoshift%d_%d_char%d_gfortran_eoshift%d_%d__exp_%c%d__exponent_%d_gfortran_is_extension_of_gfortran_fdate__floor%d_%c%d_gfortran_fnum_i%d__fraction_%d__tgamma_%d_gfortran_getcwd_gfortran_getgid_gfortran_getpid_gfortran_getuid_gfortran_hostnm__hypot_r%d__iand_%d__ibclr_%d__ibits_%d__ibset_%d__ichar_%d__nint_%d_%d_gfortran_ierrno_i%d__ieor_%d__ior_%d__index_%d_i%d__int_%d_%c%d_gfortran_isatty_l%d__is_contiguous__ishft_%d_%d__rshift_%d_%d__lshift_%d_%d__ishftc_%d_%d_%d__lbound__lcobound__len_%d_i%d__len_trim%d__lgamma_%d_gfortran_link_i%d__loc_%d__log_%c%d__log10_%c%d__logical_%d_%c%d_gfortran_matmul_%c%d__max_%c%dsmaxlocmmaxloc_gfortran_%s%d_%d_%c%dmfindlocsfindloc_gfortran_%s%d_%c%dsmaxvalmmaxval_gfortran_%s_%c%d_gfortran_mclock_gfortran_mclock8__maskl_i%d__maskr_i%d__merge_%c%d__merge_bits_i%d__min_%c%dsminlocmminlocsminvalmminval__mod_%c%d__modulo_%c%d__nearest_%c%d__not_%d__or_%c%d_gfortran_pack_char_gfortran_pack_s_char_gfortran_pack_char%d_gfortran_pack_gfortran_pack_s_char%d_gfortran_pack_s__rank__real_%d_%c%d_gfortran_rename_i%d__repeat_%d_gfortran_reshape_char_gfortran_reshape_gfortran_reshape_%c%d_gfortran_reshape_char%d_gfortran_reshape_%d__rrspacing_%d_gfortran_runtime_error%VAL__scale_%d__scan_%d_gfortran_secnds__set_exponent_%d_gfortran_shape_%dshifta_i%d__sign_%c%d_gfortran_signal_func_int_gfortran_signal_func__sin_%c%d__sinh_%c%d__spacing_%d_gfortran_spread_char_scalar_gfortran_spread_char_gfortran_spread_scalar_gfortran_spread_char%d_gfortran_spread__sqrt_%c%d_gfortran_stat_i%d_gfortran_lstat_i%d_gfortran_fstat_i%d_gfortran_fgetc_gfortran_fget_gfortran_fputc_gfortran_fput_gfortran_ftell_gfortran_symlnk_i%d_gfortran_system__tan_%c%d__tanh_%c%d__this_image_gfortran_time_func_gfortran_time8_func_gfortran_transpose_char_gfortran_transpose_gfortran_transpose_%c%d_gfortran_transpose_i%d_gfortran_transpose_char4__trim_%d_gfortran_%s_%d__ubound__ucobound_gfortran_umask_i%d_gfortran_unlink_gfortran_ttynam_gfortran_unpack%d_char_gfortran_unpack%d_char%d_gfortran_unpack%d__verify_%d__xor_%c%d_gfortran_alarm_sub_int_i%d_gfortran_alarm_sub_i%d_gfortran_cpu_time_%d_gfortran_mvbits_i%ddummyarg_gfortran_random_init_gfortran_random_r%d_gfortran_arandom_r%d_gfortran_random_seed_i%d_gfortran_rename_i%d_sub_gfortran_link_i%d_sub_gfortran_symlnk_i%d_sub_gfortran_dtime_sub_gfortran_etime_sub_gfortran_itime_i%d_gfortran_idate_i%d_gfortran_ltime_i%d_gfortran_gmtime_i%d_gfortran_second_sub_gfortran_sleep_i%d_sub_gfortran_srand_gfortran_getarg_i%d_gfortran_getcwd_i%d_sub_gfortran_get_command_i%d_gfortran_signal_sub_int_gfortran_signal_sub_gfortran_system_sub_gfortran_system_clock_%d_gfortran_exit_i%d_gfortran_flush_i%d_gfortran_ctime_sub_gfortran_fdate_sub_gfortran_gerror_gfortran_getlog_gfortran_hostnm_i%d_sub_gfortran_perror_sub_gfortran_stat_i%d_sub_gfortran_lstat_i%d_sub_gfortran_fstat_i%d_sub_gfortran_fgetc_i%d_sub_gfortran_fget_i%d_sub_gfortran_fputc_i%d_sub_gfortran_fput_i%d_sub_gfortran_fseek_sub_gfortran_ftell_i%d_sub_gfortran_ttynam_sub_gfortran_umask_i%d_sub_gfortran_unlink_i%d_sub_gfortran_spread_char%d_scalar_gfortran_get_command_argument_i%d_gfortran_get_environment_variable_i%d_gfortran_execute_command_line_i%dgfc_resolve_shiftgfc_resolve_dshift__tmp_%s_%d__tmp_%s_%ld_%d__tmp_class_%s__tmp_type_%s**.and..or..eqv..neqv..not.gfc_op2string(): Bad codeInteger too large at %C../../gcc/fortran/match.cStatement label at %C is zeroName at %C is too long.XOR. operator at %C %n :% %n%t stat = %v errmsg = %v until_count = %eUNLOCK acquired_lock = %vSYNC statement at %C% %s%t %v = %v => %e then CRITICAL statement at %CNested CRITICAL block at %C %n =>Expected association at %CExpected %<)%> or %<,%> at %CInvalid type-spec at %C concurrentDO CONCURRENT construct at %C while ( %e )%t do PAUSE statement at %CERROR STOP statement at %CEVENT POST statement at %CEVENT WAIT statement at %CFAIL IMAGE statement at %CFORM TEAM statement at %CCHANGE TEAM statement at %CEND TEAM statement at %CSYNC TEAM statement at %CUNLOCK statement at %C to %v%tASSIGN statement at %CAssigned GOTO statement at %CComputed GOTO at %C%n :: Error in type-spec at %Ltypespec in ALLOCATE at %LUnexpected constant at %CERRMSG tag at %L source = %eSOURCE tag at %L mold = %eMOLD tag at %LERRMSG at %LJunk after CALL at %C_result_%sArithmetic IF statement at %C then%t if ( %e ) change teamcycleend fileend teamerror stopevent postevent waitfail imageform teamgo to ( %e ) %l , %l , %l%tnullifypauseprintsync allsync imagessync memorysync teamBLOCK DATA construct at %L / %s /EQUIVALENCE statement at %C = %e%tStatement function at %C select case ( %e )%t select type ( %n => %e select rank ( SELECT RANK statement at %C% default% is__tmp_%s_rank_%d__tmp_%s_rank_m%d where ( %e )cJpcJ`cJpcJ`cJPcJ@cJ0cJ cJcJcJbJbJbJbJbJbJbJbJpbJ`bJPbJ@bJ0bJ bJbJcJbJaJcJaJaJdJ8fJ8fJ8fJ8fJ8fJ8fJ8fJ8fJ8fJ8fJ8fJ8fJ8fJ8fJ8fJ8fJ8fJ8fJ8fJ8fJ8fJ8fJ8fJ8fJ8fJ8fJ8fJ8fJ8fJ8fJ8fJ8fJ8fJ8fJ8fJ8fJeJ8fJ8fJ8fJ8fJ8fJ8fJeJ8fJ8fJ8fJ8fJeJeJ8fJ8fJ8fJeJ8fJ8fJ8fJ8fJ8fJ8fJ8fJeJ8fJ8fJ8fJfJ8fJ fJeJdJ8fJeJhpJpJ-pJpJpJpJ-pJ-pJ-pJ-pJ-pJ-pJ-pJ-pJ-pJ-pJ-pJ-pJpJ pJHpJqJ-pJ-pJ-pJqJ-pJhrJ-pJ-pJ-pJ-pJ0rJ-pJrJqJ-pJ-pJ-pJ-pJ-pJ-pJ-pJ-pJ0qJ at %L. Use %<-fdollar-ok%> to allow it as an extensionExpected block name %qs of SELECT construct at %CExpected structure component or operator name after '.' at %C%qs is neither a defined operator nor a structure component in dotted string at %Cgfc_match(): Bad match code %cLabel name %qs at %C is ambiguousDuplicate construct label %qs at %CLoop variable at %C cannot be an arrayLoop variable at %C cannot be a coarrayLoop variable at %C cannot be a sub-componentExpected a step value in iterator at %CSyntax error in iterator at %C%s statement at %C leaving OpenACC structured block%s statement at %C leaving OpenMP structured blockCYCLE statement at %C to non-innermost tiled !$ACC LOOP loopCYCLE statement at %C to non-innermost collapsed !$ACC LOOP loopName %qs in %s statement at %C is unknownName %qs in %s statement at %C is not a construct name%s statement at %C leaves CRITICAL construct%s statement at %C leaves DO CONCURRENT construct%s statement at %C is not within a construct%s statement at %C is not within construct %qsCYCLE statement at %C is not applicable to non-loop construct %qsEXIT statement with no do-construct-name at %C%s statement at %C is not applicable to construct %qsEXIT statement at %C terminating !$ACC LOOP loopEXIT statement at %C terminating !$OMP DO loopCYCLE statement at %C to non-innermost collapsed !$OMP DO loopImage control statement EVENT %s at %C in PURE procedureImage control statement EVENT %s at %C in CRITICAL blockImage control statement EVENT %s at %C in DO CONCURRENT blockRedundant STAT tag found at %LRedundant ERRMSG tag found at %LRedundant UNTIL_COUNT tag found at %LImage control statement %s at %C in PURE procedureImage control statement %s at %C in CRITICAL blockImage control statement %s at %C in DO CONCURRENT blockRedundant ACQUIRED_LOCK tag found at %LImage control statement SYNC at %C in PURE procedureImage control statement SYNC at %C in CRITICAL blockImage control statement SYNC at %C in DO CONCURRENT blockInvalid form of PROGRAM statement at %CAssignment to a constant expression at %CBOZ literal constant at %L is neither a DATA statement value nor an actual argument of INT/REAL/DBLE/CMPLX intrinsic subprogramInvalid character(s) in ELSE statement after %CLabel %qs at %C doesn't match IF label %qsMissing %<(%> in ELSE IF expression at %CMissing %<)%> in ELSE IF expression at %CMissing THEN in ELSE IF statement after %LSyntax error in ELSE IF statement after %LLabel %qs after %L doesn't match IF label %qsImage control statement CRITICAL at %C in PURE procedureImage control statement CRITICAL at %C in DO CONCURRENT blockExpected association list at %CInvalid association target at %CDuplicate name %qs in association at %CAssociation target at %C must not be coindexedAssociation target at %L cannot be a BOZ literal constantJunk after ASSOCIATE statement at %CDerived type %qs at %L may not be ABSTRACTSyntax error in FORALL iterator at %CStatement label list in GOTO at %C cannot be emptyThe type parameter spec list in the type-spec at %L cannot contain DEFERRED parametersBad allocate-object at %C for a PURE procedureIncompatible allocate-object at %C for CHARACTER type-spec at %LALLOCATE of coarray at %C in DO CONCURRENT blockALLOCATE of coarray at %C in CRITICAL blockAllocate-object at %L is neither a data pointer nor an allocatable variableType of entity at %L is type incompatible with typespecKind type parameter for entity at %L differs from the kind type parameter of the typespecShape specification for allocatable scalar at %CSTAT tag at %L cannot be a constantRedundant SOURCE tag found at %LSOURCE tag at %L conflicts with the typespec at %LSOURCE tag at %L with more than a single allocate objectRedundant MOLD tag found at %LMOLD tag at %L conflicts with the typespec at %LMOLD tag at %L conflicts with SOURCE tag at %LAllocate-object at %L with a deferred type parameter requires either a type-spec or SOURCE tag or a MOLD tagUnlimited polymorphic allocate-object at %L requires either a type-spec or SOURCE tag or a MOLD tagPointer object at %C shall not be coindexedNULLIFY does not allow bounds remapping for pointer object at %CIllegal allocate-object at %C for a PURE procedureDEALLOCATE of coarray at %C in DO CONCURRENT blockDEALLOCATE of coarray at %C in CRITICAL blockAllocate-object at %C is not a nonprocedure pointer nor an allocatable variableImage control statement RETURN at %C in CRITICAL blockImage control statement RETURN at %C in DO CONCURRENT blockAlternate RETURN statement at %C is only allowed within a SUBROUTINERETURN statement in main program at %CExpected component reference at %CExpected type-bound procedure or procedure pointer component at %CA BOZ literal constant at %L cannot appear as an actual argument in a subroutine referenceMissing %<(%> in IF-expression at %CSyntax error in IF-expression at %CBlock label not appropriate for arithmetic IF statement at %CBlock label is not appropriate for IF statement at %CSyntax error in IF-clause after %CSyntax error in common block name at %CVariable %qs at %L in common block %qs at %C must be declared with a C interoperable kind since common block %qs is bind(c)Variable %qs in common block %qs at %C cannot be bind(c) since it is not globalSymbol %qs at %C is already in a COMMON blockInitialized symbol %qs at %C can only be COMMON in BLOCK DATAArray specification for symbol %qs in COMMON at %C must be explicitSymbol %qs in COMMON at %C cannot be a POINTER arraySymbol %qs, in COMMON block %qs at %C is being indirectly equivalenced to another COMMON block %qsNamelist group name %qs at %C already has a basic type of %sNamelist group name %qs at %C already is USE associated and cannot be respecified.Assumed size array %qs in namelist %qs at %C is not allowedDerived type component %C is not a permitted EQUIVALENCE memberArray reference in EQUIVALENCE at %C cannot be an array sectionEQUIVALENCE at %C requires two or more objectsAttempt to indirectly overlap COMMON blocks %s and %s by EQUIVALENCE at %CExpecting a comma in EQUIVALENCE at %CStatement function at %L is recursiveStatement function %qs at %L conflicts with function nameStatement function at %L cannot appear within an INTERFACESELECT TYPE at %C cannot appear in this scopeparse error in SELECT TYPE statement at %CSelector in SELECT TYPE at %C is not a named variable; use associate-name=>The SELECT RANK selector at %C must be an assumed rank variableparse error in SELECT RANK statement at %CUnexpected CASE statement at %CExpression in CASE selector at %L cannot be %sExpected initialization expression in CASE at %CSyntax error in CASE specification at %CUnexpected TYPE IS statement at %CThe type-spec shall not specify a sequence derived type or a type with the BIND attribute in SELECT TYPE at %C [F2003:C815]All the LEN type parameters in the TYPE IS statement at %C must be ASSUMEDSyntax error in TYPE IS specification at %CSyntax error in CLASS IS specification at %CUnexpected RANK statement at %CThe SELECT RANK CASE expression at %C must be a scalar, integer constantThe value of the SELECT RANK CASE expression at %C must not be less than zero or greater than %dSyntax error in RANK specification at %CELSEWHERE statement at %C not enclosed in WHERE blockLabel %qs at %C doesn't match WHERE label %qs . %n .match_level_4(): Bad operatorneqvBad character %qc in OPERATOR name at %CThe name %qs cannot be used as a defined operator at %CExpected a right parenthesis in expression at %CExpected exponent in expression at %CExtension: Unary operator following arithmetic operator (use parentheses) at %CKKKKK KKKKKK K%%  /8@<DHSyntax error in expression at %CLOGICALHOLLERITHDERIVEDVOIDBOZTYPE(*)INTEGER(%d)REAL(%d)COMPLEX(%d)LOGICAL(%d)CHARACTER(%d)CHARACTER(%ld)CHARACTER(%ld,%d)UNION(%s)TYPE(%s)CLASS(%s)CHARACTER(:)CHARACTER(*)CHARACTER(:,%d)CHARACTER(*,%d)CHARACTER(KIND=%d)gfc_code2string(): Bad codegfc_basic_typename(): Undefined typegfc_typename(): Undefined typeKKK K K K0 K@ KP K` Kp K KKK K K K0 KH K` K K K@ K K K` Kx K Kfree_pi_tree(): Unresolved fixupReading module %qs at line %d column %d: %sWriting module %qs at line %d column %d: %sModule %qs at line %d column %d: %srequire_atom(): bad atom type requiredSymbol %qs at %C already declaredassociate_integer_pointer(): Already associatedError writing modules file: %sUnsupported: multiple OpenACC 'routine' levels of parallelismunquote_string(): got bad stringwrite_symtree(): Symbol not writtenUse of intrinsic module %qs at %C conflicts with non-intrinsic module name used previouslyThe symbol %qs, referenced at %L, is not in the selected standardUse of the NUMERIC_STORAGE_SIZE named constant from intrinsic module ISO_FORTRAN_ENV at %L is incompatible with option %qsUse of the NUMERIC_STORAGE_SIZE named constant from intrinsic module ISO_FORTRAN_ENV at %C is incompatible with option %sSymbol %qs referenced at %L not found in intrinsic module ISO_FORTRAN_ENVmodule nature in USE statement at %CModule nature in USE statement at %C shall be either INTRINSIC or NON_INTRINSIC"::" was expected after module nature at %C but was not foundMissing generic specification in USE statement at %CRenaming operators in USE statements at %CSymbol %qs at %L conflicts with the rename symbol at %LSymbol %qs at %L conflicts with the symbol at %LThe name %qs at %C has already been used as an external module nameSUBMODULE declaration at %C cannot appear within another scoping unitSyntax error in SUBMODULE statement at %Cmio_array_ref(): Unknown array refBad type in constant expressionAmbiguous !$OMP DECLARE REDUCTION from module %s at %LPrevious !$OMP DECLARE REDUCTION from module %s at %Lwrite_symbol(): bad module symbol %qsMismatch in components of derived type %qs from %qs at %C: expecting %qs, but got %qs%qs of module %qs, imported at %C, is also the name of the current program unitSymbol %qs referenced at %L not found in module %qsUser operator %qs referenced at %L not found in module %qsIntrinsic operator %qs referenced at %L not found in module %qsUSE statement at %C has no ONLY qualifierISO_FORTRAN_ENV intrinsic module at %Cimport_iso_c_binding_module(): Unable to create symbol for %sSymbol %qs referenced at %L not found in intrinsic module ISO_C_BINDINGCannot find an intrinsic module named %qs at %CCannot open module file %qs for reading at %C: %sModule file %qs has not been generated, either because the module does not contain a MODULE PROCEDURE or there is an error in the module.Use of non-intrinsic module %qs at %C conflicts with intrinsic module name used previouslyFile %qs opened at %C is not a GNU Fortran module fileCannot read module file %qs opened at %C, because it was created by a different version of GNU FortranCannot USE a submodule that is currently builtCannot USE a module that is currently builtCannot open module file %qs for writing at %C: %sGFORTRAN module version '%s' created from %s Error writing module file %qs for writing: %sCannot delete module file %qs: %sCannot rename module file %qs to %qs: %sCannot delete temporary module file %qs: %sUnexpected EOFName too longBad nameExpected nameExpected left parenthesisExpected right parenthesisExpected integerExpected stringfind_enum(): Enum not foundrb../../gcc/fortran/module.cExpected real string0.%s@%ldExpected attribute bit nameExpected integer stringError converting integer\U%08lx%01xSymbol %qs already declared%c%s8atomic_int_kindatomic_logical_kindcharacter_storage_sizeerror_unitfile_storage_sizeinput_unitiostat_inquire_internal_unitnumeric_storage_sizeoutput_unitreal32real64stat_lockedstat_locked_other_imagestat_stopped_imagestat_failed_imagestat_unlockedcharacter_kindsinteger_kindslogical_kindsreal_kindsevent_typeteam_type %n ::non_intrinsic"USE :: module" at %C only :SUBMODULE declaration at %C%s@%sExpected expression typeDEFERRED_CLomp_inomp_origomp_privomp_outoperator OMP_DECLARE_SIMD__vtab___vtype_c_shortc_longc_long_longc_intmax_tc_intptr_tc_ptrdiff_tc_size_tc_signed_charc_int8_tc_int16_tc_int32_tc_int64_tc_int128_tc_int_least8_tc_int_least16_tc_int_least32_tc_int_least64_tc_int_least128_tc_int_fast8_tc_int_fast16_tc_int_fast32_tc_int_fast64_tc_int_fast128_tc_floatc_doublec_long_doublec_float128c_float_complexc_double_complexc_long_double_complexc_float128_complexiso_fortran_envISO_C_BINDING module at %C__iso_c_bindingieee_featuresIEEE_FEATURES module at %Cieee_exceptionsIEEE_EXCEPTIONS module at %Cieee_arithmeticIEEE_ARITHMETIC module at %CUnexpected end of moduleGFORTRAN__BLNK__%s.eq.%dSIMDLENLINEAR_REFLINEAR_VALLINEAR_UVALUPLUSUMINUSTIMESDIVIDEPOWERCONCATNEQVEQ_SIGNNE_SIGNGT_SIGNGTGE_SIGNLT_SIGNLE_SIGNNOTPARENTHESESCONSTANTVARIABLESUBSTRINGCOMPCALLINQUIRYELEMENTASSUMEDNO_PPCSPECIFICIN_NAMELISTIN_COMMONALWAYS_EXPLICITCRAY_POINTERCRAY_POINTEEIS_C_INTEROPIS_ISO_CALLOC_COMPCOARRAY_COMPLOCK_COMPEVENT_COMPPROC_POINTER_COMPPRIVATE_COMPZERO_COMPIS_CLASSPROC_POINTERVTYPEVTABCLASS_POINTERIMPLICIT_PUREUNLIMITED_POLYOMP_DECLARE_TARGETARRAY_OUTER_DEPENDENCYMODULE_PROCEDUREOACC_DECLARE_CREATEOACC_DECLARE_COPYINOACC_DECLARE_DEVICEPTROACC_DECLARE_DEVICE_RESIDENTOACC_DECLARE_LINKOMP_DECLARE_TARGET_LINKPDT_KINDPDT_LENPDT_TYPEPDT_TEMPLATEPDT_ARRAYPDT_STRINGOACC_ROUTINE_LOP_GANGOACC_ROUTINE_LOP_WORKEROACC_ROUTINE_LOP_VECTOROACC_ROUTINE_LOP_SEQKKK6K6K6K6K6K6KhKhKhKhKhKhKhKhKhKhK6K6K6K6K6K6K6K@K@K@K@K@K@K@K@K@K@K@K@K@K@K@K@K@K@K@K@K@K@K@K@K@K@K6K6K6K6K6K6K@K@K@K@K@K@K@K@K@K@K@K@K@K@K@K@K@K@K@K@K@K@K@K@K@K@KKKK^K^K^K^K^K^KKKKKKKKKKK^K^K^K^K^K^K^KpKpKpKpKpKpKpKpKpKpKpKpKpKpKpKpKpKpKpKpKpKpKpKpKpKpK^K^K^K^K^K^KpKpKpKpKpKpKpKpKpKpKpKpKpKpKpKpKpKpKpKpKpKpKpKpKpKpK@KPKK K0K.K.K.K.K.K.K.Kp.K`.KP.K@.K0.K .K.K.K-K-K-K-K-K-K-K-Kp-K`-KP-K@-K0-K -K-K-K,K,K,K,K,K,K,K,Kp,K`,KP,K@,K0,K ,K,K,K+K+K+K+K+K+K+K+Kp+K`+KP+K@+K0+K +K+K+K*K*K*K*K*K.Ky*K1K1Kj*K1K & : Q \ n Implicitly declared function %s used in !$OMP DECLARE REDUCTION at %LVariable other than OMP_PRIV or OMP_ORIG used in INITIALIZER clause of !$OMP DECLARE REDUCTION at %LVariable other than OMP_OUT or OMP_IN used in combiner of !$OMP DECLARE REDUCTION at %LCOMMON block /%s/ not found at %CSyntax error in OpenMP variable list at %CList item shall not be coindexed at %CSyntax error in OpenACC expression list at %C%s clause at %L requires a scalar INTEGER expression!$ACC LOOP cannot be a DO WHILE or DO without loop control at %L!$ACC LOOP cannot be a DO CONCURRENT loop at %L!$ACC LOOP iteration variable must be of type integer at %L!$ACC LOOP %s loops don't form rectangular iteration space at %L%s !$ACC LOOP loops not perfectly nested at %Lnot enough DO loops for %s !$ACC LOOP at %LAssumed size array %qs in %s clause at %LAssumed rank array %qs in %s clause at %LUnexpected junk after $OMP ATOMIC statement at %CINTEGER expression of %s clause at %L must be positiveINTEGER expression of %s clause at %L must be non-negativeImplicitly declared subroutine %s used in !$OMP DECLARE REDUCTION at %LORDERED clause parameter is less than COLLAPSE at %LIF clause at %L requires a scalar LOGICAL expressionIF clause without modifier at %L used together with IF clauses with modifiersIF clause modifier %s at %L not appropriate for the current OpenMP constructFINAL clause at %L requires a scalar LOGICAL expressionSCHEDULE clause's chunk_size at %L requires a scalar INTEGER expressionINTEGER expression of SCHEDULE clause's chunk_size at %L must be positiveNONMONOTONIC modifier specified for %s schedule kind at %LBoth MONOTONIC and NONMONOTONIC schedule modifiers specified at %LNONMONOTONIC schedule modifier specified with ORDERED clause at %LVariable %qs is not a dummy argument at %LObject %qs is not a variable at %L; parameters cannot be and need not be copiedObject %qs is not a variable at %L; parameters cannot be and need not be mappedObject %qs is not a variable at %LSymbol %qs has mixed component and non-component accesses at %LSymbol %qs present on multiple clauses at %LArray %qs is not permitted in reduction at %LSymbol %qs present on both FROM and TO clauses at %LNon-THREADPRIVATE object %qs in COPYIN clause at %LAssumed size array %qs in COPYPRIVATE clause at %LINTENT(IN) POINTER %qs in COPYPRIVATE clause at %LTHREADPRIVATE object %qs in SHARED clause at %LCray pointee %qs in SHARED clause at %LASSOCIATE name %qs in SHARED clause at %L%qs in ALIGNED clause must be POINTER, ALLOCATABLE, Cray pointer or C_PTR at %L%qs in ALIGNED clause at %L requires a scalar positive constant integer alignment expressionSINK dependence type only allowed on ORDERED directive at %LDEPEND SINK used together with DEPEND SOURCE on the same construct at %LSINK addend not a constant integer at %LOnly SOURCE or SINK dependence types are allowed on ORDERED directive at %L%qs in %s clause at %L is not a proper array sectionStride should not be specified for array section in %s clause at %L%qs in DEPEND clause at %L is a zero size array sectionPOINTER object %qs in %s clause at %LCray pointer object %qs in %s clause at %LCray pointee object %qs in %s clause at %LALLOCATABLE object %qs in %s clause at %LVALUE object %qs in %s clause at %LTARGET%s with map-type other than TO, FROM, TOFROM, or ALLOC on MAP clause at %LTARGET ENTER DATA with map-type other than TO, or ALLOC on MAP clause at %LTARGET EXIT DATA with map-type other than FROM, RELEASE, or DELETE on MAP clause at %LTHREADPRIVATE object %qs in %s clause at %LCray pointee %qs in %s clause at %LNon-dummy object %qs in %s clause at %LASSOCIATE name %qs in %s clause at %LProcedure pointer %qs in %s clause at %LCray pointer %qs in %s clause at %LVariable %qs in %s clause is used in NAMELIST statement at %LINTENT(IN) POINTER %qs in %s clause at %L!$OMP DECLARE REDUCTION %s not found for type %s at %LLINEAR clause modifier used on DO or SIMD construct at %LLINEAR clause specified together with ORDERED clause with argument at %LLINEAR variable %qs must be INTEGER at %LLINEAR dummy argument %qs with VALUE attribute with %s modifier at %L%qs in LINEAR clause at %L requires a scalar integer linear-step expression%qs in LINEAR clause at %L requires a constant integer linear-step expression or dummy argument specified in UNIFORM clausePOINTER object %qs of polymorphic type in %s clause at %L%s clause variable %qs at %L is neither a POINTER nor an arrayCray pointer object %qs of derived type in %s clause at %LCray pointee object %qs of derived type in %s clause at %LCray pointer object %qs of polymorphic type in %s clause at %LCray pointee object %qs of polymorphic type in %s clause at %LDIST_SCHEDULE clause's chunk_size at %L requires a scalar INTEGER expressionIncompatible use of TILE and COLLAPSE at %LSOURCE dependence type only allowed on ORDERED directive at %L%s must contain at least one MAP clause at %Lunexpected !$OMP ATOMIC expression at %L!$OMP ATOMIC statement must set a scalar variable of intrinsic type at %L!$OMP ATOMIC READ statement must read from a scalar variable of intrinsic type at %Lexpr in !$OMP ATOMIC WRITE assignment var = expr must be scalar and cannot reference var at %L!$OMP ATOMIC CAPTURE capture statement must read from a scalar variable of intrinsic type at %L!$OMP ATOMIC CAPTURE update statement must set a scalar variable of intrinsic type at %L!$OMP ATOMIC CAPTURE capture statement reads from different variable than update statement writes into at %L!$OMP ATOMIC with ALLOCATABLE variable at %L!$OMP ATOMIC assignment operator must be binary +, *, -, /, .AND., .OR., .EQV. or .NEQV. at %L!$OMP ATOMIC var = var op expr not mathematically equivalent to var = var op (expr) at %Lexpr in !$OMP ATOMIC assignment var = var op expr must be scalar and cannot reference var at %L!$OMP ATOMIC assignment intrinsic IAND, IOR or IEOR must have two arguments at %L!$OMP ATOMIC assignment intrinsic must be MIN, MAX, IAND, IOR or IEOR at %L!$OMP ATOMIC intrinsic arguments except one must not reference %qs at %L!$OMP ATOMIC intrinsic arguments must be scalar at %LFirst or last !$OMP ATOMIC intrinsic argument must be %qs at %L!$OMP ATOMIC assignment must have an operator or intrinsic on right hand side at %L!$OMP ATOMIC CAPTURE capture statement must set a scalar variable of intrinsic type at %L!$OMP ATOMIC assignment must be var = var op expr or var = expr op var at %LCOLLAPSE clause argument not constant positive integer at %Cdefaultmap ( tofrom : scalar )Syntax error in OpenMP DEPEND SINK list at %CVariable at %C is an element of a COMMON blockUnexpected junk after !$ACC DECLARE at %CSyntax error in !$ACC DECLARE list at %CORDERED clause argument not constant positive integer at %C!$OMP DECLARE REDUCTION %s not found at %LInvalid clause in module with !$ACC DECLARE at %LVariable is USE-associated with !$ACC DECLARE at %LVariable %qs shall be declared in the same scoping unit as !$ACC DECLARE at %LAssumed-size dummy array with !$ACC DECLARE at %L% must contain at least one % or % or % clause at %LInvalid argument to !$ACC WAIT at %CWAIT clause at %L requires a scalar INTEGER expressionACC CACHE directive must be inside of loop %COnly the !$ACC ROUTINE form without list is allowed in interface block at %CSyntax error in !$ACC ROUTINE ( NAME ) at %CSyntax error in !$ACC ROUTINE ( NAME ) at %C, expecting ')' after NAMEMultiple loop axes specified for routine at %CIntrinsic symbol specified in !$ACC ROUTINE ( NAME ) at %C marked with incompatible GANG, WORKER, or VECTOR clause!$ACC ROUTINE already applied at %CInvalid NAME %qs in !$ACC ROUTINE ( NAME ) at %CUnexpected junk after $OMP CRITICAL statement at %CUnexpected junk after $OMP FLUSH statement at %CRedefinition of predefined %s !$OMP DECLARE REDUCTION at %LRedefinition of predefined !$OMP DECLARE REDUCTION at %LRedefinition of !$OMP DECLARE REDUCTION at %LPrevious !$OMP DECLARE REDUCTION at %LUnexpected junk after !$OMP DECLARE REDUCTION at %COnly the !$OMP DECLARE TARGET form without clauses is allowed in interface block at %CUnexpected junk after !$OMP DECLARE TARGET at %COMP DECLARE TARGET variable at %L is an element of a COMMON blockOMP DECLARE TARGET variable at %L previously mentioned in LINK clause and later in TO clauseOMP DECLARE TARGET variable at %L previously mentioned in TO clause and later in LINK clauseVariable at %L mentioned multiple times in clauses of the same OMP DECLARE TARGET directiveOMP DECLARE TARGET COMMON at %L previously mentioned in LINK clause and later in TO clauseOMP DECLARE TARGET COMMON at %L previously mentioned in TO clause and later in LINK clauseCOMMON at %L mentioned multiple times in clauses of the same OMP DECLARE TARGET directiveSyntax error in !$OMP DECLARE TARGET list at %CThreadprivate variable at %C is an element of a COMMON blockUnexpected junk after OMP THREADPRIVATE at %CSyntax error in !$OMP THREADPRIVATE list at %CUnexpected junk after TASKWAIT clause at %CUnexpected junk after TASKYIELD clause at %CUnexpected junk after $OMP WORKSHARE statement at %CUnexpected junk after $OMP MASTER statement at %CUnexpected junk after $OMP BARRIER statement at %CUnexpected junk after $OMP TASKGROUP statement at %CUnexpected junk after $OMP CANCELLATION POINT statement at %CUnexpected junk after NOWAIT clause at %CArray sections: %qs not allowed in !$ACC DECLARE at %LNAME %qs does not refer to a subroutine or function in !$ACC ROUTINE ( NAME ) at %LNAME %qs invalid in !$ACC ROUTINE ( NAME ) at %LThe %s directive cannot be specified within a %s region at %L!$OMP DISTRIBUTE PARALLEL DO SIMD!$OMP TARGET TEAMS DISTRIBUTE PARALLEL DO!$OMP TARGET TEAMS DISTRIBUTE PARALLEL DO SIMD!$OMP TARGET TEAMS DISTRIBUTE SIMD!$OMP TEAMS DISTRIBUTE PARALLEL DO!$OMP TEAMS DISTRIBUTE PARALLEL DO SIMD%s cannot be a DO WHILE or DO without loop control at %L%s cannot be a DO CONCURRENT loop at %L%s iteration variable must be of type integer at %L%s iteration variable must not be THREADPRIVATE at %L%s iteration variable present on clause other than PRIVATE or LASTPRIVATE at %L%s iteration variable present on clause other than LASTPRIVATE at %L%s iteration variable present on clause other than LINEAR at %L%s collapsed loops don't form rectangular iteration space at %Lcollapsed %s loops not perfectly nested at %Lnot enough DO loops for collapsed %s at %LOMP TARGET UPDATE at %L requires at least one TO or FROM clause!$OMP DECLARE SIMD should refer to containing procedure %qs at %LInvalid operator for !$OMP DECLARE REDUCTION %s at %LCHARACTER length in !$OMP DECLARE REDUCTION %s not constant at %LMissing INITIALIZER clause for !$OMP DECLARE REDUCTION of derived type without default initializer at %LSubroutine call with alternate returns in combiner of !$OMP DECLARE REDUCTION at %LSubroutine call with alternate returns in INITIALIZER clause of !$OMP DECLARE REDUCTION at %LOne of actual subroutine arguments in INITIALIZER clause of !$OMP DECLARE REDUCTION must be OMP_PRIV at %LTiled loop cannot be parallelized across gangs, workers and vectors at the same time at %LTILE requires constant expression at %L../../gcc/fortran/openmp.c% seq_cst, seq_cstArray is not contiguous at %LSAFELENHINTPRIORITYTHREAD_LIMITGRAINSIZENUM_TASKSNUM_GANGSNUM_WORKERSVECTOR_LENGTHaligned (attach ( collapse ( %e )copyin (copyprivate (default ( none )default ( present )default ( firstprivate )default ( private )default ( shared )delete ( depend ( sink : source )detach ( device ( %e )device ( deviceptr ( device_resident (dist_schedule ( static , %e )final ( %e )firstprivate (from ( static : num :grainsize ( %e )hint ( %e )host ( if ( if_presentindependentis_device_ptr (lastprivate (linear ( ref ( val ( uval (link (map ( always , alloc : tofrom : to : release : delete : mergeableno_create ( nogroupnotinbranchnum_gangs ( %e )num_tasks ( %e )num_teams ( %e )num_threads ( %e )num_workers ( %e )pcopy ( pcopyin ( pcopyout ( pcreate ( present ( present_or_copy ( present_or_copyin ( present_or_copyout ( present_or_create ( priority ( %e )proc_bind ( master )proc_bind ( spread )proc_bind ( close )reduction ( operator %ssafelen ( %e )schedule ( nonmonotonicself ( shared (simdlen ( %e )thread_limit ( %e )tile (uniform (untieduse_device (use_device_ptr (use_device_addr (vector_length ( %e ) length :Failed to match clause at %CUnexpected junk at %C ( %s ) %o : %n : initializer ( % nowaittiledcollapsed!$OMP DISTRIBUTE!$OMP DISTRIBUTE PARALLEL DO!$OMP DISTRIBUTE SIMD!$OMP DO!$OMP DO SIMD!$OMP PARALLEL DO!$OMP PARALLEL DO SIMD!$OMP SIMD!$OMP TARGET PARALLEL DO!$OMP TARGET PARALLEL DO SIMD!$OMP TARGET SIMD!$OMP TARGET TEAMS DISTRIBUTE!$OMP TASKLOOP!$OMP TASKLOOP SIMD!$OMP TEAMS DISTRIBUTE!$OMP TEAMS DISTRIBUTE SIMD parallel : %e ) task : %e ) taskloop : %e ) target : %e ) target data : %e ) target update : %e ) target enter data : %e ) target exit data : %e )KpK0KKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKK@K0K KKKKKKKKKKKpKFKFK`KPK@K0K KKKKKKKKKKKpK`KPK@K0K KKKKKKFKKKKKKpK`KXKKKKKKKKKK K0K@KPK`KpKxKK(K_K KKKKKKKKKKKKKXKKKKKKKKKKKKKKKKKKKKKK@K@K@KKKKKKKKKKKKKKKKKKKKKKKKKK(K(K(K6KzKzKkKkKkKzKzKzKkKkKkKkKzKzKzKzKzKzKzKzKzKzKzKzKzKzKzKzKKKKxKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKKrKKKKKKKKKKKKKKKKKKKIK`LLLLLp L LLLLLLpL L LLLP LL@LLLL ;L:L:L;L:L:L:L:L ;L;L:L:L:L:LeLeLeLeLeLeLeLeLeLeLeLdLeLeLeLeLeLeLeLeLeLeLeLeLeLeLeLeLeLeLeLeLeLeLeLeLeLeLeLeLWeLeLeLeLeLeLeLeLeLeLhLBkLBkLBkLBkLhLBkLBkLBkLBkLBkLBkLBkLBkLBkLBkLBkLBkLBkLBkLBkLhLiLiLBkLBkLBkL.iLAiLTiLgiLBkLziLiLiLiLiLiL%jL8jLBkLBkLBkLBkLBkLKjLeLiLjLiLresolve_omp_doexpr_references_symresolve_omp_atomicresolve_oacc_nested_loopsresolve_omp_clausesoacc_code_to_statementomp_code_to_statementalloc_entries~47877999Ix u u u u  eu u d 3 u Q66edu 5u u u       $ > .f90Reading file %qs as free formGNU Fortran2008GNU Fortran2003array-temps.f95.f03.f08.f.for.ftnArgument to %<-ffpe-trap%> is not valid: %sArgument to %<-ffpe-summary%> is not valid: %s%<-fexcess-precision=standard%> for Fortran%<-fd-lines-as-comments%> has no effect in free form%<-fd-lines-as-code%> has no effect in free formFlag %<-fno-automatic%> overwrites %<-fmax-stack-var-size=%d%>Flag %<-fno-automatic%> overwrites %<-frecursive%>Flag %<-fno-automatic%> overwrites %<-frecursive%> implied by %<-fopenmp%>Flag %<-frecursive%> overwrites %<-fmax-stack-var-size=%d%>Flag %<-fmax-stack-var-size=%d%> overwrites %<-frecursive%> implied by %<-fopenmp%>Fixed line length must be at least sevenFree line length must be at least threeMaximum subrecord length cannot exceed %d%<-static-libgfortran%>Maximum supported identifier length is %dUnrecognized option to %<-finit-logical%>: %sThe value of n in %<-finit-character=n%> must be between 0 and 127gfortran: Only one %<-J%> option allowedArgument to %<-fcheck%> is not valid: %sIY = .T  N T %-[ e @dComponent %s at %L of type LOCK_TYPE must have a codimension or be a subcomponent of a coarray, which is not possible as the component has the pointer attributePointer component %s at %L has a noncoarray subcomponent of type LOCK_TYPE, which must have a codimension or be a subcomponent of a coarrayAllocatable component %s at %L of type LOCK_TYPE must have a codimensionAllocatable component %s at %L must have a codimension as it has a noncoarray subcomponent of type LOCK_TYPENoncoarray component %s at %L of type LOCK_TYPE or with subcomponent of type LOCK_TYPE must have a codimension or be a subcomponent of a coarray. (Variables of type %s may not have a codimension as already a coarray subcomponent exists)Noncoarray component %s at %L of type LOCK_TYPE or with subcomponent of type LOCK_TYPE must have a codimension or be a subcomponent of a coarray. (Variables of type %s may not have a codimension as %s at %L has a codimension or a coarray subcomponent)Component %s at %L of type EVENT_TYPE must have a codimension or be a subcomponent of a coarray, which is not possible as the component has the pointer attributePointer component %s at %L has a noncoarray subcomponent of type EVENT_TYPE, which must have a codimension or be a subcomponent of a coarrayAllocatable component %s at %L of type EVENT_TYPE must have a codimensionAllocatable component %s at %L must have a codimension as it has a noncoarray subcomponent of type EVENT_TYPENoncoarray component %s at %L of type EVENT_TYPE or with subcomponent of type EVENT_TYPE must have a codimension or be a subcomponent of a coarray. (Variables of type %s may not have a codimension as already a coarray subcomponent exists)Noncoarray component %s at %L of type EVENT_TYPE or with subcomponent of type EVENT_TYPE must have a codimension or be a subcomponent of a coarray. (Variables of type %s may not have a codimension as %s at %L has a codimension or a coarray subcomponent)End of nonblock DO statement at %C is within another blockEnd of nonblock DO statement at %C is interwoven with another DO loopUnclassifiable GCC directive at %CUnclassifiable GCC directive at %C, ignoredOpenMP directives other than SIMD or DECLARE TARGET at %C may not appear in PURE or ELEMENTAL proceduresend distribute parallel do simdend target teams distribute parallel do simdend target teams distribute parallel doend target teams distribute simdend teams distribute parallel do simdend teams distribute parallel dotarget teams distribute parallel do simdtarget teams distribute parallel doteams distribute parallel do simdUnclassifiable OpenMP directive at %COpenACC directives at %C may not appear in PURE proceduresUnclassifiable OpenACC directive at %CUnclassifiable statement at %CNon-numeric character in statement label at %CZero is not a valid statement label at %CStatement label without statement at %LSemicolon at %C needs to be preceded by statementFortran 2008: Semicolon at %C without preceding statementFORMAT statement at %L does not have a statement label!$OMP END DISTRIBUTE PARALLEL DO!$OMP END DISTRIBUTE PARALLEL DO SIMD!$OMP END TARGET PARALLEL DO SIMD!$OMP END TARGET TEAMS DISTRIBUTE!$OMP END TARGET TEAMS DISTRIBUTE PARALLEL DO!$OMP END TARGET TEAMS DISTRIBUTE PARALLEL DO SIMD!$OMP END TARGET TEAMS DISTRIBUTE SIMD!$OMP END TEAMS DISTRIBUTE PARALLEL DO!$OMP END TEAMS DISTRIBUTE PARALLEL DO SIMD!$OMP END TEAMS DISTRIBUTE SIMDgfc_ascii_statement(): Bad statement code%s statement at %C cannot follow %s statement at %LSyntax error in nested structure declaration at %Cfailed to create map component '%s'failed to create union component '%s'Missing !$OMP END ATOMIC after !$OMP ATOMIC CAPTURE at %C%s statement is not allowed inside of BLOCK at %C%s statement is not allowed inside of BLOCK DATA at %CUnexpected %s statement in INTERFACE block at %CName %qs of ABSTRACT INTERFACE at %C cannot be the same as an intrinsic typeUnexpected %s statement at %C in INTERFACE bodyINTERFACE procedure %qs at %L has the same name as the enclosing procedureFINAL declaration at %C must be inside CONTAINSDerived type definition at %C without componentsPRIVATE statement at %C must precede structure componentsDuplicate PRIVATE statement at %CSEQUENCE statement at %C must precede structure componentsSEQUENCE attribute at %C already specified in TYPE statementDuplicate SEQUENCE statement at %CCONTAINS block in derived type definition at %CDerived-type %qs with SEQUENCE must not have a CONTAINS section at %CDerived-type %qs with BIND(C) must not have a CONTAINS section at %CComponents in TYPE at %C must precede CONTAINSFINAL procedure declaration at %CDerived type definition at %C with empty CONTAINS sectionPRIVATE statement at %C must precede procedure bindingsSEQUENCE statement at %C must precede CONTAINSAlready inside a CONTAINS block at %C%s statement must appear in a MODULE%s statement at %C follows another accessibility specificationENUM declaration at %C has no ENUMERATORSBad kind expression for function %qs at %LThe type for function %qs at %L is not accessiblePRIVATE statement in TYPE at %C must be inside a MODULEELSEWHERE statement at %C follows previous unmasked ELSEWHEREUnexpected %s statement in WHERE block at %CUnexpected %s statement in FORALL block at %CVariable %qs at %C cannot be redefined inside loop beginning at %LCRITICAL block inside of OpenACC region at %CCRITICAL block inside of OpenMP region at %C%s statement at %C cannot terminate a non-block DO loopDATA statement at %C after the first executable statementELSE IF statement at %C cannot follow ELSE statement at %LDuplicate ELSE statements at %L and %CExpected a CASE or END SELECT statement following SELECT CASE at %CExpected TYPE IS, CLASS IS or END SELECT statement following SELECT TYPE at %CExpected RANK or RANK DEFAULT following SELECT RANK at %CStatement label in END CRITICAL at %C does not match CRITICAL labelOpenACC directive inside of CRITICAL block at %CRedundant !$ACC END LOOP at %C% directive not at the start of a loop at %C% directive not at the start of a loop at %C% directive not at the start of a loop at %C% directive not at the start of a loop at %CCONTAINS statement at %C is already in a contained program unitContained procedure %qs at %C is already ambiguousUnexpected %s statement in CONTAINS section at %CCONTAINS statement without FUNCTION or SUBROUTINE statement at %LStatement label in ENDDO at %C doesn't match DO labelNamed block DO at %L requires matching ENDDO nameName after !$omp critical and !$omp end critical does not match at %CGlobal binding name %qs at %L is already being used as a %s at %LGlobal name %qs at %L is already being used as a %s at %LGlobal binding name %qs at %L is already being used at %LGlobal name %qs at %L is already being used at %LUnexpected %s statement in MODULE at %CBlank BLOCK DATA at %C conflicts with prior BLOCK DATA at %LUnexpected %s statement in BLOCK DATA at %C #include #ifdef __cplusplus #include #define __GFORTRAN_FLOAT_COMPLEX std::complex #define __GFORTRAN_DOUBLE_COMPLEX std::complex #define __GFORTRAN_LONG_DOUBLE_COMPLEX std::complex extern "C" { #else #define __GFORTRAN_FLOAT_COMPLEX float _Complex #define __GFORTRAN_DOUBLE_COMPLEX double _Complex #define __GFORTRAN_LONG_DOUBLE_COMPLEX long double _Complex #endif Two main PROGRAMs at %L and %CUnexpected end of file in %qsnovectorcancellation% pointdeclare reductiondistribute parallel do simdend atomicend criticalend distribute parallel doend distribute simdend distributeend do simdend doend simdend masterend orderedend parallel do simdend parallel doend parallel sectionsend parallel workshareend parallelend sectionsend singleend target dataend target parallel do simdend target parallel doend target parallelend target simdend target teams distributeend target teamsend targetend taskgroupend taskloop simdend taskloopend taskend teams distribute simdend teams distributeend teamsend workshareordered depend (target teams distribute simdteams distribute parallel dothreadprivateend dataend host_dataend kernels loopend kernelsend loopend parallel loopend serial loopend serialabstract% interfaceasynchronouscodimensionenum , bind ( c )entry% equivalenceimplicit% noneintentcontainselse whereelse ifgenericmodule% proceduretype is../../gcc/fortran/parse.cBad continuation line at %Cgcc$arithmetic IFattribute declarationCONTAINSdata declarationDEALLOCATEderived type declarationELSE IFELSEWHEREEVENT POSTEVENT WAITFAIL IMAGEEND ASSOCIATEEND BLOCKEND BLOCK DATAEND FILEEND FUNCTIONEND IFEND INTERFACEEND MODULEEND SUBMODULEEND PROGRAMEND SUBROUTINEEND STRUCTUREEND UNIONEND MAPEND TYPEEQUIVALENCEERROR STOPblock IFIMPLICITIMPLICIT NONEimplied END DOIMPORTPAUSEMODULE PROCEDURENULLIFYSYNC ALLSYNC IMAGESSYNC MEMORYSELECT CASESELECT TYPESELECT RANKTYPE ISCLASS ISsimple IFSTATEMENT FUNCTIONLABEL ASSIGNMENTENUM DEFINITIONENUMERATOR DEFINITIONEND ENUM!$ACC PARALLEL LOOP!$ACC END PARALLEL LOOP!$ACC PARALLEL!$ACC END PARALLEL!$ACC KERNELS!$ACC END KERNELS!$ACC KERNELS LOOP!$ACC END KERNELS LOOP!$ACC SERIAL LOOP!$ACC END SERIAL LOOP!$ACC SERIAL!$ACC END SERIAL!$ACC DATA!$ACC END DATA!$ACC HOST_DATA!$ACC END HOST_DATA!$ACC LOOP!$ACC END LOOP!$ACC UPDATE!$ACC WAIT!$ACC CACHE!$ACC ENTER DATA!$ACC EXIT DATA!$ACC ROUTINE!$ACC ATOMIC!$ACC END ATOMIC!$OMP ATOMIC!$OMP BARRIER!$OMP CANCEL!$OMP CANCELLATION POINT!$OMP CRITICAL!$OMP DECLARE REDUCTION!$OMP DECLARE SIMD!$OMP DECLARE TARGET!$OMP END ATOMIC!$OMP END CRITICAL!$OMP END DISTRIBUTE!$OMP END DISTRIBUTE SIMD!$OMP END DO!$OMP END DO SIMD!$OMP END SIMD!$OMP END MASTER!$OMP END ORDERED!$OMP END PARALLEL!$OMP END PARALLEL DO!$OMP END PARALLEL DO SIMD!$OMP END PARALLEL SECTIONS!$OMP END PARALLEL WORKSHARE!$OMP END SECTIONS!$OMP END SINGLE!$OMP END TASK!$OMP END TARGET!$OMP END TARGET DATA!$OMP END TARGET PARALLEL!$OMP END TARGET PARALLEL DO!$OMP END TARGET SIMD!$OMP END TARGET TEAMS!$OMP END TASKGROUP!$OMP END TASKLOOP!$OMP END TASKLOOP SIMD!$OMP END TEAMS!$OMP END TEAMS DISTRIBUTE!$OMP END WORKSHARE!$OMP FLUSH!$OMP MASTER!$OMP ORDERED!$OMP PARALLEL!$OMP PARALLEL SECTIONS!$OMP PARALLEL WORKSHARE!$OMP SECTIONS!$OMP SECTION!$OMP SINGLE!$OMP TARGET!$OMP TARGET DATA!$OMP TARGET ENTER DATA!$OMP TARGET EXIT DATA!$OMP TARGET PARALLEL!$OMP TARGET TEAMS!$OMP TARGET UPDATE!$OMP TASK!$OMP TASKGROUP!$OMP TASKWAIT!$OMP TASKYIELD!$OMP TEAMS!$OMP THREADPRIVATE!$OMP WORKSHAREUnexpected %s statement at %CType-bound procedure at %CGENERIC binding at %Cblock@%dBLOCK construct at %CASSOCIATE construct at %CExpected DO loop at %CExpecting %s at %CMAIN__ #ifdef __cplusplus } #endif ЛLЛLLЛLЛLЛLLЛLLЛLLЛLLЛLLLЛLЛLLLLLLLLLЛLLЛLLLLLLLLLLLLЛLLЛLЛLЛLЛLLLЛLЛLLLLЛLLЛLЛLЛLLLLLLЛLЛLЛLLLLЛLЛLЛLЛLLLLЛLЛLЛLЛLЛLЛLЛLLЛLLLЛLLLLЛLLLЛLLLLLLLLЛLLЛLLЛLLЛLLЛLLЛLLLЛLЛLЛLЛLLЛLLЛLLЛLЛLLЛLLЛLЛLЛLLLLLLLLLLLLLЛLЛLЛLЛLЛLЛLЛLЛLЛLLЛLLЛLЛLLЛLЛLЛLЛLЛLLЛLLЛLLЛLLLLЛLLЛLLЛLLЛLLЛLLЛLLЛLLЛLLЛLLЛLLЛLLЛLLЛLLЛLLЛLLЛLLЛLLЛLLЛLLЛLLЛLЛLЛLLЛLLЛLLЛLLLЛLLLЛLЛLЛLЛLЛLЛLЛLЛLЛLLLLLLLLLLLLLLLХL@LLLLLLLLLPL@LLLpLLLLLLLLLХL@LLLиLLLLL@LLxLLPLLL+LLLLLLLLLLL0LLLLLhLLcLLsLLLLLLTLLLLL5LLLLLLL LLLLLLJLLLLLLLLLL"LLALLLBLLL{LL;LLVL1LLLfLLLLLLLkLLLLLL%LPLPL`LPLPLPL`LPL`LPL`LPL`LPL`L`LPLPL`L`L`LPLPL`LPL`LPL`LPL`LPLPL`L`L`LPLPLPL`L`L`L`LPLPLPLPL(L`LPLPL`L`L`LPL`LPLPLPL`L`L`L`L`LPLPLPL`L`L`LPLPLPLPL`L`L`LPLPLPLPLPLPLPL`LPL`L`LPL`L`L`LPL`L`LPL`L`L`L`L`L`L`LPL`LPL`LPL`LPL`LPL`LPL`L`LPLPLPLPL`LPL`LPL`LPLPL`LPL`LPLPLPL`L`L`L`L`L`L`L`L`L`L`L`LPLPLPLPLPLPLPLPLPL`LPL`LPLPL`LPLPLPLPLPL`LPL`LPL`LPL`L`L`LPL`LPL`LPL`LPL`LPL`LPL`LPL`LPL`LPL`LPL`LPL`LPL`LPL`LPL`LPL`LPL`LPL`LPL`LPL`LPL`LPLPLPL`LPL`LPL`LPL`L`LPLPL`LPLPLPLPLPLPLPLPLPLLL}LwLqLkLeL_LYLSLMLGLAL;L5L/L)L#LLLL LLLLLLLLLLLLLLLLLLLLLLLLLL{LuLoLiLcL]LWLQLKLEL?L9L3L-L'L!LLLL LLLLLLLLLLLLLLLLLLLLLLLLLyLsLmLgLaL[LULOLILCL=L7L1L+L%LLLL LLLLLLLLLLLLLLLLLLLLLLLL}LwLqLkLeL_LYLSLMLGLAL;L5L/L)L#LLLL LLLLLLLLLLLLLLLLLLLLLLLL{LuLoLiLcL]LWLQLKLEL?L9L3L-L'L!LLLL LLLLLLLLLLLLLLLLLLLLLLLLyLsL)LmLgLaL[LLULOLILCL=L7L1L+L%LLLLLLLLLLLLLLL LLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLL@LPL0LLLLLLLLLLLLLLLLLLLLLLL`LLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLMMMMMMMMMMMMMMpMMMMMMMMMMMMMMMMMMMMMMMMMMpMMMMMMpMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMxMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMKMMMMMMMMMMMMM@MM@MMMMMMMMMMMMMMMMMpMMMMMMMMMMMMMMMMMMMMMMMMMMpMMMMMMMMMMMMMMMMMMMMMMMMMMpMMMMMMpMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMxMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMKM`MvMvMvM`MvMvMvMvMvMvMvMvMvMvMvMvM`MvMvMvMvM`MvM`MvMvMvMvM`M`M`MvMvMvM`MvMvMvMvMvMvMvMvMvMvMvMvMvMvMvMvM`MvMvMvMvM`MvM`MvMvMvMvM`MvMvMvMvM`M%M%M%Mq!M%M M%M%M%M%M%M%M%M%M%M%M%MZ%M%M%M%M%M%M%M%M%M%M%M%M%M%M%M%M%M%M%M%M%M%M%M%M%M%M%M%MP%M%M%M%M6$M%M%M%M%M%M%M%M%M%M%M%M%M%M%M%M%M%M%M%M%M%M%M%M%M%M%Mx%M%M%M%M%M%MM%M%M%M%M%M%M%M%MM%M%MM%M%M%M%M%M%M%MGM%M[M%M[M%M[M%M[M%MGM%M%M%M%M%MGM%MGM%M[M%M%M%M%MMM%M<M%M1M%M%M%M%M%M%M%M%M%M%M%M%MM%M1M1M1MM1MM1M%M1M%MM1M%M%M%M%M%M1M%MM%MM%MM%M%M%M1M%M1M%M%M%M1M%MM%MM%MM%MM%M1M%MM%MM%MM%MM%MM%MM%MM%MM%M1M%MM%MM%M%M%MM%MM%MM%M%M%M%MM%M%M%M%M%M%M%M%M%M%M%M+M%M%M%Mq!M%M M%M%M%M%M%M%M%M%M%M%M%MZ%M%M%M%M%M%M%M%M%M%M%M%M%M%M%M%M%M%M%M%M%M%M%M%M%M%M%M%MP%M%M%M%M6$M%M%M%M%M%M%M%M%M%M%M%M%M%M%M%M%M%M%M%M%M%M%M%M%M%M%Mx%M%M%M%M%M%MM%M%M%M%M%M%M%M%MM%M%MM%M%M%M%M%M%M%MGM%M[M%M[M%M[M%M[M%MGM%M%M%M%M%MGM%MGM%M[M%M%M%M%MMM%M<M%M1M%M%M%M%M%M%M%M%M%M%M%M%MM%M1M1M1MM1MM1M%M1M%MM1M%M%M%M%M%M1M%MM%MM%MM%M%M%M1M%M1M%M%M%M1M%MM%MM%MM%MM%M1M%MM%MM%MM%MM%MM%MM%MM%MM%M1M%MM%MM%M%M%MM%MM%MM%M%M%M%MM%M%M%M%M%M%M%M%M%M%M%M+M 0M0M0M0M/M0M/M0M0M0M0M0M0M0M0M0M0M0M/M<:MZ;MZ;MZ;MZ;M:MZ;MZ;MZ;MZ;MZ;MZ;MZ;MZ;MZ;MZ;MZ;MZ;MZ;MZ;MZ;M:MZ;M:MZ;M:MZ;MZ;MZ;MZ;MZ;MZ;MZ;MZ;MZ;MZ;MZ;M:MZ;M:MZ;M:MZ;M:MZ;MZ;MZ;M:MZ;M:MZ;M:MZ;M:MZ;M:MZ;M:MZ;M:MZ;Mz:MZ;MZ;MZ;Ms:MZ;Ml:MZ;MZ;MZ;Me:MZ;M^:MZ;MW:M=M@M@M@M@M@M@M@M@M@M@M@M@M@M@Mx=Mn=Md=M@MZ=MP=MF=M@M<=M@M2=M(=M@M@M@M@M@M=M@M@M@M@M@M@M@M@M@M=M@M =M@M@M@M=M@M in real-literal-constant at %CMissing exponent in real number at %CReal number at %C has a % exponent and an explicit kindReal number at %C has a % exponent and an explicit kindInvalid exponent-letter % in real-literal-constant at %CReal constant overflows its kind at %CReal constant underflows its kind at %Cgfc_range_check() returned bad valueNon-significant digits in %qs number at %C, maybe incorrect KINDStructure constructor with missing optional arguments at %CNo initializer for allocatable component %qs given in the structure constructor at %CNo initializer for component %qs given in the structure constructor at %CNamelist %qs cannot be an argument at %LKeyword %qs at %C has already appeared in the current argument listHexadecimal constant at %L uses nonstandard X instead of ZEmpty set of digits in BOZ constant at %CIllegal character in BOZ constant at %CBOZ constant at %C uses nonstandard postfix syntaxBOZ used outside a DATA statement at %LInteger kind %d at %C not availableInteger too big for its kind at %C. This check can be disabled with the option %<-fno-range-check%>Expected PARAMETER symbol in complex constant at %CNumeric PARAMETER required in complex constant at %CScalar PARAMETER required in complex constant at %CPARAMETER symbol in complex constant at %Cgfc_match_sym_complex_part(): Bad typeError converting PARAMETER constant in complex constant at %CSyntax error in COMPLEX constant at %CInvalid kind %d for CHARACTER constant at %CUnterminated character constant beginning at %CCharacter %qs in string at %C is not representable in character kind %dmatch_string_constant(): Delimiter not foundSubstring start index (%ld) at %L below 1Substring end index (%ld) at %L exceeds string lengthInvalid Hollerith constant: %L must contain at least one characterInvalid Hollerith constant: Integer kind at %L should be defaultInvalid Hollerith constant at %L contains a wide characterBad kind for logical constant at %CExpected alternate return label at %CMissing keyword name in actual argument list at %CSyntax error in argument list at %CArray section designator, e.g. '(:)', is required besides the coarray designator '[...]' at %CCoarray designator at %C but %qs is not a coarraySymbol %qs at %C has no IMPLICIT typeUnexpected %<%c%> for nonderived-type variable %qs at %CExpected structure component name at %CThe RE or IM part_ref at %C must be applied to a COMPLEX expressionThe LEN part_ref at %C must be applied to a CHARACTER expression%qs at %C is not an inquiry reference to an intrinsic type component %qs%qs at %C is not an inquiry reference to an intrinsic type componentThe KIND part_ref at %C must be applied to an expression of intrinsic typeProcedure pointer component %qs requires an argument list at %CNonpolymorphic reference to abstract type at %CCoindexed procedure-pointer component at %CDerived type %qs cannot be used as a variable at %CNamed constant at %C in an EQUIVALENCEgfc_variable_attr(): Expression isn't a variableCannot construct ABSTRACT type %qs at %LStructure constructor with named arguments at %CComponent initializer without name after component named %s at %LToo many components in structure constructor at %LCHARACTER expression will be truncated in constructor (%ld/%ld) at %LComponent %qs is initialized twice in the structure constructor at %LCoindexed expression to pointer component %qs in structure constructor at %Lcomponent %qs at %L has already been set by a parent derived type constructor%qs at %C is the name of a recursive function and so refers to the result variable. Use an explicit RESULT variable for direct recursion (12.5.2.1)Symbol at %C is not appropriate for an expressionUnexpected use of subroutine name %qs at %CStatement function %qs requires argument list at %CFunction %qs requires an argument list at %CThe leftmost part-ref in a data-ref cannot be a function reference at %CMissing argument list in function %qs at %CMissing kind-parameter at %Cextend_ref(): Bad tailInvalid real kind %d at %Cgfc_check_digit(): bad radixHollerith constant at %C%n (%LOC%REFargument list function at %CRE or IM part_ref at %C../../gcc/fortran/primary.cExpected argument list at %C%qs at %C is not a variable%%loc%%LOC() as an rvalue at %CMissing argument to %qs at %C ( %n =MkMkMkMڦMmMkM6M M(MkM Mgfc_match_structure_constructorgfc_convert_to_structure_constructorgfc_match_varspecAllocatable array %qs at %L must have a deferred shape or assumed rankScalar object %qs at %L may not be ALLOCATABLEArray pointer %qs at %L must have a deferred shape or assumed rankArray %qs at %L cannot have a deferred shapeType %qs of CLASS variable %qs at %L is not extensibleCLASS variable %qs at %L must be dummy, allocatable or pointerInterface %qs at %L is declared in a later PROCEDURE statementInterface %qs at %L may not be a statement functionIntrinsic procedure %qs not allowed in PROCEDURE statement at %LInterface %qs at %L must be explicitInterface %qs at %L may not be genericCharacter-valued module procedure %qs at %L must not be assumed lengthCharacter-valued internal function %qs at %L must not be assumed lengthContained procedure %qs at %L has the same name as its encompassing procedureContained function %qs at %L has no IMPLICIT typeResult %qs of contained function %qs at %L has no IMPLICIT typeGENERIC procedure %qs is not allowed as an actual argument at %LDerived-type %qs declared at %L must be ABSTRACT because %qs is DEFERRED and not overriddenThe KIND parameter %qs in the PDT constructor at %C has no valueVariable %qs at %L is in COMMON but only in BLOCK DATA initialization is allowedInitialized variable %qs at %L is in a blank COMMON but initialization is only allowed in named common blocks%qs in cannot appear in COMMON at %L [F2008:C5100]Derived type variable %qs in COMMON at %L has neither the SEQUENCE nor the BIND(C) attributeDerived type variable %qs in COMMON at %L has an ultimate component that is allocatableDerived type variable %qs in COMMON at %L may not have default initializerDerived type variable %qs at %L must have SEQUENCE attribute to be an EQUIVALENCE objectDerived type variable %qs at %L cannot have ALLOCATABLE components to be an EQUIVALENCE objectDerived type variable %qs at %L with default initialization cannot be in EQUIVALENCE with a variable in COMMONDerived type variable %qs at %L with pointer component(s) cannot be an EQUIVALENCE objectIn Fortran 2003 COMMON %qs block at %L is a global identifier and must thus have the same binding name as the same-named COMMON block at %L: %s vs %sCOMMON block %qs at %L uses the same global identifier as entity at %LFortran 2008: COMMON block %qs with binding label at %L sharing the identifier with global non-COMMON-block entity at %LCOMMON block at %L with binding label %qs uses the same global identifier as entity at %LCOMMON block %qs at %L is used as PARAMETER at %LCOMMON block %qs at %L cannot have the EXTERNAL attributeCOMMON block %qs at %L is also an intrinsic procedureCOMMON block %qs at %L that is also a function resultCOMMON block %qs at %L that is also a global procedureLabel %d at %L defined but not usedLabel %d at %L defined but cannot be usedUser operator procedure %qs at %L must be a FUNCTIONUser operator procedure %qs at %L cannot be assumed character lengthUser operator procedure %qs at %L must have at least one argumentFirst argument of operator interface at %L cannot be optionalSecond argument of operator interface at %L cannot be optionalOperator interface at %L must have, at most, two argumentsFINAL procedure %qs at %L is not a SUBROUTINEFINAL procedure at %L must have exactly one argumentArgument of FINAL procedure at %L must be of type %qsArgument of FINAL procedure at %L must not be a POINTERArgument of FINAL procedure at %L must not be ALLOCATABLEArgument of FINAL procedure at %L must not be OPTIONALArgument of FINAL procedure at %L must not be INTENT(OUT)Non-scalar FINAL procedure at %L should have assumed shape argumentFINAL procedure %qs declared at %L has the same rank (%d) as %qsOnly array FINAL procedures declared for derived type %qs defined at %L, suggest also scalar onelogical_to_bitwise(): Bad intrinsicEntity %qs at %L has a deferred type parameter and requires either the POINTER or ALLOCATABLE attribute%qs at %L is of the ABSTRACT type %qsThe upper bound in the last dimension must appear in the reference to the assumed size array %qs at %L%qs at %L is an array and OPTIONAL; IF IT IS MISSING, it cannot be the actual argument of an ELEMENTAL procedure unless there is a non-optional argument with the same rank (12.4.1.5)Actual argument at %L for INTENT(%s) dummy %qs of ELEMENTAL subroutine %qs is a scalar, but another actual argument is an arrayType-bound operator at %L cannot be NOPASSVariable %qs with binding label %qs at %L uses the same global identifier as entity at %LVariable %qs from module %qs with binding label %qs at %L uses the same global identifier as entity at %L from module %qsProcedure %qs with binding label %qs at %L uses the same global identifier as entity at %LFunction %s at %L has entries with mismatched array specificationsFunction %s at %L with entries returning variables of different string lengthsFUNCTION result %s cannot be an array in FUNCTION %s at %LENTRY result %s cannot be an array in FUNCTION %s at %LFUNCTION result %s cannot be a POINTER in FUNCTION %s at %LENTRY result %s cannot be a POINTER in FUNCTION %s at %LFUNCTION result %s cannot be of type %s in FUNCTION %s at %LENTRY result %s cannot be of type %s in FUNCTION %s at %LSelf reference in character length expression for %qs at %L%qs and %qs cannot be mixed FUNCTION/SUBROUTINE for GENERIC %qs at %L%qs and %qs for GENERIC %qs at %L are ambiguousUndefined specific binding %qs as target of GENERIC %qs at %LGENERIC %qs at %L must target a specific binding, %qs is GENERIC, tooGENERIC %qs at %L cannot overwrite specific binding with the same name%qs must be a module procedure or an external procedure with an explicit interface at %LProcedure %qs with PASS at %L must have at least one argumentNon-polymorphic passed-object dummy argument of %qs at %LArgument %qs of %qs with PASS(%s) at %L must be of the derived-type %qsPassed-object dummy argument of %qs at %L must be scalarPassed-object dummy argument of %qs at %L must not be ALLOCATABLEPassed-object dummy argument of %qs at %L must not be POINTERProcedure %qs at %L has the same name as a component of %qsProcedure %qs at %L has the same name as an inherited component of %qsProcedure %qs with PASS(%s) at %L has no argument %qsExpression in CASE statement at %L must be of type %sExpression in CASE statement at %L must be of kind %dExpression in CASE statement at %L must be scalarcompare_bound_int(): Bad expressionArray reference at %L is out of bounds (%ld < %ld) in dimension %dArray reference at %L is out of bounds (%ld < %ld) in codimension %dArray reference at %L is out of bounds (%ld > %ld) in dimension %dArray reference at %L is out of bounds (%ld > %ld) in codimension %dLower array reference at %L is out of bounds (%ld < %ld) in dimension %dLower array reference at %L is out of bounds (%ld > %ld) in dimension %dUpper array reference at %L is out of bounds (%ld < %ld) in dimension %dUpper array reference at %L is out of bounds (%ld > %ld) in dimension %dcheck_dimension(): Bad array referenceLabel %d referenced at %L is never definedStatement at %L is not a valid branch target statement for the branch statement at %LBranch at %L may result in an infinite loopGOTO statement at %L leaves CRITICAL construct for label at %LGOTO statement at %L leaves DO CONCURRENT construct for label at %LLabel at %L is not in the same block as the GOTO statement at %LAssignment to a FORALL index variable at %LThe FORALL with index %qs is not used on the left side of the assignment at %L and so might cause multiple assignment to this objectSelection expression in computed GOTO statement at %L must be a scalar integer expressionArgument of SELECT statement at %L cannot be %sArgument of SELECT statement at %L must be a scalar expressionExpression in CASE statement at %L is not in the range of %sThe DEFAULT CASE at %L cannot be followed by a second DEFAULT CASE at %LLogical range in CASE statement at %L is not allowedConstant logical value in CASE statement is repeated at %LRange specification at %L can never be matchedCASE label at %L overlaps with CASE label at %LLogical SELECT CASE block at %L has more that two casesType specified for intrinsic function %qs at %L is ignored%qs declared INTRINSIC at %L does not existIntrinsic subroutine %qs at %L shall not have a type specifierThe intrinsic %qs declared INTRINSIC at %L is not available in the current standard settings but %s. Use an appropriate %<-std=*%> option or enable %<-fall-intrinsics%> in order to use it.result with non-constant character lengthImpure function %qs at %L might not be evaluatedImpure function at %L might not be evaluatedReference to impure function %qs at %L inside a FORALL %sReference to impure function %qs at %L inside a DO CONCURRENT %sReference to impure function %qs at %L within a PURE procedureCHARACTER expression will be truncated in assignment (%ld/%ld) at %LCoindexed expression at %L is assigned to a derived type variable with a POINTER component in a PURE procedureIn a pure subprogram an INTENT(IN) dummy argument shall not be used as the expr at %L of an intrinsic assignment statement in which the variable is of a derived type if the derived type has a pointer component at any level of component selection.Assignment to coindexed variable at %L in a PURE procedureAssignment to polymorphic coarray at %L is not permittedAssignment to an allocatable polymorphic variable at %LAssignment to an allocatable polymorphic variable at %L requires %<-frealloc-lhs%>Nonallocatable variable must not be polymorphic in intrinsic assignment at %L - check that there is a matching specific subroutine for '=' operatorCoindexed variable must not have an allocatable ultimate component in assignment at %LSubroutine call to %qs in FORALL block at %L is not PURESubroutine call to %qs in DO CONCURRENT block at %L is not PURESubroutine call to %qs at %L is not PUREUnexpected junk after %qs at %LABSTRACT INTERFACE %qs must not be referenced at %LABSTRACT INTERFACE %qs at %L must not have an assumed character length result (F2008: C418)Function %qs is declared CHARACTER(*) and cannot be used at %L since it is not a dummy argumentThere is no specific function for the generic %qs at %LGeneric function %qs at %L is not consistent with a specific intrinsic interfaceFunction %qs at %L is INTRINSIC but is not compatible with an intrinsicUnable to resolve the specific function %qs at %LFunction %qs at %L has no IMPLICIT type; did you mean %qs?Function %qs at %L has no IMPLICIT typeresolve_function(): bad function typeA BOZ literal constant at %L cannot appear as an actual argument in a function referenceUser defined non-ELEMENTAL function %qs at %L not allowed in WORKSHARE constructENTRY %qs at %L cannot be called recursively, as function %qs is not RECURSIVEFunction %qs at %L cannot be called recursively, as it is not RECURSIVEVariable %s at %L with NO_ARG_CHECK attribute may only be used as actual argumentAssumed-type variable %s at %L may only be used as actual argumentAssumed-type variable %s at %L as actual argument to an inquiry function shall be the first argumentAssumed-rank variable %s at %L may only be used as actual argumentAssumed-rank variable %s at %L as actual argument to an inquiry function shall be the first argumentVariable %s at %L with NO_ARG_CHECK attribute shall not have a subobject referenceAssumed-type variable %s at %L shall not have a subobject referenceAssumed-rank variable %s at %L shall not have a subobject referenceVariable %qs, used in a specification expression, is referenced at %L before the ENTRY statement in which it is a parameterVariable %qs is used at %L before the ENTRY statement in which it is a parameterNon-RECURSIVE procedure %qs at %L is possibly calling itself recursively. Declare it RECURSIVE or use %<-frecursive%>Polymorphic subobject of coindexed object at %LCoindexed object with polymorphic allocatable subcomponent at %Lgfc_resolve_expr(): Bad expression typeString length at %L is too largeExpecting definable entity near %LBLOCK DATA element %qs at %L must be in COMMONDATA array %qs at %L must be specified in a previous declarationDATA element %qs at %L cannot have a coindexDATA element %qs at %L is a pointer and so must be a full arrayDATA object near %L has the pointer attribute and the corresponding DATA value is not a valid initial-data-targetNonconstant array section at %L in DATA statementDATA statement at %L has more variables than valuesstart of implied-do loop at %L could not be simplified to a constant valueend of implied-do loop at %L could not be simplified to a constant valuestep of implied-do loop at %L could not be simplified to a constant valuestep of implied-do loop at %L shall not be zeroArray index at %L must be scalarArray index at %L must be of INTEGER type, found %sArgument dim at %L must be scalarArgument dim at %L must be of INTEGER typeSubstring start index at %L must be of type INTEGERSubstring start index at %L must be scalarSubstring start index at %L is less than oneSubstring end index at %L must be of type INTEGERSubstring end index at %L must be scalarSubstring end index at %L exceeds the string lengthSubstring end index at %L is too largefind_array_spec(): Missing specfind_array_spec(): unused as(1)find_array_spec(): unused as(2)Array index at %L is an array of rank %dRightmost upper bound of assumed size array section not specified at %LRank mismatch in array reference at %L (%d/%d)Coindex rank mismatch in array reference at %L (%d/%d)Coindex of codimension %d must be a scalar at %Lresolve_ref(): Bad array referenceComponent to the right of a part reference with nonzero rank must not have the POINTER attribute at %LComponent to the right of a part reference with nonzero rank must not have the ALLOCATABLE attribute at %LTwo or more part references with nonzero rank must not be specified at %LEquality comparison for %s at %LInequality comparison for %s at %LBOZ literal constant at %L cannot be an operand of unary operator %qsOperands at %L and %L cannot appear as operands of binary operator %qsInvalid context for NULL() pointer at %%LOperand of unary numeric operator %%<%s%%> at %%L is %sUnexpected derived-type entities in binary intrinsic numeric operator %%<%s%%> at %%LOperands of binary numeric operator %%<%s%%> at %%L are %s/%sOperands of string concatenation operator at %%L are %s/%sOperands of logical operator %%<%s%%> at %%L are %s/%sOperand of .not. operator at %%L is %sBOZ literal constant near %L cannot appear as an operand of a relational operatorLogicals at %%L must be compared with %s instead of %sOperands of comparison operator %%<%s%%> at %%L are %s/%sUnknown operator %%<%s%%> at %%L; did you mean '%s'?Unknown operator %%<%s%%> at %%LOperand of user operator %%<%s%%> at %%L is %sOperands of user operator %%<%s%%> at %%L are %s/%sresolve_operator(): Bad intrinsicShapes for operands at %L and %L are not conformableInconsistent ranks for operator at %%L and %%LStatement function %qs at %L is not allowed as an actual argumentIntrinsic %qs at %L is not allowed as an actual argumentInternal procedure %qs is used as actual argument at %LELEMENTAL non-INTRINSIC procedure %qs is not allowed as an actual argument at %LUnable to find a specific INTRINSIC procedure for the reference %qs at %LBy-value argument at %L is not of numeric typeBy-value argument at %L cannot be an array or an array sectionBy-value argument at %L is not allowed in this contextPassing internal procedure at %L by location not allowedELEMENTAL procedure pointer component %qs is not allowed as an actual argument at %LCoindexed actual argument at %L with ultimate pointer componentDerived type %qs is used as an actual argument at %LBase object for type-bound procedure call at %L is of ABSTRACT type %qsBase object for NOPASS type-bound procedure call at %L must be scalarFound no matching specific binding for the call to the GENERIC %qs at %L%qs at %L should be a FUNCTIONPassed-object at %L must be scalarBase object for procedure-pointer component call at %L is of ABSTRACT type %qsStep expression in DO loop at %L cannot be zeroDO loop at %L will be executed zero timesDO loop at %L is undefined as it overflowsDO loop at %L is undefined as it underflowsStat-variable at %L must be a scalar INTEGER variableERRMSG at %L is useless without a STAT tagERRMSG variable at %L shall be a scalar default CHARACTER variableAllocate-object at %L is subobject of object at %LAllocate-object at %L also appears at %LCoindexed allocatable object at %LAllocate-object at %L must be ALLOCATABLE or a POINTERType of entity at %L is type incompatible with source-expr at %LSource-expr at %L must be scalar or have the same rank as the allocate-object at %LSource-expr at %L and allocate-object at %L must have the same shapeThe allocate-object at %L and the source-expr at %L shall have the same kind type parameterThe source-expr at %L shall neither be of type LOCK_TYPE nor have a LOCK_TYPE component if allocate-object at %L is a coarrayThe source-expr at %L shall neither be of type EVENT_TYPE nor have a EVENT_TYPE component if allocate-object at %L is a coarrayAllocating %s of ABSTRACT base type at %L requires a type-spec or source-exprAllocating %s at %L with type-spec requires the same character-length parameter as in the declarationArray specification required in ALLOCATE statement at %LArray specification or array-valued SOURCE= expression required in ALLOCATE statement at %LCoarray specification required in ALLOCATE statement at %LBad array specification in ALLOCATE statement at %LUpper cobound is less than lower cobound at %LUpper cobound is less than lower cobound of 1 at %L%qs must not appear in the array specification at %L in the same ALLOCATE statement where it is itself allocatedExpected '*' in coindex specification in ALLOCATE statement at %LBad coarray specification in ALLOCATE statement at %LStat-variable at %L shall not be %sd within the same %s statementErrmsg-variable at %L shall not be %sd within the same %s statementWHERE/ELSEWHERE clause at %L requires a LOGICAL arraygfc_resolve_blocks(): Bad block typeFORALL index-name at %L must be a scalar variable of type integerAn outer FORALL construct already has an index with this name %LImageset argument at %L must be a scalar or rank-1 INTEGER expressionImageset argument at %L must between 1 and num_images()STAT= argument at %L must be a scalar INTEGER variableERRMSG= argument at %L must be a scalar CHARACTER variableLock variable at %L must be a scalar of type LOCK_TYPEEvent variable at %L must be a scalar of type EVENT_TYPEEvent variable argument at %L must be a coarray or coindexedEvent variable argument at %L must be a coarray but not coindexedACQUIRED_LOCK= argument at %L must be a scalar LOGICAL variableUNTIL_COUNT= argument at %L must be a scalar INTEGER expressionASSIGNED GOTO statement at %L requires a scalar INTEGER variableVariable %qs has not been assigned a target label at %LAlternate RETURN statement at %L requires a SCALAR-INTEGER return specifierPointer procedure assignment at %LThe function result on the lhs of the assignment at %L must have the pointer attribute.ASSIGN statement at %L requires a scalar default INTEGER variableArithmetic IF statement at %L requires a scalar REAL or INTEGER expressionRANK DEFAULT at %L is repeated at %LRANK (*) at %L is repeated at %LRANK (%i) at %L is repeated at %LRANK (*) at %L cannot be used with the pointer or allocatable selector at %Lgfc_resolve_code(): No expression on DO WHILEExit condition of DO WHILE loop at %L must be a scalar LOGICAL expressionInvalid context for NULL () intrinsic at %LDTIO %s procedure at %L must be recursiveData transfer element at %L cannot be polymorphic unless it is processed by a defined input/output procedureData transfer element at %L cannot have POINTER components unless it is processed by a defined input/output procedureData transfer element at %L cannot have procedure pointer componentsData transfer element at %L cannot have ALLOCATABLE components unless it is processed by a defined input/output procedureData transfer element at %L cannot have PRIVATE componentsData transfer element at %L cannot have PRIVATE components unless it is processed by a defined input/output procedureData transfer element at %L cannot be a full reference to an assumed-size arrayFORALL index-name at %L must be a scalar INTEGERFORALL start expression at %L must be a scalar INTEGERFORALL end expression at %L must be a scalar INTEGERFORALL stride expression at %L must be a scalar %sFORALL stride expression at %L cannot be zeroFORALL index %qs may not appear in triplet specification at %LFORALL mask clause at %L requires a scalar LOGICAL expressiongfc_resolve_code(): Bad statement codeTODO: type-bound defined assignment(s) at %L not done because multiple part array references would occur in intermediate expressions.Return type mismatch of function %qs at %L (%s/%s)Interface mismatch in global procedure %qs at %L: %s%qs at %L has a type, which is not consistent with the CALL at %LENTRY %qs at %L cannot be called recursively, as subroutine %qs is not RECURSIVESUBROUTINE %qs at %L cannot be called recursively, as it is not RECURSIVEThere is no specific subroutine for the generic %qs at %LGeneric subroutine %qs at %L is not consistent with an intrinsic subroutine interfaceSubroutine %qs at %L is INTRINSIC but is not compatible with an intrinsicUnable to resolve the specific subroutine %qs at %Lresolve_subroutine(): bad function typeNon-ELEMENTAL user-defined assignment in WHERE at %LUnsupported statement inside WHERE at %LWHERE mask at %L has inconsistent shapeWHERE assignment target at %L has inconsistent shape%qs at %L should be a SUBROUTINESorry, allocatable/pointer components in polymorphic (CLASS) type coarrays at %L are unsupportedPROTECTED attribute conflicts with EXTERNAL attribute at %LPROCEDURE attribute conflicts with PROTECTED attribute at %L%qs at %L has the CONTIGUOUS attribute but is not an array pointer or an assumed-shape or assumed-rank arrayBad specification for assumed size array at %LAssumed size array at %L must be a dummy argumentAssumed shape array at %L must be a dummy argumentAssumed-rank array at %L must be a dummy argumentAssumed-rank array at %L may not have the VALUE or CODIMENSION attributeSymbol at %L is not a DUMMY variable%qs at %L cannot have the VALUE attribute because it is not a dummy argumentCharacter dummy variable %qs at %L with VALUE attribute must have constant lengthC interoperable character dummy variable %qs at %L with VALUE attribute must have length oneVariable %s at %L with NO_ARG_CHECK attribute shall be a dummy argumentVariable %s at %L with NO_ARG_CHECK attribute shall be of type TYPE(*) or of an numeric intrinsic typeVariable %s at %L with NO_ARG_CHECK attribute may not have the ALLOCATABLE, CODIMENSION, POINTER or VALUE attributeVariable %s at %L with NO_ARG_CHECK attribute may not have the INTENT(OUT) attributeVariable %s at %L with NO_ARG_CHECK attribute shall either be a scalar or an assumed-size arrayAssumed type of variable %s at %L is only permitted for dummy variablesAssumed-type variable %s at %L may not have the ALLOCATABLE, CODIMENSION, POINTER or VALUE attributeAssumed-type variable %s at %L may not have the INTENT(OUT) attributeAssumed-type variable %s at %L shall not be an explicit-shape arrayVariable %qs at %L cannot be BIND(C) because it is neither a COMMON block nor declared at the module level scopeBIND(C) Variable %qs at %L must have length oneThe derived type %qs at %L is of type %qs, which has not been definedPUBLIC %s %qs at %L of PRIVATE derived type %qsVariable %s at %L of type LOCK_TYPE or with subcomponent of type LOCK_TYPE must be a coarrayVariable %s at %L of type EVENT_TYPE or with subcomponent of type EVENT_TYPE must be a coarrayThe INTENT(OUT) dummy argument %qs at %L is ASSUMED SIZE and so cannot have a default initializerDummy argument %qs at %L of LOCK_TYPE shall not be INTENT(OUT)Dummy argument %qs at %L of EVENT_TYPE shall not be INTENT(OUT)Function result %qs at %L shall not be a coarray or have a coarray componentVariable %qs at %L of TYPE(C_PTR) or TYPE(C_FUNPTR) shall not be a coarrayVariable %qs at %L with coarray component shall be a nonpointer, nonallocatable scalar, which is not a coarrayVariable %qs at %L is a coarray and is not ALLOCATABLE, SAVE nor a dummy argumentCoarray variable %qs at %L shall not have codimensions with deferred shapeAllocatable coarray variable %qs at %L must have deferred shapeVariable %qs at %L is INTENT(OUT) and can thus not be an allocatable coarray or have coarray componentsCoarray dummy variable %qs at %L not allowed in BIND(C) procedure %qsLOGICAL dummy argument %qs at %L with non-C_Bool kind in BIND(C) procedure %qsLOGICAL result variable %qs at %L with non-C_Bool kind in BIND(C) procedure %qsThe module or main program array %qs at %L must have constant shapeEntity with assumed character length at %L must be a dummy argument or a PARAMETERAutomatic object %qs at %L cannot have the SAVE attribute%qs at %L must have constant character length in this contextCOMMON variable %qs at %L must have constant character lengthAllocatable %qs at %L cannot have an initializerExternal %qs at %L cannot have an initializerDummy %qs at %L cannot have an initializerIntrinsic %qs at %L cannot have an initializerFunction result %qs at %L cannot have an initializerAutomatic array %qs at %L cannot have an initializerThe type %qs cannot be host associated at %L because it is blocked by an incompatible object of the same name declared at %LImplied SAVE for module variable %qs at %L, needed due to the default initializationNamelist %qs cannot be an argument to subroutine or function at %LCharacter-valued statement function %qs at %L must have constant length%qs is of a PRIVATE type and cannot be a dummy argument of %qs, which is PUBLIC at %LProcedure %qs in PUBLIC interface %qs at %L takes dummy arguments of %qs which is PRIVATEFunction %qs at %L cannot have an initializerExternal object %qs at %L may not have an initializerELEMENTAL function %qs at %L must have a scalar resultStatement function %qs at %L may not have pointer or allocatable attributeCHARACTER(*) function %qs at %L cannot be pointer-valuedCHARACTER(*) function %qs at %L cannot be pureCHARACTER(*) function %qs at %L cannot be recursiveCHARACTER(*) function %qs at %LProcedure pointer %qs at %L shall not be elementalDummy procedure %qs at %L shall not be elementalFunction result variable %qs at %L of elemental function %qs shall not have an ALLOCATABLE or POINTER attributePROCEDURE attribute conflicts with SAVE attribute in %qs at %LPROCEDURE attribute conflicts with INTENT attribute in %qs at %LPROCEDURE attribute conflicts with RESULT attribute in %qs at %LEXTERNAL attribute conflicts with FUNCTION attribute in %qs at %LProcedure pointer result %qs at %L is missing the pointer attributeMismatch in ELEMENTAL attribute between MODULE PROCEDURE at %L and its interface in %sMismatch in PURE attribute between MODULE PROCEDURE at %L and its interface in %sMismatch in RECURSIVE attribute between MODULE PROCEDURE at %L and its interface in %s%s between the MODULE PROCEDURE declaration in MODULE %qs and the declaration at %L in (SUB)MODULE %qsAssumed size array %qs in namelist %qs at %L is not allowedNAMELIST array object %qs with assumed shape in namelist %qs at %LNAMELIST array object %qs with nonconstant shape in namelist %qs at %LNAMELIST object %qs with nonconstant character length in namelist %qs at %LNAMELIST object %qs was declared PRIVATE and cannot be member of PUBLIC namelist %qs at %LNAMELIST object %qs has use-associated PRIVATE components and cannot be member of namelist %qs at %LNAMELIST object %qs has PRIVATE components and cannot be a member of PUBLIC namelist %qs at %LPROCEDURE attribute conflicts with NAMELIST attribute in %qs at %LParameter array %qs at %L cannot be automatic or of deferred shapeImplicitly typed PARAMETER %qs at %L doesn't match a later IMPLICIT typeIncompatible derived type in PARAMETER at %LCLASS variable %qs at %L cannot have the PARAMETER attributeThreadprivate at %L isn't SAVEd!$OMP DECLARE TARGET variable %qs at %L isn't SAVEdThe object %qs at %L has a deferred LEN parameter %qs and is neither allocatable nor a pointerThe AUTOMATIC object %qs at %L must not have the SAVE attribute or be a variable declared in the main program, a module or a submodule(F08/C513)The object %qs at %L with ASSUMED type parameters must be a dummy or a SELECT TYPE selector(F08/4.2)CHARACTER(*) function %qs at %L cannot be array-valuedPROCEDURE %qs at %L may not be used as its own interfaceAlternate return specifier in elemental subroutine %qs at %L is not allowedAlternate return specifier in function %qs at %L is not allowedSelf-referential argument %qs at %L is not allowedDummy procedure %qs of PURE procedure at %L must also be PUREArgument %qs of pure function %qs at %L with VALUE attribute but without INTENT(IN)Argument %qs of pure function %qs at %L must be INTENT(IN) or VALUEArgument %qs of pure subroutine %qs at %L with VALUE attribute but without INTENTArgument %qs of pure subroutine %qs at %L must have its INTENT specified or have the VALUE attributeINTENT(OUT) argument %qs of pure procedure %qs at %L may not be polymorphicCoarray dummy argument %qs at %L to elemental procedureArgument %qs of elemental procedure at %L must be scalarArgument %qs of elemental procedure at %L cannot have the ALLOCATABLE attributeArgument %qs of elemental procedure at %L cannot have the POINTER attributeDummy procedure %qs not allowed in elemental procedure %qs at %LArgument %qs of elemental procedure %qs at %L must have its INTENT specified or have the VALUE attributeArgument '%s' of statement function '%s' at %L must be scalarCharacter-valued argument %qs of statement function at %L must have constant lengthPUBLIC function %qs at %L of PRIVATE type %qsENTRY %qs at %L has no IMPLICIT typeContained procedure %qs at %L of a PURE procedure must also be PUREDATA statement at %L has more values than variablesSyntax error in EQUIVALENCE statement at %LEither all or none of the objects in the EQUIVALENCE set at %L shall have the PROTECTED attributeCOMMON block member %qs at %L cannot be an EQUIVALENCE object in the pure procedure %qsNamed constant %qs at %L cannot be an EQUIVALENCE objectSequence %s with mixed components in EQUIVALENCE statement at %L with different type objectsNon-default type object or sequence %s in EQUIVALENCE statement at %L with objects of different typeNon-CHARACTER object %qs in default CHARACTER EQUIVALENCE statement at %LNon-NUMERIC object %qs in default NUMERIC EQUIVALENCE statement at %LArray %qs at %L with non-constant bounds cannot be an EQUIVALENCE objectStructure component %qs at %L cannot be an EQUIVALENCE objectSubstring at %L has length zeroAssociating entity %qs at %L is a procedure pointerDerived type %qs cannot be used as a variable at %LAssociating entity %qs at %L is a procedure nameAssociating selector-expression at %L yields a procedureSelector at %L cannot be NULL()Associate-name %qs at %L is used as arrayCLASS selector at %L needs a temporary which is not yet implementedCoarray component %qs at %L must be allocatable with deferred shapeComponent %qs at %L of TYPE(C_PTR) or TYPE(C_FUNPTR) shall not be a coarrayComponent %qs at %L with coarray component shall be a nonpointer, nonallocatable scalarComponent %qs at %L has the CONTIGUOUS attribute but is not an array pointerComponent %qs of BIND(C) type at %L must have length oneProcedure pointer component %qs with PASS at %L must have at least one argumentArgument %qs of %qs with PASS(%s) at %L must be of the derived type %qsArgument %qs of %qs with PASS(%s) at %L must be scalarArgument %qs of %qs with PASS(%s) at %L may not have the POINTER attributeArgument %qs of %qs with PASS(%s) at %L may not be ALLOCATABLEComponent %qs of %qs at %L has the same name as an inherited type-bound procedureCharacter length of component %qs needs to be a constant specification expression at %LCharacter component %qs of %qs at %L with deferred length must be a POINTER or ALLOCATABLEthe component %qs is a PRIVATE type and cannot be a component of %qs, which is PUBLIC at %LPolymorphic component %s at %L in SEQUENCE or BIND(C) type %sComponent %s of SEQUENCE type declared at %L does not have the SEQUENCE attributeProcedure pointer component %qs with PASS(%s) at %L has no argument %qsConflicting initializers in union at %L and %LAs extending type %qs at %L has a coarray component, parent type %qs shall also have oneNon-extensible derived-type %qs at %L must not be ABSTRACTParameterized type %qs does not have a component corresponding to parameter %qs at %LGeneric name %qs of function %qs at %L being the same name as derived type at %LDerived type %qs at %L has not been declaredThe rank of the element in the structure constructor at %L does not match that of the component (%d/%d)The element in the structure constructor at %L, for pointer component %qs, is %s but should be %sThe NULL in the structure constructor at %L is being applied to component %qs, which is neither a POINTER nor ALLOCATABLEInterface mismatch for procedure-pointer component %qs in structure constructor at %L: %sThe element in the structure constructor at %L, for pointer component %qs should be a POINTER or a TARGETInvalid expression in the structure constructor for pointer component %qs at %L in PURE procedureSelector shall be polymorphic in SELECT TYPE statement at %LSelector at %L must not be coindexedTYPE IS at %L overlaps with TYPE IS at %LDerived type %qs at %L must be extensibleDerived type %qs at %L must be an extension of %qsUnexpected intrinsic type %qs at %LThe type-spec at %L shall specify that each length type parameter is assumedDouble CLASS IS block in SELECT TYPE statement at %L../../gcc/fortran/resolve.cCOMMON block at %L(blank)_F.DA%dABSTRACT type %qs used at %LINOUTelemental proceduremaster.%d.%sIllegal stride of zero at %Lallocatable argumentasynchronous argumentoptional argumentpointer argumenttarget argumentvalue argumentvolatile argumentassumed-shape argumentassumed-rank argumentcoarray argumentpolymorphic argumentNO_ARG_CHECK attributeassumed-type argumentarray resultpointer or allocatable resultbind(c) procedure%qs at %L is not a functionREAL array index at %LSymbol %qs at %L is ambiguousError in typebound call at %L%s at %L must be a scalar%s at %L must be integer%s at %L must be INTEGERLoop variableiterator variableStart expression in DO loopEnd expression in DO loopStep expression in DO loopSTAT variableERRMSG variableDEALLOCATE objectFORALL construct at %L_F.lock_type_F.lock_var%dACQUIRED_LOCK variableInvalid NULL at %Litem in READSelector at %L has no type_%s_length_caf_%spMMMM5M5M5M5M5M5M5M5M5M5M5M5MMMM(M@M`MxMM)MM MM+M:+Nx$NG-N'NG-NH'N'N$N'N%N`&NiNiNjNjNjNjNjNHlNlNlNlNlNpkNpkNmNmNmNmNpkNpkNmNmNmNmNpmN0jNmN8kNNҡNNҡNҡNҡNҡNҡNҡNҡNҡNҡN@NҡNNNNNpNNNҡNҡNҡNҡNҡNҡNNNҡNҡNҡNҡNҡNNNNNҡNҡNҡNҡNҡNҡNҡNҡNҡNҡNҡNҡNҡNҡNҡNҡNNNNNNNNNNNNNNNNNҡNNNҡNNNNNNNNNNNҡNҡNҡNNNNҡNҡNNNNNNNNNNNNNNNNNNNNNNҡNNNNNNNNNNҡNNҡNҡNҡNҡNҡNҡNҡNҡNҡNNҡNNNNNpNNNҡNҡNҡNҡNҡNҡNNNҡNҡNҡNҡNҡNNNNNҡNҡNҡNҡNҡNҡNҡNҡNҡNҡNҡNҡNҡNҡNҡNҡNNNNNNNNNNNNNNNNNҡNNNҡNNNNNNNNNNNҡNҡNҡNNNNҡNҡNNNNNNNNNNNNNNNNNNNNNNҡNNNNNNNNNȣN@N@N@N@N@N@N@N@N@N@N@N0N@N@N@N@N@N@N@N@N@N@N@N@N@N@N@N@N@N@N@N@N@N@N@N@N@N@NNNN@NNNNNNN@N@N@N@N@N@N@N@NN@N@N@NNNNN@N@NN@N@N@N@NN@N@N@N@N@NNNN@N@NNNNNNNNNNNNNNN@N@N@N@NNNNNNNzNNNN`NKNpNNN@NNвNNNQNNNNNNNNNNN@NNNNN*NNNNجNN/N/N/NNN˱NyNyN8NNNԦNԦNNԦNԦNNNNNؤNؤNؤNؤNN N N NzN N N N N N N N N N N N N N@N@N@N@N@N`N`N`N`N@N@N@N@N@NzNzN@N@N@N@N@N@N@N@N`N@N@N@N@N@N@N@N@N@N@N@N@N@N@N@N@N@NzN@N@N@N@N@N@N@N@NzNNNNNHNNNN@NNвNNNQNNNNNNNNNNN@NNNNN*NNNNجNN/N/N/NNN˱NyNyN8NNNԦNԦNNԦNԦNNNNNؤNؤNؤNؤNN N N NzN N N N N N N N N N N N N N@N@N@N@N@N`N`N`N`N@N@N@N@N@NzNzN@N@N@N@N@N@N@N@N`N@N@N@N@N@N@N@N@N@N@N@N@N@N@N@N@N@NzN@N@N@N@N@N@N@N@Nresolve_allocate_exprresolve_allocate_deallocateresolve_select_rankresolve_select_typeresolve_ppc_callresolve_typebound_callgenerate_component_assignmentsresolve_ordinary_assignremove_caf_get_intrinsicresolve_criticalgfc_count_forall_iteratorsgfc_resolve_forallfind_reachable_labelsgfc_resolve_coderesolve_branchgfc_resolve_blocksgfc_resolve_character_operatorupdate_ppc_arglistresolve_expr_ppcresolve_typebound_staticupdate_arglist_passresolve_typebound_generic_callextract_compcall_passed_objectresolve_compcallresolve_procedure_expressionis_illegal_recursionresolve_elemental_actualcheck_host_associationlogical_to_bitwisegfc_expression_rankcheck_dimensionresolve_array_refgfc_resolve_substring_charlenresolve_pdtapply_default_init_localresolve_typebound_intrinsic_opresolve_typebound_procedureget_checked_tb_operator_targetcheck_generic_tbp_ambiguityresolve_tb_generic_targetsresolve_typebound_user_opgfc_resolve_finalizersresolve_fl_derivedresolve_equivalencecheck_data_variableresolve_contained_fntyperesolve_entriesget_pdt_constructorensure_not_abstract_walkerensure_not_abstractresolve_assoc_varresolve_componentcons_whereresolve_symbolCOMPLEX quantities cannot be com!$OMP at %C starts a commented line as it neither is followed by a space nor is a continuation line!$ACC at %C starts a commented line as it neither is followed by a space nor is a continuation lineNonexistent include directory %qs%<&%> not allowed by itself in line %dNonconforming tab character in column %d of line %dIllegal preprocessor directiveWrong OpenACC continuation at %C: expected !$ACC, got !$OMPWrong OpenMP continuation at %C: expected !$OMP, got !$ACCLimit of %d continuations exceeded in statement at %CMissing %<&%> at %CNonconforming tab character at %C%s:%d: Error: File '%s' is being included recursively %s:%d: Error: Can't open included file '%s' Error: Can't open pre-included file '%s' %s:%d: Error: Included path '%s' is not a regular file Include directory %qs: %s%qs is not a directory../../gcc/fortran/scanner.cfile %qs left but not entered!$omp!$acc*$acc*$ompLine truncated at %L#undef %2.2XInvalid character 0x%s at %CCannot open file %qs# 1 "include_stmtgfc_next_char_literalgfc_widechar_to_charKIND parameter of %s at %L must be an initialization expressionInvalid KIND parameter of %s at %LArgument of %s function at %L is negativeArgument of %s function at %L outside of range [0,127]Argument of %s function at %L is too large for the collating sequence of kind %dsimplify_min_max(): Bad type in arglistResult of %s overflows its kind at %LResult of %s underflows its kind at %LResult of %s gives range error for its kind at %Lgfc_simplify_dcmplx(): Bad type (x)gfc_simplify_dcmplx(): Bad type (y)DIM argument at %L is out of boundsResult of BESSEL_YN is -INF at %LSecond argument of %s is negative at %LMagnitude of second argument of %s exceeds bit size at %LArgument of ACOS at %L must be between -1 and 1in gfc_simplify_acos(): Bad typeArgument of ACOSH at %L must not be less than 1in gfc_simplify_acosh(): Bad typeArgument of ASIN at %L must be between -1 and 1in gfc_simplify_asin(): Bad typeArgument of ACOSD at %L must be between -1 and 1Argument of ASIND at %L must be between -1 and 1in gfc_simplify_asinh(): Bad typein gfc_simplify_atan(): Bad typeArgument of ATANH at %L must be inside the range -1 to 1in gfc_simplify_atanh(): Bad typeIf first argument of ATAN2 at %L is zero, then the second argument must not be zeroIf first argument of ATAN2D at %L is zero, then the second argument must not be zeroin gfc_simplify_cos(): Bad typein gfc_simplify_exp(): Bad typegfc_simplify_floor(): Bad kindArgument of IACHAR at %L must be of length oneArgument of IACHAR function at %L outside of range 0..127Sum of second and third arguments of IBITS exceeds bit size at %LArgument of ICHAR at %L must be of length oneMagnitude of second argument of ISHFTC exceeds BIT_SIZE of first argument at %CAbsolute value of SHIFT shall be less than or equal to SIZE at %CArgument of LOG at %L cannot be less than or equal to zeroComplex argument of LOG at %L cannot be zeroArgument of LOG10 at %L cannot be less than or equal to zeroArgument %qs of MOD at %L shall not be zerogfc_simplify_mod(): Bad argumentsArgument %qs of MODULO at %L shall not be zerogfc_simplify_modulo(): Bad argumentsResult of NEAREST is NaN at %LArgument NCOPIES of REPEAT intrinsic is negative at %LArgument NCOPIES of REPEAT intrinsic is too large at %LEvaluation of string longer than 2**28 at %L deferred to runtime, expect bugsShapes of ORDER at %L and SHAPE at %L are differentSizes of ORDER at %L and SHAPE at %L are differentElement with a value of %d in ORDER at %L must be in the range [1, ..., %d] for the RESHAPE intrinsic near %LORDER at %L is not a permutation of the size of SHAPE at %LReshaped array too large at %CResult of SCALE overflows its kind at %Lin gfc_simplify_sin(): Bad typeFailure getting length of a constant array.The number of elements (%d) in the array constructor at %L requires an increase of the allowed %d upper limit. See %<-fmax-array-constructor%> option.Simplification of SPREAD at %C not yet implementedArgument of SQRT at %L has a negative valueinvalid argument of SQRT at %LOut of bounds in IMAGE_INDEX at %L for dimension %d, SUB has %ld and COARRAY lower bound is %ld)Out of bounds in IMAGE_INDEX at %L for dimension %d, SUB has %ld and COARRAY upper bound is %ld)Character %qs in string at %L cannot be converted into character kind %d../../gcc/fortran/simplify.cResult of %s is NaN at %LUBOUNDLBOUNDUCOBOUNDLCOBOUNDBESSEL_JNBESSEL_YNIABSgfc_simplify_abs(): Bad typeAIMAGAINTDREALANINTACOSDASINDATANDBESSEL_J0BESSEL_J1BESSEL_Y0BESSEL_Y1CEILINGCONJGATAN2DCOTANDDCMPLXgfc_simplify_dim(): Bad typeDPRODERFC_SCALEDEPSILONFRACTIONIACHARIANDIBITS: Bad bitICHARIEORIFIXIDINTISHFTLSHIFTRSHIFTSHIFTASHIFTLSHIFTRLEN_TRIMgfc_simplify_log: bad typeMASKRMASKLMODULOIDNINTREALPARTRRSPACINGSCALESCANasciiiso_10646SET_EXPONENTSTORAGE_SIZEBad type in gfc_simplify_signSNGLCOTANVERIFYDBLEINT2INT8LONGGCC version %s.Py-Pc-P7-P -P.P,PSPRPRPSPHSPSP`RP`PPP@PаP`PP`P`PP`PPбP`PP`PP`PPPpPP`PPgfc_simplify_xorgfc_simplify_unpackgfc_simplify_image_indexgfc_simplify_transposegfc_simplify_tanhgfc_simplify_tangfc_simplify_cotangfc_simplify_spreadgfc_simplify_sinhgfc_simplify_reshapegfc_simplify_repeatgfc_simplify_rangegfc_simplify_radixgfc_simplify_poppardo_xorgfc_simplify_ornorm2_add_squarednorm2_do_sqrtgfc_simplify_minmaxlocgfc_simplify_masklgfc_simplify_maskrgfc_simplify_matmulsimplify_coboundsimplify_bound_dimsimplify_bounddo_bit_xordo_bit_iordo_bit_andgfc_simplify_hugeasympt_erfc_scaledgfc_simplify_eoshiftcompute_dot_productgfc_simplify_digitsgfc_countgfc_simplify_coshgfc_simplify_complexconvert_mpz_to_unsignedcompare_bitwisegfc_simplify_andinit_result_exprgfc_convert_mpz_to_signed?(@Cgfc_free_statement(): Bad statementPPPPPPPPPPPPPPPPPPPPPPPP(P@PPPPPPpPpPP(P(PPPPPPPPPPPPнPнPPнPнPPPPPPPPPPPPPPPPPPPPPPPPPPؼPPPPPPPPPPPPPPPPPPPPPPPPPPPPPPPPPPPPPPPPPPPPPPPPPPPP../../gcc/fortran/symbol.cOMP DECLARE TARGET LINKUSE ASSOCIATEDOACC DECLARE CREATEOACC DECLARE COPYINOACC DECLARE DEVICEPTROACC DECLARE DEVICE_RESIDENTCRAY POINTERCRAY POINTEEINTENT(OUT)INTENT(INOUT)INTENT(IN)PROCEDURE POINTERProcedure pointer at %CBIND(C) at %LEXTENDS at %LLabeled DO statement at %LSymbol %qs at %C is ambiguousc_address_dtio_formatted_read_dtio_formatted_write_dtio_unformatted_read_dtio_unformatted_writeEXPLICIT-SAVEIMPLICIT-SAVEBODYUNKNOWN-ACCESSUNKNOWN-INTENTUNKNOWN-PROCMODULE-PROCINTERNAL-PROCDUMMY-PROCINTRINSIC-PROCEXTERNAL-PROCSTATEMENT-PROCUNKNOWN-FLBLOCK-DATACannot change attributes of USE-associated symbol at %LCannot change attributes of USE-associated symbol %s at %LName %qs at %C is an ambiguous reference to %qs from module %qsName %qs at %C is an ambiguous reference to %qs from current program unit%s attribute not allowed in BLOCK DATA program unit at %LNamelist group name at %L cannot have the SAVE attributeinternal procedure %qs at %L conflicts with DUMMY argument%s attribute applied to %s %s at %LBIND(C) applied to %s %s at %L%s attribute conflicts with %s attribute at %L%s attribute conflicts with %s attribute in %qs at %LIMPLICIT NONE (type) statement at %L following an IMPLICIT statementLetter %qc already set in IMPLICIT statement at %CCannot specify IMPLICIT at %C after IMPLICIT NONELetter %qc already has an IMPLICIT type at %COption %<-fallow-leading-underscore%> is for use only by gfortran developers, and should not be used for implicitly typed variablesgfc_get_default_type(): Bad symbol %qsDuplicate %s attribute specified at %LALLOCATABLE specified outside of INTERFACE body at %LDuplicate AUTOMATIC attribute specified at %LCODIMENSION specified for %qs outside its INTERFACE body at %LDIMENSION specified for %qs outside its INTERFACE body at %LCray Pointee at %L appears in multiple pointer() statementsDuplicate PROTECTED attribute specified at %LSAVE attribute at %L cannot be specified in a PURE procedureDuplicate SAVE attribute specified at %LDuplicate VALUE attribute specified at %LDuplicate VOLATILE attribute specified at %LVOLATILE attribute at %L cannot be specified in a PURE procedureDuplicate ASYNCHRONOUS attribute specified at %L%s attribute of %qs conflicts with %s attribute at %L%s procedure at %L is already declared as %s procedure%s procedure at %L is already declared as %s procedure. F2008: A pointer function assignment is ambiguous if it is the first executable statement after the specification block. Please add any other kind of executable statement before it. FIXMEINTENT (%s) conflicts with INTENT(%s) at %LACCESS specification at %L was already specifiedDuplicate BIND attribute specified at %LDuplicate EXTENDS attribute specified at %LSymbol %qs at %L already has an explicit interface%qs at %L has attributes specified outside its INTERFACE bodySymbol %qs at %L conflicts with symbol from module %qs, use-associated at %LSymbol %qs at %L already has basic type of %sProcedure %qs at %L may not have basic type of %sSymbol %qs at %L cannot have a typeDuplicate statement label %d at %L and %LLabel %d at %C already referenced as branch targetLabel %d at %C already referenced as a format labelDO termination statement which is not END DO or CONTINUE with label %d at %CLabel %d at %C previously used as a FORMAT labelLabel %d at %C previously used as branch targetShared DO termination label %d at %CSymbol %qs at %C has already been host associatedgfc_set_default_type(): symbol already has a typeSymbol %qs at %L has no IMPLICIT type; did you mean %qs?Symbol %qs at %L has no IMPLICIT typeImplicitly declared BIND(C) variable %qs at %L may not be C interoperableImplicitly declared variable %qs at %L may not be C interoperable but it is a dummy argument to the BIND(C) procedure %qs at %LFunction result %qs at %L has no IMPLICIT typeverify_bind_c_derived_type(): Given symbol is unexpectedly NULLDerived type %qs declared at %L must have the BIND attribute to be C interoperableDerived type %qs with BIND(C) attribute at %L is empty, and may be inaccessible by the C companion processorComponent %qs at %L cannot have the POINTER attribute because it is a member of the BIND(C) derived type %qs at %LProcedure pointer component %qs at %L cannot be a member of the BIND(C) derived type %qs at %LComponent %qs at %L cannot have the ALLOCATABLE attribute because it is a member of the BIND(C) derived type %qs at %LComponent %qs in derived type %qs at %L may not be C interoperable, even though derived type %qs is BIND(C)Component %qs in derived type %qs at %L may not be C interoperableDerived type %qs at %L cannot have the SEQUENCE attribute because it is BIND(C)Symbol %qs is used before it is typed at %LDerived type %qs at %C is being used before it is definedComponent %qs at %C is a PRIVATE component of %qs%qs at %C is not a member of the %qs structure; did you mean %qs?%qs at %C is not a member of the %qs structureComponent %qs at %C already in the parent type at %LPPPPPPUPPPPP GQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQ KQIQIQIQIQIQIQIQIQIQIQIQIQIQIQIQIQIQIQIQIQIQIQIQIQIQIQIQIQIQIQIQIQIQIQ0IQ0IQ0IQ0IQ0IQ0IQ0IQ0IQ0GQ`JQ0GQ`JQgfc_find_proc_namespacegfc_get_derived_super_typegfc_check_symbol_typedgen_special_c_interop_ptrgenerate_isocbinding_symbolgfc_copy_formal_args_intrgfc_enforce_clean_symbol_statedo_traverse_symtreegfc_free_namespacegfc_free_charlengfc_drop_last_undo_checkpointsingle_undo_checkpoint_pgfc_save_symbol_datagfc_release_symbolgfc_find_component../../gcc/fortran/target-memory.cInvalid expression in gfc_element_size.Invalid expression in gfc_target_encode_expr.Overlapping unequal initializers in EQUIVALENCE at %LOverlapping unequal initializers in EQUIVALENCE at %CInvalid expression in gfc_target_interpret_expr.PQOQOQOQOQPQPPQPQPQhOQPQPQeVQTQpTQPTQTQSQ8SQ3_Q\Q]Q]Q]QF]Q]Q ]Q3_Q3_Q\Qgfc_convert_bozexpr_to_charinterpret_arraygfc_target_interpret_exprgfc_interpret_derivedgfc_interpret_characterencode_derivedgfc_target_encode_exprgfc_encode_charactergfc_target_expr_sizegfc_element_size../../gcc/fortran/convert.ctruthvalue_conversionconvert../../gcc/fortran/dependency.cidentical_array_ref(): inconsistent dimensionsare_identical_variables: Bad typegfc_check_dependency: expecting an EXPR_VARIABLEINTENT(%s) actual argument at %L might interfere with actual argument at %L.iQgQhQgQgQgQphQphQgQQ8QpQ`QpQQQQ`Qcontains_forall_index_pgfc_dep_resolvergfc_are_equivalenced_arraysgfc_is_data_pointergfc_check_argument_var_dependencyis_same_rangeidentical_array_refgfc_expr_is_one__builtin_acoshlcacoshlasinlhlcasinhlcatanhlatan2l__builtin_atan2__builtin_atan2f__builtin_coslhlcoshlexpl__builtin_exp__builtin_expflogl__builtin_log__builtin_logf__builtin_clolog10f__builtin_clog10sinl__builtin_sin__builtin_sinf__builtin_csinsinh__builtin_sqrt__builtin_sqrtf__builtin_csqrtl__builtin_csqrt__builtin_csqrtf__builtin_tanlhlctanhlj0erfl__builtin_erf__builtin_erff__builtin_erfcltgammahypotltrunccabsnextafterlrintemainderfrexpabsscalbfmodliroundf__builtin_lroundf__builtin_llroundf__builtin_iround__builtin_lround__builtin_llround__builtin_iroundl__builtin_lroundl__builtin_llroundl__builtin_powl__builtin_pow__builtin_powf__builtin_cpowl__builtin_cpow__builtin_cpowf__builtin_powilcbrtf__builtin_cexpisincoslclz__builtin_ctz__builtin_parity__builtin_popcount__builtin_clzl__builtin_ctzl__builtin_parityl__builtin_popcountl__builtin_clzll__builtin_ctzlll__builtin_expect__builtin_free__builtin_malloc__builtin_calloc__builtin_realloc__builtin_isfinite__builtin_isinf__builtin_isinf_sign__builtin_isnan__builtin_isnormal__builtin_signbitunorderedGOACC_enter_exit_dataGOACC_parallel_keyeGOACC_declarGOMP_taskwait_depend__builtin_GOMP_taskyieldGOMP_taskgroup_startGOMP_cancellation_pointGOMP_critical_name_startGOMP_critical_name_endGOMP_loop_static_startGOMP_loop_dynamic_startGOMP_loop_guided_startGOMP_loop_runtime_start__builtin_GOMP_loop_startGOMP_loop_ordered_startGOMP_loop_doacross_startGOMP_loop_static_nextGOMP_loop_dynamic_nextGOMP_loop_guided_nextGOMP_loop_runtime_nextGOMP_loop_ordered_static_nextGOMP_loop_ordered_guided_nextGOMP_loop_ull_static_startGOMP_loop_ull_dynamic_startGOMP_loop_ull_guided_startGOMP_loop_ull_runtime_start__builtin_GOMP_loop_ull_startGOMP_loop_ull_ordered_startGOMP_loop_ull_doacross_startGOMP_loop_ull_static_nextGOMP_loop_ull_dynamic_nextGOMP_loop_ull_guided_nextGOMP_loop_ull_runtime_nextGOMP_parallel_loop_staticGOMP_parallel_loop_dynamicGOMP_parallel_loop_guidedGOMP_parallel_loop_runtime__builtin_GOMP_loop_endGOMP_loop_end_cancelGOMP_loop_end_nowaitGOMP_doacross_ull_postGOMP_doacross_ull_wait__builtin_GOMP_parallelGOMP_parallel_reductionsGOMP_sections2_start__builtin_GOMP_sections_nextGOMP_parallel_sections__builtin_GOMP_sections_endGOMP_sections_end_cancelGOMP_sections_end_nowait__builtin_GOMP_single_startGOMP_single_copy_startGOMP_single_copy_endGOMP_offload_register_verGOMP_offload_unregister_ver__builtin_GOMP_target_extGOMP_target_data_extGOMP_target_end_dataGOMP_target_update_extGOMP_target_enter_exit_dataGOMP_task_reduction_remap__builtin_trap__builtin_assume_aligned__emutls_get_address__emutls_register_commoncannot open input file: %sGNU Fortranomp declare targetomp declare target linkoacc functionstart__builtin_GOMP_loopnonmonotonic_runtime_start__builtin_GOMP_loop_ordered_static_startGOMP_loop_ordered_dynamic_start__builtin_GOMP_loop_ordered_dynamic_startGOMP_loop_ordered_guided_start__builtin_GOMP_loop_ordered_guided_startGOMP_loop_ordered_runtime_start__builtin_GOMP_loop_ordered_runtime_startGOMP_loop_doacross_static_startGOMP_loop_doacross_dynamic_start__builtin_GOMP_loop_doacross_dynamic_startGOMP_loop_doacross_guided_start__builtin_GOMP_loop_doacross_guided_startGOMP_loop_doacrossGOMP_loop_nonmonotonic_guided_next__builtin_GOMP_loop_nonmonotonic_guided_nextGOMP_loopnonmonotonic_runtime_nextGOMP_loop_ordered_dynamic_nextGOMP_loop_ordered_runtime_next__builtin___builtin___builtin_GOMP_loop_ull_ordered_guided_start__builtin_GOMP_loop_ull_ordered_guided_startGOMP_loop_ull_ordered_runtime_start__builtin_GOMP_loop_ull_ordered_runtime_startGOMP_loop_ull_doacross_dynamic_start__builtin_GOMP_loop_ull_doacross_dynamic_startGOMP_loop_ull_doacross_guided_start__builtin_GOMP_loop_ull_doacross_guided_startnext__builtin_next__builtin_next__builtin_next__builtin_GOMP_loop_ull_ordered_stordered_static_next__builtin_GOMP_loop_ull_ordered_guidedordered_guided_nextGOMP_loop_ull_ordered_runtime_next__builtin_GOMP_parallel_loopparallel_loop_nonmonotonic_runtimeGOMP_taskgroup_reduction_registerGOMP_taskgroup_reduction_un5RRRRRRRFRR0R0RB RRR RRo RR RRRR RRRR RRRRN Ro R R R0RR RRRRRRRRRR1RRuRRRERRRRR3RR~RRRERRRRERRR!R\RRkRRR:RRRR RR(R)R,R)R,R1R0R3R)R3R(R3RU)R*R`+R*RJ/R*Rr-R*R*RU)R`+R.RU)R91R0R4Rj,R$,R1R,R74RJ2R5R2R$3RA-R6R op@v../../gcc/fortran/trans.cAt line %d of file %sIn file '%s', around line %dprint_warningError allocating %lu bytesError reallocating to %lu bytesAttempting to allocate already allocated variable '%s'Attempting to allocate already allocated variableAttempt to DEALLOCATE unallocated '%s'gfc_trans_code(): Bad statement codeRRRR|R |R0|R@|RP|Rp|R|RR|R|R|R|RP|RR}R0}R@}RP}Rh}R}R}R}R}R}R}R}R}R~R ~RR0~R@~RP~RP~RP~R`~Rp~R~R~R~R~R~R~R~R~RRR R0R@RPR`RpRpRRRR{R{R{RR{R{R{R{R{R{R{R{R{R{R{R{R{R}R}R}R}R}R}R}R}R}R}R}R}R}R}RRR}R}R}R}R}R}R}R}R}R}R}R}R}R}R}R}R}R}R}R}R}R}R}R}R}R}RR}R}R}R}R}R}R}R}Rgfc_finish_wrapped_blockgfc_add_init_cleanupgfc_start_wrapped_blockalloc_entriesgfc_generate_module_codegfc_add_block_to_blockgfc_prepend_expr_to_blockgfc_add_expr_to_blockgfc_deallocate_scalar_with_statusgfc_deallocate_with_statusgfc_build_final_callgfc_add_finalizer_callgfc_add_comp_finalizer_callgfc_allocate_allocatablegfc_allocate_using_caf_libgfc_build_array_refgfc_merge_block_scopegfc_advance_chainIncorrect function return valueArray reference out of bounds../../gcc/fortran/trans-array.cDifferent CHARACTER lengths (%ld/%ld) in array constructorCreating array temporary at %LIndex '%%ld' of dimension %d of array '%s' outside of expected range (%%ld:%%ld)Index '%%ld' of dimension %d outside of expected range (%%ld:%%ld)Index '%%ld' of dimension %d of array '%s' below lower bound of %%ldIndex '%%ld' of dimension %d below lower bound of %%ldIndex '%%ld' of dimension %d of array '%s' above upper bound of %%ldZero stride is not allowed, for dimension %d of array '%s'Array bound mismatch for dimension %d of array '%s' (%%ld/%%ld)Integer overflow when calculating the amount of memory to allocateDimension %d of array '%s' has extent %%ld instead of %%ldbad expression type during walk (%d)Negative character length treated as LEN = 0The value of the PDT LEN parameter '%s' does not agree with that in the dummy declarationAn array temporary was created for argument '%s' of procedure '%s'An array temporary was createdCreating array temporary at %L for argument %qsThe number of elements in the array constructor at %L requires an increase of the allowed %d upper limit. See %<-fmax-array-constructor%> option=R=RvR RR=R=R0RR;S:S:S9S9S9S9S:S9S9QSQSrMSPSXOSOS MSfRS MS0S`SpSpS0SSPS0SpSSgfc_walk_op_exprgfc_walk_elemental_function_argsgfc_reverse_ssgfc_walk_array_refgfc_trans_deferred_arrayget_std_lboundconcat_str_lengthgfc_alloc_allocatable_for_assignmentstructure_alloc_compsgfc_conv_array_parameterget_array_charlenwalk_coarraygfc_conv_expr_descriptorgfc_get_dataptr_offsetgfc_trans_dummy_array_biasgfc_trans_g77_arraygfc_trans_auto_array_allocationtree_to_uhwigfc_conv_array_initializergfc_array_init_sizegfc_array_allocateretrieve_last_refgfc_trans_array_constructor_subarraygfc_put_offset_into_varget_loop_upper_bound_for_arraytrans_array_constructorset_vector_loop_boundsgfc_add_loop_ss_codeset_loop_boundsgfc_conv_loop_setupgfc_conv_section_startstridegfc_conv_ss_descriptorgfc_conv_ss_startstridegfc_trans_preloop_setupgfc_start_scalarized_bodygfc_conv_array_reftrans_array_bound_checkconv_array_index_offsetget_array_ctor_var_strlengfc_trans_allocate_array_storageget_class_info_from_ssgfc_trans_create_temp_arraygfc_set_loop_bounds_from_array_specgfc_trans_static_array_pointergfc_add_ss_to_loopgfc_cleanup_loopfree_ss_infogfc_free_ss_chaingfc_build_null_descriptorgfc_conv_descriptor_uboundgfc_conv_descriptor_lboundgfc_conv_descriptor_stridegfc_conv_descriptor_stride_getgfc_conv_descriptor_tokengfc_get_descriptor_dimensiongfc_conv_descriptor_attributegfc_conv_descriptor_elem_lengfc_conv_descriptor_rankgfc_conv_descriptor_spangfc_conv_descriptor_dtypegfc_conv_descriptor_offsetgfc_conv_descriptor_data_addrgfc_conv_descriptor_data_setgfc_conv_descriptor_data_get _atmpnot_prev_allocatedpartialshadow_loopvarslncdescorigptrarg_desc../../gcc/tree.hThe equivalence set for variable %qs declared at %L violates alignment requirementselement_number(): Bad dimension typeget_mpz(): Not an integer constantIllegal reference type at %L as EQUIVALENCE object../../gcc/fortran/trans-common.cInconsistent equivalence rules involving %qs at %L and %qs at %LNamed COMMON block %qs at %L shall be of the same size as elsewhere (%lu vs %lu bytes)COMMON %qs at %L does not existEquivalence for %qs does not match ordering of COMMON %qs at %LThe equivalence set for %qs cause an invalid extension to COMMON %qs at %LPadding of %d bytes required before %qs in COMMON %qs at %L; reorder elements or use %<-fno-align-commons%>Padding of %d bytes required before %qs in COMMON at %L; reorder elements or use %<-fno-align-commons%>COMMON %qs at %L requires %d bytes of padding; reorder elements or use %<-fno-align-commons%>COMMON at %L requires %d bytes of padding; reorder elements or use %<-fno-align-commons%>Bad array reference at %L%s__%s_../../gcc/wide-int.hequiv.%dfinish_equivalencessign_maskgfc_sym_mangled_common_idcreate_common../../gcc/fortran/trans-const.cAssigning value other than 0 or 1 to LOGICAL has undefined result at %Lnon-constant initialization expression at %LSNaN../../gcc/real.h"TpTThT T"THT"T"TXTgfc_conv_constantgfc_conv_constant_to_treeformat_helpergfc_conv_mpfr_to_treegfc_conv_string_initgfc_build_const__%s_MOD__%s_PROC_%s_F$%s_MOD_%s_F$%s.__%s_MOD_%s%s$%lu__label_%.6d__entry..__resultcaf_tokencaf_offset__BLOCK_DATA___F$caf_token%s__result_%.20sfn specsgemm_sgemmdgemm_dgemmcgemm_cgemmzgemm_zgemm_gfortran_stop_numeric_gfortran_stop_string_gfortran_error_stop_numeric_gfortran_error_stop_string_gfortran_pause_numeric_gfortran_pause_string_gfortran_runtime_error_at.RR_gfortran_runtime_warning_at_gfortran_generate_error_gfortran_os_error_at_gfortran_set_args_gfortran_set_fpe_gfortran_ieee_procedure_exit_gfortran_set_options..R_gfortran_set_convert_gfortran_set_record_marker_gfortran_internal_pack.r_gfortran_internal_unpack.wR.w._gfortran_associated_gfortran_caf_init_gfortran_caf_finalize_gfortran_caf_this_image_gfortran_caf_num_images_gfortran_caf_registerRRWWWWR_gfortran_caf_deregisterWRWWR_gfortran_caf_get.R.RRWRRRW_gfortran_caf_send.R.RRRRRRWR_gfortran_caf_sendget.R.RRRR.RRRRRR_gfortran_caf_get_by_ref.RWRRRRRWR_gfortran_caf_send_by_ref.RRRRRRRWR_gfortran_caf_sendget_by_ref.RR.RRRRRWWRR_gfortran_caf_sync_all_gfortran_caf_sync_memory_gfortran_caf_sync_images.RRWW_gfortran_caf_error_stop_gfortran_caf_error_stop_str_gfortran_caf_stop_numeric_gfortran_caf_stop_str_gfortran_caf_atomic_defineR..RW_gfortran_caf_atomic_refR..WW_gfortran_caf_atomic_casR..WRRW_gfortran_caf_atomic_op.R..RWW_gfortran_caf_lockR..WWW_gfortran_caf_unlock_gfortran_caf_event_post_gfortran_caf_event_wait_gfortran_caf_event_query_gfortran_caf_fail_image_gfortran_caf_failed_images_gfortran_caf_form_team_gfortran_caf_change_team_gfortran_caf_end_team_gfortran_caf_get_team_gfortran_caf_sync_team_gfortran_caf_team_number_gfortran_caf_image_status_gfortran_caf_stopped_images_gfortran_caf_co_broadcastW.WW_gfortran_caf_co_max_gfortran_caf_co_min_gfortran_caf_co_reduceW.R.WW_gfortran_caf_co_sum_gfortran_caf_is_present_gfortran_compare_string..R.R_gfortran_concat_string..W.R.R_gfortran_string_len_trim_gfortran_string_index..R.R._gfortran_string_scan_gfortran_string_verify_gfortran_string_trim.Ww.R_gfortran_string_minmax_gfortran_adjustl.W.R_gfortran_adjustr_gfortran_select_string_gfortran_concat_string_char4_gfortran_string_index_char4_gfortran_string_scan_char4_gfortran_string_verify_char4_gfortran_string_trim_char4_gfortran_string_minmax_char4_gfortran_adjustl_char4_gfortran_adjustr_char4_gfortran_select_string_char4.w.R.W_gfortran_selected_char_kind_gfortran_selected_int_kind_gfortran_system_clock_4_gfortran_system_clock_8_gfortran_pow_i%d_i%d_gfortran_pow_r%d_i%d_gfortran_pow_c%d_i%d_gfortran_ishftc4_gfortran_ishftc8_gfortran_ishftc16_gfortran_size0_gfortran_size1_gfortran_iargc_gfortran_kill_sub_gfortran_kill_gfortran_is_contiguous0_gfortran_f2c_specific%s_gfortran_specific%s__def_initgfc_descgfc_desc_ptrCFI_desc_ptris_recursiveargv../../gcc/fortran/trans-decl.cSorry, !$ACC DECLARE at %L is not allowed in BLOCK construct_gfortran_ieee_procedure_entry_gfortran_set_max_subrecord_length_gfortran_cfi_desc_to_gfc_desc_gfortran_gfc_desc_to_cfi_desc_gfortran_compare_string_char4_gfortran_string_len_trim_char4_gfortran_convert_char1_to_char4_gfortran_convert_char4_to_char1_gfortran_selected_real_kind2008intrinsic variable which isn't a procedureArray %qs at %L is larger than limit set by %<-fmax-stack-var-size=%>, moved from stack to static storage. This makes the procedure unsafe when called recursively, or concurrently from multiple threads. Consider using %<-frecursive%>, or increase the %<-fmax-stack-var-size=%> limit, or change the code to use an ALLOCATABLE array.Return value of function %qs at %L not setDeferred type parameter not yet supportedDummy argument %qs at %L was declared INTENT(OUT) but was not setDerived-type dummy argument %qs at %L was declared INTENT(OUT) but was not set and does not have a default initializerUnused dummy argument %qs at %LUnused module variable %qs which has been explicitly imported at %LUnused variable %qs declared at %LUnused parameter %qs declared at %LUnused parameter %qs which has been explicitly imported at %LReturn value %qs of function %qs declared at %L not setbackend decl for module variable %qs already existsUnused PRIVATE module variable %qs declared at %LActual string length does not match the declared one for dummy argument '%s' (%ld/%ld)Actual string length is shorter than the declared one for dummy argument '%s' (%ld/%ld)Recursive call to nonrecursive procedure '%s'gfc_generate_constructorscreate_main_functionadd_argument_checkinggfc_trans_use_stmtsgenerate_coarray_sym_initgfc_create_module_variablegenerate_namelist_declcreate_module_nml_declgfc_generate_module_varsgfc_module_add_declalloc_entriesalloc_entriesgfc_trans_assign_aux_vargfc_trans_auto_character_variablegfc_trans_deferred_varsgfc_init_default_dtbuild_library_function_decl_1gfc_get_fake_result_declgfc_finish_declcreate_function_arglistbuild_entry_thunksgfc_get_extern_function_declgfc_add_assign_aux_varsgfc_finish_var_declgfc_build_qualified_arrayadd_attributes_to_declgfc_create_string_lengthbuild_function_declgfc_build_dummy_array_declgfc_add_decl_to_parent_functiongfc_get_symbol_declgfc_get_module_backend_declgfc_get_label_declgfc_add_decl_to_functionX_/T!T_`R(R_`R(R_`R(R_`R(R_`R(R ../../gcc/fortran/trans-expr.cCode for reallocating the allocatable array at %L will be addedCode for reallocating the allocatable variable at %L will be addedSorry, coindexed access to an unlimited polymorphic object at %L is not supportedSorry, coindexed access to an unlimited polymorphic component at %L is not supportedSubstring out of bounds: lower bound (%%ld) of '%s' is less than oneSubstring out of bounds: upper bound (%%ld) of '%s' exceeds string length (%%ld)Substring out of bounds: lower bound (%%ld) is less than oneSubstring out of bounds: upper bound (%%ld) exceeds string length (%%ld)Cannot compute the length of the char array at %L.Unknown argument list function at %LAllocatable actual argument '%s' is not allocated or not presentPointer actual argument '%s' is not associated or not presentProc-pointer actual argument '%s' is not associated or not presentAllocatable actual argument '%s' is not allocatedPointer actual argument '%s' is not associatedProc-pointer actual argument '%s' is not associatedIf bounds remapping is specified at %L, the pointer target shall not be NULLTarget of rank remapping is too small (%ld < %ld)Assignment of scalar to unallocated arrayU(UUUUUUHUUU:U@6UP5U95U95U95U4U4U95U:U:UCdU ]U@]UP]U`]Up]U]U]U]Uh^UL_U|^U.`Ua_U(dU,cUaUaUaUaUaUaUaUvaUkaU`aU`UaUaUvaUkaU`aU`U`U.kU.kU<`UkUeUeUkUeUkUkUkUeUkUkUkUkUkUkUkUeU|kUfUfU|kUfU|kU|kU|kUeU|kUgU|kU|kU|kU|kU|kUfUgfc_trans_assignment_1gfc_trans_arrayfunc_assigngfc_expr_is_variablegfc_conv_string_parametertrans_get_upoly_lentrans_class_vptr_len_assignmentgfc_trans_pointer_assignmentgfc_conv_expr_valgfc_conv_expr_lhsgfc_conv_array_constructor_exprgfc_conv_substring_exprsign_maskgfc_conv_statement_functiongfc_conv_concat_opgfc_conv_power_opgfc_conv_unary_opgfc_conv_expr_opgfc_conv_exprgfc_conv_structuregfc_conv_union_initializeralloc_scalar_allocatable_for_subcomponent_assignmentgfc_trans_subarray_assigngfc_trans_subcomponent_assigngfc_trans_structure_assigngfc_conv_initializergfc_trans_string_copyconv_base_obj_fcn_valconv_function_valgfc_conv_procedure_callgfc_conv_subref_array_arggfc_apply_interface_mapping_to_exprgfc_add_interface_mappinggfc_build_compare_stringconv_inquirygfc_conv_variablegfc_conv_string_tmpgfc_conv_component_refflatten_array_ctors_without_strlengfc_caf_get_image_indexgfc_get_caf_token_offsetgfc_get_tree_for_caf_exprgfc_get_expr_charlengfc_conv_expr_presentgfc_advance_se_ss_chaingfc_copy_class_to_classgfc_conv_class_to_classgfc_conv_intrinsic_to_classgfc_conv_derived_to_classgfc_vptr_size_getgfc_class_vtab_size_getvptr_field_getifmpstr<>Unknown intrinsic oparg_ptrcfi__vtab_CHARACTERslenptrtemp../../gcc/fortran/trans-intrinsic.c% argument of %s intrinsic at %L is not a valid dimension indexArgument NCOPIES of REPEAT intrinsic is negative (its value is %ld)Argument NCOPIES of REPEAT intrinsic is too largePOS argument (%ld) out of range 0:%ld in intrinsic BTESTPOS argument (%ld) out of range 0:%ld in intrinsic IBITSLEN argument (%ld) out of range 0:%ld in intrinsic IBITSPOS(%ld)+LEN(%ld)>BIT_SIZE(%ld) in intrinsic IBITSPOS argument (%ld) out of range 0:%ld in intrinsic %sSHIFT argument (%ld) out of range 0:%ld in intrinsic %sSHIFT argument (%ld) out of range -%ld:%ld in intrinsic ISHFTSIZE argument (%ld) out of range 1:%ld in intrinsic ISHFTCSHIFT argument (%ld) out of range -%ld:%ld in intrinsic ISHFTCUnequal character lengths (%ld/%ld) in %sSorry, the event component of derived type at %L is not yet supportedThe event variable at %L shall not be coindexedIntrinsic function %qs (%d) not recognizedcaf_refcaf_rescabsqcopysignqcpowqfabsqfmodqfrexpqlogbqllroundqiroundqnextafterqremainderqrintqscalbnqtruncqsqrtq__transfer_in_transferfpstate_gfortran_ieee_is_nan_gfortran_ieee_is_finite_gfortran_ieee_unordered_gfortran_ieee_is_normal_gfortran_ieee_is_negative_gfortran_ieee_copy_sign_gfortran_ieee_scalb_gfortran_ieee_next_after_gfortran_ieee_rem_gfortran_ieee_logb_gfortran_ieee_rintabsXcomp_statpomp declare simdnonemptyVhVhVVhVVVVXVVVVVVVVVPW@WW0WPW@W WW WWW,WWWW,WWWuWWWuWWiWWW`W_ X; X~WWfW; X%W; X"WOW!WW; XWW(WXX; X%W; XX; X; X X%W; X; X; X; X; X; X; X; X; X; X; X; XXX; X?X*XXX; XXfWWWX; X; X; X; X; XW; X; XXaW+ X; X; X; X X(XX; X; XXX; X; XXXX; X+ X+ X; XCX3XXXXWK X; X; X; X; XW; X; X; X; XdXW; XTX; XWW@W+ XW; XWWWW; X; XWW; X; X; X; X; XW; X; XW; XW; XW; XW; X; XWXv>XPIX~JXtransfer_array_componenttransfer_exprgfc_trans_transfergfc_trans_dt_endnml_get_addr_exprtransfer_namelist_elementset_internal_unitbuild_dtgfc_convert_array_to_stringset_stringset_parameter_refgfc_build_st_parameter@p_ @~_p(_`R(R@_$`R(R@p_Y(ZpXp_Y(`R(R`p_Y(`R(R@~_ZpH~_`R(R__st_parameter_omp dummy varoptional class parameter../../gcc/fortran/trans-openmp.cimplicit mapping of assumed size array %qDunhandled derived-type componentgfc_trans_omp_workshare(): Bad statement codemXqXqXqXmXjXqXqXqXqXqXqXqXqXqXqXqXqXqXqXqXjXqXqXpqXqX`qXpjX qXqXqXpXpXppX pXoXoXoX`oXPoXqXqXqXqX@oX0oXnXPnX@nXmX|X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X|X|X xX}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X}X|X|X}X}X}X}X}X}X|X{XpzX}XHzX{X}X}X}X}X}X}X}X}X}X{X{XXXXXXXXXXXXXpXPXXX@X X0X@XضXX XйXйX@XոXոX0XйX X0XXyXyXXXXyXyXyXXXXXyXyXyXyXyXyXyXyXyXyXyXyXyXyXyXyXXXXX@XXXXXXX{XmX_XQXCX5X'XX XeXeXXXX[XhXXXLX@X4XtXXXXXjXaXsXNXXXX|XXXXXXYXXXXXYxYYxYxYxY`YX YP Y( Y Y Y YP Y YP Y( Y( Y Y YXYYYYYYYYYYYYYYYYYYYYYYYYXYYYYYYYYYYYYYYYYYYYYYYYY`Y8YYYYYYYYY@Y YYYYY0YYYYPYY8YYYYXY@Y8Y0Y0Y0YY@Y@YYY@YY@YYYYYYYYYY8YYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYgfc_trans_omp_taskloopgfc_split_omp_clausesgfc_trans_omp_distributegfc_trans_omp_cancellation_pointgfc_trans_omp_cancelgfc_trans_omp_directivegfc_trans_omp_atomicgfc_trans_oacc_executable_directivegfc_trans_omp_dogfc_trans_oacc_combined_directivegfc_trans_oacc_directivegfc_trans_omp_variablegfc_trans_omp_array_sectiongfc_trans_omp_array_reduction_or_udrgfc_trans_omp_reduction_listgfc_trans_omp_clausesgfc_omp_firstprivatize_type_sizesgfc_omp_linear_clause_add_loopgfc_omp_clause_linear_ctorgfc_omp_clause_copy_ctorgfc_walk_alloc_compsgfc_omp_clause_default_ctorgfc_omp_check_optional_argumentcount1count2acquiredDO step value is zero.saved_dovarLoop iterates infinitelycountm1countm1tERRMSGcase_num_jump_struct_char1_jump_struct_char4string1string1_lenstring2string2_lenjumptable../../gcc/fortran/trans-stmt.cAssigned label is not a target labelAn alternate return at %L without a * dummy argumentSorry, the lock component of derived type at %L is not yet supportedInvalid image number %d in SYNC IMAGESSorry, only support for integer kind %d implemented for image-set at %LLoop variable has been modifiedAttempt to allocate an allocated objectAttempt to deallocate an unallocated objectgfc_trans_select(): Bad type for case expr.vY|YsY|Y|Y|Y|Y|YPsY|Y|Y|Y|Y|Y|Y|Y|Y|Y|Y|Y|Y|Y0sYrYgfc_trans_deallocategfc_trans_allocategfc_trans_exitgfc_trans_cyclegfc_trans_where_assigngfc_trans_where_2gfc_trans_assign_need_tempforall_make_variable_tempgfc_trans_forall_1gfc_trans_select_rankgfc_trans_select_type_casesgfc_trans_select_typegfc_trans_character_selectgfc_trans_selecttrans_associate_vargfc_trans_block_constructgfc_trans_syncgfc_trans_event_post_waitgfc_trans_lock_unlockreplace_ssgfc_conv_elemental_dependenciesget_proc_ifc_for_callgfc_trans_callgfc_conv_label_variable_`R(R../../gcc/fortran/trans-types.cgfc_validate_kind(): Got bad typegfc_validate_kind(): Got bad kindINTEGER(KIND=8) is not available for %<-fdefault-integer-8%> optionINTEGER(KIND=8) is not available for %<-finteger-4-integer-8%> optionREAL(KIND=8) is not available for %<-fdefault-real-8%> optionREAL(KIND=10) is not available for %<-fdefault-real-10%> optionREAL(KIND=16) is not available for %<-fdefault-real-16%> optionREAL(KIND=8) is not available for %<-freal-4-real-8%> optionREAL(KIND=10) is not available for %<-freal-4-real-10%> optionREAL(KIND=16) is not available for %<-freal-4-real-16%> optionREAL(KIND=4) is not available for %<-freal-8-real-4%> optionREAL(KIND=10) is not available for %<-freal-8-real-10%> optionREAL(KIND=10) is not available for %<-freal-8-real-16%> optionYYYYYYYpYpYpYZ ZXZZZZZMZNZOZOZOZOZOZ0OZOZ@OZ`OZ]jZXiZiZiZiZjZPjZjZ iZ0iZiZiZHiZgfc_get_array_descr_infogfc_get_mixed_entry_uniongfc_get_function_typegfc_get_derived_typegfc_get_union_typemirror_fieldsgfc_get_array_descriptor_basegfc_get_array_type_boundsgfc_get_dtypegfc_is_nodesc_arraygfc_get_element_typegfc_typenode_for_specgfc_init_kinds```R(R``R(R``R(R``R(R ``R(R```R(R` `R(R` `R(R``R(R``R(Rinteger(kind=%d)logical(kind=%d)real(kind=%d)complex(kind=%d)character(kind=%d)c_integerc_chardtype_typeelem_lenarray_descriptor%02ddtypespandescriptor_dimensionarray%02d_%.*stripletnveccaf_vector_tcaf_token_offsetstatic_array_typeitem_sizecaf_reference_tmaster.munion.%sc_int_least16_tc_int_least32_tc_int_least64_tc_int_least128_tc_int_fast128_tc_carriage_returc_horizontal_tabc_f_procpointerc_float_complexc_double_complexc_long_double_coc_float128_compl_gfortran_lbound_gfortran_size__internal_len_trim_gfortran_allocatedloc0__var_%d_do__internal_len__var_%d_%srealloc_stringmmbfreezebad expressionbad opfcnInterchanging loops at %L../../gcc/fortran/frontend-passes.cScalarization using DIMEN_RANGE unimplementedInconsistent internal state: No location in expression near %LInconsistent internal state: No location in statementIncorrect extent in argument B in MATMUL intrinsic in dimension 1: is %ld, should be %ldArray bound mismatch for dimension 1 of array (%ld/%ld)Array bound mismatch for dimension 2 of array (%ld/%ld)illegal OP in optimize_comparisonIllegal id in copy_walk_reduction_argVariable %qs at %L set to undefined value inside loop beginning at %L as INTENT(OUT) argument to subroutine %qsVariable %qs at %L not definable inside loop beginning at %L as INTENT(INOUT) argument to subroutine %qsVariable %qs at %L set to undefined value inside loop beginning at %L as INTENT(OUT) argument to function %qsVariable %qs at %L not definable inside loop beginning at %L as INTENT(INOUT) argument to function %qsArray reference at %L out of bounds (%ld < %ld) in loop beginning at %LArray reference at %L out of bounds (%ld > %ld) in loop beginning at %LRemoving call to impure function %qs at %LRemoving call to impure function at %L@@@@R@ZZZ ZZZ`Z`ZHZpZpZpZpZpZ`ZpZpZpZpZZpZpZpZpZpZpZ`Z`ZpZpZpZpZpZpZpZpZpZ(ZZpZ8Z8ZpZpZpZZZZZZZZZZ ZПZZبZxZZZZ8Z8ZZ Z ZZ?ZZ?Z?Z?Z?Z?Z?ZZ?ZZ Z?ZZ0Z ZZ?Z?ZZ0Z0Z?Z?Z?Z?Z?Z?ZpZZZZZ?Z?Z?ZZZ ZZZ?Z?Z?Z?Z?Z?Z?Z?Z?Z?Z?Z?Z?Z?Z?Z?Z?Z?Z?Z?Z?Z?Z?Z?ZZ?ZoZoZ?Z?ZoZ@Z@Z@ZhZoZoZhZZ?Z?ZoZoZ?Z?Z?Z?Z?ZoZoZ@ZoZoZoZoZoZoZoZoZoZoZoZoZoZoZoZoZoZ?ZoZoZoZoZoZoZoZoZcheck_externals_proceduregfc_has_dimen_vector_refvar_in_exprtraverse_io_blockoptimize_minmaxlocoptimize_comparisonstrip_function_calloptimize_trimcall_external_blasconvert_to_index_kindcreate_do_loopmatmul_lhs_reallocruntime_error_neget_array_inq_functioncreate_varcheck_conjg_transpose_variableinline_matmul_assignceilfceillfloorffloorloptabs-query.h0x1p%d%qE declared herestrcpyalloc_sizeffsstpcpyleaftransaction_purereturns_nonnullkedifuncnoipaoffload_lto_bbtree_to_uhwisign_maskconvert_to_real_1%d.%d.%d10.3.0__ORDER_BIG_ENDIAN____ORDER_LITTLE_ENDIAN____GNUC__=%d__GNUC_MINOR__=%d__GNUC_PATCHLEVEL__=%d__VERSION__="%s"__ATOMIC_RELAXED=%d__ATOMIC_SEQ_CST=%d__ATOMIC_ACQUIRE=%d__ATOMIC_RELEASE=%d__ATOMIC_ACQ_REL=%d__ATOMIC_CONSUME=%d__pic__=%d__PIC__=%d__pie__=%d__PIE__=%d__SANITIZE_ADDRESS____SANITIZE_THREAD____OPTIMIZE_SIZE____OPTIMIZE____FAST_MATH____SUPPORT_SNAN____NO_MATH_ERRNO____FINITE_MATH_ONLY__=%d_LP64__LP64____SIZEOF_INT__=%ld__SIZEOF_LONG__=%ld__SIZEOF_LONG_LONG__=%ld__SIZEOF_SHORT__=%ld__SIZEOF_FLOAT__=%ld__SIZEOF_DOUBLE__=%ld__SIZEOF_LONG_DOUBLE__=%ld__SIZEOF_SIZE_T__=%ld__CHAR_BIT__=%u__BIGGEST_ALIGNMENT__=%d__ORDER_LITTLE_ENDIAN__=1234__ORDER_BIG_ENDIAN__=4321__ORDER_PDP_ENDIAN__=3412__BYTE_ORDER__=%s__FLOAT_WORD_ORDER__=%s__SIZEOF_POINTER__=%dtree_to_uhwivopsGIMPLEGNU CGNU C++GNU F77Match-and-simplified __builtin_unreachablemwdump_hsa_bbdump_hsa_insn_1$SYSROOT%s: %s%s: not a directoryCPATHEnd of search list. OBJC_INCLUDE_PATHOBJCPLUS_INCLUDE_PATHignoring nonexistent directory "%s" ignoring duplicate directory "%s" as it is a non-system directory that duplicates a system directory #include "..." search starts here: #include <...> search starts here: /../../gcc/internal-fn.clxLTO_tagsline map string pair map hashalloc_entriescanon_relative_path_prefixinput_location_and_blockcf-protection=full-fcf-protection=none-fcf-protection=branch-fcf-protection=returnFunction is OpenACC serial offload paralleloacc kernelsoacc serialexecute_oacc_device_lowerscan_omp_forenclosing routine must-tail-call side-effectseh_returnclrsb../../gcc/spellcheck.c2-p6__builtin_object_sizetsan_atomic16_compare_exchange_weak__builtin___tsan_atomic16_compare_exchange_weak__tsan_atomic32_compare_exchange_weak__builtin___tsan_atomic32_compare_exchange_weakweak__builtin___tsan_atomic64_compare_exchange_weak__tsan_atomic12ubsan_handle_divrem_overflowbuiltin_ubsan_handle_type_mismatch_ubsan_handle_negate_overflowubsan_handle_pointer_overflowubsan_handle_divrem_overflow_abort_abortbuiltin__abortbuiltin_builtin___ubsan_handle_sub_overflow_abortbuiltin___ubsan_handle_negate_overflow_abort__builtin___ubsan_handle_negate_overflow_abortbuiltin_builtin_ubsan_handlebuiltin_builtin___ubsan_handle_nonnull_return_v1__ubsan_handle_nonnull_return_v1_abort__builtin___ubsan_handle_nonnull_return_v1_abort__ubsan_handle_invalid_builtinubsan_handle_invalid_builtin_abort__builtin___ubsan_handle_invalid_builtin_abortbuiltin___sanitizer_cov_trace_const_cmp1builtin___sanitizer_cov_trace_const_cmp4__builtin___sanitizer_cov_trace_const_cmp4noclonenoinlineretvalptrexpand_call_inlinedefault(schedule(collapse(final(linear(aligned(depend(tile(_griddim_( negative offset access.e_to_uhwicreate_artificial_child_accessbuild_ref_for_modelalloc_align__builtin_prefetchsign_maskNot vectorized: Incompatible number of vector subparts between %T andvect_reanalyze_as_main_loopReanalyzing as a main loop with vector mode %s ***** Failed to analyze main loop with vector mode %smaybe_set_vectorized_backedge_valuevect_reanalyze_as_main_looplong long unsigned intbortcmp__builtin_memset__builtin_allocareturn_addressalready exists: %qsc_unlockedfputsfputstposfwrite_unlockedvfprintfvsnprintfvsprintflrintlfloorlceilGCC_EXEC_PREFIX/usr/include/c++/10G++/usr/include/c++/10/backwardGCCBINUTILS/usr/lib64/gcc//usr/5 141/usr/include/c++/10/nvptx-nonesys-includeS_%dP_%dbuild_poly_scop [from: %i(%f)no_icfinternal call LHS type mismatchstrcmp/usr_ROOTreal-10-fdefault-real-16=|checkzero start column(SUSE Linux) 10.3.0push_macropop_macro<=>.debug_str_offsetsXSXSXSSPzzzzzzzzzzzzzz]z1zzzz|JzJzJzJzJzJzJzJzJzJz{{{{p xȇ8\S˥˥A˥˥S{;{;{;{;{;{z;{0{{{{{{;{;{;{;{;{;{;{P{;{;{;{;{{{{{`JЁX`````~~p````0̇v77ۑ  ""f AГMܒזTW yqCٮʮʮƴ޳¹¹¹¹¹¹¹¹¹hU%%%%%%0%G%Y%k%x%%%%%%%alnumalphacntrlpunctng doublecrealdcharucentlazy _DmainD mainhhlLAAAAAAASAA%s.%s.%sprevious declaration hereexpected %i or more, found %i%qE attribute ignoredreturns_twiceN9__gnu_cxx20recursive_init_errorEstd::bad_allocSt9bad_allocpure virtual method called basic_string::appendccxcHccuninitialized __any_stringȫȫȫȫȫȫȫȫȫȫȫȫȫȫȫȫlȫT ܫ ī 10ctype_baseSt5ctypeIcESt5ctypeIwESoSt13basic_ostreamxP04 čD%.*Lf%m/%d/%y%H:%M%H:%M:%SؽȾh8hhH`Pht`(<<<<<<<<<<<<<<<<<ȇhP8 \<` TdOP\QVVVQRVVVSVVVVSDTUVVVWPdVVVVVVVVXDYYdZVVXVVVV[[VVVVV[VVV \P`n q@ppnl\</dev/random`X< FFFFFF0FFFFXFFFFFFFFFP@`FF@FFFF0FFFFFhFFFP L ,  H(St7collateIcEiostream errorUnknown errorNSt8ios_base7failureB5cxx11ESt12ctype_bynameIcESt9type_infostd::bad_array_new_lengthSt20bad_array_new_lengthddggggdN10__cxxabiv117__class_type_infoEN10__cxxabiv121__vmi_class_type_infoEstd::bad_castПППППППППППППП՟՟՟՟՟՟՟՟ПППППП՟՟՟ППППППППППП՟՟՟؟`@@@@p`QQ@@@@@@@@@@@@@@@`@@@@@@@@@@@@@@`@@@@`@@@@`Q@@@@@@@@@@@@@``@@@```@@``pppppppppppX(pp@@@@@@@@@@@@@@@@@@@@@VVVVVVVVVVVVVVVVVVVVVVVVVVPVHVVVVVVVV2 0PX8xX8(( B(@H8p000xxxxxxxxxkxpp@ X(o`P@x8xxxf$$$Lttt imaginary SdppppppppppppppppppppppppppppppppppppppppppppppppppppppppddggggdX0@p@(h@(  xhx(p0  h ` ]h{MMMM 3F8p   8 ( ;!dt,L!tSkV0mZp~Z~[!\$B\̖Z\@\__0`DP``LKg.Bh=ZhnrhLohtiikCm<r#v(vww?ykyWzX{t}x["$LZ `Z"n:R01j$JaaaʃbbbXc*lcBTh~ XƄ@ބT;l`$%=U@8 Ihay% T. d Лh  a4 BX Zl rdG _ Xh `i j l  @ 6 N<2 fxo ~o o , ƣ` ޣ  x &   N< f(` P` ҥ(d DV``E`$a5Pa|a$a,b Xb b cy 4c `cq c ci c d` [>l]>;_ ?DaDhbEcTEdEeE#gFVh\GhxG iGmiGiG%j8HjTHj@JkNm`PnSnS)o,TpxTpTqUr\U%sUsVdtVtWlu]uT]yv]`w^x^x^y_y`z\`1{`{`|@aD}\ba~b~cc6dȀe`flhePhhهxiiIi jXjwjp^ppΎ,qrɐtˑLu͒uϓxqLy zL}{}YX~o~<@0NP5@U܅Јk$Fg8Pvh`ܗ1dpM4v ܳ[(tK XO,h0 xi.,tt P2 J Pb z ( T0/D Qld*i"\"|"#"#/#|Q#X$B$$L$Q$t[$h%x&h))))hD*t*d+d.p=.X.T00 0PG t1e D2 3 43 T7 $8  : t;d1 D<t < DV WP XcY$YZ [Tt\$^`T^@TccTc(wtii$lmHHmno:4o@wDo&opq$1qȕDrr!dst!v!v!w "x"y"{L#t##4,$X$$4$$Ԃ%@%t%T%D%&$H&p&&d&t&'$'Ԍ8'tp''Ԑ(P(t(T()|))))8*tp*ԝ*4*+Tt++++ԟ+++4 ,T ,t4,H,\,Ԡp,,,4,T,t,,,ԡ-`--Ĥ-D.ħ..D(/t/ԯ//40d 0t<0İh00d001TD1l11D1t1$X2x22423$83Ļl333 4D4$x44t5D555X6T667$h77$7T77Dl8$88 9L9D999$::::T<;tP;T;;$;,<p<<<<=4H=t=d==D=t=> >$8>d`>>D?TT???d?X@t@@@$MD>Md>M?NT@HN$BNF OF8OFLOG`OGtOTGOGOGO4HOTHPI8PJ|PKPKPMQTM,QMDQdNQOQPRRDRRRDSRSRTT,SdT@StTTSdUStUSUSUSUSdV,TV@TVTTVhTTYU[dU^U^U$_V4`DV`dVDaVbVd,WePWedW$fWfWiXtiDXiXXdjXtjXjXkXDkXTkXdk YlLYrYtsYsZtdZDuZuZwZ$w[wH[x[x[ty[z$\{d\|\~\ ]ԀP]]D]t<^^^t^^L__ԉ_$_4`D`T,`d@`tT`$`T`$a4PalabHbtbb4bDb4PccdctcĘcHddd eTPeledeTeԤe@f$Tf|fDfԧfԨ0gDgXggdgtgĪhԪ$h8hLh`hhhhhD iT4idHit\i$i4iDijHjjkDkXkk4,llldlm4m4mmDnn$Hnnndo$ pD{A{A{B{4C |E|DG|H4}K}M}tR4~S~U~TVXXYd[t\H]xT^bDc<c|4fi܁ij0djXks8tTuTxyHd~dTD`dԅtdLĖȆ܆Ԝ<$tԇ@̈T\Љ4TTtlԊD d$<`d40\t܌dTTč 4<hD  ! X. 5 D7 <7 T7 t< A $D xF F ВF G TG J 0L ̓L dM \ L\ x` 4a d xTe Tn 4o \q x$t tu 4v 4{ L    H ԏ  ę t < | ̚ d  ( TD pd   D    , ` x  ܜ @ p4  $ @$  t X l    ܟ4 d D $ \  Ġ$ $ $ 8 `4  d ̡t   $ X l 4 d t Т 4 \D  l4 d   ȣ $ t 0 L 4   L$ |T   4  XD xd   Ԧ !@T!!4!4!!`!T! !  !dD !T !ĩt !ة !!tD!! ! $!4t!#!&!-!$.!P5!4=!4?!`t@!E!G!@M!Q!lQ!Q!Q!R!ذY!$$Y!8Z!hZ!t[!̱[!$`!,`!X`!la!da!a!b!b!b!c!Pd!he!e!df!ȳf!f!g! Dh!4h!Hm!D!$!L!d!Ԏ!4$!H!\!pT!!!4!dD!xԙ!$!!d!ط!!P4!!Ըԡ!t!Dĥ!h$!!Ĺ!عt!!t!غ!T!!@4!TT!!$!l!!!h4!|!D!d!D!X$!!$!о!4"8""@"" "d ""l"t"D"D"\D"tt"D"""H """4&"$'"T4("+"."Hd/"t/"T1"3"`4"t4"<"="="<V"W" TW"$W"DW"XTX"lX"Y"$Z"""HD"`t"x"""D"0$"hT""$""d""DT"""t"( "@ $"l " d" # ! #\!T #! #! #"T#@"#`"$#"#"t#"##$#l#####$### $T#X$!#t$4!#$%#$(#$$*#8%D*#L%6#p&8#&48#&T9#&t:#':#4';#d'<#'<#'<#'<#'<#'=#(S#(S#(S#(DT#)dT#,)W#P)c#t)c#)td#)te#)f#)f#<*g#X*h#*di#*$j#+j#D+$k#l+Dk#+k#+k#+k#+Dl#+l#,l#D,m#,4n#,Dn#,n#-to#@-o#T-p#-dp#-p#-q#-Tq#$.q#L.$r#|.ts#.s#.s#.s# /Dt#8/Tt#L/t#h/t#/t#/$u#/u#0$v#,0v#l0w#0Tw#0w#0Dx#1z#1z#1D{#1|#2|#L2|#h2T}#2# 3#43$#P3t#3#3#3T#4#04ԃ#L4#h4D#4t#4#4Ԅ#4#4D#5d#$5#`5$#t5d#5#5#5D#5T#6#$6#@6T#|6#6#6#7d#@7#h7#7$#7#7ċ#7#8D# 8#<8Ԍ#X8#l8#8d#8č#8#84#9d#,9#T9#|9d#9#9$#9#@:$#:# ;Ԙ# ;#4;d#T;$#;d#;t#;4#<Ğ#l<#<D#<#=#@=$#T=d#p=#=#=# >4# >d#<>#h>4#>d#>$#>#,?ԧ#H?$#d?d#?Ĩ#?#?D#?ĩ#@t#<@Ī#d@d#@#@#@D#Ad#A#4A#AD#At#A#A#Ad#$B#@B#hB4#B#B#B#B4#Bd#C#%A%B%ЀDC%C%8E%Tf%؁4o%Xp%%`%D%t%%̄Ԩ%%4%,d%H%\%xԩ%%%4%ȅT%܅t%%%Ԫ%,D%tԭ%%%Ȇ$%%<%d%ć%%$%D%$d%8%L%`$%%%%<%D%܉d%%%%,%@%T$%hD%|d%%%%̊%%$%D%d%0%D%X%l%%$%D%d%Ћ%%$%$D%8d%L%`%t%T%t%Ȍ%܌%%%$D%D%p4%%Ѝ%%%$4%8T%LD%%Ď%%ȏ4%4%<%xT%%Б$%$d%8%t%ؐ%%0%|%%%d%%L%4 &$ & &< &TD &x & & &Ȕ&&<&\&t4&T&ĕ&d:&@B&[&D\&\&T]&X]&e&$f&tf&l& tm&pm&q&Иts& D&p&$&&ؙt&&Td&Ĩ&Ț&Ԭ&8&|&d&ԛ&&P&d&Ĝ&&t&H&d&̝&d&P&D&t&H&&&&0&dt&&Ġ$&t&$&'4'<d '| ' 'Ԣ'$'x4 'ģD 'أT ' '4!',T!'T!'Ĥ#'8T%'%'&'t''<''P)'+'Ц,' 40'0'ȧ41'2'0T3'L48'8' t9'd9';'Щt<'D='H?'?'ĪB' dC'<C'x$D'D'̫4E'DF'HF'hG'G'K' K',dL'HM'lN'DN'N'ܭDO'O'R'`Z'ȮT['D]'d_'̯de'f'`g'g'h'̰th'o'4$p'`p'p's'4$t't'ܲDu'D}'}'ȳ}'ܳT'ă'@'`$'4'ȴT'ܴ'$'d'D'h''D'ԉ'D'4'4''t''','LԼ'T'ܷ',t''''H4'йT' D'p'D''@'tT'4''4'p'ȼT''('Ht(d(ؽT=($>(Td?(C(tD(J(lK(4M(N(t|(|(Ć(((T(((D((((P(T(T((t(<(h(|(T((D(,(X(d((($(4($D(PT(dt(x(((((($(p((D((( $(4(P($(T((((T((4T(H$((()`)4)T))))8 )l)4)$)4)p)#) 4$),$)Ld&)()))(4))P/)2)49)@;)<)4?)?)@)8@)dA)$C)XtG)DK)dN)4O)O)(P)R)X)8Y)pZ)\)T])])^)^)`_)`)c)0dd)De)e)g)$h)xi)4j)(j)tk)$l) l)ldm)Tn)n)Tp)p)q)DDr)s)s)4t)u)dv)0v)|w)w)Tx),x)xy)y)y)z)Dz)`{)|T{){)|)$|)8|)ld})})d)d)l4)t))p)T)$)`)$)))<)d))ė)$)P)D))H$)))@)d)T))t),T)))T)L)԰)t)<)l))D),)|)Ժ)l)T),)HԼ)d)xD)t))Խ))4)4)<T)Pd)dt)x)t))X)))) ) )4)))D)\)x))D)),)p)))4)))$)8)L)`)tt)))d)$))$)<)P)d))D)()l)T))0)Dd)x)4)D))D)t),)xT)()t) 4)X)d))T)4d)t**4*<4**t*$D*t*t *P ****p*T**8$*d*T*$*$d***@*T*ht** *d$*$*d%*|%*%*%*%*t'*T'*h'*|(***0D+*X+*,*4,*,*$-* d-*( -*D -*X .*t /* /*$ T0*X d0*l 0* 0* D1* 1*8 2*L 2*` D2*| 2* 2* 2* D3* 3* $4*, d4*H 6* T6* 6* 6* T7* 8*( V* $V* V* V*tW*pW*X*X*TY*tY*Y*Y*$Z*X$[*[*\*P]*$^*^*0T_*|_*D`*`*a*<b*|b*c* c*Ld*$e*e*Df*Tf*dg*g*dh*Xh*Ti*i*Tj*Xj*Tk*k*$4l*dTl*xl*dm*n*Pn*Do*o*dq*q*4r* r*\4s*s*u*Dtv*v*w*w*x*T{*${*4{***Ԃ*,*Ԅ*4**(*`T**D***DD*p*Ď*P *x * T*!4*4!*H!*t!*!*!T*"*8"*h"$*|"Ĝ*"*"4*"T* #d*4#*L#*h#*|#*#Ԡ*#*#*#$*#*$*P$D*|$*$*%*L%t*%*%*%*%T*%*D&*l&*&ĭ*&*&d*'į*('4*`'T*t'*'*$(*h(Կ*(*(D*(d* )*$)*<)t*)***\*t**T* +*X+*+D*+*,*T,*,*-t*l-*-*.*0.*t.*.*.*.* /*l/*/D*`0*0*1*d1*1*1*1t*2* 2+2d+2 +T3d+3T+ 4d+ 4+44+H4+\4+4D+4+5+@5+T5T+5+5D+5Ds+7ds+7s+7dt+8tt+88T~+8~+8+8+8D+89+T9$+p9T+9Ă+9+9+:D+8:+t:+:$+:D+:d+:t+:+;ĭ+P;+l;D+;+;+;D+<d+,<+L<+`<+<+<$+<+(=D+T=d+h=+|=IJ+=+=4+=+>Գ+D>4+|>T+>t+>+>+>+,?4+|?+?+?+,@$+H@d+@D+@d+(Ad+A+A4+AT+A+B+,Bt+B+HC+C+0D+D$+Dd+D+E+(E+-dD>-(dN-dO-dTU-\$V-/A/`H/̘/К4/ԩ/h/t/d/T/Ȝ/ܜ/|D/d//d/D/Xt/d/T//<////t0Ġt 0 0< 0d 0ġ4000h0040t0$d0X000!0d#0D%0Ĥ%0)0H-0t/0$00 $20l5070̦d90<0t>0B0TD0HI0L0DN0P0T0hdU0W0$X0,T]0|^0Ъ_0b0Pc0p$e0k0p0X$000Dd0įD0$0dT00԰0lT0Ա0 0p040 T0 0t0d00d40d000h0|0000#1XD$1%1ȸ)10+1-1t/1t11`91$K1\1\1 D1Ă1141ԅ11Ԓ11(1x1d1$1(11ؿ1<ԭ1x$1<1pd1t11111DD1xd1111$1Dd1X1l11t1 1`1T11H1t111d1D2t22 2@2T2h2|42 2222$2%20&2D&2h&2'2DZ2HZ2p$\24]2]2 m2pp2u2v2Ptx2z2d|2 }282|$2$2T2\d22282d22422ġ222Զ22d242$222`22d22tt2d2t2HD2|24222 t2L23` 3  343t$3t3$"3t#3$(3,3`,3t$,3.3T.3.3Q3T[3\m3dn3n3n3To30r3r34s3ԃ3$3 T3l33ԗ338T3T3pԘ33T33ԙ3$3(3T$3pt3Ģ33 T3lt33Ĩ343 T33d303Ծ3334333D3L$3D33333$3T3t333$D3`3$3h$33D3X333D333433@33t3(3Ht3d33(4td4 D4\ 4 44 $4\44444L#4d$4L5454<4@=4t?4dQ4(Q4H$R4h4T4V4tV4V4 X4<X4\Y4$Y4Z4T]4(_4a4a4a4 j4\l4t$l44l4Dl4m4m4Dn4to4du4Xy4z4z4$}4T4}4h}4|}4~4t~4~44$4$T4tt444Ċ4$d44ԍ44D44,4d$44Xĕ4xt4444t4<$4X4tę4ԙ444T4$4 484P4d4t4T4dt444D4Xķ4444D4XT4p44444T44404P4d44t44p444T4h4|D4d444$4\T4x44444 4x 4 44 4$ $48 44x4P444L4 4 D4 4h D4 4 t4 4$ $4< 4P 4h 4 4 $4 4 D4 4L 4 4 44@4xt445$5(45t5 5,555H5|5D55T$5h5|5"54$5&5D&5($'5t'5$(54(5*5@$*5TD*5h*5|*5 05d 1535D4565h85t:5:5;5;5t<5(<5T$=5lt>5D?5 @5<A5B5D54D5F5<H5|$J54K5TM5HdN5\e5de5e5e5dg5(h5j5n5Tp5(r5dtt5u5 tx5` x5 z5 }58!~5L!~5h!D5!5!T5$"$5#d5\#D5#5<$5 %54%D5%t5%5&ԗ5(&$5H&5&d5&$5 '45'45P(5(5)T5)5*5l*5*5*5L+5+5,5,5,$5,5-50-5`-5-D5$.5H.5l.5.5.5.5 /5051515,15@1T5T15p1515@2T5d2D5252T53d5035P34535X45t44646$56856L56t5466"66T$6<74%674&68+6x84/68K69N64:P6:S6;T6d;V6<Y6x<[6<D^68=6565ԏ656868686>D6?6$?ԕ6?46?Ė6?6X@T6@6@t6@Ağ6A6At6`B6tBį6Bĸ6\C$6pC6C6C6 D6D6D6 E6PE6E6DF6XF6|FD6F6F6Gt64G6XG46pG6G6G$6H6TI6xI6I6I46I 7J 7(JD 7HJ 7dJ 7J7K-7LTh7|Jh7Jti7@KDj7dK4l7Ktm7Mm7N$q7`Nr7xNdr7N4s7Nds7Nt7ODw7TOw7hOw7|O4x7ODx7Odx7Ox7Oy7PTz7HP|7P|7P}7P4}7P}7PԀ74Q7hQ$7Q7Rd7dRԎ7RԔ7Rt7 S7lS7Sԛ7T7XTt7Tğ7T7DU7UT7U7U7U7 V7 V7=d>=>=>=>=t?=0C=|4E=$F=DF= F=4G=HG=`H=xH=H=J=4J=TK=<L=4M=tN=N=N=N=,$O=@TO=TO=Q=Q=Q=R=dS=8tT=pT=T=T=T=T=U=DU= TU= U=4U=P$W=W=$X=4X=X=X=4Y= tY= Y=4$Z=HZ=dD[=[=\=T\=]=$^=P`=|a=b=4d=e=,Tf=ldg=h=$i=Ti=tj=j=k=<l=|do=o= p=Hr=s=s=$u=lu=v=Tv=v=v=0w=Ldw=h4x=x=y=tz=|=d}=x}=t~=d=$T=`D=D=T=T=TT=T=t==(=dt===8==4=Ĝ=@===$d=pT===d==@===dĹ===Ľ=<Խ=P=dԾ=t=$==8 =L =h $= T= = = =< =x = = T=4 = = D=$ =p = D= = = =( T=P =x D= d= D=d==@D=lD==$=H=t$=4===$=,=@T=x====@=\t==$=d=t= d=T==T=4=T=LD==t=d=(=`==>d>T >h > >T > T>H>$>>>tD>d"> d'>d(>)>D+>T,>xt1>$3>l3>$8>d9>4<>$>>`>>ttD>D>J>J>(dL>TM>Q>S>,DV>pdW>$X>Y>DDZ>t^>4_> b> Dc> td> e> e>!f>0!j>t!j>!j>!j>!k>!k>!4p>"p>,"p>@"p>T"p>h"p>|"q>"Dq>"r>"T>#4>d$>$>@%T>%č>&>&>&>'>t(>(4>(>($>)>8)d>t)>)>)>*Դ>T*>*>*ļ>@+T>+d>+>+4>+>0,>|,D>,4>-D>(->t->-4>->->->.>4.$>.4>.>.>$/>8/$>L/D>`/d>t/>/>/>/D>/T><0T>/$>0T>0>0d>D1>X14>1>1>2>H2>2>2>2>3>d3>|3>34>3d>3>3>4D>$4>84>t4>4>4>5>5D><5t>\5>x54>5>5>54>5>,6>@6>x6D>6>7>l7>7>7t>884>T8>8D>8d>9t>9>,9>@9>T9>x9T>9>9T><:>P:>d:$>:T>:>:>:t>$;D>P;>;D?;D? <4?`<?<?=?@=?T=T?=?=t?>?@>$ ?|>d ?> ?> ?? ?L?4 ??t ??T ?? ? @4 ?X@ ?@ ?@?@$?@$??8J4>?LJ>?lJ>?J??JT??J??JT@? KtA?\KA?K4C?KdE?DLG?LG?LdI?,MDK?dMdK?xMN?NQ?N$R?NdR?NR?OS?$O4S?D|t>D@Dt@D@D@D@D@DAD,BD`BDBDCD4CDtCDCDCD($DDTDDTEDEDEDGDxHD$HDdHDIDTIDHIDtIDIDJDJD4LD4tLDHLD\MDp$MD4MDDMDdMDMDOD8PDTPDQD4QDTQDQDQDSDXZDZD[DT[D[D,\DDT\D\\Dt\DD]D]D]D$_DdbD`bDdDDdDeD|fD$fDiDdiD4jDHdkDlDlDmDmDmD$mD8nDpnDdoDTpDtpDpDpD4qDtqDqD$qD84rDLtrD`rDtrDDsDsDsD$tDTtDtDuDduD(vDPvDl$wDwDyDyD~D|d~DDD4DD,ĀDH4DlDĂDԃDtDDDD0ĄDDDl4DDDTDdDtDDDdDTԆDDDDD$D4DDDTD,dD@DTDhD|ćDDDDD$DD0ԉDHD\4DxtDDĊDDDDċD dD\DtDD$DD@tDhDDDDtDDDD@ԒDTDlDD$D0ĚDP4DlDDDDdTDD$DD,D@ĨDTԨDhD|D4DTDtDDDD$D04DDDDXTDldDtDDDDDĪDԪD D D4DHD\$Dp4DDDHD`$DD D<DDhtDDԴDD(DlDDtDDtD8DX4DDXDDDD4DDTDDDDTD,DtDtdDldDdDtDDD4DDTDDTD$DDDhDlTDD4DDDDD$DdDxDDDTDdDtDDTDD@DD$D4DTDDDDDHEpEE4EE E E4E`E4EEEExTEEE E4 E EdD E E$ Ed E E0 EL E| E$EEEdE E,4ETtE|EE$EEETEEEEEDETE0dED EHT E\d Ept ED!E!E#E`4$ET&E'E,(ET)EH *Et D2E( $4Et 5E 6EH BE0 JEJEDDKE|$MEMENE PE\DRETE$VE VE XE tYE T[E [E T\E ^E0 t_E _E _E dEeELDfEpdfEtgEgEgEgEgE$DhEPkEkEkEkE$lE$dlELlE`mEtmEDnEnEnEoE(pETTqElqErETuE(uE<4vEtvEwE}EPEEE$EE$EPEdԍE|tEEDElEl4ETE0ETE$EEEdEdEEDEh$EtEE4EEEEEdEdE EtE ED!E!tE!FL"TF"F" Fl#F#4F$4Fd%F%4F$&$ F&D%F'+F($/F()gFP4hFX-DhFl-DiF-DjF-jF-jF-jF-kFH.lFh.TmF/nF`/nF/doF/qF0$rF0trF0rF0sF 14sF 1sFD1TtF1tuF1uF1vF2vF2tyF2yF2yF3zFh3{F3{F3D|F3t|F3|F3|F4~FP4F4DF4ĀF4$F5tFx5F5ĄF6F06Fx6F6F6F7dFL7F7F7F 84FP8Fx8F8F84F8F9DF4:4F:tF:F;TF0;F;F=$F=4F=Fh>F>F@?F?TF?T G@GAGA4GdATGAGAGdBGBGBG`CtG D4 GTD G|D!GD)GD)GE*GED-GF-G8F/GF1GF5GlGt:G0HDDGHDGHdMGIMG@IOGIRGITVG(JWGJXGJt^G\KD_GKlG`LDGMGMGNԣGOG,QGQGpRGSHLSHD-H<2HP2THT;H;$H;HSHTD H@Td HTT HhT HT HT HT HT HT HT H U H U H4U HHU$ H\U4 HpUD HU4 HUDH(V4HLVHV4HVdHVDHWDH`WHWHW$HW4HXTHXtH(XHHP\>H\$?H\?H\$@H\@H]CHD]GH]HH]XH]dYH]$]HX^t]Hl^dbH^eH_DfH4_iH|_iH_iH_4jH_tjH `jH4`nH`oH`qHDasHauHa4uHaDuHavH4bxHbyHb4H4cԆHhcHcHdH`dHdHdDHdHdH eH$eHpeHe4He4HeHf4H$fHPfTHdfHfHfDHgH0g$HtgԢHhԤH`hԧHhHlidHitHi$Hi4HiDHjHjH,jHHj$H\j4HpjHjHjTHjDH(kdHtkHkHkHPlHdlHxlHlDHmHDm$HXmDHlmHnHnHnHnHoH$oHxoIo IHp Ip Ip$I`q4IqIqIrIsI4sdIls Isd#Is#It#I$t4%IdtT&It&It'I u'ILu4(I`u*Iut+Iv+Iv,Iv2Iw3Ipw5IwEIxQIP{ĀI}I~Ix~I~ԨItIPIdIJ`JtJJJJJ4J\/J$0JD0J0J83JЄ9JXEJGJtHJԆtIJ JJ<OJ$VJtXJdYJ$ZJ̈ZJaJ\bJeJ gJpjJԊlJloJ܋pJ@rJrJsJ܌uJ`JdJXJDJĎĜJJ0J|J4JJԑJ4JĒDJHDJJTJJ4J JXJJdTJ$JlTJtJJhDJiJ$iJvJvTJ0wdJxJ yJ,yKyD Kz K`z KzKzKzK{KK\KK4K4KpKȚdKTK`KKKHK%K,t?K?K 4CKJKDLKNK NK\OKPKPK̠dPK4QKhTQK|QKQKRKtRK4RKHRK\RKpRKSKSKSK0SKDTKXTKlTKVḲVKVKWKDWK($\K\\K]KĤ]KD^K(^KP_KaKȥbK4bKbK0cKddKdK $gKp4gKgKgKhKاhK$hKThKhK,hK@iKDjKȨjKܨkKTkKkK<lK`4mKwKTwK(dyKxDKĪKtKLTK$KȫTKTKLK`ďKtKKԬDKKKDԓK`KtKTKĭKK,dKLKdKKK̮KtKK,KDK`$KtTKtKԯKԠKKK$DKLKTKK ԤK4DKXKKTKܱKKK$$KHħKԧKKȲDKdKtKK$KPĨKd$KdKKTKtK`4KTKKKԵK8KPDKpԺKKжKKKDKpdKtKK`KtTKxKKKKDK8KXKKDK DL`LdLLLԢ$LdLL< L$ LĹ LdLL@TLTL$LdLLԽDLL,LxLLLȾ$LdL$L L`$L4LDLԿLLL@L`TLLDL$"L4"LT4#Lp$$L&L@'L\4(L4*L+L0$-L|-L.Ld/L0L$D3Lp6L6L8L49LxMLML$NL,4NLNLDOLHPLTQLDRL gL@iLkLmLoL\ttLtLtL<TuLXuLp$wLxL$xL xLh$yLdyLyLyLyLyLDzL |L\$|LpD|LT|L|L|L~L $~L t~L\~L4LPDLdLLLl4LL8ԑL$LL4LLęL<TL4LLĭLHL|dLLLL,ԯL@LTTL|LLL0DLpԳLTLLԴL$L@L\$LpdLTLLL8LdDLLLԸLLLdĹLLtLĺLL0DL\ԻL$LL$L@ԼL\Lx$LTLLL ԽL4L\LpDLLLL4L@dL\ĿL|$LLL$LdL0L\LdLLLLL|LTLDL@TLtLtLL L $L4L`4LLL$L L\dLL$LLL,L@LTTLhLLL<$LP4LdL4LdLLTL|L4LLL($L<DL$LDLLTL44LltLLLL4L`LtLLLLdTLLM $M4DMHM\MxMtMMM4MtMDM\tMMM M4 MT M M( Mt M$ M4 M M MD M d M4 M\ MpMMMMTMTMMMTMMM DM@tMlM$MMM$MMDMXMMTM M M4MHd Mt Md!Mt!M!M!M "MDT#M|#Mt$M$M$M$M$M(T%M@%Md&M&M$(M<T*M+Md.M.MHT1M3MHt7M7Md8M8M9M$:M`D;Mt=MDBM,EMEMFMDKMKM$OMXOMdTMTTMDXMDdM4uMzMdM4MTdM|tMMMDM`dMtMM M`tMtMML M M M M4 M MX M, M M M, M M dMH 4Mx DM N TN` 4N N $N$ TNp d"N 4%N D%N %NH (N| D,N ,N@ T-N -N D.N .N $0N< 1Nl d2N 3N 6N 9N ZN d[N T^N $aN lN qNx ttN uN vNH vN wN $wN dwN wN $xN0 DxND xNx |N |N |N }N( }N< T}Nd }Nx dN NP Nd $Nx ND Nl N N N NL N DN Nh O O OX Ox d O !O8 t!Od "O T$O $O T'O! t.O! d0O! 1O " T1OD" :Ot" ;O" ;O" 4;O" T;O" ;O" ;O" ;O# DO# ?O$ @OP$ AO$ $BO$ BO$ DO,% DO% TEO% EO$& FO8& FOP& DGOl& GO& HO& THO& HO& IO' JO$' JOD' KOh' DKO' KO' tMO' $OOD( tPO|( PO( QO( T[Ox) $^O) D`O8* `OT* aOx* $aO* 4aO* aO+ cO<+ cOh+ cO|+ cO+ 4dO+ fO, TfO$, gOh, mO, DoO - oO`- pOx- ${O- {O. 4|O/ }O1 $~O,1 Od1 O1 dO2 $O2 O2 O3 DO`3 TO3 O4 (P4 (P4 )P4 d+PT5 +P5 4,P5 D6P$6 7P6 ԪP> dP$? P- TP. P$. DP. PL/ tP/ P/ $P00 P|0 TP0 Pt? P? P? P@ P@ DPT@ TPh@ P@ P@ P@ P@ P@ DPA DP0A DPlA DPA PA $PB $PXB PB PB PB dPB P,C PC QD 'Q`C )QdD *QD +QD t,QE -QE D4QhE $6QE 6Q(F 6QDF 4:QF ;QF ;QG ;QG QH T>Q$H >QPH 4?QH ?QH ?QH @Q I 4@Q I t@QLI $AQ|I $BQI DBQI dBQI BQJ BQJ BQ@J $CQlJ dCQJ CQJ CQJ DQK $DQK DDQ,K DQK DQK dEQK EQL FQ,L DFQXL FQL GQL TGQL tGQM GQ,M GQ@M HQlM dHQM IQM TIQM IQ N IQ4N JQ`N 4JQtN JQN JQN JQN tKQO KQRĖ d>R ?R ?R4 ?RH ?Rh @R| @R ARЗ $AR tAR AR, BRT DBRt BR BR BRИ CR $CR CR$ CRP CRd DRx 4DR DDR TDR̙ dDR DR DR8 DRd ERx 4ER dERК tER ER ER ER ERL ER` $FR 4FR dFR̛ tFR FR FR FR4 FRH FR\ GRp 4GR GRԜ GR GR GR@ $HRl 4HR DHR HR؝ HR HR IR, TIRd IR IR IRО IR JRx TJR JRȟ KR| tKR LRР NR NR NR( DORD tORX 4PRp dPR PR̡ tQR RR tRR RR, RRD SR\ SRp $SR 4SR DSR TR $TR TTR tTR( UR< tURP VR WR 4WRģ TWRأ tWR WR WR XR0 XRD $XRX DXRl XR YR $YRԤ YR 4ZR ZR@ ZRT [Rh $[R, _Rx taR cR< 4hR hR hR( hR< hRP hR kR0 4kRL DkR` dkR| mR mRܥ oR qR sRd duR vR̨ xR 4yR DyR4 TyRH dyR\ yR 4zR̩ {R {R ~RH R $R dR R| Rث dR R R tRT R ĎRܬ R R( 4R@ DR R Rܭ R( TRd tR R R R ĝR( RX Rl R $R R R ԠRD R| DR tR԰ R R R@ ĢRT tR R Rȱ Rܱ R R, Rx R $R 4R dR tR RH ĮRp R $Rȳ R įR R8 TRh R İRȴ $R dR@ Rx R Rȵ R ԳR, $RX dRx R ԴR $R TR8 Rp 4R Rط R$ ĹR8 TR 4Rи R, TR R R$ R dR R R@ R 4R R@ R R, RĽ R RX TR R DR0 R` R RX R R TR 4R dR4 $R dR, RP dRx R dR, TR| R, Sx S S tS SD $S $S4 S dS Sp S $!S !S8 d#S #S ,S` -S -S -S $0S, 1S| 6S SS T[S S S S S S` ԭS S DSd dSx S S 4S IJS Sp S S S DS, S` dS S dS S Sh DS dS S S S$ $S S S S 4S4 SP S T4 T\ Td T !T $T %T 'T )T *T +TH ,T ,T D-T :TD :TX :Tp T >T >T >T >T0 ?TL T?T` d?Tt ?T ?T ?T @T DCT, CTH DT\ DT dET JT JT4 JTH PT` PTt PT PT QT $QT DQT dQT QT QT QT( QT< RTP $RTd DRTx dRT RT RT RT RT ST $ST DST dST, ST@ STT STh ST| TT $TT DTT dTT TT TT $UT 4UT, DUT@ TUTT tVT WT WT dXTX YT ZT [T d\T8 D]Tl t^T eT teT eT fT dfT4 fTH $gTd hT hT hT iT jT, $kTP dlT mT doT drT4 trTH rT\ rTp rT rT rT rT rT rT sT sT $sT$ 4sT8 DsTL TsT` dsTt tsT sT sT sT sT sT sT sT sT( tT< tTP $tTd 4tTx DtT TtT dtT ttT tT tT tT tT tT, tT@ tTT tTh uT| uT $uT 4uT DuT TuT duT tuT uT uT0 uTD uTX uTl uT uT 4vT DvT TvT dvT tvT vT$ vT8 vTL vT` vTt vT vT vT DwT TwT dwT twT wT, wT@ wTT wTh xT xT $xT 4xT DxT TxT dxT txT xT4 xTH xT\ tzT zT zT $}T$ 4}T8 D}TL dT tT T Tx T $T ĚT T dT tT T4 TH $T\ 4Tp TT tT T ԣT DT TT dT tT4 TH T T ĨT ԨT $T8 T $T DT T TT` T TT( T DT T T T T $T| Td T 4T T DT T, tT T T T T4 4TH 4T DT T@ tTl T T4 TH T\ TTx T T T` dT T T T TH T T DT\ T U  U UP U 4 U U0 U| U U t U U U 4U0 UX Ux U tU U U U DU( dU< Ud U U $U DU TU dU U U TU TU@ dU U &UD 'Ux )U $-U /UP $MU $WU< aU dU hUx U DUD Ut U U ĞU ԞU U U U dU U, U@ U U U  U4 4U U$ U` Ud TU 4U U ԬU dUP 4U Ud U U DU tU U ԾU4 UH $Ut 4U DU U U4 UH U\ $U $U tU  U4 UH U U $U U DU$ U8 $UL tUt U U $U  U U U 4U Ud DU $U DU U tU U, Ud U U U8 U U U U U U0 UD 4Ud TU U0 U TU U V dV@ V 4V V< $V V V V8 4VX Vt V V V! 4!VT! $"V! *VD" t1V" 2V$ QVx% TV & UV`& UV& VV& 4lVH( $yV TyV$ tzVx zV {V T|V 4~V DV4 V V|! Vd# $V# ďVH$ V$ DV' 4V( V( V( V( V) $V) 4V) DVh* TV|* dV* tV* V* TV+ V(+ ԘVP+ ԙV+ V+ $V+ 4V, ĤV, tV, ĦV, 4V$- V- V- DV- V(. V4/ $V/ dV/ ԻV0 tV(1 $V 2 dV2 V83 VL3 dV3 V3 V<4 V4 V4 V4 V4 V 5 V$5 DVL5 Vt5 V5 4V6 4VH7 DV7 4V7 V7 TV8 $V: V: 4V; TV< V|< V< $VX= 4V= V > Vp> W> TW? tW$@ W@ 3WhB 4hW`C lWC vW D wW@D wWTD xWtD yWD āWLE dW( tW) W) W() W<) WP) ĐWd) ԐWx) W) W) W) W* $W* 4W,* DW@* TWT* dW- W`. W. W. 4W. TW/ dW/ W0 dWP0 DW0 WL1 W1 W1 ĝW1 W1 W02 ğWh2 W2 ĠW2 ԠW2 W$3 W5 W5 TW86 W6 W6 W8 Wx8 DW9 TWt9 W9 $W,: WL: 4W: W: tW; dWX< DW? WA WA ĺWB DWC 4WD WE ԿWE dW\F WF WG WxG TWG WG 4WH W$H WDH WH WI WI WLI $WI WI W J WJ WJ WJ WJ W,K dWK dWK W4L WHL W\L DWL 4WL $W(M WM XM DXO X@O TXR X\R dXR XR TXR *X0S +X`S -XS t0XS 2X(T 4XxT 5XT 6XT 9X|N :XN :XN $:XN 4:XN D:XN T:XN d:XO ;XO XP ?XP DDXHQ HXQ QXU RXU tSXU dTXhV 4`XV bX W DdX X TeXdX drXY X4U XHU X\U XpU $XV TXLV tXXW TXW XX XX X(Y 4XZ dXXZ dXZ XZ X8[ X[ X[ X8\ DXL\ 4Xt\ X ] TXL] X] X] X] $X^ XH^ ԨX^ X^ 4X$_ DX`_ TXt_ tX_ TX` Xa Xda Xa $Xa Xtb Xpc DXc X$d Xd TX_ dX8_ tXL_ X` X(` 4Xd` TX` dX` Xa Xb Xb DXc Xd Xd XXe dXe tXe Xe Xf X Tb > b4> b`> db> b> 4b> tb> b> b? Db lnx> ln> pn> rn0? Tsn? sn? ttn? un@@ xn@ yn@ zn,A t{nA |nA }nB nB nB nC n8C ԉnPC ndC 4nxC dnC nC ԋnC DndD nD DnD n$E ndE Dn|E nE ĕnE 4nLF $nF 4nF nTI n`J tnJ 4n(K DnL nL n M dnM 7olO 47oO ;oO ;oO ;oO s @s 4Bs tBs Fsl Gs Gs JsD Ks Ns Ns( $Os< dOs dQs 4SsT TSsh Ts dUs Us 4Wsl DWs Ws Ws Ws $Xs [s( [s\ d\s \s 4]s ]s $^s ^s( _s@ T_sT d`s `s bs@ fs| tgs hs dis is is, DjsD tjsX ks ns\ os dos os ost qs rs ss $ss 4ss( }s ~s s sl s Ts4 s čs 4s s( sH Ēs` ts s ts Ės $s 4s s s s( s< ĘsP $s 4s Ds Ts ds ts s s@ Ds\ Tsp ds ts s s s s Ds sl s Ds s s s, sP sd s Ԥs 4s ds0 s| ts Ԩs s s, tsd s s s 4s s ĸs Ts s dsP $s s 4s s $s s s4 $s 4s Ts ts, sD ts` ts s $s ds s8 s` $s s Ts s s sH $s\ 4sp Ds s s $s $sD t tD t t tL t t8 (t( )t` $8t d8t, 8t :t $;t ttx @t Bt Dt Ft Ht Ht Ht Ht Ht dIt, Ktx DKt Kt Mt t| t t t t@ 4th Tt ԋt 4t tD tp tt Ԙt t t tL t` tx t tD 4tX t t ĩt8 t t t $t tx Զt t t< Էt t Ծt Dt@ Dt t8 4t 4t t( tT t 4t t, t Dtp 4u Du< TuP dud u u u $u u ut D u uH $u` du u u u du0 tuX ul u !u !u "u0 "uP T&u 'u 'u0 )u| +u +u$ +u8 4,u` ,u ,u ,u ,u $-u T-u t-u -u0 -u\ -up .u 4u\ ;u ;u Gu0 Ru DUu  ^u Tcu tku mu duu uD u đu ԑu u $uH dup tu u $u du uH uD u Du tu u u uD Tud u Du $uP u u 4u Tu du( tu< uP ud u u u $u du@ uh u du tu u  4uX u u u $uD dul du Tv T vT tv v v Tv dv v v0 vD $vX 4vl Dv Tv dv tv v v v v  v0 v| v v $v  DvT $v "v $v  $vP %v $&v D&v d&v d'v$ )vX 4*vx *v *v 4+v $,vL ,v $1v 42v 2vH D3vt v  @v dAv Av Iv$ bv Dcv$ cv8 cv` cvt ev ev dfv fv 4gv  gvT $hv| dmv mv mv mv ovT pvx 4rv Trv rv uv@ $uvT duvh uv| uv vv xv d|v4 v v v4 vH v\ vx tv đv $v v v vH vl v ԗv v vP! Dv! 4v" vd" v" Dv" $v# tvd# v# v# $v# v# Dv@$ vl$ v$ v$ $v$ 4v$ Dv$ dv$ v$ v4% v% $v% v& vh& 4v& Dv& 4v0' Dvp' v' $v' Tv( v ( v<( v( v( v) v) v0) $vD) dvl) v) w) w) $w * wL* $wx* w+ wp+ w+ w+ w, w<, wP, w, Tw,- w- w- 4w. wp. tw.  w.  w. w$/ .wt/ /w/ /w/ 4 wL> wx> dw> w? w? w? w? w? Dw@ 4w@ w@ wA wB wPB 4wB xTD x@ xA $xHA xA xB DxD xD xD $x E $ xE D xE xF xF x,F #xF $x(G 45xE D5xE T5xE d5xE t5xxF D6xF $8xG 8xG 9xH :xXH 4:xlH T:xH :xH :xH ;x I d;xDI ;xXI tMxI MxI MxI MxI MxI MxJ MxJ Mx0J NxXJ NxlJ $NxJ 4NxJ DNxJ TNxJ NxJ TOxK RxK 4RxK RxK tSxK Sx(L 4UxL UxL VxL Vx M Xx\M \xM ]xN dkxLN kxtN dlxN TmxO nxO nxDO nxlO 4oxO toxO oxO qx@P sxP txP $txP txQ $wxPQ yx|Q yxQ zxR T{xR t{xR {xS 4|x| dE|0 I|| N|ȟ T|̞ tV| V|( V|< V|P V|d T[| d[|Ġ t[|ؠ ]| D]|D ]|l ]| ]| ]| `|4 `|H `|p a| da| c| te|P e|d e| 4f| o| p|L z| T| Ԉ|l |ԥ T| |\ | t| | |Ԧ | ԑ|P | | |$ |L d| ij| | | |4 |H 4|t t| | | |X T|p T| |ܫ $| |T | | t| | |( |< $|ȭ | D|D | $|̯ | |Ȱ d|@ } B} C}d $D}x E}4 G} 4I} J}d L} N}, TN}d O} P} S} tT}D W} $X} DY}س Y} Z}, Z}x [} T[}ȴ [} D\} T\}( 4]}X D]}l T]} ]} $`} `}@ Ta}x a} b}Զ c} c} d}@ $g} dh} h}( Dj}\ k} Tl}ȸ l} v}D v}| {} }4 } } }Ⱥ t} Ğ}4 D}` } } $} d}< }h } d}м $} } }$ }8 4}L D}` } $}Ľ }, } } $(~ d(~D (~\ )~ D)~ t)~Կ <~T =~ >~ >~ ?~H TC~ C~ dE~ F~ G~ $J~ tJ~ J~0 J~D J~X dK~| 4P~ R~ R~ dT~l $U~ U~ $V~ W~( Z~\ Ta~ Tf~ f~ Tg~$ g~L g~` g~t g~ h~ th~ i~ Di~< i~p $j~ m~ p~l $p~ 4p~ tp~ $q~ r~ v~ x~ x~ x~ x~ y~4 $y~H Ty~h ty~| y~ y~ y~ y~ y~ z~ {~, Ԁ~ ~< ԋ~ T~ ~ ~L ~ ~ ~ ԯ~x ~ T~ ~d ~x ~ ~ ~ $~ ~ ~D d~ t~ T~T ~x $~ D~ ~ D~ T~$ d~L ~h ~ ~ $~ 4~0 ~D ~` $~x ~ ~ 4~( t~d ~x ~ ~ ~< ~x ~ T~$ ~p ~ ~0 ~| ~ ~p ~ ~ ~D d   Th   D $ 4 p   D t @    ! T! 4$\ % ) $;D U( U< U8 U DV V tWD TX 4Z8 [L ] ] ]4 ]H ]\ ^p ^ $^ 4^ D^ ^ ^8 ^ $_0 Tb i t t t $u tu 4z@ zx D{ d} ~\     @ T $  t`  ԧ t 4 T $ D$ t@ T Ĵ   tT $ 4 T D  t d4 $ $ 4  4\ 8 d   $X  0  4 l       $ 4 T d` 4   $ $p $H 4\ Tx d  $ 4   d@ h  4  T( x    $( D$ t$ $ t&| d( d: t; T<\

D,@1A2HA3A5A7,B9B?&h?'?(?)0@D+p@+J,(K/dK/K0K$0K0L2_$>_4>_@`@8`4B|`4DaE(aE@aGlaQaQaTRaR_R$`RaR$bS8bSLb$Sb4SbDSbTSbdScDUtcTUcdUctUcUcVcTVdV@dVhdV|dXdtYdZdtZd4[e_fd` v4>(v$?\vD?vT? wt?\w?w4@wT@w@w$AxBLxBhxTCxHyHt>$>-<7.P7.d7$.x74.7D.7T.7d.7.8808$3`<d3t<3<4<6,=8=t@?$Ax?A?A?A?B?C @TC8@Cd@D@D@D AdF AdHpAtIAX0BZ|B^BTa0CaDCbXCTbtCtbC$cCf DulDxDtyEd|dE|EETtFFFdFFtTةTU0V\V̧WTWWtWWX̪XDY<YxYZTZ$[\X\l]$]t]Ԭ]^,t^@^T_t_D``aĭtaحaTbbdc(tdpddeЮeee TfLgxhhiD4j\jDl԰q8trXrl4ssuuu4vP$vl4vtvvv$wdw$|\Td$<xD0$ĴdxTđėԝt\pTt̷T<tP4TDhdttdHdxdܻdDd $l T 4$4о PdxDDtt<TtD4tPx$ d D |D!!#$Hd%%&D'(,T(Hd)|t)))T*,-X-..0X55$69h;$<,TT>d>?$4?@@tADG8DHlHITJJK@TLpdLtLLLMM4DNxNtOPPDQ Q0RT$SxTTT4U,D[tt\4]]]$_`\deeef,df@fT$ghg|gg$hqDTvv$Xēl$Ԕd 4lL$D0tԠĪ`0ThԱ$T$8LԴpt$4 8LĶ`Զt,ԹdĻDT DXDT4(pD4lD0tD$Tt0pd4Hp4tD0t4t8PT 4XTl44,tHp,hTHpdDT,TxDd0DTl,d !d!t"$,4%@d%T%hD&d&&&4'D'' ( d)d$+-.,/T/0t23t454777788@8T9h99;;4;H4<D<<<= = 4=<D=Pt=x=>$>T>,>DD??t$@@$AAPB4BBC$D0tD\EEF$G$IDI,TJ|JK$KKdgll,lH4s$uutw8ԌTTĐ dd$hD\Ԛp$dt$4$Ht$4tĴ`|dd\4D$88tT4`t,$$l$4HdT4D   D 8 ` 4   8 4`   4(  $ t   P  x   \T!!!0D"X"*d+T2H2t37d@8HTKKK LXM$M4MTMMNNdO Q\QpRR4RDR4SDSTS$dS8tSLS`S|VdXtX 4Y DY4TYHdY\tYt4\d]] `4$adtaaf dihlmmo84oXoxodrs,sL}4~~~<TTD`dԄ(tt4LdĈxTt \Ե ԶH!! "l""$"ĺ"d0#X###Լ##$4x$$$%D8%l%%d% &p&&d&0'DL'$x'D'4(D))t*d+,,,,$ <----4.h.d.=.@\/4k/Dk/Tk/k/k0k0k<0kP04ll0l0$m0n 1dp1p14q1tq2q,2twx2dx2x24y2z<3z|3t{3{3|3|3|4D}04}D4}X44~4~4T44ā5T<54556t86T6t7774$8$T8888T88d`9D99:4L:p:T:;4(;4;<P< <<=d4=l=t==$=$,>t@>X>l>>> >4 >d > >(8?(t?+?+?,L@-|@-@d0@D14A4A5A5Ad6B7DBt8B9Bd:B;4D>`D4?D?D@ETAHq4>`q?q?q@qT@r@DrBrdCrCrE$s4FTsdFsFstG tHLtHtHtItIttI u$KPu4LuLuNuODvDRvRvRvU$wUTwUwVw$WwTWwWwXwXx$Xx4X0xXDxXXx[lxDaxtaxaxbxdbxb,ybHy$cxycycyTeyjPzpztqzq{rL{r|{y{TT|$|T|$|T}t0}D}`}ԁ}}}4}T~t0~l~4~T~t~~ ԃ0Tx4T48TTԇ4$ԉTD(Hh4ď4Ȃ4 @d`Ē|$ЃD(dDĕ`$tԄTĘ,4Hh4ĚTȅ$DDTdtdTL$ 4DDXTldt$ȉԭ<d|dtDThľ|Dȋd4$Xd|$Č  4DTt܍$Ddt̎$,@4Th|Tȏ܏, H4ddDt8Dd̑TԐdTȒ dt(l<TXp$tTD4hDؔt4$$hd$htT0xt̘ tdH t ܚD , TT4DDT,l " $\%(̞+ .td3ȟt3ܟ546\68Ƞ8ܠ$9:;@D<d<<<ȡ=4@@tAТAE<HJKMDN|P$QܤT04WtZإ4_,D_@Taabd0ehdggȧjmppĨuu,wpxyة{(}`ThTd$thđ$dĬԒجD8ThTtĭĔ0ԕXt4TԮ8`įDĜ4XddȰTdԠH\pġԱ$D`$xDdĦܲ$D4DtгT d4tH\4D$tİtеT0dD$D$D4dPtTԷԺ(\Ļp4TDtl$\$pdt4t(<P$d`м4,@Th4DĽ  h$|о$40xȿܿ$,L`tdt$,D@dTh|$Tt4$p$DT4,T@d\$4DTd(<Pdx$4Ddd 0DXl$Tdt(t`|D4(T@Dd4t(<Px$DdDX$dD$<dTl$t4t8TdtT(<Ptt$4D4pDtT$d8dxdt  DlT $ TL`4Dt(`dt L`$td,$H|4$T@tT44P$x4DTdt0D4\Tpd  $ T   4 H \ p$ D d     @ D d    Dd0tL4xDTdt,@Th|$DT 4H\p4DTdtTPdl$$0p4t4t,@h4 d4Hpdd(<TPd|t T  D!,!X!l!$"T"""$#T#(#L$`$t%%%&4&<&d'$'T''(X$(lT(d(() 4)8D)L)))D**8*L+xT++D,-$t-\....t/D/0041d1$2XD2t22t33 $4@4`4|4545555,6L6|7788D88$9,d9X;t;; 4<(T>??$?4?D?T?0d?Dt?X?l????@@D@d@ @ @4@H@\Ap$ADAdAAAAAB8$BTDBpdBBBBBCDC<tC\C|CD4DdDDD<D\$E|TEEEF4F$FPFlFFG$GdGGI(4KdKtLLDMM(MXNlDNNN4OO O4OHP\dPPQ4QdQQ<RdTRRRS4ST8TUVV$V8VLW`$Wt4WWW4XX,X@XT4Y|YYZ[\[p[[[[\$\4\D\T\$d\8\h$]^$^D^d^t^,^@^TD_h_|d`dbbbbbcc$c04cDDcXTcldccccdd$dPdlee4eDfTfdf$gdgg4hDhTh$dh8Di$jtjkXop p,4qLqluu4vv|pd|D$D4Ԅ$ t(ąD`d|TD$@\d$d4|d$ tp  Ԗ D0 d x   Ģ $ P   D d d t 0 d` tt    ĸ Ը < `   4x$Ld4h D|T44D h$|D$4&D&&&40\D0pd0$:4:T:CTChHI$I4IDI(TI<dIPtIdIxIIIIIJJ4JDJ,J@JTJhJ|K$KDKdKKKK8KTLtDLtLLLM4M0TMLNDOOO 4PHPpP$QtQQR8dR`RT$UtU@UpVDVtVVTW0 Wd 4X dX X X Y!dY4!Yd!Yx!Z!Z!Z!d[("[\"\p"4\"d\"D]"$^(#_p#_#`#`#$`#D`$T`$d`0$h$dq$s %$t %ttP%t%u%Du%tu%u &Tv@&vt&w&x'X''T(DT(t(d()$)8)L)`)t)))$)T**d*+|++++ , ,L,h,$|,4,T,d,t - -$`-t-----H...D../4teP>el>e>e>4g8?gX?gx?dh?h?i?$i @Di(@il@4j@j@j@k@l Al0AmdA$mADmAdmAmAmBm4BmPBtnBnBnBnBoC4o,CToPCtotCoCoCoCdpDp(><?t?@@ԠB B $CX4ClDDTDdEtEF<FPGGHԢHI I4$JHJ|JK$KDKܣTKKK0Ld$MDNФO OHQDST U\VXȦZ<Zh\4]4^4_@4`aШb c<ddefĩgghHiDlԪl l l44mHTm\tmpnnoop@pTqqsجs$t$4t8DupTudvtvЭwwxTxhyyyȮyܮzz$z4z,Dz@TzTdzhtz|zzzz̯z{D{d{{0{D{X{l|$|D|d||ذ|||,}H$}dD}d}}}Ա}} ~,D~Lt~l~~̲4d ,Ll$T̳44`|ā$ܴdă$XtDH\Dp4$8Ld|ܷ4d,TT|Ԍ̸4(pԏԐ(<P$d4x40D4lԓL`ĕtԕļ$ؼDTdt(ԖX4̽4Tt0DTXlttĜо$4 D T4dHt\pĝȿĞԞ@\$p4Tdt$TġTdt(d|DdPd(HhT4T,TLlTD8Tp4$t4dPlTd HDt4htd$TTD0TDXt <P4d4Tdt0DdptX$t,$H"$&$$/49DC0$DlE4GPHP\PpZZZ(Z<ccddm<tmPmdwww4H$Dd$@ThԇDt8TPdLtt$,DLddĖ$D$Tpė4DT4LԘ`DTd(<P$ě tD|4$ D8dTԞTt0hĠ4 ԡ$@X$D44lTtd(DĥXtTd,Ĩhd tHԪt8t|ĭ$D0ĮD\p4DTtįԯ(DXl$4Dd $$@DԺ0d|$4Tdt $<\$pDd$tdx$$XDTt<DlTd@4Th|T$T(<PT4$\p4<d4,P4t dH`HQTXY[Td\T]`$g@hhi $l\nTnndp,pTpqDqtqqrr@rTrhDs|ssttttt(uPtuhuDvdvv$www0xT}T,4\dDtċX(\Dĝ4tt(Th|$DdĬ0L$`D|ԭ4Tt0LԮ|4T8ԯTttd$ t8 T l  4 T t     IJ$ $T T   D 4 d $ t ĵ Ե  L  Ļ dX  d d$|DtD0LT $d T<lPd,@t,tht4 D4hD \(<Pdx4TtD,t@Th|$DTdt0DXl4DT0Lh$|4D4DtHpTdTddx4DtDXD$D4Xdt0h$4 D T4dHt\pD,$@dTh 8 L Dh t    D!t@!l!!D!!$$"D8"dT"h""$"D"d"#D#`#Dt#t###T0$dD$X$$$4$d$$,%P%d%t%T%T&d0&X&&&&&&t4'l'4''''$,(l(( ( ) d) )T ) ) ) * p* ***$+d@+T+h+++4+T+t+,4,\,$,4,D,T,d,t,--,-H-Tx--$-4-T-d .(.D.X.l.....$(/DD/d`/|////$0D0P0d04x0D0d0t0001D112t,2D2$ \2)24*20<383<34>4>L4t?4D@4B4TB5dB(5B<5BP5Bd5Bx5B5C54C5DC5TC5dD6tD06Gl6$I6J6dL 7L 7ML74M`7N7N7O 84QD8Q\8Qt8dR8dT9UT9V9tX9_8:4`P:d:e:e:e:e;f4;fH;fd;g;g;$h;Dj<jd< ABA Hr F H s] PH U (A BBBA w0Z=8w,Z=BBG A(F0(A BBB (A ABBD C (A ABBG .PHBCMHBV^BZA]BZB^B_BhLJGGRBE B(D0A8JX 8D0A(B BBBA DGDTN 8G0A(B BBBJ  8F0A(B BBBM (`N:GDD _AB(ؿtNy(A BBBK8NAA M ABD `O\OXOTO8h+D$\OOQAG sAA$8OGAAO sAAk8DpxFdxApn 8A0A(B BBBA WVA 8D0A(B BBBH 0$UAAI@] AAF nAAA8JV 8A0A(B BBBF GGGGGUTGGGGQKCCCCIc (D BBBA A(D0 (D ABBG J (A ABBH ) 8C0A(B BBBE {PQAdzj[B8F0A(B BBBDHX]5aX]T8A0A(B BBBHĢ]pZ 8D0A(B BBBB H h] 8A0A(B BBBE \`|] p\|]0A8L2 8A0A(B BBBK $8]84] (A ABBC A (A ABBI (^XGBD A(K0T (H GBBH pP0Z (A DBBE R (A DBBE h (A DBBE W (D ABBE U (A DBBE h (A DBBI Z (A DBBE LH^8BBE B(D0A8H7 8A0A(B BBBG d8^x4^ HAXBKdBhPE 8A0A(B BBBA 8L 8A0A(B BBBH JAD I`s AABAB` (A BBBA D (F ABBA p|0qrBBD A(J0n (A ABBI H (F ABBE Y (F ABBD d (C ABBD $2q7AAL cDA$2q7AAL cDAN 8A0A(B BBBE HiA+HdBHfBO_BHCp 8A0A(B BBBE xBaxDppVBxEBBE B(D0A8DBaE 8A0A(B BBBH *BaDl@WIx BBE B(D0A8M 8A0A(B BBBE JBjDBhEBBG B(A0A8D 8C0A(B BBBJ rBaDB^E 8A0A(B BBBH GFfDWPAwpWB\LZAG 8D0A(B BBBB m]fBkVVBXvA\RXA YA YP 8A0A(B BBBE `x CA^ A D8 DW\8 BBB B(D0A8GA 8A0A(B BBBE BeAH FH p`I 6HeAQHfAKA^Ay 8D0A(B BBBI wDUA FH  GM !HRA 8C0A(B BBBF pi 8A0A(B BBBF L 0F(A BBBG AYAAYA (A ABBK I FfDzBoCdV,. 8A0A(B BBBG  a  a  a a$ a8 aL a ` a t a $ a8AAL dDA$ a8AAL dDA0 aSBAD D@  ACBF I F O 8D0A(B BBBC 1 nJ2 Dn2 Pn02 \nD2 hnX2 tnl2 n2 n82 nC F  B L(A BBBC I \ D I F $ D0x D I G d D  8A0A(B BBBA d 8F0A(B BBBE m 8A0A(B BBBA ^ OcB\R`AOdA\e[B\RaB\e[B\R`AnOdA0e[Be[B7 ^E  ^E  ^E  ^E ~ ^E ,e[BeZB ^E eZBeZAeZAe[A ^E  ^E 1 (A BBBR K (C BBBD  KdA8D0A(B BBBH d#EAA K CBI S FBE IEDR&VAANEOB]^$ ^  ,^Aw H F J a O E K TIAv[EGMT8h4|0,($̮ 0 DXlЯ  4H\pD԰$8L`tıر(<P|dxxtplhȲdܲ`\XT,P@LTHhD|@<84̳0,($ 0DXl ԴA[A] B 4hL`dX |P;D_ M A A lDABCBAD  ̞ %HX4%HXH2IzPLRx# ,$TcAAG  AAB ,TOAAG  DAH *PT b AG H h-gDA K Ъ%$!uHT A DȪ\ pAQ̡ܫ$AQDKXKl fAm B p A $HDy C O I O I ̢DQDL,/AAG I DAE @@ȬcNPD}p(oDP A YJ$8L`]tH]pT%4D ̈́22I32I@32Il32I "3RAAL "3RoAAL #4RWAAL ,#L4R?AAL P#4R'AAL t#47AC C #5RAAL #6RAAL #X6RϸAAL $6RAAL <6P6P$69wHlp$63[Hf77AL7ALd%hQx%tQ%Q%Q%Q%Q%Q HS%Q HS&Q HS0&Q HSL&Q HSh&Q HS&Q HS&QAL&Q HS&Q HS&Q HS'Q HS,'Q#AG VDL'Q#AG VDl'R#AG VD'R#AG VD' R#AG VD'0R#AG VD'@R#AG VD (PR#AG VD,(`R#AG VDL(pR#AG VDl(R#AG VDxR% cHXR-cH`R%bHX؉R-bH`(RXbAAG EDA8)SAL(@SIlbAAG vDA),SAL)0S)\AW,`;~AAK X AAA L>pAWh>tAW>xAW>x >tkAG k AD >AL,ħE~IAA U ABA ((?{AAG0x AAE T?8h?42`N?X*D`?p?|*D`??? @@,@@@,@^|AAK i AAE ,pȨ/|AAG I DAE ,Ȩ^|AAK i AAE ,С/l|AAG I DAE ,SH|AAI d AAA DA(:AI nA,PHS|AAI d AAA Ax:AI nA$AKAAG0AA4AHBBH A(P0a(D ABBBة(BԩD*D`DD*D`DĻ0DлBAD G0D  AABJ DKh; [AAK X AAA XeAWteAWeAWeAW,;uZAAK X AAA eAWfAW0fAW0Lf BAD G0D  AABJ fffYyYALfgK,gAAN G DAD D^ F DhZ * KPY   "=Y g!  & !19@I VZhVZh g  )Tp~ )Tp~(6(6$; } )Tp~ )Tp~ !BZ 1~:y(6(6ncludemgomp/finclude/omp_lib.f90Jakub Jelinek module omp_lib_kinds implicit none integer, parameter :: omp_lock_kind = 4 integer, parameter :: omp_nest_lock_kind = 8 integer, parameter :: omp_sched_kind = 4 integer, parameter :: omp_proc_bind_kind = 4 integer, parameter :: omp_lock_hint_kind = 4 integer, parameter :: omp_pause_resource_kind = 4 integer (omp_sched_kind), parameter :: omp_sched_static = 1 integer (omp_sched_kind), parameter :: omp_sched_dynamic = 2 integer (omp_sched_kind), parameter :: omp_sched_guided = 3 integer (omp_sched_kind), parameter :: omp_sched_auto = 4 integer (omp_proc_bind_kind), & parameter :: omp_proc_bind_false = 0 integer (omp_proc_bind_kind), & parameter :: omp_proc_bind_true = 1 integer (omp_proc_bind_kind), & parameter :: omp_proc_bind_master = 2 integer (omp_proc_bind_kind), & parameter :: omp_proc_bind_close = 3 integer (omp_proc_bind_kind), & parameter :: omp_proc_bind_spread = 4 integer (omp_lock_hint_kind), & parameter :: omp_lock_hint_none = 0 integer (omp_lock_hint_kind), & parameter :: omp_lock_hint_uncontended = 1 integer (omp_lock_hint_kind), & parameter :: omp_lock_hint_contended = 2 integer (omp_lock_hint_kind), & parameter :: omp_lock_hint_nonspeculative = 4 integer (omp_lock_hint_kind), & parameter :: omp_lock_hint_speculative = 8 integer (kind=omp_pause_resource_kind), & parameter :: omp_pause_soft = 1 integer (kind=omp_pause_resource_kind), & parameter :: omp_pause_hard = 2 end module module omp_lib use omp_lib_kinds implicit none integer, parameter :: openmp_version = 201511 interface subroutine omp_init_lock (svar) use omp_lib_kinds integer (omp_lock_kind), intent (out) :: svar end subroutine omp_init_lock end interface interface subroutine omp_init_lock_with_hint (svar, hint) use omp_lib_kinds integer (omp_lock_kind), intent (out) :: svar integer (omp_lock_hint_kind), intent (in) :: hint end subroutine omp_init_lock_with_hint end interface interface subroutine omp_init_nest_lock (nvar) use omp_lib_kinds integer (omp_nest_lock_kind), intent (out) :: nvar end subroutine omp_init_nest_lock end interface interface subroutine omp_init_nest_lock_with_hint (nvar, hint) use omp_lib_kinds integer (omp_nest_lock_kind), intent (out) :: nvar integer (omp_lock_hint_kind), intent (in) :: hint end subroutine omp_init_nest_lock_with_hint end interface interface subroutine omp_destroy_lock (svar) use omp_lib_kinds integer (omp_lock_kind), intent (inout) :: svar end subroutine omp_destroy_lock end interface interface subroutine omp_destroy_nest_lock (nvar) use omp_lib_kinds integer (omp_nest_lock_kind), intent (inout) :: nvar end subroutine omp_destroy_nest_lock end interface interface subroutine omp_set_lock (svar) use omp_lib_kinds integer (omp_lock_kind), intent (inout) :: svar end subroutine omp_set_lock end interface interface subroutine omp_set_nest_lock (nvar) use omp_lib_kinds integer (omp_nest_lock_kind), intent (inout) :: nvar end subroutine omp_set_nest_lock end interface interface subroutine omp_unset_lock (svar) use omp_lib_kinds integer (omp_lock_kind), intent (inout) :: svar end subroutine omp_unset_lock end interface interface subroutine omp_unset_nest_lock (nvar) use omp_lib_kinds integer (omp_nest_lock_kind), intent (inout) :: nvar end subroutine omp_unset_nest_lock end interface interface omp_set_dynamic subroutine omp_set_dynamic (dynamic_threads) logical (4), intent (in) :: dynamic_threads end subroutine omp_set_dynamic subroutine omp_set_dynamic_8 (dynamic_threads) logical (8), intent (in) :: dynamic_threads end subroutine omp_set_dynamic_8 end interface interface omp_set_nested subroutine omp_set_nested (nested) logical (4), intent (in) :: nested end subroutine omp_set_nested subroutine omp_set_nested_8 (nested) logical (8), intent (in) :: nested end subroutine omp_set_nested_8 end interface interface omp_set_num_threads subroutine omp_set_num_threads (num_threads) integer (4), intent (in) :: num_threads end subroutine omp_set_num_threads subroutine omp_set_num_threads_8 (num_threads) integer (8), intent (in) :: num_threads end subroutine omp_set_num_threads_8 end interface interface function omp_get_dynamic () logical (4) :: omp_get_dynamic end function omp_get_dynamic end interface interface function omp_get_nested () logical (4) :: omp_get_nested end function omp_get_nested end interface interface function omp_in_parallel () logical (4) :: omp_in_parallel end function omp_in_parallel end interface interface function omp_test_lock (svar) use omp_lib_kinds logical (4) :: omp_test_lock integer (omp_lock_kind), intent (inout) :: svar end function omp_test_lock end interface interface function omp_get_max_threads () integer (4) :: omp_get_max_threads end function omp_get_max_threads end interface interface function omp_get_num_procs () integer (4) :: omp_get_num_procs end function omp_get_num_procs end interface interface function omp_get_num_threads () integer (4) :: omp_get_num_threads end function omp_get_num_threads end interface interface function omp_get_thread_num () integer (4) :: omp_get_thread_num end function omp_get_thread_num end interface interface function omp_test_nest_lock (nvar) use omp_lib_kinds integer (4) :: omp_test_nest_lock integer (omp_nest_lock_kind), intent (inout) :: nvar end function omp_test_nest_lock end interface interface function omp_get_wtick () double precision :: omp_get_wtick end function omp_get_wtick end interface interface function omp_get_wtime () double precision :: omp_get_wtime end function omp_get_wtime end interface interface omp_set_schedule subroutine omp_set_schedule (kind, chunk_size) use omp_lib_kinds integer (omp_sched_kind), intent (in) :: kind integer (4), intent (in) :: chunk_size end subroutine omp_set_schedule subroutine omp_set_schedule_8 (kind, chunk_size) use omp_lib_kinds integer (omp_sched_kind), intent (in) :: kind integer (8), intent (in) :: chunk_size end subroutine omp_set_schedule_8 end interface interface omp_get_schedule subroutine omp_get_schedule (kind, chunk_size) use omp_lib_kinds integer (omp_sched_kind), intent (out) :: kind integer (4), intent (out) :: chunk_size end subroutine omp_get_schedule subroutine omp_get_schedule_8 (kind, chunk_size) use omp_lib_kinds integer (omp_sched_kind), intent (out) :: kind integer (8), intent (out) :: chunk_size end subroutine omp_get_schedule_8 end interface interface function omp_get_thread_limit () integer (4) :: omp_get_thread_limit end function omp_get_thread_limit end interface interface omp_set_max_active_levels subroutine omp_set_max_active_levels (max_levels) integer (4), intent (in) :: max_levels end subroutine omp_set_max_active_levels subroutine omp_set_max_active_levels_8 (max_levels) integer (8), intent (in) :: max_levels end subroutine omp_set_max_active_levels_8 end interface interface function omp_get_max_active_levels () integer (4) :: omp_get_max_active_levels end function omp_get_max_active_levels end interface interface function omp_get_level () integer (4) :: omp_get_level end function omp_get_level end interface interface omp_get_ancestor_thread_num function omp_get_ancestor_thread_num (level) integer (4), intent (in) :: level integer (4) :: omp_get_ancestor_thread_num end function omp_get_ancestor_thread_num function omp_get_ancestor_thread_num_8 (level) integer (8), intent (in) :: level integer (4) :: omp_get_ancestor_thread_num_8 end function omp_get_ancestor_thread_num_8 end interface interface omp_get_team_size function omp_get_team_size (level) integer (4), intent (in) :: level integer (4) :: omp_get_team_size end function omp_get_team_size function omp_get_team_size_8 (level) integer (8), intent (in) :: level integer (4) :: omp_get_team_size_8 end function omp_get_team_size_8 end interface interface function omp_get_active_level () integer (4) :: omp_get_active_level end function omp_get_active_level end interface interface function omp_in_final () logical (4) :: omp_in_final end function omp_in_final end interface interface function omp_get_cancellation () logical (4) :: omp_get_cancellation end function omp_get_cancellation end interface interface function omp_get_proc_bind () use omp_lib_kinds integer (omp_proc_bind_kind) :: omp_get_proc_bind end function omp_get_proc_bind end interface interface function omp_get_num_places () integer (4) :: omp_get_num_places end function omp_get_num_places end interface interface omp_get_place_num_procs function omp_get_place_num_procs (place_num) integer (4), intent(in) :: place_num integer (4) :: omp_get_place_num_procs end function omp_get_place_num_procs function omp_get_place_num_procs_8 (place_num) integer (8), intent(in) :: place_num integer (4) :: omp_get_place_num_procs_8 end function omp_get_place_num_procs_8 end interface interface omp_get_place_proc_ids subroutine omp_get_place_proc_ids (place_num, ids) integer (4), intent(in) :: place_num integer (4), intent(out) :: ids(*) end subroutine omp_get_place_proc_ids subroutine omp_get_place_proc_ids_8 (place_num, ids) integer (8), intent(in) :: place_num integer (8), intent(out) :: ids(*) end subroutine omp_get_place_proc_ids_8 end interface interface function omp_get_place_num () integer (4) :: omp_get_place_num end function omp_get_place_num end interface interface function omp_get_partition_num_places () integer (4) :: omp_get_partition_num_places end function omp_get_partition_num_places end interface interface omp_get_partition_place_nums subroutine omp_get_partition_place_nums (place_nums) integer (4), intent(out) :: place_nums(*) end subroutine omp_get_partition_place_nums subroutine omp_get_partition_place_nums_8 (place_nums) integer (8), intent(out) :: place_nums(*) end subroutine omp_get_partition_place_nums_8 end interface interface omp_set_default_device subroutine omp_set_default_device (device_num) integer (4), intent (in) :: device_num end subroutine omp_set_default_device subroutine omp_set_default_device_8 (device_num) integer (8), intent (in) :: device_num end subroutine omp_set_default_device_8 end interface interface function omp_get_default_device () integer (4) :: omp_get_default_device end function omp_get_default_device end interface interface function omp_get_num_devices () integer (4) :: omp_get_num_devices end function omp_get_num_devices end interface interface function omp_get_num_teams () integer (4) :: omp_get_num_teams end function omp_get_num_teams end interface interface function omp_get_team_num () integer (4) :: omp_get_team_num end function omp_get_team_num end interface interface function omp_is_initial_device () logical (4) :: omp_is_initial_device end function omp_is_initial_device end interface interface function omp_get_initial_device () integer (4) :: omp_get_initial_device end function omp_get_initial_device end interface interface function omp_get_max_task_priority () integer (4) :: omp_get_max_task_priority end function omp_get_max_task_priority end interface interface subroutine omp_set_affinity_format (format) character(len=*), intent(in) :: format end subroutine omp_set_affinity_format end interface interface function omp_get_affinity_format (buffer) integer (4) :: omp_get_affinity_format character(len=*), intent(out) :: buffer end function omp_get_affinity_format end interface interface subroutine omp_display_affinity (format) character(len=*), intent(in) :: format end subroutine omp_display_affinity end interface interface function omp_capture_affinity (buffer, format) integer (4) :: omp_capture_affinity character(len=*), intent(out) :: buffer character(len=*), intent(in) :: format end function omp_capture_affinity end interface interface function omp_pause_resource (kind, device_num) use omp_lib_kinds integer (4) :: omp_pause_resource integer (kind=omp_pause_resource_kind), & intent(in) :: kind integer (4) :: device_num end function end interface interface function omp_pause_resource_all (kind) use omp_lib_kinds integer (4) :: omp_pause_resource_all integer (kind=omp_pause_resource_kind), & intent(in) :: kind end function end interface end module omp_lib mgomp/finclude/omp_lib.hJakub Jelinek integer omp_lock_kind, omp_nest_lock_kind, openmp_version parameter (omp_lock_kind = 4) parameter (omp_nest_lock_kind = 8) integer omp_sched_kind parameter (omp_sched_kind = 4) integer (omp_sched_kind) omp_sched_static, omp_sched_dynamic integer (omp_sched_kind) omp_sched_guided, omp_sched_auto parameter (omp_sched_static = 1) parameter (omp_sched_dynamic = 2) parameter (omp_sched_guided = 3) parameter (omp_sched_auto = 4) integer omp_proc_bind_kind parameter (omp_proc_bind_kind = 4) integer (omp_proc_bind_kind) omp_proc_bind_false integer (omp_proc_bind_kind) omp_proc_bind_true integer (omp_proc_bind_kind) omp_proc_bind_master integer (omp_proc_bind_kind) omp_proc_bind_close integer (omp_proc_bind_kind) omp_proc_bind_spread parameter (omp_proc_bind_false = 0) parameter (omp_proc_bind_true = 1) parameter (omp_proc_bind_master = 2) parameter (omp_proc_bind_close = 3) parameter (omp_proc_bind_spread = 4) integer omp_lock_hint_kind parameter (omp_lock_hint_kind = 4) integer (omp_lock_hint_kind) omp_lock_hint_none integer (omp_lock_hint_kind) omp_lock_hint_uncontended integer (omp_lock_hint_kind) omp_lock_hint_contended integer (omp_lock_hint_kind) omp_lock_hint_nonspeculative integer (omp_lock_hint_kind) omp_lock_hint_speculative parameter (omp_lock_hint_none = 0) parameter (omp_lock_hint_uncontended = 1) parameter (omp_lock_hint_contended = 2) parameter (omp_lock_hint_nonspeculative = 4) parameter (omp_lock_hint_speculative = 8) parameter (openmp_version = 201511) integer omp_pause_resource_kind parameter (omp_pause_resource_kind = 4) integer (omp_pause_resource_kind) omp_pause_soft integer (omp_pause_resource_kind) omp_pause_hard parameter (omp_pause_soft = 1) parameter (omp_pause_hard = 2) external omp_init_lock, omp_init_nest_lock external omp_init_lock_with_hint external omp_init_nest_lock_with_hint external omp_destroy_lock, omp_destroy_nest_lock external omp_set_lock, omp_set_nest_lock external omp_unset_lock, omp_unset_nest_lock external omp_set_dynamic, omp_set_nested external omp_set_num_threads external omp_get_dynamic, omp_get_nested logical(4) omp_get_dynamic, omp_get_nested external omp_test_lock, omp_in_parallel logical(4) omp_test_lock, omp_in_parallel external omp_get_max_threads, omp_get_num_procs integer(4) omp_get_max_threads, omp_get_num_procs external omp_get_num_threads, omp_get_thread_num integer(4) omp_get_num_threads, omp_get_thread_num external omp_test_nest_lock integer(4) omp_test_nest_lock external omp_get_wtick, omp_get_wtime double precision omp_get_wtick, omp_get_wtime external omp_set_schedule, omp_get_schedule external omp_get_thread_limit, omp_set_max_active_levels external omp_get_max_active_levels, omp_get_level external omp_get_ancestor_thread_num, omp_get_team_size external omp_get_active_level integer(4) omp_get_thread_limit, omp_get_max_active_levels integer(4) omp_get_level, omp_get_ancestor_thread_num integer(4) omp_get_team_size, omp_get_active_level external omp_in_final logical(4) omp_in_final external omp_get_cancellation logical(4) omp_get_cancellation external omp_get_proc_bind integer(omp_proc_bind_kind) omp_get_proc_bind integer(4) omp_get_num_places external omp_get_num_places integer(4) omp_get_place_num_procs external omp_get_place_num_procs external omp_get_place_proc_ids integer(4) omp_get_place_num external omp_get_place_num integer(4) omp_get_partition_num_places external omp_get_partition_num_places external omp_get_partition_place_nums external omp_set_default_device, omp_get_default_device external omp_get_num_devices, omp_get_num_teams external omp_get_team_num integer(4) omp_get_default_device, omp_get_num_devices integer(4) omp_get_num_teams, omp_get_team_num external omp_is_initial_device logical(4) omp_is_initial_device external omp_get_initial_device integer(4) omp_get_initial_device external omp_get_max_task_priority integer(4) omp_get_max_task_priority external omp_set_affinity_format, omp_get_affinity_format external omp_display_affinity, omp_capture_affinity integer(4) omp_get_affinity_format integer(4) omp_capture_affinity external omp_pause_resource, omp_pause_resource_all integer(4) omp_pause_resource integer(4) omp_pause_resource_all mgomp/finclude/omp_lib.mod]Is8W&)\[Q-dg2#Km; Hb!Ȉ%Q]I*xxqy7zv fo}VCքuQ,eS'/F<dmwif2T1zp4ѿF}Gv__odyG[qx?=bG7o`t]% L䟙LWˠ2~r\(@?\~F}O_ p4bz/L\ę8npu\.:LKv*p='?(\m0I˥v&!&RQ NC l0j]d9M3Ɛ(TꝖI逆tFکi* q;NYZV= R>DeG% g4|p&q:fsGX(C %YBGG4d"`9LWw%l2$nx%$H"MAv,oN|Ek] ߾Թl+>պs@x`x .刡+,gQT ;0;Pԛna:9 RSHNMA"]|"ZM΁htib4 LD*61(-׿S3Mщp3tƾ%V' HXZQOaJ4 )R(tP}4d,niScTJF2vIu*u߇&|kvMIXcZX Ӊ@AZ#Cնq*?uJ7n)k YMH@HB`PW+,#*a*&=릧AP`$-=10 >p>t2p342P{p̈$?tNcn^M i(VĢ) 3<ŀxn=yAf2W0+*, `F}VEx~yoDul qP Q+0'ݧ >F֩ۺp$!5x¿CaAH8z& ffe{z偸 DŮzCTy2,>4a̛ƾ:`Mΰa kW`i6lEZmM;6`3Eax)6XW5| Cêb3=Uqߋy1?>Gw@Gg.tlv1EM.Z _ͤ)-*fV6Rn҃=Ե(| ԷQyPYB2l:Vߒ Y;ߨ7A`ъCwϽ6Q[(7K2{2EifJe%.b-BaolT#!VUxYnZYwjQ{eMNbE2PV^ՈO6/DBTGcM@ZUk,굑f .L0EzNiͯp)UH]$5Bj q:^qA ,t<:݄f?`l)tCRkhc=Fomgomp/finclude/openacc.f90! OpenACC Runtime Library Definitions. ! ! Contributed by Tobias Burnus ! and Mentor Embedded ! Keep in sync with config/accel/openacc.f90 and openacc_lib.h. module openacc_kinds use iso_fortran_env, only: int32 implicit none public private :: int32 ! When adding items, also update 'public' setting in 'module openacc' below. integer, parameter :: acc_device_kind = int32 ! integer (acc_device_kind), parameter :: acc_device_current = -1 integer (acc_device_kind), parameter :: acc_device_none = 0 integer (acc_device_kind), parameter :: acc_device_default = 1 integer (acc_device_kind), parameter :: acc_device_host = 2 ! integer (acc_device_kind), parameter :: integer (acc_device_kind), parameter :: acc_device_not_host = 4 integer (acc_device_kind), parameter :: acc_device_nvidia = 5 integer (acc_device_kind), parameter :: acc_device_radeon = 8 integer, parameter :: acc_device_property_kind = int32 ! OpenACC 2.6/2.7/3.0 used acc_device_property; in a spec update the ! missing '_kind' was added for consistency. For backward compatibility, keep: integer, parameter :: acc_device_property = acc_device_property_kind ! integer (acc_device_property_kind), parameter :: acc_property_memory = 1 integer (acc_device_property_kind), parameter :: acc_property_free_memory = 2 integer (acc_device_property_kind), parameter :: acc_property_name = int(Z'10001') integer (acc_device_property_kind), parameter :: acc_property_vendor = int(Z'10002') integer (acc_device_property_kind), parameter :: acc_property_driver = int(Z'10003') integer, parameter :: acc_handle_kind = int32 ! integer (acc_handle_kind), parameter :: acc_async_noval = -1 integer (acc_handle_kind), parameter :: acc_async_sync = -2 end module openacc_kinds module openacc_internal use openacc_kinds implicit none interface function acc_get_num_devices_h (devicetype) import integer acc_get_num_devices_h integer (acc_device_kind) devicetype end function subroutine acc_set_device_type_h (devicetype) import integer (acc_device_kind) devicetype end subroutine function acc_get_device_type_h () import integer (acc_device_kind) acc_get_device_type_h end function subroutine acc_set_device_num_h (devicenum, devicetype) import integer devicenum integer (acc_device_kind) devicetype end subroutine function acc_get_device_num_h (devicetype) import integer acc_get_device_num_h integer (acc_device_kind) devicetype end function function acc_get_property_h (devicenum, devicetype, property) use iso_c_binding, only: c_size_t import implicit none (type, external) integer (c_size_t) :: acc_get_property_h integer, value :: devicenum integer (acc_device_kind), value :: devicetype integer (acc_device_property_kind), value :: property end function subroutine acc_get_property_string_h (devicenum, devicetype, property, string) import implicit none (type, external) integer, value :: devicenum integer (acc_device_kind), value :: devicetype integer (acc_device_property_kind), value :: property character (*) :: string end subroutine function acc_async_test_h (arg) logical acc_async_test_h integer arg end function function acc_async_test_all_h () logical acc_async_test_all_h end function subroutine acc_wait_h (arg) integer arg end subroutine subroutine acc_wait_async_h (arg, async) integer arg, async end subroutine subroutine acc_wait_all_h () end subroutine subroutine acc_wait_all_async_h (async) integer async end subroutine subroutine acc_init_h (devicetype) import integer (acc_device_kind) devicetype end subroutine subroutine acc_shutdown_h (devicetype) import integer (acc_device_kind) devicetype end subroutine function acc_on_device_h (devicetype) import integer (acc_device_kind) devicetype logical acc_on_device_h end function subroutine acc_copyin_32_h (a, len) use iso_c_binding, only: c_int32_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len end subroutine subroutine acc_copyin_64_h (a, len) use iso_c_binding, only: c_int64_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len end subroutine subroutine acc_copyin_array_h (a) type (*), dimension (..), contiguous :: a end subroutine subroutine acc_present_or_copyin_32_h (a, len) use iso_c_binding, only: c_int32_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len end subroutine subroutine acc_present_or_copyin_64_h (a, len) use iso_c_binding, only: c_int64_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len end subroutine subroutine acc_present_or_copyin_array_h (a) type (*), dimension (..), contiguous :: a end subroutine subroutine acc_create_32_h (a, len) use iso_c_binding, only: c_int32_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len end subroutine subroutine acc_create_64_h (a, len) use iso_c_binding, only: c_int64_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len end subroutine subroutine acc_create_array_h (a) type (*), dimension (..), contiguous :: a end subroutine subroutine acc_present_or_create_32_h (a, len) use iso_c_binding, only: c_int32_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len end subroutine subroutine acc_present_or_create_64_h (a, len) use iso_c_binding, only: c_int64_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len end subroutine subroutine acc_present_or_create_array_h (a) type (*), dimension (..), contiguous :: a end subroutine subroutine acc_copyout_32_h (a, len) use iso_c_binding, only: c_int32_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len end subroutine subroutine acc_copyout_64_h (a, len) use iso_c_binding, only: c_int64_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len end subroutine subroutine acc_copyout_array_h (a) type (*), dimension (..), contiguous :: a end subroutine subroutine acc_copyout_finalize_32_h (a, len) use iso_c_binding, only: c_int32_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len end subroutine subroutine acc_copyout_finalize_64_h (a, len) use iso_c_binding, only: c_int64_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len end subroutine subroutine acc_copyout_finalize_array_h (a) type (*), dimension (..), contiguous :: a end subroutine subroutine acc_delete_32_h (a, len) use iso_c_binding, only: c_int32_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len end subroutine subroutine acc_delete_64_h (a, len) use iso_c_binding, only: c_int64_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len end subroutine subroutine acc_delete_array_h (a) type (*), dimension (..), contiguous :: a end subroutine subroutine acc_delete_finalize_32_h (a, len) use iso_c_binding, only: c_int32_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len end subroutine subroutine acc_delete_finalize_64_h (a, len) use iso_c_binding, only: c_int64_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len end subroutine subroutine acc_delete_finalize_array_h (a) type (*), dimension (..), contiguous :: a end subroutine subroutine acc_update_device_32_h (a, len) use iso_c_binding, only: c_int32_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len end subroutine subroutine acc_update_device_64_h (a, len) use iso_c_binding, only: c_int64_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len end subroutine subroutine acc_update_device_array_h (a) type (*), dimension (..), contiguous :: a end subroutine subroutine acc_update_self_32_h (a, len) use iso_c_binding, only: c_int32_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len end subroutine subroutine acc_update_self_64_h (a, len) use iso_c_binding, only: c_int64_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len end subroutine subroutine acc_update_self_array_h (a) type (*), dimension (..), contiguous :: a end subroutine function acc_is_present_32_h (a, len) use iso_c_binding, only: c_int32_t logical acc_is_present_32_h !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len end function function acc_is_present_64_h (a, len) use iso_c_binding, only: c_int64_t logical acc_is_present_64_h !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len end function function acc_is_present_array_h (a) logical acc_is_present_array_h type (*), dimension (..), contiguous :: a end function subroutine acc_copyin_async_32_h (a, len, async) use iso_c_binding, only: c_int32_t use openacc_kinds, only: acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len integer (acc_handle_kind) async end subroutine subroutine acc_copyin_async_64_h (a, len, async) use iso_c_binding, only: c_int64_t use openacc_kinds, only: acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len integer (acc_handle_kind) async end subroutine subroutine acc_copyin_async_array_h (a, async) use openacc_kinds, only: acc_handle_kind type (*), dimension (..), contiguous :: a integer (acc_handle_kind) async end subroutine subroutine acc_create_async_32_h (a, len, async) use iso_c_binding, only: c_int32_t use openacc_kinds, only: acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len integer (acc_handle_kind) async end subroutine subroutine acc_create_async_64_h (a, len, async) use iso_c_binding, only: c_int64_t use openacc_kinds, only: acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len integer (acc_handle_kind) async end subroutine subroutine acc_create_async_array_h (a, async) use openacc_kinds, only: acc_handle_kind type (*), dimension (..), contiguous :: a integer (acc_handle_kind) async end subroutine subroutine acc_copyout_async_32_h (a, len, async) use iso_c_binding, only: c_int32_t use openacc_kinds, only: acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len integer (acc_handle_kind) async end subroutine subroutine acc_copyout_async_64_h (a, len, async) use iso_c_binding, only: c_int64_t use openacc_kinds, only: acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len integer (acc_handle_kind) async end subroutine subroutine acc_copyout_async_array_h (a, async) use openacc_kinds, only: acc_handle_kind type (*), dimension (..), contiguous :: a integer (acc_handle_kind) async end subroutine subroutine acc_delete_async_32_h (a, len, async) use iso_c_binding, only: c_int32_t use openacc_kinds, only: acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len integer (acc_handle_kind) async end subroutine subroutine acc_delete_async_64_h (a, len, async) use iso_c_binding, only: c_int64_t use openacc_kinds, only: acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len integer (acc_handle_kind) async end subroutine subroutine acc_delete_async_array_h (a, async) use openacc_kinds, only: acc_handle_kind type (*), dimension (..), contiguous :: a integer (acc_handle_kind) async end subroutine subroutine acc_update_device_async_32_h (a, len, async) use iso_c_binding, only: c_int32_t use openacc_kinds, only: acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len integer (acc_handle_kind) async end subroutine subroutine acc_update_device_async_64_h (a, len, async) use iso_c_binding, only: c_int64_t use openacc_kinds, only: acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len integer (acc_handle_kind) async end subroutine subroutine acc_update_device_async_array_h (a, async) use openacc_kinds, only: acc_handle_kind type (*), dimension (..), contiguous :: a integer (acc_handle_kind) async end subroutine subroutine acc_update_self_async_32_h (a, len, async) use iso_c_binding, only: c_int32_t use openacc_kinds, only: acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len integer (acc_handle_kind) async end subroutine subroutine acc_update_self_async_64_h (a, len, async) use iso_c_binding, only: c_int64_t use openacc_kinds, only: acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len integer (acc_handle_kind) async end subroutine subroutine acc_update_self_async_array_h (a, async) use openacc_kinds, only: acc_handle_kind type (*), dimension (..), contiguous :: a integer (acc_handle_kind) async end subroutine end interface interface function acc_get_num_devices_l (devicetype) & bind (C, name = "acc_get_num_devices") use iso_c_binding, only: c_int integer (c_int) :: acc_get_num_devices_l integer (c_int), value :: devicetype end function subroutine acc_set_device_type_l (devicetype) & bind (C, name = "acc_set_device_type") use iso_c_binding, only: c_int integer (c_int), value :: devicetype end subroutine function acc_get_device_type_l () & bind (C, name = "acc_get_device_type") use iso_c_binding, only: c_int integer (c_int) :: acc_get_device_type_l end function subroutine acc_set_device_num_l (devicenum, devicetype) & bind (C, name = "acc_set_device_num") use iso_c_binding, only: c_int integer (c_int), value :: devicenum, devicetype end subroutine function acc_get_device_num_l (devicetype) & bind (C, name = "acc_get_device_num") use iso_c_binding, only: c_int integer (c_int) :: acc_get_device_num_l integer (c_int), value :: devicetype end function function acc_get_property_l (devicenum, devicetype, property) & bind (C, name = "acc_get_property") use iso_c_binding, only: c_int, c_size_t implicit none (type, external) integer (c_size_t) :: acc_get_property_l integer (c_int), value :: devicenum integer (c_int), value :: devicetype integer (c_int), value :: property end function function acc_get_property_string_l (devicenum, devicetype, property) & bind (C, name = "acc_get_property_string") use iso_c_binding, only: c_int, c_ptr implicit none (type, external) type (c_ptr) :: acc_get_property_string_l integer (c_int), value :: devicenum integer (c_int), value :: devicetype integer (c_int), value :: property end function function acc_async_test_l (a) & bind (C, name = "acc_async_test") use iso_c_binding, only: c_int integer (c_int) :: acc_async_test_l integer (c_int), value :: a end function function acc_async_test_all_l () & bind (C, name = "acc_async_test_all") use iso_c_binding, only: c_int integer (c_int) :: acc_async_test_all_l end function subroutine acc_wait_l (a) & bind (C, name = "acc_wait") use iso_c_binding, only: c_int integer (c_int), value :: a end subroutine subroutine acc_wait_async_l (arg, async) & bind (C, name = "acc_wait_async") use iso_c_binding, only: c_int integer (c_int), value :: arg, async end subroutine subroutine acc_wait_all_l () & bind (C, name = "acc_wait_all") use iso_c_binding, only: c_int end subroutine subroutine acc_wait_all_async_l (async) & bind (C, name = "acc_wait_all_async") use iso_c_binding, only: c_int integer (c_int), value :: async end subroutine subroutine acc_init_l (devicetype) & bind (C, name = "acc_init") use iso_c_binding, only: c_int integer (c_int), value :: devicetype end subroutine subroutine acc_shutdown_l (devicetype) & bind (C, name = "acc_shutdown") use iso_c_binding, only: c_int integer (c_int), value :: devicetype end subroutine function acc_on_device_l (devicetype) & bind (C, name = "acc_on_device") use iso_c_binding, only: c_int integer (c_int) :: acc_on_device_l integer (c_int), value :: devicetype end function subroutine acc_copyin_l (a, len) & bind (C, name = "acc_copyin") use iso_c_binding, only: c_size_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_size_t), value :: len end subroutine subroutine acc_present_or_copyin_l (a, len) & bind (C, name = "acc_present_or_copyin") use iso_c_binding, only: c_size_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_size_t), value :: len end subroutine subroutine acc_create_l (a, len) & bind (C, name = "acc_create") use iso_c_binding, only: c_size_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_size_t), value :: len end subroutine subroutine acc_present_or_create_l (a, len) & bind (C, name = "acc_present_or_create") use iso_c_binding, only: c_size_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_size_t), value :: len end subroutine subroutine acc_copyout_l (a, len) & bind (C, name = "acc_copyout") use iso_c_binding, only: c_size_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_size_t), value :: len end subroutine subroutine acc_copyout_finalize_l (a, len) & bind (C, name = "acc_copyout_finalize") use iso_c_binding, only: c_size_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_size_t), value :: len end subroutine subroutine acc_delete_l (a, len) & bind (C, name = "acc_delete") use iso_c_binding, only: c_size_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_size_t), value :: len end subroutine subroutine acc_delete_finalize_l (a, len) & bind (C, name = "acc_delete_finalize") use iso_c_binding, only: c_size_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_size_t), value :: len end subroutine subroutine acc_update_device_l (a, len) & bind (C, name = "acc_update_device") use iso_c_binding, only: c_size_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_size_t), value :: len end subroutine subroutine acc_update_self_l (a, len) & bind (C, name = "acc_update_self") use iso_c_binding, only: c_size_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_size_t), value :: len end subroutine function acc_is_present_l (a, len) & bind (C, name = "acc_is_present") use iso_c_binding, only: c_int32_t, c_size_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a integer (c_int32_t) :: acc_is_present_l type (*), dimension (*) :: a integer (c_size_t), value :: len end function subroutine acc_copyin_async_l (a, len, async) & bind (C, name = "acc_copyin_async") use iso_c_binding, only: c_size_t, c_int !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_size_t), value :: len integer (c_int), value :: async end subroutine subroutine acc_create_async_l (a, len, async) & bind (C, name = "acc_create_async") use iso_c_binding, only: c_size_t, c_int !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_size_t), value :: len integer (c_int), value :: async end subroutine subroutine acc_copyout_async_l (a, len, async) & bind (C, name = "acc_copyout_async") use iso_c_binding, only: c_size_t, c_int !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_size_t), value :: len integer (c_int), value :: async end subroutine subroutine acc_delete_async_l (a, len, async) & bind (C, name = "acc_delete_async") use iso_c_binding, only: c_size_t, c_int !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_size_t), value :: len integer (c_int), value :: async end subroutine subroutine acc_update_device_async_l (a, len, async) & bind (C, name = "acc_update_device_async") use iso_c_binding, only: c_size_t, c_int !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_size_t), value :: len integer (c_int), value :: async end subroutine subroutine acc_update_self_async_l (a, len, async) & bind (C, name = "acc_update_self_async") use iso_c_binding, only: c_size_t, c_int !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_size_t), value :: len integer (c_int), value :: async end subroutine end interface end module openacc_internal module openacc use openacc_kinds use openacc_internal implicit none private ! From openacc_kinds public :: acc_device_kind public :: acc_device_none, acc_device_default, acc_device_host public :: acc_device_not_host, acc_device_nvidia, acc_device_radeon public :: acc_device_property_kind, acc_device_property public :: acc_property_memory, acc_property_free_memory public :: acc_property_name, acc_property_vendor, acc_property_driver public :: acc_handle_kind public :: acc_async_noval, acc_async_sync public :: openacc_version public :: acc_get_num_devices, acc_set_device_type, acc_get_device_type public :: acc_set_device_num, acc_get_device_num public :: acc_get_property, acc_get_property_string public :: acc_async_test, acc_async_test_all public :: acc_wait, acc_async_wait, acc_wait_async public :: acc_wait_all, acc_async_wait_all, acc_wait_all_async public :: acc_init, acc_shutdown, acc_on_device public :: acc_copyin, acc_present_or_copyin, acc_pcopyin, acc_create public :: acc_present_or_create, acc_pcreate, acc_copyout, acc_delete public :: acc_update_device, acc_update_self, acc_is_present public :: acc_copyin_async, acc_create_async, acc_copyout_async public :: acc_delete_async, acc_update_device_async, acc_update_self_async public :: acc_copyout_finalize, acc_delete_finalize integer, parameter :: openacc_version = 201711 interface acc_get_num_devices procedure :: acc_get_num_devices_h end interface interface acc_set_device_type procedure :: acc_set_device_type_h end interface interface acc_get_device_type procedure :: acc_get_device_type_h end interface interface acc_set_device_num procedure :: acc_set_device_num_h end interface interface acc_get_device_num procedure :: acc_get_device_num_h end interface interface acc_get_property procedure :: acc_get_property_h end interface interface acc_get_property_string procedure :: acc_get_property_string_h end interface interface acc_async_test procedure :: acc_async_test_h end interface interface acc_async_test_all procedure :: acc_async_test_all_h end interface interface acc_wait procedure :: acc_wait_h end interface ! acc_async_wait is an OpenACC 1.0 compatibility name for acc_wait. interface acc_async_wait procedure :: acc_wait_h end interface interface acc_wait_async procedure :: acc_wait_async_h end interface interface acc_wait_all procedure :: acc_wait_all_h end interface ! acc_async_wait_all is an OpenACC 1.0 compatibility name for acc_wait_all. interface acc_async_wait_all procedure :: acc_wait_all_h end interface interface acc_wait_all_async procedure :: acc_wait_all_async_h end interface interface acc_init procedure :: acc_init_h end interface interface acc_shutdown procedure :: acc_shutdown_h end interface interface acc_on_device procedure :: acc_on_device_h end interface ! acc_malloc: Only available in C/C++ ! acc_free: Only available in C/C++ ! As vendor extension, the following code supports both 32bit and 64bit ! arguments for "size"; the OpenACC standard only permits default-kind ! integers, which are of kind 4 (i.e. 32 bits). ! Additionally, the two-argument version also takes arrays as argument. ! and the one argument version also scalars. Note that the code assumes ! that the arrays are contiguous. interface acc_copyin procedure :: acc_copyin_32_h procedure :: acc_copyin_64_h procedure :: acc_copyin_array_h end interface interface acc_present_or_copyin procedure :: acc_present_or_copyin_32_h procedure :: acc_present_or_copyin_64_h procedure :: acc_present_or_copyin_array_h end interface interface acc_pcopyin procedure :: acc_present_or_copyin_32_h procedure :: acc_present_or_copyin_64_h procedure :: acc_present_or_copyin_array_h end interface interface acc_create procedure :: acc_create_32_h procedure :: acc_create_64_h procedure :: acc_create_array_h end interface interface acc_present_or_create procedure :: acc_present_or_create_32_h procedure :: acc_present_or_create_64_h procedure :: acc_present_or_create_array_h end interface interface acc_pcreate procedure :: acc_present_or_create_32_h procedure :: acc_present_or_create_64_h procedure :: acc_present_or_create_array_h end interface interface acc_copyout procedure :: acc_copyout_32_h procedure :: acc_copyout_64_h procedure :: acc_copyout_array_h end interface interface acc_copyout_finalize procedure :: acc_copyout_finalize_32_h procedure :: acc_copyout_finalize_64_h procedure :: acc_copyout_finalize_array_h end interface interface acc_delete procedure :: acc_delete_32_h procedure :: acc_delete_64_h procedure :: acc_delete_array_h end interface interface acc_delete_finalize procedure :: acc_delete_finalize_32_h procedure :: acc_delete_finalize_64_h procedure :: acc_delete_finalize_array_h end interface interface acc_update_device procedure :: acc_update_device_32_h procedure :: acc_update_device_64_h procedure :: acc_update_device_array_h end interface interface acc_update_self procedure :: acc_update_self_32_h procedure :: acc_update_self_64_h procedure :: acc_update_self_array_h end interface ! acc_map_data: Only available in C/C++ ! acc_unmap_data: Only available in C/C++ ! acc_deviceptr: Only available in C/C++ ! acc_hostptr: Only available in C/C++ interface acc_is_present procedure :: acc_is_present_32_h procedure :: acc_is_present_64_h procedure :: acc_is_present_array_h end interface ! acc_memcpy_to_device: Only available in C/C++ ! acc_memcpy_from_device: Only available in C/C++ interface acc_copyin_async procedure :: acc_copyin_async_32_h procedure :: acc_copyin_async_64_h procedure :: acc_copyin_async_array_h end interface interface acc_create_async procedure :: acc_create_async_32_h procedure :: acc_create_async_64_h procedure :: acc_create_async_array_h end interface interface acc_copyout_async procedure :: acc_copyout_async_32_h procedure :: acc_copyout_async_64_h procedure :: acc_copyout_async_array_h end interface interface acc_delete_async procedure :: acc_delete_async_32_h procedure :: acc_delete_async_64_h procedure :: acc_delete_async_array_h end interface interface acc_update_device_async procedure :: acc_update_device_async_32_h procedure :: acc_update_device_async_64_h procedure :: acc_update_device_async_array_h end interface interface acc_update_self_async procedure :: acc_update_self_async_32_h procedure :: acc_update_self_async_64_h procedure :: acc_update_self_async_array_h end interface end module openacc function acc_get_num_devices_h (devicetype) use openacc_internal, only: acc_get_num_devices_l use openacc_kinds integer acc_get_num_devices_h integer (acc_device_kind) devicetype acc_get_num_devices_h = acc_get_num_devices_l (devicetype) end function subroutine acc_set_device_type_h (devicetype) use openacc_internal, only: acc_set_device_type_l use openacc_kinds integer (acc_device_kind) devicetype call acc_set_device_type_l (devicetype) end subroutine function acc_get_device_type_h () use openacc_internal, only: acc_get_device_type_l use openacc_kinds integer (acc_device_kind) acc_get_device_type_h acc_get_device_type_h = acc_get_device_type_l () end function subroutine acc_set_device_num_h (devicenum, devicetype) use openacc_internal, only: acc_set_device_num_l use openacc_kinds integer devicenum integer (acc_device_kind) devicetype call acc_set_device_num_l (devicenum, devicetype) end subroutine function acc_get_device_num_h (devicetype) use openacc_internal, only: acc_get_device_num_l use openacc_kinds integer acc_get_device_num_h integer (acc_device_kind) devicetype acc_get_device_num_h = acc_get_device_num_l (devicetype) end function function acc_get_property_h (devicenum, devicetype, property) use iso_c_binding, only: c_size_t use openacc_internal, only: acc_get_property_l use openacc_kinds implicit none (type, external) integer (c_size_t) :: acc_get_property_h integer, value :: devicenum integer (acc_device_kind), value :: devicetype integer (acc_device_property_kind), value :: property acc_get_property_h = acc_get_property_l (devicenum, devicetype, property) end function subroutine acc_get_property_string_h (devicenum, devicetype, property, string) use iso_c_binding, only: c_char, c_size_t, c_ptr, c_f_pointer, c_associated use openacc_internal, only: acc_get_property_string_l use openacc_kinds implicit none (type, external) integer, value :: devicenum integer (acc_device_kind), value :: devicetype integer (acc_device_property_kind), value :: property character (*) :: string type (c_ptr) :: cptr integer(c_size_t) :: clen, slen, i character (kind=c_char, len=1), pointer, contiguous :: sptr (:) interface function strlen (s) bind (C, name = "strlen") use iso_c_binding, only: c_ptr, c_size_t type (c_ptr), intent(in), value :: s integer (c_size_t) :: strlen end function strlen end interface cptr = acc_get_property_string_l (devicenum, devicetype, property) string = "" if (.not. c_associated (cptr)) then return end if clen = strlen (cptr) call c_f_pointer (cptr, sptr, [clen]) slen = min (clen, len (string, kind=c_size_t)) do i = 1, slen string (i:i) = sptr (i) end do end subroutine function acc_async_test_h (arg) use openacc_internal, only: acc_async_test_l logical acc_async_test_h integer arg acc_async_test_h = acc_async_test_l (arg) /= 0 end function function acc_async_test_all_h () use openacc_internal, only: acc_async_test_all_l logical acc_async_test_all_h acc_async_test_all_h = acc_async_test_all_l () /= 0 end function subroutine acc_wait_h (arg) use openacc_internal, only: acc_wait_l integer arg call acc_wait_l (arg) end subroutine subroutine acc_wait_async_h (arg, async) use openacc_internal, only: acc_wait_async_l integer arg, async call acc_wait_async_l (arg, async) end subroutine subroutine acc_wait_all_h () use openacc_internal, only: acc_wait_all_l call acc_wait_all_l () end subroutine subroutine acc_wait_all_async_h (async) use openacc_internal, only: acc_wait_all_async_l integer async call acc_wait_all_async_l (async) end subroutine subroutine acc_init_h (devicetype) use openacc_internal, only: acc_init_l use openacc_kinds integer (acc_device_kind) devicetype call acc_init_l (devicetype) end subroutine subroutine acc_shutdown_h (devicetype) use openacc_internal, only: acc_shutdown_l use openacc_kinds integer (acc_device_kind) devicetype call acc_shutdown_l (devicetype) end subroutine function acc_on_device_h (devicetype) use openacc_internal, only: acc_on_device_l use openacc_kinds integer (acc_device_kind) devicetype logical acc_on_device_h acc_on_device_h = acc_on_device_l (devicetype) /= 0 end function subroutine acc_copyin_32_h (a, len) use iso_c_binding, only: c_int32_t, c_size_t use openacc_internal, only: acc_copyin_l !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len call acc_copyin_l (a, int (len, kind = c_size_t)) end subroutine subroutine acc_copyin_64_h (a, len) use iso_c_binding, only: c_int64_t, c_size_t use openacc_internal, only: acc_copyin_l !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len call acc_copyin_l (a, int (len, kind = c_size_t)) end subroutine subroutine acc_copyin_array_h (a) use openacc_internal, only: acc_copyin_l type (*), dimension (..), contiguous :: a call acc_copyin_l (a, sizeof (a)) end subroutine subroutine acc_present_or_copyin_32_h (a, len) use iso_c_binding, only: c_int32_t, c_size_t use openacc_internal, only: acc_present_or_copyin_l !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len call acc_present_or_copyin_l (a, int (len, kind = c_size_t)) end subroutine subroutine acc_present_or_copyin_64_h (a, len) use iso_c_binding, only: c_int64_t, c_size_t use openacc_internal, only: acc_present_or_copyin_l !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len call acc_present_or_copyin_l (a, int (len, kind = c_size_t)) end subroutine subroutine acc_present_or_copyin_array_h (a) use openacc_internal, only: acc_present_or_copyin_l type (*), dimension (..), contiguous :: a call acc_present_or_copyin_l (a, sizeof (a)) end subroutine subroutine acc_create_32_h (a, len) use iso_c_binding, only: c_int32_t, c_size_t use openacc_internal, only: acc_create_l !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len call acc_create_l (a, int (len, kind = c_size_t)) end subroutine subroutine acc_create_64_h (a, len) use iso_c_binding, only: c_int64_t, c_size_t use openacc_internal, only: acc_create_l !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len call acc_create_l (a, int (len, kind = c_size_t)) end subroutine subroutine acc_create_array_h (a) use openacc_internal, only: acc_create_l type (*), dimension (..), contiguous :: a call acc_create_l (a, sizeof (a)) end subroutine subroutine acc_present_or_create_32_h (a, len) use iso_c_binding, only: c_int32_t, c_size_t use openacc_internal, only: acc_present_or_create_l !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len call acc_present_or_create_l (a, int (len, kind = c_size_t)) end subroutine subroutine acc_present_or_create_64_h (a, len) use iso_c_binding, only: c_int64_t, c_size_t use openacc_internal, only: acc_present_or_create_l !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len call acc_present_or_create_l (a, int (len, kind = c_size_t)) end subroutine subroutine acc_present_or_create_array_h (a) use openacc_internal, only: acc_present_or_create_l type (*), dimension (..), contiguous :: a call acc_present_or_create_l (a, sizeof (a)) end subroutine subroutine acc_copyout_32_h (a, len) use iso_c_binding, only: c_int32_t, c_size_t use openacc_internal, only: acc_copyout_l !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len call acc_copyout_l (a, int (len, kind = c_size_t)) end subroutine subroutine acc_copyout_64_h (a, len) use iso_c_binding, only: c_int64_t, c_size_t use openacc_internal, only: acc_copyout_l !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len call acc_copyout_l (a, int (len, kind = c_size_t)) end subroutine subroutine acc_copyout_array_h (a) use openacc_internal, only: acc_copyout_l type (*), dimension (..), contiguous :: a call acc_copyout_l (a, sizeof (a)) end subroutine subroutine acc_copyout_finalize_32_h (a, len) use iso_c_binding, only: c_int32_t, c_size_t use openacc_internal, only: acc_copyout_finalize_l !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len call acc_copyout_finalize_l (a, int (len, kind = c_size_t)) end subroutine subroutine acc_copyout_finalize_64_h (a, len) use iso_c_binding, only: c_int64_t, c_size_t use openacc_internal, only: acc_copyout_finalize_l !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len call acc_copyout_finalize_l (a, int (len, kind = c_size_t)) end subroutine subroutine acc_copyout_finalize_array_h (a) use openacc_internal, only: acc_copyout_finalize_l type (*), dimension (..), contiguous :: a call acc_copyout_finalize_l (a, sizeof (a)) end subroutine subroutine acc_delete_32_h (a, len) use iso_c_binding, only: c_int32_t, c_size_t use openacc_internal, only: acc_delete_l !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len call acc_delete_l (a, int (len, kind = c_size_t)) end subroutine subroutine acc_delete_64_h (a, len) use iso_c_binding, only: c_int64_t, c_size_t use openacc_internal, only: acc_delete_l !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len call acc_delete_l (a, int (len, kind = c_size_t)) end subroutine subroutine acc_delete_array_h (a) use openacc_internal, only: acc_delete_l type (*), dimension (..), contiguous :: a call acc_delete_l (a, sizeof (a)) end subroutine subroutine acc_delete_finalize_32_h (a, len) use iso_c_binding, only: c_int32_t, c_size_t use openacc_internal, only: acc_delete_finalize_l !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len call acc_delete_finalize_l (a, int (len, kind = c_size_t)) end subroutine subroutine acc_delete_finalize_64_h (a, len) use iso_c_binding, only: c_int64_t, c_size_t use openacc_internal, only: acc_delete_finalize_l !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len call acc_delete_finalize_l (a, int (len, kind = c_size_t)) end subroutine subroutine acc_delete_finalize_array_h (a) use openacc_internal, only: acc_delete_finalize_l type (*), dimension (..), contiguous :: a call acc_delete_finalize_l (a, sizeof (a)) end subroutine subroutine acc_update_device_32_h (a, len) use iso_c_binding, only: c_int32_t, c_size_t use openacc_internal, only: acc_update_device_l !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len call acc_update_device_l (a, int (len, kind = c_size_t)) end subroutine subroutine acc_update_device_64_h (a, len) use iso_c_binding, only: c_int64_t, c_size_t use openacc_internal, only: acc_update_device_l !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len call acc_update_device_l (a, int (len, kind = c_size_t)) end subroutine subroutine acc_update_device_array_h (a) use openacc_internal, only: acc_update_device_l type (*), dimension (..), contiguous :: a call acc_update_device_l (a, sizeof (a)) end subroutine subroutine acc_update_self_32_h (a, len) use iso_c_binding, only: c_int32_t, c_size_t use openacc_internal, only: acc_update_self_l !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len call acc_update_self_l (a, int (len, kind = c_size_t)) end subroutine subroutine acc_update_self_64_h (a, len) use iso_c_binding, only: c_int64_t, c_size_t use openacc_internal, only: acc_update_self_l !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len call acc_update_self_l (a, int (len, kind = c_size_t)) end subroutine subroutine acc_update_self_array_h (a) use openacc_internal, only: acc_update_self_l type (*), dimension (..), contiguous :: a call acc_update_self_l (a, sizeof (a)) end subroutine function acc_is_present_32_h (a, len) use iso_c_binding, only: c_int32_t, c_size_t use openacc_internal, only: acc_is_present_l logical acc_is_present_32_h !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len acc_is_present_32_h = acc_is_present_l (a, int (len, kind = c_size_t)) /= 0 end function function acc_is_present_64_h (a, len) use iso_c_binding, only: c_int64_t, c_size_t use openacc_internal, only: acc_is_present_l logical acc_is_present_64_h !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len acc_is_present_64_h = acc_is_present_l (a, int (len, kind = c_size_t)) /= 0 end function function acc_is_present_array_h (a) use openacc_internal, only: acc_is_present_l logical acc_is_present_array_h type (*), dimension (..), contiguous :: a acc_is_present_array_h = acc_is_present_l (a, sizeof (a)) /= 0 end function subroutine acc_copyin_async_32_h (a, len, async) use iso_c_binding, only: c_int32_t, c_size_t, c_int use openacc_internal, only: acc_copyin_async_l use openacc_kinds, only: acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len integer (acc_handle_kind) async call acc_copyin_async_l (a, int (len, kind = c_size_t), int (async, kind = c_int)) end subroutine subroutine acc_copyin_async_64_h (a, len, async) use iso_c_binding, only: c_int64_t, c_size_t, c_int use openacc_internal, only: acc_copyin_async_l use openacc_kinds, only: acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len integer (acc_handle_kind) async call acc_copyin_async_l (a, int (len, kind = c_size_t), int (async, kind = c_int)) end subroutine subroutine acc_copyin_async_array_h (a, async) use iso_c_binding, only: c_int use openacc_internal, only: acc_copyin_async_l use openacc_kinds, only: acc_handle_kind type (*), dimension (..), contiguous :: a integer (acc_handle_kind) async call acc_copyin_async_l (a, sizeof (a), int (async, kind = c_int)) end subroutine subroutine acc_create_async_32_h (a, len, async) use iso_c_binding, only: c_int32_t, c_size_t, c_int use openacc_internal, only: acc_create_async_l use openacc_kinds, only: acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len integer (acc_handle_kind) async call acc_create_async_l (a, int (len, kind = c_size_t), int (async, kind = c_int)) end subroutine subroutine acc_create_async_64_h (a, len, async) use iso_c_binding, only: c_int64_t, c_size_t, c_int use openacc_internal, only: acc_create_async_l use openacc_kinds, only: acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len integer (acc_handle_kind) async call acc_create_async_l (a, int (len, kind = c_size_t), int (async, kind = c_int)) end subroutine subroutine acc_create_async_array_h (a, async) use iso_c_binding, only: c_int use openacc_internal, only: acc_create_async_l use openacc_kinds, only: acc_handle_kind type (*), dimension (..), contiguous :: a integer (acc_handle_kind) async call acc_create_async_l (a, sizeof (a), int (async, kind = c_int)) end subroutine subroutine acc_copyout_async_32_h (a, len, async) use iso_c_binding, only: c_int32_t, c_size_t, c_int use openacc_internal, only: acc_copyout_async_l use openacc_kinds, only: acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len integer (acc_handle_kind) async call acc_copyout_async_l (a, int (len, kind = c_size_t), int (async, kind = c_int)) end subroutine subroutine acc_copyout_async_64_h (a, len, async) use iso_c_binding, only: c_int64_t, c_size_t, c_int use openacc_internal, only: acc_copyout_async_l use openacc_kinds, only: acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len integer (acc_handle_kind) async call acc_copyout_async_l (a, int (len, kind = c_size_t), int (async, kind = c_int)) end subroutine subroutine acc_copyout_async_array_h (a, async) use iso_c_binding, only: c_int use openacc_internal, only: acc_copyout_async_l use openacc_kinds, only: acc_handle_kind type (*), dimension (..), contiguous :: a integer (acc_handle_kind) async call acc_copyout_async_l (a, sizeof (a), int (async, kind = c_int)) end subroutine subroutine acc_delete_async_32_h (a, len, async) use iso_c_binding, only: c_int32_t, c_size_t, c_int use openacc_internal, only: acc_delete_async_l use openacc_kinds, only: acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len integer (acc_handle_kind) async call acc_delete_async_l (a, int (len, kind = c_size_t), int (async, kind = c_int)) end subroutine subroutine acc_delete_async_64_h (a, len, async) use iso_c_binding, only: c_int64_t, c_size_t, c_int use openacc_internal, only: acc_delete_async_l use openacc_kinds, only: acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len integer (acc_handle_kind) async call acc_delete_async_l (a, int (len, kind = c_size_t), int (async, kind = c_int)) end subroutine subroutine acc_delete_async_array_h (a, async) use iso_c_binding, only: c_int use openacc_internal, only: acc_delete_async_l use openacc_kinds, only: acc_handle_kind type (*), dimension (..), contiguous :: a integer (acc_handle_kind) async call acc_delete_async_l (a, sizeof (a), int (async, kind = c_int)) end subroutine subroutine acc_update_device_async_32_h (a, len, async) use iso_c_binding, only: c_int32_t, c_size_t, c_int use openacc_internal, only: acc_update_device_async_l use openacc_kinds, only: acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len integer (acc_handle_kind) async call acc_update_device_async_l (a, int (len, kind = c_size_t), int (async, kind = c_int)) end subroutine subroutine acc_update_device_async_64_h (a, len, async) use iso_c_binding, only: c_int64_t, c_size_t, c_int use openacc_internal, only: acc_update_device_async_l use openacc_kinds, only: acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len integer (acc_handle_kind) async call acc_update_device_async_l (a, int (len, kind = c_size_t), int (async, kind = c_int)) end subroutine subroutine acc_update_device_async_array_h (a, async) use iso_c_binding, only: c_int use openacc_internal, only: acc_update_device_async_l use openacc_kinds, only: acc_handle_kind type (*), dimension (..), contiguous :: a integer (acc_handle_kind) async call acc_update_device_async_l (a, sizeof (a), int (async, kind = c_int)) end subroutine subroutine acc_update_self_async_32_h (a, len, async) use iso_c_binding, only: c_int32_t, c_size_t, c_int use openacc_internal, only: acc_update_self_async_l use openacc_kinds, only: acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len integer (acc_handle_kind) async call acc_update_self_async_l (a, int (len, kind = c_size_t), int (async, kind = c_int)) end subroutine subroutine acc_update_self_async_64_h (a, len, async) use iso_c_binding, only: c_int64_t, c_size_t, c_int use openacc_internal, only: acc_update_self_async_l use openacc_kinds, only: acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len integer (acc_handle_kind) async call acc_update_self_async_l (a, int (len, kind = c_size_t), int (async, kind = c_int)) end subroutine subroutine acc_update_self_async_array_h (a, async) use iso_c_binding, only: c_int use openacc_internal, only: acc_update_self_async_l use openacc_kinds, only: acc_handle_kind type (*), dimension (..), contiguous :: a integer (acc_handle_kind) async call acc_update_self_async_l (a, sizeof (a), int (async, kind = c_int)) end subroutine mgomp/finclude/openacc.modՕn0~L7BmNGEM"4R߾6${ K3flmf^; Y"l0lJT2mYC!d.2 NjZDTjTĀ09&4~-)>~t| Ɗ 곐 #+uW"o+ϵ>3 @zRM1]i}-_:SHyh -ؘhʋ\ b:X&15Hv!#šbu!KdD`AV-w ٢V&,&Э VW̓݉L gN$%8YS"nn Aqa+uS`*Ei} `U럜]C>dYVg*\z]ʆAҝ< }=/Ezڛ z,G].+Zueޛcܡ{L`ao2mgomp/finclude/openacc_kinds.modՖo0+ޭpp) cLA܎@ȴ] DY]H!ןkix (t)Ev182Y dI2ȥ؃` ! and Mentor Embedded ! NOTE: Due to the use of dimension (..), the code only works when compiled ! with -std=f2008ts/gnu/legacy but not with other standard settings. ! Alternatively, the user can use the module version, which permits ! compilation with -std=f95. ! Keep in sync with openacc.f90 and config/accel/openacc.f90. integer, parameter :: acc_device_kind = 4 ! integer (acc_device_kind), parameter :: acc_device_current = -1 integer (acc_device_kind), parameter :: acc_device_none = 0 integer (acc_device_kind), parameter :: acc_device_default = 1 integer (acc_device_kind), parameter :: acc_device_host = 2 ! integer (acc_device_kind), parameter :: acc_device_host_nonshm = 3 ! removed. integer (acc_device_kind), parameter :: acc_device_not_host = 4 integer (acc_device_kind), parameter :: acc_device_nvidia = 5 integer (acc_device_kind), parameter :: acc_device_radeon = 8 integer, parameter :: acc_device_property_kind = 4 ! OpenACC 2.6/2.7/3.0 used acc_device_property; in a spec update the ! missing '_kind' was added for consistency. For backward compatibility, keep: integer, parameter :: acc_device_property & & = acc_device_property_kind integer (acc_device_property_kind), parameter :: & & acc_property_memory = 1 integer (acc_device_property_kind), parameter :: & & acc_property_free_memory = 2 integer (acc_device_property_kind), parameter :: & & acc_property_name = int(Z'10001') integer (acc_device_property_kind), parameter :: & & acc_property_vendor = int(Z'10002') integer (acc_device_property_kind), parameter :: & & acc_property_driver = int(Z'10003') integer, parameter :: acc_handle_kind = 4 ! integer (acc_handle_kind), parameter :: acc_async_noval = -1 integer (acc_handle_kind), parameter :: acc_async_sync = -2 integer, parameter :: openacc_version = 201711 interface acc_get_num_devices function acc_get_num_devices_h (devicetype) import acc_device_kind integer acc_get_num_devices_h integer (acc_device_kind) devicetype end function end interface interface acc_set_device_type subroutine acc_set_device_type_h (devicetype) import acc_device_kind integer (acc_device_kind) devicetype end subroutine end interface interface acc_get_device_type function acc_get_device_type_h () import acc_device_kind integer (acc_device_kind) acc_get_device_type_h end function end interface interface acc_set_device_num subroutine acc_set_device_num_h (devicenum, devicetype) import acc_device_kind integer devicenum integer (acc_device_kind) devicetype end subroutine end interface interface acc_get_device_num function acc_get_device_num_h (devicetype) import acc_device_kind integer acc_get_device_num_h integer (acc_device_kind) devicetype end function end interface interface acc_get_property function acc_get_property_h (devicenum, devicetype, & & property) use iso_c_binding, only: c_size_t import acc_device_kind, acc_device_property_kind implicit none (type, external) integer (c_size_t) :: acc_get_property_h integer, value :: devicenum integer (acc_device_kind), value :: devicetype integer (acc_device_property_kind), value :: property end function end interface interface acc_get_property_string subroutine acc_get_property_string_h (devicenum, devicetype, & & property, string) import acc_device_kind, acc_device_property_kind implicit none (type, external) integer, value :: devicenum integer (acc_device_kind), value :: devicetype integer (acc_device_property_kind), value :: property character (*) :: string end subroutine end interface interface acc_async_test function acc_async_test_h (arg) logical acc_async_test_h integer arg end function end interface interface acc_async_test_all function acc_async_test_all_h () logical acc_async_test_all_h end function end interface interface acc_wait subroutine acc_wait_h (arg) integer arg end subroutine end interface ! acc_async_wait is an OpenACC 1.0 compatibility name for acc_wait. interface acc_async_wait procedure :: acc_wait_h end interface interface acc_wait_async subroutine acc_wait_async_h (arg, async) integer arg, async end subroutine end interface interface acc_wait_all subroutine acc_wait_all_h () end subroutine end interface ! acc_async_wait_all is an OpenACC 1.0 compatibility name for ! acc_wait_all. interface acc_async_wait_all procedure :: acc_wait_all_h end interface interface acc_wait_all_async subroutine acc_wait_all_async_h (async) integer async end subroutine end interface interface acc_init subroutine acc_init_h (devicetype) import acc_device_kind integer (acc_device_kind) devicetype end subroutine end interface interface acc_shutdown subroutine acc_shutdown_h (devicetype) import acc_device_kind integer (acc_device_kind) devicetype end subroutine end interface interface acc_on_device function acc_on_device_h (devicetype) import acc_device_kind logical acc_on_device_h integer (acc_device_kind) devicetype end function end interface ! acc_malloc: Only available in C/C++ ! acc_free: Only available in C/C++ interface acc_copyin subroutine acc_copyin_32_h (a, len) use iso_c_binding, only: c_int32_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len end subroutine subroutine acc_copyin_64_h (a, len) use iso_c_binding, only: c_int64_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len end subroutine subroutine acc_copyin_array_h (a) type (*), dimension (..), contiguous :: a end subroutine end interface interface acc_present_or_copyin subroutine acc_present_or_copyin_32_h (a, len) use iso_c_binding, only: c_int32_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len end subroutine subroutine acc_present_or_copyin_64_h (a, len) use iso_c_binding, only: c_int64_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len end subroutine subroutine acc_present_or_copyin_array_h (a) type (*), dimension (..), contiguous :: a end subroutine end interface interface acc_pcopyin procedure :: acc_present_or_copyin_32_h procedure :: acc_present_or_copyin_64_h procedure :: acc_present_or_copyin_array_h end interface interface acc_create subroutine acc_create_32_h (a, len) use iso_c_binding, only: c_int32_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len end subroutine subroutine acc_create_64_h (a, len) use iso_c_binding, only: c_int64_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len end subroutine subroutine acc_create_array_h (a) type (*), dimension (..), contiguous :: a end subroutine end interface interface acc_present_or_create subroutine acc_present_or_create_32_h (a, len) use iso_c_binding, only: c_int32_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len end subroutine subroutine acc_present_or_create_64_h (a, len) use iso_c_binding, only: c_int64_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len end subroutine subroutine acc_present_or_create_array_h (a) type (*), dimension (..), contiguous :: a end subroutine end interface interface acc_pcreate procedure :: acc_present_or_create_32_h procedure :: acc_present_or_create_64_h procedure :: acc_present_or_create_array_h end interface interface acc_copyout subroutine acc_copyout_32_h (a, len) use iso_c_binding, only: c_int32_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len end subroutine subroutine acc_copyout_64_h (a, len) use iso_c_binding, only: c_int64_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len end subroutine subroutine acc_copyout_array_h (a) type (*), dimension (..), contiguous :: a end subroutine end interface interface acc_copyout_finalize subroutine acc_copyout_finalize_32_h (a, len) use iso_c_binding, only: c_int32_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len end subroutine subroutine acc_copyout_finalize_64_h (a, len) use iso_c_binding, only: c_int64_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len end subroutine subroutine acc_copyout_finalize_array_h (a) type (*), dimension (..), contiguous :: a end subroutine end interface interface acc_delete subroutine acc_delete_32_h (a, len) use iso_c_binding, only: c_int32_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len end subroutine subroutine acc_delete_64_h (a, len) use iso_c_binding, only: c_int64_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len end subroutine subroutine acc_delete_array_h (a) type (*), dimension (..), contiguous :: a end subroutine end interface interface acc_delete_finalize subroutine acc_delete_finalize_32_h (a, len) use iso_c_binding, only: c_int32_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len end subroutine subroutine acc_delete_finalize_64_h (a, len) use iso_c_binding, only: c_int64_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len end subroutine subroutine acc_delete_finalize_array_h (a) type (*), dimension (..), contiguous :: a end subroutine end interface interface acc_update_device subroutine acc_update_device_32_h (a, len) use iso_c_binding, only: c_int32_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len end subroutine subroutine acc_update_device_64_h (a, len) use iso_c_binding, only: c_int64_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len end subroutine subroutine acc_update_device_array_h (a) type (*), dimension (..), contiguous :: a end subroutine end interface interface acc_update_self subroutine acc_update_self_32_h (a, len) use iso_c_binding, only: c_int32_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len end subroutine subroutine acc_update_self_64_h (a, len) use iso_c_binding, only: c_int64_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len end subroutine subroutine acc_update_self_array_h (a) type (*), dimension (..), contiguous :: a end subroutine end interface ! acc_map_data: Only available in C/C++ ! acc_unmap_data: Only available in C/C++ ! acc_deviceptr: Only available in C/C++ ! acc_hostptr: Only available in C/C++ interface acc_is_present function acc_is_present_32_h (a, len) use iso_c_binding, only: c_int32_t logical acc_is_present_32_h !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len end function function acc_is_present_64_h (a, len) use iso_c_binding, only: c_int64_t logical acc_is_present_64_h !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len end function function acc_is_present_array_h (a) logical acc_is_present_array_h type (*), dimension (..), contiguous :: a end function end interface ! acc_memcpy_to_device: Only available in C/C++ ! acc_memcpy_from_device: Only available in C/C++ interface acc_copyin_async subroutine acc_copyin_async_32_h (a, len, async) use iso_c_binding, only: c_int32_t import acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len integer (acc_handle_kind) async end subroutine subroutine acc_copyin_async_64_h (a, len, async) use iso_c_binding, only: c_int64_t import acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len integer (acc_handle_kind) async end subroutine subroutine acc_copyin_async_array_h (a, async_) import acc_handle_kind type (*), dimension (..), contiguous :: a integer (acc_handle_kind) async_ end subroutine end interface interface acc_create_async subroutine acc_create_async_32_h (a, len, async) use iso_c_binding, only: c_int32_t import acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len integer (acc_handle_kind) async end subroutine subroutine acc_create_async_64_h (a, len, async) use iso_c_binding, only: c_int64_t import acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len integer (acc_handle_kind) async end subroutine subroutine acc_create_async_array_h (a, async_) import acc_handle_kind type (*), dimension (..), contiguous :: a integer (acc_handle_kind) async_ end subroutine end interface interface acc_copyout_async subroutine acc_copyout_async_32_h (a, len, async) use iso_c_binding, only: c_int32_t import acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len integer (acc_handle_kind) async end subroutine subroutine acc_copyout_async_64_h (a, len, async) use iso_c_binding, only: c_int64_t import acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len integer (acc_handle_kind) async end subroutine subroutine acc_copyout_async_array_h (a, async_) import acc_handle_kind type (*), dimension (..), contiguous :: a integer (acc_handle_kind) async_ end subroutine end interface interface acc_delete_async subroutine acc_delete_async_32_h (a, len, async) use iso_c_binding, only: c_int32_t import acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len integer (acc_handle_kind) async end subroutine subroutine acc_delete_async_64_h (a, len, async) use iso_c_binding, only: c_int64_t import acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len integer (acc_handle_kind) async end subroutine subroutine acc_delete_async_array_h (a, async_) import acc_handle_kind type (*), dimension (..), contiguous :: a integer (acc_handle_kind) async_ end subroutine end interface interface acc_update_device_async subroutine acc_update_device_async_32_h (a, len, async) use iso_c_binding, only: c_int32_t import acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len integer (acc_handle_kind) async end subroutine subroutine acc_update_device_async_64_h (a, len, async) use iso_c_binding, only: c_int64_t import acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len integer (acc_handle_kind) async end subroutine subroutine acc_update_device_async_array_h (a, async_) import acc_handle_kind type (*), dimension (..), contiguous :: a integer (acc_handle_kind) async_ end subroutine end interface interface acc_update_self_async subroutine acc_update_self_async_32_h (a, len, async) use iso_c_binding, only: c_int32_t import acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len integer (acc_handle_kind) async end subroutine subroutine acc_update_self_async_64_h (a, len, async) use iso_c_binding, only: c_int64_t import acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len integer (acc_handle_kind) async end subroutine subroutine acc_update_self_async_array_h (a, async_) import acc_handle_kind type (*), dimension (..), contiguous :: a integer (acc_handle_kind) async_ end subroutine end interface caf_single.a! single.o/ FUNCTION DECL: assign_char1_from_char4 .func assign_char1_from_char../libgfortran/caf/single.c" // BEGIN FUNCTION DECL: caf_runtime_error .func caf_runtime_FUNCTION DECL: caf_internal_error .func caf_internal_errorFUNCTION DECL: convert_type .func convert_type; // BEGIN FUNCTION DECL: copy_data .func copy_data); // BEGIN FUNCTION DECL: get_for_ref$constprop$0$isra$0 .func get_for_ref$constprop$0$isra$0, .param .u64 %in_ar11, .param .u32 %in_ar1_gfortran_caf_init .visible .func _gfortran_caf_in_gfortran_caf_finalize .visible .func _gfortran_caf_fingfortran_caf_this_imag_gfortran_caf_this_image_gfortran_caf_gfortran_caf_register .visible .func _gfortran_caf_registersend_by_ref$constprop$0$isra$0 .func send_by_ref$constprop$0$isra$0, .param .u64 %in_ar11, .param .u64 %in_ar12, .param .u32 %in_ar1_gfortran_caf_deregister .visible .func _gfortran_caf_deregister_gfortran_caf_sync_all .visible .func _gfortran_caf_sync_all_gfortran_caf_sync_memory .visible .func _gfortran_caf_sync_memory_gfortran_caf_sync_images .visible .func _gfortran_caf_sync_images_gfortran_caf_stop_numeric .visible .func _gfortran_caf_stop_numer_gfortran_caf_stop_str .visible .func _gfortran_caf_stop_str_gfortran_caf_error_stop_str .visible .func _gfortran_caf_error_stop_str_gfortran_caf_fail_image .visible .func _gfortran_caf_fail_imag_gfortran_caf_image_statu_gfortran_caf_image_status_gfortran_caf_failed_images .visible .func _gfortran_caf_failed_images_gfortran_caf_stopped_images .visible .func _gfortran_caf_stopped_images_gfortran_caf_error_stop .visible .func _gfortran_caf_error_stop_gfortran_caf_co_broadcast .visible .func _gfortran_caf_co_broadcast_gfortran_caf_co_sum .visible .func _gfortran_caf_co_sum_gfortran_caf_co_min .visible .func _gfortran_caf_co_m_gfortran_caf_co_max .visible .func _gfortran_caf_co_max_gfortran_caf_co_reduce .visible .func _gfortran_caf_co_reduce_gfortran_caf_get .visible .func _gfortran_caf_get_gfortran_caf_send .visible .func _gfortran_caf_send_gfortran_caf_sendget .visible .func _gfortran_caf_sendge, .param .u32 %in_ar11, .param .u32 %in_ar1_gfortran_caf_get_by_ref .visible .func _gfortran_caf_get_by_ref; // BEGIN GLOBAL FUNCTION DECL: _gfortran_caf_send_by_ref .visible .func _gfortran_caf_send_by_ref; // BEGIN GLOBAL FUNCTION DECL: _gfortran_caf_sendget_by_ref .visible .func _gfortran_caf_sendget_by_ref, .param .u32 %in_ar11, .param .u32 %in_ar1_gfortran_caf_atomic_define .visible .func _gfortran_caf_atomic_def_gfortran_caf_atomic_ref .visible .func _gfortran_caf_atomic_ref_gfortran_caf_atomic_cas .visible .func _gfortran_caf_atomic_cas_gfortran_caf_atomic_op .visible .func _gfortran_caf_atomic_op_gfortran_caf_event_post .visible .func _gfortran_caf_event_pos); // BEGIN GLOBAL FUNCTION DECL: _gfortran_caf_event_wait .visible .func _gfortran_caf_event_w); // BEGIN GLOBAL FUNCTION DECL: _gfortran_caf_event_query .visible .func _gfortran_caf_event_query); // BEGIN GLOBAL FUNCTION DECL: _gfortran_caf_lock .visible .func _gfortran_caf_lock_gfortran_caf_unlock .visible .func _gfortran_caf_unlock);_gfortran_caf_is_presenVAR DECL: _impure_ptr .extern .global .align 8 .u64 _impure_ptrfwritefwritevfvffputcfputcBEGIN GLOBAL FUNCTION DECL: snnprintffixdfti .extern// BEGIN GLOBAL FUNCTION DECL: __floattisf .extern// BEGIN GLOBAL FUNCTION DECL: __floattidf .externCL: memmovefputcfputc$LC0 .const .align 1 .u8 $LC0[24] = {70,111,114,116,114,97,110,32,114,117,110,116,105,109,101,32,101,114,114,111,114,58,328,105,98,99,97,102,95,115,105,110,103,108,101,32,82,85,78,84,73,77,69,32,69,82,82,79,82,58,32,67,97,110,110,111,116,32,99,111,110,118,101,114,116,32,116,121,112,101,32,37,100,32,107,105,110,100,32,37,100,32,116,111,32,116,121,112,101,32,37,100,3200,32,116,111,32,97,108,108,111,99,97,116,101,32,99,111,97,114,114,9705,98,99,97,102,95,115,105,110,103,108,101,58,58,99,97,102,95,115,101,110,100,95,98,121,95,114,101,102,40,41,58,32,117,110,107,110,111,119,110,32,107,105,110,100,32,105,110,32,118,101,99,116,111,114,45,114,101,102,469] = {83,84,79,80,32,376] = {83,84,79,80,3282,79,82,32,83,84,79,80,3273,77,65,71,69,32,70,65,73,76,69,68,33,10,0 }82,79,82,32,83,84,79,809 .const .align 1 .u8 $LC9[62] = {108,105,98,99,97,102,95,115,105,110,103,108,101,58,58,99,97,102,95,103,101,116,95,98,121,95,114,101,102,40,41,58,32,117,110,107,110,111,119,110,32,107,105,110,100,32,105,110,32,118,101,99,116,111,114,45,114,101,102,46,10,0 }; // BEGIN VAR DEF: __func__$2 .const .align 1 .u8 __func__$2[25] = {95,103,102,111,114,116,114,97,110,95,99,97,102,95,97,116,111,109,105,99,95,114,101,102108,105,98,99,97,102,95,115,105,110,103,108,101,58,58,99,97,102,95,103,101,116,95,98,121,95,114,101,102,40,41,58,32,117,110,107,110,111,119,110,3246,10,0 }; // BEGIN VAR DEF: $LC11 .const .align 1 .u8 $LC11[64] = {108,105,98,99,97,102,95,115,105,110,103,108,101,58,58,99,97,102,95,103,101,116,95,98,121,95,114,101,102,40,41,58,32,117,110,107,110,111,119,110,32,97,114,114,97,121,32,114,101,102,101,114,101,110,99,101,32,116,121,112,101,46108,105,98,99,97,102,95,115,105,110,103,108,101,58,58,99,97,102,95,103,101,116,95,98,121,95,114,101,102,40,41,58,32,114,97,110,107,32,111,117,116,32,111,102,32,114,97,110,103,101,46108,105,98,99,97,102,95,115,105,110,103,108,101,58,58,99,97,102,95,103,101,116,95,98,121,95,114,101,102,40,41,58,32,101,120,116,101,110,116,32,111,117,116,32,111,102,32,114,97,110,103,101,46108,105,98,99,97,102,95,115,105,110,103,108,101,58,58,99,97,102,95,103,101,116,95,98,121,95,114,101,102,40,41,58,32,99,97,110,110,111,116,32,97,108,108,111,99,97,116,101,32,109,101,109,111,114,121,46108,105,98,99,97,102,95,115,105,110,103,108,101,58,58,99,97,102,95,103,101,116,95,98,121,95,114,101,102,40,41,58,32,101,120,116,101,110,116,32,111,102,32,110,111,110,45,97,108,108,111,99,97,116,97,98,108,101,32,97,114,114,97,121,115,32,109,105,115,109,97,116,99,104,32,40,37,108,117,32,33,61,32,37,108,117,41,46108,105,98,99,97,102,95,115,105,110,103,108,101,58,58,99,97,102,95,103,101,116,95,98,121,95,114,101,102,40,41,58,32,116,119,111,32,111,114,32,109,111,114,101,32,97,114,114,97,121,32,112,97,114,116,32,114,101,102,101,114,101,110,99,101,115,32,970,0 }; // BEGIN VAR DEF: $LC17 .const .align 1 .u8 $LC17[37] = {33,114,101,97,108,108,111,99,95,110,101,101,100,101,100,32,124,124,32,100,115,116,95,114,101,97,108,108,111,99,97,116,97,98,108,10199,117,114,95,100,105,109,32,61,61,32,71,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,32,40,100,115,116,4195,103,102,111,114,116,114,97,110,95,99,97,102,95,97,116,111,109,105,99,95,111,112,0 }; // BEGIN VAR DEF: __func__$1 .const .align 1 .u8 __func__$1[25] = {95,103,102,111,114,116,114,97,110,95,99,97,102,95,97,116,111,109,105,99,95,9902,111,114,116,114,97,110,95,99,97,102,95,97,116,111,109,105,99,95,100,101,102,105,110,101,0 }; // BEGIN VAR DEF: __func__$4 .const .align 1 .u8 __func__$4[26] = {95,103,102,111,114,116,114,97,110,95,99,97,102,95,115,101,110,100,95,98,121,95,114,101,102,0 }; // BEGIN VAR DEF: __func__$5 .const .align 1 .u8 __func__$5[25] = {95,103,102,111,114,116,114,97,110,95,99,97,102,95,103,101,116,95,98,121,95,114,101,102,0 }; // BEGIN VAR DEF: $LC30 .const .align 1 .u8 $LC30[23] = {86,97,114,105,97,98,108,101,32,105,115,32,110,111,116,32,108,111,99,107,101,100,0 }; // BEGIN GLOBAL VAR DEF: unreachable .visible .const .align 1 .u8 unreachable[45] = {70,97,116,97,108,32,101,114,114,111,114,58,32,117,110,114,101,97,99,104,97,98,108,101,32,97,108,116,101,114,110,97,116,105,118,101,32,102,111,117,110,100,46,108,105,98,99,97,102,95,115,105,110,103,108,101,58,58,99,97,102,95,105,115,95,112,114,101,115,101,110,116,40,41,58,32,111,110,108,121,32,115,99,97,108,97,114,32,105,110,100,101,120,101,115,32,97,108,108,111,119,101,100,4610] = {115,105,122,101,32,61,61,32,49108,105,98,99,97,102,95,115,105,110,103,108,101,58,58,99,97,102,95,115,101,110,100,95,98,121,95,114,101,102,40,41,58,32,117,110,107,110,111,119,110,32498,99,97,102,95,115,105,110,103,108,101,58,58,99,97,102,95,115,101,110,100,95,98,121,95,114,101,102,40,41,58,32,117,110,107,110,111,119,110,32,97,114,114,97,121,32,114,101,102,101,114,101,110,99,101,32,116,121,112,101108,105,98,99,97,102,95,115,105,110,103,108,101,58,58,99,97,102,95,115,101,110,100,95,98,121,95,114,101,102,40,41,58,32,114,97,110,107,32,111,117,116,32,111,102,32,114,9798,99,97,102,95,115,105,110,103,108,101,58,58,99,97,102,95,115,101,110,100,95,98,121,95,114,101,102,40,41,58,32,114,101,97,108,108,111,99,97,116,105,111,110,32,111,102,32,97,114,114,9799,111,109,112,111,110,101,110,116,32,114,101,102,32,110,111,116,32,97,108,108,111,119,101,100108,105,98,99,97,102,95,115,105,110,103,108,101,58,58,99,97,102,95,115,101,110,100,95,98,121,95,114,101,102,40,41,58,32,99,97,110,110,111,116,32,97,108,108,111,99,97,116,101,32,109,101,109,111,114,121,46108,105,98,99,97,102,95,115,105,110,103,108,101,58,58,99,97,102,95,115,101,110,100,95,98,121,95,114,101,102,40,41,58,32,101,120,116,101,110,116,32,111,102,32,110,111,110,45,97,108,108,111,99,97,116,97,98,108,101,32,97,114,114,97,121,32,109,105,115,109,97,116,99,104,4610] = {105,32,61,6128 .const .align 1 .u8 $LC28[10] = {107,105,110,100,32,61,61,32,5208,114,101,97,100,121,32,108,111,99,107,101,100,0 }; // BEGIN GLOBAL VAR DEF: caf_staticcaf_static_list[1]; // BEGIN FUNCTION DEF: assign_char1_from_char4 .func assign_char1_from_charmov.u64 %r35,%ar2; mov.u64 %r36,%ar3; .loc 1 459 26 shr.u64 %r37,%r34,2; .loc 1 459 5 min.u64 %r28,%r37,%r33; .loc 1 460 3 setp.ne.u64 %r38,%r28,0; @ %r38 bra $L2; $L7: .loc 1 462 6 setp.gt.u64 %r39,%r33,%r28; @ ! %r39 bra $L1; bra $L3; $L2: mov.u64 %r24,%r36; mov.u64 %r27,%r35; add.u64 %r32,%r28,%r35; $L6: .loc 1 461 17 ld.u32 %r23,[%r24]; .loc 1 461 12 setp.le.u32 %r40,%r23,255; selp.u32 %r23,%r23,63,%r40; st.u86; bra $L7; $L3:$L1: .loc 1 464 1 ret; } // BEGIN FUNCTION DEF: caf_runtime_error .func caf_runti64mov.u64 %r29,%ar0; .loc 1 65 12 ld.global.u64 %r34,[_impure_ptr]; .loc 1 65 3 ld.u64 %r35,[%r34+24]; mov.u64 %r32,23; mov.u64 %r31,1; cvta.const.u64 %r30,$LCld.param.u64 %r36,[%value_in]; } .loc 1 67 13 ld.global.u64 %r41,[_impure_ptr]; .loc 1 67 3,[%value_in]; } .loc 1 69 12 ld.global.u64 %r46,[_impure_ptr]; .loc 1 69 3 ld.u64 %r47,[%r46+24];fputc.loc 1 72 3FUNCTION DEF: caf_internal_error .func caf_internal_error {gs; ld.param.u64 %argsmov.u64 %r29,%ar2; mov.u64 %r30,%ar3; .loc 1 8284 13 mov.u32 %r33,1; st.u32 [%r28],%r33; .loc 1 85 10 setp.ne.u64 %r34,%r30,0; @ ! %r34 bra $L10; .loc 1 87 14 st.u64 [%stack],%argsstack; call (%value_in),sn,%out_arg4); ld.param.u32 %r38,[%value_in]; } .loc 1 88 7 setp.lt.s32 %r40,%r38,0; @ %r40 bra $L10; .loc 1 88 33 cvt.s64.s32 %r22,%r38; .loc 1 88 17 setp.ge.u64 %r41,%r22,%r30; @ %r41 bra $L10; .loc 1 89 6 sub.u64 %r47,%r30,%r22; .loc 1 89 14 add.u64 %r48,%r29,%r22; .loc 1 8995 5 st.u64 [%stack],%argscaf_runtime_error10: .loc 1 97FUNCTION DEF: convert_type .func convert_typef32 %r96; .reg .predpred %r113; .reg .pred %r115; .reg .pred32pred %r128; .reg .predpredf64 %r176; .reg .predu64pred %r195; .reg .predu64setp.eq.u32 %r76,%r73,3; @ %r76 bra $L16; setp.eq.u32 %r77,%r73,4; @ %r77 bra $L17; setp.ne.u32 %r78,%r73,1; @ %r78 bra $L18; .loc 1 499 10 setp.ne.u32 %r79,%r74,1; @ %r79 bra $L19; .loc 1 500 10 ld.s8 %r204,[%r72]; shr.s64 %r205,%r204,63; bra $L20; $L19: .loc 1 501 15 setp.ne.u32 %r83,%r74,2; @ %r83 bra $L21; .loc 1 502 10 ld.s16 %r204,[%r72]; shr.s64 %r205,%r204,63; bra $L20; $L21: .loc 1 503 15 setp.ne.u32 %r87,%r74,4; @ %r87 bra $L22; .loc 1 504 10 ld.s32 %r204,[%r72]; shr.s64 %r205,%r204,63; bra $L20; $L22: .loc 1 505 15 setp.ne.u32 %r91,%r74,8; @ %r91 bra $L23; .loc 1 506 10 ld.u64 %r204,[%r72]; shr.s64 %r205,%r204,63; bra $L20; $L23: .loc 1 508 15 setp.ne.u32 %r94,%r74,16; @ %r94 bra $L18; .loc 1 509 10 ld.u64 %r204,[%r72]; ld.u64 %r205,[%r72+8]; bra $L20; $L16: .loc 1 515 10 setp.ne.u32 %r95,%r74,4; @ %r95 bra $L24; .loc 1 516 11 ld.f32 %r96,[%r72]; cvt.f64.f32 %r67,%r96; .loc 1 550 3 setp.eq.u32 %r97,%r70,3; @ %r97 bra $L25; setp.eq.u32 %r98,%r70,4; @ %r98 bra $L26; setp.eq.u32 %r99,%r70,1; @ ! %r99 bra $L18; bra $L27; $L24: .loc 1 517 15 setp.ne.u32 %r100,%r74,8; @ %r100 bra $L18; .loc 1 518 11 ld.f64 %r67,[%r72]; .loc 1 550 3 setp.eq.u32 %r101,%r70,3; @ %r101 bra $L25; setp.eq.u32 %r102,%r70,4; @ %r102 bra $L26; setp.eq.u32 %r103,%r70,1; @ ! %r103 bra $L18; bra $L27; $L17: .loc 1 531 10 setp.ne.u32 %r104,%r74,4; @ %r104 bra $L28; .loc 1 532 13 ld.f32 %r106,[%r72]; ld.f32 %r108,[%r72+4]; .loc 1 532 11 cvt.f64.f32 %r65,%r106; cvt.f64.f32 %r66,%r108; bra $L29; $L28: .loc 1 533 15 setp.ne.u32 %r109,%r74,8; @ %r109 bra $L18; .loc 1 534 11 ld.f64 %r65,[%r72]; ld.f64 %r66,[%r72+8]; bra $L29; $L51: .loc 1 556 23 st.u8 [%r69],%r204; bra $L15; $L79: .loc 1 557 12 setp.ne.u32 %r111,%r71,2; @ %r111 bra $L31; .loc 1 558 24 st.u16 [%r69],%r204; bra $L15; $L31: .loc 1 559 12 setp.ne.u32 %r113,%r71,4; @ %r113 bra $L32; .loc 1 560 24 st.u32 [%r69],%r204; bra $L15; $L32: .loc 1 561 12 setp.ne.u32 %r115,%r71,8; @ %r115 bra $L33; .loc 1 562 24 st.u64 [%r69],%r204; bra $L15; $L33: .loc 1 564 12 setp.ne.u32 %r117,%r71,16; @ %r117 bra $L18; .loc 1 565 22 st.u64 [%r69],%r204; st.u64 [%r69+8],%r205; bra $L15; $L27: .loc 1 572 7 setp.ne.u32 %r120,%r71,1; @ %r120 bra $L34; .loc 1 573 23 cvt.rzi.s32.f64 %r121,%r67; st.u8 [%r69],%r121; bra $L15; $L34: .loc 1 574 12 setp.ne.u32 %r122,%r71,2; @ %r122 bra $L35; .loc 1 575 24 cvt.rzi.s32.f64 %r123,%r67; st.u16 [%r69],%r123; bra $L15; $L35: .loc 1 576 12 setp.ne.u32 %r124,%r71,4; @ %r124 bra $L36; .loc 1 577 24 cvt.rzi.s32.f64 %r125,%r67; st.u32 [%r69],%r125; bra $L15; $L36: .loc 1 578 12 setp.ne.u32 %r126,%r71,8; @ %r126 bra $L37; .loc 1 579 24 cvt.rzi.s64.f64 %r127,%r67; st.u64 [%r69],%r127; bra $L15; $L37: .loc 1 581 12 setp.ne.u32 %r128,%r71,16; @ %r128 bra $L18; .loc 1 582 24d206,[%frame]; ld.u64 %r207,[%frame+8]; st.u64 [%r69],%r206; st.u64 [%r69+8],%r207; bra $L15; $L50: .loc 1 589 7 setp.ne.u32 %r134,%r71,1; @ %r134 bra $L38; .loc 1 590 23 cvt.rzi.s32.f64 %r135,%r65; st.u8 [%r69],%r135; bra $L15; $L38: .loc 1 591 12 setp.ne.u32 %r136,%r71,2; @ %r136 bra $L39; .loc 1 592 24 cvt.rzi.s32.f64 %r137,%r65; st.u16 [%r69],%r137; bra $L15; $L39: .loc 1 593 12 setp.ne.u32 %r138,%r71,4; @ %r138 bra $L40; .loc 1 594 24 cvt.rzi.s32.f64 %r139,%r65; st.u32 [%r69],%r139; bra $L15; $L40: .loc 1 595 12 setp.ne.u32 %r140,%r71,8; @ %r140 bra $L41; .loc 1 596 24 cvt.rzi.s64.f64 %r141,%r65; st.u64 [%r69],%r141; bra $L15; $L41: .loc 1 598 12 setp.ne.u32 %r142,%r71,16; @ %r142 bra $L18; .loc 1 599 24d208,[%frame]; ld.u64 %r209,[%frame+8]; st.u64 [%r69],%r208; st.u64 [%r69+8],%r209; bra $L15; $L57: .loc 1 611 22 st.u64 [%frame],%r204; st.u64 [%frame+8],%r205;__floattisf,(%out_arg1); ld.param.f32 %r151,[%value_in]; } st.f32 [%r69],%r151; bra $L15; $L82: .loc 1 612 12 setp.ne.u32 %r153,%r71,8; @ %r153 bra $L18; .loc 1 613 23 st.u64 [%frame],%r204; st.u64 [%frame+8],%r205;__floattidf,(%out_arg1); ld.param.f64 %r157,[%value_in]; } st.f64 [%r69],%r157; bra $L15; $L55: .loc 1 628 22 cvt.rn.f32.f64 %r159,%r67; st.f32 [%r69],%r159; bra $L15; $L81: .loc 1 629 12 setp.ne.u32 %r160,%r71,8; @ %r160 bra $L18; .loc 1 630 21 st.f64 [%r69],%r67; bra $L15; $L53: .loc 1 645 22 cvt.rn.f32.f64 %r161,%r65; st.f32 [%r69],%r161; bra $L15; $L80: .loc 1 646 12 setp.ne.u32 %r162,%r71,8; @ %r162 bra $L18; .loc 1 647 21 st.f64 [%r69],%r65; bra $L15; $L46: .loc 1 663 7 setp.ne.u32 %r163,%r71,4; @ %r163 bra $L42; .loc 1 664 31 st.u64 [%frame],%r204; st.u64 [%frame+8],%r205;__floattisf,(%out_arg1); ld.param.f32 %r167,[%value_in]; } st.f32 [%r69],%r167; .loc 1 664 29 mov.f32 %r169,0f00000000; st.f32 [%r69+4],%r169; bra $L15; $L42: .loc 1 665 12 setp.ne.u32 %r170,%r71,8; @ %r170 bra $L18; .loc 1 666 32 st.u64 [%frame],%r204; st.u64 [%frame+8],%r205;__floattidf,(%out_arg1); ld.param.f64 %r174,[%value_in]; } st.f64 [%r69],%r174; .loc 1 666 30 mov.f64 %r176,0d0000000000000000; st.f64 [%r69+8],%r176; bra $L15; $L26: .loc 1 680 7 setp.ne.u32 %r177,%r71,4; @ %r177 bra $L43; .loc 1 681 31 cvt.rn.f32.f64 %r178,%r67; st.f32 [%r69],%r178; .loc 1 681 29 mov.f32 %r179,0f00000000; st.f32 [%r69+4],%r179; bra $L15; $L43: .loc 1 682 12 setp.ne.u32 %r180,%r71,8; @ %r180 bra $L18; .loc 1 683 30 st.f64 [%r69],%r67; mov.f64 %r181,0d0000000000000000; st.f64 [%r69+8],%r181; bra $L15; $L59: .loc 1 698 31 cvt.rn.f32.f64 %r182,%r65; st.f32 [%r69],%r182; cvt.rn.f32.f64 %r183,%r66; st.f32 [%r69+4],%r183; bra $L15; $L83: .loc 1 699 12 setp.ne.u32 %r184,%r71,8; @ %r184 bra $L18; .loc 1 700 30 st.f64 [%r69],%r65; st.f64 [%r69+8],%r66; bra $L15; $L18: .loc 1 720 12 ld.global.u64 %r187,[_impure_ptr]; .loc 1 720 3 ld.u64 %r188,[%r187+24]; st.u32 [%stack+12],%r71; st.u32 [%stack+8],%r70; st.u32 [%stack+4],%r74; st.u32 [%stack],%r73; cvta.const.u64 %r186,$LC18r186189,[%value_in]; } .loc 1 722 6 setp.eq.u64 %r191,%r75,0; @ %r191 bra $L44; .loc 1 723 11 mov.u32 %r192,1; st.u32 [%r75],%r192; bra $L15; $L44:0: .loc 1 550 3 setp.eq.u32 %r193,%r70,3; @ %r193 bra $L45; setp.eq.u32 %r194,%r70,4; @ %r194 bra $L46; setp.eq.u32 %r195,%r70,1; @ ! %r195 bra $L18; bra $L47; $L29: setp.eq.u32 %r196,%r70,3; @ %r196 bra $L48; setp.eq.u32 %r197,%r70,4; @ %r197 bra $L49; setp.eq.u32 %r198,%r70,1; @ ! %r198 bra $L18; bra $L50; $L47: .loc 1 555 7 setp.eq.u32 %r199,%r71,1; @ ! %r199 bra $L79; bra $L51; $L48: .loc 1 644 7 setp.eq.u32 %r200,%r71,4; @ ! %r200 bra $L80; bra $L53; $L25: .loc 1 627 7 setp.eq.u32 %r201,%r71,4; @ ! %r201 bra $L81; bra $L55; $L45: .loc 1 610 7 setp.eq.u32 %r202,%r71,4; @ ! %r202 bra $L82; bra $L57; $L49: .loc 1 697 7 setp.eq.u32 %r203,%r71,4; @ ! %r203 bra $L83; bra $L59; $L15: .loc 1 726FUNCTION DEF: copy_data .func copy_datau16 %r82; .reg .u16 %r83; .reg .u32 %r84; .reg .u16predu32 %r141; .reg .predmov.u64 %r62,%ar0; mov.u64 %r63,%ar1; mov.u32 %r64,%ar2; mov.u32 %r65,%ar3; mov.u32 %r66,%ar4; mov.u32 %r67,%ar5; mov.u64 %r68,%ar6; mov.u64 %r69,%ar7; mov.u64 %r70,%ar8; mov.u64 %r71,%ar9; .loc 1 1166 21 set.u32.eq.u32 %r73,%r64,6; neg.s32 %r74,%r73; cvt.u32.u32 %r72,%r74; cvt.u32.u8 %r39,%r72; .loc 1 1163 16 set.u32.eq.u32 %r76,%r64,%r65; neg.s32 %r77,%r76; .loc 1 1163 40 set.u32.eq.u32 %r79,%r66,%r67; neg.s32 %r80,%r79; .loc 1 1163 28 cvt.u16.u32 %r82,%r77; cvt.u16.u32 %r83,%r80; and.b16 %r81,%r82,%r83; .loc 1 1163 6 cvt.u32.u16 %r84,%r81; cvt.u16.u8 %r85,%r84; setp.eq.u16 %r86,%r85,0; @ %r86 bra $L85; .loc 1 1165 56 min.u64 %r92,%r68,%r69; .loc 1 1165 7 mul.lo.u64 %r93,%r92,%r7move166 49 set.u32.eq.u32 %r96,%r65,6; neg.s32 %r97,%r96; .loc 1 1166 37 cvt.u16.u32 %r99,%r97; cvt.u16.u32 %r100,%r39; or.b16 %r98,%r99,%r100; .loc 1 1167 4 set.u32.gt.u64 %r102,%r68,%r69; neg.s3298@ ! %r110 bra $L84; .loc 1 1169 7 setp.ne.u32 %r111,%r66,1; @ %r111 bra $L88; .loc 1 1170 6 sub.u64 %r117,%r68,%r69; add.u64 %r118,%r62,%r69; mov.u32 %r1111115117119,[%value_in]; } bra $L84; $L88: .loc 1 1172 13 shr.u64 %r43,%r69,2; .loc 1 1172 39 shr.u64 %r55,%r68,2; .loc 1 1172 6 setp.ge.u64 %r120,%r43,%r55; @ %r120 bra $L84; shl.b64 %r121,%r43,2; add.u64 %r51,%r62,%r121; shl.b64 %r122,%r55,2; add.u64 %r54,%r62,%r122; .loc 1 1173 27 mov.u32 %r123,32; $L90: st.u32 [%r51],%r123; .loc 1 1172 6 add.u64 %r51,%r51,4; setp.ne.u64 %r124,%r51,%r54; @ %r124 bra $L90; bra $L84; $L85: .loc 1 1176 49 set.u32.eq.u32 %r126,%r66,1; neg.s32 %r127,%r126; .loc 1 1176 3739117691; .loc 1 1call assign_char1_from_char4bra $L84; $L91: .loc 1 1178 11 setp.eq.u32 %r138,%r64,6; @ %r138 bra $L92; .loc 1 1181 5 setp.ne.u64 %r139,%r70,0; @ %r139 bra $L97; bra $L84; $L92: .loc 1 446 15 shr.u64 %r44,%r68,2; .loc 1 446 5 min.u64 %r45,%r69,%r44; .loc 1 447 3 setp.eq.u64 %r140,%r45,0; @ %r140 bra $L94; mov.u64 %r52,%r63; mov.u64 %r58,%r62; add.u64 %r61,%r52,%r45; $L95: .loc 1 448 14 ld.u8 %r141,[%r52]; st.u32 [%r58],%r141; .loc 1 447 3 add.u64 %r52,%r52,1; add.u64 %r58,%r58,4; setp.ne.u64 %r142,%r52,%r61; @ %r142 bra $L95; $L94: .loc 1 449 3 setp.le.u64 %r143,%r44,%r45; @ %r143 bra $L84; shl.b64 %r144,%r45,2; add.u64 %r49,%r62,%r144; shl.b64 %r145,%r44,2; add.u64 %r53,%r62,%r145; .loc 1 450 12 mov.u32 %r146,32; $L96: st.u32 [%r49],%r146; .loc 1 449 3 add.u64 %r49,%r49,4; setp.ne.u64 %r147,%r49,%r53; @ %r147 bra $L96; bra $L84; $L97: .loc 1 1181 12 mov.u64 %r41,0; $L93: .loc 1 1183 26771; call convert_type1184 5 add.u64 %r62,%r62,%r68; .loc 1 1185 5 add.u64 %r63,%r63,%r69; .loc 1 1181 26 add.u64 %r41,%r41,1; .loc 1 1181 5 setp.ne.u64 %r155,%r70,%r41; @ %r155 bra $L93; $L84: .loc 1 1187 1 ret; } // BEGIN FUNCTION DEF: get_for_ref$constprop$0$isra$0 .func get_for_ref$constprop$0$isra$0, .param .u64 %in_ar11, .param .u32 %in_ar12) {ar11; ld.param.u64 %ar11,[%in_ar11]; .reg .u32 %ar12; ld.param.u32 %ar12,[%in_ar120u64 %r210; .reg .u32 %r211; .reg .u64 %r215; .reg .u64u64 %r279; .reg .u64 %r282; .reg .u64u64 %r299; .reg .u64 %r301; .reg .u32 %r302; .reg .u64 %r303; .reg .u64 %r305; .reg .u64 %r306; .reg .u6464u64 %r327; .reg .u6432u64pred %r404; .reg .predpred %r409; .reg .pred %r411; .reg .pred %r413; .reg .pred %r415; .reg .pred %r417; .reg .predpred %r438; .reg .pred %r439; .reg .predu64 %r521; .reg .u64 %r522; .reg .u64 %r523; .reg .u64 %r530; .reg .u64 %r531; .reg .u64 %r532; .reg .u64u64 %r582; .reg .u64 %r589; .reg .u64 %r590; .reg .u64 %r591; .reg .u64 %r598; .reg .pred %r599; .reg .u64 %r603; .reg .u64 %r604; .reg .u64 %r610; .reg .pred %r618; .reg .u64 %r619; .reg .u64 %r620; .reg .u32 %r622; .reg .u32 %r623; .reg .u32 %r625; .reg .u32 %r626; .reg .u16 %r627; .reg .u16 %r628; .reg .u16 %r629; .reg .u32 %r630; .reg .u16 %r631; .reg .pred %r632; .reg .pred %r633; .reg .u64 %r634; .reg .u64 %r638; .reg .u64 %r642; .reg .u64 %r643; .reg .u64 %r645; .reg .u64 %r646; .reg .u64 %r661; .reg .u64u64 %r670; .reg .u64 %r671; .reg .pred %r672; .reg .u64 %r676; .reg .u64 %r677; .reg .pred %r691; .reg .u64 %r692; .reg .u64 %r693; .reg .u32 %r695; .reg .u32 %r696; .reg .u32 %r698; .reg .u32 %r699; .reg .u16 %r700; .reg .u16 %r701; .reg .u16 %r702; .reg .u32 %r703; .reg .u16 %r704; .reg .pred %r705; .reg .pred %r706; .reg .u64 %r707; .reg .u64 %r711; .reg .u64 %r715; .reg .u64 %r716; .reg .u64 %r724; .reg .u64 %r725; .reg .u64 %r726; .reg .pred %r727; .reg .u64 %r731; .reg .u64 %r732; .reg .u64 %r735; .reg .u64 %r743; .reg .pred %r744; .reg .u64 %r759; .reg .u64 %r760; .reg .u64 %r764; .reg .u64 %r765; .reg .u64 %r766; .reg .u64 %r767; .reg .u64 %r768; .reg .u64 %r769; .reg .pred %r770; .reg .u64 %r772; .reg .u64 %r773; .reg .u64 %r774; .reg .u64 %r775; .reg .u64 %r781; .reg .u64 %r783; .reg .u64 %r784; .reg .u64 %r785; .reg .u64 %r792; .reg .u64 %r795; .reg .u64 %r799; .reg .u64 %r800; .reg .u64 %r806; .reg .pred %r814; .reg .u64 %r815; .reg .u64 %r816; .reg .u32 %r818; .reg .u32 %r819; .reg .u32 %r821; .reg .u32 %r822; .reg .u16 %r823; .reg .u16 %r824; .reg .u16 %r825; .reg .u32 %r826; .reg .u16 %r827; .reg .pred %r828; .reg .pred %r829; .reg .u64 %r830; .reg .u64 %r834; .reg .u64 %r838; .reg .u64 %r839; .reg .u64 %r847; .reg .u64 %r848; .reg .u64 %r849; .reg .u64 %r864; .reg .u64 %r865; .reg .u64 %r869; .reg .u64 %r870; .reg .u64 %r871; .reg .u64 %r872; .reg .u64 %r873; .reg .u64 %r874; .reg .pred %r875; .reg .u64 %r879; .reg .u64 %r880; .reg .u64 %r886; .reg .pred %r894; .reg .u64 %r895; .reg .u64 %r896; .reg .u32 %r898; .reg .u32 %r899; .reg .u32 %r901; .reg .u32 %r902; .reg .u16 %r903; .reg .u16 %r904; .reg .u16 %r905; .reg .u32 %r906; .reg .u16 %r907; .reg .pred %r908; .reg .pred %r909; .reg .u64 %r910; .reg .u64 %r914; .reg .u64 %r918; .reg .u64 %r919; .reg .u64 %r921; .reg .u64 %r922; .reg .u64 %r937; .reg .u64 %r938; .reg .u64 %r942; .reg .u64 %r943; .reg .u64 %r944; .reg .u64 %r945; .reg .u64 %r946; .reg .u64 %r947; .reg .pred %r948; .reg .u64 %r949; .reg .pred %r951; .reg .u16 %r952; .reg .pred %r953; .reg .pred %r955; .reg .pred %r957; .reg .pred %r959; .reg .pred %r961; .reg .u64 %r962; .reg .u64 %r966; .reg .u64 %r967; .reg .u64 %r968; .reg .u64 %r969; .reg .pred %r970; .reg .u64 %r976; .reg .pred %r977; .reg .pred %r978; .reg .pred %r979; .reg .pred %r980; .reg .pred %r981; .reg .pred %r982; .reg .u64 %r989; .reg .u64 %r990; .reg .u64 %r997; .reg .u64 %r998; .reg .u64 %r999; .reg .u64 %r1006; .reg .u64 %r1007; .reg .u64 %r1008; .reg .u64 %r1015; .reg .u64 %r1016; .reg .u64 %r1017; .reg .u64 %r1024; .reg .u64 %r1025; .reg .u64 %r1026; .reg .u64 %r1031; .reg .u64 %r1043; .reg .u64 %r1044; .reg .u64 %r1051; .reg .u64 %r1052; .reg .u64 %r1053; .reg .u64 %r1060; .reg .pred %r1061; .reg .u64 %r1062; .reg .u64 %r1066; .reg .u64 %r1067; .reg .u64 %r1068; .reg .u64 %r1069; .reg .pred %r1070; .reg .u64 %r1075; .reg .u64 %r1085; .reg .u64 %r1086; .reg .u64 %r1090; .reg .u64 %r1091; .reg .u64 %r1092; .reg .u64 %r1093; .reg .u64 %r1094; .reg .u64 %r1095; .reg .u64 %r1101; .reg .u64 %r1102; .reg .u64 %r1109; .reg .pred %r1110; .reg .u64 %r1114; .reg .u64 %r1115; .reg .pred %r1129; .reg .u64 %r1130; .reg .u64 %r1131; .reg .u32 %r1133; .reg .u32 %r1134; .reg .u32 %r1136; .reg .u32 %r1137; .reg .u16 %r1138; .reg .u16 %r1139; .reg .u16 %r1140; .reg .u32 %r1141; .reg .u16 %r1142; .reg .pred %r1143; .reg .pred %r1144; .reg .u64 %r1145; .reg .u64 %r1149; .reg .u64 %r1150; .reg .u64 %r1155; .reg .u64 %r1165; .reg .u64 %r1166; .reg .u64 %r1170; .reg .u64 %r1171; .reg .u64 %r1172; .reg .u64 %r1173; .reg .u64 %r1174; .reg .u64 %r1175; .reg .u64 %r1179; .reg .u64 %r1180; .reg .u64 %r1181; .reg .u64 %r1182; .reg .pred %r1183; .reg .u64 %r1185; .reg .u64 %r1186; .reg .u64 %r1187; .reg .u64 %r1188; .reg .u64 %r1191; .reg .pred %r1192; .reg .u64 %r1207; .reg .u64 %r1208; .reg .u64 %r1215; .reg .u64 %r1216; .reg .u64 %r1217; .reg .pred %r1218; .reg .pred %r1219; .reg .pred %r1220; .reg .u64 %r1221; .reg .u64 %r1222; .reg .u32 %r1223; .reg .u64 %r1224; .reg .u64 %r1225; .reg .u64 %r1226; .reg .u64 %r1227; .reg .u64 %r1228; .reg .u64 %r1229; .reg .u64 %r1230; .reg .u64 %r1231; .reg .u64 %r1232; .reg .u64 %r1233; .reg .u64 %r1235; .reg .u64 %r1236; .reg .u64 %r1237; .reg .u64 %r1238; .reg .u64 %r1239; .reg .u64 %r1240; .reg .u64 %r1241; .reg .u64 %r1242; .reg .u64 %r1243; .reg .u64 %r1244; .reg .u64 %r1245; .reg .u64 %r1247; mov.u64 %r323,%ar0; mov.u64 %r324,%ar1; mov.u64 %r325,%ar2; mov.u64 %r326,%ar3; mov.u64 %r327,%ar4; mov.u64 %r328,%ar5; mov.u64 %r329,%ar6; mov.u32 %r330,%ar7; mov.u32 %r331,%ar8; mov.u64 %r332,%ar9; mov.u64 %r333,%ar10; mov.u64 %r334,%ar11; mov.u32 %r335,%ar12; .loc 1 1209 6 setp.eq.u64 %r336,%r323,0; @ %r336 bra $L105; .loc 1 1392 13 shl.b64 %r337,%r332,3; add.u64 %r159,%r325,%r337; .loc 1 1214 10 ld.u64 %r303,[%r323]; .loc 1 1220 18 ld.u32 %r302,[%r323+8]; setp.ne.u64 %r1220,%r303,0; .loc 1 1523 4 mov.u64 %r1247,0; bra $L107; $L182: .loc 1 1447 4 mov.u64 %r327,%r1247; ld.u32 %r1223,[%r303+8]; $L158: .loc 1 1220 18 mov.u32 %r302,%r1223; mov.u64 %r323,%r303; ld.u64 %r303,[%r303]; mov.u64 %r333,0; setp.ne.u64 %r1220,%r303,0; $L107: .loc 1 1214 6 @ %r1220 bra $L108; .loc 1 1216 14 ld.u64 %r22,[%r326+16]; .loc 1 1220 7 setp.eq.u32 %r339,%r302,0; @ %r339 bra $L109; add.u32 %r340,%r302,-1; setp.gt.u32 %r341,%r340,1; @ %r341 bra $L110; .loc 1 1239 21 add.u64 %r342,%r323,%r333; ld.u8 %r314,[%r342+24]; .loc 1 1239 7 setp.eq.u32 %r344,%r314,0; @ ! %r344 bra $L228; bra $L111; $L109: .loc 1 1227 35 ld.u64 %r345,[%r323+24]; add.u64 %r301,%r329,%r345; .loc 1 1227 6 ld.u64 %r299,[%r323+16]; .loc 1 1225 7 ld.u64 %r346,[%r323+32]; setp.le.s64 %r347,%r346,0; @ %r347 bra $L113; .loc 1 1227 6 ld.s8 %r358,[%r326+29]; mov.u64 %r356,1; ld.u64 %r349,[%r3013r34935822; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r299; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r356; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r334; call copy_data,%out_arg8,%out_arg9,%out_arg10); } bra $L114; $L113: .loc 1 1231 6 ld.s8 %r369,[%r326+29]; mov.u64 %r367,r301322; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r299; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r367; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r334; call copy_data,%out_arg8,%out_arg9,%out_arg10); } $L114: .loc 1 1234 4 ld.u64 %r371,[%r324]; add.u64 %r370,%r371,1; st.u64 [%r324],%r370; .loc 1 1235 4 bra $L105; $L111: .loc 1 1218 14 ld.s8 %r36,[%r326+28]; .loc 1 1241 8 setp.eq.u64 %r372,%r36,0; @ %r372 bra $L116; mov.u64 %r282,%r325; shl.b64 %r373,%r36,3; add.u64 %r279,%r282,%r373; .loc 1 1217 17 mov.u64 %r37,0; $L117: .loc 1 1242 20 ld.u64 %r374,[%r282]; add.u64 %r37,%r37,%r374; mov.u64 %r36,%r37; .loc 1 1241 8 add.u64 %r282,%r282,8; setp.ne.u64 %r375,%r279,%r282; @ %r375 bra $L117; $L116: .loc 1 1243 8 ld.u64 %r386,[%r323+16]; ld.s8 %r387,[%r326+29]; mad.lo.u64 %r389,%r22,%r36,%r328; mov.u64 %r384,r32938722; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r386; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r384; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r334; call copy_data,%out_arg8,%out_arg9,%out_arg10); } .loc 1 1247 11 ld.u64 %r391,[%r324]; add.u64 %r390,%r391,1; st.u64 [%r324],%r390; .loc 1 1248 8 bra $L105; $L110: .loc 1 1252 4 cvta.const.u64 %r392,unreachable;stack; call caf_runtime_error108: .loc 1 1256 3 setp.eq.u32 %r394,%r302,1; @ %r394 bra $L118; setp.eq.u32 %r395,%r302,2; @ %r395 bra $L119; setp.ne.u32 %r396,%r302,0; @ %r396 bra $L110; .loc 1 1259 19 ld.u64 %r47,[%r323+32]; .loc 1 1227 35 ld.u64 %r397,[%r323+24]; add.u64 %r327,%r329,%r397; .loc 1 1259 10 setp.le.s64 %r398,%r47,0; @ %r398 bra $L181; .loc 1 1263 30 ld.u32 %r302,[%r303+8]; mov.u32 %r1223,%r302; .loc 1 1263 18 setp.ne.u32 %r399,%r302,1; @ %r399 bra $L121; .loc 1 1261 17 add.u64 %r400,%r329,%r47; .loc 1 1264 10 ld.u64 %r401,[%r400]; ld.u64 %r51,[%r401+8]; bra $L122; $L177: .loc 1 1273 9 ld.u64 %r329,[%r327]; mov.u64 %r323,%r303; ld.u64 %r303,[%r303]; .loc 1 1277 4 mov.u64 %r327,%r1247; .loc 1 1266 10 mov.u64 %r333,0; setp.ne.u64 %r1220,%r303,0; .loc 1 1273 9 bra $L107; $L122: .loc 1 1275 9 mov.u64 %r329,%r327; mov.u64 %r327,%r51; bra $L158; $L118: .loc 1 1288 24 add.u64 %r402,%r323,%r333; ld.u8 %r314,[%r402+24]; .loc 1 1288 10 setp.eq.u32 %r404,%r314,0; @ ! %r404 bra $L179; ld.u32 %r1223,[%r303+8]; bra $L158; $L179: .loc 1 1297 10 setp.ne.u64 %r405,%r333,0; @ %r405 bra $L124; .loc 1 1298 5 ld.u64 %r329,[%r327]; $L124: .loc 1 1299 7 cvt.u16.u32 %r406,%r314; setp.eq.u16 %r407,%r406,4; @ %r407 bra $L125; setp.gt.u16 %r409,%r406,4; @ %r409 bra $L126; setp.eq.u16 %r411,%r406,2; @ %r411 bra $L127; setp.eq.u16 %r413,%r406,3; @ %r413 bra $L128; setp.eq.u16 %r415,%r406,1; @ %r415 bra $L129; bra $L110; $L126: setp.eq.u16 %r417,%r406,5; @ %r417 bra $L130; setp.eq.u16 %r419,%r406,6; @ %r419 bra $L131; bra $L110; $L129: .loc 1 1304 23 mov.u64 %r420,0; st.u64 [%r159],%r420; .loc 1 1305 54 add.u64 %r1225,%r333,%r333; add.u64 %r1222,%r1225,%r333; shl.b64 %r424,%r1222,3; add.u64 %r425,%r323,%r424; add.u64 %r426,%r425,56; .loc 1 1305 4 ld.u64 %r427,[%r426]; setp.eq.u64 %r428,%r427,0; @ %r428 bra $L105; .loc 1 1305 16 mov.u64 %r99,%r420; .loc 1 1315 39 add.u64 %r434,%r425,64; .loc 1 1319 3 add.u64 %r1235,%r425,48; add.u64 %r1236,%r327,%r424; add.u64 %r1237,%r1236,48; add.u64 %r1238,%r1236,40; .loc 1 1332 8 add.u64 %r1239,%r333,1; add.u64 %r1240,%r332,1; .loc 1 1337 8 add.u64 %r1241,%r332,%r332; add.u64 %r1242,%r1241,%r332; shl.b64 %r1243,%r1242,3; add.u64 %r1244,%r326,%r1243; add.u64 %r1245,%r1244,40; $L139: .loc 1 1315 39 ld.u32 %r53,[%r434]; setp.eq.u32 %r435,%r53,4; @ %r435 bra $L132; setp.gt.s32 %r436,%r53,4; @ %r436 bra $L133; setp.eq.u32 %r437,%r53,1; @ %r437 bra $L134; setp.eq.u32 %r438,%r53,2; @ %r438 bra $L135; bra $L110; $L133: setp.eq.u32 %r439,%r53,8; @ %r439 bra $L136; setp.eq.u32 %r440,%r53,16; @ %r440 bra $L137; bra $L110; $L134: .loc 1 1317 3 ld.u64 %r447,[%r1235]; add.u64 %r448,%r447,%r99; ld.s8 %r449,[%r448]; ld.u64 %r457,[%r1237]; sub.u64 %r456,%r449,%r457; ld.u64 %r464,[%r1238]; mul.lo.u64 %r61,%r456,%r464; bra $L138; $L135: .loc 1 1318 3 ld.u64 %r471,[%r1235]; add.u64 %r472,%r99,%r99; add.u64 %r473,%r471,%r472; ld.s16 %r474,[%r473]; ld.u64 %r482,[%r1237]; sub.u64 %r481,%r474,%r482; ld.u64 %r489,[%r1238]; mul.lo.u64 %r61,%r481,%r489; bra $L138; $L132: .loc 1 1319 3 ld.u64 %r496,[%r1235]; shl.b64 %r497,%r99,2; add.u64 %r498,%r496,%r497; ld.s32 %r499,[%r498]; ld.u64 %r507,[%r1237]; sub.u64 %r506,%r499,%r507; ld.u64 %r514,[%r1238]; mul.lo.u64 %r61,%r506,%r514; bra $L138; $L136: .loc 1 1321 3 ld.u64 %r521,[%r1235]; shl.b64 %r522,%r99,3; add.u64 %r523,%r521,%r522; ld.u64 %r531,[%r523]; ld.u64 %r532,[%r1237]; sub.u64 %r530,%r531,%r532; ld.u64 %r539,[%r1238]; mul.lo.u64 %r61,%r530,%r539; bra $L138; $L137: .loc 1 1324 3 ld.u64 %r546,[%r1235]; shl.b64 %r547,%r99,4; add.u64 %r548,%r546,%r547; ld.u64 %r556,[%r548]; ld.u64 %r557,[%r1237]; sub.u64 %r555,%r556,%r557; ld.u64 %r564,[%r1238]; mul.lo.u64 %r61,%r555,%r564; $L138: .loc 1 1333 33 ld.u64 %r581,[%r323+16]; .loc 1 1332 8 mad.lo.u64 %r582,%r61,%r5813324333582; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r330; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r331; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r1240; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r1239; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r334; .param .u32 %out_arg13; st.param.u32 [%out_arg13],%r335; call get_for_ref$constprop$0$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13); } .loc 1 1337 5 ld.u64 %r590,[%r159]; ld.u64 %r591,[%r1245]; add.u64 %r589,%r590,%r591; st.u64 [%r159],%r589; .loc 1 1306 9 add.u64 %r99,%r99,1; .loc 1 1305 4 ld.u64 %r598,[%r426]; setp.lt.u64 %r599,%r99,%r598; @ %r599 bra $L139; bra $L105; $L127: .loc 1 1341 4 add.u64 %r1225,%r333,%r333; add.u64 %r1222,%r1225,%r333; shl.b64 %r603,%r1222,3; add.u64 %r604,%r323,%r603; ld.u64 %r101,[%r604+64]; abs.s64 %r102,%r101; add.u64 %r610,%r327,%r603; ld.u64 %r296,[%r610+56]; ld.u64 %r319,[%r610+48]; setp.le.s64 %r618,%r101,0; @ %r618 bra $L140; add.u64 %r619,%r296,1; sub.u64 %r104,%r619,%r319; bra $L141; $L140: add.u64 %r620,%r319,1; sub.u64 %r104,%r620,%r296; $L141: set.u32.le.s64 %r622,%r104,0; neg.s32 %r623,%r622; set.u32.le.s64 %r625,%r102,0; neg.s32 %r626,%r625; cvt.u16.u32 %r628,%r623; cvt.u16.u32 %r629,%r626; or.b16 %r627,%r628,%r629; cvt.u32.u16 %r630,%r627; cvt.u16.u8 %r631,%r630; setp.ne.u16 %r632,%r631,0; @ %r632 bra $L105; setp.eq.u64 %r633,%r102,1; @ %r633 bra $L142; add.u64 %r634,%r104,-1; div.s64 %r638,%r634,%r102; add.u64 %r104,%r638,1; $L142: .loc 1 1345 34 shl.b64 %r642,%r1222,3; add.u64 %r643,%r327,%r642; .loc 1 1345 15 ld.u64 %r645,[%r643+40]; mul.lo.u64 %r112,%r101,%r645; .loc 1 1348 23 mov.u64 %r646,0; st.u64 [%r159],%r646; .loc 1 1332 8 add.u64 %r310,%r332,1; add.u64 %r305,%r333,1; .loc 1 1349 20 mov.u64 %r120,%r646; .loc 1 1347 21 mov.u64 %r121,%r120; mul.lo.u64 %r1221,%r332,3; .loc 1 1357 8 shl.b64 %r666,%r1221,3; add.u64 %r667,%r326,%r666; add.u64 %r668,%r667,40; $L143: .loc 1 1353 33 ld.u64 %r661,[%r323+16]; .loc 1 1352 8 mad.lo.u64 %r662,%r121,%r6613324333662; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r330; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r331; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r310; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r305; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r334; .param .u32 %out_arg13; st.param.u32 [%out_arg13],%r335; call get_for_ref$constprop$0$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13); } .loc 1 1357 5 ld.u64 %r670,[%r159]; ld.u64 %r671,[%r668]; add.u64 %r669,%r670,%r671; st.u64 [%r159],%r669; .loc 1 1350 9 add.u64 %r120,%r120,1; .loc 1 1350 33 add.u64 %r121,%r121,%r112; .loc 1 1349 4 setp.ne.u64 %r672,%r120,%r104; @ %r672 bra $L143; bra $L105; $L128: .loc 1 1361 4 add.u64 %r1225,%r333,%r333; add.u64 %r1222,%r1225,%r333; shl.b64 %r676,%r1222,3; add.u64 %r677,%r323,%r676; ld.u64 %r122,[%r677+64]; abs.s64 %r123,%r122; ld.u64 %r294,[%r677+56]; ld.u64 %r317,[%r677+48]; setp.le.s64 %r691,%r122,0; @ %r691 bra $L144; add.u64 %r692,%r294,1; sub.u64 %r125,%r692,%r317; bra $L145; $L144: add.u64 %r693,%r317,1; sub.u64 %r125,%r693,%r294; $L145: set.u32.le.s64 %r695,%r125,0; neg.s32 %r696,%r695; set.u32.le.s64 %r698,%r123,0; neg.s32 %r699,%r698; cvt.u16.u32 %r701,%r696; cvt.u16.u32 %r702,%r699; or.b16 %r700,%r701,%r702; cvt.u32.u16 %r703,%r700; cvt.u16.u8 %r704,%r703; setp.ne.u16 %r705,%r704,0; @ %r705 bra $L105; setp.eq.u64 %r706,%r123,1; @ %r706 bra $L146; add.u64 %r707,%r125,-1; div.s64 %r711,%r707,%r123; add.u64 %r125,%r711,1; $L146: .loc 1 1367 10 shl.b64 %r715,%r1222,3; add.u64 %r716,%r327,%r715; ld.u64 %r134,[%r716+40]; .loc 1 1366 10 ld.u64 %r725,[%r716+48]; sub.u64 %r724,%r317,%r725; .loc 1 1365 21 mul.lo.u64 %r150,%r724,%r134; .loc 1 1368 15 mul.lo.u64 %r135,%r122,%r134; .loc 1 1370 23 mov.u64 %r726,0; st.u64 [%r159],%r726; .loc 1 1332 8 add.u64 %r318,%r333,1; .loc 1 1376 41 setp.ne.u64 %r727,%r125,1; @ ! %r727 bra $L229; mul.lo.u64 %r1221,%r332,3; bra $L147; $L229: .loc 1 1375 12 add.u64 %r1224,%r332,%r332; add.u64 %r1221,%r1224,%r332; shl.b64 %r731,%r1221,3; add.u64 %r732,%r326,%r731; ld.u64 %r735,[%r732+56]; ld.u64 %r743,[%r732+48]; .loc 1 1375 8 setp.ne.u64 %r744,%r735,%r743; @ %r744 bra $L148; $L147: .loc 1 1376 41 add.u64 %r140,%r332,1; .loc 1 1377 20 mov.u64 %r152,0; .loc 1 1384 8 shl.b64 %r764,%r1221,3; add.u64 %r765,%r326,%r764; add.u64 %r766,%r765,40; $L149: .loc 1 1380 33 ld.u64 %r759,[%r323+16]; .loc 1 1379 8 mad.lo.u64 %r760,%r150,%r7593324333760; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r330; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r331; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r140; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r318; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r334; .param .u32 %out_arg13; st.param.u32 [%out_arg13],%r335; call get_for_ref$constprop$0$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13); } .loc 1 1384 5 ld.u64 %r768,[%r159]; ld.u64 %r769,[%r766]; add.u64 %r767,%r768,%r769; st.u64 [%r159],%r767; .loc 1 1385 25 add.u64 %r150,%r150,%r135; .loc 1 1377 47 add.u64 %r152,%r152,1; .loc 1 1377 4 setp.ne.u64 %r770,%r125,%r152; @ %r770 bra $L149; bra $L105; $L125: .loc 1 1389 47 add.u64 %r772,%r333,%r333; add.u64 %r773,%r772,%r333; shl.b64 %r774,%r773,3; add.u64 %r775,%r323,%r774; .loc 1 1390 29 add.u64 %r781,%r327,%r774; .loc 1 1390 10 ld.u64 %r784,[%r775+48]; ld.u64 %r785,[%r781+48]; sub.u64 %r783,%r784,%r785; .loc 1 1389 21 ld.u64 %r792,[%r781+40]; mul.lo.u64 %r157,%r783,%r792; .loc 1 1392 23 st.u64 [%r159],%r1247; .loc 1 1393 4 add.u64 %r333,%r333,1; .loc 1 1394 32 ld.u64 %r795,[%r323+16]; .loc 1 1393 4 mad.lo.u64 %r329,%r157,%r795,%r329; .loc 1 1397 4 bra $L107; $L130: .loc 1 1399 4 add.u64 %r1225,%r333,%r333; add.u64 %r1222,%r1225,%r333; shl.b64 %r799,%r1222,3; add.u64 %r800,%r323,%r799; ld.u64 %r163,[%r800+64]; abs.s64 %r164,%r163; .loc 1 1341 4 add.u64 %r806,%r327,%r799; ld.u64 %r293,[%r806+56]; .loc 1 1361 4 ld.u64 %r316,[%r800+48]; .loc 1 1399 4 setp.le.s64 %r814,%r163,0; @ %r814 bra $L150; add.u64 %r815,%r293,1; sub.u64 %r166,%r815,%r316; bra $L151; $L150: add.u64 %r816,%r316,1; sub.u64 %r166,%r816,%r293; $L151: set.u32.le.s64 %r818,%r166,0; neg.s32 %r819,%r818; set.u32.le.s64 %r821,%r164,0; neg.s32 %r822,%r821; cvt.u16.u32 %r824,%r819; cvt.u16.u32 %r825,%r822; or.b16 %r823,%r824,%r825; cvt.u32.u16 %r826,%r823; cvt.u16.u8 %r827,%r826; setp.ne.u16 %r828,%r827,0; @ %r828 bra $L105; setp.eq.u64 %r829,%r164,1; @ %r829 bra $L152; add.u64 %r830,%r166,-1; div.s64 %r834,%r830,%r164; add.u64 %r166,%r834,1; $L152: .loc 1 1403 17 shl.b64 %r838,%r1222,3; add.u64 %r839,%r327,%r838; ld.u64 %r173,[%r839+40]; .loc 1 1403 15 mul.lo.u64 %r174,%r163,%r173; .loc 1 1406 10 ld.u64 %r848,[%r839+48]; sub.u64 %r847,%r316,%r848; .loc 1 1405 21 mul.lo.u64 %r185,%r847,%r173; .loc 1 1408 23 mov.u64 %r849,0; st.u64 [%r159],%r849; .loc 1 1332 8 add.u64 %r306,%r332,1; add.u64 %r311,%r333,1; .loc 1 1409 20 mov.u64 %r187,%r849; mul.lo.u64 %r1221,%r332,3; .loc 1 1416 8 shl.b64 %r869,%r1221,3; add.u64 %r870,%r326,%r869; add.u64 %r871,%r870,40; $L153: .loc 1 1412 33 ld.u64 %r864,[%r323+16]; .loc 1 1411 8 mad.lo.u64 %r865,%r185,%r8643324333865; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r330; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r331; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r306; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r311; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r334; .param .u32 %out_arg13; st.param.u32 [%out_arg13],%r335; call get_for_ref$constprop$0$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13); } .loc 1 1416 5 ld.u64 %r873,[%r159]; ld.u64 %r874,[%r871]; add.u64 %r872,%r873,%r874; st.u64 [%r159],%r872; .loc 1 1417 25 add.u64 %r185,%r185,%r174; .loc 1 1409 47 add.u64 %r187,%r187,1; .loc 1 1409 4 setp.ne.u64 %r875,%r187,%r166; @ %r875 bra $L153; bra $L105; $L131: .loc 1 1421 4 add.u64 %r1225,%r333,%r333; add.u64 %r1222,%r1225,%r333; shl.b64 %r879,%r1222,3; add.u64 %r880,%r323,%r879; ld.u64 %r188,[%r880+64]; abs.s64 %r189,%r188; .loc 1 1341 4 add.u64 %r886,%r327,%r879; ld.u64 %r292,[%r886+48]; .loc 1 1361 4 ld.u64 %r315,[%r880+56]; .loc 1 1421 4 setp.le.s64 %r894,%r188,0; @ %r894 bra $L154; add.u64 %r895,%r315,1; sub.u64 %r191,%r895,%r292; bra $L155; $L154: add.u64 %r896,%r292,1; sub.u64 %r191,%r896,%r315; $L155: set.u32.le.s64 %r898,%r191,0; neg.s32 %r899,%r898; set.u32.le.s64 %r901,%r189,0; neg.s32 %r902,%r901; cvt.u16.u32 %r904,%r899; cvt.u16.u32 %r905,%r902; or.b16 %r903,%r904,%r905; cvt.u32.u16 %r906,%r903; cvt.u16.u8 %r907,%r906; setp.ne.u16 %r908,%r907,0; @ %r908 bra $L105; setp.eq.u64 %r909,%r189,1; @ %r909 bra $L156; add.u64 %r910,%r191,-1; div.s64 %r914,%r910,%r189; add.u64 %r191,%r914,1; $L156: .loc 1 1425 17 shl.b64 %r918,%r1222,3; add.u64 %r919,%r327,%r918; .loc 1 1425 15 ld.u64 %r921,[%r919+40]; mul.lo.u64 %r199,%r188,%r921; .loc 1 1428 23 mov.u64 %r922,0; st.u64 [%r159],%r922; .loc 1 1332 8 add.u64 %r322,%r332,1; add.u64 %r307,%r333,1; .loc 1 1429 20 mov.u64 %r210,%r922; .loc 1 1427 21 mov.u64 %r208,%r210; mul.lo.u64 %r1221,%r332,3; .loc 1 1436 8 shl.b64 %r942,%r1221,3; add.u64 %r943,%r326,%r942; add.u64 %r944,%r943,40; $L157: .loc 1 1432 33 ld.u64 %r937,[%r323+16]; .loc 1 1431 8 mad.lo.u64 %r938,%r208,3324333938; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r330; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r331; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r322; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r307; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r334; .param .u32 %out_arg13; st.param.u32 [%out_arg13],%r335; call get_for_ref$constprop$0$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13); } .loc 1 1436 5 ld.u64 %r946,[%r159]; ld.u64 %r947,[%r944]; add.u64 %r945,%r946,%r947; st.u64 [%r159],%r945; .loc 1 1437 25 add.u64 %r208,%r208,%r199; .loc 1 1429 47 add.u64 %r210,%r210,1; .loc 1 1429 4 setp.ne.u64 %r948,%r210,%r191; @ %r948 bra $L157; bra $L105; $L119: .loc 1 1239 21 add.u64 %r949,%r323,%r333; ld.u8 %r314,[%r949+24]; .loc 1 1445 10 setp.eq.u32 %r951,%r314,0; @ %r951 bra $L182; $L180: .loc 1 1452 7 cvt.u16.u32 %r952,%r314; setp.eq.u16 %r953,%r952,3; @ %r953 bra $L159; setp.gt.u16 %r955,%r952,3; @ %r955 bra $L160; setp.eq.u16 %r957,%r952,1; @ %r957 bra $L161; setp.eq.u16 %r959,%r952,2; @ %r959 bra $L162; bra $L110; $L160: setp.eq.u16 %r961,%r952,4; @ %r961 bra $L163; bra $L110; $L161: .loc 1 1456 23 mov.u64 %r962,0; st.u64 [%r159],%r962; .loc 1 1457 54 add.u64 %r1225,%r333,%r333; add.u64 %r1222,%r1225,%r333; shl.b64 %r966,%r1222,3; add.u64 %r967,%r323,%r966; add.u64 %r968,%r967,56; .loc 1 1457 4 ld.u64 %r969,[%r968]; setp.eq.u64 %r970,%r969,0; @ %r970 bra $L105; .loc 1 1457 16 mov.u64 %r237,%r962; .loc 1 1464 39 add.u64 %r976,%r967,64; .loc 1 1468 3 add.u64 %r1226,%r967,48; .loc 1 1481 8 add.u64 %r1227,%r333,1; add.u64 %r1228,%r332,1; .loc 1 1486 8 add.u64 %r1229,%r332,%r332; add.u64 %r1230,%r1229,%r332; shl.b64 %r1231,%r1230,3; add.u64 %r1232,%r326,%r1231; add.u64 %r1233,%r1232,40; $L171: .loc 1 1464 39 ld.u32 %r211,[%r976]; setp.eq.u32 %r977,%r211,4; @ %r977 bra $L164; setp.gt.s32 %r978,%r211,4; @ %r978 bra $L165; setp.eq.u32 %r979,%r211,1; @ %r979 bra $L166; setp.eq.u32 %r980,%r211,2; @ %r980 bra $L167; bra $L110; $L165: setp.eq.u32 %r981,%r211,8; @ %r981 bra $L168; setp.eq.u32 %r982,%r211,16; @ %r982 bra $L169; bra $L110; $L166: .loc 1 1466 3 ld.u64 %r989,[%r1226]; add.u64 %r990,%r989,%r237; ld.s8 %r215,[%r990]; bra $L170; $L167: .loc 1 1467 3 ld.u64 %r997,[%r1226]; add.u64 %r998,%r237,%r237; add.u64 %r999,%r997,%r998; ld.s16 %r215,[%r999]; bra $L170; $L164: .loc 1 1468 3 ld.u64 %r1006,[%r1226]; shl.b64 %r1007,%r237,2; add.u64 %r1008,%r1006,%r1007; ld.s32 %r215,[%r1008]; bra $L170; $L168: .loc 1 1470 3 ld.u64 %r1015,[%r1226]; shl.b64 %r1016,%r237,3; add.u64 %r1017,%r1015,%r1016; ld.u64 %r215,[%r1017]; bra $L170; $L169: .loc 1 1473 3 ld.u64 %r1024,[%r1226]; shl.b64 %r1025,%r237,4; add.u64 %r1026,%r1024,%r1025; ld.u64 %r215,[%r1026]; $L170: .loc 1 1482 33 ld.u64 %r1043,[%r323+16]; .loc 1 1481 8 mad.lo.u64 %r1044,%r215,%r1043,%r329; mov.u64 %r1031,332433103144; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r330; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r331; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r1228; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r1227; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r334; .param .u32 %out_arg13; st.param.u32 [%out_arg13],%r335; call get_for_ref$constprop$0$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13); } .loc 1 1486 5 ld.u64 %r1052,[%r159]; ld.u64 %r1053,[%r1233]; add.u64 %r1051,%r1052,%r1053; st.u64 [%r159],%r1051; .loc 1 1458 9 add.u64 %r237,%r237,1; .loc 1 1457 4 ld.u64 %r1060,[%r968]; setp.lt.u64 %r1061,%r237,%r1060; @ %r1061 bra $L171; bra $L105; $L162: .loc 1 1490 23 mov.u64 %r1062,0; st.u64 [%r159],%r1062; .loc 1 1492 52 add.u64 %r1225,%r333,%r333; add.u64 %r1222,%r1225,%r333; shl.b64 %r1066,%r1222,3; add.u64 %r1067,%r323,%r1066; add.u64 %r1068,%r1067,56; .loc 1 1491 4 ld.u64 %r1069,[%r1068]; setp.lt.s64 %r1070,%r1069,0; @ %r1070 bra $L105; .loc 1 1332 8 add.u64 %r309,%r332,1; add.u64 %r308,%r333,1; .loc 1 1491 26 mov.u64 %r44,%r1062; mul.lo.u64 %r1221,%r332,3; .loc 1 1500 8 shl.b64 %r1090,%r1221,3; add.u64 %r1091,%r326,%r1090; add.u64 %r1092,%r1091,40; .loc 1 1493 52 add.u64 %r1101,%r1067,64; $L172: .loc 1 1496 33 ld.u64 %r1085,[%r323+16]; .loc 1 1495 8 mad.lo.u64 %r1086,%r44,%r1085,%r329; mov.u64 %r1075,332433107586; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r330; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r331; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r309; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r308; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r334; .param .u32 %out_arg13; st.param.u32 [%out_arg13],%r335; call get_for_ref$constprop$0$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13); } .loc 1 1500 5 ld.u64 %r1094,[%r159]; ld.u64 %r1095,[%r1092]; add.u64 %r1093,%r1094,%r1095; st.u64 [%r159],%r1093; .loc 1 1493 26 ld.u64 %r1102,[%r1101]; add.u64 %r44,%r44,%r1102; .loc 1 1491 4 ld.u64 %r1109,[%r1068]; setp.le.s64 %r1110,%r44,%r1109; @ %r1110 bra $L172; bra $L105; $L159: .loc 1 1504 4 add.u64 %r1225,%r333,%r333; add.u64 %r1222,%r1225,%r333; shl.b64 %r1114,%r1222,3; add.u64 %r1115,%r323,%r1114; ld.u64 %r249,[%r1115+64]; abs.s64 %r250,%r249; .loc 1 1361 4 ld.u64 %r291,[%r1115+56]; ld.u64 %r269,[%r1115+48]; .loc 1 1504 4 setp.le.s64 %r1129,%r249,0; @ %r1129 bra $L173; add.u64 %r1130,%r291,1; sub.u64 %r252,%r1130,%r269; bra $L174; $L173: add.u64 %r1131,%r269,1; sub.u64 %r252,%r1131,%r291; $L174: set.u32.le.s64 %r1133,%r252,0; neg.s32 %r1134,%r1133; set.u32.le.s64 %r1136,%r250,0; neg.s32 %r1137,%r1136; cvt.u16.u32 %r1139,%r1134; cvt.u16.u32 %r1140,%r1137; or.b16 %r1138,%r1139,%r1140; cvt.u32.u16 %r1141,%r1138; cvt.u16.u8 %r1142,%r1141; setp.ne.u16 %r1143,%r1142,0; @ %r1143 bra $L105; setp.eq.u64 %r1144,%r250,1; @ %r1144 bra $L175; add.u64 %r1145,%r252,-1; div.s64 %r1149,%r1145,%r250; add.u64 %r252,%r1149,1; $L175: .loc 1 1509 23 mov.u64 %r1150,0; st.u64 [%r159],%r1150; .loc 1 1332 8 add.u64 %r313,%r332,1; add.u64 %r312,%r333,1; .loc 1 1510 20 mov.u64 %r271,%r1150; mul.lo.u64 %r1221,%r332,3; .loc 1 1517 8 shl.b64 %r1170,%r1221,3; add.u64 %r1171,%r326,%r1170; add.u64 %r1172,%r1171,40; .loc 1 1518 51 shl.b64 %r1179,%r1222,3; add.u64 %r1180,%r323,%r1179; add.u64 %r1181,%r1180,64; $L176: .loc 1 1513 33 ld.u64 %r1165,[%r323+16]; .loc 1 1512 8 mad.lo.u64 %r1166,%r269,%r1165,%r329; mov.u64 %r1155,3324331155166; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r330; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r331; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r313; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r312; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r334; .param .u32 %out_arg13; st.param.u32 [%out_arg13],%r335; call get_for_ref$constprop$0$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13); } .loc 1 1517 5 ld.u64 %r1174,[%r159]; ld.u64 %r1175,[%r1172]; add.u64 %r1173,%r1174,%r1175; st.u64 [%r159],%r1173; .loc 1 1518 25 ld.u64 %r1182,[%r1181]; add.u64 %r269,%r269,%r1182; .loc 1 1510 47 add.u64 %r271,%r271,1; .loc 1 1510 4 setp.ne.u64 %r1183,%r271,%r252; @ %r1183 bra $L176; bra $L105; $L163: .loc 1 1522 21 add.u64 %r1185,%r333,%r333; add.u64 %r1186,%r1185,%r333; shl.b64 %r1187,%r1186,3; add.u64 %r1188,%r323,%r1187; ld.u64 %r272,[%r1188+48]; .loc 1 1523 4 add.u64 %r333,%r333,1; .loc 1 1524 32 ld.u64 %r1191,[%r323+16]; .loc 1 1523 4 mad.lo.u64 %r329,%r272,%r1191,%r329; mov.u64 %r327,%r1247; .loc 1 1527 4 bra $L107; $L121: .loc 1 1268 18 setp.eq.u32 %r1192,%r1223,0; @ %r1192 bra $L177; .loc 1 1266 10 mov.u64 %r51,%r1247; bra $L122; $L148: .loc 1 1380 33 ld.u64 %r1207,[%r323+16]; .loc 1 1379 8 mad.lo.u64 %r1208,%r150,%r1203324333208; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r330; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r331; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r332; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r318; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r334; .param .u32 %out_arg13; st.param.u32 [%out_arg13],%r335; call get_for_ref$constprop$0$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13); } .loc 1 1384 5 ld.u64 %r1216,[%r159]; ld.u64 %r1217,[%r732+40]; add.u64 %r1215,%r1216,%r1217; st.u64 [%r159],%r1215; bra $L105; $L228: .loc 1 1256 3 setp.eq.u32 %r1218,%r302,1; @ %r1218 bra $L179; setp.eq.u32 %r1219,%r302,2; @ ! %r1219 bra $L110; bra $L180; $L181: .loc 1 1282 2 mov.u64 %r329,%r327; ld.u32 %r1223,[%r303+8];_gfortran_caf_init .visible .func _gfortran_caf_in_gfortran_caf_finalize .visible .func _gfortran_caf_finalize {u64 %r27; .reg .u64 %r29; .reg .pred %r30; .loc 1 110 26 ld.global.u64 %r24,[caf_static_list]; .loc 1 110 9 setp.eq.u64 %r26,%r24,0; @ %r26 bra $L231; $L233: mov.u64 %r25,%r24; .loc 1 112 21 ld.u64 %r24,[%r24+8]; .loc 1 113 7 ld.u64 %r214 7 ld.global.u64 %r29,[caf_static_list];115 23 st.global.u64 [caf_static_list],%r24; .loc 1 110 9 setp.ne.u64 %r30,%r24,0; @ %r30 bra $L233; $L23_gfortran_caf_this_imag_gfortran_caf_this_imageloc 1 123 10 mov.u32 %value,1_gfortran_caf_gfortran_caf_num_imagesloc 1 131 10 mov.u32 %value,1_gfortran_caf_register .visible .func _gfortran_caf_registerpredpredpred %r93; .reg .predpredpred %r110; mov.u64 %r36,%ar0; mov.u32 %r37,%ar1; mov.u64 %r38,%ar2; mov.u64 %r39,%ar3; mov.u64 %r40,%ar4; mov.u64 %r41,%ar5; mov.u64 %r42,%ar6; .loc 1 140 14 cvta.const.u64 %r44,$LC2; mov.u64 %r45,2-2; .loc 1 144 6 setp.gt.u32 %r52,%r51,2; @ %r52 bra $L241; .loc 1 146 13 mov.u64 654; call (%value_in),cmov.u64 %r34,%r55; bra $L242; $L241: .loc 1 147 45 add.u32 %r57,%r37,-5; .loc 1 147 11 setp.gt.u32 %r58,%r57,1; @ %r58 bra $L243; .loc 1 150 13 mov.u64 %r60cmov.u64 %r34,%r61; bra $L242; $L243: .loc 1 151 11 setp.eq.u32 %r63,%r37,7; @ %r63 bra $L244; .loc 1 154 13mov.u64 %r34,%r65; .loc 1 156 6 setp.ne.u32 %r67,%r37,8; @ %r67 bra $L242; .loc 1 159 7 ld.u64 %r33,[%r38]; bra $L245; $L242: .loc 1 157 14 mov.u64 %r68mov.u64 %r33,%r69; .loc 1 157 12 st.u64 [%r38],%r33; $L245: .loc 1 159 6 setp.eq.u64 %r71,%r33,0; @ %r71 bra $L246; .loc 1 159 7 set.u32.ne.u32 %r73,%r37,7; neg.s32 %r74,%r73; cvt.u32.u32 %r72,%r74; cvt.u32.u8 %r25,%r72; set.u32.eq.u64 %r76,%r34,0; neg.s32 %r77,%r76;@ ! %r83 bra $L248; bra $L247; $L254:$L255: .loc 1 170 7 {stack; call caf_internal_error); } bra $L240; $L248: .loc 1 175 24 st.u64 [%r33],%r34; .loc 1 176 31 cvt.u32.u32 %r90,%r25; st.u8 [%r33+16],%r90; .loc 1 177 62 ld.s8 %r92,[%r39+28]; cvt.u16.u32 %r91,%r92; setp.gt.s16 %r93,%r91,0; selp.u64 %r31,%r39,0,%r93; .loc 1 177 22 st.u64 [%r33+8],%r31; .loc 1 180 6 setp.eq.u64 %r94,%r40,0; @ %r94 bra $L251; .loc 1 181 11 mov.u32 %r95,0; st.u32 [%r40],%r95; $L251: .loc 1 183 12 and.b32 %r96,%r37,-3; .loc 1 183 6 setp.eq.u32 %r97,%r96,0; @ %r97 bra $L252; .loc 1 184 39 add.u32 %r98,%r37,-4; .loc 1 185 7 setp.gt.u32 %r99,%r98,2; @ %r99 bra $L253; $L252: .loc 1 187 27 mov.u64 %r100,16101,[%value_in]; } .loc 1 188 18 ld.global.u64 %r103,[caf_static_list]; st.u64 [%r101+8],%r103; .loc 1 189 18 ld.u64 %r104,[%r38]; st.u64 [%r101],%r104; .loc 1 190 23 st.global.u64 [caf_static_list],%r101; $L253: .loc 1 192 30 st.u64 [%r39],%r34; bra $L240; $L246: .loc 1 166 10 setp.ne.u64 %r105,%r34,0; @ %r105 bra $L254; bra $L255; $L247:bra $L255; $L244: .loc 1 157 1407108,[%value_in]; } mov.u64 %r33,%r108; .loc 1 157 12 st.u64 [%r38],%r33; .loc 1 159 6 setp.eq.u64 %r110,%r33,0; @ %r110 bra $L255; .loc 1 152 11 mov.u64 %r34,0; .loc 1 159 7 cvt.u32.u64 %r25,%r34; bra $L248;FUNCTION DEF: send_by_ref$constprop$0$isra$0 .func send_by_ref$constprop$0$isra$0, .param .u64 %in_ar11, .param .u64 %in_ar12, .param .u32 %in_ar13) {ar11; ld.param.u64 %ar11,[%in_ar11]; .reg .u64 %ar12; ld.param.u64 %ar12,[%in_ar12]; .reg .u32 %ar13; ld.param.u32 %ar13,[%in_ar184u64u64 %r304; .reg .u64 %r305; .reg .u32 %r306; .reg .u64u64 %r313; .reg .u64u64 %r327; .reg .u64u64u64 %r357; .reg .u64 %r358; .reg .u32 %r359; .reg .u64u64 %r365; .reg .u64 %r366; .reg .u32 %r367; .reg .u32u64 %r375; .reg .u64 %r376; .reg .u64 %r380; .reg .predpred %r386; .reg .u64predu64 %r431; .reg .u32predpredpred %r498; .reg .u32 %r499; .reg .pred %r500; .reg .u32 %r501; .reg .pred %r502; .reg .u64 %r512; .reg .u64 %r517; .reg .u64 %r527; .reg .u64 %r532; .reg .pred %r534; .reg .u64 %r544; .reg .predpred %r557; .reg .pred %r559; .reg .pred %r561; .reg .pred %r563; .reg .u64 %r564; .reg .u64 %r565; .reg .u64 %r569; .reg .u64 %r570; .reg .u64 %r571; .reg .u64 %r572; .reg .pred %r573; .reg .u64 %r579; .reg .pred %r580; .reg .pred %r581; .reg .pred %r582; .reg .pred %r583; .reg .pred %r584; .reg .pred %r585; .reg .u64 %r592; .reg .u64 %r593; .reg .u64 %r594; .reg .u64 %r601; .reg .u64 %r602; .reg .u64 %r609; .reg .u64 %r616; .reg .u64 %r617; .reg .u64 %r618; .reg .u64 %r619; .reg .u64 %r626; .reg .u64 %r627; .reg .u64 %r634; .reg .u64 %r641; .reg .u64 %r642; .reg .u64 %r643; .reg .u64 %r644; .reg .u64 %r651; .reg .u64 %r652; .reg .u64u64 %r676; .reg .u64 %r677; .reg .u64 %r684; .reg .u64 %r691; .reg .u64 %r692; .reg .u64 %r693; .reg .u64 %r700; .reg .u64 %r701; .reg .u64 %r702; .reg .u64 %r709; .reg .u64 %r713; .reg .u64 %r732; .reg .u64 %r733; .reg .u64 %r741; .reg .u64 %r742; .reg .u64 %r743; .reg .u64 %r750; .reg .pred %r751; .reg .u64 %r755; .reg .u64 %r756; .reg .u64 %r762; .reg .pred %r770; .reg .u64 %r771; .reg .u64 %r772; .reg .u32 %r774; .reg .u32 %r775; .reg .u32 %r777; .reg .u32 %r778; .reg .u16 %r779; .reg .u16 %r780; .reg .u16 %r781; .reg .u32 %r782; .reg .u16 %r783; .reg .pred %r784; .reg .pred %r785; .reg .u64 %r786; .reg .u64 %r790; .reg .u64 %r794; .reg .u64 %r795; .reg .u64 %r797; .reg .u64 %r798; .reg .u64 %r799; .reg .u64 %r815; .reg .u64 %r816; .reg .u64 %r824; .reg .u64 %r825; .reg .u64 %r826; .reg .pred %r827; .reg .u64 %r831; .reg .u64 %r832; .reg .pred %r846; .reg .u64 %r847; .reg .u64 %r848; .reg .u32 %r850; .reg .u32 %r851; .reg .u32 %r853; .reg .u32 %r854; .reg .u16 %r855; .reg .u16 %r856; .reg .u16 %r857; .reg .u32 %r858; .reg .u16 %r859; .reg .pred %r860; .reg .pred %r861; .reg .u64 %r862; .reg .u64 %r866; .reg .u64 %r870; .reg .u64 %r871; .reg .u64 %r873; .reg .u64 %r880; .reg .u64 %r881; .reg .u64 %r882; .reg .u64 %r898; .reg .u64 %r899; .reg .u64 %r907; .reg .u64 %r908; .reg .u64 %r909; .reg .pred %r910; .reg .u64 %r925; .reg .u64 %r927; .reg .u64 %r928; .reg .u64 %r929; .reg .u64 %r930; .reg .u64 %r936; .reg .u64 %r938; .reg .u64 %r939; .reg .u64 %r940; .reg .u64 %r947; .reg .u64 %r948; .reg .u64 %r950; .reg .u64 %r951; .reg .u64 %r955; .reg .u64 %r956; .reg .u64 %r962; .reg .pred %r970; .reg .u64 %r971; .reg .u64 %r972; .reg .u32 %r974; .reg .u32 %r975; .reg .u32 %r977; .reg .u32 %r978; .reg .u16 %r979; .reg .u16 %r980; .reg .u16 %r981; .reg .u32 %r982; .reg .u16 %r983; .reg .pred %r984; .reg .pred %r985; .reg .u64 %r986; .reg .u64 %r990; .reg .u64 %r994; .reg .u64 %r995; .reg .u64 %r997; .reg .u64 %r1004; .reg .u64 %r1005; .reg .u64 %r1006; .reg .u64 %r1022; .reg .u64 %r1023; .reg .u64 %r1031; .reg .u64 %r1032; .reg .u64 %r1033; .reg .pred %r1034; .reg .u64 %r1038; .reg .u64 %r1039; .reg .u64 %r1045; .reg .pred %r1053; .reg .u64 %r1054; .reg .u64 %r1055; .reg .u32 %r1057; .reg .u32 %r1058; .reg .u32 %r1060; .reg .u32 %r1061; .reg .u16 %r1062; .reg .u16 %r1063; .reg .u16 %r1064; .reg .u32 %r1065; .reg .u16 %r1066; .reg .pred %r1067; .reg .pred %r1068; .reg .u64 %r1069; .reg .u64 %r1073; .reg .u64 %r1077; .reg .u64 %r1078; .reg .u64 %r1080; .reg .u64 %r1081; .reg .u64 %r1082; .reg .u64 %r1098; .reg .u64 %r1099; .reg .u64 %r1107; .reg .u64 %r1108; .reg .u64 %r1109; .reg .pred %r1110; .reg .u64 %r1111; .reg .pred %r1113; .reg .u64 %r1123; .reg .u16 %r1128; .reg .pred %r1129; .reg .pred %r1131; .reg .pred %r1133; .reg .pred %r1135; .reg .pred %r1137; .reg .u64 %r1138; .reg .u64 %r1139; .reg .u64 %r1143; .reg .u64 %r1144; .reg .u64 %r1145; .reg .u64 %r1146; .reg .pred %r1147; .reg .u64 %r1153; .reg .pred %r1154; .reg .pred %r1155; .reg .pred %r1156; .reg .pred %r1157; .reg .pred %r1158; .reg .pred %r1159; .reg .u64 %r1166; .reg .u64 %r1167; .reg .u64 %r1174; .reg .u64 %r1175; .reg .u64 %r1176; .reg .u64 %r1183; .reg .u64 %r1184; .reg .u64 %r1185; .reg .u64 %r1192; .reg .u64 %r1193; .reg .u64 %r1194; .reg .u64 %r1201; .reg .u64 %r1202; .reg .u64 %r1203; .reg .u64 %r1207; .reg .u64 %r1221; .reg .u64 %r1222; .reg .u64 %r1229; .reg .u64 %r1230; .reg .u64 %r1231; .reg .u64 %r1238; .reg .pred %r1239; .reg .u64 %r1240; .reg .u64 %r1241; .reg .u64 %r1245; .reg .u64 %r1246; .reg .u64 %r1247; .reg .u64 %r1248; .reg .pred %r1249; .reg .u64 %r1253; .reg .u64 %r1265; .reg .u64 %r1266; .reg .u64 %r1274; .reg .u64 %r1275; .reg .u64 %r1276; .reg .u64 %r1282; .reg .u64 %r1283; .reg .u64 %r1290; .reg .pred %r1291; .reg .u64 %r1295; .reg .u64 %r1296; .reg .pred %r1310; .reg .u64 %r1311; .reg .u64 %r1312; .reg .u32 %r1314; .reg .u32 %r1315; .reg .u32 %r1317; .reg .u32 %r1318; .reg .u16 %r1319; .reg .u16 %r1320; .reg .u16 %r1321; .reg .u32 %r1322; .reg .u16 %r1323; .reg .pred %r1324; .reg .pred %r1325; .reg .u64 %r1326; .reg .u64 %r1330; .reg .u64 %r1331; .reg .u64 %r1332; .reg .u64 %r1336; .reg .u64 %r1348; .reg .u64 %r1349; .reg .u64 %r1357; .reg .u64 %r1358; .reg .u64 %r1359; .reg .u64 %r1363; .reg .u64 %r1364; .reg .u64 %r1365; .reg .u64 %r1366; .reg .pred %r1367; .reg .u64 %r1371; .reg .u64 %r1382; .reg .u64 %r1384; .reg .u64 %r1385; .reg .u64 %r1386; .reg .u64 %r1387; .reg .u64 %r1390; .reg .u64 %r1391; .reg .u64 %r1392; .reg .pred %r1393; .reg .pred %r1394; .reg .u64 %r1395; .reg .u64 %r1396; .reg .u64 %r1397; .reg .u64 %r1398; .reg .pred %r1399; .reg .u32 %r1400; .reg .u64 %r1401; .reg .u64 %r1402; .reg .u64 %r1403; .reg .u64 %r1404; .reg .u64 %r1405; .reg .u64 %r1406; .reg .u64 %r1407; .reg .u64 %r1408; .reg .u64 %r1409; .reg .u64 %r1410; .reg .u64 %r1411; .reg .u64 %r1412; .reg .u64 %r1413; .reg .u64 %r1415; .reg .u64 %r1416; .reg .u64 %r1417; .reg .u64 %r1418; .reg .u64 %r1419; .reg .u64 %r1420; .reg .u64 %r1421; .reg .u64 %r1422; .reg .u64 %r1423; .reg .u64 %r1424; .reg .u64 %r1425; .reg .pred %r1426; .reg .u64 %r1427; .reg .u64 %r1428; .reg .u64 %r1429; .reg .u64 %r1430; .reg .u64 %r1431; .reg .u64 %r1433; .reg .u64 %r1434; .reg .u64 %r1435; .reg .u64 %r1436; .reg .u64 %r1437; .reg .u64 %r1438; .reg .u64 %r1439; .reg .u64 %r1440; .reg .u64 %r1441; .reg .u64 %r1442; .reg .u64 %r1443; .reg .u64 %r1444; .reg .u64 %r1445; .reg .u64 %r1446; .reg .u64 %r1447; .reg .u64 %r1448; .reg .u64 %r1449; .reg .u64 %r1450; .reg .u64 %r1451; .reg .u64 %r1452; mov.u64 %r360,%ar0; mov.u64 %r361,%ar1; mov.u64 %r362,%ar2; mov.u64 %r363,%ar3; mov.u64 %r364,%ar4; mov.u64 %r365,%ar5; mov.u64 %r366,%ar6; mov.u32 %r367,%ar7; mov.u32 %r368,%ar8; mov.u64 %r369,%ar9; mov.u64 %r370,%ar10; mov.u64 %r371,%ar11; mov.u64 %r372,%ar12; mov.u32 %r373,%ar13; .loc 1 2030 14 cvta.const.u64 %r375,$LC3; mov.u64 %r376,63753760,[%value_in]; } .loc 1 2033 16 ld.s8 %r23,[%r364+28]; .loc 1 2035 6 setp.eq.u64 %r382,%r360,0; @ %r382 bra $L266; .loc 1 2040 10 ld.u64 %r346,[%r360]; .loc 1 2045 18 ld.u32 %r359,[%r360+8]; .loc 1 2040 6 setp.ne.u64 %r383,%r346,0; @ %r383 bra $L268; .loc 1 2042 14 ld.u64 %r24,[%r364+16]; .loc 1 2045 7 setp.eq.u32 %r384,%r359,0; @ %r384 bra $L269; add.u32 %r385,%r359,-1; setp.gt.u32 %r386,%r385,1; @ %r386 bra $L270; .loc 1 2101 21 add.u64 %r387,%r360,%r369; ld.u8 %r238,[%r387+24]; .loc 1 2101 7 setp.eq.u32 %r389,%r238,0; @ ! %r389 bra $L409; bra $L271; $L269: .loc 1 2048 16 ld.u64 %r25,[%r360+32]; .loc 1 2050 25 ld.u64 %r390,[%r360+24]; add.u64 %r304,%r365,%r390; .loc 1 2059 5 ld.u64 %r1397,[%r360+16]; .loc 1 2048 7 setp.le.s64 %r391,%r25,0; @ %r391 bra $L273; .loc 1 2050 12 ld.u64 %r33,[%r304]; .loc 1 2061 11 add.u64 %r308,%r365,%r25; .loc 1 2050 11 setp.ne.u64 %r392,%r33,0; @ %r392 bra $L274; .loc 1 2054 39 st.u64 [%frame+64],%r33; .loc 1 2056 9 st.u64 [%frame+80],%r24; ld.u64 %r395,[%r364+24]; st.u64 [%frame+88],%r395; .loc 1 2059 5 add.u64 %r403,%frame,64; mov.u32 %r397,9708403372067 8 setp.eq.u64 %r404,%r372,0; @ %r404 bra $L275; .loc 1 2067 22 ld.u32 %r405,[%r372]; setp.eq.u32 %r406,%r405,0; @ ! %r406 bra $L266; $L275: .loc 1 2071 11 ld.u64 %r33,[%frame+64]; .loc 1 2071 9 ld.u64 %r407,[%r360+24]; add.u64 %r408,%r365,%r407; st.u64 [%r408],%r33; .loc 1 2074 14 ld.s8 %r1400,[%r364+29]; mov.u32 %r373,%r1400; ld.u64 %r1397,[%r360+16]; bra $L277; $L274: .loc 1 2080 9 ld.u64 %r409,[%r308]; ld.u64 %r32,[%r409+8]; .loc 1 2081 8 setp.eq.u64 %r410,%r32,0; @ ! %r410 bra $L410; ld.s8 %r1400,[%r364+29]; bra $L277; $L410: .loc 1 2083 12 ld.u64 %r33,[%r32]; .loc 1 2084 18 ld.s8 %r373,[%r32+29]; ld.s8 %r1400,[%r364+29]; $L277: .loc 1 2089 8 mov.u64 %r419,1; {37314001397; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r24; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r419; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r372; call copy_data,%out_arg8,%out_arg9,%out_arg10); } bra $L278; $L273: .loc 1 2093 6 ld.s8 %r433,[%r364+29]; mov.u64 %r431,r33731397; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r24; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r431; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r372; call copy_data,%out_arg8,%out_arg9,%out_arg10); } $L278: .loc 1 2096 4 ld.u64 %r435,[%r361]; add.u64 %r434,%r435,1; st.u64 [%r361],%r434; .loc 1 2097 4 bra $L266; $L271: .loc 1 2093 6 ld.s8 %r306,[%r364+29]; .loc 1 2059 5 ld.u64 %r316,[%r360+16]; .loc 1 2103 11 setp.eq.u64 %r436,%r23,0; @ %r436 bra $L279; mov.u64 %r353,%r362; shl.b64 %r437,%r23,3; add.u64 %r350,%r353,%r437; .loc 1 2043 17 mov.u64 %r47,%r346; $L280: .loc 1 2106 24 ld.u64 %r438,[%r353]; add.u64 %r47,%r47,%r438; .loc 1 2105 5 add.u64 %r353,%r353,8; setp.ne.u64 %r439,%r350,%r353; @ %r439 bra $L280; .loc 1 2107 5 mad.lo.u64 %r451,%r24,%r47,%r366; mov.u64 %r448,r451373.param .u64 %out_arg8; st.param.u64 [%out_arg8],%r24; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r448; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r372; call copy_data,%out_arg8,%out_arg9,%out_arg10); } bra $L281; $L279: .loc 1 2112 3 mov.u64 %r460,r3373.param .u64 %out_arg8; st.param.u64 [%out_arg8],%r24; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r460; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r372; call copy_data,%out_arg8,%out_arg9,%out_arg10); } $L281: .loc 1 2115 11 ld.u64 %r463,[%r361]; add.u64 %r462,%r463,1; st.u64 [%r361],%r462; .loc 1 2116 8 bra $L266; $L270: .loc 1 2120 4 cvta.const.u64 %r464,unreachable464caf_runtime_error268: .loc 1 2124 3 setp.eq.u32 %r466,%r359,1; @ %r466 bra $L282; setp.eq.u32 %r467,%r359,2; @ %r467 bra $L283; setp.ne.u32 %r468,%r359,0; @ %r468 bra $L270; .loc 1 2127 19 ld.u64 %r54,[%r360+32]; .loc 1 2050 25 ld.u64 %r469,[%r360+24]; add.u64 %r247,%r365,%r469; .loc 1 2127 10 setp.le.s64 %r470,%r54,0; @ %r470 bra $L284; .loc 1 2129 7 ld.u64 %r471,[%r247]; setp.ne.u64 %r472,%r471,0; @ %r472 bra $L285; .loc 1 2136 35 ld.u64 %r473,[%r364+16]; st.u64 [%r247+16],%r473; ld.u64 %r474,[%r364+24]; st.u64 [%r247+24],%r474; .loc 1 2137 20 st.u64 [%r247+8],%r471; .loc 1 2139 8 setp.ne.u64 %r476,%r23,0; @ %r476 bra $L352; $L289: .loc 1 2149 34 mov.u64 %r477,0; st.u64 [%r247],%r477; .loc 1 2154 14 add.u64 %r485,%r365,%r54; .loc 1 2152 8 ld.u64 %r487,[%r360+16]; mul.lo.u64 %r486,%r371,%r487; mov.u32 %r479,r47984737_gfortran_caf_register2158 11 setp.ne.u64 %r488,%r372,0; @ %r488 bra $L287; bra $L288; $L352: .loc 1 2139 8 mov.u64 %r349,%r471; .loc 1 2138 19 mov.u64 %r61,1; .loc 1 2139 20 mov.u64 %r62,%r349; .loc 1 2142 40 mov.u64 %r492,%r349; $L286: add.u64 %r348,%r364,%r349; .loc 1 2141 18 ld.u64 %r490,[%r348+56]; add.u64 %r489,%r490,1; .loc 1 2141 16 ld.u64 %r491,[%r348+48]; sub.u64 %r59,%r489,%r491; add.u64 %r347,%r247,%r349; .loc 1 2142 40 st.u64 [%r347+48],%r492; .loc 1 2143 53 add.u64 %r493,%r59,-1; .loc 1 2143 40 st.u64 [%r347+56],%r493; .loc 1 2144 40 st.u64 [%r347+40],%r61; .loc 1 2145 16 mul.lo.u64 %r61,%r61,%r59; .loc 1 2139 41 add.u64 %r62,%r62,1; .loc 1 2139 8 add.u64 %r349,%r349,24; setp.ne.u64 %r494,%r23,%r62; @ %r494 bra $L286; bra $L289; $L288: .loc 1 2165 11 ld.u64 %r346,[%r360]; .loc 1 2166 25 ld.u64 %r495,[%r360+24]; add.u64 %r247,%r365,%r495; .loc 1 2161 17 ld.u64 %r496,[%r360+32]; add.u64 %r497,%r365,%r496; ld.u64 %r84,[%r497]; .loc 1 2165 7 setp.ne.u64 %r498,%r346,0; @ ! %r498 bra $L291; bra $L290; $L287: .loc 1 2158 25 ld.u32 %r499,[%r372]; setp.ne.u32 %r500,%r499,0; @ ! %r500 bra $L288; bra $L266; $L290: .loc 1 2165 18 ld.u32 %r501,[%r346+8]; setp.ne.u32 %r502,%r501,0; @ %r502 bra $L291; .loc 1 2166 9 ld.u64 %r247,[%r247]; $L291: .loc 1 2170 4 ld.u64 %r517,[%r84+8]; mov.u64 %r512,362517364366; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r367; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r368; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r512; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r370; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r371; .param .u64 %out_arg13; st.param.u64 [%out_arg13],%r372; .param .u32 %out_arg14; st.param.u32 [%out_arg14],%r373; call send_by_ref$constprop$0$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13,%out_arg14); } bra $L266; $L284: .loc 1 2175 2 mov.u64 %r5236247364366; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r367; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r368; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r527; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r370; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r371; .param .u64 %out_arg13; st.param.u64 [%out_arg13],%r372; .param .u32 %out_arg14; st.param.u32 [%out_arg14],%r373; call send_by_ref$constprop$0$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13,%out_arg14); } bra $L266; $L282: .loc 1 2181 24 add.u64 %r532,%r360,%r369; ld.u8 %r238,[%r532+24]; .loc 1 2181 10 setp.ne.u32 %r534,%r238,0; @ %r534 bra $L292; .loc 1 2183 4 mov.u64 %r544,3625364366; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r367; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r368; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r544; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r370; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r371; .param .u64 %out_arg13; st.param.u64 [%out_arg13],%r372; .param .u32 %out_arg14; st.param.u32 [%out_arg14],%r373; call send_by_ref$constprop$0$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13,%out_arg14); } .loc 1 2186 4 bra $L266; $L292: .loc 1 2190 10 setp.ne.u64 %r549,%r369,0; @ %r549 bra $L293; .loc 1 2191 5 ld.u64 %r365,[%r363]; $L293: .loc 1 2192 7 cvt.u16.u32 %r550,%r238; setp.eq.u16 %r551,%r550,4; @ %r551 bra $L294; setp.gt.u16 %r553,%r550,4; @ %r553 bra $L295; setp.eq.u16 %r555,%r550,2; @ %r555 bra $L296; setp.eq.u16 %r557,%r550,3; @ %r557 bra $L297; setp.eq.u16 %r559,%r550,1; @ %r559 bra $L298; bra $L270; $L295: setp.eq.u16 %r561,%r550,5; @ %r561 bra $L299; setp.eq.u16 %r563,%r550,6; @ %r563 bra $L300; bra $L270; $L298: .loc 1 2196 13 shl.b64 %r564,%r370,3; add.u64 %r73,%r362,%r564; .loc 1 2196 23 mov.u64 %r565,0; st.u64 [%r73],%r565; .loc 1 2197 54 add.u64 %r1398,%r369,%r369; add.u64 %r1396,%r1398,%r369; shl.b64 %r569,%r1396,3; add.u64 %r570,%r360,%r569; add.u64 %r571,%r570,56; .loc 1 2197 4 ld.u64 %r572,[%r571]; setp.eq.u64 %r573,%r572,0; @ %r573 bra $L266; .loc 1 2197 16 mov.u64 %r122,%r565; .loc 1 2207 39 add.u64 %r579,%r570,64; .loc 1 2211 3 add.u64 %r1420,%r570,48; add.u64 %r1421,%r363,%r569; add.u64 %r1422,%r1421,48; add.u64 %r1423,%r1421,40; .loc 1 2224 8 add.u64 %r1424,%r370,1; add.u64 %r1425,%r369,1; .loc 1 2228 11 setp.eq.u64 %r1426,%r23,0; .loc 1 2230 10 add.u64 %r1427,%r370,%r370; add.u64 %r1428,%r1427,%r370; shl.b64 %r1429,%r1428,3; add.u64 %r1430,%r364,%r1429; add.u64 %r1431,%r1430,40; $L310: .loc 1 2207 39 ld.u32 %r74,[%r579]; setp.eq.u32 %r580,%r74,4; @ %r580 bra $L301; setp.gt.s32 %r581,%r74,4; @ %r581 bra $L302; setp.eq.u32 %r582,%r74,1; @ %r582 bra $L303; setp.eq.u32 %r583,%r74,2; @ %r583 bra $L304; bra $L305; $L302: setp.eq.u32 %r584,%r74,8; @ %r584 bra $L306; setp.eq.u32 %r585,%r74,16; @ %r585 bra $L307; bra $L305; $L303: .loc 1 2209 3 ld.u64 %r592,[%r1420]; add.u64 %r593,%r592,%r122; ld.s8 %r594,[%r593]; ld.u64 %r602,[%r1422]; sub.u64 %r601,%r594,%r602; ld.u64 %r609,[%r1423]; mul.lo.u64 %r82,%r601,%r609; bra $L308; $L304: .loc 1 2210 3 ld.u64 %r616,[%r1420]; add.u64 %r617,%r122,%r122; add.u64 %r618,%r616,%r617; ld.s16 %r619,[%r618]; ld.u64 %r627,[%r1422]; sub.u64 %r626,%r619,%r627; ld.u64 %r634,[%r1423]; mul.lo.u64 %r82,%r626,%r634; bra $L308; $L301: .loc 1 2211 3 ld.u64 %r641,[%r1420]; shl.b64 %r642,%r122,2; add.u64 %r643,%r641,%r642; ld.s32 %r644,[%r643]; ld.u64 %r652,[%r1422]; sub.u64 %r651,%r644,%r652; ld.u64 %r659,[%r1423]; mul.lo.u64 %r82,%r651,%r659; bra $L308; $L306: .loc 1 2213 3 ld.u64 %r666,[%r1420]; shl.b64 %r667,%r122,3; add.u64 %r668,%r666,%r667; ld.u64 %r676,[%r668]; ld.u64 %r677,[%r1422]; sub.u64 %r675,%r676,%r677; ld.u64 %r684,[%r1423]; mul.lo.u64 %r82,%r675,%r684; bra $L308; $L307: .loc 1 2216 3 ld.u64 %r691,[%r1420]; shl.b64 %r692,%r122,4; add.u64 %r693,%r691,%r692; ld.u64 %r701,[%r693]; ld.u64 %r702,[%r1422]; sub.u64 %r700,%r701,%r702; ld.u64 %r709,[%r1423]; mul.lo.u64 %r82,%r700,%r709; bra $L308; $L305: .loc 1 2219 5 mov.u64 %r713,0; {2713713stack; call caf_internal_error); } .loc 1 2220 5 bra $L266; $L308: .loc 1 2225 29 ld.u64 %r732,[%r360+16]; .loc 1 2224 8 mad.lo.u64 %r733,%r82,%r732,%r365363613623364366; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r367; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r368; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r1425; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r1424; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r371; .param .u64 %out_arg13; st.param.u64 [%out_arg13],%r372; .param .u32 %out_arg14; st.param.u32 [%out_arg14],%r373; call send_by_ref$constprop$0$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13,%out_arg14); } .loc 1 2228 11 @ %r1426 bra $L309; .loc 1 2230 7 ld.u64 %r742,[%r73]; ld.u64 %r743,[%r1431]; add.u64 %r741,%r742,%r743; st.u64 [%r73],%r741; $L309: .loc 1 2198 9 add.u64 %r122,%r122,1; .loc 1 2197 4 ld.u64 %r750,[%r571]; setp.lt.u64 %r751,%r122,%r750; @ %r751 bra $L310; bra $L266; $L296: .loc 1 2234 4 add.u64 %r1398,%r369,%r369; add.u64 %r1396,%r1398,%r369; shl.b64 %r755,%r1396,3; add.u64 %r756,%r360,%r755; ld.u64 %r124,[%r756+64]; abs.s64 %r125,%r124; add.u64 %r762,%r363,%r755; ld.u64 %r313,[%r762+56]; ld.u64 %r312,[%r762+48]; setp.le.s64 %r770,%r124,0; @ %r770 bra $L311; add.u64 %r771,%r313,1; sub.u64 %r127,%r771,%r312; bra $L312; $L311: add.u64 %r772,%r312,1; sub.u64 %r127,%r772,%r313; $L312: set.u32.le.s64 %r774,%r127,0; neg.s32 %r775,%r774; set.u32.le.s64 %r777,%r125,0; neg.s32 %r778,%r777; cvt.u16.u32 %r780,%r775; cvt.u16.u32 %r781,%r778; or.b16 %r779,%r780,%r781; cvt.u32.u16 %r782,%r779; cvt.u16.u8 %r783,%r782; setp.ne.u16 %r784,%r783,0; @ %r784 bra $L266; setp.eq.u64 %r785,%r125,1; @ %r785 bra $L313; add.u64 %r786,%r127,-1; div.s64 %r790,%r786,%r125; add.u64 %r127,%r790,1; $L313: .loc 1 2239 17 shl.b64 %r794,%r1396,3; add.u64 %r795,%r363,%r794; .loc 1 2239 15 ld.u64 %r797,[%r795+40]; mul.lo.u64 %r135,%r124,%r797; .loc 1 2241 13 shl.b64 %r798,%r370,3; add.u64 %r137,%r362,%r798; .loc 1 2241 23 mov.u64 %r799,0; st.u64 [%r137],%r799; .loc 1 2224 8 add.u64 %r309,%r370,1; add.u64 %r324,%r369,1; .loc 1 2242 20 mov.u64 %r145,%r799; .loc 1 2238 21 mov.u64 %r146,%r145; setp.eq.u64 %r1399,%r23,0; .loc 1 2251 10 add.u64 %r1438,%r370,%r370; add.u64 %r1439,%r1438,%r370; shl.b64 %r1440,%r1439,3; add.u64 %r1441,%r364,%r1440; add.u64 %r1442,%r1441,40; $L315: .loc 1 2246 29 ld.u64 %r815,[%r360+16]; .loc 1 2245 8 mad.lo.u64 %r816,%r146,%r815,%r365363613623364366; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r367; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r368; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r324; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r309; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r371; .param .u64 %out_arg13; st.param.u64 [%out_arg13],%r372; .param .u32 %out_arg14; st.param.u32 [%out_arg14],%r373; call send_by_ref$constprop$0$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13,%out_arg14); } .loc 1 2249 11 @ %r1399 bra $L314; .loc 1 2251 7 ld.u64 %r825,[%r137]; ld.u64 %r826,[%r1442]; add.u64 %r824,%r825,%r826; st.u64 [%r137],%r824; $L314: .loc 1 2243 9 add.u64 %r145,%r145,1; .loc 1 2243 33 add.u64 %r146,%r146,%r135; .loc 1 2242 4 setp.ne.u64 %r827,%r145,%r127; @ %r827 bra $L315; bra $L266; $L297: .loc 1 2255 4 add.u64 %r1398,%r369,%r369; add.u64 %r1396,%r1398,%r369; shl.b64 %r831,%r1396,3; add.u64 %r832,%r360,%r831; ld.u64 %r147,[%r832+64]; abs.s64 %r148,%r147; ld.u64 %r311,[%r832+56]; ld.u64 %r305,[%r832+48]; setp.le.s64 %r846,%r147,0; @ %r846 bra $L316; add.u64 %r847,%r311,1; sub.u64 %r150,%r847,%r305; bra $L317; $L316: add.u64 %r848,%r305,1; sub.u64 %r150,%r848,%r311; $L317: set.u32.le.s64 %r850,%r150,0; neg.s32 %r851,%r850; set.u32.le.s64 %r853,%r148,0; neg.s32 %r854,%r853; cvt.u16.u32 %r856,%r851; cvt.u16.u32 %r857,%r854; or.b16 %r855,%r856,%r857; cvt.u32.u16 %r858,%r855; cvt.u16.u8 %r859,%r858; setp.ne.u16 %r860,%r859,0; @ %r860 bra $L266; setp.eq.u64 %r861,%r148,1; @ %r861 bra $L318; add.u64 %r862,%r150,-1; div.s64 %r866,%r862,%r148; add.u64 %r150,%r866,1; $L318: .loc 1 2260 10 shl.b64 %r870,%r1396,3; add.u64 %r871,%r363,%r870; .loc 1 2259 21 ld.u64 %r873,[%r871+48]; sub.u64 %r170,%r305,%r873; .loc 1 2261 15 ld.u64 %r880,[%r871+40]; mul.lo.u64 %r159,%r147,%r880; .loc 1 2263 13 shl.b64 %r881,%r370,3; add.u64 %r161,%r362,%r881; .loc 1 2263 23 mov.u64 %r882,0; st.u64 [%r161],%r882; .loc 1 2224 8 add.u64 %r325,%r370,1; add.u64 %r310,%r369,1; .loc 1 2264 20 mov.u64 %r171,%r882; setp.eq.u64 %r1399,%r23,0; .loc 1 2272 10 add.u64 %r1433,%r370,%r370; add.u64 %r1434,%r1433,%r370; shl.b64 %r1435,%r1434,3; add.u64 %r1436,%r364,%r1435; add.u64 %r1437,%r1436,40; $L320: .loc 1 2267 29 ld.u64 %r898,[%r360+16]; .loc 1 2266 8 mad.lo.u64 %r899,%r170,%r898,%r365363613623364366; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r367; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r368; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r310; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r325; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r371; .param .u64 %out_arg13; st.param.u64 [%out_arg13],%r372; .param .u32 %out_arg14; st.param.u32 [%out_arg14],%r373; call send_by_ref$constprop$0$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13,%out_arg14); } .loc 1 2270 11 @ %r1399 bra $L319; .loc 1 2272 7 ld.u64 %r908,[%r161]; ld.u64 %r909,[%r1437]; add.u64 %r907,%r908,%r909; st.u64 [%r161],%r907; $L319: .loc 1 2273 25 add.u64 %r170,%r170,%r159; .loc 1 2264 47 add.u64 %r171,%r171,1; .loc 1 2264 4 setp.ne.u64 %r910,%r171,%r150; @ %r910 bra $L320; bra $L266; $L294: .loc 1 2280 4 add.u64 %r925,%r369,1; .loc 1 2277 47 add.u64 %r927,%r369,%r369; add.u64 %r928,%r927,%r369; shl.b64 %r929,%r928,3; add.u64 %r930,%r360,%r929; .loc 1 2278 13 add.u64 %r936,%r363,%r929; .loc 1 2278 11 ld.u64 %r939,[%r930+48]; ld.u64 %r940,[%r936+48]; sub.u64 %r938,%r939,%r940; .loc 1 2277 21 ld.u64 %r948,[%r936+40]; mul.lo.u64 %r947,%r938,%r948; .loc 1 2281 29 ld.u64 %r950,[%r360+16]; .loc 1 2280 4 mad.lo.u64 %r951,%r947,%r950,%r365363613623364366; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r367; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r368; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r925; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r370; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r371; .param .u64 %out_arg13; st.param.u64 [%out_arg13],%r372; .param .u32 %out_arg14; st.param.u32 [%out_arg14],%r373; call send_by_ref$constprop$0$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13,%out_arg14); } .loc 1 2284 4 bra $L266; $L299: .loc 1 2286 4 add.u64 %r1398,%r369,%r369; add.u64 %r1396,%r1398,%r369; shl.b64 %r955,%r1396,3; add.u64 %r956,%r360,%r955; ld.u64 %r182,[%r956+64]; abs.s64 %r183,%r182; .loc 1 2234 4 add.u64 %r962,%r363,%r955; ld.u64 %r49,[%r962+56]; .loc 1 2255 4 ld.u64 %r355,[%r956+48]; .loc 1 2286 4 setp.le.s64 %r970,%r182,0; @ %r970 bra $L321; add.u64 %r971,%r49,1; sub.u64 %r185,%r971,%r355; bra $L322; $L321: add.u64 %r972,%r355,1; sub.u64 %r185,%r972,%r49; $L322: set.u32.le.s64 %r974,%r185,0; neg.s32 %r975,%r974; set.u32.le.s64 %r977,%r183,0; neg.s32 %r978,%r977; cvt.u16.u32 %r980,%r975; cvt.u16.u32 %r981,%r978; or.b16 %r979,%r980,%r981; cvt.u32.u16 %r982,%r979; cvt.u16.u8 %r983,%r982; setp.ne.u16 %r984,%r983,0; @ %r984 bra $L266; setp.eq.u64 %r985,%r183,1; @ %r985 bra $L323; add.u64 %r986,%r185,-1; div.s64 %r990,%r986,%r183; add.u64 %r185,%r990,1; $L323: .loc 1 2291 10 shl.b64 %r994,%r1396,3; add.u64 %r995,%r363,%r994; .loc 1 2290 21 ld.u64 %r997,[%r995+48]; sub.u64 %r204,%r355,%r997; .loc 1 2292 15 ld.u64 %r1004,[%r995+40]; mul.lo.u64 %r194,%r182,%r1004; .loc 1 2294 13 shl.b64 %r1005,%r370,3; add.u64 %r196,%r362,%r1005; .loc 1 2294 23 mov.u64 %r1006,0; st.u64 [%r196],%r1006; .loc 1 2224 8 add.u64 %r326,%r370,1; add.u64 %r328,%r369,1; .loc 1 2295 20 mov.u64 %r205,%r1006; setp.eq.u64 %r1399,%r23,0; .loc 1 2303 10 add.u64 %r1448,%r370,%r370; add.u64 %r1449,%r1448,%r370; shl.b64 %r1450,%r1449,3; add.u64 %r1451,%r364,%r1450; add.u64 %r1452,%r1451,40; $L325: .loc 1 2298 29 ld.u64 %r1022,[%r360+16]; .loc 1 2297 8 mad.lo.u64 %r1023,%r204,%r1022,%r3653636136233641023366; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r367; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r368; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r328; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r326; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r371; .param .u64 %out_arg13; st.param.u64 [%out_arg13],%r372; .param .u32 %out_arg14; st.param.u32 [%out_arg14],%r373; call send_by_ref$constprop$0$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13,%out_arg14); } .loc 1 2301 11 @ %r1399 bra $L324; .loc 1 2303 7 ld.u64 %r1032,[%r196]; ld.u64 %r1033,[%r1452]; add.u64 %r1031,%r1032,%r1033; st.u64 [%r196],%r1031; $L324: .loc 1 2304 25 add.u64 %r204,%r204,%r194; .loc 1 2295 47 add.u64 %r205,%r205,1; .loc 1 2295 4 setp.ne.u64 %r1034,%r205,%r185; @ %r1034 bra $L325; bra $L266; $L300: .loc 1 2308 4 add.u64 %r1398,%r369,%r369; add.u64 %r1396,%r1398,%r369; shl.b64 %r1038,%r1396,3; add.u64 %r1039,%r360,%r1038; ld.u64 %r206,[%r1039+64]; abs.s64 %r207,%r206; .loc 1 2234 4 add.u64 %r1045,%r363,%r1038; ld.u64 %r356,[%r1045+48]; .loc 1 2255 4 ld.u64 %r357,[%r1039+56]; .loc 1 2308 4 setp.le.s64 %r1053,%r206,0; @ %r1053 bra $L326; add.u64 %r1054,%r357,1; sub.u64 %r209,%r1054,%r356; bra $L327; $L326: add.u64 %r1055,%r356,1; sub.u64 %r209,%r1055,%r357; $L327: set.u32.le.s64 %r1057,%r209,0; neg.s32 %r1058,%r1057; set.u32.le.s64 %r1060,%r207,0; neg.s32 %r1061,%r1060; cvt.u16.u32 %r1063,%r1058; cvt.u16.u32 %r1064,%r1061; or.b16 %r1062,%r1063,%r1064; cvt.u32.u16 %r1065,%r1062; cvt.u16.u8 %r1066,%r1065; setp.ne.u16 %r1067,%r1066,0; @ %r1067 bra $L266; setp.eq.u64 %r1068,%r207,1; @ %r1068 bra $L328; add.u64 %r1069,%r209,-1; div.s64 %r1073,%r1069,%r207; add.u64 %r209,%r1073,1; $L328: .loc 1 2313 17 shl.b64 %r1077,%r1396,3; add.u64 %r1078,%r363,%r1077; .loc 1 2313 15 ld.u64 %r1080,[%r1078+40]; mul.lo.u64 %r217,%r206,%r1080; .loc 1 2315 13 shl.b64 %r1081,%r370,3; add.u64 %r219,%r362,%r1081; .loc 1 2315 23 mov.u64 %r1082,0; st.u64 [%r219],%r1082; .loc 1 2224 8 add.u64 %r329,%r370,1; add.u64 %r327,%r369,1; .loc 1 2316 20 mov.u64 %r229,%r1082; .loc 1 2312 21 mov.u64 %r228,%r229; setp.eq.u64 %r1399,%r23,0; .loc 1 2324 10 add.u64 %r1443,%r370,%r370; add.u64 %r1444,%r1443,%r370; shl.b64 %r1445,%r1444,3; add.u64 %r1446,%r364,%r1445; add.u64 %r1447,%r1446,40; $L330: .loc 1 2319 29 ld.u64 %r1098,[%r360+16]; .loc 1 2318 8 mad.lo.u64 %r1099,%r228,%r1098,%r3653636136233641099366; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r367; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r368; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r327; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r329; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r371; .param .u64 %out_arg13; st.param.u64 [%out_arg13],%r372; .param .u32 %out_arg14; st.param.u32 [%out_arg14],%r373; call send_by_ref$constprop$0$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13,%out_arg14); } .loc 1 2322 11 @ %r1399 bra $L329; .loc 1 2324 7 ld.u64 %r1108,[%r219]; ld.u64 %r1109,[%r1447]; add.u64 %r1107,%r1108,%r1109; st.u64 [%r219],%r1107; $L329: .loc 1 2325 25 add.u64 %r228,%r228,%r217; .loc 1 2316 47 add.u64 %r229,%r229,1; .loc 1 2316 4 setp.ne.u64 %r1110,%r229,%r209; @ %r1110 bra $L330; bra $L266; $L283: .loc 1 2101 21 add.u64 %r1111,%r360,%r369; ld.u8 %r238,[%r1111+24]; .loc 1 2333 10 setp.ne.u32 %r1113,%r238,0; @ %r1113 bra $L331; .loc 1 2335 4 mov.u64 %r1123,3621123364366; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r367; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r368; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r1123; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r370; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r371; .param .u64 %out_arg13; st.param.u64 [%out_arg13],%r372; .param .u32 %out_arg14; st.param.u32 [%out_arg14],%r373; call send_by_ref$constprop$0$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13,%out_arg14); } .loc 1 2338 4 bra $L266; $L331: .loc 1 2340 7 cvt.u16.u32 %r1128,%r238; setp.eq.u16 %r1129,%r1128,3; @ %r1129 bra $L332; setp.gt.u16 %r1131,%r1128,3; @ %r1131 bra $L333; setp.eq.u16 %r1133,%r1128,1; @ %r1133 bra $L334; setp.eq.u16 %r1135,%r1128,2; @ %r1135 bra $L335; bra $L270; $L333: setp.eq.u16 %r1137,%r1128,4; @ %r1137 bra $L336; bra $L270; $L334: .loc 1 2344 13 shl.b64 %r1138,%r370,3; add.u64 %r231,%r362,%r1138; .loc 1 2344 23 mov.u64 %r1139,0; st.u64 [%r231],%r1139; .loc 1 2345 54 add.u64 %r1398,%r369,%r369; add.u64 %r1396,%r1398,%r369; shl.b64 %r1143,%r1396,3; add.u64 %r1144,%r360,%r1143; add.u64 %r1145,%r1144,56; .loc 1 2345 4 ld.u64 %r1146,[%r1145]; setp.eq.u64 %r1147,%r1146,0; @ %r1147 bra $L266; .loc 1 2345 16 mov.u64 %r221,%r1139; .loc 1 2352 39 add.u64 %r1153,%r1144,64; .loc 1 2356 3 add.u64 %r1406,%r1144,48; .loc 1 2369 8 add.u64 %r1407,%r370,1; add.u64 %r1408,%r369,1; .loc 1 2374 8 add.u64 %r1409,%r370,%r370; add.u64 %r1410,%r1409,%r370; shl.b64 %r1411,%r1410,3; add.u64 %r1412,%r364,%r1411; add.u64 %r1413,%r1412,40; $L344: .loc 1 2352 39 ld.u32 %r232,[%r1153]; setp.eq.u32 %r1154,%r232,4; @ %r1154 bra $L337; setp.gt.s32 %r1155,%r232,4; @ %r1155 bra $L338; setp.eq.u32 %r1156,%r232,1; @ %r1156 bra $L339; setp.eq.u32 %r1157,%r232,2; @ %r1157 bra $L340; bra $L270; $L338: setp.eq.u32 %r1158,%r232,8; @ %r1158 bra $L341; setp.eq.u32 %r1159,%r232,16; @ %r1159 bra $L342; bra $L270; $L339: .loc 1 2354 3 ld.u64 %r1166,[%r1406]; add.u64 %r1167,%r1166,%r221; ld.s8 %r236,[%r1167]; bra $L343; $L340: .loc 1 2355 3 ld.u64 %r1174,[%r1406]; add.u64 %r1175,%r221,%r221; add.u64 %r1176,%r1174,%r1175; ld.s16 %r236,[%r1176]; bra $L343; $L337: .loc 1 2356 3 ld.u64 %r1183,[%r1406]; shl.b64 %r1184,%r221,2; add.u64 %r1185,%r1183,%r1184; ld.s32 %r236,[%r1185]; bra $L343; $L341: .loc 1 2358 3 ld.u64 %r1192,[%r1406]; shl.b64 %r1193,%r221,3; add.u64 %r1194,%r1192,%r1193; ld.u64 %r236,[%r1194]; bra $L343; $L342: .loc 1 2361 3 ld.u64 %r1201,[%r1406]; shl.b64 %r1202,%r221,4; add.u64 %r1203,%r1201,%r1202; ld.u64 %r236,[%r1203]; $L343: .loc 1 2370 29 ld.u64 %r1221,[%r360+16]; .loc 1 2369 8 mad.lo.u64 %r1222,%r236,%r1221,%r365; mov.u64 %r12036136212073641222366; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r367; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r368; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r1408; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r1407; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r371; .param .u64 %out_arg13; st.param.u64 [%out_arg13],%r372; .param .u32 %out_arg14; st.param.u32 [%out_arg14],%r373; call send_by_ref$constprop$0$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13,%out_arg14); } .loc 1 2374 5 ld.u64 %r1230,[%r231]; ld.u64 %r1231,[%r1413]; add.u64 %r1229,%r1230,%r1231; st.u64 [%r231],%r1229; .loc 1 2346 9 add.u64 %r221,%r221,1; .loc 1 2345 4 ld.u64 %r1238,[%r1145]; setp.lt.u64 %r1239,%r221,%r1238; @ %r1239 bra $L344; bra $L266; $L335: .loc 1 2378 13 shl.b64 %r1240,%r370,3; add.u64 %r262,%r362,%r1240; .loc 1 2378 23 mov.u64 %r1241,0; st.u64 [%r262],%r1241; .loc 1 2380 52 add.u64 %r1398,%r369,%r369; add.u64 %r1396,%r1398,%r369; shl.b64 %r1245,%r1396,3; add.u64 %r1246,%r360,%r1245; add.u64 %r1247,%r1246,56; .loc 1 2379 4 ld.u64 %r1248,[%r1247]; setp.lt.s64 %r1249,%r1248,0; @ %r1249 bra $L266; .loc 1 2224 8 add.u64 %r331,%r370,1; add.u64 %r330,%r369,1; .loc 1 2379 26 mov.u64 %r272,%r1241; setp.eq.u64 %r1399,%r23,0; .loc 1 2389 10 add.u64 %r1401,%r370,%r370; add.u64 %r1402,%r1401,%r370; shl.b64 %r1403,%r1402,3; add.u64 %r1404,%r364,%r1403; add.u64 %r1405,%r1404,40; .loc 1 2381 52 add.u64 %r1282,%r1246,64; $L346: .loc 1 2384 29 ld.u64 %r1265,[%r360+16]; .loc 1 2383 8 mad.lo.u64 %r1266,%r272,%r1265,%r365; mov.u64 %r1253,3636136212533641266366; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r367; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r368; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r330; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r331; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r371; .param .u64 %out_arg13; st.param.u64 [%out_arg13],%r372; .param .u32 %out_arg14; st.param.u32 [%out_arg14],%r373; call send_by_ref$constprop$0$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13,%out_arg14); } .loc 1 2387 11 @ %r1399 bra $L345; .loc 1 2389 7 ld.u64 %r1275,[%r262]; ld.u64 %r1276,[%r1405]; add.u64 %r1274,%r1275,%r1276; st.u64 [%r262],%r1274; $L345: .loc 1 2381 26 ld.u64 %r1283,[%r1282]; add.u64 %r272,%r272,%r1283; .loc 1 2379 4 ld.u64 %r1290,[%r1247]; setp.le.s64 %r1291,%r272,%r1290; @ %r1291 bra $L346; bra $L266; $L332: .loc 1 2393 4 add.u64 %r1398,%r369,%r369; add.u64 %r1396,%r1398,%r369; shl.b64 %r1295,%r1396,3; add.u64 %r1296,%r360,%r1295; ld.u64 %r274,[%r1296+64]; abs.s64 %r275,%r274; .loc 1 2255 4 ld.u64 %r358,[%r1296+56]; ld.u64 %r295,[%r1296+48]; .loc 1 2393 4 setp.le.s64 %r1310,%r274,0; @ %r1310 bra $L347; add.u64 %r1311,%r358,1; sub.u64 %r277,%r1311,%r295; bra $L348; $L347: add.u64 %r1312,%r295,1; sub.u64 %r277,%r1312,%r358; $L348: set.u32.le.s64 %r1314,%r277,0; neg.s32 %r1315,%r1314; set.u32.le.s64 %r1317,%r275,0; neg.s32 %r1318,%r1317; cvt.u16.u32 %r1320,%r1315; cvt.u16.u32 %r1321,%r1318; or.b16 %r1319,%r1320,%r1321; cvt.u32.u16 %r1322,%r1319; cvt.u16.u8 %r1323,%r1322; setp.ne.u16 %r1324,%r1323,0; @ %r1324 bra $L266; setp.eq.u64 %r1325,%r275,1; @ %r1325 bra $L349; add.u64 %r1326,%r277,-1; div.s64 %r1330,%r1326,%r275; add.u64 %r277,%r1330,1; $L349: .loc 1 2398 13 shl.b64 %r1331,%r370,3; add.u64 %r285,%r362,%r1331; .loc 1 2398 23 mov.u64 %r1332,0; st.u64 [%r285],%r1332; .loc 1 2224 8 add.u64 %r333,%r370,1; add.u64 %r332,%r369,1; .loc 1 2399 20 mov.u64 %r297,%r1332; setp.eq.u64 %r1399,%r23,0; .loc 1 2407 10 add.u64 %r1415,%r370,%r370; add.u64 %r1416,%r1415,%r370; shl.b64 %r1417,%r1416,3; add.u64 %r1418,%r364,%r1417; add.u64 %r1419,%r1418,40; .loc 1 2408 51 shl.b64 %r1363,%r1396,3; add.u64 %r1364,%r360,%r1363; add.u64 %r1365,%r1364,64; $L351: .loc 1 2402 29 ld.u64 %r1348,[%r360+16]; .loc 1 2401 8 mad.lo.u64 %r1349,%r295,%r1348,%r365; mov.u64 %r1336,3636136213336449366; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r367; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r368; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r332; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r333; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r371; .param .u64 %out_arg13; st.param.u64 [%out_arg13],%r372; .param .u32 %out_arg14; st.param.u32 [%out_arg14],%r373; call send_by_ref$constprop$0$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13,%out_arg14); } .loc 1 2405 11 @ %r1399 bra $L350; .loc 1 2407 7 ld.u64 %r1358,[%r285]; ld.u64 %r1359,[%r1419]; add.u64 %r1357,%r1358,%r1359; st.u64 [%r285],%r1357; $L350: .loc 1 2408 25 ld.u64 %r1366,[%r1365]; add.u64 %r295,%r295,%r1366; .loc 1 2399 47 add.u64 %r297,%r297,1; .loc 1 2399 4 setp.ne.u64 %r1367,%r297,%r277; @ %r1367 bra $L351; bra $L266; $L336: .loc 1 2413 4 add.u64 %r1382,%r369,1; .loc 1 2412 21 add.u64 %r1384,%r369,%r369; add.u64 %r1385,%r1384,%r369; shl.b64 %r1386,%r1385,3; add.u64 %r1387,%r360,%r1386; .loc 1 2414 32 ld.u64 %r1390,[%r360+16]; ld.u64 %r1391,[%r1387+48]; .loc 1 2413 4 mad.lo.u64 %r1392,%r1390,%r1391,%r365; mov.u64 %r1371,363613621336492366; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r367; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r368; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r1382; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r370; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r371; .param .u64 %out_arg13; st.param.u64 [%out_arg13],%r372; .param .u32 %out_arg14; st.param.u32 [%out_arg14],%r373; call send_by_ref$constprop$0$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13,%out_arg14); } .loc 1 2417 4 bra $L266; $L409: .loc 1 2124 3 setp.eq.u32 %r1393,%r359,1; @ %r1393 bra $L292; setp.eq.u32 %r1394,%r359,2; @ ! %r1394 bra $L270; bra $L331; $L285: .loc 1 2161 17 add.u64 %r1395,%r365,%r54; ld.u64 %r84,[%r1395];_gfortran_caf_deregister .visible .func _gfortran_caf_deregister16.reg .pred %r40; .reg .u32 %r41; mov.u64 %r26,%ar0; mov.u32 %r27,%ar1; mov.u64 %r28,%ar2; .loc 1 201 22 ld.u64 %r25,[%r26]; .loc 1 203 6 ld.u8 %r31,[%r25+16]; setp.eq.u16 %r32,%r31,0; @ %r32 bra $L412; .loc 1 203 50 ld.u64 %r23,[%r25]; .loc 1 203 356 setp.eq.u32 %r35,%r27,1; @ %r35 bra $L413; .loc 1 208 7 ld.u64 %r36,[%r26209 14 mov.u64 %r37,0; st.u64 [%r26],%r37; bra $L414; $L413: .loc 1 213 28 mov.u64 %r38,0; st.u64 [%r25],%r38; .loc 1 214 35 cvt.u32.u64 %r39,%r38; st.u8 [%r25+16],%r39; $L414: .loc 1 217 6 setp.eq.u64 %r40,%r28,0; @ %r40 bra $L411; .loc 1 218 11 mov.u32 %r41,0; st.u32 [%r28],%r_gfortran_caf_sync_all .visible .func _gfortran_caf_sync_allr22; .reg .pred.loc 1 227 3 .loc 1 228 6 setp.eq.u64 %r25,%r22,0; @ %r25 bra $L425; .loc 1 229 11 mov.u32 %r26,0; st.u32 [%r22],%r26; $L425_gfortran_caf_sync_memory .visible .func _gfortran_caf_sync_memoryr22; .reg .pred %r25; .reg .u32 %r26; $LFB50: mov.u64 %r22,%ar0; setp.eq.u64 %r25,%r22,0; @ %r25 bra $L430; mov.u32 %r26,0; st.u32 [%r22],%r26; $L430_gfortran_caf_sync_images .visible .func _gfortran_caf_sync_imagespred %r27; .reg .u32 %r28; mov.u64 %r24,%ar2; .loc 1 263 3 .loc 1 264 6 setp.eq.u64 %r27,%r24,0; @ %r27 bra $L435; .loc 1 265 11 mov.u32 %r28,0; st.u32 [%r24],%r28_gfortran_caf_stop_numeric .visible .func _gfortran_caf_stop_numericpredmov.u32 %r24,%ar0; mov.u32 %r25,%ar1; .loc 1 272 6 setp.ne.u32 %r26,%r25,0; @ %r26 bra $L441; .loc 1 273 14 ld.global.u64 %r29,[_impure_ptr]; .loc 1 273 5 ld.u64 %r30,[%r29+24];431,[%value_in]; } $L441: .loc 1 274 3 mov.u32 %r33,0_gfortran_caf_stop_str .visible .func _gfortran_caf_stop_strpredpredpredmov.u32 %r35,%ar2; .loc 1 281 6 setp.ne.u32 %r36,%r35,0; @ %r36 bra $L443; .loc 1 283 23 ld.global.u64 %r41,[_impure_ptr]; .loc 1 283 7 ld.u64 %r42,[%r41+24]; mov.u64 %r39,5; mov.u64 %r38,1; cvta.const.u64 %r37ld.param.u64 %r43,[%value_in]; } .loc 1 284 13 setp.eq.u64 %r44,%r34,0; @ %r44 bra $L444; mov.u64 %r32,%r33; add.u64 %r31,%r32,%r34; $L445: .loc 1 285 22 ld.global.u64 %r47,[_impure_ptr]; .loc 1 285 2 ld.u64 %r48,[%r47+24]; ld.s8 %r49,[%r32]call (%value_in),fputc.loc 1 284 13 add.u64 %r32,%r32,1; setp.ne.u64 %r51,%r31,%r32; @ %r51 bra $L445; $L444: .loc 1 286 20 ld.global.u64 %r54,[_impure_ptr]; .loc 1 286 7 ld.u64 %r55,[%r54+24];call (%value_in),fputc$L443: .loc 1 288 3 mov.u32 %r57,0_gfortran_caf_error_stop_str .visible .func _gfortran_caf_error_stop_strpredpredpredmov.u32 %r35,%ar2; .loc 1 295 6 setp.ne.u32 %r36,%r35,0; @ %r36 bra $L451; .loc 1 297 29 ld.global.u64 %r41,[_impure_ptr]; .loc 1 297 7 ld.u64 %r42,[%r41+24]; mov.u64 %r39,11; mov.u64 %r38,1; cvta.const.u64 %r37,$LC6ld.param.u64 %r43,[%value_in]; } .loc 1 298 13 setp.eq.u64 %r44,%r34,0; @ %r44 bra $L452; mov.u64 %r32,%r33; add.u64 %r31,%r32,%r34; $L453: .loc 1 299 22 ld.global.u64 %r47,[_impure_ptr]; .loc 1 299 2 ld.u64 %r48,[%r47+24]; ld.s8 %r49,[%r32]call (%value_in),fputc.loc 1 298 13 add.u64 %r32,%r32,1; setp.ne.u64 %r51,%r31,%r32; @ %r51 bra $L453; $L452: .loc 1 300 20 ld.global.u64 %r54,[_impure_ptr]; .loc 1 300 7 ld.u64 %r55,[%r54+24];call (%value_in),fputc$L451: .loc 1 302 3 mov.u32 %r57,1_gfortran_caf_fail_image .visible .func _gfortran_caf_fail_image {loc 1 312 29 ld.global.u64 %r28,[_impure_ptr]; .loc 1 312 3 ld.u64 %r29,[%r28+24]; mov.u64 %r26,14; mov.u64 %r25,1; cvta.const.u64 %r24,$LC7313 3 mov.u32 %r31,0_gfortran_caf_image_statu_gfortran_caf_image_status (r23; .reg .pred %r25; mov.u32 %r23,%ar0; .loc 1 323 6 setp.eq.u32 %r25,%r23,1; .loc 1 326 12 selp.u32 %value,0,6000,%r25; .loc 1 32_gfortran_caf_failed_images .visible .func _gfortran_caf_failed_imagespredmov.u64 %r24,%ar0; mov.u64 %r26,%ar2; .loc 1 338 41 setp.eq.u64 %r27,%r26,0; @ %r27 bra $L464; .loc 1 342 25 ld.s32 %r23,[%r26]; bra $L463; $L464: mov.u64 %r23,4; $L463: .loc 1 340 20 mov.u64 %r28,0; st.u64 [%r24],%r28; .loc 1 341 21 mov.u32 %r29,1; st.u8 [%r24+29],%r29; .loc 1 3345 29 st.u64 [%r24+48],%r28; .loc 1 346 25 mov.u64 %r31,-1; st.u64 [%r24+56],%r31; .loc 1 347 25 mov.u64 %r32,1; st.u64 [%r24+40],%r32; .loc 1 348 17 st.u64 [%r24+8],%r28;_gfortran_caf_stopped_images .visible .func _gfortran_caf_stopped_imagespred$LFB56: mov.u64 %r24,%ar0; mov.u64 %r26,%ar2; setp.eq.u64 %r27,%r26,0; @ %r27 bra $L467; ld.s32 %r22,[%r26]; bra $L466; $L467: mov.u64 %r22,4; $L466: mov.u64 %r28,0; st.u64 [%r24],%r28; mov.u32 %r29,1; st.u8 [%r24+29],%r29; st.u64 [%r24+16],%r22; st.u64 [%r24+48],%r28; mov.u64 %r31,-1; st.u64 [%r24+56],%r31; mov.u64 %r32,1; st.u64 [%r24+40],%r32; st.u64 [%r24+8],%r28;_gfortran_caf_error_stop .visible .func _gfortran_caf_error_stoppredmov.u32 %r24,%ar0; mov.u32 %r25,%ar1; .loc 1 377 6 setp.ne.u32 %r26,%r25,0; @ %r26 bra $L469; .loc 1 378 14 ld.global.u64 %r29,[_impure_ptr]; .loc 1 378 5 ld.u64 %r30,[%r29+24]; st.u32 [%stack],%r24; cvta.const.u64 %r28,$LC31,[%value_in]; } $L469: .loc 1 379 3_gfortran_caf_co_broadcast .visible .func _gfortran_caf_co_broadcastr24; .reg .pred %r27; .reg .u32 %r28; mov.u64 %r24,%ar2; .loc 1 389 6 setp.eq.u64 %r27,%r24,0; @ %r27 bra $L470; .loc 1 390 11 mov.u32 %r28,0; st.u32 [%r24],%r28; $L470:_gfortran_caf_co_sum .visible .func _gfortran_caf_co_sumr24; .reg .pred %r27; .reg .u32 %r28; $LFB52: mov.u64 %r24,%ar2; setp.eq.u64 %r27,%r24,0; @ %r27 bra $L475; mov.u32 %r28,0; st.u32 [%r24],%r28; $L475_gfortran_caf_co_min .visible .func _gfortran_caf_co_mpred %r28; .reg .u32 %r29; mov.u64 %r24,%ar2480; .loc 1 411 11 mov.u32 %r29,0; st.u32 [%r24],%r29; $L480: .loc 1 41_gfortran_caf_co_max .visible .func _gfortran_caf_co_maxpred %r28; .reg .u32 %r29; $LFB54: mov.u64 %r24,%ar2;485; mov.u32 %r29,0; st.u32 [%r24],%r29; $L485_gfortran_caf_co_reduce .visible .func _gfortran_caf_co_reducer26; .reg .pred %r30; .reg .u32 %r31; mov.u64 %r26,%ar4; .loc 1 436 7 setp.eq.u64 %r30,%r26,0; @ %r30 bra $L490; .loc 1 437 12 mov.u32 %r31,0; st.u32 [%r26],%r31; $L490: .loc 1 438_gfortran_caf_get .visible .func _gfortran_caf_getu64u64 %r249; .reg .u64 %r251; .reg .u64u64 %r272; .reg .u64u64u64u16 %r337; .reg .predu32pred %r374; .reg .pred %r376; .reg .predpredu64 %r387; .reg .u32 %r388; .reg .pred %r389; .reg .u64 %r398; .reg .u64pred %r408; .reg .predpred %r413; .reg .u32predpred16u64 %r474; .reg .u64 %r475; .reg .u64 %r477; .reg .pred.reg .u64 %r492; .reg .u16 %r494; .reg .u16 %r495; .reg .u32 %r496; .reg .u32 %r497; .reg .u16 %r498; .reg .u16 %r499; .reg .u16 %r500; .reg .u32 %r501; .reg .u16 %r502; .reg .predu32 %r512; .reg .u32 %r513; .reg .u32 %r514; .reg .u16 %r518; .reg .u16 %r519; .reg .u32predu64 %r565; .reg .pred %r567; .reg .u64 %r570; .reg .u64 %r571; .reg .u64 %r572; .reg .u64 %r573; .reg .u64 %r574; .reg .u64 %r576; .reg .u64 %r577; .reg .u64 %r579; .reg .u16 %r580; .reg .pred %r581; .reg .u32 %r582; .reg .u64 %r583; .reg .u64 %r585; .reg .u64 %r586; .reg .u64 %r587; .reg .u64u16 %r616; .reg .u16 %r617; .reg .u32 %r618; .reg .u16 %r619; .reg .predu32 %r631; .reg .u16 %r635; .reg .u16 %r636; .reg .u32 %r638; .reg .u16 %r639; .reg .pred %r640; .reg .u32 %r645; .reg .u64 %r647; .reg .u64 %r648; .reg .u64 %r649; .reg .pred %r650; .reg .u32 %r651; .reg .pred %r652; .reg .pred %r654; .reg .u32 %r661; .reg .predpred %r673; .reg .u32 %r675; .reg .u64 %r676; .reg .u64 %r677; .reg .pred %r678; .reg .pred %r680; .reg .pred %r681; .reg .u32 %r682; .reg .u32 %r683; .reg .u16 %r684; .reg .u64 %r685; .reg .u64 %r686; .reg .pred %r687; .reg .pred %r688; .reg .u32 %r689; .reg .u32 %r690; .reg .u16 %r691; mov.u64 %r310,%ar0; mov.u64 %r311,%ar1; mov.u64 %r313,%ar3; mov.u64 %r315,%ar5; mov.u32 %r316,%ar6; mov.u32 %r317,%ar7; mov.u32 %r318,%ar8; mov.u64 %r319,%ar9; .loc 1 740 14 ld.s8 %r22,[%r315+28]; .loc 1 741 10 ld.u64 %r149,[%r313+16]; .loc 1 742 10 ld.u64 %r150,[%r315+16]; .loc 1 744 6 setp.eq.u64 %r320,%r319,0; @ %r320 bra $L496; .loc 1 745 11 mov.u32 %r321,0; st.u32 [%r319],%r321; $L496: .loc 1 747 6 setp.eq.u32 %r322,%r22,0; @ %r322 bra $L497; .loc 1 778 3 setp.gt.s32 %r323,%r22,0; @ %r323 bra $L498; .loc 1 777 8 mov.u64 %r170,1; add.u32 %r675,%r22,-1; bra $L499; $L497: .loc 1 749 13 ld.u64 %r324,[%r310]; add.u64 %r171,%r324,%r311; .loc 1 750 11 ld.s8 %r24,[%r315+29]; .loc 1 750 41 ld.s8 %r25,[%r313+29]; .loc 1 753 4 ld.u64 %r304,[%r315]; .loc 1 751 4 set.u32.eq.u32 %r326,%r317,%r316; neg.s32 %r327,%r326; .loc 1 750 10 cvt.u16.u32 %r329,%r24; cvt.u16.u32 %r330,%r25; set.u32.eq.u16 %r331,%r329,%r330; neg.s32 %r332,%r331; .loc 1 751 4 cvt.u16.u32 %r334,%r327; cvt.u16.u32 %r335,%r332; and.b16 %r333,%r334,%r335; cvt.u32.u16 %r336,%r333; cvt.u16.u8 %r337,%r336; setp.eq.u16 %r338,%r337,0; @ %r338 bra $L500; .loc 1 753 4 min.u64 %r343,%r149,%r15ld.param.u64 %r344,[%value_in]; } .loc 1 755 7 ld.s8 %r347,[%r315+29]; cvt.u16.u32 %r346,%r347; set.u32.eq.u16 %r348,%r346,6; neg.s32 %r349,%r348; .loc 1 755 51 set.u32.lt.u64 %r351,%r149,%r150495; .loc 1 757 11 setp.ne.u32 %r359,%r317,1; @ %r359 bra $L502; .loc 1 758 3 sub.u64 %r366,%r150,%r149; ld.u64 %r368,[%r315]; add.u64 %r367,%r368,%r149;bra $L495; $L502: .loc 1 761 10 shr.u64 %r172,%r149,2; .loc 1 761 36 shr.u64 %r147,%r150,2; .loc 1 761 3 setp.le.u64 %r370,%r147,%r172; @ %r370 bra $L495; .loc 1 762 17 ld.u64 %r31,[%r315]; shl.b64 %r371,%r172,2; add.u64 %r290,%r31,%r371; shl.b64 %r372,%r147,2; add.u64 %r292,%r31,%r372; .loc 1 762 48 mov.u32 %r373,32; $L504: st.u32 [%r290],%r373; .loc 1 761 3 add.u64 %r290,%r290,4; setp.ne.u64 %r374,%r292,%r290; @ %r374 bra $L504; bra $L495; $L500: .loc 1 765 15 setp.ne.u16 %r376,%r329,6; @ %r376 bra $L505; .loc 1 765 59 setp.ne.u32 %r377,%r317,1; @ %r377 bra $L506; .loc 1 766 2304171; call assign_char1_from_char4bra $L495; $L506: .loc 1 446 15 shr.u64 %r173,%r150,2; .loc 1 446 5 min.u64 %r174,%r149,%r173; .loc 1 447 3 setp.eq.u64 %r382,%r174,0; @ %r382 bra $L507; mov.u64 %r282,%r171; mov.u64 %r281,%r304; add.u64 %r278,%r282,%r174; $L508: .loc 1 448 14 ld.u8 %r383,[%r282]; st.u32 [%r281],%r383; .loc 1 447 3 add.u64 %r282,%r282,1; add.u64 %r281,%r281,4; setp.ne.u64 %r384,%r278,%r282; @ %r384 bra $L508; $L507: .loc 1 449 3 setp.le.u64 %r385,%r173,%r174; @ %r385 bra $L495; shl.b64 %r386,%r174,2; add.u64 %r291,%r304,%r386; shl.b64 %r387,%r173,2; add.u64 %r283,%r304,%r387; .loc 1 450 12 mov.u32 %r388,32; $L509: st.u32 [%r291],%r388; .loc 1 449 3 add.u64 %r291,%r291,4; setp.ne.u64 %r389,%r283,%r291; @ %r389 bra $L509; bra $L495; $L505: .loc 1 772 2317convert_typebra $L495; $L498: add.u64 %r197,%r315,48; add.u64 %r193,%r315,56; add.u32 %r675,%r22,-1; cvt.u64.u32 %r398,%r675; add.u64 %r400,%r398,%r398; add.u64 %r401,%r400,%r398; shl.b64 %r402,%r401,3; add.u64 %r403,%r315,72; add.u64 %r184,%r402,%r403; .loc 1 777 8 mov.u64 %r170,1; $L510: .loc 1 780 50 ld.u64 %r405,[%r193]; ld.u64 %r406,[%r197]; sub.u64 %r404,%r405,%r406; .loc 1 780 17 add.u64 %r169,%r404,1; .loc 1 783 12 max.s64 %r407,%r169,0; mul.lo.u64 %r170,%r170,%r407; .loc 1 778 3 add.u64 %r197,%r197,24; add.u64 %r193,%r193,24; setp.ne.u64 %r408,%r184,%r197; @ %r408 bra $L510; .loc 1 786 6 setp.eq.u64 %r409,%r170,0; @ %r409 bra $L495; $L499: .loc 1 837 8 set.u32.eq.u32 %r411,%r317,%r316; neg.s32 %r412,%r411; cvt.u32.u32 %r410,%r412; cvt.u32.u8 %r305,%r410; .loc 1 847 14 shr.u64 %r306,%r149,2; .loc 1 446 15 shr.u64 %r307,%r150,2; .loc 1 446 5 min.u64 %r308,%r149,%r307; .loc 1 789 6 setp.ne.u32 %r413,%r318,0; @ %r413 bra $L511; add.u32 %r414,%r22,-2; cvt.u64.u32 %r415,%r414; add.u64 %r417,%r415,%r415; add.u64 %r418,%r417,%r415; shl.b64 %r419,%r418,3; add.u64 %r420,%r315,64; add.u64 %r199,%r419,%r420; shl.b64 %r232,%r308,2; shl.b64 %r229,%r307,2; shl.b64 %r238,%r306,2; .loc 1 741 10 mov.u64 %r298,%r149; .loc 1 742 10 mov.u64 %r297,%r150; .loc 1 865 10 mov.u64 %r155,0; cvt.s64.s32 %r677,%r675; setp.gt.s32 %r678,%r675,0; .loc 1 879 59 add.u64 %r570,%r677,%r677; add.u64 %r571,%r570,%r677; shl.b64 %r572,%r571,3; add.u64 %r573,%r315,%r572; add.u64 %r574,%r573,40; add.u64 %r686,%r313,64; .loc 1 894 57 add.u64 %r602,%r313,%r572; add.u64 %r603,%r602,40; .loc 1 899 4 cvt.u16.u32 %r617,%r305; .loc 1 911 59 setp.ne.u32 %r687,%r317,1; .loc 1 447 3 setp.eq.u64 %r688,%r308,0; .loc 1 902 51 set.u32.lt.u64 %r689,%r149,%r150; neg.s32 %r690,%r689; cvt.u16.u32 %r691,%r690; bra $L512; $L511: .loc 1 792 19 mul.lo.u64 %r422,%r149,%r1742223,[%value_in]; } mov.u64 %r251,%r423; mov.u64 %r240,%r423; .loc 1 741 10 mov.u64 %r301,%r149; .loc 1 795 14 mov.u64 %r167,0; cvt.s64.s32 %r677,%r675; add.u64 %r685,%r313,64; .loc 1 809 54 add.u64 %r442,%r677,%r677; add.u64 %r443,%r442,%r677; shl.b64 %r444,%r443,3; add.u64 %r445,%r313,%r444; add.u64 %r446,%r445,40; bra $L513; $L517: add.u64 %r249,%r313,40; add.u32 %r425,%r200,-2; cvt.u64.u32 %r426,%r425; add.u64 %r428,%r426,%r426; add.u64 %r429,%r428,%r426; shl.b64 %r430,%r429,3; add.u64 %r241,%r430,%r685; .loc 1 799 14 mov.u64 %r45,1; .loc 1 798 14 mov.u64 %r48,%r45; .loc 1 797 14 mov.u64 %r168,0; $L514: .loc 1 802 40 mul.lo.u64 %r432,%r45,%r48; .loc 1 802 31 div.u64 %r41,%r167,%r432; .loc 1 804 9 ld.u64 %r434,[%r249+16]; ld.u64 %r435,[%r249+8]; sub.u64 %r433,%r434,%r435; .loc 1 804 35 add.u64 %r45,%r433,1; .loc 1 805 19 ld.u64 %r48,[%r249]; .loc 1 803 7 rem.u64 %r437,%r41,%r45; .loc 1 802 24 mad.lo.u64 %r52,%r437,%r48,%r168; mov.u64 %r168,%r52; .loc 1 800 4 add.u64 %r249,%r249,24; setp.ne.u64 %r439,%r241,%r249; @ %r439 bra $L514; .loc 1 809 26 div.u64 %r302,%r167,%r45; $L518: .loc 1 809 36 ld.u64 %r448,[%r446]; .loc 1 809 20 mad.lo.u64 %r449,%r302,%r448,%r52; .loc 1 811 6 mad.lo.u64 %r451,%r449,%r301,%r311; .loc 1 810 10 ld.u64 %r453,[%r310]; add.u64 %r452,%r453,%r451; .loc 1 812 112449460,[%value_in]; } .loc 1 795 30 add.u64 %r167,%r167,1; .loc 1 795 7 add.u64 %r240,%r240,%r149; setp.lt.u64 %r462,%r167,%r170; @ %r462 bra $L515; add.u32 %r463,%r22,-2; cvt.u64.u32 %r464,%r463; add.u64 %r466,%r464,%r464; add.u64 %r467,%r466,%r464; shl.b64 %r468,%r467,3; add.u64 %r469,%r315,64; add.u64 %r252,%r468,%r469; shl.b64 %r270,%r308,2; shl.b64 %r267,%r307,2; shl.b64 %r276,%r306,2; .loc 1 817 14 mov.u64 %r164,0; setp.gt.s32 %r678,%r675,0; .loc 1 831 56 add.u64 %r482,%r677,%r677; add.u64 %r483,%r482,%r677; shl.b64 %r484,%r483,3; add.u64 %r485,%r315,%r484; add.u64 %r486,%r485,40; .loc 1 837 8 cvt.u16.u32 %r500,%r305; .loc 1 851 56 setp.ne.u32 %r680,%r317,1; .loc 1 447 3 setp.eq.u64 %r681,%r308,0; .loc 1 841 12 set.u32.lt.u64 %r682,%r149,%r150; neg.s32 %r683,%r682; cvt.u16.u32 %r684,%r683; bra $L516; $L515: .loc 1 811 24 ld.u64 %r301,[%r313+16]; $L513: .loc 1 800 20 ld.s8 %r200,[%r313+28]; .loc 1 800 4 cvt.u16.u32 %r470,%r200; setp.gt.s16 %r471,%r470,1; @ %r471 bra $L517; mov.u64 %r302,%r167; mov.u64 %r52,0; bra $L518; $L532: add.u64 %r260,%r315,40; .loc 1 821 14 mov.u64 %r67,1; .loc 1 820 14 mov.u64 %r70,%r67; .loc 1 819 14 mov.u64 %r165,0; $L519: .loc 1 824 41 mul.lo.u64 %r472,%r70,%r67; .loc 1 824 32 div.u64 %r63,%r164,%r472; .loc 1 826 11 ld.u64 %r474,[%r260+16]; ld.u64 %r475,[%r260+8]; sub.u64 %r473,%r474,%r475; .loc 1 826 38 add.u64 %r67,%r473,1; .loc 1 827 21 ld.u64 %r70,[%r260]; .loc 1 825 8 rem.u64 %r477,%r63,%r67; .loc 1 824 25 mad.lo.u64 %r74,%r477,%r70,%r165; mov.u64 %r165,%r74; .loc 1 822 4 add.u64 %r260,%r260,24; setp.ne.u64 %r479,%r252,%r260; @ %r479 bra $L519; .loc 1 831 27 div.u64 %r303,%r164,%r67; $L533: .loc 1 831 37 ld.u64 %r488,[%r486]; .loc 1 831 21 mad.lo.u64 %r489,%r303,%r488,%r74; .loc 1 833 27 ld.u64 %r491,[%r315+16]; .loc 1 832 10 ld.u64 %r492,[%r315]; mad.lo.u64 %r161,%r489,%r491,%r492; .loc 1 836 8 ld.s8 %r82,[%r315+29]; .loc 1 836 38 ld.s8 %r83,[%r313+29]; .loc 1 836 7 cvt.u16.u32 %r494,%r82; cvt.u16.u32 %r495,%r83; set.u32.eq.u16 %r496,%r494,%r495; neg.s32 %r497,%r496; .loc 1 837 8 cvt.u16.u32 %r499,%r497; and.b16 %r498,%r499,%r500; cvt.u32.u16 %r501,%r498; cvt.u16.u8 %r502,%r501; setp.eq.u16 %r503,%r502,0; @ %r503 bra $L520; .loc 1 839 8 min.u64 %r508,%r149,%r15(%value_in),memmove9,[%value_in]; } .loc 1 840 11 ld.s8 %r512,[%r315+29]; cvt.u16.u32 %r511,%r512; set.u32.eq.u16 %r513,%r511,6; neg.s32 %r514,%r513; .loc 1 841 12 cvt.u16.u32 %r519,%r514; and.b16 %r518,%r519,%r684; cvt.u32.u16 %r521,%r518; cvt.u16.u8 %r522,%r521; setp.ne.u16 %r523,%r522,0; @ ! %r523 bra $L522; .loc 1 843 8 @ %r680 bra $L523; .loc 1 844 7 sub.u64 %r530,%r150,%r149; add.u64 %r531,%r161,%r149;bra $L522; $L523: .loc 1 847 7 setp.ge.u64 %r533,%r306,%r307; @ %r533 bra $L522; add.u64 %r277,%r161,%r276; add.u64 %r272,%r161,%r267; .loc 1 848 29 mov.u32 %r534,32; $L525: st.u32 [%r277],%r534; .loc 1 847 7 add.u64 %r277,%r277,4; setp.ne.u64 %r535,%r272,%r277; @ %r535 bra $L525; bra $L522; $L520: .loc 1 851 12 setp.ne.u16 %r537,%r494,6; @ %r537 bra $L526; .loc 1 851 56 @ %r680 bra $L527; .loc 1 852 6251; call assign_char1_from_char4bra $L522; $L527: .loc 1 447 3 @ %r681 bra $L528; mov.u64 %r263,%r161; mov.u64 %r264,%r251; add.u64 %r676,%r251,%r308; $L529: .loc 1 448 14 ld.u8 %r544,[%r264]; st.u32 [%r263],%r544; .loc 1 447 3 add.u64 %r264,%r264,1; add.u64 %r263,%r263,4; setp.ne.u64 %r546,%r676,%r264; @ %r546 bra $L529; $L528: .loc 1 449 3 setp.le.u64 %r547,%r307,%r308; @ %r547 bra $L522; add.u64 %r271,%r161,%r270; add.u64 %r265,%r161,%r267; .loc 1 450 12 mov.u32 %r548,32; $L530: st.u32 [%r271],%r548; .loc 1 449 3 add.u64 %r271,%r271,4; setp.ne.u64 %r549,%r265,%r271; @ %r549 bra $L530; bra $L522; $L526:317convert_type$L522: .loc 1 817 30 add.u64 %r164,%r164,1; .loc 1 817 7 add.u64 %r251,%r251,%r149; setp.ge.u64 %r557,%r164,%r170; @ %r557 bra $L531; $L516: .loc 1 822 4 @ %r678 bra $L532; mov.u64 %r303,%r164; mov.u64 %r74,0; bra $L533; $L531:__nvptx_free,(%out_arg1); } .loc 1 862 7 bra $L495; $L548: add.u64 %r210,%r315,40; .loc 1 869 17 mov.u64 %r96,1; .loc 1 868 17 mov.u64 %r99,%r96; .loc 1 867 17 mov.u64 %r159,0; $L534: .loc 1 872 37 mul.lo.u64 %r560,%r96,%r99; .loc 1 872 28 div.u64 %r92,%r155,%r560; .loc 1 874 7 ld.u64 %r562,[%r210+16]; ld.u64 %r563,[%r210+8]; sub.u64 %r561,%r562,%r563; .loc 1 874 34 add.u64 %r96,%r561,1; .loc 1 875 24 ld.u64 %r99,[%r210]; .loc 1 873 11 rem.u64 %r565,%r92,%r96; .loc 1 872 21 mad.lo.u64 %r103,%r565,%r99,%r159; mov.u64 %r159,%r103; .loc 1 870 7 add.u64 %r210,%r210,24; setp.ne.u64 %r567,%r199,%r210; @ %r567 bra $L534; .loc 1 879 30 div.u64 %r299,%r155,%r96; $L549: .loc 1 879 40 ld.u64 %r576,[%r574]; .loc 1 879 24 mad.lo.u64 %r577,%r299,%r576,%r103; .loc 1 880 13 ld.u64 %r579,[%r315]; mad.lo.u64 %r152,%r577,%r297,%r579; .loc 1 885 23 ld.s8 %r195,[%r313+28]; .loc 1 885 7 cvt.u16.u32 %r580,%r195; setp.le.s16 %r581,%r580,1; @ %r581 bra $L550; add.u64 %r221,%r313,40; add.u32 %r582,%r195,-2; cvt.u64.u32 %r583,%r582; add.u64 %r585,%r583,%r583; add.u64 %r586,%r585,%r583; shl.b64 %r587,%r586,3; add.u64 %r211,%r587,%r686; .loc 1 882 17 mov.u64 %r156,0; .loc 1 884 14 mov.u64 %r117,1; .loc 1 883 14 mov.u64 %r120,%r117; $L536: .loc 1 887 36 mul.lo.u64 %r589,%r117,%r120; .loc 1 887 27 div.u64 %r113,%r155,%r589; .loc 1 889 7 ld.u64 %r591,[%r221+16]; ld.u64 %r592,[%r221+8]; sub.u64 %r590,%r591,%r592; .loc 1 889 33 add.u64 %r117,%r590,1; .loc 1 890 23 ld.u64 %r120,[%r221]; .loc 1 888 11 rem.u64 %r594,%r113,%r117; .loc 1 887 20 mad.lo.u64 %r124,%r594,%r120,%r156; mov.u64 %r156,%r124; .loc 1 885 7 add.u64 %r221,%r221,24; setp.ne.u64 %r596,%r211,%r221; @ %r596 bra $L536; .loc 1 894 29 div.u64 %r300,%r155,%r117; bra $L535; $L550: .loc 1 885 7 mov.u64 %r300,%r155; mov.u64 %r124,0; $L535: .loc 1 894 39 ld.u64 %r605,[%r603]; .loc 1 894 23 mad.lo.u64 %r606,%r300,%r605,%r124; .loc 1 896 6 mad.lo.u64 %r608,%r606,%r298,%r311; .loc 1 895 13 ld.u64 %r609,[%r310]; add.u64 %r153,%r609,%r608; .loc 1 898 11 ld.s8 %r135,[%r315+29]; .loc 1 898 41 ld.s8 %r136,[%r313+29]; .loc 1 898 10 cvt.u16.u32 %r611,%r135; cvt.u16.u32 %r612,%r136; set.u32.eq.u16 %r613,%r611,%r612; neg.s32 %r614,%r613; .loc 1 899 4 cvt.u16.u32 %r616,%r614; and.b16 %r615,%r616,%r617; cvt.u32.u16 %r618,%r615; cvt.u16.u8 %r619,%r618; setp.eq.u16 %r620,%r619,0; @ %r620 bra $L537; .loc 1 901 4 min.u64 %r625,%r149,%r1511535; call (%value_in),memmove26,[%value_in]; } .loc 1 902 7 ld.s8 %r629,[%r315+29]; cvt.u16.u32 %r628,%r629; set.u32.eq.u16 %r630,%r628,6; neg.s32 %r631,%r630; .loc 1 902 51 cvt.u16.u32 %r636,%r631; and.b16 %r635,%r636,%r691; cvt.u32.u16 %r638,%r635; cvt.u16.u8 %r639,%r638; setp.ne.u16 %r640,%r639,0; @ ! %r640 bra $L539; .loc 1 904 11 @ %r687 bra $L540; .loc 1 905 3 sub.u64 %r647,%r150,%r149; add.u64 %r648,%r152,%r149; mov.u32 %r664645647649,[%value_in]; } bra $L539; $L540: .loc 1 907 3 setp.ge.u64 %r650,%r306,%r307; @ %r650 bra $L539; add.u64 %r239,%r152,%r238; add.u64 %r234,%r152,%r229; .loc 1 908 25 mov.u32 %r651,32; $L542: st.u32 [%r239],%r651; .loc 1 907 3 add.u64 %r239,%r239,4; setp.ne.u64 %r652,%r234,%r239; @ %r652 bra $L542; bra $L539; $L537: .loc 1 911 15 setp.ne.u16 %r654,%r611,6; @ %r654 bra $L543; .loc 1 911 59 @ %r687 bra $L544; .loc 1 912 252153; call assign_char1_from_char4bra $L539; $L544: .loc 1 447 3 @ %r688 bra $L545; mov.u64 %r226,%r153; mov.u64 %r225,%r152; add.u64 %r222,%r226,%r308; $L546: .loc 1 448 14 ld.u8 %r661,[%r226]; st.u32 [%r225],%r661; .loc 1 447 3 add.u64 %r226,%r226,1; add.u64 %r225,%r225,4; setp.ne.u64 %r662,%r222,%r226; @ %r662 bra $L546; $L545: .loc 1 449 3 setp.le.u64 %r663,%r307,%r308; @ %r663 bra $L539; add.u64 %r233,%r152,%r232; add.u64 %r227,%r152,%r229; .loc 1 450 12 mov.u32 %r664,32; $L547: st.u32 [%r233],%r664; .loc 1 449 3 add.u64 %r233,%r233,4; setp.ne.u64 %r665,%r227,%r233; @ %r665 bra $L547; bra $L539; $L543: .loc 1 916 217convert_type$L539: .loc 1 865 26 add.u64 %r155,%r155,1; .loc 1 865 3 setp.ge.u64 %r673,%r155,%r170; @ %r673 bra $L495; .loc 1 880 54 ld.u64 %r297,[%r315+16]; .loc 1 896 24 ld.u64 %r298,[%r313+16]; $L512: .loc 1 870 7 @ %r678 bra $L548; mov.u64 %r299,%r155; mov.u64 %r103,0; bra $L5_gfortran_caf_send .visible .func _gfortran_caf_sendu64u64 %r302; .reg .u64 %r303; .reg .u64 %r304; .reg .u64predpred %r325; .reg .u64 %r326; .reg .u32 %r328; .reg .u32 %r329; .reg .u16 %r331; .reg .u16pred %r340; .reg .u64 %r345; .reg .u64 %r346; .reg .u16 %r348; .reg .u32pred %r376; .reg .predpredu64 %r387; .reg .u32 %r388; .reg .pred %r389; .reg .u64 %r398; .reg .u64pred %r408; .reg .predpred %r413; .reg .u32predpredpred489; .reg .u64 %r491; .reg .predu64 %r505; .reg .u64 %r506; .reg .u16 %r508; .reg .u16 %r509; .reg .u32 %r510; .reg .u32 %r511; .reg .u16 %r512; .reg .u16 %r513; .reg .u16 %r514; .reg .u32 %r515; .reg .u16 %r516; .reg .pred %r517; .reg .u64 %r522; .reg .u64 %r523; .reg .u16 %r525; .reg .u32 %r526; .reg .u32 %r527; .reg .u32 %r528; .reg .u16 %r532; .reg .u16 %r533; .reg .u32 %r535; .reg .u16 %r536; .reg .predpred %r549; .reg .predu32 %r571; .reg .pred %r572; .reg .pred %r573; .reg .u64 %r576; .reg .u64 %r577; .reg .u64pred %r583; .reg .u64 %r586; .reg .u64 %r587; .reg .u64 %r588; .reg .u64 %r589; .reg .u64 %r590; .reg .u64 %r592; .reg .u64 %r593; .reg .u64 %r595; .reg .u64 %r596; .reg .pred %r597; .reg .u16 %r598; .reg .pred %r599; .reg .u32 %r600; .reg .u64 %r601; .reg .u64.reg .pred %r614; .reg .u64 %r623; .reg .u64 %r624; .reg .u64 %r626; .reg .u16 %r628; .reg .u16pred %r667; .reg .u32 %r668; .reg .pred %r669; .reg .pred %r671; .reg .u32 %r678; .reg .pred %r679; .reg .pred %r680; .reg .u32 %r681; .reg .pred %r682; .reg .pred %r690; .reg .u32 %r692; .reg .u64 %r693; .reg .pred %r694; .reg .pred %r696; .reg .pred %r697; .reg .u32 %r698; .reg .u32 %r699; .reg .u16 %r700; .reg .u64 %r701; .reg .u64 %r702; .reg .u64 %r703; .reg .u64 %r704; .reg .pred %r705; .reg .pred %r706; .reg .u32 %r707; .reg .u32 %r708; .reg .u16 %r709; mov.u64 %r312,%ar0; mov.u64 %r313,%ar1; mov.u64 %r315,%ar3; mov.u64 %r317,%ar5; mov.u32 %r318,%ar6; mov.u32 %r319,%ar7; mov.u32 %r320,%ar8; mov.u64 %r321,%ar9; .loc 1 933 14 ld.s8 %r22,[%r315+28]; .loc 1 934 10 ld.u64 %r157,[%r317+16]; .loc 1 935 10 ld.u64 %r158,[%r315+16]; .loc 1 937 6 setp.eq.u64 %r322,%r321,0; @ %r322 bra $L582; .loc 1 938 11 mov.u32 %r323,0; st.u32 [%r321],%r323; $L582: .loc 1 940 6 setp.eq.u32 %r324,%r22,0; @ %r324 bra $L583; .loc 1 971 3 setp.gt.s32 %r325,%r22,0; @ %r325 bra $L584; .loc 1 970 8 mov.u64 %r176,1; add.u32 %r692,%r22,-1; bra $L585; $L583: .loc 1 942 13 ld.u64 %r326,[%r312]; add.u64 %r177,%r326,%r313; .loc 1 943 11 ld.s8 %r24,[%r315+29]; .loc 1 943 41 ld.s8 %r25,[%r317+29]; .loc 1 946 18 ld.u64 %r306,[%r317]; .loc 1 944 4 set.u32.eq.u32 %r328,%r318,%r319; neg.s32 %r329,%r328; .loc 1 943 10 cvt.u16.u32 %r331,%r24; cvt.u16.u32 %r332,%r25; set.u32.eq.u16 %r333,%r331,%r332; neg.s32 %r334,%r333; .loc 1 944 4 cvt.u16.u32 %r336,%r329; cvt.u16.u32 %r337,%r334; and.b16 %r335,%r336,%r337; cvt.u32.u16 %r338,%r335; cvt.u16.u8 %r339,%r338; setp.eq.u16 %r340,%r339,0; @ %r340 bra $L586; .loc 1 946 4 min.u64 %r345,%r157,%r15845; call (%value_in),memmove346,[%value_in]; } .loc 1 948 7 ld.s8 %r349,[%r315+29]; cvt.u16.u32 %r348,%r349; set.u32.eq.u16 %r350,%r348,6; neg.s32 %r351,%r350; .loc 1 948 51581; .loc 1 950 11 setp.ne.u32 %r361,%r318,1; @ %r361 bra $L588; .loc 1 951 3 sub.u64 %r367,%r158,%r157; add.u64 %r368,%r177,%r157; mov.u32 %r3636r367369,[%value_in]; } bra $L581; $L588: .loc 1 953 10 shr.u64 %r178,%r157,2; .loc 1 953 36 shr.u64 %r149,%r158,2; .loc 1 953 3 setp.le.u64 %r370,%r149,%r178; @ %r370 bra $L581; shl.b64 %r371,%r178,2; add.u64 %r292,%r177,%r371; shl.b64 %r372,%r149,2; add.u64 %r293,%r177,%r372; .loc 1 954 25 mov.u32 %r373,32; $L590: st.u32 [%r292],%r373; .loc 1 953 3 add.u64 %r292,%r292,4; setp.ne.u64 %r374,%r293,%r292; @ %r374 bra $L590; bra $L581; $L586: .loc 1 957 15 setp.ne.u16 %r376,%r331,6; @ %r376 bra $L591; .loc 1 957 59 setp.ne.u32 %r377,%r318,1; @ %r377 bra $L592; .loc 1 958 215r157177306; call assign_char1_from_char4bra $L581; $L592: .loc 1 446 15 shr.u64 %r179,%r158,2; .loc 1 446 5 min.u64 %r180,%r157,%r179; .loc 1 447 3 setp.eq.u64 %r382,%r180,0; @ %r382 bra $L593; mov.u64 %r284,%r306; mov.u64 %r283,%r177; add.u64 %r280,%r284,%r180; $L594: .loc 1 448 14 ld.u8 %r383,[%r284]; st.u32 [%r283],%r383; .loc 1 447 3 add.u64 %r284,%r284,1; add.u64 %r283,%r283,4; setp.ne.u64 %r384,%r280,%r284; @ %r384 bra $L594; $L593: .loc 1 449 3 setp.le.u64 %r385,%r179,%r180; @ %r385 bra $L581; shl.b64 %r386,%r180,2; add.u64 %r294,%r177,%r386; shl.b64 %r387,%r179,2; add.u64 %r285,%r177,%r387; .loc 1 450 12 mov.u32 %r388,32; $L595: st.u32 [%r294],%r388; .loc 1 449 3 add.u64 %r294,%r294,4; setp.ne.u64 %r389,%r285,%r294; @ %r389 bra $L595; bra $L581; $L591: .loc 1 964 2318convert_typebra $L581; $L584: add.u64 %r204,%r315,48; add.u64 %r201,%r315,56; add.u32 %r692,%r22,-1; cvt.u64.u32 %r398,%r692; add.u64 %r400,%r398,%r398; add.u64 %r401,%r400,%r398; shl.b64 %r402,%r401,3; add.u64 %r403,%r315,72; add.u64 %r192,%r402,%r403; .loc 1 970 8 mov.u64 %r176,1; $L596: .loc 1 973 50 ld.u64 %r405,[%r201]; ld.u64 %r406,[%r204]; sub.u64 %r404,%r405,%r406; .loc 1 973 17 add.u64 %r175,%r404,1; .loc 1 976 12 max.s64 %r407,%r175,0; mul.lo.u64 %r176,%r176,%r407; .loc 1 971 3 add.u64 %r204,%r204,24; add.u64 %r201,%r201,24; setp.ne.u64 %r408,%r192,%r204; @ %r408 bra $L596; .loc 1 979 6 setp.eq.u64 %r409,%r176,0; @ %r409 bra $L581; $L585: .loc 1 1038 8 set.u32.eq.u32 %r411,%r318,%r319; neg.s32 %r412,%r411; cvt.u32.u32 %r410,%r412; cvt.u32.u8 %r307,%r410; .loc 1 1049 14 shr.u64 %r308,%r157,2; .loc 1 446 15 shr.u64 %r309,%r158,2; .loc 1 446 5 min.u64 %r310,%r157,%r309; .loc 1 982 6 setp.ne.u32 %r413,%r320,0; @ %r413 bra $L597; add.u32 %r414,%r22,-2; cvt.u64.u32 %r415,%r414; add.u64 %r417,%r415,%r415; add.u64 %r418,%r417,%r415; shl.b64 %r419,%r418,3; add.u64 %r420,%r315,64; add.u64 %r205,%r419,%r420; shl.b64 %r235,%r310,2; shl.b64 %r232,%r309,2; shl.b64 %r241,%r308,2; .loc 1 935 10 mov.u64 %r299,%r158; .loc 1 1067 10 mov.u64 %r165,0; cvt.s64.s32 %r693,%r692; setp.gt.s32 %r694,%r692,0; .loc 1 1081 59 add.u64 %r586,%r693,%r693; add.u64 %r587,%r586,%r693; shl.b64 %r588,%r587,3; add.u64 %r589,%r315,%r588; add.u64 %r590,%r589,40; .loc 1 1099 54 add.u64 %r702,%r317,%r588; add.u64 %r703,%r702,40; add.u64 %r704,%r317,64; .loc 1 1107 4 cvt.u16.u32 %r634,%r307; .loc 1 1120 59 setp.ne.u32 %r705,%r318,1; .loc 1 447 3 setp.eq.u64 %r706,%r310,0; .loc 1 1111 51 set.u32.lt.u64 %r707,%r157,%r158; neg.s32 %r708,%r707; cvt.u16.u32 %r709,%r708; bra $L598; $L597: .loc 1 987 10 ld.s8 %r422,[%r317+28]; cvt.u16.u32 %r421,%r422; setp.ne.u16 %r423,%r421,0; @ %r423 bra $L599; .loc 1 98915725,[%value_in]; } mov.u64 %r152,%r425; .loc 1 990 4 ld.u64 %r427,[%r317]935 10 mov.u64 %r304,%r158; cvt.s64.s32 %r693,%r692; bra $L600; $L599: .loc 1 994 10 mul.lo.u64 %r437,%r157,%r17643738,[%value_in]; } mov.u64 %r152,%r438; mov.u64 %r243,%r152; .loc 1 934 10 mov.u64 %r302,%r157; .loc 1 996 11 mov.u64 %r168,0; cvt.s64.s32 %r693,%r692; add.u64 %r701,%r317,64; .loc 1 1010 58 add.u64 %r457,%r693,%r693; add.u64 %r458,%r457,%r693; shl.b64 %r459,%r458,3; add.u64 %r460,%r317,%r459; add.u64 %r461,%r460,40; bra $L601; $L604: add.u64 %r252,%r317,40; add.u32 %r440,%r202,-2; cvt.u64.u32 %r441,%r440; add.u64 %r443,%r441,%r441; add.u64 %r444,%r443,%r441; shl.b64 %r445,%r444,3; add.u64 %r244,%r445,%r701; .loc 1 1000 18 mov.u64 %r45,1; .loc 1 999 18 mov.u64 %r48,%r45; .loc 1 998 18 mov.u64 %r169,0; $L602: .loc 1 1003 37 mul.lo.u64 %r447,%r45,%r48; .loc 1 1003 28 div.u64 %r41,%r168,%r447; .loc 1 1005 7 ld.u64 %r449,[%r252+16]; ld.u64 %r450,[%r252+8]; sub.u64 %r448,%r449,%r450; .loc 1 1005 33 add.u64 %r45,%r448,1; .loc 1 1006 23 ld.u64 %r48,[%r252]; .loc 1 1004 11 rem.u64 %r452,%r41,%r45; .loc 1 1003 21 mad.lo.u64 %r52,%r452,%r48,%r169; mov.u64 %r169,%r52; .loc 1 1001 8 add.u64 %r252,%r252,24; setp.ne.u64 %r454,%r244,%r252; @ %r454 bra $L602; .loc 1 1010 30 div.u64 %r303,%r168,%r45; $L605: .loc 1 1010 40 ld.u64 %r463,[%r461]; .loc 1 1010 24 mad.lo.u64 %r464,%r303,%r463,%r52; .loc 1 1011 14 ld.u64 %r467,[%r317]; mad.lo.u64 %r466,%r464,%r302,%r467; .loc 1 1013996 27 add.u64 %r168,%r168,1; .loc 1 996 4 add.u64 %r243,%r243,%r157; setp.ge.u64 %r476,%r168,%r176; @ %r476 bra $L603; .loc 1 1012 26 ld.u64 %r302,[%r317+16]; $L601: .loc 1 1001 24 ld.s8 %r202,[%r317+28]; .loc 1 1001 8 cvt.u16.u32 %r477,%r202; setp.gt.s16 %r478,%r477,1; @ %r478 bra $L604; mov.u64 %r303,%r168; mov.u64 %r52,0; bra $L605; $L603: .loc 1 1035 28 ld.u64 %r304,[%r315+16]; $L600: add.u32 %r479,%r22,-2; cvt.u64.u32 %r480,%r479; add.u64 %r482,%r480,%r480; add.u64 %r483,%r482,%r480; shl.b64 %r484,%r483,3; add.u64 %r485,%r315,64; add.u64 %r253,%r484,%r485; shl.b64 %r272,%r310,2; shl.b64 %r269,%r309,2; shl.b64 %r278,%r308,2; .loc 1 1018 23 mov.u64 %r151,0; .loc 1 1019 14 mov.u64 %r173,%r151; setp.gt.s32 %r694,%r692,0; .loc 1 1033 56 add.u64 %r496,%r693,%r693; add.u64 %r497,%r496,%r693; shl.b64 %r498,%r497,3; add.u64 %r499,%r315,%r498; add.u64 %r500,%r499,40; .loc 1 1038 8 cvt.u16.u32 %r514,%r307; .loc 1 1053 56 setp.ne.u32 %r696,%r318,1; .loc 1 447 3 setp.eq.u64 %r697,%r310,0; .loc 1 1043 5 set.u32.lt.u64 %r698,%r157,%r158; neg.s32 %r699,%r698; cvt.u16.u32 %r700,%r699; bra $L606; $L621: add.u64 %r261,%r315,40; .loc 1 1023 14 mov.u64 %r66,1; .loc 1 1022 14 mov.u64 %r69,%r66; .loc 1 1021 14 mov.u64 %r174,0; $L607: .loc 1 1026 41 mul.lo.u64 %r486,%r69,%r66; .loc 1 1026 32 div.u64 %r62,%r173,%r486; .loc 1 1028 11 ld.u64 %r488,[%r261+16]; ld.u64 %r489,[%r261+8]; sub.u64 %r487,%r488,%r489; .loc 1 1028 38 add.u64 %r66,%r487,1; .loc 1 1029 21 ld.u64 %r69,[%r261]; .loc 1 1027 8 rem.u64 %r491,%r62,%r66; .loc 1 1026 25 mad.lo.u64 %r73,%r491,%r69,%r174; mov.u64 %r174,%r73; .loc 1 1024 4 add.u64 %r261,%r261,24; setp.ne.u64 %r493,%r253,%r261; @ %r493 bra $L607; .loc 1 1033 27 div.u64 %r305,%r173,%r66; $L622: .loc 1 1033 37 ld.u64 %r502,[%r500]; .loc 1 1033 21 mad.lo.u64 %r503,%r305,%r502,%r73; .loc 1 1035 9 mad.lo.u64 %r505,%r503,%r304,%r313; .loc 1 1034 10 ld.u64 %r506,[%r312]; add.u64 %r170,%r506,%r505; .loc 1 1036 17 add.u64 %r171,%r152,%r151; .loc 1 1037 8 ld.s8 %r82,[%r315+29]; .loc 1 1037 38 ld.s8 %r83,[%r317+29]; .loc 1 1037 7 cvt.u16.u32 %r508,%r82; cvt.u16.u32 %r509,%r83; set.u32.eq.u16 %r510,%r508,%r509; neg.s32 %r511,%r510; .loc 1 1038 8 cvt.u16.u32 %r513,%r511; and.b16 %r512,%r513,%r514; cvt.u32.u16 %r515,%r512; cvt.u16.u8 %r516,%r515; setp.eq.u16 %r517,%r516,0; @ %r517 bra $L608; .loc 1 1040 8 min.u64 %r522,%r157,%r158522; call (%value_in),memmove523,[%value_in]; } .loc 1 1042 11 ld.s8 %r526,[%r315+29]; cvt.u16.u32 %r525,%r526; set.u32.eq.u16 %r527,%r525,6; neg.s32 %r528,%r527; .loc 1 1043 5 cvt.u16.u32 %r533,%r528; and.b16 %r532,%r533,%r700; cvt.u32.u16 %r535,%r532; cvt.u16.u8 %r536,%r535; setp.ne.u16 %r537,%r536,0; @ ! %r537 bra $L610; .loc 1 1045 8 @ %r696 bra $L611; .loc 1 1046 7 sub.u64 %r544,%r158,%r157; add.u64 %r545,%r170,%r157; mov.u32 %r5454546,[%value_in]; } bra $L610; $L611: .loc 1 1049 7 setp.ge.u64 %r547,%r308,%r309; @ %r547 bra $L610; add.u64 %r279,%r170,%r278; add.u64 %r274,%r170,%r269; .loc 1 1050 29 mov.u32 %r548,32; $L613: st.u32 [%r279],%r548; .loc 1 1049 7 add.u64 %r279,%r279,4; setp.ne.u64 %r549,%r274,%r279; @ %r549 bra $L613; bra $L610; $L608: .loc 1 1053 12 setp.ne.u16 %r551,%r508,6; @ %r551 bra $L614; .loc 1 1053 56 @ %r696 bra $L615; .loc 1 1054 615r157170171; call assign_char1_from_char4bra $L610; $L615: .loc 1 447 3 @ %r697 bra $L616; mov.u64 %r266,%r171; mov.u64 %r265,%r170; add.u64 %r262,%r266,%r310; $L617: .loc 1 448 14 ld.u8 %r558,[%r266]; st.u32 [%r265],%r558; .loc 1 447 3 add.u64 %r266,%r266,1; add.u64 %r265,%r265,4; setp.ne.u64 %r559,%r262,%r266; @ %r559 bra $L617; $L616: .loc 1 449 3 setp.le.u64 %r560,%r309,%r310; @ %r560 bra $L610; add.u64 %r273,%r170,%r272; add.u64 %r267,%r170,%r269; .loc 1 450 12 mov.u32 %r561,32; $L618: st.u32 [%r273],%r561; .loc 1 449 3 add.u64 %r273,%r273,4; setp.ne.u64 %r562,%r267,%r273; @ %r562 bra $L618; bra $L610; $L614: .loc 1 1058 6 {318convert_type$L610: .loc 1 1060 14 ld.s8 %r571,[%r317+28]; cvt.u16.u32 %r570,%r571; setp.eq.u16 %r572,%r570,0; @ %r572 bra $L619; .loc 1 1061 22 add.u64 %r151,%r151,%r157; $L619: .loc 1 1019 30 add.u64 %r173,%r173,1; .loc 1 1019 7 setp.ge.u64 %r573,%r173,%r176; @ %r573 bra $L620; .loc 1 1035 28 ld.u64 %r304,[%r315+16]; $L606: .loc 1 1024 4 @ %r694 bra $L621; mov.u64 %r305,%r173; mov.u64 %r73,0; bra $L622; $L620: .loc 1 1063__nvptx_free,(%out_arg1); } .loc 1 1064 7 bra $L581; $L638: add.u64 %r215,%r315,40; .loc 1 1071 17 mov.u64 %r98,1; .loc 1 1070 17 mov.u64 %r101,%r98; .loc 1 1069 17 mov.u64 %r166,0; $L623: .loc 1 1074 37 mul.lo.u64 %r576,%r101,%r98; .loc 1 1074 28 div.u64 %r94,%r165,%r576; .loc 1 1076 7 ld.u64 %r578,[%r215+16]; ld.u64 %r579,[%r215+8]; sub.u64 %r577,%r578,%r579; .loc 1 1076 34 add.u64 %r98,%r577,1; .loc 1 1077 24 ld.u64 %r101,[%r215]; .loc 1 1075 11 rem.u64 %r581,%r94,%r98; .loc 1 1074 21 mad.lo.u64 %r105,%r581,%r101,%r166; mov.u64 %r166,%r105; .loc 1 1072 7 add.u64 %r215,%r215,24; setp.ne.u64 %r583,%r205,%r215; @ %r583 bra $L623; .loc 1 1081 30 div.u64 %r300,%r165,%r98; $L639: .loc 1 1081 40 ld.u64 %r592,[%r590]; .loc 1 1081 24 mad.lo.u64 %r593,%r300,%r592,%r105; .loc 1 1083 7 mad.lo.u64 %r595,%r593,%r299,%r313; .loc 1 1082 13 ld.u64 %r596,[%r312]; add.u64 %r160,%r596,%r595; .loc 1 1085 11 ld.s8 %r114,[%r317+28]; .loc 1 1100 30 ld.u64 %r153,[%r317]; .loc 1 1085 10 setp.eq.u32 %r597,%r114,0; @ %r597 bra $L624; .loc 1 1090 4 cvt.u16.u32 %r598,%r114; setp.le.s16 %r599,%r598,1; @ %r599 bra $L640; add.u64 %r224,%r317,40; add.u32 %r600,%r114,-2; cvt.u64.u32 %r601,%r600; add.u64 %r603,%r601,%r601; add.u64 %r604,%r603,%r601; shl.b64 %r605,%r604,3; add.u64 %r216,%r605,%r704; .loc 1 1087 14 mov.u64 %r161,0; .loc 1 1089 11 mov.u64 %r121,1; .loc 1 1088 11 mov.u64 %r124,%r121; $L626: .loc 1 1092 40 mul.lo.u64 %r607,%r121,%r124; .loc 1 1092 31 div.u64 %r117,%r165,%r607; .loc 1 1094 10 ld.u64 %r609,[%r224+16]; ld.u64 %r610,[%r224+8]; sub.u64 %r608,%r609,%r610; .loc 1 1094 36 add.u64 %r121,%r608,1; .loc 1 1095 19 ld.u64 %r124,[%r224]; .loc 1 1093 7 rem.u64 %r612,%r117,%r121; .loc 1 1092 24 mad.lo.u64 %r128,%r612,%r124,%r161; mov.u64 %r161,%r128; .loc 1 1090 4 add.u64 %r224,%r224,24; setp.ne.u64 %r614,%r216,%r224; @ %r614 bra $L626; .loc 1 1099 26 div.u64 %r301,%r165,%r121; bra $L625; $L640: .loc 1 1090 4 mov.u64 %r301,%r165; mov.u64 %r128,0; $L625: .loc 1 1099 36 ld.u64 %r623,[%r703]; .loc 1 1099 20 mad.lo.u64 %r624,%r301,%r623,%r128; .loc 1 1101 21 ld.u64 %r626,[%r317+16]; .loc 1 1100 7 mad.lo.u64 %r153,%r624,%r626,%r153; $L624: .loc 1 1106 11 ld.s8 %r138,[%r315+29]; .loc 1 1106 41 ld.s8 %r139,[%r317+29]; .loc 1 1106 10 cvt.u16.u32 %r628,%r138; cvt.u16.u32 %r629,%r139; set.u32.eq.u16 %r630,%r628,%r629; neg.s32 %r631,%r630; .loc 1 1107 4 cvt.u16.u32 %r633,%r631; and.b16 %r632,%r633,%r634; cvt.u32.u16 %r635,%r632; cvt.u16.u8 %r636,%r635; setp.eq.u16 %r637,%r636,0; @ %r637 bra $L627; .loc 1 1109 4 min.u64 %r642,%r157,%r158642; call (%value_in),memmove43,[%value_in]; } .loc 1 1111 7 ld.s8 %r646,[%r315+29]; cvt.u16.u32 %r645,%r646; set.u32.eq.u16 %r647,%r645,6; neg.s32 %r648,%r647; .loc 1 1111 51 cvt.u16.u32 %r653,%r648; and.b16 %r652,%r653,%r709; cvt.u32.u16 %r655,%r652; cvt.u16.u8 %r656,%r655; setp.ne.u16 %r657,%r656,0; @ ! %r657 bra $L629; .loc 1 1113 11 @ %r705 bra $L630; .loc 1 1114 3 sub.u64 %r664,%r158,%r157; add.u64 %r665,%r160,%r157; mov.u32 %r6666666,[%value_in]; } bra $L629; $L630: .loc 1 1116 3 setp.ge.u64 %r667,%r308,%r309; @ %r667 bra $L629; add.u64 %r242,%r160,%r241; add.u64 %r237,%r160,%r232; .loc 1 1117 25 mov.u32 %r668,32; $L632: st.u32 [%r242],%r668; .loc 1 1116 3 add.u64 %r242,%r242,4; setp.ne.u64 %r669,%r237,%r242; @ %r669 bra $L632; bra $L629; $L627: .loc 1 1120 15 setp.ne.u16 %r671,%r628,6; @ %r671 bra $L633; .loc 1 1120 59 @ %r705 bra $L634; .loc 1 1121 215r1571153; call assign_char1_from_char4bra $L629; $L634: .loc 1 447 3 @ %r706 bra $L635; mov.u64 %r229,%r153; mov.u64 %r228,%r160; add.u64 %r225,%r229,%r310; $L636: .loc 1 448 14 ld.u8 %r678,[%r229]; st.u32 [%r228],%r678; .loc 1 447 3 add.u64 %r229,%r229,1; add.u64 %r228,%r228,4; setp.ne.u64 %r679,%r225,%r229; @ %r679 bra $L636; $L635: .loc 1 449 3 setp.le.u64 %r680,%r309,%r310; @ %r680 bra $L629; add.u64 %r236,%r160,%r235; add.u64 %r230,%r160,%r232; .loc 1 450 12 mov.u32 %r681,32; $L637: st.u32 [%r236],%r681; .loc 1 449 3 add.u64 %r236,%r236,4; setp.ne.u64 %r682,%r230,%r236; @ %r682 bra $L637; bra $L629; $L633: .loc 1 1125 2 {318convert_type$L629: .loc 1 1067 26 add.u64 %r165,%r165,1; .loc 1 1067 3 setp.ge.u64 %r690,%r165,%r176; @ %r690 bra $L581; .loc 1 1083 26 ld.u64 %r299,[%r315+16]; $L598: .loc 1 1072 7 @ %r694 bra $L638; mov.u64 %r300,%r165; mov.u64 %r105,0; bra $L639; $L581: .loc 1 112_gfortran_caf_sendget .visible .func _gfortran_caf_sendge, .param .u32 %in_ar11, .param .u32 %in_ar1ar10; ld.param.u32 %ar10,[%in_ar10]; .reg .u32 %ar11; ld.param.u32 %ar11,[%in_ar11]; .reg .u32 %ar12; ld.param.u32 %ar12,[%in_ar1mov.u64 %r28,%ar3; mov.u64 %r29,%ar4; mov.u64 %r30,%ar5; mov.u64 %r31,%ar6; mov.u64 %r33,%ar8; mov.u32 %r35,%ar10; mov.u32 %r36,%ar11; mov.u32 %r37,%ar12; .loc 1 1144 9 ld.u64 %r24,[%r33]; .loc 1 1145 31 ld.u64 %r39,[%r30]; add.u64 %r38,%r39,%r31; .loc 1 1145 29 st.u64 [%r33],%r38; .loc 1 1147 3 mov.u64 %r49,0; {3; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r36; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r37; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r49; call _gfortran_caf_send,%out_arg8,%out_arg9,%out_arg10); } .loc 1 1149 29 st.u64 [%r33],%r24; .loc 1 115_gfortran_caf_get_by_ref .visible .func _gfortran_caf_get_by_refu64u64 %r283; .reg .u64u64 %r287; .reg .u64 %r292; .reg .u64u64u64u64u64u64 %r372; .reg .u64 %r375; .reg .u64 %r376; .reg .u64u64 %r404; .reg .pred %r405; .reg .pred %r407; .reg .predpred %r414; .reg .u64 %r416; .reg .pred %r417; .reg .u16 %r418; .reg .pred %r419; .reg .pred %r421; .reg .pred %r423; .reg .pred %r425; .reg .pred %r427; .reg .predpred %r433; .reg .predu64.reg .u64495; .reg .u32 %r496; .reg .u16 %r497; .reg .pred %r498; .reg .pred %r499; .reg .u64 %r500; .reg .u64 %r504; .reg .pred %r505; .reg .u64 %r506; .reg .u64 %r507; .reg .u32 %r509; .reg .u32 %r510; .reg .u32 %r512; .reg .u32 %r513; .reg .u16 %r514; .reg .u16 %r515; .reg .u16 %r516; .reg .u32 %r517; .reg .u16 %r518; .reg .pred %r519; .reg .pred %r520; .reg .u64 %r521; .reg .u64u64 %r531; .reg .u64 %r532; .reg .u64 %r533; .reg .u64 %r534; .reg .u64 %r535; .reg .u64 %r536; .reg .u64u32 %r546; .reg .u32 %r547; .reg .u16 %r548; .reg .u16 %r549; .reg .u16 %r550; .reg .u32 %r551; .reg .u16 %r552; .reg .pred %r553; .reg .pred %r554; .reg .u64u64 %r565; .reg .predu32 %r571; .reg .u32 %r573; .reg .u32 %r574; .reg .u16 %r575; .reg .u16 %r576; .reg .u16 %r577; .reg .u32 %r578; .reg .u16 %r579; .reg .pred %r580; .reg .pred %r581; .reg .u64 %r582; .reg .u64 %r586; .reg .u64 %r590; .reg .pred %r593; .reg .u32 %r595; .reg .u32 %r596; .reg .u32 %r598; .reg .u32 %r599; .reg .u16 %r600; .reg .u16 %r601; .reg .u16 %r602; .reg .u32 %r603; .reg .u16 %r604; .reg .pred %r605; .reg .u64 %r609; .reg .pred %r612; .reg .pred %r613; .reg .u64 %r614; .reg .pred %r615; .reg .u32 %r617; .reg .u32 %r618; .reg .u32 %r620; .reg .u32 %r621; .reg .u16 %r622; .reg .u16 %r623; .reg .u16 %r624; .reg .u32 %r625; .reg .u16 %r626; .reg .pred %r627; .reg .u64 %r631; .reg .pred %r634; .reg .pred %r635; .reg .pred %r637; .reg .u32 %r638; .reg .pred %r639; .reg .u32 %r641; .reg .u64 %r642; .reg .u64 %r644; .reg .u64 %r645; .reg .u64 %r646; .reg .u64 %r647; .reg .pred %r651; .reg .u64 %r655; .reg .u64 %r661; .reg .u64 %r662; .reg .u64 %r663; .reg .u6464u64 %r683; .reg .u64u32 %r710; .reg .u16 %r711; .reg .u16 %r712; .reg .u16 %r713; .reg .u32 %r714; .reg .pred %r715; .reg .u64 %r716; .reg .pred %r717; .reg .pred %r718; .reg .u32 %r719; .reg .pred %r720; .reg .u64 %r721; .reg .u32 %r722; .reg .u64 %r723; .reg .u64 %r724; .reg .u64 %r725; .reg .pred %r726; .reg .u16 %r727; .reg .pred %r728; .reg .pred %r730; .reg .pred %r732; .reg .pred %r734; .reg .pred %r736; .reg .pred %r737; .reg .pred %r738; .reg .pred %r739; .reg .pred %r740; .reg .pred %r741; .reg .pred %r742; .reg .u64 %r743; .reg .u64 %r744; .reg .u64 %r746; .reg .u64 %r747; .reg .u64 %r748; .reg .u64 %r750; .reg .u64 %r751; .reg .u64 %r752; .reg .u64 %r754; .reg .u64 %r755; .reg .u64 %r757; .reg .u64 %r758; .reg .u64 %r759; .reg .u64 %r761; .reg .u64 %r762; .reg .u64 %r764; .reg .u64 %r765; .reg .u64 %r766; .reg .pred %r767; .reg .u64 %r768; .reg .u64 %r769; .reg .u32 %r771; .reg .u32 %r772; .reg .u32 %r774; .reg .u32 %r775; .reg .u16 %r776; .reg .u16 %r777; .reg .u16 %r778; .reg .u32 %r779; .reg .u16 %r780; .reg .pred %r781; .reg .pred %r782; .reg .u64 %r783; .reg .u64 %r787; .reg .u64 %r788; .reg .u64 %r790; .reg .u64 %r791; .reg .u64 %r792; .reg .u64 %r793; .reg .u64 %r795; .reg .pred %r796; .reg .u32 %r798; .reg .u32 %r799; .reg .u32 %r801; .reg .u32 %r802; .reg .u16 %r803; .reg .u16 %r804; .reg .u16 %r805; .reg .u32 %r806; .reg .u16 %r807; .reg .pred %r808; .reg .pred %r809; .reg .u64 %r811; .reg .pred %r812; .reg .u32 %r814; .reg .u32 %r815; .reg .u32 %r817; .reg .u32 %r818; .reg .u16 %r819; .reg .u16 %r820; .reg .u16 %r821; .reg .u32 %r822; .reg .u16 %r823; .reg .pred %r824; .reg .pred %r825; .reg .pred %r826; .reg .u64 %r831; .reg .u64 %r832; .reg .u64 %r833; .reg .u64 %r834; .reg .u64 %r836; .reg .u64 %r837; .reg .u64 %r845; .reg .pred %r846; .reg .pred %r848; .reg .u64 %r853; .reg .u64 %r858; .reg .u64 %r859; .reg .u64 %r860; .reg .u64 %r861; .reg .u32 %r879; .reg .u32 %r880; .reg .u16 %r881; .reg .u16 %r882; .reg .u16 %r883; .reg .u32 %r884; .reg .pred %r885; .reg .u64 %r886; .reg .pred %r887; .reg .pred %r888; .reg .u32 %r889; .reg .pred %r890; .reg .u64 %r891; .reg .u32 %r892; .reg .u64 %r893; .reg .u64 %r894; .reg .u64 %r898; .reg .pred %r901; .reg .u64 %r903; .reg .u32 %r904; .reg .u32 %r905; .reg .u32 %r907; .reg .u32 %r908; .reg .u16 %r909; .reg .u16 %r910; .reg .u16 %r911; .reg .u32 %r912; .reg .u16 %r913; .reg .pred %r914; .reg .pred %r915; .reg .pred %r916; .reg .u64 %r918; .reg .u64 %r919; .reg .u64 %r920; .reg .pred %r922; .reg .pred %r923; .reg .u64 %r924; .reg .u32 %r925; .reg .u64 %r926; .reg .u64 %r927; .reg .pred %r928; .reg .pred %r929; .reg .u64 %r930; .reg .pred %r931; .reg .u32 %r932; .reg .u64 %r933; .reg .u64 %r935; .reg .u64 %r936; .reg .u64 %r937; .reg .u64 %r938; .reg .u64 %r939; .reg .pred %r942; .reg .u64 %r950; .reg .u32 %r952; .reg .u64 %r954; .reg .u64 %r956; .reg .u64 %r962; .reg .u64 %r970; .reg .u64 %r971; .reg .u64 %r972; .reg .u64 %r973; .reg .u64 %r974; .reg .u64 %r975; .reg .u64 %r976; .reg .pred %r977; .reg .u64 %r978; .reg .u64 %r979; .reg .u64 %r980; .reg .pred %r987; .reg .pred %r988; .reg .pred %r994; .reg .pred %r995; .reg .u64 %r996; .reg .pred %r997; .reg .u32 %r998; .reg .pred %r999; .reg .pred %r1000; mov.u64 %r316,%ar0; mov.u64 %r318,%ar2; mov.u64 %r319,%ar3; mov.u32 %r320,%ar4; mov.u32 %r321,%ar5; mov.u32 %r323,%ar7; mov.u64 %r324,%ar8; mov.u32 %r325,%ar9; .loc 1 1550 14 add.u64 %r971,%frame,361; cvta.const.u64 %r327,$LC9; mov.u64 %r328,6328332,[%value_in]; } .loc 1 1552 14 add.u64 %r973,%frame,481; cvta.const.u64 %r335,$LC10; mov.u64 %r336,5836340,[%value_in]; } .loc 1 1554 14 add.u64 %r980,%frame,297; cvta.const.u64 %r343,$LC11; mov.u64 %r344,6444348,[%value_in]; } .loc 1 1556 14 add.u64 %r979,%frame,594; cvta.const.u64 %r351,$LC12; mov.u64 %r352,53356,[%value_in]; } .loc 1 1558 14 add.u64 %r976,%frame,539; cvta.const.u64 %r359,$LC13; mov.u64 %r360,5560364,[%value_in]; } .loc 1 1560 14 add.u64 %r972,%frame,423; cvta.const.u64 %r367,$LC1436372,[%value_in]; } .loc 1 1562 14 add.u64 %r975,%frame,120; cvta.const.u64 %r375,$LC15; mov.u64 %r376,93760,[%value_in]; } .loc 1 1564 14 add.u64 %r978,%frame,210; cvta.const.u64 %r383,$LC16; mov.u64 %r384,8971568 7 ld.s8 %r242,[%r318+28]; .loc 1 1572 9 ld.u64 %r274,[%r316]; .loc 1 1573 21 ld.u64 %r275,[%r316+8]; .loc 1 1585 39 ld.u64 %r23,[%r318]; .loc 1 1585 37 set.u32.eq.u64 %r391,%r23,0; neg.s32 %r392,%r391; cvt.u32.u32 %r390,%r392; cvt.u32.u8 %r277,%r390; .loc 1 1587 3 cvt.u16.u32 %r394,%r323; xor.b16 %r393,%r394,1; cvt.u32.u16 %r395,%r393; cvt.u32.u8 %r24,%r395; cvt.u16.u32 %r397,%r24; cvt.u16.u32 %r398,%r277; and.b16 %r396,%r397,%r398; cvt.u32.u16 %r399,%r396; cvt.u32.u8 %r25,%r399; setp.eq.u32 %r400,%r25,0; @ %r400 bra $L679; cvta.const.u64 %r404,$LC17; cvta.const.u64 %r403,__func__$5; mov.u32 %r402,1587; cvta.const.u64 %r401,$LC183401589 6 setp.eq.u64 %r405,%r324,0; @ %r405 bra $L680; .loc 1 1590 11 st.u32 [%r324],%r25; $L680: .loc 1 1595 9 setp.eq.u64 %r407,%r319,0; @ %r407 bra $L678; .loc 1 1597 20 ld.u32 %r160,[%r319+8]; mov.u32 %r266,%r25; mov.u64 %r304,%r319; mov.u64 %r264,%r275; mov.u64 %r161,%r274; mov.u32 %r212,0; mov.u64 %r86,1; setp.eq.u32 %r977,%r160,1; .loc 1 1670 11 mov.u64 %r996,%r86; .loc 1 1768 6 setp.eq.u64 %r997,%r23,0; .loc 1 1796 4 cvt.u32.u64 %r998,%r86; .loc 1 1775 9 setp.eq.u32 %r999,%r24,0; setp.ne.u32 %r1000,%r24,0; $L764: .loc 1 1597 7 @ %r977 bra $L682; setp.eq.u32 %r409,%r160,2; @ %r409 bra $L683; setp.ne.u32 %r410,%r160,0; @ %r410 bra $L684; .loc 1 1600 18 ld.u64 %r26,[%r304+32]; .loc 1 1613 17 ld.u64 %r241,[%r304]; .loc 1 1600 7 setp.eq.u64 %r411,%r26,0; @ %r411 bra $L685; .loc 1 1602 21 add.u64 %r412,%r161,%r26; ld.u64 %r287,[%r412]; .loc 1 1604 15 ld.u64 %r161,[%r287]; .loc 1 1605 12 ld.u64 %r264,[%r287+8]; bra $L686; $L685: .loc 1 1609 15 ld.u64 %r413,[%r304+24]; add.u64 %r161,%r161,%r413; .loc 1 1613 11 setp.eq.u64 %r414,%r241,0; @ %r414 bra $L687; .loc 1 1613 38 ld.u32 %r160,[%r241+8]; .loc 1 1613 24 setp.eq.u32 %r977,%r160,1; @ %r977 bra $L774; .loc 1 1616 7 mov.u64 %r264,%r26; bra $L689; $L682: .loc 1 1620 11 mov.u64 %r416,0; st.u64 [%frame+648],%r416; .loc 1 1620 31 ld.u8 %r169,[%r304+24]; .loc 1 1620 4 setp.ne.u32 %r417,%r169,0; @ %r417 bra $L690; .loc 1 1613 17 ld.u64 %r241,[%r304]; bra $L686; $L690: add.u64 %r137,%r304,48; add.u64 %r120,%r264,40; .loc 1 1620 4 mov.u32 %r270,%r25; .loc 1 1620 31 mov.u64 %r167,%r416; .loc 1 1733 9 setp.ne.u32 %r994,%r266,0; .loc 1 1743 10 setp.eq.u32 %r995,%r323,0; $L730: cvt.u16.u32 %r418,%r169; setp.eq.u16 %r419,%r418,4; @ %r419 bra $L691; setp.gt.u16 %r421,%r418,4; @ %r421 bra $L692; setp.eq.u16 %r423,%r418,2; @ %r423 bra $L693; setp.eq.u16 %r425,%r418,3; @ %r425 bra $L694; setp.eq.u16 %r427,%r418,1; @ %r427 bra $L695; bra $L696; $L692: setp.eq.u16 %r429,%r418,5; @ %r429 bra $L697; setp.eq.u16 %r431,%r418,6; @ %r431 bra $L698; bra $L696; $L695: .loc 1 1625 11 ld.u64 %r314,[%r137+8]; .loc 1 1634 32 ld.u32 %r33,[%r137+16]; setp.eq.u32 %r432,%r33,4; @ %r432 bra $L699; setp.gt.s32 %r433,%r33,4; @ %r433 bra $L700; setp.eq.u32 %r434,%r33,1; @ %r434 bra $L701; setp.eq.u32 %r435,%r33,2; @ %r435 bra $L702; bra $L703; $L700: setp.eq.u32 %r436,%r33,8; @ %r436 bra $L704; setp.eq.u32 %r437,%r33,16; @ %r437 bra $L705; bra $L703; $L701: .loc 1 1636 7 ld.u64 %r438,[%r137]; ld.s8 %r439,[%r438]; ld.u64 %r441,[%r120+8]; sub.u64 %r440,%r439,%r441; ld.u64 %r443,[%r120]; mul.lo.u64 %r442,%r440,%r443; ld.u64 %r445,[%r304+16]; mad.lo.u64 %r161,%r442,%r445,%r161; bra $L706; $L702: .loc 1 1637 7 ld.u64 %r446,[%r137]; ld.s16 %r447,[%r446]; ld.u64 %r449,[%r120+8]; sub.u64 %r448,%r447,%r449; ld.u64 %r451,[%r120]; mul.lo.u64 %r450,%r448,%r451; ld.u64 %r453,[%r304+16]; mad.lo.u64 %r161,%r450,%r453,%r161; bra $L706; $L699: .loc 1 1638 7 ld.u64 %r454,[%r137]; ld.s32 %r455,[%r454]; ld.u64 %r457,[%r120+8]; sub.u64 %r456,%r455,%r457; ld.u64 %r459,[%r120]; mul.lo.u64 %r458,%r456,%r459; ld.u64 %r461,[%r304+16]; mad.lo.u64 %r161,%r458,%r461,%r161; bra $L706; $L704: .loc 1 1640 7 ld.u64 %r462,[%r137]; ld.u64 %r464,[%r462]; ld.u64 %r465,[%r120+8]; sub.u64 %r463,%r464,%r465; ld.u64 %r467,[%r120]; mul.lo.u64 %r466,%r463,%r467; ld.u64 %r469,[%r304+16]; mad.lo.u64 %r161,%r466,%r469,%r161; bra $L706; $L705: .loc 1 1643 7 ld.u64 %r470,[%r137]; ld.u64 %r472,[%r470]; ld.u64 %r473,[%r120+8]; sub.u64 %r471,%r472,%r473; ld.u64 %r475,[%r120]; mul.lo.u64 %r474,%r471,%r475; ld.u64 %r477,[%r304+16]; mad.lo.u64 %r161,%r474,%r477,%r161; bra $L706; $L703: .loc 1 1646 9 mov.u64 %r481,48stack; call caf_internal_error); } .loc 1 1647 9 bra $L678; $L693: .loc 1 1652 5 ld.u64 %r83,[%r137+16]; abs.s64 %r286,%r83; ld.u64 %r268,[%r120+16]; .loc 1 1636 7 ld.u64 %r267,[%r120+8]; .loc 1 1652 5 setp.le.s64 %r484,%r83,0; @ %r484 bra $L708; add.u64 %r485,%r268,1; sub.u64 %r314,%r485,%r267; bra $L709; $L708: add.u64 %r486,%r267,1; sub.u64 %r314,%r486,%r268; $L709: set.u32.le.s64 %r488,%r314,0; neg.s32 %r489,%r488; set.u32.le.s64 %r491,%r286,0; neg.s32 %r492,%r491; cvt.u16.u32 %r494,%r489; cvt.u16.u32 %r495,%r492; or.b16 %r493,%r494,%r495; cvt.u32.u16 %r496,%r493; cvt.u16.u8 %r497,%r496; setp.ne.u16 %r498,%r497,0; @ %r498 bra $L678; setp.eq.u64 %r499,%r286,1; @ %r499 bra $L710; add.u64 %r500,%r314,-1; div.s64 %r504,%r500,%r286; add.u64 %r314,%r504,1; bra $L710; $L694: .loc 1 1660 5 ld.u64 %r93,[%r137+16]; abs.s64 %r285,%r93; .loc 1 1636 7 ld.u64 %r168,[%r137]; .loc 1 1625 32 ld.u64 %r313,[%r137+8]; .loc 1 1660 5 setp.le.s64 %r505,%r93,0; @ %r505 bra $L711; add.u64 %r506,%r313,1; sub.u64 %r314,%r506,%r168; bra $L712; $L711: add.u64 %r507,%r168,1; sub.u64 %r314,%r507,%r313; $L712: set.u32.le.s64 %r509,%r314,0; neg.s32 %r510,%r509; set.u32.le.s64 %r512,%r285,0; neg.s32 %r513,%r512; cvt.u16.u32 %r515,%r510; cvt.u16.u32 %r516,%r513; or.b16 %r514,%r515,%r516; cvt.u32.u16 %r517,%r514; cvt.u16.u8 %r518,%r517; setp.ne.u16 %r519,%r518,0; @ %r519 bra $L678; setp.eq.u64 %r520,%r285,1; @ %r520 bra $L713; add.u64 %r521,%r314,-1; div.s64 %r525,%r521,%r285; add.u64 %r314,%r525,1; $L713: .loc 1 1665 9 ld.u64 %r527,[%r120+8]; sub.u64 %r526,%r168,%r527; .loc 1 1666 9 ld.u64 %r529,[%r120]; mul.lo.u64 %r528,%r526,%r529; .loc 1 1667 9 ld.u64 %r531,[%r304+16]; .loc 1 1664 12 mad.lo.u64 %r161,%r528,%r531,%r161; .loc 1 1668 5 bra $L710; $L691: .loc 1 1672 9 ld.u64 %r533,[%r137]; ld.u64 %r534,[%r120+8]; sub.u64 %r532,%r533,%r534; .loc 1 1673 9 ld.u64 %r536,[%r120]; mul.lo.u64 %r535,%r532,%r536; .loc 1 1674 9 ld.u64 %r538,[%r304+16]; .loc 1 1671 12 mad.lo.u64 %r161,%r535,%r538,%r161; .loc 1 1670 11 mov.u64 %r314,%r996; bra $L714; $L697: .loc 1 1677 5 ld.u64 %r119,[%r137+16]; abs.s64 %r284,%r119; .loc 1 1636 7 ld.u64 %r273,[%r137]; .loc 1 1652 5 ld.u64 %r310,[%r120+16]; .loc 1 1677 5 setp.le.s64 %r539,%r119,0; @ %r539 bra $L715; add.u64 %r540,%r310,1; sub.u64 %r314,%r540,%r273; bra $L716; $L715: add.u64 %r541,%r273,1; sub.u64 %r314,%r541,%r310; $L716: set.u32.le.s64 %r543,%r314,0; neg.s32 %r544,%r543; set.u32.le.s64 %r546,%r284,0; neg.s32 %r547,%r546; cvt.u16.u32 %r549,%r544; cvt.u16.u32 %r550,%r547; or.b16 %r548,%r549,%r550; cvt.u32.u16 %r551,%r548; cvt.u16.u8 %r552,%r551; setp.ne.u16 %r553,%r552,0; @ %r553 bra $L678; setp.eq.u64 %r554,%r284,1; @ %r554 bra $L717; add.u64 %r555,%r314,-1; div.s64 %r559,%r555,%r284; add.u64 %r314,%r559,1; $L717: .loc 1 1682 9 ld.u64 %r561,[%r120+8]; sub.u64 %r560,%r273,%r561; .loc 1 1683 9 ld.u64 %r563,[%r120]; mul.lo.u64 %r562,%r560,%r563; .loc 1 1684 9 ld.u64 %r565,[%r304+16]; .loc 1 1681 12 mad.lo.u64 %r161,%r562,%r565,%r161; .loc 1 1685 5 bra $L710; $L698: .loc 1 1687 5 ld.u64 %r136,[%r137+16]; abs.s64 %r283,%r136; .loc 1 1636 7 ld.u64 %r302,[%r120+8]; .loc 1 1625 32 ld.u64 %r276,[%r137+8]; .loc 1 1687 5 setp.le.s64 %r566,%r136,0; @ %r566 bra $L718; add.u64 %r567,%r276,1; sub.u64 %r314,%r567,%r302; bra $L719; $L718: add.u64 %r568,%r302,1; sub.u64 %r314,%r568,%r276; $L719: set.u32.le.s64 %r570,%r314,0; neg.s32 %r571,%r570; set.u32.le.s64 %r573,%r283,0; neg.s32 %r574,%r573; cvt.u16.u32 %r576,%r571; cvt.u16.u32 %r577,%r574; or.b16 %r575,%r576,%r577; cvt.u32.u16 %r578,%r575; cvt.u16.u8 %r579,%r578; setp.ne.u16 %r580,%r579,0; @ %r580 bra $L678; setp.eq.u64 %r581,%r283,1; @ %r581 bra $L710; add.u64 %r582,%r314,-1; div.s64 %r586,%r582,%r283; add.u64 %r314,%r586,1; bra $L710; $L696: .loc 1 1695 5 mov.u64 %r590,98324590590stack; call caf_internal_error); } .loc 1 1696 5 bra $L678; $L706: .loc 1 1698 11 setp.le.s64 %r593,%r314,0; @ %r593 bra $L678; $L710: .loc 1 1702 18 set.u32.gt.s64 %r595,%r314,1; neg.s32 %r596,%r595; .loc 1 1702 34 set.u32.eq.u32 %r598,%r242,0; neg.s32 %r599,%r598; .loc 1 1702 22 cvt.u16.u32 %r601,%r596; cvt.u16.u32 %r602,%r599; and.b16 %r600,%r601,%r602; .loc 1 1702 11 cvt.u32.u16 %r603,%r600; cvt.u16.u8 %r604,%r603; setp.eq.u16 %r605,%r604,0; @ %r605 bra $L714; $L751: .loc 1 1705 5 mov.u64 %r609,9460stack; call caf_internal_error); } .loc 1 1706 5 bra $L678; $L714: .loc 1 1709 11 setp.ne.u32 %r612,%r242,-1; @ %r612 bra $L720; .loc 1 1709 27 setp.ne.u64 %r613,%r23,0; @ %r613 bra $L721; .loc 1 1711 14 add.u32 %r242,%r212,1; .loc 1 1712 31 st.u8 [%r318+28],%r242; .loc 1 1713 31 cvt.s64.s32 %r614,%r320; st.u64 [%r318+16],%r614; bra $L722; $L720: .loc 1 1716 11 setp.le.s32 %r615,%r242,0; @ %r615 bra $L721; .loc 1 1720 21 set.u32.le.s32 %r617,%r242,%r212; neg.s32 %r618,%r617; .loc 1 1720 42 set.u32.ne.u64 %r620,%r314,1; neg.s32 %r621,%r620; .loc 1 1720 33 cvt.u16.u32 %r623,%r618; cvt.u16.u32 %r624,%r621; and.b16 %r622,%r623,%r624; .loc 1 1720 8 cvt.u32.u16 %r625,%r622; cvt.u16.u8 %r626,%r625; setp.eq.u16 %r627,%r626,0; @ %r627 bra $L722; $L755: .loc 1 1722 9 mov.u64 %r631,6363stack; call caf_internal_error); } .loc 1 1723 9 bra $L678; $L722: .loc 1 1726 13 setp.eq.u64 %r634,%r314,1; @ %r634 bra $L723; .loc 1 1730 12 setp.ne.u32 %r635,%r270,0; @ %r635 bra $L724; .loc 1 1733 9 @ %r994 bra $L725; .loc 1 1740 13 setp.eq.u64 %r637,%r167,0; @ %r637 bra $L724; .loc 1 1741 22 ld.s8 %r638,[%r264+28]; .loc 1 1741 7 setp.ne.u32 %r639,%r638,%r242; @ %r639 bra $L724; .loc 1 1747 46 cvt.u32.u64 %r212,%r167; .loc 1 1743 10 @ %r995 bra $L724; add.u64 %r269,%r318,40; add.u32 %r641,%r212,-1; cvt.u64.u32 %r642,%r641; add.u64 %r644,%r642,%r642; add.u64 %r645,%r644,%r642; shl.b64 %r646,%r645,3; add.u64 %r647,%r318,64; add.u64 %r140,%r646,%r647; $L726: .loc 1 1749 12 st.u64 [%r269+8],%r996; st.u64 [%r269+16],%r996; st.u64 [%r269],%r996; .loc 1 1747 11 add.u64 %r269,%r269,24; setp.ne.u64 %r651,%r140,%r269; @ %r651 bra $L726; bra $L724; $L725: .loc 1 1760 10 mov.u64 %r655,97r32465655stack; call caf_internal_error); } .loc 1 1762 10 bra $L678; $L724: .loc 1 1768 6 @ ! %r997 bra $L841; cvt.s64.s32 %r974,%r212; bra $L727; $L841: .loc 1 1768 9 cvt.s64.s32 %r974,%r212; add.u64 %r661,%r974,%r974; add.u64 %r662,%r661,%r974; shl.b64 %r663,%r662,3; add.u64 %r664,%r318,%r663; ld.u64 %r667,[%r664+56]; add.u64 %r666,%r667,1; ld.u64 %r675,[%r664+48]; sub.u64 %r159,%r666,%r675; .loc 1 1768 6 setp.ne.u64 %r676,%r159,%r314; @ %r676 bra $L728; .loc 1 1771 12 setp.eq.u32 %r677,%r277,0; @ %r677 bra $L775; .loc 1 1775 9 @ %r1000 bra $L729; mov.u32 %r270,%r277; bra $L723; $L728: @ %r999 bra $L727; $L729: .loc 1 1777 10 st.u64 [%stack+8],%r159; st.u64 [%stack],%r314; mov.u64 %r683,683683stack; call caf_internal_error); } .loc 1 1781 10 bra $L678; $L727: .loc 1 1796 4 add.u64 %r688,%r974,%r974; add.u64 %r689,%r688,%r974; shl.b64 %r690,%r689,3; add.u64 %r691,%r318,%r690; st.u64 [%r691+48],%r996; st.u64 [%r691+56],%r314; st.u64 [%r691+40],%r86; mov.u32 %r270,%r998; .loc 1 1791 21 mov.u32 %r277,%r998; bra $L723; $L775: mov.u32 %r270,1; $L723: .loc 1 1801 37 set.u32.lt.s32 %r709,%r212,%r242; neg.s32 %r710,%r709; .loc 1 1801 22 cvt.u16.u32 %r712,%r710; cvt.u16.u32 %r713,%r270; and.b16 %r711,%r712,%r713; cvt.u32.u16 %r714,%r711; cvt.u32.u8 %r165,%r714; .loc 1 1801 8 setp.eq.u32 %r715,%r165,0; @ %r715 bra $L721; .loc 1 1802 7 add.u32 %r212,%r212,1; mov.u32 %r270,%r165; $L721: .loc 1 1804 13 mul.lo.u64 %r86,%r86,%r314; .loc 1 1620 56 add.u64 %r167,%r167,1; st.u64 [%frame+648],%r167; .loc 1 1620 31 add.u64 %r716,%r304,%r167; ld.u8 %r169,[%r716+24]; .loc 1 1620 4 add.u64 %r137,%r137,24; add.u64 %r120,%r120,24; setp.ne.u32 %r717,%r169,0; @ %r717 bra $L730; .loc 1 1806 7 setp.ne.u32 %r718,%r270,0; @ %r718 bra $L731; .loc 1 1613 17 ld.u64 %r241,[%r304]; bra $L686; $L731: .loc 1 1812 8 ld.s8 %r719,[%r318+28]; setp.ne.u32 %r720,%r719,%r212; @ %r720 bra $L732; .loc 1 1613 17 ld.u64 %r241,[%r304]; .loc 1 1808 27 mov.u32 %r266,%r270; bra $L686; $L732: .loc 1 1812 8 cvta.const.u64 %r724,$LC19; cvta.const.u64 %r723,__func__$5; mov.u32 %r722,1812; cvta.const.u64 %r721,$LC182721816 11 mov.u64 %r725,0; st.u64 [%frame+648],%r725; .loc 1 1816 31 ld.u8 %r252,[%r304+24]; .loc 1 1816 4 setp.ne.u32 %r726,%r252,0; @ %r726 bra $L733; .loc 1 1613 17 ld.u64 %r241,[%r304]; bra $L686; $L733: add.u64 %r292,%r304,48; .loc 1 1816 4 mov.u32 %r271,%r25; .loc 1 1816 31 mov.u64 %r251,%r725; .loc 1 1908 9 setp.ne.u32 %r987,%r266,0; .loc 1 1884 27 setp.ne.u64 %r988,%r23,0; $L761: cvt.u16.u32 %r727,%r252; setp.eq.u16 %r728,%r727,3; @ %r728 bra $L734; setp.gt.u16 %r730,%r727,3; @ %r730 bra $L735; setp.eq.u16 %r732,%r727,1; @ %r732 bra $L736; setp.eq.u16 %r734,%r727,2; @ %r734 bra $L737; bra $L696; $L735: setp.eq.u16 %r736,%r727,4; @ %r736 bra $L738; bra $L696; $L736: .loc 1 1821 11 ld.u64 %r315,[%r292+8]; .loc 1 1827 32 ld.u32 %r176,[%r292+16]; setp.eq.u32 %r737,%r176,4; @ %r737 bra $L739; setp.gt.s32 %r738,%r176,4; @ %r738 bra $L740; setp.eq.u32 %r739,%r176,1; @ %r739 bra $L741; setp.eq.u32 %r740,%r176,2; @ %r740 bra $L742; bra $L703; $L740: setp.eq.u32 %r741,%r176,8; @ %r741 bra $L743; setp.eq.u32 %r742,%r176,16; @ %r742 bra $L744; bra $L703; $L741: .loc 1 1829 7 ld.u64 %r743,[%r292]; ld.s8 %r744,[%r743]; ld.u64 %r746,[%r304+16]; mad.lo.u64 %r161,%r744,%r746,%r161; bra $L745; $L742: .loc 1 1830 7 ld.u64 %r747,[%r292]; ld.s16 %r748,[%r747]; ld.u64 %r750,[%r304+16]; mad.lo.u64 %r161,%r748,%r750,%r161; bra $L745; $L739: .loc 1 1831 7 ld.u64 %r751,[%r292]; ld.s32 %r752,[%r751]; ld.u64 %r754,[%r304+16]; mad.lo.u64 %r161,%r752,%r754,%r161; bra $L745; $L743: .loc 1 1833 7 ld.u64 %r755,[%r292]; ld.u64 %r757,[%r755]; ld.u64 %r758,[%r304+16]; mad.lo.u64 %r161,%r757,%r758,%r161; bra $L745; $L744: .loc 1 1836 7 ld.u64 %r759,[%r292]; ld.u64 %r761,[%r759]; ld.u64 %r762,[%r304+16]; mad.lo.u64 %r161,%r761,%r762,%r161; bra $L745; $L737: .loc 1 1845 37 ld.u64 %r765,[%r292+8]; ld.u64 %r766,[%r292+16]; div.s64 %r764,%r765,%r766; .loc 1 1845 11 add.u64 %r315,%r764,1; .loc 1 1849 5 bra $L745; $L734: .loc 1 1851 5 ld.u64 %r207,[%r292+16]; abs.s64 %r278,%r207; .loc 1 1821 32 ld.u64 %r312,[%r292+8]; .loc 1 1829 7 ld.u64 %r265,[%r292]; .loc 1 1851 5 setp.le.s64 %r767,%r207,0; @ %r767 bra $L746; add.u64 %r768,%r312,1; sub.u64 %r315,%r768,%r265; bra $L747; $L746: add.u64 %r769,%r265,1; sub.u64 %r315,%r769,%r312; $L747: set.u32.le.s64 %r771,%r315,0; neg.s32 %r772,%r771; set.u32.le.s64 %r774,%r278,0; neg.s32 %r775,%r774; cvt.u16.u32 %r777,%r772; cvt.u16.u32 %r778,%r775; or.b16 %r776,%r777,%r778; cvt.u32.u16 %r779,%r776; cvt.u16.u8 %r780,%r779; setp.ne.u16 %r781,%r780,0; @ %r781 bra $L678; setp.eq.u64 %r782,%r278,1; @ %r782 bra $L748; add.u64 %r783,%r315,-1; div.s64 %r787,%r783,%r278; add.u64 %r315,%r787,1; $L748: .loc 1 1856 9 mul.lo.u64 %r788,%r207,%r265; .loc 1 1857 9 ld.u64 %r790,[%r304+16]; .loc 1 1855 12 mad.lo.u64 %r161,%r788,%r790,%r161; bra $L749; $L738: .loc 1 1862 9 ld.u64 %r792,[%r292]; ld.u64 %r793,[%r292+16]; mul.lo.u64 %r791,%r792,%r793; .loc 1 1863 9 ld.u64 %r795,[%r304+16]; .loc 1 1861 12 mad.lo.u64 %r161,%r791,%r795,%r161; .loc 1 1860 11 mov.u64 %r315,%r996; bra $L750; $L745: .loc 1 1873 11 setp.le.s64 %r796,%r315,0; @ %r796 bra $L678; $L749: .loc 1 1877 18 set.u32.gt.s64 %r798,%r315,1; neg.s32 %r799,%r798; .loc 1 1877 34 set.u32.eq.u32 %r801,%r242,0; neg.s32 %r802,%r801; .loc 1 1877 22 cvt.u16.u32 %r804,%r799; cvt.u16.u32 %r805,%r802; and.b16 %r803,%r804,%r805; .loc 1 1877 11 cvt.u32.u16 %r806,%r803; cvt.u16.u8 %r807,%r806; setp.ne.u16 %r808,%r807,0; @ %r808 bra $L751; $L750: .loc 1 1884 11 setp.ne.u32 %r809,%r242,-1; @ %r809 bra $L752; .loc 1 1884 27 @ %r988 bra $L753; .loc 1 1886 14 add.u32 %r242,%r212,1; .loc 1 1887 31 st.u8 [%r318+28],%r242; .loc 1 1888 31 cvt.s64.s32 %r811,%r320; st.u64 [%r318+16],%r811; bra $L754; $L752: .loc 1 1891 11 setp.le.s32 %r812,%r242,0; @ %r812 bra $L753; .loc 1 1895 21 set.u32.ge.s32 %r814,%r212,%r242; neg.s32 %r815,%r814; .loc 1 1895 42 set.u32.ne.u64 %r817,%r315,1; neg.s32 %r818,%r817; .loc 1 1895 33 cvt.u16.u32 %r820,%r815; cvt.u16.u32 %r821,%r818; and.b16 %r819,%r820,%r821; .loc 1 1895 8 cvt.u32.u16 %r822,%r819; cvt.u16.u8 %r823,%r822; setp.ne.u16 %r824,%r823,0; @ %r824 bra $L755; $L754: .loc 1 1901 13 setp.eq.u64 %r825,%r315,1; @ %r825 bra $L756; .loc 1 1905 12 setp.ne.u32 %r826,%r271,0; @ %r826 bra $L757; .loc 1 1908 9 @ %r987 bra $L725; .loc 1 1914 22 cvt.u32.u64 %r212,%r251; $L757: .loc 1 1926 6 @ ! %r997 bra $L842; cvt.s64.s32 %r974,%r212; bra $L758; $L842: .loc 1 1926 9 cvt.s64.s32 %r974,%r212; add.u64 %r831,%r974,%r974; add.u64 %r832,%r831,%r974; shl.b64 %r833,%r832,3; add.u64 %r834,%r318,%r833; ld.u64 %r837,[%r834+56]; add.u64 %r836,%r837,1; ld.u64 %r845,[%r834+48]; sub.u64 %r240,%r836,%r845; .loc 1 1926 6 setp.eq.u64 %r846,%r240,%r315; @ %r846 bra $L759; .loc 1 1933 9 @ ! %r1000 bra $L758; bra $L760; $L759: .loc 1 1929 12 setp.eq.u32 %r848,%r277,0; @ %r848 bra $L776; .loc 1 1933 9 @ %r999 bra $L777; $L760: .loc 1 1935 10 st.u64 [%stack+8],%r240; st.u64 [%stack],%r315; mov.u64 %r853,3853stack; call caf_internal_error); } .loc 1 1939 10 bra $L678; $L758: .loc 1 1954 4 add.u64 %r858,%r974,%r974; add.u64 %r859,%r858,%r974; shl.b64 %r860,%r859,3; add.u64 %r861,%r318,%r860; st.u64 [%r861+48],%r996; st.u64 [%r861+56],%r315; st.u64 [%r861+40],%r86; mov.u32 %r271,%r998; .loc 1 1949 21 mov.u32 %r277,%r998; bra $L756; $L776: mov.u32 %r271,1; bra $L756; $L777: .loc 1 1933 9 mov.u32 %r271,%r277; $L756: .loc 1 1958 37 set.u32.lt.s32 %r879,%r212,%r242; neg.s32 %r880,%r879; .loc 1 1958 22 cvt.u16.u32 %r882,%r880; cvt.u16.u32 %r883,%r271; and.b16 %r881,%r882,%r883; cvt.u32.u16 %r884,%r881; cvt.u32.u8 %r249,%r884; .loc 1 1958 8 setp.eq.u32 %r885,%r249,0; @ %r885 bra $L753; .loc 1 1959 7 add.u32 %r212,%r212,1; mov.u32 %r271,%r249; $L753: .loc 1 1961 13 mul.lo.u64 %r86,%r86,%r315; .loc 1 1816 56 add.u64 %r251,%r251,1; st.u64 [%frame+648],%r251; .loc 1 1816 31 add.u64 %r886,%r304,%r251; ld.u8 %r252,[%r886+24]; .loc 1 1816 4 add.u64 %r292,%r292,24; setp.ne.u32 %r887,%r252,0; @ %r887 bra $L761; .loc 1 1963 7 setp.ne.u32 %r888,%r271,0; @ %r888 bra $L762; .loc 1 1613 17 ld.u64 %r241,[%r304]; bra $L686; $L762: .loc 1 1969 8 ld.s8 %r889,[%r318+28]; setp.ne.u32 %r890,%r889,%r212; @ %r890 bra $L763; .loc 1 1613 17 ld.u64 %r241,[%r304]; .loc 1 1965 27 mov.u32 %r266,%r271; bra $L686; $L763: .loc 1 1969 8 cvta.const.u64 %r894,$LC19; cvta.const.u64 %r893,__func__$5; mov.u32 %r892,1969; cvta.const.u64 %r891,$LC18892893891973 4 mov.u64 %r898,97r324898898stack; call caf_internal_error); } .loc 1 1974 4 bra $L678; $L686: .loc 1 1595 9 setp.eq.u64 %r901,%r241,0; @ %r901 bra $L687; $L689: .loc 1 1613 38 ld.u32 %r160,[%r241+8]; mov.u64 %r304,%r241; setp.eq.u32 %r977,%r160,1; bra $L764; $L774: mov.u64 %r304,%r241; .loc 1 1614 7 mov.u64 %r264,%r161; bra $L764; $L687: .loc 1 1979 29 ld.u64 %r903,[%r304+16]; set.u32.eq.u64 %r904,%r903,0; neg.s32 %r905,%r904; .loc 1 1979 12 set.u32.eq.u64 %r907,%r86,0; neg.s32 %r908,%r907; .loc 1 1979 17 cvt.u16.u32 %r910,%r905; cvt.u16.u32 %r911,%r908; or.b16 %r909,%r910,%r911; .loc 1 1979 6 cvt.u32.u16 %r912,%r909; cvt.u16.u8 %r913,%r912; setp.ne.u16 %r914,%r913,0; @ %r914 bra $L678; .loc 1 1986 6 setp.eq.u32 %r915,%r277,0; @ %r915 bra $L765; .loc 1 1988 10 setp.eq.u32 %r916,%r266,0; @ %r916 bra $L766; $L770: .loc 1 2003 35 ld.u64 %r919,[%r318+16]; mul.lo.u64 %r918,%r86,%r919918920,[%value_in]; } .loc 1 2003 33 st.u64 [%r318],%r920; .loc 1 2004 10 setp.eq.u64 %r922,%r920,0; @ ! %r922 bra $L765; bra $L767; $L766: .loc 1 1990 4 setp.eq.u64 %r923,%r86,1; @ %r923 bra $L768; cvta.const.u64 %r927,$LC20; cvta.const.u64 %r926,__func__$5; mov.u32 %r925,1990; cvta.const.u64 %r924,$LC1899271992 7 setp.ne.u32 %r928,%r242,-1; @ %r928 bra $L769; .loc 1 1992 23 setp.ne.u64 %r929,%r23,0; @ %r929 bra $L770; .loc 1 1994 17 add.u32 %r242,%r212,1; .loc 1 1995 34 st.u8 [%r318+28],%r242; .loc 1 1996 34 cvt.s64.s32 %r930,%r320; st.u64 [%r318+16],%r930; bra $L771; $L769: .loc 1 1999 4 setp.le.s32 %r931,%r242,0; @ %r931 bra $L770; $L771: add.u64 %r206,%r318,40; add.u32 %r932,%r242,-1; cvt.u64.u32 %r933,%r932; add.u64 %r935,%r933,%r933; add.u64 %r936,%r935,%r933; shl.b64 %r937,%r936,3; add.u64 %r938,%r318,64; add.u64 %r280,%r937,%r938; .loc 1 2000 6 mov.u64 %r939,1; $L772: st.u64 [%r206+8],%r939; st.u64 [%r206+16],%r939; st.u64 [%r206],%r939; .loc 1 1999 4 add.u64 %r206,%r206,24; setp.ne.u64 %r942,%r206,%r280; @ %r942 bra $L772; bra $L770; $L767: .loc 1 200697324920stack; call caf_internal_error); } .loc 1 2007 4 bra $L678; $L765: .loc 1 2015 3 mov.u64 %r950,120; mov.u32 %r95frame950954,[%value_in]; } .loc 1 2016 5 mov.u64 %r956,0; st.u64 [%frame+648],%r956; .loc 1 2017 3 add.u64 %r970,%frame,648; ld.u64 %r962,[%r31frame3185274; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r320; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r321; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r956; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r956; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r324; .param .u32 %out_arg13; st.param.u32 [%out_arg13],%r325; call get_for_ref$constprop$0$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13); }_gfortran_caf_send_by_ref .visible .func _gfortran_caf_send_by_ref608u64 %r265; .reg .u64 %r266; .reg .u64 %r267; .reg .u64u64 %r280; .reg .u64u64u64 %r320; .reg .u64 %r321; .reg .u64 %r325; .reg .u64u64 %r349; .reg .pred %r351; .reg .u32 %r352; .reg .pred %r353; .reg .pred %r354; .reg .pred %r355; .reg .predpred %r373; .reg .pred %r375; .reg .pred %r377; .reg .pred %r379; .reg .pred %r381; .reg .pred %r383; .reg .predu64u64u64 %r410; .reg .u64u32 %r450; .reg .u16 %r451; .reg .pred %r452; .reg .pred %r453; .reg .u64 %r454; .reg .u64 %r458; .reg .pred %r459; .reg .u64 %r460; .reg .u64 %r461; .reg .u32 %r463; .reg .u32 %r464; .reg .u32 %r466; .reg .u32 %r467; .reg .u16 %r468; .reg .u16 %r469; .reg .u16 %r470; .reg .u32 %r471; .reg .u16 %r472; .reg .pred %r473; .reg .pred %r474; .reg .u64 %r475; .reg .u64 %r479; .reg .u64pred498; .reg .u32 %r500; .reg .u32 %r501; .reg .u16 %r502; .reg .u16 %r503; .reg .u16 %r504; .reg .u32 %r505; .reg .u16 %r506; .reg .pred %r507; .reg .predu64 %r514; .reg .u64 %r515; .reg .u64 %r516; .reg .u64 %r517; .reg .u64 %r519; .reg .pred %r520; .reg .u64 %r521; .reg .u64 %r522; .reg .u32 %r524; .reg .u32 %r525; .reg .u32 %r527; .reg .u32 %r528; .reg .u16 %r529; .reg .u16 %r530; .reg .u16 %r531; .reg .u32pred %r560; .reg .u64 %r564; .reg .pred %r567; .reg .u64 %r570; .reg .u64 %r571; .reg .u64 %r572; .reg .u64 %r573; .reg .u64 %r575; .reg .u64 %r576; .reg .u64 %r584; .reg .u64 %r585; .reg .pred %r586; .reg .u64 %r590; .reg .u64 %r592; .reg .u64 %r594; .reg .u64 %r595; .reg .u64 %r596; .reg .u64 %r597; .reg .u64 %r599; .reg .u64 %r600; .reg .u64 %r608; .reg .u64 %r609; .reg .u64 %r611; .reg .pred %r612; .reg .u64 %r616; .reg .u64 %r621; .reg .u64 %r622; .reg .u64 %r623; .reg .u64 %r624; .reg .u64 %r643; .reg .u64 %r644; .reg .u64 %r645; .reg .u64 %r646; .reg .u64 %r648; .reg .u64 %r649; .reg .u64 %r657; .reg .u64 %r658; .reg .pred %r659; .reg .u64 %r662; .reg .pred %r663; .reg .pred %r665; .reg .u16 %r666; .reg .pred %r667; .reg .pred %r669; .reg .predpred %r679; .reg .pred %r680; .reg .pred %r681; .reg .u64 %r682; .reg .u64 %r683; .reg .u64 %r685; .reg .u64 %r686; .reg .u64 %r687; .reg .u64 %r689; .reg .u64 %r690; .reg .u64 %r691; .reg .u64 %r693; .reg .u64 %r694; .reg .u64 %r696; .reg .u64 %r697; .reg .u64 %r698; .reg .u64 %r700; .reg .u64 %r701; .reg .u64 %r703; .reg .u64 %r704; .reg .u64 %r705; .reg .pred %r706; .reg .u64 %r707; .reg .u64 %r708; .reg .u32 %r710; .reg .u32 %r711; .reg .u32 %r713; .reg .u32 %r714; .reg .u16 %r715; .reg .u16 %r716; .reg .u16 %r717; .reg .u32 %r718; .reg .u16 %r719; .reg .pred %r720; .reg .pred %r721; .reg .u64 %r722; .reg .u64 %r726; .reg .u64 %r727; .reg .u64 %r729; .reg .u64 %r730; .reg .u64 %r731; .reg .u64 %r732; .reg .u64 %r734; .reg .pred %r735; .reg .u32 %r737; .reg .u32 %r738; .reg .u16 %r742; .reg .u16 %r743; .reg .u32 %r745; .reg .u16 %r746; .reg .pred %r747; .reg .pred %r748; .reg .u64 %r749; .reg .u64 %r751; .reg .u64 %r752; .reg .u64 %r753; .reg .u64 %r754; .reg .u64 %r756; .reg .u64 %r757; .reg .u64 %r765; .reg .pred %r766; .reg .u64 %r770; .reg .u64 %r773; .reg .u64 %r775; .reg .pred %r776; .reg .u64 %r780; .reg .pred %r783; .reg .u64 %r785; .reg .u32 %r786; .reg .u32 %r787; .reg .u32 %r789; .reg .u32 %r790; .reg .u16 %r791; .reg .u16 %r792; .reg .u16 %r793; .reg .u32 %r794; .reg .u16 %r795; .reg .pred %r796; .reg .u64 %r798; .reg .u32 %r800; .reg .u64 %r802; .reg .u64 %r811; .reg .u64 %r819; .reg .u64 %r820; .reg .pred %r821; .reg .u64 %r822; .reg .u32 %r823; .reg .u64 %r824; .reg .u64 %r825; .reg .u64 %r827; .reg .u64 %r828; .reg .u64 %r829; .reg .u64 %r830; .reg .u64 %r831; .reg .u64 %r832; .reg .u64 %r833; .reg .u64 %r834; .reg .u64 %r839; .reg .u64 %r844; .reg .u64 %r845; .reg .u64 %r846; .reg .u64 %r847; .reg .u32 %r848; .reg .u32 %r849; .reg .u16 %r850; .reg .pred %r851; mov.u64 %r285,%ar0; mov.u64 %r287,%ar2; mov.u64 %r288,%ar3; mov.u32 %r289,%ar4; mov.u32 %r290,%ar5; mov.u32 %r292,%ar7; mov.u64 %r293,%ar8; mov.u32 %r294,%ar9; .loc 1 2439 14 add.u64 %r833,%frame,358; cvta.const.u64 %r296,$LC9; mov.u64 %r297,683r296297301,[%value_in]; } .loc 1 2441 14 add.u64 %r829,%frame,479; cvta.const.u64 %r304,$LC21; mov.u64 %r305,590309,[%value_in]; } .loc 1 2443 14 add.u64 %r832,%frame,293; cvta.const.u64 %r312,$LC22; mov.u64 %r313,6513317,[%value_in]; } .loc 1 2445 14 add.u64 %r831,%frame,538; cvta.const.u64 %r320,$LC23; mov.u64 %r321,5421325,[%value_in]; } .loc 1 2447 14 add.u64 %r827,%frame,120; cvta.const.u64 %r328,$LC24; mov.u64 %r329,9629333,[%value_in]; } .loc 1 2449 14 add.u64 %r828,%frame,420; cvta.const.u64 %r336,$LC250341,[%value_in]; } .loc 1 2451 14 add.u64 %r830,%frame,216; cvta.const.u64 %r344,$LC26; mov.u64 %r345,784434349,[%value_in]; } .loc 1 2457 7 ld.s8 %r248,[%r287+28]; .loc 1 2461 9 ld.u64 %r251,[%r285]; .loc 1 2462 21 ld.u64 %r252,[%r285+8]; .loc 1 2469 6 setp.eq.u64 %r351,%r293,0; @ %r351 bra $L844; .loc 1 2470 11 mov.u32 %r352,0; st.u32 [%r293],%r352; $L844: .loc 1 2475 9 setp.eq.u64 %r353,%r288,0; @ %r353 bra $L843; mov.u64 %r153,%r288; mov.u64 %r152,%r252; mov.u64 %r240,%r251; mov.u32 %r239,0; mov.u64 %r129,1; .loc 1 2525 11 mov.u64 %r847,0; .loc 1 2624 34 set.u32.gt.s32 %r848,%r248,0; neg.s32 %r849,%r848; .loc 1 2624 22 cvt.u16.u32 %r850,%r849; .loc 1 2647 9 setp.ne.u32 %r851,%r292,0; bra $L908; $L914: mov.u64 %r153,%r274; $L908: .loc 1 2477 20 ld.u32 %r23,[%r153+8]; .loc 1 2477 7 setp.eq.u32 %r354,%r23,1; @ %r354 bra $L846; setp.eq.u32 %r355,%r23,2; @ %r355 bra $L847; setp.ne.u32 %r356,%r23,0; @ %r356 bra $L848; .loc 1 2487 18 ld.u64 %r24,[%r153+32]; .loc 1 2492 29 ld.u64 %r357,[%r153+24]; add.u64 %r152,%r240,%r357; .loc 1 2487 7 setp.le.s64 %r358,%r24,0; @ %r358 bra $L911; .loc 1 2492 12 ld.u64 %r25,[%r152]; .loc 1 2492 11 setp.ne.u64 %r359,%r25,0; @ %r359 bra $L850; .loc 1 2496 8 @ %r851 bra $L912; .loc 1 2498 98r293stack; call caf_internal_error); } .loc 1 2499 9 bra $L843; $L850: .loc 1 2506 21 add.u64 %r367,%r240,%r24; .loc 1 2509 12 ld.u64 %r368,[%r367]; mov.u64 %r240,%r152; ld.u64 %r152,[%r368+8]; bra $L849; $L846: .loc 1 2519 7 setp.eq.u64 %r369,%r152,0; @ %r369 bra $L913; .loc 1 2520 13 ld.u64 %r240,[%r152]; bra $L852; $L913: mov.u64 %r152,%r287; $L852: .loc 1 2525 11 st.u64 [%frame+592],%r847; .loc 1 2525 31 ld.u8 %r171,[%r153+24]; .loc 1 2525 4 setp.eq.u32 %r371,%r171,0; @ %r371 bra $L849; add.u64 %r284,%r153,25; add.u64 %r282,%r153,48; add.u64 %r272,%r152,40; .loc 1 2667 4 mov.u64 %r844,1; .loc 1 2525 56 mov.u64 %r845,-24; sub.u64 %r846,%r845,%r153; $L886: cvt.u16.u32 %r372,%r171; setp.eq.u16 %r373,%r372,4; @ %r373 bra $L853; setp.gt.u16 %r375,%r372,4; @ %r375 bra $L854; setp.eq.u16 %r377,%r372,2; @ %r377 bra $L855; setp.eq.u16 %r379,%r372,3; @ %r379 bra $L856; setp.eq.u16 %r381,%r372,1; @ %r381 bra $L857; bra $L858; $L854: setp.eq.u16 %r383,%r372,5; @ %r383 bra $L859; setp.eq.u16 %r385,%r372,6; @ %r385 bra $L860; bra $L858; $L857: .loc 1 2530 11 ld.u64 %r269,[%r282+8]; .loc 1 2539 32 ld.u32 %r29,[%r282+16]; setp.eq.u32 %r386,%r29,4; @ %r386 bra $L861; setp.gt.s32 %r387,%r29,4; @ %r387 bra $L862; setp.eq.u32 %r388,%r29,1; @ %r388 bra $L863; setp.eq.u32 %r389,%r29,2; @ %r389 bra $L864; bra $L865; $L862: setp.eq.u32 %r390,%r29,8; @ %r390 bra $L866; setp.eq.u32 %r391,%r29,16; @ %r391 bra $L867; bra $L865; $L863: .loc 1 2541 7 ld.u64 %r392,[%r282]; ld.s8 %r393,[%r392]; ld.u64 %r395,[%r272+8]; sub.u64 %r394,%r393,%r395; ld.u64 %r397,[%r272]; mul.lo.u64 %r396,%r394,%r397; ld.u64 %r399,[%r153+16]; mad.lo.u64 %r240,%r396,%r399,%r240; bra $L868; $L864: .loc 1 2542 7 ld.u64 %r400,[%r282]; ld.s16 %r401,[%r400]; ld.u64 %r403,[%r272+8]; sub.u64 %r402,%r401,%r403; ld.u64 %r405,[%r272]; mul.lo.u64 %r404,%r402,%r405; ld.u64 %r407,[%r153+16]; mad.lo.u64 %r240,%r404,%r407,%r240; bra $L868; $L861: .loc 1 2543 7 ld.u64 %r408,[%r282]; ld.s32 %r409,[%r408]; ld.u64 %r411,[%r272+8]; sub.u64 %r410,%r409,%r411; ld.u64 %r413,[%r272]; mul.lo.u64 %r412,%r410,%r413; ld.u64 %r415,[%r153+16]; mad.lo.u64 %r240,%r412,%r415,%r240; bra $L868; $L866: .loc 1 2545 7 ld.u64 %r416,[%r282]; ld.u64 %r418,[%r416]; ld.u64 %r419,[%r272+8]; sub.u64 %r417,%r418,%r419; ld.u64 %r421,[%r272]; mul.lo.u64 %r420,%r417,%r421; ld.u64 %r423,[%r153+16]; mad.lo.u64 %r240,%r420,%r423,%r240; bra $L868; $L867: .loc 1 2548 7 ld.u64 %r424,[%r282]; ld.u64 %r426,[%r424]; ld.u64 %r427,[%r272+8]; sub.u64 %r425,%r426,%r427; ld.u64 %r429,[%r272]; mul.lo.u64 %r428,%r425,%r429; ld.u64 %r431,[%r153+16]; mad.lo.u64 %r240,%r428,%r431,%r240; bra $L868; $L865: .loc 1 2551 9 mov.u64 %r435,83r2933435stack; call caf_internal_error); } .loc 1 2552 9 bra $L843; $L855: .loc 1 2558 7 ld.u64 %r79,[%r282+16]; abs.s64 %r268,%r79; ld.u64 %r126,[%r272+16]; .loc 1 2541 7 ld.u64 %r262,[%r272+8]; .loc 1 2558 7 setp.le.s64 %r438,%r79,0; @ %r438 bra $L869; add.u64 %r439,%r126,1; sub.u64 %r269,%r439,%r262; bra $L870; $L869: add.u64 %r440,%r262,1; sub.u64 %r269,%r440,%r126; $L870: set.u32.le.s64 %r442,%r269,0; neg.s32 %r443,%r442; set.u32.le.s64 %r445,%r268,0; neg.s32 %r446,%r445; cvt.u16.u32 %r448,%r443; cvt.u16.u32 %r449,%r446; or.b16 %r447,%r448,%r449; cvt.u32.u16 %r450,%r447; cvt.u16.u8 %r451,%r450; setp.ne.u16 %r452,%r451,0; @ %r452 bra $L843; setp.eq.u64 %r453,%r268,1; @ %r453 bra $L871; add.u64 %r454,%r269,-1; div.s64 %r458,%r454,%r268; add.u64 %r269,%r458,1; bra $L871; $L856: .loc 1 2569 5 ld.u64 %r94,[%r282+16]; abs.s64 %r266,%r94; .loc 1 2541 7 ld.u64 %r92,[%r282]; .loc 1 2530 32 ld.u64 %r93,[%r282+8]; .loc 1 2569 5 setp.le.s64 %r459,%r94,0; @ %r459 bra $L872; add.u64 %r460,%r93,1; sub.u64 %r269,%r460,%r92; bra $L873; $L872: add.u64 %r461,%r92,1; sub.u64 %r269,%r461,%r93; $L873: set.u32.le.s64 %r463,%r269,0; neg.s32 %r464,%r463; set.u32.le.s64 %r466,%r266,0; neg.s32 %r467,%r466; cvt.u16.u32 %r469,%r464; cvt.u16.u32 %r470,%r467; or.b16 %r468,%r469,%r470; cvt.u32.u16 %r471,%r468; cvt.u16.u8 %r472,%r471; setp.ne.u16 %r473,%r472,0; @ %r473 bra $L843; setp.eq.u64 %r474,%r266,1; @ %r474 bra $L874; add.u64 %r475,%r269,-1; div.s64 %r479,%r475,%r266; add.u64 %r269,%r479,1; $L874: .loc 1 2574 9 ld.u64 %r481,[%r272+8]; sub.u64 %r480,%r92,%r481; .loc 1 2575 9 ld.u64 %r483,[%r272]; mul.lo.u64 %r482,%r480,%r483; .loc 1 2576 9 ld.u64 %r485,[%r153+16]; .loc 1 2573 12 mad.lo.u64 %r240,%r482,%r485,%r240; .loc 1 2577 5 bra $L871; $L853: .loc 1 2581 9 ld.u64 %r487,[%r282]; ld.u64 %r488,[%r272+8]; sub.u64 %r486,%r487,%r488; .loc 1 2582 9 ld.u64 %r490,[%r272]; mul.lo.u64 %r489,%r486,%r490; .loc 1 2583 9 ld.u64 %r492,[%r153+16]; .loc 1 2580 12 mad.lo.u64 %r240,%r489,%r492,%r240; bra $L875; $L859: .loc 1 2587 7 ld.u64 %r118,[%r282+16]; abs.s64 %r265,%r118; .loc 1 2541 7 ld.u64 %r267,[%r282]; .loc 1 2558 7 ld.u64 %r243,[%r272+16]; .loc 1 2587 7 setp.le.s64 %r493,%r118,0; @ %r493 bra $L876; add.u64 %r494,%r243,1; sub.u64 %r269,%r494,%r267; bra $L877; $L876: add.u64 %r495,%r267,1; sub.u64 %r269,%r495,%r243; $L877: set.u32.le.s64 %r497,%r269,0; neg.s32 %r498,%r497; set.u32.le.s64 %r500,%r265,0; neg.s32 %r501,%r500; cvt.u16.u32 %r503,%r498; cvt.u16.u32 %r504,%r501; or.b16 %r502,%r503,%r504; cvt.u32.u16 %r505,%r502; cvt.u16.u8 %r506,%r505; setp.ne.u16 %r507,%r506,0; @ %r507 bra $L843; setp.eq.u64 %r508,%r265,1; @ %r508 bra $L878; add.u64 %r509,%r269,-1; div.s64 %r513,%r509,%r265; add.u64 %r269,%r513,1; $L878: .loc 1 2597 9 ld.u64 %r515,[%r272+8]; sub.u64 %r514,%r267,%r515; .loc 1 2598 9 ld.u64 %r517,[%r272]; mul.lo.u64 %r516,%r514,%r517; .loc 1 2599 9 ld.u64 %r519,[%r153+16]; .loc 1 2596 12 mad.lo.u64 %r240,%r516,%r519,%r240; .loc 1 2600 5 bra $L871; $L860: .loc 1 2603 7 ld.u64 %r139,[%r282+16]; abs.s64 %r261,%r139; .loc 1 2541 7 ld.u64 %r89,[%r272+8]; .loc 1 2530 32 ld.u64 %r90,[%r282+8]; .loc 1 2603 7 setp.le.s64 %r520,%r139,0; @ %r520 bra $L879; add.u64 %r521,%r90,1; sub.u64 %r269,%r521,%r89; bra $L880; $L879: add.u64 %r522,%r89,1; sub.u64 %r269,%r522,%r90; $L880: set.u32.le.s64 %r524,%r269,0; neg.s32 %r525,%r524; set.u32.le.s64 %r527,%r261,0; neg.s32 %r528,%r527; cvt.u16.u32 %r530,%r525; cvt.u16.u32 %r531,%r528; or.b16 %r529,%r530,%r531; cvt.u32.u16 %r532,%r529; cvt.u16.u8 %r533,%r532; setp.ne.u16 %r534,%r533,0; @ %r534 bra $L843; setp.eq.u64 %r535,%r261,1; @ %r535 bra $L871; add.u64 %r536,%r269,-1; div.s64 %r540,%r536,%r261; add.u64 %r269,%r540,1; bra $L871; $L858: .loc 1 2616 5 mov.u64 %r544,8293544544stack; call caf_internal_error); } .loc 1 2617 5 bra $L843; $L868: .loc 1 2620 11 setp.le.s64 %r547,%r269,0; @ %r547 bra $L843; $L871: .loc 1 2624 18 set.u32.gt.s64 %r549,%r269,1; neg.s32 %r550,%r549; .loc 1 2624 22 cvt.u16.u32 %r555,%r550; and.b16 %r554,%r555,%r850; .loc 1 2624 11 cvt.u32.u16 %r557,%r554; cvt.u16.u8 %r558,%r557; setp.ne.u16 %r559,%r558,0; @ %r559 bra $L881; .loc 1 2676 13 mul.lo.u64 %r129,%r129,%r269; bra $L875; $L881: .loc 1 2628 8 setp.gt.s32 %r560,%r248,%r239; @ %r560 bra $L882; $L905: .loc 1 2630 9 mov.u64 %r564,82935644stack; call caf_internal_error); } .loc 1 2631 9 bra $L843; $L882: .loc 1 2639 6 setp.eq.u64 %r567,%r240,0; @ %r567 bra $L883; .loc 1 2640 13 cvt.s64.s32 %r834,%r239; add.u64 %r570,%r834,%r834; add.u64 %r571,%r570,%r834; shl.b64 %r572,%r571,3; add.u64 %r573,%r152,%r572; ld.u64 %r576,[%r573+56]; add.u64 %r575,%r576,1; ld.u64 %r585,[%r573+48]; sub.u64 %r584,%r575,%r585; .loc 1 2640 10 setp.ne.u64 %r586,%r584,%r269; @ %r586 bra $L883; bra $L884; $L947: .loc 1 2651 9 cvt.s64.s32 %r592,%r239; add.u64 %r594,%r592,%r592; add.u64 %r595,%r594,%r592; shl.b64 %r596,%r595,3; add.u64 %r597,%r152,%r596; ld.u64 %r600,[%r597+56]; add.u64 %r599,%r600,1; .loc 1 2649 10 ld.u64 %r609,[%r597+48]; sub.u64 %r608,%r599,%r609; st.u64 [%stack+8],%r608; st.u64 [%stack],%r269; mov.u64 %r590,8293590590stack; call caf_internal_error); } .loc 1 2653 10 bra $L843; $L909: .loc 1 2657 14 ld.u64 %r611,[%r153]; setp.eq.u64 %r612,%r611,0; @ %r612 bra $L885; .loc 1 2659 10 mov.u64 %r616,61661stack; call caf_internal_error); } .loc 1 2661 10 bra $L843; $L885: .loc 1 2667 4 cvt.s64.s32 %r834,%r239; add.u64 %r621,%r834,%r834; add.u64 %r622,%r621,%r834; shl.b64 %r623,%r622,3; add.u64 %r624,%r152,%r623; st.u64 [%r624+48],%r844; st.u64 [%r624+56],%r269; st.u64 [%r624+40],%r129; $L884: .loc 1 2676 13 mul.lo.u64 %r129,%r129,%r269; .loc 1 2673 12 add.u64 %r643,%r834,%r834; add.u64 %r644,%r643,%r834; shl.b64 %r645,%r644,3; add.u64 %r646,%r287,%r645; ld.u64 %r649,[%r646+56]; add.u64 %r648,%r649,1; ld.u64 %r658,[%r646+48]; sub.u64 %r657,%r648,%r658; .loc 1 2673 9 setp.ne.u64 %r659,%r657,%r269; @ %r659 bra $L875; .loc 1 2674 7 add.u32 %r239,%r239,1; $L875: .loc 1 2525 56 add.u64 %r662,%r846,%r284; st.u64 [%frame+592],%r662; .loc 1 2525 31 ld.u8 %r171,[%r284]; .loc 1 2525 4 add.u64 %r284,%r284,1; add.u64 %r282,%r282,24; add.u64 %r272,%r272,24; setp.ne.u32 %r663,%r171,0; @ %r663 bra $L886; bra $L849; $L847: .loc 1 2680 11 st.u64 [%frame+592],%r847; .loc 1 2680 31 ld.u8 %r229,[%r153+24]; .loc 1 2680 4 setp.eq.u32 %r665,%r229,0; @ %r665 bra $L849; add.u64 %r280,%r153,25; add.u64 %r130,%r153,48; .loc 1 2680 56 mov.u64 %r839,-24; $L907: cvt.u16.u32 %r666,%r229; setp.eq.u16 %r667,%r666,3; @ %r667 bra $L887; setp.gt.u16 %r669,%r666,3; @ %r669 bra $L888; setp.eq.u16 %r671,%r666,1; @ %r671 bra $L889; setp.eq.u16 %r673,%r666,2; @ %r673 bra $L890; bra $L858; $L888: setp.eq.u16 %r675,%r666,4; @ %r675 bra $L891; bra $L858; $L889: .loc 1 2685 11 ld.u64 %r242,[%r130+8]; .loc 1 2691 32 ld.u32 %r173,[%r130+16]; setp.eq.u32 %r676,%r173,4; @ %r676 bra $L892; setp.gt.s32 %r677,%r173,4; @ %r677 bra $L893; setp.eq.u32 %r678,%r173,1; @ %r678 bra $L894; setp.eq.u32 %r679,%r173,2; @ %r679 bra $L895; bra $L865; $L893: setp.eq.u32 %r680,%r173,8; @ %r680 bra $L896; setp.eq.u32 %r681,%r173,16; @ %r681 bra $L897; bra $L865; $L894: .loc 1 2693 7 ld.u64 %r682,[%r130]; ld.s8 %r683,[%r682]; ld.u64 %r685,[%r153+16]; mad.lo.u64 %r240,%r683,%r685,%r240; bra $L898; $L895: .loc 1 2694 7 ld.u64 %r686,[%r130]; ld.s16 %r687,[%r686]; ld.u64 %r689,[%r153+16]; mad.lo.u64 %r240,%r687,%r689,%r240; bra $L898; $L892: .loc 1 2695 7 ld.u64 %r690,[%r130]; ld.s32 %r691,[%r690]; ld.u64 %r693,[%r153+16]; mad.lo.u64 %r240,%r691,%r693,%r240; bra $L898; $L896: .loc 1 2697 7 ld.u64 %r694,[%r130]; ld.u64 %r696,[%r694]; ld.u64 %r697,[%r153+16]; mad.lo.u64 %r240,%r696,%r697,%r240; bra $L898; $L897: .loc 1 2700 7 ld.u64 %r698,[%r130]; ld.u64 %r700,[%r698]; ld.u64 %r701,[%r153+16]; mad.lo.u64 %r240,%r700,%r701,%r240; bra $L898; $L890: .loc 1 2709 37 ld.u64 %r704,[%r130+8]; ld.u64 %r705,[%r130+16]; div.s64 %r703,%r704,%r705; .loc 1 2709 11 add.u64 %r242,%r703,1; .loc 1 2713 5 bra $L898; $L887: .loc 1 2715 5 ld.u64 %r202,[%r130+16]; abs.s64 %r255,%r202; .loc 1 2685 32 ld.u64 %r91,[%r130+8]; .loc 1 2693 7 ld.u64 %r88,[%r130]; .loc 1 2715 5 setp.le.s64 %r706,%r202,0; @ %r706 bra $L899; add.u64 %r707,%r91,1; sub.u64 %r242,%r707,%r88; bra $L900; $L899: add.u64 %r708,%r88,1; sub.u64 %r242,%r708,%r91; $L900: set.u32.le.s64 %r710,%r242,0; neg.s32 %r711,%r710; set.u32.le.s64 %r713,%r255,0; neg.s32 %r714,%r713; cvt.u16.u32 %r716,%r711; cvt.u16.u32 %r717,%r714; or.b16 %r715,%r716,%r717; cvt.u32.u16 %r718,%r715; cvt.u16.u8 %r719,%r718; setp.ne.u16 %r720,%r719,0; @ %r720 bra $L843; setp.eq.u64 %r721,%r255,1; @ %r721 bra $L901; add.u64 %r722,%r242,-1; div.s64 %r726,%r722,%r255; add.u64 %r242,%r726,1; $L901: .loc 1 2720 9 mul.lo.u64 %r727,%r88,%r202; .loc 1 2721 9 ld.u64 %r729,[%r153+16]; .loc 1 2719 12 mad.lo.u64 %r240,%r727,%r729,%r240; bra $L902; $L891: .loc 1 2726 9 ld.u64 %r731,[%r130]; ld.u64 %r732,[%r130+16]; mul.lo.u64 %r730,%r731,%r732; .loc 1 2727 9 ld.u64 %r734,[%r153+16]; .loc 1 2725 12 mad.lo.u64 %r240,%r730,%r734,%r240; bra $L903; $L898: .loc 1 2737 11 setp.le.s64 %r735,%r242,0; @ %r735 bra $L843; $L902: .loc 1 2742 18 set.u32.gt.s64 %r737,%r242,1; neg.s32 %r738,%r737; .loc 1 2742 22 cvt.u16.u32 %r743,%r738; and.b16 %r742,%r743,%r850; .loc 1 2742 11 cvt.u32.u16 %r745,%r742; cvt.u16.u8 %r746,%r745; setp.ne.u16 %r747,%r746,0; @ %r747 bra $L904; .loc 1 2770 13 mul.lo.u64 %r129,%r129,%r242; bra $L903; $L904: .loc 1 2746 8 setp.le.s32 %r748,%r248,%r239; @ %r748 bra $L905; .loc 1 2755 27 cvt.s64.s32 %r749,%r239; add.u64 %r751,%r749,%r749; add.u64 %r752,%r751,%r749; shl.b64 %r753,%r752,3; add.u64 %r754,%r287,%r753; ld.u64 %r757,[%r754+56]; add.u64 %r756,%r757,1; ld.u64 %r765,[%r754+48]; sub.u64 %r228,%r756,%r765; .loc 1 2759 12 setp.eq.u64 %r766,%r228,%r242; @ %r766 bra $L906; .loc 1 2761 6 st.u64 [%stack+8],%r228; st.u64 [%stack],%r242; mov.u64 %r770,82937707stack; call caf_internal_error); } .loc 1 2765 6 bra $L843; $L906: .loc 1 2768 5 add.u32 %r239,%r239,1; .loc 1 2770 13 mul.lo.u64 %r129,%r129,%r242; $L903: .loc 1 2680 56 sub.u64 %r773,%r839,%r153; add.u64 %r775,%r773,%r280; st.u64 [%frame+592],%r775; .loc 1 2680 31 ld.u8 %r229,[%r280]; .loc 1 2680 4 add.u64 %r280,%r280,1; add.u64 %r130,%r130,24; setp.ne.u32 %r776,%r229,0; @ %r776 bra $L907; bra $L849; $L848: .loc 1 2774 4 mov.u64 %r780,780780stack; call caf_internal_error); } .loc 1 2775 4 bra $L843; $L911: .loc 1 2492 29 mov.u64 %r240,%r152; bra $L849; $L912: .loc 1 2503 9 mov.u64 %r152,%r847; .loc 1 2502 12 mov.u64 %r240,%r25; $L849: .loc 1 2778 13 ld.u64 %r274,[%r153]; .loc 1 2475 9 setp.ne.u64 %r783,%r274,0; @ %r783 bra $L914; .loc 1 2780 29 ld.u64 %r785,[%r153+16]; set.u32.eq.u64 %r786,%r785,0; neg.s32 %r787,%r786; .loc 1 2780 12 set.u32.eq.u64 %r789,%r129,0; neg.s32 %r790,%r789; .loc 1 2780 17 cvt.u16.u32 %r792,%r787; cvt.u16.u32 %r793,%r790; or.b16 %r791,%r792,%r793; .loc 1 2780 6 cvt.u32.u16 %r794,%r791; cvt.u16.u8 %r795,%r794; setp.ne.u16 %r796,%r795,0; @ %r796 bra $L843; .loc 1 2791 3 mov.u64 %r798,120; cvt.u32.u64 %r800,%r274; { .param .u64 %value_in; .param .u64 %out_arg1; st.param.u64 [%out_arg1],%frame798802,[%value_in]; } .loc 1 2792 5 st.u64 [%frame+592],%r274; .loc 1 2793 3 add.u64 %r819,%frame,592; ld.u64 %r811,[%r228r819frame52287811; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r289; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r290; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r274; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r274; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r129; .param .u64 %out_arg13; st.param.u64 [%out_arg13],%r293; .param .u32 %out_arg14; st.param.u32 [%out_arg14],%r294; call send_by_ref$constprop$0$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13,%out_arg14); } .loc 1 2796 3 ld.u64 %r820,[%frame+592]; setp.eq.u64 %r821,%r820,%r129; @ %r821 bra $L843; cvta.const.u64 %r825,$LC27; cvta.const.u64 %r824,__func__$4; mov.u32 %r823,2796; cvta.const.u64 %r822,$LC1888252647 9 @ %r851 bra $L909;_gfortran_caf_sendget_by_ref .visible .func _gfortran_caf_sendget_by_ref, .param .u32 %in_ar11, .param .u32 %in_ar1ar11; ld.param.u32 %ar11,[%in_ar11]; .reg .u32 %ar12; ld.param.u32 %ar12,[%in_ar140mov.u32 %r29,%ar4; mov.u64 %r30,%ar5; mov.u32 %r31,%ar6; mov.u32 %r32,%ar7; mov.u32 %r33,%ar8; mov.u64 %r34,%ar9; mov.u64 %r35,%ar10; mov.u32 %r36,%ar11; mov.u32 %r37,%ar12; .loc 1 2809 31 mov.u64 %r38,0; st.u64 [%frame],%r38; .loc 1 2810 31 mov.u32 %r39,-1; st.u8 [%frame+28],%r39; .loc 1 2811 31 st.u8 [%frame+29],%r36; .loc 1 2813 3 mov.u32 %r47,1frame3233; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r47; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r35; .param .u32 %out_arg10; st.param.u32 [%out_arg10],%r37; call _gfortran_caf_get_by_ref,%out_arg8,%out_arg9,%out_arg10); } .loc 1 2818 6 setp.eq.u64 %r50,%r35,0; @ %r50 bra $L949; .loc 1 2818 16 ld.u32 %r51,[%r35]; setp.eq.u32 %r52,%r51,0; @ ! %r52 bra $L948; $L949: .loc 1 2821 3 mov.u32 %r60,5frame3133; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r60; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r34; .param .u32 %out_arg10; st.param.u32 [%out_arg10],%r36; call _gfortran_caf_send_by_ref,%out_arg8,%out_arg9,%out_arg10); } .loc 1 2825 7 ld.u64 %r24,[%frame]; .loc 1 2825 6 setp.eq.u64 %r63,%r24,0; @ %r63 bra $L948; .loc 1 2826_gfortran_caf_atomic_define .visible .func _gfortran_caf_atomic_defpred.reg .predmov.u64 %r29,%ar4; mov.u32 %r31,%ar6; .loc 1 2836 3 setp.eq.u32 %r32,%r31,4; @ %r32 bra $L961; cvta.const.u64 %r36,$LC28; cvta.const.u64 %r35,__func__$3; mov.u32 %r34,2836; cvta.const.u64 %r33,$LC18; {2840 3 ld.u64 %r37,[%r25]; add.u64 %r38,%r37,%r26; ld.u32 %r39,[%r28]; st.u32 [%r38],%r39; .loc 1 2842 6 setp.eq.u64 %r40,%r29,0; @ %r40 bra $L960; .loc 1 2843 11 mov.u32 %r41,0; st.u32 [%r29],%r41; $L960: .loc 1 2844_gfortran_caf_atomic_ref .visible .func _gfortran_caf_atomic_refpredmov.u64 %r29,%ar4; mov.u32 %r31,%ar6; .loc 1 2852 3 setp.eq.u32 %r32,%r31,4; @ %r32 bra $L967; cvta.const.u64 %r36,$LC28; cvta.const.u64 %r35,__func__$2; mov.u32 %r34,2852; cvta.const.u64 %r33,$LC18; {2856 3 ld.u64 %r37,[%r25]; add.u64 %r38,%r37,%r26; ld.u32 %r23,[%r38]; st.u32 [%r28],%r23; .loc 1 2858 6 setp.eq.u64 %r39,%r29,0; @ %r39 bra $L966; .loc 1 2859 11 mov.u32 %r40,0; st.u32 [%r29],%r40; $L966: .loc 1 286_gfortran_caf_atomic_cas .visible .func _gfortran_caf_atomic_casr2332.reg .u3255,__nvptx_uni; add.u64 %r55,%r55,%ustmp1; ld.shared.u32 %r53,[%r5553,%r53,%ustmp0; setp.eq.u32 %r54,%r53,%ustmp0; }mov.u64 %r32,%ar6; mov.u32 %r34,%ar8; .loc 1 2869 3 setp.eq.u32 %r35,%r34,4; @ %r35 bra $L973; cvta.const.u64 %r39,$LC28; cvta.const.u64 %r38,__func__$1; mov.u32 %r37,2869; cvta.const.u64 %r36,$LC182871 13 ld.u64 %r40,[%r26]; add.u64 %r25,%r40,%r27; .loc 1 2873 23 ld.u32 %r23,[%r30]; .loc 1 2873 21 st.u32 [%r29],%r23; .loc 1 2874 10 ld.u32 %r44,[%r31]; @ %r54 atom.cas.b32 %r42,[%r25],%r23,%r44; shfl.idx.b32 %r42,%r42,%r53,31; setp.eq.u32 %r45,%r42,%r23; selp.u32 %r43,1,0,%r45; setp.ne.u32 %r46,%r43,0; @ %r46 bra $L974; st.u32 [%r29],%r42; $L974: .loc 1 287972; .loc 1 2878 11 mov.u32 %r48,0; st.u32 [%r32],%r48; $L972: .loc 1 28_gfortran_caf_atomic_op .visible .func _gfortran_caf_atomic_oppredmov.u64 %r30,%ar1; mov.u64 %r31,%ar2; mov.u64 %r33,%ar4; mov.u64 %r34,%ar5; mov.u64 %r35,%ar6; mov.u32 %r37,%ar8; .loc 1 2888 3 setp.eq.u32 %r38,%r37,4; @ %r38 bra $L980; cvta.const.u64 %r42,$LC28; cvta.const.u64 %r41,__func__$0; mov.u32 %r40,2888; cvta.const.u64 %r39,$LC182891 13 ld.u64 %r43,[%r30]; add.u64 %r28,%r43,%r31; .loc 1 2893 3 setp.eq.u32 %r44,%r29,3; @ %r44 bra $L981; setp.gt.s32 %r45,%r29,3; @ %r45 bra $L982; setp.ne.u32 %r46,%r29,1; @ %r46 bra $L983; .loc 1 2896 13 ld.u32 %r47,[%r33]; @ %r59 atom.add.u32 %r27,[%r28],%r47; shfl.idx.b32 %r27,%r27,%r58,31; .loc 1 2897 7 bra $L984; $L983: .loc 1 2899 13 ld.u32 %r48,[%r33]; @ %r59 atom.b32.and %r27,[%r28],%r48; shfl.idx.b32 %r27,%r27,%r58,31; .loc 1 2900 7 bra $L984; $L981: .loc 1 2902 13 ld.u32 %r49,[%r33]; @ %r59 atom.b32.or %r27,[%r28],%r49; shfl.idx.b32 %r27,%r27,%r58,31; .loc 1 2903 7 bra $L984; $L982: .loc 1 2905 13 ld.u32 %r50,[%r33]; @ %r59 atom.b32.xor %r27,[%r28],%r50; shfl.idx.b32 %r27,%r27,%r58,31; $L984: .loc 1 2911 6 setp.eq.u64 %r51,%r34,0; @ %r51 bra $L985; .loc 1 2912 23 st.u32 [%r34],%r27; $L985: .loc 1 2914 6 setp.eq.u64 %r52,%r35,0; @ %r52 bra $L979; .loc 1 2915 11 mov.u32 %r53,0; st.u32 [%r35],%r53; $L979: .loc 1 291_gfortran_caf_event_post .visible .func _gfortran_caf_event_pos,__nvptx_uni; add.u64 %r41,%r41,%ustmp1; ld.shared.u32 %r39,[%r439,%r39,%ustmp0; setp.eq.u32 %r40,%r39,%ustmp0; }2927 3 ld.u64 %r31,[%r25]; shl.b64 %r32,%r26,2; add.u64 %r33,%r31,%r32; @ %r40 atom.add.u32 %r34,[%r33],1; shfl.idx.b32 %r34,%r34,%r39,31; .loc 1 2929 5 setp.eq.u64 %r35,%r28,0; @ %r35 bra $L993; .loc 1 2930 11 mov.u32 %r36,0; st.u32 [%r28],%r36; $L993: .loc 1 293_gfortran_caf_event_wait .visible .func _gfortran_caf_event_wr27predmov.u32 %r29,%ar2; mov.u64 %r30,%ar3; .loc 1 2942 4 ld.u64 %r33,[%r27]; shl.b64 %r34,%r28,2; add.u64 %r35,%r33,%r34; .loc 1 2941 30 neg.s32 %r36,%r29; .loc 1 2942 4 @ %r43 atom.add.u32 %r37,[%r35],%r36; shfl.idx.b32 %r37,%r37,%r42,31; .loc 1 2944 6 setp.eq.u64 %r38,%r30,0; @ %r38 bra $L998; .loc 1 2945 11 mov.u32 %r39,0; st.u32 [%r30],%r39; $L998: .loc 1 29_gfortran_caf_event_query .visible .func _gfortran_caf_event_querypred.loc 1 2955 3 ld.u64 %r31,[%r26]; shl.b64 %r32,%r27,2; add.u64 %r33,%r31,%r32; ld.u32 %r24,[%r33]; st.u32 [%r29],%r24; .loc 1 2957 5 setp.eq.u64 %r34,%r30,0; @ %r34 bra $L1003; .loc 1 2958 11 mov.u32 %r35,0; st.u32 [%r30],%r35; $L1003: .loc 1 2959_gfortran_caf_lock .visible .func _gfortran_caf_lockpredmov.u64 %r34,%ar6; .loc 1 2967 9 ld.u64 %r35,[%r28]; add.u64 %r26,%r35,%r29; .loc 1 2969 6 ld.u8 %r36,[%r26]; setp.ne.u16 %r37,%r36,0; @ %r37 bra $L1009; .loc 1 2971 13 mov.u32 %r38,1; st.u8 [%r26],%r38; .loc 1 2972 10 setp.eq.u64 %r39,%r31,0; @ %r39 bra $L1010; .loc 1 2973 16 mov.u32 %r40,1; st.u32 [%r31],%r40; $L1010: .loc 1 2974 10 setp.eq.u64 %r41,%r32,0; @ %r41 bra $L1008; $L1014: .loc 1 2975 8 mov.u32 %r42,0; st.u32 [%r32],%r42; bra $L1008; $L1009: .loc 1 2979 6 setp.eq.u64 %r43,%r31,0; @ %r43 bra $L1013; .loc 1 2981 21 mov.u32 %r44,0; st.u32 [%r31],%r44; .loc 1 2982 10 setp.ne.u64 %r45,%r32,0; @ %r45 bra $L1014; bra $L1008; $L1013: .loc 1 2988 6 setp.eq.u64 %r46,%r32,0; @ %r46 bra $L1015; .loc 1 2990 13 mov.u32 %r47,1; st.u32 [%r32],%r47; .loc 1 2991 10 setp.eq.u64 %r48,%r34,0; @ %r48 bra $L1008; .loc 1 2993 11 min.u64 %r27,%r34,8; .loc 1 2995 4 cvta.const.u64 %r49,$LC29;.loc 1 2996 7 setp.le.u64 %r58,%r34,%r27; @ %r58 bra $L1008; .loc 1 2997 6 sub.u64 %r64,%r34,%r27; .loc 1 2997 14 add.u64 %r65,%r33,%r27; .loc 1 2997bra $L1008; $L1015: .loc 1 3001 3 cvt.u32.u64 %r69,%r32; mov.u64 %r68,14; cvta.const.u64 %r67,$LC2call _gfortran_caf_error_stop_str1008:_gfortran_caf_unlock .visible .func _gfortran_caf_unlockpred3011 9 ld.u64 %r34,[%r28]; add.u64 %r26,%r34,%r29; .loc 1 3013 6 ld.u8 %r35,[%r26]; setp.eq.u16 %r36,%r35,0; @ %r36 bra $L1026; .loc 1 3015 13 mov.u32 %r37,0; st.u8 [%r26],%r37; .loc 1 3016 10 setp.eq.u64 %r38,%r31,0; @ %r38 bra $L1025; .loc 1 3017 8 mov.u32 %r39,0; st.u32 [%r31],%r39; bra $L1025; $L1026: .loc 1 3021 6 setp.eq.u64 %r40,%r31,0; @ %r40 bra $L1029; .loc 1 3023 13 mov.u32 %r41,1; st.u32 [%r31],%r41; .loc 1 3024 10025; .loc 1 3026 11 min.u64 %r27,%r33,8; .loc 1 3028 4 cvta.const.u64 %r43,$LC33029 7 setp.le.u64 %r52,%r33,%r27; @ %r52 bra $L1025; .loc 1 3030 6 sub.u64 %r58,%r33,%r27; .loc 1 3030 14 add.u64 %r59,%r32,%r27; .loc 1 3030 6 mov.u32 %r5659bra $L1025; $L1029: .loc 1 3034 3 cvt.u32.u64 %r63,%r31; mov.u64 %r62,22; cvta.const.u64 %r61,$LC30call _gfortran_caf_error_stop_str1025: .loc 1 303_gfortran_caf_is_pres_gfortran_caf_is_presentvalue;predpredpred %r113; .reg .pred %r115; .reg .u16 %r116; .reg .u16pred %r151; .reg .u16 %r152; .reg .u16 %r154; .reg .u32 %r155; .reg .predpred %r171; .reg .u32 %r173; .reg .pred.reg .u64 %r179; mov.u64 %r70,%ar0; mov.u64 %r72,%ar2; .loc 1 3042 14 add.u64 %r178,%frame,64; cvta.const.u64 %r74,$LC313044 14 add.u64 %r179,%frame,127; cvta.const.u64 %r82,$LC103046 14 cvta.const.u64 %r90,$LC113050 9 ld.u64 %r48,[%r70]; .loc 1 3051 21 ld.u64 %r49,[%r70+8]; .loc 1 3054 9 setp.eq.u64 %r97,%r72,0; @ %r97 bra $L1037; $L1059: .loc 1 3056 20 ld.u32 %r22,[%r72+8]; .loc 1 3056 7 setp.eq.u32 %r98,%r22,1; @ %r98 bra $L1038; setp.eq.u32 %r99,%r22,2; @ %r99 bra $L1039; setp.eq.u32 %r100,%r22,0; @ ! %r100 bra $L1082; bra $L1040; $L1039: .loc 1 3102 31 ld.u8 %r52,[%r72+24]; .loc 1 3102 4 setp.eq.u32 %r101,%r52,0; @ %r101 bra $L1042; add.u64 %r60,%r72,25; add.u64 %r62,%r72,48; bra $L1043; $L1038: .loc 1 3073 31 ld.u8 %r51,[%r72+24]; .loc 1 3073 4 setp.eq.u32 %r102,%r51,0; @ %r102 bra $L1042; add.u64 %r46,%r72,25; add.u64 %r45,%r72,48; add.u64 %r54,%r49,48; bra $L1044; $L1040: .loc 1 3059 18 ld.u64 %r23,[%r72+32]; .loc 1 3059 7 setp.eq.u64 %r103,%r23,0; @ %r103 bra $L1045; .loc 1 3061 21 add.u64 %r104,%r48,%r23; ld.u64 %r50,[%r104]; .loc 1 3063 15 ld.u64 %r48,[%r50]; .loc 1 3064 12 ld.u64 %r49,[%r50+8]; bra $L1042; $L1045: .loc 1 3068 15 ld.u64 %r105,[%r72+24]; add.u64 %r48,%r48,%r105; .loc 1 3069 12 mov.u64 %r49,%r48; bra $L1042; $L1044: cvt.u16.u32 %r106,%r51; setp.eq.u16 %r107,%r106,3; @ %r107 bra $L1046; setp.gt.u16 %r109,%r106,3; @ %r109 bra $L1047; setp.eq.u16 %r111,%r106,1; @ %r111 bra $L1046; setp.eq.u16 %r113,%r106,2; @ %r113 bra $L1048; bra $L1049; $L1047: setp.eq.u16 %r115,%r106,4; @ %r115 bra $L1050; add.u16 %r116,%r106,-5; cvt.u32.u16 %r119,%r116; cvt.u16.u8 %r118,%r119; setp.gt.u16 %r120,%r118,1; @ %r120 bra $L1049; bra $L1046; $L1050: .loc 1 3079 9 ld.u64 %r122,[%r45]; ld.u64 %r123,[%r54]; sub.u64 %r121,%r122,%r123; .loc 1 3080 9 ld.u64 %r125,[%r54+-8]; mul.lo.u64 %r124,%r121,%r125; .loc 1 3081 9 ld.u64 %r127,[%r72+16]; .loc 1 3078 12 mad.lo.u64 %r48,%r124,%r127,%r48; .loc 1 3073 31 ld.u8 %r51,[%r46]; .loc 1 3073 4 setp.ne.u32 %r128,%r51,03093 5 mov.u64 %r134,17r1134134stack; call caf_internal_error); } .loc 1 3094 12 cvt.u32.u64 %r69,%r134; bra $L1036; $L1049: .loc 1 30stack; call caf_internal_error); } .loc 1 3097 12 cvt.u32.u64 %r69,%r140; bra $L1036; $L1051: add.u64 %r46,%r46,1; add.u64 %r45,%r45,24; add.u64 %r54,%r54,24; bra $L1044; $L1043: cvt.u16.u32 %r142,%r52; setp.eq.u16 %r143,%r142,3; @ %r143 bra $L1046; setp.gt.u16 %r145,%r142,3; @ %r145 bra $L1054; setp.eq.u16 %r147,%r142,1; @ %r147 bra $L1046; setp.eq.u16 %r149,%r142,2; @ %r149 bra $L1055; bra $L1049; $L1054: setp.eq.u16 %r151,%r142,4; @ %r151 bra $L1056; add.u16 %r152,%r142,-5; cvt.u32.u16 %r155,%r152; cvt.u16.u8 %r154,%r155; setp.gt.u16 %r156,%r154,1; @ %r156 bra $L1049; bra $L1046; $L1056: .loc 1 3108 9 ld.u64 %r158,[%r62]; ld.u64 %r159,[%r62+16]; mul.lo.u64 %r157,%r158,%r159; .loc 1 3109 9 ld.u64 %r161,[%r72+16]; .loc 1 3107 12 mad.lo.u64 %r48,%r157,%r161,%r48; .loc 1 3102 31 ld.u8 %r52,[%r60]; .loc 1 3102 4 setp.ne.u32 %r162,%r52,0; @ %r162 bra $L1057; bra $L1042; $L1055: .loc 1 3113 8 ld.u64 %r163,[%r72]; setp.eq.u64 %r164,%r163,0; @ %r164 bra $L1058; bra $L1046; $L1057: add.u64 %r60,%r60,1; add.u64 %r62,%r62,24; bra $L1043; $L1082: .loc 1 3130 4 mov.u64 %r168,r161688stack; call caf_internal_error); } .loc 1 3131 11 cvt.u32.u64 %r69,%r168; bra $L1036; $L1042: .loc 1 3133 13 ld.u64 %r72,[%r72]; .loc 1 3054 9 setp.ne.u64 %r171,%r72,0; @ %r171 bra $L1059; $L1037: .loc 1 3135 17 set.u32.ne.u64 %r173,%r48,0; neg.s32 %r69,%r173; bra $L1036; $L1052: .loc 1 3073 31 ld.u8 %r51,[%r46]; .loc 1 3073 4 setp.ne.u32 %r176,%r51,0; @ %r176 bra $L1051; bra $L1037; $L1058: .loc 1 3102 31 ld.u8 %r52,[%r60]; .loc 1 3102 4 setp.ne.u32 %r177,%r52,0; @ %r177 bra $L1057; bra $L1037; $L1036: .loc 1 3136 1 mov.u32 %value,%r69caf_single.la# libcaf_singlecaf_single -lmcaf_single ` compile_options.o/ maxloc0_16_i16.o/ maxloc0_16_r10.o/ maxloc0_16_r16.o/ maxloc1_16_i16.o/ maxloc1_16_r10.o/ maxloc1_16_r16.o/ minloc0_16_i16.o/ minloc0_16_r10.o/ minloc0_16_r16.o/ minloc1_16_i16.o/ minloc1_16_r10.o/ minloc1_16_r16.o/ matmulavx128_i1.o/ matmulavx128_i2.o/ matmulavx128_i4.o/ matmulavx128_i8.o/ matmulavx128_i16.o/ matmulavx128_r4.o/ matmulavx128_r8.o/ matmulavx128_r10.o/ matmulavx128_r16.o/ matmulavx128_c4.o/ matmulavx128_c8.o/ matmulavx128_c10.o/ matmulavx128_c16.o/ cshift1_16_i16.o/ cshift1_16_r10.o/ cshift1_16_r16.o/ cshift1_16_c10.o/ cshift1_16_c16.o/ ISO_Fortran_binding.o/ size_from_kind.o/ extends_type_of.o/ selected_char_kind.o/ spread_generic.o/ string_intrinsics.o/ reshape_generic.o/ reshape_packed.o/ selected_int_kind.o/ selected_real_kind.o/ unpack_generic.o/ in_pack_generic.o/ in_unpack_generic.o/ misc_specifics.o/ bounds.o/_gfortrani_bounds_iforeach_return .visible .func _gfortrani_bounds_iforeach_returnfortran/runtime/boundgfortrani_bounds_ifunction_return .visible .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .visible .func _gfortrani_bounds_equal_extentsgfortrani_bounds_reduced_extents .visible .func _gfortrani_bounds_reduced_extents); // BEGIN GLOBAL FUNCTION DECL: _gfortrani_count_0gfortrani_count_0_gfortran_runtime_error .extern .func _gfortran_runtime__gfortran_size0_gfortran_size0FUNCTION DECL: _gfortrani_internal_error .extern .func _gfortrani_internal_73,110,99,111,114,114,101,99,116,32,115,105,122,101,32,105,110,32,37,115,32,111,102,32,37,115,32,105,110,116,114,105,110,115,105,99,58,32,115,104,111,117,108,100,32,110,111,116,32,98,101,32,122,101,114,111,45,115,105,122,101,1073,110,99,111,114,114,101,99,116,32,115,105,122,101,32,111,102,32,37,115,32,111,102,32,37,115,32,105,110,116,114,105,110,115,105,99,58,32,115,104,111,117,108,100,32,98,101,32,122,101,114,111,45,115,105,122,101,100,0 }; // BEGIN VAR DEF: $LC11 .const .align 1 .u8 $LC11[37] = {70,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121,32,105,110,32,99,111,117,110,116,95,4898,111,117,110,100,115,95,114,101,100,117,99,101,100,95,101,120,116,101,110,116,115,0 }; // BEGIN VAR DEF: __func__$1 .const .align 1 .u8 __func__$1[21] = {98,111,117,110,100,115,95,101,113,117,97,108,95,101,120,116,101,110,116,115,0 }99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,37,115,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,10810,99,111,114,114,101,99,116,32,115,105,122,101,32,105,110,32,37,115,32,111,102,32,37,115,32,105,110,116,114,105,110,115,105,99,58,32,115,104,111,117,108,100,32,98,101,32,122,101,114,111,45,115,105,122,101,1073,110,99,111,114,114,101,99,116,32,115,105,122,101,32,111,102,32,37,115,32,105,110,32,37,115,32,105,110,116,114,105,110,115,105,99,58,32,115,104,111,117,108,100,32,110,111,116,32,98,101,32,122,101,114,111,45,115,105,122,101,1073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,37,115,32,111,102,32,37,115,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1071,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,40,97,41,32,61,61,32,71,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,40,98,4141116,114,97,110,47,114,117,110,116,105,109,101,47,98,111,117,110,100,115,46,9973,110,99,111,114,114,101,99,116,32,115,105,122,101,32,111,102,32,37,115,32,105,110,32,37,115,32,105,110,116,114,105,110,115,105,99,58,32,115,104,111,117,108,100,32,98,101,32,122,101,114,111,45,115,105,122,10173,110,99,111,114,114,101,99,116,32,115,105,122,101,32,111,102,32,37,115,32,111,102,32,37,115,32,105,110,116,114,105,110,115,105,99,58,32,83,104,111,117,108,100,32,110,111,116,32,98,101,32,122,101,114,111,45,115,105,122,101,100,0 }71,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,40,97,41,32,61,61,32,71,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,40,98,41,32,45,32,49_gfortrani_bounds_iforeach_return .visible .func _gfortrani_bounds_iforeach_return.loc 1 46 8 ld.s8 %r26,[%r29+28]; .loc 1 47 16 ld.u64 %r32,[%r28+56]; add.u64 %r31,%r32,1; .loc 1 47 14 ld.u64 %r33,[%r28+48]; sub.u64 %r27,%r31,%r33; .loc 1 48 6 setp.eq.u64 %r34,%r26,%r27; @ %r34 bra $L1; .loc 1 49 5 st.u64 [%stack+16],%r26; st.u64 [%stack+8],%r27; st.u64 [%stack],%r30; cvta.const.u64 %r35,$LC0; {stack; call _gfortran_runtime_error5_gfortrani_bounds_ifunction_return .visible .func _gfortrani_bounds_ifunction_returnpred %r70; .reg .u64 %r71; .reg .pred %r73; .reg .pred.loc 1 68 10 ld.s8 %r22,[%r46+28]; .loc 1 69 12gfortran_size0,(%out_arg1); ld.param.u64 %r51,[%value_in]; } .loc 1 72 28 cvt.u32.u32 %r52,%r22; cvt.s64.s8 %r29,%r52; .loc 1 72 3 setp.le.s64 %r53,%r29,0; @ %r53 bra $L4; mov.u64 %r30,%r47; shl.b64 %r56,%r29,3; add.u64 %r45,%r30,%r56; mov.u64 %r36,%r30; .loc 1 71 9 mov.u32 %r32,0; $L6: .loc 1 74 10 ld.u64 %r57,[%r36]; setp.ne.u64 %r58,%r57,0; .loc 1 75 8 selp.u32 %r32,%r32,1,%r58; .loc 1 72 3 add.u64 %r36,%r36,8; setp.ne.u64 %r59,%r36,%r45; @ %r59 bra $L6; bra $L25; $L4: .loc 1 86 10 setp.eq.u64 %r60,%r51,0; @ %r60 bra $L8; bra $L3; $L25: .loc 1 77 6 setp.eq.u32 %r61,%r32,0; @ %r61 bra $L10; .loc 1 79 10 setp.eq.u64 %r62,%r51,0; @ %r62 bra $L3; .loc 1 80 2 st.u64 [%stack+8],%r49; st.u64 [%stack],%r48; cvta.const.u64 %r63_gfortran_runtime_error87 2 st.u64 [%stack+8],%r49; st.u64 [%stack],%r48; cvta.const.u64 %r65,$LC2stack; call _gfortran_runtime_error26: add.u64 %r33,%r46,48; add.u64 %r24,%r46,56; .loc 1 86 10 mov.u64 %r37,0; $L12: .loc 1 94 15 ld.u64 %r68,[%r24]; add.u64 %r67,%r68,1; .loc 1 94 13 ld.u64 %r69,[%r33]; sub.u64 %r35,%r67,%r69; .loc 1 95 26 ld.u64 %r28,[%r30]; .loc 1 96 6 add.u64 %r37,%r37,1; .loc 1 95 7 setp.eq.u64 %r70,%r28,%r35; @ %r70 bra $L11; .loc 1 96 6 st.u64 [%stack+32],%r28; st.u64 [%stack+24],%r35; st.u64 [%stack+16],%r37; st.u64 [%stack+8],%r49; st.u64 [%stack],%r48; cvta.const.u64 %r71_gfortran_runtime_error11: .loc 1 91 7 add.u64 %r33,%r33,24; add.u64 %r30,%r30,8; add.u64 %r24,%r24,24; setp.ne.u64 %r73,%r29,%r37; @ %r73 bra $L12; bra $L3; $L10: .loc 1 86 10 setp.eq.u64 %r74,%r51,0; @ ! %r74 bra $L26; bra $L8; $L_gfortrani_bounds_equal_extents .visible .func _gfortrani_bounds_equal_extentspred.loc 1 115 3 ld.s8 %r51,[%r46+28]; cvt.u16.u32 %r50,%r51; ld.s8 %r53,[%r47+28]; cvt.u16.u32 %r52,%r53; setp.eq.u16 %r54,%r50,%r52; @ %r54 bra $L28; cvta.const.u64 %r58,$LC4; cvta.const.u64 %r57,__func__$1; mov.u32 %r56,11528: .loc 1 117 12gfortran_size0118 12call (%value_in),_gfortran_size0,(%out_arg1); ld.param.u64 %r62,[%value_in]; } .loc 1 120 6 setp.ne.u64 %r63,%r62,0; @ %r63 bra $L29; .loc 1 122 10 setp.eq.u64 %r64,%r60,0; @ %r64 bra $L27; .loc 1 123 2 st.u64 [%stack+8],%r49; st.u64 [%stack],%r48; cvta.const.u64 %r65,$LC6stack; call _gfortran_runtime_error29: .loc 1 129 10 setp.eq.u64 %r67,%r60,0; @ %r67 bra $L31; .loc 1 134 23 ld.s8 %r39,[%r47+28]; .loc 1 134 7 setp.gt.s64 %r68,%r39,0; @ %r68 bra $L32; bra $L27; $L31: .loc 1 130 2 st.u64 [%stack+8],%r49;_gfortran_runtime_error32: add.u64 %r41,%r47,48; add.u64 %r35,%r46,56; add.u64 %r31,%r47,56; .loc 1 134 14 mov.u64 %r42,0; $L34: .loc 1 138 15 ld.u64 %r72,[%r35]; add.u64 %r71,%r72,1; .loc 1 138 13 ld.u64 %r73,[%r35+-8]; sub.u64 %r36,%r71,%r73; .loc 1 139 15 ld.u64 %r75,[%r31]; add.u64 %r74,%r75,1; .loc 1 139 13 ld.u64 %r76,[%r41]; sub.u64 %r37,%r74,%r76; .loc 1 141 6 add.u64 %r42,%r42,1; .loc 1 140 7 setp.eq.u64 %r77,%r36,%r37; @ %r77 bra $L33; .loc 1 141 6 st.u64 [%stack+32],%r37; st.u64 [%stack+24],%r36; st.u64 [%stack+16],%r42; st.u64 [%stack+8],%r49; st.u64 [%stack],%r48; cvta.const.u64 %r78,$LC3; {stack; call _gfortran_runtime_erroradd.u64 %r41,%r41,24; add.u64 %r35,%r35,24; add.u64 %r31,%r31,24; setp.ne.u64 %r80,%r39,%r42; @ %r80 bra $L34; $L27: .loc 1 147_gfortrani_bounds_reduced_extents .visible .func _gfortrani_bounds_reduced_extentspredpred.loc 1 159 3 ld.s8 %r50,[%r45+28]; ld.s8 %r51,[%r46+28]; add.u32 %r52,%r51,-1; setp.eq.u32 %r53,%r50,%r52; @ %r53 bra $L40; cvta.const.u64 %r57,$LC8; cvta.const.u64 %r56,__func__$0; mov.u32 %r55,15940: .loc 1 161 12call (%value_in),_gfortran_size0,(62 12gfortran_size0,(%out_arg1); ld.param.u64 %r61,[%value_in]; } .loc 1 164 6 setp.ne.u64 %r62,%r61,0; @ %r62 bra $L41; .loc 1 166 10 setp.eq.u64 %r63,%r59,0; @ %r63 bra $L39; .loc 1 167 2 st.u64 [%stack+8],%r49; st.u64 [%stack],%r48; cvta.const.u64 %r64,$LCstack; call _gfortran_runtime_error41: .loc 1 173 10 setp.eq.u64 %r66,%r59,0; @ %r66 bra $L43; .loc 1 179 23 ld.s8 %r44,[%r46+28]; .loc 1 179 7 setp.gt.s64 %r67,%r44,0; @ %r67 bra $L44; bra $L39; $L43: .loc 1 174 2 st.u64 [%stack+8],%r49; st.u64 [%stack],%r48; cvta.const.u64 %r68,$LC_gfortran_runtime_error44: .loc 1 183 10 cvt.s64.s32 %r33,%r47; add.u64 %r35,%r46,48; .loc 1 179 14 mov.u64 %r41,0; .loc 1 178 9 mov.u64 %r34,%r41; $L46: .loc 1 183 7 setp.eq.u64 %r70,%r33,%r41; @ %r70 bra $L45; .loc 1 185 19 add.u64 %r72,%r34,%r34; add.u64 %r73,%r72,%r34; shl.b64 %r74,%r73,3; add.u64 %r75,%r45,%r74; ld.u64 %r78,[%r75+56]; add.u64 %r77,%r78,1; .loc 1 185 17 ld.u64 %r85,[%r75+48]; sub.u64 %r39,%r77,%r85; .loc 1 186 19 ld.u64 %r87,[%r35+8]; add.u64 %r86,%r87,1; .loc 1 186 17 ld.u64 %r88,[%r35]; sub.u64 %r40,%r86,%r88; .loc 1 188 3 add.u64 %r34,%r34,1; .loc 1 187 11 setp.eq.u64 %r89,%r39,%r40; @ %r89 bra $L45; .loc 1 188 3 st.u64 [%stack+32],%r40; st.u64 [%stack+24],%r39; st.u64 [%stack+16],%r34; st.u64 [%stack+8],%r49; st.u64 [%stack],%r48; cvta.const.u64 %r90stack; call _gfortran_runtime_error45: .loc 1 179 49 add.u64 %r41,%r41,1; .loc 1 179 7 add.u64 %r35,%r35,24; setp.ne.u64 %r92,%r41,%r44; @ %r92 bra $L46; $L39: .loc 1 196_gfortrani_count_0gfortrani_count_0predpredpredpred %r125; .reg .u64 %r127; mov.u64 %r75,%ar0; .loc 1 214 8 ld.s8 %r54,[%r75+28]; .loc 1 215 10 ld.u64 %r23,[%r75+16]; .loc 1 217 8 ld.u64 %r59,[%r75]; .loc 1 219 17 cvt.u32.u64 %r24,%r23; .loc 1 219 38 add.u32 %r76,%r24,-4; and.b32 %r77,%r76,-5; set.u32.eq.u32 %r79,%r77,0; neg.s32 %r80,%r79; .loc 1 219 17 add.u32 %r81,%r24,-1; set.u32.le.u32 %r83,%r81,1; neg.s32 %r84,%r83; .loc 1 219 6 cvt.u16.u32 %r86,%r80; cvt.u16.u32 %r87,%r84; or.b16 %r85,%r86,%r87;@ %r90 bra $L52; $L55: .loc 1 231 3 setp.gt.s64 %r91,%r54,0; @ %r91 bra $L53; $L58: ld.u64 %r28,[%frame+240]; .loc 1 249 22 ld.u64 %r41,[%frame]; .loc 1 251 32 ld.u64 %r53,[%frame+120]; .loc 1 254 23 mul.lo.u64 %r92,%r53,%r41; .loc 1 254 9 neg.s64 %r61,%r92; .loc 1 241 10 mov.u64 %r74,0; setp.eq.u64 %r125,%r54,1; .loc 1 253 13 mov.u64 %r127,%r74; bra $L54; $L52: .loc 1 221 7 setp.eq.u32 %r94,%r24,16; @ %r94 bra $L55; .loc 1 229 5 cvta.const.u64 %r96,$LC11; mov.u64 %r95,r96; call _gfortrani_internal_error3: add.u64 %r69,%r75,40; .loc 1 231 3 mov.u64 %r67,0; mov.u64 %r60,%r67; add.u64 %r124,%frame,120; add.u64 %r123,%frame,240; .loc 1 235 16 mov.u64 %r107,%r67; $L57: .loc 1 233 18 add.u64 %r97,%frame,%r67; .loc 1 233 20 ld.u64 %r99,[%r69]; mul.lo.u64 %r98,%r99,%r23; .loc 1 233 18 st.u64 [%r97],%r98; .loc 1 234 19 ld.u64 %r101,[%r69+16]; add.u64 %r100,%r101,1; ld.u64 %r102,[%r69+8]; sub.u64 %r37,%r100,%r102; .loc 1 234 17 add.u64 %r104,%r124,%r67; st.u64 [%r104],%r37; .loc 1 235 16 add.u64 %r106,%r123,%r67; st.u64 [%r106],%r107; .loc 1 237 10 setp.le.s64 %r108,%r37,0; @ %r108 bra $L63; .loc 1 231 26 add.u64 %r60,%r60,1; .loc 1 231 3 add.u64 %r69,%r69,24; add.u64 %r67,%r67,8; setp.ne.u64 %r109,%r54,%r60; @ %r109 bra $L57; bra $L58; $L54: .loc 1 245 10 ld.s8 %r111,[%r59]; cvt.u16.u32 %r110,%r111; setp.eq.u16 %r112,%r110,0; @ %r112 bra $L59; .loc 1 246 9 add.u64 %r74,%r74,1; $L59: .loc 1 248 15 add.u64 %r28,%r28,1; .loc 1 249 12 add.u64 %r59,%r59,%r41; .loc 1 251 13 setp.ne.u64 %r113,%r28,%r53; @ %r113 bra $L54; .loc 1 254 9 add.u64 %r57,%r59,%r61; .loc 1 256 7 @ %r125 bra $L51; add.u64 %r52,%frame,248; add.u64 %r56,%frame,8; add.u64 %r73,%frame,128; .loc 1 255 5 mov.u64 %r58,1; bra $L61; $L62: .loc 1 253 13 st.u64 [%r52],%r127; .loc 1 254 23 mul.lo.u64 %r118,%r48,%r47; .loc 1 254 9 sub.u64 %r57,%r59,%r118; .loc 1 255 5 add.u64 %r58,%r58,1; .loc 1 256 7 add.u64 %r52,%r52,8; add.u64 %r56,%r56,8; add.u64 %r73,%r73,8; setp.eq.u64 %r119,%r54,%r58; @ %r119 bra $L51; $L61: .loc 1 263 16 ld.u64 %r120,[%r52]; add.u64 %r47,%r120,1; st.u64 [%r52],%r47; .loc 1 264 23 ld.u64 %r48,[%r56]; .loc 1 264 13 add.u64 %r59,%r57,%r48; .loc 1 251 32 ld.u64 %r50,[%r73]; .loc 1 251 13 setp.eq.u64 %r121,%r47,%r50; @ %r121 bra $L62; .loc 1 253 13 mov.u64 %r28,0; bra $L54; $L63: .loc 1 238 9 mov.u64 %r74,0; $L51: .loc 1 2gfortran_set_options .visible .func _gfortran_set_options../libgfortran/runtime/compile_optiongfortrani_init_compile_options .visible .func _gfortrani_init_compile_option_gfortran_set_convert .visible .func _gfortran_set_convert_gfortran_set_record_marker .visible .func _gfortran_set_record_marker_gfortran_set_max_subrecord_length .visible .func _gfortran_set_max_subrecord_length_gfortran_runtime_error .extern .func _gfortran_runtime_97,108,105,100,32,118,97,108,117,101,32,102,111,114,32,114,101,99,111,114,100,32,109,97,114,107,10132,118,97,108,117,101,32,102,111,114,32,109,97,120,105,109,117,109,32,115,117,98,114,101,99,111,114,100,32,108,101,110,103,116,104,0 }; // BEGIN GLOBAL VAR DEF: _gfortrani_compile_options .visible .global .align 8 .u64 _gfortrani_compile_options[ortran_set_options .visible .func _gfortran_set_optionspredmov.u32 %r29,%ar0; mov.u64 %r30,%ar1; .loc 1 148 6 setp.gt.s32 %r31,%r29,0; @ ! %r31 bra $L1; .loc 1 149 39 ld.u32 %r22,[%r30]; .loc 1 149 30 cvta.global.u64 %r32,_gfortrani_compile_options; st.u32 [%r32],%r22; .loc 1 150 6 setp.eq.u32 %r33,%r29,1; @ %r33 bra $L1; .loc 1 151 31 ld.u32 %r23,[%r30+4]; st.u32 [%r32+4],%r23; .loc 1 152 6 setp.eq.u32 %r35,%r29,2; @ %r35 bra $L1; .loc 1 153 30 ld.u32 %r24,[%r30+8]; st.u32 [%r32+8],%r24; .loc 1 154 6 setp.eq.u32 %r37,%r29,3; @ %r37 bra $L1; .loc 1 155 31 ld.u32 %r25,[%r30+12]; st.u32 [%r32+16],%r25; .loc 1 156 6 setp.eq.u32 %r39,%r29,4; @ %r39 bra $L1; .loc 1 157 31 ld.u32 %r26,[%r30+16]; st.u32 [%r32+20],%r26; .loc 1 158 6 setp.eq.u32 %r41,%r29,5; @ %r41 bra $L1; .loc 1 159 34 ld.u32 %r27,[%r30+20]; st.u32 [%r32+36],%r27; .loc 1 160 6 setp.eq.u32 %r43,%r29,6; @ %r43 bra $L1; .loc 1 161 33 ld.u32 %r28,[%r30+24]; st.u32 [%r32+40],%r28; $L1: .loc 1 1_gfortrani_init_compile_options .visible .func _gfortrani_init_compile_options {.reg .u64 %r27; .loc 1 207 28 cvta.global.u64 %r22,_gfortrani_compile_options; mov.u64 %r23,2194728288324; st.u64 [%r22],%r23; .loc 1 211 28 mov.u32 %r25,0; st.u32 [%r22+8],%r25; .loc 1 212 29 mov.u64 %r27,4294967296; st.u64 [%r22+16],%r27; .loc 1 214 31 st.u32 [%r22+40],%r25;_gfortran_set_convert .visible .func _gfortran_set_conmov.u32 %r22,%ar0; .loc 1 226 27 st.global.u32 [_gfortrani_compile_options+12],%r22;_gfortran_set_record_marker .visible .func _gfortran_set_record_markerpred.reg .u64 %r28; .reg .u64 %r29; mov.u32 %r22,%ar0; .loc 1 237 3 setp.eq.u32 %r23,%r22,4; @ %r23 bra $L26; setp.eq.u32 %r24,%r22,8; @ %r24 bra $L27; bra $L30; $L26: .loc 1 240 37 mov.u64 %r26,4; st.global.u64 [_gfortrani_compile_options+24],%r26; .loc 1 241 7 bra $L25; $L27: .loc 1 244 37 mov.u64 %r28,8; st.global.u64 [_gfortrani_compile_options+24],%r28; .loc 1 245 7 bra $L25; $L30: .loc 1 248 7_gfortran_runtime_error25: .loc 1 251_gfortran_set_max_subrecord_length .visible .func _gfortran_set_max_subrecord_length258 38 add.u32 %r25,%r24,-1; .loc 1 258 6 setp.le.u32 %r26,%r25,2147483638; @ %r26 bra $L32; .loc 1 260 7 cvta.const.u64 %r27_gfortran_runtime_error32: .loc 1 264 40 st.global.u32 [_gfortrani_compile_options+32],%r24; .loc 1 265memory.o/_gfortrani_xm_gfortrani_xmalloc../libgfortran/runtime/memory.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_xmallocarraygfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_xc_gfortrani_xc_gfortrani_x_gfortrani_x_gfortran_os_error .extern .func _gfortran_os_erro__errno__errno7102,97,105,108,10116,101,103,101,114,32,97,108,108,111,99,97,114,114,97,12177102,97,105,108,101,100,32,105,110,32,120,109,97,108,108,111,99,97,114,114,9765,108,108,111,99,97,116,105,110,103,32,99,108,101,97,114,101,100,32,109,101,109,111,114,121,32,102,97,105,108,101,1077102,97,105,108,117,114,101,32,105,110,32,120,114,101,97,108,108,111,99_gfortrani_xm_gfortrani_xmallocu64 %r27; .reg .pred 7 max.u64 %r26,40 6 setp.ne.u64 %r29,%r27,0; @ %r29 bra $L1; .loc 1 41 5call _gfortran_os_error,(_gfortrani_xmallocarraygfortrani_xmallocarrayu16 %r40; .reg .u16 %r41; .reg .u32 %r42; .reg .u16 %r43; .reg .pred32u64predpredpred53 7 set.u32.eq.u64 %r34,%r31,0; neg.s32 %r35,%r34; .loc 1 53 17 set.u32.eq.u64 %r37,%r32,0; neg.s32 %r38,%r37; .loc 1 53 14 cvt.u16.u32 %r40,%r35; cvt.u16.u32 %r41,%r38; or.b16 %r39,%r40,%r41; .loc 1 53 6 cvt.u32.u16 %r42,%r39; cvt.u16.u8 %r43,%r42; setp.ne.u16 %r44,%r43,0; @ %r44 bra $L13; .loc 1 55 12 mov.u64 %r29,0; shr.u64 %r45,%r31,32; shr.u64 %r46,%r32,32; cvt.u32.u64 %r48,%r8; cvt.u32.u64 %r53,%r31; cvt.u64.u32 %r52,%r53; cvt.u32.u64 %r55,%r32; cvt.u64.u32 %r54,%r55; mul.lo.u64 %r47,%r52,%r54; bra $L5; $L8: cvt.u32.u64 %r58,%r46; cvt.u32.u64 %r59,%r31; bra $L9; $L7: cvt.u32.u64 %r60,%r46; setp.ne.u32 %r61,%r60,0; @ %r61 bra $L11; cvt.u32.u64 %r58,%r45; cvt.u32.u64 %r59,%r32; $L9: cvt.u32.u64 %r63,%r31; cvt.u64.u32 %r62,%r63; cvt.u32.u64 %r65,%r32; cvt.u64.u32 %r64,%r65; mul.lo.u64 %r66,%r62,%r64; cvt.u64.u32 %r68,%r59; cvt.u64.u32 %r69,%r58; shr.u64 %r71,%r66,32; mad.lo.u64 %r72,%r68,%r69,%r71; shr.s64 %r73,%r72,32; cvt.u32.u64 %r74,%r73; setp.ne.u32 %r75,%r74,0; @ %r75 bra $L11; shl.b64 %r76,%r72,32; cvt.u32.u64 %r78,%r66; cvt.u64.u32 %r77,%r78; or.b64 %r47,%r76,%r77; bra $L5; $L11: mul.lo.u64 %r47,%r31,%r32; mov.u64 %r29,1; $L5: mov.u64 %r25,%r47; .loc 1 55 11 setp.eq.u64 %r81,%r29,0; @ %r81 bra $L4; .loc 1 57 7__errnomov.u32 %r83,12; st.u32 [%r82],%r83; .loc 1 58 7 cvta.const.u64 %r884; call _gfortran_os_error$L13: .loc 1 54 10 mov.u64 %r25,1; $L4: .loc 1 61 763 6 setp.ne.u64 %r88,%r86,0; @ %r88 bra $L3; .loc 1 64 5call _gfortran_os_error$L3: .loc 1 67gfortrani_xc_gfortrani_xcallocpred75 7 set.u32.eq.u64 %r29,%r26,0; neg.s32 %r30,%r29; .loc 1 75 17 set.u32.eq.u64 %r32,%r27,0; neg.s32 %r33,%r32; .loc 1 75 14 cvt.u16.u32 %r35,%r30; cvt.u16.u32 %r36,%r33; or.b16 %r34,%r35,%r36; .loc 1 75 6 cvt.u32.u16 %r37,%r34; cvt.u16.u8 %r38,%r37; setp.eq.u16 %r39,%r38,0; @ %r39 bra $L18; .loc 1 76 18 mov.u64 %r27,1; .loc 1 76 11 mov.u64 %r26,%r27; $L18: .loc 1 78 13call (%value_in),c17; .loc 1 80 5 cvta.const.u64 %r4545; call _gfortran_os_errorgfortrani_x_gfortrani_xrealloc92 16 max.u64 %r28,23; .loc 1 94 5call _gfortran_os_error$L23: .loc 1 97string.o/ gfortrani_fstrlengfortrani_fstrlenfortran/runtime/string.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_fstrcpygfortrani_fstrcpygfortrani_cf_strcpygfortrani_cf_str_gfortrani_fc_strd_gfortrani_fc_strd_gfortrani_fc_strdup_notrgfortrani_fc_strdup_notr; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_find_op_gfortrani_find_option_gfortrani_gfc_itoagfortrani_gfc_itoand_gfortran_os_error .extern .func _gfortran_os_errostrncasecmp_gfortran_generate_error .extern .func _gfortran_generate_error_gfortrani_sys_abort .extern .func _gfortrani_sys___umodti3 .extern// BEGIN GLOBAL FUNCTION DECL: __udivti3 .extern077102,97,105,108,101,100,32,105,110,32,102,99,95,115,116,114,100,11_gfortrani_fstrlengfortrani_fstrlenmov.u64 %r26,%ar0; mov.u64 %r25,%ar1; .loc 1 36 3 bra $L2; $L4: .loc 1 37 15 add.u64 %r22,%r25,-1; add.u64 %r28,%r26,%r22; .loc 1 37 8 ld.s8 %r30,[%r28]; cvt.u16.u32 %r29,%r30; setp.ne.u16 %r31,%r29,32; @ %r31 bra $L1; .loc 1 36 22 mov.u64 %r25,%r22; $L2: .loc 1 36 3 setp.ne.u64 %r32,%r25,0; @ %r32 bra $L4; $L1: .loc 1 41gfortrani_fstrcpygfortrani_fstrcpypred %r29; .reg .u64 %r36; .reg .u64 %r44; .reg .u32 %r49; .reg .u64 %r51; .reg .u64 %r52; .reg .u6452 6 setp.lt.u64 %r29,%r28,%r26; @ %r29 bra $L6; .loc 1 55 7mov.u64 %r24,%r26; bra $L5; $L6: .loc 1 60 72 7 sub.u64 %r51,%r26,%r28; .loc 1 62 15 add.u64 %r52,%r25,%r28; .loc 1 62 7 mov.u32 %r49,3r5,[%value_in]; } .loc 1 63 14 mov.u64 %r24,%r28; $L5: .loc 1 65gfortrani_cf_strcpypred76 1327mov.u64 %r24,%r29; .loc 1 78 6 setp.lt.u64 %r31,%r24,%r26; @ %r31 bra $L9; .loc 1 81 7mov.u64 %r24,%r26; bra $L8; $L9: .loc 1 86 788 7 sub.u64 %r53,%r26,%r24; .loc 1 88 15 add.u64 %r54,%r25,%r24; .loc 1 88 7 mov.u32 %r515455,[%value_in]; } $L8: .loc 1 91gfortrani_fc_strd_gfortrani_fc_strdup6 3 bra $L12; $L14: .loc 1 37 15 add.u64 %r22,%r27,-1; add.u64 %r28,%r26,%r22; .loc 1 37 8 ld.s8 %r30,[%r28]; cvt.u16.u32 %r29,%r30; setp.ne.u16 %r31,%r29,32; @ %r31 bra $L13; .loc 1 36 22 mov.u64 %r27,%r22; $L12: .loc 1 36 3 setp.ne.u64 %r32,%r27,0; @ %r32 bra $L14; $L13: .loc 1 130 13call (%value_in),strndup.loc 1 131 61;call _gfortran_os_error$L1gfortrani_fc_strdup_notrgfortrani_fc_strdup_notr44strndup145 6 setp.ne.u64 %r29,%r27,0; @ %r29 bra $L16; .loc 1 146 5call _gfortran_os_error$L16: .loc 1 148_gfortrani_find_op_gfortrani_find_optionpredpredmov.u64 %r31,%ar2; mov.u64 %r32,%ar3; mov.u64 %r33,%ar4; .loc 1 36 3 bra $L19; $L21: .loc 1 37 15 add.u64 %r25,%r31,-1; add.u64 %r34,%r30,%r25; .loc 1 37 8 ld.s8 %r36,[%r34]; cvt.u16.u32 %r35,%r36; setp.ne.u16 %r37,%r35,32; @ %r37 bra $L20; .loc 1 36 22 mov.u64 %r31,%r25; $L19: .loc 1 36 3 setp.ne.u64 %r38,%r31,0; @ %r38 bra $L21; bra $L20; $L24: .loc 1 163 1624.loc 1 163 8 setp.ne.u64 %r42,%r31,%r40; @ %r42 bra $L22strncasecmp.loc 1 163 35 setp.ne.u32 %r47,%r46,0; @ %r47 bra $L22; .loc 1 164 18 ld.u32 %r28,[%r32+8]; bra $L18; $L22: .loc 1 162 26 add.u64 %r32,%r32,16; .loc 1 162 14 ld.u64 %r24,[%r32]; .loc 1 162 3 setp.ne.u64 %r48,%r24,0; @ %r48 bra $L24; bra $L25; $L20: .loc 1 162 14 ld.u64 %r24,[%r32]; .loc 1 162 3 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L24; $L25: .loc 1 166 3 mov.u32 %r51,5002gfortran_generate_error.loc 1 168 10 mov.u32 %r28,-1; $L18gfortrani_gfc_itoagfortrani_gfc_itoapredmov.u64 %r33,%ar0; ld.u64 %r131,[%r33]; ld.u64 %r132,[%r33+8]188 6 setp.gt.u64 %r37,%r36,49; @ %r37 bra $L29; .loc 1 189 5 { call _gfortrani_sys_9: .loc 1 191 6 or.b64 %r38,%r131,%r132; setp.eq.u64 %r40,%r38,0; @ %r40 bra $L37; .loc 1 196 6 setp.lt.s64 %r42,%r132,0; @ %r42 bra $L32; .loc 1 195 5 mov.u64 %r127,%r131; mov.u64 %r128,%r132; .loc 1 194 12 mov.u32 %r27,0; bra $L34; $L32: .loc 1 199 11 set.u32.ne.u64 %r50,%r131,0; cvt.s64.s32 %r48,%r50; neg.s64 %r127,%r131; sub.u64 %r128,%r48,%r132; .loc 1 198 16 mov.u32 %r27,1; $L34: .loc 1 202 5 add.u64 %r32,%r35,49; .loc 1 203 6 mov.u32 %r55,0; st.u8 [%r35+49],%r55; add.u64 %r142,%frame,16; add.u64 %r141,%frame,32; $L41: mov.u64 %r31,%r32; .loc 1 207 12 add.u64 %r32,%r32,-1; .loc 1 207 23 mov.u64 %r62,%r127; mov.u64 %r63,%r128; mov.u64 %r66,10; mov.u64 %r67,0; st.u64 [%frame+16],%r127; st.u64 [%frame+24],%r128; st.u64 [%frame+32],%r66; st.u64 [%frame+40],%r67; {142141; call __umod.loc 1 207 18 ld.u16 %r89,[%frame]; add.u16 %r88,%r89,48; .loc 1 207 12 cvt.u32.u16 %r90,%r88; st.u8 [%r32],%r90; .loc 1 208 9 st.u64 [%frame+16],%r127; st.u64 [%frame+24],%r128; st.u64 [%frame+32],%r66; st.u64 [%frame+40],%r67; {142141; call __udiv]; ld.u64 %r128,[%frame+8]; .loc 1 205 9 setp.ne.u64 %r119,%r63,0; @ %r119 bra $L41; setp.gt.u64 %r123,%r62,9; @ %r123 bra $L41; .loc 1 211 6 setp.eq.u32 %r124,%r27,0; @ %r124 bra $L28; .loc 1 212 10 mov.u32 %r125,45; st.u8 [%r32+-1],%r125; add.u64 %r32,%r31,-2; bra $L28; $L37: .loc 1 192 12 cvta.const.u64 %r32,$LC1; $L28: .loc 1 214 select.o/ gfortran_selec_gfortran_select_stringfortran/runtime/select_in_gfortran_select_string_char4gfortran_select_string_char4_gfortran_compare_gfortran_compare); // BEGIN GLOBAL FUNCTION DECL: _gfortran_compare_string_char4_gfortran_compare_string_char4F: _gfortran_selec_gfortran_select_stringu64u64 %r83; .reg .u64 %r86; .reg .u32u32 %r104; .reg .pred %r105; .reg .pred %r106; .reg .u64 %r107pred %r119; mov.u64 %r54,%ar0; mov.u32 %r55,%ar1; mov.u64 %r56,%ar2; mov.u64 %r57,%ar3; .loc 1 56 6 setp.eq.u32 %r58,%r55,0; @ %r58 bra $L12; .loc 1 61 6 ld.u64 %r59,[%r54]; setp.ne.u64 %r60,%r59,0; @ %r60 bra $L13; .loc 1 61 34 ld.u64 %r23,[%r54+16]; .loc 1 61 26 setp.ne.u64 %r61,%r23,0; @ %r61 bra $L14; .loc 1 63 20 ld.u32 %r53,[%r54+32]; .loc 1 65 12 add.u64 %r45,%r54,40; .loc 1 66 16 add.u32 %r55,%r55,-1; .loc 1 67 10 setp.eq.u32 %r62,%r55,0; @ %r62 bra $L1; .loc 1 73 6 ld.u64 %r63,[%r54+40]; setp.ne.u64 %r64,%r63,0; @ %r64 bra $L15; .loc 1 75 49 ld.u64 %r23,[%r54+56]; .loc 1 65 12 mov.u64 %r54,%r45; bra $L4; $L14: .loc 1 54 7 mov.u32 %r53,-1; $L4: .loc 1 75call (%value_in),_gfortran_compare_string75 10 setp.lt.s32 %r71,%r70,0; @ %r71 bra $L5; .loc 1 77 21 ld.u32 %r53,[%r54+32]; bra $L1; $L5: .loc 1 80 16 add.u32 %r55,%r55,-1; .loc 1 81 10 setp.eq.u32 %r72,%r55,0; @ %r72 bra $L1; .loc 1 79 12 add.u64 %r54,%r54,40; bra $L3; $L13: .loc 1 54 7 mov.u32 %r53,-1; bra $L3; $L15: .loc 1 65 12 mov.u64 %r54,%r45; $L3: .loc 1 85 13 cvt.s64.s32 %r73,%r55; shl.b64 %r75,%r73,2; add.u64 %r76,%r75,%r73; shl.b64 %r77,%r76,3; .loc 1 85 25 add.u64 %r78,%r77,-40; .loc 1 85 5 add.u64 %r47,%r54,%r78; .loc 1 87 6 ld.u64 %r79,[%r47+16]; setp.eq.u64 %r80,%r79,0; @ %r80 bra $L6; .loc 1 103 9 setp.gt.s32 %r81,%r55,0; @ %r81 bra $L7; bra $L1; $L6: .loc 1 89 11 ld.u64 %r86,[%r47+8]; ld.u64 %r83,[%r47]call (%value_in),_gfortran_compare_stringld.param.u32 %r87,[%value_in]; } .loc 1 89 10 setp.gt.s32 %r88,%r87,0; @ %r88 bra $L8; .loc 1 90 17 ld.u32 %r53,[%r47+32]; bra $L1; $L8: .loc 1 92 16 add.u32 %r48,%r55,-1; .loc 1 103 9 setp.le.s32 %r89,%r55,1; @ %r89 bra $L1; mov.u32 %r55,%r48; $L7: mov.u32 %r46,-1; $L11: .loc 1 105 18 add.u32 %r90,%r55,%r46; .loc 1 105 11 shr.s32 %r50,%r90,1; .loc 1 107 17 cvt.s64.s32 %r91,%r50; shl.b64 %r93,%r91,2; add.u64 %r94,%r93,%r91; shl.b64 %r95,%r94,3; .loc 1 107 9 add.u64 %r51,%r54,%r95; .loc 1 108 11 ld.u64 %r100,[%r51+8]; ld.u64 %r97,[%r5110call (%value_in),_gfortran_compare_string110 10 setp.ne.u32 %r102,%r101,0; @ %r102 bra $L9; .loc 1 111 17 ld.u32 %r53,[%r51+32]; bra $L1; $L9: .loc 1 113 10 setp.lt.s32 %r103,%r101,0; .loc 1 116 14 selp.u32 %r55,%r55,%r50,%r103; selp.u32 %r46,%r50,%r46,%r103; .loc 1 103 14 add.u32 %r104,%r46,1; .loc 1 103 9 setp.lt.s32 %r105,%r104,%r55; @ %r105 bra $L11; .loc 1 125 6 setp.eq.u32 %r106,%r46,-1; @ %r106 bra $L1; .loc 1 128 13 cvt.s64.s32 %r107,%r46; shl.b64 %r109,%r107,2; add.u64 %r110,%r109,%r107; shl.b64 %r111,%r110,3; .loc 1 128 5 add.u64 %r49,%r54,%r111; .loc 1 129 7 ld.u64 %r116,[%r49+16]; ld.u64 %r1177116; call (%value_in),_gfortran_compare_stringld.param.u32 %r118,[%value_in]; } .loc 1 129 6 setp.gt.s32 %r119,%r118,0; @ %r119 bra $L1; .loc 1 130 13 ld.u32 %r53,[%r49+32]; bra $L1; $L12: .loc 1 57 12 mov.u32 %r53,-1;gfortran_select_string_char4gfortran_select_string_char4u64u64 %r83; .reg .u64 %r86; .reg .u32u32 %r104; .reg .pred %r105; .reg .pred %r106; .reg .u64 %r107pred %r119; mov.u64 %r54,%ar0; mov.u32 %r55,%ar1; mov.u64 %r56,%ar2; mov.u64 %r57,%ar3; .loc 1 56 6 setp.eq.u32 %r58,%r55,0; @ %r58 bra $L38; .loc 1 61 6 ld.u64 %r59,[%r54]; setp.ne.u64 %r60,%r59,0; @ %r60 bra $L39; .loc 1 61 34 ld.u64 %r23,[%r54+16]; .loc 1 61 26 setp.ne.u64 %r61,%r23,0; @ %r61 bra $L40; .loc 1 63 20 ld.u32 %r53,[%r54+32]; .loc 1 65 12 add.u64 %r45,%r54,40; .loc 1 66 16 add.u32 %r55,%r55,-1; .loc 1 67 10 setp.eq.u32 %r62,%r55,0; @ %r62 bra $L27; .loc 1 73 6 ld.u64 %r63,[%r54+40]; setp.ne.u64 %r64,%r63,0; @ %r64 bra $L41; .loc 1 75 49 ld.u64 %r23,[%r54+56]; .loc 1 65 12 mov.u64 %r54,%r45; bra $L30; $L40: .loc 1 54 7 mov.u32 %r53,-1; $L30: .loc 1 75call (%value_in),_gfortran_compare_string_char475 10 setp.lt.s32 %r71,%r70,0; @ %r71 bra $L31; .loc 1 77 21 ld.u32 %r53,[%r54+32]; bra $L27; $L31: .loc 1 80 16 add.u32 %r55,%r55,-1; .loc 1 81 10 setp.eq.u32 %r72,%r55,0; @ %r72 bra $L27; .loc 1 79 12 add.u64 %r54,%r54,40; bra $L29; $L39: .loc 1 54 7 mov.u32 %r53,-1; bra $L29; $L41: .loc 1 65 12 mov.u64 %r54,%r45; $L29: .loc 1 85 13 cvt.s64.s32 %r73,%r55; shl.b64 %r75,%r73,2; add.u64 %r76,%r75,%r73; shl.b64 %r77,%r76,3; .loc 1 85 25 add.u64 %r78,%r77,-40; .loc 1 85 5 add.u64 %r47,%r54,%r78; .loc 1 87 6 ld.u64 %r79,[%r47+16]; setp.eq.u64 %r80,%r79,0; @ %r80 bra $L32; .loc 1 103 9 setp.gt.s32 %r81,%r55,0; @ %r81 bra $L33; bra $L27; $L32: .loc 1 89 11 ld.u64 %r86,[%r47+8]; ld.u64 %r83,[%r47]call (%value_in),_gfortran_compare_string_char4ld.param.u32 %r87,[%value_in]; } .loc 1 89 10 setp.gt.s32 %r88,%r87,0; @ %r88 bra $L34; .loc 1 90 17 ld.u32 %r53,[%r47+32]; bra $L27; $L34: .loc 1 92 16 add.u32 %r48,%r55,-1; .loc 1 103 9 setp.le.s32 %r89,%r55,1; @ %r89 bra $L27; mov.u32 %r55,%r48; $L33: mov.u32 %r46,-1; $L37: .loc 1 105 18 add.u32 %r90,%r55,%r46; .loc 1 105 11 shr.s32 %r50,%r90,1; .loc 1 107 17 cvt.s64.s32 %r91,%r50; shl.b64 %r93,%r91,2; add.u64 %r94,%r93,%r91; shl.b64 %r95,%r94,3; .loc 1 107 9 add.u64 %r51,%r54,%r95; .loc 1 108 11 ld.u64 %r100,[%r51+8]; ld.u64 %r97,[%r5110call (%value_in),_gfortran_compare_string_char4110 10 setp.ne.u32 %r102,%r101,0; @ %r102 bra $L35; .loc 1 111 17 ld.u32 %r53,[%r51+32]; bra $L27; $L35: .loc 1 113 10 setp.lt.s32 %r103,%r101,0; .loc 1 116 14 selp.u32 %r55,%r55,%r50,%r103; selp.u32 %r46,%r50,%r46,%r103; .loc 1 103 14 add.u32 %r104,%r46,1; .loc 1 103 9 setp.lt.s32 %r105,%r104,%r55; @ %r105 bra $L37; .loc 1 125 6 setp.eq.u32 %r106,%r46,-1; @ %r106 bra $L27; .loc 1 128 13 cvt.s64.s32 %r107,%r46; shl.b64 %r109,%r107,2; add.u64 %r110,%r109,%r107; shl.b64 %r111,%r110,3; .loc 1 128 5 add.u64 %r49,%r54,%r111; .loc 1 129 7 ld.u64 %r116,[%r49+16]; ld.u64 %r1177116; call (%value_in),_gfortran_compare_string_char4ld.param.u32 %r118,[%value_in]; } .loc 1 129 6 setp.gt.s32 %r119,%r118,0; @ %r119 bra $L27; .loc 1 130 13 ld.u32 %r53,[%r49+32]; bra $L27; $L38: .loc 1 57 12 mov.u32 %r53,-1GLOBAL FUNCTION DECL: _gfortrani_stupid_function_name_for_static_linking .visible .func _gfortrani_stupid_function_name_for_static_linking; .file 1 "../../../../libgfortran/runtime/minimal.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_set_args .visible .func _gfortran_set_args_gfortrani_get_args .visible .func _gfortrani_get_arg); // BEGIN GLOBAL FUNCTION DECL: _gfortrani_estr_writegfortrani_estr_write_gfortrani_st_printfgfortrani_st__gfortrani_sys_abort .visible .func _gfortrani_sys_abort; // BEGIN FUNCTION DECL: recursion_check .func recursion_check; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_exit_error .visible .func _gfortrani_exit_error_gfortrani_show_locus .visible .func _gfortrani_show_gfortran_os_error .visible .func _gfortran_os_erro_gfortran_os_error_at .visible .func _gfortran_os_error_at_gfortran_runtime_error .visible .func _gfortran_runtime__gfortran_runtime_error_at .visible .func _gfortran_runtime_error_at_gfortran_runtime_warning_at .visible .func _gfortran_runtime_warning_at_gfortrani_internal_error .visible .func _gfortrani_internal__gfortran_stop_numeric .visible .func _gfortran_stop_numer_gfortran_stop_string .visible .func _gfortran_stop_string_gfortran_error_stop_string .visible .func _gfortran_error_stop_string_gfortran_error_stop_numeric .visible .func _gfortran_error_stop_numerVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_optionwrite$LC9 .const .align 1 .u8 $LC9[27] = {10,70,111,114,116,114,97,110,32,114,117,110,116,105,109,101,32,119,97,114,110,105,110,103,58,3282,79,82,32,83,84,79,80,32,0 }; // BEGIN VAR DEF: argc_save .global .align 4 .u32 argc_save[110,80,114,111,103,114,97,109,32,97,98,111,114,116,101,100,46,10,0 }; // BEGIN VAR DEF: magic$0 .global .align 4 .u32 magic$0[1]73,110,116,101,114,110,97,108,32,69,114,114,111,114,58,32,0 }; // BEGIN VAR DEF: $LC11 .const .align 1 .u8 $LC11[9] = {83,84,79,80,32,37,100,10,0 }; // BEGIN VAR DEF: $LC12 .const .align 1 .u8 $LC12[6] = {83,84,79,80,32,0 }; // BEGIN VAR DEF: argv_save .global .align 8 .u64 argv_save[1]82,79,82,32,83,84,79,80,32,37,100,10,0 }; // BEGIN GLOBAL VAR DEF: _gfortrani_options .visible .global .align 8 .u64 _gfortrani_options[7]69,114,114,111,114,32,116,101,114,109,105,110,97,116,105,11165,116,32,108,105,110,101,32,37,100,32,111,102,32,102,105,108,101,32,37,115,32,40,117,110,105,116,32,61,32,37,100,4165,116,32,108,105,110,101,32,37,100,32,111,102,32,102,105,108,101,32,379,112,101,114,97,116,105,110,103,32,115,121,115,116,101,109,32,101,114,114,111,114,58,322] = {1058,3270,111,114,116,114,97,110,32,114,117,110,116,105,109,101,32,101,114,114,111,114,58,32,0 }97,110,32,114,117,110,116,105,109,101,32,101_gfortrani_stupid_function_name_for_static_linking .visible .func _gfortrani_stupid_function_name_for_static_linking { .loc 1 64_gfortran_set_args .visible .func _gfortran_set_args76 13 st.global.u32 [argc_save],%r22; .loc 1 77 13 st.global.u64 [argv_save],%r23; .loc 1 7_gfortrani_get_args .visible .func _gfortrani_get_args87 9 ld.global.u32 %r26,[argc_save]; st.u32 [%r24],%r26; .loc 1 88 9 ld.global.u64 %r27,[argv_save]; st.u64 [%r25_gfortrani_estr_writegfortrani_estr_writeu64 %r27; .reg .u32 %r29; .reg .u32 %r32; mov.u64 %r25,%ar0; .loc 1 10025mov.u32 %r29,229value,%r32; .loc 1 101_gfortrani_st_printfgfortrani_st_printfpred %r31; .reg .u64 %r114 13args; @ %r31 shfl.idx.b32 %r28,%r28,%r30,31;gfortrani_sys_abort .visible .func _gfortrani_sys_abort { .reg .u32 %r22; .reg .u64 %r24; .reg .pred %r25; .reg .pred %r26; .reg .u64 %r27; .reg .u32 %r28; .reg .predloc 1 128 14 cvta.global.u64 %r24,_gfortrani_options; ld.u32 %r22,[%r24+44]; .loc 1 128 6 setp.eq.u32 %r25,%r22,1; @ %r25 bra $L7; .loc 1 129 7 setp.ne.u32 %r26,%r22,-1; @ %r26 bra $L8; .loc 1 129 53 cvta.global.u64 %r27,_gfortrani_compile_options; .loc 1 129 35 ld.u32 %r28,[%r27+16]; setp.ne.u32 %r29,%r28,1; @ %r29 bra $L8; $L7: .loc 1 131 7 cvta.const.u64 %r30,$LCcall (%value_in),_gfortrani_estr_write,(%out_arg1); ld.param.u64 %r31,[%value_in]; } $L8} // BEGIN FUNCTION DEF: recursion_check .func recursion_check {loc 1 198 6 ld.global.u32 %r23,[magic$0]; setp.ne.u32 %r24,%r23,551452929; @ %r24 bra $L13; .loc 1 199 5 { call _gfortrani_sys_13: .loc 1 201 9 mov.u32 %r25,551452929; st.global.u32 [magic$0],%r25; .loc 1 2_gfortrani_exit_error .visible .func _gfortrani_exit_errorloc 1 144 14 cvta.global.u64 %r25,_gfortrani_options; ld.u32 %r22,[%r25+44]; .loc 1 144 6 setp.eq.u32 %r26,%r22,1; @ %r26 bra $L15; .loc 1 145 7 setp.ne.u32 %r27,%r22,-1; @ %r27 bra $L16; .loc 1 145 53 cvta.global.u64 %r28,_gfortrani_compile_options; .loc 1 145 35 ld.u32 %r29,[%r28+16]; setp.ne.u32 %r30,%r29,1; @ %r30 bra $L16; $L15: .loc 1 147 7 cvta.const.u64 %r31,$LC131; call (%value_in),_gfortrani_estr_write,(%out_arg1); ld.param.u64 %r32,[%value_in]; } $L16} // BEGIN GLOBAL FUNCTION DEF: _gfortrani_show_locus .visible .func _gfortrani_show_locusmov.u64 %r29,%ar0; .loc 1 161 15 cvta.global.u64 %r30,_gfortrani_options; .loc 1 161 6 ld.u32 %r32,[%r30+16]; set.u32.eq.u32 %r33,%r32,0; neg.s32 %r34,%r33; .loc 1 161 22 set.u32.eq.u64 %r36,%r29,0; neg.s32 %r37,%r36; cvt.u16.u32 %r39,%r34; cvt.u16.u32 %r40,%r37; or.b16 %r38,%r39,%r40; cvt.u32.u16 %r41,%r38; cvt.u16.u8 %r42,%r41; setp.ne.u16 %r43,%r42,0; @ %r43 bra $L20; .loc 1 161 43 ld.u64 %r23,[%r29+8]; .loc 1 161 37 setp.eq.u64 %r44,%r23,0; @ %r44 bra $L20; .loc 1 164 10 ld.u32 %r24,[%r29+4]; .loc 1 177 15 ld.u32 %r28,[%r29+16]; .loc 1 164 6 setp.le.s32 %r45,%r24,0; @ %r45 bra $L22; .loc 1 176 4 st.u32 [%stack+16],%r24; st.u64 [%stack+8],%r2346,$LC2stack; call (%value_in),_gfortrani_st_179 7 bra $L20; $L22: .loc 1 182 3 st.u64 [%stack+8],%r2349,$LC3;stack; call (%value_in),_gfortrani_st_$L_gfortran_os_error .visible .func _gfortran_os_errormov.u64 %r22,%ar0; .loc 1 212 3 { call recursion_check; } .loc 1 213 3 cvta.const.u64 %r23,$LC4;call (%value_in),_gfortrani_estr_write,(%out_arg1); ld.param.u64 %r24,[%value_in]; } .loc 1 214 3call (%value_in),_gfortrani_estr_write,(215 3 cvta.const.u64 %r27; call (%value_in),_gfortrani_estr_write,(216 3 mov.u32 %r29,1_gfortrani_exit_error_gfortran_os_error_at .visible .func _gfortran_os_error_atpred %r37; .reg .u64 %rmov.u64 %r24,%ar1; .loc 1 230 3 { call recursion_check; }call (%value_in),_gfortrani_estr_write,(232 3 cvta.const.u64 %r27,$LC6; call (%value_in),_gfortrani_estr_write,(4 3args; @ %r37 shfl.idx.b32 %r32,%r32,%r36,31; .loc 1 236 3 cvta.const.u64 %r333; call (%value_in),_gfortrani_estr_write237 3 mov.u32 %r35,1_gfortrani_exit_error_gfortran_runtime_error .visible .func _gfortran_runti.loc 1 250 3 { call recursion_check; } .loc 1 251 3 cvta.const.u64 %r24,$LCgfortrani_estr_write253 3args; @ %r34 shfl.idx.b32 %r29,%r29,%r33,31; .loc 1 255 3 cvta.const.u64 %r30; call (%value_in),_gfortrani_estr_write,(256 3 mov.u32 %r322; call _gfortrani_exit_error_gfortran_runtime_error_at .visible .func _gfortran_runtime_error_atpred %r37; .reg .u64 %rmov.u64 %r24,%ar1; .loc 1 268 3 { call recursion_checkcall (%value_in),_gfortrani_estr_write,(270 3 cvta.const.u64 %r27,$LC8; call (%value_in),_gfortrani_estr_write,(272 3args; @ %r37 shfl.idx.b32 %r32,%r32,%r36,31; .loc 1 274 3 cvta.const.u64 %r333; call (%value_in),_gfortrani_estr_write275 3 mov.u32 %r35; call _gfortrani_exit_error_gfortran_runtime_warning_at .visible .func _gfortran_runtime_warning_atpred %r46; .reg .u64 %r4747,__nvptx_uni; add.u64 %r47,%r47,%ustmp1; ld.shared.u32 %r45,[%r475,%r45,%ustmp0; setp.eq.u32 %r46,%r45,%ustmp0; }mov.u32 %r29,229writemov.u64 %r35,26; cvta.const.u64 %r34,$LC9288.loc 1 100 10 mov.u64 %r43,1; cvta.const.u64 %r42,$LC5_gfortrani_internal_error .visible .func _gfortrani_internal_r22mov.u64 %r23,%ar1; .loc 1 301 3 { call recursion_check; } .loc 1 302call _gfortrani_show_locus,(%out_arg1); } .loc 1 303 3 cvta.const.u64 %r25,$LC1call (%value_in),_gfortrani_estr_write,(call (%value_in),_gfortrani_estr_write,(305 3 cvta.const.u64 %r29; call (%value_in),_gfortrani_estr_write313 3 mov.u32 %r31,3_gfortrani_exit_error_gfortran_stop_numeric .visible .func _gfortran_stop_numeric331 6 setp.ne.u32 %r24,%r23,0; @ %r24 bra $L33; .loc 1 334 7 st.u32 [%stack],%r22; cvta.const.u64 %r25,$LC11_gfortrani_st_$L33} // BEGIN GLOBAL FUNCTION DEF: _gfortran_stop_string .visible .func _gfortran_stop_stringu16 %r35; .reg .u32 %r37; .reg .u16 %r38; .reg .pred10 set.u32.ne.u64 %r29,%r25,0; neg.s32 %r30,%r29; .loc 1 345 6 cvt.u16.u32 %r32,%r27; xor.b16 %r31,%r32,1; .loc 1 348 10 cvt.u16.u32 %r35,%r30; and.b16 %r34,%r35,%r31; cvt.u32.u16 %r37,%r34; cvt.u16.u8 %r38,%r37; setp.eq.u16 %r39,%r38,0; @ %r39 bra $L35; .loc 1 350 4 cvta.const.u64 %r40,$LC12gfortrani_estr_write351 11 mov.u32 %r42,2.loc 1 352 4 cvta.const.u64 %r4646; call (%value_in),_gfortrani_estr_write$L35} // BEGIN GLOBAL FUNCTION DEF: _gfortran_error_stop_string .visible .func _gfortran_error_stop_stringpred.loc 1 3740; .loc 1 373 7 cvta.const.u64 %r26,$LC1326; call (%value_in),_gfortrani_estr_write,(4 14 mov.u32 %r28,2cvta.const.u64 %r32; call (%value_in),_gfortrani_estr_write,(%out_arg1); ld.param.u64 %r33,[%value_in]; } $L40: .loc 1 377 3 mov.u32 %r34; call _gfortrani_exit_error_gfortran_error_stop_numeric .visible .func _gfortran_error_stop_numeric389 6 setp.ne.u32 %r24,%r23,0; @ %r24 bra $L42; .loc 1 392 7 st.u32 [%stack],%r22; cvta.const.u64 %r25,$LC14_gfortrani_st_$L42: .loc 1 394 3_gfortrani_exit_error all_l1.o/_gfortran_all_l1 .visible .func _gfortran_all_l1fortran/generated/all_l1.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_internal_error .extern .func _gfortrani_internal_114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,65,76,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,10810,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,65,76,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121,32,105,110,32,65,76,76,32,105,110,116,114,105,110,115,105,99_gfortran_all_l1 .visible .func _gfortran_all_l148pred %r217; .reg .predu64 %r228; .reg .pred %r229; .reg .pred %r231; .reg .u64 %r232; .reg .predu32u64 %r250; .reg .u64 %r251; .reg .u64 %r252; .reg .u64 %r253; .reg .predu32u16u64pred %r293; .reg .u32 %r294; .reg .u32predu64 %r318; .reg .u6464u64 %r341; .reg .u32predpred %r355; .reg .u64 %r356; .reg .u64 %r358; .reg .u64u64 %r375; .reg .u64 %r376; mov.u64 %r196,%ar0; mov.u64 %r197,%ar1; mov.u64 %r198,%ar2; .loc 1 56 10 ld.u64 %r22,[%r198]; .loc 1 56 7 add.u64 %r121,%r22,-1; .loc 1 57 10 ld.s8 %r23,[%r197+28]; .loc 1 57 38 add.u32 %r25,%r23,-1; .loc 1 57 8 cvt.s64.s32 %r122,%r25; .loc 1 59 14 ld.u64 %r26,[%r197+16]; .loc 1 61 9 add.u64 %r200,%r121,%r121; add.u64 %r201,%r200,%r121; shl.b64 %r202,%r201,3; add.u64 %r203,%r197,%r202; ld.u64 %r27,[%r203+56]; ld.u64 %r29,[%r203+48]; .loc 1 65 11 ld.u64 %r30,[%r203+40]; .loc 1 67 3 setp.gt.s64 %r217,%r121,0; @ %r217 bra $L2; $L8: .loc 1 75 3 setp.lt.s64 %r218,%r121,%r122; @ %r218 bra $L3; bra $L4; $L2: add.u64 %r52,%r197,40; add.u64 %r148,%frame,120; add.u64 %r67,%frame,240; add.u64 %r220,%r22,%r22; add.u64 %r221,%r220,%r22; shl.b64 %r222,%r221,3; add.u64 %r223,%r197,16; add.u64 %r193,%r222,%r223; .loc 1 73 12 mov.u64 %r376,0; $L7: .loc 1 69 20 ld.u64 %r225,[%r52]; mul.lo.u64 %r224,%r225,%r26; .loc 1 69 18 st.u64 [%r148],%r224; .loc 1 70 19 ld.u64 %r227,[%r52+16]; add.u64 %r226,%r227,1; ld.u64 %r228,[%r52+8]; sub.u64 %r41,%r226,%r228; .loc 1 72 10 setp.lt.s64 %r229,%r41,0; @ %r229 bra $L5; .loc 1 70 17 st.u64 [%r67],%r41; bra $L6; $L5: .loc 1 73 12 st.u64 [%r67],%r376; $L6: .loc 1 67 3 add.u64 %r52,%r52,24; add.u64 %r148,%r148,8; add.u64 %r67,%r67,8; setp.ne.u64 %r231,%r52,%r193; @ %r231 bra $L7; bra $L8; $L4: .loc 1 84 6 ld.u64 %r232,[%r196]; setp.eq.u64 %r233,%r232,0; @ ! %r233 bra $L53; bra $L9; $L3: add.u64 %r235,%r22,%r22; add.u64 %r236,%r235,%r22; shl.b64 %r237,%r236,3; add.u64 %r238,%r237,40; add.u64 %r117,%r197,%r238; shl.b64 %r239,%r22,3; add.u64 %r107,%r239,-8; add.u64 %r363,%frame,120; add.u64 %r109,%r363,%r107; add.u64 %r366,%frame,240; add.u64 %r69,%r366,%r107; add.u64 %r242,%r197,40; cvt.u32.u32 %r244,%r23; cvt.s64.s8 %r243,%r244; add.u64 %r246,%r243,%r243; add.u64 %r247,%r246,%r243; shl.b64 %r248,%r247,3; add.u64 %r147,%r242,%r248; .loc 1 81 12 mov.u64 %r375,0; $L13: .loc 1 77 20 ld.u64 %r250,[%r117]; mul.lo.u64 %r249,%r250,%r26; .loc 1 77 18 st.u64 [%r109],%r249; .loc 1 78 19 ld.u64 %r252,[%r117+16]; add.u64 %r251,%r252,1; ld.u64 %r253,[%r117+8]; sub.u64 %r49,%r251,%r253; .loc 1 80 10 setp.lt.s64 %r254,%r49,0; @ %r254 bra $L11; .loc 1 78 17 st.u64 [%r69],%r49; bra $L12; $L11: .loc 1 81 12 st.u64 [%r69],%r375; $L12: .loc 1 75 3 add.u64 %r117,%r117,24; add.u64 %r109,%r109,8; add.u64 %r69,%r69,8; setp.ne.u64 %r256,%r117,%r147; @ %r256 bra $L13; bra $L4; $L9: .loc 1 88 7 setp.le.s32 %r257,%r25,0; @ %r257 bra $L14; add.u64 %r172,%r196,40; add.u64 %r170,%frame,240; add.u64 %r258,%r196,16; cvt.u32.u32 %r260,%r23; cvt.s64.s8 %r259,%r260; add.u64 %r262,%r259,%r259; add.u64 %r263,%r262,%r259; shl.b64 %r264,%r263,3; add.u64 %r164,%r258,%r264; .loc 1 91 17 mov.u64 %r112,1; bra $L15; $L14: .loc 1 99 24 mov.u64 %r265,0; st.u64 [%r196+8],%r265; .loc 1 100 28 cvt.u16.u32 %r268,%r23; add.u16 %r267,%r268,-1; cvt.u32.u16 %r269,%r267; st.u8 [%r196+28],%r269; .loc 1 102 20 add.u32 %r270,%r23,-2; cvt.s64.s32 %r58,%r270; add.u64 %r272,%r58,%r58; add.u64 %r273,%r272,%r58; shl.b64 %r274,%r273,3; add.u64 %r275,%r196,%r274; .loc 1 102 67 shl.b64 %r277,%r58,3; add.u64 %r278,%frame,%r277; .loc 1 102 59 ld.u64 %r280,[%r275+40]; ld.u64 %r281,[%r278+240]; mul.lo.u64 %r130,%r280,%r281; .loc 1 104 10 setp.eq.u64 %r282,%r130,0; @ %r282 bra $L16; bra $L54; $L18: .loc 1 93 55 mul.lo.u64 %r112,%r54,%r112; $L15: .loc 1 95 4 st.u64 [%r172+8],%r232; ld.u64 %r54,[%r170]; add.u64 %r284,%r54,-1; st.u64 [%r172+16],%r284; st.u64 [%r172],%r112; .loc 1 88 7 add.u64 %r172,%r172,24; add.u64 %r170,%r170,8; setp.ne.u64 %r285,%r164,%r172; @ %r285 bra $L18; bra $L14; $L16: .loc 1 107 4 st.u64 [%r196+48],%r130; mov.u64 %r287,-1; st.u64 [%r196+56],%r287; mov.u64 %r288,1; st.u64 [%r196+40],%r288; .loc 1 108 4 bra $L1; $L54: .loc 1 111 24 mov.u64 %r2901290; call (%value_in),_gfortrani_xmallocarray,(%out_arg1,%out_arg2); ld.param.u64 %r291,[%value_in]; } .loc 1 111 22 st.u64 [%r196],%r291; $L25: .loc 1 137 3 setp.gt.s32 %r293,%r25,0; @ ! %r293 bra $L26; add.u64 %r366,%frame,240; bra $L20; $L26: .loc 1 145 8 ld.u64 %r137,[%r197]; .loc 1 147 21 cvt.u32.u64 %r79,%r26; .loc 1 147 50 add.u32 %r294,%r79,-4; and.b32 %r295,%r294,-5; set.u32.eq.u32 %r297,%r295,0; neg.s32 %r298,%r297; .loc 1 147 21 add.u32 %r299,%r79,-1; set.u32.le.u32 %r301,%r299,1; neg.s32 %r302,%r301; .loc 1 147 6 cvt.u16.u32 %r304,%r298; cvt.u16.u32 %r305,%r302; orbra $L55; $L20: add.u64 %r182,%frame,360; add.u64 %r181,%r196,40; mov.u64 %r179,%frame; .loc 1 123 10 mov.u64 %r111,0; .loc 1 139 16 mov.u64 %r327,%r111; bra $L23; $L53: .loc 1 115 19 ld.s8 %r65,[%r196+28]; .loc 1 115 10 setp.eq.u64 %r309,%r65,%r122; @ %r309 bra $L24; .loc 1 116 2 st.u64 [%stack+8],%r122; st.u64 [%stack],%r65; cvta.const.u64 %r310310_gfortran_runtime_error24: .loc 1 121 11 cvta.global.u64 %r312,_gfortrani_compile_options; .loc 1 121 10 ld.u32 %r313,[%r312+36]; setp.eq.u32 %r314,%r313,0; @ %r314 bra $L25; .loc 1 123 4 setp.le.s64 %r315,%r122,0; @ %r315 bra $L26; add.u64 %r163,%r196,48; add.u64 %r366,%frame,240; mov.u64 %r155,%r366; add.u64 %r150,%r196,56; cvt.u32.u32 %r317,%r23; cvt.s64.s8 %r316,%r317; add.u64 %r133,%r316,-1; .loc 1 123 10 mov.u64 %r129,0; $L28: .loc 1 127 21 ld.u64 %r319,[%r150]; add.u64 %r318,%r319,1; .loc 1 127 19 ld.u64 %r320,[%r163]; sub.u64 %r128,%r318,%r320; .loc 1 128 18 ld.u64 %r73,[%r155]; .loc 1 128 11 setp.eq.u64 %r321,%r73,%r128; @ %r321 bra $L27; .loc 1 129 3 st.u64 [%stack+16],%r73; st.u64 [%stack+8],%r128; cvt.u32.u64 %r324,%r129; add.u32 %r323,%r324,1; st.u32 [%stack],%r323; cvta.const.u64 %r322,$LC1; {stack; call _gfortran_runtime_error27: .loc 1 123 25 add.u64 %r129,%r129,1; .loc 1 123 4 add.u64 %r163,%r163,24; add.u64 %r155,%r155,8; add.u64 %r150,%r150,24; setp.ne.u64 %r326,%r129,%r133; @ %r326 bra $L28; bra $L20; $L23: .loc 1 139 16 st.u64 [%r182],%r327; .loc 1 140 18 ld.u64 %r328,[%r181]; st.u64 [%r179],%r328; .loc 1 141 17 shl.b64 %r330,%r111,3; add.u64 %r331,%r366,%r330; .loc 1 141 10 ld.u64 %r332,[%r331]; setp.le.s64 %r333,%r332,0; @ %r333 bra $L1; .loc 1 137 26 add.u64 %r111,%r111,1; .loc 1 137 3 add.u64 %r182,%r182,8; add.u64 %r181,%r181,24; add.u64 %r179,%r179,8; setp.gt.s64 %r334,%r122,%r111; @ %r334 bra $L23; bra $L26; $L55: .loc 1 149 7 setp.eq.u32 %r336,%r79,16; @ %r336 bra $L21; .loc 1 157 5 cvta.const.u64 %r338,$LC2; mov.u64 %r333338; call _gfortrani_internal_error61 9 add.u64 %r339,%r27,1; .loc 1 61 7 sub.u64 %r124,%r339,%r29; max.s64 %r126,%r124,0; .loc 1 65 11 mul.lo.u64 %r33,%r30,%r26; .loc 1 159 8 ld.u64 %r138,[%r196]; ld.u64 %r161,[%frame+360]; .loc 1 189 22 ld.u64 %r86,[%frame+120]; .loc 1 190 22 ld.u64 %r88,[%frame]; .loc 1 192 32 ld.u64 %r159,[%frame+240]; .loc 1 199 30 mul.lo.u64 %r340,%r159,%r86; .loc 1 199 16 neg.s64 %r146,%r340; .loc 1 200 30 mul.lo.u64 %r341,%r159,%r88; .loc 1 200 16 neg.s64 %r157,%r341; setp.gt.s64 %r364,%r124,0; setp.le.s32 %r365,%r25,1; cvt.u32.u32 %r368,%r23; cvt.s64.s8 %r369,%r368; add.u64 %r370,%r369,-1; add.u64 %r371,%frame,120; add.u64 %r372,%frame,240; .loc 1 196 20 mov.u64 %r373,0; $L36: .loc 1 171 12 @ %r364 bra $L37; .loc 1 172 10 mov.u32 %r343,1; st.u8 [%r138],%r343; bra $L30; $L37: mov.u64 %r132,%r137; .loc 1 175 13 mov.u64 %r131,0; $L29: .loc 1 178 9 ld.s8 %r84,[%r132]; .loc 1 178 6 setp.eq.u32 %r344,%r84,0; @ %r344 bra $L31; .loc 1 175 28 add.u64 %r131,%r131,1; .loc 1 175 36 add.u64 %r132,%r132,%r33; .loc 1 175 6 setp.gt.s64 %r345,%r126,%r131; @ %r345 bra $L29; .loc 1 170 10 mov.u32 %r84,1; $L31: .loc 1 184 12 cvt.u32.u32 %r346,%r84; st.u8 [%r138],%r346; $L30: .loc 1 188 15 add.u64 %r161,%r161,1; .loc 1 189 12 add.u64 %r137,%r137,%r86; .loc 1 190 12 add.u64 %r138,%r138,%r88; .loc 1 192 13 setp.ne.u64 %r347,%r161,%r159; @ %r347 bra $L36; .loc 1 199 16 add.u64 %r134,%r137,%r146; .loc 1 200 16 add.u64 %r135,%r138,%r157; .loc 1 202 14 @ %r365 bra $L1; add.u64 %r32,%frame,368; mov.u64 %r160,8; .loc 1 201 12 mov.u64 %r136,1; bra $L34; $L35: .loc 1 196 20 st.u64 [%r32],%r373; .loc 1 199 30 mul.lo.u64 %r353,%r98,%r97; .loc 1 199 16 sub.u64 %r134,%r137,%r353; .loc 1 200 30 mul.lo.u64 %r354,%r100,%r97; .loc 1 200 16 sub.u64 %r135,%r138,%r354; .loc 1 201 12 add.u64 %r136,%r136,1; .loc 1 202 14 add.u64 %r32,%r32,8; add.u64 %r160,%r160,8; setp.eq.u64 %r355,%r136,%r370; @ %r355 bra $L1; $L34: .loc 1 210 23 ld.u64 %r356,[%r32]; add.u64 %r97,%r356,1; st.u64 [%r32],%r97; .loc 1 211 30 add.u64 %r358,%r371,%r160; ld.u64 %r98,[%r358]; .loc 1 211 20 add.u64 %r137,%r134,%r98; .loc 1 212 30 add.u64 %r359,%frame,%r160; ld.u64 %r100,[%r359]; .loc 1 212 20 add.u64 %r138,%r135,%r100; .loc 1 192 32 add.u64 %r361,%r372,%r160; ld.u64 %r103,[%r361]; .loc 1 192 13 setp.eq.u64 %r362,%r97,%r103; @ %r362 bra $L35; .loc 1 196 20 mov.u64 %r161,0; bra $L36; $L1: .loc 1 216all_l2.o/_gfortran_all_l2 .visible .func _gfortran_all_l2fortran/generated/all_l2.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_internal_error .extern .func _gfortrani_internal_114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,65,76,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,10810,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,65,76,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121,32,105,110,32,65,76,76,32,105,110,116,114,105,110,115,105,99_gfortran_all_l2 .visible .func _gfortran_all_l248u64 %r229; .reg .u64 %r230; .reg .u64 %r231; .reg .u64 %r232; .reg .u64 %r233; .reg .pred %r234; .reg .pred %r236; .reg .u64u64u64 %r257; .reg .u64 %r258; .reg .predu64u16u64 %r279; .reg .u64 %r280; .reg .u64 %r282; .reg .u64pred %r320; .reg .u64u64 %r332; .reg .u64 %r333; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .pred %r338; .reg .pred %r339; .reg .pred %r341; .reg .u64 %r342; .reg .u64 %r343; .reg .u64 %r344; .reg .u64 %r346; .reg .u64 %r347; .reg .u64 %r348; .reg .u16 %r350; .reg .u1664u64 %r370; .reg .u64 %r372; .reg .predmov.u64 %r201,%ar0; mov.u64 %r202,%ar1; mov.u64 %r203,%ar2; .loc 1 56 10 ld.u64 %r22,[%r203]; .loc 1 56 7 add.u64 %r126,%r22,-1; .loc 1 57 10 ld.s8 %r23,[%r202+28]; .loc 1 57 38 add.u32 %r25,%r23,-1; .loc 1 57 8 cvt.s64.s32 %r127,%r25; .loc 1 59 14 ld.u64 %r26,[%r202+16]; .loc 1 61 9 add.u64 %r205,%r126,%r126; add.u64 %r206,%r205,%r126; shl.b64 %r207,%r206,3; add.u64 %r208,%r202,%r207; ld.u64 %r27,[%r208+56]; ld.u64 %r29,[%r208+48]; .loc 1 65 11 ld.u64 %r30,[%r208+40]; .loc 1 67 3 setp.gt.s64 %r222,%r126,0; @ %r222 bra $L2; $L8: .loc 1 75 3 setp.lt.s64 %r223,%r126,%r127; @ %r223 bra $L3; bra $L4; $L2: add.u64 %r151,%r202,40; add.u64 %r52,%frame,120; add.u64 %r109,%frame,240; add.u64 %r225,%r22,%r22; add.u64 %r226,%r225,%r22; shl.b64 %r227,%r226,3; add.u64 %r228,%r202,16; add.u64 %r160,%r227,%r228; .loc 1 73 12 mov.u64 %r387,0; $L7: .loc 1 69 20 ld.u64 %r230,[%r151]; mul.lo.u64 %r229,%r230,%r26; .loc 1 69 18 st.u64 [%r52],%r229; .loc 1 70 19 ld.u64 %r232,[%r151+16]; add.u64 %r231,%r232,1; ld.u64 %r233,[%r151+8]; sub.u64 %r41,%r231,%r233; .loc 1 72 10 setp.lt.s64 %r234,%r41,0; @ %r234 bra $L5; .loc 1 70 17 st.u64 [%r109],%r41; bra $L6; $L5: .loc 1 73 12 st.u64 [%r109],%r387; $L6: .loc 1 67 3 add.u64 %r151,%r151,24; add.u64 %r52,%r52,8; add.u64 %r109,%r109,8; setp.ne.u64 %r236,%r151,%r160; @ %r236 bra $L7; bra $L8; $L4: .loc 1 84 6 ld.u64 %r237,[%r201]; setp.eq.u64 %r238,%r237,0; @ ! %r238 bra $L51; bra $L9; $L3: add.u64 %r240,%r22,%r22; add.u64 %r241,%r240,%r22; shl.b64 %r242,%r241,3; add.u64 %r243,%r242,40; add.u64 %r145,%r202,%r243; shl.b64 %r244,%r22,3; add.u64 %r112,%r244,-8; add.u64 %r376,%frame,120; add.u64 %r117,%r376,%r112; add.u64 %r375,%frame,240; add.u64 %r110,%r375,%r112; add.u64 %r247,%r202,40; cvt.u32.u32 %r249,%r23; cvt.s64.s8 %r248,%r249; add.u64 %r251,%r248,%r248; add.u64 %r252,%r251,%r248; shl.b64 %r253,%r252,3; add.u64 %r198,%r247,%r253; .loc 1 81 12 mov.u64 %r386,0; $L13: .loc 1 77 20 ld.u64 %r255,[%r145]; mul.lo.u64 %r254,%r255,%r26; .loc 1 77 18 st.u64 [%r117],%r254; .loc 1 78 19 ld.u64 %r257,[%r145+16]; add.u64 %r256,%r257,1; ld.u64 %r258,[%r145+8]; sub.u64 %r49,%r256,%r258; .loc 1 80 10 setp.lt.s64 %r259,%r49,0; @ %r259 bra $L11; .loc 1 78 17 st.u64 [%r110],%r49; bra $L12; $L11: .loc 1 81 12 st.u64 [%r110],%r386; $L12: .loc 1 75 3 add.u64 %r145,%r145,24; add.u64 %r117,%r117,8; add.u64 %r110,%r110,8; setp.ne.u64 %r261,%r145,%r198; @ %r261 bra $L13; bra $L4; $L9: .loc 1 88 7 setp.le.s32 %r262,%r25,0; @ %r262 bra $L14; add.u64 %r179,%r201,40; add.u64 %r177,%frame,240; add.u64 %r263,%r201,16; cvt.u32.u32 %r265,%r23; cvt.s64.s8 %r264,%r265; add.u64 %r267,%r264,%r264; add.u64 %r268,%r267,%r264; shl.b64 %r269,%r268,3; add.u64 %r171,%r263,%r269; .loc 1 91 17 mov.u64 %r115,1; bra $L15; $L14: .loc 1 99 24 mov.u64 %r270,0; st.u64 [%r201+8],%r270; .loc 1 100 28 cvt.u16.u32 %r273,%r23; add.u16 %r272,%r273,-1; cvt.u32.u16 %r274,%r272; st.u8 [%r201+28],%r274; .loc 1 102 20 add.u32 %r275,%r23,-2; cvt.s64.s32 %r58,%r275; add.u64 %r277,%r58,%r58; add.u64 %r278,%r277,%r58; shl.b64 %r279,%r278,3; add.u64 %r280,%r201,%r279; .loc 1 102 67 shl.b64 %r282,%r58,3; add.u64 %r283,%frame,%r282; .loc 1 102 59 ld.u64 %r285,[%r280+40]; ld.u64 %r286,[%r283+240]; mul.lo.u64 %r134,%r285,%r286; .loc 1 104 10 setp.eq.u64 %r287,%r134,0; @ %r287 bra $L16; bra $L52; $L18: .loc 1 93 55 mul.lo.u64 %r115,%r54,%r115; $L15: .loc 1 95 4 st.u64 [%r179+8],%r237; ld.u64 %r54,[%r177]; add.u64 %r289,%r54,-1; st.u64 [%r179+16],%r289; st.u64 [%r179],%r115; .loc 1 88 7 add.u64 %r179,%r179,24; add.u64 %r177,%r177,8; setp.ne.u64 %r290,%r171,%r179; @ %r290 bra $L18; bra $L14; $L16: .loc 1 107 4 st.u64 [%r201+48],%r134; mov.u64 %r292,-1; st.u64 [%r201+56],%r292; mov.u64 %r293,1; st.u64 [%r201+40],%r293; .loc 1 108 4 bra $L1; $L52: .loc 1 111 24 mov.u64 %r295,1295; call (%value_in),_gfortrani_xmallocarray296,[%value_in]; } .loc 1 111 22 st.u64 [%r201],%r296; $L25: .loc 1 137 3 setp.gt.s32 %r298,%r25,0; @ ! %r298 bra $L26; add.u64 %r375,%frame,240; bra $L20; $L26: .loc 1 145 8 ld.u64 %r140,[%r202]; .loc 1 147 21 cvt.u32.u64 %r76,%r26; .loc 1 147 50 add.u32 %r299,%r76,-4; and.b32 %r300,%r299,-5; set.u32.eq.u32 %r302,%r300,0; neg.s32 %r303,%r302; .loc 1 147 21 add.u32 %r304,%r76,-1; set.u32.le.u32 %r306,%r304,1; neg.s32 %r307,%r306; .loc 1 147 6 cvt.u16.u32 %r309,%r303; cvt.u16.u32 %r310,%r307; orbra $L53; $L20: add.u64 %r189,%frame,360; add.u64 %r188,%r201,40; mov.u64 %r186,%frame; .loc 1 123 10 mov.u64 %r142,0; .loc 1 139 16 mov.u64 %r332,%r142; bra $L23; $L51: .loc 1 115 19 ld.s8 %r65,[%r201+28]; .loc 1 115 10 setp.eq.u64 %r314,%r65,%r127; @ %r314 bra $L24; .loc 1 116 2 st.u64 [%stack+8],%r127; st.u64 [%stack],%r65; cvta.const.u64 %r315315_gfortran_runtime_error24: .loc 1 121 11 cvta.global.u64 %r317,_gfortrani_compile_options; .loc 1 121 10 ld.u32 %r318,[%r317+36]; setp.eq.u32 %r319,%r318,0; @ %r319 bra $L25; .loc 1 123 4 setp.le.s64 %r320,%r127,0; @ %r320 bra $L26; add.u64 %r170,%r201,48; add.u64 %r375,%frame,240; mov.u64 %r168,%r375; add.u64 %r162,%r201,56; cvt.u32.u32 %r322,%r23; cvt.s64.s8 %r321,%r322; add.u64 %r149,%r321,-1; .loc 1 123 10 mov.u64 %r114,0; $L28: .loc 1 127 21 ld.u64 %r324,[%r162]; add.u64 %r323,%r324,1; .loc 1 127 19 ld.u64 %r325,[%r170]; sub.u64 %r133,%r323,%r325; .loc 1 128 18 ld.u64 %r71,[%r168]; .loc 1 128 11 setp.eq.u64 %r326,%r71,%r133; @ %r326 bra $L27; .loc 1 129 3 st.u64 [%stack+16],%r71; st.u64 [%stack+8],%r133; cvt.u32.u64 %r329,%r114; add.u32 %r328,%r329,1; st.u32 [%stack],%r328; cvta.const.u64 %r327,$LC1stack; call _gfortran_runtime_error27: .loc 1 123 25 add.u64 %r114,%r114,1; .loc 1 123 4 add.u64 %r170,%r170,24; add.u64 %r168,%r168,8; add.u64 %r162,%r162,24; setp.ne.u64 %r331,%r114,%r149; @ %r331 bra $L28; bra $L20; $L23: .loc 1 139 16 st.u64 [%r189],%r332; .loc 1 140 18 ld.u64 %r333,[%r188]; st.u64 [%r186],%r333; .loc 1 141 17 shl.b64 %r335,%r142,3; add.u64 %r336,%r375,%r335; .loc 1 141 10 ld.u64 %r337,[%r336]; setp.le.s64 %r338,%r337,0; @ %r338 bra $L1; .loc 1 137 26 add.u64 %r142,%r142,1; .loc 1 137 3 add.u64 %r189,%r189,8; add.u64 %r188,%r188,24; add.u64 %r186,%r186,8; setp.gt.s64 %r339,%r127,%r142; @ %r339 bra $L23; bra $L26; $L53: .loc 1 149 7 setp.eq.u32 %r341,%r76,16; @ %r341 bra $L21; .loc 1 157 5 cvta.const.u64 %r343,$LC2; mov.u64 %r342,3343; call _gfortrani_internal_error61 9 add.u64 %r344,%r27,1; .loc 1 61 7 sub.u64 %r129,%r344,%r29; max.s64 %r131,%r129,0; .loc 1 65 11 mul.lo.u64 %r33,%r30,%r26; .loc 1 159 8 ld.u64 %r141,[%r201]; ld.u64 %r166,[%frame+360]; .loc 1 189 22 ld.u64 %r86,[%frame+120]; .loc 1 190 22 ld.u64 %r88,[%frame]; .loc 1 190 12 add.u64 %r89,%r88,%r88; .loc 1 192 32 ld.u64 %r164,[%frame+240]; .loc 1 199 30 mul.lo.u64 %r346,%r164,%r86; .loc 1 199 16 neg.s64 %r150,%r346; .loc 1 200 30 mul.lo.u64 %r347,%r164,%r88; .loc 1 200 16 add.u64 %r348,%r347,%r347; neg.s64 %r148,%r348; setp.gt.s64 %r377,%r129,0; setp.le.s32 %r374,%r25,1; cvt.u32.u32 %r379,%r23; cvt.s64.s8 %r380,%r379; add.u64 %r381,%r380,-1; add.u64 %r382,%frame,120; add.u64 %r383,%frame,240; .loc 1 196 20 mov.u64 %r384,0; $L36: .loc 1 171 12 @ %r377 bra $L37; .loc 1 172 10 mov.u16 %r350,1; st.u16 [%r141],%r350; bra $L30; $L37: mov.u64 %r136,%r140; .loc 1 175 13 mov.u64 %r135,0; $L29: .loc 1 178 6 ld.s8 %r352,[%r136]; cvt.u16.u32 %r351,%r352; setp.eq.u16 %r353,%r351,0; @ %r353 bra $L38; .loc 1 175 28 add.u64 %r135,%r135,1; .loc 1 175 36 add.u64 %r136,%r136,%r33; .loc 1 175 6 setp.gt.s64 %r354,%r131,%r135; @ %r354 bra $L29; .loc 1 170 10 mov.u32 %r116,1; bra $L31; $L38: .loc 1 180 14 mov.u32 %r116,0; $L31: .loc 1 184 12 cvt.u16.u32 %r355,%r116; st.u16 [%r141],%r355; $L30: .loc 1 188 15 add.u64 %r166,%r166,1; .loc 1 189 12 add.u64 %r140,%r140,%r86; .loc 1 190 12 add.u64 %r141,%r141,%r89; .loc 1 192 13 setp.ne.u64 %r356,%r166,%r164; @ %r356 bra $L36; .loc 1 199 16 add.u64 %r137,%r140,%r150; .loc 1 200 16 add.u64 %r138,%r141,%r148; .loc 1 202 14 @ %r374 bra $L1; add.u64 %r32,%frame,368; mov.u64 %r165,8; .loc 1 201 12 mov.u64 %r139,1; bra $L34; $L35: .loc 1 196 20 st.u64 [%r32],%r384; .loc 1 199 30 mul.lo.u64 %r362,%r101,%r100; .loc 1 199 16 sub.u64 %r137,%r140,%r362; .loc 1 200 30 mul.lo.u64 %r363,%r103,%r100; .loc 1 200 16 add.u64 %r364,%r363,%r363; sub.u64 %r138,%r141,%r364; .loc 1 201 12 add.u64 %r139,%r139,1; .loc 1 202 14 add.u64 %r32,%r32,8; add.u64 %r165,%r165,8; setp.eq.u64 %r365,%r139,%r381; @ %r365 bra $L1; $L34: .loc 1 210 23 ld.u64 %r366,[%r32]; add.u64 %r100,%r366,1; st.u64 [%r32],%r100; .loc 1 211 30 add.u64 %r368,%r382,%r165; ld.u64 %r101,[%r368]; .loc 1 211 20 add.u64 %r140,%r137,%r101; .loc 1 212 30 add.u64 %r369,%frame,%r165; ld.u64 %r103,[%r369]; .loc 1 212 20 add.u64 %r370,%r103,%r103; add.u64 %r141,%r138,%r370; .loc 1 192 32 add.u64 %r372,%r383,%r165; ld.u64 %r107,[%r372]; .loc 1 192 13 setp.eq.u64 %r373,%r100,%r107; @ %r373 bra $L35; .loc 1 196 20 mov.u64 %r166,0; bra $L36; $L1: .loc 1 216 all_l4.o/_gfortran_all_l4 .visible .func _gfortran_all_l4fortran/generated/all_l4.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_internal_error .extern .func _gfortrani_internal_114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,65,76,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,10810,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,65,76,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121,32,105,110,32,65,76,76,32,105,110,116,114,105,110,115,105,99_gfortran_all_l4 .visible .func _gfortran_all_l448u64 %r229; .reg .u64 %r230; .reg .u64 %r231; .reg .u64 %r232; .reg .u64 %r233; .reg .pred %r234; .reg .pred %r236; .reg .u64u64u64 %r257; .reg .u64 %r258; .reg .predu64u16u64 %r279; .reg .u64 %r280; .reg .u64 %r282; .reg .u64pred %r320; .reg .u64u64 %r332; .reg .u64 %r333; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .pred %r338; .reg .pred %r339; .reg .pred %r341; .reg .u64 %r342; .reg .u64 %r343; .reg .u64 %r344; .reg .u64 %r345; .reg .u64 %r346; .reg .u64 %r347; .reg .u32 %r349; .reg .u16 %r350; .reg .u32 %r351; .reg .pred %r352; .reg .pred %r353; .reg .pred %r354; .reg .u64pred %r363; .reg .u64u64 %r385; mov.u64 %r201,%ar0; mov.u64 %r202,%ar1; mov.u64 %r203,%ar2; .loc 1 56 10 ld.u64 %r22,[%r203]; .loc 1 56 7 add.u64 %r126,%r22,-1; .loc 1 57 10 ld.s8 %r23,[%r202+28]; .loc 1 57 38 add.u32 %r25,%r23,-1; .loc 1 57 8 cvt.s64.s32 %r127,%r25; .loc 1 59 14 ld.u64 %r26,[%r202+16]; .loc 1 61 9 add.u64 %r205,%r126,%r126; add.u64 %r206,%r205,%r126; shl.b64 %r207,%r206,3; add.u64 %r208,%r202,%r207; ld.u64 %r27,[%r208+56]; ld.u64 %r29,[%r208+48]; .loc 1 65 11 ld.u64 %r30,[%r208+40]; .loc 1 67 3 setp.gt.s64 %r222,%r126,0; @ %r222 bra $L2; $L8: .loc 1 75 3 setp.lt.s64 %r223,%r126,%r127; @ %r223 bra $L3; bra $L4; $L2: add.u64 %r151,%r202,40; add.u64 %r52,%frame,120; add.u64 %r109,%frame,240; add.u64 %r225,%r22,%r22; add.u64 %r226,%r225,%r22; shl.b64 %r227,%r226,3; add.u64 %r228,%r202,16; add.u64 %r160,%r227,%r228; .loc 1 73 12 mov.u64 %r385,0; $L7: .loc 1 69 20 ld.u64 %r230,[%r151]; mul.lo.u64 %r229,%r230,%r26; .loc 1 69 18 st.u64 [%r52],%r229; .loc 1 70 19 ld.u64 %r232,[%r151+16]; add.u64 %r231,%r232,1; ld.u64 %r233,[%r151+8]; sub.u64 %r41,%r231,%r233; .loc 1 72 10 setp.lt.s64 %r234,%r41,0; @ %r234 bra $L5; .loc 1 70 17 st.u64 [%r109],%r41; bra $L6; $L5: .loc 1 73 12 st.u64 [%r109],%r385; $L6: .loc 1 67 3 add.u64 %r151,%r151,24; add.u64 %r52,%r52,8; add.u64 %r109,%r109,8; setp.ne.u64 %r236,%r151,%r160; @ %r236 bra $L7; bra $L8; $L4: .loc 1 84 6 ld.u64 %r237,[%r201]; setp.eq.u64 %r238,%r237,0; @ ! %r238 bra $L51; bra $L9; $L3: add.u64 %r240,%r22,%r22; add.u64 %r241,%r240,%r22; shl.b64 %r242,%r241,3; add.u64 %r243,%r242,40; add.u64 %r145,%r202,%r243; shl.b64 %r244,%r22,3; add.u64 %r112,%r244,-8; add.u64 %r375,%frame,120; add.u64 %r117,%r375,%r112; add.u64 %r374,%frame,240; add.u64 %r110,%r374,%r112; add.u64 %r247,%r202,40; cvt.u32.u32 %r249,%r23; cvt.s64.s8 %r248,%r249; add.u64 %r251,%r248,%r248; add.u64 %r252,%r251,%r248; shl.b64 %r253,%r252,3; add.u64 %r198,%r247,%r253; .loc 1 81 12 mov.u64 %r384,0; $L13: .loc 1 77 20 ld.u64 %r255,[%r145]; mul.lo.u64 %r254,%r255,%r26; .loc 1 77 18 st.u64 [%r117],%r254; .loc 1 78 19 ld.u64 %r257,[%r145+16]; add.u64 %r256,%r257,1; ld.u64 %r258,[%r145+8]; sub.u64 %r49,%r256,%r258; .loc 1 80 10 setp.lt.s64 %r259,%r49,0; @ %r259 bra $L11; .loc 1 78 17 st.u64 [%r110],%r49; bra $L12; $L11: .loc 1 81 12 st.u64 [%r110],%r384; $L12: .loc 1 75 3 add.u64 %r145,%r145,24; add.u64 %r117,%r117,8; add.u64 %r110,%r110,8; setp.ne.u64 %r261,%r145,%r198; @ %r261 bra $L13; bra $L4; $L9: .loc 1 88 7 setp.le.s32 %r262,%r25,0; @ %r262 bra $L14; add.u64 %r179,%r201,40; add.u64 %r177,%frame,240; add.u64 %r263,%r201,16; cvt.u32.u32 %r265,%r23; cvt.s64.s8 %r264,%r265; add.u64 %r267,%r264,%r264; add.u64 %r268,%r267,%r264; shl.b64 %r269,%r268,3; add.u64 %r171,%r263,%r269; .loc 1 91 17 mov.u64 %r115,1; bra $L15; $L14: .loc 1 99 24 mov.u64 %r270,0; st.u64 [%r201+8],%r270; .loc 1 100 28 cvt.u16.u32 %r273,%r23; add.u16 %r272,%r273,-1; cvt.u32.u16 %r274,%r272; st.u8 [%r201+28],%r274; .loc 1 102 20 add.u32 %r275,%r23,-2; cvt.s64.s32 %r58,%r275; add.u64 %r277,%r58,%r58; add.u64 %r278,%r277,%r58; shl.b64 %r279,%r278,3; add.u64 %r280,%r201,%r279; .loc 1 102 67 shl.b64 %r282,%r58,3; add.u64 %r283,%frame,%r282; .loc 1 102 59 ld.u64 %r285,[%r280+40]; ld.u64 %r286,[%r283+240]; mul.lo.u64 %r134,%r285,%r286; .loc 1 104 10 setp.eq.u64 %r287,%r134,0; @ %r287 bra $L16; bra $L52; $L18: .loc 1 93 55 mul.lo.u64 %r115,%r54,%r115; $L15: .loc 1 95 4 st.u64 [%r179+8],%r237; ld.u64 %r54,[%r177]; add.u64 %r289,%r54,-1; st.u64 [%r179+16],%r289; st.u64 [%r179],%r115; .loc 1 88 7 add.u64 %r179,%r179,24; add.u64 %r177,%r177,8; setp.ne.u64 %r290,%r171,%r179; @ %r290 bra $L18; bra $L14; $L16: .loc 1 107 4 st.u64 [%r201+48],%r134; mov.u64 %r292,-1; st.u64 [%r201+56],%r292; mov.u64 %r293,1; st.u64 [%r201+40],%r293; .loc 1 108 4 bra $L1; $L52: .loc 1 111 24295; call (%value_in),_gfortrani_xmallocarray296,[%value_in]; } .loc 1 111 22 st.u64 [%r201],%r296; $L25: .loc 1 137 3 setp.gt.s32 %r298,%r25,0; @ ! %r298 bra $L26; add.u64 %r374,%frame,240; bra $L20; $L26: .loc 1 145 8 ld.u64 %r140,[%r202]; .loc 1 147 21 cvt.u32.u64 %r76,%r26; .loc 1 147 50 add.u32 %r299,%r76,-4; and.b32 %r300,%r299,-5; set.u32.eq.u32 %r302,%r300,0; neg.s32 %r303,%r302; .loc 1 147 21 add.u32 %r304,%r76,-1; set.u32.le.u32 %r306,%r304,1; neg.s32 %r307,%r306; .loc 1 147 6 cvt.u16.u32 %r309,%r303; cvt.u16.u32 %r310,%r307; orbra $L53; $L20: add.u64 %r189,%frame,360; add.u64 %r188,%r201,40; mov.u64 %r186,%frame; .loc 1 123 10 mov.u64 %r142,0; .loc 1 139 16 mov.u64 %r332,%r142; bra $L23; $L51: .loc 1 115 19 ld.s8 %r65,[%r201+28]; .loc 1 115 10 setp.eq.u64 %r314,%r65,%r127; @ %r314 bra $L24; .loc 1 116 2 st.u64 [%stack+8],%r127; st.u64 [%stack],%r65; cvta.const.u64 %r315315_gfortran_runtime_error24: .loc 1 121 11 cvta.global.u64 %r317,_gfortrani_compile_options; .loc 1 121 10 ld.u32 %r318,[%r317+36]; setp.eq.u32 %r319,%r318,0; @ %r319 bra $L25; .loc 1 123 4 setp.le.s64 %r320,%r127,0; @ %r320 bra $L26; add.u64 %r170,%r201,48; add.u64 %r374,%frame,240; mov.u64 %r168,%r374; add.u64 %r162,%r201,56; cvt.u32.u32 %r322,%r23; cvt.s64.s8 %r321,%r322; add.u64 %r149,%r321,-1; .loc 1 123 10 mov.u64 %r114,0; $L28: .loc 1 127 21 ld.u64 %r324,[%r162]; add.u64 %r323,%r324,1; .loc 1 127 19 ld.u64 %r325,[%r170]; sub.u64 %r133,%r323,%r325; .loc 1 128 18 ld.u64 %r71,[%r168]; .loc 1 128 11 setp.eq.u64 %r326,%r71,%r133; @ %r326 bra $L27; .loc 1 129 3 st.u64 [%stack+16],%r71; st.u64 [%stack+8],%r133; cvt.u32.u64 %r329,%r114; add.u32 %r328,%r329,1; st.u32 [%stack],%r328; cvta.const.u64 %r327,$LC1stack; call _gfortran_runtime_error27: .loc 1 123 25 add.u64 %r114,%r114,1; .loc 1 123 4 add.u64 %r170,%r170,24; add.u64 %r168,%r168,8; add.u64 %r162,%r162,24; setp.ne.u64 %r331,%r114,%r149; @ %r331 bra $L28; bra $L20; $L23: .loc 1 139 16 st.u64 [%r189],%r332; .loc 1 140 18 ld.u64 %r333,[%r188]; st.u64 [%r186],%r333; .loc 1 141 17 shl.b64 %r335,%r142,3; add.u64 %r336,%r374,%r335; .loc 1 141 10 ld.u64 %r337,[%r336]; setp.le.s64 %r338,%r337,0; @ %r338 bra $L1; .loc 1 137 26 add.u64 %r142,%r142,1; .loc 1 137 3 add.u64 %r189,%r189,8; add.u64 %r188,%r188,24; add.u64 %r186,%r186,8; setp.gt.s64 %r339,%r127,%r142; @ %r339 bra $L23; bra $L26; $L53: .loc 1 149 7 setp.eq.u32 %r341,%r76,16; @ %r341 bra $L21; .loc 1 157 5 cvta.const.u64 %r343,$LC2; mov.u64 %r342,3343; call _gfortrani_internal_error61 9 add.u64 %r344,%r27,1; .loc 1 61 7 sub.u64 %r129,%r344,%r29; max.s64 %r131,%r129,0; .loc 1 65 11 mul.lo.u64 %r33,%r30,%r26; .loc 1 159 8 ld.u64 %r141,[%r201]; ld.u64 %r166,[%frame+360]; .loc 1 189 22 ld.u64 %r86,[%frame+120]; .loc 1 190 22 ld.u64 %r88,[%frame]; .loc 1 190 12 shl.b64 %r89,%r88,2; .loc 1 192 32 ld.u64 %r164,[%frame+240]; .loc 1 199 30 mul.lo.u64 %r345,%r164,%r86; .loc 1 199 16 neg.s64 %r150,%r345; .loc 1 200 30 mul.lo.u64 %r346,%r164,%r88; .loc 1 200 16 shl.b64 %r347,%r346,2; neg.s64 %r148,%r347; setp.gt.s64 %r372,%r129,0; setp.le.s32 %r373,%r25,1; cvt.u32.u32 %r377,%r23; cvt.s64.s8 %r378,%r377; add.u64 %r379,%r378,-1; add.u64 %r380,%frame,120; add.u64 %r381,%frame,240; .loc 1 196 20 mov.u64 %r382,0; $L36: .loc 1 171 12 @ %r372 bra $L37; .loc 1 172 10 mov.u32 %r349,1; st.u32 [%r141],%r349; bra $L30; $L37: mov.u64 %r136,%r140; .loc 1 175 13 mov.u64 %r135,0; $L29: .loc 1 178 6 ld.s8 %r351,[%r136]; cvt.u16.u32 %r350,%r351; setp.eq.u16 %r352,%r350,0; @ %r352 bra $L38; .loc 1 175 28 add.u64 %r135,%r135,1; .loc 1 175 36 add.u64 %r136,%r136,%r33; .loc 1 175 6 setp.gt.s64 %r353,%r131,%r135; @ %r353 bra $L29; .loc 1 170 10 mov.u32 %r116,1; bra $L31; $L38: .loc 1 180 14 mov.u32 %r116,0; $L31: .loc 1 184 12 st.u32 [%r141],%r116; $L30: .loc 1 188 15 add.u64 %r166,%r166,1; .loc 1 189 12 add.u64 %r140,%r140,%r86; .loc 1 190 12 add.u64 %r141,%r141,%r89; .loc 1 192 13 setp.ne.u64 %r354,%r166,%r164; @ %r354 bra $L36; .loc 1 199 16 add.u64 %r137,%r140,%r150; .loc 1 200 16 add.u64 %r138,%r141,%r148; .loc 1 202 14 @ %r373 bra $L1; add.u64 %r32,%frame,368; mov.u64 %r165,8; .loc 1 201 12 mov.u64 %r139,1; bra $L34; $L35: .loc 1 196 20 st.u64 [%r32],%r382; .loc 1 199 30 mul.lo.u64 %r360,%r101,%r100; .loc 1 199 16 sub.u64 %r137,%r140,%r360; .loc 1 200 30 mul.lo.u64 %r361,%r103,%r100; .loc 1 200 16 shl.b64 %r362,%r361,2; sub.u64 %r138,%r141,%r362; .loc 1 201 12 add.u64 %r139,%r139,1; .loc 1 202 14 add.u64 %r32,%r32,8; add.u64 %r165,%r165,8; setp.eq.u64 %r363,%r139,%r379; @ %r363 bra $L1; $L34: .loc 1 210 23 ld.u64 %r364,[%r32]; add.u64 %r100,%r364,1; st.u64 [%r32],%r100; .loc 1 211 30 add.u64 %r366,%r380,%r165; ld.u64 %r101,[%r366]; .loc 1 211 20 add.u64 %r140,%r137,%r101; .loc 1 212 30 add.u64 %r367,%frame,%r165; ld.u64 %r103,[%r367]; .loc 1 212 20 shl.b64 %r368,%r103,2; add.u64 %r141,%r138,%r368; .loc 1 192 32 add.u64 %r370,%r381,%r165; ld.u64 %r107,[%r370]; .loc 1 192 13 setp.eq.u64 %r371,%r100,%r107; @ %r371 bra $L35; .loc 1 196 20 mov.u64 %r166,0; bra $L36; $L1: .loc 1 216all_l8.o/_gfortran_all_l8 .visible .func _gfortran_all_l8fortran/generated/all_l8.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_internal_error .extern .func _gfortrani_internal_114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,65,76,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,10810,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,65,76,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121,32,105,110,32,65,76,76,32,105,110,116,114,105,110,115,105,99_gfortran_all_l8 .visible .func _gfortran_all_l848u64 %r229; .reg .u64 %r230; .reg .u64 %r231; .reg .u64 %r232; .reg .u64 %r233; .reg .pred %r234; .reg .pred %r236; .reg .u64u64u64 %r257; .reg .u64 %r258; .reg .predu64u16u64 %r279; .reg .u64 %r280; .reg .u64 %r282; .reg .u64pred %r320; .reg .u64u64 %r332; .reg .u64 %r333; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .pred %r338; .reg .pred %r339; .reg .pred %r341; .reg .u64 %r342; .reg .u64 %r343; .reg .u64 %r344; .reg .u64 %r345; .reg .u64 %r346; .reg .u64 %r347; .reg .u64 %r349; .reg .u16 %r350; .reg .u32 %r351; .reg .pred %r352; .reg .pred %r353; .reg .pred %r354; .reg .u64pred %r363; .reg .u64u64 %r385; mov.u64 %r201,%ar0; mov.u64 %r202,%ar1; mov.u64 %r203,%ar2; .loc 1 56 10 ld.u64 %r22,[%r203]; .loc 1 56 7 add.u64 %r126,%r22,-1; .loc 1 57 10 ld.s8 %r23,[%r202+28]; .loc 1 57 38 add.u32 %r25,%r23,-1; .loc 1 57 8 cvt.s64.s32 %r127,%r25; .loc 1 59 14 ld.u64 %r26,[%r202+16]; .loc 1 61 9 add.u64 %r205,%r126,%r126; add.u64 %r206,%r205,%r126; shl.b64 %r207,%r206,3; add.u64 %r208,%r202,%r207; ld.u64 %r27,[%r208+56]; ld.u64 %r29,[%r208+48]; .loc 1 65 11 ld.u64 %r30,[%r208+40]; .loc 1 67 3 setp.gt.s64 %r222,%r126,0; @ %r222 bra $L2; $L8: .loc 1 75 3 setp.lt.s64 %r223,%r126,%r127; @ %r223 bra $L3; bra $L4; $L2: add.u64 %r151,%r202,40; add.u64 %r52,%frame,120; add.u64 %r109,%frame,240; add.u64 %r225,%r22,%r22; add.u64 %r226,%r225,%r22; shl.b64 %r227,%r226,3; add.u64 %r228,%r202,16; add.u64 %r160,%r227,%r228; .loc 1 73 12 mov.u64 %r385,0; $L7: .loc 1 69 20 ld.u64 %r230,[%r151]; mul.lo.u64 %r229,%r230,%r26; .loc 1 69 18 st.u64 [%r52],%r229; .loc 1 70 19 ld.u64 %r232,[%r151+16]; add.u64 %r231,%r232,1; ld.u64 %r233,[%r151+8]; sub.u64 %r41,%r231,%r233; .loc 1 72 10 setp.lt.s64 %r234,%r41,0; @ %r234 bra $L5; .loc 1 70 17 st.u64 [%r109],%r41; bra $L6; $L5: .loc 1 73 12 st.u64 [%r109],%r385; $L6: .loc 1 67 3 add.u64 %r151,%r151,24; add.u64 %r52,%r52,8; add.u64 %r109,%r109,8; setp.ne.u64 %r236,%r151,%r160; @ %r236 bra $L7; bra $L8; $L4: .loc 1 84 6 ld.u64 %r237,[%r201]; setp.eq.u64 %r238,%r237,0; @ ! %r238 bra $L51; bra $L9; $L3: add.u64 %r240,%r22,%r22; add.u64 %r241,%r240,%r22; shl.b64 %r242,%r241,3; add.u64 %r243,%r242,40; add.u64 %r145,%r202,%r243; shl.b64 %r244,%r22,3; add.u64 %r112,%r244,-8; add.u64 %r375,%frame,120; add.u64 %r117,%r375,%r112; add.u64 %r374,%frame,240; add.u64 %r110,%r374,%r112; add.u64 %r247,%r202,40; cvt.u32.u32 %r249,%r23; cvt.s64.s8 %r248,%r249; add.u64 %r251,%r248,%r248; add.u64 %r252,%r251,%r248; shl.b64 %r253,%r252,3; add.u64 %r198,%r247,%r253; .loc 1 81 12 mov.u64 %r384,0; $L13: .loc 1 77 20 ld.u64 %r255,[%r145]; mul.lo.u64 %r254,%r255,%r26; .loc 1 77 18 st.u64 [%r117],%r254; .loc 1 78 19 ld.u64 %r257,[%r145+16]; add.u64 %r256,%r257,1; ld.u64 %r258,[%r145+8]; sub.u64 %r49,%r256,%r258; .loc 1 80 10 setp.lt.s64 %r259,%r49,0; @ %r259 bra $L11; .loc 1 78 17 st.u64 [%r110],%r49; bra $L12; $L11: .loc 1 81 12 st.u64 [%r110],%r384; $L12: .loc 1 75 3 add.u64 %r145,%r145,24; add.u64 %r117,%r117,8; add.u64 %r110,%r110,8; setp.ne.u64 %r261,%r145,%r198; @ %r261 bra $L13; bra $L4; $L9: .loc 1 88 7 setp.le.s32 %r262,%r25,0; @ %r262 bra $L14; add.u64 %r179,%r201,40; add.u64 %r177,%frame,240; add.u64 %r263,%r201,16; cvt.u32.u32 %r265,%r23; cvt.s64.s8 %r264,%r265; add.u64 %r267,%r264,%r264; add.u64 %r268,%r267,%r264; shl.b64 %r269,%r268,3; add.u64 %r171,%r263,%r269; .loc 1 91 17 mov.u64 %r115,1; bra $L15; $L14: .loc 1 99 24 mov.u64 %r270,0; st.u64 [%r201+8],%r270; .loc 1 100 28 cvt.u16.u32 %r273,%r23; add.u16 %r272,%r273,-1; cvt.u32.u16 %r274,%r272; st.u8 [%r201+28],%r274; .loc 1 102 20 add.u32 %r275,%r23,-2; cvt.s64.s32 %r58,%r275; add.u64 %r277,%r58,%r58; add.u64 %r278,%r277,%r58; shl.b64 %r279,%r278,3; add.u64 %r280,%r201,%r279; .loc 1 102 67 shl.b64 %r282,%r58,3; add.u64 %r283,%frame,%r282; .loc 1 102 59 ld.u64 %r285,[%r280+40]; ld.u64 %r286,[%r283+240]; mul.lo.u64 %r134,%r285,%r286; .loc 1 104 10 setp.eq.u64 %r287,%r134,0; @ %r287 bra $L16; bra $L52; $L18: .loc 1 93 55 mul.lo.u64 %r115,%r54,%r115; $L15: .loc 1 95 4 st.u64 [%r179+8],%r237; ld.u64 %r54,[%r177]; add.u64 %r289,%r54,-1; st.u64 [%r179+16],%r289; st.u64 [%r179],%r115; .loc 1 88 7 add.u64 %r179,%r179,24; add.u64 %r177,%r177,8; setp.ne.u64 %r290,%r171,%r179; @ %r290 bra $L18; bra $L14; $L16: .loc 1 107 4 st.u64 [%r201+48],%r134; mov.u64 %r292,-1; st.u64 [%r201+56],%r292; mov.u64 %r293,1; st.u64 [%r201+40],%r293; .loc 1 108 4 bra $L1; $L52: .loc 1 111 24 mov.u64 %r2951295; call (%value_in),_gfortrani_xmallocarray296,[%value_in]; } .loc 1 111 22 st.u64 [%r201],%r296; $L25: .loc 1 137 3 setp.gt.s32 %r298,%r25,0; @ ! %r298 bra $L26; add.u64 %r374,%frame,240; bra $L20; $L26: .loc 1 145 8 ld.u64 %r140,[%r202]; .loc 1 147 21 cvt.u32.u64 %r76,%r26; .loc 1 147 50 add.u32 %r299,%r76,-4; and.b32 %r300,%r299,-5; set.u32.eq.u32 %r302,%r300,0; neg.s32 %r303,%r302; .loc 1 147 21 add.u32 %r304,%r76,-1; set.u32.le.u32 %r306,%r304,1; neg.s32 %r307,%r306; .loc 1 147 6 cvt.u16.u32 %r309,%r303; cvt.u16.u32 %r310,%r307; orbra $L53; $L20: add.u64 %r189,%frame,360; add.u64 %r188,%r201,40; mov.u64 %r186,%frame; .loc 1 123 10 mov.u64 %r142,0; .loc 1 139 16 mov.u64 %r332,%r142; bra $L23; $L51: .loc 1 115 19 ld.s8 %r65,[%r201+28]; .loc 1 115 10 setp.eq.u64 %r314,%r65,%r127; @ %r314 bra $L24; .loc 1 116 2 st.u64 [%stack+8],%r127; st.u64 [%stack],%r65; cvta.const.u64 %r315315_gfortran_runtime_error24: .loc 1 121 11 cvta.global.u64 %r317,_gfortrani_compile_options; .loc 1 121 10 ld.u32 %r318,[%r317+36]; setp.eq.u32 %r319,%r318,0; @ %r319 bra $L25; .loc 1 123 4 setp.le.s64 %r320,%r127,0; @ %r320 bra $L26; add.u64 %r170,%r201,48; add.u64 %r374,%frame,240; mov.u64 %r168,%r374; add.u64 %r162,%r201,56; cvt.u32.u32 %r322,%r23; cvt.s64.s8 %r321,%r322; add.u64 %r149,%r321,-1; .loc 1 123 10 mov.u64 %r114,0; $L28: .loc 1 127 21 ld.u64 %r324,[%r162]; add.u64 %r323,%r324,1; .loc 1 127 19 ld.u64 %r325,[%r170]; sub.u64 %r133,%r323,%r325; .loc 1 128 18 ld.u64 %r71,[%r168]; .loc 1 128 11 setp.eq.u64 %r326,%r71,%r133; @ %r326 bra $L27; .loc 1 129 3 st.u64 [%stack+16],%r71; st.u64 [%stack+8],%r133; cvt.u32.u64 %r329,%r114; add.u32 %r328,%r329,1; st.u32 [%stack],%r328; cvta.const.u64 %r327,$LC1stack; call _gfortran_runtime_error27: .loc 1 123 25 add.u64 %r114,%r114,1; .loc 1 123 4 add.u64 %r170,%r170,24; add.u64 %r168,%r168,8; add.u64 %r162,%r162,24; setp.ne.u64 %r331,%r114,%r149; @ %r331 bra $L28; bra $L20; $L23: .loc 1 139 16 st.u64 [%r189],%r332; .loc 1 140 18 ld.u64 %r333,[%r188]; st.u64 [%r186],%r333; .loc 1 141 17 shl.b64 %r335,%r142,3; add.u64 %r336,%r374,%r335; .loc 1 141 10 ld.u64 %r337,[%r336]; setp.le.s64 %r338,%r337,0; @ %r338 bra $L1; .loc 1 137 26 add.u64 %r142,%r142,1; .loc 1 137 3 add.u64 %r189,%r189,8; add.u64 %r188,%r188,24; add.u64 %r186,%r186,8; setp.gt.s64 %r339,%r127,%r142; @ %r339 bra $L23; bra $L26; $L53: .loc 1 149 7 setp.eq.u32 %r341,%r76,16; @ %r341 bra $L21; .loc 1 157 5 cvta.const.u64 %r343,$LC2; mov.u64 %r342,3343; call _gfortrani_internal_error61 9 add.u64 %r344,%r27,1; .loc 1 61 7 sub.u64 %r129,%r344,%r29; max.s64 %r131,%r129,0; .loc 1 65 11 mul.lo.u64 %r33,%r30,%r26; .loc 1 159 8 ld.u64 %r141,[%r201]; ld.u64 %r166,[%frame+360]; .loc 1 189 22 ld.u64 %r86,[%frame+120]; .loc 1 190 22 ld.u64 %r88,[%frame]; .loc 1 190 12 shl.b64 %r89,%r88,3; .loc 1 192 32 ld.u64 %r164,[%frame+240]; .loc 1 199 30 mul.lo.u64 %r345,%r164,%r86; .loc 1 199 16 neg.s64 %r150,%r345; .loc 1 200 30 mul.lo.u64 %r346,%r164,%r88; .loc 1 200 16 shl.b64 %r347,%r346,3; neg.s64 %r148,%r347; setp.gt.s64 %r372,%r129,0; setp.le.s32 %r373,%r25,1; .loc 1 180 14 mov.u64 %r377,0; cvt.u32.u32 %r378,%r23; cvt.s64.s8 %r379,%r378; add.u64 %r380,%r379,-1; add.u64 %r381,%frame,120; add.u64 %r382,%frame,240; $L36: .loc 1 171 12 @ %r372 bra $L37; .loc 1 172 10 mov.u64 %r349,1; st.u64 [%r141],%r349; bra $L30; $L37: mov.u64 %r136,%r140; .loc 1 175 13 mov.u64 %r135,0; $L29: .loc 1 178 6 ld.s8 %r351,[%r136]; cvt.u16.u32 %r350,%r351; setp.eq.u16 %r352,%r350,0; @ %r352 bra $L38; .loc 1 175 28 add.u64 %r135,%r135,1; .loc 1 175 36 add.u64 %r136,%r136,%r33; .loc 1 175 6 setp.gt.s64 %r353,%r131,%r135; @ %r353 bra $L29; .loc 1 170 10 mov.u64 %r116,1; bra $L31; $L38: .loc 1 180 14 mov.u64 %r116,%r377; $L31: .loc 1 184 12 st.u64 [%r141],%r116; $L30: .loc 1 188 15 add.u64 %r166,%r166,1; .loc 1 189 12 add.u64 %r140,%r140,%r86; .loc 1 190 12 add.u64 %r141,%r141,%r89; .loc 1 192 13 setp.ne.u64 %r354,%r166,%r164; @ %r354 bra $L36; .loc 1 199 16 add.u64 %r137,%r140,%r150; .loc 1 200 16 add.u64 %r138,%r141,%r148; .loc 1 202 14 @ %r373 bra $L1; add.u64 %r32,%frame,368; mov.u64 %r165,8; .loc 1 201 12 mov.u64 %r139,1; bra $L34; $L35: .loc 1 196 20 st.u64 [%r32],%r377; .loc 1 199 30 mul.lo.u64 %r360,%r101,%r100; .loc 1 199 16 sub.u64 %r137,%r140,%r360; .loc 1 200 30 mul.lo.u64 %r361,%r103,%r100; .loc 1 200 16 shl.b64 %r362,%r361,3; sub.u64 %r138,%r141,%r362; .loc 1 201 12 add.u64 %r139,%r139,1; .loc 1 202 14 add.u64 %r32,%r32,8; add.u64 %r165,%r165,8; setp.eq.u64 %r363,%r139,%r380; @ %r363 bra $L1; $L34: .loc 1 210 23 ld.u64 %r364,[%r32]; add.u64 %r100,%r364,1; st.u64 [%r32],%r100; .loc 1 211 30 add.u64 %r366,%r381,%r165; ld.u64 %r101,[%r366]; .loc 1 211 20 add.u64 %r140,%r137,%r101; .loc 1 212 30 add.u64 %r367,%frame,%r165; ld.u64 %r103,[%r367]; .loc 1 212 20 shl.b64 %r368,%r103,3; add.u64 %r141,%r138,%r368; .loc 1 192 32 add.u64 %r370,%r382,%r165; ld.u64 %r107,[%r370]; .loc 1 192 13 setp.eq.u64 %r371,%r100,%r107; @ %r371 bra $L35; .loc 1 196 20 mov.u64 %r166,0; bra $L36; $L1: .loc 1 216all_l16.o/_gfortran_all_l16 .visible .func _gfortran_all_l16fortran/generated/all_l16.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_internal_error .extern .func _gfortrani_internal_114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,65,76,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,10810,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,65,76,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121,32,105,110,32,65,76,76,32,105,110,116,114,105,110,115,105,99_gfortran_all_l1648u64 %r229; .reg .u64 %r230; .reg .u64 %r231; .reg .u64 %r232; .reg .u64 %r233; .reg .pred %r234; .reg .pred %r236; .reg .u64u64u64 %r257; .reg .u64 %r258; .reg .predu64u16u64 %r279; .reg .u64 %r280; .reg .u64 %r282; .reg .u64pred %r320; .reg .u64u64 %r332; .reg .u64 %r333; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .pred %r338; .reg .pred %r339; .reg .pred %r341; .reg .u64 %r342; .reg .u64 %r343; .reg .u64 %r344; .reg .u64 %r345; .reg .u64 %r346; .reg .u64 %r347; .reg .u64 %r349; .reg .u16 %r351; .reg .u32 %r352; .reg .predpredmov.u64 %r201,%ar0; mov.u64 %r202,%ar1; mov.u64 %r203,%ar2; .loc 1 56 10 ld.u64 %r22,[%r203]; .loc 1 56 7 add.u64 %r126,%r22,-1; .loc 1 57 10 ld.s8 %r23,[%r202+28]; .loc 1 57 38 add.u32 %r25,%r23,-1; .loc 1 57 8 cvt.s64.s32 %r127,%r25; .loc 1 59 14 ld.u64 %r26,[%r202+16]; .loc 1 61 9 add.u64 %r205,%r126,%r126; add.u64 %r206,%r205,%r126; shl.b64 %r207,%r206,3; add.u64 %r208,%r202,%r207; ld.u64 %r27,[%r208+56]; ld.u64 %r29,[%r208+48]; .loc 1 65 11 ld.u64 %r30,[%r208+40]; .loc 1 67 3 setp.gt.s64 %r222,%r126,0; @ %r222 bra $L2; $L8: .loc 1 75 3 setp.lt.s64 %r223,%r126,%r127; @ %r223 bra $L3; bra $L4; $L2: add.u64 %r151,%r202,40; add.u64 %r52,%frame,120; add.u64 %r109,%frame,240; add.u64 %r225,%r22,%r22; add.u64 %r226,%r225,%r22; shl.b64 %r227,%r226,3; add.u64 %r228,%r202,16; add.u64 %r160,%r227,%r228; .loc 1 73 12 mov.u64 %r390,0; $L7: .loc 1 69 20 ld.u64 %r230,[%r151]; mul.lo.u64 %r229,%r230,%r26; .loc 1 69 18 st.u64 [%r52],%r229; .loc 1 70 19 ld.u64 %r232,[%r151+16]; add.u64 %r231,%r232,1; ld.u64 %r233,[%r151+8]; sub.u64 %r41,%r231,%r233; .loc 1 72 10 setp.lt.s64 %r234,%r41,0; @ %r234 bra $L5; .loc 1 70 17 st.u64 [%r109],%r41; bra $L6; $L5: .loc 1 73 12 st.u64 [%r109],%r390; $L6: .loc 1 67 3 add.u64 %r151,%r151,24; add.u64 %r52,%r52,8; add.u64 %r109,%r109,8; setp.ne.u64 %r236,%r151,%r160; @ %r236 bra $L7; bra $L8; $L4: .loc 1 84 6 ld.u64 %r237,[%r201]; setp.eq.u64 %r238,%r237,0; @ ! %r238 bra $L51; bra $L9; $L3: add.u64 %r240,%r22,%r22; add.u64 %r241,%r240,%r22; shl.b64 %r242,%r241,3; add.u64 %r243,%r242,40; add.u64 %r145,%r202,%r243; shl.b64 %r244,%r22,3; add.u64 %r112,%r244,-8; add.u64 %r379,%frame,120; add.u64 %r117,%r379,%r112; add.u64 %r378,%frame,240; add.u64 %r110,%r378,%r112; add.u64 %r247,%r202,40; cvt.u32.u32 %r249,%r23; cvt.s64.s8 %r248,%r249; add.u64 %r251,%r248,%r248; add.u64 %r252,%r251,%r248; shl.b64 %r253,%r252,3; add.u64 %r198,%r247,%r253; .loc 1 81 12 mov.u64 %r389,0; $L13: .loc 1 77 20 ld.u64 %r255,[%r145]; mul.lo.u64 %r254,%r255,%r26; .loc 1 77 18 st.u64 [%r117],%r254; .loc 1 78 19 ld.u64 %r257,[%r145+16]; add.u64 %r256,%r257,1; ld.u64 %r258,[%r145+8]; sub.u64 %r49,%r256,%r258; .loc 1 80 10 setp.lt.s64 %r259,%r49,0; @ %r259 bra $L11; .loc 1 78 17 st.u64 [%r110],%r49; bra $L12; $L11: .loc 1 81 12 st.u64 [%r110],%r389; $L12: .loc 1 75 3 add.u64 %r145,%r145,24; add.u64 %r117,%r117,8; add.u64 %r110,%r110,8; setp.ne.u64 %r261,%r145,%r198; @ %r261 bra $L13; bra $L4; $L9: .loc 1 88 7 setp.le.s32 %r262,%r25,0; @ %r262 bra $L14; add.u64 %r179,%r201,40; add.u64 %r177,%frame,240; add.u64 %r263,%r201,16; cvt.u32.u32 %r265,%r23; cvt.s64.s8 %r264,%r265; add.u64 %r267,%r264,%r264; add.u64 %r268,%r267,%r264; shl.b64 %r269,%r268,3; add.u64 %r171,%r263,%r269; .loc 1 91 17 mov.u64 %r115,1; bra $L15; $L14: .loc 1 99 24 mov.u64 %r270,0; st.u64 [%r201+8],%r270; .loc 1 100 28 cvt.u16.u32 %r273,%r23; add.u16 %r272,%r273,-1; cvt.u32.u16 %r274,%r272; st.u8 [%r201+28],%r274; .loc 1 102 20 add.u32 %r275,%r23,-2; cvt.s64.s32 %r58,%r275; add.u64 %r277,%r58,%r58; add.u64 %r278,%r277,%r58; shl.b64 %r279,%r278,3; add.u64 %r280,%r201,%r279; .loc 1 102 67 shl.b64 %r282,%r58,3; add.u64 %r283,%frame,%r282; .loc 1 102 59 ld.u64 %r285,[%r280+40]; ld.u64 %r286,[%r283+240]; mul.lo.u64 %r134,%r285,%r286; .loc 1 104 10 setp.eq.u64 %r287,%r134,0; @ %r287 bra $L16; bra $L52; $L18: .loc 1 93 55 mul.lo.u64 %r115,%r54,%r115; $L15: .loc 1 95 4 st.u64 [%r179+8],%r237; ld.u64 %r54,[%r177]; add.u64 %r289,%r54,-1; st.u64 [%r179+16],%r289; st.u64 [%r179],%r115; .loc 1 88 7 add.u64 %r179,%r179,24; add.u64 %r177,%r177,8; setp.ne.u64 %r290,%r171,%r179; @ %r290 bra $L18; bra $L14; $L16: .loc 1 107 4 st.u64 [%r201+48],%r134; mov.u64 %r292,-1; st.u64 [%r201+56],%r292; mov.u64 %r293,1; st.u64 [%r201+40],%r293; .loc 1 108 4 bra $L1; $L52: .loc 1 111 24 mov.u64 %r295,16295; call (%value_in),_gfortrani_xmallocarray296,[%value_in]; } .loc 1 111 22 st.u64 [%r201],%r296; $L25: .loc 1 137 3 setp.gt.s32 %r298,%r25,0; @ ! %r298 bra $L26; add.u64 %r378,%frame,240; bra $L20; $L26: .loc 1 145 8 ld.u64 %r140,[%r202]; .loc 1 147 21 cvt.u32.u64 %r76,%r26; .loc 1 147 50 add.u32 %r299,%r76,-4; and.b32 %r300,%r299,-5; set.u32.eq.u32 %r302,%r300,0; neg.s32 %r303,%r302; .loc 1 147 21 add.u32 %r304,%r76,-1; set.u32.le.u32 %r306,%r304,1; neg.s32 %r307,%r306; .loc 1 147 6 cvt.u16.u32 %r309,%r303; cvt.u16.u32 %r310,%r307; orbra $L53; $L20: add.u64 %r189,%frame,360; add.u64 %r188,%r201,40; mov.u64 %r186,%frame; .loc 1 123 10 mov.u64 %r142,0; .loc 1 139 16 mov.u64 %r332,%r142; bra $L23; $L51: .loc 1 115 19 ld.s8 %r65,[%r201+28]; .loc 1 115 10 setp.eq.u64 %r314,%r65,%r127; @ %r314 bra $L24; .loc 1 116 2 st.u64 [%stack+8],%r127; st.u64 [%stack],%r65; cvta.const.u64 %r315315_gfortran_runtime_error24: .loc 1 121 11 cvta.global.u64 %r317,_gfortrani_compile_options; .loc 1 121 10 ld.u32 %r318,[%r317+36]; setp.eq.u32 %r319,%r318,0; @ %r319 bra $L25; .loc 1 123 4 setp.le.s64 %r320,%r127,0; @ %r320 bra $L26; add.u64 %r170,%r201,48; add.u64 %r378,%frame,240; mov.u64 %r168,%r378; add.u64 %r162,%r201,56; cvt.u32.u32 %r322,%r23; cvt.s64.s8 %r321,%r322; add.u64 %r149,%r321,-1; .loc 1 123 10 mov.u64 %r114,0; $L28: .loc 1 127 21 ld.u64 %r324,[%r162]; add.u64 %r323,%r324,1; .loc 1 127 19 ld.u64 %r325,[%r170]; sub.u64 %r133,%r323,%r325; .loc 1 128 18 ld.u64 %r71,[%r168]; .loc 1 128 11 setp.eq.u64 %r326,%r71,%r133; @ %r326 bra $L27; .loc 1 129 3 st.u64 [%stack+16],%r71; st.u64 [%stack+8],%r133; cvt.u32.u64 %r329,%r114; add.u32 %r328,%r329,1; st.u32 [%stack],%r328; cvta.const.u64 %r327,$LC1stack; call _gfortran_runtime_error27: .loc 1 123 25 add.u64 %r114,%r114,1; .loc 1 123 4 add.u64 %r170,%r170,24; add.u64 %r168,%r168,8; add.u64 %r162,%r162,24; setp.ne.u64 %r331,%r114,%r149; @ %r331 bra $L28; bra $L20; $L23: .loc 1 139 16 st.u64 [%r189],%r332; .loc 1 140 18 ld.u64 %r333,[%r188]; st.u64 [%r186],%r333; .loc 1 141 17 shl.b64 %r335,%r142,3; add.u64 %r336,%r378,%r335; .loc 1 141 10 ld.u64 %r337,[%r336]; setp.le.s64 %r338,%r337,0; @ %r338 bra $L1; .loc 1 137 26 add.u64 %r142,%r142,1; .loc 1 137 3 add.u64 %r189,%r189,8; add.u64 %r188,%r188,24; add.u64 %r186,%r186,8; setp.gt.s64 %r339,%r127,%r142; @ %r339 bra $L23; bra $L26; $L53: .loc 1 149 7 setp.eq.u32 %r341,%r76,16; @ %r341 bra $L21; .loc 1 157 5 cvta.const.u64 %r343,$LC2; mov.u64 %r342,3343; call _gfortrani_internal_error61 9 add.u64 %r344,%r27,1; .loc 1 61 7 sub.u64 %r129,%r344,%r29; max.s64 %r131,%r129,0; .loc 1 65 11 mul.lo.u64 %r33,%r30,%r26; .loc 1 159 8 ld.u64 %r141,[%r201]; ld.u64 %r166,[%frame+360]; .loc 1 189 22 ld.u64 %r86,[%frame+120]; .loc 1 190 22 ld.u64 %r88,[%frame]; .loc 1 190 12 shl.b64 %r89,%r88,4; .loc 1 192 32 ld.u64 %r164,[%frame+240]; .loc 1 199 30 mul.lo.u64 %r345,%r164,%r86; .loc 1 199 16 neg.s64 %r150,%r345; .loc 1 200 30 mul.lo.u64 %r346,%r164,%r88; .loc 1 200 16 shl.b64 %r347,%r346,4; neg.s64 %r148,%r347; setp.gt.s64 %r380,%r129,0; setp.le.s32 %r377,%r25,1; .loc 1 180 14 mov.u64 %r382,0; cvt.u32.u32 %r384,%r23; cvt.s64.s8 %r385,%r384; add.u64 %r386,%r385,-1; add.u64 %r387,%frame,120; add.u64 %r388,%frame,240; $L36: .loc 1 171 12 @ %r380 bra $L37; .loc 1 172 10 mov.u64 %r349,1; st.u64 [%r141],%r349; st.u64 [%r141+8],%r382; bra $L30; $L37: mov.u64 %r136,%r140; .loc 1 175 13 mov.u64 %r135,0; $L29: .loc 1 178 6 ld.s8 %r352,[%r136]; cvt.u16.u32 %r351,%r352; setp.eq.u16 %r353,%r351,0; @ %r353 bra $L38; .loc 1 175 28 add.u64 %r135,%r135,1; .loc 1 175 36 add.u64 %r136,%r136,%r33; .loc 1 175 6 setp.gt.s64 %r354,%r131,%r135; @ %r354 bra $L29; .loc 1 170 10 mov.u64 %r375,1; mov.u64 %r376,%r382; bra $L31; $L38: .loc 1 180 14 mov.u64 %r375,%r382; mov.u64 %r376,%r382; $L31: .loc 1 184 12 st.u64 [%r141],%r375; st.u64 [%r141+8],%r376; $L30: .loc 1 188 15 add.u64 %r166,%r166,1; .loc 1 189 12 add.u64 %r140,%r140,%r86; .loc 1 190 12 add.u64 %r141,%r141,%r89; .loc 1 192 13 setp.ne.u64 %r357,%r166,%r164; @ %r357 bra $L36; .loc 1 199 16 add.u64 %r137,%r140,%r150; .loc 1 200 16 add.u64 %r138,%r141,%r148; .loc 1 202 14 @ %r377 bra $L1; add.u64 %r32,%frame,368; mov.u64 %r165,8; .loc 1 201 12 mov.u64 %r139,1; bra $L34; $L35: .loc 1 196 20 st.u64 [%r32],%r382; .loc 1 199 30 mul.lo.u64 %r363,%r101,%r100; .loc 1 199 16 sub.u64 %r137,%r140,%r363; .loc 1 200 30 mul.lo.u64 %r364,%r103,%r100; .loc 1 200 16 shl.b64 %r365,%r364,4; sub.u64 %r138,%r141,%r365; .loc 1 201 12 add.u64 %r139,%r139,1; .loc 1 202 14 add.u64 %r32,%r32,8; add.u64 %r165,%r165,8; setp.eq.u64 %r366,%r139,%r386; @ %r366 bra $L1; $L34: .loc 1 210 23 ld.u64 %r367,[%r32]; add.u64 %r100,%r367,1; st.u64 [%r32],%r100; .loc 1 211 30 add.u64 %r369,%r387,%r165; ld.u64 %r101,[%r369]; .loc 1 211 20 add.u64 %r140,%r137,%r101; .loc 1 212 30 add.u64 %r370,%frame,%r165; ld.u64 %r103,[%r370]; .loc 1 212 20 shl.b64 %r371,%r103,4; add.u64 %r141,%r138,%r371; .loc 1 192 32 add.u64 %r373,%r388,%r165; ld.u64 %r107,[%r373]; .loc 1 192 13 setp.eq.u64 %r374,%r100,%r107; @ %r374 bra $L35; .loc 1 196 20 mov.u64 %r166,0; bra $L36; $L1: .loc 1 216 any_l1.o/_gfortran_any_l1 .visible .func _gfortran_any_l1fortran/generated/any_l1.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_internal_error .extern .func _gfortrani_internal_114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,65,78,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,10810,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,65,78,89,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121,32,105,110,32,65,78,89,32,105,110,116,114,105,110,115,105,99_gfortran_any_l1 .visible .func _gfortran_any_l148pred %r217; .reg .predu64 %r228; .reg .pred %r229; .reg .pred %r231; .reg .u64 %r232; .reg .predu32u64 %r250; .reg .u64 %r251; .reg .u64 %r252; .reg .u64 %r253; .reg .predu32u16u64pred %r293; .reg .u32 %r294; .reg .u32predu64 %r318; .reg .u6464u64 %r341; .reg .u32predpred %r362; .reg .predu64 %r376; mov.u64 %r196,%ar0; mov.u64 %r197,%ar1; mov.u64 %r198,%ar2; .loc 1 56 10 ld.u64 %r22,[%r198]; .loc 1 56 7 add.u64 %r121,%r22,-1; .loc 1 57 10 ld.s8 %r23,[%r197+28]; .loc 1 57 38 add.u32 %r25,%r23,-1; .loc 1 57 8 cvt.s64.s32 %r122,%r25; .loc 1 59 14 ld.u64 %r26,[%r197+16]; .loc 1 61 9 add.u64 %r200,%r121,%r121; add.u64 %r201,%r200,%r121; shl.b64 %r202,%r201,3; add.u64 %r203,%r197,%r202; ld.u64 %r27,[%r203+56]; ld.u64 %r29,[%r203+48]; .loc 1 65 11 ld.u64 %r30,[%r203+40]; .loc 1 67 3 setp.gt.s64 %r217,%r121,0; @ %r217 bra $L2; $L8: .loc 1 75 3 setp.lt.s64 %r218,%r121,%r122; @ %r218 bra $L3; bra $L4; $L2: add.u64 %r52,%r197,40; add.u64 %r148,%frame,120; add.u64 %r67,%frame,240; add.u64 %r220,%r22,%r22; add.u64 %r221,%r220,%r22; shl.b64 %r222,%r221,3; add.u64 %r223,%r197,16; add.u64 %r193,%r222,%r223; .loc 1 73 12 mov.u64 %r376,0; $L7: .loc 1 69 20 ld.u64 %r225,[%r52]; mul.lo.u64 %r224,%r225,%r26; .loc 1 69 18 st.u64 [%r148],%r224; .loc 1 70 19 ld.u64 %r227,[%r52+16]; add.u64 %r226,%r227,1; ld.u64 %r228,[%r52+8]; sub.u64 %r41,%r226,%r228; .loc 1 72 10 setp.lt.s64 %r229,%r41,0; @ %r229 bra $L5; .loc 1 70 17 st.u64 [%r67],%r41; bra $L6; $L5: .loc 1 73 12 st.u64 [%r67],%r376; $L6: .loc 1 67 3 add.u64 %r52,%r52,24; add.u64 %r148,%r148,8; add.u64 %r67,%r67,8; setp.ne.u64 %r231,%r52,%r193; @ %r231 bra $L7; bra $L8; $L4: .loc 1 84 6 ld.u64 %r232,[%r196]; setp.eq.u64 %r233,%r232,0; @ ! %r233 bra $L51; bra $L9; $L3: add.u64 %r235,%r22,%r22; add.u64 %r236,%r235,%r22; shl.b64 %r237,%r236,3; add.u64 %r238,%r237,40; add.u64 %r117,%r197,%r238; shl.b64 %r239,%r22,3; add.u64 %r107,%r239,-8; add.u64 %r366,%frame,120; add.u64 %r109,%r366,%r107; add.u64 %r365,%frame,240; add.u64 %r69,%r365,%r107; add.u64 %r242,%r197,40; cvt.u32.u32 %r244,%r23; cvt.s64.s8 %r243,%r244; add.u64 %r246,%r243,%r243; add.u64 %r247,%r246,%r243; shl.b64 %r248,%r247,3; add.u64 %r147,%r242,%r248; .loc 1 81 12 mov.u64 %r375,0; $L13: .loc 1 77 20 ld.u64 %r250,[%r117]; mul.lo.u64 %r249,%r250,%r26; .loc 1 77 18 st.u64 [%r109],%r249; .loc 1 78 19 ld.u64 %r252,[%r117+16]; add.u64 %r251,%r252,1; ld.u64 %r253,[%r117+8]; sub.u64 %r49,%r251,%r253; .loc 1 80 10 setp.lt.s64 %r254,%r49,0; @ %r254 bra $L11; .loc 1 78 17 st.u64 [%r69],%r49; bra $L12; $L11: .loc 1 81 12 st.u64 [%r69],%r375; $L12: .loc 1 75 3 add.u64 %r117,%r117,24; add.u64 %r109,%r109,8; add.u64 %r69,%r69,8; setp.ne.u64 %r256,%r117,%r147; @ %r256 bra $L13; bra $L4; $L9: .loc 1 88 7 setp.le.s32 %r257,%r25,0; @ %r257 bra $L14; add.u64 %r172,%r196,40; add.u64 %r170,%frame,240; add.u64 %r258,%r196,16; cvt.u32.u32 %r260,%r23; cvt.s64.s8 %r259,%r260; add.u64 %r262,%r259,%r259; add.u64 %r263,%r262,%r259; shl.b64 %r264,%r263,3; add.u64 %r164,%r258,%r264; .loc 1 91 17 mov.u64 %r112,1; bra $L15; $L14: .loc 1 99 24 mov.u64 %r265,0; st.u64 [%r196+8],%r265; .loc 1 100 28 cvt.u16.u32 %r268,%r23; add.u16 %r267,%r268,-1; cvt.u32.u16 %r269,%r267; st.u8 [%r196+28],%r269; .loc 1 102 20 add.u32 %r270,%r23,-2; cvt.s64.s32 %r58,%r270; add.u64 %r272,%r58,%r58; add.u64 %r273,%r272,%r58; shl.b64 %r274,%r273,3; add.u64 %r275,%r196,%r274; .loc 1 102 67 shl.b64 %r277,%r58,3; add.u64 %r278,%frame,%r277; .loc 1 102 59 ld.u64 %r280,[%r275+40]; ld.u64 %r281,[%r278+240]; mul.lo.u64 %r130,%r280,%r281; .loc 1 104 10 setp.eq.u64 %r282,%r130,0; @ %r282 bra $L16; bra $L52; $L18: .loc 1 93 55 mul.lo.u64 %r112,%r54,%r112; $L15: .loc 1 95 4 st.u64 [%r172+8],%r232; ld.u64 %r54,[%r170]; add.u64 %r284,%r54,-1; st.u64 [%r172+16],%r284; st.u64 [%r172],%r112; .loc 1 88 7 add.u64 %r172,%r172,24; add.u64 %r170,%r170,8; setp.ne.u64 %r285,%r164,%r172; @ %r285 bra $L18; bra $L14; $L16: .loc 1 107 4 st.u64 [%r196+48],%r130; mov.u64 %r287,-1; st.u64 [%r196+56],%r287; mov.u64 %r288,1; st.u64 [%r196+40],%r288; .loc 1 108 4 bra $L1; $L52: .loc 1 111 24 mov.u64 %r2901290; call (%value_in),_gfortrani_xmallocarray,(%out_arg1,%out_arg2); ld.param.u64 %r291,[%value_in]; } .loc 1 111 22 st.u64 [%r196],%r291; $L25: .loc 1 137 3 setp.gt.s32 %r293,%r25,0; @ ! %r293 bra $L26; add.u64 %r365,%frame,240; bra $L20; $L26: .loc 1 145 8 ld.u64 %r137,[%r197]; .loc 1 147 21 cvt.u32.u64 %r79,%r26; .loc 1 147 50 add.u32 %r294,%r79,-4; and.b32 %r295,%r294,-5; set.u32.eq.u32 %r297,%r295,0; neg.s32 %r298,%r297; .loc 1 147 21 add.u32 %r299,%r79,-1; set.u32.le.u32 %r301,%r299,1; neg.s32 %r302,%r301; .loc 1 147 6 cvt.u16.u32 %r304,%r298; cvt.u16.u32 %r305,%r302; orbra $L53; $L20: add.u64 %r182,%frame,360; add.u64 %r181,%r196,40; mov.u64 %r179,%frame; .loc 1 123 10 mov.u64 %r111,0; .loc 1 139 16 mov.u64 %r327,%r111; bra $L23; $L51: .loc 1 115 19 ld.s8 %r65,[%r196+28]; .loc 1 115 10 setp.eq.u64 %r309,%r65,%r122; @ %r309 bra $L24; .loc 1 116 2 st.u64 [%stack+8],%r122; st.u64 [%stack],%r65; cvta.const.u64 %r310310_gfortran_runtime_error24: .loc 1 121 11 cvta.global.u64 %r312,_gfortrani_compile_options; .loc 1 121 10 ld.u32 %r313,[%r312+36]; setp.eq.u32 %r314,%r313,0; @ %r314 bra $L25; .loc 1 123 4 setp.le.s64 %r315,%r122,0; @ %r315 bra $L26; add.u64 %r163,%r196,48; add.u64 %r365,%frame,240; mov.u64 %r155,%r365; add.u64 %r150,%r196,56; cvt.u32.u32 %r317,%r23; cvt.s64.s8 %r316,%r317; add.u64 %r133,%r316,-1; .loc 1 123 10 mov.u64 %r129,0; $L28: .loc 1 127 21 ld.u64 %r319,[%r150]; add.u64 %r318,%r319,1; .loc 1 127 19 ld.u64 %r320,[%r163]; sub.u64 %r128,%r318,%r320; .loc 1 128 18 ld.u64 %r73,[%r155]; .loc 1 128 11 setp.eq.u64 %r321,%r73,%r128; @ %r321 bra $L27; .loc 1 129 3 st.u64 [%stack+16],%r73; st.u64 [%stack+8],%r128; cvt.u32.u64 %r324,%r129; add.u32 %r323,%r324,1; st.u32 [%stack],%r323; cvta.const.u64 %r322,$LC1; {stack; call _gfortran_runtime_error27: .loc 1 123 25 add.u64 %r129,%r129,1; .loc 1 123 4 add.u64 %r163,%r163,24; add.u64 %r155,%r155,8; add.u64 %r150,%r150,24; setp.ne.u64 %r326,%r129,%r133; @ %r326 bra $L28; bra $L20; $L23: .loc 1 139 16 st.u64 [%r182],%r327; .loc 1 140 18 ld.u64 %r328,[%r181]; st.u64 [%r179],%r328; .loc 1 141 17 shl.b64 %r330,%r111,3; add.u64 %r331,%r365,%r330; .loc 1 141 10 ld.u64 %r332,[%r331]; setp.le.s64 %r333,%r332,0; @ %r333 bra $L1; .loc 1 137 26 add.u64 %r111,%r111,1; .loc 1 137 3 add.u64 %r182,%r182,8; add.u64 %r181,%r181,24; add.u64 %r179,%r179,8; setp.gt.s64 %r334,%r122,%r111; @ %r334 bra $L23; bra $L26; $L53: .loc 1 149 7 setp.eq.u32 %r336,%r79,16; @ %r336 bra $L21; .loc 1 157 5 cvta.const.u64 %r338,$LC2; mov.u64 %r333338; call _gfortrani_internal_error61 9 add.u64 %r339,%r27,1; .loc 1 61 7 sub.u64 %r124,%r339,%r29; max.s64 %r126,%r124,0; .loc 1 65 11 mul.lo.u64 %r33,%r30,%r26; .loc 1 159 8 ld.u64 %r138,[%r196]; ld.u64 %r161,[%frame+360]; .loc 1 189 22 ld.u64 %r86,[%frame+120]; .loc 1 190 22 ld.u64 %r88,[%frame]; .loc 1 192 32 ld.u64 %r159,[%frame+240]; .loc 1 199 30 mul.lo.u64 %r340,%r159,%r86; .loc 1 199 16 neg.s64 %r146,%r340; .loc 1 200 30 mul.lo.u64 %r341,%r159,%r88; .loc 1 200 16 neg.s64 %r157,%r341; setp.gt.s64 %r363,%r124,0; setp.le.s32 %r364,%r25,1; cvt.u32.u32 %r368,%r23; cvt.s64.s8 %r369,%r368; add.u64 %r370,%r369,-1; add.u64 %r371,%frame,120; add.u64 %r372,%frame,240; .loc 1 196 20 mov.u64 %r373,0; $L36: .loc 1 170 12 @ %r363 bra $L37; .loc 1 171 10 mov.u32 %r343,0; st.u8 [%r138],%r343; bra $L30; $L37: mov.u64 %r132,%r137; .loc 1 174 13 mov.u64 %r131,0; $L29: .loc 1 178 7 ld.s8 %r84,[%r132]; .loc 1 178 6 setp.ne.u32 %r344,%r84,0; @ %r344 bra $L38; .loc 1 174 28 add.u64 %r131,%r131,1; .loc 1 174 36 add.u64 %r132,%r132,%r33; .loc 1 174 6 setp.gt.s64 %r345,%r126,%r131; @ %r345 bra $L29; bra $L31; $L38: .loc 1 180 14 mov.u32 %r84,1; $L31: .loc 1 184 12 cvt.u32.u32 %r346,%r84; st.u8 [%r138],%r346; $L30: .loc 1 188 15 add.u64 %r161,%r161,1; .loc 1 189 12 add.u64 %r137,%r137,%r86; .loc 1 190 12 add.u64 %r138,%r138,%r88; .loc 1 192 13 setp.ne.u64 %r347,%r161,%r159; @ %r347 bra $L36; .loc 1 199 16 add.u64 %r134,%r137,%r146; .loc 1 200 16 add.u64 %r135,%r138,%r157; .loc 1 202 14 @ %r364 bra $L1; add.u64 %r32,%frame,368; mov.u64 %r160,8; .loc 1 201 12 mov.u64 %r136,1; bra $L34; $L35: .loc 1 196 20 st.u64 [%r32],%r373; .loc 1 199 30 mul.lo.u64 %r353,%r98,%r97; .loc 1 199 16 sub.u64 %r134,%r137,%r353; .loc 1 200 30 mul.lo.u64 %r354,%r100,%r97; .loc 1 200 16 sub.u64 %r135,%r138,%r354; .loc 1 201 12 add.u64 %r136,%r136,1; .loc 1 202 14 add.u64 %r32,%r32,8; add.u64 %r160,%r160,8; setp.eq.u64 %r355,%r136,%r370; @ %r355 bra $L1; $L34: .loc 1 210 23 ld.u64 %r356,[%r32]; add.u64 %r97,%r356,1; st.u64 [%r32],%r97; .loc 1 211 30 add.u64 %r358,%r371,%r160; ld.u64 %r98,[%r358]; .loc 1 211 20 add.u64 %r137,%r134,%r98; .loc 1 212 30 add.u64 %r359,%frame,%r160; ld.u64 %r100,[%r359]; .loc 1 212 20 add.u64 %r138,%r135,%r100; .loc 1 192 32 add.u64 %r361,%r372,%r160; ld.u64 %r103,[%r361]; .loc 1 192 13 setp.eq.u64 %r362,%r97,%r103; @ %r362 bra $L35; .loc 1 196 20 mov.u64 %r161,0; bra $L36; $L1: .loc 1 216any_l2.o/_gfortran_any_l2 .visible .func _gfortran_any_l2fortran/generated/any_l2.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_internal_error .extern .func _gfortrani_internal_114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,65,78,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,10810,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,65,78,89,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121,32,105,110,32,65,78,89,32,105,110,116,114,105,110,115,105,99_gfortran_any_l2 .visible .func _gfortran_any_l248u64 %r229; .reg .u64 %r230; .reg .u64 %r231; .reg .u64 %r232; .reg .u64 %r233; .reg .pred %r234; .reg .pred %r236; .reg .u64u64u64 %r257; .reg .u64 %r258; .reg .predu64u16u64 %r279; .reg .u64 %r280; .reg .u64 %r282; .reg .u64pred %r320; .reg .u64u64 %r332; .reg .u64 %r333; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .pred %r338; .reg .pred %r339; .reg .pred %r341; .reg .u64 %r342; .reg .u64 %r343; .reg .u64 %r344; .reg .u64 %r346; .reg .u64 %r347; .reg .u64 %r348; .reg .u16 %r350; .reg .u1664u64 %r370; .reg .u64 %r372; .reg .predmov.u64 %r201,%ar0; mov.u64 %r202,%ar1; mov.u64 %r203,%ar2; .loc 1 56 10 ld.u64 %r22,[%r203]; .loc 1 56 7 add.u64 %r126,%r22,-1; .loc 1 57 10 ld.s8 %r23,[%r202+28]; .loc 1 57 38 add.u32 %r25,%r23,-1; .loc 1 57 8 cvt.s64.s32 %r127,%r25; .loc 1 59 14 ld.u64 %r26,[%r202+16]; .loc 1 61 9 add.u64 %r205,%r126,%r126; add.u64 %r206,%r205,%r126; shl.b64 %r207,%r206,3; add.u64 %r208,%r202,%r207; ld.u64 %r27,[%r208+56]; ld.u64 %r29,[%r208+48]; .loc 1 65 11 ld.u64 %r30,[%r208+40]; .loc 1 67 3 setp.gt.s64 %r222,%r126,0; @ %r222 bra $L2; $L8: .loc 1 75 3 setp.lt.s64 %r223,%r126,%r127; @ %r223 bra $L3; bra $L4; $L2: add.u64 %r151,%r202,40; add.u64 %r52,%frame,120; add.u64 %r109,%frame,240; add.u64 %r225,%r22,%r22; add.u64 %r226,%r225,%r22; shl.b64 %r227,%r226,3; add.u64 %r228,%r202,16; add.u64 %r160,%r227,%r228; .loc 1 73 12 mov.u64 %r387,0; $L7: .loc 1 69 20 ld.u64 %r230,[%r151]; mul.lo.u64 %r229,%r230,%r26; .loc 1 69 18 st.u64 [%r52],%r229; .loc 1 70 19 ld.u64 %r232,[%r151+16]; add.u64 %r231,%r232,1; ld.u64 %r233,[%r151+8]; sub.u64 %r41,%r231,%r233; .loc 1 72 10 setp.lt.s64 %r234,%r41,0; @ %r234 bra $L5; .loc 1 70 17 st.u64 [%r109],%r41; bra $L6; $L5: .loc 1 73 12 st.u64 [%r109],%r387; $L6: .loc 1 67 3 add.u64 %r151,%r151,24; add.u64 %r52,%r52,8; add.u64 %r109,%r109,8; setp.ne.u64 %r236,%r151,%r160; @ %r236 bra $L7; bra $L8; $L4: .loc 1 84 6 ld.u64 %r237,[%r201]; setp.eq.u64 %r238,%r237,0; @ ! %r238 bra $L51; bra $L9; $L3: add.u64 %r240,%r22,%r22; add.u64 %r241,%r240,%r22; shl.b64 %r242,%r241,3; add.u64 %r243,%r242,40; add.u64 %r145,%r202,%r243; shl.b64 %r244,%r22,3; add.u64 %r112,%r244,-8; add.u64 %r376,%frame,120; add.u64 %r117,%r376,%r112; add.u64 %r375,%frame,240; add.u64 %r110,%r375,%r112; add.u64 %r247,%r202,40; cvt.u32.u32 %r249,%r23; cvt.s64.s8 %r248,%r249; add.u64 %r251,%r248,%r248; add.u64 %r252,%r251,%r248; shl.b64 %r253,%r252,3; add.u64 %r198,%r247,%r253; .loc 1 81 12 mov.u64 %r386,0; $L13: .loc 1 77 20 ld.u64 %r255,[%r145]; mul.lo.u64 %r254,%r255,%r26; .loc 1 77 18 st.u64 [%r117],%r254; .loc 1 78 19 ld.u64 %r257,[%r145+16]; add.u64 %r256,%r257,1; ld.u64 %r258,[%r145+8]; sub.u64 %r49,%r256,%r258; .loc 1 80 10 setp.lt.s64 %r259,%r49,0; @ %r259 bra $L11; .loc 1 78 17 st.u64 [%r110],%r49; bra $L12; $L11: .loc 1 81 12 st.u64 [%r110],%r386; $L12: .loc 1 75 3 add.u64 %r145,%r145,24; add.u64 %r117,%r117,8; add.u64 %r110,%r110,8; setp.ne.u64 %r261,%r145,%r198; @ %r261 bra $L13; bra $L4; $L9: .loc 1 88 7 setp.le.s32 %r262,%r25,0; @ %r262 bra $L14; add.u64 %r179,%r201,40; add.u64 %r177,%frame,240; add.u64 %r263,%r201,16; cvt.u32.u32 %r265,%r23; cvt.s64.s8 %r264,%r265; add.u64 %r267,%r264,%r264; add.u64 %r268,%r267,%r264; shl.b64 %r269,%r268,3; add.u64 %r171,%r263,%r269; .loc 1 91 17 mov.u64 %r115,1; bra $L15; $L14: .loc 1 99 24 mov.u64 %r270,0; st.u64 [%r201+8],%r270; .loc 1 100 28 cvt.u16.u32 %r273,%r23; add.u16 %r272,%r273,-1; cvt.u32.u16 %r274,%r272; st.u8 [%r201+28],%r274; .loc 1 102 20 add.u32 %r275,%r23,-2; cvt.s64.s32 %r58,%r275; add.u64 %r277,%r58,%r58; add.u64 %r278,%r277,%r58; shl.b64 %r279,%r278,3; add.u64 %r280,%r201,%r279; .loc 1 102 67 shl.b64 %r282,%r58,3; add.u64 %r283,%frame,%r282; .loc 1 102 59 ld.u64 %r285,[%r280+40]; ld.u64 %r286,[%r283+240]; mul.lo.u64 %r134,%r285,%r286; .loc 1 104 10 setp.eq.u64 %r287,%r134,0; @ %r287 bra $L16; bra $L52; $L18: .loc 1 93 55 mul.lo.u64 %r115,%r54,%r115; $L15: .loc 1 95 4 st.u64 [%r179+8],%r237; ld.u64 %r54,[%r177]; add.u64 %r289,%r54,-1; st.u64 [%r179+16],%r289; st.u64 [%r179],%r115; .loc 1 88 7 add.u64 %r179,%r179,24; add.u64 %r177,%r177,8; setp.ne.u64 %r290,%r171,%r179; @ %r290 bra $L18; bra $L14; $L16: .loc 1 107 4 st.u64 [%r201+48],%r134; mov.u64 %r292,-1; st.u64 [%r201+56],%r292; mov.u64 %r293,1; st.u64 [%r201+40],%r293; .loc 1 108 4 bra $L1; $L52: .loc 1 111 24 mov.u64 %r295,1295; call (%value_in),_gfortrani_xmallocarray296,[%value_in]; } .loc 1 111 22 st.u64 [%r201],%r296; $L25: .loc 1 137 3 setp.gt.s32 %r298,%r25,0; @ ! %r298 bra $L26; add.u64 %r375,%frame,240; bra $L20; $L26: .loc 1 145 8 ld.u64 %r140,[%r202]; .loc 1 147 21 cvt.u32.u64 %r76,%r26; .loc 1 147 50 add.u32 %r299,%r76,-4; and.b32 %r300,%r299,-5; set.u32.eq.u32 %r302,%r300,0; neg.s32 %r303,%r302; .loc 1 147 21 add.u32 %r304,%r76,-1; set.u32.le.u32 %r306,%r304,1; neg.s32 %r307,%r306; .loc 1 147 6 cvt.u16.u32 %r309,%r303; cvt.u16.u32 %r310,%r307; orbra $L53; $L20: add.u64 %r189,%frame,360; add.u64 %r188,%r201,40; mov.u64 %r186,%frame; .loc 1 123 10 mov.u64 %r142,0; .loc 1 139 16 mov.u64 %r332,%r142; bra $L23; $L51: .loc 1 115 19 ld.s8 %r65,[%r201+28]; .loc 1 115 10 setp.eq.u64 %r314,%r65,%r127; @ %r314 bra $L24; .loc 1 116 2 st.u64 [%stack+8],%r127; st.u64 [%stack],%r65; cvta.const.u64 %r315315_gfortran_runtime_error24: .loc 1 121 11 cvta.global.u64 %r317,_gfortrani_compile_options; .loc 1 121 10 ld.u32 %r318,[%r317+36]; setp.eq.u32 %r319,%r318,0; @ %r319 bra $L25; .loc 1 123 4 setp.le.s64 %r320,%r127,0; @ %r320 bra $L26; add.u64 %r170,%r201,48; add.u64 %r375,%frame,240; mov.u64 %r168,%r375; add.u64 %r162,%r201,56; cvt.u32.u32 %r322,%r23; cvt.s64.s8 %r321,%r322; add.u64 %r149,%r321,-1; .loc 1 123 10 mov.u64 %r114,0; $L28: .loc 1 127 21 ld.u64 %r324,[%r162]; add.u64 %r323,%r324,1; .loc 1 127 19 ld.u64 %r325,[%r170]; sub.u64 %r133,%r323,%r325; .loc 1 128 18 ld.u64 %r71,[%r168]; .loc 1 128 11 setp.eq.u64 %r326,%r71,%r133; @ %r326 bra $L27; .loc 1 129 3 st.u64 [%stack+16],%r71; st.u64 [%stack+8],%r133; cvt.u32.u64 %r329,%r114; add.u32 %r328,%r329,1; st.u32 [%stack],%r328; cvta.const.u64 %r327,$LC1stack; call _gfortran_runtime_error27: .loc 1 123 25 add.u64 %r114,%r114,1; .loc 1 123 4 add.u64 %r170,%r170,24; add.u64 %r168,%r168,8; add.u64 %r162,%r162,24; setp.ne.u64 %r331,%r114,%r149; @ %r331 bra $L28; bra $L20; $L23: .loc 1 139 16 st.u64 [%r189],%r332; .loc 1 140 18 ld.u64 %r333,[%r188]; st.u64 [%r186],%r333; .loc 1 141 17 shl.b64 %r335,%r142,3; add.u64 %r336,%r375,%r335; .loc 1 141 10 ld.u64 %r337,[%r336]; setp.le.s64 %r338,%r337,0; @ %r338 bra $L1; .loc 1 137 26 add.u64 %r142,%r142,1; .loc 1 137 3 add.u64 %r189,%r189,8; add.u64 %r188,%r188,24; add.u64 %r186,%r186,8; setp.gt.s64 %r339,%r127,%r142; @ %r339 bra $L23; bra $L26; $L53: .loc 1 149 7 setp.eq.u32 %r341,%r76,16; @ %r341 bra $L21; .loc 1 157 5 cvta.const.u64 %r343,$LC2; mov.u64 %r342,3343; call _gfortrani_internal_error61 9 add.u64 %r344,%r27,1; .loc 1 61 7 sub.u64 %r129,%r344,%r29; max.s64 %r131,%r129,0; .loc 1 65 11 mul.lo.u64 %r33,%r30,%r26; .loc 1 159 8 ld.u64 %r141,[%r201]; ld.u64 %r166,[%frame+360]; .loc 1 189 22 ld.u64 %r86,[%frame+120]; .loc 1 190 22 ld.u64 %r88,[%frame]; .loc 1 190 12 add.u64 %r89,%r88,%r88; .loc 1 192 32 ld.u64 %r164,[%frame+240]; .loc 1 199 30 mul.lo.u64 %r346,%r164,%r86; .loc 1 199 16 neg.s64 %r150,%r346; .loc 1 200 30 mul.lo.u64 %r347,%r164,%r88; .loc 1 200 16 add.u64 %r348,%r347,%r347; neg.s64 %r148,%r348; setp.gt.s64 %r377,%r129,0; setp.le.s32 %r374,%r25,1; cvt.u32.u32 %r379,%r23; cvt.s64.s8 %r380,%r379; add.u64 %r381,%r380,-1; add.u64 %r382,%frame,120; add.u64 %r383,%frame,240; .loc 1 196 20 mov.u64 %r384,0; $L36: .loc 1 170 12 @ %r377 bra $L37; .loc 1 171 10 mov.u16 %r350,0; st.u16 [%r141],%r350; bra $L30; $L37: mov.u64 %r136,%r140; .loc 1 174 13 mov.u64 %r135,0; $L29: .loc 1 178 6 ld.s8 %r352,[%r136]; cvt.u16.u32 %r351,%r352; setp.ne.u16 %r353,%r351,0; @ %r353 bra $L38; .loc 1 174 28 add.u64 %r135,%r135,1; .loc 1 174 36 add.u64 %r136,%r136,%r33; .loc 1 174 6 setp.gt.s64 %r354,%r131,%r135; @ %r354 bra $L29; .loc 1 169 10 mov.u32 %r116,0; bra $L31; $L38: .loc 1 180 14 mov.u32 %r116,1; $L31: .loc 1 184 12 cvt.u16.u32 %r355,%r116; st.u16 [%r141],%r355; $L30: .loc 1 188 15 add.u64 %r166,%r166,1; .loc 1 189 12 add.u64 %r140,%r140,%r86; .loc 1 190 12 add.u64 %r141,%r141,%r89; .loc 1 192 13 setp.ne.u64 %r356,%r166,%r164; @ %r356 bra $L36; .loc 1 199 16 add.u64 %r137,%r140,%r150; .loc 1 200 16 add.u64 %r138,%r141,%r148; .loc 1 202 14 @ %r374 bra $L1; add.u64 %r32,%frame,368; mov.u64 %r165,8; .loc 1 201 12 mov.u64 %r139,1; bra $L34; $L35: .loc 1 196 20 st.u64 [%r32],%r384; .loc 1 199 30 mul.lo.u64 %r362,%r101,%r100; .loc 1 199 16 sub.u64 %r137,%r140,%r362; .loc 1 200 30 mul.lo.u64 %r363,%r103,%r100; .loc 1 200 16 add.u64 %r364,%r363,%r363; sub.u64 %r138,%r141,%r364; .loc 1 201 12 add.u64 %r139,%r139,1; .loc 1 202 14 add.u64 %r32,%r32,8; add.u64 %r165,%r165,8; setp.eq.u64 %r365,%r139,%r381; @ %r365 bra $L1; $L34: .loc 1 210 23 ld.u64 %r366,[%r32]; add.u64 %r100,%r366,1; st.u64 [%r32],%r100; .loc 1 211 30 add.u64 %r368,%r382,%r165; ld.u64 %r101,[%r368]; .loc 1 211 20 add.u64 %r140,%r137,%r101; .loc 1 212 30 add.u64 %r369,%frame,%r165; ld.u64 %r103,[%r369]; .loc 1 212 20 add.u64 %r370,%r103,%r103; add.u64 %r141,%r138,%r370; .loc 1 192 32 add.u64 %r372,%r383,%r165; ld.u64 %r107,[%r372]; .loc 1 192 13 setp.eq.u64 %r373,%r100,%r107; @ %r373 bra $L35; .loc 1 196 20 mov.u64 %r166,0; bra $L36; $L1: .loc 1 216 any_l4.o/_gfortran_any_l4 .visible .func _gfortran_any_l4fortran/generated/any_l4.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_internal_error .extern .func _gfortrani_internal_114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,65,78,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,10810,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,65,78,89,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121,32,105,110,32,65,78,89,32,105,110,116,114,105,110,115,105,99_gfortran_any_l4 .visible .func _gfortran_any_l448u64 %r229; .reg .u64 %r230; .reg .u64 %r231; .reg .u64 %r232; .reg .u64 %r233; .reg .pred %r234; .reg .pred %r236; .reg .u64u64u64 %r257; .reg .u64 %r258; .reg .predu64u16u64 %r279; .reg .u64 %r280; .reg .u64 %r282; .reg .u64pred %r320; .reg .u64u64 %r332; .reg .u64 %r333; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .pred %r338; .reg .pred %r339; .reg .pred %r341; .reg .u64 %r342; .reg .u64 %r343; .reg .u64 %r344; .reg .u64 %r345; .reg .u64 %r346; .reg .u64 %r347; .reg .u32 %r349; .reg .u16 %r350; .reg .u32 %r351; .reg .pred %r352; .reg .pred %r353; .reg .pred %r354; .reg .u64pred %r363; .reg .u64u64 %r385; mov.u64 %r201,%ar0; mov.u64 %r202,%ar1; mov.u64 %r203,%ar2; .loc 1 56 10 ld.u64 %r22,[%r203]; .loc 1 56 7 add.u64 %r126,%r22,-1; .loc 1 57 10 ld.s8 %r23,[%r202+28]; .loc 1 57 38 add.u32 %r25,%r23,-1; .loc 1 57 8 cvt.s64.s32 %r127,%r25; .loc 1 59 14 ld.u64 %r26,[%r202+16]; .loc 1 61 9 add.u64 %r205,%r126,%r126; add.u64 %r206,%r205,%r126; shl.b64 %r207,%r206,3; add.u64 %r208,%r202,%r207; ld.u64 %r27,[%r208+56]; ld.u64 %r29,[%r208+48]; .loc 1 65 11 ld.u64 %r30,[%r208+40]; .loc 1 67 3 setp.gt.s64 %r222,%r126,0; @ %r222 bra $L2; $L8: .loc 1 75 3 setp.lt.s64 %r223,%r126,%r127; @ %r223 bra $L3; bra $L4; $L2: add.u64 %r151,%r202,40; add.u64 %r52,%frame,120; add.u64 %r109,%frame,240; add.u64 %r225,%r22,%r22; add.u64 %r226,%r225,%r22; shl.b64 %r227,%r226,3; add.u64 %r228,%r202,16; add.u64 %r160,%r227,%r228; .loc 1 73 12 mov.u64 %r385,0; $L7: .loc 1 69 20 ld.u64 %r230,[%r151]; mul.lo.u64 %r229,%r230,%r26; .loc 1 69 18 st.u64 [%r52],%r229; .loc 1 70 19 ld.u64 %r232,[%r151+16]; add.u64 %r231,%r232,1; ld.u64 %r233,[%r151+8]; sub.u64 %r41,%r231,%r233; .loc 1 72 10 setp.lt.s64 %r234,%r41,0; @ %r234 bra $L5; .loc 1 70 17 st.u64 [%r109],%r41; bra $L6; $L5: .loc 1 73 12 st.u64 [%r109],%r385; $L6: .loc 1 67 3 add.u64 %r151,%r151,24; add.u64 %r52,%r52,8; add.u64 %r109,%r109,8; setp.ne.u64 %r236,%r151,%r160; @ %r236 bra $L7; bra $L8; $L4: .loc 1 84 6 ld.u64 %r237,[%r201]; setp.eq.u64 %r238,%r237,0; @ ! %r238 bra $L51; bra $L9; $L3: add.u64 %r240,%r22,%r22; add.u64 %r241,%r240,%r22; shl.b64 %r242,%r241,3; add.u64 %r243,%r242,40; add.u64 %r145,%r202,%r243; shl.b64 %r244,%r22,3; add.u64 %r112,%r244,-8; add.u64 %r375,%frame,120; add.u64 %r117,%r375,%r112; add.u64 %r374,%frame,240; add.u64 %r110,%r374,%r112; add.u64 %r247,%r202,40; cvt.u32.u32 %r249,%r23; cvt.s64.s8 %r248,%r249; add.u64 %r251,%r248,%r248; add.u64 %r252,%r251,%r248; shl.b64 %r253,%r252,3; add.u64 %r198,%r247,%r253; .loc 1 81 12 mov.u64 %r384,0; $L13: .loc 1 77 20 ld.u64 %r255,[%r145]; mul.lo.u64 %r254,%r255,%r26; .loc 1 77 18 st.u64 [%r117],%r254; .loc 1 78 19 ld.u64 %r257,[%r145+16]; add.u64 %r256,%r257,1; ld.u64 %r258,[%r145+8]; sub.u64 %r49,%r256,%r258; .loc 1 80 10 setp.lt.s64 %r259,%r49,0; @ %r259 bra $L11; .loc 1 78 17 st.u64 [%r110],%r49; bra $L12; $L11: .loc 1 81 12 st.u64 [%r110],%r384; $L12: .loc 1 75 3 add.u64 %r145,%r145,24; add.u64 %r117,%r117,8; add.u64 %r110,%r110,8; setp.ne.u64 %r261,%r145,%r198; @ %r261 bra $L13; bra $L4; $L9: .loc 1 88 7 setp.le.s32 %r262,%r25,0; @ %r262 bra $L14; add.u64 %r179,%r201,40; add.u64 %r177,%frame,240; add.u64 %r263,%r201,16; cvt.u32.u32 %r265,%r23; cvt.s64.s8 %r264,%r265; add.u64 %r267,%r264,%r264; add.u64 %r268,%r267,%r264; shl.b64 %r269,%r268,3; add.u64 %r171,%r263,%r269; .loc 1 91 17 mov.u64 %r115,1; bra $L15; $L14: .loc 1 99 24 mov.u64 %r270,0; st.u64 [%r201+8],%r270; .loc 1 100 28 cvt.u16.u32 %r273,%r23; add.u16 %r272,%r273,-1; cvt.u32.u16 %r274,%r272; st.u8 [%r201+28],%r274; .loc 1 102 20 add.u32 %r275,%r23,-2; cvt.s64.s32 %r58,%r275; add.u64 %r277,%r58,%r58; add.u64 %r278,%r277,%r58; shl.b64 %r279,%r278,3; add.u64 %r280,%r201,%r279; .loc 1 102 67 shl.b64 %r282,%r58,3; add.u64 %r283,%frame,%r282; .loc 1 102 59 ld.u64 %r285,[%r280+40]; ld.u64 %r286,[%r283+240]; mul.lo.u64 %r134,%r285,%r286; .loc 1 104 10 setp.eq.u64 %r287,%r134,0; @ %r287 bra $L16; bra $L52; $L18: .loc 1 93 55 mul.lo.u64 %r115,%r54,%r115; $L15: .loc 1 95 4 st.u64 [%r179+8],%r237; ld.u64 %r54,[%r177]; add.u64 %r289,%r54,-1; st.u64 [%r179+16],%r289; st.u64 [%r179],%r115; .loc 1 88 7 add.u64 %r179,%r179,24; add.u64 %r177,%r177,8; setp.ne.u64 %r290,%r171,%r179; @ %r290 bra $L18; bra $L14; $L16: .loc 1 107 4 st.u64 [%r201+48],%r134; mov.u64 %r292,-1; st.u64 [%r201+56],%r292; mov.u64 %r293,1; st.u64 [%r201+40],%r293; .loc 1 108 4 bra $L1; $L52: .loc 1 111 24295; call (%value_in),_gfortrani_xmallocarray296,[%value_in]; } .loc 1 111 22 st.u64 [%r201],%r296; $L25: .loc 1 137 3 setp.gt.s32 %r298,%r25,0; @ ! %r298 bra $L26; add.u64 %r374,%frame,240; bra $L20; $L26: .loc 1 145 8 ld.u64 %r140,[%r202]; .loc 1 147 21 cvt.u32.u64 %r76,%r26; .loc 1 147 50 add.u32 %r299,%r76,-4; and.b32 %r300,%r299,-5; set.u32.eq.u32 %r302,%r300,0; neg.s32 %r303,%r302; .loc 1 147 21 add.u32 %r304,%r76,-1; set.u32.le.u32 %r306,%r304,1; neg.s32 %r307,%r306; .loc 1 147 6 cvt.u16.u32 %r309,%r303; cvt.u16.u32 %r310,%r307; orbra $L53; $L20: add.u64 %r189,%frame,360; add.u64 %r188,%r201,40; mov.u64 %r186,%frame; .loc 1 123 10 mov.u64 %r142,0; .loc 1 139 16 mov.u64 %r332,%r142; bra $L23; $L51: .loc 1 115 19 ld.s8 %r65,[%r201+28]; .loc 1 115 10 setp.eq.u64 %r314,%r65,%r127; @ %r314 bra $L24; .loc 1 116 2 st.u64 [%stack+8],%r127; st.u64 [%stack],%r65; cvta.const.u64 %r315315_gfortran_runtime_error24: .loc 1 121 11 cvta.global.u64 %r317,_gfortrani_compile_options; .loc 1 121 10 ld.u32 %r318,[%r317+36]; setp.eq.u32 %r319,%r318,0; @ %r319 bra $L25; .loc 1 123 4 setp.le.s64 %r320,%r127,0; @ %r320 bra $L26; add.u64 %r170,%r201,48; add.u64 %r374,%frame,240; mov.u64 %r168,%r374; add.u64 %r162,%r201,56; cvt.u32.u32 %r322,%r23; cvt.s64.s8 %r321,%r322; add.u64 %r149,%r321,-1; .loc 1 123 10 mov.u64 %r114,0; $L28: .loc 1 127 21 ld.u64 %r324,[%r162]; add.u64 %r323,%r324,1; .loc 1 127 19 ld.u64 %r325,[%r170]; sub.u64 %r133,%r323,%r325; .loc 1 128 18 ld.u64 %r71,[%r168]; .loc 1 128 11 setp.eq.u64 %r326,%r71,%r133; @ %r326 bra $L27; .loc 1 129 3 st.u64 [%stack+16],%r71; st.u64 [%stack+8],%r133; cvt.u32.u64 %r329,%r114; add.u32 %r328,%r329,1; st.u32 [%stack],%r328; cvta.const.u64 %r327,$LC1stack; call _gfortran_runtime_error27: .loc 1 123 25 add.u64 %r114,%r114,1; .loc 1 123 4 add.u64 %r170,%r170,24; add.u64 %r168,%r168,8; add.u64 %r162,%r162,24; setp.ne.u64 %r331,%r114,%r149; @ %r331 bra $L28; bra $L20; $L23: .loc 1 139 16 st.u64 [%r189],%r332; .loc 1 140 18 ld.u64 %r333,[%r188]; st.u64 [%r186],%r333; .loc 1 141 17 shl.b64 %r335,%r142,3; add.u64 %r336,%r374,%r335; .loc 1 141 10 ld.u64 %r337,[%r336]; setp.le.s64 %r338,%r337,0; @ %r338 bra $L1; .loc 1 137 26 add.u64 %r142,%r142,1; .loc 1 137 3 add.u64 %r189,%r189,8; add.u64 %r188,%r188,24; add.u64 %r186,%r186,8; setp.gt.s64 %r339,%r127,%r142; @ %r339 bra $L23; bra $L26; $L53: .loc 1 149 7 setp.eq.u32 %r341,%r76,16; @ %r341 bra $L21; .loc 1 157 5 cvta.const.u64 %r343,$LC2; mov.u64 %r342,3343; call _gfortrani_internal_error61 9 add.u64 %r344,%r27,1; .loc 1 61 7 sub.u64 %r129,%r344,%r29; max.s64 %r131,%r129,0; .loc 1 65 11 mul.lo.u64 %r33,%r30,%r26; .loc 1 159 8 ld.u64 %r141,[%r201]; ld.u64 %r166,[%frame+360]; .loc 1 189 22 ld.u64 %r86,[%frame+120]; .loc 1 190 22 ld.u64 %r88,[%frame]; .loc 1 190 12 shl.b64 %r89,%r88,2; .loc 1 192 32 ld.u64 %r164,[%frame+240]; .loc 1 199 30 mul.lo.u64 %r345,%r164,%r86; .loc 1 199 16 neg.s64 %r150,%r345; .loc 1 200 30 mul.lo.u64 %r346,%r164,%r88; .loc 1 200 16 shl.b64 %r347,%r346,2; neg.s64 %r148,%r347; setp.gt.s64 %r372,%r129,0; setp.le.s32 %r373,%r25,1; cvt.u32.u32 %r377,%r23; cvt.s64.s8 %r378,%r377; add.u64 %r379,%r378,-1; add.u64 %r380,%frame,120; add.u64 %r381,%frame,240; .loc 1 196 20 mov.u64 %r382,0; $L36: .loc 1 170 12 @ %r372 bra $L37; .loc 1 171 10 mov.u32 %r349,0; st.u32 [%r141],%r349; bra $L30; $L37: mov.u64 %r136,%r140; .loc 1 174 13 mov.u64 %r135,0; $L29: .loc 1 178 6 ld.s8 %r351,[%r136]; cvt.u16.u32 %r350,%r351; setp.ne.u16 %r352,%r350,0; @ %r352 bra $L38; .loc 1 174 28 add.u64 %r135,%r135,1; .loc 1 174 36 add.u64 %r136,%r136,%r33; .loc 1 174 6 setp.gt.s64 %r353,%r131,%r135; @ %r353 bra $L29; .loc 1 169 10 mov.u32 %r116,0; bra $L31; $L38: .loc 1 180 14 mov.u32 %r116,1; $L31: .loc 1 184 12 st.u32 [%r141],%r116; $L30: .loc 1 188 15 add.u64 %r166,%r166,1; .loc 1 189 12 add.u64 %r140,%r140,%r86; .loc 1 190 12 add.u64 %r141,%r141,%r89; .loc 1 192 13 setp.ne.u64 %r354,%r166,%r164; @ %r354 bra $L36; .loc 1 199 16 add.u64 %r137,%r140,%r150; .loc 1 200 16 add.u64 %r138,%r141,%r148; .loc 1 202 14 @ %r373 bra $L1; add.u64 %r32,%frame,368; mov.u64 %r165,8; .loc 1 201 12 mov.u64 %r139,1; bra $L34; $L35: .loc 1 196 20 st.u64 [%r32],%r382; .loc 1 199 30 mul.lo.u64 %r360,%r101,%r100; .loc 1 199 16 sub.u64 %r137,%r140,%r360; .loc 1 200 30 mul.lo.u64 %r361,%r103,%r100; .loc 1 200 16 shl.b64 %r362,%r361,2; sub.u64 %r138,%r141,%r362; .loc 1 201 12 add.u64 %r139,%r139,1; .loc 1 202 14 add.u64 %r32,%r32,8; add.u64 %r165,%r165,8; setp.eq.u64 %r363,%r139,%r379; @ %r363 bra $L1; $L34: .loc 1 210 23 ld.u64 %r364,[%r32]; add.u64 %r100,%r364,1; st.u64 [%r32],%r100; .loc 1 211 30 add.u64 %r366,%r380,%r165; ld.u64 %r101,[%r366]; .loc 1 211 20 add.u64 %r140,%r137,%r101; .loc 1 212 30 add.u64 %r367,%frame,%r165; ld.u64 %r103,[%r367]; .loc 1 212 20 shl.b64 %r368,%r103,2; add.u64 %r141,%r138,%r368; .loc 1 192 32 add.u64 %r370,%r381,%r165; ld.u64 %r107,[%r370]; .loc 1 192 13 setp.eq.u64 %r371,%r100,%r107; @ %r371 bra $L35; .loc 1 196 20 mov.u64 %r166,0; bra $L36; $L1: .loc 1 216any_l8.o/_gfortran_any_l8 .visible .func _gfortran_any_l8fortran/generated/any_l8.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_internal_error .extern .func _gfortrani_internal_114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,65,78,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,10810,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,65,78,89,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121,32,105,110,32,65,78,89,32,105,110,116,114,105,110,115,105,99_gfortran_any_l8 .visible .func _gfortran_any_l848u64 %r229; .reg .u64 %r230; .reg .u64 %r231; .reg .u64 %r232; .reg .u64 %r233; .reg .pred %r234; .reg .pred %r236; .reg .u64u64u64 %r257; .reg .u64 %r258; .reg .predu64u16u64 %r279; .reg .u64 %r280; .reg .u64 %r282; .reg .u64pred %r320; .reg .u64u64 %r332; .reg .u64 %r333; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .pred %r338; .reg .pred %r339; .reg .pred %r341; .reg .u64 %r342; .reg .u64 %r343; .reg .u64 %r344; .reg .u64 %r345; .reg .u64 %r346; .reg .u64 %r347; .reg .u16 %r350; .reg .u32 %r351; .reg .pred %r352; .reg .pred %r353; .reg .pred %r354; .reg .u64pred %r363; .reg .u64u64 %r385; mov.u64 %r201,%ar0; mov.u64 %r202,%ar1; mov.u64 %r203,%ar2; .loc 1 56 10 ld.u64 %r22,[%r203]; .loc 1 56 7 add.u64 %r126,%r22,-1; .loc 1 57 10 ld.s8 %r23,[%r202+28]; .loc 1 57 38 add.u32 %r25,%r23,-1; .loc 1 57 8 cvt.s64.s32 %r127,%r25; .loc 1 59 14 ld.u64 %r26,[%r202+16]; .loc 1 61 9 add.u64 %r205,%r126,%r126; add.u64 %r206,%r205,%r126; shl.b64 %r207,%r206,3; add.u64 %r208,%r202,%r207; ld.u64 %r27,[%r208+56]; ld.u64 %r29,[%r208+48]; .loc 1 65 11 ld.u64 %r30,[%r208+40]; .loc 1 67 3 setp.gt.s64 %r222,%r126,0; @ %r222 bra $L2; $L8: .loc 1 75 3 setp.lt.s64 %r223,%r126,%r127; @ %r223 bra $L3; bra $L4; $L2: add.u64 %r151,%r202,40; add.u64 %r52,%frame,120; add.u64 %r109,%frame,240; add.u64 %r225,%r22,%r22; add.u64 %r226,%r225,%r22; shl.b64 %r227,%r226,3; add.u64 %r228,%r202,16; add.u64 %r160,%r227,%r228; .loc 1 73 12 mov.u64 %r385,0; $L7: .loc 1 69 20 ld.u64 %r230,[%r151]; mul.lo.u64 %r229,%r230,%r26; .loc 1 69 18 st.u64 [%r52],%r229; .loc 1 70 19 ld.u64 %r232,[%r151+16]; add.u64 %r231,%r232,1; ld.u64 %r233,[%r151+8]; sub.u64 %r41,%r231,%r233; .loc 1 72 10 setp.lt.s64 %r234,%r41,0; @ %r234 bra $L5; .loc 1 70 17 st.u64 [%r109],%r41; bra $L6; $L5: .loc 1 73 12 st.u64 [%r109],%r385; $L6: .loc 1 67 3 add.u64 %r151,%r151,24; add.u64 %r52,%r52,8; add.u64 %r109,%r109,8; setp.ne.u64 %r236,%r151,%r160; @ %r236 bra $L7; bra $L8; $L4: .loc 1 84 6 ld.u64 %r237,[%r201]; setp.eq.u64 %r238,%r237,0; @ ! %r238 bra $L51; bra $L9; $L3: add.u64 %r240,%r22,%r22; add.u64 %r241,%r240,%r22; shl.b64 %r242,%r241,3; add.u64 %r243,%r242,40; add.u64 %r145,%r202,%r243; shl.b64 %r244,%r22,3; add.u64 %r112,%r244,-8; add.u64 %r375,%frame,120; add.u64 %r117,%r375,%r112; add.u64 %r374,%frame,240; add.u64 %r110,%r374,%r112; add.u64 %r247,%r202,40; cvt.u32.u32 %r249,%r23; cvt.s64.s8 %r248,%r249; add.u64 %r251,%r248,%r248; add.u64 %r252,%r251,%r248; shl.b64 %r253,%r252,3; add.u64 %r198,%r247,%r253; .loc 1 81 12 mov.u64 %r384,0; $L13: .loc 1 77 20 ld.u64 %r255,[%r145]; mul.lo.u64 %r254,%r255,%r26; .loc 1 77 18 st.u64 [%r117],%r254; .loc 1 78 19 ld.u64 %r257,[%r145+16]; add.u64 %r256,%r257,1; ld.u64 %r258,[%r145+8]; sub.u64 %r49,%r256,%r258; .loc 1 80 10 setp.lt.s64 %r259,%r49,0; @ %r259 bra $L11; .loc 1 78 17 st.u64 [%r110],%r49; bra $L12; $L11: .loc 1 81 12 st.u64 [%r110],%r384; $L12: .loc 1 75 3 add.u64 %r145,%r145,24; add.u64 %r117,%r117,8; add.u64 %r110,%r110,8; setp.ne.u64 %r261,%r145,%r198; @ %r261 bra $L13; bra $L4; $L9: .loc 1 88 7 setp.le.s32 %r262,%r25,0; @ %r262 bra $L14; add.u64 %r179,%r201,40; add.u64 %r177,%frame,240; add.u64 %r263,%r201,16; cvt.u32.u32 %r265,%r23; cvt.s64.s8 %r264,%r265; add.u64 %r267,%r264,%r264; add.u64 %r268,%r267,%r264; shl.b64 %r269,%r268,3; add.u64 %r171,%r263,%r269; .loc 1 91 17 mov.u64 %r115,1; bra $L15; $L14: .loc 1 99 24 mov.u64 %r270,0; st.u64 [%r201+8],%r270; .loc 1 100 28 cvt.u16.u32 %r273,%r23; add.u16 %r272,%r273,-1; cvt.u32.u16 %r274,%r272; st.u8 [%r201+28],%r274; .loc 1 102 20 add.u32 %r275,%r23,-2; cvt.s64.s32 %r58,%r275; add.u64 %r277,%r58,%r58; add.u64 %r278,%r277,%r58; shl.b64 %r279,%r278,3; add.u64 %r280,%r201,%r279; .loc 1 102 67 shl.b64 %r282,%r58,3; add.u64 %r283,%frame,%r282; .loc 1 102 59 ld.u64 %r285,[%r280+40]; ld.u64 %r286,[%r283+240]; mul.lo.u64 %r134,%r285,%r286; .loc 1 104 10 setp.eq.u64 %r287,%r134,0; @ %r287 bra $L16; bra $L52; $L18: .loc 1 93 55 mul.lo.u64 %r115,%r54,%r115; $L15: .loc 1 95 4 st.u64 [%r179+8],%r237; ld.u64 %r54,[%r177]; add.u64 %r289,%r54,-1; st.u64 [%r179+16],%r289; st.u64 [%r179],%r115; .loc 1 88 7 add.u64 %r179,%r179,24; add.u64 %r177,%r177,8; setp.ne.u64 %r290,%r171,%r179; @ %r290 bra $L18; bra $L14; $L16: .loc 1 107 4 st.u64 [%r201+48],%r134; mov.u64 %r292,-1; st.u64 [%r201+56],%r292; mov.u64 %r293,1; st.u64 [%r201+40],%r293; .loc 1 108 4 bra $L1; $L52: .loc 1 111 24 mov.u64 %r2951295; call (%value_in),_gfortrani_xmallocarray296,[%value_in]; } .loc 1 111 22 st.u64 [%r201],%r296; $L25: .loc 1 137 3 setp.gt.s32 %r298,%r25,0; @ ! %r298 bra $L26; add.u64 %r374,%frame,240; bra $L20; $L26: .loc 1 145 8 ld.u64 %r140,[%r202]; .loc 1 147 21 cvt.u32.u64 %r76,%r26; .loc 1 147 50 add.u32 %r299,%r76,-4; and.b32 %r300,%r299,-5; set.u32.eq.u32 %r302,%r300,0; neg.s32 %r303,%r302; .loc 1 147 21 add.u32 %r304,%r76,-1; set.u32.le.u32 %r306,%r304,1; neg.s32 %r307,%r306; .loc 1 147 6 cvt.u16.u32 %r309,%r303; cvt.u16.u32 %r310,%r307; orbra $L53; $L20: add.u64 %r189,%frame,360; add.u64 %r188,%r201,40; mov.u64 %r186,%frame; .loc 1 123 10 mov.u64 %r142,0; .loc 1 139 16 mov.u64 %r332,%r142; bra $L23; $L51: .loc 1 115 19 ld.s8 %r65,[%r201+28]; .loc 1 115 10 setp.eq.u64 %r314,%r65,%r127; @ %r314 bra $L24; .loc 1 116 2 st.u64 [%stack+8],%r127; st.u64 [%stack],%r65; cvta.const.u64 %r315315_gfortran_runtime_error24: .loc 1 121 11 cvta.global.u64 %r317,_gfortrani_compile_options; .loc 1 121 10 ld.u32 %r318,[%r317+36]; setp.eq.u32 %r319,%r318,0; @ %r319 bra $L25; .loc 1 123 4 setp.le.s64 %r320,%r127,0; @ %r320 bra $L26; add.u64 %r170,%r201,48; add.u64 %r374,%frame,240; mov.u64 %r168,%r374; add.u64 %r162,%r201,56; cvt.u32.u32 %r322,%r23; cvt.s64.s8 %r321,%r322; add.u64 %r149,%r321,-1; .loc 1 123 10 mov.u64 %r114,0; $L28: .loc 1 127 21 ld.u64 %r324,[%r162]; add.u64 %r323,%r324,1; .loc 1 127 19 ld.u64 %r325,[%r170]; sub.u64 %r133,%r323,%r325; .loc 1 128 18 ld.u64 %r71,[%r168]; .loc 1 128 11 setp.eq.u64 %r326,%r71,%r133; @ %r326 bra $L27; .loc 1 129 3 st.u64 [%stack+16],%r71; st.u64 [%stack+8],%r133; cvt.u32.u64 %r329,%r114; add.u32 %r328,%r329,1; st.u32 [%stack],%r328; cvta.const.u64 %r327,$LC1stack; call _gfortran_runtime_error27: .loc 1 123 25 add.u64 %r114,%r114,1; .loc 1 123 4 add.u64 %r170,%r170,24; add.u64 %r168,%r168,8; add.u64 %r162,%r162,24; setp.ne.u64 %r331,%r114,%r149; @ %r331 bra $L28; bra $L20; $L23: .loc 1 139 16 st.u64 [%r189],%r332; .loc 1 140 18 ld.u64 %r333,[%r188]; st.u64 [%r186],%r333; .loc 1 141 17 shl.b64 %r335,%r142,3; add.u64 %r336,%r374,%r335; .loc 1 141 10 ld.u64 %r337,[%r336]; setp.le.s64 %r338,%r337,0; @ %r338 bra $L1; .loc 1 137 26 add.u64 %r142,%r142,1; .loc 1 137 3 add.u64 %r189,%r189,8; add.u64 %r188,%r188,24; add.u64 %r186,%r186,8; setp.gt.s64 %r339,%r127,%r142; @ %r339 bra $L23; bra $L26; $L53: .loc 1 149 7 setp.eq.u32 %r341,%r76,16; @ %r341 bra $L21; .loc 1 157 5 cvta.const.u64 %r343,$LC2; mov.u64 %r342,3343; call _gfortrani_internal_error61 9 add.u64 %r344,%r27,1; .loc 1 61 7 sub.u64 %r129,%r344,%r29; max.s64 %r131,%r129,0; .loc 1 65 11 mul.lo.u64 %r33,%r30,%r26; .loc 1 159 8 ld.u64 %r141,[%r201]; ld.u64 %r166,[%frame+360]; .loc 1 189 22 ld.u64 %r86,[%frame+120]; .loc 1 190 22 ld.u64 %r88,[%frame]; .loc 1 190 12 shl.b64 %r89,%r88,3; .loc 1 192 32 ld.u64 %r164,[%frame+240]; .loc 1 199 30 mul.lo.u64 %r345,%r164,%r86; .loc 1 199 16 neg.s64 %r150,%r345; .loc 1 200 30 mul.lo.u64 %r346,%r164,%r88; .loc 1 200 16 shl.b64 %r347,%r346,3; neg.s64 %r148,%r347; setp.gt.s64 %r372,%r129,0; setp.le.s32 %r373,%r25,1; .loc 1 169 10 mov.u64 %r377,0; cvt.u32.u32 %r378,%r23; cvt.s64.s8 %r379,%r378; add.u64 %r380,%r379,-1; add.u64 %r381,%frame,120; add.u64 %r382,%frame,240; $L36: .loc 1 170 12 @ %r372 bra $L37; .loc 1 171 10 st.u64 [%r141],%r377; bra $L30; $L37: mov.u64 %r136,%r140; .loc 1 174 13 mov.u64 %r135,0; $L29: .loc 1 178 6 ld.s8 %r351,[%r136]; cvt.u16.u32 %r350,%r351; setp.ne.u16 %r352,%r350,0; @ %r352 bra $L38; .loc 1 174 28 add.u64 %r135,%r135,1; .loc 1 174 36 add.u64 %r136,%r136,%r33; .loc 1 174 6 setp.gt.s64 %r353,%r131,%r135; @ %r353 bra $L29; .loc 1 169 10 mov.u64 %r116,%r377; bra $L31; $L38: .loc 1 180 14 mov.u64 %r116,1; $L31: .loc 1 184 12 st.u64 [%r141],%r116; $L30: .loc 1 188 15 add.u64 %r166,%r166,1; .loc 1 189 12 add.u64 %r140,%r140,%r86; .loc 1 190 12 add.u64 %r141,%r141,%r89; .loc 1 192 13 setp.ne.u64 %r354,%r166,%r164; @ %r354 bra $L36; .loc 1 199 16 add.u64 %r137,%r140,%r150; .loc 1 200 16 add.u64 %r138,%r141,%r148; .loc 1 202 14 @ %r373 bra $L1; add.u64 %r32,%frame,368; mov.u64 %r165,8; .loc 1 201 12 mov.u64 %r139,1; bra $L34; $L35: .loc 1 196 20 st.u64 [%r32],%r377; .loc 1 199 30 mul.lo.u64 %r360,%r101,%r100; .loc 1 199 16 sub.u64 %r137,%r140,%r360; .loc 1 200 30 mul.lo.u64 %r361,%r103,%r100; .loc 1 200 16 shl.b64 %r362,%r361,3; sub.u64 %r138,%r141,%r362; .loc 1 201 12 add.u64 %r139,%r139,1; .loc 1 202 14 add.u64 %r32,%r32,8; add.u64 %r165,%r165,8; setp.eq.u64 %r363,%r139,%r380; @ %r363 bra $L1; $L34: .loc 1 210 23 ld.u64 %r364,[%r32]; add.u64 %r100,%r364,1; st.u64 [%r32],%r100; .loc 1 211 30 add.u64 %r366,%r381,%r165; ld.u64 %r101,[%r366]; .loc 1 211 20 add.u64 %r140,%r137,%r101; .loc 1 212 30 add.u64 %r367,%frame,%r165; ld.u64 %r103,[%r367]; .loc 1 212 20 shl.b64 %r368,%r103,3; add.u64 %r141,%r138,%r368; .loc 1 192 32 add.u64 %r370,%r382,%r165; ld.u64 %r107,[%r370]; .loc 1 192 13 setp.eq.u64 %r371,%r100,%r107; @ %r371 bra $L35; .loc 1 196 20 mov.u64 %r166,0; bra $L36; $L1: .loc 1 216any_l16.o/_gfortran_any_l16 .visible .func _gfortran_any_l16fortran/generated/any_l16.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_internal_error .extern .func _gfortrani_internal_114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,65,78,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,10810,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,65,78,89,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121,32,105,110,32,65,78,89,32,105,110,116,114,105,110,115,105,99_gfortran_any_l1648u64 %r229; .reg .u64 %r230; .reg .u64 %r231; .reg .u64 %r232; .reg .u64 %r233; .reg .pred %r234; .reg .pred %r236; .reg .u64u64u64 %r257; .reg .u64 %r258; .reg .predu64u16u64 %r279; .reg .u64 %r280; .reg .u64 %r282; .reg .u64pred %r320; .reg .u64u64 %r332; .reg .u64 %r333; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .pred %r338; .reg .pred %r339; .reg .pred %r341; .reg .u64 %r342; .reg .u64 %r343; .reg .u64 %r344; .reg .u64 %r345; .reg .u64 %r346; .reg .u64 %r347; .reg .u16 %r351; .reg .u32 %r352; .reg .predpredmov.u64 %r201,%ar0; mov.u64 %r202,%ar1; mov.u64 %r203,%ar2; .loc 1 56 10 ld.u64 %r22,[%r203]; .loc 1 56 7 add.u64 %r126,%r22,-1; .loc 1 57 10 ld.s8 %r23,[%r202+28]; .loc 1 57 38 add.u32 %r25,%r23,-1; .loc 1 57 8 cvt.s64.s32 %r127,%r25; .loc 1 59 14 ld.u64 %r26,[%r202+16]; .loc 1 61 9 add.u64 %r205,%r126,%r126; add.u64 %r206,%r205,%r126; shl.b64 %r207,%r206,3; add.u64 %r208,%r202,%r207; ld.u64 %r27,[%r208+56]; ld.u64 %r29,[%r208+48]; .loc 1 65 11 ld.u64 %r30,[%r208+40]; .loc 1 67 3 setp.gt.s64 %r222,%r126,0; @ %r222 bra $L2; $L8: .loc 1 75 3 setp.lt.s64 %r223,%r126,%r127; @ %r223 bra $L3; bra $L4; $L2: add.u64 %r151,%r202,40; add.u64 %r52,%frame,120; add.u64 %r109,%frame,240; add.u64 %r225,%r22,%r22; add.u64 %r226,%r225,%r22; shl.b64 %r227,%r226,3; add.u64 %r228,%r202,16; add.u64 %r160,%r227,%r228; .loc 1 73 12 mov.u64 %r390,0; $L7: .loc 1 69 20 ld.u64 %r230,[%r151]; mul.lo.u64 %r229,%r230,%r26; .loc 1 69 18 st.u64 [%r52],%r229; .loc 1 70 19 ld.u64 %r232,[%r151+16]; add.u64 %r231,%r232,1; ld.u64 %r233,[%r151+8]; sub.u64 %r41,%r231,%r233; .loc 1 72 10 setp.lt.s64 %r234,%r41,0; @ %r234 bra $L5; .loc 1 70 17 st.u64 [%r109],%r41; bra $L6; $L5: .loc 1 73 12 st.u64 [%r109],%r390; $L6: .loc 1 67 3 add.u64 %r151,%r151,24; add.u64 %r52,%r52,8; add.u64 %r109,%r109,8; setp.ne.u64 %r236,%r151,%r160; @ %r236 bra $L7; bra $L8; $L4: .loc 1 84 6 ld.u64 %r237,[%r201]; setp.eq.u64 %r238,%r237,0; @ ! %r238 bra $L51; bra $L9; $L3: add.u64 %r240,%r22,%r22; add.u64 %r241,%r240,%r22; shl.b64 %r242,%r241,3; add.u64 %r243,%r242,40; add.u64 %r145,%r202,%r243; shl.b64 %r244,%r22,3; add.u64 %r112,%r244,-8; add.u64 %r379,%frame,120; add.u64 %r117,%r379,%r112; add.u64 %r378,%frame,240; add.u64 %r110,%r378,%r112; add.u64 %r247,%r202,40; cvt.u32.u32 %r249,%r23; cvt.s64.s8 %r248,%r249; add.u64 %r251,%r248,%r248; add.u64 %r252,%r251,%r248; shl.b64 %r253,%r252,3; add.u64 %r198,%r247,%r253; .loc 1 81 12 mov.u64 %r389,0; $L13: .loc 1 77 20 ld.u64 %r255,[%r145]; mul.lo.u64 %r254,%r255,%r26; .loc 1 77 18 st.u64 [%r117],%r254; .loc 1 78 19 ld.u64 %r257,[%r145+16]; add.u64 %r256,%r257,1; ld.u64 %r258,[%r145+8]; sub.u64 %r49,%r256,%r258; .loc 1 80 10 setp.lt.s64 %r259,%r49,0; @ %r259 bra $L11; .loc 1 78 17 st.u64 [%r110],%r49; bra $L12; $L11: .loc 1 81 12 st.u64 [%r110],%r389; $L12: .loc 1 75 3 add.u64 %r145,%r145,24; add.u64 %r117,%r117,8; add.u64 %r110,%r110,8; setp.ne.u64 %r261,%r145,%r198; @ %r261 bra $L13; bra $L4; $L9: .loc 1 88 7 setp.le.s32 %r262,%r25,0; @ %r262 bra $L14; add.u64 %r179,%r201,40; add.u64 %r177,%frame,240; add.u64 %r263,%r201,16; cvt.u32.u32 %r265,%r23; cvt.s64.s8 %r264,%r265; add.u64 %r267,%r264,%r264; add.u64 %r268,%r267,%r264; shl.b64 %r269,%r268,3; add.u64 %r171,%r263,%r269; .loc 1 91 17 mov.u64 %r115,1; bra $L15; $L14: .loc 1 99 24 mov.u64 %r270,0; st.u64 [%r201+8],%r270; .loc 1 100 28 cvt.u16.u32 %r273,%r23; add.u16 %r272,%r273,-1; cvt.u32.u16 %r274,%r272; st.u8 [%r201+28],%r274; .loc 1 102 20 add.u32 %r275,%r23,-2; cvt.s64.s32 %r58,%r275; add.u64 %r277,%r58,%r58; add.u64 %r278,%r277,%r58; shl.b64 %r279,%r278,3; add.u64 %r280,%r201,%r279; .loc 1 102 67 shl.b64 %r282,%r58,3; add.u64 %r283,%frame,%r282; .loc 1 102 59 ld.u64 %r285,[%r280+40]; ld.u64 %r286,[%r283+240]; mul.lo.u64 %r134,%r285,%r286; .loc 1 104 10 setp.eq.u64 %r287,%r134,0; @ %r287 bra $L16; bra $L52; $L18: .loc 1 93 55 mul.lo.u64 %r115,%r54,%r115; $L15: .loc 1 95 4 st.u64 [%r179+8],%r237; ld.u64 %r54,[%r177]; add.u64 %r289,%r54,-1; st.u64 [%r179+16],%r289; st.u64 [%r179],%r115; .loc 1 88 7 add.u64 %r179,%r179,24; add.u64 %r177,%r177,8; setp.ne.u64 %r290,%r171,%r179; @ %r290 bra $L18; bra $L14; $L16: .loc 1 107 4 st.u64 [%r201+48],%r134; mov.u64 %r292,-1; st.u64 [%r201+56],%r292; mov.u64 %r293,1; st.u64 [%r201+40],%r293; .loc 1 108 4 bra $L1; $L52: .loc 1 111 24 mov.u64 %r295,16295; call (%value_in),_gfortrani_xmallocarray296,[%value_in]; } .loc 1 111 22 st.u64 [%r201],%r296; $L25: .loc 1 137 3 setp.gt.s32 %r298,%r25,0; @ ! %r298 bra $L26; add.u64 %r378,%frame,240; bra $L20; $L26: .loc 1 145 8 ld.u64 %r140,[%r202]; .loc 1 147 21 cvt.u32.u64 %r76,%r26; .loc 1 147 50 add.u32 %r299,%r76,-4; and.b32 %r300,%r299,-5; set.u32.eq.u32 %r302,%r300,0; neg.s32 %r303,%r302; .loc 1 147 21 add.u32 %r304,%r76,-1; set.u32.le.u32 %r306,%r304,1; neg.s32 %r307,%r306; .loc 1 147 6 cvt.u16.u32 %r309,%r303; cvt.u16.u32 %r310,%r307; orbra $L53; $L20: add.u64 %r189,%frame,360; add.u64 %r188,%r201,40; mov.u64 %r186,%frame; .loc 1 123 10 mov.u64 %r142,0; .loc 1 139 16 mov.u64 %r332,%r142; bra $L23; $L51: .loc 1 115 19 ld.s8 %r65,[%r201+28]; .loc 1 115 10 setp.eq.u64 %r314,%r65,%r127; @ %r314 bra $L24; .loc 1 116 2 st.u64 [%stack+8],%r127; st.u64 [%stack],%r65; cvta.const.u64 %r315315_gfortran_runtime_error24: .loc 1 121 11 cvta.global.u64 %r317,_gfortrani_compile_options; .loc 1 121 10 ld.u32 %r318,[%r317+36]; setp.eq.u32 %r319,%r318,0; @ %r319 bra $L25; .loc 1 123 4 setp.le.s64 %r320,%r127,0; @ %r320 bra $L26; add.u64 %r170,%r201,48; add.u64 %r378,%frame,240; mov.u64 %r168,%r378; add.u64 %r162,%r201,56; cvt.u32.u32 %r322,%r23; cvt.s64.s8 %r321,%r322; add.u64 %r149,%r321,-1; .loc 1 123 10 mov.u64 %r114,0; $L28: .loc 1 127 21 ld.u64 %r324,[%r162]; add.u64 %r323,%r324,1; .loc 1 127 19 ld.u64 %r325,[%r170]; sub.u64 %r133,%r323,%r325; .loc 1 128 18 ld.u64 %r71,[%r168]; .loc 1 128 11 setp.eq.u64 %r326,%r71,%r133; @ %r326 bra $L27; .loc 1 129 3 st.u64 [%stack+16],%r71; st.u64 [%stack+8],%r133; cvt.u32.u64 %r329,%r114; add.u32 %r328,%r329,1; st.u32 [%stack],%r328; cvta.const.u64 %r327,$LC1stack; call _gfortran_runtime_error27: .loc 1 123 25 add.u64 %r114,%r114,1; .loc 1 123 4 add.u64 %r170,%r170,24; add.u64 %r168,%r168,8; add.u64 %r162,%r162,24; setp.ne.u64 %r331,%r114,%r149; @ %r331 bra $L28; bra $L20; $L23: .loc 1 139 16 st.u64 [%r189],%r332; .loc 1 140 18 ld.u64 %r333,[%r188]; st.u64 [%r186],%r333; .loc 1 141 17 shl.b64 %r335,%r142,3; add.u64 %r336,%r378,%r335; .loc 1 141 10 ld.u64 %r337,[%r336]; setp.le.s64 %r338,%r337,0; @ %r338 bra $L1; .loc 1 137 26 add.u64 %r142,%r142,1; .loc 1 137 3 add.u64 %r189,%r189,8; add.u64 %r188,%r188,24; add.u64 %r186,%r186,8; setp.gt.s64 %r339,%r127,%r142; @ %r339 bra $L23; bra $L26; $L53: .loc 1 149 7 setp.eq.u32 %r341,%r76,16; @ %r341 bra $L21; .loc 1 157 5 cvta.const.u64 %r343,$LC2; mov.u64 %r342,3343; call _gfortrani_internal_error61 9 add.u64 %r344,%r27,1; .loc 1 61 7 sub.u64 %r129,%r344,%r29; max.s64 %r131,%r129,0; .loc 1 65 11 mul.lo.u64 %r33,%r30,%r26; .loc 1 159 8 ld.u64 %r141,[%r201]; ld.u64 %r166,[%frame+360]; .loc 1 189 22 ld.u64 %r86,[%frame+120]; .loc 1 190 22 ld.u64 %r88,[%frame]; .loc 1 190 12 shl.b64 %r89,%r88,4; .loc 1 192 32 ld.u64 %r164,[%frame+240]; .loc 1 199 30 mul.lo.u64 %r345,%r164,%r86; .loc 1 199 16 neg.s64 %r150,%r345; .loc 1 200 30 mul.lo.u64 %r346,%r164,%r88; .loc 1 200 16 shl.b64 %r347,%r346,4; neg.s64 %r148,%r347; setp.gt.s64 %r380,%r129,0; setp.le.s32 %r377,%r25,1; .loc 1 180 14 mov.u64 %r382,0; cvt.u32.u32 %r384,%r23; cvt.s64.s8 %r385,%r384; add.u64 %r386,%r385,-1; add.u64 %r387,%frame,120; add.u64 %r388,%frame,240; $L36: .loc 1 170 12 @ %r380 bra $L37; .loc 1 171 10 st.u64 [%r141],%r382; st.u64 [%r141+8],%r382; bra $L30; $L37: mov.u64 %r136,%r140; .loc 1 174 13 mov.u64 %r135,0; $L29: .loc 1 178 6 ld.s8 %r352,[%r136]; cvt.u16.u32 %r351,%r352; setp.ne.u16 %r353,%r351,0; @ %r353 bra $L38; .loc 1 174 28 add.u64 %r135,%r135,1; .loc 1 174 36 add.u64 %r136,%r136,%r33; .loc 1 174 6 setp.gt.s64 %r354,%r131,%r135; @ %r354 bra $L29; .loc 1 169 10 mov.u64 %r375,%r382; mov.u64 %r376,%r382; bra $L31; $L38: .loc 1 180 14 mov.u64 %r375,1; mov.u64 %r376,%r382; $L31: .loc 1 184 12 st.u64 [%r141],%r375; st.u64 [%r141+8],%r376; $L30: .loc 1 188 15 add.u64 %r166,%r166,1; .loc 1 189 12 add.u64 %r140,%r140,%r86; .loc 1 190 12 add.u64 %r141,%r141,%r89; .loc 1 192 13 setp.ne.u64 %r357,%r166,%r164; @ %r357 bra $L36; .loc 1 199 16 add.u64 %r137,%r140,%r150; .loc 1 200 16 add.u64 %r138,%r141,%r148; .loc 1 202 14 @ %r377 bra $L1; add.u64 %r32,%frame,368; mov.u64 %r165,8; .loc 1 201 12 mov.u64 %r139,1; bra $L34; $L35: .loc 1 196 20 st.u64 [%r32],%r382; .loc 1 199 30 mul.lo.u64 %r363,%r101,%r100; .loc 1 199 16 sub.u64 %r137,%r140,%r363; .loc 1 200 30 mul.lo.u64 %r364,%r103,%r100; .loc 1 200 16 shl.b64 %r365,%r364,4; sub.u64 %r138,%r141,%r365; .loc 1 201 12 add.u64 %r139,%r139,1; .loc 1 202 14 add.u64 %r32,%r32,8; add.u64 %r165,%r165,8; setp.eq.u64 %r366,%r139,%r386; @ %r366 bra $L1; $L34: .loc 1 210 23 ld.u64 %r367,[%r32]; add.u64 %r100,%r367,1; st.u64 [%r32],%r100; .loc 1 211 30 add.u64 %r369,%r387,%r165; ld.u64 %r101,[%r369]; .loc 1 211 20 add.u64 %r140,%r137,%r101; .loc 1 212 30 add.u64 %r370,%frame,%r165; ld.u64 %r103,[%r370]; .loc 1 212 20 shl.b64 %r371,%r103,4; add.u64 %r141,%r138,%r371; .loc 1 192 32 add.u64 %r373,%r388,%r165; ld.u64 %r107,[%r373]; .loc 1 192 13 setp.eq.u64 %r374,%r100,%r107; @ %r374 bra $L35; .loc 1 196 20 mov.u64 %r166,0; bra $L36; $L1: .loc 1 216 count_1_l.o/_gfortran_count_1_l .visible .func _gfortran_count_1_lfortran/generated/count_1_l.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_internal_error .extern .func _gfortrani_internal_114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,67,79,85,78,84,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,10810,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,67,79,85,78,84,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121,32,105,110,32,67,79,85,78,84,32,105,110,116,114,105,110,115,105,99_gfortran_count_1_l .visible .func _gfortran_count_1_l4896u64u64 %r229; .reg .u64 %r230; .reg .u64 %r231; .reg .pred %r232; .reg .predu32 %r263; .reg .u64 %r265; .reg .u64 %r266; .reg .u64 %r267; .reg .u64 %r268; .reg .u16 %r270; .reg .u16 %r271; .reg .u32 %r272; .reg .u32 %r273; .reg .u64 %r275; .reg .u64u32 %r320; .reg .u64pred %r336; .reg .predu16 %r347; .reg .u32 %r348; .reg .predpred %r355; .reg .u32u64 %r371; .reg .pred %r372; .reg .predu64 %r385; .reg .u64 %r386; mov.u64 %r199,%ar0; mov.u64 %r200,%ar1; mov.u64 %r201,%ar2; .loc 1 56 10 ld.u64 %r22,[%r201]; .loc 1 56 7 add.u64 %r126,%r22,-1; .loc 1 57 10 ld.s8 %r23,[%r200+28]; .loc 1 57 38 add.u32 %r25,%r23,-1; .loc 1 57 8 cvt.s64.s32 %r127,%r25; .loc 1 59 14 ld.u64 %r26,[%r200+16]; .loc 1 61 9 add.u64 %r203,%r126,%r126; add.u64 %r204,%r203,%r126; shl.b64 %r205,%r204,3; add.u64 %r206,%r200,%r205; ld.u64 %r27,[%r206+56]; ld.u64 %r29,[%r206+48]; .loc 1 65 11 ld.u64 %r30,[%r206+40]; .loc 1 67 3 setp.gt.s64 %r220,%r126,0; @ %r220 bra $L2; $L8: .loc 1 75 3 setp.lt.s64 %r221,%r126,%r127; @ %r221 bra $L3; bra $L4; $L2: add.u64 %r52,%r200,40; add.u64 %r51,%frame,120; add.u64 %r68,%frame,240; add.u64 %r222,%r200,16; add.u64 %r224,%r22,%r22; add.u64 %r225,%r224,%r22; shl.b64 %r226,%r225,3; add.u64 %r96,%r222,%r226; .loc 1 73 12 mov.u64 %r386,0; $L7: .loc 1 69 20 ld.u64 %r228,[%r52]; mul.lo.u64 %r227,%r228,%r26; .loc 1 69 18 st.u64 [%r51],%r227; .loc 1 70 19 ld.u64 %r230,[%r52+16]; add.u64 %r229,%r230,1; ld.u64 %r231,[%r52+8]; sub.u64 %r41,%r229,%r231; .loc 1 72 10 setp.lt.s64 %r232,%r41,0; @ %r232 bra $L5; .loc 1 70 17 st.u64 [%r68],%r41; bra $L6; $L5: .loc 1 73 12 st.u64 [%r68],%r386; $L6: .loc 1 67 3 add.u64 %r52,%r52,24; add.u64 %r51,%r51,8; add.u64 %r68,%r68,8; setp.ne.u64 %r234,%r52,%r96; @ %r234 bra $L7; bra $L8; $L4: .loc 1 84 6 ld.u64 %r235,[%r199]; setp.eq.u64 %r236,%r235,0; @ ! %r236 bra $L53; bra $L9; $L3: add.u64 %r238,%r22,%r22; add.u64 %r239,%r238,%r22; shl.b64 %r240,%r239,3; add.u64 %r241,%r240,40; add.u64 %r121,%r200,%r241; shl.b64 %r242,%r22,3; add.u64 %r111,%r242,-8; add.u64 %r376,%frame,120; add.u64 %r113,%r376,%r111; add.u64 %r375,%frame,240; add.u64 %r108,%r375,%r111; add.u64 %r245,%r200,40; cvt.u32.u32 %r247,%r23; cvt.s64.s8 %r246,%r247; add.u64 %r249,%r246,%r246; add.u64 %r250,%r249,%r246; shl.b64 %r251,%r250,3; add.u64 %r192,%r245,%r251; .loc 1 81 12 mov.u64 %r385,0; $L13: .loc 1 77 20 ld.u64 %r253,[%r121]; mul.lo.u64 %r252,%r253,%r26; .loc 1 77 18 st.u64 [%r113],%r252; .loc 1 78 19 ld.u64 %r255,[%r121+16]; add.u64 %r254,%r255,1; ld.u64 %r256,[%r121+8]; sub.u64 %r49,%r254,%r256; .loc 1 80 10 setp.lt.s64 %r257,%r49,0; @ %r257 bra $L11; .loc 1 78 17 st.u64 [%r108],%r49; bra $L12; $L11: .loc 1 81 12 st.u64 [%r108],%r385; $L12: .loc 1 75 3 add.u64 %r121,%r121,24; add.u64 %r113,%r113,8; add.u64 %r108,%r108,8; setp.ne.u64 %r259,%r121,%r192; @ %r259 bra $L13; bra $L4; $L9: .loc 1 88 7 setp.le.s32 %r260,%r25,0; @ %r260 bra $L14; add.u64 %r175,%r199,40; add.u64 %r173,%frame,240; add.u64 %r261,%r199,16; cvt.u32.u32 %r263,%r23; cvt.s64.s8 %r262,%r263; add.u64 %r265,%r262,%r262; add.u64 %r266,%r265,%r262; shl.b64 %r267,%r266,3; add.u64 %r167,%r261,%r267; .loc 1 91 17 mov.u64 %r115,1; bra $L15; $L14: .loc 1 99 24 mov.u64 %r268,0; st.u64 [%r199+8],%r268; .loc 1 100 28 cvt.u16.u32 %r271,%r23; add.u16 %r270,%r271,-1; cvt.u32.u16 %r272,%r270; st.u8 [%r199+28],%r272; .loc 1 102 20 add.u32 %r273,%r23,-2; cvt.s64.s32 %r59,%r273; add.u64 %r275,%r59,%r59; add.u64 %r276,%r275,%r59; shl.b64 %r277,%r276,3; add.u64 %r278,%r199,%r277; .loc 1 102 67 shl.b64 %r280,%r59,3; add.u64 %r281,%frame,%r280; .loc 1 102 59 ld.u64 %r283,[%r278+40]; ld.u64 %r284,[%r281+240]; mul.lo.u64 %r135,%r283,%r284; .loc 1 104 10 setp.eq.u64 %r285,%r135,0; @ %r285 bra $L16; bra $L54; $L18: .loc 1 93 55 mul.lo.u64 %r115,%r55,%r115; $L15: .loc 1 95 4 st.u64 [%r175+8],%r235; ld.u64 %r55,[%r173]; add.u64 %r287,%r55,-1; st.u64 [%r175+16],%r287; st.u64 [%r175],%r115; .loc 1 88 7 add.u64 %r175,%r175,24; add.u64 %r173,%r173,8; setp.ne.u64 %r288,%r167,%r175; @ %r288 bra $L18; bra $L14; $L16: .loc 1 107 4 st.u64 [%r199+48],%r135; mov.u64 %r290,-1; st.u64 [%r199+56],%r290; mov.u64 %r291,1; st.u64 [%r199+40],%r291; .loc 1 108 4 bra $L1; $L54: .loc 1 111 24 mov.u64 %r29293; call (%value_in),_gfortrani_xmallocarray294,[%value_in]; } .loc 1 111 22 st.u64 [%r199],%r294; $L25: .loc 1 137 3 setp.gt.s32 %r296,%r25,0; @ ! %r296 bra $L26; add.u64 %r375,%frame,240; bra $L20; $L26: .loc 1 145 8 ld.u64 %r142,[%r200]; .loc 1 147 21 cvt.u32.u64 %r80,%r26; .loc 1 147 50 add.u32 %r297,%r80,-4; and.b32 %r298,%r297,-5; set.u32.eq.u32 %r300,%r298,0; neg.s32 %r301,%r300; .loc 1 147 21 add.u32 %r302,%r80,-1; set.u32.le.u32 %r304,%r302,1; neg.s32 %r305,%r304; .loc 1 147 6 cvt.u16; bra $L55; $L20: add.u64 %r185,%frame,360; add.u64 %r184,%r199,40; mov.u64 %r182,%frame; .loc 1 123 10 mov.u64 %r114,0; .loc 1 139 16 mov.u64 %r330,%r114; bra $L23; $L53: .loc 1 115 19 ld.s8 %r66,[%r199+28]; .loc 1 115 10 setp.eq.u64 %r312,%r66,%r127; @ %r312 bra $L24; .loc 1 116 2 st.u64 [%stack+8],%r1273_gfortran_runtime_error24: .loc 1 121 11 cvta.global.u64 %r315,_gfortrani_compile_options; .loc 1 121 10 ld.u32 %r316,[%r315+36]; setp.eq.u32 %r317,%r316,0; @ %r317 bra $L25; .loc 1 123 4 setp.le.s64 %r318,%r127,0; @ %r318 bra $L26; add.u64 %r163,%r199,48; add.u64 %r375,%frame,240; mov.u64 %r157,%r375; add.u64 %r153,%r199,56; cvt.u32.u32 %r320,%r23; cvt.s64.s8 %r319,%r320; add.u64 %r123,%r319,-1; .loc 1 123 10 mov.u64 %r134,0; $L28: .loc 1 127 21 ld.u64 %r322,[%r153]; add.u64 %r321,%r322,1; .loc 1 127 19 ld.u64 %r323,[%r163]; sub.u64 %r133,%r321,%r323; .loc 1 128 18 ld.u64 %r74,[%r157]; .loc 1 128 11 setp.eq.u64 %r324,%r74,%r133; @ %r324 bra $L27; .loc 1 129 3 st.u64 [%stack+16],%r74; st.u64 [%stack+8],%r133; cvt.u32.u64 %r327,%r134; add.u32 %r326,%r327,1; st.u32 [%stack],%r326;_gfortran_runtime_error27: .loc 1 123 25 add.u64 %r134,%r134,1; .loc 1 123 4 add.u64 %r163,%r163,24; add.u64 %r157,%r157,8; add.u64 %r153,%r153,24; setp.ne.u64 %r329,%r123,%r134; @ %r329 bra $L28; bra $L20; $L23: .loc 1 139 16 st.u64 [%r185],%r330; .loc 1 140 18 ld.u64 %r331,[%r184]; st.u64 [%r182],%r331; .loc 1 141 17 shl.b64 %r333,%r114,3; add.u64 %r334,%r375,%r333; .loc 1 141 10 ld.u64 %r335,[%r334]; setp.le.s64 %r336,%r335,0; @ %r336 bra $L1; .loc 1 137 26 add.u64 %r114,%r114,1; .loc 1 137 3 add.u64 %r185,%r185,8; add.u64 %r184,%r184,24; add.u64 %r182,%r182,8; setp.gt.s64 %r337,%r127,%r114; @ %r337 bra $L23; bra $L26; $L55: .loc 1 149 7 setp.eq.u32 %r339,%r80,16; @ %r339 bra $L21; .loc 1 157 5 cvta.const.u64 %r341,$LC2; mov.u64 %r340,34341; call _gfortrani_internal_error61 9 add.u64 %r342,%r27,1; .loc 1 61 7 sub.u64 %r129,%r342,%r29; max.s64 %r131,%r129,0; .loc 1 65 11 mul.lo.u64 %r33,%r30,%r26; .loc 1 159 8 ld.u64 %r143,[%r199]; ld.u64 %r165,[%frame+360]; .loc 1 185 22 ld.u64 %r89,[%frame+120]; .loc 1 186 22 ld.u64 %r91,[%frame]; .loc 1 188 32 ld.u64 %r70,[%frame+240]; .loc 1 195 30 mul.lo.u64 %r343,%r89,%r70; .loc 1 195 16 neg.s64 %r154,%r343; .loc 1 196 30 mul.lo.u64 %r344,%r91,%r70; .loc 1 196 16 neg.s64 %r158,%r344; setp.gt.s64 %r373,%r129,0; setp.le.s32 %r374,%r25,1; cvt.u32.u32 %r378,%r23; cvt.s64.s8 %r379,%r378; add.u64 %r380,%r379,-1; add.u64 %r381,%frame,120; add.u64 %r382,%frame,240; .loc 1 192 20 mov.u64 %r383,0; $L36: .loc 1 170 12 @ %r373 bra $L37; .loc 1 171 10 mov.u32 %r346,0; st.u8 [%r143],%r346; bra $L30; $L37: mov.u64 %r137,%r142; .loc 1 169 10 mov.u32 %r117,0; .loc 1 174 13 mov.u64 %r136,0; $L29: .loc 1 177 6 ld.s8 %r348,[%r137]; cvt.u16.u32 %r347,%r348; setp.eq.u16 %r349,%r347,0; @ %r349 bra $L31; .loc 1 178 11 cvt.u16.u32 %r352,%r117; add.u16 %r351,%r352,1; cvt.u32.u16 %r353,%r351; cvt.s32.s8 %r117,%r353; $L31: .loc 1 174 28 add.u64 %r136,%r136,1; .loc 1 174 36 add.u64 %r137,%r137,%r33; .loc 1 174 6 setp.gt.s64 %r355,%r131,%r136; @ %r355 bra $L29; .loc 1 180 12 cvt.u32.u32 %r356,%r117; st.u8 [%r143],%r356; $L30: .loc 1 184 15 add.u64 %r165,%r165,1; .loc 1 185 12 add.u64 %r142,%r142,%r89; .loc 1 186 12 add.u64 %r143,%r143,%r91; .loc 1 188 13 setp.ne.u64 %r357,%r70,%r165; @ %r357 bra $L36; .loc 1 195 16 add.u64 %r139,%r142,%r154; .loc 1 196 16 add.u64 %r140,%r143,%r158; .loc 1 198 14 @ %r374 bra $L1; add.u64 %r32,%frame,368; mov.u64 %r164,8; .loc 1 197 12 mov.u64 %r141,1; bra $L34; $L35: .loc 1 192 20 st.u64 [%r32],%r383; .loc 1 195 30 mul.lo.u64 %r363,%r103,%r102; .loc 1 195 16 sub.u64 %r139,%r142,%r363; .loc 1 196 30 mul.lo.u64 %r364,%r105,%r102; .loc 1 196 16 sub.u64 %r140,%r143,%r364; .loc 1 197 12 add.u64 %r141,%r141,1; .loc 1 198 14 add.u64 %r32,%r32,8; add.u64 %r164,%r164,8; setp.eq.u64 %r365,%r141,%r380; @ %r365 bra $L1; $L34: .loc 1 206 23 ld.u64 %r366,[%r32]; add.u64 %r102,%r366,1; st.u64 [%r32],%r102; .loc 1 207 30 add.u64 %r368,%r381,%r164; ld.u64 %r103,[%r368]; .loc 1 207 20 add.u64 %r142,%r139,%r103; .loc 1 208 30 add.u64 %r369,%frame,%r164; ld.u64 %r105,[%r369]; .loc 1 208 20 add.u64 %r143,%r140,%r105; .loc 1 188 32 add.u64 %r371,%r382,%r164; ld.u64 %r107,[%r371]; .loc 1 188 13 setp.eq.u64 %r372,%r102,%r107; @ %r372 bra $L35; .loc 1 192 20 mov.u64 %r165,0; bra $L36; $L1: .loc 1 212 count_2_l.o/_gfortran_count_2_l .visible .func _gfortran_count_2_lfortran/generated/count_2_l.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_internal_error .extern .func _gfortrani_internal_114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,67,79,85,78,84,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,10810,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,67,79,85,78,84,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121,32,105,110,32,67,79,85,78,84,32,105,110,116,114,105,110,115,105,99_gfortran_count_2_l .visible .func _gfortran_count_2_l4892predpred %r238; .reg .u64predu64 %r279; .reg .u64 %r280; .reg .u64 %r281; .reg .u64pred %r316; .reg .u64 %r317; .reg .u64 %r319; .reg .u32predpredmov.u64 %r203,%ar0; mov.u64 %r204,%ar1; mov.u64 %r205,%ar2; .loc 1 56 10 ld.u64 %r22,[%r205]; .loc 1 56 7 add.u64 %r125,%r22,-1; .loc 1 57 10 ld.s8 %r23,[%r204+28]; .loc 1 57 38 add.u32 %r25,%r23,-1; .loc 1 57 8 cvt.s64.s32 %r126,%r25; .loc 1 59 14 ld.u64 %r26,[%r204+16]; .loc 1 61 9 add.u64 %r207,%r125,%r125; add.u64 %r208,%r207,%r125; shl.b64 %r209,%r208,3; add.u64 %r210,%r204,%r209; ld.u64 %r27,[%r210+56]; ld.u64 %r29,[%r210+48]; .loc 1 65 11 ld.u64 %r30,[%r210+40]; .loc 1 67 3 setp.gt.s64 %r224,%r125,0; @ %r224 bra $L2; $L8: .loc 1 75 3 setp.lt.s64 %r225,%r125,%r126; @ %r225 bra $L3; bra $L4; $L2: add.u64 %r197,%r204,40; add.u64 %r51,%frame,120; add.u64 %r52,%frame,240; add.u64 %r227,%r22,%r22; add.u64 %r228,%r227,%r22; shl.b64 %r229,%r228,3; add.u64 %r230,%r204,16; add.u64 %r73,%r229,%r230; .loc 1 73 12 mov.u64 %r392,0; $L7: .loc 1 69 20 ld.u64 %r232,[%r197]; mul.lo.u64 %r231,%r232,%r26; .loc 1 69 18 st.u64 [%r51],%r231; .loc 1 70 19 ld.u64 %r234,[%r197+16]; add.u64 %r233,%r234,1; ld.u64 %r235,[%r197+8]; sub.u64 %r41,%r233,%r235; .loc 1 72 10 setp.lt.s64 %r236,%r41,0; @ %r236 bra $L5; .loc 1 70 17 st.u64 [%r52],%r41; bra $L6; $L5: .loc 1 73 12 st.u64 [%r52],%r392; $L6: .loc 1 67 3 add.u64 %r197,%r197,24; add.u64 %r51,%r51,8; add.u64 %r52,%r52,8; setp.ne.u64 %r238,%r73,%r197; @ %r238 bra $L7; bra $L8; $L4: .loc 1 84 6 ld.u64 %r239,[%r203]; setp.eq.u64 %r240,%r239,0; @ ! %r240 bra $L53; bra $L9; $L3: add.u64 %r242,%r22,%r22; add.u64 %r243,%r242,%r22; shl.b64 %r244,%r243,3; add.u64 %r245,%r244,40; add.u64 %r136,%r204,%r245; shl.b64 %r246,%r22,3; add.u64 %r112,%r246,-8; add.u64 %r381,%frame,120; add.u64 %r117,%r381,%r112; add.u64 %r380,%frame,240; add.u64 %r110,%r380,%r112; add.u64 %r249,%r204,40; cvt.u32.u32 %r251,%r23; cvt.s64.s8 %r250,%r251; add.u64 %r253,%r250,%r250; add.u64 %r254,%r253,%r250; shl.b64 %r255,%r254,3; add.u64 %r200,%r249,%r255; .loc 1 81 12 mov.u64 %r391,0; $L13: .loc 1 77 20 ld.u64 %r257,[%r136]; mul.lo.u64 %r256,%r257,%r26; .loc 1 77 18 st.u64 [%r117],%r256; .loc 1 78 19 ld.u64 %r259,[%r136+16]; add.u64 %r258,%r259,1; ld.u64 %r260,[%r136+8]; sub.u64 %r49,%r258,%r260; .loc 1 80 10 setp.lt.s64 %r261,%r49,0; @ %r261 bra $L11; .loc 1 78 17 st.u64 [%r110],%r49; bra $L12; $L11: .loc 1 81 12 st.u64 [%r110],%r391; $L12: .loc 1 75 3 add.u64 %r136,%r136,24; add.u64 %r117,%r117,8; add.u64 %r110,%r110,8; setp.ne.u64 %r263,%r136,%r200; @ %r263 bra $L13; bra $L4; $L9: .loc 1 88 7 setp.le.s32 %r264,%r25,0; @ %r264 bra $L14; add.u64 %r180,%r203,40; add.u64 %r178,%frame,240; add.u64 %r265,%r203,16; cvt.u32.u32 %r267,%r23; cvt.s64.s8 %r266,%r267; add.u64 %r269,%r266,%r266; add.u64 %r270,%r269,%r266; shl.b64 %r271,%r270,3; add.u64 %r172,%r265,%r271; .loc 1 91 17 mov.u64 %r115,1; bra $L15; $L14: .loc 1 99 24 mov.u64 %r272,0; st.u64 [%r203+8],%r272; .loc 1 100 28 cvt.u16.u32 %r275,%r23; add.u16 %r274,%r275,-1; cvt.u32.u16 %r276,%r274; st.u8 [%r203+28],%r276; .loc 1 102 20 add.u32 %r277,%r23,-2; cvt.s64.s32 %r58,%r277; add.u64 %r279,%r58,%r58; add.u64 %r280,%r279,%r58; shl.b64 %r281,%r280,3; add.u64 %r282,%r203,%r281; .loc 1 102 67 shl.b64 %r284,%r58,3; add.u64 %r285,%frame,%r284; .loc 1 102 59 ld.u64 %r287,[%r282+40]; ld.u64 %r288,[%r285+240]; mul.lo.u64 %r133,%r287,%r288; .loc 1 104 10 setp.eq.u64 %r289,%r133,0; @ %r289 bra $L16; bra $L54; $L18: .loc 1 93 55 mul.lo.u64 %r115,%r54,%r115; $L15: .loc 1 95 4 st.u64 [%r180+8],%r239; ld.u64 %r54,[%r178]; add.u64 %r291,%r54,-1; st.u64 [%r180+16],%r291; st.u64 [%r180],%r115; .loc 1 88 7 add.u64 %r180,%r180,24; add.u64 %r178,%r178,8; setp.ne.u64 %r292,%r172,%r180; @ %r292 bra $L18; bra $L14; $L16: .loc 1 107 4 st.u64 [%r203+48],%r133; mov.u64 %r294,-1; st.u64 [%r203+56],%r294; mov.u64 %r295,1; st.u64 [%r203+40],%r295; .loc 1 108 4 bra $L1; $L54: .loc 1 111 24 mov.u64 %r297,13(%value_in),_gfortrani_xmallocarray298,[%value_in]; } .loc 1 111 22 st.u64 [%r203],%r298; $L25: .loc 1 137 3 setp.gt.s32 %r300,%r25,0; @ ! %r300 bra $L26; add.u64 %r380,%frame,240; bra $L20; $L26: .loc 1 145 8 ld.u64 %r140,[%r204]; .loc 1 147 21 cvt.u32.u64 %r78,%r26; .loc 1 147 50 add.u32 %r301,%r78,-4; and.b32 %r302,%r301,-5; set.u32.eq.u32 %r304,%r302,0; neg.s32 %r305,%r304; .loc 1 147 21 add.u32 %r306,%r78,-1; set.u32.le.u32 %r308,%r306,1; neg.s32 %r309,%r308; .loc 1 147 6 cvt.u16.u32 %r311,%r305; cvt.u16.u32 %r312,%r309; orbra $L55; $L20: add.u64 %r190,%frame,360; add.u64 %r189,%r203,40; mov.u64 %r187,%frame; .loc 1 123 10 mov.u64 %r142,0; .loc 1 139 16 mov.u64 %r334,%r142; bra $L23; $L53: .loc 1 115 19 ld.s8 %r65,[%r203+28]; .loc 1 115 10 setp.eq.u64 %r316,%r65,%r126; @ %r316 bra $L24; .loc 1 116 2 st.u64 [%stack+8],%r126; st.u64 [%stack],%r65; cvta.const.u64 %r31317_gfortran_runtime_error24: .loc 1 121 11 cvta.global.u64 %r319,_gfortrani_compile_options; .loc 1 121 10 ld.u32 %r320,[%r319+36]; setp.eq.u32 %r321,%r320,0; @ %r321 bra $L25; .loc 1 123 4 setp.le.s64 %r322,%r126,0; @ %r322 bra $L26; add.u64 %r171,%r203,48; add.u64 %r380,%frame,240; mov.u64 %r166,%r380; add.u64 %r164,%r203,56; cvt.u32.u32 %r324,%r23; cvt.s64.s8 %r323,%r324; add.u64 %r146,%r323,-1; .loc 1 123 10 mov.u64 %r114,0; $L28: .loc 1 127 21 ld.u64 %r326,[%r164]; add.u64 %r325,%r326,1; .loc 1 127 19 ld.u64 %r327,[%r171]; sub.u64 %r132,%r325,%r327; .loc 1 128 18 ld.u64 %r72,[%r166]; .loc 1 128 11 setp.eq.u64 %r328,%r72,%r132; @ %r328 bra $L27; .loc 1 129 3 st.u64 [%stack+16],%r72; st.u64 [%stack+8],%r132; cvt.u32.u64 %r331,%r114; add.u32 %r330,%r331,1; st.u32 [%stack],%r330;_gfortran_runtime_error27: .loc 1 123 25 add.u64 %r114,%r114,1; .loc 1 123 4 add.u64 %r171,%r171,24; add.u64 %r166,%r166,8; add.u64 %r164,%r164,24; setp.ne.u64 %r333,%r114,%r146; @ %r333 bra $L28; bra $L20; $L23: .loc 1 139 16 st.u64 [%r190],%r334; .loc 1 140 18 ld.u64 %r335,[%r189]; st.u64 [%r187],%r335; .loc 1 141 17 shl.b64 %r337,%r142,3; add.u64 %r338,%r380,%r337; .loc 1 141 10 ld.u64 %r339,[%r338]; setp.le.s64 %r340,%r339,0; @ %r340 bra $L1; .loc 1 137 26 add.u64 %r142,%r142,1; .loc 1 137 3 add.u64 %r190,%r190,8; add.u64 %r189,%r189,24; add.u64 %r187,%r187,8; setp.gt.s64 %r341,%r126,%r142; @ %r341 bra $L23; bra $L26; $L55: .loc 1 149 7 setp.eq.u32 %r343,%r78,16; @ %r343 bra $L21; .loc 1 157 5 cvta.const.u64 %r345,$LC2; mov.u64 %r344,3445; call _gfortrani_internal_error61 9 add.u64 %r346,%r27,1; .loc 1 61 7 sub.u64 %r128,%r346,%r29; max.s64 %r130,%r128,0; .loc 1 65 11 mul.lo.u64 %r33,%r30,%r26; .loc 1 159 8 ld.u64 %r141,[%r203]; ld.u64 %r168,[%frame+360]; .loc 1 185 22 ld.u64 %r89,[%frame+120]; .loc 1 186 22 ld.u64 %r91,[%frame]; .loc 1 186 12 add.u64 %r92,%r91,%r91; .loc 1 188 32 ld.u64 %r68,[%frame+240]; .loc 1 195 30 mul.lo.u64 %r348,%r89,%r68; .loc 1 195 16 neg.s64 %r154,%r348; .loc 1 196 30 mul.lo.u64 %r349,%r91,%r68; .loc 1 196 16 add.u64 %r350,%r349,%r349; neg.s64 %r165,%r350; setp.gt.s64 %r382,%r128,0; setp.le.s32 %r379,%r25,1; cvt.u32.u32 %r384,%r23; cvt.s64.s8 %r385,%r384; add.u64 %r386,%r385,-1; add.u64 %r387,%frame,120; add.u64 %r388,%frame,240; .loc 1 192 20 mov.u64 %r389,0; $L36: .loc 1 170 12 @ %r382 bra $L37; .loc 1 171 10 mov.u16 %r352,0; st.u16 [%r141],%r352; bra $L30; $L37: mov.u64 %r135,%r140; .loc 1 169 10 mov.u32 %r116,0; .loc 1 174 13 mov.u64 %r134,0; $L29: .loc 1 177 6 ld.s8 %r354,[%r135]; cvt.u16.u32 %r353,%r354; setp.eq.u16 %r355,%r353,0; @ %r355 bra $L31; .loc 1 178 11 cvt.u16.u32 %r357,%r116; add.u16 %r356,%r357,1; cvt.s32.s16 %r116,%r356; $L31: .loc 1 174 28 add.u64 %r134,%r134,1; .loc 1 174 36 add.u64 %r135,%r135,%r33; .loc 1 174 6 setp.gt.s64 %r359,%r130,%r134; @ %r359 bra $L29; .loc 1 180 12 cvt.u16.u32 %r360,%r116; st.u16 [%r141],%r360; $L30: .loc 1 184 15 add.u64 %r168,%r168,1; .loc 1 185 12 add.u64 %r140,%r140,%r89; .loc 1 186 12 add.u64 %r141,%r141,%r92; .loc 1 188 13 setp.ne.u64 %r361,%r68,%r168; @ %r361 bra $L36; .loc 1 195 16 add.u64 %r137,%r140,%r154; .loc 1 196 16 add.u64 %r138,%r141,%r165; .loc 1 198 14 @ %r379 bra $L1; add.u64 %r32,%frame,368; mov.u64 %r167,8; .loc 1 197 12 mov.u64 %r139,1; bra $L34; $L35: .loc 1 192 20 st.u64 [%r32],%r389; .loc 1 195 30 mul.lo.u64 %r367,%r103,%r102; .loc 1 195 16 sub.u64 %r137,%r140,%r367; .loc 1 196 30 mul.lo.u64 %r368,%r105,%r102; .loc 1 196 16 add.u64 %r369,%r368,%r368; sub.u64 %r138,%r141,%r369; .loc 1 197 12 add.u64 %r139,%r139,1; .loc 1 198 14 add.u64 %r32,%r32,8; add.u64 %r167,%r167,8; setp.eq.u64 %r370,%r139,%r386; @ %r370 bra $L1; $L34: .loc 1 206 23 ld.u64 %r371,[%r32]; add.u64 %r102,%r371,1; st.u64 [%r32],%r102; .loc 1 207 30 add.u64 %r373,%r387,%r167; ld.u64 %r103,[%r373]; .loc 1 207 20 add.u64 %r140,%r137,%r103; .loc 1 208 30 add.u64 %r374,%frame,%r167; ld.u64 %r105,[%r374]; .loc 1 208 20 add.u64 %r375,%r105,%r105; add.u64 %r141,%r138,%r375; .loc 1 188 32 add.u64 %r377,%r388,%r167; ld.u64 %r108,[%r377]; .loc 1 188 13 setp.eq.u64 %r378,%r102,%r108; @ %r378 bra $L35; .loc 1 192 20 mov.u64 %r168,0; bra $L36; $L1: .loc 1 212count_4_l.o/_gfortran_count_4_l .visible .func _gfortran_count_4_lfortran/generated/count_4_l.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_internal_error .extern .func _gfortrani_internal_114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,67,79,85,78,84,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,10810,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,67,79,85,78,84,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121,32,105,110,32,67,79,85,78,84,32,105,110,116,114,105,110,115,105,99_gfortran_count_4_l .visible .func _gfortran_count_4_l48u64 %r229; .reg .u64 %r230; .reg .u64 %r231; .reg .u64 %r232; .reg .u64 %r233; .reg .pred %r234; .reg .pred %r236; .reg .u64u64u64 %r257; .reg .u64 %r258; .reg .predu64u16u64 %r279; .reg .u64 %r280; .reg .u64 %r282; .reg .u64pred %r320; .reg .u64u64 %r332; .reg .u64 %r333; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .pred %r338; .reg .pred %r339; .reg .pred %r341; .reg .u64 %r342; .reg .u64 %r343; .reg .u64 %r344; .reg .u64 %r345; .reg .u64 %r346; .reg .u64 %r347; .reg .u32 %r349; .reg .u16 %r350; .reg .u32 %r351; .reg .pred %r352; .reg .pred %r353; .reg .pred %r354; .reg .u64pred %r363; .reg .u64u64 %r385; mov.u64 %r201,%ar0; mov.u64 %r202,%ar1; mov.u64 %r203,%ar2; .loc 1 56 10 ld.u64 %r22,[%r203]; .loc 1 56 7 add.u64 %r123,%r22,-1; .loc 1 57 10 ld.s8 %r23,[%r202+28]; .loc 1 57 38 add.u32 %r25,%r23,-1; .loc 1 57 8 cvt.s64.s32 %r124,%r25; .loc 1 59 14 ld.u64 %r26,[%r202+16]; .loc 1 61 9 add.u64 %r205,%r123,%r123; add.u64 %r206,%r205,%r123; shl.b64 %r207,%r206,3; add.u64 %r208,%r202,%r207; ld.u64 %r27,[%r208+56]; ld.u64 %r29,[%r208+48]; .loc 1 65 11 ld.u64 %r30,[%r208+40]; .loc 1 67 3 setp.gt.s64 %r222,%r123,0; @ %r222 bra $L2; $L8: .loc 1 75 3 setp.lt.s64 %r223,%r123,%r124; @ %r223 bra $L3; bra $L4; $L2: add.u64 %r195,%r202,40; add.u64 %r51,%frame,120; add.u64 %r52,%frame,240; add.u64 %r225,%r22,%r22; add.u64 %r226,%r225,%r22; shl.b64 %r227,%r226,3; add.u64 %r228,%r202,16; add.u64 %r73,%r227,%r228; .loc 1 73 12 mov.u64 %r385,0; $L7: .loc 1 69 20 ld.u64 %r230,[%r195]; mul.lo.u64 %r229,%r230,%r26; .loc 1 69 18 st.u64 [%r51],%r229; .loc 1 70 19 ld.u64 %r232,[%r195+16]; add.u64 %r231,%r232,1; ld.u64 %r233,[%r195+8]; sub.u64 %r41,%r231,%r233; .loc 1 72 10 setp.lt.s64 %r234,%r41,0; @ %r234 bra $L5; .loc 1 70 17 st.u64 [%r52],%r41; bra $L6; $L5: .loc 1 73 12 st.u64 [%r52],%r385; $L6: .loc 1 67 3 add.u64 %r195,%r195,24; add.u64 %r51,%r51,8; add.u64 %r52,%r52,8; setp.ne.u64 %r236,%r73,%r195; @ %r236 bra $L7; bra $L8; $L4: .loc 1 84 6 ld.u64 %r237,[%r201]; setp.eq.u64 %r238,%r237,0; @ ! %r238 bra $L53; bra $L9; $L3: add.u64 %r240,%r22,%r22; add.u64 %r241,%r240,%r22; shl.b64 %r242,%r241,3; add.u64 %r243,%r242,40; add.u64 %r134,%r202,%r243; shl.b64 %r244,%r22,3; add.u64 %r110,%r244,-8; add.u64 %r375,%frame,120; add.u64 %r115,%r375,%r110; add.u64 %r374,%frame,240; add.u64 %r108,%r374,%r110; add.u64 %r247,%r202,40; cvt.u32.u32 %r249,%r23; cvt.s64.s8 %r248,%r249; add.u64 %r251,%r248,%r248; add.u64 %r252,%r251,%r248; shl.b64 %r253,%r252,3; add.u64 %r198,%r247,%r253; .loc 1 81 12 mov.u64 %r384,0; $L13: .loc 1 77 20 ld.u64 %r255,[%r134]; mul.lo.u64 %r254,%r255,%r26; .loc 1 77 18 st.u64 [%r115],%r254; .loc 1 78 19 ld.u64 %r257,[%r134+16]; add.u64 %r256,%r257,1; ld.u64 %r258,[%r134+8]; sub.u64 %r49,%r256,%r258; .loc 1 80 10 setp.lt.s64 %r259,%r49,0; @ %r259 bra $L11; .loc 1 78 17 st.u64 [%r108],%r49; bra $L12; $L11: .loc 1 81 12 st.u64 [%r108],%r384; $L12: .loc 1 75 3 add.u64 %r134,%r134,24; add.u64 %r115,%r115,8; add.u64 %r108,%r108,8; setp.ne.u64 %r261,%r134,%r198; @ %r261 bra $L13; bra $L4; $L9: .loc 1 88 7 setp.le.s32 %r262,%r25,0; @ %r262 bra $L14; add.u64 %r178,%r201,40; add.u64 %r176,%frame,240; add.u64 %r263,%r201,16; cvt.u32.u32 %r265,%r23; cvt.s64.s8 %r264,%r265; add.u64 %r267,%r264,%r264; add.u64 %r268,%r267,%r264; shl.b64 %r269,%r268,3; add.u64 %r170,%r263,%r269; .loc 1 91 17 mov.u64 %r113,1; bra $L15; $L14: .loc 1 99 24 mov.u64 %r270,0; st.u64 [%r201+8],%r270; .loc 1 100 28 cvt.u16.u32 %r273,%r23; add.u16 %r272,%r273,-1; cvt.u32.u16 %r274,%r272; st.u8 [%r201+28],%r274; .loc 1 102 20 add.u32 %r275,%r23,-2; cvt.s64.s32 %r58,%r275; add.u64 %r277,%r58,%r58; add.u64 %r278,%r277,%r58; shl.b64 %r279,%r278,3; add.u64 %r280,%r201,%r279; .loc 1 102 67 shl.b64 %r282,%r58,3; add.u64 %r283,%frame,%r282; .loc 1 102 59 ld.u64 %r285,[%r280+40]; ld.u64 %r286,[%r283+240]; mul.lo.u64 %r131,%r285,%r286; .loc 1 104 10 setp.eq.u64 %r287,%r131,0; @ %r287 bra $L16; bra $L54; $L18: .loc 1 93 55 mul.lo.u64 %r113,%r54,%r113; $L15: .loc 1 95 4 st.u64 [%r178+8],%r237; ld.u64 %r54,[%r176]; add.u64 %r289,%r54,-1; st.u64 [%r178+16],%r289; st.u64 [%r178],%r113; .loc 1 88 7 add.u64 %r178,%r178,24; add.u64 %r176,%r176,8; setp.ne.u64 %r290,%r170,%r178; @ %r290 bra $L18; bra $L14; $L16: .loc 1 107 4 st.u64 [%r201+48],%r131; mov.u64 %r292,-1; st.u64 [%r201+56],%r292; mov.u64 %r293,1; st.u64 [%r201+40],%r293; .loc 1 108 4 bra $L1; $L54: .loc 1 111 24295; call (%value_in),_gfortrani_xmallocarray296,[%value_in]; } .loc 1 111 22 st.u64 [%r201],%r296; $L25: .loc 1 137 3 setp.gt.s32 %r298,%r25,0; @ ! %r298 bra $L26; add.u64 %r374,%frame,240; bra $L20; $L26: .loc 1 145 8 ld.u64 %r138,[%r202]; .loc 1 147 21 cvt.u32.u64 %r78,%r26; .loc 1 147 50 add.u32 %r299,%r78,-4; and.b32 %r300,%r299,-5; set.u32.eq.u32 %r302,%r300,0; neg.s32 %r303,%r302; .loc 1 147 21 add.u32 %r304,%r78,-1; set.u32.le.u32 %r306,%r304,1; neg.s32 %r307,%r306; .loc 1 147 6 cvt.u16.u32 %r309,%r303; cvt.u16.u32 %r310,%r307; orbra $L55; $L20: add.u64 %r188,%frame,360; add.u64 %r187,%r201,40; mov.u64 %r185,%frame; .loc 1 123 10 mov.u64 %r140,0; .loc 1 139 16 mov.u64 %r332,%r140; bra $L23; $L53: .loc 1 115 19 ld.s8 %r65,[%r201+28]; .loc 1 115 10 setp.eq.u64 %r314,%r65,%r124; @ %r314 bra $L24; .loc 1 116 2 st.u64 [%stack+8],%r124; st.u64 [%stack],%r65; cvta.const.u64 %r315315_gfortran_runtime_error24: .loc 1 121 11 cvta.global.u64 %r317,_gfortrani_compile_options; .loc 1 121 10 ld.u32 %r318,[%r317+36]; setp.eq.u32 %r319,%r318,0; @ %r319 bra $L25; .loc 1 123 4 setp.le.s64 %r320,%r124,0; @ %r320 bra $L26; add.u64 %r169,%r201,48; add.u64 %r374,%frame,240; mov.u64 %r164,%r374; add.u64 %r162,%r201,56; cvt.u32.u32 %r322,%r23; cvt.s64.s8 %r321,%r322; add.u64 %r144,%r321,-1; .loc 1 123 10 mov.u64 %r112,0; $L28: .loc 1 127 21 ld.u64 %r324,[%r162]; add.u64 %r323,%r324,1; .loc 1 127 19 ld.u64 %r325,[%r169]; sub.u64 %r130,%r323,%r325; .loc 1 128 18 ld.u64 %r72,[%r164]; .loc 1 128 11 setp.eq.u64 %r326,%r72,%r130; @ %r326 bra $L27; .loc 1 129 3 st.u64 [%stack+16],%r72; st.u64 [%stack+8],%r130; cvt.u32.u64 %r329,%r112; add.u32 %r328,%r329,1; st.u32 [%stack],%r328; cvta.const.u64 %r327,$LC1stack; call _gfortran_runtime_error27: .loc 1 123 25 add.u64 %r112,%r112,1; .loc 1 123 4 add.u64 %r169,%r169,24; add.u64 %r164,%r164,8; add.u64 %r162,%r162,24; setp.ne.u64 %r331,%r112,%r144; @ %r331 bra $L28; bra $L20; $L23: .loc 1 139 16 st.u64 [%r188],%r332; .loc 1 140 18 ld.u64 %r333,[%r187]; st.u64 [%r185],%r333; .loc 1 141 17 shl.b64 %r335,%r140,3; add.u64 %r336,%r374,%r335; .loc 1 141 10 ld.u64 %r337,[%r336]; setp.le.s64 %r338,%r337,0; @ %r338 bra $L1; .loc 1 137 26 add.u64 %r140,%r140,1; .loc 1 137 3 add.u64 %r188,%r188,8; add.u64 %r187,%r187,24; add.u64 %r185,%r185,8; setp.gt.s64 %r339,%r124,%r140; @ %r339 bra $L23; bra $L26; $L55: .loc 1 149 7 setp.eq.u32 %r341,%r78,16; @ %r341 bra $L21; .loc 1 157 5 cvta.const.u64 %r343,$LC2; mov.u64 %r342,3343; call _gfortrani_internal_error61 9 add.u64 %r344,%r27,1; .loc 1 61 7 sub.u64 %r126,%r344,%r29; max.s64 %r128,%r126,0; .loc 1 65 11 mul.lo.u64 %r33,%r30,%r26; .loc 1 159 8 ld.u64 %r139,[%r201]; ld.u64 %r166,[%frame+360]; .loc 1 185 22 ld.u64 %r87,[%frame+120]; .loc 1 186 22 ld.u64 %r89,[%frame]; .loc 1 186 12 shl.b64 %r90,%r89,2; .loc 1 188 32 ld.u64 %r68,[%frame+240]; .loc 1 195 30 mul.lo.u64 %r345,%r87,%r68; .loc 1 195 16 neg.s64 %r152,%r345; .loc 1 196 30 mul.lo.u64 %r346,%r89,%r68; .loc 1 196 16 shl.b64 %r347,%r346,2; neg.s64 %r163,%r347; setp.gt.s64 %r372,%r126,0; setp.le.s32 %r373,%r25,1; cvt.u32.u32 %r377,%r23; cvt.s64.s8 %r378,%r377; add.u64 %r379,%r378,-1; add.u64 %r380,%frame,120; add.u64 %r381,%frame,240; .loc 1 192 20 mov.u64 %r382,0; $L36: .loc 1 170 12 @ %r372 bra $L37; .loc 1 171 10 mov.u32 %r349,0; st.u32 [%r139],%r349; bra $L30; $L37: mov.u64 %r133,%r138; .loc 1 169 10 mov.u32 %r114,0; .loc 1 174 13 mov.u64 %r132,0; $L29: .loc 1 177 6 ld.s8 %r351,[%r133]; cvt.u16.u32 %r350,%r351; setp.eq.u16 %r352,%r350,0; @ %r352 bra $L31; .loc 1 178 11 add.u32 %r114,%r114,1; $L31: .loc 1 174 28 add.u64 %r132,%r132,1; .loc 1 174 36 add.u64 %r133,%r133,%r33; .loc 1 174 6 setp.gt.s64 %r353,%r128,%r132; @ %r353 bra $L29; .loc 1 180 12 st.u32 [%r139],%r114; $L30: .loc 1 184 15 add.u64 %r166,%r166,1; .loc 1 185 12 add.u64 %r138,%r138,%r87; .loc 1 186 12 add.u64 %r139,%r139,%r90; .loc 1 188 13 setp.ne.u64 %r354,%r68,%r166; @ %r354 bra $L36; .loc 1 195 16 add.u64 %r135,%r138,%r152; .loc 1 196 16 add.u64 %r136,%r139,%r163; .loc 1 198 14 @ %r373 bra $L1; add.u64 %r32,%frame,368; mov.u64 %r165,8; .loc 1 197 12 mov.u64 %r137,1; bra $L34; $L35: .loc 1 192 20 st.u64 [%r32],%r382; .loc 1 195 30 mul.lo.u64 %r360,%r101,%r100; .loc 1 195 16 sub.u64 %r135,%r138,%r360; .loc 1 196 30 mul.lo.u64 %r361,%r103,%r100; .loc 1 196 16 shl.b64 %r362,%r361,2; sub.u64 %r136,%r139,%r362; .loc 1 197 12 add.u64 %r137,%r137,1; .loc 1 198 14 add.u64 %r32,%r32,8; add.u64 %r165,%r165,8; setp.eq.u64 %r363,%r137,%r379; @ %r363 bra $L1; $L34: .loc 1 206 23 ld.u64 %r364,[%r32]; add.u64 %r100,%r364,1; st.u64 [%r32],%r100; .loc 1 207 30 add.u64 %r366,%r380,%r165; ld.u64 %r101,[%r366]; .loc 1 207 20 add.u64 %r138,%r135,%r101; .loc 1 208 30 add.u64 %r367,%frame,%r165; ld.u64 %r103,[%r367]; .loc 1 208 20 shl.b64 %r368,%r103,2; add.u64 %r139,%r136,%r368; .loc 1 188 32 add.u64 %r370,%r381,%r165; ld.u64 %r106,[%r370]; .loc 1 188 13 setp.eq.u64 %r371,%r100,%r106; @ %r371 bra $L35; .loc 1 192 20 mov.u64 %r166,0; bra $L36; $L1: .loc 1 212 count_8_l.o/_gfortran_count_8_l .visible .func _gfortran_count_8_lfortran/generated/count_8_l.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_internal_error .extern .func _gfortrani_internal_114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,67,79,85,78,84,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,10810,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,67,79,85,78,84,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121,32,105,110,32,67,79,85,78,84,32,105,110,116,114,105,110,115,105,99_gfortran_count_8_l .visible .func _gfortran_count_8_l48u64 %r229; .reg .u64 %r230; .reg .u64 %r231; .reg .u64 %r232; .reg .u64 %r233; .reg .pred %r234; .reg .pred %r236; .reg .u64u64u64 %r257; .reg .u64 %r258; .reg .predu64u16u64 %r279; .reg .u64 %r280; .reg .u64 %r282; .reg .u64pred %r320; .reg .u64u64 %r332; .reg .u64 %r333; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .pred %r338; .reg .pred %r339; .reg .pred %r341; .reg .u64 %r342; .reg .u64 %r343; .reg .u64 %r344; .reg .u64 %r345; .reg .u64 %r346; .reg .u64 %r347; .reg .u16 %r350; .reg .u32 %r351; .reg .pred %r352; .reg .pred %r353; .reg .pred %r354; .reg .u64pred %r363; .reg .u64u64 %r385; mov.u64 %r201,%ar0; mov.u64 %r202,%ar1; mov.u64 %r203,%ar2; .loc 1 56 10 ld.u64 %r22,[%r203]; .loc 1 56 7 add.u64 %r123,%r22,-1; .loc 1 57 10 ld.s8 %r23,[%r202+28]; .loc 1 57 38 add.u32 %r25,%r23,-1; .loc 1 57 8 cvt.s64.s32 %r124,%r25; .loc 1 59 14 ld.u64 %r26,[%r202+16]; .loc 1 61 9 add.u64 %r205,%r123,%r123; add.u64 %r206,%r205,%r123; shl.b64 %r207,%r206,3; add.u64 %r208,%r202,%r207; ld.u64 %r27,[%r208+56]; ld.u64 %r29,[%r208+48]; .loc 1 65 11 ld.u64 %r30,[%r208+40]; .loc 1 67 3 setp.gt.s64 %r222,%r123,0; @ %r222 bra $L2; $L8: .loc 1 75 3 setp.lt.s64 %r223,%r123,%r124; @ %r223 bra $L3; bra $L4; $L2: add.u64 %r195,%r202,40; add.u64 %r51,%frame,120; add.u64 %r52,%frame,240; add.u64 %r225,%r22,%r22; add.u64 %r226,%r225,%r22; shl.b64 %r227,%r226,3; add.u64 %r228,%r202,16; add.u64 %r73,%r227,%r228; .loc 1 73 12 mov.u64 %r385,0; $L7: .loc 1 69 20 ld.u64 %r230,[%r195]; mul.lo.u64 %r229,%r230,%r26; .loc 1 69 18 st.u64 [%r51],%r229; .loc 1 70 19 ld.u64 %r232,[%r195+16]; add.u64 %r231,%r232,1; ld.u64 %r233,[%r195+8]; sub.u64 %r41,%r231,%r233; .loc 1 72 10 setp.lt.s64 %r234,%r41,0; @ %r234 bra $L5; .loc 1 70 17 st.u64 [%r52],%r41; bra $L6; $L5: .loc 1 73 12 st.u64 [%r52],%r385; $L6: .loc 1 67 3 add.u64 %r195,%r195,24; add.u64 %r51,%r51,8; add.u64 %r52,%r52,8; setp.ne.u64 %r236,%r73,%r195; @ %r236 bra $L7; bra $L8; $L4: .loc 1 84 6 ld.u64 %r237,[%r201]; setp.eq.u64 %r238,%r237,0; @ ! %r238 bra $L53; bra $L9; $L3: add.u64 %r240,%r22,%r22; add.u64 %r241,%r240,%r22; shl.b64 %r242,%r241,3; add.u64 %r243,%r242,40; add.u64 %r134,%r202,%r243; shl.b64 %r244,%r22,3; add.u64 %r110,%r244,-8; add.u64 %r375,%frame,120; add.u64 %r115,%r375,%r110; add.u64 %r374,%frame,240; add.u64 %r108,%r374,%r110; add.u64 %r247,%r202,40; cvt.u32.u32 %r249,%r23; cvt.s64.s8 %r248,%r249; add.u64 %r251,%r248,%r248; add.u64 %r252,%r251,%r248; shl.b64 %r253,%r252,3; add.u64 %r198,%r247,%r253; .loc 1 81 12 mov.u64 %r384,0; $L13: .loc 1 77 20 ld.u64 %r255,[%r134]; mul.lo.u64 %r254,%r255,%r26; .loc 1 77 18 st.u64 [%r115],%r254; .loc 1 78 19 ld.u64 %r257,[%r134+16]; add.u64 %r256,%r257,1; ld.u64 %r258,[%r134+8]; sub.u64 %r49,%r256,%r258; .loc 1 80 10 setp.lt.s64 %r259,%r49,0; @ %r259 bra $L11; .loc 1 78 17 st.u64 [%r108],%r49; bra $L12; $L11: .loc 1 81 12 st.u64 [%r108],%r384; $L12: .loc 1 75 3 add.u64 %r134,%r134,24; add.u64 %r115,%r115,8; add.u64 %r108,%r108,8; setp.ne.u64 %r261,%r134,%r198; @ %r261 bra $L13; bra $L4; $L9: .loc 1 88 7 setp.le.s32 %r262,%r25,0; @ %r262 bra $L14; add.u64 %r178,%r201,40; add.u64 %r176,%frame,240; add.u64 %r263,%r201,16; cvt.u32.u32 %r265,%r23; cvt.s64.s8 %r264,%r265; add.u64 %r267,%r264,%r264; add.u64 %r268,%r267,%r264; shl.b64 %r269,%r268,3; add.u64 %r170,%r263,%r269; .loc 1 91 17 mov.u64 %r113,1; bra $L15; $L14: .loc 1 99 24 mov.u64 %r270,0; st.u64 [%r201+8],%r270; .loc 1 100 28 cvt.u16.u32 %r273,%r23; add.u16 %r272,%r273,-1; cvt.u32.u16 %r274,%r272; st.u8 [%r201+28],%r274; .loc 1 102 20 add.u32 %r275,%r23,-2; cvt.s64.s32 %r58,%r275; add.u64 %r277,%r58,%r58; add.u64 %r278,%r277,%r58; shl.b64 %r279,%r278,3; add.u64 %r280,%r201,%r279; .loc 1 102 67 shl.b64 %r282,%r58,3; add.u64 %r283,%frame,%r282; .loc 1 102 59 ld.u64 %r285,[%r280+40]; ld.u64 %r286,[%r283+240]; mul.lo.u64 %r131,%r285,%r286; .loc 1 104 10 setp.eq.u64 %r287,%r131,0; @ %r287 bra $L16; bra $L54; $L18: .loc 1 93 55 mul.lo.u64 %r113,%r54,%r113; $L15: .loc 1 95 4 st.u64 [%r178+8],%r237; ld.u64 %r54,[%r176]; add.u64 %r289,%r54,-1; st.u64 [%r178+16],%r289; st.u64 [%r178],%r113; .loc 1 88 7 add.u64 %r178,%r178,24; add.u64 %r176,%r176,8; setp.ne.u64 %r290,%r170,%r178; @ %r290 bra $L18; bra $L14; $L16: .loc 1 107 4 st.u64 [%r201+48],%r131; mov.u64 %r292,-1; st.u64 [%r201+56],%r292; mov.u64 %r293,1; st.u64 [%r201+40],%r293; .loc 1 108 4 bra $L1; $L54: .loc 1 111 24 mov.u64 %r2951295; call (%value_in),_gfortrani_xmallocarray296,[%value_in]; } .loc 1 111 22 st.u64 [%r201],%r296; $L25: .loc 1 137 3 setp.gt.s32 %r298,%r25,0; @ ! %r298 bra $L26; add.u64 %r374,%frame,240; bra $L20; $L26: .loc 1 145 8 ld.u64 %r138,[%r202]; .loc 1 147 21 cvt.u32.u64 %r78,%r26; .loc 1 147 50 add.u32 %r299,%r78,-4; and.b32 %r300,%r299,-5; set.u32.eq.u32 %r302,%r300,0; neg.s32 %r303,%r302; .loc 1 147 21 add.u32 %r304,%r78,-1; set.u32.le.u32 %r306,%r304,1; neg.s32 %r307,%r306; .loc 1 147 6 cvt.u16.u32 %r309,%r303; cvt.u16.u32 %r310,%r307; orbra $L55; $L20: add.u64 %r188,%frame,360; add.u64 %r187,%r201,40; mov.u64 %r185,%frame; .loc 1 123 10 mov.u64 %r140,0; .loc 1 139 16 mov.u64 %r332,%r140; bra $L23; $L53: .loc 1 115 19 ld.s8 %r65,[%r201+28]; .loc 1 115 10 setp.eq.u64 %r314,%r65,%r124; @ %r314 bra $L24; .loc 1 116 2 st.u64 [%stack+8],%r124; st.u64 [%stack],%r65; cvta.const.u64 %r315315_gfortran_runtime_error24: .loc 1 121 11 cvta.global.u64 %r317,_gfortrani_compile_options; .loc 1 121 10 ld.u32 %r318,[%r317+36]; setp.eq.u32 %r319,%r318,0; @ %r319 bra $L25; .loc 1 123 4 setp.le.s64 %r320,%r124,0; @ %r320 bra $L26; add.u64 %r169,%r201,48; add.u64 %r374,%frame,240; mov.u64 %r164,%r374; add.u64 %r162,%r201,56; cvt.u32.u32 %r322,%r23; cvt.s64.s8 %r321,%r322; add.u64 %r144,%r321,-1; .loc 1 123 10 mov.u64 %r112,0; $L28: .loc 1 127 21 ld.u64 %r324,[%r162]; add.u64 %r323,%r324,1; .loc 1 127 19 ld.u64 %r325,[%r169]; sub.u64 %r130,%r323,%r325; .loc 1 128 18 ld.u64 %r72,[%r164]; .loc 1 128 11 setp.eq.u64 %r326,%r72,%r130; @ %r326 bra $L27; .loc 1 129 3 st.u64 [%stack+16],%r72; st.u64 [%stack+8],%r130; cvt.u32.u64 %r329,%r112; add.u32 %r328,%r329,1; st.u32 [%stack],%r328; cvta.const.u64 %r327,$LC1stack; call _gfortran_runtime_error27: .loc 1 123 25 add.u64 %r112,%r112,1; .loc 1 123 4 add.u64 %r169,%r169,24; add.u64 %r164,%r164,8; add.u64 %r162,%r162,24; setp.ne.u64 %r331,%r112,%r144; @ %r331 bra $L28; bra $L20; $L23: .loc 1 139 16 st.u64 [%r188],%r332; .loc 1 140 18 ld.u64 %r333,[%r187]; st.u64 [%r185],%r333; .loc 1 141 17 shl.b64 %r335,%r140,3; add.u64 %r336,%r374,%r335; .loc 1 141 10 ld.u64 %r337,[%r336]; setp.le.s64 %r338,%r337,0; @ %r338 bra $L1; .loc 1 137 26 add.u64 %r140,%r140,1; .loc 1 137 3 add.u64 %r188,%r188,8; add.u64 %r187,%r187,24; add.u64 %r185,%r185,8; setp.gt.s64 %r339,%r124,%r140; @ %r339 bra $L23; bra $L26; $L55: .loc 1 149 7 setp.eq.u32 %r341,%r78,16; @ %r341 bra $L21; .loc 1 157 5 cvta.const.u64 %r343,$LC2; mov.u64 %r342,3343; call _gfortrani_internal_error61 9 add.u64 %r344,%r27,1; .loc 1 61 7 sub.u64 %r126,%r344,%r29; max.s64 %r128,%r126,0; .loc 1 65 11 mul.lo.u64 %r33,%r30,%r26; .loc 1 159 8 ld.u64 %r139,[%r201]; ld.u64 %r166,[%frame+360]; .loc 1 185 22 ld.u64 %r87,[%frame+120]; .loc 1 186 22 ld.u64 %r89,[%frame]; .loc 1 186 12 shl.b64 %r90,%r89,3; .loc 1 188 32 ld.u64 %r68,[%frame+240]; .loc 1 195 30 mul.lo.u64 %r345,%r87,%r68; .loc 1 195 16 neg.s64 %r152,%r345; .loc 1 196 30 mul.lo.u64 %r346,%r89,%r68; .loc 1 196 16 shl.b64 %r347,%r346,3; neg.s64 %r163,%r347; setp.gt.s64 %r372,%r126,0; setp.le.s32 %r373,%r25,1; .loc 1 171 10 mov.u64 %r377,0; cvt.u32.u32 %r378,%r23; cvt.s64.s8 %r379,%r378; add.u64 %r380,%r379,-1; add.u64 %r381,%frame,120; add.u64 %r382,%frame,240; $L36: .loc 1 170 12 @ %r372 bra $L37; .loc 1 171 10 st.u64 [%r139],%r377; bra $L30; $L37: mov.u64 %r133,%r138; .loc 1 169 10 mov.u64 %r114,0; .loc 1 174 13 mov.u64 %r132,%r114; $L29: .loc 1 177 6 ld.s8 %r351,[%r133]; cvt.u16.u32 %r350,%r351; setp.eq.u16 %r352,%r350,0; @ %r352 bra $L31; .loc 1 178 11 add.u64 %r114,%r114,1; $L31: .loc 1 174 28 add.u64 %r132,%r132,1; .loc 1 174 36 add.u64 %r133,%r133,%r33; .loc 1 174 6 setp.gt.s64 %r353,%r128,%r132; @ %r353 bra $L29; .loc 1 180 12 st.u64 [%r139],%r114; $L30: .loc 1 184 15 add.u64 %r166,%r166,1; .loc 1 185 12 add.u64 %r138,%r138,%r87; .loc 1 186 12 add.u64 %r139,%r139,%r90; .loc 1 188 13 setp.ne.u64 %r354,%r68,%r166; @ %r354 bra $L36; .loc 1 195 16 add.u64 %r135,%r138,%r152; .loc 1 196 16 add.u64 %r136,%r139,%r163; .loc 1 198 14 @ %r373 bra $L1; add.u64 %r32,%frame,368; mov.u64 %r165,8; .loc 1 197 12 mov.u64 %r137,1; bra $L34; $L35: .loc 1 192 20 st.u64 [%r32],%r377; .loc 1 195 30 mul.lo.u64 %r360,%r101,%r100; .loc 1 195 16 sub.u64 %r135,%r138,%r360; .loc 1 196 30 mul.lo.u64 %r361,%r103,%r100; .loc 1 196 16 shl.b64 %r362,%r361,3; sub.u64 %r136,%r139,%r362; .loc 1 197 12 add.u64 %r137,%r137,1; .loc 1 198 14 add.u64 %r32,%r32,8; add.u64 %r165,%r165,8; setp.eq.u64 %r363,%r137,%r380; @ %r363 bra $L1; $L34: .loc 1 206 23 ld.u64 %r364,[%r32]; add.u64 %r100,%r364,1; st.u64 [%r32],%r100; .loc 1 207 30 add.u64 %r366,%r381,%r165; ld.u64 %r101,[%r366]; .loc 1 207 20 add.u64 %r138,%r135,%r101; .loc 1 208 30 add.u64 %r367,%frame,%r165; ld.u64 %r103,[%r367]; .loc 1 208 20 shl.b64 %r368,%r103,3; add.u64 %r139,%r136,%r368; .loc 1 188 32 add.u64 %r370,%r382,%r165; ld.u64 %r106,[%r370]; .loc 1 188 13 setp.eq.u64 %r371,%r100,%r106; @ %r371 bra $L35; .loc 1 192 20 mov.u64 %r166,0; bra $L36; $L1: .loc 1 212 count_16_l.o/_gfortran_count_16_l .visible .func _gfortran_count_16_lfortran/generated/count_16_l.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_internal_error .extern .func _gfortrani_internal_114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,67,79,85,78,84,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,10810,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,67,79,85,78,84,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121,32,105,110,32,67,79,85,78,84,32,105,110,116,114,105,110,115,105,99_gfortran_count_16_l .visible .func _gfortran_count_16_l48u64 %r229; .reg .u64 %r230; .reg .u64 %r231; .reg .u64 %r232; .reg .u64 %r233; .reg .pred %r234; .reg .pred %r236; .reg .u64u64u64 %r257; .reg .u64 %r258; .reg .predu64u16u64 %r279; .reg .u64 %r280; .reg .u64 %r282; .reg .u64pred %r320; .reg .u64u64 %r332; .reg .u64 %r333; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .pred %r338; .reg .pred %r339; .reg .pred %r341; .reg .u64 %r342; .reg .u64 %r343; .reg .u64 %r344; .reg .u64 %r345; .reg .u64 %r346; .reg .u64 %r347; .reg .u16 %r351; .reg .u32 %r352; .reg .pred %r353; .reg .u64 %r356; .reg .u64 %r359; .reg .u32 %r361; .reg .pred %r366; .reg .pred %r369; .reg .u64 %r375; .reg .u64pred %r396; .reg .u64 %r398; .reg .u64 %r399; .reg .u32 %r401; .reg .u64mov.u64 %r201,%ar0; mov.u64 %r202,%ar1; mov.u64 %r203,%ar2; .loc 1 56 10 ld.u64 %r22,[%r203]; .loc 1 56 7 add.u64 %r123,%r22,-1; .loc 1 57 10 ld.s8 %r23,[%r202+28]; .loc 1 57 38 add.u32 %r25,%r23,-1; .loc 1 57 8 cvt.s64.s32 %r124,%r25; .loc 1 59 14 ld.u64 %r26,[%r202+16]; .loc 1 61 9 add.u64 %r205,%r123,%r123; add.u64 %r206,%r205,%r123; shl.b64 %r207,%r206,3; add.u64 %r208,%r202,%r207; ld.u64 %r27,[%r208+56]; ld.u64 %r29,[%r208+48]; .loc 1 65 11 ld.u64 %r30,[%r208+40]; .loc 1 67 3 setp.gt.s64 %r222,%r123,0; @ %r222 bra $L2; $L8: .loc 1 75 3 setp.lt.s64 %r223,%r123,%r124; @ %r223 bra $L3; bra $L4; $L2: add.u64 %r195,%r202,40; add.u64 %r51,%frame,120; add.u64 %r52,%frame,240; add.u64 %r225,%r22,%r22; add.u64 %r226,%r225,%r22; shl.b64 %r227,%r226,3; add.u64 %r228,%r202,16; add.u64 %r73,%r227,%r228; .loc 1 73 12 mov.u64 %r407,0; $L7: .loc 1 69 20 ld.u64 %r230,[%r195]; mul.lo.u64 %r229,%r230,%r26; .loc 1 69 18 st.u64 [%r51],%r229; .loc 1 70 19 ld.u64 %r232,[%r195+16]; add.u64 %r231,%r232,1; ld.u64 %r233,[%r195+8]; sub.u64 %r41,%r231,%r233; .loc 1 72 10 setp.lt.s64 %r234,%r41,0; @ %r234 bra $L5; .loc 1 70 17 st.u64 [%r52],%r41; bra $L6; $L5: .loc 1 73 12 st.u64 [%r52],%r407; $L6: .loc 1 67 3 add.u64 %r195,%r195,24; add.u64 %r51,%r51,8; add.u64 %r52,%r52,8; setp.ne.u64 %r236,%r73,%r195; @ %r236 bra $L7; bra $L8; $L4: .loc 1 84 6 ld.u64 %r237,[%r201]; setp.eq.u64 %r238,%r237,0; @ ! %r238 bra $L53; bra $L9; $L3: add.u64 %r240,%r22,%r22; add.u64 %r241,%r240,%r22; shl.b64 %r242,%r241,3; add.u64 %r243,%r242,40; add.u64 %r134,%r202,%r243; shl.b64 %r244,%r22,3; add.u64 %r110,%r244,-8; add.u64 %r394,%frame,120; add.u64 %r115,%r394,%r110; add.u64 %r393,%frame,240; add.u64 %r108,%r393,%r110; add.u64 %r247,%r202,40; cvt.u32.u32 %r249,%r23; cvt.s64.s8 %r248,%r249; add.u64 %r251,%r248,%r248; add.u64 %r252,%r251,%r248; shl.b64 %r253,%r252,3; add.u64 %r198,%r247,%r253; .loc 1 81 12 mov.u64 %r406,0; $L13: .loc 1 77 20 ld.u64 %r255,[%r134]; mul.lo.u64 %r254,%r255,%r26; .loc 1 77 18 st.u64 [%r115],%r254; .loc 1 78 19 ld.u64 %r257,[%r134+16]; add.u64 %r256,%r257,1; ld.u64 %r258,[%r134+8]; sub.u64 %r49,%r256,%r258; .loc 1 80 10 setp.lt.s64 %r259,%r49,0; @ %r259 bra $L11; .loc 1 78 17 st.u64 [%r108],%r49; bra $L12; $L11: .loc 1 81 12 st.u64 [%r108],%r406; $L12: .loc 1 75 3 add.u64 %r134,%r134,24; add.u64 %r115,%r115,8; add.u64 %r108,%r108,8; setp.ne.u64 %r261,%r134,%r198; @ %r261 bra $L13; bra $L4; $L9: .loc 1 88 7 setp.le.s32 %r262,%r25,0; @ %r262 bra $L14; add.u64 %r178,%r201,40; add.u64 %r176,%frame,240; add.u64 %r263,%r201,16; cvt.u32.u32 %r265,%r23; cvt.s64.s8 %r264,%r265; add.u64 %r267,%r264,%r264; add.u64 %r268,%r267,%r264; shl.b64 %r269,%r268,3; add.u64 %r170,%r263,%r269; .loc 1 91 17 mov.u64 %r113,1; bra $L15; $L14: .loc 1 99 24 mov.u64 %r270,0; st.u64 [%r201+8],%r270; .loc 1 100 28 cvt.u16.u32 %r273,%r23; add.u16 %r272,%r273,-1; cvt.u32.u16 %r274,%r272; st.u8 [%r201+28],%r274; .loc 1 102 20 add.u32 %r275,%r23,-2; cvt.s64.s32 %r58,%r275; add.u64 %r277,%r58,%r58; add.u64 %r278,%r277,%r58; shl.b64 %r279,%r278,3; add.u64 %r280,%r201,%r279; .loc 1 102 67 shl.b64 %r282,%r58,3; add.u64 %r283,%frame,%r282; .loc 1 102 59 ld.u64 %r285,[%r280+40]; ld.u64 %r286,[%r283+240]; mul.lo.u64 %r131,%r285,%r286; .loc 1 104 10 setp.eq.u64 %r287,%r131,0; @ %r287 bra $L16; bra $L54; $L18: .loc 1 93 55 mul.lo.u64 %r113,%r54,%r113; $L15: .loc 1 95 4 st.u64 [%r178+8],%r237; ld.u64 %r54,[%r176]; add.u64 %r289,%r54,-1; st.u64 [%r178+16],%r289; st.u64 [%r178],%r113; .loc 1 88 7 add.u64 %r178,%r178,24; add.u64 %r176,%r176,8; setp.ne.u64 %r290,%r170,%r178; @ %r290 bra $L18; bra $L14; $L16: .loc 1 107 4 st.u64 [%r201+48],%r131; mov.u64 %r292,-1; st.u64 [%r201+56],%r292; mov.u64 %r293,1; st.u64 [%r201+40],%r293; .loc 1 108 4 bra $L1; $L54: .loc 1 111 24 mov.u64 %r295,16295; call (%value_in),_gfortrani_xmallocarray296,[%value_in]; } .loc 1 111 22 st.u64 [%r201],%r296; $L25: .loc 1 137 3 setp.gt.s32 %r298,%r25,0; @ ! %r298 bra $L26; add.u64 %r393,%frame,240; bra $L20; $L26: .loc 1 145 8 ld.u64 %r138,[%r202]; .loc 1 147 21 cvt.u32.u64 %r78,%r26; .loc 1 147 50 add.u32 %r299,%r78,-4; and.b32 %r300,%r299,-5; set.u32.eq.u32 %r302,%r300,0; neg.s32 %r303,%r302; .loc 1 147 21 add.u32 %r304,%r78,-1; set.u32.le.u32 %r306,%r304,1; neg.s32 %r307,%r306; .loc 1 147 6 cvt.u16.u32 %r309,%r303; cvt.u16.u32 %r310,%r307; orbra $L55; $L20: add.u64 %r188,%frame,360; add.u64 %r187,%r201,40; mov.u64 %r185,%frame; .loc 1 123 10 mov.u64 %r140,0; .loc 1 139 16 mov.u64 %r332,%r140; bra $L23; $L53: .loc 1 115 19 ld.s8 %r65,[%r201+28]; .loc 1 115 10 setp.eq.u64 %r314,%r65,%r124; @ %r314 bra $L24; .loc 1 116 2 st.u64 [%stack+8],%r124; st.u64 [%stack],%r65; cvta.const.u64 %r315315_gfortran_runtime_error24: .loc 1 121 11 cvta.global.u64 %r317,_gfortrani_compile_options; .loc 1 121 10 ld.u32 %r318,[%r317+36]; setp.eq.u32 %r319,%r318,0; @ %r319 bra $L25; .loc 1 123 4 setp.le.s64 %r320,%r124,0; @ %r320 bra $L26; add.u64 %r169,%r201,48; add.u64 %r393,%frame,240; mov.u64 %r164,%r393; add.u64 %r162,%r201,56; cvt.u32.u32 %r322,%r23; cvt.s64.s8 %r321,%r322; add.u64 %r144,%r321,-1; .loc 1 123 10 mov.u64 %r112,0; $L28: .loc 1 127 21 ld.u64 %r324,[%r162]; add.u64 %r323,%r324,1; .loc 1 127 19 ld.u64 %r325,[%r169]; sub.u64 %r130,%r323,%r325; .loc 1 128 18 ld.u64 %r72,[%r164]; .loc 1 128 11 setp.eq.u64 %r326,%r72,%r130; @ %r326 bra $L27; .loc 1 129 3 st.u64 [%stack+16],%r72; st.u64 [%stack+8],%r130; cvt.u32.u64 %r329,%r112; add.u32 %r328,%r329,1; st.u32 [%stack],%r328; cvta.const.u64 %r327,$LC1stack; call _gfortran_runtime_error27: .loc 1 123 25 add.u64 %r112,%r112,1; .loc 1 123 4 add.u64 %r169,%r169,24; add.u64 %r164,%r164,8; add.u64 %r162,%r162,24; setp.ne.u64 %r331,%r112,%r144; @ %r331 bra $L28; bra $L20; $L23: .loc 1 139 16 st.u64 [%r188],%r332; .loc 1 140 18 ld.u64 %r333,[%r187]; st.u64 [%r185],%r333; .loc 1 141 17 shl.b64 %r335,%r140,3; add.u64 %r336,%r393,%r335; .loc 1 141 10 ld.u64 %r337,[%r336]; setp.le.s64 %r338,%r337,0; @ %r338 bra $L1; .loc 1 137 26 add.u64 %r140,%r140,1; .loc 1 137 3 add.u64 %r188,%r188,8; add.u64 %r187,%r187,24; add.u64 %r185,%r185,8; setp.gt.s64 %r339,%r124,%r140; @ %r339 bra $L23; bra $L26; $L55: .loc 1 149 7 setp.eq.u32 %r341,%r78,16; @ %r341 bra $L21; .loc 1 157 5 cvta.const.u64 %r343,$LC2; mov.u64 %r342,3343; call _gfortrani_internal_error61 9 add.u64 %r344,%r27,1; .loc 1 61 7 sub.u64 %r126,%r344,%r29; max.s64 %r128,%r126,0; .loc 1 65 11 mul.lo.u64 %r33,%r30,%r26; .loc 1 159 8 ld.u64 %r139,[%r201]; ld.u64 %r166,[%frame+360]; .loc 1 185 22 ld.u64 %r87,[%frame+120]; .loc 1 186 22 ld.u64 %r89,[%frame]; .loc 1 186 12 shl.b64 %r90,%r89,4; .loc 1 188 32 ld.u64 %r68,[%frame+240]; .loc 1 195 30 mul.lo.u64 %r345,%r87,%r68; .loc 1 195 16 neg.s64 %r152,%r345; .loc 1 196 30 mul.lo.u64 %r346,%r89,%r68; .loc 1 196 16 shl.b64 %r347,%r346,4; neg.s64 %r163,%r347; setp.gt.s64 %r395,%r126,0; setp.le.s32 %r396,%r25,1; .loc 1 169 10 mov.u64 %r399,0; cvt.u32.u32 %r401,%r23; cvt.s64.s8 %r402,%r401; add.u64 %r403,%r402,-1; add.u64 %r404,%frame,120; add.u64 %r405,%frame,240; $L36: .loc 1 170 12 @ %r395 bra $L37; .loc 1 171 10 st.u64 [%r139],%r399; st.u64 [%r139+8],%r399; bra $L30; $L37: mov.u64 %r133,%r138; .loc 1 169 10 mov.u64 %r387,%r399; mov.u64 %r388,%r399; .loc 1 174 13 mov.u64 %r132,0; .loc 1 178 11 mov.u64 %r398,1; $L29: .loc 1 177 6 ld.s8 %r352,[%r133]; cvt.u16.u32 %r351,%r352; setp.eq.u16 %r353,%r351,0; @ %r353 bra $L31; .loc 1 178 11 add.u64 %r356,%r387,%r398; set.u32.lt.u64 %r361,%r356,%r387; cvt.s64.s32 %r359,%r361; mov.u64 %r387,%r356; sub.u64 %r388,%r388,%r359; $L31: .loc 1 174 28 add.u64 %r132,%r132,1; .loc 1 174 36 add.u64 %r133,%r133,%r33; .loc 1 174 6 setp.gt.s64 %r366,%r128,%r132; @ %r366 bra $L29; .loc 1 180 12 st.u64 [%r139],%r387; st.u64 [%r139+8],%r388; $L30: .loc 1 184 15 add.u64 %r166,%r166,1; .loc 1 185 12 add.u64 %r138,%r138,%r87; .loc 1 186 12 add.u64 %r139,%r139,%r90; .loc 1 188 13 setp.ne.u64 %r369,%r68,%r166; @ %r369 bra $L36; .loc 1 195 16 add.u64 %r135,%r138,%r152; .loc 1 196 16 add.u64 %r136,%r139,%r163; .loc 1 198 14 @ %r396 bra $L1; add.u64 %r32,%frame,368; mov.u64 %r165,8; .loc 1 197 12 mov.u64 %r137,1; bra $L34; $L35: .loc 1 192 20 st.u64 [%r32],%r399; .loc 1 195 30 mul.lo.u64 %r375,%r101,%r100; .loc 1 195 16 sub.u64 %r135,%r138,%r375; .loc 1 196 30 mul.lo.u64 %r376,%r103,%r100; .loc 1 196 16 shl.b64 %r377,%r376,4; sub.u64 %r136,%r139,%r377; .loc 1 197 12 add.u64 %r137,%r137,1; .loc 1 198 14 add.u64 %r32,%r32,8; add.u64 %r165,%r165,8; setp.eq.u64 %r378,%r137,%r403; @ %r378 bra $L1; $L34: .loc 1 206 23 ld.u64 %r379,[%r32]; add.u64 %r100,%r379,1; st.u64 [%r32],%r100; .loc 1 207 30 add.u64 %r381,%r404,%r165; ld.u64 %r101,[%r381]; .loc 1 207 20 add.u64 %r138,%r135,%r101; .loc 1 208 30 add.u64 %r382,%frame,%r165; ld.u64 %r103,[%r382]; .loc 1 208 20 shl.b64 %r383,%r103,4; add.u64 %r139,%r136,%r383; .loc 1 188 32 add.u64 %r385,%r405,%r165; ld.u64 %r106,[%r385]; .loc 1 188 13 setp.eq.u64 %r386,%r100,%r106; @ %r386 bra $L35; .loc 1 192 20 mov.u64 %r166,0; bra $L36; $L1: .loc 1 212 maxloc0_4_i1.o/ 1622802227_gfortran_maxloc0_4_i1 .visible .func _gfortran_maxloc0_4_i1fortran/generated/maxloc0_4_i1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_4_i1 .visible .func _gfortran_mmaxloc0_4_i1_gfortran_smaxloc0_4_i1 .visible .func _gfortran_smaxloc0_4_i1VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,65,88,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_4_i1 .visible .func _gfortran_maxloc0_4_i1u64 %r124; .reg .u32 %r125; .reg .u64u16 %r167; .reg .u16 %r168; .reg .predmov.u32 %r116,%ar2; .loc 1 50 10 ld.s8 %r22,[%r115+28]; .loc 1 50 8 cvt.u32.u32 %r117,%r22; cvt.s64.s8 %r69,%r117; .loc 1 51 6 setp.gt.s64 %r118,%r69,0; @ %r118 bra $L2; .loc 1 52 5 cvta.const.u64 %r1111_gfortran_runtime_errorr23,[%r114]; .loc 1 54 6 setp.ne.u64 %r121,%r23,0; @ %r121 bra $L3; .loc 1 56 7 st.u64 [%r114+48],%r23; add.u64 %r123,%r69,-1; st.u64 [%r114+56],%r123; mov.u64 %r124,1; st.u64 [%r114+40],%r124; .loc 1 57 28 cvt.u32.u64 %r125,%r124; st.u8 [%r114+28],%r125; .loc 1 58 24 st.u64 [%r114+8],%r23; .loc 1 59 29128; call (%value_in),_gfortrani_xmallocarray131,[%value_in]; } mov.u64 %r23,%r131; .loc 1 59 27 st.u64 [%r114],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r133,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r134,[%r133+36]; setp.eq.u32 %r135,%r134,0; @ %r135 bra $L4; .loc 1 64r115138; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r114]; $L4: .loc 1 68 11 ld.u64 %r70,[%r114+40]; add.u64 %r38,%r115,40; mov.u64 %r43,%frame; add.u64 %r34,%frame,120; .loc 1 70 10 mov.u64 %r96,0; add.u64 %r189,%frame,240; .loc 1 74 16 mov.u64 %r146,%r96; bra $L7; $L22: mov.u64 %r96,%r75; $L7: .loc 1 72 18 ld.u64 %r139,[%r38]; st.u64 [%r43],%r139; .loc 1 73 19 ld.u64 %r141,[%r38+16]; add.u64 %r140,%r141,1; ld.u64 %r142,[%r38+8]; sub.u64 %r31,%r140,%r142; .loc 1 73 17 st.u64 [%r34],%r31; .loc 1 74 16 shl.b64 %r144,%r96,3; add.u64 %r145,%r189,%r144; st.u64 [%r145],%r146; .loc 1 75 10 setp.gt.s64 %r147,%r31,0; @ %r147 bra $L5; shl.b64 %r81,%r70,2; mov.u64 %r83,%r23; .loc 1 78 11 mov.u64 %r76,0; .loc 1 79 24 cvt.u32.u64 %r148,%r76; $L6: st.u32 [%r83],%r148; .loc 1 78 27 add.u64 %r76,%r76,1; .loc 1 78 4 add.u64 %r83,%r83,%r81; setp.ne.u64 %r149,%r69,%r76; @ %r149 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r75,%r96,1; .loc 1 70 3 add.u64 %r38,%r38,24; add.u64 %r43,%r43,8; add.u64 %r34,%r34,8; setp.ne.u64 %r150,%r69,%r75; @ %r150 bra $L22; .loc 1 84 8 ld.u64 %r61,[%r115]; shl.b64 %r78,%r70,2; mov.u64 %r89,%r23; .loc 1 87 10 mov.u64 %r74,0; .loc 1 88 23 mov.u32 %r151,1; $L8: st.u32 [%r89],%r151; mov.u64 %r97,%r74; .loc 1 87 26 add.u64 %r74,%r74,1; .loc 1 87 3 add.u64 %r89,%r89,%r78; setp.ne.u64 %r152,%r96,%r97; @ %r152 bra $L8; .loc 1 101 9 setp.ne.u64 %r153,%r61,0; @ ! %r153 bra $L1; .loc 1 149 19 ld.u64 %r47,[%frame]; .loc 1 151 34 ld.u64 %r50,[%frame+120]; cvt.u32.u32 %r155,%r22; cvt.s64.s8 %r154,%r155; shl.b64 %r156,%r154,3; add.u64 %r100,%r189,%r156; .loc 1 99 12 mov.u32 %r64,-128; setp.eq.u32 %r188,%r116,0; add.u64 %r190,%frame,248; .loc 1 157 13 mov.u64 %r193,0; .loc 1 174 32 add.u64 %r194,%frame,120; $L21: .loc 1 151 21 ld.u64 %r108,[%frame+240]; .loc 1 126 12 @ %r188 bra $L23; mov.u64 %r39,%r108; mov.u64 %r59,%r61; $L14: .loc 1 129 12 ld.s8 %r32,[%r59]; .loc 1 129 11 cvt.u16.u32 %r159,%r32; cvt.u16.u32 %r160,%r64; setp.lt.s16 %r161,%r159,%r160; @ %r161 bra $L12; mov.u64 %r41,%r190; mov.u64 %r62,%r23; mov.u64 %r103,%r39; $L13: .loc 1 133 35 cvt.u32.u64 %r164,%r103; add.u32 %r163,%r164,1; .loc 1 133 24 st.u32 [%r62],%r163; .loc 1 132 10 add.u64 %r62,%r62,%r78; setp.eq.u64 %r165,%r41,%r100; @ %r165 bra $L24; .loc 1 133 31 ld.u64 %r103,[%r41]; add.u64 %r41,%r41,8; bra $L13; $L24: mov.u32 %r64,%r32; $L12: .loc 1 135 12 add.u64 %r59,%r59,%r47; .loc 1 137 17 add.u64 %r39,%r39,1; .loc 1 137 10 st.u64 [%frame+240],%r39; setp.ne.u64 %r166,%r39,%r50; @ %r166 bra $L14; bra $L34; $L23: mov.u64 %r49,%r108; mov.u64 %r60,%r61; $L11: .loc 1 141 11 ld.s8 %r40,[%r60]; .loc 1 141 10 cvt.u16.u32 %r167,%r40; cvt.u16.u32 %r168,%r64; setp.le.s16 %r169,%r167,%r168; @ %r169 bra $L16; mov.u64 %r106,%r190; mov.u64 %r107,%r23; mov.u64 %r42,%r49; $L17: .loc 1 145 35 cvt.u32.u64 %r172,%r42; add.u32 %r171,%r172,1; .loc 1 145 24 st.u32 [%r107],%r171; .loc 1 144 4 add.u64 %r107,%r107,%r78; setp.eq.u64 %r173,%r100,%r106; @ %r173 bra $L25; .loc 1 145 31 ld.u64 %r42,[%r106]; add.u64 %r106,%r106,8; bra $L17; $L25: mov.u32 %r64,%r40; $L16: .loc 1 149 9 add.u64 %r60,%r60,%r47; .loc 1 151 14 add.u64 %r49,%r49,1; .loc 1 151 7 st.u64 [%frame+240],%r49; setp.ne.u64 %r174,%r49,%r50; @ %r174 bra $L11; .loc 1 149 9 sub.u64 %r175,%r50,%r108; mad.lo.u64 %r61,%r175,%r47,%r61; bra $L18; $L34: .loc 1 135 12 sub.u64 %r177,%r50,%r108; mad.lo.u64 %r61,%r177,%r47,%r61; $L18: mov.u64 %r77,%r190; .loc 1 151 34 mov.u64 %r58,%r50; mov.u64 %r56,%r47; .loc 1 152 9 mov.u64 %r63,0; $L20: .loc 1 157 13 st.u64 [%r77+-8],%r193; .loc 1 160 23 mul.lo.u64 %r53,%r56,%r58; mov.u64 %r98,%r63; .loc 1 161 5 add.u64 %r63,%r63,1; .loc 1 162 7 setp.eq.u64 %r181,%r98,%r96; @ %r181 bra $L1; .loc 1 170 16 ld.u64 %r182,[%r77]; add.u64 %r55,%r182,1; st.u64 [%r77],%r55; shl.b64 %r67,%r63,3; .loc 1 171 23 add.u64 %r183,%frame,%r67; ld.u64 %r56,[%r183]; .loc 1 171 13 sub.u64 %r184,%r56,%r53; add.u64 %r61,%r61,%r184; .loc 1 174 32 add.u64 %r186,%r194,%r67; ld.u64 %r58,[%r186]; .loc 1 174 7 add.u64 %r77,%r77,8; setp.eq.u64 %r187,%r55,%r58; @ %r187 bra $L20; bra $L21; $L1: .loc 1 1_gfortran_mmaxloc0_4_i1 .visible .func _gfortran_mmaxloc0_4_i197; .reg .u32u64u64pred %r168; .reg .predpredpredpredpred %r239; .reg .pred %r240; .reg .u32predpred %r295; mov.u64 %r164,%ar0; mov.u64 %r165,%ar1; mov.u64 %r166,%ar2; mov.u32 %r167,%ar3; .loc 1 202 6 setp.ne.u64 %r168,%r166,0; @ %r168 bra $L361617; call _gfortran_maxloc0_4_.loc 1 205 7 bra $L35; $L36: .loc 1 208 10 ld.s8 %r22,[%r165+28]; .loc 1 209 6 setp.gt.s32 %r172,%r22,0; @ %r172 bra $L38; .loc 1 210 5 cvta.const.u64 %r173173_gfortran_runtime_error%r175,[%r164]; setp.ne.u64 %r176,%r175,0; @ %r176 bra $L39; .loc 1 214 7 st.u64 [%r164+48],%r175; add.u32 %r178,%r22,-1; cvt.s64.s32 %r179,%r178; st.u64 [%r164+56],%r179; mov.u64 %r180,1; st.u64 [%r164+40],%r180; .loc 1 215 28 cvt.u32.u64 %r181,%r180; st.u8 [%r164+28],%r181; .loc 1 216 24 st.u64 [%r164+8],%r175; .loc 1 217 29 cvt.u32.u32 %r186,%r22; cvt.s64.s8 %r185,%r18call (%value_in),_gfortrani_xmallocarray187,[%value_in]; } .loc 1 217 27 st.u64 [%r164],%r187; bra $L40; $L39: .loc 1 221 11 cvta.global.u64 %r189,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r190,[%r189+36]; setp.eq.u32 %r191,%r190,0164r194; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19716r197194; call _gfortrani_bounds_equal_extents$L40: .loc 1 231 15 ld.u64 %r31,[%r166+16]; .loc 1 233 9 ld.u64 %r93,[%r166]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r199,%r32,-4; and.b32 %r200,%r199,-5; set.u32.eq.u32 %r202,%r200,0; neg.s32 %r203,%r202; .loc 1 235 22 add.u32 %r204,%r32,-1; set.u32.le.u32 %r206,%r204,1; neg.s32 %r207,%r206; .loc 1 235 6 cvt.u16.u32 %r209,%r203; cvt.u16.u32 %r210,%r207; or.b16 %r208,%r209,%r210; cvt.u32.u16 %r211,%r208; cvt.u16.u8 %r212,%r211; setp.ne.u16 %r213,%r212,0; @ %r213 bra $L41; .loc 1 237 7 setp.ne.u32 %r215,%r32,16; @ %r215 bra $L42; $L41: .loc 1 244 11 ld.u64 %r106,[%r164+40]; .loc 1 245 8 ld.u64 %r107,[%r164]; .loc 1 246 17 cvt.u32.u32 %r216,%r22; cvt.s64.s8 %r65,%r216; add.u64 %r52,%r165,40; add.u64 %r285,%frame,120; mov.u64 %r115,%r285; add.u64 %r136,%r166,40; mov.u64 %r163,%frame; add.u64 %r160,%frame,240; add.u64 %r157,%frame,360; mov.u64 %r158,%r157; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r225,%r29; bra $L43; $L42: .loc 1 242 5 cvta.const.u64 %r217,$LC3stack; call _gfortran_runtime_error64: mov.u64 %r29,%r112; $L43: .loc 1 248 18 ld.u64 %r219,[%r52]; st.u64 [%r115],%r219; .loc 1 249 20 ld.u64 %r221,[%r136]; mul.lo.u64 %r220,%r221,%r31; .loc 1 249 18 st.u64 [%r163],%r220; .loc 1 250 19 ld.u64 %r223,[%r52+16]; add.u64 %r222,%r223,1; ld.u64 %r224,[%r52+8]; sub.u64 %r44,%r222,%r224; .loc 1 250 17 st.u64 [%r160],%r44; .loc 1 251 16 st.u64 [%r158],%r225; .loc 1 252 10 setp.gt.s64 %r226,%r44,0; @ %r226 bra $L44; shl.b64 %r147,%r106,2; mov.u64 %r137,%r107; .loc 1 255 11 mov.u64 %r113,0; .loc 1 256 24 cvt.u32.u64 %r227,%r113; $L45: st.u32 [%r137],%r227; .loc 1 255 27 add.u64 %r113,%r113,1; .loc 1 255 4 add.u64 %r137,%r137,%r147; setp.ne.u64 %r228,%r65,%r113; @ %r228 bra $L45; bra $L35; $L44: .loc 1 246 26 add.u64 %r112,%r29,1; .loc 1 246 3 add.u64 %r52,%r52,24; add.u64 %r115,%r115,8; add.u64 %r136,%r136,24; add.u64 %r163,%r163,8; add.u64 %r160,%r160,8; add.u64 %r158,%r158,8; setp.ne.u64 %r229,%r65,%r112; @ %r229 bra $L64; .loc 1 261 8 ld.u64 %r89,[%r165]; shl.b64 %r62,%r106,2; mov.u64 %r91,%r107; .loc 1 264 10 mov.u64 %r111,0; .loc 1 265 23 cvt.u32.u64 %r230,%r111; $L47: st.u32 [%r91],%r230; mov.u64 %r140,%r111; .loc 1 264 26 add.u64 %r111,%r111,1; .loc 1 264 3 add.u64 %r91,%r91,%r62; setp.ne.u64 %r231,%r29,%r140; @ %r231 bra $L47; .loc 1 276 9 setp.eq.u64 %r232,%r89,0; @ %r232 bra $L35; .loc 1 331 19 ld.u64 %r69,[%frame+120]; .loc 1 332 20 ld.u64 %r71,[%frame]; .loc 1 334 34 ld.u64 %r74,[%frame+240]; ld.u64 %r161,[%frame+360]; cvt.u32.u32 %r234,%r22; cvt.s64.s8 %r233,%r234; shl.b64 %r235,%r233,3; add.u64 %r125,%r235,%r157; .loc 1 274 12 mov.u32 %r98,-128; setp.eq.u32 %r287,1,0; add.u64 %r286,%frame,368; .loc 1 340 13 mov.u64 %r292,0; .loc 1 359 32 add.u64 %r293,%frame,240; .loc 1 308 12 setp.eq.u32 %r295,%r167,0; $L63: .loc 1 280 10 @ %r287 bra $L48; mov.u64 %r53,%r161; mov.u32 %r90,0; mov.u32 %r289,1; $L53: mov.u64 %r45,%r89; .loc 1 300 13 add.u64 %r89,%r45,%r69; .loc 1 284 11 ld.s8 %r238,[%r93]; cvt.u16.u32 %r237,%r238; setp.eq.u16 %r239,%r237,0; @ %r239 bra $L49; setp.eq.u32 %r240,%r90,0; selp.u64 %r161,%r161,%r53,%r240; .loc 1 294 16 ld.s8 %r98,[%r45]; mov.u64 %r135,%r286; mov.u64 %r148,%r107; mov.u64 %r159,%r161; $L52: .loc 1 296 33 cvt.u32.u64 %r243,%r159; add.u32 %r242,%r243,1; .loc 1 296 22 st.u32 [%r148],%r242; .loc 1 295 9 add.u64 %r148,%r148,%r62; setp.eq.u64 %r244,%r125,%r135; @ %r244 bra $L65; .loc 1 296 29 ld.u64 %r159,[%r135]; add.u64 %r135,%r135,8; bra $L52; $L49: .loc 1 301 14 add.u64 %r93,%r93,%r71; .loc 1 303 11 add.u64 %r53,%r53,1; mov.u32 %r90,%r289; .loc 1 303 4 setp.ne.u64 %r245,%r53,%r74; @ %r245 bra $L53; bra $L89; $L48: .loc 1 308 12 @ %r295 bra $L66; mov.u64 %r61,%r161; mov.u64 %r87,%r89; $L58: .loc 1 311 11 ld.s8 %r248,[%r93]; cvt.u16.u32 %r247,%r248; setp.eq.u16 %r249,%r247,0; @ %r249 bra $L56; .loc 1 311 22 ld.s8 %r57,[%r87]; .loc 1 311 19 cvt.u16.u32 %r250,%r57; cvt.u16.u32 %r251,%r98; setp.lt.s16 %r252,%r250,%r251; @ %r252 bra $L56; mov.u64 %r116,%r286; mov.u64 %r117,%r107; mov.u64 %r154,%r61; .loc 1 314 19 mov.u64 %r141,%r292; $L57: .loc 1 315 36 cvt.u32.u64 %r255,%r154; add.u32 %r254,%r255,1; .loc 1 315 25 st.u32 [%r117],%r254; .loc 1 314 35 add.u64 %r109,%r141,1; .loc 1 314 12 add.u64 %r117,%r117,%r62; setp.le.s64 %r256,%r29,%r141; @ %r256 bra $L67; .loc 1 315 32 ld.u64 %r154,[%r116]; add.u64 %r116,%r116,8; mov.u64 %r141,%r109; bra $L57; $L67: mov.u32 %r98,%r57; $L56: .loc 1 317 13 add.u64 %r87,%r87,%r69; .loc 1 319 11 add.u64 %r61,%r61,1; .loc 1 319 4 st.u64 [%frame+360],%r61; setp.ne.u64 %r257,%r61,%r74; @ %r257 bra $L58; .loc 1 317 13 sub.u64 %r258,%r74,%r161; mad.lo.u64 %r89,%r258,%r69,%r89; bra $L59; $L66: mov.u64 %r73,%r161; mov.u64 %r92,%r93; mov.u64 %r88,%r89; $L55: .loc 1 323 11 ld.s8 %r261,[%r92]; cvt.u16.u32 %r260,%r261; setp.eq.u16 %r262,%r260,0; @ %r262 bra $L60; .loc 1 323 22 ld.s8 %r64,[%r88]; .loc 1 323 19 cvt.u16.u32 %r263,%r64; cvt.u16.u32 %r264,%r98; setp.le.s16 %r265,%r263,%r264; @ %r265 bra $L60; mov.u64 %r97,%r286; mov.u64 %r100,%r107; mov.u64 %r149,%r73; .loc 1 326 12 mov.u64 %r142,%r292; $L61: .loc 1 327 36 cvt.u32.u64 %r268,%r149; add.u32 %r267,%r268,1; .loc 1 327 25 st.u32 [%r100],%r267; .loc 1 326 28 add.u64 %r108,%r142,1; .loc 1 326 5 add.u64 %r100,%r100,%r62; setp.le.s64 %r269,%r29,%r142; @ %r269 bra $L68; .loc 1 327 32 ld.u64 %r149,[%r97]; add.u64 %r97,%r97,8; mov.u64 %r142,%r108; bra $L61; $L68: mov.u32 %r98,%r64; $L60: .loc 1 331 9 add.u64 %r88,%r88,%r69; .loc 1 332 10 add.u64 %r92,%r92,%r71; .loc 1 334 14 add.u64 %r73,%r73,1; .loc 1 334 7 st.u64 [%frame+360],%r73; setp.ne.u64 %r270,%r73,%r74; @ %r270 bra $L55; sub.u64 %r132,%r74,%r161; .loc 1 331 9 mad.lo.u64 %r89,%r69,%r132,%r89; .loc 1 332 10 mad.lo.u64 %r93,%r71,%r132,%r93; bra $L59; $L89: st.u64 [%frame+360],%r74; $L59: mov.u64 %r123,%r286; mov.u64 %r83,%r71; .loc 1 334 34 mov.u64 %r85,%r74; mov.u64 %r81,%r69; mov.u64 %r122,8; .loc 1 335 9 mov.u64 %r96,0; $L62: .loc 1 340 13 st.u64 [%r123+-8],%r292; .loc 1 343 23 mul.lo.u64 %r76,%r81,%r85; .loc 1 344 24 mul.lo.u64 %r78,%r85,%r83; mov.u64 %r99,%r96; .loc 1 345 5 add.u64 %r96,%r96,1; .loc 1 346 7 setp.eq.u64 %r275,%r29,%r99; @ %r275 bra $L35; .loc 1 354 16 ld.u64 %r276,[%r123]; add.u64 %r80,%r276,1; st.u64 [%r123],%r80; .loc 1 355 23 add.u64 %r278,%r285,%r122; ld.u64 %r81,[%r278]; .loc 1 355 13 sub.u64 %r279,%r81,%r76; add.u64 %r89,%r89,%r279; .loc 1 356 24 add.u64 %r280,%frame,%r122; ld.u64 %r83,[%r280]; .loc 1 356 14 sub.u64 %r281,%r83,%r78; add.u64 %r93,%r93,%r281; .loc 1 359 32 add.u64 %r283,%r293,%r122; ld.u64 %r85,[%r283]; .loc 1 359 7 add.u64 %r123,%r123,8; add.u64 %r122,%r122,8; setp.eq.u64 %r284,%r80,%r85; @ %r284 bra $L62; ld.u64 %r161,[%frame+360]; bra $L63; $L65: mov.u64 %r89,%r45; setp.eq.u32 %r287,1,1; bra $L48; $L35: .loc 1 362_gfortran_smaxloc0_4_i1 .visible .func _gfortran_smaxloc0_4_i14predpred.reg .u64u32 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 379 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L91; .loc 1 379 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L92; $L91: .loc 1 381 7 {_gfortran_maxloc0_4_.loc 1 382 7 bra $L90; $L92: .loc 1 385 8 ld.s8 %r30,[%r36+28]_gfortran_runtime_error94: .loc 1 390 15 ld.u64 %r24,[%r35]; .loc 1 390 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L95; .loc 1 392 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 393 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 394 24 st.u64 [%r35+8],%r24; .loc 1 395 29 mov.u64 %r56call (%value_in),_gfortrani_xmallocarray95 27 st.u64 [%r35],%r24; bra $L96; $L95: .loc 1 397 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 397 11 ld.u32 %r62,[%r61+36; .loc 1 399 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 404 8 ld.u64 %r24,[%r35]; $L96: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,2; mov.u64 %r29,%r24; .loc 1 405 10 mov.u64 %r32,0; .loc 1 406 23 cvt.u32.u64 %r68,%r32; $L97: st.u32 [%r29],%r68; .loc 1 405 24 add.u64 %r32,%r32,1; .loc 1 405 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L97; $L90: .loc 1 407 maxloc0_8_i1.o/ 1622802227_gfortran_maxloc0_8_i1 .visible .func _gfortran_maxloc0_8_i1fortran/generated/maxloc0_8_i1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_8_i1 .visible .func _gfortran_mmaxloc0_8_i1_gfortran_smaxloc0_8_i1 .visible .func _gfortran_smaxloc0_8_i1VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,65,88,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_8_i1 .visible .func _gfortran_maxloc0_8_i1pred %r145; .reg .pred %r146; .reg .u64 %r147; .reg .pred %r148; .reg .predu64 %r152; .reg .u16 %r155; .reg .u16predpredu64 %r188; mov.u64 %r110,%ar0; mov.u64 %r111,%ar1; mov.u32 %r112,%ar2; .loc 1 50 10 ld.s8 %r22,[%r111+28]; .loc 1 50 8 cvt.u32.u32 %r113,%r22; cvt.s64.s8 %r67,%r113; .loc 1 51 6 setp.gt.s64 %r114,%r67,0; @ %r114 bra $L2; .loc 1 52 5 cvta.const.u64 %r115115_gfortran_runtime_errorr23,[%r110]; .loc 1 54 6 setp.ne.u64 %r117,%r23,0; @ %r117 bra $L3; .loc 1 56 7 st.u64 [%r110+48],%r23; add.u64 %r119,%r67,-1; st.u64 [%r110+56],%r119; mov.u64 %r120,1; st.u64 [%r110+40],%r120; .loc 1 57 28 cvt.u32.u64 %r121,%r120; st.u8 [%r110+28],%r121; .loc 1 58 24 st.u64 [%r110+8],%r23; .loc 1 59 29 mov.u64 %r16call (%value_in),_gfortrani_xmallocarray127,[%value_in]; } mov.u64 %r23,%r127; .loc 1 59 27 st.u64 [%r110],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r129,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r130,[%r129+36]; setp.eq.u32 %r131,%r130,0; @ %r131 bra $L4; .loc 1 64110134; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r110]; $L4: .loc 1 68 11 ld.u64 %r68,[%r110+40]; add.u64 %r38,%r111,40; mov.u64 %r102,%frame; add.u64 %r103,%frame,120; .loc 1 70 10 mov.u64 %r94,0; add.u64 %r183,%frame,240; .loc 1 74 16 mov.u64 %r142,%r94; bra $L7; $L22: mov.u64 %r94,%r74; $L7: .loc 1 72 18 ld.u64 %r135,[%r38]; st.u64 [%r102],%r135; .loc 1 73 19 ld.u64 %r137,[%r38+16]; add.u64 %r136,%r137,1; ld.u64 %r138,[%r38+8]; sub.u64 %r31,%r136,%r138; .loc 1 73 17 st.u64 [%r103],%r31; .loc 1 74 16 shl.b64 %r140,%r94,3; add.u64 %r141,%r183,%r140; st.u64 [%r141],%r142; .loc 1 75 10 setp.gt.s64 %r143,%r31,0; @ %r143 bra $L5; shl.b64 %r76,%r68,3; mov.u64 %r82,%r23; .loc 1 78 11 mov.u64 %r75,0; .loc 1 79 24 mov.u64 %r144,%r75; $L6: st.u64 [%r82],%r144; .loc 1 78 27 add.u64 %r75,%r75,1; .loc 1 78 4 add.u64 %r82,%r82,%r76; setp.ne.u64 %r145,%r67,%r75; @ %r145 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r74,%r94,1; .loc 1 70 3 add.u64 %r38,%r38,24; add.u64 %r102,%r102,8; add.u64 %r103,%r103,8; setp.ne.u64 %r146,%r67,%r74; @ %r146 bra $L22; .loc 1 84 8 ld.u64 %r58,[%r111]; shl.b64 %r41,%r68,3; mov.u64 %r37,%r23; .loc 1 87 10 mov.u64 %r73,0; .loc 1 88 23 mov.u64 %r147,1; $L8: st.u64 [%r37],%r147; mov.u64 %r95,%r73; .loc 1 87 26 add.u64 %r73,%r73,1; .loc 1 87 3 add.u64 %r37,%r37,%r41; setp.ne.u64 %r148,%r94,%r95; @ %r148 bra $L8; .loc 1 101 9 setp.ne.u64 %r149,%r58,0; @ ! %r149 bra $L1; .loc 1 149 19 ld.u64 %r43,[%frame]; .loc 1 151 34 ld.u64 %r48,[%frame+120]; cvt.u32.u32 %r151,%r22; cvt.s64.s8 %r150,%r151; shl.b64 %r152,%r150,3; add.u64 %r69,%r183,%r152; .loc 1 99 12 mov.u32 %r63,-128; setp.eq.u32 %r182,%r112,0; add.u64 %r184,%frame,248; .loc 1 157 13 mov.u64 %r187,0; .loc 1 174 32 add.u64 %r188,%frame,120; $L21: .loc 1 151 21 ld.u64 %r100,[%frame+240]; .loc 1 126 12 @ %r182 bra $L23; mov.u64 %r39,%r100; mov.u64 %r56,%r58; $L14: .loc 1 129 12 ld.s8 %r32,[%r56]; .loc 1 129 11 cvt.u16.u32 %r155,%r32; cvt.u16.u32 %r156,%r63; setp.lt.s16 %r157,%r155,%r156; @ %r157 bra $L12; mov.u64 %r71,%r184; mov.u64 %r33,%r23; mov.u64 %r85,%r39; $L13: .loc 1 133 35 add.u64 %r159,%r85,1; .loc 1 133 24 st.u64 [%r33],%r159; .loc 1 132 10 add.u64 %r33,%r33,%r41; setp.eq.u64 %r160,%r69,%r71; @ %r160 bra $L24; .loc 1 133 31 ld.u64 %r85,[%r71]; add.u64 %r71,%r71,8; bra $L13; $L24: mov.u32 %r63,%r32; $L12: .loc 1 135 12 add.u64 %r56,%r56,%r43; .loc 1 137 17 add.u64 %r39,%r39,1; .loc 1 137 10 st.u64 [%frame+240],%r39; setp.ne.u64 %r161,%r39,%r48; @ %r161 bra $L14; bra $L34; $L23: mov.u64 %r46,%r100; mov.u64 %r57,%r58; $L11: .loc 1 141 11 ld.s8 %r40,[%r57]; .loc 1 141 10 cvt.u16.u32 %r162,%r40; cvt.u16.u32 %r163,%r63; setp.le.s16 %r164,%r162,%r163; @ %r164 bra $L16; mov.u64 %r80,%r184; mov.u64 %r98,%r23; mov.u64 %r81,%r46; $L17: .loc 1 145 35 add.u64 %r166,%r81,1; .loc 1 145 24 st.u64 [%r98],%r166; .loc 1 144 4 add.u64 %r98,%r98,%r41; setp.eq.u64 %r167,%r69,%r80; @ %r167 bra $L25; .loc 1 145 31 ld.u64 %r81,[%r80]; add.u64 %r80,%r80,8; bra $L17; $L25: mov.u32 %r63,%r40; $L16: .loc 1 149 9 add.u64 %r57,%r57,%r43; .loc 1 151 14 add.u64 %r46,%r46,1; .loc 1 151 7 st.u64 [%frame+240],%r46; setp.ne.u64 %r168,%r46,%r48; @ %r168 bra $L11; .loc 1 149 9 sub.u64 %r169,%r48,%r100; mad.lo.u64 %r58,%r169,%r43,%r58; bra $L18; $L34: .loc 1 135 12 sub.u64 %r171,%r48,%r100; mad.lo.u64 %r58,%r171,%r43,%r58; $L18: mov.u64 %r66,%r184; .loc 1 151 34 mov.u64 %r55,%r48; mov.u64 %r53,%r43; .loc 1 152 9 mov.u64 %r62,0; $L20: .loc 1 157 13 st.u64 [%r66+-8],%r187; .loc 1 160 23 mul.lo.u64 %r50,%r53,%r55; mov.u64 %r96,%r62; .loc 1 161 5 add.u64 %r62,%r62,1; .loc 1 162 7 setp.eq.u64 %r175,%r96,%r94; @ %r175 bra $L1; .loc 1 170 16 ld.u64 %r176,[%r66]; add.u64 %r52,%r176,1; st.u64 [%r66],%r52; shl.b64 %r61,%r62,3; .loc 1 171 23 add.u64 %r177,%frame,%r61; ld.u64 %r53,[%r177]; .loc 1 171 13 sub.u64 %r178,%r53,%r50; add.u64 %r58,%r58,%r178; .loc 1 174 32 add.u64 %r180,%r188,%r61; ld.u64 %r55,[%r180]; .loc 1 174 7 add.u64 %r66,%r66,8; setp.eq.u64 %r181,%r52,%r55; @ %r181 bra $L20; bra $L21; $L1: .loc 1 1_gfortran_mmaxloc0_8_i1 .visible .func _gfortran_mmaxloc0_8_i1u64pred %r185; .reg .u64 %r188; .reg .u64u32 %r196; .reg .u32 %r197; .reg .u32u16pred %r223; .reg .u64 %r224; .reg .predu32 %r232; .reg .pred %r233; .reg .pred %r234; .reg .u64 %r236; .reg .predpred %r248; .reg .pred %r249; .reg .u64 %r250; .reg .u16 %r252; .reg .u32 %r253; .reg .predu64 %r274; .reg .pred %r275; .reg .u64mov.u64 %r158,%ar0; mov.u64 %r159,%ar1; mov.u64 %r160,%ar2; mov.u32 %r161,%ar3; .loc 1 202 6 setp.ne.u64 %r162,%r160,0; @ %r162 bra $L3615r159161; call _gfortran_maxloc0_8_.loc 1 205 7 bra $L35; $L36: .loc 1 208 10 ld.s8 %r22,[%r159+28]; .loc 1 209 6 setp.gt.s32 %r166,%r22,0; @ %r166 bra $L38; .loc 1 210 5 cvta.const.u64 %r16167_gfortran_runtime_error%r169,[%r1539; .loc 1 214 7 st.u64 [%r158+48],%r169; add.u32 %r172,%r22,-1; cvt.s64.s32 %r173,%r172; st.u64 [%r158+56],%r173; mov.u64 %r174,1; st.u64 [%r158+40],%r174; .loc 1 215 28 cvt.u32.u64 %r175,%r174; st.u8 [%r158+28],%r175; .loc 1 216 24 st.u64 [%r158+8],%r169; .loc 1 217 29 cvt.u32.u32 %r180,%r22; cvt.s64.s8 %r179,%r180; mov.u64 %r178call (%value_in),_gfortrani_xmallocarray181,[%value_in]; } .loc 1 217 27 st.u64 [%r158],%r181; bra $L40; $L39: .loc 1 221 11 cvta.global.u64 %r183,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r184,[%r183+36]; setp.eq.u32 %r185,%r184,0r159188; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r191,$LC2; {15919188; call _gfortrani_bounds_equal_extents$L40: .loc 1 231 15 ld.u64 %r31,[%r160+16]; .loc 1 233 9 ld.u64 %r91,[%r160]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r193,%r32,-4; and.b32 %r194,%r193,-5; set.u32.eq.u32 %r196,%r194,0; neg.s32 %r197,%r196; .loc 1 235 22 add.u32 %r198,%r32,-1; set.u32.le.u32 %r200,%r198,1; neg.s32 %r201,%r200; .loc 1 235 6 cvt.u16.u32 %r203,%r197; cvt.u16.u32 %r204,%r201; or.b16 %r202,%r203,%r204; cvt.u32.u16 %r205,%r202; cvt.u16.u8 %r206,%r205; setp.ne.u16 %r207,%r206,0; @ %r207 bra $L41; .loc 1 237 7 setp.ne.u32 %r209,%r32,16; @ %r209 bra $L42; $L41: .loc 1 244 11 ld.u64 %r103,[%r158+40]; .loc 1 245 8 ld.u64 %r104,[%r158]; .loc 1 246 17 cvt.u32.u32 %r210,%r22; cvt.s64.s8 %r67,%r210; add.u64 %r151,%r159,40; add.u64 %r278,%frame,120; mov.u64 %r147,%r278; add.u64 %r145,%r160,40; mov.u64 %r144,%frame; add.u64 %r140,%frame,240; add.u64 %r61,%frame,360; mov.u64 %r131,%r61; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r219,%r29; bra $L43; $L42: .loc 1 242 5_gfortran_runtime_error64: mov.u64 %r29,%r109; $L43: .loc 1 248 18 ld.u64 %r213,[%r151]; st.u64 [%r147],%r213; .loc 1 249 20 ld.u64 %r215,[%r145]; mul.lo.u64 %r214,%r215,%r31; .loc 1 249 18 st.u64 [%r144],%r214; .loc 1 250 19 ld.u64 %r217,[%r151+16]; add.u64 %r216,%r217,1; ld.u64 %r218,[%r151+8]; sub.u64 %r44,%r216,%r218; .loc 1 250 17 st.u64 [%r140],%r44; .loc 1 251 16 st.u64 [%r131],%r219; .loc 1 252 10 setp.gt.s64 %r220,%r44,0; @ %r220 bra $L44; shl.b64 %r134,%r103,3; mov.u64 %r125,%r104; .loc 1 255 11 mov.u64 %r110,0; .loc 1 256 24 mov.u64 %r221,%r110; $L45: st.u64 [%r125],%r221; .loc 1 255 27 add.u64 %r110,%r110,1; .loc 1 255 4 add.u64 %r125,%r125,%r134; setp.ne.u64 %r222,%r67,%r110; @ %r222 bra $L45; bra $L35; $L44: .loc 1 246 26 add.u64 %r109,%r29,1; .loc 1 246 3 add.u64 %r151,%r151,24; add.u64 %r147,%r147,8; add.u64 %r145,%r145,24; add.u64 %r144,%r144,8; add.u64 %r140,%r140,8; add.u64 %r131,%r131,8; setp.ne.u64 %r223,%r67,%r109; @ %r223 bra $L64; .loc 1 261 8 ld.u64 %r88,[%r159]; shl.b64 %r156,%r103,3; mov.u64 %r154,%r104; .loc 1 264 10 mov.u64 %r108,0; .loc 1 265 23 mov.u64 %r224,%r108; $L47: st.u64 [%r154],%r224; mov.u64 %r137,%r108; .loc 1 264 26 add.u64 %r108,%r108,1; .loc 1 264 3 add.u64 %r154,%r154,%r156; setp.ne.u64 %r225,%r29,%r137; @ %r225 bra $L47; .loc 1 276 9 setp.eq.u64 %r226,%r88,0; @ %r226 bra $L35; .loc 1 331 19 ld.u64 %r69,[%frame+120]; .loc 1 332 20 ld.u64 %r71,[%frame]; .loc 1 334 34 ld.u64 %r74,[%frame+240]; ld.u64 %r153,[%frame+360]; cvt.u32.u32 %r228,%r22; cvt.s64.s8 %r227,%r228; shl.b64 %r229,%r227,3; add.u64 %r115,%r229,%r61; .loc 1 274 12 mov.u32 %r95,-128; setp.eq.u32 %r277,1,0; add.u64 %r276,%frame,368; .loc 1 340 13 mov.u64 %r283,0; .loc 1 359 32 add.u64 %r284,%frame,240; .loc 1 308 12 setp.eq.u32 %r286,%r161,0; $L63: .loc 1 280 10 @ %r277 bra $L48; mov.u64 %r53,%r153; mov.u32 %r89,0; mov.u32 %r280,1; $L53: mov.u64 %r45,%r88; .loc 1 300 13 add.u64 %r88,%r45,%r69; .loc 1 284 11 ld.s8 %r232,[%r910; @ %r233 bra $L49; setp.eq.u32 %r234,%r89,0; selp.u64 %r153,%r153,%r53,%r234; .loc 1 294 16 ld.s8 %r95,[%r45]; mov.u64 %r124,%r276; mov.u64 %r136,%r104; mov.u64 %r149,%r153; $L52: .loc 1 296 33 add.u64 %r236,%r149,1; .loc 1 296 22 st.u64 [%r136],%r236; .loc 1 295 9 add.u64 %r136,%r136,%r156; setp.eq.u64 %r237,%r115,%r124; @ %r237 bra $L65; .loc 1 296 29 ld.u64 %r149,[%r124]; add.u64 %r124,%r124,8; bra $L52; $L49: .loc 1 301 14 add.u64 %r91,%r91,%r71; .loc 1 303 11 add.u64 %r53,%r53,1; mov.u32 %r89,%r280; .loc 1 303 4 setp.ne.u64 %r238,%r53,%r74; @ %r238 bra $L53; bra $L89; $L48: .loc 1 308 12 @ %r286 bra $L66; mov.u64 %r63,%r153; mov.u64 %r86,%r88; $L58: .loc 1 311 11 ld.s8 %r241,[%r91]; cvt.u16.u32 %r240,%r241; setp.eq.u16 %r242,%r240,0; @ %r242 bra $L56; .loc 1 311 22 ld.s8 %r59,[%r86]; .loc 1 311 19 cvt.u16.u32 %r243,%r59; cvt.u16.u32 %r244,%r95; setp.lt.s16 %r245,%r243,%r244; @ %r245 bra $L56; mov.u64 %r92,%r276; mov.u64 %r94,%r104; mov.u64 %r146,%r63; .loc 1 314 19 mov.u64 %r138,%r283; $L57: .loc 1 315 36 add.u64 %r247,%r146,1; .loc 1 315 25 st.u64 [%r94],%r247; .loc 1 314 35 add.u64 %r106,%r138,1; .loc 1 314 12 add.u64 %r94,%r94,%r156; setp.le.s64 %r248,%r29,%r138; @ %r248 bra $L67; .loc 1 315 32 ld.u64 %r146,[%r92]; add.u64 %r92,%r92,8; mov.u64 %r138,%r106; bra $L57; $L67: mov.u32 %r95,%r59; $L56: .loc 1 317 13 add.u64 %r86,%r86,%r69; .loc 1 319 11 add.u64 %r63,%r63,1; .loc 1 319 4 st.u64 [%frame+360],%r63; setp.ne.u64 %r249,%r63,%r74; @ %r249 bra $L58; .loc 1 317 13 sub.u64 %r250,%r74,%r153; mad.lo.u64 %r88,%r250,%r69,%r88; bra $L59; $L66: mov.u64 %r73,%r153; mov.u64 %r90,%r91; mov.u64 %r87,%r88; $L55: .loc 1 323 11 ld.s8 %r253,[%r90]; cvt.u16.u32 %r252,%r253; setp.eq.u16 %r254,%r252,0; @ %r254 bra $L60; .loc 1 323 22 ld.s8 %r66,[%r87]; .loc 1 323 19 cvt.u16.u32 %r255,%r66; cvt.u16.u32 %r256,%r95; setp.le.s16 %r257,%r255,%r256; @ %r257 bra $L60; mov.u64 %r111,%r276; mov.u64 %r56,%r104; mov.u64 %r142,%r73; .loc 1 326 12 mov.u64 %r139,%r283; $L61: .loc 1 327 36 add.u64 %r259,%r142,1; .loc 1 327 25 st.u64 [%r56],%r259; .loc 1 326 28 add.u64 %r105,%r139,1; .loc 1 326 5 add.u64 %r56,%r56,%r156; setp.le.s64 %r260,%r29,%r139; @ %r260 bra $L68; .loc 1 327 32 ld.u64 %r142,[%r111]; add.u64 %r111,%r111,8; mov.u64 %r139,%r105; bra $L61; $L68: mov.u32 %r95,%r66; $L60: .loc 1 331 9 add.u64 %r87,%r87,%r69; .loc 1 332 10 add.u64 %r90,%r90,%r71; .loc 1 334 14 add.u64 %r73,%r73,1; .loc 1 334 7 st.u64 [%frame+360],%r73; setp.ne.u64 %r261,%r73,%r74; @ %r261 bra $L55; sub.u64 %r121,%r74,%r153; .loc 1 331 9 mad.lo.u64 %r88,%r69,%r121,%r88; .loc 1 332 10 mad.lo.u64 %r91,%r71,%r121,%r91; bra $L59; $L89: st.u64 [%frame+360],%r74; $L59: mov.u64 %r113,%r276; mov.u64 %r83,%r71; .loc 1 334 34 mov.u64 %r85,%r74; mov.u64 %r81,%r69; mov.u64 %r112,8; .loc 1 335 9 mov.u64 %r93,0; $L62: .loc 1 340 13 st.u64 [%r113+-8],%r283; .loc 1 343 23 mul.lo.u64 %r76,%r81,%r85; .loc 1 344 24 mul.lo.u64 %r78,%r85,%r83; mov.u64 %r54,%r93; .loc 1 345 5 add.u64 %r93,%r93,1; .loc 1 346 7 setp.eq.u64 %r266,%r29,%r54; @ %r266 bra $L35; .loc 1 354 16 ld.u64 %r267,[%r113]; add.u64 %r80,%r267,1; st.u64 [%r113],%r80; .loc 1 355 23 add.u64 %r269,%r278,%r112; ld.u64 %r81,[%r269]; .loc 1 355 13 sub.u64 %r270,%r81,%r76; add.u64 %r88,%r88,%r270; .loc 1 356 24 add.u64 %r271,%frame,%r112; ld.u64 %r83,[%r271]; .loc 1 356 14 sub.u64 %r272,%r83,%r78; add.u64 %r91,%r91,%r272; .loc 1 359 32 add.u64 %r274,%r284,%r112; ld.u64 %r85,[%r274]; .loc 1 359 7 add.u64 %r113,%r113,8; add.u64 %r112,%r112,8; setp.eq.u64 %r275,%r80,%r85; @ %r275 bra $L62; ld.u64 %r153,[%frame+360]; bra $L63; $L65: mov.u64 %r88,%r45; setp.eq.u32 %r277,1,1; bra $L48; $L35: .loc 1 362_gfortran_smaxloc0_8_i1 .visible .func _gfortran_smaxloc0_8_i14predpred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 379 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L91; .loc 1 379 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L92; $L91: .loc 1 381 7 {_gfortran_maxloc0_8_.loc 1 382 7 bra $L90; $L92: .loc 1 385 8 ld.s8 %r30,[%r36+28]_gfortran_runtime_error94: .loc 1 390 15 ld.u64 %r24,[%r35]; .loc 1 390 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L95; .loc 1 392 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 393 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 394 24 st.u64 [%r35+8],%r24; .loc 1 395 29 mov.u64 call (%value_in),_gfortrani_xmallocarray95 27 st.u64 [%r35],%r24; bra $L96; $L95: .loc 1 397 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 397 11 ld.u32 %r62,[%r61+36; .loc 1 399 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 404 8 ld.u64 %r24,[%r35]; $L96: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,3; mov.u64 %r29,%r24; .loc 1 405 10 mov.u64 %r32,0; .loc 1 406 23 mov.u64 %r68,%r32; $L97: st.u64 [%r29],%r68; .loc 1 405 24 add.u64 %r32,%r32,1; .loc 1 405 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L97; $L90: .loc 1 407maxloc0_16_i1.o/1622802227_gfortran_maxloc0_16_i1 .visible .func _gfortran_maxloc0_16_i1fortran/generated/maxloc0_16_i1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_16_i1 .visible .func _gfortran_mmaxloc0_16_i1_gfortran_smaxloc0_16_i1 .visible .func _gfortran_smaxloc0_16_i1VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,65,88,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_16_i1 .visible .func _gfortran_maxloc0_16_i132pred %r166; .reg .u16 %r167; .reg .u16 %r168; .reg .predmov.u64 %r112,%ar0; mov.u64 %r113,%ar1; mov.u32 %r114,%ar2; .loc 1 50 10 ld.s8 %r22,[%r113+28]; .loc 1 50 8 cvt.u32.u32 %r115,%r22; cvt.s64.s8 %r69,%r115; .loc 1 51 6 setp.gt.s64 %r116,%r69,0; @ %r116 bra $L2; .loc 1 52 5 cvta.const.u64 %r11117_gfortran_runtime_errorr23,[%r112]; .loc 1 54 6 setp.ne.u64 %r119,%r23,0; @ %r119 bra $L3; .loc 1 56 7 st.u64 [%r112+48],%r23; add.u64 %r121,%r69,-1; st.u64 [%r112+56],%r121; mov.u64 %r122,1; st.u64 [%r112+40],%r122; .loc 1 57 28 cvt.u32.u64 %r123,%r122; st.u8 [%r112+28],%r123; .loc 1 58 24 st.u64 [%r112+8],%r23; .loc 1 59 29 mov.u64 %r126,16126; call (%value_in),_gfortrani_xmallocarray129,[%value_in]; } mov.u64 %r23,%r129; .loc 1 59 27 st.u64 [%r112],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r131,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r132,[%r131+36]; setp.eq.u32 %r133,%r132,0; @ %r133 bra $L4; .loc 1 64r113136; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r112]; $L4: .loc 1 68 11 ld.u64 %r70,[%r112+40]; add.u64 %r44,%r113,40; mov.u64 %r39,%frame; add.u64 %r105,%frame,120; .loc 1 70 10 mov.u64 %r97,0; add.u64 %r189,%frame,240; .loc 1 74 16 mov.u64 %r144,%r97; bra $L7; $L22: mov.u64 %r97,%r76; $L7: .loc 1 72 18 ld.u64 %r137,[%r44]; st.u64 [%r39],%r137; .loc 1 73 19 ld.u64 %r139,[%r44+16]; add.u64 %r138,%r139,1; ld.u64 %r140,[%r44+8]; sub.u64 %r32,%r138,%r140; .loc 1 73 17 st.u64 [%r105],%r32; .loc 1 74 16 shl.b64 %r142,%r97,3; add.u64 %r143,%r189,%r142; st.u64 [%r143],%r144; .loc 1 75 10 setp.gt.s64 %r145,%r32,0; @ %r145 bra $L5; shl.b64 %r83,%r70,4; mov.u64 %r85,%r23; .loc 1 78 11 mov.u64 %r77,0; .loc 1 79 24 mov.u64 %r146,%r77; $L6: st.u64 [%r85],%r146; st.u64 [%r85+8],%r146; .loc 1 78 27 add.u64 %r77,%r77,1; .loc 1 78 4 add.u64 %r85,%r85,%r83; setp.ne.u64 %r148,%r69,%r77; @ %r148 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r76,%r97,1; .loc 1 70 3 add.u64 %r44,%r44,24; add.u64 %r39,%r39,8; add.u64 %r105,%r105,8; setp.ne.u64 %r149,%r69,%r76; @ %r149 bra $L22; .loc 1 84 8 ld.u64 %r60,[%r113]; shl.b64 %r37,%r70,4; mov.u64 %r82,%r23; .loc 1 87 10 mov.u64 %r75,0; .loc 1 88 23 mov.u64 %r150,1; mov.u64 %r151,%r75; $L8: st.u64 [%r82],%r150; st.u64 [%r82+8],%r151; mov.u64 %r98,%r75; .loc 1 87 26 add.u64 %r75,%r75,1; .loc 1 87 3 add.u64 %r82,%r82,%r37; setp.ne.u64 %r152,%r97,%r98; @ %r152 bra $L8; .loc 1 101 9 setp.ne.u64 %r153,%r60,0; @ ! %r153 bra $L1; .loc 1 149 19 ld.u64 %r47,[%frame]; .loc 1 151 34 ld.u64 %r50,[%frame+120]; cvt.u32.u32 %r155,%r22; cvt.s64.s8 %r154,%r155; shl.b64 %r156,%r154,3; add.u64 %r86,%r189,%r156; .loc 1 99 12 mov.u32 %r64,-128; setp.eq.u32 %r188,%r114,0; add.u64 %r190,%frame,248; .loc 1 157 13 mov.u64 %r193,0; .loc 1 174 32 add.u64 %r194,%frame,120; $L21: .loc 1 151 21 ld.u64 %r104,[%frame+240]; .loc 1 126 12 @ %r188 bra $L23; mov.u64 %r40,%r104; mov.u64 %r58,%r60; $L14: .loc 1 129 12 ld.s8 %r33,[%r58]; .loc 1 129 11 cvt.u16.u32 %r159,%r33; cvt.u16.u32 %r160,%r64; setp.lt.s16 %r161,%r159,%r160; @ %r161 bra $L12; mov.u64 %r27,%r190; mov.u64 %r61,%r23; mov.u64 %r100,%r40; $L13: .loc 1 133 35 add.u64 %r163,%r100,1; st.u64 [%r61],%r163; shr.s64 %r164,%r163,63; st.u64 [%r61+8],%r164; .loc 1 132 10 add.u64 %r61,%r61,%r37; setp.eq.u64 %r165,%r27,%r86; @ %r165 bra $L24; .loc 1 133 31 ld.u64 %r100,[%r27]; add.u64 %r27,%r27,8; bra $L13; $L24: mov.u32 %r64,%r33; $L12: .loc 1 135 12 add.u64 %r58,%r58,%r47; .loc 1 137 17 add.u64 %r40,%r40,1; .loc 1 137 10 st.u64 [%frame+240],%r40; setp.ne.u64 %r166,%r40,%r50; @ %r166 bra $L14; bra $L34; $L23: mov.u64 %r49,%r104; mov.u64 %r59,%r60; $L11: .loc 1 141 11 ld.s8 %r42,[%r59]; .loc 1 141 10 cvt.u16.u32 %r167,%r42; cvt.u16.u32 %r168,%r64; setp.le.s16 %r169,%r167,%r168; @ %r169 bra $L16; mov.u64 %r101,%r190; mov.u64 %r102,%r23; mov.u64 %r81,%r49; $L17: .loc 1 145 35 add.u64 %r171,%r81,1; st.u64 [%r102],%r171; shr.s64 %r172,%r171,63; st.u64 [%r102+8],%r172; .loc 1 144 4 add.u64 %r102,%r102,%r37; setp.eq.u64 %r173,%r86,%r101; @ %r173 bra $L25; .loc 1 145 31 ld.u64 %r81,[%r101]; add.u64 %r101,%r101,8; bra $L17; $L25: mov.u32 %r64,%r42; $L16: .loc 1 149 9 add.u64 %r59,%r59,%r47; .loc 1 151 14 add.u64 %r49,%r49,1; .loc 1 151 7 st.u64 [%frame+240],%r49; setp.ne.u64 %r174,%r49,%r50; @ %r174 bra $L11; .loc 1 149 9 sub.u64 %r175,%r50,%r104; mad.lo.u64 %r60,%r175,%r47,%r60; bra $L18; $L34: .loc 1 135 12 sub.u64 %r177,%r50,%r104; mad.lo.u64 %r60,%r177,%r47,%r60; $L18: mov.u64 %r74,%r190; .loc 1 151 34 mov.u64 %r57,%r50; mov.u64 %r55,%r47; .loc 1 152 9 mov.u64 %r63,0; $L20: .loc 1 157 13 st.u64 [%r74+-8],%r193; .loc 1 160 23 mul.lo.u64 %r52,%r55,%r57; mov.u64 %r99,%r63; .loc 1 161 5 add.u64 %r63,%r63,1; .loc 1 162 7 setp.eq.u64 %r181,%r99,%r97; @ %r181 bra $L1; .loc 1 170 16 ld.u64 %r182,[%r74]; add.u64 %r54,%r182,1; st.u64 [%r74],%r54; shl.b64 %r66,%r63,3; .loc 1 171 23 add.u64 %r183,%frame,%r66; ld.u64 %r55,[%r183]; .loc 1 171 13 sub.u64 %r184,%r55,%r52; add.u64 %r60,%r60,%r184; .loc 1 174 32 add.u64 %r186,%r194,%r66; ld.u64 %r57,[%r186]; .loc 1 174 7 add.u64 %r74,%r74,8; setp.eq.u64 %r187,%r54,%r57; @ %r187 bra $L20; bra $L21; $L1: .loc 1 1_gfortran_mmaxloc0_16_i1 .visible .func _gfortran_mmaxloc0_16_i16u64 %r170; .reg .u64 %r172; .reg .predpred %r188; .reg .u64 %r191; .reg .u64 %r194; .reg .u32 %r196; .reg .u32 %r197; .reg .u32u16 %r206; .reg .u16u32 %r213; .reg .u64pred %r223; .reg .u64pred %r239; .reg .u64 %r241; .reg .u64 %r242; .reg .pred %r243; .reg .predpred %r269; .reg .pred %r274; .reg .u64u64 %r280; .reg .u64 %r282; .reg .predu32 %r288; .reg .u64 %r291; .reg .u64 %r292; .reg .pred %r294; mov.u64 %r161,%ar0; mov.u64 %r162,%ar1; mov.u64 %r163,%ar2; mov.u32 %r164,%ar3; .loc 1 202 6 setp.ne.u64 %r165,%r163,0; @ %r165 bra $L36gfortran_maxloc0_16_.loc 1 205 7 bra $L35; $L36: .loc 1 208 10 ld.s8 %r22,[%r162+28]; .loc 1 209 6 setp.gt.s32 %r169,%r22,0; @ %r169 bra $L38; .loc 1 210 5 cvta.const.u64 %r170170_gfortran_runtime_error%r172,[%r161]; setp.ne.u64 %r173,%r172,0; @ %r173 bra $L39; .loc 1 214 7 st.u64 [%r161+48],%r172; add.u32 %r175,%r22,-1; cvt.s64.s32 %r176,%r175; st.u64 [%r161+56],%r176; mov.u64 %r177,1; st.u64 [%r161+40],%r177; .loc 1 215 28 cvt.u32.u64 %r178,%r177; st.u8 [%r161+28],%r178; .loc 1 216 24 st.u64 [%r161+8],%r172; .loc 1 217 29 cvt.u32.u32 %r183,%r22; cvt.s64.s8 %r182,%r183; mov.u64 %r181,16call (%value_in),_gfortrani_xmallocarray184,[%value_in]; } .loc 1 217 27 st.u64 [%r161],%r184; bra $L40; $L39: .loc 1 221 11 cvta.global.u64 %r186,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r187,[%r186+36]; setp.eq.u32 %r188,%r187,0161_gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r194163194191; call _gfortrani_bounds_equal_extents$L40: .loc 1 231 15 ld.u64 %r31,[%r163+16]; .loc 1 233 9 ld.u64 %r89,[%r163]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r196,%r32,-4; and.b32 %r197,%r196,-5; set.u32.eq.u32 %r199,%r197,0; neg.s32 %r200,%r199; .loc 1 235 22 add.u32 %r201,%r32,-1; set.u32.le.u32 %r203,%r201,1; neg.s32 %r204,%r203; .loc 1 235 6 cvt.u16.u32 %r206,%r200; cvt.u16.u32 %r207,%r204; or.b16 %r205,%r206,%r207; cvt.u32.u16 %r208,%r205; cvt.u16.u8 %r209,%r20837 7 setp.ne.u32 %r212,%r32,16; @ %r212 bra $L42; $L41: .loc 1 244 11 ld.u64 %r101,[%r161+40]; .loc 1 245 8 ld.u64 %r102,[%r161]; .loc 1 246 17 cvt.u32.u32 %r213,%r22; cvt.s64.s8 %r64,%r213; add.u64 %r159,%r162,40; add.u64 %r285,%frame,120; mov.u64 %r157,%r285; add.u64 %r156,%r163,40; mov.u64 %r152,%frame; add.u64 %r153,%frame,240; add.u64 %r149,%frame,360; mov.u64 %r148,%r149; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r222,%r29; bra $L43; $L42: .loc 1 242 5_gfortran_runtime_error64: mov.u64 %r29,%r107; $L43: .loc 1 248 18 ld.u64 %r216,[%r159]; st.u64 [%r157],%r216; .loc 1 249 20 ld.u64 %r218,[%r156]; mul.lo.u64 %r217,%r218,%r31; .loc 1 249 18 st.u64 [%r152],%r217; .loc 1 250 19 ld.u64 %r220,[%r159+16]; add.u64 %r219,%r220,1; ld.u64 %r221,[%r159+8]; sub.u64 %r44,%r219,%r221; .loc 1 250 17 st.u64 [%r153],%r44; .loc 1 251 16 st.u64 [%r148],%r222; .loc 1 252 10 setp.gt.s64 %r223,%r44,0; @ %r223 bra $L44; shl.b64 %r142,%r101,4; mov.u64 %r132,%r102; .loc 1 255 11 mov.u64 %r108,0; .loc 1 256 24 mov.u64 %r224,%r108; $L45: st.u64 [%r132],%r224; st.u64 [%r132+8],%r224; .loc 1 255 27 add.u64 %r108,%r108,1; .loc 1 255 4 add.u64 %r132,%r132,%r142; setp.ne.u64 %r226,%r64,%r108; @ %r226 bra $L45; bra $L35; $L44: .loc 1 246 26 add.u64 %r107,%r29,1; .loc 1 246 3 add.u64 %r159,%r159,24; add.u64 %r157,%r157,8; add.u64 %r156,%r156,24; add.u64 %r152,%r152,8; add.u64 %r153,%r153,8; add.u64 %r148,%r148,8; setp.ne.u64 %r227,%r64,%r107; @ %r227 bra $L64; .loc 1 261 8 ld.u64 %r86,[%r162]; shl.b64 %r110,%r101,4; mov.u64 %r51,%r102; .loc 1 264 10 mov.u64 %r106,0; .loc 1 265 23 mov.u64 %r228,%r106; $L47: st.u64 [%r51],%r228; st.u64 [%r51+8],%r228; mov.u64 %r138,%r106; .loc 1 264 26 add.u64 %r106,%r106,1; .loc 1 264 3 add.u64 %r51,%r51,%r110; setp.ne.u64 %r230,%r29,%r138; @ %r230 bra $L47; .loc 1 276 9 setp.eq.u64 %r231,%r86,0; @ %r231 bra $L35; .loc 1 331 19 ld.u64 %r67,[%frame+120]; .loc 1 332 20 ld.u64 %r69,[%frame]; .loc 1 334 34 ld.u64 %r72,[%frame+240]; ld.u64 %r158,[%frame+360]; cvt.u32.u32 %r233,%r22; cvt.s64.s8 %r232,%r233; shl.b64 %r234,%r232,3; add.u64 %r120,%r234,%r149; .loc 1 274 12 mov.u32 %r95,-128; setp.eq.u32 %r284,1,0; add.u64 %r286,%frame,368; .loc 1 340 13 mov.u64 %r291,0; .loc 1 359 32 add.u64 %r292,%frame,240; .loc 1 308 12 setp.eq.u32 %r294,%r164,0; $L63: .loc 1 280 10 @ %r284 bra $L48; mov.u64 %r52,%r158; mov.u32 %r87,0; mov.u32 %r288,1; $L53: mov.u64 %r45,%r86; .loc 1 300 13 add.u64 %r86,%r45,%r67; .loc 1 284 11 ld.s8 %r237,[%r89]; cvt.u16.u32 %r236,%r237; setp.eq.u16 %r238,%r236,0; @ %r238 bra $L49; setp.eq.u32 %r239,%r87,0; selp.u64 %r158,%r158,%r52,%r239; .loc 1 294 16 ld.s8 %r95,[%r45]; mov.u64 %r129,%r286; mov.u64 %r143,%r102; mov.u64 %r155,%r158; $L52: .loc 1 296 33 add.u64 %r241,%r155,1; st.u64 [%r143],%r241; shr.s64 %r242,%r241,63; st.u64 [%r143+8],%r242; .loc 1 295 9 add.u64 %r143,%r143,%r110; setp.eq.u64 %r243,%r120,%r129; @ %r243 bra $L65; .loc 1 296 29 ld.u64 %r155,[%r129]; add.u64 %r129,%r129,8; bra $L52; $L49: .loc 1 301 14 add.u64 %r89,%r89,%r69; .loc 1 303 11 add.u64 %r52,%r52,1; mov.u32 %r87,%r288; .loc 1 303 4 setp.ne.u64 %r244,%r52,%r72; @ %r244 bra $L53; bra $L89; $L48: .loc 1 308 12 @ %r294 bra $L66; mov.u64 %r60,%r158; mov.u64 %r84,%r86; $L58: .loc 1 311 11 ld.s8 %r247,[%r89]; cvt.u16.u32 %r246,%r247; setp.eq.u16 %r248,%r246,0; @ %r248 bra $L56; .loc 1 311 22 ld.s8 %r56,[%r84]; .loc 1 311 19 cvt.u16.u32 %r249,%r56; cvt.u16.u32 %r250,%r95; setp.lt.s16 %r251,%r249,%r250; @ %r251 bra $L56; mov.u64 %r96,%r286; mov.u64 %r105,%r102; mov.u64 %r151,%r60; .loc 1 314 19 mov.u64 %r139,%r291; $L57: .loc 1 315 36 add.u64 %r253,%r151,1; st.u64 [%r105],%r253; shr.s64 %r254,%r253,63; st.u64 [%r105+8],%r254; .loc 1 314 35 add.u64 %r104,%r139,1; .loc 1 314 12 add.u64 %r105,%r105,%r110; setp.le.s64 %r255,%r29,%r139; @ %r255 bra $L67; .loc 1 315 32 ld.u64 %r151,[%r96]; add.u64 %r96,%r96,8; mov.u64 %r139,%r104; bra $L57; $L67: mov.u32 %r95,%r56; $L56: .loc 1 317 13 add.u64 %r84,%r84,%r67; .loc 1 319 11 add.u64 %r60,%r60,1; .loc 1 319 4 st.u64 [%frame+360],%r60; setp.ne.u64 %r256,%r60,%r72; @ %r256 bra $L58; .loc 1 317 13 sub.u64 %r257,%r72,%r158; mad.lo.u64 %r86,%r257,%r67,%r86; bra $L59; $L66: mov.u64 %r71,%r158; mov.u64 %r88,%r89; mov.u64 %r85,%r86; $L55: .loc 1 323 11 ld.s8 %r260,[%r88]; cvt.u16.u32 %r259,%r260; setp.eq.u16 %r261,%r259,0; @ %r261 bra $L60; .loc 1 323 22 ld.s8 %r63,[%r85]; .loc 1 323 19 cvt.u16.u32 %r262,%r63; cvt.u16.u32 %r263,%r95; setp.le.s16 %r264,%r262,%r263; @ %r264 bra $L60; mov.u64 %r61,%r286; mov.u64 %r91,%r102; mov.u64 %r147,%r71; .loc 1 326 12 mov.u64 %r140,%r291; $L61: .loc 1 327 36 add.u64 %r266,%r147,1; st.u64 [%r91],%r266; shr.s64 %r267,%r266,63; st.u64 [%r91+8],%r267; .loc 1 326 28 add.u64 %r103,%r140,1; .loc 1 326 5 add.u64 %r91,%r91,%r110; setp.le.s64 %r268,%r29,%r140; @ %r268 bra $L68; .loc 1 327 32 ld.u64 %r147,[%r61]; add.u64 %r61,%r61,8; mov.u64 %r140,%r103; bra $L61; $L68: mov.u32 %r95,%r63; $L60: .loc 1 331 9 add.u64 %r85,%r85,%r67; .loc 1 332 10 add.u64 %r88,%r88,%r69; .loc 1 334 14 add.u64 %r71,%r71,1; .loc 1 334 7 st.u64 [%frame+360],%r71; setp.ne.u64 %r269,%r71,%r72; @ %r269 bra $L55; sub.u64 %r126,%r72,%r158; .loc 1 331 9 mad.lo.u64 %r86,%r67,%r126,%r86; .loc 1 332 10 mad.lo.u64 %r89,%r69,%r126,%r89; bra $L59; $L89: st.u64 [%frame+360],%r72; $L59: mov.u64 %r118,%r286; mov.u64 %r81,%r69; .loc 1 334 34 mov.u64 %r83,%r72; mov.u64 %r79,%r67; mov.u64 %r117,8; .loc 1 335 9 mov.u64 %r93,0; $L62: .loc 1 340 13 st.u64 [%r118+-8],%r291; .loc 1 343 23 mul.lo.u64 %r74,%r79,%r83; .loc 1 344 24 mul.lo.u64 %r76,%r83,%r81; mov.u64 %r90,%r93; .loc 1 345 5 add.u64 %r93,%r93,1; .loc 1 346 7 setp.eq.u64 %r274,%r29,%r90; @ %r274 bra $L35; .loc 1 354 16 ld.u64 %r275,[%r118]; add.u64 %r78,%r275,1; st.u64 [%r118],%r78; .loc 1 355 23 add.u64 %r277,%r285,%r117; ld.u64 %r79,[%r277]; .loc 1 355 13 sub.u64 %r278,%r79,%r74; add.u64 %r86,%r86,%r278; .loc 1 356 24 add.u64 %r279,%frame,%r117; ld.u64 %r81,[%r279]; .loc 1 356 14 sub.u64 %r280,%r81,%r76; add.u64 %r89,%r89,%r280; .loc 1 359 32 add.u64 %r282,%r292,%r117; ld.u64 %r83,[%r282]; .loc 1 359 7 add.u64 %r118,%r118,8; add.u64 %r117,%r117,8; setp.eq.u64 %r283,%r78,%r83; @ %r283 bra $L62; ld.u64 %r158,[%frame+360]; bra $L63; $L65: mov.u64 %r86,%r45; setp.eq.u32 %r284,1,1; bra $L48; $L35: .loc 1 362_gfortran_smaxloc0_16_i1 .visible .func _gfortran_smaxloc0_16_i14predpred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 379 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L91; .loc 1 379 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L92; $L91: .loc 1 381 7 {_gfortran_maxloc0_16_.loc 1 382 7 bra $L90; $L92: .loc 1 385 8 ld.s8 %r30,[%r36+28]_gfortran_runtime_error94: .loc 1 390 15 ld.u64 %r24,[%r35]; .loc 1 390 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L95; .loc 1 392 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 393 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 394 24 st.u64 [%r35+8],%r24; .loc 1 395 29call (%value_in),_gfortrani_xmallocarray95 27 st.u64 [%r35],%r24; bra $L96; $L95: .loc 1 397 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 397 11 ld.u32 %r62,[%r61+36; .loc 1 399 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 404 8 ld.u64 %r24,[%r35]; $L96: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,4; mov.u64 %r29,%r24; .loc 1 405 10 mov.u64 %r32,0; .loc 1 406 23 mov.u64 %r68,%r32; $L97: st.u64 [%r29],%r68; st.u64 [%r29+8],%r68; .loc 1 405 24 add.u64 %r32,%r32,1; .loc 1 405 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r70,%r30,%r32; @ %r70 bra $L97; $L90: .loc 1 407maxloc0_4_i2.o/ 1622802227_gfortran_maxloc0_4_i2 .visible .func _gfortran_maxloc0_4_i2fortran/generated/maxloc0_4_i2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_4_i2 .visible .func _gfortran_mmaxloc0_4_i2_gfortran_smaxloc0_4_i2 .visible .func _gfortran_smaxloc0_4_i2VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,65,88,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_4_i2 .visible .func _gfortran_maxloc0_4_i2u64 %r87; .reg .u64predpredpred %r168; .reg .pred %r169; .reg .u16 %r170; .reg .u16 %r171; .reg .pred %r172; .reg .u32 %r174; .reg .u32predu64 %r198; mov.u64 %r116,%ar0; mov.u64 %r117,%ar1; mov.u32 %r118,%ar2; .loc 1 50 10 ld.s8 %r22,[%r117+28]; .loc 1 50 8 cvt.u32.u32 %r119,%r22; cvt.s64.s8 %r72,%r119; .loc 1 51 6 setp.gt.s64 %r120,%r72,0; @ %r120 bra $L2; .loc 1 52 5 cvta.const.u64 %r121121_gfortran_runtime_errorr23,[%r116]; .loc 1 54 6 setp.ne.u64 %r123,%r23,0; @ %r123 bra $L3; .loc 1 56 7 st.u64 [%r116+48],%r23; add.u64 %r125,%r72,-1; st.u64 [%r116+56],%r125; mov.u64 %r126,1; st.u64 [%r116+40],%r126; .loc 1 57 28 cvt.u32.u64 %r127,%r126; st.u8 [%r116+28],%r127; .loc 1 58 24 st.u64 [%r116+8],%r23; .loc 1 59 29 mov.u64 %r130,4;r130; call (%value_in),_gfortrani_xmallocarray133,[%value_in]; } mov.u64 %r23,%r133; .loc 1 59 27 st.u64 [%r116],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r135,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r136,[%r135+36]; setp.eq.u32 %r137,%r136,0; @ %r137 bra $L4; .loc 1 64r140; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r116]; $L4: .loc 1 68 11 ld.u64 %r73,[%r116+40]; add.u64 %r39,%r117,40; mov.u64 %r45,%frame; add.u64 %r35,%frame,120; .loc 1 70 10 mov.u64 %r99,0; add.u64 %r193,%frame,240; .loc 1 74 16 mov.u64 %r148,%r99; bra $L7; $L22: mov.u64 %r99,%r79; $L7: .loc 1 72 18 ld.u64 %r141,[%r39]; st.u64 [%r45],%r141; .loc 1 73 19 ld.u64 %r143,[%r39+16]; add.u64 %r142,%r143,1; ld.u64 %r144,[%r39+8]; sub.u64 %r31,%r142,%r144; .loc 1 73 17 st.u64 [%r35],%r31; .loc 1 74 16 shl.b64 %r146,%r99,3; add.u64 %r147,%r193,%r146; st.u64 [%r147],%r148; .loc 1 75 10 setp.gt.s64 %r149,%r31,0; @ %r149 bra $L5; shl.b64 %r85,%r73,2; mov.u64 %r87,%r23; .loc 1 78 11 mov.u64 %r80,0; .loc 1 79 24 cvt.u32.u64 %r150,%r80; $L6: st.u32 [%r87],%r150; .loc 1 78 27 add.u64 %r80,%r80,1; .loc 1 78 4 add.u64 %r87,%r87,%r85; setp.ne.u64 %r151,%r72,%r80; @ %r151 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r79,%r99,1; .loc 1 70 3 add.u64 %r39,%r39,24; add.u64 %r45,%r45,8; add.u64 %r35,%r35,8; setp.ne.u64 %r152,%r72,%r79; @ %r152 bra $L22; .loc 1 84 8 ld.u64 %r64,[%r117]; shl.b64 %r82,%r73,2; mov.u64 %r92,%r23; .loc 1 87 10 mov.u64 %r78,0; .loc 1 88 23 mov.u32 %r153,1; $L8: st.u32 [%r92],%r153; mov.u64 %r32,%r78; .loc 1 87 26 add.u64 %r78,%r78,1; .loc 1 87 3 add.u64 %r92,%r92,%r82; setp.ne.u64 %r154,%r99,%r32; @ %r154 bra $L8; .loc 1 101 9 setp.ne.u64 %r155,%r64,0; @ ! %r155 bra $L1; .loc 1 149 19 ld.u64 %r49,[%frame]; .loc 1 149 9 add.u64 %r51,%r49,%r49; .loc 1 151 34 ld.u64 %r53,[%frame+120]; cvt.u32.u32 %r158,%r22; cvt.s64.s8 %r157,%r158; shl.b64 %r159,%r157,3; add.u64 %r102,%r193,%r159; .loc 1 99 12 mov.u32 %r67,-32768; setp.eq.u32 %r192,%r118,0; add.u64 %r194,%frame,248; .loc 1 157 13 mov.u64 %r197,0; .loc 1 174 32 add.u64 %r198,%frame,120; $L21: .loc 1 151 21 ld.u64 %r110,[%frame+240]; .loc 1 126 12 @ %r192 bra $L23; mov.u64 %r40,%r110; mov.u64 %r62,%r64; $L14: .loc 1 129 12 ld.s16 %r33,[%r62]; .loc 1 129 11 cvt.u16.u32 %r162,%r33; cvt.u16.u32 %r163,%r67; setp.lt.s16 %r164,%r162,%r163; @ %r164 bra $L12; mov.u64 %r43,%r194; mov.u64 %r65,%r23; mov.u64 %r105,%r40; $L13: .loc 1 133 35 cvt.u32.u64 %r167,%r105; add.u32 %r166,%r167,1; .loc 1 133 24 st.u32 [%r65],%r166; .loc 1 132 10 add.u64 %r65,%r65,%r82; setp.eq.u64 %r168,%r43,%r102; @ %r168 bra $L24; .loc 1 133 31 ld.u64 %r105,[%r43]; add.u64 %r43,%r43,8; bra $L13; $L24: mov.u32 %r67,%r33; $L12: .loc 1 135 12 add.u64 %r62,%r62,%r51; .loc 1 137 17 add.u64 %r40,%r40,1; .loc 1 137 10 st.u64 [%frame+240],%r40; setp.ne.u64 %r169,%r40,%r53; @ %r169 bra $L14; bra $L34; $L23: mov.u64 %r52,%r110; mov.u64 %r63,%r64; $L11: .loc 1 141 11 ld.s16 %r42,[%r63]; .loc 1 141 10 cvt.u16.u32 %r170,%r42; cvt.u16.u32 %r171,%r67; setp.le.s16 %r172,%r170,%r171; @ %r172 bra $L16; mov.u64 %r108,%r194; mov.u64 %r109,%r23; mov.u64 %r44,%r52; $L17: .loc 1 145 35 cvt.u32.u64 %r175,%r44; add.u32 %r174,%r175,1; .loc 1 145 24 st.u32 [%r109],%r174; .loc 1 144 4 add.u64 %r109,%r109,%r82; setp.eq.u64 %r176,%r102,%r108; @ %r176 bra $L25; .loc 1 145 31 ld.u64 %r44,[%r108]; add.u64 %r108,%r108,8; bra $L17; $L25: mov.u32 %r67,%r42; $L16: .loc 1 149 9 add.u64 %r63,%r63,%r51; .loc 1 151 14 add.u64 %r52,%r52,1; .loc 1 151 7 st.u64 [%frame+240],%r52; setp.ne.u64 %r177,%r52,%r53; @ %r177 bra $L11; .loc 1 149 9 sub.u64 %r178,%r53,%r110; mad.lo.u64 %r64,%r178,%r51,%r64; bra $L18; $L34: .loc 1 135 12 sub.u64 %r180,%r53,%r110; mad.lo.u64 %r64,%r180,%r51,%r64; $L18: mov.u64 %r81,%r194; .loc 1 151 34 mov.u64 %r61,%r53; mov.u64 %r59,%r49; .loc 1 152 9 mov.u64 %r66,0; $L20: .loc 1 157 13 st.u64 [%r81+-8],%r197; .loc 1 160 23 mul.lo.u64 %r56,%r59,%r61; mov.u64 %r100,%r66; .loc 1 161 5 add.u64 %r66,%r66,1; .loc 1 162 7 setp.eq.u64 %r184,%r100,%r99; @ %r184 bra $L1; .loc 1 170 16 ld.u64 %r185,[%r81]; add.u64 %r58,%r185,1; st.u64 [%r81],%r58; shl.b64 %r70,%r66,3; .loc 1 171 23 add.u64 %r186,%frame,%r70; ld.u64 %r59,[%r186]; .loc 1 171 13 sub.u64 %r187,%r59,%r56; add.u64 %r188,%r187,%r187; add.u64 %r64,%r64,%r188; .loc 1 174 32 add.u64 %r190,%r198,%r70; ld.u64 %r61,[%r190]; .loc 1 174 7 add.u64 %r81,%r81,8; setp.eq.u64 %r191,%r58,%r61; @ %r191 bra $L20; bra $L21; $L1: .loc 1 1_gfortran_mmaxloc0_4_i2 .visible .func _gfortran_mmaxloc0_4_i27predpred %r193; .reg .u64 %r196; .reg .u6464pred %r228; .reg .u32 %r229; .reg .predpredu64predu64 %r279; .reg .u64 %r281; .reg .u64 %r282; .reg .u64pred %r299; mov.u64 %r166,%ar0; mov.u64 %r167,%ar1; mov.u64 %r168,%ar2; mov.u32 %r169,%ar3; .loc 1 202 6 setp.ne.u64 %r170,%r168,0; @ %r170 bra $L36gfortran_maxloc0_4_.loc 1 205 7 bra $L35; $L36: .loc 1 208 10 ld.s8 %r22,[%r167+28]; .loc 1 209 6 setp.gt.s32 %r174,%r22,0; @ %r174 bra $L38; .loc 1 210 5 cvta.const.u64 %r175175_gfortran_runtime_error%r177,[%r166]; setp.ne.u64 %r178,%r177,0; @ %r178 bra $L39; .loc 1 214 7 st.u64 [%r166+48],%r177; add.u32 %r180,%r22,-1; cvt.s64.s32 %r181,%r180; st.u64 [%r166+56],%r181; mov.u64 %r182,1; st.u64 [%r166+40],%r182; .loc 1 215 28 cvt.u32.u64 %r183,%r182; st.u8 [%r166+28],%r183; .loc 1 216 24 st.u64 [%r166+8],%r177; .loc 1 217 29 cvt.u32.u32 %r188,%r22; cvt.s64.s8 %r187,%r188call (%value_in),_gfortrani_xmallocarray189,[%value_in]; } .loc 1 217 27 st.u64 [%r166],%r189; bra $L40; $L39: .loc 1 221 11 cvta.global.u64 %r191,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r192,[%r191+36];r167196; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19call _gfortrani_bounds_equal_extents$L40: .loc 1 231 15 ld.u64 %r31,[%r168+16]; .loc 1 233 9 ld.u64 %r93,[%r168]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r201,%r32,-4; and.b32 %r202,%r201,-5; set.u32.eq.u32 %r204,%r202,0; neg.s32 %r205,%r204; .loc 1 235 22 add.u32 %r206,%r32,-1; set.u32.le.u32 %r208,%r206,1; neg.s32 %r209,%r208; .loc 1 235 6 cvt.u16.u32 %r211,%r205; cvt.u16.u32 %r212,%r209; or.b16 %r210,%r211,%r212; cvt.u32.u16 %r213,%r210; cvt.u16.u8 %r214,%r213; setp.ne.u16 %r215,%r214,0; @ %r215 bra $L41; .loc 1 237 7 setp.ne.u32 %r217,%r32,16; @ %r217 bra $L42; $L41: .loc 1 244 11 ld.u64 %r106,[%r166+40]; .loc 1 245 8 ld.u64 %r107,[%r166]; .loc 1 246 17 cvt.u32.u32 %r218,%r22; cvt.s64.s8 %r65,%r218; add.u64 %r52,%r167,40; add.u64 %r289,%frame,120; mov.u64 %r116,%r289; add.u64 %r102,%r168,40; mov.u64 %r165,%frame; add.u64 %r162,%frame,240; add.u64 %r159,%frame,360; mov.u64 %r160,%r159; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r227,%r29; bra $L43; $L42: .loc 1 242 5_gfortran_runtime_error64: mov.u64 %r29,%r113; $L43: .loc 1 248 18 ld.u64 %r221,[%r52]; st.u64 [%r116],%r221; .loc 1 249 20 ld.u64 %r223,[%r102]; mul.lo.u64 %r222,%r223,%r31; .loc 1 249 18 st.u64 [%r165],%r222; .loc 1 250 19 ld.u64 %r225,[%r52+16]; add.u64 %r224,%r225,1; ld.u64 %r226,[%r52+8]; sub.u64 %r44,%r224,%r226; .loc 1 250 17 st.u64 [%r162],%r44; .loc 1 251 16 st.u64 [%r160],%r227; .loc 1 252 10 setp.gt.s64 %r228,%r44,0; @ %r228 bra $L44; shl.b64 %r149,%r106,2; mov.u64 %r139,%r107; .loc 1 255 11 mov.u64 %r114,0; .loc 1 256 24 cvt.u32.u64 %r229,%r114; $L45: st.u32 [%r139],%r229; .loc 1 255 27 add.u64 %r114,%r114,1; .loc 1 255 4 add.u64 %r139,%r139,%r149; setp.ne.u64 %r230,%r65,%r114; @ %r230 bra $L45; bra $L35; $L44: .loc 1 246 26 add.u64 %r113,%r29,1; .loc 1 246 3 add.u64 %r52,%r52,24; add.u64 %r116,%r116,8; add.u64 %r102,%r102,24; add.u64 %r165,%r165,8; add.u64 %r162,%r162,8; add.u64 %r160,%r160,8; setp.ne.u64 %r231,%r65,%r113; @ %r231 bra $L64; .loc 1 261 8 ld.u64 %r90,[%r167]; shl.b64 %r62,%r106,2; mov.u64 %r91,%r107; .loc 1 264 10 mov.u64 %r112,0; .loc 1 265 23 cvt.u32.u64 %r232,%r112; $L47: st.u32 [%r91],%r232; mov.u64 %r142,%r112; .loc 1 264 26 add.u64 %r112,%r112,1; .loc 1 264 3 add.u64 %r91,%r91,%r62; setp.ne.u64 %r233,%r29,%r142; @ %r233 bra $L47; .loc 1 276 9 setp.eq.u64 %r234,%r90,0; @ %r234 bra $L35; .loc 1 331 19 ld.u64 %r69,[%frame+120]; .loc 1 331 9 add.u64 %r71,%r69,%r69; .loc 1 332 20 ld.u64 %r72,[%frame]; .loc 1 334 34 ld.u64 %r75,[%frame+240]; ld.u64 %r163,[%frame+360]; cvt.u32.u32 %r237,%r22; cvt.s64.s8 %r236,%r237; shl.b64 %r238,%r236,3; add.u64 %r127,%r238,%r159; .loc 1 274 12 mov.u32 %r98,-32768; setp.eq.u32 %r291,1,0; add.u64 %r290,%frame,368; .loc 1 340 13 mov.u64 %r296,0; .loc 1 359 32 add.u64 %r297,%frame,240; .loc 1 308 12 setp.eq.u32 %r299,%r169,0; $L63: .loc 1 280 10 @ %r291 bra $L48; mov.u64 %r53,%r163; mov.u32 %r46,0; mov.u32 %r293,1; $L53: mov.u64 %r45,%r90; .loc 1 300 13 add.u64 %r90,%r45,%r71; .loc 1 284 11 ld.s8 %r241,[%r93]; cvt.u16.u32 %r240,%r241; setp.eq.u16 %r242,%r240,0; @ %r242 bra $L49; setp.eq.u32 %r243,%r46,0; selp.u64 %r163,%r163,%r53,%r243; .loc 1 294 16 ld.s16 %r98,[%r45]; mov.u64 %r137,%r290; mov.u64 %r150,%r107; mov.u64 %r161,%r163; $L52: .loc 1 296 33 cvt.u32.u64 %r246,%r161; add.u32 %r245,%r246,1; .loc 1 296 22 st.u32 [%r150],%r245; .loc 1 295 9 add.u64 %r150,%r150,%r62; setp.eq.u64 %r247,%r127,%r137; @ %r247 bra $L65; .loc 1 296 29 ld.u64 %r161,[%r137]; add.u64 %r137,%r137,8; bra $L52; $L49: .loc 1 301 14 add.u64 %r93,%r93,%r72; .loc 1 303 11 add.u64 %r53,%r53,1; mov.u32 %r46,%r293; .loc 1 303 4 setp.ne.u64 %r248,%r53,%r75; @ %r248 bra $L53; bra $L89; $L48: .loc 1 308 12 @ %r299 bra $L66; mov.u64 %r61,%r163; mov.u64 %r88,%r90; $L58: .loc 1 311 11 ld.s8 %r251,[%r93]; cvt.u16.u32 %r250,%r251; setp.eq.u16 %r252,%r250,0; @ %r252 bra $L56; .loc 1 311 22 ld.s16 %r57,[%r88]; .loc 1 311 19 cvt.u16.u32 %r253,%r57; cvt.u16.u32 %r254,%r98; setp.lt.s16 %r255,%r253,%r254; @ %r255 bra $L56; mov.u64 %r117,%r290; mov.u64 %r119,%r107; mov.u64 %r156,%r61; .loc 1 314 19 mov.u64 %r143,%r296; $L57: .loc 1 315 36 cvt.u32.u64 %r258,%r156; add.u32 %r257,%r258,1; .loc 1 315 25 st.u32 [%r119],%r257; .loc 1 314 35 add.u64 %r110,%r143,1; .loc 1 314 12 add.u64 %r119,%r119,%r62; setp.le.s64 %r259,%r29,%r143; @ %r259 bra $L67; .loc 1 315 32 ld.u64 %r156,[%r117]; add.u64 %r117,%r117,8; mov.u64 %r143,%r110; bra $L57; $L67: mov.u32 %r98,%r57; $L56: .loc 1 317 13 add.u64 %r88,%r88,%r71; .loc 1 319 11 add.u64 %r61,%r61,1; .loc 1 319 4 st.u64 [%frame+360],%r61; setp.ne.u64 %r260,%r61,%r75; @ %r260 bra $L58; .loc 1 317 13 sub.u64 %r261,%r75,%r163; mad.lo.u64 %r90,%r261,%r71,%r90; bra $L59; $L66: mov.u64 %r74,%r163; mov.u64 %r92,%r93; mov.u64 %r89,%r90; $L55: .loc 1 323 11 ld.s8 %r264,[%r92]; cvt.u16.u32 %r263,%r264; setp.eq.u16 %r265,%r263,0; @ %r265 bra $L60; .loc 1 323 22 ld.s16 %r64,[%r89]; .loc 1 323 19 cvt.u16.u32 %r266,%r64; cvt.u16.u32 %r267,%r98; setp.le.s16 %r268,%r266,%r267; @ %r268 bra $L60; mov.u64 %r97,%r290; mov.u64 %r100,%r107; mov.u64 %r151,%r74; .loc 1 326 12 mov.u64 %r144,%r296; $L61: .loc 1 327 36 cvt.u32.u64 %r271,%r151; add.u32 %r270,%r271,1; .loc 1 327 25 st.u32 [%r100],%r270; .loc 1 326 28 add.u64 %r109,%r144,1; .loc 1 326 5 add.u64 %r100,%r100,%r62; setp.le.s64 %r272,%r29,%r144; @ %r272 bra $L68; .loc 1 327 32 ld.u64 %r151,[%r97]; add.u64 %r97,%r97,8; mov.u64 %r144,%r109; bra $L61; $L68: mov.u32 %r98,%r64; $L60: .loc 1 331 9 add.u64 %r89,%r89,%r71; .loc 1 332 10 add.u64 %r92,%r92,%r72; .loc 1 334 14 add.u64 %r74,%r74,1; .loc 1 334 7 st.u64 [%frame+360],%r74; setp.ne.u64 %r273,%r74,%r75; @ %r273 bra $L55; sub.u64 %r134,%r75,%r163; .loc 1 331 9 mad.lo.u64 %r90,%r71,%r134,%r90; .loc 1 332 10 mad.lo.u64 %r93,%r72,%r134,%r93; bra $L59; $L89: st.u64 [%frame+360],%r75; $L59: mov.u64 %r125,%r290; mov.u64 %r84,%r72; .loc 1 334 34 mov.u64 %r86,%r75; mov.u64 %r82,%r69; mov.u64 %r124,8; .loc 1 335 9 mov.u64 %r96,0; $L62: .loc 1 340 13 st.u64 [%r125+-8],%r296; .loc 1 343 23 mul.lo.u64 %r77,%r82,%r86; .loc 1 344 24 mul.lo.u64 %r79,%r86,%r84; mov.u64 %r118,%r96; .loc 1 345 5 add.u64 %r96,%r96,1; .loc 1 346 7 setp.eq.u64 %r278,%r29,%r118; @ %r278 bra $L35; .loc 1 354 16 ld.u64 %r279,[%r125]; add.u64 %r81,%r279,1; st.u64 [%r125],%r81; .loc 1 355 23 add.u64 %r281,%r289,%r124; ld.u64 %r82,[%r281]; .loc 1 355 13 sub.u64 %r282,%r82,%r77; add.u64 %r283,%r282,%r282; add.u64 %r90,%r90,%r283; .loc 1 356 24 add.u64 %r284,%frame,%r124; ld.u64 %r84,[%r284]; .loc 1 356 14 sub.u64 %r285,%r84,%r79; add.u64 %r93,%r93,%r285; .loc 1 359 32 add.u64 %r287,%r297,%r124; ld.u64 %r86,[%r287]; .loc 1 359 7 add.u64 %r125,%r125,8; add.u64 %r124,%r124,8; setp.eq.u64 %r288,%r81,%r86; @ %r288 bra $L62; ld.u64 %r163,[%frame+360]; bra $L63; $L65: mov.u64 %r90,%r45; setp.eq.u32 %r291,1,1; bra $L48; $L35: .loc 1 362_gfortran_smaxloc0_4_i2 .visible .func _gfortran_smaxloc0_4_i24predpred.reg .u64u32 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 379 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L91; .loc 1 379 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L92; $L91: .loc 1 381 7 {_gfortran_maxloc0_4_.loc 1 382 7 bra $L90; $L92: .loc 1 385 8 ld.s8 %r30,[%r36+28]_gfortran_runtime_error94: .loc 1 390 15 ld.u64 %r24,[%r35]; .loc 1 390 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L95; .loc 1 392 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 393 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 394 24 st.u64 [%r35+8],%r24; .loc 1 395 29 mov.u64 %r56call (%value_in),_gfortrani_xmallocarray95 27 st.u64 [%r35],%r24; bra $L96; $L95: .loc 1 397 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 397 11 ld.u32 %r62,[%r61+36; .loc 1 399 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 404 8 ld.u64 %r24,[%r35]; $L96: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,2; mov.u64 %r29,%r24; .loc 1 405 10 mov.u64 %r32,0; .loc 1 406 23 cvt.u32.u64 %r68,%r32; $L97: st.u32 [%r29],%r68; .loc 1 405 24 add.u64 %r32,%r32,1; .loc 1 405 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L97; $L90: .loc 1 407maxloc0_8_i2.o/ 1622802227_gfortran_maxloc0_8_i2 .visible .func _gfortran_maxloc0_8_i2fortran/generated/maxloc0_8_i2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_8_i2 .visible .func _gfortran_mmaxloc0_8_i2_gfortran_smaxloc0_8_i2 .visible .func _gfortran_smaxloc0_8_i2VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,65,88,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_8_i2 .visible .func _gfortran_maxloc0_8_i232predpredpredpredmov.u64 %r112,%ar0; mov.u64 %r113,%ar1; mov.u32 %r114,%ar2; .loc 1 50 10 ld.s8 %r22,[%r113+28]; .loc 1 50 8 cvt.u32.u32 %r115,%r22; cvt.s64.s8 %r71,%r115; .loc 1 51 6 setp.gt.s64 %r116,%r71,0; @ %r116 bra $L2; .loc 1 52 5 cvta.const.u64 %r11117_gfortran_runtime_errorr23,[%r112]; .loc 1 54 6 setp.ne.u64 %r119,%r23,0; @ %r119 bra $L3; .loc 1 56 7 st.u64 [%r112+48],%r23; add.u64 %r121,%r71,-1; st.u64 [%r112+56],%r121; mov.u64 %r122,1; st.u64 [%r112+40],%r122; .loc 1 57 28 cvt.u32.u64 %r123,%r122; st.u8 [%r112+28],%r123; .loc 1 58 24 st.u64 [%r112+8],%r23; .loc 1 59 29 mov.u64 %r171call (%value_in),_gfortrani_xmallocarray129,[%value_in]; } mov.u64 %r23,%r129; .loc 1 59 27 st.u64 [%r112],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r131,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r132,[%r131+36]; setp.eq.u32 %r133,%r132,0; @ %r133 bra $L4; .loc 1 64r113136; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r112]; $L4: .loc 1 68 11 ld.u64 %r72,[%r112+40]; add.u64 %r36,%r113,40; mov.u64 %r104,%frame; add.u64 %r105,%frame,120; .loc 1 70 10 mov.u64 %r98,0; add.u64 %r187,%frame,240; .loc 1 74 16 mov.u64 %r144,%r98; bra $L7; $L22: mov.u64 %r98,%r79; $L7: .loc 1 72 18 ld.u64 %r137,[%r36]; st.u64 [%r104],%r137; .loc 1 73 19 ld.u64 %r139,[%r36+16]; add.u64 %r138,%r139,1; ld.u64 %r140,[%r36+8]; sub.u64 %r31,%r138,%r140; .loc 1 73 17 st.u64 [%r105],%r31; .loc 1 74 16 shl.b64 %r142,%r98,3; add.u64 %r143,%r187,%r142; st.u64 [%r143],%r144; .loc 1 75 10 setp.gt.s64 %r145,%r31,0; @ %r145 bra $L5; shl.b64 %r81,%r72,3; mov.u64 %r87,%r23; .loc 1 78 11 mov.u64 %r80,0; .loc 1 79 24 mov.u64 %r146,%r80; $L6: st.u64 [%r87],%r146; .loc 1 78 27 add.u64 %r80,%r80,1; .loc 1 78 4 add.u64 %r87,%r87,%r81; setp.ne.u64 %r147,%r71,%r80; @ %r147 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r79,%r98,1; .loc 1 70 3 add.u64 %r36,%r36,24; add.u64 %r104,%r104,8; add.u64 %r105,%r105,8; setp.ne.u64 %r148,%r71,%r79; @ %r148 bra $L22; .loc 1 84 8 ld.u64 %r62,[%r113]; shl.b64 %r84,%r72,3; mov.u64 %r39,%r23; .loc 1 87 10 mov.u64 %r78,0; .loc 1 88 23 mov.u64 %r149,1; $L8: st.u64 [%r39],%r149; mov.u64 %r32,%r78; .loc 1 87 26 add.u64 %r78,%r78,1; .loc 1 87 3 add.u64 %r39,%r39,%r84; setp.ne.u64 %r150,%r98,%r32; @ %r150 bra $L8; .loc 1 101 9 setp.ne.u64 %r151,%r62,0; @ ! %r151 bra $L1; .loc 1 149 19 ld.u64 %r45,[%frame]; .loc 1 149 9 add.u64 %r47,%r45,%r45; .loc 1 151 34 ld.u64 %r51,[%frame+120]; cvt.u32.u32 %r154,%r22; cvt.s64.s8 %r153,%r154; shl.b64 %r155,%r153,3; add.u64 %r73,%r187,%r155; .loc 1 99 12 mov.u32 %r67,-32768; setp.eq.u32 %r186,%r114,0; add.u64 %r188,%frame,248; .loc 1 157 13 mov.u64 %r191,0; .loc 1 174 32 add.u64 %r192,%frame,120; $L21: .loc 1 151 21 ld.u64 %r103,[%frame+240]; .loc 1 126 12 @ %r186 bra $L23; mov.u64 %r40,%r103; mov.u64 %r60,%r62; $L14: .loc 1 129 12 ld.s16 %r33,[%r60]; .loc 1 129 11 cvt.u16.u32 %r158,%r33; cvt.u16.u32 %r159,%r67; setp.lt.s16 %r160,%r158,%r159; @ %r160 bra $L12; mov.u64 %r76,%r188; mov.u64 %r34,%r23; mov.u64 %r90,%r40; $L13: .loc 1 133 35 add.u64 %r162,%r90,1; .loc 1 133 24 st.u64 [%r34],%r162; .loc 1 132 10 add.u64 %r34,%r34,%r84; setp.eq.u64 %r163,%r73,%r76; @ %r163 bra $L24; .loc 1 133 31 ld.u64 %r90,[%r76]; add.u64 %r76,%r76,8; bra $L13; $L24: mov.u32 %r67,%r33; $L12: .loc 1 135 12 add.u64 %r60,%r60,%r47; .loc 1 137 17 add.u64 %r40,%r40,1; .loc 1 137 10 st.u64 [%frame+240],%r40; setp.ne.u64 %r164,%r40,%r51; @ %r164 bra $L14; bra $L34; $L23: mov.u64 %r49,%r103; mov.u64 %r61,%r62; $L11: .loc 1 141 11 ld.s16 %r42,[%r61]; .loc 1 141 10 cvt.u16.u32 %r165,%r42; cvt.u16.u32 %r166,%r67; setp.le.s16 %r167,%r165,%r166; @ %r167 bra $L16; mov.u64 %r85,%r188; mov.u64 %r101,%r23; mov.u64 %r86,%r49; $L17: .loc 1 145 35 add.u64 %r169,%r86,1; .loc 1 145 24 st.u64 [%r101],%r169; .loc 1 144 4 add.u64 %r101,%r101,%r84; setp.eq.u64 %r170,%r73,%r85; @ %r170 bra $L25; .loc 1 145 31 ld.u64 %r86,[%r85]; add.u64 %r85,%r85,8; bra $L17; $L25: mov.u32 %r67,%r42; $L16: .loc 1 149 9 add.u64 %r61,%r61,%r47; .loc 1 151 14 add.u64 %r49,%r49,1; .loc 1 151 7 st.u64 [%frame+240],%r49; setp.ne.u64 %r171,%r49,%r51; @ %r171 bra $L11; .loc 1 149 9 sub.u64 %r172,%r51,%r103; mad.lo.u64 %r62,%r172,%r47,%r62; bra $L18; $L34: .loc 1 135 12 sub.u64 %r174,%r51,%r103; mad.lo.u64 %r62,%r174,%r47,%r62; $L18: mov.u64 %r70,%r188; .loc 1 151 34 mov.u64 %r59,%r51; mov.u64 %r57,%r45; .loc 1 152 9 mov.u64 %r66,0; $L20: .loc 1 157 13 st.u64 [%r70+-8],%r191; .loc 1 160 23 mul.lo.u64 %r54,%r57,%r59; mov.u64 %r99,%r66; .loc 1 161 5 add.u64 %r66,%r66,1; .loc 1 162 7 setp.eq.u64 %r178,%r99,%r98; @ %r178 bra $L1; .loc 1 170 16 ld.u64 %r179,[%r70]; add.u64 %r56,%r179,1; st.u64 [%r70],%r56; shl.b64 %r65,%r66,3; .loc 1 171 23 add.u64 %r180,%frame,%r65; ld.u64 %r57,[%r180]; .loc 1 171 13 sub.u64 %r181,%r57,%r54; add.u64 %r182,%r181,%r181; add.u64 %r62,%r62,%r182; .loc 1 174 32 add.u64 %r184,%r192,%r65; ld.u64 %r59,[%r184]; .loc 1 174 7 add.u64 %r70,%r70,8; setp.eq.u64 %r185,%r56,%r59; @ %r185 bra $L20; bra $L21; $L1: .loc 1 1_gfortran_mmaxloc0_8_i2 .visible .func _gfortran_mmaxloc0_8_i27pred %r187; .reg .u64 %r190; .reg .u64u64pred %r222; .reg .u64 %r223; .reg .predu32 %r231; .reg .u64 %r232; .reg .u16 %r234; .reg .u32 %r235; .reg .pred %r236; .reg .predpred %r251; .reg .pred %r252; .reg .u64 %r253; .reg .u16 %r255; .reg .u32predpred %r281; .reg .u64 %r282; .reg .u32 %r284; .reg .u64 %r287; .reg .u64 %r288; .reg .pred %r290; mov.u64 %r160,%ar0; mov.u64 %r161,%ar1; mov.u64 %r162,%ar2; mov.u32 %r163,%ar3; .loc 1 202 6 setp.ne.u64 %r164,%r162,0; @ %r164 bra $L36; .loc 1 204 7 {1163; call _gfortran_maxloc0_8_.loc 1 205 7 bra $L35; $L36: .loc 1 208 10 ld.s8 %r22,[%r161+28]; .loc 1 209 6 setp.gt.s32 %r168,%r22,0; @ %r168 bra $L38; .loc 1 210 5 cvta.const.u64 %r16stack; call _gfortran_runtime_error%r171,[%r160]; setp.ne.u64 %r172,%r171,0; @ %r172 bra $L39; .loc 1 214 7 st.u64 [%r160+48],%r171; add.u32 %r174,%r22,-1; cvt.s64.s32 %r175,%r174; st.u64 [%r160+56],%r175; mov.u64 %r176,1; st.u64 [%r160+40],%r176; .loc 1 215 28 cvt.u32.u64 %r177,%r176; st.u8 [%r160+28],%r177; .loc 1 216 24 st.u64 [%r160+8],%r171; .loc 1 217 29 cvt.u32.u32 %r182,%r22; cvt.s64.s8 %r181,%r182; mov.u64 %r180call (%value_in),_gfortrani_xmallocarray183,[%value_in]; } .loc 1 217 27 st.u64 [%r160],%r183; bra $L40; $L39: .loc 1 221 11 cvta.global.u64 %r185,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r186,[%r185+36]; setp.eq.u32 %r187,%r186,0; @ %r187 bra $L40; .loc 1 224 4 cvta.const.u64 %r190,$LC1; {161190; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19316161193190; call _gfortrani_bounds_equal_extents$L40: .loc 1 231 15 ld.u64 %r31,[%r162+16]; .loc 1 233 9 ld.u64 %r92,[%r162]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r195,%r32,-4; and.b32 %r196,%r195,-5; set.u32.eq.u32 %r198,%r196,0; neg.s32 %r199,%r198; .loc 1 235 22 add.u32 %r200,%r32,-1; set.u32.le.u32 %r202,%r200,1; neg.s32 %r203,%r202; .loc 1 235 6 cvt.u16.u32 %r205,%r199; cvt.u16.u32 %r206,%r203; or.b16 %r204,%r205,%r206; cvt.u32.u16 %r207,%r204; cvt.u16.u8 %r208,%r20741; .loc 1 237 7 setp.ne.u32 %r211,%r32,16; @ %r211 bra $L42; $L41: .loc 1 244 11 ld.u64 %r106,[%r160+40]; .loc 1 245 8 ld.u64 %r107,[%r160]; .loc 1 246 17 cvt.u32.u32 %r212,%r22; cvt.s64.s8 %r68,%r212; add.u64 %r153,%r161,40; add.u64 %r282,%frame,120; mov.u64 %r149,%r282; add.u64 %r147,%r162,40; mov.u64 %r146,%frame; add.u64 %r143,%frame,240; add.u64 %r62,%frame,360; mov.u64 %r47,%r62; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r221,%r29; bra $L43; $L42: .loc 1 242 5_gfortran_runtime_error64: mov.u64 %r29,%r113; $L43: .loc 1 248 18 ld.u64 %r215,[%r153]; st.u64 [%r149],%r215; .loc 1 249 20 ld.u64 %r217,[%r147]; mul.lo.u64 %r216,%r217,%r31; .loc 1 249 18 st.u64 [%r146],%r216; .loc 1 250 19 ld.u64 %r219,[%r153+16]; add.u64 %r218,%r219,1; ld.u64 %r220,[%r153+8]; sub.u64 %r44,%r218,%r220; .loc 1 250 17 st.u64 [%r143],%r44; .loc 1 251 16 st.u64 [%r47],%r221; .loc 1 252 10 setp.gt.s64 %r222,%r44,0; @ %r222 bra $L44; shl.b64 %r137,%r106,3; mov.u64 %r129,%r107; .loc 1 255 11 mov.u64 %r114,0; .loc 1 256 24 mov.u64 %r223,%r114; $L45: st.u64 [%r129],%r223; .loc 1 255 27 add.u64 %r114,%r114,1; .loc 1 255 4 add.u64 %r129,%r129,%r137; setp.ne.u64 %r224,%r68,%r114; @ %r224 bra $L45; bra $L35; $L44: .loc 1 246 26 add.u64 %r113,%r29,1; .loc 1 246 3 add.u64 %r153,%r153,24; add.u64 %r149,%r149,8; add.u64 %r147,%r147,24; add.u64 %r146,%r146,8; add.u64 %r143,%r143,8; add.u64 %r47,%r47,8; setp.ne.u64 %r225,%r68,%r113; @ %r225 bra $L64; .loc 1 261 8 ld.u64 %r90,[%r161]; shl.b64 %r158,%r106,3; mov.u64 %r156,%r107; .loc 1 264 10 mov.u64 %r112,0; .loc 1 265 23 mov.u64 %r226,%r112; $L47: st.u64 [%r156],%r226; mov.u64 %r140,%r112; .loc 1 264 26 add.u64 %r112,%r112,1; .loc 1 264 3 add.u64 %r156,%r156,%r158; setp.ne.u64 %r227,%r29,%r140; @ %r227 bra $L47; .loc 1 276 9 setp.eq.u64 %r228,%r90,0; @ %r228 bra $L35; .loc 1 331 19 ld.u64 %r70,[%frame+120]; .loc 1 331 9 add.u64 %r72,%r70,%r70; .loc 1 332 20 ld.u64 %r73,[%frame]; .loc 1 334 34 ld.u64 %r76,[%frame+240]; ld.u64 %r155,[%frame+360]; cvt.u32.u32 %r231,%r22; cvt.s64.s8 %r230,%r231; shl.b64 %r232,%r230,3; add.u64 %r119,%r232,%r62; .loc 1 274 12 mov.u32 %r97,-32768; setp.eq.u32 %r281,1,0; add.u64 %r280,%frame,368; .loc 1 340 13 mov.u64 %r287,0; .loc 1 359 32 add.u64 %r288,%frame,240; .loc 1 308 12 setp.eq.u32 %r290,%r163,0; $L63: .loc 1 280 10 @ %r281 bra $L48; mov.u64 %r56,%r155; mov.u32 %r46,0; mov.u32 %r284,1; $L53: mov.u64 %r45,%r90; .loc 1 300 13 add.u64 %r90,%r45,%r72; .loc 1 284 11 ld.s8 %r235,[%r92]; cvt.u16.u32 %r234,%r235; setp.eq.u16 %r236,%r234,0; @ %r236 bra $L49; setp.eq.u32 %r237,%r46,0; selp.u64 %r155,%r155,%r56,%r237; .loc 1 294 16 ld.s16 %r97,[%r45]; mov.u64 %r128,%r280; mov.u64 %r139,%r107; mov.u64 %r151,%r155; $L52: .loc 1 296 33 add.u64 %r239,%r151,1; .loc 1 296 22 st.u64 [%r139],%r239; .loc 1 295 9 add.u64 %r139,%r139,%r158; setp.eq.u64 %r240,%r119,%r128; @ %r240 bra $L65; .loc 1 296 29 ld.u64 %r151,[%r128]; add.u64 %r128,%r128,8; bra $L52; $L49: .loc 1 301 14 add.u64 %r92,%r92,%r73; .loc 1 303 11 add.u64 %r56,%r56,1; mov.u32 %r46,%r284; .loc 1 303 4 setp.ne.u64 %r241,%r56,%r76; @ %r241 bra $L53; bra $L89; $L48: .loc 1 308 12 @ %r290 bra $L66; mov.u64 %r64,%r155; mov.u64 %r88,%r90; $L58: .loc 1 311 11 ld.s8 %r244,[%r92]; cvt.u16.u32 %r243,%r244; setp.eq.u16 %r245,%r243,0; @ %r245 bra $L56; .loc 1 311 22 ld.s16 %r60,[%r88]; .loc 1 311 19 cvt.u16.u32 %r246,%r60; cvt.u16.u32 %r247,%r97; setp.lt.s16 %r248,%r246,%r247; @ %r248 bra $L56; mov.u64 %r93,%r280; mov.u64 %r96,%r107; mov.u64 %r148,%r64; .loc 1 314 19 mov.u64 %r141,%r287; $L57: .loc 1 315 36 add.u64 %r250,%r148,1; .loc 1 315 25 st.u64 [%r96],%r250; .loc 1 314 35 add.u64 %r110,%r141,1; .loc 1 314 12 add.u64 %r96,%r96,%r158; setp.le.s64 %r251,%r29,%r141; @ %r251 bra $L67; .loc 1 315 32 ld.u64 %r148,[%r93]; add.u64 %r93,%r93,8; mov.u64 %r141,%r110; bra $L57; $L67: mov.u32 %r97,%r60; $L56: .loc 1 317 13 add.u64 %r88,%r88,%r72; .loc 1 319 11 add.u64 %r64,%r64,1; .loc 1 319 4 st.u64 [%frame+360],%r64; setp.ne.u64 %r252,%r64,%r76; @ %r252 bra $L58; .loc 1 317 13 sub.u64 %r253,%r76,%r155; mad.lo.u64 %r90,%r253,%r72,%r90; bra $L59; $L66: mov.u64 %r75,%r155; mov.u64 %r91,%r92; mov.u64 %r89,%r90; $L55: .loc 1 323 11 ld.s8 %r256,[%r91]; cvt.u16.u32 %r255,%r256; setp.eq.u16 %r257,%r255,0; @ %r257 bra $L60; .loc 1 323 22 ld.s16 %r67,[%r89]; .loc 1 323 19 cvt.u16.u32 %r258,%r67; cvt.u16.u32 %r259,%r97; setp.le.s16 %r260,%r258,%r259; @ %r260 bra $L60; mov.u64 %r115,%r280; mov.u64 %r57,%r107; mov.u64 %r144,%r75; .loc 1 326 12 mov.u64 %r142,%r287; $L61: .loc 1 327 36 add.u64 %r262,%r144,1; .loc 1 327 25 st.u64 [%r57],%r262; .loc 1 326 28 add.u64 %r109,%r142,1; .loc 1 326 5 add.u64 %r57,%r57,%r158; setp.le.s64 %r263,%r29,%r142; @ %r263 bra $L68; .loc 1 327 32 ld.u64 %r144,[%r115]; add.u64 %r115,%r115,8; mov.u64 %r142,%r109; bra $L61; $L68: mov.u32 %r97,%r67; $L60: .loc 1 331 9 add.u64 %r89,%r89,%r72; .loc 1 332 10 add.u64 %r91,%r91,%r73; .loc 1 334 14 add.u64 %r75,%r75,1; .loc 1 334 7 st.u64 [%frame+360],%r75; setp.ne.u64 %r264,%r75,%r76; @ %r264 bra $L55; sub.u64 %r125,%r76,%r155; .loc 1 331 9 mad.lo.u64 %r90,%r72,%r125,%r90; .loc 1 332 10 mad.lo.u64 %r92,%r73,%r125,%r92; bra $L59; $L89: st.u64 [%frame+360],%r76; $L59: mov.u64 %r117,%r280; mov.u64 %r85,%r73; .loc 1 334 34 mov.u64 %r87,%r76; mov.u64 %r83,%r70; mov.u64 %r116,8; .loc 1 335 9 mov.u64 %r95,0; $L62: .loc 1 340 13 st.u64 [%r117+-8],%r287; .loc 1 343 23 mul.lo.u64 %r78,%r83,%r87; .loc 1 344 24 mul.lo.u64 %r80,%r87,%r85; mov.u64 %r94,%r95; .loc 1 345 5 add.u64 %r95,%r95,1; .loc 1 346 7 setp.eq.u64 %r269,%r29,%r94; @ %r269 bra $L35; .loc 1 354 16 ld.u64 %r270,[%r117]; add.u64 %r82,%r270,1; st.u64 [%r117],%r82; .loc 1 355 23 add.u64 %r272,%r282,%r116; ld.u64 %r83,[%r272]; .loc 1 355 13 sub.u64 %r273,%r83,%r78; add.u64 %r274,%r273,%r273; add.u64 %r90,%r90,%r274; .loc 1 356 24 add.u64 %r275,%frame,%r116; ld.u64 %r85,[%r275]; .loc 1 356 14 sub.u64 %r276,%r85,%r80; add.u64 %r92,%r92,%r276; .loc 1 359 32 add.u64 %r278,%r288,%r116; ld.u64 %r87,[%r278]; .loc 1 359 7 add.u64 %r117,%r117,8; add.u64 %r116,%r116,8; setp.eq.u64 %r279,%r82,%r87; @ %r279 bra $L62; ld.u64 %r155,[%frame+360]; bra $L63; $L65: mov.u64 %r90,%r45; setp.eq.u32 %r281,1,1; bra $L48; $L35: .loc 1 362_gfortran_smaxloc0_8_i2 .visible .func _gfortran_smaxloc0_8_i24predpred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 379 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L91; .loc 1 379 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L92; $L91: .loc 1 381 7 {_gfortran_maxloc0_8_.loc 1 382 7 bra $L90; $L92: .loc 1 385 8 ld.s8 %r30,[%r36+28]_gfortran_runtime_error94: .loc 1 390 15 ld.u64 %r24,[%r35]; .loc 1 390 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L95; .loc 1 392 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 393 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 394 24 st.u64 [%r35+8],%r24; .loc 1 395 29 mov.u64 call (%value_in),_gfortrani_xmallocarray95 27 st.u64 [%r35],%r24; bra $L96; $L95: .loc 1 397 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 397 11 ld.u32 %r62,[%r61+36; .loc 1 399 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 404 8 ld.u64 %r24,[%r35]; $L96: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,3; mov.u64 %r29,%r24; .loc 1 405 10 mov.u64 %r32,0; .loc 1 406 23 mov.u64 %r68,%r32; $L97: st.u64 [%r29],%r68; .loc 1 405 24 add.u64 %r32,%r32,1; .loc 1 405 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L97; $L90: .loc 1 407 maxloc0_16_i2.o/1622802227_gfortran_maxloc0_16_i2 .visible .func _gfortran_maxloc0_16_i2fortran/generated/maxloc0_16_i2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_16_i2 .visible .func _gfortran_mmaxloc0_16_i2_gfortran_smaxloc0_16_i2 .visible .func _gfortran_smaxloc0_16_i2VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,65,88,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_16_i2 .visible .func _gfortran_maxloc0_16_i23264pred %r168; .reg .pred %r169; .reg .u16 %r170; .reg .u16 %r171; .reg .pred.reg .u64 %r180; .reg .predmov.u32 %r116,%ar2; .loc 1 50 10 ld.s8 %r22,[%r115+28]; .loc 1 50 8 cvt.u32.u32 %r117,%r22; cvt.s64.s8 %r71,%r117; .loc 1 51 6 setp.gt.s64 %r118,%r71,0; @ %r118 bra $L2; .loc 1 52 5 cvta.const.u64 %r1111_gfortran_runtime_errorr23,[%r114]; .loc 1 54 6 setp.ne.u64 %r121,%r23,0; @ %r121 bra $L3; .loc 1 56 7 st.u64 [%r114+48],%r23; add.u64 %r123,%r71,-1; st.u64 [%r114+56],%r123; mov.u64 %r124,1; st.u64 [%r114+40],%r124; .loc 1 57 28 cvt.u32.u64 %r125,%r124; st.u8 [%r114+28],%r125; .loc 1 58 24 st.u64 [%r114+8],%r23; .loc 1 59 29 mov.u64 %r128,1671call (%value_in),_gfortrani_xmallocarray131,[%value_in]; } mov.u64 %r23,%r131; .loc 1 59 27 st.u64 [%r114],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r133,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r134,[%r133+36]; setp.eq.u32 %r135,%r134,0; @ %r135 bra $L4; .loc 1 64r115138; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r114]; $L4: .loc 1 68 11 ld.u64 %r72,[%r114+40]; add.u64 %r44,%r115,40; mov.u64 %r35,%frame; add.u64 %r52,%frame,120; .loc 1 70 10 mov.u64 %r100,0; add.u64 %r193,%frame,240; .loc 1 74 16 mov.u64 %r146,%r100; bra $L7; $L22: mov.u64 %r100,%r79; $L7: .loc 1 72 18 ld.u64 %r139,[%r44]; st.u64 [%r35],%r139; .loc 1 73 19 ld.u64 %r141,[%r44+16]; add.u64 %r140,%r141,1; ld.u64 %r142,[%r44+8]; sub.u64 %r32,%r140,%r142; .loc 1 73 17 st.u64 [%r52],%r32; .loc 1 74 16 shl.b64 %r144,%r100,3; add.u64 %r145,%r193,%r144; st.u64 [%r145],%r146; .loc 1 75 10 setp.gt.s64 %r147,%r32,0; @ %r147 bra $L5; shl.b64 %r86,%r72,4; mov.u64 %r88,%r23; .loc 1 78 11 mov.u64 %r80,0; .loc 1 79 24 mov.u64 %r148,%r80; $L6: st.u64 [%r88],%r148; st.u64 [%r88+8],%r148; .loc 1 78 27 add.u64 %r80,%r80,1; .loc 1 78 4 add.u64 %r88,%r88,%r86; setp.ne.u64 %r150,%r71,%r80; @ %r150 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r79,%r100,1; .loc 1 70 3 add.u64 %r44,%r44,24; add.u64 %r35,%r35,8; add.u64 %r52,%r52,8; setp.ne.u64 %r151,%r71,%r79; @ %r151 bra $L22; .loc 1 84 8 ld.u64 %r62,[%r115]; shl.b64 %r38,%r72,4; mov.u64 %r85,%r23; .loc 1 87 10 mov.u64 %r78,0; .loc 1 88 23 mov.u64 %r152,1; mov.u64 %r153,%r78; $L8: st.u64 [%r85],%r152; st.u64 [%r85+8],%r153; mov.u64 %r33,%r78; .loc 1 87 26 add.u64 %r78,%r78,1; .loc 1 87 3 add.u64 %r85,%r85,%r38; setp.ne.u64 %r154,%r100,%r33; @ %r154 bra $L8; .loc 1 101 9 setp.ne.u64 %r155,%r62,0; @ ! %r155 bra $L1; .loc 1 149 19 ld.u64 %r47,[%frame]; .loc 1 149 9 add.u64 %r49,%r47,%r47; .loc 1 151 34 ld.u64 %r51,[%frame+120]; cvt.u32.u32 %r158,%r22; cvt.s64.s8 %r157,%r158; shl.b64 %r159,%r157,3; add.u64 %r89,%r193,%r159; .loc 1 99 12 mov.u32 %r66,-32768; setp.eq.u32 %r192,%r116,0; add.u64 %r194,%frame,248; .loc 1 157 13 mov.u64 %r197,0; .loc 1 174 32 add.u64 %r198,%frame,120; $L21: .loc 1 151 21 ld.u64 %r107,[%frame+240]; .loc 1 126 12 @ %r192 bra $L23; mov.u64 %r41,%r107; mov.u64 %r60,%r62; $L14: .loc 1 129 12 ld.s16 %r34,[%r60]; .loc 1 129 11 cvt.u16.u32 %r162,%r34; cvt.u16.u32 %r163,%r66; setp.lt.s16 %r164,%r162,%r163; @ %r164 bra $L12; mov.u64 %r27,%r194; mov.u64 %r63,%r23; mov.u64 %r103,%r41; $L13: .loc 1 133 35 add.u64 %r166,%r103,1; st.u64 [%r63],%r166; shr.s64 %r167,%r166,63; st.u64 [%r63+8],%r167; .loc 1 132 10 add.u64 %r63,%r63,%r38; setp.eq.u64 %r168,%r27,%r89; @ %r168 bra $L24; .loc 1 133 31 ld.u64 %r103,[%r27]; add.u64 %r27,%r27,8; bra $L13; $L24: mov.u32 %r66,%r34; $L12: .loc 1 135 12 add.u64 %r60,%r60,%r49; .loc 1 137 17 add.u64 %r41,%r41,1; .loc 1 137 10 st.u64 [%frame+240],%r41; setp.ne.u64 %r169,%r41,%r51; @ %r169 bra $L14; bra $L34; $L23: mov.u64 %r50,%r107; mov.u64 %r61,%r62; $L11: .loc 1 141 11 ld.s16 %r42,[%r61]; .loc 1 141 10 cvt.u16.u32 %r170,%r42; cvt.u16.u32 %r171,%r66; setp.le.s16 %r172,%r170,%r171; @ %r172 bra $L16; mov.u64 %r104,%r194; mov.u64 %r105,%r23; mov.u64 %r84,%r50; $L17: .loc 1 145 35 add.u64 %r174,%r84,1; st.u64 [%r105],%r174; shr.s64 %r175,%r174,63; st.u64 [%r105+8],%r175; .loc 1 144 4 add.u64 %r105,%r105,%r38; setp.eq.u64 %r176,%r89,%r104; @ %r176 bra $L25; .loc 1 145 31 ld.u64 %r84,[%r104]; add.u64 %r104,%r104,8; bra $L17; $L25: mov.u32 %r66,%r42; $L16: .loc 1 149 9 add.u64 %r61,%r61,%r49; .loc 1 151 14 add.u64 %r50,%r50,1; .loc 1 151 7 st.u64 [%frame+240],%r50; setp.ne.u64 %r177,%r50,%r51; @ %r177 bra $L11; .loc 1 149 9 sub.u64 %r178,%r51,%r107; mad.lo.u64 %r62,%r178,%r49,%r62; bra $L18; $L34: .loc 1 135 12 sub.u64 %r180,%r51,%r107; mad.lo.u64 %r62,%r180,%r49,%r62; $L18: mov.u64 %r77,%r194; .loc 1 151 34 mov.u64 %r59,%r51; mov.u64 %r57,%r47; .loc 1 152 9 mov.u64 %r65,0; $L20: .loc 1 157 13 st.u64 [%r77+-8],%r197; .loc 1 160 23 mul.lo.u64 %r54,%r57,%r59; mov.u64 %r101,%r65; .loc 1 161 5 add.u64 %r65,%r65,1; .loc 1 162 7 setp.eq.u64 %r184,%r101,%r100; @ %r184 bra $L1; .loc 1 170 16 ld.u64 %r185,[%r77]; add.u64 %r56,%r185,1; st.u64 [%r77],%r56; shl.b64 %r68,%r65,3; .loc 1 171 23 add.u64 %r186,%frame,%r68; ld.u64 %r57,[%r186]; .loc 1 171 13 sub.u64 %r187,%r57,%r54; add.u64 %r188,%r187,%r187; add.u64 %r62,%r62,%r188; .loc 1 174 32 add.u64 %r190,%r198,%r68; ld.u64 %r59,[%r190]; .loc 1 174 7 add.u64 %r77,%r77,8; setp.eq.u64 %r191,%r56,%r59; @ %r191 bra $L20; bra $L21; $L1: .loc 1 1_gfortran_mmaxloc0_16_i2 .visible .func _gfortran_mmaxloc0_16_i28predpredpredu16predpred %r246; .reg .pred %r247; .reg .u16 %r249; .reg .u32 %r250; .reg .predpredpredu64 %r280; .reg .u64 %r281; .reg .u64predpred %r298; mov.u64 %r163,%ar0; mov.u64 %r164,%ar1; mov.u64 %r165,%ar2; mov.u32 %r166,%ar3; .loc 1 202 6 setp.ne.u64 %r167,%r165,0; @ %r167 bra $L3616r1166; call _gfortran_maxloc0_16_.loc 1 205 7 bra $L35; $L36: .loc 1 208 10 ld.s8 %r22,[%r164+28]; .loc 1 209 6 setp.gt.s32 %r171,%r22,0; @ %r171 bra $L38; .loc 1 210 5 cvta.const.u64 %r172172_gfortran_runtime_error%r174,[%r163]; setp.ne.u64 %r175,%r174,0; @ %r175 bra $L39; .loc 1 214 7 st.u64 [%r163+48],%r174; add.u32 %r177,%r22,-1; cvt.s64.s32 %r178,%r177; st.u64 [%r163+56],%r178; mov.u64 %r179,1; st.u64 [%r163+40],%r179; .loc 1 215 28 cvt.u32.u64 %r180,%r179; st.u8 [%r163+28],%r180; .loc 1 216 24 st.u64 [%r163+8],%r174; .loc 1 217 29 cvt.u32.u32 %r185,%r22; cvt.s64.s8 %r184,%r185; mov.u64 %r183,16call (%value_in),_gfortrani_xmallocarray186,[%value_in]; } .loc 1 217 27 st.u64 [%r163],%r186; bra $L40; $L39: .loc 1 221 11 cvta.global.u64 %r188,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r189,[%r188+36]; setp.eq.u32 %r190,%r189,0r164193; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r196193; call _gfortrani_bounds_equal_extents$L40: .loc 1 231 15 ld.u64 %r31,[%r165+16]; .loc 1 233 9 ld.u64 %r90,[%r165]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r198,%r32,-4; and.b32 %r199,%r198,-5; set.u32.eq.u32 %r201,%r199,0; neg.s32 %r202,%r201; .loc 1 235 22 add.u32 %r203,%r32,-1; set.u32.le.u32 %r205,%r203,1; neg.s32 %r206,%r205; .loc 1 235 6 cvt.u16.u32 %r208,%r202; cvt.u16.u32 %r209,%r206; or.b16 %r207,%r208,%r209;41; .loc 1 237 7 setp.ne.u32 %r214,%r32,16; @ %r214 bra $L42; $L41: .loc 1 244 11 ld.u64 %r102,[%r163+40]; .loc 1 245 8 ld.u64 %r103,[%r163]; .loc 1 246 17 cvt.u32.u32 %r215,%r22; cvt.s64.s8 %r65,%r215; add.u64 %r161,%r164,40; add.u64 %r289,%frame,120; mov.u64 %r159,%r289; add.u64 %r158,%r165,40; mov.u64 %r154,%frame; add.u64 %r155,%frame,240; add.u64 %r151,%frame,360; mov.u64 %r150,%r151; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r224,%r29; bra $L43; $L42: .loc 1 242 5_gfortran_runtime_error64: mov.u64 %r29,%r109; $L43: .loc 1 248 18 ld.u64 %r218,[%r161]; st.u64 [%r159],%r218; .loc 1 249 20 ld.u64 %r220,[%r158]; mul.lo.u64 %r219,%r220,%r31; .loc 1 249 18 st.u64 [%r154],%r219; .loc 1 250 19 ld.u64 %r222,[%r161+16]; add.u64 %r221,%r222,1; ld.u64 %r223,[%r161+8]; sub.u64 %r44,%r221,%r223; .loc 1 250 17 st.u64 [%r155],%r44; .loc 1 251 16 st.u64 [%r150],%r224; .loc 1 252 10 setp.gt.s64 %r225,%r44,0; @ %r225 bra $L44; shl.b64 %r144,%r102,4; mov.u64 %r133,%r103; .loc 1 255 11 mov.u64 %r110,0; .loc 1 256 24 mov.u64 %r226,%r110; $L45: st.u64 [%r133],%r226; st.u64 [%r133+8],%r226; .loc 1 255 27 add.u64 %r110,%r110,1; .loc 1 255 4 add.u64 %r133,%r133,%r144; setp.ne.u64 %r228,%r65,%r110; @ %r228 bra $L45; bra $L35; $L44: .loc 1 246 26 add.u64 %r109,%r29,1; .loc 1 246 3 add.u64 %r161,%r161,24; add.u64 %r159,%r159,8; add.u64 %r158,%r158,24; add.u64 %r154,%r154,8; add.u64 %r155,%r155,8; add.u64 %r150,%r150,8; setp.ne.u64 %r229,%r65,%r109; @ %r229 bra $L64; .loc 1 261 8 ld.u64 %r88,[%r164]; shl.b64 %r112,%r102,4; mov.u64 %r52,%r103; .loc 1 264 10 mov.u64 %r108,0; .loc 1 265 23 mov.u64 %r230,%r108; $L47: st.u64 [%r52],%r230; st.u64 [%r52+8],%r230; mov.u64 %r140,%r108; .loc 1 264 26 add.u64 %r108,%r108,1; .loc 1 264 3 add.u64 %r52,%r52,%r112; setp.ne.u64 %r232,%r29,%r140; @ %r232 bra $L47; .loc 1 276 9 setp.eq.u64 %r233,%r88,0; @ %r233 bra $L35; .loc 1 331 19 ld.u64 %r68,[%frame+120]; .loc 1 331 9 add.u64 %r70,%r68,%r68; .loc 1 332 20 ld.u64 %r71,[%frame]; .loc 1 334 34 ld.u64 %r74,[%frame+240]; ld.u64 %r160,[%frame+360]; cvt.u32.u32 %r236,%r22; cvt.s64.s8 %r235,%r236; shl.b64 %r237,%r235,3; add.u64 %r122,%r237,%r151; .loc 1 274 12 mov.u32 %r95,-32768; setp.eq.u32 %r288,1,0; add.u64 %r290,%frame,368; .loc 1 340 13 mov.u64 %r295,0; .loc 1 359 32 add.u64 %r296,%frame,240; .loc 1 308 12 setp.eq.u32 %r298,%r166,0; $L63: .loc 1 280 10 @ %r288 bra $L48; mov.u64 %r53,%r160; mov.u32 %r46,0; mov.u32 %r292,1; $L53: mov.u64 %r45,%r88; .loc 1 300 13 add.u64 %r88,%r45,%r70; .loc 1 284 11 ld.s8 %r240,[%r90]; cvt.u16.u32 %r239,%r240; setp.eq.u16 %r241,%r239,0; @ %r241 bra $L49; setp.eq.u32 %r242,%r46,0; selp.u64 %r160,%r160,%r53,%r242; .loc 1 294 16 ld.s16 %r95,[%r45]; mov.u64 %r131,%r290; mov.u64 %r145,%r103; mov.u64 %r157,%r160; $L52: .loc 1 296 33 add.u64 %r244,%r157,1; st.u64 [%r145],%r244; shr.s64 %r245,%r244,63; st.u64 [%r145+8],%r245; .loc 1 295 9 add.u64 %r145,%r145,%r112; setp.eq.u64 %r246,%r122,%r131; @ %r246 bra $L65; .loc 1 296 29 ld.u64 %r157,[%r131]; add.u64 %r131,%r131,8; bra $L52; $L49: .loc 1 301 14 add.u64 %r90,%r90,%r71; .loc 1 303 11 add.u64 %r53,%r53,1; mov.u32 %r46,%r292; .loc 1 303 4 setp.ne.u64 %r247,%r53,%r74; @ %r247 bra $L53; bra $L89; $L48: .loc 1 308 12 @ %r298 bra $L66; mov.u64 %r61,%r160; mov.u64 %r86,%r88; $L58: .loc 1 311 11 ld.s8 %r250,[%r90]; cvt.u16.u32 %r249,%r250; setp.eq.u16 %r251,%r249,0; @ %r251 bra $L56; .loc 1 311 22 ld.s16 %r57,[%r86]; .loc 1 311 19 cvt.u16.u32 %r252,%r57; cvt.u16.u32 %r253,%r95; setp.lt.s16 %r254,%r252,%r253; @ %r254 bra $L56; mov.u64 %r97,%r290; mov.u64 %r107,%r103; mov.u64 %r153,%r61; .loc 1 314 19 mov.u64 %r141,%r295; $L57: .loc 1 315 36 add.u64 %r256,%r153,1; st.u64 [%r107],%r256; shr.s64 %r257,%r256,63; st.u64 [%r107+8],%r257; .loc 1 314 35 add.u64 %r106,%r141,1; .loc 1 314 12 add.u64 %r107,%r107,%r112; setp.le.s64 %r258,%r29,%r141; @ %r258 bra $L67; .loc 1 315 32 ld.u64 %r153,[%r97]; add.u64 %r97,%r97,8; mov.u64 %r141,%r106; bra $L57; $L67: mov.u32 %r95,%r57; $L56: .loc 1 317 13 add.u64 %r86,%r86,%r70; .loc 1 319 11 add.u64 %r61,%r61,1; .loc 1 319 4 st.u64 [%frame+360],%r61; setp.ne.u64 %r259,%r61,%r74; @ %r259 bra $L58; .loc 1 317 13 sub.u64 %r260,%r74,%r160; mad.lo.u64 %r88,%r260,%r70,%r88; bra $L59; $L66: mov.u64 %r73,%r160; mov.u64 %r89,%r90; mov.u64 %r87,%r88; $L55: .loc 1 323 11 ld.s8 %r263,[%r89]; cvt.u16.u32 %r262,%r263; setp.eq.u16 %r264,%r262,0; @ %r264 bra $L60; .loc 1 323 22 ld.s16 %r64,[%r87]; .loc 1 323 19 cvt.u16.u32 %r265,%r64; cvt.u16.u32 %r266,%r95; setp.le.s16 %r267,%r265,%r266; @ %r267 bra $L60; mov.u64 %r62,%r290; mov.u64 %r91,%r103; mov.u64 %r149,%r73; .loc 1 326 12 mov.u64 %r142,%r295; $L61: .loc 1 327 36 add.u64 %r269,%r149,1; st.u64 [%r91],%r269; shr.s64 %r270,%r269,63; st.u64 [%r91+8],%r270; .loc 1 326 28 add.u64 %r105,%r142,1; .loc 1 326 5 add.u64 %r91,%r91,%r112; setp.le.s64 %r271,%r29,%r142; @ %r271 bra $L68; .loc 1 327 32 ld.u64 %r149,[%r62]; add.u64 %r62,%r62,8; mov.u64 %r142,%r105; bra $L61; $L68: mov.u32 %r95,%r64; $L60: .loc 1 331 9 add.u64 %r87,%r87,%r70; .loc 1 332 10 add.u64 %r89,%r89,%r71; .loc 1 334 14 add.u64 %r73,%r73,1; .loc 1 334 7 st.u64 [%frame+360],%r73; setp.ne.u64 %r272,%r73,%r74; @ %r272 bra $L55; sub.u64 %r128,%r74,%r160; .loc 1 331 9 mad.lo.u64 %r88,%r70,%r128,%r88; .loc 1 332 10 mad.lo.u64 %r90,%r71,%r128,%r90; bra $L59; $L89: st.u64 [%frame+360],%r74; $L59: mov.u64 %r120,%r290; mov.u64 %r83,%r71; .loc 1 334 34 mov.u64 %r85,%r74; mov.u64 %r81,%r68; mov.u64 %r119,8; .loc 1 335 9 mov.u64 %r93,0; $L62: .loc 1 340 13 st.u64 [%r120+-8],%r295; .loc 1 343 23 mul.lo.u64 %r76,%r81,%r85; .loc 1 344 24 mul.lo.u64 %r78,%r85,%r83; mov.u64 %r98,%r93; .loc 1 345 5 add.u64 %r93,%r93,1; .loc 1 346 7 setp.eq.u64 %r277,%r29,%r98; @ %r277 bra $L35; .loc 1 354 16 ld.u64 %r278,[%r120]; add.u64 %r80,%r278,1; st.u64 [%r120],%r80; .loc 1 355 23 add.u64 %r280,%r289,%r119; ld.u64 %r81,[%r280]; .loc 1 355 13 sub.u64 %r281,%r81,%r76; add.u64 %r282,%r281,%r281; add.u64 %r88,%r88,%r282; .loc 1 356 24 add.u64 %r283,%frame,%r119; ld.u64 %r83,[%r283]; .loc 1 356 14 sub.u64 %r284,%r83,%r78; add.u64 %r90,%r90,%r284; .loc 1 359 32 add.u64 %r286,%r296,%r119; ld.u64 %r85,[%r286]; .loc 1 359 7 add.u64 %r120,%r120,8; add.u64 %r119,%r119,8; setp.eq.u64 %r287,%r80,%r85; @ %r287 bra $L62; ld.u64 %r160,[%frame+360]; bra $L63; $L65: mov.u64 %r88,%r45; setp.eq.u32 %r288,1,1; bra $L48; $L35: .loc 1 362_gfortran_smaxloc0_16_i2 .visible .func _gfortran_smaxloc0_16_i24predpred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 379 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L91; .loc 1 379 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L92; $L91: .loc 1 381 7 {_gfortran_maxloc0_16_.loc 1 382 7 bra $L90; $L92: .loc 1 385 8 ld.s8 %r30,[%r36+28]_gfortran_runtime_error94: .loc 1 390 15 ld.u64 %r24,[%r35]; .loc 1 390 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L95; .loc 1 392 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 393 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 394 24 st.u64 [%r35+8],%r24; .loc 1 395 29call (%value_in),_gfortrani_xmallocarray95 27 st.u64 [%r35],%r24; bra $L96; $L95: .loc 1 397 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 397 11 ld.u32 %r62,[%r61+36; .loc 1 399 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 404 8 ld.u64 %r24,[%r35]; $L96: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,4; mov.u64 %r29,%r24; .loc 1 405 10 mov.u64 %r32,0; .loc 1 406 23 mov.u64 %r68,%r32; $L97: st.u64 [%r29],%r68; st.u64 [%r29+8],%r68; .loc 1 405 24 add.u64 %r32,%r32,1; .loc 1 405 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r70,%r30,%r32; @ %r70 bra $L97; $L90: .loc 1 407 maxloc0_4_i4.o/ 1622802227_gfortran_maxloc0_4_i4 .visible .func _gfortran_maxloc0_4_i4fortran/generated/maxloc0_4_i4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_4_i4 .visible .func _gfortran_mmaxloc0_4_i4_gfortran_smaxloc0_4_i4 .visible .func _gfortran_smaxloc0_4_i4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,65,88,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_4_i4 .visible .func _gfortran_maxloc0_4_i4u64 %r87; .reg .u64predpred %r154; .reg .predpredpredmov.u64 %r116,%ar0; mov.u64 %r117,%ar1; mov.u32 %r118,%ar2; .loc 1 50 10 ld.s8 %r22,[%r117+28]; .loc 1 50 8 cvt.u32.u32 %r119,%r22; cvt.s64.s8 %r72,%r119; .loc 1 51 6 setp.gt.s64 %r120,%r72,0; @ %r120 bra $L2; .loc 1 52 5 cvta.const.u64 %r121121_gfortran_runtime_errorr23,[%r116]; .loc 1 54 6 setp.ne.u64 %r123,%r23,0; @ %r123 bra $L3; .loc 1 56 7 st.u64 [%r116+48],%r23; add.u64 %r125,%r72,-1; st.u64 [%r116+56],%r125; mov.u64 %r126,1; st.u64 [%r116+40],%r126; .loc 1 57 28 cvt.u32.u64 %r127,%r126; st.u8 [%r116+28],%r127; .loc 1 58 24 st.u64 [%r116+8],%r23; .loc 1 59 29 mov.u64 %r130,4;r130; call (%value_in),_gfortrani_xmallocarray133,[%value_in]; } mov.u64 %r23,%r133; .loc 1 59 27 st.u64 [%r116],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r135,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r136,[%r135+36]; setp.eq.u32 %r137,%r136,0; @ %r137 bra $L4; .loc 1 64r140; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r116]; $L4: .loc 1 68 11 ld.u64 %r73,[%r116+40]; add.u64 %r39,%r117,40; mov.u64 %r45,%frame; add.u64 %r35,%frame,120; .loc 1 70 10 mov.u64 %r99,0; add.u64 %r188,%frame,240; .loc 1 74 16 mov.u64 %r148,%r99; bra $L7; $L22: mov.u64 %r99,%r79; $L7: .loc 1 72 18 ld.u64 %r141,[%r39]; st.u64 [%r45],%r141; .loc 1 73 19 ld.u64 %r143,[%r39+16]; add.u64 %r142,%r143,1; ld.u64 %r144,[%r39+8]; sub.u64 %r31,%r142,%r144; .loc 1 73 17 st.u64 [%r35],%r31; .loc 1 74 16 shl.b64 %r146,%r99,3; add.u64 %r147,%r188,%r146; st.u64 [%r147],%r148; .loc 1 75 10 setp.gt.s64 %r149,%r31,0; @ %r149 bra $L5; shl.b64 %r85,%r73,2; mov.u64 %r87,%r23; .loc 1 78 11 mov.u64 %r80,0; .loc 1 79 24 cvt.u32.u64 %r150,%r80; $L6: st.u32 [%r87],%r150; .loc 1 78 27 add.u64 %r80,%r80,1; .loc 1 78 4 add.u64 %r87,%r87,%r85; setp.ne.u64 %r151,%r72,%r80; @ %r151 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r79,%r99,1; .loc 1 70 3 add.u64 %r39,%r39,24; add.u64 %r45,%r45,8; add.u64 %r35,%r35,8; setp.ne.u64 %r152,%r72,%r79; @ %r152 bra $L22; .loc 1 84 8 ld.u64 %r64,[%r117]; shl.b64 %r82,%r73,2; mov.u64 %r92,%r23; .loc 1 87 10 mov.u64 %r78,0; .loc 1 88 23 mov.u32 %r153,1; $L8: st.u32 [%r92],%r153; mov.u64 %r32,%r78; .loc 1 87 26 add.u64 %r78,%r78,1; .loc 1 87 3 add.u64 %r92,%r92,%r82; setp.ne.u64 %r154,%r99,%r32; @ %r154 bra $L8; .loc 1 101 9 setp.ne.u64 %r155,%r64,0; @ ! %r155 bra $L1; .loc 1 149 19 ld.u64 %r49,[%frame]; .loc 1 149 9 shl.b64 %r51,%r49,2; .loc 1 151 34 ld.u64 %r53,[%frame+120]; cvt.u32.u32 %r157,%r22; cvt.s64.s8 %r156,%r157; shl.b64 %r158,%r156,3; add.u64 %r102,%r188,%r158; .loc 1 99 12 mov.u32 %r67,-2147483648; setp.eq.u32 %r187,%r118,0; add.u64 %r189,%frame,248; .loc 1 157 13 mov.u64 %r192,0; .loc 1 174 32 add.u64 %r193,%frame,120; $L21: .loc 1 151 21 ld.u64 %r110,[%frame+240]; .loc 1 126 12 @ %r187 bra $L23; mov.u64 %r40,%r110; mov.u64 %r62,%r64; $L14: .loc 1 129 12 ld.u32 %r33,[%r62]; .loc 1 129 11 setp.lt.s32 %r161,%r33,%r67; @ %r161 bra $L12; mov.u64 %r43,%r189; mov.u64 %r65,%r23; mov.u64 %r105,%r40; $L13: .loc 1 133 35 cvt.u32.u64 %r164,%r105; add.u32 %r163,%r164,1; .loc 1 133 24 st.u32 [%r65],%r163; .loc 1 132 10 add.u64 %r65,%r65,%r82; setp.eq.u64 %r165,%r43,%r102; @ %r165 bra $L24; .loc 1 133 31 ld.u64 %r105,[%r43]; add.u64 %r43,%r43,8; bra $L13; $L24: mov.u32 %r67,%r33; $L12: .loc 1 135 12 add.u64 %r62,%r62,%r51; .loc 1 137 17 add.u64 %r40,%r40,1; .loc 1 137 10 st.u64 [%frame+240],%r40; setp.ne.u64 %r166,%r40,%r53; @ %r166 bra $L14; bra $L34; $L23: mov.u64 %r52,%r110; mov.u64 %r63,%r64; $L11: .loc 1 141 11 ld.u32 %r42,[%r63]; .loc 1 141 10 setp.le.s32 %r167,%r42,%r67; @ %r167 bra $L16; mov.u64 %r108,%r189; mov.u64 %r109,%r23; mov.u64 %r44,%r52; $L17: .loc 1 145 35 cvt.u32.u64 %r170,%r44; add.u32 %r169,%r170,1; .loc 1 145 24 st.u32 [%r109],%r169; .loc 1 144 4 add.u64 %r109,%r109,%r82; setp.eq.u64 %r171,%r102,%r108; @ %r171 bra $L25; .loc 1 145 31 ld.u64 %r44,[%r108]; add.u64 %r108,%r108,8; bra $L17; $L25: mov.u32 %r67,%r42; $L16: .loc 1 149 9 add.u64 %r63,%r63,%r51; .loc 1 151 14 add.u64 %r52,%r52,1; .loc 1 151 7 st.u64 [%frame+240],%r52; setp.ne.u64 %r172,%r52,%r53; @ %r172 bra $L11; .loc 1 149 9 sub.u64 %r173,%r53,%r110; mad.lo.u64 %r64,%r173,%r51,%r64; bra $L18; $L34: .loc 1 135 12 sub.u64 %r175,%r53,%r110; mad.lo.u64 %r64,%r175,%r51,%r64; $L18: mov.u64 %r81,%r189; .loc 1 151 34 mov.u64 %r61,%r53; mov.u64 %r59,%r49; .loc 1 152 9 mov.u64 %r66,0; $L20: .loc 1 157 13 st.u64 [%r81+-8],%r192; .loc 1 160 23 mul.lo.u64 %r56,%r59,%r61; mov.u64 %r100,%r66; .loc 1 161 5 add.u64 %r66,%r66,1; .loc 1 162 7 setp.eq.u64 %r179,%r100,%r99; @ %r179 bra $L1; .loc 1 170 16 ld.u64 %r180,[%r81]; add.u64 %r58,%r180,1; st.u64 [%r81],%r58; shl.b64 %r70,%r66,3; .loc 1 171 23 add.u64 %r181,%frame,%r70; ld.u64 %r59,[%r181]; .loc 1 171 13 sub.u64 %r182,%r59,%r56; shl.b64 %r183,%r182,2; add.u64 %r64,%r64,%r183; .loc 1 174 32 add.u64 %r185,%r193,%r70; ld.u64 %r61,[%r185]; .loc 1 174 7 add.u64 %r81,%r81,8; setp.eq.u64 %r186,%r58,%r61; @ %r186 bra $L20; bra $L21; $L1: .loc 1 1_gfortran_mmaxloc0_4_i4 .visible .func _gfortran_mmaxloc0_4_i47predpred %r193; .reg .u64 %r196; .reg .u6464pred %r228; .reg .u32 %r229; .reg .predpred %r233; .reg .pred %r234; .reg .u64u16u64u64 %r292; .reg .pred %r294; mov.u64 %r166,%ar0; mov.u64 %r167,%ar1; mov.u64 %r168,%ar2; mov.u32 %r169,%ar3; .loc 1 202 6 setp.ne.u64 %r170,%r168,0; @ %r170 bra $L36gfortran_maxloc0_4_.loc 1 205 7 bra $L35; $L36: .loc 1 208 10 ld.s8 %r22,[%r167+28]; .loc 1 209 6 setp.gt.s32 %r174,%r22,0; @ %r174 bra $L38; .loc 1 210 5 cvta.const.u64 %r175175_gfortran_runtime_error%r177,[%r166]; setp.ne.u64 %r178,%r177,0; @ %r178 bra $L39; .loc 1 214 7 st.u64 [%r166+48],%r177; add.u32 %r180,%r22,-1; cvt.s64.s32 %r181,%r180; st.u64 [%r166+56],%r181; mov.u64 %r182,1; st.u64 [%r166+40],%r182; .loc 1 215 28 cvt.u32.u64 %r183,%r182; st.u8 [%r166+28],%r183; .loc 1 216 24 st.u64 [%r166+8],%r177; .loc 1 217 29 cvt.u32.u32 %r188,%r22; cvt.s64.s8 %r187,%r188call (%value_in),_gfortrani_xmallocarray189,[%value_in]; } .loc 1 217 27 st.u64 [%r166],%r189; bra $L40; $L39: .loc 1 221 11 cvta.global.u64 %r191,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r192,[%r191+36];r167196; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19call _gfortrani_bounds_equal_extents$L40: .loc 1 231 15 ld.u64 %r31,[%r168+16]; .loc 1 233 9 ld.u64 %r93,[%r168]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r201,%r32,-4; and.b32 %r202,%r201,-5; set.u32.eq.u32 %r204,%r202,0; neg.s32 %r205,%r204; .loc 1 235 22 add.u32 %r206,%r32,-1; set.u32.le.u32 %r208,%r206,1; neg.s32 %r209,%r208; .loc 1 235 6 cvt.u16.u32 %r211,%r205; cvt.u16.u32 %r212,%r209; or.b16 %r210,%r211,%r212; cvt.u32.u16 %r213,%r210; cvt.u16.u8 %r214,%r213; setp.ne.u16 %r215,%r214,0; @ %r215 bra $L41; .loc 1 237 7 setp.ne.u32 %r217,%r32,16; @ %r217 bra $L42; $L41: .loc 1 244 11 ld.u64 %r106,[%r166+40]; .loc 1 245 8 ld.u64 %r107,[%r166]; .loc 1 246 17 cvt.u32.u32 %r218,%r22; cvt.s64.s8 %r65,%r218; add.u64 %r52,%r167,40; add.u64 %r286,%frame,120; mov.u64 %r116,%r286; add.u64 %r102,%r168,40; mov.u64 %r165,%frame; add.u64 %r162,%frame,240; add.u64 %r159,%frame,360; mov.u64 %r160,%r159; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r227,%r29; bra $L43; $L42: .loc 1 242 5_gfortran_runtime_error64: mov.u64 %r29,%r113; $L43: .loc 1 248 18 ld.u64 %r221,[%r52]; st.u64 [%r116],%r221; .loc 1 249 20 ld.u64 %r223,[%r102]; mul.lo.u64 %r222,%r223,%r31; .loc 1 249 18 st.u64 [%r165],%r222; .loc 1 250 19 ld.u64 %r225,[%r52+16]; add.u64 %r224,%r225,1; ld.u64 %r226,[%r52+8]; sub.u64 %r44,%r224,%r226; .loc 1 250 17 st.u64 [%r162],%r44; .loc 1 251 16 st.u64 [%r160],%r227; .loc 1 252 10 setp.gt.s64 %r228,%r44,0; @ %r228 bra $L44; shl.b64 %r149,%r106,2; mov.u64 %r139,%r107; .loc 1 255 11 mov.u64 %r114,0; .loc 1 256 24 cvt.u32.u64 %r229,%r114; $L45: st.u32 [%r139],%r229; .loc 1 255 27 add.u64 %r114,%r114,1; .loc 1 255 4 add.u64 %r139,%r139,%r149; setp.ne.u64 %r230,%r65,%r114; @ %r230 bra $L45; bra $L35; $L44: .loc 1 246 26 add.u64 %r113,%r29,1; .loc 1 246 3 add.u64 %r52,%r52,24; add.u64 %r116,%r116,8; add.u64 %r102,%r102,24; add.u64 %r165,%r165,8; add.u64 %r162,%r162,8; add.u64 %r160,%r160,8; setp.ne.u64 %r231,%r65,%r113; @ %r231 bra $L64; .loc 1 261 8 ld.u64 %r90,[%r167]; shl.b64 %r62,%r106,2; mov.u64 %r91,%r107; .loc 1 264 10 mov.u64 %r112,0; .loc 1 265 23 cvt.u32.u64 %r232,%r112; $L47: st.u32 [%r91],%r232; mov.u64 %r142,%r112; .loc 1 264 26 add.u64 %r112,%r112,1; .loc 1 264 3 add.u64 %r91,%r91,%r62; setp.ne.u64 %r233,%r29,%r142; @ %r233 bra $L47; .loc 1 276 9 setp.eq.u64 %r234,%r90,0; @ %r234 bra $L35; .loc 1 331 19 ld.u64 %r69,[%frame+120]; .loc 1 331 9 shl.b64 %r71,%r69,2; .loc 1 332 20 ld.u64 %r72,[%frame]; .loc 1 334 34 ld.u64 %r75,[%frame+240]; ld.u64 %r163,[%frame+360]; cvt.u32.u32 %r236,%r22; cvt.s64.s8 %r235,%r236; shl.b64 %r237,%r235,3; add.u64 %r127,%r237,%r159; .loc 1 274 12 mov.u32 %r98,-2147483648; setp.eq.u32 %r285,1,0; add.u64 %r284,%frame,368; .loc 1 340 13 mov.u64 %r291,0; .loc 1 359 32 add.u64 %r292,%frame,240; .loc 1 308 12 setp.eq.u32 %r294,%r169,0; $L63: .loc 1 280 10 @ %r285 bra $L48; mov.u64 %r53,%r163; mov.u32 %r46,0; mov.u32 %r288,1; $L53: mov.u64 %r45,%r90; .loc 1 300 13 add.u64 %r90,%r45,%r71; .loc 1 284 11 ld.s8 %r240,[%r93]; cvt.u16.u32 %r239,%r240; setp.eq.u16 %r241,%r239,0; @ %r241 bra $L49; setp.eq.u32 %r242,%r46,0; selp.u64 %r163,%r163,%r53,%r242; .loc 1 294 16 ld.u32 %r98,[%r45]; mov.u64 %r137,%r284; mov.u64 %r150,%r107; mov.u64 %r161,%r163; $L52: .loc 1 296 33 cvt.u32.u64 %r245,%r161; add.u32 %r244,%r245,1; .loc 1 296 22 st.u32 [%r150],%r244; .loc 1 295 9 add.u64 %r150,%r150,%r62; setp.eq.u64 %r246,%r127,%r137; @ %r246 bra $L65; .loc 1 296 29 ld.u64 %r161,[%r137]; add.u64 %r137,%r137,8; bra $L52; $L49: .loc 1 301 14 add.u64 %r93,%r93,%r72; .loc 1 303 11 add.u64 %r53,%r53,1; mov.u32 %r46,%r288; .loc 1 303 4 setp.ne.u64 %r247,%r53,%r75; @ %r247 bra $L53; bra $L89; $L48: .loc 1 308 12 @ %r294 bra $L66; mov.u64 %r61,%r163; mov.u64 %r88,%r90; $L58: .loc 1 311 11 ld.s8 %r250,[%r93]; cvt.u16.u32 %r249,%r250; setp.eq.u16 %r251,%r249,0; @ %r251 bra $L56; .loc 1 311 22 ld.u32 %r57,[%r88]; .loc 1 311 19 setp.lt.s32 %r252,%r57,%r98; @ %r252 bra $L56; mov.u64 %r117,%r284; mov.u64 %r119,%r107; mov.u64 %r156,%r61; .loc 1 314 19 mov.u64 %r143,%r291; $L57: .loc 1 315 36 cvt.u32.u64 %r255,%r156; add.u32 %r254,%r255,1; .loc 1 315 25 st.u32 [%r119],%r254; .loc 1 314 35 add.u64 %r110,%r143,1; .loc 1 314 12 add.u64 %r119,%r119,%r62; setp.le.s64 %r256,%r29,%r143; @ %r256 bra $L67; .loc 1 315 32 ld.u64 %r156,[%r117]; add.u64 %r117,%r117,8; mov.u64 %r143,%r110; bra $L57; $L67: mov.u32 %r98,%r57; $L56: .loc 1 317 13 add.u64 %r88,%r88,%r71; .loc 1 319 11 add.u64 %r61,%r61,1; .loc 1 319 4 st.u64 [%frame+360],%r61; setp.ne.u64 %r257,%r61,%r75; @ %r257 bra $L58; .loc 1 317 13 sub.u64 %r258,%r75,%r163; mad.lo.u64 %r90,%r258,%r71,%r90; bra $L59; $L66: mov.u64 %r74,%r163; mov.u64 %r92,%r93; mov.u64 %r89,%r90; $L55: .loc 1 323 11 ld.s8 %r261,[%r92]; cvt.u16.u32 %r260,%r261; setp.eq.u16 %r262,%r260,0; @ %r262 bra $L60; .loc 1 323 22 ld.u32 %r64,[%r89]; .loc 1 323 19 setp.le.s32 %r263,%r64,%r98; @ %r263 bra $L60; mov.u64 %r97,%r284; mov.u64 %r100,%r107; mov.u64 %r151,%r74; .loc 1 326 12 mov.u64 %r144,%r291; $L61: .loc 1 327 36 cvt.u32.u64 %r266,%r151; add.u32 %r265,%r266,1; .loc 1 327 25 st.u32 [%r100],%r265; .loc 1 326 28 add.u64 %r109,%r144,1; .loc 1 326 5 add.u64 %r100,%r100,%r62; setp.le.s64 %r267,%r29,%r144; @ %r267 bra $L68; .loc 1 327 32 ld.u64 %r151,[%r97]; add.u64 %r97,%r97,8; mov.u64 %r144,%r109; bra $L61; $L68: mov.u32 %r98,%r64; $L60: .loc 1 331 9 add.u64 %r89,%r89,%r71; .loc 1 332 10 add.u64 %r92,%r92,%r72; .loc 1 334 14 add.u64 %r74,%r74,1; .loc 1 334 7 st.u64 [%frame+360],%r74; setp.ne.u64 %r268,%r74,%r75; @ %r268 bra $L55; sub.u64 %r134,%r75,%r163; .loc 1 331 9 mad.lo.u64 %r90,%r71,%r134,%r90; .loc 1 332 10 mad.lo.u64 %r93,%r72,%r134,%r93; bra $L59; $L89: st.u64 [%frame+360],%r75; $L59: mov.u64 %r125,%r284; mov.u64 %r84,%r72; .loc 1 334 34 mov.u64 %r86,%r75; mov.u64 %r82,%r69; mov.u64 %r124,8; .loc 1 335 9 mov.u64 %r96,0; $L62: .loc 1 340 13 st.u64 [%r125+-8],%r291; .loc 1 343 23 mul.lo.u64 %r77,%r82,%r86; .loc 1 344 24 mul.lo.u64 %r79,%r86,%r84; mov.u64 %r118,%r96; .loc 1 345 5 add.u64 %r96,%r96,1; .loc 1 346 7 setp.eq.u64 %r273,%r29,%r118; @ %r273 bra $L35; .loc 1 354 16 ld.u64 %r274,[%r125]; add.u64 %r81,%r274,1; st.u64 [%r125],%r81; .loc 1 355 23 add.u64 %r276,%r286,%r124; ld.u64 %r82,[%r276]; .loc 1 355 13 sub.u64 %r277,%r82,%r77; shl.b64 %r278,%r277,2; add.u64 %r90,%r90,%r278; .loc 1 356 24 add.u64 %r279,%frame,%r124; ld.u64 %r84,[%r279]; .loc 1 356 14 sub.u64 %r280,%r84,%r79; add.u64 %r93,%r93,%r280; .loc 1 359 32 add.u64 %r282,%r292,%r124; ld.u64 %r86,[%r282]; .loc 1 359 7 add.u64 %r125,%r125,8; add.u64 %r124,%r124,8; setp.eq.u64 %r283,%r81,%r86; @ %r283 bra $L62; ld.u64 %r163,[%frame+360]; bra $L63; $L65: mov.u64 %r90,%r45; setp.eq.u32 %r285,1,1; bra $L48; $L35: .loc 1 362_gfortran_smaxloc0_4_i4 .visible .func _gfortran_smaxloc0_4_i44predpred.reg .u64u32 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 379 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L91; .loc 1 379 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L92; $L91: .loc 1 381 7 {_gfortran_maxloc0_4_.loc 1 382 7 bra $L90; $L92: .loc 1 385 8 ld.s8 %r30,[%r36+28]_gfortran_runtime_error94: .loc 1 390 15 ld.u64 %r24,[%r35]; .loc 1 390 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L95; .loc 1 392 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 393 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 394 24 st.u64 [%r35+8],%r24; .loc 1 395 29 mov.u64 %r56call (%value_in),_gfortrani_xmallocarray95 27 st.u64 [%r35],%r24; bra $L96; $L95: .loc 1 397 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 397 11 ld.u32 %r62,[%r61+36; .loc 1 399 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 404 8 ld.u64 %r24,[%r35]; $L96: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,2; mov.u64 %r29,%r24; .loc 1 405 10 mov.u64 %r32,0; .loc 1 406 23 cvt.u32.u64 %r68,%r32; $L97: st.u32 [%r29],%r68; .loc 1 405 24 add.u64 %r32,%r32,1; .loc 1 405 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L97; $L90: .loc 1 407maxloc0_8_i4.o/ 1622802227_gfortran_maxloc0_8_i4 .visible .func _gfortran_maxloc0_8_i4fortran/generated/maxloc0_8_i4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_8_i4 .visible .func _gfortran_mmaxloc0_8_i4_gfortran_smaxloc0_8_i4 .visible .func _gfortran_smaxloc0_8_i4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,65,88,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_8_i4 .visible .func _gfortran_maxloc0_8_i432predpred %r180; .reg .predmov.u64 %r112,%ar0; mov.u64 %r113,%ar1; mov.u32 %r114,%ar2; .loc 1 50 10 ld.s8 %r22,[%r113+28]; .loc 1 50 8 cvt.u32.u32 %r115,%r22; cvt.s64.s8 %r71,%r115; .loc 1 51 6 setp.gt.s64 %r116,%r71,0; @ %r116 bra $L2; .loc 1 52 5 cvta.const.u64 %r11117_gfortran_runtime_errorr23,[%r112]; .loc 1 54 6 setp.ne.u64 %r119,%r23,0; @ %r119 bra $L3; .loc 1 56 7 st.u64 [%r112+48],%r23; add.u64 %r121,%r71,-1; st.u64 [%r112+56],%r121; mov.u64 %r122,1; st.u64 [%r112+40],%r122; .loc 1 57 28 cvt.u32.u64 %r123,%r122; st.u8 [%r112+28],%r123; .loc 1 58 24 st.u64 [%r112+8],%r23; .loc 1 59 29 mov.u64 %r171call (%value_in),_gfortrani_xmallocarray129,[%value_in]; } mov.u64 %r23,%r129; .loc 1 59 27 st.u64 [%r112],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r131,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r132,[%r131+36]; setp.eq.u32 %r133,%r132,0; @ %r133 bra $L4; .loc 1 64r113136; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r112]; $L4: .loc 1 68 11 ld.u64 %r72,[%r112+40]; add.u64 %r36,%r113,40; mov.u64 %r104,%frame; add.u64 %r105,%frame,120; .loc 1 70 10 mov.u64 %r98,0; add.u64 %r182,%frame,240; .loc 1 74 16 mov.u64 %r144,%r98; bra $L7; $L22: mov.u64 %r98,%r79; $L7: .loc 1 72 18 ld.u64 %r137,[%r36]; st.u64 [%r104],%r137; .loc 1 73 19 ld.u64 %r139,[%r36+16]; add.u64 %r138,%r139,1; ld.u64 %r140,[%r36+8]; sub.u64 %r31,%r138,%r140; .loc 1 73 17 st.u64 [%r105],%r31; .loc 1 74 16 shl.b64 %r142,%r98,3; add.u64 %r143,%r182,%r142; st.u64 [%r143],%r144; .loc 1 75 10 setp.gt.s64 %r145,%r31,0; @ %r145 bra $L5; shl.b64 %r81,%r72,3; mov.u64 %r87,%r23; .loc 1 78 11 mov.u64 %r80,0; .loc 1 79 24 mov.u64 %r146,%r80; $L6: st.u64 [%r87],%r146; .loc 1 78 27 add.u64 %r80,%r80,1; .loc 1 78 4 add.u64 %r87,%r87,%r81; setp.ne.u64 %r147,%r71,%r80; @ %r147 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r79,%r98,1; .loc 1 70 3 add.u64 %r36,%r36,24; add.u64 %r104,%r104,8; add.u64 %r105,%r105,8; setp.ne.u64 %r148,%r71,%r79; @ %r148 bra $L22; .loc 1 84 8 ld.u64 %r62,[%r113]; shl.b64 %r84,%r72,3; mov.u64 %r39,%r23; .loc 1 87 10 mov.u64 %r78,0; .loc 1 88 23 mov.u64 %r149,1; $L8: st.u64 [%r39],%r149; mov.u64 %r32,%r78; .loc 1 87 26 add.u64 %r78,%r78,1; .loc 1 87 3 add.u64 %r39,%r39,%r84; setp.ne.u64 %r150,%r98,%r32; @ %r150 bra $L8; .loc 1 101 9 setp.ne.u64 %r151,%r62,0; @ ! %r151 bra $L1; .loc 1 149 19 ld.u64 %r45,[%frame]; .loc 1 149 9 shl.b64 %r47,%r45,2; .loc 1 151 34 ld.u64 %r51,[%frame+120]; cvt.u32.u32 %r153,%r22; cvt.s64.s8 %r152,%r153; shl.b64 %r154,%r152,3; add.u64 %r73,%r182,%r154; .loc 1 99 12 mov.u32 %r67,-2147483648; setp.eq.u32 %r181,%r114,0; add.u64 %r183,%frame,248; .loc 1 157 13 mov.u64 %r186,0; .loc 1 174 32 add.u64 %r187,%frame,120; $L21: .loc 1 151 21 ld.u64 %r103,[%frame+240]; .loc 1 126 12 @ %r181 bra $L23; mov.u64 %r40,%r103; mov.u64 %r60,%r62; $L14: .loc 1 129 12 ld.u32 %r33,[%r60]; .loc 1 129 11 setp.lt.s32 %r157,%r33,%r67; @ %r157 bra $L12; mov.u64 %r76,%r183; mov.u64 %r34,%r23; mov.u64 %r90,%r40; $L13: .loc 1 133 35 add.u64 %r159,%r90,1; .loc 1 133 24 st.u64 [%r34],%r159; .loc 1 132 10 add.u64 %r34,%r34,%r84; setp.eq.u64 %r160,%r73,%r76; @ %r160 bra $L24; .loc 1 133 31 ld.u64 %r90,[%r76]; add.u64 %r76,%r76,8; bra $L13; $L24: mov.u32 %r67,%r33; $L12: .loc 1 135 12 add.u64 %r60,%r60,%r47; .loc 1 137 17 add.u64 %r40,%r40,1; .loc 1 137 10 st.u64 [%frame+240],%r40; setp.ne.u64 %r161,%r40,%r51; @ %r161 bra $L14; bra $L34; $L23: mov.u64 %r49,%r103; mov.u64 %r61,%r62; $L11: .loc 1 141 11 ld.u32 %r42,[%r61]; .loc 1 141 10 setp.le.s32 %r162,%r42,%r67; @ %r162 bra $L16; mov.u64 %r85,%r183; mov.u64 %r101,%r23; mov.u64 %r86,%r49; $L17: .loc 1 145 35 add.u64 %r164,%r86,1; .loc 1 145 24 st.u64 [%r101],%r164; .loc 1 144 4 add.u64 %r101,%r101,%r84; setp.eq.u64 %r165,%r73,%r85; @ %r165 bra $L25; .loc 1 145 31 ld.u64 %r86,[%r85]; add.u64 %r85,%r85,8; bra $L17; $L25: mov.u32 %r67,%r42; $L16: .loc 1 149 9 add.u64 %r61,%r61,%r47; .loc 1 151 14 add.u64 %r49,%r49,1; .loc 1 151 7 st.u64 [%frame+240],%r49; setp.ne.u64 %r166,%r49,%r51; @ %r166 bra $L11; .loc 1 149 9 sub.u64 %r167,%r51,%r103; mad.lo.u64 %r62,%r167,%r47,%r62; bra $L18; $L34: .loc 1 135 12 sub.u64 %r169,%r51,%r103; mad.lo.u64 %r62,%r169,%r47,%r62; $L18: mov.u64 %r70,%r183; .loc 1 151 34 mov.u64 %r59,%r51; mov.u64 %r57,%r45; .loc 1 152 9 mov.u64 %r66,0; $L20: .loc 1 157 13 st.u64 [%r70+-8],%r186; .loc 1 160 23 mul.lo.u64 %r54,%r57,%r59; mov.u64 %r99,%r66; .loc 1 161 5 add.u64 %r66,%r66,1; .loc 1 162 7 setp.eq.u64 %r173,%r99,%r98; @ %r173 bra $L1; .loc 1 170 16 ld.u64 %r174,[%r70]; add.u64 %r56,%r174,1; st.u64 [%r70],%r56; shl.b64 %r65,%r66,3; .loc 1 171 23 add.u64 %r175,%frame,%r65; ld.u64 %r57,[%r175]; .loc 1 171 13 sub.u64 %r176,%r57,%r54; shl.b64 %r177,%r176,2; add.u64 %r62,%r62,%r177; .loc 1 174 32 add.u64 %r179,%r187,%r65; ld.u64 %r59,[%r179]; .loc 1 174 7 add.u64 %r70,%r70,8; setp.eq.u64 %r180,%r56,%r59; @ %r180 bra $L20; bra $L21; $L1: .loc 1 1_gfortran_mmaxloc0_8_i4 .visible .func _gfortran_mmaxloc0_8_i47pred %r187; .reg .u64 %r190; .reg .u64u64pred %r222; .reg .u64 %r223; .reg .predu32 %r230; .reg .u64 %r231; .reg .u16 %r233; .reg .u32 %r234; .reg .pred %r235; .reg .predpred %r248; .reg .predu64 %r273; .reg .predu64 %r283; .reg .pred %r285; mov.u64 %r160,%ar0; mov.u64 %r161,%ar1; mov.u64 %r162,%ar2; mov.u32 %r163,%ar3; .loc 1 202 6 setp.ne.u64 %r164,%r162,0; @ %r164 bra $L36; .loc 1 204 7 {1163; call _gfortran_maxloc0_8_.loc 1 205 7 bra $L35; $L36: .loc 1 208 10 ld.s8 %r22,[%r161+28]; .loc 1 209 6 setp.gt.s32 %r168,%r22,0; @ %r168 bra $L38; .loc 1 210 5 cvta.const.u64 %r16stack; call _gfortran_runtime_error%r171,[%r160]; setp.ne.u64 %r172,%r171,0; @ %r172 bra $L39; .loc 1 214 7 st.u64 [%r160+48],%r171; add.u32 %r174,%r22,-1; cvt.s64.s32 %r175,%r174; st.u64 [%r160+56],%r175; mov.u64 %r176,1; st.u64 [%r160+40],%r176; .loc 1 215 28 cvt.u32.u64 %r177,%r176; st.u8 [%r160+28],%r177; .loc 1 216 24 st.u64 [%r160+8],%r171; .loc 1 217 29 cvt.u32.u32 %r182,%r22; cvt.s64.s8 %r181,%r182; mov.u64 %r180call (%value_in),_gfortrani_xmallocarray183,[%value_in]; } .loc 1 217 27 st.u64 [%r160],%r183; bra $L40; $L39: .loc 1 221 11 cvta.global.u64 %r185,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r186,[%r185+36]; setp.eq.u32 %r187,%r186,0; @ %r187 bra $L40; .loc 1 224 4 cvta.const.u64 %r190,$LC1; {161190; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19316161193190; call _gfortrani_bounds_equal_extents$L40: .loc 1 231 15 ld.u64 %r31,[%r162+16]; .loc 1 233 9 ld.u64 %r92,[%r162]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r195,%r32,-4; and.b32 %r196,%r195,-5; set.u32.eq.u32 %r198,%r196,0; neg.s32 %r199,%r198; .loc 1 235 22 add.u32 %r200,%r32,-1; set.u32.le.u32 %r202,%r200,1; neg.s32 %r203,%r202; .loc 1 235 6 cvt.u16.u32 %r205,%r199; cvt.u16.u32 %r206,%r203; or.b16 %r204,%r205,%r206; cvt.u32.u16 %r207,%r204; cvt.u16.u8 %r208,%r20741; .loc 1 237 7 setp.ne.u32 %r211,%r32,16; @ %r211 bra $L42; $L41: .loc 1 244 11 ld.u64 %r106,[%r160+40]; .loc 1 245 8 ld.u64 %r107,[%r160]; .loc 1 246 17 cvt.u32.u32 %r212,%r22; cvt.s64.s8 %r68,%r212; add.u64 %r153,%r161,40; add.u64 %r277,%frame,120; mov.u64 %r149,%r277; add.u64 %r147,%r162,40; mov.u64 %r146,%frame; add.u64 %r143,%frame,240; add.u64 %r62,%frame,360; mov.u64 %r47,%r62; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r221,%r29; bra $L43; $L42: .loc 1 242 5_gfortran_runtime_error64: mov.u64 %r29,%r113; $L43: .loc 1 248 18 ld.u64 %r215,[%r153]; st.u64 [%r149],%r215; .loc 1 249 20 ld.u64 %r217,[%r147]; mul.lo.u64 %r216,%r217,%r31; .loc 1 249 18 st.u64 [%r146],%r216; .loc 1 250 19 ld.u64 %r219,[%r153+16]; add.u64 %r218,%r219,1; ld.u64 %r220,[%r153+8]; sub.u64 %r44,%r218,%r220; .loc 1 250 17 st.u64 [%r143],%r44; .loc 1 251 16 st.u64 [%r47],%r221; .loc 1 252 10 setp.gt.s64 %r222,%r44,0; @ %r222 bra $L44; shl.b64 %r137,%r106,3; mov.u64 %r129,%r107; .loc 1 255 11 mov.u64 %r114,0; .loc 1 256 24 mov.u64 %r223,%r114; $L45: st.u64 [%r129],%r223; .loc 1 255 27 add.u64 %r114,%r114,1; .loc 1 255 4 add.u64 %r129,%r129,%r137; setp.ne.u64 %r224,%r68,%r114; @ %r224 bra $L45; bra $L35; $L44: .loc 1 246 26 add.u64 %r113,%r29,1; .loc 1 246 3 add.u64 %r153,%r153,24; add.u64 %r149,%r149,8; add.u64 %r147,%r147,24; add.u64 %r146,%r146,8; add.u64 %r143,%r143,8; add.u64 %r47,%r47,8; setp.ne.u64 %r225,%r68,%r113; @ %r225 bra $L64; .loc 1 261 8 ld.u64 %r90,[%r161]; shl.b64 %r158,%r106,3; mov.u64 %r156,%r107; .loc 1 264 10 mov.u64 %r112,0; .loc 1 265 23 mov.u64 %r226,%r112; $L47: st.u64 [%r156],%r226; mov.u64 %r140,%r112; .loc 1 264 26 add.u64 %r112,%r112,1; .loc 1 264 3 add.u64 %r156,%r156,%r158; setp.ne.u64 %r227,%r29,%r140; @ %r227 bra $L47; .loc 1 276 9 setp.eq.u64 %r228,%r90,0; @ %r228 bra $L35; .loc 1 331 19 ld.u64 %r70,[%frame+120]; .loc 1 331 9 shl.b64 %r72,%r70,2; .loc 1 332 20 ld.u64 %r73,[%frame]; .loc 1 334 34 ld.u64 %r76,[%frame+240]; ld.u64 %r155,[%frame+360]; cvt.u32.u32 %r230,%r22; cvt.s64.s8 %r229,%r230; shl.b64 %r231,%r229,3; add.u64 %r119,%r231,%r62; .loc 1 274 12 mov.u32 %r97,-2147483648; setp.eq.u32 %r276,1,0; add.u64 %r275,%frame,368; .loc 1 340 13 mov.u64 %r282,0; .loc 1 359 32 add.u64 %r283,%frame,240; .loc 1 308 12 setp.eq.u32 %r285,%r163,0; $L63: .loc 1 280 10 @ %r276 bra $L48; mov.u64 %r56,%r155; mov.u32 %r46,0; mov.u32 %r279,1; $L53: mov.u64 %r45,%r90; .loc 1 300 13 add.u64 %r90,%r45,%r72; .loc 1 284 11 ld.s8 %r234,[%r92]; cvt.u16.u32 %r233,%r234; setp.eq.u16 %r235,%r233,0; @ %r235 bra $L49; setp.eq.u32 %r236,%r46,0; selp.u64 %r155,%r155,%r56,%r236; .loc 1 294 16 ld.u32 %r97,[%r45]; mov.u64 %r128,%r275; mov.u64 %r139,%r107; mov.u64 %r151,%r155; $L52: .loc 1 296 33 add.u64 %r238,%r151,1; .loc 1 296 22 st.u64 [%r139],%r238; .loc 1 295 9 add.u64 %r139,%r139,%r158; setp.eq.u64 %r239,%r119,%r128; @ %r239 bra $L65; .loc 1 296 29 ld.u64 %r151,[%r128]; add.u64 %r128,%r128,8; bra $L52; $L49: .loc 1 301 14 add.u64 %r92,%r92,%r73; .loc 1 303 11 add.u64 %r56,%r56,1; mov.u32 %r46,%r279; .loc 1 303 4 setp.ne.u64 %r240,%r56,%r76; @ %r240 bra $L53; bra $L89; $L48: .loc 1 308 12 @ %r285 bra $L66; mov.u64 %r64,%r155; mov.u64 %r88,%r90; $L58: .loc 1 311 11 ld.s8 %r243,[%r92]; cvt.u16.u32 %r242,%r243; setp.eq.u16 %r244,%r242,0; @ %r244 bra $L56; .loc 1 311 22 ld.u32 %r60,[%r88]; .loc 1 311 19 setp.lt.s32 %r245,%r60,%r97; @ %r245 bra $L56; mov.u64 %r93,%r275; mov.u64 %r96,%r107; mov.u64 %r148,%r64; .loc 1 314 19 mov.u64 %r141,%r282; $L57: .loc 1 315 36 add.u64 %r247,%r148,1; .loc 1 315 25 st.u64 [%r96],%r247; .loc 1 314 35 add.u64 %r110,%r141,1; .loc 1 314 12 add.u64 %r96,%r96,%r158; setp.le.s64 %r248,%r29,%r141; @ %r248 bra $L67; .loc 1 315 32 ld.u64 %r148,[%r93]; add.u64 %r93,%r93,8; mov.u64 %r141,%r110; bra $L57; $L67: mov.u32 %r97,%r60; $L56: .loc 1 317 13 add.u64 %r88,%r88,%r72; .loc 1 319 11 add.u64 %r64,%r64,1; .loc 1 319 4 st.u64 [%frame+360],%r64; setp.ne.u64 %r249,%r64,%r76; @ %r249 bra $L58; .loc 1 317 13 sub.u64 %r250,%r76,%r155; mad.lo.u64 %r90,%r250,%r72,%r90; bra $L59; $L66: mov.u64 %r75,%r155; mov.u64 %r91,%r92; mov.u64 %r89,%r90; $L55: .loc 1 323 11 ld.s8 %r253,[%r91]; cvt.u16.u32 %r252,%r253; setp.eq.u16 %r254,%r252,0; @ %r254 bra $L60; .loc 1 323 22 ld.u32 %r67,[%r89]; .loc 1 323 19 setp.le.s32 %r255,%r67,%r97; @ %r255 bra $L60; mov.u64 %r115,%r275; mov.u64 %r57,%r107; mov.u64 %r144,%r75; .loc 1 326 12 mov.u64 %r142,%r282; $L61: .loc 1 327 36 add.u64 %r257,%r144,1; .loc 1 327 25 st.u64 [%r57],%r257; .loc 1 326 28 add.u64 %r109,%r142,1; .loc 1 326 5 add.u64 %r57,%r57,%r158; setp.le.s64 %r258,%r29,%r142; @ %r258 bra $L68; .loc 1 327 32 ld.u64 %r144,[%r115]; add.u64 %r115,%r115,8; mov.u64 %r142,%r109; bra $L61; $L68: mov.u32 %r97,%r67; $L60: .loc 1 331 9 add.u64 %r89,%r89,%r72; .loc 1 332 10 add.u64 %r91,%r91,%r73; .loc 1 334 14 add.u64 %r75,%r75,1; .loc 1 334 7 st.u64 [%frame+360],%r75; setp.ne.u64 %r259,%r75,%r76; @ %r259 bra $L55; sub.u64 %r125,%r76,%r155; .loc 1 331 9 mad.lo.u64 %r90,%r72,%r125,%r90; .loc 1 332 10 mad.lo.u64 %r92,%r73,%r125,%r92; bra $L59; $L89: st.u64 [%frame+360],%r76; $L59: mov.u64 %r117,%r275; mov.u64 %r85,%r73; .loc 1 334 34 mov.u64 %r87,%r76; mov.u64 %r83,%r70; mov.u64 %r116,8; .loc 1 335 9 mov.u64 %r95,0; $L62: .loc 1 340 13 st.u64 [%r117+-8],%r282; .loc 1 343 23 mul.lo.u64 %r78,%r83,%r87; .loc 1 344 24 mul.lo.u64 %r80,%r87,%r85; mov.u64 %r94,%r95; .loc 1 345 5 add.u64 %r95,%r95,1; .loc 1 346 7 setp.eq.u64 %r264,%r29,%r94; @ %r264 bra $L35; .loc 1 354 16 ld.u64 %r265,[%r117]; add.u64 %r82,%r265,1; st.u64 [%r117],%r82; .loc 1 355 23 add.u64 %r267,%r277,%r116; ld.u64 %r83,[%r267]; .loc 1 355 13 sub.u64 %r268,%r83,%r78; shl.b64 %r269,%r268,2; add.u64 %r90,%r90,%r269; .loc 1 356 24 add.u64 %r270,%frame,%r116; ld.u64 %r85,[%r270]; .loc 1 356 14 sub.u64 %r271,%r85,%r80; add.u64 %r92,%r92,%r271; .loc 1 359 32 add.u64 %r273,%r283,%r116; ld.u64 %r87,[%r273]; .loc 1 359 7 add.u64 %r117,%r117,8; add.u64 %r116,%r116,8; setp.eq.u64 %r274,%r82,%r87; @ %r274 bra $L62; ld.u64 %r155,[%frame+360]; bra $L63; $L65: mov.u64 %r90,%r45; setp.eq.u32 %r276,1,1; bra $L48; $L35: .loc 1 362_gfortran_smaxloc0_8_i4 .visible .func _gfortran_smaxloc0_8_i44predpred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 379 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L91; .loc 1 379 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L92; $L91: .loc 1 381 7 {_gfortran_maxloc0_8_.loc 1 382 7 bra $L90; $L92: .loc 1 385 8 ld.s8 %r30,[%r36+28]_gfortran_runtime_error94: .loc 1 390 15 ld.u64 %r24,[%r35]; .loc 1 390 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L95; .loc 1 392 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 393 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 394 24 st.u64 [%r35+8],%r24; .loc 1 395 29 mov.u64 call (%value_in),_gfortrani_xmallocarray95 27 st.u64 [%r35],%r24; bra $L96; $L95: .loc 1 397 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 397 11 ld.u32 %r62,[%r61+36; .loc 1 399 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 404 8 ld.u64 %r24,[%r35]; $L96: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,3; mov.u64 %r29,%r24; .loc 1 405 10 mov.u64 %r32,0; .loc 1 406 23 mov.u64 %r68,%r32; $L97: st.u64 [%r29],%r68; .loc 1 405 24 add.u64 %r32,%r32,1; .loc 1 405 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L97; $L90: .loc 1 407 maxloc0_16_i4.o/1622802227_gfortran_maxloc0_16_i4 .visible .func _gfortran_maxloc0_16_i4fortran/generated/maxloc0_16_i4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_16_i4 .visible .func _gfortran_mmaxloc0_16_i4_gfortran_smaxloc0_16_i4 .visible .func _gfortran_smaxloc0_16_i4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,65,88,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_16_i4 .visible .func _gfortran_maxloc0_16_i43264u32 %r157; .reg .u64 %r158; .reg .predpred %r166; .reg .predpred %r171; .reg .predpredmov.u64 %r114,%ar0; mov.u64 %r115,%ar1; mov.u32 %r116,%ar2; .loc 1 50 10 ld.s8 %r22,[%r115+28]; .loc 1 50 8 cvt.u32.u32 %r117,%r22; cvt.s64.s8 %r71,%r117; .loc 1 51 6 setp.gt.s64 %r118,%r71,0; @ %r118 bra $L2; .loc 1 52 5 cvta.const.u64 %r1111_gfortran_runtime_errorr23,[%r114]; .loc 1 54 6 setp.ne.u64 %r121,%r23,0; @ %r121 bra $L3; .loc 1 56 7 st.u64 [%r114+48],%r23; add.u64 %r123,%r71,-1; st.u64 [%r114+56],%r123; mov.u64 %r124,1; st.u64 [%r114+40],%r124; .loc 1 57 28 cvt.u32.u64 %r125,%r124; st.u8 [%r114+28],%r125; .loc 1 58 24 st.u64 [%r114+8],%r23; .loc 1 59 29 mov.u64 %r128,1671call (%value_in),_gfortrani_xmallocarray131,[%value_in]; } mov.u64 %r23,%r131; .loc 1 59 27 st.u64 [%r114],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r133,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r134,[%r133+36]; setp.eq.u32 %r135,%r134,0; @ %r135 bra $L4; .loc 1 64r115138; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r114]; $L4: .loc 1 68 11 ld.u64 %r72,[%r114+40]; add.u64 %r44,%r115,40; mov.u64 %r35,%frame; add.u64 %r52,%frame,120; .loc 1 70 10 mov.u64 %r100,0; add.u64 %r188,%frame,240; .loc 1 74 16 mov.u64 %r146,%r100; bra $L7; $L22: mov.u64 %r100,%r79; $L7: .loc 1 72 18 ld.u64 %r139,[%r44]; st.u64 [%r35],%r139; .loc 1 73 19 ld.u64 %r141,[%r44+16]; add.u64 %r140,%r141,1; ld.u64 %r142,[%r44+8]; sub.u64 %r32,%r140,%r142; .loc 1 73 17 st.u64 [%r52],%r32; .loc 1 74 16 shl.b64 %r144,%r100,3; add.u64 %r145,%r188,%r144; st.u64 [%r145],%r146; .loc 1 75 10 setp.gt.s64 %r147,%r32,0; @ %r147 bra $L5; shl.b64 %r86,%r72,4; mov.u64 %r88,%r23; .loc 1 78 11 mov.u64 %r80,0; .loc 1 79 24 mov.u64 %r148,%r80; $L6: st.u64 [%r88],%r148; st.u64 [%r88+8],%r148; .loc 1 78 27 add.u64 %r80,%r80,1; .loc 1 78 4 add.u64 %r88,%r88,%r86; setp.ne.u64 %r150,%r71,%r80; @ %r150 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r79,%r100,1; .loc 1 70 3 add.u64 %r44,%r44,24; add.u64 %r35,%r35,8; add.u64 %r52,%r52,8; setp.ne.u64 %r151,%r71,%r79; @ %r151 bra $L22; .loc 1 84 8 ld.u64 %r62,[%r115]; shl.b64 %r38,%r72,4; mov.u64 %r85,%r23; .loc 1 87 10 mov.u64 %r78,0; .loc 1 88 23 mov.u64 %r152,1; mov.u64 %r153,%r78; $L8: st.u64 [%r85],%r152; st.u64 [%r85+8],%r153; mov.u64 %r33,%r78; .loc 1 87 26 add.u64 %r78,%r78,1; .loc 1 87 3 add.u64 %r85,%r85,%r38; setp.ne.u64 %r154,%r100,%r33; @ %r154 bra $L8; .loc 1 101 9 setp.ne.u64 %r155,%r62,0; @ ! %r155 bra $L1; .loc 1 149 19 ld.u64 %r47,[%frame]; .loc 1 149 9 shl.b64 %r49,%r47,2; .loc 1 151 34 ld.u64 %r51,[%frame+120]; cvt.u32.u32 %r157,%r22; cvt.s64.s8 %r156,%r157; shl.b64 %r158,%r156,3; add.u64 %r89,%r188,%r158; .loc 1 99 12 mov.u32 %r66,-2147483648; setp.eq.u32 %r187,%r116,0; add.u64 %r189,%frame,248; .loc 1 157 13 mov.u64 %r192,0; .loc 1 174 32 add.u64 %r193,%frame,120; $L21: .loc 1 151 21 ld.u64 %r107,[%frame+240]; .loc 1 126 12 @ %r187 bra $L23; mov.u64 %r41,%r107; mov.u64 %r60,%r62; $L14: .loc 1 129 12 ld.u32 %r34,[%r60]; .loc 1 129 11 setp.lt.s32 %r161,%r34,%r66; @ %r161 bra $L12; mov.u64 %r27,%r189; mov.u64 %r63,%r23; mov.u64 %r103,%r41; $L13: .loc 1 133 35 add.u64 %r163,%r103,1; st.u64 [%r63],%r163; shr.s64 %r164,%r163,63; st.u64 [%r63+8],%r164; .loc 1 132 10 add.u64 %r63,%r63,%r38; setp.eq.u64 %r165,%r27,%r89; @ %r165 bra $L24; .loc 1 133 31 ld.u64 %r103,[%r27]; add.u64 %r27,%r27,8; bra $L13; $L24: mov.u32 %r66,%r34; $L12: .loc 1 135 12 add.u64 %r60,%r60,%r49; .loc 1 137 17 add.u64 %r41,%r41,1; .loc 1 137 10 st.u64 [%frame+240],%r41; setp.ne.u64 %r166,%r41,%r51; @ %r166 bra $L14; bra $L34; $L23: mov.u64 %r50,%r107; mov.u64 %r61,%r62; $L11: .loc 1 141 11 ld.u32 %r42,[%r61]; .loc 1 141 10 setp.le.s32 %r167,%r42,%r66; @ %r167 bra $L16; mov.u64 %r104,%r189; mov.u64 %r105,%r23; mov.u64 %r84,%r50; $L17: .loc 1 145 35 add.u64 %r169,%r84,1; st.u64 [%r105],%r169; shr.s64 %r170,%r169,63; st.u64 [%r105+8],%r170; .loc 1 144 4 add.u64 %r105,%r105,%r38; setp.eq.u64 %r171,%r89,%r104; @ %r171 bra $L25; .loc 1 145 31 ld.u64 %r84,[%r104]; add.u64 %r104,%r104,8; bra $L17; $L25: mov.u32 %r66,%r42; $L16: .loc 1 149 9 add.u64 %r61,%r61,%r49; .loc 1 151 14 add.u64 %r50,%r50,1; .loc 1 151 7 st.u64 [%frame+240],%r50; setp.ne.u64 %r172,%r50,%r51; @ %r172 bra $L11; .loc 1 149 9 sub.u64 %r173,%r51,%r107; mad.lo.u64 %r62,%r173,%r49,%r62; bra $L18; $L34: .loc 1 135 12 sub.u64 %r175,%r51,%r107; mad.lo.u64 %r62,%r175,%r49,%r62; $L18: mov.u64 %r77,%r189; .loc 1 151 34 mov.u64 %r59,%r51; mov.u64 %r57,%r47; .loc 1 152 9 mov.u64 %r65,0; $L20: .loc 1 157 13 st.u64 [%r77+-8],%r192; .loc 1 160 23 mul.lo.u64 %r54,%r57,%r59; mov.u64 %r101,%r65; .loc 1 161 5 add.u64 %r65,%r65,1; .loc 1 162 7 setp.eq.u64 %r179,%r101,%r100; @ %r179 bra $L1; .loc 1 170 16 ld.u64 %r180,[%r77]; add.u64 %r56,%r180,1; st.u64 [%r77],%r56; shl.b64 %r68,%r65,3; .loc 1 171 23 add.u64 %r181,%frame,%r68; ld.u64 %r57,[%r181]; .loc 1 171 13 sub.u64 %r182,%r57,%r54; shl.b64 %r183,%r182,2; add.u64 %r62,%r62,%r183; .loc 1 174 32 add.u64 %r185,%r193,%r68; ld.u64 %r59,[%r185]; .loc 1 174 7 add.u64 %r77,%r77,8; setp.eq.u64 %r186,%r56,%r59; @ %r186 bra $L20; bra $L21; $L1: .loc 1 1_gfortran_mmaxloc0_16_i4 .visible .func _gfortran_mmaxloc0_16_i48predpredpredu64pred %r293; mov.u64 %r163,%ar0; mov.u64 %r164,%ar1; mov.u64 %r165,%ar2; mov.u32 %r166,%ar3; .loc 1 202 6 setp.ne.u64 %r167,%r165,0; @ %r167 bra $L3616r1166; call _gfortran_maxloc0_16_.loc 1 205 7 bra $L35; $L36: .loc 1 208 10 ld.s8 %r22,[%r164+28]; .loc 1 209 6 setp.gt.s32 %r171,%r22,0; @ %r171 bra $L38; .loc 1 210 5 cvta.const.u64 %r172172_gfortran_runtime_error%r174,[%r163]; setp.ne.u64 %r175,%r174,0; @ %r175 bra $L39; .loc 1 214 7 st.u64 [%r163+48],%r174; add.u32 %r177,%r22,-1; cvt.s64.s32 %r178,%r177; st.u64 [%r163+56],%r178; mov.u64 %r179,1; st.u64 [%r163+40],%r179; .loc 1 215 28 cvt.u32.u64 %r180,%r179; st.u8 [%r163+28],%r180; .loc 1 216 24 st.u64 [%r163+8],%r174; .loc 1 217 29 cvt.u32.u32 %r185,%r22; cvt.s64.s8 %r184,%r185; mov.u64 %r183,16call (%value_in),_gfortrani_xmallocarray186,[%value_in]; } .loc 1 217 27 st.u64 [%r163],%r186; bra $L40; $L39: .loc 1 221 11 cvta.global.u64 %r188,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r189,[%r188+36]; setp.eq.u32 %r190,%r189,0r164193; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r196193; call _gfortrani_bounds_equal_extents$L40: .loc 1 231 15 ld.u64 %r31,[%r165+16]; .loc 1 233 9 ld.u64 %r90,[%r165]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r198,%r32,-4; and.b32 %r199,%r198,-5; set.u32.eq.u32 %r201,%r199,0; neg.s32 %r202,%r201; .loc 1 235 22 add.u32 %r203,%r32,-1; set.u32.le.u32 %r205,%r203,1; neg.s32 %r206,%r205; .loc 1 235 6 cvt.u16.u32 %r208,%r202; cvt.u16.u32 %r209,%r206; or.b16 %r207,%r208,%r209;41; .loc 1 237 7 setp.ne.u32 %r214,%r32,16; @ %r214 bra $L42; $L41: .loc 1 244 11 ld.u64 %r102,[%r163+40]; .loc 1 245 8 ld.u64 %r103,[%r163]; .loc 1 246 17 cvt.u32.u32 %r215,%r22; cvt.s64.s8 %r65,%r215; add.u64 %r161,%r164,40; add.u64 %r284,%frame,120; mov.u64 %r159,%r284; add.u64 %r158,%r165,40; mov.u64 %r154,%frame; add.u64 %r155,%frame,240; add.u64 %r151,%frame,360; mov.u64 %r150,%r151; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r224,%r29; bra $L43; $L42: .loc 1 242 5_gfortran_runtime_error64: mov.u64 %r29,%r109; $L43: .loc 1 248 18 ld.u64 %r218,[%r161]; st.u64 [%r159],%r218; .loc 1 249 20 ld.u64 %r220,[%r158]; mul.lo.u64 %r219,%r220,%r31; .loc 1 249 18 st.u64 [%r154],%r219; .loc 1 250 19 ld.u64 %r222,[%r161+16]; add.u64 %r221,%r222,1; ld.u64 %r223,[%r161+8]; sub.u64 %r44,%r221,%r223; .loc 1 250 17 st.u64 [%r155],%r44; .loc 1 251 16 st.u64 [%r150],%r224; .loc 1 252 10 setp.gt.s64 %r225,%r44,0; @ %r225 bra $L44; shl.b64 %r144,%r102,4; mov.u64 %r133,%r103; .loc 1 255 11 mov.u64 %r110,0; .loc 1 256 24 mov.u64 %r226,%r110; $L45: st.u64 [%r133],%r226; st.u64 [%r133+8],%r226; .loc 1 255 27 add.u64 %r110,%r110,1; .loc 1 255 4 add.u64 %r133,%r133,%r144; setp.ne.u64 %r228,%r65,%r110; @ %r228 bra $L45; bra $L35; $L44: .loc 1 246 26 add.u64 %r109,%r29,1; .loc 1 246 3 add.u64 %r161,%r161,24; add.u64 %r159,%r159,8; add.u64 %r158,%r158,24; add.u64 %r154,%r154,8; add.u64 %r155,%r155,8; add.u64 %r150,%r150,8; setp.ne.u64 %r229,%r65,%r109; @ %r229 bra $L64; .loc 1 261 8 ld.u64 %r88,[%r164]; shl.b64 %r112,%r102,4; mov.u64 %r52,%r103; .loc 1 264 10 mov.u64 %r108,0; .loc 1 265 23 mov.u64 %r230,%r108; $L47: st.u64 [%r52],%r230; st.u64 [%r52+8],%r230; mov.u64 %r140,%r108; .loc 1 264 26 add.u64 %r108,%r108,1; .loc 1 264 3 add.u64 %r52,%r52,%r112; setp.ne.u64 %r232,%r29,%r140; @ %r232 bra $L47; .loc 1 276 9 setp.eq.u64 %r233,%r88,0; @ %r233 bra $L35; .loc 1 331 19 ld.u64 %r68,[%frame+120]; .loc 1 331 9 shl.b64 %r70,%r68,2; .loc 1 332 20 ld.u64 %r71,[%frame]; .loc 1 334 34 ld.u64 %r74,[%frame+240]; ld.u64 %r160,[%frame+360]; cvt.u32.u32 %r235,%r22; cvt.s64.s8 %r234,%r235; shl.b64 %r236,%r234,3; add.u64 %r122,%r236,%r151; .loc 1 274 12 mov.u32 %r95,-2147483648; setp.eq.u32 %r283,1,0; add.u64 %r285,%frame,368; .loc 1 340 13 mov.u64 %r290,0; .loc 1 359 32 add.u64 %r291,%frame,240; .loc 1 308 12 setp.eq.u32 %r293,%r166,0; $L63: .loc 1 280 10 @ %r283 bra $L48; mov.u64 %r53,%r160; mov.u32 %r46,0; mov.u32 %r287,1; $L53: mov.u64 %r45,%r88; .loc 1 300 13 add.u64 %r88,%r45,%r70; .loc 1 284 11 ld.s8 %r239,[%r90]; cvt.u16.u32 %r238,%r239; setp.eq.u16 %r240,%r238,0; @ %r240 bra $L49; setp.eq.u32 %r241,%r46,0; selp.u64 %r160,%r160,%r53,%r241; .loc 1 294 16 ld.u32 %r95,[%r45]; mov.u64 %r131,%r285; mov.u64 %r145,%r103; mov.u64 %r157,%r160; $L52: .loc 1 296 33 add.u64 %r243,%r157,1; st.u64 [%r145],%r243; shr.s64 %r244,%r243,63; st.u64 [%r145+8],%r244; .loc 1 295 9 add.u64 %r145,%r145,%r112; setp.eq.u64 %r245,%r122,%r131; @ %r245 bra $L65; .loc 1 296 29 ld.u64 %r157,[%r131]; add.u64 %r131,%r131,8; bra $L52; $L49: .loc 1 301 14 add.u64 %r90,%r90,%r71; .loc 1 303 11 add.u64 %r53,%r53,1; mov.u32 %r46,%r287; .loc 1 303 4 setp.ne.u64 %r246,%r53,%r74; @ %r246 bra $L53; bra $L89; $L48: .loc 1 308 12 @ %r293 bra $L66; mov.u64 %r61,%r160; mov.u64 %r86,%r88; $L58: .loc 1 311 11 ld.s8 %r249,[%r90]; cvt.u16.u32 %r248,%r249; setp.eq.u16 %r250,%r248,0; @ %r250 bra $L56; .loc 1 311 22 ld.u32 %r57,[%r86]; .loc 1 311 19 setp.lt.s32 %r251,%r57,%r95; @ %r251 bra $L56; mov.u64 %r97,%r285; mov.u64 %r107,%r103; mov.u64 %r153,%r61; .loc 1 314 19 mov.u64 %r141,%r290; $L57: .loc 1 315 36 add.u64 %r253,%r153,1; st.u64 [%r107],%r253; shr.s64 %r254,%r253,63; st.u64 [%r107+8],%r254; .loc 1 314 35 add.u64 %r106,%r141,1; .loc 1 314 12 add.u64 %r107,%r107,%r112; setp.le.s64 %r255,%r29,%r141; @ %r255 bra $L67; .loc 1 315 32 ld.u64 %r153,[%r97]; add.u64 %r97,%r97,8; mov.u64 %r141,%r106; bra $L57; $L67: mov.u32 %r95,%r57; $L56: .loc 1 317 13 add.u64 %r86,%r86,%r70; .loc 1 319 11 add.u64 %r61,%r61,1; .loc 1 319 4 st.u64 [%frame+360],%r61; setp.ne.u64 %r256,%r61,%r74; @ %r256 bra $L58; .loc 1 317 13 sub.u64 %r257,%r74,%r160; mad.lo.u64 %r88,%r257,%r70,%r88; bra $L59; $L66: mov.u64 %r73,%r160; mov.u64 %r89,%r90; mov.u64 %r87,%r88; $L55: .loc 1 323 11 ld.s8 %r260,[%r89]; cvt.u16.u32 %r259,%r260; setp.eq.u16 %r261,%r259,0; @ %r261 bra $L60; .loc 1 323 22 ld.u32 %r64,[%r87]; .loc 1 323 19 setp.le.s32 %r262,%r64,%r95; @ %r262 bra $L60; mov.u64 %r62,%r285; mov.u64 %r91,%r103; mov.u64 %r149,%r73; .loc 1 326 12 mov.u64 %r142,%r290; $L61: .loc 1 327 36 add.u64 %r264,%r149,1; st.u64 [%r91],%r264; shr.s64 %r265,%r264,63; st.u64 [%r91+8],%r265; .loc 1 326 28 add.u64 %r105,%r142,1; .loc 1 326 5 add.u64 %r91,%r91,%r112; setp.le.s64 %r266,%r29,%r142; @ %r266 bra $L68; .loc 1 327 32 ld.u64 %r149,[%r62]; add.u64 %r62,%r62,8; mov.u64 %r142,%r105; bra $L61; $L68: mov.u32 %r95,%r64; $L60: .loc 1 331 9 add.u64 %r87,%r87,%r70; .loc 1 332 10 add.u64 %r89,%r89,%r71; .loc 1 334 14 add.u64 %r73,%r73,1; .loc 1 334 7 st.u64 [%frame+360],%r73; setp.ne.u64 %r267,%r73,%r74; @ %r267 bra $L55; sub.u64 %r128,%r74,%r160; .loc 1 331 9 mad.lo.u64 %r88,%r70,%r128,%r88; .loc 1 332 10 mad.lo.u64 %r90,%r71,%r128,%r90; bra $L59; $L89: st.u64 [%frame+360],%r74; $L59: mov.u64 %r120,%r285; mov.u64 %r83,%r71; .loc 1 334 34 mov.u64 %r85,%r74; mov.u64 %r81,%r68; mov.u64 %r119,8; .loc 1 335 9 mov.u64 %r93,0; $L62: .loc 1 340 13 st.u64 [%r120+-8],%r290; .loc 1 343 23 mul.lo.u64 %r76,%r81,%r85; .loc 1 344 24 mul.lo.u64 %r78,%r85,%r83; mov.u64 %r98,%r93; .loc 1 345 5 add.u64 %r93,%r93,1; .loc 1 346 7 setp.eq.u64 %r272,%r29,%r98; @ %r272 bra $L35; .loc 1 354 16 ld.u64 %r273,[%r120]; add.u64 %r80,%r273,1; st.u64 [%r120],%r80; .loc 1 355 23 add.u64 %r275,%r284,%r119; ld.u64 %r81,[%r275]; .loc 1 355 13 sub.u64 %r276,%r81,%r76; shl.b64 %r277,%r276,2; add.u64 %r88,%r88,%r277; .loc 1 356 24 add.u64 %r278,%frame,%r119; ld.u64 %r83,[%r278]; .loc 1 356 14 sub.u64 %r279,%r83,%r78; add.u64 %r90,%r90,%r279; .loc 1 359 32 add.u64 %r281,%r291,%r119; ld.u64 %r85,[%r281]; .loc 1 359 7 add.u64 %r120,%r120,8; add.u64 %r119,%r119,8; setp.eq.u64 %r282,%r80,%r85; @ %r282 bra $L62; ld.u64 %r160,[%frame+360]; bra $L63; $L65: mov.u64 %r88,%r45; setp.eq.u32 %r283,1,1; bra $L48; $L35: .loc 1 362_gfortran_smaxloc0_16_i4 .visible .func _gfortran_smaxloc0_16_i44predpred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 379 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L91; .loc 1 379 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L92; $L91: .loc 1 381 7 {_gfortran_maxloc0_16_.loc 1 382 7 bra $L90; $L92: .loc 1 385 8 ld.s8 %r30,[%r36+28]_gfortran_runtime_error94: .loc 1 390 15 ld.u64 %r24,[%r35]; .loc 1 390 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L95; .loc 1 392 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 393 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 394 24 st.u64 [%r35+8],%r24; .loc 1 395 29call (%value_in),_gfortrani_xmallocarray95 27 st.u64 [%r35],%r24; bra $L96; $L95: .loc 1 397 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 397 11 ld.u32 %r62,[%r61+36; .loc 1 399 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 404 8 ld.u64 %r24,[%r35]; $L96: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,4; mov.u64 %r29,%r24; .loc 1 405 10 mov.u64 %r32,0; .loc 1 406 23 mov.u64 %r68,%r32; $L97: st.u64 [%r29],%r68; st.u64 [%r29+8],%r68; .loc 1 405 24 add.u64 %r32,%r32,1; .loc 1 405 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r70,%r30,%r32; @ %r70 bra $L97; $L90: .loc 1 407 maxloc0_4_i8.o/ 1622802228_gfortran_maxloc0_4_i8 .visible .func _gfortran_maxloc0_4_i8fortran/generated/maxloc0_4_i8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_4_i8 .visible .func _gfortran_mmaxloc0_4_i8_gfortran_smaxloc0_4_i8 .visible .func _gfortran_smaxloc0_4_i8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,65,88,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_4_i8 .visible .func _gfortran_maxloc0_4_i8u64predpred %r154; .reg .predpredpredmov.u64 %r116,%ar0; mov.u64 %r117,%ar1; mov.u32 %r118,%ar2; .loc 1 50 10 ld.s8 %r22,[%r117+28]; .loc 1 50 8 cvt.u32.u32 %r119,%r22; cvt.s64.s8 %r72,%r119; .loc 1 51 6 setp.gt.s64 %r120,%r72,0; @ %r120 bra $L2; .loc 1 52 5 cvta.const.u64 %r121121_gfortran_runtime_errorr23,[%r116]; .loc 1 54 6 setp.ne.u64 %r123,%r23,0; @ %r123 bra $L3; .loc 1 56 7 st.u64 [%r116+48],%r23; add.u64 %r125,%r72,-1; st.u64 [%r116+56],%r125; mov.u64 %r126,1; st.u64 [%r116+40],%r126; .loc 1 57 28 cvt.u32.u64 %r127,%r126; st.u8 [%r116+28],%r127; .loc 1 58 24 st.u64 [%r116+8],%r23; .loc 1 59 29 mov.u64 %r130,4;r130; call (%value_in),_gfortrani_xmallocarray133,[%value_in]; } mov.u64 %r23,%r133; .loc 1 59 27 st.u64 [%r116],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r135,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r136,[%r135+36]; setp.eq.u32 %r137,%r136,0; @ %r137 bra $L4; .loc 1 64r140; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r116]; $L4: .loc 1 68 11 ld.u64 %r73,[%r116+40]; add.u64 %r39,%r117,40; mov.u64 %r45,%frame; add.u64 %r35,%frame,120; .loc 1 70 10 mov.u64 %r99,0; add.u64 %r188,%frame,240; .loc 1 74 16 mov.u64 %r148,%r99; bra $L7; $L22: mov.u64 %r99,%r79; $L7: .loc 1 72 18 ld.u64 %r141,[%r39]; st.u64 [%r45],%r141; .loc 1 73 19 ld.u64 %r143,[%r39+16]; add.u64 %r142,%r143,1; ld.u64 %r144,[%r39+8]; sub.u64 %r31,%r142,%r144; .loc 1 73 17 st.u64 [%r35],%r31; .loc 1 74 16 shl.b64 %r146,%r99,3; add.u64 %r147,%r188,%r146; st.u64 [%r147],%r148; .loc 1 75 10 setp.gt.s64 %r149,%r31,0; @ %r149 bra $L5; shl.b64 %r85,%r73,2; mov.u64 %r87,%r23; .loc 1 78 11 mov.u64 %r80,0; .loc 1 79 24 cvt.u32.u64 %r150,%r80; $L6: st.u32 [%r87],%r150; .loc 1 78 27 add.u64 %r80,%r80,1; .loc 1 78 4 add.u64 %r87,%r87,%r85; setp.ne.u64 %r151,%r72,%r80; @ %r151 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r79,%r99,1; .loc 1 70 3 add.u64 %r39,%r39,24; add.u64 %r45,%r45,8; add.u64 %r35,%r35,8; setp.ne.u64 %r152,%r72,%r79; @ %r152 bra $L22; .loc 1 84 8 ld.u64 %r64,[%r117]; shl.b64 %r82,%r73,2; mov.u64 %r92,%r23; .loc 1 87 10 mov.u64 %r78,0; .loc 1 88 23 mov.u32 %r153,1; $L8: st.u32 [%r92],%r153; mov.u64 %r32,%r78; .loc 1 87 26 add.u64 %r78,%r78,1; .loc 1 87 3 add.u64 %r92,%r92,%r82; setp.ne.u64 %r154,%r99,%r32; @ %r154 bra $L8; .loc 1 101 9 setp.ne.u64 %r155,%r64,0; @ ! %r155 bra $L1; .loc 1 149 19 ld.u64 %r49,[%frame]; .loc 1 149 9 shl.b64 %r51,%r49,3; .loc 1 151 34 ld.u64 %r53,[%frame+120]; cvt.u32.u32 %r157,%r22; cvt.s64.s8 %r156,%r157; shl.b64 %r158,%r156,3; add.u64 %r102,%r188,%r158; .loc 1 99 12 mov.u64 %r67,-9223372036854775808; setp.eq.u32 %r187,%r118,0; add.u64 %r189,%frame,248; .loc 1 157 13 mov.u64 %r192,0; .loc 1 174 32 add.u64 %r193,%frame,120; $L21: .loc 1 151 21 ld.u64 %r110,[%frame+240]; .loc 1 126 12 @ %r187 bra $L23; mov.u64 %r40,%r110; mov.u64 %r62,%r64; $L14: .loc 1 129 12 ld.u64 %r33,[%r62]; .loc 1 129 11 setp.lt.s64 %r161,%r33,%r67; @ %r161 bra $L12; mov.u64 %r43,%r189; mov.u64 %r65,%r23; mov.u64 %r105,%r40; $L13: .loc 1 133 35 cvt.u32.u64 %r164,%r105; add.u32 %r163,%r164,1; .loc 1 133 24 st.u32 [%r65],%r163; .loc 1 132 10 add.u64 %r65,%r65,%r82; setp.eq.u64 %r165,%r43,%r102; @ %r165 bra $L24; .loc 1 133 31 ld.u64 %r105,[%r43]; add.u64 %r43,%r43,8; bra $L13; $L24: mov.u64 %r67,%r33; $L12: .loc 1 135 12 add.u64 %r62,%r62,%r51; .loc 1 137 17 add.u64 %r40,%r40,1; .loc 1 137 10 st.u64 [%frame+240],%r40; setp.ne.u64 %r166,%r40,%r53; @ %r166 bra $L14; bra $L34; $L23: mov.u64 %r52,%r110; mov.u64 %r63,%r64; $L11: .loc 1 141 11 ld.u64 %r42,[%r63]; .loc 1 141 10 setp.le.s64 %r167,%r42,%r67; @ %r167 bra $L16; mov.u64 %r108,%r189; mov.u64 %r109,%r23; mov.u64 %r44,%r52; $L17: .loc 1 145 35 cvt.u32.u64 %r170,%r44; add.u32 %r169,%r170,1; .loc 1 145 24 st.u32 [%r109],%r169; .loc 1 144 4 add.u64 %r109,%r109,%r82; setp.eq.u64 %r171,%r102,%r108; @ %r171 bra $L25; .loc 1 145 31 ld.u64 %r44,[%r108]; add.u64 %r108,%r108,8; bra $L17; $L25: mov.u64 %r67,%r42; $L16: .loc 1 149 9 add.u64 %r63,%r63,%r51; .loc 1 151 14 add.u64 %r52,%r52,1; .loc 1 151 7 st.u64 [%frame+240],%r52; setp.ne.u64 %r172,%r52,%r53; @ %r172 bra $L11; .loc 1 149 9 sub.u64 %r173,%r53,%r110; mad.lo.u64 %r64,%r173,%r51,%r64; bra $L18; $L34: .loc 1 135 12 sub.u64 %r175,%r53,%r110; mad.lo.u64 %r64,%r175,%r51,%r64; $L18: mov.u64 %r81,%r189; .loc 1 151 34 mov.u64 %r61,%r53; mov.u64 %r59,%r49; .loc 1 152 9 mov.u64 %r66,0; $L20: .loc 1 157 13 st.u64 [%r81+-8],%r192; .loc 1 160 23 mul.lo.u64 %r56,%r59,%r61; mov.u64 %r100,%r66; .loc 1 161 5 add.u64 %r66,%r66,1; .loc 1 162 7 setp.eq.u64 %r179,%r100,%r99; @ %r179 bra $L1; .loc 1 170 16 ld.u64 %r180,[%r81]; add.u64 %r58,%r180,1; st.u64 [%r81],%r58; shl.b64 %r70,%r66,3; .loc 1 171 23 add.u64 %r181,%frame,%r70; ld.u64 %r59,[%r181]; .loc 1 171 13 sub.u64 %r182,%r59,%r56; shl.b64 %r183,%r182,3; add.u64 %r64,%r64,%r183; .loc 1 174 32 add.u64 %r185,%r193,%r70; ld.u64 %r61,[%r185]; .loc 1 174 7 add.u64 %r81,%r81,8; setp.eq.u64 %r186,%r58,%r61; @ %r186 bra $L20; bra $L21; $L1: .loc 1 1_gfortran_mmaxloc0_4_i8 .visible .func _gfortran_mmaxloc0_4_i87predpred %r193; .reg .u64 %r196; .reg .u6464pred %r228; .reg .u32 %r229; .reg .predpred %r233; .reg .pred %r234; .reg .u64u16u64u64 %r292; .reg .pred %r294; mov.u64 %r166,%ar0; mov.u64 %r167,%ar1; mov.u64 %r168,%ar2; mov.u32 %r169,%ar3; .loc 1 202 6 setp.ne.u64 %r170,%r168,0; @ %r170 bra $L36gfortran_maxloc0_4_.loc 1 205 7 bra $L35; $L36: .loc 1 208 10 ld.s8 %r22,[%r167+28]; .loc 1 209 6 setp.gt.s32 %r174,%r22,0; @ %r174 bra $L38; .loc 1 210 5 cvta.const.u64 %r175175_gfortran_runtime_error%r177,[%r166]; setp.ne.u64 %r178,%r177,0; @ %r178 bra $L39; .loc 1 214 7 st.u64 [%r166+48],%r177; add.u32 %r180,%r22,-1; cvt.s64.s32 %r181,%r180; st.u64 [%r166+56],%r181; mov.u64 %r182,1; st.u64 [%r166+40],%r182; .loc 1 215 28 cvt.u32.u64 %r183,%r182; st.u8 [%r166+28],%r183; .loc 1 216 24 st.u64 [%r166+8],%r177; .loc 1 217 29 cvt.u32.u32 %r188,%r22; cvt.s64.s8 %r187,%r188call (%value_in),_gfortrani_xmallocarray189,[%value_in]; } .loc 1 217 27 st.u64 [%r166],%r189; bra $L40; $L39: .loc 1 221 11 cvta.global.u64 %r191,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r192,[%r191+36];r167196; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19call _gfortrani_bounds_equal_extents$L40: .loc 1 231 15 ld.u64 %r31,[%r168+16]; .loc 1 233 9 ld.u64 %r93,[%r168]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r201,%r32,-4; and.b32 %r202,%r201,-5; set.u32.eq.u32 %r204,%r202,0; neg.s32 %r205,%r204; .loc 1 235 22 add.u32 %r206,%r32,-1; set.u32.le.u32 %r208,%r206,1; neg.s32 %r209,%r208; .loc 1 235 6 cvt.u16.u32 %r211,%r205; cvt.u16.u32 %r212,%r209; or.b16 %r210,%r211,%r212; cvt.u32.u16 %r213,%r210; cvt.u16.u8 %r214,%r213; setp.ne.u16 %r215,%r214,0; @ %r215 bra $L41; .loc 1 237 7 setp.ne.u32 %r217,%r32,16; @ %r217 bra $L42; $L41: .loc 1 244 11 ld.u64 %r106,[%r166+40]; .loc 1 245 8 ld.u64 %r107,[%r166]; .loc 1 246 17 cvt.u32.u32 %r218,%r22; cvt.s64.s8 %r65,%r218; add.u64 %r52,%r167,40; add.u64 %r286,%frame,120; mov.u64 %r116,%r286; add.u64 %r102,%r168,40; mov.u64 %r165,%frame; add.u64 %r162,%frame,240; add.u64 %r159,%frame,360; mov.u64 %r160,%r159; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r227,%r29; bra $L43; $L42: .loc 1 242 5_gfortran_runtime_error64: mov.u64 %r29,%r113; $L43: .loc 1 248 18 ld.u64 %r221,[%r52]; st.u64 [%r116],%r221; .loc 1 249 20 ld.u64 %r223,[%r102]; mul.lo.u64 %r222,%r223,%r31; .loc 1 249 18 st.u64 [%r165],%r222; .loc 1 250 19 ld.u64 %r225,[%r52+16]; add.u64 %r224,%r225,1; ld.u64 %r226,[%r52+8]; sub.u64 %r44,%r224,%r226; .loc 1 250 17 st.u64 [%r162],%r44; .loc 1 251 16 st.u64 [%r160],%r227; .loc 1 252 10 setp.gt.s64 %r228,%r44,0; @ %r228 bra $L44; shl.b64 %r149,%r106,2; mov.u64 %r139,%r107; .loc 1 255 11 mov.u64 %r114,0; .loc 1 256 24 cvt.u32.u64 %r229,%r114; $L45: st.u32 [%r139],%r229; .loc 1 255 27 add.u64 %r114,%r114,1; .loc 1 255 4 add.u64 %r139,%r139,%r149; setp.ne.u64 %r230,%r65,%r114; @ %r230 bra $L45; bra $L35; $L44: .loc 1 246 26 add.u64 %r113,%r29,1; .loc 1 246 3 add.u64 %r52,%r52,24; add.u64 %r116,%r116,8; add.u64 %r102,%r102,24; add.u64 %r165,%r165,8; add.u64 %r162,%r162,8; add.u64 %r160,%r160,8; setp.ne.u64 %r231,%r65,%r113; @ %r231 bra $L64; .loc 1 261 8 ld.u64 %r90,[%r167]; shl.b64 %r62,%r106,2; mov.u64 %r91,%r107; .loc 1 264 10 mov.u64 %r112,0; .loc 1 265 23 cvt.u32.u64 %r232,%r112; $L47: st.u32 [%r91],%r232; mov.u64 %r142,%r112; .loc 1 264 26 add.u64 %r112,%r112,1; .loc 1 264 3 add.u64 %r91,%r91,%r62; setp.ne.u64 %r233,%r29,%r142; @ %r233 bra $L47; .loc 1 276 9 setp.eq.u64 %r234,%r90,0; @ %r234 bra $L35; .loc 1 331 19 ld.u64 %r69,[%frame+120]; .loc 1 331 9 shl.b64 %r71,%r69,3; .loc 1 332 20 ld.u64 %r72,[%frame]; .loc 1 334 34 ld.u64 %r75,[%frame+240]; ld.u64 %r163,[%frame+360]; cvt.u32.u32 %r236,%r22; cvt.s64.s8 %r235,%r236; shl.b64 %r237,%r235,3; add.u64 %r127,%r237,%r159; .loc 1 274 12 mov.u64 %r98,-9223372036854775808; setp.eq.u32 %r285,1,0; add.u64 %r284,%frame,368; .loc 1 340 13 mov.u64 %r291,0; .loc 1 359 32 add.u64 %r292,%frame,240; .loc 1 308 12 setp.eq.u32 %r294,%r169,0; $L63: .loc 1 280 10 @ %r285 bra $L48; mov.u64 %r53,%r163; mov.u32 %r46,0; mov.u32 %r288,1; $L53: mov.u64 %r45,%r90; .loc 1 300 13 add.u64 %r90,%r45,%r71; .loc 1 284 11 ld.s8 %r240,[%r93]; cvt.u16.u32 %r239,%r240; setp.eq.u16 %r241,%r239,0; @ %r241 bra $L49; setp.eq.u32 %r242,%r46,0; selp.u64 %r163,%r163,%r53,%r242; .loc 1 294 16 ld.u64 %r98,[%r45]; mov.u64 %r137,%r284; mov.u64 %r150,%r107; mov.u64 %r161,%r163; $L52: .loc 1 296 33 cvt.u32.u64 %r245,%r161; add.u32 %r244,%r245,1; .loc 1 296 22 st.u32 [%r150],%r244; .loc 1 295 9 add.u64 %r150,%r150,%r62; setp.eq.u64 %r246,%r127,%r137; @ %r246 bra $L65; .loc 1 296 29 ld.u64 %r161,[%r137]; add.u64 %r137,%r137,8; bra $L52; $L49: .loc 1 301 14 add.u64 %r93,%r93,%r72; .loc 1 303 11 add.u64 %r53,%r53,1; mov.u32 %r46,%r288; .loc 1 303 4 setp.ne.u64 %r247,%r53,%r75; @ %r247 bra $L53; bra $L89; $L48: .loc 1 308 12 @ %r294 bra $L66; mov.u64 %r61,%r163; mov.u64 %r88,%r90; $L58: .loc 1 311 11 ld.s8 %r250,[%r93]; cvt.u16.u32 %r249,%r250; setp.eq.u16 %r251,%r249,0; @ %r251 bra $L56; .loc 1 311 22 ld.u64 %r57,[%r88]; .loc 1 311 19 setp.lt.s64 %r252,%r57,%r98; @ %r252 bra $L56; mov.u64 %r117,%r284; mov.u64 %r119,%r107; mov.u64 %r156,%r61; .loc 1 314 19 mov.u64 %r143,%r291; $L57: .loc 1 315 36 cvt.u32.u64 %r255,%r156; add.u32 %r254,%r255,1; .loc 1 315 25 st.u32 [%r119],%r254; .loc 1 314 35 add.u64 %r110,%r143,1; .loc 1 314 12 add.u64 %r119,%r119,%r62; setp.le.s64 %r256,%r29,%r143; @ %r256 bra $L67; .loc 1 315 32 ld.u64 %r156,[%r117]; add.u64 %r117,%r117,8; mov.u64 %r143,%r110; bra $L57; $L67: mov.u64 %r98,%r57; $L56: .loc 1 317 13 add.u64 %r88,%r88,%r71; .loc 1 319 11 add.u64 %r61,%r61,1; .loc 1 319 4 st.u64 [%frame+360],%r61; setp.ne.u64 %r257,%r61,%r75; @ %r257 bra $L58; .loc 1 317 13 sub.u64 %r258,%r75,%r163; mad.lo.u64 %r90,%r258,%r71,%r90; bra $L59; $L66: mov.u64 %r74,%r163; mov.u64 %r92,%r93; mov.u64 %r89,%r90; $L55: .loc 1 323 11 ld.s8 %r261,[%r92]; cvt.u16.u32 %r260,%r261; setp.eq.u16 %r262,%r260,0; @ %r262 bra $L60; .loc 1 323 22 ld.u64 %r64,[%r89]; .loc 1 323 19 setp.le.s64 %r263,%r64,%r98; @ %r263 bra $L60; mov.u64 %r97,%r284; mov.u64 %r100,%r107; mov.u64 %r151,%r74; .loc 1 326 12 mov.u64 %r144,%r291; $L61: .loc 1 327 36 cvt.u32.u64 %r266,%r151; add.u32 %r265,%r266,1; .loc 1 327 25 st.u32 [%r100],%r265; .loc 1 326 28 add.u64 %r109,%r144,1; .loc 1 326 5 add.u64 %r100,%r100,%r62; setp.le.s64 %r267,%r29,%r144; @ %r267 bra $L68; .loc 1 327 32 ld.u64 %r151,[%r97]; add.u64 %r97,%r97,8; mov.u64 %r144,%r109; bra $L61; $L68: mov.u64 %r98,%r64; $L60: .loc 1 331 9 add.u64 %r89,%r89,%r71; .loc 1 332 10 add.u64 %r92,%r92,%r72; .loc 1 334 14 add.u64 %r74,%r74,1; .loc 1 334 7 st.u64 [%frame+360],%r74; setp.ne.u64 %r268,%r74,%r75; @ %r268 bra $L55; sub.u64 %r134,%r75,%r163; .loc 1 331 9 mad.lo.u64 %r90,%r71,%r134,%r90; .loc 1 332 10 mad.lo.u64 %r93,%r72,%r134,%r93; bra $L59; $L89: st.u64 [%frame+360],%r75; $L59: mov.u64 %r125,%r284; mov.u64 %r84,%r72; .loc 1 334 34 mov.u64 %r86,%r75; mov.u64 %r82,%r69; mov.u64 %r124,8; .loc 1 335 9 mov.u64 %r96,0; $L62: .loc 1 340 13 st.u64 [%r125+-8],%r291; .loc 1 343 23 mul.lo.u64 %r77,%r82,%r86; .loc 1 344 24 mul.lo.u64 %r79,%r86,%r84; mov.u64 %r118,%r96; .loc 1 345 5 add.u64 %r96,%r96,1; .loc 1 346 7 setp.eq.u64 %r273,%r29,%r118; @ %r273 bra $L35; .loc 1 354 16 ld.u64 %r274,[%r125]; add.u64 %r81,%r274,1; st.u64 [%r125],%r81; .loc 1 355 23 add.u64 %r276,%r286,%r124; ld.u64 %r82,[%r276]; .loc 1 355 13 sub.u64 %r277,%r82,%r77; shl.b64 %r278,%r277,3; add.u64 %r90,%r90,%r278; .loc 1 356 24 add.u64 %r279,%frame,%r124; ld.u64 %r84,[%r279]; .loc 1 356 14 sub.u64 %r280,%r84,%r79; add.u64 %r93,%r93,%r280; .loc 1 359 32 add.u64 %r282,%r292,%r124; ld.u64 %r86,[%r282]; .loc 1 359 7 add.u64 %r125,%r125,8; add.u64 %r124,%r124,8; setp.eq.u64 %r283,%r81,%r86; @ %r283 bra $L62; ld.u64 %r163,[%frame+360]; bra $L63; $L65: mov.u64 %r90,%r45; setp.eq.u32 %r285,1,1; bra $L48; $L35: .loc 1 362_gfortran_smaxloc0_4_i8 .visible .func _gfortran_smaxloc0_4_i84predpred.reg .u64u32 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 379 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L91; .loc 1 379 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L92; $L91: .loc 1 381 7 {_gfortran_maxloc0_4_.loc 1 382 7 bra $L90; $L92: .loc 1 385 8 ld.s8 %r30,[%r36+28]_gfortran_runtime_error94: .loc 1 390 15 ld.u64 %r24,[%r35]; .loc 1 390 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L95; .loc 1 392 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 393 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 394 24 st.u64 [%r35+8],%r24; .loc 1 395 29 mov.u64 %r56call (%value_in),_gfortrani_xmallocarray95 27 st.u64 [%r35],%r24; bra $L96; $L95: .loc 1 397 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 397 11 ld.u32 %r62,[%r61+36; .loc 1 399 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 404 8 ld.u64 %r24,[%r35]; $L96: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,2; mov.u64 %r29,%r24; .loc 1 405 10 mov.u64 %r32,0; .loc 1 406 23 cvt.u32.u64 %r68,%r32; $L97: st.u32 [%r29],%r68; .loc 1 405 24 add.u64 %r32,%r32,1; .loc 1 405 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L97; $L90: .loc 1 407maxloc0_8_i8.o/ 1622802228_gfortran_maxloc0_8_i8 .visible .func _gfortran_maxloc0_8_i8fortran/generated/maxloc0_8_i8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_8_i8 .visible .func _gfortran_mmaxloc0_8_i8_gfortran_smaxloc0_8_i8 .visible .func _gfortran_smaxloc0_8_i8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,65,88,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_8_i8 .visible .func _gfortran_maxloc0_8_i832predpred %r180; .reg .predmov.u64 %r112,%ar0; mov.u64 %r113,%ar1; mov.u32 %r114,%ar2; .loc 1 50 10 ld.s8 %r22,[%r113+28]; .loc 1 50 8 cvt.u32.u32 %r115,%r22; cvt.s64.s8 %r71,%r115; .loc 1 51 6 setp.gt.s64 %r116,%r71,0; @ %r116 bra $L2; .loc 1 52 5 cvta.const.u64 %r11117_gfortran_runtime_errorr23,[%r112]; .loc 1 54 6 setp.ne.u64 %r119,%r23,0; @ %r119 bra $L3; .loc 1 56 7 st.u64 [%r112+48],%r23; add.u64 %r121,%r71,-1; st.u64 [%r112+56],%r121; mov.u64 %r122,1; st.u64 [%r112+40],%r122; .loc 1 57 28 cvt.u32.u64 %r123,%r122; st.u8 [%r112+28],%r123; .loc 1 58 24 st.u64 [%r112+8],%r23; .loc 1 59 29 mov.u64 %r171call (%value_in),_gfortrani_xmallocarray129,[%value_in]; } mov.u64 %r23,%r129; .loc 1 59 27 st.u64 [%r112],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r131,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r132,[%r131+36]; setp.eq.u32 %r133,%r132,0; @ %r133 bra $L4; .loc 1 64r113136; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r112]; $L4: .loc 1 68 11 ld.u64 %r72,[%r112+40]; add.u64 %r36,%r113,40; mov.u64 %r104,%frame; add.u64 %r105,%frame,120; .loc 1 70 10 mov.u64 %r98,0; add.u64 %r182,%frame,240; .loc 1 74 16 mov.u64 %r144,%r98; bra $L7; $L22: mov.u64 %r98,%r79; $L7: .loc 1 72 18 ld.u64 %r137,[%r36]; st.u64 [%r104],%r137; .loc 1 73 19 ld.u64 %r139,[%r36+16]; add.u64 %r138,%r139,1; ld.u64 %r140,[%r36+8]; sub.u64 %r31,%r138,%r140; .loc 1 73 17 st.u64 [%r105],%r31; .loc 1 74 16 shl.b64 %r142,%r98,3; add.u64 %r143,%r182,%r142; st.u64 [%r143],%r144; .loc 1 75 10 setp.gt.s64 %r145,%r31,0; @ %r145 bra $L5; shl.b64 %r81,%r72,3; mov.u64 %r87,%r23; .loc 1 78 11 mov.u64 %r80,0; .loc 1 79 24 mov.u64 %r146,%r80; $L6: st.u64 [%r87],%r146; .loc 1 78 27 add.u64 %r80,%r80,1; .loc 1 78 4 add.u64 %r87,%r87,%r81; setp.ne.u64 %r147,%r71,%r80; @ %r147 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r79,%r98,1; .loc 1 70 3 add.u64 %r36,%r36,24; add.u64 %r104,%r104,8; add.u64 %r105,%r105,8; setp.ne.u64 %r148,%r71,%r79; @ %r148 bra $L22; .loc 1 84 8 ld.u64 %r62,[%r113]; shl.b64 %r84,%r72,3; mov.u64 %r39,%r23; .loc 1 87 10 mov.u64 %r78,0; .loc 1 88 23 mov.u64 %r149,1; $L8: st.u64 [%r39],%r149; mov.u64 %r32,%r78; .loc 1 87 26 add.u64 %r78,%r78,1; .loc 1 87 3 add.u64 %r39,%r39,%r84; setp.ne.u64 %r150,%r98,%r32; @ %r150 bra $L8; .loc 1 101 9 setp.ne.u64 %r151,%r62,0; @ ! %r151 bra $L1; .loc 1 149 19 ld.u64 %r45,[%frame]; .loc 1 149 9 shl.b64 %r47,%r45,3; .loc 1 151 34 ld.u64 %r51,[%frame+120]; cvt.u32.u32 %r153,%r22; cvt.s64.s8 %r152,%r153; shl.b64 %r154,%r152,3; add.u64 %r73,%r182,%r154; .loc 1 99 12 mov.u64 %r67,-9223372036854775808; setp.eq.u32 %r181,%r114,0; add.u64 %r183,%frame,248; .loc 1 157 13 mov.u64 %r186,0; .loc 1 174 32 add.u64 %r187,%frame,120; $L21: .loc 1 151 21 ld.u64 %r103,[%frame+240]; .loc 1 126 12 @ %r181 bra $L23; mov.u64 %r40,%r103; mov.u64 %r60,%r62; $L14: .loc 1 129 12 ld.u64 %r33,[%r60]; .loc 1 129 11 setp.lt.s64 %r157,%r33,%r67; @ %r157 bra $L12; mov.u64 %r76,%r183; mov.u64 %r34,%r23; mov.u64 %r90,%r40; $L13: .loc 1 133 35 add.u64 %r159,%r90,1; .loc 1 133 24 st.u64 [%r34],%r159; .loc 1 132 10 add.u64 %r34,%r34,%r84; setp.eq.u64 %r160,%r73,%r76; @ %r160 bra $L24; .loc 1 133 31 ld.u64 %r90,[%r76]; add.u64 %r76,%r76,8; bra $L13; $L24: mov.u64 %r67,%r33; $L12: .loc 1 135 12 add.u64 %r60,%r60,%r47; .loc 1 137 17 add.u64 %r40,%r40,1; .loc 1 137 10 st.u64 [%frame+240],%r40; setp.ne.u64 %r161,%r40,%r51; @ %r161 bra $L14; bra $L34; $L23: mov.u64 %r49,%r103; mov.u64 %r61,%r62; $L11: .loc 1 141 11 ld.u64 %r42,[%r61]; .loc 1 141 10 setp.le.s64 %r162,%r42,%r67; @ %r162 bra $L16; mov.u64 %r85,%r183; mov.u64 %r101,%r23; mov.u64 %r86,%r49; $L17: .loc 1 145 35 add.u64 %r164,%r86,1; .loc 1 145 24 st.u64 [%r101],%r164; .loc 1 144 4 add.u64 %r101,%r101,%r84; setp.eq.u64 %r165,%r73,%r85; @ %r165 bra $L25; .loc 1 145 31 ld.u64 %r86,[%r85]; add.u64 %r85,%r85,8; bra $L17; $L25: mov.u64 %r67,%r42; $L16: .loc 1 149 9 add.u64 %r61,%r61,%r47; .loc 1 151 14 add.u64 %r49,%r49,1; .loc 1 151 7 st.u64 [%frame+240],%r49; setp.ne.u64 %r166,%r49,%r51; @ %r166 bra $L11; .loc 1 149 9 sub.u64 %r167,%r51,%r103; mad.lo.u64 %r62,%r167,%r47,%r62; bra $L18; $L34: .loc 1 135 12 sub.u64 %r169,%r51,%r103; mad.lo.u64 %r62,%r169,%r47,%r62; $L18: mov.u64 %r70,%r183; .loc 1 151 34 mov.u64 %r59,%r51; mov.u64 %r57,%r45; .loc 1 152 9 mov.u64 %r66,0; $L20: .loc 1 157 13 st.u64 [%r70+-8],%r186; .loc 1 160 23 mul.lo.u64 %r54,%r57,%r59; mov.u64 %r99,%r66; .loc 1 161 5 add.u64 %r66,%r66,1; .loc 1 162 7 setp.eq.u64 %r173,%r99,%r98; @ %r173 bra $L1; .loc 1 170 16 ld.u64 %r174,[%r70]; add.u64 %r56,%r174,1; st.u64 [%r70],%r56; shl.b64 %r65,%r66,3; .loc 1 171 23 add.u64 %r175,%frame,%r65; ld.u64 %r57,[%r175]; .loc 1 171 13 sub.u64 %r176,%r57,%r54; shl.b64 %r177,%r176,3; add.u64 %r62,%r62,%r177; .loc 1 174 32 add.u64 %r179,%r187,%r65; ld.u64 %r59,[%r179]; .loc 1 174 7 add.u64 %r70,%r70,8; setp.eq.u64 %r180,%r56,%r59; @ %r180 bra $L20; bra $L21; $L1: .loc 1 1_gfortran_mmaxloc0_8_i8 .visible .func _gfortran_mmaxloc0_8_i87pred %r187; .reg .u64 %r190; .reg .u64u64pred %r222; .reg .u64 %r223; .reg .predu32 %r230; .reg .u64 %r231; .reg .u16 %r233; .reg .u32 %r234; .reg .pred %r235; .reg .predpred %r248; .reg .predu64 %r273; .reg .predu64 %r283; .reg .pred %r285; mov.u64 %r160,%ar0; mov.u64 %r161,%ar1; mov.u64 %r162,%ar2; mov.u32 %r163,%ar3; .loc 1 202 6 setp.ne.u64 %r164,%r162,0; @ %r164 bra $L36; .loc 1 204 7 {1163; call _gfortran_maxloc0_8_.loc 1 205 7 bra $L35; $L36: .loc 1 208 10 ld.s8 %r22,[%r161+28]; .loc 1 209 6 setp.gt.s32 %r168,%r22,0; @ %r168 bra $L38; .loc 1 210 5 cvta.const.u64 %r16stack; call _gfortran_runtime_error%r171,[%r160]; setp.ne.u64 %r172,%r171,0; @ %r172 bra $L39; .loc 1 214 7 st.u64 [%r160+48],%r171; add.u32 %r174,%r22,-1; cvt.s64.s32 %r175,%r174; st.u64 [%r160+56],%r175; mov.u64 %r176,1; st.u64 [%r160+40],%r176; .loc 1 215 28 cvt.u32.u64 %r177,%r176; st.u8 [%r160+28],%r177; .loc 1 216 24 st.u64 [%r160+8],%r171; .loc 1 217 29 cvt.u32.u32 %r182,%r22; cvt.s64.s8 %r181,%r182; mov.u64 %r180call (%value_in),_gfortrani_xmallocarray183,[%value_in]; } .loc 1 217 27 st.u64 [%r160],%r183; bra $L40; $L39: .loc 1 221 11 cvta.global.u64 %r185,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r186,[%r185+36]; setp.eq.u32 %r187,%r186,0; @ %r187 bra $L40; .loc 1 224 4 cvta.const.u64 %r190,$LC1; {161190; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19316161193190; call _gfortrani_bounds_equal_extents$L40: .loc 1 231 15 ld.u64 %r31,[%r162+16]; .loc 1 233 9 ld.u64 %r92,[%r162]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r195,%r32,-4; and.b32 %r196,%r195,-5; set.u32.eq.u32 %r198,%r196,0; neg.s32 %r199,%r198; .loc 1 235 22 add.u32 %r200,%r32,-1; set.u32.le.u32 %r202,%r200,1; neg.s32 %r203,%r202; .loc 1 235 6 cvt.u16.u32 %r205,%r199; cvt.u16.u32 %r206,%r203; or.b16 %r204,%r205,%r206; cvt.u32.u16 %r207,%r204; cvt.u16.u8 %r208,%r20741; .loc 1 237 7 setp.ne.u32 %r211,%r32,16; @ %r211 bra $L42; $L41: .loc 1 244 11 ld.u64 %r106,[%r160+40]; .loc 1 245 8 ld.u64 %r107,[%r160]; .loc 1 246 17 cvt.u32.u32 %r212,%r22; cvt.s64.s8 %r68,%r212; add.u64 %r153,%r161,40; add.u64 %r277,%frame,120; mov.u64 %r149,%r277; add.u64 %r147,%r162,40; mov.u64 %r146,%frame; add.u64 %r143,%frame,240; add.u64 %r62,%frame,360; mov.u64 %r47,%r62; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r221,%r29; bra $L43; $L42: .loc 1 242 5_gfortran_runtime_error64: mov.u64 %r29,%r113; $L43: .loc 1 248 18 ld.u64 %r215,[%r153]; st.u64 [%r149],%r215; .loc 1 249 20 ld.u64 %r217,[%r147]; mul.lo.u64 %r216,%r217,%r31; .loc 1 249 18 st.u64 [%r146],%r216; .loc 1 250 19 ld.u64 %r219,[%r153+16]; add.u64 %r218,%r219,1; ld.u64 %r220,[%r153+8]; sub.u64 %r44,%r218,%r220; .loc 1 250 17 st.u64 [%r143],%r44; .loc 1 251 16 st.u64 [%r47],%r221; .loc 1 252 10 setp.gt.s64 %r222,%r44,0; @ %r222 bra $L44; shl.b64 %r137,%r106,3; mov.u64 %r129,%r107; .loc 1 255 11 mov.u64 %r114,0; .loc 1 256 24 mov.u64 %r223,%r114; $L45: st.u64 [%r129],%r223; .loc 1 255 27 add.u64 %r114,%r114,1; .loc 1 255 4 add.u64 %r129,%r129,%r137; setp.ne.u64 %r224,%r68,%r114; @ %r224 bra $L45; bra $L35; $L44: .loc 1 246 26 add.u64 %r113,%r29,1; .loc 1 246 3 add.u64 %r153,%r153,24; add.u64 %r149,%r149,8; add.u64 %r147,%r147,24; add.u64 %r146,%r146,8; add.u64 %r143,%r143,8; add.u64 %r47,%r47,8; setp.ne.u64 %r225,%r68,%r113; @ %r225 bra $L64; .loc 1 261 8 ld.u64 %r90,[%r161]; shl.b64 %r158,%r106,3; mov.u64 %r156,%r107; .loc 1 264 10 mov.u64 %r112,0; .loc 1 265 23 mov.u64 %r226,%r112; $L47: st.u64 [%r156],%r226; mov.u64 %r140,%r112; .loc 1 264 26 add.u64 %r112,%r112,1; .loc 1 264 3 add.u64 %r156,%r156,%r158; setp.ne.u64 %r227,%r29,%r140; @ %r227 bra $L47; .loc 1 276 9 setp.eq.u64 %r228,%r90,0; @ %r228 bra $L35; .loc 1 331 19 ld.u64 %r70,[%frame+120]; .loc 1 331 9 shl.b64 %r72,%r70,3; .loc 1 332 20 ld.u64 %r73,[%frame]; .loc 1 334 34 ld.u64 %r76,[%frame+240]; ld.u64 %r155,[%frame+360]; cvt.u32.u32 %r230,%r22; cvt.s64.s8 %r229,%r230; shl.b64 %r231,%r229,3; add.u64 %r119,%r231,%r62; .loc 1 274 12 mov.u64 %r97,-9223372036854775808; setp.eq.u32 %r276,1,0; add.u64 %r275,%frame,368; .loc 1 340 13 mov.u64 %r282,0; .loc 1 359 32 add.u64 %r283,%frame,240; .loc 1 308 12 setp.eq.u32 %r285,%r163,0; $L63: .loc 1 280 10 @ %r276 bra $L48; mov.u64 %r56,%r155; mov.u32 %r46,0; mov.u32 %r279,1; $L53: mov.u64 %r45,%r90; .loc 1 300 13 add.u64 %r90,%r45,%r72; .loc 1 284 11 ld.s8 %r234,[%r92]; cvt.u16.u32 %r233,%r234; setp.eq.u16 %r235,%r233,0; @ %r235 bra $L49; setp.eq.u32 %r236,%r46,0; selp.u64 %r155,%r155,%r56,%r236; .loc 1 294 16 ld.u64 %r97,[%r45]; mov.u64 %r128,%r275; mov.u64 %r139,%r107; mov.u64 %r151,%r155; $L52: .loc 1 296 33 add.u64 %r238,%r151,1; .loc 1 296 22 st.u64 [%r139],%r238; .loc 1 295 9 add.u64 %r139,%r139,%r158; setp.eq.u64 %r239,%r119,%r128; @ %r239 bra $L65; .loc 1 296 29 ld.u64 %r151,[%r128]; add.u64 %r128,%r128,8; bra $L52; $L49: .loc 1 301 14 add.u64 %r92,%r92,%r73; .loc 1 303 11 add.u64 %r56,%r56,1; mov.u32 %r46,%r279; .loc 1 303 4 setp.ne.u64 %r240,%r56,%r76; @ %r240 bra $L53; bra $L89; $L48: .loc 1 308 12 @ %r285 bra $L66; mov.u64 %r64,%r155; mov.u64 %r88,%r90; $L58: .loc 1 311 11 ld.s8 %r243,[%r92]; cvt.u16.u32 %r242,%r243; setp.eq.u16 %r244,%r242,0; @ %r244 bra $L56; .loc 1 311 22 ld.u64 %r60,[%r88]; .loc 1 311 19 setp.lt.s64 %r245,%r60,%r97; @ %r245 bra $L56; mov.u64 %r93,%r275; mov.u64 %r96,%r107; mov.u64 %r148,%r64; .loc 1 314 19 mov.u64 %r141,%r282; $L57: .loc 1 315 36 add.u64 %r247,%r148,1; .loc 1 315 25 st.u64 [%r96],%r247; .loc 1 314 35 add.u64 %r110,%r141,1; .loc 1 314 12 add.u64 %r96,%r96,%r158; setp.le.s64 %r248,%r29,%r141; @ %r248 bra $L67; .loc 1 315 32 ld.u64 %r148,[%r93]; add.u64 %r93,%r93,8; mov.u64 %r141,%r110; bra $L57; $L67: mov.u64 %r97,%r60; $L56: .loc 1 317 13 add.u64 %r88,%r88,%r72; .loc 1 319 11 add.u64 %r64,%r64,1; .loc 1 319 4 st.u64 [%frame+360],%r64; setp.ne.u64 %r249,%r64,%r76; @ %r249 bra $L58; .loc 1 317 13 sub.u64 %r250,%r76,%r155; mad.lo.u64 %r90,%r250,%r72,%r90; bra $L59; $L66: mov.u64 %r75,%r155; mov.u64 %r91,%r92; mov.u64 %r89,%r90; $L55: .loc 1 323 11 ld.s8 %r253,[%r91]; cvt.u16.u32 %r252,%r253; setp.eq.u16 %r254,%r252,0; @ %r254 bra $L60; .loc 1 323 22 ld.u64 %r67,[%r89]; .loc 1 323 19 setp.le.s64 %r255,%r67,%r97; @ %r255 bra $L60; mov.u64 %r115,%r275; mov.u64 %r57,%r107; mov.u64 %r144,%r75; .loc 1 326 12 mov.u64 %r142,%r282; $L61: .loc 1 327 36 add.u64 %r257,%r144,1; .loc 1 327 25 st.u64 [%r57],%r257; .loc 1 326 28 add.u64 %r109,%r142,1; .loc 1 326 5 add.u64 %r57,%r57,%r158; setp.le.s64 %r258,%r29,%r142; @ %r258 bra $L68; .loc 1 327 32 ld.u64 %r144,[%r115]; add.u64 %r115,%r115,8; mov.u64 %r142,%r109; bra $L61; $L68: mov.u64 %r97,%r67; $L60: .loc 1 331 9 add.u64 %r89,%r89,%r72; .loc 1 332 10 add.u64 %r91,%r91,%r73; .loc 1 334 14 add.u64 %r75,%r75,1; .loc 1 334 7 st.u64 [%frame+360],%r75; setp.ne.u64 %r259,%r75,%r76; @ %r259 bra $L55; sub.u64 %r125,%r76,%r155; .loc 1 331 9 mad.lo.u64 %r90,%r72,%r125,%r90; .loc 1 332 10 mad.lo.u64 %r92,%r73,%r125,%r92; bra $L59; $L89: st.u64 [%frame+360],%r76; $L59: mov.u64 %r117,%r275; mov.u64 %r85,%r73; .loc 1 334 34 mov.u64 %r87,%r76; mov.u64 %r83,%r70; mov.u64 %r116,8; .loc 1 335 9 mov.u64 %r95,0; $L62: .loc 1 340 13 st.u64 [%r117+-8],%r282; .loc 1 343 23 mul.lo.u64 %r78,%r83,%r87; .loc 1 344 24 mul.lo.u64 %r80,%r87,%r85; mov.u64 %r94,%r95; .loc 1 345 5 add.u64 %r95,%r95,1; .loc 1 346 7 setp.eq.u64 %r264,%r29,%r94; @ %r264 bra $L35; .loc 1 354 16 ld.u64 %r265,[%r117]; add.u64 %r82,%r265,1; st.u64 [%r117],%r82; .loc 1 355 23 add.u64 %r267,%r277,%r116; ld.u64 %r83,[%r267]; .loc 1 355 13 sub.u64 %r268,%r83,%r78; shl.b64 %r269,%r268,3; add.u64 %r90,%r90,%r269; .loc 1 356 24 add.u64 %r270,%frame,%r116; ld.u64 %r85,[%r270]; .loc 1 356 14 sub.u64 %r271,%r85,%r80; add.u64 %r92,%r92,%r271; .loc 1 359 32 add.u64 %r273,%r283,%r116; ld.u64 %r87,[%r273]; .loc 1 359 7 add.u64 %r117,%r117,8; add.u64 %r116,%r116,8; setp.eq.u64 %r274,%r82,%r87; @ %r274 bra $L62; ld.u64 %r155,[%frame+360]; bra $L63; $L65: mov.u64 %r90,%r45; setp.eq.u32 %r276,1,1; bra $L48; $L35: .loc 1 362_gfortran_smaxloc0_8_i8 .visible .func _gfortran_smaxloc0_8_i84predpred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 379 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L91; .loc 1 379 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L92; $L91: .loc 1 381 7 {_gfortran_maxloc0_8_.loc 1 382 7 bra $L90; $L92: .loc 1 385 8 ld.s8 %r30,[%r36+28]_gfortran_runtime_error94: .loc 1 390 15 ld.u64 %r24,[%r35]; .loc 1 390 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L95; .loc 1 392 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 393 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 394 24 st.u64 [%r35+8],%r24; .loc 1 395 29 mov.u64 call (%value_in),_gfortrani_xmallocarray95 27 st.u64 [%r35],%r24; bra $L96; $L95: .loc 1 397 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 397 11 ld.u32 %r62,[%r61+36; .loc 1 399 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 404 8 ld.u64 %r24,[%r35]; $L96: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,3; mov.u64 %r29,%r24; .loc 1 405 10 mov.u64 %r32,0; .loc 1 406 23 mov.u64 %r68,%r32; $L97: st.u64 [%r29],%r68; .loc 1 405 24 add.u64 %r32,%r32,1; .loc 1 405 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L97; $L90: .loc 1 407 maxloc0_16_i8.o/1622802228_gfortran_maxloc0_16_i8 .visible .func _gfortran_maxloc0_16_i8fortran/generated/maxloc0_16_i8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_16_i8 .visible .func _gfortran_mmaxloc0_16_i8_gfortran_smaxloc0_16_i8 .visible .func _gfortran_smaxloc0_16_i8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,65,88,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_16_i8 .visible .func _gfortran_maxloc0_16_i83264u32 %r157; .reg .u64 %r158; .reg .predpred %r166; .reg .predpred %r171; .reg .predpredmov.u64 %r114,%ar0; mov.u64 %r115,%ar1; mov.u32 %r116,%ar2; .loc 1 50 10 ld.s8 %r22,[%r115+28]; .loc 1 50 8 cvt.u32.u32 %r117,%r22; cvt.s64.s8 %r71,%r117; .loc 1 51 6 setp.gt.s64 %r118,%r71,0; @ %r118 bra $L2; .loc 1 52 5 cvta.const.u64 %r1111_gfortran_runtime_errorr23,[%r114]; .loc 1 54 6 setp.ne.u64 %r121,%r23,0; @ %r121 bra $L3; .loc 1 56 7 st.u64 [%r114+48],%r23; add.u64 %r123,%r71,-1; st.u64 [%r114+56],%r123; mov.u64 %r124,1; st.u64 [%r114+40],%r124; .loc 1 57 28 cvt.u32.u64 %r125,%r124; st.u8 [%r114+28],%r125; .loc 1 58 24 st.u64 [%r114+8],%r23; .loc 1 59 29 mov.u64 %r128,1671call (%value_in),_gfortrani_xmallocarray131,[%value_in]; } mov.u64 %r23,%r131; .loc 1 59 27 st.u64 [%r114],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r133,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r134,[%r133+36]; setp.eq.u32 %r135,%r134,0; @ %r135 bra $L4; .loc 1 64r115138; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r114]; $L4: .loc 1 68 11 ld.u64 %r72,[%r114+40]; add.u64 %r44,%r115,40; mov.u64 %r35,%frame; add.u64 %r52,%frame,120; .loc 1 70 10 mov.u64 %r100,0; add.u64 %r188,%frame,240; .loc 1 74 16 mov.u64 %r146,%r100; bra $L7; $L22: mov.u64 %r100,%r79; $L7: .loc 1 72 18 ld.u64 %r139,[%r44]; st.u64 [%r35],%r139; .loc 1 73 19 ld.u64 %r141,[%r44+16]; add.u64 %r140,%r141,1; ld.u64 %r142,[%r44+8]; sub.u64 %r32,%r140,%r142; .loc 1 73 17 st.u64 [%r52],%r32; .loc 1 74 16 shl.b64 %r144,%r100,3; add.u64 %r145,%r188,%r144; st.u64 [%r145],%r146; .loc 1 75 10 setp.gt.s64 %r147,%r32,0; @ %r147 bra $L5; shl.b64 %r86,%r72,4; mov.u64 %r88,%r23; .loc 1 78 11 mov.u64 %r80,0; .loc 1 79 24 mov.u64 %r148,%r80; $L6: st.u64 [%r88],%r148; st.u64 [%r88+8],%r148; .loc 1 78 27 add.u64 %r80,%r80,1; .loc 1 78 4 add.u64 %r88,%r88,%r86; setp.ne.u64 %r150,%r71,%r80; @ %r150 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r79,%r100,1; .loc 1 70 3 add.u64 %r44,%r44,24; add.u64 %r35,%r35,8; add.u64 %r52,%r52,8; setp.ne.u64 %r151,%r71,%r79; @ %r151 bra $L22; .loc 1 84 8 ld.u64 %r62,[%r115]; shl.b64 %r38,%r72,4; mov.u64 %r85,%r23; .loc 1 87 10 mov.u64 %r78,0; .loc 1 88 23 mov.u64 %r152,1; mov.u64 %r153,%r78; $L8: st.u64 [%r85],%r152; st.u64 [%r85+8],%r153; mov.u64 %r33,%r78; .loc 1 87 26 add.u64 %r78,%r78,1; .loc 1 87 3 add.u64 %r85,%r85,%r38; setp.ne.u64 %r154,%r100,%r33; @ %r154 bra $L8; .loc 1 101 9 setp.ne.u64 %r155,%r62,0; @ ! %r155 bra $L1; .loc 1 149 19 ld.u64 %r47,[%frame]; .loc 1 149 9 shl.b64 %r49,%r47,3; .loc 1 151 34 ld.u64 %r51,[%frame+120]; cvt.u32.u32 %r157,%r22; cvt.s64.s8 %r156,%r157; shl.b64 %r158,%r156,3; add.u64 %r89,%r188,%r158; .loc 1 99 12 mov.u64 %r66,-9223372036854775808; setp.eq.u32 %r187,%r116,0; add.u64 %r189,%frame,248; .loc 1 157 13 mov.u64 %r192,0; .loc 1 174 32 add.u64 %r193,%frame,120; $L21: .loc 1 151 21 ld.u64 %r107,[%frame+240]; .loc 1 126 12 @ %r187 bra $L23; mov.u64 %r41,%r107; mov.u64 %r60,%r62; $L14: .loc 1 129 12 ld.u64 %r34,[%r60]; .loc 1 129 11 setp.lt.s64 %r161,%r34,%r66; @ %r161 bra $L12; mov.u64 %r27,%r189; mov.u64 %r63,%r23; mov.u64 %r103,%r41; $L13: .loc 1 133 35 add.u64 %r163,%r103,1; st.u64 [%r63],%r163; shr.s64 %r164,%r163,63; st.u64 [%r63+8],%r164; .loc 1 132 10 add.u64 %r63,%r63,%r38; setp.eq.u64 %r165,%r27,%r89; @ %r165 bra $L24; .loc 1 133 31 ld.u64 %r103,[%r27]; add.u64 %r27,%r27,8; bra $L13; $L24: mov.u64 %r66,%r34; $L12: .loc 1 135 12 add.u64 %r60,%r60,%r49; .loc 1 137 17 add.u64 %r41,%r41,1; .loc 1 137 10 st.u64 [%frame+240],%r41; setp.ne.u64 %r166,%r41,%r51; @ %r166 bra $L14; bra $L34; $L23: mov.u64 %r50,%r107; mov.u64 %r61,%r62; $L11: .loc 1 141 11 ld.u64 %r42,[%r61]; .loc 1 141 10 setp.le.s64 %r167,%r42,%r66; @ %r167 bra $L16; mov.u64 %r104,%r189; mov.u64 %r105,%r23; mov.u64 %r84,%r50; $L17: .loc 1 145 35 add.u64 %r169,%r84,1; st.u64 [%r105],%r169; shr.s64 %r170,%r169,63; st.u64 [%r105+8],%r170; .loc 1 144 4 add.u64 %r105,%r105,%r38; setp.eq.u64 %r171,%r89,%r104; @ %r171 bra $L25; .loc 1 145 31 ld.u64 %r84,[%r104]; add.u64 %r104,%r104,8; bra $L17; $L25: mov.u64 %r66,%r42; $L16: .loc 1 149 9 add.u64 %r61,%r61,%r49; .loc 1 151 14 add.u64 %r50,%r50,1; .loc 1 151 7 st.u64 [%frame+240],%r50; setp.ne.u64 %r172,%r50,%r51; @ %r172 bra $L11; .loc 1 149 9 sub.u64 %r173,%r51,%r107; mad.lo.u64 %r62,%r173,%r49,%r62; bra $L18; $L34: .loc 1 135 12 sub.u64 %r175,%r51,%r107; mad.lo.u64 %r62,%r175,%r49,%r62; $L18: mov.u64 %r77,%r189; .loc 1 151 34 mov.u64 %r59,%r51; mov.u64 %r57,%r47; .loc 1 152 9 mov.u64 %r65,0; $L20: .loc 1 157 13 st.u64 [%r77+-8],%r192; .loc 1 160 23 mul.lo.u64 %r54,%r57,%r59; mov.u64 %r101,%r65; .loc 1 161 5 add.u64 %r65,%r65,1; .loc 1 162 7 setp.eq.u64 %r179,%r101,%r100; @ %r179 bra $L1; .loc 1 170 16 ld.u64 %r180,[%r77]; add.u64 %r56,%r180,1; st.u64 [%r77],%r56; shl.b64 %r68,%r65,3; .loc 1 171 23 add.u64 %r181,%frame,%r68; ld.u64 %r57,[%r181]; .loc 1 171 13 sub.u64 %r182,%r57,%r54; shl.b64 %r183,%r182,3; add.u64 %r62,%r62,%r183; .loc 1 174 32 add.u64 %r185,%r193,%r68; ld.u64 %r59,[%r185]; .loc 1 174 7 add.u64 %r77,%r77,8; setp.eq.u64 %r186,%r56,%r59; @ %r186 bra $L20; bra $L21; $L1: .loc 1 1_gfortran_mmaxloc0_16_i8 .visible .func _gfortran_mmaxloc0_16_i88predpredpredu64pred %r293; mov.u64 %r163,%ar0; mov.u64 %r164,%ar1; mov.u64 %r165,%ar2; mov.u32 %r166,%ar3; .loc 1 202 6 setp.ne.u64 %r167,%r165,0; @ %r167 bra $L3616r1166; call _gfortran_maxloc0_16_.loc 1 205 7 bra $L35; $L36: .loc 1 208 10 ld.s8 %r22,[%r164+28]; .loc 1 209 6 setp.gt.s32 %r171,%r22,0; @ %r171 bra $L38; .loc 1 210 5 cvta.const.u64 %r172172_gfortran_runtime_error%r174,[%r163]; setp.ne.u64 %r175,%r174,0; @ %r175 bra $L39; .loc 1 214 7 st.u64 [%r163+48],%r174; add.u32 %r177,%r22,-1; cvt.s64.s32 %r178,%r177; st.u64 [%r163+56],%r178; mov.u64 %r179,1; st.u64 [%r163+40],%r179; .loc 1 215 28 cvt.u32.u64 %r180,%r179; st.u8 [%r163+28],%r180; .loc 1 216 24 st.u64 [%r163+8],%r174; .loc 1 217 29 cvt.u32.u32 %r185,%r22; cvt.s64.s8 %r184,%r185; mov.u64 %r183,16call (%value_in),_gfortrani_xmallocarray186,[%value_in]; } .loc 1 217 27 st.u64 [%r163],%r186; bra $L40; $L39: .loc 1 221 11 cvta.global.u64 %r188,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r189,[%r188+36]; setp.eq.u32 %r190,%r189,0r164193; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r196193; call _gfortrani_bounds_equal_extents$L40: .loc 1 231 15 ld.u64 %r31,[%r165+16]; .loc 1 233 9 ld.u64 %r90,[%r165]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r198,%r32,-4; and.b32 %r199,%r198,-5; set.u32.eq.u32 %r201,%r199,0; neg.s32 %r202,%r201; .loc 1 235 22 add.u32 %r203,%r32,-1; set.u32.le.u32 %r205,%r203,1; neg.s32 %r206,%r205; .loc 1 235 6 cvt.u16.u32 %r208,%r202; cvt.u16.u32 %r209,%r206; or.b16 %r207,%r208,%r209;41; .loc 1 237 7 setp.ne.u32 %r214,%r32,16; @ %r214 bra $L42; $L41: .loc 1 244 11 ld.u64 %r102,[%r163+40]; .loc 1 245 8 ld.u64 %r103,[%r163]; .loc 1 246 17 cvt.u32.u32 %r215,%r22; cvt.s64.s8 %r65,%r215; add.u64 %r161,%r164,40; add.u64 %r284,%frame,120; mov.u64 %r159,%r284; add.u64 %r158,%r165,40; mov.u64 %r154,%frame; add.u64 %r155,%frame,240; add.u64 %r151,%frame,360; mov.u64 %r150,%r151; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r224,%r29; bra $L43; $L42: .loc 1 242 5_gfortran_runtime_error64: mov.u64 %r29,%r109; $L43: .loc 1 248 18 ld.u64 %r218,[%r161]; st.u64 [%r159],%r218; .loc 1 249 20 ld.u64 %r220,[%r158]; mul.lo.u64 %r219,%r220,%r31; .loc 1 249 18 st.u64 [%r154],%r219; .loc 1 250 19 ld.u64 %r222,[%r161+16]; add.u64 %r221,%r222,1; ld.u64 %r223,[%r161+8]; sub.u64 %r44,%r221,%r223; .loc 1 250 17 st.u64 [%r155],%r44; .loc 1 251 16 st.u64 [%r150],%r224; .loc 1 252 10 setp.gt.s64 %r225,%r44,0; @ %r225 bra $L44; shl.b64 %r144,%r102,4; mov.u64 %r133,%r103; .loc 1 255 11 mov.u64 %r110,0; .loc 1 256 24 mov.u64 %r226,%r110; $L45: st.u64 [%r133],%r226; st.u64 [%r133+8],%r226; .loc 1 255 27 add.u64 %r110,%r110,1; .loc 1 255 4 add.u64 %r133,%r133,%r144; setp.ne.u64 %r228,%r65,%r110; @ %r228 bra $L45; bra $L35; $L44: .loc 1 246 26 add.u64 %r109,%r29,1; .loc 1 246 3 add.u64 %r161,%r161,24; add.u64 %r159,%r159,8; add.u64 %r158,%r158,24; add.u64 %r154,%r154,8; add.u64 %r155,%r155,8; add.u64 %r150,%r150,8; setp.ne.u64 %r229,%r65,%r109; @ %r229 bra $L64; .loc 1 261 8 ld.u64 %r88,[%r164]; shl.b64 %r112,%r102,4; mov.u64 %r52,%r103; .loc 1 264 10 mov.u64 %r108,0; .loc 1 265 23 mov.u64 %r230,%r108; $L47: st.u64 [%r52],%r230; st.u64 [%r52+8],%r230; mov.u64 %r140,%r108; .loc 1 264 26 add.u64 %r108,%r108,1; .loc 1 264 3 add.u64 %r52,%r52,%r112; setp.ne.u64 %r232,%r29,%r140; @ %r232 bra $L47; .loc 1 276 9 setp.eq.u64 %r233,%r88,0; @ %r233 bra $L35; .loc 1 331 19 ld.u64 %r68,[%frame+120]; .loc 1 331 9 shl.b64 %r70,%r68,3; .loc 1 332 20 ld.u64 %r71,[%frame]; .loc 1 334 34 ld.u64 %r74,[%frame+240]; ld.u64 %r160,[%frame+360]; cvt.u32.u32 %r235,%r22; cvt.s64.s8 %r234,%r235; shl.b64 %r236,%r234,3; add.u64 %r122,%r236,%r151; .loc 1 274 12 mov.u64 %r95,-9223372036854775808; setp.eq.u32 %r283,1,0; add.u64 %r285,%frame,368; .loc 1 340 13 mov.u64 %r290,0; .loc 1 359 32 add.u64 %r291,%frame,240; .loc 1 308 12 setp.eq.u32 %r293,%r166,0; $L63: .loc 1 280 10 @ %r283 bra $L48; mov.u64 %r53,%r160; mov.u32 %r46,0; mov.u32 %r287,1; $L53: mov.u64 %r45,%r88; .loc 1 300 13 add.u64 %r88,%r45,%r70; .loc 1 284 11 ld.s8 %r239,[%r90]; cvt.u16.u32 %r238,%r239; setp.eq.u16 %r240,%r238,0; @ %r240 bra $L49; setp.eq.u32 %r241,%r46,0; selp.u64 %r160,%r160,%r53,%r241; .loc 1 294 16 ld.u64 %r95,[%r45]; mov.u64 %r131,%r285; mov.u64 %r145,%r103; mov.u64 %r157,%r160; $L52: .loc 1 296 33 add.u64 %r243,%r157,1; st.u64 [%r145],%r243; shr.s64 %r244,%r243,63; st.u64 [%r145+8],%r244; .loc 1 295 9 add.u64 %r145,%r145,%r112; setp.eq.u64 %r245,%r122,%r131; @ %r245 bra $L65; .loc 1 296 29 ld.u64 %r157,[%r131]; add.u64 %r131,%r131,8; bra $L52; $L49: .loc 1 301 14 add.u64 %r90,%r90,%r71; .loc 1 303 11 add.u64 %r53,%r53,1; mov.u32 %r46,%r287; .loc 1 303 4 setp.ne.u64 %r246,%r53,%r74; @ %r246 bra $L53; bra $L89; $L48: .loc 1 308 12 @ %r293 bra $L66; mov.u64 %r61,%r160; mov.u64 %r86,%r88; $L58: .loc 1 311 11 ld.s8 %r249,[%r90]; cvt.u16.u32 %r248,%r249; setp.eq.u16 %r250,%r248,0; @ %r250 bra $L56; .loc 1 311 22 ld.u64 %r57,[%r86]; .loc 1 311 19 setp.lt.s64 %r251,%r57,%r95; @ %r251 bra $L56; mov.u64 %r97,%r285; mov.u64 %r107,%r103; mov.u64 %r153,%r61; .loc 1 314 19 mov.u64 %r141,%r290; $L57: .loc 1 315 36 add.u64 %r253,%r153,1; st.u64 [%r107],%r253; shr.s64 %r254,%r253,63; st.u64 [%r107+8],%r254; .loc 1 314 35 add.u64 %r106,%r141,1; .loc 1 314 12 add.u64 %r107,%r107,%r112; setp.le.s64 %r255,%r29,%r141; @ %r255 bra $L67; .loc 1 315 32 ld.u64 %r153,[%r97]; add.u64 %r97,%r97,8; mov.u64 %r141,%r106; bra $L57; $L67: mov.u64 %r95,%r57; $L56: .loc 1 317 13 add.u64 %r86,%r86,%r70; .loc 1 319 11 add.u64 %r61,%r61,1; .loc 1 319 4 st.u64 [%frame+360],%r61; setp.ne.u64 %r256,%r61,%r74; @ %r256 bra $L58; .loc 1 317 13 sub.u64 %r257,%r74,%r160; mad.lo.u64 %r88,%r257,%r70,%r88; bra $L59; $L66: mov.u64 %r73,%r160; mov.u64 %r89,%r90; mov.u64 %r87,%r88; $L55: .loc 1 323 11 ld.s8 %r260,[%r89]; cvt.u16.u32 %r259,%r260; setp.eq.u16 %r261,%r259,0; @ %r261 bra $L60; .loc 1 323 22 ld.u64 %r64,[%r87]; .loc 1 323 19 setp.le.s64 %r262,%r64,%r95; @ %r262 bra $L60; mov.u64 %r62,%r285; mov.u64 %r91,%r103; mov.u64 %r149,%r73; .loc 1 326 12 mov.u64 %r142,%r290; $L61: .loc 1 327 36 add.u64 %r264,%r149,1; st.u64 [%r91],%r264; shr.s64 %r265,%r264,63; st.u64 [%r91+8],%r265; .loc 1 326 28 add.u64 %r105,%r142,1; .loc 1 326 5 add.u64 %r91,%r91,%r112; setp.le.s64 %r266,%r29,%r142; @ %r266 bra $L68; .loc 1 327 32 ld.u64 %r149,[%r62]; add.u64 %r62,%r62,8; mov.u64 %r142,%r105; bra $L61; $L68: mov.u64 %r95,%r64; $L60: .loc 1 331 9 add.u64 %r87,%r87,%r70; .loc 1 332 10 add.u64 %r89,%r89,%r71; .loc 1 334 14 add.u64 %r73,%r73,1; .loc 1 334 7 st.u64 [%frame+360],%r73; setp.ne.u64 %r267,%r73,%r74; @ %r267 bra $L55; sub.u64 %r128,%r74,%r160; .loc 1 331 9 mad.lo.u64 %r88,%r70,%r128,%r88; .loc 1 332 10 mad.lo.u64 %r90,%r71,%r128,%r90; bra $L59; $L89: st.u64 [%frame+360],%r74; $L59: mov.u64 %r120,%r285; mov.u64 %r83,%r71; .loc 1 334 34 mov.u64 %r85,%r74; mov.u64 %r81,%r68; mov.u64 %r119,8; .loc 1 335 9 mov.u64 %r93,0; $L62: .loc 1 340 13 st.u64 [%r120+-8],%r290; .loc 1 343 23 mul.lo.u64 %r76,%r81,%r85; .loc 1 344 24 mul.lo.u64 %r78,%r85,%r83; mov.u64 %r98,%r93; .loc 1 345 5 add.u64 %r93,%r93,1; .loc 1 346 7 setp.eq.u64 %r272,%r29,%r98; @ %r272 bra $L35; .loc 1 354 16 ld.u64 %r273,[%r120]; add.u64 %r80,%r273,1; st.u64 [%r120],%r80; .loc 1 355 23 add.u64 %r275,%r284,%r119; ld.u64 %r81,[%r275]; .loc 1 355 13 sub.u64 %r276,%r81,%r76; shl.b64 %r277,%r276,3; add.u64 %r88,%r88,%r277; .loc 1 356 24 add.u64 %r278,%frame,%r119; ld.u64 %r83,[%r278]; .loc 1 356 14 sub.u64 %r279,%r83,%r78; add.u64 %r90,%r90,%r279; .loc 1 359 32 add.u64 %r281,%r291,%r119; ld.u64 %r85,[%r281]; .loc 1 359 7 add.u64 %r120,%r120,8; add.u64 %r119,%r119,8; setp.eq.u64 %r282,%r80,%r85; @ %r282 bra $L62; ld.u64 %r160,[%frame+360]; bra $L63; $L65: mov.u64 %r88,%r45; setp.eq.u32 %r283,1,1; bra $L48; $L35: .loc 1 362_gfortran_smaxloc0_16_i8 .visible .func _gfortran_smaxloc0_16_i84predpred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 379 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L91; .loc 1 379 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L92; $L91: .loc 1 381 7 {_gfortran_maxloc0_16_.loc 1 382 7 bra $L90; $L92: .loc 1 385 8 ld.s8 %r30,[%r36+28]_gfortran_runtime_error94: .loc 1 390 15 ld.u64 %r24,[%r35]; .loc 1 390 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L95; .loc 1 392 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 393 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 394 24 st.u64 [%r35+8],%r24; .loc 1 395 29call (%value_in),_gfortrani_xmallocarray95 27 st.u64 [%r35],%r24; bra $L96; $L95: .loc 1 397 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 397 11 ld.u32 %r62,[%r61+36; .loc 1 399 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 404 8 ld.u64 %r24,[%r35]; $L96: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,4; mov.u64 %r29,%r24; .loc 1 405 10 mov.u64 %r32,0; .loc 1 406 23 mov.u64 %r68,%r32; $L97: st.u64 [%r29],%r68; st.u64 [%r29+8],%r68; .loc 1 405 24 add.u64 %r32,%r32,1; .loc 1 405 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r70,%r30,%r32; @ %r70 bra $L97; $L90: .loc 1 407 maxloc0_4_i16.o/1622802228_gfortran_maxloc0_4_i16 .visible .func _gfortran_maxloc0_4_i16fortran/generated/maxloc0_4_i16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_4_i16 .visible .func _gfortran_mmaxloc0_4_i16_gfortran_smaxloc0_4_i16 .visible .func _gfortran_smaxloc0_4_i16VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,65,88,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_4_i16 .visible .func _gfortran_maxloc0_4_i16predpred %r154; .reg .predpred %r163; .reg .pred %r166; .reg .predpredpredpredu64 %r191; .reg .predpred64u64 %r211; .reg .u64 %r214; .reg .u64 %r215; mov.u64 %r116,%ar0; mov.u64 %r117,%ar1; mov.u32 %r118,%ar2; .loc 1 50 10 ld.s8 %r22,[%r117+28]; .loc 1 50 8 cvt.u32.u32 %r119,%r22; cvt.s64.s8 %r72,%r119; .loc 1 51 6 setp.gt.s64 %r120,%r72,0; @ %r120 bra $L2; .loc 1 52 5 cvta.const.u64 %r121121_gfortran_runtime_errorr23,[%r116]; .loc 1 54 6 setp.ne.u64 %r123,%r23,0; @ %r123 bra $L3; .loc 1 56 7 st.u64 [%r116+48],%r23; add.u64 %r125,%r72,-1; st.u64 [%r116+56],%r125; mov.u64 %r126,1; st.u64 [%r116+40],%r126; .loc 1 57 28 cvt.u32.u64 %r127,%r126; st.u8 [%r116+28],%r127; .loc 1 58 24 st.u64 [%r116+8],%r23; .loc 1 59 29 mov.u64 %r130,4;r130; call (%value_in),_gfortrani_xmallocarray133,[%value_in]; } mov.u64 %r23,%r133; .loc 1 59 27 st.u64 [%r116],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r135,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r136,[%r135+36]; setp.eq.u32 %r137,%r136,0; @ %r137 bra $L4; .loc 1 64r140; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r116]; $L4: .loc 1 68 11 ld.u64 %r73,[%r116+40]; add.u64 %r39,%r117,40; mov.u64 %r45,%frame; add.u64 %r35,%frame,120; .loc 1 70 10 mov.u64 %r99,0; add.u64 %r210,%frame,240; .loc 1 74 16 mov.u64 %r148,%r99; bra $L7; $L24: mov.u64 %r99,%r79; $L7: .loc 1 72 18 ld.u64 %r141,[%r39]; st.u64 [%r45],%r141; .loc 1 73 19 ld.u64 %r143,[%r39+16]; add.u64 %r142,%r143,1; ld.u64 %r144,[%r39+8]; sub.u64 %r31,%r142,%r144; .loc 1 73 17 st.u64 [%r35],%r31; .loc 1 74 16 shl.b64 %r146,%r99,3; add.u64 %r147,%r210,%r146; st.u64 [%r147],%r148; .loc 1 75 10 setp.gt.s64 %r149,%r31,0; @ %r149 bra $L5; shl.b64 %r85,%r73,2; mov.u64 %r87,%r23; .loc 1 78 11 mov.u64 %r80,0; .loc 1 79 24 cvt.u32.u64 %r150,%r80; $L6: st.u32 [%r87],%r150; .loc 1 78 27 add.u64 %r80,%r80,1; .loc 1 78 4 add.u64 %r87,%r87,%r85; setp.ne.u64 %r151,%r72,%r80; @ %r151 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r79,%r99,1; .loc 1 70 3 add.u64 %r39,%r39,24; add.u64 %r45,%r45,8; add.u64 %r35,%r35,8; setp.ne.u64 %r152,%r72,%r79; @ %r152 bra $L24; .loc 1 84 8 ld.u64 %r64,[%r117]; shl.b64 %r82,%r73,2; mov.u64 %r92,%r23; .loc 1 87 10 mov.u64 %r78,0; .loc 1 88 23 mov.u32 %r153,1; $L8: st.u32 [%r92],%r153; mov.u64 %r32,%r78; .loc 1 87 26 add.u64 %r78,%r78,1; .loc 1 87 3 add.u64 %r92,%r92,%r82; setp.ne.u64 %r154,%r99,%r32; @ %r154 bra $L8; .loc 1 101 9 setp.ne.u64 %r155,%r64,0; @ ! %r155 bra $L1; .loc 1 149 19 ld.u64 %r49,[%frame]; .loc 1 149 9 shl.b64 %r51,%r49,4; .loc 1 151 34 ld.u64 %r53,[%frame+120]; cvt.u32.u32 %r157,%r22; cvt.s64.s8 %r156,%r157; shl.b64 %r158,%r156,3; add.u64 %r102,%r210,%r158; .loc 1 99 12 mov.u64 %r207,0; mov.u64 %r208,-9223372036854775808; setp.eq.u32 %r209,%r118,0; add.u64 %r211,%frame,248; .loc 1 157 13 mov.u64 %r214,%r207; .loc 1 174 32 add.u64 %r215,%frame,120; $L23: .loc 1 151 21 ld.u64 %r110,[%frame+240]; .loc 1 126 12 @ %r209 bra $L25; mov.u64 %r40,%r110; mov.u64 %r62,%r64; $L15: .loc 1 129 12 ld.u64 %r203,[%r62]; ld.u64 %r204,[%r62+8]; .loc 1 129 11 setp.gt.s64 %r163,%r208,%r204; @ %r163 bra $L12; setp.ne.u64 %r166,%r208,%r204; @ %r166 bra $L28; setp.gt.u64 %r169,%r207,%r203; @ %r169 bra $L12; $L28: mov.u64 %r43,%r211; mov.u64 %r65,%r23; mov.u64 %r105,%r40; $L14: .loc 1 133 35 cvt.u32.u64 %r172,%r105; add.u32 %r171,%r172,1; .loc 1 133 24 st.u32 [%r65],%r171; .loc 1 132 10 add.u64 %r65,%r65,%r82; setp.eq.u64 %r173,%r43,%r102; @ %r173 bra $L26; .loc 1 133 31 ld.u64 %r105,[%r43]; add.u64 %r43,%r43,8; bra $L14; $L26: mov.u64 %r207,%r203; mov.u64 %r208,%r204; $L12: .loc 1 135 12 add.u64 %r62,%r62,%r51; .loc 1 137 17 add.u64 %r40,%r40,1; .loc 1 137 10 st.u64 [%frame+240],%r40; setp.ne.u64 %r174,%r40,%r53; @ %r174 bra $L15; bra $L38; $L25: mov.u64 %r52,%r110; mov.u64 %r63,%r64; $L11: .loc 1 141 11 ld.u64 %r205,[%r63]; ld.u64 %r206,[%r63+8]; .loc 1 141 10 setp.gt.s64 %r177,%r206,%r208; @ %r177 bra $L29; setp.ne.u64 %r180,%r206,%r208; @ %r180 bra $L17; setp.gt.u64 %r183,%r205,%r207; @ ! %r183 bra $L17; $L29: mov.u64 %r108,%r211; mov.u64 %r109,%r23; mov.u64 %r44,%r52; $L19: .loc 1 145 35 cvt.u32.u64 %r186,%r44; add.u32 %r185,%r186,1; .loc 1 145 24 st.u32 [%r109],%r185; .loc 1 144 4 add.u64 %r109,%r109,%r82; setp.eq.u64 %r187,%r102,%r108; @ %r187 bra $L27; .loc 1 145 31 ld.u64 %r44,[%r108]; add.u64 %r108,%r108,8; bra $L19; $L27: mov.u64 %r207,%r205; mov.u64 %r208,%r206; $L17: .loc 1 149 9 add.u64 %r63,%r63,%r51; .loc 1 151 14 add.u64 %r52,%r52,1; .loc 1 151 7 st.u64 [%frame+240],%r52; setp.ne.u64 %r188,%r52,%r53; @ %r188 bra $L11; .loc 1 149 9 sub.u64 %r189,%r53,%r110; mad.lo.u64 %r64,%r189,%r51,%r64; bra $L20; $L38: .loc 1 135 12 sub.u64 %r191,%r53,%r110; mad.lo.u64 %r64,%r191,%r51,%r64; $L20: mov.u64 %r81,%r211; .loc 1 151 34 mov.u64 %r61,%r53; mov.u64 %r59,%r49; .loc 1 152 9 mov.u64 %r66,0; $L22: .loc 1 157 13 st.u64 [%r81+-8],%r214; .loc 1 160 23 mul.lo.u64 %r56,%r59,%r61; mov.u64 %r100,%r66; .loc 1 161 5 add.u64 %r66,%r66,1; .loc 1 162 7 setp.eq.u64 %r195,%r100,%r99; @ %r195 bra $L1; .loc 1 170 16 ld.u64 %r196,[%r81]; add.u64 %r58,%r196,1; st.u64 [%r81],%r58; shl.b64 %r70,%r66,3; .loc 1 171 23 add.u64 %r197,%frame,%r70; ld.u64 %r59,[%r197]; .loc 1 171 13 sub.u64 %r198,%r59,%r56; shl.b64 %r199,%r198,4; add.u64 %r64,%r64,%r199; .loc 1 174 32 add.u64 %r201,%r215,%r70; ld.u64 %r61,[%r201]; .loc 1 174 7 add.u64 %r81,%r81,8; setp.eq.u64 %r202,%r58,%r61; @ %r202 bra $L22; bra $L23; $L1: .loc 1 1_gfortran_mmaxloc0_4_i16 .visible .func _gfortran_mmaxloc0_4_i167predpred %r193; .reg .u64 %r196; .reg .u6464pred %r228; .reg .u32 %r229; .reg .predpred %r233; .reg .pred %r234; .reg .u64u16predpred %r265; .reg .u64 %r266; .reg .u16 %r268; .reg .u32 %r269; .reg .pred %r270; .reg .predpredu64 %r313; .reg .u64 %r314; .reg .pred %r316; mov.u64 %r166,%ar0; mov.u64 %r167,%ar1; mov.u64 %r168,%ar2; mov.u32 %r169,%ar3; .loc 1 202 6 setp.ne.u64 %r170,%r168,0; @ %r170 bra $L40gfortran_maxloc0_4_i16.loc 1 205 7 bra $L39; $L40: .loc 1 208 10 ld.s8 %r22,[%r167+28]; .loc 1 209 6 setp.gt.s32 %r174,%r22,0; @ %r174 bra $L42; .loc 1 210 5 cvta.const.u64 %r175175_gfortran_runtime_error42: .loc 1 212 6 ld.u64 %r177,[%r166]; setp.ne.u64 %r178,%r177,0; @ %r178 bra $L43; .loc 1 214 7 st.u64 [%r166+48],%r177; add.u32 %r180,%r22,-1; cvt.s64.s32 %r181,%r180; st.u64 [%r166+56],%r181; mov.u64 %r182,1; st.u64 [%r166+40],%r182; .loc 1 215 28 cvt.u32.u64 %r183,%r182; st.u8 [%r166+28],%r183; .loc 1 216 24 st.u64 [%r166+8],%r177; .loc 1 217 29 cvt.u32.u32 %r188,%r22; cvt.s64.s8 %r187,%r188call (%value_in),_gfortrani_xmallocarray189,[%value_in]; } .loc 1 217 27 st.u64 [%r166],%r189; bra $L44; $L43: .loc 1 221 11 cvta.global.u64 %r191,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r192,[%r191+36];r167196; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19call _gfortrani_bounds_equal_extents$L44: .loc 1 231 15 ld.u64 %r31,[%r168+16]; .loc 1 233 9 ld.u64 %r93,[%r168]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r201,%r32,-4; and.b32 %r202,%r201,-5; set.u32.eq.u32 %r204,%r202,0; neg.s32 %r205,%r204; .loc 1 235 22 add.u32 %r206,%r32,-1; set.u32.le.u32 %r208,%r206,1; neg.s32 %r209,%r208; .loc 1 235 6 cvt.u16.u32 %r211,%r205; cvt.u16.u32 %r212,%r209; or.b16 %r210,%r211,%r212; cvt.u32.u16 %r213,%r210; cvt.u16.u8 %r214,%r213; setp.ne.u16 %r215,%r214,0; @ %r215 bra $L45; .loc 1 237 7 setp.ne.u32 %r217,%r32,16; @ %r217 bra $L46; $L45: .loc 1 244 11 ld.u64 %r106,[%r166+40]; .loc 1 245 8 ld.u64 %r107,[%r166]; .loc 1 246 17 cvt.u32.u32 %r218,%r22; cvt.s64.s8 %r65,%r218; add.u64 %r52,%r167,40; add.u64 %r307,%frame,120; mov.u64 %r116,%r307; add.u64 %r102,%r168,40; mov.u64 %r165,%frame; add.u64 %r162,%frame,240; add.u64 %r159,%frame,360; mov.u64 %r160,%r159; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r227,%r29; bra $L47; $L46: .loc 1 242 5_gfortran_runtime_error70: mov.u64 %r29,%r113; $L47: .loc 1 248 18 ld.u64 %r221,[%r52]; st.u64 [%r116],%r221; .loc 1 249 20 ld.u64 %r223,[%r102]; mul.lo.u64 %r222,%r223,%r31; .loc 1 249 18 st.u64 [%r165],%r222; .loc 1 250 19 ld.u64 %r225,[%r52+16]; add.u64 %r224,%r225,1; ld.u64 %r226,[%r52+8]; sub.u64 %r44,%r224,%r226; .loc 1 250 17 st.u64 [%r162],%r44; .loc 1 251 16 st.u64 [%r160],%r227; .loc 1 252 10 setp.gt.s64 %r228,%r44,0; @ %r228 bra $L48; shl.b64 %r149,%r106,2; mov.u64 %r139,%r107; .loc 1 255 11 mov.u64 %r114,0; .loc 1 256 24 cvt.u32.u64 %r229,%r114; $L49: st.u32 [%r139],%r229; .loc 1 255 27 add.u64 %r114,%r114,1; .loc 1 255 4 add.u64 %r139,%r139,%r149; setp.ne.u64 %r230,%r65,%r114; @ %r230 bra $L49; bra $L39; $L48: .loc 1 246 26 add.u64 %r113,%r29,1; .loc 1 246 3 add.u64 %r52,%r52,24; add.u64 %r116,%r116,8; add.u64 %r102,%r102,24; add.u64 %r165,%r165,8; add.u64 %r162,%r162,8; add.u64 %r160,%r160,8; setp.ne.u64 %r231,%r65,%r113; @ %r231 bra $L70; .loc 1 261 8 ld.u64 %r90,[%r167]; shl.b64 %r62,%r106,2; mov.u64 %r91,%r107; .loc 1 264 10 mov.u64 %r112,0; .loc 1 265 23 cvt.u32.u64 %r232,%r112; $L51: st.u32 [%r91],%r232; mov.u64 %r142,%r112; .loc 1 264 26 add.u64 %r112,%r112,1; .loc 1 264 3 add.u64 %r91,%r91,%r62; setp.ne.u64 %r233,%r29,%r142; @ %r233 bra $L51; .loc 1 276 9 setp.eq.u64 %r234,%r90,0; @ %r234 bra $L39; .loc 1 331 19 ld.u64 %r69,[%frame+120]; .loc 1 331 9 shl.b64 %r71,%r69,4; .loc 1 332 20 ld.u64 %r72,[%frame]; .loc 1 334 34 ld.u64 %r75,[%frame+240]; ld.u64 %r163,[%frame+360]; cvt.u32.u32 %r236,%r22; cvt.s64.s8 %r235,%r236; shl.b64 %r237,%r235,3; add.u64 %r127,%r237,%r159; .loc 1 274 12 mov.u64 %r304,0; mov.u64 %r305,-9223372036854775808; setp.eq.u32 %r306,1,0; add.u64 %r308,%frame,368; .loc 1 340 13 mov.u64 %r313,%r304; .loc 1 359 32 add.u64 %r314,%frame,240; .loc 1 308 12 setp.eq.u32 %r316,%r169,0; $L69: .loc 1 280 10 @ %r306 bra $L52; mov.u64 %r53,%r163; mov.u32 %r46,0; mov.u32 %r310,1; $L57: mov.u64 %r45,%r90; .loc 1 300 13 add.u64 %r90,%r45,%r71; .loc 1 284 11 ld.s8 %r240,[%r93]; cvt.u16.u32 %r239,%r240; setp.eq.u16 %r241,%r239,0; @ %r241 bra $L53; setp.eq.u32 %r242,%r46,0; selp.u64 %r163,%r163,%r53,%r242; .loc 1 294 16 ld.u64 %r304,[%r45]; ld.u64 %r305,[%r45+8]; mov.u64 %r137,%r308; mov.u64 %r150,%r107; mov.u64 %r161,%r163; $L56: .loc 1 296 33 cvt.u32.u64 %r245,%r161; add.u32 %r244,%r245,1; .loc 1 296 22 st.u32 [%r150],%r244; .loc 1 295 9 add.u64 %r150,%r150,%r62; setp.eq.u64 %r246,%r127,%r137; @ %r246 bra $L71; .loc 1 296 29 ld.u64 %r161,[%r137]; add.u64 %r137,%r137,8; bra $L56; $L53: .loc 1 301 14 add.u64 %r93,%r93,%r72; .loc 1 303 11 add.u64 %r53,%r53,1; mov.u32 %r46,%r310; .loc 1 303 4 setp.ne.u64 %r247,%r53,%r75; @ %r247 bra $L57; bra $L97; $L52: .loc 1 308 12 @ %r316 bra $L72; mov.u64 %r61,%r163; mov.u64 %r88,%r90; $L63: .loc 1 311 11 ld.s8 %r250,[%r93]; cvt.u16.u32 %r249,%r250; setp.eq.u16 %r251,%r249,0; @ %r251 bra $L60; .loc 1 311 22 ld.u64 %r300,[%r88]; ld.u64 %r301,[%r88+8]; .loc 1 311 19 setp.gt.s64 %r254,%r305,%r301; @ %r254 bra $L60; setp.ne.u64 %r257,%r305,%r301; @ %r257 bra $L75; setp.gt.u64 %r260,%r304,%r300; @ %r260 bra $L60; $L75: mov.u64 %r117,%r308; mov.u64 %r119,%r107; mov.u64 %r156,%r61; .loc 1 314 19 mov.u64 %r143,%r313; $L62: .loc 1 315 36 cvt.u32.u64 %r263,%r156; add.u32 %r262,%r263,1; .loc 1 315 25 st.u32 [%r119],%r262; .loc 1 314 35 add.u64 %r110,%r143,1; .loc 1 314 12 add.u64 %r119,%r119,%r62; setp.le.s64 %r264,%r29,%r143; @ %r264 bra $L73; .loc 1 315 32 ld.u64 %r156,[%r117]; add.u64 %r117,%r117,8; mov.u64 %r143,%r110; bra $L62; $L73: mov.u64 %r304,%r300; mov.u64 %r305,%r301; $L60: .loc 1 317 13 add.u64 %r88,%r88,%r71; .loc 1 319 11 add.u64 %r61,%r61,1; .loc 1 319 4 st.u64 [%frame+360],%r61; setp.ne.u64 %r265,%r61,%r75; @ %r265 bra $L63; .loc 1 317 13 sub.u64 %r266,%r75,%r163; mad.lo.u64 %r90,%r266,%r71,%r90; bra $L64; $L72: mov.u64 %r74,%r163; mov.u64 %r92,%r93; mov.u64 %r89,%r90; $L59: .loc 1 323 11 ld.s8 %r269,[%r92]; cvt.u16.u32 %r268,%r269; setp.eq.u16 %r270,%r268,0; @ %r270 bra $L65; .loc 1 323 22 ld.u64 %r302,[%r89]; ld.u64 %r303,[%r89+8]; .loc 1 323 19 setp.gt.s64 %r273,%r303,%r305; @ %r273 bra $L76; setp.ne.u64 %r276,%r303,%r305; @ %r276 bra $L65; setp.gt.u64 %r279,%r302,%r304; @ ! %r279 bra $L65; $L76: mov.u64 %r97,%r308; mov.u64 %r100,%r107; mov.u64 %r151,%r74; .loc 1 326 12 mov.u64 %r144,%r313; $L67: .loc 1 327 36 cvt.u32.u64 %r282,%r151; add.u32 %r281,%r282,1; .loc 1 327 25 st.u32 [%r100],%r281; .loc 1 326 28 add.u64 %r109,%r144,1; .loc 1 326 5 add.u64 %r100,%r100,%r62; setp.le.s64 %r283,%r29,%r144; @ %r283 bra $L74; .loc 1 327 32 ld.u64 %r151,[%r97]; add.u64 %r97,%r97,8; mov.u64 %r144,%r109; bra $L67; $L74: mov.u64 %r304,%r302; mov.u64 %r305,%r303; $L65: .loc 1 331 9 add.u64 %r89,%r89,%r71; .loc 1 332 10 add.u64 %r92,%r92,%r72; .loc 1 334 14 add.u64 %r74,%r74,1; .loc 1 334 7 st.u64 [%frame+360],%r74; setp.ne.u64 %r284,%r74,%r75; @ %r284 bra $L59; sub.u64 %r134,%r75,%r163; .loc 1 331 9 mad.lo.u64 %r90,%r71,%r134,%r90; .loc 1 332 10 mad.lo.u64 %r93,%r72,%r134,%r93; bra $L64; $L97: st.u64 [%frame+360],%r75; $L64: mov.u64 %r125,%r308; mov.u64 %r84,%r72; .loc 1 334 34 mov.u64 %r86,%r75; mov.u64 %r82,%r69; mov.u64 %r124,8; .loc 1 335 9 mov.u64 %r96,0; $L68: .loc 1 340 13 st.u64 [%r125+-8],%r313; .loc 1 343 23 mul.lo.u64 %r77,%r82,%r86; .loc 1 344 24 mul.lo.u64 %r79,%r86,%r84; mov.u64 %r118,%r96; .loc 1 345 5 add.u64 %r96,%r96,1; .loc 1 346 7 setp.eq.u64 %r289,%r29,%r118; @ %r289 bra $L39; .loc 1 354 16 ld.u64 %r290,[%r125]; add.u64 %r81,%r290,1; st.u64 [%r125],%r81; .loc 1 355 23 add.u64 %r292,%r307,%r124; ld.u64 %r82,[%r292]; .loc 1 355 13 sub.u64 %r293,%r82,%r77; shl.b64 %r294,%r293,4; add.u64 %r90,%r90,%r294; .loc 1 356 24 add.u64 %r295,%frame,%r124; ld.u64 %r84,[%r295]; .loc 1 356 14 sub.u64 %r296,%r84,%r79; add.u64 %r93,%r93,%r296; .loc 1 359 32 add.u64 %r298,%r314,%r124; ld.u64 %r86,[%r298]; .loc 1 359 7 add.u64 %r125,%r125,8; add.u64 %r124,%r124,8; setp.eq.u64 %r299,%r81,%r86; @ %r299 bra $L68; ld.u64 %r163,[%frame+360]; bra $L69; $L71: mov.u64 %r90,%r45; setp.eq.u32 %r306,1,1; bra $L52; $L39: .loc 1 362_gfortran_smaxloc0_4_i16 .visible .func _gfortran_smaxloc0_4_i164predpred.reg .u64u32 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 379 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L99; .loc 1 379 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L100; $L99: .loc 1 381 7 {_gfortran_maxloc0_4_i16.loc 1 382 7 bra $L98; $L100: .loc 1 385 8 ld.s8 %r30,[%r36+28]; .loc 1 387 6 setp.gt.s64 %r46,%r30,0; @ %r46 bra $L102_gfortran_runtime_error102: .loc 1 390 15 ld.u64 %r24,[%r35]; .loc 1 390 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L103; .loc 1 392 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 393 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 394 24 st.u64 [%r35+8],%r24; .loc 1 395 29 mov.u64 %r56call (%value_in),_gfortrani_xmallocarray95 27 st.u64 [%r35],%r24; bra $L104; $L103: .loc 1 397 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 397 11 ld.u32 %r62,[%r61+36.loc 1 399 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 404 8 ld.u64 %r24,[%r35]; $L104: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,2; mov.u64 %r29,%r24; .loc 1 405 10 mov.u64 %r32,0; .loc 1 406 23 cvt.u32.u64 %r68,%r32; $L105: st.u32 [%r29],%r68; .loc 1 405 24 add.u64 %r32,%r32,1; .loc 1 405 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L105; $L98: .loc 1 407maxloc0_8_i16.o/1622802228_gfortran_maxloc0_8_i16 .visible .func _gfortran_maxloc0_8_i16fortran/generated/maxloc0_8_i16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_8_i16 .visible .func _gfortran_mmaxloc0_8_i16_gfortran_smaxloc0_8_i16 .visible .func _gfortran_smaxloc0_8_i16VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,65,88,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_8_i16 .visible .func _gfortran_maxloc0_8_i1632predpredpred %r169; .reg .pred %r172; .reg .predpredpredpredpredmov.u64 %r112,%ar0; mov.u64 %r113,%ar1; mov.u32 %r114,%ar2; .loc 1 50 10 ld.s8 %r22,[%r113+28]; .loc 1 50 8 cvt.u32.u32 %r115,%r22; cvt.s64.s8 %r71,%r115; .loc 1 51 6 setp.gt.s64 %r116,%r71,0; @ %r116 bra $L2; .loc 1 52 5 cvta.const.u64 %r11117_gfortran_runtime_errorr23,[%r112]; .loc 1 54 6 setp.ne.u64 %r119,%r23,0; @ %r119 bra $L3; .loc 1 56 7 st.u64 [%r112+48],%r23; add.u64 %r121,%r71,-1; st.u64 [%r112+56],%r121; mov.u64 %r122,1; st.u64 [%r112+40],%r122; .loc 1 57 28 cvt.u32.u64 %r123,%r122; st.u8 [%r112+28],%r123; .loc 1 58 24 st.u64 [%r112+8],%r23; .loc 1 59 29 mov.u64 %r171call (%value_in),_gfortrani_xmallocarray129,[%value_in]; } mov.u64 %r23,%r129; .loc 1 59 27 st.u64 [%r112],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r131,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r132,[%r131+36]; setp.eq.u32 %r133,%r132,0; @ %r133 bra $L4; .loc 1 64r113136; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r112]; $L4: .loc 1 68 11 ld.u64 %r72,[%r112+40]; add.u64 %r36,%r113,40; mov.u64 %r104,%frame; add.u64 %r105,%frame,120; .loc 1 70 10 mov.u64 %r98,0; add.u64 %r204,%frame,240; .loc 1 74 16 mov.u64 %r144,%r98; bra $L7; $L24: mov.u64 %r98,%r79; $L7: .loc 1 72 18 ld.u64 %r137,[%r36]; st.u64 [%r104],%r137; .loc 1 73 19 ld.u64 %r139,[%r36+16]; add.u64 %r138,%r139,1; ld.u64 %r140,[%r36+8]; sub.u64 %r31,%r138,%r140; .loc 1 73 17 st.u64 [%r105],%r31; .loc 1 74 16 shl.b64 %r142,%r98,3; add.u64 %r143,%r204,%r142; st.u64 [%r143],%r144; .loc 1 75 10 setp.gt.s64 %r145,%r31,0; @ %r145 bra $L5; shl.b64 %r81,%r72,3; mov.u64 %r87,%r23; .loc 1 78 11 mov.u64 %r80,0; .loc 1 79 24 mov.u64 %r146,%r80; $L6: st.u64 [%r87],%r146; .loc 1 78 27 add.u64 %r80,%r80,1; .loc 1 78 4 add.u64 %r87,%r87,%r81; setp.ne.u64 %r147,%r71,%r80; @ %r147 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r79,%r98,1; .loc 1 70 3 add.u64 %r36,%r36,24; add.u64 %r104,%r104,8; add.u64 %r105,%r105,8; setp.ne.u64 %r148,%r71,%r79; @ %r148 bra $L24; .loc 1 84 8 ld.u64 %r62,[%r113]; shl.b64 %r84,%r72,3; mov.u64 %r39,%r23; .loc 1 87 10 mov.u64 %r78,0; .loc 1 88 23 mov.u64 %r149,1; $L8: st.u64 [%r39],%r149; mov.u64 %r32,%r78; .loc 1 87 26 add.u64 %r78,%r78,1; .loc 1 87 3 add.u64 %r39,%r39,%r84; setp.ne.u64 %r150,%r98,%r32; @ %r150 bra $L8; .loc 1 101 9 setp.ne.u64 %r151,%r62,0; @ ! %r151 bra $L1; .loc 1 149 19 ld.u64 %r45,[%frame]; .loc 1 149 9 shl.b64 %r47,%r45,4; .loc 1 151 34 ld.u64 %r51,[%frame+120]; cvt.u32.u32 %r153,%r22; cvt.s64.s8 %r152,%r153; shl.b64 %r154,%r152,3; add.u64 %r73,%r204,%r154; .loc 1 99 12 mov.u64 %r201,0; mov.u64 %r202,-9223372036854775808; setp.eq.u32 %r203,%r114,0; add.u64 %r205,%frame,248; .loc 1 157 13 mov.u64 %r208,%r201; .loc 1 174 32 add.u64 %r209,%frame,120; $L23: .loc 1 151 21 ld.u64 %r103,[%frame+240]; .loc 1 126 12 @ %r203 bra $L25; mov.u64 %r40,%r103; mov.u64 %r60,%r62; $L15: .loc 1 129 12 ld.u64 %r197,[%r60]; ld.u64 %r198,[%r60+8]; .loc 1 129 11 setp.gt.s64 %r159,%r202,%r198; @ %r159 bra $L12; setp.ne.u64 %r162,%r202,%r198; @ %r162 bra $L28; setp.gt.u64 %r165,%r201,%r197; @ %r165 bra $L12; $L28: mov.u64 %r76,%r205; mov.u64 %r34,%r23; mov.u64 %r90,%r40; $L14: .loc 1 133 35 add.u64 %r167,%r90,1; .loc 1 133 24 st.u64 [%r34],%r167; .loc 1 132 10 add.u64 %r34,%r34,%r84; setp.eq.u64 %r168,%r73,%r76; @ %r168 bra $L26; .loc 1 133 31 ld.u64 %r90,[%r76]; add.u64 %r76,%r76,8; bra $L14; $L26: mov.u64 %r201,%r197; mov.u64 %r202,%r198; $L12: .loc 1 135 12 add.u64 %r60,%r60,%r47; .loc 1 137 17 add.u64 %r40,%r40,1; .loc 1 137 10 st.u64 [%frame+240],%r40; setp.ne.u64 %r169,%r40,%r51; @ %r169 bra $L15; bra $L38; $L25: mov.u64 %r49,%r103; mov.u64 %r61,%r62; $L11: .loc 1 141 11 ld.u64 %r199,[%r61]; ld.u64 %r200,[%r61+8]; .loc 1 141 10 setp.gt.s64 %r172,%r200,%r202; @ %r172 bra $L29; setp.ne.u64 %r175,%r200,%r202; @ %r175 bra $L17; setp.gt.u64 %r178,%r199,%r201; @ ! %r178 bra $L17; $L29: mov.u64 %r85,%r205; mov.u64 %r101,%r23; mov.u64 %r86,%r49; $L19: .loc 1 145 35 add.u64 %r180,%r86,1; .loc 1 145 24 st.u64 [%r101],%r180; .loc 1 144 4 add.u64 %r101,%r101,%r84; setp.eq.u64 %r181,%r73,%r85; @ %r181 bra $L27; .loc 1 145 31 ld.u64 %r86,[%r85]; add.u64 %r85,%r85,8; bra $L19; $L27: mov.u64 %r201,%r199; mov.u64 %r202,%r200; $L17: .loc 1 149 9 add.u64 %r61,%r61,%r47; .loc 1 151 14 add.u64 %r49,%r49,1; .loc 1 151 7 st.u64 [%frame+240],%r49; setp.ne.u64 %r182,%r49,%r51; @ %r182 bra $L11; .loc 1 149 9 sub.u64 %r183,%r51,%r103; mad.lo.u64 %r62,%r183,%r47,%r62; bra $L20; $L38: .loc 1 135 12 sub.u64 %r185,%r51,%r103; mad.lo.u64 %r62,%r185,%r47,%r62; $L20: mov.u64 %r70,%r205; .loc 1 151 34 mov.u64 %r59,%r51; mov.u64 %r57,%r45; .loc 1 152 9 mov.u64 %r66,0; $L22: .loc 1 157 13 st.u64 [%r70+-8],%r208; .loc 1 160 23 mul.lo.u64 %r54,%r57,%r59; mov.u64 %r99,%r66; .loc 1 161 5 add.u64 %r66,%r66,1; .loc 1 162 7 setp.eq.u64 %r189,%r99,%r98; @ %r189 bra $L1; .loc 1 170 16 ld.u64 %r190,[%r70]; add.u64 %r56,%r190,1; st.u64 [%r70],%r56; shl.b64 %r65,%r66,3; .loc 1 171 23 add.u64 %r191,%frame,%r65; ld.u64 %r57,[%r191]; .loc 1 171 13 sub.u64 %r192,%r57,%r54; shl.b64 %r193,%r192,4; add.u64 %r62,%r62,%r193; .loc 1 174 32 add.u64 %r195,%r209,%r65; ld.u64 %r59,[%r195]; .loc 1 174 7 add.u64 %r70,%r70,8; setp.eq.u64 %r196,%r56,%r59; @ %r196 bra $L22; bra $L23; $L1: .loc 1 1_gfortran_mmaxloc0_8_i16 .visible .func _gfortran_mmaxloc0_8_i16pred %r187; .reg .u64 %r190; .reg .u64u64pred %r222; .reg .u64 %r223; .reg .predu32 %r230; .reg .u64 %r231; .reg .u16 %r233; .reg .u32 %r234; .reg .pred %r235; .reg .pred %r236; .reg .u64 %r238; .reg .pred %r239; .reg .pred %r240; .reg .u16 %r242; .reg .u32 %r243; .reg .pred %r244; .reg .pred %r247; .reg .pred %r250; .reg .predu32 %r261; .reg .pred %r262; .reg .pred %r265; .reg .pred %r268; .reg .pred %r271; .reg .u64 %r273; .reg .pred %r274; .reg .predu64 %r283; .reg .u64u64 %r305; .reg .pred %r307; mov.u64 %r160,%ar0; mov.u64 %r161,%ar1; mov.u64 %r162,%ar2; mov.u32 %r163,%ar3; .loc 1 202 6 setp.ne.u64 %r164,%r162,0; @ %r164 bra $L40; .loc 1 204 7 {1163; call _gfortran_maxloc0_8_i16.loc 1 205 7 bra $L39; $L40: .loc 1 208 10 ld.s8 %r22,[%r161+28]; .loc 1 209 6 setp.gt.s32 %r168,%r22,0; @ %r168 bra $L42; .loc 1 210 5 cvta.const.u64 %r16stack; call _gfortran_runtime_error42: .loc 1 212 6 ld.u64 %r171,[%r160]; setp.ne.u64 %r172,%r171,0; @ %r172 bra $L43; .loc 1 214 7 st.u64 [%r160+48],%r171; add.u32 %r174,%r22,-1; cvt.s64.s32 %r175,%r174; st.u64 [%r160+56],%r175; mov.u64 %r176,1; st.u64 [%r160+40],%r176; .loc 1 215 28 cvt.u32.u64 %r177,%r176; st.u8 [%r160+28],%r177; .loc 1 216 24 st.u64 [%r160+8],%r171; .loc 1 217 29 cvt.u32.u32 %r182,%r22; cvt.s64.s8 %r181,%r182; mov.u64 %r180call (%value_in),_gfortrani_xmallocarray183,[%value_in]; } .loc 1 217 27 st.u64 [%r160],%r183; bra $L44; $L43: .loc 1 221 11 cvta.global.u64 %r185,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r186,[%r185+36]; setp.eq.u32 %r187,%r186,0; @ %r187 bra $L44; .loc 1 224 4 cvta.const.u64 %r190,$LC1; {161190; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19316161193190; call _gfortrani_bounds_equal_extents$L44: .loc 1 231 15 ld.u64 %r31,[%r162+16]; .loc 1 233 9 ld.u64 %r92,[%r162]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r195,%r32,-4; and.b32 %r196,%r195,-5; set.u32.eq.u32 %r198,%r196,0; neg.s32 %r199,%r198; .loc 1 235 22 add.u32 %r200,%r32,-1; set.u32.le.u32 %r202,%r200,1; neg.s32 %r203,%r202; .loc 1 235 6 cvt.u16.u32 %r205,%r199; cvt.u16.u32 %r206,%r203; or.b16 %r204,%r205,%r206; cvt.u32.u16 %r207,%r204; cvt.u16.u8 %r208,%r207; .loc 1 237 7 setp.ne.u32 %r211,%r32,16; @ %r211 bra $L46; $L45: .loc 1 244 11 ld.u64 %r106,[%r160+40]; .loc 1 245 8 ld.u64 %r107,[%r160]; .loc 1 246 17 cvt.u32.u32 %r212,%r22; cvt.s64.s8 %r68,%r212; add.u64 %r153,%r161,40; add.u64 %r298,%frame,120; mov.u64 %r149,%r298; add.u64 %r147,%r162,40; mov.u64 %r146,%frame; add.u64 %r143,%frame,240; add.u64 %r62,%frame,360; mov.u64 %r47,%r62; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r221,%r29; bra $L47; $L46: .loc 1 242 5_gfortran_runtime_error70: mov.u64 %r29,%r113; $L47: .loc 1 248 18 ld.u64 %r215,[%r153]; st.u64 [%r149],%r215; .loc 1 249 20 ld.u64 %r217,[%r147]; mul.lo.u64 %r216,%r217,%r31; .loc 1 249 18 st.u64 [%r146],%r216; .loc 1 250 19 ld.u64 %r219,[%r153+16]; add.u64 %r218,%r219,1; ld.u64 %r220,[%r153+8]; sub.u64 %r44,%r218,%r220; .loc 1 250 17 st.u64 [%r143],%r44; .loc 1 251 16 st.u64 [%r47],%r221; .loc 1 252 10 setp.gt.s64 %r222,%r44,0; @ %r222 bra $L48; shl.b64 %r137,%r106,3; mov.u64 %r129,%r107; .loc 1 255 11 mov.u64 %r114,0; .loc 1 256 24 mov.u64 %r223,%r114; $L49: st.u64 [%r129],%r223; .loc 1 255 27 add.u64 %r114,%r114,1; .loc 1 255 4 add.u64 %r129,%r129,%r137; setp.ne.u64 %r224,%r68,%r114; @ %r224 bra $L49; bra $L39; $L48: .loc 1 246 26 add.u64 %r113,%r29,1; .loc 1 246 3 add.u64 %r153,%r153,24; add.u64 %r149,%r149,8; add.u64 %r147,%r147,24; add.u64 %r146,%r146,8; add.u64 %r143,%r143,8; add.u64 %r47,%r47,8; setp.ne.u64 %r225,%r68,%r113; @ %r225 bra $L70; .loc 1 261 8 ld.u64 %r90,[%r161]; shl.b64 %r158,%r106,3; mov.u64 %r156,%r107; .loc 1 264 10 mov.u64 %r112,0; .loc 1 265 23 mov.u64 %r226,%r112; $L51: st.u64 [%r156],%r226; mov.u64 %r140,%r112; .loc 1 264 26 add.u64 %r112,%r112,1; .loc 1 264 3 add.u64 %r156,%r156,%r158; setp.ne.u64 %r227,%r29,%r140; @ %r227 bra $L51; .loc 1 276 9 setp.eq.u64 %r228,%r90,0; @ %r228 bra $L39; .loc 1 331 19 ld.u64 %r70,[%frame+120]; .loc 1 331 9 shl.b64 %r72,%r70,4; .loc 1 332 20 ld.u64 %r73,[%frame]; .loc 1 334 34 ld.u64 %r76,[%frame+240]; ld.u64 %r155,[%frame+360]; cvt.u32.u32 %r230,%r22; cvt.s64.s8 %r229,%r230; shl.b64 %r231,%r229,3; add.u64 %r119,%r231,%r62; .loc 1 274 12 mov.u64 %r295,0; mov.u64 %r296,-9223372036854775808; setp.eq.u32 %r297,1,0; add.u64 %r299,%frame,368; .loc 1 340 13 mov.u64 %r304,%r295; .loc 1 359 32 add.u64 %r305,%frame,240; .loc 1 308 12 setp.eq.u32 %r307,%r163,0; $L69: .loc 1 280 10 @ %r297 bra $L52; mov.u64 %r56,%r155; mov.u32 %r46,0; mov.u32 %r301,1; $L57: mov.u64 %r45,%r90; .loc 1 300 13 add.u64 %r90,%r45,%r72; .loc 1 284 11 ld.s8 %r234,[%r92]; cvt.u16.u32 %r233,%r234; setp.eq.u16 %r235,%r233,0; @ %r235 bra $L53; setp.eq.u32 %r236,%r46,0; selp.u64 %r155,%r155,%r56,%r236; .loc 1 294 16 ld.u64 %r295,[%r45]; ld.u64 %r296,[%r45+8]; mov.u64 %r128,%r299; mov.u64 %r139,%r107; mov.u64 %r151,%r155; $L56: .loc 1 296 33 add.u64 %r238,%r151,1; .loc 1 296 22 st.u64 [%r139],%r238; .loc 1 295 9 add.u64 %r139,%r139,%r158; setp.eq.u64 %r239,%r119,%r128; @ %r239 bra $L71; .loc 1 296 29 ld.u64 %r151,[%r128]; add.u64 %r128,%r128,8; bra $L56; $L53: .loc 1 301 14 add.u64 %r92,%r92,%r73; .loc 1 303 11 add.u64 %r56,%r56,1; mov.u32 %r46,%r301; .loc 1 303 4 setp.ne.u64 %r240,%r56,%r76; @ %r240 bra $L57; bra $L97; $L52: .loc 1 308 12 @ %r307 bra $L72; mov.u64 %r64,%r155; mov.u64 %r88,%r90; $L63: .loc 1 311 11 ld.s8 %r243,[%r92]; cvt.u16.u32 %r242,%r243; setp.eq.u16 %r244,%r242,0; @ %r244 bra $L60; .loc 1 311 22 ld.u64 %r291,[%r88]; ld.u64 %r292,[%r88+8]; .loc 1 311 19 setp.gt.s64 %r247,%r296,%r292; @ %r247 bra $L60; setp.ne.u64 %r250,%r296,%r292; @ %r250 bra $L75; setp.gt.u64 %r253,%r295,%r291; @ %r253 bra $L60; $L75: mov.u64 %r93,%r299; mov.u64 %r96,%r107; mov.u64 %r148,%r64; .loc 1 314 19 mov.u64 %r141,%r304; $L62: .loc 1 315 36 add.u64 %r255,%r148,1; .loc 1 315 25 st.u64 [%r96],%r255; .loc 1 314 35 add.u64 %r110,%r141,1; .loc 1 314 12 add.u64 %r96,%r96,%r158; setp.le.s64 %r256,%r29,%r141; @ %r256 bra $L73; .loc 1 315 32 ld.u64 %r148,[%r93]; add.u64 %r93,%r93,8; mov.u64 %r141,%r110; bra $L62; $L73: mov.u64 %r295,%r291; mov.u64 %r296,%r292; $L60: .loc 1 317 13 add.u64 %r88,%r88,%r72; .loc 1 319 11 add.u64 %r64,%r64,1; .loc 1 319 4 st.u64 [%frame+360],%r64; setp.ne.u64 %r257,%r64,%r76; @ %r257 bra $L63; .loc 1 317 13 sub.u64 %r258,%r76,%r155; mad.lo.u64 %r90,%r258,%r72,%r90; bra $L64; $L72: mov.u64 %r75,%r155; mov.u64 %r91,%r92; mov.u64 %r89,%r90; $L59: .loc 1 323 11 ld.s8 %r261,[%r91]; cvt.u16.u32 %r260,%r261; setp.eq.u16 %r262,%r260,0; @ %r262 bra $L65; .loc 1 323 22 ld.u64 %r293,[%r89]; ld.u64 %r294,[%r89+8]; .loc 1 323 19 setp.gt.s64 %r265,%r294,%r296; @ %r265 bra $L76; setp.ne.u64 %r268,%r294,%r296; @ %r268 bra $L65; setp.gt.u64 %r271,%r293,%r295; @ ! %r271 bra $L65; $L76: mov.u64 %r115,%r299; mov.u64 %r57,%r107; mov.u64 %r144,%r75; .loc 1 326 12 mov.u64 %r142,%r304; $L67: .loc 1 327 36 add.u64 %r273,%r144,1; .loc 1 327 25 st.u64 [%r57],%r273; .loc 1 326 28 add.u64 %r109,%r142,1; .loc 1 326 5 add.u64 %r57,%r57,%r158; setp.le.s64 %r274,%r29,%r142; @ %r274 bra $L74; .loc 1 327 32 ld.u64 %r144,[%r115]; add.u64 %r115,%r115,8; mov.u64 %r142,%r109; bra $L67; $L74: mov.u64 %r295,%r293; mov.u64 %r296,%r294; $L65: .loc 1 331 9 add.u64 %r89,%r89,%r72; .loc 1 332 10 add.u64 %r91,%r91,%r73; .loc 1 334 14 add.u64 %r75,%r75,1; .loc 1 334 7 st.u64 [%frame+360],%r75; setp.ne.u64 %r275,%r75,%r76; @ %r275 bra $L59; sub.u64 %r125,%r76,%r155; .loc 1 331 9 mad.lo.u64 %r90,%r72,%r125,%r90; .loc 1 332 10 mad.lo.u64 %r92,%r73,%r125,%r92; bra $L64; $L97: st.u64 [%frame+360],%r76; $L64: mov.u64 %r117,%r299; mov.u64 %r85,%r73; .loc 1 334 34 mov.u64 %r87,%r76; mov.u64 %r83,%r70; mov.u64 %r116,8; .loc 1 335 9 mov.u64 %r95,0; $L68: .loc 1 340 13 st.u64 [%r117+-8],%r304; .loc 1 343 23 mul.lo.u64 %r78,%r83,%r87; .loc 1 344 24 mul.lo.u64 %r80,%r87,%r85; mov.u64 %r94,%r95; .loc 1 345 5 add.u64 %r95,%r95,1; .loc 1 346 7 setp.eq.u64 %r280,%r29,%r94; @ %r280 bra $L39; .loc 1 354 16 ld.u64 %r281,[%r117]; add.u64 %r82,%r281,1; st.u64 [%r117],%r82; .loc 1 355 23 add.u64 %r283,%r298,%r116; ld.u64 %r83,[%r283]; .loc 1 355 13 sub.u64 %r284,%r83,%r78; shl.b64 %r285,%r284,4; add.u64 %r90,%r90,%r285; .loc 1 356 24 add.u64 %r286,%frame,%r116; ld.u64 %r85,[%r286]; .loc 1 356 14 sub.u64 %r287,%r85,%r80; add.u64 %r92,%r92,%r287; .loc 1 359 32 add.u64 %r289,%r305,%r116; ld.u64 %r87,[%r289]; .loc 1 359 7 add.u64 %r117,%r117,8; add.u64 %r116,%r116,8; setp.eq.u64 %r290,%r82,%r87; @ %r290 bra $L68; ld.u64 %r155,[%frame+360]; bra $L69; $L71: mov.u64 %r90,%r45; setp.eq.u32 %r297,1,1; bra $L52; $L39: .loc 1 362_gfortran_smaxloc0_8_i16 .visible .func _gfortran_smaxloc0_8_i164predpred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 379 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L99; .loc 1 379 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L100; $L99: .loc 1 381 7 {_gfortran_maxloc0_8_i16.loc 1 382 7 bra $L98; $L100: .loc 1 385 8 ld.s8 %r30,[%r36+28]; .loc 1 387 6 setp.gt.s64 %r46,%r30,0; @ %r46 bra $L102_gfortran_runtime_error102: .loc 1 390 15 ld.u64 %r24,[%r35]; .loc 1 390 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L103; .loc 1 392 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 393 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 394 24 st.u64 [%r35+8],%r24; .loc 1 395 29 mov.u64 call (%value_in),_gfortrani_xmallocarray95 27 st.u64 [%r35],%r24; bra $L104; $L103: .loc 1 397 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 397 11 ld.u32 %r62,[%r61+36.loc 1 399 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 404 8 ld.u64 %r24,[%r35]; $L104: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,3; mov.u64 %r29,%r24; .loc 1 405 10 mov.u64 %r32,0; .loc 1 406 23 mov.u64 %r68,%r32; $L105: st.u64 [%r29],%r68; .loc 1 405 24 add.u64 %r32,%r32,1; .loc 1 405 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L105; $L98: .loc 1 40719 _gfortran_maxloc0_16_i16 .visible .func _gfortran_maxloc0_16_i16fortran/generated/maxloc0_16_i16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_16_i16 .visible .func _gfortran_mmaxloc0_16_i16_gfortran_smaxloc0_16_i16 .visible .func _gfortran_smaxloc0_16_i16VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,65,88,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_16_i16 .visible .func _gfortran_maxloc0_16_i163264u32 %r157; .reg .u64 %r158; .reg .pred %r163; .reg .pred %r166; .reg .predpredpredpredu64 %r191; .reg .predpred64mov.u32 %r116,%ar2; .loc 1 50 10 ld.s8 %r22,[%r115+28]; .loc 1 50 8 cvt.u32.u32 %r117,%r22; cvt.s64.s8 %r71,%r117; .loc 1 51 6 setp.gt.s64 %r118,%r71,0; @ %r118 bra $L2; .loc 1 52 5 cvta.const.u64 %r1111_gfortran_runtime_errorr23,[%r114]; .loc 1 54 6 setp.ne.u64 %r121,%r23,0; @ %r121 bra $L3; .loc 1 56 7 st.u64 [%r114+48],%r23; add.u64 %r123,%r71,-1; st.u64 [%r114+56],%r123; mov.u64 %r124,1; st.u64 [%r114+40],%r124; .loc 1 57 28 cvt.u32.u64 %r125,%r124; st.u8 [%r114+28],%r125; .loc 1 58 24 st.u64 [%r114+8],%r23; .loc 1 59 29 mov.u64 %r128,1671call (%value_in),_gfortrani_xmallocarray131,[%value_in]; } mov.u64 %r23,%r131; .loc 1 59 27 st.u64 [%r114],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r133,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r134,[%r133+36]; setp.eq.u32 %r135,%r134,0; @ %r135 bra $L4; .loc 1 64r115138; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r114]; $L4: .loc 1 68 11 ld.u64 %r72,[%r114+40]; add.u64 %r44,%r115,40; mov.u64 %r35,%frame; add.u64 %r52,%frame,120; .loc 1 70 10 mov.u64 %r100,0; add.u64 %r210,%frame,240; .loc 1 74 16 mov.u64 %r146,%r100; bra $L7; $L24: mov.u64 %r100,%r79; $L7: .loc 1 72 18 ld.u64 %r139,[%r44]; st.u64 [%r35],%r139; .loc 1 73 19 ld.u64 %r141,[%r44+16]; add.u64 %r140,%r141,1; ld.u64 %r142,[%r44+8]; sub.u64 %r32,%r140,%r142; .loc 1 73 17 st.u64 [%r52],%r32; .loc 1 74 16 shl.b64 %r144,%r100,3; add.u64 %r145,%r210,%r144; st.u64 [%r145],%r146; .loc 1 75 10 setp.gt.s64 %r147,%r32,0; @ %r147 bra $L5; shl.b64 %r86,%r72,4; mov.u64 %r88,%r23; .loc 1 78 11 mov.u64 %r80,0; .loc 1 79 24 mov.u64 %r148,%r80; $L6: st.u64 [%r88],%r148; st.u64 [%r88+8],%r148; .loc 1 78 27 add.u64 %r80,%r80,1; .loc 1 78 4 add.u64 %r88,%r88,%r86; setp.ne.u64 %r150,%r71,%r80; @ %r150 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r79,%r100,1; .loc 1 70 3 add.u64 %r44,%r44,24; add.u64 %r35,%r35,8; add.u64 %r52,%r52,8; setp.ne.u64 %r151,%r71,%r79; @ %r151 bra $L24; .loc 1 84 8 ld.u64 %r62,[%r115]; shl.b64 %r38,%r72,4; mov.u64 %r85,%r23; .loc 1 87 10 mov.u64 %r78,0; .loc 1 88 23 mov.u64 %r152,1; mov.u64 %r153,%r78; $L8: st.u64 [%r85],%r152; st.u64 [%r85+8],%r153; mov.u64 %r33,%r78; .loc 1 87 26 add.u64 %r78,%r78,1; .loc 1 87 3 add.u64 %r85,%r85,%r38; setp.ne.u64 %r154,%r100,%r33; @ %r154 bra $L8; .loc 1 101 9 setp.ne.u64 %r155,%r62,0; @ ! %r155 bra $L1; .loc 1 149 19 ld.u64 %r47,[%frame]; .loc 1 149 9 shl.b64 %r49,%r47,4; .loc 1 151 34 ld.u64 %r51,[%frame+120]; cvt.u32.u32 %r157,%r22; cvt.s64.s8 %r156,%r157; shl.b64 %r158,%r156,3; add.u64 %r89,%r210,%r158; .loc 1 99 12 mov.u64 %r207,0; mov.u64 %r208,-9223372036854775808; setp.eq.u32 %r209,%r116,0; add.u64 %r211,%frame,248; .loc 1 157 13 mov.u64 %r214,%r207; .loc 1 174 32 add.u64 %r215,%frame,120; $L23: .loc 1 151 21 ld.u64 %r107,[%frame+240]; .loc 1 126 12 @ %r209 bra $L25; mov.u64 %r41,%r107; mov.u64 %r60,%r62; $L15: .loc 1 129 12 ld.u64 %r203,[%r60]; ld.u64 %r204,[%r60+8]; .loc 1 129 11 setp.gt.s64 %r163,%r208,%r204; @ %r163 bra $L12; setp.ne.u64 %r166,%r208,%r204; @ %r166 bra $L28; setp.gt.u64 %r169,%r207,%r203; @ %r169 bra $L12; $L28: mov.u64 %r27,%r211; mov.u64 %r63,%r23; mov.u64 %r103,%r41; $L14: .loc 1 133 35 add.u64 %r171,%r103,1; st.u64 [%r63],%r171; shr.s64 %r172,%r171,63; st.u64 [%r63+8],%r172; .loc 1 132 10 add.u64 %r63,%r63,%r38; setp.eq.u64 %r173,%r27,%r89; @ %r173 bra $L26; .loc 1 133 31 ld.u64 %r103,[%r27]; add.u64 %r27,%r27,8; bra $L14; $L26: mov.u64 %r207,%r203; mov.u64 %r208,%r204; $L12: .loc 1 135 12 add.u64 %r60,%r60,%r49; .loc 1 137 17 add.u64 %r41,%r41,1; .loc 1 137 10 st.u64 [%frame+240],%r41; setp.ne.u64 %r174,%r41,%r51; @ %r174 bra $L15; bra $L38; $L25: mov.u64 %r50,%r107; mov.u64 %r61,%r62; $L11: .loc 1 141 11 ld.u64 %r205,[%r61]; ld.u64 %r206,[%r61+8]; .loc 1 141 10 setp.gt.s64 %r177,%r206,%r208; @ %r177 bra $L29; setp.ne.u64 %r180,%r206,%r208; @ %r180 bra $L17; setp.gt.u64 %r183,%r205,%r207; @ ! %r183 bra $L17; $L29: mov.u64 %r104,%r211; mov.u64 %r105,%r23; mov.u64 %r84,%r50; $L19: .loc 1 145 35 add.u64 %r185,%r84,1; st.u64 [%r105],%r185; shr.s64 %r186,%r185,63; st.u64 [%r105+8],%r186; .loc 1 144 4 add.u64 %r105,%r105,%r38; setp.eq.u64 %r187,%r89,%r104; @ %r187 bra $L27; .loc 1 145 31 ld.u64 %r84,[%r104]; add.u64 %r104,%r104,8; bra $L19; $L27: mov.u64 %r207,%r205; mov.u64 %r208,%r206; $L17: .loc 1 149 9 add.u64 %r61,%r61,%r49; .loc 1 151 14 add.u64 %r50,%r50,1; .loc 1 151 7 st.u64 [%frame+240],%r50; setp.ne.u64 %r188,%r50,%r51; @ %r188 bra $L11; .loc 1 149 9 sub.u64 %r189,%r51,%r107; mad.lo.u64 %r62,%r189,%r49,%r62; bra $L20; $L38: .loc 1 135 12 sub.u64 %r191,%r51,%r107; mad.lo.u64 %r62,%r191,%r49,%r62; $L20: mov.u64 %r77,%r211; .loc 1 151 34 mov.u64 %r59,%r51; mov.u64 %r57,%r47; .loc 1 152 9 mov.u64 %r65,0; $L22: .loc 1 157 13 st.u64 [%r77+-8],%r214; .loc 1 160 23 mul.lo.u64 %r54,%r57,%r59; mov.u64 %r101,%r65; .loc 1 161 5 add.u64 %r65,%r65,1; .loc 1 162 7 setp.eq.u64 %r195,%r101,%r100; @ %r195 bra $L1; .loc 1 170 16 ld.u64 %r196,[%r77]; add.u64 %r56,%r196,1; st.u64 [%r77],%r56; shl.b64 %r68,%r65,3; .loc 1 171 23 add.u64 %r197,%frame,%r68; ld.u64 %r57,[%r197]; .loc 1 171 13 sub.u64 %r198,%r57,%r54; shl.b64 %r199,%r198,4; add.u64 %r62,%r62,%r199; .loc 1 174 32 add.u64 %r201,%r215,%r68; ld.u64 %r59,[%r201]; .loc 1 174 7 add.u64 %r77,%r77,8; setp.eq.u64 %r202,%r56,%r59; @ %r202 bra $L22; bra $L23; $L1: .loc 1 1_gfortran_mmaxloc0_16_i16 .visible .func _gfortran_mmaxloc0_16_i16predpredpredpred %r272; .reg .predpredu64u64 %r302; .reg .u64mov.u64 %r163,%ar0; mov.u64 %r164,%ar1; mov.u64 %r165,%ar2; mov.u32 %r166,%ar3; .loc 1 202 6 setp.ne.u64 %r167,%r165,0; @ %r167 bra $L4016r1166; call _gfortran_maxloc0_16_i16.loc 1 205 7 bra $L39; $L40: .loc 1 208 10 ld.s8 %r22,[%r164+28]; .loc 1 209 6 setp.gt.s32 %r171,%r22,0; @ %r171 bra $L42; .loc 1 210 5 cvta.const.u64 %r172172_gfortran_runtime_error42: .loc 1 212 6 ld.u64 %r174,[%r163]; setp.ne.u64 %r175,%r174,0; @ %r175 bra $L43; .loc 1 214 7 st.u64 [%r163+48],%r174; add.u32 %r177,%r22,-1; cvt.s64.s32 %r178,%r177; st.u64 [%r163+56],%r178; mov.u64 %r179,1; st.u64 [%r163+40],%r179; .loc 1 215 28 cvt.u32.u64 %r180,%r179; st.u8 [%r163+28],%r180; .loc 1 216 24 st.u64 [%r163+8],%r174; .loc 1 217 29 cvt.u32.u32 %r185,%r22; cvt.s64.s8 %r184,%r185; mov.u64 %r183,16call (%value_in),_gfortrani_xmallocarray186,[%value_in]; } .loc 1 217 27 st.u64 [%r163],%r186; bra $L44; $L43: .loc 1 221 11 cvta.global.u64 %r188,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r189,[%r188+36]; setp.eq.u32 %r190,%r189,0r164193; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r196193; call _gfortrani_bounds_equal_extents$L44: .loc 1 231 15 ld.u64 %r31,[%r165+16]; .loc 1 233 9 ld.u64 %r90,[%r165]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r198,%r32,-4; and.b32 %r199,%r198,-5; set.u32.eq.u32 %r201,%r199,0; neg.s32 %r202,%r201; .loc 1 235 22 add.u32 %r203,%r32,-1; set.u32.le.u32 %r205,%r203,1; neg.s32 %r206,%r205; .loc 1 235 6 cvt.u16.u32 %r208,%r202; cvt.u16.u32 %r209,%r206; or.b16 %r207,%r208,%r209;45; .loc 1 237 7 setp.ne.u32 %r214,%r32,16; @ %r214 bra $L46; $L45: .loc 1 244 11 ld.u64 %r102,[%r163+40]; .loc 1 245 8 ld.u64 %r103,[%r163]; .loc 1 246 17 cvt.u32.u32 %r215,%r22; cvt.s64.s8 %r65,%r215; add.u64 %r161,%r164,40; add.u64 %r306,%frame,120; mov.u64 %r159,%r306; add.u64 %r158,%r165,40; mov.u64 %r154,%frame; add.u64 %r155,%frame,240; add.u64 %r151,%frame,360; mov.u64 %r150,%r151; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r224,%r29; bra $L47; $L46: .loc 1 242 5_gfortran_runtime_error70: mov.u64 %r29,%r109; $L47: .loc 1 248 18 ld.u64 %r218,[%r161]; st.u64 [%r159],%r218; .loc 1 249 20 ld.u64 %r220,[%r158]; mul.lo.u64 %r219,%r220,%r31; .loc 1 249 18 st.u64 [%r154],%r219; .loc 1 250 19 ld.u64 %r222,[%r161+16]; add.u64 %r221,%r222,1; ld.u64 %r223,[%r161+8]; sub.u64 %r44,%r221,%r223; .loc 1 250 17 st.u64 [%r155],%r44; .loc 1 251 16 st.u64 [%r150],%r224; .loc 1 252 10 setp.gt.s64 %r225,%r44,0; @ %r225 bra $L48; shl.b64 %r144,%r102,4; mov.u64 %r133,%r103; .loc 1 255 11 mov.u64 %r110,0; .loc 1 256 24 mov.u64 %r226,%r110; $L49: st.u64 [%r133],%r226; st.u64 [%r133+8],%r226; .loc 1 255 27 add.u64 %r110,%r110,1; .loc 1 255 4 add.u64 %r133,%r133,%r144; setp.ne.u64 %r228,%r65,%r110; @ %r228 bra $L49; bra $L39; $L48: .loc 1 246 26 add.u64 %r109,%r29,1; .loc 1 246 3 add.u64 %r161,%r161,24; add.u64 %r159,%r159,8; add.u64 %r158,%r158,24; add.u64 %r154,%r154,8; add.u64 %r155,%r155,8; add.u64 %r150,%r150,8; setp.ne.u64 %r229,%r65,%r109; @ %r229 bra $L70; .loc 1 261 8 ld.u64 %r88,[%r164]; shl.b64 %r112,%r102,4; mov.u64 %r52,%r103; .loc 1 264 10 mov.u64 %r108,0; .loc 1 265 23 mov.u64 %r230,%r108; $L51: st.u64 [%r52],%r230; st.u64 [%r52+8],%r230; mov.u64 %r140,%r108; .loc 1 264 26 add.u64 %r108,%r108,1; .loc 1 264 3 add.u64 %r52,%r52,%r112; setp.ne.u64 %r232,%r29,%r140; @ %r232 bra $L51; .loc 1 276 9 setp.eq.u64 %r233,%r88,0; @ %r233 bra $L39; .loc 1 331 19 ld.u64 %r68,[%frame+120]; .loc 1 331 9 shl.b64 %r70,%r68,4; .loc 1 332 20 ld.u64 %r71,[%frame]; .loc 1 334 34 ld.u64 %r74,[%frame+240]; ld.u64 %r160,[%frame+360]; cvt.u32.u32 %r235,%r22; cvt.s64.s8 %r234,%r235; shl.b64 %r236,%r234,3; add.u64 %r122,%r236,%r151; .loc 1 274 12 mov.u64 %r303,0; mov.u64 %r304,-9223372036854775808; setp.eq.u32 %r307,1,0; add.u64 %r305,%frame,368; .loc 1 340 13 mov.u64 %r312,%r303; .loc 1 359 32 add.u64 %r313,%frame,240; .loc 1 308 12 setp.eq.u32 %r315,%r166,0; $L69: .loc 1 280 10 @ %r307 bra $L52; mov.u64 %r53,%r160; mov.u32 %r46,0; mov.u32 %r309,1; $L57: mov.u64 %r45,%r88; .loc 1 300 13 add.u64 %r88,%r45,%r70; .loc 1 284 11 ld.s8 %r239,[%r90]; cvt.u16.u32 %r238,%r239; setp.eq.u16 %r240,%r238,0; @ %r240 bra $L53; setp.eq.u32 %r241,%r46,0; selp.u64 %r160,%r160,%r53,%r241; .loc 1 294 16 ld.u64 %r303,[%r45]; ld.u64 %r304,[%r45+8]; mov.u64 %r131,%r305; mov.u64 %r145,%r103; mov.u64 %r157,%r160; $L56: .loc 1 296 33 add.u64 %r243,%r157,1; st.u64 [%r145],%r243; shr.s64 %r244,%r243,63; st.u64 [%r145+8],%r244; .loc 1 295 9 add.u64 %r145,%r145,%r112; setp.eq.u64 %r245,%r122,%r131; @ %r245 bra $L71; .loc 1 296 29 ld.u64 %r157,[%r131]; add.u64 %r131,%r131,8; bra $L56; $L53: .loc 1 301 14 add.u64 %r90,%r90,%r71; .loc 1 303 11 add.u64 %r53,%r53,1; mov.u32 %r46,%r309; .loc 1 303 4 setp.ne.u64 %r246,%r53,%r74; @ %r246 bra $L57; bra $L97; $L52: .loc 1 308 12 @ %r315 bra $L72; mov.u64 %r61,%r160; mov.u64 %r86,%r88; $L63: .loc 1 311 11 ld.s8 %r249,[%r90]; cvt.u16.u32 %r248,%r249; setp.eq.u16 %r250,%r248,0; @ %r250 bra $L60; .loc 1 311 22 ld.u64 %r299,[%r86]; ld.u64 %r300,[%r86+8]; .loc 1 311 19 setp.gt.s64 %r253,%r304,%r300; @ %r253 bra $L60; setp.ne.u64 %r256,%r304,%r300; @ %r256 bra $L75; setp.gt.u64 %r259,%r303,%r299; @ %r259 bra $L60; $L75: mov.u64 %r97,%r305; mov.u64 %r107,%r103; mov.u64 %r153,%r61; .loc 1 314 19 mov.u64 %r141,%r312; $L62: .loc 1 315 36 add.u64 %r261,%r153,1; st.u64 [%r107],%r261; shr.s64 %r262,%r261,63; st.u64 [%r107+8],%r262; .loc 1 314 35 add.u64 %r106,%r141,1; .loc 1 314 12 add.u64 %r107,%r107,%r112; setp.le.s64 %r263,%r29,%r141; @ %r263 bra $L73; .loc 1 315 32 ld.u64 %r153,[%r97]; add.u64 %r97,%r97,8; mov.u64 %r141,%r106; bra $L62; $L73: mov.u64 %r303,%r299; mov.u64 %r304,%r300; $L60: .loc 1 317 13 add.u64 %r86,%r86,%r70; .loc 1 319 11 add.u64 %r61,%r61,1; .loc 1 319 4 st.u64 [%frame+360],%r61; setp.ne.u64 %r264,%r61,%r74; @ %r264 bra $L63; .loc 1 317 13 sub.u64 %r265,%r74,%r160; mad.lo.u64 %r88,%r265,%r70,%r88; bra $L64; $L72: mov.u64 %r73,%r160; mov.u64 %r89,%r90; mov.u64 %r87,%r88; $L59: .loc 1 323 11 ld.s8 %r268,[%r89]; cvt.u16.u32 %r267,%r268; setp.eq.u16 %r269,%r267,0; @ %r269 bra $L65; .loc 1 323 22 ld.u64 %r301,[%r87]; ld.u64 %r302,[%r87+8]; .loc 1 323 19 setp.gt.s64 %r272,%r302,%r304; @ %r272 bra $L76; setp.ne.u64 %r275,%r302,%r304; @ %r275 bra $L65; setp.gt.u64 %r278,%r301,%r303; @ ! %r278 bra $L65; $L76: mov.u64 %r62,%r305; mov.u64 %r91,%r103; mov.u64 %r149,%r73; .loc 1 326 12 mov.u64 %r142,%r312; $L67: .loc 1 327 36 add.u64 %r280,%r149,1; st.u64 [%r91],%r280; shr.s64 %r281,%r280,63; st.u64 [%r91+8],%r281; .loc 1 326 28 add.u64 %r105,%r142,1; .loc 1 326 5 add.u64 %r91,%r91,%r112; setp.le.s64 %r282,%r29,%r142; @ %r282 bra $L74; .loc 1 327 32 ld.u64 %r149,[%r62]; add.u64 %r62,%r62,8; mov.u64 %r142,%r105; bra $L67; $L74: mov.u64 %r303,%r301; mov.u64 %r304,%r302; $L65: .loc 1 331 9 add.u64 %r87,%r87,%r70; .loc 1 332 10 add.u64 %r89,%r89,%r71; .loc 1 334 14 add.u64 %r73,%r73,1; .loc 1 334 7 st.u64 [%frame+360],%r73; setp.ne.u64 %r283,%r73,%r74; @ %r283 bra $L59; sub.u64 %r128,%r74,%r160; .loc 1 331 9 mad.lo.u64 %r88,%r70,%r128,%r88; .loc 1 332 10 mad.lo.u64 %r90,%r71,%r128,%r90; bra $L64; $L97: st.u64 [%frame+360],%r74; $L64: mov.u64 %r120,%r305; mov.u64 %r83,%r71; .loc 1 334 34 mov.u64 %r85,%r74; mov.u64 %r81,%r68; mov.u64 %r119,8; .loc 1 335 9 mov.u64 %r93,0; $L68: .loc 1 340 13 st.u64 [%r120+-8],%r312; .loc 1 343 23 mul.lo.u64 %r76,%r81,%r85; .loc 1 344 24 mul.lo.u64 %r78,%r85,%r83; mov.u64 %r98,%r93; .loc 1 345 5 add.u64 %r93,%r93,1; .loc 1 346 7 setp.eq.u64 %r288,%r29,%r98; @ %r288 bra $L39; .loc 1 354 16 ld.u64 %r289,[%r120]; add.u64 %r80,%r289,1; st.u64 [%r120],%r80; .loc 1 355 23 add.u64 %r291,%r306,%r119; ld.u64 %r81,[%r291]; .loc 1 355 13 sub.u64 %r292,%r81,%r76; shl.b64 %r293,%r292,4; add.u64 %r88,%r88,%r293; .loc 1 356 24 add.u64 %r294,%frame,%r119; ld.u64 %r83,[%r294]; .loc 1 356 14 sub.u64 %r295,%r83,%r78; add.u64 %r90,%r90,%r295; .loc 1 359 32 add.u64 %r297,%r313,%r119; ld.u64 %r85,[%r297]; .loc 1 359 7 add.u64 %r120,%r120,8; add.u64 %r119,%r119,8; setp.eq.u64 %r298,%r80,%r85; @ %r298 bra $L68; ld.u64 %r160,[%frame+360]; bra $L69; $L71: mov.u64 %r88,%r45; setp.eq.u32 %r307,1,1; bra $L52; $L39: .loc 1 362_gfortran_smaxloc0_16_i16 .visible .func _gfortran_smaxloc0_16_i164predpred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 379 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L99; .loc 1 379 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L100; $L99: .loc 1 381 7 {_gfortran_maxloc0_16_i16.loc 1 382 7 bra $L98; $L100: .loc 1 385 8 ld.s8 %r30,[%r36+28]; .loc 1 387 6 setp.gt.s64 %r46,%r30,0; @ %r46 bra $L102_gfortran_runtime_error102: .loc 1 390 15 ld.u64 %r24,[%r35]; .loc 1 390 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L103; .loc 1 392 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 393 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 394 24 st.u64 [%r35+8],%r24; .loc 1 395 29call (%value_in),_gfortrani_xmallocarray95 27 st.u64 [%r35],%r24; bra $L104; $L103: .loc 1 397 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 397 11 ld.u32 %r62,[%r61+36.loc 1 399 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 404 8 ld.u64 %r24,[%r35]; $L104: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,4; mov.u64 %r29,%r24; .loc 1 405 10 mov.u64 %r32,0; .loc 1 406 23 mov.u64 %r68,%r32; $L105: st.u64 [%r29],%r68; st.u64 [%r29+8],%r68; .loc 1 405 24 add.u64 %r32,%r32,1; .loc 1 405 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r70,%r30,%r32; @ %r70 bra $L105; $L98: .loc 1 407 maxloc0_4_r4.o/ 1622802228_gfortran_maxloc0_4_r4 .visible .func _gfortran_maxloc0_4_fortran/generated/maxloc0_4_r4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_4_r4 .visible .func _gfortran_mmaxloc0_4_r4_gfortran_smaxloc0_4_r4 .visible .func _gfortran_smaxloc0_4_r4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,65,88,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_4_r4 .visible .func _gfortran_maxloc0_4_r4predpredpredu32 %r176; .reg .pred %r177; .reg .pred %r178; .reg .pred %r180; .reg .u32 %r182; .reg .u32 %r183; .reg .pred64pred %r208; .reg .u32 %r210; .reg .u64 %r211; .reg .u64 %r212; .reg .pred50 10 ld.s8 %r22,[%r128+28]; .loc 1 50 8 cvt.u32.u32 %r130,%r22; cvt.s64.s8 %r79,%r130; .loc 1 51 6 setp.gt.s64 %r131,%r79,0; @ %r131 bra $L2; .loc 1 52 5 cvta.const.u64 %r1321stack; call _gfortran_runtime_errorr23,[%r127]; .loc 1 54 6 setp.ne.u64 %r134,%r23,0; @ %r134 bra $L3; .loc 1 56 7 st.u64 [%r127+48],%r23; add.u64 %r136,%r79,-1; st.u64 [%r127+56],%r136; mov.u64 %r137,1; st.u64 [%r127+40],%r137; .loc 1 57 28 cvt.u32.u64 %r138,%r137; st.u8 [%r127+28],%r138; .loc 1 58 24 st.u64 [%r127+8],%r23; .loc 1 59 2979call (%value_in),_gfortrani_xmallocarray4,[%value_in]; } mov.u64 %r23,%r144; .loc 1 59 27 st.u64 [%r127],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r146,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r147,[%r146+36]; setp.eq.u32 %r148,%r147,0; @ %r148 bra $L4; .loc 1 64 2r12151; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r127]; $L4: .loc 1 68 11 ld.u64 %r80,[%r127+40]; add.u64 %r44,%r128,40; mov.u64 %r37,%frame; add.u64 %r47,%frame,120; .loc 1 70 10 mov.u64 %r103,0; add.u64 %r206,%frame,240; .loc 1 74 16 mov.u64 %r159,%r103; bra $L7; $L32: mov.u64 %r103,%r88; $L7: .loc 1 72 18 ld.u64 %r152,[%r44]; st.u64 [%r37],%r152; .loc 1 73 19 ld.u64 %r154,[%r44+16]; add.u64 %r153,%r154,1; ld.u64 %r155,[%r44+8]; sub.u64 %r31,%r153,%r155; .loc 1 73 17 st.u64 [%r47],%r31; .loc 1 74 16 shl.b64 %r157,%r103,3; add.u64 %r158,%r206,%r157; st.u64 [%r158],%r159; .loc 1 75 10 setp.gt.s64 %r160,%r31,0; @ %r160 bra $L5; shl.b64 %r122,%r80,2; mov.u64 %r101,%r23; .loc 1 78 11 mov.u64 %r89,0; .loc 1 79 24 cvt.u32.u64 %r161,%r89; $L6: st.u32 [%r101],%r161; .loc 1 78 27 add.u64 %r89,%r89,1; .loc 1 78 4 add.u64 %r101,%r101,%r122; setp.ne.u64 %r162,%r79,%r89; @ %r162 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r88,%r103,1; .loc 1 70 3 add.u64 %r44,%r44,24; add.u64 %r37,%r37,8; add.u64 %r47,%r47,8; setp.ne.u64 %r163,%r79,%r88; @ %r163 bra $L32; .loc 1 84 8 ld.u64 %r67,[%r128]; shl.b64 %r108,%r80,2; mov.u64 %r106,%r23; .loc 1 87 10 mov.u64 %r87,0; .loc 1 88 23 mov.u32 %r164,1; $L8: st.u32 [%r106],%r164; mov.u64 %r104,%r87; .loc 1 87 26 add.u64 %r87,%r87,1; .loc 1 87 3 add.u64 %r106,%r106,%r108; setp.ne.u64 %r165,%r103,%r104; @ %r165 bra $L8; .loc 1 101 9 setp.eq.u64 %r166,%r67,0; @ %r166 bra $L1; .loc 1 149 19 ld.u64 %r51,[%frame]; .loc 1 149 9 shl.b64 %r53,%r51,2; .loc 1 151 34 ld.u64 %r56,[%frame+120]; ld.u64 %r123,[%frame+240]; .loc 1 141 11 ld.f32 %r125,[%r67]; cvt.u32.u32 %r168,%r22; cvt.s64.s8 %r167,%r168; shl.b64 %r169,%r167,3; add.u64 %r113,%r206,%r169; .loc 1 97 12 mov.f32 %r73,0fff800000; setp.eq.u32 %r208,1,0; add.u64 %r207,%frame,248; .loc 1 157 13 mov.u64 %r211,0; .loc 1 174 32 add.u64 %r212,%frame,120; .loc 1 126 12 setp.eq.u32 %r214,%r129,0; $L30: .loc 1 106 10 @ %r208 bra $L10; mov.u64 %r38,%r123; mov.u32 %r85,0; .loc 1 110 12 mov.u32 %r210,1; $L17: .loc 1 118 13 add.u64 %r126,%r67,%r53; .loc 1 110 11 setp.le.f32 %r172,%r73,%r125; @ ! %r172 bra $L51; setp.eq.u32 %r173,%r85,0; selp.u64 %r123,%r123,%r38,%r173; mov.u64 %r90,%r207; mov.u64 %r95,%r23; mov.u64 %r119,%r123; $L15: .loc 1 115 36 cvt.u32.u64 %r176,%r119; add.u32 %r175,%r176,1; .loc 1 115 25 st.u32 [%r95],%r175; .loc 1 114 5 add.u64 %r95,%r95,%r108; setp.eq.u64 %r177,%r90,%r113; @ %r177 bra $L33; .loc 1 115 32 ld.u64 %r119,[%r90]; add.u64 %r90,%r90,8; bra $L15; $L51: .loc 1 120 11 add.u64 %r38,%r38,1; .loc 1 120 4 setp.eq.u64 %r178,%r38,%r56; @ %r178 bra $L16; .loc 1 110 12 ld.f32 %r125,[%r126]; mov.u64 %r67,%r126; mov.u32 %r85,%r210; bra $L17; $L10: .loc 1 126 12 @ %r214 bra $L34; mov.u64 %r46,%r123; mov.u64 %r65,%r67; $L24: .loc 1 129 11 setp.le.f32 %r180,%r73,%r125; @ ! %r180 bra $L19; mov.u64 %r84,%r207; mov.u64 %r39,%r23; mov.u64 %r116,%r46; $L21: .loc 1 133 35 cvt.u32.u64 %r183,%r116; add.u32 %r182,%r183,1; .loc 1 133 24 st.u32 [%r39],%r182; .loc 1 132 10 add.u64 %r39,%r39,%r108; setp.eq.u64 %r184,%r84,%r113; @ %r184 bra $L35; .loc 1 133 31 ld.u64 %r116,[%r84]; add.u64 %r84,%r84,8; bra $L21; $L35: .loc 1 131 18 mov.f32 %r73,%r125; $L19: .loc 1 135 12 add.u64 %r65,%r65,%r53; .loc 1 137 17 add.u64 %r46,%r46,1; .loc 1 137 10 st.u64 [%frame+240],%r46; setp.ne.u64 %r185,%r46,%r56; @ %r185 bra $L22; .loc 1 135 12 sub.u64 %r186,%r56,%r123; mad.lo.u64 %r67,%r186,%r53,%r67; bra $L23; $L22: .loc 1 129 12 ld.f32 %r125,[%r65]; bra $L24; $L34: mov.u64 %r55,%r123; mov.u64 %r66,%r67; $L18: .loc 1 141 10 setp.lt.f32 %r188,%r73,%r125; @ ! %r188 bra $L25; mov.u64 %r117,%r207; mov.u64 %r120,%r23; mov.u64 %r110,%r55; $L27: .loc 1 145 35 cvt.u32.u64 %r191,%r110; add.u32 %r190,%r191,1; .loc 1 145 24 st.u32 [%r120],%r190; .loc 1 144 4 add.u64 %r120,%r120,%r108; setp.eq.u64 %r192,%r113,%r117; @ %r192 bra $L36; .loc 1 145 31 ld.u64 %r110,[%r117]; add.u64 %r117,%r117,8; bra $L27; $L36: .loc 1 143 18 mov.f32 %r73,%r125; $L25: .loc 1 149 9 add.u64 %r66,%r66,%r53; .loc 1 151 14 add.u64 %r55,%r55,1; .loc 1 151 7 st.u64 [%frame+240],%r55; setp.ne.u64 %r193,%r55,%r56; @ %r193 bra $L28; .loc 1 149 9 sub.u64 %r194,%r56,%r123; mad.lo.u64 %r67,%r194,%r53,%r67; bra $L23; $L28: .loc 1 141 11 ld.f32 %r125,[%r66]; bra $L18; $L16: st.u64 [%frame+240],%r56; .loc 1 118 13 mov.u64 %r67,%r126; $L23: mov.u64 %r76,%r207; .loc 1 151 34 mov.u64 %r63,%r56; mov.u64 %r61,%r51; .loc 1 152 9 mov.u64 %r71,0; $L29: .loc 1 157 13 st.u64 [%r76+-8],%r211; .loc 1 160 23 mul.lo.u64 %r58,%r61,%r63; mov.u64 %r105,%r71; .loc 1 161 5 add.u64 %r71,%r71,1; .loc 1 162 7 setp.eq.u64 %r198,%r105,%r103; @ %r198 bra $L1; .loc 1 170 16 ld.u64 %r199,[%r76]; add.u64 %r60,%r199,1; st.u64 [%r76],%r60; shl.b64 %r69,%r71,3; .loc 1 171 23 add.u64 %r200,%frame,%r69; ld.u64 %r61,[%r200]; .loc 1 171 13 sub.u64 %r201,%r61,%r58; shl.b64 %r202,%r201,2; add.u64 %r67,%r67,%r202; .loc 1 174 32 add.u64 %r204,%r212,%r69; ld.u64 %r63,[%r204]; .loc 1 174 7 add.u64 %r76,%r76,8; setp.eq.u64 %r205,%r60,%r63; @ %r205 bra $L29; ld.u64 %r123,[%frame+240]; .loc 1 141 11 ld.f32 %r125,[%r67]; bra $L30; $L33: mov.f32 %r73,%r125; setp.eq.u32 %r208,1,1; bra $L10; $L1: .loc 1 1_gfortran_mmaxloc0_4_r4 .visible .func _gfortran_mmaxloc0_4_r4pred %r172; .reg .predpredpred %r195; .reg .u64 %r198; .reg .u64u32 %r211; .reg .u16pred %r217; .reg .predpredu16 %r257; .reg .u32 %r258; .reg .pred %r259; .reg .pred %r260; .reg .u32 %r262; .reg .u32 %r263; .reg .pred %r264; .reg .pred %r265; .reg .u64 %r266; .reg .u16 %r268; .reg .u32 %r269; .reg .pred %r270; .reg .pred %r271; .reg .u32 %r273; .reg .u32 %r274; .reg .pred %r275; .reg .pred %r276; .reg .pred %r281; .reg .u64 %r282; .reg .u64pred %r294; .reg .u64 %r296; .reg .u64 %r297; .reg .pred %r299; mov.u64 %r168,%ar0; mov.u64 %r169,%ar1; mov.u64 %r170,%ar2; mov.u32 %r171,%ar3; .loc 1 202 6 setp.ne.u64 %r172,%r170,0; @ %r172 bra $Lcall _gfortran_maxloc0_4_r4.loc 1 205 7 bra $L52; $L53: .loc 1 208 10 ld.s8 %r22,[%r169+28]; .loc 1 209 6 setp.gt.s32 %r176,%r22,0; @ %r176 bra $L55; .loc 1 210 5 cvta.const.u64 %r177_gfortran_runtime_error5: .loc 1 212 6 ld.u64 %r179,[%r168]; setp.ne.u64 %r180,%r179,0; @ %r180 bra $L56; .loc 1 214 7 st.u64 [%r168+48],%r179; add.u32 %r182,%r22,-1; cvt.s64.s32 %r183,%r182; st.u64 [%r168+56],%r183; mov.u64 %r184,1; st.u64 [%r168+40],%r184; .loc 1 215 28 cvt.u32.u64 %r185,%r184; st.u8 [%r168+28],%r185; .loc 1 216 24 st.u64 [%r168+8],%r179; .loc 1 217 29 cvt.u32.u32 %r190,%r22; cvt.s64.s8 %r189,%r1call (%value_in),_gfortrani_xmallocarray191,[%value_in]; } .loc 1 217 27 st.u64 [%r168],%r191; bra $L57; $L56: .loc 1 221 11 cvta.global.u64 %r193,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r194,[%r193+36]; setp.eq.u32 %r195,%r194,0r169198; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r2011716920198; call _gfortrani_bounds_equal_extents$L57: .loc 1 231 15 ld.u64 %r31,[%r170+16]; .loc 1 233 9 ld.u64 %r94,[%r170]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r203,%r32,-4; and.b32 %r204,%r203,-5;.loc 1 235 22 add.u32 %r208,%r32,-1; set.u32.le.u32 %r210,%r208,1; neg.s32 %r211,%r210; .loc 1 235 6 cvt.u16.u32 %r213,%r207; cvt.u16.u32 %r214,%r211; or.b16 %r212,%r213,%r214; cvt.u32.u16 %r215,%r212; cvt.u16.u8 %r216,%r215; setp.ne.u16 %r217,%r216,0; @ %r217 bra $L58; .loc 1 237 7 setp.ne.u32 %r219,%r32,16; @ %r219 bra $L59; $L58: .loc 1 244 11 ld.u64 %r112,[%r168+40]; .loc 1 245 8 ld.u64 %r113,[%r168]; .loc 1 246 17 cvt.u32.u32 %r220,%r22; cvt.s64.s8 %r135,%r220; add.u64 %r166,%r169,40; add.u64 %r292,%frame,120; mov.u64 %r163,%r292; add.u64 %r160,%r170,40; mov.u64 %r159,%frame; add.u64 %r156,%frame,240; add.u64 %r152,%frame,360; mov.u64 %r154,%r152; .loc 1 246 10 mov.u64 %r120,0; .loc 1 251 16 mov.u64 %r229,%r120; bra $L60; $L59: .loc 1 242 5_gfortran_runtime_error248 18 ld.u64 %r223,[%r166]; st.u64 [%r163],%r223; .loc 1 249 20 ld.u64 %r225,[%r160]; mul.lo.u64 %r224,%r225,%r31; .loc 1 249 18 st.u64 [%r159],%r224; .loc 1 250 19 ld.u64 %r227,[%r166+16]; add.u64 %r226,%r227,1; ld.u64 %r228,[%r166+8]; sub.u64 %r45,%r226,%r228; .loc 1 250 17 st.u64 [%r156],%r45; .loc 1 251 16 st.u64 [%r154],%r229; .loc 1 252 10 setp.gt.s64 %r230,%r45,0; @ %r230 bra $L61; shl.b64 %r131,%r112,2; mov.u64 %r133,%r113; .loc 1 255 11 mov.u64 %r121,0; .loc 1 256 24 cvt.u32.u64 %r231,%r121; $L62: st.u32 [%r133],%r231; .loc 1 255 27 add.u64 %r121,%r121,1; .loc 1 255 4 add.u64 %r133,%r133,%r131; setp.ne.u64 %r232,%r121,%r135; @ %r232 bra $L62; bra $L52; $L61: .loc 1 246 26 add.u64 %r120,%r120,1; .loc 1 246 3 add.u64 %r166,%r166,24; add.u64 %r163,%r163,8; add.u64 %r160,%r160,24; add.u64 %r159,%r159,8; add.u64 %r156,%r156,8; add.u64 %r154,%r154,8; setp.ne.u64 %r233,%r120,%r135; @ %r233 bra $L60; .loc 1 261 8 ld.u64 %r91,[%r169]; shl.b64 %r124,%r112,2; mov.u64 %r122,%r113; .loc 1 264 10 mov.u64 %r119,0; .loc 1 265 23 cvt.u32.u64 %r234,%r119; $L64: st.u32 [%r122],%r234; .loc 1 264 26 add.u64 %r119,%r119,1; .loc 1 264 3 add.u64 %r122,%r122,%r124; setp.ne.u64 %r235,%r119,%r135; @ %r235 bra $L64; .loc 1 276 9 setp.eq.u64 %r236,%r91,0; @ %r236 bra $L52; .loc 1 331 19 ld.u64 %r71,[%frame+120]; .loc 1 331 9 shl.b64 %r73,%r71,2; .loc 1 332 20 ld.u64 %r74,[%frame]; .loc 1 334 34 ld.u64 %r77,[%frame+240]; .loc 1 334 21 ld.u64 %r164,[%frame+360]; cvt.u32.u32 %r238,%r22; cvt.s64.s8 %r237,%r238; shl.b64 %r239,%r237,3; add.u64 %r96,%r239,%r152; .loc 1 272 12 mov.f32 %r102,0fff800000; setp.eq.u32 %r294,1,0; add.u64 %r293,%frame,368; .loc 1 340 13 mov.u64 %r296,0; .loc 1 359 32 add.u64 %r297,%frame,240; .loc 1 308 12 setp.eq.u32 %r299,%r171,0; $L83: .loc 1 280 10 @ %r294 bra $L65; $L72: .loc 1 284 11 ld.s8 %r242,[%r94]; cvt.u16.u32 %r241,%r242; setp.eq.u16 %r243,%r241,0; @ %r243 bra $L66; .loc 1 287 8 ld.u32 %r244,[%r113]; setp.ne.u32 %r245,%r244,0; @ %r245 bra $L67; mov.u64 %r101,%r293; mov.u64 %r104,%r113; mov.u64 %r157,%r164; $L68: .loc 1 289 38 cvt.u32.u64 %r248,%r157; add.u32 %r247,%r248,1; .loc 1 289 27 st.u32 [%r104],%r247; .loc 1 288 7 add.u64 %r104,%r104,%r124; setp.eq.u64 %r249,%r96,%r101; @ %r249 bra $L67; .loc 1 289 34 ld.u64 %r157,[%r101]; add.u64 %r101,%r101,8; bra $L68; $L67: .loc 1 290 9 ld.f32 %r52,[%r91]; .loc 1 290 8 setp.ge.f32 %r250,%r52,%r102; @ ! %r250 bra $L66; mov.u64 %r128,%r293; mov.u64 %r129,%r113; mov.u64 %r161,%r164; $L71: .loc 1 296 33 cvt.u32.u64 %r253,%r161; add.u32 %r252,%r253,1; .loc 1 296 22 st.u32 [%r129],%r252; .loc 1 295 9 add.u64 %r129,%r129,%r124; setp.eq.u64 %r254,%r96,%r128; @ %r254 bra $L84; .loc 1 296 29 ld.u64 %r161,[%r128]; add.u64 %r128,%r128,8; bra $L71; $L66: .loc 1 300 13 add.u64 %r91,%r91,%r73; .loc 1 301 14 add.u64 %r94,%r94,%r74; .loc 1 303 11 add.u64 %r164,%r164,1; .loc 1 303 4 st.u64 [%frame+360],%r164; setp.ne.u64 %r255,%r164,%r77; @ %r255 bra $L72; bra $L73; $L65: .loc 1 308 12 @ %r299 bra $L85; mov.u64 %r64,%r164; mov.u64 %r89,%r91; $L78: .loc 1 311 11 ld.s8 %r258,[%r94]; cvt.u16.u32 %r257,%r258; setp.eq.u16 %r259,%r257,0; @ %r259 bra $L75; .loc 1 311 22 ld.f32 %r60,[%r89]; .loc 1 311 19 setp.ge.f32 %r260,%r60,%r102; @ ! %r260 bra $L75; mov.u64 %r67,%r293; mov.u64 %r92,%r113; mov.u64 %r155,%r64; .loc 1 314 19 mov.u64 %r117,0; $L77: .loc 1 315 36 cvt.u32.u64 %r263,%r155; add.u32 %r262,%r263,1; .loc 1 315 25 st.u32 [%r92],%r262; .loc 1 314 35 add.u64 %r117,%r117,1; .loc 1 314 12 add.u64 %r92,%r92,%r124; setp.ge.s64 %r264,%r117,%r135; @ %r264 bra $L86; .loc 1 315 32 ld.u64 %r155,[%r67]; add.u64 %r67,%r67,8; bra $L77; $L86: .loc 1 313 19 mov.f32 %r102,%r60; $L75: .loc 1 317 13 add.u64 %r89,%r89,%r73; .loc 1 319 11 add.u64 %r64,%r64,1; .loc 1 319 4 st.u64 [%frame+360],%r64; setp.ne.u64 %r265,%r64,%r77; @ %r265 bra $L78; .loc 1 317 13 sub.u64 %r266,%r77,%r164; mad.lo.u64 %r91,%r266,%r73,%r91; bra $L73; $L85: mov.u64 %r76,%r164; mov.u64 %r93,%r94; mov.u64 %r90,%r91; $L74: .loc 1 323 11 ld.s8 %r269,[%r93]; cvt.u16.u32 %r268,%r269; setp.eq.u16 %r270,%r268,0; @ %r270 bra $L79; .loc 1 323 22 ld.f32 %r66,[%r90]; .loc 1 323 19 setp.gt.f32 %r271,%r66,%r102; @ ! %r271 bra $L79; mov.u64 %r46,%r293; mov.u64 %r56,%r113; mov.u64 %r151,%r76; .loc 1 326 12 mov.u64 %r115,0; $L81: .loc 1 327 36 cvt.u32.u64 %r274,%r151; add.u32 %r273,%r274,1; .loc 1 327 25 st.u32 [%r56],%r273; .loc 1 326 28 add.u64 %r115,%r115,1; .loc 1 326 5 add.u64 %r56,%r56,%r124; setp.ge.s64 %r275,%r115,%r135; @ %r275 bra $L87; .loc 1 327 32 ld.u64 %r151,[%r46]; add.u64 %r46,%r46,8; bra $L81; $L87: .loc 1 325 12 mov.f32 %r102,%r66; $L79: .loc 1 331 9 add.u64 %r90,%r90,%r73; .loc 1 332 10 add.u64 %r93,%r93,%r74; .loc 1 334 14 add.u64 %r76,%r76,1; .loc 1 334 7 st.u64 [%frame+360],%r76; setp.ne.u64 %r276,%r76,%r77; @ %r276 bra $L74; sub.u64 %r138,%r77,%r164; .loc 1 331 9 mad.lo.u64 %r91,%r73,%r138,%r91; .loc 1 332 10 mad.lo.u64 %r94,%r74,%r138,%r94; $L73: mov.u64 %r125,%r293; .loc 1 325 12 mov.u64 %r86,%r74; .loc 1 334 34 mov.u64 %r88,%r77; mov.u64 %r84,%r71; mov.u64 %r118,8; .loc 1 335 9 mov.u64 %r100,0; $L82: .loc 1 340 13 st.u64 [%r125+-8],%r296; .loc 1 343 23 mul.lo.u64 %r79,%r84,%r88; .loc 1 344 24 mul.lo.u64 %r81,%r88,%r86; .loc 1 345 5 add.u64 %r100,%r100,1; .loc 1 346 7 setp.eq.u64 %r281,%r100,%r135; @ %r281 bra $L52; .loc 1 354 16 ld.u64 %r282,[%r125]; add.u64 %r83,%r282,1; st.u64 [%r125],%r83; .loc 1 355 23 add.u64 %r284,%r292,%r118; ld.u64 %r84,[%r284]; .loc 1 355 13 sub.u64 %r285,%r84,%r79; shl.b64 %r286,%r285,2; add.u64 %r91,%r91,%r286; .loc 1 356 24 add.u64 %r287,%frame,%r118; ld.u64 %r86,[%r287]; .loc 1 356 14 sub.u64 %r288,%r86,%r81; add.u64 %r94,%r94,%r288; .loc 1 359 32 add.u64 %r290,%r297,%r118; ld.u64 %r88,[%r290]; .loc 1 359 7 add.u64 %r125,%r125,8; add.u64 %r118,%r118,8; setp.eq.u64 %r291,%r83,%r88; @ %r291 bra $L82; .loc 1 334 21 ld.u64 %r164,[%frame+360]; bra $L83; $L84: .loc 1 290 9 mov.f32 %r102,%r52; setp.eq.u32 %r294,1,1; bra $L65; $L52: .loc 1 362_gfortran_smaxloc0_4_r4 .visible .func _gfortran_smaxloc0_4_r44predpred.reg .u64u32 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 379 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L114; .loc 1 379 20 ld.u32 %r40,[%r37381 7 {_gfortran_maxloc0_4_r4.loc 1 382 7 bra $L113; $L115: .loc 1 385 8 ld.s8 %r30,[%r36+28]; .loc 1 387 6 setp.gt.s64 %r46,%r30,0; @ %r46 bra $L1_gfortran_runtime_error117: .loc 1 390 15 ld.u64 %r24,[%r35]; .loc 1 390 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L118; .loc 1 392 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 393 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 394 24 st.u64 [%r35+8],%r24; .loc 1 395 29 mov.u64 %r56call (%value_in),_gfortrani_xmallocarray95 27 st.u64 [%r35],%r24; bra $L119; $L118: .loc 1 397 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 397 11 ld.u32 %r62,[%r61+36.loc 1 399 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 404 8 ld.u64 %r24,[%r35]; $L119: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,2; mov.u64 %r29,%r24; .loc 1 405 10 mov.u64 %r32,0; .loc 1 406 23 cvt.u32.u64 %r68,%r32; $L120: st.u32 [%r29],%r68; .loc 1 405 24 add.u64 %r32,%r32,1; .loc 1 405 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L120; $L113: .loc 1 407maxloc0_8_r4.o/ 1622802228_gfortran_maxloc0_8_r4 .visible .func _gfortran_maxloc0_8_fortran/generated/maxloc0_8_r4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_8_r4 .visible .func _gfortran_mmaxloc0_8_r4_gfortran_smaxloc0_8_r4 .visible .func _gfortran_smaxloc0_8_r4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,65,88,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_8_r4 .visible .func _gfortran_maxloc0_8_r4pred.reg .u64 %r140; .reg .u32 %r141; .reg .predpred %r157; .reg .u64pred %r170; .reg .pred %r171; .reg .pred %r173; .reg .u64pred %r177; .reg .u64 %r178; .reg .predpredpred %r196; .reg .predpred %r205; mov.u64 %r121,%ar0; mov.u64 %r122,%ar1; mov.u32 %r123,%ar2; .loc 1 50 10 ld.s8 %r22,[%r122+28]; .loc 1 50 8 cvt.u32.u32 %r124,%r22; cvt.s64.s8 %r84,%r124; .loc 1 51 6 setp.gt.s64 %r125,%r84,0; @ %r125 bra $L2; .loc 1 52 5 cvta.const.u64 %r126,$LCstack; call _gfortran_runtime_errorr23,[%r121]; .loc 1 54 6 setp.ne.u64 %r128,%r23,0; @ %r128 bra $L3; .loc 1 56 7 st.u64 [%r121+48],%r23; add.u64 %r130,%r84,-1; st.u64 [%r121+56],%r130; mov.u64 %r131,1; st.u64 [%r121+40],%r131; .loc 1 57 28 cvt.u32.u64 %r132,%r131; st.u8 [%r121+28],%r132; .loc 1 58 24 st.u64 [%r121+8],%r23; .loc 1 59 29 mov.u64 %r135135; call (%value_in),_gfortrani_xmallocarray138,[%value_in]; } mov.u64 %r23,%r138; .loc 1 59 27 st.u64 [%r121],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r140,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r141,[%r140+36]; setp.eq.u32 %r142,%r141,0; @ %r142 bra $L4; .loc 1 64r122145; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r121]; $L4: .loc 1 68 11 ld.u64 %r85,[%r121+40]; add.u64 %r33,%r122,40; mov.u64 %r53,%frame; add.u64 %r45,%frame,120; .loc 1 70 10 mov.u64 %r106,0; add.u64 %r198,%frame,240; .loc 1 74 16 mov.u64 %r153,%r106; bra $L7; $L32: mov.u64 %r106,%r91; $L7: .loc 1 72 18 ld.u64 %r146,[%r33]; st.u64 [%r53],%r146; .loc 1 73 19 ld.u64 %r148,[%r33+16]; add.u64 %r147,%r148,1; ld.u64 %r149,[%r33+8]; sub.u64 %r32,%r147,%r149; .loc 1 73 17 st.u64 [%r45],%r32; .loc 1 74 16 shl.b64 %r151,%r106,3; add.u64 %r152,%r198,%r151; st.u64 [%r152],%r153; .loc 1 75 10 setp.gt.s64 %r154,%r32,0; @ %r154 bra $L5; shl.b64 %r114,%r85,3; mov.u64 %r104,%r23; .loc 1 78 11 mov.u64 %r92,0; .loc 1 79 24 mov.u64 %r155,%r92; $L6: st.u64 [%r104],%r155; .loc 1 78 27 add.u64 %r92,%r92,1; .loc 1 78 4 add.u64 %r104,%r104,%r114; setp.ne.u64 %r156,%r84,%r92; @ %r156 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r91,%r106,1; .loc 1 70 3 add.u64 %r33,%r33,24; add.u64 %r53,%r53,8; add.u64 %r45,%r45,8; setp.ne.u64 %r157,%r84,%r91; @ %r157 bra $L32; .loc 1 84 8 ld.u64 %r73,[%r122]; shl.b64 %r44,%r85,3; mov.u64 %r55,%r23; .loc 1 87 10 mov.u64 %r90,0; .loc 1 88 23 mov.u64 %r158,1; $L8: st.u64 [%r55],%r158; mov.u64 %r107,%r90; .loc 1 87 26 add.u64 %r90,%r90,1; .loc 1 87 3 add.u64 %r55,%r55,%r44; setp.ne.u64 %r159,%r106,%r107; @ %r159 bra $L8; .loc 1 101 9 setp.eq.u64 %r160,%r73,0; @ %r160 bra $L1; .loc 1 149 19 ld.u64 %r57,[%frame]; .loc 1 149 9 shl.b64 %r59,%r57,2; .loc 1 151 34 ld.u64 %r61,[%frame+120]; ld.u64 %r115,[%frame+240]; .loc 1 141 11 ld.f32 %r119,[%r73]; cvt.u32.u32 %r162,%r22; cvt.s64.s8 %r161,%r162; shl.b64 %r163,%r161,3; add.u64 %r95,%r198,%r163; .loc 1 97 12 mov.f32 %r78,0fff800000; setp.eq.u32 %r197,1,0; add.u64 %r199,%frame,248; .loc 1 157 13 mov.u64 %r202,0; .loc 1 174 32 add.u64 %r203,%frame,120; .loc 1 126 12 setp.eq.u32 %r205,%r123,0; $L30: .loc 1 106 10 @ %r197 bra $L10; mov.u64 %r40,%r115; mov.u32 %r89,0; .loc 1 110 12 mov.u32 %r201,1; $L17: .loc 1 118 13 add.u64 %r120,%r73,%r59; .loc 1 110 11 setp.le.f32 %r166,%r78,%r119; @ ! %r166 bra $L51; setp.eq.u32 %r167,%r89,0; selp.u64 %r115,%r115,%r40,%r167; mov.u64 %r77,%r199; mov.u64 %r97,%r23; mov.u64 %r111,%r115; $L15: .loc 1 115 36 add.u64 %r169,%r111,1; .loc 1 115 25 st.u64 [%r97],%r169; .loc 1 114 5 add.u64 %r97,%r97,%r44; setp.eq.u64 %r170,%r77,%r95; @ %r170 bra $L33; .loc 1 115 32 ld.u64 %r111,[%r77]; add.u64 %r77,%r77,8; bra $L15; $L51: .loc 1 120 11 add.u64 %r40,%r40,1; .loc 1 120 4 setp.eq.u64 %r171,%r40,%r61; @ %r171 bra $L16; .loc 1 110 12 ld.f32 %r119,[%r120]; mov.u64 %r73,%r120; mov.u32 %r89,%r201; bra $L17; $L10: .loc 1 126 12 @ %r205 bra $L34; mov.u64 %r52,%r115; mov.u64 %r71,%r73; $L24: .loc 1 129 11 setp.le.f32 %r173,%r78,%r119; @ ! %r173 bra $L19; mov.u64 %r110,%r199; mov.u64 %r112,%r23; mov.u64 %r109,%r52; $L21: .loc 1 133 35 add.u64 %r175,%r109,1; .loc 1 133 24 st.u64 [%r112],%r175; .loc 1 132 10 add.u64 %r112,%r112,%r44; setp.eq.u64 %r176,%r95,%r110; @ %r176 bra $L35; .loc 1 133 31 ld.u64 %r109,[%r110]; add.u64 %r110,%r110,8; bra $L21; $L35: .loc 1 131 18 mov.f32 %r78,%r119; $L19: .loc 1 135 12 add.u64 %r71,%r71,%r59; .loc 1 137 17 add.u64 %r52,%r52,1; .loc 1 137 10 st.u64 [%frame+240],%r52; setp.ne.u64 %r177,%r52,%r61; @ %r177 bra $L22; .loc 1 135 12 sub.u64 %r178,%r61,%r115; mad.lo.u64 %r73,%r178,%r59,%r73; bra $L23; $L22: .loc 1 129 12 ld.f32 %r119,[%r71]; bra $L24; $L34: mov.u64 %r60,%r115; mov.u64 %r72,%r73; $L18: .loc 1 141 10 setp.lt.f32 %r180,%r78,%r119; @ ! %r180 bra $L25; mov.u64 %r50,%r199; mov.u64 %r47,%r23; mov.u64 %r93,%r60; $L27: .loc 1 145 35 add.u64 %r182,%r93,1; .loc 1 145 24 st.u64 [%r47],%r182; .loc 1 144 4 add.u64 %r47,%r47,%r44; setp.eq.u64 %r183,%r50,%r95; @ %r183 bra $L36; .loc 1 145 31 ld.u64 %r93,[%r50]; add.u64 %r50,%r50,8; bra $L27; $L36: .loc 1 143 18 mov.f32 %r78,%r119; $L25: .loc 1 149 9 add.u64 %r72,%r72,%r59; .loc 1 151 14 add.u64 %r60,%r60,1; .loc 1 151 7 st.u64 [%frame+240],%r60; setp.ne.u64 %r184,%r60,%r61; @ %r184 bra $L28; .loc 1 149 9 sub.u64 %r185,%r61,%r115; mad.lo.u64 %r73,%r185,%r59,%r73; bra $L23; $L28: .loc 1 141 11 ld.f32 %r119,[%r72]; bra $L18; $L16: st.u64 [%frame+240],%r61; .loc 1 118 13 mov.u64 %r73,%r120; $L23: mov.u64 %r42,%r199; .loc 1 151 34 mov.u64 %r70,%r61; mov.u64 %r68,%r57; .loc 1 152 9 mov.u64 %r76,0; $L29: .loc 1 157 13 st.u64 [%r42+-8],%r202; .loc 1 160 23 mul.lo.u64 %r65,%r68,%r70; mov.u64 %r108,%r76; .loc 1 161 5 add.u64 %r76,%r76,1; .loc 1 162 7 setp.eq.u64 %r189,%r108,%r106; @ %r189 bra $L1; .loc 1 170 16 ld.u64 %r190,[%r42]; add.u64 %r67,%r190,1; st.u64 [%r42],%r67; shl.b64 %r118,%r76,3; .loc 1 171 23 add.u64 %r191,%frame,%r118; ld.u64 %r68,[%r191]; .loc 1 171 13 sub.u64 %r192,%r68,%r65; shl.b64 %r193,%r192,2; add.u64 %r73,%r73,%r193; .loc 1 174 32 add.u64 %r195,%r203,%r118; ld.u64 %r70,[%r195]; .loc 1 174 7 add.u64 %r42,%r42,8; setp.eq.u64 %r196,%r67,%r70; @ %r196 bra $L29; ld.u64 %r115,[%frame+240]; .loc 1 141 11 ld.f32 %r119,[%r73]; bra $L30; $L33: mov.f32 %r78,%r119; setp.eq.u32 %r197,1,1; bra $L10; $L1: .loc 1 1_gfortran_mmaxloc0_8_r4 .visible .func _gfortran_mmaxloc0_8_r4pred %r187; .reg .u64 %r190; .reg .u64u64pred %r222; .reg .u64 %r223; .reg .predu32 %r230; .reg .u64 %r231; .reg .u16 %r233; .reg .u32 %r234; .reg .pred %r235; .reg .u64 %r236; .reg .predpred %r250; .reg .u64 %r252; .reg .predu16 %r257; .reg .u32 %r258; .reg .pred %r259; .reg .pred %r260; .reg .u64 %r262; .reg .pred %r263; .reg .predpred %r282; .reg .u64 %r284; .reg .u64 %r285; .reg .pred %r287; mov.u64 %r160,%ar0; mov.u64 %r161,%ar1; mov.u64 %r162,%ar2; mov.u32 %r163,%ar3; .loc 1 202 6 setp.ne.u64 %r164,%r162,0; @ %r164 bra $L53; .loc 1 204 7 {1163; call _gfortran_maxloc0_8_r4.loc 1 205 7 bra $L52; $L53: .loc 1 208 10 ld.s8 %r22,[%r161+28]; .loc 1 209 6 setp.gt.s32 %r168,%r22,0; @ %r168 bra $L55; .loc 1 210 5 cvta.const.u64 %r16stack; call _gfortran_runtime_error5: .loc 1 212 6 ld.u64 %r171,[%r160]; setp.ne.u64 %r172,%r171,0; @ %r172 bra $L56; .loc 1 214 7 st.u64 [%r160+48],%r171; add.u32 %r174,%r22,-1; cvt.s64.s32 %r175,%r174; st.u64 [%r160+56],%r175; mov.u64 %r176,1; st.u64 [%r160+40],%r176; .loc 1 215 28 cvt.u32.u64 %r177,%r176; st.u8 [%r160+28],%r177; .loc 1 216 24 st.u64 [%r160+8],%r171; .loc 1 217 29 cvt.u32.u32 %r182,%r22; cvt.s64.s8 %r181,%r182; mov.u64 %r180call (%value_in),_gfortrani_xmallocarray183,[%value_in]; } .loc 1 217 27 st.u64 [%r160],%r183; bra $L57; $L56: .loc 1 221 11 cvta.global.u64 %r185,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r186,[%r185+36]; setp.eq.u32 %r187,%r186,0; @ %r187 bra $L57; .loc 1 224 4 cvta.const.u64 %r190,$LC1; {161190; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19316161193190; call _gfortrani_bounds_equal_extents$L57: .loc 1 231 15 ld.u64 %r31,[%r162+16]; .loc 1 233 9 ld.u64 %r97,[%r162]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r195,%r32,-4; and.b32 %r196,%r195,-5; set.u32.eq.u32 %r198,%r196,0; neg.s32 %r199,%r198; .loc 1 235 22 add.u32 %r200,%r32,-1; set.u32.le.u32 %r202,%r200,1; neg.s32 %r203,%r202; .loc 1 235 6 cvt.u16.u32 %r205,%r199; cvt.u16.u32 %r206,%r203; or.b16 %r204,%r205,%r206; cvt.u32.u16 %r207,%r204; cvt.u16.u8 %r208,%r20758; .loc 1 237 7 setp.ne.u32 %r211,%r32,16; @ %r211 bra $L59; $L58: .loc 1 244 11 ld.u64 %r112,[%r160+40]; .loc 1 245 8 ld.u64 %r113,[%r160]; .loc 1 246 17 cvt.u32.u32 %r212,%r22; cvt.s64.s8 %r134,%r212; add.u64 %r122,%r161,40; add.u64 %r280,%frame,120; mov.u64 %r132,%r280; add.u64 %r124,%r162,40; mov.u64 %r66,%frame; add.u64 %r57,%frame,240; add.u64 %r59,%frame,360; mov.u64 %r58,%r59; .loc 1 246 10 mov.u64 %r119,0; .loc 1 251 16 mov.u64 %r221,%r119; bra $L60; $L59: .loc 1 242 5_gfortran_runtime_error248 18 ld.u64 %r215,[%r122]; st.u64 [%r132],%r215; .loc 1 249 20 ld.u64 %r217,[%r124]; mul.lo.u64 %r216,%r217,%r31; .loc 1 249 18 st.u64 [%r66],%r216; .loc 1 250 19 ld.u64 %r219,[%r122+16]; add.u64 %r218,%r219,1; ld.u64 %r220,[%r122+8]; sub.u64 %r45,%r218,%r220; .loc 1 250 17 st.u64 [%r57],%r45; .loc 1 251 16 st.u64 [%r58],%r221; .loc 1 252 10 setp.gt.s64 %r222,%r45,0; @ %r222 bra $L61; shl.b64 %r127,%r112,3; mov.u64 %r129,%r113; .loc 1 255 11 mov.u64 %r120,0; .loc 1 256 24 mov.u64 %r223,%r120; $L62: st.u64 [%r129],%r223; .loc 1 255 27 add.u64 %r120,%r120,1; .loc 1 255 4 add.u64 %r129,%r129,%r127; setp.ne.u64 %r224,%r120,%r134; @ %r224 bra $L62; bra $L52; $L61: .loc 1 246 26 add.u64 %r119,%r119,1; .loc 1 246 3 add.u64 %r122,%r122,24; add.u64 %r132,%r132,8; add.u64 %r124,%r124,24; add.u64 %r66,%r66,8; add.u64 %r57,%r57,8; add.u64 %r58,%r58,8; setp.ne.u64 %r225,%r119,%r134; @ %r225 bra $L60; .loc 1 261 8 ld.u64 %r94,[%r161]; shl.b64 %r145,%r112,3; mov.u64 %r153,%r113; .loc 1 264 10 mov.u64 %r118,0; .loc 1 265 23 mov.u64 %r226,%r118; $L64: st.u64 [%r153],%r226; .loc 1 264 26 add.u64 %r118,%r118,1; .loc 1 264 3 add.u64 %r153,%r153,%r145; setp.ne.u64 %r227,%r118,%r134; @ %r227 bra $L64; .loc 1 276 9 setp.eq.u64 %r228,%r94,0; @ %r228 bra $L52; .loc 1 331 19 ld.u64 %r73,[%frame+120]; .loc 1 331 9 shl.b64 %r75,%r73,2; .loc 1 332 20 ld.u64 %r76,[%frame]; .loc 1 334 34 ld.u64 %r80,[%frame+240]; .loc 1 334 21 ld.u64 %r156,[%frame+360]; cvt.u32.u32 %r230,%r22; cvt.s64.s8 %r229,%r230; shl.b64 %r231,%r229,3; add.u64 %r48,%r231,%r59; .loc 1 272 12 mov.f32 %r104,0fff800000; setp.eq.u32 %r282,1,0; add.u64 %r281,%frame,368; .loc 1 340 13 mov.u64 %r284,0; .loc 1 359 32 add.u64 %r285,%frame,240; .loc 1 308 12 setp.eq.u32 %r287,%r163,0; $L83: .loc 1 280 10 @ %r282 bra $L65; $L72: .loc 1 284 11 ld.s8 %r234,[%r97]; cvt.u16.u32 %r233,%r234; setp.eq.u16 %r235,%r233,0; @ %r235 bra $L66; .loc 1 287 8 ld.u64 %r236,[%r113]; setp.ne.u64 %r237,%r236,0; @ %r237 bra $L67; mov.u64 %r61,%r281; mov.u64 %r72,%r113; mov.u64 %r150,%r156; $L68: .loc 1 289 38 add.u64 %r239,%r150,1; .loc 1 289 27 st.u64 [%r72],%r239; .loc 1 288 7 add.u64 %r72,%r72,%r145; setp.eq.u64 %r240,%r48,%r61; @ %r240 bra $L67; .loc 1 289 34 ld.u64 %r150,[%r61]; add.u64 %r61,%r61,8; bra $L68; $L67: .loc 1 290 9 ld.f32 %r54,[%r94]; .loc 1 290 8 setp.ge.f32 %r241,%r54,%r104; @ ! %r241 bra $L66; mov.u64 %r108,%r281; mov.u64 %r123,%r113; mov.u64 %r154,%r156; $L71: .loc 1 296 33 add.u64 %r243,%r154,1; .loc 1 296 22 st.u64 [%r123],%r243; .loc 1 295 9 add.u64 %r123,%r123,%r145; setp.eq.u64 %r244,%r48,%r108; @ %r244 bra $L84; .loc 1 296 29 ld.u64 %r154,[%r108]; add.u64 %r108,%r108,8; bra $L71; $L66: .loc 1 300 13 add.u64 %r94,%r94,%r75; .loc 1 301 14 add.u64 %r97,%r97,%r76; .loc 1 303 11 add.u64 %r156,%r156,1; .loc 1 303 4 st.u64 [%frame+360],%r156; setp.ne.u64 %r245,%r156,%r80; @ %r245 bra $L72; bra $L73; $L65: .loc 1 308 12 @ %r287 bra $L85; mov.u64 %r68,%r156; mov.u64 %r92,%r94; $L78: .loc 1 311 11 ld.s8 %r248,[%r97]; cvt.u16.u32 %r247,%r248; setp.eq.u16 %r249,%r247,0; @ %r249 bra $L75; .loc 1 311 22 ld.f32 %r63,[%r92]; .loc 1 311 19 setp.ge.f32 %r250,%r63,%r104; @ ! %r250 bra $L75; mov.u64 %r121,%r281; mov.u64 %r36,%r113; mov.u64 %r131,%r68; .loc 1 314 19 mov.u64 %r116,0; $L77: .loc 1 315 36 add.u64 %r252,%r131,1; .loc 1 315 25 st.u64 [%r36],%r252; .loc 1 314 35 add.u64 %r116,%r116,1; .loc 1 314 12 add.u64 %r36,%r36,%r145; setp.ge.s64 %r253,%r116,%r134; @ %r253 bra $L86; .loc 1 315 32 ld.u64 %r131,[%r121]; add.u64 %r121,%r121,8; bra $L77; $L86: .loc 1 313 19 mov.f32 %r104,%r63; $L75: .loc 1 317 13 add.u64 %r92,%r92,%r75; .loc 1 319 11 add.u64 %r68,%r68,1; .loc 1 319 4 st.u64 [%frame+360],%r68; setp.ne.u64 %r254,%r68,%r80; @ %r254 bra $L78; .loc 1 317 13 sub.u64 %r255,%r80,%r156; mad.lo.u64 %r94,%r255,%r75,%r94; bra $L73; $L85: mov.u64 %r79,%r156; mov.u64 %r96,%r97; mov.u64 %r93,%r94; $L74: .loc 1 323 11 ld.s8 %r258,[%r96]; cvt.u16.u32 %r257,%r258; setp.eq.u16 %r259,%r257,0; @ %r259 bra $L79; .loc 1 323 22 ld.f32 %r70,[%r93]; .loc 1 323 19 setp.gt.f32 %r260,%r70,%r104; @ ! %r260 bra $L79; mov.u64 %r155,%r281; mov.u64 %r159,%r113; mov.u64 %r126,%r79; .loc 1 326 12 mov.u64 %r115,0; $L81: .loc 1 327 36 add.u64 %r262,%r126,1; .loc 1 327 25 st.u64 [%r159],%r262; .loc 1 326 28 add.u64 %r115,%r115,1; .loc 1 326 5 add.u64 %r159,%r159,%r145; setp.ge.s64 %r263,%r115,%r134; @ %r263 bra $L87; .loc 1 327 32 ld.u64 %r126,[%r155]; add.u64 %r155,%r155,8; bra $L81; $L87: .loc 1 325 12 mov.f32 %r104,%r70; $L79: .loc 1 331 9 add.u64 %r93,%r93,%r75; .loc 1 332 10 add.u64 %r96,%r96,%r76; .loc 1 334 14 add.u64 %r79,%r79,1; .loc 1 334 7 st.u64 [%frame+360],%r79; setp.ne.u64 %r264,%r79,%r80; @ %r264 bra $L74; sub.u64 %r137,%r80,%r156; .loc 1 331 9 mad.lo.u64 %r94,%r75,%r137,%r94; .loc 1 332 10 mad.lo.u64 %r97,%r76,%r137,%r97; $L73: mov.u64 %r103,%r281; .loc 1 325 12 mov.u64 %r89,%r76; .loc 1 334 34 mov.u64 %r91,%r80; mov.u64 %r87,%r73; mov.u64 %r101,8; .loc 1 335 9 mov.u64 %r102,0; $L82: .loc 1 340 13 st.u64 [%r103+-8],%r284; .loc 1 343 23 mul.lo.u64 %r82,%r87,%r91; .loc 1 344 24 mul.lo.u64 %r84,%r91,%r89; .loc 1 345 5 add.u64 %r102,%r102,1; .loc 1 346 7 setp.eq.u64 %r269,%r102,%r134; @ %r269 bra $L52; .loc 1 354 16 ld.u64 %r270,[%r103]; add.u64 %r86,%r270,1; st.u64 [%r103],%r86; .loc 1 355 23 add.u64 %r272,%r280,%r101; ld.u64 %r87,[%r272]; .loc 1 355 13 sub.u64 %r273,%r87,%r82; shl.b64 %r274,%r273,2; add.u64 %r94,%r94,%r274; .loc 1 356 24 add.u64 %r275,%frame,%r101; ld.u64 %r89,[%r275]; .loc 1 356 14 sub.u64 %r276,%r89,%r84; add.u64 %r97,%r97,%r276; .loc 1 359 32 add.u64 %r278,%r285,%r101; ld.u64 %r91,[%r278]; .loc 1 359 7 add.u64 %r103,%r103,8; add.u64 %r101,%r101,8; setp.eq.u64 %r279,%r86,%r91; @ %r279 bra $L82; .loc 1 334 21 ld.u64 %r156,[%frame+360]; bra $L83; $L84: .loc 1 290 9 mov.f32 %r104,%r54; setp.eq.u32 %r282,1,1; bra $L65; $L52: .loc 1 362_gfortran_smaxloc0_8_r4 .visible .func _gfortran_smaxloc0_8_r44predpred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 379 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L114; .loc 1 379 20 ld.u32 %r40,[%r37381 7 {_gfortran_maxloc0_8_r4.loc 1 382 7 bra $L113; $L115: .loc 1 385 8 ld.s8 %r30,[%r36+28]; .loc 1 387 6 setp.gt.s64 %r46,%r30,0; @ %r46 bra $L1_gfortran_runtime_error117: .loc 1 390 15 ld.u64 %r24,[%r35]; .loc 1 390 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L118; .loc 1 392 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 393 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 394 24 st.u64 [%r35+8],%r24; .loc 1 395 29 mov.u64 call (%value_in),_gfortrani_xmallocarray95 27 st.u64 [%r35],%r24; bra $L119; $L118: .loc 1 397 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 397 11 ld.u32 %r62,[%r61+36.loc 1 399 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 404 8 ld.u64 %r24,[%r35]; $L119: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,3; mov.u64 %r29,%r24; .loc 1 405 10 mov.u64 %r32,0; .loc 1 406 23 mov.u64 %r68,%r32; $L120: st.u64 [%r29],%r68; .loc 1 405 24 add.u64 %r32,%r32,1; .loc 1 405 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L120; $L113: .loc 1 407 maxloc0_16_r4.o/1622802228_gfortran_maxloc0_16_r4 .visible .func _gfortran_maxloc0_16_fortran/generated/maxloc0_16_r4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_16_r4 .visible .func _gfortran_mmaxloc0_16_r4_gfortran_smaxloc0_16_r4 .visible .func _gfortran_smaxloc0_16_r4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,65,88,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_16_r4 .visible .func _gfortran_maxloc0_16_r4u64 %r94; .reg .u64 %r105; .reg .u64 %r108predpred %r160; .reg .predpredu32 %r167; .reg .u64 %r168; .reg .pred %r171; .reg .pred %r172; .reg .u64 %r174; .reg .u64 %r175; .reg .pred %r176; .reg .pred %r177; .reg .predu64 %r193; .reg .predmov.u64 %r124,%ar0; mov.u64 %r125,%ar1; mov.u32 %r126,%ar2; .loc 1 50 10 ld.s8 %r22,[%r125+28]; .loc 1 50 8 cvt.u32.u32 %r127,%r22; cvt.s64.s8 %r79,%r127; .loc 1 51 6 setp.gt.s64 %r128,%r79,0; @ %r128 bra $L2; .loc 1 52 5 cvta.const.u64 %r112_gfortran_runtime_errorr23,[%r124]; .loc 1 54 6 setp.ne.u64 %r131,%r23,0; @ %r131 bra $L3; .loc 1 56 7 st.u64 [%r124+48],%r23; add.u64 %r133,%r79,-1; st.u64 [%r124+56],%r133; mov.u64 %r134,1; st.u64 [%r124+40],%r134; .loc 1 57 28 cvt.u32.u64 %r135,%r134; st.u8 [%r124+28],%r135; .loc 1 58 24 st.u64 [%r124+8],%r23; .loc 1 59 29 mov.u64 %r138,1679call (%value_in),_gfortrani_xmallocarray1,[%value_in]; } mov.u64 %r23,%r141; .loc 1 59 27 st.u64 [%r124],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r143,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r144,[%r143+36]; setp.eq.u32 %r145,%r144,0; @ %r145 bra $L4; .loc 1 64r148; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r124]; $L4: .loc 1 68 11 ld.u64 %r80,[%r124+40]; add.u64 %r51,%r125,40; mov.u64 %r42,%frame; add.u64 %r76,%frame,120; .loc 1 70 10 mov.u64 %r108,0; add.u64 %r205,%frame,240; .loc 1 74 16 mov.u64 %r156,%r108; bra $L7; $L32: mov.u64 %r108,%r88; $L7: .loc 1 72 18 ld.u64 %r149,[%r51]; st.u64 [%r42],%r149; .loc 1 73 19 ld.u64 %r151,[%r51+16]; add.u64 %r150,%r151,1; ld.u64 %r152,[%r51+8]; sub.u64 %r31,%r150,%r152; .loc 1 73 17 st.u64 [%r76],%r31; .loc 1 74 16 shl.b64 %r154,%r108,3; add.u64 %r155,%r205,%r154; st.u64 [%r155],%r156; .loc 1 75 10 setp.gt.s64 %r157,%r31,0; @ %r157 bra $L5; shl.b64 %r120,%r80,4; mov.u64 %r105,%r23; .loc 1 78 11 mov.u64 %r89,0; .loc 1 79 24 mov.u64 %r158,%r89; $L6: st.u64 [%r105],%r158; st.u64 [%r105+8],%r158; .loc 1 78 27 add.u64 %r89,%r89,1; .loc 1 78 4 add.u64 %r105,%r105,%r120; setp.ne.u64 %r160,%r79,%r89; @ %r160 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r88,%r108,1; .loc 1 70 3 add.u64 %r51,%r51,24; add.u64 %r42,%r42,8; add.u64 %r76,%r76,8; setp.ne.u64 %r161,%r79,%r88; @ %r161 bra $L32; .loc 1 84 8 ld.u64 %r69,[%r125]; shl.b64 %r35,%r80,4; mov.u64 %r48,%r23; .loc 1 87 10 mov.u64 %r87,0; .loc 1 88 23 mov.u64 %r162,1; mov.u64 %r163,%r87; $L8: st.u64 [%r48],%r162; st.u64 [%r48+8],%r163; mov.u64 %r109,%r87; .loc 1 87 26 add.u64 %r87,%r87,1; .loc 1 87 3 add.u64 %r48,%r48,%r35; setp.ne.u64 %r164,%r108,%r109; @ %r164 bra $L8; .loc 1 101 9 setp.eq.u64 %r165,%r69,0; @ %r165 bra $L1; .loc 1 149 19 ld.u64 %r54,[%frame]; .loc 1 149 9 shl.b64 %r56,%r54,2; .loc 1 151 34 ld.u64 %r58,[%frame+120]; ld.u64 %r121,[%frame+240]; .loc 1 141 11 ld.f32 %r122,[%r69]; cvt.u32.u32 %r167,%r22; cvt.s64.s8 %r166,%r167; shl.b64 %r168,%r166,3; add.u64 %r46,%r205,%r168; .loc 1 97 12 mov.f32 %r74,0fff800000; setp.eq.u32 %r207,1,0; add.u64 %r206,%frame,248; .loc 1 157 13 mov.u64 %r210,0; .loc 1 174 32 add.u64 %r211,%frame,120; .loc 1 126 12 setp.eq.u32 %r213,%r126,0; $L30: .loc 1 106 10 @ %r207 bra $L10; mov.u64 %r37,%r121; mov.u32 %r85,0; .loc 1 110 12 mov.u32 %r209,1; $L17: .loc 1 118 13 add.u64 %r123,%r69,%r56; .loc 1 110 11 setp.le.f32 %r171,%r74,%r122; @ ! %r171 bra $L51; setp.eq.u32 %r172,%r85,0; selp.u64 %r121,%r121,%r37,%r172; mov.u64 %r78,%r206; mov.u64 %r94,%r23; mov.u64 %r117,%r121; $L15: .loc 1 115 36 add.u64 %r174,%r117,1; st.u64 [%r94],%r174; shr.s64 %r175,%r174,63; st.u64 [%r94+8],%r175; .loc 1 114 5 add.u64 %r94,%r94,%r35; setp.eq.u64 %r176,%r46,%r78; @ %r176 bra $L33; .loc 1 115 32 ld.u64 %r117,[%r78]; add.u64 %r78,%r78,8; bra $L15; $L51: .loc 1 120 11 add.u64 %r37,%r37,1; .loc 1 120 4 setp.eq.u64 %r177,%r37,%r58; @ %r177 bra $L16; .loc 1 110 12 ld.f32 %r122,[%r123]; mov.u64 %r69,%r123; mov.u32 %r85,%r209; bra $L17; $L10: .loc 1 126 12 @ %r213 bra $L34; mov.u64 %r49,%r121; mov.u64 %r67,%r69; $L24: .loc 1 129 11 setp.le.f32 %r179,%r74,%r122; @ ! %r179 bra $L19; mov.u64 %r119,%r206; mov.u64 %r70,%r23; mov.u64 %r114,%r49; $L21: .loc 1 133 35 add.u64 %r181,%r114,1; st.u64 [%r70],%r181; shr.s64 %r182,%r181,63; st.u64 [%r70+8],%r182; .loc 1 132 10 add.u64 %r70,%r70,%r35; setp.eq.u64 %r183,%r46,%r119; @ %r183 bra $L35; .loc 1 133 31 ld.u64 %r114,[%r119]; add.u64 %r119,%r119,8; bra $L21; $L35: .loc 1 131 18 mov.f32 %r74,%r122; $L19: .loc 1 135 12 add.u64 %r67,%r67,%r56; .loc 1 137 17 add.u64 %r49,%r49,1; .loc 1 137 10 st.u64 [%frame+240],%r49; setp.ne.u64 %r184,%r49,%r58; @ %r184 bra $L22; .loc 1 135 12 sub.u64 %r185,%r58,%r121; mad.lo.u64 %r69,%r185,%r56,%r69; bra $L23; $L22: .loc 1 129 12 ld.f32 %r122,[%r67]; bra $L24; $L34: mov.u64 %r57,%r121; mov.u64 %r68,%r69; $L18: .loc 1 141 10 setp.lt.f32 %r187,%r74,%r122; @ ! %r187 bra $L25; mov.u64 %r112,%r206; mov.u64 %r116,%r23; mov.u64 %r111,%r57; $L27: .loc 1 145 35 add.u64 %r189,%r111,1; st.u64 [%r116],%r189; shr.s64 %r190,%r189,63; st.u64 [%r116+8],%r190; .loc 1 144 4 add.u64 %r116,%r116,%r35; setp.eq.u64 %r191,%r46,%r112; @ %r191 bra $L36; .loc 1 145 31 ld.u64 %r111,[%r112]; add.u64 %r112,%r112,8; bra $L27; $L36: .loc 1 143 18 mov.f32 %r74,%r122; $L25: .loc 1 149 9 add.u64 %r68,%r68,%r56; .loc 1 151 14 add.u64 %r57,%r57,1; .loc 1 151 7 st.u64 [%frame+240],%r57; setp.ne.u64 %r192,%r57,%r58; @ %r192 bra $L28; .loc 1 149 9 sub.u64 %r193,%r58,%r121; mad.lo.u64 %r69,%r193,%r56,%r69; bra $L23; $L28: .loc 1 141 11 ld.f32 %r122,[%r68]; bra $L18; $L16: st.u64 [%frame+240],%r58; .loc 1 118 13 mov.u64 %r69,%r123; $L23: mov.u64 %r71,%r206; .loc 1 151 34 mov.u64 %r65,%r58; mov.u64 %r63,%r54; .loc 1 152 9 mov.u64 %r73,0; $L29: .loc 1 157 13 st.u64 [%r71+-8],%r210; .loc 1 160 23 mul.lo.u64 %r60,%r63,%r65; mov.u64 %r110,%r73; .loc 1 161 5 add.u64 %r73,%r73,1; .loc 1 162 7 setp.eq.u64 %r197,%r110,%r108; @ %r197 bra $L1; .loc 1 170 16 ld.u64 %r198,[%r71]; add.u64 %r62,%r198,1; st.u64 [%r71],%r62; shl.b64 %r40,%r73,3; .loc 1 171 23 add.u64 %r199,%frame,%r40; ld.u64 %r63,[%r199]; .loc 1 171 13 sub.u64 %r200,%r63,%r60; shl.b64 %r201,%r200,2; add.u64 %r69,%r69,%r201; .loc 1 174 32 add.u64 %r203,%r211,%r40; ld.u64 %r65,[%r203]; .loc 1 174 7 add.u64 %r71,%r71,8; setp.eq.u64 %r204,%r62,%r65; @ %r204 bra $L29; ld.u64 %r121,[%frame+240]; .loc 1 141 11 ld.f32 %r122,[%r69]; bra $L30; $L33: mov.f32 %r74,%r122; setp.eq.u32 %r207,1,1; bra $L10; $L1: .loc 1 1_gfortran_mmaxloc0_16_r4 .visible .func _gfortran_mmaxloc0_16_r4pred %r168; .reg .predpredpred %r226; .reg .u64 %r227; .reg .pred %r229; .reg .predpred %r234; .reg .u64u16pred %r248; .reg .pred %r249; .reg .u64 %r251; .reg .u64 %r252; .reg .pred %r253; .reg .pred %r254; .reg .u16 %r256; .reg .u32predpred %r274; .reg .predu64 %r283; .reg .u64u64 %r287; .reg .u64 %r289; .reg .pred %r290; .reg .u64 %r291; .reg .u64 %r292; .reg .predpred %r298; mov.u64 %r164,%ar0; mov.u64 %r165,%ar1; mov.u64 %r166,%ar2; mov.u32 %r167,%ar3; .loc 1 202 6 setp.ne.u64 %r168,%r166,0; @ %r168 bra $L1617; call _gfortran_maxloc0_16_r4.loc 1 205 7 bra $L52; $L53: .loc 1 208 10 ld.s8 %r22,[%r165+28]; .loc 1 209 6 setp.gt.s32 %r172,%r22,0; @ %r172 bra $L55; .loc 1 210 5 cvta.const.u64 %r173173_gfortran_runtime_error5: .loc 1 212 6 ld.u64 %r175,[%r164]; setp.ne.u64 %r176,%r175,0; @ %r176 bra $L56; .loc 1 214 7 st.u64 [%r164+48],%r175; add.u32 %r178,%r22,-1; cvt.s64.s32 %r179,%r178; st.u64 [%r164+56],%r179; mov.u64 %r180,1; st.u64 [%r164+40],%r180; .loc 1 215 28 cvt.u32.u64 %r181,%r180; st.u8 [%r164+28],%r181; .loc 1 216 24 st.u64 [%r164+8],%r175; .loc 1 217 29 cvt.u32.u32 %r186,%r22; cvt.s64.s8 %r185,%r186; mov.u64 %r184,16call (%value_in),_gfortrani_xmallocarray187,[%value_in]; } .loc 1 217 27 st.u64 [%r164],%r187; bra $L57; $L56: .loc 1 221 11 cvta.global.u64 %r189,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r190,[%r189+36]; setp.eq.u32 %r191,%r190,0164r194; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19716r197194; call _gfortrani_bounds_equal_extents$L57: .loc 1 231 15 ld.u64 %r31,[%r166+16]; .loc 1 233 9 ld.u64 %r101,[%r166]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r199,%r32,-4; and.b32 %r200,%r199,-5; set.u32.eq.u32 %r202,%r200,0; neg.s32 %r203,%r202; .loc 1 235 22 add.u32 %r204,%r32,-1; set.u32.le.u32 %r206,%r204,1; neg.s32 %r207,%r206; .loc 1 235 6 cvt.u16.u32 %r209,%r203; cvt.u16.u32 %r210,%r207; or.b16 %r208,%r209,%r210; cvt.u32.u16 %r211,%r208; cvt.u16.u8 %r212,%r211; setp.ne.u16 %r213,%r212,0; @ %r213 bra $L58; .loc 1 237 7 setp.ne.u32 %r215,%r32,16; @ %r215 bra $L59; $L58: .loc 1 244 11 ld.u64 %r116,[%r164+40]; .loc 1 245 8 ld.u64 %r117,[%r164]; .loc 1 246 17 cvt.u32.u32 %r216,%r22; cvt.s64.s8 %r136,%r216; add.u64 %r153,%r165,40; add.u64 %r291,%frame,120; mov.u64 %r150,%r291; add.u64 %r151,%r166,40; mov.u64 %r148,%frame; add.u64 %r125,%frame,240; add.u64 %r48,%frame,360; mov.u64 %r135,%r48; .loc 1 246 10 mov.u64 %r122,0; .loc 1 251 16 mov.u64 %r225,%r122; bra $L60; $L59: .loc 1 242 5 cvta.const.u64 %r217,$LC3stack; call _gfortran_runtime_error248 18 ld.u64 %r219,[%r153]; st.u64 [%r150],%r219; .loc 1 249 20 ld.u64 %r221,[%r151]; mul.lo.u64 %r220,%r221,%r31; .loc 1 249 18 st.u64 [%r148],%r220; .loc 1 250 19 ld.u64 %r223,[%r153+16]; add.u64 %r222,%r223,1; ld.u64 %r224,[%r153+8]; sub.u64 %r47,%r222,%r224; .loc 1 250 17 st.u64 [%r125],%r47; .loc 1 251 16 st.u64 [%r135],%r225; .loc 1 252 10 setp.gt.s64 %r226,%r47,0; @ %r226 bra $L61; shl.b64 %r130,%r116,4; mov.u64 %r132,%r117; .loc 1 255 11 mov.u64 %r123,0; .loc 1 256 24 mov.u64 %r227,%r123; $L62: st.u64 [%r132],%r227; st.u64 [%r132+8],%r227; .loc 1 255 27 add.u64 %r123,%r123,1; .loc 1 255 4 add.u64 %r132,%r132,%r130; setp.ne.u64 %r229,%r123,%r136; @ %r229 bra $L62; bra $L52; $L61: .loc 1 246 26 add.u64 %r122,%r122,1; .loc 1 246 3 add.u64 %r153,%r153,24; add.u64 %r150,%r150,8; add.u64 %r151,%r151,24; add.u64 %r148,%r148,8; add.u64 %r125,%r125,8; add.u64 %r135,%r135,8; setp.ne.u64 %r230,%r122,%r136; @ %r230 bra $L60; .loc 1 261 8 ld.u64 %r98,[%r165]; shl.b64 %r159,%r116,4; mov.u64 %r157,%r117; .loc 1 264 10 mov.u64 %r121,0; .loc 1 265 23 mov.u64 %r231,%r121; $L64: st.u64 [%r157],%r231; st.u64 [%r157+8],%r231; .loc 1 264 26 add.u64 %r121,%r121,1; .loc 1 264 3 add.u64 %r157,%r157,%r159; setp.ne.u64 %r233,%r121,%r136; @ %r233 bra $L64; .loc 1 276 9 setp.eq.u64 %r234,%r98,0; @ %r234 bra $L52; .loc 1 331 19 ld.u64 %r77,[%frame+120]; .loc 1 331 9 shl.b64 %r79,%r77,2; .loc 1 332 20 ld.u64 %r80,[%frame]; .loc 1 334 34 ld.u64 %r84,[%frame+240]; .loc 1 334 21 ld.u64 %r161,[%frame+360]; cvt.u32.u32 %r236,%r22; cvt.s64.s8 %r235,%r236; shl.b64 %r237,%r235,3; add.u64 %r72,%r237,%r48; .loc 1 272 12 mov.f32 %r105,0fff800000; setp.eq.u32 %r293,1,0; add.u64 %r292,%frame,368; .loc 1 340 13 mov.u64 %r295,0; .loc 1 359 32 add.u64 %r296,%frame,240; .loc 1 308 12 setp.eq.u32 %r298,%r167,0; $L83: .loc 1 280 10 @ %r293 bra $L65; $L72: .loc 1 284 11 ld.s8 %r240,[%r101]; cvt.u16.u32 %r239,%r240; setp.eq.u16 %r241,%r239,0; @ %r241 bra $L66; .loc 1 287 8 ld.u64 %r242,[%r117]; ld.u64 %r243,[%r117+8]; or.b64 %r242,%r242,%r243; setp.ne.u64 %r244,%r242,0; @ %r244 bra $L67; mov.u64 %r99,%r292; mov.u64 %r102,%r117; mov.u64 %r152,%r161; $L68: .loc 1 289 38 add.u64 %r246,%r152,1; st.u64 [%r102],%r246; shr.s64 %r247,%r246,63; st.u64 [%r102+8],%r247; .loc 1 288 7 add.u64 %r102,%r102,%r159; setp.eq.u64 %r248,%r72,%r99; @ %r248 bra $L67; .loc 1 289 34 ld.u64 %r152,[%r99]; add.u64 %r99,%r99,8; bra $L68; $L67: .loc 1 290 9 ld.f32 %r54,[%r98]; .loc 1 290 8 setp.ge.f32 %r249,%r54,%r105; @ ! %r249 bra $L66; mov.u64 %r127,%r292; mov.u64 %r128,%r117; mov.u64 %r156,%r161; $L71: .loc 1 296 33 add.u64 %r251,%r156,1; st.u64 [%r128],%r251; shr.s64 %r252,%r251,63; st.u64 [%r128+8],%r252; .loc 1 295 9 add.u64 %r128,%r128,%r159; setp.eq.u64 %r253,%r72,%r127; @ %r253 bra $L84; .loc 1 296 29 ld.u64 %r156,[%r127]; add.u64 %r127,%r127,8; bra $L71; $L66: .loc 1 300 13 add.u64 %r98,%r98,%r79; .loc 1 301 14 add.u64 %r101,%r101,%r80; .loc 1 303 11 add.u64 %r161,%r161,1; .loc 1 303 4 st.u64 [%frame+360],%r161; setp.ne.u64 %r254,%r161,%r84; @ %r254 bra $L72; bra $L73; $L65: .loc 1 308 12 @ %r298 bra $L85; mov.u64 %r67,%r161; mov.u64 %r96,%r98; $L78: .loc 1 311 11 ld.s8 %r257,[%r101]; cvt.u16.u32 %r256,%r257; setp.eq.u16 %r258,%r256,0; @ %r258 bra $L75; .loc 1 311 22 ld.f32 %r63,[%r96]; .loc 1 311 19 setp.ge.f32 %r259,%r63,%r105; @ ! %r259 bra $L75; mov.u64 %r49,%r292; mov.u64 %r61,%r117; mov.u64 %r149,%r67; .loc 1 314 19 mov.u64 %r120,0; $L77: .loc 1 315 36 add.u64 %r261,%r149,1; st.u64 [%r61],%r261; shr.s64 %r262,%r261,63; st.u64 [%r61+8],%r262; .loc 1 314 35 add.u64 %r120,%r120,1; .loc 1 314 12 add.u64 %r61,%r61,%r159; setp.ge.s64 %r263,%r120,%r136; @ %r263 bra $L86; .loc 1 315 32 ld.u64 %r149,[%r49]; add.u64 %r49,%r49,8; bra $L77; $L86: .loc 1 313 19 mov.f32 %r105,%r63; $L75: .loc 1 317 13 add.u64 %r96,%r96,%r79; .loc 1 319 11 add.u64 %r67,%r67,1; .loc 1 319 4 st.u64 [%frame+360],%r67; setp.ne.u64 %r264,%r67,%r84; @ %r264 bra $L78; .loc 1 317 13 sub.u64 %r265,%r84,%r161; mad.lo.u64 %r98,%r265,%r79,%r98; bra $L73; $L85: mov.u64 %r82,%r161; mov.u64 %r100,%r101; mov.u64 %r97,%r98; $L74: .loc 1 323 11 ld.s8 %r268,[%r100]; cvt.u16.u32 %r267,%r268; setp.eq.u16 %r269,%r267,0; @ %r269 bra $L79; .loc 1 323 22 ld.f32 %r71,[%r97]; .loc 1 323 19 setp.gt.f32 %r270,%r71,%r105; @ ! %r270 bra $L79; mov.u64 %r124,%r292; mov.u64 %r36,%r117; mov.u64 %r134,%r82; .loc 1 326 12 mov.u64 %r119,0; $L81: .loc 1 327 36 add.u64 %r272,%r134,1; st.u64 [%r36],%r272; shr.s64 %r273,%r272,63; st.u64 [%r36+8],%r273; .loc 1 326 28 add.u64 %r119,%r119,1; .loc 1 326 5 add.u64 %r36,%r36,%r159; setp.ge.s64 %r274,%r119,%r136; @ %r274 bra $L87; .loc 1 327 32 ld.u64 %r134,[%r124]; add.u64 %r124,%r124,8; bra $L81; $L87: .loc 1 325 12 mov.f32 %r105,%r71; $L79: .loc 1 331 9 add.u64 %r97,%r97,%r79; .loc 1 332 10 add.u64 %r100,%r100,%r80; .loc 1 334 14 add.u64 %r82,%r82,1; .loc 1 334 7 st.u64 [%frame+360],%r82; setp.ne.u64 %r275,%r82,%r84; @ %r275 bra $L74; sub.u64 %r139,%r84,%r161; .loc 1 331 9 mad.lo.u64 %r98,%r79,%r139,%r98; .loc 1 332 10 mad.lo.u64 %r101,%r80,%r139,%r101; $L73: mov.u64 %r111,%r292; .loc 1 325 12 mov.u64 %r93,%r80; .loc 1 334 34 mov.u64 %r95,%r84; mov.u64 %r91,%r77; mov.u64 %r110,8; .loc 1 335 9 mov.u64 %r103,0; $L82: .loc 1 340 13 st.u64 [%r111+-8],%r295; .loc 1 343 23 mul.lo.u64 %r86,%r91,%r95; .loc 1 344 24 mul.lo.u64 %r88,%r95,%r93; .loc 1 345 5 add.u64 %r103,%r103,1; .loc 1 346 7 setp.eq.u64 %r280,%r103,%r136; @ %r280 bra $L52; .loc 1 354 16 ld.u64 %r281,[%r111]; add.u64 %r90,%r281,1; st.u64 [%r111],%r90; .loc 1 355 23 add.u64 %r283,%r291,%r110; ld.u64 %r91,[%r283]; .loc 1 355 13 sub.u64 %r284,%r91,%r86; shl.b64 %r285,%r284,2; add.u64 %r98,%r98,%r285; .loc 1 356 24 add.u64 %r286,%frame,%r110; ld.u64 %r93,[%r286]; .loc 1 356 14 sub.u64 %r287,%r93,%r88; add.u64 %r101,%r101,%r287; .loc 1 359 32 add.u64 %r289,%r296,%r110; ld.u64 %r95,[%r289]; .loc 1 359 7 add.u64 %r111,%r111,8; add.u64 %r110,%r110,8; setp.eq.u64 %r290,%r90,%r95; @ %r290 bra $L82; .loc 1 334 21 ld.u64 %r161,[%frame+360]; bra $L83; $L84: .loc 1 290 9 mov.f32 %r105,%r54; setp.eq.u32 %r293,1,1; bra $L65; $L52: .loc 1 362_gfortran_smaxloc0_16_r4 .visible .func _gfortran_smaxloc0_16_r44predpred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 379 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L114; .loc 1 379 20 ld.u32 %r40,[%r37381 7 {_gfortran_maxloc0_16_r4.loc 1 382 7 bra $L113; $L115: .loc 1 385 8 ld.s8 %r30,[%r36+28]; .loc 1 387 6 setp.gt.s64 %r46,%r30,0; @ %r46 bra $L1_gfortran_runtime_error117: .loc 1 390 15 ld.u64 %r24,[%r35]; .loc 1 390 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L118; .loc 1 392 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 393 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 394 24 st.u64 [%r35+8],%r24; .loc 1 395 29call (%value_in),_gfortrani_xmallocarray95 27 st.u64 [%r35],%r24; bra $L119; $L118: .loc 1 397 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 397 11 ld.u32 %r62,[%r61+36.loc 1 399 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 404 8 ld.u64 %r24,[%r35]; $L119: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,4; mov.u64 %r29,%r24; .loc 1 405 10 mov.u64 %r32,0; .loc 1 406 23 mov.u64 %r68,%r32; $L120: st.u64 [%r29],%r68; st.u64 [%r29+8],%r68; .loc 1 405 24 add.u64 %r32,%r32,1; .loc 1 405 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r70,%r30,%r32; @ %r70 bra $L120; $L113: .loc 1 407 maxloc0_4_r8.o/ 1622802228_gfortran_maxloc0_4_r8 .visible .func _gfortran_maxloc0_4_fortran/generated/maxloc0_4_r8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_4_r8 .visible .func _gfortran_mmaxloc0_4_r8_gfortran_smaxloc0_4_r8 .visible .func _gfortran_smaxloc0_4_r8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,65,88,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_4_r8 .visible .func _gfortran_maxloc0_4_r8predpredpredu32 %r176; .reg .pred %r177; .reg .pred %r178; .reg .pred %r180; .reg .u32 %r182; .reg .u32 %r183; .reg .pred64pred %r208; .reg .u32 %r210; .reg .u64 %r211; .reg .u64 %r212; .reg .pred50 10 ld.s8 %r22,[%r128+28]; .loc 1 50 8 cvt.u32.u32 %r130,%r22; cvt.s64.s8 %r79,%r130; .loc 1 51 6 setp.gt.s64 %r131,%r79,0; @ %r131 bra $L2; .loc 1 52 5 cvta.const.u64 %r1321stack; call _gfortran_runtime_errorr23,[%r127]; .loc 1 54 6 setp.ne.u64 %r134,%r23,0; @ %r134 bra $L3; .loc 1 56 7 st.u64 [%r127+48],%r23; add.u64 %r136,%r79,-1; st.u64 [%r127+56],%r136; mov.u64 %r137,1; st.u64 [%r127+40],%r137; .loc 1 57 28 cvt.u32.u64 %r138,%r137; st.u8 [%r127+28],%r138; .loc 1 58 24 st.u64 [%r127+8],%r23; .loc 1 59 2979call (%value_in),_gfortrani_xmallocarray4,[%value_in]; } mov.u64 %r23,%r144; .loc 1 59 27 st.u64 [%r127],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r146,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r147,[%r146+36]; setp.eq.u32 %r148,%r147,0; @ %r148 bra $L4; .loc 1 64 2r12151; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r127]; $L4: .loc 1 68 11 ld.u64 %r80,[%r127+40]; add.u64 %r44,%r128,40; mov.u64 %r37,%frame; add.u64 %r47,%frame,120; .loc 1 70 10 mov.u64 %r103,0; add.u64 %r206,%frame,240; .loc 1 74 16 mov.u64 %r159,%r103; bra $L7; $L32: mov.u64 %r103,%r88; $L7: .loc 1 72 18 ld.u64 %r152,[%r44]; st.u64 [%r37],%r152; .loc 1 73 19 ld.u64 %r154,[%r44+16]; add.u64 %r153,%r154,1; ld.u64 %r155,[%r44+8]; sub.u64 %r31,%r153,%r155; .loc 1 73 17 st.u64 [%r47],%r31; .loc 1 74 16 shl.b64 %r157,%r103,3; add.u64 %r158,%r206,%r157; st.u64 [%r158],%r159; .loc 1 75 10 setp.gt.s64 %r160,%r31,0; @ %r160 bra $L5; shl.b64 %r122,%r80,2; mov.u64 %r101,%r23; .loc 1 78 11 mov.u64 %r89,0; .loc 1 79 24 cvt.u32.u64 %r161,%r89; $L6: st.u32 [%r101],%r161; .loc 1 78 27 add.u64 %r89,%r89,1; .loc 1 78 4 add.u64 %r101,%r101,%r122; setp.ne.u64 %r162,%r79,%r89; @ %r162 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r88,%r103,1; .loc 1 70 3 add.u64 %r44,%r44,24; add.u64 %r37,%r37,8; add.u64 %r47,%r47,8; setp.ne.u64 %r163,%r79,%r88; @ %r163 bra $L32; .loc 1 84 8 ld.u64 %r67,[%r128]; shl.b64 %r108,%r80,2; mov.u64 %r106,%r23; .loc 1 87 10 mov.u64 %r87,0; .loc 1 88 23 mov.u32 %r164,1; $L8: st.u32 [%r106],%r164; mov.u64 %r104,%r87; .loc 1 87 26 add.u64 %r87,%r87,1; .loc 1 87 3 add.u64 %r106,%r106,%r108; setp.ne.u64 %r165,%r103,%r104; @ %r165 bra $L8; .loc 1 101 9 setp.eq.u64 %r166,%r67,0; @ %r166 bra $L1; .loc 1 149 19 ld.u64 %r51,[%frame]; .loc 1 149 9 shl.b64 %r53,%r51,3; .loc 1 151 34 ld.u64 %r56,[%frame+120]; ld.u64 %r123,[%frame+240]; .loc 1 141 11 ld.f64 %r125,[%r67]; cvt.u32.u32 %r168,%r22; cvt.s64.s8 %r167,%r168; shl.b64 %r169,%r167,3; add.u64 %r113,%r206,%r169; .loc 1 97 12 mov.f64 %r73,0dfff0000000000000; setp.eq.u32 %r208,1,0; add.u64 %r207,%frame,248; .loc 1 157 13 mov.u64 %r211,0; .loc 1 174 32 add.u64 %r212,%frame,120; .loc 1 126 12 setp.eq.u32 %r214,%r129,0; $L30: .loc 1 106 10 @ %r208 bra $L10; mov.u64 %r38,%r123; mov.u32 %r85,0; .loc 1 110 12 mov.u32 %r210,1; $L17: .loc 1 118 13 add.u64 %r126,%r67,%r53; .loc 1 110 11 setp.le.f64 %r172,%r73,%r125; @ ! %r172 bra $L51; setp.eq.u32 %r173,%r85,0; selp.u64 %r123,%r123,%r38,%r173; mov.u64 %r90,%r207; mov.u64 %r95,%r23; mov.u64 %r119,%r123; $L15: .loc 1 115 36 cvt.u32.u64 %r176,%r119; add.u32 %r175,%r176,1; .loc 1 115 25 st.u32 [%r95],%r175; .loc 1 114 5 add.u64 %r95,%r95,%r108; setp.eq.u64 %r177,%r90,%r113; @ %r177 bra $L33; .loc 1 115 32 ld.u64 %r119,[%r90]; add.u64 %r90,%r90,8; bra $L15; $L51: .loc 1 120 11 add.u64 %r38,%r38,1; .loc 1 120 4 setp.eq.u64 %r178,%r38,%r56; @ %r178 bra $L16; .loc 1 110 12 ld.f64 %r125,[%r126]; mov.u64 %r67,%r126; mov.u32 %r85,%r210; bra $L17; $L10: .loc 1 126 12 @ %r214 bra $L34; mov.u64 %r46,%r123; mov.u64 %r65,%r67; $L24: .loc 1 129 11 setp.le.f64 %r180,%r73,%r125; @ ! %r180 bra $L19; mov.u64 %r84,%r207; mov.u64 %r39,%r23; mov.u64 %r116,%r46; $L21: .loc 1 133 35 cvt.u32.u64 %r183,%r116; add.u32 %r182,%r183,1; .loc 1 133 24 st.u32 [%r39],%r182; .loc 1 132 10 add.u64 %r39,%r39,%r108; setp.eq.u64 %r184,%r84,%r113; @ %r184 bra $L35; .loc 1 133 31 ld.u64 %r116,[%r84]; add.u64 %r84,%r84,8; bra $L21; $L35: .loc 1 131 18 mov.f64 %r73,%r125; $L19: .loc 1 135 12 add.u64 %r65,%r65,%r53; .loc 1 137 17 add.u64 %r46,%r46,1; .loc 1 137 10 st.u64 [%frame+240],%r46; setp.ne.u64 %r185,%r46,%r56; @ %r185 bra $L22; .loc 1 135 12 sub.u64 %r186,%r56,%r123; mad.lo.u64 %r67,%r186,%r53,%r67; bra $L23; $L22: .loc 1 129 12 ld.f64 %r125,[%r65]; bra $L24; $L34: mov.u64 %r55,%r123; mov.u64 %r66,%r67; $L18: .loc 1 141 10 setp.lt.f64 %r188,%r73,%r125; @ ! %r188 bra $L25; mov.u64 %r117,%r207; mov.u64 %r120,%r23; mov.u64 %r110,%r55; $L27: .loc 1 145 35 cvt.u32.u64 %r191,%r110; add.u32 %r190,%r191,1; .loc 1 145 24 st.u32 [%r120],%r190; .loc 1 144 4 add.u64 %r120,%r120,%r108; setp.eq.u64 %r192,%r113,%r117; @ %r192 bra $L36; .loc 1 145 31 ld.u64 %r110,[%r117]; add.u64 %r117,%r117,8; bra $L27; $L36: .loc 1 143 18 mov.f64 %r73,%r125; $L25: .loc 1 149 9 add.u64 %r66,%r66,%r53; .loc 1 151 14 add.u64 %r55,%r55,1; .loc 1 151 7 st.u64 [%frame+240],%r55; setp.ne.u64 %r193,%r55,%r56; @ %r193 bra $L28; .loc 1 149 9 sub.u64 %r194,%r56,%r123; mad.lo.u64 %r67,%r194,%r53,%r67; bra $L23; $L28: .loc 1 141 11 ld.f64 %r125,[%r66]; bra $L18; $L16: st.u64 [%frame+240],%r56; .loc 1 118 13 mov.u64 %r67,%r126; $L23: mov.u64 %r76,%r207; .loc 1 151 34 mov.u64 %r63,%r56; mov.u64 %r61,%r51; .loc 1 152 9 mov.u64 %r71,0; $L29: .loc 1 157 13 st.u64 [%r76+-8],%r211; .loc 1 160 23 mul.lo.u64 %r58,%r61,%r63; mov.u64 %r105,%r71; .loc 1 161 5 add.u64 %r71,%r71,1; .loc 1 162 7 setp.eq.u64 %r198,%r105,%r103; @ %r198 bra $L1; .loc 1 170 16 ld.u64 %r199,[%r76]; add.u64 %r60,%r199,1; st.u64 [%r76],%r60; shl.b64 %r69,%r71,3; .loc 1 171 23 add.u64 %r200,%frame,%r69; ld.u64 %r61,[%r200]; .loc 1 171 13 sub.u64 %r201,%r61,%r58; shl.b64 %r202,%r201,3; add.u64 %r67,%r67,%r202; .loc 1 174 32 add.u64 %r204,%r212,%r69; ld.u64 %r63,[%r204]; .loc 1 174 7 add.u64 %r76,%r76,8; setp.eq.u64 %r205,%r60,%r63; @ %r205 bra $L29; ld.u64 %r123,[%frame+240]; .loc 1 141 11 ld.f64 %r125,[%r67]; bra $L30; $L33: mov.f64 %r73,%r125; setp.eq.u32 %r208,1,1; bra $L10; $L1: .loc 1 1_gfortran_mmaxloc0_4_r8 .visible .func _gfortran_mmaxloc0_4_r8pred %r172; .reg .predpredpred %r195; .reg .u64 %r198; .reg .u64u32 %r211; .reg .u16pred %r217; .reg .predpredu16 %r257; .reg .u32 %r258; .reg .pred %r259; .reg .pred %r260; .reg .u32 %r262; .reg .u32 %r263; .reg .pred %r264; .reg .pred %r265; .reg .u64 %r266; .reg .u16 %r268; .reg .u32 %r269; .reg .pred %r270; .reg .pred %r271; .reg .u32 %r273; .reg .u32 %r274; .reg .pred %r275; .reg .pred %r276; .reg .pred %r281; .reg .u64 %r282; .reg .u64pred %r294; .reg .u64 %r296; .reg .u64 %r297; .reg .pred %r299; mov.u64 %r168,%ar0; mov.u64 %r169,%ar1; mov.u64 %r170,%ar2; mov.u32 %r171,%ar3; .loc 1 202 6 setp.ne.u64 %r172,%r170,0; @ %r172 bra $Lcall _gfortran_maxloc0_4_r8.loc 1 205 7 bra $L52; $L53: .loc 1 208 10 ld.s8 %r22,[%r169+28]; .loc 1 209 6 setp.gt.s32 %r176,%r22,0; @ %r176 bra $L55; .loc 1 210 5 cvta.const.u64 %r177_gfortran_runtime_error5: .loc 1 212 6 ld.u64 %r179,[%r168]; setp.ne.u64 %r180,%r179,0; @ %r180 bra $L56; .loc 1 214 7 st.u64 [%r168+48],%r179; add.u32 %r182,%r22,-1; cvt.s64.s32 %r183,%r182; st.u64 [%r168+56],%r183; mov.u64 %r184,1; st.u64 [%r168+40],%r184; .loc 1 215 28 cvt.u32.u64 %r185,%r184; st.u8 [%r168+28],%r185; .loc 1 216 24 st.u64 [%r168+8],%r179; .loc 1 217 29 cvt.u32.u32 %r190,%r22; cvt.s64.s8 %r189,%r1call (%value_in),_gfortrani_xmallocarray191,[%value_in]; } .loc 1 217 27 st.u64 [%r168],%r191; bra $L57; $L56: .loc 1 221 11 cvta.global.u64 %r193,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r194,[%r193+36]; setp.eq.u32 %r195,%r194,0r169198; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r2011716920198; call _gfortrani_bounds_equal_extents$L57: .loc 1 231 15 ld.u64 %r31,[%r170+16]; .loc 1 233 9 ld.u64 %r94,[%r170]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r203,%r32,-4; and.b32 %r204,%r203,-5;.loc 1 235 22 add.u32 %r208,%r32,-1; set.u32.le.u32 %r210,%r208,1; neg.s32 %r211,%r210; .loc 1 235 6 cvt.u16.u32 %r213,%r207; cvt.u16.u32 %r214,%r211; or.b16 %r212,%r213,%r214; cvt.u32.u16 %r215,%r212; cvt.u16.u8 %r216,%r215; setp.ne.u16 %r217,%r216,0; @ %r217 bra $L58; .loc 1 237 7 setp.ne.u32 %r219,%r32,16; @ %r219 bra $L59; $L58: .loc 1 244 11 ld.u64 %r112,[%r168+40]; .loc 1 245 8 ld.u64 %r113,[%r168]; .loc 1 246 17 cvt.u32.u32 %r220,%r22; cvt.s64.s8 %r135,%r220; add.u64 %r166,%r169,40; add.u64 %r292,%frame,120; mov.u64 %r163,%r292; add.u64 %r160,%r170,40; mov.u64 %r159,%frame; add.u64 %r156,%frame,240; add.u64 %r152,%frame,360; mov.u64 %r154,%r152; .loc 1 246 10 mov.u64 %r120,0; .loc 1 251 16 mov.u64 %r229,%r120; bra $L60; $L59: .loc 1 242 5_gfortran_runtime_error248 18 ld.u64 %r223,[%r166]; st.u64 [%r163],%r223; .loc 1 249 20 ld.u64 %r225,[%r160]; mul.lo.u64 %r224,%r225,%r31; .loc 1 249 18 st.u64 [%r159],%r224; .loc 1 250 19 ld.u64 %r227,[%r166+16]; add.u64 %r226,%r227,1; ld.u64 %r228,[%r166+8]; sub.u64 %r45,%r226,%r228; .loc 1 250 17 st.u64 [%r156],%r45; .loc 1 251 16 st.u64 [%r154],%r229; .loc 1 252 10 setp.gt.s64 %r230,%r45,0; @ %r230 bra $L61; shl.b64 %r131,%r112,2; mov.u64 %r133,%r113; .loc 1 255 11 mov.u64 %r121,0; .loc 1 256 24 cvt.u32.u64 %r231,%r121; $L62: st.u32 [%r133],%r231; .loc 1 255 27 add.u64 %r121,%r121,1; .loc 1 255 4 add.u64 %r133,%r133,%r131; setp.ne.u64 %r232,%r121,%r135; @ %r232 bra $L62; bra $L52; $L61: .loc 1 246 26 add.u64 %r120,%r120,1; .loc 1 246 3 add.u64 %r166,%r166,24; add.u64 %r163,%r163,8; add.u64 %r160,%r160,24; add.u64 %r159,%r159,8; add.u64 %r156,%r156,8; add.u64 %r154,%r154,8; setp.ne.u64 %r233,%r120,%r135; @ %r233 bra $L60; .loc 1 261 8 ld.u64 %r91,[%r169]; shl.b64 %r124,%r112,2; mov.u64 %r122,%r113; .loc 1 264 10 mov.u64 %r119,0; .loc 1 265 23 cvt.u32.u64 %r234,%r119; $L64: st.u32 [%r122],%r234; .loc 1 264 26 add.u64 %r119,%r119,1; .loc 1 264 3 add.u64 %r122,%r122,%r124; setp.ne.u64 %r235,%r119,%r135; @ %r235 bra $L64; .loc 1 276 9 setp.eq.u64 %r236,%r91,0; @ %r236 bra $L52; .loc 1 331 19 ld.u64 %r71,[%frame+120]; .loc 1 331 9 shl.b64 %r73,%r71,3; .loc 1 332 20 ld.u64 %r74,[%frame]; .loc 1 334 34 ld.u64 %r77,[%frame+240]; .loc 1 334 21 ld.u64 %r164,[%frame+360]; cvt.u32.u32 %r238,%r22; cvt.s64.s8 %r237,%r238; shl.b64 %r239,%r237,3; add.u64 %r96,%r239,%r152; .loc 1 272 12 mov.f64 %r102,0dfff0000000000000; setp.eq.u32 %r294,1,0; add.u64 %r293,%frame,368; .loc 1 340 13 mov.u64 %r296,0; .loc 1 359 32 add.u64 %r297,%frame,240; .loc 1 308 12 setp.eq.u32 %r299,%r171,0; $L83: .loc 1 280 10 @ %r294 bra $L65; $L72: .loc 1 284 11 ld.s8 %r242,[%r94]; cvt.u16.u32 %r241,%r242; setp.eq.u16 %r243,%r241,0; @ %r243 bra $L66; .loc 1 287 8 ld.u32 %r244,[%r113]; setp.ne.u32 %r245,%r244,0; @ %r245 bra $L67; mov.u64 %r101,%r293; mov.u64 %r104,%r113; mov.u64 %r157,%r164; $L68: .loc 1 289 38 cvt.u32.u64 %r248,%r157; add.u32 %r247,%r248,1; .loc 1 289 27 st.u32 [%r104],%r247; .loc 1 288 7 add.u64 %r104,%r104,%r124; setp.eq.u64 %r249,%r96,%r101; @ %r249 bra $L67; .loc 1 289 34 ld.u64 %r157,[%r101]; add.u64 %r101,%r101,8; bra $L68; $L67: .loc 1 290 9 ld.f64 %r52,[%r91]; .loc 1 290 8 setp.ge.f64 %r250,%r52,%r102; @ ! %r250 bra $L66; mov.u64 %r128,%r293; mov.u64 %r129,%r113; mov.u64 %r161,%r164; $L71: .loc 1 296 33 cvt.u32.u64 %r253,%r161; add.u32 %r252,%r253,1; .loc 1 296 22 st.u32 [%r129],%r252; .loc 1 295 9 add.u64 %r129,%r129,%r124; setp.eq.u64 %r254,%r96,%r128; @ %r254 bra $L84; .loc 1 296 29 ld.u64 %r161,[%r128]; add.u64 %r128,%r128,8; bra $L71; $L66: .loc 1 300 13 add.u64 %r91,%r91,%r73; .loc 1 301 14 add.u64 %r94,%r94,%r74; .loc 1 303 11 add.u64 %r164,%r164,1; .loc 1 303 4 st.u64 [%frame+360],%r164; setp.ne.u64 %r255,%r164,%r77; @ %r255 bra $L72; bra $L73; $L65: .loc 1 308 12 @ %r299 bra $L85; mov.u64 %r64,%r164; mov.u64 %r89,%r91; $L78: .loc 1 311 11 ld.s8 %r258,[%r94]; cvt.u16.u32 %r257,%r258; setp.eq.u16 %r259,%r257,0; @ %r259 bra $L75; .loc 1 311 22 ld.f64 %r60,[%r89]; .loc 1 311 19 setp.ge.f64 %r260,%r60,%r102; @ ! %r260 bra $L75; mov.u64 %r67,%r293; mov.u64 %r92,%r113; mov.u64 %r155,%r64; .loc 1 314 19 mov.u64 %r117,0; $L77: .loc 1 315 36 cvt.u32.u64 %r263,%r155; add.u32 %r262,%r263,1; .loc 1 315 25 st.u32 [%r92],%r262; .loc 1 314 35 add.u64 %r117,%r117,1; .loc 1 314 12 add.u64 %r92,%r92,%r124; setp.ge.s64 %r264,%r117,%r135; @ %r264 bra $L86; .loc 1 315 32 ld.u64 %r155,[%r67]; add.u64 %r67,%r67,8; bra $L77; $L86: .loc 1 313 19 mov.f64 %r102,%r60; $L75: .loc 1 317 13 add.u64 %r89,%r89,%r73; .loc 1 319 11 add.u64 %r64,%r64,1; .loc 1 319 4 st.u64 [%frame+360],%r64; setp.ne.u64 %r265,%r64,%r77; @ %r265 bra $L78; .loc 1 317 13 sub.u64 %r266,%r77,%r164; mad.lo.u64 %r91,%r266,%r73,%r91; bra $L73; $L85: mov.u64 %r76,%r164; mov.u64 %r93,%r94; mov.u64 %r90,%r91; $L74: .loc 1 323 11 ld.s8 %r269,[%r93]; cvt.u16.u32 %r268,%r269; setp.eq.u16 %r270,%r268,0; @ %r270 bra $L79; .loc 1 323 22 ld.f64 %r66,[%r90]; .loc 1 323 19 setp.gt.f64 %r271,%r66,%r102; @ ! %r271 bra $L79; mov.u64 %r46,%r293; mov.u64 %r56,%r113; mov.u64 %r151,%r76; .loc 1 326 12 mov.u64 %r115,0; $L81: .loc 1 327 36 cvt.u32.u64 %r274,%r151; add.u32 %r273,%r274,1; .loc 1 327 25 st.u32 [%r56],%r273; .loc 1 326 28 add.u64 %r115,%r115,1; .loc 1 326 5 add.u64 %r56,%r56,%r124; setp.ge.s64 %r275,%r115,%r135; @ %r275 bra $L87; .loc 1 327 32 ld.u64 %r151,[%r46]; add.u64 %r46,%r46,8; bra $L81; $L87: .loc 1 325 12 mov.f64 %r102,%r66; $L79: .loc 1 331 9 add.u64 %r90,%r90,%r73; .loc 1 332 10 add.u64 %r93,%r93,%r74; .loc 1 334 14 add.u64 %r76,%r76,1; .loc 1 334 7 st.u64 [%frame+360],%r76; setp.ne.u64 %r276,%r76,%r77; @ %r276 bra $L74; sub.u64 %r138,%r77,%r164; .loc 1 331 9 mad.lo.u64 %r91,%r73,%r138,%r91; .loc 1 332 10 mad.lo.u64 %r94,%r74,%r138,%r94; $L73: mov.u64 %r125,%r293; .loc 1 325 12 mov.u64 %r86,%r74; .loc 1 334 34 mov.u64 %r88,%r77; mov.u64 %r84,%r71; mov.u64 %r118,8; .loc 1 335 9 mov.u64 %r100,0; $L82: .loc 1 340 13 st.u64 [%r125+-8],%r296; .loc 1 343 23 mul.lo.u64 %r79,%r84,%r88; .loc 1 344 24 mul.lo.u64 %r81,%r88,%r86; .loc 1 345 5 add.u64 %r100,%r100,1; .loc 1 346 7 setp.eq.u64 %r281,%r100,%r135; @ %r281 bra $L52; .loc 1 354 16 ld.u64 %r282,[%r125]; add.u64 %r83,%r282,1; st.u64 [%r125],%r83; .loc 1 355 23 add.u64 %r284,%r292,%r118; ld.u64 %r84,[%r284]; .loc 1 355 13 sub.u64 %r285,%r84,%r79; shl.b64 %r286,%r285,3; add.u64 %r91,%r91,%r286; .loc 1 356 24 add.u64 %r287,%frame,%r118; ld.u64 %r86,[%r287]; .loc 1 356 14 sub.u64 %r288,%r86,%r81; add.u64 %r94,%r94,%r288; .loc 1 359 32 add.u64 %r290,%r297,%r118; ld.u64 %r88,[%r290]; .loc 1 359 7 add.u64 %r125,%r125,8; add.u64 %r118,%r118,8; setp.eq.u64 %r291,%r83,%r88; @ %r291 bra $L82; .loc 1 334 21 ld.u64 %r164,[%frame+360]; bra $L83; $L84: .loc 1 290 9 mov.f64 %r102,%r52; setp.eq.u32 %r294,1,1; bra $L65; $L52: .loc 1 362_gfortran_smaxloc0_4_r8 .visible .func _gfortran_smaxloc0_4_r84predpred.reg .u64u32 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 379 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L114; .loc 1 379 20 ld.u32 %r40,[%r37381 7 {_gfortran_maxloc0_4_r8.loc 1 382 7 bra $L113; $L115: .loc 1 385 8 ld.s8 %r30,[%r36+28]; .loc 1 387 6 setp.gt.s64 %r46,%r30,0; @ %r46 bra $L1_gfortran_runtime_error117: .loc 1 390 15 ld.u64 %r24,[%r35]; .loc 1 390 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L118; .loc 1 392 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 393 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 394 24 st.u64 [%r35+8],%r24; .loc 1 395 29 mov.u64 %r56call (%value_in),_gfortrani_xmallocarray95 27 st.u64 [%r35],%r24; bra $L119; $L118: .loc 1 397 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 397 11 ld.u32 %r62,[%r61+36.loc 1 399 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 404 8 ld.u64 %r24,[%r35]; $L119: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,2; mov.u64 %r29,%r24; .loc 1 405 10 mov.u64 %r32,0; .loc 1 406 23 cvt.u32.u64 %r68,%r32; $L120: st.u32 [%r29],%r68; .loc 1 405 24 add.u64 %r32,%r32,1; .loc 1 405 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L120; $L113: .loc 1 407maxloc0_8_r8.o/ 1622802229_gfortran_maxloc0_8_r8 .visible .func _gfortran_maxloc0_8_fortran/generated/maxloc0_8_r8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_8_r8 .visible .func _gfortran_mmaxloc0_8_r8_gfortran_smaxloc0_8_r8 .visible .func _gfortran_smaxloc0_8_r8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,65,88,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_8_r8 .visible .func _gfortran_maxloc0_8_r8pred.reg .u64 %r140; .reg .u32 %r141; .reg .predpred %r157; .reg .u64pred %r170; .reg .pred %r171; .reg .pred %r173; .reg .u64pred %r177; .reg .u64 %r178; .reg .predpredpred %r196; .reg .predpred %r205; mov.u64 %r121,%ar0; mov.u64 %r122,%ar1; mov.u32 %r123,%ar2; .loc 1 50 10 ld.s8 %r22,[%r122+28]; .loc 1 50 8 cvt.u32.u32 %r124,%r22; cvt.s64.s8 %r84,%r124; .loc 1 51 6 setp.gt.s64 %r125,%r84,0; @ %r125 bra $L2; .loc 1 52 5 cvta.const.u64 %r126,$LCstack; call _gfortran_runtime_errorr23,[%r121]; .loc 1 54 6 setp.ne.u64 %r128,%r23,0; @ %r128 bra $L3; .loc 1 56 7 st.u64 [%r121+48],%r23; add.u64 %r130,%r84,-1; st.u64 [%r121+56],%r130; mov.u64 %r131,1; st.u64 [%r121+40],%r131; .loc 1 57 28 cvt.u32.u64 %r132,%r131; st.u8 [%r121+28],%r132; .loc 1 58 24 st.u64 [%r121+8],%r23; .loc 1 59 29 mov.u64 %r135135; call (%value_in),_gfortrani_xmallocarray138,[%value_in]; } mov.u64 %r23,%r138; .loc 1 59 27 st.u64 [%r121],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r140,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r141,[%r140+36]; setp.eq.u32 %r142,%r141,0; @ %r142 bra $L4; .loc 1 64r122145; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r121]; $L4: .loc 1 68 11 ld.u64 %r85,[%r121+40]; add.u64 %r33,%r122,40; mov.u64 %r53,%frame; add.u64 %r45,%frame,120; .loc 1 70 10 mov.u64 %r106,0; add.u64 %r198,%frame,240; .loc 1 74 16 mov.u64 %r153,%r106; bra $L7; $L32: mov.u64 %r106,%r91; $L7: .loc 1 72 18 ld.u64 %r146,[%r33]; st.u64 [%r53],%r146; .loc 1 73 19 ld.u64 %r148,[%r33+16]; add.u64 %r147,%r148,1; ld.u64 %r149,[%r33+8]; sub.u64 %r32,%r147,%r149; .loc 1 73 17 st.u64 [%r45],%r32; .loc 1 74 16 shl.b64 %r151,%r106,3; add.u64 %r152,%r198,%r151; st.u64 [%r152],%r153; .loc 1 75 10 setp.gt.s64 %r154,%r32,0; @ %r154 bra $L5; shl.b64 %r114,%r85,3; mov.u64 %r104,%r23; .loc 1 78 11 mov.u64 %r92,0; .loc 1 79 24 mov.u64 %r155,%r92; $L6: st.u64 [%r104],%r155; .loc 1 78 27 add.u64 %r92,%r92,1; .loc 1 78 4 add.u64 %r104,%r104,%r114; setp.ne.u64 %r156,%r84,%r92; @ %r156 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r91,%r106,1; .loc 1 70 3 add.u64 %r33,%r33,24; add.u64 %r53,%r53,8; add.u64 %r45,%r45,8; setp.ne.u64 %r157,%r84,%r91; @ %r157 bra $L32; .loc 1 84 8 ld.u64 %r73,[%r122]; shl.b64 %r44,%r85,3; mov.u64 %r55,%r23; .loc 1 87 10 mov.u64 %r90,0; .loc 1 88 23 mov.u64 %r158,1; $L8: st.u64 [%r55],%r158; mov.u64 %r107,%r90; .loc 1 87 26 add.u64 %r90,%r90,1; .loc 1 87 3 add.u64 %r55,%r55,%r44; setp.ne.u64 %r159,%r106,%r107; @ %r159 bra $L8; .loc 1 101 9 setp.eq.u64 %r160,%r73,0; @ %r160 bra $L1; .loc 1 149 19 ld.u64 %r57,[%frame]; .loc 1 149 9 shl.b64 %r59,%r57,3; .loc 1 151 34 ld.u64 %r61,[%frame+120]; ld.u64 %r115,[%frame+240]; .loc 1 141 11 ld.f64 %r119,[%r73]; cvt.u32.u32 %r162,%r22; cvt.s64.s8 %r161,%r162; shl.b64 %r163,%r161,3; add.u64 %r95,%r198,%r163; .loc 1 97 12 mov.f64 %r78,0dfff0000000000000; setp.eq.u32 %r197,1,0; add.u64 %r199,%frame,248; .loc 1 157 13 mov.u64 %r202,0; .loc 1 174 32 add.u64 %r203,%frame,120; .loc 1 126 12 setp.eq.u32 %r205,%r123,0; $L30: .loc 1 106 10 @ %r197 bra $L10; mov.u64 %r40,%r115; mov.u32 %r89,0; .loc 1 110 12 mov.u32 %r201,1; $L17: .loc 1 118 13 add.u64 %r120,%r73,%r59; .loc 1 110 11 setp.le.f64 %r166,%r78,%r119; @ ! %r166 bra $L51; setp.eq.u32 %r167,%r89,0; selp.u64 %r115,%r115,%r40,%r167; mov.u64 %r77,%r199; mov.u64 %r97,%r23; mov.u64 %r111,%r115; $L15: .loc 1 115 36 add.u64 %r169,%r111,1; .loc 1 115 25 st.u64 [%r97],%r169; .loc 1 114 5 add.u64 %r97,%r97,%r44; setp.eq.u64 %r170,%r77,%r95; @ %r170 bra $L33; .loc 1 115 32 ld.u64 %r111,[%r77]; add.u64 %r77,%r77,8; bra $L15; $L51: .loc 1 120 11 add.u64 %r40,%r40,1; .loc 1 120 4 setp.eq.u64 %r171,%r40,%r61; @ %r171 bra $L16; .loc 1 110 12 ld.f64 %r119,[%r120]; mov.u64 %r73,%r120; mov.u32 %r89,%r201; bra $L17; $L10: .loc 1 126 12 @ %r205 bra $L34; mov.u64 %r52,%r115; mov.u64 %r71,%r73; $L24: .loc 1 129 11 setp.le.f64 %r173,%r78,%r119; @ ! %r173 bra $L19; mov.u64 %r110,%r199; mov.u64 %r112,%r23; mov.u64 %r109,%r52; $L21: .loc 1 133 35 add.u64 %r175,%r109,1; .loc 1 133 24 st.u64 [%r112],%r175; .loc 1 132 10 add.u64 %r112,%r112,%r44; setp.eq.u64 %r176,%r95,%r110; @ %r176 bra $L35; .loc 1 133 31 ld.u64 %r109,[%r110]; add.u64 %r110,%r110,8; bra $L21; $L35: .loc 1 131 18 mov.f64 %r78,%r119; $L19: .loc 1 135 12 add.u64 %r71,%r71,%r59; .loc 1 137 17 add.u64 %r52,%r52,1; .loc 1 137 10 st.u64 [%frame+240],%r52; setp.ne.u64 %r177,%r52,%r61; @ %r177 bra $L22; .loc 1 135 12 sub.u64 %r178,%r61,%r115; mad.lo.u64 %r73,%r178,%r59,%r73; bra $L23; $L22: .loc 1 129 12 ld.f64 %r119,[%r71]; bra $L24; $L34: mov.u64 %r60,%r115; mov.u64 %r72,%r73; $L18: .loc 1 141 10 setp.lt.f64 %r180,%r78,%r119; @ ! %r180 bra $L25; mov.u64 %r50,%r199; mov.u64 %r47,%r23; mov.u64 %r93,%r60; $L27: .loc 1 145 35 add.u64 %r182,%r93,1; .loc 1 145 24 st.u64 [%r47],%r182; .loc 1 144 4 add.u64 %r47,%r47,%r44; setp.eq.u64 %r183,%r50,%r95; @ %r183 bra $L36; .loc 1 145 31 ld.u64 %r93,[%r50]; add.u64 %r50,%r50,8; bra $L27; $L36: .loc 1 143 18 mov.f64 %r78,%r119; $L25: .loc 1 149 9 add.u64 %r72,%r72,%r59; .loc 1 151 14 add.u64 %r60,%r60,1; .loc 1 151 7 st.u64 [%frame+240],%r60; setp.ne.u64 %r184,%r60,%r61; @ %r184 bra $L28; .loc 1 149 9 sub.u64 %r185,%r61,%r115; mad.lo.u64 %r73,%r185,%r59,%r73; bra $L23; $L28: .loc 1 141 11 ld.f64 %r119,[%r72]; bra $L18; $L16: st.u64 [%frame+240],%r61; .loc 1 118 13 mov.u64 %r73,%r120; $L23: mov.u64 %r42,%r199; .loc 1 151 34 mov.u64 %r70,%r61; mov.u64 %r68,%r57; .loc 1 152 9 mov.u64 %r76,0; $L29: .loc 1 157 13 st.u64 [%r42+-8],%r202; .loc 1 160 23 mul.lo.u64 %r65,%r68,%r70; mov.u64 %r108,%r76; .loc 1 161 5 add.u64 %r76,%r76,1; .loc 1 162 7 setp.eq.u64 %r189,%r108,%r106; @ %r189 bra $L1; .loc 1 170 16 ld.u64 %r190,[%r42]; add.u64 %r67,%r190,1; st.u64 [%r42],%r67; shl.b64 %r118,%r76,3; .loc 1 171 23 add.u64 %r191,%frame,%r118; ld.u64 %r68,[%r191]; .loc 1 171 13 sub.u64 %r192,%r68,%r65; shl.b64 %r193,%r192,3; add.u64 %r73,%r73,%r193; .loc 1 174 32 add.u64 %r195,%r203,%r118; ld.u64 %r70,[%r195]; .loc 1 174 7 add.u64 %r42,%r42,8; setp.eq.u64 %r196,%r67,%r70; @ %r196 bra $L29; ld.u64 %r115,[%frame+240]; .loc 1 141 11 ld.f64 %r119,[%r73]; bra $L30; $L33: mov.f64 %r78,%r119; setp.eq.u32 %r197,1,1; bra $L10; $L1: .loc 1 1_gfortran_mmaxloc0_8_r8 .visible .func _gfortran_mmaxloc0_8_r8pred %r187; .reg .u64 %r190; .reg .u64u64pred %r222; .reg .u64 %r223; .reg .predu32 %r230; .reg .u64 %r231; .reg .u16 %r233; .reg .u32 %r234; .reg .pred %r235; .reg .u64 %r236; .reg .predpred %r250; .reg .u64 %r252; .reg .predu16 %r257; .reg .u32 %r258; .reg .pred %r259; .reg .pred %r260; .reg .u64 %r262; .reg .pred %r263; .reg .predpred %r282; .reg .u64 %r284; .reg .u64 %r285; .reg .pred %r287; mov.u64 %r160,%ar0; mov.u64 %r161,%ar1; mov.u64 %r162,%ar2; mov.u32 %r163,%ar3; .loc 1 202 6 setp.ne.u64 %r164,%r162,0; @ %r164 bra $L53; .loc 1 204 7 {1163; call _gfortran_maxloc0_8_r8.loc 1 205 7 bra $L52; $L53: .loc 1 208 10 ld.s8 %r22,[%r161+28]; .loc 1 209 6 setp.gt.s32 %r168,%r22,0; @ %r168 bra $L55; .loc 1 210 5 cvta.const.u64 %r16stack; call _gfortran_runtime_error5: .loc 1 212 6 ld.u64 %r171,[%r160]; setp.ne.u64 %r172,%r171,0; @ %r172 bra $L56; .loc 1 214 7 st.u64 [%r160+48],%r171; add.u32 %r174,%r22,-1; cvt.s64.s32 %r175,%r174; st.u64 [%r160+56],%r175; mov.u64 %r176,1; st.u64 [%r160+40],%r176; .loc 1 215 28 cvt.u32.u64 %r177,%r176; st.u8 [%r160+28],%r177; .loc 1 216 24 st.u64 [%r160+8],%r171; .loc 1 217 29 cvt.u32.u32 %r182,%r22; cvt.s64.s8 %r181,%r182; mov.u64 %r180call (%value_in),_gfortrani_xmallocarray183,[%value_in]; } .loc 1 217 27 st.u64 [%r160],%r183; bra $L57; $L56: .loc 1 221 11 cvta.global.u64 %r185,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r186,[%r185+36]; setp.eq.u32 %r187,%r186,0; @ %r187 bra $L57; .loc 1 224 4 cvta.const.u64 %r190,$LC1; {161190; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19316161193190; call _gfortrani_bounds_equal_extents$L57: .loc 1 231 15 ld.u64 %r31,[%r162+16]; .loc 1 233 9 ld.u64 %r97,[%r162]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r195,%r32,-4; and.b32 %r196,%r195,-5; set.u32.eq.u32 %r198,%r196,0; neg.s32 %r199,%r198; .loc 1 235 22 add.u32 %r200,%r32,-1; set.u32.le.u32 %r202,%r200,1; neg.s32 %r203,%r202; .loc 1 235 6 cvt.u16.u32 %r205,%r199; cvt.u16.u32 %r206,%r203; or.b16 %r204,%r205,%r206; cvt.u32.u16 %r207,%r204; cvt.u16.u8 %r208,%r20758; .loc 1 237 7 setp.ne.u32 %r211,%r32,16; @ %r211 bra $L59; $L58: .loc 1 244 11 ld.u64 %r112,[%r160+40]; .loc 1 245 8 ld.u64 %r113,[%r160]; .loc 1 246 17 cvt.u32.u32 %r212,%r22; cvt.s64.s8 %r134,%r212; add.u64 %r122,%r161,40; add.u64 %r280,%frame,120; mov.u64 %r132,%r280; add.u64 %r124,%r162,40; mov.u64 %r66,%frame; add.u64 %r57,%frame,240; add.u64 %r59,%frame,360; mov.u64 %r58,%r59; .loc 1 246 10 mov.u64 %r119,0; .loc 1 251 16 mov.u64 %r221,%r119; bra $L60; $L59: .loc 1 242 5_gfortran_runtime_error248 18 ld.u64 %r215,[%r122]; st.u64 [%r132],%r215; .loc 1 249 20 ld.u64 %r217,[%r124]; mul.lo.u64 %r216,%r217,%r31; .loc 1 249 18 st.u64 [%r66],%r216; .loc 1 250 19 ld.u64 %r219,[%r122+16]; add.u64 %r218,%r219,1; ld.u64 %r220,[%r122+8]; sub.u64 %r45,%r218,%r220; .loc 1 250 17 st.u64 [%r57],%r45; .loc 1 251 16 st.u64 [%r58],%r221; .loc 1 252 10 setp.gt.s64 %r222,%r45,0; @ %r222 bra $L61; shl.b64 %r127,%r112,3; mov.u64 %r129,%r113; .loc 1 255 11 mov.u64 %r120,0; .loc 1 256 24 mov.u64 %r223,%r120; $L62: st.u64 [%r129],%r223; .loc 1 255 27 add.u64 %r120,%r120,1; .loc 1 255 4 add.u64 %r129,%r129,%r127; setp.ne.u64 %r224,%r120,%r134; @ %r224 bra $L62; bra $L52; $L61: .loc 1 246 26 add.u64 %r119,%r119,1; .loc 1 246 3 add.u64 %r122,%r122,24; add.u64 %r132,%r132,8; add.u64 %r124,%r124,24; add.u64 %r66,%r66,8; add.u64 %r57,%r57,8; add.u64 %r58,%r58,8; setp.ne.u64 %r225,%r119,%r134; @ %r225 bra $L60; .loc 1 261 8 ld.u64 %r94,[%r161]; shl.b64 %r145,%r112,3; mov.u64 %r153,%r113; .loc 1 264 10 mov.u64 %r118,0; .loc 1 265 23 mov.u64 %r226,%r118; $L64: st.u64 [%r153],%r226; .loc 1 264 26 add.u64 %r118,%r118,1; .loc 1 264 3 add.u64 %r153,%r153,%r145; setp.ne.u64 %r227,%r118,%r134; @ %r227 bra $L64; .loc 1 276 9 setp.eq.u64 %r228,%r94,0; @ %r228 bra $L52; .loc 1 331 19 ld.u64 %r73,[%frame+120]; .loc 1 331 9 shl.b64 %r75,%r73,3; .loc 1 332 20 ld.u64 %r76,[%frame]; .loc 1 334 34 ld.u64 %r80,[%frame+240]; .loc 1 334 21 ld.u64 %r156,[%frame+360]; cvt.u32.u32 %r230,%r22; cvt.s64.s8 %r229,%r230; shl.b64 %r231,%r229,3; add.u64 %r48,%r231,%r59; .loc 1 272 12 mov.f64 %r104,0dfff0000000000000; setp.eq.u32 %r282,1,0; add.u64 %r281,%frame,368; .loc 1 340 13 mov.u64 %r284,0; .loc 1 359 32 add.u64 %r285,%frame,240; .loc 1 308 12 setp.eq.u32 %r287,%r163,0; $L83: .loc 1 280 10 @ %r282 bra $L65; $L72: .loc 1 284 11 ld.s8 %r234,[%r97]; cvt.u16.u32 %r233,%r234; setp.eq.u16 %r235,%r233,0; @ %r235 bra $L66; .loc 1 287 8 ld.u64 %r236,[%r113]; setp.ne.u64 %r237,%r236,0; @ %r237 bra $L67; mov.u64 %r61,%r281; mov.u64 %r72,%r113; mov.u64 %r150,%r156; $L68: .loc 1 289 38 add.u64 %r239,%r150,1; .loc 1 289 27 st.u64 [%r72],%r239; .loc 1 288 7 add.u64 %r72,%r72,%r145; setp.eq.u64 %r240,%r48,%r61; @ %r240 bra $L67; .loc 1 289 34 ld.u64 %r150,[%r61]; add.u64 %r61,%r61,8; bra $L68; $L67: .loc 1 290 9 ld.f64 %r54,[%r94]; .loc 1 290 8 setp.ge.f64 %r241,%r54,%r104; @ ! %r241 bra $L66; mov.u64 %r108,%r281; mov.u64 %r123,%r113; mov.u64 %r154,%r156; $L71: .loc 1 296 33 add.u64 %r243,%r154,1; .loc 1 296 22 st.u64 [%r123],%r243; .loc 1 295 9 add.u64 %r123,%r123,%r145; setp.eq.u64 %r244,%r48,%r108; @ %r244 bra $L84; .loc 1 296 29 ld.u64 %r154,[%r108]; add.u64 %r108,%r108,8; bra $L71; $L66: .loc 1 300 13 add.u64 %r94,%r94,%r75; .loc 1 301 14 add.u64 %r97,%r97,%r76; .loc 1 303 11 add.u64 %r156,%r156,1; .loc 1 303 4 st.u64 [%frame+360],%r156; setp.ne.u64 %r245,%r156,%r80; @ %r245 bra $L72; bra $L73; $L65: .loc 1 308 12 @ %r287 bra $L85; mov.u64 %r68,%r156; mov.u64 %r92,%r94; $L78: .loc 1 311 11 ld.s8 %r248,[%r97]; cvt.u16.u32 %r247,%r248; setp.eq.u16 %r249,%r247,0; @ %r249 bra $L75; .loc 1 311 22 ld.f64 %r63,[%r92]; .loc 1 311 19 setp.ge.f64 %r250,%r63,%r104; @ ! %r250 bra $L75; mov.u64 %r121,%r281; mov.u64 %r36,%r113; mov.u64 %r131,%r68; .loc 1 314 19 mov.u64 %r116,0; $L77: .loc 1 315 36 add.u64 %r252,%r131,1; .loc 1 315 25 st.u64 [%r36],%r252; .loc 1 314 35 add.u64 %r116,%r116,1; .loc 1 314 12 add.u64 %r36,%r36,%r145; setp.ge.s64 %r253,%r116,%r134; @ %r253 bra $L86; .loc 1 315 32 ld.u64 %r131,[%r121]; add.u64 %r121,%r121,8; bra $L77; $L86: .loc 1 313 19 mov.f64 %r104,%r63; $L75: .loc 1 317 13 add.u64 %r92,%r92,%r75; .loc 1 319 11 add.u64 %r68,%r68,1; .loc 1 319 4 st.u64 [%frame+360],%r68; setp.ne.u64 %r254,%r68,%r80; @ %r254 bra $L78; .loc 1 317 13 sub.u64 %r255,%r80,%r156; mad.lo.u64 %r94,%r255,%r75,%r94; bra $L73; $L85: mov.u64 %r79,%r156; mov.u64 %r96,%r97; mov.u64 %r93,%r94; $L74: .loc 1 323 11 ld.s8 %r258,[%r96]; cvt.u16.u32 %r257,%r258; setp.eq.u16 %r259,%r257,0; @ %r259 bra $L79; .loc 1 323 22 ld.f64 %r70,[%r93]; .loc 1 323 19 setp.gt.f64 %r260,%r70,%r104; @ ! %r260 bra $L79; mov.u64 %r155,%r281; mov.u64 %r159,%r113; mov.u64 %r126,%r79; .loc 1 326 12 mov.u64 %r115,0; $L81: .loc 1 327 36 add.u64 %r262,%r126,1; .loc 1 327 25 st.u64 [%r159],%r262; .loc 1 326 28 add.u64 %r115,%r115,1; .loc 1 326 5 add.u64 %r159,%r159,%r145; setp.ge.s64 %r263,%r115,%r134; @ %r263 bra $L87; .loc 1 327 32 ld.u64 %r126,[%r155]; add.u64 %r155,%r155,8; bra $L81; $L87: .loc 1 325 12 mov.f64 %r104,%r70; $L79: .loc 1 331 9 add.u64 %r93,%r93,%r75; .loc 1 332 10 add.u64 %r96,%r96,%r76; .loc 1 334 14 add.u64 %r79,%r79,1; .loc 1 334 7 st.u64 [%frame+360],%r79; setp.ne.u64 %r264,%r79,%r80; @ %r264 bra $L74; sub.u64 %r137,%r80,%r156; .loc 1 331 9 mad.lo.u64 %r94,%r75,%r137,%r94; .loc 1 332 10 mad.lo.u64 %r97,%r76,%r137,%r97; $L73: mov.u64 %r103,%r281; .loc 1 325 12 mov.u64 %r89,%r76; .loc 1 334 34 mov.u64 %r91,%r80; mov.u64 %r87,%r73; mov.u64 %r101,8; .loc 1 335 9 mov.u64 %r102,0; $L82: .loc 1 340 13 st.u64 [%r103+-8],%r284; .loc 1 343 23 mul.lo.u64 %r82,%r87,%r91; .loc 1 344 24 mul.lo.u64 %r84,%r91,%r89; .loc 1 345 5 add.u64 %r102,%r102,1; .loc 1 346 7 setp.eq.u64 %r269,%r102,%r134; @ %r269 bra $L52; .loc 1 354 16 ld.u64 %r270,[%r103]; add.u64 %r86,%r270,1; st.u64 [%r103],%r86; .loc 1 355 23 add.u64 %r272,%r280,%r101; ld.u64 %r87,[%r272]; .loc 1 355 13 sub.u64 %r273,%r87,%r82; shl.b64 %r274,%r273,3; add.u64 %r94,%r94,%r274; .loc 1 356 24 add.u64 %r275,%frame,%r101; ld.u64 %r89,[%r275]; .loc 1 356 14 sub.u64 %r276,%r89,%r84; add.u64 %r97,%r97,%r276; .loc 1 359 32 add.u64 %r278,%r285,%r101; ld.u64 %r91,[%r278]; .loc 1 359 7 add.u64 %r103,%r103,8; add.u64 %r101,%r101,8; setp.eq.u64 %r279,%r86,%r91; @ %r279 bra $L82; .loc 1 334 21 ld.u64 %r156,[%frame+360]; bra $L83; $L84: .loc 1 290 9 mov.f64 %r104,%r54; setp.eq.u32 %r282,1,1; bra $L65; $L52: .loc 1 362_gfortran_smaxloc0_8_r8 .visible .func _gfortran_smaxloc0_8_r84predpred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 379 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L114; .loc 1 379 20 ld.u32 %r40,[%r37381 7 {_gfortran_maxloc0_8_r8.loc 1 382 7 bra $L113; $L115: .loc 1 385 8 ld.s8 %r30,[%r36+28]; .loc 1 387 6 setp.gt.s64 %r46,%r30,0; @ %r46 bra $L1_gfortran_runtime_error117: .loc 1 390 15 ld.u64 %r24,[%r35]; .loc 1 390 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L118; .loc 1 392 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 393 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 394 24 st.u64 [%r35+8],%r24; .loc 1 395 29 mov.u64 call (%value_in),_gfortrani_xmallocarray95 27 st.u64 [%r35],%r24; bra $L119; $L118: .loc 1 397 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 397 11 ld.u32 %r62,[%r61+36.loc 1 399 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 404 8 ld.u64 %r24,[%r35]; $L119: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,3; mov.u64 %r29,%r24; .loc 1 405 10 mov.u64 %r32,0; .loc 1 406 23 mov.u64 %r68,%r32; $L120: st.u64 [%r29],%r68; .loc 1 405 24 add.u64 %r32,%r32,1; .loc 1 405 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L120; $L113: .loc 1 407 maxloc0_16_r8.o/1622802229_gfortran_maxloc0_16_r8 .visible .func _gfortran_maxloc0_16_fortran/generated/maxloc0_16_r8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_16_r8 .visible .func _gfortran_mmaxloc0_16_r8_gfortran_smaxloc0_16_r8 .visible .func _gfortran_smaxloc0_16_r8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,65,88,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_16_r8 .visible .func _gfortran_maxloc0_16_r8u64 %r94; .reg .u64 %r105; .reg .u64 %r108predpred %r160; .reg .predpredu32 %r167; .reg .u64 %r168; .reg .pred %r171; .reg .pred %r172; .reg .u64 %r174; .reg .u64 %r175; .reg .pred %r176; .reg .pred %r177; .reg .predu64 %r193; .reg .predmov.u64 %r124,%ar0; mov.u64 %r125,%ar1; mov.u32 %r126,%ar2; .loc 1 50 10 ld.s8 %r22,[%r125+28]; .loc 1 50 8 cvt.u32.u32 %r127,%r22; cvt.s64.s8 %r79,%r127; .loc 1 51 6 setp.gt.s64 %r128,%r79,0; @ %r128 bra $L2; .loc 1 52 5 cvta.const.u64 %r112_gfortran_runtime_errorr23,[%r124]; .loc 1 54 6 setp.ne.u64 %r131,%r23,0; @ %r131 bra $L3; .loc 1 56 7 st.u64 [%r124+48],%r23; add.u64 %r133,%r79,-1; st.u64 [%r124+56],%r133; mov.u64 %r134,1; st.u64 [%r124+40],%r134; .loc 1 57 28 cvt.u32.u64 %r135,%r134; st.u8 [%r124+28],%r135; .loc 1 58 24 st.u64 [%r124+8],%r23; .loc 1 59 29 mov.u64 %r138,1679call (%value_in),_gfortrani_xmallocarray1,[%value_in]; } mov.u64 %r23,%r141; .loc 1 59 27 st.u64 [%r124],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r143,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r144,[%r143+36]; setp.eq.u32 %r145,%r144,0; @ %r145 bra $L4; .loc 1 64r148; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r124]; $L4: .loc 1 68 11 ld.u64 %r80,[%r124+40]; add.u64 %r51,%r125,40; mov.u64 %r42,%frame; add.u64 %r76,%frame,120; .loc 1 70 10 mov.u64 %r108,0; add.u64 %r205,%frame,240; .loc 1 74 16 mov.u64 %r156,%r108; bra $L7; $L32: mov.u64 %r108,%r88; $L7: .loc 1 72 18 ld.u64 %r149,[%r51]; st.u64 [%r42],%r149; .loc 1 73 19 ld.u64 %r151,[%r51+16]; add.u64 %r150,%r151,1; ld.u64 %r152,[%r51+8]; sub.u64 %r31,%r150,%r152; .loc 1 73 17 st.u64 [%r76],%r31; .loc 1 74 16 shl.b64 %r154,%r108,3; add.u64 %r155,%r205,%r154; st.u64 [%r155],%r156; .loc 1 75 10 setp.gt.s64 %r157,%r31,0; @ %r157 bra $L5; shl.b64 %r120,%r80,4; mov.u64 %r105,%r23; .loc 1 78 11 mov.u64 %r89,0; .loc 1 79 24 mov.u64 %r158,%r89; $L6: st.u64 [%r105],%r158; st.u64 [%r105+8],%r158; .loc 1 78 27 add.u64 %r89,%r89,1; .loc 1 78 4 add.u64 %r105,%r105,%r120; setp.ne.u64 %r160,%r79,%r89; @ %r160 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r88,%r108,1; .loc 1 70 3 add.u64 %r51,%r51,24; add.u64 %r42,%r42,8; add.u64 %r76,%r76,8; setp.ne.u64 %r161,%r79,%r88; @ %r161 bra $L32; .loc 1 84 8 ld.u64 %r69,[%r125]; shl.b64 %r35,%r80,4; mov.u64 %r48,%r23; .loc 1 87 10 mov.u64 %r87,0; .loc 1 88 23 mov.u64 %r162,1; mov.u64 %r163,%r87; $L8: st.u64 [%r48],%r162; st.u64 [%r48+8],%r163; mov.u64 %r109,%r87; .loc 1 87 26 add.u64 %r87,%r87,1; .loc 1 87 3 add.u64 %r48,%r48,%r35; setp.ne.u64 %r164,%r108,%r109; @ %r164 bra $L8; .loc 1 101 9 setp.eq.u64 %r165,%r69,0; @ %r165 bra $L1; .loc 1 149 19 ld.u64 %r54,[%frame]; .loc 1 149 9 shl.b64 %r56,%r54,3; .loc 1 151 34 ld.u64 %r58,[%frame+120]; ld.u64 %r121,[%frame+240]; .loc 1 141 11 ld.f64 %r122,[%r69]; cvt.u32.u32 %r167,%r22; cvt.s64.s8 %r166,%r167; shl.b64 %r168,%r166,3; add.u64 %r46,%r205,%r168; .loc 1 97 12 mov.f64 %r74,0dfff0000000000000; setp.eq.u32 %r207,1,0; add.u64 %r206,%frame,248; .loc 1 157 13 mov.u64 %r210,0; .loc 1 174 32 add.u64 %r211,%frame,120; .loc 1 126 12 setp.eq.u32 %r213,%r126,0; $L30: .loc 1 106 10 @ %r207 bra $L10; mov.u64 %r37,%r121; mov.u32 %r85,0; .loc 1 110 12 mov.u32 %r209,1; $L17: .loc 1 118 13 add.u64 %r123,%r69,%r56; .loc 1 110 11 setp.le.f64 %r171,%r74,%r122; @ ! %r171 bra $L51; setp.eq.u32 %r172,%r85,0; selp.u64 %r121,%r121,%r37,%r172; mov.u64 %r78,%r206; mov.u64 %r94,%r23; mov.u64 %r117,%r121; $L15: .loc 1 115 36 add.u64 %r174,%r117,1; st.u64 [%r94],%r174; shr.s64 %r175,%r174,63; st.u64 [%r94+8],%r175; .loc 1 114 5 add.u64 %r94,%r94,%r35; setp.eq.u64 %r176,%r46,%r78; @ %r176 bra $L33; .loc 1 115 32 ld.u64 %r117,[%r78]; add.u64 %r78,%r78,8; bra $L15; $L51: .loc 1 120 11 add.u64 %r37,%r37,1; .loc 1 120 4 setp.eq.u64 %r177,%r37,%r58; @ %r177 bra $L16; .loc 1 110 12 ld.f64 %r122,[%r123]; mov.u64 %r69,%r123; mov.u32 %r85,%r209; bra $L17; $L10: .loc 1 126 12 @ %r213 bra $L34; mov.u64 %r49,%r121; mov.u64 %r67,%r69; $L24: .loc 1 129 11 setp.le.f64 %r179,%r74,%r122; @ ! %r179 bra $L19; mov.u64 %r119,%r206; mov.u64 %r70,%r23; mov.u64 %r114,%r49; $L21: .loc 1 133 35 add.u64 %r181,%r114,1; st.u64 [%r70],%r181; shr.s64 %r182,%r181,63; st.u64 [%r70+8],%r182; .loc 1 132 10 add.u64 %r70,%r70,%r35; setp.eq.u64 %r183,%r46,%r119; @ %r183 bra $L35; .loc 1 133 31 ld.u64 %r114,[%r119]; add.u64 %r119,%r119,8; bra $L21; $L35: .loc 1 131 18 mov.f64 %r74,%r122; $L19: .loc 1 135 12 add.u64 %r67,%r67,%r56; .loc 1 137 17 add.u64 %r49,%r49,1; .loc 1 137 10 st.u64 [%frame+240],%r49; setp.ne.u64 %r184,%r49,%r58; @ %r184 bra $L22; .loc 1 135 12 sub.u64 %r185,%r58,%r121; mad.lo.u64 %r69,%r185,%r56,%r69; bra $L23; $L22: .loc 1 129 12 ld.f64 %r122,[%r67]; bra $L24; $L34: mov.u64 %r57,%r121; mov.u64 %r68,%r69; $L18: .loc 1 141 10 setp.lt.f64 %r187,%r74,%r122; @ ! %r187 bra $L25; mov.u64 %r112,%r206; mov.u64 %r116,%r23; mov.u64 %r111,%r57; $L27: .loc 1 145 35 add.u64 %r189,%r111,1; st.u64 [%r116],%r189; shr.s64 %r190,%r189,63; st.u64 [%r116+8],%r190; .loc 1 144 4 add.u64 %r116,%r116,%r35; setp.eq.u64 %r191,%r46,%r112; @ %r191 bra $L36; .loc 1 145 31 ld.u64 %r111,[%r112]; add.u64 %r112,%r112,8; bra $L27; $L36: .loc 1 143 18 mov.f64 %r74,%r122; $L25: .loc 1 149 9 add.u64 %r68,%r68,%r56; .loc 1 151 14 add.u64 %r57,%r57,1; .loc 1 151 7 st.u64 [%frame+240],%r57; setp.ne.u64 %r192,%r57,%r58; @ %r192 bra $L28; .loc 1 149 9 sub.u64 %r193,%r58,%r121; mad.lo.u64 %r69,%r193,%r56,%r69; bra $L23; $L28: .loc 1 141 11 ld.f64 %r122,[%r68]; bra $L18; $L16: st.u64 [%frame+240],%r58; .loc 1 118 13 mov.u64 %r69,%r123; $L23: mov.u64 %r71,%r206; .loc 1 151 34 mov.u64 %r65,%r58; mov.u64 %r63,%r54; .loc 1 152 9 mov.u64 %r73,0; $L29: .loc 1 157 13 st.u64 [%r71+-8],%r210; .loc 1 160 23 mul.lo.u64 %r60,%r63,%r65; mov.u64 %r110,%r73; .loc 1 161 5 add.u64 %r73,%r73,1; .loc 1 162 7 setp.eq.u64 %r197,%r110,%r108; @ %r197 bra $L1; .loc 1 170 16 ld.u64 %r198,[%r71]; add.u64 %r62,%r198,1; st.u64 [%r71],%r62; shl.b64 %r40,%r73,3; .loc 1 171 23 add.u64 %r199,%frame,%r40; ld.u64 %r63,[%r199]; .loc 1 171 13 sub.u64 %r200,%r63,%r60; shl.b64 %r201,%r200,3; add.u64 %r69,%r69,%r201; .loc 1 174 32 add.u64 %r203,%r211,%r40; ld.u64 %r65,[%r203]; .loc 1 174 7 add.u64 %r71,%r71,8; setp.eq.u64 %r204,%r62,%r65; @ %r204 bra $L29; ld.u64 %r121,[%frame+240]; .loc 1 141 11 ld.f64 %r122,[%r69]; bra $L30; $L33: mov.f64 %r74,%r122; setp.eq.u32 %r207,1,1; bra $L10; $L1: .loc 1 1_gfortran_mmaxloc0_16_r8 .visible .func _gfortran_mmaxloc0_16_r8pred %r168; .reg .predpredpred %r226; .reg .u64 %r227; .reg .pred %r229; .reg .predpred %r234; .reg .u64u16pred %r248; .reg .pred %r249; .reg .u64 %r251; .reg .u64 %r252; .reg .pred %r253; .reg .pred %r254; .reg .u16 %r256; .reg .u32predpred %r274; .reg .predu64 %r283; .reg .u64u64 %r287; .reg .u64 %r289; .reg .pred %r290; .reg .u64 %r291; .reg .u64 %r292; .reg .predpred %r298; mov.u64 %r164,%ar0; mov.u64 %r165,%ar1; mov.u64 %r166,%ar2; mov.u32 %r167,%ar3; .loc 1 202 6 setp.ne.u64 %r168,%r166,0; @ %r168 bra $L1617; call _gfortran_maxloc0_16_r8.loc 1 205 7 bra $L52; $L53: .loc 1 208 10 ld.s8 %r22,[%r165+28]; .loc 1 209 6 setp.gt.s32 %r172,%r22,0; @ %r172 bra $L55; .loc 1 210 5 cvta.const.u64 %r173173_gfortran_runtime_error5: .loc 1 212 6 ld.u64 %r175,[%r164]; setp.ne.u64 %r176,%r175,0; @ %r176 bra $L56; .loc 1 214 7 st.u64 [%r164+48],%r175; add.u32 %r178,%r22,-1; cvt.s64.s32 %r179,%r178; st.u64 [%r164+56],%r179; mov.u64 %r180,1; st.u64 [%r164+40],%r180; .loc 1 215 28 cvt.u32.u64 %r181,%r180; st.u8 [%r164+28],%r181; .loc 1 216 24 st.u64 [%r164+8],%r175; .loc 1 217 29 cvt.u32.u32 %r186,%r22; cvt.s64.s8 %r185,%r186; mov.u64 %r184,16call (%value_in),_gfortrani_xmallocarray187,[%value_in]; } .loc 1 217 27 st.u64 [%r164],%r187; bra $L57; $L56: .loc 1 221 11 cvta.global.u64 %r189,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r190,[%r189+36]; setp.eq.u32 %r191,%r190,0164r194; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19716r197194; call _gfortrani_bounds_equal_extents$L57: .loc 1 231 15 ld.u64 %r31,[%r166+16]; .loc 1 233 9 ld.u64 %r101,[%r166]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r199,%r32,-4; and.b32 %r200,%r199,-5; set.u32.eq.u32 %r202,%r200,0; neg.s32 %r203,%r202; .loc 1 235 22 add.u32 %r204,%r32,-1; set.u32.le.u32 %r206,%r204,1; neg.s32 %r207,%r206; .loc 1 235 6 cvt.u16.u32 %r209,%r203; cvt.u16.u32 %r210,%r207; or.b16 %r208,%r209,%r210; cvt.u32.u16 %r211,%r208; cvt.u16.u8 %r212,%r211; setp.ne.u16 %r213,%r212,0; @ %r213 bra $L58; .loc 1 237 7 setp.ne.u32 %r215,%r32,16; @ %r215 bra $L59; $L58: .loc 1 244 11 ld.u64 %r116,[%r164+40]; .loc 1 245 8 ld.u64 %r117,[%r164]; .loc 1 246 17 cvt.u32.u32 %r216,%r22; cvt.s64.s8 %r136,%r216; add.u64 %r153,%r165,40; add.u64 %r291,%frame,120; mov.u64 %r150,%r291; add.u64 %r151,%r166,40; mov.u64 %r148,%frame; add.u64 %r125,%frame,240; add.u64 %r48,%frame,360; mov.u64 %r135,%r48; .loc 1 246 10 mov.u64 %r122,0; .loc 1 251 16 mov.u64 %r225,%r122; bra $L60; $L59: .loc 1 242 5 cvta.const.u64 %r217,$LC3stack; call _gfortran_runtime_error248 18 ld.u64 %r219,[%r153]; st.u64 [%r150],%r219; .loc 1 249 20 ld.u64 %r221,[%r151]; mul.lo.u64 %r220,%r221,%r31; .loc 1 249 18 st.u64 [%r148],%r220; .loc 1 250 19 ld.u64 %r223,[%r153+16]; add.u64 %r222,%r223,1; ld.u64 %r224,[%r153+8]; sub.u64 %r47,%r222,%r224; .loc 1 250 17 st.u64 [%r125],%r47; .loc 1 251 16 st.u64 [%r135],%r225; .loc 1 252 10 setp.gt.s64 %r226,%r47,0; @ %r226 bra $L61; shl.b64 %r130,%r116,4; mov.u64 %r132,%r117; .loc 1 255 11 mov.u64 %r123,0; .loc 1 256 24 mov.u64 %r227,%r123; $L62: st.u64 [%r132],%r227; st.u64 [%r132+8],%r227; .loc 1 255 27 add.u64 %r123,%r123,1; .loc 1 255 4 add.u64 %r132,%r132,%r130; setp.ne.u64 %r229,%r123,%r136; @ %r229 bra $L62; bra $L52; $L61: .loc 1 246 26 add.u64 %r122,%r122,1; .loc 1 246 3 add.u64 %r153,%r153,24; add.u64 %r150,%r150,8; add.u64 %r151,%r151,24; add.u64 %r148,%r148,8; add.u64 %r125,%r125,8; add.u64 %r135,%r135,8; setp.ne.u64 %r230,%r122,%r136; @ %r230 bra $L60; .loc 1 261 8 ld.u64 %r98,[%r165]; shl.b64 %r159,%r116,4; mov.u64 %r157,%r117; .loc 1 264 10 mov.u64 %r121,0; .loc 1 265 23 mov.u64 %r231,%r121; $L64: st.u64 [%r157],%r231; st.u64 [%r157+8],%r231; .loc 1 264 26 add.u64 %r121,%r121,1; .loc 1 264 3 add.u64 %r157,%r157,%r159; setp.ne.u64 %r233,%r121,%r136; @ %r233 bra $L64; .loc 1 276 9 setp.eq.u64 %r234,%r98,0; @ %r234 bra $L52; .loc 1 331 19 ld.u64 %r77,[%frame+120]; .loc 1 331 9 shl.b64 %r79,%r77,3; .loc 1 332 20 ld.u64 %r80,[%frame]; .loc 1 334 34 ld.u64 %r84,[%frame+240]; .loc 1 334 21 ld.u64 %r161,[%frame+360]; cvt.u32.u32 %r236,%r22; cvt.s64.s8 %r235,%r236; shl.b64 %r237,%r235,3; add.u64 %r72,%r237,%r48; .loc 1 272 12 mov.f64 %r105,0dfff0000000000000; setp.eq.u32 %r293,1,0; add.u64 %r292,%frame,368; .loc 1 340 13 mov.u64 %r295,0; .loc 1 359 32 add.u64 %r296,%frame,240; .loc 1 308 12 setp.eq.u32 %r298,%r167,0; $L83: .loc 1 280 10 @ %r293 bra $L65; $L72: .loc 1 284 11 ld.s8 %r240,[%r101]; cvt.u16.u32 %r239,%r240; setp.eq.u16 %r241,%r239,0; @ %r241 bra $L66; .loc 1 287 8 ld.u64 %r242,[%r117]; ld.u64 %r243,[%r117+8]; or.b64 %r242,%r242,%r243; setp.ne.u64 %r244,%r242,0; @ %r244 bra $L67; mov.u64 %r99,%r292; mov.u64 %r102,%r117; mov.u64 %r152,%r161; $L68: .loc 1 289 38 add.u64 %r246,%r152,1; st.u64 [%r102],%r246; shr.s64 %r247,%r246,63; st.u64 [%r102+8],%r247; .loc 1 288 7 add.u64 %r102,%r102,%r159; setp.eq.u64 %r248,%r72,%r99; @ %r248 bra $L67; .loc 1 289 34 ld.u64 %r152,[%r99]; add.u64 %r99,%r99,8; bra $L68; $L67: .loc 1 290 9 ld.f64 %r54,[%r98]; .loc 1 290 8 setp.ge.f64 %r249,%r54,%r105; @ ! %r249 bra $L66; mov.u64 %r127,%r292; mov.u64 %r128,%r117; mov.u64 %r156,%r161; $L71: .loc 1 296 33 add.u64 %r251,%r156,1; st.u64 [%r128],%r251; shr.s64 %r252,%r251,63; st.u64 [%r128+8],%r252; .loc 1 295 9 add.u64 %r128,%r128,%r159; setp.eq.u64 %r253,%r72,%r127; @ %r253 bra $L84; .loc 1 296 29 ld.u64 %r156,[%r127]; add.u64 %r127,%r127,8; bra $L71; $L66: .loc 1 300 13 add.u64 %r98,%r98,%r79; .loc 1 301 14 add.u64 %r101,%r101,%r80; .loc 1 303 11 add.u64 %r161,%r161,1; .loc 1 303 4 st.u64 [%frame+360],%r161; setp.ne.u64 %r254,%r161,%r84; @ %r254 bra $L72; bra $L73; $L65: .loc 1 308 12 @ %r298 bra $L85; mov.u64 %r67,%r161; mov.u64 %r96,%r98; $L78: .loc 1 311 11 ld.s8 %r257,[%r101]; cvt.u16.u32 %r256,%r257; setp.eq.u16 %r258,%r256,0; @ %r258 bra $L75; .loc 1 311 22 ld.f64 %r63,[%r96]; .loc 1 311 19 setp.ge.f64 %r259,%r63,%r105; @ ! %r259 bra $L75; mov.u64 %r49,%r292; mov.u64 %r61,%r117; mov.u64 %r149,%r67; .loc 1 314 19 mov.u64 %r120,0; $L77: .loc 1 315 36 add.u64 %r261,%r149,1; st.u64 [%r61],%r261; shr.s64 %r262,%r261,63; st.u64 [%r61+8],%r262; .loc 1 314 35 add.u64 %r120,%r120,1; .loc 1 314 12 add.u64 %r61,%r61,%r159; setp.ge.s64 %r263,%r120,%r136; @ %r263 bra $L86; .loc 1 315 32 ld.u64 %r149,[%r49]; add.u64 %r49,%r49,8; bra $L77; $L86: .loc 1 313 19 mov.f64 %r105,%r63; $L75: .loc 1 317 13 add.u64 %r96,%r96,%r79; .loc 1 319 11 add.u64 %r67,%r67,1; .loc 1 319 4 st.u64 [%frame+360],%r67; setp.ne.u64 %r264,%r67,%r84; @ %r264 bra $L78; .loc 1 317 13 sub.u64 %r265,%r84,%r161; mad.lo.u64 %r98,%r265,%r79,%r98; bra $L73; $L85: mov.u64 %r82,%r161; mov.u64 %r100,%r101; mov.u64 %r97,%r98; $L74: .loc 1 323 11 ld.s8 %r268,[%r100]; cvt.u16.u32 %r267,%r268; setp.eq.u16 %r269,%r267,0; @ %r269 bra $L79; .loc 1 323 22 ld.f64 %r71,[%r97]; .loc 1 323 19 setp.gt.f64 %r270,%r71,%r105; @ ! %r270 bra $L79; mov.u64 %r124,%r292; mov.u64 %r36,%r117; mov.u64 %r134,%r82; .loc 1 326 12 mov.u64 %r119,0; $L81: .loc 1 327 36 add.u64 %r272,%r134,1; st.u64 [%r36],%r272; shr.s64 %r273,%r272,63; st.u64 [%r36+8],%r273; .loc 1 326 28 add.u64 %r119,%r119,1; .loc 1 326 5 add.u64 %r36,%r36,%r159; setp.ge.s64 %r274,%r119,%r136; @ %r274 bra $L87; .loc 1 327 32 ld.u64 %r134,[%r124]; add.u64 %r124,%r124,8; bra $L81; $L87: .loc 1 325 12 mov.f64 %r105,%r71; $L79: .loc 1 331 9 add.u64 %r97,%r97,%r79; .loc 1 332 10 add.u64 %r100,%r100,%r80; .loc 1 334 14 add.u64 %r82,%r82,1; .loc 1 334 7 st.u64 [%frame+360],%r82; setp.ne.u64 %r275,%r82,%r84; @ %r275 bra $L74; sub.u64 %r139,%r84,%r161; .loc 1 331 9 mad.lo.u64 %r98,%r79,%r139,%r98; .loc 1 332 10 mad.lo.u64 %r101,%r80,%r139,%r101; $L73: mov.u64 %r111,%r292; .loc 1 325 12 mov.u64 %r93,%r80; .loc 1 334 34 mov.u64 %r95,%r84; mov.u64 %r91,%r77; mov.u64 %r110,8; .loc 1 335 9 mov.u64 %r103,0; $L82: .loc 1 340 13 st.u64 [%r111+-8],%r295; .loc 1 343 23 mul.lo.u64 %r86,%r91,%r95; .loc 1 344 24 mul.lo.u64 %r88,%r95,%r93; .loc 1 345 5 add.u64 %r103,%r103,1; .loc 1 346 7 setp.eq.u64 %r280,%r103,%r136; @ %r280 bra $L52; .loc 1 354 16 ld.u64 %r281,[%r111]; add.u64 %r90,%r281,1; st.u64 [%r111],%r90; .loc 1 355 23 add.u64 %r283,%r291,%r110; ld.u64 %r91,[%r283]; .loc 1 355 13 sub.u64 %r284,%r91,%r86; shl.b64 %r285,%r284,3; add.u64 %r98,%r98,%r285; .loc 1 356 24 add.u64 %r286,%frame,%r110; ld.u64 %r93,[%r286]; .loc 1 356 14 sub.u64 %r287,%r93,%r88; add.u64 %r101,%r101,%r287; .loc 1 359 32 add.u64 %r289,%r296,%r110; ld.u64 %r95,[%r289]; .loc 1 359 7 add.u64 %r111,%r111,8; add.u64 %r110,%r110,8; setp.eq.u64 %r290,%r90,%r95; @ %r290 bra $L82; .loc 1 334 21 ld.u64 %r161,[%frame+360]; bra $L83; $L84: .loc 1 290 9 mov.f64 %r105,%r54; setp.eq.u32 %r293,1,1; bra $L65; $L52: .loc 1 362_gfortran_smaxloc0_16_r8 .visible .func _gfortran_smaxloc0_16_r84predpred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 379 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L114; .loc 1 379 20 ld.u32 %r40,[%r37381 7 {_gfortran_maxloc0_16_r8.loc 1 382 7 bra $L113; $L115: .loc 1 385 8 ld.s8 %r30,[%r36+28]; .loc 1 387 6 setp.gt.s64 %r46,%r30,0; @ %r46 bra $L1_gfortran_runtime_error117: .loc 1 390 15 ld.u64 %r24,[%r35]; .loc 1 390 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L118; .loc 1 392 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 393 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 394 24 st.u64 [%r35+8],%r24; .loc 1 395 29call (%value_in),_gfortrani_xmallocarray95 27 st.u64 [%r35],%r24; bra $L119; $L118: .loc 1 397 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 397 11 ld.u32 %r62,[%r61+36.loc 1 399 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 404 8 ld.u64 %r24,[%r35]; $L119: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,4; mov.u64 %r29,%r24; .loc 1 405 10 mov.u64 %r32,0; .loc 1 406 23 mov.u64 %r68,%r32; $L120: st.u64 [%r29],%r68; st.u64 [%r29+8],%r68; .loc 1 405 24 add.u64 %r32,%r32,1; .loc 1 405 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r70,%r30,%r32; @ %r70 bra $L120; $L113: .loc 1 407 maxloc0_4_r10.o/162280222837 maxloc0_8_r16.o/1622802229maxloc1_4_i1.o/ 1622802229_gfortran_maxloc1_4_i1 .visible .func _gfortran_maxloc1_4_i1.file 1 "../../../../libgfortran/generated/maxloc1_4_i1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_4_i1 .visible .func _gfortran_mmaxloc1_4__gfortran_smaxloc1_4_i1 .visible .func _gfortran_smaxloc1_4_VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_maxloc1_4_i1 .visible .func _gfortran_maxloc1_4_i116predpred %r227; .reg .predu64 %r236; .reg .u64predu32 %r256; .reg .u64 %r258; .reg .u64u64 %r274; .reg .u64predu64pred %r321; .reg .u16pred %r326; .reg .pred %r327; .reg .u64 %r333; .reg .u64 %r334; .reg .u64predpred %r350; .reg .u64 %r352; .reg .u32mov.u64 %r177,%ar0; mov.u64 %r178,%ar1; mov.u64 %r179,%ar2; mov.u32 %r180,%ar3; .loc 1 58 10 ld.s8 %r22,[%r178+28]; .loc 1 58 38 add.u32 %r181,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r103,%r181; .loc 1 59 10 ld.u64 %r25,[%r179]; .loc 1 59 7 add.u64 %r104,%r25,-1; .loc 1 61 7 shr.u64 %r183,%r104,63; set.u32.lt.s64 %r185,%r103,%r104; neg.s32 %r186,%r185; cvt.u16.u64 %r189,%r183; cvt.u16.u32 %r190,%r186; or.b16 %r188,%r189,%r190; .loc 1 61 6 cvt.u32.u16 %r191,%r188; cvt.u16.u8 %r192,%r191; setp.eq.u16 %r193,%r192,0; @ %r193 bra $L2; .loc 1 63 7 cvt.u32.u32 %r196,%r22; cvt.s64.s8 %r195,%r196; st.u64 [%stack+8],%r1951944_gfortran_runtime_errorr199,%r104,%r104; add.u64 %r200,%r199,%r104; shl.b64 %r201,%r200,3; add.u64 %r202,%r178,%r201; ld.u64 %r31,[%r202+56]; ld.u64 %r33,[%r202+48]; .loc 1 71 9 ld.u64 %r107,[%r202+40]; .loc 1 73 3 setp.ne.u64 %r216,%r104,0; @ %r216 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r217,%r103,%r104; @ %r217 bra $L4; bra $L49; $L3: add.u64 %r121,%r178,40; add.u64 %r175,%frame,120; add.u64 %r173,%frame,240; add.u64 %r219,%r25,%r25; add.u64 %r220,%r219,%r25; shl.b64 %r221,%r220,3; add.u64 %r222,%r178,16; add.u64 %r145,%r221,%r222; .loc 1 79 12 mov.u64 %r361,0; $L8: .loc 1 75 18 ld.u64 %r223,[%r121]; st.u64 [%r175],%r223; .loc 1 76 19 ld.u64 %r225,[%r121+16]; add.u64 %r224,%r225,1; ld.u64 %r226,[%r121+8]; sub.u64 %r38,%r224,%r226; .loc 1 78 10 setp.lt.s64 %r227,%r38,0; @ %r227 bra $L6; .loc 1 76 17 st.u64 [%r173],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r173],%r361; $L7: .loc 1 73 3 add.u64 %r121,%r121,24; add.u64 %r175,%r175,8; add.u64 %r173,%r173,8; setp.ne.u64 %r229,%r121,%r145; @ %r229 bra $L8; bra $L9; $L49: .loc 1 90 6 ld.u64 %r230,[%r177]; setp.eq.u64 %r231,%r230,0; @ ! %r231 bra $L11; bra $L10; $L4: add.u64 %r233,%r25,%r25; add.u64 %r234,%r233,%r25; shl.b64 %r235,%r234,3; add.u64 %r236,%r235,40; add.u64 %r127,%r178,%r236; shl.b64 %r237,%r25,3; add.u64 %r99,%r237,-8; add.u64 %r348,%frame,120; add.u64 %r101,%r348,%r99; add.u64 %r347,%frame,240; add.u64 %r95,%r347,%r99; add.u64 %r240,%r178,40; cvt.u32.u32 %r242,%r22; cvt.s64.s8 %r241,%r242; add.u64 %r244,%r241,%r241; add.u64 %r245,%r244,%r241; shl.b64 %r246,%r245,3; add.u64 %r29,%r240,%r246; .loc 1 87 12 mov.u64 %r360,0; $L14: .loc 1 83 18 ld.u64 %r247,[%r127]; st.u64 [%r101],%r247; .loc 1 84 19 ld.u64 %r249,[%r127+16]; add.u64 %r248,%r249,1; ld.u64 %r250,[%r127+8]; sub.u64 %r44,%r248,%r250; .loc 1 86 10 setp.lt.s64 %r251,%r44,0; @ %r251 bra $L12; .loc 1 84 17 st.u64 [%r95],%r44; bra $L13; $L12: .loc 1 87 12 st.u64 [%r95],%r360; $L13: .loc 1 81 3 add.u64 %r127,%r127,24; add.u64 %r101,%r101,8; add.u64 %r95,%r95,8; setp.ne.u64 %r253,%r29,%r127; @ %r253 bra $L14; bra $L50; $L10: .loc 1 94 7 setp.le.s64 %r254,%r103,0; @ %r254 bra $L16; add.u64 %r347,%frame,240; $L38: add.u64 %r149,%r177,40; mov.u64 %r147,%r347; cvt.u32.u32 %r256,%r22; cvt.s64.s8 %r255,%r256; add.u64 %r258,%r255,%r255; add.u64 %r259,%r258,%r255; shl.b64 %r260,%r259,3; add.u64 %r261,%r177,16; add.u64 %r128,%r260,%r261; .loc 1 97 10 mov.u64 %r94,1; .loc 1 101 4 mov.u64 %r284,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r262,0; st.u64 [%r177+8],%r262; .loc 1 106 28 cvt.u16.u32 %r265,%r22; add.u16 %r264,%r265,-1; cvt.u32.u16 %r266,%r264; st.u8 [%r177+28],%r266; .loc 1 108 20 add.u32 %r267,%r22,-2; cvt.s64.s32 %r55,%r267; add.u64 %r269,%r55,%r55; add.u64 %r270,%r269,%r55; shl.b64 %r271,%r270,3; add.u64 %r272,%r177,%r271; .loc 1 108 67 shl.b64 %r274,%r55,3; add.u64 %r275,%frame,%r274; .loc 1 108 59 ld.u64 %r277,[%r272+40]; ld.u64 %r278,[%r275+240]; mul.lo.u64 %r109,%r277,%r278; .loc 1 110 29280; call (%value_in),_gfortrani_xmallocarray281,[%value_in]; } .loc 1 110 27 st.u64 [%r177],%r281; .loc 1 111 10 setp.eq.u64 %r283,%r109,0; @ ! %r283 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r94,%r51,%r94; $L17: .loc 1 101 4 st.u64 [%r149+8],%r284; ld.u64 %r51,[%r147]; add.u64 %r285,%r51,-1; st.u64 [%r149+16],%r285; st.u64 [%r149],%r94; .loc 1 94 7 add.u64 %r149,%r149,24; add.u64 %r147,%r147,8; setp.ne.u64 %r286,%r128,%r149; @ %r286 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r177+48],%r109; mov.u64 %r288,-1; st.u64 [%r177+56],%r288; mov.u64 %r289,1; st.u64 [%r177+40],%r289; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r61,[%r177+28]; .loc 1 121 10 setp.eq.u64 %r290,%r61,%r103; @ %r290 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r61;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r293,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r294,[%r293+36]; setp.eq.u32 %r295,%r294,0; @ %r295 bra $L21; .loc 1 128 2 add.u64 %r347,%frame,240; cvta.const.u64 %r299,$LC2r34729899; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r301,%r103,0; @ %r301 bra $L24; $L27: .loc 1 68 9 add.u64 %r302,%r31,1; .loc 1 68 7 sub.u64 %r105,%r302,%r33; max.s64 %r45,%r105,0; .loc 1 140 8 ld.u64 %r118,[%r178]; .loc 1 141 8 ld.u64 %r119,[%r177]; ld.u64 %r75,[%frame+360]; .loc 1 194 22 ld.u64 %r71,[%frame+120]; .loc 1 195 22 ld.u64 %r73,[%frame]; .loc 1 195 12 shl.b64 %r74,%r73,2; .loc 1 197 32 ld.u64 %r140,[%frame+240]; .loc 1 204 23 mul.lo.u64 %r303,%r71,%r140; .loc 1 204 9 neg.s64 %r130,%r303; .loc 1 205 23 mul.lo.u64 %r304,%r73,%r140; .loc 1 205 9 shl.b64 %r305,%r304,2; neg.s64 %r135,%r305; setp.gt.s64 %r349,%r105,0; setp.le.s64 %r350,%r103,1; .loc 1 155 9 mov.u32 %r353,-128; setp.eq.u32 %r354,%r180,0; cvt.u32.u32 %r355,%r22; cvt.s64.s8 %r356,%r355; add.u64 %r357,%r356,-1; add.u64 %r358,%frame,120; add.u64 %r359,%frame,240; bra $L25; $L24: add.u64 %r162,%frame,360; add.u64 %r161,%r177,40; mov.u64 %r159,%frame; cvt.u32.u32 %r307,%r22; cvt.s64.s8 %r306,%r307; add.u64 %r150,%r306,-1; .loc 1 132 3 mov.u64 %r120,0; add.u64 %r347,%frame,240; .loc 1 134 16 mov.u64 %r308,%r120; $L26: st.u64 [%r162],%r308; .loc 1 135 18 ld.u64 %r309,[%r161]; st.u64 [%r159],%r309; .loc 1 136 17 shl.b64 %r311,%r120,3; add.u64 %r312,%r347,%r311; .loc 1 136 10 ld.u64 %r313,[%r312]; setp.le.s64 %r314,%r313,0; @ %r314 bra $L1; .loc 1 132 26 add.u64 %r120,%r120,1; .loc 1 132 3 add.u64 %r162,%r162,8; add.u64 %r161,%r161,24; add.u64 %r159,%r159,8; setp.ne.u64 %r315,%r120,%r150; @ %r315 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r349 bra $L28; .loc 1 159 10 mov.u32 %r317,0; st.u32 [%r119],%r317; bra $L29; $L28: .loc 1 180 31 mov.u64 %r113,%r118; .loc 1 155 9 mov.u32 %r97,%r353; .loc 1 157 9 mov.u32 %r96,1; .loc 1 178 8 mov.u64 %r112,0; $L33: .loc 1 182 14 ld.s8 %r176,[%r113]; .loc 1 182 7 @ %r354 bra $L30; .loc 1 182 6 cvt.u16.u32 %r319,%r97; cvt.u16.u32 %r320,%r176; setp.le.s16 %r321,%r319,%r320; @ %r321 bra $L31; bra $L32; $L30: cvt.u16.u32 %r322,%r97; cvt.u16.u32 %r323,%r176; setp.ge.s16 %r324,%r322,%r323; @ %r324 bra $L32; $L31: cvt.u32.u64 %r325,%r112; add.u32 %r96,%r325,1; .loc 1 182 14 mov.u32 %r97,%r176; $L32: .loc 1 180 23 add.u64 %r112,%r112,1; .loc 1 180 31 add.u64 %r113,%r113,%r107; .loc 1 180 6 setp.gt.s64 %r326,%r45,%r112; @ %r326 bra $L33; .loc 1 189 12 st.u32 [%r119],%r96; $L29: .loc 1 193 15 add.u64 %r75,%r75,1; .loc 1 194 12 add.u64 %r118,%r118,%r71; .loc 1 195 12 add.u64 %r119,%r119,%r74; .loc 1 197 13 setp.ne.u64 %r327,%r75,%r140; @ %r327 bra $L25; .loc 1 204 9 add.u64 %r115,%r118,%r130; .loc 1 205 9 add.u64 %r116,%r119,%r135; .loc 1 207 7 @ %r350 bra $L1; add.u64 %r132,%frame,368; mov.u64 %r144,8; .loc 1 206 5 mov.u64 %r117,1; .loc 1 201 13 mov.u64 %r352,0; bra $L36; $L37: st.u64 [%r132],%r352; .loc 1 204 23 mul.lo.u64 %r333,%r86,%r85; .loc 1 204 9 sub.u64 %r115,%r118,%r333; .loc 1 205 23 mul.lo.u64 %r334,%r88,%r85; .loc 1 205 9 shl.b64 %r335,%r334,2; sub.u64 %r116,%r119,%r335; .loc 1 206 5 add.u64 %r117,%r117,1; .loc 1 207 7 add.u64 %r132,%r132,8; add.u64 %r144,%r144,8; setp.eq.u64 %r336,%r117,%r357; @ %r336 bra $L1; $L36: .loc 1 215 16 ld.u64 %r337,[%r132]; add.u64 %r85,%r337,1; st.u64 [%r132],%r85; .loc 1 216 23 add.u64 %r339,%r358,%r144; ld.u64 %r86,[%r339]; .loc 1 216 13 add.u64 %r118,%r115,%r86; .loc 1 217 23 add.u64 %r340,%frame,%r144; ld.u64 %r88,[%r340]; .loc 1 217 13 shl.b64 %r341,%r88,2; add.u64 %r119,%r116,%r341; .loc 1 197 32 add.u64 %r343,%r359,%r144; ld.u64 %r91,[%r343]; .loc 1 197 13 setp.eq.u64 %r344,%r85,%r91; @ %r344 bra $L37; .loc 1 201 13 mov.u64 %r75,0; bra $L25; $L50: .loc 1 90 6 ld.u64 %r345,[%r177]; setp.eq.u64 %r346,%r345,0; @ ! %r346 bra $L11; bra $L38; $L1: .loc 1 221_gfortran_mmaxloc1_4_i1 .visible .func _gfortran_mmaxloc1_4_, .param .u32 %in_ar4) {608pred %r225; .reg .u32 %r230; .reg .u64 %r232; .reg .u32 %r234; .reg .u32 %r235; .reg .u16 %r237; .reg .u16 %r238; .reg .u16pred %r262; .reg .u32 %r263; .reg .u32 %r264; .reg .u32 %r266; .reg .u32u16 %r274; .reg .u32 %r275; .reg .u16 %r276; .reg .predpredu64 %r302; .reg .u64 %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u64 %r306; .reg .u6464u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .pred %r336; .reg .u64 %r338; .reg .u64predpredpredpred %r413; .reg .pred %r414; .reg .predpred %r419; .reg .pred %r420; .reg .u16 %r421; .reg .u32 %r422; .reg .pred %r423; .reg .u16 %r424; .reg .u16 %r425; .reg .pred %r426; .reg .pred %r427; .reg .predu64predpred %r454; .reg .u64 %r455; .reg .u64 %r456; .reg .pred %r457; .reg .u64 %r458; .reg .predu64 %r467; mov.u64 %r220,%ar0; mov.u64 %r221,%ar1; mov.u64 %r222,%ar2; mov.u64 %r223,%ar3; mov.u32 %r224,%ar4; .loc 1 251 6 setp.ne.u64 %r225,%r223,0; @ %r225 bra $L2224; call _gfortran_maxloc1_4_7 bra $L51; $L52: .loc 1 261 10 ld.u64 %r22,[%r222]; .loc 1 261 7 add.u64 %r152,%r22,-1; .loc 1 262 10 ld.s8 %r23,[%r221+28]; .loc 1 262 38 add.u32 %r230,%r23,-1; .loc 1 262 8 cvt.s64.s32 %r153,%r230; .loc 1 265 7 shr.u64 %r232,%r152,63; set.u32.gt.s64 %r234,%r152,%r153; neg.s32 %r235,%r234; cvt.u16.u64 %r238,%r232; cvt.u16.u32 %r239,%r235; or.b16 %r237,%r238,%r239; .loc 1 265 6 cvt.u32.u16 %r240,%r237; cvt.u16.u8 %r241,%r240; setp.eq.u16 %r242,%r241,0; @ %r242 bra $L54; .loc 1 267 7 cvt.u32.u32 %r245,%r23; cvt.s64.s8 %r244,%r245; st.u64 [%stack+8],%r244; st.u64 [%stack],%r22; cvta.const.u64 %r243stack; call _gfortran_runtime_error4: .loc 1 272 9 add.u64 %r452,%r152,%r152; add.u64 %r453,%r452,%r152; shl.b64 %r250,%r453,3; add.u64 %r251,%r221,%r250; ld.u64 %r254,[%r251+56]; add.u64 %r253,%r254,1; .loc 1 272 7 ld.u64 %r261,[%r251+48]; sub.u64 %r154,%r253,%r261; .loc 1 273 6 setp.le.s64 %r262,%r154,0; @ %r262 bra $L51; .loc 1 276 9 ld.u64 %r169,[%r223]; .loc 1 278 15 ld.u64 %r34,[%r223+16]; .loc 1 280 22 cvt.u32.u64 %r35,%r34; .loc 1 280 53 add.u32 %r263,%r35,-4; and.b32 %r264,%r263,-5; set.u32.eq.u32 %r266,%r264,0; neg.s32 %r267,%r266; .loc 1 280 22 add.u32 %r268,%r35,-1; set.u32.le.u32 %r270,%r268,1; neg.s32 %r271,%r270; .loc 1 280 6 cvt.u16.u32 %r273,%r267; cvt.u16.u32 %r274,%r271; or.b16 %r272,%r273,%r274; cvt.u32.u16 %r275,%r272; cvt.u16.u8 %r276,%r275; setp.ne.u16 %r277,%r276,0; @ %r277 bra $L56; .loc 1 282 7 setp.ne.u32 %r279,%r35,16; @ %r279 bra $L57; $L56: .loc 1 289 9 shl.b64 %r283,%r453,3; add.u64 %r284,%r221,%r283; ld.u64 %r157,[%r284+40]; .loc 1 290 12 add.u64 %r290,%r223,%r283; ld.u64 %r42,[%r290+40]; .loc 1 292 3 setp.ne.u64 %r292,%r152,0; @ %r292 bra $L58; $L64: .loc 1 302 3 setp.lt.s64 %r293,%r152,%r153; @ %r293 bra $L59; bra $L116; $L57: .loc 1 287 5 cvta.const.u64 %r2944_gfortran_runtime_error8: add.u64 %r40,%r221,40; add.u64 %r178,%r223,40; add.u64 %r297,%r22,%r22; add.u64 %r298,%r297,%r22; shl.b64 %r299,%r298,3; add.u64 %r300,%r221,16; add.u64 %r131,%r299,%r300; .loc 1 292 3 mov.u64 %r147,0; add.u64 %r456,%frame,240; add.u64 %r455,%frame,360; .loc 1 299 12 mov.u64 %r467,%r147; $L63: .loc 1 294 18 add.u64 %r302,%r456,%r147; ld.u64 %r303,[%r40]; st.u64 [%r302],%r303; .loc 1 295 18 add.u64 %r304,%frame,%r147; .loc 1 295 20 ld.u64 %r306,[%r178]; mul.lo.u64 %r305,%r306,%r34; .loc 1 295 18 st.u64 [%r304],%r305; .loc 1 296 19 ld.u64 %r308,[%r40+16]; add.u64 %r307,%r308,1; ld.u64 %r309,[%r40+8]; sub.u64 %r54,%r307,%r309; .loc 1 298 10 setp.lt.s64 %r310,%r54,0; @ %r310 bra $L61; .loc 1 296 17 add.u64 %r312,%r455,%r147; st.u64 [%r312],%r54; bra $L62; $L61: .loc 1 299 12 add.u64 %r314,%r455,%r147; st.u64 [%r314],%r467; $L62: .loc 1 292 3 add.u64 %r40,%r40,24; add.u64 %r178,%r178,24; add.u64 %r147,%r147,8; setp.ne.u64 %r316,%r40,%r131; @ %r316 bra $L63; bra $L64; $L116: .loc 1 312 6 ld.u64 %r317,[%r220]; setp.eq.u64 %r318,%r317,0; @ ! %r318 bra $L66; bra $L65; $L59: add.u64 %r320,%r22,%r22; add.u64 %r321,%r320,%r22; shl.b64 %r322,%r321,3; add.u64 %r114,%r322,40; add.u64 %r132,%r221,%r114; add.u64 %r96,%r223,%r114; shl.b64 %r323,%r22,3; add.u64 %r150,%r323,-8; cvt.u32.u32 %r325,%r23; cvt.s64.s8 %r324,%r325; shl.b64 %r326,%r324,3; add.u64 %r39,%r326,-8; add.u64 %r456,%frame,240; add.u64 %r455,%frame,360; .loc 1 309 12 mov.u64 %r466,0; $L69: .loc 1 304 18 add.u64 %r328,%r456,%r150; ld.u64 %r329,[%r132]; st.u64 [%r328],%r329; .loc 1 305 18 add.u64 %r330,%frame,%r150; .loc 1 305 20 ld.u64 %r332,[%r96]; mul.lo.u64 %r331,%r332,%r34; .loc 1 305 18 st.u64 [%r330],%r331; .loc 1 306 19 ld.u64 %r334,[%r132+16]; add.u64 %r333,%r334,1; ld.u64 %r335,[%r132+8]; sub.u64 %r64,%r333,%r335; .loc 1 308 10 setp.lt.s64 %r336,%r64,0; @ %r336 bra $L67; .loc 1 306 17 add.u64 %r338,%r455,%r150; st.u64 [%r338],%r64; bra $L68; $L67: .loc 1 309 12 add.u64 %r340,%r455,%r150; st.u64 [%r340],%r466; $L68: .loc 1 302 3 add.u64 %r132,%r132,24; add.u64 %r96,%r96,24; add.u64 %r150,%r150,8; setp.ne.u64 %r342,%r39,%r150; @ %r342 bra $L69; bra $L117; $L65: .loc 1 316 7 setp.eq.u64 %r343,%r153,0; @ %r343 bra $L71; add.u64 %r455,%frame,360; $L96: add.u64 %r145,%r220,40; mov.u64 %r142,%r455; cvt.u32.u32 %r345,%r23; cvt.s64.s8 %r344,%r345; add.u64 %r347,%r344,%r344; add.u64 %r348,%r347,%r344; shl.b64 %r349,%r348,3; add.u64 %r350,%r220,16; add.u64 %r133,%r349,%r350; .loc 1 319 10 mov.u64 %r135,1; .loc 1 323 4 mov.u64 %r369,0; bra $L72; $L71: .loc 1 327 20 add.u32 %r351,%r23,-2; cvt.s64.s32 %r74,%r351; add.u64 %r353,%r74,%r74; add.u64 %r354,%r353,%r74; shl.b64 %r355,%r354,3; add.u64 %r356,%r220,%r355; .loc 1 327 67 shl.b64 %r358,%r74,3; add.u64 %r359,%frame,%r358; .loc 1 327 59 ld.u64 %r361,[%r356+40]; ld.u64 %r362,[%r359+360]; mul.lo.u64 %r160,%r361,%r362; .loc 1 329 24 mov.u64 %r363,0; st.u64 [%r220+8],%r363; .loc 1 330 28 cvt.u16.u32 %r366,%r23; add.u16 %r365,%r366,-1; cvt.u32.u16 %r367,%r365; st.u8 [%r220+28],%r367; .loc 1 332 10 setp.eq.u64 %r368,%r160,0; @ %r368 bra $L73; bra $L118; $L75: .loc 1 321 47 mul.lo.u64 %r135,%r71,%r135; $L72: .loc 1 323 4 st.u64 [%r145+8],%r369; ld.u64 %r71,[%r142]; add.u64 %r370,%r71,-1; st.u64 [%r145+16],%r370; st.u64 [%r145],%r135; .loc 1 316 7 add.u64 %r145,%r145,24; add.u64 %r142,%r142,8; setp.ne.u64 %r371,%r133,%r145; @ %r371 bra $L75; bra $L71; $L73: .loc 1 335 4 st.u64 [%r220+48],%r160; mov.u64 %r373,-1; st.u64 [%r220+56],%r373; mov.u64 %r374,1; st.u64 [%r220+40],%r374; .loc 1 336 4 bra $L51; $L118: .loc 1 339 24call (%value_in),_gfortrani_xmallocarray77,[%value_in]; } .loc 1 339 22 st.u64 [%r220],%r377; bra $L76; $L66: .loc 1 344 19 ld.s8 %r379,[%r220+28]; .loc 1 344 10 setp.eq.u64 %r380,%r379,%r153; @ %r380 bra $L77; .loc 1 345 2 cvta.const.u64 %r381381_gfortran_runtime_error77: .loc 1 347 11 cvta.global.u64 %r383,_gfortrani_compile_options; .loc 1 347 10 ld.u32 %r384,[%r383+36]; setp.eq.u32 %r385,%r384,0; @ %r385 bra $L76; .loc 1 349 4 add.u64 %r455,%frame,360; cvta.const.u64 %r389,$LC2r455388389; call _gfortrani_bounds_ifunction_return4 cvta.const.u64 %r393,$LC393389; call _gfortrani_bounds_equal_extents$L76: .loc 1 356 3 setp.ne.u64 %r395,%r153,0; @ %r395 bra $L78; $L81: .loc 1 364 8 ld.u64 %r170,[%r220]; .loc 1 365 8 ld.u64 %r168,[%r221]; .loc 1 367 9 setp.ne.u64 %r396,%r168,0; @ %r396 bra $L79; bra $L51; $L78: add.u64 %r197,%frame,480; add.u64 %r196,%r220,40; add.u64 %r192,%frame,120; cvt.u32.u32 %r398,%r23; cvt.s64.s8 %r397,%r398; add.u64 %r146,%r397,-1; .loc 1 356 3 mov.u64 %r171,0; add.u64 %r455,%frame,360; .loc 1 358 16 mov.u64 %r399,%r171; $L80: st.u64 [%r197],%r399; .loc 1 359 18 ld.u64 %r400,[%r196]; st.u64 [%r192],%r400; .loc 1 360 17 shl.b64 %r402,%r171,3; add.u64 %r403,%r455,%r402; .loc 1 360 10 ld.u64 %r404,[%r403]; setp.le.s64 %r405,%r404,0; @ %r405 bra $L51; .loc 1 356 26 add.u64 %r171,%r171,1; .loc 1 356 3 add.u64 %r197,%r197,8; add.u64 %r196,%r196,24; add.u64 %r192,%r192,8; setp.ne.u64 %r406,%r146,%r171; @ %r406 bra $L80; bra $L81; $L79: .loc 1 290 12 mul.lo.u64 %r45,%r42,%r34; ld.u64 %r189,[%frame+480]; .loc 1 430 22 ld.u64 %r100,[%frame+240]; .loc 1 431 23 ld.u64 %r102,[%frame]; .loc 1 432 22 ld.u64 %r104,[%frame+120]; .loc 1 432 12 shl.b64 %r105,%r104,2; .loc 1 434 32 ld.u64 %r29,[%frame+360]; .loc 1 441 23 mul.lo.u64 %r38,%r100,%r29; .loc 1 442 24 mul.lo.u64 %r407,%r102,%r29; .loc 1 442 10 neg.s64 %r136,%r407; .loc 1 443 23 mul.lo.u64 %r408,%r104,%r29; .loc 1 443 9 shl.b64 %r409,%r408,2; neg.s64 %r190,%r409; setp.le.s64 %r454,%r153,1; .loc 1 408 9 setp.ne.u32 %r459,%r224,0; cvt.u32.u32 %r460,%r23; cvt.s64.s8 %r461,%r460; add.u64 %r462,%r461,-1; add.u64 %r463,%frame,240; add.u64 %r464,%frame,360; add.u64 %r465,%frame,120; $L97: .loc 1 356 3 mov.u64 %r163,%r169; mov.u64 %r162,%r168; .loc 1 386 9 mov.u64 %r161,0; $L85: .loc 1 389 7 ld.s8 %r219,[%r163]; .loc 1 389 6 setp.eq.u32 %r457,%r219,0; @ %r457 bra $L82; .loc 1 397 11 ld.s8 %r143,[%r162]; .loc 1 398 11 cvt.u32.u64 %r411,%r161; add.u32 %r141,%r411,1; .loc 1 408 9 @ %r459 bra $L83; bra $L119; $L82: .loc 1 386 24 add.u64 %r161,%r161,1; .loc 1 386 32 add.u64 %r162,%r162,%r157; .loc 1 386 47 add.u64 %r163,%r163,%r45; .loc 1 386 2 setp.ne.u64 %r413,%r154,%r161; @ %r413 bra $L85; .loc 1 385 9 mov.u32 %r141,%r219; bra $L86; $L119: .loc 1 418 8 setp.gt.s64 %r414,%r154,%r161; @ %r414 bra $L87; bra $L86; $L83: .loc 1 409 8 setp.le.s64 %r415,%r154,%r161; @ %r415 bra $L86; add.u64 %r204,%r161,1; $L89: .loc 1 411 8 @ %r457 bra $L88; .loc 1 411 18 ld.s8 %r89,[%r162]; .loc 1 411 15 cvt.u16.u32 %r417,%r89; cvt.u16.u32 %r418,%r143; setp.lt.s16 %r419,%r417,%r418; @ %r419 bra $L88; .loc 1 414 16 cvt.u32.u64 %r141,%r204; mov.u32 %r143,%r89; $L88: .loc 1 409 33 add.u64 %r162,%r162,%r157; .loc 1 409 48 add.u64 %r163,%r163,%r45; .loc 1 409 8 add.u64 %r205,%r204,1; setp.eq.u64 %r420,%r154,%r204; @ %r420 bra $L86; .loc 1 411 9 ld.s8 %r219,[%r163]; mov.u64 %r204,%r205; setp.eq.u32 %r457,%r219,0; bra $L89; $L87: add.u64 %r200,%r161,1; bra $L91; $L98: mov.u64 %r200,%r201; $L91: .loc 1 420 8 ld.s8 %r422,[%r163]; cvt.u16.u32 %r421,%r422; setp.eq.u16 %r423,%r421,0; @ %r423 bra $L90; .loc 1 420 18 ld.s8 %r94,[%r162]; .loc 1 420 15 cvt.u16.u32 %r424,%r94; cvt.u16.u32 %r425,%r143; setp.le.s16 %r426,%r424,%r425; @ %r426 bra $L90; .loc 1 423 16 cvt.u32.u64 %r141,%r200; mov.u32 %r143,%r94; $L90: .loc 1 418 33 add.u64 %r162,%r162,%r157; .loc 1 418 48 add.u64 %r163,%r163,%r45; .loc 1 418 8 add.u64 %r201,%r200,1; setp.ne.u64 %r427,%r154,%r200; @ %r427 bra $L98; $L86: .loc 1 426 8 st.u32 [%r170],%r141; .loc 1 429 15 add.u64 %r189,%r189,1; .loc 1 430 12 add.u64 %r168,%r168,%r100; .loc 1 431 13 add.u64 %r169,%r169,%r102; .loc 1 432 12 add.u64 %r170,%r170,%r105; .loc 1 434 13 setp.ne.u64 %r428,%r29,%r189; @ %r428 bra $L97; .loc 1 442 10 add.u64 %r165,%r169,%r136; .loc 1 443 9 add.u64 %r166,%r170,%r190; .loc 1 445 7 @ %r454 bra $L51; add.u64 %r151,%frame,488; .loc 1 441 23 mov.u64 %r108,%r38; .loc 1 445 7 mov.u64 %r214,8; .loc 1 444 5 mov.u64 %r167,1; .loc 1 438 13 mov.u64 %r458,0; bra $L94; $L95: st.u64 [%r151],%r458; .loc 1 441 23 mul.lo.u64 %r108,%r122,%r121; .loc 1 442 24 mul.lo.u64 %r434,%r124,%r121; .loc 1 442 10 sub.u64 %r165,%r169,%r434; .loc 1 443 23 mul.lo.u64 %r435,%r126,%r121; .loc 1 443 9 shl.b64 %r436,%r435,2; sub.u64 %r166,%r170,%r436; .loc 1 444 5 add.u64 %r167,%r167,1; .loc 1 445 7 add.u64 %r151,%r151,8; add.u64 %r214,%r214,8; setp.eq.u64 %r437,%r167,%r462; @ %r437 bra $L51; $L94: .loc 1 453 16 ld.u64 %r438,[%r151]; add.u64 %r121,%r438,1; st.u64 [%r151],%r121; .loc 1 454 23 add.u64 %r440,%r463,%r214; ld.u64 %r122,[%r440]; .loc 1 454 13 sub.u64 %r441,%r122,%r108; add.u64 %r168,%r168,%r441; .loc 1 455 24 add.u64 %r442,%frame,%r214; ld.u64 %r124,[%r442]; .loc 1 455 14 add.u64 %r169,%r165,%r124; .loc 1 456 23 add.u64 %r444,%r465,%r214; ld.u64 %r126,[%r444]; .loc 1 456 13 shl.b64 %r445,%r126,2; add.u64 %r170,%r166,%r445; .loc 1 434 32 add.u64 %r447,%r464,%r214; ld.u64 %r130,[%r447]; .loc 1 434 13 setp.eq.u64 %r448,%r121,%r130; @ %r448 bra $L95; .loc 1 438 13 mov.u64 %r189,0; bra $L97; $L117: .loc 1 312 6 ld.u64 %r449,[%r220]; setp.eq.u64 %r450,%r449,0; @ ! %r450 bra $L66; bra $L96; $L51: .loc 1 460_gfortran_smaxloc1_4_i1 .visible .func _gfortran_smaxloc1_4_, .param .u32 %in_ar4) {8u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u32 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 483 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L121; .loc 1 483 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L122; $L121:r152154; call _gfortran_maxloc1_4_490 7 bra $L120; $L122: .loc 1 493 10 ld.u64 %r23,[%r152]; .loc 1 493 7 add.u64 %r88,%r23,-1; .loc 1 494 10 ld.s8 %r24,[%r151+28]; .loc 1 494 38 add.u32 %r26,%r24,-1; .loc 1 494 8 cvt.s64.s32 %r296,%r26; .loc 1 496 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 496 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L124; .loc 1 503 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L125; $L131: .loc 1 511 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L126; bra $L179; $L124: .loc 1 498 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error125: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 508 12 mov.u64 %r302,0; $L130: .loc 1 505 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 507 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L128; .loc 1 505 17 st.u64 [%r139],%r36; bra $L129; $L128: .loc 1 508 12 st.u64 [%r139],%r302; $L129: .loc 1 503 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L130; bra $L131; $L179: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L133; bra $L132; $L126: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 517 12 mov.u64 %r301,0; $L136: .loc 1 514 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 516 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L134; .loc 1 513 17 st.u64 [%r43],%r41; bra $L135; $L134: .loc 1 517 12 st.u64 [%r43],%r301; $L135: .loc 1 511 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L136; bra $L180; $L132: .loc 1 524 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L138; add.u64 %r295,%frame,120; $L158: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 527 10 mov.u64 %r81,1; .loc 1 531 4 mov.u64 %r239,0; bra $L139; $L138: .loc 1 535 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 536 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 538 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 538 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 538 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 540 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L140; bra $L181; $L142: .loc 1 529 48 mul.lo.u64 %r81,%r46,%r81; $L139: .loc 1 531 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 524 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L142; bra $L138; $L140: .loc 1 543 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 544 4 bra $L120; $L181: .loc 1 547 2496246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 547 22 st.u64 [%r150],%r98; $L146: .loc 1 573 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L143; bra $L144; $L133: .loc 1 551 19 ld.s8 %r55,[%r150+28]; .loc 1 551 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L145; .loc 1 552 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error145: .loc 1 557 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 557 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L146; .loc 1 559 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L144; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 559 10 mov.u64 %r104,0; $L148: .loc 1 563 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 563 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 564 18 ld.u64 %r63,[%r31]; .loc 1 559 25 add.u64 %r104,%r104,1; .loc 1 564 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L147; .loc 1 565 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error147: .loc 1 559 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L148; $L143: .loc 1 575 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L149; shl.b64 %r266,%r296,3; bra $L150; $L149: mov.u64 %r266,8; $L150: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L151: .loc 1 576 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 573 26 add.u64 %r100,%r100,1; .loc 1 573 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L151; $L144: ld.u64 %r80,[%frame+240]; .loc 1 585 22 ld.u64 %r68,[%frame]; .loc 1 585 12 shl.b64 %r69,%r68,2; .loc 1 587 32 ld.u64 %r59,[%frame+120]; .loc 1 594 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 583 13 mov.u32 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L157: st.u32 [%r98],%r280; .loc 1 584 15 add.u64 %r80,%r80,1; .loc 1 585 12 add.u64 %r98,%r98,%r69; .loc 1 587 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L157; .loc 1 596 7 @ ! %r294 bra $L182; bra $L120; $L156: .loc 1 591 13 st.u64 [%r99],%r297; .loc 1 594 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 595 5 add.u64 %r97,%r97,1; .loc 1 596 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L155; bra $L120; $L182: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 594 23 mov.u64 %r72,%r44; .loc 1 595 5 mov.u64 %r97,1; .loc 1 591 13 mov.u64 %r297,0; $L155: .loc 1 600 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 601 23 ld.u64 %r75,[%r124]; .loc 1 601 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,2; add.u64 %r98,%r98,%r291; .loc 1 587 32 ld.u64 %r77,[%r123]; .loc 1 587 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L156; .loc 1 591 13 mov.u64 %r80,0; bra $L157; $L180: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L133; bra $L158; $L120: .loc 1 605maxloc1_8_i1.o/ 1622802229_gfortran_maxloc1_8_i1 .visible .func _gfortran_maxloc1_8_i1.file 1 "../../../../libgfortran/generated/maxloc1_8_i1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_8_i1 .visible .func _gfortran_mmaxloc1_8__gfortran_smaxloc1_8_i1 .visible .func _gfortran_smaxloc1_8_VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_maxloc1_8_i1 .visible .func _gfortran_maxloc1_8_i1predpred %r226; .reg .predu64 %r249; .reg .pred %r250; .reg .pred %r252; .reg .pred64u64 %r273; .reg .u64 %r274; .reg .u64pred %r289; .reg .u64 %r290; .reg .u64 %r292; .reg .u32 %r293; .reg .predu64 %r302; .reg .u32 %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u64pred %r311; .reg .u16 %r315; .reg .u16 %r316; .reg .predpred %r320; .reg .predpred %r342; .reg .u64 %r343; .reg .predpredu64 %r357; .reg .u64 %r358; .reg .u64 %r359; mov.u64 %r176,%ar0; mov.u64 %r177,%ar1; mov.u64 %r178,%ar2; mov.u32 %r179,%ar3; .loc 1 58 10 ld.s8 %r22,[%r177+28]; .loc 1 58 38 add.u32 %r180,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r110,%r180; .loc 1 59 10 ld.u64 %r25,[%r178]; .loc 1 59 7 add.u64 %r111,%r25,-1; .loc 1 61 7 shr.u64 %r182,%r111,63; set.u32.lt.s64 %r184,%r110,%r111; neg.s32 %r185,%r184; cvt.u16.u64 %r188,%r182; cvt.u16.u32 %r189,%r185; or.b16 %r187,%r188,%r189; .loc 1 61 6 cvt.u32.u16 %r190,%r187; cvt.u16.u8 %r191,2; .loc 1 63 7 cvt.u32.u32 %r195,%r22; cvt.s64.s8 %r194,%r195; st.u64 [%stack+8],%r1941933_gfortran_runtime_errorr198,%r111,%r111; add.u64 %r199,%r198,%r111; shl.b64 %r200,%r199,3; add.u64 %r201,%r177,%r200; ld.u64 %r31,[%r201+56]; ld.u64 %r33,[%r201+48]; .loc 1 71 9 ld.u64 %r114,[%r201+40]; .loc 1 73 3 setp.ne.u64 %r215,%r111,0; @ %r215 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r216,%r110,%r111; @ %r216 bra $L4; bra $L50; $L3: add.u64 %r108,%r177,40; add.u64 %r126,%frame,120; add.u64 %r172,%frame,240; add.u64 %r217,%r177,16; add.u64 %r219,%r25,%r25; add.u64 %r220,%r219,%r25; shl.b64 %r221,%r220,3; add.u64 %r173,%r217,%r221; .loc 1 79 12 mov.u64 %r359,0; $L8: .loc 1 75 18 ld.u64 %r222,[%r108]; st.u64 [%r126],%r222; .loc 1 76 19 ld.u64 %r224,[%r108+16]; add.u64 %r223,%r224,1; ld.u64 %r225,[%r108+8]; sub.u64 %r38,%r223,%r225; .loc 1 78 10 setp.lt.s64 %r226,%r38,0; @ %r226 bra $L6; .loc 1 76 17 st.u64 [%r172],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r172],%r359; $L7: .loc 1 73 3 add.u64 %r108,%r108,24; add.u64 %r126,%r126,8; add.u64 %r172,%r172,8; setp.ne.u64 %r228,%r108,%r173; @ %r228 bra $L8; bra $L9; $L50: .loc 1 90 6 ld.u64 %r229,[%r176]; setp.eq.u64 %r230,%r229,0; @ ! %r230 bra $L11; bra $L10; $L4: add.u64 %r232,%r25,%r25; add.u64 %r233,%r232,%r25; shl.b64 %r234,%r233,3; add.u64 %r235,%r234,40; add.u64 %r128,%r177,%r235; shl.b64 %r236,%r25,3; add.u64 %r102,%r236,-8; add.u64 %r345,%frame,120; add.u64 %r105,%r345,%r102; add.u64 %r349,%frame,240; add.u64 %r98,%r349,%r102; add.u64 %r239,%r177,40; cvt.u32.u32 %r241,%r22; cvt.s64.s8 %r240,%r241; add.u64 %r243,%r240,%r240; add.u64 %r244,%r243,%r240; shl.b64 %r245,%r244,3; add.u64 %r41,%r239,%r245; .loc 1 87 12 mov.u64 %r358,0; $L14: .loc 1 83 18 ld.u64 %r246,[%r128]; st.u64 [%r105],%r246; .loc 1 84 19 ld.u64 %r248,[%r128+16]; add.u64 %r247,%r248,1; ld.u64 %r249,[%r128+8]; sub.u64 %r45,%r247,%r249; .loc 1 86 10 setp.lt.s64 %r250,%r45,0; @ %r250 bra $L12; .loc 1 84 17 st.u64 [%r98],%r45; bra $L13; $L12: .loc 1 87 12 st.u64 [%r98],%r358; $L13: .loc 1 81 3 add.u64 %r128,%r128,24; add.u64 %r105,%r105,8; add.u64 %r98,%r98,8; setp.ne.u64 %r252,%r41,%r128; @ %r252 bra $L14; bra $L51; $L10: .loc 1 94 7 setp.le.s64 %r253,%r110,0; @ %r253 bra $L16; add.u64 %r349,%frame,240; $L37: add.u64 %r148,%r176,40; mov.u64 %r142,%r349; cvt.u32.u32 %r255,%r22; cvt.s64.s8 %r254,%r255; add.u64 %r257,%r254,%r254; add.u64 %r258,%r257,%r254; shl.b64 %r259,%r258,3; add.u64 %r260,%r176,16; add.u64 %r129,%r259,%r260; .loc 1 97 10 mov.u64 %r99,1; .loc 1 101 4 mov.u64 %r283,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r261,0; st.u64 [%r176+8],%r261; .loc 1 106 28 cvt.u16.u32 %r264,%r22; add.u16 %r263,%r264,-1; cvt.u32.u16 %r265,%r263; st.u8 [%r176+28],%r265; .loc 1 108 20 add.u32 %r266,%r22,-2; cvt.s64.s32 %r57,%r266; add.u64 %r268,%r57,%r57; add.u64 %r269,%r268,%r57; shl.b64 %r270,%r269,3; add.u64 %r271,%r176,%r270; .loc 1 108 67 shl.b64 %r273,%r57,3; add.u64 %r274,%frame,%r273; .loc 1 108 59 ld.u64 %r276,[%r271+40]; ld.u64 %r277,[%r274+240]; mul.lo.u64 %r116,%r276,%r277; .loc 1 110 29 mov.u64 %r279279; call (%value_in),_gfortrani_xmallocarray280,[%value_in]; } .loc 1 110 27 st.u64 [%r176],%r280; .loc 1 111 10 setp.eq.u64 %r282,%r116,0; @ ! %r282 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r99,%r53,%r99; $L17: .loc 1 101 4 st.u64 [%r148+8],%r283; ld.u64 %r53,[%r142]; add.u64 %r284,%r53,-1; st.u64 [%r148+16],%r284; st.u64 [%r148],%r99; .loc 1 94 7 add.u64 %r148,%r148,24; add.u64 %r142,%r142,8; setp.ne.u64 %r285,%r129,%r148; @ %r285 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r176+48],%r116; mov.u64 %r287,-1; st.u64 [%r176+56],%r287; mov.u64 %r288,1; st.u64 [%r176+40],%r288; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r63,[%r176+28]; .loc 1 121 10 setp.eq.u64 %r289,%r63,%r110; @ %r289 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r110; st.u64 [%stack],%r63; cvta.const.u64 %r290,$LC1stack; call _gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r292,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r293,[%r292+36]; setp.eq.u32 %r294,%r293,0; @ %r294 bra $L21; .loc 1 128 2 add.u64 %r349,%frame,240; cvta.const.u64 %r298,$LC2r34929798; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r300,%r110,0; @ %r300 bra $L24; $L27: .loc 1 68 9 add.u64 %r301,%r31,1; .loc 1 68 7 sub.u64 %r112,%r301,%r33; max.s64 %r46,%r112,0; .loc 1 140 8 ld.u64 %r122,[%r177]; .loc 1 141 8 ld.u64 %r123,[%r176]; ld.u64 %r48,[%frame+360]; .loc 1 194 22 ld.u64 %r72,[%frame+120]; .loc 1 195 22 ld.u64 %r74,[%frame]; .loc 1 195 12 shl.b64 %r76,%r74,3; setp.gt.s64 %r346,%r112,0; setp.le.s64 %r347,%r110,1; setp.eq.u32 %r351,%r179,0; .loc 1 159 10 mov.u64 %r352,0; cvt.u32.u32 %r353,%r22; cvt.s64.s8 %r354,%r353; add.u64 %r355,%r354,-1; add.u64 %r356,%frame,120; add.u64 %r357,%frame,240; bra $L25; $L24: add.u64 %r161,%frame,360; add.u64 %r160,%r176,40; mov.u64 %r158,%frame; cvt.u32.u32 %r303,%r22; cvt.s64.s8 %r302,%r303; add.u64 %r149,%r302,-1; .loc 1 132 3 mov.u64 %r124,0; add.u64 %r349,%frame,240; .loc 1 134 16 mov.u64 %r304,%r124; $L26: st.u64 [%r161],%r304; .loc 1 135 18 ld.u64 %r305,[%r160]; st.u64 [%r158],%r305; .loc 1 136 17 shl.b64 %r307,%r124,3; add.u64 %r308,%r349,%r307; .loc 1 136 10 ld.u64 %r309,[%r308]; setp.le.s64 %r310,%r309,0; @ %r310 bra $L1; .loc 1 132 26 add.u64 %r124,%r124,1; .loc 1 132 3 add.u64 %r161,%r161,8; add.u64 %r160,%r160,24; add.u64 %r158,%r158,8; setp.ne.u64 %r311,%r124,%r149; @ %r311 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r346 bra $L28; .loc 1 159 10 st.u64 [%r123],%r352; bra $L29; $L28: .loc 1 180 31 mov.u64 %r117,%r122; .loc 1 155 9 mov.u32 %r103,-128; .loc 1 157 9 mov.u64 %r101,1; .loc 1 178 8 mov.u64 %r146,0; $L32: .loc 1 182 14 ld.s8 %r175,[%r117]; .loc 1 185 14 add.u64 %r146,%r146,1; .loc 1 182 7 @ %r351 bra $L30; .loc 1 182 6 cvt.u16.u32 %r315,%r103; cvt.u16.u32 %r316,%r175; setp.le.s16 %r317,%r315,%r316; selp.u32 %r103,%r175,%r103,%r317; selp.u64 %r101,%r146,%r101,%r317; bra $L31; $L30: cvt.u16.u32 %r318,%r103; cvt.u16.u32 %r319,%r175; setp.lt.s16 %r320,%r318,%r319; selp.u32 %r103,%r175,%r103,%r320; selp.u64 %r101,%r146,%r101,%r320; $L31: .loc 1 180 31 add.u64 %r117,%r117,%r114; .loc 1 180 6 setp.gt.s64 %r321,%r46,%r146; @ %r321 bra $L32; .loc 1 189 12 st.u64 [%r123],%r101; $L29: .loc 1 193 15 add.u64 %r48,%r48,1; .loc 1 194 12 add.u64 %r122,%r122,%r72; .loc 1 195 12 add.u64 %r123,%r123,%r76; .loc 1 197 32 ld.u64 %r144,[%frame+240]; .loc 1 197 13 setp.ne.u64 %r322,%r48,%r144; @ %r322 bra $L25; .loc 1 204 23 mul.lo.u64 %r323,%r72,%r48; .loc 1 204 9 sub.u64 %r119,%r122,%r323; .loc 1 205 23 mul.lo.u64 %r324,%r74,%r48; .loc 1 205 9 shl.b64 %r325,%r324,3; sub.u64 %r120,%r123,%r325; .loc 1 207 7 @ %r347 bra $L1; add.u64 %r147,%frame,368; mov.u64 %r109,8; .loc 1 206 5 mov.u64 %r121,1; bra $L35; $L36: .loc 1 201 13 st.u64 [%r147],%r352; .loc 1 204 23 mul.lo.u64 %r331,%r88,%r87; .loc 1 204 9 sub.u64 %r119,%r122,%r331; .loc 1 205 23 mul.lo.u64 %r332,%r90,%r87; .loc 1 205 9 shl.b64 %r333,%r332,3; sub.u64 %r120,%r123,%r333; .loc 1 206 5 add.u64 %r121,%r121,1; .loc 1 207 7 add.u64 %r147,%r147,8; add.u64 %r109,%r109,8; setp.eq.u64 %r334,%r121,%r355; @ %r334 bra $L1; $L35: .loc 1 215 16 ld.u64 %r335,[%r147]; add.u64 %r87,%r335,1; st.u64 [%r147],%r87; .loc 1 216 23 add.u64 %r337,%r356,%r109; ld.u64 %r88,[%r337]; .loc 1 216 13 add.u64 %r122,%r119,%r88; .loc 1 217 23 add.u64 %r338,%frame,%r109; ld.u64 %r90,[%r338]; .loc 1 217 13 shl.b64 %r339,%r90,3; add.u64 %r123,%r120,%r339; .loc 1 197 32 add.u64 %r341,%r357,%r109; ld.u64 %r93,[%r341]; .loc 1 197 13 setp.eq.u64 %r342,%r87,%r93; @ %r342 bra $L36; .loc 1 201 13 mov.u64 %r48,0; bra $L25; $L51: .loc 1 90 6 ld.u64 %r343,[%r176]; setp.eq.u64 %r344,%r343,0; @ ! %r344 bra $L11; bra $L37; $L1: .loc 1 221_gfortran_mmaxloc1_8_i1 .visible .func _gfortran_mmaxloc1_8_, .param .u32 %in_ar4) {608101; .reg .u64 %r106; .reg .u64.reg .u64u64 %r214; .reg .u64u32 %r227; .reg .u16 %r229; .reg .u16u16 %r233; .reg .pred %r234; .reg .u64u64pred646464pred %r335; .reg .u64 %r336; .reg .u32 %r337; .reg .u64u64 %r342; .reg .u32 %r343; .reg .u64pred %r372; .reg .u64 %r373; .reg .u64 %r375; .reg .u32 %r376; .reg .predu32pred %r397; .reg .pred %r398; .reg .pred %r401; .reg .pred %r402; .reg .predpredpred %r414; .reg .pred %r415; .reg .predpredpredpredmov.u64 %r212,%ar0; mov.u64 %r213,%ar1; mov.u64 %r214,%ar2; mov.u64 %r215,%ar3; mov.u32 %r216,%ar4; .loc 1 251 6 setp.ne.u64 %r217,%r215,0; @ %r217 bra $L121321216; call _gfortran_maxloc1_8_7 bra $L52; $L53: .loc 1 261 10 ld.u64 %r22,[%r214]; .loc 1 261 7 add.u64 %r142,%r22,-1; .loc 1 262 10 ld.s8 %r23,[%r213+28]; .loc 1 262 38 add.u32 %r222,%r23,-1; .loc 1 262 8 cvt.s64.s32 %r143,%r222; .loc 1 265 7 shr.u64 %r224,%r142,63; set.u32.gt.s64 %r226,%r142,%r143; neg.s32 %r227,%r226; cvt.u16.u64 %r230,%r224; cvt.u16.u32 %r231,%r227; or.b16 %r229,%r230,%r231; .loc 1 265 6 cvt.u32.u16 %r232,%r229; cvt.u16.u8 %r233,%r232; setp.eq.u16 %r234,%r233,0; @ %r234 bra $L55; .loc 1 267 7 cvt.u32.u32 %r237,%r23; cvt.s64.s8 %r236,%r237; st.u64 [%stack+8],%r236; st.u64 [%stack],%r22; cvta.const.u64 %r235235_gfortran_runtime_error5: .loc 1 272 9 add.u64 %r447,%r142,%r142; add.u64 %r448,%r447,%r142; shl.b64 %r242,%r448,3; add.u64 %r243,%r213,%r242; ld.u64 %r246,[%r243+56]; add.u64 %r245,%r246,1; .loc 1 272 7 ld.u64 %r253,[%r243+48]; sub.u64 %r144,%r245,%r253; .loc 1 273 6 setp.le.s64 %r254,%r144,0; @ %r254 bra $L52; .loc 1 276 9 ld.u64 %r160,[%r215]; .loc 1 278 15 ld.u64 %r34,[%r215+16]; .loc 1 280 22 cvt.u32.u64 %r35,%r34; .loc 1 280 53 add.u32 %r255,%r35,-4; and.b32 %r256,%r255,-5; set.u32.eq.u32 %r258,%r256,0; neg.s32 %r259,%r258; .loc 1 280 22 add.u32 %r260,%r35,-1; set.u32.le.u32 %r262,%r260,1; neg.s32 %r263,%r262; .loc 1 280 6 cvt.u16.u32 %r265,%r259; cvt.u16.u32 %r266,%r263; or.loc 1 282 7 setp.ne.u32 %r271,%r35,16; @ %r271 bra $L58; $L57: .loc 1 289 9 shl.b64 %r275,%r448,3; add.u64 %r276,%r213,%r275; ld.u64 %r147,[%r276+40]; .loc 1 290 12 add.u64 %r282,%r215,%r275; ld.u64 %r38,[%r282+40]; .loc 1 292 3 setp.ne.u64 %r284,%r142,0; @ %r284 bra $L59; $L65: .loc 1 302 3 setp.lt.s64 %r285,%r142,%r143; @ %r285 bra $L60; bra $L119; $L58: .loc 1 287 5 cvta.const.u64 %r286286_gfortran_runtime_error9: add.u64 %r99,%r213,40; add.u64 %r62,%r215,40; add.u64 %r289,%r22,%r22; add.u64 %r290,%r289,%r22; shl.b64 %r291,%r290,3; add.u64 %r292,%r213,16; add.u64 %r88,%r291,%r292; .loc 1 292 3 mov.u64 %r40,0; add.u64 %r445,%frame,240; add.u64 %r443,%frame,360; .loc 1 299 12 mov.u64 %r458,%r40; $L64: .loc 1 294 18 add.u64 %r294,%r445,%r40; ld.u64 %r295,[%r99]; st.u64 [%r294],%r295; .loc 1 295 18 add.u64 %r296,%frame,%r40; .loc 1 295 20 ld.u64 %r298,[%r62]; mul.lo.u64 %r297,%r298,%r34; .loc 1 295 18 st.u64 [%r296],%r297; .loc 1 296 19 ld.u64 %r300,[%r99+16]; add.u64 %r299,%r300,1; ld.u64 %r301,[%r99+8]; sub.u64 %r50,%r299,%r301; .loc 1 298 10 setp.lt.s64 %r302,%r50,0; @ %r302 bra $L62; .loc 1 296 17 add.u64 %r304,%r443,%r40; st.u64 [%r304],%r50; bra $L63; $L62: .loc 1 299 12 add.u64 %r306,%r443,%r40; st.u64 [%r306],%r458; $L63: .loc 1 292 3 add.u64 %r99,%r99,24; add.u64 %r62,%r62,24; add.u64 %r40,%r40,8; setp.ne.u64 %r308,%r88,%r99; @ %r308 bra $L64; bra $L65; $L119: .loc 1 312 6 ld.u64 %r309,[%r212]; setp.eq.u64 %r310,%r309,0; @ ! %r310 bra $L67; bra $L66; $L60: add.u64 %r312,%r22,%r22; add.u64 %r313,%r312,%r22; shl.b64 %r314,%r313,3; add.u64 %r122,%r314,40; add.u64 %r125,%r213,%r122; add.u64 %r106,%r215,%r122; shl.b64 %r315,%r22,3; add.u64 %r59,%r315,-8; cvt.u32.u32 %r317,%r23; cvt.s64.s8 %r316,%r317; shl.b64 %r318,%r316,3; add.u64 %r200,%r318,-8; add.u64 %r445,%frame,240; add.u64 %r443,%frame,360; .loc 1 309 12 mov.u64 %r457,0; $L70: .loc 1 304 18 add.u64 %r320,%r445,%r59; ld.u64 %r321,[%r125]; st.u64 [%r320],%r321; .loc 1 305 18 add.u64 %r322,%frame,%r59; .loc 1 305 20 ld.u64 %r324,[%r106]; mul.lo.u64 %r323,%r324,%r34; .loc 1 305 18 st.u64 [%r322],%r323; .loc 1 306 19 ld.u64 %r326,[%r125+16]; add.u64 %r325,%r326,1; ld.u64 %r327,[%r125+8]; sub.u64 %r61,%r325,%r327; .loc 1 308 10 setp.lt.s64 %r328,%r61,0; @ %r328 bra $L68; .loc 1 306 17 add.u64 %r330,%r443,%r59; st.u64 [%r330],%r61; bra $L69; $L68: .loc 1 309 12 add.u64 %r332,%r443,%r59; st.u64 [%r332],%r457; $L69: .loc 1 302 3 add.u64 %r125,%r125,24; add.u64 %r106,%r106,24; add.u64 %r59,%r59,8; setp.ne.u64 %r334,%r59,%r200; @ %r334 bra $L70; bra $L120; $L66: .loc 1 316 7 setp.eq.u64 %r335,%r143,0; @ %r335 bra $L72; add.u64 %r443,%frame,360; $L97: add.u64 %r139,%r212,40; mov.u64 %r137,%r443; cvt.u32.u32 %r337,%r23; cvt.s64.s8 %r336,%r337; add.u64 %r339,%r336,%r336; add.u64 %r340,%r339,%r336; shl.b64 %r341,%r340,3; add.u64 %r342,%r212,16; add.u64 %r128,%r341,%r342; .loc 1 319 10 mov.u64 %r126,1; .loc 1 323 4 mov.u64 %r361,0; bra $L73; $L72: .loc 1 327 20 add.u32 %r343,%r23,-2; cvt.s64.s32 %r70,%r343; add.u64 %r345,%r70,%r70; add.u64 %r346,%r345,%r70; shl.b64 %r347,%r346,3; add.u64 %r348,%r212,%r347; .loc 1 327 67 shl.b64 %r350,%r70,3; add.u64 %r351,%frame,%r350; .loc 1 327 59 ld.u64 %r353,[%r348+40]; ld.u64 %r354,[%r351+360]; mul.lo.u64 %r150,%r353,%r354; .loc 1 329 24 mov.u64 %r355,0; st.u64 [%r212+8],%r355; .loc 1 330 28 cvt.u16.u32 %r358,%r23; add.u16 %r357,%r358,-1; cvt.u32.u16 %r359,%r357; st.u8 [%r212+28],%r359; .loc 1 332 10 setp.eq.u64 %r360,%r150,0; @ %r360 bra $L74; bra $L121; $L76: .loc 1 321 47 mul.lo.u64 %r126,%r67,%r126; $L73: .loc 1 323 4 st.u64 [%r139+8],%r361; ld.u64 %r67,[%r137]; add.u64 %r362,%r67,-1; st.u64 [%r139+16],%r362; st.u64 [%r139],%r126; .loc 1 316 7 add.u64 %r139,%r139,24; add.u64 %r137,%r137,8; setp.ne.u64 %r363,%r128,%r139; @ %r363 bra $L76; bra $L72; $L74: .loc 1 335 4 st.u64 [%r212+48],%r150; mov.u64 %r365,-1; st.u64 [%r212+56],%r365; mov.u64 %r366,1; st.u64 [%r212+40],%r366; .loc 1 336 4 bra $L52; $L121: .loc 1 339 24 mov.u64 %r368368; call (%value_in),_gfortrani_xmallocarray69,[%value_in]; } .loc 1 339 22 st.u64 [%r212],%r369; bra $L77; $L67: .loc 1 344 19 ld.s8 %r371,[%r212+28]; .loc 1 344 10 setp.eq.u64 %r372,%r371,%r143; @ %r372 bra $L78; .loc 1 345 2 cvta.const.u64 %r373373_gfortran_runtime_error78: .loc 1 347 11 cvta.global.u64 %r375,_gfortrani_compile_options; .loc 1 347 10 ld.u32 %r376,[%r375+36]; setp.eq.u32 %r377,%r376,0; @ %r377 bra $L77; .loc 1 349 4 add.u64 %r443,%frame,360; cvta.const.u64 %r381,$LC2r443380381; call _gfortrani_bounds_ifunction_return4 cvta.const.u64 %r385,$LC38381; call _gfortrani_bounds_equal_extents$L77: .loc 1 356 3 setp.ne.u64 %r387,%r143,0; @ %r387 bra $L79; $L82: .loc 1 364 8 ld.u64 %r161,[%r212]; .loc 1 365 8 ld.u64 %r159,[%r213]; .loc 1 367 9 setp.ne.u64 %r388,%r159,0; @ %r388 bra $L80; bra $L52; $L79: add.u64 %r187,%frame,480; add.u64 %r184,%r212,40; add.u64 %r179,%frame,120; cvt.u32.u32 %r390,%r23; cvt.s64.s8 %r389,%r390; add.u64 %r155,%r389,-1; .loc 1 356 3 mov.u64 %r162,0; add.u64 %r443,%frame,360; .loc 1 358 16 mov.u64 %r391,%r162; $L81: st.u64 [%r187],%r391; .loc 1 359 18 ld.u64 %r392,[%r184]; st.u64 [%r179],%r392; .loc 1 360 17 shl.b64 %r394,%r162,3; add.u64 %r395,%r443,%r394; .loc 1 360 10 ld.u64 %r396,[%r395]; setp.le.s64 %r397,%r396,0; @ %r397 bra $L52; .loc 1 356 26 add.u64 %r162,%r162,1; .loc 1 356 3 add.u64 %r187,%r187,8; add.u64 %r184,%r184,24; add.u64 %r179,%r179,8; setp.ne.u64 %r398,%r155,%r162; @ %r398 bra $L81; bra $L82; $L80: .loc 1 290 12 mul.lo.u64 %r41,%r38,%r34; ld.u64 %r186,[%frame+480]; .loc 1 430 22 ld.u64 %r92,[%frame+240]; .loc 1 431 23 ld.u64 %r94,[%frame]; .loc 1 432 22 ld.u64 %r96,[%frame+120]; .loc 1 432 12 shl.b64 %r98,%r96,3; setp.le.s64 %r442,%r143,1; .loc 1 408 9 setp.ne.u32 %r450,%r216,0; cvt.u32.u32 %r451,%r23; cvt.s64.s8 %r452,%r451; add.u64 %r453,%r452,-1; add.u64 %r454,%frame,240; add.u64 %r455,%frame,360; add.u64 %r456,%frame,120; $L98: .loc 1 356 3 mov.u64 %r154,%r160; mov.u64 %r153,%r159; .loc 1 386 9 mov.u64 %r133,0; $L86: .loc 1 389 7 ld.s8 %r205,[%r154]; mov.u64 %r183,%r133; .loc 1 386 24 add.u64 %r133,%r133,1; .loc 1 389 6 setp.eq.u32 %r444,%r205,0; @ %r444 bra $L83; .loc 1 397 11 ld.s8 %r136,[%r153]; .loc 1 408 9 @ %r450 bra $L84; bra $L122; $L83: .loc 1 386 32 add.u64 %r153,%r153,%r147; .loc 1 386 47 add.u64 %r154,%r154,%r41; .loc 1 386 2 setp.ne.u64 %r401,%r144,%r133; @ %r401 bra $L86; .loc 1 385 9 mov.u64 %r133,0; bra $L87; $L122: .loc 1 418 8 setp.gt.s64 %r402,%r144,%r183; @ %r402 bra $L92; bra $L87; $L84: .loc 1 409 8 setp.le.s64 %r403,%r144,%r183; @ %r403 bra $L87; $L90: .loc 1 414 16 add.u64 %r183,%r183,1; .loc 1 411 8 @ %r444 bra $L89; .loc 1 411 18 ld.s8 %r85,[%r153]; .loc 1 411 15 cvt.u16.u32 %r405,%r85; cvt.u16.u32 %r406,%r136; setp.ge.s16 %r407,%r405,%r406; selp.u32 %r136,%r85,%r136,%r407; selp.u64 %r133,%r183,%r133,%r407; $L89: .loc 1 409 33 add.u64 %r153,%r153,%r147; .loc 1 409 48 add.u64 %r154,%r154,%r41; .loc 1 409 8 setp.eq.u64 %r408,%r144,%r183; @ %r408 bra $L87; .loc 1 411 9 ld.s8 %r205,[%r154]; setp.eq.u32 %r444,%r205,0; bra $L90; $L92: .loc 1 423 16 add.u64 %r183,%r183,1; .loc 1 420 8 ld.s8 %r410,[%r154]; cvt.u16.u32 %r409,%r410; setp.eq.u16 %r411,%r409,0; @ %r411 bra $L91; .loc 1 420 18 ld.s8 %r90,[%r153]; .loc 1 420 15 cvt.u16.u32 %r412,%r90; cvt.u16.u32 %r413,%r136; setp.gt.s16 %r414,%r412,%r413; selp.u32 %r136,%r90,%r136,%r414; selp.u64 %r133,%r183,%r133,%r414; $L91: .loc 1 418 33 add.u64 %r153,%r153,%r147; .loc 1 418 48 add.u64 %r154,%r154,%r41; .loc 1 418 8 setp.ne.u64 %r415,%r144,%r183; @ %r415 bra $L92; $L87: .loc 1 426 8 st.u64 [%r161],%r133; .loc 1 429 15 add.u64 %r186,%r186,1; .loc 1 430 12 add.u64 %r159,%r159,%r92; .loc 1 431 13 add.u64 %r160,%r160,%r94; .loc 1 432 12 add.u64 %r161,%r161,%r98; .loc 1 434 32 ld.u64 %r29,[%frame+360]; .loc 1 434 13 setp.ne.u64 %r416,%r29,%r186; @ %r416 bra $L98; .loc 1 441 23 mul.lo.u64 %r101,%r186,%r92; .loc 1 442 24 mul.lo.u64 %r417,%r186,%r94; .loc 1 442 10 sub.u64 %r156,%r160,%r417; .loc 1 443 23 mul.lo.u64 %r418,%r186,%r96; .loc 1 443 9 shl.b64 %r419,%r418,3; sub.u64 %r157,%r161,%r419; .loc 1 445 7 @ %r442 bra $L52; add.u64 %r197,%frame,488; mov.u64 %r196,8; .loc 1 444 5 mov.u64 %r158,1; .loc 1 438 13 mov.u64 %r449,0; bra $L95; $L96: st.u64 [%r197],%r449; .loc 1 441 23 mul.lo.u64 %r101,%r113,%r112; .loc 1 442 24 mul.lo.u64 %r425,%r115,%r112; .loc 1 442 10 sub.u64 %r156,%r160,%r425; .loc 1 443 23 mul.lo.u64 %r426,%r117,%r112; .loc 1 443 9 shl.b64 %r427,%r426,3; sub.u64 %r157,%r161,%r427; .loc 1 444 5 add.u64 %r158,%r158,1; .loc 1 445 7 add.u64 %r197,%r197,8; add.u64 %r196,%r196,8; setp.eq.u64 %r428,%r158,%r453; @ %r428 bra $L52; $L95: .loc 1 453 16 ld.u64 %r429,[%r197]; add.u64 %r112,%r429,1; st.u64 [%r197],%r112; .loc 1 454 23 add.u64 %r431,%r454,%r196; ld.u64 %r113,[%r431]; .loc 1 454 13 sub.u64 %r432,%r113,%r101; add.u64 %r159,%r159,%r432; .loc 1 455 24 add.u64 %r433,%frame,%r196; ld.u64 %r115,[%r433]; .loc 1 455 14 add.u64 %r160,%r156,%r115; .loc 1 456 23 add.u64 %r435,%r456,%r196; ld.u64 %r117,[%r435]; .loc 1 456 13 shl.b64 %r436,%r117,3; add.u64 %r161,%r157,%r436; .loc 1 434 32 add.u64 %r438,%r455,%r196; ld.u64 %r121,[%r438]; .loc 1 434 13 setp.eq.u64 %r439,%r112,%r121; @ %r439 bra $L96; .loc 1 438 13 mov.u64 %r186,0; bra $L98; $L120: .loc 1 312 6 ld.u64 %r440,[%r212]; setp.eq.u64 %r441,%r440,0; @ ! %r441 bra $L67; bra $L97; $L52: .loc 1 460_gfortran_smaxloc1_8_i1 .visible .func _gfortran_smaxloc1_8_, .param .u32 %in_ar4) {8u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 483 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L124; .loc 1 483 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L125; $L124:r152154; call _gfortran_maxloc1_8_490 7 bra $L123; $L125: .loc 1 493 10 ld.u64 %r23,[%r152]; .loc 1 493 7 add.u64 %r88,%r23,-1; .loc 1 494 10 ld.s8 %r24,[%r151+28]; .loc 1 494 38 add.u32 %r26,%r24,-1; .loc 1 494 8 cvt.s64.s32 %r296,%r26; .loc 1 496 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 496 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L127; .loc 1 503 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L128; $L134: .loc 1 511 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L129; bra $L182; $L127: .loc 1 498 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error128: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 508 12 mov.u64 %r302,0; $L133: .loc 1 505 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 507 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L131; .loc 1 505 17 st.u64 [%r139],%r36; bra $L132; $L131: .loc 1 508 12 st.u64 [%r139],%r302; $L132: .loc 1 503 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L133; bra $L134; $L182: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L136; bra $L135; $L129: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 517 12 mov.u64 %r301,0; $L139: .loc 1 514 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 516 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L137; .loc 1 513 17 st.u64 [%r43],%r41; bra $L138; $L137: .loc 1 517 12 st.u64 [%r43],%r301; $L138: .loc 1 511 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L139; bra $L183; $L135: .loc 1 524 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L141; add.u64 %r295,%frame,120; $L161: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 527 10 mov.u64 %r81,1; .loc 1 531 4 mov.u64 %r239,0; bra $L142; $L141: .loc 1 535 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 536 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 538 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 538 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 538 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 540 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L143; bra $L184; $L145: .loc 1 529 48 mul.lo.u64 %r81,%r46,%r81; $L142: .loc 1 531 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 524 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L145; bra $L141; $L143: .loc 1 543 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 544 4 bra $L123; $L184: .loc 1 547 24 mov.u64 %r24696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 547 22 st.u64 [%r150],%r98; $L149: .loc 1 573 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L146; bra $L147; $L136: .loc 1 551 19 ld.s8 %r55,[%r150+28]; .loc 1 551 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L148; .loc 1 552 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error148: .loc 1 557 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 557 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L149; .loc 1 559 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L147; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 559 10 mov.u64 %r104,0; $L151: .loc 1 563 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 563 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 564 18 ld.u64 %r63,[%r31]; .loc 1 559 25 add.u64 %r104,%r104,1; .loc 1 564 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L150; .loc 1 565 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error150: .loc 1 559 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L151; $L146: .loc 1 575 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L152; shl.b64 %r266,%r296,3; bra $L153; $L152: mov.u64 %r266,8; $L153: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L154: .loc 1 576 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 573 26 add.u64 %r100,%r100,1; .loc 1 573 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L154; $L147: ld.u64 %r80,[%frame+240]; .loc 1 585 22 ld.u64 %r68,[%frame]; .loc 1 585 12 shl.b64 %r69,%r68,3; .loc 1 587 32 ld.u64 %r59,[%frame+120]; .loc 1 594 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 583 13 mov.u64 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L160: st.u64 [%r98],%r280; .loc 1 584 15 add.u64 %r80,%r80,1; .loc 1 585 12 add.u64 %r98,%r98,%r69; .loc 1 587 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L160; .loc 1 596 7 @ ! %r294 bra $L185; bra $L123; $L159: .loc 1 591 13 st.u64 [%r99],%r280; .loc 1 594 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 595 5 add.u64 %r97,%r97,1; .loc 1 596 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L158; bra $L123; $L185: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 594 23 mov.u64 %r72,%r44; .loc 1 595 5 mov.u64 %r97,1; $L158: .loc 1 600 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 601 23 ld.u64 %r75,[%r124]; .loc 1 601 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,3; add.u64 %r98,%r98,%r291; .loc 1 587 32 ld.u64 %r77,[%r123]; .loc 1 587 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L159; .loc 1 591 13 mov.u64 %r80,0; bra $L160; $L183: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L136; bra $L161; $L123: .loc 1 605 maxloc1_16_i1.o/1622802229_gfortran_maxloc1_16_i1 .visible .func _gfortran_maxloc1_16_i1.file 1 "../../../../libgfortran/generated/maxloc1_16_i1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_16_i1 .visible .func _gfortran_mmaxloc1_16__gfortran_smaxloc1_16_i1 .visible .func _gfortran_smaxloc1_16_VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_maxloc1_16_i1 .visible .func _gfortran_maxloc1_16_i1pred %r217; .reg .pred64u64 %r251; .reg .predu64 %r256; .reg .u64 %r257; .reg .u32u32 %r268; .reg .u64predpred %r344; .reg .u64 %r350; .reg .u64 %r351; .reg .u64 %r352; .reg .predmov.u64 %r178,%ar0; mov.u64 %r179,%ar1; mov.u64 %r180,%ar2; mov.u32 %r181,%ar3; .loc 1 58 10 ld.s8 %r22,[%r179+28]; .loc 1 58 38 add.u32 %r182,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r107,%r182; .loc 1 59 10 ld.u64 %r25,[%r180]; .loc 1 59 7 add.u64 %r108,%r25,-1; .loc 1 61 7 shr.u64 %r184,%r108,63; set.u32.lt.s64 %r186,%r107,%r108; neg.s32 %r187,%r186; cvt.u16.u64 %r190,%r184; cvt.u16.u32 %r191,%r187; or.b16 %r189,%r190,%r191; .loc 1 61 6 cvt.u32.u16 %r192,%r189; cvt.u16.u8 %r193,%r192; setp.eq.u16 %r194,%r193,0; @ %r194 bra $L2; .loc 1 63 7 cvt.u32.u32 %r197,%r22; cvt.s64.s8 %r196,%r197; st.u64 [%stack+8],%r1961955_gfortran_runtime_errorr200,%r108,%r108; add.u64 %r201,%r200,%r108; shl.b64 %r202,%r201,3; add.u64 %r203,%r179,%r202; ld.u64 %r31,[%r203+56]; ld.u64 %r33,[%r203+48]; .loc 1 71 9 ld.u64 %r111,[%r203+40]; .loc 1 73 3 setp.ne.u64 %r217,%r108,0; @ %r217 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r218,%r107,%r108; @ %r218 bra $L4; bra $L49; $L3: add.u64 %r126,%r179,40; add.u64 %r173,%frame,120; add.u64 %r171,%frame,240; add.u64 %r219,%r179,16; add.u64 %r221,%r25,%r25; add.u64 %r222,%r221,%r25; shl.b64 %r223,%r222,3; add.u64 %r68,%r219,%r223; .loc 1 79 12 mov.u64 %r386,0; $L8: .loc 1 75 18 ld.u64 %r224,[%r126]; st.u64 [%r173],%r224; .loc 1 76 19 ld.u64 %r226,[%r126+16]; add.u64 %r225,%r226,1; ld.u64 %r227,[%r126+8]; sub.u64 %r38,%r225,%r227; .loc 1 78 10 setp.lt.s64 %r228,%r38,0; @ %r228 bra $L6; .loc 1 76 17 st.u64 [%r171],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r171],%r386; $L7: .loc 1 73 3 add.u64 %r126,%r126,24; add.u64 %r173,%r173,8; add.u64 %r171,%r171,8; setp.ne.u64 %r230,%r68,%r126; @ %r230 bra $L8; bra $L9; $L49: .loc 1 90 6 ld.u64 %r231,[%r178]; setp.eq.u64 %r232,%r231,0; @ ! %r232 bra $L11; bra $L10; $L4: add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r237,%r236,40; add.u64 %r128,%r179,%r237; shl.b64 %r238,%r25,3; add.u64 %r99,%r238,-8; add.u64 %r374,%frame,120; add.u64 %r103,%r374,%r99; add.u64 %r373,%frame,240; add.u64 %r94,%r373,%r99; cvt.u32.u32 %r242,%r22; cvt.s64.s8 %r241,%r242; add.u64 %r244,%r241,%r241; add.u64 %r245,%r244,%r241; shl.b64 %r246,%r245,3; add.u64 %r247,%r179,40; add.u64 %r106,%r246,%r247; .loc 1 87 12 mov.u64 %r385,0; $L14: .loc 1 83 18 ld.u64 %r248,[%r128]; st.u64 [%r103],%r248; .loc 1 84 19 ld.u64 %r250,[%r128+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r128+8]; sub.u64 %r45,%r249,%r251; .loc 1 86 10 setp.lt.s64 %r252,%r45,0; @ %r252 bra $L12; .loc 1 84 17 st.u64 [%r94],%r45; bra $L13; $L12: .loc 1 87 12 st.u64 [%r94],%r385; $L13: .loc 1 81 3 add.u64 %r128,%r128,24; add.u64 %r103,%r103,8; add.u64 %r94,%r94,8; setp.ne.u64 %r254,%r106,%r128; @ %r254 bra $L14; bra $L50; $L10: .loc 1 94 7 setp.le.s64 %r255,%r107,0; @ %r255 bra $L16; add.u64 %r373,%frame,240; $L38: add.u64 %r147,%r178,40; mov.u64 %r141,%r373; add.u64 %r256,%r178,16; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r130,%r256,%r262; .loc 1 97 10 mov.u64 %r96,1; .loc 1 101 4 mov.u64 %r285,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r263,0; st.u64 [%r178+8],%r263; .loc 1 106 28 cvt.u16.u32 %r266,%r22; add.u16 %r265,%r266,-1; cvt.u32.u16 %r267,%r265; st.u8 [%r178+28],%r267; .loc 1 108 20 add.u32 %r268,%r22,-2; cvt.s64.s32 %r56,%r268; add.u64 %r270,%r56,%r56; add.u64 %r271,%r270,%r56; shl.b64 %r272,%r271,3; add.u64 %r273,%r178,%r272; .loc 1 108 67 shl.b64 %r275,%r56,3; add.u64 %r276,%frame,%r275; .loc 1 108 59 ld.u64 %r278,[%r273+40]; ld.u64 %r279,[%r276+240]; mul.lo.u64 %r113,%r278,%r279; .loc 1 110 29 mov.u64 %r281,16281; call (%value_in),_gfortrani_xmallocarray282,[%value_in]; } .loc 1 110 27 st.u64 [%r178],%r282; .loc 1 111 10 setp.eq.u64 %r284,%r113,0; @ ! %r284 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r96,%r52,%r96; $L17: .loc 1 101 4 st.u64 [%r147+8],%r285; ld.u64 %r52,[%r141]; add.u64 %r286,%r52,-1; st.u64 [%r147+16],%r286; st.u64 [%r147],%r96; .loc 1 94 7 add.u64 %r147,%r147,2420; bra $L16; $L18: .loc 1 114 4 st.u64 [%r178+48],%r113; mov.u64 %r289,-1; st.u64 [%r178+56],%r289; mov.u64 %r290,1; st.u64 [%r178+40],%r290; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r62,[%r178+28]; .loc 1 121 10 setp.eq.u64 %r291,%r62,%r107; @ %r291 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r107; st.u64 [%stack],%r62;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r294,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r295,[%r294+36]; setp.eq.u32 %r296,%r295,0; @ %r296 bra $L21; .loc 1 128 2 add.u64 %r373,%frame,240; cvta.const.u64 %r300,$LC2r373299300; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r302,%r107,0; @ %r302 bra $L24; $L27: .loc 1 68 9 add.u64 %r303,%r31,1; .loc 1 68 7 sub.u64 %r109,%r303,%r33; max.s64 %r46,%r109,0; .loc 1 140 8 ld.u64 %r121,[%r179]; .loc 1 141 8 ld.u64 %r122,[%r178]; ld.u64 %r74,[%frame+360]; .loc 1 194 22 ld.u64 %r70,[%frame+120]; .loc 1 195 22 ld.u64 %r72,[%frame]; .loc 1 195 12 shl.b64 %r73,%r72,4; .loc 1 197 32 ld.u64 %r102,[%frame+240]; .loc 1 204 23 mul.lo.u64 %r304,%r70,%r102; .loc 1 204 9 neg.s64 %r133,%r304; .loc 1 205 23 mul.lo.u64 %r305,%r72,%r102; .loc 1 205 9 shl.b64 %r306,%r305,4; neg.s64 %r139,%r306; setp.gt.s64 %r375,%r109,0; setp.le.s64 %r376,%r107,1; .loc 1 180 31 mov.u64 %r378,1; setp.eq.u32 %r379,%r181,0; cvt.u32.u32 %r380,%r22; cvt.s64.s8 %r381,%r380; add.u64 %r382,%r381,-1; add.u64 %r383,%frame,120; add.u64 %r384,%frame,240; bra $L25; $L24: add.u64 %r160,%frame,360; add.u64 %r159,%r178,40; mov.u64 %r157,%frame; cvt.u32.u32 %r308,%r22; cvt.s64.s8 %r307,%r308; add.u64 %r148,%r307,-1; .loc 1 132 3 mov.u64 %r123,0; add.u64 %r373,%frame,240; .loc 1 134 16 mov.u64 %r309,%r123; $L26: st.u64 [%r160],%r309; .loc 1 135 18 ld.u64 %r310,[%r159]; st.u64 [%r157],%r310; .loc 1 136 17 shl.b64 %r312,%r123,3; add.u64 %r313,%r373,%r312; .loc 1 136 10 ld.u64 %r314,[%r313]; setp.le.s64 %r315,%r314,0; @ %r315 bra $L1; .loc 1 132 26 add.u64 %r123,%r123,1; .loc 1 132 3 add.u64 %r160,%r160,8; add.u64 %r159,%r159,24; add.u64 %r157,%r157,8; setp.ne.u64 %r316,%r123,%r148; @ %r316 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r375 bra $L28; .loc 1 159 10 mov.u64 %r318,0; st.u64 [%r122],%r318; st.u64 [%r122+8],%r318; bra $L29; $L28: .loc 1 180 31 mov.u64 %r116,%r121; mov.u64 %r366,%r378; mov.u64 %r367,0; .loc 1 155 9 mov.u32 %r100,-128; .loc 1 157 9 mov.u64 %r364,1; mov.u64 %r365,%r367; $L33: .loc 1 182 14 ld.s8 %r177,[%r116]; .loc 1 182 7 @ %r379 bra $L30; .loc 1 182 6 cvt.u16.u32 %r321,%r100; cvt.u16.u32 %r322,%r177; setp.le.s16 %r323,%r321,%r322; @ %r323 bra $L31; bra $L32; $L30: cvt.u16.u32 %r324,%r100; cvt.u16.u32 %r325,%r177; setp.ge.s16 %r326,%r324,%r325; @ %r326 bra $L32; $L31: .loc 1 185 14 mov.u64 %r364,%r366; mov.u64 %r365,%r367; .loc 1 182 14 mov.u32 %r100,%r177; $L32: .loc 1 180 31 add.u64 %r116,%r116,%r111; .loc 1 180 6 add.u64 %r329,%r366,%r378; set.u32.lt.u64 %r334,%r329,%r366; cvt.s64.s32 %r332,%r334; mov.u64 %r366,%r329; sub.u64 %r367,%r367,%r332; add.u64 %r339,%r366,-1; setp.gt.s64 %r341,%r46,%r339; @ %r341 bra $L33; .loc 1 189 12 st.u64 [%r122],%r364; st.u64 [%r122+8],%r365; $L29: .loc 1 193 15 add.u64 %r74,%r74,1; .loc 1 194 12 add.u64 %r121,%r121,%r70; .loc 1 195 12 add.u64 %r122,%r122,%r73; .loc 1 197 13 setp.ne.u64 %r344,%r74,%r102; @ %r344 bra $L25; .loc 1 204 9 add.u64 %r118,%r121,%r133; .loc 1 205 9 add.u64 %r119,%r122,%r139; .loc 1 207 7 @ %r376 bra $L1; add.u64 %r129,%frame,368; mov.u64 %r143,8; .loc 1 206 5 mov.u64 %r120,1; .loc 1 201 13 mov.u64 %r377,0; bra $L36; $L37: st.u64 [%r129],%r377; .loc 1 204 23 mul.lo.u64 %r350,%r85,%r84; .loc 1 204 9 sub.u64 %r118,%r121,%r350; .loc 1 205 23 mul.lo.u64 %r351,%r87,%r84; .loc 1 205 9 shl.b64 %r352,%r351,4; sub.u64 %r119,%r122,%r352; .loc 1 206 5 add.u64 %r120,%r120,1; .loc 1 207 7 add.u64 %r129,%r129,8; add.u64 %r143,%r143,8; setp.eq.u64 %r353,%r120,%r382; @ %r353 bra $L1; $L36: .loc 1 215 16 ld.u64 %r354,[%r129]; add.u64 %r84,%r354,1; st.u64 [%r129],%r84; .loc 1 216 23 add.u64 %r356,%r383,%r143; ld.u64 %r85,[%r356]; .loc 1 216 13 add.u64 %r121,%r118,%r85; .loc 1 217 23 add.u64 %r357,%frame,%r143; ld.u64 %r87,[%r357]; .loc 1 217 13 shl.b64 %r358,%r87,4; add.u64 %r122,%r119,%r358; .loc 1 197 32 add.u64 %r360,%r384,%r143; ld.u64 %r90,[%r360]; .loc 1 197 13 setp.eq.u64 %r361,%r84,%r90; @ %r361 bra $L37; .loc 1 201 13 mov.u64 %r74,0; bra $L25; $L50: .loc 1 90 6 ld.u64 %r362,[%r178]; setp.eq.u64 %r363,%r362,0; @ ! %r363 bra $L11; bra $L38; $L1: .loc 1 221_gfortran_mmaxloc1_16_i1 .visible .func _gfortran_mmaxloc1_16_, .param .u32 %in_ar4) {608pred %r289; .reg .predpred %r304; .reg .pred %r305; .reg .u64u64u64 %r318; .reg .u64pred %r322; .reg .u64 %r324; .reg .u64 %r326; .reg .pred %r328; .reg .u64 %r329; .reg .predu64pred %r354; .reg .predpredpred %r397; .reg .u64 %r400; .reg .u64 %r401; .reg .u64 %r405; .reg .pred.reg .u64 %r416; .reg .predpred %r425; .reg .pred %r426; .reg .pred %r427; .reg .u64 %r431; .reg .u64 %r434; .reg .u64pred %r476; .reg .pred %r479; .reg .u64 %r483; .reg .u64 %r486; .reg .u64 %r489; .reg .u64 %r492; .reg .u32 %r494; .reg .u64 %r498; .reg .u64 %r500; .reg .u64 %r503; .reg .u32 %r505; .reg .u64 %r509; .reg .u16 %r510; .reg .u32 %r511; .reg .pred %r512; .reg .u16 %r513; .reg .u16 %r514; .reg .pred %r515; .reg .u64 %r518; .reg .u64 %r521; .reg .u32 %r523; .reg .predpred %r558; .reg .u64 %r559; .reg .u64 %r560; .reg .u64u64 %r593; .reg .pred.reg .u64 %r612; .reg .u64 %r613; mov.u64 %r232,%ar0; mov.u64 %r233,%ar1; mov.u64 %r234,%ar2; mov.u64 %r235,%ar3; mov.u32 %r236,%ar4; .loc 1 251 6 setp.ne.u64 %r237,%r235,0; @ %r237 bra $L23236; call _gfortran_maxloc1_16_7 bra $L51; $L52: .loc 1 261 10 ld.u64 %r22,[%r234]; .loc 1 261 7 add.u64 %r150,%r22,-1; .loc 1 262 10 ld.s8 %r23,[%r233+28]; .loc 1 262 38 add.u32 %r242,%r23,-1; .loc 1 262 8 cvt.s64.s32 %r151,%r242; .loc 1 265 7 shr.u64 %r244,%r150,63; set.u32.gt.s64 %r246,%r150,%r151; neg.s32 %r247,%r246; cvt.u16.u64 %r250,%r244; cvt.u16.u32 %r251,%r247; or.b16 %r249,%r250,%r251; .loc 1 265 6.loc 1 267 7 cvt.u32.u32 %r257,%r23; cvt.s64.s8 %r256,%r257; st.u64 [%stack+8],%r256; st.u64 [%stack],%r22; cvta.const.u64 %r22_gfortran_runtime_error4: .loc 1 272 9 add.u64 %r598,%r150,%r150; add.u64 %r602,%r598,%r150; shl.b64 %r262,%r602,3; add.u64 %r263,%r233,%r262; ld.u64 %r266,[%r263+56]; add.u64 %r265,%r266,1; .loc 1 272 7 ld.u64 %r273,[%r263+48]; sub.u64 %r152,%r265,%r273; .loc 1 273 6 setp.le.s64 %r274,%r152,0; @ %r274 bra $L51; .loc 1 276 9 ld.u64 %r170,[%r235]; .loc 1 278 15 ld.u64 %r35,[%r235+16]; .loc 1 280 22 cvt.u32.u64 %r36,%r35; .loc 1 280 53 add.u32 %r275,%r36,-4; and.b32 %r276,%r275,-5; set.u32.eq.u32 %r278,%r276,0; neg.s32 %r279,%r278; .loc 1 280 22 add.u32 %r280,%r36,-1; set.u32.le.u32 %r282,%r280,1; neg.s32 %r283,%r282; .loc 1 280 6 cvt.u16.u32 %r285,%r279; cvt.u16.u32 %r286,%r283; or.b16 %r284,%r285,%r286; cvt.u32.u16 %r287,%r284; cvt.u16.u8 %r288,%r287; setp.ne.u16 %r289,%r288,0; @ %r289 bra $L56; .loc 1 282 7 setp.ne.u32 %r291,%r36,16; @ %r291 bra $L57; $L56: .loc 1 289 9 shl.b64 %r295,%r602,3; add.u64 %r296,%r233,%r295; ld.u64 %r155,[%r296+40]; .loc 1 290 12 add.u64 %r302,%r235,%r295; ld.u64 %r41,[%r302+40]; .loc 1 292 3 setp.ne.u64 %r304,%r150,0; @ %r304 bra $L58; $L64: .loc 1 302 3 setp.lt.s64 %r305,%r150,%r151; @ %r305 bra $L59; bra $L118; $L57: .loc 1 287 5 cvta.const.u64 %r306306_gfortran_runtime_error8: add.u64 %r85,%r233,40; add.u64 %r228,%r235,40; add.u64 %r309,%r22,%r22; add.u64 %r310,%r309,%r22; shl.b64 %r311,%r310,3; add.u64 %r312,%r233,16; add.u64 %r76,%r311,%r312; .loc 1 292 3 mov.u64 %r163,0; add.u64 %r599,%frame,240; add.u64 %r603,%frame,360; .loc 1 299 12 mov.u64 %r613,%r163; $L63: .loc 1 294 18 add.u64 %r314,%r599,%r163; ld.u64 %r315,[%r85]; st.u64 [%r314],%r315; .loc 1 295 18 add.u64 %r316,%frame,%r163; .loc 1 295 20 ld.u64 %r318,[%r228]; mul.lo.u64 %r317,%r318,%r35; .loc 1 295 18 st.u64 [%r316],%r317; .loc 1 296 19 ld.u64 %r320,[%r85+16]; add.u64 %r319,%r320,1; ld.u64 %r321,[%r85+8]; sub.u64 %r53,%r319,%r321; .loc 1 298 10 setp.lt.s64 %r322,%r53,0; @ %r322 bra $L61; .loc 1 296 17 add.u64 %r324,%r603,%r163; st.u64 [%r324],%r53; bra $L62; $L61: .loc 1 299 12 add.u64 %r326,%r603,%r163; st.u64 [%r326],%r613; $L62: .loc 1 292 3 add.u64 %r85,%r85,24; add.u64 %r228,%r228,24; add.u64 %r163,%r163,8; setp.ne.u64 %r328,%r76,%r85; @ %r328 bra $L63; bra $L64; $L118: .loc 1 312 6 ld.u64 %r329,[%r232]; setp.eq.u64 %r330,%r329,0; @ ! %r330 bra $L66; bra $L65; $L59: add.u64 %r332,%r22,%r22; add.u64 %r333,%r332,%r22; shl.b64 %r334,%r333,3; add.u64 %r176,%r334,40; add.u64 %r149,%r233,%r176; add.u64 %r221,%r235,%r176; shl.b64 %r335,%r22,3; add.u64 %r68,%r335,-8; cvt.u32.u32 %r337,%r23; cvt.s64.s8 %r336,%r337; shl.b64 %r338,%r336,3; add.u64 %r181,%r338,-8; add.u64 %r599,%frame,240; add.u64 %r603,%frame,360; .loc 1 309 12 mov.u64 %r612,0; $L69: .loc 1 304 18 add.u64 %r340,%r599,%r68; ld.u64 %r341,[%r149]; st.u64 [%r340],%r341; .loc 1 305 18 add.u64 %r342,%frame,%r68; .loc 1 305 20 ld.u64 %r344,[%r221]; mul.lo.u64 %r343,%r344,%r35; .loc 1 305 18 st.u64 [%r342],%r343; .loc 1 306 19 ld.u64 %r346,[%r149+16]; add.u64 %r345,%r346,1; ld.u64 %r347,[%r149+8]; sub.u64 %r66,%r345,%r347; .loc 1 308 10 setp.lt.s64 %r348,%r66,0; @ %r348 bra $L67; .loc 1 306 17 add.u64 %r350,%r603,%r68; st.u64 [%r350],%r66; bra $L68; $L67: .loc 1 309 12 add.u64 %r352,%r603,%r68; st.u64 [%r352],%r612; $L68: .loc 1 302 3 add.u64 %r149,%r149,24; add.u64 %r221,%r221,24; add.u64 %r68,%r68,8; setp.ne.u64 %r354,%r68,%r181; @ %r354 bra $L69; bra $L119; $L65: .loc 1 316 7 setp.eq.u64 %r355,%r151,0; @ %r355 bra $L71; add.u64 %r603,%frame,360; $L97: add.u64 %r132,%r232,40; mov.u64 %r130,%r603; cvt.u32.u32 %r357,%r23; cvt.s64.s8 %r356,%r357; add.u64 %r359,%r356,%r356; add.u64 %r360,%r359,%r356; shl.b64 %r361,%r360,3; add.u64 %r362,%r232,16; add.u64 %r59,%r361,%r362; .loc 1 319 10 mov.u64 %r133,1; .loc 1 323 4 mov.u64 %r381,0; bra $L72; $L71: .loc 1 327 20 add.u32 %r363,%r23,-2; cvt.s64.s32 %r74,%r363; add.u64 %r365,%r74,%r74; add.u64 %r366,%r365,%r74; shl.b64 %r367,%r366,3; add.u64 %r368,%r232,%r367; .loc 1 327 67 shl.b64 %r370,%r74,3; add.u64 %r371,%frame,%r370; .loc 1 327 59 ld.u64 %r373,[%r368+40]; ld.u64 %r374,[%r371+360]; mul.lo.u64 %r158,%r373,%r374; .loc 1 329 24 mov.u64 %r375,0; st.u64 [%r232+8],%r375; .loc 1 330 28 cvt.u16.u32 %r378,%r23; add.u16 %r377,%r378,-1; cvt.u32.u16 %r379,%r377; st.u8 [%r232+28],%r379; .loc 1 332 10 setp.eq.u64 %r380,%r158,0; @ %r380 bra $L73; bra $L120; $L75: .loc 1 321 47 mul.lo.u64 %r133,%r71,%r133; $L72: .loc 1 323 4 st.u64 [%r132+8],%r381; ld.u64 %r71,[%r130]; add.u64 %r382,%r71,-1; st.u64 [%r132+16],%r382; st.u64 [%r132],%r133; .loc 1 316 7 add.u64 %r132,%r132,24; add.u64 %r130,%r130,8; setp.ne.u64 %r383,%r59,%r132; @ %r383 bra $L75; bra $L71; $L73: .loc 1 335 4 st.u64 [%r232+48],%r158; mov.u64 %r385,-1; st.u64 [%r232+56],%r385; mov.u64 %r386,1; st.u64 [%r232+40],%r386; .loc 1 336 4 bra $L51; $L120: .loc 1 339 24 mov.u64 %r388,16388; call (%value_in),_gfortrani_xmallocarray89,[%value_in]; } .loc 1 339 22 st.u64 [%r232],%r389; bra $L76; $L66: .loc 1 344 19 ld.s8 %r391,[%r232+28]; .loc 1 344 10 setp.eq.u64 %r392,%r391,%r151; @ %r392 bra $L77; .loc 1 345 2 cvta.const.u64 %r3933_gfortran_runtime_error77: .loc 1 347 11 cvta.global.u64 %r395,_gfortrani_compile_options; .loc 1 347 10 ld.u32 %r396,[%r395+36]; setp.eq.u32 %r397,%r396,0; @ %r397 bra $L76; .loc 1 349 4 add.u64 %r603,%frame,360; cvta.const.u64 %r401,$LC2r60300401; call _gfortrani_bounds_ifunction_return4 cvta.const.u64 %r405,$LC40401; call _gfortrani_bounds_equal_extents$L76: .loc 1 356 3 setp.ne.u64 %r407,%r151,0; @ %r407 bra $L78; $L81: .loc 1 364 8 ld.u64 %r171,[%r232]; .loc 1 365 8 ld.u64 %r169,[%r233]; .loc 1 367 9 setp.ne.u64 %r408,%r169,0; @ %r408 bra $L79; bra $L51; $L78: add.u64 %r186,%frame,480; add.u64 %r148,%r232,40; add.u64 %r145,%frame,120; cvt.u32.u32 %r410,%r23; cvt.s64.s8 %r409,%r410; add.u64 %r134,%r409,-1; .loc 1 356 3 mov.u64 %r172,0; add.u64 %r603,%frame,360; .loc 1 358 16 mov.u64 %r411,%r172; $L80: st.u64 [%r186],%r411; .loc 1 359 18 ld.u64 %r412,[%r148]; st.u64 [%r145],%r412; .loc 1 360 17 shl.b64 %r414,%r172,3; add.u64 %r415,%r603,%r414; .loc 1 360 10 ld.u64 %r416,[%r415]; setp.le.s64 %r417,%r416,0; @ %r417 bra $L51; .loc 1 356 26 add.u64 %r172,%r172,1; .loc 1 356 3 add.u64 %r186,%r186,8; add.u64 %r148,%r148,24; add.u64 %r145,%r145,8; setp.ne.u64 %r418,%r134,%r172; @ %r418 bra $L80; bra $L81; $L79: .loc 1 290 12 mul.lo.u64 %r44,%r41,%r35; ld.u64 %r188,[%frame+480]; .loc 1 430 22 ld.u64 %r101,[%frame+240]; .loc 1 431 23 ld.u64 %r103,[%frame]; .loc 1 432 22 ld.u64 %r105,[%frame+120]; .loc 1 432 12 shl.b64 %r106,%r105,4; .loc 1 434 32 ld.u64 %r29,[%frame+360]; .loc 1 441 23 mul.lo.u64 %r39,%r101,%r29; .loc 1 442 24 mul.lo.u64 %r419,%r103,%r29; .loc 1 442 10 neg.s64 %r92,%r419; .loc 1 443 23 mul.lo.u64 %r420,%r105,%r29; .loc 1 443 9 shl.b64 %r421,%r420,4; neg.s64 %r182,%r421; setp.le.s64 %r597,%r151,1; .loc 1 408 9 setp.ne.u32 %r605,%r236,0; add.u64 %r606,%r152,-1; cvt.u32.u32 %r607,%r23; cvt.s64.s8 %r608,%r607; add.u64 %r609,%r608,-1; add.u64 %r610,%frame,240; add.u64 %r611,%frame,360; $L98: .loc 1 356 3 mov.u64 %r165,%r170; mov.u64 %r164,%r169; .loc 1 386 9 mov.u64 %r199,0; $L85: .loc 1 389 7 ld.s8 %r227,[%r165]; mov.u64 %r229,%r199; .loc 1 386 24 add.u64 %r199,%r199,1; .loc 1 389 6 setp.eq.u32 %r600,%r227,0; @ %r600 bra $L82; .loc 1 397 11 ld.s8 %r143,[%r164]; .loc 1 398 11 mov.u64 %r559,%r199; shr.s64 %r560,%r559,63; .loc 1 408 9 @ %r605 bra $L83; bra $L121; $L82: .loc 1 386 32 add.u64 %r164,%r164,%r155; .loc 1 386 47 add.u64 %r165,%r165,%r44; .loc 1 386 2 setp.ne.u64 %r425,%r152,%r199; @ %r425 bra $L85; .loc 1 385 9 mov.u64 %r559,0; mov.u64 %r560,%r559; bra $L86; $L121: .loc 1 418 8 setp.gt.s64 %r426,%r152,%r229; @ %r426 bra $L87; bra $L86; $L83: .loc 1 409 8 setp.le.s64 %r427,%r152,%r229; @ %r427 bra $L86; add.u64 %r567,%r229,1; shr.s64 %r568,%r567,63; sub.u64 %r431,%r606,%r229; shr.s64 %r434,%r229,63; add.u64 %r437,%r229,2; set.u32.lt.u64 %r442,%r437,%r229; cvt.s64.s32 %r440,%r442; sub.u64 %r446,%r434,%r440; add.u64 %r448,%r431,%r437; set.u32.lt.u64 %r453,%r448,%r431; cvt.s64.s32 %r451,%r453; sub.u64 %r457,%r446,%r451; mov.u64 %r579,1; $L90: .loc 1 411 8 @ %r600 bra $L88; .loc 1 411 18 ld.s8 %r90,[%r164]; .loc 1 411 15 cvt.u16.u32 %r459,%r90; cvt.u16.u32 %r460,%r143; setp.lt.s16 %r461,%r459,%r460; .loc 1 414 16 selp.u64 %r559,%r559,%r567,%r461; selp.u64 %r560,%r560,%r568,%r461; selp.u32 %r143,%r143,%r90,%r461; $L88: .loc 1 409 33 add.u64 %r164,%r164,%r155; .loc 1 409 48 add.u64 %r165,%r165,%r44; .loc 1 409 8 add.u64 %r464,%r567,%r579; set.u32.lt.u64 %r469,%r464,%r567; cvt.s64.s32 %r467,%r469; mov.u64 %r567,%r464; sub.u64 %r568,%r568,%r467; setp.ne.u64 %r476,%r448,%r567; @ %r476 bra $L99; setp.ne.u64 %r479,%r457,%r568; @ ! %r479 bra $L86; $L99: .loc 1 411 9 ld.s8 %r227,[%r165]; setp.eq.u32 %r600,%r227,0; bra $L90; $L87: add.u64 %r563,%r229,1; shr.s64 %r564,%r563,63; sub.u64 %r483,%r606,%r229; shr.s64 %r486,%r229,63; add.u64 %r489,%r229,2; set.u32.lt.u64 %r494,%r489,%r229; cvt.s64.s32 %r492,%r494; sub.u64 %r498,%r486,%r492; add.u64 %r500,%r483,%r489; set.u32.lt.u64 %r505,%r500,%r483; cvt.s64.s32 %r503,%r505; sub.u64 %r509,%r498,%r503; .loc 1 418 8 mov.u64 %r593,1; $L113: .loc 1 420 8 ld.s8 %r511,[%r165]; cvt.u16.u32 %r510,%r511; setp.eq.u16 %r512,%r510,0; @ %r512 bra $L91; .loc 1 420 18 ld.s8 %r95,[%r164]; .loc 1 420 15 cvt.u16.u32 %r513,%r95; cvt.u16.u32 %r514,%r143; setp.le.s16 %r515,%r513,%r514; .loc 1 423 16 selp.u64 %r559,%r559,%r563,%r515; selp.u64 %r560,%r560,%r564,%r515; selp.u32 %r143,%r143,%r95,%r515; $L91: .loc 1 418 33 add.u64 %r164,%r164,%r155; .loc 1 418 48 add.u64 %r165,%r165,%r44; .loc 1 418 8 add.u64 %r518,%r563,%r593; set.u32.lt.u64 %r523,%r518,%r563; cvt.s64.s32 %r521,%r523; mov.u64 %r563,%r518; sub.u64 %r564,%r564,%r521; setp.ne.u64 %r530,%r500,%r563; @ %r530 bra $L113; setp.ne.u64 %r533,%r509,%r564; @ %r533 bra $L113; $L86: .loc 1 426 8 st.u64 [%r171],%r559; st.u64 [%r171+8],%r560; .loc 1 429 15 add.u64 %r188,%r188,1; .loc 1 430 12 add.u64 %r169,%r169,%r101; .loc 1 431 13 add.u64 %r170,%r170,%r103; .loc 1 432 12 add.u64 %r171,%r171,%r106; .loc 1 434 13 setp.ne.u64 %r536,%r29,%r188; @ %r536 bra $L98; .loc 1 442 10 add.u64 %r166,%r170,%r92; .loc 1 443 9 add.u64 %r167,%r171,%r182; .loc 1 445 7 @ %r597 bra $L51; add.u64 %r220,%frame,488; .loc 1 441 23 mov.u64 %r110,%r39; .loc 1 445 7 mov.u64 %r219,8; .loc 1 444 5 mov.u64 %r168,1; add.u64 %r601,%frame,120; .loc 1 438 13 mov.u64 %r604,0; bra $L95; $L96: st.u64 [%r220],%r604; .loc 1 441 23 mul.lo.u64 %r110,%r122,%r121; .loc 1 442 24 mul.lo.u64 %r542,%r124,%r121; .loc 1 442 10 sub.u64 %r166,%r170,%r542; .loc 1 443 23 mul.lo.u64 %r543,%r126,%r121; .loc 1 443 9 shl.b64 %r544,%r543,4; sub.u64 %r167,%r171,%r544; .loc 1 444 5 add.u64 %r168,%r168,1; .loc 1 445 7 add.u64 %r220,%r220,8; add.u64 %r219,%r219,8; setp.eq.u64 %r545,%r168,%r609; @ %r545 bra $L51; $L95: .loc 1 453 16 ld.u64 %r546,[%r220]; add.u64 %r121,%r546,1; st.u64 [%r220],%r121; .loc 1 454 23 add.u64 %r548,%r610,%r219; ld.u64 %r122,[%r548]; .loc 1 454 13 sub.u64 %r549,%r122,%r110; add.u64 %r169,%r169,%r549; .loc 1 455 24 add.u64 %r550,%frame,%r219; ld.u64 %r124,[%r550]; .loc 1 455 14 add.u64 %r170,%r166,%r124; .loc 1 456 23 add.u64 %r552,%r601,%r219; ld.u64 %r126,[%r552]; .loc 1 456 13 shl.b64 %r553,%r126,4; add.u64 %r171,%r167,%r553; .loc 1 434 32 add.u64 %r555,%r611,%r219; ld.u64 %r129,[%r555]; .loc 1 434 13 setp.eq.u64 %r556,%r121,%r129; @ %r556 bra $L96; .loc 1 438 13 mov.u64 %r188,0; bra $L98; $L119: .loc 1 312 6 ld.u64 %r557,[%r232]; setp.eq.u64 %r558,%r557,0; @ ! %r558 bra $L66; bra $L97; $L51: .loc 1 460_gfortran_smaxloc1_16_i1 .visible .func _gfortran_smaxloc1_16_, .param .u32 %in_ar4) {8u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r282; .reg .predpred %r293; .reg .pred %r294; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 483 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L123; .loc 1 483 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L124; r152154; call _gfortran_maxloc1_16_490 7 bra $L122; $L124: .loc 1 493 10 ld.u64 %r23,[%r152]; .loc 1 493 7 add.u64 %r88,%r23,-1; .loc 1 494 10 ld.s8 %r24,[%r151+28]; .loc 1 494 38 add.u32 %r26,%r24,-1; .loc 1 494 8 cvt.s64.s32 %r297,%r26; .loc 1 496 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r297; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 496 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L126; .loc 1 503 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L127; $L133: .loc 1 511 3 setp.lt.s64 %r175,%r88,%r297; @ %r175 bra $L128; bra $L181; $L126: .loc 1 498 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error127: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 508 12 mov.u64 %r303,0; $L132: .loc 1 505 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 507 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L130; .loc 1 505 17 st.u64 [%r139],%r36; bra $L131; $L130: .loc 1 508 12 st.u64 [%r139],%r303; $L131: .loc 1 503 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L132; bra $L133; $L181: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L135; bra $L134; $L128: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r296,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r296,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 517 12 mov.u64 %r302,0; $L138: .loc 1 514 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 516 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L136; .loc 1 513 17 st.u64 [%r43],%r41; bra $L137; $L136: .loc 1 517 12 st.u64 [%r43],%r302; $L137: .loc 1 511 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L138; bra $L182; $L134: .loc 1 524 7 setp.eq.u64 %r213,%r297,0; @ %r213 bra $L140; add.u64 %r296,%frame,120; $L160: add.u64 %r111,%r150,40; mov.u64 %r107,%r296; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 527 10 mov.u64 %r81,1; .loc 1 531 4 mov.u64 %r239,0; bra $L141; $L140: .loc 1 535 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 536 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 538 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 538 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 538 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 540 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L142; bra $L183; $L144: .loc 1 529 48 mul.lo.u64 %r81,%r46,%r81; $L141: .loc 1 531 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 524 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L144; bra $L140; $L142: .loc 1 543 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 544 4 bra $L122; $L183: .loc 1 547 24 mov.u64 %r246,1696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 547 22 st.u64 [%r150],%r98; $L148: .loc 1 573 3 setp.ne.u64 %r249,%r297,0; @ %r249 bra $L145; bra $L146; $L135: .loc 1 551 19 ld.s8 %r55,[%r150+28]; .loc 1 551 10 setp.eq.u64 %r250,%r55,%r297; @ %r250 bra $L147; .loc 1 552 2 st.u64 [%stack+8],%r297; st.u64 [%stack],%r55;_gfortran_runtime_error147: .loc 1 557 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 557 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L148; .loc 1 559 4 setp.eq.u64 %r256,%r297,0; @ %r256 bra $L146; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 559 10 mov.u64 %r104,0; $L150: .loc 1 563 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 563 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 564 18 ld.u64 %r63,[%r31]; .loc 1 559 25 add.u64 %r104,%r104,1; .loc 1 564 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L149; .loc 1 565 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error149: .loc 1 559 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L150; $L145: .loc 1 575 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L151; shl.b64 %r266,%r297,3; bra $L152; $L151: mov.u64 %r266,8; $L152: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L153: .loc 1 576 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 573 26 add.u64 %r100,%r100,1; .loc 1 573 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r297,%r100; @ %r279 bra $L153; $L146: ld.u64 %r80,[%frame+240]; .loc 1 585 22 ld.u64 %r68,[%frame]; .loc 1 585 12 shl.b64 %r69,%r68,4; .loc 1 587 32 ld.u64 %r59,[%frame+120]; .loc 1 594 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r295,%r297,1; .loc 1 583 13 mov.u64 %r280,0; cvt.u32.u32 %r299,%r24; cvt.s64.s8 %r300,%r299; add.u64 %r301,%r300,-1; $L159: st.u64 [%r98],%r280; st.u64 [%r98+8],%r280; .loc 1 584 15 add.u64 %r80,%r80,1; .loc 1 585 12 add.u64 %r98,%r98,%r69; .loc 1 587 13 setp.ne.u64 %r282,%r59,%r80; @ %r282 bra $L159; .loc 1 596 7 @ ! %r295 bra $L184; bra $L122; $L158: .loc 1 591 13 st.u64 [%r99],%r280; .loc 1 594 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 595 5 add.u64 %r97,%r97,1; .loc 1 596 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r285,%r97,%r301; @ %r285 bra $L157; bra $L122; $L184: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 594 23 mov.u64 %r72,%r44; .loc 1 595 5 mov.u64 %r97,1; $L157: .loc 1 600 16 ld.u64 %r290,[%r99]; add.u64 %r74,%r290,1; st.u64 [%r99],%r74; .loc 1 601 23 ld.u64 %r75,[%r124]; .loc 1 601 13 sub.u64 %r291,%r75,%r72; shl.b64 %r292,%r291,4; add.u64 %r98,%r98,%r292; .loc 1 587 32 ld.u64 %r77,[%r123]; .loc 1 587 13 setp.eq.u64 %r293,%r74,%r77; @ %r293 bra $L158; .loc 1 591 13 mov.u64 %r80,0; bra $L159; $L182: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r294,%r98,0; @ ! %r294 bra $L135; bra $L160; $L122: .loc 1 605 maxloc1_4_i2.o/ 1622802229_gfortran_maxloc1_4_i2 .visible .func _gfortran_maxloc1_4_i2.file 1 "../../../../libgfortran/generated/maxloc1_4_i2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_4_i2 .visible .func _gfortran_mmaxloc1_4__gfortran_smaxloc1_4_i2 .visible .func _gfortran_smaxloc1_4_VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_maxloc1_4_i2 .visible .func _gfortran_maxloc1_4_i2pred %r221; .reg .pred %r222; .reg .u64u64 %r229; .reg .u64 %r230; .reg .u64 %r231; .reg .pred %r232; .reg .predu64 %r280; .reg .u64 %r282; .reg .u64predu64pred %r322; .reg .predpred %r345; .reg .u64mov.u64 %r182,%ar0; mov.u64 %r183,%ar1; mov.u64 %r184,%ar2; mov.u32 %r185,%ar3; .loc 1 58 10 ld.s8 %r22,[%r183+28]; .loc 1 58 38 add.u32 %r186,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r108,%r186; .loc 1 59 10 ld.u64 %r25,[%r184]; .loc 1 59 7 add.u64 %r109,%r25,-1; .loc 1 61 7 shr.u64 %r188,%r109,63; set.u32.lt.s64 %r190,%r108,%r109; neg.s32 %r191,%r190; cvt.u16.u64 %r194,%r188; cvt.u16.u32 %r195,%r191; or.b16 %r193,%r194,%r195; .loc 1 61 6 cvt.u32.u16 %r196,%r193; cvt.u16.u8 %r197,%r196; setp.eq.u16 %r198,%r197,0; @ %r198 bra $L2; .loc 1 63 7 cvt.u32.u32 %r201,%r22; cvt.s64.s8 %r200,%r201; st.u64 [%stack+8],%r20019_gfortran_runtime_errorr204,%r109,%r109; add.u64 %r205,%r204,%r109; shl.b64 %r206,%r205,3; add.u64 %r207,%r183,%r206; ld.u64 %r31,[%r207+56]; ld.u64 %r33,[%r207+48]; .loc 1 71 9 ld.u64 %r112,[%r207+40]; .loc 1 73 3 setp.ne.u64 %r221,%r109,0; @ %r221 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r222,%r108,%r109; @ %r222 bra $L4; bra $L49; $L3: add.u64 %r29,%r183,40; add.u64 %r125,%frame,120; add.u64 %r126,%frame,240; add.u64 %r223,%r183,16; add.u64 %r225,%r25,%r25; add.u64 %r226,%r225,%r25; shl.b64 %r227,%r226,3; add.u64 %r147,%r223,%r227; .loc 1 79 12 mov.u64 %r371,0; $L8: .loc 1 75 18 ld.u64 %r228,[%r29]; st.u64 [%r125],%r228; .loc 1 76 19 ld.u64 %r230,[%r29+16]; add.u64 %r229,%r230,1; ld.u64 %r231,[%r29+8]; sub.u64 %r38,%r229,%r231; .loc 1 78 10 setp.lt.s64 %r232,%r38,0; @ %r232 bra $L6; .loc 1 76 17 st.u64 [%r126],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r126],%r371; $L7: .loc 1 73 3 add.u64 %r29,%r29,24; add.u64 %r125,%r125,8; add.u64 %r126,%r126,8; setp.ne.u64 %r234,%r29,%r147; @ %r234 bra $L8; bra $L9; $L49: .loc 1 90 6 ld.u64 %r235,[%r182]; setp.eq.u64 %r236,%r235,0; @ ! %r236 bra $L11; bra $L10; $L4: add.u64 %r238,%r25,%r25; add.u64 %r239,%r238,%r25; shl.b64 %r240,%r239,3; add.u64 %r241,%r240,40; add.u64 %r137,%r183,%r241; shl.b64 %r242,%r25,3; add.u64 %r105,%r242,-8; add.u64 %r357,%frame,120; add.u64 %r127,%r357,%r105; add.u64 %r361,%frame,240; add.u64 %r103,%r361,%r105; cvt.u32.u32 %r246,%r22; cvt.s64.s8 %r245,%r246; add.u64 %r248,%r245,%r245; add.u64 %r249,%r248,%r245; shl.b64 %r250,%r249,3; add.u64 %r251,%r183,40; add.u64 %r64,%r250,%r251; .loc 1 87 12 mov.u64 %r370,0; $L14: .loc 1 83 18 ld.u64 %r252,[%r137]; st.u64 [%r127],%r252; .loc 1 84 19 ld.u64 %r254,[%r137+16]; add.u64 %r253,%r254,1; ld.u64 %r255,[%r137+8]; sub.u64 %r44,%r253,%r255; .loc 1 86 10 setp.lt.s64 %r256,%r44,0; @ %r256 bra $L12; .loc 1 84 17 st.u64 [%r103],%r44; bra $L13; $L12: .loc 1 87 12 st.u64 [%r103],%r370; $L13: .loc 1 81 3 add.u64 %r137,%r137,24; add.u64 %r127,%r127,8; add.u64 %r103,%r103,8; setp.ne.u64 %r258,%r64,%r137; @ %r258 bra $L14; bra $L50; $L10: .loc 1 94 7 setp.le.s64 %r259,%r108,0; @ %r259 bra $L16; add.u64 %r361,%frame,240; $L37: add.u64 %r154,%r182,40; mov.u64 %r152,%r361; cvt.u32.u32 %r261,%r22; cvt.s64.s8 %r260,%r261; add.u64 %r263,%r260,%r260; add.u64 %r264,%r263,%r260; shl.b64 %r265,%r264,3; add.u64 %r266,%r182,16; add.u64 %r138,%r265,%r266; .loc 1 97 10 mov.u64 %r100,1; .loc 1 101 4 mov.u64 %r289,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r267,0; st.u64 [%r182+8],%r267; .loc 1 106 28 cvt.u16.u32 %r270,%r22; add.u16 %r269,%r270,-1; cvt.u32.u16 %r271,%r269; st.u8 [%r182+28],%r271; .loc 1 108 20 add.u32 %r272,%r22,-2; cvt.s64.s32 %r54,%r272; add.u64 %r274,%r54,%r54; add.u64 %r275,%r274,%r54; shl.b64 %r276,%r275,3; add.u64 %r277,%r182,%r276; .loc 1 108 67 shl.b64 %r279,%r54,3; add.u64 %r280,%frame,%r279; .loc 1 108 59 ld.u64 %r282,[%r277+40]; ld.u64 %r283,[%r280+240]; mul.lo.u64 %r114,%r282,%r283; .loc 1 110 29285; call (%value_in),_gfortrani_xmallocarray286,[%value_in]; } .loc 1 110 27 st.u64 [%r182],%r286; .loc 1 111 10 setp.eq.u64 %r288,%r114,0; @ ! %r288 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r100,%r50,%r100; $L17: .loc 1 101 4 st.u64 [%r154+8],%r289; ld.u64 %r50,[%r152]; add.u64 %r290,%r50,-1; st.u64 [%r154+16],%r290; st.u64 [%r154],%r100; .loc 1 94 7 add.u64 %r154,%r154,24; add.u64 %r152,%r152,8; setp.ne.u64 %r291,%r138,%r154; @ %r291 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r182+48],%r114; mov.u64 %r293,-1; st.u64 [%r182+56],%r293; mov.u64 %r294,1; st.u64 [%r182+40],%r294; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r60,[%r182+28]; .loc 1 121 10 setp.eq.u64 %r295,%r60,%r108; @ %r295 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r108; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r298,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r299,[%r298+36]; setp.eq.u32 %r300,%r299,0; @ %r300 bra $L21; .loc 1 128 2 add.u64 %r361,%frame,240; cvta.const.u64 %r304,$LC2r361303304; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r306,%r108,0; @ %r306 bra $L24; $L27: .loc 1 68 9 add.u64 %r307,%r31,1; .loc 1 68 7 sub.u64 %r110,%r307,%r33; max.s64 %r45,%r110,0; .loc 1 140 8 ld.u64 %r122,[%r183]; .loc 1 141 8 ld.u64 %r123,[%r182]; .loc 1 180 31 add.u64 %r68,%r112,%r112; ld.u64 %r75,[%frame+360]; .loc 1 194 22 ld.u64 %r70,[%frame+120]; .loc 1 194 12 add.u64 %r71,%r70,%r70; .loc 1 195 22 ld.u64 %r73,[%frame]; .loc 1 195 12 shl.b64 %r74,%r73,2; .loc 1 197 32 ld.u64 %r143,[%frame+240]; .loc 1 204 23 mul.lo.u64 %r310,%r70,%r143; .loc 1 204 9 add.u64 %r311,%r310,%r310; neg.s64 %r134,%r311; .loc 1 205 23 mul.lo.u64 %r312,%r73,%r143; .loc 1 205 9 shl.b64 %r313,%r312,2; neg.s64 %r128,%r313; setp.gt.s64 %r358,%r110,0; setp.le.s64 %r359,%r108,1; .loc 1 155 9 mov.u32 %r363,-32768; setp.eq.u32 %r364,%r185,0; cvt.u32.u32 %r365,%r22; cvt.s64.s8 %r366,%r365; add.u64 %r367,%r366,-1; add.u64 %r368,%frame,120; add.u64 %r369,%frame,240; bra $L25; $L24: add.u64 %r167,%frame,360; add.u64 %r166,%r182,40; mov.u64 %r164,%frame; cvt.u32.u32 %r315,%r22; cvt.s64.s8 %r314,%r315; add.u64 %r155,%r314,-1; .loc 1 132 3 mov.u64 %r124,0; add.u64 %r361,%frame,240; .loc 1 134 16 mov.u64 %r316,%r124; $L26: st.u64 [%r167],%r316; .loc 1 135 18 ld.u64 %r317,[%r166]; st.u64 [%r164],%r317; .loc 1 136 17 shl.b64 %r319,%r124,3; add.u64 %r320,%r361,%r319; .loc 1 136 10 ld.u64 %r321,[%r320]; setp.le.s64 %r322,%r321,0; @ %r322 bra $L1; .loc 1 132 26 add.u64 %r124,%r124,1; .loc 1 132 3 add.u64 %r167,%r167,8; add.u64 %r166,%r166,24; add.u64 %r164,%r164,8; setp.ne.u64 %r323,%r124,%r155; @ %r323 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r358 bra $L38; .loc 1 159 10 mov.u32 %r325,0; st.u32 [%r123],%r325; bra $L29; $L38: mov.u64 %r118,%r122; .loc 1 155 9 mov.u32 %r102,%r363; .loc 1 157 9 mov.u32 %r101,1; .loc 1 178 8 mov.u64 %r117,0; $L28: .loc 1 182 14 ld.s16 %r181,[%r118]; .loc 1 182 7 @ %r364 bra $L30; .loc 1 182 6 cvt.u16.u32 %r327,%r102; cvt.u16.u32 %r328,%r181; setp.le.s16 %r329,%r327,%r328; @ %r329 bra $L31; bra $L32; $L30: cvt.u16.u32 %r330,%r102; cvt.u16.u32 %r331,%r181; setp.ge.s16 %r332,%r330,%r331; @ %r332 bra $L32; $L31: cvt.u32.u64 %r333,%r117; add.u32 %r101,%r333,1; .loc 1 182 14 mov.u32 %r102,%r181; $L32: .loc 1 180 23 add.u64 %r117,%r117,1; .loc 1 180 31 add.u64 %r118,%r118,%r68; .loc 1 180 6 setp.gt.s64 %r334,%r45,%r117; @ %r334 bra $L28; .loc 1 189 12 st.u32 [%r123],%r101; $L29: .loc 1 193 15 add.u64 %r75,%r75,1; .loc 1 194 12 add.u64 %r122,%r122,%r71; .loc 1 195 12 add.u64 %r123,%r123,%r74; .loc 1 197 13 setp.ne.u64 %r335,%r75,%r143; @ %r335 bra $L25; .loc 1 204 9 add.u64 %r119,%r122,%r134; .loc 1 205 9 add.u64 %r120,%r123,%r128; .loc 1 207 7 @ %r359 bra $L1; add.u64 %r133,%frame,368; mov.u64 %r149,8; .loc 1 206 5 mov.u64 %r121,1; .loc 1 201 13 mov.u64 %r362,0; bra $L35; $L36: st.u64 [%r133],%r362; .loc 1 204 23 mul.lo.u64 %r341,%r87,%r86; .loc 1 204 9 add.u64 %r342,%r341,%r341; sub.u64 %r119,%r122,%r342; .loc 1 205 23 mul.lo.u64 %r343,%r90,%r86; .loc 1 205 9 shl.b64 %r344,%r343,2; sub.u64 %r120,%r123,%r344; .loc 1 206 5 add.u64 %r121,%r121,1; .loc 1 207 7 add.u64 %r133,%r133,8; add.u64 %r149,%r149,8; setp.eq.u64 %r345,%r121,%r367; @ %r345 bra $L1; $L35: .loc 1 215 16 ld.u64 %r346,[%r133]; add.u64 %r86,%r346,1; st.u64 [%r133],%r86; .loc 1 216 23 add.u64 %r348,%r368,%r149; ld.u64 %r87,[%r348]; .loc 1 216 13 add.u64 %r349,%r87,%r87; add.u64 %r122,%r119,%r349; .loc 1 217 23 add.u64 %r350,%frame,%r149; ld.u64 %r90,[%r350]; .loc 1 217 13 shl.b64 %r351,%r90,2; add.u64 %r123,%r120,%r351; .loc 1 197 32 add.u64 %r353,%r369,%r149; ld.u64 %r93,[%r353]; .loc 1 197 13 setp.eq.u64 %r354,%r86,%r93; @ %r354 bra $L36; .loc 1 201 13 mov.u64 %r75,0; bra $L25; $L50: .loc 1 90 6 ld.u64 %r355,[%r182]; setp.eq.u64 %r356,%r355,0; @ ! %r356 bra $L11; bra $L37; $L1: .loc 1 221_gfortran_mmaxloc1_4_i2 .visible .func _gfortran_mmaxloc1_4_, .param .u32 %in_ar4) {60898u16 %r238; .reg .u16u64 %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u64 %r306; .reg .u64pred %r317; .reg .u64 %r318; .reg .pred %r319; .reg .u64u32u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64u64predu32 %r385; .reg .predu64 %r394; .reg .predu64pred %r407; .reg .u64 %r410; .reg .u64 %r411; .reg .u64 %r412; .reg .u32 %r414; .reg .pred %r416; .reg .pred %r417; .reg .predpred %r423; .reg .u16 %r424; .reg .u32 %r425; .reg .predpred %r429; .reg .pred %r430; .reg .predpred %r461; .reg .u64 %r462; .reg .pred.reg .u64 %r471; mov.u64 %r221,%ar0; mov.u64 %r222,%ar1; mov.u64 %r223,%ar2; mov.u64 %r224,%ar3; mov.u32 %r225,%ar4; .loc 1 251 6 setp.ne.u64 %r226,%r224,0; @ %r226 bra $L223225; call _gfortran_maxloc1_4_7 bra $L51; $L52: .loc 1 261 10 ld.u64 %r22,[%r223]; .loc 1 261 7 add.u64 %r151,%r22,-1; .loc 1 262 10 ld.s8 %r23,[%r222+28]; .loc 1 262 38 add.u32 %r231,%r23,-1; .loc 1 262 8 cvt.s64.s32 %r152,%r231; .loc 1 265 7 shr.u64 %r233,%r151,63; set.u32.gt.s64 %r235,%r151,%r152; neg.s32 %r236,%r235; cvt.u16.u64 %r239,%r233; cvt.u16.u32 %r240,%r236; or.b16 %r238,%r239,%r240; .loc 1 265 6 cvt.u32.u16 %r241,%r238; cvt.u16.u8 %r242,%r241; setp.eq.u16 %r243,%r242,0; @ %r243 bra $L54; .loc 1 267 7 cvt.u32.u32 %r246,%r23; cvt.s64.s8 %r245,%r246; st.u64 [%stack+8],%r245; st.u64 [%stack],%r22; cvta.const.u64 %r244244_gfortran_runtime_error4: .loc 1 272 9 add.u64 %r456,%r151,%r151; add.u64 %r457,%r456,%r151; shl.b64 %r251,%r457,3; add.u64 %r252,%r222,%r251; ld.u64 %r255,[%r252+56]; add.u64 %r254,%r255,1; .loc 1 272 7 ld.u64 %r262,[%r252+48]; sub.u64 %r153,%r254,%r262; .loc 1 273 6 setp.le.s64 %r263,%r153,0; @ %r263 bra $L51; .loc 1 276 9 ld.u64 %r169,[%r224]; .loc 1 278 15 ld.u64 %r34,[%r224+16]; .loc 1 280 22 cvt.u32.u64 %r35,%r34; .loc 1 280 53 add.u32 %r264,%r35,-4; and.b32 %r265,%r264,-5; set.u32.eq.u32 %r267,%r265,0; neg.s32 %r268,%r267; .loc 1 280 22 add.u32 %r269,%r35,-1; set.u32.le.u32 %r271,%r269,1; neg.s32 %r272,%r271; .loc 1 280 6 cvt.u16.u32 %r274,%r268; cvt.u16.u32 %r275,%r272; or.loc 1 282 7 setp.ne.u32 %r280,%r35,16; @ %r280 bra $L57; $L56: .loc 1 289 9 shl.b64 %r284,%r457,3; add.u64 %r285,%r222,%r284; ld.u64 %r156,[%r285+40]; .loc 1 290 12 add.u64 %r291,%r224,%r284; ld.u64 %r41,[%r291+40]; .loc 1 292 3 setp.ne.u64 %r293,%r151,0; @ %r293 bra $L58; $L64: .loc 1 302 3 setp.lt.s64 %r294,%r151,%r152; @ %r294 bra $L59; bra $L116; $L57: .loc 1 287 5 cvta.const.u64 %r2955_gfortran_runtime_error8: add.u64 %r96,%r222,40; add.u64 %r68,%r224,40; add.u64 %r298,%r22,%r22; add.u64 %r299,%r298,%r22; shl.b64 %r300,%r299,3; add.u64 %r301,%r222,16; add.u64 %r39,%r300,%r301; .loc 1 292 3 mov.u64 %r139,0; add.u64 %r460,%frame,240; add.u64 %r459,%frame,360; .loc 1 299 12 mov.u64 %r471,%r139; $L63: .loc 1 294 18 add.u64 %r303,%r460,%r139; ld.u64 %r304,[%r96]; st.u64 [%r303],%r304; .loc 1 295 18 add.u64 %r305,%frame,%r139; .loc 1 295 20 ld.u64 %r307,[%r68]; mul.lo.u64 %r306,%r307,%r34; .loc 1 295 18 st.u64 [%r305],%r306; .loc 1 296 19 ld.u64 %r309,[%r96+16]; add.u64 %r308,%r309,1; ld.u64 %r310,[%r96+8]; sub.u64 %r53,%r308,%r310; .loc 1 298 10 setp.lt.s64 %r311,%r53,0; @ %r311 bra $L61; .loc 1 296 17 add.u64 %r313,%r459,%r139; st.u64 [%r313],%r53; bra $L62; $L61: .loc 1 299 12 add.u64 %r315,%r459,%r139; st.u64 [%r315],%r471; $L62: .loc 1 292 3 add.u64 %r96,%r96,24; add.u64 %r68,%r68,24; add.u64 %r139,%r139,8; setp.ne.u64 %r317,%r39,%r96; @ %r317 bra $L63; bra $L64; $L116: .loc 1 312 6 ld.u64 %r318,[%r221]; setp.eq.u64 %r319,%r318,0; @ ! %r319 bra $L66; bra $L65; $L59: add.u64 %r321,%r22,%r22; add.u64 %r322,%r321,%r22; shl.b64 %r323,%r322,3; add.u64 %r130,%r323,40; add.u64 %r134,%r222,%r130; add.u64 %r111,%r224,%r130; shl.b64 %r324,%r22,3; add.u64 %r58,%r324,-8; cvt.u32.u32 %r326,%r23; cvt.s64.s8 %r325,%r326; shl.b64 %r327,%r325,3; add.u64 %r69,%r327,-8; add.u64 %r460,%frame,240; add.u64 %r459,%frame,360; .loc 1 309 12 mov.u64 %r470,0; $L69: .loc 1 304 18 add.u64 %r329,%r460,%r58; ld.u64 %r330,[%r134]; st.u64 [%r329],%r330; .loc 1 305 18 add.u64 %r331,%frame,%r58; .loc 1 305 20 ld.u64 %r333,[%r111]; mul.lo.u64 %r332,%r333,%r34; .loc 1 305 18 st.u64 [%r331],%r332; .loc 1 306 19 ld.u64 %r335,[%r134+16]; add.u64 %r334,%r335,1; ld.u64 %r336,[%r134+8]; sub.u64 %r65,%r334,%r336; .loc 1 308 10 setp.lt.s64 %r337,%r65,0; @ %r337 bra $L67; .loc 1 306 17 add.u64 %r339,%r459,%r58; st.u64 [%r339],%r65; bra $L68; $L67: .loc 1 309 12 add.u64 %r341,%r459,%r58; st.u64 [%r341],%r470; $L68: .loc 1 302 3 add.u64 %r134,%r134,24; add.u64 %r111,%r111,24; add.u64 %r58,%r58,8; setp.ne.u64 %r343,%r69,%r58; @ %r343 bra $L69; bra $L117; $L65: .loc 1 316 7 setp.eq.u64 %r344,%r152,0; @ %r344 bra $L71; add.u64 %r459,%frame,360; $L96: add.u64 %r148,%r221,40; mov.u64 %r146,%r459; cvt.u32.u32 %r346,%r23; cvt.s64.s8 %r345,%r346; add.u64 %r348,%r345,%r345; add.u64 %r349,%r348,%r345; shl.b64 %r350,%r349,3; add.u64 %r351,%r221,16; add.u64 %r135,%r350,%r351; .loc 1 319 10 mov.u64 %r136,1; .loc 1 323 4 mov.u64 %r370,0; bra $L72; $L71: .loc 1 327 20 add.u32 %r352,%r23,-2; cvt.s64.s32 %r76,%r352; add.u64 %r354,%r76,%r76; add.u64 %r355,%r354,%r76; shl.b64 %r356,%r355,3; add.u64 %r357,%r221,%r356; .loc 1 327 67 shl.b64 %r359,%r76,3; add.u64 %r360,%frame,%r359; .loc 1 327 59 ld.u64 %r362,[%r357+40]; ld.u64 %r363,[%r360+360]; mul.lo.u64 %r160,%r362,%r363; .loc 1 329 24 mov.u64 %r364,0; st.u64 [%r221+8],%r364; .loc 1 330 28 cvt.u16.u32 %r367,%r23; add.u16 %r366,%r367,-1; cvt.u32.u16 %r368,%r366; st.u8 [%r221+28],%r368; .loc 1 332 10 setp.eq.u64 %r369,%r160,0; @ %r369 bra $L73; bra $L118; $L75: .loc 1 321 47 mul.lo.u64 %r136,%r73,%r136; $L72: .loc 1 323 4 st.u64 [%r148+8],%r370; ld.u64 %r73,[%r146]; add.u64 %r371,%r73,-1; st.u64 [%r148+16],%r371; st.u64 [%r148],%r136; .loc 1 316 7 add.u64 %r148,%r148,24; add.u64 %r146,%r146,8; setp.ne.u64 %r372,%r135,%r148; @ %r372 bra $L75; bra $L71; $L73: .loc 1 335 4 st.u64 [%r221+48],%r160; mov.u64 %r374,-1; st.u64 [%r221+56],%r374; mov.u64 %r375,1; st.u64 [%r221+40],%r375; .loc 1 336 4 bra $L51; $L118: .loc 1 339 24call (%value_in),_gfortrani_xmallocarray78,[%value_in]; } .loc 1 339 22 st.u64 [%r221],%r378; bra $L76; $L66: .loc 1 344 19 ld.s8 %r380,[%r221+28]; .loc 1 344 10 setp.eq.u64 %r381,%r380,%r152; @ %r381 bra $L77; .loc 1 345 2 cvta.const.u64 %r382,$LC5; {stack; call _gfortran_runtime_error77: .loc 1 347 11 cvta.global.u64 %r384,_gfortrani_compile_options; .loc 1 347 10 ld.u32 %r385,[%r384+36]; setp.eq.u32 %r386,%r385,0; @ %r386 bra $L76; .loc 1 349 4 add.u64 %r459,%frame,360; cvta.const.u64 %r390,$LC2r459389390; call _gfortrani_bounds_ifunction_return4 cvta.const.u64 %r394,$LC222394390; call _gfortrani_bounds_equal_extents$L76: .loc 1 356 3 setp.ne.u64 %r396,%r152,0; @ %r396 bra $L78; $L81: .loc 1 364 8 ld.u64 %r170,[%r221]; .loc 1 365 8 ld.u64 %r168,[%r222]; .loc 1 367 9 setp.ne.u64 %r397,%r168,0; @ %r397 bra $L79; bra $L51; $L78: add.u64 %r198,%frame,480; add.u64 %r197,%r221,40; add.u64 %r195,%frame,120; cvt.u32.u32 %r399,%r23; cvt.s64.s8 %r398,%r399; add.u64 %r164,%r398,-1; .loc 1 356 3 mov.u64 %r171,0; add.u64 %r459,%frame,360; .loc 1 358 16 mov.u64 %r400,%r171; $L80: st.u64 [%r198],%r400; .loc 1 359 18 ld.u64 %r401,[%r197]; st.u64 [%r195],%r401; .loc 1 360 17 shl.b64 %r403,%r171,3; add.u64 %r404,%r459,%r403; .loc 1 360 10 ld.u64 %r405,[%r404]; setp.le.s64 %r406,%r405,0; @ %r406 bra $L51; .loc 1 356 26 add.u64 %r171,%r171,1; .loc 1 356 3 add.u64 %r198,%r198,8; add.u64 %r197,%r197,24; add.u64 %r195,%r195,8; setp.ne.u64 %r407,%r164,%r171; @ %r407 bra $L80; bra $L81; $L79: .loc 1 290 12 mul.lo.u64 %r44,%r41,%r34; .loc 1 386 32 add.u64 %r91,%r156,%r156; ld.u64 %r187,[%frame+480]; .loc 1 430 22 ld.u64 %r101,[%frame+240]; .loc 1 430 12 add.u64 %r102,%r101,%r101; .loc 1 431 23 ld.u64 %r104,[%frame]; .loc 1 432 22 ld.u64 %r106,[%frame+120]; .loc 1 432 12 shl.b64 %r107,%r106,2; .loc 1 434 32 ld.u64 %r29,[%frame+360]; .loc 1 441 23 mul.lo.u64 %r38,%r29,%r101; .loc 1 442 24 mul.lo.u64 %r410,%r29,%r104; .loc 1 442 10 neg.s64 %r137,%r410; .loc 1 443 23 mul.lo.u64 %r411,%r29,%r106; .loc 1 443 9 shl.b64 %r412,%r411,2; neg.s64 %r188,%r412; setp.le.s64 %r458,%r152,1; .loc 1 408 9 setp.ne.u32 %r463,%r225,0; cvt.u32.u32 %r464,%r23; cvt.s64.s8 %r465,%r464; add.u64 %r466,%r465,-1; add.u64 %r467,%frame,240; add.u64 %r468,%frame,360; add.u64 %r469,%frame,120; $L97: .loc 1 356 3 mov.u64 %r163,%r169; mov.u64 %r162,%r168; .loc 1 386 9 mov.u64 %r161,0; $L85: .loc 1 389 7 ld.s8 %r220,[%r163]; .loc 1 389 6 setp.eq.u32 %r461,%r220,0; @ %r461 bra $L82; .loc 1 397 11 ld.s16 %r145,[%r162]; .loc 1 398 11 cvt.u32.u64 %r414,%r161; add.u32 %r143,%r414,1; .loc 1 408 9 @ %r463 bra $L83; bra $L119; $L82: .loc 1 386 24 add.u64 %r161,%r161,1; .loc 1 386 32 add.u64 %r162,%r162,%r91; .loc 1 386 47 add.u64 %r163,%r163,%r44; .loc 1 386 2 setp.ne.u64 %r416,%r153,%r161; @ %r416 bra $L85; .loc 1 385 9 mov.u32 %r143,%r220; bra $L86; $L119: .loc 1 418 8 setp.gt.s64 %r417,%r153,%r161; @ %r417 bra $L87; bra $L86; $L83: .loc 1 409 8 setp.le.s64 %r418,%r153,%r161; @ %r418 bra $L86; add.u64 %r205,%r161,1; $L89: .loc 1 411 8 @ %r461 bra $L88; .loc 1 411 18 ld.s16 %r93,[%r162]; .loc 1 411 15 cvt.u16.u32 %r420,%r93; cvt.u16.u32 %r421,%r145; setp.lt.s16 %r422,%r420,%r421; @ %r422 bra $L88; .loc 1 414 16 cvt.u32.u64 %r143,%r205; mov.u32 %r145,%r93; $L88: .loc 1 409 33 add.u64 %r162,%r162,%r91; .loc 1 409 48 add.u64 %r163,%r163,%r44; .loc 1 409 8 add.u64 %r206,%r205,1; setp.eq.u64 %r423,%r153,%r205; @ %r423 bra $L86; .loc 1 411 9 ld.s8 %r220,[%r163]; mov.u64 %r205,%r206; setp.eq.u32 %r461,%r220,0; bra $L89; $L87: add.u64 %r201,%r161,1; bra $L91; $L98: mov.u64 %r201,%r202; $L91: .loc 1 420 8 ld.s8 %r425,[%r163]; cvt.u16.u32 %r424,%r425; setp.eq.u16 %r426,%r424,0; @ %r426 bra $L90; .loc 1 420 18 ld.s16 %r98,[%r162]; .loc 1 420 15 cvt.u16.u32 %r427,%r98; cvt.u16.u32 %r428,%r145; setp.le.s16 %r429,%r427,%r428; @ %r429 bra $L90; .loc 1 423 16 cvt.u32.u64 %r143,%r201; mov.u32 %r145,%r98; $L90: .loc 1 418 33 add.u64 %r162,%r162,%r91; .loc 1 418 48 add.u64 %r163,%r163,%r44; .loc 1 418 8 add.u64 %r202,%r201,1; setp.ne.u64 %r430,%r153,%r201; @ %r430 bra $L98; $L86: .loc 1 426 8 st.u32 [%r170],%r143; .loc 1 429 15 add.u64 %r187,%r187,1; .loc 1 430 12 add.u64 %r168,%r168,%r102; .loc 1 431 13 add.u64 %r169,%r169,%r104; .loc 1 432 12 add.u64 %r170,%r170,%r107; .loc 1 434 13 setp.ne.u64 %r431,%r29,%r187; @ %r431 bra $L97; .loc 1 442 10 add.u64 %r165,%r169,%r137; .loc 1 443 9 add.u64 %r166,%r170,%r188; .loc 1 445 7 @ %r458 bra $L51; add.u64 %r150,%frame,488; .loc 1 441 23 mov.u64 %r110,%r38; .loc 1 445 7 mov.u64 %r215,8; .loc 1 444 5 mov.u64 %r167,1; .loc 1 438 13 mov.u64 %r462,0; bra $L94; $L95: st.u64 [%r150],%r462; .loc 1 441 23 mul.lo.u64 %r110,%r123,%r122; .loc 1 442 24 mul.lo.u64 %r437,%r125,%r122; .loc 1 442 10 sub.u64 %r165,%r169,%r437; .loc 1 443 23 mul.lo.u64 %r438,%r127,%r122; .loc 1 443 9 shl.b64 %r439,%r438,2; sub.u64 %r166,%r170,%r439; .loc 1 444 5 add.u64 %r167,%r167,1; .loc 1 445 7 add.u64 %r150,%r150,8; add.u64 %r215,%r215,8; setp.eq.u64 %r440,%r167,%r466; @ %r440 bra $L51; $L94: .loc 1 453 16 ld.u64 %r441,[%r150]; add.u64 %r122,%r441,1; st.u64 [%r150],%r122; .loc 1 454 23 add.u64 %r443,%r467,%r215; ld.u64 %r123,[%r443]; .loc 1 454 13 sub.u64 %r444,%r123,%r110; add.u64 %r445,%r444,%r444; add.u64 %r168,%r168,%r445; .loc 1 455 24 add.u64 %r446,%frame,%r215; ld.u64 %r125,[%r446]; .loc 1 455 14 add.u64 %r169,%r165,%r125; .loc 1 456 23 add.u64 %r448,%r469,%r215; ld.u64 %r127,[%r448]; .loc 1 456 13 shl.b64 %r449,%r127,2; add.u64 %r170,%r166,%r449; .loc 1 434 32 add.u64 %r451,%r468,%r215; ld.u64 %r131,[%r451]; .loc 1 434 13 setp.eq.u64 %r452,%r122,%r131; @ %r452 bra $L95; .loc 1 438 13 mov.u64 %r187,0; bra $L97; $L117: .loc 1 312 6 ld.u64 %r453,[%r221]; setp.eq.u64 %r454,%r453,0; @ ! %r454 bra $L66; bra $L96; $L51: .loc 1 460_gfortran_smaxloc1_4_i2 .visible .func _gfortran_smaxloc1_4_, .param .u32 %in_ar4) {8u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u32 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 483 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L121; .loc 1 483 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L122; $L121:r152154; call _gfortran_maxloc1_4_490 7 bra $L120; $L122: .loc 1 493 10 ld.u64 %r23,[%r152]; .loc 1 493 7 add.u64 %r88,%r23,-1; .loc 1 494 10 ld.s8 %r24,[%r151+28]; .loc 1 494 38 add.u32 %r26,%r24,-1; .loc 1 494 8 cvt.s64.s32 %r296,%r26; .loc 1 496 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 496 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L124; .loc 1 503 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L125; $L131: .loc 1 511 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L126; bra $L179; $L124: .loc 1 498 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error125: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 508 12 mov.u64 %r302,0; $L130: .loc 1 505 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 507 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L128; .loc 1 505 17 st.u64 [%r139],%r36; bra $L129; $L128: .loc 1 508 12 st.u64 [%r139],%r302; $L129: .loc 1 503 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L130; bra $L131; $L179: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L133; bra $L132; $L126: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 517 12 mov.u64 %r301,0; $L136: .loc 1 514 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 516 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L134; .loc 1 513 17 st.u64 [%r43],%r41; bra $L135; $L134: .loc 1 517 12 st.u64 [%r43],%r301; $L135: .loc 1 511 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L136; bra $L180; $L132: .loc 1 524 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L138; add.u64 %r295,%frame,120; $L158: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 527 10 mov.u64 %r81,1; .loc 1 531 4 mov.u64 %r239,0; bra $L139; $L138: .loc 1 535 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 536 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 538 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 538 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 538 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 540 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L140; bra $L181; $L142: .loc 1 529 48 mul.lo.u64 %r81,%r46,%r81; $L139: .loc 1 531 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 524 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L142; bra $L138; $L140: .loc 1 543 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 544 4 bra $L120; $L181: .loc 1 547 2496246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 547 22 st.u64 [%r150],%r98; $L146: .loc 1 573 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L143; bra $L144; $L133: .loc 1 551 19 ld.s8 %r55,[%r150+28]; .loc 1 551 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L145; .loc 1 552 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error145: .loc 1 557 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 557 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L146; .loc 1 559 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L144; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 559 10 mov.u64 %r104,0; $L148: .loc 1 563 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 563 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 564 18 ld.u64 %r63,[%r31]; .loc 1 559 25 add.u64 %r104,%r104,1; .loc 1 564 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L147; .loc 1 565 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error147: .loc 1 559 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L148; $L143: .loc 1 575 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L149; shl.b64 %r266,%r296,3; bra $L150; $L149: mov.u64 %r266,8; $L150: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L151: .loc 1 576 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 573 26 add.u64 %r100,%r100,1; .loc 1 573 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L151; $L144: ld.u64 %r80,[%frame+240]; .loc 1 585 22 ld.u64 %r68,[%frame]; .loc 1 585 12 shl.b64 %r69,%r68,2; .loc 1 587 32 ld.u64 %r59,[%frame+120]; .loc 1 594 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 583 13 mov.u32 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L157: st.u32 [%r98],%r280; .loc 1 584 15 add.u64 %r80,%r80,1; .loc 1 585 12 add.u64 %r98,%r98,%r69; .loc 1 587 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L157; .loc 1 596 7 @ ! %r294 bra $L182; bra $L120; $L156: .loc 1 591 13 st.u64 [%r99],%r297; .loc 1 594 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 595 5 add.u64 %r97,%r97,1; .loc 1 596 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L155; bra $L120; $L182: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 594 23 mov.u64 %r72,%r44; .loc 1 595 5 mov.u64 %r97,1; .loc 1 591 13 mov.u64 %r297,0; $L155: .loc 1 600 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 601 23 ld.u64 %r75,[%r124]; .loc 1 601 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,2; add.u64 %r98,%r98,%r291; .loc 1 587 32 ld.u64 %r77,[%r123]; .loc 1 587 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L156; .loc 1 591 13 mov.u64 %r80,0; bra $L157; $L180: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L133; bra $L158; $L120: .loc 1 605 maxloc1_8_i2.o/ 1622802229_gfortran_maxloc1_8_i2 .visible .func _gfortran_maxloc1_8_i2.file 1 "../../../../libgfortran/generated/maxloc1_8_i2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_8_i2 .visible .func _gfortran_mmaxloc1_8__gfortran_smaxloc1_8_i2 .visible .func _gfortran_smaxloc1_8_VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_maxloc1_8_i2 .visible .func _gfortran_maxloc1_8_i295predu64u64 %r229; .reg .u64 %r230; .reg .predpred %r255; .reg .predu64 %r266; .reg .u16u64predpred %r327; .reg .predu64 %r333; .reg .u64predpred %r357; .reg .u64 %r359; .reg .pred %r361; .reg .u64 %r362; .reg .u32mov.u64 %r181,%ar0; mov.u64 %r182,%ar1; mov.u64 %r183,%ar2; mov.u32 %r184,%ar3; .loc 1 58 10 ld.s8 %r22,[%r182+28]; .loc 1 58 38 add.u32 %r185,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r110,%r185; .loc 1 59 10 ld.u64 %r25,[%r183]; .loc 1 59 7 add.u64 %r111,%r25,-1; .loc 1 61 7 shr.u64 %r187,%r111,63; set.u32.lt.s64 %r189,%r110,%r111; neg.s32 %r190,%r189; cvt.u16.u64 %r193,%r187; cvt.u16.u32 %r194,%r190; or.b16 %r192,%r193,%r194; .loc 1 61 6 cvt.u32.u16 %r195,%r192; cvt.u16.u8 %r196,%r195; setp.eq.u16 %r197,%r196,0; @ %r197 bra $L2; .loc 1 63 7 cvt.u32.u32 %r200,%r22; cvt.s64.s8 %r199,%r200; st.u64 [%stack+8],%r199_gfortran_runtime_errorr203,%r111,%r111; add.u64 %r204,%r203,%r111; shl.b64 %r205,%r204,3; add.u64 %r206,%r182,%r205; ld.u64 %r31,[%r206+56]; ld.u64 %r33,[%r206+48]; .loc 1 71 9 ld.u64 %r114,[%r206+40]; .loc 1 73 3 setp.ne.u64 %r220,%r111,0; @ %r220 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r221,%r110,%r111; @ %r221 bra $L4; bra $L50; $L3: add.u64 %r63,%r182,40; add.u64 %r29,%frame,120; add.u64 %r124,%frame,240; add.u64 %r222,%r182,16; add.u64 %r224,%r25,%r25; add.u64 %r225,%r224,%r25; shl.b64 %r226,%r225,3; add.u64 %r174,%r222,%r226; .loc 1 79 12 mov.u64 %r369,0; $L8: .loc 1 75 18 ld.u64 %r227,[%r63]; st.u64 [%r29],%r227; .loc 1 76 19 ld.u64 %r229,[%r63+16]; add.u64 %r228,%r229,1; ld.u64 %r230,[%r63+8]; sub.u64 %r38,%r228,%r230; .loc 1 78 10 setp.lt.s64 %r231,%r38,0; @ %r231 bra $L6; .loc 1 76 17 st.u64 [%r124],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r124],%r369; $L7: .loc 1 73 3 add.u64 %r63,%r63,24; add.u64 %r29,%r29,8; add.u64 %r124,%r124,8; setp.ne.u64 %r233,%r63,%r174; @ %r233 bra $L8; bra $L9; $L50: .loc 1 90 6 ld.u64 %r234,[%r181]; setp.eq.u64 %r235,%r234,0; @ ! %r235 bra $L11; bra $L10; $L4: add.u64 %r237,%r25,%r25; add.u64 %r238,%r237,%r25; shl.b64 %r239,%r238,3; add.u64 %r240,%r239,40; add.u64 %r132,%r182,%r240; shl.b64 %r241,%r25,3; add.u64 %r107,%r241,-8; add.u64 %r355,%frame,120; add.u64 %r126,%r355,%r107; add.u64 %r359,%frame,240; add.u64 %r105,%r359,%r107; cvt.u32.u32 %r245,%r22; cvt.s64.s8 %r244,%r245; add.u64 %r247,%r244,%r244; add.u64 %r248,%r247,%r244; shl.b64 %r249,%r248,3; add.u64 %r250,%r182,40; add.u64 %r81,%r249,%r250; .loc 1 87 12 mov.u64 %r368,0; $L14: .loc 1 83 18 ld.u64 %r251,[%r132]; st.u64 [%r126],%r251; .loc 1 84 19 ld.u64 %r253,[%r132+16]; add.u64 %r252,%r253,1; ld.u64 %r254,[%r132+8]; sub.u64 %r45,%r252,%r254; .loc 1 86 10 setp.lt.s64 %r255,%r45,0; @ %r255 bra $L12; .loc 1 84 17 st.u64 [%r105],%r45; bra $L13; $L12: .loc 1 87 12 st.u64 [%r105],%r368; $L13: .loc 1 81 3 add.u64 %r132,%r132,24; add.u64 %r126,%r126,8; add.u64 %r105,%r105,8; setp.ne.u64 %r257,%r81,%r132; @ %r257 bra $L14; bra $L51; $L10: .loc 1 94 7 setp.le.s64 %r258,%r110,0; @ %r258 bra $L16; add.u64 %r359,%frame,240; $L36: add.u64 %r153,%r181,40; mov.u64 %r151,%r359; cvt.u32.u32 %r260,%r22; cvt.s64.s8 %r259,%r260; add.u64 %r262,%r259,%r259; add.u64 %r263,%r262,%r259; shl.b64 %r264,%r263,3; add.u64 %r265,%r181,16; add.u64 %r138,%r264,%r265; .loc 1 97 10 mov.u64 %r102,1; .loc 1 101 4 mov.u64 %r288,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r266,0; st.u64 [%r181+8],%r266; .loc 1 106 28 cvt.u16.u32 %r269,%r22; add.u16 %r268,%r269,-1; cvt.u32.u16 %r270,%r268; st.u8 [%r181+28],%r270; .loc 1 108 20 add.u32 %r271,%r22,-2; cvt.s64.s32 %r56,%r271; add.u64 %r273,%r56,%r56; add.u64 %r274,%r273,%r56; shl.b64 %r275,%r274,3; add.u64 %r276,%r181,%r275; .loc 1 108 67 shl.b64 %r278,%r56,3; add.u64 %r279,%frame,%r278; .loc 1 108 59 ld.u64 %r281,[%r276+40]; ld.u64 %r282,[%r279+240]; mul.lo.u64 %r116,%r281,%r282; .loc 1 110 29 mov.u64 %r284284; call (%value_in),_gfortrani_xmallocarray285,[%value_in]; } .loc 1 110 27 st.u64 [%r181],%r285; .loc 1 111 10 setp.eq.u64 %r287,%r116,0; @ ! %r287 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r102,%r52,%r102; $L17: .loc 1 101 4 st.u64 [%r153+8],%r288; ld.u64 %r52,[%r151]; add.u64 %r289,%r52,-1; st.u64 [%r153+16],%r289; st.u64 [%r153],%r102; .loc 1 94 7 add.u64 %r153,%r153,24; add.u64 %r151,%r151,8; setp.ne.u64 %r290,%r138,%r153; @ %r290 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r181+48],%r116; mov.u64 %r292,-1; st.u64 [%r181+56],%r292; mov.u64 %r293,1; st.u64 [%r181+40],%r293; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r62,[%r181+28]; .loc 1 121 10 setp.eq.u64 %r294,%r62,%r110; @ %r294 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r110; st.u64 [%stack],%r62;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r297,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r298,[%r297+36]; setp.eq.u32 %r299,%r298,0; @ %r299 bra $L21; .loc 1 128 2 add.u64 %r359,%frame,240; cvta.const.u64 %r303,$LC2r359302303; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r305,%r110,0; @ %r305 bra $L24; $L27: .loc 1 68 9 add.u64 %r306,%r31,1; .loc 1 68 7 sub.u64 %r112,%r306,%r33; max.s64 %r46,%r112,0; .loc 1 140 8 ld.u64 %r121,[%r182]; .loc 1 141 8 ld.u64 %r122,[%r181]; .loc 1 180 31 add.u64 %r69,%r114,%r114; ld.u64 %r48,[%frame+360]; .loc 1 194 22 ld.u64 %r70,[%frame+120]; .loc 1 194 12 add.u64 %r72,%r70,%r70; .loc 1 195 22 ld.u64 %r73,[%frame]; .loc 1 195 12 shl.b64 %r75,%r73,3; setp.gt.s64 %r356,%r112,0; setp.le.s64 %r357,%r110,1; setp.eq.u32 %r361,%r184,0; .loc 1 159 10 mov.u64 %r362,0; cvt.u32.u32 %r363,%r22; cvt.s64.s8 %r364,%r363; add.u64 %r365,%r364,-1; add.u64 %r366,%frame,120; add.u64 %r367,%frame,240; bra $L25; $L24: add.u64 %r166,%frame,360; add.u64 %r165,%r181,40; mov.u64 %r163,%frame; cvt.u32.u32 %r310,%r22; cvt.s64.s8 %r309,%r310; add.u64 %r154,%r309,-1; .loc 1 132 3 mov.u64 %r123,0; add.u64 %r359,%frame,240; .loc 1 134 16 mov.u64 %r311,%r123; $L26: st.u64 [%r166],%r311; .loc 1 135 18 ld.u64 %r312,[%r165]; st.u64 [%r163],%r312; .loc 1 136 17 shl.b64 %r314,%r123,3; add.u64 %r315,%r359,%r314; .loc 1 136 10 ld.u64 %r316,[%r315]; setp.le.s64 %r317,%r316,0; @ %r317 bra $L1; .loc 1 132 26 add.u64 %r123,%r123,1; .loc 1 132 3 add.u64 %r166,%r166,8; add.u64 %r165,%r165,24; add.u64 %r163,%r163,8; setp.ne.u64 %r318,%r123,%r154; @ %r318 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r356 bra $L37; .loc 1 159 10 st.u64 [%r122],%r362; bra $L29; $L37: mov.u64 %r117,%r121; .loc 1 155 9 mov.u32 %r104,-32768; .loc 1 157 9 mov.u64 %r103,1; .loc 1 178 8 mov.u64 %r149,0; $L28: .loc 1 182 14 ld.s16 %r180,[%r117]; .loc 1 185 14 add.u64 %r149,%r149,1; .loc 1 182 7 @ %r361 bra $L30; .loc 1 182 6 cvt.u16.u32 %r322,%r104; cvt.u16.u32 %r323,%r180; setp.le.s16 %r324,%r322,%r323; selp.u32 %r104,%r180,%r104,%r324; selp.u64 %r103,%r149,%r103,%r324; bra $L31; $L30: cvt.u16.u32 %r325,%r104; cvt.u16.u32 %r326,%r180; setp.lt.s16 %r327,%r325,%r326; selp.u32 %r104,%r180,%r104,%r327; selp.u64 %r103,%r149,%r103,%r327; $L31: .loc 1 180 31 add.u64 %r117,%r117,%r69; .loc 1 180 6 setp.gt.s64 %r328,%r46,%r149; @ %r328 bra $L28; .loc 1 189 12 st.u64 [%r122],%r103; $L29: .loc 1 193 15 add.u64 %r48,%r48,1; .loc 1 194 12 add.u64 %r121,%r121,%r72; .loc 1 195 12 add.u64 %r122,%r122,%r75; .loc 1 197 32 ld.u64 %r145,[%frame+240]; .loc 1 197 13 setp.ne.u64 %r329,%r48,%r145; @ %r329 bra $L25; .loc 1 204 23 mul.lo.u64 %r330,%r70,%r48; .loc 1 204 9 add.u64 %r331,%r330,%r330; sub.u64 %r118,%r121,%r331; .loc 1 205 23 mul.lo.u64 %r332,%r73,%r48; .loc 1 205 9 shl.b64 %r333,%r332,3; sub.u64 %r119,%r122,%r333; .loc 1 207 7 @ %r357 bra $L1; add.u64 %r150,%frame,368; mov.u64 %r109,8; .loc 1 206 5 mov.u64 %r120,1; bra $L34; $L35: .loc 1 201 13 st.u64 [%r150],%r362; .loc 1 204 23 mul.lo.u64 %r339,%r89,%r88; .loc 1 204 9 add.u64 %r340,%r339,%r339; sub.u64 %r118,%r121,%r340; .loc 1 205 23 mul.lo.u64 %r341,%r92,%r88; .loc 1 205 9 shl.b64 %r342,%r341,3; sub.u64 %r119,%r122,%r342; .loc 1 206 5 add.u64 %r120,%r120,1; .loc 1 207 7 add.u64 %r150,%r150,8; add.u64 %r109,%r109,8; setp.eq.u64 %r343,%r120,%r365; @ %r343 bra $L1; $L34: .loc 1 215 16 ld.u64 %r344,[%r150]; add.u64 %r88,%r344,1; st.u64 [%r150],%r88; .loc 1 216 23 add.u64 %r346,%r366,%r109; ld.u64 %r89,[%r346]; .loc 1 216 13 add.u64 %r347,%r89,%r89; add.u64 %r121,%r118,%r347; .loc 1 217 23 add.u64 %r348,%frame,%r109; ld.u64 %r92,[%r348]; .loc 1 217 13 shl.b64 %r349,%r92,3; add.u64 %r122,%r119,%r349; .loc 1 197 32 add.u64 %r351,%r367,%r109; ld.u64 %r95,[%r351]; .loc 1 197 13 setp.eq.u64 %r352,%r88,%r95; @ %r352 bra $L35; .loc 1 201 13 mov.u64 %r48,0; bra $L25; $L51: .loc 1 90 6 ld.u64 %r353,[%r181]; setp.eq.u64 %r354,%r353,0; @ ! %r354 bra $L11; bra $L36; $L1: .loc 1 221_gfortran_mmaxloc1_8_i2 .visible .func _gfortran_mmaxloc1_8_, .param .u32 %in_ar4) {608pred %r218; .reg .u32 %r223; .reg .u6464u64pred %r373; .reg .u64 %r374; .reg .u64 %r376; .reg .u32 %r377; .reg .pred %r378; .reg .u64 %r381; .reg .u64 %r382; .reg .u64 %r386; .reg .pred %r388; .reg .pred %r389; .reg .u64 %r390; .reg .u32 %r391; .reg .u64pred %r398; .reg .pred %r399; .reg .pred %r404; .reg .pred16u32 %r413; .reg .predpred %r417; .reg .pred %r418; .reg .predpredpredu64 %r460; .reg .u64 %r461; .reg .u64 %r462; mov.u64 %r213,%ar0; mov.u64 %r214,%ar1; mov.u64 %r215,%ar2; mov.u64 %r216,%ar3; mov.u32 %r217,%ar4; .loc 1 251 6 setp.ne.u64 %r218,%r216,0; @ %r218 bra $L1r214215217; call _gfortran_maxloc1_8_7 bra $L52; $L53: .loc 1 261 10 ld.u64 %r22,[%r215]; .loc 1 261 7 add.u64 %r146,%r22,-1; .loc 1 262 10 ld.s8 %r23,[%r214+28]; .loc 1 262 38 add.u32 %r223,%r23,-1; .loc 1 262 8 cvt.s64.s32 %r147,%r223; .loc 1 265 7 shr.u64 %r225,%r146,63; set.u32.gt.s64 %r227,%r146,%r147; neg.s32 %r228,%r227; cvt.u16.u64 %r231,%r225; cvt.u16.u32 %r232,%r228; or.b16 %r230,%r231,%r232; .loc 1 265 6 cvt.u32.u16 %r233,%r230; cvt.u16.u8 %r234,%r233; setp.eq.u16 %r235,%r234,0; @ %r235 bra $L55; .loc 1 267 7 cvt.u32.u32 %r238,%r23; cvt.s64.s8 %r237,%r238; st.u64 [%stack+8],%r237; st.u64 [%stack],%r22; cvta.const.u64 %r236236_gfortran_runtime_error5: .loc 1 272 9 add.u64 %r452,%r146,%r146; add.u64 %r451,%r452,%r146; shl.b64 %r243,%r451,3; add.u64 %r244,%r214,%r243; ld.u64 %r247,[%r244+56]; add.u64 %r246,%r247,1; .loc 1 272 7 ld.u64 %r254,[%r244+48]; sub.u64 %r148,%r246,%r254; .loc 1 273 6 setp.le.s64 %r255,%r148,0; @ %r255 bra $L52; .loc 1 276 9 ld.u64 %r166,[%r216]; .loc 1 278 15 ld.u64 %r34,[%r216+16]; .loc 1 280 22 cvt.u32.u64 %r35,%r34; .loc 1 280 53 add.u32 %r256,%r35,-4; and.b32 %r257,%r256,-5; set.u32.eq.u32 %r259,%r257,0; neg.s32 %r260,%r259; .loc 1 280 22 add.u32 %r261,%r35,-1; set.u32.le.u32 %r263,%r261,1; neg.s32 %r264,%r263; .loc 1 280 6 cvt.u16.u32 %r266,%r260; cvt.u16.u32 %r267,%r264; or.b16 %r265,%r266,%r267; cvt.u32.u16 %r268,%r265; cvt.u16.u8 %r269,%r268; setp.ne.u16 %r270,%r269,0; @ %r270 bra $L57; .loc 1 282 7 setp.ne.u32 %r272,%r35,16; @ %r272 bra $L58; $L57: .loc 1 289 9 shl.b64 %r276,%r451,3; add.u64 %r277,%r214,%r276; ld.u64 %r151,[%r277+40]; .loc 1 290 12 add.u64 %r283,%r216,%r276; ld.u64 %r39,[%r283+40]; .loc 1 292 3 setp.ne.u64 %r285,%r146,0; @ %r285 bra $L59; $L65: .loc 1 302 3 setp.lt.s64 %r286,%r146,%r147; @ %r286 bra $L60; bra $L119; $L58: .loc 1 287 5_gfortran_runtime_error9: add.u64 %r205,%r214,40; add.u64 %r67,%r216,40; add.u64 %r289,%r214,16; add.u64 %r291,%r22,%r22; add.u64 %r292,%r291,%r22; shl.b64 %r293,%r292,3; add.u64 %r210,%r289,%r293; .loc 1 292 3 mov.u64 %r91,0; add.u64 %r449,%frame,240; add.u64 %r447,%frame,360; .loc 1 299 12 mov.u64 %r462,%r91; $L64: .loc 1 294 18 add.u64 %r295,%r449,%r91; ld.u64 %r296,[%r205]; st.u64 [%r295],%r296; .loc 1 295 18 add.u64 %r297,%frame,%r91; .loc 1 295 20 ld.u64 %r299,[%r67]; mul.lo.u64 %r298,%r299,%r34; .loc 1 295 18 st.u64 [%r297],%r298; .loc 1 296 19 ld.u64 %r301,[%r205+16]; add.u64 %r300,%r301,1; ld.u64 %r302,[%r205+8]; sub.u64 %r51,%r300,%r302; .loc 1 298 10 setp.lt.s64 %r303,%r51,0; @ %r303 bra $L62; .loc 1 296 17 add.u64 %r305,%r447,%r91; st.u64 [%r305],%r51; bra $L63; $L62: .loc 1 299 12 add.u64 %r307,%r447,%r91; st.u64 [%r307],%r462; $L63: .loc 1 292 3 add.u64 %r205,%r205,24; add.u64 %r67,%r67,24; add.u64 %r91,%r91,8; setp.ne.u64 %r309,%r205,%r210; @ %r309 bra $L64; bra $L65; $L119: .loc 1 312 6 ld.u64 %r310,[%r213]; setp.eq.u64 %r311,%r310,0; @ ! %r311 bra $L67; bra $L66; $L60: add.u64 %r313,%r22,%r22; add.u64 %r314,%r313,%r22; shl.b64 %r315,%r314,3; add.u64 %r128,%r315,40; add.u64 %r133,%r214,%r128; add.u64 %r124,%r216,%r128; shl.b64 %r316,%r22,3; add.u64 %r83,%r316,-8; cvt.u32.u32 %r318,%r23; cvt.s64.s8 %r317,%r318; shl.b64 %r319,%r317,3; add.u64 %r209,%r319,-8; add.u64 %r449,%frame,240; add.u64 %r447,%frame,360; .loc 1 309 12 mov.u64 %r461,0; $L70: .loc 1 304 18 add.u64 %r321,%r449,%r83; ld.u64 %r322,[%r133]; st.u64 [%r321],%r322; .loc 1 305 18 add.u64 %r323,%frame,%r83; .loc 1 305 20 ld.u64 %r325,[%r124]; mul.lo.u64 %r324,%r325,%r34; .loc 1 305 18 st.u64 [%r323],%r324; .loc 1 306 19 ld.u64 %r327,[%r133+16]; add.u64 %r326,%r327,1; ld.u64 %r328,[%r133+8]; sub.u64 %r64,%r326,%r328; .loc 1 308 10 setp.lt.s64 %r329,%r64,0; @ %r329 bra $L68; .loc 1 306 17 add.u64 %r331,%r447,%r83; st.u64 [%r331],%r64; bra $L69; $L68: .loc 1 309 12 add.u64 %r333,%r447,%r83; st.u64 [%r333],%r461; $L69: .loc 1 302 3 add.u64 %r133,%r133,24; add.u64 %r124,%r124,24; add.u64 %r83,%r83,8; setp.ne.u64 %r335,%r83,%r209; @ %r335 bra $L70; bra $L120; $L66: .loc 1 316 7 setp.eq.u64 %r336,%r147,0; @ %r336 bra $L72; add.u64 %r447,%frame,360; $L96: add.u64 %r161,%r213,40; mov.u64 %r143,%r447; cvt.u32.u32 %r338,%r23; cvt.s64.s8 %r337,%r338; add.u64 %r340,%r337,%r337; add.u64 %r341,%r340,%r337; shl.b64 %r342,%r341,3; add.u64 %r343,%r213,16; add.u64 %r135,%r342,%r343; .loc 1 319 10 mov.u64 %r131,1; .loc 1 323 4 mov.u64 %r362,0; bra $L73; $L72: .loc 1 327 20 add.u32 %r344,%r23,-2; cvt.s64.s32 %r73,%r344; add.u64 %r346,%r73,%r73; add.u64 %r347,%r346,%r73; shl.b64 %r348,%r347,3; add.u64 %r349,%r213,%r348; .loc 1 327 67 shl.b64 %r351,%r73,3; add.u64 %r352,%frame,%r351; .loc 1 327 59 ld.u64 %r354,[%r349+40]; ld.u64 %r355,[%r352+360]; mul.lo.u64 %r155,%r354,%r355; .loc 1 329 24 mov.u64 %r356,0; st.u64 [%r213+8],%r356; .loc 1 330 28 cvt.u16.u32 %r359,%r23; add.u16 %r358,%r359,-1; cvt.u32.u16 %r360,%r358; st.u8 [%r213+28],%r360; .loc 1 332 10 setp.eq.u64 %r361,%r155,0; @ %r361 bra $L74; bra $L121; $L76: .loc 1 321 47 mul.lo.u64 %r131,%r70,%r131; $L73: .loc 1 323 4 st.u64 [%r161+8],%r362; ld.u64 %r70,[%r143]; add.u64 %r363,%r70,-1; st.u64 [%r161+16],%r363; st.u64 [%r161],%r131; .loc 1 316 7 add.u64 %r161,%r161,24; add.u64 %r143,%r143,8; setp.ne.u64 %r364,%r135,%r161; @ %r364 bra $L76; bra $L72; $L74: .loc 1 335 4 st.u64 [%r213+48],%r155; mov.u64 %r366,-1; st.u64 [%r213+56],%r366; mov.u64 %r367,1; st.u64 [%r213+40],%r367; .loc 1 336 4 bra $L52; $L121: .loc 1 339 24 mov.u64 %r369369; call (%value_in),_gfortrani_xmallocarray70,[%value_in]; } .loc 1 339 22 st.u64 [%r213],%r370; bra $L77; $L67: .loc 1 344 19 ld.s8 %r372,[%r213+28]; .loc 1 344 10 setp.eq.u64 %r373,%r372,%r147; @ %r373 bra $L78; .loc 1 345 2 cvta.const.u64 %r374374_gfortran_runtime_error78: .loc 1 347 11 cvta.global.u64 %r376,_gfortrani_compile_options; .loc 1 347 10 ld.u32 %r377,[%r376+36]; setp.eq.u32 %r378,%r377,0; @ %r378 bra $L77; .loc 1 349 4 add.u64 %r447,%frame,360; cvta.const.u64 %r382,$LC2r4473382; call _gfortrani_bounds_ifunction_return4 cvta.const.u64 %r386,$LC386382; call _gfortrani_bounds_equal_extents$L77: .loc 1 356 3 setp.ne.u64 %r388,%r147,0; @ %r388 bra $L79; $L82: .loc 1 364 8 ld.u64 %r167,[%r213]; .loc 1 365 8 ld.u64 %r165,[%r214]; .loc 1 367 9 setp.ne.u64 %r389,%r165,0; @ %r389 bra $L80; bra $L52; $L79: add.u64 %r194,%frame,480; add.u64 %r193,%r213,40; add.u64 %r188,%frame,120; cvt.u32.u32 %r391,%r23; cvt.s64.s8 %r390,%r391; add.u64 %r174,%r390,-1; .loc 1 356 3 mov.u64 %r168,0; add.u64 %r447,%frame,360; .loc 1 358 16 mov.u64 %r392,%r168; $L81: st.u64 [%r194],%r392; .loc 1 359 18 ld.u64 %r393,[%r193]; st.u64 [%r188],%r393; .loc 1 360 17 shl.b64 %r395,%r168,3; add.u64 %r396,%r447,%r395; .loc 1 360 10 ld.u64 %r397,[%r396]; setp.le.s64 %r398,%r397,0; @ %r398 bra $L52; .loc 1 356 26 add.u64 %r168,%r168,1; .loc 1 356 3 add.u64 %r194,%r194,8; add.u64 %r193,%r193,24; add.u64 %r188,%r188,8; setp.ne.u64 %r399,%r168,%r174; @ %r399 bra $L81; bra $L82; $L80: .loc 1 290 12 mul.lo.u64 %r42,%r39,%r34; .loc 1 386 32 add.u64 %r87,%r151,%r151; ld.u64 %r68,[%frame+480]; .loc 1 430 22 ld.u64 %r97,[%frame+240]; .loc 1 430 12 add.u64 %r99,%r97,%r97; .loc 1 431 23 ld.u64 %r100,[%frame]; .loc 1 432 22 ld.u64 %r102,[%frame+120]; .loc 1 432 12 shl.b64 %r104,%r102,3; setp.le.s64 %r446,%r147,1; .loc 1 408 9 setp.ne.u32 %r454,%r217,0; cvt.u32.u32 %r455,%r23; cvt.s64.s8 %r456,%r455; add.u64 %r457,%r456,-1; add.u64 %r458,%frame,240; add.u64 %r459,%frame,360; add.u64 %r460,%frame,120; $L97: .loc 1 356 3 mov.u64 %r160,%r166; mov.u64 %r159,%r165; .loc 1 386 9 mov.u64 %r137,0; $L86: .loc 1 389 7 ld.s8 %r208,[%r160]; mov.u64 %r189,%r137; .loc 1 386 24 add.u64 %r137,%r137,1; .loc 1 389 6 setp.eq.u32 %r448,%r208,0; @ %r448 bra $L83; .loc 1 397 11 ld.s16 %r141,[%r159]; .loc 1 408 9 @ %r454 bra $L84; bra $L122; $L83: .loc 1 386 32 add.u64 %r159,%r159,%r87; .loc 1 386 47 add.u64 %r160,%r160,%r42; .loc 1 386 2 setp.ne.u64 %r404,%r148,%r137; @ %r404 bra $L86; .loc 1 385 9 mov.u64 %r137,0; bra $L87; $L122: .loc 1 418 8 setp.gt.s64 %r405,%r148,%r189; @ %r405 bra $L88; bra $L87; $L84: .loc 1 409 8 setp.le.s64 %r406,%r148,%r189; @ %r406 bra $L87; $L90: .loc 1 414 16 add.u64 %r189,%r189,1; .loc 1 411 8 @ %r448 bra $L89; .loc 1 411 18 ld.s16 %r88,[%r159]; .loc 1 411 15 cvt.u16.u32 %r408,%r88; cvt.u16.u32 %r409,%r141; setp.ge.s16 %r410,%r408,%r409; selp.u32 %r141,%r88,%r141,%r410; selp.u64 %r137,%r189,%r137,%r410; $L89: .loc 1 409 33 add.u64 %r159,%r159,%r87; .loc 1 409 48 add.u64 %r160,%r160,%r42; .loc 1 409 8 setp.eq.u64 %r411,%r148,%r189; @ %r411 bra $L87; .loc 1 411 9 ld.s8 %r208,[%r160]; setp.eq.u32 %r448,%r208,0; bra $L90; $L88: .loc 1 423 16 add.u64 %r189,%r189,1; .loc 1 420 8 ld.s8 %r413,[%r160]; cvt.u16.u32 %r412,%r413; setp.eq.u16 %r414,%r412,0; @ %r414 bra $L91; .loc 1 420 18 ld.s16 %r93,[%r159]; .loc 1 420 15 cvt.u16.u32 %r415,%r93; cvt.u16.u32 %r416,%r141; setp.gt.s16 %r417,%r415,%r416; selp.u32 %r141,%r93,%r141,%r417; selp.u64 %r137,%r189,%r137,%r417; $L91: .loc 1 418 33 add.u64 %r159,%r159,%r87; .loc 1 418 48 add.u64 %r160,%r160,%r42; .loc 1 418 8 setp.ne.u64 %r418,%r148,%r189; @ %r418 bra $L88; $L87: .loc 1 426 8 st.u64 [%r167],%r137; .loc 1 429 15 add.u64 %r68,%r68,1; .loc 1 430 12 add.u64 %r165,%r165,%r99; .loc 1 431 13 add.u64 %r166,%r166,%r100; .loc 1 432 12 add.u64 %r167,%r167,%r104; .loc 1 434 32 ld.u64 %r29,[%frame+360]; .loc 1 434 13 setp.ne.u64 %r419,%r29,%r68; @ %r419 bra $L97; .loc 1 441 23 mul.lo.u64 %r106,%r68,%r97; .loc 1 442 24 mul.lo.u64 %r420,%r68,%r100; .loc 1 442 10 sub.u64 %r162,%r166,%r420; .loc 1 443 23 mul.lo.u64 %r421,%r68,%r102; .loc 1 443 9 shl.b64 %r422,%r421,3; sub.u64 %r163,%r167,%r422; .loc 1 445 7 @ %r446 bra $L52; add.u64 %r204,%frame,488; mov.u64 %r203,8; .loc 1 444 5 mov.u64 %r164,1; .loc 1 438 13 mov.u64 %r453,0; bra $L94; $L95: st.u64 [%r204],%r453; .loc 1 441 23 mul.lo.u64 %r106,%r117,%r116; .loc 1 442 24 mul.lo.u64 %r428,%r119,%r116; .loc 1 442 10 sub.u64 %r162,%r166,%r428; .loc 1 443 23 mul.lo.u64 %r429,%r121,%r116; .loc 1 443 9 shl.b64 %r430,%r429,3; sub.u64 %r163,%r167,%r430; .loc 1 444 5 add.u64 %r164,%r164,1; .loc 1 445 7 add.u64 %r204,%r204,8; add.u64 %r203,%r203,8; setp.eq.u64 %r431,%r164,%r457; @ %r431 bra $L52; $L94: .loc 1 453 16 ld.u64 %r432,[%r204]; add.u64 %r116,%r432,1; st.u64 [%r204],%r116; .loc 1 454 23 add.u64 %r434,%r458,%r203; ld.u64 %r117,[%r434]; .loc 1 454 13 sub.u64 %r435,%r117,%r106; add.u64 %r436,%r435,%r435; add.u64 %r165,%r165,%r436; .loc 1 455 24 add.u64 %r437,%frame,%r203; ld.u64 %r119,[%r437]; .loc 1 455 14 add.u64 %r166,%r162,%r119; .loc 1 456 23 add.u64 %r439,%r460,%r203; ld.u64 %r121,[%r439]; .loc 1 456 13 shl.b64 %r440,%r121,3; add.u64 %r167,%r163,%r440; .loc 1 434 32 add.u64 %r442,%r459,%r203; ld.u64 %r125,[%r442]; .loc 1 434 13 setp.eq.u64 %r443,%r116,%r125; @ %r443 bra $L95; .loc 1 438 13 mov.u64 %r68,0; bra $L97; $L120: .loc 1 312 6 ld.u64 %r444,[%r213]; setp.eq.u64 %r445,%r444,0; @ ! %r445 bra $L67; bra $L96; $L52: .loc 1 460_gfortran_smaxloc1_8_i2 .visible .func _gfortran_smaxloc1_8_, .param .u32 %in_ar4) {8u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 483 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L124; .loc 1 483 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L125; $L124:r152154; call _gfortran_maxloc1_8_490 7 bra $L123; $L125: .loc 1 493 10 ld.u64 %r23,[%r152]; .loc 1 493 7 add.u64 %r88,%r23,-1; .loc 1 494 10 ld.s8 %r24,[%r151+28]; .loc 1 494 38 add.u32 %r26,%r24,-1; .loc 1 494 8 cvt.s64.s32 %r296,%r26; .loc 1 496 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 496 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L127; .loc 1 503 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L128; $L134: .loc 1 511 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L129; bra $L182; $L127: .loc 1 498 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error128: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 508 12 mov.u64 %r302,0; $L133: .loc 1 505 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 507 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L131; .loc 1 505 17 st.u64 [%r139],%r36; bra $L132; $L131: .loc 1 508 12 st.u64 [%r139],%r302; $L132: .loc 1 503 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L133; bra $L134; $L182: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L136; bra $L135; $L129: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 517 12 mov.u64 %r301,0; $L139: .loc 1 514 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 516 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L137; .loc 1 513 17 st.u64 [%r43],%r41; bra $L138; $L137: .loc 1 517 12 st.u64 [%r43],%r301; $L138: .loc 1 511 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L139; bra $L183; $L135: .loc 1 524 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L141; add.u64 %r295,%frame,120; $L161: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 527 10 mov.u64 %r81,1; .loc 1 531 4 mov.u64 %r239,0; bra $L142; $L141: .loc 1 535 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 536 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 538 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 538 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 538 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 540 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L143; bra $L184; $L145: .loc 1 529 48 mul.lo.u64 %r81,%r46,%r81; $L142: .loc 1 531 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 524 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L145; bra $L141; $L143: .loc 1 543 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 544 4 bra $L123; $L184: .loc 1 547 24 mov.u64 %r24696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 547 22 st.u64 [%r150],%r98; $L149: .loc 1 573 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L146; bra $L147; $L136: .loc 1 551 19 ld.s8 %r55,[%r150+28]; .loc 1 551 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L148; .loc 1 552 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error148: .loc 1 557 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 557 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L149; .loc 1 559 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L147; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 559 10 mov.u64 %r104,0; $L151: .loc 1 563 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 563 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 564 18 ld.u64 %r63,[%r31]; .loc 1 559 25 add.u64 %r104,%r104,1; .loc 1 564 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L150; .loc 1 565 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error150: .loc 1 559 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L151; $L146: .loc 1 575 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L152; shl.b64 %r266,%r296,3; bra $L153; $L152: mov.u64 %r266,8; $L153: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L154: .loc 1 576 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 573 26 add.u64 %r100,%r100,1; .loc 1 573 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L154; $L147: ld.u64 %r80,[%frame+240]; .loc 1 585 22 ld.u64 %r68,[%frame]; .loc 1 585 12 shl.b64 %r69,%r68,3; .loc 1 587 32 ld.u64 %r59,[%frame+120]; .loc 1 594 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 583 13 mov.u64 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L160: st.u64 [%r98],%r280; .loc 1 584 15 add.u64 %r80,%r80,1; .loc 1 585 12 add.u64 %r98,%r98,%r69; .loc 1 587 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L160; .loc 1 596 7 @ ! %r294 bra $L185; bra $L123; $L159: .loc 1 591 13 st.u64 [%r99],%r280; .loc 1 594 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 595 5 add.u64 %r97,%r97,1; .loc 1 596 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L158; bra $L123; $L185: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 594 23 mov.u64 %r72,%r44; .loc 1 595 5 mov.u64 %r97,1; $L158: .loc 1 600 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 601 23 ld.u64 %r75,[%r124]; .loc 1 601 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,3; add.u64 %r98,%r98,%r291; .loc 1 587 32 ld.u64 %r77,[%r123]; .loc 1 587 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L159; .loc 1 591 13 mov.u64 %r80,0; bra $L160; $L183: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L136; bra $L161; $L123: .loc 1 605 maxloc1_16_i2.o/1622802231_gfortran_maxloc1_16_i2 .visible .func _gfortran_maxloc1_16_i2.file 1 "../../../../libgfortran/generated/maxloc1_16_i2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_16_i2 .visible .func _gfortran_mmaxloc1_16__gfortran_smaxloc1_16_i2 .visible .func _gfortran_smaxloc1_16_VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_maxloc1_16_i2 .visible .func _gfortran_maxloc1_16_i207predpredu64 %r231; .reg .u64 %r232; .reg .pred %r233; .reg .pred %r235; .reg .u64 %r236; .reg .pred %r237; .reg .u64u64predu64 %r264; .reg .u64 %r265; .reg .u64 %r266; .reg .u64 %r267; .reg .u64 %r268; .reg .u16 %r270; .reg .u16 %r271; .reg .u32 %r272; .reg .u32 %r273; .reg .u64 %r275; .reg .u64u64predu64 %r312; .reg .u64u32 %r316; .reg .u64 %r317; .reg .u64pred %r323; .reg .pred %r324; .reg .u64 %r326; .reg .u16 %r329; .reg .u16 %r330; .reg .predpred %r334; .reg .u64 %r337; .reg .u64 %r340; .reg .u32 %r342; .reg .u64 %r347; .reg .pred %r349; .reg .pred %r352; .reg .u64 %r358; .reg .u64 %r359; .reg .u64 %r360; .reg .u64 %r361; .reg .pred %r362; .reg .u64 %r363; .reg .u64predpred %r385; .reg .predpredmov.u64 %r183,%ar0; mov.u64 %r184,%ar1; mov.u64 %r185,%ar2; mov.u32 %r186,%ar3; .loc 1 58 10 ld.s8 %r22,[%r184+28]; .loc 1 58 38 add.u32 %r187,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r110,%r187; .loc 1 59 10 ld.u64 %r25,[%r185]; .loc 1 59 7 add.u64 %r111,%r25,-1; .loc 1 61 7 shr.u64 %r189,%r111,63; set.u32.lt.s64 %r191,%r110,%r111; neg.s32 %r192,%r191; cvt.u16.u64 %r195,%r189; cvt.u16.u32 %r196,%r192; or.b16 %r194,%r195,%r196; .loc 1 61 6 cvt.u32.u16 %r197,%r194; cvt.u16.u8 %r198,%r197; setp.eq.u16 %r199,%r198,0; @ %r199 bra $L2; .loc 1 63 7 cvt.u32.u32 %r202,%r22; cvt.s64.s8 %r201,%r202; st.u64 [%stack+8],%r2000200_gfortran_runtime_errorr205,%r111,%r111; add.u64 %r206,%r205,%r111; shl.b64 %r207,%r206,3; add.u64 %r208,%r184,%r207; ld.u64 %r31,[%r208+56]; ld.u64 %r33,[%r208+48]; .loc 1 71 9 ld.u64 %r114,[%r208+40]; .loc 1 73 3 setp.ne.u64 %r222,%r111,0; @ %r222 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r223,%r110,%r111; @ %r223 bra $L4; bra $L49; $L3: add.u64 %r109,%r184,40; add.u64 %r128,%frame,120; add.u64 %r181,%frame,240; add.u64 %r224,%r184,16; add.u64 %r226,%r25,%r25; add.u64 %r227,%r226,%r25; shl.b64 %r228,%r227,3; add.u64 %r137,%r224,%r228; .loc 1 79 12 mov.u64 %r396,0; $L8: .loc 1 75 18 ld.u64 %r229,[%r109]; st.u64 [%r128],%r229; .loc 1 76 19 ld.u64 %r231,[%r109+16]; add.u64 %r230,%r231,1; ld.u64 %r232,[%r109+8]; sub.u64 %r38,%r230,%r232; .loc 1 78 10 setp.lt.s64 %r233,%r38,0; @ %r233 bra $L6; .loc 1 76 17 st.u64 [%r181],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r181],%r396; $L7: .loc 1 73 3 add.u64 %r109,%r109,24; add.u64 %r128,%r128,8; add.u64 %r181,%r181,8; setp.ne.u64 %r235,%r109,%r137; @ %r235 bra $L8; bra $L9; $L49: .loc 1 90 6 ld.u64 %r236,[%r183]; setp.eq.u64 %r237,%r236,0; @ ! %r237 bra $L11; bra $L10; $L4: add.u64 %r239,%r25,%r25; add.u64 %r240,%r239,%r25; shl.b64 %r241,%r240,3; add.u64 %r242,%r241,40; add.u64 %r134,%r184,%r242; shl.b64 %r243,%r25,3; add.u64 %r107,%r243,-8; add.u64 %r384,%frame,120; add.u64 %r120,%r384,%r107; add.u64 %r383,%frame,240; add.u64 %r104,%r383,%r107; add.u64 %r246,%r184,40; cvt.u32.u32 %r248,%r22; cvt.s64.s8 %r247,%r248; add.u64 %r250,%r247,%r247; add.u64 %r251,%r250,%r247; shl.b64 %r252,%r251,3; add.u64 %r63,%r246,%r252; .loc 1 87 12 mov.u64 %r395,0; $L14: .loc 1 83 18 ld.u64 %r253,[%r134]; st.u64 [%r120],%r253; .loc 1 84 19 ld.u64 %r255,[%r134+16]; add.u64 %r254,%r255,1; ld.u64 %r256,[%r134+8]; sub.u64 %r45,%r254,%r256; .loc 1 86 10 setp.lt.s64 %r257,%r45,0; @ %r257 bra $L12; .loc 1 84 17 st.u64 [%r104],%r45; bra $L13; $L12: .loc 1 87 12 st.u64 [%r104],%r395; $L13: .loc 1 81 3 add.u64 %r134,%r134,24; add.u64 %r120,%r120,8; add.u64 %r104,%r104,8; setp.ne.u64 %r259,%r63,%r134; @ %r259 bra $L14; bra $L50; $L10: .loc 1 94 7 setp.le.s64 %r260,%r110,0; @ %r260 bra $L16; add.u64 %r383,%frame,240; $L37: add.u64 %r154,%r183,40; mov.u64 %r152,%r383; cvt.u32.u32 %r262,%r22; cvt.s64.s8 %r261,%r262; add.u64 %r264,%r261,%r261; add.u64 %r265,%r264,%r261; shl.b64 %r266,%r265,3; add.u64 %r267,%r183,16; add.u64 %r138,%r266,%r267; .loc 1 97 10 mov.u64 %r101,1; .loc 1 101 4 mov.u64 %r290,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r268,0; st.u64 [%r183+8],%r268; .loc 1 106 28 cvt.u16.u32 %r271,%r22; add.u16 %r270,%r271,-1; cvt.u32.u16 %r272,%r270; st.u8 [%r183+28],%r272; .loc 1 108 20 add.u32 %r273,%r22,-2; cvt.s64.s32 %r56,%r273; add.u64 %r275,%r56,%r56; add.u64 %r276,%r275,%r56; shl.b64 %r277,%r276,3; add.u64 %r278,%r183,%r277; .loc 1 108 67 shl.b64 %r280,%r56,3; add.u64 %r281,%frame,%r280; .loc 1 108 59 ld.u64 %r283,[%r278+40]; ld.u64 %r284,[%r281+240]; mul.lo.u64 %r116,%r283,%r284; .loc 1 110 29 mov.u64 %r286,16286; call (%value_in),_gfortrani_xmallocarray287,[%value_in]; } .loc 1 110 27 st.u64 [%r183],%r287; .loc 1 111 10 setp.eq.u64 %r289,%r116,0; @ ! %r289 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r101,%r52,%r101; $L17: .loc 1 101 4 st.u64 [%r154+8],%r290; ld.u64 %r52,[%r152]; add.u64 %r291,%r52,-1; st.u64 [%r154+16],%r291; st.u64 [%r154],%r101; .loc 1 94 7 add.u64 %r154,%r154,24; add.u64 %r152,%r152,8; setp.ne.u64 %r292,%r138,%r154; @ %r292 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r183+48],%r116; mov.u64 %r294,-1; st.u64 [%r183+56],%r294; mov.u64 %r295,1; st.u64 [%r183+40],%r295; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r62,[%r183+28]; .loc 1 121 10 setp.eq.u64 %r296,%r62,%r110; @ %r296 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r110; st.u64 [%stack],%r62;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r299,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r300,[%r299+36]; setp.eq.u32 %r301,%r300,0; @ %r301 bra $L21; .loc 1 128 2 add.u64 %r383,%frame,240; cvta.const.u64 %r305,$LC204305; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r307,%r110,0; @ %r307 bra $L24; $L27: .loc 1 68 9 add.u64 %r308,%r31,1; .loc 1 68 7 sub.u64 %r112,%r308,%r33; max.s64 %r46,%r112,0; .loc 1 140 8 ld.u64 %r124,[%r184]; .loc 1 141 8 ld.u64 %r125,[%r183]; .loc 1 180 31 add.u64 %r70,%r114,%r114; ld.u64 %r77,[%frame+360]; .loc 1 194 22 ld.u64 %r72,[%frame+120]; .loc 1 194 12 add.u64 %r73,%r72,%r72; .loc 1 195 22 ld.u64 %r75,[%frame]; .loc 1 195 12 shl.b64 %r76,%r75,4; .loc 1 197 32 ld.u64 %r105,[%frame+240]; .loc 1 204 23 mul.lo.u64 %r311,%r72,%r105; .loc 1 204 9 add.u64 %r312,%r311,%r311; neg.s64 %r135,%r312; .loc 1 205 23 mul.lo.u64 %r313,%r75,%r105; .loc 1 205 9 shl.b64 %r314,%r313,4; neg.s64 %r139,%r314; setp.gt.s64 %r385,%r112,0; setp.le.s64 %r386,%r110,1; mov.u64 %r388,1; setp.eq.u32 %r389,%r186,0; cvt.u32.u32 %r390,%r22; cvt.s64.s8 %r391,%r390; add.u64 %r392,%r391,-1; add.u64 %r393,%frame,120; add.u64 %r394,%frame,240; bra $L25; $L24: add.u64 %r167,%frame,360; add.u64 %r166,%r183,40; mov.u64 %r164,%frame; cvt.u32.u32 %r316,%r22; cvt.s64.s8 %r315,%r316; add.u64 %r155,%r315,-1; .loc 1 132 3 mov.u64 %r126,0; add.u64 %r383,%frame,240; .loc 1 134 16 mov.u64 %r317,%r126; $L26: st.u64 [%r167],%r317; .loc 1 135 18 ld.u64 %r318,[%r166]; st.u64 [%r164],%r318; .loc 1 136 17 shl.b64 %r320,%r126,3; add.u64 %r321,%r383,%r320; .loc 1 136 10 ld.u64 %r322,[%r321]; setp.le.s64 %r323,%r322,0; @ %r323 bra $L1; .loc 1 132 26 add.u64 %r126,%r126,1; .loc 1 132 3 add.u64 %r167,%r167,8; add.u64 %r166,%r166,24; add.u64 %r164,%r164,8; setp.ne.u64 %r324,%r126,%r155; @ %r324 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r385 bra $L38; .loc 1 159 10 mov.u64 %r326,0; st.u64 [%r125],%r326; st.u64 [%r125+8],%r326; bra $L29; $L38: mov.u64 %r119,%r124; mov.u64 %r376,%r388; mov.u64 %r377,0; .loc 1 155 9 mov.u32 %r103,-32768; .loc 1 157 9 mov.u64 %r374,1; mov.u64 %r375,%r377; $L28: .loc 1 182 14 ld.s16 %r182,[%r119]; .loc 1 182 7 @ %r389 bra $L30; .loc 1 182 6 cvt.u16.u32 %r329,%r103; cvt.u16.u32 %r330,%r182; setp.le.s16 %r331,%r329,%r330; @ %r331 bra $L31; bra $L32; $L30: cvt.u16.u32 %r332,%r103; cvt.u16.u32 %r333,%r182; setp.ge.s16 %r334,%r332,%r333; @ %r334 bra $L32; $L31: .loc 1 185 14 mov.u64 %r374,%r376; mov.u64 %r375,%r377; .loc 1 182 14 mov.u32 %r103,%r182; $L32: .loc 1 180 31 add.u64 %r119,%r119,%r70; .loc 1 180 6 add.u64 %r337,%r376,%r388; set.u32.lt.u64 %r342,%r337,%r376; cvt.s64.s32 %r340,%r342; mov.u64 %r376,%r337; sub.u64 %r377,%r377,%r340; add.u64 %r347,%r376,-1; setp.gt.s64 %r349,%r46,%r347; @ %r349 bra $L28; .loc 1 189 12 st.u64 [%r125],%r374; st.u64 [%r125+8],%r375; $L29: .loc 1 193 15 add.u64 %r77,%r77,1; .loc 1 194 12 add.u64 %r124,%r124,%r73; .loc 1 195 12 add.u64 %r125,%r125,%r76; .loc 1 197 13 setp.ne.u64 %r352,%r77,%r105; @ %r352 bra $L25; .loc 1 204 9 add.u64 %r121,%r124,%r135; .loc 1 205 9 add.u64 %r122,%r125,%r139; .loc 1 207 7 @ %r386 bra $L1; add.u64 %r130,%frame,368; mov.u64 %r149,8; .loc 1 206 5 mov.u64 %r123,1; .loc 1 201 13 mov.u64 %r387,0; bra $L35; $L36: st.u64 [%r130],%r387; .loc 1 204 23 mul.lo.u64 %r358,%r89,%r88; .loc 1 204 9 add.u64 %r359,%r358,%r358; sub.u64 %r121,%r124,%r359; .loc 1 205 23 mul.lo.u64 %r360,%r92,%r88; .loc 1 205 9 shl.b64 %r361,%r360,4; sub.u64 %r122,%r125,%r361; .loc 1 206 5 add.u64 %r123,%r123,1; .loc 1 207 7 add.u64 %r130,%r130,8; add.u64 %r149,%r149,8; setp.eq.u64 %r362,%r123,%r392; @ %r362 bra $L1; $L35: .loc 1 215 16 ld.u64 %r363,[%r130]; add.u64 %r88,%r363,1; st.u64 [%r130],%r88; .loc 1 216 23 add.u64 %r365,%r393,%r149; ld.u64 %r89,[%r365]; .loc 1 216 13 add.u64 %r366,%r89,%r89; add.u64 %r124,%r121,%r366; .loc 1 217 23 add.u64 %r367,%frame,%r149; ld.u64 %r92,[%r367]; .loc 1 217 13 shl.b64 %r368,%r92,4; add.u64 %r125,%r122,%r368; .loc 1 197 32 add.u64 %r370,%r394,%r149; ld.u64 %r95,[%r370]; .loc 1 197 13 setp.eq.u64 %r371,%r88,%r95; @ %r371 bra $L36; .loc 1 201 13 mov.u64 %r77,0; bra $L25; $L50: .loc 1 90 6 ld.u64 %r372,[%r183]; setp.eq.u64 %r373,%r372,0; @ ! %r373 bra $L11; bra $L37; $L1: .loc 1 221_gfortran_mmaxloc1_16_i2 .visible .func _gfortran_mmaxloc1_16_, .param .u32 %in_ar4) {608u32 %r94; .reg .u64 %r97u64u64 %r267; .reg .u64 %r274; .reg .pred %r275; .reg .u32 %r276; .reg .u32 %r277; .reg .u32u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .pred %r329; .reg .u64 %r330; .reg .pred %r331; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32 %r338; .reg .u64 %r339; .reg .u64 %r341; .reg .u64 %r342; .reg .u64 %r343; .reg .u64 %r344; .reg .u64pred %r355; .reg .pred64predpred %r408; .reg .pred %r409; .reg .u64 %r410; .reg .u32.reg .u64 %r417; .reg .pred %r418; .reg .predpred %r428; .reg .pred %r429; .reg .pred %r430; .reg .u64.reg .u32 %r456; .reg .u64 %r460; .reg .u16 %r462; .reg .u16 %r463; .reg .predpred %r479; .reg .pred %r482; .reg .u64 %r486; .reg .u64 %r489; .reg .u64 %r492; .reg .u64 %r495; .reg .u32u64 %r547; .reg .pred %r548; .reg .u64 %r549; .reg .u64 %r551; .reg .u64u64 %r557; .reg .u64u64 %r572; .reg .u64 %r583; .reg .u64 %r597; .reg .predpred %r609; .reg .u64 %r610; .reg .u32 %r611; .reg .u64 %r612; .reg .u64 %r613; .reg .u64 %r614; .reg .u64 %r615; .reg .u64 %r616; .reg .u64 %r617; mov.u64 %r233,%ar0; mov.u64 %r234,%ar1; mov.u64 %r235,%ar2; mov.u64 %r236,%ar3; mov.u32 %r237,%ar4; .loc 1 251 6 setp.ne.u64 %r238,%r236,0; @ %r238 bra $L3r2235237; call _gfortran_maxloc1_16_7 bra $L51; $L52: .loc 1 261 10 ld.u64 %r22,[%r235]; .loc 1 261 7 add.u64 %r153,%r22,-1; .loc 1 262 10 ld.s8 %r23,[%r234+28]; .loc 1 262 38 add.u32 %r243,%r23,-1; .loc 1 262 8 cvt.s64.s32 %r154,%r243; .loc 1 265 7 shr.u64 %r245,%r153,63; set.u32.gt.s64 %r247,%r153,%r154; neg.s32 %r248,%r247; cvt.u16.u64 %r251,%r245; cvt.u16.u32 %r252,%r248; or.b16 %r250,%r251,%r252; .loc 1 265 6 cvt.u32.u16 %r253,%r250; cvt.u16.u8 %r254,%r253; setp.eq.u16 %r255,%r254,0; @ %r255 bra $L54; .loc 1 267 7 cvt.u32.u32 %r258,%r23; cvt.s64.s8 %r257,%r258; st.u64 [%stack+8],%r257; st.u64 [%stack],%r22; cvta.const.u64 %r22stack; call _gfortran_runtime_error4: .loc 1 272 9 add.u64 %r602,%r153,%r153; add.u64 %r607,%r602,%r153; shl.b64 %r263,%r607,3; add.u64 %r264,%r234,%r263; ld.u64 %r267,[%r264+56]; add.u64 %r266,%r267,1; .loc 1 272 7 ld.u64 %r274,[%r264+48]; sub.u64 %r155,%r266,%r274; .loc 1 273 6 setp.le.s64 %r275,%r155,0; @ %r275 bra $L51; .loc 1 276 9 ld.u64 %r173,[%r236]; .loc 1 278 15 ld.u64 %r35,[%r236+16]; .loc 1 280 22 cvt.u32.u64 %r36,%r35; .loc 1 280 53 add.u32 %r276,%r36,-4; and.b32 %r277,%r276,-5; set.u32.eq.u32 %r279,%r277,0; neg.s32 %r280,%r279; .loc 1 280 22 add.u32 %r281,%r36,-1; set.u32.le.u32 %r283,%r281,1; neg.s32 %r284,%r283; .loc 1 280 6 cvt.u16.u32 %r286,%r280; cvt.u16.u32 %r287,%r284; or.b16 %r285,%r286,%r287;56; .loc 1 282 7 setp.ne.u32 %r292,%r36,16; @ %r292 bra $L57; $L56: .loc 1 289 9 shl.b64 %r296,%r607,3; add.u64 %r297,%r234,%r296; ld.u64 %r158,[%r297+40]; .loc 1 290 12 add.u64 %r303,%r236,%r296; ld.u64 %r41,[%r303+40]; .loc 1 292 3 setp.ne.u64 %r305,%r153,0; @ %r305 bra $L58; $L64: .loc 1 302 3 setp.lt.s64 %r306,%r153,%r154; @ %r306 bra $L59; bra $L118; $L57: .loc 1 287 5 cvta.const.u64 %r307307_gfortran_runtime_error8: add.u64 %r164,%r234,40; add.u64 %r150,%r236,40; add.u64 %r309,%r234,16; add.u64 %r311,%r22,%r22; add.u64 %r312,%r311,%r22; shl.b64 %r313,%r312,3; add.u64 %r223,%r309,%r313; .loc 1 292 3 mov.u64 %r53,0; add.u64 %r603,%frame,240; add.u64 %r606,%frame,360; .loc 1 299 12 mov.u64 %r617,%r53; $L63: .loc 1 294 18 add.u64 %r315,%r603,%r53; ld.u64 %r316,[%r164]; st.u64 [%r315],%r316; .loc 1 295 18 add.u64 %r317,%frame,%r53; .loc 1 295 20 ld.u64 %r319,[%r150]; mul.lo.u64 %r318,%r319,%r35; .loc 1 295 18 st.u64 [%r317],%r318; .loc 1 296 19 ld.u64 %r321,[%r164+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r164+8]; sub.u64 %r52,%r320,%r322; .loc 1 298 10 setp.lt.s64 %r323,%r52,0; @ %r323 bra $L61; .loc 1 296 17 add.u64 %r325,%r606,%r53; st.u64 [%r325],%r52; bra $L62; $L61: .loc 1 299 12 add.u64 %r327,%r606,%r53; st.u64 [%r327],%r617; $L62: .loc 1 292 3 add.u64 %r164,%r164,24; add.u64 %r150,%r150,24; add.u64 %r53,%r53,8; setp.ne.u64 %r329,%r164,%r223; @ %r329 bra $L63; bra $L64; $L118: .loc 1 312 6 ld.u64 %r330,[%r233]; setp.eq.u64 %r331,%r330,0; @ ! %r331 bra $L66; bra $L65; $L59: add.u64 %r333,%r22,%r22; add.u64 %r334,%r333,%r22; shl.b64 %r335,%r334,3; add.u64 %r178,%r335,40; add.u64 %r59,%r234,%r178; add.u64 %r180,%r236,%r178; shl.b64 %r336,%r22,3; add.u64 %r91,%r336,-8; cvt.u32.u32 %r338,%r23; cvt.s64.s8 %r337,%r338; shl.b64 %r339,%r337,3; add.u64 %r97,%r339,-8; add.u64 %r603,%frame,240; add.u64 %r606,%frame,360; .loc 1 309 12 mov.u64 %r616,0; $L69: .loc 1 304 18 add.u64 %r341,%r603,%r91; ld.u64 %r342,[%r59]; st.u64 [%r341],%r342; .loc 1 305 18 add.u64 %r343,%frame,%r91; .loc 1 305 20 ld.u64 %r345,[%r180]; mul.lo.u64 %r344,%r345,%r35; .loc 1 305 18 st.u64 [%r343],%r344; .loc 1 306 19 ld.u64 %r347,[%r59+16]; add.u64 %r346,%r347,1; ld.u64 %r348,[%r59+8]; sub.u64 %r66,%r346,%r348; .loc 1 308 10 setp.lt.s64 %r349,%r66,0; @ %r349 bra $L67; .loc 1 306 17 add.u64 %r351,%r606,%r91; st.u64 [%r351],%r66; bra $L68; $L67: .loc 1 309 12 add.u64 %r353,%r606,%r91; st.u64 [%r353],%r616; $L68: .loc 1 302 3 add.u64 %r59,%r59,24; add.u64 %r180,%r180,24; add.u64 %r91,%r91,8; setp.ne.u64 %r355,%r91,%r97; @ %r355 bra $L69; bra $L119; $L65: .loc 1 316 7 setp.eq.u64 %r356,%r154,0; @ %r356 bra $L71; add.u64 %r606,%frame,360; $L97: add.u64 %r137,%r233,40; mov.u64 %r134,%r606; cvt.u32.u32 %r358,%r23; cvt.s64.s8 %r357,%r358; add.u64 %r360,%r357,%r357; add.u64 %r361,%r360,%r357; shl.b64 %r362,%r361,3; add.u64 %r363,%r233,16; add.u64 %r64,%r362,%r363; .loc 1 319 10 mov.u64 %r136,1; .loc 1 323 4 mov.u64 %r382,0; bra $L72; $L71: .loc 1 327 20 add.u32 %r364,%r23,-2; cvt.s64.s32 %r75,%r364; add.u64 %r366,%r75,%r75; add.u64 %r367,%r366,%r75; shl.b64 %r368,%r367,3; add.u64 %r369,%r233,%r368; .loc 1 327 67 shl.b64 %r371,%r75,3; add.u64 %r372,%frame,%r371; .loc 1 327 59 ld.u64 %r374,[%r369+40]; ld.u64 %r375,[%r372+360]; mul.lo.u64 %r162,%r374,%r375; .loc 1 329 24 mov.u64 %r376,0; st.u64 [%r233+8],%r376; .loc 1 330 28 cvt.u16.u32 %r379,%r23; add.u16 %r378,%r379,-1; cvt.u32.u16 %r380,%r378; st.u8 [%r233+28],%r380; .loc 1 332 10 setp.eq.u64 %r381,%r162,0; @ %r381 bra $L73; bra $L120; $L75: .loc 1 321 47 mul.lo.u64 %r136,%r72,%r136; $L72: .loc 1 323 4 st.u64 [%r137+8],%r382; ld.u64 %r72,[%r134]; add.u64 %r383,%r72,-1; st.u64 [%r137+16],%r383; st.u64 [%r137],%r136; .loc 1 316 7 add.u64 %r137,%r137,24; add.u64 %r134,%r134,8; setp.ne.u64 %r384,%r64,%r137; @ %r384 bra $L75; bra $L71; $L73: .loc 1 335 4 st.u64 [%r233+48],%r162; mov.u64 %r386,-1; st.u64 [%r233+56],%r386; mov.u64 %r387,1; st.u64 [%r233+40],%r387; .loc 1 336 4 bra $L51; $L120: .loc 1 339 24 mov.u64 %r389,16389; call (%value_in),_gfortrani_xmallocarray90,[%value_in]; } .loc 1 339 22 st.u64 [%r233],%r390; bra $L76; $L66: .loc 1 344 19 ld.s8 %r392,[%r233+28]; .loc 1 344 10 setp.eq.u64 %r393,%r392,%r154; @ %r393 bra $L77; .loc 1 345 2 cvta.const.u64 %r3944_gfortran_runtime_error77: .loc 1 347 11 cvta.global.u64 %r396,_gfortrani_compile_options; .loc 1 347 10 ld.u32 %r397,[%r396+36]; setp.eq.u32 %r398,%r397,0; @ %r398 bra $L76; .loc 1 349 4 add.u64 %r606,%frame,360; cvta.const.u64 %r402,$LC24002; call _gfortrani_bounds_ifunction_return4 cvta.const.u64 %r406,$LC406402; call _gfortrani_bounds_equal_extents$L76: .loc 1 356 3 setp.ne.u64 %r408,%r154,0; @ %r408 bra $L78; $L81: .loc 1 364 8 ld.u64 %r174,[%r233]; .loc 1 365 8 ld.u64 %r172,[%r234]; .loc 1 367 9 setp.ne.u64 %r409,%r172,0; @ %r409 bra $L79; bra $L51; $L78: add.u64 %r188,%frame,480; add.u64 %r181,%r233,40; add.u64 %r149,%frame,120; cvt.u32.u32 %r411,%r23; cvt.s64.s8 %r410,%r411; add.u64 %r138,%r410,-1; .loc 1 356 3 mov.u64 %r175,0; add.u64 %r606,%frame,360; .loc 1 358 16 mov.u64 %r412,%r175; $L80: st.u64 [%r188],%r412; .loc 1 359 18 ld.u64 %r413,[%r181]; st.u64 [%r149],%r413; .loc 1 360 17 shl.b64 %r415,%r175,3; add.u64 %r416,%r606,%r415; .loc 1 360 10 ld.u64 %r417,[%r416]; setp.le.s64 %r418,%r417,0; @ %r418 bra $L51; .loc 1 356 26 add.u64 %r175,%r175,1; .loc 1 356 3 add.u64 %r188,%r188,8; add.u64 %r181,%r181,24; add.u64 %r149,%r149,8; setp.ne.u64 %r419,%r138,%r175; @ %r419 bra $L80; bra $L81; $L79: .loc 1 290 12 mul.lo.u64 %r43,%r41,%r35; .loc 1 386 32 add.u64 %r88,%r158,%r158; ld.u64 %r189,[%frame+480]; .loc 1 430 22 ld.u64 %r102,[%frame+240]; .loc 1 430 12 add.u64 %r103,%r102,%r102; .loc 1 431 23 ld.u64 %r105,[%frame]; .loc 1 432 22 ld.u64 %r107,[%frame+120]; .loc 1 432 12 shl.b64 %r108,%r107,4; .loc 1 434 32 ld.u64 %r29,[%frame+360]; .loc 1 441 23 mul.lo.u64 %r39,%r29,%r102; .loc 1 442 24 mul.lo.u64 %r422,%r29,%r105; .loc 1 442 10 neg.s64 %r92,%r422; .loc 1 443 23 mul.lo.u64 %r423,%r29,%r107; .loc 1 443 9 shl.b64 %r424,%r423,4; neg.s64 %r185,%r424; setp.le.s64 %r601,%r154,1; .loc 1 408 9 setp.ne.u32 %r609,%r237,0; add.u64 %r610,%r155,-1; cvt.u32.u32 %r611,%r23; cvt.s64.s8 %r612,%r611; add.u64 %r613,%r612,-1; add.u64 %r614,%frame,240; add.u64 %r615,%frame,360; $L98: .loc 1 356 3 mov.u64 %r168,%r173; mov.u64 %r167,%r172; .loc 1 386 9 mov.u64 %r197,0; $L85: .loc 1 389 7 ld.s8 %r224,[%r168]; mov.u64 %r230,%r197; .loc 1 386 24 add.u64 %r197,%r197,1; .loc 1 389 6 setp.eq.u32 %r604,%r224,0; @ %r604 bra $L82; .loc 1 397 11 ld.s16 %r145,[%r167]; .loc 1 398 11 mov.u64 %r563,%r197; shr.s64 %r564,%r563,63; .loc 1 408 9 @ %r609 bra $L83; bra $L121; $L82: .loc 1 386 32 add.u64 %r167,%r167,%r88; .loc 1 386 47 add.u64 %r168,%r168,%r43; .loc 1 386 2 setp.ne.u64 %r428,%r155,%r197; @ %r428 bra $L85; .loc 1 385 9 mov.u64 %r563,0; mov.u64 %r564,%r563; bra $L86; $L121: .loc 1 418 8 setp.gt.s64 %r429,%r155,%r230; @ %r429 bra $L87; bra $L86; $L83: .loc 1 409 8 setp.le.s64 %r430,%r155,%r230; @ %r430 bra $L86; add.u64 %r571,%r230,1; shr.s64 %r572,%r571,63; sub.u64 %r434,%r610,%r230; shr.s64 %r437,%r230,63; add.u64 %r440,%r230,2; set.u32.lt.u64 %r445,%r440,%r230; cvt.s64.s32 %r443,%r445; sub.u64 %r449,%r437,%r443; add.u64 %r451,%r434,%r440; set.u32.lt.u64 %r456,%r451,%r434; cvt.s64.s32 %r454,%r456; sub.u64 %r460,%r449,%r454; mov.u64 %r583,1; $L90: .loc 1 411 8 @ %r604 bra $L88; .loc 1 411 18 ld.s16 %r89,[%r167]; .loc 1 411 15 cvt.u16.u32 %r462,%r89; cvt.u16.u32 %r463,%r145; setp.lt.s16 %r464,%r462,%r463; .loc 1 414 16 selp.u64 %r563,%r563,%r571,%r464; selp.u64 %r564,%r564,%r572,%r464; selp.u32 %r145,%r145,%r89,%r464; $L88: .loc 1 409 33 add.u64 %r167,%r167,%r88; .loc 1 409 48 add.u64 %r168,%r168,%r43; .loc 1 409 8 add.u64 %r467,%r571,%r583; set.u32.lt.u64 %r472,%r467,%r571; cvt.s64.s32 %r470,%r472; mov.u64 %r571,%r467; sub.u64 %r572,%r572,%r470; setp.ne.u64 %r479,%r451,%r571; @ %r479 bra $L99; setp.ne.u64 %r482,%r460,%r572; @ ! %r482 bra $L86; $L99: .loc 1 411 9 ld.s8 %r224,[%r168]; setp.eq.u32 %r604,%r224,0; bra $L90; $L87: add.u64 %r567,%r230,1; shr.s64 %r568,%r567,63; sub.u64 %r486,%r610,%r230; shr.s64 %r489,%r230,63; add.u64 %r492,%r230,2; set.u32.lt.u64 %r497,%r492,%r230; cvt.s64.s32 %r495,%r497; sub.u64 %r501,%r489,%r495; add.u64 %r503,%r486,%r492; set.u32.lt.u64 %r508,%r503,%r486; cvt.s64.s32 %r506,%r508; sub.u64 %r512,%r501,%r506; .loc 1 418 8 mov.u64 %r597,1; $L113: .loc 1 420 8 ld.s8 %r514,[%r168]; cvt.u16.u32 %r513,%r514; setp.eq.u16 %r515,%r513,0; @ %r515 bra $L91; .loc 1 420 18 ld.s16 %r94,[%r167]; .loc 1 420 15 cvt.u16.u32 %r516,%r94; cvt.u16.u32 %r517,%r145; setp.le.s16 %r518,%r516,%r517; .loc 1 423 16 selp.u64 %r563,%r563,%r567,%r518; selp.u64 %r564,%r564,%r568,%r518; selp.u32 %r145,%r145,%r94,%r518; $L91: .loc 1 418 33 add.u64 %r167,%r167,%r88; .loc 1 418 48 add.u64 %r168,%r168,%r43; .loc 1 418 8 add.u64 %r521,%r567,%r597; set.u32.lt.u64 %r526,%r521,%r567; cvt.s64.s32 %r524,%r526; mov.u64 %r567,%r521; sub.u64 %r568,%r568,%r524; setp.ne.u64 %r533,%r503,%r567; @ %r533 bra $L113; setp.ne.u64 %r536,%r512,%r568; @ %r536 bra $L113; $L86: .loc 1 426 8 st.u64 [%r174],%r563; st.u64 [%r174+8],%r564; .loc 1 429 15 add.u64 %r189,%r189,1; .loc 1 430 12 add.u64 %r172,%r172,%r103; .loc 1 431 13 add.u64 %r173,%r173,%r105; .loc 1 432 12 add.u64 %r174,%r174,%r108; .loc 1 434 13 setp.ne.u64 %r539,%r29,%r189; @ %r539 bra $L98; .loc 1 442 10 add.u64 %r169,%r173,%r92; .loc 1 443 9 add.u64 %r170,%r174,%r185; .loc 1 445 7 @ %r601 bra $L51; add.u64 %r219,%frame,488; .loc 1 441 23 mov.u64 %r112,%r39; .loc 1 445 7 mov.u64 %r218,8; .loc 1 444 5 mov.u64 %r171,1; add.u64 %r605,%frame,120; .loc 1 438 13 mov.u64 %r608,0; bra $L95; $L96: st.u64 [%r219],%r608; .loc 1 441 23 mul.lo.u64 %r112,%r125,%r124; .loc 1 442 24 mul.lo.u64 %r545,%r127,%r124; .loc 1 442 10 sub.u64 %r169,%r173,%r545; .loc 1 443 23 mul.lo.u64 %r546,%r129,%r124; .loc 1 443 9 shl.b64 %r547,%r546,4; sub.u64 %r170,%r174,%r547; .loc 1 444 5 add.u64 %r171,%r171,1; .loc 1 445 7 add.u64 %r219,%r219,8; add.u64 %r218,%r218,8; setp.eq.u64 %r548,%r171,%r613; @ %r548 bra $L51; $L95: .loc 1 453 16 ld.u64 %r549,[%r219]; add.u64 %r124,%r549,1; st.u64 [%r219],%r124; .loc 1 454 23 add.u64 %r551,%r614,%r218; ld.u64 %r125,[%r551]; .loc 1 454 13 sub.u64 %r552,%r125,%r112; add.u64 %r553,%r552,%r552; add.u64 %r172,%r172,%r553; .loc 1 455 24 add.u64 %r554,%frame,%r218; ld.u64 %r127,[%r554]; .loc 1 455 14 add.u64 %r173,%r169,%r127; .loc 1 456 23 add.u64 %r556,%r605,%r218; ld.u64 %r129,[%r556]; .loc 1 456 13 shl.b64 %r557,%r129,4; add.u64 %r174,%r170,%r557; .loc 1 434 32 add.u64 %r559,%r615,%r218; ld.u64 %r133,[%r559]; .loc 1 434 13 setp.eq.u64 %r560,%r124,%r133; @ %r560 bra $L96; .loc 1 438 13 mov.u64 %r189,0; bra $L98; $L119: .loc 1 312 6 ld.u64 %r561,[%r233]; setp.eq.u64 %r562,%r561,0; @ ! %r562 bra $L66; bra $L97; $L51: .loc 1 460_gfortran_smaxloc1_16_i2 .visible .func _gfortran_smaxloc1_16_, .param .u32 %in_ar4) {8u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r282; .reg .predpred %r293; .reg .pred %r294; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 483 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L123; .loc 1 483 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L124; r152154; call _gfortran_maxloc1_16_490 7 bra $L122; $L124: .loc 1 493 10 ld.u64 %r23,[%r152]; .loc 1 493 7 add.u64 %r88,%r23,-1; .loc 1 494 10 ld.s8 %r24,[%r151+28]; .loc 1 494 38 add.u32 %r26,%r24,-1; .loc 1 494 8 cvt.s64.s32 %r297,%r26; .loc 1 496 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r297; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 496 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L126; .loc 1 503 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L127; $L133: .loc 1 511 3 setp.lt.s64 %r175,%r88,%r297; @ %r175 bra $L128; bra $L181; $L126: .loc 1 498 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error127: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 508 12 mov.u64 %r303,0; $L132: .loc 1 505 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 507 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L130; .loc 1 505 17 st.u64 [%r139],%r36; bra $L131; $L130: .loc 1 508 12 st.u64 [%r139],%r303; $L131: .loc 1 503 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L132; bra $L133; $L181: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L135; bra $L134; $L128: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r296,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r296,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 517 12 mov.u64 %r302,0; $L138: .loc 1 514 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 516 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L136; .loc 1 513 17 st.u64 [%r43],%r41; bra $L137; $L136: .loc 1 517 12 st.u64 [%r43],%r302; $L137: .loc 1 511 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L138; bra $L182; $L134: .loc 1 524 7 setp.eq.u64 %r213,%r297,0; @ %r213 bra $L140; add.u64 %r296,%frame,120; $L160: add.u64 %r111,%r150,40; mov.u64 %r107,%r296; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 527 10 mov.u64 %r81,1; .loc 1 531 4 mov.u64 %r239,0; bra $L141; $L140: .loc 1 535 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 536 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 538 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 538 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 538 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 540 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L142; bra $L183; $L144: .loc 1 529 48 mul.lo.u64 %r81,%r46,%r81; $L141: .loc 1 531 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 524 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L144; bra $L140; $L142: .loc 1 543 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 544 4 bra $L122; $L183: .loc 1 547 24 mov.u64 %r246,1696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 547 22 st.u64 [%r150],%r98; $L148: .loc 1 573 3 setp.ne.u64 %r249,%r297,0; @ %r249 bra $L145; bra $L146; $L135: .loc 1 551 19 ld.s8 %r55,[%r150+28]; .loc 1 551 10 setp.eq.u64 %r250,%r55,%r297; @ %r250 bra $L147; .loc 1 552 2 st.u64 [%stack+8],%r297; st.u64 [%stack],%r55;_gfortran_runtime_error147: .loc 1 557 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 557 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L148; .loc 1 559 4 setp.eq.u64 %r256,%r297,0; @ %r256 bra $L146; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 559 10 mov.u64 %r104,0; $L150: .loc 1 563 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 563 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 564 18 ld.u64 %r63,[%r31]; .loc 1 559 25 add.u64 %r104,%r104,1; .loc 1 564 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L149; .loc 1 565 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error149: .loc 1 559 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L150; $L145: .loc 1 575 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L151; shl.b64 %r266,%r297,3; bra $L152; $L151: mov.u64 %r266,8; $L152: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L153: .loc 1 576 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 573 26 add.u64 %r100,%r100,1; .loc 1 573 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r297,%r100; @ %r279 bra $L153; $L146: ld.u64 %r80,[%frame+240]; .loc 1 585 22 ld.u64 %r68,[%frame]; .loc 1 585 12 shl.b64 %r69,%r68,4; .loc 1 587 32 ld.u64 %r59,[%frame+120]; .loc 1 594 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r295,%r297,1; .loc 1 583 13 mov.u64 %r280,0; cvt.u32.u32 %r299,%r24; cvt.s64.s8 %r300,%r299; add.u64 %r301,%r300,-1; $L159: st.u64 [%r98],%r280; st.u64 [%r98+8],%r280; .loc 1 584 15 add.u64 %r80,%r80,1; .loc 1 585 12 add.u64 %r98,%r98,%r69; .loc 1 587 13 setp.ne.u64 %r282,%r59,%r80; @ %r282 bra $L159; .loc 1 596 7 @ ! %r295 bra $L184; bra $L122; $L158: .loc 1 591 13 st.u64 [%r99],%r280; .loc 1 594 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 595 5 add.u64 %r97,%r97,1; .loc 1 596 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r285,%r97,%r301; @ %r285 bra $L157; bra $L122; $L184: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 594 23 mov.u64 %r72,%r44; .loc 1 595 5 mov.u64 %r97,1; $L157: .loc 1 600 16 ld.u64 %r290,[%r99]; add.u64 %r74,%r290,1; st.u64 [%r99],%r74; .loc 1 601 23 ld.u64 %r75,[%r124]; .loc 1 601 13 sub.u64 %r291,%r75,%r72; shl.b64 %r292,%r291,4; add.u64 %r98,%r98,%r292; .loc 1 587 32 ld.u64 %r77,[%r123]; .loc 1 587 13 setp.eq.u64 %r293,%r74,%r77; @ %r293 bra $L158; .loc 1 591 13 mov.u64 %r80,0; bra $L159; $L182: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r294,%r98,0; @ ! %r294 bra $L135; bra $L160; $L122: .loc 1 605 maxloc1_4_i4.o/ 1622802229_gfortran_maxloc1_4_i4 .visible .func _gfortran_maxloc1_4_../libgfortran/generated/maxloc1_4_i4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_4_i4 .visible .func _gfortran_mmaxloc1_4__gfortran_smaxloc1_4_i4 .visible .func _gfortran_smaxloc1_4_VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_maxloc1_4_i4 .visible .func _gfortran_maxloc1_4_i4pred %r221; .reg .pred %r222; .reg .u64u64 %r229; .reg .u64 %r230; .reg .u64 %r231; .reg .pred %r232; .reg .predu64 %r280; .reg .u64 %r282; .reg .u64predu32u64 %r317; .reg .u64pred %r325; .reg .pred %r326; .reg .u32 %r327; .reg .pred %r328; .reg .pred %r329; .reg .u64 %r335; .reg .u64pred %r358; .reg .u32 %r359; .reg .u64u64 %r365; mov.u64 %r182,%ar0; mov.u64 %r183,%ar1; mov.u64 %r184,%ar2; mov.u32 %r185,%ar3; .loc 1 58 10 ld.s8 %r22,[%r183+28]; .loc 1 58 38 add.u32 %r186,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r108,%r186; .loc 1 59 10 ld.u64 %r25,[%r184]; .loc 1 59 7 add.u64 %r109,%r25,-1; .loc 1 61 7 shr.u64 %r188,%r109,63; set.u32.lt.s64 %r190,%r108,%r109; neg.s32 %r191,%r190; cvt.u16.u64 %r194,%r188; cvt.u16.u32 %r195,%r191; or.b16 %r193,%r194,%r195; .loc 1 61 6 cvt.u32.u16 %r196,%r193; cvt.u16.u8 %r197,%r196; setp.eq.u16 %r198,%r197,0; @ %r198 bra $L2; .loc 1 63 7 cvt.u32.u32 %r201,%r22; cvt.s64.s8 %r200,%r201; st.u64 [%stack+8],%r20019_gfortran_runtime_errorr204,%r109,%r109; add.u64 %r205,%r204,%r109; shl.b64 %r206,%r205,3; add.u64 %r207,%r183,%r206; ld.u64 %r31,[%r207+56]; ld.u64 %r33,[%r207+48]; .loc 1 71 9 ld.u64 %r112,[%r207+40]; .loc 1 73 3 setp.ne.u64 %r221,%r109,0; @ %r221 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r222,%r108,%r109; @ %r222 bra $L4; bra $L49; $L3: add.u64 %r29,%r183,40; add.u64 %r125,%frame,120; add.u64 %r126,%frame,240; add.u64 %r223,%r183,16; add.u64 %r225,%r25,%r25; add.u64 %r226,%r225,%r25; shl.b64 %r227,%r226,3; add.u64 %r147,%r223,%r227; .loc 1 79 12 mov.u64 %r365,0; $L8: .loc 1 75 18 ld.u64 %r228,[%r29]; st.u64 [%r125],%r228; .loc 1 76 19 ld.u64 %r230,[%r29+16]; add.u64 %r229,%r230,1; ld.u64 %r231,[%r29+8]; sub.u64 %r38,%r229,%r231; .loc 1 78 10 setp.lt.s64 %r232,%r38,0; @ %r232 bra $L6; .loc 1 76 17 st.u64 [%r126],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r126],%r365; $L7: .loc 1 73 3 add.u64 %r29,%r29,24; add.u64 %r125,%r125,8; add.u64 %r126,%r126,8; setp.ne.u64 %r234,%r29,%r147; @ %r234 bra $L8; bra $L9; $L49: .loc 1 90 6 ld.u64 %r235,[%r182]; setp.eq.u64 %r236,%r235,0; @ ! %r236 bra $L11; bra $L10; $L4: add.u64 %r238,%r25,%r25; add.u64 %r239,%r238,%r25; shl.b64 %r240,%r239,3; add.u64 %r241,%r240,40; add.u64 %r137,%r183,%r241; shl.b64 %r242,%r25,3; add.u64 %r105,%r242,-8; add.u64 %r352,%frame,120; add.u64 %r127,%r352,%r105; add.u64 %r351,%frame,240; add.u64 %r103,%r351,%r105; cvt.u32.u32 %r246,%r22; cvt.s64.s8 %r245,%r246; add.u64 %r248,%r245,%r245; add.u64 %r249,%r248,%r245; shl.b64 %r250,%r249,3; add.u64 %r251,%r183,40; add.u64 %r64,%r250,%r251; .loc 1 87 12 mov.u64 %r364,0; $L14: .loc 1 83 18 ld.u64 %r252,[%r137]; st.u64 [%r127],%r252; .loc 1 84 19 ld.u64 %r254,[%r137+16]; add.u64 %r253,%r254,1; ld.u64 %r255,[%r137+8]; sub.u64 %r44,%r253,%r255; .loc 1 86 10 setp.lt.s64 %r256,%r44,0; @ %r256 bra $L12; .loc 1 84 17 st.u64 [%r103],%r44; bra $L13; $L12: .loc 1 87 12 st.u64 [%r103],%r364; $L13: .loc 1 81 3 add.u64 %r137,%r137,24; add.u64 %r127,%r127,8; add.u64 %r103,%r103,8; setp.ne.u64 %r258,%r64,%r137; @ %r258 bra $L14; bra $L50; $L10: .loc 1 94 7 setp.le.s64 %r259,%r108,0; @ %r259 bra $L16; add.u64 %r351,%frame,240; $L37: add.u64 %r154,%r182,40; mov.u64 %r152,%r351; cvt.u32.u32 %r261,%r22; cvt.s64.s8 %r260,%r261; add.u64 %r263,%r260,%r260; add.u64 %r264,%r263,%r260; shl.b64 %r265,%r264,3; add.u64 %r266,%r182,16; add.u64 %r138,%r265,%r266; .loc 1 97 10 mov.u64 %r100,1; .loc 1 101 4 mov.u64 %r289,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r267,0; st.u64 [%r182+8],%r267; .loc 1 106 28 cvt.u16.u32 %r270,%r22; add.u16 %r269,%r270,-1; cvt.u32.u16 %r271,%r269; st.u8 [%r182+28],%r271; .loc 1 108 20 add.u32 %r272,%r22,-2; cvt.s64.s32 %r54,%r272; add.u64 %r274,%r54,%r54; add.u64 %r275,%r274,%r54; shl.b64 %r276,%r275,3; add.u64 %r277,%r182,%r276; .loc 1 108 67 shl.b64 %r279,%r54,3; add.u64 %r280,%frame,%r279; .loc 1 108 59 ld.u64 %r282,[%r277+40]; ld.u64 %r283,[%r280+240]; mul.lo.u64 %r114,%r282,%r283; .loc 1 110 29285; call (%value_in),_gfortrani_xmallocarray286,[%value_in]; } .loc 1 110 27 st.u64 [%r182],%r286; .loc 1 111 10 setp.eq.u64 %r288,%r114,0; @ ! %r288 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r100,%r50,%r100; $L17: .loc 1 101 4 st.u64 [%r154+8],%r289; ld.u64 %r50,[%r152]; add.u64 %r290,%r50,-1; st.u64 [%r154+16],%r290; st.u64 [%r154],%r100; .loc 1 94 7 add.u64 %r154,%r154,24; add.u64 %r152,%r152,8; setp.ne.u64 %r291,%r138,%r154; @ %r291 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r182+48],%r114; mov.u64 %r293,-1; st.u64 [%r182+56],%r293; mov.u64 %r294,1; st.u64 [%r182+40],%r294; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r60,[%r182+28]; .loc 1 121 10 setp.eq.u64 %r295,%r60,%r108; @ %r295 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r108; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r298,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r299,[%r298+36]; setp.eq.u32 %r300,%r299,0; @ %r300 bra $L21; .loc 1 128 2 add.u64 %r351,%frame,240; cvta.const.u64 %r304,$LC2r351303304; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r306,%r108,0; @ %r306 bra $L24; $L27: .loc 1 68 9 add.u64 %r307,%r31,1; .loc 1 68 7 sub.u64 %r110,%r307,%r33; max.s64 %r45,%r110,0; .loc 1 140 8 ld.u64 %r122,[%r183]; .loc 1 141 8 ld.u64 %r123,[%r182]; .loc 1 180 31 shl.b64 %r68,%r112,2; ld.u64 %r75,[%frame+360]; .loc 1 194 22 ld.u64 %r70,[%frame+120]; .loc 1 194 12 shl.b64 %r71,%r70,2; .loc 1 195 22 ld.u64 %r73,[%frame]; .loc 1 195 12 shl.b64 %r74,%r73,2; .loc 1 197 32 ld.u64 %r143,[%frame+240]; .loc 1 204 23 mul.lo.u64 %r308,%r70,%r143; .loc 1 204 9 shl.b64 %r309,%r308,2; neg.s64 %r134,%r309; .loc 1 205 23 mul.lo.u64 %r310,%r73,%r143; .loc 1 205 9 shl.b64 %r311,%r310,2; neg.s64 %r128,%r311; setp.gt.s64 %r353,%r110,0; setp.le.s64 %r354,%r108,1; .loc 1 155 9 mov.u32 %r357,-2147483648; setp.eq.u32 %r358,%r185,0; cvt.u32.u32 %r359,%r22; cvt.s64.s8 %r360,%r359; add.u64 %r361,%r360,-1; add.u64 %r362,%frame,120; add.u64 %r363,%frame,240; bra $L25; $L24: add.u64 %r167,%frame,360; add.u64 %r166,%r182,40; mov.u64 %r164,%frame; cvt.u32.u32 %r313,%r22; cvt.s64.s8 %r312,%r313; add.u64 %r155,%r312,-1; .loc 1 132 3 mov.u64 %r124,0; add.u64 %r351,%frame,240; .loc 1 134 16 mov.u64 %r314,%r124; $L26: st.u64 [%r167],%r314; .loc 1 135 18 ld.u64 %r315,[%r166]; st.u64 [%r164],%r315; .loc 1 136 17 shl.b64 %r317,%r124,3; add.u64 %r318,%r351,%r317; .loc 1 136 10 ld.u64 %r319,[%r318]; setp.le.s64 %r320,%r319,0; @ %r320 bra $L1; .loc 1 132 26 add.u64 %r124,%r124,1; .loc 1 132 3 add.u64 %r167,%r167,8; add.u64 %r166,%r166,24; add.u64 %r164,%r164,8; setp.ne.u64 %r321,%r124,%r155; @ %r321 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r353 bra $L38; .loc 1 159 10 mov.u32 %r323,0; st.u32 [%r123],%r323; bra $L29; $L38: mov.u64 %r118,%r122; .loc 1 155 9 mov.u32 %r102,%r357; .loc 1 157 9 mov.u32 %r101,1; .loc 1 178 8 mov.u64 %r117,0; $L28: .loc 1 182 14 ld.u32 %r181,[%r118]; .loc 1 182 7 @ %r358 bra $L30; .loc 1 182 6 setp.le.s32 %r325,%r102,%r181; @ %r325 bra $L31; bra $L32; $L30: setp.ge.s32 %r326,%r102,%r181; @ %r326 bra $L32; $L31: cvt.u32.u64 %r327,%r117; add.u32 %r101,%r327,1; .loc 1 182 14 mov.u32 %r102,%r181; $L32: .loc 1 180 23 add.u64 %r117,%r117,1; .loc 1 180 31 add.u64 %r118,%r118,%r68; .loc 1 180 6 setp.gt.s64 %r328,%r45,%r117; @ %r328 bra $L28; .loc 1 189 12 st.u32 [%r123],%r101; $L29: .loc 1 193 15 add.u64 %r75,%r75,1; .loc 1 194 12 add.u64 %r122,%r122,%r71; .loc 1 195 12 add.u64 %r123,%r123,%r74; .loc 1 197 13 setp.ne.u64 %r329,%r75,%r143; @ %r329 bra $L25; .loc 1 204 9 add.u64 %r119,%r122,%r134; .loc 1 205 9 add.u64 %r120,%r123,%r128; .loc 1 207 7 @ %r354 bra $L1; add.u64 %r133,%frame,368; mov.u64 %r149,8; .loc 1 206 5 mov.u64 %r121,1; .loc 1 201 13 mov.u64 %r356,0; bra $L35; $L36: st.u64 [%r133],%r356; .loc 1 204 23 mul.lo.u64 %r335,%r87,%r86; .loc 1 204 9 shl.b64 %r336,%r335,2; sub.u64 %r119,%r122,%r336; .loc 1 205 23 mul.lo.u64 %r337,%r90,%r86; .loc 1 205 9 shl.b64 %r338,%r337,2; sub.u64 %r120,%r123,%r338; .loc 1 206 5 add.u64 %r121,%r121,1; .loc 1 207 7 add.u64 %r133,%r133,8; add.u64 %r149,%r149,8; setp.eq.u64 %r339,%r121,%r361; @ %r339 bra $L1; $L35: .loc 1 215 16 ld.u64 %r340,[%r133]; add.u64 %r86,%r340,1; st.u64 [%r133],%r86; .loc 1 216 23 add.u64 %r342,%r362,%r149; ld.u64 %r87,[%r342]; .loc 1 216 13 shl.b64 %r343,%r87,2; add.u64 %r122,%r119,%r343; .loc 1 217 23 add.u64 %r344,%frame,%r149; ld.u64 %r90,[%r344]; .loc 1 217 13 shl.b64 %r345,%r90,2; add.u64 %r123,%r120,%r345; .loc 1 197 32 add.u64 %r347,%r363,%r149; ld.u64 %r93,[%r347]; .loc 1 197 13 setp.eq.u64 %r348,%r86,%r93; @ %r348 bra $L36; .loc 1 201 13 mov.u64 %r75,0; bra $L25; $L50: .loc 1 90 6 ld.u64 %r349,[%r182]; setp.eq.u64 %r350,%r349,0; @ ! %r350 bra $L11; bra $L37; $L1: .loc 1 221_gfortran_mmaxloc1_4_i4 .visible .func _gfortran_mmaxloc1_4_, .param .u32 %in_ar4) {60898u16 %r238; .reg .u16u64 %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u64 %r306; .reg .u64pred %r317; .reg .u64 %r318; .reg .pred %r319; .reg .u64u32u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64u64predu32 %r385; .reg .predu64 %r394; .reg .predu64predu64 %r410; .reg .u32 %r412; .reg .pred %r414; .reg .pred %r415; .reg .pred %r416; .reg .predpred %r423; .reg .predpredpred %r446; .reg .u64 %r447; .reg .predmov.u64 %r221,%ar0; mov.u64 %r222,%ar1; mov.u64 %r223,%ar2; mov.u64 %r224,%ar3; mov.u32 %r225,%ar4; .loc 1 251 6 setp.ne.u64 %r226,%r224,0; @ %r226 bra $L223225; call _gfortran_maxloc1_4_7 bra $L51; $L52: .loc 1 261 10 ld.u64 %r22,[%r223]; .loc 1 261 7 add.u64 %r151,%r22,-1; .loc 1 262 10 ld.s8 %r23,[%r222+28]; .loc 1 262 38 add.u32 %r231,%r23,-1; .loc 1 262 8 cvt.s64.s32 %r152,%r231; .loc 1 265 7 shr.u64 %r233,%r151,63; set.u32.gt.s64 %r235,%r151,%r152; neg.s32 %r236,%r235; cvt.u16.u64 %r239,%r233; cvt.u16.u32 %r240,%r236; or.b16 %r238,%r239,%r240; .loc 1 265 6 cvt.u32.u16 %r241,%r238; cvt.u16.u8 %r242,%r241; setp.eq.u16 %r243,%r242,0; @ %r243 bra $L54; .loc 1 267 7 cvt.u32.u32 %r246,%r23; cvt.s64.s8 %r245,%r246; st.u64 [%stack+8],%r245; st.u64 [%stack],%r22; cvta.const.u64 %r244244_gfortran_runtime_error4: .loc 1 272 9 add.u64 %r449,%r151,%r151; add.u64 %r455,%r449,%r151; shl.b64 %r251,%r455,3; add.u64 %r252,%r222,%r251; ld.u64 %r255,[%r252+56]; add.u64 %r254,%r255,1; .loc 1 272 7 ld.u64 %r262,[%r252+48]; sub.u64 %r153,%r254,%r262; .loc 1 273 6 setp.le.s64 %r263,%r153,0; @ %r263 bra $L51; .loc 1 276 9 ld.u64 %r169,[%r224]; .loc 1 278 15 ld.u64 %r34,[%r224+16]; .loc 1 280 22 cvt.u32.u64 %r35,%r34; .loc 1 280 53 add.u32 %r264,%r35,-4; and.b32 %r265,%r264,-5; set.u32.eq.u32 %r267,%r265,0; neg.s32 %r268,%r267; .loc 1 280 22 add.u32 %r269,%r35,-1; set.u32.le.u32 %r271,%r269,1; neg.s32 %r272,%r271; .loc 1 280 6 cvt.u16.u32 %r274,%r268; cvt.u16.u32 %r275,%r272; or.loc 1 282 7 setp.ne.u32 %r280,%r35,16; @ %r280 bra $L57; $L56: .loc 1 289 9 shl.b64 %r284,%r455,3; add.u64 %r285,%r222,%r284; ld.u64 %r156,[%r285+40]; .loc 1 290 12 add.u64 %r291,%r224,%r284; ld.u64 %r41,[%r291+40]; .loc 1 292 3 setp.ne.u64 %r293,%r151,0; @ %r293 bra $L58; $L64: .loc 1 302 3 setp.lt.s64 %r294,%r151,%r152; @ %r294 bra $L59; bra $L116; $L57: .loc 1 287 5 cvta.const.u64 %r2955_gfortran_runtime_error8: add.u64 %r96,%r222,40; add.u64 %r68,%r224,40; add.u64 %r298,%r22,%r22; add.u64 %r299,%r298,%r22; shl.b64 %r300,%r299,3; add.u64 %r301,%r222,16; add.u64 %r39,%r300,%r301; .loc 1 292 3 mov.u64 %r139,0; add.u64 %r452,%frame,240; add.u64 %r451,%frame,360; .loc 1 299 12 mov.u64 %r465,%r139; $L63: .loc 1 294 18 add.u64 %r303,%r452,%r139; ld.u64 %r304,[%r96]; st.u64 [%r303],%r304; .loc 1 295 18 add.u64 %r305,%frame,%r139; .loc 1 295 20 ld.u64 %r307,[%r68]; mul.lo.u64 %r306,%r307,%r34; .loc 1 295 18 st.u64 [%r305],%r306; .loc 1 296 19 ld.u64 %r309,[%r96+16]; add.u64 %r308,%r309,1; ld.u64 %r310,[%r96+8]; sub.u64 %r53,%r308,%r310; .loc 1 298 10 setp.lt.s64 %r311,%r53,0; @ %r311 bra $L61; .loc 1 296 17 add.u64 %r313,%r451,%r139; st.u64 [%r313],%r53; bra $L62; $L61: .loc 1 299 12 add.u64 %r315,%r451,%r139; st.u64 [%r315],%r465; $L62: .loc 1 292 3 add.u64 %r96,%r96,24; add.u64 %r68,%r68,24; add.u64 %r139,%r139,8; setp.ne.u64 %r317,%r39,%r96; @ %r317 bra $L63; bra $L64; $L116: .loc 1 312 6 ld.u64 %r318,[%r221]; setp.eq.u64 %r319,%r318,0; @ ! %r319 bra $L66; bra $L65; $L59: add.u64 %r321,%r22,%r22; add.u64 %r322,%r321,%r22; shl.b64 %r323,%r322,3; add.u64 %r130,%r323,40; add.u64 %r134,%r222,%r130; add.u64 %r111,%r224,%r130; shl.b64 %r324,%r22,3; add.u64 %r58,%r324,-8; cvt.u32.u32 %r326,%r23; cvt.s64.s8 %r325,%r326; shl.b64 %r327,%r325,3; add.u64 %r69,%r327,-8; add.u64 %r452,%frame,240; add.u64 %r451,%frame,360; .loc 1 309 12 mov.u64 %r464,0; $L69: .loc 1 304 18 add.u64 %r329,%r452,%r58; ld.u64 %r330,[%r134]; st.u64 [%r329],%r330; .loc 1 305 18 add.u64 %r331,%frame,%r58; .loc 1 305 20 ld.u64 %r333,[%r111]; mul.lo.u64 %r332,%r333,%r34; .loc 1 305 18 st.u64 [%r331],%r332; .loc 1 306 19 ld.u64 %r335,[%r134+16]; add.u64 %r334,%r335,1; ld.u64 %r336,[%r134+8]; sub.u64 %r65,%r334,%r336; .loc 1 308 10 setp.lt.s64 %r337,%r65,0; @ %r337 bra $L67; .loc 1 306 17 add.u64 %r339,%r451,%r58; st.u64 [%r339],%r65; bra $L68; $L67: .loc 1 309 12 add.u64 %r341,%r451,%r58; st.u64 [%r341],%r464; $L68: .loc 1 302 3 add.u64 %r134,%r134,24; add.u64 %r111,%r111,24; add.u64 %r58,%r58,8; setp.ne.u64 %r343,%r69,%r58; @ %r343 bra $L69; bra $L117; $L65: .loc 1 316 7 setp.eq.u64 %r344,%r152,0; @ %r344 bra $L71; add.u64 %r451,%frame,360; $L96: add.u64 %r148,%r221,40; mov.u64 %r146,%r451; cvt.u32.u32 %r346,%r23; cvt.s64.s8 %r345,%r346; add.u64 %r348,%r345,%r345; add.u64 %r349,%r348,%r345; shl.b64 %r350,%r349,3; add.u64 %r351,%r221,16; add.u64 %r135,%r350,%r351; .loc 1 319 10 mov.u64 %r136,1; .loc 1 323 4 mov.u64 %r370,0; bra $L72; $L71: .loc 1 327 20 add.u32 %r352,%r23,-2; cvt.s64.s32 %r76,%r352; add.u64 %r354,%r76,%r76; add.u64 %r355,%r354,%r76; shl.b64 %r356,%r355,3; add.u64 %r357,%r221,%r356; .loc 1 327 67 shl.b64 %r359,%r76,3; add.u64 %r360,%frame,%r359; .loc 1 327 59 ld.u64 %r362,[%r357+40]; ld.u64 %r363,[%r360+360]; mul.lo.u64 %r160,%r362,%r363; .loc 1 329 24 mov.u64 %r364,0; st.u64 [%r221+8],%r364; .loc 1 330 28 cvt.u16.u32 %r367,%r23; add.u16 %r366,%r367,-1; cvt.u32.u16 %r368,%r366; st.u8 [%r221+28],%r368; .loc 1 332 10 setp.eq.u64 %r369,%r160,0; @ %r369 bra $L73; bra $L118; $L75: .loc 1 321 47 mul.lo.u64 %r136,%r73,%r136; $L72: .loc 1 323 4 st.u64 [%r148+8],%r370; ld.u64 %r73,[%r146]; add.u64 %r371,%r73,-1; st.u64 [%r148+16],%r371; st.u64 [%r148],%r136; .loc 1 316 7 add.u64 %r148,%r148,24; add.u64 %r146,%r146,8; setp.ne.u64 %r372,%r135,%r148; @ %r372 bra $L75; bra $L71; $L73: .loc 1 335 4 st.u64 [%r221+48],%r160; mov.u64 %r374,-1; st.u64 [%r221+56],%r374; mov.u64 %r375,1; st.u64 [%r221+40],%r375; .loc 1 336 4 bra $L51; $L118: .loc 1 339 24call (%value_in),_gfortrani_xmallocarray78,[%value_in]; } .loc 1 339 22 st.u64 [%r221],%r378; bra $L76; $L66: .loc 1 344 19 ld.s8 %r380,[%r221+28]; .loc 1 344 10 setp.eq.u64 %r381,%r380,%r152; @ %r381 bra $L77; .loc 1 345 2 cvta.const.u64 %r382,$LC5; {stack; call _gfortran_runtime_error77: .loc 1 347 11 cvta.global.u64 %r384,_gfortrani_compile_options; .loc 1 347 10 ld.u32 %r385,[%r384+36]; setp.eq.u32 %r386,%r385,0; @ %r386 bra $L76; .loc 1 349 4 add.u64 %r451,%frame,360; cvta.const.u64 %r390,$LC2390; call _gfortrani_bounds_ifunction_return4 cvta.const.u64 %r394,$LC222394390; call _gfortrani_bounds_equal_extents$L76: .loc 1 356 3 setp.ne.u64 %r396,%r152,0; @ %r396 bra $L78; $L81: .loc 1 364 8 ld.u64 %r170,[%r221]; .loc 1 365 8 ld.u64 %r168,[%r222]; .loc 1 367 9 setp.ne.u64 %r397,%r168,0; @ %r397 bra $L79; bra $L51; $L78: add.u64 %r198,%frame,480; add.u64 %r197,%r221,40; add.u64 %r195,%frame,120; cvt.u32.u32 %r399,%r23; cvt.s64.s8 %r398,%r399; add.u64 %r164,%r398,-1; .loc 1 356 3 mov.u64 %r171,0; add.u64 %r451,%frame,360; .loc 1 358 16 mov.u64 %r400,%r171; $L80: st.u64 [%r198],%r400; .loc 1 359 18 ld.u64 %r401,[%r197]; st.u64 [%r195],%r401; .loc 1 360 17 shl.b64 %r403,%r171,3; add.u64 %r404,%r451,%r403; .loc 1 360 10 ld.u64 %r405,[%r404]; setp.le.s64 %r406,%r405,0; @ %r406 bra $L51; .loc 1 356 26 add.u64 %r171,%r171,1; .loc 1 356 3 add.u64 %r198,%r198,8; add.u64 %r197,%r197,24; add.u64 %r195,%r195,8; setp.ne.u64 %r407,%r164,%r171; @ %r407 bra $L80; bra $L81; $L79: .loc 1 290 12 mul.lo.u64 %r44,%r41,%r34; .loc 1 386 32 shl.b64 %r91,%r156,2; ld.u64 %r187,[%frame+480]; .loc 1 430 22 ld.u64 %r101,[%frame+240]; .loc 1 430 12 shl.b64 %r102,%r101,2; .loc 1 431 23 ld.u64 %r104,[%frame]; .loc 1 432 22 ld.u64 %r106,[%frame+120]; .loc 1 432 12 shl.b64 %r107,%r106,2; .loc 1 434 32 ld.u64 %r29,[%frame+360]; .loc 1 441 23 mul.lo.u64 %r38,%r29,%r101; .loc 1 442 24 mul.lo.u64 %r408,%r29,%r104; .loc 1 442 10 neg.s64 %r137,%r408; .loc 1 443 23 mul.lo.u64 %r409,%r29,%r106; .loc 1 443 9 shl.b64 %r410,%r409,2; neg.s64 %r188,%r410; setp.le.s64 %r450,%r152,1; .loc 1 408 9 setp.ne.u32 %r457,%r225,0; cvt.u32.u32 %r458,%r23; cvt.s64.s8 %r459,%r458; add.u64 %r460,%r459,-1; add.u64 %r461,%frame,240; add.u64 %r462,%frame,360; add.u64 %r463,%frame,120; $L97: .loc 1 356 3 mov.u64 %r163,%r169; mov.u64 %r162,%r168; .loc 1 386 9 mov.u64 %r161,0; $L85: .loc 1 389 7 ld.s8 %r220,[%r163]; .loc 1 389 6 setp.eq.u32 %r453,%r220,0; @ %r453 bra $L82; .loc 1 397 11 ld.u32 %r145,[%r162]; .loc 1 398 11 cvt.u32.u64 %r412,%r161; add.u32 %r143,%r412,1; .loc 1 408 9 @ %r457 bra $L83; bra $L119; $L82: .loc 1 386 24 add.u64 %r161,%r161,1; .loc 1 386 32 add.u64 %r162,%r162,%r91; .loc 1 386 47 add.u64 %r163,%r163,%r44; .loc 1 386 2 setp.ne.u64 %r414,%r153,%r161; @ %r414 bra $L85; .loc 1 385 9 mov.u32 %r143,%r220; bra $L86; $L119: .loc 1 418 8 setp.gt.s64 %r415,%r153,%r161; @ %r415 bra $L87; bra $L86; $L83: .loc 1 409 8 setp.le.s64 %r416,%r153,%r161; @ %r416 bra $L86; add.u64 %r205,%r161,1; $L89: .loc 1 411 8 @ %r453 bra $L88; .loc 1 411 18 ld.u32 %r93,[%r162]; .loc 1 411 15 setp.lt.s32 %r418,%r93,%r145; @ %r418 bra $L88; .loc 1 414 16 cvt.u32.u64 %r143,%r205; mov.u32 %r145,%r93; $L88: .loc 1 409 33 add.u64 %r162,%r162,%r91; .loc 1 409 48 add.u64 %r163,%r163,%r44; .loc 1 409 8 add.u64 %r206,%r205,1; setp.eq.u64 %r419,%r153,%r205; @ %r419 bra $L86; .loc 1 411 9 ld.s8 %r220,[%r163]; mov.u64 %r205,%r206; setp.eq.u32 %r453,%r220,0; bra $L89; $L87: add.u64 %r201,%r161,1; bra $L91; $L98: mov.u64 %r201,%r202; $L91: .loc 1 420 8 ld.s8 %r421,[%r163]; cvt.u16.u32 %r420,%r421; setp.eq.u16 %r422,%r420,0; @ %r422 bra $L90; .loc 1 420 18 ld.u32 %r98,[%r162]; .loc 1 420 15 setp.le.s32 %r423,%r98,%r145; @ %r423 bra $L90; .loc 1 423 16 cvt.u32.u64 %r143,%r201; mov.u32 %r145,%r98; $L90: .loc 1 418 33 add.u64 %r162,%r162,%r91; .loc 1 418 48 add.u64 %r163,%r163,%r44; .loc 1 418 8 add.u64 %r202,%r201,1; setp.ne.u64 %r424,%r153,%r201; @ %r424 bra $L98; $L86: .loc 1 426 8 st.u32 [%r170],%r143; .loc 1 429 15 add.u64 %r187,%r187,1; .loc 1 430 12 add.u64 %r168,%r168,%r102; .loc 1 431 13 add.u64 %r169,%r169,%r104; .loc 1 432 12 add.u64 %r170,%r170,%r107; .loc 1 434 13 setp.ne.u64 %r425,%r29,%r187; @ %r425 bra $L97; .loc 1 442 10 add.u64 %r165,%r169,%r137; .loc 1 443 9 add.u64 %r166,%r170,%r188; .loc 1 445 7 @ %r450 bra $L51; add.u64 %r150,%frame,488; .loc 1 441 23 mov.u64 %r110,%r38; .loc 1 445 7 mov.u64 %r215,8; .loc 1 444 5 mov.u64 %r167,1; .loc 1 438 13 mov.u64 %r456,0; bra $L94; $L95: st.u64 [%r150],%r456; .loc 1 441 23 mul.lo.u64 %r110,%r123,%r122; .loc 1 442 24 mul.lo.u64 %r431,%r125,%r122; .loc 1 442 10 sub.u64 %r165,%r169,%r431; .loc 1 443 23 mul.lo.u64 %r432,%r127,%r122; .loc 1 443 9 shl.b64 %r433,%r432,2; sub.u64 %r166,%r170,%r433; .loc 1 444 5 add.u64 %r167,%r167,1; .loc 1 445 7 add.u64 %r150,%r150,8; add.u64 %r215,%r215,8; setp.eq.u64 %r434,%r167,%r460; @ %r434 bra $L51; $L94: .loc 1 453 16 ld.u64 %r435,[%r150]; add.u64 %r122,%r435,1; st.u64 [%r150],%r122; .loc 1 454 23 add.u64 %r437,%r461,%r215; ld.u64 %r123,[%r437]; .loc 1 454 13 sub.u64 %r438,%r123,%r110; shl.b64 %r439,%r438,2; add.u64 %r168,%r168,%r439; .loc 1 455 24 add.u64 %r440,%frame,%r215; ld.u64 %r125,[%r440]; .loc 1 455 14 add.u64 %r169,%r165,%r125; .loc 1 456 23 add.u64 %r442,%r463,%r215; ld.u64 %r127,[%r442]; .loc 1 456 13 shl.b64 %r443,%r127,2; add.u64 %r170,%r166,%r443; .loc 1 434 32 add.u64 %r445,%r462,%r215; ld.u64 %r131,[%r445]; .loc 1 434 13 setp.eq.u64 %r446,%r122,%r131; @ %r446 bra $L95; .loc 1 438 13 mov.u64 %r187,0; bra $L97; $L117: .loc 1 312 6 ld.u64 %r447,[%r221]; setp.eq.u64 %r448,%r447,0; @ ! %r448 bra $L66; bra $L96; $L51: .loc 1 460_gfortran_smaxloc1_4_i4 .visible .func _gfortran_smaxloc1_4_, .param .u32 %in_ar4) {8u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u32 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 483 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L121; .loc 1 483 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L122; $L121:r152154; call _gfortran_maxloc1_4_490 7 bra $L120; $L122: .loc 1 493 10 ld.u64 %r23,[%r152]; .loc 1 493 7 add.u64 %r88,%r23,-1; .loc 1 494 10 ld.s8 %r24,[%r151+28]; .loc 1 494 38 add.u32 %r26,%r24,-1; .loc 1 494 8 cvt.s64.s32 %r296,%r26; .loc 1 496 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 496 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L124; .loc 1 503 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L125; $L131: .loc 1 511 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L126; bra $L179; $L124: .loc 1 498 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error125: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 508 12 mov.u64 %r302,0; $L130: .loc 1 505 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 507 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L128; .loc 1 505 17 st.u64 [%r139],%r36; bra $L129; $L128: .loc 1 508 12 st.u64 [%r139],%r302; $L129: .loc 1 503 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L130; bra $L131; $L179: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L133; bra $L132; $L126: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 517 12 mov.u64 %r301,0; $L136: .loc 1 514 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 516 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L134; .loc 1 513 17 st.u64 [%r43],%r41; bra $L135; $L134: .loc 1 517 12 st.u64 [%r43],%r301; $L135: .loc 1 511 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L136; bra $L180; $L132: .loc 1 524 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L138; add.u64 %r295,%frame,120; $L158: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 527 10 mov.u64 %r81,1; .loc 1 531 4 mov.u64 %r239,0; bra $L139; $L138: .loc 1 535 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 536 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 538 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 538 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 538 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 540 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L140; bra $L181; $L142: .loc 1 529 48 mul.lo.u64 %r81,%r46,%r81; $L139: .loc 1 531 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 524 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L142; bra $L138; $L140: .loc 1 543 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 544 4 bra $L120; $L181: .loc 1 547 2496246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 547 22 st.u64 [%r150],%r98; $L146: .loc 1 573 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L143; bra $L144; $L133: .loc 1 551 19 ld.s8 %r55,[%r150+28]; .loc 1 551 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L145; .loc 1 552 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error145: .loc 1 557 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 557 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L146; .loc 1 559 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L144; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 559 10 mov.u64 %r104,0; $L148: .loc 1 563 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 563 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 564 18 ld.u64 %r63,[%r31]; .loc 1 559 25 add.u64 %r104,%r104,1; .loc 1 564 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L147; .loc 1 565 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error147: .loc 1 559 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L148; $L143: .loc 1 575 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L149; shl.b64 %r266,%r296,3; bra $L150; $L149: mov.u64 %r266,8; $L150: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L151: .loc 1 576 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 573 26 add.u64 %r100,%r100,1; .loc 1 573 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L151; $L144: ld.u64 %r80,[%frame+240]; .loc 1 585 22 ld.u64 %r68,[%frame]; .loc 1 585 12 shl.b64 %r69,%r68,2; .loc 1 587 32 ld.u64 %r59,[%frame+120]; .loc 1 594 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 583 13 mov.u32 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L157: st.u32 [%r98],%r280; .loc 1 584 15 add.u64 %r80,%r80,1; .loc 1 585 12 add.u64 %r98,%r98,%r69; .loc 1 587 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L157; .loc 1 596 7 @ ! %r294 bra $L182; bra $L120; $L156: .loc 1 591 13 st.u64 [%r99],%r297; .loc 1 594 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 595 5 add.u64 %r97,%r97,1; .loc 1 596 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L155; bra $L120; $L182: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 594 23 mov.u64 %r72,%r44; .loc 1 595 5 mov.u64 %r97,1; .loc 1 591 13 mov.u64 %r297,0; $L155: .loc 1 600 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 601 23 ld.u64 %r75,[%r124]; .loc 1 601 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,2; add.u64 %r98,%r98,%r291; .loc 1 587 32 ld.u64 %r77,[%r123]; .loc 1 587 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L156; .loc 1 591 13 mov.u64 %r80,0; bra $L157; $L180: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L133; bra $L158; $L120: .loc 1 605maxloc1_8_i4.o/ 1622802230_gfortran_maxloc1_8_i4 .visible .func _gfortran_maxloc1_8_../libgfortran/generated/maxloc1_8_i4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_8_i4 .visible .func _gfortran_mmaxloc1_8__gfortran_smaxloc1_8_i4 .visible .func _gfortran_smaxloc1_8_VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_maxloc1_8_i4 .visible .func _gfortran_maxloc1_8_i495predu64u64 %r229; .reg .u64 %r230; .reg .predpred %r255; .reg .predu64 %r266; .reg .u16u64predpred %r315; .reg .predu64 %r327; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .pred %r337; .reg .u64 %r338; .reg .u64 %r340; .reg .u64 %r341; .reg .u64 %r342; .reg .u64 %r343; .reg .u64 %r345; .reg .pred %r346; .reg .u64 %r347; .reg .pred %r348; .reg .u64 %r349; .reg .predu64 %r353; .reg .pred %r355; .reg .u64u64 %r359; .reg .u64mov.u64 %r181,%ar0; mov.u64 %r182,%ar1; mov.u64 %r183,%ar2; mov.u32 %r184,%ar3; .loc 1 58 10 ld.s8 %r22,[%r182+28]; .loc 1 58 38 add.u32 %r185,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r110,%r185; .loc 1 59 10 ld.u64 %r25,[%r183]; .loc 1 59 7 add.u64 %r111,%r25,-1; .loc 1 61 7 shr.u64 %r187,%r111,63; set.u32.lt.s64 %r189,%r110,%r111; neg.s32 %r190,%r189; cvt.u16.u64 %r193,%r187; cvt.u16.u32 %r194,%r190; or.b16 %r192,%r193,%r194; .loc 1 61 6 cvt.u32.u16 %r195,%r192; cvt.u16.u8 %r196,%r195; setp.eq.u16 %r197,%r196,0; @ %r197 bra $L2; .loc 1 63 7 cvt.u32.u32 %r200,%r22; cvt.s64.s8 %r199,%r200; st.u64 [%stack+8],%r199_gfortran_runtime_errorr203,%r111,%r111; add.u64 %r204,%r203,%r111; shl.b64 %r205,%r204,3; add.u64 %r206,%r182,%r205; ld.u64 %r31,[%r206+56]; ld.u64 %r33,[%r206+48]; .loc 1 71 9 ld.u64 %r114,[%r206+40]; .loc 1 73 3 setp.ne.u64 %r220,%r111,0; @ %r220 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r221,%r110,%r111; @ %r221 bra $L4; bra $L50; $L3: add.u64 %r63,%r182,40; add.u64 %r29,%frame,120; add.u64 %r124,%frame,240; add.u64 %r222,%r182,16; add.u64 %r224,%r25,%r25; add.u64 %r225,%r224,%r25; shl.b64 %r226,%r225,3; add.u64 %r174,%r222,%r226; .loc 1 79 12 mov.u64 %r363,0; $L8: .loc 1 75 18 ld.u64 %r227,[%r63]; st.u64 [%r29],%r227; .loc 1 76 19 ld.u64 %r229,[%r63+16]; add.u64 %r228,%r229,1; ld.u64 %r230,[%r63+8]; sub.u64 %r38,%r228,%r230; .loc 1 78 10 setp.lt.s64 %r231,%r38,0; @ %r231 bra $L6; .loc 1 76 17 st.u64 [%r124],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r124],%r363; $L7: .loc 1 73 3 add.u64 %r63,%r63,24; add.u64 %r29,%r29,8; add.u64 %r124,%r124,8; setp.ne.u64 %r233,%r63,%r174; @ %r233 bra $L8; bra $L9; $L50: .loc 1 90 6 ld.u64 %r234,[%r181]; setp.eq.u64 %r235,%r234,0; @ ! %r235 bra $L11; bra $L10; $L4: add.u64 %r237,%r25,%r25; add.u64 %r238,%r237,%r25; shl.b64 %r239,%r238,3; add.u64 %r240,%r239,40; add.u64 %r132,%r182,%r240; shl.b64 %r241,%r25,3; add.u64 %r107,%r241,-8; add.u64 %r349,%frame,120; add.u64 %r126,%r349,%r107; add.u64 %r353,%frame,240; add.u64 %r105,%r353,%r107; cvt.u32.u32 %r245,%r22; cvt.s64.s8 %r244,%r245; add.u64 %r247,%r244,%r244; add.u64 %r248,%r247,%r244; shl.b64 %r249,%r248,3; add.u64 %r250,%r182,40; add.u64 %r81,%r249,%r250; .loc 1 87 12 mov.u64 %r362,0; $L14: .loc 1 83 18 ld.u64 %r251,[%r132]; st.u64 [%r126],%r251; .loc 1 84 19 ld.u64 %r253,[%r132+16]; add.u64 %r252,%r253,1; ld.u64 %r254,[%r132+8]; sub.u64 %r45,%r252,%r254; .loc 1 86 10 setp.lt.s64 %r255,%r45,0; @ %r255 bra $L12; .loc 1 84 17 st.u64 [%r105],%r45; bra $L13; $L12: .loc 1 87 12 st.u64 [%r105],%r362; $L13: .loc 1 81 3 add.u64 %r132,%r132,24; add.u64 %r126,%r126,8; add.u64 %r105,%r105,8; setp.ne.u64 %r257,%r81,%r132; @ %r257 bra $L14; bra $L51; $L10: .loc 1 94 7 setp.le.s64 %r258,%r110,0; @ %r258 bra $L16; add.u64 %r353,%frame,240; $L36: add.u64 %r153,%r181,40; mov.u64 %r151,%r353; cvt.u32.u32 %r260,%r22; cvt.s64.s8 %r259,%r260; add.u64 %r262,%r259,%r259; add.u64 %r263,%r262,%r259; shl.b64 %r264,%r263,3; add.u64 %r265,%r181,16; add.u64 %r138,%r264,%r265; .loc 1 97 10 mov.u64 %r102,1; .loc 1 101 4 mov.u64 %r288,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r266,0; st.u64 [%r181+8],%r266; .loc 1 106 28 cvt.u16.u32 %r269,%r22; add.u16 %r268,%r269,-1; cvt.u32.u16 %r270,%r268; st.u8 [%r181+28],%r270; .loc 1 108 20 add.u32 %r271,%r22,-2; cvt.s64.s32 %r56,%r271; add.u64 %r273,%r56,%r56; add.u64 %r274,%r273,%r56; shl.b64 %r275,%r274,3; add.u64 %r276,%r181,%r275; .loc 1 108 67 shl.b64 %r278,%r56,3; add.u64 %r279,%frame,%r278; .loc 1 108 59 ld.u64 %r281,[%r276+40]; ld.u64 %r282,[%r279+240]; mul.lo.u64 %r116,%r281,%r282; .loc 1 110 29 mov.u64 %r284284; call (%value_in),_gfortrani_xmallocarray285,[%value_in]; } .loc 1 110 27 st.u64 [%r181],%r285; .loc 1 111 10 setp.eq.u64 %r287,%r116,0; @ ! %r287 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r102,%r52,%r102; $L17: .loc 1 101 4 st.u64 [%r153+8],%r288; ld.u64 %r52,[%r151]; add.u64 %r289,%r52,-1; st.u64 [%r153+16],%r289; st.u64 [%r153],%r102; .loc 1 94 7 add.u64 %r153,%r153,24; add.u64 %r151,%r151,8; setp.ne.u64 %r290,%r138,%r153; @ %r290 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r181+48],%r116; mov.u64 %r292,-1; st.u64 [%r181+56],%r292; mov.u64 %r293,1; st.u64 [%r181+40],%r293; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r62,[%r181+28]; .loc 1 121 10 setp.eq.u64 %r294,%r62,%r110; @ %r294 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r110; st.u64 [%stack],%r62;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r297,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r298,[%r297+36]; setp.eq.u32 %r299,%r298,0; @ %r299 bra $L21; .loc 1 128 2 add.u64 %r353,%frame,240; cvta.const.u64 %r303,$LC2r353302303; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r305,%r110,0; @ %r305 bra $L24; $L27: .loc 1 68 9 add.u64 %r306,%r31,1; .loc 1 68 7 sub.u64 %r112,%r306,%r33; max.s64 %r46,%r112,0; .loc 1 140 8 ld.u64 %r121,[%r182]; .loc 1 141 8 ld.u64 %r122,[%r181]; .loc 1 180 31 shl.b64 %r69,%r114,2; ld.u64 %r48,[%frame+360]; .loc 1 194 22 ld.u64 %r70,[%frame+120]; .loc 1 194 12 shl.b64 %r72,%r70,2; .loc 1 195 22 ld.u64 %r73,[%frame]; .loc 1 195 12 shl.b64 %r75,%r73,3; setp.gt.s64 %r350,%r112,0; setp.le.s64 %r351,%r110,1; setp.eq.u32 %r355,%r184,0; .loc 1 159 10 mov.u64 %r356,0; cvt.u32.u32 %r357,%r22; cvt.s64.s8 %r358,%r357; add.u64 %r359,%r358,-1; add.u64 %r360,%frame,120; add.u64 %r361,%frame,240; bra $L25; $L24: add.u64 %r166,%frame,360; add.u64 %r165,%r181,40; mov.u64 %r163,%frame; cvt.u32.u32 %r308,%r22; cvt.s64.s8 %r307,%r308; add.u64 %r154,%r307,-1; .loc 1 132 3 mov.u64 %r123,0; add.u64 %r353,%frame,240; .loc 1 134 16 mov.u64 %r309,%r123; $L26: st.u64 [%r166],%r309; .loc 1 135 18 ld.u64 %r310,[%r165]; st.u64 [%r163],%r310; .loc 1 136 17 shl.b64 %r312,%r123,3; add.u64 %r313,%r353,%r312; .loc 1 136 10 ld.u64 %r314,[%r313]; setp.le.s64 %r315,%r314,0; @ %r315 bra $L1; .loc 1 132 26 add.u64 %r123,%r123,1; .loc 1 132 3 add.u64 %r166,%r166,8; add.u64 %r165,%r165,24; add.u64 %r163,%r163,8; setp.ne.u64 %r316,%r123,%r154; @ %r316 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r350 bra $L37; .loc 1 159 10 st.u64 [%r122],%r356; bra $L29; $L37: mov.u64 %r117,%r121; .loc 1 155 9 mov.u32 %r104,-2147483648; .loc 1 157 9 mov.u64 %r103,1; .loc 1 178 8 mov.u64 %r149,0; $L28: .loc 1 182 14 ld.u32 %r180,[%r117]; .loc 1 185 14 add.u64 %r149,%r149,1; .loc 1 182 7 @ %r355 bra $L30; .loc 1 182 6 setp.le.s32 %r320,%r104,%r180; selp.u32 %r104,%r180,%r104,%r320; selp.u64 %r103,%r149,%r103,%r320; bra $L31; $L30: setp.lt.s32 %r321,%r104,%r180; selp.u32 %r104,%r180,%r104,%r321; selp.u64 %r103,%r149,%r103,%r321; $L31: .loc 1 180 31 add.u64 %r117,%r117,%r69; .loc 1 180 6 setp.gt.s64 %r322,%r46,%r149; @ %r322 bra $L28; .loc 1 189 12 st.u64 [%r122],%r103; $L29: .loc 1 193 15 add.u64 %r48,%r48,1; .loc 1 194 12 add.u64 %r121,%r121,%r72; .loc 1 195 12 add.u64 %r122,%r122,%r75; .loc 1 197 32 ld.u64 %r145,[%frame+240]; .loc 1 197 13 setp.ne.u64 %r323,%r48,%r145; @ %r323 bra $L25; .loc 1 204 23 mul.lo.u64 %r324,%r70,%r48; .loc 1 204 9 shl.b64 %r325,%r324,2; sub.u64 %r118,%r121,%r325; .loc 1 205 23 mul.lo.u64 %r326,%r73,%r48; .loc 1 205 9 shl.b64 %r327,%r326,3; sub.u64 %r119,%r122,%r327; .loc 1 207 7 @ %r351 bra $L1; add.u64 %r150,%frame,368; mov.u64 %r109,8; .loc 1 206 5 mov.u64 %r120,1; bra $L34; $L35: .loc 1 201 13 st.u64 [%r150],%r356; .loc 1 204 23 mul.lo.u64 %r333,%r89,%r88; .loc 1 204 9 shl.b64 %r334,%r333,2; sub.u64 %r118,%r121,%r334; .loc 1 205 23 mul.lo.u64 %r335,%r92,%r88; .loc 1 205 9 shl.b64 %r336,%r335,3; sub.u64 %r119,%r122,%r336; .loc 1 206 5 add.u64 %r120,%r120,1; .loc 1 207 7 add.u64 %r150,%r150,8; add.u64 %r109,%r109,8; setp.eq.u64 %r337,%r120,%r359; @ %r337 bra $L1; $L34: .loc 1 215 16 ld.u64 %r338,[%r150]; add.u64 %r88,%r338,1; st.u64 [%r150],%r88; .loc 1 216 23 add.u64 %r340,%r360,%r109; ld.u64 %r89,[%r340]; .loc 1 216 13 shl.b64 %r341,%r89,2; add.u64 %r121,%r118,%r341; .loc 1 217 23 add.u64 %r342,%frame,%r109; ld.u64 %r92,[%r342]; .loc 1 217 13 shl.b64 %r343,%r92,3; add.u64 %r122,%r119,%r343; .loc 1 197 32 add.u64 %r345,%r361,%r109; ld.u64 %r95,[%r345]; .loc 1 197 13 setp.eq.u64 %r346,%r88,%r95; @ %r346 bra $L35; .loc 1 201 13 mov.u64 %r48,0; bra $L25; $L51: .loc 1 90 6 ld.u64 %r347,[%r181]; setp.eq.u64 %r348,%r347,0; @ ! %r348 bra $L11; bra $L36; $L1: .loc 1 221_gfortran_mmaxloc1_8_i4 .visible .func _gfortran_mmaxloc1_8_, .param .u32 %in_ar4) {608pred %r218; .reg .u32 %r223; .reg .u6464u64pred %r373; .reg .u64 %r374; .reg .u64 %r376; .reg .u32 %r377; .reg .pred %r378; .reg .u64 %r381; .reg .u64 %r382; .reg .u64 %r386; .reg .pred %r388; .reg .pred %r389; .reg .u64 %r390; .reg .u32 %r391; .reg .u64pred %r398; .reg .pred %r399; .reg .pred %r402; .reg .pred %r403; .reg .pred %r404; .reg .pred %r406; .reg .pred %r407; .reg .u16 %r408; .reg .u32 %r409; .reg .pred %r410; .reg .pred %r411; .reg .pred %r412; .reg .pred.reg .u64 %r416; .reg .u64predpredmov.u64 %r213,%ar0; mov.u64 %r214,%ar1; mov.u64 %r215,%ar2; mov.u64 %r216,%ar3; mov.u32 %r217,%ar4; .loc 1 251 6 setp.ne.u64 %r218,%r216,0; @ %r218 bra $L1r214215217; call _gfortran_maxloc1_8_7 bra $L52; $L53: .loc 1 261 10 ld.u64 %r22,[%r215]; .loc 1 261 7 add.u64 %r146,%r22,-1; .loc 1 262 10 ld.s8 %r23,[%r214+28]; .loc 1 262 38 add.u32 %r223,%r23,-1; .loc 1 262 8 cvt.s64.s32 %r147,%r223; .loc 1 265 7 shr.u64 %r225,%r146,63; set.u32.gt.s64 %r227,%r146,%r147; neg.s32 %r228,%r227; cvt.u16.u64 %r231,%r225; cvt.u16.u32 %r232,%r228; or.b16 %r230,%r231,%r232; .loc 1 265 6 cvt.u32.u16 %r233,%r230; cvt.u16.u8 %r234,%r233; setp.eq.u16 %r235,%r234,0; @ %r235 bra $L55; .loc 1 267 7 cvt.u32.u32 %r238,%r23; cvt.s64.s8 %r237,%r238; st.u64 [%stack+8],%r237; st.u64 [%stack],%r22; cvta.const.u64 %r236236_gfortran_runtime_error5: .loc 1 272 9 add.u64 %r445,%r146,%r146; add.u64 %r444,%r445,%r146; shl.b64 %r243,%r444,3; add.u64 %r244,%r214,%r243; ld.u64 %r247,[%r244+56]; add.u64 %r246,%r247,1; .loc 1 272 7 ld.u64 %r254,[%r244+48]; sub.u64 %r148,%r246,%r254; .loc 1 273 6 setp.le.s64 %r255,%r148,0; @ %r255 bra $L52; .loc 1 276 9 ld.u64 %r166,[%r216]; .loc 1 278 15 ld.u64 %r34,[%r216+16]; .loc 1 280 22 cvt.u32.u64 %r35,%r34; .loc 1 280 53 add.u32 %r256,%r35,-4; and.b32 %r257,%r256,-5; set.u32.eq.u32 %r259,%r257,0; neg.s32 %r260,%r259; .loc 1 280 22 add.u32 %r261,%r35,-1; set.u32.le.u32 %r263,%r261,1; neg.s32 %r264,%r263; .loc 1 280 6 cvt.u16.u32 %r266,%r260; cvt.u16.u32 %r267,%r264; or.b16 %r265,%r266,%r267; cvt.u32.u16 %r268,%r265; cvt.u16.u8 %r269,%r268; setp.ne.u16 %r270,%r269,0; @ %r270 bra $L57; .loc 1 282 7 setp.ne.u32 %r272,%r35,16; @ %r272 bra $L58; $L57: .loc 1 289 9 shl.b64 %r276,%r444,3; add.u64 %r277,%r214,%r276; ld.u64 %r151,[%r277+40]; .loc 1 290 12 add.u64 %r283,%r216,%r276; ld.u64 %r39,[%r283+40]; .loc 1 292 3 setp.ne.u64 %r285,%r146,0; @ %r285 bra $L59; $L65: .loc 1 302 3 setp.lt.s64 %r286,%r146,%r147; @ %r286 bra $L60; bra $L119; $L58: .loc 1 287 5_gfortran_runtime_error9: add.u64 %r205,%r214,40; add.u64 %r67,%r216,40; add.u64 %r289,%r214,16; add.u64 %r291,%r22,%r22; add.u64 %r292,%r291,%r22; shl.b64 %r293,%r292,3; add.u64 %r210,%r289,%r293; .loc 1 292 3 mov.u64 %r91,0; add.u64 %r442,%frame,240; add.u64 %r440,%frame,360; .loc 1 299 12 mov.u64 %r456,%r91; $L64: .loc 1 294 18 add.u64 %r295,%r442,%r91; ld.u64 %r296,[%r205]; st.u64 [%r295],%r296; .loc 1 295 18 add.u64 %r297,%frame,%r91; .loc 1 295 20 ld.u64 %r299,[%r67]; mul.lo.u64 %r298,%r299,%r34; .loc 1 295 18 st.u64 [%r297],%r298; .loc 1 296 19 ld.u64 %r301,[%r205+16]; add.u64 %r300,%r301,1; ld.u64 %r302,[%r205+8]; sub.u64 %r51,%r300,%r302; .loc 1 298 10 setp.lt.s64 %r303,%r51,0; @ %r303 bra $L62; .loc 1 296 17 add.u64 %r305,%r440,%r91; st.u64 [%r305],%r51; bra $L63; $L62: .loc 1 299 12 add.u64 %r307,%r440,%r91; st.u64 [%r307],%r456; $L63: .loc 1 292 3 add.u64 %r205,%r205,24; add.u64 %r67,%r67,24; add.u64 %r91,%r91,8; setp.ne.u64 %r309,%r205,%r210; @ %r309 bra $L64; bra $L65; $L119: .loc 1 312 6 ld.u64 %r310,[%r213]; setp.eq.u64 %r311,%r310,0; @ ! %r311 bra $L67; bra $L66; $L60: add.u64 %r313,%r22,%r22; add.u64 %r314,%r313,%r22; shl.b64 %r315,%r314,3; add.u64 %r128,%r315,40; add.u64 %r133,%r214,%r128; add.u64 %r124,%r216,%r128; shl.b64 %r316,%r22,3; add.u64 %r83,%r316,-8; cvt.u32.u32 %r318,%r23; cvt.s64.s8 %r317,%r318; shl.b64 %r319,%r317,3; add.u64 %r209,%r319,-8; add.u64 %r442,%frame,240; add.u64 %r440,%frame,360; .loc 1 309 12 mov.u64 %r455,0; $L70: .loc 1 304 18 add.u64 %r321,%r442,%r83; ld.u64 %r322,[%r133]; st.u64 [%r321],%r322; .loc 1 305 18 add.u64 %r323,%frame,%r83; .loc 1 305 20 ld.u64 %r325,[%r124]; mul.lo.u64 %r324,%r325,%r34; .loc 1 305 18 st.u64 [%r323],%r324; .loc 1 306 19 ld.u64 %r327,[%r133+16]; add.u64 %r326,%r327,1; ld.u64 %r328,[%r133+8]; sub.u64 %r64,%r326,%r328; .loc 1 308 10 setp.lt.s64 %r329,%r64,0; @ %r329 bra $L68; .loc 1 306 17 add.u64 %r331,%r440,%r83; st.u64 [%r331],%r64; bra $L69; $L68: .loc 1 309 12 add.u64 %r333,%r440,%r83; st.u64 [%r333],%r455; $L69: .loc 1 302 3 add.u64 %r133,%r133,24; add.u64 %r124,%r124,24; add.u64 %r83,%r83,8; setp.ne.u64 %r335,%r83,%r209; @ %r335 bra $L70; bra $L120; $L66: .loc 1 316 7 setp.eq.u64 %r336,%r147,0; @ %r336 bra $L72; add.u64 %r440,%frame,360; $L96: add.u64 %r161,%r213,40; mov.u64 %r143,%r440; cvt.u32.u32 %r338,%r23; cvt.s64.s8 %r337,%r338; add.u64 %r340,%r337,%r337; add.u64 %r341,%r340,%r337; shl.b64 %r342,%r341,3; add.u64 %r343,%r213,16; add.u64 %r135,%r342,%r343; .loc 1 319 10 mov.u64 %r131,1; .loc 1 323 4 mov.u64 %r362,0; bra $L73; $L72: .loc 1 327 20 add.u32 %r344,%r23,-2; cvt.s64.s32 %r73,%r344; add.u64 %r346,%r73,%r73; add.u64 %r347,%r346,%r73; shl.b64 %r348,%r347,3; add.u64 %r349,%r213,%r348; .loc 1 327 67 shl.b64 %r351,%r73,3; add.u64 %r352,%frame,%r351; .loc 1 327 59 ld.u64 %r354,[%r349+40]; ld.u64 %r355,[%r352+360]; mul.lo.u64 %r155,%r354,%r355; .loc 1 329 24 mov.u64 %r356,0; st.u64 [%r213+8],%r356; .loc 1 330 28 cvt.u16.u32 %r359,%r23; add.u16 %r358,%r359,-1; cvt.u32.u16 %r360,%r358; st.u8 [%r213+28],%r360; .loc 1 332 10 setp.eq.u64 %r361,%r155,0; @ %r361 bra $L74; bra $L121; $L76: .loc 1 321 47 mul.lo.u64 %r131,%r70,%r131; $L73: .loc 1 323 4 st.u64 [%r161+8],%r362; ld.u64 %r70,[%r143]; add.u64 %r363,%r70,-1; st.u64 [%r161+16],%r363; st.u64 [%r161],%r131; .loc 1 316 7 add.u64 %r161,%r161,24; add.u64 %r143,%r143,8; setp.ne.u64 %r364,%r135,%r161; @ %r364 bra $L76; bra $L72; $L74: .loc 1 335 4 st.u64 [%r213+48],%r155; mov.u64 %r366,-1; st.u64 [%r213+56],%r366; mov.u64 %r367,1; st.u64 [%r213+40],%r367; .loc 1 336 4 bra $L52; $L121: .loc 1 339 24 mov.u64 %r369369; call (%value_in),_gfortrani_xmallocarray70,[%value_in]; } .loc 1 339 22 st.u64 [%r213],%r370; bra $L77; $L67: .loc 1 344 19 ld.s8 %r372,[%r213+28]; .loc 1 344 10 setp.eq.u64 %r373,%r372,%r147; @ %r373 bra $L78; .loc 1 345 2 cvta.const.u64 %r374374_gfortran_runtime_error78: .loc 1 347 11 cvta.global.u64 %r376,_gfortrani_compile_options; .loc 1 347 10 ld.u32 %r377,[%r376+36]; setp.eq.u32 %r378,%r377,0; @ %r378 bra $L77; .loc 1 349 4 add.u64 %r440,%frame,360; cvta.const.u64 %r382,$LC2r43382; call _gfortrani_bounds_ifunction_return4 cvta.const.u64 %r386,$LC386382; call _gfortrani_bounds_equal_extents$L77: .loc 1 356 3 setp.ne.u64 %r388,%r147,0; @ %r388 bra $L79; $L82: .loc 1 364 8 ld.u64 %r167,[%r213]; .loc 1 365 8 ld.u64 %r165,[%r214]; .loc 1 367 9 setp.ne.u64 %r389,%r165,0; @ %r389 bra $L80; bra $L52; $L79: add.u64 %r194,%frame,480; add.u64 %r193,%r213,40; add.u64 %r188,%frame,120; cvt.u32.u32 %r391,%r23; cvt.s64.s8 %r390,%r391; add.u64 %r174,%r390,-1; .loc 1 356 3 mov.u64 %r168,0; add.u64 %r440,%frame,360; .loc 1 358 16 mov.u64 %r392,%r168; $L81: st.u64 [%r194],%r392; .loc 1 359 18 ld.u64 %r393,[%r193]; st.u64 [%r188],%r393; .loc 1 360 17 shl.b64 %r395,%r168,3; add.u64 %r396,%r440,%r395; .loc 1 360 10 ld.u64 %r397,[%r396]; setp.le.s64 %r398,%r397,0; @ %r398 bra $L52; .loc 1 356 26 add.u64 %r168,%r168,1; .loc 1 356 3 add.u64 %r194,%r194,8; add.u64 %r193,%r193,24; add.u64 %r188,%r188,8; setp.ne.u64 %r399,%r168,%r174; @ %r399 bra $L81; bra $L82; $L80: .loc 1 290 12 mul.lo.u64 %r42,%r39,%r34; .loc 1 386 32 shl.b64 %r87,%r151,2; ld.u64 %r68,[%frame+480]; .loc 1 430 22 ld.u64 %r97,[%frame+240]; .loc 1 430 12 shl.b64 %r99,%r97,2; .loc 1 431 23 ld.u64 %r100,[%frame]; .loc 1 432 22 ld.u64 %r102,[%frame+120]; .loc 1 432 12 shl.b64 %r104,%r102,3; setp.le.s64 %r446,%r147,1; .loc 1 408 9 setp.ne.u32 %r448,%r217,0; cvt.u32.u32 %r449,%r23; cvt.s64.s8 %r450,%r449; add.u64 %r451,%r450,-1; add.u64 %r452,%frame,240; add.u64 %r453,%frame,360; add.u64 %r454,%frame,120; $L97: .loc 1 356 3 mov.u64 %r160,%r166; mov.u64 %r159,%r165; .loc 1 386 9 mov.u64 %r137,0; $L86: .loc 1 389 7 ld.s8 %r208,[%r160]; mov.u64 %r189,%r137; .loc 1 386 24 add.u64 %r137,%r137,1; .loc 1 389 6 setp.eq.u32 %r441,%r208,0; @ %r441 bra $L83; .loc 1 397 11 ld.u32 %r141,[%r159]; .loc 1 408 9 @ %r448 bra $L84; bra $L122; $L83: .loc 1 386 32 add.u64 %r159,%r159,%r87; .loc 1 386 47 add.u64 %r160,%r160,%r42; .loc 1 386 2 setp.ne.u64 %r402,%r148,%r137; @ %r402 bra $L86; .loc 1 385 9 mov.u64 %r137,0; bra $L87; $L122: .loc 1 418 8 setp.gt.s64 %r403,%r148,%r189; @ %r403 bra $L88; bra $L87; $L84: .loc 1 409 8 setp.le.s64 %r404,%r148,%r189; @ %r404 bra $L87; $L90: .loc 1 414 16 add.u64 %r189,%r189,1; .loc 1 411 8 @ %r441 bra $L89; .loc 1 411 18 ld.u32 %r88,[%r159]; .loc 1 411 15 setp.ge.s32 %r406,%r88,%r141; selp.u32 %r141,%r88,%r141,%r406; selp.u64 %r137,%r189,%r137,%r406; $L89: .loc 1 409 33 add.u64 %r159,%r159,%r87; .loc 1 409 48 add.u64 %r160,%r160,%r42; .loc 1 409 8 setp.eq.u64 %r407,%r148,%r189; @ %r407 bra $L87; .loc 1 411 9 ld.s8 %r208,[%r160]; setp.eq.u32 %r441,%r208,0; bra $L90; $L88: .loc 1 423 16 add.u64 %r189,%r189,1; .loc 1 420 8 ld.s8 %r409,[%r160]; cvt.u16.u32 %r408,%r409; setp.eq.u16 %r410,%r408,0; @ %r410 bra $L91; .loc 1 420 18 ld.u32 %r93,[%r159]; .loc 1 420 15 setp.gt.s32 %r411,%r93,%r141; selp.u32 %r141,%r93,%r141,%r411; selp.u64 %r137,%r189,%r137,%r411; $L91: .loc 1 418 33 add.u64 %r159,%r159,%r87; .loc 1 418 48 add.u64 %r160,%r160,%r42; .loc 1 418 8 setp.ne.u64 %r412,%r148,%r189; @ %r412 bra $L88; $L87: .loc 1 426 8 st.u64 [%r167],%r137; .loc 1 429 15 add.u64 %r68,%r68,1; .loc 1 430 12 add.u64 %r165,%r165,%r99; .loc 1 431 13 add.u64 %r166,%r166,%r100; .loc 1 432 12 add.u64 %r167,%r167,%r104; .loc 1 434 32 ld.u64 %r29,[%frame+360]; .loc 1 434 13 setp.ne.u64 %r413,%r29,%r68; @ %r413 bra $L97; .loc 1 441 23 mul.lo.u64 %r106,%r68,%r97; .loc 1 442 24 mul.lo.u64 %r414,%r68,%r100; .loc 1 442 10 sub.u64 %r162,%r166,%r414; .loc 1 443 23 mul.lo.u64 %r415,%r68,%r102; .loc 1 443 9 shl.b64 %r416,%r415,3; sub.u64 %r163,%r167,%r416; .loc 1 445 7 @ %r446 bra $L52; add.u64 %r204,%frame,488; mov.u64 %r203,8; .loc 1 444 5 mov.u64 %r164,1; .loc 1 438 13 mov.u64 %r447,0; bra $L94; $L95: st.u64 [%r204],%r447; .loc 1 441 23 mul.lo.u64 %r106,%r117,%r116; .loc 1 442 24 mul.lo.u64 %r422,%r119,%r116; .loc 1 442 10 sub.u64 %r162,%r166,%r422; .loc 1 443 23 mul.lo.u64 %r423,%r121,%r116; .loc 1 443 9 shl.b64 %r424,%r423,3; sub.u64 %r163,%r167,%r424; .loc 1 444 5 add.u64 %r164,%r164,1; .loc 1 445 7 add.u64 %r204,%r204,8; add.u64 %r203,%r203,8; setp.eq.u64 %r425,%r164,%r451; @ %r425 bra $L52; $L94: .loc 1 453 16 ld.u64 %r426,[%r204]; add.u64 %r116,%r426,1; st.u64 [%r204],%r116; .loc 1 454 23 add.u64 %r428,%r452,%r203; ld.u64 %r117,[%r428]; .loc 1 454 13 sub.u64 %r429,%r117,%r106; shl.b64 %r430,%r429,2; add.u64 %r165,%r165,%r430; .loc 1 455 24 add.u64 %r431,%frame,%r203; ld.u64 %r119,[%r431]; .loc 1 455 14 add.u64 %r166,%r162,%r119; .loc 1 456 23 add.u64 %r433,%r454,%r203; ld.u64 %r121,[%r433]; .loc 1 456 13 shl.b64 %r434,%r121,3; add.u64 %r167,%r163,%r434; .loc 1 434 32 add.u64 %r436,%r453,%r203; ld.u64 %r125,[%r436]; .loc 1 434 13 setp.eq.u64 %r437,%r116,%r125; @ %r437 bra $L95; .loc 1 438 13 mov.u64 %r68,0; bra $L97; $L120: .loc 1 312 6 ld.u64 %r438,[%r213]; setp.eq.u64 %r439,%r438,0; @ ! %r439 bra $L67; bra $L96; $L52: .loc 1 460_gfortran_smaxloc1_8_i4 .visible .func _gfortran_smaxloc1_8_, .param .u32 %in_ar4) {8u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 483 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L124; .loc 1 483 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L125; $L124:r152154; call _gfortran_maxloc1_8_490 7 bra $L123; $L125: .loc 1 493 10 ld.u64 %r23,[%r152]; .loc 1 493 7 add.u64 %r88,%r23,-1; .loc 1 494 10 ld.s8 %r24,[%r151+28]; .loc 1 494 38 add.u32 %r26,%r24,-1; .loc 1 494 8 cvt.s64.s32 %r296,%r26; .loc 1 496 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 496 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L127; .loc 1 503 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L128; $L134: .loc 1 511 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L129; bra $L182; $L127: .loc 1 498 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error128: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 508 12 mov.u64 %r302,0; $L133: .loc 1 505 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 507 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L131; .loc 1 505 17 st.u64 [%r139],%r36; bra $L132; $L131: .loc 1 508 12 st.u64 [%r139],%r302; $L132: .loc 1 503 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L133; bra $L134; $L182: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L136; bra $L135; $L129: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 517 12 mov.u64 %r301,0; $L139: .loc 1 514 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 516 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L137; .loc 1 513 17 st.u64 [%r43],%r41; bra $L138; $L137: .loc 1 517 12 st.u64 [%r43],%r301; $L138: .loc 1 511 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L139; bra $L183; $L135: .loc 1 524 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L141; add.u64 %r295,%frame,120; $L161: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 527 10 mov.u64 %r81,1; .loc 1 531 4 mov.u64 %r239,0; bra $L142; $L141: .loc 1 535 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 536 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 538 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 538 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 538 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 540 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L143; bra $L184; $L145: .loc 1 529 48 mul.lo.u64 %r81,%r46,%r81; $L142: .loc 1 531 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 524 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L145; bra $L141; $L143: .loc 1 543 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 544 4 bra $L123; $L184: .loc 1 547 24 mov.u64 %r24696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 547 22 st.u64 [%r150],%r98; $L149: .loc 1 573 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L146; bra $L147; $L136: .loc 1 551 19 ld.s8 %r55,[%r150+28]; .loc 1 551 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L148; .loc 1 552 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error148: .loc 1 557 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 557 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L149; .loc 1 559 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L147; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 559 10 mov.u64 %r104,0; $L151: .loc 1 563 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 563 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 564 18 ld.u64 %r63,[%r31]; .loc 1 559 25 add.u64 %r104,%r104,1; .loc 1 564 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L150; .loc 1 565 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error150: .loc 1 559 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L151; $L146: .loc 1 575 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L152; shl.b64 %r266,%r296,3; bra $L153; $L152: mov.u64 %r266,8; $L153: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L154: .loc 1 576 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 573 26 add.u64 %r100,%r100,1; .loc 1 573 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L154; $L147: ld.u64 %r80,[%frame+240]; .loc 1 585 22 ld.u64 %r68,[%frame]; .loc 1 585 12 shl.b64 %r69,%r68,3; .loc 1 587 32 ld.u64 %r59,[%frame+120]; .loc 1 594 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 583 13 mov.u64 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L160: st.u64 [%r98],%r280; .loc 1 584 15 add.u64 %r80,%r80,1; .loc 1 585 12 add.u64 %r98,%r98,%r69; .loc 1 587 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L160; .loc 1 596 7 @ ! %r294 bra $L185; bra $L123; $L159: .loc 1 591 13 st.u64 [%r99],%r280; .loc 1 594 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 595 5 add.u64 %r97,%r97,1; .loc 1 596 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L158; bra $L123; $L185: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 594 23 mov.u64 %r72,%r44; .loc 1 595 5 mov.u64 %r97,1; $L158: .loc 1 600 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 601 23 ld.u64 %r75,[%r124]; .loc 1 601 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,3; add.u64 %r98,%r98,%r291; .loc 1 587 32 ld.u64 %r77,[%r123]; .loc 1 587 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L159; .loc 1 591 13 mov.u64 %r80,0; bra $L160; $L183: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L136; bra $L161; $L123: .loc 1 605 maxloc1_16_i4.o/1622802230_gfortran_maxloc1_16_i4 .visible .func _gfortran_maxloc1_16_../libgfortran/generated/maxloc1_16_i4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_16_i4 .visible .func _gfortran_mmaxloc1_16__gfortran_smaxloc1_16_i4 .visible .func _gfortran_smaxloc1_16_VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_maxloc1_16_i4 .visible .func _gfortran_maxloc1_16_i407predpredu64 %r231; .reg .u64 %r232; .reg .pred %r233; .reg .pred %r235; .reg .u64 %r236; .reg .pred %r237; .reg .u64u64predu64 %r264; .reg .u64 %r265; .reg .u64 %r266; .reg .u64 %r267; .reg .u64 %r268; .reg .u16 %r270; .reg .u16 %r271; .reg .u32 %r272; .reg .u32 %r273; .reg .u64 %r275; .reg .u64u64pred64pred64predpredpred %r383; .reg .u32u64 %r389; .reg .u64 %r390; mov.u64 %r183,%ar0; mov.u64 %r184,%ar1; mov.u64 %r185,%ar2; mov.u32 %r186,%ar3; .loc 1 58 10 ld.s8 %r22,[%r184+28]; .loc 1 58 38 add.u32 %r187,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r110,%r187; .loc 1 59 10 ld.u64 %r25,[%r185]; .loc 1 59 7 add.u64 %r111,%r25,-1; .loc 1 61 7 shr.u64 %r189,%r111,63; set.u32.lt.s64 %r191,%r110,%r111; neg.s32 %r192,%r191; cvt.u16.u64 %r195,%r189; cvt.u16.u32 %r196,%r192; or.b16 %r194,%r195,%r196; .loc 1 61 6 cvt.u32.u16 %r197,%r194; cvt.u16.u8 %r198,%r197; setp.eq.u16 %r199,%r198,0; @ %r199 bra $L2; .loc 1 63 7 cvt.u32.u32 %r202,%r22; cvt.s64.s8 %r201,%r202; st.u64 [%stack+8],%r2000200_gfortran_runtime_errorr205,%r111,%r111; add.u64 %r206,%r205,%r111; shl.b64 %r207,%r206,3; add.u64 %r208,%r184,%r207; ld.u64 %r31,[%r208+56]; ld.u64 %r33,[%r208+48]; .loc 1 71 9 ld.u64 %r114,[%r208+40]; .loc 1 73 3 setp.ne.u64 %r222,%r111,0; @ %r222 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r223,%r110,%r111; @ %r223 bra $L4; bra $L49; $L3: add.u64 %r109,%r184,40; add.u64 %r128,%frame,120; add.u64 %r181,%frame,240; add.u64 %r224,%r184,16; add.u64 %r226,%r25,%r25; add.u64 %r227,%r226,%r25; shl.b64 %r228,%r227,3; add.u64 %r137,%r224,%r228; .loc 1 79 12 mov.u64 %r390,0; $L8: .loc 1 75 18 ld.u64 %r229,[%r109]; st.u64 [%r128],%r229; .loc 1 76 19 ld.u64 %r231,[%r109+16]; add.u64 %r230,%r231,1; ld.u64 %r232,[%r109+8]; sub.u64 %r38,%r230,%r232; .loc 1 78 10 setp.lt.s64 %r233,%r38,0; @ %r233 bra $L6; .loc 1 76 17 st.u64 [%r181],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r181],%r390; $L7: .loc 1 73 3 add.u64 %r109,%r109,24; add.u64 %r128,%r128,8; add.u64 %r181,%r181,8; setp.ne.u64 %r235,%r109,%r137; @ %r235 bra $L8; bra $L9; $L49: .loc 1 90 6 ld.u64 %r236,[%r183]; setp.eq.u64 %r237,%r236,0; @ ! %r237 bra $L11; bra $L10; $L4: add.u64 %r239,%r25,%r25; add.u64 %r240,%r239,%r25; shl.b64 %r241,%r240,3; add.u64 %r242,%r241,40; add.u64 %r134,%r184,%r242; shl.b64 %r243,%r25,3; add.u64 %r107,%r243,-8; add.u64 %r378,%frame,120; add.u64 %r120,%r378,%r107; add.u64 %r377,%frame,240; add.u64 %r104,%r377,%r107; add.u64 %r246,%r184,40; cvt.u32.u32 %r248,%r22; cvt.s64.s8 %r247,%r248; add.u64 %r250,%r247,%r247; add.u64 %r251,%r250,%r247; shl.b64 %r252,%r251,3; add.u64 %r63,%r246,%r252; .loc 1 87 12 mov.u64 %r389,0; $L14: .loc 1 83 18 ld.u64 %r253,[%r134]; st.u64 [%r120],%r253; .loc 1 84 19 ld.u64 %r255,[%r134+16]; add.u64 %r254,%r255,1; ld.u64 %r256,[%r134+8]; sub.u64 %r45,%r254,%r256; .loc 1 86 10 setp.lt.s64 %r257,%r45,0; @ %r257 bra $L12; .loc 1 84 17 st.u64 [%r104],%r45; bra $L13; $L12: .loc 1 87 12 st.u64 [%r104],%r389; $L13: .loc 1 81 3 add.u64 %r134,%r134,24; add.u64 %r120,%r120,8; add.u64 %r104,%r104,8; setp.ne.u64 %r259,%r63,%r134; @ %r259 bra $L14; bra $L50; $L10: .loc 1 94 7 setp.le.s64 %r260,%r110,0; @ %r260 bra $L16; add.u64 %r377,%frame,240; $L37: add.u64 %r154,%r183,40; mov.u64 %r152,%r377; cvt.u32.u32 %r262,%r22; cvt.s64.s8 %r261,%r262; add.u64 %r264,%r261,%r261; add.u64 %r265,%r264,%r261; shl.b64 %r266,%r265,3; add.u64 %r267,%r183,16; add.u64 %r138,%r266,%r267; .loc 1 97 10 mov.u64 %r101,1; .loc 1 101 4 mov.u64 %r290,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r268,0; st.u64 [%r183+8],%r268; .loc 1 106 28 cvt.u16.u32 %r271,%r22; add.u16 %r270,%r271,-1; cvt.u32.u16 %r272,%r270; st.u8 [%r183+28],%r272; .loc 1 108 20 add.u32 %r273,%r22,-2; cvt.s64.s32 %r56,%r273; add.u64 %r275,%r56,%r56; add.u64 %r276,%r275,%r56; shl.b64 %r277,%r276,3; add.u64 %r278,%r183,%r277; .loc 1 108 67 shl.b64 %r280,%r56,3; add.u64 %r281,%frame,%r280; .loc 1 108 59 ld.u64 %r283,[%r278+40]; ld.u64 %r284,[%r281+240]; mul.lo.u64 %r116,%r283,%r284; .loc 1 110 29 mov.u64 %r286,16286; call (%value_in),_gfortrani_xmallocarray287,[%value_in]; } .loc 1 110 27 st.u64 [%r183],%r287; .loc 1 111 10 setp.eq.u64 %r289,%r116,0; @ ! %r289 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r101,%r52,%r101; $L17: .loc 1 101 4 st.u64 [%r154+8],%r290; ld.u64 %r52,[%r152]; add.u64 %r291,%r52,-1; st.u64 [%r154+16],%r291; st.u64 [%r154],%r101; .loc 1 94 7 add.u64 %r154,%r154,24; add.u64 %r152,%r152,8; setp.ne.u64 %r292,%r138,%r154; @ %r292 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r183+48],%r116; mov.u64 %r294,-1; st.u64 [%r183+56],%r294; mov.u64 %r295,1; st.u64 [%r183+40],%r295; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r62,[%r183+28]; .loc 1 121 10 setp.eq.u64 %r296,%r62,%r110; @ %r296 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r110; st.u64 [%stack],%r62;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r299,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r300,[%r299+36]; setp.eq.u32 %r301,%r300,0; @ %r301 bra $L21; .loc 1 128 2 add.u64 %r377,%frame,240; cvta.const.u64 %r305,$LC2call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r307,%r110,0; @ %r307 bra $L24; $L27: .loc 1 68 9 add.u64 %r308,%r31,1; .loc 1 68 7 sub.u64 %r112,%r308,%r33; max.s64 %r46,%r112,0; .loc 1 140 8 ld.u64 %r124,[%r184]; .loc 1 141 8 ld.u64 %r125,[%r183]; .loc 1 180 31 shl.b64 %r70,%r114,2; ld.u64 %r77,[%frame+360]; .loc 1 194 22 ld.u64 %r72,[%frame+120]; .loc 1 194 12 shl.b64 %r73,%r72,2; .loc 1 195 22 ld.u64 %r75,[%frame]; .loc 1 195 12 shl.b64 %r76,%r75,4; .loc 1 197 32 ld.u64 %r105,[%frame+240]; .loc 1 204 23 mul.lo.u64 %r309,%r72,%r105; .loc 1 204 9 shl.b64 %r310,%r309,2; neg.s64 %r135,%r310; .loc 1 205 23 mul.lo.u64 %r311,%r75,%r105; .loc 1 205 9 shl.b64 %r312,%r311,4; neg.s64 %r139,%r312; setp.gt.s64 %r379,%r112,0; setp.le.s64 %r380,%r110,1; mov.u64 %r382,1; setp.eq.u32 %r383,%r186,0; cvt.u32.u32 %r384,%r22; cvt.s64.s8 %r385,%r384; add.u64 %r386,%r385,-1; add.u64 %r387,%frame,120; add.u64 %r388,%frame,240; bra $L25; $L24: add.u64 %r167,%frame,360; add.u64 %r166,%r183,40; mov.u64 %r164,%frame; cvt.u32.u32 %r314,%r22; cvt.s64.s8 %r313,%r314; add.u64 %r155,%r313,-1; .loc 1 132 3 mov.u64 %r126,0; add.u64 %r377,%frame,240; .loc 1 134 16 mov.u64 %r315,%r126; $L26: st.u64 [%r167],%r315; .loc 1 135 18 ld.u64 %r316,[%r166]; st.u64 [%r164],%r316; .loc 1 136 17 shl.b64 %r318,%r126,3; add.u64 %r319,%r377,%r318; .loc 1 136 10 ld.u64 %r320,[%r319]; setp.le.s64 %r321,%r320,0; @ %r321 bra $L1; .loc 1 132 26 add.u64 %r126,%r126,1; .loc 1 132 3 add.u64 %r167,%r167,8; add.u64 %r166,%r166,24; add.u64 %r164,%r164,8; setp.ne.u64 %r322,%r126,%r155; @ %r322 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r379 bra $L38; .loc 1 159 10 mov.u64 %r324,0; st.u64 [%r125],%r324; st.u64 [%r125+8],%r324; bra $L29; $L38: mov.u64 %r119,%r124; mov.u64 %r370,%r382; mov.u64 %r371,0; .loc 1 155 9 mov.u32 %r103,-2147483648; .loc 1 157 9 mov.u64 %r368,1; mov.u64 %r369,%r371; $L28: .loc 1 182 14 ld.u32 %r182,[%r119]; .loc 1 182 7 @ %r383 bra $L30; .loc 1 182 6 setp.le.s32 %r327,%r103,%r182; @ %r327 bra $L31; bra $L32; $L30: setp.ge.s32 %r328,%r103,%r182; @ %r328 bra $L32; $L31: .loc 1 185 14 mov.u64 %r368,%r370; mov.u64 %r369,%r371; .loc 1 182 14 mov.u32 %r103,%r182; $L32: .loc 1 180 31 add.u64 %r119,%r119,%r70; .loc 1 180 6 add.u64 %r331,%r370,%r382; set.u32.lt.u64 %r336,%r331,%r370; cvt.s64.s32 %r334,%r336; mov.u64 %r370,%r331; sub.u64 %r371,%r371,%r334; add.u64 %r341,%r370,-1; setp.gt.s64 %r343,%r46,%r341; @ %r343 bra $L28; .loc 1 189 12 st.u64 [%r125],%r368; st.u64 [%r125+8],%r369; $L29: .loc 1 193 15 add.u64 %r77,%r77,1; .loc 1 194 12 add.u64 %r124,%r124,%r73; .loc 1 195 12 add.u64 %r125,%r125,%r76; .loc 1 197 13 setp.ne.u64 %r346,%r77,%r105; @ %r346 bra $L25; .loc 1 204 9 add.u64 %r121,%r124,%r135; .loc 1 205 9 add.u64 %r122,%r125,%r139; .loc 1 207 7 @ %r380 bra $L1; add.u64 %r130,%frame,368; mov.u64 %r149,8; .loc 1 206 5 mov.u64 %r123,1; .loc 1 201 13 mov.u64 %r381,0; bra $L35; $L36: st.u64 [%r130],%r381; .loc 1 204 23 mul.lo.u64 %r352,%r89,%r88; .loc 1 204 9 shl.b64 %r353,%r352,2; sub.u64 %r121,%r124,%r353; .loc 1 205 23 mul.lo.u64 %r354,%r92,%r88; .loc 1 205 9 shl.b64 %r355,%r354,4; sub.u64 %r122,%r125,%r355; .loc 1 206 5 add.u64 %r123,%r123,1; .loc 1 207 7 add.u64 %r130,%r130,8; add.u64 %r149,%r149,8; setp.eq.u64 %r356,%r123,%r386; @ %r356 bra $L1; $L35: .loc 1 215 16 ld.u64 %r357,[%r130]; add.u64 %r88,%r357,1; st.u64 [%r130],%r88; .loc 1 216 23 add.u64 %r359,%r387,%r149; ld.u64 %r89,[%r359]; .loc 1 216 13 shl.b64 %r360,%r89,2; add.u64 %r124,%r121,%r360; .loc 1 217 23 add.u64 %r361,%frame,%r149; ld.u64 %r92,[%r361]; .loc 1 217 13 shl.b64 %r362,%r92,4; add.u64 %r125,%r122,%r362; .loc 1 197 32 add.u64 %r364,%r388,%r149; ld.u64 %r95,[%r364]; .loc 1 197 13 setp.eq.u64 %r365,%r88,%r95; @ %r365 bra $L36; .loc 1 201 13 mov.u64 %r77,0; bra $L25; $L50: .loc 1 90 6 ld.u64 %r366,[%r183]; setp.eq.u64 %r367,%r366,0; @ ! %r367 bra $L11; bra $L37; $L1: .loc 1 221_gfortran_mmaxloc1_16_i4 .visible .func _gfortran_mmaxloc1_16_, .param .u32 %in_ar4) {608u32 %r94; .reg .u64 %r97u64u64 %r267; .reg .u64 %r274; .reg .pred %r275; .reg .u32 %r276; .reg .u32 %r277; .reg .u32u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .pred %r329; .reg .u64 %r330; .reg .pred %r331; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32 %r338; .reg .u64 %r339; .reg .u64 %r341; .reg .u64 %r342; .reg .u64 %r343; .reg .u64 %r344; .reg .u64pred %r355; .reg .pred64predpred %r408; .reg .pred %r409; .reg .u64 %r410; .reg .u32.reg .u64 %r417; .reg .pred %r418; .reg .predpred %r428; .reg .u64 %r432; .reg .u64pred %r460; .reg .u64 %r463; .reg .u64 %r466; .reg .u32 %r468; .reg .pred %r475; .reg .pred.reg .u32 %r504; .reg .u64 %r508; .reg .u16 %r509; .reg .u32 %r510; .reg .pred %r511; .reg .pred %r512; .reg .u64 %r515; .reg .u64 %r518; .reg .u32 %r520; .reg .pred %r527; .reg .pred %r530; .reg .pred %r533; .reg .u64pred %r542; .reg .u64 %r543; .reg .u64pred.reg .u64 %r610; .reg .u64 %r611; mov.u64 %r233,%ar0; mov.u64 %r234,%ar1; mov.u64 %r235,%ar2; mov.u64 %r236,%ar3; mov.u32 %r237,%ar4; .loc 1 251 6 setp.ne.u64 %r238,%r236,0; @ %r238 bra $L3r2235237; call _gfortran_maxloc1_16_7 bra $L51; $L52: .loc 1 261 10 ld.u64 %r22,[%r235]; .loc 1 261 7 add.u64 %r153,%r22,-1; .loc 1 262 10 ld.s8 %r23,[%r234+28]; .loc 1 262 38 add.u32 %r243,%r23,-1; .loc 1 262 8 cvt.s64.s32 %r154,%r243; .loc 1 265 7 shr.u64 %r245,%r153,63; set.u32.gt.s64 %r247,%r153,%r154; neg.s32 %r248,%r247; cvt.u16.u64 %r251,%r245; cvt.u16.u32 %r252,%r248; or.b16 %r250,%r251,%r252; .loc 1 265 6 cvt.u32.u16 %r253,%r250; cvt.u16.u8 %r254,%r253; setp.eq.u16 %r255,%r254,0; @ %r255 bra $L54; .loc 1 267 7 cvt.u32.u32 %r258,%r23; cvt.s64.s8 %r257,%r258; st.u64 [%stack+8],%r257; st.u64 [%stack],%r22; cvta.const.u64 %r22stack; call _gfortran_runtime_error4: .loc 1 272 9 add.u64 %r596,%r153,%r153; add.u64 %r601,%r596,%r153; shl.b64 %r263,%r601,3; add.u64 %r264,%r234,%r263; ld.u64 %r267,[%r264+56]; add.u64 %r266,%r267,1; .loc 1 272 7 ld.u64 %r274,[%r264+48]; sub.u64 %r155,%r266,%r274; .loc 1 273 6 setp.le.s64 %r275,%r155,0; @ %r275 bra $L51; .loc 1 276 9 ld.u64 %r173,[%r236]; .loc 1 278 15 ld.u64 %r35,[%r236+16]; .loc 1 280 22 cvt.u32.u64 %r36,%r35; .loc 1 280 53 add.u32 %r276,%r36,-4; and.b32 %r277,%r276,-5; set.u32.eq.u32 %r279,%r277,0; neg.s32 %r280,%r279; .loc 1 280 22 add.u32 %r281,%r36,-1; set.u32.le.u32 %r283,%r281,1; neg.s32 %r284,%r283; .loc 1 280 6 cvt.u16.u32 %r286,%r280; cvt.u16.u32 %r287,%r284; or.b16 %r285,%r286,%r287;56; .loc 1 282 7 setp.ne.u32 %r292,%r36,16; @ %r292 bra $L57; $L56: .loc 1 289 9 shl.b64 %r296,%r601,3; add.u64 %r297,%r234,%r296; ld.u64 %r158,[%r297+40]; .loc 1 290 12 add.u64 %r303,%r236,%r296; ld.u64 %r41,[%r303+40]; .loc 1 292 3 setp.ne.u64 %r305,%r153,0; @ %r305 bra $L58; $L64: .loc 1 302 3 setp.lt.s64 %r306,%r153,%r154; @ %r306 bra $L59; bra $L118; $L57: .loc 1 287 5 cvta.const.u64 %r307307_gfortran_runtime_error8: add.u64 %r164,%r234,40; add.u64 %r150,%r236,40; add.u64 %r309,%r234,16; add.u64 %r311,%r22,%r22; add.u64 %r312,%r311,%r22; shl.b64 %r313,%r312,3; add.u64 %r223,%r309,%r313; .loc 1 292 3 mov.u64 %r53,0; add.u64 %r597,%frame,240; add.u64 %r600,%frame,360; .loc 1 299 12 mov.u64 %r611,%r53; $L63: .loc 1 294 18 add.u64 %r315,%r597,%r53; ld.u64 %r316,[%r164]; st.u64 [%r315],%r316; .loc 1 295 18 add.u64 %r317,%frame,%r53; .loc 1 295 20 ld.u64 %r319,[%r150]; mul.lo.u64 %r318,%r319,%r35; .loc 1 295 18 st.u64 [%r317],%r318; .loc 1 296 19 ld.u64 %r321,[%r164+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r164+8]; sub.u64 %r52,%r320,%r322; .loc 1 298 10 setp.lt.s64 %r323,%r52,0; @ %r323 bra $L61; .loc 1 296 17 add.u64 %r325,%r600,%r53; st.u64 [%r325],%r52; bra $L62; $L61: .loc 1 299 12 add.u64 %r327,%r600,%r53; st.u64 [%r327],%r611; $L62: .loc 1 292 3 add.u64 %r164,%r164,24; add.u64 %r150,%r150,24; add.u64 %r53,%r53,8; setp.ne.u64 %r329,%r164,%r223; @ %r329 bra $L63; bra $L64; $L118: .loc 1 312 6 ld.u64 %r330,[%r233]; setp.eq.u64 %r331,%r330,0; @ ! %r331 bra $L66; bra $L65; $L59: add.u64 %r333,%r22,%r22; add.u64 %r334,%r333,%r22; shl.b64 %r335,%r334,3; add.u64 %r178,%r335,40; add.u64 %r59,%r234,%r178; add.u64 %r180,%r236,%r178; shl.b64 %r336,%r22,3; add.u64 %r91,%r336,-8; cvt.u32.u32 %r338,%r23; cvt.s64.s8 %r337,%r338; shl.b64 %r339,%r337,3; add.u64 %r97,%r339,-8; add.u64 %r597,%frame,240; add.u64 %r600,%frame,360; .loc 1 309 12 mov.u64 %r610,0; $L69: .loc 1 304 18 add.u64 %r341,%r597,%r91; ld.u64 %r342,[%r59]; st.u64 [%r341],%r342; .loc 1 305 18 add.u64 %r343,%frame,%r91; .loc 1 305 20 ld.u64 %r345,[%r180]; mul.lo.u64 %r344,%r345,%r35; .loc 1 305 18 st.u64 [%r343],%r344; .loc 1 306 19 ld.u64 %r347,[%r59+16]; add.u64 %r346,%r347,1; ld.u64 %r348,[%r59+8]; sub.u64 %r66,%r346,%r348; .loc 1 308 10 setp.lt.s64 %r349,%r66,0; @ %r349 bra $L67; .loc 1 306 17 add.u64 %r351,%r600,%r91; st.u64 [%r351],%r66; bra $L68; $L67: .loc 1 309 12 add.u64 %r353,%r600,%r91; st.u64 [%r353],%r610; $L68: .loc 1 302 3 add.u64 %r59,%r59,24; add.u64 %r180,%r180,24; add.u64 %r91,%r91,8; setp.ne.u64 %r355,%r91,%r97; @ %r355 bra $L69; bra $L119; $L65: .loc 1 316 7 setp.eq.u64 %r356,%r154,0; @ %r356 bra $L71; add.u64 %r600,%frame,360; $L97: add.u64 %r137,%r233,40; mov.u64 %r134,%r600; cvt.u32.u32 %r358,%r23; cvt.s64.s8 %r357,%r358; add.u64 %r360,%r357,%r357; add.u64 %r361,%r360,%r357; shl.b64 %r362,%r361,3; add.u64 %r363,%r233,16; add.u64 %r64,%r362,%r363; .loc 1 319 10 mov.u64 %r136,1; .loc 1 323 4 mov.u64 %r382,0; bra $L72; $L71: .loc 1 327 20 add.u32 %r364,%r23,-2; cvt.s64.s32 %r75,%r364; add.u64 %r366,%r75,%r75; add.u64 %r367,%r366,%r75; shl.b64 %r368,%r367,3; add.u64 %r369,%r233,%r368; .loc 1 327 67 shl.b64 %r371,%r75,3; add.u64 %r372,%frame,%r371; .loc 1 327 59 ld.u64 %r374,[%r369+40]; ld.u64 %r375,[%r372+360]; mul.lo.u64 %r162,%r374,%r375; .loc 1 329 24 mov.u64 %r376,0; st.u64 [%r233+8],%r376; .loc 1 330 28 cvt.u16.u32 %r379,%r23; add.u16 %r378,%r379,-1; cvt.u32.u16 %r380,%r378; st.u8 [%r233+28],%r380; .loc 1 332 10 setp.eq.u64 %r381,%r162,0; @ %r381 bra $L73; bra $L120; $L75: .loc 1 321 47 mul.lo.u64 %r136,%r72,%r136; $L72: .loc 1 323 4 st.u64 [%r137+8],%r382; ld.u64 %r72,[%r134]; add.u64 %r383,%r72,-1; st.u64 [%r137+16],%r383; st.u64 [%r137],%r136; .loc 1 316 7 add.u64 %r137,%r137,24; add.u64 %r134,%r134,8; setp.ne.u64 %r384,%r64,%r137; @ %r384 bra $L75; bra $L71; $L73: .loc 1 335 4 st.u64 [%r233+48],%r162; mov.u64 %r386,-1; st.u64 [%r233+56],%r386; mov.u64 %r387,1; st.u64 [%r233+40],%r387; .loc 1 336 4 bra $L51; $L120: .loc 1 339 24 mov.u64 %r389,16389; call (%value_in),_gfortrani_xmallocarray90,[%value_in]; } .loc 1 339 22 st.u64 [%r233],%r390; bra $L76; $L66: .loc 1 344 19 ld.s8 %r392,[%r233+28]; .loc 1 344 10 setp.eq.u64 %r393,%r392,%r154; @ %r393 bra $L77; .loc 1 345 2 cvta.const.u64 %r3944_gfortran_runtime_error77: .loc 1 347 11 cvta.global.u64 %r396,_gfortrani_compile_options; .loc 1 347 10 ld.u32 %r397,[%r396+36]; setp.eq.u32 %r398,%r397,0; @ %r398 bra $L76; .loc 1 349 4 add.u64 %r600,%frame,360; cvta.const.u64 %r402,$LC2r600002; call _gfortrani_bounds_ifunction_return4 cvta.const.u64 %r406,$LC406402; call _gfortrani_bounds_equal_extents$L76: .loc 1 356 3 setp.ne.u64 %r408,%r154,0; @ %r408 bra $L78; $L81: .loc 1 364 8 ld.u64 %r174,[%r233]; .loc 1 365 8 ld.u64 %r172,[%r234]; .loc 1 367 9 setp.ne.u64 %r409,%r172,0; @ %r409 bra $L79; bra $L51; $L78: add.u64 %r188,%frame,480; add.u64 %r181,%r233,40; add.u64 %r149,%frame,120; cvt.u32.u32 %r411,%r23; cvt.s64.s8 %r410,%r411; add.u64 %r138,%r410,-1; .loc 1 356 3 mov.u64 %r175,0; add.u64 %r600,%frame,360; .loc 1 358 16 mov.u64 %r412,%r175; $L80: st.u64 [%r188],%r412; .loc 1 359 18 ld.u64 %r413,[%r181]; st.u64 [%r149],%r413; .loc 1 360 17 shl.b64 %r415,%r175,3; add.u64 %r416,%r600,%r415; .loc 1 360 10 ld.u64 %r417,[%r416]; setp.le.s64 %r418,%r417,0; @ %r418 bra $L51; .loc 1 356 26 add.u64 %r175,%r175,1; .loc 1 356 3 add.u64 %r188,%r188,8; add.u64 %r181,%r181,24; add.u64 %r149,%r149,8; setp.ne.u64 %r419,%r138,%r175; @ %r419 bra $L80; bra $L81; $L79: .loc 1 290 12 mul.lo.u64 %r43,%r41,%r35; .loc 1 386 32 shl.b64 %r88,%r158,2; ld.u64 %r189,[%frame+480]; .loc 1 430 22 ld.u64 %r102,[%frame+240]; .loc 1 430 12 shl.b64 %r103,%r102,2; .loc 1 431 23 ld.u64 %r105,[%frame]; .loc 1 432 22 ld.u64 %r107,[%frame+120]; .loc 1 432 12 shl.b64 %r108,%r107,4; .loc 1 434 32 ld.u64 %r29,[%frame+360]; .loc 1 441 23 mul.lo.u64 %r39,%r29,%r102; .loc 1 442 24 mul.lo.u64 %r420,%r29,%r105; .loc 1 442 10 neg.s64 %r92,%r420; .loc 1 443 23 mul.lo.u64 %r421,%r29,%r107; .loc 1 443 9 shl.b64 %r422,%r421,4; neg.s64 %r185,%r422; setp.le.s64 %r595,%r154,1; .loc 1 408 9 setp.ne.u32 %r603,%r237,0; add.u64 %r604,%r155,-1; cvt.u32.u32 %r605,%r23; cvt.s64.s8 %r606,%r605; add.u64 %r607,%r606,-1; add.u64 %r608,%frame,240; add.u64 %r609,%frame,360; $L98: .loc 1 356 3 mov.u64 %r168,%r173; mov.u64 %r167,%r172; .loc 1 386 9 mov.u64 %r197,0; $L85: .loc 1 389 7 ld.s8 %r224,[%r168]; mov.u64 %r230,%r197; .loc 1 386 24 add.u64 %r197,%r197,1; .loc 1 389 6 setp.eq.u32 %r598,%r224,0; @ %r598 bra $L82; .loc 1 397 11 ld.u32 %r145,[%r167]; .loc 1 398 11 mov.u64 %r557,%r197; shr.s64 %r558,%r557,63; .loc 1 408 9 @ %r603 bra $L83; bra $L121; $L82: .loc 1 386 32 add.u64 %r167,%r167,%r88; .loc 1 386 47 add.u64 %r168,%r168,%r43; .loc 1 386 2 setp.ne.u64 %r426,%r155,%r197; @ %r426 bra $L85; .loc 1 385 9 mov.u64 %r557,0; mov.u64 %r558,%r557; bra $L86; $L121: .loc 1 418 8 setp.gt.s64 %r427,%r155,%r230; @ %r427 bra $L87; bra $L86; $L83: .loc 1 409 8 setp.le.s64 %r428,%r155,%r230; @ %r428 bra $L86; add.u64 %r565,%r230,1; shr.s64 %r566,%r565,63; sub.u64 %r432,%r604,%r230; shr.s64 %r435,%r230,63; add.u64 %r438,%r230,2; set.u32.lt.u64 %r443,%r438,%r230; cvt.s64.s32 %r441,%r443; sub.u64 %r447,%r435,%r441; add.u64 %r449,%r432,%r438; set.u32.lt.u64 %r454,%r449,%r432; cvt.s64.s32 %r452,%r454; sub.u64 %r458,%r447,%r452; mov.u64 %r577,1; $L90: .loc 1 411 8 @ %r598 bra $L88; .loc 1 411 18 ld.u32 %r89,[%r167]; .loc 1 411 15 setp.lt.s32 %r460,%r89,%r145; .loc 1 414 16 selp.u64 %r557,%r557,%r565,%r460; selp.u64 %r558,%r558,%r566,%r460; selp.u32 %r145,%r145,%r89,%r460; $L88: .loc 1 409 33 add.u64 %r167,%r167,%r88; .loc 1 409 48 add.u64 %r168,%r168,%r43; .loc 1 409 8 add.u64 %r463,%r565,%r577; set.u32.lt.u64 %r468,%r463,%r565; cvt.s64.s32 %r466,%r468; mov.u64 %r565,%r463; sub.u64 %r566,%r566,%r466; setp.ne.u64 %r475,%r449,%r565; @ %r475 bra $L99; setp.ne.u64 %r478,%r458,%r566; @ ! %r478 bra $L86; $L99: .loc 1 411 9 ld.s8 %r224,[%r168]; setp.eq.u32 %r598,%r224,0; bra $L90; $L87: add.u64 %r561,%r230,1; shr.s64 %r562,%r561,63; sub.u64 %r482,%r604,%r230; shr.s64 %r485,%r230,63; add.u64 %r488,%r230,2; set.u32.lt.u64 %r493,%r488,%r230; cvt.s64.s32 %r491,%r493; sub.u64 %r497,%r485,%r491; add.u64 %r499,%r482,%r488; set.u32.lt.u64 %r504,%r499,%r482; cvt.s64.s32 %r502,%r504; sub.u64 %r508,%r497,%r502; .loc 1 418 8 mov.u64 %r591,1; $L113: .loc 1 420 8 ld.s8 %r510,[%r168]; cvt.u16.u32 %r509,%r510; setp.eq.u16 %r511,%r509,0; @ %r511 bra $L91; .loc 1 420 18 ld.u32 %r94,[%r167]; .loc 1 420 15 setp.le.s32 %r512,%r94,%r145; .loc 1 423 16 selp.u64 %r557,%r557,%r561,%r512; selp.u64 %r558,%r558,%r562,%r512; selp.u32 %r145,%r145,%r94,%r512; $L91: .loc 1 418 33 add.u64 %r167,%r167,%r88; .loc 1 418 48 add.u64 %r168,%r168,%r43; .loc 1 418 8 add.u64 %r515,%r561,%r591; set.u32.lt.u64 %r520,%r515,%r561; cvt.s64.s32 %r518,%r520; mov.u64 %r561,%r515; sub.u64 %r562,%r562,%r518; setp.ne.u64 %r527,%r499,%r561; @ %r527 bra $L113; setp.ne.u64 %r530,%r508,%r562; @ %r530 bra $L113; $L86: .loc 1 426 8 st.u64 [%r174],%r557; st.u64 [%r174+8],%r558; .loc 1 429 15 add.u64 %r189,%r189,1; .loc 1 430 12 add.u64 %r172,%r172,%r103; .loc 1 431 13 add.u64 %r173,%r173,%r105; .loc 1 432 12 add.u64 %r174,%r174,%r108; .loc 1 434 13 setp.ne.u64 %r533,%r29,%r189; @ %r533 bra $L98; .loc 1 442 10 add.u64 %r169,%r173,%r92; .loc 1 443 9 add.u64 %r170,%r174,%r185; .loc 1 445 7 @ %r595 bra $L51; add.u64 %r219,%frame,488; .loc 1 441 23 mov.u64 %r112,%r39; .loc 1 445 7 mov.u64 %r218,8; .loc 1 444 5 mov.u64 %r171,1; add.u64 %r599,%frame,120; .loc 1 438 13 mov.u64 %r602,0; bra $L95; $L96: st.u64 [%r219],%r602; .loc 1 441 23 mul.lo.u64 %r112,%r125,%r124; .loc 1 442 24 mul.lo.u64 %r539,%r127,%r124; .loc 1 442 10 sub.u64 %r169,%r173,%r539; .loc 1 443 23 mul.lo.u64 %r540,%r129,%r124; .loc 1 443 9 shl.b64 %r541,%r540,4; sub.u64 %r170,%r174,%r541; .loc 1 444 5 add.u64 %r171,%r171,1; .loc 1 445 7 add.u64 %r219,%r219,8; add.u64 %r218,%r218,8; setp.eq.u64 %r542,%r171,%r607; @ %r542 bra $L51; $L95: .loc 1 453 16 ld.u64 %r543,[%r219]; add.u64 %r124,%r543,1; st.u64 [%r219],%r124; .loc 1 454 23 add.u64 %r545,%r608,%r218; ld.u64 %r125,[%r545]; .loc 1 454 13 sub.u64 %r546,%r125,%r112; shl.b64 %r547,%r546,2; add.u64 %r172,%r172,%r547; .loc 1 455 24 add.u64 %r548,%frame,%r218; ld.u64 %r127,[%r548]; .loc 1 455 14 add.u64 %r173,%r169,%r127; .loc 1 456 23 add.u64 %r550,%r599,%r218; ld.u64 %r129,[%r550]; .loc 1 456 13 shl.b64 %r551,%r129,4; add.u64 %r174,%r170,%r551; .loc 1 434 32 add.u64 %r553,%r609,%r218; ld.u64 %r133,[%r553]; .loc 1 434 13 setp.eq.u64 %r554,%r124,%r133; @ %r554 bra $L96; .loc 1 438 13 mov.u64 %r189,0; bra $L98; $L119: .loc 1 312 6 ld.u64 %r555,[%r233]; setp.eq.u64 %r556,%r555,0; @ ! %r556 bra $L66; bra $L97; $L51: .loc 1 460_gfortran_smaxloc1_16_i4 .visible .func _gfortran_smaxloc1_16_, .param .u32 %in_ar4) {8u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r282; .reg .predpred %r293; .reg .pred %r294; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 483 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L123; .loc 1 483 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L124; r152154; call _gfortran_maxloc1_16_490 7 bra $L122; $L124: .loc 1 493 10 ld.u64 %r23,[%r152]; .loc 1 493 7 add.u64 %r88,%r23,-1; .loc 1 494 10 ld.s8 %r24,[%r151+28]; .loc 1 494 38 add.u32 %r26,%r24,-1; .loc 1 494 8 cvt.s64.s32 %r297,%r26; .loc 1 496 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r297; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 496 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L126; .loc 1 503 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L127; $L133: .loc 1 511 3 setp.lt.s64 %r175,%r88,%r297; @ %r175 bra $L128; bra $L181; $L126: .loc 1 498 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error127: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 508 12 mov.u64 %r303,0; $L132: .loc 1 505 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 507 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L130; .loc 1 505 17 st.u64 [%r139],%r36; bra $L131; $L130: .loc 1 508 12 st.u64 [%r139],%r303; $L131: .loc 1 503 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L132; bra $L133; $L181: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L135; bra $L134; $L128: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r296,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r296,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 517 12 mov.u64 %r302,0; $L138: .loc 1 514 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 516 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L136; .loc 1 513 17 st.u64 [%r43],%r41; bra $L137; $L136: .loc 1 517 12 st.u64 [%r43],%r302; $L137: .loc 1 511 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L138; bra $L182; $L134: .loc 1 524 7 setp.eq.u64 %r213,%r297,0; @ %r213 bra $L140; add.u64 %r296,%frame,120; $L160: add.u64 %r111,%r150,40; mov.u64 %r107,%r296; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 527 10 mov.u64 %r81,1; .loc 1 531 4 mov.u64 %r239,0; bra $L141; $L140: .loc 1 535 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 536 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 538 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 538 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 538 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 540 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L142; bra $L183; $L144: .loc 1 529 48 mul.lo.u64 %r81,%r46,%r81; $L141: .loc 1 531 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 524 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L144; bra $L140; $L142: .loc 1 543 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 544 4 bra $L122; $L183: .loc 1 547 24 mov.u64 %r246,1696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 547 22 st.u64 [%r150],%r98; $L148: .loc 1 573 3 setp.ne.u64 %r249,%r297,0; @ %r249 bra $L145; bra $L146; $L135: .loc 1 551 19 ld.s8 %r55,[%r150+28]; .loc 1 551 10 setp.eq.u64 %r250,%r55,%r297; @ %r250 bra $L147; .loc 1 552 2 st.u64 [%stack+8],%r297; st.u64 [%stack],%r55;_gfortran_runtime_error147: .loc 1 557 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 557 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L148; .loc 1 559 4 setp.eq.u64 %r256,%r297,0; @ %r256 bra $L146; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 559 10 mov.u64 %r104,0; $L150: .loc 1 563 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 563 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 564 18 ld.u64 %r63,[%r31]; .loc 1 559 25 add.u64 %r104,%r104,1; .loc 1 564 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L149; .loc 1 565 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error149: .loc 1 559 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L150; $L145: .loc 1 575 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L151; shl.b64 %r266,%r297,3; bra $L152; $L151: mov.u64 %r266,8; $L152: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L153: .loc 1 576 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 573 26 add.u64 %r100,%r100,1; .loc 1 573 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r297,%r100; @ %r279 bra $L153; $L146: ld.u64 %r80,[%frame+240]; .loc 1 585 22 ld.u64 %r68,[%frame]; .loc 1 585 12 shl.b64 %r69,%r68,4; .loc 1 587 32 ld.u64 %r59,[%frame+120]; .loc 1 594 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r295,%r297,1; .loc 1 583 13 mov.u64 %r280,0; cvt.u32.u32 %r299,%r24; cvt.s64.s8 %r300,%r299; add.u64 %r301,%r300,-1; $L159: st.u64 [%r98],%r280; st.u64 [%r98+8],%r280; .loc 1 584 15 add.u64 %r80,%r80,1; .loc 1 585 12 add.u64 %r98,%r98,%r69; .loc 1 587 13 setp.ne.u64 %r282,%r59,%r80; @ %r282 bra $L159; .loc 1 596 7 @ ! %r295 bra $L184; bra $L122; $L158: .loc 1 591 13 st.u64 [%r99],%r280; .loc 1 594 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 595 5 add.u64 %r97,%r97,1; .loc 1 596 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r285,%r97,%r301; @ %r285 bra $L157; bra $L122; $L184: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 594 23 mov.u64 %r72,%r44; .loc 1 595 5 mov.u64 %r97,1; $L157: .loc 1 600 16 ld.u64 %r290,[%r99]; add.u64 %r74,%r290,1; st.u64 [%r99],%r74; .loc 1 601 23 ld.u64 %r75,[%r124]; .loc 1 601 13 sub.u64 %r291,%r75,%r72; shl.b64 %r292,%r291,4; add.u64 %r98,%r98,%r292; .loc 1 587 32 ld.u64 %r77,[%r123]; .loc 1 587 13 setp.eq.u64 %r293,%r74,%r77; @ %r293 bra $L158; .loc 1 591 13 mov.u64 %r80,0; bra $L159; $L182: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r294,%r98,0; @ ! %r294 bra $L135; bra $L160; $L122: .loc 1 605maxloc1_4_i8.o/ 1622802230_gfortran_maxloc1_4_i8 .visible .func _gfortran_maxloc1_4_i8.file 1 "../../../../libgfortran/generated/maxloc1_4_i8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_4_i8 .visible .func _gfortran_mmaxloc1_4__gfortran_smaxloc1_4_i8 .visible .func _gfortran_smaxloc1_4_VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_maxloc1_4_i8 .visible .func _gfortran_maxloc1_4_i8pred %r221; .reg .pred %r222; .reg .u64u64 %r229; .reg .u64 %r230; .reg .u64 %r231; .reg .pred %r232; .reg .predu64 %r280; .reg .u64 %r282; .reg .u64predu32u64 %r317; .reg .u64pred %r325; .reg .pred %r326; .reg .u32 %r327; .reg .pred %r328; .reg .pred %r329; .reg .u64 %r335; .reg .u64pred %r358; .reg .u32 %r359; .reg .u64u64 %r365; mov.u64 %r182,%ar0; mov.u64 %r183,%ar1; mov.u64 %r184,%ar2; mov.u32 %r185,%ar3; .loc 1 58 10 ld.s8 %r22,[%r183+28]; .loc 1 58 38 add.u32 %r186,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r108,%r186; .loc 1 59 10 ld.u64 %r25,[%r184]; .loc 1 59 7 add.u64 %r109,%r25,-1; .loc 1 61 7 shr.u64 %r188,%r109,63; set.u32.lt.s64 %r190,%r108,%r109; neg.s32 %r191,%r190; cvt.u16.u64 %r194,%r188; cvt.u16.u32 %r195,%r191; or.b16 %r193,%r194,%r195; .loc 1 61 6 cvt.u32.u16 %r196,%r193; cvt.u16.u8 %r197,%r196; setp.eq.u16 %r198,%r197,0; @ %r198 bra $L2; .loc 1 63 7 cvt.u32.u32 %r201,%r22; cvt.s64.s8 %r200,%r201; st.u64 [%stack+8],%r20019_gfortran_runtime_errorr204,%r109,%r109; add.u64 %r205,%r204,%r109; shl.b64 %r206,%r205,3; add.u64 %r207,%r183,%r206; ld.u64 %r31,[%r207+56]; ld.u64 %r33,[%r207+48]; .loc 1 71 9 ld.u64 %r112,[%r207+40]; .loc 1 73 3 setp.ne.u64 %r221,%r109,0; @ %r221 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r222,%r108,%r109; @ %r222 bra $L4; bra $L49; $L3: add.u64 %r29,%r183,40; add.u64 %r125,%frame,120; add.u64 %r126,%frame,240; add.u64 %r223,%r183,16; add.u64 %r225,%r25,%r25; add.u64 %r226,%r225,%r25; shl.b64 %r227,%r226,3; add.u64 %r147,%r223,%r227; .loc 1 79 12 mov.u64 %r365,0; $L8: .loc 1 75 18 ld.u64 %r228,[%r29]; st.u64 [%r125],%r228; .loc 1 76 19 ld.u64 %r230,[%r29+16]; add.u64 %r229,%r230,1; ld.u64 %r231,[%r29+8]; sub.u64 %r38,%r229,%r231; .loc 1 78 10 setp.lt.s64 %r232,%r38,0; @ %r232 bra $L6; .loc 1 76 17 st.u64 [%r126],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r126],%r365; $L7: .loc 1 73 3 add.u64 %r29,%r29,24; add.u64 %r125,%r125,8; add.u64 %r126,%r126,8; setp.ne.u64 %r234,%r29,%r147; @ %r234 bra $L8; bra $L9; $L49: .loc 1 90 6 ld.u64 %r235,[%r182]; setp.eq.u64 %r236,%r235,0; @ ! %r236 bra $L11; bra $L10; $L4: add.u64 %r238,%r25,%r25; add.u64 %r239,%r238,%r25; shl.b64 %r240,%r239,3; add.u64 %r241,%r240,40; add.u64 %r137,%r183,%r241; shl.b64 %r242,%r25,3; add.u64 %r105,%r242,-8; add.u64 %r352,%frame,120; add.u64 %r127,%r352,%r105; add.u64 %r351,%frame,240; add.u64 %r103,%r351,%r105; cvt.u32.u32 %r246,%r22; cvt.s64.s8 %r245,%r246; add.u64 %r248,%r245,%r245; add.u64 %r249,%r248,%r245; shl.b64 %r250,%r249,3; add.u64 %r251,%r183,40; add.u64 %r64,%r250,%r251; .loc 1 87 12 mov.u64 %r364,0; $L14: .loc 1 83 18 ld.u64 %r252,[%r137]; st.u64 [%r127],%r252; .loc 1 84 19 ld.u64 %r254,[%r137+16]; add.u64 %r253,%r254,1; ld.u64 %r255,[%r137+8]; sub.u64 %r44,%r253,%r255; .loc 1 86 10 setp.lt.s64 %r256,%r44,0; @ %r256 bra $L12; .loc 1 84 17 st.u64 [%r103],%r44; bra $L13; $L12: .loc 1 87 12 st.u64 [%r103],%r364; $L13: .loc 1 81 3 add.u64 %r137,%r137,24; add.u64 %r127,%r127,8; add.u64 %r103,%r103,8; setp.ne.u64 %r258,%r64,%r137; @ %r258 bra $L14; bra $L50; $L10: .loc 1 94 7 setp.le.s64 %r259,%r108,0; @ %r259 bra $L16; add.u64 %r351,%frame,240; $L37: add.u64 %r154,%r182,40; mov.u64 %r152,%r351; cvt.u32.u32 %r261,%r22; cvt.s64.s8 %r260,%r261; add.u64 %r263,%r260,%r260; add.u64 %r264,%r263,%r260; shl.b64 %r265,%r264,3; add.u64 %r266,%r182,16; add.u64 %r138,%r265,%r266; .loc 1 97 10 mov.u64 %r100,1; .loc 1 101 4 mov.u64 %r289,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r267,0; st.u64 [%r182+8],%r267; .loc 1 106 28 cvt.u16.u32 %r270,%r22; add.u16 %r269,%r270,-1; cvt.u32.u16 %r271,%r269; st.u8 [%r182+28],%r271; .loc 1 108 20 add.u32 %r272,%r22,-2; cvt.s64.s32 %r54,%r272; add.u64 %r274,%r54,%r54; add.u64 %r275,%r274,%r54; shl.b64 %r276,%r275,3; add.u64 %r277,%r182,%r276; .loc 1 108 67 shl.b64 %r279,%r54,3; add.u64 %r280,%frame,%r279; .loc 1 108 59 ld.u64 %r282,[%r277+40]; ld.u64 %r283,[%r280+240]; mul.lo.u64 %r114,%r282,%r283; .loc 1 110 29285; call (%value_in),_gfortrani_xmallocarray286,[%value_in]; } .loc 1 110 27 st.u64 [%r182],%r286; .loc 1 111 10 setp.eq.u64 %r288,%r114,0; @ ! %r288 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r100,%r50,%r100; $L17: .loc 1 101 4 st.u64 [%r154+8],%r289; ld.u64 %r50,[%r152]; add.u64 %r290,%r50,-1; st.u64 [%r154+16],%r290; st.u64 [%r154],%r100; .loc 1 94 7 add.u64 %r154,%r154,24; add.u64 %r152,%r152,8; setp.ne.u64 %r291,%r138,%r154; @ %r291 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r182+48],%r114; mov.u64 %r293,-1; st.u64 [%r182+56],%r293; mov.u64 %r294,1; st.u64 [%r182+40],%r294; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r60,[%r182+28]; .loc 1 121 10 setp.eq.u64 %r295,%r60,%r108; @ %r295 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r108; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r298,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r299,[%r298+36]; setp.eq.u32 %r300,%r299,0; @ %r300 bra $L21; .loc 1 128 2 add.u64 %r351,%frame,240; cvta.const.u64 %r304,$LC2r351303304; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r306,%r108,0; @ %r306 bra $L24; $L27: .loc 1 68 9 add.u64 %r307,%r31,1; .loc 1 68 7 sub.u64 %r110,%r307,%r33; max.s64 %r45,%r110,0; .loc 1 140 8 ld.u64 %r122,[%r183]; .loc 1 141 8 ld.u64 %r123,[%r182]; .loc 1 180 31 shl.b64 %r68,%r112,3; ld.u64 %r75,[%frame+360]; .loc 1 194 22 ld.u64 %r70,[%frame+120]; .loc 1 194 12 shl.b64 %r71,%r70,3; .loc 1 195 22 ld.u64 %r73,[%frame]; .loc 1 195 12 shl.b64 %r74,%r73,2; .loc 1 197 32 ld.u64 %r143,[%frame+240]; .loc 1 204 23 mul.lo.u64 %r308,%r70,%r143; .loc 1 204 9 shl.b64 %r309,%r308,3; neg.s64 %r134,%r309; .loc 1 205 23 mul.lo.u64 %r310,%r73,%r143; .loc 1 205 9 shl.b64 %r311,%r310,2; neg.s64 %r128,%r311; setp.gt.s64 %r353,%r110,0; setp.le.s64 %r354,%r108,1; .loc 1 155 9 mov.u64 %r357,-9223372036854775808; setp.eq.u32 %r358,%r185,0; cvt.u32.u32 %r359,%r22; cvt.s64.s8 %r360,%r359; add.u64 %r361,%r360,-1; add.u64 %r362,%frame,120; add.u64 %r363,%frame,240; bra $L25; $L24: add.u64 %r167,%frame,360; add.u64 %r166,%r182,40; mov.u64 %r164,%frame; cvt.u32.u32 %r313,%r22; cvt.s64.s8 %r312,%r313; add.u64 %r155,%r312,-1; .loc 1 132 3 mov.u64 %r124,0; add.u64 %r351,%frame,240; .loc 1 134 16 mov.u64 %r314,%r124; $L26: st.u64 [%r167],%r314; .loc 1 135 18 ld.u64 %r315,[%r166]; st.u64 [%r164],%r315; .loc 1 136 17 shl.b64 %r317,%r124,3; add.u64 %r318,%r351,%r317; .loc 1 136 10 ld.u64 %r319,[%r318]; setp.le.s64 %r320,%r319,0; @ %r320 bra $L1; .loc 1 132 26 add.u64 %r124,%r124,1; .loc 1 132 3 add.u64 %r167,%r167,8; add.u64 %r166,%r166,24; add.u64 %r164,%r164,8; setp.ne.u64 %r321,%r124,%r155; @ %r321 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r353 bra $L38; .loc 1 159 10 mov.u32 %r323,0; st.u32 [%r123],%r323; bra $L29; $L38: mov.u64 %r118,%r122; .loc 1 155 9 mov.u64 %r102,%r357; .loc 1 157 9 mov.u32 %r101,1; .loc 1 178 8 mov.u64 %r117,0; $L28: .loc 1 182 14 ld.u64 %r181,[%r118]; .loc 1 182 7 @ %r358 bra $L30; .loc 1 182 6 setp.le.s64 %r325,%r102,%r181; @ %r325 bra $L31; bra $L32; $L30: setp.ge.s64 %r326,%r102,%r181; @ %r326 bra $L32; $L31: cvt.u32.u64 %r327,%r117; add.u32 %r101,%r327,1; .loc 1 182 14 mov.u64 %r102,%r181; $L32: .loc 1 180 23 add.u64 %r117,%r117,1; .loc 1 180 31 add.u64 %r118,%r118,%r68; .loc 1 180 6 setp.gt.s64 %r328,%r45,%r117; @ %r328 bra $L28; .loc 1 189 12 st.u32 [%r123],%r101; $L29: .loc 1 193 15 add.u64 %r75,%r75,1; .loc 1 194 12 add.u64 %r122,%r122,%r71; .loc 1 195 12 add.u64 %r123,%r123,%r74; .loc 1 197 13 setp.ne.u64 %r329,%r75,%r143; @ %r329 bra $L25; .loc 1 204 9 add.u64 %r119,%r122,%r134; .loc 1 205 9 add.u64 %r120,%r123,%r128; .loc 1 207 7 @ %r354 bra $L1; add.u64 %r133,%frame,368; mov.u64 %r149,8; .loc 1 206 5 mov.u64 %r121,1; .loc 1 201 13 mov.u64 %r356,0; bra $L35; $L36: st.u64 [%r133],%r356; .loc 1 204 23 mul.lo.u64 %r335,%r87,%r86; .loc 1 204 9 shl.b64 %r336,%r335,3; sub.u64 %r119,%r122,%r336; .loc 1 205 23 mul.lo.u64 %r337,%r90,%r86; .loc 1 205 9 shl.b64 %r338,%r337,2; sub.u64 %r120,%r123,%r338; .loc 1 206 5 add.u64 %r121,%r121,1; .loc 1 207 7 add.u64 %r133,%r133,8; add.u64 %r149,%r149,8; setp.eq.u64 %r339,%r121,%r361; @ %r339 bra $L1; $L35: .loc 1 215 16 ld.u64 %r340,[%r133]; add.u64 %r86,%r340,1; st.u64 [%r133],%r86; .loc 1 216 23 add.u64 %r342,%r362,%r149; ld.u64 %r87,[%r342]; .loc 1 216 13 shl.b64 %r343,%r87,3; add.u64 %r122,%r119,%r343; .loc 1 217 23 add.u64 %r344,%frame,%r149; ld.u64 %r90,[%r344]; .loc 1 217 13 shl.b64 %r345,%r90,2; add.u64 %r123,%r120,%r345; .loc 1 197 32 add.u64 %r347,%r363,%r149; ld.u64 %r93,[%r347]; .loc 1 197 13 setp.eq.u64 %r348,%r86,%r93; @ %r348 bra $L36; .loc 1 201 13 mov.u64 %r75,0; bra $L25; $L50: .loc 1 90 6 ld.u64 %r349,[%r182]; setp.eq.u64 %r350,%r349,0; @ ! %r350 bra $L11; bra $L37; $L1: .loc 1 221_gfortran_mmaxloc1_4_i8 .visible .func _gfortran_mmaxloc1_4_, .param .u32 %in_ar4) {60898u16 %r238; .reg .u16u64 %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u64 %r306; .reg .u64pred %r317; .reg .u64 %r318; .reg .pred %r319; .reg .u64u32u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64u64predu32 %r385; .reg .predu64 %r394; .reg .predu64predu64 %r410; .reg .u32 %r412; .reg .pred %r414; .reg .pred %r415; .reg .pred %r416; .reg .predpred %r423; .reg .predpredpred %r446; .reg .u64 %r447; .reg .predmov.u64 %r221,%ar0; mov.u64 %r222,%ar1; mov.u64 %r223,%ar2; mov.u64 %r224,%ar3; mov.u32 %r225,%ar4; .loc 1 251 6 setp.ne.u64 %r226,%r224,0; @ %r226 bra $L223225; call _gfortran_maxloc1_4_7 bra $L51; $L52: .loc 1 261 10 ld.u64 %r22,[%r223]; .loc 1 261 7 add.u64 %r151,%r22,-1; .loc 1 262 10 ld.s8 %r23,[%r222+28]; .loc 1 262 38 add.u32 %r231,%r23,-1; .loc 1 262 8 cvt.s64.s32 %r152,%r231; .loc 1 265 7 shr.u64 %r233,%r151,63; set.u32.gt.s64 %r235,%r151,%r152; neg.s32 %r236,%r235; cvt.u16.u64 %r239,%r233; cvt.u16.u32 %r240,%r236; or.b16 %r238,%r239,%r240; .loc 1 265 6 cvt.u32.u16 %r241,%r238; cvt.u16.u8 %r242,%r241; setp.eq.u16 %r243,%r242,0; @ %r243 bra $L54; .loc 1 267 7 cvt.u32.u32 %r246,%r23; cvt.s64.s8 %r245,%r246; st.u64 [%stack+8],%r245; st.u64 [%stack],%r22; cvta.const.u64 %r244244_gfortran_runtime_error4: .loc 1 272 9 add.u64 %r449,%r151,%r151; add.u64 %r455,%r449,%r151; shl.b64 %r251,%r455,3; add.u64 %r252,%r222,%r251; ld.u64 %r255,[%r252+56]; add.u64 %r254,%r255,1; .loc 1 272 7 ld.u64 %r262,[%r252+48]; sub.u64 %r153,%r254,%r262; .loc 1 273 6 setp.le.s64 %r263,%r153,0; @ %r263 bra $L51; .loc 1 276 9 ld.u64 %r169,[%r224]; .loc 1 278 15 ld.u64 %r34,[%r224+16]; .loc 1 280 22 cvt.u32.u64 %r35,%r34; .loc 1 280 53 add.u32 %r264,%r35,-4; and.b32 %r265,%r264,-5; set.u32.eq.u32 %r267,%r265,0; neg.s32 %r268,%r267; .loc 1 280 22 add.u32 %r269,%r35,-1; set.u32.le.u32 %r271,%r269,1; neg.s32 %r272,%r271; .loc 1 280 6 cvt.u16.u32 %r274,%r268; cvt.u16.u32 %r275,%r272; or.loc 1 282 7 setp.ne.u32 %r280,%r35,16; @ %r280 bra $L57; $L56: .loc 1 289 9 shl.b64 %r284,%r455,3; add.u64 %r285,%r222,%r284; ld.u64 %r156,[%r285+40]; .loc 1 290 12 add.u64 %r291,%r224,%r284; ld.u64 %r41,[%r291+40]; .loc 1 292 3 setp.ne.u64 %r293,%r151,0; @ %r293 bra $L58; $L64: .loc 1 302 3 setp.lt.s64 %r294,%r151,%r152; @ %r294 bra $L59; bra $L116; $L57: .loc 1 287 5 cvta.const.u64 %r2955_gfortran_runtime_error8: add.u64 %r96,%r222,40; add.u64 %r68,%r224,40; add.u64 %r298,%r22,%r22; add.u64 %r299,%r298,%r22; shl.b64 %r300,%r299,3; add.u64 %r301,%r222,16; add.u64 %r39,%r300,%r301; .loc 1 292 3 mov.u64 %r139,0; add.u64 %r452,%frame,240; add.u64 %r451,%frame,360; .loc 1 299 12 mov.u64 %r465,%r139; $L63: .loc 1 294 18 add.u64 %r303,%r452,%r139; ld.u64 %r304,[%r96]; st.u64 [%r303],%r304; .loc 1 295 18 add.u64 %r305,%frame,%r139; .loc 1 295 20 ld.u64 %r307,[%r68]; mul.lo.u64 %r306,%r307,%r34; .loc 1 295 18 st.u64 [%r305],%r306; .loc 1 296 19 ld.u64 %r309,[%r96+16]; add.u64 %r308,%r309,1; ld.u64 %r310,[%r96+8]; sub.u64 %r53,%r308,%r310; .loc 1 298 10 setp.lt.s64 %r311,%r53,0; @ %r311 bra $L61; .loc 1 296 17 add.u64 %r313,%r451,%r139; st.u64 [%r313],%r53; bra $L62; $L61: .loc 1 299 12 add.u64 %r315,%r451,%r139; st.u64 [%r315],%r465; $L62: .loc 1 292 3 add.u64 %r96,%r96,24; add.u64 %r68,%r68,24; add.u64 %r139,%r139,8; setp.ne.u64 %r317,%r39,%r96; @ %r317 bra $L63; bra $L64; $L116: .loc 1 312 6 ld.u64 %r318,[%r221]; setp.eq.u64 %r319,%r318,0; @ ! %r319 bra $L66; bra $L65; $L59: add.u64 %r321,%r22,%r22; add.u64 %r322,%r321,%r22; shl.b64 %r323,%r322,3; add.u64 %r130,%r323,40; add.u64 %r134,%r222,%r130; add.u64 %r111,%r224,%r130; shl.b64 %r324,%r22,3; add.u64 %r58,%r324,-8; cvt.u32.u32 %r326,%r23; cvt.s64.s8 %r325,%r326; shl.b64 %r327,%r325,3; add.u64 %r69,%r327,-8; add.u64 %r452,%frame,240; add.u64 %r451,%frame,360; .loc 1 309 12 mov.u64 %r464,0; $L69: .loc 1 304 18 add.u64 %r329,%r452,%r58; ld.u64 %r330,[%r134]; st.u64 [%r329],%r330; .loc 1 305 18 add.u64 %r331,%frame,%r58; .loc 1 305 20 ld.u64 %r333,[%r111]; mul.lo.u64 %r332,%r333,%r34; .loc 1 305 18 st.u64 [%r331],%r332; .loc 1 306 19 ld.u64 %r335,[%r134+16]; add.u64 %r334,%r335,1; ld.u64 %r336,[%r134+8]; sub.u64 %r65,%r334,%r336; .loc 1 308 10 setp.lt.s64 %r337,%r65,0; @ %r337 bra $L67; .loc 1 306 17 add.u64 %r339,%r451,%r58; st.u64 [%r339],%r65; bra $L68; $L67: .loc 1 309 12 add.u64 %r341,%r451,%r58; st.u64 [%r341],%r464; $L68: .loc 1 302 3 add.u64 %r134,%r134,24; add.u64 %r111,%r111,24; add.u64 %r58,%r58,8; setp.ne.u64 %r343,%r69,%r58; @ %r343 bra $L69; bra $L117; $L65: .loc 1 316 7 setp.eq.u64 %r344,%r152,0; @ %r344 bra $L71; add.u64 %r451,%frame,360; $L96: add.u64 %r148,%r221,40; mov.u64 %r146,%r451; cvt.u32.u32 %r346,%r23; cvt.s64.s8 %r345,%r346; add.u64 %r348,%r345,%r345; add.u64 %r349,%r348,%r345; shl.b64 %r350,%r349,3; add.u64 %r351,%r221,16; add.u64 %r135,%r350,%r351; .loc 1 319 10 mov.u64 %r136,1; .loc 1 323 4 mov.u64 %r370,0; bra $L72; $L71: .loc 1 327 20 add.u32 %r352,%r23,-2; cvt.s64.s32 %r76,%r352; add.u64 %r354,%r76,%r76; add.u64 %r355,%r354,%r76; shl.b64 %r356,%r355,3; add.u64 %r357,%r221,%r356; .loc 1 327 67 shl.b64 %r359,%r76,3; add.u64 %r360,%frame,%r359; .loc 1 327 59 ld.u64 %r362,[%r357+40]; ld.u64 %r363,[%r360+360]; mul.lo.u64 %r160,%r362,%r363; .loc 1 329 24 mov.u64 %r364,0; st.u64 [%r221+8],%r364; .loc 1 330 28 cvt.u16.u32 %r367,%r23; add.u16 %r366,%r367,-1; cvt.u32.u16 %r368,%r366; st.u8 [%r221+28],%r368; .loc 1 332 10 setp.eq.u64 %r369,%r160,0; @ %r369 bra $L73; bra $L118; $L75: .loc 1 321 47 mul.lo.u64 %r136,%r73,%r136; $L72: .loc 1 323 4 st.u64 [%r148+8],%r370; ld.u64 %r73,[%r146]; add.u64 %r371,%r73,-1; st.u64 [%r148+16],%r371; st.u64 [%r148],%r136; .loc 1 316 7 add.u64 %r148,%r148,24; add.u64 %r146,%r146,8; setp.ne.u64 %r372,%r135,%r148; @ %r372 bra $L75; bra $L71; $L73: .loc 1 335 4 st.u64 [%r221+48],%r160; mov.u64 %r374,-1; st.u64 [%r221+56],%r374; mov.u64 %r375,1; st.u64 [%r221+40],%r375; .loc 1 336 4 bra $L51; $L118: .loc 1 339 24call (%value_in),_gfortrani_xmallocarray78,[%value_in]; } .loc 1 339 22 st.u64 [%r221],%r378; bra $L76; $L66: .loc 1 344 19 ld.s8 %r380,[%r221+28]; .loc 1 344 10 setp.eq.u64 %r381,%r380,%r152; @ %r381 bra $L77; .loc 1 345 2 cvta.const.u64 %r382,$LC5; {stack; call _gfortran_runtime_error77: .loc 1 347 11 cvta.global.u64 %r384,_gfortrani_compile_options; .loc 1 347 10 ld.u32 %r385,[%r384+36]; setp.eq.u32 %r386,%r385,0; @ %r386 bra $L76; .loc 1 349 4 add.u64 %r451,%frame,360; cvta.const.u64 %r390,$LC2390; call _gfortrani_bounds_ifunction_return4 cvta.const.u64 %r394,$LC222394390; call _gfortrani_bounds_equal_extents$L76: .loc 1 356 3 setp.ne.u64 %r396,%r152,0; @ %r396 bra $L78; $L81: .loc 1 364 8 ld.u64 %r170,[%r221]; .loc 1 365 8 ld.u64 %r168,[%r222]; .loc 1 367 9 setp.ne.u64 %r397,%r168,0; @ %r397 bra $L79; bra $L51; $L78: add.u64 %r198,%frame,480; add.u64 %r197,%r221,40; add.u64 %r195,%frame,120; cvt.u32.u32 %r399,%r23; cvt.s64.s8 %r398,%r399; add.u64 %r164,%r398,-1; .loc 1 356 3 mov.u64 %r171,0; add.u64 %r451,%frame,360; .loc 1 358 16 mov.u64 %r400,%r171; $L80: st.u64 [%r198],%r400; .loc 1 359 18 ld.u64 %r401,[%r197]; st.u64 [%r195],%r401; .loc 1 360 17 shl.b64 %r403,%r171,3; add.u64 %r404,%r451,%r403; .loc 1 360 10 ld.u64 %r405,[%r404]; setp.le.s64 %r406,%r405,0; @ %r406 bra $L51; .loc 1 356 26 add.u64 %r171,%r171,1; .loc 1 356 3 add.u64 %r198,%r198,8; add.u64 %r197,%r197,24; add.u64 %r195,%r195,8; setp.ne.u64 %r407,%r164,%r171; @ %r407 bra $L80; bra $L81; $L79: .loc 1 290 12 mul.lo.u64 %r44,%r41,%r34; .loc 1 386 32 shl.b64 %r91,%r156,3; ld.u64 %r187,[%frame+480]; .loc 1 430 22 ld.u64 %r101,[%frame+240]; .loc 1 430 12 shl.b64 %r102,%r101,3; .loc 1 431 23 ld.u64 %r104,[%frame]; .loc 1 432 22 ld.u64 %r106,[%frame+120]; .loc 1 432 12 shl.b64 %r107,%r106,2; .loc 1 434 32 ld.u64 %r29,[%frame+360]; .loc 1 441 23 mul.lo.u64 %r38,%r29,%r101; .loc 1 442 24 mul.lo.u64 %r408,%r29,%r104; .loc 1 442 10 neg.s64 %r137,%r408; .loc 1 443 23 mul.lo.u64 %r409,%r29,%r106; .loc 1 443 9 shl.b64 %r410,%r409,2; neg.s64 %r188,%r410; setp.le.s64 %r450,%r152,1; .loc 1 408 9 setp.ne.u32 %r457,%r225,0; cvt.u32.u32 %r458,%r23; cvt.s64.s8 %r459,%r458; add.u64 %r460,%r459,-1; add.u64 %r461,%frame,240; add.u64 %r462,%frame,360; add.u64 %r463,%frame,120; $L97: .loc 1 356 3 mov.u64 %r163,%r169; mov.u64 %r162,%r168; .loc 1 386 9 mov.u64 %r161,0; $L85: .loc 1 389 7 ld.s8 %r220,[%r163]; .loc 1 389 6 setp.eq.u32 %r453,%r220,0; @ %r453 bra $L82; .loc 1 397 11 ld.u64 %r145,[%r162]; .loc 1 398 11 cvt.u32.u64 %r412,%r161; add.u32 %r143,%r412,1; .loc 1 408 9 @ %r457 bra $L83; bra $L119; $L82: .loc 1 386 24 add.u64 %r161,%r161,1; .loc 1 386 32 add.u64 %r162,%r162,%r91; .loc 1 386 47 add.u64 %r163,%r163,%r44; .loc 1 386 2 setp.ne.u64 %r414,%r153,%r161; @ %r414 bra $L85; .loc 1 385 9 mov.u32 %r143,%r220; bra $L86; $L119: .loc 1 418 8 setp.gt.s64 %r415,%r153,%r161; @ %r415 bra $L87; bra $L86; $L83: .loc 1 409 8 setp.le.s64 %r416,%r153,%r161; @ %r416 bra $L86; add.u64 %r205,%r161,1; $L89: .loc 1 411 8 @ %r453 bra $L88; .loc 1 411 18 ld.u64 %r93,[%r162]; .loc 1 411 15 setp.lt.s64 %r418,%r93,%r145; @ %r418 bra $L88; .loc 1 414 16 cvt.u32.u64 %r143,%r205; mov.u64 %r145,%r93; $L88: .loc 1 409 33 add.u64 %r162,%r162,%r91; .loc 1 409 48 add.u64 %r163,%r163,%r44; .loc 1 409 8 add.u64 %r206,%r205,1; setp.eq.u64 %r419,%r153,%r205; @ %r419 bra $L86; .loc 1 411 9 ld.s8 %r220,[%r163]; mov.u64 %r205,%r206; setp.eq.u32 %r453,%r220,0; bra $L89; $L87: add.u64 %r201,%r161,1; bra $L91; $L98: mov.u64 %r201,%r202; $L91: .loc 1 420 8 ld.s8 %r421,[%r163]; cvt.u16.u32 %r420,%r421; setp.eq.u16 %r422,%r420,0; @ %r422 bra $L90; .loc 1 420 18 ld.u64 %r98,[%r162]; .loc 1 420 15 setp.le.s64 %r423,%r98,%r145; @ %r423 bra $L90; .loc 1 423 16 cvt.u32.u64 %r143,%r201; mov.u64 %r145,%r98; $L90: .loc 1 418 33 add.u64 %r162,%r162,%r91; .loc 1 418 48 add.u64 %r163,%r163,%r44; .loc 1 418 8 add.u64 %r202,%r201,1; setp.ne.u64 %r424,%r153,%r201; @ %r424 bra $L98; $L86: .loc 1 426 8 st.u32 [%r170],%r143; .loc 1 429 15 add.u64 %r187,%r187,1; .loc 1 430 12 add.u64 %r168,%r168,%r102; .loc 1 431 13 add.u64 %r169,%r169,%r104; .loc 1 432 12 add.u64 %r170,%r170,%r107; .loc 1 434 13 setp.ne.u64 %r425,%r29,%r187; @ %r425 bra $L97; .loc 1 442 10 add.u64 %r165,%r169,%r137; .loc 1 443 9 add.u64 %r166,%r170,%r188; .loc 1 445 7 @ %r450 bra $L51; add.u64 %r150,%frame,488; .loc 1 441 23 mov.u64 %r110,%r38; .loc 1 445 7 mov.u64 %r215,8; .loc 1 444 5 mov.u64 %r167,1; .loc 1 438 13 mov.u64 %r456,0; bra $L94; $L95: st.u64 [%r150],%r456; .loc 1 441 23 mul.lo.u64 %r110,%r123,%r122; .loc 1 442 24 mul.lo.u64 %r431,%r125,%r122; .loc 1 442 10 sub.u64 %r165,%r169,%r431; .loc 1 443 23 mul.lo.u64 %r432,%r127,%r122; .loc 1 443 9 shl.b64 %r433,%r432,2; sub.u64 %r166,%r170,%r433; .loc 1 444 5 add.u64 %r167,%r167,1; .loc 1 445 7 add.u64 %r150,%r150,8; add.u64 %r215,%r215,8; setp.eq.u64 %r434,%r167,%r460; @ %r434 bra $L51; $L94: .loc 1 453 16 ld.u64 %r435,[%r150]; add.u64 %r122,%r435,1; st.u64 [%r150],%r122; .loc 1 454 23 add.u64 %r437,%r461,%r215; ld.u64 %r123,[%r437]; .loc 1 454 13 sub.u64 %r438,%r123,%r110; shl.b64 %r439,%r438,3; add.u64 %r168,%r168,%r439; .loc 1 455 24 add.u64 %r440,%frame,%r215; ld.u64 %r125,[%r440]; .loc 1 455 14 add.u64 %r169,%r165,%r125; .loc 1 456 23 add.u64 %r442,%r463,%r215; ld.u64 %r127,[%r442]; .loc 1 456 13 shl.b64 %r443,%r127,2; add.u64 %r170,%r166,%r443; .loc 1 434 32 add.u64 %r445,%r462,%r215; ld.u64 %r131,[%r445]; .loc 1 434 13 setp.eq.u64 %r446,%r122,%r131; @ %r446 bra $L95; .loc 1 438 13 mov.u64 %r187,0; bra $L97; $L117: .loc 1 312 6 ld.u64 %r447,[%r221]; setp.eq.u64 %r448,%r447,0; @ ! %r448 bra $L66; bra $L96; $L51: .loc 1 460_gfortran_smaxloc1_4_i8 .visible .func _gfortran_smaxloc1_4_, .param .u32 %in_ar4) {8u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u32 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 483 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L121; .loc 1 483 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L122; $L121:r152154; call _gfortran_maxloc1_4_490 7 bra $L120; $L122: .loc 1 493 10 ld.u64 %r23,[%r152]; .loc 1 493 7 add.u64 %r88,%r23,-1; .loc 1 494 10 ld.s8 %r24,[%r151+28]; .loc 1 494 38 add.u32 %r26,%r24,-1; .loc 1 494 8 cvt.s64.s32 %r296,%r26; .loc 1 496 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 496 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L124; .loc 1 503 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L125; $L131: .loc 1 511 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L126; bra $L179; $L124: .loc 1 498 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error125: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 508 12 mov.u64 %r302,0; $L130: .loc 1 505 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 507 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L128; .loc 1 505 17 st.u64 [%r139],%r36; bra $L129; $L128: .loc 1 508 12 st.u64 [%r139],%r302; $L129: .loc 1 503 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L130; bra $L131; $L179: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L133; bra $L132; $L126: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 517 12 mov.u64 %r301,0; $L136: .loc 1 514 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 516 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L134; .loc 1 513 17 st.u64 [%r43],%r41; bra $L135; $L134: .loc 1 517 12 st.u64 [%r43],%r301; $L135: .loc 1 511 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L136; bra $L180; $L132: .loc 1 524 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L138; add.u64 %r295,%frame,120; $L158: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 527 10 mov.u64 %r81,1; .loc 1 531 4 mov.u64 %r239,0; bra $L139; $L138: .loc 1 535 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 536 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 538 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 538 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 538 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 540 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L140; bra $L181; $L142: .loc 1 529 48 mul.lo.u64 %r81,%r46,%r81; $L139: .loc 1 531 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 524 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L142; bra $L138; $L140: .loc 1 543 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 544 4 bra $L120; $L181: .loc 1 547 2496246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 547 22 st.u64 [%r150],%r98; $L146: .loc 1 573 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L143; bra $L144; $L133: .loc 1 551 19 ld.s8 %r55,[%r150+28]; .loc 1 551 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L145; .loc 1 552 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error145: .loc 1 557 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 557 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L146; .loc 1 559 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L144; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 559 10 mov.u64 %r104,0; $L148: .loc 1 563 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 563 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 564 18 ld.u64 %r63,[%r31]; .loc 1 559 25 add.u64 %r104,%r104,1; .loc 1 564 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L147; .loc 1 565 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error147: .loc 1 559 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L148; $L143: .loc 1 575 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L149; shl.b64 %r266,%r296,3; bra $L150; $L149: mov.u64 %r266,8; $L150: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L151: .loc 1 576 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 573 26 add.u64 %r100,%r100,1; .loc 1 573 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L151; $L144: ld.u64 %r80,[%frame+240]; .loc 1 585 22 ld.u64 %r68,[%frame]; .loc 1 585 12 shl.b64 %r69,%r68,2; .loc 1 587 32 ld.u64 %r59,[%frame+120]; .loc 1 594 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 583 13 mov.u32 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L157: st.u32 [%r98],%r280; .loc 1 584 15 add.u64 %r80,%r80,1; .loc 1 585 12 add.u64 %r98,%r98,%r69; .loc 1 587 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L157; .loc 1 596 7 @ ! %r294 bra $L182; bra $L120; $L156: .loc 1 591 13 st.u64 [%r99],%r297; .loc 1 594 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 595 5 add.u64 %r97,%r97,1; .loc 1 596 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L155; bra $L120; $L182: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 594 23 mov.u64 %r72,%r44; .loc 1 595 5 mov.u64 %r97,1; .loc 1 591 13 mov.u64 %r297,0; $L155: .loc 1 600 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 601 23 ld.u64 %r75,[%r124]; .loc 1 601 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,2; add.u64 %r98,%r98,%r291; .loc 1 587 32 ld.u64 %r77,[%r123]; .loc 1 587 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L156; .loc 1 591 13 mov.u64 %r80,0; bra $L157; $L180: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L133; bra $L158; $L120: .loc 1 605 maxloc1_8_i8.o/ 1622802230_gfortran_maxloc1_8_i8 .visible .func _gfortran_maxloc1_8_i8.file 1 "../../../../libgfortran/generated/maxloc1_8_i8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_8_i8 .visible .func _gfortran_mmaxloc1_8__gfortran_smaxloc1_8_i8 .visible .func _gfortran_smaxloc1_8_VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_maxloc1_8_i8 .visible .func _gfortran_maxloc1_8_i895u16 %r196; .reg .predu64u64 %r229; .reg .u64 %r230; .reg .predpred %r255; .reg .predu64 %r266; .reg .u16u64predpred %r315; .reg .predu64 %r327; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .pred %r337; .reg .u64 %r338; .reg .u64 %r340; .reg .u64 %r341; .reg .u64 %r342; .reg .u64 %r343; .reg .u64 %r345; .reg .pred %r346; .reg .u64 %r347; .reg .pred %r348; .reg .u64 %r349; .reg .predu64 %r353; .reg .pred %r355; .reg .u64u64 %r359; .reg .u64mov.u64 %r181,%ar0; mov.u64 %r182,%ar1; mov.u64 %r183,%ar2; mov.u32 %r184,%ar3; .loc 1 58 10 ld.s8 %r22,[%r182+28]; .loc 1 58 38 add.u32 %r185,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r110,%r185; .loc 1 59 10 ld.u64 %r25,[%r183]; .loc 1 59 7 add.u64 %r111,%r25,-1; .loc 1 61 7 shr.u64 %r187,%r111,63; set.u32.lt.s64 %r189,%r110,%r111; neg.s32 %r190,%r189; cvt.u16.u64 %r193,%r187; cvt.u16.u32 %r194,%r190; or.b16 %r192,%r193,%r194; .loc 1 61 6 cvt.u32.u16 %r195,%r192; cvt.u16.u8 %r196,%r195; setp.eq.u16 %r197,%r196,0; @ %r197 bra $L2; .loc 1 63 7 cvt.u32.u32 %r200,%r22; cvt.s64.s8 %r199,%r200; st.u64 [%stack+8],%r199_gfortran_runtime_errorr203,%r111,%r111; add.u64 %r204,%r203,%r111; shl.b64 %r205,%r204,3; add.u64 %r206,%r182,%r205; ld.u64 %r31,[%r206+56]; ld.u64 %r33,[%r206+48]; .loc 1 71 9 ld.u64 %r114,[%r206+40]; .loc 1 73 3 setp.ne.u64 %r220,%r111,0; @ %r220 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r221,%r110,%r111; @ %r221 bra $L4; bra $L50; $L3: add.u64 %r63,%r182,40; add.u64 %r29,%frame,120; add.u64 %r124,%frame,240; add.u64 %r222,%r182,16; add.u64 %r224,%r25,%r25; add.u64 %r225,%r224,%r25; shl.b64 %r226,%r225,3; add.u64 %r174,%r222,%r226; .loc 1 79 12 mov.u64 %r363,0; $L8: .loc 1 75 18 ld.u64 %r227,[%r63]; st.u64 [%r29],%r227; .loc 1 76 19 ld.u64 %r229,[%r63+16]; add.u64 %r228,%r229,1; ld.u64 %r230,[%r63+8]; sub.u64 %r38,%r228,%r230; .loc 1 78 10 setp.lt.s64 %r231,%r38,0; @ %r231 bra $L6; .loc 1 76 17 st.u64 [%r124],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r124],%r363; $L7: .loc 1 73 3 add.u64 %r63,%r63,24; add.u64 %r29,%r29,8; add.u64 %r124,%r124,8; setp.ne.u64 %r233,%r63,%r174; @ %r233 bra $L8; bra $L9; $L50: .loc 1 90 6 ld.u64 %r234,[%r181]; setp.eq.u64 %r235,%r234,0; @ ! %r235 bra $L11; bra $L10; $L4: add.u64 %r237,%r25,%r25; add.u64 %r238,%r237,%r25; shl.b64 %r239,%r238,3; add.u64 %r240,%r239,40; add.u64 %r132,%r182,%r240; shl.b64 %r241,%r25,3; add.u64 %r107,%r241,-8; add.u64 %r349,%frame,120; add.u64 %r126,%r349,%r107; add.u64 %r353,%frame,240; add.u64 %r105,%r353,%r107; cvt.u32.u32 %r245,%r22; cvt.s64.s8 %r244,%r245; add.u64 %r247,%r244,%r244; add.u64 %r248,%r247,%r244; shl.b64 %r249,%r248,3; add.u64 %r250,%r182,40; add.u64 %r81,%r249,%r250; .loc 1 87 12 mov.u64 %r362,0; $L14: .loc 1 83 18 ld.u64 %r251,[%r132]; st.u64 [%r126],%r251; .loc 1 84 19 ld.u64 %r253,[%r132+16]; add.u64 %r252,%r253,1; ld.u64 %r254,[%r132+8]; sub.u64 %r45,%r252,%r254; .loc 1 86 10 setp.lt.s64 %r255,%r45,0; @ %r255 bra $L12; .loc 1 84 17 st.u64 [%r105],%r45; bra $L13; $L12: .loc 1 87 12 st.u64 [%r105],%r362; $L13: .loc 1 81 3 add.u64 %r132,%r132,24; add.u64 %r126,%r126,8; add.u64 %r105,%r105,8; setp.ne.u64 %r257,%r81,%r132; @ %r257 bra $L14; bra $L51; $L10: .loc 1 94 7 setp.le.s64 %r258,%r110,0; @ %r258 bra $L16; add.u64 %r353,%frame,240; $L36: add.u64 %r153,%r181,40; mov.u64 %r151,%r353; cvt.u32.u32 %r260,%r22; cvt.s64.s8 %r259,%r260; add.u64 %r262,%r259,%r259; add.u64 %r263,%r262,%r259; shl.b64 %r264,%r263,3; add.u64 %r265,%r181,16; add.u64 %r138,%r264,%r265; .loc 1 97 10 mov.u64 %r102,1; .loc 1 101 4 mov.u64 %r288,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r266,0; st.u64 [%r181+8],%r266; .loc 1 106 28 cvt.u16.u32 %r269,%r22; add.u16 %r268,%r269,-1; cvt.u32.u16 %r270,%r268; st.u8 [%r181+28],%r270; .loc 1 108 20 add.u32 %r271,%r22,-2; cvt.s64.s32 %r56,%r271; add.u64 %r273,%r56,%r56; add.u64 %r274,%r273,%r56; shl.b64 %r275,%r274,3; add.u64 %r276,%r181,%r275; .loc 1 108 67 shl.b64 %r278,%r56,3; add.u64 %r279,%frame,%r278; .loc 1 108 59 ld.u64 %r281,[%r276+40]; ld.u64 %r282,[%r279+240]; mul.lo.u64 %r116,%r281,%r282; .loc 1 110 29 mov.u64 %r284284; call (%value_in),_gfortrani_xmallocarray285,[%value_in]; } .loc 1 110 27 st.u64 [%r181],%r285; .loc 1 111 10 setp.eq.u64 %r287,%r116,0; @ ! %r287 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r102,%r52,%r102; $L17: .loc 1 101 4 st.u64 [%r153+8],%r288; ld.u64 %r52,[%r151]; add.u64 %r289,%r52,-1; st.u64 [%r153+16],%r289; st.u64 [%r153],%r102; .loc 1 94 7 add.u64 %r153,%r153,24; add.u64 %r151,%r151,8; setp.ne.u64 %r290,%r138,%r153; @ %r290 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r181+48],%r116; mov.u64 %r292,-1; st.u64 [%r181+56],%r292; mov.u64 %r293,1; st.u64 [%r181+40],%r293; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r62,[%r181+28]; .loc 1 121 10 setp.eq.u64 %r294,%r62,%r110; @ %r294 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r110; st.u64 [%stack],%r62;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r297,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r298,[%r297+36]; setp.eq.u32 %r299,%r298,0; @ %r299 bra $L21; .loc 1 128 2 add.u64 %r353,%frame,240; cvta.const.u64 %r303,$LC2r353302303; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r305,%r110,0; @ %r305 bra $L24; $L27: .loc 1 68 9 add.u64 %r306,%r31,1; .loc 1 68 7 sub.u64 %r112,%r306,%r33; max.s64 %r46,%r112,0; .loc 1 140 8 ld.u64 %r121,[%r182]; .loc 1 141 8 ld.u64 %r122,[%r181]; .loc 1 180 31 shl.b64 %r69,%r114,3; ld.u64 %r48,[%frame+360]; .loc 1 194 22 ld.u64 %r70,[%frame+120]; .loc 1 194 12 shl.b64 %r72,%r70,3; .loc 1 195 22 ld.u64 %r73,[%frame]; .loc 1 195 12 shl.b64 %r75,%r73,3; setp.gt.s64 %r350,%r112,0; setp.le.s64 %r351,%r110,1; setp.eq.u32 %r355,%r184,0; .loc 1 159 10 mov.u64 %r356,0; cvt.u32.u32 %r357,%r22; cvt.s64.s8 %r358,%r357; add.u64 %r359,%r358,-1; add.u64 %r360,%frame,120; add.u64 %r361,%frame,240; bra $L25; $L24: add.u64 %r166,%frame,360; add.u64 %r165,%r181,40; mov.u64 %r163,%frame; cvt.u32.u32 %r308,%r22; cvt.s64.s8 %r307,%r308; add.u64 %r154,%r307,-1; .loc 1 132 3 mov.u64 %r123,0; add.u64 %r353,%frame,240; .loc 1 134 16 mov.u64 %r309,%r123; $L26: st.u64 [%r166],%r309; .loc 1 135 18 ld.u64 %r310,[%r165]; st.u64 [%r163],%r310; .loc 1 136 17 shl.b64 %r312,%r123,3; add.u64 %r313,%r353,%r312; .loc 1 136 10 ld.u64 %r314,[%r313]; setp.le.s64 %r315,%r314,0; @ %r315 bra $L1; .loc 1 132 26 add.u64 %r123,%r123,1; .loc 1 132 3 add.u64 %r166,%r166,8; add.u64 %r165,%r165,24; add.u64 %r163,%r163,8; setp.ne.u64 %r316,%r123,%r154; @ %r316 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r350 bra $L37; .loc 1 159 10 st.u64 [%r122],%r356; bra $L29; $L37: mov.u64 %r117,%r121; .loc 1 155 9 mov.u64 %r104,-9223372036854775808; .loc 1 157 9 mov.u64 %r103,1; .loc 1 178 8 mov.u64 %r149,0; $L28: .loc 1 182 14 ld.u64 %r180,[%r117]; .loc 1 185 14 add.u64 %r149,%r149,1; .loc 1 182 7 @ %r355 bra $L30; .loc 1 182 6 setp.le.s64 %r320,%r104,%r180; selp.u64 %r104,%r180,%r104,%r320; selp.u64 %r103,%r149,%r103,%r320; bra $L31; $L30: setp.lt.s64 %r321,%r104,%r180; selp.u64 %r104,%r180,%r104,%r321; selp.u64 %r103,%r149,%r103,%r321; $L31: .loc 1 180 31 add.u64 %r117,%r117,%r69; .loc 1 180 6 setp.gt.s64 %r322,%r46,%r149; @ %r322 bra $L28; .loc 1 189 12 st.u64 [%r122],%r103; $L29: .loc 1 193 15 add.u64 %r48,%r48,1; .loc 1 194 12 add.u64 %r121,%r121,%r72; .loc 1 195 12 add.u64 %r122,%r122,%r75; .loc 1 197 32 ld.u64 %r145,[%frame+240]; .loc 1 197 13 setp.ne.u64 %r323,%r48,%r145; @ %r323 bra $L25; .loc 1 204 23 mul.lo.u64 %r324,%r70,%r48; .loc 1 204 9 shl.b64 %r325,%r324,3; sub.u64 %r118,%r121,%r325; .loc 1 205 23 mul.lo.u64 %r326,%r73,%r48; .loc 1 205 9 shl.b64 %r327,%r326,3; sub.u64 %r119,%r122,%r327; .loc 1 207 7 @ %r351 bra $L1; add.u64 %r150,%frame,368; mov.u64 %r109,8; .loc 1 206 5 mov.u64 %r120,1; bra $L34; $L35: .loc 1 201 13 st.u64 [%r150],%r356; .loc 1 204 23 mul.lo.u64 %r333,%r89,%r88; .loc 1 204 9 shl.b64 %r334,%r333,3; sub.u64 %r118,%r121,%r334; .loc 1 205 23 mul.lo.u64 %r335,%r92,%r88; .loc 1 205 9 shl.b64 %r336,%r335,3; sub.u64 %r119,%r122,%r336; .loc 1 206 5 add.u64 %r120,%r120,1; .loc 1 207 7 add.u64 %r150,%r150,8; add.u64 %r109,%r109,8; setp.eq.u64 %r337,%r120,%r359; @ %r337 bra $L1; $L34: .loc 1 215 16 ld.u64 %r338,[%r150]; add.u64 %r88,%r338,1; st.u64 [%r150],%r88; .loc 1 216 23 add.u64 %r340,%r360,%r109; ld.u64 %r89,[%r340]; .loc 1 216 13 shl.b64 %r341,%r89,3; add.u64 %r121,%r118,%r341; .loc 1 217 23 add.u64 %r342,%frame,%r109; ld.u64 %r92,[%r342]; .loc 1 217 13 shl.b64 %r343,%r92,3; add.u64 %r122,%r119,%r343; .loc 1 197 32 add.u64 %r345,%r361,%r109; ld.u64 %r95,[%r345]; .loc 1 197 13 setp.eq.u64 %r346,%r88,%r95; @ %r346 bra $L35; .loc 1 201 13 mov.u64 %r48,0; bra $L25; $L51: .loc 1 90 6 ld.u64 %r347,[%r181]; setp.eq.u64 %r348,%r347,0; @ ! %r348 bra $L11; bra $L36; $L1: .loc 1 221_gfortran_mmaxloc1_8_i8 .visible .func _gfortran_mmaxloc1_8_, .param .u32 %in_ar4) {608pred %r218; .reg .u32 %r223; .reg .u6464u64pred %r373; .reg .u64 %r374; .reg .u64 %r376; .reg .u32 %r377; .reg .pred %r378; .reg .u64 %r381; .reg .u64 %r382; .reg .u64 %r386; .reg .pred %r388; .reg .pred %r389; .reg .u64 %r390; .reg .u32 %r391; .reg .u64pred %r398; .reg .pred %r399; .reg .pred %r402; .reg .pred %r403; .reg .pred %r404; .reg .pred %r406; .reg .pred %r407; .reg .u16 %r408; .reg .u32 %r409; .reg .pred %r410; .reg .pred %r411; .reg .pred %r412; .reg .pred.reg .u64 %r416; .reg .u64predpredmov.u64 %r213,%ar0; mov.u64 %r214,%ar1; mov.u64 %r215,%ar2; mov.u64 %r216,%ar3; mov.u32 %r217,%ar4; .loc 1 251 6 setp.ne.u64 %r218,%r216,0; @ %r218 bra $L1r214215217; call _gfortran_maxloc1_8_7 bra $L52; $L53: .loc 1 261 10 ld.u64 %r22,[%r215]; .loc 1 261 7 add.u64 %r146,%r22,-1; .loc 1 262 10 ld.s8 %r23,[%r214+28]; .loc 1 262 38 add.u32 %r223,%r23,-1; .loc 1 262 8 cvt.s64.s32 %r147,%r223; .loc 1 265 7 shr.u64 %r225,%r146,63; set.u32.gt.s64 %r227,%r146,%r147; neg.s32 %r228,%r227; cvt.u16.u64 %r231,%r225; cvt.u16.u32 %r232,%r228; or.b16 %r230,%r231,%r232; .loc 1 265 6 cvt.u32.u16 %r233,%r230; cvt.u16.u8 %r234,%r233; setp.eq.u16 %r235,%r234,0; @ %r235 bra $L55; .loc 1 267 7 cvt.u32.u32 %r238,%r23; cvt.s64.s8 %r237,%r238; st.u64 [%stack+8],%r237; st.u64 [%stack],%r22; cvta.const.u64 %r236236_gfortran_runtime_error5: .loc 1 272 9 add.u64 %r445,%r146,%r146; add.u64 %r444,%r445,%r146; shl.b64 %r243,%r444,3; add.u64 %r244,%r214,%r243; ld.u64 %r247,[%r244+56]; add.u64 %r246,%r247,1; .loc 1 272 7 ld.u64 %r254,[%r244+48]; sub.u64 %r148,%r246,%r254; .loc 1 273 6 setp.le.s64 %r255,%r148,0; @ %r255 bra $L52; .loc 1 276 9 ld.u64 %r166,[%r216]; .loc 1 278 15 ld.u64 %r34,[%r216+16]; .loc 1 280 22 cvt.u32.u64 %r35,%r34; .loc 1 280 53 add.u32 %r256,%r35,-4; and.b32 %r257,%r256,-5; set.u32.eq.u32 %r259,%r257,0; neg.s32 %r260,%r259; .loc 1 280 22 add.u32 %r261,%r35,-1; set.u32.le.u32 %r263,%r261,1; neg.s32 %r264,%r263; .loc 1 280 6 cvt.u16.u32 %r266,%r260; cvt.u16.u32 %r267,%r264; or.b16 %r265,%r266,%r267; cvt.u32.u16 %r268,%r265; cvt.u16.u8 %r269,%r268; setp.ne.u16 %r270,%r269,0; @ %r270 bra $L57; .loc 1 282 7 setp.ne.u32 %r272,%r35,16; @ %r272 bra $L58; $L57: .loc 1 289 9 shl.b64 %r276,%r444,3; add.u64 %r277,%r214,%r276; ld.u64 %r151,[%r277+40]; .loc 1 290 12 add.u64 %r283,%r216,%r276; ld.u64 %r39,[%r283+40]; .loc 1 292 3 setp.ne.u64 %r285,%r146,0; @ %r285 bra $L59; $L65: .loc 1 302 3 setp.lt.s64 %r286,%r146,%r147; @ %r286 bra $L60; bra $L119; $L58: .loc 1 287 5_gfortran_runtime_error9: add.u64 %r205,%r214,40; add.u64 %r67,%r216,40; add.u64 %r289,%r214,16; add.u64 %r291,%r22,%r22; add.u64 %r292,%r291,%r22; shl.b64 %r293,%r292,3; add.u64 %r210,%r289,%r293; .loc 1 292 3 mov.u64 %r91,0; add.u64 %r442,%frame,240; add.u64 %r440,%frame,360; .loc 1 299 12 mov.u64 %r456,%r91; $L64: .loc 1 294 18 add.u64 %r295,%r442,%r91; ld.u64 %r296,[%r205]; st.u64 [%r295],%r296; .loc 1 295 18 add.u64 %r297,%frame,%r91; .loc 1 295 20 ld.u64 %r299,[%r67]; mul.lo.u64 %r298,%r299,%r34; .loc 1 295 18 st.u64 [%r297],%r298; .loc 1 296 19 ld.u64 %r301,[%r205+16]; add.u64 %r300,%r301,1; ld.u64 %r302,[%r205+8]; sub.u64 %r51,%r300,%r302; .loc 1 298 10 setp.lt.s64 %r303,%r51,0; @ %r303 bra $L62; .loc 1 296 17 add.u64 %r305,%r440,%r91; st.u64 [%r305],%r51; bra $L63; $L62: .loc 1 299 12 add.u64 %r307,%r440,%r91; st.u64 [%r307],%r456; $L63: .loc 1 292 3 add.u64 %r205,%r205,24; add.u64 %r67,%r67,24; add.u64 %r91,%r91,8; setp.ne.u64 %r309,%r205,%r210; @ %r309 bra $L64; bra $L65; $L119: .loc 1 312 6 ld.u64 %r310,[%r213]; setp.eq.u64 %r311,%r310,0; @ ! %r311 bra $L67; bra $L66; $L60: add.u64 %r313,%r22,%r22; add.u64 %r314,%r313,%r22; shl.b64 %r315,%r314,3; add.u64 %r128,%r315,40; add.u64 %r133,%r214,%r128; add.u64 %r124,%r216,%r128; shl.b64 %r316,%r22,3; add.u64 %r83,%r316,-8; cvt.u32.u32 %r318,%r23; cvt.s64.s8 %r317,%r318; shl.b64 %r319,%r317,3; add.u64 %r209,%r319,-8; add.u64 %r442,%frame,240; add.u64 %r440,%frame,360; .loc 1 309 12 mov.u64 %r455,0; $L70: .loc 1 304 18 add.u64 %r321,%r442,%r83; ld.u64 %r322,[%r133]; st.u64 [%r321],%r322; .loc 1 305 18 add.u64 %r323,%frame,%r83; .loc 1 305 20 ld.u64 %r325,[%r124]; mul.lo.u64 %r324,%r325,%r34; .loc 1 305 18 st.u64 [%r323],%r324; .loc 1 306 19 ld.u64 %r327,[%r133+16]; add.u64 %r326,%r327,1; ld.u64 %r328,[%r133+8]; sub.u64 %r64,%r326,%r328; .loc 1 308 10 setp.lt.s64 %r329,%r64,0; @ %r329 bra $L68; .loc 1 306 17 add.u64 %r331,%r440,%r83; st.u64 [%r331],%r64; bra $L69; $L68: .loc 1 309 12 add.u64 %r333,%r440,%r83; st.u64 [%r333],%r455; $L69: .loc 1 302 3 add.u64 %r133,%r133,24; add.u64 %r124,%r124,24; add.u64 %r83,%r83,8; setp.ne.u64 %r335,%r83,%r209; @ %r335 bra $L70; bra $L120; $L66: .loc 1 316 7 setp.eq.u64 %r336,%r147,0; @ %r336 bra $L72; add.u64 %r440,%frame,360; $L96: add.u64 %r161,%r213,40; mov.u64 %r143,%r440; cvt.u32.u32 %r338,%r23; cvt.s64.s8 %r337,%r338; add.u64 %r340,%r337,%r337; add.u64 %r341,%r340,%r337; shl.b64 %r342,%r341,3; add.u64 %r343,%r213,16; add.u64 %r135,%r342,%r343; .loc 1 319 10 mov.u64 %r131,1; .loc 1 323 4 mov.u64 %r362,0; bra $L73; $L72: .loc 1 327 20 add.u32 %r344,%r23,-2; cvt.s64.s32 %r73,%r344; add.u64 %r346,%r73,%r73; add.u64 %r347,%r346,%r73; shl.b64 %r348,%r347,3; add.u64 %r349,%r213,%r348; .loc 1 327 67 shl.b64 %r351,%r73,3; add.u64 %r352,%frame,%r351; .loc 1 327 59 ld.u64 %r354,[%r349+40]; ld.u64 %r355,[%r352+360]; mul.lo.u64 %r155,%r354,%r355; .loc 1 329 24 mov.u64 %r356,0; st.u64 [%r213+8],%r356; .loc 1 330 28 cvt.u16.u32 %r359,%r23; add.u16 %r358,%r359,-1; cvt.u32.u16 %r360,%r358; st.u8 [%r213+28],%r360; .loc 1 332 10 setp.eq.u64 %r361,%r155,0; @ %r361 bra $L74; bra $L121; $L76: .loc 1 321 47 mul.lo.u64 %r131,%r70,%r131; $L73: .loc 1 323 4 st.u64 [%r161+8],%r362; ld.u64 %r70,[%r143]; add.u64 %r363,%r70,-1; st.u64 [%r161+16],%r363; st.u64 [%r161],%r131; .loc 1 316 7 add.u64 %r161,%r161,24; add.u64 %r143,%r143,8; setp.ne.u64 %r364,%r135,%r161; @ %r364 bra $L76; bra $L72; $L74: .loc 1 335 4 st.u64 [%r213+48],%r155; mov.u64 %r366,-1; st.u64 [%r213+56],%r366; mov.u64 %r367,1; st.u64 [%r213+40],%r367; .loc 1 336 4 bra $L52; $L121: .loc 1 339 24 mov.u64 %r369369; call (%value_in),_gfortrani_xmallocarray70,[%value_in]; } .loc 1 339 22 st.u64 [%r213],%r370; bra $L77; $L67: .loc 1 344 19 ld.s8 %r372,[%r213+28]; .loc 1 344 10 setp.eq.u64 %r373,%r372,%r147; @ %r373 bra $L78; .loc 1 345 2 cvta.const.u64 %r374374_gfortran_runtime_error78: .loc 1 347 11 cvta.global.u64 %r376,_gfortrani_compile_options; .loc 1 347 10 ld.u32 %r377,[%r376+36]; setp.eq.u32 %r378,%r377,0; @ %r378 bra $L77; .loc 1 349 4 add.u64 %r440,%frame,360; cvta.const.u64 %r382,$LC2r43382; call _gfortrani_bounds_ifunction_return4 cvta.const.u64 %r386,$LC386382; call _gfortrani_bounds_equal_extents$L77: .loc 1 356 3 setp.ne.u64 %r388,%r147,0; @ %r388 bra $L79; $L82: .loc 1 364 8 ld.u64 %r167,[%r213]; .loc 1 365 8 ld.u64 %r165,[%r214]; .loc 1 367 9 setp.ne.u64 %r389,%r165,0; @ %r389 bra $L80; bra $L52; $L79: add.u64 %r194,%frame,480; add.u64 %r193,%r213,40; add.u64 %r188,%frame,120; cvt.u32.u32 %r391,%r23; cvt.s64.s8 %r390,%r391; add.u64 %r174,%r390,-1; .loc 1 356 3 mov.u64 %r168,0; add.u64 %r440,%frame,360; .loc 1 358 16 mov.u64 %r392,%r168; $L81: st.u64 [%r194],%r392; .loc 1 359 18 ld.u64 %r393,[%r193]; st.u64 [%r188],%r393; .loc 1 360 17 shl.b64 %r395,%r168,3; add.u64 %r396,%r440,%r395; .loc 1 360 10 ld.u64 %r397,[%r396]; setp.le.s64 %r398,%r397,0; @ %r398 bra $L52; .loc 1 356 26 add.u64 %r168,%r168,1; .loc 1 356 3 add.u64 %r194,%r194,8; add.u64 %r193,%r193,24; add.u64 %r188,%r188,8; setp.ne.u64 %r399,%r168,%r174; @ %r399 bra $L81; bra $L82; $L80: .loc 1 290 12 mul.lo.u64 %r42,%r39,%r34; .loc 1 386 32 shl.b64 %r87,%r151,3; ld.u64 %r68,[%frame+480]; .loc 1 430 22 ld.u64 %r97,[%frame+240]; .loc 1 430 12 shl.b64 %r99,%r97,3; .loc 1 431 23 ld.u64 %r100,[%frame]; .loc 1 432 22 ld.u64 %r102,[%frame+120]; .loc 1 432 12 shl.b64 %r104,%r102,3; setp.le.s64 %r446,%r147,1; .loc 1 408 9 setp.ne.u32 %r448,%r217,0; cvt.u32.u32 %r449,%r23; cvt.s64.s8 %r450,%r449; add.u64 %r451,%r450,-1; add.u64 %r452,%frame,240; add.u64 %r453,%frame,360; add.u64 %r454,%frame,120; $L97: .loc 1 356 3 mov.u64 %r160,%r166; mov.u64 %r159,%r165; .loc 1 386 9 mov.u64 %r137,0; $L86: .loc 1 389 7 ld.s8 %r208,[%r160]; mov.u64 %r189,%r137; .loc 1 386 24 add.u64 %r137,%r137,1; .loc 1 389 6 setp.eq.u32 %r441,%r208,0; @ %r441 bra $L83; .loc 1 397 11 ld.u64 %r141,[%r159]; .loc 1 408 9 @ %r448 bra $L84; bra $L122; $L83: .loc 1 386 32 add.u64 %r159,%r159,%r87; .loc 1 386 47 add.u64 %r160,%r160,%r42; .loc 1 386 2 setp.ne.u64 %r402,%r148,%r137; @ %r402 bra $L86; .loc 1 385 9 mov.u64 %r137,0; bra $L87; $L122: .loc 1 418 8 setp.gt.s64 %r403,%r148,%r189; @ %r403 bra $L88; bra $L87; $L84: .loc 1 409 8 setp.le.s64 %r404,%r148,%r189; @ %r404 bra $L87; $L90: .loc 1 414 16 add.u64 %r189,%r189,1; .loc 1 411 8 @ %r441 bra $L89; .loc 1 411 18 ld.u64 %r88,[%r159]; .loc 1 411 15 setp.ge.s64 %r406,%r88,%r141; selp.u64 %r141,%r88,%r141,%r406; selp.u64 %r137,%r189,%r137,%r406; $L89: .loc 1 409 33 add.u64 %r159,%r159,%r87; .loc 1 409 48 add.u64 %r160,%r160,%r42; .loc 1 409 8 setp.eq.u64 %r407,%r148,%r189; @ %r407 bra $L87; .loc 1 411 9 ld.s8 %r208,[%r160]; setp.eq.u32 %r441,%r208,0; bra $L90; $L88: .loc 1 423 16 add.u64 %r189,%r189,1; .loc 1 420 8 ld.s8 %r409,[%r160]; cvt.u16.u32 %r408,%r409; setp.eq.u16 %r410,%r408,0; @ %r410 bra $L91; .loc 1 420 18 ld.u64 %r93,[%r159]; .loc 1 420 15 setp.gt.s64 %r411,%r93,%r141; selp.u64 %r141,%r93,%r141,%r411; selp.u64 %r137,%r189,%r137,%r411; $L91: .loc 1 418 33 add.u64 %r159,%r159,%r87; .loc 1 418 48 add.u64 %r160,%r160,%r42; .loc 1 418 8 setp.ne.u64 %r412,%r148,%r189; @ %r412 bra $L88; $L87: .loc 1 426 8 st.u64 [%r167],%r137; .loc 1 429 15 add.u64 %r68,%r68,1; .loc 1 430 12 add.u64 %r165,%r165,%r99; .loc 1 431 13 add.u64 %r166,%r166,%r100; .loc 1 432 12 add.u64 %r167,%r167,%r104; .loc 1 434 32 ld.u64 %r29,[%frame+360]; .loc 1 434 13 setp.ne.u64 %r413,%r29,%r68; @ %r413 bra $L97; .loc 1 441 23 mul.lo.u64 %r106,%r68,%r97; .loc 1 442 24 mul.lo.u64 %r414,%r68,%r100; .loc 1 442 10 sub.u64 %r162,%r166,%r414; .loc 1 443 23 mul.lo.u64 %r415,%r68,%r102; .loc 1 443 9 shl.b64 %r416,%r415,3; sub.u64 %r163,%r167,%r416; .loc 1 445 7 @ %r446 bra $L52; add.u64 %r204,%frame,488; mov.u64 %r203,8; .loc 1 444 5 mov.u64 %r164,1; .loc 1 438 13 mov.u64 %r447,0; bra $L94; $L95: st.u64 [%r204],%r447; .loc 1 441 23 mul.lo.u64 %r106,%r117,%r116; .loc 1 442 24 mul.lo.u64 %r422,%r119,%r116; .loc 1 442 10 sub.u64 %r162,%r166,%r422; .loc 1 443 23 mul.lo.u64 %r423,%r121,%r116; .loc 1 443 9 shl.b64 %r424,%r423,3; sub.u64 %r163,%r167,%r424; .loc 1 444 5 add.u64 %r164,%r164,1; .loc 1 445 7 add.u64 %r204,%r204,8; add.u64 %r203,%r203,8; setp.eq.u64 %r425,%r164,%r451; @ %r425 bra $L52; $L94: .loc 1 453 16 ld.u64 %r426,[%r204]; add.u64 %r116,%r426,1; st.u64 [%r204],%r116; .loc 1 454 23 add.u64 %r428,%r452,%r203; ld.u64 %r117,[%r428]; .loc 1 454 13 sub.u64 %r429,%r117,%r106; shl.b64 %r430,%r429,3; add.u64 %r165,%r165,%r430; .loc 1 455 24 add.u64 %r431,%frame,%r203; ld.u64 %r119,[%r431]; .loc 1 455 14 add.u64 %r166,%r162,%r119; .loc 1 456 23 add.u64 %r433,%r454,%r203; ld.u64 %r121,[%r433]; .loc 1 456 13 shl.b64 %r434,%r121,3; add.u64 %r167,%r163,%r434; .loc 1 434 32 add.u64 %r436,%r453,%r203; ld.u64 %r125,[%r436]; .loc 1 434 13 setp.eq.u64 %r437,%r116,%r125; @ %r437 bra $L95; .loc 1 438 13 mov.u64 %r68,0; bra $L97; $L120: .loc 1 312 6 ld.u64 %r438,[%r213]; setp.eq.u64 %r439,%r438,0; @ ! %r439 bra $L67; bra $L96; $L52: .loc 1 460_gfortran_smaxloc1_8_i8 .visible .func _gfortran_smaxloc1_8_, .param .u32 %in_ar4) {8u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 483 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L124; .loc 1 483 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L125; $L124:r152154; call _gfortran_maxloc1_8_490 7 bra $L123; $L125: .loc 1 493 10 ld.u64 %r23,[%r152]; .loc 1 493 7 add.u64 %r88,%r23,-1; .loc 1 494 10 ld.s8 %r24,[%r151+28]; .loc 1 494 38 add.u32 %r26,%r24,-1; .loc 1 494 8 cvt.s64.s32 %r296,%r26; .loc 1 496 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 496 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L127; .loc 1 503 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L128; $L134: .loc 1 511 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L129; bra $L182; $L127: .loc 1 498 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error128: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 508 12 mov.u64 %r302,0; $L133: .loc 1 505 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 507 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L131; .loc 1 505 17 st.u64 [%r139],%r36; bra $L132; $L131: .loc 1 508 12 st.u64 [%r139],%r302; $L132: .loc 1 503 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L133; bra $L134; $L182: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L136; bra $L135; $L129: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 517 12 mov.u64 %r301,0; $L139: .loc 1 514 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 516 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L137; .loc 1 513 17 st.u64 [%r43],%r41; bra $L138; $L137: .loc 1 517 12 st.u64 [%r43],%r301; $L138: .loc 1 511 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L139; bra $L183; $L135: .loc 1 524 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L141; add.u64 %r295,%frame,120; $L161: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 527 10 mov.u64 %r81,1; .loc 1 531 4 mov.u64 %r239,0; bra $L142; $L141: .loc 1 535 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 536 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 538 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 538 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 538 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 540 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L143; bra $L184; $L145: .loc 1 529 48 mul.lo.u64 %r81,%r46,%r81; $L142: .loc 1 531 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 524 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L145; bra $L141; $L143: .loc 1 543 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 544 4 bra $L123; $L184: .loc 1 547 24 mov.u64 %r24696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 547 22 st.u64 [%r150],%r98; $L149: .loc 1 573 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L146; bra $L147; $L136: .loc 1 551 19 ld.s8 %r55,[%r150+28]; .loc 1 551 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L148; .loc 1 552 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error148: .loc 1 557 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 557 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L149; .loc 1 559 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L147; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 559 10 mov.u64 %r104,0; $L151: .loc 1 563 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 563 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 564 18 ld.u64 %r63,[%r31]; .loc 1 559 25 add.u64 %r104,%r104,1; .loc 1 564 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L150; .loc 1 565 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error150: .loc 1 559 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L151; $L146: .loc 1 575 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L152; shl.b64 %r266,%r296,3; bra $L153; $L152: mov.u64 %r266,8; $L153: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L154: .loc 1 576 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 573 26 add.u64 %r100,%r100,1; .loc 1 573 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L154; $L147: ld.u64 %r80,[%frame+240]; .loc 1 585 22 ld.u64 %r68,[%frame]; .loc 1 585 12 shl.b64 %r69,%r68,3; .loc 1 587 32 ld.u64 %r59,[%frame+120]; .loc 1 594 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 583 13 mov.u64 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L160: st.u64 [%r98],%r280; .loc 1 584 15 add.u64 %r80,%r80,1; .loc 1 585 12 add.u64 %r98,%r98,%r69; .loc 1 587 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L160; .loc 1 596 7 @ ! %r294 bra $L185; bra $L123; $L159: .loc 1 591 13 st.u64 [%r99],%r280; .loc 1 594 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 595 5 add.u64 %r97,%r97,1; .loc 1 596 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L158; bra $L123; $L185: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 594 23 mov.u64 %r72,%r44; .loc 1 595 5 mov.u64 %r97,1; $L158: .loc 1 600 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 601 23 ld.u64 %r75,[%r124]; .loc 1 601 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,3; add.u64 %r98,%r98,%r291; .loc 1 587 32 ld.u64 %r77,[%r123]; .loc 1 587 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L159; .loc 1 591 13 mov.u64 %r80,0; bra $L160; $L183: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L136; bra $L161; $L123: .loc 1 605maxloc1_16_i8.o/1622802230_gfortran_maxloc1_16_i8 .visible .func _gfortran_maxloc1_16_i8.file 1 "../../../../libgfortran/generated/maxloc1_16_i8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_16_i8 .visible .func _gfortran_mmaxloc1_16__gfortran_smaxloc1_16_i8 .visible .func _gfortran_smaxloc1_16_VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_maxloc1_16_i8 .visible .func _gfortran_maxloc1_16_i8predpredu64 %r231; .reg .u64 %r232; .reg .pred %r233; .reg .pred %r235; .reg .u64 %r236; .reg .pred %r237; .reg .u64u64predu64 %r264; .reg .u64 %r265; .reg .u64 %r266; .reg .u64 %r267; .reg .u64 %r268; .reg .u16 %r270; .reg .u16 %r271; .reg .u32 %r272; .reg .u32 %r273; .reg .u64 %r275; .reg .u64u64pred64pred64predpredpred %r383; .reg .u32u64 %r389; .reg .u64 %r390; mov.u64 %r183,%ar0; mov.u64 %r184,%ar1; mov.u64 %r185,%ar2; mov.u32 %r186,%ar3; .loc 1 58 10 ld.s8 %r22,[%r184+28]; .loc 1 58 38 add.u32 %r187,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r110,%r187; .loc 1 59 10 ld.u64 %r25,[%r185]; .loc 1 59 7 add.u64 %r111,%r25,-1; .loc 1 61 7 shr.u64 %r189,%r111,63; set.u32.lt.s64 %r191,%r110,%r111; neg.s32 %r192,%r191; cvt.u16.u64 %r195,%r189; cvt.u16.u32 %r196,%r192; or.b16 %r194,%r195,%r196; .loc 1 61 6 cvt.u32.u16 %r197,%r194; cvt.u16.u8 %r198,%r197; setp.eq.u16 %r199,%r198,0; @ %r199 bra $L2; .loc 1 63 7 cvt.u32.u32 %r202,%r22; cvt.s64.s8 %r201,%r202; st.u64 [%stack+8],%r2000200_gfortran_runtime_errorr205,%r111,%r111; add.u64 %r206,%r205,%r111; shl.b64 %r207,%r206,3; add.u64 %r208,%r184,%r207; ld.u64 %r31,[%r208+56]; ld.u64 %r33,[%r208+48]; .loc 1 71 9 ld.u64 %r114,[%r208+40]; .loc 1 73 3 setp.ne.u64 %r222,%r111,0; @ %r222 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r223,%r110,%r111; @ %r223 bra $L4; bra $L49; $L3: add.u64 %r109,%r184,40; add.u64 %r128,%frame,120; add.u64 %r181,%frame,240; add.u64 %r224,%r184,16; add.u64 %r226,%r25,%r25; add.u64 %r227,%r226,%r25; shl.b64 %r228,%r227,3; add.u64 %r137,%r224,%r228; .loc 1 79 12 mov.u64 %r390,0; $L8: .loc 1 75 18 ld.u64 %r229,[%r109]; st.u64 [%r128],%r229; .loc 1 76 19 ld.u64 %r231,[%r109+16]; add.u64 %r230,%r231,1; ld.u64 %r232,[%r109+8]; sub.u64 %r38,%r230,%r232; .loc 1 78 10 setp.lt.s64 %r233,%r38,0; @ %r233 bra $L6; .loc 1 76 17 st.u64 [%r181],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r181],%r390; $L7: .loc 1 73 3 add.u64 %r109,%r109,24; add.u64 %r128,%r128,8; add.u64 %r181,%r181,8; setp.ne.u64 %r235,%r109,%r137; @ %r235 bra $L8; bra $L9; $L49: .loc 1 90 6 ld.u64 %r236,[%r183]; setp.eq.u64 %r237,%r236,0; @ ! %r237 bra $L11; bra $L10; $L4: add.u64 %r239,%r25,%r25; add.u64 %r240,%r239,%r25; shl.b64 %r241,%r240,3; add.u64 %r242,%r241,40; add.u64 %r134,%r184,%r242; shl.b64 %r243,%r25,3; add.u64 %r107,%r243,-8; add.u64 %r378,%frame,120; add.u64 %r120,%r378,%r107; add.u64 %r377,%frame,240; add.u64 %r104,%r377,%r107; add.u64 %r246,%r184,40; cvt.u32.u32 %r248,%r22; cvt.s64.s8 %r247,%r248; add.u64 %r250,%r247,%r247; add.u64 %r251,%r250,%r247; shl.b64 %r252,%r251,3; add.u64 %r63,%r246,%r252; .loc 1 87 12 mov.u64 %r389,0; $L14: .loc 1 83 18 ld.u64 %r253,[%r134]; st.u64 [%r120],%r253; .loc 1 84 19 ld.u64 %r255,[%r134+16]; add.u64 %r254,%r255,1; ld.u64 %r256,[%r134+8]; sub.u64 %r45,%r254,%r256; .loc 1 86 10 setp.lt.s64 %r257,%r45,0; @ %r257 bra $L12; .loc 1 84 17 st.u64 [%r104],%r45; bra $L13; $L12: .loc 1 87 12 st.u64 [%r104],%r389; $L13: .loc 1 81 3 add.u64 %r134,%r134,24; add.u64 %r120,%r120,8; add.u64 %r104,%r104,8; setp.ne.u64 %r259,%r63,%r134; @ %r259 bra $L14; bra $L50; $L10: .loc 1 94 7 setp.le.s64 %r260,%r110,0; @ %r260 bra $L16; add.u64 %r377,%frame,240; $L37: add.u64 %r154,%r183,40; mov.u64 %r152,%r377; cvt.u32.u32 %r262,%r22; cvt.s64.s8 %r261,%r262; add.u64 %r264,%r261,%r261; add.u64 %r265,%r264,%r261; shl.b64 %r266,%r265,3; add.u64 %r267,%r183,16; add.u64 %r138,%r266,%r267; .loc 1 97 10 mov.u64 %r101,1; .loc 1 101 4 mov.u64 %r290,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r268,0; st.u64 [%r183+8],%r268; .loc 1 106 28 cvt.u16.u32 %r271,%r22; add.u16 %r270,%r271,-1; cvt.u32.u16 %r272,%r270; st.u8 [%r183+28],%r272; .loc 1 108 20 add.u32 %r273,%r22,-2; cvt.s64.s32 %r56,%r273; add.u64 %r275,%r56,%r56; add.u64 %r276,%r275,%r56; shl.b64 %r277,%r276,3; add.u64 %r278,%r183,%r277; .loc 1 108 67 shl.b64 %r280,%r56,3; add.u64 %r281,%frame,%r280; .loc 1 108 59 ld.u64 %r283,[%r278+40]; ld.u64 %r284,[%r281+240]; mul.lo.u64 %r116,%r283,%r284; .loc 1 110 29 mov.u64 %r286,16286; call (%value_in),_gfortrani_xmallocarray287,[%value_in]; } .loc 1 110 27 st.u64 [%r183],%r287; .loc 1 111 10 setp.eq.u64 %r289,%r116,0; @ ! %r289 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r101,%r52,%r101; $L17: .loc 1 101 4 st.u64 [%r154+8],%r290; ld.u64 %r52,[%r152]; add.u64 %r291,%r52,-1; st.u64 [%r154+16],%r291; st.u64 [%r154],%r101; .loc 1 94 7 add.u64 %r154,%r154,24; add.u64 %r152,%r152,8; setp.ne.u64 %r292,%r138,%r154; @ %r292 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r183+48],%r116; mov.u64 %r294,-1; st.u64 [%r183+56],%r294; mov.u64 %r295,1; st.u64 [%r183+40],%r295; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r62,[%r183+28]; .loc 1 121 10 setp.eq.u64 %r296,%r62,%r110; @ %r296 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r110; st.u64 [%stack],%r62;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r299,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r300,[%r299+36]; setp.eq.u32 %r301,%r300,0; @ %r301 bra $L21; .loc 1 128 2 add.u64 %r377,%frame,240; cvta.const.u64 %r305,$LC2call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r307,%r110,0; @ %r307 bra $L24; $L27: .loc 1 68 9 add.u64 %r308,%r31,1; .loc 1 68 7 sub.u64 %r112,%r308,%r33; max.s64 %r46,%r112,0; .loc 1 140 8 ld.u64 %r124,[%r184]; .loc 1 141 8 ld.u64 %r125,[%r183]; .loc 1 180 31 shl.b64 %r70,%r114,3; ld.u64 %r77,[%frame+360]; .loc 1 194 22 ld.u64 %r72,[%frame+120]; .loc 1 194 12 shl.b64 %r73,%r72,3; .loc 1 195 22 ld.u64 %r75,[%frame]; .loc 1 195 12 shl.b64 %r76,%r75,4; .loc 1 197 32 ld.u64 %r105,[%frame+240]; .loc 1 204 23 mul.lo.u64 %r309,%r72,%r105; .loc 1 204 9 shl.b64 %r310,%r309,3; neg.s64 %r135,%r310; .loc 1 205 23 mul.lo.u64 %r311,%r75,%r105; .loc 1 205 9 shl.b64 %r312,%r311,4; neg.s64 %r139,%r312; setp.gt.s64 %r379,%r112,0; setp.le.s64 %r380,%r110,1; mov.u64 %r382,1; setp.eq.u32 %r383,%r186,0; cvt.u32.u32 %r384,%r22; cvt.s64.s8 %r385,%r384; add.u64 %r386,%r385,-1; add.u64 %r387,%frame,120; add.u64 %r388,%frame,240; bra $L25; $L24: add.u64 %r167,%frame,360; add.u64 %r166,%r183,40; mov.u64 %r164,%frame; cvt.u32.u32 %r314,%r22; cvt.s64.s8 %r313,%r314; add.u64 %r155,%r313,-1; .loc 1 132 3 mov.u64 %r126,0; add.u64 %r377,%frame,240; .loc 1 134 16 mov.u64 %r315,%r126; $L26: st.u64 [%r167],%r315; .loc 1 135 18 ld.u64 %r316,[%r166]; st.u64 [%r164],%r316; .loc 1 136 17 shl.b64 %r318,%r126,3; add.u64 %r319,%r377,%r318; .loc 1 136 10 ld.u64 %r320,[%r319]; setp.le.s64 %r321,%r320,0; @ %r321 bra $L1; .loc 1 132 26 add.u64 %r126,%r126,1; .loc 1 132 3 add.u64 %r167,%r167,8; add.u64 %r166,%r166,24; add.u64 %r164,%r164,8; setp.ne.u64 %r322,%r126,%r155; @ %r322 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r379 bra $L38; .loc 1 159 10 mov.u64 %r324,0; st.u64 [%r125],%r324; st.u64 [%r125+8],%r324; bra $L29; $L38: mov.u64 %r119,%r124; mov.u64 %r370,%r382; mov.u64 %r371,0; .loc 1 155 9 mov.u64 %r103,-9223372036854775808; .loc 1 157 9 mov.u64 %r368,1; mov.u64 %r369,%r371; $L28: .loc 1 182 14 ld.u64 %r182,[%r119]; .loc 1 182 7 @ %r383 bra $L30; .loc 1 182 6 setp.le.s64 %r327,%r103,%r182; @ %r327 bra $L31; bra $L32; $L30: setp.ge.s64 %r328,%r103,%r182; @ %r328 bra $L32; $L31: .loc 1 185 14 mov.u64 %r368,%r370; mov.u64 %r369,%r371; .loc 1 182 14 mov.u64 %r103,%r182; $L32: .loc 1 180 31 add.u64 %r119,%r119,%r70; .loc 1 180 6 add.u64 %r331,%r370,%r382; set.u32.lt.u64 %r336,%r331,%r370; cvt.s64.s32 %r334,%r336; mov.u64 %r370,%r331; sub.u64 %r371,%r371,%r334; add.u64 %r341,%r370,-1; setp.gt.s64 %r343,%r46,%r341; @ %r343 bra $L28; .loc 1 189 12 st.u64 [%r125],%r368; st.u64 [%r125+8],%r369; $L29: .loc 1 193 15 add.u64 %r77,%r77,1; .loc 1 194 12 add.u64 %r124,%r124,%r73; .loc 1 195 12 add.u64 %r125,%r125,%r76; .loc 1 197 13 setp.ne.u64 %r346,%r77,%r105; @ %r346 bra $L25; .loc 1 204 9 add.u64 %r121,%r124,%r135; .loc 1 205 9 add.u64 %r122,%r125,%r139; .loc 1 207 7 @ %r380 bra $L1; add.u64 %r130,%frame,368; mov.u64 %r149,8; .loc 1 206 5 mov.u64 %r123,1; .loc 1 201 13 mov.u64 %r381,0; bra $L35; $L36: st.u64 [%r130],%r381; .loc 1 204 23 mul.lo.u64 %r352,%r89,%r88; .loc 1 204 9 shl.b64 %r353,%r352,3; sub.u64 %r121,%r124,%r353; .loc 1 205 23 mul.lo.u64 %r354,%r92,%r88; .loc 1 205 9 shl.b64 %r355,%r354,4; sub.u64 %r122,%r125,%r355; .loc 1 206 5 add.u64 %r123,%r123,1; .loc 1 207 7 add.u64 %r130,%r130,8; add.u64 %r149,%r149,8; setp.eq.u64 %r356,%r123,%r386; @ %r356 bra $L1; $L35: .loc 1 215 16 ld.u64 %r357,[%r130]; add.u64 %r88,%r357,1; st.u64 [%r130],%r88; .loc 1 216 23 add.u64 %r359,%r387,%r149; ld.u64 %r89,[%r359]; .loc 1 216 13 shl.b64 %r360,%r89,3; add.u64 %r124,%r121,%r360; .loc 1 217 23 add.u64 %r361,%frame,%r149; ld.u64 %r92,[%r361]; .loc 1 217 13 shl.b64 %r362,%r92,4; add.u64 %r125,%r122,%r362; .loc 1 197 32 add.u64 %r364,%r388,%r149; ld.u64 %r95,[%r364]; .loc 1 197 13 setp.eq.u64 %r365,%r88,%r95; @ %r365 bra $L36; .loc 1 201 13 mov.u64 %r77,0; bra $L25; $L50: .loc 1 90 6 ld.u64 %r366,[%r183]; setp.eq.u64 %r367,%r366,0; @ ! %r367 bra $L11; bra $L37; $L1: .loc 1 221_gfortran_mmaxloc1_16_i8 .visible .func _gfortran_mmaxloc1_16_, .param .u32 %in_ar4) {608102u64u64 %r267; .reg .u64 %r274; .reg .pred %r275; .reg .u32 %r276; .reg .u32 %r277; .reg .u32u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .pred %r329; .reg .u64 %r330; .reg .pred %r331; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32 %r338; .reg .u64 %r339; .reg .u64 %r341; .reg .u64 %r342; .reg .u64 %r343; .reg .u64 %r344; .reg .u64pred %r355; .reg .pred64predpred %r408; .reg .pred %r409; .reg .u64 %r410; .reg .u32.reg .u64 %r417; .reg .pred %r418; .reg .predpred %r428; .reg .u64 %r432; .reg .u64pred %r460; .reg .u64 %r463; .reg .u64 %r466; .reg .u32 %r468; .reg .pred %r475; .reg .pred.reg .u32 %r504; .reg .u64 %r508; .reg .u16 %r509; .reg .u32 %r510; .reg .pred %r511; .reg .pred %r512; .reg .u64 %r515; .reg .u64 %r518; .reg .u32 %r520; .reg .pred %r527; .reg .pred %r530; .reg .pred %r533; .reg .u64pred %r542; .reg .u64 %r543; .reg .u64pred.reg .u64 %r610; .reg .u64 %r611; mov.u64 %r233,%ar0; mov.u64 %r234,%ar1; mov.u64 %r235,%ar2; mov.u64 %r236,%ar3; mov.u32 %r237,%ar4; .loc 1 251 6 setp.ne.u64 %r238,%r236,0; @ %r238 bra $L3r2235237; call _gfortran_maxloc1_16_7 bra $L51; $L52: .loc 1 261 10 ld.u64 %r22,[%r235]; .loc 1 261 7 add.u64 %r153,%r22,-1; .loc 1 262 10 ld.s8 %r23,[%r234+28]; .loc 1 262 38 add.u32 %r243,%r23,-1; .loc 1 262 8 cvt.s64.s32 %r154,%r243; .loc 1 265 7 shr.u64 %r245,%r153,63; set.u32.gt.s64 %r247,%r153,%r154; neg.s32 %r248,%r247; cvt.u16.u64 %r251,%r245; cvt.u16.u32 %r252,%r248; or.b16 %r250,%r251,%r252; .loc 1 265 6 cvt.u32.u16 %r253,%r250; cvt.u16.u8 %r254,%r253; setp.eq.u16 %r255,%r254,0; @ %r255 bra $L54; .loc 1 267 7 cvt.u32.u32 %r258,%r23; cvt.s64.s8 %r257,%r258; st.u64 [%stack+8],%r257; st.u64 [%stack],%r22; cvta.const.u64 %r22stack; call _gfortran_runtime_error4: .loc 1 272 9 add.u64 %r596,%r153,%r153; add.u64 %r601,%r596,%r153; shl.b64 %r263,%r601,3; add.u64 %r264,%r234,%r263; ld.u64 %r267,[%r264+56]; add.u64 %r266,%r267,1; .loc 1 272 7 ld.u64 %r274,[%r264+48]; sub.u64 %r155,%r266,%r274; .loc 1 273 6 setp.le.s64 %r275,%r155,0; @ %r275 bra $L51; .loc 1 276 9 ld.u64 %r173,[%r236]; .loc 1 278 15 ld.u64 %r35,[%r236+16]; .loc 1 280 22 cvt.u32.u64 %r36,%r35; .loc 1 280 53 add.u32 %r276,%r36,-4; and.b32 %r277,%r276,-5; set.u32.eq.u32 %r279,%r277,0; neg.s32 %r280,%r279; .loc 1 280 22 add.u32 %r281,%r36,-1; set.u32.le.u32 %r283,%r281,1; neg.s32 %r284,%r283; .loc 1 280 6 cvt.u16.u32 %r286,%r280; cvt.u16.u32 %r287,%r284; or.b16 %r285,%r286,%r287;56; .loc 1 282 7 setp.ne.u32 %r292,%r36,16; @ %r292 bra $L57; $L56: .loc 1 289 9 shl.b64 %r296,%r601,3; add.u64 %r297,%r234,%r296; ld.u64 %r158,[%r297+40]; .loc 1 290 12 add.u64 %r303,%r236,%r296; ld.u64 %r41,[%r303+40]; .loc 1 292 3 setp.ne.u64 %r305,%r153,0; @ %r305 bra $L58; $L64: .loc 1 302 3 setp.lt.s64 %r306,%r153,%r154; @ %r306 bra $L59; bra $L118; $L57: .loc 1 287 5 cvta.const.u64 %r307307_gfortran_runtime_error8: add.u64 %r164,%r234,40; add.u64 %r150,%r236,40; add.u64 %r309,%r234,16; add.u64 %r311,%r22,%r22; add.u64 %r312,%r311,%r22; shl.b64 %r313,%r312,3; add.u64 %r223,%r309,%r313; .loc 1 292 3 mov.u64 %r53,0; add.u64 %r597,%frame,240; add.u64 %r600,%frame,360; .loc 1 299 12 mov.u64 %r611,%r53; $L63: .loc 1 294 18 add.u64 %r315,%r597,%r53; ld.u64 %r316,[%r164]; st.u64 [%r315],%r316; .loc 1 295 18 add.u64 %r317,%frame,%r53; .loc 1 295 20 ld.u64 %r319,[%r150]; mul.lo.u64 %r318,%r319,%r35; .loc 1 295 18 st.u64 [%r317],%r318; .loc 1 296 19 ld.u64 %r321,[%r164+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r164+8]; sub.u64 %r52,%r320,%r322; .loc 1 298 10 setp.lt.s64 %r323,%r52,0; @ %r323 bra $L61; .loc 1 296 17 add.u64 %r325,%r600,%r53; st.u64 [%r325],%r52; bra $L62; $L61: .loc 1 299 12 add.u64 %r327,%r600,%r53; st.u64 [%r327],%r611; $L62: .loc 1 292 3 add.u64 %r164,%r164,24; add.u64 %r150,%r150,24; add.u64 %r53,%r53,8; setp.ne.u64 %r329,%r164,%r223; @ %r329 bra $L63; bra $L64; $L118: .loc 1 312 6 ld.u64 %r330,[%r233]; setp.eq.u64 %r331,%r330,0; @ ! %r331 bra $L66; bra $L65; $L59: add.u64 %r333,%r22,%r22; add.u64 %r334,%r333,%r22; shl.b64 %r335,%r334,3; add.u64 %r178,%r335,40; add.u64 %r59,%r234,%r178; add.u64 %r180,%r236,%r178; shl.b64 %r336,%r22,3; add.u64 %r91,%r336,-8; cvt.u32.u32 %r338,%r23; cvt.s64.s8 %r337,%r338; shl.b64 %r339,%r337,3; add.u64 %r97,%r339,-8; add.u64 %r597,%frame,240; add.u64 %r600,%frame,360; .loc 1 309 12 mov.u64 %r610,0; $L69: .loc 1 304 18 add.u64 %r341,%r597,%r91; ld.u64 %r342,[%r59]; st.u64 [%r341],%r342; .loc 1 305 18 add.u64 %r343,%frame,%r91; .loc 1 305 20 ld.u64 %r345,[%r180]; mul.lo.u64 %r344,%r345,%r35; .loc 1 305 18 st.u64 [%r343],%r344; .loc 1 306 19 ld.u64 %r347,[%r59+16]; add.u64 %r346,%r347,1; ld.u64 %r348,[%r59+8]; sub.u64 %r66,%r346,%r348; .loc 1 308 10 setp.lt.s64 %r349,%r66,0; @ %r349 bra $L67; .loc 1 306 17 add.u64 %r351,%r600,%r91; st.u64 [%r351],%r66; bra $L68; $L67: .loc 1 309 12 add.u64 %r353,%r600,%r91; st.u64 [%r353],%r610; $L68: .loc 1 302 3 add.u64 %r59,%r59,24; add.u64 %r180,%r180,24; add.u64 %r91,%r91,8; setp.ne.u64 %r355,%r91,%r97; @ %r355 bra $L69; bra $L119; $L65: .loc 1 316 7 setp.eq.u64 %r356,%r154,0; @ %r356 bra $L71; add.u64 %r600,%frame,360; $L97: add.u64 %r137,%r233,40; mov.u64 %r134,%r600; cvt.u32.u32 %r358,%r23; cvt.s64.s8 %r357,%r358; add.u64 %r360,%r357,%r357; add.u64 %r361,%r360,%r357; shl.b64 %r362,%r361,3; add.u64 %r363,%r233,16; add.u64 %r64,%r362,%r363; .loc 1 319 10 mov.u64 %r136,1; .loc 1 323 4 mov.u64 %r382,0; bra $L72; $L71: .loc 1 327 20 add.u32 %r364,%r23,-2; cvt.s64.s32 %r75,%r364; add.u64 %r366,%r75,%r75; add.u64 %r367,%r366,%r75; shl.b64 %r368,%r367,3; add.u64 %r369,%r233,%r368; .loc 1 327 67 shl.b64 %r371,%r75,3; add.u64 %r372,%frame,%r371; .loc 1 327 59 ld.u64 %r374,[%r369+40]; ld.u64 %r375,[%r372+360]; mul.lo.u64 %r162,%r374,%r375; .loc 1 329 24 mov.u64 %r376,0; st.u64 [%r233+8],%r376; .loc 1 330 28 cvt.u16.u32 %r379,%r23; add.u16 %r378,%r379,-1; cvt.u32.u16 %r380,%r378; st.u8 [%r233+28],%r380; .loc 1 332 10 setp.eq.u64 %r381,%r162,0; @ %r381 bra $L73; bra $L120; $L75: .loc 1 321 47 mul.lo.u64 %r136,%r72,%r136; $L72: .loc 1 323 4 st.u64 [%r137+8],%r382; ld.u64 %r72,[%r134]; add.u64 %r383,%r72,-1; st.u64 [%r137+16],%r383; st.u64 [%r137],%r136; .loc 1 316 7 add.u64 %r137,%r137,24; add.u64 %r134,%r134,8; setp.ne.u64 %r384,%r64,%r137; @ %r384 bra $L75; bra $L71; $L73: .loc 1 335 4 st.u64 [%r233+48],%r162; mov.u64 %r386,-1; st.u64 [%r233+56],%r386; mov.u64 %r387,1; st.u64 [%r233+40],%r387; .loc 1 336 4 bra $L51; $L120: .loc 1 339 24 mov.u64 %r389,16389; call (%value_in),_gfortrani_xmallocarray90,[%value_in]; } .loc 1 339 22 st.u64 [%r233],%r390; bra $L76; $L66: .loc 1 344 19 ld.s8 %r392,[%r233+28]; .loc 1 344 10 setp.eq.u64 %r393,%r392,%r154; @ %r393 bra $L77; .loc 1 345 2 cvta.const.u64 %r3944_gfortran_runtime_error77: .loc 1 347 11 cvta.global.u64 %r396,_gfortrani_compile_options; .loc 1 347 10 ld.u32 %r397,[%r396+36]; setp.eq.u32 %r398,%r397,0; @ %r398 bra $L76; .loc 1 349 4 add.u64 %r600,%frame,360; cvta.const.u64 %r402,$LC2r600002; call _gfortrani_bounds_ifunction_return4 cvta.const.u64 %r406,$LC406402; call _gfortrani_bounds_equal_extents$L76: .loc 1 356 3 setp.ne.u64 %r408,%r154,0; @ %r408 bra $L78; $L81: .loc 1 364 8 ld.u64 %r174,[%r233]; .loc 1 365 8 ld.u64 %r172,[%r234]; .loc 1 367 9 setp.ne.u64 %r409,%r172,0; @ %r409 bra $L79; bra $L51; $L78: add.u64 %r188,%frame,480; add.u64 %r181,%r233,40; add.u64 %r149,%frame,120; cvt.u32.u32 %r411,%r23; cvt.s64.s8 %r410,%r411; add.u64 %r138,%r410,-1; .loc 1 356 3 mov.u64 %r175,0; add.u64 %r600,%frame,360; .loc 1 358 16 mov.u64 %r412,%r175; $L80: st.u64 [%r188],%r412; .loc 1 359 18 ld.u64 %r413,[%r181]; st.u64 [%r149],%r413; .loc 1 360 17 shl.b64 %r415,%r175,3; add.u64 %r416,%r600,%r415; .loc 1 360 10 ld.u64 %r417,[%r416]; setp.le.s64 %r418,%r417,0; @ %r418 bra $L51; .loc 1 356 26 add.u64 %r175,%r175,1; .loc 1 356 3 add.u64 %r188,%r188,8; add.u64 %r181,%r181,24; add.u64 %r149,%r149,8; setp.ne.u64 %r419,%r138,%r175; @ %r419 bra $L80; bra $L81; $L79: .loc 1 290 12 mul.lo.u64 %r43,%r41,%r35; .loc 1 386 32 shl.b64 %r88,%r158,3; ld.u64 %r189,[%frame+480]; .loc 1 430 22 ld.u64 %r102,[%frame+240]; .loc 1 430 12 shl.b64 %r103,%r102,3; .loc 1 431 23 ld.u64 %r105,[%frame]; .loc 1 432 22 ld.u64 %r107,[%frame+120]; .loc 1 432 12 shl.b64 %r108,%r107,4; .loc 1 434 32 ld.u64 %r29,[%frame+360]; .loc 1 441 23 mul.lo.u64 %r39,%r29,%r102; .loc 1 442 24 mul.lo.u64 %r420,%r29,%r105; .loc 1 442 10 neg.s64 %r92,%r420; .loc 1 443 23 mul.lo.u64 %r421,%r29,%r107; .loc 1 443 9 shl.b64 %r422,%r421,4; neg.s64 %r185,%r422; setp.le.s64 %r595,%r154,1; .loc 1 408 9 setp.ne.u32 %r603,%r237,0; add.u64 %r604,%r155,-1; cvt.u32.u32 %r605,%r23; cvt.s64.s8 %r606,%r605; add.u64 %r607,%r606,-1; add.u64 %r608,%frame,240; add.u64 %r609,%frame,360; $L98: .loc 1 356 3 mov.u64 %r168,%r173; mov.u64 %r167,%r172; .loc 1 386 9 mov.u64 %r197,0; $L85: .loc 1 389 7 ld.s8 %r224,[%r168]; mov.u64 %r230,%r197; .loc 1 386 24 add.u64 %r197,%r197,1; .loc 1 389 6 setp.eq.u32 %r598,%r224,0; @ %r598 bra $L82; .loc 1 397 11 ld.u64 %r145,[%r167]; .loc 1 398 11 mov.u64 %r557,%r197; shr.s64 %r558,%r557,63; .loc 1 408 9 @ %r603 bra $L83; bra $L121; $L82: .loc 1 386 32 add.u64 %r167,%r167,%r88; .loc 1 386 47 add.u64 %r168,%r168,%r43; .loc 1 386 2 setp.ne.u64 %r426,%r155,%r197; @ %r426 bra $L85; .loc 1 385 9 mov.u64 %r557,0; mov.u64 %r558,%r557; bra $L86; $L121: .loc 1 418 8 setp.gt.s64 %r427,%r155,%r230; @ %r427 bra $L87; bra $L86; $L83: .loc 1 409 8 setp.le.s64 %r428,%r155,%r230; @ %r428 bra $L86; add.u64 %r565,%r230,1; shr.s64 %r566,%r565,63; sub.u64 %r432,%r604,%r230; shr.s64 %r435,%r230,63; add.u64 %r438,%r230,2; set.u32.lt.u64 %r443,%r438,%r230; cvt.s64.s32 %r441,%r443; sub.u64 %r447,%r435,%r441; add.u64 %r449,%r432,%r438; set.u32.lt.u64 %r454,%r449,%r432; cvt.s64.s32 %r452,%r454; sub.u64 %r458,%r447,%r452; mov.u64 %r577,1; $L90: .loc 1 411 8 @ %r598 bra $L88; .loc 1 411 18 ld.u64 %r89,[%r167]; .loc 1 411 15 setp.lt.s64 %r460,%r89,%r145; .loc 1 414 16 selp.u64 %r557,%r557,%r565,%r460; selp.u64 %r558,%r558,%r566,%r460; selp.u64 %r145,%r145,%r89,%r460; $L88: .loc 1 409 33 add.u64 %r167,%r167,%r88; .loc 1 409 48 add.u64 %r168,%r168,%r43; .loc 1 409 8 add.u64 %r463,%r565,%r577; set.u32.lt.u64 %r468,%r463,%r565; cvt.s64.s32 %r466,%r468; mov.u64 %r565,%r463; sub.u64 %r566,%r566,%r466; setp.ne.u64 %r475,%r449,%r565; @ %r475 bra $L99; setp.ne.u64 %r478,%r458,%r566; @ ! %r478 bra $L86; $L99: .loc 1 411 9 ld.s8 %r224,[%r168]; setp.eq.u32 %r598,%r224,0; bra $L90; $L87: add.u64 %r561,%r230,1; shr.s64 %r562,%r561,63; sub.u64 %r482,%r604,%r230; shr.s64 %r485,%r230,63; add.u64 %r488,%r230,2; set.u32.lt.u64 %r493,%r488,%r230; cvt.s64.s32 %r491,%r493; sub.u64 %r497,%r485,%r491; add.u64 %r499,%r482,%r488; set.u32.lt.u64 %r504,%r499,%r482; cvt.s64.s32 %r502,%r504; sub.u64 %r508,%r497,%r502; .loc 1 418 8 mov.u64 %r591,1; $L113: .loc 1 420 8 ld.s8 %r510,[%r168]; cvt.u16.u32 %r509,%r510; setp.eq.u16 %r511,%r509,0; @ %r511 bra $L91; .loc 1 420 18 ld.u64 %r94,[%r167]; .loc 1 420 15 setp.le.s64 %r512,%r94,%r145; .loc 1 423 16 selp.u64 %r557,%r557,%r561,%r512; selp.u64 %r558,%r558,%r562,%r512; selp.u64 %r145,%r145,%r94,%r512; $L91: .loc 1 418 33 add.u64 %r167,%r167,%r88; .loc 1 418 48 add.u64 %r168,%r168,%r43; .loc 1 418 8 add.u64 %r515,%r561,%r591; set.u32.lt.u64 %r520,%r515,%r561; cvt.s64.s32 %r518,%r520; mov.u64 %r561,%r515; sub.u64 %r562,%r562,%r518; setp.ne.u64 %r527,%r499,%r561; @ %r527 bra $L113; setp.ne.u64 %r530,%r508,%r562; @ %r530 bra $L113; $L86: .loc 1 426 8 st.u64 [%r174],%r557; st.u64 [%r174+8],%r558; .loc 1 429 15 add.u64 %r189,%r189,1; .loc 1 430 12 add.u64 %r172,%r172,%r103; .loc 1 431 13 add.u64 %r173,%r173,%r105; .loc 1 432 12 add.u64 %r174,%r174,%r108; .loc 1 434 13 setp.ne.u64 %r533,%r29,%r189; @ %r533 bra $L98; .loc 1 442 10 add.u64 %r169,%r173,%r92; .loc 1 443 9 add.u64 %r170,%r174,%r185; .loc 1 445 7 @ %r595 bra $L51; add.u64 %r219,%frame,488; .loc 1 441 23 mov.u64 %r112,%r39; .loc 1 445 7 mov.u64 %r218,8; .loc 1 444 5 mov.u64 %r171,1; add.u64 %r599,%frame,120; .loc 1 438 13 mov.u64 %r602,0; bra $L95; $L96: st.u64 [%r219],%r602; .loc 1 441 23 mul.lo.u64 %r112,%r125,%r124; .loc 1 442 24 mul.lo.u64 %r539,%r127,%r124; .loc 1 442 10 sub.u64 %r169,%r173,%r539; .loc 1 443 23 mul.lo.u64 %r540,%r129,%r124; .loc 1 443 9 shl.b64 %r541,%r540,4; sub.u64 %r170,%r174,%r541; .loc 1 444 5 add.u64 %r171,%r171,1; .loc 1 445 7 add.u64 %r219,%r219,8; add.u64 %r218,%r218,8; setp.eq.u64 %r542,%r171,%r607; @ %r542 bra $L51; $L95: .loc 1 453 16 ld.u64 %r543,[%r219]; add.u64 %r124,%r543,1; st.u64 [%r219],%r124; .loc 1 454 23 add.u64 %r545,%r608,%r218; ld.u64 %r125,[%r545]; .loc 1 454 13 sub.u64 %r546,%r125,%r112; shl.b64 %r547,%r546,3; add.u64 %r172,%r172,%r547; .loc 1 455 24 add.u64 %r548,%frame,%r218; ld.u64 %r127,[%r548]; .loc 1 455 14 add.u64 %r173,%r169,%r127; .loc 1 456 23 add.u64 %r550,%r599,%r218; ld.u64 %r129,[%r550]; .loc 1 456 13 shl.b64 %r551,%r129,4; add.u64 %r174,%r170,%r551; .loc 1 434 32 add.u64 %r553,%r609,%r218; ld.u64 %r133,[%r553]; .loc 1 434 13 setp.eq.u64 %r554,%r124,%r133; @ %r554 bra $L96; .loc 1 438 13 mov.u64 %r189,0; bra $L98; $L119: .loc 1 312 6 ld.u64 %r555,[%r233]; setp.eq.u64 %r556,%r555,0; @ ! %r556 bra $L66; bra $L97; $L51: .loc 1 460_gfortran_smaxloc1_16_i8 .visible .func _gfortran_smaxloc1_16_, .param .u32 %in_ar4) {8u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r282; .reg .predpred %r293; .reg .pred %r294; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 483 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L123; .loc 1 483 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L124; r152154; call _gfortran_maxloc1_16_490 7 bra $L122; $L124: .loc 1 493 10 ld.u64 %r23,[%r152]; .loc 1 493 7 add.u64 %r88,%r23,-1; .loc 1 494 10 ld.s8 %r24,[%r151+28]; .loc 1 494 38 add.u32 %r26,%r24,-1; .loc 1 494 8 cvt.s64.s32 %r297,%r26; .loc 1 496 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r297; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 496 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L126; .loc 1 503 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L127; $L133: .loc 1 511 3 setp.lt.s64 %r175,%r88,%r297; @ %r175 bra $L128; bra $L181; $L126: .loc 1 498 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error127: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 508 12 mov.u64 %r303,0; $L132: .loc 1 505 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 507 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L130; .loc 1 505 17 st.u64 [%r139],%r36; bra $L131; $L130: .loc 1 508 12 st.u64 [%r139],%r303; $L131: .loc 1 503 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L132; bra $L133; $L181: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L135; bra $L134; $L128: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r296,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r296,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 517 12 mov.u64 %r302,0; $L138: .loc 1 514 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 516 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L136; .loc 1 513 17 st.u64 [%r43],%r41; bra $L137; $L136: .loc 1 517 12 st.u64 [%r43],%r302; $L137: .loc 1 511 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L138; bra $L182; $L134: .loc 1 524 7 setp.eq.u64 %r213,%r297,0; @ %r213 bra $L140; add.u64 %r296,%frame,120; $L160: add.u64 %r111,%r150,40; mov.u64 %r107,%r296; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 527 10 mov.u64 %r81,1; .loc 1 531 4 mov.u64 %r239,0; bra $L141; $L140: .loc 1 535 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 536 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 538 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 538 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 538 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 540 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L142; bra $L183; $L144: .loc 1 529 48 mul.lo.u64 %r81,%r46,%r81; $L141: .loc 1 531 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 524 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L144; bra $L140; $L142: .loc 1 543 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 544 4 bra $L122; $L183: .loc 1 547 24 mov.u64 %r246,1696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 547 22 st.u64 [%r150],%r98; $L148: .loc 1 573 3 setp.ne.u64 %r249,%r297,0; @ %r249 bra $L145; bra $L146; $L135: .loc 1 551 19 ld.s8 %r55,[%r150+28]; .loc 1 551 10 setp.eq.u64 %r250,%r55,%r297; @ %r250 bra $L147; .loc 1 552 2 st.u64 [%stack+8],%r297; st.u64 [%stack],%r55;_gfortran_runtime_error147: .loc 1 557 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 557 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L148; .loc 1 559 4 setp.eq.u64 %r256,%r297,0; @ %r256 bra $L146; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 559 10 mov.u64 %r104,0; $L150: .loc 1 563 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 563 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 564 18 ld.u64 %r63,[%r31]; .loc 1 559 25 add.u64 %r104,%r104,1; .loc 1 564 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L149; .loc 1 565 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error149: .loc 1 559 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L150; $L145: .loc 1 575 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L151; shl.b64 %r266,%r297,3; bra $L152; $L151: mov.u64 %r266,8; $L152: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L153: .loc 1 576 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 573 26 add.u64 %r100,%r100,1; .loc 1 573 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r297,%r100; @ %r279 bra $L153; $L146: ld.u64 %r80,[%frame+240]; .loc 1 585 22 ld.u64 %r68,[%frame]; .loc 1 585 12 shl.b64 %r69,%r68,4; .loc 1 587 32 ld.u64 %r59,[%frame+120]; .loc 1 594 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r295,%r297,1; .loc 1 583 13 mov.u64 %r280,0; cvt.u32.u32 %r299,%r24; cvt.s64.s8 %r300,%r299; add.u64 %r301,%r300,-1; $L159: st.u64 [%r98],%r280; st.u64 [%r98+8],%r280; .loc 1 584 15 add.u64 %r80,%r80,1; .loc 1 585 12 add.u64 %r98,%r98,%r69; .loc 1 587 13 setp.ne.u64 %r282,%r59,%r80; @ %r282 bra $L159; .loc 1 596 7 @ ! %r295 bra $L184; bra $L122; $L158: .loc 1 591 13 st.u64 [%r99],%r280; .loc 1 594 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 595 5 add.u64 %r97,%r97,1; .loc 1 596 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r285,%r97,%r301; @ %r285 bra $L157; bra $L122; $L184: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 594 23 mov.u64 %r72,%r44; .loc 1 595 5 mov.u64 %r97,1; $L157: .loc 1 600 16 ld.u64 %r290,[%r99]; add.u64 %r74,%r290,1; st.u64 [%r99],%r74; .loc 1 601 23 ld.u64 %r75,[%r124]; .loc 1 601 13 sub.u64 %r291,%r75,%r72; shl.b64 %r292,%r291,4; add.u64 %r98,%r98,%r292; .loc 1 587 32 ld.u64 %r77,[%r123]; .loc 1 587 13 setp.eq.u64 %r293,%r74,%r77; @ %r293 bra $L158; .loc 1 591 13 mov.u64 %r80,0; bra $L159; $L182: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r294,%r98,0; @ ! %r294 bra $L135; bra $L160; $L122: .loc 1 605 maxloc1_4_i16.o/1622802230_gfortran_maxloc1_4_i16 .visible .func _gfortran_maxloc1_4_i16.file 1 "../../../../libgfortran/generated/maxloc1_4_i16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_4_i16 .visible .func _gfortran_mmaxloc1_4_i16_gfortran_smaxloc1_4_i16 .visible .func _gfortran_smaxloc1_4_i16VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_maxloc1_4_i16 .visible .func _gfortran_maxloc1_4_i16pred %r221; .reg .pred %r222; .reg .u64u64 %r229; .reg .u64 %r230; .reg .u64 %r231; .reg .pred %r232; .reg .predu64 %r280; .reg .u64 %r282; .reg .u64predu32u64 %r317; .reg .u64predpred %r339; .reg .pred %r342; .reg .u32pred %r355; .reg .u64 %r356; .reg .u64 %r358; .reg .u64 %r359; .reg .u64 %r360; .reg .u64 %r361; .reg .u64 %r363; .reg .predpredu64u64 %r384; .reg .u64 %r385; mov.u64 %r182,%ar0; mov.u64 %r183,%ar1; mov.u64 %r184,%ar2; mov.u32 %r185,%ar3; .loc 1 58 10 ld.s8 %r22,[%r183+28]; .loc 1 58 38 add.u32 %r186,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r108,%r186; .loc 1 59 10 ld.u64 %r25,[%r184]; .loc 1 59 7 add.u64 %r109,%r25,-1; .loc 1 61 7 shr.u64 %r188,%r109,63; set.u32.lt.s64 %r190,%r108,%r109; neg.s32 %r191,%r190; cvt.u16.u64 %r194,%r188; cvt.u16.u32 %r195,%r191; or.b16 %r193,%r194,%r195; .loc 1 61 6 cvt.u32.u16 %r196,%r193; cvt.u16.u8 %r197,%r196; setp.eq.u16 %r198,%r197,0; @ %r198 bra $L2; .loc 1 63 7 cvt.u32.u32 %r201,%r22; cvt.s64.s8 %r200,%r201; st.u64 [%stack+8],%r20019_gfortran_runtime_errorr204,%r109,%r109; add.u64 %r205,%r204,%r109; shl.b64 %r206,%r205,3; add.u64 %r207,%r183,%r206; ld.u64 %r31,[%r207+56]; ld.u64 %r33,[%r207+48]; .loc 1 71 9 ld.u64 %r112,[%r207+40]; .loc 1 73 3 setp.ne.u64 %r221,%r109,0; @ %r221 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r222,%r108,%r109; @ %r222 bra $L4; bra $L51; $L3: add.u64 %r29,%r183,40; add.u64 %r125,%frame,120; add.u64 %r126,%frame,240; add.u64 %r223,%r183,16; add.u64 %r225,%r25,%r25; add.u64 %r226,%r225,%r25; shl.b64 %r227,%r226,3; add.u64 %r147,%r223,%r227; .loc 1 79 12 mov.u64 %r385,0; $L8: .loc 1 75 18 ld.u64 %r228,[%r29]; st.u64 [%r125],%r228; .loc 1 76 19 ld.u64 %r230,[%r29+16]; add.u64 %r229,%r230,1; ld.u64 %r231,[%r29+8]; sub.u64 %r38,%r229,%r231; .loc 1 78 10 setp.lt.s64 %r232,%r38,0; @ %r232 bra $L6; .loc 1 76 17 st.u64 [%r126],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r126],%r385; $L7: .loc 1 73 3 add.u64 %r29,%r29,24; add.u64 %r125,%r125,8; add.u64 %r126,%r126,8; setp.ne.u64 %r234,%r29,%r147; @ %r234 bra $L8; bra $L9; $L51: .loc 1 90 6 ld.u64 %r235,[%r182]; setp.eq.u64 %r236,%r235,0; @ ! %r236 bra $L11; bra $L10; $L4: add.u64 %r238,%r25,%r25; add.u64 %r239,%r238,%r25; shl.b64 %r240,%r239,3; add.u64 %r241,%r240,40; add.u64 %r137,%r183,%r241; shl.b64 %r242,%r25,3; add.u64 %r105,%r242,-8; add.u64 %r373,%frame,120; add.u64 %r127,%r373,%r105; add.u64 %r372,%frame,240; add.u64 %r103,%r372,%r105; cvt.u32.u32 %r246,%r22; cvt.s64.s8 %r245,%r246; add.u64 %r248,%r245,%r245; add.u64 %r249,%r248,%r245; shl.b64 %r250,%r249,3; add.u64 %r251,%r183,40; add.u64 %r64,%r250,%r251; .loc 1 87 12 mov.u64 %r384,0; $L14: .loc 1 83 18 ld.u64 %r252,[%r137]; st.u64 [%r127],%r252; .loc 1 84 19 ld.u64 %r254,[%r137+16]; add.u64 %r253,%r254,1; ld.u64 %r255,[%r137+8]; sub.u64 %r44,%r253,%r255; .loc 1 86 10 setp.lt.s64 %r256,%r44,0; @ %r256 bra $L12; .loc 1 84 17 st.u64 [%r103],%r44; bra $L13; $L12: .loc 1 87 12 st.u64 [%r103],%r384; $L13: .loc 1 81 3 add.u64 %r137,%r137,24; add.u64 %r127,%r127,8; add.u64 %r103,%r103,8; setp.ne.u64 %r258,%r64,%r137; @ %r258 bra $L14; bra $L52; $L10: .loc 1 94 7 setp.le.s64 %r259,%r108,0; @ %r259 bra $L16; add.u64 %r372,%frame,240; $L39: add.u64 %r154,%r182,40; mov.u64 %r152,%r372; cvt.u32.u32 %r261,%r22; cvt.s64.s8 %r260,%r261; add.u64 %r263,%r260,%r260; add.u64 %r264,%r263,%r260; shl.b64 %r265,%r264,3; add.u64 %r266,%r182,16; add.u64 %r138,%r265,%r266; .loc 1 97 10 mov.u64 %r100,1; .loc 1 101 4 mov.u64 %r289,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r267,0; st.u64 [%r182+8],%r267; .loc 1 106 28 cvt.u16.u32 %r270,%r22; add.u16 %r269,%r270,-1; cvt.u32.u16 %r271,%r269; st.u8 [%r182+28],%r271; .loc 1 108 20 add.u32 %r272,%r22,-2; cvt.s64.s32 %r54,%r272; add.u64 %r274,%r54,%r54; add.u64 %r275,%r274,%r54; shl.b64 %r276,%r275,3; add.u64 %r277,%r182,%r276; .loc 1 108 67 shl.b64 %r279,%r54,3; add.u64 %r280,%frame,%r279; .loc 1 108 59 ld.u64 %r282,[%r277+40]; ld.u64 %r283,[%r280+240]; mul.lo.u64 %r114,%r282,%r283; .loc 1 110 29285; call (%value_in),_gfortrani_xmallocarray286,[%value_in]; } .loc 1 110 27 st.u64 [%r182],%r286; .loc 1 111 10 setp.eq.u64 %r288,%r114,0; @ ! %r288 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r100,%r50,%r100; $L17: .loc 1 101 4 st.u64 [%r154+8],%r289; ld.u64 %r50,[%r152]; add.u64 %r290,%r50,-1; st.u64 [%r154+16],%r290; st.u64 [%r154],%r100; .loc 1 94 7 add.u64 %r154,%r154,24; add.u64 %r152,%r152,8; setp.ne.u64 %r291,%r138,%r154; @ %r291 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r182+48],%r114; mov.u64 %r293,-1; st.u64 [%r182+56],%r293; mov.u64 %r294,1; st.u64 [%r182+40],%r294; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r60,[%r182+28]; .loc 1 121 10 setp.eq.u64 %r295,%r60,%r108; @ %r295 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r108; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r298,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r299,[%r298+36]; setp.eq.u32 %r300,%r299,0; @ %r300 bra $L21; .loc 1 128 2 add.u64 %r372,%frame,240; cvta.const.u64 %r304,$LC2r372303304; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r306,%r108,0; @ %r306 bra $L24; $L27: .loc 1 68 9 add.u64 %r307,%r31,1; .loc 1 68 7 sub.u64 %r110,%r307,%r33; max.s64 %r45,%r110,0; .loc 1 140 8 ld.u64 %r122,[%r183]; .loc 1 141 8 ld.u64 %r123,[%r182]; .loc 1 180 31 shl.b64 %r68,%r112,4; ld.u64 %r75,[%frame+360]; .loc 1 194 22 ld.u64 %r70,[%frame+120]; .loc 1 194 12 shl.b64 %r71,%r70,4; .loc 1 195 22 ld.u64 %r73,[%frame]; .loc 1 195 12 shl.b64 %r74,%r73,2; .loc 1 197 32 ld.u64 %r143,[%frame+240]; .loc 1 204 23 mul.lo.u64 %r308,%r70,%r143; .loc 1 204 9 shl.b64 %r309,%r308,4; neg.s64 %r134,%r309; .loc 1 205 23 mul.lo.u64 %r310,%r73,%r143; .loc 1 205 9 shl.b64 %r311,%r310,2; neg.s64 %r128,%r311; setp.gt.s64 %r374,%r110,0; setp.le.s64 %r375,%r108,1; .loc 1 155 9 mov.u64 %r377,0; setp.eq.u32 %r378,%r185,0; cvt.u32.u32 %r379,%r22; cvt.s64.s8 %r380,%r379; add.u64 %r381,%r380,-1; add.u64 %r382,%frame,120; add.u64 %r383,%frame,240; bra $L25; $L24: add.u64 %r167,%frame,360; add.u64 %r166,%r182,40; mov.u64 %r164,%frame; cvt.u32.u32 %r313,%r22; cvt.s64.s8 %r312,%r313; add.u64 %r155,%r312,-1; .loc 1 132 3 mov.u64 %r124,0; add.u64 %r372,%frame,240; .loc 1 134 16 mov.u64 %r314,%r124; $L26: st.u64 [%r167],%r314; .loc 1 135 18 ld.u64 %r315,[%r166]; st.u64 [%r164],%r315; .loc 1 136 17 shl.b64 %r317,%r124,3; add.u64 %r318,%r372,%r317; .loc 1 136 10 ld.u64 %r319,[%r318]; setp.le.s64 %r320,%r319,0; @ %r320 bra $L1; .loc 1 132 26 add.u64 %r124,%r124,1; .loc 1 132 3 add.u64 %r167,%r167,8; add.u64 %r166,%r166,24; add.u64 %r164,%r164,8; setp.ne.u64 %r321,%r124,%r155; @ %r321 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r374 bra $L40; .loc 1 159 10 mov.u32 %r323,0; st.u32 [%r123],%r323; bra $L29; $L40: mov.u64 %r118,%r122; .loc 1 155 9 mov.u64 %r367,%r377; mov.u64 %r368,-9223372036854775808; .loc 1 157 9 mov.u32 %r101,1; .loc 1 178 8 mov.u64 %r117,0; $L28: .loc 1 182 14 ld.u64 %r369,[%r118]; ld.u64 %r370,[%r118+8]; .loc 1 182 7 @ %r378 bra $L30; .loc 1 182 6 setp.gt.s64 %r327,%r368,%r370; @ %r327 bra $L33; setp.ne.u64 %r330,%r368,%r370; @ %r330 bra $L31; setp.gt.u64 %r333,%r367,%r369; @ %r333 bra $L33; bra $L31; $L30: setp.gt.s64 %r336,%r370,%r368; @ %r336 bra $L31; setp.ne.u64 %r339,%r370,%r368; @ %r339 bra $L33; setp.gt.u64 %r342,%r369,%r367; @ ! %r342 bra $L33; $L31: cvt.u32.u64 %r343,%r117; add.u32 %r101,%r343,1; .loc 1 182 14 mov.u64 %r367,%r369; mov.u64 %r368,%r370; $L33: .loc 1 180 23 add.u64 %r117,%r117,1; .loc 1 180 31 add.u64 %r118,%r118,%r68; .loc 1 180 6 setp.gt.s64 %r344,%r45,%r117; @ %r344 bra $L28; .loc 1 189 12 st.u32 [%r123],%r101; $L29: .loc 1 193 15 add.u64 %r75,%r75,1; .loc 1 194 12 add.u64 %r122,%r122,%r71; .loc 1 195 12 add.u64 %r123,%r123,%r74; .loc 1 197 13 setp.ne.u64 %r345,%r75,%r143; @ %r345 bra $L25; .loc 1 204 9 add.u64 %r119,%r122,%r134; .loc 1 205 9 add.u64 %r120,%r123,%r128; .loc 1 207 7 @ %r375 bra $L1; add.u64 %r133,%frame,368; mov.u64 %r149,8; .loc 1 206 5 mov.u64 %r121,1; bra $L37; $L38: .loc 1 201 13 st.u64 [%r133],%r377; .loc 1 204 23 mul.lo.u64 %r351,%r87,%r86; .loc 1 204 9 shl.b64 %r352,%r351,4; sub.u64 %r119,%r122,%r352; .loc 1 205 23 mul.lo.u64 %r353,%r90,%r86; .loc 1 205 9 shl.b64 %r354,%r353,2; sub.u64 %r120,%r123,%r354; .loc 1 206 5 add.u64 %r121,%r121,1; .loc 1 207 7 add.u64 %r133,%r133,8; add.u64 %r149,%r149,8; setp.eq.u64 %r355,%r121,%r381; @ %r355 bra $L1; $L37: .loc 1 215 16 ld.u64 %r356,[%r133]; add.u64 %r86,%r356,1; st.u64 [%r133],%r86; .loc 1 216 23 add.u64 %r358,%r382,%r149; ld.u64 %r87,[%r358]; .loc 1 216 13 shl.b64 %r359,%r87,4; add.u64 %r122,%r119,%r359; .loc 1 217 23 add.u64 %r360,%frame,%r149; ld.u64 %r90,[%r360]; .loc 1 217 13 shl.b64 %r361,%r90,2; add.u64 %r123,%r120,%r361; .loc 1 197 32 add.u64 %r363,%r383,%r149; ld.u64 %r93,[%r363]; .loc 1 197 13 setp.eq.u64 %r364,%r86,%r93; @ %r364 bra $L38; .loc 1 201 13 mov.u64 %r75,0; bra $L25; $L52: .loc 1 90 6 ld.u64 %r365,[%r182]; setp.eq.u64 %r366,%r365,0; @ ! %r366 bra $L11; bra $L39; $L1: .loc 1 221_gfortran_mmaxloc1_4_i16 .visible .func _gfortran_mmaxloc1_4_i16, .param .u32 %in_ar4) {60896u16 %r238; .reg .u16u64 %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u64 %r306; .reg .u64pred %r317; .reg .u64 %r318; .reg .pred %r319; .reg .u64u32u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64u64predu32 %r385; .reg .predu64 %r394; .reg .predu64predu64 %r410; .reg .u32 %r412; .reg .pred %r414; .reg .pred %r415; .reg .pred %r416; .reg .pred %r420; .reg .pred %r423; .reg .pred %r426; .reg .predpred %r430; .reg .pred %r433; .reg .pred %r436; .reg .pred %r439; .reg .predpred %r462; .reg .u64 %r463; .reg .predpred %r473; .reg .u64 %r475; .reg .pred %r476; .reg .u64 %r477; .reg .u64 %r478; .reg .pred %r479; .reg .u32u64 %r484; .reg .u64 %r485; .reg .u64 %r486; .reg .u64 %r487; mov.u64 %r221,%ar0; mov.u64 %r222,%ar1; mov.u64 %r223,%ar2; mov.u64 %r224,%ar3; mov.u32 %r225,%ar4; .loc 1 251 6 setp.ne.u64 %r226,%r224,0; @ %r226 bra $L223225; call _gfortran_maxloc1_4_i167 bra $L53; $L54: .loc 1 261 10 ld.u64 %r22,[%r223]; .loc 1 261 7 add.u64 %r151,%r22,-1; .loc 1 262 10 ld.s8 %r23,[%r222+28]; .loc 1 262 38 add.u32 %r231,%r23,-1; .loc 1 262 8 cvt.s64.s32 %r152,%r231; .loc 1 265 7 shr.u64 %r233,%r151,63; set.u32.gt.s64 %r235,%r151,%r152; neg.s32 %r236,%r235; cvt.u16.u64 %r239,%r233; cvt.u16.u32 %r240,%r236; or.b16 %r238,%r239,%r240; .loc 1 265 6 cvt.u32.u16 %r241,%r238; cvt.u16.u8 %r242,%r241; setp.eq.u16 %r243,%r242,0; @ %r243 bra $L56; .loc 1 267 7 cvt.u32.u32 %r246,%r23; cvt.s64.s8 %r245,%r246; st.u64 [%stack+8],%r245; st.u64 [%stack],%r22; cvta.const.u64 %r244244_gfortran_runtime_error6: .loc 1 272 9 add.u64 %r472,%r151,%r151; add.u64 %r475,%r472,%r151; shl.b64 %r251,%r475,3; add.u64 %r252,%r222,%r251; ld.u64 %r255,[%r252+56]; add.u64 %r254,%r255,1; .loc 1 272 7 ld.u64 %r262,[%r252+48]; sub.u64 %r153,%r254,%r262; .loc 1 273 6 setp.le.s64 %r263,%r153,0; @ %r263 bra $L53; .loc 1 276 9 ld.u64 %r169,[%r224]; .loc 1 278 15 ld.u64 %r34,[%r224+16]; .loc 1 280 22 cvt.u32.u64 %r35,%r34; .loc 1 280 53 add.u32 %r264,%r35,-4; and.b32 %r265,%r264,-5; set.u32.eq.u32 %r267,%r265,0; neg.s32 %r268,%r267; .loc 1 280 22 add.u32 %r269,%r35,-1; set.u32.le.u32 %r271,%r269,1; neg.s32 %r272,%r271; .loc 1 280 6 cvt.u16.u32 %r274,%r268; cvt.u16.u32 %r275,%r272; or.loc 1 282 7 setp.ne.u32 %r280,%r35,16; @ %r280 bra $L59; $L58: .loc 1 289 9 shl.b64 %r284,%r475,3; add.u64 %r285,%r222,%r284; ld.u64 %r156,[%r285+40]; .loc 1 290 12 add.u64 %r291,%r224,%r284; ld.u64 %r41,[%r291+40]; .loc 1 292 3 setp.ne.u64 %r293,%r151,0; @ %r293 bra $L60; $L66: .loc 1 302 3 setp.lt.s64 %r294,%r151,%r152; @ %r294 bra $L61; bra $L122; $L59: .loc 1 287 5 cvta.const.u64 %r2955_gfortran_runtime_error60: add.u64 %r96,%r222,40; add.u64 %r68,%r224,40; add.u64 %r298,%r22,%r22; add.u64 %r299,%r298,%r22; shl.b64 %r300,%r299,3; add.u64 %r301,%r222,16; add.u64 %r39,%r300,%r301; .loc 1 292 3 mov.u64 %r139,0; add.u64 %r471,%frame,240; add.u64 %r477,%frame,360; .loc 1 299 12 mov.u64 %r487,%r139; $L65: .loc 1 294 18 add.u64 %r303,%r471,%r139; ld.u64 %r304,[%r96]; st.u64 [%r303],%r304; .loc 1 295 18 add.u64 %r305,%frame,%r139; .loc 1 295 20 ld.u64 %r307,[%r68]; mul.lo.u64 %r306,%r307,%r34; .loc 1 295 18 st.u64 [%r305],%r306; .loc 1 296 19 ld.u64 %r309,[%r96+16]; add.u64 %r308,%r309,1; ld.u64 %r310,[%r96+8]; sub.u64 %r53,%r308,%r310; .loc 1 298 10 setp.lt.s64 %r311,%r53,0; @ %r311 bra $L63; .loc 1 296 17 add.u64 %r313,%r477,%r139; st.u64 [%r313],%r53; bra $L64; $L63: .loc 1 299 12 add.u64 %r315,%r477,%r139; st.u64 [%r315],%r487; $L64: .loc 1 292 3 add.u64 %r96,%r96,24; add.u64 %r68,%r68,24; add.u64 %r139,%r139,8; setp.ne.u64 %r317,%r39,%r96; @ %r317 bra $L65; bra $L66; $L122: .loc 1 312 6 ld.u64 %r318,[%r221]; setp.eq.u64 %r319,%r318,0; @ ! %r319 bra $L68; bra $L67; $L61: add.u64 %r321,%r22,%r22; add.u64 %r322,%r321,%r22; shl.b64 %r323,%r322,3; add.u64 %r130,%r323,40; add.u64 %r134,%r222,%r130; add.u64 %r111,%r224,%r130; shl.b64 %r324,%r22,3; add.u64 %r58,%r324,-8; cvt.u32.u32 %r326,%r23; cvt.s64.s8 %r325,%r326; shl.b64 %r327,%r325,3; add.u64 %r69,%r327,-8; add.u64 %r471,%frame,240; add.u64 %r477,%frame,360; .loc 1 309 12 mov.u64 %r486,0; $L71: .loc 1 304 18 add.u64 %r329,%r471,%r58; ld.u64 %r330,[%r134]; st.u64 [%r329],%r330; .loc 1 305 18 add.u64 %r331,%frame,%r58; .loc 1 305 20 ld.u64 %r333,[%r111]; mul.lo.u64 %r332,%r333,%r34; .loc 1 305 18 st.u64 [%r331],%r332; .loc 1 306 19 ld.u64 %r335,[%r134+16]; add.u64 %r334,%r335,1; ld.u64 %r336,[%r134+8]; sub.u64 %r65,%r334,%r336; .loc 1 308 10 setp.lt.s64 %r337,%r65,0; @ %r337 bra $L69; .loc 1 306 17 add.u64 %r339,%r477,%r58; st.u64 [%r339],%r65; bra $L70; $L69: .loc 1 309 12 add.u64 %r341,%r477,%r58; st.u64 [%r341],%r486; $L70: .loc 1 302 3 add.u64 %r134,%r134,24; add.u64 %r111,%r111,24; add.u64 %r58,%r58,8; setp.ne.u64 %r343,%r69,%r58; @ %r343 bra $L71; bra $L123; $L67: .loc 1 316 7 setp.eq.u64 %r344,%r152,0; @ %r344 bra $L73; add.u64 %r477,%frame,360; $L100: add.u64 %r148,%r221,40; mov.u64 %r146,%r477; cvt.u32.u32 %r346,%r23; cvt.s64.s8 %r345,%r346; add.u64 %r348,%r345,%r345; add.u64 %r349,%r348,%r345; shl.b64 %r350,%r349,3; add.u64 %r351,%r221,16; add.u64 %r135,%r350,%r351; .loc 1 319 10 mov.u64 %r136,1; .loc 1 323 4 mov.u64 %r370,0; bra $L74; $L73: .loc 1 327 20 add.u32 %r352,%r23,-2; cvt.s64.s32 %r76,%r352; add.u64 %r354,%r76,%r76; add.u64 %r355,%r354,%r76; shl.b64 %r356,%r355,3; add.u64 %r357,%r221,%r356; .loc 1 327 67 shl.b64 %r359,%r76,3; add.u64 %r360,%frame,%r359; .loc 1 327 59 ld.u64 %r362,[%r357+40]; ld.u64 %r363,[%r360+360]; mul.lo.u64 %r160,%r362,%r363; .loc 1 329 24 mov.u64 %r364,0; st.u64 [%r221+8],%r364; .loc 1 330 28 cvt.u16.u32 %r367,%r23; add.u16 %r366,%r367,-1; cvt.u32.u16 %r368,%r366; st.u8 [%r221+28],%r368; .loc 1 332 10 setp.eq.u64 %r369,%r160,0; @ %r369 bra $L75; bra $L124; $L77: .loc 1 321 47 mul.lo.u64 %r136,%r73,%r136; $L74: .loc 1 323 4 st.u64 [%r148+8],%r370; ld.u64 %r73,[%r146]; add.u64 %r371,%r73,-1; st.u64 [%r148+16],%r371; st.u64 [%r148],%r136; .loc 1 316 7 add.u64 %r148,%r148,24; add.u64 %r146,%r146,8; setp.ne.u64 %r372,%r135,%r148; @ %r372 bra $L77; bra $L73; $L75: .loc 1 335 4 st.u64 [%r221+48],%r160; mov.u64 %r374,-1; st.u64 [%r221+56],%r374; mov.u64 %r375,1; st.u64 [%r221+40],%r375; .loc 1 336 4 bra $L53; $L124: .loc 1 339 24call (%value_in),_gfortrani_xmallocarray78,[%value_in]; } .loc 1 339 22 st.u64 [%r221],%r378; bra $L78; $L68: .loc 1 344 19 ld.s8 %r380,[%r221+28]; .loc 1 344 10 setp.eq.u64 %r381,%r380,%r152; @ %r381 bra $L79; .loc 1 345 2 cvta.const.u64 %r382,$LC5; {stack; call _gfortran_runtime_error79: .loc 1 347 11 cvta.global.u64 %r384,_gfortrani_compile_options; .loc 1 347 10 ld.u32 %r385,[%r384+36]; setp.eq.u32 %r386,%r385,0; @ %r386 bra $L78; .loc 1 349 4 add.u64 %r477,%frame,360; cvta.const.u64 %r390,$LC2r477389390; call _gfortrani_bounds_ifunction_return4 cvta.const.u64 %r394,$LC222394390; call _gfortrani_bounds_equal_extents$L78: .loc 1 356 3 setp.ne.u64 %r396,%r152,0; @ %r396 bra $L80; $L83: .loc 1 364 8 ld.u64 %r170,[%r221]; .loc 1 365 8 ld.u64 %r168,[%r222]; .loc 1 367 9 setp.ne.u64 %r397,%r168,0; @ %r397 bra $L81; bra $L53; $L80: add.u64 %r198,%frame,480; add.u64 %r197,%r221,40; add.u64 %r195,%frame,120; cvt.u32.u32 %r399,%r23; cvt.s64.s8 %r398,%r399; add.u64 %r164,%r398,-1; .loc 1 356 3 mov.u64 %r171,0; add.u64 %r477,%frame,360; .loc 1 358 16 mov.u64 %r400,%r171; $L82: st.u64 [%r198],%r400; .loc 1 359 18 ld.u64 %r401,[%r197]; st.u64 [%r195],%r401; .loc 1 360 17 shl.b64 %r403,%r171,3; add.u64 %r404,%r477,%r403; .loc 1 360 10 ld.u64 %r405,[%r404]; setp.le.s64 %r406,%r405,0; @ %r406 bra $L53; .loc 1 356 26 add.u64 %r171,%r171,1; .loc 1 356 3 add.u64 %r198,%r198,8; add.u64 %r197,%r197,24; add.u64 %r195,%r195,8; setp.ne.u64 %r407,%r164,%r171; @ %r407 bra $L82; bra $L83; $L81: .loc 1 290 12 mul.lo.u64 %r44,%r41,%r34; .loc 1 386 32 shl.b64 %r91,%r156,4; ld.u64 %r187,[%frame+480]; .loc 1 430 22 ld.u64 %r101,[%frame+240]; .loc 1 430 12 shl.b64 %r102,%r101,4; .loc 1 431 23 ld.u64 %r104,[%frame]; .loc 1 432 22 ld.u64 %r106,[%frame+120]; .loc 1 432 12 shl.b64 %r107,%r106,2; .loc 1 434 32 ld.u64 %r29,[%frame+360]; .loc 1 441 23 mul.lo.u64 %r38,%r29,%r101; .loc 1 442 24 mul.lo.u64 %r408,%r29,%r104; .loc 1 442 10 neg.s64 %r137,%r408; .loc 1 443 23 mul.lo.u64 %r409,%r29,%r106; .loc 1 443 9 shl.b64 %r410,%r409,2; neg.s64 %r188,%r410; setp.le.s64 %r476,%r152,1; .loc 1 408 9 setp.ne.u32 %r479,%r225,0; cvt.u32.u32 %r480,%r23; cvt.s64.s8 %r481,%r480; add.u64 %r482,%r481,-1; add.u64 %r483,%frame,240; add.u64 %r484,%frame,360; add.u64 %r485,%frame,120; $L101: .loc 1 356 3 mov.u64 %r163,%r169; mov.u64 %r162,%r168; .loc 1 386 9 mov.u64 %r161,0; $L87: .loc 1 389 7 ld.s8 %r220,[%r163]; .loc 1 389 6 setp.eq.u32 %r473,%r220,0; @ %r473 bra $L84; .loc 1 397 11 ld.u64 %r469,[%r162]; ld.u64 %r470,[%r162+8]; .loc 1 398 11 cvt.u32.u64 %r412,%r161; add.u32 %r143,%r412,1; .loc 1 408 9 @ %r479 bra $L85; bra $L125; $L84: .loc 1 386 24 add.u64 %r161,%r161,1; .loc 1 386 32 add.u64 %r162,%r162,%r91; .loc 1 386 47 add.u64 %r163,%r163,%r44; .loc 1 386 2 setp.ne.u64 %r414,%r153,%r161; @ %r414 bra $L87; .loc 1 385 9 mov.u32 %r143,%r220; bra $L88; $L125: .loc 1 418 8 setp.gt.s64 %r415,%r153,%r161; @ %r415 bra $L89; bra $L88; $L85: .loc 1 409 8 setp.le.s64 %r416,%r153,%r161; @ %r416 bra $L88; add.u64 %r205,%r161,1; $L92: .loc 1 411 8 @ %r473 bra $L90; .loc 1 411 18 ld.u64 %r465,[%r162]; ld.u64 %r466,[%r162+8]; .loc 1 411 15 setp.gt.s64 %r420,%r470,%r466; @ %r420 bra $L90; setp.ne.u64 %r423,%r470,%r466; @ %r423 bra $L103; setp.gt.u64 %r426,%r469,%r465; @ %r426 bra $L90; $L103: .loc 1 414 16 cvt.u32.u64 %r143,%r205; mov.u64 %r469,%r465; mov.u64 %r470,%r466; $L90: .loc 1 409 33 add.u64 %r162,%r162,%r91; .loc 1 409 48 add.u64 %r163,%r163,%r44; .loc 1 409 8 add.u64 %r206,%r205,1; setp.eq.u64 %r427,%r153,%r205; @ %r427 bra $L88; .loc 1 411 9 ld.s8 %r220,[%r163]; mov.u64 %r205,%r206; setp.eq.u32 %r473,%r220,0; bra $L92; $L89: add.u64 %r201,%r161,1; bra $L95; $L102: mov.u64 %r201,%r202; $L95: .loc 1 420 8 ld.s8 %r429,[%r163]; cvt.u16.u32 %r428,%r429; setp.eq.u16 %r430,%r428,0; @ %r430 bra $L93; .loc 1 420 18 ld.u64 %r467,[%r162]; ld.u64 %r468,[%r162+8]; .loc 1 420 15 setp.gt.s64 %r433,%r468,%r470; @ %r433 bra $L104; setp.ne.u64 %r436,%r468,%r470; @ %r436 bra $L93; setp.gt.u64 %r439,%r467,%r469; @ ! %r439 bra $L93; $L104: .loc 1 423 16 cvt.u32.u64 %r143,%r201; mov.u64 %r469,%r467; mov.u64 %r470,%r468; $L93: .loc 1 418 33 add.u64 %r162,%r162,%r91; .loc 1 418 48 add.u64 %r163,%r163,%r44; .loc 1 418 8 add.u64 %r202,%r201,1; setp.ne.u64 %r440,%r153,%r201; @ %r440 bra $L102; $L88: .loc 1 426 8 st.u32 [%r170],%r143; .loc 1 429 15 add.u64 %r187,%r187,1; .loc 1 430 12 add.u64 %r168,%r168,%r102; .loc 1 431 13 add.u64 %r169,%r169,%r104; .loc 1 432 12 add.u64 %r170,%r170,%r107; .loc 1 434 13 setp.ne.u64 %r441,%r29,%r187; @ %r441 bra $L101; .loc 1 442 10 add.u64 %r165,%r169,%r137; .loc 1 443 9 add.u64 %r166,%r170,%r188; .loc 1 445 7 @ %r476 bra $L53; add.u64 %r150,%frame,488; .loc 1 441 23 mov.u64 %r110,%r38; .loc 1 445 7 mov.u64 %r215,8; .loc 1 444 5 mov.u64 %r167,1; .loc 1 438 13 mov.u64 %r478,0; bra $L98; $L99: st.u64 [%r150],%r478; .loc 1 441 23 mul.lo.u64 %r110,%r123,%r122; .loc 1 442 24 mul.lo.u64 %r447,%r125,%r122; .loc 1 442 10 sub.u64 %r165,%r169,%r447; .loc 1 443 23 mul.lo.u64 %r448,%r127,%r122; .loc 1 443 9 shl.b64 %r449,%r448,2; sub.u64 %r166,%r170,%r449; .loc 1 444 5 add.u64 %r167,%r167,1; .loc 1 445 7 add.u64 %r150,%r150,8; add.u64 %r215,%r215,8; setp.eq.u64 %r450,%r167,%r482; @ %r450 bra $L53; $L98: .loc 1 453 16 ld.u64 %r451,[%r150]; add.u64 %r122,%r451,1; st.u64 [%r150],%r122; .loc 1 454 23 add.u64 %r453,%r483,%r215; ld.u64 %r123,[%r453]; .loc 1 454 13 sub.u64 %r454,%r123,%r110; shl.b64 %r455,%r454,4; add.u64 %r168,%r168,%r455; .loc 1 455 24 add.u64 %r456,%frame,%r215; ld.u64 %r125,[%r456]; .loc 1 455 14 add.u64 %r169,%r165,%r125; .loc 1 456 23 add.u64 %r458,%r485,%r215; ld.u64 %r127,[%r458]; .loc 1 456 13 shl.b64 %r459,%r127,2; add.u64 %r170,%r166,%r459; .loc 1 434 32 add.u64 %r461,%r484,%r215; ld.u64 %r131,[%r461]; .loc 1 434 13 setp.eq.u64 %r462,%r122,%r131; @ %r462 bra $L99; .loc 1 438 13 mov.u64 %r187,0; bra $L101; $L123: .loc 1 312 6 ld.u64 %r463,[%r221]; setp.eq.u64 %r464,%r463,0; @ ! %r464 bra $L68; bra $L100; $L53: .loc 1 460_gfortran_smaxloc1_4_i16 .visible .func _gfortran_smaxloc1_4_i16, .param .u32 %in_ar4) {8u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u32 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 483 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L127; .loc 1 483 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L128; $L127:r152154; call _gfortran_maxloc1_4_i16490 7 bra $L126; $L128: .loc 1 493 10 ld.u64 %r23,[%r152]; .loc 1 493 7 add.u64 %r88,%r23,-1; .loc 1 494 10 ld.s8 %r24,[%r151+28]; .loc 1 494 38 add.u32 %r26,%r24,-1; .loc 1 494 8 cvt.s64.s32 %r296,%r26; .loc 1 496 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 496 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L130; .loc 1 503 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L131; $L137: .loc 1 511 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L132; bra $L185; $L130: .loc 1 498 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error131: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 508 12 mov.u64 %r302,0; $L136: .loc 1 505 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 507 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L134; .loc 1 505 17 st.u64 [%r139],%r36; bra $L135; $L134: .loc 1 508 12 st.u64 [%r139],%r302; $L135: .loc 1 503 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L136; bra $L137; $L185: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L139; bra $L138; $L132: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 517 12 mov.u64 %r301,0; $L142: .loc 1 514 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 516 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L140; .loc 1 513 17 st.u64 [%r43],%r41; bra $L141; $L140: .loc 1 517 12 st.u64 [%r43],%r301; $L141: .loc 1 511 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L142; bra $L186; $L138: .loc 1 524 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L144; add.u64 %r295,%frame,120; $L164: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 527 10 mov.u64 %r81,1; .loc 1 531 4 mov.u64 %r239,0; bra $L145; $L144: .loc 1 535 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 536 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 538 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 538 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 538 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 540 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L146; bra $L187; $L148: .loc 1 529 48 mul.lo.u64 %r81,%r46,%r81; $L145: .loc 1 531 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 524 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L148; bra $L144; $L146: .loc 1 543 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 544 4 bra $L126; $L187: .loc 1 547 2496246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 547 22 st.u64 [%r150],%r98; $L152: .loc 1 573 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L149; bra $L150; $L139: .loc 1 551 19 ld.s8 %r55,[%r150+28]; .loc 1 551 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L151; .loc 1 552 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error151: .loc 1 557 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 557 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L152; .loc 1 559 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L150; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 559 10 mov.u64 %r104,0; $L154: .loc 1 563 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 563 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 564 18 ld.u64 %r63,[%r31]; .loc 1 559 25 add.u64 %r104,%r104,1; .loc 1 564 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L153; .loc 1 565 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error153: .loc 1 559 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L154; $L149: .loc 1 575 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L155; shl.b64 %r266,%r296,3; bra $L156; $L155: mov.u64 %r266,8; $L156: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L157: .loc 1 576 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 573 26 add.u64 %r100,%r100,1; .loc 1 573 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L157; $L150: ld.u64 %r80,[%frame+240]; .loc 1 585 22 ld.u64 %r68,[%frame]; .loc 1 585 12 shl.b64 %r69,%r68,2; .loc 1 587 32 ld.u64 %r59,[%frame+120]; .loc 1 594 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 583 13 mov.u32 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L163: st.u32 [%r98],%r280; .loc 1 584 15 add.u64 %r80,%r80,1; .loc 1 585 12 add.u64 %r98,%r98,%r69; .loc 1 587 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L163; .loc 1 596 7 @ ! %r294 bra $L188; bra $L126; $L162: .loc 1 591 13 st.u64 [%r99],%r297; .loc 1 594 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 595 5 add.u64 %r97,%r97,1; .loc 1 596 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L161; bra $L126; $L188: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 594 23 mov.u64 %r72,%r44; .loc 1 595 5 mov.u64 %r97,1; .loc 1 591 13 mov.u64 %r297,0; $L161: .loc 1 600 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 601 23 ld.u64 %r75,[%r124]; .loc 1 601 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,2; add.u64 %r98,%r98,%r291; .loc 1 587 32 ld.u64 %r77,[%r123]; .loc 1 587 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L162; .loc 1 591 13 mov.u64 %r80,0; bra $L163; $L186: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L139; bra $L164; $L126: .loc 1 605maxloc1_8_i16.o/1622802230_gfortran_maxloc1_8_i16 .visible .func _gfortran_maxloc1_8_i16.file 1 "../../../../libgfortran/generated/maxloc1_8_i16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_8_i16 .visible .func _gfortran_mmaxloc1_8_i16_gfortran_smaxloc1_8_i16 .visible .func _gfortran_smaxloc1_8_i16VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_maxloc1_8_i16 .visible .func _gfortran_maxloc1_8_i1695predu64u64 %r229; .reg .u64 %r230; .reg .predpred %r255; .reg .predu64 %r266; .reg .u16u64predpred %r315; .reg .pred %r316; .reg .pred %r322; .reg .pred %r325; .reg .pred %r328; .reg .pred %r331; .reg .pred %r334; .reg .pred %r337; .reg .predu64u64predmov.u64 %r181,%ar0; mov.u64 %r182,%ar1; mov.u64 %r183,%ar2; mov.u32 %r184,%ar3; .loc 1 58 10 ld.s8 %r22,[%r182+28]; .loc 1 58 38 add.u32 %r185,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r110,%r185; .loc 1 59 10 ld.u64 %r25,[%r183]; .loc 1 59 7 add.u64 %r111,%r25,-1; .loc 1 61 7 shr.u64 %r187,%r111,63; set.u32.lt.s64 %r189,%r110,%r111; neg.s32 %r190,%r189; cvt.u16.u64 %r193,%r187; cvt.u16.u32 %r194,%r190; or.b16 %r192,%r193,%r194; .loc 1 61 6 cvt.u32.u16 %r195,%r192; cvt.u16.u8 %r196,%r195; setp.eq.u16 %r197,%r196,0; @ %r197 bra $L2; .loc 1 63 7 cvt.u32.u32 %r200,%r22; cvt.s64.s8 %r199,%r200; st.u64 [%stack+8],%r199_gfortran_runtime_errorr203,%r111,%r111; add.u64 %r204,%r203,%r111; shl.b64 %r205,%r204,3; add.u64 %r206,%r182,%r205; ld.u64 %r31,[%r206+56]; ld.u64 %r33,[%r206+48]; .loc 1 71 9 ld.u64 %r114,[%r206+40]; .loc 1 73 3 setp.ne.u64 %r220,%r111,0; @ %r220 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r221,%r110,%r111; @ %r221 bra $L4; bra $L52; $L3: add.u64 %r63,%r182,40; add.u64 %r29,%frame,120; add.u64 %r124,%frame,240; add.u64 %r222,%r182,16; add.u64 %r224,%r25,%r25; add.u64 %r225,%r224,%r25; shl.b64 %r226,%r225,3; add.u64 %r174,%r222,%r226; .loc 1 79 12 mov.u64 %r383,0; $L8: .loc 1 75 18 ld.u64 %r227,[%r63]; st.u64 [%r29],%r227; .loc 1 76 19 ld.u64 %r229,[%r63+16]; add.u64 %r228,%r229,1; ld.u64 %r230,[%r63+8]; sub.u64 %r38,%r228,%r230; .loc 1 78 10 setp.lt.s64 %r231,%r38,0; @ %r231 bra $L6; .loc 1 76 17 st.u64 [%r124],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r124],%r383; $L7: .loc 1 73 3 add.u64 %r63,%r63,24; add.u64 %r29,%r29,8; add.u64 %r124,%r124,8; setp.ne.u64 %r233,%r63,%r174; @ %r233 bra $L8; bra $L9; $L52: .loc 1 90 6 ld.u64 %r234,[%r181]; setp.eq.u64 %r235,%r234,0; @ ! %r235 bra $L11; bra $L10; $L4: add.u64 %r237,%r25,%r25; add.u64 %r238,%r237,%r25; shl.b64 %r239,%r238,3; add.u64 %r240,%r239,40; add.u64 %r132,%r182,%r240; shl.b64 %r241,%r25,3; add.u64 %r107,%r241,-8; add.u64 %r371,%frame,120; add.u64 %r126,%r371,%r107; add.u64 %r370,%frame,240; add.u64 %r105,%r370,%r107; cvt.u32.u32 %r245,%r22; cvt.s64.s8 %r244,%r245; add.u64 %r247,%r244,%r244; add.u64 %r248,%r247,%r244; shl.b64 %r249,%r248,3; add.u64 %r250,%r182,40; add.u64 %r81,%r249,%r250; .loc 1 87 12 mov.u64 %r382,0; $L14: .loc 1 83 18 ld.u64 %r251,[%r132]; st.u64 [%r126],%r251; .loc 1 84 19 ld.u64 %r253,[%r132+16]; add.u64 %r252,%r253,1; ld.u64 %r254,[%r132+8]; sub.u64 %r45,%r252,%r254; .loc 1 86 10 setp.lt.s64 %r255,%r45,0; @ %r255 bra $L12; .loc 1 84 17 st.u64 [%r105],%r45; bra $L13; $L12: .loc 1 87 12 st.u64 [%r105],%r382; $L13: .loc 1 81 3 add.u64 %r132,%r132,24; add.u64 %r126,%r126,8; add.u64 %r105,%r105,8; setp.ne.u64 %r257,%r81,%r132; @ %r257 bra $L14; bra $L53; $L10: .loc 1 94 7 setp.le.s64 %r258,%r110,0; @ %r258 bra $L16; add.u64 %r370,%frame,240; $L38: add.u64 %r153,%r181,40; mov.u64 %r151,%r370; cvt.u32.u32 %r260,%r22; cvt.s64.s8 %r259,%r260; add.u64 %r262,%r259,%r259; add.u64 %r263,%r262,%r259; shl.b64 %r264,%r263,3; add.u64 %r265,%r181,16; add.u64 %r138,%r264,%r265; .loc 1 97 10 mov.u64 %r102,1; .loc 1 101 4 mov.u64 %r288,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r266,0; st.u64 [%r181+8],%r266; .loc 1 106 28 cvt.u16.u32 %r269,%r22; add.u16 %r268,%r269,-1; cvt.u32.u16 %r270,%r268; st.u8 [%r181+28],%r270; .loc 1 108 20 add.u32 %r271,%r22,-2; cvt.s64.s32 %r56,%r271; add.u64 %r273,%r56,%r56; add.u64 %r274,%r273,%r56; shl.b64 %r275,%r274,3; add.u64 %r276,%r181,%r275; .loc 1 108 67 shl.b64 %r278,%r56,3; add.u64 %r279,%frame,%r278; .loc 1 108 59 ld.u64 %r281,[%r276+40]; ld.u64 %r282,[%r279+240]; mul.lo.u64 %r116,%r281,%r282; .loc 1 110 29 mov.u64 %r284284; call (%value_in),_gfortrani_xmallocarray285,[%value_in]; } .loc 1 110 27 st.u64 [%r181],%r285; .loc 1 111 10 setp.eq.u64 %r287,%r116,0; @ ! %r287 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r102,%r52,%r102; $L17: .loc 1 101 4 st.u64 [%r153+8],%r288; ld.u64 %r52,[%r151]; add.u64 %r289,%r52,-1; st.u64 [%r153+16],%r289; st.u64 [%r153],%r102; .loc 1 94 7 add.u64 %r153,%r153,24; add.u64 %r151,%r151,8; setp.ne.u64 %r290,%r138,%r153; @ %r290 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r181+48],%r116; mov.u64 %r292,-1; st.u64 [%r181+56],%r292; mov.u64 %r293,1; st.u64 [%r181+40],%r293; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r62,[%r181+28]; .loc 1 121 10 setp.eq.u64 %r294,%r62,%r110; @ %r294 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r110; st.u64 [%stack],%r62;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r297,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r298,[%r297+36]; setp.eq.u32 %r299,%r298,0; @ %r299 bra $L21; .loc 1 128 2 add.u64 %r370,%frame,240; cvta.const.u64 %r303,$LC2r370302303; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r305,%r110,0; @ %r305 bra $L24; $L27: .loc 1 68 9 add.u64 %r306,%r31,1; .loc 1 68 7 sub.u64 %r112,%r306,%r33; max.s64 %r46,%r112,0; .loc 1 140 8 ld.u64 %r121,[%r182]; .loc 1 141 8 ld.u64 %r122,[%r181]; .loc 1 180 31 shl.b64 %r69,%r114,4; ld.u64 %r48,[%frame+360]; .loc 1 194 22 ld.u64 %r70,[%frame+120]; .loc 1 194 12 shl.b64 %r72,%r70,4; .loc 1 195 22 ld.u64 %r73,[%frame]; .loc 1 195 12 shl.b64 %r75,%r73,3; setp.gt.s64 %r372,%r112,0; setp.le.s64 %r373,%r110,1; .loc 1 155 9 mov.u64 %r375,0; setp.eq.u32 %r376,%r184,0; cvt.u32.u32 %r377,%r22; cvt.s64.s8 %r378,%r377; add.u64 %r379,%r378,-1; add.u64 %r380,%frame,120; add.u64 %r381,%frame,240; bra $L25; $L24: add.u64 %r166,%frame,360; add.u64 %r165,%r181,40; mov.u64 %r163,%frame; cvt.u32.u32 %r308,%r22; cvt.s64.s8 %r307,%r308; add.u64 %r154,%r307,-1; .loc 1 132 3 mov.u64 %r123,0; add.u64 %r370,%frame,240; .loc 1 134 16 mov.u64 %r309,%r123; $L26: st.u64 [%r166],%r309; .loc 1 135 18 ld.u64 %r310,[%r165]; st.u64 [%r163],%r310; .loc 1 136 17 shl.b64 %r312,%r123,3; add.u64 %r313,%r370,%r312; .loc 1 136 10 ld.u64 %r314,[%r313]; setp.le.s64 %r315,%r314,0; @ %r315 bra $L1; .loc 1 132 26 add.u64 %r123,%r123,1; .loc 1 132 3 add.u64 %r166,%r166,8; add.u64 %r165,%r165,24; add.u64 %r163,%r163,8; setp.ne.u64 %r316,%r123,%r154; @ %r316 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r372 bra $L39; .loc 1 159 10 st.u64 [%r122],%r375; bra $L29; $L39: mov.u64 %r117,%r121; .loc 1 155 9 mov.u64 %r365,%r375; mov.u64 %r366,-9223372036854775808; .loc 1 157 9 mov.u64 %r103,1; .loc 1 178 8 mov.u64 %r149,0; $L28: .loc 1 182 14 ld.u64 %r367,[%r117]; ld.u64 %r368,[%r117+8]; .loc 1 185 14 add.u64 %r149,%r149,1; .loc 1 182 7 @ %r376 bra $L30; .loc 1 182 6 setp.gt.s64 %r322,%r366,%r368; @ %r322 bra $L31; setp.ne.u64 %r325,%r366,%r368; @ %r325 bra $L40; setp.gt.u64 %r328,%r365,%r367; @ %r328 bra $L31; bra $L40; $L30: setp.gt.s64 %r331,%r368,%r366; @ %r331 bra $L41; setp.ne.u64 %r334,%r368,%r366; @ %r334 bra $L31; setp.gt.u64 %r337,%r367,%r365; @ %r337 bra $L41; bra $L31; $L40: mov.u64 %r365,%r367; mov.u64 %r366,%r368; .loc 1 185 14 mov.u64 %r103,%r149; bra $L31; $L41: .loc 1 182 6 mov.u64 %r365,%r367; mov.u64 %r366,%r368; .loc 1 185 14 mov.u64 %r103,%r149; $L31: .loc 1 180 31 add.u64 %r117,%r117,%r69; .loc 1 180 6 setp.gt.s64 %r338,%r46,%r149; @ %r338 bra $L28; .loc 1 189 12 st.u64 [%r122],%r103; $L29: .loc 1 193 15 add.u64 %r48,%r48,1; .loc 1 194 12 add.u64 %r121,%r121,%r72; .loc 1 195 12 add.u64 %r122,%r122,%r75; .loc 1 197 32 ld.u64 %r145,[%frame+240]; .loc 1 197 13 setp.ne.u64 %r339,%r48,%r145; @ %r339 bra $L25; .loc 1 204 23 mul.lo.u64 %r340,%r70,%r48; .loc 1 204 9 shl.b64 %r341,%r340,4; sub.u64 %r118,%r121,%r341; .loc 1 205 23 mul.lo.u64 %r342,%r73,%r48; .loc 1 205 9 shl.b64 %r343,%r342,3; sub.u64 %r119,%r122,%r343; .loc 1 207 7 @ %r373 bra $L1; add.u64 %r150,%frame,368; mov.u64 %r109,8; .loc 1 206 5 mov.u64 %r120,1; bra $L36; $L37: .loc 1 201 13 st.u64 [%r150],%r375; .loc 1 204 23 mul.lo.u64 %r349,%r89,%r88; .loc 1 204 9 shl.b64 %r350,%r349,4; sub.u64 %r118,%r121,%r350; .loc 1 205 23 mul.lo.u64 %r351,%r92,%r88; .loc 1 205 9 shl.b64 %r352,%r351,3; sub.u64 %r119,%r122,%r352; .loc 1 206 5 add.u64 %r120,%r120,1; .loc 1 207 7 add.u64 %r150,%r150,8; add.u64 %r109,%r109,8; setp.eq.u64 %r353,%r120,%r379; @ %r353 bra $L1; $L36: .loc 1 215 16 ld.u64 %r354,[%r150]; add.u64 %r88,%r354,1; st.u64 [%r150],%r88; .loc 1 216 23 add.u64 %r356,%r380,%r109; ld.u64 %r89,[%r356]; .loc 1 216 13 shl.b64 %r357,%r89,4; add.u64 %r121,%r118,%r357; .loc 1 217 23 add.u64 %r358,%frame,%r109; ld.u64 %r92,[%r358]; .loc 1 217 13 shl.b64 %r359,%r92,3; add.u64 %r122,%r119,%r359; .loc 1 197 32 add.u64 %r361,%r381,%r109; ld.u64 %r95,[%r361]; .loc 1 197 13 setp.eq.u64 %r362,%r88,%r95; @ %r362 bra $L37; .loc 1 201 13 mov.u64 %r48,0; bra $L25; $L53: .loc 1 90 6 ld.u64 %r363,[%r181]; setp.eq.u64 %r364,%r363,0; @ ! %r364 bra $L11; bra $L38; $L1: .loc 1 221_gfortran_mmaxloc1_8_i16 .visible .func _gfortran_mmaxloc1_8_i16, .param .u32 %in_ar4) {608pred %r218; .reg .u32 %r223; .reg .u6464u64pred %r373; .reg .u64 %r374; .reg .u64 %r376; .reg .u32 %r377; .reg .pred %r378; .reg .u64 %r381; .reg .u64 %r382; .reg .u64 %r386; .reg .pred %r388; .reg .pred %r389; .reg .u64 %r390; .reg .u32 %r391; .reg .u64pred %r398; .reg .pred %r399; .reg .pred %r402; .reg .pred %r403; .reg .pred %r404; .reg .pred %r408; .reg .pred %r411; .reg .pred %r414; .reg .pred %r415; .reg .u16 %r416; .reg .u32 %r417; .reg .pred %r418; .reg .pred %r421; .reg .pred %r424; .reg .pred %r427; .reg .predu64 %r460; .reg .u64 %r461; .reg .u64 %r463; .reg .pred %r464; .reg .u64 %r465; .reg .predpredmov.u64 %r213,%ar0; mov.u64 %r214,%ar1; mov.u64 %r215,%ar2; mov.u64 %r216,%ar3; mov.u32 %r217,%ar4; .loc 1 251 6 setp.ne.u64 %r218,%r216,0; @ %r218 bra $L1r214215217; call _gfortran_maxloc1_8_i167 bra $L54; $L55: .loc 1 261 10 ld.u64 %r22,[%r215]; .loc 1 261 7 add.u64 %r146,%r22,-1; .loc 1 262 10 ld.s8 %r23,[%r214+28]; .loc 1 262 38 add.u32 %r223,%r23,-1; .loc 1 262 8 cvt.s64.s32 %r147,%r223; .loc 1 265 7 shr.u64 %r225,%r146,63; set.u32.gt.s64 %r227,%r146,%r147; neg.s32 %r228,%r227; cvt.u16.u64 %r231,%r225; cvt.u16.u32 %r232,%r228; or.b16 %r230,%r231,%r232; .loc 1 265 6 cvt.u32.u16 %r233,%r230; cvt.u16.u8 %r234,%r233; setp.eq.u16 %r235,%r234,0; @ %r235 bra $L57; .loc 1 267 7 cvt.u32.u32 %r238,%r23; cvt.s64.s8 %r237,%r238; st.u64 [%stack+8],%r237; st.u64 [%stack],%r22; cvta.const.u64 %r236236_gfortran_runtime_error7: .loc 1 272 9 add.u64 %r468,%r146,%r146; add.u64 %r463,%r468,%r146; shl.b64 %r243,%r463,3; add.u64 %r244,%r214,%r243; ld.u64 %r247,[%r244+56]; add.u64 %r246,%r247,1; .loc 1 272 7 ld.u64 %r254,[%r244+48]; sub.u64 %r148,%r246,%r254; .loc 1 273 6 setp.le.s64 %r255,%r148,0; @ %r255 bra $L54; .loc 1 276 9 ld.u64 %r166,[%r216]; .loc 1 278 15 ld.u64 %r34,[%r216+16]; .loc 1 280 22 cvt.u32.u64 %r35,%r34; .loc 1 280 53 add.u32 %r256,%r35,-4; and.b32 %r257,%r256,-5; set.u32.eq.u32 %r259,%r257,0; neg.s32 %r260,%r259; .loc 1 280 22 add.u32 %r261,%r35,-1; set.u32.le.u32 %r263,%r261,1; neg.s32 %r264,%r263; .loc 1 280 6 cvt.u16.u32 %r266,%r260; cvt.u16.u32 %r267,%r264; or.b16 %r265,%r266,%r267; cvt.u32.u16 %r268,%r265; cvt.u16.u8 %r269,%r268; setp.ne.u16 %r270,%r269,0; @ %r270 bra $L59; .loc 1 282 7 setp.ne.u32 %r272,%r35,16; @ %r272 bra $L60; $L59: .loc 1 289 9 shl.b64 %r276,%r463,3; add.u64 %r277,%r214,%r276; ld.u64 %r151,[%r277+40]; .loc 1 290 12 add.u64 %r283,%r216,%r276; ld.u64 %r39,[%r283+40]; .loc 1 292 3 setp.ne.u64 %r285,%r146,0; @ %r285 bra $L61; $L67: .loc 1 302 3 setp.lt.s64 %r286,%r146,%r147; @ %r286 bra $L62; bra $L123; $L60: .loc 1 287 5_gfortran_runtime_error61: add.u64 %r205,%r214,40; add.u64 %r67,%r216,40; add.u64 %r289,%r214,16; add.u64 %r291,%r22,%r22; add.u64 %r292,%r291,%r22; shl.b64 %r293,%r292,3; add.u64 %r210,%r289,%r293; .loc 1 292 3 mov.u64 %r91,0; add.u64 %r467,%frame,240; add.u64 %r465,%frame,360; .loc 1 299 12 mov.u64 %r478,%r91; $L66: .loc 1 294 18 add.u64 %r295,%r467,%r91; ld.u64 %r296,[%r205]; st.u64 [%r295],%r296; .loc 1 295 18 add.u64 %r297,%frame,%r91; .loc 1 295 20 ld.u64 %r299,[%r67]; mul.lo.u64 %r298,%r299,%r34; .loc 1 295 18 st.u64 [%r297],%r298; .loc 1 296 19 ld.u64 %r301,[%r205+16]; add.u64 %r300,%r301,1; ld.u64 %r302,[%r205+8]; sub.u64 %r51,%r300,%r302; .loc 1 298 10 setp.lt.s64 %r303,%r51,0; @ %r303 bra $L64; .loc 1 296 17 add.u64 %r305,%r465,%r91; st.u64 [%r305],%r51; bra $L65; $L64: .loc 1 299 12 add.u64 %r307,%r465,%r91; st.u64 [%r307],%r478; $L65: .loc 1 292 3 add.u64 %r205,%r205,24; add.u64 %r67,%r67,24; add.u64 %r91,%r91,8; setp.ne.u64 %r309,%r205,%r210; @ %r309 bra $L66; bra $L67; $L123: .loc 1 312 6 ld.u64 %r310,[%r213]; setp.eq.u64 %r311,%r310,0; @ ! %r311 bra $L69; bra $L68; $L62: add.u64 %r313,%r22,%r22; add.u64 %r314,%r313,%r22; shl.b64 %r315,%r314,3; add.u64 %r128,%r315,40; add.u64 %r133,%r214,%r128; add.u64 %r124,%r216,%r128; shl.b64 %r316,%r22,3; add.u64 %r83,%r316,-8; cvt.u32.u32 %r318,%r23; cvt.s64.s8 %r317,%r318; shl.b64 %r319,%r317,3; add.u64 %r209,%r319,-8; add.u64 %r467,%frame,240; add.u64 %r465,%frame,360; .loc 1 309 12 mov.u64 %r477,0; $L72: .loc 1 304 18 add.u64 %r321,%r467,%r83; ld.u64 %r322,[%r133]; st.u64 [%r321],%r322; .loc 1 305 18 add.u64 %r323,%frame,%r83; .loc 1 305 20 ld.u64 %r325,[%r124]; mul.lo.u64 %r324,%r325,%r34; .loc 1 305 18 st.u64 [%r323],%r324; .loc 1 306 19 ld.u64 %r327,[%r133+16]; add.u64 %r326,%r327,1; ld.u64 %r328,[%r133+8]; sub.u64 %r64,%r326,%r328; .loc 1 308 10 setp.lt.s64 %r329,%r64,0; @ %r329 bra $L70; .loc 1 306 17 add.u64 %r331,%r465,%r83; st.u64 [%r331],%r64; bra $L71; $L70: .loc 1 309 12 add.u64 %r333,%r465,%r83; st.u64 [%r333],%r477; $L71: .loc 1 302 3 add.u64 %r133,%r133,24; add.u64 %r124,%r124,24; add.u64 %r83,%r83,8; setp.ne.u64 %r335,%r83,%r209; @ %r335 bra $L72; bra $L124; $L68: .loc 1 316 7 setp.eq.u64 %r336,%r147,0; @ %r336 bra $L74; add.u64 %r465,%frame,360; $L100: add.u64 %r161,%r213,40; mov.u64 %r143,%r465; cvt.u32.u32 %r338,%r23; cvt.s64.s8 %r337,%r338; add.u64 %r340,%r337,%r337; add.u64 %r341,%r340,%r337; shl.b64 %r342,%r341,3; add.u64 %r343,%r213,16; add.u64 %r135,%r342,%r343; .loc 1 319 10 mov.u64 %r131,1; .loc 1 323 4 mov.u64 %r362,0; bra $L75; $L74: .loc 1 327 20 add.u32 %r344,%r23,-2; cvt.s64.s32 %r73,%r344; add.u64 %r346,%r73,%r73; add.u64 %r347,%r346,%r73; shl.b64 %r348,%r347,3; add.u64 %r349,%r213,%r348; .loc 1 327 67 shl.b64 %r351,%r73,3; add.u64 %r352,%frame,%r351; .loc 1 327 59 ld.u64 %r354,[%r349+40]; ld.u64 %r355,[%r352+360]; mul.lo.u64 %r155,%r354,%r355; .loc 1 329 24 mov.u64 %r356,0; st.u64 [%r213+8],%r356; .loc 1 330 28 cvt.u16.u32 %r359,%r23; add.u16 %r358,%r359,-1; cvt.u32.u16 %r360,%r358; st.u8 [%r213+28],%r360; .loc 1 332 10 setp.eq.u64 %r361,%r155,0; @ %r361 bra $L76; bra $L125; $L78: .loc 1 321 47 mul.lo.u64 %r131,%r70,%r131; $L75: .loc 1 323 4 st.u64 [%r161+8],%r362; ld.u64 %r70,[%r143]; add.u64 %r363,%r70,-1; st.u64 [%r161+16],%r363; st.u64 [%r161],%r131; .loc 1 316 7 add.u64 %r161,%r161,24; add.u64 %r143,%r143,8; setp.ne.u64 %r364,%r135,%r161; @ %r364 bra $L78; bra $L74; $L76: .loc 1 335 4 st.u64 [%r213+48],%r155; mov.u64 %r366,-1; st.u64 [%r213+56],%r366; mov.u64 %r367,1; st.u64 [%r213+40],%r367; .loc 1 336 4 bra $L54; $L125: .loc 1 339 24 mov.u64 %r369369; call (%value_in),_gfortrani_xmallocarray70,[%value_in]; } .loc 1 339 22 st.u64 [%r213],%r370; bra $L79; $L69: .loc 1 344 19 ld.s8 %r372,[%r213+28]; .loc 1 344 10 setp.eq.u64 %r373,%r372,%r147; @ %r373 bra $L80; .loc 1 345 2 cvta.const.u64 %r374374_gfortran_runtime_error80: .loc 1 347 11 cvta.global.u64 %r376,_gfortrani_compile_options; .loc 1 347 10 ld.u32 %r377,[%r376+36]; setp.eq.u32 %r378,%r377,0; @ %r378 bra $L79; .loc 1 349 4 add.u64 %r465,%frame,360; cvta.const.u64 %r382,$LC2r3382; call _gfortrani_bounds_ifunction_return4 cvta.const.u64 %r386,$LC386382; call _gfortrani_bounds_equal_extents$L79: .loc 1 356 3 setp.ne.u64 %r388,%r147,0; @ %r388 bra $L81; $L84: .loc 1 364 8 ld.u64 %r167,[%r213]; .loc 1 365 8 ld.u64 %r165,[%r214]; .loc 1 367 9 setp.ne.u64 %r389,%r165,0; @ %r389 bra $L82; bra $L54; $L81: add.u64 %r194,%frame,480; add.u64 %r193,%r213,40; add.u64 %r188,%frame,120; cvt.u32.u32 %r391,%r23; cvt.s64.s8 %r390,%r391; add.u64 %r174,%r390,-1; .loc 1 356 3 mov.u64 %r168,0; add.u64 %r465,%frame,360; .loc 1 358 16 mov.u64 %r392,%r168; $L83: st.u64 [%r194],%r392; .loc 1 359 18 ld.u64 %r393,[%r193]; st.u64 [%r188],%r393; .loc 1 360 17 shl.b64 %r395,%r168,3; add.u64 %r396,%r465,%r395; .loc 1 360 10 ld.u64 %r397,[%r396]; setp.le.s64 %r398,%r397,0; @ %r398 bra $L54; .loc 1 356 26 add.u64 %r168,%r168,1; .loc 1 356 3 add.u64 %r194,%r194,8; add.u64 %r193,%r193,24; add.u64 %r188,%r188,8; setp.ne.u64 %r399,%r168,%r174; @ %r399 bra $L83; bra $L84; $L82: .loc 1 290 12 mul.lo.u64 %r42,%r39,%r34; .loc 1 386 32 shl.b64 %r87,%r151,4; ld.u64 %r68,[%frame+480]; .loc 1 430 22 ld.u64 %r97,[%frame+240]; .loc 1 430 12 shl.b64 %r99,%r97,4; .loc 1 431 23 ld.u64 %r100,[%frame]; .loc 1 432 22 ld.u64 %r102,[%frame+120]; .loc 1 432 12 shl.b64 %r104,%r102,3; setp.le.s64 %r464,%r147,1; .loc 1 408 9 setp.ne.u32 %r470,%r217,0; cvt.u32.u32 %r471,%r23; cvt.s64.s8 %r472,%r471; add.u64 %r473,%r472,-1; add.u64 %r474,%frame,240; add.u64 %r475,%frame,360; add.u64 %r476,%frame,120; $L101: .loc 1 356 3 mov.u64 %r160,%r166; mov.u64 %r159,%r165; .loc 1 386 9 mov.u64 %r137,0; $L88: .loc 1 389 7 ld.s8 %r208,[%r160]; mov.u64 %r189,%r137; .loc 1 386 24 add.u64 %r137,%r137,1; .loc 1 389 6 setp.eq.u32 %r466,%r208,0; @ %r466 bra $L85; .loc 1 397 11 ld.u64 %r460,[%r159]; ld.u64 %r461,[%r159+8]; .loc 1 408 9 @ %r470 bra $L86; bra $L126; $L85: .loc 1 386 32 add.u64 %r159,%r159,%r87; .loc 1 386 47 add.u64 %r160,%r160,%r42; .loc 1 386 2 setp.ne.u64 %r402,%r148,%r137; @ %r402 bra $L88; .loc 1 385 9 mov.u64 %r137,0; bra $L89; $L126: .loc 1 418 8 setp.gt.s64 %r403,%r148,%r189; @ %r403 bra $L90; bra $L89; $L86: .loc 1 409 8 setp.le.s64 %r404,%r148,%r189; @ %r404 bra $L89; $L93: .loc 1 414 16 add.u64 %r189,%r189,1; .loc 1 411 8 @ %r466 bra $L91; .loc 1 411 18 ld.u64 %r456,[%r159]; ld.u64 %r457,[%r159+8]; .loc 1 411 15 setp.gt.s64 %r408,%r461,%r457; @ %r408 bra $L91; setp.ne.u64 %r411,%r461,%r457; @ %r411 bra $L102; setp.gt.u64 %r414,%r460,%r456; @ %r414 bra $L91; $L102: mov.u64 %r460,%r456; mov.u64 %r461,%r457; .loc 1 414 16 mov.u64 %r137,%r189; $L91: .loc 1 409 33 add.u64 %r159,%r159,%r87; .loc 1 409 48 add.u64 %r160,%r160,%r42; .loc 1 409 8 setp.eq.u64 %r415,%r148,%r189; @ %r415 bra $L89; .loc 1 411 9 ld.s8 %r208,[%r160]; setp.eq.u32 %r466,%r208,0; bra $L93; $L90: .loc 1 423 16 add.u64 %r189,%r189,1; .loc 1 420 8 ld.s8 %r417,[%r160]; cvt.u16.u32 %r416,%r417; setp.eq.u16 %r418,%r416,0; @ %r418 bra $L94; .loc 1 420 18 ld.u64 %r458,[%r159]; ld.u64 %r459,[%r159+8]; .loc 1 420 15 setp.gt.s64 %r421,%r459,%r461; @ %r421 bra $L103; setp.ne.u64 %r424,%r459,%r461; @ %r424 bra $L94; setp.gt.u64 %r427,%r458,%r460; @ ! %r427 bra $L94; $L103: mov.u64 %r460,%r458; mov.u64 %r461,%r459; .loc 1 423 16 mov.u64 %r137,%r189; $L94: .loc 1 418 33 add.u64 %r159,%r159,%r87; .loc 1 418 48 add.u64 %r160,%r160,%r42; .loc 1 418 8 setp.ne.u64 %r428,%r148,%r189; @ %r428 bra $L90; $L89: .loc 1 426 8 st.u64 [%r167],%r137; .loc 1 429 15 add.u64 %r68,%r68,1; .loc 1 430 12 add.u64 %r165,%r165,%r99; .loc 1 431 13 add.u64 %r166,%r166,%r100; .loc 1 432 12 add.u64 %r167,%r167,%r104; .loc 1 434 32 ld.u64 %r29,[%frame+360]; .loc 1 434 13 setp.ne.u64 %r429,%r29,%r68; @ %r429 bra $L101; .loc 1 441 23 mul.lo.u64 %r106,%r68,%r97; .loc 1 442 24 mul.lo.u64 %r430,%r68,%r100; .loc 1 442 10 sub.u64 %r162,%r166,%r430; .loc 1 443 23 mul.lo.u64 %r431,%r68,%r102; .loc 1 443 9 shl.b64 %r432,%r431,3; sub.u64 %r163,%r167,%r432; .loc 1 445 7 @ %r464 bra $L54; add.u64 %r204,%frame,488; mov.u64 %r203,8; .loc 1 444 5 mov.u64 %r164,1; .loc 1 438 13 mov.u64 %r469,0; bra $L98; $L99: st.u64 [%r204],%r469; .loc 1 441 23 mul.lo.u64 %r106,%r117,%r116; .loc 1 442 24 mul.lo.u64 %r438,%r119,%r116; .loc 1 442 10 sub.u64 %r162,%r166,%r438; .loc 1 443 23 mul.lo.u64 %r439,%r121,%r116; .loc 1 443 9 shl.b64 %r440,%r439,3; sub.u64 %r163,%r167,%r440; .loc 1 444 5 add.u64 %r164,%r164,1; .loc 1 445 7 add.u64 %r204,%r204,8; add.u64 %r203,%r203,8; setp.eq.u64 %r441,%r164,%r473; @ %r441 bra $L54; $L98: .loc 1 453 16 ld.u64 %r442,[%r204]; add.u64 %r116,%r442,1; st.u64 [%r204],%r116; .loc 1 454 23 add.u64 %r444,%r474,%r203; ld.u64 %r117,[%r444]; .loc 1 454 13 sub.u64 %r445,%r117,%r106; shl.b64 %r446,%r445,4; add.u64 %r165,%r165,%r446; .loc 1 455 24 add.u64 %r447,%frame,%r203; ld.u64 %r119,[%r447]; .loc 1 455 14 add.u64 %r166,%r162,%r119; .loc 1 456 23 add.u64 %r449,%r476,%r203; ld.u64 %r121,[%r449]; .loc 1 456 13 shl.b64 %r450,%r121,3; add.u64 %r167,%r163,%r450; .loc 1 434 32 add.u64 %r452,%r475,%r203; ld.u64 %r125,[%r452]; .loc 1 434 13 setp.eq.u64 %r453,%r116,%r125; @ %r453 bra $L99; .loc 1 438 13 mov.u64 %r68,0; bra $L101; $L124: .loc 1 312 6 ld.u64 %r454,[%r213]; setp.eq.u64 %r455,%r454,0; @ ! %r455 bra $L69; bra $L100; $L54: .loc 1 460_gfortran_smaxloc1_8_i16 .visible .func _gfortran_smaxloc1_8_i16, .param .u32 %in_ar4) {8u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 483 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L128; .loc 1 483 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L129; $L128:r152154; call _gfortran_maxloc1_8_i16490 7 bra $L127; $L129: .loc 1 493 10 ld.u64 %r23,[%r152]; .loc 1 493 7 add.u64 %r88,%r23,-1; .loc 1 494 10 ld.s8 %r24,[%r151+28]; .loc 1 494 38 add.u32 %r26,%r24,-1; .loc 1 494 8 cvt.s64.s32 %r296,%r26; .loc 1 496 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 496 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L131; .loc 1 503 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L132; $L138: .loc 1 511 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L133; bra $L186; $L131: .loc 1 498 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error132: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 508 12 mov.u64 %r302,0; $L137: .loc 1 505 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 507 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L135; .loc 1 505 17 st.u64 [%r139],%r36; bra $L136; $L135: .loc 1 508 12 st.u64 [%r139],%r302; $L136: .loc 1 503 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L137; bra $L138; $L186: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L140; bra $L139; $L133: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 517 12 mov.u64 %r301,0; $L143: .loc 1 514 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 516 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L141; .loc 1 513 17 st.u64 [%r43],%r41; bra $L142; $L141: .loc 1 517 12 st.u64 [%r43],%r301; $L142: .loc 1 511 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L143; bra $L187; $L139: .loc 1 524 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L145; add.u64 %r295,%frame,120; $L165: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 527 10 mov.u64 %r81,1; .loc 1 531 4 mov.u64 %r239,0; bra $L146; $L145: .loc 1 535 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 536 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 538 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 538 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 538 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 540 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L147; bra $L188; $L149: .loc 1 529 48 mul.lo.u64 %r81,%r46,%r81; $L146: .loc 1 531 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 524 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L149; bra $L145; $L147: .loc 1 543 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 544 4 bra $L127; $L188: .loc 1 547 24 mov.u64 %r24696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 547 22 st.u64 [%r150],%r98; $L153: .loc 1 573 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L150; bra $L151; $L140: .loc 1 551 19 ld.s8 %r55,[%r150+28]; .loc 1 551 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L152; .loc 1 552 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error152: .loc 1 557 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 557 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L153; .loc 1 559 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L151; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 559 10 mov.u64 %r104,0; $L155: .loc 1 563 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 563 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 564 18 ld.u64 %r63,[%r31]; .loc 1 559 25 add.u64 %r104,%r104,1; .loc 1 564 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L154; .loc 1 565 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error154: .loc 1 559 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L155; $L150: .loc 1 575 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L156; shl.b64 %r266,%r296,3; bra $L157; $L156: mov.u64 %r266,8; $L157: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L158: .loc 1 576 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 573 26 add.u64 %r100,%r100,1; .loc 1 573 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L158; $L151: ld.u64 %r80,[%frame+240]; .loc 1 585 22 ld.u64 %r68,[%frame]; .loc 1 585 12 shl.b64 %r69,%r68,3; .loc 1 587 32 ld.u64 %r59,[%frame+120]; .loc 1 594 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 583 13 mov.u64 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L164: st.u64 [%r98],%r280; .loc 1 584 15 add.u64 %r80,%r80,1; .loc 1 585 12 add.u64 %r98,%r98,%r69; .loc 1 587 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L164; .loc 1 596 7 @ ! %r294 bra $L189; bra $L127; $L163: .loc 1 591 13 st.u64 [%r99],%r280; .loc 1 594 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 595 5 add.u64 %r97,%r97,1; .loc 1 596 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L162; bra $L127; $L189: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 594 23 mov.u64 %r72,%r44; .loc 1 595 5 mov.u64 %r97,1; $L162: .loc 1 600 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 601 23 ld.u64 %r75,[%r124]; .loc 1 601 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,3; add.u64 %r98,%r98,%r291; .loc 1 587 32 ld.u64 %r77,[%r123]; .loc 1 587 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L163; .loc 1 591 13 mov.u64 %r80,0; bra $L164; $L187: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L140; bra $L165; $L127: .loc 1 60573 _gfortran_maxloc1_16_i16 .visible .func _gfortran_maxloc1_16_i16.file 1 "../../../../libgfortran/generated/maxloc1_16_i16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_16_i16 .visible .func _gfortran_mmaxloc1_16_i16_gfortran_smaxloc1_16_i16 .visible .func _gfortran_smaxloc1_16_i16VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_maxloc1_16_i16 .visible .func _gfortran_maxloc1_16_i1695u64 %r107predpredu64 %r231; .reg .u64 %r232; .reg .pred %r233; .reg .pred %r235; .reg .u64 %r236; .reg .pred %r237; .reg .u64u64predu64 %r264; .reg .u64 %r265; .reg .u64 %r266; .reg .u64 %r267; .reg .u64 %r268; .reg .u16 %r270; .reg .u16 %r271; .reg .u32 %r272; .reg .u32 %r273; .reg .u64 %r275; .reg .u64u64pred64pred %r329; .reg .pred %r332; .reg .pred %r335; .reg .predpred %r359; .reg .predpredu64predpred %r403; .reg .u32 %r404; .reg .u64 %r405; .reg .u64 %r406; .reg .u64 %r407; .reg .u64 %r408; .reg .u64 %r409; .reg .u64 %r410; mov.u64 %r183,%ar0; mov.u64 %r184,%ar1; mov.u64 %r185,%ar2; mov.u32 %r186,%ar3; .loc 1 58 10 ld.s8 %r22,[%r184+28]; .loc 1 58 38 add.u32 %r187,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r110,%r187; .loc 1 59 10 ld.u64 %r25,[%r185]; .loc 1 59 7 add.u64 %r111,%r25,-1; .loc 1 61 7 shr.u64 %r189,%r111,63; set.u32.lt.s64 %r191,%r110,%r111; neg.s32 %r192,%r191; cvt.u16.u64 %r195,%r189; cvt.u16.u32 %r196,%r192; or.b16 %r194,%r195,%r196; .loc 1 61 6 cvt.u32.u16 %r197,%r194; cvt.u16.u8 %r198,%r197; setp.eq.u16 %r199,%r198,0; @ %r199 bra $L2; .loc 1 63 7 cvt.u32.u32 %r202,%r22; cvt.s64.s8 %r201,%r202; st.u64 [%stack+8],%r2000200_gfortran_runtime_errorr205,%r111,%r111; add.u64 %r206,%r205,%r111; shl.b64 %r207,%r206,3; add.u64 %r208,%r184,%r207; ld.u64 %r31,[%r208+56]; ld.u64 %r33,[%r208+48]; .loc 1 71 9 ld.u64 %r114,[%r208+40]; .loc 1 73 3 setp.ne.u64 %r222,%r111,0; @ %r222 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r223,%r110,%r111; @ %r223 bra $L4; bra $L51; $L3: add.u64 %r109,%r184,40; add.u64 %r128,%frame,120; add.u64 %r181,%frame,240; add.u64 %r224,%r184,16; add.u64 %r226,%r25,%r25; add.u64 %r227,%r226,%r25; shl.b64 %r228,%r227,3; add.u64 %r137,%r224,%r228; .loc 1 79 12 mov.u64 %r410,0; $L8: .loc 1 75 18 ld.u64 %r229,[%r109]; st.u64 [%r128],%r229; .loc 1 76 19 ld.u64 %r231,[%r109+16]; add.u64 %r230,%r231,1; ld.u64 %r232,[%r109+8]; sub.u64 %r38,%r230,%r232; .loc 1 78 10 setp.lt.s64 %r233,%r38,0; @ %r233 bra $L6; .loc 1 76 17 st.u64 [%r181],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r181],%r410; $L7: .loc 1 73 3 add.u64 %r109,%r109,24; add.u64 %r128,%r128,8; add.u64 %r181,%r181,8; setp.ne.u64 %r235,%r109,%r137; @ %r235 bra $L8; bra $L9; $L51: .loc 1 90 6 ld.u64 %r236,[%r183]; setp.eq.u64 %r237,%r236,0; @ ! %r237 bra $L11; bra $L10; $L4: add.u64 %r239,%r25,%r25; add.u64 %r240,%r239,%r25; shl.b64 %r241,%r240,3; add.u64 %r242,%r241,40; add.u64 %r134,%r184,%r242; shl.b64 %r243,%r25,3; add.u64 %r107,%r243,-8; add.u64 %r398,%frame,120; add.u64 %r120,%r398,%r107; add.u64 %r397,%frame,240; add.u64 %r104,%r397,%r107; add.u64 %r246,%r184,40; cvt.u32.u32 %r248,%r22; cvt.s64.s8 %r247,%r248; add.u64 %r250,%r247,%r247; add.u64 %r251,%r250,%r247; shl.b64 %r252,%r251,3; add.u64 %r63,%r246,%r252; .loc 1 87 12 mov.u64 %r409,0; $L14: .loc 1 83 18 ld.u64 %r253,[%r134]; st.u64 [%r120],%r253; .loc 1 84 19 ld.u64 %r255,[%r134+16]; add.u64 %r254,%r255,1; ld.u64 %r256,[%r134+8]; sub.u64 %r45,%r254,%r256; .loc 1 86 10 setp.lt.s64 %r257,%r45,0; @ %r257 bra $L12; .loc 1 84 17 st.u64 [%r104],%r45; bra $L13; $L12: .loc 1 87 12 st.u64 [%r104],%r409; $L13: .loc 1 81 3 add.u64 %r134,%r134,24; add.u64 %r120,%r120,8; add.u64 %r104,%r104,8; setp.ne.u64 %r259,%r63,%r134; @ %r259 bra $L14; bra $L52; $L10: .loc 1 94 7 setp.le.s64 %r260,%r110,0; @ %r260 bra $L16; add.u64 %r397,%frame,240; $L39: add.u64 %r154,%r183,40; mov.u64 %r152,%r397; cvt.u32.u32 %r262,%r22; cvt.s64.s8 %r261,%r262; add.u64 %r264,%r261,%r261; add.u64 %r265,%r264,%r261; shl.b64 %r266,%r265,3; add.u64 %r267,%r183,16; add.u64 %r138,%r266,%r267; .loc 1 97 10 mov.u64 %r101,1; .loc 1 101 4 mov.u64 %r290,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r268,0; st.u64 [%r183+8],%r268; .loc 1 106 28 cvt.u16.u32 %r271,%r22; add.u16 %r270,%r271,-1; cvt.u32.u16 %r272,%r270; st.u8 [%r183+28],%r272; .loc 1 108 20 add.u32 %r273,%r22,-2; cvt.s64.s32 %r56,%r273; add.u64 %r275,%r56,%r56; add.u64 %r276,%r275,%r56; shl.b64 %r277,%r276,3; add.u64 %r278,%r183,%r277; .loc 1 108 67 shl.b64 %r280,%r56,3; add.u64 %r281,%frame,%r280; .loc 1 108 59 ld.u64 %r283,[%r278+40]; ld.u64 %r284,[%r281+240]; mul.lo.u64 %r116,%r283,%r284; .loc 1 110 29 mov.u64 %r286,16286; call (%value_in),_gfortrani_xmallocarray287,[%value_in]; } .loc 1 110 27 st.u64 [%r183],%r287; .loc 1 111 10 setp.eq.u64 %r289,%r116,0; @ ! %r289 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r101,%r52,%r101; $L17: .loc 1 101 4 st.u64 [%r154+8],%r290; ld.u64 %r52,[%r152]; add.u64 %r291,%r52,-1; st.u64 [%r154+16],%r291; st.u64 [%r154],%r101; .loc 1 94 7 add.u64 %r154,%r154,24; add.u64 %r152,%r152,8; setp.ne.u64 %r292,%r138,%r154; @ %r292 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r183+48],%r116; mov.u64 %r294,-1; st.u64 [%r183+56],%r294; mov.u64 %r295,1; st.u64 [%r183+40],%r295; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r62,[%r183+28]; .loc 1 121 10 setp.eq.u64 %r296,%r62,%r110; @ %r296 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r110; st.u64 [%stack],%r62;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r299,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r300,[%r299+36]; setp.eq.u32 %r301,%r300,0; @ %r301 bra $L21; .loc 1 128 2 add.u64 %r397,%frame,240; cvta.const.u64 %r305,$LC2304305; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r307,%r110,0; @ %r307 bra $L24; $L27: .loc 1 68 9 add.u64 %r308,%r31,1; .loc 1 68 7 sub.u64 %r112,%r308,%r33; max.s64 %r46,%r112,0; .loc 1 140 8 ld.u64 %r124,[%r184]; .loc 1 141 8 ld.u64 %r125,[%r183]; .loc 1 180 31 shl.b64 %r70,%r114,4; ld.u64 %r77,[%frame+360]; .loc 1 194 22 ld.u64 %r72,[%frame+120]; .loc 1 194 12 shl.b64 %r73,%r72,4; .loc 1 195 22 ld.u64 %r75,[%frame]; .loc 1 195 12 shl.b64 %r76,%r75,4; .loc 1 197 32 ld.u64 %r105,[%frame+240]; .loc 1 204 23 mul.lo.u64 %r309,%r72,%r105; .loc 1 204 9 shl.b64 %r310,%r309,4; neg.s64 %r135,%r310; .loc 1 205 23 mul.lo.u64 %r311,%r75,%r105; .loc 1 205 9 shl.b64 %r312,%r311,4; neg.s64 %r139,%r312; setp.gt.s64 %r399,%r112,0; setp.le.s64 %r400,%r110,1; mov.u64 %r402,1; setp.eq.u32 %r403,%r186,0; cvt.u32.u32 %r404,%r22; cvt.s64.s8 %r405,%r404; add.u64 %r406,%r405,-1; add.u64 %r407,%frame,120; add.u64 %r408,%frame,240; bra $L25; $L24: add.u64 %r167,%frame,360; add.u64 %r166,%r183,40; mov.u64 %r164,%frame; cvt.u32.u32 %r314,%r22; cvt.s64.s8 %r313,%r314; add.u64 %r155,%r313,-1; .loc 1 132 3 mov.u64 %r126,0; add.u64 %r397,%frame,240; .loc 1 134 16 mov.u64 %r315,%r126; $L26: st.u64 [%r167],%r315; .loc 1 135 18 ld.u64 %r316,[%r166]; st.u64 [%r164],%r316; .loc 1 136 17 shl.b64 %r318,%r126,3; add.u64 %r319,%r397,%r318; .loc 1 136 10 ld.u64 %r320,[%r319]; setp.le.s64 %r321,%r320,0; @ %r321 bra $L1; .loc 1 132 26 add.u64 %r126,%r126,1; .loc 1 132 3 add.u64 %r167,%r167,8; add.u64 %r166,%r166,24; add.u64 %r164,%r164,8; setp.ne.u64 %r322,%r126,%r155; @ %r322 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r399 bra $L40; .loc 1 159 10 mov.u64 %r324,0; st.u64 [%r125],%r324; st.u64 [%r125+8],%r324; bra $L29; $L40: mov.u64 %r119,%r124; mov.u64 %r388,%r402; mov.u64 %r389,0; .loc 1 155 9 mov.u64 %r386,%r389; mov.u64 %r387,-9223372036854775808; .loc 1 157 9 mov.u64 %r384,1; mov.u64 %r385,%r389; $L28: .loc 1 182 14 ld.u64 %r390,[%r119]; ld.u64 %r391,[%r119+8]; .loc 1 182 7 @ %r403 bra $L30; .loc 1 182 6 setp.gt.s64 %r329,%r387,%r391; @ %r329 bra $L33; setp.ne.u64 %r332,%r387,%r391; @ %r332 bra $L31; setp.gt.u64 %r335,%r386,%r390; @ %r335 bra $L33; bra $L31; $L30: setp.gt.s64 %r338,%r391,%r387; @ %r338 bra $L31; setp.ne.u64 %r341,%r391,%r387; @ %r341 bra $L33; setp.gt.u64 %r344,%r390,%r386; @ ! %r344 bra $L33; $L31: .loc 1 185 14 mov.u64 %r384,%r388; mov.u64 %r385,%r389; .loc 1 182 14 mov.u64 %r386,%r390; mov.u64 %r387,%r391; $L33: .loc 1 180 31 add.u64 %r119,%r119,%r70; .loc 1 180 6 add.u64 %r347,%r388,%r402; set.u32.lt.u64 %r352,%r347,%r388; cvt.s64.s32 %r350,%r352; mov.u64 %r388,%r347; sub.u64 %r389,%r389,%r350; add.u64 %r357,%r388,-1; setp.gt.s64 %r359,%r46,%r357; @ %r359 bra $L28; .loc 1 189 12 st.u64 [%r125],%r384; st.u64 [%r125+8],%r385; $L29: .loc 1 193 15 add.u64 %r77,%r77,1; .loc 1 194 12 add.u64 %r124,%r124,%r73; .loc 1 195 12 add.u64 %r125,%r125,%r76; .loc 1 197 13 setp.ne.u64 %r362,%r77,%r105; @ %r362 bra $L25; .loc 1 204 9 add.u64 %r121,%r124,%r135; .loc 1 205 9 add.u64 %r122,%r125,%r139; .loc 1 207 7 @ %r400 bra $L1; add.u64 %r130,%frame,368; mov.u64 %r149,8; .loc 1 206 5 mov.u64 %r123,1; .loc 1 201 13 mov.u64 %r401,0; bra $L37; $L38: st.u64 [%r130],%r401; .loc 1 204 23 mul.lo.u64 %r368,%r89,%r88; .loc 1 204 9 shl.b64 %r369,%r368,4; sub.u64 %r121,%r124,%r369; .loc 1 205 23 mul.lo.u64 %r370,%r92,%r88; .loc 1 205 9 shl.b64 %r371,%r370,4; sub.u64 %r122,%r125,%r371; .loc 1 206 5 add.u64 %r123,%r123,1; .loc 1 207 7 add.u64 %r130,%r130,8; add.u64 %r149,%r149,8; setp.eq.u64 %r372,%r123,%r406; @ %r372 bra $L1; $L37: .loc 1 215 16 ld.u64 %r373,[%r130]; add.u64 %r88,%r373,1; st.u64 [%r130],%r88; .loc 1 216 23 add.u64 %r375,%r407,%r149; ld.u64 %r89,[%r375]; .loc 1 216 13 shl.b64 %r376,%r89,4; add.u64 %r124,%r121,%r376; .loc 1 217 23 add.u64 %r377,%frame,%r149; ld.u64 %r92,[%r377]; .loc 1 217 13 shl.b64 %r378,%r92,4; add.u64 %r125,%r122,%r378; .loc 1 197 32 add.u64 %r380,%r408,%r149; ld.u64 %r95,[%r380]; .loc 1 197 13 setp.eq.u64 %r381,%r88,%r95; @ %r381 bra $L38; .loc 1 201 13 mov.u64 %r77,0; bra $L25; $L52: .loc 1 90 6 ld.u64 %r382,[%r183]; setp.eq.u64 %r383,%r382,0; @ ! %r383 bra $L11; bra $L39; $L1: .loc 1 221_gfortran_mmaxloc1_16_i16 .visible .func _gfortran_mmaxloc1_16_i16, .param .u32 %in_ar4) {6087u64u64 %r267; .reg .u64 %r274; .reg .pred %r275; .reg .u32 %r276; .reg .u32 %r277; .reg .u32u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .pred %r329; .reg .u64 %r330; .reg .pred %r331; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32 %r338; .reg .u64 %r339; .reg .u64 %r341; .reg .u64 %r342; .reg .u64 %r343; .reg .u64 %r344; .reg .u64pred %r355; .reg .pred64predpred %r408; .reg .pred %r409; .reg .u64 %r410; .reg .u32.reg .u64 %r417; .reg .pred %r418; .reg .predpred %r428; .reg .u64 %r432; .reg .u64pred %r462; .reg .pred %r465; .reg .predpred %r483; .reg .pred.reg .u32 %r512; .reg .u64 %r516; .reg .u16 %r517; .reg .u32 %r518; .reg .pred %r519; .reg .pred %r522; .reg .pred %r525; .reg .predu32 %r536; .reg .pred %r543; .reg .pred %r546; .reg .pred %r549; .reg .u64 %r555; .reg .u64 %r556; .reg .u64 %r557; .reg .predu64u64 %r577; .reg .u64u64 %r587; .reg .u64 %r588; .reg .u64 %r599; .reg .u64 %r613; .reg .u64 %r617; .reg .u64 %r618; .reg .u64 %r619; .reg .u64 %r620; .reg .pred %r621; .reg .u64 %r622; .reg .pred %r623; .reg .u64 %r624; .reg .pred %r625; .reg .u64 %r626; .reg .u32 %r627; .reg .u64 %r628; .reg .u64 %r629; .reg .u64 %r630; .reg .u64 %r631; .reg .u64 %r632; .reg .u64 %r633; mov.u64 %r233,%ar0; mov.u64 %r234,%ar1; mov.u64 %r235,%ar2; mov.u64 %r236,%ar3; mov.u32 %r237,%ar4; .loc 1 251 6 setp.ne.u64 %r238,%r236,0; @ %r238 bra $L3r2235237; call _gfortran_maxloc1_16_i167 bra $L53; $L54: .loc 1 261 10 ld.u64 %r22,[%r235]; .loc 1 261 7 add.u64 %r153,%r22,-1; .loc 1 262 10 ld.s8 %r23,[%r234+28]; .loc 1 262 38 add.u32 %r243,%r23,-1; .loc 1 262 8 cvt.s64.s32 %r154,%r243; .loc 1 265 7 shr.u64 %r245,%r153,63; set.u32.gt.s64 %r247,%r153,%r154; neg.s32 %r248,%r247; cvt.u16.u64 %r251,%r245; cvt.u16.u32 %r252,%r248; or.b16 %r250,%r251,%r252; .loc 1 265 6 cvt.u32.u16 %r253,%r250; cvt.u16.u8 %r254,%r253; setp.eq.u16 %r255,%r254,0; @ %r255 bra $L56; .loc 1 267 7 cvt.u32.u32 %r258,%r23; cvt.s64.s8 %r257,%r258; st.u64 [%stack+8],%r257; st.u64 [%stack],%r22; cvta.const.u64 %r22stack; call _gfortran_runtime_error6: .loc 1 272 9 add.u64 %r619,%r153,%r153; add.u64 %r620,%r619,%r153; shl.b64 %r263,%r620,3; add.u64 %r264,%r234,%r263; ld.u64 %r267,[%r264+56]; add.u64 %r266,%r267,1; .loc 1 272 7 ld.u64 %r274,[%r264+48]; sub.u64 %r155,%r266,%r274; .loc 1 273 6 setp.le.s64 %r275,%r155,0; @ %r275 bra $L53; .loc 1 276 9 ld.u64 %r173,[%r236]; .loc 1 278 15 ld.u64 %r35,[%r236+16]; .loc 1 280 22 cvt.u32.u64 %r36,%r35; .loc 1 280 53 add.u32 %r276,%r36,-4; and.b32 %r277,%r276,-5; set.u32.eq.u32 %r279,%r277,0; neg.s32 %r280,%r279; .loc 1 280 22 add.u32 %r281,%r36,-1; set.u32.le.u32 %r283,%r281,1; neg.s32 %r284,%r283; .loc 1 280 6 cvt.u16.u32 %r286,%r280; cvt.u16.u32 %r287,%r284; or.b16 %r285,%r286,%r287;58; .loc 1 282 7 setp.ne.u32 %r292,%r36,16; @ %r292 bra $L59; $L58: .loc 1 289 9 shl.b64 %r296,%r620,3; add.u64 %r297,%r234,%r296; ld.u64 %r158,[%r297+40]; .loc 1 290 12 add.u64 %r303,%r236,%r296; ld.u64 %r41,[%r303+40]; .loc 1 292 3 setp.ne.u64 %r305,%r153,0; @ %r305 bra $L60; $L66: .loc 1 302 3 setp.lt.s64 %r306,%r153,%r154; @ %r306 bra $L61; bra $L124; $L59: .loc 1 287 5 cvta.const.u64 %r307307_gfortran_runtime_error60: add.u64 %r164,%r234,40; add.u64 %r150,%r236,40; add.u64 %r309,%r234,16; add.u64 %r311,%r22,%r22; add.u64 %r312,%r311,%r22; shl.b64 %r313,%r312,3; add.u64 %r223,%r309,%r313; .loc 1 292 3 mov.u64 %r53,0; add.u64 %r622,%frame,240; add.u64 %r618,%frame,360; .loc 1 299 12 mov.u64 %r633,%r53; $L65: .loc 1 294 18 add.u64 %r315,%r622,%r53; ld.u64 %r316,[%r164]; st.u64 [%r315],%r316; .loc 1 295 18 add.u64 %r317,%frame,%r53; .loc 1 295 20 ld.u64 %r319,[%r150]; mul.lo.u64 %r318,%r319,%r35; .loc 1 295 18 st.u64 [%r317],%r318; .loc 1 296 19 ld.u64 %r321,[%r164+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r164+8]; sub.u64 %r52,%r320,%r322; .loc 1 298 10 setp.lt.s64 %r323,%r52,0; @ %r323 bra $L63; .loc 1 296 17 add.u64 %r325,%r618,%r53; st.u64 [%r325],%r52; bra $L64; $L63: .loc 1 299 12 add.u64 %r327,%r618,%r53; st.u64 [%r327],%r633; $L64: .loc 1 292 3 add.u64 %r164,%r164,24; add.u64 %r150,%r150,24; add.u64 %r53,%r53,8; setp.ne.u64 %r329,%r164,%r223; @ %r329 bra $L65; bra $L66; $L124: .loc 1 312 6 ld.u64 %r330,[%r233]; setp.eq.u64 %r331,%r330,0; @ ! %r331 bra $L68; bra $L67; $L61: add.u64 %r333,%r22,%r22; add.u64 %r334,%r333,%r22; shl.b64 %r335,%r334,3; add.u64 %r178,%r335,40; add.u64 %r59,%r234,%r178; add.u64 %r180,%r236,%r178; shl.b64 %r336,%r22,3; add.u64 %r91,%r336,-8; cvt.u32.u32 %r338,%r23; cvt.s64.s8 %r337,%r338; shl.b64 %r339,%r337,3; add.u64 %r97,%r339,-8; add.u64 %r622,%frame,240; add.u64 %r618,%frame,360; .loc 1 309 12 mov.u64 %r632,0; $L71: .loc 1 304 18 add.u64 %r341,%r622,%r91; ld.u64 %r342,[%r59]; st.u64 [%r341],%r342; .loc 1 305 18 add.u64 %r343,%frame,%r91; .loc 1 305 20 ld.u64 %r345,[%r180]; mul.lo.u64 %r344,%r345,%r35; .loc 1 305 18 st.u64 [%r343],%r344; .loc 1 306 19 ld.u64 %r347,[%r59+16]; add.u64 %r346,%r347,1; ld.u64 %r348,[%r59+8]; sub.u64 %r66,%r346,%r348; .loc 1 308 10 setp.lt.s64 %r349,%r66,0; @ %r349 bra $L69; .loc 1 306 17 add.u64 %r351,%r618,%r91; st.u64 [%r351],%r66; bra $L70; $L69: .loc 1 309 12 add.u64 %r353,%r618,%r91; st.u64 [%r353],%r632; $L70: .loc 1 302 3 add.u64 %r59,%r59,24; add.u64 %r180,%r180,24; add.u64 %r91,%r91,8; setp.ne.u64 %r355,%r91,%r97; @ %r355 bra $L71; bra $L125; $L67: .loc 1 316 7 setp.eq.u64 %r356,%r154,0; @ %r356 bra $L73; add.u64 %r618,%frame,360; $L101: add.u64 %r137,%r233,40; mov.u64 %r134,%r618; cvt.u32.u32 %r358,%r23; cvt.s64.s8 %r357,%r358; add.u64 %r360,%r357,%r357; add.u64 %r361,%r360,%r357; shl.b64 %r362,%r361,3; add.u64 %r363,%r233,16; add.u64 %r64,%r362,%r363; .loc 1 319 10 mov.u64 %r136,1; .loc 1 323 4 mov.u64 %r382,0; bra $L74; $L73: .loc 1 327 20 add.u32 %r364,%r23,-2; cvt.s64.s32 %r75,%r364; add.u64 %r366,%r75,%r75; add.u64 %r367,%r366,%r75; shl.b64 %r368,%r367,3; add.u64 %r369,%r233,%r368; .loc 1 327 67 shl.b64 %r371,%r75,3; add.u64 %r372,%frame,%r371; .loc 1 327 59 ld.u64 %r374,[%r369+40]; ld.u64 %r375,[%r372+360]; mul.lo.u64 %r162,%r374,%r375; .loc 1 329 24 mov.u64 %r376,0; st.u64 [%r233+8],%r376; .loc 1 330 28 cvt.u16.u32 %r379,%r23; add.u16 %r378,%r379,-1; cvt.u32.u16 %r380,%r378; st.u8 [%r233+28],%r380; .loc 1 332 10 setp.eq.u64 %r381,%r162,0; @ %r381 bra $L75; bra $L126; $L77: .loc 1 321 47 mul.lo.u64 %r136,%r72,%r136; $L74: .loc 1 323 4 st.u64 [%r137+8],%r382; ld.u64 %r72,[%r134]; add.u64 %r383,%r72,-1; st.u64 [%r137+16],%r383; st.u64 [%r137],%r136; .loc 1 316 7 add.u64 %r137,%r137,24; add.u64 %r134,%r134,8; setp.ne.u64 %r384,%r64,%r137; @ %r384 bra $L77; bra $L73; $L75: .loc 1 335 4 st.u64 [%r233+48],%r162; mov.u64 %r386,-1; st.u64 [%r233+56],%r386; mov.u64 %r387,1; st.u64 [%r233+40],%r387; .loc 1 336 4 bra $L53; $L126: .loc 1 339 24 mov.u64 %r389,16389; call (%value_in),_gfortrani_xmallocarray90,[%value_in]; } .loc 1 339 22 st.u64 [%r233],%r390; bra $L78; $L68: .loc 1 344 19 ld.s8 %r392,[%r233+28]; .loc 1 344 10 setp.eq.u64 %r393,%r392,%r154; @ %r393 bra $L79; .loc 1 345 2 cvta.const.u64 %r3944_gfortran_runtime_error79: .loc 1 347 11 cvta.global.u64 %r396,_gfortrani_compile_options; .loc 1 347 10 ld.u32 %r397,[%r396+36]; setp.eq.u32 %r398,%r397,0; @ %r398 bra $L78; .loc 1 349 4 add.u64 %r618,%frame,360; cvta.const.u64 %r402,$LC2r61002; call _gfortrani_bounds_ifunction_return4 cvta.const.u64 %r406,$LC406402; call _gfortrani_bounds_equal_extents$L78: .loc 1 356 3 setp.ne.u64 %r408,%r154,0; @ %r408 bra $L80; $L83: .loc 1 364 8 ld.u64 %r174,[%r233]; .loc 1 365 8 ld.u64 %r172,[%r234]; .loc 1 367 9 setp.ne.u64 %r409,%r172,0; @ %r409 bra $L81; bra $L53; $L80: add.u64 %r188,%frame,480; add.u64 %r181,%r233,40; add.u64 %r149,%frame,120; cvt.u32.u32 %r411,%r23; cvt.s64.s8 %r410,%r411; add.u64 %r138,%r410,-1; .loc 1 356 3 mov.u64 %r175,0; add.u64 %r618,%frame,360; .loc 1 358 16 mov.u64 %r412,%r175; $L82: st.u64 [%r188],%r412; .loc 1 359 18 ld.u64 %r413,[%r181]; st.u64 [%r149],%r413; .loc 1 360 17 shl.b64 %r415,%r175,3; add.u64 %r416,%r618,%r415; .loc 1 360 10 ld.u64 %r417,[%r416]; setp.le.s64 %r418,%r417,0; @ %r418 bra $L53; .loc 1 356 26 add.u64 %r175,%r175,1; .loc 1 356 3 add.u64 %r188,%r188,8; add.u64 %r181,%r181,24; add.u64 %r149,%r149,8; setp.ne.u64 %r419,%r138,%r175; @ %r419 bra $L82; bra $L83; $L81: .loc 1 290 12 mul.lo.u64 %r43,%r41,%r35; .loc 1 386 32 shl.b64 %r88,%r158,4; ld.u64 %r189,[%frame+480]; .loc 1 430 22 ld.u64 %r102,[%frame+240]; .loc 1 430 12 shl.b64 %r103,%r102,4; .loc 1 431 23 ld.u64 %r105,[%frame]; .loc 1 432 22 ld.u64 %r107,[%frame+120]; .loc 1 432 12 shl.b64 %r108,%r107,4; .loc 1 434 32 ld.u64 %r29,[%frame+360]; .loc 1 441 23 mul.lo.u64 %r39,%r29,%r102; .loc 1 442 24 mul.lo.u64 %r420,%r29,%r105; .loc 1 442 10 neg.s64 %r92,%r420; .loc 1 443 23 mul.lo.u64 %r421,%r29,%r107; .loc 1 443 9 shl.b64 %r422,%r421,4; neg.s64 %r185,%r422; setp.le.s64 %r621,%r154,1; .loc 1 408 9 setp.ne.u32 %r625,%r237,0; add.u64 %r626,%r155,-1; cvt.u32.u32 %r627,%r23; cvt.s64.s8 %r628,%r627; add.u64 %r629,%r628,-1; add.u64 %r630,%frame,240; add.u64 %r631,%frame,360; $L102: .loc 1 356 3 mov.u64 %r168,%r173; mov.u64 %r167,%r172; .loc 1 386 9 mov.u64 %r197,0; $L87: .loc 1 389 7 ld.s8 %r224,[%r168]; mov.u64 %r230,%r197; .loc 1 386 24 add.u64 %r197,%r197,1; .loc 1 389 6 setp.eq.u32 %r623,%r224,0; @ %r623 bra $L84; .loc 1 397 11 ld.u64 %r579,[%r167]; ld.u64 %r580,[%r167+8]; .loc 1 398 11 mov.u64 %r577,%r197; shr.s64 %r578,%r577,63; .loc 1 408 9 @ %r625 bra $L85; bra $L127; $L84: .loc 1 386 32 add.u64 %r167,%r167,%r88; .loc 1 386 47 add.u64 %r168,%r168,%r43; .loc 1 386 2 setp.ne.u64 %r426,%r155,%r197; @ %r426 bra $L87; .loc 1 385 9 mov.u64 %r577,0; mov.u64 %r578,%r577; bra $L88; $L127: .loc 1 418 8 setp.gt.s64 %r427,%r155,%r230; @ %r427 bra $L89; bra $L88; $L85: .loc 1 409 8 setp.le.s64 %r428,%r155,%r230; @ %r428 bra $L88; add.u64 %r587,%r230,1; shr.s64 %r588,%r587,63; sub.u64 %r432,%r626,%r230; shr.s64 %r435,%r230,63; add.u64 %r438,%r230,2; set.u32.lt.u64 %r443,%r438,%r230; cvt.s64.s32 %r441,%r443; sub.u64 %r447,%r435,%r441; add.u64 %r449,%r432,%r438; set.u32.lt.u64 %r454,%r449,%r432; cvt.s64.s32 %r452,%r454; sub.u64 %r458,%r447,%r452; mov.u64 %r599,1; $L93: .loc 1 411 8 @ %r623 bra $L90; .loc 1 411 18 ld.u64 %r573,[%r167]; ld.u64 %r574,[%r167+8]; .loc 1 411 15 setp.gt.s64 %r462,%r580,%r574; @ %r462 bra $L90; setp.ne.u64 %r465,%r580,%r574; @ %r465 bra $L103; setp.gt.u64 %r468,%r579,%r573; @ %r468 bra $L90; $L103: .loc 1 414 16 mov.u64 %r577,%r587; mov.u64 %r578,%r588; mov.u64 %r579,%r573; mov.u64 %r580,%r574; $L90: .loc 1 409 33 add.u64 %r167,%r167,%r88; .loc 1 409 48 add.u64 %r168,%r168,%r43; .loc 1 409 8 add.u64 %r471,%r587,%r599; set.u32.lt.u64 %r476,%r471,%r587; cvt.s64.s32 %r474,%r476; mov.u64 %r587,%r471; sub.u64 %r588,%r588,%r474; setp.ne.u64 %r483,%r449,%r587; @ %r483 bra $L104; setp.ne.u64 %r486,%r458,%r588; @ ! %r486 bra $L88; $L104: .loc 1 411 9 ld.s8 %r224,[%r168]; setp.eq.u32 %r623,%r224,0; bra $L93; $L89: add.u64 %r583,%r230,1; shr.s64 %r584,%r583,63; sub.u64 %r490,%r626,%r230; shr.s64 %r493,%r230,63; add.u64 %r496,%r230,2; set.u32.lt.u64 %r501,%r496,%r230; cvt.s64.s32 %r499,%r501; sub.u64 %r505,%r493,%r499; add.u64 %r507,%r490,%r496; set.u32.lt.u64 %r512,%r507,%r490; cvt.s64.s32 %r510,%r512; sub.u64 %r516,%r505,%r510; .loc 1 418 8 mov.u64 %r613,1; $L119: .loc 1 420 8 ld.s8 %r518,[%r168]; cvt.u16.u32 %r517,%r518; setp.eq.u16 %r519,%r517,0; @ %r519 bra $L94; .loc 1 420 18 ld.u64 %r575,[%r167]; ld.u64 %r576,[%r167+8]; .loc 1 420 15 setp.gt.s64 %r522,%r576,%r580; @ %r522 bra $L105; setp.ne.u64 %r525,%r576,%r580; @ %r525 bra $L94; setp.gt.u64 %r528,%r575,%r579; @ ! %r528 bra $L94; $L105: .loc 1 423 16 mov.u64 %r577,%r583; mov.u64 %r578,%r584; mov.u64 %r579,%r575; mov.u64 %r580,%r576; $L94: .loc 1 418 33 add.u64 %r167,%r167,%r88; .loc 1 418 48 add.u64 %r168,%r168,%r43; .loc 1 418 8 add.u64 %r531,%r583,%r613; set.u32.lt.u64 %r536,%r531,%r583; cvt.s64.s32 %r534,%r536; mov.u64 %r583,%r531; sub.u64 %r584,%r584,%r534; setp.ne.u64 %r543,%r507,%r583; @ %r543 bra $L119; setp.ne.u64 %r546,%r516,%r584; @ %r546 bra $L119; $L88: .loc 1 426 8 st.u64 [%r174],%r577; st.u64 [%r174+8],%r578; .loc 1 429 15 add.u64 %r189,%r189,1; .loc 1 430 12 add.u64 %r172,%r172,%r103; .loc 1 431 13 add.u64 %r173,%r173,%r105; .loc 1 432 12 add.u64 %r174,%r174,%r108; .loc 1 434 13 setp.ne.u64 %r549,%r29,%r189; @ %r549 bra $L102; .loc 1 442 10 add.u64 %r169,%r173,%r92; .loc 1 443 9 add.u64 %r170,%r174,%r185; .loc 1 445 7 @ %r621 bra $L53; add.u64 %r219,%frame,488; .loc 1 441 23 mov.u64 %r112,%r39; .loc 1 445 7 mov.u64 %r218,8; .loc 1 444 5 mov.u64 %r171,1; add.u64 %r617,%frame,120; .loc 1 438 13 mov.u64 %r624,0; bra $L99; $L100: st.u64 [%r219],%r624; .loc 1 441 23 mul.lo.u64 %r112,%r125,%r124; .loc 1 442 24 mul.lo.u64 %r555,%r127,%r124; .loc 1 442 10 sub.u64 %r169,%r173,%r555; .loc 1 443 23 mul.lo.u64 %r556,%r129,%r124; .loc 1 443 9 shl.b64 %r557,%r556,4; sub.u64 %r170,%r174,%r557; .loc 1 444 5 add.u64 %r171,%r171,1; .loc 1 445 7 add.u64 %r219,%r219,8; add.u64 %r218,%r218,8; setp.eq.u64 %r558,%r171,%r629; @ %r558 bra $L53; $L99: .loc 1 453 16 ld.u64 %r559,[%r219]; add.u64 %r124,%r559,1; st.u64 [%r219],%r124; .loc 1 454 23 add.u64 %r561,%r630,%r218; ld.u64 %r125,[%r561]; .loc 1 454 13 sub.u64 %r562,%r125,%r112; shl.b64 %r563,%r562,4; add.u64 %r172,%r172,%r563; .loc 1 455 24 add.u64 %r564,%frame,%r218; ld.u64 %r127,[%r564]; .loc 1 455 14 add.u64 %r173,%r169,%r127; .loc 1 456 23 add.u64 %r566,%r617,%r218; ld.u64 %r129,[%r566]; .loc 1 456 13 shl.b64 %r567,%r129,4; add.u64 %r174,%r170,%r567; .loc 1 434 32 add.u64 %r569,%r631,%r218; ld.u64 %r133,[%r569]; .loc 1 434 13 setp.eq.u64 %r570,%r124,%r133; @ %r570 bra $L100; .loc 1 438 13 mov.u64 %r189,0; bra $L102; $L125: .loc 1 312 6 ld.u64 %r571,[%r233]; setp.eq.u64 %r572,%r571,0; @ ! %r572 bra $L68; bra $L101; $L53: .loc 1 460_gfortran_smaxloc1_16_i16 .visible .func _gfortran_smaxloc1_16_i16, .param .u32 %in_ar4) {8u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r282; .reg .predpred %r293; .reg .pred %r294; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 483 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L129; .loc 1 483 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L130; $L129:r152154; call _gfortran_maxloc1_16_i16490 7 bra $L128; $L130: .loc 1 493 10 ld.u64 %r23,[%r152]; .loc 1 493 7 add.u64 %r88,%r23,-1; .loc 1 494 10 ld.s8 %r24,[%r151+28]; .loc 1 494 38 add.u32 %r26,%r24,-1; .loc 1 494 8 cvt.s64.s32 %r297,%r26; .loc 1 496 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r297; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 496 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L132; .loc 1 503 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L133; $L139: .loc 1 511 3 setp.lt.s64 %r175,%r88,%r297; @ %r175 bra $L134; bra $L187; $L132: .loc 1 498 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error133: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 508 12 mov.u64 %r303,0; $L138: .loc 1 505 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 507 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L136; .loc 1 505 17 st.u64 [%r139],%r36; bra $L137; $L136: .loc 1 508 12 st.u64 [%r139],%r303; $L137: .loc 1 503 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L138; bra $L139; $L187: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L141; bra $L140; $L134: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r296,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r296,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 517 12 mov.u64 %r302,0; $L144: .loc 1 514 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 516 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L142; .loc 1 513 17 st.u64 [%r43],%r41; bra $L143; $L142: .loc 1 517 12 st.u64 [%r43],%r302; $L143: .loc 1 511 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L144; bra $L188; $L140: .loc 1 524 7 setp.eq.u64 %r213,%r297,0; @ %r213 bra $L146; add.u64 %r296,%frame,120; $L166: add.u64 %r111,%r150,40; mov.u64 %r107,%r296; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 527 10 mov.u64 %r81,1; .loc 1 531 4 mov.u64 %r239,0; bra $L147; $L146: .loc 1 535 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 536 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 538 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 538 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 538 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 540 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L148; bra $L189; $L150: .loc 1 529 48 mul.lo.u64 %r81,%r46,%r81; $L147: .loc 1 531 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 524 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L150; bra $L146; $L148: .loc 1 543 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 544 4 bra $L128; $L189: .loc 1 547 24 mov.u64 %r246,1696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 547 22 st.u64 [%r150],%r98; $L154: .loc 1 573 3 setp.ne.u64 %r249,%r297,0; @ %r249 bra $L151; bra $L152; $L141: .loc 1 551 19 ld.s8 %r55,[%r150+28]; .loc 1 551 10 setp.eq.u64 %r250,%r55,%r297; @ %r250 bra $L153; .loc 1 552 2 st.u64 [%stack+8],%r297; st.u64 [%stack],%r55;_gfortran_runtime_error153: .loc 1 557 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 557 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L154; .loc 1 559 4 setp.eq.u64 %r256,%r297,0; @ %r256 bra $L152; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 559 10 mov.u64 %r104,0; $L156: .loc 1 563 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 563 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 564 18 ld.u64 %r63,[%r31]; .loc 1 559 25 add.u64 %r104,%r104,1; .loc 1 564 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L155; .loc 1 565 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error155: .loc 1 559 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L156; $L151: .loc 1 575 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L157; shl.b64 %r266,%r297,3; bra $L158; $L157: mov.u64 %r266,8; $L158: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L159: .loc 1 576 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 573 26 add.u64 %r100,%r100,1; .loc 1 573 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r297,%r100; @ %r279 bra $L159; $L152: ld.u64 %r80,[%frame+240]; .loc 1 585 22 ld.u64 %r68,[%frame]; .loc 1 585 12 shl.b64 %r69,%r68,4; .loc 1 587 32 ld.u64 %r59,[%frame+120]; .loc 1 594 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r295,%r297,1; .loc 1 583 13 mov.u64 %r280,0; cvt.u32.u32 %r299,%r24; cvt.s64.s8 %r300,%r299; add.u64 %r301,%r300,-1; $L165: st.u64 [%r98],%r280; st.u64 [%r98+8],%r280; .loc 1 584 15 add.u64 %r80,%r80,1; .loc 1 585 12 add.u64 %r98,%r98,%r69; .loc 1 587 13 setp.ne.u64 %r282,%r59,%r80; @ %r282 bra $L165; .loc 1 596 7 @ ! %r295 bra $L190; bra $L128; $L164: .loc 1 591 13 st.u64 [%r99],%r280; .loc 1 594 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 595 5 add.u64 %r97,%r97,1; .loc 1 596 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r285,%r97,%r301; @ %r285 bra $L163; bra $L128; $L190: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 594 23 mov.u64 %r72,%r44; .loc 1 595 5 mov.u64 %r97,1; $L163: .loc 1 600 16 ld.u64 %r290,[%r99]; add.u64 %r74,%r290,1; st.u64 [%r99],%r74; .loc 1 601 23 ld.u64 %r75,[%r124]; .loc 1 601 13 sub.u64 %r291,%r75,%r72; shl.b64 %r292,%r291,4; add.u64 %r98,%r98,%r292; .loc 1 587 32 ld.u64 %r77,[%r123]; .loc 1 587 13 setp.eq.u64 %r293,%r74,%r77; @ %r293 bra $L164; .loc 1 591 13 mov.u64 %r80,0; bra $L165; $L188: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r294,%r98,0; @ ! %r294 bra $L141; bra $L166; $L128: .loc 1 605maxloc1_4_r4.o/ 1622802230_gfortran_maxloc1_4_r4 .visible .func _gfortran_maxloc1_4_r../libgfortran/generated/maxloc1_4_r4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_4_r4 .visible .func _gfortran_mmaxloc1_4_r_gfortran_smaxloc1_4_r4 .visible .func _gfortran_smaxloc1_4_rVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_maxloc1_4_r4 .visible .func _gfortran_maxloc1_4_r4100; .reg .u64 %r102; .reg .u32predpredpred %r235; .reg .predpredu64u16u64 %r279; .reg .u64 %r280; .reg .u64 %r282; .reg .u64 %r283; .reg .u64predu64u32 %r316; .reg .u64 %r317; .reg .u64pred %r323; .reg .pred %r324; .reg .u32 %r326; .reg .predpred %r330; .reg .predpred %r345; .reg .u64mov.u64 %r185,%ar0; mov.u64 %r186,%ar1; mov.u64 %r187,%ar2; mov.u32 %r188,%ar3; .loc 1 58 10 ld.s8 %r22,[%r186+28]; .loc 1 58 38 add.u32 %r189,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r112,%r189; .loc 1 59 10 ld.u64 %r25,[%r187]; .loc 1 59 7 add.u64 %r113,%r25,-1; .loc 1 61 7 shr.u64 %r191,%r113,63; set.u32.lt.s64 %r193,%r112,%r113; neg.s32 %r194,%r193; cvt.u16.u64 %r197,%r191; cvt.u16.u32 %r198,%r194; or.b16 %r196,%r197,%r198; .loc 1 61 6 cvt.u32.u16 %r199,%r196; cvt.u16.u8 %r200,%r199; setp.eq.u16 %r201,%r200,0; @ %r201 bra $L2; .loc 1 63 7 cvt.u32.u32 %r204,%r22; cvt.s64.s8 %r203,%r204; st.u64 [%stack+8],%r20302202_gfortran_runtime_errorr207,%r113,%r113; add.u64 %r208,%r207,%r113; shl.b64 %r209,%r208,3; add.u64 %r210,%r186,%r209; ld.u64 %r30,[%r210+56]; ld.u64 %r32,[%r210+48]; .loc 1 71 9 ld.u64 %r116,[%r210+40]; .loc 1 73 3 setp.ne.u64 %r224,%r113,0; @ %r224 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r225,%r112,%r113; @ %r225 bra $L4; bra $L55; $L3: add.u64 %r184,%r186,40; add.u64 %r181,%frame,120; add.u64 %r47,%frame,240; add.u64 %r226,%r186,16; add.u64 %r228,%r25,%r25; add.u64 %r229,%r228,%r25; shl.b64 %r230,%r229,3; add.u64 %r148,%r226,%r230; .loc 1 79 12 mov.u64 %r371,0; $L8: .loc 1 75 18 ld.u64 %r231,[%r184]; st.u64 [%r181],%r231; .loc 1 76 19 ld.u64 %r233,[%r184+16]; add.u64 %r232,%r233,1; ld.u64 %r234,[%r184+8]; sub.u64 %r37,%r232,%r234; .loc 1 78 10 setp.lt.s64 %r235,%r37,0; @ %r235 bra $L6; .loc 1 76 17 st.u64 [%r47],%r37; bra $L7; $L6: .loc 1 79 12 st.u64 [%r47],%r371; $L7: .loc 1 73 3 add.u64 %r184,%r184,24; add.u64 %r181,%r181,8; add.u64 %r47,%r47,8; setp.ne.u64 %r237,%r148,%r184; @ %r237 bra $L8; bra $L9; $L55: .loc 1 90 6 ld.u64 %r238,[%r185]; setp.eq.u64 %r239,%r238,0; @ ! %r239 bra $L11; bra $L10; $L4: add.u64 %r241,%r25,%r25; add.u64 %r242,%r241,%r25; shl.b64 %r243,%r242,3; add.u64 %r244,%r243,40; add.u64 %r130,%r186,%r244; shl.b64 %r245,%r25,3; add.u64 %r102,%r245,-8; add.u64 %r359,%frame,120; add.u64 %r106,%r359,%r102; add.u64 %r358,%frame,240; add.u64 %r98,%r358,%r102; cvt.u32.u32 %r249,%r22; cvt.s64.s8 %r248,%r249; add.u64 %r251,%r248,%r248; add.u64 %r252,%r251,%r248; shl.b64 %r253,%r252,3; add.u64 %r254,%r186,40; add.u64 %r127,%r253,%r254; .loc 1 87 12 mov.u64 %r370,0; $L14: .loc 1 83 18 ld.u64 %r255,[%r130]; st.u64 [%r106],%r255; .loc 1 84 19 ld.u64 %r257,[%r130+16]; add.u64 %r256,%r257,1; ld.u64 %r258,[%r130+8]; sub.u64 %r44,%r256,%r258; .loc 1 86 10 setp.lt.s64 %r259,%r44,0; @ %r259 bra $L12; .loc 1 84 17 st.u64 [%r98],%r44; bra $L13; $L12: .loc 1 87 12 st.u64 [%r98],%r370; $L13: .loc 1 81 3 add.u64 %r130,%r130,24; add.u64 %r106,%r106,8; add.u64 %r98,%r98,8; setp.ne.u64 %r261,%r127,%r130; @ %r261 bra $L14; bra $L56; $L10: .loc 1 94 7 setp.le.s64 %r262,%r112,0; @ %r262 bra $L16; add.u64 %r358,%frame,240; $L42: add.u64 %r155,%r185,40; mov.u64 %r146,%r358; add.u64 %r263,%r185,16; cvt.u32.u32 %r265,%r22; cvt.s64.s8 %r264,%r265; add.u64 %r267,%r264,%r264; add.u64 %r268,%r267,%r264; shl.b64 %r269,%r268,3; add.u64 %r134,%r263,%r269; .loc 1 97 10 mov.u64 %r100,1; .loc 1 101 4 mov.u64 %r292,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r270,0; st.u64 [%r185+8],%r270; .loc 1 106 28 cvt.u16.u32 %r273,%r22; add.u16 %r272,%r273,-1; cvt.u32.u16 %r274,%r272; st.u8 [%r185+28],%r274; .loc 1 108 20 add.u32 %r275,%r22,-2; cvt.s64.s32 %r53,%r275; add.u64 %r277,%r53,%r53; add.u64 %r278,%r277,%r53; shl.b64 %r279,%r278,3; add.u64 %r280,%r185,%r279; .loc 1 108 67 shl.b64 %r282,%r53,3; add.u64 %r283,%frame,%r282; .loc 1 108 59 ld.u64 %r285,[%r280+40]; ld.u64 %r286,[%r283+240]; mul.lo.u64 %r118,%r285,%r286; .loc 1 110 29288; call (%value_in),_gfortrani_xmallocarray289,[%value_in]; } .loc 1 110 27 st.u64 [%r185],%r289; .loc 1 111 10 setp.eq.u64 %r291,%r118,0; @ ! %r291 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r100,%r49,%r100; $L17: .loc 1 101 4 st.u64 [%r155+8],%r292; ld.u64 %r49,[%r146]; add.u64 %r293,%r49,-1; st.u64 [%r155+16],%r293; st.u64 [%r155],%r100; .loc 1 94 7 add.u64 %r155,%r155,24; add.u64 %r146,%r146,8; setp.ne.u64 %r294,%r134,%r155; @ %r294 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r185+48],%r118; mov.u64 %r296,-1; st.u64 [%r185+56],%r296; mov.u64 %r297,1; st.u64 [%r185+40],%r297; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r59,[%r185+28]; .loc 1 121 10 setp.eq.u64 %r298,%r59,%r112; @ %r298 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r112; st.u64 [%stack],%r59;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r301,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r302,[%r301+36]; setp.eq.u32 %r303,%r302,0; @ %r303 bra $L21; .loc 1 128 2 add.u64 %r358,%frame,240; cvta.const.u64 %r307,$LC2r35306307; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r309,%r112,0; @ %r309 bra $L24; $L27: .loc 1 68 9 add.u64 %r310,%r30,1; .loc 1 68 7 sub.u64 %r114,%r310,%r32; max.s64 %r131,%r114,0; .loc 1 140 8 ld.u64 %r124,[%r186]; .loc 1 141 8 ld.u64 %r125,[%r185]; .loc 1 168 42 shl.b64 %r66,%r116,2; ld.u64 %r135,[%frame+360]; .loc 1 194 22 ld.u64 %r70,[%frame+120]; .loc 1 194 12 shl.b64 %r71,%r70,2; .loc 1 195 22 ld.u64 %r73,[%frame]; .loc 1 195 12 shl.b64 %r74,%r73,2; .loc 1 197 32 ld.u64 %r61,[%frame+240]; .loc 1 204 23 mul.lo.u64 %r311,%r61,%r70; .loc 1 204 9 shl.b64 %r312,%r311,2; neg.s64 %r149,%r312; .loc 1 205 23 mul.lo.u64 %r313,%r61,%r73; .loc 1 205 9 shl.b64 %r314,%r313,2; neg.s64 %r132,%r314; setp.gt.s64 %r360,%r114,0; setp.le.s64 %r361,%r112,1; .loc 1 157 9 mov.u32 %r363,1; setp.eq.u32 %r364,%r188,0; cvt.u32.u32 %r365,%r22; cvt.s64.s8 %r366,%r365; add.u64 %r367,%r366,-1; add.u64 %r368,%frame,120; add.u64 %r369,%frame,240; bra $L25; $L24: add.u64 %r168,%frame,360; add.u64 %r167,%r185,40; mov.u64 %r165,%frame; cvt.u32.u32 %r316,%r22; cvt.s64.s8 %r315,%r316; add.u64 %r156,%r315,-1; .loc 1 132 3 mov.u64 %r126,0; add.u64 %r358,%frame,240; .loc 1 134 16 mov.u64 %r317,%r126; $L26: st.u64 [%r168],%r317; .loc 1 135 18 ld.u64 %r318,[%r167]; st.u64 [%r165],%r318; .loc 1 136 17 shl.b64 %r320,%r126,3; add.u64 %r321,%r358,%r320; .loc 1 136 10 ld.u64 %r322,[%r321]; setp.le.s64 %r323,%r322,0; @ %r323 bra $L1; .loc 1 132 26 add.u64 %r126,%r126,1; .loc 1 132 3 add.u64 %r168,%r168,8; add.u64 %r167,%r167,24; add.u64 %r165,%r165,8; setp.ne.u64 %r324,%r126,%r156; @ %r324 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r360 bra $L43; .loc 1 159 10 mov.u32 %r326,0; st.u32 [%r125],%r326; bra $L29; $L43: mov.u64 %r120,%r124; .loc 1 168 19 mov.u64 %r119,0; $L28: .loc 1 170 7 ld.f32 %r105,[%r120]; .loc 1 170 6 setp.ge.f32 %r327,%r105,0fff800000; @ ! %r327 bra $L57; .loc 1 173 14 cvt.u32.u64 %r328,%r119; add.u32 %r103,%r328,1; .loc 1 180 6 setp.le.s64 %r329,%r131,%r119; @ %r329 bra $L32; add.u64 %r171,%r119,1; .loc 1 170 7 mov.f32 %r183,%r105; bra $L33; $L57: .loc 1 168 34 add.u64 %r119,%r119,1; .loc 1 168 42 add.u64 %r120,%r120,%r66; .loc 1 168 12 setp.lt.s64 %r330,%r119,%r131; @ %r330 bra $L28; .loc 1 157 9 mov.u32 %r103,%r363; bra $L32; $L33: .loc 1 182 7 @ %r364 bra $L34; .loc 1 182 6 setp.le.f32 %r332,%r105,%r183; @ %r332 bra $L35; bra $L36; $L34: setp.lt.f32 %r333,%r105,%r183; @ ! %r333 bra $L36; $L35: .loc 1 185 14 cvt.u32.u64 %r103,%r171; .loc 1 184 14 mov.f32 %r105,%r183; $L36: .loc 1 180 31 add.u64 %r120,%r120,%r66; .loc 1 180 6 add.u64 %r172,%r171,1; setp.eq.u64 %r334,%r131,%r171; @ %r334 bra $L32; .loc 1 182 14 ld.f32 %r183,[%r120]; mov.u64 %r171,%r172; bra $L33; $L32: .loc 1 189 12 st.u32 [%r125],%r103; $L29: .loc 1 193 15 add.u64 %r135,%r135,1; .loc 1 194 12 add.u64 %r124,%r124,%r71; .loc 1 195 12 add.u64 %r125,%r125,%r74; .loc 1 197 13 setp.ne.u64 %r335,%r61,%r135; @ %r335 bra $L25; .loc 1 204 9 add.u64 %r121,%r124,%r149; .loc 1 205 9 add.u64 %r122,%r125,%r132; .loc 1 207 7 @ %r361 bra $L1; add.u64 %r137,%frame,368; mov.u64 %r180,8; .loc 1 206 5 mov.u64 %r123,1; .loc 1 201 13 mov.u64 %r362,0; bra $L40; $L41: st.u64 [%r137],%r362; .loc 1 204 23 mul.lo.u64 %r341,%r87,%r86; .loc 1 204 9 shl.b64 %r342,%r341,2; sub.u64 %r121,%r124,%r342; .loc 1 205 23 mul.lo.u64 %r343,%r90,%r86; .loc 1 205 9 shl.b64 %r344,%r343,2; sub.u64 %r122,%r125,%r344; .loc 1 206 5 add.u64 %r123,%r123,1; .loc 1 207 7 add.u64 %r137,%r137,8; add.u64 %r180,%r180,8; setp.eq.u64 %r345,%r123,%r367; @ %r345 bra $L1; $L40: .loc 1 215 16 ld.u64 %r346,[%r137]; add.u64 %r86,%r346,1; st.u64 [%r137],%r86; .loc 1 216 23 add.u64 %r348,%r368,%r180; ld.u64 %r87,[%r348]; .loc 1 216 13 shl.b64 %r349,%r87,2; add.u64 %r124,%r121,%r349; .loc 1 217 23 add.u64 %r350,%frame,%r180; ld.u64 %r90,[%r350]; .loc 1 217 13 shl.b64 %r351,%r90,2; add.u64 %r125,%r122,%r351; .loc 1 197 32 add.u64 %r353,%r369,%r180; ld.u64 %r94,[%r353]; .loc 1 197 13 setp.eq.u64 %r354,%r86,%r94; @ %r354 bra $L41; .loc 1 201 13 mov.u64 %r135,0; bra $L25; $L56: .loc 1 90 6 ld.u64 %r355,[%r185]; setp.eq.u64 %r356,%r355,0; @ ! %r356 bra $L11; bra $L42; $L1: .loc 1 221_gfortran_mmaxloc1_4_r4 .visible .func _gfortran_mmaxloc1_4_r, .param .u32 %in_ar4) {608u32 %r267; .reg .u32 %r269; .reg .u32 %r270; .reg .u32 %r271; .reg .u32 %r273; .reg .u32predu64 %r293; .reg .predu64 %r302; .reg .u64 %r303; .reg .u64 %r305; .reg .u64 %r306; .reg .u64u64 %r327; .reg .u32u64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64pred %r339; .reg .u64 %r341; .reg .u64 %r343; .reg .pred %r345; .reg .predpred %r383; .reg .u64 %r384; .reg .u64 %r386; .reg .u32 %r387; .reg .pred %r388; .reg .u64 %r391; .reg .u64pred %r409; .reg .u64 %r410; .reg .u64 %r411; .reg .u64 %r412; .reg .pred %r414; .reg .u32pred %r418; .reg .pred %r419; .reg .pred %r422; .reg .pred %r423; .reg .pred %r425; .reg .pred %r426; .reg .predu64predmov.u64 %r223,%ar0; mov.u64 %r224,%ar1; mov.u64 %r225,%ar2; mov.u64 %r226,%ar3; mov.u32 %r227,%ar4; .loc 1 251 6 setp.ne.u64 %r228,%r226,0; @ %r228 bra $L227; call _gfortran_maxloc1_4_r47 bra $L58; $L59: .loc 1 261 10 ld.u64 %r22,[%r225]; .loc 1 261 7 add.u64 %r159,%r22,-1; .loc 1 262 10 ld.s8 %r23,[%r224+28]; .loc 1 262 38 add.u32 %r233,%r23,-1; .loc 1 262 8 cvt.s64.s32 %r160,%r233; .loc 1 265 7 shr.u64 %r235,%r159,63; set.u32.gt.s64 %r237,%r159,%r160; neg.s32 %r238,%r237; cvt.u16.u64 %r241,%r235; cvt.u16.u32 %r242,%r238; or.b16 %r240,%r241,%r242; .loc 1 265 6 cvt.u32.u16 %r243,%r240; cvt.u16.u8 %r244,%r243; setp.eq.u16 %r245,%r244,0; @ %r245 bra $L61; .loc 1 267 7 cvt.u32.u32 %r248,%r23; cvt.s64.s8 %r247,%r248; st.u64 [%stack+8],%r247; st.u64 [%stack],%r22; cvta.const.u64 %r246246_gfortran_runtime_error61: .loc 1 272 9 add.u64 %r456,%r159,%r159; add.u64 %r457,%r456,%r159; shl.b64 %r253,%r457,3; add.u64 %r254,%r224,%r253; ld.u64 %r257,[%r254+56]; add.u64 %r256,%r257,1; .loc 1 272 7 ld.u64 %r264,[%r254+48]; sub.u64 %r161,%r256,%r264; .loc 1 273 6 setp.le.s64 %r265,%r161,0; @ %r265 bra $L58; .loc 1 276 9 ld.u64 %r180,[%r226]; .loc 1 278 15 ld.u64 %r34,[%r226+16]; .loc 1 280 22 cvt.u32.u64 %r35,%r34; .loc 1 280 53 add.u32 %r266,%r35,-4; and.b32 %r267,%r266,-5; set.u32.eq.u32 %r269,%r267,0; neg.s32 %r270,%r269; .loc 1 280 22 add.u32 %r271,%r35,-1; set.u32.le.u32 %r273,%r271,1; neg.s32 %r274,%r273; .loc 1 280 6 cvt.u16.u32 %r276,%r270; cvt.u16.u32 %r277,%r274; or.b16 %r275,%r276,%r277; cvt.u32.u16 %r278,%r275; cvt.u16.u8 %r279,%r278; setp.ne.u16 %r280,%r279,0; @ %r280 bra $L63; .loc 1 282 7 setp.ne.u32 %r282,%r35,16; @ %r282 bra $L64; $L63: .loc 1 289 9 shl.b64 %r286,%r457,3; add.u64 %r287,%r224,%r286; ld.u64 %r164,[%r287+40]; .loc 1 290 12 add.u64 %r293,%r226,%r286; ld.u64 %r41,[%r293+40]; .loc 1 292 3 setp.ne.u64 %r295,%r159,0; @ %r295 bra $L65; $L71: .loc 1 302 3 setp.lt.s64 %r296,%r159,%r160; @ %r296 bra $L66; bra $L129; $L64: .loc 1 287 5 cvta.const.u64 %r2977_gfortran_runtime_error65: add.u64 %r91,%r224,40; add.u64 %r40,%r226,40; add.u64 %r300,%r22,%r22; add.u64 %r301,%r300,%r22; shl.b64 %r302,%r301,3; add.u64 %r303,%r224,16; add.u64 %r186,%r302,%r303; .loc 1 292 3 mov.u64 %r148,0; add.u64 %r453,%frame,240; add.u64 %r452,%frame,360; .loc 1 299 12 mov.u64 %r467,%r148; $L70: .loc 1 294 18 add.u64 %r305,%r453,%r148; ld.u64 %r306,[%r91]; st.u64 [%r305],%r306; .loc 1 295 18 add.u64 %r307,%frame,%r148; .loc 1 295 20 ld.u64 %r309,[%r40]; mul.lo.u64 %r308,%r309,%r34; .loc 1 295 18 st.u64 [%r307],%r308; .loc 1 296 19 ld.u64 %r311,[%r91+16]; add.u64 %r310,%r311,1; ld.u64 %r312,[%r91+8]; sub.u64 %r52,%r310,%r312; .loc 1 298 10 setp.lt.s64 %r313,%r52,0; @ %r313 bra $L68; .loc 1 296 17 add.u64 %r315,%r452,%r148; st.u64 [%r315],%r52; bra $L69; $L68: .loc 1 299 12 add.u64 %r317,%r452,%r148; st.u64 [%r317],%r467; $L69: .loc 1 292 3 add.u64 %r91,%r91,24; add.u64 %r40,%r40,24; add.u64 %r148,%r148,8; setp.ne.u64 %r319,%r186,%r91; @ %r319 bra $L70; bra $L71; $L129: .loc 1 312 6 ld.u64 %r320,[%r223]; setp.eq.u64 %r321,%r320,0; @ ! %r321 bra $L73; bra $L72; $L66: add.u64 %r323,%r22,%r22; add.u64 %r324,%r323,%r22; shl.b64 %r325,%r324,3; add.u64 %r116,%r325,40; add.u64 %r133,%r224,%r116; add.u64 %r87,%r226,%r116; shl.b64 %r326,%r22,3; add.u64 %r157,%r326,-8; cvt.u32.u32 %r328,%r23; cvt.s64.s8 %r327,%r328; shl.b64 %r329,%r327,3; add.u64 %r96,%r329,-8; add.u64 %r453,%frame,240; add.u64 %r452,%frame,360; .loc 1 309 12 mov.u64 %r466,0; $L76: .loc 1 304 18 add.u64 %r331,%r453,%r157; ld.u64 %r332,[%r133]; st.u64 [%r331],%r332; .loc 1 305 18 add.u64 %r333,%frame,%r157; .loc 1 305 20 ld.u64 %r335,[%r87]; mul.lo.u64 %r334,%r335,%r34; .loc 1 305 18 st.u64 [%r333],%r334; .loc 1 306 19 ld.u64 %r337,[%r133+16]; add.u64 %r336,%r337,1; ld.u64 %r338,[%r133+8]; sub.u64 %r61,%r336,%r338; .loc 1 308 10 setp.lt.s64 %r339,%r61,0; @ %r339 bra $L74; .loc 1 306 17 add.u64 %r341,%r452,%r157; st.u64 [%r341],%r61; bra $L75; $L74: .loc 1 309 12 add.u64 %r343,%r452,%r157; st.u64 [%r343],%r466; $L75: .loc 1 302 3 add.u64 %r133,%r133,24; add.u64 %r87,%r87,24; add.u64 %r157,%r157,8; setp.ne.u64 %r345,%r96,%r157; @ %r345 bra $L76; bra $L130; $L72: .loc 1 316 7 setp.eq.u64 %r346,%r160,0; @ %r346 bra $L78; add.u64 %r452,%frame,360; $L106: add.u64 %r146,%r223,40; mov.u64 %r143,%r452; cvt.u32.u32 %r348,%r23; cvt.s64.s8 %r347,%r348; add.u64 %r350,%r347,%r347; add.u64 %r351,%r350,%r347; shl.b64 %r352,%r351,3; add.u64 %r353,%r223,16; add.u64 %r134,%r352,%r353; .loc 1 319 10 mov.u64 %r139,1; .loc 1 323 4 mov.u64 %r372,0; bra $L79; $L78: .loc 1 327 20 add.u32 %r354,%r23,-2; cvt.s64.s32 %r71,%r354; add.u64 %r356,%r71,%r71; add.u64 %r357,%r356,%r71; shl.b64 %r358,%r357,3; add.u64 %r359,%r223,%r358; .loc 1 327 67 shl.b64 %r361,%r71,3; add.u64 %r362,%frame,%r361; .loc 1 327 59 ld.u64 %r364,[%r359+40]; ld.u64 %r365,[%r362+360]; mul.lo.u64 %r168,%r364,%r365; .loc 1 329 24 mov.u64 %r366,0; st.u64 [%r223+8],%r366; .loc 1 330 28 cvt.u16.u32 %r369,%r23; add.u16 %r368,%r369,-1; cvt.u32.u16 %r370,%r368; st.u8 [%r223+28],%r370; .loc 1 332 10 setp.eq.u64 %r371,%r168,0; @ %r371 bra $L80; bra $L131; $L82: .loc 1 321 47 mul.lo.u64 %r139,%r68,%r139; $L79: .loc 1 323 4 st.u64 [%r146+8],%r372; ld.u64 %r68,[%r143]; add.u64 %r373,%r68,-1; st.u64 [%r146+16],%r373; st.u64 [%r146],%r139; .loc 1 316 7 add.u64 %r146,%r146,24; add.u64 %r143,%r143,8; setp.ne.u64 %r374,%r134,%r146; @ %r374 bra $L82; bra $L78; $L80: .loc 1 335 4 st.u64 [%r223+48],%r168; mov.u64 %r376,-1; st.u64 [%r223+56],%r376; mov.u64 %r377,1; st.u64 [%r223+40],%r377; .loc 1 336 4 bra $L58; $L131: .loc 1 339 24call (%value_in),_gfortrani_xmallocarray80,[%value_in]; } .loc 1 339 22 st.u64 [%r223],%r380; bra $L83; $L73: .loc 1 344 19 ld.s8 %r382,[%r223+28]; .loc 1 344 10 setp.eq.u64 %r383,%r382,%r160; @ %r383 bra $L84; .loc 1 345 2 cvta.const.u64 %r3843stack; call _gfortran_runtime_error84: .loc 1 347 11 cvta.global.u64 %r386,_gfortrani_compile_options; .loc 1 347 10 ld.u32 %r387,[%r386+36]; setp.eq.u32 %r388,%r387,0; @ %r388 bra $L83; .loc 1 349 4 add.u64 %r452,%frame,360; cvta.const.u64 %r392,$LC239392; call _gfortrani_bounds_ifunction_return4 cvta.const.u64 %r396,$LC396392; call _gfortrani_bounds_equal_extents$L83: .loc 1 356 3 setp.ne.u64 %r398,%r160,0; @ %r398 bra $L85; $L88: .loc 1 364 8 ld.u64 %r181,[%r223]; .loc 1 365 8 ld.u64 %r179,[%r224]; .loc 1 367 9 setp.ne.u64 %r399,%r179,0; @ %r399 bra $L86; bra $L58; $L85: add.u64 %r199,%frame,480; add.u64 %r195,%r223,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r401,%r23; cvt.s64.s8 %r400,%r401; add.u64 %r149,%r400,-1; .loc 1 356 3 mov.u64 %r182,0; add.u64 %r452,%frame,360; .loc 1 358 16 mov.u64 %r402,%r182; $L87: st.u64 [%r199],%r402; .loc 1 359 18 ld.u64 %r403,[%r195]; st.u64 [%r187],%r403; .loc 1 360 17 shl.b64 %r405,%r182,3; add.u64 %r406,%r452,%r405; .loc 1 360 10 ld.u64 %r407,[%r406]; setp.le.s64 %r408,%r407,0; @ %r408 bra $L58; .loc 1 356 26 add.u64 %r182,%r182,1; .loc 1 356 3 add.u64 %r199,%r199,8; add.u64 %r195,%r195,24; add.u64 %r187,%r187,8; setp.ne.u64 %r409,%r149,%r182; @ %r409 bra $L87; bra $L88; $L86: .loc 1 290 12 mul.lo.u64 %r43,%r41,%r34; .loc 1 386 32 shl.b64 %r86,%r164,2; ld.u64 %r39,[%frame+480]; .loc 1 430 22 ld.u64 %r100,[%frame+240]; .loc 1 430 12 shl.b64 %r101,%r100,2; .loc 1 431 23 ld.u64 %r103,[%frame]; .loc 1 432 22 ld.u64 %r105,[%frame+120]; .loc 1 432 12 shl.b64 %r106,%r105,2; .loc 1 434 32 ld.u64 %r172,[%frame+360]; .loc 1 441 23 mul.lo.u64 %r38,%r172,%r100; .loc 1 442 24 mul.lo.u64 %r410,%r172,%r103; .loc 1 442 10 neg.s64 %r94,%r410; .loc 1 443 23 mul.lo.u64 %r411,%r172,%r105; .loc 1 443 9 shl.b64 %r412,%r411,2; neg.s64 %r140,%r412; setp.le.s64 %r451,%r160,1; .loc 1 408 9 setp.ne.u32 %r459,%r227,0; cvt.u32.u32 %r460,%r23; cvt.s64.s8 %r461,%r460; add.u64 %r462,%r461,-1; add.u64 %r463,%frame,240; add.u64 %r464,%frame,360; add.u64 %r465,%frame,120; $L107: .loc 1 356 3 mov.u64 %r174,%r180; mov.u64 %r173,%r179; .loc 1 383 16 mov.u32 %r147,0; .loc 1 386 9 mov.u64 %r171,0; $L94: .loc 1 389 7 ld.s8 %r222,[%r174]; .loc 1 389 6 setp.eq.u32 %r454,%r222,0; @ %r454 bra $L89; .loc 1 392 10 setp.ne.u32 %r414,%r147,0; @ %r414 bra $L90; cvt.u32.u64 %r415,%r171; add.u32 %r147,%r415,1; $L90: .loc 1 394 11 ld.f32 %r145,[%r173]; .loc 1 394 10 setp.ge.f32 %r416,%r145,0fff800000; @ ! %r416 bra $L89; .loc 1 398 11 cvt.u32.u64 %r417,%r171; add.u32 %r170,%r417,1; .loc 1 404 9 setp.le.s64 %r418,%r161,%r171; @ ! %r418 bra $L132; bra $L92; $L89: .loc 1 386 24 add.u64 %r171,%r171,1; .loc 1 386 32 add.u64 %r173,%r173,%r86; .loc 1 386 47 add.u64 %r174,%r174,%r43; .loc 1 386 2 setp.ne.u64 %r419,%r161,%r171; @ %r419 bra $L94; bra $L92; $L132: .loc 1 408 9 @ %r459 bra $L95; add.u64 %r205,%r171,1; .loc 1 398 11 mov.u32 %r147,%r170; bra $L96; $L95: add.u64 %r209,%r171,1; mov.u32 %r147,%r170; $L99: .loc 1 411 8 @ %r454 bra $L97; .loc 1 411 18 ld.f32 %r89,[%r173]; .loc 1 411 15 setp.ge.f32 %r422,%r89,%r145; @ ! %r422 bra $L97; .loc 1 414 16 cvt.u32.u64 %r147,%r209; .loc 1 413 16 mov.f32 %r145,%r89; $L97: .loc 1 409 33 add.u64 %r173,%r173,%r86; .loc 1 409 48 add.u64 %r174,%r174,%r43; .loc 1 409 8 add.u64 %r210,%r209,1; setp.eq.u64 %r423,%r161,%r209; @ %r423 bra $L92; .loc 1 411 9 ld.s8 %r222,[%r174]; mov.u64 %r209,%r210; setp.eq.u32 %r454,%r222,0; bra $L99; $L96: .loc 1 420 8 @ %r454 bra $L100; .loc 1 420 18 ld.f32 %r93,[%r173]; .loc 1 420 15 setp.gt.f32 %r425,%r93,%r145; @ ! %r425 bra $L100; .loc 1 423 16 cvt.u32.u64 %r147,%r205; .loc 1 422 16 mov.f32 %r145,%r93; $L100: .loc 1 418 33 add.u64 %r173,%r173,%r86; .loc 1 418 48 add.u64 %r174,%r174,%r43; .loc 1 418 8 add.u64 %r206,%r205,1; setp.eq.u64 %r426,%r161,%r205; @ %r426 bra $L92; .loc 1 420 9 ld.s8 %r222,[%r174]; mov.u64 %r205,%r206; setp.eq.u32 %r454,%r222,0; bra $L96; $L92: .loc 1 426 8 st.u32 [%r181],%r147; .loc 1 429 15 add.u64 %r39,%r39,1; .loc 1 430 12 add.u64 %r179,%r179,%r101; .loc 1 431 13 add.u64 %r180,%r180,%r103; .loc 1 432 12 add.u64 %r181,%r181,%r106; .loc 1 434 13 setp.ne.u64 %r427,%r39,%r172; @ %r427 bra $L107; .loc 1 442 10 add.u64 %r176,%r180,%r94; .loc 1 443 9 add.u64 %r177,%r181,%r140; .loc 1 445 7 @ %r451 bra $L58; add.u64 %r158,%frame,488; .loc 1 441 23 mov.u64 %r110,%r38; .loc 1 445 7 mov.u64 %r219,8; .loc 1 444 5 mov.u64 %r178,1; .loc 1 438 13 mov.u64 %r458,0; bra $L104; $L105: st.u64 [%r158],%r458; .loc 1 441 23 mul.lo.u64 %r110,%r124,%r123; .loc 1 442 24 mul.lo.u64 %r433,%r126,%r123; .loc 1 442 10 sub.u64 %r176,%r180,%r433; .loc 1 443 23 mul.lo.u64 %r434,%r128,%r123; .loc 1 443 9 shl.b64 %r435,%r434,2; sub.u64 %r177,%r181,%r435; .loc 1 444 5 add.u64 %r178,%r178,1; .loc 1 445 7 add.u64 %r158,%r158,8; add.u64 %r219,%r219,8; setp.eq.u64 %r436,%r178,%r462; @ %r436 bra $L58; $L104: .loc 1 453 16 ld.u64 %r437,[%r158]; add.u64 %r123,%r437,1; st.u64 [%r158],%r123; .loc 1 454 23 add.u64 %r439,%r463,%r219; ld.u64 %r124,[%r439]; .loc 1 454 13 sub.u64 %r440,%r124,%r110; shl.b64 %r441,%r440,2; add.u64 %r179,%r179,%r441; .loc 1 455 24 add.u64 %r442,%frame,%r219; ld.u64 %r126,[%r442]; .loc 1 455 14 add.u64 %r180,%r176,%r126; .loc 1 456 23 add.u64 %r444,%r465,%r219; ld.u64 %r128,[%r444]; .loc 1 456 13 shl.b64 %r445,%r128,2; add.u64 %r181,%r177,%r445; .loc 1 434 32 add.u64 %r447,%r464,%r219; ld.u64 %r131,[%r447]; .loc 1 434 13 setp.eq.u64 %r448,%r123,%r131; @ %r448 bra $L105; .loc 1 438 13 mov.u64 %r39,0; bra $L107; $L130: .loc 1 312 6 ld.u64 %r449,[%r223]; setp.eq.u64 %r450,%r449,0; @ ! %r450 bra $L73; bra $L106; $L58: .loc 1 460_gfortran_smaxloc1_4_r4 .visible .func _gfortran_smaxloc1_4_r, .param .u32 %in_ar4) {8u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u32 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 483 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L134; .loc 1 483 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L135; $L134:r152154; call _gfortran_maxloc1_4_r4490 7 bra $L133; $L135: .loc 1 493 10 ld.u64 %r23,[%r152]; .loc 1 493 7 add.u64 %r88,%r23,-1; .loc 1 494 10 ld.s8 %r24,[%r151+28]; .loc 1 494 38 add.u32 %r26,%r24,-1; .loc 1 494 8 cvt.s64.s32 %r296,%r26; .loc 1 496 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 496 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L137; .loc 1 503 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L138; $L144: .loc 1 511 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L139; bra $L192; $L137: .loc 1 498 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error138: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 508 12 mov.u64 %r302,0; $L143: .loc 1 505 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 507 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L141; .loc 1 505 17 st.u64 [%r139],%r36; bra $L142; $L141: .loc 1 508 12 st.u64 [%r139],%r302; $L142: .loc 1 503 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L143; bra $L144; $L192: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L146; bra $L145; $L139: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 517 12 mov.u64 %r301,0; $L149: .loc 1 514 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 516 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L147; .loc 1 513 17 st.u64 [%r43],%r41; bra $L148; $L147: .loc 1 517 12 st.u64 [%r43],%r301; $L148: .loc 1 511 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L149; bra $L193; $L145: .loc 1 524 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L151; add.u64 %r295,%frame,120; $L171: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 527 10 mov.u64 %r81,1; .loc 1 531 4 mov.u64 %r239,0; bra $L152; $L151: .loc 1 535 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 536 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 538 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 538 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 538 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 540 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L153; bra $L194; $L155: .loc 1 529 48 mul.lo.u64 %r81,%r46,%r81; $L152: .loc 1 531 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 524 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L155; bra $L151; $L153: .loc 1 543 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 544 4 bra $L133; $L194: .loc 1 547 2496246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 547 22 st.u64 [%r150],%r98; $L159: .loc 1 573 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L156; bra $L157; $L146: .loc 1 551 19 ld.s8 %r55,[%r150+28]; .loc 1 551 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L158; .loc 1 552 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error158: .loc 1 557 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 557 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L159; .loc 1 559 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L157; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 559 10 mov.u64 %r104,0; $L161: .loc 1 563 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 563 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 564 18 ld.u64 %r63,[%r31]; .loc 1 559 25 add.u64 %r104,%r104,1; .loc 1 564 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L160; .loc 1 565 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error160: .loc 1 559 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L161; $L156: .loc 1 575 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L162; shl.b64 %r266,%r296,3; bra $L163; $L162: mov.u64 %r266,8; $L163: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L164: .loc 1 576 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 573 26 add.u64 %r100,%r100,1; .loc 1 573 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L164; $L157: ld.u64 %r80,[%frame+240]; .loc 1 585 22 ld.u64 %r68,[%frame]; .loc 1 585 12 shl.b64 %r69,%r68,2; .loc 1 587 32 ld.u64 %r59,[%frame+120]; .loc 1 594 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 583 13 mov.u32 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L170: st.u32 [%r98],%r280; .loc 1 584 15 add.u64 %r80,%r80,1; .loc 1 585 12 add.u64 %r98,%r98,%r69; .loc 1 587 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L170; .loc 1 596 7 @ ! %r294 bra $L195; bra $L133; $L169: .loc 1 591 13 st.u64 [%r99],%r297; .loc 1 594 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 595 5 add.u64 %r97,%r97,1; .loc 1 596 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L168; bra $L133; $L195: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 594 23 mov.u64 %r72,%r44; .loc 1 595 5 mov.u64 %r97,1; .loc 1 591 13 mov.u64 %r297,0; $L168: .loc 1 600 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 601 23 ld.u64 %r75,[%r124]; .loc 1 601 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,2; add.u64 %r98,%r98,%r291; .loc 1 587 32 ld.u64 %r77,[%r123]; .loc 1 587 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L169; .loc 1 591 13 mov.u64 %r80,0; bra $L170; $L193: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L146; bra $L171; $L133: .loc 1 605 maxloc1_8_r4.o/ 1622802230_gfortran_maxloc1_8_r4 .visible .func _gfortran_maxloc1_8_r../libgfortran/generated/maxloc1_8_r4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_8_r4 .visible .func _gfortran_mmaxloc1_8_r_gfortran_smaxloc1_8_r4 .visible .func _gfortran_smaxloc1_8_rVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_maxloc1_8_r4 .visible .func _gfortran_maxloc1_8_r499pred %r221; .reg .pred %r222; .reg .u64u64 %r229; .reg .u64 %r230; .reg .u64 %r231; .reg .pred %r232; .reg .predu64 %r280; .reg .u64 %r282; .reg .u64pred64predu64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64pred %r345; .reg .u64u64 %r353; .reg .pred %r354; .reg .pred %r355; .reg .u64 %r356; .reg .predpred %r361; .reg .pred %r362; .reg .predmov.u64 %r182,%ar0; mov.u64 %r183,%ar1; mov.u64 %r184,%ar2; mov.u32 %r185,%ar3; .loc 1 58 10 ld.s8 %r22,[%r183+28]; .loc 1 58 38 add.u32 %r186,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r112,%r186; .loc 1 59 10 ld.u64 %r25,[%r184]; .loc 1 59 7 add.u64 %r113,%r25,-1; .loc 1 61 7 shr.u64 %r188,%r113,63; set.u32.lt.s64 %r190,%r112,%r113; neg.s32 %r191,%r190; cvt.u16.u64 %r194,%r188; cvt.u16.u32 %r195,%r191; or.b16 %r193,%r194,%r195; .loc 1 61 6 cvt.u32.u16 %r196,%r193; cvt.u16.u8 %r197,%r196; setp.eq.u16 %r198,%r197,0; @ %r198 bra $L2; .loc 1 63 7 cvt.u32.u32 %r201,%r22; cvt.s64.s8 %r200,%r201; st.u64 [%stack+8],%r20019_gfortran_runtime_errorr204,%r113,%r113; add.u64 %r205,%r204,%r113; shl.b64 %r206,%r205,3; add.u64 %r207,%r183,%r206; ld.u64 %r31,[%r207+56]; ld.u64 %r33,[%r207+48]; .loc 1 71 9 ld.u64 %r116,[%r207+40]; .loc 1 73 3 setp.ne.u64 %r221,%r113,0; @ %r221 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r222,%r112,%r113; @ %r222 bra $L4; bra $L56; $L3: add.u64 %r99,%r183,40; add.u64 %r130,%frame,120; add.u64 %r181,%frame,240; add.u64 %r223,%r183,16; add.u64 %r225,%r25,%r25; add.u64 %r226,%r225,%r25; shl.b64 %r227,%r226,3; add.u64 %r72,%r223,%r227; .loc 1 79 12 mov.u64 %r372,0; $L8: .loc 1 75 18 ld.u64 %r228,[%r99]; st.u64 [%r130],%r228; .loc 1 76 19 ld.u64 %r230,[%r99+16]; add.u64 %r229,%r230,1; ld.u64 %r231,[%r99+8]; sub.u64 %r38,%r229,%r231; .loc 1 78 10 setp.lt.s64 %r232,%r38,0; @ %r232 bra $L6; .loc 1 76 17 st.u64 [%r181],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r181],%r372; $L7: .loc 1 73 3 add.u64 %r99,%r99,24; add.u64 %r130,%r130,8; add.u64 %r181,%r181,8; setp.ne.u64 %r234,%r72,%r99; @ %r234 bra $L8; bra $L9; $L56: .loc 1 90 6 ld.u64 %r235,[%r182]; setp.eq.u64 %r236,%r235,0; @ ! %r236 bra $L11; bra $L10; $L4: add.u64 %r238,%r25,%r25; add.u64 %r239,%r238,%r25; shl.b64 %r240,%r239,3; add.u64 %r241,%r240,40; add.u64 %r136,%r183,%r241; shl.b64 %r242,%r25,3; add.u64 %r108,%r242,-8; add.u64 %r360,%frame,120; add.u64 %r110,%r360,%r108; add.u64 %r359,%frame,240; add.u64 %r104,%r359,%r108; cvt.u32.u32 %r246,%r22; cvt.s64.s8 %r245,%r246; add.u64 %r248,%r245,%r245; add.u64 %r249,%r248,%r245; shl.b64 %r250,%r249,3; add.u64 %r251,%r183,40; add.u64 %r58,%r250,%r251; .loc 1 87 12 mov.u64 %r371,0; $L14: .loc 1 83 18 ld.u64 %r252,[%r136]; st.u64 [%r110],%r252; .loc 1 84 19 ld.u64 %r254,[%r136+16]; add.u64 %r253,%r254,1; ld.u64 %r255,[%r136+8]; sub.u64 %r46,%r253,%r255; .loc 1 86 10 setp.lt.s64 %r256,%r46,0; @ %r256 bra $L12; .loc 1 84 17 st.u64 [%r104],%r46; bra $L13; $L12: .loc 1 87 12 st.u64 [%r104],%r371; $L13: .loc 1 81 3 add.u64 %r136,%r136,24; add.u64 %r110,%r110,8; add.u64 %r104,%r104,8; setp.ne.u64 %r258,%r58,%r136; @ %r258 bra $L14; bra $L57; $L10: .loc 1 94 7 setp.le.s64 %r259,%r112,0; @ %r259 bra $L16; add.u64 %r359,%frame,240; $L42: add.u64 %r153,%r182,40; mov.u64 %r146,%r359; cvt.u32.u32 %r261,%r22; cvt.s64.s8 %r260,%r261; add.u64 %r263,%r260,%r260; add.u64 %r264,%r263,%r260; shl.b64 %r265,%r264,3; add.u64 %r266,%r182,16; add.u64 %r138,%r265,%r266; .loc 1 97 10 mov.u64 %r103,1; .loc 1 101 4 mov.u64 %r289,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r267,0; st.u64 [%r182+8],%r267; .loc 1 106 28 cvt.u16.u32 %r270,%r22; add.u16 %r269,%r270,-1; cvt.u32.u16 %r271,%r269; st.u8 [%r182+28],%r271; .loc 1 108 20 add.u32 %r272,%r22,-2; cvt.s64.s32 %r56,%r272; add.u64 %r274,%r56,%r56; add.u64 %r275,%r274,%r56; shl.b64 %r276,%r275,3; add.u64 %r277,%r182,%r276; .loc 1 108 67 shl.b64 %r279,%r56,3; add.u64 %r280,%frame,%r279; .loc 1 108 59 ld.u64 %r282,[%r277+40]; ld.u64 %r283,[%r280+240]; mul.lo.u64 %r118,%r282,%r283; .loc 1 110 29 mov.u64 %r28511r285; call (%value_in),_gfortrani_xmallocarray286,[%value_in]; } .loc 1 110 27 st.u64 [%r182],%r286; .loc 1 111 10 setp.eq.u64 %r288,%r118,0; @ ! %r288 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r103,%r52,%r103; $L17: .loc 1 101 4 st.u64 [%r153+8],%r289; ld.u64 %r52,[%r146]; add.u64 %r290,%r52,-1; st.u64 [%r153+16],%r290; st.u64 [%r153],%r103; .loc 1 94 7 add.u64 %r153,%r153,24; add.u64 %r146,%r146,8; setp.ne.u64 %r291,%r138,%r153; @ %r291 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r182+48],%r118; mov.u64 %r293,-1; st.u64 [%r182+56],%r293; mov.u64 %r294,1; st.u64 [%r182+40],%r294; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r63,[%r182+28]; .loc 1 121 10 setp.eq.u64 %r295,%r63,%r112; @ %r295 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r112; st.u64 [%stack],%r63;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r298,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r299,[%r298+36]; setp.eq.u32 %r300,%r299,0; @ %r300 bra $L21; .loc 1 128 2 add.u64 %r359,%frame,240; cvta.const.u64 %r304,$LC2r359303304; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r306,%r112,0; @ %r306 bra $L24; $L27: .loc 1 68 9 add.u64 %r307,%r31,1; .loc 1 68 7 sub.u64 %r114,%r307,%r33; max.s64 %r132,%r114,0; .loc 1 140 8 ld.u64 %r126,[%r183]; .loc 1 141 8 ld.u64 %r127,[%r182]; .loc 1 168 42 shl.b64 %r71,%r116,2; ld.u64 %r48,[%frame+360]; .loc 1 194 22 ld.u64 %r73,[%frame+120]; .loc 1 194 12 shl.b64 %r75,%r73,2; .loc 1 195 22 ld.u64 %r76,[%frame]; .loc 1 195 12 shl.b64 %r78,%r76,3; setp.gt.s64 %r361,%r114,0; setp.le.s64 %r362,%r112,1; setp.eq.u32 %r364,%r185,0; .loc 1 159 10 mov.u64 %r365,0; cvt.u32.u32 %r366,%r22; cvt.s64.s8 %r367,%r366; add.u64 %r368,%r367,-1; add.u64 %r369,%frame,120; add.u64 %r370,%frame,240; bra $L25; $L24: add.u64 %r166,%frame,360; add.u64 %r165,%r182,40; mov.u64 %r163,%frame; cvt.u32.u32 %r309,%r22; cvt.s64.s8 %r308,%r309; add.u64 %r154,%r308,-1; .loc 1 132 3 mov.u64 %r128,0; add.u64 %r359,%frame,240; .loc 1 134 16 mov.u64 %r310,%r128; $L26: st.u64 [%r166],%r310; .loc 1 135 18 ld.u64 %r311,[%r165]; st.u64 [%r163],%r311; .loc 1 136 17 shl.b64 %r313,%r128,3; add.u64 %r314,%r359,%r313; .loc 1 136 10 ld.u64 %r315,[%r314]; setp.le.s64 %r316,%r315,0; @ %r316 bra $L1; .loc 1 132 26 add.u64 %r128,%r128,1; .loc 1 132 3 add.u64 %r166,%r166,8; add.u64 %r165,%r165,24; add.u64 %r163,%r163,8; setp.ne.u64 %r317,%r128,%r154; @ %r317 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r361 bra $L43; .loc 1 159 10 st.u64 [%r127],%r365; bra $L29; $L43: mov.u64 %r121,%r126; .loc 1 168 19 mov.u64 %r105,0; $L28: .loc 1 170 7 ld.f32 %r106,[%r121]; mov.u64 %r137,%r105; .loc 1 168 34 add.u64 %r105,%r105,1; .loc 1 170 6 setp.ge.f32 %r320,%r106,0fff800000; @ ! %r320 bra $L58; .loc 1 180 6 setp.gt.s64 %r321,%r132,%r137; @ %r321 bra $L44; bra $L33; $L58: .loc 1 168 42 add.u64 %r121,%r121,%r71; .loc 1 168 12 setp.gt.s64 %r322,%r132,%r105; @ %r322 bra $L28; .loc 1 157 9 mov.u64 %r105,1; bra $L33; $L44: .loc 1 170 7 mov.f32 %r180,%r106; $L32: .loc 1 182 7 @ %r364 bra $L34; set.u32.le.f32 %r325,%r106,%r180; neg.s32 %r326,%r325; cvt.u32.u32 %r324,%r326; cvt.u32.u8 %r120,%r324; bra $L35; $L34: set.u32.lt.f32 %r328,%r106,%r180; neg.s32 %r329,%r328; cvt.u32.u32 %r327,%r329; cvt.u32.u8 %r120,%r327; bra $L35; $L59: .loc 1 182 14 ld.f32 %r180,[%r121]; bra $L32; $L33: .loc 1 189 12 st.u64 [%r127],%r105; $L29: .loc 1 193 15 add.u64 %r48,%r48,1; .loc 1 194 12 add.u64 %r126,%r126,%r75; .loc 1 195 12 add.u64 %r127,%r127,%r78; .loc 1 197 32 ld.u64 %r66,[%frame+240]; .loc 1 197 13 setp.ne.u64 %r331,%r66,%r48; @ %r331 bra $L25; .loc 1 204 23 mul.lo.u64 %r332,%r48,%r73; .loc 1 204 9 shl.b64 %r333,%r332,2; sub.u64 %r123,%r126,%r333; .loc 1 205 23 mul.lo.u64 %r334,%r48,%r76; .loc 1 205 9 shl.b64 %r335,%r334,3; sub.u64 %r124,%r127,%r335; .loc 1 207 7 @ %r362 bra $L1; add.u64 %r175,%frame,368; mov.u64 %r174,8; .loc 1 206 5 mov.u64 %r125,1; bra $L38; $L39: .loc 1 201 13 st.u64 [%r175],%r365; .loc 1 204 23 mul.lo.u64 %r341,%r90,%r89; .loc 1 204 9 shl.b64 %r342,%r341,2; sub.u64 %r123,%r126,%r342; .loc 1 205 23 mul.lo.u64 %r343,%r93,%r89; .loc 1 205 9 shl.b64 %r344,%r343,3; sub.u64 %r124,%r127,%r344; .loc 1 206 5 add.u64 %r125,%r125,1; .loc 1 207 7 add.u64 %r175,%r175,8; add.u64 %r174,%r174,8; setp.eq.u64 %r345,%r125,%r368; @ %r345 bra $L1; $L38: .loc 1 215 16 ld.u64 %r346,[%r175]; add.u64 %r89,%r346,1; st.u64 [%r175],%r89; .loc 1 216 23 add.u64 %r348,%r369,%r174; ld.u64 %r90,[%r348]; .loc 1 216 13 shl.b64 %r349,%r90,2; add.u64 %r126,%r123,%r349; .loc 1 217 23 add.u64 %r350,%frame,%r174; ld.u64 %r93,[%r350]; .loc 1 217 13 shl.b64 %r351,%r93,3; add.u64 %r127,%r124,%r351; .loc 1 197 32 add.u64 %r353,%r370,%r174; ld.u64 %r96,[%r353]; .loc 1 197 13 setp.eq.u64 %r354,%r89,%r96; @ %r354 bra $L39; .loc 1 201 13 mov.u64 %r48,0; bra $L25; $L35: .loc 1 185 14 add.u64 %r137,%r137,1; .loc 1 182 6 setp.ne.u32 %r355,%r120,0; .loc 1 184 14 selp.f32 %r106,%r180,%r106,%r355; selp.u64 %r105,%r137,%r105,%r355; .loc 1 180 31 add.u64 %r121,%r121,%r71; .loc 1 180 6 setp.eq.u64 %r330,%r132,%r137; @ ! %r330 bra $L59; bra $L33; $L57: .loc 1 90 6 ld.u64 %r356,[%r182]; setp.eq.u64 %r357,%r356,0; @ ! %r357 bra $L11; bra $L42; $L1: .loc 1 221_gfortran_mmaxloc1_8_r4 .visible .func _gfortran_mmaxloc1_8_r, .param .u32 %in_ar4) {608pred %r218; .reg .u32 %r223; .reg .u6464u64pred %r373; .reg .u64 %r374; .reg .u64 %r376; .reg .u32 %r377; .reg .pred %r378; .reg .u64 %r381; .reg .u64 %r382; .reg .u64 %r386; .reg .pred %r388; .reg .pred %r389; .reg .u64 %r390; .reg .u32 %r391; .reg .u64pred %r398; .reg .pred %r399; .reg .pred %r401; .reg .pred %r402; .reg .pred %r403; .reg .pred %r404; .reg .pred %r407; .reg .pred %r408; .reg .pred %r410; .reg .pred %r411; .reg .pred.reg .u64 %r421; .reg .u64u64pred %r445; .reg .u64 %r446; .reg .predmov.u64 %r213,%ar0; mov.u64 %r214,%ar1; mov.u64 %r215,%ar2; mov.u64 %r216,%ar3; mov.u32 %r217,%ar4; .loc 1 251 6 setp.ne.u64 %r218,%r216,0; @ %r218 bra $L611r214215217; call _gfortran_maxloc1_8_r47 bra $L60; $L61: .loc 1 261 10 ld.u64 %r22,[%r215]; .loc 1 261 7 add.u64 %r152,%r22,-1; .loc 1 262 10 ld.s8 %r23,[%r214+28]; .loc 1 262 38 add.u32 %r223,%r23,-1; .loc 1 262 8 cvt.s64.s32 %r153,%r223; .loc 1 265 7 shr.u64 %r225,%r152,63; set.u32.gt.s64 %r227,%r152,%r153; neg.s32 %r228,%r227; cvt.u16.u64 %r231,%r225; cvt.u16.u32 %r232,%r228; or.b16 %r230,%r231,%r232; .loc 1 265 6 cvt.u32.u16 %r233,%r230; cvt.u16.u8 %r234,%r233; setp.eq.u16 %r235,%r234,0; @ %r235 bra $L63; .loc 1 267 7 cvt.u32.u32 %r238,%r23; cvt.s64.s8 %r237,%r238; st.u64 [%stack+8],%r237; st.u64 [%stack],%r22; cvta.const.u64 %r236236_gfortran_runtime_error63: .loc 1 272 9 add.u64 %r442,%r152,%r152; add.u64 %r444,%r442,%r152; shl.b64 %r243,%r444,3; add.u64 %r244,%r214,%r243; ld.u64 %r247,[%r244+56]; add.u64 %r246,%r247,1; .loc 1 272 7 ld.u64 %r254,[%r244+48]; sub.u64 %r154,%r246,%r254; .loc 1 273 6 setp.le.s64 %r255,%r154,0; @ %r255 bra $L60; .loc 1 276 9 ld.u64 %r172,[%r216]; .loc 1 278 15 ld.u64 %r34,[%r216+16]; .loc 1 280 22 cvt.u32.u64 %r35,%r34; .loc 1 280 53 add.u32 %r256,%r35,-4; and.b32 %r257,%r256,-5; set.u32.eq.u32 %r259,%r257,0; neg.s32 %r260,%r259; .loc 1 280 22 add.u32 %r261,%r35,-1; set.u32.le.u32 %r263,%r261,1; neg.s32 %r264,%r263; .loc 1 280 6 cvt.u16.u32 %r266,%r260; cvt.u16.u32 %r267,%r264; or.b16 %r265,%r266,%r267; cvt.u32.u16 %r268,%r265; cvt.u16.u8 %r269,%r268; setp.ne.u16 %r270,%r269,0; @ %r270 bra $L65; .loc 1 282 7 setp.ne.u32 %r272,%r35,16; @ %r272 bra $L66; $L65: .loc 1 289 9 shl.b64 %r276,%r444,3; add.u64 %r277,%r214,%r276; ld.u64 %r157,[%r277+40]; .loc 1 290 12 add.u64 %r283,%r216,%r276; ld.u64 %r38,[%r283+40]; .loc 1 292 3 setp.ne.u64 %r285,%r152,0; @ %r285 bra $L67; $L73: .loc 1 302 3 setp.lt.s64 %r286,%r152,%r153; @ %r286 bra $L68; bra $L134; $L66: .loc 1 287 5_gfortran_runtime_error67: add.u64 %r205,%r214,40; add.u64 %r70,%r216,40; add.u64 %r289,%r214,16; add.u64 %r291,%r22,%r22; add.u64 %r292,%r291,%r22; shl.b64 %r293,%r292,3; add.u64 %r188,%r289,%r293; .loc 1 292 3 mov.u64 %r92,0; add.u64 %r441,%frame,240; add.u64 %r439,%frame,360; .loc 1 299 12 mov.u64 %r455,%r92; $L72: .loc 1 294 18 add.u64 %r295,%r441,%r92; ld.u64 %r296,[%r205]; st.u64 [%r295],%r296; .loc 1 295 18 add.u64 %r297,%frame,%r92; .loc 1 295 20 ld.u64 %r299,[%r70]; mul.lo.u64 %r298,%r299,%r34; .loc 1 295 18 st.u64 [%r297],%r298; .loc 1 296 19 ld.u64 %r301,[%r205+16]; add.u64 %r300,%r301,1; ld.u64 %r302,[%r205+8]; sub.u64 %r49,%r300,%r302; .loc 1 298 10 setp.lt.s64 %r303,%r49,0; @ %r303 bra $L70; .loc 1 296 17 add.u64 %r305,%r439,%r92; st.u64 [%r305],%r49; bra $L71; $L70: .loc 1 299 12 add.u64 %r307,%r439,%r92; st.u64 [%r307],%r455; $L71: .loc 1 292 3 add.u64 %r205,%r205,24; add.u64 %r70,%r70,24; add.u64 %r92,%r92,8; setp.ne.u64 %r309,%r188,%r205; @ %r309 bra $L72; bra $L73; $L134: .loc 1 312 6 ld.u64 %r310,[%r213]; setp.eq.u64 %r311,%r310,0; @ ! %r311 bra $L75; bra $L74; $L68: add.u64 %r313,%r22,%r22; add.u64 %r314,%r313,%r22; shl.b64 %r315,%r314,3; add.u64 %r123,%r315,40; add.u64 %r128,%r214,%r123; add.u64 %r107,%r216,%r123; shl.b64 %r316,%r22,3; add.u64 %r78,%r316,-8; cvt.u32.u32 %r318,%r23; cvt.s64.s8 %r317,%r318; shl.b64 %r319,%r317,3; add.u64 %r180,%r319,-8; add.u64 %r441,%frame,240; add.u64 %r439,%frame,360; .loc 1 309 12 mov.u64 %r454,0; $L78: .loc 1 304 18 add.u64 %r321,%r441,%r78; ld.u64 %r322,[%r128]; st.u64 [%r321],%r322; .loc 1 305 18 add.u64 %r323,%frame,%r78; .loc 1 305 20 ld.u64 %r325,[%r107]; mul.lo.u64 %r324,%r325,%r34; .loc 1 305 18 st.u64 [%r323],%r324; .loc 1 306 19 ld.u64 %r327,[%r128+16]; add.u64 %r326,%r327,1; ld.u64 %r328,[%r128+8]; sub.u64 %r61,%r326,%r328; .loc 1 308 10 setp.lt.s64 %r329,%r61,0; @ %r329 bra $L76; .loc 1 306 17 add.u64 %r331,%r439,%r78; st.u64 [%r331],%r61; bra $L77; $L76: .loc 1 309 12 add.u64 %r333,%r439,%r78; st.u64 [%r333],%r454; $L77: .loc 1 302 3 add.u64 %r128,%r128,24; add.u64 %r107,%r107,24; add.u64 %r78,%r78,8; setp.ne.u64 %r335,%r78,%r180; @ %r335 bra $L78; bra $L135; $L74: .loc 1 316 7 setp.eq.u64 %r336,%r153,0; @ %r336 bra $L80; add.u64 %r439,%frame,360; $L107: add.u64 %r144,%r213,40; mov.u64 %r139,%r439; cvt.u32.u32 %r338,%r23; cvt.s64.s8 %r337,%r338; add.u64 %r340,%r337,%r337; add.u64 %r341,%r340,%r337; shl.b64 %r342,%r341,3; add.u64 %r343,%r213,16; add.u64 %r129,%r342,%r343; .loc 1 319 10 mov.u64 %r132,1; .loc 1 323 4 mov.u64 %r362,0; bra $L81; $L80: .loc 1 327 20 add.u32 %r344,%r23,-2; cvt.s64.s32 %r68,%r344; add.u64 %r346,%r68,%r68; add.u64 %r347,%r346,%r68; shl.b64 %r348,%r347,3; add.u64 %r349,%r213,%r348; .loc 1 327 67 shl.b64 %r351,%r68,3; add.u64 %r352,%frame,%r351; .loc 1 327 59 ld.u64 %r354,[%r349+40]; ld.u64 %r355,[%r352+360]; mul.lo.u64 %r161,%r354,%r355; .loc 1 329 24 mov.u64 %r356,0; st.u64 [%r213+8],%r356; .loc 1 330 28 cvt.u16.u32 %r359,%r23; add.u16 %r358,%r359,-1; cvt.u32.u16 %r360,%r358; st.u8 [%r213+28],%r360; .loc 1 332 10 setp.eq.u64 %r361,%r161,0; @ %r361 bra $L82; bra $L136; $L84: .loc 1 321 47 mul.lo.u64 %r132,%r65,%r132; $L81: .loc 1 323 4 st.u64 [%r144+8],%r362; ld.u64 %r65,[%r139]; add.u64 %r363,%r65,-1; st.u64 [%r144+16],%r363; st.u64 [%r144],%r132; .loc 1 316 7 add.u64 %r144,%r144,24; add.u64 %r139,%r139,8; setp.ne.u64 %r364,%r129,%r144; @ %r364 bra $L84; bra $L80; $L82: .loc 1 335 4 st.u64 [%r213+48],%r161; mov.u64 %r366,-1; st.u64 [%r213+56],%r366; mov.u64 %r367,1; st.u64 [%r213+40],%r367; .loc 1 336 4 bra $L60; $L136: .loc 1 339 24 mov.u64 %r369369; call (%value_in),_gfortrani_xmallocarray70,[%value_in]; } .loc 1 339 22 st.u64 [%r213],%r370; bra $L85; $L75: .loc 1 344 19 ld.s8 %r372,[%r213+28]; .loc 1 344 10 setp.eq.u64 %r373,%r372,%r153; @ %r373 bra $L86; .loc 1 345 2 cvta.const.u64 %r374374_gfortran_runtime_error86: .loc 1 347 11 cvta.global.u64 %r376,_gfortrani_compile_options; .loc 1 347 10 ld.u32 %r377,[%r376+36]; setp.eq.u32 %r378,%r377,0; @ %r378 bra $L85; .loc 1 349 4 add.u64 %r439,%frame,360; cvta.const.u64 %r382,$LC2382; call _gfortrani_bounds_ifunction_return4 cvta.const.u64 %r386,$LC386382; call _gfortrani_bounds_equal_extents$L85: .loc 1 356 3 setp.ne.u64 %r388,%r153,0; @ %r388 bra $L87; $L90: .loc 1 364 8 ld.u64 %r173,[%r213]; .loc 1 365 8 ld.u64 %r171,[%r214]; .loc 1 367 9 setp.ne.u64 %r389,%r171,0; @ %r389 bra $L88; bra $L60; $L87: add.u64 %r193,%frame,480; add.u64 %r187,%r213,40; add.u64 %r181,%frame,120; cvt.u32.u32 %r391,%r23; cvt.s64.s8 %r390,%r391; add.u64 %r145,%r390,-1; .loc 1 356 3 mov.u64 %r174,0; add.u64 %r439,%frame,360; .loc 1 358 16 mov.u64 %r392,%r174; $L89: st.u64 [%r193],%r392; .loc 1 359 18 ld.u64 %r393,[%r187]; st.u64 [%r181],%r393; .loc 1 360 17 shl.b64 %r395,%r174,3; add.u64 %r396,%r439,%r395; .loc 1 360 10 ld.u64 %r397,[%r396]; setp.le.s64 %r398,%r397,0; @ %r398 bra $L60; .loc 1 356 26 add.u64 %r174,%r174,1; .loc 1 356 3 add.u64 %r193,%r193,8; add.u64 %r187,%r187,24; add.u64 %r181,%r181,8; setp.ne.u64 %r399,%r145,%r174; @ %r399 bra $L89; bra $L90; $L88: .loc 1 290 12 mul.lo.u64 %r40,%r38,%r34; .loc 1 386 32 shl.b64 %r82,%r157,2; ld.u64 %r185,[%frame+480]; .loc 1 430 22 ld.u64 %r93,[%frame+240]; .loc 1 430 12 shl.b64 %r95,%r93,2; .loc 1 431 23 ld.u64 %r96,[%frame]; .loc 1 432 22 ld.u64 %r98,[%frame+120]; .loc 1 432 12 shl.b64 %r100,%r98,3; setp.le.s64 %r445,%r153,1; .loc 1 408 9 setp.eq.u32 %r447,%r217,0; cvt.u32.u32 %r448,%r23; cvt.s64.s8 %r449,%r448; add.u64 %r450,%r449,-1; add.u64 %r451,%frame,240; add.u64 %r452,%frame,360; add.u64 %r453,%frame,120; $L108: .loc 1 356 3 mov.u64 %r166,%r172; mov.u64 %r165,%r171; .loc 1 383 16 mov.u64 %r142,0; .loc 1 386 9 mov.u64 %r138,%r142; $L96: .loc 1 389 7 ld.s8 %r210,[%r166]; mov.u64 %r182,%r138; .loc 1 393 17 add.u64 %r138,%r138,1; .loc 1 389 6 setp.eq.u32 %r440,%r210,0; @ %r440 bra $L91; .loc 1 392 10 setp.eq.u64 %r401,%r142,0; .loc 1 393 17 selp.u64 %r142,%r138,%r142,%r401; .loc 1 394 11 ld.f32 %r140,[%r165]; .loc 1 394 10 setp.ge.f32 %r402,%r140,0fff800000; @ ! %r402 bra $L91; .loc 1 404 9 setp.le.s64 %r403,%r154,%r182; @ ! %r403 bra $L137; bra $L94; $L91: .loc 1 386 32 add.u64 %r165,%r165,%r82; .loc 1 386 47 add.u64 %r166,%r166,%r40; .loc 1 386 2 setp.ne.u64 %r404,%r154,%r138; @ %r404 bra $L96; bra $L94; $L137: .loc 1 408 9 @ %r447 bra $L97; $L100: .loc 1 414 16 add.u64 %r182,%r182,1; .loc 1 411 8 @ %r440 bra $L98; .loc 1 411 18 ld.f32 %r85,[%r165]; .loc 1 411 15 setp.ge.f32 %r407,%r85,%r140; .loc 1 413 16 selp.f32 %r140,%r85,%r140,%r407; selp.u64 %r138,%r182,%r138,%r407; $L98: .loc 1 409 33 add.u64 %r165,%r165,%r82; .loc 1 409 48 add.u64 %r166,%r166,%r40; .loc 1 409 8 setp.eq.u64 %r408,%r154,%r182; @ %r408 bra $L110; .loc 1 411 9 ld.s8 %r210,[%r166]; setp.eq.u32 %r440,%r210,0; bra $L100; $L97: .loc 1 423 16 add.u64 %r182,%r182,1; .loc 1 420 8 @ %r440 bra $L101; .loc 1 420 18 ld.f32 %r90,[%r165]; .loc 1 420 15 setp.gt.f32 %r410,%r90,%r140; .loc 1 422 16 selp.f32 %r140,%r90,%r140,%r410; selp.u64 %r138,%r182,%r138,%r410; $L101: .loc 1 418 33 add.u64 %r165,%r165,%r82; .loc 1 418 48 add.u64 %r166,%r166,%r40; .loc 1 418 8 setp.eq.u64 %r411,%r154,%r182; @ %r411 bra $L111; .loc 1 420 9 ld.s8 %r210,[%r166]; setp.eq.u32 %r440,%r210,0; bra $L97; $L110: mov.u64 %r142,%r138; bra $L94; $L111: mov.u64 %r142,%r138; $L94: .loc 1 426 8 st.u64 [%r173],%r142; .loc 1 429 15 add.u64 %r185,%r185,1; .loc 1 430 12 add.u64 %r171,%r171,%r95; .loc 1 431 13 add.u64 %r172,%r172,%r96; .loc 1 432 12 add.u64 %r173,%r173,%r100; .loc 1 434 32 ld.u64 %r164,[%frame+360]; .loc 1 434 13 setp.ne.u64 %r412,%r185,%r164; @ %r412 bra $L108; .loc 1 441 23 mul.lo.u64 %r102,%r93,%r185; .loc 1 442 24 mul.lo.u64 %r413,%r96,%r185; .loc 1 442 10 sub.u64 %r168,%r172,%r413; .loc 1 443 23 mul.lo.u64 %r414,%r98,%r185; .loc 1 443 9 shl.b64 %r415,%r414,3; sub.u64 %r169,%r173,%r415; .loc 1 445 7 @ %r445 bra $L60; add.u64 %r203,%frame,488; mov.u64 %r202,8; .loc 1 444 5 mov.u64 %r170,1; .loc 1 438 13 mov.u64 %r446,0; bra $L105; $L106: st.u64 [%r203],%r446; .loc 1 441 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 442 24 mul.lo.u64 %r421,%r116,%r113; .loc 1 442 10 sub.u64 %r168,%r172,%r421; .loc 1 443 23 mul.lo.u64 %r422,%r118,%r113; .loc 1 443 9 shl.b64 %r423,%r422,3; sub.u64 %r169,%r173,%r423; .loc 1 444 5 add.u64 %r170,%r170,1; .loc 1 445 7 add.u64 %r203,%r203,8; add.u64 %r202,%r202,8; setp.eq.u64 %r424,%r170,%r450; @ %r424 bra $L60; $L105: .loc 1 453 16 ld.u64 %r425,[%r203]; add.u64 %r113,%r425,1; st.u64 [%r203],%r113; .loc 1 454 23 add.u64 %r427,%r451,%r202; ld.u64 %r114,[%r427]; .loc 1 454 13 sub.u64 %r428,%r114,%r102; shl.b64 %r429,%r428,2; add.u64 %r171,%r171,%r429; .loc 1 455 24 add.u64 %r430,%frame,%r202; ld.u64 %r116,[%r430]; .loc 1 455 14 add.u64 %r172,%r168,%r116; .loc 1 456 23 add.u64 %r432,%r453,%r202; ld.u64 %r118,[%r432]; .loc 1 456 13 shl.b64 %r433,%r118,3; add.u64 %r173,%r169,%r433; .loc 1 434 32 add.u64 %r435,%r452,%r202; ld.u64 %r122,[%r435]; .loc 1 434 13 setp.eq.u64 %r436,%r113,%r122; @ %r436 bra $L106; .loc 1 438 13 mov.u64 %r185,0; bra $L108; $L135: .loc 1 312 6 ld.u64 %r437,[%r213]; setp.eq.u64 %r438,%r437,0; @ ! %r438 bra $L75; bra $L107; $L60: .loc 1 460_gfortran_smaxloc1_8_r4 .visible .func _gfortran_smaxloc1_8_r, .param .u32 %in_ar4) {8u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 483 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L139; .loc 1 483 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L140; $L139:r152154; call _gfortran_maxloc1_8_r4490 7 bra $L138; $L140: .loc 1 493 10 ld.u64 %r23,[%r152]; .loc 1 493 7 add.u64 %r88,%r23,-1; .loc 1 494 10 ld.s8 %r24,[%r151+28]; .loc 1 494 38 add.u32 %r26,%r24,-1; .loc 1 494 8 cvt.s64.s32 %r296,%r26; .loc 1 496 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 496 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L142; .loc 1 503 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L143; $L149: .loc 1 511 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L144; bra $L197; $L142: .loc 1 498 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error143: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 508 12 mov.u64 %r302,0; $L148: .loc 1 505 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 507 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L146; .loc 1 505 17 st.u64 [%r139],%r36; bra $L147; $L146: .loc 1 508 12 st.u64 [%r139],%r302; $L147: .loc 1 503 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L148; bra $L149; $L197: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L151; bra $L150; $L144: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 517 12 mov.u64 %r301,0; $L154: .loc 1 514 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 516 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L152; .loc 1 513 17 st.u64 [%r43],%r41; bra $L153; $L152: .loc 1 517 12 st.u64 [%r43],%r301; $L153: .loc 1 511 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L154; bra $L198; $L150: .loc 1 524 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L156; add.u64 %r295,%frame,120; $L176: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 527 10 mov.u64 %r81,1; .loc 1 531 4 mov.u64 %r239,0; bra $L157; $L156: .loc 1 535 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 536 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 538 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 538 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 538 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 540 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L158; bra $L199; $L160: .loc 1 529 48 mul.lo.u64 %r81,%r46,%r81; $L157: .loc 1 531 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 524 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L160; bra $L156; $L158: .loc 1 543 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 544 4 bra $L138; $L199: .loc 1 547 24 mov.u64 %r24696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 547 22 st.u64 [%r150],%r98; $L164: .loc 1 573 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L161; bra $L162; $L151: .loc 1 551 19 ld.s8 %r55,[%r150+28]; .loc 1 551 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L163; .loc 1 552 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error163: .loc 1 557 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 557 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L164; .loc 1 559 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L162; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 559 10 mov.u64 %r104,0; $L166: .loc 1 563 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 563 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 564 18 ld.u64 %r63,[%r31]; .loc 1 559 25 add.u64 %r104,%r104,1; .loc 1 564 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L165; .loc 1 565 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error165: .loc 1 559 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L166; $L161: .loc 1 575 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L167; shl.b64 %r266,%r296,3; bra $L168; $L167: mov.u64 %r266,8; $L168: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L169: .loc 1 576 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 573 26 add.u64 %r100,%r100,1; .loc 1 573 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L169; $L162: ld.u64 %r80,[%frame+240]; .loc 1 585 22 ld.u64 %r68,[%frame]; .loc 1 585 12 shl.b64 %r69,%r68,3; .loc 1 587 32 ld.u64 %r59,[%frame+120]; .loc 1 594 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 583 13 mov.u64 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L175: st.u64 [%r98],%r280; .loc 1 584 15 add.u64 %r80,%r80,1; .loc 1 585 12 add.u64 %r98,%r98,%r69; .loc 1 587 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L175; .loc 1 596 7 @ ! %r294 bra $L200; bra $L138; $L174: .loc 1 591 13 st.u64 [%r99],%r280; .loc 1 594 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 595 5 add.u64 %r97,%r97,1; .loc 1 596 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L173; bra $L138; $L200: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 594 23 mov.u64 %r72,%r44; .loc 1 595 5 mov.u64 %r97,1; $L173: .loc 1 600 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 601 23 ld.u64 %r75,[%r124]; .loc 1 601 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,3; add.u64 %r98,%r98,%r291; .loc 1 587 32 ld.u64 %r77,[%r123]; .loc 1 587 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L174; .loc 1 591 13 mov.u64 %r80,0; bra $L175; $L198: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L151; bra $L176; $L138: .loc 1 605 maxloc1_16_r4.o/1622802230_gfortran_maxloc1_16_r4 .visible .func _gfortran_maxloc1_16_r../libgfortran/generated/maxloc1_16_r4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_16_r4 .visible .func _gfortran_mmaxloc1_16_r_gfortran_smaxloc1_16_r4 .visible .func _gfortran_smaxloc1_16_rVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_maxloc1_16_r4 .visible .func _gfortran_maxloc1_16_r4predu64 %r258; .reg .u64pred %r265; .reg .pred64pred %r304; .reg .u64 %r305; .reg .u64u64 %r318; .reg .u64u64 %r324; .reg .u64predpred %r334; .reg .pred %r336; .reg .u64 %r340; .reg .u64 %r343; .reg .u64 %r346; .reg .u64 %r349; .reg .u32 %r351; .reg .u64 %r355; .reg .u64 %r357; .reg .u64 %r360; .reg .u32 %r362; .reg .u64 %r366; .reg .pred %r367; .reg .pred %r369; .reg .pred %r370; .reg .u64 %r373; .reg .u64 %r376; .reg .u32pred %r391; .reg .u64 %r397; .reg .u6464mov.u64 %r191,%ar0; mov.u64 %r192,%ar1; mov.u64 %r193,%ar2; mov.u32 %r194,%ar3; .loc 1 58 10 ld.s8 %r22,[%r192+28]; .loc 1 58 38 add.u32 %r195,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r118,%r195; .loc 1 59 10 ld.u64 %r25,[%r193]; .loc 1 59 7 add.u64 %r119,%r25,-1; .loc 1 61 7 shr.u64 %r197,%r119,63; set.u32.lt.s64 %r199,%r118,%r119; neg.s32 %r200,%r199; cvt.u16.u64 %r203,%r197; cvt.u16.u32 %r204,%r200; or.b16 %r202,%r203,%r204; .loc 1 61 6 cvt.u32.u16 %r205,%r202; cvt.u16.u8 %r206,%r205; setp.eq.u16 %r207,%r206,0; @ %r207 bra $L2; .loc 1 63 7 cvt.u32.u32 %r210,%r22; cvt.s64.s8 %r209,%r210; st.u64 [%stack+8],%r209_gfortran_runtime_errorr213,%r119,%r119; add.u64 %r214,%r213,%r119; shl.b64 %r215,%r214,3; add.u64 %r216,%r192,%r215; ld.u64 %r30,[%r216+56]; ld.u64 %r32,[%r216+48]; .loc 1 71 9 ld.u64 %r122,[%r216+40]; .loc 1 73 3 setp.ne.u64 %r230,%r119,0; @ %r230 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r231,%r118,%r119; @ %r231 bra $L4; bra $L57; $L3: add.u64 %r181,%r192,40; add.u64 %r69,%frame,120; add.u64 %r97,%frame,240; add.u64 %r232,%r192,16; add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r67,%r232,%r236; .loc 1 79 12 mov.u64 %r447,0; $L8: .loc 1 75 18 ld.u64 %r237,[%r181]; st.u64 [%r69],%r237; .loc 1 76 19 ld.u64 %r239,[%r181+16]; add.u64 %r238,%r239,1; ld.u64 %r240,[%r181+8]; sub.u64 %r37,%r238,%r240; .loc 1 78 10 setp.lt.s64 %r241,%r37,0; @ %r241 bra $L6; .loc 1 76 17 st.u64 [%r97],%r37; bra $L7; $L6: .loc 1 79 12 st.u64 [%r97],%r447; $L7: .loc 1 73 3 add.u64 %r181,%r181,24; add.u64 %r69,%r69,8; add.u64 %r97,%r97,8; setp.ne.u64 %r243,%r67,%r181; @ %r243 bra $L8; bra $L9; $L57: .loc 1 90 6 ld.u64 %r244,[%r191]; setp.eq.u64 %r245,%r244,0; @ ! %r245 bra $L11; bra $L10; $L4: add.u64 %r247,%r25,%r25; add.u64 %r248,%r247,%r25; shl.b64 %r249,%r248,3; add.u64 %r250,%r249,40; add.u64 %r110,%r192,%r250; shl.b64 %r251,%r25,3; add.u64 %r99,%r251,-8; add.u64 %r435,%frame,120; add.u64 %r101,%r435,%r99; add.u64 %r434,%frame,240; add.u64 %r96,%r434,%r99; add.u64 %r254,%r192,40; cvt.u32.u32 %r256,%r22; cvt.s64.s8 %r255,%r256; add.u64 %r258,%r255,%r255; add.u64 %r259,%r258,%r255; shl.b64 %r260,%r259,3; add.u64 %r184,%r254,%r260; .loc 1 87 12 mov.u64 %r446,0; $L14: .loc 1 83 18 ld.u64 %r261,[%r110]; st.u64 [%r101],%r261; .loc 1 84 19 ld.u64 %r263,[%r110+16]; add.u64 %r262,%r263,1; ld.u64 %r264,[%r110+8]; sub.u64 %r44,%r262,%r264; .loc 1 86 10 setp.lt.s64 %r265,%r44,0; @ %r265 bra $L12; .loc 1 84 17 st.u64 [%r96],%r44; bra $L13; $L12: .loc 1 87 12 st.u64 [%r96],%r446; $L13: .loc 1 81 3 add.u64 %r110,%r110,24; add.u64 %r101,%r101,8; add.u64 %r96,%r96,8; setp.ne.u64 %r267,%r110,%r184; @ %r267 bra $L14; bra $L58; $L10: .loc 1 94 7 setp.le.s64 %r268,%r118,0; @ %r268 bra $L16; add.u64 %r434,%frame,240; $L43: add.u64 %r146,%r191,40; mov.u64 %r143,%r434; cvt.u32.u32 %r270,%r22; cvt.s64.s8 %r269,%r270; add.u64 %r272,%r269,%r269; add.u64 %r273,%r272,%r269; shl.b64 %r274,%r273,3; add.u64 %r275,%r191,16; add.u64 %r111,%r274,%r275; .loc 1 97 10 mov.u64 %r103,1; .loc 1 101 4 mov.u64 %r298,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r276,0; st.u64 [%r191+8],%r276; .loc 1 106 28 cvt.u16.u32 %r279,%r22; add.u16 %r278,%r279,-1; cvt.u32.u16 %r280,%r278; st.u8 [%r191+28],%r280; .loc 1 108 20 add.u32 %r281,%r22,-2; cvt.s64.s32 %r53,%r281; add.u64 %r283,%r53,%r53; add.u64 %r284,%r283,%r53; shl.b64 %r285,%r284,3; add.u64 %r286,%r191,%r285; .loc 1 108 67 shl.b64 %r288,%r53,3; add.u64 %r289,%frame,%r288; .loc 1 108 59 ld.u64 %r291,[%r286+40]; ld.u64 %r292,[%r289+240]; mul.lo.u64 %r124,%r291,%r292; .loc 1 110 29 mov.u64 %r294,16call (%value_in),_gfortrani_xmallocarray295,[%value_in]; } .loc 1 110 27 st.u64 [%r191],%r295; .loc 1 111 10 setp.eq.u64 %r297,%r124,0; @ ! %r297 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r103,%r49,%r103; $L17: .loc 1 101 4 st.u64 [%r146+8],%r298; ld.u64 %r49,[%r143]; add.u64 %r299,%r49,-1; st.u64 [%r146+16],%r299; st.u64 [%r146],%r103; .loc 1 94 7 add.u64 %r146,%r146,24; add.u64 %r143,%r143,8; setp.ne.u64 %r300,%r111,%r146; @ %r300 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r191+48],%r124; mov.u64 %r302,-1; st.u64 [%r191+56],%r302; mov.u64 %r303,1; st.u64 [%r191+40],%r303; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r59,[%r191+28]; .loc 1 121 10 setp.eq.u64 %r304,%r59,%r118; @ %r304 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r118; st.u64 [%stack],%r59;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r307,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r308,[%r307+36]; setp.eq.u32 %r309,%r308,0; @ %r309 bra $L21; .loc 1 128 2 add.u64 %r434,%frame,240; cvta.const.u64 %r313,$LC2r4312313; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r315,%r118,0; @ %r315 bra $L24; $L27: .loc 1 68 9 add.u64 %r316,%r30,1; .loc 1 68 7 sub.u64 %r120,%r316,%r32; max.s64 %r64,%r120,0; .loc 1 140 8 ld.u64 %r133,[%r192]; .loc 1 141 8 ld.u64 %r134,[%r191]; .loc 1 168 42 shl.b64 %r66,%r122,2; ld.u64 %r142,[%frame+360]; .loc 1 194 22 ld.u64 %r71,[%frame+120]; .loc 1 194 12 shl.b64 %r72,%r71,2; .loc 1 195 22 ld.u64 %r74,[%frame]; .loc 1 195 12 shl.b64 %r75,%r74,4; .loc 1 197 32 ld.u64 %r61,[%frame+240]; .loc 1 204 23 mul.lo.u64 %r317,%r61,%r71; .loc 1 204 9 shl.b64 %r318,%r317,2; neg.s64 %r154,%r318; .loc 1 205 23 mul.lo.u64 %r319,%r61,%r74; .loc 1 205 9 shl.b64 %r320,%r319,4; neg.s64 %r140,%r320; setp.gt.s64 %r436,%r120,0; setp.le.s64 %r437,%r118,1; add.u64 %r439,%r64,-1; setp.eq.u32 %r440,%r194,0; cvt.u32.u32 %r441,%r22; cvt.s64.s8 %r442,%r441; add.u64 %r443,%r442,-1; add.u64 %r444,%frame,120; add.u64 %r445,%frame,240; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r191,40; mov.u64 %r162,%frame; cvt.u32.u32 %r322,%r22; cvt.s64.s8 %r321,%r322; add.u64 %r148,%r321,-1; .loc 1 132 3 mov.u64 %r135,0; add.u64 %r434,%frame,240; .loc 1 134 16 mov.u64 %r323,%r135; $L26: st.u64 [%r165],%r323; .loc 1 135 18 ld.u64 %r324,[%r164]; st.u64 [%r162],%r324; .loc 1 136 17 shl.b64 %r326,%r135,3; add.u64 %r327,%r434,%r326; .loc 1 136 10 ld.u64 %r328,[%r327]; setp.le.s64 %r329,%r328,0; @ %r329 bra $L1; .loc 1 132 26 add.u64 %r135,%r135,1; .loc 1 132 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r330,%r135,%r148; @ %r330 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r436 bra $L44; .loc 1 159 10 mov.u64 %r332,0; st.u64 [%r134],%r332; st.u64 [%r134+8],%r332; bra $L29; $L44: mov.u64 %r126,%r133; .loc 1 168 19 mov.u64 %r145,0; $L28: .loc 1 170 7 ld.f32 %r108,[%r126]; mov.u64 %r127,%r145; .loc 1 168 34 add.u64 %r145,%r145,1; .loc 1 170 6 setp.ge.f32 %r334,%r108,0fff800000; @ ! %r334 bra $L59; .loc 1 173 14 mov.u64 %r413,%r145; shr.s64 %r414,%r413,63; .loc 1 180 6 setp.le.s64 %r336,%r64,%r127; @ %r336 bra $L32; add.u64 %r417,%r127,1; shr.s64 %r418,%r417,63; sub.u64 %r340,%r439,%r127; shr.s64 %r343,%r127,63; add.u64 %r346,%r127,2; set.u32.lt.u64 %r351,%r346,%r127; cvt.s64.s32 %r349,%r351; sub.u64 %r355,%r343,%r349; add.u64 %r357,%r340,%r346; set.u32.lt.u64 %r362,%r357,%r340; cvt.s64.s32 %r360,%r362; sub.u64 %r366,%r355,%r360; .loc 1 170 7 mov.f32 %r185,%r108; .loc 1 180 6 mov.u64 %r429,1; bra $L33; $L59: .loc 1 168 42 add.u64 %r126,%r126,%r66; .loc 1 168 12 setp.gt.s64 %r367,%r64,%r145; @ %r367 bra $L28; .loc 1 157 9 mov.u64 %r413,1; mov.u64 %r414,0; bra $L32; $L33: .loc 1 182 7 @ %r440 bra $L34; .loc 1 182 6 setp.le.f32 %r369,%r108,%r185; @ %r369 bra $L35; bra $L36; $L34: setp.lt.f32 %r370,%r108,%r185; @ ! %r370 bra $L36; $L35: .loc 1 185 14 mov.u64 %r413,%r417; mov.u64 %r414,%r418; .loc 1 184 14 mov.f32 %r108,%r185; $L36: .loc 1 180 31 add.u64 %r126,%r126,%r66; .loc 1 180 6 add.u64 %r373,%r417,%r429; set.u32.lt.u64 %r378,%r373,%r417; cvt.s64.s32 %r376,%r378; mov.u64 %r417,%r373; sub.u64 %r418,%r418,%r376; setp.ne.u64 %r385,%r357,%r417; @ %r385 bra $L46; setp.ne.u64 %r388,%r366,%r418; @ ! %r388 bra $L32; $L46: .loc 1 182 14 ld.f32 %r185,[%r126]; bra $L33; $L32: .loc 1 189 12 st.u64 [%r134],%r413; st.u64 [%r134+8],%r414; $L29: .loc 1 193 15 add.u64 %r142,%r142,1; .loc 1 194 12 add.u64 %r133,%r133,%r72; .loc 1 195 12 add.u64 %r134,%r134,%r75; .loc 1 197 13 setp.ne.u64 %r391,%r61,%r142; @ %r391 bra $L25; .loc 1 204 9 add.u64 %r130,%r133,%r154; .loc 1 205 9 add.u64 %r131,%r134,%r140; .loc 1 207 7 @ %r437 bra $L1; add.u64 %r117,%frame,368; mov.u64 %r180,8; .loc 1 206 5 mov.u64 %r132,1; .loc 1 201 13 mov.u64 %r438,0; bra $L41; $L42: st.u64 [%r117],%r438; .loc 1 204 23 mul.lo.u64 %r397,%r89,%r88; .loc 1 204 9 shl.b64 %r398,%r397,2; sub.u64 %r130,%r133,%r398; .loc 1 205 23 mul.lo.u64 %r399,%r92,%r88; .loc 1 205 9 shl.b64 %r400,%r399,4; sub.u64 %r131,%r134,%r400; .loc 1 206 5 add.u64 %r132,%r132,1; .loc 1 207 7 add.u64 %r117,%r117,8; add.u64 %r180,%r180,8; setp.eq.u64 %r401,%r132,%r443; @ %r401 bra $L1; $L41: .loc 1 215 16 ld.u64 %r402,[%r117]; add.u64 %r88,%r402,1; st.u64 [%r117],%r88; .loc 1 216 23 add.u64 %r404,%r444,%r180; ld.u64 %r89,[%r404]; .loc 1 216 13 shl.b64 %r405,%r89,2; add.u64 %r133,%r130,%r405; .loc 1 217 23 add.u64 %r406,%frame,%r180; ld.u64 %r92,[%r406]; .loc 1 217 13 shl.b64 %r407,%r92,4; add.u64 %r134,%r131,%r407; .loc 1 197 32 add.u64 %r409,%r445,%r180; ld.u64 %r95,[%r409]; .loc 1 197 13 setp.eq.u64 %r410,%r88,%r95; @ %r410 bra $L42; .loc 1 201 13 mov.u64 %r142,0; bra $L25; $L58: .loc 1 90 6 ld.u64 %r411,[%r191]; setp.eq.u64 %r412,%r411,0; @ ! %r412 bra $L11; bra $L43; $L1: .loc 1 221_gfortran_mmaxloc1_16_r4 .visible .func _gfortran_mmaxloc1_16_r, .param .u32 %in_ar4) {608pred %r239; .reg .u32 %r244; .reg .u64 %r246; .reg .u32 %r248; .reg .u32 %r249; .reg .u16 %r251; .reg .u16 %r252; .reg .u16 %r253; .reg .u32u64 %r268; .reg .u64 %r275; .reg .pred %r276; .reg .u32 %r277; .reg .u32 %r278; .reg .u32 %r280; .reg .u32pred %r291; .reg .pred %r293; .reg .u64predu64 %r312; .reg .u64u64 %r317; .reg .u64 %r318; .reg .u64u64pred %r356; .reg .pred %r357; .reg .u64 %r358; .reg .u32pred %r410; .reg .u64pred %r419; .reg .predpred %r427; .reg .predu64 %r454; .reg .u64 %r457; .reg .u32 %r459; .reg .u64 %r463; .reg .u64 %r465; .reg .u64pred %r541; .reg .predu64u64 %r565; .reg .u64 %r567; .reg .pred %r568; .reg .u64 %r569; .reg .pred %r570; .reg .u64 %r571; .reg .u64 %r572; .reg .u64 %r573; .reg .u64.reg .u64 %r633; .reg .u64 %r634; mov.u64 %r234,%ar0; mov.u64 %r235,%ar1; mov.u64 %r236,%ar2; mov.u64 %r237,%ar3; mov.u32 %r238,%ar4; .loc 1 251 6 setp.ne.u64 %r239,%r237,0; @ %r239 bra $L61; .loc 1 254 7r2236238; call _gfortran_maxloc1_16_r47 bra $L60; $L61: .loc 1 261 10 ld.u64 %r22,[%r236]; .loc 1 261 7 add.u64 %r152,%r22,-1; .loc 1 262 10 ld.s8 %r23,[%r235+28]; .loc 1 262 38 add.u32 %r244,%r23,-1; .loc 1 262 8 cvt.s64.s32 %r153,%r244; .loc 1 265 7 shr.u64 %r246,%r152,63; set.u32.gt.s64 %r248,%r152,%r153; neg.s32 %r249,%r248; cvt.u16.u64 %r252,%r246; cvt.u16.u32 %r253,%r249; or.b16 %r251,%r252,%r253; .loc 1 265 6 cvt.u32.u16 %r254,%r251; cvt.u16.u8 %r255,%r254; setp.eq.u16 %r256,%r255,0; @ %r256 bra $L63; .loc 1 267 7 cvt.u32.u32 %r259,%r23; cvt.s64.s8 %r258,%r259; st.u64 [%stack+8],%r258; st.u64 [%stack],%r22; cvta.const.u64 %r25_gfortran_runtime_error63: .loc 1 272 9 add.u64 %r621,%r152,%r152; add.u64 %r617,%r621,%r152; shl.b64 %r264,%r617,3; add.u64 %r265,%r235,%r264; ld.u64 %r268,[%r265+56]; add.u64 %r267,%r268,1; .loc 1 272 7 ld.u64 %r275,[%r265+48]; sub.u64 %r154,%r267,%r275; .loc 1 273 6 setp.le.s64 %r276,%r154,0; @ %r276 bra $L60; .loc 1 276 9 ld.u64 %r174,[%r237]; .loc 1 278 15 ld.u64 %r33,[%r237+16]; .loc 1 280 22 cvt.u32.u64 %r34,%r33; .loc 1 280 53 add.u32 %r277,%r34,-4; and.b32 %r278,%r277,-5; set.u32.eq.u32 %r280,%r278,0; neg.s32 %r281,%r280; .loc 1 280 22 add.u32 %r282,%r34,-1; set.u32.le.u32 %r284,%r282,1; neg.s32 %r285,%r284; .loc 1 280 6 cvt.u16.u32 %r287,%r281; cvt.u16.u32 %r288,%r285; or.b16 %r286,%r287,%r288; cvt.u32.u16 %r289,%r286; cvt.u16.u8 %r290,%r289; setp.ne.u16 %r291,%r290,0; @ %r291 bra $L65; .loc 1 282 7 setp.ne.u32 %r293,%r34,16; @ %r293 bra $L66; $L65: .loc 1 289 9 shl.b64 %r297,%r617,3; add.u64 %r298,%r235,%r297; ld.u64 %r157,[%r298+40]; .loc 1 290 12 add.u64 %r304,%r237,%r297; ld.u64 %r40,[%r304+40]; .loc 1 292 3 setp.ne.u64 %r306,%r152,0; @ %r306 bra $L67; $L73: .loc 1 302 3 setp.lt.s64 %r307,%r152,%r153; @ %r307 bra $L68; bra $L135; $L66: .loc 1 287 5 cvta.const.u64 %r30830_gfortran_runtime_error67: add.u64 %r91,%r235,40; add.u64 %r196,%r237,40; add.u64 %r311,%r22,%r22; add.u64 %r312,%r311,%r22; shl.b64 %r313,%r312,3; add.u64 %r314,%r235,16; add.u64 %r226,%r313,%r314; .loc 1 292 3 mov.u64 %r107,0; add.u64 %r619,%frame,240; add.u64 %r623,%frame,360; .loc 1 299 12 mov.u64 %r634,%r107; $L72: .loc 1 294 18 add.u64 %r316,%r619,%r107; ld.u64 %r317,[%r91]; st.u64 [%r316],%r317; .loc 1 295 18 add.u64 %r318,%frame,%r107; .loc 1 295 20 ld.u64 %r320,[%r196]; mul.lo.u64 %r319,%r320,%r33; .loc 1 295 18 st.u64 [%r318],%r319; .loc 1 296 19 ld.u64 %r322,[%r91+16]; add.u64 %r321,%r322,1; ld.u64 %r323,[%r91+8]; sub.u64 %r52,%r321,%r323; .loc 1 298 10 setp.lt.s64 %r324,%r52,0; @ %r324 bra $L70; .loc 1 296 17 add.u64 %r326,%r623,%r107; st.u64 [%r326],%r52; bra $L71; $L70: .loc 1 299 12 add.u64 %r328,%r623,%r107; st.u64 [%r328],%r634; $L71: .loc 1 292 3 add.u64 %r91,%r91,24; add.u64 %r196,%r196,24; add.u64 %r107,%r107,8; setp.ne.u64 %r330,%r91,%r226; @ %r330 bra $L72; bra $L73; $L135: .loc 1 312 6 ld.u64 %r331,[%r234]; setp.eq.u64 %r332,%r331,0; @ ! %r332 bra $L75; bra $L74; $L68: add.u64 %r334,%r22,%r22; add.u64 %r335,%r334,%r22; shl.b64 %r336,%r335,3; add.u64 %r180,%r336,40; add.u64 %r57,%r235,%r180; add.u64 %r182,%r237,%r180; shl.b64 %r337,%r22,3; add.u64 %r74,%r337,-8; cvt.u32.u32 %r339,%r23; cvt.s64.s8 %r338,%r339; shl.b64 %r340,%r338,3; add.u64 %r88,%r340,-8; add.u64 %r619,%frame,240; add.u64 %r623,%frame,360; .loc 1 309 12 mov.u64 %r633,0; $L78: .loc 1 304 18 add.u64 %r342,%r619,%r74; ld.u64 %r343,[%r57]; st.u64 [%r342],%r343; .loc 1 305 18 add.u64 %r344,%frame,%r74; .loc 1 305 20 ld.u64 %r346,[%r182]; mul.lo.u64 %r345,%r346,%r33; .loc 1 305 18 st.u64 [%r344],%r345; .loc 1 306 19 ld.u64 %r348,[%r57+16]; add.u64 %r347,%r348,1; ld.u64 %r349,[%r57+8]; sub.u64 %r64,%r347,%r349; .loc 1 308 10 setp.lt.s64 %r350,%r64,0; @ %r350 bra $L76; .loc 1 306 17 add.u64 %r352,%r623,%r74; st.u64 [%r352],%r64; bra $L77; $L76: .loc 1 309 12 add.u64 %r354,%r623,%r74; st.u64 [%r354],%r633; $L77: .loc 1 302 3 add.u64 %r57,%r57,24; add.u64 %r182,%r182,24; add.u64 %r74,%r74,8; setp.ne.u64 %r356,%r74,%r88; @ %r356 bra $L78; bra $L136; $L74: .loc 1 316 7 setp.eq.u64 %r357,%r153,0; @ %r357 bra $L80; add.u64 %r623,%frame,360; $L110: add.u64 %r133,%r234,40; mov.u64 %r131,%r623; cvt.u32.u32 %r359,%r23; cvt.s64.s8 %r358,%r359; add.u64 %r361,%r358,%r358; add.u64 %r362,%r361,%r358; shl.b64 %r363,%r362,3; add.u64 %r364,%r234,16; add.u64 %r60,%r363,%r364; .loc 1 319 10 mov.u64 %r136,1; .loc 1 323 4 mov.u64 %r383,0; bra $L81; $L80: .loc 1 327 20 add.u32 %r365,%r23,-2; cvt.s64.s32 %r72,%r365; add.u64 %r367,%r72,%r72; add.u64 %r368,%r367,%r72; shl.b64 %r369,%r368,3; add.u64 %r370,%r234,%r369; .loc 1 327 67 shl.b64 %r372,%r72,3; add.u64 %r373,%frame,%r372; .loc 1 327 59 ld.u64 %r375,[%r370+40]; ld.u64 %r376,[%r373+360]; mul.lo.u64 %r161,%r375,%r376; .loc 1 329 24 mov.u64 %r377,0; st.u64 [%r234+8],%r377; .loc 1 330 28 cvt.u16.u32 %r380,%r23; add.u16 %r379,%r380,-1; cvt.u32.u16 %r381,%r379; st.u8 [%r234+28],%r381; .loc 1 332 10 setp.eq.u64 %r382,%r161,0; @ %r382 bra $L82; bra $L137; $L84: .loc 1 321 47 mul.lo.u64 %r136,%r69,%r136; $L81: .loc 1 323 4 st.u64 [%r133+8],%r383; ld.u64 %r69,[%r131]; add.u64 %r384,%r69,-1; st.u64 [%r133+16],%r384; st.u64 [%r133],%r136; .loc 1 316 7 add.u64 %r133,%r133,24; add.u64 %r131,%r131,8; setp.ne.u64 %r385,%r60,%r133; @ %r385 bra $L84; bra $L80; $L82: .loc 1 335 4 st.u64 [%r234+48],%r161; mov.u64 %r387,-1; st.u64 [%r234+56],%r387; mov.u64 %r388,1; st.u64 [%r234+40],%r388; .loc 1 336 4 bra $L60; $L137: .loc 1 339 24 mov.u64 %r390,16390; call (%value_in),_gfortrani_xmallocarray91,[%value_in]; } .loc 1 339 22 st.u64 [%r234],%r391; bra $L85; $L75: .loc 1 344 19 ld.s8 %r393,[%r234+28]; .loc 1 344 10 setp.eq.u64 %r394,%r393,%r153; @ %r394 bra $L86; .loc 1 345 2 cvta.const.u64 %r3955_gfortran_runtime_error86: .loc 1 347 11 cvta.global.u64 %r397,_gfortrani_compile_options; .loc 1 347 10 ld.u32 %r398,[%r397+36]; setp.eq.u32 %r399,%r398,0; @ %r399 bra $L85; .loc 1 349 4 add.u64 %r623,%frame,360; cvta.const.u64 %r403,$LC2;r62302403; call _gfortrani_bounds_ifunction_return4 cvta.const.u64 %r407,$LC207403; call _gfortrani_bounds_equal_extents$L85: .loc 1 356 3 setp.ne.u64 %r409,%r153,0; @ %r409 bra $L87; $L90: .loc 1 364 8 ld.u64 %r175,[%r234]; .loc 1 365 8 ld.u64 %r173,[%r235]; .loc 1 367 9 setp.ne.u64 %r410,%r173,0; @ %r410 bra $L88; bra $L60; $L87: add.u64 %r149,%frame,480; add.u64 %r148,%r234,40; add.u64 %r144,%frame,120; cvt.u32.u32 %r412,%r23; cvt.s64.s8 %r411,%r412; add.u64 %r134,%r411,-1; .loc 1 356 3 mov.u64 %r176,0; add.u64 %r623,%frame,360; .loc 1 358 16 mov.u64 %r413,%r176; $L89: st.u64 [%r149],%r413; .loc 1 359 18 ld.u64 %r414,[%r148]; st.u64 [%r144],%r414; .loc 1 360 17 shl.b64 %r416,%r176,3; add.u64 %r417,%r623,%r416; .loc 1 360 10 ld.u64 %r418,[%r417]; setp.le.s64 %r419,%r418,0; @ %r419 bra $L60; .loc 1 356 26 add.u64 %r176,%r176,1; .loc 1 356 3 add.u64 %r149,%r149,8; add.u64 %r148,%r148,24; add.u64 %r144,%r144,8; setp.ne.u64 %r420,%r134,%r176; @ %r420 bra $L89; bra $L90; $L88: .loc 1 290 12 mul.lo.u64 %r43,%r40,%r33; .loc 1 386 32 shl.b64 %r87,%r157,2; ld.u64 %r38,[%frame+480]; .loc 1 430 22 ld.u64 %r100,[%frame+240]; .loc 1 430 12 shl.b64 %r101,%r100,2; .loc 1 431 23 ld.u64 %r103,[%frame]; .loc 1 432 22 ld.u64 %r105,[%frame+120]; .loc 1 432 12 shl.b64 %r106,%r105,4; .loc 1 434 32 ld.u64 %r166,[%frame+360]; .loc 1 441 23 mul.lo.u64 %r37,%r166,%r100; .loc 1 442 24 mul.lo.u64 %r421,%r166,%r103; .loc 1 442 10 neg.s64 %r85,%r421; .loc 1 443 23 mul.lo.u64 %r422,%r166,%r105; .loc 1 443 9 shl.b64 %r423,%r422,4; neg.s64 %r93,%r423; setp.le.s64 %r618,%r153,1; .loc 1 356 3 mov.u64 %r626,1; .loc 1 408 9 setp.ne.u32 %r627,%r238,0; add.u64 %r628,%r154,-1; cvt.u32.u32 %r629,%r23; cvt.s64.s8 %r630,%r629; add.u64 %r631,%r630,-1; add.u64 %r632,%frame,240; $L111: .loc 1 356 3 mov.u64 %r169,%r174; mov.u64 %r168,%r173; mov.u64 %r573,%r626; mov.u64 %r574,0; .loc 1 383 17 mov.u64 %r571,%r574; mov.u64 %r572,%r574; .loc 1 386 9 mov.u64 %r200,%r574; $L96: .loc 1 389 7 ld.s8 %r225,[%r169]; mov.u64 %r231,%r200; add.u64 %r200,%r200,1; .loc 1 389 6 setp.eq.u32 %r620,%r225,0; @ %r620 bra $L91; .loc 1 392 10 or.b64 %r425,%r571,%r572; setp.ne.u64 %r427,%r425,0; .loc 1 393 17 selp.u64 %r571,%r571,%r573,%r427; selp.u64 %r572,%r572,%r574,%r427; .loc 1 394 11 ld.f32 %r145,[%r168]; .loc 1 394 10 setp.ge.f32 %r428,%r145,0fff800000; @ ! %r428 bra $L91; .loc 1 398 11 shr.s64 %r429,%r200,63; .loc 1 404 9 setp.le.s64 %r430,%r154,%r231; @ ! %r430 bra $L138; bra $L94; $L91: .loc 1 386 32 add.u64 %r168,%r168,%r87; .loc 1 386 47 add.u64 %r169,%r169,%r43; .loc 1 386 2 add.u64 %r433,%r573,%r626; set.u32.lt.u64 %r438,%r433,%r573; cvt.s64.s32 %r436,%r438; mov.u64 %r573,%r433; sub.u64 %r574,%r574,%r436; setp.ne.u64 %r443,%r154,%r200; @ %r443 bra $L96; bra $L94; $L138: .loc 1 408 9 @ %r627 bra $L97; add.u64 %r579,%r231,1; shr.s64 %r580,%r579,63; sub.u64 %r448,%r628,%r231; shr.s64 %r451,%r231,63; add.u64 %r454,%r231,2; set.u32.lt.u64 %r459,%r454,%r231; cvt.s64.s32 %r457,%r459; sub.u64 %r463,%r451,%r457; add.u64 %r465,%r448,%r454; set.u32.lt.u64 %r470,%r465,%r448; cvt.s64.s32 %r468,%r470; sub.u64 %r474,%r463,%r468; .loc 1 398 11 mov.u64 %r571,%r200; mov.u64 %r572,%r429; bra $L98; $L97: add.u64 %r583,%r231,1; shr.s64 %r584,%r583,63; sub.u64 %r478,%r628,%r231; shr.s64 %r481,%r231,63; add.u64 %r484,%r231,2; set.u32.lt.u64 %r489,%r484,%r231; cvt.s64.s32 %r487,%r489; sub.u64 %r493,%r481,%r487; add.u64 %r495,%r478,%r484; set.u32.lt.u64 %r500,%r495,%r478; cvt.s64.s32 %r498,%r500; sub.u64 %r504,%r493,%r498; mov.u64 %r571,%r200; mov.u64 %r572,%r429; $L102: .loc 1 411 8 @ %r620 bra $L99; .loc 1 411 18 ld.f32 %r90,[%r168]; .loc 1 411 15 setp.ge.f32 %r506,%r90,%r145; .loc 1 414 16 selp.u64 %r571,%r583,%r571,%r506; selp.u64 %r572,%r584,%r572,%r506; selp.f32 %r145,%r90,%r145,%r506; $L99: .loc 1 409 33 add.u64 %r168,%r168,%r87; .loc 1 409 48 add.u64 %r169,%r169,%r43; .loc 1 409 8 add.u64 %r509,%r583,%r626; set.u32.lt.u64 %r514,%r509,%r583; cvt.s64.s32 %r512,%r514; mov.u64 %r583,%r509; sub.u64 %r584,%r584,%r512; setp.ne.u64 %r521,%r495,%r583; @ %r521 bra $L114; setp.ne.u64 %r524,%r504,%r584; @ ! %r524 bra $L94; $L114: .loc 1 411 9 ld.s8 %r225,[%r169]; setp.eq.u32 %r620,%r225,0; bra $L102; $L98: .loc 1 420 8 @ %r620 bra $L103; .loc 1 420 18 ld.f32 %r94,[%r168]; .loc 1 420 15 setp.gt.f32 %r526,%r94,%r145; .loc 1 423 16 selp.u64 %r571,%r579,%r571,%r526; selp.u64 %r572,%r580,%r572,%r526; selp.f32 %r145,%r94,%r145,%r526; $L103: .loc 1 418 33 add.u64 %r168,%r168,%r87; .loc 1 418 48 add.u64 %r169,%r169,%r43; .loc 1 418 8 add.u64 %r529,%r579,%r626; set.u32.lt.u64 %r534,%r529,%r579; cvt.s64.s32 %r532,%r534; mov.u64 %r579,%r529; sub.u64 %r580,%r580,%r532; setp.ne.u64 %r541,%r465,%r579; @ %r541 bra $L116; setp.ne.u64 %r544,%r474,%r580; @ ! %r544 bra $L94; $L116: .loc 1 420 9 ld.s8 %r225,[%r169]; setp.eq.u32 %r620,%r225,0; bra $L98; $L94: .loc 1 426 8 st.u64 [%r175],%r571; st.u64 [%r175+8],%r572; .loc 1 429 15 add.u64 %r38,%r38,1; .loc 1 430 12 add.u64 %r173,%r173,%r101; .loc 1 431 13 add.u64 %r174,%r174,%r103; .loc 1 432 12 add.u64 %r175,%r175,%r106; .loc 1 434 13 setp.ne.u64 %r547,%r38,%r166; @ %r547 bra $L111; .loc 1 442 10 add.u64 %r170,%r174,%r85; .loc 1 443 9 add.u64 %r171,%r175,%r93; .loc 1 445 7 @ %r618 bra $L60; add.u64 %r151,%frame,488; .loc 1 441 23 mov.u64 %r110,%r37; .loc 1 445 7 mov.u64 %r219,8; .loc 1 444 5 mov.u64 %r172,1; add.u64 %r623,%frame,360; add.u64 %r622,%frame,120; .loc 1 438 13 mov.u64 %r624,0; bra $L108; $L109: st.u64 [%r151],%r624; .loc 1 441 23 mul.lo.u64 %r110,%r123,%r122; .loc 1 442 24 mul.lo.u64 %r553,%r125,%r122; .loc 1 442 10 sub.u64 %r170,%r174,%r553; .loc 1 443 23 mul.lo.u64 %r554,%r127,%r122; .loc 1 443 9 shl.b64 %r555,%r554,4; sub.u64 %r171,%r175,%r555; .loc 1 444 5 add.u64 %r172,%r172,1; .loc 1 445 7 add.u64 %r151,%r151,8; add.u64 %r219,%r219,8; setp.eq.u64 %r556,%r172,%r631; @ %r556 bra $L60; $L108: .loc 1 453 16 ld.u64 %r557,[%r151]; add.u64 %r122,%r557,1; st.u64 [%r151],%r122; .loc 1 454 23 add.u64 %r559,%r632,%r219; ld.u64 %r123,[%r559]; .loc 1 454 13 sub.u64 %r560,%r123,%r110; shl.b64 %r561,%r560,2; add.u64 %r173,%r173,%r561; .loc 1 455 24 add.u64 %r562,%frame,%r219; ld.u64 %r125,[%r562]; .loc 1 455 14 add.u64 %r174,%r170,%r125; .loc 1 456 23 add.u64 %r564,%r622,%r219; ld.u64 %r127,[%r564]; .loc 1 456 13 shl.b64 %r565,%r127,4; add.u64 %r175,%r171,%r565; .loc 1 434 32 add.u64 %r567,%r623,%r219; ld.u64 %r130,[%r567]; .loc 1 434 13 setp.eq.u64 %r568,%r122,%r130; @ %r568 bra $L109; .loc 1 438 13 mov.u64 %r38,0; bra $L111; $L136: .loc 1 312 6 ld.u64 %r569,[%r234]; setp.eq.u64 %r570,%r569,0; @ ! %r570 bra $L75; bra $L110; $L60: .loc 1 460_gfortran_smaxloc1_16_r4 .visible .func _gfortran_smaxloc1_16_r, .param .u32 %in_ar4) {8u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r282; .reg .predpred %r293; .reg .pred %r294; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 483 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L140; .loc 1 483 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L141; $L140:r152154; call _gfortran_maxloc1_16_r4490 7 bra $L139; $L141: .loc 1 493 10 ld.u64 %r23,[%r152]; .loc 1 493 7 add.u64 %r88,%r23,-1; .loc 1 494 10 ld.s8 %r24,[%r151+28]; .loc 1 494 38 add.u32 %r26,%r24,-1; .loc 1 494 8 cvt.s64.s32 %r297,%r26; .loc 1 496 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r297; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 496 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L143; .loc 1 503 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L144; $L150: .loc 1 511 3 setp.lt.s64 %r175,%r88,%r297; @ %r175 bra $L145; bra $L198; $L143: .loc 1 498 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error144: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 508 12 mov.u64 %r303,0; $L149: .loc 1 505 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 507 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L147; .loc 1 505 17 st.u64 [%r139],%r36; bra $L148; $L147: .loc 1 508 12 st.u64 [%r139],%r303; $L148: .loc 1 503 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L149; bra $L150; $L198: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L152; bra $L151; $L145: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r296,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r296,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 517 12 mov.u64 %r302,0; $L155: .loc 1 514 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 516 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L153; .loc 1 513 17 st.u64 [%r43],%r41; bra $L154; $L153: .loc 1 517 12 st.u64 [%r43],%r302; $L154: .loc 1 511 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L155; bra $L199; $L151: .loc 1 524 7 setp.eq.u64 %r213,%r297,0; @ %r213 bra $L157; add.u64 %r296,%frame,120; $L177: add.u64 %r111,%r150,40; mov.u64 %r107,%r296; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 527 10 mov.u64 %r81,1; .loc 1 531 4 mov.u64 %r239,0; bra $L158; $L157: .loc 1 535 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 536 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 538 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 538 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 538 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 540 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L159; bra $L200; $L161: .loc 1 529 48 mul.lo.u64 %r81,%r46,%r81; $L158: .loc 1 531 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 524 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L161; bra $L157; $L159: .loc 1 543 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 544 4 bra $L139; $L200: .loc 1 547 24 mov.u64 %r246,1696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 547 22 st.u64 [%r150],%r98; $L165: .loc 1 573 3 setp.ne.u64 %r249,%r297,0; @ %r249 bra $L162; bra $L163; $L152: .loc 1 551 19 ld.s8 %r55,[%r150+28]; .loc 1 551 10 setp.eq.u64 %r250,%r55,%r297; @ %r250 bra $L164; .loc 1 552 2 st.u64 [%stack+8],%r297; st.u64 [%stack],%r55;_gfortran_runtime_error164: .loc 1 557 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 557 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L165; .loc 1 559 4 setp.eq.u64 %r256,%r297,0; @ %r256 bra $L163; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 559 10 mov.u64 %r104,0; $L167: .loc 1 563 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 563 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 564 18 ld.u64 %r63,[%r31]; .loc 1 559 25 add.u64 %r104,%r104,1; .loc 1 564 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L166; .loc 1 565 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error166: .loc 1 559 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L167; $L162: .loc 1 575 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L168; shl.b64 %r266,%r297,3; bra $L169; $L168: mov.u64 %r266,8; $L169: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L170: .loc 1 576 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 573 26 add.u64 %r100,%r100,1; .loc 1 573 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r297,%r100; @ %r279 bra $L170; $L163: ld.u64 %r80,[%frame+240]; .loc 1 585 22 ld.u64 %r68,[%frame]; .loc 1 585 12 shl.b64 %r69,%r68,4; .loc 1 587 32 ld.u64 %r59,[%frame+120]; .loc 1 594 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r295,%r297,1; .loc 1 583 13 mov.u64 %r280,0; cvt.u32.u32 %r299,%r24; cvt.s64.s8 %r300,%r299; add.u64 %r301,%r300,-1; $L176: st.u64 [%r98],%r280; st.u64 [%r98+8],%r280; .loc 1 584 15 add.u64 %r80,%r80,1; .loc 1 585 12 add.u64 %r98,%r98,%r69; .loc 1 587 13 setp.ne.u64 %r282,%r59,%r80; @ %r282 bra $L176; .loc 1 596 7 @ ! %r295 bra $L201; bra $L139; $L175: .loc 1 591 13 st.u64 [%r99],%r280; .loc 1 594 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 595 5 add.u64 %r97,%r97,1; .loc 1 596 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r285,%r97,%r301; @ %r285 bra $L174; bra $L139; $L201: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 594 23 mov.u64 %r72,%r44; .loc 1 595 5 mov.u64 %r97,1; $L174: .loc 1 600 16 ld.u64 %r290,[%r99]; add.u64 %r74,%r290,1; st.u64 [%r99],%r74; .loc 1 601 23 ld.u64 %r75,[%r124]; .loc 1 601 13 sub.u64 %r291,%r75,%r72; shl.b64 %r292,%r291,4; add.u64 %r98,%r98,%r292; .loc 1 587 32 ld.u64 %r77,[%r123]; .loc 1 587 13 setp.eq.u64 %r293,%r74,%r77; @ %r293 bra $L175; .loc 1 591 13 mov.u64 %r80,0; bra $L176; $L199: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r294,%r98,0; @ ! %r294 bra $L152; bra $L177; $L139: .loc 1 605maxloc1_4_r8.o/ 1622802230_gfortran_maxloc1_4_r8 .visible .func _gfortran_maxloc1_4_r8.file 1 "../../../../libgfortran/generated/maxloc1_4_r8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_4_r8 .visible .func _gfortran_mmaxloc1_4_r8_gfortran_smaxloc1_4_r8 .visible .func _gfortran_smaxloc1_4_r8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_maxloc1_4_r8 .visible .func _gfortran_maxloc1_4_r8100; .reg .u64predpredpred %r235; .reg .predpredu64u16u64 %r279; .reg .u64 %r280; .reg .u64 %r282; .reg .u64 %r283; .reg .u64predu64u32 %r316; .reg .u64 %r317; .reg .u64pred %r323; .reg .pred %r324; .reg .u32 %r326; .reg .predpred %r330; .reg .predpred %r345; .reg .u64mov.u64 %r185,%ar0; mov.u64 %r186,%ar1; mov.u64 %r187,%ar2; mov.u32 %r188,%ar3; .loc 1 58 10 ld.s8 %r22,[%r186+28]; .loc 1 58 38 add.u32 %r189,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r112,%r189; .loc 1 59 10 ld.u64 %r25,[%r187]; .loc 1 59 7 add.u64 %r113,%r25,-1; .loc 1 61 7 shr.u64 %r191,%r113,63; set.u32.lt.s64 %r193,%r112,%r113; neg.s32 %r194,%r193; cvt.u16.u64 %r197,%r191; cvt.u16.u32 %r198,%r194; or.b16 %r196,%r197,%r198; .loc 1 61 6 cvt.u32.u16 %r199,%r196; cvt.u16.u8 %r200,%r199; setp.eq.u16 %r201,%r200,0; @ %r201 bra $L2; .loc 1 63 7 cvt.u32.u32 %r204,%r22; cvt.s64.s8 %r203,%r204; st.u64 [%stack+8],%r20302202_gfortran_runtime_errorr207,%r113,%r113; add.u64 %r208,%r207,%r113; shl.b64 %r209,%r208,3; add.u64 %r210,%r186,%r209; ld.u64 %r30,[%r210+56]; ld.u64 %r32,[%r210+48]; .loc 1 71 9 ld.u64 %r116,[%r210+40]; .loc 1 73 3 setp.ne.u64 %r224,%r113,0; @ %r224 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r225,%r112,%r113; @ %r225 bra $L4; bra $L55; $L3: add.u64 %r184,%r186,40; add.u64 %r181,%frame,120; add.u64 %r47,%frame,240; add.u64 %r226,%r186,16; add.u64 %r228,%r25,%r25; add.u64 %r229,%r228,%r25; shl.b64 %r230,%r229,3; add.u64 %r148,%r226,%r230; .loc 1 79 12 mov.u64 %r371,0; $L8: .loc 1 75 18 ld.u64 %r231,[%r184]; st.u64 [%r181],%r231; .loc 1 76 19 ld.u64 %r233,[%r184+16]; add.u64 %r232,%r233,1; ld.u64 %r234,[%r184+8]; sub.u64 %r37,%r232,%r234; .loc 1 78 10 setp.lt.s64 %r235,%r37,0; @ %r235 bra $L6; .loc 1 76 17 st.u64 [%r47],%r37; bra $L7; $L6: .loc 1 79 12 st.u64 [%r47],%r371; $L7: .loc 1 73 3 add.u64 %r184,%r184,24; add.u64 %r181,%r181,8; add.u64 %r47,%r47,8; setp.ne.u64 %r237,%r148,%r184; @ %r237 bra $L8; bra $L9; $L55: .loc 1 90 6 ld.u64 %r238,[%r185]; setp.eq.u64 %r239,%r238,0; @ ! %r239 bra $L11; bra $L10; $L4: add.u64 %r241,%r25,%r25; add.u64 %r242,%r241,%r25; shl.b64 %r243,%r242,3; add.u64 %r244,%r243,40; add.u64 %r130,%r186,%r244; shl.b64 %r245,%r25,3; add.u64 %r102,%r245,-8; add.u64 %r359,%frame,120; add.u64 %r106,%r359,%r102; add.u64 %r358,%frame,240; add.u64 %r98,%r358,%r102; cvt.u32.u32 %r249,%r22; cvt.s64.s8 %r248,%r249; add.u64 %r251,%r248,%r248; add.u64 %r252,%r251,%r248; shl.b64 %r253,%r252,3; add.u64 %r254,%r186,40; add.u64 %r127,%r253,%r254; .loc 1 87 12 mov.u64 %r370,0; $L14: .loc 1 83 18 ld.u64 %r255,[%r130]; st.u64 [%r106],%r255; .loc 1 84 19 ld.u64 %r257,[%r130+16]; add.u64 %r256,%r257,1; ld.u64 %r258,[%r130+8]; sub.u64 %r44,%r256,%r258; .loc 1 86 10 setp.lt.s64 %r259,%r44,0; @ %r259 bra $L12; .loc 1 84 17 st.u64 [%r98],%r44; bra $L13; $L12: .loc 1 87 12 st.u64 [%r98],%r370; $L13: .loc 1 81 3 add.u64 %r130,%r130,24; add.u64 %r106,%r106,8; add.u64 %r98,%r98,8; setp.ne.u64 %r261,%r127,%r130; @ %r261 bra $L14; bra $L56; $L10: .loc 1 94 7 setp.le.s64 %r262,%r112,0; @ %r262 bra $L16; add.u64 %r358,%frame,240; $L42: add.u64 %r155,%r185,40; mov.u64 %r146,%r358; add.u64 %r263,%r185,16; cvt.u32.u32 %r265,%r22; cvt.s64.s8 %r264,%r265; add.u64 %r267,%r264,%r264; add.u64 %r268,%r267,%r264; shl.b64 %r269,%r268,3; add.u64 %r134,%r263,%r269; .loc 1 97 10 mov.u64 %r100,1; .loc 1 101 4 mov.u64 %r292,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r270,0; st.u64 [%r185+8],%r270; .loc 1 106 28 cvt.u16.u32 %r273,%r22; add.u16 %r272,%r273,-1; cvt.u32.u16 %r274,%r272; st.u8 [%r185+28],%r274; .loc 1 108 20 add.u32 %r275,%r22,-2; cvt.s64.s32 %r53,%r275; add.u64 %r277,%r53,%r53; add.u64 %r278,%r277,%r53; shl.b64 %r279,%r278,3; add.u64 %r280,%r185,%r279; .loc 1 108 67 shl.b64 %r282,%r53,3; add.u64 %r283,%frame,%r282; .loc 1 108 59 ld.u64 %r285,[%r280+40]; ld.u64 %r286,[%r283+240]; mul.lo.u64 %r118,%r285,%r286; .loc 1 110 29288; call (%value_in),_gfortrani_xmallocarray289,[%value_in]; } .loc 1 110 27 st.u64 [%r185],%r289; .loc 1 111 10 setp.eq.u64 %r291,%r118,0; @ ! %r291 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r100,%r49,%r100; $L17: .loc 1 101 4 st.u64 [%r155+8],%r292; ld.u64 %r49,[%r146]; add.u64 %r293,%r49,-1; st.u64 [%r155+16],%r293; st.u64 [%r155],%r100; .loc 1 94 7 add.u64 %r155,%r155,24; add.u64 %r146,%r146,8; setp.ne.u64 %r294,%r134,%r155; @ %r294 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r185+48],%r118; mov.u64 %r296,-1; st.u64 [%r185+56],%r296; mov.u64 %r297,1; st.u64 [%r185+40],%r297; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r59,[%r185+28]; .loc 1 121 10 setp.eq.u64 %r298,%r59,%r112; @ %r298 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r112; st.u64 [%stack],%r59;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r301,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r302,[%r301+36]; setp.eq.u32 %r303,%r302,0; @ %r303 bra $L21; .loc 1 128 2 add.u64 %r358,%frame,240; cvta.const.u64 %r307,$LC2r35306307; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r309,%r112,0; @ %r309 bra $L24; $L27: .loc 1 68 9 add.u64 %r310,%r30,1; .loc 1 68 7 sub.u64 %r114,%r310,%r32; max.s64 %r131,%r114,0; .loc 1 140 8 ld.u64 %r124,[%r186]; .loc 1 141 8 ld.u64 %r125,[%r185]; .loc 1 168 42 shl.b64 %r66,%r116,3; ld.u64 %r135,[%frame+360]; .loc 1 194 22 ld.u64 %r70,[%frame+120]; .loc 1 194 12 shl.b64 %r71,%r70,3; .loc 1 195 22 ld.u64 %r73,[%frame]; .loc 1 195 12 shl.b64 %r74,%r73,2; .loc 1 197 32 ld.u64 %r61,[%frame+240]; .loc 1 204 23 mul.lo.u64 %r311,%r61,%r70; .loc 1 204 9 shl.b64 %r312,%r311,3; neg.s64 %r149,%r312; .loc 1 205 23 mul.lo.u64 %r313,%r61,%r73; .loc 1 205 9 shl.b64 %r314,%r313,2; neg.s64 %r132,%r314; setp.gt.s64 %r360,%r114,0; setp.le.s64 %r361,%r112,1; .loc 1 157 9 mov.u32 %r363,1; setp.eq.u32 %r364,%r188,0; cvt.u32.u32 %r365,%r22; cvt.s64.s8 %r366,%r365; add.u64 %r367,%r366,-1; add.u64 %r368,%frame,120; add.u64 %r369,%frame,240; bra $L25; $L24: add.u64 %r168,%frame,360; add.u64 %r167,%r185,40; mov.u64 %r165,%frame; cvt.u32.u32 %r316,%r22; cvt.s64.s8 %r315,%r316; add.u64 %r156,%r315,-1; .loc 1 132 3 mov.u64 %r126,0; add.u64 %r358,%frame,240; .loc 1 134 16 mov.u64 %r317,%r126; $L26: st.u64 [%r168],%r317; .loc 1 135 18 ld.u64 %r318,[%r167]; st.u64 [%r165],%r318; .loc 1 136 17 shl.b64 %r320,%r126,3; add.u64 %r321,%r358,%r320; .loc 1 136 10 ld.u64 %r322,[%r321]; setp.le.s64 %r323,%r322,0; @ %r323 bra $L1; .loc 1 132 26 add.u64 %r126,%r126,1; .loc 1 132 3 add.u64 %r168,%r168,8; add.u64 %r167,%r167,24; add.u64 %r165,%r165,8; setp.ne.u64 %r324,%r126,%r156; @ %r324 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r360 bra $L43; .loc 1 159 10 mov.u32 %r326,0; st.u32 [%r125],%r326; bra $L29; $L43: mov.u64 %r120,%r124; .loc 1 168 19 mov.u64 %r119,0; $L28: .loc 1 170 7 ld.f64 %r105,[%r120]; .loc 1 170 6 setp.ge.f64 %r327,%r105,0dfff0000000000000; @ ! %r327 bra $L57; .loc 1 173 14 cvt.u32.u64 %r328,%r119; add.u32 %r103,%r328,1; .loc 1 180 6 setp.le.s64 %r329,%r131,%r119; @ %r329 bra $L32; add.u64 %r171,%r119,1; .loc 1 170 7 mov.f64 %r183,%r105; bra $L33; $L57: .loc 1 168 34 add.u64 %r119,%r119,1; .loc 1 168 42 add.u64 %r120,%r120,%r66; .loc 1 168 12 setp.lt.s64 %r330,%r119,%r131; @ %r330 bra $L28; .loc 1 157 9 mov.u32 %r103,%r363; bra $L32; $L33: .loc 1 182 7 @ %r364 bra $L34; .loc 1 182 6 setp.le.f64 %r332,%r105,%r183; @ %r332 bra $L35; bra $L36; $L34: setp.lt.f64 %r333,%r105,%r183; @ ! %r333 bra $L36; $L35: .loc 1 185 14 cvt.u32.u64 %r103,%r171; .loc 1 184 14 mov.f64 %r105,%r183; $L36: .loc 1 180 31 add.u64 %r120,%r120,%r66; .loc 1 180 6 add.u64 %r172,%r171,1; setp.eq.u64 %r334,%r131,%r171; @ %r334 bra $L32; .loc 1 182 14 ld.f64 %r183,[%r120]; mov.u64 %r171,%r172; bra $L33; $L32: .loc 1 189 12 st.u32 [%r125],%r103; $L29: .loc 1 193 15 add.u64 %r135,%r135,1; .loc 1 194 12 add.u64 %r124,%r124,%r71; .loc 1 195 12 add.u64 %r125,%r125,%r74; .loc 1 197 13 setp.ne.u64 %r335,%r61,%r135; @ %r335 bra $L25; .loc 1 204 9 add.u64 %r121,%r124,%r149; .loc 1 205 9 add.u64 %r122,%r125,%r132; .loc 1 207 7 @ %r361 bra $L1; add.u64 %r137,%frame,368; mov.u64 %r180,8; .loc 1 206 5 mov.u64 %r123,1; .loc 1 201 13 mov.u64 %r362,0; bra $L40; $L41: st.u64 [%r137],%r362; .loc 1 204 23 mul.lo.u64 %r341,%r87,%r86; .loc 1 204 9 shl.b64 %r342,%r341,3; sub.u64 %r121,%r124,%r342; .loc 1 205 23 mul.lo.u64 %r343,%r90,%r86; .loc 1 205 9 shl.b64 %r344,%r343,2; sub.u64 %r122,%r125,%r344; .loc 1 206 5 add.u64 %r123,%r123,1; .loc 1 207 7 add.u64 %r137,%r137,8; add.u64 %r180,%r180,8; setp.eq.u64 %r345,%r123,%r367; @ %r345 bra $L1; $L40: .loc 1 215 16 ld.u64 %r346,[%r137]; add.u64 %r86,%r346,1; st.u64 [%r137],%r86; .loc 1 216 23 add.u64 %r348,%r368,%r180; ld.u64 %r87,[%r348]; .loc 1 216 13 shl.b64 %r349,%r87,3; add.u64 %r124,%r121,%r349; .loc 1 217 23 add.u64 %r350,%frame,%r180; ld.u64 %r90,[%r350]; .loc 1 217 13 shl.b64 %r351,%r90,2; add.u64 %r125,%r122,%r351; .loc 1 197 32 add.u64 %r353,%r369,%r180; ld.u64 %r94,[%r353]; .loc 1 197 13 setp.eq.u64 %r354,%r86,%r94; @ %r354 bra $L41; .loc 1 201 13 mov.u64 %r135,0; bra $L25; $L56: .loc 1 90 6 ld.u64 %r355,[%r185]; setp.eq.u64 %r356,%r355,0; @ ! %r356 bra $L11; bra $L42; $L1: .loc 1 221_gfortran_mmaxloc1_4_r8 .visible .func _gfortran_mmaxloc1_4_r8, .param .u32 %in_ar4) {608u32 %r267; .reg .u32 %r269; .reg .u32 %r270; .reg .u32 %r271; .reg .u32 %r273; .reg .u32predu64 %r293; .reg .predu64 %r302; .reg .u64 %r303; .reg .u64 %r305; .reg .u64 %r306; .reg .u64u64 %r327; .reg .u32u64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64pred %r339; .reg .u64 %r341; .reg .u64 %r343; .reg .pred %r345; .reg .predpred %r383; .reg .u64 %r384; .reg .u64 %r386; .reg .u32 %r387; .reg .pred %r388; .reg .u64 %r391; .reg .u64pred %r409; .reg .u64 %r410; .reg .u64 %r411; .reg .u64 %r412; .reg .pred %r414; .reg .u32pred %r418; .reg .pred %r419; .reg .pred %r422; .reg .pred %r423; .reg .pred %r425; .reg .pred %r426; .reg .predu64predmov.u64 %r223,%ar0; mov.u64 %r224,%ar1; mov.u64 %r225,%ar2; mov.u64 %r226,%ar3; mov.u32 %r227,%ar4; .loc 1 251 6 setp.ne.u64 %r228,%r226,0; @ %r228 bra $L227; call _gfortran_maxloc1_4_r87 bra $L58; $L59: .loc 1 261 10 ld.u64 %r22,[%r225]; .loc 1 261 7 add.u64 %r159,%r22,-1; .loc 1 262 10 ld.s8 %r23,[%r224+28]; .loc 1 262 38 add.u32 %r233,%r23,-1; .loc 1 262 8 cvt.s64.s32 %r160,%r233; .loc 1 265 7 shr.u64 %r235,%r159,63; set.u32.gt.s64 %r237,%r159,%r160; neg.s32 %r238,%r237; cvt.u16.u64 %r241,%r235; cvt.u16.u32 %r242,%r238; or.b16 %r240,%r241,%r242; .loc 1 265 6 cvt.u32.u16 %r243,%r240; cvt.u16.u8 %r244,%r243; setp.eq.u16 %r245,%r244,0; @ %r245 bra $L61; .loc 1 267 7 cvt.u32.u32 %r248,%r23; cvt.s64.s8 %r247,%r248; st.u64 [%stack+8],%r247; st.u64 [%stack],%r22; cvta.const.u64 %r246246_gfortran_runtime_error61: .loc 1 272 9 add.u64 %r456,%r159,%r159; add.u64 %r457,%r456,%r159; shl.b64 %r253,%r457,3; add.u64 %r254,%r224,%r253; ld.u64 %r257,[%r254+56]; add.u64 %r256,%r257,1; .loc 1 272 7 ld.u64 %r264,[%r254+48]; sub.u64 %r161,%r256,%r264; .loc 1 273 6 setp.le.s64 %r265,%r161,0; @ %r265 bra $L58; .loc 1 276 9 ld.u64 %r180,[%r226]; .loc 1 278 15 ld.u64 %r34,[%r226+16]; .loc 1 280 22 cvt.u32.u64 %r35,%r34; .loc 1 280 53 add.u32 %r266,%r35,-4; and.b32 %r267,%r266,-5; set.u32.eq.u32 %r269,%r267,0; neg.s32 %r270,%r269; .loc 1 280 22 add.u32 %r271,%r35,-1; set.u32.le.u32 %r273,%r271,1; neg.s32 %r274,%r273; .loc 1 280 6 cvt.u16.u32 %r276,%r270; cvt.u16.u32 %r277,%r274; or.b16 %r275,%r276,%r277; cvt.u32.u16 %r278,%r275; cvt.u16.u8 %r279,%r278; setp.ne.u16 %r280,%r279,0; @ %r280 bra $L63; .loc 1 282 7 setp.ne.u32 %r282,%r35,16; @ %r282 bra $L64; $L63: .loc 1 289 9 shl.b64 %r286,%r457,3; add.u64 %r287,%r224,%r286; ld.u64 %r164,[%r287+40]; .loc 1 290 12 add.u64 %r293,%r226,%r286; ld.u64 %r41,[%r293+40]; .loc 1 292 3 setp.ne.u64 %r295,%r159,0; @ %r295 bra $L65; $L71: .loc 1 302 3 setp.lt.s64 %r296,%r159,%r160; @ %r296 bra $L66; bra $L129; $L64: .loc 1 287 5 cvta.const.u64 %r2977_gfortran_runtime_error65: add.u64 %r91,%r224,40; add.u64 %r40,%r226,40; add.u64 %r300,%r22,%r22; add.u64 %r301,%r300,%r22; shl.b64 %r302,%r301,3; add.u64 %r303,%r224,16; add.u64 %r186,%r302,%r303; .loc 1 292 3 mov.u64 %r148,0; add.u64 %r453,%frame,240; add.u64 %r452,%frame,360; .loc 1 299 12 mov.u64 %r467,%r148; $L70: .loc 1 294 18 add.u64 %r305,%r453,%r148; ld.u64 %r306,[%r91]; st.u64 [%r305],%r306; .loc 1 295 18 add.u64 %r307,%frame,%r148; .loc 1 295 20 ld.u64 %r309,[%r40]; mul.lo.u64 %r308,%r309,%r34; .loc 1 295 18 st.u64 [%r307],%r308; .loc 1 296 19 ld.u64 %r311,[%r91+16]; add.u64 %r310,%r311,1; ld.u64 %r312,[%r91+8]; sub.u64 %r52,%r310,%r312; .loc 1 298 10 setp.lt.s64 %r313,%r52,0; @ %r313 bra $L68; .loc 1 296 17 add.u64 %r315,%r452,%r148; st.u64 [%r315],%r52; bra $L69; $L68: .loc 1 299 12 add.u64 %r317,%r452,%r148; st.u64 [%r317],%r467; $L69: .loc 1 292 3 add.u64 %r91,%r91,24; add.u64 %r40,%r40,24; add.u64 %r148,%r148,8; setp.ne.u64 %r319,%r186,%r91; @ %r319 bra $L70; bra $L71; $L129: .loc 1 312 6 ld.u64 %r320,[%r223]; setp.eq.u64 %r321,%r320,0; @ ! %r321 bra $L73; bra $L72; $L66: add.u64 %r323,%r22,%r22; add.u64 %r324,%r323,%r22; shl.b64 %r325,%r324,3; add.u64 %r116,%r325,40; add.u64 %r133,%r224,%r116; add.u64 %r87,%r226,%r116; shl.b64 %r326,%r22,3; add.u64 %r157,%r326,-8; cvt.u32.u32 %r328,%r23; cvt.s64.s8 %r327,%r328; shl.b64 %r329,%r327,3; add.u64 %r96,%r329,-8; add.u64 %r453,%frame,240; add.u64 %r452,%frame,360; .loc 1 309 12 mov.u64 %r466,0; $L76: .loc 1 304 18 add.u64 %r331,%r453,%r157; ld.u64 %r332,[%r133]; st.u64 [%r331],%r332; .loc 1 305 18 add.u64 %r333,%frame,%r157; .loc 1 305 20 ld.u64 %r335,[%r87]; mul.lo.u64 %r334,%r335,%r34; .loc 1 305 18 st.u64 [%r333],%r334; .loc 1 306 19 ld.u64 %r337,[%r133+16]; add.u64 %r336,%r337,1; ld.u64 %r338,[%r133+8]; sub.u64 %r61,%r336,%r338; .loc 1 308 10 setp.lt.s64 %r339,%r61,0; @ %r339 bra $L74; .loc 1 306 17 add.u64 %r341,%r452,%r157; st.u64 [%r341],%r61; bra $L75; $L74: .loc 1 309 12 add.u64 %r343,%r452,%r157; st.u64 [%r343],%r466; $L75: .loc 1 302 3 add.u64 %r133,%r133,24; add.u64 %r87,%r87,24; add.u64 %r157,%r157,8; setp.ne.u64 %r345,%r96,%r157; @ %r345 bra $L76; bra $L130; $L72: .loc 1 316 7 setp.eq.u64 %r346,%r160,0; @ %r346 bra $L78; add.u64 %r452,%frame,360; $L106: add.u64 %r146,%r223,40; mov.u64 %r143,%r452; cvt.u32.u32 %r348,%r23; cvt.s64.s8 %r347,%r348; add.u64 %r350,%r347,%r347; add.u64 %r351,%r350,%r347; shl.b64 %r352,%r351,3; add.u64 %r353,%r223,16; add.u64 %r134,%r352,%r353; .loc 1 319 10 mov.u64 %r139,1; .loc 1 323 4 mov.u64 %r372,0; bra $L79; $L78: .loc 1 327 20 add.u32 %r354,%r23,-2; cvt.s64.s32 %r71,%r354; add.u64 %r356,%r71,%r71; add.u64 %r357,%r356,%r71; shl.b64 %r358,%r357,3; add.u64 %r359,%r223,%r358; .loc 1 327 67 shl.b64 %r361,%r71,3; add.u64 %r362,%frame,%r361; .loc 1 327 59 ld.u64 %r364,[%r359+40]; ld.u64 %r365,[%r362+360]; mul.lo.u64 %r168,%r364,%r365; .loc 1 329 24 mov.u64 %r366,0; st.u64 [%r223+8],%r366; .loc 1 330 28 cvt.u16.u32 %r369,%r23; add.u16 %r368,%r369,-1; cvt.u32.u16 %r370,%r368; st.u8 [%r223+28],%r370; .loc 1 332 10 setp.eq.u64 %r371,%r168,0; @ %r371 bra $L80; bra $L131; $L82: .loc 1 321 47 mul.lo.u64 %r139,%r68,%r139; $L79: .loc 1 323 4 st.u64 [%r146+8],%r372; ld.u64 %r68,[%r143]; add.u64 %r373,%r68,-1; st.u64 [%r146+16],%r373; st.u64 [%r146],%r139; .loc 1 316 7 add.u64 %r146,%r146,24; add.u64 %r143,%r143,8; setp.ne.u64 %r374,%r134,%r146; @ %r374 bra $L82; bra $L78; $L80: .loc 1 335 4 st.u64 [%r223+48],%r168; mov.u64 %r376,-1; st.u64 [%r223+56],%r376; mov.u64 %r377,1; st.u64 [%r223+40],%r377; .loc 1 336 4 bra $L58; $L131: .loc 1 339 24call (%value_in),_gfortrani_xmallocarray80,[%value_in]; } .loc 1 339 22 st.u64 [%r223],%r380; bra $L83; $L73: .loc 1 344 19 ld.s8 %r382,[%r223+28]; .loc 1 344 10 setp.eq.u64 %r383,%r382,%r160; @ %r383 bra $L84; .loc 1 345 2 cvta.const.u64 %r3843stack; call _gfortran_runtime_error84: .loc 1 347 11 cvta.global.u64 %r386,_gfortrani_compile_options; .loc 1 347 10 ld.u32 %r387,[%r386+36]; setp.eq.u32 %r388,%r387,0; @ %r388 bra $L83; .loc 1 349 4 add.u64 %r452,%frame,360; cvta.const.u64 %r392,$LC239392; call _gfortrani_bounds_ifunction_return4 cvta.const.u64 %r396,$LC396392; call _gfortrani_bounds_equal_extents$L83: .loc 1 356 3 setp.ne.u64 %r398,%r160,0; @ %r398 bra $L85; $L88: .loc 1 364 8 ld.u64 %r181,[%r223]; .loc 1 365 8 ld.u64 %r179,[%r224]; .loc 1 367 9 setp.ne.u64 %r399,%r179,0; @ %r399 bra $L86; bra $L58; $L85: add.u64 %r199,%frame,480; add.u64 %r195,%r223,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r401,%r23; cvt.s64.s8 %r400,%r401; add.u64 %r149,%r400,-1; .loc 1 356 3 mov.u64 %r182,0; add.u64 %r452,%frame,360; .loc 1 358 16 mov.u64 %r402,%r182; $L87: st.u64 [%r199],%r402; .loc 1 359 18 ld.u64 %r403,[%r195]; st.u64 [%r187],%r403; .loc 1 360 17 shl.b64 %r405,%r182,3; add.u64 %r406,%r452,%r405; .loc 1 360 10 ld.u64 %r407,[%r406]; setp.le.s64 %r408,%r407,0; @ %r408 bra $L58; .loc 1 356 26 add.u64 %r182,%r182,1; .loc 1 356 3 add.u64 %r199,%r199,8; add.u64 %r195,%r195,24; add.u64 %r187,%r187,8; setp.ne.u64 %r409,%r149,%r182; @ %r409 bra $L87; bra $L88; $L86: .loc 1 290 12 mul.lo.u64 %r43,%r41,%r34; .loc 1 386 32 shl.b64 %r86,%r164,3; ld.u64 %r39,[%frame+480]; .loc 1 430 22 ld.u64 %r100,[%frame+240]; .loc 1 430 12 shl.b64 %r101,%r100,3; .loc 1 431 23 ld.u64 %r103,[%frame]; .loc 1 432 22 ld.u64 %r105,[%frame+120]; .loc 1 432 12 shl.b64 %r106,%r105,2; .loc 1 434 32 ld.u64 %r172,[%frame+360]; .loc 1 441 23 mul.lo.u64 %r38,%r172,%r100; .loc 1 442 24 mul.lo.u64 %r410,%r172,%r103; .loc 1 442 10 neg.s64 %r94,%r410; .loc 1 443 23 mul.lo.u64 %r411,%r172,%r105; .loc 1 443 9 shl.b64 %r412,%r411,2; neg.s64 %r140,%r412; setp.le.s64 %r451,%r160,1; .loc 1 408 9 setp.ne.u32 %r459,%r227,0; cvt.u32.u32 %r460,%r23; cvt.s64.s8 %r461,%r460; add.u64 %r462,%r461,-1; add.u64 %r463,%frame,240; add.u64 %r464,%frame,360; add.u64 %r465,%frame,120; $L107: .loc 1 356 3 mov.u64 %r174,%r180; mov.u64 %r173,%r179; .loc 1 383 16 mov.u32 %r147,0; .loc 1 386 9 mov.u64 %r171,0; $L94: .loc 1 389 7 ld.s8 %r222,[%r174]; .loc 1 389 6 setp.eq.u32 %r454,%r222,0; @ %r454 bra $L89; .loc 1 392 10 setp.ne.u32 %r414,%r147,0; @ %r414 bra $L90; cvt.u32.u64 %r415,%r171; add.u32 %r147,%r415,1; $L90: .loc 1 394 11 ld.f64 %r145,[%r173]; .loc 1 394 10 setp.ge.f64 %r416,%r145,0dfff0000000000000; @ ! %r416 bra $L89; .loc 1 398 11 cvt.u32.u64 %r417,%r171; add.u32 %r170,%r417,1; .loc 1 404 9 setp.le.s64 %r418,%r161,%r171; @ ! %r418 bra $L132; bra $L92; $L89: .loc 1 386 24 add.u64 %r171,%r171,1; .loc 1 386 32 add.u64 %r173,%r173,%r86; .loc 1 386 47 add.u64 %r174,%r174,%r43; .loc 1 386 2 setp.ne.u64 %r419,%r161,%r171; @ %r419 bra $L94; bra $L92; $L132: .loc 1 408 9 @ %r459 bra $L95; add.u64 %r205,%r171,1; .loc 1 398 11 mov.u32 %r147,%r170; bra $L96; $L95: add.u64 %r209,%r171,1; mov.u32 %r147,%r170; $L99: .loc 1 411 8 @ %r454 bra $L97; .loc 1 411 18 ld.f64 %r89,[%r173]; .loc 1 411 15 setp.ge.f64 %r422,%r89,%r145; @ ! %r422 bra $L97; .loc 1 414 16 cvt.u32.u64 %r147,%r209; .loc 1 413 16 mov.f64 %r145,%r89; $L97: .loc 1 409 33 add.u64 %r173,%r173,%r86; .loc 1 409 48 add.u64 %r174,%r174,%r43; .loc 1 409 8 add.u64 %r210,%r209,1; setp.eq.u64 %r423,%r161,%r209; @ %r423 bra $L92; .loc 1 411 9 ld.s8 %r222,[%r174]; mov.u64 %r209,%r210; setp.eq.u32 %r454,%r222,0; bra $L99; $L96: .loc 1 420 8 @ %r454 bra $L100; .loc 1 420 18 ld.f64 %r93,[%r173]; .loc 1 420 15 setp.gt.f64 %r425,%r93,%r145; @ ! %r425 bra $L100; .loc 1 423 16 cvt.u32.u64 %r147,%r205; .loc 1 422 16 mov.f64 %r145,%r93; $L100: .loc 1 418 33 add.u64 %r173,%r173,%r86; .loc 1 418 48 add.u64 %r174,%r174,%r43; .loc 1 418 8 add.u64 %r206,%r205,1; setp.eq.u64 %r426,%r161,%r205; @ %r426 bra $L92; .loc 1 420 9 ld.s8 %r222,[%r174]; mov.u64 %r205,%r206; setp.eq.u32 %r454,%r222,0; bra $L96; $L92: .loc 1 426 8 st.u32 [%r181],%r147; .loc 1 429 15 add.u64 %r39,%r39,1; .loc 1 430 12 add.u64 %r179,%r179,%r101; .loc 1 431 13 add.u64 %r180,%r180,%r103; .loc 1 432 12 add.u64 %r181,%r181,%r106; .loc 1 434 13 setp.ne.u64 %r427,%r39,%r172; @ %r427 bra $L107; .loc 1 442 10 add.u64 %r176,%r180,%r94; .loc 1 443 9 add.u64 %r177,%r181,%r140; .loc 1 445 7 @ %r451 bra $L58; add.u64 %r158,%frame,488; .loc 1 441 23 mov.u64 %r110,%r38; .loc 1 445 7 mov.u64 %r219,8; .loc 1 444 5 mov.u64 %r178,1; .loc 1 438 13 mov.u64 %r458,0; bra $L104; $L105: st.u64 [%r158],%r458; .loc 1 441 23 mul.lo.u64 %r110,%r124,%r123; .loc 1 442 24 mul.lo.u64 %r433,%r126,%r123; .loc 1 442 10 sub.u64 %r176,%r180,%r433; .loc 1 443 23 mul.lo.u64 %r434,%r128,%r123; .loc 1 443 9 shl.b64 %r435,%r434,2; sub.u64 %r177,%r181,%r435; .loc 1 444 5 add.u64 %r178,%r178,1; .loc 1 445 7 add.u64 %r158,%r158,8; add.u64 %r219,%r219,8; setp.eq.u64 %r436,%r178,%r462; @ %r436 bra $L58; $L104: .loc 1 453 16 ld.u64 %r437,[%r158]; add.u64 %r123,%r437,1; st.u64 [%r158],%r123; .loc 1 454 23 add.u64 %r439,%r463,%r219; ld.u64 %r124,[%r439]; .loc 1 454 13 sub.u64 %r440,%r124,%r110; shl.b64 %r441,%r440,3; add.u64 %r179,%r179,%r441; .loc 1 455 24 add.u64 %r442,%frame,%r219; ld.u64 %r126,[%r442]; .loc 1 455 14 add.u64 %r180,%r176,%r126; .loc 1 456 23 add.u64 %r444,%r465,%r219; ld.u64 %r128,[%r444]; .loc 1 456 13 shl.b64 %r445,%r128,2; add.u64 %r181,%r177,%r445; .loc 1 434 32 add.u64 %r447,%r464,%r219; ld.u64 %r131,[%r447]; .loc 1 434 13 setp.eq.u64 %r448,%r123,%r131; @ %r448 bra $L105; .loc 1 438 13 mov.u64 %r39,0; bra $L107; $L130: .loc 1 312 6 ld.u64 %r449,[%r223]; setp.eq.u64 %r450,%r449,0; @ ! %r450 bra $L73; bra $L106; $L58: .loc 1 460_gfortran_smaxloc1_4_r8 .visible .func _gfortran_smaxloc1_4_r8, .param .u32 %in_ar4) {8u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u32 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 483 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L134; .loc 1 483 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L135; $L134:r152154; call _gfortran_maxloc1_4_r8490 7 bra $L133; $L135: .loc 1 493 10 ld.u64 %r23,[%r152]; .loc 1 493 7 add.u64 %r88,%r23,-1; .loc 1 494 10 ld.s8 %r24,[%r151+28]; .loc 1 494 38 add.u32 %r26,%r24,-1; .loc 1 494 8 cvt.s64.s32 %r296,%r26; .loc 1 496 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 496 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L137; .loc 1 503 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L138; $L144: .loc 1 511 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L139; bra $L192; $L137: .loc 1 498 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error138: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 508 12 mov.u64 %r302,0; $L143: .loc 1 505 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 507 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L141; .loc 1 505 17 st.u64 [%r139],%r36; bra $L142; $L141: .loc 1 508 12 st.u64 [%r139],%r302; $L142: .loc 1 503 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L143; bra $L144; $L192: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L146; bra $L145; $L139: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 517 12 mov.u64 %r301,0; $L149: .loc 1 514 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 516 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L147; .loc 1 513 17 st.u64 [%r43],%r41; bra $L148; $L147: .loc 1 517 12 st.u64 [%r43],%r301; $L148: .loc 1 511 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L149; bra $L193; $L145: .loc 1 524 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L151; add.u64 %r295,%frame,120; $L171: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 527 10 mov.u64 %r81,1; .loc 1 531 4 mov.u64 %r239,0; bra $L152; $L151: .loc 1 535 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 536 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 538 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 538 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 538 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 540 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L153; bra $L194; $L155: .loc 1 529 48 mul.lo.u64 %r81,%r46,%r81; $L152: .loc 1 531 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 524 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L155; bra $L151; $L153: .loc 1 543 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 544 4 bra $L133; $L194: .loc 1 547 2496246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 547 22 st.u64 [%r150],%r98; $L159: .loc 1 573 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L156; bra $L157; $L146: .loc 1 551 19 ld.s8 %r55,[%r150+28]; .loc 1 551 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L158; .loc 1 552 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error158: .loc 1 557 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 557 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L159; .loc 1 559 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L157; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 559 10 mov.u64 %r104,0; $L161: .loc 1 563 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 563 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 564 18 ld.u64 %r63,[%r31]; .loc 1 559 25 add.u64 %r104,%r104,1; .loc 1 564 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L160; .loc 1 565 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error160: .loc 1 559 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L161; $L156: .loc 1 575 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L162; shl.b64 %r266,%r296,3; bra $L163; $L162: mov.u64 %r266,8; $L163: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L164: .loc 1 576 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 573 26 add.u64 %r100,%r100,1; .loc 1 573 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L164; $L157: ld.u64 %r80,[%frame+240]; .loc 1 585 22 ld.u64 %r68,[%frame]; .loc 1 585 12 shl.b64 %r69,%r68,2; .loc 1 587 32 ld.u64 %r59,[%frame+120]; .loc 1 594 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 583 13 mov.u32 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L170: st.u32 [%r98],%r280; .loc 1 584 15 add.u64 %r80,%r80,1; .loc 1 585 12 add.u64 %r98,%r98,%r69; .loc 1 587 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L170; .loc 1 596 7 @ ! %r294 bra $L195; bra $L133; $L169: .loc 1 591 13 st.u64 [%r99],%r297; .loc 1 594 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 595 5 add.u64 %r97,%r97,1; .loc 1 596 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L168; bra $L133; $L195: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 594 23 mov.u64 %r72,%r44; .loc 1 595 5 mov.u64 %r97,1; .loc 1 591 13 mov.u64 %r297,0; $L168: .loc 1 600 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 601 23 ld.u64 %r75,[%r124]; .loc 1 601 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,2; add.u64 %r98,%r98,%r291; .loc 1 587 32 ld.u64 %r77,[%r123]; .loc 1 587 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L169; .loc 1 591 13 mov.u64 %r80,0; bra $L170; $L193: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L146; bra $L171; $L133: .loc 1 605 maxloc1_8_r8.o/ 1622802230_gfortran_maxloc1_8_r8 .visible .func _gfortran_maxloc1_8_r8.file 1 "../../../../libgfortran/generated/maxloc1_8_r8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_8_r8 .visible .func _gfortran_mmaxloc1_8_r8_gfortran_smaxloc1_8_r8 .visible .func _gfortran_smaxloc1_8_r8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_maxloc1_8_r8 .visible .func _gfortran_maxloc1_8_r899pred %r221; .reg .pred %r222; .reg .u64u64 %r229; .reg .u64 %r230; .reg .u64 %r231; .reg .pred %r232; .reg .predu64 %r280; .reg .u64 %r282; .reg .u64pred64predu64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64pred %r345; .reg .u64u64 %r353; .reg .pred %r354; .reg .pred %r355; .reg .u64 %r356; .reg .predpred %r361; .reg .pred %r362; .reg .predmov.u64 %r182,%ar0; mov.u64 %r183,%ar1; mov.u64 %r184,%ar2; mov.u32 %r185,%ar3; .loc 1 58 10 ld.s8 %r22,[%r183+28]; .loc 1 58 38 add.u32 %r186,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r112,%r186; .loc 1 59 10 ld.u64 %r25,[%r184]; .loc 1 59 7 add.u64 %r113,%r25,-1; .loc 1 61 7 shr.u64 %r188,%r113,63; set.u32.lt.s64 %r190,%r112,%r113; neg.s32 %r191,%r190; cvt.u16.u64 %r194,%r188; cvt.u16.u32 %r195,%r191; or.b16 %r193,%r194,%r195; .loc 1 61 6 cvt.u32.u16 %r196,%r193; cvt.u16.u8 %r197,%r196; setp.eq.u16 %r198,%r197,0; @ %r198 bra $L2; .loc 1 63 7 cvt.u32.u32 %r201,%r22; cvt.s64.s8 %r200,%r201; st.u64 [%stack+8],%r20019_gfortran_runtime_errorr204,%r113,%r113; add.u64 %r205,%r204,%r113; shl.b64 %r206,%r205,3; add.u64 %r207,%r183,%r206; ld.u64 %r31,[%r207+56]; ld.u64 %r33,[%r207+48]; .loc 1 71 9 ld.u64 %r116,[%r207+40]; .loc 1 73 3 setp.ne.u64 %r221,%r113,0; @ %r221 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r222,%r112,%r113; @ %r222 bra $L4; bra $L56; $L3: add.u64 %r99,%r183,40; add.u64 %r130,%frame,120; add.u64 %r181,%frame,240; add.u64 %r223,%r183,16; add.u64 %r225,%r25,%r25; add.u64 %r226,%r225,%r25; shl.b64 %r227,%r226,3; add.u64 %r72,%r223,%r227; .loc 1 79 12 mov.u64 %r372,0; $L8: .loc 1 75 18 ld.u64 %r228,[%r99]; st.u64 [%r130],%r228; .loc 1 76 19 ld.u64 %r230,[%r99+16]; add.u64 %r229,%r230,1; ld.u64 %r231,[%r99+8]; sub.u64 %r38,%r229,%r231; .loc 1 78 10 setp.lt.s64 %r232,%r38,0; @ %r232 bra $L6; .loc 1 76 17 st.u64 [%r181],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r181],%r372; $L7: .loc 1 73 3 add.u64 %r99,%r99,24; add.u64 %r130,%r130,8; add.u64 %r181,%r181,8; setp.ne.u64 %r234,%r72,%r99; @ %r234 bra $L8; bra $L9; $L56: .loc 1 90 6 ld.u64 %r235,[%r182]; setp.eq.u64 %r236,%r235,0; @ ! %r236 bra $L11; bra $L10; $L4: add.u64 %r238,%r25,%r25; add.u64 %r239,%r238,%r25; shl.b64 %r240,%r239,3; add.u64 %r241,%r240,40; add.u64 %r136,%r183,%r241; shl.b64 %r242,%r25,3; add.u64 %r108,%r242,-8; add.u64 %r360,%frame,120; add.u64 %r110,%r360,%r108; add.u64 %r359,%frame,240; add.u64 %r104,%r359,%r108; cvt.u32.u32 %r246,%r22; cvt.s64.s8 %r245,%r246; add.u64 %r248,%r245,%r245; add.u64 %r249,%r248,%r245; shl.b64 %r250,%r249,3; add.u64 %r251,%r183,40; add.u64 %r58,%r250,%r251; .loc 1 87 12 mov.u64 %r371,0; $L14: .loc 1 83 18 ld.u64 %r252,[%r136]; st.u64 [%r110],%r252; .loc 1 84 19 ld.u64 %r254,[%r136+16]; add.u64 %r253,%r254,1; ld.u64 %r255,[%r136+8]; sub.u64 %r46,%r253,%r255; .loc 1 86 10 setp.lt.s64 %r256,%r46,0; @ %r256 bra $L12; .loc 1 84 17 st.u64 [%r104],%r46; bra $L13; $L12: .loc 1 87 12 st.u64 [%r104],%r371; $L13: .loc 1 81 3 add.u64 %r136,%r136,24; add.u64 %r110,%r110,8; add.u64 %r104,%r104,8; setp.ne.u64 %r258,%r58,%r136; @ %r258 bra $L14; bra $L57; $L10: .loc 1 94 7 setp.le.s64 %r259,%r112,0; @ %r259 bra $L16; add.u64 %r359,%frame,240; $L42: add.u64 %r153,%r182,40; mov.u64 %r146,%r359; cvt.u32.u32 %r261,%r22; cvt.s64.s8 %r260,%r261; add.u64 %r263,%r260,%r260; add.u64 %r264,%r263,%r260; shl.b64 %r265,%r264,3; add.u64 %r266,%r182,16; add.u64 %r138,%r265,%r266; .loc 1 97 10 mov.u64 %r103,1; .loc 1 101 4 mov.u64 %r289,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r267,0; st.u64 [%r182+8],%r267; .loc 1 106 28 cvt.u16.u32 %r270,%r22; add.u16 %r269,%r270,-1; cvt.u32.u16 %r271,%r269; st.u8 [%r182+28],%r271; .loc 1 108 20 add.u32 %r272,%r22,-2; cvt.s64.s32 %r56,%r272; add.u64 %r274,%r56,%r56; add.u64 %r275,%r274,%r56; shl.b64 %r276,%r275,3; add.u64 %r277,%r182,%r276; .loc 1 108 67 shl.b64 %r279,%r56,3; add.u64 %r280,%frame,%r279; .loc 1 108 59 ld.u64 %r282,[%r277+40]; ld.u64 %r283,[%r280+240]; mul.lo.u64 %r118,%r282,%r283; .loc 1 110 29 mov.u64 %r28511r285; call (%value_in),_gfortrani_xmallocarray286,[%value_in]; } .loc 1 110 27 st.u64 [%r182],%r286; .loc 1 111 10 setp.eq.u64 %r288,%r118,0; @ ! %r288 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r103,%r52,%r103; $L17: .loc 1 101 4 st.u64 [%r153+8],%r289; ld.u64 %r52,[%r146]; add.u64 %r290,%r52,-1; st.u64 [%r153+16],%r290; st.u64 [%r153],%r103; .loc 1 94 7 add.u64 %r153,%r153,24; add.u64 %r146,%r146,8; setp.ne.u64 %r291,%r138,%r153; @ %r291 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r182+48],%r118; mov.u64 %r293,-1; st.u64 [%r182+56],%r293; mov.u64 %r294,1; st.u64 [%r182+40],%r294; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r63,[%r182+28]; .loc 1 121 10 setp.eq.u64 %r295,%r63,%r112; @ %r295 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r112; st.u64 [%stack],%r63;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r298,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r299,[%r298+36]; setp.eq.u32 %r300,%r299,0; @ %r300 bra $L21; .loc 1 128 2 add.u64 %r359,%frame,240; cvta.const.u64 %r304,$LC2r359303304; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r306,%r112,0; @ %r306 bra $L24; $L27: .loc 1 68 9 add.u64 %r307,%r31,1; .loc 1 68 7 sub.u64 %r114,%r307,%r33; max.s64 %r132,%r114,0; .loc 1 140 8 ld.u64 %r126,[%r183]; .loc 1 141 8 ld.u64 %r127,[%r182]; .loc 1 168 42 shl.b64 %r71,%r116,3; ld.u64 %r48,[%frame+360]; .loc 1 194 22 ld.u64 %r73,[%frame+120]; .loc 1 194 12 shl.b64 %r75,%r73,3; .loc 1 195 22 ld.u64 %r76,[%frame]; .loc 1 195 12 shl.b64 %r78,%r76,3; setp.gt.s64 %r361,%r114,0; setp.le.s64 %r362,%r112,1; setp.eq.u32 %r364,%r185,0; .loc 1 159 10 mov.u64 %r365,0; cvt.u32.u32 %r366,%r22; cvt.s64.s8 %r367,%r366; add.u64 %r368,%r367,-1; add.u64 %r369,%frame,120; add.u64 %r370,%frame,240; bra $L25; $L24: add.u64 %r166,%frame,360; add.u64 %r165,%r182,40; mov.u64 %r163,%frame; cvt.u32.u32 %r309,%r22; cvt.s64.s8 %r308,%r309; add.u64 %r154,%r308,-1; .loc 1 132 3 mov.u64 %r128,0; add.u64 %r359,%frame,240; .loc 1 134 16 mov.u64 %r310,%r128; $L26: st.u64 [%r166],%r310; .loc 1 135 18 ld.u64 %r311,[%r165]; st.u64 [%r163],%r311; .loc 1 136 17 shl.b64 %r313,%r128,3; add.u64 %r314,%r359,%r313; .loc 1 136 10 ld.u64 %r315,[%r314]; setp.le.s64 %r316,%r315,0; @ %r316 bra $L1; .loc 1 132 26 add.u64 %r128,%r128,1; .loc 1 132 3 add.u64 %r166,%r166,8; add.u64 %r165,%r165,24; add.u64 %r163,%r163,8; setp.ne.u64 %r317,%r128,%r154; @ %r317 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r361 bra $L43; .loc 1 159 10 st.u64 [%r127],%r365; bra $L29; $L43: mov.u64 %r121,%r126; .loc 1 168 19 mov.u64 %r105,0; $L28: .loc 1 170 7 ld.f64 %r106,[%r121]; mov.u64 %r137,%r105; .loc 1 168 34 add.u64 %r105,%r105,1; .loc 1 170 6 setp.ge.f64 %r320,%r106,0dfff0000000000000; @ ! %r320 bra $L58; .loc 1 180 6 setp.gt.s64 %r321,%r132,%r137; @ %r321 bra $L44; bra $L33; $L58: .loc 1 168 42 add.u64 %r121,%r121,%r71; .loc 1 168 12 setp.gt.s64 %r322,%r132,%r105; @ %r322 bra $L28; .loc 1 157 9 mov.u64 %r105,1; bra $L33; $L44: .loc 1 170 7 mov.f64 %r180,%r106; $L32: .loc 1 182 7 @ %r364 bra $L34; set.u32.le.f64 %r325,%r106,%r180; neg.s32 %r326,%r325; cvt.u32.u32 %r324,%r326; cvt.u32.u8 %r120,%r324; bra $L35; $L34: set.u32.lt.f64 %r328,%r106,%r180; neg.s32 %r329,%r328; cvt.u32.u32 %r327,%r329; cvt.u32.u8 %r120,%r327; bra $L35; $L59: .loc 1 182 14 ld.f64 %r180,[%r121]; bra $L32; $L33: .loc 1 189 12 st.u64 [%r127],%r105; $L29: .loc 1 193 15 add.u64 %r48,%r48,1; .loc 1 194 12 add.u64 %r126,%r126,%r75; .loc 1 195 12 add.u64 %r127,%r127,%r78; .loc 1 197 32 ld.u64 %r66,[%frame+240]; .loc 1 197 13 setp.ne.u64 %r331,%r66,%r48; @ %r331 bra $L25; .loc 1 204 23 mul.lo.u64 %r332,%r48,%r73; .loc 1 204 9 shl.b64 %r333,%r332,3; sub.u64 %r123,%r126,%r333; .loc 1 205 23 mul.lo.u64 %r334,%r48,%r76; .loc 1 205 9 shl.b64 %r335,%r334,3; sub.u64 %r124,%r127,%r335; .loc 1 207 7 @ %r362 bra $L1; add.u64 %r175,%frame,368; mov.u64 %r174,8; .loc 1 206 5 mov.u64 %r125,1; bra $L38; $L39: .loc 1 201 13 st.u64 [%r175],%r365; .loc 1 204 23 mul.lo.u64 %r341,%r90,%r89; .loc 1 204 9 shl.b64 %r342,%r341,3; sub.u64 %r123,%r126,%r342; .loc 1 205 23 mul.lo.u64 %r343,%r93,%r89; .loc 1 205 9 shl.b64 %r344,%r343,3; sub.u64 %r124,%r127,%r344; .loc 1 206 5 add.u64 %r125,%r125,1; .loc 1 207 7 add.u64 %r175,%r175,8; add.u64 %r174,%r174,8; setp.eq.u64 %r345,%r125,%r368; @ %r345 bra $L1; $L38: .loc 1 215 16 ld.u64 %r346,[%r175]; add.u64 %r89,%r346,1; st.u64 [%r175],%r89; .loc 1 216 23 add.u64 %r348,%r369,%r174; ld.u64 %r90,[%r348]; .loc 1 216 13 shl.b64 %r349,%r90,3; add.u64 %r126,%r123,%r349; .loc 1 217 23 add.u64 %r350,%frame,%r174; ld.u64 %r93,[%r350]; .loc 1 217 13 shl.b64 %r351,%r93,3; add.u64 %r127,%r124,%r351; .loc 1 197 32 add.u64 %r353,%r370,%r174; ld.u64 %r96,[%r353]; .loc 1 197 13 setp.eq.u64 %r354,%r89,%r96; @ %r354 bra $L39; .loc 1 201 13 mov.u64 %r48,0; bra $L25; $L35: .loc 1 185 14 add.u64 %r137,%r137,1; .loc 1 182 6 setp.ne.u32 %r355,%r120,0; .loc 1 184 14 selp.f64 %r106,%r180,%r106,%r355; selp.u64 %r105,%r137,%r105,%r355; .loc 1 180 31 add.u64 %r121,%r121,%r71; .loc 1 180 6 setp.eq.u64 %r330,%r132,%r137; @ ! %r330 bra $L59; bra $L33; $L57: .loc 1 90 6 ld.u64 %r356,[%r182]; setp.eq.u64 %r357,%r356,0; @ ! %r357 bra $L11; bra $L42; $L1: .loc 1 221_gfortran_mmaxloc1_8_r8 .visible .func _gfortran_mmaxloc1_8_r8, .param .u32 %in_ar4) {608pred %r218; .reg .u32 %r223; .reg .u6464u64pred %r373; .reg .u64 %r374; .reg .u64 %r376; .reg .u32 %r377; .reg .pred %r378; .reg .u64 %r381; .reg .u64 %r382; .reg .u64 %r386; .reg .pred %r388; .reg .pred %r389; .reg .u64 %r390; .reg .u32 %r391; .reg .u64pred %r398; .reg .pred %r399; .reg .pred %r401; .reg .pred %r402; .reg .pred %r403; .reg .pred %r404; .reg .pred %r407; .reg .pred %r408; .reg .pred %r410; .reg .pred %r411; .reg .pred.reg .u64 %r421; .reg .u64u64pred %r445; .reg .u64 %r446; .reg .predmov.u64 %r213,%ar0; mov.u64 %r214,%ar1; mov.u64 %r215,%ar2; mov.u64 %r216,%ar3; mov.u32 %r217,%ar4; .loc 1 251 6 setp.ne.u64 %r218,%r216,0; @ %r218 bra $L611r214215217; call _gfortran_maxloc1_8_r87 bra $L60; $L61: .loc 1 261 10 ld.u64 %r22,[%r215]; .loc 1 261 7 add.u64 %r152,%r22,-1; .loc 1 262 10 ld.s8 %r23,[%r214+28]; .loc 1 262 38 add.u32 %r223,%r23,-1; .loc 1 262 8 cvt.s64.s32 %r153,%r223; .loc 1 265 7 shr.u64 %r225,%r152,63; set.u32.gt.s64 %r227,%r152,%r153; neg.s32 %r228,%r227; cvt.u16.u64 %r231,%r225; cvt.u16.u32 %r232,%r228; or.b16 %r230,%r231,%r232; .loc 1 265 6 cvt.u32.u16 %r233,%r230; cvt.u16.u8 %r234,%r233; setp.eq.u16 %r235,%r234,0; @ %r235 bra $L63; .loc 1 267 7 cvt.u32.u32 %r238,%r23; cvt.s64.s8 %r237,%r238; st.u64 [%stack+8],%r237; st.u64 [%stack],%r22; cvta.const.u64 %r236236_gfortran_runtime_error63: .loc 1 272 9 add.u64 %r442,%r152,%r152; add.u64 %r444,%r442,%r152; shl.b64 %r243,%r444,3; add.u64 %r244,%r214,%r243; ld.u64 %r247,[%r244+56]; add.u64 %r246,%r247,1; .loc 1 272 7 ld.u64 %r254,[%r244+48]; sub.u64 %r154,%r246,%r254; .loc 1 273 6 setp.le.s64 %r255,%r154,0; @ %r255 bra $L60; .loc 1 276 9 ld.u64 %r172,[%r216]; .loc 1 278 15 ld.u64 %r34,[%r216+16]; .loc 1 280 22 cvt.u32.u64 %r35,%r34; .loc 1 280 53 add.u32 %r256,%r35,-4; and.b32 %r257,%r256,-5; set.u32.eq.u32 %r259,%r257,0; neg.s32 %r260,%r259; .loc 1 280 22 add.u32 %r261,%r35,-1; set.u32.le.u32 %r263,%r261,1; neg.s32 %r264,%r263; .loc 1 280 6 cvt.u16.u32 %r266,%r260; cvt.u16.u32 %r267,%r264; or.b16 %r265,%r266,%r267; cvt.u32.u16 %r268,%r265; cvt.u16.u8 %r269,%r268; setp.ne.u16 %r270,%r269,0; @ %r270 bra $L65; .loc 1 282 7 setp.ne.u32 %r272,%r35,16; @ %r272 bra $L66; $L65: .loc 1 289 9 shl.b64 %r276,%r444,3; add.u64 %r277,%r214,%r276; ld.u64 %r157,[%r277+40]; .loc 1 290 12 add.u64 %r283,%r216,%r276; ld.u64 %r38,[%r283+40]; .loc 1 292 3 setp.ne.u64 %r285,%r152,0; @ %r285 bra $L67; $L73: .loc 1 302 3 setp.lt.s64 %r286,%r152,%r153; @ %r286 bra $L68; bra $L134; $L66: .loc 1 287 5_gfortran_runtime_error67: add.u64 %r205,%r214,40; add.u64 %r70,%r216,40; add.u64 %r289,%r214,16; add.u64 %r291,%r22,%r22; add.u64 %r292,%r291,%r22; shl.b64 %r293,%r292,3; add.u64 %r188,%r289,%r293; .loc 1 292 3 mov.u64 %r92,0; add.u64 %r441,%frame,240; add.u64 %r439,%frame,360; .loc 1 299 12 mov.u64 %r455,%r92; $L72: .loc 1 294 18 add.u64 %r295,%r441,%r92; ld.u64 %r296,[%r205]; st.u64 [%r295],%r296; .loc 1 295 18 add.u64 %r297,%frame,%r92; .loc 1 295 20 ld.u64 %r299,[%r70]; mul.lo.u64 %r298,%r299,%r34; .loc 1 295 18 st.u64 [%r297],%r298; .loc 1 296 19 ld.u64 %r301,[%r205+16]; add.u64 %r300,%r301,1; ld.u64 %r302,[%r205+8]; sub.u64 %r49,%r300,%r302; .loc 1 298 10 setp.lt.s64 %r303,%r49,0; @ %r303 bra $L70; .loc 1 296 17 add.u64 %r305,%r439,%r92; st.u64 [%r305],%r49; bra $L71; $L70: .loc 1 299 12 add.u64 %r307,%r439,%r92; st.u64 [%r307],%r455; $L71: .loc 1 292 3 add.u64 %r205,%r205,24; add.u64 %r70,%r70,24; add.u64 %r92,%r92,8; setp.ne.u64 %r309,%r188,%r205; @ %r309 bra $L72; bra $L73; $L134: .loc 1 312 6 ld.u64 %r310,[%r213]; setp.eq.u64 %r311,%r310,0; @ ! %r311 bra $L75; bra $L74; $L68: add.u64 %r313,%r22,%r22; add.u64 %r314,%r313,%r22; shl.b64 %r315,%r314,3; add.u64 %r123,%r315,40; add.u64 %r128,%r214,%r123; add.u64 %r107,%r216,%r123; shl.b64 %r316,%r22,3; add.u64 %r78,%r316,-8; cvt.u32.u32 %r318,%r23; cvt.s64.s8 %r317,%r318; shl.b64 %r319,%r317,3; add.u64 %r180,%r319,-8; add.u64 %r441,%frame,240; add.u64 %r439,%frame,360; .loc 1 309 12 mov.u64 %r454,0; $L78: .loc 1 304 18 add.u64 %r321,%r441,%r78; ld.u64 %r322,[%r128]; st.u64 [%r321],%r322; .loc 1 305 18 add.u64 %r323,%frame,%r78; .loc 1 305 20 ld.u64 %r325,[%r107]; mul.lo.u64 %r324,%r325,%r34; .loc 1 305 18 st.u64 [%r323],%r324; .loc 1 306 19 ld.u64 %r327,[%r128+16]; add.u64 %r326,%r327,1; ld.u64 %r328,[%r128+8]; sub.u64 %r61,%r326,%r328; .loc 1 308 10 setp.lt.s64 %r329,%r61,0; @ %r329 bra $L76; .loc 1 306 17 add.u64 %r331,%r439,%r78; st.u64 [%r331],%r61; bra $L77; $L76: .loc 1 309 12 add.u64 %r333,%r439,%r78; st.u64 [%r333],%r454; $L77: .loc 1 302 3 add.u64 %r128,%r128,24; add.u64 %r107,%r107,24; add.u64 %r78,%r78,8; setp.ne.u64 %r335,%r78,%r180; @ %r335 bra $L78; bra $L135; $L74: .loc 1 316 7 setp.eq.u64 %r336,%r153,0; @ %r336 bra $L80; add.u64 %r439,%frame,360; $L107: add.u64 %r144,%r213,40; mov.u64 %r139,%r439; cvt.u32.u32 %r338,%r23; cvt.s64.s8 %r337,%r338; add.u64 %r340,%r337,%r337; add.u64 %r341,%r340,%r337; shl.b64 %r342,%r341,3; add.u64 %r343,%r213,16; add.u64 %r129,%r342,%r343; .loc 1 319 10 mov.u64 %r132,1; .loc 1 323 4 mov.u64 %r362,0; bra $L81; $L80: .loc 1 327 20 add.u32 %r344,%r23,-2; cvt.s64.s32 %r68,%r344; add.u64 %r346,%r68,%r68; add.u64 %r347,%r346,%r68; shl.b64 %r348,%r347,3; add.u64 %r349,%r213,%r348; .loc 1 327 67 shl.b64 %r351,%r68,3; add.u64 %r352,%frame,%r351; .loc 1 327 59 ld.u64 %r354,[%r349+40]; ld.u64 %r355,[%r352+360]; mul.lo.u64 %r161,%r354,%r355; .loc 1 329 24 mov.u64 %r356,0; st.u64 [%r213+8],%r356; .loc 1 330 28 cvt.u16.u32 %r359,%r23; add.u16 %r358,%r359,-1; cvt.u32.u16 %r360,%r358; st.u8 [%r213+28],%r360; .loc 1 332 10 setp.eq.u64 %r361,%r161,0; @ %r361 bra $L82; bra $L136; $L84: .loc 1 321 47 mul.lo.u64 %r132,%r65,%r132; $L81: .loc 1 323 4 st.u64 [%r144+8],%r362; ld.u64 %r65,[%r139]; add.u64 %r363,%r65,-1; st.u64 [%r144+16],%r363; st.u64 [%r144],%r132; .loc 1 316 7 add.u64 %r144,%r144,24; add.u64 %r139,%r139,8; setp.ne.u64 %r364,%r129,%r144; @ %r364 bra $L84; bra $L80; $L82: .loc 1 335 4 st.u64 [%r213+48],%r161; mov.u64 %r366,-1; st.u64 [%r213+56],%r366; mov.u64 %r367,1; st.u64 [%r213+40],%r367; .loc 1 336 4 bra $L60; $L136: .loc 1 339 24 mov.u64 %r369369; call (%value_in),_gfortrani_xmallocarray70,[%value_in]; } .loc 1 339 22 st.u64 [%r213],%r370; bra $L85; $L75: .loc 1 344 19 ld.s8 %r372,[%r213+28]; .loc 1 344 10 setp.eq.u64 %r373,%r372,%r153; @ %r373 bra $L86; .loc 1 345 2 cvta.const.u64 %r374374_gfortran_runtime_error86: .loc 1 347 11 cvta.global.u64 %r376,_gfortrani_compile_options; .loc 1 347 10 ld.u32 %r377,[%r376+36]; setp.eq.u32 %r378,%r377,0; @ %r378 bra $L85; .loc 1 349 4 add.u64 %r439,%frame,360; cvta.const.u64 %r382,$LC2382; call _gfortrani_bounds_ifunction_return4 cvta.const.u64 %r386,$LC386382; call _gfortrani_bounds_equal_extents$L85: .loc 1 356 3 setp.ne.u64 %r388,%r153,0; @ %r388 bra $L87; $L90: .loc 1 364 8 ld.u64 %r173,[%r213]; .loc 1 365 8 ld.u64 %r171,[%r214]; .loc 1 367 9 setp.ne.u64 %r389,%r171,0; @ %r389 bra $L88; bra $L60; $L87: add.u64 %r193,%frame,480; add.u64 %r187,%r213,40; add.u64 %r181,%frame,120; cvt.u32.u32 %r391,%r23; cvt.s64.s8 %r390,%r391; add.u64 %r145,%r390,-1; .loc 1 356 3 mov.u64 %r174,0; add.u64 %r439,%frame,360; .loc 1 358 16 mov.u64 %r392,%r174; $L89: st.u64 [%r193],%r392; .loc 1 359 18 ld.u64 %r393,[%r187]; st.u64 [%r181],%r393; .loc 1 360 17 shl.b64 %r395,%r174,3; add.u64 %r396,%r439,%r395; .loc 1 360 10 ld.u64 %r397,[%r396]; setp.le.s64 %r398,%r397,0; @ %r398 bra $L60; .loc 1 356 26 add.u64 %r174,%r174,1; .loc 1 356 3 add.u64 %r193,%r193,8; add.u64 %r187,%r187,24; add.u64 %r181,%r181,8; setp.ne.u64 %r399,%r145,%r174; @ %r399 bra $L89; bra $L90; $L88: .loc 1 290 12 mul.lo.u64 %r40,%r38,%r34; .loc 1 386 32 shl.b64 %r82,%r157,3; ld.u64 %r185,[%frame+480]; .loc 1 430 22 ld.u64 %r93,[%frame+240]; .loc 1 430 12 shl.b64 %r95,%r93,3; .loc 1 431 23 ld.u64 %r96,[%frame]; .loc 1 432 22 ld.u64 %r98,[%frame+120]; .loc 1 432 12 shl.b64 %r100,%r98,3; setp.le.s64 %r445,%r153,1; .loc 1 408 9 setp.eq.u32 %r447,%r217,0; cvt.u32.u32 %r448,%r23; cvt.s64.s8 %r449,%r448; add.u64 %r450,%r449,-1; add.u64 %r451,%frame,240; add.u64 %r452,%frame,360; add.u64 %r453,%frame,120; $L108: .loc 1 356 3 mov.u64 %r166,%r172; mov.u64 %r165,%r171; .loc 1 383 16 mov.u64 %r142,0; .loc 1 386 9 mov.u64 %r138,%r142; $L96: .loc 1 389 7 ld.s8 %r210,[%r166]; mov.u64 %r182,%r138; .loc 1 393 17 add.u64 %r138,%r138,1; .loc 1 389 6 setp.eq.u32 %r440,%r210,0; @ %r440 bra $L91; .loc 1 392 10 setp.eq.u64 %r401,%r142,0; .loc 1 393 17 selp.u64 %r142,%r138,%r142,%r401; .loc 1 394 11 ld.f64 %r140,[%r165]; .loc 1 394 10 setp.ge.f64 %r402,%r140,0dfff0000000000000; @ ! %r402 bra $L91; .loc 1 404 9 setp.le.s64 %r403,%r154,%r182; @ ! %r403 bra $L137; bra $L94; $L91: .loc 1 386 32 add.u64 %r165,%r165,%r82; .loc 1 386 47 add.u64 %r166,%r166,%r40; .loc 1 386 2 setp.ne.u64 %r404,%r154,%r138; @ %r404 bra $L96; bra $L94; $L137: .loc 1 408 9 @ %r447 bra $L97; $L100: .loc 1 414 16 add.u64 %r182,%r182,1; .loc 1 411 8 @ %r440 bra $L98; .loc 1 411 18 ld.f64 %r85,[%r165]; .loc 1 411 15 setp.ge.f64 %r407,%r85,%r140; .loc 1 413 16 selp.f64 %r140,%r85,%r140,%r407; selp.u64 %r138,%r182,%r138,%r407; $L98: .loc 1 409 33 add.u64 %r165,%r165,%r82; .loc 1 409 48 add.u64 %r166,%r166,%r40; .loc 1 409 8 setp.eq.u64 %r408,%r154,%r182; @ %r408 bra $L110; .loc 1 411 9 ld.s8 %r210,[%r166]; setp.eq.u32 %r440,%r210,0; bra $L100; $L97: .loc 1 423 16 add.u64 %r182,%r182,1; .loc 1 420 8 @ %r440 bra $L101; .loc 1 420 18 ld.f64 %r90,[%r165]; .loc 1 420 15 setp.gt.f64 %r410,%r90,%r140; .loc 1 422 16 selp.f64 %r140,%r90,%r140,%r410; selp.u64 %r138,%r182,%r138,%r410; $L101: .loc 1 418 33 add.u64 %r165,%r165,%r82; .loc 1 418 48 add.u64 %r166,%r166,%r40; .loc 1 418 8 setp.eq.u64 %r411,%r154,%r182; @ %r411 bra $L111; .loc 1 420 9 ld.s8 %r210,[%r166]; setp.eq.u32 %r440,%r210,0; bra $L97; $L110: mov.u64 %r142,%r138; bra $L94; $L111: mov.u64 %r142,%r138; $L94: .loc 1 426 8 st.u64 [%r173],%r142; .loc 1 429 15 add.u64 %r185,%r185,1; .loc 1 430 12 add.u64 %r171,%r171,%r95; .loc 1 431 13 add.u64 %r172,%r172,%r96; .loc 1 432 12 add.u64 %r173,%r173,%r100; .loc 1 434 32 ld.u64 %r164,[%frame+360]; .loc 1 434 13 setp.ne.u64 %r412,%r185,%r164; @ %r412 bra $L108; .loc 1 441 23 mul.lo.u64 %r102,%r93,%r185; .loc 1 442 24 mul.lo.u64 %r413,%r96,%r185; .loc 1 442 10 sub.u64 %r168,%r172,%r413; .loc 1 443 23 mul.lo.u64 %r414,%r98,%r185; .loc 1 443 9 shl.b64 %r415,%r414,3; sub.u64 %r169,%r173,%r415; .loc 1 445 7 @ %r445 bra $L60; add.u64 %r203,%frame,488; mov.u64 %r202,8; .loc 1 444 5 mov.u64 %r170,1; .loc 1 438 13 mov.u64 %r446,0; bra $L105; $L106: st.u64 [%r203],%r446; .loc 1 441 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 442 24 mul.lo.u64 %r421,%r116,%r113; .loc 1 442 10 sub.u64 %r168,%r172,%r421; .loc 1 443 23 mul.lo.u64 %r422,%r118,%r113; .loc 1 443 9 shl.b64 %r423,%r422,3; sub.u64 %r169,%r173,%r423; .loc 1 444 5 add.u64 %r170,%r170,1; .loc 1 445 7 add.u64 %r203,%r203,8; add.u64 %r202,%r202,8; setp.eq.u64 %r424,%r170,%r450; @ %r424 bra $L60; $L105: .loc 1 453 16 ld.u64 %r425,[%r203]; add.u64 %r113,%r425,1; st.u64 [%r203],%r113; .loc 1 454 23 add.u64 %r427,%r451,%r202; ld.u64 %r114,[%r427]; .loc 1 454 13 sub.u64 %r428,%r114,%r102; shl.b64 %r429,%r428,3; add.u64 %r171,%r171,%r429; .loc 1 455 24 add.u64 %r430,%frame,%r202; ld.u64 %r116,[%r430]; .loc 1 455 14 add.u64 %r172,%r168,%r116; .loc 1 456 23 add.u64 %r432,%r453,%r202; ld.u64 %r118,[%r432]; .loc 1 456 13 shl.b64 %r433,%r118,3; add.u64 %r173,%r169,%r433; .loc 1 434 32 add.u64 %r435,%r452,%r202; ld.u64 %r122,[%r435]; .loc 1 434 13 setp.eq.u64 %r436,%r113,%r122; @ %r436 bra $L106; .loc 1 438 13 mov.u64 %r185,0; bra $L108; $L135: .loc 1 312 6 ld.u64 %r437,[%r213]; setp.eq.u64 %r438,%r437,0; @ ! %r438 bra $L75; bra $L107; $L60: .loc 1 460_gfortran_smaxloc1_8_r8 .visible .func _gfortran_smaxloc1_8_r8, .param .u32 %in_ar4) {8u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 483 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L139; .loc 1 483 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L140; $L139:r152154; call _gfortran_maxloc1_8_r8490 7 bra $L138; $L140: .loc 1 493 10 ld.u64 %r23,[%r152]; .loc 1 493 7 add.u64 %r88,%r23,-1; .loc 1 494 10 ld.s8 %r24,[%r151+28]; .loc 1 494 38 add.u32 %r26,%r24,-1; .loc 1 494 8 cvt.s64.s32 %r296,%r26; .loc 1 496 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 496 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L142; .loc 1 503 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L143; $L149: .loc 1 511 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L144; bra $L197; $L142: .loc 1 498 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error143: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 508 12 mov.u64 %r302,0; $L148: .loc 1 505 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 507 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L146; .loc 1 505 17 st.u64 [%r139],%r36; bra $L147; $L146: .loc 1 508 12 st.u64 [%r139],%r302; $L147: .loc 1 503 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L148; bra $L149; $L197: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L151; bra $L150; $L144: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 517 12 mov.u64 %r301,0; $L154: .loc 1 514 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 516 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L152; .loc 1 513 17 st.u64 [%r43],%r41; bra $L153; $L152: .loc 1 517 12 st.u64 [%r43],%r301; $L153: .loc 1 511 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L154; bra $L198; $L150: .loc 1 524 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L156; add.u64 %r295,%frame,120; $L176: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 527 10 mov.u64 %r81,1; .loc 1 531 4 mov.u64 %r239,0; bra $L157; $L156: .loc 1 535 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 536 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 538 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 538 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 538 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 540 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L158; bra $L199; $L160: .loc 1 529 48 mul.lo.u64 %r81,%r46,%r81; $L157: .loc 1 531 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 524 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L160; bra $L156; $L158: .loc 1 543 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 544 4 bra $L138; $L199: .loc 1 547 24 mov.u64 %r24696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 547 22 st.u64 [%r150],%r98; $L164: .loc 1 573 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L161; bra $L162; $L151: .loc 1 551 19 ld.s8 %r55,[%r150+28]; .loc 1 551 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L163; .loc 1 552 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error163: .loc 1 557 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 557 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L164; .loc 1 559 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L162; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 559 10 mov.u64 %r104,0; $L166: .loc 1 563 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 563 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 564 18 ld.u64 %r63,[%r31]; .loc 1 559 25 add.u64 %r104,%r104,1; .loc 1 564 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L165; .loc 1 565 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error165: .loc 1 559 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L166; $L161: .loc 1 575 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L167; shl.b64 %r266,%r296,3; bra $L168; $L167: mov.u64 %r266,8; $L168: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L169: .loc 1 576 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 573 26 add.u64 %r100,%r100,1; .loc 1 573 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L169; $L162: ld.u64 %r80,[%frame+240]; .loc 1 585 22 ld.u64 %r68,[%frame]; .loc 1 585 12 shl.b64 %r69,%r68,3; .loc 1 587 32 ld.u64 %r59,[%frame+120]; .loc 1 594 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 583 13 mov.u64 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L175: st.u64 [%r98],%r280; .loc 1 584 15 add.u64 %r80,%r80,1; .loc 1 585 12 add.u64 %r98,%r98,%r69; .loc 1 587 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L175; .loc 1 596 7 @ ! %r294 bra $L200; bra $L138; $L174: .loc 1 591 13 st.u64 [%r99],%r280; .loc 1 594 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 595 5 add.u64 %r97,%r97,1; .loc 1 596 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L173; bra $L138; $L200: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 594 23 mov.u64 %r72,%r44; .loc 1 595 5 mov.u64 %r97,1; $L173: .loc 1 600 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 601 23 ld.u64 %r75,[%r124]; .loc 1 601 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,3; add.u64 %r98,%r98,%r291; .loc 1 587 32 ld.u64 %r77,[%r123]; .loc 1 587 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L174; .loc 1 591 13 mov.u64 %r80,0; bra $L175; $L198: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L151; bra $L176; $L138: .loc 1 605 maxloc1_16_r8.o/1622802230_gfortran_maxloc1_16_r8 .visible .func _gfortran_maxloc1_16_r8.file 1 "../../../../libgfortran/generated/maxloc1_16_r8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_16_r8 .visible .func _gfortran_mmaxloc1_16_r8_gfortran_smaxloc1_16_r8 .visible .func _gfortran_smaxloc1_16_r8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_maxloc1_16_r8 .visible .func _gfortran_maxloc1_16_r8predu64 %r258; .reg .u64pred %r265; .reg .pred64pred %r304; .reg .u64 %r305; .reg .u64u64 %r318; .reg .u64u64 %r324; .reg .u64predpred %r334; .reg .pred %r336; .reg .u64 %r340; .reg .u64 %r343; .reg .u64 %r346; .reg .u64 %r349; .reg .u32 %r351; .reg .u64 %r355; .reg .u64 %r357; .reg .u64 %r360; .reg .u32 %r362; .reg .u64 %r366; .reg .pred %r367; .reg .pred %r369; .reg .pred %r370; .reg .u64 %r373; .reg .u64 %r376; .reg .u32pred %r391; .reg .u64 %r397; .reg .u6464mov.u64 %r191,%ar0; mov.u64 %r192,%ar1; mov.u64 %r193,%ar2; mov.u32 %r194,%ar3; .loc 1 58 10 ld.s8 %r22,[%r192+28]; .loc 1 58 38 add.u32 %r195,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r118,%r195; .loc 1 59 10 ld.u64 %r25,[%r193]; .loc 1 59 7 add.u64 %r119,%r25,-1; .loc 1 61 7 shr.u64 %r197,%r119,63; set.u32.lt.s64 %r199,%r118,%r119; neg.s32 %r200,%r199; cvt.u16.u64 %r203,%r197; cvt.u16.u32 %r204,%r200; or.b16 %r202,%r203,%r204; .loc 1 61 6 cvt.u32.u16 %r205,%r202; cvt.u16.u8 %r206,%r205; setp.eq.u16 %r207,%r206,0; @ %r207 bra $L2; .loc 1 63 7 cvt.u32.u32 %r210,%r22; cvt.s64.s8 %r209,%r210; st.u64 [%stack+8],%r209_gfortran_runtime_errorr213,%r119,%r119; add.u64 %r214,%r213,%r119; shl.b64 %r215,%r214,3; add.u64 %r216,%r192,%r215; ld.u64 %r30,[%r216+56]; ld.u64 %r32,[%r216+48]; .loc 1 71 9 ld.u64 %r122,[%r216+40]; .loc 1 73 3 setp.ne.u64 %r230,%r119,0; @ %r230 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r231,%r118,%r119; @ %r231 bra $L4; bra $L57; $L3: add.u64 %r181,%r192,40; add.u64 %r69,%frame,120; add.u64 %r97,%frame,240; add.u64 %r232,%r192,16; add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r67,%r232,%r236; .loc 1 79 12 mov.u64 %r447,0; $L8: .loc 1 75 18 ld.u64 %r237,[%r181]; st.u64 [%r69],%r237; .loc 1 76 19 ld.u64 %r239,[%r181+16]; add.u64 %r238,%r239,1; ld.u64 %r240,[%r181+8]; sub.u64 %r37,%r238,%r240; .loc 1 78 10 setp.lt.s64 %r241,%r37,0; @ %r241 bra $L6; .loc 1 76 17 st.u64 [%r97],%r37; bra $L7; $L6: .loc 1 79 12 st.u64 [%r97],%r447; $L7: .loc 1 73 3 add.u64 %r181,%r181,24; add.u64 %r69,%r69,8; add.u64 %r97,%r97,8; setp.ne.u64 %r243,%r67,%r181; @ %r243 bra $L8; bra $L9; $L57: .loc 1 90 6 ld.u64 %r244,[%r191]; setp.eq.u64 %r245,%r244,0; @ ! %r245 bra $L11; bra $L10; $L4: add.u64 %r247,%r25,%r25; add.u64 %r248,%r247,%r25; shl.b64 %r249,%r248,3; add.u64 %r250,%r249,40; add.u64 %r110,%r192,%r250; shl.b64 %r251,%r25,3; add.u64 %r99,%r251,-8; add.u64 %r435,%frame,120; add.u64 %r101,%r435,%r99; add.u64 %r434,%frame,240; add.u64 %r96,%r434,%r99; add.u64 %r254,%r192,40; cvt.u32.u32 %r256,%r22; cvt.s64.s8 %r255,%r256; add.u64 %r258,%r255,%r255; add.u64 %r259,%r258,%r255; shl.b64 %r260,%r259,3; add.u64 %r184,%r254,%r260; .loc 1 87 12 mov.u64 %r446,0; $L14: .loc 1 83 18 ld.u64 %r261,[%r110]; st.u64 [%r101],%r261; .loc 1 84 19 ld.u64 %r263,[%r110+16]; add.u64 %r262,%r263,1; ld.u64 %r264,[%r110+8]; sub.u64 %r44,%r262,%r264; .loc 1 86 10 setp.lt.s64 %r265,%r44,0; @ %r265 bra $L12; .loc 1 84 17 st.u64 [%r96],%r44; bra $L13; $L12: .loc 1 87 12 st.u64 [%r96],%r446; $L13: .loc 1 81 3 add.u64 %r110,%r110,24; add.u64 %r101,%r101,8; add.u64 %r96,%r96,8; setp.ne.u64 %r267,%r110,%r184; @ %r267 bra $L14; bra $L58; $L10: .loc 1 94 7 setp.le.s64 %r268,%r118,0; @ %r268 bra $L16; add.u64 %r434,%frame,240; $L43: add.u64 %r146,%r191,40; mov.u64 %r143,%r434; cvt.u32.u32 %r270,%r22; cvt.s64.s8 %r269,%r270; add.u64 %r272,%r269,%r269; add.u64 %r273,%r272,%r269; shl.b64 %r274,%r273,3; add.u64 %r275,%r191,16; add.u64 %r111,%r274,%r275; .loc 1 97 10 mov.u64 %r103,1; .loc 1 101 4 mov.u64 %r298,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r276,0; st.u64 [%r191+8],%r276; .loc 1 106 28 cvt.u16.u32 %r279,%r22; add.u16 %r278,%r279,-1; cvt.u32.u16 %r280,%r278; st.u8 [%r191+28],%r280; .loc 1 108 20 add.u32 %r281,%r22,-2; cvt.s64.s32 %r53,%r281; add.u64 %r283,%r53,%r53; add.u64 %r284,%r283,%r53; shl.b64 %r285,%r284,3; add.u64 %r286,%r191,%r285; .loc 1 108 67 shl.b64 %r288,%r53,3; add.u64 %r289,%frame,%r288; .loc 1 108 59 ld.u64 %r291,[%r286+40]; ld.u64 %r292,[%r289+240]; mul.lo.u64 %r124,%r291,%r292; .loc 1 110 29 mov.u64 %r294,16call (%value_in),_gfortrani_xmallocarray295,[%value_in]; } .loc 1 110 27 st.u64 [%r191],%r295; .loc 1 111 10 setp.eq.u64 %r297,%r124,0; @ ! %r297 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r103,%r49,%r103; $L17: .loc 1 101 4 st.u64 [%r146+8],%r298; ld.u64 %r49,[%r143]; add.u64 %r299,%r49,-1; st.u64 [%r146+16],%r299; st.u64 [%r146],%r103; .loc 1 94 7 add.u64 %r146,%r146,24; add.u64 %r143,%r143,8; setp.ne.u64 %r300,%r111,%r146; @ %r300 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r191+48],%r124; mov.u64 %r302,-1; st.u64 [%r191+56],%r302; mov.u64 %r303,1; st.u64 [%r191+40],%r303; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r59,[%r191+28]; .loc 1 121 10 setp.eq.u64 %r304,%r59,%r118; @ %r304 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r118; st.u64 [%stack],%r59;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r307,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r308,[%r307+36]; setp.eq.u32 %r309,%r308,0; @ %r309 bra $L21; .loc 1 128 2 add.u64 %r434,%frame,240; cvta.const.u64 %r313,$LC2r4312313; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r315,%r118,0; @ %r315 bra $L24; $L27: .loc 1 68 9 add.u64 %r316,%r30,1; .loc 1 68 7 sub.u64 %r120,%r316,%r32; max.s64 %r64,%r120,0; .loc 1 140 8 ld.u64 %r133,[%r192]; .loc 1 141 8 ld.u64 %r134,[%r191]; .loc 1 168 42 shl.b64 %r66,%r122,3; ld.u64 %r142,[%frame+360]; .loc 1 194 22 ld.u64 %r71,[%frame+120]; .loc 1 194 12 shl.b64 %r72,%r71,3; .loc 1 195 22 ld.u64 %r74,[%frame]; .loc 1 195 12 shl.b64 %r75,%r74,4; .loc 1 197 32 ld.u64 %r61,[%frame+240]; .loc 1 204 23 mul.lo.u64 %r317,%r61,%r71; .loc 1 204 9 shl.b64 %r318,%r317,3; neg.s64 %r154,%r318; .loc 1 205 23 mul.lo.u64 %r319,%r61,%r74; .loc 1 205 9 shl.b64 %r320,%r319,4; neg.s64 %r140,%r320; setp.gt.s64 %r436,%r120,0; setp.le.s64 %r437,%r118,1; add.u64 %r439,%r64,-1; setp.eq.u32 %r440,%r194,0; cvt.u32.u32 %r441,%r22; cvt.s64.s8 %r442,%r441; add.u64 %r443,%r442,-1; add.u64 %r444,%frame,120; add.u64 %r445,%frame,240; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r191,40; mov.u64 %r162,%frame; cvt.u32.u32 %r322,%r22; cvt.s64.s8 %r321,%r322; add.u64 %r148,%r321,-1; .loc 1 132 3 mov.u64 %r135,0; add.u64 %r434,%frame,240; .loc 1 134 16 mov.u64 %r323,%r135; $L26: st.u64 [%r165],%r323; .loc 1 135 18 ld.u64 %r324,[%r164]; st.u64 [%r162],%r324; .loc 1 136 17 shl.b64 %r326,%r135,3; add.u64 %r327,%r434,%r326; .loc 1 136 10 ld.u64 %r328,[%r327]; setp.le.s64 %r329,%r328,0; @ %r329 bra $L1; .loc 1 132 26 add.u64 %r135,%r135,1; .loc 1 132 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r330,%r135,%r148; @ %r330 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r436 bra $L44; .loc 1 159 10 mov.u64 %r332,0; st.u64 [%r134],%r332; st.u64 [%r134+8],%r332; bra $L29; $L44: mov.u64 %r126,%r133; .loc 1 168 19 mov.u64 %r145,0; $L28: .loc 1 170 7 ld.f64 %r108,[%r126]; mov.u64 %r127,%r145; .loc 1 168 34 add.u64 %r145,%r145,1; .loc 1 170 6 setp.ge.f64 %r334,%r108,0dfff0000000000000; @ ! %r334 bra $L59; .loc 1 173 14 mov.u64 %r413,%r145; shr.s64 %r414,%r413,63; .loc 1 180 6 setp.le.s64 %r336,%r64,%r127; @ %r336 bra $L32; add.u64 %r417,%r127,1; shr.s64 %r418,%r417,63; sub.u64 %r340,%r439,%r127; shr.s64 %r343,%r127,63; add.u64 %r346,%r127,2; set.u32.lt.u64 %r351,%r346,%r127; cvt.s64.s32 %r349,%r351; sub.u64 %r355,%r343,%r349; add.u64 %r357,%r340,%r346; set.u32.lt.u64 %r362,%r357,%r340; cvt.s64.s32 %r360,%r362; sub.u64 %r366,%r355,%r360; .loc 1 170 7 mov.f64 %r185,%r108; .loc 1 180 6 mov.u64 %r429,1; bra $L33; $L59: .loc 1 168 42 add.u64 %r126,%r126,%r66; .loc 1 168 12 setp.gt.s64 %r367,%r64,%r145; @ %r367 bra $L28; .loc 1 157 9 mov.u64 %r413,1; mov.u64 %r414,0; bra $L32; $L33: .loc 1 182 7 @ %r440 bra $L34; .loc 1 182 6 setp.le.f64 %r369,%r108,%r185; @ %r369 bra $L35; bra $L36; $L34: setp.lt.f64 %r370,%r108,%r185; @ ! %r370 bra $L36; $L35: .loc 1 185 14 mov.u64 %r413,%r417; mov.u64 %r414,%r418; .loc 1 184 14 mov.f64 %r108,%r185; $L36: .loc 1 180 31 add.u64 %r126,%r126,%r66; .loc 1 180 6 add.u64 %r373,%r417,%r429; set.u32.lt.u64 %r378,%r373,%r417; cvt.s64.s32 %r376,%r378; mov.u64 %r417,%r373; sub.u64 %r418,%r418,%r376; setp.ne.u64 %r385,%r357,%r417; @ %r385 bra $L46; setp.ne.u64 %r388,%r366,%r418; @ ! %r388 bra $L32; $L46: .loc 1 182 14 ld.f64 %r185,[%r126]; bra $L33; $L32: .loc 1 189 12 st.u64 [%r134],%r413; st.u64 [%r134+8],%r414; $L29: .loc 1 193 15 add.u64 %r142,%r142,1; .loc 1 194 12 add.u64 %r133,%r133,%r72; .loc 1 195 12 add.u64 %r134,%r134,%r75; .loc 1 197 13 setp.ne.u64 %r391,%r61,%r142; @ %r391 bra $L25; .loc 1 204 9 add.u64 %r130,%r133,%r154; .loc 1 205 9 add.u64 %r131,%r134,%r140; .loc 1 207 7 @ %r437 bra $L1; add.u64 %r117,%frame,368; mov.u64 %r180,8; .loc 1 206 5 mov.u64 %r132,1; .loc 1 201 13 mov.u64 %r438,0; bra $L41; $L42: st.u64 [%r117],%r438; .loc 1 204 23 mul.lo.u64 %r397,%r89,%r88; .loc 1 204 9 shl.b64 %r398,%r397,3; sub.u64 %r130,%r133,%r398; .loc 1 205 23 mul.lo.u64 %r399,%r92,%r88; .loc 1 205 9 shl.b64 %r400,%r399,4; sub.u64 %r131,%r134,%r400; .loc 1 206 5 add.u64 %r132,%r132,1; .loc 1 207 7 add.u64 %r117,%r117,8; add.u64 %r180,%r180,8; setp.eq.u64 %r401,%r132,%r443; @ %r401 bra $L1; $L41: .loc 1 215 16 ld.u64 %r402,[%r117]; add.u64 %r88,%r402,1; st.u64 [%r117],%r88; .loc 1 216 23 add.u64 %r404,%r444,%r180; ld.u64 %r89,[%r404]; .loc 1 216 13 shl.b64 %r405,%r89,3; add.u64 %r133,%r130,%r405; .loc 1 217 23 add.u64 %r406,%frame,%r180; ld.u64 %r92,[%r406]; .loc 1 217 13 shl.b64 %r407,%r92,4; add.u64 %r134,%r131,%r407; .loc 1 197 32 add.u64 %r409,%r445,%r180; ld.u64 %r95,[%r409]; .loc 1 197 13 setp.eq.u64 %r410,%r88,%r95; @ %r410 bra $L42; .loc 1 201 13 mov.u64 %r142,0; bra $L25; $L58: .loc 1 90 6 ld.u64 %r411,[%r191]; setp.eq.u64 %r412,%r411,0; @ ! %r412 bra $L11; bra $L43; $L1: .loc 1 221_gfortran_mmaxloc1_16_r8 .visible .func _gfortran_mmaxloc1_16_r8, .param .u32 %in_ar4) {608pred %r239; .reg .u32 %r244; .reg .u64 %r246; .reg .u32 %r248; .reg .u32 %r249; .reg .u16 %r251; .reg .u16 %r252; .reg .u16 %r253; .reg .u32u64 %r268; .reg .u64 %r275; .reg .pred %r276; .reg .u32 %r277; .reg .u32 %r278; .reg .u32 %r280; .reg .u32pred %r291; .reg .pred %r293; .reg .u64predu64 %r312; .reg .u64u64 %r317; .reg .u64 %r318; .reg .u64u64pred %r356; .reg .pred %r357; .reg .u64 %r358; .reg .u32pred %r410; .reg .u64pred %r419; .reg .predpred %r427; .reg .predu64 %r454; .reg .u64 %r457; .reg .u32 %r459; .reg .u64 %r463; .reg .u64 %r465; .reg .u64pred %r541; .reg .predu64u64 %r565; .reg .u64 %r567; .reg .pred %r568; .reg .u64 %r569; .reg .pred %r570; .reg .u64 %r571; .reg .u64 %r572; .reg .u64 %r573; .reg .u64.reg .u64 %r633; .reg .u64 %r634; mov.u64 %r234,%ar0; mov.u64 %r235,%ar1; mov.u64 %r236,%ar2; mov.u64 %r237,%ar3; mov.u32 %r238,%ar4; .loc 1 251 6 setp.ne.u64 %r239,%r237,0; @ %r239 bra $L61; .loc 1 254 7r2236238; call _gfortran_maxloc1_16_r87 bra $L60; $L61: .loc 1 261 10 ld.u64 %r22,[%r236]; .loc 1 261 7 add.u64 %r152,%r22,-1; .loc 1 262 10 ld.s8 %r23,[%r235+28]; .loc 1 262 38 add.u32 %r244,%r23,-1; .loc 1 262 8 cvt.s64.s32 %r153,%r244; .loc 1 265 7 shr.u64 %r246,%r152,63; set.u32.gt.s64 %r248,%r152,%r153; neg.s32 %r249,%r248; cvt.u16.u64 %r252,%r246; cvt.u16.u32 %r253,%r249; or.b16 %r251,%r252,%r253; .loc 1 265 6 cvt.u32.u16 %r254,%r251; cvt.u16.u8 %r255,%r254; setp.eq.u16 %r256,%r255,0; @ %r256 bra $L63; .loc 1 267 7 cvt.u32.u32 %r259,%r23; cvt.s64.s8 %r258,%r259; st.u64 [%stack+8],%r258; st.u64 [%stack],%r22; cvta.const.u64 %r25_gfortran_runtime_error63: .loc 1 272 9 add.u64 %r621,%r152,%r152; add.u64 %r617,%r621,%r152; shl.b64 %r264,%r617,3; add.u64 %r265,%r235,%r264; ld.u64 %r268,[%r265+56]; add.u64 %r267,%r268,1; .loc 1 272 7 ld.u64 %r275,[%r265+48]; sub.u64 %r154,%r267,%r275; .loc 1 273 6 setp.le.s64 %r276,%r154,0; @ %r276 bra $L60; .loc 1 276 9 ld.u64 %r174,[%r237]; .loc 1 278 15 ld.u64 %r33,[%r237+16]; .loc 1 280 22 cvt.u32.u64 %r34,%r33; .loc 1 280 53 add.u32 %r277,%r34,-4; and.b32 %r278,%r277,-5; set.u32.eq.u32 %r280,%r278,0; neg.s32 %r281,%r280; .loc 1 280 22 add.u32 %r282,%r34,-1; set.u32.le.u32 %r284,%r282,1; neg.s32 %r285,%r284; .loc 1 280 6 cvt.u16.u32 %r287,%r281; cvt.u16.u32 %r288,%r285; or.b16 %r286,%r287,%r288; cvt.u32.u16 %r289,%r286; cvt.u16.u8 %r290,%r289; setp.ne.u16 %r291,%r290,0; @ %r291 bra $L65; .loc 1 282 7 setp.ne.u32 %r293,%r34,16; @ %r293 bra $L66; $L65: .loc 1 289 9 shl.b64 %r297,%r617,3; add.u64 %r298,%r235,%r297; ld.u64 %r157,[%r298+40]; .loc 1 290 12 add.u64 %r304,%r237,%r297; ld.u64 %r40,[%r304+40]; .loc 1 292 3 setp.ne.u64 %r306,%r152,0; @ %r306 bra $L67; $L73: .loc 1 302 3 setp.lt.s64 %r307,%r152,%r153; @ %r307 bra $L68; bra $L135; $L66: .loc 1 287 5 cvta.const.u64 %r30830_gfortran_runtime_error67: add.u64 %r91,%r235,40; add.u64 %r196,%r237,40; add.u64 %r311,%r22,%r22; add.u64 %r312,%r311,%r22; shl.b64 %r313,%r312,3; add.u64 %r314,%r235,16; add.u64 %r226,%r313,%r314; .loc 1 292 3 mov.u64 %r107,0; add.u64 %r619,%frame,240; add.u64 %r623,%frame,360; .loc 1 299 12 mov.u64 %r634,%r107; $L72: .loc 1 294 18 add.u64 %r316,%r619,%r107; ld.u64 %r317,[%r91]; st.u64 [%r316],%r317; .loc 1 295 18 add.u64 %r318,%frame,%r107; .loc 1 295 20 ld.u64 %r320,[%r196]; mul.lo.u64 %r319,%r320,%r33; .loc 1 295 18 st.u64 [%r318],%r319; .loc 1 296 19 ld.u64 %r322,[%r91+16]; add.u64 %r321,%r322,1; ld.u64 %r323,[%r91+8]; sub.u64 %r52,%r321,%r323; .loc 1 298 10 setp.lt.s64 %r324,%r52,0; @ %r324 bra $L70; .loc 1 296 17 add.u64 %r326,%r623,%r107; st.u64 [%r326],%r52; bra $L71; $L70: .loc 1 299 12 add.u64 %r328,%r623,%r107; st.u64 [%r328],%r634; $L71: .loc 1 292 3 add.u64 %r91,%r91,24; add.u64 %r196,%r196,24; add.u64 %r107,%r107,8; setp.ne.u64 %r330,%r91,%r226; @ %r330 bra $L72; bra $L73; $L135: .loc 1 312 6 ld.u64 %r331,[%r234]; setp.eq.u64 %r332,%r331,0; @ ! %r332 bra $L75; bra $L74; $L68: add.u64 %r334,%r22,%r22; add.u64 %r335,%r334,%r22; shl.b64 %r336,%r335,3; add.u64 %r180,%r336,40; add.u64 %r57,%r235,%r180; add.u64 %r182,%r237,%r180; shl.b64 %r337,%r22,3; add.u64 %r74,%r337,-8; cvt.u32.u32 %r339,%r23; cvt.s64.s8 %r338,%r339; shl.b64 %r340,%r338,3; add.u64 %r88,%r340,-8; add.u64 %r619,%frame,240; add.u64 %r623,%frame,360; .loc 1 309 12 mov.u64 %r633,0; $L78: .loc 1 304 18 add.u64 %r342,%r619,%r74; ld.u64 %r343,[%r57]; st.u64 [%r342],%r343; .loc 1 305 18 add.u64 %r344,%frame,%r74; .loc 1 305 20 ld.u64 %r346,[%r182]; mul.lo.u64 %r345,%r346,%r33; .loc 1 305 18 st.u64 [%r344],%r345; .loc 1 306 19 ld.u64 %r348,[%r57+16]; add.u64 %r347,%r348,1; ld.u64 %r349,[%r57+8]; sub.u64 %r64,%r347,%r349; .loc 1 308 10 setp.lt.s64 %r350,%r64,0; @ %r350 bra $L76; .loc 1 306 17 add.u64 %r352,%r623,%r74; st.u64 [%r352],%r64; bra $L77; $L76: .loc 1 309 12 add.u64 %r354,%r623,%r74; st.u64 [%r354],%r633; $L77: .loc 1 302 3 add.u64 %r57,%r57,24; add.u64 %r182,%r182,24; add.u64 %r74,%r74,8; setp.ne.u64 %r356,%r74,%r88; @ %r356 bra $L78; bra $L136; $L74: .loc 1 316 7 setp.eq.u64 %r357,%r153,0; @ %r357 bra $L80; add.u64 %r623,%frame,360; $L110: add.u64 %r133,%r234,40; mov.u64 %r131,%r623; cvt.u32.u32 %r359,%r23; cvt.s64.s8 %r358,%r359; add.u64 %r361,%r358,%r358; add.u64 %r362,%r361,%r358; shl.b64 %r363,%r362,3; add.u64 %r364,%r234,16; add.u64 %r60,%r363,%r364; .loc 1 319 10 mov.u64 %r136,1; .loc 1 323 4 mov.u64 %r383,0; bra $L81; $L80: .loc 1 327 20 add.u32 %r365,%r23,-2; cvt.s64.s32 %r72,%r365; add.u64 %r367,%r72,%r72; add.u64 %r368,%r367,%r72; shl.b64 %r369,%r368,3; add.u64 %r370,%r234,%r369; .loc 1 327 67 shl.b64 %r372,%r72,3; add.u64 %r373,%frame,%r372; .loc 1 327 59 ld.u64 %r375,[%r370+40]; ld.u64 %r376,[%r373+360]; mul.lo.u64 %r161,%r375,%r376; .loc 1 329 24 mov.u64 %r377,0; st.u64 [%r234+8],%r377; .loc 1 330 28 cvt.u16.u32 %r380,%r23; add.u16 %r379,%r380,-1; cvt.u32.u16 %r381,%r379; st.u8 [%r234+28],%r381; .loc 1 332 10 setp.eq.u64 %r382,%r161,0; @ %r382 bra $L82; bra $L137; $L84: .loc 1 321 47 mul.lo.u64 %r136,%r69,%r136; $L81: .loc 1 323 4 st.u64 [%r133+8],%r383; ld.u64 %r69,[%r131]; add.u64 %r384,%r69,-1; st.u64 [%r133+16],%r384; st.u64 [%r133],%r136; .loc 1 316 7 add.u64 %r133,%r133,24; add.u64 %r131,%r131,8; setp.ne.u64 %r385,%r60,%r133; @ %r385 bra $L84; bra $L80; $L82: .loc 1 335 4 st.u64 [%r234+48],%r161; mov.u64 %r387,-1; st.u64 [%r234+56],%r387; mov.u64 %r388,1; st.u64 [%r234+40],%r388; .loc 1 336 4 bra $L60; $L137: .loc 1 339 24 mov.u64 %r390,16390; call (%value_in),_gfortrani_xmallocarray91,[%value_in]; } .loc 1 339 22 st.u64 [%r234],%r391; bra $L85; $L75: .loc 1 344 19 ld.s8 %r393,[%r234+28]; .loc 1 344 10 setp.eq.u64 %r394,%r393,%r153; @ %r394 bra $L86; .loc 1 345 2 cvta.const.u64 %r3955_gfortran_runtime_error86: .loc 1 347 11 cvta.global.u64 %r397,_gfortrani_compile_options; .loc 1 347 10 ld.u32 %r398,[%r397+36]; setp.eq.u32 %r399,%r398,0; @ %r399 bra $L85; .loc 1 349 4 add.u64 %r623,%frame,360; cvta.const.u64 %r403,$LC2;r62302403; call _gfortrani_bounds_ifunction_return4 cvta.const.u64 %r407,$LC207403; call _gfortrani_bounds_equal_extents$L85: .loc 1 356 3 setp.ne.u64 %r409,%r153,0; @ %r409 bra $L87; $L90: .loc 1 364 8 ld.u64 %r175,[%r234]; .loc 1 365 8 ld.u64 %r173,[%r235]; .loc 1 367 9 setp.ne.u64 %r410,%r173,0; @ %r410 bra $L88; bra $L60; $L87: add.u64 %r149,%frame,480; add.u64 %r148,%r234,40; add.u64 %r144,%frame,120; cvt.u32.u32 %r412,%r23; cvt.s64.s8 %r411,%r412; add.u64 %r134,%r411,-1; .loc 1 356 3 mov.u64 %r176,0; add.u64 %r623,%frame,360; .loc 1 358 16 mov.u64 %r413,%r176; $L89: st.u64 [%r149],%r413; .loc 1 359 18 ld.u64 %r414,[%r148]; st.u64 [%r144],%r414; .loc 1 360 17 shl.b64 %r416,%r176,3; add.u64 %r417,%r623,%r416; .loc 1 360 10 ld.u64 %r418,[%r417]; setp.le.s64 %r419,%r418,0; @ %r419 bra $L60; .loc 1 356 26 add.u64 %r176,%r176,1; .loc 1 356 3 add.u64 %r149,%r149,8; add.u64 %r148,%r148,24; add.u64 %r144,%r144,8; setp.ne.u64 %r420,%r134,%r176; @ %r420 bra $L89; bra $L90; $L88: .loc 1 290 12 mul.lo.u64 %r43,%r40,%r33; .loc 1 386 32 shl.b64 %r87,%r157,3; ld.u64 %r38,[%frame+480]; .loc 1 430 22 ld.u64 %r100,[%frame+240]; .loc 1 430 12 shl.b64 %r101,%r100,3; .loc 1 431 23 ld.u64 %r103,[%frame]; .loc 1 432 22 ld.u64 %r105,[%frame+120]; .loc 1 432 12 shl.b64 %r106,%r105,4; .loc 1 434 32 ld.u64 %r166,[%frame+360]; .loc 1 441 23 mul.lo.u64 %r37,%r166,%r100; .loc 1 442 24 mul.lo.u64 %r421,%r166,%r103; .loc 1 442 10 neg.s64 %r85,%r421; .loc 1 443 23 mul.lo.u64 %r422,%r166,%r105; .loc 1 443 9 shl.b64 %r423,%r422,4; neg.s64 %r93,%r423; setp.le.s64 %r618,%r153,1; .loc 1 356 3 mov.u64 %r626,1; .loc 1 408 9 setp.ne.u32 %r627,%r238,0; add.u64 %r628,%r154,-1; cvt.u32.u32 %r629,%r23; cvt.s64.s8 %r630,%r629; add.u64 %r631,%r630,-1; add.u64 %r632,%frame,240; $L111: .loc 1 356 3 mov.u64 %r169,%r174; mov.u64 %r168,%r173; mov.u64 %r573,%r626; mov.u64 %r574,0; .loc 1 383 17 mov.u64 %r571,%r574; mov.u64 %r572,%r574; .loc 1 386 9 mov.u64 %r200,%r574; $L96: .loc 1 389 7 ld.s8 %r225,[%r169]; mov.u64 %r231,%r200; add.u64 %r200,%r200,1; .loc 1 389 6 setp.eq.u32 %r620,%r225,0; @ %r620 bra $L91; .loc 1 392 10 or.b64 %r425,%r571,%r572; setp.ne.u64 %r427,%r425,0; .loc 1 393 17 selp.u64 %r571,%r571,%r573,%r427; selp.u64 %r572,%r572,%r574,%r427; .loc 1 394 11 ld.f64 %r145,[%r168]; .loc 1 394 10 setp.ge.f64 %r428,%r145,0dfff0000000000000; @ ! %r428 bra $L91; .loc 1 398 11 shr.s64 %r429,%r200,63; .loc 1 404 9 setp.le.s64 %r430,%r154,%r231; @ ! %r430 bra $L138; bra $L94; $L91: .loc 1 386 32 add.u64 %r168,%r168,%r87; .loc 1 386 47 add.u64 %r169,%r169,%r43; .loc 1 386 2 add.u64 %r433,%r573,%r626; set.u32.lt.u64 %r438,%r433,%r573; cvt.s64.s32 %r436,%r438; mov.u64 %r573,%r433; sub.u64 %r574,%r574,%r436; setp.ne.u64 %r443,%r154,%r200; @ %r443 bra $L96; bra $L94; $L138: .loc 1 408 9 @ %r627 bra $L97; add.u64 %r579,%r231,1; shr.s64 %r580,%r579,63; sub.u64 %r448,%r628,%r231; shr.s64 %r451,%r231,63; add.u64 %r454,%r231,2; set.u32.lt.u64 %r459,%r454,%r231; cvt.s64.s32 %r457,%r459; sub.u64 %r463,%r451,%r457; add.u64 %r465,%r448,%r454; set.u32.lt.u64 %r470,%r465,%r448; cvt.s64.s32 %r468,%r470; sub.u64 %r474,%r463,%r468; .loc 1 398 11 mov.u64 %r571,%r200; mov.u64 %r572,%r429; bra $L98; $L97: add.u64 %r583,%r231,1; shr.s64 %r584,%r583,63; sub.u64 %r478,%r628,%r231; shr.s64 %r481,%r231,63; add.u64 %r484,%r231,2; set.u32.lt.u64 %r489,%r484,%r231; cvt.s64.s32 %r487,%r489; sub.u64 %r493,%r481,%r487; add.u64 %r495,%r478,%r484; set.u32.lt.u64 %r500,%r495,%r478; cvt.s64.s32 %r498,%r500; sub.u64 %r504,%r493,%r498; mov.u64 %r571,%r200; mov.u64 %r572,%r429; $L102: .loc 1 411 8 @ %r620 bra $L99; .loc 1 411 18 ld.f64 %r90,[%r168]; .loc 1 411 15 setp.ge.f64 %r506,%r90,%r145; .loc 1 414 16 selp.u64 %r571,%r583,%r571,%r506; selp.u64 %r572,%r584,%r572,%r506; selp.f64 %r145,%r90,%r145,%r506; $L99: .loc 1 409 33 add.u64 %r168,%r168,%r87; .loc 1 409 48 add.u64 %r169,%r169,%r43; .loc 1 409 8 add.u64 %r509,%r583,%r626; set.u32.lt.u64 %r514,%r509,%r583; cvt.s64.s32 %r512,%r514; mov.u64 %r583,%r509; sub.u64 %r584,%r584,%r512; setp.ne.u64 %r521,%r495,%r583; @ %r521 bra $L114; setp.ne.u64 %r524,%r504,%r584; @ ! %r524 bra $L94; $L114: .loc 1 411 9 ld.s8 %r225,[%r169]; setp.eq.u32 %r620,%r225,0; bra $L102; $L98: .loc 1 420 8 @ %r620 bra $L103; .loc 1 420 18 ld.f64 %r94,[%r168]; .loc 1 420 15 setp.gt.f64 %r526,%r94,%r145; .loc 1 423 16 selp.u64 %r571,%r579,%r571,%r526; selp.u64 %r572,%r580,%r572,%r526; selp.f64 %r145,%r94,%r145,%r526; $L103: .loc 1 418 33 add.u64 %r168,%r168,%r87; .loc 1 418 48 add.u64 %r169,%r169,%r43; .loc 1 418 8 add.u64 %r529,%r579,%r626; set.u32.lt.u64 %r534,%r529,%r579; cvt.s64.s32 %r532,%r534; mov.u64 %r579,%r529; sub.u64 %r580,%r580,%r532; setp.ne.u64 %r541,%r465,%r579; @ %r541 bra $L116; setp.ne.u64 %r544,%r474,%r580; @ ! %r544 bra $L94; $L116: .loc 1 420 9 ld.s8 %r225,[%r169]; setp.eq.u32 %r620,%r225,0; bra $L98; $L94: .loc 1 426 8 st.u64 [%r175],%r571; st.u64 [%r175+8],%r572; .loc 1 429 15 add.u64 %r38,%r38,1; .loc 1 430 12 add.u64 %r173,%r173,%r101; .loc 1 431 13 add.u64 %r174,%r174,%r103; .loc 1 432 12 add.u64 %r175,%r175,%r106; .loc 1 434 13 setp.ne.u64 %r547,%r38,%r166; @ %r547 bra $L111; .loc 1 442 10 add.u64 %r170,%r174,%r85; .loc 1 443 9 add.u64 %r171,%r175,%r93; .loc 1 445 7 @ %r618 bra $L60; add.u64 %r151,%frame,488; .loc 1 441 23 mov.u64 %r110,%r37; .loc 1 445 7 mov.u64 %r219,8; .loc 1 444 5 mov.u64 %r172,1; add.u64 %r623,%frame,360; add.u64 %r622,%frame,120; .loc 1 438 13 mov.u64 %r624,0; bra $L108; $L109: st.u64 [%r151],%r624; .loc 1 441 23 mul.lo.u64 %r110,%r123,%r122; .loc 1 442 24 mul.lo.u64 %r553,%r125,%r122; .loc 1 442 10 sub.u64 %r170,%r174,%r553; .loc 1 443 23 mul.lo.u64 %r554,%r127,%r122; .loc 1 443 9 shl.b64 %r555,%r554,4; sub.u64 %r171,%r175,%r555; .loc 1 444 5 add.u64 %r172,%r172,1; .loc 1 445 7 add.u64 %r151,%r151,8; add.u64 %r219,%r219,8; setp.eq.u64 %r556,%r172,%r631; @ %r556 bra $L60; $L108: .loc 1 453 16 ld.u64 %r557,[%r151]; add.u64 %r122,%r557,1; st.u64 [%r151],%r122; .loc 1 454 23 add.u64 %r559,%r632,%r219; ld.u64 %r123,[%r559]; .loc 1 454 13 sub.u64 %r560,%r123,%r110; shl.b64 %r561,%r560,3; add.u64 %r173,%r173,%r561; .loc 1 455 24 add.u64 %r562,%frame,%r219; ld.u64 %r125,[%r562]; .loc 1 455 14 add.u64 %r174,%r170,%r125; .loc 1 456 23 add.u64 %r564,%r622,%r219; ld.u64 %r127,[%r564]; .loc 1 456 13 shl.b64 %r565,%r127,4; add.u64 %r175,%r171,%r565; .loc 1 434 32 add.u64 %r567,%r623,%r219; ld.u64 %r130,[%r567]; .loc 1 434 13 setp.eq.u64 %r568,%r122,%r130; @ %r568 bra $L109; .loc 1 438 13 mov.u64 %r38,0; bra $L111; $L136: .loc 1 312 6 ld.u64 %r569,[%r234]; setp.eq.u64 %r570,%r569,0; @ ! %r570 bra $L75; bra $L110; $L60: .loc 1 460_gfortran_smaxloc1_16_r8 .visible .func _gfortran_smaxloc1_16_r8, .param .u32 %in_ar4) {8u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r282; .reg .predpred %r293; .reg .pred %r294; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 483 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L140; .loc 1 483 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L141; $L140:r152154; call _gfortran_maxloc1_16_r8490 7 bra $L139; $L141: .loc 1 493 10 ld.u64 %r23,[%r152]; .loc 1 493 7 add.u64 %r88,%r23,-1; .loc 1 494 10 ld.s8 %r24,[%r151+28]; .loc 1 494 38 add.u32 %r26,%r24,-1; .loc 1 494 8 cvt.s64.s32 %r297,%r26; .loc 1 496 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r297; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 496 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L143; .loc 1 503 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L144; $L150: .loc 1 511 3 setp.lt.s64 %r175,%r88,%r297; @ %r175 bra $L145; bra $L198; $L143: .loc 1 498 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error144: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 508 12 mov.u64 %r303,0; $L149: .loc 1 505 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 507 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L147; .loc 1 505 17 st.u64 [%r139],%r36; bra $L148; $L147: .loc 1 508 12 st.u64 [%r139],%r303; $L148: .loc 1 503 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L149; bra $L150; $L198: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L152; bra $L151; $L145: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r296,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r296,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 517 12 mov.u64 %r302,0; $L155: .loc 1 514 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 516 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L153; .loc 1 513 17 st.u64 [%r43],%r41; bra $L154; $L153: .loc 1 517 12 st.u64 [%r43],%r302; $L154: .loc 1 511 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L155; bra $L199; $L151: .loc 1 524 7 setp.eq.u64 %r213,%r297,0; @ %r213 bra $L157; add.u64 %r296,%frame,120; $L177: add.u64 %r111,%r150,40; mov.u64 %r107,%r296; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 527 10 mov.u64 %r81,1; .loc 1 531 4 mov.u64 %r239,0; bra $L158; $L157: .loc 1 535 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 536 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 538 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 538 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 538 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 540 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L159; bra $L200; $L161: .loc 1 529 48 mul.lo.u64 %r81,%r46,%r81; $L158: .loc 1 531 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 524 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L161; bra $L157; $L159: .loc 1 543 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 544 4 bra $L139; $L200: .loc 1 547 24 mov.u64 %r246,1696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 547 22 st.u64 [%r150],%r98; $L165: .loc 1 573 3 setp.ne.u64 %r249,%r297,0; @ %r249 bra $L162; bra $L163; $L152: .loc 1 551 19 ld.s8 %r55,[%r150+28]; .loc 1 551 10 setp.eq.u64 %r250,%r55,%r297; @ %r250 bra $L164; .loc 1 552 2 st.u64 [%stack+8],%r297; st.u64 [%stack],%r55;_gfortran_runtime_error164: .loc 1 557 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 557 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L165; .loc 1 559 4 setp.eq.u64 %r256,%r297,0; @ %r256 bra $L163; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 559 10 mov.u64 %r104,0; $L167: .loc 1 563 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 563 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 564 18 ld.u64 %r63,[%r31]; .loc 1 559 25 add.u64 %r104,%r104,1; .loc 1 564 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L166; .loc 1 565 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error166: .loc 1 559 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L167; $L162: .loc 1 575 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L168; shl.b64 %r266,%r297,3; bra $L169; $L168: mov.u64 %r266,8; $L169: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L170: .loc 1 576 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 573 26 add.u64 %r100,%r100,1; .loc 1 573 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r297,%r100; @ %r279 bra $L170; $L163: ld.u64 %r80,[%frame+240]; .loc 1 585 22 ld.u64 %r68,[%frame]; .loc 1 585 12 shl.b64 %r69,%r68,4; .loc 1 587 32 ld.u64 %r59,[%frame+120]; .loc 1 594 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r295,%r297,1; .loc 1 583 13 mov.u64 %r280,0; cvt.u32.u32 %r299,%r24; cvt.s64.s8 %r300,%r299; add.u64 %r301,%r300,-1; $L176: st.u64 [%r98],%r280; st.u64 [%r98+8],%r280; .loc 1 584 15 add.u64 %r80,%r80,1; .loc 1 585 12 add.u64 %r98,%r98,%r69; .loc 1 587 13 setp.ne.u64 %r282,%r59,%r80; @ %r282 bra $L176; .loc 1 596 7 @ ! %r295 bra $L201; bra $L139; $L175: .loc 1 591 13 st.u64 [%r99],%r280; .loc 1 594 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 595 5 add.u64 %r97,%r97,1; .loc 1 596 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r285,%r97,%r301; @ %r285 bra $L174; bra $L139; $L201: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 594 23 mov.u64 %r72,%r44; .loc 1 595 5 mov.u64 %r97,1; $L174: .loc 1 600 16 ld.u64 %r290,[%r99]; add.u64 %r74,%r290,1; st.u64 [%r99],%r74; .loc 1 601 23 ld.u64 %r75,[%r124]; .loc 1 601 13 sub.u64 %r291,%r75,%r72; shl.b64 %r292,%r291,4; add.u64 %r98,%r98,%r292; .loc 1 587 32 ld.u64 %r77,[%r123]; .loc 1 587 13 setp.eq.u64 %r293,%r74,%r77; @ %r293 bra $L175; .loc 1 591 13 mov.u64 %r80,0; bra $L176; $L199: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r294,%r98,0; @ ! %r294 bra $L152; bra $L177; $L139: .loc 1 605maxloc1_4_r10.o/162280223091 maxloc1_8_r16.o/1622802230maxval_i1.o/_gfortran_maxval_i1 .visible .func _gfortran_maxval_i1fortran/generated/maxval_i1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxval_i1 .visible .func _gfortran_mmaxval_i1gfortran_smaxval_i1 .visible .func _gfortran_smaxval_i1VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,86,65,76114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_maxval_i1 .visible .func _gfortran_maxval_i148predpred %r209; .reg .pred %r210; .reg .u64 %r211; .reg .u64 %r213; .reg .u64pred %r220; .reg .pred %r222; .reg .u64pred %r247; .reg .u64 %r248; .reg .u32 %r249; .reg .u64 %r251; .reg .u64 %r252; .reg .u64 %r253; .reg .u64 %r254; .reg .u64 %r255; .reg .u16 %r257; .reg .u16predpred %r288; .reg .u64 %r291; .reg .u64 %r292; .reg .predpred %r304; .reg .pred64pred %r335; .reg .u64 %r336; .reg .u64mov.u64 %r171,%ar0; mov.u64 %r172,%ar1; mov.u64 %r173,%ar2; .loc 1 55 10 ld.s8 %r22,[%r172+28]; .loc 1 55 38 add.u32 %r174,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r174; .loc 1 56 10 ld.u64 %r25,[%r173]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r176,%r104,63; set.u32.lt.s64 %r178,%r103,%r104; neg.s32 %r179,%r178; cvt.u16.u64 %r182,%r176; cvt.u16.u32 %r183,%r179; or.b16 %r181,%r182,%r183; .loc 1 58 6 cvt.u32.u16 %r184,%r181; cvt.u16.u8 %r185,%r184; setp.eq.u16 %r186,%r185,0; @ %r186 bra $L2; .loc 1 60 7 cvt.u32.u32 %r189,%r22; cvt.s64.s8 %r188,%r189; st.u64 [%stack+8],%r18818187_gfortran_runtime_errorr192,%r104,%r104; add.u64 %r193,%r192,%r104; shl.b64 %r194,%r193,3; add.u64 %r195,%r172,%r194; ld.u64 %r31,[%r195+56]; ld.u64 %r33,[%r195+48]; .loc 1 68 9 ld.u64 %r107,[%r195+40]; .loc 1 70 3 setp.ne.u64 %r209,%r104,0; @ %r209 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r210,%r103,%r104; @ %r210 bra $L4; bra $L46; $L3: add.u64 %r168,%r172,40; add.u64 %r165,%frame,120; add.u64 %r49,%frame,240; add.u64 %r211,%r172,16; add.u64 %r213,%r25,%r25; add.u64 %r214,%r213,%r25; shl.b64 %r215,%r214,3; add.u64 %r139,%r211,%r215; .loc 1 76 12 mov.u64 %r349,0; $L8: .loc 1 72 18 ld.u64 %r216,[%r168]; st.u64 [%r165],%r216; .loc 1 73 19 ld.u64 %r218,[%r168+16]; add.u64 %r217,%r218,1; ld.u64 %r219,[%r168+8]; sub.u64 %r38,%r217,%r219; .loc 1 75 10 setp.lt.s64 %r220,%r38,0; @ %r220 bra $L6; .loc 1 73 17 st.u64 [%r49],%r38; bra $L7; $L6: .loc 1 76 12 st.u64 [%r49],%r349; $L7: .loc 1 70 3 add.u64 %r168,%r168,24; add.u64 %r165,%r165,8; add.u64 %r49,%r49,8; setp.ne.u64 %r222,%r139,%r168; @ %r222 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r223,[%r171]; setp.eq.u64 %r224,%r223,0; @ ! %r224 bra $L11; bra $L10; $L4: add.u64 %r226,%r25,%r25; add.u64 %r227,%r226,%r25; shl.b64 %r228,%r227,3; add.u64 %r229,%r228,40; add.u64 %r99,%r172,%r229; shl.b64 %r230,%r25,3; add.u64 %r91,%r230,-8; add.u64 %r336,%frame,120; add.u64 %r93,%r336,%r91; add.u64 %r337,%frame,240; add.u64 %r78,%r337,%r91; cvt.u32.u32 %r234,%r22; cvt.s64.s8 %r233,%r234; add.u64 %r236,%r233,%r233; add.u64 %r237,%r236,%r233172,40; add.u64 %r121,%r238,%r239; .loc 1 84 12 mov.u64 %r348,0; $L14: .loc 1 80 18 ld.u64 %r240,[%r99]; st.u64 [%r93],%r240; .loc 1 81 19 ld.u64 %r242,[%r99+16]; add.u64 %r241,%r242,1; ld.u64 %r243,[%r99+8]; sub.u64 %r46,%r241,%r243; .loc 1 83 10 setp.lt.s64 %r244,%r46,0; @ %r244 bra $L12; .loc 1 81 17 st.u64 [%r78],%r46; bra $L13; $L12: .loc 1 84 12 st.u64 [%r78],%r348; $L13: .loc 1 78 3 add.u64 %r99,%r99,24; add.u64 %r93,%r93,8; add.u64 %r78,%r78,8; setp.ne.u64 %r246,%r99,%r121; @ %r246 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r247,%r103,0; @ %r247 bra $L16; add.u64 %r337,%frame,240; $L35: add.u64 %r137,%r171,40; mov.u64 %r132,%r337; cvt.u32.u32 %r249,%r22; cvt.s64.s8 %r248,%r249; add.u64 %r251,%r248,%r248; add.u64 %r252,%r251,%r248; shl.b64 %r253,%r252,3; add.u64 %r254,%r171,16; add.u64 %r100,%r253,%r254; .loc 1 94 10 mov.u64 %r95,1; .loc 1 98 4 mov.u64 %r277,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r255,0; st.u64 [%r171+8],%r255; .loc 1 103 28 cvt.u16.u32 %r258,%r22; add.u16 %r257,%r258,-1; cvt.u32.u16 %r259,%r257; st.u8 [%r171+28],%r259; .loc 1 105 20 add.u32 %r260,%r22,-2; cvt.s64.s32 %r55,%r260; add.u64 %r262,%r55,%r55; add.u64 %r263,%r262,%r55; shl.b64 %r264,%r263,3; add.u64 %r265,%r171,%r264; .loc 1 105 67 shl.b64 %r267,%r55,3; add.u64 %r268,%frame,%r267; .loc 1 105 59 ld.u64 %r270,[%r265+40]; ld.u64 %r271,[%r268+240]; mul.lo.u64 %r109,%r270,%r271; .loc 1 107 29 mov.u64 %r27273; call (%value_in),_gfortrani_xmallocarray274,[%value_in]; } .loc 1 107 27 st.u64 [%r171],%r274; .loc 1 108 10 setp.eq.u64 %r276,%r109,0; @ ! %r276 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r95,%r51,%r95; $L17: .loc 1 98 4 st.u64 [%r137+8],%r277; ld.u64 %r51,[%r132]; add.u64 %r278,%r51,-1; st.u64 [%r137+16],%r278; st.u64 [%r137],%r95; .loc 1 91 7 add.u64 %r137,%r137,24; add.u64 %r132,%r132,8; setp.ne.u64 %r279,%r100,%r137; @ %r279 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r171+48],%r109; mov.u64 %r281,-1; st.u64 [%r171+56],%r281; st.u64 [%r171+40],%r273; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r61,[%r171+28]; .loc 1 118 10 setp.eq.u64 %r283,%r61,%r103; @ %r283 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r61;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r286,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r287,[%r286+36]; setp.eq.u32 %r288,%r287,0; @ %r288 bra $L21; .loc 1 125 2 add.u64 %r337,%frame,240; cvta.const.u64 %r292,$LC2r329292; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r294,%r103,0; @ %r294 bra $L24; $L27: .loc 1 65 9 add.u64 %r295,%r31,1; .loc 1 65 7 sub.u64 %r105,%r295,%r33; max.s64 %r41,%r105,0; .loc 1 137 8 ld.u64 %r117,[%r172]; .loc 1 138 8 ld.u64 %r118,[%r171]; ld.u64 %r48,[%frame+360]; .loc 1 180 22 ld.u64 %r70,[%frame+120]; .loc 1 181 22 ld.u64 %r72,[%frame]; setp.gt.s64 %r338,%r105,0; setp.le.s64 %r339,%r103,1; cvt.u32.u32 %r341,%r22; cvt.s64.s8 %r342,%r341; add.u64 %r343,%r342,-1; add.u64 %r344,%frame,120; add.u64 %r345,%frame,240; .loc 1 187 13 mov.u64 %r346,0; bra $L25; $L24: add.u64 %r157,%frame,360; add.u64 %r156,%r171,40; mov.u64 %r154,%frame; cvt.u32.u32 %r297,%r22; cvt.s64.s8 %r296,%r297; add.u64 %r145,%r296,-1; .loc 1 129 3 mov.u64 %r119,0; add.u64 %r337,%frame,240; .loc 1 131 16 mov.u64 %r298,%r119; $L26: st.u64 [%r157],%r298; .loc 1 132 18 ld.u64 %r299,[%r156]; st.u64 [%r154],%r299; .loc 1 133 17 shl.b64 %r301,%r119,3; add.u64 %r302,%r337,%r301; .loc 1 133 10 ld.u64 %r303,[%r302]; setp.le.s64 %r304,%r303,0; @ %r304 bra $L1; .loc 1 129 26 add.u64 %r119,%r119,1; .loc 1 129 3 add.u64 %r157,%r157,8; add.u64 %r156,%r156,24; add.u64 %r154,%r154,8; setp.ne.u64 %r305,%r119,%r145; @ %r305 bra $L26; bra $L27; $L25: .loc 1 153 5 @ %r338 bra $L28; .loc 1 154 10 mov.u32 %r307,-128; st.u8 [%r118],%r307; bra $L29; $L28: .loc 1 158 36 mov.u64 %r112,%r117; .loc 1 151 9 mov.u32 %r44,-128; .loc 1 158 13 mov.u64 %r111,0; $L30: ld.s8 %r310,[%r112]; cvt.u16.u32 %r309,%r310; cvt.u16.u32 %r312,%r44; max.s16 %r311,%r309,%r312; cvt.u32.u16 %r313,%r311; cvt.s32.s8 %r44,%r313; .loc 1 158 28 add.u64 %r111,%r111,1; .loc 1 158 36 add.u64 %r112,%r112,%r107; .loc 1 158 6 setp.gt.s64 %r314,%r41,%r111; @ %r314 bra $L30; .loc 1 175 12 cvt.u32.u32 %r315,%r44; st.u8 [%r118],%r315; $L29: .loc 1 179 15 add.u64 %r48,%r48,1; .loc 1 180 12 add.u64 %r117,%r117,%r70; .loc 1 181 12 add.u64 %r118,%r118,%r72; .loc 1 183 32 ld.u64 %r126,[%frame+240]; .loc 1 183 13 setp.ne.u64 %r316,%r48,%r126; @ %r316 bra $L25; .loc 1 190 23 mul.lo.u64 %r317,%r70,%r48; .loc 1 190 9 sub.u64 %r114,%r117,%r317; .loc 1 191 23 mul.lo.u64 %r318,%r72,%r48; .loc 1 191 9 sub.u64 %r115,%r118,%r318; .loc 1 193 7 @ %r339 bra $L1; add.u64 %r143,%frame,368; mov.u64 %r102,8; .loc 1 192 5 mov.u64 %r116,1; bra $L33; $L34: .loc 1 187 13 st.u64 [%r143],%r346; .loc 1 190 23 mul.lo.u64 %r324,%r84,%r83; .loc 1 190 9 sub.u64 %r114,%r117,%r324; .loc 1 191 23 mul.lo.u64 %r325,%r86,%r83; .loc 1 191 9 sub.u64 %r115,%r118,%r325; .loc 1 192 5 add.u64 %r116,%r116,1; .loc 1 193 7 add.u64 %r143,%r143,8; add.u64 %r102,%r102,8; setp.eq.u64 %r326,%r116,%r343; @ %r326 bra $L1; $L33: .loc 1 201 16 ld.u64 %r327,[%r143]; add.u64 %r83,%r327,1; st.u64 [%r143],%r83; .loc 1 202 23 add.u64 %r329,%r344,%r102; ld.u64 %r84,[%r329]; .loc 1 202 13 add.u64 %r117,%r114,%r84; .loc 1 203 23 add.u64 %r330,%frame,%r102; ld.u64 %r86,[%r330]; .loc 1 203 13 add.u64 %r118,%r115,%r86; .loc 1 183 32 add.u64 %r332,%r345,%r102; ld.u64 %r88,[%r332]; .loc 1 183 13 setp.eq.u64 %r333,%r83,%r88; @ %r333 bra $L34; .loc 1 187 13 mov.u64 %r48,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r334,[%r171]; setp.eq.u64 %r335,%r334,0; @ ! %r335 bra $L11; bra $L35; $L1: .loc 1 207_gfortran_mmaxval_i1 .visible .func _gfortran_mmaxval_60899pred %r207; .reg .u32 %r211; .reg .u64 %r213; .reg .u32 %r215; .reg .u32 %r216; .reg .u16u16 %r255; .reg .u32 %r256; .reg .u16 %r257; .reg .pred %r258; .reg .predpred %r274; .reg .u64 %r275; .reg .u64 %r277; .reg .u64 %r279; .reg .u64 %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64pred %r297; .reg .u64 %r298; .reg .predu64 %r305; .reg .u32 %r306; .reg .u64u64 %r313; .reg .u64pred %r323; .reg .pred %r324; .reg .u64 %r325; .reg .u32 %r326; .reg .u64u64 %r331; .reg .u32 %r332; .reg .u64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u64u16 %r347; .reg .u32 %r348; .reg .pred %r349; .reg .u64 %r350; .reg .u64 %r351; .reg .predu16 %r388; .reg .u32 %r389; .reg .predpred %r397; .reg .u32u64 %r437; mov.u64 %r203,%ar0; mov.u64 %r204,%ar1; mov.u64 %r205,%ar2; mov.u64 %r206,%ar3; .loc 1 237 6 setp.ne.u64 %r207,%r206,0; @ %r207 bra $L490r204205; call _gfortran_maxva.loc 1 244 7 bra $L48; $L49: .loc 1 247 10 ld.u64 %r22,[%r205]; .loc 1 247 7 add.u64 %r131,%r22,-1; .loc 1 248 10 ld.s8 %r23,[%r204+28]; .loc 1 248 38 add.u32 %r211,%r23,-1; .loc 1 248 8 cvt.s64.s32 %r132,%r211; .loc 1 251 7 shr.u64 %r213,%r131,63; set.u32.gt.s64 %r215,%r131,%r132; neg.s32 %r216,%r215; cvt.u16.u64 %r219,%r213; cvt.u16.u32 %r220,%r216; or.b16 %r218,%r219,%r220; .loc 1 251 6 cvt.u32.u16 %r221,%r218; cvt.u16.u8 %r222,%r221; setp.eq.u16 %r223,%r222,0; @ %r223 bra $L51; .loc 1 253 7 cvt.u32.u32 %r226,%r23; cvt.s64.s8 %r225,%r226; st.u64 [%stack+8],%r225; st.u64 [%stack],%r22; cvta.const.u64 %r2242stack; call _gfortran_runtime_error1: .loc 1 258 9 add.u64 %r426,%r131,%r131; add.u64 %r425,%r426,%r131; shl.b64 %r231,%r425,3; add.u64 %r232,%r204,%r231; ld.u64 %r235,[%r232+56]; add.u64 %r234,%r235,1; .loc 1 258 7 ld.u64 %r242,[%r232+48]; sub.u64 %r133,%r234,%r242; .loc 1 259 6 setp.le.s64 %r243,%r133,0; @ %r243 bra $L48; .loc 1 262 9 ld.u64 %r145,[%r206]; .loc 1 264 15 ld.u64 %r34,[%r206+16]; .loc 1 266 22 cvt.u32.u64 %r35,%r34; .loc 1 266 53 add.u32 %r244,%r35,-4; and.b32 %r245,%r244,-5; set.u32.eq.u32 %r247,%r245,0; neg.s32 %r248,%r247; .loc 1 266 22 add.u32 %r249,%r35,-1; set.u32.le.u32 %r251,%r249,1; neg.s32 %r252,%r251; .loc 1 266 6 cvt.u16.u32 %r254,%r248; cvt.u16.u32 %r255,%r252; or.b16 %r253,%r254,%r255; cvt.u32.u16 %r256,%r253; cvt.u16.u8 %r257,%r256; setp.ne.u16 %r258,%r257,0; @ %r258 bra $L53; .loc 1 268 7 setp.ne.u32 %r260,%r35,16; @ %r260 bra $L54; $L53: .loc 1 275 9 shl.b64 %r264,%r425,3; add.u64 %r265,%r204,%r264; ld.u64 %r136,[%r265+40]; .loc 1 276 12 add.u64 %r271,%r206,%r264; ld.u64 %r42,[%r271+40]; .loc 1 278 3 setp.ne.u64 %r273,%r131,0; @ %r273 bra $L55; $L61: .loc 1 288 3 setp.lt.s64 %r274,%r131,%r132; @ %r274 bra $L56; bra $L105; $L54: .loc 1 273 5 cvta.const.u64 %r275275_gfortran_runtime_error5: add.u64 %r187,%r204,40; add.u64 %r189,%r206,40; add.u64 %r277,%r204,16; add.u64 %r279,%r22,%r22; add.u64 %r280,%r279,%r22; shl.b64 %r281,%r280,3; add.u64 %r200,%r277,%r281; .loc 1 278 3 mov.u64 %r191,0; add.u64 %r423,%frame,240; add.u64 %r422,%frame,360; .loc 1 285 12 mov.u64 %r437,%r191; $L60: .loc 1 280 18 add.u64 %r283,%r423,%r191; ld.u64 %r284,[%r187]; st.u64 [%r283],%r284; .loc 1 281 18 add.u64 %r285,%frame,%r191; .loc 1 281 20 ld.u64 %r287,[%r189]; mul.lo.u64 %r286,%r287,%r34; .loc 1 281 18 st.u64 [%r285],%r286; .loc 1 282 19 ld.u64 %r289,[%r187+16]; add.u64 %r288,%r289,1; ld.u64 %r290,[%r187+8]; sub.u64 %r53,%r288,%r290; .loc 1 284 10 setp.lt.s64 %r291,%r53,0; @ %r291 bra $L58; .loc 1 282 17 add.u64 %r293,%r422,%r191; st.u64 [%r293],%r53; bra $L59; $L58: .loc 1 285 12 add.u64 %r295,%r422,%r191; st.u64 [%r295],%r437; $L59: .loc 1 278 3 add.u64 %r187,%r187,24; add.u64 %r189,%r189,24; add.u64 %r191,%r191,8; setp.ne.u64 %r297,%r187,%r200; @ %r297 bra $L60; bra $L61; $L105: .loc 1 298 6 ld.u64 %r298,[%r203]; setp.eq.u64 %r299,%r298,0; @ ! %r299 bra $L63; bra $L62; $L56: add.u64 %r301,%r22,%r22; add.u64 %r302,%r301,%r22; shl.b64 %r303,%r302,3; add.u64 %r152,%r303,40; add.u64 %r55,%r204,%r152; add.u64 %r154,%r206,%r152; shl.b64 %r304,%r22,3; add.u64 %r99,%r304,-8; cvt.u32.u32 %r306,%r23; cvt.s64.s8 %r305,%r306; shl.b64 %r307,%r305,3; add.u64 %r184,%r307,-8; add.u64 %r423,%frame,240; add.u64 %r422,%frame,360; .loc 1 295 12 mov.u64 %r436,0; $L66: .loc 1 290 18 add.u64 %r309,%r423,%r99; ld.u64 %r310,[%r55]; st.u64 [%r309],%r310; .loc 1 291 18 add.u64 %r311,%frame,%r99; .loc 1 291 20 ld.u64 %r313,[%r154]; mul.lo.u64 %r312,%r313,%r34; .loc 1 291 18 st.u64 [%r311],%r312; .loc 1 292 19 ld.u64 %r315,[%r55+16]; add.u64 %r314,%r315,1; ld.u64 %r316,[%r55+8]; sub.u64 %r63,%r314,%r316; .loc 1 294 10 setp.lt.s64 %r317,%r63,0; @ %r317 bra $L64; .loc 1 292 17 add.u64 %r319,%r422,%r99; st.u64 [%r319],%r63; bra $L65; $L64: .loc 1 295 12 add.u64 %r321,%r422,%r99; st.u64 [%r321],%r436; $L65: .loc 1 288 3 add.u64 %r55,%r55,24; add.u64 %r154,%r154,24; add.u64 %r99,%r99,8; setp.ne.u64 %r323,%r99,%r184; @ %r323 bra $L66; bra $L106; $L62: .loc 1 302 7 setp.eq.u64 %r324,%r132,0; @ %r324 bra $L68; add.u64 %r422,%frame,360; $L86: add.u64 %r123,%r203,40; mov.u64 %r117,%r422; cvt.u32.u32 %r326,%r23; cvt.s64.s8 %r325,%r326; add.u64 %r328,%r325,%r325; add.u64 %r329,%r328,%r325; shl.b64 %r330,%r329,3; add.u64 %r331,%r203,16; add.u64 %r71,%r330,%r331; .loc 1 305 10 mov.u64 %r119,1; .loc 1 309 4 mov.u64 %r350,0; bra $L69; $L68: .loc 1 313 20 add.u32 %r332,%r23,-2; cvt.s64.s32 %r69,%r332; add.u64 %r334,%r69,%r69; add.u64 %r335,%r334,%r69; shl.b64 %r336,%r335,3; add.u64 %r337,%r203,%r336; .loc 1 313 67 shl.b64 %r339,%r69,3; add.u64 %r340,%frame,%r339; .loc 1 313 59 ld.u64 %r342,[%r337+40]; ld.u64 %r343,[%r340+360]; mul.lo.u64 %r139,%r342,%r343; .loc 1 315 24 mov.u64 %r344,0; st.u64 [%r203+8],%r344; .loc 1 316 28 cvt.u16.u32 %r347,%r23; add.u16 %r346,%r347,-1; cvt.u32.u16 %r348,%r346; st.u8 [%r203+28],%r348; .loc 1 318 10 setp.eq.u64 %r349,%r139,0; @ %r349 bra $L70; bra $L107; $L72: .loc 1 307 47 mul.lo.u64 %r119,%r66,%r119; $L69: .loc 1 309 4 st.u64 [%r123+8],%r350; ld.u64 %r66,[%r117]; add.u64 %r351,%r66,-1; st.u64 [%r123+16],%r351; st.u64 [%r123],%r119; .loc 1 302 7 add.u64 %r123,%r123,24; add.u64 %r117,%r117,8; setp.ne.u64 %r352,%r71,%r123; @ %r352 bra $L72; bra $L68; $L70: .loc 1 321 4 st.u64 [%r203+48],%r139; mov.u64 %r354,-1; st.u64 [%r203+56],%r354; mov.u64 %r355,1; st.u64 [%r203+40],%r355; .loc 1 322 4 bra $L48; $L107: .loc 1 325 24 mov.u64 %r3571357; call (%value_in),_gfortrani_xmallocarray58,[%value_in]; } .loc 1 325 22 st.u64 [%r203],%r358; bra $L73; $L63: .loc 1 330 19 ld.s8 %r360,[%r203+28]; .loc 1 330 10 setp.eq.u64 %r361,%r360,%r132; @ %r361 bra $L74; .loc 1 331 2 cvta.const.u64 %r362362_gfortran_runtime_error74: .loc 1 333 11 cvta.global.u64 %r364,_gfortrani_compile_options; .loc 1 333 10 ld.u32 %r365,[%r364+36]; setp.eq.u32 %r366,%r365,0; @ %r366 bra $L73; .loc 1 335 4 add.u64 %r422,%frame,360; cvta.const.u64 %r370,$LC2r422369370; call _gfortrani_bounds_ifunction_return337 4 cvta.const.u64 %r374,$LC6; {374370; call _gfortrani_bounds_equal_extents$L73: .loc 1 342 3 setp.ne.u64 %r376,%r132,0; @ %r376 bra $L75; $L78: .loc 1 350 8 ld.u64 %r146,[%r203]; .loc 1 351 8 ld.u64 %r144,[%r204]; .loc 1 353 9 setp.ne.u64 %r377,%r144,0; @ %r377 bra $L76; bra $L48; $L75: add.u64 %r170,%frame,480; add.u64 %r167,%r203,40; add.u64 %r160,%frame,120; cvt.u32.u32 %r379,%r23; cvt.s64.s8 %r378,%r379; add.u64 %r124,%r378,-1; .loc 1 342 3 mov.u64 %r151,0; add.u64 %r422,%frame,360; .loc 1 344 16 mov.u64 %r380,%r151; $L77: st.u64 [%r170],%r380; .loc 1 345 18 ld.u64 %r381,[%r167]; st.u64 [%r160],%r381; .loc 1 346 17 shl.b64 %r383,%r151,3; add.u64 %r384,%r422,%r383; .loc 1 346 10 ld.u64 %r385,[%r384]; setp.le.s64 %r386,%r385,0; @ %r386 bra $L48; .loc 1 342 26 add.u64 %r151,%r151,1; .loc 1 342 3 add.u64 %r170,%r170,8; add.u64 %r167,%r167,24; add.u64 %r160,%r160,8; setp.ne.u64 %r387,%r124,%r151; @ %r387 bra $L77; bra $L78; $L76: .loc 1 276 12 mul.lo.u64 %r44,%r42,%r34; ld.u64 %r173,[%frame+480]; .loc 1 401 22 ld.u64 %r85,[%frame+240]; .loc 1 402 23 ld.u64 %r87,[%frame]; .loc 1 403 22 ld.u64 %r89,[%frame+120]; setp.le.s64 %r427,%r132,1; .loc 1 365 9 mov.u32 %r429,-128; cvt.u32.u32 %r430,%r23; cvt.s64.s8 %r431,%r430; add.u64 %r432,%r431,-1; add.u64 %r433,%frame,240; add.u64 %r434,%frame,360; add.u64 %r435,%frame,120; $L87: .loc 1 342 3 mov.u64 %r150,%r145; mov.u64 %r149,%r144; .loc 1 365 9 mov.u32 %r121,%r429; .loc 1 370 9 mov.u64 %r148,0; $L80: .loc 1 394 6 ld.s8 %r389,[%r150]; cvt.u16.u32 %r388,%r389; setp.eq.u16 %r390,%r388,0; @ %r390 bra $L79; ld.s8 %r393,[%r149]; cvt.u16.u32 %r392,%r393; cvt.u16.u32 %r395,%r121; max.s16 %r394,%r392,%r395; cvt.u32.u16 %r396,%r394; cvt.s32.s8 %r121,%r396; $L79: .loc 1 370 24 add.u64 %r148,%r148,1; .loc 1 370 32 add.u64 %r149,%r149,%r136; .loc 1 370 47 add.u64 %r150,%r150,%r44; .loc 1 370 2 setp.ne.u64 %r397,%r133,%r148; @ %r397 bra $L80; .loc 1 397 8 cvt.u32.u32 %r398,%r121; st.u8 [%r146],%r398; .loc 1 400 15 add.u64 %r173,%r173,1; .loc 1 401 12 add.u64 %r144,%r144,%r85; .loc 1 402 13 add.u64 %r145,%r145,%r87; .loc 1 403 12 add.u64 %r146,%r146,%r89; .loc 1 405 32 ld.u64 %r79,[%frame+360]; .loc 1 405 13 setp.ne.u64 %r399,%r79,%r173; @ %r399 bra $L87; .loc 1 412 23 mul.lo.u64 %r93,%r173,%r85; .loc 1 413 24 mul.lo.u64 %r400,%r173,%r87; .loc 1 413 10 sub.u64 %r141,%r145,%r400; .loc 1 414 23 mul.lo.u64 %r401,%r173,%r89; .loc 1 414 9 sub.u64 %r142,%r146,%r401; .loc 1 416 7 @ ! %r427 bra $L108; bra $L48; $L85: .loc 1 409 13 st.u64 [%r40],%r428; .loc 1 412 23 mul.lo.u64 %r93,%r105,%r104; .loc 1 413 24 mul.lo.u64 %r404,%r107,%r104; .loc 1 413 10 sub.u64 %r141,%r145,%r404; .loc 1 414 23 mul.lo.u64 %r405,%r109,%r104; .loc 1 414 9 sub.u64 %r142,%r146,%r405; .loc 1 415 5 add.u64 %r143,%r143,1; .loc 1 416 7 add.u64 %r40,%r40,8; add.u64 %r164,%r164,8; setp.ne.u64 %r406,%r143,%r432; @ %r406 bra $L84; bra $L48; $L108: add.u64 %r40,%frame,488; mov.u64 %r164,8; .loc 1 415 5 mov.u64 %r143,1; .loc 1 409 13 mov.u64 %r428,0; $L84: .loc 1 424 16 ld.u64 %r410,[%r40]; add.u64 %r104,%r410,1; st.u64 [%r40],%r104; .loc 1 425 23 add.u64 %r412,%r433,%r164; ld.u64 %r105,[%r412]; .loc 1 425 13 sub.u64 %r413,%r105,%r93; add.u64 %r144,%r144,%r413; .loc 1 426 24 add.u64 %r414,%frame,%r164; ld.u64 %r107,[%r414]; .loc 1 426 14 add.u64 %r145,%r141,%r107; .loc 1 427 23 add.u64 %r416,%r435,%r164; ld.u64 %r109,[%r416]; .loc 1 427 13 add.u64 %r146,%r142,%r109; .loc 1 405 32 add.u64 %r418,%r434,%r164; ld.u64 %r112,[%r418]; .loc 1 405 13 setp.eq.u64 %r419,%r104,%r112; @ %r419 bra $L85; .loc 1 409 13 mov.u64 %r173,0; bra $L87; $L106: .loc 1 298 6 ld.u64 %r420,[%r203]; setp.eq.u64 %r421,%r420,0; @ ! %r421 bra $L63; bra $L86; $L48: .loc 1 431_gfortran_smaxval_i1 .visible .func _gfortran_smaxval_88pred %r184; .reg .predpred %r206; .reg .predu32 %r211; .reg .u64 %r213; .reg .u64u32 %r222; .reg .u64pred %r264; .reg .u64 %r266; .reg .u32 %r270; .reg .u64 %r272; .reg .u64 %r274; .reg .predpredu64 %r297; mov.u64 %r148,%ar0; mov.u64 %r149,%ar1; mov.u64 %r150,%ar2; mov.u64 %r151,%ar3; .loc 1 454 6 setp.eq.u64 %r152,%r151,0; @ %r152 bra $L110; .loc 1 454 20 ld.u32 %r153,[%r151]; setp.eq.u32 %r154,%r153,0; @ %r154 bra $L111; $L110: .loc 1 459 7gfortran_maxva.loc 1 461 7 bra $L109; $L111: .loc 1 464 10 ld.u64 %r23,[%r150]; .loc 1 464 7 add.u64 %r86,%r23,-1; .loc 1 465 10 ld.s8 %r24,[%r149+28]; .loc 1 465 38 add.u32 %r26,%r24,-1; .loc 1 465 8 cvt.s64.s32 %r291,%r26; .loc 1 467 7 shr.u64 %r159,%r86,63; set.u32.gt.s64 %r161,%r86,%r291; neg.s32 %r162,%r161; cvt.u16.u64 %r165,%r159; cvt.u16.u32 %r166,%r162; or.b16 %r164,%r165,%r166; .loc 1 467 6 cvt.u32.u16 %r167,%r164; cvt.u16.u8 %r168,%r167; setp.ne.u16 %r169,%r168,0; @ %r169 bra $L113; .loc 1 474 3 setp.ne.u64 %r170,%r86,0; @ %r170 bra $L114; $L120: .loc 1 482 3 setp.lt.s64 %r171,%r86,%r291; @ %r171 bra $L115; bra $L168; $L113: .loc 1 469 7 cvt.u32.u32 %r174,%r24; cvt.s64.s8 %r173,%r174; st.u64 [%stack+8],%r173172172_gfortran_runtime_error114: add.u64 %r135,%r149,48; add.u64 %r137,%frame,120; add.u64 %r138,%r149,56; add.u64 %r176,%r149,24; add.u64 %r178,%r23,%r23; add.u64 %r179,%r178,%r23; shl.b64 %r180,%r179,3; add.u64 %r144,%r176,%r180; .loc 1 479 12 mov.u64 %r297,0; $L119: .loc 1 476 19 ld.u64 %r182,[%r138]; add.u64 %r181,%r182,1; ld.u64 %r183,[%r135]; sub.u64 %r36,%r181,%r183; .loc 1 478 10 setp.le.s64 %r184,%r36,0; @ %r184 bra $L117; .loc 1 476 17 st.u64 [%r137],%r36; bra $L118; $L117: .loc 1 479 12 st.u64 [%r137],%r297; $L118: .loc 1 474 3 add.u64 %r135,%r135,24; add.u64 %r137,%r137,8; add.u64 %r138,%r138,24; setp.ne.u64 %r186,%r135,%r144; @ %r186 bra $L119; bra $L120; $L168: .loc 1 491 15 ld.u64 %r96,[%r148]; .loc 1 491 6 setp.eq.u64 %r187,%r96,0; @ ! %r187 bra $L122; bra $L121; $L115: add.u64 %r189,%r23,%r23; add.u64 %r190,%r189,%r23; shl.b64 %r191,%r190,3; add.u64 %r192,%r191,48; add.u64 %r57,%r149,%r192; add.u64 %r290,%frame,120; shl.b64 %r194,%r23,3; add.u64 %r195,%r194,-8; add.u64 %r67,%r290,%r195; add.u64 %r196,%r149,48; cvt.u32.u32 %r198,%r24; cvt.s64.s8 %r197,%r198; add.u64 %r200,%r197,%r197; add.u64 %r201,%r200,%r197; shl.b64 %r202,%r201,3; add.u64 %r131,%r196,%r202; .loc 1 488 12 mov.u64 %r296,0; $L125: .loc 1 485 2 ld.u64 %r204,[%r57+8]; add.u64 %r203,%r204,1; ld.u64 %r205,[%r57]; sub.u64 %r41,%r203,%r205; .loc 1 487 10 setp.le.s64 %r206,%r41,0; @ %r206 bra $L123; .loc 1 484 17 st.u64 [%r67],%r41; bra $L124; $L123: .loc 1 488 12 st.u64 [%r67],%r296; $L124: .loc 1 482 3 add.u64 %r57,%r57,24; add.u64 %r67,%r67,8; setp.ne.u64 %r208,%r57,%r131; @ %r208 bra $L125; bra $L169; $L121: .loc 1 495 7 setp.eq.u64 %r209,%r291,0; @ %r209 bra $L127; add.u64 %r290,%frame,120; $L147: add.u64 %r102,%r148,40; mov.u64 %r84,%r290; cvt.u32.u32 %r211,%r24; cvt.s64.s8 %r210,%r211; add.u64 %r213,%r210,%r210; add.u64 %r214,%r213,%r210; shl.b64 %r215,%r214,3; add.u64 %r216,%r148,16; add.u64 %r60,%r215,%r216; .loc 1 498 10 mov.u64 %r82,1; .loc 1 502 4 mov.u64 %r235,0; bra $L128; $L127: .loc 1 506 24 mov.u64 %r217,0; st.u64 [%r148+8],%r217; .loc 1 507 28 cvt.u16.u32 %r220,%r24; add.u16 %r219,%r220,-1; cvt.u32.u16 %r221,%r219; st.u8 [%r148+28],%r221; .loc 1 509 20 add.u32 %r222,%r24,-2; cvt.s64.s32 %r50,%r222; add.u64 %r224,%r50,%r50; add.u64 %r225,%r224,%r50; shl.b64 %r226,%r225,3; add.u64 %r227,%r148,%r226; .loc 1 509 67 shl.b64 %r229,%r50,3; add.u64 %r230,%frame,%r229; .loc 1 509 59 ld.u64 %r232,[%r227+40]; ld.u64 %r233,[%r230+120]; mul.lo.u64 %r93,%r232,%r233; .loc 1 511 10 setp.eq.u64 %r234,%r93,0; @ %r234 bra $L129; bra $L170; $L131: .loc 1 500 48 mul.lo.u64 %r82,%r46,%r82; $L128: .loc 1 502 4 st.u64 [%r102+8],%r235; ld.u64 %r46,[%r84]; add.u64 %r236,%r46,-1; st.u64 [%r102+16],%r236; st.u64 [%r102],%r82; .loc 1 495 7 add.u64 %r102,%r102,24;131; bra $L127; $L129: .loc 1 514 4 st.u64 [%r148+48],%r93; mov.u64 %r239,-1; st.u64 [%r148+56],%r239; mov.u64 %r240,1; st.u64 [%r148+40],%r240; .loc 1 515 4 bra $L109; $L170: .loc 1 518 24 mov.u64 %r242(%value_in),_gfortrani_xmallocarray3,[%value_in]; } mov.u64 %r96,%r243; .loc 1 518 22 st.u64 [%r148],%r96; $L135: .loc 1 544 3 setp.ne.u64 %r245,%r291,0; @ %r245 bra $L132; bra $L133; $L122: .loc 1 522 19 ld.s8 %r55,[%r148+28]; .loc 1 522 10 setp.eq.u64 %r246,%r55,%r291; @ %r246 bra $L134; .loc 1 523 2 st.u64 [%stack+8],%r291; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 528 11 cvta.global.u64 %r249,_gfortrani_compile_options; .loc 1 528 10 ld.u32 %r250,[%r249+36]; setp.eq.u32 %r251,%r250,0; @ %r251 bra $L135; .loc 1 530 4 setp.eq.u64 %r252,%r291,0; @ %r252 bra $L133; add.u64 %r31,%r148,48; add.u64 %r125,%frame,120; cvt.u32.u32 %r254,%r24; cvt.s64.s8 %r253,%r254; add.u64 %r92,%r253,-1; .loc 1 530 10 mov.u64 %r101,0; $L137: .loc 1 534 21 ld.u64 %r256,[%r31+8]; add.u64 %r255,%r256,1; .loc 1 534 19 ld.u64 %r257,[%r31]; sub.u64 %r91,%r255,%r257; .loc 1 535 18 ld.u64 %r64,[%r125]; .loc 1 530 25 add.u64 %r101,%r101,1; .loc 1 535 11 setp.eq.u64 %r258,%r64,%r91; @ %r258 bra $L136; .loc 1 536 3 st.u64 [%stack+16],%r64; st.u64 [%stack+8],%r91; st.u64 [%stack],%r101; cvta.const.u64 %r22_gfortran_runtime_error136: .loc 1 530 4 add.u64 %r31,%r31,24; add.u64 %r125,%r125,8; setp.ne.u64 %r261,%r92,%r101; @ %r261 bra $L137; $L132: .loc 1 546 16 cvt.u16.u32 %r263,%r24; setp.le.s16 %r264,%r263,1; @ %r264 bra $L138; shl.b64 %r262,%r291,3; bra $L139; $L138: mov.u64 %r262,8; $L139: add.u64 %r266,%frame,240; mov.u32 %r27026272,[%value_in]; } add.u64 %r111,%r148,40; mov.u64 %r109,%frame; mov.u64 %r98,0; $L140: .loc 1 547 18 ld.u64 %r274,[%r111]; st.u64 [%r109],%r274; .loc 1 544 26 add.u64 %r98,%r98,1; .loc 1 544 3 add.u64 %r111,%r111,24; add.u64 %r109,%r109,8; setp.gt.s64 %r275,%r291,%r98; @ %r275 bra $L140; $L133: ld.u64 %r103,[%frame+240]; .loc 1 556 22 ld.u64 %r69,[%frame]; .loc 1 558 32 ld.u64 %r59,[%frame+120]; .loc 1 565 23 mul.lo.u64 %r81,%r59,%r69; setp.le.s64 %r289,%r291,1; .loc 1 554 13 mov.u32 %r276,-128; cvt.u32.u32 %r293,%r24; cvt.s64.s8 %r294,%r293; add.u64 %r295,%r294,-1; $L146: st.u8 [%r96],%r276; .loc 1 555 15 add.u64 %r103,%r103,1; .loc 1 556 12 add.u64 %r96,%r96,%r69; .loc 1 558 13 setp.ne.u64 %r277,%r59,%r103; @ %r277 bra $L146; .loc 1 567 7 @ ! %r289 bra $L171; bra $L109; $L145: .loc 1 562 13 st.u64 [%r97],%r292; .loc 1 565 23 mul.lo.u64 %r71,%r74,%r73; .loc 1 566 5 add.u64 %r95,%r95,1; .loc 1 567 7 add.u64 %r97,%r97,8; add.u64 %r119,%r119,8; add.u64 %r118,%r118,8; setp.ne.u64 %r280,%r95,%r295; @ %r280 bra $L144; bra $L109; $L171: add.u64 %r97,%frame,248; add.u64 %r119,%frame,8; add.u64 %r118,%frame,128; .loc 1 565 23 mov.u64 %r71,%r81; .loc 1 566 5 mov.u64 %r95,1; .loc 1 562 13 mov.u64 %r292,0; $L144: .loc 1 571 16 ld.u64 %r285,[%r97]; add.u64 %r73,%r285,1; st.u64 [%r97],%r73; .loc 1 572 23 ld.u64 %r74,[%r119]; .loc 1 572 13 sub.u64 %r286,%r74,%r71; add.u64 %r96,%r96,%r286; .loc 1 558 32 ld.u64 %r77,[%r118]; .loc 1 558 13 setp.eq.u64 %r287,%r73,%r77; @ %r287 bra $L145; .loc 1 562 13 mov.u64 %r103,0; bra $L146; $L169: .loc 1 491 15 ld.u64 %r96,[%r148]; .loc 1 491 6 setp.eq.u64 %r288,%r96,0; @ ! %r288 bra $L122; bra $L147; $L109: .loc 1 maxval_i2.o/_gfortran_maxval_i2 .visible .func _gfortran_maxval_i2fortran/generated/maxval_i2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxval_i2 .visible .func _gfortran_mmaxval_i2gfortran_smaxval_i2 .visible .func _gfortran_smaxval_i2VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,86,65,76114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_maxval_i2 .visible .func _gfortran_maxval_i24899pred %r217; .reg .pred64u64 %r251; .reg .predu64 %r256; .reg .u64 %r257; .reg .u32u32 %r268; .reg .u64predmov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; .loc 1 55 10 ld.s8 %r22,[%r180+28]; .loc 1 55 38 add.u32 %r182,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r182; .loc 1 56 10 ld.u64 %r25,[%r181]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r184,%r104,63; set.u32.lt.s64 %r186,%r103,%r104; neg.s32 %r187,%r186; cvt.u16.u64 %r190,%r184; cvt.u16.u32 %r191,%r187; or.b16 %r189,%r190,%r191; .loc 1 58 6 cvt.u32.u16 %r192,%r189; cvt.u16.u8 %r193,%r192; setp.eq.u16 %r194,%r193,0; @ %r194 bra $L2; .loc 1 60 7 cvt.u32.u32 %r197,%r22; cvt.s64.s8 %r196,%r197; st.u64 [%stack+8],%r1961955_gfortran_runtime_errorr200,%r104,%r104; add.u64 %r201,%r200,%r104; shl.b64 %r202,%r201,3; add.u64 %r203,%r180,%r202; ld.u64 %r30,[%r203+56]; ld.u64 %r32,[%r203+48]; .loc 1 68 9 ld.u64 %r107,[%r203+40]; .loc 1 70 3 setp.ne.u64 %r217,%r104,0; @ %r217 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r218,%r103,%r104; @ %r218 bra $L4; bra $L46; $L3: add.u64 %r78,%r180,40; add.u64 %r59,%frame,120; add.u64 %r91,%frame,240; add.u64 %r219,%r180,16; add.u64 %r221,%r25,%r25; add.u64 %r222,%r221,%r25; shl.b64 %r223,%r222,3; add.u64 %r174,%r219,%r223; .loc 1 76 12 mov.u64 %r363,0; $L8: .loc 1 72 18 ld.u64 %r224,[%r78]; st.u64 [%r59],%r224; .loc 1 73 19 ld.u64 %r226,[%r78+16]; add.u64 %r225,%r226,1; ld.u64 %r227,[%r78+8]; sub.u64 %r37,%r225,%r227; .loc 1 75 10 setp.lt.s64 %r228,%r37,0; @ %r228 bra $L6; .loc 1 73 17 st.u64 [%r91],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r91],%r363; $L7: .loc 1 70 3 add.u64 %r78,%r78,24; add.u64 %r59,%r59,8; add.u64 %r91,%r91,8; setp.ne.u64 %r230,%r78,%r174; @ %r230 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r231,[%r179]; setp.eq.u64 %r232,%r231,0; @ ! %r232 bra $L11; bra $L10; $L4: add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r237,%r236,40; add.u64 %r143,%r180,%r237; shl.b64 %r238,%r25,3; add.u64 %r125,%r238,-8; add.u64 %r351,%frame,120; add.u64 %r128,%r351,%r125; add.u64 %r350,%frame,240; add.u64 %r121,%r350,%r125; cvt.u32.u32 %r242,%r22; cvt.s64.s8 %r241,%r242; add.u64 %r244,%r241,%r241; add.u64 %r245,%r244,%r241; shl.b64 %r246,%r245,3; add.u64 %r247,%r180,40; add.u64 %r95,%r246,%r247; .loc 1 84 12 mov.u64 %r362,0; $L14: .loc 1 80 18 ld.u64 %r248,[%r143]; st.u64 [%r128],%r248; .loc 1 81 19 ld.u64 %r250,[%r143+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r143+8]; sub.u64 %r45,%r249,%r251; .loc 1 83 10 setp.lt.s64 %r252,%r45,0; @ %r252 bra $L12; .loc 1 81 17 st.u64 [%r121],%r45; bra $L13; $L12: .loc 1 84 12 st.u64 [%r121],%r362; $L13: .loc 1 78 3 add.u64 %r143,%r143,24; add.u64 %r128,%r128,8; add.u64 %r121,%r121,8; setp.ne.u64 %r254,%r95,%r143; @ %r254 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r255,%r103,0; @ %r255 bra $L16; add.u64 %r350,%frame,240; $L34: add.u64 %r152,%r179,40; mov.u64 %r150,%r350; add.u64 %r256,%r179,16; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r144,%r256,%r262; .loc 1 94 10 mov.u64 %r99,1; .loc 1 98 4 mov.u64 %r285,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r263,0; st.u64 [%r179+8],%r263; .loc 1 103 28 cvt.u16.u32 %r266,%r22; add.u16 %r265,%r266,-1; cvt.u32.u16 %r267,%r265; st.u8 [%r179+28],%r267; .loc 1 105 20 add.u32 %r268,%r22,-2; cvt.s64.s32 %r52,%r268; add.u64 %r270,%r52,%r52; add.u64 %r271,%r270,%r52; shl.b64 %r272,%r271,3; add.u64 %r273,%r179,%r272; .loc 1 105 67 shl.b64 %r275,%r52,3; add.u64 %r276,%frame,%r275; .loc 1 105 59 ld.u64 %r278,[%r273+40]; ld.u64 %r279,[%r276+240]; mul.lo.u64 %r109,%r278,%r279; .loc 1 107 29 mov.u64 %r281,281; call (%value_in),_gfortrani_xmallocarray282,[%value_in]; } .loc 1 107 27 st.u64 [%r179],%r282; .loc 1 108 10 setp.eq.u64 %r284,%r109,0; @ ! %r284 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r99,%r48,%r99; $L17: .loc 1 98 4 st.u64 [%r152+8],%r285; ld.u64 %r48,[%r150]; add.u64 %r286,%r48,-1; st.u64 [%r152+16],%r286; st.u64 [%r152],%r99; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r287,%r144,%r152; @ %r287 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r179+48],%r109; mov.u64 %r289,-1; st.u64 [%r179+56],%r289; mov.u64 %r290,1; st.u64 [%r179+40],%r290; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r179+28]; .loc 1 118 10 setp.eq.u64 %r291,%r58,%r103; @ %r291 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r294,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r295,[%r294+36]; setp.eq.u32 %r296,%r295,0; @ %r296 bra $L21; .loc 1 125 2 add.u64 %r350,%frame,240; cvta.const.u64 %r300,$LC2r350299300; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r302,%r103,0; @ %r302 bra $L24; $L27: .loc 1 65 9 add.u64 %r303,%r30,1; .loc 1 65 7 sub.u64 %r105,%r303,%r32; max.s64 %r40,%r105,0; .loc 1 137 8 ld.u64 %r116,[%r180]; .loc 1 138 8 ld.u64 %r117,[%r179]; .loc 1 158 36 add.u64 %r66,%r107,%r107; ld.u64 %r138,[%frame+360]; .loc 1 180 22 ld.u64 %r68,[%frame+120]; .loc 1 180 12 add.u64 %r69,%r68,%r68; .loc 1 181 22 ld.u64 %r71,[%frame]; .loc 1 181 12 add.u64 %r72,%r71,%r71; .loc 1 183 32 ld.u64 %r123,[%frame+240]; .loc 1 190 23 mul.lo.u64 %r307,%r68,%r123; .loc 1 190 9 add.u64 %r308,%r307,%r307; neg.s64 %r127,%r308; .loc 1 191 23 mul.lo.u64 %r309,%r71,%r123; .loc 1 191 9 add.u64 %r310,%r309,%r309; neg.s64 %r130,%r310; setp.gt.s64 %r352,%r105,0; setp.le.s64 %r353,%r103,1; cvt.u32.u32 %r355,%r22; cvt.s64.s8 %r356,%r355; add.u64 %r357,%r356,-1; add.u64 %r358,%frame,120; add.u64 %r359,%frame,240; .loc 1 187 13 mov.u64 %r360,0; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r179,40; mov.u64 %r162,%frame; cvt.u32.u32 %r312,%r22; cvt.s64.s8 %r311,%r312; add.u64 %r153,%r311,-1; .loc 1 129 3 mov.u64 %r118,0; add.u64 %r350,%frame,240; .loc 1 131 16 mov.u64 %r313,%r118; $L26: st.u64 [%r165],%r313; .loc 1 132 18 ld.u64 %r314,[%r164]; st.u64 [%r162],%r314; .loc 1 133 17 shl.b64 %r316,%r118,3; add.u64 %r317,%r350,%r316; .loc 1 133 10 ld.u64 %r318,[%r317]; setp.le.s64 %r319,%r318,0; @ %r319 bra $L1; .loc 1 129 26 add.u64 %r118,%r118,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r320,%r118,%r153; @ %r320 bra $L26; bra $L27; $L25: .loc 1 153 5 @ %r352 bra $L35; .loc 1 154 10 mov.u16 %r322,-32768; st.u16 [%r117],%r322; bra $L29; $L35: mov.u64 %r112,%r116; .loc 1 151 9 mov.u32 %r43,-32768; .loc 1 158 13 mov.u64 %r111,0; $L28: ld.u16 %r324,[%r112]; cvt.u16.u32 %r325,%r43; max.s16 %r323,%r324,%r325; cvt.s32.s16 %r43,%r323; .loc 1 158 28 add.u64 %r111,%r111,1; .loc 1 158 36 add.u64 %r112,%r112,%r66; .loc 1 158 6 setp.gt.s64 %r326,%r40,%r111; @ %r326 bra $L28; .loc 1 175 12 cvt.u16.u32 %r327,%r43; st.u16 [%r117],%r327; $L29: .loc 1 179 15 add.u64 %r138,%r138,1; .loc 1 180 12 add.u64 %r116,%r116,%r69; .loc 1 181 12 add.u64 %r117,%r117,%r72; .loc 1 183 13 setp.ne.u64 %r328,%r138,%r123; @ %r328 bra $L25; .loc 1 190 9 add.u64 %r113,%r116,%r127; .loc 1 191 9 add.u64 %r114,%r117,%r130; .loc 1 193 7 @ %r353 bra $L1; add.u64 %r122,%frame,368; mov.u64 %r140,8; .loc 1 192 5 mov.u64 %r115,1; bra $L32; $L33: .loc 1 187 13 st.u64 [%r122],%r360; .loc 1 190 23 mul.lo.u64 %r334,%r85,%r84; .loc 1 190 9 add.u64 %r335,%r334,%r334; sub.u64 %r113,%r116,%r335; .loc 1 191 23 mul.lo.u64 %r336,%r88,%r84; .loc 1 191 9 add.u64 %r337,%r336,%r336; sub.u64 %r114,%r117,%r337; .loc 1 192 5 add.u64 %r115,%r115,1; .loc 1 193 7 add.u64 %r122,%r122,8; add.u64 %r140,%r140,8; setp.eq.u64 %r338,%r115,%r357; @ %r338 bra $L1; $L32: .loc 1 201 16 ld.u64 %r339,[%r122]; add.u64 %r84,%r339,1; st.u64 [%r122],%r84; .loc 1 202 23 add.u64 %r341,%r358,%r140; ld.u64 %r85,[%r341]; .loc 1 202 13 add.u64 %r342,%r85,%r85; add.u64 %r116,%r113,%r342; .loc 1 203 23 add.u64 %r343,%frame,%r140; ld.u64 %r88,[%r343]; .loc 1 203 13 add.u64 %r344,%r88,%r88; add.u64 %r117,%r114,%r344; .loc 1 183 32 add.u64 %r346,%r359,%r140; ld.u64 %r92,[%r346]; .loc 1 183 13 setp.eq.u64 %r347,%r84,%r92; @ %r347 bra $L33; .loc 1 187 13 mov.u64 %r138,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r348,[%r179]; setp.eq.u64 %r349,%r348,0; @ ! %r349 bra $L11; bra $L34; $L1: .loc 1 207_gfortran_mmaxval_i2 .visible .func _gfortran_mmaxval_608u64u16 %r263; .reg .pred %r264; .reg .pred %r266; .reg .u64 %r270; .reg .u64 %r271; .reg .u64 %r277; .reg .pred %r279; .reg .pred %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .predu64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32u64 %r342; .reg .u64pred %r358; .reg .u64predu64 %r391; .reg .pred %r392; .reg .pred %r393; .reg .u64u64 %r421; .reg .u64predmov.u64 %r211,%ar2; mov.u64 %r212,%ar3; .loc 1 237 6 setp.ne.u64 %r213,%r212,0; @ %r213 bra $L49gfortran_maxva.loc 1 244 7 bra $L48; $L49: .loc 1 247 10 ld.u64 %r22,[%r211]; .loc 1 247 7 add.u64 %r136,%r22,-1; .loc 1 248 10 ld.s8 %r23,[%r210+28]; .loc 1 248 38 add.u32 %r217,%r23,-1; .loc 1 248 8 cvt.s64.s32 %r137,%r217; .loc 1 251 7 shr.u64 %r219,%r136,63; set.u32.gt.s64 %r221,%r136,%r137; neg.s32 %r222,%r221; cvt.u16.u64 %r225,%r219; cvt.u16.u32 %r226,%r222; or.b16 %r224,%r225,%r226; .loc 1 251 6 cvt.u32.u16 %r227,%r224; cvt.u16.u8 %r228,%r227; setp.eq.u16 %r229,%r228,0; @ %r229 bra $L51; .loc 1 253 7 cvt.u32.u32 %r232,%r23; cvt.s64.s8 %r231,%r232; st.u64 [%stack+8],%r231; st.u64 [%stack],%r22; cvta.const.u64 %r2302stack; call _gfortran_runtime_error1: .loc 1 258 9 add.u64 %r435,%r136,%r136; add.u64 %r436,%r435,%r136; shl.b64 %r237,%r436,3; add.u64 %r238,%r210,%r237; ld.u64 %r241,[%r238+56]; add.u64 %r240,%r241,1; .loc 1 258 7 ld.u64 %r248,[%r238+48]; sub.u64 %r138,%r240,%r248; .loc 1 259 6 setp.le.s64 %r249,%r138,0; @ %r249 bra $L48; .loc 1 262 9 ld.u64 %r151,[%r212]; .loc 1 264 15 ld.u64 %r34,[%r212+16]; .loc 1 266 22 cvt.u32.u64 %r35,%r34; .loc 1 266 53 add.u32 %r250,%r35,-4; and.b32 %r251,%r250,-5; set.u32.eq.u32 %r253,%r251,0; neg.s32 %r254,%r253; .loc 1 266 22 add.u32 %r255,%r35,-1; set.u32.le.u32 %r257,%r255,1; neg.s32 %r258,%r257; .loc 1 266 6 cvt.u16.u32 %r260,%r254; cvt.u16.u32 %r261,%r258; or.b16 %r259,%r260,%r261; cvt.u32.u16 %r262,%r259; cvt.u16.u8 %r263,%r262; setp.ne.u16 %r264,%r263,0; @ %r264 bra $L53; .loc 1 268 7 setp.ne.u32 %r266,%r35,16; @ %r266 bra $L54; $L53: .loc 1 275 9 shl.b64 %r270,%r436,3; add.u64 %r271,%r210,%r270; ld.u64 %r141,[%r271+40]; .loc 1 276 12 add.u64 %r277,%r212,%r270; ld.u64 %r42,[%r277+40]; .loc 1 278 3 setp.ne.u64 %r279,%r136,0; @ %r279 bra $L55; $L61: .loc 1 288 3 setp.lt.s64 %r280,%r136,%r137; @ %r280 bra $L56; bra $L105; $L54: .loc 1 273 5 cvta.const.u64 %r281281_gfortran_runtime_error5: add.u64 %r40,%r210,40; add.u64 %r179,%r212,40; add.u64 %r283,%r210,16; add.u64 %r285,%r22,%r22; add.u64 %r286,%r285,%r22; shl.b64 %r287,%r286,3; add.u64 %r206,%r283,%r287; .loc 1 278 3 mov.u64 %r91,0; add.u64 %r433,%frame,240; add.u64 %r432,%frame,360; .loc 1 285 12 mov.u64 %r447,%r91; $L60: .loc 1 280 18 add.u64 %r289,%r433,%r91; ld.u64 %r290,[%r40]; st.u64 [%r289],%r290; .loc 1 281 18 add.u64 %r291,%frame,%r91; .loc 1 281 20 ld.u64 %r293,[%r179]; mul.lo.u64 %r292,%r293,%r34; .loc 1 281 18 st.u64 [%r291],%r292; .loc 1 282 19 ld.u64 %r295,[%r40+16]; add.u64 %r294,%r295,1; ld.u64 %r296,[%r40+8]; sub.u64 %r54,%r294,%r296; .loc 1 284 10 setp.lt.s64 %r297,%r54,0; @ %r297 bra $L58; .loc 1 282 17 add.u64 %r299,%r432,%r91; st.u64 [%r299],%r54; bra $L59; $L58: .loc 1 285 12 add.u64 %r301,%r432,%r91; st.u64 [%r301],%r447; $L59: .loc 1 278 3 add.u64 %r40,%r40,24; add.u64 %r179,%r179,24; add.u64 %r91,%r91,8; setp.ne.u64 %r303,%r40,%r206; @ %r303 bra $L60; bra $L61; $L105: .loc 1 298 6 ld.u64 %r304,[%r209]; setp.eq.u64 %r305,%r304,0; @ ! %r305 bra $L63; bra $L62; $L56: add.u64 %r307,%r22,%r22; add.u64 %r308,%r307,%r22; shl.b64 %r309,%r308,3; add.u64 %r126,%r309,40; add.u64 %r132,%r210,%r126; add.u64 %r124,%r212,%r126; shl.b64 %r310,%r22,3; add.u64 %r102,%r310,-8; cvt.u32.u32 %r312,%r23; cvt.s64.s8 %r311,%r312; shl.b64 %r313,%r311,3; add.u64 %r68,%r313,-8; add.u64 %r433,%frame,240; add.u64 %r432,%frame,360; .loc 1 295 12 mov.u64 %r446,0; $L66: .loc 1 290 18 add.u64 %r315,%r433,%r102; ld.u64 %r316,[%r132]; st.u64 [%r315],%r316; .loc 1 291 18 add.u64 %r317,%frame,%r102; .loc 1 291 20 ld.u64 %r319,[%r124]; mul.lo.u64 %r318,%r319,%r34; .loc 1 291 18 st.u64 [%r317],%r318; .loc 1 292 19 ld.u64 %r321,[%r132+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r132+8]; sub.u64 %r65,%r320,%r322; .loc 1 294 10 setp.lt.s64 %r323,%r65,0; @ %r323 bra $L64; .loc 1 292 17 add.u64 %r325,%r432,%r102; st.u64 [%r325],%r65; bra $L65; $L64: .loc 1 295 12 add.u64 %r327,%r432,%r102; st.u64 [%r327],%r446; $L65: .loc 1 288 3 add.u64 %r132,%r132,24; add.u64 %r124,%r124,24; add.u64 %r102,%r102,8; setp.ne.u64 %r329,%r68,%r102; @ %r329 bra $L66; bra $L106; $L62: .loc 1 302 7 setp.eq.u64 %r330,%r137,0; @ %r330 bra $L68; add.u64 %r432,%frame,360; $L86: add.u64 %r172,%r209,40; mov.u64 %r164,%r432; cvt.u32.u32 %r332,%r23; cvt.s64.s8 %r331,%r332; add.u64 %r334,%r331,%r331; add.u64 %r335,%r334,%r331; shl.b64 %r336,%r335,3; add.u64 %r337,%r209,16; add.u64 %r133,%r336,%r337; .loc 1 305 10 mov.u64 %r128,1; .loc 1 309 4 mov.u64 %r356,0; bra $L69; $L68: .loc 1 313 20 add.u32 %r338,%r23,-2; cvt.s64.s32 %r74,%r338; add.u64 %r340,%r74,%r74; add.u64 %r341,%r340,%r74; shl.b64 %r342,%r341,3; add.u64 %r343,%r209,%r342; .loc 1 313 67 shl.b64 %r345,%r74,3; add.u64 %r346,%frame,%r345; .loc 1 313 59 ld.u64 %r348,[%r343+40]; ld.u64 %r349,[%r346+360]; mul.lo.u64 %r145,%r348,%r349; .loc 1 315 24 mov.u64 %r350,0; st.u64 [%r209+8],%r350; .loc 1 316 28 cvt.u16.u32 %r353,%r23; add.u16 %r352,%r353,-1; cvt.u32.u16 %r354,%r352; st.u8 [%r209+28],%r354; .loc 1 318 10 setp.eq.u64 %r355,%r145,0; @ %r355 bra $L70; bra $L107; $L72: .loc 1 307 47 mul.lo.u64 %r128,%r71,%r128; $L69: .loc 1 309 4 st.u64 [%r172+8],%r356; ld.u64 %r71,[%r164]; add.u64 %r357,%r71,-1; st.u64 [%r172+16],%r357; st.u64 [%r172],%r128; .loc 1 302 7 add.u64 %r172,%r172,24; add.u64 %r164,%r164,8; setp.ne.u64 %r358,%r133,%r172; @ %r358 bra $L72; bra $L68; $L70: .loc 1 321 4 st.u64 [%r209+48],%r145; mov.u64 %r360,-1; st.u64 [%r209+56],%r360; mov.u64 %r361,1; st.u64 [%r209+40],%r361; .loc 1 322 4 bra $L48; $L107: .loc 1 325 24 mov.u64 %r363,363; call (%value_in),_gfortrani_xmallocarray64,[%value_in]; } .loc 1 325 22 st.u64 [%r209],%r364; bra $L73; $L63: .loc 1 330 19 ld.s8 %r366,[%r209+28]; .loc 1 330 10 setp.eq.u64 %r367,%r366,%r137; @ %r367 bra $L74; .loc 1 331 2 cvta.const.u64 %r36836_gfortran_runtime_error74: .loc 1 333 11 cvta.global.u64 %r370,_gfortrani_compile_options; .loc 1 333 10 ld.u32 %r371,[%r370+36]; setp.eq.u32 %r372,%r371,0; @ %r372 bra $L73; .loc 1 335 4 add.u64 %r432,%frame,360; cvta.const.u64 %r376,$LC2r437376; call _gfortrani_bounds_ifunction_return337 4 cvta.const.u64 %r380,$LC1210380376; call _gfortrani_bounds_equal_extents$L73: .loc 1 342 3 setp.ne.u64 %r382,%r137,0; @ %r382 bra $L75; $L78: .loc 1 350 8 ld.u64 %r152,[%r209]; .loc 1 351 8 ld.u64 %r150,[%r210]; .loc 1 353 9 setp.ne.u64 %r383,%r150,0; @ %r383 bra $L76; bra $L48; $L75: add.u64 %r190,%frame,480; add.u64 %r189,%r209,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r385,%r23; cvt.s64.s8 %r384,%r385; add.u64 %r175,%r384,-1; .loc 1 342 3 mov.u64 %r157,0; add.u64 %r432,%frame,360; .loc 1 344 16 mov.u64 %r386,%r157; $L77: st.u64 [%r190],%r386; .loc 1 345 18 ld.u64 %r387,[%r189]; st.u64 [%r187],%r387; .loc 1 346 17 shl.b64 %r389,%r157,3; add.u64 %r390,%r432,%r389; .loc 1 346 10 ld.u64 %r391,[%r390]; setp.le.s64 %r392,%r391,0; @ %r392 bra $L48; .loc 1 342 26 add.u64 %r157,%r157,1; .loc 1 342 3 add.u64 %r190,%r190,8; add.u64 %r189,%r189,24; add.u64 %r187,%r187,8; setp.ne.u64 %r393,%r157,%r175; @ %r393 bra $L77; bra $L78; $L76: .loc 1 276 12 mul.lo.u64 %r45,%r42,%r34; .loc 1 370 32 add.u64 %r90,%r141,%r141; ld.u64 %r39,[%frame+480]; .loc 1 401 22 ld.u64 %r93,[%frame+240]; .loc 1 401 12 add.u64 %r94,%r93,%r93; .loc 1 402 23 ld.u64 %r96,[%frame]; .loc 1 403 22 ld.u64 %r98,[%frame+120]; .loc 1 403 12 add.u64 %r99,%r98,%r98; .loc 1 405 32 ld.u64 %r84,[%frame+360]; .loc 1 412 23 mul.lo.u64 %r38,%r84,%r93; .loc 1 413 24 mul.lo.u64 %r397,%r84,%r96; .loc 1 413 10 neg.s64 %r163,%r397; .loc 1 414 23 mul.lo.u64 %r398,%r84,%r98; .loc 1 414 9 add.u64 %r399,%r398,%r398; neg.s64 %r165,%r399; setp.le.s64 %r437,%r137,1; .loc 1 365 9 mov.u32 %r439,-32768; cvt.u32.u32 %r440,%r23; cvt.s64.s8 %r441,%r440; add.u64 %r442,%r441,-1; add.u64 %r443,%frame,240; add.u64 %r444,%frame,360; add.u64 %r445,%frame,120; $L87: .loc 1 342 3 mov.u64 %r156,%r151; mov.u64 %r155,%r150; .loc 1 365 9 mov.u32 %r129,%r439; .loc 1 370 9 mov.u64 %r154,0; $L80: .loc 1 394 6 ld.s8 %r401,[%r156]; cvt.u16.u32 %r400,%r401; setp.eq.u16 %r402,%r400,0; @ %r402 bra $L79; ld.u16 %r404,[%r155]; cvt.u16.u32 %r405,%r129; max.s16 %r403,%r404,%r405; cvt.s32.s16 %r129,%r403; $L79: .loc 1 370 24 add.u64 %r154,%r154,1; .loc 1 370 32 add.u64 %r155,%r155,%r90; .loc 1 370 47 add.u64 %r156,%r156,%r45; .loc 1 370 2 setp.ne.u64 %r406,%r138,%r154; @ %r406 bra $L80; .loc 1 397 8 cvt.u16.u32 %r407,%r129; st.u16 [%r152],%r407; .loc 1 400 15 add.u64 %r39,%r39,1; .loc 1 401 12 add.u64 %r150,%r150,%r94; .loc 1 402 13 add.u64 %r151,%r151,%r96; .loc 1 403 12 add.u64 %r152,%r152,%r99; .loc 1 405 13 setp.ne.u64 %r408,%r84,%r39; @ %r408 bra $L87; .loc 1 413 10 add.u64 %r147,%r151,%r163; .loc 1 414 9 add.u64 %r148,%r152,%r165; .loc 1 416 7 @ ! %r437 bra $L108; bra $L48; $L85: .loc 1 409 13 st.u64 [%r41],%r438; .loc 1 412 23 mul.lo.u64 %r101,%r113,%r112; .loc 1 413 24 mul.lo.u64 %r411,%r115,%r112; .loc 1 413 10 sub.u64 %r147,%r151,%r411; .loc 1 414 23 mul.lo.u64 %r412,%r117,%r112; .loc 1 414 9 add.u64 %r413,%r412,%r412; sub.u64 %r148,%r152,%r413; .loc 1 415 5 add.u64 %r149,%r149,1; .loc 1 416 7 add.u64 %r41,%r41,8; add.u64 %r166,%r166,8; setp.ne.u64 %r414,%r149,%r442; @ %r414 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; .loc 1 412 23 mov.u64 %r101,%r38; .loc 1 416 7 mov.u64 %r166,8; .loc 1 415 5 mov.u64 %r149,1; .loc 1 409 13 mov.u64 %r438,0; $L84: .loc 1 424 16 ld.u64 %r418,[%r41]; add.u64 %r112,%r418,1; st.u64 [%r41],%r112; .loc 1 425 23 add.u64 %r420,%r443,%r166; ld.u64 %r113,[%r420]; .loc 1 425 13 sub.u64 %r421,%r113,%r101; add.u64 %r422,%r421,%r421; add.u64 %r150,%r150,%r422; .loc 1 426 24 add.u64 %r423,%frame,%r166; ld.u64 %r115,[%r423]; .loc 1 426 14 add.u64 %r151,%r147,%r115; .loc 1 427 23 add.u64 %r425,%r445,%r166; ld.u64 %r117,[%r425]; .loc 1 427 13 add.u64 %r426,%r117,%r117; add.u64 %r152,%r148,%r426; .loc 1 405 32 add.u64 %r428,%r444,%r166; ld.u64 %r121,[%r428]; .loc 1 405 13 setp.eq.u64 %r429,%r112,%r121; @ %r429 bra $L85; .loc 1 409 13 mov.u64 %r39,0; bra $L87; $L106: .loc 1 298 6 ld.u64 %r430,[%r209]; setp.eq.u64 %r431,%r430,0; @ ! %r431 bra $L63; bra $L86; $L48: .loc 1 431_gfortran_smaxval_i2 .visible .func _gfortran_smaxval_8u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64predpred %r292; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 454 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 454 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_maxva.loc 1 461 7 bra $L109; $L111: .loc 1 464 10 ld.u64 %r23,[%r152]; .loc 1 464 7 add.u64 %r88,%r23,-1; .loc 1 465 10 ld.s8 %r24,[%r151+28]; .loc 1 465 38 add.u32 %r26,%r24,-1; .loc 1 465 8 cvt.s64.s32 %r295,%r26; .loc 1 467 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r295; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 467 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 474 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 482 3 setp.lt.s64 %r173,%r88,%r295; @ %r173 bra $L115; bra $L168; $L113: .loc 1 469 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 479 12 mov.u64 %r301,0; $L119: .loc 1 476 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 478 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 476 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 479 12 st.u64 [%r139],%r301; $L118: .loc 1 474 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 491 15 ld.u64 %r98,[%r150]; .loc 1 491 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r294,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r294,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 488 12 mov.u64 %r300,0; $L125: .loc 1 485 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 487 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 484 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 488 12 st.u64 [%r43],%r300; $L124: .loc 1 482 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 495 7 setp.eq.u64 %r211,%r295,0; @ %r211 bra $L127; add.u64 %r294,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r294; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 498 10 mov.u64 %r81,1; .loc 1 502 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 506 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 507 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 509 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 509 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 509 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 511 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 500 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 502 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 495 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 514 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 515 4 bra $L109; $L170: .loc 1 518 24 mov.u64 %r244,4; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 518 22 st.u64 [%r150],%r98; $L135: .loc 1 544 3 setp.ne.u64 %r247,%r295,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 522 19 ld.s8 %r55,[%r150+28]; .loc 1 522 10 setp.eq.u64 %r248,%r55,%r295; @ %r248 bra $L134; .loc 1 523 2 st.u64 [%stack+8],%r295; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 528 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 528 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 530 4 setp.eq.u64 %r254,%r295,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 530 10 mov.u64 %r104,0; $L137: .loc 1 534 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 534 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 535 18 ld.u64 %r63,[%r31]; .loc 1 530 25 add.u64 %r104,%r104,1; .loc 1 535 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 536 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 530 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 546 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r295,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 547 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 544 26 add.u64 %r100,%r100,1; .loc 1 544 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r295,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 556 22 ld.u64 %r68,[%frame]; .loc 1 556 12 add.u64 %r69,%r68,%r68; .loc 1 558 32 ld.u64 %r59,[%frame+120]; .loc 1 565 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r293,%r295,1; .loc 1 554 13 mov.u16 %r279,-32768; cvt.u32.u32 %r297,%r24; cvt.s64.s8 %r298,%r297; add.u64 %r299,%r298,-1; $L146: st.u16 [%r98],%r279; .loc 1 555 15 add.u64 %r80,%r80,1; .loc 1 556 12 add.u64 %r98,%r98,%r69; .loc 1 558 13 setp.ne.u64 %r280,%r59,%r80; @ %r280 bra $L146; .loc 1 567 7 @ ! %r293 bra $L171; bra $L109; $L145: .loc 1 562 13 st.u64 [%r99],%r296; .loc 1 565 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 566 5 add.u64 %r97,%r97,1; .loc 1 567 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r283,%r97,%r299; @ %r283 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 565 23 mov.u64 %r72,%r44; .loc 1 566 5 mov.u64 %r97,1; .loc 1 562 13 mov.u64 %r296,0; $L144: .loc 1 571 16 ld.u64 %r288,[%r99]; add.u64 %r74,%r288,1; st.u64 [%r99],%r74; .loc 1 572 23 ld.u64 %r75,[%r124]; .loc 1 572 13 sub.u64 %r289,%r75,%r72; add.u64 %r290,%r289,%r289; add.u64 %r98,%r98,%r290; .loc 1 558 32 ld.u64 %r77,[%r123]; .loc 1 558 13 setp.eq.u64 %r291,%r74,%r77; @ %r291 bra $L145; .loc 1 562 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 491 15 ld.u64 %r98,[%r150]; .loc 1 491 6 setp.eq.u64 %r292,%r98,0; @ ! %r292 bra $L122; bra $L147; $L109: .loc 1 maxval_i4.o/_gfortran_maxval_i4 .visible .func _gfortran_maxval_i4fortran/generated/maxval_i4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxval_i4 .visible .func _gfortran_mmaxval_i4gfortran_smaxval_i4 .visible .func _gfortran_smaxval_i4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,86,65,76114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_maxval_i4 .visible .func _gfortran_maxval_i44899pred %r217; .reg .pred64u64 %r251; .reg .predu64 %r256; .reg .u64 %r257; .reg .u32u32 %r268; .reg .u64u64predpredu64u64predmov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; .loc 1 55 10 ld.s8 %r22,[%r180+28]; .loc 1 55 38 add.u32 %r182,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r182; .loc 1 56 10 ld.u64 %r25,[%r181]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r184,%r104,63; set.u32.lt.s64 %r186,%r103,%r104; neg.s32 %r187,%r186; cvt.u16.u64 %r190,%r184; cvt.u16.u32 %r191,%r187; or.b16 %r189,%r190,%r191; .loc 1 58 6 cvt.u32.u16 %r192,%r189; cvt.u16.u8 %r193,%r192; setp.eq.u16 %r194,%r193,0; @ %r194 bra $L2; .loc 1 60 7 cvt.u32.u32 %r197,%r22; cvt.s64.s8 %r196,%r197; st.u64 [%stack+8],%r1961955_gfortran_runtime_errorr200,%r104,%r104; add.u64 %r201,%r200,%r104; shl.b64 %r202,%r201,3; add.u64 %r203,%r180,%r202; ld.u64 %r30,[%r203+56]; ld.u64 %r32,[%r203+48]; .loc 1 68 9 ld.u64 %r107,[%r203+40]; .loc 1 70 3 setp.ne.u64 %r217,%r104,0; @ %r217 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r218,%r103,%r104; @ %r218 bra $L4; bra $L46; $L3: add.u64 %r78,%r180,40; add.u64 %r59,%frame,120; add.u64 %r91,%frame,240; add.u64 %r219,%r180,16; add.u64 %r221,%r25,%r25; add.u64 %r222,%r221,%r25; shl.b64 %r223,%r222,3; add.u64 %r174,%r219,%r223; .loc 1 76 12 mov.u64 %r357,0; $L8: .loc 1 72 18 ld.u64 %r224,[%r78]; st.u64 [%r59],%r224; .loc 1 73 19 ld.u64 %r226,[%r78+16]; add.u64 %r225,%r226,1; ld.u64 %r227,[%r78+8]; sub.u64 %r37,%r225,%r227; .loc 1 75 10 setp.lt.s64 %r228,%r37,0; @ %r228 bra $L6; .loc 1 73 17 st.u64 [%r91],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r91],%r357; $L7: .loc 1 70 3 add.u64 %r78,%r78,24; add.u64 %r59,%r59,8; add.u64 %r91,%r91,8; setp.ne.u64 %r230,%r78,%r174; @ %r230 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r231,[%r179]; setp.eq.u64 %r232,%r231,0; @ ! %r232 bra $L11; bra $L10; $L4: add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r237,%r236,40; add.u64 %r143,%r180,%r237; shl.b64 %r238,%r25,3; add.u64 %r125,%r238,-8; add.u64 %r344,%frame,120; add.u64 %r128,%r344,%r125; add.u64 %r345,%frame,240; add.u64 %r121,%r345,%r125; cvt.u32.u32 %r242,%r22; cvt.s64.s8 %r241,%r242; add.u64 %r244,%r241,%r241; add.u64 %r245,%r244,%r241; shl.b64 %r246,%r245,3; add.u64 %r247,%r180,40; add.u64 %r95,%r246,%r247; .loc 1 84 12 mov.u64 %r356,0; $L14: .loc 1 80 18 ld.u64 %r248,[%r143]; st.u64 [%r128],%r248; .loc 1 81 19 ld.u64 %r250,[%r143+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r143+8]; sub.u64 %r45,%r249,%r251; .loc 1 83 10 setp.lt.s64 %r252,%r45,0; @ %r252 bra $L12; .loc 1 81 17 st.u64 [%r121],%r45; bra $L13; $L12: .loc 1 84 12 st.u64 [%r121],%r356; $L13: .loc 1 78 3 add.u64 %r143,%r143,24; add.u64 %r128,%r128,8; add.u64 %r121,%r121,8; setp.ne.u64 %r254,%r95,%r143; @ %r254 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r255,%r103,0; @ %r255 bra $L16; add.u64 %r345,%frame,240; $L34: add.u64 %r152,%r179,40; mov.u64 %r150,%r345; add.u64 %r256,%r179,16; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r144,%r256,%r262; .loc 1 94 10 mov.u64 %r99,1; .loc 1 98 4 mov.u64 %r285,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r263,0; st.u64 [%r179+8],%r263; .loc 1 103 28 cvt.u16.u32 %r266,%r22; add.u16 %r265,%r266,-1; cvt.u32.u16 %r267,%r265; st.u8 [%r179+28],%r267; .loc 1 105 20 add.u32 %r268,%r22,-2; cvt.s64.s32 %r52,%r268; add.u64 %r270,%r52,%r52; add.u64 %r271,%r270,%r52; shl.b64 %r272,%r271,3; add.u64 %r273,%r179,%r272; .loc 1 105 67 shl.b64 %r275,%r52,3; add.u64 %r276,%frame,%r275; .loc 1 105 59 ld.u64 %r278,[%r273+40]; ld.u64 %r279,[%r276+240]; mul.lo.u64 %r109,%r278,%r279; .loc 1 107 29281; call (%value_in),_gfortrani_xmallocarray282,[%value_in]; } .loc 1 107 27 st.u64 [%r179],%r282; .loc 1 108 10 setp.eq.u64 %r284,%r109,0; @ ! %r284 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r99,%r48,%r99; $L17: .loc 1 98 4 st.u64 [%r152+8],%r285; ld.u64 %r48,[%r150]; add.u64 %r286,%r48,-1; st.u64 [%r152+16],%r286; st.u64 [%r152],%r99; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r287,%r144,%r152; @ %r287 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r179+48],%r109; mov.u64 %r289,-1; st.u64 [%r179+56],%r289; mov.u64 %r290,1; st.u64 [%r179+40],%r290; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r179+28]; .loc 1 118 10 setp.eq.u64 %r291,%r58,%r103; @ %r291 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r294,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r295,[%r294+36]; setp.eq.u32 %r296,%r295,0; @ %r296 bra $L21; .loc 1 125 2 add.u64 %r345,%frame,240; cvta.const.u64 %r300,$LC2r345299300; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r302,%r103,0; @ %r302 bra $L24; $L27: .loc 1 65 9 add.u64 %r303,%r30,1; .loc 1 65 7 sub.u64 %r105,%r303,%r32; max.s64 %r40,%r105,0; .loc 1 137 8 ld.u64 %r116,[%r180]; .loc 1 138 8 ld.u64 %r117,[%r179]; .loc 1 158 36 shl.b64 %r66,%r107,2; ld.u64 %r138,[%frame+360]; .loc 1 180 22 ld.u64 %r68,[%frame+120]; .loc 1 180 12 shl.b64 %r69,%r68,2; .loc 1 181 22 ld.u64 %r71,[%frame]; .loc 1 181 12 shl.b64 %r72,%r71,2; .loc 1 183 32 ld.u64 %r123,[%frame+240]; .loc 1 190 23 mul.lo.u64 %r304,%r68,%r123; .loc 1 190 9 shl.b64 %r305,%r304,2; neg.s64 %r127,%r305; .loc 1 191 23 mul.lo.u64 %r306,%r71,%r123; .loc 1 191 9 shl.b64 %r307,%r306,2; neg.s64 %r130,%r307; setp.gt.s64 %r346,%r105,0; setp.le.s64 %r347,%r103,1; cvt.u32.u32 %r349,%r22; cvt.s64.s8 %r350,%r349; add.u64 %r351,%r350,-1; add.u64 %r352,%frame,120; add.u64 %r353,%frame,240; .loc 1 187 13 mov.u64 %r354,0; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r179,40; mov.u64 %r162,%frame; cvt.u32.u32 %r309,%r22; cvt.s64.s8 %r308,%r309; add.u64 %r153,%r308,-1; .loc 1 129 3 mov.u64 %r118,0; add.u64 %r345,%frame,240; .loc 1 131 16 mov.u64 %r310,%r118; $L26: st.u64 [%r165],%r310; .loc 1 132 18 ld.u64 %r311,[%r164]; st.u64 [%r162],%r311; .loc 1 133 17 shl.b64 %r313,%r118,3; add.u64 %r314,%r345,%r313; .loc 1 133 10 ld.u64 %r315,[%r314]; setp.le.s64 %r316,%r315,0; @ %r316 bra $L1; .loc 1 129 26 add.u64 %r118,%r118,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r317,%r118,%r153; @ %r317 bra $L26; bra $L27; $L25: .loc 1 153 5 @ %r346 bra $L35; .loc 1 154 10 mov.u32 %r319,-2147483648; st.u32 [%r117],%r319; bra $L29; $L35: mov.u64 %r112,%r116; .loc 1 151 9 mov.u32 %r43,-2147483648; .loc 1 158 13 mov.u64 %r111,0; $L28: ld.u32 %r320,[%r112]; max.s32 %r43,%r43,%r320; .loc 1 158 28 add.u64 %r111,%r111,1; .loc 1 158 36 add.u64 %r112,%r112,%r66; .loc 1 158 6 setp.gt.s64 %r321,%r40,%r111; @ %r321 bra $L28; .loc 1 175 12 st.u32 [%r117],%r43; $L29: .loc 1 179 15 add.u64 %r138,%r138,1; .loc 1 180 12 add.u64 %r116,%r116,%r69; .loc 1 181 12 add.u64 %r117,%r117,%r72; .loc 1 183 13 setp.ne.u64 %r322,%r138,%r123; @ %r322 bra $L25; .loc 1 190 9 add.u64 %r113,%r116,%r127; .loc 1 191 9 add.u64 %r114,%r117,%r130; .loc 1 193 7 @ %r347 bra $L1; add.u64 %r122,%frame,368; mov.u64 %r140,8; .loc 1 192 5 mov.u64 %r115,1; bra $L32; $L33: .loc 1 187 13 st.u64 [%r122],%r354; .loc 1 190 23 mul.lo.u64 %r328,%r85,%r84; .loc 1 190 9 shl.b64 %r329,%r328,2; sub.u64 %r113,%r116,%r329; .loc 1 191 23 mul.lo.u64 %r330,%r88,%r84; .loc 1 191 9 shl.b64 %r331,%r330,2; sub.u64 %r114,%r117,%r331; .loc 1 192 5 add.u64 %r115,%r115,1; .loc 1 193 7 add.u64 %r122,%r122,8; add.u64 %r140,%r140,8; setp.eq.u64 %r332,%r115,%r351; @ %r332 bra $L1; $L32: .loc 1 201 16 ld.u64 %r333,[%r122]; add.u64 %r84,%r333,1; st.u64 [%r122],%r84; .loc 1 202 23 add.u64 %r335,%r352,%r140; ld.u64 %r85,[%r335]; .loc 1 202 13 shl.b64 %r336,%r85,2; add.u64 %r116,%r113,%r336; .loc 1 203 23 add.u64 %r337,%frame,%r140; ld.u64 %r88,[%r337]; .loc 1 203 13 shl.b64 %r338,%r88,2; add.u64 %r117,%r114,%r338; .loc 1 183 32 add.u64 %r340,%r353,%r140; ld.u64 %r92,[%r340]; .loc 1 183 13 setp.eq.u64 %r341,%r84,%r92; @ %r341 bra $L33; .loc 1 187 13 mov.u64 %r138,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r342,[%r179]; setp.eq.u64 %r343,%r342,0; @ ! %r343 bra $L11; bra $L34; $L1: .loc 1 207_gfortran_mmaxval_i4 .visible .func _gfortran_mmaxval_608u64u16 %r263; .reg .pred %r264; .reg .pred %r266; .reg .u64 %r270; .reg .u64 %r271; .reg .u64 %r277; .reg .pred %r279; .reg .pred %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .predu64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32u64 %r342; .reg .u64pred %r358; .reg .u64predu64 %r391; .reg .predpred %r399; .reg .u32 %r400; .reg .pred %r401; .reg .predmov.u64 %r211,%ar2; mov.u64 %r212,%ar3; .loc 1 237 6 setp.ne.u64 %r213,%r212,0; @ %r213 bra $L49gfortran_maxva.loc 1 244 7 bra $L48; $L49: .loc 1 247 10 ld.u64 %r22,[%r211]; .loc 1 247 7 add.u64 %r136,%r22,-1; .loc 1 248 10 ld.s8 %r23,[%r210+28]; .loc 1 248 38 add.u32 %r217,%r23,-1; .loc 1 248 8 cvt.s64.s32 %r137,%r217; .loc 1 251 7 shr.u64 %r219,%r136,63; set.u32.gt.s64 %r221,%r136,%r137; neg.s32 %r222,%r221; cvt.u16.u64 %r225,%r219; cvt.u16.u32 %r226,%r222; or.b16 %r224,%r225,%r226; .loc 1 251 6 cvt.u32.u16 %r227,%r224; cvt.u16.u8 %r228,%r227; setp.eq.u16 %r229,%r228,0; @ %r229 bra $L51; .loc 1 253 7 cvt.u32.u32 %r232,%r23; cvt.s64.s8 %r231,%r232; st.u64 [%stack+8],%r231; st.u64 [%stack],%r22; cvta.const.u64 %r2302stack; call _gfortran_runtime_error1: .loc 1 258 9 add.u64 %r429,%r136,%r136; add.u64 %r430,%r429,%r136; shl.b64 %r237,%r430,3; add.u64 %r238,%r210,%r237; ld.u64 %r241,[%r238+56]; add.u64 %r240,%r241,1; .loc 1 258 7 ld.u64 %r248,[%r238+48]; sub.u64 %r138,%r240,%r248; .loc 1 259 6 setp.le.s64 %r249,%r138,0; @ %r249 bra $L48; .loc 1 262 9 ld.u64 %r151,[%r212]; .loc 1 264 15 ld.u64 %r34,[%r212+16]; .loc 1 266 22 cvt.u32.u64 %r35,%r34; .loc 1 266 53 add.u32 %r250,%r35,-4; and.b32 %r251,%r250,-5; set.u32.eq.u32 %r253,%r251,0; neg.s32 %r254,%r253; .loc 1 266 22 add.u32 %r255,%r35,-1; set.u32.le.u32 %r257,%r255,1; neg.s32 %r258,%r257; .loc 1 266 6 cvt.u16.u32 %r260,%r254; cvt.u16.u32 %r261,%r258; or.b16 %r259,%r260,%r261; cvt.u32.u16 %r262,%r259; cvt.u16.u8 %r263,%r262; setp.ne.u16 %r264,%r263,0; @ %r264 bra $L53; .loc 1 268 7 setp.ne.u32 %r266,%r35,16; @ %r266 bra $L54; $L53: .loc 1 275 9 shl.b64 %r270,%r430,3; add.u64 %r271,%r210,%r270; ld.u64 %r141,[%r271+40]; .loc 1 276 12 add.u64 %r277,%r212,%r270; ld.u64 %r42,[%r277+40]; .loc 1 278 3 setp.ne.u64 %r279,%r136,0; @ %r279 bra $L55; $L61: .loc 1 288 3 setp.lt.s64 %r280,%r136,%r137; @ %r280 bra $L56; bra $L105; $L54: .loc 1 273 5 cvta.const.u64 %r281281_gfortran_runtime_error5: add.u64 %r40,%r210,40; add.u64 %r179,%r212,40; add.u64 %r283,%r210,16; add.u64 %r285,%r22,%r22; add.u64 %r286,%r285,%r22; shl.b64 %r287,%r286,3; add.u64 %r206,%r283,%r287; .loc 1 278 3 mov.u64 %r91,0; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 285 12 mov.u64 %r441,%r91; $L60: .loc 1 280 18 add.u64 %r289,%r427,%r91; ld.u64 %r290,[%r40]; st.u64 [%r289],%r290; .loc 1 281 18 add.u64 %r291,%frame,%r91; .loc 1 281 20 ld.u64 %r293,[%r179]; mul.lo.u64 %r292,%r293,%r34; .loc 1 281 18 st.u64 [%r291],%r292; .loc 1 282 19 ld.u64 %r295,[%r40+16]; add.u64 %r294,%r295,1; ld.u64 %r296,[%r40+8]; sub.u64 %r54,%r294,%r296; .loc 1 284 10 setp.lt.s64 %r297,%r54,0; @ %r297 bra $L58; .loc 1 282 17 add.u64 %r299,%r426,%r91; st.u64 [%r299],%r54; bra $L59; $L58: .loc 1 285 12 add.u64 %r301,%r426,%r91; st.u64 [%r301],%r441; $L59: .loc 1 278 3 add.u64 %r40,%r40,24; add.u64 %r179,%r179,24; add.u64 %r91,%r91,8; setp.ne.u64 %r303,%r40,%r206; @ %r303 bra $L60; bra $L61; $L105: .loc 1 298 6 ld.u64 %r304,[%r209]; setp.eq.u64 %r305,%r304,0; @ ! %r305 bra $L63; bra $L62; $L56: add.u64 %r307,%r22,%r22; add.u64 %r308,%r307,%r22; shl.b64 %r309,%r308,3; add.u64 %r126,%r309,40; add.u64 %r132,%r210,%r126; add.u64 %r124,%r212,%r126; shl.b64 %r310,%r22,3; add.u64 %r102,%r310,-8; cvt.u32.u32 %r312,%r23; cvt.s64.s8 %r311,%r312; shl.b64 %r313,%r311,3; add.u64 %r68,%r313,-8; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 295 12 mov.u64 %r440,0; $L66: .loc 1 290 18 add.u64 %r315,%r427,%r102; ld.u64 %r316,[%r132]; st.u64 [%r315],%r316; .loc 1 291 18 add.u64 %r317,%frame,%r102; .loc 1 291 20 ld.u64 %r319,[%r124]; mul.lo.u64 %r318,%r319,%r34; .loc 1 291 18 st.u64 [%r317],%r318; .loc 1 292 19 ld.u64 %r321,[%r132+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r132+8]; sub.u64 %r65,%r320,%r322; .loc 1 294 10 setp.lt.s64 %r323,%r65,0; @ %r323 bra $L64; .loc 1 292 17 add.u64 %r325,%r426,%r102; st.u64 [%r325],%r65; bra $L65; $L64: .loc 1 295 12 add.u64 %r327,%r426,%r102; st.u64 [%r327],%r440; $L65: .loc 1 288 3 add.u64 %r132,%r132,24; add.u64 %r124,%r124,24; add.u64 %r102,%r102,8; setp.ne.u64 %r329,%r68,%r102; @ %r329 bra $L66; bra $L106; $L62: .loc 1 302 7 setp.eq.u64 %r330,%r137,0; @ %r330 bra $L68; add.u64 %r426,%frame,360; $L86: add.u64 %r172,%r209,40; mov.u64 %r164,%r426; cvt.u32.u32 %r332,%r23; cvt.s64.s8 %r331,%r332; add.u64 %r334,%r331,%r331; add.u64 %r335,%r334,%r331; shl.b64 %r336,%r335,3; add.u64 %r337,%r209,16; add.u64 %r133,%r336,%r337; .loc 1 305 10 mov.u64 %r128,1; .loc 1 309 4 mov.u64 %r356,0; bra $L69; $L68: .loc 1 313 20 add.u32 %r338,%r23,-2; cvt.s64.s32 %r74,%r338; add.u64 %r340,%r74,%r74; add.u64 %r341,%r340,%r74; shl.b64 %r342,%r341,3; add.u64 %r343,%r209,%r342; .loc 1 313 67 shl.b64 %r345,%r74,3; add.u64 %r346,%frame,%r345; .loc 1 313 59 ld.u64 %r348,[%r343+40]; ld.u64 %r349,[%r346+360]; mul.lo.u64 %r145,%r348,%r349; .loc 1 315 24 mov.u64 %r350,0; st.u64 [%r209+8],%r350; .loc 1 316 28 cvt.u16.u32 %r353,%r23; add.u16 %r352,%r353,-1; cvt.u32.u16 %r354,%r352; st.u8 [%r209+28],%r354; .loc 1 318 10 setp.eq.u64 %r355,%r145,0; @ %r355 bra $L70; bra $L107; $L72: .loc 1 307 47 mul.lo.u64 %r128,%r71,%r128; $L69: .loc 1 309 4 st.u64 [%r172+8],%r356; ld.u64 %r71,[%r164]; add.u64 %r357,%r71,-1; st.u64 [%r172+16],%r357; st.u64 [%r172],%r128; .loc 1 302 7 add.u64 %r172,%r172,24; add.u64 %r164,%r164,8; setp.ne.u64 %r358,%r133,%r172; @ %r358 bra $L72; bra $L68; $L70: .loc 1 321 4 st.u64 [%r209+48],%r145; mov.u64 %r360,-1; st.u64 [%r209+56],%r360; mov.u64 %r361,1; st.u64 [%r209+40],%r361; .loc 1 322 4 bra $L48; $L107: .loc 1 325 24363; call (%value_in),_gfortrani_xmallocarray64,[%value_in]; } .loc 1 325 22 st.u64 [%r209],%r364; bra $L73; $L63: .loc 1 330 19 ld.s8 %r366,[%r209+28]; .loc 1 330 10 setp.eq.u64 %r367,%r366,%r137; @ %r367 bra $L74; .loc 1 331 2 cvta.const.u64 %r36836_gfortran_runtime_error74: .loc 1 333 11 cvta.global.u64 %r370,_gfortrani_compile_options; .loc 1 333 10 ld.u32 %r371,[%r370+36]; setp.eq.u32 %r372,%r371,0; @ %r372 bra $L73; .loc 1 335 4 add.u64 %r426,%frame,360; cvta.const.u64 %r376,$LC2r42637376; call _gfortrani_bounds_ifunction_return337 4 cvta.const.u64 %r380,$LC1210380376; call _gfortrani_bounds_equal_extents$L73: .loc 1 342 3 setp.ne.u64 %r382,%r137,0; @ %r382 bra $L75; $L78: .loc 1 350 8 ld.u64 %r152,[%r209]; .loc 1 351 8 ld.u64 %r150,[%r210]; .loc 1 353 9 setp.ne.u64 %r383,%r150,0; @ %r383 bra $L76; bra $L48; $L75: add.u64 %r190,%frame,480; add.u64 %r189,%r209,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r385,%r23; cvt.s64.s8 %r384,%r385; add.u64 %r175,%r384,-1; .loc 1 342 3 mov.u64 %r157,0; add.u64 %r426,%frame,360; .loc 1 344 16 mov.u64 %r386,%r157; $L77: st.u64 [%r190],%r386; .loc 1 345 18 ld.u64 %r387,[%r189]; st.u64 [%r187],%r387; .loc 1 346 17 shl.b64 %r389,%r157,3; add.u64 %r390,%r426,%r389; .loc 1 346 10 ld.u64 %r391,[%r390]; setp.le.s64 %r392,%r391,0; @ %r392 bra $L48; .loc 1 342 26 add.u64 %r157,%r157,1; .loc 1 342 3 add.u64 %r190,%r190,8; add.u64 %r189,%r189,24; add.u64 %r187,%r187,8; setp.ne.u64 %r393,%r157,%r175; @ %r393 bra $L77; bra $L78; $L76: .loc 1 276 12 mul.lo.u64 %r45,%r42,%r34; .loc 1 370 32 shl.b64 %r90,%r141,2; ld.u64 %r39,[%frame+480]; .loc 1 401 22 ld.u64 %r93,[%frame+240]; .loc 1 401 12 shl.b64 %r94,%r93,2; .loc 1 402 23 ld.u64 %r96,[%frame]; .loc 1 403 22 ld.u64 %r98,[%frame+120]; .loc 1 403 12 shl.b64 %r99,%r98,2; .loc 1 405 32 ld.u64 %r84,[%frame+360]; .loc 1 412 23 mul.lo.u64 %r38,%r84,%r93; .loc 1 413 24 mul.lo.u64 %r394,%r84,%r96; .loc 1 413 10 neg.s64 %r163,%r394; .loc 1 414 23 mul.lo.u64 %r395,%r84,%r98; .loc 1 414 9 shl.b64 %r396,%r395,2; neg.s64 %r165,%r396; setp.le.s64 %r431,%r137,1; cvt.u32.u32 %r433,%r23; cvt.s64.s8 %r434,%r433; add.u64 %r435,%r434,-1; add.u64 %r436,%frame,240; add.u64 %r437,%frame,360; add.u64 %r438,%frame,120; .loc 1 409 13 mov.u64 %r439,0; $L87: .loc 1 342 3 mov.u64 %r156,%r151; mov.u64 %r155,%r150; .loc 1 365 9 mov.u32 %r129,-2147483648; .loc 1 370 9 mov.u64 %r154,0; $L80: .loc 1 394 6 ld.s8 %r398,[%r156]; cvt.u16.u32 %r397,%r398; setp.eq.u16 %r399,%r397,0; @ %r399 bra $L79; ld.u32 %r400,[%r155]; max.s32 %r129,%r129,%r400; $L79: .loc 1 370 24 add.u64 %r154,%r154,1; .loc 1 370 32 add.u64 %r155,%r155,%r90; .loc 1 370 47 add.u64 %r156,%r156,%r45; .loc 1 370 2 setp.ne.u64 %r401,%r138,%r154; @ %r401 bra $L80; .loc 1 397 8 st.u32 [%r152],%r129; .loc 1 400 15 add.u64 %r39,%r39,1; .loc 1 401 12 add.u64 %r150,%r150,%r94; .loc 1 402 13 add.u64 %r151,%r151,%r96; .loc 1 403 12 add.u64 %r152,%r152,%r99; .loc 1 405 13 setp.ne.u64 %r402,%r84,%r39; @ %r402 bra $L87; .loc 1 413 10 add.u64 %r147,%r151,%r163; .loc 1 414 9 add.u64 %r148,%r152,%r165; .loc 1 416 7 @ ! %r431 bra $L108; bra $L48; $L85: .loc 1 409 13 st.u64 [%r41],%r439; .loc 1 412 23 mul.lo.u64 %r101,%r113,%r112; .loc 1 413 24 mul.lo.u64 %r405,%r115,%r112; .loc 1 413 10 sub.u64 %r147,%r151,%r405; .loc 1 414 23 mul.lo.u64 %r406,%r117,%r112; .loc 1 414 9 shl.b64 %r407,%r406,2; sub.u64 %r148,%r152,%r407; .loc 1 415 5 add.u64 %r149,%r149,1; .loc 1 416 7 add.u64 %r41,%r41,8; add.u64 %r166,%r166,8; setp.ne.u64 %r408,%r149,%r435; @ %r408 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; .loc 1 412 23 mov.u64 %r101,%r38; .loc 1 416 7 mov.u64 %r166,8; .loc 1 415 5 mov.u64 %r149,1; $L84: .loc 1 424 16 ld.u64 %r412,[%r41]; add.u64 %r112,%r412,1; st.u64 [%r41],%r112; .loc 1 425 23 add.u64 %r414,%r436,%r166; ld.u64 %r113,[%r414]; .loc 1 425 13 sub.u64 %r415,%r113,%r101; shl.b64 %r416,%r415,2; add.u64 %r150,%r150,%r416; .loc 1 426 24 add.u64 %r417,%frame,%r166; ld.u64 %r115,[%r417]; .loc 1 426 14 add.u64 %r151,%r147,%r115; .loc 1 427 23 add.u64 %r419,%r438,%r166; ld.u64 %r117,[%r419]; .loc 1 427 13 shl.b64 %r420,%r117,2; add.u64 %r152,%r148,%r420; .loc 1 405 32 add.u64 %r422,%r437,%r166; ld.u64 %r121,[%r422]; .loc 1 405 13 setp.eq.u64 %r423,%r112,%r121; @ %r423 bra $L85; .loc 1 409 13 mov.u64 %r39,0; bra $L87; $L106: .loc 1 298 6 ld.u64 %r424,[%r209]; setp.eq.u64 %r425,%r424,0; @ ! %r425 bra $L63; bra $L86; $L48: .loc 1 431_gfortran_smaxval_i4 .visible .func _gfortran_smaxval_8u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64predu64mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 454 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 454 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_maxva.loc 1 461 7 bra $L109; $L111: .loc 1 464 10 ld.u64 %r23,[%r152]; .loc 1 464 7 add.u64 %r88,%r23,-1; .loc 1 465 10 ld.s8 %r24,[%r151+28]; .loc 1 465 38 add.u32 %r26,%r24,-1; .loc 1 465 8 cvt.s64.s32 %r294,%r26; .loc 1 467 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r294; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 467 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 474 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 482 3 setp.lt.s64 %r173,%r88,%r294; @ %r173 bra $L115; bra $L168; $L113: .loc 1 469 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 479 12 mov.u64 %r300,0; $L119: .loc 1 476 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 478 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 476 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 479 12 st.u64 [%r139],%r300; $L118: .loc 1 474 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 491 15 ld.u64 %r98,[%r150]; .loc 1 491 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r293,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r293,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 488 12 mov.u64 %r299,0; $L125: .loc 1 485 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 487 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 484 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 488 12 st.u64 [%r43],%r299; $L124: .loc 1 482 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 495 7 setp.eq.u64 %r211,%r294,0; @ %r211 bra $L127; add.u64 %r293,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r293; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 498 10 mov.u64 %r81,1; .loc 1 502 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 506 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 507 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 509 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 509 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 509 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 511 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 500 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 502 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 495 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 514 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 515 4 bra $L109; $L170: .loc 1 518 2496244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 518 22 st.u64 [%r150],%r98; $L135: .loc 1 544 3 setp.ne.u64 %r247,%r294,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 522 19 ld.s8 %r55,[%r150+28]; .loc 1 522 10 setp.eq.u64 %r248,%r55,%r294; @ %r248 bra $L134; .loc 1 523 2 st.u64 [%stack+8],%r294; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 528 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 528 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 530 4 setp.eq.u64 %r254,%r294,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 530 10 mov.u64 %r104,0; $L137: .loc 1 534 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 534 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 535 18 ld.u64 %r63,[%r31]; .loc 1 530 25 add.u64 %r104,%r104,1; .loc 1 535 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 536 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 530 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 546 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r294,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 547 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 544 26 add.u64 %r100,%r100,1; .loc 1 544 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r294,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 556 22 ld.u64 %r68,[%frame]; .loc 1 556 12 shl.b64 %r69,%r68,2; .loc 1 558 32 ld.u64 %r59,[%frame+120]; .loc 1 565 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r292,%r294,1; .loc 1 554 13 mov.u32 %r278,-2147483648; cvt.u32.u32 %r296,%r24; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; $L146: st.u32 [%r98],%r278; .loc 1 555 15 add.u64 %r80,%r80,1; .loc 1 556 12 add.u64 %r98,%r98,%r69; .loc 1 558 13 setp.ne.u64 %r279,%r59,%r80; @ %r279 bra $L146; .loc 1 567 7 @ ! %r292 bra $L171; bra $L109; $L145: .loc 1 562 13 st.u64 [%r99],%r295; .loc 1 565 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 566 5 add.u64 %r97,%r97,1; .loc 1 567 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r282,%r97,%r298; @ %r282 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 565 23 mov.u64 %r72,%r44; .loc 1 566 5 mov.u64 %r97,1; .loc 1 562 13 mov.u64 %r295,0; $L144: .loc 1 571 16 ld.u64 %r287,[%r99]; add.u64 %r74,%r287,1; st.u64 [%r99],%r74; .loc 1 572 23 ld.u64 %r75,[%r124]; .loc 1 572 13 sub.u64 %r288,%r75,%r72; shl.b64 %r289,%r288,2; add.u64 %r98,%r98,%r289; .loc 1 558 32 ld.u64 %r77,[%r123]; .loc 1 558 13 setp.eq.u64 %r290,%r74,%r77; @ %r290 bra $L145; .loc 1 562 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 491 15 ld.u64 %r98,[%r150]; .loc 1 491 6 setp.eq.u64 %r291,%r98,0; @ ! %r291 bra $L122; bra $L147; $L109: .loc 1 maxval_i8.o/_gfortran_maxval_i8 .visible .func _gfortran_maxval_i8fortran/generated/maxval_i8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxval_i8 .visible .func _gfortran_mmaxval_i8gfortran_smaxval_i8 .visible .func _gfortran_smaxval_i8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,86,65,76114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_maxval_i8 .visible .func _gfortran_maxval_i84897pred %r218; .reg .predpred %r231; .reg .u64 %r232; .reg .predu64 %r250; .reg .u64 %r251; .reg .u64 %r252; .reg .pred %r253; .reg .predu64u16u64 %r274; .reg .u64pred %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u32 %r306; .reg .u64u64 %r311; .reg .u64 %r312; .reg .pred %r313; .reg .predu64predmov.u64 %r180,%ar0; mov.u64 %r181,%ar1; mov.u64 %r182,%ar2; .loc 1 55 10 ld.s8 %r22,[%r181+28]; .loc 1 55 38 add.u32 %r183,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r108,%r183; .loc 1 56 10 ld.u64 %r25,[%r182]; .loc 1 56 7 add.u64 %r109,%r25,-1; .loc 1 58 7 shr.u64 %r185,%r109,63; set.u32.lt.s64 %r187,%r108,%r109; neg.s32 %r188,%r187; cvt.u16.u64 %r191,%r185; cvt.u16.u32 %r192,%r188; or.b16 %r190,%r191,%r192; .loc 1 58 6 cvt.u32.u16 %r193,%r190; cvt.u16.u8 %r194,%r193; setp.eq.u16 %r195,%r194,0; @ %r195 bra $L2; .loc 1 60 7 cvt.u32.u32 %r198,%r22; cvt.s64.s8 %r197,%r198; st.u64 [%stack+8],%r1971966_gfortran_runtime_errorr201,%r109,%r109; add.u64 %r202,%r201,%r109; shl.b64 %r203,%r202,3; add.u64 %r204,%r181,%r203; ld.u64 %r30,[%r204+56]; ld.u64 %r32,[%r204+48]; .loc 1 68 9 ld.u64 %r112,[%r204+40]; .loc 1 70 3 setp.ne.u64 %r218,%r109,0; @ %r218 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r219,%r108,%r109; @ %r219 bra $L4; bra $L46; $L3: add.u64 %r63,%r181,40; add.u64 %r48,%frame,120; add.u64 %r97,%frame,240; add.u64 %r220,%r181,16; add.u64 %r222,%r25,%r25; add.u64 %r223,%r222,%r25; shl.b64 %r224,%r223,3; add.u64 %r175,%r220,%r224; .loc 1 76 12 mov.u64 %r358,0; $L8: .loc 1 72 18 ld.u64 %r225,[%r63]; st.u64 [%r48],%r225; .loc 1 73 19 ld.u64 %r227,[%r63+16]; add.u64 %r226,%r227,1; ld.u64 %r228,[%r63+8]; sub.u64 %r37,%r226,%r228; .loc 1 75 10 setp.lt.s64 %r229,%r37,0; @ %r229 bra $L6; .loc 1 73 17 st.u64 [%r97],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r97],%r358; $L7: .loc 1 70 3 add.u64 %r63,%r63,24; add.u64 %r48,%r48,8; add.u64 %r97,%r97,8; setp.ne.u64 %r231,%r63,%r175; @ %r231 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r232,[%r180]; setp.eq.u64 %r233,%r232,0; @ ! %r233 bra $L11; bra $L10; $L4: add.u64 %r235,%r25,%r25; add.u64 %r236,%r235,%r25; shl.b64 %r237,%r236,3; add.u64 %r238,%r237,40; add.u64 %r136,%r181,%r238; shl.b64 %r239,%r25,3; add.u64 %r106,%r239,-8; add.u64 %r345,%frame,120; add.u64 %r129,%r345,%r106; add.u64 %r346,%frame,240; add.u64 %r104,%r346,%r106; cvt.u32.u32 %r243,%r22; cvt.s64.s8 %r242,%r243; add.u64 %r245,%r242,%r242; add.u64 %r246,%r245,%r242; shl.b64 %r247,%r246,3; add.u64 %r248,%r181,40; add.u64 %r82,%r247,%r248; .loc 1 84 12 mov.u64 %r357,0; $L14: .loc 1 80 18 ld.u64 %r249,[%r136]; st.u64 [%r129],%r249; .loc 1 81 19 ld.u64 %r251,[%r136+16]; add.u64 %r250,%r251,1; ld.u64 %r252,[%r136+8]; sub.u64 %r45,%r250,%r252; .loc 1 83 10 setp.lt.s64 %r253,%r45,0; @ %r253 bra $L12; .loc 1 81 17 st.u64 [%r104],%r45; bra $L13; $L12: .loc 1 84 12 st.u64 [%r104],%r357; $L13: .loc 1 78 3 add.u64 %r136,%r136,24; add.u64 %r129,%r129,8; add.u64 %r104,%r104,8; setp.ne.u64 %r255,%r82,%r136; @ %r255 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r256,%r108,0; @ %r256 bra $L16; add.u64 %r346,%frame,240; $L34: add.u64 %r152,%r180,40; mov.u64 %r150,%r346; add.u64 %r257,%r180,16; cvt.u32.u32 %r259,%r22; cvt.s64.s8 %r258,%r259; add.u64 %r261,%r258,%r258; add.u64 %r262,%r261,%r258; shl.b64 %r263,%r262,3; add.u64 %r139,%r257,%r263; .loc 1 94 10 mov.u64 %r103,1; .loc 1 98 4 mov.u64 %r286,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r264,0; st.u64 [%r180+8],%r264; .loc 1 103 28 cvt.u16.u32 %r267,%r22; add.u16 %r266,%r267,-1; cvt.u32.u16 %r268,%r266; st.u8 [%r180+28],%r268; .loc 1 105 20 add.u32 %r269,%r22,-2; cvt.s64.s32 %r54,%r269; add.u64 %r271,%r54,%r54; add.u64 %r272,%r271,%r54; shl.b64 %r273,%r272,3; add.u64 %r274,%r180,%r273; .loc 1 105 67 shl.b64 %r276,%r54,3; add.u64 %r277,%frame,%r276; .loc 1 105 59 ld.u64 %r279,[%r274+40]; ld.u64 %r280,[%r277+240]; mul.lo.u64 %r114,%r279,%r280; .loc 1 107 29 mov.u64 %r2811282; call (%value_in),_gfortrani_xmallocarray283,[%value_in]; } .loc 1 107 27 st.u64 [%r180],%r283; .loc 1 108 10 setp.eq.u64 %r285,%r114,0; @ ! %r285 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r103,%r50,%r103; $L17: .loc 1 98 4 st.u64 [%r152+8],%r286; ld.u64 %r50,[%r150]; add.u64 %r287,%r50,-1; st.u64 [%r152+16],%r287; st.u64 [%r152],%r103; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r288,%r139,%r152; @ %r288 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r180+48],%r114; mov.u64 %r290,-1; st.u64 [%r180+56],%r290; mov.u64 %r291,1; st.u64 [%r180+40],%r291; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r60,[%r180+28]; .loc 1 118 10 setp.eq.u64 %r292,%r60,%r108; @ %r292 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r108; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r295,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r296,[%r295+36]; setp.eq.u32 %r297,%r296,0; @ %r297 bra $L21; .loc 1 125 2 add.u64 %r346,%frame,240; cvta.const.u64 %r301,$LC2r346300301; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r303,%r108,0; @ %r303 bra $L24; $L27: .loc 1 65 9 add.u64 %r304,%r30,1; .loc 1 65 7 sub.u64 %r110,%r304,%r32; max.s64 %r40,%r110,0; .loc 1 137 8 ld.u64 %r121,[%r181]; .loc 1 138 8 ld.u64 %r122,[%r180]; .loc 1 158 36 shl.b64 %r68,%r112,3; ld.u64 %r47,[%frame+360]; .loc 1 180 22 ld.u64 %r69,[%frame+120]; .loc 1 180 12 shl.b64 %r71,%r69,3; .loc 1 181 22 ld.u64 %r72,[%frame]; .loc 1 181 12 shl.b64 %r74,%r72,3; setp.gt.s64 %r347,%r110,0; setp.le.s64 %r348,%r108,1; cvt.u32.u32 %r350,%r22; cvt.s64.s8 %r351,%r350; add.u64 %r352,%r351,-1; add.u64 %r353,%frame,120; add.u64 %r354,%frame,240; .loc 1 187 13 mov.u64 %r355,0; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r180,40; mov.u64 %r162,%frame; cvt.u32.u32 %r306,%r22; cvt.s64.s8 %r305,%r306; add.u64 %r153,%r305,-1; .loc 1 129 3 mov.u64 %r123,0; add.u64 %r346,%frame,240; .loc 1 131 16 mov.u64 %r307,%r123; $L26: st.u64 [%r165],%r307; .loc 1 132 18 ld.u64 %r308,[%r164]; st.u64 [%r162],%r308; .loc 1 133 17 shl.b64 %r310,%r123,3; add.u64 %r311,%r346,%r310; .loc 1 133 10 ld.u64 %r312,[%r311]; setp.le.s64 %r313,%r312,0; @ %r313 bra $L1; .loc 1 129 26 add.u64 %r123,%r123,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r314,%r123,%r153; @ %r314 bra $L26; bra $L27; $L25: .loc 1 153 5 @ %r347 bra $L35; .loc 1 154 10 mov.u64 %r316,-9223372036854775808; st.u64 [%r122],%r316; bra $L29; $L35: mov.u64 %r116,%r121; .loc 1 151 9 mov.u64 %r43,-9223372036854775808; .loc 1 158 13 mov.u64 %r115,0; $L28: ld.u64 %r317,[%r116]; max.s64 %r43,%r43,%r317; .loc 1 158 28 add.u64 %r115,%r115,1; .loc 1 158 36 add.u64 %r116,%r116,%r68; .loc 1 158 6 setp.gt.s64 %r318,%r40,%r115; @ %r318 bra $L28; .loc 1 175 12 st.u64 [%r122],%r43; $L29: .loc 1 179 15 add.u64 %r47,%r47,1; .loc 1 180 12 add.u64 %r121,%r121,%r71; .loc 1 181 12 add.u64 %r122,%r122,%r74; .loc 1 183 32 ld.u64 %r127,[%frame+240]; .loc 1 183 13 setp.ne.u64 %r319,%r47,%r127; @ %r319 bra $L25; .loc 1 190 23 mul.lo.u64 %r320,%r69,%r47; .loc 1 190 9 shl.b64 %r321,%r320,3; sub.u64 %r118,%r121,%r321; .loc 1 191 23 mul.lo.u64 %r322,%r72,%r47; .loc 1 191 9 shl.b64 %r323,%r322,3; sub.u64 %r119,%r122,%r323; .loc 1 193 7 @ %r348 bra $L1; add.u64 %r144,%frame,368; mov.u64 %r107,8; .loc 1 192 5 mov.u64 %r120,1; bra $L32; $L33: .loc 1 187 13 st.u64 [%r144],%r355; .loc 1 190 23 mul.lo.u64 %r329,%r89,%r88; .loc 1 190 9 shl.b64 %r330,%r329,3; sub.u64 %r118,%r121,%r330; .loc 1 191 23 mul.lo.u64 %r331,%r92,%r88; .loc 1 191 9 shl.b64 %r332,%r331,3; sub.u64 %r119,%r122,%r332; .loc 1 192 5 add.u64 %r120,%r120,1; .loc 1 193 7 add.u64 %r144,%r144,8; add.u64 %r107,%r107,8; setp.eq.u64 %r333,%r120,%r352; @ %r333 bra $L1; $L32: .loc 1 201 16 ld.u64 %r334,[%r144]; add.u64 %r88,%r334,1; st.u64 [%r144],%r88; .loc 1 202 23 add.u64 %r336,%r353,%r107; ld.u64 %r89,[%r336]; .loc 1 202 13 shl.b64 %r337,%r89,3; add.u64 %r121,%r118,%r337; .loc 1 203 23 add.u64 %r338,%frame,%r107; ld.u64 %r92,[%r338]; .loc 1 203 13 shl.b64 %r339,%r92,3; add.u64 %r122,%r119,%r339; .loc 1 183 32 add.u64 %r341,%r354,%r107; ld.u64 %r95,[%r341]; .loc 1 183 13 setp.eq.u64 %r342,%r88,%r95; @ %r342 bra $L33; .loc 1 187 13 mov.u64 %r47,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r343,[%r180]; setp.eq.u64 %r344,%r343,0; @ ! %r344 bra $L11; bra $L34; $L1: .loc 1 207_gfortran_mmaxval_i8 .visible .func _gfortran_mmaxval_608predpred %r280; .reg .pred %r281; .reg .u64 %r282; .reg .u64 %r284; .reg .u64predu64 %r318; .reg .u64u64u64 %r364; .reg .u64 %r365; .reg .u64 %r367; .reg .predu64pred %r393; .reg .predpredmov.u64 %r210,%ar0; mov.u64 %r211,%ar1; mov.u64 %r212,%ar2; mov.u64 %r213,%ar3; .loc 1 237 6911212; call _gfortran_maxva.loc 1 244 7 bra $L48; $L49: .loc 1 247 10 ld.u64 %r22,[%r212]; .loc 1 247 7 add.u64 %r140,%r22,-1; .loc 1 248 10 ld.s8 %r23,[%r211+28]; .loc 1 248 38 add.u32 %r218,%r23,-1; .loc 1 248 8 cvt.s64.s32 %r141,%r218; .loc 1 251 7 shr.u64 %r220,%r140,63; set.u32.gt.s64 %r222,%r140,%r141; neg.s32 %r223,%r222; cvt.u16.u64 %r226,%r220; cvt.u16.u32 %r227,%r223; or.b16 %r225,%r226,%r227; .loc 1 251 6 cvt.u32.u16 %r228,%r225; cvt.u16.u8 %r229,%r228; setp.eq.u16 %r230,%r229,0; @ %r230 bra $L51; .loc 1 253 7 cvt.u32.u32 %r233,%r23; cvt.s64.s8 %r232,%r233; st.u64 [%stack+8],%r232; st.u64 [%stack],%r22; cvta.const.u64 %r2312stack; call _gfortran_runtime_error1: .loc 1 258 9 add.u64 %r431,%r140,%r140; add.u64 %r430,%r431,%r140; shl.b64 %r238,%r430,3; add.u64 %r239,%r211,%r238; ld.u64 %r242,[%r239+56]; add.u64 %r241,%r242,1; .loc 1 258 7 ld.u64 %r249,[%r239+48]; sub.u64 %r142,%r241,%r249; .loc 1 259 6 setp.le.s64 %r250,%r142,0; @ %r250 bra $L48; .loc 1 262 9 ld.u64 %r155,[%r213]; .loc 1 264 15 ld.u64 %r35,[%r213+16]; .loc 1 266 22 cvt.u32.u64 %r36,%r35; .loc 1 266 53 add.u32 %r251,%r36,-4; and.b32 %r252,%r251,-5; set.u32.eq.u32 %r254,%r252,0; neg.s32 %r255,%r254; .loc 1 266 22 add.u32 %r256,%r36,-1; set.u32.le.u32 %r258,%r256,1; neg.s32 %r259,%r258; .loc 1 266 6 cvt.u16.u32 %r261,%r255; cvt.u16.u32 %r262,%r259; or.b16 %r260,%r261,%r262; cvt.u32.u16 %r263,%r260; cvt.u16.u8 %r264,%r263; setp.ne.u16 %r265,%r264,0; @ %r265 bra $L53; .loc 1 268 7 setp.ne.u32 %r267,%r36,16; @ %r267 bra $L54; $L53: .loc 1 275 9 shl.b64 %r271,%r430,3; add.u64 %r272,%r211,%r271; ld.u64 %r145,[%r272+40]; .loc 1 276 12 add.u64 %r278,%r213,%r271; ld.u64 %r43,[%r278+40]; .loc 1 278 3 setp.ne.u64 %r280,%r140,0; @ %r280 bra $L55; $L61: .loc 1 288 3 setp.lt.s64 %r281,%r140,%r141; @ %r281 bra $L56; bra $L105; $L54: .loc 1 273 5 cvta.const.u64 %r282282_gfortran_runtime_error5: add.u64 %r194,%r211,40; add.u64 %r196,%r213,40; add.u64 %r284,%r211,16; add.u64 %r286,%r22,%r22; add.u64 %r287,%r286,%r22; shl.b64 %r288,%r287,3; add.u64 %r207,%r284,%r288; .loc 1 278 3 mov.u64 %r198,0; add.u64 %r428,%frame,240; add.u64 %r427,%frame,360; .loc 1 285 12 mov.u64 %r442,%r198; $L60: .loc 1 280 18 add.u64 %r290,%r428,%r198; ld.u64 %r291,[%r194]; st.u64 [%r290],%r291; .loc 1 281 18 add.u64 %r292,%frame,%r198; .loc 1 281 20 ld.u64 %r294,[%r196]; mul.lo.u64 %r293,%r294,%r35; .loc 1 281 18 st.u64 [%r292],%r293; .loc 1 282 19 ld.u64 %r296,[%r194+16]; add.u64 %r295,%r296,1; ld.u64 %r297,[%r194+8]; sub.u64 %r55,%r295,%r297; .loc 1 284 10 setp.lt.s64 %r298,%r55,0; @ %r298 bra $L58; .loc 1 282 17 add.u64 %r300,%r427,%r198; st.u64 [%r300],%r55; bra $L59; $L58: .loc 1 285 12 add.u64 %r302,%r427,%r198; st.u64 [%r302],%r442; $L59: .loc 1 278 3 add.u64 %r194,%r194,24; add.u64 %r196,%r196,24; add.u64 %r198,%r198,8; setp.ne.u64 %r304,%r194,%r207; @ %r304 bra $L60; bra $L61; $L105: .loc 1 298 6 ld.u64 %r305,[%r210]; setp.eq.u64 %r306,%r305,0; @ ! %r306 bra $L63; bra $L62; $L56: add.u64 %r308,%r22,%r22; add.u64 %r309,%r308,%r22; shl.b64 %r310,%r309,3; add.u64 %r58,%r310,40; add.u64 %r99,%r211,%r58; add.u64 %r138,%r213,%r58; shl.b64 %r311,%r22,3; add.u64 %r192,%r311,-8; cvt.u32.u32 %r313,%r23; cvt.s64.s8 %r312,%r313; shl.b64 %r314,%r312,3; add.u64 %r39,%r314,-8; add.u64 %r428,%frame,240; add.u64 %r427,%frame,360; .loc 1 295 12 mov.u64 %r441,0; $L66: .loc 1 290 18 add.u64 %r316,%r428,%r192; ld.u64 %r317,[%r99]; st.u64 [%r316],%r317; .loc 1 291 18 add.u64 %r318,%frame,%r192; .loc 1 291 20 ld.u64 %r320,[%r138]; mul.lo.u64 %r319,%r320,%r35; .loc 1 291 18 st.u64 [%r318],%r319; .loc 1 292 19 ld.u64 %r322,[%r99+16]; add.u64 %r321,%r322,1; ld.u64 %r323,[%r99+8]; sub.u64 %r66,%r321,%r323; .loc 1 294 10 setp.lt.s64 %r324,%r66,0; @ %r324 bra $L64; .loc 1 292 17 add.u64 %r326,%r427,%r192; st.u64 [%r326],%r66; bra $L65; $L64: .loc 1 295 12 add.u64 %r328,%r427,%r192; st.u64 [%r328],%r441; $L65: .loc 1 288 3 add.u64 %r99,%r99,24; add.u64 %r138,%r138,24; add.u64 %r192,%r192,8; setp.ne.u64 %r330,%r39,%r192; @ %r330 bra $L66; bra $L106; $L62: .loc 1 302 7 setp.eq.u64 %r331,%r141,0; @ %r331 bra $L68; add.u64 %r427,%frame,360; $L86: add.u64 %r136,%r210,40; mov.u64 %r134,%r427; cvt.u32.u32 %r333,%r23; cvt.s64.s8 %r332,%r333; add.u64 %r335,%r332,%r332; add.u64 %r336,%r335,%r332; shl.b64 %r337,%r336,3; add.u64 %r338,%r210,16; add.u64 %r105,%r337,%r338; .loc 1 305 10 mov.u64 %r130,1; .loc 1 309 4 mov.u64 %r357,0; bra $L69; $L68: .loc 1 313 20 add.u32 %r339,%r23,-2; cvt.s64.s32 %r73,%r339; add.u64 %r341,%r73,%r73; add.u64 %r342,%r341,%r73; shl.b64 %r343,%r342,3; add.u64 %r344,%r210,%r343; .loc 1 313 67 shl.b64 %r346,%r73,3; add.u64 %r347,%frame,%r346; .loc 1 313 59 ld.u64 %r349,[%r344+40]; ld.u64 %r350,[%r347+360]; mul.lo.u64 %r149,%r349,%r350; .loc 1 315 24 mov.u64 %r351,0; st.u64 [%r210+8],%r351; .loc 1 316 28 cvt.u16.u32 %r354,%r23; add.u16 %r353,%r354,-1; cvt.u32.u16 %r355,%r353; st.u8 [%r210+28],%r355; .loc 1 318 10 setp.eq.u64 %r356,%r149,0; @ %r356 bra $L70; bra $L107; $L72: .loc 1 307 47 mul.lo.u64 %r130,%r70,%r130; $L69: .loc 1 309 4 st.u64 [%r136+8],%r357; ld.u64 %r70,[%r134]; add.u64 %r358,%r70,-1; st.u64 [%r136+16],%r358; st.u64 [%r136],%r130; .loc 1 302 7 add.u64 %r136,%r136,24; add.u64 %r134,%r134,8; setp.ne.u64 %r359,%r105,%r136; @ %r359 bra $L72; bra $L68; $L70: .loc 1 321 4 st.u64 [%r210+48],%r149; mov.u64 %r361,-1; st.u64 [%r210+56],%r361; mov.u64 %r362,1; st.u64 [%r210+40],%r362; .loc 1 322 4 bra $L48; $L107: .loc 1 325 24 mov.u64 %r364364; call (%value_in),_gfortrani_xmallocarray65,[%value_in]; } .loc 1 325 22 st.u64 [%r210],%r365; bra $L73; $L63: .loc 1 330 19 ld.s8 %r367,[%r210+28]; .loc 1 330 10 setp.eq.u64 %r368,%r367,%r141; @ %r368 bra $L74; .loc 1 331 2 cvta.const.u64 %r36stack; call _gfortran_runtime_error74: .loc 1 333 11 cvta.global.u64 %r371,_gfortrani_compile_options; .loc 1 333 10 ld.u32 %r372,[%r371+36]; setp.eq.u32 %r373,%r372,0; @ %r373 bra $L73; .loc 1 335 4 add.u64 %r427,%frame,360; cvta.const.u64 %r377,$LC2r427376377; call _gfortrani_bounds_ifunction_return337 4 cvta.const.u64 %r381,$LC1r2113377; call _gfortrani_bounds_equal_extents$L73: .loc 1 342 3 setp.ne.u64 %r383,%r141,0; @ %r383 bra $L75; $L78: .loc 1 350 8 ld.u64 %r156,[%r210]; .loc 1 351 8 ld.u64 %r154,[%r211]; .loc 1 353 9 setp.ne.u64 %r384,%r154,0; @ %r384 bra $L76; bra $L48; $L75: add.u64 %r183,%frame,480; add.u64 %r180,%r210,40; add.u64 %r177,%frame,120; cvt.u32.u32 %r386,%r23; cvt.s64.s8 %r385,%r386; add.u64 %r137,%r385,-1; .loc 1 342 3 mov.u64 %r161,0; add.u64 %r427,%frame,360; .loc 1 344 16 mov.u64 %r387,%r161; $L77: st.u64 [%r183],%r387; .loc 1 345 18 ld.u64 %r388,[%r180]; st.u64 [%r177],%r388; .loc 1 346 17 shl.b64 %r390,%r161,3; add.u64 %r391,%r427,%r390; .loc 1 346 10 ld.u64 %r392,[%r391]; setp.le.s64 %r393,%r392,0; @ %r393 bra $L48; .loc 1 342 26 add.u64 %r161,%r161,1; .loc 1 342 3 add.u64 %r183,%r183,8; add.u64 %r180,%r180,24; add.u64 %r177,%r177,8; setp.ne.u64 %r394,%r137,%r161; @ %r394 bra $L77; bra $L78; $L76: .loc 1 276 12 mul.lo.u64 %r46,%r43,%r35; .loc 1 370 32 shl.b64 %r89,%r145,3; ld.u64 %r29,[%frame+480]; .loc 1 401 22 ld.u64 %r91,[%frame+240]; .loc 1 401 12 shl.b64 %r93,%r91,3; .loc 1 402 23 ld.u64 %r94,[%frame]; .loc 1 403 22 ld.u64 %r96,[%frame+120]; .loc 1 403 12 shl.b64 %r98,%r96,3; setp.le.s64 %r432,%r141,1; cvt.u32.u32 %r434,%r23; cvt.s64.s8 %r435,%r434; add.u64 %r436,%r435,-1; add.u64 %r437,%frame,240; add.u64 %r438,%frame,360; add.u64 %r439,%frame,120; .loc 1 409 13 mov.u64 %r440,0; $L87: .loc 1 342 3 mov.u64 %r160,%r155; mov.u64 %r159,%r154; .loc 1 365 9 mov.u64 %r132,-9223372036854775808; .loc 1 370 9 mov.u64 %r158,0; $L80: .loc 1 394 6 ld.s8 %r396,[%r160]; cvt.u16.u32 %r395,%r396; setp.eq.u16 %r397,%r395,0; @ %r397 bra $L79; ld.u64 %r398,[%r159]; max.s64 %r132,%r132,%r398; $L79: .loc 1 370 24 add.u64 %r158,%r158,1; .loc 1 370 32 add.u64 %r159,%r159,%r89; .loc 1 370 47 add.u64 %r160,%r160,%r46; .loc 1 370 2 setp.ne.u64 %r399,%r142,%r158; @ %r399 bra $L80; .loc 1 397 8 st.u64 [%r156],%r132; .loc 1 400 15 add.u64 %r29,%r29,1; .loc 1 401 12 add.u64 %r154,%r154,%r93; .loc 1 402 13 add.u64 %r155,%r155,%r94; .loc 1 403 12 add.u64 %r156,%r156,%r98; .loc 1 405 32 ld.u64 %r83,[%frame+360]; .loc 1 405 13 setp.ne.u64 %r400,%r83,%r29; @ %r400 bra $L87; .loc 1 412 23 mul.lo.u64 %r101,%r29,%r91; .loc 1 413 24 mul.lo.u64 %r401,%r29,%r94; .loc 1 413 10 sub.u64 %r151,%r155,%r401; .loc 1 414 23 mul.lo.u64 %r402,%r29,%r96; .loc 1 414 9 shl.b64 %r403,%r402,3; sub.u64 %r152,%r156,%r403; .loc 1 416 7 @ ! %r432 bra $L108; bra $L48; $L85: .loc 1 409 13 st.u64 [%r41],%r440; .loc 1 412 23 mul.lo.u64 %r101,%r113,%r112; .loc 1 413 24 mul.lo.u64 %r406,%r115,%r112; .loc 1 413 10 sub.u64 %r151,%r155,%r406; .loc 1 414 23 mul.lo.u64 %r407,%r117,%r112; .loc 1 414 9 shl.b64 %r408,%r407,3; sub.u64 %r152,%r156,%r408; .loc 1 415 5 add.u64 %r153,%r153,1; .loc 1 416 7 add.u64 %r41,%r41,8; add.u64 %r30,%r30,8; setp.ne.u64 %r409,%r153,%r436; @ %r409 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; mov.u64 %r30,8; .loc 1 415 5 mov.u64 %r153,1; $L84: .loc 1 424 16 ld.u64 %r413,[%r41]; add.u64 %r112,%r413,1; st.u64 [%r41],%r112; .loc 1 425 23 add.u64 %r415,%r437,%r30; ld.u64 %r113,[%r415]; .loc 1 425 13 sub.u64 %r416,%r113,%r101; shl.b64 %r417,%r416,3; add.u64 %r154,%r154,%r417; .loc 1 426 24 add.u64 %r418,%frame,%r30; ld.u64 %r115,[%r418]; .loc 1 426 14 add.u64 %r155,%r151,%r115; .loc 1 427 23 add.u64 %r420,%r439,%r30; ld.u64 %r117,[%r420]; .loc 1 427 13 shl.b64 %r421,%r117,3; add.u64 %r156,%r152,%r421; .loc 1 405 32 add.u64 %r423,%r438,%r30; ld.u64 %r120,[%r423]; .loc 1 405 13 setp.eq.u64 %r424,%r112,%r120; @ %r424 bra $L85; .loc 1 409 13 mov.u64 %r29,0; bra $L87; $L106: .loc 1 298 6 ld.u64 %r425,[%r210]; setp.eq.u64 %r426,%r425,0; @ ! %r426 bra $L63; bra $L86; $L48: .loc 1 431_gfortran_smaxval_i8 .visible .func _gfortran_smaxval_8u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64predu64mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 454 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 454 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_maxva.loc 1 461 7 bra $L109; $L111: .loc 1 464 10 ld.u64 %r23,[%r152]; .loc 1 464 7 add.u64 %r88,%r23,-1; .loc 1 465 10 ld.s8 %r24,[%r151+28]; .loc 1 465 38 add.u32 %r26,%r24,-1; .loc 1 465 8 cvt.s64.s32 %r294,%r26; .loc 1 467 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r294; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 467 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 474 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 482 3 setp.lt.s64 %r173,%r88,%r294; @ %r173 bra $L115; bra $L168; $L113: .loc 1 469 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 479 12 mov.u64 %r300,0; $L119: .loc 1 476 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 478 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 476 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 479 12 st.u64 [%r139],%r300; $L118: .loc 1 474 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 491 15 ld.u64 %r98,[%r150]; .loc 1 491 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r293,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r293,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 488 12 mov.u64 %r299,0; $L125: .loc 1 485 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 487 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 484 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 488 12 st.u64 [%r43],%r299; $L124: .loc 1 482 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 495 7 setp.eq.u64 %r211,%r294,0; @ %r211 bra $L127; add.u64 %r293,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r293; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 498 10 mov.u64 %r81,1; .loc 1 502 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 506 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 507 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 509 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 509 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 509 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 511 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 500 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 502 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 495 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 514 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 515 4 bra $L109; $L170: .loc 1 518 24 mov.u64 %r24496244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 518 22 st.u64 [%r150],%r98; $L135: .loc 1 544 3 setp.ne.u64 %r247,%r294,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 522 19 ld.s8 %r55,[%r150+28]; .loc 1 522 10 setp.eq.u64 %r248,%r55,%r294; @ %r248 bra $L134; .loc 1 523 2 st.u64 [%stack+8],%r294; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 528 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 528 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 530 4 setp.eq.u64 %r254,%r294,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 530 10 mov.u64 %r104,0; $L137: .loc 1 534 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 534 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 535 18 ld.u64 %r63,[%r31]; .loc 1 530 25 add.u64 %r104,%r104,1; .loc 1 535 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 536 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 530 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 546 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r294,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 547 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 544 26 add.u64 %r100,%r100,1; .loc 1 544 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r294,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 556 22 ld.u64 %r68,[%frame]; .loc 1 556 12 shl.b64 %r69,%r68,3; .loc 1 558 32 ld.u64 %r59,[%frame+120]; .loc 1 565 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r292,%r294,1; .loc 1 554 13 mov.u64 %r278,-9223372036854775808; cvt.u32.u32 %r296,%r24; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; $L146: st.u64 [%r98],%r278; .loc 1 555 15 add.u64 %r80,%r80,1; .loc 1 556 12 add.u64 %r98,%r98,%r69; .loc 1 558 13 setp.ne.u64 %r279,%r59,%r80; @ %r279 bra $L146; .loc 1 567 7 @ ! %r292 bra $L171; bra $L109; $L145: .loc 1 562 13 st.u64 [%r99],%r295; .loc 1 565 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 566 5 add.u64 %r97,%r97,1; .loc 1 567 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r282,%r97,%r298; @ %r282 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 565 23 mov.u64 %r72,%r44; .loc 1 566 5 mov.u64 %r97,1; .loc 1 562 13 mov.u64 %r295,0; $L144: .loc 1 571 16 ld.u64 %r287,[%r99]; add.u64 %r74,%r287,1; st.u64 [%r99],%r74; .loc 1 572 23 ld.u64 %r75,[%r124]; .loc 1 572 13 sub.u64 %r288,%r75,%r72; shl.b64 %r289,%r288,3; add.u64 %r98,%r98,%r289; .loc 1 558 32 ld.u64 %r77,[%r123]; .loc 1 558 13 setp.eq.u64 %r290,%r74,%r77; @ %r290 bra $L145; .loc 1 562 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 491 15 ld.u64 %r98,[%r150]; .loc 1 491 6 setp.eq.u64 %r291,%r98,0; @ ! %r291 bra $L122; bra $L147; $L109: .loc 1 maxval_i16.o/_gfortran_maxval_i16 .visible .func _gfortran_maxval_i16fortran/generated/maxval_i16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxval_i16 .visible .func _gfortran_mmaxval_i16gfortran_smaxval_i16 .visible .func _gfortran_smaxval_i16VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,86,65,76114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_maxval_i16 .visible .func _gfortran_maxval_i16489pred %r217; .reg .pred64u64 %r251; .reg .predu64 %r256; .reg .u64 %r257; .reg .u32u32 %r268; .reg .u64u64pred %r317; .reg .u64 %r320; .reg .pred %r324; .reg .pred %r327; .reg .pred %r330; .reg .predu64 %r359; .reg .u64predu64 %r370; .reg .u64 %r372; .reg .u64 %r373; mov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; .loc 1 55 10 ld.s8 %r22,[%r180+28]; .loc 1 55 38 add.u32 %r182,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r182; .loc 1 56 10 ld.u64 %r25,[%r181]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r184,%r104,63; set.u32.lt.s64 %r186,%r103,%r104; neg.s32 %r187,%r186; cvt.u16.u64 %r190,%r184; cvt.u16.u32 %r191,%r187; or.b16 %r189,%r190,%r191; .loc 1 58 6 cvt.u32.u16 %r192,%r189; cvt.u16.u8 %r193,%r192; setp.eq.u16 %r194,%r193,0; @ %r194 bra $L2; .loc 1 60 7 cvt.u32.u32 %r197,%r22; cvt.s64.s8 %r196,%r197; st.u64 [%stack+8],%r1961955_gfortran_runtime_errorr200,%r104,%r104; add.u64 %r201,%r200,%r104; shl.b64 %r202,%r201,3; add.u64 %r203,%r180,%r202; ld.u64 %r30,[%r203+56]; ld.u64 %r32,[%r203+48]; .loc 1 68 9 ld.u64 %r107,[%r203+40]; .loc 1 70 3 setp.ne.u64 %r217,%r104,0; @ %r217 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r218,%r103,%r104; @ %r218 bra $L4; bra $L48; $L3: add.u64 %r78,%r180,40; add.u64 %r59,%frame,120; add.u64 %r91,%frame,240; add.u64 %r219,%r180,16; add.u64 %r221,%r25,%r25; add.u64 %r222,%r221,%r25; shl.b64 %r223,%r222,3; add.u64 %r174,%r219,%r223; .loc 1 76 12 mov.u64 %r373,0; $L8: .loc 1 72 18 ld.u64 %r224,[%r78]; st.u64 [%r59],%r224; .loc 1 73 19 ld.u64 %r226,[%r78+16]; add.u64 %r225,%r226,1; ld.u64 %r227,[%r78+8]; sub.u64 %r37,%r225,%r227; .loc 1 75 10 setp.lt.s64 %r228,%r37,0; @ %r228 bra $L6; .loc 1 73 17 st.u64 [%r91],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r91],%r373; $L7: .loc 1 70 3 add.u64 %r78,%r78,24; add.u64 %r59,%r59,8; add.u64 %r91,%r91,8; setp.ne.u64 %r230,%r78,%r174; @ %r230 bra $L8; bra $L9; $L48: .loc 1 87 6 ld.u64 %r231,[%r179]; setp.eq.u64 %r232,%r231,0; @ ! %r232 bra $L11; bra $L10; $L4: add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r237,%r236,40; add.u64 %r143,%r180,%r237; shl.b64 %r238,%r25,3; add.u64 %r125,%r238,-8; add.u64 %r361,%frame,120; add.u64 %r128,%r361,%r125; add.u64 %r360,%frame,240; add.u64 %r121,%r360,%r125; cvt.u32.u32 %r242,%r22; cvt.s64.s8 %r241,%r242; add.u64 %r244,%r241,%r241; add.u64 %r245,%r244,%r241; shl.b64 %r246,%r245,3; add.u64 %r247,%r180,40; add.u64 %r95,%r246,%r247; .loc 1 84 12 mov.u64 %r372,0; $L14: .loc 1 80 18 ld.u64 %r248,[%r143]; st.u64 [%r128],%r248; .loc 1 81 19 ld.u64 %r250,[%r143+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r143+8]; sub.u64 %r45,%r249,%r251; .loc 1 83 10 setp.lt.s64 %r252,%r45,0; @ %r252 bra $L12; .loc 1 81 17 st.u64 [%r121],%r45; bra $L13; $L12: .loc 1 84 12 st.u64 [%r121],%r372; $L13: .loc 1 78 3 add.u64 %r143,%r143,24; add.u64 %r128,%r128,8; add.u64 %r121,%r121,8; setp.ne.u64 %r254,%r95,%r143; @ %r254 bra $L14; bra $L49; $L10: .loc 1 91 7 setp.le.s64 %r255,%r103,0; @ %r255 bra $L16; add.u64 %r360,%frame,240; $L36: add.u64 %r152,%r179,40; mov.u64 %r150,%r360; add.u64 %r256,%r179,16; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r144,%r256,%r262; .loc 1 94 10 mov.u64 %r99,1; .loc 1 98 4 mov.u64 %r285,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r263,0; st.u64 [%r179+8],%r263; .loc 1 103 28 cvt.u16.u32 %r266,%r22; add.u16 %r265,%r266,-1; cvt.u32.u16 %r267,%r265; st.u8 [%r179+28],%r267; .loc 1 105 20 add.u32 %r268,%r22,-2; cvt.s64.s32 %r52,%r268; add.u64 %r270,%r52,%r52; add.u64 %r271,%r270,%r52; shl.b64 %r272,%r271,3; add.u64 %r273,%r179,%r272; .loc 1 105 67 shl.b64 %r275,%r52,3; add.u64 %r276,%frame,%r275; .loc 1 105 59 ld.u64 %r278,[%r273+40]; ld.u64 %r279,[%r276+240]; mul.lo.u64 %r109,%r278,%r279; .loc 1 107 29 mov.u64 %r281,16281; call (%value_in),_gfortrani_xmallocarray282,[%value_in]; } .loc 1 107 27 st.u64 [%r179],%r282; .loc 1 108 10 setp.eq.u64 %r284,%r109,0; @ ! %r284 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r99,%r48,%r99; $L17: .loc 1 98 4 st.u64 [%r152+8],%r285; ld.u64 %r48,[%r150]; add.u64 %r286,%r48,-1; st.u64 [%r152+16],%r286; st.u64 [%r152],%r99; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r287,%r144,%r152; @ %r287 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r179+48],%r109; mov.u64 %r289,-1; st.u64 [%r179+56],%r289; mov.u64 %r290,1; st.u64 [%r179+40],%r290; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r179+28]; .loc 1 118 10 setp.eq.u64 %r291,%r58,%r103; @ %r291 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r294,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r295,[%r294+36]; setp.eq.u32 %r296,%r295,0; @ %r296 bra $L21; .loc 1 125 2 add.u64 %r360,%frame,240; cvta.const.u64 %r300,$LC2r360299300; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r302,%r103,0; @ %r302 bra $L24; $L27: .loc 1 65 9 add.u64 %r303,%r30,1; .loc 1 65 7 sub.u64 %r105,%r303,%r32; max.s64 %r40,%r105,0; .loc 1 137 8 ld.u64 %r116,[%r180]; .loc 1 138 8 ld.u64 %r117,[%r179]; .loc 1 158 36 shl.b64 %r66,%r107,4; ld.u64 %r138,[%frame+360]; .loc 1 180 22 ld.u64 %r68,[%frame+120]; .loc 1 180 12 shl.b64 %r69,%r68,4; .loc 1 181 22 ld.u64 %r71,[%frame]; .loc 1 181 12 shl.b64 %r72,%r71,4; .loc 1 183 32 ld.u64 %r123,[%frame+240]; .loc 1 190 23 mul.lo.u64 %r304,%r68,%r123; .loc 1 190 9 shl.b64 %r305,%r304,4; neg.s64 %r127,%r305; .loc 1 191 23 mul.lo.u64 %r306,%r71,%r123; .loc 1 191 9 shl.b64 %r307,%r306,4; neg.s64 %r130,%r307; setp.gt.s64 %r362,%r105,0; setp.le.s64 %r363,%r103,1; .loc 1 151 9 mov.u64 %r365,0; cvt.u32.u32 %r366,%r22; cvt.s64.s8 %r367,%r366; add.u64 %r368,%r367,-1; add.u64 %r369,%frame,120; add.u64 %r370,%frame,240; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r179,40; mov.u64 %r162,%frame; cvt.u32.u32 %r309,%r22; cvt.s64.s8 %r308,%r309; add.u64 %r153,%r308,-1; .loc 1 129 3 mov.u64 %r118,0; add.u64 %r360,%frame,240; .loc 1 131 16 mov.u64 %r310,%r118; $L26: st.u64 [%r165],%r310; .loc 1 132 18 ld.u64 %r311,[%r164]; st.u64 [%r162],%r311; .loc 1 133 17 shl.b64 %r313,%r118,3; add.u64 %r314,%r360,%r313; .loc 1 133 10 ld.u64 %r315,[%r314]; setp.le.s64 %r316,%r315,0; @ %r316 bra $L1; .loc 1 129 26 add.u64 %r118,%r118,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r317,%r118,%r153; @ %r317 bra $L26; bra $L27; $L25: .loc 1 153 5 @ %r362 bra $L37; .loc 1 154 10 st.u64 [%r117],%r365; mov.u64 %r320,-9223372036854775808; st.u64 [%r117+8],%r320; bra $L29; $L37: mov.u64 %r112,%r116; .loc 1 151 9 mov.u64 %r356,%r365; mov.u64 %r357,-9223372036854775808; .loc 1 158 13 mov.u64 %r111,0; $L28: ld.u64 %r358,[%r112]; ld.u64 %r359,[%r112+8]; setp.gt.s64 %r324,%r359,%r357; @ %r324 bra $L31; setp.ne.u64 %r327,%r359,%r357; @ %r327 bra $L30; setp.gt.u64 %r330,%r358,%r356; @ ! %r330 bra $L30; $L31: mov.u64 %r356,%r358; mov.u64 %r357,%r359; $L30: .loc 1 158 28 add.u64 %r111,%r111,1; .loc 1 158 36 add.u64 %r112,%r112,%r66; .loc 1 158 6 setp.gt.s64 %r331,%r40,%r111; @ %r331 bra $L28; .loc 1 175 12 st.u64 [%r117],%r356; st.u64 [%r117+8],%r357; $L29: .loc 1 179 15 add.u64 %r138,%r138,1; .loc 1 180 12 add.u64 %r116,%r116,%r69; .loc 1 181 12 add.u64 %r117,%r117,%r72; .loc 1 183 13 setp.ne.u64 %r334,%r138,%r123; @ %r334 bra $L25; .loc 1 190 9 add.u64 %r113,%r116,%r127; .loc 1 191 9 add.u64 %r114,%r117,%r130; .loc 1 193 7 @ %r363 bra $L1; add.u64 %r122,%frame,368; mov.u64 %r140,8; .loc 1 192 5 mov.u64 %r115,1; bra $L34; $L35: .loc 1 187 13 st.u64 [%r122],%r365; .loc 1 190 23 mul.lo.u64 %r340,%r85,%r84; .loc 1 190 9 shl.b64 %r341,%r340,4; sub.u64 %r113,%r116,%r341; .loc 1 191 23 mul.lo.u64 %r342,%r88,%r84; .loc 1 191 9 shl.b64 %r343,%r342,4; sub.u64 %r114,%r117,%r343; .loc 1 192 5 add.u64 %r115,%r115,1; .loc 1 193 7 add.u64 %r122,%r122,8; add.u64 %r140,%r140,8; setp.eq.u64 %r344,%r115,%r368; @ %r344 bra $L1; $L34: .loc 1 201 16 ld.u64 %r345,[%r122]; add.u64 %r84,%r345,1; st.u64 [%r122],%r84; .loc 1 202 23 add.u64 %r347,%r369,%r140; ld.u64 %r85,[%r347]; .loc 1 202 13 shl.b64 %r348,%r85,4; add.u64 %r116,%r113,%r348; .loc 1 203 23 add.u64 %r349,%frame,%r140; ld.u64 %r88,[%r349]; .loc 1 203 13 shl.b64 %r350,%r88,4; add.u64 %r117,%r114,%r350; .loc 1 183 32 add.u64 %r352,%r370,%r140; ld.u64 %r92,[%r352]; .loc 1 183 13 setp.eq.u64 %r353,%r84,%r92; @ %r353 bra $L35; .loc 1 187 13 mov.u64 %r138,0; bra $L25; $L49: .loc 1 87 6 ld.u64 %r354,[%r179]; setp.eq.u64 %r355,%r354,0; @ ! %r355 bra $L11; bra $L36; $L1: .loc 1 207_gfortran_mmaxval_i16 .visible .func _gfortran_mmaxval_i16608u64 %r166; .reg .u64u64u16 %r263; .reg .pred %r264; .reg .pred %r266; .reg .u64 %r270; .reg .u64 %r271; .reg .u64 %r277; .reg .pred %r279; .reg .pred %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .predu64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32u64 %r342; .reg .u64pred %r358; .reg .u64predu64 %r391; .reg .predpred %r399; .reg .pred %r403; .reg .pred %r406; .reg .predmov.u64 %r211,%ar2; mov.u64 %r212,%ar3; .loc 1 237 6 setp.ne.u64 %r213,%r212,0; @ %r213 bra $Lgfortran_maxval_i16.loc 1 244 7 bra $L50; $L51: .loc 1 247 10 ld.u64 %r22,[%r211]; .loc 1 247 7 add.u64 %r136,%r22,-1; .loc 1 248 10 ld.s8 %r23,[%r210+28]; .loc 1 248 38 add.u32 %r217,%r23,-1; .loc 1 248 8 cvt.s64.s32 %r137,%r217; .loc 1 251 7 shr.u64 %r219,%r136,63; set.u32.gt.s64 %r221,%r136,%r137; neg.s32 %r222,%r221; cvt.u16.u64 %r225,%r219; cvt.u16.u32 %r226,%r222; or.b16 %r224,%r225,%r226; .loc 1 251 6 cvt.u32.u16 %r227,%r224; cvt.u16.u8 %r228,%r227; setp.eq.u16 %r229,%r228,0; @ %r229 bra $L53; .loc 1 253 7 cvt.u32.u32 %r232,%r23; cvt.s64.s8 %r231,%r232; st.u64 [%stack+8],%r231; st.u64 [%stack],%r22; cvta.const.u64 %r2302stack; call _gfortran_runtime_error3: .loc 1 258 9 add.u64 %r443,%r136,%r136; add.u64 %r446,%r443,%r136; shl.b64 %r237,%r446,3; add.u64 %r238,%r210,%r237; ld.u64 %r241,[%r238+56]; add.u64 %r240,%r241,1; .loc 1 258 7 ld.u64 %r248,[%r238+48]; sub.u64 %r138,%r240,%r248; .loc 1 259 6 setp.le.s64 %r249,%r138,0; @ %r249 bra $L50; .loc 1 262 9 ld.u64 %r151,[%r212]; .loc 1 264 15 ld.u64 %r34,[%r212+16]; .loc 1 266 22 cvt.u32.u64 %r35,%r34; .loc 1 266 53 add.u32 %r250,%r35,-4; and.b32 %r251,%r250,-5; set.u32.eq.u32 %r253,%r251,0; neg.s32 %r254,%r253; .loc 1 266 22 add.u32 %r255,%r35,-1; set.u32.le.u32 %r257,%r255,1; neg.s32 %r258,%r257; .loc 1 266 6 cvt.u16.u32 %r260,%r254; cvt.u16.u32 %r261,%r258; or.b16 %r259,%r260,%r261; cvt.u32.u16 %r262,%r259; cvt.u16.u8 %r263,%r262; setp.ne.u16 %r264,%r263,0; @ %r264 bra $L55; .loc 1 268 7 setp.ne.u32 %r266,%r35,16; @ %r266 bra $L56; $L55: .loc 1 275 9 shl.b64 %r270,%r446,3; add.u64 %r271,%r210,%r270; ld.u64 %r141,[%r271+40]; .loc 1 276 12 add.u64 %r277,%r212,%r270; ld.u64 %r42,[%r277+40]; .loc 1 278 3 setp.ne.u64 %r279,%r136,0; @ %r279 bra $L57; $L63: .loc 1 288 3 setp.lt.s64 %r280,%r136,%r137; @ %r280 bra $L58; bra $L109; $L56: .loc 1 273 5 cvta.const.u64 %r281281_gfortran_runtime_error7: add.u64 %r40,%r210,40; add.u64 %r179,%r212,40; add.u64 %r283,%r210,16; add.u64 %r285,%r22,%r22; add.u64 %r286,%r285,%r22; shl.b64 %r287,%r286,3; add.u64 %r206,%r283,%r287; .loc 1 278 3 mov.u64 %r91,0; add.u64 %r444,%frame,240; add.u64 %r442,%frame,360; .loc 1 285 12 mov.u64 %r456,%r91; $L62: .loc 1 280 18 add.u64 %r289,%r444,%r91; ld.u64 %r290,[%r40]; st.u64 [%r289],%r290; .loc 1 281 18 add.u64 %r291,%frame,%r91; .loc 1 281 20 ld.u64 %r293,[%r179]; mul.lo.u64 %r292,%r293,%r34; .loc 1 281 18 st.u64 [%r291],%r292; .loc 1 282 19 ld.u64 %r295,[%r40+16]; add.u64 %r294,%r295,1; ld.u64 %r296,[%r40+8]; sub.u64 %r54,%r294,%r296; .loc 1 284 10 setp.lt.s64 %r297,%r54,0; @ %r297 bra $L60; .loc 1 282 17 add.u64 %r299,%r442,%r91; st.u64 [%r299],%r54; bra $L61; $L60: .loc 1 285 12 add.u64 %r301,%r442,%r91; st.u64 [%r301],%r456; $L61: .loc 1 278 3 add.u64 %r40,%r40,24; add.u64 %r179,%r179,24; add.u64 %r91,%r91,8; setp.ne.u64 %r303,%r40,%r206; @ %r303 bra $L62; bra $L63; $L109: .loc 1 298 6 ld.u64 %r304,[%r209]; setp.eq.u64 %r305,%r304,0; @ ! %r305 bra $L65; bra $L64; $L58: add.u64 %r307,%r22,%r22; add.u64 %r308,%r307,%r22; shl.b64 %r309,%r308,3; add.u64 %r126,%r309,40; add.u64 %r132,%r210,%r126; add.u64 %r124,%r212,%r126; shl.b64 %r310,%r22,3; add.u64 %r102,%r310,-8; cvt.u32.u32 %r312,%r23; cvt.s64.s8 %r311,%r312; shl.b64 %r313,%r311,3; add.u64 %r68,%r313,-8; add.u64 %r444,%frame,240; add.u64 %r442,%frame,360; .loc 1 295 12 mov.u64 %r455,0; $L68: .loc 1 290 18 add.u64 %r315,%r444,%r102; ld.u64 %r316,[%r132]; st.u64 [%r315],%r316; .loc 1 291 18 add.u64 %r317,%frame,%r102; .loc 1 291 20 ld.u64 %r319,[%r124]; mul.lo.u64 %r318,%r319,%r34; .loc 1 291 18 st.u64 [%r317],%r318; .loc 1 292 19 ld.u64 %r321,[%r132+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r132+8]; sub.u64 %r65,%r320,%r322; .loc 1 294 10 setp.lt.s64 %r323,%r65,0; @ %r323 bra $L66; .loc 1 292 17 add.u64 %r325,%r442,%r102; st.u64 [%r325],%r65; bra $L67; $L66: .loc 1 295 12 add.u64 %r327,%r442,%r102; st.u64 [%r327],%r455; $L67: .loc 1 288 3 add.u64 %r132,%r132,24; add.u64 %r124,%r124,24; add.u64 %r102,%r102,8; setp.ne.u64 %r329,%r68,%r102; @ %r329 bra $L68; bra $L110; $L64: .loc 1 302 7 setp.eq.u64 %r330,%r137,0; @ %r330 bra $L70; add.u64 %r442,%frame,360; $L90: add.u64 %r172,%r209,40; mov.u64 %r164,%r442; cvt.u32.u32 %r332,%r23; cvt.s64.s8 %r331,%r332; add.u64 %r334,%r331,%r331; add.u64 %r335,%r334,%r331; shl.b64 %r336,%r335,3; add.u64 %r337,%r209,16; add.u64 %r133,%r336,%r337; .loc 1 305 10 mov.u64 %r128,1; .loc 1 309 4 mov.u64 %r356,0; bra $L71; $L70: .loc 1 313 20 add.u32 %r338,%r23,-2; cvt.s64.s32 %r74,%r338; add.u64 %r340,%r74,%r74; add.u64 %r341,%r340,%r74; shl.b64 %r342,%r341,3; add.u64 %r343,%r209,%r342; .loc 1 313 67 shl.b64 %r345,%r74,3; add.u64 %r346,%frame,%r345; .loc 1 313 59 ld.u64 %r348,[%r343+40]; ld.u64 %r349,[%r346+360]; mul.lo.u64 %r145,%r348,%r349; .loc 1 315 24 mov.u64 %r350,0; st.u64 [%r209+8],%r350; .loc 1 316 28 cvt.u16.u32 %r353,%r23; add.u16 %r352,%r353,-1; cvt.u32.u16 %r354,%r352; st.u8 [%r209+28],%r354; .loc 1 318 10 setp.eq.u64 %r355,%r145,0; @ %r355 bra $L72; bra $L111; $L74: .loc 1 307 47 mul.lo.u64 %r128,%r71,%r128; $L71: .loc 1 309 4 st.u64 [%r172+8],%r356; ld.u64 %r71,[%r164]; add.u64 %r357,%r71,-1; st.u64 [%r172+16],%r357; st.u64 [%r172],%r128; .loc 1 302 7 add.u64 %r172,%r172,24; add.u64 %r164,%r164,8; setp.ne.u64 %r358,%r133,%r172; @ %r358 bra $L74; bra $L70; $L72: .loc 1 321 4 st.u64 [%r209+48],%r145; mov.u64 %r360,-1; st.u64 [%r209+56],%r360; mov.u64 %r361,1; st.u64 [%r209+40],%r361; .loc 1 322 4 bra $L50; $L111: .loc 1 325 24 mov.u64 %r363,16363; call (%value_in),_gfortrani_xmallocarray64,[%value_in]; } .loc 1 325 22 st.u64 [%r209],%r364; bra $L75; $L65: .loc 1 330 19 ld.s8 %r366,[%r209+28]; .loc 1 330 10 setp.eq.u64 %r367,%r366,%r137; @ %r367 bra $L76; .loc 1 331 2 cvta.const.u64 %r36836_gfortran_runtime_error76: .loc 1 333 11 cvta.global.u64 %r370,_gfortrani_compile_options; .loc 1 333 10 ld.u32 %r371,[%r370+36]; setp.eq.u32 %r372,%r371,0; @ %r372 bra $L75; .loc 1 335 4 add.u64 %r442,%frame,360; cvta.const.u64 %r376,$LC2r44237376; call _gfortrani_bounds_ifunction_return337 4 cvta.const.u64 %r380,$LC1210380376; call _gfortrani_bounds_equal_extents$L75: .loc 1 342 3 setp.ne.u64 %r382,%r137,0; @ %r382 bra $L77; $L80: .loc 1 350 8 ld.u64 %r152,[%r209]; .loc 1 351 8 ld.u64 %r150,[%r210]; .loc 1 353 9 setp.ne.u64 %r383,%r150,0; @ %r383 bra $L78; bra $L50; $L77: add.u64 %r190,%frame,480; add.u64 %r189,%r209,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r385,%r23; cvt.s64.s8 %r384,%r385; add.u64 %r175,%r384,-1; .loc 1 342 3 mov.u64 %r157,0; add.u64 %r442,%frame,360; .loc 1 344 16 mov.u64 %r386,%r157; $L79: st.u64 [%r190],%r386; .loc 1 345 18 ld.u64 %r387,[%r189]; st.u64 [%r187],%r387; .loc 1 346 17 shl.b64 %r389,%r157,3; add.u64 %r390,%r442,%r389; .loc 1 346 10 ld.u64 %r391,[%r390]; setp.le.s64 %r392,%r391,0; @ %r392 bra $L50; .loc 1 342 26 add.u64 %r157,%r157,1; .loc 1 342 3 add.u64 %r190,%r190,8; add.u64 %r189,%r189,24; add.u64 %r187,%r187,8; setp.ne.u64 %r393,%r157,%r175; @ %r393 bra $L79; bra $L80; $L78: .loc 1 276 12 mul.lo.u64 %r45,%r42,%r34; .loc 1 370 32 shl.b64 %r90,%r141,4; ld.u64 %r39,[%frame+480]; .loc 1 401 22 ld.u64 %r93,[%frame+240]; .loc 1 401 12 shl.b64 %r94,%r93,4; .loc 1 402 23 ld.u64 %r96,[%frame]; .loc 1 403 22 ld.u64 %r98,[%frame+120]; .loc 1 403 12 shl.b64 %r99,%r98,4; .loc 1 405 32 ld.u64 %r84,[%frame+360]; .loc 1 412 23 mul.lo.u64 %r38,%r84,%r93; .loc 1 413 24 mul.lo.u64 %r394,%r84,%r96; .loc 1 413 10 neg.s64 %r163,%r394; .loc 1 414 23 mul.lo.u64 %r395,%r84,%r98; .loc 1 414 9 shl.b64 %r396,%r395,4; neg.s64 %r165,%r396; setp.le.s64 %r441,%r137,1; .loc 1 365 9 mov.u64 %r448,0; cvt.u32.u32 %r449,%r23; cvt.s64.s8 %r450,%r449; add.u64 %r451,%r450,-1; add.u64 %r452,%frame,240; add.u64 %r453,%frame,360; add.u64 %r454,%frame,120; $L91: .loc 1 342 3 mov.u64 %r156,%r151; mov.u64 %r155,%r150; .loc 1 365 9 mov.u64 %r437,%r448; mov.u64 %r438,-9223372036854775808; .loc 1 370 9 mov.u64 %r154,0; $L84: .loc 1 394 6 ld.s8 %r398,[%r156]; cvt.u16.u32 %r397,%r398; setp.eq.u16 %r399,%r397,0; @ %r399 bra $L81; ld.u64 %r439,[%r155]; ld.u64 %r440,[%r155+8]; setp.gt.s64 %r403,%r440,%r438; @ %r403 bra $L83; setp.ne.u64 %r406,%r440,%r438; @ %r406 bra $L81; setp.gt.u64 %r409,%r439,%r437; @ ! %r409 bra $L81; $L83: mov.u64 %r437,%r439; mov.u64 %r438,%r440; $L81: .loc 1 370 24 add.u64 %r154,%r154,1; .loc 1 370 32 add.u64 %r155,%r155,%r90; .loc 1 370 47 add.u64 %r156,%r156,%r45; .loc 1 370 2 setp.ne.u64 %r410,%r138,%r154; @ %r410 bra $L84; .loc 1 397 8 st.u64 [%r152],%r437; st.u64 [%r152+8],%r438; .loc 1 400 15 add.u64 %r39,%r39,1; .loc 1 401 12 add.u64 %r150,%r150,%r94; .loc 1 402 13 add.u64 %r151,%r151,%r96; .loc 1 403 12 add.u64 %r152,%r152,%r99; .loc 1 405 13 setp.ne.u64 %r413,%r84,%r39; @ %r413 bra $L91; .loc 1 413 10 add.u64 %r147,%r151,%r163; .loc 1 414 9 add.u64 %r148,%r152,%r165; .loc 1 416 7 @ ! %r441 bra $L112; bra $L50; $L89: .loc 1 409 13 st.u64 [%r41],%r448; .loc 1 412 23 mul.lo.u64 %r101,%r113,%r112; .loc 1 413 24 mul.lo.u64 %r416,%r115,%r112; .loc 1 413 10 sub.u64 %r147,%r151,%r416; .loc 1 414 23 mul.lo.u64 %r417,%r117,%r112; .loc 1 414 9 shl.b64 %r418,%r417,4; sub.u64 %r148,%r152,%r418; .loc 1 415 5 add.u64 %r149,%r149,1; .loc 1 416 7 add.u64 %r41,%r41,8; add.u64 %r166,%r166,8; setp.ne.u64 %r419,%r149,%r451; @ %r419 bra $L88; bra $L50; $L112: add.u64 %r41,%frame,488; .loc 1 412 23 mov.u64 %r101,%r38; .loc 1 416 7 mov.u64 %r166,8; .loc 1 415 5 mov.u64 %r149,1; $L88: .loc 1 424 16 ld.u64 %r423,[%r41]; add.u64 %r112,%r423,1; st.u64 [%r41],%r112; .loc 1 425 23 add.u64 %r425,%r452,%r166; ld.u64 %r113,[%r425]; .loc 1 425 13 sub.u64 %r426,%r113,%r101; shl.b64 %r427,%r426,4; add.u64 %r150,%r150,%r427; .loc 1 426 24 add.u64 %r428,%frame,%r166; ld.u64 %r115,[%r428]; .loc 1 426 14 add.u64 %r151,%r147,%r115; .loc 1 427 23 add.u64 %r430,%r454,%r166; ld.u64 %r117,[%r430]; .loc 1 427 13 shl.b64 %r431,%r117,4; add.u64 %r152,%r148,%r431; .loc 1 405 32 add.u64 %r433,%r453,%r166; ld.u64 %r121,[%r433]; .loc 1 405 13 setp.eq.u64 %r434,%r112,%r121; @ %r434 bra $L89; .loc 1 409 13 mov.u64 %r39,0; bra $L91; $L110: .loc 1 298 6 ld.u64 %r435,[%r209]; setp.eq.u64 %r436,%r435,0; @ ! %r436 bra $L65; bra $L90; $L50: .loc 1 431_gfortran_smaxval_i16 .visible .func _gfortran_smaxval_i168u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64u64 %r279; .reg .pred %r280; .reg .predpred %r292; .reg .predu32u64 %r301; mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 454 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L114; .loc 1 454 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L115; $L114:r152; call _gfortran_maxval_i16.loc 1 461 7 bra $L113; $L115: .loc 1 464 10 ld.u64 %r23,[%r152]; .loc 1 464 7 add.u64 %r88,%r23,-1; .loc 1 465 10 ld.s8 %r24,[%r151+28]; .loc 1 465 38 add.u32 %r26,%r24,-1; .loc 1 465 8 cvt.s64.s32 %r295,%r26; .loc 1 467 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r295; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 467 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L117; .loc 1 474 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L118; $L124: .loc 1 482 3 setp.lt.s64 %r173,%r88,%r295; @ %r173 bra $L119; bra $L172; $L117: .loc 1 469 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error118: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 479 12 mov.u64 %r301,0; $L123: .loc 1 476 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 478 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L121; .loc 1 476 17 st.u64 [%r139],%r36; bra $L122; $L121: .loc 1 479 12 st.u64 [%r139],%r301; $L122: .loc 1 474 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L123; bra $L124; $L172: .loc 1 491 15 ld.u64 %r98,[%r150]; .loc 1 491 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L126; bra $L125; $L119: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r294,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r294,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 488 12 mov.u64 %r300,0; $L129: .loc 1 485 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 487 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L127; .loc 1 484 17 st.u64 [%r43],%r41; bra $L128; $L127: .loc 1 488 12 st.u64 [%r43],%r300; $L128: .loc 1 482 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L129; bra $L173; $L125: .loc 1 495 7 setp.eq.u64 %r211,%r295,0; @ %r211 bra $L131; add.u64 %r294,%frame,120; $L151: add.u64 %r111,%r150,40; mov.u64 %r107,%r294; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 498 10 mov.u64 %r81,1; .loc 1 502 4 mov.u64 %r237,0; bra $L132; $L131: .loc 1 506 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 507 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 509 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 509 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 509 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 511 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L133; bra $L174; $L135: .loc 1 500 48 mul.lo.u64 %r81,%r46,%r81; $L132: .loc 1 502 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 495 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L135; bra $L131; $L133: .loc 1 514 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 515 4 bra $L113; $L174: .loc 1 518 24 mov.u64 %r244,1696244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 518 22 st.u64 [%r150],%r98; $L139: .loc 1 544 3 setp.ne.u64 %r247,%r295,0; @ %r247 bra $L136; bra $L137; $L126: .loc 1 522 19 ld.s8 %r55,[%r150+28]; .loc 1 522 10 setp.eq.u64 %r248,%r55,%r295; @ %r248 bra $L138; .loc 1 523 2 st.u64 [%stack+8],%r295; st.u64 [%stack],%r55;_gfortran_runtime_error138: .loc 1 528 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 528 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L139; .loc 1 530 4 setp.eq.u64 %r254,%r295,0; @ %r254 bra $L137; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 530 10 mov.u64 %r104,0; $L141: .loc 1 534 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 534 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 535 18 ld.u64 %r63,[%r31]; .loc 1 530 25 add.u64 %r104,%r104,1; .loc 1 535 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L140; .loc 1 536 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error140: .loc 1 530 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L141; $L136: .loc 1 546 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L142; shl.b64 %r264,%r295,3; bra $L143; $L142: mov.u64 %r264,8; $L143: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L144: .loc 1 547 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 544 26 add.u64 %r100,%r100,1; .loc 1 544 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r295,%r100; @ %r277 bra $L144; $L137: ld.u64 %r80,[%frame+240]; .loc 1 556 22 ld.u64 %r68,[%frame]; .loc 1 556 12 shl.b64 %r69,%r68,4; .loc 1 558 32 ld.u64 %r59,[%frame+120]; .loc 1 565 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r293,%r295,1; .loc 1 554 13 mov.u64 %r278,0; cvt.u32.u32 %r297,%r24; cvt.s64.s8 %r298,%r297; add.u64 %r299,%r298,-1; $L150: st.u64 [%r98],%r278; mov.u64 %r279,-9223372036854775808; st.u64 [%r98+8],%r279; .loc 1 555 15 add.u64 %r80,%r80,1; .loc 1 556 12 add.u64 %r98,%r98,%r69; .loc 1 558 13 setp.ne.u64 %r280,%r59,%r80; @ %r280 bra $L150; .loc 1 567 7 @ ! %r293 bra $L175; bra $L113; $L149: .loc 1 562 13 st.u64 [%r99],%r278; .loc 1 565 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 566 5 add.u64 %r97,%r97,1; .loc 1 567 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r283,%r97,%r299; @ %r283 bra $L148; bra $L113; $L175: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 565 23 mov.u64 %r72,%r44; .loc 1 566 5 mov.u64 %r97,1; $L148: .loc 1 571 16 ld.u64 %r288,[%r99]; add.u64 %r74,%r288,1; st.u64 [%r99],%r74; .loc 1 572 23 ld.u64 %r75,[%r124]; .loc 1 572 13 sub.u64 %r289,%r75,%r72; shl.b64 %r290,%r289,4; add.u64 %r98,%r98,%r290; .loc 1 558 32 ld.u64 %r77,[%r123]; .loc 1 558 13 setp.eq.u64 %r291,%r74,%r77; @ %r291 bra $L149; .loc 1 562 13 mov.u64 %r80,0; bra $L150; $L173: .loc 1 491 15 ld.u64 %r98,[%r150]; .loc 1 491 6 setp.eq.u64 %r292,%r98,0; @ ! %r292 bra $L126; bra $L151; $L113: .loc 1 maxval_r4.o/_gfortran_maxval_r4 .visible .func _gfortran_maxval_r4fortran/generated/maxval_r4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxval_r4 .visible .func _gfortran_mmaxval_r4gfortran_smaxval_r4 .visible .func _gfortran_smaxval_r4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,86,65,76114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_maxval_r4 .visible .func _gfortran_maxval_r448u64u32 %r194; .reg .u16 %r195; .reg .predu64 %r227; .reg .u64 %r228; .reg .u64 %r229; .reg .predpredu32u64 %r250; .reg .u64 %r251; .reg .u64 %r252; .reg .u64 %r253; .reg .predu32u16u64pred %r319; .reg .f32 %r321; .reg .pred %r322; .reg .pred %r323; .reg .pred %r324; .reg .pred %r325; .reg .pred %r326; .reg .u64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64u64 %r357; .reg .u64 %r358; .reg .u64 %r360; .reg .u64 %r361; mov.u64 %r181,%ar0; mov.u64 %r182,%ar1; mov.u64 %r183,%ar2; .loc 1 55 10 ld.s8 %r22,[%r182+28]; .loc 1 55 38 add.u32 %r184,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r109,%r184; .loc 1 56 10 ld.u64 %r25,[%r183]; .loc 1 56 7 add.u64 %r110,%r25,-1; .loc 1 58 7 shr.u64 %r186,%r110,63; set.u32.lt.s64 %r188,%r109,%r110; neg.s32 %r189,%r188; cvt.u16.u64 %r192,%r186; cvt.u16.u32 %r193,%r189; or.b16 %r191,%r192,%r193; .loc 1 58 6 cvt.u32.u16 %r194,%r191; cvt.u16.u8 %r195,%r194; setp.eq.u16 %r196,%r195,0; @ %r196 bra $L2; .loc 1 60 7 cvt.u32.u32 %r199,%r22; cvt.s64.s8 %r198,%r199; st.u64 [%stack+8],%r19819197_gfortran_runtime_errorr202,%r110,%r110; add.u64 %r203,%r202,%r110; shl.b64 %r204,%r203,3; add.u64 %r205,%r182,%r204; ld.u64 %r31,[%r205+56]; ld.u64 %r33,[%r205+48]; .loc 1 68 9 ld.u64 %r113,[%r205+40]; .loc 1 70 3 setp.ne.u64 %r219,%r110,0; @ %r219 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r220,%r109,%r110; @ %r220 bra $L4; bra $L52; $L3: add.u64 %r29,%r182,40; add.u64 %r126,%frame,120; add.u64 %r180,%frame,240; add.u64 %r221,%r182,16; add.u64 %r223,%r25,%r25; add.u64 %r224,%r223,%r25; shl.b64 %r225,%r224,3; add.u64 %r71,%r221,%r225; .loc 1 76 12 mov.u64 %r361,0; $L8: .loc 1 72 18 ld.u64 %r226,[%r29]; st.u64 [%r126],%r226; .loc 1 73 19 ld.u64 %r228,[%r29+16]; add.u64 %r227,%r228,1; ld.u64 %r229,[%r29+8]; sub.u64 %r38,%r227,%r229; .loc 1 75 10 setp.lt.s64 %r230,%r38,0; @ %r230 bra $L6; .loc 1 73 17 st.u64 [%r180],%r38; bra $L7; $L6: .loc 1 76 12 st.u64 [%r180],%r361; $L7: .loc 1 70 3 add.u64 %r29,%r29,24; add.u64 %r126,%r126,8; add.u64 %r180,%r180,8; setp.ne.u64 %r232,%r71,%r29; @ %r232 bra $L8; bra $L9; $L52: .loc 1 87 6 ld.u64 %r233,[%r181]; setp.eq.u64 %r234,%r233,0; @ ! %r234 bra $L11; bra $L10; $L4: add.u64 %r236,%r25,%r25; add.u64 %r237,%r236,%r25238,40; add.u64 %r144,%r182,%r239; shl.b64 %r240,%r25,3; add.u64 %r107,%r240,-8; add.u64 %r349,%frame,120; add.u64 %r128,%r349,%r107; add.u64 %r348,%frame,240; add.u64 %r105,%r348,%r107; cvt.u32.u32 %r244,%r22; cvt.s64.s8 %r243,%r244; add.u64 %r246,%r243,%r243; add.u64 %r247,%r246,%r243; shl.b64 %r248,%r247,3; add.u64 %r249,%r182,40; add.u64 %r65,%r248,%r249; .loc 1 84 12 mov.u64 %r360,0; $L14: .loc 1 80 18 ld.u64 %r250,[%r144]; st.u64 [%r128],%r250; .loc 1 81 19 ld.u64 %r252,[%r144+16]; add.u64 %r251,%r252,1; ld.u64 %r253,[%r144+8]; sub.u64 %r45,%r251,%r253; .loc 1 83 10 setp.lt.s64 %r254,%r45,0; @ %r254 bra $L12; .loc 1 81 17 st.u64 [%r105],%r45; bra $L13; $L12: .loc 1 84 12 st.u64 [%r105],%r360; $L13: .loc 1 78 3 add.u64 %r144,%r144,24; add.u64 %r128,%r128,8; add.u64 %r105,%r105,8; setp.ne.u64 %r256,%r65,%r144; @ %r256 bra $L14; bra $L53; $L10: .loc 1 91 7 setp.le.s64 %r257,%r109,0; @ %r257 bra $L16; add.u64 %r348,%frame,240; $L38: add.u64 %r153,%r181,40; mov.u64 %r151,%r348; add.u64 %r258,%r181,16; cvt.u32.u32 %r260,%r22; cvt.s64.s8 %r259,%r260; add.u64 %r262,%r259,%r259; add.u64 %r263,%r262,%r259; shl.b64 %r264,%r263,3; add.u64 %r145,%r258,%r264; .loc 1 94 10 mov.u64 %r103,1; .loc 1 98 4 mov.u64 %r287,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r265,0; st.u64 [%r181+8],%r265; .loc 1 103 28 cvt.u16.u32 %r268,%r22; add.u16 %r267,%r268,-1; cvt.u32.u16 %r269,%r267; st.u8 [%r181+28],%r269; .loc 1 105 20 add.u32 %r270,%r22,-2; cvt.s64.s32 %r56,%r270; add.u64 %r272,%r56,%r56; add.u64 %r273,%r272,%r56; shl.b64 %r274,%r273,3; add.u64 %r275,%r181,%r274; .loc 1 105 67 shl.b64 %r277,%r56,3; add.u64 %r278,%frame,%r277; .loc 1 105 59 ld.u64 %r280,[%r275+40]; ld.u64 %r281,[%r278+240]; mul.lo.u64 %r115,%r280,%r281; .loc 1 107 29283; call (%value_in),_gfortrani_xmallocarray284,[%value_in]; } .loc 1 107 27 st.u64 [%r181],%r284; .loc 1 108 10 setp.eq.u64 %r286,%r115,0; @ ! %r286 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r103,%r52,%r103; $L17: .loc 1 98 4 st.u64 [%r153+8],%r287; ld.u64 %r52,[%r151]; add.u64 %r288,%r52,-1; st.u64 [%r153+16],%r288; st.u64 [%r153],%r103; .loc 1 91 7 add.u64 %r153,%r153,24; add.u64 %r151,%r151,8; setp.ne.u64 %r289,%r145,%r153; @ %r289 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r181+48],%r115; mov.u64 %r291,-1; st.u64 [%r181+56],%r291; mov.u64 %r292,1; st.u64 [%r181+40],%r292; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r62,[%r181+28]; .loc 1 118 10 setp.eq.u64 %r293,%r62,%r109; @ %r293 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r109; st.u64 [%stack],%r62;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r296,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r297,[%r296+36]; setp.eq.u32 %r298,%r297,0; @ %r298 bra $L21; .loc 1 125 2 add.u64 %r348,%frame,240; cvta.const.u64 %r302,$LC2r3430302; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r304,%r109,0; @ %r304 bra $L24; $L27: .loc 1 65 9 add.u64 %r305,%r31,1; .loc 1 65 7 sub.u64 %r111,%r305,%r33; max.s64 %r46,%r111,0; .loc 1 137 8 ld.u64 %r122,[%r182]; .loc 1 138 8 ld.u64 %r123,[%r181]; .loc 1 158 36 shl.b64 %r69,%r113,2; ld.u64 %r137,[%frame+360]; .loc 1 180 22 ld.u64 %r73,[%frame+120]; .loc 1 180 12 shl.b64 %r74,%r73,2; .loc 1 181 22 ld.u64 %r76,[%frame]; .loc 1 181 12 shl.b64 %r77,%r76,2; .loc 1 183 32 ld.u64 %r142,[%frame+240]; .loc 1 190 23 mul.lo.u64 %r306,%r73,%r142; .loc 1 190 9 shl.b64 %r307,%r306,2; neg.s64 %r143,%r307; .loc 1 191 23 mul.lo.u64 %r308,%r76,%r142; .loc 1 191 9 shl.b64 %r309,%r308,2; neg.s64 %r70,%r309; setp.gt.s64 %r350,%r111,0; setp.le.s64 %r351,%r109,1; cvt.u32.u32 %r353,%r22; cvt.s64.s8 %r354,%r353; add.u64 %r355,%r354,-1; add.u64 %r356,%frame,120; add.u64 %r357,%frame,240; .loc 1 187 13 mov.u64 %r358,0; bra $L25; $L24: add.u64 %r166,%frame,360; add.u64 %r165,%r181,40; mov.u64 %r163,%frame; cvt.u32.u32 %r311,%r22; cvt.s64.s8 %r310,%r311; add.u64 %r154,%r310,-1; .loc 1 129 3 mov.u64 %r124,0; add.u64 %r348,%frame,240; .loc 1 131 16 mov.u64 %r312,%r124; $L26: st.u64 [%r166],%r312; .loc 1 132 18 ld.u64 %r313,[%r165]; st.u64 [%r163],%r313; .loc 1 133 17 shl.b64 %r315,%r124,3; add.u64 %r316,%r348,%r315; .loc 1 133 10 ld.u64 %r317,[%r316]; setp.le.s64 %r318,%r317,0; @ %r318 bra $L1; .loc 1 129 26 add.u64 %r124,%r124,1; .loc 1 129 3 add.u64 %r166,%r166,8; add.u64 %r165,%r165,24; add.u64 %r163,%r163,8; setp.ne.u64 %r319,%r124,%r154; @ %r319 bra $L26; bra $L27; $L25: .loc 1 153 5 @ %r350 bra $L39; .loc 1 154 10 mov.f32 %r321,0fff7fffff; st.f32 [%r123],%r321; bra $L29; $L39: mov.u64 %r139,%r122; .loc 1 158 13 mov.u64 %r129,0; $L28: .loc 1 163 7 ld.f32 %r179,[%r139]; mov.u64 %r117,%r129; .loc 1 158 28 add.u64 %r129,%r129,1; mov.u64 %r118,%r139; .loc 1 158 36 add.u64 %r139,%r139,%r69; .loc 1 163 6 setp.ge.f32 %r322,%r179,0fff800000; @ %r322 bra $L40; .loc 1 158 6 setp.gt.s64 %r323,%r46,%r129; @ %r323 bra $L28; .loc 1 167 15 mov.f32 %r104,0f7fc00000; bra $L31; $L40: .loc 1 149 9 mov.f32 %r104,0fff800000; $L30: .loc 1 171 6 setp.lt.f32 %r324,%r104,%r179; .loc 1 172 12 selp.f32 %r104,%r179,%r104,%r324; .loc 1 168 28 add.u64 %r117,%r117,1; .loc 1 168 36 add.u64 %r118,%r118,%r69; .loc 1 168 11 setp.le.s64 %r325,%r46,%r117; @ %r325 bra $L31; .loc 1 171 7 ld.f32 %r179,[%r118]; bra $L30; $L31: .loc 1 175 12 st.f32 [%r123],%r104; $L29: .loc 1 179 15 add.u64 %r137,%r137,1; .loc 1 180 12 add.u64 %r122,%r122,%r74; .loc 1 181 12 add.u64 %r123,%r123,%r77; .loc 1 183 13 setp.ne.u64 %r326,%r137,%r142; @ %r326 bra $L25; .loc 1 190 9 add.u64 %r119,%r122,%r143; .loc 1 191 9 add.u64 %r120,%r123,%r70; .loc 1 193 7 @ %r351 bra $L1; add.u64 %r132,%frame,368; mov.u64 %r174,8; .loc 1 192 5 mov.u64 %r121,1; bra $L36; $L37: .loc 1 187 13 st.u64 [%r132],%r358; .loc 1 190 23 mul.lo.u64 %r332,%r90,%r89; .loc 1 190 9 shl.b64 %r333,%r332,2; sub.u64 %r119,%r122,%r333; .loc 1 191 23 mul.lo.u64 %r334,%r93,%r89; .loc 1 191 9 shl.b64 %r335,%r334,2; sub.u64 %r120,%r123,%r335; .loc 1 192 5 add.u64 %r121,%r121,1; .loc 1 193 7 add.u64 %r132,%r132,8; add.u64 %r174,%r174,8; setp.eq.u64 %r336,%r121,%r355; @ %r336 bra $L1; $L36: .loc 1 201 16 ld.u64 %r337,[%r132]; add.u64 %r89,%r337,1; st.u64 [%r132],%r89; .loc 1 202 23 add.u64 %r339,%r356,%r174; ld.u64 %r90,[%r339]; .loc 1 202 13 shl.b64 %r340,%r90,2; add.u64 %r122,%r119,%r340; .loc 1 203 23 add.u64 %r341,%frame,%r174; ld.u64 %r93,[%r341]; .loc 1 203 13 shl.b64 %r342,%r93,2; add.u64 %r123,%r120,%r342; .loc 1 183 32 add.u64 %r344,%r357,%r174; ld.u64 %r96,[%r344]; .loc 1 183 13 setp.eq.u64 %r345,%r89,%r96; @ %r345 bra $L37; .loc 1 187 13 mov.u64 %r137,0; bra $L25; $L53: .loc 1 87 6 ld.u64 %r346,[%r181]; setp.eq.u64 %r347,%r346,0; @ ! %r347 bra $L11; bra $L38; $L1: .loc 1 207_gfortran_mmaxval_r4 .visible .func _gfortran_mmaxval_r608.reg .u64 %r89; .reg .f32u32 %r219; .reg .u64 %r221; .reg .u32 %r223; .reg .u32 %r224; .reg .u16 %r226; .reg .u16 %r227; .reg .u16 %r228; .reg .u32u16 %r263; .reg .u32pred %r282; .reg .u64 %r283; .reg .u64u64u6464pred %r325; .reg .u64 %r327; .reg .u64 %r329; .reg .pred %r331; .reg .pred %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u32u64pred %r357; .reg .u64 %r358; .reg .u64 %r359; .reg .predu64 %r392; .reg .u64 %r393; .reg .pred %r394; .reg .predpred %r403; .reg .pred %r405; .reg .predpredu64 %r422; .reg .u64 %r424; .reg .u64 %r425; .reg .u64 %r427; .reg .pred %r428; .reg .pred %r429; .reg .u64 %r430; .reg .pred %r431; .reg .u64 %r433; .reg .predu64 %r449; mov.u64 %r211,%ar0; mov.u64 %r212,%ar1; mov.u64 %r213,%ar2; mov.u64 %r214,%ar3; .loc 1 237 6 setp.ne.u64 %r215,%r214,0; @ %r215 bra $Lgfortran_maxval_r4.loc 1 244 7 bra $L54; $L55: .loc 1 247 10 ld.u64 %r22,[%r213]; .loc 1 247 7 add.u64 %r142,%r22,-1; .loc 1 248 10 ld.s8 %r23,[%r212+28]; .loc 1 248 38 add.u32 %r219,%r23,-1; .loc 1 248 8 cvt.s64.s32 %r143,%r219; .loc 1 251 7 shr.u64 %r221,%r142,63; set.u32.gt.s64 %r223,%r142,%r143; neg.s32 %r224,%r223; cvt.u16.u64 %r227,%r221; cvt.u16.u32 %r228,%r224; or.b16 %r226,%r227,%r228; .loc 1 251 6 cvt.u32.u16 %r229,%r226; cvt.u16.u8 %r230,%r229; setp.eq.u16 %r231,%r230,0; @ %r231 bra $L57; .loc 1 253 7 cvt.u32.u32 %r234,%r23; cvt.s64.s8 %r233,%r234; st.u64 [%stack+8],%r233; st.u64 [%stack],%r22; cvta.const.u64 %r2322stack; call _gfortran_runtime_error7: .loc 1 258 9 add.u64 %r435,%r142,%r142; add.u64 %r433,%r435,%r142; shl.b64 %r239,%r433,3; add.u64 %r240,%r212,%r239; ld.u64 %r243,[%r240+56]; add.u64 %r242,%r243,1; .loc 1 258 7 ld.u64 %r250,[%r240+48]; sub.u64 %r144,%r242,%r250; .loc 1 259 6 setp.le.s64 %r251,%r144,0; @ %r251 bra $L54; .loc 1 262 9 ld.u64 %r159,[%r214]; .loc 1 264 15 ld.u64 %r33,[%r214+16]; .loc 1 266 22 cvt.u32.u64 %r34,%r33; .loc 1 266 53 add.u32 %r252,%r34,-4; and.b32 %r253,%r252,-5; set.u32.eq.u32 %r255,%r253,0; neg.s32 %r256,%r255; .loc 1 266 22 add.u32 %r257,%r34,-1; set.u32.le.u32 %r259,%r257,1; neg.s32 %r260,%r259; .loc 1 266 6 cvt.u16.u32 %r262,%r256; cvt.u16.u32 %r263,%r260; or.b16 %r261,%r262,%r263; cvt.u32.u16 %r264,%r261; cvt.u16.u8 %r265,%r264; setp.ne.u16 %r266,%r265,0; @ %r266 bra $L59; .loc 1 268 7 setp.ne.u32 %r268,%r34,16; @ %r268 bra $L60; $L59: .loc 1 275 9 shl.b64 %r272,%r433,3; add.u64 %r273,%r212,%r272; ld.u64 %r147,[%r273+40]; .loc 1 276 12 add.u64 %r279,%r214,%r272; ld.u64 %r40,[%r279+40]; .loc 1 278 3 setp.ne.u64 %r281,%r142,0; @ %r281 bra $L61; $L67: .loc 1 288 3 setp.lt.s64 %r282,%r142,%r143; @ %r282 bra $L62; bra $L118; $L60: .loc 1 273 5 cvta.const.u64 %r283283_gfortran_runtime_error61: add.u64 %r53,%r212,40; add.u64 %r163,%r214,40; add.u64 %r286,%r22,%r22; add.u64 %r287,%r286,%r22; shl.b64 %r288,%r287,3; add.u64 %r289,%r212,16; add.u64 %r39,%r288,%r289; .loc 1 278 3 mov.u64 %r209,0; add.u64 %r438,%frame,240; add.u64 %r436,%frame,360; .loc 1 285 12 mov.u64 %r449,%r209; $L66: .loc 1 280 18 add.u64 %r291,%r438,%r209; ld.u64 %r292,[%r53]; st.u64 [%r291],%r292; .loc 1 281 18 add.u64 %r293,%frame,%r209; .loc 1 281 20 ld.u64 %r295,[%r163]; mul.lo.u64 %r294,%r295,%r33; .loc 1 281 18 st.u64 [%r293],%r294; .loc 1 282 19 ld.u64 %r297,[%r53+16]; add.u64 %r296,%r297,1; ld.u64 %r298,[%r53+8]; sub.u64 %r51,%r296,%r298; .loc 1 284 10 setp.lt.s64 %r299,%r51,0; @ %r299 bra $L64; .loc 1 282 17 add.u64 %r301,%r436,%r209; st.u64 [%r301],%r51; bra $L65; $L64: .loc 1 285 12 add.u64 %r303,%r436,%r209; st.u64 [%r303],%r449; $L65: .loc 1 278 3 add.u64 %r53,%r53,24; add.u64 %r163,%r163,24; add.u64 %r209,%r209,8; setp.ne.u64 %r305,%r39,%r53; @ %r305 bra $L66; bra $L67; $L118: .loc 1 298 6 ld.u64 %r306,[%r211]; setp.eq.u64 %r307,%r306,0; @ ! %r307 bra $L69; bra $L68; $L62: add.u64 %r309,%r22,%r22; add.u64 %r310,%r309,%r22; shl.b64 %r311,%r310,3; add.u64 %r167,%r311,40; add.u64 %r171,%r212,%r167; add.u64 %r165,%r214,%r167; shl.b64 %r312,%r22,3; add.u64 %r139,%r312,-8; cvt.u32.u32 %r314,%r23; cvt.s64.s8 %r313,%r314; shl.b64 %r315,%r313,3; add.u64 %r124,%r315,-8; add.u64 %r438,%frame,240; add.u64 %r436,%frame,360; .loc 1 295 12 mov.u64 %r448,0; $L72: .loc 1 290 18 add.u64 %r317,%r438,%r139; ld.u64 %r318,[%r171]; st.u64 [%r317],%r318; .loc 1 291 18 add.u64 %r319,%frame,%r139; .loc 1 291 20 ld.u64 %r321,[%r165]; mul.lo.u64 %r320,%r321,%r33; .loc 1 291 18 st.u64 [%r319],%r320; .loc 1 292 19 ld.u64 %r323,[%r171+16]; add.u64 %r322,%r323,1; ld.u64 %r324,[%r171+8]; sub.u64 %r63,%r322,%r324; .loc 1 294 10 setp.lt.s64 %r325,%r63,0; @ %r325 bra $L70; .loc 1 292 17 add.u64 %r327,%r436,%r139; st.u64 [%r327],%r63; bra $L71; $L70: .loc 1 295 12 add.u64 %r329,%r436,%r139; st.u64 [%r329],%r448; $L71: .loc 1 288 3 add.u64 %r171,%r171,24; add.u64 %r165,%r165,24; add.u64 %r139,%r139,8; setp.ne.u64 %r331,%r124,%r139; @ %r331 bra $L72; bra $L119; $L68: .loc 1 302 7 setp.eq.u64 %r332,%r143,0; @ %r332 bra $L74; add.u64 %r436,%frame,360; $L96: add.u64 %r185,%r211,40; mov.u64 %r183,%r436; add.u64 %r333,%r211,16; cvt.u32.u32 %r335,%r23; cvt.s64.s8 %r334,%r335; add.u64 %r337,%r334,%r334; add.u64 %r338,%r337,%r334; shl.b64 %r339,%r338,3; add.u64 %r172,%r333,%r339; .loc 1 305 10 mov.u64 %r131,1; .loc 1 309 4 mov.u64 %r358,0; bra $L75; $L74: .loc 1 313 20 add.u32 %r340,%r23,-2; cvt.s64.s32 %r73,%r340; add.u64 %r342,%r73,%r73; add.u64 %r343,%r342,%r73; shl.b64 %r344,%r343,3; add.u64 %r345,%r211,%r344; .loc 1 313 67 shl.b64 %r347,%r73,3; add.u64 %r348,%frame,%r347; .loc 1 313 59 ld.u64 %r350,[%r345+40]; ld.u64 %r351,[%r348+360]; mul.lo.u64 %r151,%r350,%r351; .loc 1 315 24 mov.u64 %r352,0; st.u64 [%r211+8],%r352; .loc 1 316 28 cvt.u16.u32 %r355,%r23; add.u16 %r354,%r355,-1; cvt.u32.u16 %r356,%r354; st.u8 [%r211+28],%r356; .loc 1 318 10 setp.eq.u64 %r357,%r151,0; @ %r357 bra $L76; bra $L120; $L78: .loc 1 307 47 mul.lo.u64 %r131,%r70,%r131; $L75: .loc 1 309 4 st.u64 [%r185+8],%r358; ld.u64 %r70,[%r183]; add.u64 %r359,%r70,-1; st.u64 [%r185+16],%r359; st.u64 [%r185],%r131; .loc 1 302 7 add.u64 %r185,%r185,24; add.u64 %r183,%r183,8; setp.ne.u64 %r360,%r172,%r185; @ %r360 bra $L78; bra $L74; $L76: .loc 1 321 4 st.u64 [%r211+48],%r151; mov.u64 %r362,-1; st.u64 [%r211+56],%r362; mov.u64 %r363,1; st.u64 [%r211+40],%r363; .loc 1 322 4 bra $L54; $L120: .loc 1 325 24365; call (%value_in),_gfortrani_xmallocarray66,[%value_in]; } .loc 1 325 22 st.u64 [%r211],%r366; bra $L79; $L69: .loc 1 330 19 ld.s8 %r368,[%r211+28]; .loc 1 330 10 setp.eq.u64 %r369,%r368,%r143; @ %r369 bra $L80; .loc 1 331 2 cvta.const.u64 %r370370_gfortran_runtime_error80: .loc 1 333 11 cvta.global.u64 %r372,_gfortrani_compile_options; .loc 1 333 10 ld.u32 %r373,[%r372+36]; setp.eq.u32 %r374,%r373,0; @ %r374 bra $L79; .loc 1 335 4 add.u64 %r436,%frame,360; cvta.const.u64 %r378,$LC2r4r377378; call _gfortrani_bounds_ifunction_return337 4 cvta.const.u64 %r382,$LC1212382378; call _gfortrani_bounds_equal_extents$L79: .loc 1 342 3 setp.ne.u64 %r384,%r143,0; @ %r384 bra $L81; $L84: .loc 1 350 8 ld.u64 %r160,[%r211]; .loc 1 351 8 ld.u64 %r158,[%r212]; .loc 1 353 9 setp.ne.u64 %r385,%r158,0; @ %r385 bra $L82; bra $L54; $L81: add.u64 %r198,%frame,480; add.u64 %r197,%r211,40; add.u64 %r195,%frame,120; cvt.u32.u32 %r387,%r23; cvt.s64.s8 %r386,%r387; add.u64 %r186,%r386,-1; .loc 1 342 3 mov.u64 %r161,0; add.u64 %r436,%frame,360; .loc 1 344 16 mov.u64 %r388,%r161; $L83: st.u64 [%r198],%r388; .loc 1 345 18 ld.u64 %r389,[%r197]; st.u64 [%r195],%r389; .loc 1 346 17 shl.b64 %r391,%r161,3; add.u64 %r392,%r436,%r391; .loc 1 346 10 ld.u64 %r393,[%r392]; setp.le.s64 %r394,%r393,0; @ %r394 bra $L54; .loc 1 342 26 add.u64 %r161,%r161,1; .loc 1 342 3 add.u64 %r198,%r198,8; add.u64 %r197,%r197,24; add.u64 %r195,%r195,8; setp.ne.u64 %r395,%r161,%r186; @ %r395 bra $L83; bra $L84; $L82: .loc 1 276 12 mul.lo.u64 %r42,%r40,%r33; .loc 1 370 32 shl.b64 %r87,%r147,2; ld.u64 %r38,[%frame+480]; .loc 1 401 22 ld.u64 %r95,[%frame+240]; .loc 1 401 12 shl.b64 %r96,%r95,2; .loc 1 402 23 ld.u64 %r98,[%frame]; .loc 1 403 22 ld.u64 %r100,[%frame+120]; .loc 1 403 12 shl.b64 %r101,%r100,2; .loc 1 405 32 ld.u64 %r59,[%frame+360]; .loc 1 412 23 mul.lo.u64 %r37,%r95,%r59; .loc 1 413 24 mul.lo.u64 %r396,%r98,%r59; .loc 1 413 10 neg.s64 %r89,%r396; .loc 1 414 23 mul.lo.u64 %r397,%r100,%r59; .loc 1 414 9 shl.b64 %r398,%r397,2; neg.s64 %r132,%r398; setp.le.s64 %r434,%r143,1; .loc 1 377 19 mov.u32 %r440,1; cvt.u32.u32 %r442,%r23; cvt.s64.s8 %r443,%r442; add.u64 %r444,%r443,-1; add.u64 %r445,%frame,240; add.u64 %r446,%frame,360; add.u64 %r447,%frame,120; $L97: .loc 1 342 3 mov.u64 %r154,%r159; mov.u64 %r153,%r158; .loc 1 368 6 mov.u32 %r137,0; .loc 1 370 9 mov.u64 %r152,0; $L87: .loc 1 374 7 ld.s8 %r210,[%r154]; .loc 1 374 6 setp.eq.u32 %r437,%r210,0; @ %r437 bra $L85; .loc 1 378 10 ld.f32 %r400,[%r153]; setp.ge.f32 %r401,%r400,0fff800000; @ %r401 bra $L86; .loc 1 377 19 mov.u32 %r137,%r440; $L85: .loc 1 370 24 add.u64 %r152,%r152,1; .loc 1 370 32 add.u64 %r153,%r153,%r87; .loc 1 370 47 add.u64 %r154,%r154,%r42; .loc 1 370 2 setp.ne.u64 %r402,%r144,%r152; @ %r402 bra $L87; .loc 1 386 47 setp.ne.u32 %r403,%r137,0; selp.f32 %r136,0f7fc00000,0fff7fffff,%r403; bra $L88; $L91: .loc 1 394 6 @ %r437 bra $L89; .loc 1 394 16 ld.f32 %r91,[%r153]; .loc 1 394 13 setp.gt.f32 %r405,%r91,%r136; .loc 1 395 12 selp.f32 %r136,%r91,%r136,%r405; $L89: .loc 1 391 28 add.u64 %r152,%r152,1; .loc 1 391 36 add.u64 %r153,%r153,%r87; .loc 1 391 51 add.u64 %r154,%r154,%r42; .loc 1 391 11 setp.eq.u64 %r406,%r144,%r152; @ %r406 bra $L88; .loc 1 394 7 ld.s8 %r210,[%r154]; setp.eq.u32 %r437,%r210,0; bra $L91; $L99: .loc 1 386 47 mov.f32 %r136,0f7fc00000; $L88: .loc 1 397 8 st.f32 [%r160],%r136; .loc 1 400 15 add.u64 %r38,%r38,1; .loc 1 401 12 add.u64 %r158,%r158,%r96; .loc 1 402 13 add.u64 %r159,%r159,%r98; .loc 1 403 12 add.u64 %r160,%r160,%r101; .loc 1 405 13 setp.ne.u64 %r407,%r59,%r38; @ %r407 bra $L97; .loc 1 413 10 add.u64 %r155,%r159,%r89; .loc 1 414 9 add.u64 %r156,%r160,%r132; .loc 1 416 7 @ %r434 bra $L54; add.u64 %r169,%frame,488; .loc 1 412 23 mov.u64 %r103,%r37; .loc 1 416 7 mov.u64 %r141,8; .loc 1 415 5 mov.u64 %r157,1; .loc 1 409 13 mov.u64 %r439,0; bra $L94; $L95: st.u64 [%r169],%r439; .loc 1 412 23 mul.lo.u64 %r103,%r114,%r113; .loc 1 413 24 mul.lo.u64 %r413,%r116,%r113; .loc 1 413 10 sub.u64 %r155,%r159,%r413; .loc 1 414 23 mul.lo.u64 %r414,%r118,%r113; .loc 1 414 9 shl.b64 %r415,%r414,2; sub.u64 %r156,%r160,%r415; .loc 1 415 5 add.u64 %r157,%r157,1; .loc 1 416 7 add.u64 %r169,%r169,8; add.u64 %r141,%r141,8; setp.eq.u64 %r416,%r157,%r444; @ %r416 bra $L54; $L94: .loc 1 424 16 ld.u64 %r417,[%r169]; add.u64 %r113,%r417,1; st.u64 [%r169],%r113; .loc 1 425 23 add.u64 %r419,%r445,%r141; ld.u64 %r114,[%r419]; .loc 1 425 13 sub.u64 %r420,%r114,%r103; shl.b64 %r421,%r420,2; add.u64 %r158,%r158,%r421; .loc 1 426 24 add.u64 %r422,%frame,%r141; ld.u64 %r116,[%r422]; .loc 1 426 14 add.u64 %r159,%r155,%r116; .loc 1 427 23 add.u64 %r424,%r447,%r141; ld.u64 %r118,[%r424]; .loc 1 427 13 shl.b64 %r425,%r118,2; add.u64 %r160,%r156,%r425; .loc 1 405 32 add.u64 %r427,%r446,%r141; ld.u64 %r121,[%r427]; .loc 1 405 13 setp.eq.u64 %r428,%r113,%r121; @ %r428 bra $L95; .loc 1 409 13 mov.u64 %r38,0; bra $L97; $L86: .loc 1 383 9 setp.le.s64 %r429,%r144,%r152; @ %r429 bra $L99; mov.f32 %r136,0fff800000; bra $L91; $L119: .loc 1 298 6 ld.u64 %r430,[%r211]; setp.eq.u64 %r431,%r430,0; @ ! %r431 bra $L69; bra $L96; $L54: .loc 1 431_gfortran_smaxval_r4 .visible .func _gfortran_smaxval_r8u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64 %r276; .reg .predpredu64mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 454 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L122; .loc 1 454 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L123; r152; call _gfortran_maxval_r4.loc 1 461 7 bra $L121; $L123: .loc 1 464 10 ld.u64 %r23,[%r152]; .loc 1 464 7 add.u64 %r88,%r23,-1; .loc 1 465 10 ld.s8 %r24,[%r151+28]; .loc 1 465 38 add.u32 %r26,%r24,-1; .loc 1 465 8 cvt.s64.s32 %r294,%r26; .loc 1 467 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r294; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 467 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L125; .loc 1 474 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L126; $L132: .loc 1 482 3 setp.lt.s64 %r173,%r88,%r294; @ %r173 bra $L127; bra $L180; $L125: .loc 1 469 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error126: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 479 12 mov.u64 %r300,0; $L131: .loc 1 476 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 478 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L129; .loc 1 476 17 st.u64 [%r139],%r36; bra $L130; $L129: .loc 1 479 12 st.u64 [%r139],%r300; $L130: .loc 1 474 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L131; bra $L132; $L180: .loc 1 491 15 ld.u64 %r98,[%r150]; .loc 1 491 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L134; bra $L133; $L127: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r293,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r293,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 488 12 mov.u64 %r299,0; $L137: .loc 1 485 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 487 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L135; .loc 1 484 17 st.u64 [%r43],%r41; bra $L136; $L135: .loc 1 488 12 st.u64 [%r43],%r299; $L136: .loc 1 482 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L137; bra $L181; $L133: .loc 1 495 7 setp.eq.u64 %r211,%r294,0; @ %r211 bra $L139; add.u64 %r293,%frame,120; $L159: add.u64 %r111,%r150,40; mov.u64 %r107,%r293; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 498 10 mov.u64 %r81,1; .loc 1 502 4 mov.u64 %r237,0; bra $L140; $L139: .loc 1 506 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 507 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 509 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 509 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 509 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 511 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L141; bra $L182; $L143: .loc 1 500 48 mul.lo.u64 %r81,%r46,%r81; $L140: .loc 1 502 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 495 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L143; bra $L139; $L141: .loc 1 514 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 515 4 bra $L121; $L182: .loc 1 518 2496244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 518 22 st.u64 [%r150],%r98; $L147: .loc 1 544 3 setp.ne.u64 %r247,%r294,0; @ %r247 bra $L144; bra $L145; $L134: .loc 1 522 19 ld.s8 %r55,[%r150+28]; .loc 1 522 10 setp.eq.u64 %r248,%r55,%r294; @ %r248 bra $L146; .loc 1 523 2 st.u64 [%stack+8],%r294; st.u64 [%stack],%r55;_gfortran_runtime_error146: .loc 1 528 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 528 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L147; .loc 1 530 4 setp.eq.u64 %r254,%r294,0; @ %r254 bra $L145; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 530 10 mov.u64 %r104,0; $L149: .loc 1 534 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 534 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 535 18 ld.u64 %r63,[%r31]; .loc 1 530 25 add.u64 %r104,%r104,1; .loc 1 535 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L148; .loc 1 536 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error148: .loc 1 530 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L149; $L144: .loc 1 546 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L150; shl.b64 %r264,%r294,3; bra $L151; $L150: mov.u64 %r264,8; $L151: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L152: .loc 1 547 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 544 26 add.u64 %r100,%r100,1; .loc 1 544 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r294,%r100; @ %r277 bra $L152; $L145: ld.u64 %r80,[%frame+240]; .loc 1 556 22 ld.u64 %r68,[%frame]; .loc 1 556 12 shl.b64 %r69,%r68,2; .loc 1 558 32 ld.u64 %r59,[%frame+120]; .loc 1 565 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r292,%r294,1; .loc 1 554 13 mov.f32 %r278,0fff7fffff; cvt.u32.u32 %r296,%r24; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; $L158: st.f32 [%r98],%r278; .loc 1 555 15 add.u64 %r80,%r80,1; .loc 1 556 12 add.u64 %r98,%r98,%r69; .loc 1 558 13 setp.ne.u64 %r279,%r59,%r80; @ %r279 bra $L158; .loc 1 567 7 @ ! %r292 bra $L183; bra $L121; $L157: .loc 1 562 13 st.u64 [%r99],%r295; .loc 1 565 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 566 5 add.u64 %r97,%r97,1; .loc 1 567 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r282,%r97,%r298; @ %r282 bra $L156; bra $L121; $L183: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 565 23 mov.u64 %r72,%r44; .loc 1 566 5 mov.u64 %r97,1; .loc 1 562 13 mov.u64 %r295,0; $L156: .loc 1 571 16 ld.u64 %r287,[%r99]; add.u64 %r74,%r287,1; st.u64 [%r99],%r74; .loc 1 572 23 ld.u64 %r75,[%r124]; .loc 1 572 13 sub.u64 %r288,%r75,%r72; shl.b64 %r289,%r288,2; add.u64 %r98,%r98,%r289; .loc 1 558 32 ld.u64 %r77,[%r123]; .loc 1 558 13 setp.eq.u64 %r290,%r74,%r77; @ %r290 bra $L157; .loc 1 562 13 mov.u64 %r80,0; bra $L158; $L181: .loc 1 491 15 ld.u64 %r98,[%r150]; .loc 1 491 6 setp.eq.u64 %r291,%r98,0; @ ! %r291 bra $L134; bra $L159; $L121: .loc 1 maxval_r8.o/_gfortran_maxval_r8 .visible .func _gfortran_maxval_r8fortran/generated/maxval_r8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxval_r8 .visible .func _gfortran_mmaxval_r8gfortran_smaxval_r8 .visible .func _gfortran_smaxval_r8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,86,65,76114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_maxval_r8 .visible .func _gfortran_maxval_r848u32 %r194; .reg .u16 %r195; .reg .predu64 %r227; .reg .u64 %r228; .reg .u64 %r229; .reg .predpredu32u64 %r250; .reg .u64 %r251; .reg .u64 %r252; .reg .u64 %r253; .reg .predu32u16u64pred %r319; .reg .f64 %r321; .reg .pred %r322; .reg .pred %r323; .reg .pred %r324; .reg .pred %r325; .reg .pred %r326; .reg .u64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64u64 %r357; .reg .u64 %r358; .reg .u64 %r360; .reg .u64 %r361; mov.u64 %r181,%ar0; mov.u64 %r182,%ar1; mov.u64 %r183,%ar2; .loc 1 55 10 ld.s8 %r22,[%r182+28]; .loc 1 55 38 add.u32 %r184,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r109,%r184; .loc 1 56 10 ld.u64 %r25,[%r183]; .loc 1 56 7 add.u64 %r110,%r25,-1; .loc 1 58 7 shr.u64 %r186,%r110,63; set.u32.lt.s64 %r188,%r109,%r110; neg.s32 %r189,%r188; cvt.u16.u64 %r192,%r186; cvt.u16.u32 %r193,%r189; or.b16 %r191,%r192,%r193; .loc 1 58 6 cvt.u32.u16 %r194,%r191; cvt.u16.u8 %r195,%r194; setp.eq.u16 %r196,%r195,0; @ %r196 bra $L2; .loc 1 60 7 cvt.u32.u32 %r199,%r22; cvt.s64.s8 %r198,%r199; st.u64 [%stack+8],%r19819197_gfortran_runtime_errorr202,%r110,%r110; add.u64 %r203,%r202,%r110; shl.b64 %r204,%r203,3; add.u64 %r205,%r182,%r204; ld.u64 %r31,[%r205+56]; ld.u64 %r33,[%r205+48]; .loc 1 68 9 ld.u64 %r113,[%r205+40]; .loc 1 70 3 setp.ne.u64 %r219,%r110,0; @ %r219 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r220,%r109,%r110; @ %r220 bra $L4; bra $L52; $L3: add.u64 %r29,%r182,40; add.u64 %r126,%frame,120; add.u64 %r180,%frame,240; add.u64 %r221,%r182,16; add.u64 %r223,%r25,%r25; add.u64 %r224,%r223,%r25; shl.b64 %r225,%r224,3; add.u64 %r71,%r221,%r225; .loc 1 76 12 mov.u64 %r361,0; $L8: .loc 1 72 18 ld.u64 %r226,[%r29]; st.u64 [%r126],%r226; .loc 1 73 19 ld.u64 %r228,[%r29+16]; add.u64 %r227,%r228,1; ld.u64 %r229,[%r29+8]; sub.u64 %r38,%r227,%r229; .loc 1 75 10 setp.lt.s64 %r230,%r38,0; @ %r230 bra $L6; .loc 1 73 17 st.u64 [%r180],%r38; bra $L7; $L6: .loc 1 76 12 st.u64 [%r180],%r361; $L7: .loc 1 70 3 add.u64 %r29,%r29,24; add.u64 %r126,%r126,8; add.u64 %r180,%r180,8; setp.ne.u64 %r232,%r71,%r29; @ %r232 bra $L8; bra $L9; $L52: .loc 1 87 6 ld.u64 %r233,[%r181]; setp.eq.u64 %r234,%r233,0; @ ! %r234 bra $L11; bra $L10; $L4: add.u64 %r236,%r25,%r25; add.u64 %r237,%r236,%r25238,40; add.u64 %r144,%r182,%r239; shl.b64 %r240,%r25,3; add.u64 %r107,%r240,-8; add.u64 %r349,%frame,120; add.u64 %r128,%r349,%r107; add.u64 %r348,%frame,240; add.u64 %r105,%r348,%r107; cvt.u32.u32 %r244,%r22; cvt.s64.s8 %r243,%r244; add.u64 %r246,%r243,%r243; add.u64 %r247,%r246,%r243; shl.b64 %r248,%r247,3; add.u64 %r249,%r182,40; add.u64 %r65,%r248,%r249; .loc 1 84 12 mov.u64 %r360,0; $L14: .loc 1 80 18 ld.u64 %r250,[%r144]; st.u64 [%r128],%r250; .loc 1 81 19 ld.u64 %r252,[%r144+16]; add.u64 %r251,%r252,1; ld.u64 %r253,[%r144+8]; sub.u64 %r45,%r251,%r253; .loc 1 83 10 setp.lt.s64 %r254,%r45,0; @ %r254 bra $L12; .loc 1 81 17 st.u64 [%r105],%r45; bra $L13; $L12: .loc 1 84 12 st.u64 [%r105],%r360; $L13: .loc 1 78 3 add.u64 %r144,%r144,24; add.u64 %r128,%r128,8; add.u64 %r105,%r105,8; setp.ne.u64 %r256,%r65,%r144; @ %r256 bra $L14; bra $L53; $L10: .loc 1 91 7 setp.le.s64 %r257,%r109,0; @ %r257 bra $L16; add.u64 %r348,%frame,240; $L38: add.u64 %r153,%r181,40; mov.u64 %r151,%r348; add.u64 %r258,%r181,16; cvt.u32.u32 %r260,%r22; cvt.s64.s8 %r259,%r260; add.u64 %r262,%r259,%r259; add.u64 %r263,%r262,%r259; shl.b64 %r264,%r263,3; add.u64 %r145,%r258,%r264; .loc 1 94 10 mov.u64 %r103,1; .loc 1 98 4 mov.u64 %r287,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r265,0; st.u64 [%r181+8],%r265; .loc 1 103 28 cvt.u16.u32 %r268,%r22; add.u16 %r267,%r268,-1; cvt.u32.u16 %r269,%r267; st.u8 [%r181+28],%r269; .loc 1 105 20 add.u32 %r270,%r22,-2; cvt.s64.s32 %r56,%r270; add.u64 %r272,%r56,%r56; add.u64 %r273,%r272,%r56; shl.b64 %r274,%r273,3; add.u64 %r275,%r181,%r274; .loc 1 105 67 shl.b64 %r277,%r56,3; add.u64 %r278,%frame,%r277; .loc 1 105 59 ld.u64 %r280,[%r275+40]; ld.u64 %r281,[%r278+240]; mul.lo.u64 %r115,%r280,%r281; .loc 1 107 29 mov.u64 %r283283; call (%value_in),_gfortrani_xmallocarray284,[%value_in]; } .loc 1 107 27 st.u64 [%r181],%r284; .loc 1 108 10 setp.eq.u64 %r286,%r115,0; @ ! %r286 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r103,%r52,%r103; $L17: .loc 1 98 4 st.u64 [%r153+8],%r287; ld.u64 %r52,[%r151]; add.u64 %r288,%r52,-1; st.u64 [%r153+16],%r288; st.u64 [%r153],%r103; .loc 1 91 7 add.u64 %r153,%r153,24; add.u64 %r151,%r151,8; setp.ne.u64 %r289,%r145,%r153; @ %r289 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r181+48],%r115; mov.u64 %r291,-1; st.u64 [%r181+56],%r291; mov.u64 %r292,1; st.u64 [%r181+40],%r292; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r62,[%r181+28]; .loc 1 118 10 setp.eq.u64 %r293,%r62,%r109; @ %r293 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r109; st.u64 [%stack],%r62;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r296,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r297,[%r296+36]; setp.eq.u32 %r298,%r297,0; @ %r298 bra $L21; .loc 1 125 2 add.u64 %r348,%frame,240; cvta.const.u64 %r302,$LC2r3430302; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r304,%r109,0; @ %r304 bra $L24; $L27: .loc 1 65 9 add.u64 %r305,%r31,1; .loc 1 65 7 sub.u64 %r111,%r305,%r33; max.s64 %r46,%r111,0; .loc 1 137 8 ld.u64 %r122,[%r182]; .loc 1 138 8 ld.u64 %r123,[%r181]; .loc 1 158 36 shl.b64 %r69,%r113,3; ld.u64 %r137,[%frame+360]; .loc 1 180 22 ld.u64 %r73,[%frame+120]; .loc 1 180 12 shl.b64 %r74,%r73,3; .loc 1 181 22 ld.u64 %r76,[%frame]; .loc 1 181 12 shl.b64 %r77,%r76,3; .loc 1 183 32 ld.u64 %r142,[%frame+240]; .loc 1 190 23 mul.lo.u64 %r306,%r73,%r142; .loc 1 190 9 shl.b64 %r307,%r306,3; neg.s64 %r143,%r307; .loc 1 191 23 mul.lo.u64 %r308,%r76,%r142; .loc 1 191 9 shl.b64 %r309,%r308,3; neg.s64 %r70,%r309; setp.gt.s64 %r350,%r111,0; setp.le.s64 %r351,%r109,1; cvt.u32.u32 %r353,%r22; cvt.s64.s8 %r354,%r353; add.u64 %r355,%r354,-1; add.u64 %r356,%frame,120; add.u64 %r357,%frame,240; .loc 1 187 13 mov.u64 %r358,0; bra $L25; $L24: add.u64 %r166,%frame,360; add.u64 %r165,%r181,40; mov.u64 %r163,%frame; cvt.u32.u32 %r311,%r22; cvt.s64.s8 %r310,%r311; add.u64 %r154,%r310,-1; .loc 1 129 3 mov.u64 %r124,0; add.u64 %r348,%frame,240; .loc 1 131 16 mov.u64 %r312,%r124; $L26: st.u64 [%r166],%r312; .loc 1 132 18 ld.u64 %r313,[%r165]; st.u64 [%r163],%r313; .loc 1 133 17 shl.b64 %r315,%r124,3; add.u64 %r316,%r348,%r315; .loc 1 133 10 ld.u64 %r317,[%r316]; setp.le.s64 %r318,%r317,0; @ %r318 bra $L1; .loc 1 129 26 add.u64 %r124,%r124,1; .loc 1 129 3 add.u64 %r166,%r166,8; add.u64 %r165,%r165,24; add.u64 %r163,%r163,8; setp.ne.u64 %r319,%r124,%r154; @ %r319 bra $L26; bra $L27; $L25: .loc 1 153 5 @ %r350 bra $L39; .loc 1 154 10 mov.f64 %r321,0dffefffffffffffff; st.f64 [%r123],%r321; bra $L29; $L39: mov.u64 %r139,%r122; .loc 1 158 13 mov.u64 %r129,0; $L28: .loc 1 163 7 ld.f64 %r179,[%r139]; mov.u64 %r117,%r129; .loc 1 158 28 add.u64 %r129,%r129,1; mov.u64 %r118,%r139; .loc 1 158 36 add.u64 %r139,%r139,%r69; .loc 1 163 6 setp.ge.f64 %r322,%r179,0dfff0000000000000; @ %r322 bra $L40; .loc 1 158 6 setp.gt.s64 %r323,%r46,%r129; @ %r323 bra $L28; .loc 1 167 15 mov.f64 %r104,0d7ff8000000000000; bra $L31; $L40: .loc 1 149 9 mov.f64 %r104,0dfff0000000000000; $L30: .loc 1 171 6 setp.lt.f64 %r324,%r104,%r179; .loc 1 172 12 selp.f64 %r104,%r179,%r104,%r324; .loc 1 168 28 add.u64 %r117,%r117,1; .loc 1 168 36 add.u64 %r118,%r118,%r69; .loc 1 168 11 setp.le.s64 %r325,%r46,%r117; @ %r325 bra $L31; .loc 1 171 7 ld.f64 %r179,[%r118]; bra $L30; $L31: .loc 1 175 12 st.f64 [%r123],%r104; $L29: .loc 1 179 15 add.u64 %r137,%r137,1; .loc 1 180 12 add.u64 %r122,%r122,%r74; .loc 1 181 12 add.u64 %r123,%r123,%r77; .loc 1 183 13 setp.ne.u64 %r326,%r137,%r142; @ %r326 bra $L25; .loc 1 190 9 add.u64 %r119,%r122,%r143; .loc 1 191 9 add.u64 %r120,%r123,%r70; .loc 1 193 7 @ %r351 bra $L1; add.u64 %r132,%frame,368; mov.u64 %r174,8; .loc 1 192 5 mov.u64 %r121,1; bra $L36; $L37: .loc 1 187 13 st.u64 [%r132],%r358; .loc 1 190 23 mul.lo.u64 %r332,%r90,%r89; .loc 1 190 9 shl.b64 %r333,%r332,3; sub.u64 %r119,%r122,%r333; .loc 1 191 23 mul.lo.u64 %r334,%r93,%r89; .loc 1 191 9 shl.b64 %r335,%r334,3; sub.u64 %r120,%r123,%r335; .loc 1 192 5 add.u64 %r121,%r121,1; .loc 1 193 7 add.u64 %r132,%r132,8; add.u64 %r174,%r174,8; setp.eq.u64 %r336,%r121,%r355; @ %r336 bra $L1; $L36: .loc 1 201 16 ld.u64 %r337,[%r132]; add.u64 %r89,%r337,1; st.u64 [%r132],%r89; .loc 1 202 23 add.u64 %r339,%r356,%r174; ld.u64 %r90,[%r339]; .loc 1 202 13 shl.b64 %r340,%r90,3; add.u64 %r122,%r119,%r340; .loc 1 203 23 add.u64 %r341,%frame,%r174; ld.u64 %r93,[%r341]; .loc 1 203 13 shl.b64 %r342,%r93,3; add.u64 %r123,%r120,%r342; .loc 1 183 32 add.u64 %r344,%r357,%r174; ld.u64 %r96,[%r344]; .loc 1 183 13 setp.eq.u64 %r345,%r89,%r96; @ %r345 bra $L37; .loc 1 187 13 mov.u64 %r137,0; bra $L25; $L53: .loc 1 87 6 ld.u64 %r346,[%r181]; setp.eq.u64 %r347,%r346,0; @ ! %r347 bra $L11; bra $L38; $L1: .loc 1 207_gfortran_mmaxval_r8 .visible .func _gfortran_mmaxval_r8608.reg .u64 %r89; .reg .f64u32 %r219; .reg .u64 %r221; .reg .u32 %r223; .reg .u32 %r224; .reg .u16 %r226; .reg .u16 %r227; .reg .u16 %r228; .reg .u32u16 %r263; .reg .u32pred %r282; .reg .u64 %r283; .reg .u64u64u6464pred %r325; .reg .u64 %r327; .reg .u64 %r329; .reg .pred %r331; .reg .pred %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u32u64pred %r357; .reg .u64 %r358; .reg .u64 %r359; .reg .predu64 %r392; .reg .u64 %r393; .reg .pred %r394; .reg .predpred %r403; .reg .pred %r405; .reg .predpredu64 %r422; .reg .u64 %r424; .reg .u64 %r425; .reg .u64 %r427; .reg .pred %r428; .reg .pred %r429; .reg .u64 %r430; .reg .pred %r431; .reg .u64 %r433; .reg .predu64 %r449; mov.u64 %r211,%ar0; mov.u64 %r212,%ar1; mov.u64 %r213,%ar2; mov.u64 %r214,%ar3; .loc 1 237 6 setp.ne.u64 %r215,%r214,0; @ %r215 bra $Lgfortran_maxval_r8.loc 1 244 7 bra $L54; $L55: .loc 1 247 10 ld.u64 %r22,[%r213]; .loc 1 247 7 add.u64 %r142,%r22,-1; .loc 1 248 10 ld.s8 %r23,[%r212+28]; .loc 1 248 38 add.u32 %r219,%r23,-1; .loc 1 248 8 cvt.s64.s32 %r143,%r219; .loc 1 251 7 shr.u64 %r221,%r142,63; set.u32.gt.s64 %r223,%r142,%r143; neg.s32 %r224,%r223; cvt.u16.u64 %r227,%r221; cvt.u16.u32 %r228,%r224; or.b16 %r226,%r227,%r228; .loc 1 251 6 cvt.u32.u16 %r229,%r226; cvt.u16.u8 %r230,%r229; setp.eq.u16 %r231,%r230,0; @ %r231 bra $L57; .loc 1 253 7 cvt.u32.u32 %r234,%r23; cvt.s64.s8 %r233,%r234; st.u64 [%stack+8],%r233; st.u64 [%stack],%r22; cvta.const.u64 %r2322stack; call _gfortran_runtime_error7: .loc 1 258 9 add.u64 %r435,%r142,%r142; add.u64 %r433,%r435,%r142; shl.b64 %r239,%r433,3; add.u64 %r240,%r212,%r239; ld.u64 %r243,[%r240+56]; add.u64 %r242,%r243,1; .loc 1 258 7 ld.u64 %r250,[%r240+48]; sub.u64 %r144,%r242,%r250; .loc 1 259 6 setp.le.s64 %r251,%r144,0; @ %r251 bra $L54; .loc 1 262 9 ld.u64 %r159,[%r214]; .loc 1 264 15 ld.u64 %r33,[%r214+16]; .loc 1 266 22 cvt.u32.u64 %r34,%r33; .loc 1 266 53 add.u32 %r252,%r34,-4; and.b32 %r253,%r252,-5; set.u32.eq.u32 %r255,%r253,0; neg.s32 %r256,%r255; .loc 1 266 22 add.u32 %r257,%r34,-1; set.u32.le.u32 %r259,%r257,1; neg.s32 %r260,%r259; .loc 1 266 6 cvt.u16.u32 %r262,%r256; cvt.u16.u32 %r263,%r260; or.b16 %r261,%r262,%r263; cvt.u32.u16 %r264,%r261; cvt.u16.u8 %r265,%r264; setp.ne.u16 %r266,%r265,0; @ %r266 bra $L59; .loc 1 268 7 setp.ne.u32 %r268,%r34,16; @ %r268 bra $L60; $L59: .loc 1 275 9 shl.b64 %r272,%r433,3; add.u64 %r273,%r212,%r272; ld.u64 %r147,[%r273+40]; .loc 1 276 12 add.u64 %r279,%r214,%r272; ld.u64 %r40,[%r279+40]; .loc 1 278 3 setp.ne.u64 %r281,%r142,0; @ %r281 bra $L61; $L67: .loc 1 288 3 setp.lt.s64 %r282,%r142,%r143; @ %r282 bra $L62; bra $L118; $L60: .loc 1 273 5 cvta.const.u64 %r283283_gfortran_runtime_error61: add.u64 %r53,%r212,40; add.u64 %r163,%r214,40; add.u64 %r286,%r22,%r22; add.u64 %r287,%r286,%r22; shl.b64 %r288,%r287,3; add.u64 %r289,%r212,16; add.u64 %r39,%r288,%r289; .loc 1 278 3 mov.u64 %r209,0; add.u64 %r438,%frame,240; add.u64 %r436,%frame,360; .loc 1 285 12 mov.u64 %r449,%r209; $L66: .loc 1 280 18 add.u64 %r291,%r438,%r209; ld.u64 %r292,[%r53]; st.u64 [%r291],%r292; .loc 1 281 18 add.u64 %r293,%frame,%r209; .loc 1 281 20 ld.u64 %r295,[%r163]; mul.lo.u64 %r294,%r295,%r33; .loc 1 281 18 st.u64 [%r293],%r294; .loc 1 282 19 ld.u64 %r297,[%r53+16]; add.u64 %r296,%r297,1; ld.u64 %r298,[%r53+8]; sub.u64 %r51,%r296,%r298; .loc 1 284 10 setp.lt.s64 %r299,%r51,0; @ %r299 bra $L64; .loc 1 282 17 add.u64 %r301,%r436,%r209; st.u64 [%r301],%r51; bra $L65; $L64: .loc 1 285 12 add.u64 %r303,%r436,%r209; st.u64 [%r303],%r449; $L65: .loc 1 278 3 add.u64 %r53,%r53,24; add.u64 %r163,%r163,24; add.u64 %r209,%r209,8; setp.ne.u64 %r305,%r39,%r53; @ %r305 bra $L66; bra $L67; $L118: .loc 1 298 6 ld.u64 %r306,[%r211]; setp.eq.u64 %r307,%r306,0; @ ! %r307 bra $L69; bra $L68; $L62: add.u64 %r309,%r22,%r22; add.u64 %r310,%r309,%r22; shl.b64 %r311,%r310,3; add.u64 %r167,%r311,40; add.u64 %r171,%r212,%r167; add.u64 %r165,%r214,%r167; shl.b64 %r312,%r22,3; add.u64 %r139,%r312,-8; cvt.u32.u32 %r314,%r23; cvt.s64.s8 %r313,%r314; shl.b64 %r315,%r313,3; add.u64 %r124,%r315,-8; add.u64 %r438,%frame,240; add.u64 %r436,%frame,360; .loc 1 295 12 mov.u64 %r448,0; $L72: .loc 1 290 18 add.u64 %r317,%r438,%r139; ld.u64 %r318,[%r171]; st.u64 [%r317],%r318; .loc 1 291 18 add.u64 %r319,%frame,%r139; .loc 1 291 20 ld.u64 %r321,[%r165]; mul.lo.u64 %r320,%r321,%r33; .loc 1 291 18 st.u64 [%r319],%r320; .loc 1 292 19 ld.u64 %r323,[%r171+16]; add.u64 %r322,%r323,1; ld.u64 %r324,[%r171+8]; sub.u64 %r63,%r322,%r324; .loc 1 294 10 setp.lt.s64 %r325,%r63,0; @ %r325 bra $L70; .loc 1 292 17 add.u64 %r327,%r436,%r139; st.u64 [%r327],%r63; bra $L71; $L70: .loc 1 295 12 add.u64 %r329,%r436,%r139; st.u64 [%r329],%r448; $L71: .loc 1 288 3 add.u64 %r171,%r171,24; add.u64 %r165,%r165,24; add.u64 %r139,%r139,8; setp.ne.u64 %r331,%r124,%r139; @ %r331 bra $L72; bra $L119; $L68: .loc 1 302 7 setp.eq.u64 %r332,%r143,0; @ %r332 bra $L74; add.u64 %r436,%frame,360; $L96: add.u64 %r185,%r211,40; mov.u64 %r183,%r436; add.u64 %r333,%r211,16; cvt.u32.u32 %r335,%r23; cvt.s64.s8 %r334,%r335; add.u64 %r337,%r334,%r334; add.u64 %r338,%r337,%r334; shl.b64 %r339,%r338,3; add.u64 %r172,%r333,%r339; .loc 1 305 10 mov.u64 %r131,1; .loc 1 309 4 mov.u64 %r358,0; bra $L75; $L74: .loc 1 313 20 add.u32 %r340,%r23,-2; cvt.s64.s32 %r73,%r340; add.u64 %r342,%r73,%r73; add.u64 %r343,%r342,%r73; shl.b64 %r344,%r343,3; add.u64 %r345,%r211,%r344; .loc 1 313 67 shl.b64 %r347,%r73,3; add.u64 %r348,%frame,%r347; .loc 1 313 59 ld.u64 %r350,[%r345+40]; ld.u64 %r351,[%r348+360]; mul.lo.u64 %r151,%r350,%r351; .loc 1 315 24 mov.u64 %r352,0; st.u64 [%r211+8],%r352; .loc 1 316 28 cvt.u16.u32 %r355,%r23; add.u16 %r354,%r355,-1; cvt.u32.u16 %r356,%r354; st.u8 [%r211+28],%r356; .loc 1 318 10 setp.eq.u64 %r357,%r151,0; @ %r357 bra $L76; bra $L120; $L78: .loc 1 307 47 mul.lo.u64 %r131,%r70,%r131; $L75: .loc 1 309 4 st.u64 [%r185+8],%r358; ld.u64 %r70,[%r183]; add.u64 %r359,%r70,-1; st.u64 [%r185+16],%r359; st.u64 [%r185],%r131; .loc 1 302 7 add.u64 %r185,%r185,24; add.u64 %r183,%r183,8; setp.ne.u64 %r360,%r172,%r185; @ %r360 bra $L78; bra $L74; $L76: .loc 1 321 4 st.u64 [%r211+48],%r151; mov.u64 %r362,-1; st.u64 [%r211+56],%r362; mov.u64 %r363,1; st.u64 [%r211+40],%r363; .loc 1 322 4 bra $L54; $L120: .loc 1 325 24 mov.u64 %r365365; call (%value_in),_gfortrani_xmallocarray66,[%value_in]; } .loc 1 325 22 st.u64 [%r211],%r366; bra $L79; $L69: .loc 1 330 19 ld.s8 %r368,[%r211+28]; .loc 1 330 10 setp.eq.u64 %r369,%r368,%r143; @ %r369 bra $L80; .loc 1 331 2 cvta.const.u64 %r370370_gfortran_runtime_error80: .loc 1 333 11 cvta.global.u64 %r372,_gfortrani_compile_options; .loc 1 333 10 ld.u32 %r373,[%r372+36]; setp.eq.u32 %r374,%r373,0; @ %r374 bra $L79; .loc 1 335 4 add.u64 %r436,%frame,360; cvta.const.u64 %r378,$LC2r4r377378; call _gfortrani_bounds_ifunction_return337 4 cvta.const.u64 %r382,$LC1212382378; call _gfortrani_bounds_equal_extents$L79: .loc 1 342 3 setp.ne.u64 %r384,%r143,0; @ %r384 bra $L81; $L84: .loc 1 350 8 ld.u64 %r160,[%r211]; .loc 1 351 8 ld.u64 %r158,[%r212]; .loc 1 353 9 setp.ne.u64 %r385,%r158,0; @ %r385 bra $L82; bra $L54; $L81: add.u64 %r198,%frame,480; add.u64 %r197,%r211,40; add.u64 %r195,%frame,120; cvt.u32.u32 %r387,%r23; cvt.s64.s8 %r386,%r387; add.u64 %r186,%r386,-1; .loc 1 342 3 mov.u64 %r161,0; add.u64 %r436,%frame,360; .loc 1 344 16 mov.u64 %r388,%r161; $L83: st.u64 [%r198],%r388; .loc 1 345 18 ld.u64 %r389,[%r197]; st.u64 [%r195],%r389; .loc 1 346 17 shl.b64 %r391,%r161,3; add.u64 %r392,%r436,%r391; .loc 1 346 10 ld.u64 %r393,[%r392]; setp.le.s64 %r394,%r393,0; @ %r394 bra $L54; .loc 1 342 26 add.u64 %r161,%r161,1; .loc 1 342 3 add.u64 %r198,%r198,8; add.u64 %r197,%r197,24; add.u64 %r195,%r195,8; setp.ne.u64 %r395,%r161,%r186; @ %r395 bra $L83; bra $L84; $L82: .loc 1 276 12 mul.lo.u64 %r42,%r40,%r33; .loc 1 370 32 shl.b64 %r87,%r147,3; ld.u64 %r38,[%frame+480]; .loc 1 401 22 ld.u64 %r95,[%frame+240]; .loc 1 401 12 shl.b64 %r96,%r95,3; .loc 1 402 23 ld.u64 %r98,[%frame]; .loc 1 403 22 ld.u64 %r100,[%frame+120]; .loc 1 403 12 shl.b64 %r101,%r100,3; .loc 1 405 32 ld.u64 %r59,[%frame+360]; .loc 1 412 23 mul.lo.u64 %r37,%r95,%r59; .loc 1 413 24 mul.lo.u64 %r396,%r98,%r59; .loc 1 413 10 neg.s64 %r89,%r396; .loc 1 414 23 mul.lo.u64 %r397,%r100,%r59; .loc 1 414 9 shl.b64 %r398,%r397,3; neg.s64 %r132,%r398; setp.le.s64 %r434,%r143,1; .loc 1 377 19 mov.u32 %r440,1; cvt.u32.u32 %r442,%r23; cvt.s64.s8 %r443,%r442; add.u64 %r444,%r443,-1; add.u64 %r445,%frame,240; add.u64 %r446,%frame,360; add.u64 %r447,%frame,120; $L97: .loc 1 342 3 mov.u64 %r154,%r159; mov.u64 %r153,%r158; .loc 1 368 6 mov.u32 %r137,0; .loc 1 370 9 mov.u64 %r152,0; $L87: .loc 1 374 7 ld.s8 %r210,[%r154]; .loc 1 374 6 setp.eq.u32 %r437,%r210,0; @ %r437 bra $L85; .loc 1 378 10 ld.f64 %r400,[%r153]; setp.ge.f64 %r401,%r400,0dfff0000000000000; @ %r401 bra $L86; .loc 1 377 19 mov.u32 %r137,%r440; $L85: .loc 1 370 24 add.u64 %r152,%r152,1; .loc 1 370 32 add.u64 %r153,%r153,%r87; .loc 1 370 47 add.u64 %r154,%r154,%r42; .loc 1 370 2 setp.ne.u64 %r402,%r144,%r152; @ %r402 bra $L87; .loc 1 386 47 setp.ne.u32 %r403,%r137,0; selp.f64 %r136,0d7ff8000000000000,0dffefffffffffffff,%r403; bra $L88; $L91: .loc 1 394 6 @ %r437 bra $L89; .loc 1 394 16 ld.f64 %r91,[%r153]; .loc 1 394 13 setp.gt.f64 %r405,%r91,%r136; .loc 1 395 12 selp.f64 %r136,%r91,%r136,%r405; $L89: .loc 1 391 28 add.u64 %r152,%r152,1; .loc 1 391 36 add.u64 %r153,%r153,%r87; .loc 1 391 51 add.u64 %r154,%r154,%r42; .loc 1 391 11 setp.eq.u64 %r406,%r144,%r152; @ %r406 bra $L88; .loc 1 394 7 ld.s8 %r210,[%r154]; setp.eq.u32 %r437,%r210,0; bra $L91; $L99: .loc 1 386 47 mov.f64 %r136,0d7ff8000000000000; $L88: .loc 1 397 8 st.f64 [%r160],%r136; .loc 1 400 15 add.u64 %r38,%r38,1; .loc 1 401 12 add.u64 %r158,%r158,%r96; .loc 1 402 13 add.u64 %r159,%r159,%r98; .loc 1 403 12 add.u64 %r160,%r160,%r101; .loc 1 405 13 setp.ne.u64 %r407,%r59,%r38; @ %r407 bra $L97; .loc 1 413 10 add.u64 %r155,%r159,%r89; .loc 1 414 9 add.u64 %r156,%r160,%r132; .loc 1 416 7 @ %r434 bra $L54; add.u64 %r169,%frame,488; .loc 1 412 23 mov.u64 %r103,%r37; .loc 1 416 7 mov.u64 %r141,8; .loc 1 415 5 mov.u64 %r157,1; .loc 1 409 13 mov.u64 %r439,0; bra $L94; $L95: st.u64 [%r169],%r439; .loc 1 412 23 mul.lo.u64 %r103,%r114,%r113; .loc 1 413 24 mul.lo.u64 %r413,%r116,%r113; .loc 1 413 10 sub.u64 %r155,%r159,%r413; .loc 1 414 23 mul.lo.u64 %r414,%r118,%r113; .loc 1 414 9 shl.b64 %r415,%r414,3; sub.u64 %r156,%r160,%r415; .loc 1 415 5 add.u64 %r157,%r157,1; .loc 1 416 7 add.u64 %r169,%r169,8; add.u64 %r141,%r141,8; setp.eq.u64 %r416,%r157,%r444; @ %r416 bra $L54; $L94: .loc 1 424 16 ld.u64 %r417,[%r169]; add.u64 %r113,%r417,1; st.u64 [%r169],%r113; .loc 1 425 23 add.u64 %r419,%r445,%r141; ld.u64 %r114,[%r419]; .loc 1 425 13 sub.u64 %r420,%r114,%r103; shl.b64 %r421,%r420,3; add.u64 %r158,%r158,%r421; .loc 1 426 24 add.u64 %r422,%frame,%r141; ld.u64 %r116,[%r422]; .loc 1 426 14 add.u64 %r159,%r155,%r116; .loc 1 427 23 add.u64 %r424,%r447,%r141; ld.u64 %r118,[%r424]; .loc 1 427 13 shl.b64 %r425,%r118,3; add.u64 %r160,%r156,%r425; .loc 1 405 32 add.u64 %r427,%r446,%r141; ld.u64 %r121,[%r427]; .loc 1 405 13 setp.eq.u64 %r428,%r113,%r121; @ %r428 bra $L95; .loc 1 409 13 mov.u64 %r38,0; bra $L97; $L86: .loc 1 383 9 setp.le.s64 %r429,%r144,%r152; @ %r429 bra $L99; mov.f64 %r136,0dfff0000000000000; bra $L91; $L119: .loc 1 298 6 ld.u64 %r430,[%r211]; setp.eq.u64 %r431,%r430,0; @ ! %r431 bra $L69; bra $L96; $L54: .loc 1 431_gfortran_smaxval_r8 .visible .func _gfortran_smaxval_r88u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64 %r276; .reg .predpredu64mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 454 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L122; .loc 1 454 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L123; r152; call _gfortran_maxval_r8.loc 1 461 7 bra $L121; $L123: .loc 1 464 10 ld.u64 %r23,[%r152]; .loc 1 464 7 add.u64 %r88,%r23,-1; .loc 1 465 10 ld.s8 %r24,[%r151+28]; .loc 1 465 38 add.u32 %r26,%r24,-1; .loc 1 465 8 cvt.s64.s32 %r294,%r26; .loc 1 467 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r294; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 467 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L125; .loc 1 474 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L126; $L132: .loc 1 482 3 setp.lt.s64 %r173,%r88,%r294; @ %r173 bra $L127; bra $L180; $L125: .loc 1 469 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error126: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 479 12 mov.u64 %r300,0; $L131: .loc 1 476 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 478 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L129; .loc 1 476 17 st.u64 [%r139],%r36; bra $L130; $L129: .loc 1 479 12 st.u64 [%r139],%r300; $L130: .loc 1 474 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L131; bra $L132; $L180: .loc 1 491 15 ld.u64 %r98,[%r150]; .loc 1 491 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L134; bra $L133; $L127: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r293,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r293,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 488 12 mov.u64 %r299,0; $L137: .loc 1 485 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 487 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L135; .loc 1 484 17 st.u64 [%r43],%r41; bra $L136; $L135: .loc 1 488 12 st.u64 [%r43],%r299; $L136: .loc 1 482 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L137; bra $L181; $L133: .loc 1 495 7 setp.eq.u64 %r211,%r294,0; @ %r211 bra $L139; add.u64 %r293,%frame,120; $L159: add.u64 %r111,%r150,40; mov.u64 %r107,%r293; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 498 10 mov.u64 %r81,1; .loc 1 502 4 mov.u64 %r237,0; bra $L140; $L139: .loc 1 506 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 507 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 509 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 509 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 509 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 511 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L141; bra $L182; $L143: .loc 1 500 48 mul.lo.u64 %r81,%r46,%r81; $L140: .loc 1 502 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 495 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L143; bra $L139; $L141: .loc 1 514 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 515 4 bra $L121; $L182: .loc 1 518 24 mov.u64 %r24496244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 518 22 st.u64 [%r150],%r98; $L147: .loc 1 544 3 setp.ne.u64 %r247,%r294,0; @ %r247 bra $L144; bra $L145; $L134: .loc 1 522 19 ld.s8 %r55,[%r150+28]; .loc 1 522 10 setp.eq.u64 %r248,%r55,%r294; @ %r248 bra $L146; .loc 1 523 2 st.u64 [%stack+8],%r294; st.u64 [%stack],%r55;_gfortran_runtime_error146: .loc 1 528 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 528 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L147; .loc 1 530 4 setp.eq.u64 %r254,%r294,0; @ %r254 bra $L145; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 530 10 mov.u64 %r104,0; $L149: .loc 1 534 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 534 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 535 18 ld.u64 %r63,[%r31]; .loc 1 530 25 add.u64 %r104,%r104,1; .loc 1 535 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L148; .loc 1 536 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error148: .loc 1 530 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L149; $L144: .loc 1 546 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L150; shl.b64 %r264,%r294,3; bra $L151; $L150: mov.u64 %r264,8; $L151: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L152: .loc 1 547 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 544 26 add.u64 %r100,%r100,1; .loc 1 544 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r294,%r100; @ %r277 bra $L152; $L145: ld.u64 %r80,[%frame+240]; .loc 1 556 22 ld.u64 %r68,[%frame]; .loc 1 556 12 shl.b64 %r69,%r68,3; .loc 1 558 32 ld.u64 %r59,[%frame+120]; .loc 1 565 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r292,%r294,1; .loc 1 554 13 mov.f64 %r278,0dffefffffffffffff; cvt.u32.u32 %r296,%r24; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; $L158: st.f64 [%r98],%r278; .loc 1 555 15 add.u64 %r80,%r80,1; .loc 1 556 12 add.u64 %r98,%r98,%r69; .loc 1 558 13 setp.ne.u64 %r279,%r59,%r80; @ %r279 bra $L158; .loc 1 567 7 @ ! %r292 bra $L183; bra $L121; $L157: .loc 1 562 13 st.u64 [%r99],%r295; .loc 1 565 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 566 5 add.u64 %r97,%r97,1; .loc 1 567 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r282,%r97,%r298; @ %r282 bra $L156; bra $L121; $L183: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 565 23 mov.u64 %r72,%r44; .loc 1 566 5 mov.u64 %r97,1; .loc 1 562 13 mov.u64 %r295,0; $L156: .loc 1 571 16 ld.u64 %r287,[%r99]; add.u64 %r74,%r287,1; st.u64 [%r99],%r74; .loc 1 572 23 ld.u64 %r75,[%r124]; .loc 1 572 13 sub.u64 %r288,%r75,%r72; shl.b64 %r289,%r288,3; add.u64 %r98,%r98,%r289; .loc 1 558 32 ld.u64 %r77,[%r123]; .loc 1 558 13 setp.eq.u64 %r290,%r74,%r77; @ %r290 bra $L157; .loc 1 562 13 mov.u64 %r80,0; bra $L158; $L181: .loc 1 491 15 ld.u64 %r98,[%r150]; .loc 1 491 6 setp.eq.u64 %r291,%r98,0; @ ! %r291 bra $L134; bra $L159; $L121: .loc 1 maxval_r10.o/ minloc0_4_i1.o/ 1622802231_gfortran_minloc0_4_i1 .visible .func _gfortran_minloc0_4_i1fortran/generated/minloc0_4_i1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_4_i1 .visible .func _gfortran_mminloc0_4_i1_gfortran_sminloc0_4_i1 .visible .func _gfortran_sminloc0_4_i1VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_4_i1 .visible .func _gfortran_minloc0_4_i1u64 %r124; .reg .u32 %r125; .reg .u64u16 %r167; .reg .u16 %r168; .reg .predmov.u32 %r116,%ar2; .loc 1 50 10 ld.s8 %r22,[%r115+28]; .loc 1 50 8 cvt.u32.u32 %r117,%r22; cvt.s64.s8 %r69,%r117; .loc 1 51 6 setp.gt.s64 %r118,%r69,0; @ %r118 bra $L2; .loc 1 52 5 cvta.const.u64 %r1111_gfortran_runtime_errorr23,[%r114]; .loc 1 54 6 setp.ne.u64 %r121,%r23,0; @ %r121 bra $L3; .loc 1 56 7 st.u64 [%r114+48],%r23; add.u64 %r123,%r69,-1; st.u64 [%r114+56],%r123; mov.u64 %r124,1; st.u64 [%r114+40],%r124; .loc 1 57 28 cvt.u32.u64 %r125,%r124; st.u8 [%r114+28],%r125; .loc 1 58 24 st.u64 [%r114+8],%r23; .loc 1 59 29128; call (%value_in),_gfortrani_xmallocarray131,[%value_in]; } mov.u64 %r23,%r131; .loc 1 59 27 st.u64 [%r114],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r133,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r134,[%r133+36]; setp.eq.u32 %r135,%r134,0; @ %r135 bra $L4; .loc 1 64r115138; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r114]; $L4: .loc 1 68 11 ld.u64 %r70,[%r114+40]; add.u64 %r38,%r115,40; mov.u64 %r43,%frame; add.u64 %r34,%frame,120; .loc 1 70 10 mov.u64 %r96,0; add.u64 %r189,%frame,240; .loc 1 74 16 mov.u64 %r146,%r96; bra $L7; $L22: mov.u64 %r96,%r75; $L7: .loc 1 72 18 ld.u64 %r139,[%r38]; st.u64 [%r43],%r139; .loc 1 73 19 ld.u64 %r141,[%r38+16]; add.u64 %r140,%r141,1; ld.u64 %r142,[%r38+8]; sub.u64 %r31,%r140,%r142; .loc 1 73 17 st.u64 [%r34],%r31; .loc 1 74 16 shl.b64 %r144,%r96,3; add.u64 %r145,%r189,%r144; st.u64 [%r145],%r146; .loc 1 75 10 setp.gt.s64 %r147,%r31,0; @ %r147 bra $L5; shl.b64 %r81,%r70,2; mov.u64 %r83,%r23; .loc 1 78 11 mov.u64 %r76,0; .loc 1 79 24 cvt.u32.u64 %r148,%r76; $L6: st.u32 [%r83],%r148; .loc 1 78 27 add.u64 %r76,%r76,1; .loc 1 78 4 add.u64 %r83,%r83,%r81; setp.ne.u64 %r149,%r69,%r76; @ %r149 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r75,%r96,1; .loc 1 70 3 add.u64 %r38,%r38,24; add.u64 %r43,%r43,8; add.u64 %r34,%r34,8; setp.ne.u64 %r150,%r69,%r75; @ %r150 bra $L22; .loc 1 84 8 ld.u64 %r61,[%r115]; shl.b64 %r78,%r70,2; mov.u64 %r89,%r23; .loc 1 87 10 mov.u64 %r74,0; .loc 1 88 23 mov.u32 %r151,1; $L8: st.u32 [%r89],%r151; mov.u64 %r97,%r74; .loc 1 87 26 add.u64 %r74,%r74,1; .loc 1 87 3 add.u64 %r89,%r89,%r78; setp.ne.u64 %r152,%r96,%r97; @ %r152 bra $L8; .loc 1 101 9 setp.ne.u64 %r153,%r61,0; @ ! %r153 bra $L1; .loc 1 149 19 ld.u64 %r47,[%frame]; .loc 1 151 34 ld.u64 %r50,[%frame+120]; cvt.u32.u32 %r155,%r22; cvt.s64.s8 %r154,%r155; shl.b64 %r156,%r154,3; add.u64 %r100,%r189,%r156; .loc 1 99 12 mov.u32 %r64,127; setp.eq.u32 %r188,%r116,0; add.u64 %r190,%frame,248; .loc 1 157 13 mov.u64 %r193,0; .loc 1 174 32 add.u64 %r194,%frame,120; $L21: .loc 1 151 21 ld.u64 %r108,[%frame+240]; .loc 1 126 10 @ %r188 bra $L23; mov.u64 %r39,%r108; mov.u64 %r59,%r61; $L14: .loc 1 129 10 ld.s8 %r32,[%r59]; .loc 1 129 9 cvt.u16.u32 %r159,%r32; cvt.u16.u32 %r160,%r64; setp.gt.s16 %r161,%r159,%r160; @ %r161 bra $L12; mov.u64 %r41,%r190; mov.u64 %r62,%r23; mov.u64 %r103,%r39; $L13: .loc 1 133 34 cvt.u32.u64 %r164,%r103; add.u32 %r163,%r164,1; .loc 1 133 23 st.u32 [%r62],%r163; .loc 1 132 3 add.u64 %r62,%r62,%r78; setp.eq.u64 %r165,%r41,%r100; @ %r165 bra $L24; .loc 1 133 30 ld.u64 %r103,[%r41]; add.u64 %r41,%r41,8; bra $L13; $L24: mov.u32 %r64,%r32; $L12: .loc 1 135 11 add.u64 %r59,%r59,%r47; .loc 1 137 9 add.u64 %r39,%r39,1; .loc 1 137 2 st.u64 [%frame+240],%r39; setp.ne.u64 %r166,%r39,%r50; @ %r166 bra $L14; bra $L34; $L23: mov.u64 %r49,%r108; mov.u64 %r60,%r61; $L11: .loc 1 141 10 ld.s8 %r40,[%r60]; .loc 1 141 9 cvt.u16.u32 %r167,%r40; cvt.u16.u32 %r168,%r64; setp.ge.s16 %r169,%r167,%r168; @ %r169 bra $L16; mov.u64 %r106,%r190; mov.u64 %r107,%r23; mov.u64 %r42,%r49; $L17: .loc 1 145 34 cvt.u32.u64 %r172,%r42; add.u32 %r171,%r172,1; .loc 1 145 23 st.u32 [%r107],%r171; .loc 1 144 3 add.u64 %r107,%r107,%r78; setp.eq.u64 %r173,%r100,%r106; @ %r173 bra $L25; .loc 1 145 30 ld.u64 %r42,[%r106]; add.u64 %r106,%r106,8; bra $L17; $L25: mov.u32 %r64,%r40; $L16: .loc 1 149 9 add.u64 %r60,%r60,%r47; .loc 1 151 14 add.u64 %r49,%r49,1; .loc 1 151 7 st.u64 [%frame+240],%r49; setp.ne.u64 %r174,%r49,%r50; @ %r174 bra $L11; .loc 1 149 9 sub.u64 %r175,%r50,%r108; mad.lo.u64 %r61,%r175,%r47,%r61; bra $L18; $L34: .loc 1 135 11 sub.u64 %r177,%r50,%r108; mad.lo.u64 %r61,%r177,%r47,%r61; $L18: mov.u64 %r77,%r190; .loc 1 151 34 mov.u64 %r58,%r50; mov.u64 %r56,%r47; .loc 1 152 9 mov.u64 %r63,0; $L20: .loc 1 157 13 st.u64 [%r77+-8],%r193; .loc 1 160 23 mul.lo.u64 %r53,%r56,%r58; mov.u64 %r98,%r63; .loc 1 161 5 add.u64 %r63,%r63,1; .loc 1 162 7 setp.eq.u64 %r181,%r98,%r96; @ %r181 bra $L1; .loc 1 170 16 ld.u64 %r182,[%r77]; add.u64 %r55,%r182,1; st.u64 [%r77],%r55; shl.b64 %r67,%r63,3; .loc 1 171 23 add.u64 %r183,%frame,%r67; ld.u64 %r56,[%r183]; .loc 1 171 13 sub.u64 %r184,%r56,%r53; add.u64 %r61,%r61,%r184; .loc 1 174 32 add.u64 %r186,%r194,%r67; ld.u64 %r58,[%r186]; .loc 1 174 7 add.u64 %r77,%r77,8; setp.eq.u64 %r187,%r55,%r58; @ %r187 bra $L20; bra $L21; $L1: .loc 1 1_gfortran_mminloc0_4_i1 .visible .func _gfortran_mminloc0_4_i18pred %r168; .reg .predpredpredpredpred %r239; .reg .pred %r240; .reg .u32predpred %r295; mov.u64 %r164,%ar0; mov.u64 %r165,%ar1; mov.u64 %r166,%ar2; mov.u32 %r167,%ar3; .loc 1 202 6 setp.ne.u64 %r168,%r166,0; @ %r168 bra $L361617; call _gfortran_minloc0_4_.loc 1 205 7 bra $L35; $L36: .loc 1 208 10 ld.s8 %r22,[%r165+28]; .loc 1 209 6 setp.gt.s32 %r172,%r22,0; @ %r172 bra $L38; .loc 1 210 5 cvta.const.u64 %r173173_gfortran_runtime_error%r175,[%r164]; setp.ne.u64 %r176,%r175,0; @ %r176 bra $L39; .loc 1 214 7 st.u64 [%r164+48],%r175; add.u32 %r178,%r22,-1; cvt.s64.s32 %r179,%r178; st.u64 [%r164+56],%r179; mov.u64 %r180,1; st.u64 [%r164+40],%r180; .loc 1 215 28 cvt.u32.u64 %r181,%r180; st.u8 [%r164+28],%r181; .loc 1 216 24 st.u64 [%r164+8],%r175; .loc 1 217 29 cvt.u32.u32 %r186,%r22; cvt.s64.s8 %r185,%r18call (%value_in),_gfortrani_xmallocarray187,[%value_in]; } .loc 1 217 27 st.u64 [%r164],%r187; bra $L40; $L39: .loc 1 221 11 cvta.global.u64 %r189,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r190,[%r189+36]; setp.eq.u32 %r191,%r190,0164r194; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19716r197194; call _gfortrani_bounds_equal_extents$L40: .loc 1 231 15 ld.u64 %r31,[%r166+16]; .loc 1 233 9 ld.u64 %r96,[%r166]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r199,%r32,-4; and.b32 %r200,%r199,-5; set.u32.eq.u32 %r202,%r200,0; neg.s32 %r203,%r202; .loc 1 235 22 add.u32 %r204,%r32,-1; set.u32.le.u32 %r206,%r204,1; neg.s32 %r207,%r206; .loc 1 235 6 cvt.u16.u32 %r209,%r203; cvt.u16.u32 %r210,%r207; or.b16 %r208,%r209,%r210; cvt.u32.u16 %r211,%r208; cvt.u16.u8 %r212,%r211; setp.ne.u16 %r213,%r212,0; @ %r213 bra $L41; .loc 1 237 7 setp.ne.u32 %r215,%r32,16; @ %r215 bra $L42; $L41: .loc 1 244 11 ld.u64 %r108,[%r164+40]; .loc 1 245 8 ld.u64 %r109,[%r164]; .loc 1 246 17 cvt.u32.u32 %r216,%r22; cvt.s64.s8 %r67,%r216; add.u64 %r52,%r165,40; add.u64 %r285,%frame,120; mov.u64 %r116,%r285; add.u64 %r141,%r166,40; mov.u64 %r163,%frame; add.u64 %r160,%frame,240; add.u64 %r157,%frame,360; mov.u64 %r158,%r157; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r225,%r29; bra $L43; $L42: .loc 1 242 5 cvta.const.u64 %r217,$LC3stack; call _gfortran_runtime_error64: mov.u64 %r29,%r113; $L43: .loc 1 248 18 ld.u64 %r219,[%r52]; st.u64 [%r116],%r219; .loc 1 249 20 ld.u64 %r221,[%r141]; mul.lo.u64 %r220,%r221,%r31; .loc 1 249 18 st.u64 [%r163],%r220; .loc 1 250 19 ld.u64 %r223,[%r52+16]; add.u64 %r222,%r223,1; ld.u64 %r224,[%r52+8]; sub.u64 %r44,%r222,%r224; .loc 1 250 17 st.u64 [%r160],%r44; .loc 1 251 16 st.u64 [%r158],%r225; .loc 1 252 10 setp.gt.s64 %r226,%r44,0; @ %r226 bra $L44; shl.b64 %r147,%r108,2; mov.u64 %r136,%r109; .loc 1 255 11 mov.u64 %r114,0; .loc 1 256 24 cvt.u32.u64 %r227,%r114; $L45: st.u32 [%r136],%r227; .loc 1 255 27 add.u64 %r114,%r114,1; .loc 1 255 4 add.u64 %r136,%r136,%r147; setp.ne.u64 %r228,%r67,%r114; @ %r228 bra $L45; bra $L35; $L44: .loc 1 246 26 add.u64 %r113,%r29,1; .loc 1 246 3 add.u64 %r52,%r52,24; add.u64 %r116,%r116,8; add.u64 %r141,%r141,24; add.u64 %r163,%r163,8; add.u64 %r160,%r160,8; add.u64 %r158,%r158,8; setp.ne.u64 %r229,%r67,%r113; @ %r229 bra $L64; .loc 1 261 8 ld.u64 %r93,[%r165]; shl.b64 %r57,%r108,2; mov.u64 %r60,%r109; .loc 1 264 10 mov.u64 %r112,0; .loc 1 265 23 cvt.u32.u64 %r230,%r112; $L47: st.u32 [%r60],%r230; mov.u64 %r143,%r112; .loc 1 264 26 add.u64 %r112,%r112,1; .loc 1 264 3 add.u64 %r60,%r60,%r57; setp.ne.u64 %r231,%r29,%r143; @ %r231 bra $L47; .loc 1 276 9 setp.eq.u64 %r232,%r93,0; @ %r232 bra $L35; .loc 1 331 19 ld.u64 %r73,[%frame+120]; .loc 1 332 20 ld.u64 %r75,[%frame]; .loc 1 334 34 ld.u64 %r78,[%frame+240]; ld.u64 %r161,[%frame+360]; cvt.u32.u32 %r234,%r22; cvt.s64.s8 %r233,%r234; shl.b64 %r235,%r233,3; add.u64 %r125,%r235,%r157; .loc 1 274 12 mov.u32 %r101,127; setp.eq.u32 %r287,1,0; add.u64 %r286,%frame,368; .loc 1 340 13 mov.u64 %r292,0; .loc 1 359 32 add.u64 %r293,%frame,240; .loc 1 308 12 setp.eq.u32 %r295,%r167,0; $L63: .loc 1 280 10 @ %r287 bra $L48; mov.u64 %r53,%r161; mov.u32 %r94,0; mov.u32 %r289,1; $L53: mov.u64 %r45,%r93; .loc 1 300 13 add.u64 %r93,%r45,%r73; .loc 1 284 11 ld.s8 %r238,[%r96]; cvt.u16.u32 %r237,%r238; setp.eq.u16 %r239,%r237,0; @ %r239 bra $L49; setp.eq.u32 %r240,%r94,0; selp.u64 %r161,%r161,%r53,%r240; .loc 1 294 16 ld.s8 %r101,[%r45]; mov.u64 %r135,%r286; mov.u64 %r148,%r109; mov.u64 %r159,%r161; $L52: .loc 1 296 33 cvt.u32.u64 %r243,%r159; add.u32 %r242,%r243,1; .loc 1 296 22 st.u32 [%r148],%r242; .loc 1 295 9 add.u64 %r148,%r148,%r57; setp.eq.u64 %r244,%r125,%r135; @ %r244 bra $L65; .loc 1 296 29 ld.u64 %r159,[%r135]; add.u64 %r135,%r135,8; bra $L52; $L49: .loc 1 301 14 add.u64 %r96,%r96,%r75; .loc 1 303 11 add.u64 %r53,%r53,1; mov.u32 %r94,%r289; .loc 1 303 4 setp.ne.u64 %r245,%r53,%r78; @ %r245 bra $L53; bra $L89; $L48: .loc 1 308 12 @ %r295 bra $L66; mov.u64 %r64,%r161; mov.u64 %r91,%r93; $L58: .loc 1 311 11 ld.s8 %r248,[%r96]; cvt.u16.u32 %r247,%r248; setp.eq.u16 %r249,%r247,0; @ %r249 bra $L56; .loc 1 311 12 ld.s8 %r58,[%r91]; .loc 1 311 11 cvt.u16.u32 %r250,%r58; cvt.u16.u32 %r251,%r101; setp.gt.s16 %r252,%r250,%r251; @ %r252 bra $L56; mov.u64 %r103,%r286; mov.u64 %r117,%r109; mov.u64 %r154,%r64; .loc 1 314 18 mov.u64 %r144,%r292; $L57: .loc 1 315 36 cvt.u32.u64 %r255,%r154; add.u32 %r254,%r255,1; .loc 1 315 25 st.u32 [%r117],%r254; .loc 1 314 34 add.u64 %r111,%r144,1; .loc 1 314 11 add.u64 %r117,%r117,%r57; setp.le.s64 %r256,%r29,%r144; @ %r256 bra $L67; .loc 1 315 32 ld.u64 %r154,[%r103]; add.u64 %r103,%r103,8; mov.u64 %r144,%r111; bra $L57; $L67: mov.u32 %r101,%r58; $L56: .loc 1 317 8 add.u64 %r91,%r91,%r73; .loc 1 319 13 add.u64 %r64,%r64,1; .loc 1 319 6 st.u64 [%frame+360],%r64; setp.ne.u64 %r257,%r64,%r78; @ %r257 bra $L58; .loc 1 317 8 sub.u64 %r258,%r78,%r161; mad.lo.u64 %r93,%r258,%r73,%r93; bra $L59; $L66: mov.u64 %r77,%r161; mov.u64 %r95,%r96; mov.u64 %r92,%r93; $L55: .loc 1 323 11 ld.s8 %r261,[%r95]; cvt.u16.u32 %r260,%r261; setp.eq.u16 %r262,%r260,0; @ %r262 bra $L60; .loc 1 323 12 ld.s8 %r68,[%r92]; .loc 1 323 11 cvt.u16.u32 %r263,%r68; cvt.u16.u32 %r264,%r101; setp.ge.s16 %r265,%r263,%r264; @ %r265 bra $L60; mov.u64 %r90,%r286; mov.u64 %r98,%r109; mov.u64 %r149,%r77; .loc 1 326 12 mov.u64 %r145,%r292; $L61: .loc 1 327 36 cvt.u32.u64 %r268,%r149; add.u32 %r267,%r268,1; .loc 1 327 25 st.u32 [%r98],%r267; .loc 1 326 28 add.u64 %r110,%r145,1; .loc 1 326 5 add.u64 %r98,%r98,%r57; setp.le.s64 %r269,%r29,%r145; @ %r269 bra $L68; .loc 1 327 32 ld.u64 %r149,[%r90]; add.u64 %r90,%r90,8; mov.u64 %r145,%r110; bra $L61; $L68: mov.u32 %r101,%r68; $L60: .loc 1 331 9 add.u64 %r92,%r92,%r73; .loc 1 332 10 add.u64 %r95,%r95,%r75; .loc 1 334 14 add.u64 %r77,%r77,1; .loc 1 334 7 st.u64 [%frame+360],%r77; setp.ne.u64 %r270,%r77,%r78; @ %r270 bra $L55; sub.u64 %r131,%r78,%r161; .loc 1 331 9 mad.lo.u64 %r93,%r73,%r131,%r93; .loc 1 332 10 mad.lo.u64 %r96,%r75,%r131,%r96; bra $L59; $L89: st.u64 [%frame+360],%r78; $L59: mov.u64 %r123,%r286; mov.u64 %r87,%r75; .loc 1 334 34 mov.u64 %r89,%r78; mov.u64 %r85,%r73; mov.u64 %r122,8; .loc 1 335 9 mov.u64 %r99,0; $L62: .loc 1 340 13 st.u64 [%r123+-8],%r292; .loc 1 343 23 mul.lo.u64 %r80,%r85,%r89; .loc 1 344 24 mul.lo.u64 %r82,%r89,%r87; mov.u64 %r97,%r99; .loc 1 345 5 add.u64 %r99,%r99,1; .loc 1 346 7 setp.eq.u64 %r275,%r29,%r97; @ %r275 bra $L35; .loc 1 354 16 ld.u64 %r276,[%r123]; add.u64 %r84,%r276,1; st.u64 [%r123],%r84; .loc 1 355 23 add.u64 %r278,%r285,%r122; ld.u64 %r85,[%r278]; .loc 1 355 13 sub.u64 %r279,%r85,%r80; add.u64 %r93,%r93,%r279; .loc 1 356 24 add.u64 %r280,%frame,%r122; ld.u64 %r87,[%r280]; .loc 1 356 14 sub.u64 %r281,%r87,%r82; add.u64 %r96,%r96,%r281; .loc 1 359 32 add.u64 %r283,%r293,%r122; ld.u64 %r89,[%r283]; .loc 1 359 7 add.u64 %r123,%r123,8; add.u64 %r122,%r122,8; setp.eq.u64 %r284,%r84,%r89; @ %r284 bra $L62; ld.u64 %r161,[%frame+360]; bra $L63; $L65: mov.u64 %r93,%r45; setp.eq.u32 %r287,1,1; bra $L48; $L35: .loc 1 362_gfortran_sminloc0_4_i1 .visible .func _gfortran_sminloc0_4_i14predpred.reg .u64u32 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 378 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L91; .loc 1 378 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L92; $L91: .loc 1 380 7 {_gfortran_minloc0_4_.loc 1 381 7 bra $L90; $L92: .loc 1 384 8 ld.s8 %r30,[%r36+28]_gfortran_runtime_error94: .loc 1 389 15 ld.u64 %r24,[%r35]; .loc 1 389 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L95; .loc 1 391 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 392 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 393 24 st.u64 [%r35+8],%r24; .loc 1 394 29 mov.u64 %r56call (%value_in),_gfortrani_xmallocarray94 27 st.u64 [%r35],%r24; bra $L96; $L95: .loc 1 396 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 396 11 ld.u32 %r62,[%r61+36; .loc 1 398 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 403 8 ld.u64 %r24,[%r35]; $L96: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,2; mov.u64 %r29,%r24; .loc 1 404 10 mov.u64 %r32,0; .loc 1 405 23 cvt.u32.u64 %r68,%r32; $L97: st.u32 [%r29],%r68; .loc 1 404 24 add.u64 %r32,%r32,1; .loc 1 404 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L97; $L90: .loc 1 406minloc0_8_i1.o/ 1622802231_gfortran_minloc0_8_i1 .visible .func _gfortran_minloc0_8_i1fortran/generated/minloc0_8_i1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_8_i1 .visible .func _gfortran_mminloc0_8_i1_gfortran_sminloc0_8_i1 .visible .func _gfortran_sminloc0_8_i1VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_8_i1 .visible .func _gfortran_minloc0_8_i1pred %r145; .reg .pred %r146; .reg .u64 %r147; .reg .pred %r148; .reg .predu64 %r152; .reg .u16 %r155; .reg .u16predpredu64 %r188; mov.u64 %r110,%ar0; mov.u64 %r111,%ar1; mov.u32 %r112,%ar2; .loc 1 50 10 ld.s8 %r22,[%r111+28]; .loc 1 50 8 cvt.u32.u32 %r113,%r22; cvt.s64.s8 %r67,%r113; .loc 1 51 6 setp.gt.s64 %r114,%r67,0; @ %r114 bra $L2; .loc 1 52 5 cvta.const.u64 %r115115_gfortran_runtime_errorr23,[%r110]; .loc 1 54 6 setp.ne.u64 %r117,%r23,0; @ %r117 bra $L3; .loc 1 56 7 st.u64 [%r110+48],%r23; add.u64 %r119,%r67,-1; st.u64 [%r110+56],%r119; mov.u64 %r120,1; st.u64 [%r110+40],%r120; .loc 1 57 28 cvt.u32.u64 %r121,%r120; st.u8 [%r110+28],%r121; .loc 1 58 24 st.u64 [%r110+8],%r23; .loc 1 59 29 mov.u64 %r16call (%value_in),_gfortrani_xmallocarray127,[%value_in]; } mov.u64 %r23,%r127; .loc 1 59 27 st.u64 [%r110],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r129,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r130,[%r129+36]; setp.eq.u32 %r131,%r130,0; @ %r131 bra $L4; .loc 1 64110134; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r110]; $L4: .loc 1 68 11 ld.u64 %r68,[%r110+40]; add.u64 %r38,%r111,40; mov.u64 %r102,%frame; add.u64 %r103,%frame,120; .loc 1 70 10 mov.u64 %r94,0; add.u64 %r183,%frame,240; .loc 1 74 16 mov.u64 %r142,%r94; bra $L7; $L22: mov.u64 %r94,%r74; $L7: .loc 1 72 18 ld.u64 %r135,[%r38]; st.u64 [%r102],%r135; .loc 1 73 19 ld.u64 %r137,[%r38+16]; add.u64 %r136,%r137,1; ld.u64 %r138,[%r38+8]; sub.u64 %r31,%r136,%r138; .loc 1 73 17 st.u64 [%r103],%r31; .loc 1 74 16 shl.b64 %r140,%r94,3; add.u64 %r141,%r183,%r140; st.u64 [%r141],%r142; .loc 1 75 10 setp.gt.s64 %r143,%r31,0; @ %r143 bra $L5; shl.b64 %r76,%r68,3; mov.u64 %r82,%r23; .loc 1 78 11 mov.u64 %r75,0; .loc 1 79 24 mov.u64 %r144,%r75; $L6: st.u64 [%r82],%r144; .loc 1 78 27 add.u64 %r75,%r75,1; .loc 1 78 4 add.u64 %r82,%r82,%r76; setp.ne.u64 %r145,%r67,%r75; @ %r145 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r74,%r94,1; .loc 1 70 3 add.u64 %r38,%r38,24; add.u64 %r102,%r102,8; add.u64 %r103,%r103,8; setp.ne.u64 %r146,%r67,%r74; @ %r146 bra $L22; .loc 1 84 8 ld.u64 %r58,[%r111]; shl.b64 %r41,%r68,3; mov.u64 %r37,%r23; .loc 1 87 10 mov.u64 %r73,0; .loc 1 88 23 mov.u64 %r147,1; $L8: st.u64 [%r37],%r147; mov.u64 %r95,%r73; .loc 1 87 26 add.u64 %r73,%r73,1; .loc 1 87 3 add.u64 %r37,%r37,%r41; setp.ne.u64 %r148,%r94,%r95; @ %r148 bra $L8; .loc 1 101 9 setp.ne.u64 %r149,%r58,0; @ ! %r149 bra $L1; .loc 1 149 19 ld.u64 %r43,[%frame]; .loc 1 151 34 ld.u64 %r48,[%frame+120]; cvt.u32.u32 %r151,%r22; cvt.s64.s8 %r150,%r151; shl.b64 %r152,%r150,3; add.u64 %r69,%r183,%r152; .loc 1 99 12 mov.u32 %r63,127; setp.eq.u32 %r182,%r112,0; add.u64 %r184,%frame,248; .loc 1 157 13 mov.u64 %r187,0; .loc 1 174 32 add.u64 %r188,%frame,120; $L21: .loc 1 151 21 ld.u64 %r100,[%frame+240]; .loc 1 126 10 @ %r182 bra $L23; mov.u64 %r39,%r100; mov.u64 %r56,%r58; $L14: .loc 1 129 10 ld.s8 %r32,[%r56]; .loc 1 129 9 cvt.u16.u32 %r155,%r32; cvt.u16.u32 %r156,%r63; setp.gt.s16 %r157,%r155,%r156; @ %r157 bra $L12; mov.u64 %r71,%r184; mov.u64 %r33,%r23; mov.u64 %r85,%r39; $L13: .loc 1 133 34 add.u64 %r159,%r85,1; .loc 1 133 23 st.u64 [%r33],%r159; .loc 1 132 3 add.u64 %r33,%r33,%r41; setp.eq.u64 %r160,%r69,%r71; @ %r160 bra $L24; .loc 1 133 30 ld.u64 %r85,[%r71]; add.u64 %r71,%r71,8; bra $L13; $L24: mov.u32 %r63,%r32; $L12: .loc 1 135 11 add.u64 %r56,%r56,%r43; .loc 1 137 9 add.u64 %r39,%r39,1; .loc 1 137 2 st.u64 [%frame+240],%r39; setp.ne.u64 %r161,%r39,%r48; @ %r161 bra $L14; bra $L34; $L23: mov.u64 %r46,%r100; mov.u64 %r57,%r58; $L11: .loc 1 141 10 ld.s8 %r40,[%r57]; .loc 1 141 9 cvt.u16.u32 %r162,%r40; cvt.u16.u32 %r163,%r63; setp.ge.s16 %r164,%r162,%r163; @ %r164 bra $L16; mov.u64 %r80,%r184; mov.u64 %r98,%r23; mov.u64 %r81,%r46; $L17: .loc 1 145 34 add.u64 %r166,%r81,1; .loc 1 145 23 st.u64 [%r98],%r166; .loc 1 144 3 add.u64 %r98,%r98,%r41; setp.eq.u64 %r167,%r69,%r80; @ %r167 bra $L25; .loc 1 145 30 ld.u64 %r81,[%r80]; add.u64 %r80,%r80,8; bra $L17; $L25: mov.u32 %r63,%r40; $L16: .loc 1 149 9 add.u64 %r57,%r57,%r43; .loc 1 151 14 add.u64 %r46,%r46,1; .loc 1 151 7 st.u64 [%frame+240],%r46; setp.ne.u64 %r168,%r46,%r48; @ %r168 bra $L11; .loc 1 149 9 sub.u64 %r169,%r48,%r100; mad.lo.u64 %r58,%r169,%r43,%r58; bra $L18; $L34: .loc 1 135 11 sub.u64 %r171,%r48,%r100; mad.lo.u64 %r58,%r171,%r43,%r58; $L18: mov.u64 %r66,%r184; .loc 1 151 34 mov.u64 %r55,%r48; mov.u64 %r53,%r43; .loc 1 152 9 mov.u64 %r62,0; $L20: .loc 1 157 13 st.u64 [%r66+-8],%r187; .loc 1 160 23 mul.lo.u64 %r50,%r53,%r55; mov.u64 %r96,%r62; .loc 1 161 5 add.u64 %r62,%r62,1; .loc 1 162 7 setp.eq.u64 %r175,%r96,%r94; @ %r175 bra $L1; .loc 1 170 16 ld.u64 %r176,[%r66]; add.u64 %r52,%r176,1; st.u64 [%r66],%r52; shl.b64 %r61,%r62,3; .loc 1 171 23 add.u64 %r177,%frame,%r61; ld.u64 %r53,[%r177]; .loc 1 171 13 sub.u64 %r178,%r53,%r50; add.u64 %r58,%r58,%r178; .loc 1 174 32 add.u64 %r180,%r188,%r61; ld.u64 %r55,[%r180]; .loc 1 174 7 add.u64 %r66,%r66,8; setp.eq.u64 %r181,%r52,%r55; @ %r181 bra $L20; bra $L21; $L1: .loc 1 1_gfortran_mminloc0_8_i1 .visible .func _gfortran_mminloc0_8_i1u64 %r94; .reg .u64 %r97; .reg .u32 %r99u64pred %r185; .reg .u64 %r188; .reg .u64u32 %r196; .reg .u32 %r197; .reg .u32u16pred %r223; .reg .u64 %r224; .reg .predu32 %r232; .reg .pred %r233; .reg .pred %r234; .reg .u64 %r236; .reg .predpred %r248; .reg .pred %r249; .reg .u64 %r250; .reg .u16 %r252; .reg .u32 %r253; .reg .predu64 %r274; .reg .pred %r275; .reg .u64mov.u64 %r158,%ar0; mov.u64 %r159,%ar1; mov.u64 %r160,%ar2; mov.u32 %r161,%ar3; .loc 1 202 6 setp.ne.u64 %r162,%r160,0; @ %r162 bra $L3615r159161; call _gfortran_minloc0_8_.loc 1 205 7 bra $L35; $L36: .loc 1 208 10 ld.s8 %r22,[%r159+28]; .loc 1 209 6 setp.gt.s32 %r166,%r22,0; @ %r166 bra $L38; .loc 1 210 5 cvta.const.u64 %r16167_gfortran_runtime_error%r169,[%r1539; .loc 1 214 7 st.u64 [%r158+48],%r169; add.u32 %r172,%r22,-1; cvt.s64.s32 %r173,%r172; st.u64 [%r158+56],%r173; mov.u64 %r174,1; st.u64 [%r158+40],%r174; .loc 1 215 28 cvt.u32.u64 %r175,%r174; st.u8 [%r158+28],%r175; .loc 1 216 24 st.u64 [%r158+8],%r169; .loc 1 217 29 cvt.u32.u32 %r180,%r22; cvt.s64.s8 %r179,%r180; mov.u64 %r178call (%value_in),_gfortrani_xmallocarray181,[%value_in]; } .loc 1 217 27 st.u64 [%r158],%r181; bra $L40; $L39: .loc 1 221 11 cvta.global.u64 %r183,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r184,[%r183+36]; setp.eq.u32 %r185,%r184,0r159188; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r191,$LC2; {15919188; call _gfortrani_bounds_equal_extents$L40: .loc 1 231 15 ld.u64 %r31,[%r160+16]; .loc 1 233 9 ld.u64 %r94,[%r160]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r193,%r32,-4; and.b32 %r194,%r193,-5; set.u32.eq.u32 %r196,%r194,0; neg.s32 %r197,%r196; .loc 1 235 22 add.u32 %r198,%r32,-1; set.u32.le.u32 %r200,%r198,1; neg.s32 %r201,%r200; .loc 1 235 6 cvt.u16.u32 %r203,%r197; cvt.u16.u32 %r204,%r201; or.b16 %r202,%r203,%r204; cvt.u32.u16 %r205,%r202; cvt.u16.u8 %r206,%r205; setp.ne.u16 %r207,%r206,0; @ %r207 bra $L41; .loc 1 237 7 setp.ne.u32 %r209,%r32,16; @ %r209 bra $L42; $L41: .loc 1 244 11 ld.u64 %r107,[%r158+40]; .loc 1 245 8 ld.u64 %r108,[%r158]; .loc 1 246 17 cvt.u32.u32 %r210,%r22; cvt.s64.s8 %r67,%r210; add.u64 %r151,%r159,40; add.u64 %r278,%frame,120; mov.u64 %r147,%r278; add.u64 %r145,%r160,40; mov.u64 %r144,%frame; add.u64 %r69,%frame,240; add.u64 %r63,%frame,360; mov.u64 %r130,%r63; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r219,%r29; bra $L43; $L42: .loc 1 242 5_gfortran_runtime_error64: mov.u64 %r29,%r112; $L43: .loc 1 248 18 ld.u64 %r213,[%r151]; st.u64 [%r147],%r213; .loc 1 249 20 ld.u64 %r215,[%r145]; mul.lo.u64 %r214,%r215,%r31; .loc 1 249 18 st.u64 [%r144],%r214; .loc 1 250 19 ld.u64 %r217,[%r151+16]; add.u64 %r216,%r217,1; ld.u64 %r218,[%r151+8]; sub.u64 %r44,%r216,%r218; .loc 1 250 17 st.u64 [%r69],%r44; .loc 1 251 16 st.u64 [%r130],%r219; .loc 1 252 10 setp.gt.s64 %r220,%r44,0; @ %r220 bra $L44; shl.b64 %r135,%r107,3; mov.u64 %r126,%r108; .loc 1 255 11 mov.u64 %r113,0; .loc 1 256 24 mov.u64 %r221,%r113; $L45: st.u64 [%r126],%r221; .loc 1 255 27 add.u64 %r113,%r113,1; .loc 1 255 4 add.u64 %r126,%r126,%r135; setp.ne.u64 %r222,%r67,%r113; @ %r222 bra $L45; bra $L35; $L44: .loc 1 246 26 add.u64 %r112,%r29,1; .loc 1 246 3 add.u64 %r151,%r151,24; add.u64 %r147,%r147,8; add.u64 %r145,%r145,24; add.u64 %r144,%r144,8; add.u64 %r69,%r69,8; add.u64 %r130,%r130,8; setp.ne.u64 %r223,%r67,%r112; @ %r223 bra $L64; .loc 1 261 8 ld.u64 %r90,[%r159]; shl.b64 %r156,%r107,3; mov.u64 %r154,%r108; .loc 1 264 10 mov.u64 %r111,0; .loc 1 265 23 mov.u64 %r224,%r111; $L47: st.u64 [%r154],%r224; mov.u64 %r138,%r111; .loc 1 264 26 add.u64 %r111,%r111,1; .loc 1 264 3 add.u64 %r154,%r154,%r156; setp.ne.u64 %r225,%r29,%r138; @ %r225 bra $L47; .loc 1 276 9 setp.eq.u64 %r226,%r90,0; @ %r226 bra $L35; .loc 1 331 19 ld.u64 %r71,[%frame+120]; .loc 1 332 20 ld.u64 %r73,[%frame]; .loc 1 334 34 ld.u64 %r76,[%frame+240]; ld.u64 %r153,[%frame+360]; cvt.u32.u32 %r228,%r22; cvt.s64.s8 %r227,%r228; shl.b64 %r229,%r227,3; add.u64 %r116,%r229,%r63; .loc 1 274 12 mov.u32 %r99,127; setp.eq.u32 %r277,1,0; add.u64 %r276,%frame,368; .loc 1 340 13 mov.u64 %r283,0; .loc 1 359 32 add.u64 %r284,%frame,240; .loc 1 308 12 setp.eq.u32 %r286,%r161,0; $L63: .loc 1 280 10 @ %r277 bra $L48; mov.u64 %r53,%r153; mov.u32 %r91,0; mov.u32 %r280,1; $L53: mov.u64 %r45,%r90; .loc 1 300 13 add.u64 %r90,%r45,%r71; .loc 1 284 11 ld.s8 %r232,[%r940; @ %r233 bra $L49; setp.eq.u32 %r234,%r91,0; selp.u64 %r153,%r153,%r53,%r234; .loc 1 294 16 ld.s8 %r99,[%r45]; mov.u64 %r125,%r276; mov.u64 %r136,%r108; mov.u64 %r149,%r153; $L52: .loc 1 296 33 add.u64 %r236,%r149,1; .loc 1 296 22 st.u64 [%r136],%r236; .loc 1 295 9 add.u64 %r136,%r136,%r156; setp.eq.u64 %r237,%r116,%r125; @ %r237 bra $L65; .loc 1 296 29 ld.u64 %r149,[%r125]; add.u64 %r125,%r125,8; bra $L52; $L49: .loc 1 301 14 add.u64 %r94,%r94,%r73; .loc 1 303 11 add.u64 %r53,%r53,1; mov.u32 %r91,%r280; .loc 1 303 4 setp.ne.u64 %r238,%r53,%r76; @ %r238 bra $L53; bra $L89; $L48: .loc 1 308 12 @ %r286 bra $L66; mov.u64 %r65,%r153; mov.u64 %r88,%r90; $L58: .loc 1 311 11 ld.s8 %r241,[%r94]; cvt.u16.u32 %r240,%r241; setp.eq.u16 %r242,%r240,0; @ %r242 bra $L56; .loc 1 311 12 ld.s8 %r60,[%r88]; .loc 1 311 11 cvt.u16.u32 %r243,%r60; cvt.u16.u32 %r244,%r99; setp.gt.s16 %r245,%r243,%r244; @ %r245 bra $L56; mov.u64 %r61,%r276; mov.u64 %r92,%r108; mov.u64 %r146,%r65; .loc 1 314 18 mov.u64 %r139,%r283; $L57: .loc 1 315 36 add.u64 %r247,%r146,1; .loc 1 315 25 st.u64 [%r92],%r247; .loc 1 314 34 add.u64 %r110,%r139,1; .loc 1 314 11 add.u64 %r92,%r92,%r156; setp.le.s64 %r248,%r29,%r139; @ %r248 bra $L67; .loc 1 315 32 ld.u64 %r146,[%r61]; add.u64 %r61,%r61,8; mov.u64 %r139,%r110; bra $L57; $L67: mov.u32 %r99,%r60; $L56: .loc 1 317 8 add.u64 %r88,%r88,%r71; .loc 1 319 13 add.u64 %r65,%r65,1; .loc 1 319 6 st.u64 [%frame+360],%r65; setp.ne.u64 %r249,%r65,%r76; @ %r249 bra $L58; .loc 1 317 8 sub.u64 %r250,%r76,%r153; mad.lo.u64 %r90,%r250,%r71,%r90; bra $L59; $L66: mov.u64 %r75,%r153; mov.u64 %r93,%r94; mov.u64 %r89,%r90; $L55: .loc 1 323 11 ld.s8 %r253,[%r93]; cvt.u16.u32 %r252,%r253; setp.eq.u16 %r254,%r252,0; @ %r254 bra $L60; .loc 1 323 12 ld.s8 %r68,[%r89]; .loc 1 323 11 cvt.u16.u32 %r255,%r68; cvt.u16.u32 %r256,%r99; setp.ge.s16 %r257,%r255,%r256; @ %r257 bra $L60; mov.u64 %r114,%r276; mov.u64 %r56,%r108; mov.u64 %r142,%r75; .loc 1 326 12 mov.u64 %r140,%r283; $L61: .loc 1 327 36 add.u64 %r259,%r142,1; .loc 1 327 25 st.u64 [%r56],%r259; .loc 1 326 28 add.u64 %r109,%r140,1; .loc 1 326 5 add.u64 %r56,%r56,%r156; setp.le.s64 %r260,%r29,%r140; @ %r260 bra $L68; .loc 1 327 32 ld.u64 %r142,[%r114]; add.u64 %r114,%r114,8; mov.u64 %r140,%r109; bra $L61; $L68: mov.u32 %r99,%r68; $L60: .loc 1 331 9 add.u64 %r89,%r89,%r71; .loc 1 332 10 add.u64 %r93,%r93,%r73; .loc 1 334 14 add.u64 %r75,%r75,1; .loc 1 334 7 st.u64 [%frame+360],%r75; setp.ne.u64 %r261,%r75,%r76; @ %r261 bra $L55; sub.u64 %r122,%r76,%r153; .loc 1 331 9 mad.lo.u64 %r90,%r71,%r122,%r90; .loc 1 332 10 mad.lo.u64 %r94,%r73,%r122,%r94; bra $L59; $L89: st.u64 [%frame+360],%r76; $L59: mov.u64 %r102,%r276; mov.u64 %r85,%r73; .loc 1 334 34 mov.u64 %r87,%r76; mov.u64 %r83,%r71; mov.u64 %r101,8; .loc 1 335 9 mov.u64 %r97,0; $L62: .loc 1 340 13 st.u64 [%r102+-8],%r283; .loc 1 343 23 mul.lo.u64 %r78,%r83,%r87; .loc 1 344 24 mul.lo.u64 %r80,%r87,%r85; mov.u64 %r54,%r97; .loc 1 345 5 add.u64 %r97,%r97,1; .loc 1 346 7 setp.eq.u64 %r266,%r29,%r54; @ %r266 bra $L35; .loc 1 354 16 ld.u64 %r267,[%r102]; add.u64 %r82,%r267,1; st.u64 [%r102],%r82; .loc 1 355 23 add.u64 %r269,%r278,%r101; ld.u64 %r83,[%r269]; .loc 1 355 13 sub.u64 %r270,%r83,%r78; add.u64 %r90,%r90,%r270; .loc 1 356 24 add.u64 %r271,%frame,%r101; ld.u64 %r85,[%r271]; .loc 1 356 14 sub.u64 %r272,%r85,%r80; add.u64 %r94,%r94,%r272; .loc 1 359 32 add.u64 %r274,%r284,%r101; ld.u64 %r87,[%r274]; .loc 1 359 7 add.u64 %r102,%r102,8; add.u64 %r101,%r101,8; setp.eq.u64 %r275,%r82,%r87; @ %r275 bra $L62; ld.u64 %r153,[%frame+360]; bra $L63; $L65: mov.u64 %r90,%r45; setp.eq.u32 %r277,1,1; bra $L48; $L35: .loc 1 362_gfortran_sminloc0_8_i1 .visible .func _gfortran_sminloc0_8_i14predpred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 378 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L91; .loc 1 378 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L92; $L91: .loc 1 380 7 {_gfortran_minloc0_8_.loc 1 381 7 bra $L90; $L92: .loc 1 384 8 ld.s8 %r30,[%r36+28]_gfortran_runtime_error94: .loc 1 389 15 ld.u64 %r24,[%r35]; .loc 1 389 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L95; .loc 1 391 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 392 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 393 24 st.u64 [%r35+8],%r24; .loc 1 394 29 mov.u64 call (%value_in),_gfortrani_xmallocarray94 27 st.u64 [%r35],%r24; bra $L96; $L95: .loc 1 396 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 396 11 ld.u32 %r62,[%r61+36; .loc 1 398 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 403 8 ld.u64 %r24,[%r35]; $L96: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,3; mov.u64 %r29,%r24; .loc 1 404 10 mov.u64 %r32,0; .loc 1 405 23 mov.u64 %r68,%r32; $L97: st.u64 [%r29],%r68; .loc 1 404 24 add.u64 %r32,%r32,1; .loc 1 404 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L97; $L90: .loc 1 406minloc0_16_i1.o/1622802231_gfortran_minloc0_16_i1 .visible .func _gfortran_minloc0_16_i1fortran/generated/minloc0_16_i1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_16_i1 .visible .func _gfortran_mminloc0_16_i1_gfortran_sminloc0_16_i1 .visible .func _gfortran_sminloc0_16_i1VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_16_i1 .visible .func _gfortran_minloc0_16_i132pred %r166; .reg .u16 %r167; .reg .u16 %r168; .reg .predmov.u64 %r112,%ar0; mov.u64 %r113,%ar1; mov.u32 %r114,%ar2; .loc 1 50 10 ld.s8 %r22,[%r113+28]; .loc 1 50 8 cvt.u32.u32 %r115,%r22; cvt.s64.s8 %r69,%r115; .loc 1 51 6 setp.gt.s64 %r116,%r69,0; @ %r116 bra $L2; .loc 1 52 5 cvta.const.u64 %r11117_gfortran_runtime_errorr23,[%r112]; .loc 1 54 6 setp.ne.u64 %r119,%r23,0; @ %r119 bra $L3; .loc 1 56 7 st.u64 [%r112+48],%r23; add.u64 %r121,%r69,-1; st.u64 [%r112+56],%r121; mov.u64 %r122,1; st.u64 [%r112+40],%r122; .loc 1 57 28 cvt.u32.u64 %r123,%r122; st.u8 [%r112+28],%r123; .loc 1 58 24 st.u64 [%r112+8],%r23; .loc 1 59 29 mov.u64 %r126,16126; call (%value_in),_gfortrani_xmallocarray129,[%value_in]; } mov.u64 %r23,%r129; .loc 1 59 27 st.u64 [%r112],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r131,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r132,[%r131+36]; setp.eq.u32 %r133,%r132,0; @ %r133 bra $L4; .loc 1 64r113136; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r112]; $L4: .loc 1 68 11 ld.u64 %r70,[%r112+40]; add.u64 %r44,%r113,40; mov.u64 %r39,%frame; add.u64 %r105,%frame,120; .loc 1 70 10 mov.u64 %r97,0; add.u64 %r189,%frame,240; .loc 1 74 16 mov.u64 %r144,%r97; bra $L7; $L22: mov.u64 %r97,%r76; $L7: .loc 1 72 18 ld.u64 %r137,[%r44]; st.u64 [%r39],%r137; .loc 1 73 19 ld.u64 %r139,[%r44+16]; add.u64 %r138,%r139,1; ld.u64 %r140,[%r44+8]; sub.u64 %r32,%r138,%r140; .loc 1 73 17 st.u64 [%r105],%r32; .loc 1 74 16 shl.b64 %r142,%r97,3; add.u64 %r143,%r189,%r142; st.u64 [%r143],%r144; .loc 1 75 10 setp.gt.s64 %r145,%r32,0; @ %r145 bra $L5; shl.b64 %r83,%r70,4; mov.u64 %r85,%r23; .loc 1 78 11 mov.u64 %r77,0; .loc 1 79 24 mov.u64 %r146,%r77; $L6: st.u64 [%r85],%r146; st.u64 [%r85+8],%r146; .loc 1 78 27 add.u64 %r77,%r77,1; .loc 1 78 4 add.u64 %r85,%r85,%r83; setp.ne.u64 %r148,%r69,%r77; @ %r148 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r76,%r97,1; .loc 1 70 3 add.u64 %r44,%r44,24; add.u64 %r39,%r39,8; add.u64 %r105,%r105,8; setp.ne.u64 %r149,%r69,%r76; @ %r149 bra $L22; .loc 1 84 8 ld.u64 %r60,[%r113]; shl.b64 %r37,%r70,4; mov.u64 %r82,%r23; .loc 1 87 10 mov.u64 %r75,0; .loc 1 88 23 mov.u64 %r150,1; mov.u64 %r151,%r75; $L8: st.u64 [%r82],%r150; st.u64 [%r82+8],%r151; mov.u64 %r98,%r75; .loc 1 87 26 add.u64 %r75,%r75,1; .loc 1 87 3 add.u64 %r82,%r82,%r37; setp.ne.u64 %r152,%r97,%r98; @ %r152 bra $L8; .loc 1 101 9 setp.ne.u64 %r153,%r60,0; @ ! %r153 bra $L1; .loc 1 149 19 ld.u64 %r47,[%frame]; .loc 1 151 34 ld.u64 %r50,[%frame+120]; cvt.u32.u32 %r155,%r22; cvt.s64.s8 %r154,%r155; shl.b64 %r156,%r154,3; add.u64 %r86,%r189,%r156; .loc 1 99 12 mov.u32 %r64,127; setp.eq.u32 %r188,%r114,0; add.u64 %r190,%frame,248; .loc 1 157 13 mov.u64 %r193,0; .loc 1 174 32 add.u64 %r194,%frame,120; $L21: .loc 1 151 21 ld.u64 %r104,[%frame+240]; .loc 1 126 10 @ %r188 bra $L23; mov.u64 %r40,%r104; mov.u64 %r58,%r60; $L14: .loc 1 129 10 ld.s8 %r33,[%r58]; .loc 1 129 9 cvt.u16.u32 %r159,%r33; cvt.u16.u32 %r160,%r64; setp.gt.s16 %r161,%r159,%r160; @ %r161 bra $L12; mov.u64 %r27,%r190; mov.u64 %r61,%r23; mov.u64 %r100,%r40; $L13: .loc 1 133 34 add.u64 %r163,%r100,1; st.u64 [%r61],%r163; shr.s64 %r164,%r163,63; st.u64 [%r61+8],%r164; .loc 1 132 3 add.u64 %r61,%r61,%r37; setp.eq.u64 %r165,%r27,%r86; @ %r165 bra $L24; .loc 1 133 30 ld.u64 %r100,[%r27]; add.u64 %r27,%r27,8; bra $L13; $L24: mov.u32 %r64,%r33; $L12: .loc 1 135 11 add.u64 %r58,%r58,%r47; .loc 1 137 9 add.u64 %r40,%r40,1; .loc 1 137 2 st.u64 [%frame+240],%r40; setp.ne.u64 %r166,%r40,%r50; @ %r166 bra $L14; bra $L34; $L23: mov.u64 %r49,%r104; mov.u64 %r59,%r60; $L11: .loc 1 141 10 ld.s8 %r42,[%r59]; .loc 1 141 9 cvt.u16.u32 %r167,%r42; cvt.u16.u32 %r168,%r64; setp.ge.s16 %r169,%r167,%r168; @ %r169 bra $L16; mov.u64 %r101,%r190; mov.u64 %r102,%r23; mov.u64 %r81,%r49; $L17: .loc 1 145 34 add.u64 %r171,%r81,1; st.u64 [%r102],%r171; shr.s64 %r172,%r171,63; st.u64 [%r102+8],%r172; .loc 1 144 3 add.u64 %r102,%r102,%r37; setp.eq.u64 %r173,%r86,%r101; @ %r173 bra $L25; .loc 1 145 30 ld.u64 %r81,[%r101]; add.u64 %r101,%r101,8; bra $L17; $L25: mov.u32 %r64,%r42; $L16: .loc 1 149 9 add.u64 %r59,%r59,%r47; .loc 1 151 14 add.u64 %r49,%r49,1; .loc 1 151 7 st.u64 [%frame+240],%r49; setp.ne.u64 %r174,%r49,%r50; @ %r174 bra $L11; .loc 1 149 9 sub.u64 %r175,%r50,%r104; mad.lo.u64 %r60,%r175,%r47,%r60; bra $L18; $L34: .loc 1 135 11 sub.u64 %r177,%r50,%r104; mad.lo.u64 %r60,%r177,%r47,%r60; $L18: mov.u64 %r74,%r190; .loc 1 151 34 mov.u64 %r57,%r50; mov.u64 %r55,%r47; .loc 1 152 9 mov.u64 %r63,0; $L20: .loc 1 157 13 st.u64 [%r74+-8],%r193; .loc 1 160 23 mul.lo.u64 %r52,%r55,%r57; mov.u64 %r99,%r63; .loc 1 161 5 add.u64 %r63,%r63,1; .loc 1 162 7 setp.eq.u64 %r181,%r99,%r97; @ %r181 bra $L1; .loc 1 170 16 ld.u64 %r182,[%r74]; add.u64 %r54,%r182,1; st.u64 [%r74],%r54; shl.b64 %r66,%r63,3; .loc 1 171 23 add.u64 %r183,%frame,%r66; ld.u64 %r55,[%r183]; .loc 1 171 13 sub.u64 %r184,%r55,%r52; add.u64 %r60,%r60,%r184; .loc 1 174 32 add.u64 %r186,%r194,%r66; ld.u64 %r57,[%r186]; .loc 1 174 7 add.u64 %r74,%r74,8; setp.eq.u64 %r187,%r54,%r57; @ %r187 bra $L20; bra $L21; $L1: .loc 1 1_gfortran_mminloc0_16_i1 .visible .func _gfortran_mminloc0_16_i1u64u64 %r170; .reg .u64 %r172; .reg .predpred %r188; .reg .u64 %r191; .reg .u64 %r194; .reg .u32 %r196; .reg .u32 %r197; .reg .u32u16 %r206; .reg .u16u32 %r213; .reg .u64pred %r223; .reg .u64pred %r239; .reg .u64 %r241; .reg .u64 %r242; .reg .pred %r243; .reg .predpred %r269; .reg .pred %r274; .reg .u64u64 %r280; .reg .u64 %r282; .reg .predu32 %r288; .reg .u64 %r291; .reg .u64 %r292; .reg .pred %r294; mov.u64 %r161,%ar0; mov.u64 %r162,%ar1; mov.u64 %r163,%ar2; mov.u32 %r164,%ar3; .loc 1 202 6 setp.ne.u64 %r165,%r163,0; @ %r165 bra $L36gfortran_minloc0_16_.loc 1 205 7 bra $L35; $L36: .loc 1 208 10 ld.s8 %r22,[%r162+28]; .loc 1 209 6 setp.gt.s32 %r169,%r22,0; @ %r169 bra $L38; .loc 1 210 5 cvta.const.u64 %r170170_gfortran_runtime_error%r172,[%r161]; setp.ne.u64 %r173,%r172,0; @ %r173 bra $L39; .loc 1 214 7 st.u64 [%r161+48],%r172; add.u32 %r175,%r22,-1; cvt.s64.s32 %r176,%r175; st.u64 [%r161+56],%r176; mov.u64 %r177,1; st.u64 [%r161+40],%r177; .loc 1 215 28 cvt.u32.u64 %r178,%r177; st.u8 [%r161+28],%r178; .loc 1 216 24 st.u64 [%r161+8],%r172; .loc 1 217 29 cvt.u32.u32 %r183,%r22; cvt.s64.s8 %r182,%r183; mov.u64 %r181,16call (%value_in),_gfortrani_xmallocarray184,[%value_in]; } .loc 1 217 27 st.u64 [%r161],%r184; bra $L40; $L39: .loc 1 221 11 cvta.global.u64 %r186,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r187,[%r186+36]; setp.eq.u32 %r188,%r187,0161_gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r194163194191; call _gfortrani_bounds_equal_extents$L40: .loc 1 231 15 ld.u64 %r31,[%r163+16]; .loc 1 233 9 ld.u64 %r93,[%r163]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r196,%r32,-4; and.b32 %r197,%r196,-5; set.u32.eq.u32 %r199,%r197,0; neg.s32 %r200,%r199; .loc 1 235 22 add.u32 %r201,%r32,-1; set.u32.le.u32 %r203,%r201,1; neg.s32 %r204,%r203; .loc 1 235 6 cvt.u16.u32 %r206,%r200; cvt.u16.u32 %r207,%r204; or.b16 %r205,%r206,%r207; cvt.u32.u16 %r208,%r205; cvt.u16.u8 %r209,%r20837 7 setp.ne.u32 %r212,%r32,16; @ %r212 bra $L42; $L41: .loc 1 244 11 ld.u64 %r105,[%r161+40]; .loc 1 245 8 ld.u64 %r106,[%r161]; .loc 1 246 17 cvt.u32.u32 %r213,%r22; cvt.s64.s8 %r64,%r213; add.u64 %r159,%r162,40; add.u64 %r285,%frame,120; mov.u64 %r157,%r285; add.u64 %r156,%r163,40; mov.u64 %r152,%frame; add.u64 %r153,%frame,240; add.u64 %r149,%frame,360; mov.u64 %r148,%r149; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r222,%r29; bra $L43; $L42: .loc 1 242 5_gfortran_runtime_error64: mov.u64 %r29,%r110; $L43: .loc 1 248 18 ld.u64 %r216,[%r159]; st.u64 [%r157],%r216; .loc 1 249 20 ld.u64 %r218,[%r156]; mul.lo.u64 %r217,%r218,%r31; .loc 1 249 18 st.u64 [%r152],%r217; .loc 1 250 19 ld.u64 %r220,[%r159+16]; add.u64 %r219,%r220,1; ld.u64 %r221,[%r159+8]; sub.u64 %r44,%r219,%r221; .loc 1 250 17 st.u64 [%r153],%r44; .loc 1 251 16 st.u64 [%r148],%r222; .loc 1 252 10 setp.gt.s64 %r223,%r44,0; @ %r223 bra $L44; shl.b64 %r139,%r105,4; mov.u64 %r130,%r106; .loc 1 255 11 mov.u64 %r111,0; .loc 1 256 24 mov.u64 %r224,%r111; $L45: st.u64 [%r130],%r224; st.u64 [%r130+8],%r224; .loc 1 255 27 add.u64 %r111,%r111,1; .loc 1 255 4 add.u64 %r130,%r130,%r139; setp.ne.u64 %r226,%r64,%r111; @ %r226 bra $L45; bra $L35; $L44: .loc 1 246 26 add.u64 %r110,%r29,1; .loc 1 246 3 add.u64 %r159,%r159,24; add.u64 %r157,%r157,8; add.u64 %r156,%r156,24; add.u64 %r152,%r152,8; add.u64 %r153,%r153,8; add.u64 %r148,%r148,8; setp.ne.u64 %r227,%r64,%r110; @ %r227 bra $L64; .loc 1 261 8 ld.u64 %r90,[%r162]; shl.b64 %r113,%r105,4; mov.u64 %r51,%r106; .loc 1 264 10 mov.u64 %r109,0; .loc 1 265 23 mov.u64 %r228,%r109; $L47: st.u64 [%r51],%r228; st.u64 [%r51+8],%r228; mov.u64 %r141,%r109; .loc 1 264 26 add.u64 %r109,%r109,1; .loc 1 264 3 add.u64 %r51,%r51,%r113; setp.ne.u64 %r230,%r29,%r141; @ %r230 bra $L47; .loc 1 276 9 setp.eq.u64 %r231,%r90,0; @ %r231 bra $L35; .loc 1 331 19 ld.u64 %r70,[%frame+120]; .loc 1 332 20 ld.u64 %r72,[%frame]; .loc 1 334 34 ld.u64 %r75,[%frame+240]; ld.u64 %r158,[%frame+360]; cvt.u32.u32 %r233,%r22; cvt.s64.s8 %r232,%r233; shl.b64 %r234,%r232,3; add.u64 %r120,%r234,%r149; .loc 1 274 12 mov.u32 %r96,127; setp.eq.u32 %r284,1,0; add.u64 %r286,%frame,368; .loc 1 340 13 mov.u64 %r291,0; .loc 1 359 32 add.u64 %r292,%frame,240; .loc 1 308 12 setp.eq.u32 %r294,%r164,0; $L63: .loc 1 280 10 @ %r284 bra $L48; mov.u64 %r52,%r158; mov.u32 %r91,0; mov.u32 %r288,1; $L53: mov.u64 %r45,%r90; .loc 1 300 13 add.u64 %r90,%r45,%r70; .loc 1 284 11 ld.s8 %r237,[%r93]; cvt.u16.u32 %r236,%r237; setp.eq.u16 %r238,%r236,0; @ %r238 bra $L49; setp.eq.u32 %r239,%r91,0; selp.u64 %r158,%r158,%r52,%r239; .loc 1 294 16 ld.s8 %r96,[%r45]; mov.u64 %r129,%r286; mov.u64 %r140,%r106; mov.u64 %r155,%r158; $L52: .loc 1 296 33 add.u64 %r241,%r155,1; st.u64 [%r140],%r241; shr.s64 %r242,%r241,63; st.u64 [%r140+8],%r242; .loc 1 295 9 add.u64 %r140,%r140,%r113; setp.eq.u64 %r243,%r120,%r129; @ %r243 bra $L65; .loc 1 296 29 ld.u64 %r155,[%r129]; add.u64 %r129,%r129,8; bra $L52; $L49: .loc 1 301 14 add.u64 %r93,%r93,%r72; .loc 1 303 11 add.u64 %r52,%r52,1; mov.u32 %r91,%r288; .loc 1 303 4 setp.ne.u64 %r244,%r52,%r75; @ %r244 bra $L53; bra $L89; $L48: .loc 1 308 12 @ %r294 bra $L66; mov.u64 %r61,%r158; mov.u64 %r88,%r90; $L58: .loc 1 311 11 ld.s8 %r247,[%r93]; cvt.u16.u32 %r246,%r247; setp.eq.u16 %r248,%r246,0; @ %r248 bra $L56; .loc 1 311 12 ld.s8 %r57,[%r88]; .loc 1 311 11 cvt.u16.u32 %r249,%r57; cvt.u16.u32 %r250,%r96; setp.gt.s16 %r251,%r249,%r250; @ %r251 bra $L56; mov.u64 %r94,%r286; mov.u64 %r97,%r106; mov.u64 %r151,%r61; .loc 1 314 18 mov.u64 %r142,%r291; $L57: .loc 1 315 36 add.u64 %r253,%r151,1; st.u64 [%r97],%r253; shr.s64 %r254,%r253,63; st.u64 [%r97+8],%r254; .loc 1 314 34 add.u64 %r108,%r142,1; .loc 1 314 11 add.u64 %r97,%r97,%r113; setp.le.s64 %r255,%r29,%r142; @ %r255 bra $L67; .loc 1 315 32 ld.u64 %r151,[%r94]; add.u64 %r94,%r94,8; mov.u64 %r142,%r108; bra $L57; $L67: mov.u32 %r96,%r57; $L56: .loc 1 317 8 add.u64 %r88,%r88,%r70; .loc 1 319 13 add.u64 %r61,%r61,1; .loc 1 319 6 st.u64 [%frame+360],%r61; setp.ne.u64 %r256,%r61,%r75; @ %r256 bra $L58; .loc 1 317 8 sub.u64 %r257,%r75,%r158; mad.lo.u64 %r90,%r257,%r70,%r90; bra $L59; $L66: mov.u64 %r74,%r158; mov.u64 %r92,%r93; mov.u64 %r89,%r90; $L55: .loc 1 323 11 ld.s8 %r260,[%r92]; cvt.u16.u32 %r259,%r260; setp.eq.u16 %r261,%r259,0; @ %r261 bra $L60; .loc 1 323 12 ld.s8 %r65,[%r89]; .loc 1 323 11 cvt.u16.u32 %r262,%r65; cvt.u16.u32 %r263,%r96; setp.ge.s16 %r264,%r262,%r263; @ %r264 bra $L60; mov.u64 %r56,%r286; mov.u64 %r66,%r106; mov.u64 %r147,%r74; .loc 1 326 12 mov.u64 %r143,%r291; $L61: .loc 1 327 36 add.u64 %r266,%r147,1; st.u64 [%r66],%r266; shr.s64 %r267,%r266,63; st.u64 [%r66+8],%r267; .loc 1 326 28 add.u64 %r107,%r143,1; .loc 1 326 5 add.u64 %r66,%r66,%r113; setp.le.s64 %r268,%r29,%r143; @ %r268 bra $L68; .loc 1 327 32 ld.u64 %r147,[%r56]; add.u64 %r56,%r56,8; mov.u64 %r143,%r107; bra $L61; $L68: mov.u32 %r96,%r65; $L60: .loc 1 331 9 add.u64 %r89,%r89,%r70; .loc 1 332 10 add.u64 %r92,%r92,%r72; .loc 1 334 14 add.u64 %r74,%r74,1; .loc 1 334 7 st.u64 [%frame+360],%r74; setp.ne.u64 %r269,%r74,%r75; @ %r269 bra $L55; sub.u64 %r126,%r75,%r158; .loc 1 331 9 mad.lo.u64 %r90,%r70,%r126,%r90; .loc 1 332 10 mad.lo.u64 %r93,%r72,%r126,%r93; bra $L59; $L89: st.u64 [%frame+360],%r75; $L59: mov.u64 %r118,%r286; mov.u64 %r84,%r72; .loc 1 334 34 mov.u64 %r86,%r75; mov.u64 %r82,%r70; mov.u64 %r117,8; .loc 1 335 9 mov.u64 %r95,0; $L62: .loc 1 340 13 st.u64 [%r118+-8],%r291; .loc 1 343 23 mul.lo.u64 %r77,%r82,%r86; .loc 1 344 24 mul.lo.u64 %r79,%r86,%r84; mov.u64 %r62,%r95; .loc 1 345 5 add.u64 %r95,%r95,1; .loc 1 346 7 setp.eq.u64 %r274,%r29,%r62; @ %r274 bra $L35; .loc 1 354 16 ld.u64 %r275,[%r118]; add.u64 %r81,%r275,1; st.u64 [%r118],%r81; .loc 1 355 23 add.u64 %r277,%r285,%r117; ld.u64 %r82,[%r277]; .loc 1 355 13 sub.u64 %r278,%r82,%r77; add.u64 %r90,%r90,%r278; .loc 1 356 24 add.u64 %r279,%frame,%r117; ld.u64 %r84,[%r279]; .loc 1 356 14 sub.u64 %r280,%r84,%r79; add.u64 %r93,%r93,%r280; .loc 1 359 32 add.u64 %r282,%r292,%r117; ld.u64 %r86,[%r282]; .loc 1 359 7 add.u64 %r118,%r118,8; add.u64 %r117,%r117,8; setp.eq.u64 %r283,%r81,%r86; @ %r283 bra $L62; ld.u64 %r158,[%frame+360]; bra $L63; $L65: mov.u64 %r90,%r45; setp.eq.u32 %r284,1,1; bra $L48; $L35: .loc 1 362_gfortran_sminloc0_16_i1 .visible .func _gfortran_sminloc0_16_i14predpred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 378 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L91; .loc 1 378 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L92; $L91: .loc 1 380 7 {_gfortran_minloc0_16_.loc 1 381 7 bra $L90; $L92: .loc 1 384 8 ld.s8 %r30,[%r36+28]_gfortran_runtime_error94: .loc 1 389 15 ld.u64 %r24,[%r35]; .loc 1 389 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L95; .loc 1 391 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 392 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 393 24 st.u64 [%r35+8],%r24; .loc 1 394 29call (%value_in),_gfortrani_xmallocarray94 27 st.u64 [%r35],%r24; bra $L96; $L95: .loc 1 396 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 396 11 ld.u32 %r62,[%r61+36; .loc 1 398 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 403 8 ld.u64 %r24,[%r35]; $L96: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,4; mov.u64 %r29,%r24; .loc 1 404 10 mov.u64 %r32,0; .loc 1 405 23 mov.u64 %r68,%r32; $L97: st.u64 [%r29],%r68; st.u64 [%r29+8],%r68; .loc 1 404 24 add.u64 %r32,%r32,1; .loc 1 404 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r70,%r30,%r32; @ %r70 bra $L97; $L90: .loc 1 406 minloc0_4_i2.o/ 1622802231_gfortran_minloc0_4_i2 .visible .func _gfortran_minloc0_4_i2fortran/generated/minloc0_4_i2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_4_i2 .visible .func _gfortran_mminloc0_4_i2_gfortran_sminloc0_4_i2 .visible .func _gfortran_sminloc0_4_i2VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_4_i2 .visible .func _gfortran_minloc0_4_i2u64 %r87; .reg .u64predpredpred %r168; .reg .pred %r169; .reg .u16 %r170; .reg .u16 %r171; .reg .pred %r172; .reg .u32 %r174; .reg .u32predu64 %r198; mov.u64 %r116,%ar0; mov.u64 %r117,%ar1; mov.u32 %r118,%ar2; .loc 1 50 10 ld.s8 %r22,[%r117+28]; .loc 1 50 8 cvt.u32.u32 %r119,%r22; cvt.s64.s8 %r72,%r119; .loc 1 51 6 setp.gt.s64 %r120,%r72,0; @ %r120 bra $L2; .loc 1 52 5 cvta.const.u64 %r121121_gfortran_runtime_errorr23,[%r116]; .loc 1 54 6 setp.ne.u64 %r123,%r23,0; @ %r123 bra $L3; .loc 1 56 7 st.u64 [%r116+48],%r23; add.u64 %r125,%r72,-1; st.u64 [%r116+56],%r125; mov.u64 %r126,1; st.u64 [%r116+40],%r126; .loc 1 57 28 cvt.u32.u64 %r127,%r126; st.u8 [%r116+28],%r127; .loc 1 58 24 st.u64 [%r116+8],%r23; .loc 1 59 29 mov.u64 %r130,4;r130; call (%value_in),_gfortrani_xmallocarray133,[%value_in]; } mov.u64 %r23,%r133; .loc 1 59 27 st.u64 [%r116],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r135,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r136,[%r135+36]; setp.eq.u32 %r137,%r136,0; @ %r137 bra $L4; .loc 1 64r140; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r116]; $L4: .loc 1 68 11 ld.u64 %r73,[%r116+40]; add.u64 %r39,%r117,40; mov.u64 %r45,%frame; add.u64 %r35,%frame,120; .loc 1 70 10 mov.u64 %r99,0; add.u64 %r193,%frame,240; .loc 1 74 16 mov.u64 %r148,%r99; bra $L7; $L22: mov.u64 %r99,%r79; $L7: .loc 1 72 18 ld.u64 %r141,[%r39]; st.u64 [%r45],%r141; .loc 1 73 19 ld.u64 %r143,[%r39+16]; add.u64 %r142,%r143,1; ld.u64 %r144,[%r39+8]; sub.u64 %r31,%r142,%r144; .loc 1 73 17 st.u64 [%r35],%r31; .loc 1 74 16 shl.b64 %r146,%r99,3; add.u64 %r147,%r193,%r146; st.u64 [%r147],%r148; .loc 1 75 10 setp.gt.s64 %r149,%r31,0; @ %r149 bra $L5; shl.b64 %r85,%r73,2; mov.u64 %r87,%r23; .loc 1 78 11 mov.u64 %r80,0; .loc 1 79 24 cvt.u32.u64 %r150,%r80; $L6: st.u32 [%r87],%r150; .loc 1 78 27 add.u64 %r80,%r80,1; .loc 1 78 4 add.u64 %r87,%r87,%r85; setp.ne.u64 %r151,%r72,%r80; @ %r151 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r79,%r99,1; .loc 1 70 3 add.u64 %r39,%r39,24; add.u64 %r45,%r45,8; add.u64 %r35,%r35,8; setp.ne.u64 %r152,%r72,%r79; @ %r152 bra $L22; .loc 1 84 8 ld.u64 %r64,[%r117]; shl.b64 %r82,%r73,2; mov.u64 %r92,%r23; .loc 1 87 10 mov.u64 %r78,0; .loc 1 88 23 mov.u32 %r153,1; $L8: st.u32 [%r92],%r153; mov.u64 %r32,%r78; .loc 1 87 26 add.u64 %r78,%r78,1; .loc 1 87 3 add.u64 %r92,%r92,%r82; setp.ne.u64 %r154,%r99,%r32; @ %r154 bra $L8; .loc 1 101 9 setp.ne.u64 %r155,%r64,0; @ ! %r155 bra $L1; .loc 1 149 19 ld.u64 %r49,[%frame]; .loc 1 149 9 add.u64 %r51,%r49,%r49; .loc 1 151 34 ld.u64 %r53,[%frame+120]; cvt.u32.u32 %r158,%r22; cvt.s64.s8 %r157,%r158; shl.b64 %r159,%r157,3; add.u64 %r102,%r193,%r159; .loc 1 99 12 mov.u32 %r67,32767; setp.eq.u32 %r192,%r118,0; add.u64 %r194,%frame,248; .loc 1 157 13 mov.u64 %r197,0; .loc 1 174 32 add.u64 %r198,%frame,120; $L21: .loc 1 151 21 ld.u64 %r110,[%frame+240]; .loc 1 126 10 @ %r192 bra $L23; mov.u64 %r40,%r110; mov.u64 %r62,%r64; $L14: .loc 1 129 10 ld.s16 %r33,[%r62]; .loc 1 129 9 cvt.u16.u32 %r162,%r33; cvt.u16.u32 %r163,%r67; setp.gt.s16 %r164,%r162,%r163; @ %r164 bra $L12; mov.u64 %r43,%r194; mov.u64 %r65,%r23; mov.u64 %r105,%r40; $L13: .loc 1 133 34 cvt.u32.u64 %r167,%r105; add.u32 %r166,%r167,1; .loc 1 133 23 st.u32 [%r65],%r166; .loc 1 132 3 add.u64 %r65,%r65,%r82; setp.eq.u64 %r168,%r43,%r102; @ %r168 bra $L24; .loc 1 133 30 ld.u64 %r105,[%r43]; add.u64 %r43,%r43,8; bra $L13; $L24: mov.u32 %r67,%r33; $L12: .loc 1 135 11 add.u64 %r62,%r62,%r51; .loc 1 137 9 add.u64 %r40,%r40,1; .loc 1 137 2 st.u64 [%frame+240],%r40; setp.ne.u64 %r169,%r40,%r53; @ %r169 bra $L14; bra $L34; $L23: mov.u64 %r52,%r110; mov.u64 %r63,%r64; $L11: .loc 1 141 10 ld.s16 %r42,[%r63]; .loc 1 141 9 cvt.u16.u32 %r170,%r42; cvt.u16.u32 %r171,%r67; setp.ge.s16 %r172,%r170,%r171; @ %r172 bra $L16; mov.u64 %r108,%r194; mov.u64 %r109,%r23; mov.u64 %r44,%r52; $L17: .loc 1 145 34 cvt.u32.u64 %r175,%r44; add.u32 %r174,%r175,1; .loc 1 145 23 st.u32 [%r109],%r174; .loc 1 144 3 add.u64 %r109,%r109,%r82; setp.eq.u64 %r176,%r102,%r108; @ %r176 bra $L25; .loc 1 145 30 ld.u64 %r44,[%r108]; add.u64 %r108,%r108,8; bra $L17; $L25: mov.u32 %r67,%r42; $L16: .loc 1 149 9 add.u64 %r63,%r63,%r51; .loc 1 151 14 add.u64 %r52,%r52,1; .loc 1 151 7 st.u64 [%frame+240],%r52; setp.ne.u64 %r177,%r52,%r53; @ %r177 bra $L11; .loc 1 149 9 sub.u64 %r178,%r53,%r110; mad.lo.u64 %r64,%r178,%r51,%r64; bra $L18; $L34: .loc 1 135 11 sub.u64 %r180,%r53,%r110; mad.lo.u64 %r64,%r180,%r51,%r64; $L18: mov.u64 %r81,%r194; .loc 1 151 34 mov.u64 %r61,%r53; mov.u64 %r59,%r49; .loc 1 152 9 mov.u64 %r66,0; $L20: .loc 1 157 13 st.u64 [%r81+-8],%r197; .loc 1 160 23 mul.lo.u64 %r56,%r59,%r61; mov.u64 %r100,%r66; .loc 1 161 5 add.u64 %r66,%r66,1; .loc 1 162 7 setp.eq.u64 %r184,%r100,%r99; @ %r184 bra $L1; .loc 1 170 16 ld.u64 %r185,[%r81]; add.u64 %r58,%r185,1; st.u64 [%r81],%r58; shl.b64 %r70,%r66,3; .loc 1 171 23 add.u64 %r186,%frame,%r70; ld.u64 %r59,[%r186]; .loc 1 171 13 sub.u64 %r187,%r59,%r56; add.u64 %r188,%r187,%r187; add.u64 %r64,%r64,%r188; .loc 1 174 32 add.u64 %r190,%r198,%r70; ld.u64 %r61,[%r190]; .loc 1 174 7 add.u64 %r81,%r81,8; setp.eq.u64 %r191,%r58,%r61; @ %r191 bra $L20; bra $L21; $L1: .loc 1 1_gfortran_mminloc0_4_i2 .visible .func _gfortran_mminloc0_4_i2predpred %r193; .reg .u64 %r196; .reg .u6464pred %r228; .reg .u32 %r229; .reg .predpredu64predu64 %r279; .reg .u64 %r281; .reg .u64 %r282; .reg .u64pred %r299; mov.u64 %r166,%ar0; mov.u64 %r167,%ar1; mov.u64 %r168,%ar2; mov.u32 %r169,%ar3; .loc 1 202 6 setp.ne.u64 %r170,%r168,0; @ %r170 bra $L36gfortran_minloc0_4_.loc 1 205 7 bra $L35; $L36: .loc 1 208 10 ld.s8 %r22,[%r167+28]; .loc 1 209 6 setp.gt.s32 %r174,%r22,0; @ %r174 bra $L38; .loc 1 210 5 cvta.const.u64 %r175175_gfortran_runtime_error%r177,[%r166]; setp.ne.u64 %r178,%r177,0; @ %r178 bra $L39; .loc 1 214 7 st.u64 [%r166+48],%r177; add.u32 %r180,%r22,-1; cvt.s64.s32 %r181,%r180; st.u64 [%r166+56],%r181; mov.u64 %r182,1; st.u64 [%r166+40],%r182; .loc 1 215 28 cvt.u32.u64 %r183,%r182; st.u8 [%r166+28],%r183; .loc 1 216 24 st.u64 [%r166+8],%r177; .loc 1 217 29 cvt.u32.u32 %r188,%r22; cvt.s64.s8 %r187,%r188call (%value_in),_gfortrani_xmallocarray189,[%value_in]; } .loc 1 217 27 st.u64 [%r166],%r189; bra $L40; $L39: .loc 1 221 11 cvta.global.u64 %r191,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r192,[%r191+36];r167196; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19call _gfortrani_bounds_equal_extents$L40: .loc 1 231 15 ld.u64 %r31,[%r168+16]; .loc 1 233 9 ld.u64 %r96,[%r168]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r201,%r32,-4; and.b32 %r202,%r201,-5; set.u32.eq.u32 %r204,%r202,0; neg.s32 %r205,%r204; .loc 1 235 22 add.u32 %r206,%r32,-1; set.u32.le.u32 %r208,%r206,1; neg.s32 %r209,%r208; .loc 1 235 6 cvt.u16.u32 %r211,%r205; cvt.u16.u32 %r212,%r209; or.b16 %r210,%r211,%r212; cvt.u32.u16 %r213,%r210; cvt.u16.u8 %r214,%r213; setp.ne.u16 %r215,%r214,0; @ %r215 bra $L41; .loc 1 237 7 setp.ne.u32 %r217,%r32,16; @ %r217 bra $L42; $L41: .loc 1 244 11 ld.u64 %r109,[%r166+40]; .loc 1 245 8 ld.u64 %r110,[%r166]; .loc 1 246 17 cvt.u32.u32 %r218,%r22; cvt.s64.s8 %r67,%r218; add.u64 %r52,%r167,40; add.u64 %r289,%frame,120; mov.u64 %r118,%r289; add.u64 %r105,%r168,40; mov.u64 %r165,%frame; add.u64 %r162,%frame,240; add.u64 %r159,%frame,360; mov.u64 %r160,%r159; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r227,%r29; bra $L43; $L42: .loc 1 242 5_gfortran_runtime_error64: mov.u64 %r29,%r115; $L43: .loc 1 248 18 ld.u64 %r221,[%r52]; st.u64 [%r118],%r221; .loc 1 249 20 ld.u64 %r223,[%r105]; mul.lo.u64 %r222,%r223,%r31; .loc 1 249 18 st.u64 [%r165],%r222; .loc 1 250 19 ld.u64 %r225,[%r52+16]; add.u64 %r224,%r225,1; ld.u64 %r226,[%r52+8]; sub.u64 %r44,%r224,%r226; .loc 1 250 17 st.u64 [%r162],%r44; .loc 1 251 16 st.u64 [%r160],%r227; .loc 1 252 10 setp.gt.s64 %r228,%r44,0; @ %r228 bra $L44; shl.b64 %r149,%r109,2; mov.u64 %r138,%r110; .loc 1 255 11 mov.u64 %r116,0; .loc 1 256 24 cvt.u32.u64 %r229,%r116; $L45: st.u32 [%r138],%r229; .loc 1 255 27 add.u64 %r116,%r116,1; .loc 1 255 4 add.u64 %r138,%r138,%r149; setp.ne.u64 %r230,%r67,%r116; @ %r230 bra $L45; bra $L35; $L44: .loc 1 246 26 add.u64 %r115,%r29,1; .loc 1 246 3 add.u64 %r52,%r52,24; add.u64 %r118,%r118,8; add.u64 %r105,%r105,24; add.u64 %r165,%r165,8; add.u64 %r162,%r162,8; add.u64 %r160,%r160,8; setp.ne.u64 %r231,%r67,%r115; @ %r231 bra $L64; .loc 1 261 8 ld.u64 %r94,[%r167]; shl.b64 %r57,%r109,2; mov.u64 %r60,%r110; .loc 1 264 10 mov.u64 %r114,0; .loc 1 265 23 cvt.u32.u64 %r232,%r114; $L47: st.u32 [%r60],%r232; mov.u64 %r145,%r114; .loc 1 264 26 add.u64 %r114,%r114,1; .loc 1 264 3 add.u64 %r60,%r60,%r57; setp.ne.u64 %r233,%r29,%r145; @ %r233 bra $L47; .loc 1 276 9 setp.eq.u64 %r234,%r94,0; @ %r234 bra $L35; .loc 1 331 19 ld.u64 %r73,[%frame+120]; .loc 1 331 9 add.u64 %r75,%r73,%r73; .loc 1 332 20 ld.u64 %r76,[%frame]; .loc 1 334 34 ld.u64 %r79,[%frame+240]; ld.u64 %r163,[%frame+360]; cvt.u32.u32 %r237,%r22; cvt.s64.s8 %r236,%r237; shl.b64 %r238,%r236,3; add.u64 %r127,%r238,%r159; .loc 1 274 12 mov.u32 %r100,32767; setp.eq.u32 %r291,1,0; add.u64 %r290,%frame,368; .loc 1 340 13 mov.u64 %r296,0; .loc 1 359 32 add.u64 %r297,%frame,240; .loc 1 308 12 setp.eq.u32 %r299,%r169,0; $L63: .loc 1 280 10 @ %r291 bra $L48; mov.u64 %r53,%r163; mov.u32 %r46,0; mov.u32 %r293,1; $L53: mov.u64 %r45,%r94; .loc 1 300 13 add.u64 %r94,%r45,%r75; .loc 1 284 11 ld.s8 %r241,[%r96]; cvt.u16.u32 %r240,%r241; setp.eq.u16 %r242,%r240,0; @ %r242 bra $L49; setp.eq.u32 %r243,%r46,0; selp.u64 %r163,%r163,%r53,%r243; .loc 1 294 16 ld.s16 %r100,[%r45]; mov.u64 %r137,%r290; mov.u64 %r150,%r110; mov.u64 %r161,%r163; $L52: .loc 1 296 33 cvt.u32.u64 %r246,%r161; add.u32 %r245,%r246,1; .loc 1 296 22 st.u32 [%r150],%r245; .loc 1 295 9 add.u64 %r150,%r150,%r57; setp.eq.u64 %r247,%r127,%r137; @ %r247 bra $L65; .loc 1 296 29 ld.u64 %r161,[%r137]; add.u64 %r137,%r137,8; bra $L52; $L49: .loc 1 301 14 add.u64 %r96,%r96,%r76; .loc 1 303 11 add.u64 %r53,%r53,1; mov.u32 %r46,%r293; .loc 1 303 4 setp.ne.u64 %r248,%r53,%r79; @ %r248 bra $L53; bra $L89; $L48: .loc 1 308 12 @ %r299 bra $L66; mov.u64 %r64,%r163; mov.u64 %r92,%r94; $L58: .loc 1 311 11 ld.s8 %r251,[%r96]; cvt.u16.u32 %r250,%r251; setp.eq.u16 %r252,%r250,0; @ %r252 bra $L56; .loc 1 311 12 ld.s16 %r58,[%r92]; .loc 1 311 11 cvt.u16.u32 %r253,%r58; cvt.u16.u32 %r254,%r100; setp.gt.s16 %r255,%r253,%r254; @ %r255 bra $L56; mov.u64 %r103,%r290; mov.u64 %r119,%r110; mov.u64 %r156,%r64; .loc 1 314 18 mov.u64 %r146,%r296; $L57: .loc 1 315 36 cvt.u32.u64 %r258,%r156; add.u32 %r257,%r258,1; .loc 1 315 25 st.u32 [%r119],%r257; .loc 1 314 34 add.u64 %r113,%r146,1; .loc 1 314 11 add.u64 %r119,%r119,%r57; setp.le.s64 %r259,%r29,%r146; @ %r259 bra $L67; .loc 1 315 32 ld.u64 %r156,[%r103]; add.u64 %r103,%r103,8; mov.u64 %r146,%r113; bra $L57; $L67: mov.u32 %r100,%r58; $L56: .loc 1 317 8 add.u64 %r92,%r92,%r75; .loc 1 319 13 add.u64 %r64,%r64,1; .loc 1 319 6 st.u64 [%frame+360],%r64; setp.ne.u64 %r260,%r64,%r79; @ %r260 bra $L58; .loc 1 317 8 sub.u64 %r261,%r79,%r163; mad.lo.u64 %r94,%r261,%r75,%r94; bra $L59; $L66: mov.u64 %r78,%r163; mov.u64 %r95,%r96; mov.u64 %r93,%r94; $L55: .loc 1 323 11 ld.s8 %r264,[%r95]; cvt.u16.u32 %r263,%r264; setp.eq.u16 %r265,%r263,0; @ %r265 bra $L60; .loc 1 323 12 ld.s16 %r68,[%r93]; .loc 1 323 11 cvt.u16.u32 %r266,%r68; cvt.u16.u32 %r267,%r100; setp.ge.s16 %r268,%r266,%r267; @ %r268 bra $L60; mov.u64 %r91,%r290; mov.u64 %r97,%r110; mov.u64 %r151,%r78; .loc 1 326 12 mov.u64 %r147,%r296; $L61: .loc 1 327 36 cvt.u32.u64 %r271,%r151; add.u32 %r270,%r271,1; .loc 1 327 25 st.u32 [%r97],%r270; .loc 1 326 28 add.u64 %r112,%r147,1; .loc 1 326 5 add.u64 %r97,%r97,%r57; setp.le.s64 %r272,%r29,%r147; @ %r272 bra $L68; .loc 1 327 32 ld.u64 %r151,[%r91]; add.u64 %r91,%r91,8; mov.u64 %r147,%r112; bra $L61; $L68: mov.u32 %r100,%r68; $L60: .loc 1 331 9 add.u64 %r93,%r93,%r75; .loc 1 332 10 add.u64 %r95,%r95,%r76; .loc 1 334 14 add.u64 %r78,%r78,1; .loc 1 334 7 st.u64 [%frame+360],%r78; setp.ne.u64 %r273,%r78,%r79; @ %r273 bra $L55; sub.u64 %r133,%r79,%r163; .loc 1 331 9 mad.lo.u64 %r94,%r75,%r133,%r94; .loc 1 332 10 mad.lo.u64 %r96,%r76,%r133,%r96; bra $L59; $L89: st.u64 [%frame+360],%r79; $L59: mov.u64 %r125,%r290; mov.u64 %r88,%r76; .loc 1 334 34 mov.u64 %r90,%r79; mov.u64 %r86,%r73; mov.u64 %r124,8; .loc 1 335 9 mov.u64 %r98,0; $L62: .loc 1 340 13 st.u64 [%r125+-8],%r296; .loc 1 343 23 mul.lo.u64 %r81,%r86,%r90; .loc 1 344 24 mul.lo.u64 %r83,%r90,%r88; mov.u64 %r104,%r98; .loc 1 345 5 add.u64 %r98,%r98,1; .loc 1 346 7 setp.eq.u64 %r278,%r29,%r104; @ %r278 bra $L35; .loc 1 354 16 ld.u64 %r279,[%r125]; add.u64 %r85,%r279,1; st.u64 [%r125],%r85; .loc 1 355 23 add.u64 %r281,%r289,%r124; ld.u64 %r86,[%r281]; .loc 1 355 13 sub.u64 %r282,%r86,%r81; add.u64 %r283,%r282,%r282; add.u64 %r94,%r94,%r283; .loc 1 356 24 add.u64 %r284,%frame,%r124; ld.u64 %r88,[%r284]; .loc 1 356 14 sub.u64 %r285,%r88,%r83; add.u64 %r96,%r96,%r285; .loc 1 359 32 add.u64 %r287,%r297,%r124; ld.u64 %r90,[%r287]; .loc 1 359 7 add.u64 %r125,%r125,8; add.u64 %r124,%r124,8; setp.eq.u64 %r288,%r85,%r90; @ %r288 bra $L62; ld.u64 %r163,[%frame+360]; bra $L63; $L65: mov.u64 %r94,%r45; setp.eq.u32 %r291,1,1; bra $L48; $L35: .loc 1 362_gfortran_sminloc0_4_i2 .visible .func _gfortran_sminloc0_4_i24predpred.reg .u64u32 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 378 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L91; .loc 1 378 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L92; $L91: .loc 1 380 7 {_gfortran_minloc0_4_.loc 1 381 7 bra $L90; $L92: .loc 1 384 8 ld.s8 %r30,[%r36+28]_gfortran_runtime_error94: .loc 1 389 15 ld.u64 %r24,[%r35]; .loc 1 389 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L95; .loc 1 391 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 392 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 393 24 st.u64 [%r35+8],%r24; .loc 1 394 29 mov.u64 %r56call (%value_in),_gfortrani_xmallocarray94 27 st.u64 [%r35],%r24; bra $L96; $L95: .loc 1 396 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 396 11 ld.u32 %r62,[%r61+36; .loc 1 398 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 403 8 ld.u64 %r24,[%r35]; $L96: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,2; mov.u64 %r29,%r24; .loc 1 404 10 mov.u64 %r32,0; .loc 1 405 23 cvt.u32.u64 %r68,%r32; $L97: st.u32 [%r29],%r68; .loc 1 404 24 add.u64 %r32,%r32,1; .loc 1 404 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L97; $L90: .loc 1 406 minloc0_8_i2.o/ 1622802231_gfortran_minloc0_8_i2 .visible .func _gfortran_minloc0_8_i2fortran/generated/minloc0_8_i2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_8_i2 .visible .func _gfortran_mminloc0_8_i2_gfortran_sminloc0_8_i2 .visible .func _gfortran_sminloc0_8_i2VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_8_i2 .visible .func _gfortran_minloc0_8_i232predpredpredpredmov.u64 %r112,%ar0; mov.u64 %r113,%ar1; mov.u32 %r114,%ar2; .loc 1 50 10 ld.s8 %r22,[%r113+28]; .loc 1 50 8 cvt.u32.u32 %r115,%r22; cvt.s64.s8 %r71,%r115; .loc 1 51 6 setp.gt.s64 %r116,%r71,0; @ %r116 bra $L2; .loc 1 52 5 cvta.const.u64 %r11117_gfortran_runtime_errorr23,[%r112]; .loc 1 54 6 setp.ne.u64 %r119,%r23,0; @ %r119 bra $L3; .loc 1 56 7 st.u64 [%r112+48],%r23; add.u64 %r121,%r71,-1; st.u64 [%r112+56],%r121; mov.u64 %r122,1; st.u64 [%r112+40],%r122; .loc 1 57 28 cvt.u32.u64 %r123,%r122; st.u8 [%r112+28],%r123; .loc 1 58 24 st.u64 [%r112+8],%r23; .loc 1 59 29 mov.u64 %r171call (%value_in),_gfortrani_xmallocarray129,[%value_in]; } mov.u64 %r23,%r129; .loc 1 59 27 st.u64 [%r112],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r131,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r132,[%r131+36]; setp.eq.u32 %r133,%r132,0; @ %r133 bra $L4; .loc 1 64r113136; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r112]; $L4: .loc 1 68 11 ld.u64 %r72,[%r112+40]; add.u64 %r36,%r113,40; mov.u64 %r104,%frame; add.u64 %r105,%frame,120; .loc 1 70 10 mov.u64 %r98,0; add.u64 %r187,%frame,240; .loc 1 74 16 mov.u64 %r144,%r98; bra $L7; $L22: mov.u64 %r98,%r79; $L7: .loc 1 72 18 ld.u64 %r137,[%r36]; st.u64 [%r104],%r137; .loc 1 73 19 ld.u64 %r139,[%r36+16]; add.u64 %r138,%r139,1; ld.u64 %r140,[%r36+8]; sub.u64 %r31,%r138,%r140; .loc 1 73 17 st.u64 [%r105],%r31; .loc 1 74 16 shl.b64 %r142,%r98,3; add.u64 %r143,%r187,%r142; st.u64 [%r143],%r144; .loc 1 75 10 setp.gt.s64 %r145,%r31,0; @ %r145 bra $L5; shl.b64 %r81,%r72,3; mov.u64 %r87,%r23; .loc 1 78 11 mov.u64 %r80,0; .loc 1 79 24 mov.u64 %r146,%r80; $L6: st.u64 [%r87],%r146; .loc 1 78 27 add.u64 %r80,%r80,1; .loc 1 78 4 add.u64 %r87,%r87,%r81; setp.ne.u64 %r147,%r71,%r80; @ %r147 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r79,%r98,1; .loc 1 70 3 add.u64 %r36,%r36,24; add.u64 %r104,%r104,8; add.u64 %r105,%r105,8; setp.ne.u64 %r148,%r71,%r79; @ %r148 bra $L22; .loc 1 84 8 ld.u64 %r62,[%r113]; shl.b64 %r84,%r72,3; mov.u64 %r39,%r23; .loc 1 87 10 mov.u64 %r78,0; .loc 1 88 23 mov.u64 %r149,1; $L8: st.u64 [%r39],%r149; mov.u64 %r32,%r78; .loc 1 87 26 add.u64 %r78,%r78,1; .loc 1 87 3 add.u64 %r39,%r39,%r84; setp.ne.u64 %r150,%r98,%r32; @ %r150 bra $L8; .loc 1 101 9 setp.ne.u64 %r151,%r62,0; @ ! %r151 bra $L1; .loc 1 149 19 ld.u64 %r45,[%frame]; .loc 1 149 9 add.u64 %r47,%r45,%r45; .loc 1 151 34 ld.u64 %r51,[%frame+120]; cvt.u32.u32 %r154,%r22; cvt.s64.s8 %r153,%r154; shl.b64 %r155,%r153,3; add.u64 %r73,%r187,%r155; .loc 1 99 12 mov.u32 %r67,32767; setp.eq.u32 %r186,%r114,0; add.u64 %r188,%frame,248; .loc 1 157 13 mov.u64 %r191,0; .loc 1 174 32 add.u64 %r192,%frame,120; $L21: .loc 1 151 21 ld.u64 %r103,[%frame+240]; .loc 1 126 10 @ %r186 bra $L23; mov.u64 %r40,%r103; mov.u64 %r60,%r62; $L14: .loc 1 129 10 ld.s16 %r33,[%r60]; .loc 1 129 9 cvt.u16.u32 %r158,%r33; cvt.u16.u32 %r159,%r67; setp.gt.s16 %r160,%r158,%r159; @ %r160 bra $L12; mov.u64 %r76,%r188; mov.u64 %r34,%r23; mov.u64 %r90,%r40; $L13: .loc 1 133 34 add.u64 %r162,%r90,1; .loc 1 133 23 st.u64 [%r34],%r162; .loc 1 132 3 add.u64 %r34,%r34,%r84; setp.eq.u64 %r163,%r73,%r76; @ %r163 bra $L24; .loc 1 133 30 ld.u64 %r90,[%r76]; add.u64 %r76,%r76,8; bra $L13; $L24: mov.u32 %r67,%r33; $L12: .loc 1 135 11 add.u64 %r60,%r60,%r47; .loc 1 137 9 add.u64 %r40,%r40,1; .loc 1 137 2 st.u64 [%frame+240],%r40; setp.ne.u64 %r164,%r40,%r51; @ %r164 bra $L14; bra $L34; $L23: mov.u64 %r49,%r103; mov.u64 %r61,%r62; $L11: .loc 1 141 10 ld.s16 %r42,[%r61]; .loc 1 141 9 cvt.u16.u32 %r165,%r42; cvt.u16.u32 %r166,%r67; setp.ge.s16 %r167,%r165,%r166; @ %r167 bra $L16; mov.u64 %r85,%r188; mov.u64 %r101,%r23; mov.u64 %r86,%r49; $L17: .loc 1 145 34 add.u64 %r169,%r86,1; .loc 1 145 23 st.u64 [%r101],%r169; .loc 1 144 3 add.u64 %r101,%r101,%r84; setp.eq.u64 %r170,%r73,%r85; @ %r170 bra $L25; .loc 1 145 30 ld.u64 %r86,[%r85]; add.u64 %r85,%r85,8; bra $L17; $L25: mov.u32 %r67,%r42; $L16: .loc 1 149 9 add.u64 %r61,%r61,%r47; .loc 1 151 14 add.u64 %r49,%r49,1; .loc 1 151 7 st.u64 [%frame+240],%r49; setp.ne.u64 %r171,%r49,%r51; @ %r171 bra $L11; .loc 1 149 9 sub.u64 %r172,%r51,%r103; mad.lo.u64 %r62,%r172,%r47,%r62; bra $L18; $L34: .loc 1 135 11 sub.u64 %r174,%r51,%r103; mad.lo.u64 %r62,%r174,%r47,%r62; $L18: mov.u64 %r70,%r188; .loc 1 151 34 mov.u64 %r59,%r51; mov.u64 %r57,%r45; .loc 1 152 9 mov.u64 %r66,0; $L20: .loc 1 157 13 st.u64 [%r70+-8],%r191; .loc 1 160 23 mul.lo.u64 %r54,%r57,%r59; mov.u64 %r99,%r66; .loc 1 161 5 add.u64 %r66,%r66,1; .loc 1 162 7 setp.eq.u64 %r178,%r99,%r98; @ %r178 bra $L1; .loc 1 170 16 ld.u64 %r179,[%r70]; add.u64 %r56,%r179,1; st.u64 [%r70],%r56; shl.b64 %r65,%r66,3; .loc 1 171 23 add.u64 %r180,%frame,%r65; ld.u64 %r57,[%r180]; .loc 1 171 13 sub.u64 %r181,%r57,%r54; add.u64 %r182,%r181,%r181; add.u64 %r62,%r62,%r182; .loc 1 174 32 add.u64 %r184,%r192,%r65; ld.u64 %r59,[%r184]; .loc 1 174 7 add.u64 %r70,%r70,8; setp.eq.u64 %r185,%r56,%r59; @ %r185 bra $L20; bra $L21; $L1: .loc 1 1_gfortran_mminloc0_8_i2 .visible .func _gfortran_mminloc0_8_i29pred %r187; .reg .u64 %r190; .reg .u64u64pred %r222; .reg .u64 %r223; .reg .predu32 %r231; .reg .u64 %r232; .reg .u16 %r234; .reg .u32 %r235; .reg .pred %r236; .reg .predpred %r251; .reg .pred %r252; .reg .u64 %r253; .reg .u16 %r255; .reg .u32predpred %r281; .reg .u64 %r282; .reg .u32 %r284; .reg .u64 %r287; .reg .u64 %r288; .reg .pred %r290; mov.u64 %r160,%ar0; mov.u64 %r161,%ar1; mov.u64 %r162,%ar2; mov.u32 %r163,%ar3; .loc 1 202 6 setp.ne.u64 %r164,%r162,0; @ %r164 bra $L36; .loc 1 204 7 {1163; call _gfortran_minloc0_8_.loc 1 205 7 bra $L35; $L36: .loc 1 208 10 ld.s8 %r22,[%r161+28]; .loc 1 209 6 setp.gt.s32 %r168,%r22,0; @ %r168 bra $L38; .loc 1 210 5 cvta.const.u64 %r16stack; call _gfortran_runtime_error%r171,[%r160]; setp.ne.u64 %r172,%r171,0; @ %r172 bra $L39; .loc 1 214 7 st.u64 [%r160+48],%r171; add.u32 %r174,%r22,-1; cvt.s64.s32 %r175,%r174; st.u64 [%r160+56],%r175; mov.u64 %r176,1; st.u64 [%r160+40],%r176; .loc 1 215 28 cvt.u32.u64 %r177,%r176; st.u8 [%r160+28],%r177; .loc 1 216 24 st.u64 [%r160+8],%r171; .loc 1 217 29 cvt.u32.u32 %r182,%r22; cvt.s64.s8 %r181,%r182; mov.u64 %r180call (%value_in),_gfortrani_xmallocarray183,[%value_in]; } .loc 1 217 27 st.u64 [%r160],%r183; bra $L40; $L39: .loc 1 221 11 cvta.global.u64 %r185,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r186,[%r185+36]; setp.eq.u32 %r187,%r186,0; @ %r187 bra $L40; .loc 1 224 4 cvta.const.u64 %r190,$LC1; {161190; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19316161193190; call _gfortrani_bounds_equal_extents$L40: .loc 1 231 15 ld.u64 %r31,[%r162+16]; .loc 1 233 9 ld.u64 %r96,[%r162]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r195,%r32,-4; and.b32 %r196,%r195,-5; set.u32.eq.u32 %r198,%r196,0; neg.s32 %r199,%r198; .loc 1 235 22 add.u32 %r200,%r32,-1; set.u32.le.u32 %r202,%r200,1; neg.s32 %r203,%r202; .loc 1 235 6 cvt.u16.u32 %r205,%r199; cvt.u16.u32 %r206,%r203; or.b16 %r204,%r205,%r206; cvt.u32.u16 %r207,%r204; cvt.u16.u8 %r208,%r20741; .loc 1 237 7 setp.ne.u32 %r211,%r32,16; @ %r211 bra $L42; $L41: .loc 1 244 11 ld.u64 %r110,[%r160+40]; .loc 1 245 8 ld.u64 %r111,[%r160]; .loc 1 246 17 cvt.u32.u32 %r212,%r22; cvt.s64.s8 %r68,%r212; add.u64 %r153,%r161,40; add.u64 %r282,%frame,120; mov.u64 %r149,%r282; add.u64 %r147,%r162,40; mov.u64 %r146,%frame; add.u64 %r71,%frame,240; add.u64 %r64,%frame,360; mov.u64 %r47,%r64; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r221,%r29; bra $L43; $L42: .loc 1 242 5_gfortran_runtime_error64: mov.u64 %r29,%r116; $L43: .loc 1 248 18 ld.u64 %r215,[%r153]; st.u64 [%r149],%r215; .loc 1 249 20 ld.u64 %r217,[%r147]; mul.lo.u64 %r216,%r217,%r31; .loc 1 249 18 st.u64 [%r146],%r216; .loc 1 250 19 ld.u64 %r219,[%r153+16]; add.u64 %r218,%r219,1; ld.u64 %r220,[%r153+8]; sub.u64 %r44,%r218,%r220; .loc 1 250 17 st.u64 [%r71],%r44; .loc 1 251 16 st.u64 [%r47],%r221; .loc 1 252 10 setp.gt.s64 %r222,%r44,0; @ %r222 bra $L44; shl.b64 %r138,%r110,3; mov.u64 %r130,%r111; .loc 1 255 11 mov.u64 %r117,0; .loc 1 256 24 mov.u64 %r223,%r117; $L45: st.u64 [%r130],%r223; .loc 1 255 27 add.u64 %r117,%r117,1; .loc 1 255 4 add.u64 %r130,%r130,%r138; setp.ne.u64 %r224,%r68,%r117; @ %r224 bra $L45; bra $L35; $L44: .loc 1 246 26 add.u64 %r116,%r29,1; .loc 1 246 3 add.u64 %r153,%r153,24; add.u64 %r149,%r149,8; add.u64 %r147,%r147,24; add.u64 %r146,%r146,8; add.u64 %r71,%r71,8; add.u64 %r47,%r47,8; setp.ne.u64 %r225,%r68,%r116; @ %r225 bra $L64; .loc 1 261 8 ld.u64 %r93,[%r161]; shl.b64 %r158,%r110,3; mov.u64 %r156,%r111; .loc 1 264 10 mov.u64 %r115,0; .loc 1 265 23 mov.u64 %r226,%r115; $L47: st.u64 [%r156],%r226; mov.u64 %r141,%r115; .loc 1 264 26 add.u64 %r115,%r115,1; .loc 1 264 3 add.u64 %r156,%r156,%r158; setp.ne.u64 %r227,%r29,%r141; @ %r227 bra $L47; .loc 1 276 9 setp.eq.u64 %r228,%r93,0; @ %r228 bra $L35; .loc 1 331 19 ld.u64 %r73,[%frame+120]; .loc 1 331 9 add.u64 %r75,%r73,%r73; .loc 1 332 20 ld.u64 %r76,[%frame]; .loc 1 334 34 ld.u64 %r79,[%frame+240]; ld.u64 %r155,[%frame+360]; cvt.u32.u32 %r231,%r22; cvt.s64.s8 %r230,%r231; shl.b64 %r232,%r230,3; add.u64 %r120,%r232,%r64; .loc 1 274 12 mov.u32 %r101,32767; setp.eq.u32 %r281,1,0; add.u64 %r280,%frame,368; .loc 1 340 13 mov.u64 %r287,0; .loc 1 359 32 add.u64 %r288,%frame,240; .loc 1 308 12 setp.eq.u32 %r290,%r163,0; $L63: .loc 1 280 10 @ %r281 bra $L48; mov.u64 %r56,%r155; mov.u32 %r46,0; mov.u32 %r284,1; $L53: mov.u64 %r45,%r93; .loc 1 300 13 add.u64 %r93,%r45,%r75; .loc 1 284 11 ld.s8 %r235,[%r96]; cvt.u16.u32 %r234,%r235; setp.eq.u16 %r236,%r234,0; @ %r236 bra $L49; setp.eq.u32 %r237,%r46,0; selp.u64 %r155,%r155,%r56,%r237; .loc 1 294 16 ld.s16 %r101,[%r45]; mov.u64 %r129,%r280; mov.u64 %r139,%r111; mov.u64 %r151,%r155; $L52: .loc 1 296 33 add.u64 %r239,%r151,1; .loc 1 296 22 st.u64 [%r139],%r239; .loc 1 295 9 add.u64 %r139,%r139,%r158; setp.eq.u64 %r240,%r120,%r129; @ %r240 bra $L65; .loc 1 296 29 ld.u64 %r151,[%r129]; add.u64 %r129,%r129,8; bra $L52; $L49: .loc 1 301 14 add.u64 %r96,%r96,%r76; .loc 1 303 11 add.u64 %r56,%r56,1; mov.u32 %r46,%r284; .loc 1 303 4 setp.ne.u64 %r241,%r56,%r79; @ %r241 bra $L53; bra $L89; $L48: .loc 1 308 12 @ %r290 bra $L66; mov.u64 %r66,%r155; mov.u64 %r91,%r93; $L58: .loc 1 311 11 ld.s8 %r244,[%r96]; cvt.u16.u32 %r243,%r244; setp.eq.u16 %r245,%r243,0; @ %r245 bra $L56; .loc 1 311 12 ld.s16 %r61,[%r91]; .loc 1 311 11 cvt.u16.u32 %r246,%r61; cvt.u16.u32 %r247,%r101; setp.gt.s16 %r248,%r246,%r247; @ %r248 bra $L56; mov.u64 %r62,%r280; mov.u64 %r94,%r111; mov.u64 %r148,%r66; .loc 1 314 18 mov.u64 %r142,%r287; $L57: .loc 1 315 36 add.u64 %r250,%r148,1; .loc 1 315 25 st.u64 [%r94],%r250; .loc 1 314 34 add.u64 %r114,%r142,1; .loc 1 314 11 add.u64 %r94,%r94,%r158; setp.le.s64 %r251,%r29,%r142; @ %r251 bra $L67; .loc 1 315 32 ld.u64 %r148,[%r62]; add.u64 %r62,%r62,8; mov.u64 %r142,%r114; bra $L57; $L67: mov.u32 %r101,%r61; $L56: .loc 1 317 8 add.u64 %r91,%r91,%r75; .loc 1 319 13 add.u64 %r66,%r66,1; .loc 1 319 6 st.u64 [%frame+360],%r66; setp.ne.u64 %r252,%r66,%r79; @ %r252 bra $L58; .loc 1 317 8 sub.u64 %r253,%r79,%r155; mad.lo.u64 %r93,%r253,%r75,%r93; bra $L59; $L66: mov.u64 %r78,%r155; mov.u64 %r95,%r96; mov.u64 %r92,%r93; $L55: .loc 1 323 11 ld.s8 %r256,[%r95]; cvt.u16.u32 %r255,%r256; setp.eq.u16 %r257,%r255,0; @ %r257 bra $L60; .loc 1 323 12 ld.s16 %r69,[%r92]; .loc 1 323 11 cvt.u16.u32 %r258,%r69; cvt.u16.u32 %r259,%r101; setp.ge.s16 %r260,%r258,%r259; @ %r260 bra $L60; mov.u64 %r118,%r280; mov.u64 %r57,%r111; mov.u64 %r144,%r78; .loc 1 326 12 mov.u64 %r143,%r287; $L61: .loc 1 327 36 add.u64 %r262,%r144,1; .loc 1 327 25 st.u64 [%r57],%r262; .loc 1 326 28 add.u64 %r113,%r143,1; .loc 1 326 5 add.u64 %r57,%r57,%r158; setp.le.s64 %r263,%r29,%r143; @ %r263 bra $L68; .loc 1 327 32 ld.u64 %r144,[%r118]; add.u64 %r118,%r118,8; mov.u64 %r143,%r113; bra $L61; $L68: mov.u32 %r101,%r69; $L60: .loc 1 331 9 add.u64 %r92,%r92,%r75; .loc 1 332 10 add.u64 %r95,%r95,%r76; .loc 1 334 14 add.u64 %r78,%r78,1; .loc 1 334 7 st.u64 [%frame+360],%r78; setp.ne.u64 %r264,%r78,%r79; @ %r264 bra $L55; sub.u64 %r126,%r79,%r155; .loc 1 331 9 mad.lo.u64 %r93,%r75,%r126,%r93; .loc 1 332 10 mad.lo.u64 %r96,%r76,%r126,%r96; bra $L59; $L89: st.u64 [%frame+360],%r79; $L59: mov.u64 %r105,%r280; mov.u64 %r88,%r76; .loc 1 334 34 mov.u64 %r90,%r79; mov.u64 %r86,%r73; mov.u64 %r104,8; .loc 1 335 9 mov.u64 %r99,0; $L62: .loc 1 340 13 st.u64 [%r105+-8],%r287; .loc 1 343 23 mul.lo.u64 %r81,%r86,%r90; .loc 1 344 24 mul.lo.u64 %r83,%r90,%r88; mov.u64 %r70,%r99; .loc 1 345 5 add.u64 %r99,%r99,1; .loc 1 346 7 setp.eq.u64 %r269,%r29,%r70; @ %r269 bra $L35; .loc 1 354 16 ld.u64 %r270,[%r105]; add.u64 %r85,%r270,1; st.u64 [%r105],%r85; .loc 1 355 23 add.u64 %r272,%r282,%r104; ld.u64 %r86,[%r272]; .loc 1 355 13 sub.u64 %r273,%r86,%r81; add.u64 %r274,%r273,%r273; add.u64 %r93,%r93,%r274; .loc 1 356 24 add.u64 %r275,%frame,%r104; ld.u64 %r88,[%r275]; .loc 1 356 14 sub.u64 %r276,%r88,%r83; add.u64 %r96,%r96,%r276; .loc 1 359 32 add.u64 %r278,%r288,%r104; ld.u64 %r90,[%r278]; .loc 1 359 7 add.u64 %r105,%r105,8; add.u64 %r104,%r104,8; setp.eq.u64 %r279,%r85,%r90; @ %r279 bra $L62; ld.u64 %r155,[%frame+360]; bra $L63; $L65: mov.u64 %r93,%r45; setp.eq.u32 %r281,1,1; bra $L48; $L35: .loc 1 362_gfortran_sminloc0_8_i2 .visible .func _gfortran_sminloc0_8_i24predpred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 378 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L91; .loc 1 378 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L92; $L91: .loc 1 380 7 {_gfortran_minloc0_8_.loc 1 381 7 bra $L90; $L92: .loc 1 384 8 ld.s8 %r30,[%r36+28]_gfortran_runtime_error94: .loc 1 389 15 ld.u64 %r24,[%r35]; .loc 1 389 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L95; .loc 1 391 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 392 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 393 24 st.u64 [%r35+8],%r24; .loc 1 394 29 mov.u64 call (%value_in),_gfortrani_xmallocarray94 27 st.u64 [%r35],%r24; bra $L96; $L95: .loc 1 396 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 396 11 ld.u32 %r62,[%r61+36; .loc 1 398 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 403 8 ld.u64 %r24,[%r35]; $L96: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,3; mov.u64 %r29,%r24; .loc 1 404 10 mov.u64 %r32,0; .loc 1 405 23 mov.u64 %r68,%r32; $L97: st.u64 [%r29],%r68; .loc 1 404 24 add.u64 %r32,%r32,1; .loc 1 404 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L97; $L90: .loc 1 406minloc0_16_i2.o/1622802231_gfortran_minloc0_16_i2 .visible .func _gfortran_minloc0_16_i2fortran/generated/minloc0_16_i2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_16_i2 .visible .func _gfortran_mminloc0_16_i2_gfortran_sminloc0_16_i2 .visible .func _gfortran_sminloc0_16_i2VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_16_i2 .visible .func _gfortran_minloc0_16_i23264pred %r168; .reg .pred %r169; .reg .u16 %r170; .reg .u16 %r171; .reg .pred.reg .u64 %r180; .reg .predmov.u32 %r116,%ar2; .loc 1 50 10 ld.s8 %r22,[%r115+28]; .loc 1 50 8 cvt.u32.u32 %r117,%r22; cvt.s64.s8 %r71,%r117; .loc 1 51 6 setp.gt.s64 %r118,%r71,0; @ %r118 bra $L2; .loc 1 52 5 cvta.const.u64 %r1111_gfortran_runtime_errorr23,[%r114]; .loc 1 54 6 setp.ne.u64 %r121,%r23,0; @ %r121 bra $L3; .loc 1 56 7 st.u64 [%r114+48],%r23; add.u64 %r123,%r71,-1; st.u64 [%r114+56],%r123; mov.u64 %r124,1; st.u64 [%r114+40],%r124; .loc 1 57 28 cvt.u32.u64 %r125,%r124; st.u8 [%r114+28],%r125; .loc 1 58 24 st.u64 [%r114+8],%r23; .loc 1 59 29 mov.u64 %r128,1671call (%value_in),_gfortrani_xmallocarray131,[%value_in]; } mov.u64 %r23,%r131; .loc 1 59 27 st.u64 [%r114],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r133,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r134,[%r133+36]; setp.eq.u32 %r135,%r134,0; @ %r135 bra $L4; .loc 1 64r115138; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r114]; $L4: .loc 1 68 11 ld.u64 %r72,[%r114+40]; add.u64 %r44,%r115,40; mov.u64 %r35,%frame; add.u64 %r52,%frame,120; .loc 1 70 10 mov.u64 %r100,0; add.u64 %r193,%frame,240; .loc 1 74 16 mov.u64 %r146,%r100; bra $L7; $L22: mov.u64 %r100,%r79; $L7: .loc 1 72 18 ld.u64 %r139,[%r44]; st.u64 [%r35],%r139; .loc 1 73 19 ld.u64 %r141,[%r44+16]; add.u64 %r140,%r141,1; ld.u64 %r142,[%r44+8]; sub.u64 %r32,%r140,%r142; .loc 1 73 17 st.u64 [%r52],%r32; .loc 1 74 16 shl.b64 %r144,%r100,3; add.u64 %r145,%r193,%r144; st.u64 [%r145],%r146; .loc 1 75 10 setp.gt.s64 %r147,%r32,0; @ %r147 bra $L5; shl.b64 %r86,%r72,4; mov.u64 %r88,%r23; .loc 1 78 11 mov.u64 %r80,0; .loc 1 79 24 mov.u64 %r148,%r80; $L6: st.u64 [%r88],%r148; st.u64 [%r88+8],%r148; .loc 1 78 27 add.u64 %r80,%r80,1; .loc 1 78 4 add.u64 %r88,%r88,%r86; setp.ne.u64 %r150,%r71,%r80; @ %r150 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r79,%r100,1; .loc 1 70 3 add.u64 %r44,%r44,24; add.u64 %r35,%r35,8; add.u64 %r52,%r52,8; setp.ne.u64 %r151,%r71,%r79; @ %r151 bra $L22; .loc 1 84 8 ld.u64 %r62,[%r115]; shl.b64 %r38,%r72,4; mov.u64 %r85,%r23; .loc 1 87 10 mov.u64 %r78,0; .loc 1 88 23 mov.u64 %r152,1; mov.u64 %r153,%r78; $L8: st.u64 [%r85],%r152; st.u64 [%r85+8],%r153; mov.u64 %r33,%r78; .loc 1 87 26 add.u64 %r78,%r78,1; .loc 1 87 3 add.u64 %r85,%r85,%r38; setp.ne.u64 %r154,%r100,%r33; @ %r154 bra $L8; .loc 1 101 9 setp.ne.u64 %r155,%r62,0; @ ! %r155 bra $L1; .loc 1 149 19 ld.u64 %r47,[%frame]; .loc 1 149 9 add.u64 %r49,%r47,%r47; .loc 1 151 34 ld.u64 %r51,[%frame+120]; cvt.u32.u32 %r158,%r22; cvt.s64.s8 %r157,%r158; shl.b64 %r159,%r157,3; add.u64 %r89,%r193,%r159; .loc 1 99 12 mov.u32 %r66,32767; setp.eq.u32 %r192,%r116,0; add.u64 %r194,%frame,248; .loc 1 157 13 mov.u64 %r197,0; .loc 1 174 32 add.u64 %r198,%frame,120; $L21: .loc 1 151 21 ld.u64 %r107,[%frame+240]; .loc 1 126 10 @ %r192 bra $L23; mov.u64 %r41,%r107; mov.u64 %r60,%r62; $L14: .loc 1 129 10 ld.s16 %r34,[%r60]; .loc 1 129 9 cvt.u16.u32 %r162,%r34; cvt.u16.u32 %r163,%r66; setp.gt.s16 %r164,%r162,%r163; @ %r164 bra $L12; mov.u64 %r27,%r194; mov.u64 %r63,%r23; mov.u64 %r103,%r41; $L13: .loc 1 133 34 add.u64 %r166,%r103,1; st.u64 [%r63],%r166; shr.s64 %r167,%r166,63; st.u64 [%r63+8],%r167; .loc 1 132 3 add.u64 %r63,%r63,%r38; setp.eq.u64 %r168,%r27,%r89; @ %r168 bra $L24; .loc 1 133 30 ld.u64 %r103,[%r27]; add.u64 %r27,%r27,8; bra $L13; $L24: mov.u32 %r66,%r34; $L12: .loc 1 135 11 add.u64 %r60,%r60,%r49; .loc 1 137 9 add.u64 %r41,%r41,1; .loc 1 137 2 st.u64 [%frame+240],%r41; setp.ne.u64 %r169,%r41,%r51; @ %r169 bra $L14; bra $L34; $L23: mov.u64 %r50,%r107; mov.u64 %r61,%r62; $L11: .loc 1 141 10 ld.s16 %r42,[%r61]; .loc 1 141 9 cvt.u16.u32 %r170,%r42; cvt.u16.u32 %r171,%r66; setp.ge.s16 %r172,%r170,%r171; @ %r172 bra $L16; mov.u64 %r104,%r194; mov.u64 %r105,%r23; mov.u64 %r84,%r50; $L17: .loc 1 145 34 add.u64 %r174,%r84,1; st.u64 [%r105],%r174; shr.s64 %r175,%r174,63; st.u64 [%r105+8],%r175; .loc 1 144 3 add.u64 %r105,%r105,%r38; setp.eq.u64 %r176,%r89,%r104; @ %r176 bra $L25; .loc 1 145 30 ld.u64 %r84,[%r104]; add.u64 %r104,%r104,8; bra $L17; $L25: mov.u32 %r66,%r42; $L16: .loc 1 149 9 add.u64 %r61,%r61,%r49; .loc 1 151 14 add.u64 %r50,%r50,1; .loc 1 151 7 st.u64 [%frame+240],%r50; setp.ne.u64 %r177,%r50,%r51; @ %r177 bra $L11; .loc 1 149 9 sub.u64 %r178,%r51,%r107; mad.lo.u64 %r62,%r178,%r49,%r62; bra $L18; $L34: .loc 1 135 11 sub.u64 %r180,%r51,%r107; mad.lo.u64 %r62,%r180,%r49,%r62; $L18: mov.u64 %r77,%r194; .loc 1 151 34 mov.u64 %r59,%r51; mov.u64 %r57,%r47; .loc 1 152 9 mov.u64 %r65,0; $L20: .loc 1 157 13 st.u64 [%r77+-8],%r197; .loc 1 160 23 mul.lo.u64 %r54,%r57,%r59; mov.u64 %r101,%r65; .loc 1 161 5 add.u64 %r65,%r65,1; .loc 1 162 7 setp.eq.u64 %r184,%r101,%r100; @ %r184 bra $L1; .loc 1 170 16 ld.u64 %r185,[%r77]; add.u64 %r56,%r185,1; st.u64 [%r77],%r56; shl.b64 %r68,%r65,3; .loc 1 171 23 add.u64 %r186,%frame,%r68; ld.u64 %r57,[%r186]; .loc 1 171 13 sub.u64 %r187,%r57,%r54; add.u64 %r188,%r187,%r187; add.u64 %r62,%r62,%r188; .loc 1 174 32 add.u64 %r190,%r198,%r68; ld.u64 %r59,[%r190]; .loc 1 174 7 add.u64 %r77,%r77,8; setp.eq.u64 %r191,%r56,%r59; @ %r191 bra $L20; bra $L21; $L1: .loc 1 1_gfortran_mminloc0_16_i2 .visible .func _gfortran_mminloc0_16_i2u64u32 %r198; .reg .u32u16predpred %r246; .reg .pred %r247; .reg .u16 %r249; .reg .u32 %r250; .reg .predpredpredu64 %r280; .reg .u64 %r281; .reg .u64predpred %r298; mov.u64 %r163,%ar0; mov.u64 %r164,%ar1; mov.u64 %r165,%ar2; mov.u32 %r166,%ar3; .loc 1 202 6 setp.ne.u64 %r167,%r165,0; @ %r167 bra $L3616r1166; call _gfortran_minloc0_16_.loc 1 205 7 bra $L35; $L36: .loc 1 208 10 ld.s8 %r22,[%r164+28]; .loc 1 209 6 setp.gt.s32 %r171,%r22,0; @ %r171 bra $L38; .loc 1 210 5 cvta.const.u64 %r172172_gfortran_runtime_error%r174,[%r163]; setp.ne.u64 %r175,%r174,0; @ %r175 bra $L39; .loc 1 214 7 st.u64 [%r163+48],%r174; add.u32 %r177,%r22,-1; cvt.s64.s32 %r178,%r177; st.u64 [%r163+56],%r178; mov.u64 %r179,1; st.u64 [%r163+40],%r179; .loc 1 215 28 cvt.u32.u64 %r180,%r179; st.u8 [%r163+28],%r180; .loc 1 216 24 st.u64 [%r163+8],%r174; .loc 1 217 29 cvt.u32.u32 %r185,%r22; cvt.s64.s8 %r184,%r185; mov.u64 %r183,16call (%value_in),_gfortrani_xmallocarray186,[%value_in]; } .loc 1 217 27 st.u64 [%r163],%r186; bra $L40; $L39: .loc 1 221 11 cvta.global.u64 %r188,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r189,[%r188+36]; setp.eq.u32 %r190,%r189,0r164193; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r196193; call _gfortrani_bounds_equal_extents$L40: .loc 1 231 15 ld.u64 %r31,[%r165+16]; .loc 1 233 9 ld.u64 %r93,[%r165]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r198,%r32,-4; and.b32 %r199,%r198,-5; set.u32.eq.u32 %r201,%r199,0; neg.s32 %r202,%r201; .loc 1 235 22 add.u32 %r203,%r32,-1; set.u32.le.u32 %r205,%r203,1; neg.s32 %r206,%r205; .loc 1 235 6 cvt.u16.u32 %r208,%r202; cvt.u16.u32 %r209,%r206; or.b16 %r207,%r208,%r209;41; .loc 1 237 7 setp.ne.u32 %r214,%r32,16; @ %r214 bra $L42; $L41: .loc 1 244 11 ld.u64 %r106,[%r163+40]; .loc 1 245 8 ld.u64 %r107,[%r163]; .loc 1 246 17 cvt.u32.u32 %r215,%r22; cvt.s64.s8 %r64,%r215; add.u64 %r161,%r164,40; add.u64 %r289,%frame,120; mov.u64 %r159,%r289; add.u64 %r158,%r165,40; mov.u64 %r154,%frame; add.u64 %r155,%frame,240; add.u64 %r151,%frame,360; mov.u64 %r150,%r151; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r224,%r29; bra $L43; $L42: .loc 1 242 5_gfortran_runtime_error64: mov.u64 %r29,%r112; $L43: .loc 1 248 18 ld.u64 %r218,[%r161]; st.u64 [%r159],%r218; .loc 1 249 20 ld.u64 %r220,[%r158]; mul.lo.u64 %r219,%r220,%r31; .loc 1 249 18 st.u64 [%r154],%r219; .loc 1 250 19 ld.u64 %r222,[%r161+16]; add.u64 %r221,%r222,1; ld.u64 %r223,[%r161+8]; sub.u64 %r44,%r221,%r223; .loc 1 250 17 st.u64 [%r155],%r44; .loc 1 251 16 st.u64 [%r150],%r224; .loc 1 252 10 setp.gt.s64 %r225,%r44,0; @ %r225 bra $L44; shl.b64 %r141,%r106,4; mov.u64 %r132,%r107; .loc 1 255 11 mov.u64 %r113,0; .loc 1 256 24 mov.u64 %r226,%r113; $L45: st.u64 [%r132],%r226; st.u64 [%r132+8],%r226; .loc 1 255 27 add.u64 %r113,%r113,1; .loc 1 255 4 add.u64 %r132,%r132,%r141; setp.ne.u64 %r228,%r64,%r113; @ %r228 bra $L45; bra $L35; $L44: .loc 1 246 26 add.u64 %r112,%r29,1; .loc 1 246 3 add.u64 %r161,%r161,24; add.u64 %r159,%r159,8; add.u64 %r158,%r158,24; add.u64 %r154,%r154,8; add.u64 %r155,%r155,8; add.u64 %r150,%r150,8; setp.ne.u64 %r229,%r64,%r112; @ %r229 bra $L64; .loc 1 261 8 ld.u64 %r91,[%r164]; shl.b64 %r115,%r106,4; mov.u64 %r52,%r107; .loc 1 264 10 mov.u64 %r111,0; .loc 1 265 23 mov.u64 %r230,%r111; $L47: st.u64 [%r52],%r230; st.u64 [%r52+8],%r230; mov.u64 %r143,%r111; .loc 1 264 26 add.u64 %r111,%r111,1; .loc 1 264 3 add.u64 %r52,%r52,%r115; setp.ne.u64 %r232,%r29,%r143; @ %r232 bra $L47; .loc 1 276 9 setp.eq.u64 %r233,%r91,0; @ %r233 bra $L35; .loc 1 331 19 ld.u64 %r70,[%frame+120]; .loc 1 331 9 add.u64 %r72,%r70,%r70; .loc 1 332 20 ld.u64 %r73,[%frame]; .loc 1 334 34 ld.u64 %r76,[%frame+240]; ld.u64 %r160,[%frame+360]; cvt.u32.u32 %r236,%r22; cvt.s64.s8 %r235,%r236; shl.b64 %r237,%r235,3; add.u64 %r122,%r237,%r151; .loc 1 274 12 mov.u32 %r97,32767; setp.eq.u32 %r288,1,0; add.u64 %r290,%frame,368; .loc 1 340 13 mov.u64 %r295,0; .loc 1 359 32 add.u64 %r296,%frame,240; .loc 1 308 12 setp.eq.u32 %r298,%r166,0; $L63: .loc 1 280 10 @ %r288 bra $L48; mov.u64 %r53,%r160; mov.u32 %r46,0; mov.u32 %r292,1; $L53: mov.u64 %r45,%r91; .loc 1 300 13 add.u64 %r91,%r45,%r72; .loc 1 284 11 ld.s8 %r240,[%r93]; cvt.u16.u32 %r239,%r240; setp.eq.u16 %r241,%r239,0; @ %r241 bra $L49; setp.eq.u32 %r242,%r46,0; selp.u64 %r160,%r160,%r53,%r242; .loc 1 294 16 ld.s16 %r97,[%r45]; mov.u64 %r131,%r290; mov.u64 %r142,%r107; mov.u64 %r157,%r160; $L52: .loc 1 296 33 add.u64 %r244,%r157,1; st.u64 [%r142],%r244; shr.s64 %r245,%r244,63; st.u64 [%r142+8],%r245; .loc 1 295 9 add.u64 %r142,%r142,%r115; setp.eq.u64 %r246,%r122,%r131; @ %r246 bra $L65; .loc 1 296 29 ld.u64 %r157,[%r131]; add.u64 %r131,%r131,8; bra $L52; $L49: .loc 1 301 14 add.u64 %r93,%r93,%r73; .loc 1 303 11 add.u64 %r53,%r53,1; mov.u32 %r46,%r292; .loc 1 303 4 setp.ne.u64 %r247,%r53,%r76; @ %r247 bra $L53; bra $L89; $L48: .loc 1 308 12 @ %r298 bra $L66; mov.u64 %r62,%r160; mov.u64 %r89,%r91; $L58: .loc 1 311 11 ld.s8 %r250,[%r93]; cvt.u16.u32 %r249,%r250; setp.eq.u16 %r251,%r249,0; @ %r251 bra $L56; .loc 1 311 12 ld.s16 %r58,[%r89]; .loc 1 311 11 cvt.u16.u32 %r252,%r58; cvt.u16.u32 %r253,%r97; setp.gt.s16 %r254,%r252,%r253; @ %r254 bra $L56; mov.u64 %r94,%r290; mov.u64 %r98,%r107; mov.u64 %r153,%r62; .loc 1 314 18 mov.u64 %r144,%r295; $L57: .loc 1 315 36 add.u64 %r256,%r153,1; st.u64 [%r98],%r256; shr.s64 %r257,%r256,63; st.u64 [%r98+8],%r257; .loc 1 314 34 add.u64 %r110,%r144,1; .loc 1 314 11 add.u64 %r98,%r98,%r115; setp.le.s64 %r258,%r29,%r144; @ %r258 bra $L67; .loc 1 315 32 ld.u64 %r153,[%r94]; add.u64 %r94,%r94,8; mov.u64 %r144,%r110; bra $L57; $L67: mov.u32 %r97,%r58; $L56: .loc 1 317 8 add.u64 %r89,%r89,%r72; .loc 1 319 13 add.u64 %r62,%r62,1; .loc 1 319 6 st.u64 [%frame+360],%r62; setp.ne.u64 %r259,%r62,%r76; @ %r259 bra $L58; .loc 1 317 8 sub.u64 %r260,%r76,%r160; mad.lo.u64 %r91,%r260,%r72,%r91; bra $L59; $L66: mov.u64 %r75,%r160; mov.u64 %r92,%r93; mov.u64 %r90,%r91; $L55: .loc 1 323 11 ld.s8 %r263,[%r92]; cvt.u16.u32 %r262,%r263; setp.eq.u16 %r264,%r262,0; @ %r264 bra $L60; .loc 1 323 12 ld.s16 %r65,[%r90]; .loc 1 323 11 cvt.u16.u32 %r265,%r65; cvt.u16.u32 %r266,%r97; setp.ge.s16 %r267,%r265,%r266; @ %r267 bra $L60; mov.u64 %r57,%r290; mov.u64 %r66,%r107; mov.u64 %r149,%r75; .loc 1 326 12 mov.u64 %r145,%r295; $L61: .loc 1 327 36 add.u64 %r269,%r149,1; st.u64 [%r66],%r269; shr.s64 %r270,%r269,63; st.u64 [%r66+8],%r270; .loc 1 326 28 add.u64 %r109,%r145,1; .loc 1 326 5 add.u64 %r66,%r66,%r115; setp.le.s64 %r271,%r29,%r145; @ %r271 bra $L68; .loc 1 327 32 ld.u64 %r149,[%r57]; add.u64 %r57,%r57,8; mov.u64 %r145,%r109; bra $L61; $L68: mov.u32 %r97,%r65; $L60: .loc 1 331 9 add.u64 %r90,%r90,%r72; .loc 1 332 10 add.u64 %r92,%r92,%r73; .loc 1 334 14 add.u64 %r75,%r75,1; .loc 1 334 7 st.u64 [%frame+360],%r75; setp.ne.u64 %r272,%r75,%r76; @ %r272 bra $L55; sub.u64 %r128,%r76,%r160; .loc 1 331 9 mad.lo.u64 %r91,%r72,%r128,%r91; .loc 1 332 10 mad.lo.u64 %r93,%r73,%r128,%r93; bra $L59; $L89: st.u64 [%frame+360],%r76; $L59: mov.u64 %r120,%r290; mov.u64 %r85,%r73; .loc 1 334 34 mov.u64 %r87,%r76; mov.u64 %r83,%r70; mov.u64 %r119,8; .loc 1 335 9 mov.u64 %r96,0; $L62: .loc 1 340 13 st.u64 [%r120+-8],%r295; .loc 1 343 23 mul.lo.u64 %r78,%r83,%r87; .loc 1 344 24 mul.lo.u64 %r80,%r87,%r85; mov.u64 %r95,%r96; .loc 1 345 5 add.u64 %r96,%r96,1; .loc 1 346 7 setp.eq.u64 %r277,%r29,%r95; @ %r277 bra $L35; .loc 1 354 16 ld.u64 %r278,[%r120]; add.u64 %r82,%r278,1; st.u64 [%r120],%r82; .loc 1 355 23 add.u64 %r280,%r289,%r119; ld.u64 %r83,[%r280]; .loc 1 355 13 sub.u64 %r281,%r83,%r78; add.u64 %r282,%r281,%r281; add.u64 %r91,%r91,%r282; .loc 1 356 24 add.u64 %r283,%frame,%r119; ld.u64 %r85,[%r283]; .loc 1 356 14 sub.u64 %r284,%r85,%r80; add.u64 %r93,%r93,%r284; .loc 1 359 32 add.u64 %r286,%r296,%r119; ld.u64 %r87,[%r286]; .loc 1 359 7 add.u64 %r120,%r120,8; add.u64 %r119,%r119,8; setp.eq.u64 %r287,%r82,%r87; @ %r287 bra $L62; ld.u64 %r160,[%frame+360]; bra $L63; $L65: mov.u64 %r91,%r45; setp.eq.u32 %r288,1,1; bra $L48; $L35: .loc 1 362_gfortran_sminloc0_16_i2 .visible .func _gfortran_sminloc0_16_i24predpred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 378 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L91; .loc 1 378 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L92; $L91: .loc 1 380 7 {_gfortran_minloc0_16_.loc 1 381 7 bra $L90; $L92: .loc 1 384 8 ld.s8 %r30,[%r36+28]_gfortran_runtime_error94: .loc 1 389 15 ld.u64 %r24,[%r35]; .loc 1 389 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L95; .loc 1 391 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 392 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 393 24 st.u64 [%r35+8],%r24; .loc 1 394 29call (%value_in),_gfortrani_xmallocarray94 27 st.u64 [%r35],%r24; bra $L96; $L95: .loc 1 396 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 396 11 ld.u32 %r62,[%r61+36; .loc 1 398 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 403 8 ld.u64 %r24,[%r35]; $L96: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,4; mov.u64 %r29,%r24; .loc 1 404 10 mov.u64 %r32,0; .loc 1 405 23 mov.u64 %r68,%r32; $L97: st.u64 [%r29],%r68; st.u64 [%r29+8],%r68; .loc 1 404 24 add.u64 %r32,%r32,1; .loc 1 404 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r70,%r30,%r32; @ %r70 bra $L97; $L90: .loc 1 406minloc0_4_i4.o/ 1622802232_gfortran_minloc0_4_i4 .visible .func _gfortran_minloc0_4_i4fortran/generated/minloc0_4_i4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_4_i4 .visible .func _gfortran_mminloc0_4_i4_gfortran_sminloc0_4_i4 .visible .func _gfortran_sminloc0_4_i4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_4_i4 .visible .func _gfortran_minloc0_4_i4u64 %r87; .reg .u64predpred %r154; .reg .predpredpredmov.u64 %r116,%ar0; mov.u64 %r117,%ar1; mov.u32 %r118,%ar2; .loc 1 50 10 ld.s8 %r22,[%r117+28]; .loc 1 50 8 cvt.u32.u32 %r119,%r22; cvt.s64.s8 %r72,%r119; .loc 1 51 6 setp.gt.s64 %r120,%r72,0; @ %r120 bra $L2; .loc 1 52 5 cvta.const.u64 %r121121_gfortran_runtime_errorr23,[%r116]; .loc 1 54 6 setp.ne.u64 %r123,%r23,0; @ %r123 bra $L3; .loc 1 56 7 st.u64 [%r116+48],%r23; add.u64 %r125,%r72,-1; st.u64 [%r116+56],%r125; mov.u64 %r126,1; st.u64 [%r116+40],%r126; .loc 1 57 28 cvt.u32.u64 %r127,%r126; st.u8 [%r116+28],%r127; .loc 1 58 24 st.u64 [%r116+8],%r23; .loc 1 59 29 mov.u64 %r130,4;r130; call (%value_in),_gfortrani_xmallocarray133,[%value_in]; } mov.u64 %r23,%r133; .loc 1 59 27 st.u64 [%r116],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r135,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r136,[%r135+36]; setp.eq.u32 %r137,%r136,0; @ %r137 bra $L4; .loc 1 64r140; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r116]; $L4: .loc 1 68 11 ld.u64 %r73,[%r116+40]; add.u64 %r39,%r117,40; mov.u64 %r45,%frame; add.u64 %r35,%frame,120; .loc 1 70 10 mov.u64 %r99,0; add.u64 %r188,%frame,240; .loc 1 74 16 mov.u64 %r148,%r99; bra $L7; $L22: mov.u64 %r99,%r79; $L7: .loc 1 72 18 ld.u64 %r141,[%r39]; st.u64 [%r45],%r141; .loc 1 73 19 ld.u64 %r143,[%r39+16]; add.u64 %r142,%r143,1; ld.u64 %r144,[%r39+8]; sub.u64 %r31,%r142,%r144; .loc 1 73 17 st.u64 [%r35],%r31; .loc 1 74 16 shl.b64 %r146,%r99,3; add.u64 %r147,%r188,%r146; st.u64 [%r147],%r148; .loc 1 75 10 setp.gt.s64 %r149,%r31,0; @ %r149 bra $L5; shl.b64 %r85,%r73,2; mov.u64 %r87,%r23; .loc 1 78 11 mov.u64 %r80,0; .loc 1 79 24 cvt.u32.u64 %r150,%r80; $L6: st.u32 [%r87],%r150; .loc 1 78 27 add.u64 %r80,%r80,1; .loc 1 78 4 add.u64 %r87,%r87,%r85; setp.ne.u64 %r151,%r72,%r80; @ %r151 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r79,%r99,1; .loc 1 70 3 add.u64 %r39,%r39,24; add.u64 %r45,%r45,8; add.u64 %r35,%r35,8; setp.ne.u64 %r152,%r72,%r79; @ %r152 bra $L22; .loc 1 84 8 ld.u64 %r64,[%r117]; shl.b64 %r82,%r73,2; mov.u64 %r92,%r23; .loc 1 87 10 mov.u64 %r78,0; .loc 1 88 23 mov.u32 %r153,1; $L8: st.u32 [%r92],%r153; mov.u64 %r32,%r78; .loc 1 87 26 add.u64 %r78,%r78,1; .loc 1 87 3 add.u64 %r92,%r92,%r82; setp.ne.u64 %r154,%r99,%r32; @ %r154 bra $L8; .loc 1 101 9 setp.ne.u64 %r155,%r64,0; @ ! %r155 bra $L1; .loc 1 149 19 ld.u64 %r49,[%frame]; .loc 1 149 9 shl.b64 %r51,%r49,2; .loc 1 151 34 ld.u64 %r53,[%frame+120]; cvt.u32.u32 %r157,%r22; cvt.s64.s8 %r156,%r157; shl.b64 %r158,%r156,3; add.u64 %r102,%r188,%r158; .loc 1 99 12 mov.u32 %r67,2147483647; setp.eq.u32 %r187,%r118,0; add.u64 %r189,%frame,248; .loc 1 157 13 mov.u64 %r192,0; .loc 1 174 32 add.u64 %r193,%frame,120; $L21: .loc 1 151 21 ld.u64 %r110,[%frame+240]; .loc 1 126 10 @ %r187 bra $L23; mov.u64 %r40,%r110; mov.u64 %r62,%r64; $L14: .loc 1 129 10 ld.u32 %r33,[%r62]; .loc 1 129 9 setp.gt.s32 %r161,%r33,%r67; @ %r161 bra $L12; mov.u64 %r43,%r189; mov.u64 %r65,%r23; mov.u64 %r105,%r40; $L13: .loc 1 133 34 cvt.u32.u64 %r164,%r105; add.u32 %r163,%r164,1; .loc 1 133 23 st.u32 [%r65],%r163; .loc 1 132 3 add.u64 %r65,%r65,%r82; setp.eq.u64 %r165,%r43,%r102; @ %r165 bra $L24; .loc 1 133 30 ld.u64 %r105,[%r43]; add.u64 %r43,%r43,8; bra $L13; $L24: mov.u32 %r67,%r33; $L12: .loc 1 135 11 add.u64 %r62,%r62,%r51; .loc 1 137 9 add.u64 %r40,%r40,1; .loc 1 137 2 st.u64 [%frame+240],%r40; setp.ne.u64 %r166,%r40,%r53; @ %r166 bra $L14; bra $L34; $L23: mov.u64 %r52,%r110; mov.u64 %r63,%r64; $L11: .loc 1 141 10 ld.u32 %r42,[%r63]; .loc 1 141 9 setp.ge.s32 %r167,%r42,%r67; @ %r167 bra $L16; mov.u64 %r108,%r189; mov.u64 %r109,%r23; mov.u64 %r44,%r52; $L17: .loc 1 145 34 cvt.u32.u64 %r170,%r44; add.u32 %r169,%r170,1; .loc 1 145 23 st.u32 [%r109],%r169; .loc 1 144 3 add.u64 %r109,%r109,%r82; setp.eq.u64 %r171,%r102,%r108; @ %r171 bra $L25; .loc 1 145 30 ld.u64 %r44,[%r108]; add.u64 %r108,%r108,8; bra $L17; $L25: mov.u32 %r67,%r42; $L16: .loc 1 149 9 add.u64 %r63,%r63,%r51; .loc 1 151 14 add.u64 %r52,%r52,1; .loc 1 151 7 st.u64 [%frame+240],%r52; setp.ne.u64 %r172,%r52,%r53; @ %r172 bra $L11; .loc 1 149 9 sub.u64 %r173,%r53,%r110; mad.lo.u64 %r64,%r173,%r51,%r64; bra $L18; $L34: .loc 1 135 11 sub.u64 %r175,%r53,%r110; mad.lo.u64 %r64,%r175,%r51,%r64; $L18: mov.u64 %r81,%r189; .loc 1 151 34 mov.u64 %r61,%r53; mov.u64 %r59,%r49; .loc 1 152 9 mov.u64 %r66,0; $L20: .loc 1 157 13 st.u64 [%r81+-8],%r192; .loc 1 160 23 mul.lo.u64 %r56,%r59,%r61; mov.u64 %r100,%r66; .loc 1 161 5 add.u64 %r66,%r66,1; .loc 1 162 7 setp.eq.u64 %r179,%r100,%r99; @ %r179 bra $L1; .loc 1 170 16 ld.u64 %r180,[%r81]; add.u64 %r58,%r180,1; st.u64 [%r81],%r58; shl.b64 %r70,%r66,3; .loc 1 171 23 add.u64 %r181,%frame,%r70; ld.u64 %r59,[%r181]; .loc 1 171 13 sub.u64 %r182,%r59,%r56; shl.b64 %r183,%r182,2; add.u64 %r64,%r64,%r183; .loc 1 174 32 add.u64 %r185,%r193,%r70; ld.u64 %r61,[%r185]; .loc 1 174 7 add.u64 %r81,%r81,8; setp.eq.u64 %r186,%r58,%r61; @ %r186 bra $L20; bra $L21; $L1: .loc 1 1_gfortran_mminloc0_4_i4 .visible .func _gfortran_mminloc0_4_i4predpred %r193; .reg .u64 %r196; .reg .u6464pred %r228; .reg .u32 %r229; .reg .predpred %r233; .reg .pred %r234; .reg .u64u16u64u64 %r292; .reg .pred %r294; mov.u64 %r166,%ar0; mov.u64 %r167,%ar1; mov.u64 %r168,%ar2; mov.u32 %r169,%ar3; .loc 1 202 6 setp.ne.u64 %r170,%r168,0; @ %r170 bra $L36gfortran_minloc0_4_.loc 1 205 7 bra $L35; $L36: .loc 1 208 10 ld.s8 %r22,[%r167+28]; .loc 1 209 6 setp.gt.s32 %r174,%r22,0; @ %r174 bra $L38; .loc 1 210 5 cvta.const.u64 %r175175_gfortran_runtime_error%r177,[%r166]; setp.ne.u64 %r178,%r177,0; @ %r178 bra $L39; .loc 1 214 7 st.u64 [%r166+48],%r177; add.u32 %r180,%r22,-1; cvt.s64.s32 %r181,%r180; st.u64 [%r166+56],%r181; mov.u64 %r182,1; st.u64 [%r166+40],%r182; .loc 1 215 28 cvt.u32.u64 %r183,%r182; st.u8 [%r166+28],%r183; .loc 1 216 24 st.u64 [%r166+8],%r177; .loc 1 217 29 cvt.u32.u32 %r188,%r22; cvt.s64.s8 %r187,%r188call (%value_in),_gfortrani_xmallocarray189,[%value_in]; } .loc 1 217 27 st.u64 [%r166],%r189; bra $L40; $L39: .loc 1 221 11 cvta.global.u64 %r191,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r192,[%r191+36];r167196; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19call _gfortrani_bounds_equal_extents$L40: .loc 1 231 15 ld.u64 %r31,[%r168+16]; .loc 1 233 9 ld.u64 %r96,[%r168]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r201,%r32,-4; and.b32 %r202,%r201,-5; set.u32.eq.u32 %r204,%r202,0; neg.s32 %r205,%r204; .loc 1 235 22 add.u32 %r206,%r32,-1; set.u32.le.u32 %r208,%r206,1; neg.s32 %r209,%r208; .loc 1 235 6 cvt.u16.u32 %r211,%r205; cvt.u16.u32 %r212,%r209; or.b16 %r210,%r211,%r212; cvt.u32.u16 %r213,%r210; cvt.u16.u8 %r214,%r213; setp.ne.u16 %r215,%r214,0; @ %r215 bra $L41; .loc 1 237 7 setp.ne.u32 %r217,%r32,16; @ %r217 bra $L42; $L41: .loc 1 244 11 ld.u64 %r109,[%r166+40]; .loc 1 245 8 ld.u64 %r110,[%r166]; .loc 1 246 17 cvt.u32.u32 %r218,%r22; cvt.s64.s8 %r67,%r218; add.u64 %r52,%r167,40; add.u64 %r286,%frame,120; mov.u64 %r118,%r286; add.u64 %r105,%r168,40; mov.u64 %r165,%frame; add.u64 %r162,%frame,240; add.u64 %r159,%frame,360; mov.u64 %r160,%r159; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r227,%r29; bra $L43; $L42: .loc 1 242 5_gfortran_runtime_error64: mov.u64 %r29,%r115; $L43: .loc 1 248 18 ld.u64 %r221,[%r52]; st.u64 [%r118],%r221; .loc 1 249 20 ld.u64 %r223,[%r105]; mul.lo.u64 %r222,%r223,%r31; .loc 1 249 18 st.u64 [%r165],%r222; .loc 1 250 19 ld.u64 %r225,[%r52+16]; add.u64 %r224,%r225,1; ld.u64 %r226,[%r52+8]; sub.u64 %r44,%r224,%r226; .loc 1 250 17 st.u64 [%r162],%r44; .loc 1 251 16 st.u64 [%r160],%r227; .loc 1 252 10 setp.gt.s64 %r228,%r44,0; @ %r228 bra $L44; shl.b64 %r149,%r109,2; mov.u64 %r138,%r110; .loc 1 255 11 mov.u64 %r116,0; .loc 1 256 24 cvt.u32.u64 %r229,%r116; $L45: st.u32 [%r138],%r229; .loc 1 255 27 add.u64 %r116,%r116,1; .loc 1 255 4 add.u64 %r138,%r138,%r149; setp.ne.u64 %r230,%r67,%r116; @ %r230 bra $L45; bra $L35; $L44: .loc 1 246 26 add.u64 %r115,%r29,1; .loc 1 246 3 add.u64 %r52,%r52,24; add.u64 %r118,%r118,8; add.u64 %r105,%r105,24; add.u64 %r165,%r165,8; add.u64 %r162,%r162,8; add.u64 %r160,%r160,8; setp.ne.u64 %r231,%r67,%r115; @ %r231 bra $L64; .loc 1 261 8 ld.u64 %r94,[%r167]; shl.b64 %r57,%r109,2; mov.u64 %r60,%r110; .loc 1 264 10 mov.u64 %r114,0; .loc 1 265 23 cvt.u32.u64 %r232,%r114; $L47: st.u32 [%r60],%r232; mov.u64 %r145,%r114; .loc 1 264 26 add.u64 %r114,%r114,1; .loc 1 264 3 add.u64 %r60,%r60,%r57; setp.ne.u64 %r233,%r29,%r145; @ %r233 bra $L47; .loc 1 276 9 setp.eq.u64 %r234,%r94,0; @ %r234 bra $L35; .loc 1 331 19 ld.u64 %r73,[%frame+120]; .loc 1 331 9 shl.b64 %r75,%r73,2; .loc 1 332 20 ld.u64 %r76,[%frame]; .loc 1 334 34 ld.u64 %r79,[%frame+240]; ld.u64 %r163,[%frame+360]; cvt.u32.u32 %r236,%r22; cvt.s64.s8 %r235,%r236; shl.b64 %r237,%r235,3; add.u64 %r127,%r237,%r159; .loc 1 274 12 mov.u32 %r100,2147483647; setp.eq.u32 %r285,1,0; add.u64 %r284,%frame,368; .loc 1 340 13 mov.u64 %r291,0; .loc 1 359 32 add.u64 %r292,%frame,240; .loc 1 308 12 setp.eq.u32 %r294,%r169,0; $L63: .loc 1 280 10 @ %r285 bra $L48; mov.u64 %r53,%r163; mov.u32 %r46,0; mov.u32 %r288,1; $L53: mov.u64 %r45,%r94; .loc 1 300 13 add.u64 %r94,%r45,%r75; .loc 1 284 11 ld.s8 %r240,[%r96]; cvt.u16.u32 %r239,%r240; setp.eq.u16 %r241,%r239,0; @ %r241 bra $L49; setp.eq.u32 %r242,%r46,0; selp.u64 %r163,%r163,%r53,%r242; .loc 1 294 16 ld.u32 %r100,[%r45]; mov.u64 %r137,%r284; mov.u64 %r150,%r110; mov.u64 %r161,%r163; $L52: .loc 1 296 33 cvt.u32.u64 %r245,%r161; add.u32 %r244,%r245,1; .loc 1 296 22 st.u32 [%r150],%r244; .loc 1 295 9 add.u64 %r150,%r150,%r57; setp.eq.u64 %r246,%r127,%r137; @ %r246 bra $L65; .loc 1 296 29 ld.u64 %r161,[%r137]; add.u64 %r137,%r137,8; bra $L52; $L49: .loc 1 301 14 add.u64 %r96,%r96,%r76; .loc 1 303 11 add.u64 %r53,%r53,1; mov.u32 %r46,%r288; .loc 1 303 4 setp.ne.u64 %r247,%r53,%r79; @ %r247 bra $L53; bra $L89; $L48: .loc 1 308 12 @ %r294 bra $L66; mov.u64 %r64,%r163; mov.u64 %r92,%r94; $L58: .loc 1 311 11 ld.s8 %r250,[%r96]; cvt.u16.u32 %r249,%r250; setp.eq.u16 %r251,%r249,0; @ %r251 bra $L56; .loc 1 311 12 ld.u32 %r58,[%r92]; .loc 1 311 11 setp.gt.s32 %r252,%r58,%r100; @ %r252 bra $L56; mov.u64 %r103,%r284; mov.u64 %r119,%r110; mov.u64 %r156,%r64; .loc 1 314 18 mov.u64 %r146,%r291; $L57: .loc 1 315 36 cvt.u32.u64 %r255,%r156; add.u32 %r254,%r255,1; .loc 1 315 25 st.u32 [%r119],%r254; .loc 1 314 34 add.u64 %r113,%r146,1; .loc 1 314 11 add.u64 %r119,%r119,%r57; setp.le.s64 %r256,%r29,%r146; @ %r256 bra $L67; .loc 1 315 32 ld.u64 %r156,[%r103]; add.u64 %r103,%r103,8; mov.u64 %r146,%r113; bra $L57; $L67: mov.u32 %r100,%r58; $L56: .loc 1 317 8 add.u64 %r92,%r92,%r75; .loc 1 319 13 add.u64 %r64,%r64,1; .loc 1 319 6 st.u64 [%frame+360],%r64; setp.ne.u64 %r257,%r64,%r79; @ %r257 bra $L58; .loc 1 317 8 sub.u64 %r258,%r79,%r163; mad.lo.u64 %r94,%r258,%r75,%r94; bra $L59; $L66: mov.u64 %r78,%r163; mov.u64 %r95,%r96; mov.u64 %r93,%r94; $L55: .loc 1 323 11 ld.s8 %r261,[%r95]; cvt.u16.u32 %r260,%r261; setp.eq.u16 %r262,%r260,0; @ %r262 bra $L60; .loc 1 323 12 ld.u32 %r68,[%r93]; .loc 1 323 11 setp.ge.s32 %r263,%r68,%r100; @ %r263 bra $L60; mov.u64 %r91,%r284; mov.u64 %r97,%r110; mov.u64 %r151,%r78; .loc 1 326 12 mov.u64 %r147,%r291; $L61: .loc 1 327 36 cvt.u32.u64 %r266,%r151; add.u32 %r265,%r266,1; .loc 1 327 25 st.u32 [%r97],%r265; .loc 1 326 28 add.u64 %r112,%r147,1; .loc 1 326 5 add.u64 %r97,%r97,%r57; setp.le.s64 %r267,%r29,%r147; @ %r267 bra $L68; .loc 1 327 32 ld.u64 %r151,[%r91]; add.u64 %r91,%r91,8; mov.u64 %r147,%r112; bra $L61; $L68: mov.u32 %r100,%r68; $L60: .loc 1 331 9 add.u64 %r93,%r93,%r75; .loc 1 332 10 add.u64 %r95,%r95,%r76; .loc 1 334 14 add.u64 %r78,%r78,1; .loc 1 334 7 st.u64 [%frame+360],%r78; setp.ne.u64 %r268,%r78,%r79; @ %r268 bra $L55; sub.u64 %r133,%r79,%r163; .loc 1 331 9 mad.lo.u64 %r94,%r75,%r133,%r94; .loc 1 332 10 mad.lo.u64 %r96,%r76,%r133,%r96; bra $L59; $L89: st.u64 [%frame+360],%r79; $L59: mov.u64 %r125,%r284; mov.u64 %r88,%r76; .loc 1 334 34 mov.u64 %r90,%r79; mov.u64 %r86,%r73; mov.u64 %r124,8; .loc 1 335 9 mov.u64 %r98,0; $L62: .loc 1 340 13 st.u64 [%r125+-8],%r291; .loc 1 343 23 mul.lo.u64 %r81,%r86,%r90; .loc 1 344 24 mul.lo.u64 %r83,%r90,%r88; mov.u64 %r104,%r98; .loc 1 345 5 add.u64 %r98,%r98,1; .loc 1 346 7 setp.eq.u64 %r273,%r29,%r104; @ %r273 bra $L35; .loc 1 354 16 ld.u64 %r274,[%r125]; add.u64 %r85,%r274,1; st.u64 [%r125],%r85; .loc 1 355 23 add.u64 %r276,%r286,%r124; ld.u64 %r86,[%r276]; .loc 1 355 13 sub.u64 %r277,%r86,%r81; shl.b64 %r278,%r277,2; add.u64 %r94,%r94,%r278; .loc 1 356 24 add.u64 %r279,%frame,%r124; ld.u64 %r88,[%r279]; .loc 1 356 14 sub.u64 %r280,%r88,%r83; add.u64 %r96,%r96,%r280; .loc 1 359 32 add.u64 %r282,%r292,%r124; ld.u64 %r90,[%r282]; .loc 1 359 7 add.u64 %r125,%r125,8; add.u64 %r124,%r124,8; setp.eq.u64 %r283,%r85,%r90; @ %r283 bra $L62; ld.u64 %r163,[%frame+360]; bra $L63; $L65: mov.u64 %r94,%r45; setp.eq.u32 %r285,1,1; bra $L48; $L35: .loc 1 362_gfortran_sminloc0_4_i4 .visible .func _gfortran_sminloc0_4_i44predpred.reg .u64u32 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 378 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L91; .loc 1 378 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L92; $L91: .loc 1 380 7 {_gfortran_minloc0_4_.loc 1 381 7 bra $L90; $L92: .loc 1 384 8 ld.s8 %r30,[%r36+28]_gfortran_runtime_error94: .loc 1 389 15 ld.u64 %r24,[%r35]; .loc 1 389 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L95; .loc 1 391 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 392 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 393 24 st.u64 [%r35+8],%r24; .loc 1 394 29 mov.u64 %r56call (%value_in),_gfortrani_xmallocarray94 27 st.u64 [%r35],%r24; bra $L96; $L95: .loc 1 396 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 396 11 ld.u32 %r62,[%r61+36; .loc 1 398 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 403 8 ld.u64 %r24,[%r35]; $L96: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,2; mov.u64 %r29,%r24; .loc 1 404 10 mov.u64 %r32,0; .loc 1 405 23 cvt.u32.u64 %r68,%r32; $L97: st.u32 [%r29],%r68; .loc 1 404 24 add.u64 %r32,%r32,1; .loc 1 404 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L97; $L90: .loc 1 406 minloc0_8_i4.o/ 1622802232_gfortran_minloc0_8_i4 .visible .func _gfortran_minloc0_8_i4fortran/generated/minloc0_8_i4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_8_i4 .visible .func _gfortran_mminloc0_8_i4_gfortran_sminloc0_8_i4 .visible .func _gfortran_sminloc0_8_i4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_8_i4 .visible .func _gfortran_minloc0_8_i432predpred %r180; .reg .predmov.u64 %r112,%ar0; mov.u64 %r113,%ar1; mov.u32 %r114,%ar2; .loc 1 50 10 ld.s8 %r22,[%r113+28]; .loc 1 50 8 cvt.u32.u32 %r115,%r22; cvt.s64.s8 %r71,%r115; .loc 1 51 6 setp.gt.s64 %r116,%r71,0; @ %r116 bra $L2; .loc 1 52 5 cvta.const.u64 %r11117_gfortran_runtime_errorr23,[%r112]; .loc 1 54 6 setp.ne.u64 %r119,%r23,0; @ %r119 bra $L3; .loc 1 56 7 st.u64 [%r112+48],%r23; add.u64 %r121,%r71,-1; st.u64 [%r112+56],%r121; mov.u64 %r122,1; st.u64 [%r112+40],%r122; .loc 1 57 28 cvt.u32.u64 %r123,%r122; st.u8 [%r112+28],%r123; .loc 1 58 24 st.u64 [%r112+8],%r23; .loc 1 59 29 mov.u64 %r171call (%value_in),_gfortrani_xmallocarray129,[%value_in]; } mov.u64 %r23,%r129; .loc 1 59 27 st.u64 [%r112],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r131,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r132,[%r131+36]; setp.eq.u32 %r133,%r132,0; @ %r133 bra $L4; .loc 1 64r113136; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r112]; $L4: .loc 1 68 11 ld.u64 %r72,[%r112+40]; add.u64 %r36,%r113,40; mov.u64 %r104,%frame; add.u64 %r105,%frame,120; .loc 1 70 10 mov.u64 %r98,0; add.u64 %r182,%frame,240; .loc 1 74 16 mov.u64 %r144,%r98; bra $L7; $L22: mov.u64 %r98,%r79; $L7: .loc 1 72 18 ld.u64 %r137,[%r36]; st.u64 [%r104],%r137; .loc 1 73 19 ld.u64 %r139,[%r36+16]; add.u64 %r138,%r139,1; ld.u64 %r140,[%r36+8]; sub.u64 %r31,%r138,%r140; .loc 1 73 17 st.u64 [%r105],%r31; .loc 1 74 16 shl.b64 %r142,%r98,3; add.u64 %r143,%r182,%r142; st.u64 [%r143],%r144; .loc 1 75 10 setp.gt.s64 %r145,%r31,0; @ %r145 bra $L5; shl.b64 %r81,%r72,3; mov.u64 %r87,%r23; .loc 1 78 11 mov.u64 %r80,0; .loc 1 79 24 mov.u64 %r146,%r80; $L6: st.u64 [%r87],%r146; .loc 1 78 27 add.u64 %r80,%r80,1; .loc 1 78 4 add.u64 %r87,%r87,%r81; setp.ne.u64 %r147,%r71,%r80; @ %r147 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r79,%r98,1; .loc 1 70 3 add.u64 %r36,%r36,24; add.u64 %r104,%r104,8; add.u64 %r105,%r105,8; setp.ne.u64 %r148,%r71,%r79; @ %r148 bra $L22; .loc 1 84 8 ld.u64 %r62,[%r113]; shl.b64 %r84,%r72,3; mov.u64 %r39,%r23; .loc 1 87 10 mov.u64 %r78,0; .loc 1 88 23 mov.u64 %r149,1; $L8: st.u64 [%r39],%r149; mov.u64 %r32,%r78; .loc 1 87 26 add.u64 %r78,%r78,1; .loc 1 87 3 add.u64 %r39,%r39,%r84; setp.ne.u64 %r150,%r98,%r32; @ %r150 bra $L8; .loc 1 101 9 setp.ne.u64 %r151,%r62,0; @ ! %r151 bra $L1; .loc 1 149 19 ld.u64 %r45,[%frame]; .loc 1 149 9 shl.b64 %r47,%r45,2; .loc 1 151 34 ld.u64 %r51,[%frame+120]; cvt.u32.u32 %r153,%r22; cvt.s64.s8 %r152,%r153; shl.b64 %r154,%r152,3; add.u64 %r73,%r182,%r154; .loc 1 99 12 mov.u32 %r67,2147483647; setp.eq.u32 %r181,%r114,0; add.u64 %r183,%frame,248; .loc 1 157 13 mov.u64 %r186,0; .loc 1 174 32 add.u64 %r187,%frame,120; $L21: .loc 1 151 21 ld.u64 %r103,[%frame+240]; .loc 1 126 10 @ %r181 bra $L23; mov.u64 %r40,%r103; mov.u64 %r60,%r62; $L14: .loc 1 129 10 ld.u32 %r33,[%r60]; .loc 1 129 9 setp.gt.s32 %r157,%r33,%r67; @ %r157 bra $L12; mov.u64 %r76,%r183; mov.u64 %r34,%r23; mov.u64 %r90,%r40; $L13: .loc 1 133 34 add.u64 %r159,%r90,1; .loc 1 133 23 st.u64 [%r34],%r159; .loc 1 132 3 add.u64 %r34,%r34,%r84; setp.eq.u64 %r160,%r73,%r76; @ %r160 bra $L24; .loc 1 133 30 ld.u64 %r90,[%r76]; add.u64 %r76,%r76,8; bra $L13; $L24: mov.u32 %r67,%r33; $L12: .loc 1 135 11 add.u64 %r60,%r60,%r47; .loc 1 137 9 add.u64 %r40,%r40,1; .loc 1 137 2 st.u64 [%frame+240],%r40; setp.ne.u64 %r161,%r40,%r51; @ %r161 bra $L14; bra $L34; $L23: mov.u64 %r49,%r103; mov.u64 %r61,%r62; $L11: .loc 1 141 10 ld.u32 %r42,[%r61]; .loc 1 141 9 setp.ge.s32 %r162,%r42,%r67; @ %r162 bra $L16; mov.u64 %r85,%r183; mov.u64 %r101,%r23; mov.u64 %r86,%r49; $L17: .loc 1 145 34 add.u64 %r164,%r86,1; .loc 1 145 23 st.u64 [%r101],%r164; .loc 1 144 3 add.u64 %r101,%r101,%r84; setp.eq.u64 %r165,%r73,%r85; @ %r165 bra $L25; .loc 1 145 30 ld.u64 %r86,[%r85]; add.u64 %r85,%r85,8; bra $L17; $L25: mov.u32 %r67,%r42; $L16: .loc 1 149 9 add.u64 %r61,%r61,%r47; .loc 1 151 14 add.u64 %r49,%r49,1; .loc 1 151 7 st.u64 [%frame+240],%r49; setp.ne.u64 %r166,%r49,%r51; @ %r166 bra $L11; .loc 1 149 9 sub.u64 %r167,%r51,%r103; mad.lo.u64 %r62,%r167,%r47,%r62; bra $L18; $L34: .loc 1 135 11 sub.u64 %r169,%r51,%r103; mad.lo.u64 %r62,%r169,%r47,%r62; $L18: mov.u64 %r70,%r183; .loc 1 151 34 mov.u64 %r59,%r51; mov.u64 %r57,%r45; .loc 1 152 9 mov.u64 %r66,0; $L20: .loc 1 157 13 st.u64 [%r70+-8],%r186; .loc 1 160 23 mul.lo.u64 %r54,%r57,%r59; mov.u64 %r99,%r66; .loc 1 161 5 add.u64 %r66,%r66,1; .loc 1 162 7 setp.eq.u64 %r173,%r99,%r98; @ %r173 bra $L1; .loc 1 170 16 ld.u64 %r174,[%r70]; add.u64 %r56,%r174,1; st.u64 [%r70],%r56; shl.b64 %r65,%r66,3; .loc 1 171 23 add.u64 %r175,%frame,%r65; ld.u64 %r57,[%r175]; .loc 1 171 13 sub.u64 %r176,%r57,%r54; shl.b64 %r177,%r176,2; add.u64 %r62,%r62,%r177; .loc 1 174 32 add.u64 %r179,%r187,%r65; ld.u64 %r59,[%r179]; .loc 1 174 7 add.u64 %r70,%r70,8; setp.eq.u64 %r180,%r56,%r59; @ %r180 bra $L20; bra $L21; $L1: .loc 1 1_gfortran_mminloc0_8_i4 .visible .func _gfortran_mminloc0_8_i49pred %r187; .reg .u64 %r190; .reg .u64u64pred %r222; .reg .u64 %r223; .reg .predu32 %r230; .reg .u64 %r231; .reg .u16 %r233; .reg .u32 %r234; .reg .pred %r235; .reg .predpred %r248; .reg .predu64 %r273; .reg .predu64 %r283; .reg .pred %r285; mov.u64 %r160,%ar0; mov.u64 %r161,%ar1; mov.u64 %r162,%ar2; mov.u32 %r163,%ar3; .loc 1 202 6 setp.ne.u64 %r164,%r162,0; @ %r164 bra $L36; .loc 1 204 7 {1163; call _gfortran_minloc0_8_.loc 1 205 7 bra $L35; $L36: .loc 1 208 10 ld.s8 %r22,[%r161+28]; .loc 1 209 6 setp.gt.s32 %r168,%r22,0; @ %r168 bra $L38; .loc 1 210 5 cvta.const.u64 %r16stack; call _gfortran_runtime_error%r171,[%r160]; setp.ne.u64 %r172,%r171,0; @ %r172 bra $L39; .loc 1 214 7 st.u64 [%r160+48],%r171; add.u32 %r174,%r22,-1; cvt.s64.s32 %r175,%r174; st.u64 [%r160+56],%r175; mov.u64 %r176,1; st.u64 [%r160+40],%r176; .loc 1 215 28 cvt.u32.u64 %r177,%r176; st.u8 [%r160+28],%r177; .loc 1 216 24 st.u64 [%r160+8],%r171; .loc 1 217 29 cvt.u32.u32 %r182,%r22; cvt.s64.s8 %r181,%r182; mov.u64 %r180call (%value_in),_gfortrani_xmallocarray183,[%value_in]; } .loc 1 217 27 st.u64 [%r160],%r183; bra $L40; $L39: .loc 1 221 11 cvta.global.u64 %r185,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r186,[%r185+36]; setp.eq.u32 %r187,%r186,0; @ %r187 bra $L40; .loc 1 224 4 cvta.const.u64 %r190,$LC1; {161190; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19316161193190; call _gfortrani_bounds_equal_extents$L40: .loc 1 231 15 ld.u64 %r31,[%r162+16]; .loc 1 233 9 ld.u64 %r96,[%r162]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r195,%r32,-4; and.b32 %r196,%r195,-5; set.u32.eq.u32 %r198,%r196,0; neg.s32 %r199,%r198; .loc 1 235 22 add.u32 %r200,%r32,-1; set.u32.le.u32 %r202,%r200,1; neg.s32 %r203,%r202; .loc 1 235 6 cvt.u16.u32 %r205,%r199; cvt.u16.u32 %r206,%r203; or.b16 %r204,%r205,%r206; cvt.u32.u16 %r207,%r204; cvt.u16.u8 %r208,%r20741; .loc 1 237 7 setp.ne.u32 %r211,%r32,16; @ %r211 bra $L42; $L41: .loc 1 244 11 ld.u64 %r110,[%r160+40]; .loc 1 245 8 ld.u64 %r111,[%r160]; .loc 1 246 17 cvt.u32.u32 %r212,%r22; cvt.s64.s8 %r68,%r212; add.u64 %r153,%r161,40; add.u64 %r277,%frame,120; mov.u64 %r149,%r277; add.u64 %r147,%r162,40; mov.u64 %r146,%frame; add.u64 %r71,%frame,240; add.u64 %r64,%frame,360; mov.u64 %r47,%r64; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r221,%r29; bra $L43; $L42: .loc 1 242 5_gfortran_runtime_error64: mov.u64 %r29,%r116; $L43: .loc 1 248 18 ld.u64 %r215,[%r153]; st.u64 [%r149],%r215; .loc 1 249 20 ld.u64 %r217,[%r147]; mul.lo.u64 %r216,%r217,%r31; .loc 1 249 18 st.u64 [%r146],%r216; .loc 1 250 19 ld.u64 %r219,[%r153+16]; add.u64 %r218,%r219,1; ld.u64 %r220,[%r153+8]; sub.u64 %r44,%r218,%r220; .loc 1 250 17 st.u64 [%r71],%r44; .loc 1 251 16 st.u64 [%r47],%r221; .loc 1 252 10 setp.gt.s64 %r222,%r44,0; @ %r222 bra $L44; shl.b64 %r138,%r110,3; mov.u64 %r130,%r111; .loc 1 255 11 mov.u64 %r117,0; .loc 1 256 24 mov.u64 %r223,%r117; $L45: st.u64 [%r130],%r223; .loc 1 255 27 add.u64 %r117,%r117,1; .loc 1 255 4 add.u64 %r130,%r130,%r138; setp.ne.u64 %r224,%r68,%r117; @ %r224 bra $L45; bra $L35; $L44: .loc 1 246 26 add.u64 %r116,%r29,1; .loc 1 246 3 add.u64 %r153,%r153,24; add.u64 %r149,%r149,8; add.u64 %r147,%r147,24; add.u64 %r146,%r146,8; add.u64 %r71,%r71,8; add.u64 %r47,%r47,8; setp.ne.u64 %r225,%r68,%r116; @ %r225 bra $L64; .loc 1 261 8 ld.u64 %r93,[%r161]; shl.b64 %r158,%r110,3; mov.u64 %r156,%r111; .loc 1 264 10 mov.u64 %r115,0; .loc 1 265 23 mov.u64 %r226,%r115; $L47: st.u64 [%r156],%r226; mov.u64 %r141,%r115; .loc 1 264 26 add.u64 %r115,%r115,1; .loc 1 264 3 add.u64 %r156,%r156,%r158; setp.ne.u64 %r227,%r29,%r141; @ %r227 bra $L47; .loc 1 276 9 setp.eq.u64 %r228,%r93,0; @ %r228 bra $L35; .loc 1 331 19 ld.u64 %r73,[%frame+120]; .loc 1 331 9 shl.b64 %r75,%r73,2; .loc 1 332 20 ld.u64 %r76,[%frame]; .loc 1 334 34 ld.u64 %r79,[%frame+240]; ld.u64 %r155,[%frame+360]; cvt.u32.u32 %r230,%r22; cvt.s64.s8 %r229,%r230; shl.b64 %r231,%r229,3; add.u64 %r120,%r231,%r64; .loc 1 274 12 mov.u32 %r101,2147483647; setp.eq.u32 %r276,1,0; add.u64 %r275,%frame,368; .loc 1 340 13 mov.u64 %r282,0; .loc 1 359 32 add.u64 %r283,%frame,240; .loc 1 308 12 setp.eq.u32 %r285,%r163,0; $L63: .loc 1 280 10 @ %r276 bra $L48; mov.u64 %r56,%r155; mov.u32 %r46,0; mov.u32 %r279,1; $L53: mov.u64 %r45,%r93; .loc 1 300 13 add.u64 %r93,%r45,%r75; .loc 1 284 11 ld.s8 %r234,[%r96]; cvt.u16.u32 %r233,%r234; setp.eq.u16 %r235,%r233,0; @ %r235 bra $L49; setp.eq.u32 %r236,%r46,0; selp.u64 %r155,%r155,%r56,%r236; .loc 1 294 16 ld.u32 %r101,[%r45]; mov.u64 %r129,%r275; mov.u64 %r139,%r111; mov.u64 %r151,%r155; $L52: .loc 1 296 33 add.u64 %r238,%r151,1; .loc 1 296 22 st.u64 [%r139],%r238; .loc 1 295 9 add.u64 %r139,%r139,%r158; setp.eq.u64 %r239,%r120,%r129; @ %r239 bra $L65; .loc 1 296 29 ld.u64 %r151,[%r129]; add.u64 %r129,%r129,8; bra $L52; $L49: .loc 1 301 14 add.u64 %r96,%r96,%r76; .loc 1 303 11 add.u64 %r56,%r56,1; mov.u32 %r46,%r279; .loc 1 303 4 setp.ne.u64 %r240,%r56,%r79; @ %r240 bra $L53; bra $L89; $L48: .loc 1 308 12 @ %r285 bra $L66; mov.u64 %r66,%r155; mov.u64 %r91,%r93; $L58: .loc 1 311 11 ld.s8 %r243,[%r96]; cvt.u16.u32 %r242,%r243; setp.eq.u16 %r244,%r242,0; @ %r244 bra $L56; .loc 1 311 12 ld.u32 %r61,[%r91]; .loc 1 311 11 setp.gt.s32 %r245,%r61,%r101; @ %r245 bra $L56; mov.u64 %r62,%r275; mov.u64 %r94,%r111; mov.u64 %r148,%r66; .loc 1 314 18 mov.u64 %r142,%r282; $L57: .loc 1 315 36 add.u64 %r247,%r148,1; .loc 1 315 25 st.u64 [%r94],%r247; .loc 1 314 34 add.u64 %r114,%r142,1; .loc 1 314 11 add.u64 %r94,%r94,%r158; setp.le.s64 %r248,%r29,%r142; @ %r248 bra $L67; .loc 1 315 32 ld.u64 %r148,[%r62]; add.u64 %r62,%r62,8; mov.u64 %r142,%r114; bra $L57; $L67: mov.u32 %r101,%r61; $L56: .loc 1 317 8 add.u64 %r91,%r91,%r75; .loc 1 319 13 add.u64 %r66,%r66,1; .loc 1 319 6 st.u64 [%frame+360],%r66; setp.ne.u64 %r249,%r66,%r79; @ %r249 bra $L58; .loc 1 317 8 sub.u64 %r250,%r79,%r155; mad.lo.u64 %r93,%r250,%r75,%r93; bra $L59; $L66: mov.u64 %r78,%r155; mov.u64 %r95,%r96; mov.u64 %r92,%r93; $L55: .loc 1 323 11 ld.s8 %r253,[%r95]; cvt.u16.u32 %r252,%r253; setp.eq.u16 %r254,%r252,0; @ %r254 bra $L60; .loc 1 323 12 ld.u32 %r69,[%r92]; .loc 1 323 11 setp.ge.s32 %r255,%r69,%r101; @ %r255 bra $L60; mov.u64 %r118,%r275; mov.u64 %r57,%r111; mov.u64 %r144,%r78; .loc 1 326 12 mov.u64 %r143,%r282; $L61: .loc 1 327 36 add.u64 %r257,%r144,1; .loc 1 327 25 st.u64 [%r57],%r257; .loc 1 326 28 add.u64 %r113,%r143,1; .loc 1 326 5 add.u64 %r57,%r57,%r158; setp.le.s64 %r258,%r29,%r143; @ %r258 bra $L68; .loc 1 327 32 ld.u64 %r144,[%r118]; add.u64 %r118,%r118,8; mov.u64 %r143,%r113; bra $L61; $L68: mov.u32 %r101,%r69; $L60: .loc 1 331 9 add.u64 %r92,%r92,%r75; .loc 1 332 10 add.u64 %r95,%r95,%r76; .loc 1 334 14 add.u64 %r78,%r78,1; .loc 1 334 7 st.u64 [%frame+360],%r78; setp.ne.u64 %r259,%r78,%r79; @ %r259 bra $L55; sub.u64 %r126,%r79,%r155; .loc 1 331 9 mad.lo.u64 %r93,%r75,%r126,%r93; .loc 1 332 10 mad.lo.u64 %r96,%r76,%r126,%r96; bra $L59; $L89: st.u64 [%frame+360],%r79; $L59: mov.u64 %r105,%r275; mov.u64 %r88,%r76; .loc 1 334 34 mov.u64 %r90,%r79; mov.u64 %r86,%r73; mov.u64 %r104,8; .loc 1 335 9 mov.u64 %r99,0; $L62: .loc 1 340 13 st.u64 [%r105+-8],%r282; .loc 1 343 23 mul.lo.u64 %r81,%r86,%r90; .loc 1 344 24 mul.lo.u64 %r83,%r90,%r88; mov.u64 %r70,%r99; .loc 1 345 5 add.u64 %r99,%r99,1; .loc 1 346 7 setp.eq.u64 %r264,%r29,%r70; @ %r264 bra $L35; .loc 1 354 16 ld.u64 %r265,[%r105]; add.u64 %r85,%r265,1; st.u64 [%r105],%r85; .loc 1 355 23 add.u64 %r267,%r277,%r104; ld.u64 %r86,[%r267]; .loc 1 355 13 sub.u64 %r268,%r86,%r81; shl.b64 %r269,%r268,2; add.u64 %r93,%r93,%r269; .loc 1 356 24 add.u64 %r270,%frame,%r104; ld.u64 %r88,[%r270]; .loc 1 356 14 sub.u64 %r271,%r88,%r83; add.u64 %r96,%r96,%r271; .loc 1 359 32 add.u64 %r273,%r283,%r104; ld.u64 %r90,[%r273]; .loc 1 359 7 add.u64 %r105,%r105,8; add.u64 %r104,%r104,8; setp.eq.u64 %r274,%r85,%r90; @ %r274 bra $L62; ld.u64 %r155,[%frame+360]; bra $L63; $L65: mov.u64 %r93,%r45; setp.eq.u32 %r276,1,1; bra $L48; $L35: .loc 1 362_gfortran_sminloc0_8_i4 .visible .func _gfortran_sminloc0_8_i44predpred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 378 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L91; .loc 1 378 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L92; $L91: .loc 1 380 7 {_gfortran_minloc0_8_.loc 1 381 7 bra $L90; $L92: .loc 1 384 8 ld.s8 %r30,[%r36+28]_gfortran_runtime_error94: .loc 1 389 15 ld.u64 %r24,[%r35]; .loc 1 389 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L95; .loc 1 391 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 392 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 393 24 st.u64 [%r35+8],%r24; .loc 1 394 29 mov.u64 call (%value_in),_gfortrani_xmallocarray94 27 st.u64 [%r35],%r24; bra $L96; $L95: .loc 1 396 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 396 11 ld.u32 %r62,[%r61+36; .loc 1 398 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 403 8 ld.u64 %r24,[%r35]; $L96: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,3; mov.u64 %r29,%r24; .loc 1 404 10 mov.u64 %r32,0; .loc 1 405 23 mov.u64 %r68,%r32; $L97: st.u64 [%r29],%r68; .loc 1 404 24 add.u64 %r32,%r32,1; .loc 1 404 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L97; $L90: .loc 1 406minloc0_16_i4.o/1622802232_gfortran_minloc0_16_i4 .visible .func _gfortran_minloc0_16_i4fortran/generated/minloc0_16_i4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_16_i4 .visible .func _gfortran_mminloc0_16_i4_gfortran_sminloc0_16_i4 .visible .func _gfortran_sminloc0_16_i4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_16_i4 .visible .func _gfortran_minloc0_16_i43264u32 %r157; .reg .u64 %r158; .reg .predpred %r166; .reg .predpred %r171; .reg .predpredmov.u64 %r114,%ar0; mov.u64 %r115,%ar1; mov.u32 %r116,%ar2; .loc 1 50 10 ld.s8 %r22,[%r115+28]; .loc 1 50 8 cvt.u32.u32 %r117,%r22; cvt.s64.s8 %r71,%r117; .loc 1 51 6 setp.gt.s64 %r118,%r71,0; @ %r118 bra $L2; .loc 1 52 5 cvta.const.u64 %r1111_gfortran_runtime_errorr23,[%r114]; .loc 1 54 6 setp.ne.u64 %r121,%r23,0; @ %r121 bra $L3; .loc 1 56 7 st.u64 [%r114+48],%r23; add.u64 %r123,%r71,-1; st.u64 [%r114+56],%r123; mov.u64 %r124,1; st.u64 [%r114+40],%r124; .loc 1 57 28 cvt.u32.u64 %r125,%r124; st.u8 [%r114+28],%r125; .loc 1 58 24 st.u64 [%r114+8],%r23; .loc 1 59 29 mov.u64 %r128,1671call (%value_in),_gfortrani_xmallocarray131,[%value_in]; } mov.u64 %r23,%r131; .loc 1 59 27 st.u64 [%r114],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r133,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r134,[%r133+36]; setp.eq.u32 %r135,%r134,0; @ %r135 bra $L4; .loc 1 64r115138; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r114]; $L4: .loc 1 68 11 ld.u64 %r72,[%r114+40]; add.u64 %r44,%r115,40; mov.u64 %r35,%frame; add.u64 %r52,%frame,120; .loc 1 70 10 mov.u64 %r100,0; add.u64 %r188,%frame,240; .loc 1 74 16 mov.u64 %r146,%r100; bra $L7; $L22: mov.u64 %r100,%r79; $L7: .loc 1 72 18 ld.u64 %r139,[%r44]; st.u64 [%r35],%r139; .loc 1 73 19 ld.u64 %r141,[%r44+16]; add.u64 %r140,%r141,1; ld.u64 %r142,[%r44+8]; sub.u64 %r32,%r140,%r142; .loc 1 73 17 st.u64 [%r52],%r32; .loc 1 74 16 shl.b64 %r144,%r100,3; add.u64 %r145,%r188,%r144; st.u64 [%r145],%r146; .loc 1 75 10 setp.gt.s64 %r147,%r32,0; @ %r147 bra $L5; shl.b64 %r86,%r72,4; mov.u64 %r88,%r23; .loc 1 78 11 mov.u64 %r80,0; .loc 1 79 24 mov.u64 %r148,%r80; $L6: st.u64 [%r88],%r148; st.u64 [%r88+8],%r148; .loc 1 78 27 add.u64 %r80,%r80,1; .loc 1 78 4 add.u64 %r88,%r88,%r86; setp.ne.u64 %r150,%r71,%r80; @ %r150 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r79,%r100,1; .loc 1 70 3 add.u64 %r44,%r44,24; add.u64 %r35,%r35,8; add.u64 %r52,%r52,8; setp.ne.u64 %r151,%r71,%r79; @ %r151 bra $L22; .loc 1 84 8 ld.u64 %r62,[%r115]; shl.b64 %r38,%r72,4; mov.u64 %r85,%r23; .loc 1 87 10 mov.u64 %r78,0; .loc 1 88 23 mov.u64 %r152,1; mov.u64 %r153,%r78; $L8: st.u64 [%r85],%r152; st.u64 [%r85+8],%r153; mov.u64 %r33,%r78; .loc 1 87 26 add.u64 %r78,%r78,1; .loc 1 87 3 add.u64 %r85,%r85,%r38; setp.ne.u64 %r154,%r100,%r33; @ %r154 bra $L8; .loc 1 101 9 setp.ne.u64 %r155,%r62,0; @ ! %r155 bra $L1; .loc 1 149 19 ld.u64 %r47,[%frame]; .loc 1 149 9 shl.b64 %r49,%r47,2; .loc 1 151 34 ld.u64 %r51,[%frame+120]; cvt.u32.u32 %r157,%r22; cvt.s64.s8 %r156,%r157; shl.b64 %r158,%r156,3; add.u64 %r89,%r188,%r158; .loc 1 99 12 mov.u32 %r66,2147483647; setp.eq.u32 %r187,%r116,0; add.u64 %r189,%frame,248; .loc 1 157 13 mov.u64 %r192,0; .loc 1 174 32 add.u64 %r193,%frame,120; $L21: .loc 1 151 21 ld.u64 %r107,[%frame+240]; .loc 1 126 10 @ %r187 bra $L23; mov.u64 %r41,%r107; mov.u64 %r60,%r62; $L14: .loc 1 129 10 ld.u32 %r34,[%r60]; .loc 1 129 9 setp.gt.s32 %r161,%r34,%r66; @ %r161 bra $L12; mov.u64 %r27,%r189; mov.u64 %r63,%r23; mov.u64 %r103,%r41; $L13: .loc 1 133 34 add.u64 %r163,%r103,1; st.u64 [%r63],%r163; shr.s64 %r164,%r163,63; st.u64 [%r63+8],%r164; .loc 1 132 3 add.u64 %r63,%r63,%r38; setp.eq.u64 %r165,%r27,%r89; @ %r165 bra $L24; .loc 1 133 30 ld.u64 %r103,[%r27]; add.u64 %r27,%r27,8; bra $L13; $L24: mov.u32 %r66,%r34; $L12: .loc 1 135 11 add.u64 %r60,%r60,%r49; .loc 1 137 9 add.u64 %r41,%r41,1; .loc 1 137 2 st.u64 [%frame+240],%r41; setp.ne.u64 %r166,%r41,%r51; @ %r166 bra $L14; bra $L34; $L23: mov.u64 %r50,%r107; mov.u64 %r61,%r62; $L11: .loc 1 141 10 ld.u32 %r42,[%r61]; .loc 1 141 9 setp.ge.s32 %r167,%r42,%r66; @ %r167 bra $L16; mov.u64 %r104,%r189; mov.u64 %r105,%r23; mov.u64 %r84,%r50; $L17: .loc 1 145 34 add.u64 %r169,%r84,1; st.u64 [%r105],%r169; shr.s64 %r170,%r169,63; st.u64 [%r105+8],%r170; .loc 1 144 3 add.u64 %r105,%r105,%r38; setp.eq.u64 %r171,%r89,%r104; @ %r171 bra $L25; .loc 1 145 30 ld.u64 %r84,[%r104]; add.u64 %r104,%r104,8; bra $L17; $L25: mov.u32 %r66,%r42; $L16: .loc 1 149 9 add.u64 %r61,%r61,%r49; .loc 1 151 14 add.u64 %r50,%r50,1; .loc 1 151 7 st.u64 [%frame+240],%r50; setp.ne.u64 %r172,%r50,%r51; @ %r172 bra $L11; .loc 1 149 9 sub.u64 %r173,%r51,%r107; mad.lo.u64 %r62,%r173,%r49,%r62; bra $L18; $L34: .loc 1 135 11 sub.u64 %r175,%r51,%r107; mad.lo.u64 %r62,%r175,%r49,%r62; $L18: mov.u64 %r77,%r189; .loc 1 151 34 mov.u64 %r59,%r51; mov.u64 %r57,%r47; .loc 1 152 9 mov.u64 %r65,0; $L20: .loc 1 157 13 st.u64 [%r77+-8],%r192; .loc 1 160 23 mul.lo.u64 %r54,%r57,%r59; mov.u64 %r101,%r65; .loc 1 161 5 add.u64 %r65,%r65,1; .loc 1 162 7 setp.eq.u64 %r179,%r101,%r100; @ %r179 bra $L1; .loc 1 170 16 ld.u64 %r180,[%r77]; add.u64 %r56,%r180,1; st.u64 [%r77],%r56; shl.b64 %r68,%r65,3; .loc 1 171 23 add.u64 %r181,%frame,%r68; ld.u64 %r57,[%r181]; .loc 1 171 13 sub.u64 %r182,%r57,%r54; shl.b64 %r183,%r182,2; add.u64 %r62,%r62,%r183; .loc 1 174 32 add.u64 %r185,%r193,%r68; ld.u64 %r59,[%r185]; .loc 1 174 7 add.u64 %r77,%r77,8; setp.eq.u64 %r186,%r56,%r59; @ %r186 bra $L20; bra $L21; $L1: .loc 1 1_gfortran_mminloc0_16_i4 .visible .func _gfortran_mminloc0_16_i4u64u32 %r198; .reg .u32u64pred %r293; mov.u64 %r163,%ar0; mov.u64 %r164,%ar1; mov.u64 %r165,%ar2; mov.u32 %r166,%ar3; .loc 1 202 6 setp.ne.u64 %r167,%r165,0; @ %r167 bra $L3616r1166; call _gfortran_minloc0_16_.loc 1 205 7 bra $L35; $L36: .loc 1 208 10 ld.s8 %r22,[%r164+28]; .loc 1 209 6 setp.gt.s32 %r171,%r22,0; @ %r171 bra $L38; .loc 1 210 5 cvta.const.u64 %r172172_gfortran_runtime_error%r174,[%r163]; setp.ne.u64 %r175,%r174,0; @ %r175 bra $L39; .loc 1 214 7 st.u64 [%r163+48],%r174; add.u32 %r177,%r22,-1; cvt.s64.s32 %r178,%r177; st.u64 [%r163+56],%r178; mov.u64 %r179,1; st.u64 [%r163+40],%r179; .loc 1 215 28 cvt.u32.u64 %r180,%r179; st.u8 [%r163+28],%r180; .loc 1 216 24 st.u64 [%r163+8],%r174; .loc 1 217 29 cvt.u32.u32 %r185,%r22; cvt.s64.s8 %r184,%r185; mov.u64 %r183,16call (%value_in),_gfortrani_xmallocarray186,[%value_in]; } .loc 1 217 27 st.u64 [%r163],%r186; bra $L40; $L39: .loc 1 221 11 cvta.global.u64 %r188,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r189,[%r188+36]; setp.eq.u32 %r190,%r189,0r164193; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r196193; call _gfortrani_bounds_equal_extents$L40: .loc 1 231 15 ld.u64 %r31,[%r165+16]; .loc 1 233 9 ld.u64 %r93,[%r165]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r198,%r32,-4; and.b32 %r199,%r198,-5; set.u32.eq.u32 %r201,%r199,0; neg.s32 %r202,%r201; .loc 1 235 22 add.u32 %r203,%r32,-1; set.u32.le.u32 %r205,%r203,1; neg.s32 %r206,%r205; .loc 1 235 6 cvt.u16.u32 %r208,%r202; cvt.u16.u32 %r209,%r206; or.b16 %r207,%r208,%r209;41; .loc 1 237 7 setp.ne.u32 %r214,%r32,16; @ %r214 bra $L42; $L41: .loc 1 244 11 ld.u64 %r106,[%r163+40]; .loc 1 245 8 ld.u64 %r107,[%r163]; .loc 1 246 17 cvt.u32.u32 %r215,%r22; cvt.s64.s8 %r64,%r215; add.u64 %r161,%r164,40; add.u64 %r284,%frame,120; mov.u64 %r159,%r284; add.u64 %r158,%r165,40; mov.u64 %r154,%frame; add.u64 %r155,%frame,240; add.u64 %r151,%frame,360; mov.u64 %r150,%r151; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r224,%r29; bra $L43; $L42: .loc 1 242 5_gfortran_runtime_error64: mov.u64 %r29,%r112; $L43: .loc 1 248 18 ld.u64 %r218,[%r161]; st.u64 [%r159],%r218; .loc 1 249 20 ld.u64 %r220,[%r158]; mul.lo.u64 %r219,%r220,%r31; .loc 1 249 18 st.u64 [%r154],%r219; .loc 1 250 19 ld.u64 %r222,[%r161+16]; add.u64 %r221,%r222,1; ld.u64 %r223,[%r161+8]; sub.u64 %r44,%r221,%r223; .loc 1 250 17 st.u64 [%r155],%r44; .loc 1 251 16 st.u64 [%r150],%r224; .loc 1 252 10 setp.gt.s64 %r225,%r44,0; @ %r225 bra $L44; shl.b64 %r141,%r106,4; mov.u64 %r132,%r107; .loc 1 255 11 mov.u64 %r113,0; .loc 1 256 24 mov.u64 %r226,%r113; $L45: st.u64 [%r132],%r226; st.u64 [%r132+8],%r226; .loc 1 255 27 add.u64 %r113,%r113,1; .loc 1 255 4 add.u64 %r132,%r132,%r141; setp.ne.u64 %r228,%r64,%r113; @ %r228 bra $L45; bra $L35; $L44: .loc 1 246 26 add.u64 %r112,%r29,1; .loc 1 246 3 add.u64 %r161,%r161,24; add.u64 %r159,%r159,8; add.u64 %r158,%r158,24; add.u64 %r154,%r154,8; add.u64 %r155,%r155,8; add.u64 %r150,%r150,8; setp.ne.u64 %r229,%r64,%r112; @ %r229 bra $L64; .loc 1 261 8 ld.u64 %r91,[%r164]; shl.b64 %r115,%r106,4; mov.u64 %r52,%r107; .loc 1 264 10 mov.u64 %r111,0; .loc 1 265 23 mov.u64 %r230,%r111; $L47: st.u64 [%r52],%r230; st.u64 [%r52+8],%r230; mov.u64 %r143,%r111; .loc 1 264 26 add.u64 %r111,%r111,1; .loc 1 264 3 add.u64 %r52,%r52,%r115; setp.ne.u64 %r232,%r29,%r143; @ %r232 bra $L47; .loc 1 276 9 setp.eq.u64 %r233,%r91,0; @ %r233 bra $L35; .loc 1 331 19 ld.u64 %r70,[%frame+120]; .loc 1 331 9 shl.b64 %r72,%r70,2; .loc 1 332 20 ld.u64 %r73,[%frame]; .loc 1 334 34 ld.u64 %r76,[%frame+240]; ld.u64 %r160,[%frame+360]; cvt.u32.u32 %r235,%r22; cvt.s64.s8 %r234,%r235; shl.b64 %r236,%r234,3; add.u64 %r122,%r236,%r151; .loc 1 274 12 mov.u32 %r97,2147483647; setp.eq.u32 %r283,1,0; add.u64 %r285,%frame,368; .loc 1 340 13 mov.u64 %r290,0; .loc 1 359 32 add.u64 %r291,%frame,240; .loc 1 308 12 setp.eq.u32 %r293,%r166,0; $L63: .loc 1 280 10 @ %r283 bra $L48; mov.u64 %r53,%r160; mov.u32 %r46,0; mov.u32 %r287,1; $L53: mov.u64 %r45,%r91; .loc 1 300 13 add.u64 %r91,%r45,%r72; .loc 1 284 11 ld.s8 %r239,[%r93]; cvt.u16.u32 %r238,%r239; setp.eq.u16 %r240,%r238,0; @ %r240 bra $L49; setp.eq.u32 %r241,%r46,0; selp.u64 %r160,%r160,%r53,%r241; .loc 1 294 16 ld.u32 %r97,[%r45]; mov.u64 %r131,%r285; mov.u64 %r142,%r107; mov.u64 %r157,%r160; $L52: .loc 1 296 33 add.u64 %r243,%r157,1; st.u64 [%r142],%r243; shr.s64 %r244,%r243,63; st.u64 [%r142+8],%r244; .loc 1 295 9 add.u64 %r142,%r142,%r115; setp.eq.u64 %r245,%r122,%r131; @ %r245 bra $L65; .loc 1 296 29 ld.u64 %r157,[%r131]; add.u64 %r131,%r131,8; bra $L52; $L49: .loc 1 301 14 add.u64 %r93,%r93,%r73; .loc 1 303 11 add.u64 %r53,%r53,1; mov.u32 %r46,%r287; .loc 1 303 4 setp.ne.u64 %r246,%r53,%r76; @ %r246 bra $L53; bra $L89; $L48: .loc 1 308 12 @ %r293 bra $L66; mov.u64 %r62,%r160; mov.u64 %r89,%r91; $L58: .loc 1 311 11 ld.s8 %r249,[%r93]; cvt.u16.u32 %r248,%r249; setp.eq.u16 %r250,%r248,0; @ %r250 bra $L56; .loc 1 311 12 ld.u32 %r58,[%r89]; .loc 1 311 11 setp.gt.s32 %r251,%r58,%r97; @ %r251 bra $L56; mov.u64 %r94,%r285; mov.u64 %r98,%r107; mov.u64 %r153,%r62; .loc 1 314 18 mov.u64 %r144,%r290; $L57: .loc 1 315 36 add.u64 %r253,%r153,1; st.u64 [%r98],%r253; shr.s64 %r254,%r253,63; st.u64 [%r98+8],%r254; .loc 1 314 34 add.u64 %r110,%r144,1; .loc 1 314 11 add.u64 %r98,%r98,%r115; setp.le.s64 %r255,%r29,%r144; @ %r255 bra $L67; .loc 1 315 32 ld.u64 %r153,[%r94]; add.u64 %r94,%r94,8; mov.u64 %r144,%r110; bra $L57; $L67: mov.u32 %r97,%r58; $L56: .loc 1 317 8 add.u64 %r89,%r89,%r72; .loc 1 319 13 add.u64 %r62,%r62,1; .loc 1 319 6 st.u64 [%frame+360],%r62; setp.ne.u64 %r256,%r62,%r76; @ %r256 bra $L58; .loc 1 317 8 sub.u64 %r257,%r76,%r160; mad.lo.u64 %r91,%r257,%r72,%r91; bra $L59; $L66: mov.u64 %r75,%r160; mov.u64 %r92,%r93; mov.u64 %r90,%r91; $L55: .loc 1 323 11 ld.s8 %r260,[%r92]; cvt.u16.u32 %r259,%r260; setp.eq.u16 %r261,%r259,0; @ %r261 bra $L60; .loc 1 323 12 ld.u32 %r65,[%r90]; .loc 1 323 11 setp.ge.s32 %r262,%r65,%r97; @ %r262 bra $L60; mov.u64 %r57,%r285; mov.u64 %r66,%r107; mov.u64 %r149,%r75; .loc 1 326 12 mov.u64 %r145,%r290; $L61: .loc 1 327 36 add.u64 %r264,%r149,1; st.u64 [%r66],%r264; shr.s64 %r265,%r264,63; st.u64 [%r66+8],%r265; .loc 1 326 28 add.u64 %r109,%r145,1; .loc 1 326 5 add.u64 %r66,%r66,%r115; setp.le.s64 %r266,%r29,%r145; @ %r266 bra $L68; .loc 1 327 32 ld.u64 %r149,[%r57]; add.u64 %r57,%r57,8; mov.u64 %r145,%r109; bra $L61; $L68: mov.u32 %r97,%r65; $L60: .loc 1 331 9 add.u64 %r90,%r90,%r72; .loc 1 332 10 add.u64 %r92,%r92,%r73; .loc 1 334 14 add.u64 %r75,%r75,1; .loc 1 334 7 st.u64 [%frame+360],%r75; setp.ne.u64 %r267,%r75,%r76; @ %r267 bra $L55; sub.u64 %r128,%r76,%r160; .loc 1 331 9 mad.lo.u64 %r91,%r72,%r128,%r91; .loc 1 332 10 mad.lo.u64 %r93,%r73,%r128,%r93; bra $L59; $L89: st.u64 [%frame+360],%r76; $L59: mov.u64 %r120,%r285; mov.u64 %r85,%r73; .loc 1 334 34 mov.u64 %r87,%r76; mov.u64 %r83,%r70; mov.u64 %r119,8; .loc 1 335 9 mov.u64 %r96,0; $L62: .loc 1 340 13 st.u64 [%r120+-8],%r290; .loc 1 343 23 mul.lo.u64 %r78,%r83,%r87; .loc 1 344 24 mul.lo.u64 %r80,%r87,%r85; mov.u64 %r95,%r96; .loc 1 345 5 add.u64 %r96,%r96,1; .loc 1 346 7 setp.eq.u64 %r272,%r29,%r95; @ %r272 bra $L35; .loc 1 354 16 ld.u64 %r273,[%r120]; add.u64 %r82,%r273,1; st.u64 [%r120],%r82; .loc 1 355 23 add.u64 %r275,%r284,%r119; ld.u64 %r83,[%r275]; .loc 1 355 13 sub.u64 %r276,%r83,%r78; shl.b64 %r277,%r276,2; add.u64 %r91,%r91,%r277; .loc 1 356 24 add.u64 %r278,%frame,%r119; ld.u64 %r85,[%r278]; .loc 1 356 14 sub.u64 %r279,%r85,%r80; add.u64 %r93,%r93,%r279; .loc 1 359 32 add.u64 %r281,%r291,%r119; ld.u64 %r87,[%r281]; .loc 1 359 7 add.u64 %r120,%r120,8; add.u64 %r119,%r119,8; setp.eq.u64 %r282,%r82,%r87; @ %r282 bra $L62; ld.u64 %r160,[%frame+360]; bra $L63; $L65: mov.u64 %r91,%r45; setp.eq.u32 %r283,1,1; bra $L48; $L35: .loc 1 362_gfortran_sminloc0_16_i4 .visible .func _gfortran_sminloc0_16_i44predpred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 378 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L91; .loc 1 378 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L92; $L91: .loc 1 380 7 {_gfortran_minloc0_16_.loc 1 381 7 bra $L90; $L92: .loc 1 384 8 ld.s8 %r30,[%r36+28]_gfortran_runtime_error94: .loc 1 389 15 ld.u64 %r24,[%r35]; .loc 1 389 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L95; .loc 1 391 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 392 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 393 24 st.u64 [%r35+8],%r24; .loc 1 394 29call (%value_in),_gfortrani_xmallocarray94 27 st.u64 [%r35],%r24; bra $L96; $L95: .loc 1 396 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 396 11 ld.u32 %r62,[%r61+36; .loc 1 398 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 403 8 ld.u64 %r24,[%r35]; $L96: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,4; mov.u64 %r29,%r24; .loc 1 404 10 mov.u64 %r32,0; .loc 1 405 23 mov.u64 %r68,%r32; $L97: st.u64 [%r29],%r68; st.u64 [%r29+8],%r68; .loc 1 404 24 add.u64 %r32,%r32,1; .loc 1 404 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r70,%r30,%r32; @ %r70 bra $L97; $L90: .loc 1 406minloc0_4_i8.o/ 1622802232_gfortran_minloc0_4_i8 .visible .func _gfortran_minloc0_4_i8fortran/generated/minloc0_4_i8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_4_i8 .visible .func _gfortran_mminloc0_4_i8_gfortran_sminloc0_4_i8 .visible .func _gfortran_sminloc0_4_i8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_4_i8 .visible .func _gfortran_minloc0_4_i8u64predpred %r154; .reg .predpredpredmov.u64 %r116,%ar0; mov.u64 %r117,%ar1; mov.u32 %r118,%ar2; .loc 1 50 10 ld.s8 %r22,[%r117+28]; .loc 1 50 8 cvt.u32.u32 %r119,%r22; cvt.s64.s8 %r72,%r119; .loc 1 51 6 setp.gt.s64 %r120,%r72,0; @ %r120 bra $L2; .loc 1 52 5 cvta.const.u64 %r121121_gfortran_runtime_errorr23,[%r116]; .loc 1 54 6 setp.ne.u64 %r123,%r23,0; @ %r123 bra $L3; .loc 1 56 7 st.u64 [%r116+48],%r23; add.u64 %r125,%r72,-1; st.u64 [%r116+56],%r125; mov.u64 %r126,1; st.u64 [%r116+40],%r126; .loc 1 57 28 cvt.u32.u64 %r127,%r126; st.u8 [%r116+28],%r127; .loc 1 58 24 st.u64 [%r116+8],%r23; .loc 1 59 29 mov.u64 %r130,4;r130; call (%value_in),_gfortrani_xmallocarray133,[%value_in]; } mov.u64 %r23,%r133; .loc 1 59 27 st.u64 [%r116],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r135,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r136,[%r135+36]; setp.eq.u32 %r137,%r136,0; @ %r137 bra $L4; .loc 1 64r140; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r116]; $L4: .loc 1 68 11 ld.u64 %r73,[%r116+40]; add.u64 %r39,%r117,40; mov.u64 %r45,%frame; add.u64 %r35,%frame,120; .loc 1 70 10 mov.u64 %r99,0; add.u64 %r188,%frame,240; .loc 1 74 16 mov.u64 %r148,%r99; bra $L7; $L22: mov.u64 %r99,%r79; $L7: .loc 1 72 18 ld.u64 %r141,[%r39]; st.u64 [%r45],%r141; .loc 1 73 19 ld.u64 %r143,[%r39+16]; add.u64 %r142,%r143,1; ld.u64 %r144,[%r39+8]; sub.u64 %r31,%r142,%r144; .loc 1 73 17 st.u64 [%r35],%r31; .loc 1 74 16 shl.b64 %r146,%r99,3; add.u64 %r147,%r188,%r146; st.u64 [%r147],%r148; .loc 1 75 10 setp.gt.s64 %r149,%r31,0; @ %r149 bra $L5; shl.b64 %r85,%r73,2; mov.u64 %r87,%r23; .loc 1 78 11 mov.u64 %r80,0; .loc 1 79 24 cvt.u32.u64 %r150,%r80; $L6: st.u32 [%r87],%r150; .loc 1 78 27 add.u64 %r80,%r80,1; .loc 1 78 4 add.u64 %r87,%r87,%r85; setp.ne.u64 %r151,%r72,%r80; @ %r151 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r79,%r99,1; .loc 1 70 3 add.u64 %r39,%r39,24; add.u64 %r45,%r45,8; add.u64 %r35,%r35,8; setp.ne.u64 %r152,%r72,%r79; @ %r152 bra $L22; .loc 1 84 8 ld.u64 %r64,[%r117]; shl.b64 %r82,%r73,2; mov.u64 %r92,%r23; .loc 1 87 10 mov.u64 %r78,0; .loc 1 88 23 mov.u32 %r153,1; $L8: st.u32 [%r92],%r153; mov.u64 %r32,%r78; .loc 1 87 26 add.u64 %r78,%r78,1; .loc 1 87 3 add.u64 %r92,%r92,%r82; setp.ne.u64 %r154,%r99,%r32; @ %r154 bra $L8; .loc 1 101 9 setp.ne.u64 %r155,%r64,0; @ ! %r155 bra $L1; .loc 1 149 19 ld.u64 %r49,[%frame]; .loc 1 149 9 shl.b64 %r51,%r49,3; .loc 1 151 34 ld.u64 %r53,[%frame+120]; cvt.u32.u32 %r157,%r22; cvt.s64.s8 %r156,%r157; shl.b64 %r158,%r156,3; add.u64 %r102,%r188,%r158; .loc 1 99 12 mov.u64 %r67,9223372036854775807; setp.eq.u32 %r187,%r118,0; add.u64 %r189,%frame,248; .loc 1 157 13 mov.u64 %r192,0; .loc 1 174 32 add.u64 %r193,%frame,120; $L21: .loc 1 151 21 ld.u64 %r110,[%frame+240]; .loc 1 126 10 @ %r187 bra $L23; mov.u64 %r40,%r110; mov.u64 %r62,%r64; $L14: .loc 1 129 10 ld.u64 %r33,[%r62]; .loc 1 129 9 setp.gt.s64 %r161,%r33,%r67; @ %r161 bra $L12; mov.u64 %r43,%r189; mov.u64 %r65,%r23; mov.u64 %r105,%r40; $L13: .loc 1 133 34 cvt.u32.u64 %r164,%r105; add.u32 %r163,%r164,1; .loc 1 133 23 st.u32 [%r65],%r163; .loc 1 132 3 add.u64 %r65,%r65,%r82; setp.eq.u64 %r165,%r43,%r102; @ %r165 bra $L24; .loc 1 133 30 ld.u64 %r105,[%r43]; add.u64 %r43,%r43,8; bra $L13; $L24: mov.u64 %r67,%r33; $L12: .loc 1 135 11 add.u64 %r62,%r62,%r51; .loc 1 137 9 add.u64 %r40,%r40,1; .loc 1 137 2 st.u64 [%frame+240],%r40; setp.ne.u64 %r166,%r40,%r53; @ %r166 bra $L14; bra $L34; $L23: mov.u64 %r52,%r110; mov.u64 %r63,%r64; $L11: .loc 1 141 10 ld.u64 %r42,[%r63]; .loc 1 141 9 setp.ge.s64 %r167,%r42,%r67; @ %r167 bra $L16; mov.u64 %r108,%r189; mov.u64 %r109,%r23; mov.u64 %r44,%r52; $L17: .loc 1 145 34 cvt.u32.u64 %r170,%r44; add.u32 %r169,%r170,1; .loc 1 145 23 st.u32 [%r109],%r169; .loc 1 144 3 add.u64 %r109,%r109,%r82; setp.eq.u64 %r171,%r102,%r108; @ %r171 bra $L25; .loc 1 145 30 ld.u64 %r44,[%r108]; add.u64 %r108,%r108,8; bra $L17; $L25: mov.u64 %r67,%r42; $L16: .loc 1 149 9 add.u64 %r63,%r63,%r51; .loc 1 151 14 add.u64 %r52,%r52,1; .loc 1 151 7 st.u64 [%frame+240],%r52; setp.ne.u64 %r172,%r52,%r53; @ %r172 bra $L11; .loc 1 149 9 sub.u64 %r173,%r53,%r110; mad.lo.u64 %r64,%r173,%r51,%r64; bra $L18; $L34: .loc 1 135 11 sub.u64 %r175,%r53,%r110; mad.lo.u64 %r64,%r175,%r51,%r64; $L18: mov.u64 %r81,%r189; .loc 1 151 34 mov.u64 %r61,%r53; mov.u64 %r59,%r49; .loc 1 152 9 mov.u64 %r66,0; $L20: .loc 1 157 13 st.u64 [%r81+-8],%r192; .loc 1 160 23 mul.lo.u64 %r56,%r59,%r61; mov.u64 %r100,%r66; .loc 1 161 5 add.u64 %r66,%r66,1; .loc 1 162 7 setp.eq.u64 %r179,%r100,%r99; @ %r179 bra $L1; .loc 1 170 16 ld.u64 %r180,[%r81]; add.u64 %r58,%r180,1; st.u64 [%r81],%r58; shl.b64 %r70,%r66,3; .loc 1 171 23 add.u64 %r181,%frame,%r70; ld.u64 %r59,[%r181]; .loc 1 171 13 sub.u64 %r182,%r59,%r56; shl.b64 %r183,%r182,3; add.u64 %r64,%r64,%r183; .loc 1 174 32 add.u64 %r185,%r193,%r70; ld.u64 %r61,[%r185]; .loc 1 174 7 add.u64 %r81,%r81,8; setp.eq.u64 %r186,%r58,%r61; @ %r186 bra $L20; bra $L21; $L1: .loc 1 1_gfortran_mminloc0_4_i8 .visible .func _gfortran_mminloc0_4_i8predpred %r193; .reg .u64 %r196; .reg .u6464pred %r228; .reg .u32 %r229; .reg .predpred %r233; .reg .pred %r234; .reg .u64u16u64u64 %r292; .reg .pred %r294; mov.u64 %r166,%ar0; mov.u64 %r167,%ar1; mov.u64 %r168,%ar2; mov.u32 %r169,%ar3; .loc 1 202 6 setp.ne.u64 %r170,%r168,0; @ %r170 bra $L36gfortran_minloc0_4_.loc 1 205 7 bra $L35; $L36: .loc 1 208 10 ld.s8 %r22,[%r167+28]; .loc 1 209 6 setp.gt.s32 %r174,%r22,0; @ %r174 bra $L38; .loc 1 210 5 cvta.const.u64 %r175175_gfortran_runtime_error%r177,[%r166]; setp.ne.u64 %r178,%r177,0; @ %r178 bra $L39; .loc 1 214 7 st.u64 [%r166+48],%r177; add.u32 %r180,%r22,-1; cvt.s64.s32 %r181,%r180; st.u64 [%r166+56],%r181; mov.u64 %r182,1; st.u64 [%r166+40],%r182; .loc 1 215 28 cvt.u32.u64 %r183,%r182; st.u8 [%r166+28],%r183; .loc 1 216 24 st.u64 [%r166+8],%r177; .loc 1 217 29 cvt.u32.u32 %r188,%r22; cvt.s64.s8 %r187,%r188call (%value_in),_gfortrani_xmallocarray189,[%value_in]; } .loc 1 217 27 st.u64 [%r166],%r189; bra $L40; $L39: .loc 1 221 11 cvta.global.u64 %r191,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r192,[%r191+36];r167196; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19call _gfortrani_bounds_equal_extents$L40: .loc 1 231 15 ld.u64 %r31,[%r168+16]; .loc 1 233 9 ld.u64 %r96,[%r168]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r201,%r32,-4; and.b32 %r202,%r201,-5; set.u32.eq.u32 %r204,%r202,0; neg.s32 %r205,%r204; .loc 1 235 22 add.u32 %r206,%r32,-1; set.u32.le.u32 %r208,%r206,1; neg.s32 %r209,%r208; .loc 1 235 6 cvt.u16.u32 %r211,%r205; cvt.u16.u32 %r212,%r209; or.b16 %r210,%r211,%r212; cvt.u32.u16 %r213,%r210; cvt.u16.u8 %r214,%r213; setp.ne.u16 %r215,%r214,0; @ %r215 bra $L41; .loc 1 237 7 setp.ne.u32 %r217,%r32,16; @ %r217 bra $L42; $L41: .loc 1 244 11 ld.u64 %r109,[%r166+40]; .loc 1 245 8 ld.u64 %r110,[%r166]; .loc 1 246 17 cvt.u32.u32 %r218,%r22; cvt.s64.s8 %r67,%r218; add.u64 %r52,%r167,40; add.u64 %r286,%frame,120; mov.u64 %r118,%r286; add.u64 %r105,%r168,40; mov.u64 %r165,%frame; add.u64 %r162,%frame,240; add.u64 %r159,%frame,360; mov.u64 %r160,%r159; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r227,%r29; bra $L43; $L42: .loc 1 242 5_gfortran_runtime_error64: mov.u64 %r29,%r115; $L43: .loc 1 248 18 ld.u64 %r221,[%r52]; st.u64 [%r118],%r221; .loc 1 249 20 ld.u64 %r223,[%r105]; mul.lo.u64 %r222,%r223,%r31; .loc 1 249 18 st.u64 [%r165],%r222; .loc 1 250 19 ld.u64 %r225,[%r52+16]; add.u64 %r224,%r225,1; ld.u64 %r226,[%r52+8]; sub.u64 %r44,%r224,%r226; .loc 1 250 17 st.u64 [%r162],%r44; .loc 1 251 16 st.u64 [%r160],%r227; .loc 1 252 10 setp.gt.s64 %r228,%r44,0; @ %r228 bra $L44; shl.b64 %r149,%r109,2; mov.u64 %r138,%r110; .loc 1 255 11 mov.u64 %r116,0; .loc 1 256 24 cvt.u32.u64 %r229,%r116; $L45: st.u32 [%r138],%r229; .loc 1 255 27 add.u64 %r116,%r116,1; .loc 1 255 4 add.u64 %r138,%r138,%r149; setp.ne.u64 %r230,%r67,%r116; @ %r230 bra $L45; bra $L35; $L44: .loc 1 246 26 add.u64 %r115,%r29,1; .loc 1 246 3 add.u64 %r52,%r52,24; add.u64 %r118,%r118,8; add.u64 %r105,%r105,24; add.u64 %r165,%r165,8; add.u64 %r162,%r162,8; add.u64 %r160,%r160,8; setp.ne.u64 %r231,%r67,%r115; @ %r231 bra $L64; .loc 1 261 8 ld.u64 %r94,[%r167]; shl.b64 %r57,%r109,2; mov.u64 %r60,%r110; .loc 1 264 10 mov.u64 %r114,0; .loc 1 265 23 cvt.u32.u64 %r232,%r114; $L47: st.u32 [%r60],%r232; mov.u64 %r145,%r114; .loc 1 264 26 add.u64 %r114,%r114,1; .loc 1 264 3 add.u64 %r60,%r60,%r57; setp.ne.u64 %r233,%r29,%r145; @ %r233 bra $L47; .loc 1 276 9 setp.eq.u64 %r234,%r94,0; @ %r234 bra $L35; .loc 1 331 19 ld.u64 %r73,[%frame+120]; .loc 1 331 9 shl.b64 %r75,%r73,3; .loc 1 332 20 ld.u64 %r76,[%frame]; .loc 1 334 34 ld.u64 %r79,[%frame+240]; ld.u64 %r163,[%frame+360]; cvt.u32.u32 %r236,%r22; cvt.s64.s8 %r235,%r236; shl.b64 %r237,%r235,3; add.u64 %r127,%r237,%r159; .loc 1 274 12 mov.u64 %r100,9223372036854775807; setp.eq.u32 %r285,1,0; add.u64 %r284,%frame,368; .loc 1 340 13 mov.u64 %r291,0; .loc 1 359 32 add.u64 %r292,%frame,240; .loc 1 308 12 setp.eq.u32 %r294,%r169,0; $L63: .loc 1 280 10 @ %r285 bra $L48; mov.u64 %r53,%r163; mov.u32 %r46,0; mov.u32 %r288,1; $L53: mov.u64 %r45,%r94; .loc 1 300 13 add.u64 %r94,%r45,%r75; .loc 1 284 11 ld.s8 %r240,[%r96]; cvt.u16.u32 %r239,%r240; setp.eq.u16 %r241,%r239,0; @ %r241 bra $L49; setp.eq.u32 %r242,%r46,0; selp.u64 %r163,%r163,%r53,%r242; .loc 1 294 16 ld.u64 %r100,[%r45]; mov.u64 %r137,%r284; mov.u64 %r150,%r110; mov.u64 %r161,%r163; $L52: .loc 1 296 33 cvt.u32.u64 %r245,%r161; add.u32 %r244,%r245,1; .loc 1 296 22 st.u32 [%r150],%r244; .loc 1 295 9 add.u64 %r150,%r150,%r57; setp.eq.u64 %r246,%r127,%r137; @ %r246 bra $L65; .loc 1 296 29 ld.u64 %r161,[%r137]; add.u64 %r137,%r137,8; bra $L52; $L49: .loc 1 301 14 add.u64 %r96,%r96,%r76; .loc 1 303 11 add.u64 %r53,%r53,1; mov.u32 %r46,%r288; .loc 1 303 4 setp.ne.u64 %r247,%r53,%r79; @ %r247 bra $L53; bra $L89; $L48: .loc 1 308 12 @ %r294 bra $L66; mov.u64 %r64,%r163; mov.u64 %r92,%r94; $L58: .loc 1 311 11 ld.s8 %r250,[%r96]; cvt.u16.u32 %r249,%r250; setp.eq.u16 %r251,%r249,0; @ %r251 bra $L56; .loc 1 311 12 ld.u64 %r58,[%r92]; .loc 1 311 11 setp.gt.s64 %r252,%r58,%r100; @ %r252 bra $L56; mov.u64 %r103,%r284; mov.u64 %r119,%r110; mov.u64 %r156,%r64; .loc 1 314 18 mov.u64 %r146,%r291; $L57: .loc 1 315 36 cvt.u32.u64 %r255,%r156; add.u32 %r254,%r255,1; .loc 1 315 25 st.u32 [%r119],%r254; .loc 1 314 34 add.u64 %r113,%r146,1; .loc 1 314 11 add.u64 %r119,%r119,%r57; setp.le.s64 %r256,%r29,%r146; @ %r256 bra $L67; .loc 1 315 32 ld.u64 %r156,[%r103]; add.u64 %r103,%r103,8; mov.u64 %r146,%r113; bra $L57; $L67: mov.u64 %r100,%r58; $L56: .loc 1 317 8 add.u64 %r92,%r92,%r75; .loc 1 319 13 add.u64 %r64,%r64,1; .loc 1 319 6 st.u64 [%frame+360],%r64; setp.ne.u64 %r257,%r64,%r79; @ %r257 bra $L58; .loc 1 317 8 sub.u64 %r258,%r79,%r163; mad.lo.u64 %r94,%r258,%r75,%r94; bra $L59; $L66: mov.u64 %r78,%r163; mov.u64 %r95,%r96; mov.u64 %r93,%r94; $L55: .loc 1 323 11 ld.s8 %r261,[%r95]; cvt.u16.u32 %r260,%r261; setp.eq.u16 %r262,%r260,0; @ %r262 bra $L60; .loc 1 323 12 ld.u64 %r68,[%r93]; .loc 1 323 11 setp.ge.s64 %r263,%r68,%r100; @ %r263 bra $L60; mov.u64 %r91,%r284; mov.u64 %r97,%r110; mov.u64 %r151,%r78; .loc 1 326 12 mov.u64 %r147,%r291; $L61: .loc 1 327 36 cvt.u32.u64 %r266,%r151; add.u32 %r265,%r266,1; .loc 1 327 25 st.u32 [%r97],%r265; .loc 1 326 28 add.u64 %r112,%r147,1; .loc 1 326 5 add.u64 %r97,%r97,%r57; setp.le.s64 %r267,%r29,%r147; @ %r267 bra $L68; .loc 1 327 32 ld.u64 %r151,[%r91]; add.u64 %r91,%r91,8; mov.u64 %r147,%r112; bra $L61; $L68: mov.u64 %r100,%r68; $L60: .loc 1 331 9 add.u64 %r93,%r93,%r75; .loc 1 332 10 add.u64 %r95,%r95,%r76; .loc 1 334 14 add.u64 %r78,%r78,1; .loc 1 334 7 st.u64 [%frame+360],%r78; setp.ne.u64 %r268,%r78,%r79; @ %r268 bra $L55; sub.u64 %r133,%r79,%r163; .loc 1 331 9 mad.lo.u64 %r94,%r75,%r133,%r94; .loc 1 332 10 mad.lo.u64 %r96,%r76,%r133,%r96; bra $L59; $L89: st.u64 [%frame+360],%r79; $L59: mov.u64 %r125,%r284; mov.u64 %r88,%r76; .loc 1 334 34 mov.u64 %r90,%r79; mov.u64 %r86,%r73; mov.u64 %r124,8; .loc 1 335 9 mov.u64 %r98,0; $L62: .loc 1 340 13 st.u64 [%r125+-8],%r291; .loc 1 343 23 mul.lo.u64 %r81,%r86,%r90; .loc 1 344 24 mul.lo.u64 %r83,%r90,%r88; mov.u64 %r104,%r98; .loc 1 345 5 add.u64 %r98,%r98,1; .loc 1 346 7 setp.eq.u64 %r273,%r29,%r104; @ %r273 bra $L35; .loc 1 354 16 ld.u64 %r274,[%r125]; add.u64 %r85,%r274,1; st.u64 [%r125],%r85; .loc 1 355 23 add.u64 %r276,%r286,%r124; ld.u64 %r86,[%r276]; .loc 1 355 13 sub.u64 %r277,%r86,%r81; shl.b64 %r278,%r277,3; add.u64 %r94,%r94,%r278; .loc 1 356 24 add.u64 %r279,%frame,%r124; ld.u64 %r88,[%r279]; .loc 1 356 14 sub.u64 %r280,%r88,%r83; add.u64 %r96,%r96,%r280; .loc 1 359 32 add.u64 %r282,%r292,%r124; ld.u64 %r90,[%r282]; .loc 1 359 7 add.u64 %r125,%r125,8; add.u64 %r124,%r124,8; setp.eq.u64 %r283,%r85,%r90; @ %r283 bra $L62; ld.u64 %r163,[%frame+360]; bra $L63; $L65: mov.u64 %r94,%r45; setp.eq.u32 %r285,1,1; bra $L48; $L35: .loc 1 362_gfortran_sminloc0_4_i8 .visible .func _gfortran_sminloc0_4_i84predpred.reg .u64u32 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 378 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L91; .loc 1 378 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L92; $L91: .loc 1 380 7 {_gfortran_minloc0_4_.loc 1 381 7 bra $L90; $L92: .loc 1 384 8 ld.s8 %r30,[%r36+28]_gfortran_runtime_error94: .loc 1 389 15 ld.u64 %r24,[%r35]; .loc 1 389 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L95; .loc 1 391 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 392 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 393 24 st.u64 [%r35+8],%r24; .loc 1 394 29 mov.u64 %r56call (%value_in),_gfortrani_xmallocarray94 27 st.u64 [%r35],%r24; bra $L96; $L95: .loc 1 396 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 396 11 ld.u32 %r62,[%r61+36; .loc 1 398 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 403 8 ld.u64 %r24,[%r35]; $L96: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,2; mov.u64 %r29,%r24; .loc 1 404 10 mov.u64 %r32,0; .loc 1 405 23 cvt.u32.u64 %r68,%r32; $L97: st.u32 [%r29],%r68; .loc 1 404 24 add.u64 %r32,%r32,1; .loc 1 404 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L97; $L90: .loc 1 406 minloc0_8_i8.o/ 1622802232_gfortran_minloc0_8_i8 .visible .func _gfortran_minloc0_8_i8fortran/generated/minloc0_8_i8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_8_i8 .visible .func _gfortran_mminloc0_8_i8_gfortran_sminloc0_8_i8 .visible .func _gfortran_sminloc0_8_i8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_8_i8 .visible .func _gfortran_minloc0_8_i832predpred %r180; .reg .predmov.u64 %r112,%ar0; mov.u64 %r113,%ar1; mov.u32 %r114,%ar2; .loc 1 50 10 ld.s8 %r22,[%r113+28]; .loc 1 50 8 cvt.u32.u32 %r115,%r22; cvt.s64.s8 %r71,%r115; .loc 1 51 6 setp.gt.s64 %r116,%r71,0; @ %r116 bra $L2; .loc 1 52 5 cvta.const.u64 %r11117_gfortran_runtime_errorr23,[%r112]; .loc 1 54 6 setp.ne.u64 %r119,%r23,0; @ %r119 bra $L3; .loc 1 56 7 st.u64 [%r112+48],%r23; add.u64 %r121,%r71,-1; st.u64 [%r112+56],%r121; mov.u64 %r122,1; st.u64 [%r112+40],%r122; .loc 1 57 28 cvt.u32.u64 %r123,%r122; st.u8 [%r112+28],%r123; .loc 1 58 24 st.u64 [%r112+8],%r23; .loc 1 59 29 mov.u64 %r171call (%value_in),_gfortrani_xmallocarray129,[%value_in]; } mov.u64 %r23,%r129; .loc 1 59 27 st.u64 [%r112],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r131,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r132,[%r131+36]; setp.eq.u32 %r133,%r132,0; @ %r133 bra $L4; .loc 1 64r113136; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r112]; $L4: .loc 1 68 11 ld.u64 %r72,[%r112+40]; add.u64 %r36,%r113,40; mov.u64 %r104,%frame; add.u64 %r105,%frame,120; .loc 1 70 10 mov.u64 %r98,0; add.u64 %r182,%frame,240; .loc 1 74 16 mov.u64 %r144,%r98; bra $L7; $L22: mov.u64 %r98,%r79; $L7: .loc 1 72 18 ld.u64 %r137,[%r36]; st.u64 [%r104],%r137; .loc 1 73 19 ld.u64 %r139,[%r36+16]; add.u64 %r138,%r139,1; ld.u64 %r140,[%r36+8]; sub.u64 %r31,%r138,%r140; .loc 1 73 17 st.u64 [%r105],%r31; .loc 1 74 16 shl.b64 %r142,%r98,3; add.u64 %r143,%r182,%r142; st.u64 [%r143],%r144; .loc 1 75 10 setp.gt.s64 %r145,%r31,0; @ %r145 bra $L5; shl.b64 %r81,%r72,3; mov.u64 %r87,%r23; .loc 1 78 11 mov.u64 %r80,0; .loc 1 79 24 mov.u64 %r146,%r80; $L6: st.u64 [%r87],%r146; .loc 1 78 27 add.u64 %r80,%r80,1; .loc 1 78 4 add.u64 %r87,%r87,%r81; setp.ne.u64 %r147,%r71,%r80; @ %r147 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r79,%r98,1; .loc 1 70 3 add.u64 %r36,%r36,24; add.u64 %r104,%r104,8; add.u64 %r105,%r105,8; setp.ne.u64 %r148,%r71,%r79; @ %r148 bra $L22; .loc 1 84 8 ld.u64 %r62,[%r113]; shl.b64 %r84,%r72,3; mov.u64 %r39,%r23; .loc 1 87 10 mov.u64 %r78,0; .loc 1 88 23 mov.u64 %r149,1; $L8: st.u64 [%r39],%r149; mov.u64 %r32,%r78; .loc 1 87 26 add.u64 %r78,%r78,1; .loc 1 87 3 add.u64 %r39,%r39,%r84; setp.ne.u64 %r150,%r98,%r32; @ %r150 bra $L8; .loc 1 101 9 setp.ne.u64 %r151,%r62,0; @ ! %r151 bra $L1; .loc 1 149 19 ld.u64 %r45,[%frame]; .loc 1 149 9 shl.b64 %r47,%r45,3; .loc 1 151 34 ld.u64 %r51,[%frame+120]; cvt.u32.u32 %r153,%r22; cvt.s64.s8 %r152,%r153; shl.b64 %r154,%r152,3; add.u64 %r73,%r182,%r154; .loc 1 99 12 mov.u64 %r67,9223372036854775807; setp.eq.u32 %r181,%r114,0; add.u64 %r183,%frame,248; .loc 1 157 13 mov.u64 %r186,0; .loc 1 174 32 add.u64 %r187,%frame,120; $L21: .loc 1 151 21 ld.u64 %r103,[%frame+240]; .loc 1 126 10 @ %r181 bra $L23; mov.u64 %r40,%r103; mov.u64 %r60,%r62; $L14: .loc 1 129 10 ld.u64 %r33,[%r60]; .loc 1 129 9 setp.gt.s64 %r157,%r33,%r67; @ %r157 bra $L12; mov.u64 %r76,%r183; mov.u64 %r34,%r23; mov.u64 %r90,%r40; $L13: .loc 1 133 34 add.u64 %r159,%r90,1; .loc 1 133 23 st.u64 [%r34],%r159; .loc 1 132 3 add.u64 %r34,%r34,%r84; setp.eq.u64 %r160,%r73,%r76; @ %r160 bra $L24; .loc 1 133 30 ld.u64 %r90,[%r76]; add.u64 %r76,%r76,8; bra $L13; $L24: mov.u64 %r67,%r33; $L12: .loc 1 135 11 add.u64 %r60,%r60,%r47; .loc 1 137 9 add.u64 %r40,%r40,1; .loc 1 137 2 st.u64 [%frame+240],%r40; setp.ne.u64 %r161,%r40,%r51; @ %r161 bra $L14; bra $L34; $L23: mov.u64 %r49,%r103; mov.u64 %r61,%r62; $L11: .loc 1 141 10 ld.u64 %r42,[%r61]; .loc 1 141 9 setp.ge.s64 %r162,%r42,%r67; @ %r162 bra $L16; mov.u64 %r85,%r183; mov.u64 %r101,%r23; mov.u64 %r86,%r49; $L17: .loc 1 145 34 add.u64 %r164,%r86,1; .loc 1 145 23 st.u64 [%r101],%r164; .loc 1 144 3 add.u64 %r101,%r101,%r84; setp.eq.u64 %r165,%r73,%r85; @ %r165 bra $L25; .loc 1 145 30 ld.u64 %r86,[%r85]; add.u64 %r85,%r85,8; bra $L17; $L25: mov.u64 %r67,%r42; $L16: .loc 1 149 9 add.u64 %r61,%r61,%r47; .loc 1 151 14 add.u64 %r49,%r49,1; .loc 1 151 7 st.u64 [%frame+240],%r49; setp.ne.u64 %r166,%r49,%r51; @ %r166 bra $L11; .loc 1 149 9 sub.u64 %r167,%r51,%r103; mad.lo.u64 %r62,%r167,%r47,%r62; bra $L18; $L34: .loc 1 135 11 sub.u64 %r169,%r51,%r103; mad.lo.u64 %r62,%r169,%r47,%r62; $L18: mov.u64 %r70,%r183; .loc 1 151 34 mov.u64 %r59,%r51; mov.u64 %r57,%r45; .loc 1 152 9 mov.u64 %r66,0; $L20: .loc 1 157 13 st.u64 [%r70+-8],%r186; .loc 1 160 23 mul.lo.u64 %r54,%r57,%r59; mov.u64 %r99,%r66; .loc 1 161 5 add.u64 %r66,%r66,1; .loc 1 162 7 setp.eq.u64 %r173,%r99,%r98; @ %r173 bra $L1; .loc 1 170 16 ld.u64 %r174,[%r70]; add.u64 %r56,%r174,1; st.u64 [%r70],%r56; shl.b64 %r65,%r66,3; .loc 1 171 23 add.u64 %r175,%frame,%r65; ld.u64 %r57,[%r175]; .loc 1 171 13 sub.u64 %r176,%r57,%r54; shl.b64 %r177,%r176,3; add.u64 %r62,%r62,%r177; .loc 1 174 32 add.u64 %r179,%r187,%r65; ld.u64 %r59,[%r179]; .loc 1 174 7 add.u64 %r70,%r70,8; setp.eq.u64 %r180,%r56,%r59; @ %r180 bra $L20; bra $L21; $L1: .loc 1 1_gfortran_mminloc0_8_i8 .visible .func _gfortran_mminloc0_8_i89pred %r187; .reg .u64 %r190; .reg .u64u64pred %r222; .reg .u64 %r223; .reg .predu32 %r230; .reg .u64 %r231; .reg .u16 %r233; .reg .u32 %r234; .reg .pred %r235; .reg .predpred %r248; .reg .predu64 %r273; .reg .predu64 %r283; .reg .pred %r285; mov.u64 %r160,%ar0; mov.u64 %r161,%ar1; mov.u64 %r162,%ar2; mov.u32 %r163,%ar3; .loc 1 202 6 setp.ne.u64 %r164,%r162,0; @ %r164 bra $L36; .loc 1 204 7 {1163; call _gfortran_minloc0_8_.loc 1 205 7 bra $L35; $L36: .loc 1 208 10 ld.s8 %r22,[%r161+28]; .loc 1 209 6 setp.gt.s32 %r168,%r22,0; @ %r168 bra $L38; .loc 1 210 5 cvta.const.u64 %r16stack; call _gfortran_runtime_error%r171,[%r160]; setp.ne.u64 %r172,%r171,0; @ %r172 bra $L39; .loc 1 214 7 st.u64 [%r160+48],%r171; add.u32 %r174,%r22,-1; cvt.s64.s32 %r175,%r174; st.u64 [%r160+56],%r175; mov.u64 %r176,1; st.u64 [%r160+40],%r176; .loc 1 215 28 cvt.u32.u64 %r177,%r176; st.u8 [%r160+28],%r177; .loc 1 216 24 st.u64 [%r160+8],%r171; .loc 1 217 29 cvt.u32.u32 %r182,%r22; cvt.s64.s8 %r181,%r182; mov.u64 %r180call (%value_in),_gfortrani_xmallocarray183,[%value_in]; } .loc 1 217 27 st.u64 [%r160],%r183; bra $L40; $L39: .loc 1 221 11 cvta.global.u64 %r185,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r186,[%r185+36]; setp.eq.u32 %r187,%r186,0; @ %r187 bra $L40; .loc 1 224 4 cvta.const.u64 %r190,$LC1; {161190; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19316161193190; call _gfortrani_bounds_equal_extents$L40: .loc 1 231 15 ld.u64 %r31,[%r162+16]; .loc 1 233 9 ld.u64 %r96,[%r162]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r195,%r32,-4; and.b32 %r196,%r195,-5; set.u32.eq.u32 %r198,%r196,0; neg.s32 %r199,%r198; .loc 1 235 22 add.u32 %r200,%r32,-1; set.u32.le.u32 %r202,%r200,1; neg.s32 %r203,%r202; .loc 1 235 6 cvt.u16.u32 %r205,%r199; cvt.u16.u32 %r206,%r203; or.b16 %r204,%r205,%r206; cvt.u32.u16 %r207,%r204; cvt.u16.u8 %r208,%r20741; .loc 1 237 7 setp.ne.u32 %r211,%r32,16; @ %r211 bra $L42; $L41: .loc 1 244 11 ld.u64 %r110,[%r160+40]; .loc 1 245 8 ld.u64 %r111,[%r160]; .loc 1 246 17 cvt.u32.u32 %r212,%r22; cvt.s64.s8 %r68,%r212; add.u64 %r153,%r161,40; add.u64 %r277,%frame,120; mov.u64 %r149,%r277; add.u64 %r147,%r162,40; mov.u64 %r146,%frame; add.u64 %r71,%frame,240; add.u64 %r64,%frame,360; mov.u64 %r47,%r64; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r221,%r29; bra $L43; $L42: .loc 1 242 5_gfortran_runtime_error64: mov.u64 %r29,%r116; $L43: .loc 1 248 18 ld.u64 %r215,[%r153]; st.u64 [%r149],%r215; .loc 1 249 20 ld.u64 %r217,[%r147]; mul.lo.u64 %r216,%r217,%r31; .loc 1 249 18 st.u64 [%r146],%r216; .loc 1 250 19 ld.u64 %r219,[%r153+16]; add.u64 %r218,%r219,1; ld.u64 %r220,[%r153+8]; sub.u64 %r44,%r218,%r220; .loc 1 250 17 st.u64 [%r71],%r44; .loc 1 251 16 st.u64 [%r47],%r221; .loc 1 252 10 setp.gt.s64 %r222,%r44,0; @ %r222 bra $L44; shl.b64 %r138,%r110,3; mov.u64 %r130,%r111; .loc 1 255 11 mov.u64 %r117,0; .loc 1 256 24 mov.u64 %r223,%r117; $L45: st.u64 [%r130],%r223; .loc 1 255 27 add.u64 %r117,%r117,1; .loc 1 255 4 add.u64 %r130,%r130,%r138; setp.ne.u64 %r224,%r68,%r117; @ %r224 bra $L45; bra $L35; $L44: .loc 1 246 26 add.u64 %r116,%r29,1; .loc 1 246 3 add.u64 %r153,%r153,24; add.u64 %r149,%r149,8; add.u64 %r147,%r147,24; add.u64 %r146,%r146,8; add.u64 %r71,%r71,8; add.u64 %r47,%r47,8; setp.ne.u64 %r225,%r68,%r116; @ %r225 bra $L64; .loc 1 261 8 ld.u64 %r93,[%r161]; shl.b64 %r158,%r110,3; mov.u64 %r156,%r111; .loc 1 264 10 mov.u64 %r115,0; .loc 1 265 23 mov.u64 %r226,%r115; $L47: st.u64 [%r156],%r226; mov.u64 %r141,%r115; .loc 1 264 26 add.u64 %r115,%r115,1; .loc 1 264 3 add.u64 %r156,%r156,%r158; setp.ne.u64 %r227,%r29,%r141; @ %r227 bra $L47; .loc 1 276 9 setp.eq.u64 %r228,%r93,0; @ %r228 bra $L35; .loc 1 331 19 ld.u64 %r73,[%frame+120]; .loc 1 331 9 shl.b64 %r75,%r73,3; .loc 1 332 20 ld.u64 %r76,[%frame]; .loc 1 334 34 ld.u64 %r79,[%frame+240]; ld.u64 %r155,[%frame+360]; cvt.u32.u32 %r230,%r22; cvt.s64.s8 %r229,%r230; shl.b64 %r231,%r229,3; add.u64 %r120,%r231,%r64; .loc 1 274 12 mov.u64 %r101,9223372036854775807; setp.eq.u32 %r276,1,0; add.u64 %r275,%frame,368; .loc 1 340 13 mov.u64 %r282,0; .loc 1 359 32 add.u64 %r283,%frame,240; .loc 1 308 12 setp.eq.u32 %r285,%r163,0; $L63: .loc 1 280 10 @ %r276 bra $L48; mov.u64 %r56,%r155; mov.u32 %r46,0; mov.u32 %r279,1; $L53: mov.u64 %r45,%r93; .loc 1 300 13 add.u64 %r93,%r45,%r75; .loc 1 284 11 ld.s8 %r234,[%r96]; cvt.u16.u32 %r233,%r234; setp.eq.u16 %r235,%r233,0; @ %r235 bra $L49; setp.eq.u32 %r236,%r46,0; selp.u64 %r155,%r155,%r56,%r236; .loc 1 294 16 ld.u64 %r101,[%r45]; mov.u64 %r129,%r275; mov.u64 %r139,%r111; mov.u64 %r151,%r155; $L52: .loc 1 296 33 add.u64 %r238,%r151,1; .loc 1 296 22 st.u64 [%r139],%r238; .loc 1 295 9 add.u64 %r139,%r139,%r158; setp.eq.u64 %r239,%r120,%r129; @ %r239 bra $L65; .loc 1 296 29 ld.u64 %r151,[%r129]; add.u64 %r129,%r129,8; bra $L52; $L49: .loc 1 301 14 add.u64 %r96,%r96,%r76; .loc 1 303 11 add.u64 %r56,%r56,1; mov.u32 %r46,%r279; .loc 1 303 4 setp.ne.u64 %r240,%r56,%r79; @ %r240 bra $L53; bra $L89; $L48: .loc 1 308 12 @ %r285 bra $L66; mov.u64 %r66,%r155; mov.u64 %r91,%r93; $L58: .loc 1 311 11 ld.s8 %r243,[%r96]; cvt.u16.u32 %r242,%r243; setp.eq.u16 %r244,%r242,0; @ %r244 bra $L56; .loc 1 311 12 ld.u64 %r61,[%r91]; .loc 1 311 11 setp.gt.s64 %r245,%r61,%r101; @ %r245 bra $L56; mov.u64 %r62,%r275; mov.u64 %r94,%r111; mov.u64 %r148,%r66; .loc 1 314 18 mov.u64 %r142,%r282; $L57: .loc 1 315 36 add.u64 %r247,%r148,1; .loc 1 315 25 st.u64 [%r94],%r247; .loc 1 314 34 add.u64 %r114,%r142,1; .loc 1 314 11 add.u64 %r94,%r94,%r158; setp.le.s64 %r248,%r29,%r142; @ %r248 bra $L67; .loc 1 315 32 ld.u64 %r148,[%r62]; add.u64 %r62,%r62,8; mov.u64 %r142,%r114; bra $L57; $L67: mov.u64 %r101,%r61; $L56: .loc 1 317 8 add.u64 %r91,%r91,%r75; .loc 1 319 13 add.u64 %r66,%r66,1; .loc 1 319 6 st.u64 [%frame+360],%r66; setp.ne.u64 %r249,%r66,%r79; @ %r249 bra $L58; .loc 1 317 8 sub.u64 %r250,%r79,%r155; mad.lo.u64 %r93,%r250,%r75,%r93; bra $L59; $L66: mov.u64 %r78,%r155; mov.u64 %r95,%r96; mov.u64 %r92,%r93; $L55: .loc 1 323 11 ld.s8 %r253,[%r95]; cvt.u16.u32 %r252,%r253; setp.eq.u16 %r254,%r252,0; @ %r254 bra $L60; .loc 1 323 12 ld.u64 %r69,[%r92]; .loc 1 323 11 setp.ge.s64 %r255,%r69,%r101; @ %r255 bra $L60; mov.u64 %r118,%r275; mov.u64 %r57,%r111; mov.u64 %r144,%r78; .loc 1 326 12 mov.u64 %r143,%r282; $L61: .loc 1 327 36 add.u64 %r257,%r144,1; .loc 1 327 25 st.u64 [%r57],%r257; .loc 1 326 28 add.u64 %r113,%r143,1; .loc 1 326 5 add.u64 %r57,%r57,%r158; setp.le.s64 %r258,%r29,%r143; @ %r258 bra $L68; .loc 1 327 32 ld.u64 %r144,[%r118]; add.u64 %r118,%r118,8; mov.u64 %r143,%r113; bra $L61; $L68: mov.u64 %r101,%r69; $L60: .loc 1 331 9 add.u64 %r92,%r92,%r75; .loc 1 332 10 add.u64 %r95,%r95,%r76; .loc 1 334 14 add.u64 %r78,%r78,1; .loc 1 334 7 st.u64 [%frame+360],%r78; setp.ne.u64 %r259,%r78,%r79; @ %r259 bra $L55; sub.u64 %r126,%r79,%r155; .loc 1 331 9 mad.lo.u64 %r93,%r75,%r126,%r93; .loc 1 332 10 mad.lo.u64 %r96,%r76,%r126,%r96; bra $L59; $L89: st.u64 [%frame+360],%r79; $L59: mov.u64 %r105,%r275; mov.u64 %r88,%r76; .loc 1 334 34 mov.u64 %r90,%r79; mov.u64 %r86,%r73; mov.u64 %r104,8; .loc 1 335 9 mov.u64 %r99,0; $L62: .loc 1 340 13 st.u64 [%r105+-8],%r282; .loc 1 343 23 mul.lo.u64 %r81,%r86,%r90; .loc 1 344 24 mul.lo.u64 %r83,%r90,%r88; mov.u64 %r70,%r99; .loc 1 345 5 add.u64 %r99,%r99,1; .loc 1 346 7 setp.eq.u64 %r264,%r29,%r70; @ %r264 bra $L35; .loc 1 354 16 ld.u64 %r265,[%r105]; add.u64 %r85,%r265,1; st.u64 [%r105],%r85; .loc 1 355 23 add.u64 %r267,%r277,%r104; ld.u64 %r86,[%r267]; .loc 1 355 13 sub.u64 %r268,%r86,%r81; shl.b64 %r269,%r268,3; add.u64 %r93,%r93,%r269; .loc 1 356 24 add.u64 %r270,%frame,%r104; ld.u64 %r88,[%r270]; .loc 1 356 14 sub.u64 %r271,%r88,%r83; add.u64 %r96,%r96,%r271; .loc 1 359 32 add.u64 %r273,%r283,%r104; ld.u64 %r90,[%r273]; .loc 1 359 7 add.u64 %r105,%r105,8; add.u64 %r104,%r104,8; setp.eq.u64 %r274,%r85,%r90; @ %r274 bra $L62; ld.u64 %r155,[%frame+360]; bra $L63; $L65: mov.u64 %r93,%r45; setp.eq.u32 %r276,1,1; bra $L48; $L35: .loc 1 362_gfortran_sminloc0_8_i8 .visible .func _gfortran_sminloc0_8_i84predpred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 378 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L91; .loc 1 378 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L92; $L91: .loc 1 380 7 {_gfortran_minloc0_8_.loc 1 381 7 bra $L90; $L92: .loc 1 384 8 ld.s8 %r30,[%r36+28]_gfortran_runtime_error94: .loc 1 389 15 ld.u64 %r24,[%r35]; .loc 1 389 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L95; .loc 1 391 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 392 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 393 24 st.u64 [%r35+8],%r24; .loc 1 394 29 mov.u64 call (%value_in),_gfortrani_xmallocarray94 27 st.u64 [%r35],%r24; bra $L96; $L95: .loc 1 396 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 396 11 ld.u32 %r62,[%r61+36; .loc 1 398 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 403 8 ld.u64 %r24,[%r35]; $L96: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,3; mov.u64 %r29,%r24; .loc 1 404 10 mov.u64 %r32,0; .loc 1 405 23 mov.u64 %r68,%r32; $L97: st.u64 [%r29],%r68; .loc 1 404 24 add.u64 %r32,%r32,1; .loc 1 404 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L97; $L90: .loc 1 406minloc0_16_i8.o/1622802232_gfortran_minloc0_16_i8 .visible .func _gfortran_minloc0_16_i8fortran/generated/minloc0_16_i8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_16_i8 .visible .func _gfortran_mminloc0_16_i8_gfortran_sminloc0_16_i8 .visible .func _gfortran_sminloc0_16_i8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_16_i8 .visible .func _gfortran_minloc0_16_i83264u32 %r157; .reg .u64 %r158; .reg .predpred %r166; .reg .predpred %r171; .reg .predpredmov.u64 %r114,%ar0; mov.u64 %r115,%ar1; mov.u32 %r116,%ar2; .loc 1 50 10 ld.s8 %r22,[%r115+28]; .loc 1 50 8 cvt.u32.u32 %r117,%r22; cvt.s64.s8 %r71,%r117; .loc 1 51 6 setp.gt.s64 %r118,%r71,0; @ %r118 bra $L2; .loc 1 52 5 cvta.const.u64 %r1111_gfortran_runtime_errorr23,[%r114]; .loc 1 54 6 setp.ne.u64 %r121,%r23,0; @ %r121 bra $L3; .loc 1 56 7 st.u64 [%r114+48],%r23; add.u64 %r123,%r71,-1; st.u64 [%r114+56],%r123; mov.u64 %r124,1; st.u64 [%r114+40],%r124; .loc 1 57 28 cvt.u32.u64 %r125,%r124; st.u8 [%r114+28],%r125; .loc 1 58 24 st.u64 [%r114+8],%r23; .loc 1 59 29 mov.u64 %r128,1671call (%value_in),_gfortrani_xmallocarray131,[%value_in]; } mov.u64 %r23,%r131; .loc 1 59 27 st.u64 [%r114],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r133,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r134,[%r133+36]; setp.eq.u32 %r135,%r134,0; @ %r135 bra $L4; .loc 1 64r115138; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r114]; $L4: .loc 1 68 11 ld.u64 %r72,[%r114+40]; add.u64 %r44,%r115,40; mov.u64 %r35,%frame; add.u64 %r52,%frame,120; .loc 1 70 10 mov.u64 %r100,0; add.u64 %r188,%frame,240; .loc 1 74 16 mov.u64 %r146,%r100; bra $L7; $L22: mov.u64 %r100,%r79; $L7: .loc 1 72 18 ld.u64 %r139,[%r44]; st.u64 [%r35],%r139; .loc 1 73 19 ld.u64 %r141,[%r44+16]; add.u64 %r140,%r141,1; ld.u64 %r142,[%r44+8]; sub.u64 %r32,%r140,%r142; .loc 1 73 17 st.u64 [%r52],%r32; .loc 1 74 16 shl.b64 %r144,%r100,3; add.u64 %r145,%r188,%r144; st.u64 [%r145],%r146; .loc 1 75 10 setp.gt.s64 %r147,%r32,0; @ %r147 bra $L5; shl.b64 %r86,%r72,4; mov.u64 %r88,%r23; .loc 1 78 11 mov.u64 %r80,0; .loc 1 79 24 mov.u64 %r148,%r80; $L6: st.u64 [%r88],%r148; st.u64 [%r88+8],%r148; .loc 1 78 27 add.u64 %r80,%r80,1; .loc 1 78 4 add.u64 %r88,%r88,%r86; setp.ne.u64 %r150,%r71,%r80; @ %r150 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r79,%r100,1; .loc 1 70 3 add.u64 %r44,%r44,24; add.u64 %r35,%r35,8; add.u64 %r52,%r52,8; setp.ne.u64 %r151,%r71,%r79; @ %r151 bra $L22; .loc 1 84 8 ld.u64 %r62,[%r115]; shl.b64 %r38,%r72,4; mov.u64 %r85,%r23; .loc 1 87 10 mov.u64 %r78,0; .loc 1 88 23 mov.u64 %r152,1; mov.u64 %r153,%r78; $L8: st.u64 [%r85],%r152; st.u64 [%r85+8],%r153; mov.u64 %r33,%r78; .loc 1 87 26 add.u64 %r78,%r78,1; .loc 1 87 3 add.u64 %r85,%r85,%r38; setp.ne.u64 %r154,%r100,%r33; @ %r154 bra $L8; .loc 1 101 9 setp.ne.u64 %r155,%r62,0; @ ! %r155 bra $L1; .loc 1 149 19 ld.u64 %r47,[%frame]; .loc 1 149 9 shl.b64 %r49,%r47,3; .loc 1 151 34 ld.u64 %r51,[%frame+120]; cvt.u32.u32 %r157,%r22; cvt.s64.s8 %r156,%r157; shl.b64 %r158,%r156,3; add.u64 %r89,%r188,%r158; .loc 1 99 12 mov.u64 %r66,9223372036854775807; setp.eq.u32 %r187,%r116,0; add.u64 %r189,%frame,248; .loc 1 157 13 mov.u64 %r192,0; .loc 1 174 32 add.u64 %r193,%frame,120; $L21: .loc 1 151 21 ld.u64 %r107,[%frame+240]; .loc 1 126 10 @ %r187 bra $L23; mov.u64 %r41,%r107; mov.u64 %r60,%r62; $L14: .loc 1 129 10 ld.u64 %r34,[%r60]; .loc 1 129 9 setp.gt.s64 %r161,%r34,%r66; @ %r161 bra $L12; mov.u64 %r27,%r189; mov.u64 %r63,%r23; mov.u64 %r103,%r41; $L13: .loc 1 133 34 add.u64 %r163,%r103,1; st.u64 [%r63],%r163; shr.s64 %r164,%r163,63; st.u64 [%r63+8],%r164; .loc 1 132 3 add.u64 %r63,%r63,%r38; setp.eq.u64 %r165,%r27,%r89; @ %r165 bra $L24; .loc 1 133 30 ld.u64 %r103,[%r27]; add.u64 %r27,%r27,8; bra $L13; $L24: mov.u64 %r66,%r34; $L12: .loc 1 135 11 add.u64 %r60,%r60,%r49; .loc 1 137 9 add.u64 %r41,%r41,1; .loc 1 137 2 st.u64 [%frame+240],%r41; setp.ne.u64 %r166,%r41,%r51; @ %r166 bra $L14; bra $L34; $L23: mov.u64 %r50,%r107; mov.u64 %r61,%r62; $L11: .loc 1 141 10 ld.u64 %r42,[%r61]; .loc 1 141 9 setp.ge.s64 %r167,%r42,%r66; @ %r167 bra $L16; mov.u64 %r104,%r189; mov.u64 %r105,%r23; mov.u64 %r84,%r50; $L17: .loc 1 145 34 add.u64 %r169,%r84,1; st.u64 [%r105],%r169; shr.s64 %r170,%r169,63; st.u64 [%r105+8],%r170; .loc 1 144 3 add.u64 %r105,%r105,%r38; setp.eq.u64 %r171,%r89,%r104; @ %r171 bra $L25; .loc 1 145 30 ld.u64 %r84,[%r104]; add.u64 %r104,%r104,8; bra $L17; $L25: mov.u64 %r66,%r42; $L16: .loc 1 149 9 add.u64 %r61,%r61,%r49; .loc 1 151 14 add.u64 %r50,%r50,1; .loc 1 151 7 st.u64 [%frame+240],%r50; setp.ne.u64 %r172,%r50,%r51; @ %r172 bra $L11; .loc 1 149 9 sub.u64 %r173,%r51,%r107; mad.lo.u64 %r62,%r173,%r49,%r62; bra $L18; $L34: .loc 1 135 11 sub.u64 %r175,%r51,%r107; mad.lo.u64 %r62,%r175,%r49,%r62; $L18: mov.u64 %r77,%r189; .loc 1 151 34 mov.u64 %r59,%r51; mov.u64 %r57,%r47; .loc 1 152 9 mov.u64 %r65,0; $L20: .loc 1 157 13 st.u64 [%r77+-8],%r192; .loc 1 160 23 mul.lo.u64 %r54,%r57,%r59; mov.u64 %r101,%r65; .loc 1 161 5 add.u64 %r65,%r65,1; .loc 1 162 7 setp.eq.u64 %r179,%r101,%r100; @ %r179 bra $L1; .loc 1 170 16 ld.u64 %r180,[%r77]; add.u64 %r56,%r180,1; st.u64 [%r77],%r56; shl.b64 %r68,%r65,3; .loc 1 171 23 add.u64 %r181,%frame,%r68; ld.u64 %r57,[%r181]; .loc 1 171 13 sub.u64 %r182,%r57,%r54; shl.b64 %r183,%r182,3; add.u64 %r62,%r62,%r183; .loc 1 174 32 add.u64 %r185,%r193,%r68; ld.u64 %r59,[%r185]; .loc 1 174 7 add.u64 %r77,%r77,8; setp.eq.u64 %r186,%r56,%r59; @ %r186 bra $L20; bra $L21; $L1: .loc 1 1_gfortran_mminloc0_16_i8 .visible .func _gfortran_mminloc0_16_i8u64u32 %r198; .reg .u32u64pred %r293; mov.u64 %r163,%ar0; mov.u64 %r164,%ar1; mov.u64 %r165,%ar2; mov.u32 %r166,%ar3; .loc 1 202 6 setp.ne.u64 %r167,%r165,0; @ %r167 bra $L3616r1166; call _gfortran_minloc0_16_.loc 1 205 7 bra $L35; $L36: .loc 1 208 10 ld.s8 %r22,[%r164+28]; .loc 1 209 6 setp.gt.s32 %r171,%r22,0; @ %r171 bra $L38; .loc 1 210 5 cvta.const.u64 %r172172_gfortran_runtime_error%r174,[%r163]; setp.ne.u64 %r175,%r174,0; @ %r175 bra $L39; .loc 1 214 7 st.u64 [%r163+48],%r174; add.u32 %r177,%r22,-1; cvt.s64.s32 %r178,%r177; st.u64 [%r163+56],%r178; mov.u64 %r179,1; st.u64 [%r163+40],%r179; .loc 1 215 28 cvt.u32.u64 %r180,%r179; st.u8 [%r163+28],%r180; .loc 1 216 24 st.u64 [%r163+8],%r174; .loc 1 217 29 cvt.u32.u32 %r185,%r22; cvt.s64.s8 %r184,%r185; mov.u64 %r183,16call (%value_in),_gfortrani_xmallocarray186,[%value_in]; } .loc 1 217 27 st.u64 [%r163],%r186; bra $L40; $L39: .loc 1 221 11 cvta.global.u64 %r188,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r189,[%r188+36]; setp.eq.u32 %r190,%r189,0r164193; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r196193; call _gfortrani_bounds_equal_extents$L40: .loc 1 231 15 ld.u64 %r31,[%r165+16]; .loc 1 233 9 ld.u64 %r93,[%r165]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r198,%r32,-4; and.b32 %r199,%r198,-5; set.u32.eq.u32 %r201,%r199,0; neg.s32 %r202,%r201; .loc 1 235 22 add.u32 %r203,%r32,-1; set.u32.le.u32 %r205,%r203,1; neg.s32 %r206,%r205; .loc 1 235 6 cvt.u16.u32 %r208,%r202; cvt.u16.u32 %r209,%r206; or.b16 %r207,%r208,%r209;41; .loc 1 237 7 setp.ne.u32 %r214,%r32,16; @ %r214 bra $L42; $L41: .loc 1 244 11 ld.u64 %r106,[%r163+40]; .loc 1 245 8 ld.u64 %r107,[%r163]; .loc 1 246 17 cvt.u32.u32 %r215,%r22; cvt.s64.s8 %r64,%r215; add.u64 %r161,%r164,40; add.u64 %r284,%frame,120; mov.u64 %r159,%r284; add.u64 %r158,%r165,40; mov.u64 %r154,%frame; add.u64 %r155,%frame,240; add.u64 %r151,%frame,360; mov.u64 %r150,%r151; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r224,%r29; bra $L43; $L42: .loc 1 242 5_gfortran_runtime_error64: mov.u64 %r29,%r112; $L43: .loc 1 248 18 ld.u64 %r218,[%r161]; st.u64 [%r159],%r218; .loc 1 249 20 ld.u64 %r220,[%r158]; mul.lo.u64 %r219,%r220,%r31; .loc 1 249 18 st.u64 [%r154],%r219; .loc 1 250 19 ld.u64 %r222,[%r161+16]; add.u64 %r221,%r222,1; ld.u64 %r223,[%r161+8]; sub.u64 %r44,%r221,%r223; .loc 1 250 17 st.u64 [%r155],%r44; .loc 1 251 16 st.u64 [%r150],%r224; .loc 1 252 10 setp.gt.s64 %r225,%r44,0; @ %r225 bra $L44; shl.b64 %r141,%r106,4; mov.u64 %r132,%r107; .loc 1 255 11 mov.u64 %r113,0; .loc 1 256 24 mov.u64 %r226,%r113; $L45: st.u64 [%r132],%r226; st.u64 [%r132+8],%r226; .loc 1 255 27 add.u64 %r113,%r113,1; .loc 1 255 4 add.u64 %r132,%r132,%r141; setp.ne.u64 %r228,%r64,%r113; @ %r228 bra $L45; bra $L35; $L44: .loc 1 246 26 add.u64 %r112,%r29,1; .loc 1 246 3 add.u64 %r161,%r161,24; add.u64 %r159,%r159,8; add.u64 %r158,%r158,24; add.u64 %r154,%r154,8; add.u64 %r155,%r155,8; add.u64 %r150,%r150,8; setp.ne.u64 %r229,%r64,%r112; @ %r229 bra $L64; .loc 1 261 8 ld.u64 %r91,[%r164]; shl.b64 %r115,%r106,4; mov.u64 %r52,%r107; .loc 1 264 10 mov.u64 %r111,0; .loc 1 265 23 mov.u64 %r230,%r111; $L47: st.u64 [%r52],%r230; st.u64 [%r52+8],%r230; mov.u64 %r143,%r111; .loc 1 264 26 add.u64 %r111,%r111,1; .loc 1 264 3 add.u64 %r52,%r52,%r115; setp.ne.u64 %r232,%r29,%r143; @ %r232 bra $L47; .loc 1 276 9 setp.eq.u64 %r233,%r91,0; @ %r233 bra $L35; .loc 1 331 19 ld.u64 %r70,[%frame+120]; .loc 1 331 9 shl.b64 %r72,%r70,3; .loc 1 332 20 ld.u64 %r73,[%frame]; .loc 1 334 34 ld.u64 %r76,[%frame+240]; ld.u64 %r160,[%frame+360]; cvt.u32.u32 %r235,%r22; cvt.s64.s8 %r234,%r235; shl.b64 %r236,%r234,3; add.u64 %r122,%r236,%r151; .loc 1 274 12 mov.u64 %r97,9223372036854775807; setp.eq.u32 %r283,1,0; add.u64 %r285,%frame,368; .loc 1 340 13 mov.u64 %r290,0; .loc 1 359 32 add.u64 %r291,%frame,240; .loc 1 308 12 setp.eq.u32 %r293,%r166,0; $L63: .loc 1 280 10 @ %r283 bra $L48; mov.u64 %r53,%r160; mov.u32 %r46,0; mov.u32 %r287,1; $L53: mov.u64 %r45,%r91; .loc 1 300 13 add.u64 %r91,%r45,%r72; .loc 1 284 11 ld.s8 %r239,[%r93]; cvt.u16.u32 %r238,%r239; setp.eq.u16 %r240,%r238,0; @ %r240 bra $L49; setp.eq.u32 %r241,%r46,0; selp.u64 %r160,%r160,%r53,%r241; .loc 1 294 16 ld.u64 %r97,[%r45]; mov.u64 %r131,%r285; mov.u64 %r142,%r107; mov.u64 %r157,%r160; $L52: .loc 1 296 33 add.u64 %r243,%r157,1; st.u64 [%r142],%r243; shr.s64 %r244,%r243,63; st.u64 [%r142+8],%r244; .loc 1 295 9 add.u64 %r142,%r142,%r115; setp.eq.u64 %r245,%r122,%r131; @ %r245 bra $L65; .loc 1 296 29 ld.u64 %r157,[%r131]; add.u64 %r131,%r131,8; bra $L52; $L49: .loc 1 301 14 add.u64 %r93,%r93,%r73; .loc 1 303 11 add.u64 %r53,%r53,1; mov.u32 %r46,%r287; .loc 1 303 4 setp.ne.u64 %r246,%r53,%r76; @ %r246 bra $L53; bra $L89; $L48: .loc 1 308 12 @ %r293 bra $L66; mov.u64 %r62,%r160; mov.u64 %r89,%r91; $L58: .loc 1 311 11 ld.s8 %r249,[%r93]; cvt.u16.u32 %r248,%r249; setp.eq.u16 %r250,%r248,0; @ %r250 bra $L56; .loc 1 311 12 ld.u64 %r58,[%r89]; .loc 1 311 11 setp.gt.s64 %r251,%r58,%r97; @ %r251 bra $L56; mov.u64 %r94,%r285; mov.u64 %r98,%r107; mov.u64 %r153,%r62; .loc 1 314 18 mov.u64 %r144,%r290; $L57: .loc 1 315 36 add.u64 %r253,%r153,1; st.u64 [%r98],%r253; shr.s64 %r254,%r253,63; st.u64 [%r98+8],%r254; .loc 1 314 34 add.u64 %r110,%r144,1; .loc 1 314 11 add.u64 %r98,%r98,%r115; setp.le.s64 %r255,%r29,%r144; @ %r255 bra $L67; .loc 1 315 32 ld.u64 %r153,[%r94]; add.u64 %r94,%r94,8; mov.u64 %r144,%r110; bra $L57; $L67: mov.u64 %r97,%r58; $L56: .loc 1 317 8 add.u64 %r89,%r89,%r72; .loc 1 319 13 add.u64 %r62,%r62,1; .loc 1 319 6 st.u64 [%frame+360],%r62; setp.ne.u64 %r256,%r62,%r76; @ %r256 bra $L58; .loc 1 317 8 sub.u64 %r257,%r76,%r160; mad.lo.u64 %r91,%r257,%r72,%r91; bra $L59; $L66: mov.u64 %r75,%r160; mov.u64 %r92,%r93; mov.u64 %r90,%r91; $L55: .loc 1 323 11 ld.s8 %r260,[%r92]; cvt.u16.u32 %r259,%r260; setp.eq.u16 %r261,%r259,0; @ %r261 bra $L60; .loc 1 323 12 ld.u64 %r65,[%r90]; .loc 1 323 11 setp.ge.s64 %r262,%r65,%r97; @ %r262 bra $L60; mov.u64 %r57,%r285; mov.u64 %r66,%r107; mov.u64 %r149,%r75; .loc 1 326 12 mov.u64 %r145,%r290; $L61: .loc 1 327 36 add.u64 %r264,%r149,1; st.u64 [%r66],%r264; shr.s64 %r265,%r264,63; st.u64 [%r66+8],%r265; .loc 1 326 28 add.u64 %r109,%r145,1; .loc 1 326 5 add.u64 %r66,%r66,%r115; setp.le.s64 %r266,%r29,%r145; @ %r266 bra $L68; .loc 1 327 32 ld.u64 %r149,[%r57]; add.u64 %r57,%r57,8; mov.u64 %r145,%r109; bra $L61; $L68: mov.u64 %r97,%r65; $L60: .loc 1 331 9 add.u64 %r90,%r90,%r72; .loc 1 332 10 add.u64 %r92,%r92,%r73; .loc 1 334 14 add.u64 %r75,%r75,1; .loc 1 334 7 st.u64 [%frame+360],%r75; setp.ne.u64 %r267,%r75,%r76; @ %r267 bra $L55; sub.u64 %r128,%r76,%r160; .loc 1 331 9 mad.lo.u64 %r91,%r72,%r128,%r91; .loc 1 332 10 mad.lo.u64 %r93,%r73,%r128,%r93; bra $L59; $L89: st.u64 [%frame+360],%r76; $L59: mov.u64 %r120,%r285; mov.u64 %r85,%r73; .loc 1 334 34 mov.u64 %r87,%r76; mov.u64 %r83,%r70; mov.u64 %r119,8; .loc 1 335 9 mov.u64 %r96,0; $L62: .loc 1 340 13 st.u64 [%r120+-8],%r290; .loc 1 343 23 mul.lo.u64 %r78,%r83,%r87; .loc 1 344 24 mul.lo.u64 %r80,%r87,%r85; mov.u64 %r95,%r96; .loc 1 345 5 add.u64 %r96,%r96,1; .loc 1 346 7 setp.eq.u64 %r272,%r29,%r95; @ %r272 bra $L35; .loc 1 354 16 ld.u64 %r273,[%r120]; add.u64 %r82,%r273,1; st.u64 [%r120],%r82; .loc 1 355 23 add.u64 %r275,%r284,%r119; ld.u64 %r83,[%r275]; .loc 1 355 13 sub.u64 %r276,%r83,%r78; shl.b64 %r277,%r276,3; add.u64 %r91,%r91,%r277; .loc 1 356 24 add.u64 %r278,%frame,%r119; ld.u64 %r85,[%r278]; .loc 1 356 14 sub.u64 %r279,%r85,%r80; add.u64 %r93,%r93,%r279; .loc 1 359 32 add.u64 %r281,%r291,%r119; ld.u64 %r87,[%r281]; .loc 1 359 7 add.u64 %r120,%r120,8; add.u64 %r119,%r119,8; setp.eq.u64 %r282,%r82,%r87; @ %r282 bra $L62; ld.u64 %r160,[%frame+360]; bra $L63; $L65: mov.u64 %r91,%r45; setp.eq.u32 %r283,1,1; bra $L48; $L35: .loc 1 362_gfortran_sminloc0_16_i8 .visible .func _gfortran_sminloc0_16_i84predpred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 378 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L91; .loc 1 378 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L92; $L91: .loc 1 380 7 {_gfortran_minloc0_16_.loc 1 381 7 bra $L90; $L92: .loc 1 384 8 ld.s8 %r30,[%r36+28]_gfortran_runtime_error94: .loc 1 389 15 ld.u64 %r24,[%r35]; .loc 1 389 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L95; .loc 1 391 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 392 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 393 24 st.u64 [%r35+8],%r24; .loc 1 394 29call (%value_in),_gfortrani_xmallocarray94 27 st.u64 [%r35],%r24; bra $L96; $L95: .loc 1 396 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 396 11 ld.u32 %r62,[%r61+36; .loc 1 398 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 403 8 ld.u64 %r24,[%r35]; $L96: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,4; mov.u64 %r29,%r24; .loc 1 404 10 mov.u64 %r32,0; .loc 1 405 23 mov.u64 %r68,%r32; $L97: st.u64 [%r29],%r68; st.u64 [%r29+8],%r68; .loc 1 404 24 add.u64 %r32,%r32,1; .loc 1 404 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r70,%r30,%r32; @ %r70 bra $L97; $L90: .loc 1 406minloc0_4_i16.o/1622802232_gfortran_minloc0_4_i16 .visible .func _gfortran_minloc0_4_i16fortran/generated/minloc0_4_i16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_4_i16 .visible .func _gfortran_mminloc0_4_i16_gfortran_sminloc0_4_i16 .visible .func _gfortran_sminloc0_4_i16VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_4_i16 .visible .func _gfortran_minloc0_4_i16predpred %r154; .reg .predpred %r163; .reg .pred %r166; .reg .predpredpredpredu64 %r191; .reg .predpred64u64 %r211; .reg .u64 %r214; .reg .u64 %r215; mov.u64 %r116,%ar0; mov.u64 %r117,%ar1; mov.u32 %r118,%ar2; .loc 1 50 10 ld.s8 %r22,[%r117+28]; .loc 1 50 8 cvt.u32.u32 %r119,%r22; cvt.s64.s8 %r72,%r119; .loc 1 51 6 setp.gt.s64 %r120,%r72,0; @ %r120 bra $L2; .loc 1 52 5 cvta.const.u64 %r121121_gfortran_runtime_errorr23,[%r116]; .loc 1 54 6 setp.ne.u64 %r123,%r23,0; @ %r123 bra $L3; .loc 1 56 7 st.u64 [%r116+48],%r23; add.u64 %r125,%r72,-1; st.u64 [%r116+56],%r125; mov.u64 %r126,1; st.u64 [%r116+40],%r126; .loc 1 57 28 cvt.u32.u64 %r127,%r126; st.u8 [%r116+28],%r127; .loc 1 58 24 st.u64 [%r116+8],%r23; .loc 1 59 29 mov.u64 %r130,4;r130; call (%value_in),_gfortrani_xmallocarray133,[%value_in]; } mov.u64 %r23,%r133; .loc 1 59 27 st.u64 [%r116],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r135,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r136,[%r135+36]; setp.eq.u32 %r137,%r136,0; @ %r137 bra $L4; .loc 1 64r140; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r116]; $L4: .loc 1 68 11 ld.u64 %r73,[%r116+40]; add.u64 %r39,%r117,40; mov.u64 %r45,%frame; add.u64 %r35,%frame,120; .loc 1 70 10 mov.u64 %r99,0; add.u64 %r210,%frame,240; .loc 1 74 16 mov.u64 %r148,%r99; bra $L7; $L24: mov.u64 %r99,%r79; $L7: .loc 1 72 18 ld.u64 %r141,[%r39]; st.u64 [%r45],%r141; .loc 1 73 19 ld.u64 %r143,[%r39+16]; add.u64 %r142,%r143,1; ld.u64 %r144,[%r39+8]; sub.u64 %r31,%r142,%r144; .loc 1 73 17 st.u64 [%r35],%r31; .loc 1 74 16 shl.b64 %r146,%r99,3; add.u64 %r147,%r210,%r146; st.u64 [%r147],%r148; .loc 1 75 10 setp.gt.s64 %r149,%r31,0; @ %r149 bra $L5; shl.b64 %r85,%r73,2; mov.u64 %r87,%r23; .loc 1 78 11 mov.u64 %r80,0; .loc 1 79 24 cvt.u32.u64 %r150,%r80; $L6: st.u32 [%r87],%r150; .loc 1 78 27 add.u64 %r80,%r80,1; .loc 1 78 4 add.u64 %r87,%r87,%r85; setp.ne.u64 %r151,%r72,%r80; @ %r151 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r79,%r99,1; .loc 1 70 3 add.u64 %r39,%r39,24; add.u64 %r45,%r45,8; add.u64 %r35,%r35,8; setp.ne.u64 %r152,%r72,%r79; @ %r152 bra $L24; .loc 1 84 8 ld.u64 %r64,[%r117]; shl.b64 %r82,%r73,2; mov.u64 %r92,%r23; .loc 1 87 10 mov.u64 %r78,0; .loc 1 88 23 mov.u32 %r153,1; $L8: st.u32 [%r92],%r153; mov.u64 %r32,%r78; .loc 1 87 26 add.u64 %r78,%r78,1; .loc 1 87 3 add.u64 %r92,%r92,%r82; setp.ne.u64 %r154,%r99,%r32; @ %r154 bra $L8; .loc 1 101 9 setp.ne.u64 %r155,%r64,0; @ ! %r155 bra $L1; .loc 1 149 19 ld.u64 %r49,[%frame]; .loc 1 149 9 shl.b64 %r51,%r49,4; .loc 1 151 34 ld.u64 %r53,[%frame+120]; cvt.u32.u32 %r157,%r22; cvt.s64.s8 %r156,%r157; shl.b64 %r158,%r156,3; add.u64 %r102,%r210,%r158; .loc 1 99 12 mov.u64 %r207,-1; mov.u64 %r208,9223372036854775807; setp.eq.u32 %r209,%r118,0; add.u64 %r211,%frame,248; .loc 1 157 13 mov.u64 %r214,0; .loc 1 174 32 add.u64 %r215,%frame,120; $L23: .loc 1 151 21 ld.u64 %r110,[%frame+240]; .loc 1 126 10 @ %r209 bra $L25; mov.u64 %r40,%r110; mov.u64 %r62,%r64; $L15: .loc 1 129 10 ld.u64 %r203,[%r62]; ld.u64 %r204,[%r62+8]; .loc 1 129 9 setp.gt.s64 %r163,%r204,%r208; @ %r163 bra $L12; setp.ne.u64 %r166,%r204,%r208; @ %r166 bra $L28; setp.gt.u64 %r169,%r203,%r207; @ %r169 bra $L12; $L28: mov.u64 %r43,%r211; mov.u64 %r65,%r23; mov.u64 %r105,%r40; $L14: .loc 1 133 34 cvt.u32.u64 %r172,%r105; add.u32 %r171,%r172,1; .loc 1 133 23 st.u32 [%r65],%r171; .loc 1 132 3 add.u64 %r65,%r65,%r82; setp.eq.u64 %r173,%r43,%r102; @ %r173 bra $L26; .loc 1 133 30 ld.u64 %r105,[%r43]; add.u64 %r43,%r43,8; bra $L14; $L26: mov.u64 %r207,%r203; mov.u64 %r208,%r204; $L12: .loc 1 135 11 add.u64 %r62,%r62,%r51; .loc 1 137 9 add.u64 %r40,%r40,1; .loc 1 137 2 st.u64 [%frame+240],%r40; setp.ne.u64 %r174,%r40,%r53; @ %r174 bra $L15; bra $L38; $L25: mov.u64 %r52,%r110; mov.u64 %r63,%r64; $L11: .loc 1 141 10 ld.u64 %r205,[%r63]; ld.u64 %r206,[%r63+8]; .loc 1 141 9 setp.gt.s64 %r177,%r208,%r206; @ %r177 bra $L29; setp.ne.u64 %r180,%r208,%r206; @ %r180 bra $L17; setp.gt.u64 %r183,%r207,%r205; @ ! %r183 bra $L17; $L29: mov.u64 %r108,%r211; mov.u64 %r109,%r23; mov.u64 %r44,%r52; $L19: .loc 1 145 34 cvt.u32.u64 %r186,%r44; add.u32 %r185,%r186,1; .loc 1 145 23 st.u32 [%r109],%r185; .loc 1 144 3 add.u64 %r109,%r109,%r82; setp.eq.u64 %r187,%r102,%r108; @ %r187 bra $L27; .loc 1 145 30 ld.u64 %r44,[%r108]; add.u64 %r108,%r108,8; bra $L19; $L27: mov.u64 %r207,%r205; mov.u64 %r208,%r206; $L17: .loc 1 149 9 add.u64 %r63,%r63,%r51; .loc 1 151 14 add.u64 %r52,%r52,1; .loc 1 151 7 st.u64 [%frame+240],%r52; setp.ne.u64 %r188,%r52,%r53; @ %r188 bra $L11; .loc 1 149 9 sub.u64 %r189,%r53,%r110; mad.lo.u64 %r64,%r189,%r51,%r64; bra $L20; $L38: .loc 1 135 11 sub.u64 %r191,%r53,%r110; mad.lo.u64 %r64,%r191,%r51,%r64; $L20: mov.u64 %r81,%r211; .loc 1 151 34 mov.u64 %r61,%r53; mov.u64 %r59,%r49; .loc 1 152 9 mov.u64 %r66,0; $L22: .loc 1 157 13 st.u64 [%r81+-8],%r214; .loc 1 160 23 mul.lo.u64 %r56,%r59,%r61; mov.u64 %r100,%r66; .loc 1 161 5 add.u64 %r66,%r66,1; .loc 1 162 7 setp.eq.u64 %r195,%r100,%r99; @ %r195 bra $L1; .loc 1 170 16 ld.u64 %r196,[%r81]; add.u64 %r58,%r196,1; st.u64 [%r81],%r58; shl.b64 %r70,%r66,3; .loc 1 171 23 add.u64 %r197,%frame,%r70; ld.u64 %r59,[%r197]; .loc 1 171 13 sub.u64 %r198,%r59,%r56; shl.b64 %r199,%r198,4; add.u64 %r64,%r64,%r199; .loc 1 174 32 add.u64 %r201,%r215,%r70; ld.u64 %r61,[%r201]; .loc 1 174 7 add.u64 %r81,%r81,8; setp.eq.u64 %r202,%r58,%r61; @ %r202 bra $L22; bra $L23; $L1: .loc 1 1_gfortran_mminloc0_4_i16 .visible .func _gfortran_mminloc0_4_i16predpred %r193; .reg .u64 %r196; .reg .u6464pred %r228; .reg .u32 %r229; .reg .predpred %r233; .reg .pred %r234; .reg .u64u16predpred %r265; .reg .u64 %r266; .reg .u16 %r268; .reg .u32 %r269; .reg .pred %r270; .reg .predpredu64 %r313; .reg .u64 %r314; .reg .pred %r316; mov.u64 %r166,%ar0; mov.u64 %r167,%ar1; mov.u64 %r168,%ar2; mov.u32 %r169,%ar3; .loc 1 202 6 setp.ne.u64 %r170,%r168,0; @ %r170 bra $L40gfortran_minloc0_4_i16.loc 1 205 7 bra $L39; $L40: .loc 1 208 10 ld.s8 %r22,[%r167+28]; .loc 1 209 6 setp.gt.s32 %r174,%r22,0; @ %r174 bra $L42; .loc 1 210 5 cvta.const.u64 %r175175_gfortran_runtime_error42: .loc 1 212 6 ld.u64 %r177,[%r166]; setp.ne.u64 %r178,%r177,0; @ %r178 bra $L43; .loc 1 214 7 st.u64 [%r166+48],%r177; add.u32 %r180,%r22,-1; cvt.s64.s32 %r181,%r180; st.u64 [%r166+56],%r181; mov.u64 %r182,1; st.u64 [%r166+40],%r182; .loc 1 215 28 cvt.u32.u64 %r183,%r182; st.u8 [%r166+28],%r183; .loc 1 216 24 st.u64 [%r166+8],%r177; .loc 1 217 29 cvt.u32.u32 %r188,%r22; cvt.s64.s8 %r187,%r188call (%value_in),_gfortrani_xmallocarray189,[%value_in]; } .loc 1 217 27 st.u64 [%r166],%r189; bra $L44; $L43: .loc 1 221 11 cvta.global.u64 %r191,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r192,[%r191+36];r167196; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19call _gfortrani_bounds_equal_extents$L44: .loc 1 231 15 ld.u64 %r31,[%r168+16]; .loc 1 233 9 ld.u64 %r96,[%r168]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r201,%r32,-4; and.b32 %r202,%r201,-5; set.u32.eq.u32 %r204,%r202,0; neg.s32 %r205,%r204; .loc 1 235 22 add.u32 %r206,%r32,-1; set.u32.le.u32 %r208,%r206,1; neg.s32 %r209,%r208; .loc 1 235 6 cvt.u16.u32 %r211,%r205; cvt.u16.u32 %r212,%r209; or.b16 %r210,%r211,%r212; cvt.u32.u16 %r213,%r210; cvt.u16.u8 %r214,%r213; setp.ne.u16 %r215,%r214,0; @ %r215 bra $L45; .loc 1 237 7 setp.ne.u32 %r217,%r32,16; @ %r217 bra $L46; $L45: .loc 1 244 11 ld.u64 %r109,[%r166+40]; .loc 1 245 8 ld.u64 %r110,[%r166]; .loc 1 246 17 cvt.u32.u32 %r218,%r22; cvt.s64.s8 %r67,%r218; add.u64 %r52,%r167,40; add.u64 %r307,%frame,120; mov.u64 %r118,%r307; add.u64 %r105,%r168,40; mov.u64 %r165,%frame; add.u64 %r162,%frame,240; add.u64 %r159,%frame,360; mov.u64 %r160,%r159; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r227,%r29; bra $L47; $L46: .loc 1 242 5_gfortran_runtime_error70: mov.u64 %r29,%r115; $L47: .loc 1 248 18 ld.u64 %r221,[%r52]; st.u64 [%r118],%r221; .loc 1 249 20 ld.u64 %r223,[%r105]; mul.lo.u64 %r222,%r223,%r31; .loc 1 249 18 st.u64 [%r165],%r222; .loc 1 250 19 ld.u64 %r225,[%r52+16]; add.u64 %r224,%r225,1; ld.u64 %r226,[%r52+8]; sub.u64 %r44,%r224,%r226; .loc 1 250 17 st.u64 [%r162],%r44; .loc 1 251 16 st.u64 [%r160],%r227; .loc 1 252 10 setp.gt.s64 %r228,%r44,0; @ %r228 bra $L48; shl.b64 %r149,%r109,2; mov.u64 %r138,%r110; .loc 1 255 11 mov.u64 %r116,0; .loc 1 256 24 cvt.u32.u64 %r229,%r116; $L49: st.u32 [%r138],%r229; .loc 1 255 27 add.u64 %r116,%r116,1; .loc 1 255 4 add.u64 %r138,%r138,%r149; setp.ne.u64 %r230,%r67,%r116; @ %r230 bra $L49; bra $L39; $L48: .loc 1 246 26 add.u64 %r115,%r29,1; .loc 1 246 3 add.u64 %r52,%r52,24; add.u64 %r118,%r118,8; add.u64 %r105,%r105,24; add.u64 %r165,%r165,8; add.u64 %r162,%r162,8; add.u64 %r160,%r160,8; setp.ne.u64 %r231,%r67,%r115; @ %r231 bra $L70; .loc 1 261 8 ld.u64 %r94,[%r167]; shl.b64 %r57,%r109,2; mov.u64 %r60,%r110; .loc 1 264 10 mov.u64 %r114,0; .loc 1 265 23 cvt.u32.u64 %r232,%r114; $L51: st.u32 [%r60],%r232; mov.u64 %r145,%r114; .loc 1 264 26 add.u64 %r114,%r114,1; .loc 1 264 3 add.u64 %r60,%r60,%r57; setp.ne.u64 %r233,%r29,%r145; @ %r233 bra $L51; .loc 1 276 9 setp.eq.u64 %r234,%r94,0; @ %r234 bra $L39; .loc 1 331 19 ld.u64 %r73,[%frame+120]; .loc 1 331 9 shl.b64 %r75,%r73,4; .loc 1 332 20 ld.u64 %r76,[%frame]; .loc 1 334 34 ld.u64 %r79,[%frame+240]; ld.u64 %r163,[%frame+360]; cvt.u32.u32 %r236,%r22; cvt.s64.s8 %r235,%r236; shl.b64 %r237,%r235,3; add.u64 %r127,%r237,%r159; .loc 1 274 12 mov.u64 %r304,-1; mov.u64 %r305,9223372036854775807; setp.eq.u32 %r306,1,0; add.u64 %r308,%frame,368; .loc 1 340 13 mov.u64 %r313,0; .loc 1 359 32 add.u64 %r314,%frame,240; .loc 1 308 12 setp.eq.u32 %r316,%r169,0; $L69: .loc 1 280 10 @ %r306 bra $L52; mov.u64 %r53,%r163; mov.u32 %r46,0; mov.u32 %r310,1; $L57: mov.u64 %r45,%r94; .loc 1 300 13 add.u64 %r94,%r45,%r75; .loc 1 284 11 ld.s8 %r240,[%r96]; cvt.u16.u32 %r239,%r240; setp.eq.u16 %r241,%r239,0; @ %r241 bra $L53; setp.eq.u32 %r242,%r46,0; selp.u64 %r163,%r163,%r53,%r242; .loc 1 294 16 ld.u64 %r304,[%r45]; ld.u64 %r305,[%r45+8]; mov.u64 %r137,%r308; mov.u64 %r150,%r110; mov.u64 %r161,%r163; $L56: .loc 1 296 33 cvt.u32.u64 %r245,%r161; add.u32 %r244,%r245,1; .loc 1 296 22 st.u32 [%r150],%r244; .loc 1 295 9 add.u64 %r150,%r150,%r57; setp.eq.u64 %r246,%r127,%r137; @ %r246 bra $L71; .loc 1 296 29 ld.u64 %r161,[%r137]; add.u64 %r137,%r137,8; bra $L56; $L53: .loc 1 301 14 add.u64 %r96,%r96,%r76; .loc 1 303 11 add.u64 %r53,%r53,1; mov.u32 %r46,%r310; .loc 1 303 4 setp.ne.u64 %r247,%r53,%r79; @ %r247 bra $L57; bra $L97; $L52: .loc 1 308 12 @ %r316 bra $L72; mov.u64 %r64,%r163; mov.u64 %r92,%r94; $L63: .loc 1 311 11 ld.s8 %r250,[%r96]; cvt.u16.u32 %r249,%r250; setp.eq.u16 %r251,%r249,0; @ %r251 bra $L60; .loc 1 311 12 ld.u64 %r300,[%r92]; ld.u64 %r301,[%r92+8]; .loc 1 311 11 setp.gt.s64 %r254,%r301,%r305; @ %r254 bra $L60; setp.ne.u64 %r257,%r301,%r305; @ %r257 bra $L75; setp.gt.u64 %r260,%r300,%r304; @ %r260 bra $L60; $L75: mov.u64 %r103,%r308; mov.u64 %r119,%r110; mov.u64 %r156,%r64; .loc 1 314 18 mov.u64 %r146,%r313; $L62: .loc 1 315 36 cvt.u32.u64 %r263,%r156; add.u32 %r262,%r263,1; .loc 1 315 25 st.u32 [%r119],%r262; .loc 1 314 34 add.u64 %r113,%r146,1; .loc 1 314 11 add.u64 %r119,%r119,%r57; setp.le.s64 %r264,%r29,%r146; @ %r264 bra $L73; .loc 1 315 32 ld.u64 %r156,[%r103]; add.u64 %r103,%r103,8; mov.u64 %r146,%r113; bra $L62; $L73: mov.u64 %r304,%r300; mov.u64 %r305,%r301; $L60: .loc 1 317 8 add.u64 %r92,%r92,%r75; .loc 1 319 13 add.u64 %r64,%r64,1; .loc 1 319 6 st.u64 [%frame+360],%r64; setp.ne.u64 %r265,%r64,%r79; @ %r265 bra $L63; .loc 1 317 8 sub.u64 %r266,%r79,%r163; mad.lo.u64 %r94,%r266,%r75,%r94; bra $L64; $L72: mov.u64 %r78,%r163; mov.u64 %r95,%r96; mov.u64 %r93,%r94; $L59: .loc 1 323 11 ld.s8 %r269,[%r95]; cvt.u16.u32 %r268,%r269; setp.eq.u16 %r270,%r268,0; @ %r270 bra $L65; .loc 1 323 12 ld.u64 %r302,[%r93]; ld.u64 %r303,[%r93+8]; .loc 1 323 11 setp.gt.s64 %r273,%r305,%r303; @ %r273 bra $L76; setp.ne.u64 %r276,%r305,%r303; @ %r276 bra $L65; setp.gt.u64 %r279,%r304,%r302; @ ! %r279 bra $L65; $L76: mov.u64 %r91,%r308; mov.u64 %r97,%r110; mov.u64 %r151,%r78; .loc 1 326 12 mov.u64 %r147,%r313; $L67: .loc 1 327 36 cvt.u32.u64 %r282,%r151; add.u32 %r281,%r282,1; .loc 1 327 25 st.u32 [%r97],%r281; .loc 1 326 28 add.u64 %r112,%r147,1; .loc 1 326 5 add.u64 %r97,%r97,%r57; setp.le.s64 %r283,%r29,%r147; @ %r283 bra $L74; .loc 1 327 32 ld.u64 %r151,[%r91]; add.u64 %r91,%r91,8; mov.u64 %r147,%r112; bra $L67; $L74: mov.u64 %r304,%r302; mov.u64 %r305,%r303; $L65: .loc 1 331 9 add.u64 %r93,%r93,%r75; .loc 1 332 10 add.u64 %r95,%r95,%r76; .loc 1 334 14 add.u64 %r78,%r78,1; .loc 1 334 7 st.u64 [%frame+360],%r78; setp.ne.u64 %r284,%r78,%r79; @ %r284 bra $L59; sub.u64 %r133,%r79,%r163; .loc 1 331 9 mad.lo.u64 %r94,%r75,%r133,%r94; .loc 1 332 10 mad.lo.u64 %r96,%r76,%r133,%r96; bra $L64; $L97: st.u64 [%frame+360],%r79; $L64: mov.u64 %r125,%r308; mov.u64 %r88,%r76; .loc 1 334 34 mov.u64 %r90,%r79; mov.u64 %r86,%r73; mov.u64 %r124,8; .loc 1 335 9 mov.u64 %r98,0; $L68: .loc 1 340 13 st.u64 [%r125+-8],%r313; .loc 1 343 23 mul.lo.u64 %r81,%r86,%r90; .loc 1 344 24 mul.lo.u64 %r83,%r90,%r88; mov.u64 %r104,%r98; .loc 1 345 5 add.u64 %r98,%r98,1; .loc 1 346 7 setp.eq.u64 %r289,%r29,%r104; @ %r289 bra $L39; .loc 1 354 16 ld.u64 %r290,[%r125]; add.u64 %r85,%r290,1; st.u64 [%r125],%r85; .loc 1 355 23 add.u64 %r292,%r307,%r124; ld.u64 %r86,[%r292]; .loc 1 355 13 sub.u64 %r293,%r86,%r81; shl.b64 %r294,%r293,4; add.u64 %r94,%r94,%r294; .loc 1 356 24 add.u64 %r295,%frame,%r124; ld.u64 %r88,[%r295]; .loc 1 356 14 sub.u64 %r296,%r88,%r83; add.u64 %r96,%r96,%r296; .loc 1 359 32 add.u64 %r298,%r314,%r124; ld.u64 %r90,[%r298]; .loc 1 359 7 add.u64 %r125,%r125,8; add.u64 %r124,%r124,8; setp.eq.u64 %r299,%r85,%r90; @ %r299 bra $L68; ld.u64 %r163,[%frame+360]; bra $L69; $L71: mov.u64 %r94,%r45; setp.eq.u32 %r306,1,1; bra $L52; $L39: .loc 1 362_gfortran_sminloc0_4_i16 .visible .func _gfortran_sminloc0_4_i164predpred.reg .u64u32 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 378 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L99; .loc 1 378 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L100; $L99: .loc 1 380 7 {_gfortran_minloc0_4_i16.loc 1 381 7 bra $L98; $L100: .loc 1 384 8 ld.s8 %r30,[%r36+28]; .loc 1 386 6 setp.gt.s64 %r46,%r30,0; @ %r46 bra $L102_gfortran_runtime_error102: .loc 1 389 15 ld.u64 %r24,[%r35]; .loc 1 389 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L103; .loc 1 391 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 392 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 393 24 st.u64 [%r35+8],%r24; .loc 1 394 29 mov.u64 %r56call (%value_in),_gfortrani_xmallocarray94 27 st.u64 [%r35],%r24; bra $L104; $L103: .loc 1 396 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 396 11 ld.u32 %r62,[%r61+36.loc 1 398 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 403 8 ld.u64 %r24,[%r35]; $L104: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,2; mov.u64 %r29,%r24; .loc 1 404 10 mov.u64 %r32,0; .loc 1 405 23 cvt.u32.u64 %r68,%r32; $L105: st.u32 [%r29],%r68; .loc 1 404 24 add.u64 %r32,%r32,1; .loc 1 404 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L105; $L98: .loc 1 406minloc0_8_i16.o/1622802232_gfortran_minloc0_8_i16 .visible .func _gfortran_minloc0_8_i16fortran/generated/minloc0_8_i16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_8_i16 .visible .func _gfortran_mminloc0_8_i16_gfortran_sminloc0_8_i16 .visible .func _gfortran_sminloc0_8_i16VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_8_i16 .visible .func _gfortran_minloc0_8_i1632predpredpred %r169; .reg .pred %r172; .reg .predpredpredpredpredmov.u64 %r112,%ar0; mov.u64 %r113,%ar1; mov.u32 %r114,%ar2; .loc 1 50 10 ld.s8 %r22,[%r113+28]; .loc 1 50 8 cvt.u32.u32 %r115,%r22; cvt.s64.s8 %r71,%r115; .loc 1 51 6 setp.gt.s64 %r116,%r71,0; @ %r116 bra $L2; .loc 1 52 5 cvta.const.u64 %r11117_gfortran_runtime_errorr23,[%r112]; .loc 1 54 6 setp.ne.u64 %r119,%r23,0; @ %r119 bra $L3; .loc 1 56 7 st.u64 [%r112+48],%r23; add.u64 %r121,%r71,-1; st.u64 [%r112+56],%r121; mov.u64 %r122,1; st.u64 [%r112+40],%r122; .loc 1 57 28 cvt.u32.u64 %r123,%r122; st.u8 [%r112+28],%r123; .loc 1 58 24 st.u64 [%r112+8],%r23; .loc 1 59 29 mov.u64 %r171call (%value_in),_gfortrani_xmallocarray129,[%value_in]; } mov.u64 %r23,%r129; .loc 1 59 27 st.u64 [%r112],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r131,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r132,[%r131+36]; setp.eq.u32 %r133,%r132,0; @ %r133 bra $L4; .loc 1 64r113136; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r112]; $L4: .loc 1 68 11 ld.u64 %r72,[%r112+40]; add.u64 %r36,%r113,40; mov.u64 %r104,%frame; add.u64 %r105,%frame,120; .loc 1 70 10 mov.u64 %r98,0; add.u64 %r204,%frame,240; .loc 1 74 16 mov.u64 %r144,%r98; bra $L7; $L24: mov.u64 %r98,%r79; $L7: .loc 1 72 18 ld.u64 %r137,[%r36]; st.u64 [%r104],%r137; .loc 1 73 19 ld.u64 %r139,[%r36+16]; add.u64 %r138,%r139,1; ld.u64 %r140,[%r36+8]; sub.u64 %r31,%r138,%r140; .loc 1 73 17 st.u64 [%r105],%r31; .loc 1 74 16 shl.b64 %r142,%r98,3; add.u64 %r143,%r204,%r142; st.u64 [%r143],%r144; .loc 1 75 10 setp.gt.s64 %r145,%r31,0; @ %r145 bra $L5; shl.b64 %r81,%r72,3; mov.u64 %r87,%r23; .loc 1 78 11 mov.u64 %r80,0; .loc 1 79 24 mov.u64 %r146,%r80; $L6: st.u64 [%r87],%r146; .loc 1 78 27 add.u64 %r80,%r80,1; .loc 1 78 4 add.u64 %r87,%r87,%r81; setp.ne.u64 %r147,%r71,%r80; @ %r147 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r79,%r98,1; .loc 1 70 3 add.u64 %r36,%r36,24; add.u64 %r104,%r104,8; add.u64 %r105,%r105,8; setp.ne.u64 %r148,%r71,%r79; @ %r148 bra $L24; .loc 1 84 8 ld.u64 %r62,[%r113]; shl.b64 %r84,%r72,3; mov.u64 %r39,%r23; .loc 1 87 10 mov.u64 %r78,0; .loc 1 88 23 mov.u64 %r149,1; $L8: st.u64 [%r39],%r149; mov.u64 %r32,%r78; .loc 1 87 26 add.u64 %r78,%r78,1; .loc 1 87 3 add.u64 %r39,%r39,%r84; setp.ne.u64 %r150,%r98,%r32; @ %r150 bra $L8; .loc 1 101 9 setp.ne.u64 %r151,%r62,0; @ ! %r151 bra $L1; .loc 1 149 19 ld.u64 %r45,[%frame]; .loc 1 149 9 shl.b64 %r47,%r45,4; .loc 1 151 34 ld.u64 %r51,[%frame+120]; cvt.u32.u32 %r153,%r22; cvt.s64.s8 %r152,%r153; shl.b64 %r154,%r152,3; add.u64 %r73,%r204,%r154; .loc 1 99 12 mov.u64 %r201,-1; mov.u64 %r202,9223372036854775807; setp.eq.u32 %r203,%r114,0; add.u64 %r205,%frame,248; .loc 1 157 13 mov.u64 %r208,0; .loc 1 174 32 add.u64 %r209,%frame,120; $L23: .loc 1 151 21 ld.u64 %r103,[%frame+240]; .loc 1 126 10 @ %r203 bra $L25; mov.u64 %r40,%r103; mov.u64 %r60,%r62; $L15: .loc 1 129 10 ld.u64 %r197,[%r60]; ld.u64 %r198,[%r60+8]; .loc 1 129 9 setp.gt.s64 %r159,%r198,%r202; @ %r159 bra $L12; setp.ne.u64 %r162,%r198,%r202; @ %r162 bra $L28; setp.gt.u64 %r165,%r197,%r201; @ %r165 bra $L12; $L28: mov.u64 %r76,%r205; mov.u64 %r34,%r23; mov.u64 %r90,%r40; $L14: .loc 1 133 34 add.u64 %r167,%r90,1; .loc 1 133 23 st.u64 [%r34],%r167; .loc 1 132 3 add.u64 %r34,%r34,%r84; setp.eq.u64 %r168,%r73,%r76; @ %r168 bra $L26; .loc 1 133 30 ld.u64 %r90,[%r76]; add.u64 %r76,%r76,8; bra $L14; $L26: mov.u64 %r201,%r197; mov.u64 %r202,%r198; $L12: .loc 1 135 11 add.u64 %r60,%r60,%r47; .loc 1 137 9 add.u64 %r40,%r40,1; .loc 1 137 2 st.u64 [%frame+240],%r40; setp.ne.u64 %r169,%r40,%r51; @ %r169 bra $L15; bra $L38; $L25: mov.u64 %r49,%r103; mov.u64 %r61,%r62; $L11: .loc 1 141 10 ld.u64 %r199,[%r61]; ld.u64 %r200,[%r61+8]; .loc 1 141 9 setp.gt.s64 %r172,%r202,%r200; @ %r172 bra $L29; setp.ne.u64 %r175,%r202,%r200; @ %r175 bra $L17; setp.gt.u64 %r178,%r201,%r199; @ ! %r178 bra $L17; $L29: mov.u64 %r85,%r205; mov.u64 %r101,%r23; mov.u64 %r86,%r49; $L19: .loc 1 145 34 add.u64 %r180,%r86,1; .loc 1 145 23 st.u64 [%r101],%r180; .loc 1 144 3 add.u64 %r101,%r101,%r84; setp.eq.u64 %r181,%r73,%r85; @ %r181 bra $L27; .loc 1 145 30 ld.u64 %r86,[%r85]; add.u64 %r85,%r85,8; bra $L19; $L27: mov.u64 %r201,%r199; mov.u64 %r202,%r200; $L17: .loc 1 149 9 add.u64 %r61,%r61,%r47; .loc 1 151 14 add.u64 %r49,%r49,1; .loc 1 151 7 st.u64 [%frame+240],%r49; setp.ne.u64 %r182,%r49,%r51; @ %r182 bra $L11; .loc 1 149 9 sub.u64 %r183,%r51,%r103; mad.lo.u64 %r62,%r183,%r47,%r62; bra $L20; $L38: .loc 1 135 11 sub.u64 %r185,%r51,%r103; mad.lo.u64 %r62,%r185,%r47,%r62; $L20: mov.u64 %r70,%r205; .loc 1 151 34 mov.u64 %r59,%r51; mov.u64 %r57,%r45; .loc 1 152 9 mov.u64 %r66,0; $L22: .loc 1 157 13 st.u64 [%r70+-8],%r208; .loc 1 160 23 mul.lo.u64 %r54,%r57,%r59; mov.u64 %r99,%r66; .loc 1 161 5 add.u64 %r66,%r66,1; .loc 1 162 7 setp.eq.u64 %r189,%r99,%r98; @ %r189 bra $L1; .loc 1 170 16 ld.u64 %r190,[%r70]; add.u64 %r56,%r190,1; st.u64 [%r70],%r56; shl.b64 %r65,%r66,3; .loc 1 171 23 add.u64 %r191,%frame,%r65; ld.u64 %r57,[%r191]; .loc 1 171 13 sub.u64 %r192,%r57,%r54; shl.b64 %r193,%r192,4; add.u64 %r62,%r62,%r193; .loc 1 174 32 add.u64 %r195,%r209,%r65; ld.u64 %r59,[%r195]; .loc 1 174 7 add.u64 %r70,%r70,8; setp.eq.u64 %r196,%r56,%r59; @ %r196 bra $L22; bra $L23; $L1: .loc 1 1_gfortran_mminloc0_8_i16 .visible .func _gfortran_mminloc0_8_i169pred %r187; .reg .u64 %r190; .reg .u64u64pred %r222; .reg .u64 %r223; .reg .predu32 %r230; .reg .u64 %r231; .reg .u16 %r233; .reg .u32 %r234; .reg .pred %r235; .reg .pred %r236; .reg .u64 %r238; .reg .pred %r239; .reg .pred %r240; .reg .u16 %r242; .reg .u32 %r243; .reg .pred %r244; .reg .pred %r247; .reg .pred %r250; .reg .predu32 %r261; .reg .pred %r262; .reg .pred %r265; .reg .pred %r268; .reg .pred %r271; .reg .u64 %r273; .reg .pred %r274; .reg .predu64 %r283; .reg .u64u64 %r305; .reg .pred %r307; mov.u64 %r160,%ar0; mov.u64 %r161,%ar1; mov.u64 %r162,%ar2; mov.u32 %r163,%ar3; .loc 1 202 6 setp.ne.u64 %r164,%r162,0; @ %r164 bra $L40; .loc 1 204 7 {1163; call _gfortran_minloc0_8_i16.loc 1 205 7 bra $L39; $L40: .loc 1 208 10 ld.s8 %r22,[%r161+28]; .loc 1 209 6 setp.gt.s32 %r168,%r22,0; @ %r168 bra $L42; .loc 1 210 5 cvta.const.u64 %r16stack; call _gfortran_runtime_error42: .loc 1 212 6 ld.u64 %r171,[%r160]; setp.ne.u64 %r172,%r171,0; @ %r172 bra $L43; .loc 1 214 7 st.u64 [%r160+48],%r171; add.u32 %r174,%r22,-1; cvt.s64.s32 %r175,%r174; st.u64 [%r160+56],%r175; mov.u64 %r176,1; st.u64 [%r160+40],%r176; .loc 1 215 28 cvt.u32.u64 %r177,%r176; st.u8 [%r160+28],%r177; .loc 1 216 24 st.u64 [%r160+8],%r171; .loc 1 217 29 cvt.u32.u32 %r182,%r22; cvt.s64.s8 %r181,%r182; mov.u64 %r180call (%value_in),_gfortrani_xmallocarray183,[%value_in]; } .loc 1 217 27 st.u64 [%r160],%r183; bra $L44; $L43: .loc 1 221 11 cvta.global.u64 %r185,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r186,[%r185+36]; setp.eq.u32 %r187,%r186,0; @ %r187 bra $L44; .loc 1 224 4 cvta.const.u64 %r190,$LC1; {161190; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19316161193190; call _gfortrani_bounds_equal_extents$L44: .loc 1 231 15 ld.u64 %r31,[%r162+16]; .loc 1 233 9 ld.u64 %r96,[%r162]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r195,%r32,-4; and.b32 %r196,%r195,-5; set.u32.eq.u32 %r198,%r196,0; neg.s32 %r199,%r198; .loc 1 235 22 add.u32 %r200,%r32,-1; set.u32.le.u32 %r202,%r200,1; neg.s32 %r203,%r202; .loc 1 235 6 cvt.u16.u32 %r205,%r199; cvt.u16.u32 %r206,%r203; or.b16 %r204,%r205,%r206; cvt.u32.u16 %r207,%r204; cvt.u16.u8 %r208,%r207; .loc 1 237 7 setp.ne.u32 %r211,%r32,16; @ %r211 bra $L46; $L45: .loc 1 244 11 ld.u64 %r110,[%r160+40]; .loc 1 245 8 ld.u64 %r111,[%r160]; .loc 1 246 17 cvt.u32.u32 %r212,%r22; cvt.s64.s8 %r68,%r212; add.u64 %r153,%r161,40; add.u64 %r298,%frame,120; mov.u64 %r149,%r298; add.u64 %r147,%r162,40; mov.u64 %r146,%frame; add.u64 %r71,%frame,240; add.u64 %r64,%frame,360; mov.u64 %r47,%r64; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r221,%r29; bra $L47; $L46: .loc 1 242 5_gfortran_runtime_error70: mov.u64 %r29,%r116; $L47: .loc 1 248 18 ld.u64 %r215,[%r153]; st.u64 [%r149],%r215; .loc 1 249 20 ld.u64 %r217,[%r147]; mul.lo.u64 %r216,%r217,%r31; .loc 1 249 18 st.u64 [%r146],%r216; .loc 1 250 19 ld.u64 %r219,[%r153+16]; add.u64 %r218,%r219,1; ld.u64 %r220,[%r153+8]; sub.u64 %r44,%r218,%r220; .loc 1 250 17 st.u64 [%r71],%r44; .loc 1 251 16 st.u64 [%r47],%r221; .loc 1 252 10 setp.gt.s64 %r222,%r44,0; @ %r222 bra $L48; shl.b64 %r138,%r110,3; mov.u64 %r130,%r111; .loc 1 255 11 mov.u64 %r117,0; .loc 1 256 24 mov.u64 %r223,%r117; $L49: st.u64 [%r130],%r223; .loc 1 255 27 add.u64 %r117,%r117,1; .loc 1 255 4 add.u64 %r130,%r130,%r138; setp.ne.u64 %r224,%r68,%r117; @ %r224 bra $L49; bra $L39; $L48: .loc 1 246 26 add.u64 %r116,%r29,1; .loc 1 246 3 add.u64 %r153,%r153,24; add.u64 %r149,%r149,8; add.u64 %r147,%r147,24; add.u64 %r146,%r146,8; add.u64 %r71,%r71,8; add.u64 %r47,%r47,8; setp.ne.u64 %r225,%r68,%r116; @ %r225 bra $L70; .loc 1 261 8 ld.u64 %r93,[%r161]; shl.b64 %r158,%r110,3; mov.u64 %r156,%r111; .loc 1 264 10 mov.u64 %r115,0; .loc 1 265 23 mov.u64 %r226,%r115; $L51: st.u64 [%r156],%r226; mov.u64 %r141,%r115; .loc 1 264 26 add.u64 %r115,%r115,1; .loc 1 264 3 add.u64 %r156,%r156,%r158; setp.ne.u64 %r227,%r29,%r141; @ %r227 bra $L51; .loc 1 276 9 setp.eq.u64 %r228,%r93,0; @ %r228 bra $L39; .loc 1 331 19 ld.u64 %r73,[%frame+120]; .loc 1 331 9 shl.b64 %r75,%r73,4; .loc 1 332 20 ld.u64 %r76,[%frame]; .loc 1 334 34 ld.u64 %r79,[%frame+240]; ld.u64 %r155,[%frame+360]; cvt.u32.u32 %r230,%r22; cvt.s64.s8 %r229,%r230; shl.b64 %r231,%r229,3; add.u64 %r120,%r231,%r64; .loc 1 274 12 mov.u64 %r295,-1; mov.u64 %r296,9223372036854775807; setp.eq.u32 %r297,1,0; add.u64 %r299,%frame,368; .loc 1 340 13 mov.u64 %r304,0; .loc 1 359 32 add.u64 %r305,%frame,240; .loc 1 308 12 setp.eq.u32 %r307,%r163,0; $L69: .loc 1 280 10 @ %r297 bra $L52; mov.u64 %r56,%r155; mov.u32 %r46,0; mov.u32 %r301,1; $L57: mov.u64 %r45,%r93; .loc 1 300 13 add.u64 %r93,%r45,%r75; .loc 1 284 11 ld.s8 %r234,[%r96]; cvt.u16.u32 %r233,%r234; setp.eq.u16 %r235,%r233,0; @ %r235 bra $L53; setp.eq.u32 %r236,%r46,0; selp.u64 %r155,%r155,%r56,%r236; .loc 1 294 16 ld.u64 %r295,[%r45]; ld.u64 %r296,[%r45+8]; mov.u64 %r129,%r299; mov.u64 %r139,%r111; mov.u64 %r151,%r155; $L56: .loc 1 296 33 add.u64 %r238,%r151,1; .loc 1 296 22 st.u64 [%r139],%r238; .loc 1 295 9 add.u64 %r139,%r139,%r158; setp.eq.u64 %r239,%r120,%r129; @ %r239 bra $L71; .loc 1 296 29 ld.u64 %r151,[%r129]; add.u64 %r129,%r129,8; bra $L56; $L53: .loc 1 301 14 add.u64 %r96,%r96,%r76; .loc 1 303 11 add.u64 %r56,%r56,1; mov.u32 %r46,%r301; .loc 1 303 4 setp.ne.u64 %r240,%r56,%r79; @ %r240 bra $L57; bra $L97; $L52: .loc 1 308 12 @ %r307 bra $L72; mov.u64 %r66,%r155; mov.u64 %r91,%r93; $L63: .loc 1 311 11 ld.s8 %r243,[%r96]; cvt.u16.u32 %r242,%r243; setp.eq.u16 %r244,%r242,0; @ %r244 bra $L60; .loc 1 311 12 ld.u64 %r291,[%r91]; ld.u64 %r292,[%r91+8]; .loc 1 311 11 setp.gt.s64 %r247,%r292,%r296; @ %r247 bra $L60; setp.ne.u64 %r250,%r292,%r296; @ %r250 bra $L75; setp.gt.u64 %r253,%r291,%r295; @ %r253 bra $L60; $L75: mov.u64 %r62,%r299; mov.u64 %r94,%r111; mov.u64 %r148,%r66; .loc 1 314 18 mov.u64 %r142,%r304; $L62: .loc 1 315 36 add.u64 %r255,%r148,1; .loc 1 315 25 st.u64 [%r94],%r255; .loc 1 314 34 add.u64 %r114,%r142,1; .loc 1 314 11 add.u64 %r94,%r94,%r158; setp.le.s64 %r256,%r29,%r142; @ %r256 bra $L73; .loc 1 315 32 ld.u64 %r148,[%r62]; add.u64 %r62,%r62,8; mov.u64 %r142,%r114; bra $L62; $L73: mov.u64 %r295,%r291; mov.u64 %r296,%r292; $L60: .loc 1 317 8 add.u64 %r91,%r91,%r75; .loc 1 319 13 add.u64 %r66,%r66,1; .loc 1 319 6 st.u64 [%frame+360],%r66; setp.ne.u64 %r257,%r66,%r79; @ %r257 bra $L63; .loc 1 317 8 sub.u64 %r258,%r79,%r155; mad.lo.u64 %r93,%r258,%r75,%r93; bra $L64; $L72: mov.u64 %r78,%r155; mov.u64 %r95,%r96; mov.u64 %r92,%r93; $L59: .loc 1 323 11 ld.s8 %r261,[%r95]; cvt.u16.u32 %r260,%r261; setp.eq.u16 %r262,%r260,0; @ %r262 bra $L65; .loc 1 323 12 ld.u64 %r293,[%r92]; ld.u64 %r294,[%r92+8]; .loc 1 323 11 setp.gt.s64 %r265,%r296,%r294; @ %r265 bra $L76; setp.ne.u64 %r268,%r296,%r294; @ %r268 bra $L65; setp.gt.u64 %r271,%r295,%r293; @ ! %r271 bra $L65; $L76: mov.u64 %r118,%r299; mov.u64 %r57,%r111; mov.u64 %r144,%r78; .loc 1 326 12 mov.u64 %r143,%r304; $L67: .loc 1 327 36 add.u64 %r273,%r144,1; .loc 1 327 25 st.u64 [%r57],%r273; .loc 1 326 28 add.u64 %r113,%r143,1; .loc 1 326 5 add.u64 %r57,%r57,%r158; setp.le.s64 %r274,%r29,%r143; @ %r274 bra $L74; .loc 1 327 32 ld.u64 %r144,[%r118]; add.u64 %r118,%r118,8; mov.u64 %r143,%r113; bra $L67; $L74: mov.u64 %r295,%r293; mov.u64 %r296,%r294; $L65: .loc 1 331 9 add.u64 %r92,%r92,%r75; .loc 1 332 10 add.u64 %r95,%r95,%r76; .loc 1 334 14 add.u64 %r78,%r78,1; .loc 1 334 7 st.u64 [%frame+360],%r78; setp.ne.u64 %r275,%r78,%r79; @ %r275 bra $L59; sub.u64 %r126,%r79,%r155; .loc 1 331 9 mad.lo.u64 %r93,%r75,%r126,%r93; .loc 1 332 10 mad.lo.u64 %r96,%r76,%r126,%r96; bra $L64; $L97: st.u64 [%frame+360],%r79; $L64: mov.u64 %r105,%r299; mov.u64 %r88,%r76; .loc 1 334 34 mov.u64 %r90,%r79; mov.u64 %r86,%r73; mov.u64 %r104,8; .loc 1 335 9 mov.u64 %r99,0; $L68: .loc 1 340 13 st.u64 [%r105+-8],%r304; .loc 1 343 23 mul.lo.u64 %r81,%r86,%r90; .loc 1 344 24 mul.lo.u64 %r83,%r90,%r88; mov.u64 %r70,%r99; .loc 1 345 5 add.u64 %r99,%r99,1; .loc 1 346 7 setp.eq.u64 %r280,%r29,%r70; @ %r280 bra $L39; .loc 1 354 16 ld.u64 %r281,[%r105]; add.u64 %r85,%r281,1; st.u64 [%r105],%r85; .loc 1 355 23 add.u64 %r283,%r298,%r104; ld.u64 %r86,[%r283]; .loc 1 355 13 sub.u64 %r284,%r86,%r81; shl.b64 %r285,%r284,4; add.u64 %r93,%r93,%r285; .loc 1 356 24 add.u64 %r286,%frame,%r104; ld.u64 %r88,[%r286]; .loc 1 356 14 sub.u64 %r287,%r88,%r83; add.u64 %r96,%r96,%r287; .loc 1 359 32 add.u64 %r289,%r305,%r104; ld.u64 %r90,[%r289]; .loc 1 359 7 add.u64 %r105,%r105,8; add.u64 %r104,%r104,8; setp.eq.u64 %r290,%r85,%r90; @ %r290 bra $L68; ld.u64 %r155,[%frame+360]; bra $L69; $L71: mov.u64 %r93,%r45; setp.eq.u32 %r297,1,1; bra $L52; $L39: .loc 1 362_gfortran_sminloc0_8_i16 .visible .func _gfortran_sminloc0_8_i164predpred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 378 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L99; .loc 1 378 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L100; $L99: .loc 1 380 7 {_gfortran_minloc0_8_i16.loc 1 381 7 bra $L98; $L100: .loc 1 384 8 ld.s8 %r30,[%r36+28]; .loc 1 386 6 setp.gt.s64 %r46,%r30,0; @ %r46 bra $L102_gfortran_runtime_error102: .loc 1 389 15 ld.u64 %r24,[%r35]; .loc 1 389 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L103; .loc 1 391 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 392 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 393 24 st.u64 [%r35+8],%r24; .loc 1 394 29 mov.u64 call (%value_in),_gfortrani_xmallocarray94 27 st.u64 [%r35],%r24; bra $L104; $L103: .loc 1 396 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 396 11 ld.u32 %r62,[%r61+36.loc 1 398 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 403 8 ld.u64 %r24,[%r35]; $L104: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,3; mov.u64 %r29,%r24; .loc 1 404 10 mov.u64 %r32,0; .loc 1 405 23 mov.u64 %r68,%r32; $L105: st.u64 [%r29],%r68; .loc 1 404 24 add.u64 %r32,%r32,1; .loc 1 404 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L105; $L98: .loc 1 406127 _gfortran_minloc0_16_i16 .visible .func _gfortran_minloc0_16_i16fortran/generated/minloc0_16_i16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_16_i16 .visible .func _gfortran_mminloc0_16_i16_gfortran_sminloc0_16_i16 .visible .func _gfortran_sminloc0_16_i16VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_16_i16 .visible .func _gfortran_minloc0_16_i163264u32 %r157; .reg .u64 %r158; .reg .pred %r163; .reg .pred %r166; .reg .predpredpredpredu64 %r191; .reg .predpred64mov.u32 %r116,%ar2; .loc 1 50 10 ld.s8 %r22,[%r115+28]; .loc 1 50 8 cvt.u32.u32 %r117,%r22; cvt.s64.s8 %r71,%r117; .loc 1 51 6 setp.gt.s64 %r118,%r71,0; @ %r118 bra $L2; .loc 1 52 5 cvta.const.u64 %r1111_gfortran_runtime_errorr23,[%r114]; .loc 1 54 6 setp.ne.u64 %r121,%r23,0; @ %r121 bra $L3; .loc 1 56 7 st.u64 [%r114+48],%r23; add.u64 %r123,%r71,-1; st.u64 [%r114+56],%r123; mov.u64 %r124,1; st.u64 [%r114+40],%r124; .loc 1 57 28 cvt.u32.u64 %r125,%r124; st.u8 [%r114+28],%r125; .loc 1 58 24 st.u64 [%r114+8],%r23; .loc 1 59 29 mov.u64 %r128,1671call (%value_in),_gfortrani_xmallocarray131,[%value_in]; } mov.u64 %r23,%r131; .loc 1 59 27 st.u64 [%r114],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r133,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r134,[%r133+36]; setp.eq.u32 %r135,%r134,0; @ %r135 bra $L4; .loc 1 64r115138; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r114]; $L4: .loc 1 68 11 ld.u64 %r72,[%r114+40]; add.u64 %r44,%r115,40; mov.u64 %r35,%frame; add.u64 %r52,%frame,120; .loc 1 70 10 mov.u64 %r100,0; add.u64 %r210,%frame,240; .loc 1 74 16 mov.u64 %r146,%r100; bra $L7; $L24: mov.u64 %r100,%r79; $L7: .loc 1 72 18 ld.u64 %r139,[%r44]; st.u64 [%r35],%r139; .loc 1 73 19 ld.u64 %r141,[%r44+16]; add.u64 %r140,%r141,1; ld.u64 %r142,[%r44+8]; sub.u64 %r32,%r140,%r142; .loc 1 73 17 st.u64 [%r52],%r32; .loc 1 74 16 shl.b64 %r144,%r100,3; add.u64 %r145,%r210,%r144; st.u64 [%r145],%r146; .loc 1 75 10 setp.gt.s64 %r147,%r32,0; @ %r147 bra $L5; shl.b64 %r86,%r72,4; mov.u64 %r88,%r23; .loc 1 78 11 mov.u64 %r80,0; .loc 1 79 24 mov.u64 %r148,%r80; $L6: st.u64 [%r88],%r148; st.u64 [%r88+8],%r148; .loc 1 78 27 add.u64 %r80,%r80,1; .loc 1 78 4 add.u64 %r88,%r88,%r86; setp.ne.u64 %r150,%r71,%r80; @ %r150 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r79,%r100,1; .loc 1 70 3 add.u64 %r44,%r44,24; add.u64 %r35,%r35,8; add.u64 %r52,%r52,8; setp.ne.u64 %r151,%r71,%r79; @ %r151 bra $L24; .loc 1 84 8 ld.u64 %r62,[%r115]; shl.b64 %r38,%r72,4; mov.u64 %r85,%r23; .loc 1 87 10 mov.u64 %r78,0; .loc 1 88 23 mov.u64 %r152,1; mov.u64 %r153,%r78; $L8: st.u64 [%r85],%r152; st.u64 [%r85+8],%r153; mov.u64 %r33,%r78; .loc 1 87 26 add.u64 %r78,%r78,1; .loc 1 87 3 add.u64 %r85,%r85,%r38; setp.ne.u64 %r154,%r100,%r33; @ %r154 bra $L8; .loc 1 101 9 setp.ne.u64 %r155,%r62,0; @ ! %r155 bra $L1; .loc 1 149 19 ld.u64 %r47,[%frame]; .loc 1 149 9 shl.b64 %r49,%r47,4; .loc 1 151 34 ld.u64 %r51,[%frame+120]; cvt.u32.u32 %r157,%r22; cvt.s64.s8 %r156,%r157; shl.b64 %r158,%r156,3; add.u64 %r89,%r210,%r158; .loc 1 99 12 mov.u64 %r207,-1; mov.u64 %r208,9223372036854775807; setp.eq.u32 %r209,%r116,0; add.u64 %r211,%frame,248; .loc 1 157 13 mov.u64 %r214,0; .loc 1 174 32 add.u64 %r215,%frame,120; $L23: .loc 1 151 21 ld.u64 %r107,[%frame+240]; .loc 1 126 10 @ %r209 bra $L25; mov.u64 %r41,%r107; mov.u64 %r60,%r62; $L15: .loc 1 129 10 ld.u64 %r203,[%r60]; ld.u64 %r204,[%r60+8]; .loc 1 129 9 setp.gt.s64 %r163,%r204,%r208; @ %r163 bra $L12; setp.ne.u64 %r166,%r204,%r208; @ %r166 bra $L28; setp.gt.u64 %r169,%r203,%r207; @ %r169 bra $L12; $L28: mov.u64 %r27,%r211; mov.u64 %r63,%r23; mov.u64 %r103,%r41; $L14: .loc 1 133 34 add.u64 %r171,%r103,1; st.u64 [%r63],%r171; shr.s64 %r172,%r171,63; st.u64 [%r63+8],%r172; .loc 1 132 3 add.u64 %r63,%r63,%r38; setp.eq.u64 %r173,%r27,%r89; @ %r173 bra $L26; .loc 1 133 30 ld.u64 %r103,[%r27]; add.u64 %r27,%r27,8; bra $L14; $L26: mov.u64 %r207,%r203; mov.u64 %r208,%r204; $L12: .loc 1 135 11 add.u64 %r60,%r60,%r49; .loc 1 137 9 add.u64 %r41,%r41,1; .loc 1 137 2 st.u64 [%frame+240],%r41; setp.ne.u64 %r174,%r41,%r51; @ %r174 bra $L15; bra $L38; $L25: mov.u64 %r50,%r107; mov.u64 %r61,%r62; $L11: .loc 1 141 10 ld.u64 %r205,[%r61]; ld.u64 %r206,[%r61+8]; .loc 1 141 9 setp.gt.s64 %r177,%r208,%r206; @ %r177 bra $L29; setp.ne.u64 %r180,%r208,%r206; @ %r180 bra $L17; setp.gt.u64 %r183,%r207,%r205; @ ! %r183 bra $L17; $L29: mov.u64 %r104,%r211; mov.u64 %r105,%r23; mov.u64 %r84,%r50; $L19: .loc 1 145 34 add.u64 %r185,%r84,1; st.u64 [%r105],%r185; shr.s64 %r186,%r185,63; st.u64 [%r105+8],%r186; .loc 1 144 3 add.u64 %r105,%r105,%r38; setp.eq.u64 %r187,%r89,%r104; @ %r187 bra $L27; .loc 1 145 30 ld.u64 %r84,[%r104]; add.u64 %r104,%r104,8; bra $L19; $L27: mov.u64 %r207,%r205; mov.u64 %r208,%r206; $L17: .loc 1 149 9 add.u64 %r61,%r61,%r49; .loc 1 151 14 add.u64 %r50,%r50,1; .loc 1 151 7 st.u64 [%frame+240],%r50; setp.ne.u64 %r188,%r50,%r51; @ %r188 bra $L11; .loc 1 149 9 sub.u64 %r189,%r51,%r107; mad.lo.u64 %r62,%r189,%r49,%r62; bra $L20; $L38: .loc 1 135 11 sub.u64 %r191,%r51,%r107; mad.lo.u64 %r62,%r191,%r49,%r62; $L20: mov.u64 %r77,%r211; .loc 1 151 34 mov.u64 %r59,%r51; mov.u64 %r57,%r47; .loc 1 152 9 mov.u64 %r65,0; $L22: .loc 1 157 13 st.u64 [%r77+-8],%r214; .loc 1 160 23 mul.lo.u64 %r54,%r57,%r59; mov.u64 %r101,%r65; .loc 1 161 5 add.u64 %r65,%r65,1; .loc 1 162 7 setp.eq.u64 %r195,%r101,%r100; @ %r195 bra $L1; .loc 1 170 16 ld.u64 %r196,[%r77]; add.u64 %r56,%r196,1; st.u64 [%r77],%r56; shl.b64 %r68,%r65,3; .loc 1 171 23 add.u64 %r197,%frame,%r68; ld.u64 %r57,[%r197]; .loc 1 171 13 sub.u64 %r198,%r57,%r54; shl.b64 %r199,%r198,4; add.u64 %r62,%r62,%r199; .loc 1 174 32 add.u64 %r201,%r215,%r68; ld.u64 %r59,[%r201]; .loc 1 174 7 add.u64 %r77,%r77,8; setp.eq.u64 %r202,%r56,%r59; @ %r202 bra $L22; bra $L23; $L1: .loc 1 1_gfortran_mminloc0_16_i16 .visible .func _gfortran_mminloc0_16_i168u64u32 %r198; .reg .u32pred %r272; .reg .predpredu64u64 %r302; .reg .u64mov.u64 %r163,%ar0; mov.u64 %r164,%ar1; mov.u64 %r165,%ar2; mov.u32 %r166,%ar3; .loc 1 202 6 setp.ne.u64 %r167,%r165,0; @ %r167 bra $L4016r1166; call _gfortran_minloc0_16_i16.loc 1 205 7 bra $L39; $L40: .loc 1 208 10 ld.s8 %r22,[%r164+28]; .loc 1 209 6 setp.gt.s32 %r171,%r22,0; @ %r171 bra $L42; .loc 1 210 5 cvta.const.u64 %r172172_gfortran_runtime_error42: .loc 1 212 6 ld.u64 %r174,[%r163]; setp.ne.u64 %r175,%r174,0; @ %r175 bra $L43; .loc 1 214 7 st.u64 [%r163+48],%r174; add.u32 %r177,%r22,-1; cvt.s64.s32 %r178,%r177; st.u64 [%r163+56],%r178; mov.u64 %r179,1; st.u64 [%r163+40],%r179; .loc 1 215 28 cvt.u32.u64 %r180,%r179; st.u8 [%r163+28],%r180; .loc 1 216 24 st.u64 [%r163+8],%r174; .loc 1 217 29 cvt.u32.u32 %r185,%r22; cvt.s64.s8 %r184,%r185; mov.u64 %r183,16call (%value_in),_gfortrani_xmallocarray186,[%value_in]; } .loc 1 217 27 st.u64 [%r163],%r186; bra $L44; $L43: .loc 1 221 11 cvta.global.u64 %r188,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r189,[%r188+36]; setp.eq.u32 %r190,%r189,0r164193; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r196193; call _gfortrani_bounds_equal_extents$L44: .loc 1 231 15 ld.u64 %r31,[%r165+16]; .loc 1 233 9 ld.u64 %r93,[%r165]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r198,%r32,-4; and.b32 %r199,%r198,-5; set.u32.eq.u32 %r201,%r199,0; neg.s32 %r202,%r201; .loc 1 235 22 add.u32 %r203,%r32,-1; set.u32.le.u32 %r205,%r203,1; neg.s32 %r206,%r205; .loc 1 235 6 cvt.u16.u32 %r208,%r202; cvt.u16.u32 %r209,%r206; or.b16 %r207,%r208,%r209;45; .loc 1 237 7 setp.ne.u32 %r214,%r32,16; @ %r214 bra $L46; $L45: .loc 1 244 11 ld.u64 %r106,[%r163+40]; .loc 1 245 8 ld.u64 %r107,[%r163]; .loc 1 246 17 cvt.u32.u32 %r215,%r22; cvt.s64.s8 %r64,%r215; add.u64 %r161,%r164,40; add.u64 %r306,%frame,120; mov.u64 %r159,%r306; add.u64 %r158,%r165,40; mov.u64 %r154,%frame; add.u64 %r155,%frame,240; add.u64 %r151,%frame,360; mov.u64 %r150,%r151; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r224,%r29; bra $L47; $L46: .loc 1 242 5_gfortran_runtime_error70: mov.u64 %r29,%r112; $L47: .loc 1 248 18 ld.u64 %r218,[%r161]; st.u64 [%r159],%r218; .loc 1 249 20 ld.u64 %r220,[%r158]; mul.lo.u64 %r219,%r220,%r31; .loc 1 249 18 st.u64 [%r154],%r219; .loc 1 250 19 ld.u64 %r222,[%r161+16]; add.u64 %r221,%r222,1; ld.u64 %r223,[%r161+8]; sub.u64 %r44,%r221,%r223; .loc 1 250 17 st.u64 [%r155],%r44; .loc 1 251 16 st.u64 [%r150],%r224; .loc 1 252 10 setp.gt.s64 %r225,%r44,0; @ %r225 bra $L48; shl.b64 %r141,%r106,4; mov.u64 %r132,%r107; .loc 1 255 11 mov.u64 %r113,0; .loc 1 256 24 mov.u64 %r226,%r113; $L49: st.u64 [%r132],%r226; st.u64 [%r132+8],%r226; .loc 1 255 27 add.u64 %r113,%r113,1; .loc 1 255 4 add.u64 %r132,%r132,%r141; setp.ne.u64 %r228,%r64,%r113; @ %r228 bra $L49; bra $L39; $L48: .loc 1 246 26 add.u64 %r112,%r29,1; .loc 1 246 3 add.u64 %r161,%r161,24; add.u64 %r159,%r159,8; add.u64 %r158,%r158,24; add.u64 %r154,%r154,8; add.u64 %r155,%r155,8; add.u64 %r150,%r150,8; setp.ne.u64 %r229,%r64,%r112; @ %r229 bra $L70; .loc 1 261 8 ld.u64 %r91,[%r164]; shl.b64 %r115,%r106,4; mov.u64 %r52,%r107; .loc 1 264 10 mov.u64 %r111,0; .loc 1 265 23 mov.u64 %r230,%r111; $L51: st.u64 [%r52],%r230; st.u64 [%r52+8],%r230; mov.u64 %r143,%r111; .loc 1 264 26 add.u64 %r111,%r111,1; .loc 1 264 3 add.u64 %r52,%r52,%r115; setp.ne.u64 %r232,%r29,%r143; @ %r232 bra $L51; .loc 1 276 9 setp.eq.u64 %r233,%r91,0; @ %r233 bra $L39; .loc 1 331 19 ld.u64 %r70,[%frame+120]; .loc 1 331 9 shl.b64 %r72,%r70,4; .loc 1 332 20 ld.u64 %r73,[%frame]; .loc 1 334 34 ld.u64 %r76,[%frame+240]; ld.u64 %r160,[%frame+360]; cvt.u32.u32 %r235,%r22; cvt.s64.s8 %r234,%r235; shl.b64 %r236,%r234,3; add.u64 %r122,%r236,%r151; .loc 1 274 12 mov.u64 %r303,-1; mov.u64 %r304,9223372036854775807; setp.eq.u32 %r307,1,0; add.u64 %r305,%frame,368; .loc 1 340 13 mov.u64 %r312,0; .loc 1 359 32 add.u64 %r313,%frame,240; .loc 1 308 12 setp.eq.u32 %r315,%r166,0; $L69: .loc 1 280 10 @ %r307 bra $L52; mov.u64 %r53,%r160; mov.u32 %r46,0; mov.u32 %r309,1; $L57: mov.u64 %r45,%r91; .loc 1 300 13 add.u64 %r91,%r45,%r72; .loc 1 284 11 ld.s8 %r239,[%r93]; cvt.u16.u32 %r238,%r239; setp.eq.u16 %r240,%r238,0; @ %r240 bra $L53; setp.eq.u32 %r241,%r46,0; selp.u64 %r160,%r160,%r53,%r241; .loc 1 294 16 ld.u64 %r303,[%r45]; ld.u64 %r304,[%r45+8]; mov.u64 %r131,%r305; mov.u64 %r142,%r107; mov.u64 %r157,%r160; $L56: .loc 1 296 33 add.u64 %r243,%r157,1; st.u64 [%r142],%r243; shr.s64 %r244,%r243,63; st.u64 [%r142+8],%r244; .loc 1 295 9 add.u64 %r142,%r142,%r115; setp.eq.u64 %r245,%r122,%r131; @ %r245 bra $L71; .loc 1 296 29 ld.u64 %r157,[%r131]; add.u64 %r131,%r131,8; bra $L56; $L53: .loc 1 301 14 add.u64 %r93,%r93,%r73; .loc 1 303 11 add.u64 %r53,%r53,1; mov.u32 %r46,%r309; .loc 1 303 4 setp.ne.u64 %r246,%r53,%r76; @ %r246 bra $L57; bra $L97; $L52: .loc 1 308 12 @ %r315 bra $L72; mov.u64 %r62,%r160; mov.u64 %r89,%r91; $L63: .loc 1 311 11 ld.s8 %r249,[%r93]; cvt.u16.u32 %r248,%r249; setp.eq.u16 %r250,%r248,0; @ %r250 bra $L60; .loc 1 311 12 ld.u64 %r299,[%r89]; ld.u64 %r300,[%r89+8]; .loc 1 311 11 setp.gt.s64 %r253,%r300,%r304; @ %r253 bra $L60; setp.ne.u64 %r256,%r300,%r304; @ %r256 bra $L75; setp.gt.u64 %r259,%r299,%r303; @ %r259 bra $L60; $L75: mov.u64 %r94,%r305; mov.u64 %r98,%r107; mov.u64 %r153,%r62; .loc 1 314 18 mov.u64 %r144,%r312; $L62: .loc 1 315 36 add.u64 %r261,%r153,1; st.u64 [%r98],%r261; shr.s64 %r262,%r261,63; st.u64 [%r98+8],%r262; .loc 1 314 34 add.u64 %r110,%r144,1; .loc 1 314 11 add.u64 %r98,%r98,%r115; setp.le.s64 %r263,%r29,%r144; @ %r263 bra $L73; .loc 1 315 32 ld.u64 %r153,[%r94]; add.u64 %r94,%r94,8; mov.u64 %r144,%r110; bra $L62; $L73: mov.u64 %r303,%r299; mov.u64 %r304,%r300; $L60: .loc 1 317 8 add.u64 %r89,%r89,%r72; .loc 1 319 13 add.u64 %r62,%r62,1; .loc 1 319 6 st.u64 [%frame+360],%r62; setp.ne.u64 %r264,%r62,%r76; @ %r264 bra $L63; .loc 1 317 8 sub.u64 %r265,%r76,%r160; mad.lo.u64 %r91,%r265,%r72,%r91; bra $L64; $L72: mov.u64 %r75,%r160; mov.u64 %r92,%r93; mov.u64 %r90,%r91; $L59: .loc 1 323 11 ld.s8 %r268,[%r92]; cvt.u16.u32 %r267,%r268; setp.eq.u16 %r269,%r267,0; @ %r269 bra $L65; .loc 1 323 12 ld.u64 %r301,[%r90]; ld.u64 %r302,[%r90+8]; .loc 1 323 11 setp.gt.s64 %r272,%r304,%r302; @ %r272 bra $L76; setp.ne.u64 %r275,%r304,%r302; @ %r275 bra $L65; setp.gt.u64 %r278,%r303,%r301; @ ! %r278 bra $L65; $L76: mov.u64 %r57,%r305; mov.u64 %r66,%r107; mov.u64 %r149,%r75; .loc 1 326 12 mov.u64 %r145,%r312; $L67: .loc 1 327 36 add.u64 %r280,%r149,1; st.u64 [%r66],%r280; shr.s64 %r281,%r280,63; st.u64 [%r66+8],%r281; .loc 1 326 28 add.u64 %r109,%r145,1; .loc 1 326 5 add.u64 %r66,%r66,%r115; setp.le.s64 %r282,%r29,%r145; @ %r282 bra $L74; .loc 1 327 32 ld.u64 %r149,[%r57]; add.u64 %r57,%r57,8; mov.u64 %r145,%r109; bra $L67; $L74: mov.u64 %r303,%r301; mov.u64 %r304,%r302; $L65: .loc 1 331 9 add.u64 %r90,%r90,%r72; .loc 1 332 10 add.u64 %r92,%r92,%r73; .loc 1 334 14 add.u64 %r75,%r75,1; .loc 1 334 7 st.u64 [%frame+360],%r75; setp.ne.u64 %r283,%r75,%r76; @ %r283 bra $L59; sub.u64 %r128,%r76,%r160; .loc 1 331 9 mad.lo.u64 %r91,%r72,%r128,%r91; .loc 1 332 10 mad.lo.u64 %r93,%r73,%r128,%r93; bra $L64; $L97: st.u64 [%frame+360],%r76; $L64: mov.u64 %r120,%r305; mov.u64 %r85,%r73; .loc 1 334 34 mov.u64 %r87,%r76; mov.u64 %r83,%r70; mov.u64 %r119,8; .loc 1 335 9 mov.u64 %r96,0; $L68: .loc 1 340 13 st.u64 [%r120+-8],%r312; .loc 1 343 23 mul.lo.u64 %r78,%r83,%r87; .loc 1 344 24 mul.lo.u64 %r80,%r87,%r85; mov.u64 %r95,%r96; .loc 1 345 5 add.u64 %r96,%r96,1; .loc 1 346 7 setp.eq.u64 %r288,%r29,%r95; @ %r288 bra $L39; .loc 1 354 16 ld.u64 %r289,[%r120]; add.u64 %r82,%r289,1; st.u64 [%r120],%r82; .loc 1 355 23 add.u64 %r291,%r306,%r119; ld.u64 %r83,[%r291]; .loc 1 355 13 sub.u64 %r292,%r83,%r78; shl.b64 %r293,%r292,4; add.u64 %r91,%r91,%r293; .loc 1 356 24 add.u64 %r294,%frame,%r119; ld.u64 %r85,[%r294]; .loc 1 356 14 sub.u64 %r295,%r85,%r80; add.u64 %r93,%r93,%r295; .loc 1 359 32 add.u64 %r297,%r313,%r119; ld.u64 %r87,[%r297]; .loc 1 359 7 add.u64 %r120,%r120,8; add.u64 %r119,%r119,8; setp.eq.u64 %r298,%r82,%r87; @ %r298 bra $L68; ld.u64 %r160,[%frame+360]; bra $L69; $L71: mov.u64 %r91,%r45; setp.eq.u32 %r307,1,1; bra $L52; $L39: .loc 1 362_gfortran_sminloc0_16_i16 .visible .func _gfortran_sminloc0_16_i164predpred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 378 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L99; .loc 1 378 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L100; $L99: .loc 1 380 7 {_gfortran_minloc0_16_i16.loc 1 381 7 bra $L98; $L100: .loc 1 384 8 ld.s8 %r30,[%r36+28]; .loc 1 386 6 setp.gt.s64 %r46,%r30,0; @ %r46 bra $L102_gfortran_runtime_error102: .loc 1 389 15 ld.u64 %r24,[%r35]; .loc 1 389 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L103; .loc 1 391 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 392 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 393 24 st.u64 [%r35+8],%r24; .loc 1 394 29call (%value_in),_gfortrani_xmallocarray94 27 st.u64 [%r35],%r24; bra $L104; $L103: .loc 1 396 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 396 11 ld.u32 %r62,[%r61+36.loc 1 398 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 403 8 ld.u64 %r24,[%r35]; $L104: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,4; mov.u64 %r29,%r24; .loc 1 404 10 mov.u64 %r32,0; .loc 1 405 23 mov.u64 %r68,%r32; $L105: st.u64 [%r29],%r68; st.u64 [%r29+8],%r68; .loc 1 404 24 add.u64 %r32,%r32,1; .loc 1 404 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r70,%r30,%r32; @ %r70 bra $L105; $L98: .loc 1 406minloc0_4_r4.o/ 1622802232_gfortran_minloc0_4_r4 .visible .func _gfortran_minloc0_4_fortran/generated/minloc0_4_r4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_4_r4 .visible .func _gfortran_mminloc0_4_r4_gfortran_sminloc0_4_r4 .visible .func _gfortran_sminloc0_4_r4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_4_r4 .visible .func _gfortran_minloc0_4_r4predpredpredu32 %r176; .reg .pred %r177; .reg .pred %r178; .reg .pred %r180; .reg .u32 %r182; .reg .u32 %r183; .reg .pred64pred %r208; .reg .u32 %r210; .reg .u64 %r211; .reg .u64 %r212; .reg .pred50 10 ld.s8 %r22,[%r128+28]; .loc 1 50 8 cvt.u32.u32 %r130,%r22; cvt.s64.s8 %r79,%r130; .loc 1 51 6 setp.gt.s64 %r131,%r79,0; @ %r131 bra $L2; .loc 1 52 5 cvta.const.u64 %r1321stack; call _gfortran_runtime_errorr23,[%r127]; .loc 1 54 6 setp.ne.u64 %r134,%r23,0; @ %r134 bra $L3; .loc 1 56 7 st.u64 [%r127+48],%r23; add.u64 %r136,%r79,-1; st.u64 [%r127+56],%r136; mov.u64 %r137,1; st.u64 [%r127+40],%r137; .loc 1 57 28 cvt.u32.u64 %r138,%r137; st.u8 [%r127+28],%r138; .loc 1 58 24 st.u64 [%r127+8],%r23; .loc 1 59 2979call (%value_in),_gfortrani_xmallocarray4,[%value_in]; } mov.u64 %r23,%r144; .loc 1 59 27 st.u64 [%r127],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r146,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r147,[%r146+36]; setp.eq.u32 %r148,%r147,0; @ %r148 bra $L4; .loc 1 64 2r12151; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r127]; $L4: .loc 1 68 11 ld.u64 %r80,[%r127+40]; add.u64 %r44,%r128,40; mov.u64 %r37,%frame; add.u64 %r47,%frame,120; .loc 1 70 10 mov.u64 %r103,0; add.u64 %r206,%frame,240; .loc 1 74 16 mov.u64 %r159,%r103; bra $L7; $L32: mov.u64 %r103,%r88; $L7: .loc 1 72 18 ld.u64 %r152,[%r44]; st.u64 [%r37],%r152; .loc 1 73 19 ld.u64 %r154,[%r44+16]; add.u64 %r153,%r154,1; ld.u64 %r155,[%r44+8]; sub.u64 %r31,%r153,%r155; .loc 1 73 17 st.u64 [%r47],%r31; .loc 1 74 16 shl.b64 %r157,%r103,3; add.u64 %r158,%r206,%r157; st.u64 [%r158],%r159; .loc 1 75 10 setp.gt.s64 %r160,%r31,0; @ %r160 bra $L5; shl.b64 %r122,%r80,2; mov.u64 %r101,%r23; .loc 1 78 11 mov.u64 %r89,0; .loc 1 79 24 cvt.u32.u64 %r161,%r89; $L6: st.u32 [%r101],%r161; .loc 1 78 27 add.u64 %r89,%r89,1; .loc 1 78 4 add.u64 %r101,%r101,%r122; setp.ne.u64 %r162,%r79,%r89; @ %r162 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r88,%r103,1; .loc 1 70 3 add.u64 %r44,%r44,24; add.u64 %r37,%r37,8; add.u64 %r47,%r47,8; setp.ne.u64 %r163,%r79,%r88; @ %r163 bra $L32; .loc 1 84 8 ld.u64 %r67,[%r128]; shl.b64 %r108,%r80,2; mov.u64 %r106,%r23; .loc 1 87 10 mov.u64 %r87,0; .loc 1 88 23 mov.u32 %r164,1; $L8: st.u32 [%r106],%r164; mov.u64 %r104,%r87; .loc 1 87 26 add.u64 %r87,%r87,1; .loc 1 87 3 add.u64 %r106,%r106,%r108; setp.ne.u64 %r165,%r103,%r104; @ %r165 bra $L8; .loc 1 101 9 setp.eq.u64 %r166,%r67,0; @ %r166 bra $L1; .loc 1 149 19 ld.u64 %r51,[%frame]; .loc 1 149 9 shl.b64 %r53,%r51,2; .loc 1 151 34 ld.u64 %r56,[%frame+120]; ld.u64 %r123,[%frame+240]; .loc 1 141 10 ld.f32 %r125,[%r67]; cvt.u32.u32 %r168,%r22; cvt.s64.s8 %r167,%r168; shl.b64 %r169,%r167,3; add.u64 %r113,%r206,%r169; .loc 1 97 12 mov.f32 %r73,0f7f800000; setp.eq.u32 %r208,1,0; add.u64 %r207,%frame,248; .loc 1 157 13 mov.u64 %r211,0; .loc 1 174 32 add.u64 %r212,%frame,120; .loc 1 126 10 setp.eq.u32 %r214,%r129,0; $L30: .loc 1 106 10 @ %r208 bra $L10; mov.u64 %r38,%r123; mov.u32 %r85,0; .loc 1 110 12 mov.u32 %r210,1; $L17: .loc 1 118 13 add.u64 %r126,%r67,%r53; .loc 1 110 11 setp.ge.f32 %r172,%r73,%r125; @ ! %r172 bra $L51; setp.eq.u32 %r173,%r85,0; selp.u64 %r123,%r123,%r38,%r173; mov.u64 %r90,%r207; mov.u64 %r95,%r23; mov.u64 %r119,%r123; $L15: .loc 1 115 36 cvt.u32.u64 %r176,%r119; add.u32 %r175,%r176,1; .loc 1 115 25 st.u32 [%r95],%r175; .loc 1 114 5 add.u64 %r95,%r95,%r108; setp.eq.u64 %r177,%r90,%r113; @ %r177 bra $L33; .loc 1 115 32 ld.u64 %r119,[%r90]; add.u64 %r90,%r90,8; bra $L15; $L51: .loc 1 120 11 add.u64 %r38,%r38,1; .loc 1 120 4 setp.eq.u64 %r178,%r38,%r56; @ %r178 bra $L16; .loc 1 110 12 ld.f32 %r125,[%r126]; mov.u64 %r67,%r126; mov.u32 %r85,%r210; bra $L17; $L10: .loc 1 126 10 @ %r214 bra $L34; mov.u64 %r46,%r123; mov.u64 %r65,%r67; $L24: .loc 1 129 9 setp.ge.f32 %r180,%r73,%r125; @ ! %r180 bra $L19; mov.u64 %r84,%r207; mov.u64 %r39,%r23; mov.u64 %r116,%r46; $L21: .loc 1 133 34 cvt.u32.u64 %r183,%r116; add.u32 %r182,%r183,1; .loc 1 133 23 st.u32 [%r39],%r182; .loc 1 132 3 add.u64 %r39,%r39,%r108; setp.eq.u64 %r184,%r84,%r113; @ %r184 bra $L35; .loc 1 133 30 ld.u64 %r116,[%r84]; add.u64 %r84,%r84,8; bra $L21; $L35: .loc 1 131 10 mov.f32 %r73,%r125; $L19: .loc 1 135 11 add.u64 %r65,%r65,%r53; .loc 1 137 9 add.u64 %r46,%r46,1; .loc 1 137 2 st.u64 [%frame+240],%r46; setp.ne.u64 %r185,%r46,%r56; @ %r185 bra $L22; .loc 1 135 11 sub.u64 %r186,%r56,%r123; mad.lo.u64 %r67,%r186,%r53,%r67; bra $L23; $L22: .loc 1 129 10 ld.f32 %r125,[%r65]; bra $L24; $L34: mov.u64 %r55,%r123; mov.u64 %r66,%r67; $L18: .loc 1 141 9 setp.gt.f32 %r188,%r73,%r125; @ ! %r188 bra $L25; mov.u64 %r117,%r207; mov.u64 %r120,%r23; mov.u64 %r110,%r55; $L27: .loc 1 145 34 cvt.u32.u64 %r191,%r110; add.u32 %r190,%r191,1; .loc 1 145 23 st.u32 [%r120],%r190; .loc 1 144 3 add.u64 %r120,%r120,%r108; setp.eq.u64 %r192,%r113,%r117; @ %r192 bra $L36; .loc 1 145 30 ld.u64 %r110,[%r117]; add.u64 %r117,%r117,8; bra $L27; $L36: .loc 1 143 10 mov.f32 %r73,%r125; $L25: .loc 1 149 9 add.u64 %r66,%r66,%r53; .loc 1 151 14 add.u64 %r55,%r55,1; .loc 1 151 7 st.u64 [%frame+240],%r55; setp.ne.u64 %r193,%r55,%r56; @ %r193 bra $L28; .loc 1 149 9 sub.u64 %r194,%r56,%r123; mad.lo.u64 %r67,%r194,%r53,%r67; bra $L23; $L28: .loc 1 141 10 ld.f32 %r125,[%r66]; bra $L18; $L16: st.u64 [%frame+240],%r56; .loc 1 118 13 mov.u64 %r67,%r126; $L23: mov.u64 %r76,%r207; .loc 1 151 34 mov.u64 %r63,%r56; mov.u64 %r61,%r51; .loc 1 152 9 mov.u64 %r71,0; $L29: .loc 1 157 13 st.u64 [%r76+-8],%r211; .loc 1 160 23 mul.lo.u64 %r58,%r61,%r63; mov.u64 %r105,%r71; .loc 1 161 5 add.u64 %r71,%r71,1; .loc 1 162 7 setp.eq.u64 %r198,%r105,%r103; @ %r198 bra $L1; .loc 1 170 16 ld.u64 %r199,[%r76]; add.u64 %r60,%r199,1; st.u64 [%r76],%r60; shl.b64 %r69,%r71,3; .loc 1 171 23 add.u64 %r200,%frame,%r69; ld.u64 %r61,[%r200]; .loc 1 171 13 sub.u64 %r201,%r61,%r58; shl.b64 %r202,%r201,2; add.u64 %r67,%r67,%r202; .loc 1 174 32 add.u64 %r204,%r212,%r69; ld.u64 %r63,[%r204]; .loc 1 174 7 add.u64 %r76,%r76,8; setp.eq.u64 %r205,%r60,%r63; @ %r205 bra $L29; ld.u64 %r123,[%frame+240]; .loc 1 141 10 ld.f32 %r125,[%r67]; bra $L30; $L33: mov.f32 %r73,%r125; setp.eq.u32 %r208,1,1; bra $L10; $L1: .loc 1 1_gfortran_mminloc0_4_r4 .visible .func _gfortran_mminloc0_4_r4pred %r172; .reg .predpredpred %r195; .reg .u64 %r198; .reg .u64u32 %r211; .reg .u16pred %r217; .reg .predpredu16 %r257; .reg .u32 %r258; .reg .pred %r259; .reg .pred %r260; .reg .u32 %r262; .reg .u32 %r263; .reg .pred %r264; .reg .pred %r265; .reg .u64 %r266; .reg .u16 %r268; .reg .u32 %r269; .reg .pred %r270; .reg .pred %r271; .reg .u32 %r273; .reg .u32 %r274; .reg .pred %r275; .reg .pred %r276; .reg .pred %r281; .reg .u64 %r282; .reg .u64pred %r294; .reg .u64 %r296; .reg .u64 %r297; .reg .pred %r299; mov.u64 %r168,%ar0; mov.u64 %r169,%ar1; mov.u64 %r170,%ar2; mov.u32 %r171,%ar3; .loc 1 202 6 setp.ne.u64 %r172,%r170,0; @ %r172 bra $Lcall _gfortran_minloc0_4_r4.loc 1 205 7 bra $L52; $L53: .loc 1 208 10 ld.s8 %r22,[%r169+28]; .loc 1 209 6 setp.gt.s32 %r176,%r22,0; @ %r176 bra $L55; .loc 1 210 5 cvta.const.u64 %r177_gfortran_runtime_error5: .loc 1 212 6 ld.u64 %r179,[%r168]; setp.ne.u64 %r180,%r179,0; @ %r180 bra $L56; .loc 1 214 7 st.u64 [%r168+48],%r179; add.u32 %r182,%r22,-1; cvt.s64.s32 %r183,%r182; st.u64 [%r168+56],%r183; mov.u64 %r184,1; st.u64 [%r168+40],%r184; .loc 1 215 28 cvt.u32.u64 %r185,%r184; st.u8 [%r168+28],%r185; .loc 1 216 24 st.u64 [%r168+8],%r179; .loc 1 217 29 cvt.u32.u32 %r190,%r22; cvt.s64.s8 %r189,%r1call (%value_in),_gfortrani_xmallocarray191,[%value_in]; } .loc 1 217 27 st.u64 [%r168],%r191; bra $L57; $L56: .loc 1 221 11 cvta.global.u64 %r193,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r194,[%r193+36]; setp.eq.u32 %r195,%r194,0r169198; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r2011716920198; call _gfortrani_bounds_equal_extents$L57: .loc 1 231 15 ld.u64 %r31,[%r170+16]; .loc 1 233 9 ld.u64 %r99,[%r170]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r203,%r32,-4; and.b32 %r204,%r203,-5;.loc 1 235 22 add.u32 %r208,%r32,-1; set.u32.le.u32 %r210,%r208,1; neg.s32 %r211,%r210; .loc 1 235 6 cvt.u16.u32 %r213,%r207; cvt.u16.u32 %r214,%r211; or.b16 %r212,%r213,%r214; cvt.u32.u16 %r215,%r212; cvt.u16.u8 %r216,%r215; setp.ne.u16 %r217,%r216,0; @ %r217 bra $L58; .loc 1 237 7 setp.ne.u32 %r219,%r32,16; @ %r219 bra $L59; $L58: .loc 1 244 11 ld.u64 %r115,[%r168+40]; .loc 1 245 8 ld.u64 %r116,[%r168]; .loc 1 246 17 cvt.u32.u32 %r220,%r22; cvt.s64.s8 %r139,%r220; add.u64 %r166,%r169,40; add.u64 %r292,%frame,120; mov.u64 %r163,%r292; add.u64 %r160,%r170,40; mov.u64 %r159,%frame; add.u64 %r156,%frame,240; add.u64 %r152,%frame,360; mov.u64 %r154,%r152; .loc 1 246 10 mov.u64 %r122,0; .loc 1 251 16 mov.u64 %r229,%r122; bra $L60; $L59: .loc 1 242 5_gfortran_runtime_error248 18 ld.u64 %r223,[%r166]; st.u64 [%r163],%r223; .loc 1 249 20 ld.u64 %r225,[%r160]; mul.lo.u64 %r224,%r225,%r31; .loc 1 249 18 st.u64 [%r159],%r224; .loc 1 250 19 ld.u64 %r227,[%r166+16]; add.u64 %r226,%r227,1; ld.u64 %r228,[%r166+8]; sub.u64 %r45,%r226,%r228; .loc 1 250 17 st.u64 [%r156],%r45; .loc 1 251 16 st.u64 [%r154],%r229; .loc 1 252 10 setp.gt.s64 %r230,%r45,0; @ %r230 bra $L61; shl.b64 %r136,%r115,2; mov.u64 %r138,%r116; .loc 1 255 11 mov.u64 %r123,0; .loc 1 256 24 cvt.u32.u64 %r231,%r123; $L62: st.u32 [%r138],%r231; .loc 1 255 27 add.u64 %r123,%r123,1; .loc 1 255 4 add.u64 %r138,%r138,%r136; setp.ne.u64 %r232,%r123,%r139; @ %r232 bra $L62; bra $L52; $L61: .loc 1 246 26 add.u64 %r122,%r122,1; .loc 1 246 3 add.u64 %r166,%r166,24; add.u64 %r163,%r163,8; add.u64 %r160,%r160,24; add.u64 %r159,%r159,8; add.u64 %r156,%r156,8; add.u64 %r154,%r154,8; setp.ne.u64 %r233,%r122,%r139; @ %r233 bra $L60; .loc 1 261 8 ld.u64 %r96,[%r169]; shl.b64 %r126,%r115,2; mov.u64 %r124,%r116; .loc 1 264 10 mov.u64 %r121,0; .loc 1 265 23 cvt.u32.u64 %r234,%r121; $L64: st.u32 [%r124],%r234; .loc 1 264 26 add.u64 %r121,%r121,1; .loc 1 264 3 add.u64 %r124,%r124,%r126; setp.ne.u64 %r235,%r121,%r139; @ %r235 bra $L64; .loc 1 276 9 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L52; .loc 1 331 19 ld.u64 %r74,[%frame+120]; .loc 1 331 9 shl.b64 %r76,%r74,2; .loc 1 332 20 ld.u64 %r77,[%frame]; .loc 1 334 34 ld.u64 %r81,[%frame+240]; .loc 1 334 21 ld.u64 %r164,[%frame+360]; cvt.u32.u32 %r238,%r22; cvt.s64.s8 %r237,%r238; shl.b64 %r239,%r237,3; add.u64 %r80,%r239,%r152; .loc 1 272 12 mov.f32 %r104,0f7f800000; setp.eq.u32 %r294,1,0; add.u64 %r293,%frame,368; .loc 1 340 13 mov.u64 %r296,0; .loc 1 359 32 add.u64 %r297,%frame,240; .loc 1 308 12 setp.eq.u32 %r299,%r171,0; $L83: .loc 1 280 10 @ %r294 bra $L65; $L72: .loc 1 284 11 ld.s8 %r242,[%r99]; cvt.u16.u32 %r241,%r242; setp.eq.u16 %r243,%r241,0; @ %r243 bra $L66; .loc 1 287 8 ld.u32 %r244,[%r116]; setp.ne.u32 %r245,%r244,0; @ %r245 bra $L67; mov.u64 %r101,%r293; mov.u64 %r103,%r116; mov.u64 %r157,%r164; $L68: .loc 1 289 38 cvt.u32.u64 %r248,%r157; add.u32 %r247,%r248,1; .loc 1 289 27 st.u32 [%r103],%r247; .loc 1 288 7 add.u64 %r103,%r103,%r126; setp.eq.u64 %r249,%r80,%r101; @ %r249 bra $L67; .loc 1 289 34 ld.u64 %r157,[%r101]; add.u64 %r101,%r101,8; bra $L68; $L67: .loc 1 290 9 ld.f32 %r52,[%r96]; .loc 1 290 8 setp.le.f32 %r250,%r52,%r104; @ ! %r250 bra $L66; mov.u64 %r129,%r293; mov.u64 %r130,%r116; mov.u64 %r161,%r164; $L71: .loc 1 296 33 cvt.u32.u64 %r253,%r161; add.u32 %r252,%r253,1; .loc 1 296 22 st.u32 [%r130],%r252; .loc 1 295 9 add.u64 %r130,%r130,%r126; setp.eq.u64 %r254,%r80,%r129; @ %r254 bra $L84; .loc 1 296 29 ld.u64 %r161,[%r129]; add.u64 %r129,%r129,8; bra $L71; $L66: .loc 1 300 13 add.u64 %r96,%r96,%r76; .loc 1 301 14 add.u64 %r99,%r99,%r77; .loc 1 303 11 add.u64 %r164,%r164,1; .loc 1 303 4 st.u64 [%frame+360],%r164; setp.ne.u64 %r255,%r164,%r81; @ %r255 bra $L72; bra $L73; $L65: .loc 1 308 12 @ %r299 bra $L85; mov.u64 %r66,%r164; mov.u64 %r94,%r96; $L78: .loc 1 311 11 ld.s8 %r258,[%r99]; cvt.u16.u32 %r257,%r258; setp.eq.u16 %r259,%r257,0; @ %r259 bra $L75; .loc 1 311 12 ld.f32 %r61,[%r94]; .loc 1 311 11 setp.le.f32 %r260,%r61,%r104; @ ! %r260 bra $L75; mov.u64 %r62,%r293; mov.u64 %r69,%r116; mov.u64 %r155,%r66; .loc 1 314 18 mov.u64 %r120,0; $L77: .loc 1 315 36 cvt.u32.u64 %r263,%r155; add.u32 %r262,%r263,1; .loc 1 315 25 st.u32 [%r69],%r262; .loc 1 314 34 add.u64 %r120,%r120,1; .loc 1 314 11 add.u64 %r69,%r69,%r126; setp.ge.s64 %r264,%r120,%r139; @ %r264 bra $L86; .loc 1 315 32 ld.u64 %r155,[%r62]; add.u64 %r62,%r62,8; bra $L77; $L86: .loc 1 313 18 mov.f32 %r104,%r61; $L75: .loc 1 317 8 add.u64 %r94,%r94,%r76; .loc 1 319 13 add.u64 %r66,%r66,1; .loc 1 319 6 st.u64 [%frame+360],%r66; setp.ne.u64 %r265,%r66,%r81; @ %r265 bra $L78; .loc 1 317 8 sub.u64 %r266,%r81,%r164; mad.lo.u64 %r96,%r266,%r76,%r96; bra $L73; $L85: mov.u64 %r79,%r164; mov.u64 %r98,%r99; mov.u64 %r95,%r96; $L74: .loc 1 323 11 ld.s8 %r269,[%r98]; cvt.u16.u32 %r268,%r269; setp.eq.u16 %r270,%r268,0; @ %r270 bra $L79; .loc 1 323 12 ld.f32 %r68,[%r95]; .loc 1 323 11 setp.lt.f32 %r271,%r68,%r104; @ ! %r271 bra $L79; mov.u64 %r46,%r293; mov.u64 %r56,%r116; mov.u64 %r151,%r79; .loc 1 326 12 mov.u64 %r118,0; $L81: .loc 1 327 36 cvt.u32.u64 %r274,%r151; add.u32 %r273,%r274,1; .loc 1 327 25 st.u32 [%r56],%r273; .loc 1 326 28 add.u64 %r118,%r118,1; .loc 1 326 5 add.u64 %r56,%r56,%r126; setp.ge.s64 %r275,%r118,%r139; @ %r275 bra $L87; .loc 1 327 32 ld.u64 %r151,[%r46]; add.u64 %r46,%r46,8; bra $L81; $L87: .loc 1 325 12 mov.f32 %r104,%r68; $L79: .loc 1 331 9 add.u64 %r95,%r95,%r76; .loc 1 332 10 add.u64 %r98,%r98,%r77; .loc 1 334 14 add.u64 %r79,%r79,1; .loc 1 334 7 st.u64 [%frame+360],%r79; setp.ne.u64 %r276,%r79,%r81; @ %r276 bra $L74; sub.u64 %r143,%r81,%r164; .loc 1 331 9 mad.lo.u64 %r96,%r76,%r143,%r96; .loc 1 332 10 mad.lo.u64 %r99,%r77,%r143,%r99; $L73: mov.u64 %r119,%r293; .loc 1 325 12 mov.u64 %r90,%r77; .loc 1 334 34 mov.u64 %r92,%r81; mov.u64 %r88,%r74; mov.u64 %r110,8; .loc 1 335 9 mov.u64 %r102,0; $L82: .loc 1 340 13 st.u64 [%r119+-8],%r296; .loc 1 343 23 mul.lo.u64 %r83,%r88,%r92; .loc 1 344 24 mul.lo.u64 %r85,%r92,%r90; .loc 1 345 5 add.u64 %r102,%r102,1; .loc 1 346 7 setp.eq.u64 %r281,%r102,%r139; @ %r281 bra $L52; .loc 1 354 16 ld.u64 %r282,[%r119]; add.u64 %r87,%r282,1; st.u64 [%r119],%r87; .loc 1 355 23 add.u64 %r284,%r292,%r110; ld.u64 %r88,[%r284]; .loc 1 355 13 sub.u64 %r285,%r88,%r83; shl.b64 %r286,%r285,2; add.u64 %r96,%r96,%r286; .loc 1 356 24 add.u64 %r287,%frame,%r110; ld.u64 %r90,[%r287]; .loc 1 356 14 sub.u64 %r288,%r90,%r85; add.u64 %r99,%r99,%r288; .loc 1 359 32 add.u64 %r290,%r297,%r110; ld.u64 %r92,[%r290]; .loc 1 359 7 add.u64 %r119,%r119,8; add.u64 %r110,%r110,8; setp.eq.u64 %r291,%r87,%r92; @ %r291 bra $L82; .loc 1 334 21 ld.u64 %r164,[%frame+360]; bra $L83; $L84: .loc 1 290 9 mov.f32 %r104,%r52; setp.eq.u32 %r294,1,1; bra $L65; $L52: .loc 1 362_gfortran_sminloc0_4_r4 .visible .func _gfortran_sminloc0_4_r44predpred.reg .u64u32 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 378 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L114; .loc 1 378 20 ld.u32 %r40,[%r37380 7 {_gfortran_minloc0_4_r4.loc 1 381 7 bra $L113; $L115: .loc 1 384 8 ld.s8 %r30,[%r36+28]; .loc 1 386 6 setp.gt.s64 %r46,%r30,0; @ %r46 bra $L1_gfortran_runtime_error117: .loc 1 389 15 ld.u64 %r24,[%r35]; .loc 1 389 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L118; .loc 1 391 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 392 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 393 24 st.u64 [%r35+8],%r24; .loc 1 394 29 mov.u64 %r56call (%value_in),_gfortrani_xmallocarray94 27 st.u64 [%r35],%r24; bra $L119; $L118: .loc 1 396 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 396 11 ld.u32 %r62,[%r61+36.loc 1 398 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 403 8 ld.u64 %r24,[%r35]; $L119: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,2; mov.u64 %r29,%r24; .loc 1 404 10 mov.u64 %r32,0; .loc 1 405 23 cvt.u32.u64 %r68,%r32; $L120: st.u32 [%r29],%r68; .loc 1 404 24 add.u64 %r32,%r32,1; .loc 1 404 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L120; $L113: .loc 1 406 minloc0_8_r4.o/ 1622802232_gfortran_minloc0_8_r4 .visible .func _gfortran_minloc0_8_fortran/generated/minloc0_8_r4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_8_r4 .visible .func _gfortran_mminloc0_8_r4_gfortran_sminloc0_8_r4 .visible .func _gfortran_sminloc0_8_r4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_8_r4 .visible .func _gfortran_minloc0_8_r4pred.reg .u64 %r140; .reg .u32 %r141; .reg .predpred %r157; .reg .u64pred %r170; .reg .pred %r171; .reg .pred %r173; .reg .u64pred %r177; .reg .u64 %r178; .reg .predpredpred %r196; .reg .predpred %r205; mov.u64 %r121,%ar0; mov.u64 %r122,%ar1; mov.u32 %r123,%ar2; .loc 1 50 10 ld.s8 %r22,[%r122+28]; .loc 1 50 8 cvt.u32.u32 %r124,%r22; cvt.s64.s8 %r84,%r124; .loc 1 51 6 setp.gt.s64 %r125,%r84,0; @ %r125 bra $L2; .loc 1 52 5 cvta.const.u64 %r126,$LCstack; call _gfortran_runtime_errorr23,[%r121]; .loc 1 54 6 setp.ne.u64 %r128,%r23,0; @ %r128 bra $L3; .loc 1 56 7 st.u64 [%r121+48],%r23; add.u64 %r130,%r84,-1; st.u64 [%r121+56],%r130; mov.u64 %r131,1; st.u64 [%r121+40],%r131; .loc 1 57 28 cvt.u32.u64 %r132,%r131; st.u8 [%r121+28],%r132; .loc 1 58 24 st.u64 [%r121+8],%r23; .loc 1 59 29 mov.u64 %r135135; call (%value_in),_gfortrani_xmallocarray138,[%value_in]; } mov.u64 %r23,%r138; .loc 1 59 27 st.u64 [%r121],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r140,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r141,[%r140+36]; setp.eq.u32 %r142,%r141,0; @ %r142 bra $L4; .loc 1 64r122145; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r121]; $L4: .loc 1 68 11 ld.u64 %r85,[%r121+40]; add.u64 %r33,%r122,40; mov.u64 %r53,%frame; add.u64 %r45,%frame,120; .loc 1 70 10 mov.u64 %r106,0; add.u64 %r198,%frame,240; .loc 1 74 16 mov.u64 %r153,%r106; bra $L7; $L32: mov.u64 %r106,%r91; $L7: .loc 1 72 18 ld.u64 %r146,[%r33]; st.u64 [%r53],%r146; .loc 1 73 19 ld.u64 %r148,[%r33+16]; add.u64 %r147,%r148,1; ld.u64 %r149,[%r33+8]; sub.u64 %r32,%r147,%r149; .loc 1 73 17 st.u64 [%r45],%r32; .loc 1 74 16 shl.b64 %r151,%r106,3; add.u64 %r152,%r198,%r151; st.u64 [%r152],%r153; .loc 1 75 10 setp.gt.s64 %r154,%r32,0; @ %r154 bra $L5; shl.b64 %r114,%r85,3; mov.u64 %r104,%r23; .loc 1 78 11 mov.u64 %r92,0; .loc 1 79 24 mov.u64 %r155,%r92; $L6: st.u64 [%r104],%r155; .loc 1 78 27 add.u64 %r92,%r92,1; .loc 1 78 4 add.u64 %r104,%r104,%r114; setp.ne.u64 %r156,%r84,%r92; @ %r156 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r91,%r106,1; .loc 1 70 3 add.u64 %r33,%r33,24; add.u64 %r53,%r53,8; add.u64 %r45,%r45,8; setp.ne.u64 %r157,%r84,%r91; @ %r157 bra $L32; .loc 1 84 8 ld.u64 %r73,[%r122]; shl.b64 %r44,%r85,3; mov.u64 %r55,%r23; .loc 1 87 10 mov.u64 %r90,0; .loc 1 88 23 mov.u64 %r158,1; $L8: st.u64 [%r55],%r158; mov.u64 %r107,%r90; .loc 1 87 26 add.u64 %r90,%r90,1; .loc 1 87 3 add.u64 %r55,%r55,%r44; setp.ne.u64 %r159,%r106,%r107; @ %r159 bra $L8; .loc 1 101 9 setp.eq.u64 %r160,%r73,0; @ %r160 bra $L1; .loc 1 149 19 ld.u64 %r57,[%frame]; .loc 1 149 9 shl.b64 %r59,%r57,2; .loc 1 151 34 ld.u64 %r61,[%frame+120]; ld.u64 %r115,[%frame+240]; .loc 1 141 10 ld.f32 %r119,[%r73]; cvt.u32.u32 %r162,%r22; cvt.s64.s8 %r161,%r162; shl.b64 %r163,%r161,3; add.u64 %r95,%r198,%r163; .loc 1 97 12 mov.f32 %r78,0f7f800000; setp.eq.u32 %r197,1,0; add.u64 %r199,%frame,248; .loc 1 157 13 mov.u64 %r202,0; .loc 1 174 32 add.u64 %r203,%frame,120; .loc 1 126 10 setp.eq.u32 %r205,%r123,0; $L30: .loc 1 106 10 @ %r197 bra $L10; mov.u64 %r40,%r115; mov.u32 %r89,0; .loc 1 110 12 mov.u32 %r201,1; $L17: .loc 1 118 13 add.u64 %r120,%r73,%r59; .loc 1 110 11 setp.ge.f32 %r166,%r78,%r119; @ ! %r166 bra $L51; setp.eq.u32 %r167,%r89,0; selp.u64 %r115,%r115,%r40,%r167; mov.u64 %r77,%r199; mov.u64 %r97,%r23; mov.u64 %r111,%r115; $L15: .loc 1 115 36 add.u64 %r169,%r111,1; .loc 1 115 25 st.u64 [%r97],%r169; .loc 1 114 5 add.u64 %r97,%r97,%r44; setp.eq.u64 %r170,%r77,%r95; @ %r170 bra $L33; .loc 1 115 32 ld.u64 %r111,[%r77]; add.u64 %r77,%r77,8; bra $L15; $L51: .loc 1 120 11 add.u64 %r40,%r40,1; .loc 1 120 4 setp.eq.u64 %r171,%r40,%r61; @ %r171 bra $L16; .loc 1 110 12 ld.f32 %r119,[%r120]; mov.u64 %r73,%r120; mov.u32 %r89,%r201; bra $L17; $L10: .loc 1 126 10 @ %r205 bra $L34; mov.u64 %r52,%r115; mov.u64 %r71,%r73; $L24: .loc 1 129 9 setp.ge.f32 %r173,%r78,%r119; @ ! %r173 bra $L19; mov.u64 %r110,%r199; mov.u64 %r112,%r23; mov.u64 %r109,%r52; $L21: .loc 1 133 34 add.u64 %r175,%r109,1; .loc 1 133 23 st.u64 [%r112],%r175; .loc 1 132 3 add.u64 %r112,%r112,%r44; setp.eq.u64 %r176,%r95,%r110; @ %r176 bra $L35; .loc 1 133 30 ld.u64 %r109,[%r110]; add.u64 %r110,%r110,8; bra $L21; $L35: .loc 1 131 10 mov.f32 %r78,%r119; $L19: .loc 1 135 11 add.u64 %r71,%r71,%r59; .loc 1 137 9 add.u64 %r52,%r52,1; .loc 1 137 2 st.u64 [%frame+240],%r52; setp.ne.u64 %r177,%r52,%r61; @ %r177 bra $L22; .loc 1 135 11 sub.u64 %r178,%r61,%r115; mad.lo.u64 %r73,%r178,%r59,%r73; bra $L23; $L22: .loc 1 129 10 ld.f32 %r119,[%r71]; bra $L24; $L34: mov.u64 %r60,%r115; mov.u64 %r72,%r73; $L18: .loc 1 141 9 setp.gt.f32 %r180,%r78,%r119; @ ! %r180 bra $L25; mov.u64 %r50,%r199; mov.u64 %r47,%r23; mov.u64 %r93,%r60; $L27: .loc 1 145 34 add.u64 %r182,%r93,1; .loc 1 145 23 st.u64 [%r47],%r182; .loc 1 144 3 add.u64 %r47,%r47,%r44; setp.eq.u64 %r183,%r50,%r95; @ %r183 bra $L36; .loc 1 145 30 ld.u64 %r93,[%r50]; add.u64 %r50,%r50,8; bra $L27; $L36: .loc 1 143 10 mov.f32 %r78,%r119; $L25: .loc 1 149 9 add.u64 %r72,%r72,%r59; .loc 1 151 14 add.u64 %r60,%r60,1; .loc 1 151 7 st.u64 [%frame+240],%r60; setp.ne.u64 %r184,%r60,%r61; @ %r184 bra $L28; .loc 1 149 9 sub.u64 %r185,%r61,%r115; mad.lo.u64 %r73,%r185,%r59,%r73; bra $L23; $L28: .loc 1 141 10 ld.f32 %r119,[%r72]; bra $L18; $L16: st.u64 [%frame+240],%r61; .loc 1 118 13 mov.u64 %r73,%r120; $L23: mov.u64 %r42,%r199; .loc 1 151 34 mov.u64 %r70,%r61; mov.u64 %r68,%r57; .loc 1 152 9 mov.u64 %r76,0; $L29: .loc 1 157 13 st.u64 [%r42+-8],%r202; .loc 1 160 23 mul.lo.u64 %r65,%r68,%r70; mov.u64 %r108,%r76; .loc 1 161 5 add.u64 %r76,%r76,1; .loc 1 162 7 setp.eq.u64 %r189,%r108,%r106; @ %r189 bra $L1; .loc 1 170 16 ld.u64 %r190,[%r42]; add.u64 %r67,%r190,1; st.u64 [%r42],%r67; shl.b64 %r118,%r76,3; .loc 1 171 23 add.u64 %r191,%frame,%r118; ld.u64 %r68,[%r191]; .loc 1 171 13 sub.u64 %r192,%r68,%r65; shl.b64 %r193,%r192,2; add.u64 %r73,%r73,%r193; .loc 1 174 32 add.u64 %r195,%r203,%r118; ld.u64 %r70,[%r195]; .loc 1 174 7 add.u64 %r42,%r42,8; setp.eq.u64 %r196,%r67,%r70; @ %r196 bra $L29; ld.u64 %r115,[%frame+240]; .loc 1 141 10 ld.f32 %r119,[%r73]; bra $L30; $L33: mov.f32 %r78,%r119; setp.eq.u32 %r197,1,1; bra $L10; $L1: .loc 1 1_gfortran_mminloc0_8_r4 .visible .func _gfortran_mminloc0_8_r4pred %r187; .reg .u64 %r190; .reg .u64u64pred %r222; .reg .u64 %r223; .reg .predu32 %r230; .reg .u64 %r231; .reg .u16 %r233; .reg .u32 %r234; .reg .pred %r235; .reg .u64 %r236; .reg .predpred %r250; .reg .u64 %r252; .reg .predu16 %r257; .reg .u32 %r258; .reg .pred %r259; .reg .pred %r260; .reg .u64 %r262; .reg .pred %r263; .reg .predpred %r282; .reg .u64 %r284; .reg .u64 %r285; .reg .pred %r287; mov.u64 %r160,%ar0; mov.u64 %r161,%ar1; mov.u64 %r162,%ar2; mov.u32 %r163,%ar3; .loc 1 202 6 setp.ne.u64 %r164,%r162,0; @ %r164 bra $L53; .loc 1 204 7 {1163; call _gfortran_minloc0_8_r4.loc 1 205 7 bra $L52; $L53: .loc 1 208 10 ld.s8 %r22,[%r161+28]; .loc 1 209 6 setp.gt.s32 %r168,%r22,0; @ %r168 bra $L55; .loc 1 210 5 cvta.const.u64 %r16stack; call _gfortran_runtime_error5: .loc 1 212 6 ld.u64 %r171,[%r160]; setp.ne.u64 %r172,%r171,0; @ %r172 bra $L56; .loc 1 214 7 st.u64 [%r160+48],%r171; add.u32 %r174,%r22,-1; cvt.s64.s32 %r175,%r174; st.u64 [%r160+56],%r175; mov.u64 %r176,1; st.u64 [%r160+40],%r176; .loc 1 215 28 cvt.u32.u64 %r177,%r176; st.u8 [%r160+28],%r177; .loc 1 216 24 st.u64 [%r160+8],%r171; .loc 1 217 29 cvt.u32.u32 %r182,%r22; cvt.s64.s8 %r181,%r182; mov.u64 %r180call (%value_in),_gfortrani_xmallocarray183,[%value_in]; } .loc 1 217 27 st.u64 [%r160],%r183; bra $L57; $L56: .loc 1 221 11 cvta.global.u64 %r185,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r186,[%r185+36]; setp.eq.u32 %r187,%r186,0; @ %r187 bra $L57; .loc 1 224 4 cvta.const.u64 %r190,$LC1; {161190; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19316161193190; call _gfortrani_bounds_equal_extents$L57: .loc 1 231 15 ld.u64 %r31,[%r162+16]; .loc 1 233 9 ld.u64 %r101,[%r162]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r195,%r32,-4; and.b32 %r196,%r195,-5; set.u32.eq.u32 %r198,%r196,0; neg.s32 %r199,%r198; .loc 1 235 22 add.u32 %r200,%r32,-1; set.u32.le.u32 %r202,%r200,1; neg.s32 %r203,%r202; .loc 1 235 6 cvt.u16.u32 %r205,%r199; cvt.u16.u32 %r206,%r203; or.b16 %r204,%r205,%r206; cvt.u32.u16 %r207,%r204; cvt.u16.u8 %r208,%r20758; .loc 1 237 7 setp.ne.u32 %r211,%r32,16; @ %r211 bra $L59; $L58: .loc 1 244 11 ld.u64 %r116,[%r160+40]; .loc 1 245 8 ld.u64 %r117,[%r160]; .loc 1 246 17 cvt.u32.u32 %r212,%r22; cvt.s64.s8 %r137,%r212; add.u64 %r52,%r161,40; add.u64 %r280,%frame,120; mov.u64 %r62,%r280; add.u64 %r67,%r162,40; mov.u64 %r70,%frame; add.u64 %r57,%frame,240; add.u64 %r59,%frame,360; mov.u64 %r58,%r59; .loc 1 246 10 mov.u64 %r122,0; .loc 1 251 16 mov.u64 %r221,%r122; bra $L60; $L59: .loc 1 242 5_gfortran_runtime_error248 18 ld.u64 %r215,[%r52]; st.u64 [%r62],%r215; .loc 1 249 20 ld.u64 %r217,[%r67]; mul.lo.u64 %r216,%r217,%r31; .loc 1 249 18 st.u64 [%r70],%r216; .loc 1 250 19 ld.u64 %r219,[%r52+16]; add.u64 %r218,%r219,1; ld.u64 %r220,[%r52+8]; sub.u64 %r45,%r218,%r220; .loc 1 250 17 st.u64 [%r57],%r45; .loc 1 251 16 st.u64 [%r58],%r221; .loc 1 252 10 setp.gt.s64 %r222,%r45,0; @ %r222 bra $L61; shl.b64 %r130,%r116,3; mov.u64 %r132,%r117; .loc 1 255 11 mov.u64 %r123,0; .loc 1 256 24 mov.u64 %r223,%r123; $L62: st.u64 [%r132],%r223; .loc 1 255 27 add.u64 %r123,%r123,1; .loc 1 255 4 add.u64 %r132,%r132,%r130; setp.ne.u64 %r224,%r123,%r137; @ %r224 bra $L62; bra $L52; $L61: .loc 1 246 26 add.u64 %r122,%r122,1; .loc 1 246 3 add.u64 %r52,%r52,24; add.u64 %r62,%r62,8; add.u64 %r67,%r67,24; add.u64 %r70,%r70,8; add.u64 %r57,%r57,8; add.u64 %r58,%r58,8; setp.ne.u64 %r225,%r122,%r137; @ %r225 bra $L60; .loc 1 261 8 ld.u64 %r98,[%r161]; shl.b64 %r134,%r116,3; mov.u64 %r153,%r117; .loc 1 264 10 mov.u64 %r121,0; .loc 1 265 23 mov.u64 %r226,%r121; $L64: st.u64 [%r153],%r226; .loc 1 264 26 add.u64 %r121,%r121,1; .loc 1 264 3 add.u64 %r153,%r153,%r134; setp.ne.u64 %r227,%r121,%r137; @ %r227 bra $L64; .loc 1 276 9 setp.eq.u64 %r228,%r98,0; @ %r228 bra $L52; .loc 1 331 19 ld.u64 %r77,[%frame+120]; .loc 1 331 9 shl.b64 %r79,%r77,2; .loc 1 332 20 ld.u64 %r80,[%frame]; .loc 1 334 34 ld.u64 %r84,[%frame+240]; .loc 1 334 21 ld.u64 %r156,[%frame+360]; cvt.u32.u32 %r230,%r22; cvt.s64.s8 %r229,%r230; shl.b64 %r231,%r229,3; add.u64 %r48,%r231,%r59; .loc 1 272 12 mov.f32 %r108,0f7f800000; setp.eq.u32 %r282,1,0; add.u64 %r281,%frame,368; .loc 1 340 13 mov.u64 %r284,0; .loc 1 359 32 add.u64 %r285,%frame,240; .loc 1 308 12 setp.eq.u32 %r287,%r163,0; $L83: .loc 1 280 10 @ %r282 bra $L65; $L72: .loc 1 284 11 ld.s8 %r234,[%r101]; cvt.u16.u32 %r233,%r234; setp.eq.u16 %r235,%r233,0; @ %r235 bra $L66; .loc 1 287 8 ld.u64 %r236,[%r117]; setp.ne.u64 %r237,%r236,0; @ %r237 bra $L67; mov.u64 %r66,%r281; mov.u64 %r76,%r117; mov.u64 %r150,%r156; $L68: .loc 1 289 38 add.u64 %r239,%r150,1; .loc 1 289 27 st.u64 [%r76],%r239; .loc 1 288 7 add.u64 %r76,%r76,%r134; setp.eq.u64 %r240,%r48,%r66; @ %r240 bra $L67; .loc 1 289 34 ld.u64 %r150,[%r66]; add.u64 %r66,%r66,8; bra $L68; $L67: .loc 1 290 9 ld.f32 %r54,[%r98]; .loc 1 290 8 setp.le.f32 %r241,%r54,%r108; @ ! %r241 bra $L66; mov.u64 %r112,%r281; mov.u64 %r125,%r117; mov.u64 %r154,%r156; $L71: .loc 1 296 33 add.u64 %r243,%r154,1; .loc 1 296 22 st.u64 [%r125],%r243; .loc 1 295 9 add.u64 %r125,%r125,%r134; setp.eq.u64 %r244,%r48,%r112; @ %r244 bra $L84; .loc 1 296 29 ld.u64 %r154,[%r112]; add.u64 %r112,%r112,8; bra $L71; $L66: .loc 1 300 13 add.u64 %r98,%r98,%r79; .loc 1 301 14 add.u64 %r101,%r101,%r80; .loc 1 303 11 add.u64 %r156,%r156,1; .loc 1 303 4 st.u64 [%frame+360],%r156; setp.ne.u64 %r245,%r156,%r84; @ %r245 bra $L72; bra $L73; $L65: .loc 1 308 12 @ %r287 bra $L85; mov.u64 %r72,%r156; mov.u64 %r96,%r98; $L78: .loc 1 311 11 ld.s8 %r248,[%r101]; cvt.u16.u32 %r247,%r248; setp.eq.u16 %r249,%r247,0; @ %r249 bra $L75; .loc 1 311 12 ld.f32 %r65,[%r96]; .loc 1 311 11 setp.le.f32 %r250,%r65,%r108; @ ! %r250 bra $L75; mov.u64 %r124,%r281; mov.u64 %r36,%r117; mov.u64 %r136,%r72; .loc 1 314 18 mov.u64 %r120,0; $L77: .loc 1 315 36 add.u64 %r252,%r136,1; .loc 1 315 25 st.u64 [%r36],%r252; .loc 1 314 34 add.u64 %r120,%r120,1; .loc 1 314 11 add.u64 %r36,%r36,%r134; setp.ge.s64 %r253,%r120,%r137; @ %r253 bra $L86; .loc 1 315 32 ld.u64 %r136,[%r124]; add.u64 %r124,%r124,8; bra $L77; $L86: .loc 1 313 18 mov.f32 %r108,%r65; $L75: .loc 1 317 8 add.u64 %r96,%r96,%r79; .loc 1 319 13 add.u64 %r72,%r72,1; .loc 1 319 6 st.u64 [%frame+360],%r72; setp.ne.u64 %r254,%r72,%r84; @ %r254 bra $L78; .loc 1 317 8 sub.u64 %r255,%r84,%r156; mad.lo.u64 %r98,%r255,%r79,%r98; bra $L73; $L85: mov.u64 %r83,%r156; mov.u64 %r100,%r101; mov.u64 %r97,%r98; $L74: .loc 1 323 11 ld.s8 %r258,[%r100]; cvt.u16.u32 %r257,%r258; setp.eq.u16 %r259,%r257,0; @ %r259 bra $L79; .loc 1 323 12 ld.f32 %r74,[%r97]; .loc 1 323 11 setp.lt.f32 %r260,%r74,%r108; @ ! %r260 bra $L79; mov.u64 %r155,%r281; mov.u64 %r159,%r117; mov.u64 %r127,%r83; .loc 1 326 12 mov.u64 %r119,0; $L81: .loc 1 327 36 add.u64 %r262,%r127,1; .loc 1 327 25 st.u64 [%r159],%r262; .loc 1 326 28 add.u64 %r119,%r119,1; .loc 1 326 5 add.u64 %r159,%r159,%r134; setp.ge.s64 %r263,%r119,%r137; @ %r263 bra $L87; .loc 1 327 32 ld.u64 %r127,[%r155]; add.u64 %r155,%r155,8; bra $L81; $L87: .loc 1 325 12 mov.f32 %r108,%r74; $L79: .loc 1 331 9 add.u64 %r97,%r97,%r79; .loc 1 332 10 add.u64 %r100,%r100,%r80; .loc 1 334 14 add.u64 %r83,%r83,1; .loc 1 334 7 st.u64 [%frame+360],%r83; setp.ne.u64 %r264,%r83,%r84; @ %r264 bra $L74; sub.u64 %r141,%r84,%r156; .loc 1 331 9 mad.lo.u64 %r98,%r79,%r141,%r98; .loc 1 332 10 mad.lo.u64 %r101,%r80,%r141,%r101; $L73: mov.u64 %r107,%r281; .loc 1 325 12 mov.u64 %r93,%r80; .loc 1 334 34 mov.u64 %r95,%r84; mov.u64 %r91,%r77; mov.u64 %r105,8; .loc 1 335 9 mov.u64 %r106,0; $L82: .loc 1 340 13 st.u64 [%r107+-8],%r284; .loc 1 343 23 mul.lo.u64 %r86,%r91,%r95; .loc 1 344 24 mul.lo.u64 %r88,%r95,%r93; .loc 1 345 5 add.u64 %r106,%r106,1; .loc 1 346 7 setp.eq.u64 %r269,%r106,%r137; @ %r269 bra $L52; .loc 1 354 16 ld.u64 %r270,[%r107]; add.u64 %r90,%r270,1; st.u64 [%r107],%r90; .loc 1 355 23 add.u64 %r272,%r280,%r105; ld.u64 %r91,[%r272]; .loc 1 355 13 sub.u64 %r273,%r91,%r86; shl.b64 %r274,%r273,2; add.u64 %r98,%r98,%r274; .loc 1 356 24 add.u64 %r275,%frame,%r105; ld.u64 %r93,[%r275]; .loc 1 356 14 sub.u64 %r276,%r93,%r88; add.u64 %r101,%r101,%r276; .loc 1 359 32 add.u64 %r278,%r285,%r105; ld.u64 %r95,[%r278]; .loc 1 359 7 add.u64 %r107,%r107,8; add.u64 %r105,%r105,8; setp.eq.u64 %r279,%r90,%r95; @ %r279 bra $L82; .loc 1 334 21 ld.u64 %r156,[%frame+360]; bra $L83; $L84: .loc 1 290 9 mov.f32 %r108,%r54; setp.eq.u32 %r282,1,1; bra $L65; $L52: .loc 1 362_gfortran_sminloc0_8_r4 .visible .func _gfortran_sminloc0_8_r44predpred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 378 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L114; .loc 1 378 20 ld.u32 %r40,[%r37380 7 {_gfortran_minloc0_8_r4.loc 1 381 7 bra $L113; $L115: .loc 1 384 8 ld.s8 %r30,[%r36+28]; .loc 1 386 6 setp.gt.s64 %r46,%r30,0; @ %r46 bra $L1_gfortran_runtime_error117: .loc 1 389 15 ld.u64 %r24,[%r35]; .loc 1 389 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L118; .loc 1 391 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 392 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 393 24 st.u64 [%r35+8],%r24; .loc 1 394 29 mov.u64 call (%value_in),_gfortrani_xmallocarray94 27 st.u64 [%r35],%r24; bra $L119; $L118: .loc 1 396 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 396 11 ld.u32 %r62,[%r61+36.loc 1 398 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 403 8 ld.u64 %r24,[%r35]; $L119: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,3; mov.u64 %r29,%r24; .loc 1 404 10 mov.u64 %r32,0; .loc 1 405 23 mov.u64 %r68,%r32; $L120: st.u64 [%r29],%r68; .loc 1 404 24 add.u64 %r32,%r32,1; .loc 1 404 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L120; $L113: .loc 1 406 minloc0_16_r4.o/1622802232_gfortran_minloc0_16_r4 .visible .func _gfortran_minloc0_16_fortran/generated/minloc0_16_r4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_16_r4 .visible .func _gfortran_mminloc0_16_r4_gfortran_sminloc0_16_r4 .visible .func _gfortran_sminloc0_16_r4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_16_r4 .visible .func _gfortran_minloc0_16_r4u64 %r94; .reg .u64 %r105; .reg .u64 %r108predpred %r160; .reg .predpredu32 %r167; .reg .u64 %r168; .reg .pred %r171; .reg .pred %r172; .reg .u64 %r174; .reg .u64 %r175; .reg .pred %r176; .reg .pred %r177; .reg .predu64 %r193; .reg .predmov.u64 %r124,%ar0; mov.u64 %r125,%ar1; mov.u32 %r126,%ar2; .loc 1 50 10 ld.s8 %r22,[%r125+28]; .loc 1 50 8 cvt.u32.u32 %r127,%r22; cvt.s64.s8 %r79,%r127; .loc 1 51 6 setp.gt.s64 %r128,%r79,0; @ %r128 bra $L2; .loc 1 52 5 cvta.const.u64 %r112_gfortran_runtime_errorr23,[%r124]; .loc 1 54 6 setp.ne.u64 %r131,%r23,0; @ %r131 bra $L3; .loc 1 56 7 st.u64 [%r124+48],%r23; add.u64 %r133,%r79,-1; st.u64 [%r124+56],%r133; mov.u64 %r134,1; st.u64 [%r124+40],%r134; .loc 1 57 28 cvt.u32.u64 %r135,%r134; st.u8 [%r124+28],%r135; .loc 1 58 24 st.u64 [%r124+8],%r23; .loc 1 59 29 mov.u64 %r138,1679call (%value_in),_gfortrani_xmallocarray1,[%value_in]; } mov.u64 %r23,%r141; .loc 1 59 27 st.u64 [%r124],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r143,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r144,[%r143+36]; setp.eq.u32 %r145,%r144,0; @ %r145 bra $L4; .loc 1 64r148; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r124]; $L4: .loc 1 68 11 ld.u64 %r80,[%r124+40]; add.u64 %r51,%r125,40; mov.u64 %r42,%frame; add.u64 %r76,%frame,120; .loc 1 70 10 mov.u64 %r108,0; add.u64 %r205,%frame,240; .loc 1 74 16 mov.u64 %r156,%r108; bra $L7; $L32: mov.u64 %r108,%r88; $L7: .loc 1 72 18 ld.u64 %r149,[%r51]; st.u64 [%r42],%r149; .loc 1 73 19 ld.u64 %r151,[%r51+16]; add.u64 %r150,%r151,1; ld.u64 %r152,[%r51+8]; sub.u64 %r31,%r150,%r152; .loc 1 73 17 st.u64 [%r76],%r31; .loc 1 74 16 shl.b64 %r154,%r108,3; add.u64 %r155,%r205,%r154; st.u64 [%r155],%r156; .loc 1 75 10 setp.gt.s64 %r157,%r31,0; @ %r157 bra $L5; shl.b64 %r120,%r80,4; mov.u64 %r105,%r23; .loc 1 78 11 mov.u64 %r89,0; .loc 1 79 24 mov.u64 %r158,%r89; $L6: st.u64 [%r105],%r158; st.u64 [%r105+8],%r158; .loc 1 78 27 add.u64 %r89,%r89,1; .loc 1 78 4 add.u64 %r105,%r105,%r120; setp.ne.u64 %r160,%r79,%r89; @ %r160 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r88,%r108,1; .loc 1 70 3 add.u64 %r51,%r51,24; add.u64 %r42,%r42,8; add.u64 %r76,%r76,8; setp.ne.u64 %r161,%r79,%r88; @ %r161 bra $L32; .loc 1 84 8 ld.u64 %r69,[%r125]; shl.b64 %r35,%r80,4; mov.u64 %r48,%r23; .loc 1 87 10 mov.u64 %r87,0; .loc 1 88 23 mov.u64 %r162,1; mov.u64 %r163,%r87; $L8: st.u64 [%r48],%r162; st.u64 [%r48+8],%r163; mov.u64 %r109,%r87; .loc 1 87 26 add.u64 %r87,%r87,1; .loc 1 87 3 add.u64 %r48,%r48,%r35; setp.ne.u64 %r164,%r108,%r109; @ %r164 bra $L8; .loc 1 101 9 setp.eq.u64 %r165,%r69,0; @ %r165 bra $L1; .loc 1 149 19 ld.u64 %r54,[%frame]; .loc 1 149 9 shl.b64 %r56,%r54,2; .loc 1 151 34 ld.u64 %r58,[%frame+120]; ld.u64 %r121,[%frame+240]; .loc 1 141 10 ld.f32 %r122,[%r69]; cvt.u32.u32 %r167,%r22; cvt.s64.s8 %r166,%r167; shl.b64 %r168,%r166,3; add.u64 %r46,%r205,%r168; .loc 1 97 12 mov.f32 %r74,0f7f800000; setp.eq.u32 %r207,1,0; add.u64 %r206,%frame,248; .loc 1 157 13 mov.u64 %r210,0; .loc 1 174 32 add.u64 %r211,%frame,120; .loc 1 126 10 setp.eq.u32 %r213,%r126,0; $L30: .loc 1 106 10 @ %r207 bra $L10; mov.u64 %r37,%r121; mov.u32 %r85,0; .loc 1 110 12 mov.u32 %r209,1; $L17: .loc 1 118 13 add.u64 %r123,%r69,%r56; .loc 1 110 11 setp.ge.f32 %r171,%r74,%r122; @ ! %r171 bra $L51; setp.eq.u32 %r172,%r85,0; selp.u64 %r121,%r121,%r37,%r172; mov.u64 %r78,%r206; mov.u64 %r94,%r23; mov.u64 %r117,%r121; $L15: .loc 1 115 36 add.u64 %r174,%r117,1; st.u64 [%r94],%r174; shr.s64 %r175,%r174,63; st.u64 [%r94+8],%r175; .loc 1 114 5 add.u64 %r94,%r94,%r35; setp.eq.u64 %r176,%r46,%r78; @ %r176 bra $L33; .loc 1 115 32 ld.u64 %r117,[%r78]; add.u64 %r78,%r78,8; bra $L15; $L51: .loc 1 120 11 add.u64 %r37,%r37,1; .loc 1 120 4 setp.eq.u64 %r177,%r37,%r58; @ %r177 bra $L16; .loc 1 110 12 ld.f32 %r122,[%r123]; mov.u64 %r69,%r123; mov.u32 %r85,%r209; bra $L17; $L10: .loc 1 126 10 @ %r213 bra $L34; mov.u64 %r49,%r121; mov.u64 %r67,%r69; $L24: .loc 1 129 9 setp.ge.f32 %r179,%r74,%r122; @ ! %r179 bra $L19; mov.u64 %r119,%r206; mov.u64 %r70,%r23; mov.u64 %r114,%r49; $L21: .loc 1 133 34 add.u64 %r181,%r114,1; st.u64 [%r70],%r181; shr.s64 %r182,%r181,63; st.u64 [%r70+8],%r182; .loc 1 132 3 add.u64 %r70,%r70,%r35; setp.eq.u64 %r183,%r46,%r119; @ %r183 bra $L35; .loc 1 133 30 ld.u64 %r114,[%r119]; add.u64 %r119,%r119,8; bra $L21; $L35: .loc 1 131 10 mov.f32 %r74,%r122; $L19: .loc 1 135 11 add.u64 %r67,%r67,%r56; .loc 1 137 9 add.u64 %r49,%r49,1; .loc 1 137 2 st.u64 [%frame+240],%r49; setp.ne.u64 %r184,%r49,%r58; @ %r184 bra $L22; .loc 1 135 11 sub.u64 %r185,%r58,%r121; mad.lo.u64 %r69,%r185,%r56,%r69; bra $L23; $L22: .loc 1 129 10 ld.f32 %r122,[%r67]; bra $L24; $L34: mov.u64 %r57,%r121; mov.u64 %r68,%r69; $L18: .loc 1 141 9 setp.gt.f32 %r187,%r74,%r122; @ ! %r187 bra $L25; mov.u64 %r112,%r206; mov.u64 %r116,%r23; mov.u64 %r111,%r57; $L27: .loc 1 145 34 add.u64 %r189,%r111,1; st.u64 [%r116],%r189; shr.s64 %r190,%r189,63; st.u64 [%r116+8],%r190; .loc 1 144 3 add.u64 %r116,%r116,%r35; setp.eq.u64 %r191,%r46,%r112; @ %r191 bra $L36; .loc 1 145 30 ld.u64 %r111,[%r112]; add.u64 %r112,%r112,8; bra $L27; $L36: .loc 1 143 10 mov.f32 %r74,%r122; $L25: .loc 1 149 9 add.u64 %r68,%r68,%r56; .loc 1 151 14 add.u64 %r57,%r57,1; .loc 1 151 7 st.u64 [%frame+240],%r57; setp.ne.u64 %r192,%r57,%r58; @ %r192 bra $L28; .loc 1 149 9 sub.u64 %r193,%r58,%r121; mad.lo.u64 %r69,%r193,%r56,%r69; bra $L23; $L28: .loc 1 141 10 ld.f32 %r122,[%r68]; bra $L18; $L16: st.u64 [%frame+240],%r58; .loc 1 118 13 mov.u64 %r69,%r123; $L23: mov.u64 %r71,%r206; .loc 1 151 34 mov.u64 %r65,%r58; mov.u64 %r63,%r54; .loc 1 152 9 mov.u64 %r73,0; $L29: .loc 1 157 13 st.u64 [%r71+-8],%r210; .loc 1 160 23 mul.lo.u64 %r60,%r63,%r65; mov.u64 %r110,%r73; .loc 1 161 5 add.u64 %r73,%r73,1; .loc 1 162 7 setp.eq.u64 %r197,%r110,%r108; @ %r197 bra $L1; .loc 1 170 16 ld.u64 %r198,[%r71]; add.u64 %r62,%r198,1; st.u64 [%r71],%r62; shl.b64 %r40,%r73,3; .loc 1 171 23 add.u64 %r199,%frame,%r40; ld.u64 %r63,[%r199]; .loc 1 171 13 sub.u64 %r200,%r63,%r60; shl.b64 %r201,%r200,2; add.u64 %r69,%r69,%r201; .loc 1 174 32 add.u64 %r203,%r211,%r40; ld.u64 %r65,[%r203]; .loc 1 174 7 add.u64 %r71,%r71,8; setp.eq.u64 %r204,%r62,%r65; @ %r204 bra $L29; ld.u64 %r121,[%frame+240]; .loc 1 141 10 ld.f32 %r122,[%r69]; bra $L30; $L33: mov.f32 %r74,%r122; setp.eq.u32 %r207,1,1; bra $L10; $L1: .loc 1 1_gfortran_mminloc0_16_r4 .visible .func _gfortran_mminloc0_16_r4pred %r168; .reg .predpredpred %r226; .reg .u64 %r227; .reg .pred %r229; .reg .predpred %r234; .reg .u64u16pred %r248; .reg .pred %r249; .reg .u64 %r251; .reg .u64 %r252; .reg .pred %r253; .reg .pred %r254; .reg .u16 %r256; .reg .u32predpred %r274; .reg .predu64 %r283; .reg .u64u64 %r287; .reg .u64 %r289; .reg .pred %r290; .reg .u64 %r291; .reg .u64 %r292; .reg .predpred %r298; mov.u64 %r164,%ar0; mov.u64 %r165,%ar1; mov.u64 %r166,%ar2; mov.u32 %r167,%ar3; .loc 1 202 6 setp.ne.u64 %r168,%r166,0; @ %r168 bra $L1617; call _gfortran_minloc0_16_r4.loc 1 205 7 bra $L52; $L53: .loc 1 208 10 ld.s8 %r22,[%r165+28]; .loc 1 209 6 setp.gt.s32 %r172,%r22,0; @ %r172 bra $L55; .loc 1 210 5 cvta.const.u64 %r173173_gfortran_runtime_error5: .loc 1 212 6 ld.u64 %r175,[%r164]; setp.ne.u64 %r176,%r175,0; @ %r176 bra $L56; .loc 1 214 7 st.u64 [%r164+48],%r175; add.u32 %r178,%r22,-1; cvt.s64.s32 %r179,%r178; st.u64 [%r164+56],%r179; mov.u64 %r180,1; st.u64 [%r164+40],%r180; .loc 1 215 28 cvt.u32.u64 %r181,%r180; st.u8 [%r164+28],%r181; .loc 1 216 24 st.u64 [%r164+8],%r175; .loc 1 217 29 cvt.u32.u32 %r186,%r22; cvt.s64.s8 %r185,%r186; mov.u64 %r184,16call (%value_in),_gfortrani_xmallocarray187,[%value_in]; } .loc 1 217 27 st.u64 [%r164],%r187; bra $L57; $L56: .loc 1 221 11 cvta.global.u64 %r189,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r190,[%r189+36]; setp.eq.u32 %r191,%r190,0164r194; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19716r197194; call _gfortrani_bounds_equal_extents$L57: .loc 1 231 15 ld.u64 %r31,[%r166+16]; .loc 1 233 9 ld.u64 %r104,[%r166]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r199,%r32,-4; and.b32 %r200,%r199,-5; set.u32.eq.u32 %r202,%r200,0; neg.s32 %r203,%r202; .loc 1 235 22 add.u32 %r204,%r32,-1; set.u32.le.u32 %r206,%r204,1; neg.s32 %r207,%r206; .loc 1 235 6 cvt.u16.u32 %r209,%r203; cvt.u16.u32 %r210,%r207; or.b16 %r208,%r209,%r210; cvt.u32.u16 %r211,%r208; cvt.u16.u8 %r212,%r211; setp.ne.u16 %r213,%r212,0; @ %r213 bra $L58; .loc 1 237 7 setp.ne.u32 %r215,%r32,16; @ %r215 bra $L59; $L58: .loc 1 244 11 ld.u64 %r119,[%r164+40]; .loc 1 245 8 ld.u64 %r120,[%r164]; .loc 1 246 17 cvt.u32.u32 %r216,%r22; cvt.s64.s8 %r137,%r216; add.u64 %r153,%r165,40; add.u64 %r291,%frame,120; mov.u64 %r150,%r291; add.u64 %r151,%r166,40; mov.u64 %r65,%frame; add.u64 %r62,%frame,240; add.u64 %r48,%frame,360; mov.u64 %r54,%r48; .loc 1 246 10 mov.u64 %r126,0; .loc 1 251 16 mov.u64 %r225,%r126; bra $L60; $L59: .loc 1 242 5 cvta.const.u64 %r217,$LC3stack; call _gfortran_runtime_error248 18 ld.u64 %r219,[%r153]; st.u64 [%r150],%r219; .loc 1 249 20 ld.u64 %r221,[%r151]; mul.lo.u64 %r220,%r221,%r31; .loc 1 249 18 st.u64 [%r65],%r220; .loc 1 250 19 ld.u64 %r223,[%r153+16]; add.u64 %r222,%r223,1; ld.u64 %r224,[%r153+8]; sub.u64 %r47,%r222,%r224; .loc 1 250 17 st.u64 [%r62],%r47; .loc 1 251 16 st.u64 [%r54],%r225; .loc 1 252 10 setp.gt.s64 %r226,%r47,0; @ %r226 bra $L61; shl.b64 %r132,%r119,4; mov.u64 %r136,%r120; .loc 1 255 11 mov.u64 %r127,0; .loc 1 256 24 mov.u64 %r227,%r127; $L62: st.u64 [%r136],%r227; st.u64 [%r136+8],%r227; .loc 1 255 27 add.u64 %r127,%r127,1; .loc 1 255 4 add.u64 %r136,%r136,%r132; setp.ne.u64 %r229,%r127,%r137; @ %r229 bra $L62; bra $L52; $L61: .loc 1 246 26 add.u64 %r126,%r126,1; .loc 1 246 3 add.u64 %r153,%r153,24; add.u64 %r150,%r150,8; add.u64 %r151,%r151,24; add.u64 %r65,%r65,8; add.u64 %r62,%r62,8; add.u64 %r54,%r54,8; setp.ne.u64 %r230,%r126,%r137; @ %r230 bra $L60; .loc 1 261 8 ld.u64 %r102,[%r165]; shl.b64 %r159,%r119,4; mov.u64 %r157,%r120; .loc 1 264 10 mov.u64 %r125,0; .loc 1 265 23 mov.u64 %r231,%r125; $L64: st.u64 [%r157],%r231; st.u64 [%r157+8],%r231; .loc 1 264 26 add.u64 %r125,%r125,1; .loc 1 264 3 add.u64 %r157,%r157,%r159; setp.ne.u64 %r233,%r125,%r137; @ %r233 bra $L64; .loc 1 276 9 setp.eq.u64 %r234,%r102,0; @ %r234 bra $L52; .loc 1 331 19 ld.u64 %r82,[%frame+120]; .loc 1 331 9 shl.b64 %r84,%r82,2; .loc 1 332 20 ld.u64 %r85,[%frame]; .loc 1 334 34 ld.u64 %r87,[%frame+240]; .loc 1 334 21 ld.u64 %r161,[%frame+360]; cvt.u32.u32 %r236,%r22; cvt.s64.s8 %r235,%r236; shl.b64 %r237,%r235,3; add.u64 %r68,%r237,%r48; .loc 1 272 12 mov.f32 %r110,0f7f800000; setp.eq.u32 %r293,1,0; add.u64 %r292,%frame,368; .loc 1 340 13 mov.u64 %r295,0; .loc 1 359 32 add.u64 %r296,%frame,240; .loc 1 308 12 setp.eq.u32 %r298,%r167,0; $L83: .loc 1 280 10 @ %r293 bra $L65; $L72: .loc 1 284 11 ld.s8 %r240,[%r104]; cvt.u16.u32 %r239,%r240; setp.eq.u16 %r241,%r239,0; @ %r241 bra $L66; .loc 1 287 8 ld.u64 %r242,[%r120]; ld.u64 %r243,[%r120+8]; or.b64 %r242,%r242,%r243; setp.ne.u64 %r244,%r242,0; @ %r244 bra $L67; mov.u64 %r99,%r292; mov.u64 %r105,%r120; mov.u64 %r152,%r161; $L68: .loc 1 289 38 add.u64 %r246,%r152,1; st.u64 [%r105],%r246; shr.s64 %r247,%r246,63; st.u64 [%r105+8],%r247; .loc 1 288 7 add.u64 %r105,%r105,%r159; setp.eq.u64 %r248,%r68,%r99; @ %r248 bra $L67; .loc 1 289 34 ld.u64 %r152,[%r99]; add.u64 %r99,%r99,8; bra $L68; $L67: .loc 1 290 9 ld.f32 %r55,[%r102]; .loc 1 290 8 setp.le.f32 %r249,%r55,%r110; @ ! %r249 bra $L66; mov.u64 %r129,%r292; mov.u64 %r130,%r120; mov.u64 %r156,%r161; $L71: .loc 1 296 33 add.u64 %r251,%r156,1; st.u64 [%r130],%r251; shr.s64 %r252,%r251,63; st.u64 [%r130+8],%r252; .loc 1 295 9 add.u64 %r130,%r130,%r159; setp.eq.u64 %r253,%r68,%r129; @ %r253 bra $L84; .loc 1 296 29 ld.u64 %r156,[%r129]; add.u64 %r129,%r129,8; bra $L71; $L66: .loc 1 300 13 add.u64 %r102,%r102,%r84; .loc 1 301 14 add.u64 %r104,%r104,%r85; .loc 1 303 11 add.u64 %r161,%r161,1; .loc 1 303 4 st.u64 [%frame+360],%r161; setp.ne.u64 %r254,%r161,%r87; @ %r254 bra $L72; bra $L73; $L65: .loc 1 308 12 @ %r298 bra $L85; mov.u64 %r72,%r161; mov.u64 %r100,%r102; $L78: .loc 1 311 11 ld.s8 %r257,[%r104]; cvt.u16.u32 %r256,%r257; setp.eq.u16 %r258,%r256,0; @ %r258 bra $L75; .loc 1 311 12 ld.f32 %r66,[%r100]; .loc 1 311 11 setp.le.f32 %r259,%r66,%r110; @ ! %r259 bra $L75; mov.u64 %r49,%r292; mov.u64 %r64,%r120; mov.u64 %r149,%r72; .loc 1 314 18 mov.u64 %r123,0; $L77: .loc 1 315 36 add.u64 %r261,%r149,1; st.u64 [%r64],%r261; shr.s64 %r262,%r261,63; st.u64 [%r64+8],%r262; .loc 1 314 34 add.u64 %r123,%r123,1; .loc 1 314 11 add.u64 %r64,%r64,%r159; setp.ge.s64 %r263,%r123,%r137; @ %r263 bra $L86; .loc 1 315 32 ld.u64 %r149,[%r49]; add.u64 %r49,%r49,8; bra $L77; $L86: .loc 1 313 18 mov.f32 %r110,%r66; $L75: .loc 1 317 8 add.u64 %r100,%r100,%r84; .loc 1 319 13 add.u64 %r72,%r72,1; .loc 1 319 6 st.u64 [%frame+360],%r72; setp.ne.u64 %r264,%r72,%r87; @ %r264 bra $L78; .loc 1 317 8 sub.u64 %r265,%r87,%r161; mad.lo.u64 %r102,%r265,%r84,%r102; bra $L73; $L85: mov.u64 %r73,%r161; mov.u64 %r103,%r104; mov.u64 %r101,%r102; $L74: .loc 1 323 11 ld.s8 %r268,[%r103]; cvt.u16.u32 %r267,%r268; setp.eq.u16 %r269,%r267,0; @ %r269 bra $L79; .loc 1 323 12 ld.f32 %r77,[%r101]; .loc 1 323 11 setp.lt.f32 %r270,%r77,%r110; @ ! %r270 bra $L79; mov.u64 %r128,%r292; mov.u64 %r36,%r120; mov.u64 %r135,%r73; .loc 1 326 12 mov.u64 %r122,0; $L81: .loc 1 327 36 add.u64 %r272,%r135,1; st.u64 [%r36],%r272; shr.s64 %r273,%r272,63; st.u64 [%r36+8],%r273; .loc 1 326 28 add.u64 %r122,%r122,1; .loc 1 326 5 add.u64 %r36,%r36,%r159; setp.ge.s64 %r274,%r122,%r137; @ %r274 bra $L87; .loc 1 327 32 ld.u64 %r135,[%r128]; add.u64 %r128,%r128,8; bra $L81; $L87: .loc 1 325 12 mov.f32 %r110,%r77; $L79: .loc 1 331 9 add.u64 %r101,%r101,%r84; .loc 1 332 10 add.u64 %r103,%r103,%r85; .loc 1 334 14 add.u64 %r73,%r73,1; .loc 1 334 7 st.u64 [%frame+360],%r73; setp.ne.u64 %r275,%r73,%r87; @ %r275 bra $L74; sub.u64 %r141,%r87,%r161; .loc 1 331 9 mad.lo.u64 %r102,%r84,%r141,%r102; .loc 1 332 10 mad.lo.u64 %r104,%r85,%r141,%r104; $L73: mov.u64 %r114,%r292; .loc 1 325 12 mov.u64 %r96,%r85; .loc 1 334 34 mov.u64 %r98,%r87; mov.u64 %r94,%r82; mov.u64 %r113,8; .loc 1 335 9 mov.u64 %r108,0; $L82: .loc 1 340 13 st.u64 [%r114+-8],%r295; .loc 1 343 23 mul.lo.u64 %r89,%r94,%r98; .loc 1 344 24 mul.lo.u64 %r91,%r98,%r96; .loc 1 345 5 add.u64 %r108,%r108,1; .loc 1 346 7 setp.eq.u64 %r280,%r108,%r137; @ %r280 bra $L52; .loc 1 354 16 ld.u64 %r281,[%r114]; add.u64 %r93,%r281,1; st.u64 [%r114],%r93; .loc 1 355 23 add.u64 %r283,%r291,%r113; ld.u64 %r94,[%r283]; .loc 1 355 13 sub.u64 %r284,%r94,%r89; shl.b64 %r285,%r284,2; add.u64 %r102,%r102,%r285; .loc 1 356 24 add.u64 %r286,%frame,%r113; ld.u64 %r96,[%r286]; .loc 1 356 14 sub.u64 %r287,%r96,%r91; add.u64 %r104,%r104,%r287; .loc 1 359 32 add.u64 %r289,%r296,%r113; ld.u64 %r98,[%r289]; .loc 1 359 7 add.u64 %r114,%r114,8; add.u64 %r113,%r113,8; setp.eq.u64 %r290,%r93,%r98; @ %r290 bra $L82; .loc 1 334 21 ld.u64 %r161,[%frame+360]; bra $L83; $L84: .loc 1 290 9 mov.f32 %r110,%r55; setp.eq.u32 %r293,1,1; bra $L65; $L52: .loc 1 362_gfortran_sminloc0_16_r4 .visible .func _gfortran_sminloc0_16_r44predpred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 378 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L114; .loc 1 378 20 ld.u32 %r40,[%r37380 7 {_gfortran_minloc0_16_r4.loc 1 381 7 bra $L113; $L115: .loc 1 384 8 ld.s8 %r30,[%r36+28]; .loc 1 386 6 setp.gt.s64 %r46,%r30,0; @ %r46 bra $L1_gfortran_runtime_error117: .loc 1 389 15 ld.u64 %r24,[%r35]; .loc 1 389 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L118; .loc 1 391 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 392 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 393 24 st.u64 [%r35+8],%r24; .loc 1 394 29call (%value_in),_gfortrani_xmallocarray94 27 st.u64 [%r35],%r24; bra $L119; $L118: .loc 1 396 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 396 11 ld.u32 %r62,[%r61+36.loc 1 398 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 403 8 ld.u64 %r24,[%r35]; $L119: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,4; mov.u64 %r29,%r24; .loc 1 404 10 mov.u64 %r32,0; .loc 1 405 23 mov.u64 %r68,%r32; $L120: st.u64 [%r29],%r68; st.u64 [%r29+8],%r68; .loc 1 404 24 add.u64 %r32,%r32,1; .loc 1 404 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r70,%r30,%r32; @ %r70 bra $L120; $L113: .loc 1 406 minloc0_4_r8.o/ 1622802232_gfortran_minloc0_4_r8 .visible .func _gfortran_minloc0_4_fortran/generated/minloc0_4_r8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_4_r8 .visible .func _gfortran_mminloc0_4_r8_gfortran_sminloc0_4_r8 .visible .func _gfortran_sminloc0_4_r8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_4_r8 .visible .func _gfortran_minloc0_4_r8predpredpredu32 %r176; .reg .pred %r177; .reg .pred %r178; .reg .pred %r180; .reg .u32 %r182; .reg .u32 %r183; .reg .pred64pred %r208; .reg .u32 %r210; .reg .u64 %r211; .reg .u64 %r212; .reg .pred50 10 ld.s8 %r22,[%r128+28]; .loc 1 50 8 cvt.u32.u32 %r130,%r22; cvt.s64.s8 %r79,%r130; .loc 1 51 6 setp.gt.s64 %r131,%r79,0; @ %r131 bra $L2; .loc 1 52 5 cvta.const.u64 %r1321stack; call _gfortran_runtime_errorr23,[%r127]; .loc 1 54 6 setp.ne.u64 %r134,%r23,0; @ %r134 bra $L3; .loc 1 56 7 st.u64 [%r127+48],%r23; add.u64 %r136,%r79,-1; st.u64 [%r127+56],%r136; mov.u64 %r137,1; st.u64 [%r127+40],%r137; .loc 1 57 28 cvt.u32.u64 %r138,%r137; st.u8 [%r127+28],%r138; .loc 1 58 24 st.u64 [%r127+8],%r23; .loc 1 59 2979call (%value_in),_gfortrani_xmallocarray4,[%value_in]; } mov.u64 %r23,%r144; .loc 1 59 27 st.u64 [%r127],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r146,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r147,[%r146+36]; setp.eq.u32 %r148,%r147,0; @ %r148 bra $L4; .loc 1 64 2r12151; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r127]; $L4: .loc 1 68 11 ld.u64 %r80,[%r127+40]; add.u64 %r44,%r128,40; mov.u64 %r37,%frame; add.u64 %r47,%frame,120; .loc 1 70 10 mov.u64 %r103,0; add.u64 %r206,%frame,240; .loc 1 74 16 mov.u64 %r159,%r103; bra $L7; $L32: mov.u64 %r103,%r88; $L7: .loc 1 72 18 ld.u64 %r152,[%r44]; st.u64 [%r37],%r152; .loc 1 73 19 ld.u64 %r154,[%r44+16]; add.u64 %r153,%r154,1; ld.u64 %r155,[%r44+8]; sub.u64 %r31,%r153,%r155; .loc 1 73 17 st.u64 [%r47],%r31; .loc 1 74 16 shl.b64 %r157,%r103,3; add.u64 %r158,%r206,%r157; st.u64 [%r158],%r159; .loc 1 75 10 setp.gt.s64 %r160,%r31,0; @ %r160 bra $L5; shl.b64 %r122,%r80,2; mov.u64 %r101,%r23; .loc 1 78 11 mov.u64 %r89,0; .loc 1 79 24 cvt.u32.u64 %r161,%r89; $L6: st.u32 [%r101],%r161; .loc 1 78 27 add.u64 %r89,%r89,1; .loc 1 78 4 add.u64 %r101,%r101,%r122; setp.ne.u64 %r162,%r79,%r89; @ %r162 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r88,%r103,1; .loc 1 70 3 add.u64 %r44,%r44,24; add.u64 %r37,%r37,8; add.u64 %r47,%r47,8; setp.ne.u64 %r163,%r79,%r88; @ %r163 bra $L32; .loc 1 84 8 ld.u64 %r67,[%r128]; shl.b64 %r108,%r80,2; mov.u64 %r106,%r23; .loc 1 87 10 mov.u64 %r87,0; .loc 1 88 23 mov.u32 %r164,1; $L8: st.u32 [%r106],%r164; mov.u64 %r104,%r87; .loc 1 87 26 add.u64 %r87,%r87,1; .loc 1 87 3 add.u64 %r106,%r106,%r108; setp.ne.u64 %r165,%r103,%r104; @ %r165 bra $L8; .loc 1 101 9 setp.eq.u64 %r166,%r67,0; @ %r166 bra $L1; .loc 1 149 19 ld.u64 %r51,[%frame]; .loc 1 149 9 shl.b64 %r53,%r51,3; .loc 1 151 34 ld.u64 %r56,[%frame+120]; ld.u64 %r123,[%frame+240]; .loc 1 141 10 ld.f64 %r125,[%r67]; cvt.u32.u32 %r168,%r22; cvt.s64.s8 %r167,%r168; shl.b64 %r169,%r167,3; add.u64 %r113,%r206,%r169; .loc 1 97 12 mov.f64 %r73,0d7ff0000000000000; setp.eq.u32 %r208,1,0; add.u64 %r207,%frame,248; .loc 1 157 13 mov.u64 %r211,0; .loc 1 174 32 add.u64 %r212,%frame,120; .loc 1 126 10 setp.eq.u32 %r214,%r129,0; $L30: .loc 1 106 10 @ %r208 bra $L10; mov.u64 %r38,%r123; mov.u32 %r85,0; .loc 1 110 12 mov.u32 %r210,1; $L17: .loc 1 118 13 add.u64 %r126,%r67,%r53; .loc 1 110 11 setp.ge.f64 %r172,%r73,%r125; @ ! %r172 bra $L51; setp.eq.u32 %r173,%r85,0; selp.u64 %r123,%r123,%r38,%r173; mov.u64 %r90,%r207; mov.u64 %r95,%r23; mov.u64 %r119,%r123; $L15: .loc 1 115 36 cvt.u32.u64 %r176,%r119; add.u32 %r175,%r176,1; .loc 1 115 25 st.u32 [%r95],%r175; .loc 1 114 5 add.u64 %r95,%r95,%r108; setp.eq.u64 %r177,%r90,%r113; @ %r177 bra $L33; .loc 1 115 32 ld.u64 %r119,[%r90]; add.u64 %r90,%r90,8; bra $L15; $L51: .loc 1 120 11 add.u64 %r38,%r38,1; .loc 1 120 4 setp.eq.u64 %r178,%r38,%r56; @ %r178 bra $L16; .loc 1 110 12 ld.f64 %r125,[%r126]; mov.u64 %r67,%r126; mov.u32 %r85,%r210; bra $L17; $L10: .loc 1 126 10 @ %r214 bra $L34; mov.u64 %r46,%r123; mov.u64 %r65,%r67; $L24: .loc 1 129 9 setp.ge.f64 %r180,%r73,%r125; @ ! %r180 bra $L19; mov.u64 %r84,%r207; mov.u64 %r39,%r23; mov.u64 %r116,%r46; $L21: .loc 1 133 34 cvt.u32.u64 %r183,%r116; add.u32 %r182,%r183,1; .loc 1 133 23 st.u32 [%r39],%r182; .loc 1 132 3 add.u64 %r39,%r39,%r108; setp.eq.u64 %r184,%r84,%r113; @ %r184 bra $L35; .loc 1 133 30 ld.u64 %r116,[%r84]; add.u64 %r84,%r84,8; bra $L21; $L35: .loc 1 131 10 mov.f64 %r73,%r125; $L19: .loc 1 135 11 add.u64 %r65,%r65,%r53; .loc 1 137 9 add.u64 %r46,%r46,1; .loc 1 137 2 st.u64 [%frame+240],%r46; setp.ne.u64 %r185,%r46,%r56; @ %r185 bra $L22; .loc 1 135 11 sub.u64 %r186,%r56,%r123; mad.lo.u64 %r67,%r186,%r53,%r67; bra $L23; $L22: .loc 1 129 10 ld.f64 %r125,[%r65]; bra $L24; $L34: mov.u64 %r55,%r123; mov.u64 %r66,%r67; $L18: .loc 1 141 9 setp.gt.f64 %r188,%r73,%r125; @ ! %r188 bra $L25; mov.u64 %r117,%r207; mov.u64 %r120,%r23; mov.u64 %r110,%r55; $L27: .loc 1 145 34 cvt.u32.u64 %r191,%r110; add.u32 %r190,%r191,1; .loc 1 145 23 st.u32 [%r120],%r190; .loc 1 144 3 add.u64 %r120,%r120,%r108; setp.eq.u64 %r192,%r113,%r117; @ %r192 bra $L36; .loc 1 145 30 ld.u64 %r110,[%r117]; add.u64 %r117,%r117,8; bra $L27; $L36: .loc 1 143 10 mov.f64 %r73,%r125; $L25: .loc 1 149 9 add.u64 %r66,%r66,%r53; .loc 1 151 14 add.u64 %r55,%r55,1; .loc 1 151 7 st.u64 [%frame+240],%r55; setp.ne.u64 %r193,%r55,%r56; @ %r193 bra $L28; .loc 1 149 9 sub.u64 %r194,%r56,%r123; mad.lo.u64 %r67,%r194,%r53,%r67; bra $L23; $L28: .loc 1 141 10 ld.f64 %r125,[%r66]; bra $L18; $L16: st.u64 [%frame+240],%r56; .loc 1 118 13 mov.u64 %r67,%r126; $L23: mov.u64 %r76,%r207; .loc 1 151 34 mov.u64 %r63,%r56; mov.u64 %r61,%r51; .loc 1 152 9 mov.u64 %r71,0; $L29: .loc 1 157 13 st.u64 [%r76+-8],%r211; .loc 1 160 23 mul.lo.u64 %r58,%r61,%r63; mov.u64 %r105,%r71; .loc 1 161 5 add.u64 %r71,%r71,1; .loc 1 162 7 setp.eq.u64 %r198,%r105,%r103; @ %r198 bra $L1; .loc 1 170 16 ld.u64 %r199,[%r76]; add.u64 %r60,%r199,1; st.u64 [%r76],%r60; shl.b64 %r69,%r71,3; .loc 1 171 23 add.u64 %r200,%frame,%r69; ld.u64 %r61,[%r200]; .loc 1 171 13 sub.u64 %r201,%r61,%r58; shl.b64 %r202,%r201,3; add.u64 %r67,%r67,%r202; .loc 1 174 32 add.u64 %r204,%r212,%r69; ld.u64 %r63,[%r204]; .loc 1 174 7 add.u64 %r76,%r76,8; setp.eq.u64 %r205,%r60,%r63; @ %r205 bra $L29; ld.u64 %r123,[%frame+240]; .loc 1 141 10 ld.f64 %r125,[%r67]; bra $L30; $L33: mov.f64 %r73,%r125; setp.eq.u32 %r208,1,1; bra $L10; $L1: .loc 1 1_gfortran_mminloc0_4_r8 .visible .func _gfortran_mminloc0_4_r8pred %r172; .reg .predpredpred %r195; .reg .u64 %r198; .reg .u64u32 %r211; .reg .u16pred %r217; .reg .predpredu16 %r257; .reg .u32 %r258; .reg .pred %r259; .reg .pred %r260; .reg .u32 %r262; .reg .u32 %r263; .reg .pred %r264; .reg .pred %r265; .reg .u64 %r266; .reg .u16 %r268; .reg .u32 %r269; .reg .pred %r270; .reg .pred %r271; .reg .u32 %r273; .reg .u32 %r274; .reg .pred %r275; .reg .pred %r276; .reg .pred %r281; .reg .u64 %r282; .reg .u64pred %r294; .reg .u64 %r296; .reg .u64 %r297; .reg .pred %r299; mov.u64 %r168,%ar0; mov.u64 %r169,%ar1; mov.u64 %r170,%ar2; mov.u32 %r171,%ar3; .loc 1 202 6 setp.ne.u64 %r172,%r170,0; @ %r172 bra $Lcall _gfortran_minloc0_4_r8.loc 1 205 7 bra $L52; $L53: .loc 1 208 10 ld.s8 %r22,[%r169+28]; .loc 1 209 6 setp.gt.s32 %r176,%r22,0; @ %r176 bra $L55; .loc 1 210 5 cvta.const.u64 %r177_gfortran_runtime_error5: .loc 1 212 6 ld.u64 %r179,[%r168]; setp.ne.u64 %r180,%r179,0; @ %r180 bra $L56; .loc 1 214 7 st.u64 [%r168+48],%r179; add.u32 %r182,%r22,-1; cvt.s64.s32 %r183,%r182; st.u64 [%r168+56],%r183; mov.u64 %r184,1; st.u64 [%r168+40],%r184; .loc 1 215 28 cvt.u32.u64 %r185,%r184; st.u8 [%r168+28],%r185; .loc 1 216 24 st.u64 [%r168+8],%r179; .loc 1 217 29 cvt.u32.u32 %r190,%r22; cvt.s64.s8 %r189,%r1call (%value_in),_gfortrani_xmallocarray191,[%value_in]; } .loc 1 217 27 st.u64 [%r168],%r191; bra $L57; $L56: .loc 1 221 11 cvta.global.u64 %r193,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r194,[%r193+36]; setp.eq.u32 %r195,%r194,0r169198; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r2011716920198; call _gfortrani_bounds_equal_extents$L57: .loc 1 231 15 ld.u64 %r31,[%r170+16]; .loc 1 233 9 ld.u64 %r99,[%r170]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r203,%r32,-4; and.b32 %r204,%r203,-5;.loc 1 235 22 add.u32 %r208,%r32,-1; set.u32.le.u32 %r210,%r208,1; neg.s32 %r211,%r210; .loc 1 235 6 cvt.u16.u32 %r213,%r207; cvt.u16.u32 %r214,%r211; or.b16 %r212,%r213,%r214; cvt.u32.u16 %r215,%r212; cvt.u16.u8 %r216,%r215; setp.ne.u16 %r217,%r216,0; @ %r217 bra $L58; .loc 1 237 7 setp.ne.u32 %r219,%r32,16; @ %r219 bra $L59; $L58: .loc 1 244 11 ld.u64 %r115,[%r168+40]; .loc 1 245 8 ld.u64 %r116,[%r168]; .loc 1 246 17 cvt.u32.u32 %r220,%r22; cvt.s64.s8 %r139,%r220; add.u64 %r166,%r169,40; add.u64 %r292,%frame,120; mov.u64 %r163,%r292; add.u64 %r160,%r170,40; mov.u64 %r159,%frame; add.u64 %r156,%frame,240; add.u64 %r152,%frame,360; mov.u64 %r154,%r152; .loc 1 246 10 mov.u64 %r122,0; .loc 1 251 16 mov.u64 %r229,%r122; bra $L60; $L59: .loc 1 242 5_gfortran_runtime_error248 18 ld.u64 %r223,[%r166]; st.u64 [%r163],%r223; .loc 1 249 20 ld.u64 %r225,[%r160]; mul.lo.u64 %r224,%r225,%r31; .loc 1 249 18 st.u64 [%r159],%r224; .loc 1 250 19 ld.u64 %r227,[%r166+16]; add.u64 %r226,%r227,1; ld.u64 %r228,[%r166+8]; sub.u64 %r45,%r226,%r228; .loc 1 250 17 st.u64 [%r156],%r45; .loc 1 251 16 st.u64 [%r154],%r229; .loc 1 252 10 setp.gt.s64 %r230,%r45,0; @ %r230 bra $L61; shl.b64 %r136,%r115,2; mov.u64 %r138,%r116; .loc 1 255 11 mov.u64 %r123,0; .loc 1 256 24 cvt.u32.u64 %r231,%r123; $L62: st.u32 [%r138],%r231; .loc 1 255 27 add.u64 %r123,%r123,1; .loc 1 255 4 add.u64 %r138,%r138,%r136; setp.ne.u64 %r232,%r123,%r139; @ %r232 bra $L62; bra $L52; $L61: .loc 1 246 26 add.u64 %r122,%r122,1; .loc 1 246 3 add.u64 %r166,%r166,24; add.u64 %r163,%r163,8; add.u64 %r160,%r160,24; add.u64 %r159,%r159,8; add.u64 %r156,%r156,8; add.u64 %r154,%r154,8; setp.ne.u64 %r233,%r122,%r139; @ %r233 bra $L60; .loc 1 261 8 ld.u64 %r96,[%r169]; shl.b64 %r126,%r115,2; mov.u64 %r124,%r116; .loc 1 264 10 mov.u64 %r121,0; .loc 1 265 23 cvt.u32.u64 %r234,%r121; $L64: st.u32 [%r124],%r234; .loc 1 264 26 add.u64 %r121,%r121,1; .loc 1 264 3 add.u64 %r124,%r124,%r126; setp.ne.u64 %r235,%r121,%r139; @ %r235 bra $L64; .loc 1 276 9 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L52; .loc 1 331 19 ld.u64 %r74,[%frame+120]; .loc 1 331 9 shl.b64 %r76,%r74,3; .loc 1 332 20 ld.u64 %r77,[%frame]; .loc 1 334 34 ld.u64 %r81,[%frame+240]; .loc 1 334 21 ld.u64 %r164,[%frame+360]; cvt.u32.u32 %r238,%r22; cvt.s64.s8 %r237,%r238; shl.b64 %r239,%r237,3; add.u64 %r80,%r239,%r152; .loc 1 272 12 mov.f64 %r104,0d7ff0000000000000; setp.eq.u32 %r294,1,0; add.u64 %r293,%frame,368; .loc 1 340 13 mov.u64 %r296,0; .loc 1 359 32 add.u64 %r297,%frame,240; .loc 1 308 12 setp.eq.u32 %r299,%r171,0; $L83: .loc 1 280 10 @ %r294 bra $L65; $L72: .loc 1 284 11 ld.s8 %r242,[%r99]; cvt.u16.u32 %r241,%r242; setp.eq.u16 %r243,%r241,0; @ %r243 bra $L66; .loc 1 287 8 ld.u32 %r244,[%r116]; setp.ne.u32 %r245,%r244,0; @ %r245 bra $L67; mov.u64 %r101,%r293; mov.u64 %r103,%r116; mov.u64 %r157,%r164; $L68: .loc 1 289 38 cvt.u32.u64 %r248,%r157; add.u32 %r247,%r248,1; .loc 1 289 27 st.u32 [%r103],%r247; .loc 1 288 7 add.u64 %r103,%r103,%r126; setp.eq.u64 %r249,%r80,%r101; @ %r249 bra $L67; .loc 1 289 34 ld.u64 %r157,[%r101]; add.u64 %r101,%r101,8; bra $L68; $L67: .loc 1 290 9 ld.f64 %r52,[%r96]; .loc 1 290 8 setp.le.f64 %r250,%r52,%r104; @ ! %r250 bra $L66; mov.u64 %r129,%r293; mov.u64 %r130,%r116; mov.u64 %r161,%r164; $L71: .loc 1 296 33 cvt.u32.u64 %r253,%r161; add.u32 %r252,%r253,1; .loc 1 296 22 st.u32 [%r130],%r252; .loc 1 295 9 add.u64 %r130,%r130,%r126; setp.eq.u64 %r254,%r80,%r129; @ %r254 bra $L84; .loc 1 296 29 ld.u64 %r161,[%r129]; add.u64 %r129,%r129,8; bra $L71; $L66: .loc 1 300 13 add.u64 %r96,%r96,%r76; .loc 1 301 14 add.u64 %r99,%r99,%r77; .loc 1 303 11 add.u64 %r164,%r164,1; .loc 1 303 4 st.u64 [%frame+360],%r164; setp.ne.u64 %r255,%r164,%r81; @ %r255 bra $L72; bra $L73; $L65: .loc 1 308 12 @ %r299 bra $L85; mov.u64 %r66,%r164; mov.u64 %r94,%r96; $L78: .loc 1 311 11 ld.s8 %r258,[%r99]; cvt.u16.u32 %r257,%r258; setp.eq.u16 %r259,%r257,0; @ %r259 bra $L75; .loc 1 311 12 ld.f64 %r61,[%r94]; .loc 1 311 11 setp.le.f64 %r260,%r61,%r104; @ ! %r260 bra $L75; mov.u64 %r62,%r293; mov.u64 %r69,%r116; mov.u64 %r155,%r66; .loc 1 314 18 mov.u64 %r120,0; $L77: .loc 1 315 36 cvt.u32.u64 %r263,%r155; add.u32 %r262,%r263,1; .loc 1 315 25 st.u32 [%r69],%r262; .loc 1 314 34 add.u64 %r120,%r120,1; .loc 1 314 11 add.u64 %r69,%r69,%r126; setp.ge.s64 %r264,%r120,%r139; @ %r264 bra $L86; .loc 1 315 32 ld.u64 %r155,[%r62]; add.u64 %r62,%r62,8; bra $L77; $L86: .loc 1 313 18 mov.f64 %r104,%r61; $L75: .loc 1 317 8 add.u64 %r94,%r94,%r76; .loc 1 319 13 add.u64 %r66,%r66,1; .loc 1 319 6 st.u64 [%frame+360],%r66; setp.ne.u64 %r265,%r66,%r81; @ %r265 bra $L78; .loc 1 317 8 sub.u64 %r266,%r81,%r164; mad.lo.u64 %r96,%r266,%r76,%r96; bra $L73; $L85: mov.u64 %r79,%r164; mov.u64 %r98,%r99; mov.u64 %r95,%r96; $L74: .loc 1 323 11 ld.s8 %r269,[%r98]; cvt.u16.u32 %r268,%r269; setp.eq.u16 %r270,%r268,0; @ %r270 bra $L79; .loc 1 323 12 ld.f64 %r68,[%r95]; .loc 1 323 11 setp.lt.f64 %r271,%r68,%r104; @ ! %r271 bra $L79; mov.u64 %r46,%r293; mov.u64 %r56,%r116; mov.u64 %r151,%r79; .loc 1 326 12 mov.u64 %r118,0; $L81: .loc 1 327 36 cvt.u32.u64 %r274,%r151; add.u32 %r273,%r274,1; .loc 1 327 25 st.u32 [%r56],%r273; .loc 1 326 28 add.u64 %r118,%r118,1; .loc 1 326 5 add.u64 %r56,%r56,%r126; setp.ge.s64 %r275,%r118,%r139; @ %r275 bra $L87; .loc 1 327 32 ld.u64 %r151,[%r46]; add.u64 %r46,%r46,8; bra $L81; $L87: .loc 1 325 12 mov.f64 %r104,%r68; $L79: .loc 1 331 9 add.u64 %r95,%r95,%r76; .loc 1 332 10 add.u64 %r98,%r98,%r77; .loc 1 334 14 add.u64 %r79,%r79,1; .loc 1 334 7 st.u64 [%frame+360],%r79; setp.ne.u64 %r276,%r79,%r81; @ %r276 bra $L74; sub.u64 %r143,%r81,%r164; .loc 1 331 9 mad.lo.u64 %r96,%r76,%r143,%r96; .loc 1 332 10 mad.lo.u64 %r99,%r77,%r143,%r99; $L73: mov.u64 %r119,%r293; .loc 1 325 12 mov.u64 %r90,%r77; .loc 1 334 34 mov.u64 %r92,%r81; mov.u64 %r88,%r74; mov.u64 %r110,8; .loc 1 335 9 mov.u64 %r102,0; $L82: .loc 1 340 13 st.u64 [%r119+-8],%r296; .loc 1 343 23 mul.lo.u64 %r83,%r88,%r92; .loc 1 344 24 mul.lo.u64 %r85,%r92,%r90; .loc 1 345 5 add.u64 %r102,%r102,1; .loc 1 346 7 setp.eq.u64 %r281,%r102,%r139; @ %r281 bra $L52; .loc 1 354 16 ld.u64 %r282,[%r119]; add.u64 %r87,%r282,1; st.u64 [%r119],%r87; .loc 1 355 23 add.u64 %r284,%r292,%r110; ld.u64 %r88,[%r284]; .loc 1 355 13 sub.u64 %r285,%r88,%r83; shl.b64 %r286,%r285,3; add.u64 %r96,%r96,%r286; .loc 1 356 24 add.u64 %r287,%frame,%r110; ld.u64 %r90,[%r287]; .loc 1 356 14 sub.u64 %r288,%r90,%r85; add.u64 %r99,%r99,%r288; .loc 1 359 32 add.u64 %r290,%r297,%r110; ld.u64 %r92,[%r290]; .loc 1 359 7 add.u64 %r119,%r119,8; add.u64 %r110,%r110,8; setp.eq.u64 %r291,%r87,%r92; @ %r291 bra $L82; .loc 1 334 21 ld.u64 %r164,[%frame+360]; bra $L83; $L84: .loc 1 290 9 mov.f64 %r104,%r52; setp.eq.u32 %r294,1,1; bra $L65; $L52: .loc 1 362_gfortran_sminloc0_4_r8 .visible .func _gfortran_sminloc0_4_r84predpred.reg .u64u32 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 378 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L114; .loc 1 378 20 ld.u32 %r40,[%r37380 7 {_gfortran_minloc0_4_r8.loc 1 381 7 bra $L113; $L115: .loc 1 384 8 ld.s8 %r30,[%r36+28]; .loc 1 386 6 setp.gt.s64 %r46,%r30,0; @ %r46 bra $L1_gfortran_runtime_error117: .loc 1 389 15 ld.u64 %r24,[%r35]; .loc 1 389 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L118; .loc 1 391 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 392 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 393 24 st.u64 [%r35+8],%r24; .loc 1 394 29 mov.u64 %r56call (%value_in),_gfortrani_xmallocarray94 27 st.u64 [%r35],%r24; bra $L119; $L118: .loc 1 396 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 396 11 ld.u32 %r62,[%r61+36.loc 1 398 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 403 8 ld.u64 %r24,[%r35]; $L119: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,2; mov.u64 %r29,%r24; .loc 1 404 10 mov.u64 %r32,0; .loc 1 405 23 cvt.u32.u64 %r68,%r32; $L120: st.u32 [%r29],%r68; .loc 1 404 24 add.u64 %r32,%r32,1; .loc 1 404 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L120; $L113: .loc 1 406 minloc0_8_r8.o/ 1622802232_gfortran_minloc0_8_r8 .visible .func _gfortran_minloc0_8_fortran/generated/minloc0_8_r8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_8_r8 .visible .func _gfortran_mminloc0_8_r8_gfortran_sminloc0_8_r8 .visible .func _gfortran_sminloc0_8_r8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_8_r8 .visible .func _gfortran_minloc0_8_r8pred.reg .u64 %r140; .reg .u32 %r141; .reg .predpred %r157; .reg .u64pred %r170; .reg .pred %r171; .reg .pred %r173; .reg .u64pred %r177; .reg .u64 %r178; .reg .predpredpred %r196; .reg .predpred %r205; mov.u64 %r121,%ar0; mov.u64 %r122,%ar1; mov.u32 %r123,%ar2; .loc 1 50 10 ld.s8 %r22,[%r122+28]; .loc 1 50 8 cvt.u32.u32 %r124,%r22; cvt.s64.s8 %r84,%r124; .loc 1 51 6 setp.gt.s64 %r125,%r84,0; @ %r125 bra $L2; .loc 1 52 5 cvta.const.u64 %r126,$LCstack; call _gfortran_runtime_errorr23,[%r121]; .loc 1 54 6 setp.ne.u64 %r128,%r23,0; @ %r128 bra $L3; .loc 1 56 7 st.u64 [%r121+48],%r23; add.u64 %r130,%r84,-1; st.u64 [%r121+56],%r130; mov.u64 %r131,1; st.u64 [%r121+40],%r131; .loc 1 57 28 cvt.u32.u64 %r132,%r131; st.u8 [%r121+28],%r132; .loc 1 58 24 st.u64 [%r121+8],%r23; .loc 1 59 29 mov.u64 %r135135; call (%value_in),_gfortrani_xmallocarray138,[%value_in]; } mov.u64 %r23,%r138; .loc 1 59 27 st.u64 [%r121],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r140,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r141,[%r140+36]; setp.eq.u32 %r142,%r141,0; @ %r142 bra $L4; .loc 1 64r122145; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r121]; $L4: .loc 1 68 11 ld.u64 %r85,[%r121+40]; add.u64 %r33,%r122,40; mov.u64 %r53,%frame; add.u64 %r45,%frame,120; .loc 1 70 10 mov.u64 %r106,0; add.u64 %r198,%frame,240; .loc 1 74 16 mov.u64 %r153,%r106; bra $L7; $L32: mov.u64 %r106,%r91; $L7: .loc 1 72 18 ld.u64 %r146,[%r33]; st.u64 [%r53],%r146; .loc 1 73 19 ld.u64 %r148,[%r33+16]; add.u64 %r147,%r148,1; ld.u64 %r149,[%r33+8]; sub.u64 %r32,%r147,%r149; .loc 1 73 17 st.u64 [%r45],%r32; .loc 1 74 16 shl.b64 %r151,%r106,3; add.u64 %r152,%r198,%r151; st.u64 [%r152],%r153; .loc 1 75 10 setp.gt.s64 %r154,%r32,0; @ %r154 bra $L5; shl.b64 %r114,%r85,3; mov.u64 %r104,%r23; .loc 1 78 11 mov.u64 %r92,0; .loc 1 79 24 mov.u64 %r155,%r92; $L6: st.u64 [%r104],%r155; .loc 1 78 27 add.u64 %r92,%r92,1; .loc 1 78 4 add.u64 %r104,%r104,%r114; setp.ne.u64 %r156,%r84,%r92; @ %r156 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r91,%r106,1; .loc 1 70 3 add.u64 %r33,%r33,24; add.u64 %r53,%r53,8; add.u64 %r45,%r45,8; setp.ne.u64 %r157,%r84,%r91; @ %r157 bra $L32; .loc 1 84 8 ld.u64 %r73,[%r122]; shl.b64 %r44,%r85,3; mov.u64 %r55,%r23; .loc 1 87 10 mov.u64 %r90,0; .loc 1 88 23 mov.u64 %r158,1; $L8: st.u64 [%r55],%r158; mov.u64 %r107,%r90; .loc 1 87 26 add.u64 %r90,%r90,1; .loc 1 87 3 add.u64 %r55,%r55,%r44; setp.ne.u64 %r159,%r106,%r107; @ %r159 bra $L8; .loc 1 101 9 setp.eq.u64 %r160,%r73,0; @ %r160 bra $L1; .loc 1 149 19 ld.u64 %r57,[%frame]; .loc 1 149 9 shl.b64 %r59,%r57,3; .loc 1 151 34 ld.u64 %r61,[%frame+120]; ld.u64 %r115,[%frame+240]; .loc 1 141 10 ld.f64 %r119,[%r73]; cvt.u32.u32 %r162,%r22; cvt.s64.s8 %r161,%r162; shl.b64 %r163,%r161,3; add.u64 %r95,%r198,%r163; .loc 1 97 12 mov.f64 %r78,0d7ff0000000000000; setp.eq.u32 %r197,1,0; add.u64 %r199,%frame,248; .loc 1 157 13 mov.u64 %r202,0; .loc 1 174 32 add.u64 %r203,%frame,120; .loc 1 126 10 setp.eq.u32 %r205,%r123,0; $L30: .loc 1 106 10 @ %r197 bra $L10; mov.u64 %r40,%r115; mov.u32 %r89,0; .loc 1 110 12 mov.u32 %r201,1; $L17: .loc 1 118 13 add.u64 %r120,%r73,%r59; .loc 1 110 11 setp.ge.f64 %r166,%r78,%r119; @ ! %r166 bra $L51; setp.eq.u32 %r167,%r89,0; selp.u64 %r115,%r115,%r40,%r167; mov.u64 %r77,%r199; mov.u64 %r97,%r23; mov.u64 %r111,%r115; $L15: .loc 1 115 36 add.u64 %r169,%r111,1; .loc 1 115 25 st.u64 [%r97],%r169; .loc 1 114 5 add.u64 %r97,%r97,%r44; setp.eq.u64 %r170,%r77,%r95; @ %r170 bra $L33; .loc 1 115 32 ld.u64 %r111,[%r77]; add.u64 %r77,%r77,8; bra $L15; $L51: .loc 1 120 11 add.u64 %r40,%r40,1; .loc 1 120 4 setp.eq.u64 %r171,%r40,%r61; @ %r171 bra $L16; .loc 1 110 12 ld.f64 %r119,[%r120]; mov.u64 %r73,%r120; mov.u32 %r89,%r201; bra $L17; $L10: .loc 1 126 10 @ %r205 bra $L34; mov.u64 %r52,%r115; mov.u64 %r71,%r73; $L24: .loc 1 129 9 setp.ge.f64 %r173,%r78,%r119; @ ! %r173 bra $L19; mov.u64 %r110,%r199; mov.u64 %r112,%r23; mov.u64 %r109,%r52; $L21: .loc 1 133 34 add.u64 %r175,%r109,1; .loc 1 133 23 st.u64 [%r112],%r175; .loc 1 132 3 add.u64 %r112,%r112,%r44; setp.eq.u64 %r176,%r95,%r110; @ %r176 bra $L35; .loc 1 133 30 ld.u64 %r109,[%r110]; add.u64 %r110,%r110,8; bra $L21; $L35: .loc 1 131 10 mov.f64 %r78,%r119; $L19: .loc 1 135 11 add.u64 %r71,%r71,%r59; .loc 1 137 9 add.u64 %r52,%r52,1; .loc 1 137 2 st.u64 [%frame+240],%r52; setp.ne.u64 %r177,%r52,%r61; @ %r177 bra $L22; .loc 1 135 11 sub.u64 %r178,%r61,%r115; mad.lo.u64 %r73,%r178,%r59,%r73; bra $L23; $L22: .loc 1 129 10 ld.f64 %r119,[%r71]; bra $L24; $L34: mov.u64 %r60,%r115; mov.u64 %r72,%r73; $L18: .loc 1 141 9 setp.gt.f64 %r180,%r78,%r119; @ ! %r180 bra $L25; mov.u64 %r50,%r199; mov.u64 %r47,%r23; mov.u64 %r93,%r60; $L27: .loc 1 145 34 add.u64 %r182,%r93,1; .loc 1 145 23 st.u64 [%r47],%r182; .loc 1 144 3 add.u64 %r47,%r47,%r44; setp.eq.u64 %r183,%r50,%r95; @ %r183 bra $L36; .loc 1 145 30 ld.u64 %r93,[%r50]; add.u64 %r50,%r50,8; bra $L27; $L36: .loc 1 143 10 mov.f64 %r78,%r119; $L25: .loc 1 149 9 add.u64 %r72,%r72,%r59; .loc 1 151 14 add.u64 %r60,%r60,1; .loc 1 151 7 st.u64 [%frame+240],%r60; setp.ne.u64 %r184,%r60,%r61; @ %r184 bra $L28; .loc 1 149 9 sub.u64 %r185,%r61,%r115; mad.lo.u64 %r73,%r185,%r59,%r73; bra $L23; $L28: .loc 1 141 10 ld.f64 %r119,[%r72]; bra $L18; $L16: st.u64 [%frame+240],%r61; .loc 1 118 13 mov.u64 %r73,%r120; $L23: mov.u64 %r42,%r199; .loc 1 151 34 mov.u64 %r70,%r61; mov.u64 %r68,%r57; .loc 1 152 9 mov.u64 %r76,0; $L29: .loc 1 157 13 st.u64 [%r42+-8],%r202; .loc 1 160 23 mul.lo.u64 %r65,%r68,%r70; mov.u64 %r108,%r76; .loc 1 161 5 add.u64 %r76,%r76,1; .loc 1 162 7 setp.eq.u64 %r189,%r108,%r106; @ %r189 bra $L1; .loc 1 170 16 ld.u64 %r190,[%r42]; add.u64 %r67,%r190,1; st.u64 [%r42],%r67; shl.b64 %r118,%r76,3; .loc 1 171 23 add.u64 %r191,%frame,%r118; ld.u64 %r68,[%r191]; .loc 1 171 13 sub.u64 %r192,%r68,%r65; shl.b64 %r193,%r192,3; add.u64 %r73,%r73,%r193; .loc 1 174 32 add.u64 %r195,%r203,%r118; ld.u64 %r70,[%r195]; .loc 1 174 7 add.u64 %r42,%r42,8; setp.eq.u64 %r196,%r67,%r70; @ %r196 bra $L29; ld.u64 %r115,[%frame+240]; .loc 1 141 10 ld.f64 %r119,[%r73]; bra $L30; $L33: mov.f64 %r78,%r119; setp.eq.u32 %r197,1,1; bra $L10; $L1: .loc 1 1_gfortran_mminloc0_8_r8 .visible .func _gfortran_mminloc0_8_r8pred %r187; .reg .u64 %r190; .reg .u64u64pred %r222; .reg .u64 %r223; .reg .predu32 %r230; .reg .u64 %r231; .reg .u16 %r233; .reg .u32 %r234; .reg .pred %r235; .reg .u64 %r236; .reg .predpred %r250; .reg .u64 %r252; .reg .predu16 %r257; .reg .u32 %r258; .reg .pred %r259; .reg .pred %r260; .reg .u64 %r262; .reg .pred %r263; .reg .predpred %r282; .reg .u64 %r284; .reg .u64 %r285; .reg .pred %r287; mov.u64 %r160,%ar0; mov.u64 %r161,%ar1; mov.u64 %r162,%ar2; mov.u32 %r163,%ar3; .loc 1 202 6 setp.ne.u64 %r164,%r162,0; @ %r164 bra $L53; .loc 1 204 7 {1163; call _gfortran_minloc0_8_r8.loc 1 205 7 bra $L52; $L53: .loc 1 208 10 ld.s8 %r22,[%r161+28]; .loc 1 209 6 setp.gt.s32 %r168,%r22,0; @ %r168 bra $L55; .loc 1 210 5 cvta.const.u64 %r16stack; call _gfortran_runtime_error5: .loc 1 212 6 ld.u64 %r171,[%r160]; setp.ne.u64 %r172,%r171,0; @ %r172 bra $L56; .loc 1 214 7 st.u64 [%r160+48],%r171; add.u32 %r174,%r22,-1; cvt.s64.s32 %r175,%r174; st.u64 [%r160+56],%r175; mov.u64 %r176,1; st.u64 [%r160+40],%r176; .loc 1 215 28 cvt.u32.u64 %r177,%r176; st.u8 [%r160+28],%r177; .loc 1 216 24 st.u64 [%r160+8],%r171; .loc 1 217 29 cvt.u32.u32 %r182,%r22; cvt.s64.s8 %r181,%r182; mov.u64 %r180call (%value_in),_gfortrani_xmallocarray183,[%value_in]; } .loc 1 217 27 st.u64 [%r160],%r183; bra $L57; $L56: .loc 1 221 11 cvta.global.u64 %r185,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r186,[%r185+36]; setp.eq.u32 %r187,%r186,0; @ %r187 bra $L57; .loc 1 224 4 cvta.const.u64 %r190,$LC1; {161190; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19316161193190; call _gfortrani_bounds_equal_extents$L57: .loc 1 231 15 ld.u64 %r31,[%r162+16]; .loc 1 233 9 ld.u64 %r101,[%r162]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r195,%r32,-4; and.b32 %r196,%r195,-5; set.u32.eq.u32 %r198,%r196,0; neg.s32 %r199,%r198; .loc 1 235 22 add.u32 %r200,%r32,-1; set.u32.le.u32 %r202,%r200,1; neg.s32 %r203,%r202; .loc 1 235 6 cvt.u16.u32 %r205,%r199; cvt.u16.u32 %r206,%r203; or.b16 %r204,%r205,%r206; cvt.u32.u16 %r207,%r204; cvt.u16.u8 %r208,%r20758; .loc 1 237 7 setp.ne.u32 %r211,%r32,16; @ %r211 bra $L59; $L58: .loc 1 244 11 ld.u64 %r116,[%r160+40]; .loc 1 245 8 ld.u64 %r117,[%r160]; .loc 1 246 17 cvt.u32.u32 %r212,%r22; cvt.s64.s8 %r137,%r212; add.u64 %r52,%r161,40; add.u64 %r280,%frame,120; mov.u64 %r62,%r280; add.u64 %r67,%r162,40; mov.u64 %r70,%frame; add.u64 %r57,%frame,240; add.u64 %r59,%frame,360; mov.u64 %r58,%r59; .loc 1 246 10 mov.u64 %r122,0; .loc 1 251 16 mov.u64 %r221,%r122; bra $L60; $L59: .loc 1 242 5_gfortran_runtime_error248 18 ld.u64 %r215,[%r52]; st.u64 [%r62],%r215; .loc 1 249 20 ld.u64 %r217,[%r67]; mul.lo.u64 %r216,%r217,%r31; .loc 1 249 18 st.u64 [%r70],%r216; .loc 1 250 19 ld.u64 %r219,[%r52+16]; add.u64 %r218,%r219,1; ld.u64 %r220,[%r52+8]; sub.u64 %r45,%r218,%r220; .loc 1 250 17 st.u64 [%r57],%r45; .loc 1 251 16 st.u64 [%r58],%r221; .loc 1 252 10 setp.gt.s64 %r222,%r45,0; @ %r222 bra $L61; shl.b64 %r130,%r116,3; mov.u64 %r132,%r117; .loc 1 255 11 mov.u64 %r123,0; .loc 1 256 24 mov.u64 %r223,%r123; $L62: st.u64 [%r132],%r223; .loc 1 255 27 add.u64 %r123,%r123,1; .loc 1 255 4 add.u64 %r132,%r132,%r130; setp.ne.u64 %r224,%r123,%r137; @ %r224 bra $L62; bra $L52; $L61: .loc 1 246 26 add.u64 %r122,%r122,1; .loc 1 246 3 add.u64 %r52,%r52,24; add.u64 %r62,%r62,8; add.u64 %r67,%r67,24; add.u64 %r70,%r70,8; add.u64 %r57,%r57,8; add.u64 %r58,%r58,8; setp.ne.u64 %r225,%r122,%r137; @ %r225 bra $L60; .loc 1 261 8 ld.u64 %r98,[%r161]; shl.b64 %r134,%r116,3; mov.u64 %r153,%r117; .loc 1 264 10 mov.u64 %r121,0; .loc 1 265 23 mov.u64 %r226,%r121; $L64: st.u64 [%r153],%r226; .loc 1 264 26 add.u64 %r121,%r121,1; .loc 1 264 3 add.u64 %r153,%r153,%r134; setp.ne.u64 %r227,%r121,%r137; @ %r227 bra $L64; .loc 1 276 9 setp.eq.u64 %r228,%r98,0; @ %r228 bra $L52; .loc 1 331 19 ld.u64 %r77,[%frame+120]; .loc 1 331 9 shl.b64 %r79,%r77,3; .loc 1 332 20 ld.u64 %r80,[%frame]; .loc 1 334 34 ld.u64 %r84,[%frame+240]; .loc 1 334 21 ld.u64 %r156,[%frame+360]; cvt.u32.u32 %r230,%r22; cvt.s64.s8 %r229,%r230; shl.b64 %r231,%r229,3; add.u64 %r48,%r231,%r59; .loc 1 272 12 mov.f64 %r108,0d7ff0000000000000; setp.eq.u32 %r282,1,0; add.u64 %r281,%frame,368; .loc 1 340 13 mov.u64 %r284,0; .loc 1 359 32 add.u64 %r285,%frame,240; .loc 1 308 12 setp.eq.u32 %r287,%r163,0; $L83: .loc 1 280 10 @ %r282 bra $L65; $L72: .loc 1 284 11 ld.s8 %r234,[%r101]; cvt.u16.u32 %r233,%r234; setp.eq.u16 %r235,%r233,0; @ %r235 bra $L66; .loc 1 287 8 ld.u64 %r236,[%r117]; setp.ne.u64 %r237,%r236,0; @ %r237 bra $L67; mov.u64 %r66,%r281; mov.u64 %r76,%r117; mov.u64 %r150,%r156; $L68: .loc 1 289 38 add.u64 %r239,%r150,1; .loc 1 289 27 st.u64 [%r76],%r239; .loc 1 288 7 add.u64 %r76,%r76,%r134; setp.eq.u64 %r240,%r48,%r66; @ %r240 bra $L67; .loc 1 289 34 ld.u64 %r150,[%r66]; add.u64 %r66,%r66,8; bra $L68; $L67: .loc 1 290 9 ld.f64 %r54,[%r98]; .loc 1 290 8 setp.le.f64 %r241,%r54,%r108; @ ! %r241 bra $L66; mov.u64 %r112,%r281; mov.u64 %r125,%r117; mov.u64 %r154,%r156; $L71: .loc 1 296 33 add.u64 %r243,%r154,1; .loc 1 296 22 st.u64 [%r125],%r243; .loc 1 295 9 add.u64 %r125,%r125,%r134; setp.eq.u64 %r244,%r48,%r112; @ %r244 bra $L84; .loc 1 296 29 ld.u64 %r154,[%r112]; add.u64 %r112,%r112,8; bra $L71; $L66: .loc 1 300 13 add.u64 %r98,%r98,%r79; .loc 1 301 14 add.u64 %r101,%r101,%r80; .loc 1 303 11 add.u64 %r156,%r156,1; .loc 1 303 4 st.u64 [%frame+360],%r156; setp.ne.u64 %r245,%r156,%r84; @ %r245 bra $L72; bra $L73; $L65: .loc 1 308 12 @ %r287 bra $L85; mov.u64 %r72,%r156; mov.u64 %r96,%r98; $L78: .loc 1 311 11 ld.s8 %r248,[%r101]; cvt.u16.u32 %r247,%r248; setp.eq.u16 %r249,%r247,0; @ %r249 bra $L75; .loc 1 311 12 ld.f64 %r65,[%r96]; .loc 1 311 11 setp.le.f64 %r250,%r65,%r108; @ ! %r250 bra $L75; mov.u64 %r124,%r281; mov.u64 %r36,%r117; mov.u64 %r136,%r72; .loc 1 314 18 mov.u64 %r120,0; $L77: .loc 1 315 36 add.u64 %r252,%r136,1; .loc 1 315 25 st.u64 [%r36],%r252; .loc 1 314 34 add.u64 %r120,%r120,1; .loc 1 314 11 add.u64 %r36,%r36,%r134; setp.ge.s64 %r253,%r120,%r137; @ %r253 bra $L86; .loc 1 315 32 ld.u64 %r136,[%r124]; add.u64 %r124,%r124,8; bra $L77; $L86: .loc 1 313 18 mov.f64 %r108,%r65; $L75: .loc 1 317 8 add.u64 %r96,%r96,%r79; .loc 1 319 13 add.u64 %r72,%r72,1; .loc 1 319 6 st.u64 [%frame+360],%r72; setp.ne.u64 %r254,%r72,%r84; @ %r254 bra $L78; .loc 1 317 8 sub.u64 %r255,%r84,%r156; mad.lo.u64 %r98,%r255,%r79,%r98; bra $L73; $L85: mov.u64 %r83,%r156; mov.u64 %r100,%r101; mov.u64 %r97,%r98; $L74: .loc 1 323 11 ld.s8 %r258,[%r100]; cvt.u16.u32 %r257,%r258; setp.eq.u16 %r259,%r257,0; @ %r259 bra $L79; .loc 1 323 12 ld.f64 %r74,[%r97]; .loc 1 323 11 setp.lt.f64 %r260,%r74,%r108; @ ! %r260 bra $L79; mov.u64 %r155,%r281; mov.u64 %r159,%r117; mov.u64 %r127,%r83; .loc 1 326 12 mov.u64 %r119,0; $L81: .loc 1 327 36 add.u64 %r262,%r127,1; .loc 1 327 25 st.u64 [%r159],%r262; .loc 1 326 28 add.u64 %r119,%r119,1; .loc 1 326 5 add.u64 %r159,%r159,%r134; setp.ge.s64 %r263,%r119,%r137; @ %r263 bra $L87; .loc 1 327 32 ld.u64 %r127,[%r155]; add.u64 %r155,%r155,8; bra $L81; $L87: .loc 1 325 12 mov.f64 %r108,%r74; $L79: .loc 1 331 9 add.u64 %r97,%r97,%r79; .loc 1 332 10 add.u64 %r100,%r100,%r80; .loc 1 334 14 add.u64 %r83,%r83,1; .loc 1 334 7 st.u64 [%frame+360],%r83; setp.ne.u64 %r264,%r83,%r84; @ %r264 bra $L74; sub.u64 %r141,%r84,%r156; .loc 1 331 9 mad.lo.u64 %r98,%r79,%r141,%r98; .loc 1 332 10 mad.lo.u64 %r101,%r80,%r141,%r101; $L73: mov.u64 %r107,%r281; .loc 1 325 12 mov.u64 %r93,%r80; .loc 1 334 34 mov.u64 %r95,%r84; mov.u64 %r91,%r77; mov.u64 %r105,8; .loc 1 335 9 mov.u64 %r106,0; $L82: .loc 1 340 13 st.u64 [%r107+-8],%r284; .loc 1 343 23 mul.lo.u64 %r86,%r91,%r95; .loc 1 344 24 mul.lo.u64 %r88,%r95,%r93; .loc 1 345 5 add.u64 %r106,%r106,1; .loc 1 346 7 setp.eq.u64 %r269,%r106,%r137; @ %r269 bra $L52; .loc 1 354 16 ld.u64 %r270,[%r107]; add.u64 %r90,%r270,1; st.u64 [%r107],%r90; .loc 1 355 23 add.u64 %r272,%r280,%r105; ld.u64 %r91,[%r272]; .loc 1 355 13 sub.u64 %r273,%r91,%r86; shl.b64 %r274,%r273,3; add.u64 %r98,%r98,%r274; .loc 1 356 24 add.u64 %r275,%frame,%r105; ld.u64 %r93,[%r275]; .loc 1 356 14 sub.u64 %r276,%r93,%r88; add.u64 %r101,%r101,%r276; .loc 1 359 32 add.u64 %r278,%r285,%r105; ld.u64 %r95,[%r278]; .loc 1 359 7 add.u64 %r107,%r107,8; add.u64 %r105,%r105,8; setp.eq.u64 %r279,%r90,%r95; @ %r279 bra $L82; .loc 1 334 21 ld.u64 %r156,[%frame+360]; bra $L83; $L84: .loc 1 290 9 mov.f64 %r108,%r54; setp.eq.u32 %r282,1,1; bra $L65; $L52: .loc 1 362_gfortran_sminloc0_8_r8 .visible .func _gfortran_sminloc0_8_r84predpred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 378 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L114; .loc 1 378 20 ld.u32 %r40,[%r37380 7 {_gfortran_minloc0_8_r8.loc 1 381 7 bra $L113; $L115: .loc 1 384 8 ld.s8 %r30,[%r36+28]; .loc 1 386 6 setp.gt.s64 %r46,%r30,0; @ %r46 bra $L1_gfortran_runtime_error117: .loc 1 389 15 ld.u64 %r24,[%r35]; .loc 1 389 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L118; .loc 1 391 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 392 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 393 24 st.u64 [%r35+8],%r24; .loc 1 394 29 mov.u64 call (%value_in),_gfortrani_xmallocarray94 27 st.u64 [%r35],%r24; bra $L119; $L118: .loc 1 396 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 396 11 ld.u32 %r62,[%r61+36.loc 1 398 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 403 8 ld.u64 %r24,[%r35]; $L119: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,3; mov.u64 %r29,%r24; .loc 1 404 10 mov.u64 %r32,0; .loc 1 405 23 mov.u64 %r68,%r32; $L120: st.u64 [%r29],%r68; .loc 1 404 24 add.u64 %r32,%r32,1; .loc 1 404 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L120; $L113: .loc 1 406 minloc0_16_r8.o/1622802232_gfortran_minloc0_16_r8 .visible .func _gfortran_minloc0_16_fortran/generated/minloc0_16_r8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_16_r8 .visible .func _gfortran_mminloc0_16_r8_gfortran_sminloc0_16_r8 .visible .func _gfortran_sminloc0_16_r8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_16_r8 .visible .func _gfortran_minloc0_16_r8u64 %r94; .reg .u64 %r105; .reg .u64 %r108predpred %r160; .reg .predpredu32 %r167; .reg .u64 %r168; .reg .pred %r171; .reg .pred %r172; .reg .u64 %r174; .reg .u64 %r175; .reg .pred %r176; .reg .pred %r177; .reg .predu64 %r193; .reg .predmov.u64 %r124,%ar0; mov.u64 %r125,%ar1; mov.u32 %r126,%ar2; .loc 1 50 10 ld.s8 %r22,[%r125+28]; .loc 1 50 8 cvt.u32.u32 %r127,%r22; cvt.s64.s8 %r79,%r127; .loc 1 51 6 setp.gt.s64 %r128,%r79,0; @ %r128 bra $L2; .loc 1 52 5 cvta.const.u64 %r112_gfortran_runtime_errorr23,[%r124]; .loc 1 54 6 setp.ne.u64 %r131,%r23,0; @ %r131 bra $L3; .loc 1 56 7 st.u64 [%r124+48],%r23; add.u64 %r133,%r79,-1; st.u64 [%r124+56],%r133; mov.u64 %r134,1; st.u64 [%r124+40],%r134; .loc 1 57 28 cvt.u32.u64 %r135,%r134; st.u8 [%r124+28],%r135; .loc 1 58 24 st.u64 [%r124+8],%r23; .loc 1 59 29 mov.u64 %r138,1679call (%value_in),_gfortrani_xmallocarray1,[%value_in]; } mov.u64 %r23,%r141; .loc 1 59 27 st.u64 [%r124],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r143,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r144,[%r143+36]; setp.eq.u32 %r145,%r144,0; @ %r145 bra $L4; .loc 1 64r148; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r124]; $L4: .loc 1 68 11 ld.u64 %r80,[%r124+40]; add.u64 %r51,%r125,40; mov.u64 %r42,%frame; add.u64 %r76,%frame,120; .loc 1 70 10 mov.u64 %r108,0; add.u64 %r205,%frame,240; .loc 1 74 16 mov.u64 %r156,%r108; bra $L7; $L32: mov.u64 %r108,%r88; $L7: .loc 1 72 18 ld.u64 %r149,[%r51]; st.u64 [%r42],%r149; .loc 1 73 19 ld.u64 %r151,[%r51+16]; add.u64 %r150,%r151,1; ld.u64 %r152,[%r51+8]; sub.u64 %r31,%r150,%r152; .loc 1 73 17 st.u64 [%r76],%r31; .loc 1 74 16 shl.b64 %r154,%r108,3; add.u64 %r155,%r205,%r154; st.u64 [%r155],%r156; .loc 1 75 10 setp.gt.s64 %r157,%r31,0; @ %r157 bra $L5; shl.b64 %r120,%r80,4; mov.u64 %r105,%r23; .loc 1 78 11 mov.u64 %r89,0; .loc 1 79 24 mov.u64 %r158,%r89; $L6: st.u64 [%r105],%r158; st.u64 [%r105+8],%r158; .loc 1 78 27 add.u64 %r89,%r89,1; .loc 1 78 4 add.u64 %r105,%r105,%r120; setp.ne.u64 %r160,%r79,%r89; @ %r160 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r88,%r108,1; .loc 1 70 3 add.u64 %r51,%r51,24; add.u64 %r42,%r42,8; add.u64 %r76,%r76,8; setp.ne.u64 %r161,%r79,%r88; @ %r161 bra $L32; .loc 1 84 8 ld.u64 %r69,[%r125]; shl.b64 %r35,%r80,4; mov.u64 %r48,%r23; .loc 1 87 10 mov.u64 %r87,0; .loc 1 88 23 mov.u64 %r162,1; mov.u64 %r163,%r87; $L8: st.u64 [%r48],%r162; st.u64 [%r48+8],%r163; mov.u64 %r109,%r87; .loc 1 87 26 add.u64 %r87,%r87,1; .loc 1 87 3 add.u64 %r48,%r48,%r35; setp.ne.u64 %r164,%r108,%r109; @ %r164 bra $L8; .loc 1 101 9 setp.eq.u64 %r165,%r69,0; @ %r165 bra $L1; .loc 1 149 19 ld.u64 %r54,[%frame]; .loc 1 149 9 shl.b64 %r56,%r54,3; .loc 1 151 34 ld.u64 %r58,[%frame+120]; ld.u64 %r121,[%frame+240]; .loc 1 141 10 ld.f64 %r122,[%r69]; cvt.u32.u32 %r167,%r22; cvt.s64.s8 %r166,%r167; shl.b64 %r168,%r166,3; add.u64 %r46,%r205,%r168; .loc 1 97 12 mov.f64 %r74,0d7ff0000000000000; setp.eq.u32 %r207,1,0; add.u64 %r206,%frame,248; .loc 1 157 13 mov.u64 %r210,0; .loc 1 174 32 add.u64 %r211,%frame,120; .loc 1 126 10 setp.eq.u32 %r213,%r126,0; $L30: .loc 1 106 10 @ %r207 bra $L10; mov.u64 %r37,%r121; mov.u32 %r85,0; .loc 1 110 12 mov.u32 %r209,1; $L17: .loc 1 118 13 add.u64 %r123,%r69,%r56; .loc 1 110 11 setp.ge.f64 %r171,%r74,%r122; @ ! %r171 bra $L51; setp.eq.u32 %r172,%r85,0; selp.u64 %r121,%r121,%r37,%r172; mov.u64 %r78,%r206; mov.u64 %r94,%r23; mov.u64 %r117,%r121; $L15: .loc 1 115 36 add.u64 %r174,%r117,1; st.u64 [%r94],%r174; shr.s64 %r175,%r174,63; st.u64 [%r94+8],%r175; .loc 1 114 5 add.u64 %r94,%r94,%r35; setp.eq.u64 %r176,%r46,%r78; @ %r176 bra $L33; .loc 1 115 32 ld.u64 %r117,[%r78]; add.u64 %r78,%r78,8; bra $L15; $L51: .loc 1 120 11 add.u64 %r37,%r37,1; .loc 1 120 4 setp.eq.u64 %r177,%r37,%r58; @ %r177 bra $L16; .loc 1 110 12 ld.f64 %r122,[%r123]; mov.u64 %r69,%r123; mov.u32 %r85,%r209; bra $L17; $L10: .loc 1 126 10 @ %r213 bra $L34; mov.u64 %r49,%r121; mov.u64 %r67,%r69; $L24: .loc 1 129 9 setp.ge.f64 %r179,%r74,%r122; @ ! %r179 bra $L19; mov.u64 %r119,%r206; mov.u64 %r70,%r23; mov.u64 %r114,%r49; $L21: .loc 1 133 34 add.u64 %r181,%r114,1; st.u64 [%r70],%r181; shr.s64 %r182,%r181,63; st.u64 [%r70+8],%r182; .loc 1 132 3 add.u64 %r70,%r70,%r35; setp.eq.u64 %r183,%r46,%r119; @ %r183 bra $L35; .loc 1 133 30 ld.u64 %r114,[%r119]; add.u64 %r119,%r119,8; bra $L21; $L35: .loc 1 131 10 mov.f64 %r74,%r122; $L19: .loc 1 135 11 add.u64 %r67,%r67,%r56; .loc 1 137 9 add.u64 %r49,%r49,1; .loc 1 137 2 st.u64 [%frame+240],%r49; setp.ne.u64 %r184,%r49,%r58; @ %r184 bra $L22; .loc 1 135 11 sub.u64 %r185,%r58,%r121; mad.lo.u64 %r69,%r185,%r56,%r69; bra $L23; $L22: .loc 1 129 10 ld.f64 %r122,[%r67]; bra $L24; $L34: mov.u64 %r57,%r121; mov.u64 %r68,%r69; $L18: .loc 1 141 9 setp.gt.f64 %r187,%r74,%r122; @ ! %r187 bra $L25; mov.u64 %r112,%r206; mov.u64 %r116,%r23; mov.u64 %r111,%r57; $L27: .loc 1 145 34 add.u64 %r189,%r111,1; st.u64 [%r116],%r189; shr.s64 %r190,%r189,63; st.u64 [%r116+8],%r190; .loc 1 144 3 add.u64 %r116,%r116,%r35; setp.eq.u64 %r191,%r46,%r112; @ %r191 bra $L36; .loc 1 145 30 ld.u64 %r111,[%r112]; add.u64 %r112,%r112,8; bra $L27; $L36: .loc 1 143 10 mov.f64 %r74,%r122; $L25: .loc 1 149 9 add.u64 %r68,%r68,%r56; .loc 1 151 14 add.u64 %r57,%r57,1; .loc 1 151 7 st.u64 [%frame+240],%r57; setp.ne.u64 %r192,%r57,%r58; @ %r192 bra $L28; .loc 1 149 9 sub.u64 %r193,%r58,%r121; mad.lo.u64 %r69,%r193,%r56,%r69; bra $L23; $L28: .loc 1 141 10 ld.f64 %r122,[%r68]; bra $L18; $L16: st.u64 [%frame+240],%r58; .loc 1 118 13 mov.u64 %r69,%r123; $L23: mov.u64 %r71,%r206; .loc 1 151 34 mov.u64 %r65,%r58; mov.u64 %r63,%r54; .loc 1 152 9 mov.u64 %r73,0; $L29: .loc 1 157 13 st.u64 [%r71+-8],%r210; .loc 1 160 23 mul.lo.u64 %r60,%r63,%r65; mov.u64 %r110,%r73; .loc 1 161 5 add.u64 %r73,%r73,1; .loc 1 162 7 setp.eq.u64 %r197,%r110,%r108; @ %r197 bra $L1; .loc 1 170 16 ld.u64 %r198,[%r71]; add.u64 %r62,%r198,1; st.u64 [%r71],%r62; shl.b64 %r40,%r73,3; .loc 1 171 23 add.u64 %r199,%frame,%r40; ld.u64 %r63,[%r199]; .loc 1 171 13 sub.u64 %r200,%r63,%r60; shl.b64 %r201,%r200,3; add.u64 %r69,%r69,%r201; .loc 1 174 32 add.u64 %r203,%r211,%r40; ld.u64 %r65,[%r203]; .loc 1 174 7 add.u64 %r71,%r71,8; setp.eq.u64 %r204,%r62,%r65; @ %r204 bra $L29; ld.u64 %r121,[%frame+240]; .loc 1 141 10 ld.f64 %r122,[%r69]; bra $L30; $L33: mov.f64 %r74,%r122; setp.eq.u32 %r207,1,1; bra $L10; $L1: .loc 1 1_gfortran_mminloc0_16_r8 .visible .func _gfortran_mminloc0_16_r8pred %r168; .reg .predpredpred %r226; .reg .u64 %r227; .reg .pred %r229; .reg .predpred %r234; .reg .u64u16pred %r248; .reg .pred %r249; .reg .u64 %r251; .reg .u64 %r252; .reg .pred %r253; .reg .pred %r254; .reg .u16 %r256; .reg .u32predpred %r274; .reg .predu64 %r283; .reg .u64u64 %r287; .reg .u64 %r289; .reg .pred %r290; .reg .u64 %r291; .reg .u64 %r292; .reg .predpred %r298; mov.u64 %r164,%ar0; mov.u64 %r165,%ar1; mov.u64 %r166,%ar2; mov.u32 %r167,%ar3; .loc 1 202 6 setp.ne.u64 %r168,%r166,0; @ %r168 bra $L1617; call _gfortran_minloc0_16_r8.loc 1 205 7 bra $L52; $L53: .loc 1 208 10 ld.s8 %r22,[%r165+28]; .loc 1 209 6 setp.gt.s32 %r172,%r22,0; @ %r172 bra $L55; .loc 1 210 5 cvta.const.u64 %r173173_gfortran_runtime_error5: .loc 1 212 6 ld.u64 %r175,[%r164]; setp.ne.u64 %r176,%r175,0; @ %r176 bra $L56; .loc 1 214 7 st.u64 [%r164+48],%r175; add.u32 %r178,%r22,-1; cvt.s64.s32 %r179,%r178; st.u64 [%r164+56],%r179; mov.u64 %r180,1; st.u64 [%r164+40],%r180; .loc 1 215 28 cvt.u32.u64 %r181,%r180; st.u8 [%r164+28],%r181; .loc 1 216 24 st.u64 [%r164+8],%r175; .loc 1 217 29 cvt.u32.u32 %r186,%r22; cvt.s64.s8 %r185,%r186; mov.u64 %r184,16call (%value_in),_gfortrani_xmallocarray187,[%value_in]; } .loc 1 217 27 st.u64 [%r164],%r187; bra $L57; $L56: .loc 1 221 11 cvta.global.u64 %r189,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r190,[%r189+36]; setp.eq.u32 %r191,%r190,0164r194; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19716r197194; call _gfortrani_bounds_equal_extents$L57: .loc 1 231 15 ld.u64 %r31,[%r166+16]; .loc 1 233 9 ld.u64 %r104,[%r166]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r199,%r32,-4; and.b32 %r200,%r199,-5; set.u32.eq.u32 %r202,%r200,0; neg.s32 %r203,%r202; .loc 1 235 22 add.u32 %r204,%r32,-1; set.u32.le.u32 %r206,%r204,1; neg.s32 %r207,%r206; .loc 1 235 6 cvt.u16.u32 %r209,%r203; cvt.u16.u32 %r210,%r207; or.b16 %r208,%r209,%r210; cvt.u32.u16 %r211,%r208; cvt.u16.u8 %r212,%r211; setp.ne.u16 %r213,%r212,0; @ %r213 bra $L58; .loc 1 237 7 setp.ne.u32 %r215,%r32,16; @ %r215 bra $L59; $L58: .loc 1 244 11 ld.u64 %r119,[%r164+40]; .loc 1 245 8 ld.u64 %r120,[%r164]; .loc 1 246 17 cvt.u32.u32 %r216,%r22; cvt.s64.s8 %r137,%r216; add.u64 %r153,%r165,40; add.u64 %r291,%frame,120; mov.u64 %r150,%r291; add.u64 %r151,%r166,40; mov.u64 %r65,%frame; add.u64 %r62,%frame,240; add.u64 %r48,%frame,360; mov.u64 %r54,%r48; .loc 1 246 10 mov.u64 %r126,0; .loc 1 251 16 mov.u64 %r225,%r126; bra $L60; $L59: .loc 1 242 5 cvta.const.u64 %r217,$LC3stack; call _gfortran_runtime_error248 18 ld.u64 %r219,[%r153]; st.u64 [%r150],%r219; .loc 1 249 20 ld.u64 %r221,[%r151]; mul.lo.u64 %r220,%r221,%r31; .loc 1 249 18 st.u64 [%r65],%r220; .loc 1 250 19 ld.u64 %r223,[%r153+16]; add.u64 %r222,%r223,1; ld.u64 %r224,[%r153+8]; sub.u64 %r47,%r222,%r224; .loc 1 250 17 st.u64 [%r62],%r47; .loc 1 251 16 st.u64 [%r54],%r225; .loc 1 252 10 setp.gt.s64 %r226,%r47,0; @ %r226 bra $L61; shl.b64 %r132,%r119,4; mov.u64 %r136,%r120; .loc 1 255 11 mov.u64 %r127,0; .loc 1 256 24 mov.u64 %r227,%r127; $L62: st.u64 [%r136],%r227; st.u64 [%r136+8],%r227; .loc 1 255 27 add.u64 %r127,%r127,1; .loc 1 255 4 add.u64 %r136,%r136,%r132; setp.ne.u64 %r229,%r127,%r137; @ %r229 bra $L62; bra $L52; $L61: .loc 1 246 26 add.u64 %r126,%r126,1; .loc 1 246 3 add.u64 %r153,%r153,24; add.u64 %r150,%r150,8; add.u64 %r151,%r151,24; add.u64 %r65,%r65,8; add.u64 %r62,%r62,8; add.u64 %r54,%r54,8; setp.ne.u64 %r230,%r126,%r137; @ %r230 bra $L60; .loc 1 261 8 ld.u64 %r102,[%r165]; shl.b64 %r159,%r119,4; mov.u64 %r157,%r120; .loc 1 264 10 mov.u64 %r125,0; .loc 1 265 23 mov.u64 %r231,%r125; $L64: st.u64 [%r157],%r231; st.u64 [%r157+8],%r231; .loc 1 264 26 add.u64 %r125,%r125,1; .loc 1 264 3 add.u64 %r157,%r157,%r159; setp.ne.u64 %r233,%r125,%r137; @ %r233 bra $L64; .loc 1 276 9 setp.eq.u64 %r234,%r102,0; @ %r234 bra $L52; .loc 1 331 19 ld.u64 %r82,[%frame+120]; .loc 1 331 9 shl.b64 %r84,%r82,3; .loc 1 332 20 ld.u64 %r85,[%frame]; .loc 1 334 34 ld.u64 %r87,[%frame+240]; .loc 1 334 21 ld.u64 %r161,[%frame+360]; cvt.u32.u32 %r236,%r22; cvt.s64.s8 %r235,%r236; shl.b64 %r237,%r235,3; add.u64 %r68,%r237,%r48; .loc 1 272 12 mov.f64 %r110,0d7ff0000000000000; setp.eq.u32 %r293,1,0; add.u64 %r292,%frame,368; .loc 1 340 13 mov.u64 %r295,0; .loc 1 359 32 add.u64 %r296,%frame,240; .loc 1 308 12 setp.eq.u32 %r298,%r167,0; $L83: .loc 1 280 10 @ %r293 bra $L65; $L72: .loc 1 284 11 ld.s8 %r240,[%r104]; cvt.u16.u32 %r239,%r240; setp.eq.u16 %r241,%r239,0; @ %r241 bra $L66; .loc 1 287 8 ld.u64 %r242,[%r120]; ld.u64 %r243,[%r120+8]; or.b64 %r242,%r242,%r243; setp.ne.u64 %r244,%r242,0; @ %r244 bra $L67; mov.u64 %r99,%r292; mov.u64 %r105,%r120; mov.u64 %r152,%r161; $L68: .loc 1 289 38 add.u64 %r246,%r152,1; st.u64 [%r105],%r246; shr.s64 %r247,%r246,63; st.u64 [%r105+8],%r247; .loc 1 288 7 add.u64 %r105,%r105,%r159; setp.eq.u64 %r248,%r68,%r99; @ %r248 bra $L67; .loc 1 289 34 ld.u64 %r152,[%r99]; add.u64 %r99,%r99,8; bra $L68; $L67: .loc 1 290 9 ld.f64 %r55,[%r102]; .loc 1 290 8 setp.le.f64 %r249,%r55,%r110; @ ! %r249 bra $L66; mov.u64 %r129,%r292; mov.u64 %r130,%r120; mov.u64 %r156,%r161; $L71: .loc 1 296 33 add.u64 %r251,%r156,1; st.u64 [%r130],%r251; shr.s64 %r252,%r251,63; st.u64 [%r130+8],%r252; .loc 1 295 9 add.u64 %r130,%r130,%r159; setp.eq.u64 %r253,%r68,%r129; @ %r253 bra $L84; .loc 1 296 29 ld.u64 %r156,[%r129]; add.u64 %r129,%r129,8; bra $L71; $L66: .loc 1 300 13 add.u64 %r102,%r102,%r84; .loc 1 301 14 add.u64 %r104,%r104,%r85; .loc 1 303 11 add.u64 %r161,%r161,1; .loc 1 303 4 st.u64 [%frame+360],%r161; setp.ne.u64 %r254,%r161,%r87; @ %r254 bra $L72; bra $L73; $L65: .loc 1 308 12 @ %r298 bra $L85; mov.u64 %r72,%r161; mov.u64 %r100,%r102; $L78: .loc 1 311 11 ld.s8 %r257,[%r104]; cvt.u16.u32 %r256,%r257; setp.eq.u16 %r258,%r256,0; @ %r258 bra $L75; .loc 1 311 12 ld.f64 %r66,[%r100]; .loc 1 311 11 setp.le.f64 %r259,%r66,%r110; @ ! %r259 bra $L75; mov.u64 %r49,%r292; mov.u64 %r64,%r120; mov.u64 %r149,%r72; .loc 1 314 18 mov.u64 %r123,0; $L77: .loc 1 315 36 add.u64 %r261,%r149,1; st.u64 [%r64],%r261; shr.s64 %r262,%r261,63; st.u64 [%r64+8],%r262; .loc 1 314 34 add.u64 %r123,%r123,1; .loc 1 314 11 add.u64 %r64,%r64,%r159; setp.ge.s64 %r263,%r123,%r137; @ %r263 bra $L86; .loc 1 315 32 ld.u64 %r149,[%r49]; add.u64 %r49,%r49,8; bra $L77; $L86: .loc 1 313 18 mov.f64 %r110,%r66; $L75: .loc 1 317 8 add.u64 %r100,%r100,%r84; .loc 1 319 13 add.u64 %r72,%r72,1; .loc 1 319 6 st.u64 [%frame+360],%r72; setp.ne.u64 %r264,%r72,%r87; @ %r264 bra $L78; .loc 1 317 8 sub.u64 %r265,%r87,%r161; mad.lo.u64 %r102,%r265,%r84,%r102; bra $L73; $L85: mov.u64 %r73,%r161; mov.u64 %r103,%r104; mov.u64 %r101,%r102; $L74: .loc 1 323 11 ld.s8 %r268,[%r103]; cvt.u16.u32 %r267,%r268; setp.eq.u16 %r269,%r267,0; @ %r269 bra $L79; .loc 1 323 12 ld.f64 %r77,[%r101]; .loc 1 323 11 setp.lt.f64 %r270,%r77,%r110; @ ! %r270 bra $L79; mov.u64 %r128,%r292; mov.u64 %r36,%r120; mov.u64 %r135,%r73; .loc 1 326 12 mov.u64 %r122,0; $L81: .loc 1 327 36 add.u64 %r272,%r135,1; st.u64 [%r36],%r272; shr.s64 %r273,%r272,63; st.u64 [%r36+8],%r273; .loc 1 326 28 add.u64 %r122,%r122,1; .loc 1 326 5 add.u64 %r36,%r36,%r159; setp.ge.s64 %r274,%r122,%r137; @ %r274 bra $L87; .loc 1 327 32 ld.u64 %r135,[%r128]; add.u64 %r128,%r128,8; bra $L81; $L87: .loc 1 325 12 mov.f64 %r110,%r77; $L79: .loc 1 331 9 add.u64 %r101,%r101,%r84; .loc 1 332 10 add.u64 %r103,%r103,%r85; .loc 1 334 14 add.u64 %r73,%r73,1; .loc 1 334 7 st.u64 [%frame+360],%r73; setp.ne.u64 %r275,%r73,%r87; @ %r275 bra $L74; sub.u64 %r141,%r87,%r161; .loc 1 331 9 mad.lo.u64 %r102,%r84,%r141,%r102; .loc 1 332 10 mad.lo.u64 %r104,%r85,%r141,%r104; $L73: mov.u64 %r114,%r292; .loc 1 325 12 mov.u64 %r96,%r85; .loc 1 334 34 mov.u64 %r98,%r87; mov.u64 %r94,%r82; mov.u64 %r113,8; .loc 1 335 9 mov.u64 %r108,0; $L82: .loc 1 340 13 st.u64 [%r114+-8],%r295; .loc 1 343 23 mul.lo.u64 %r89,%r94,%r98; .loc 1 344 24 mul.lo.u64 %r91,%r98,%r96; .loc 1 345 5 add.u64 %r108,%r108,1; .loc 1 346 7 setp.eq.u64 %r280,%r108,%r137; @ %r280 bra $L52; .loc 1 354 16 ld.u64 %r281,[%r114]; add.u64 %r93,%r281,1; st.u64 [%r114],%r93; .loc 1 355 23 add.u64 %r283,%r291,%r113; ld.u64 %r94,[%r283]; .loc 1 355 13 sub.u64 %r284,%r94,%r89; shl.b64 %r285,%r284,3; add.u64 %r102,%r102,%r285; .loc 1 356 24 add.u64 %r286,%frame,%r113; ld.u64 %r96,[%r286]; .loc 1 356 14 sub.u64 %r287,%r96,%r91; add.u64 %r104,%r104,%r287; .loc 1 359 32 add.u64 %r289,%r296,%r113; ld.u64 %r98,[%r289]; .loc 1 359 7 add.u64 %r114,%r114,8; add.u64 %r113,%r113,8; setp.eq.u64 %r290,%r93,%r98; @ %r290 bra $L82; .loc 1 334 21 ld.u64 %r161,[%frame+360]; bra $L83; $L84: .loc 1 290 9 mov.f64 %r110,%r55; setp.eq.u32 %r293,1,1; bra $L65; $L52: .loc 1 362_gfortran_sminloc0_16_r8 .visible .func _gfortran_sminloc0_16_r84predpred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 378 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L114; .loc 1 378 20 ld.u32 %r40,[%r37380 7 {_gfortran_minloc0_16_r8.loc 1 381 7 bra $L113; $L115: .loc 1 384 8 ld.s8 %r30,[%r36+28]; .loc 1 386 6 setp.gt.s64 %r46,%r30,0; @ %r46 bra $L1_gfortran_runtime_error117: .loc 1 389 15 ld.u64 %r24,[%r35]; .loc 1 389 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L118; .loc 1 391 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 392 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 393 24 st.u64 [%r35+8],%r24; .loc 1 394 29call (%value_in),_gfortrani_xmallocarray94 27 st.u64 [%r35],%r24; bra $L119; $L118: .loc 1 396 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 396 11 ld.u32 %r62,[%r61+36.loc 1 398 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 403 8 ld.u64 %r24,[%r35]; $L119: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,4; mov.u64 %r29,%r24; .loc 1 404 10 mov.u64 %r32,0; .loc 1 405 23 mov.u64 %r68,%r32; $L120: st.u64 [%r29],%r68; st.u64 [%r29+8],%r68; .loc 1 404 24 add.u64 %r32,%r32,1; .loc 1 404 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r70,%r30,%r32; @ %r70 bra $L120; $L113: .loc 1 406 minloc0_4_r10.o/1622802232145 minloc0_8_r16.o/1622802232minloc1_4_i1.o/ 1622802233_gfortran_minloc1_4_i1 .visible .func _gfortran_minloc1_4_i1.file 1 "../../../../libgfortran/generated/minloc1_4_i1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_4_i1 .visible .func _gfortran_mminloc1_4__gfortran_sminloc1_4_i1 .visible .func _gfortran_sminloc1_4_VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_minloc1_4_i1 .visible .func _gfortran_minloc1_4_i1101; .reg .u32predu64 %r230; .reg .u64 %r231; .reg .u64 %r232; .reg .u64 %r233; .reg .pred %r234; .reg .predpredu64 %r267; .reg .u64 %r268; .reg .u64 %r269; .reg .u16 %r271; .reg .u16 %r272; .reg .u32 %r273; .reg .u32 %r274; .reg .u64u64 %r281; .reg .u6464predpred %r353; .reg .u64 %r354; .reg .pred %r355; .reg .pred %r356; .reg .predu64 %r369; mov.u64 %r184,%ar0; mov.u64 %r185,%ar1; mov.u64 %r186,%ar2; mov.u32 %r187,%ar3; .loc 1 58 10 ld.s8 %r22,[%r185+28]; .loc 1 58 38 add.u32 %r188,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r114,%r188; .loc 1 59 10 ld.u64 %r25,[%r186]; .loc 1 59 7 add.u64 %r115,%r25,-1; .loc 1 61 7 shr.u64 %r190,%r115,63; set.u32.lt.s64 %r192,%r114,%r115; neg.s32 %r193,%r192; cvt.u16.u64 %r196,%r190; cvt.u16.u32 %r197,%r193; or.b16 %r195,%r196,%r197; .loc 1 61 6 cvt.u32.u16 %r198,%r195; cvt.u16.u8 %r199,%r198; setp.eq.u16 %r200,%r199,0; @ %r200 bra $L2; .loc 1 63 7 cvt.u32.u32 %r203,%r22; cvt.s64.s8 %r202,%r203; st.u64 [%stack+8],%r20201201_gfortran_runtime_errorr206,%r115,%r115; add.u64 %r207,%r206,%r115; shl.b64 %r208,%r207,3; add.u64 %r209,%r185,%r208; ld.u64 %r31,[%r209+56]; ld.u64 %r33,[%r209+48]; .loc 1 71 9 ld.u64 %r118,[%r209+40]; .loc 1 73 3 setp.ne.u64 %r223,%r115,0; @ %r223 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r224,%r114,%r115; @ %r224 bra $L4; bra $L52; $L3: add.u64 %r182,%r185,40; add.u64 %r178,%frame,120; add.u64 %r47,%frame,240; add.u64 %r225,%r185,16; add.u64 %r227,%r25,%r25; add.u64 %r228,%r227,%r25; shl.b64 %r229,%r228,3; add.u64 %r70,%r225,%r229; .loc 1 79 12 mov.u64 %r369,0; $L8: .loc 1 75 18 ld.u64 %r230,[%r182]; st.u64 [%r178],%r230; .loc 1 76 19 ld.u64 %r232,[%r182+16]; add.u64 %r231,%r232,1; ld.u64 %r233,[%r182+8]; sub.u64 %r38,%r231,%r233; .loc 1 78 10 setp.lt.s64 %r234,%r38,0; @ %r234 bra $L6; .loc 1 76 17 st.u64 [%r47],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r47],%r369; $L7: .loc 1 73 3 add.u64 %r182,%r182,24; add.u64 %r178,%r178,8; add.u64 %r47,%r47,8; setp.ne.u64 %r236,%r70,%r182; @ %r236 bra $L8; bra $L9; $L52: .loc 1 90 6 ld.u64 %r237,[%r184]; setp.eq.u64 %r238,%r237,0; @ ! %r238 bra $L11; bra $L10; $L4: add.u64 %r240,%r25,%r25; add.u64 %r241,%r240,%r25; shl.b64 %r242,%r241,3; add.u64 %r243,%r242,40; add.u64 %r106,%r185,%r243; shl.b64 %r244,%r25,3; add.u64 %r94,%r244,-8; add.u64 %r359,%frame,120; add.u64 %r96,%r359,%r94; add.u64 %r358,%frame,240; add.u64 %r92,%r358,%r94; cvt.u32.u32 %r248,%r22; cvt.s64.s8 %r247,%r248; add.u64 %r250,%r247,%r247; add.u64 %r251,%r250,%r247; shl.b64 %r252,%r251,3; add.u64 %r253,%r185,40; add.u64 %r133,%r252,%r253; .loc 1 87 12 mov.u64 %r368,0; $L14: .loc 1 83 18 ld.u64 %r254,[%r106]; st.u64 [%r96],%r254; .loc 1 84 19 ld.u64 %r256,[%r106+16]; add.u64 %r255,%r256,1; ld.u64 %r257,[%r106+8]; sub.u64 %r44,%r255,%r257; .loc 1 86 10 setp.lt.s64 %r258,%r44,0; @ %r258 bra $L12; .loc 1 84 17 st.u64 [%r92],%r44; bra $L13; $L12: .loc 1 87 12 st.u64 [%r92],%r368; $L13: .loc 1 81 3 add.u64 %r106,%r106,24; add.u64 %r96,%r96,8; add.u64 %r92,%r92,8; setp.ne.u64 %r260,%r106,%r133; @ %r260 bra $L14; bra $L53; $L10: .loc 1 94 7 setp.le.s64 %r261,%r114,0; @ %r261 bra $L16; add.u64 %r358,%frame,240; $L40: add.u64 %r147,%r184,40; mov.u64 %r137,%r358; cvt.u32.u32 %r263,%r22; cvt.s64.s8 %r262,%r263; add.u64 %r265,%r262,%r262; add.u64 %r266,%r265,%r262; shl.b64 %r267,%r266,3; add.u64 %r268,%r184,16; add.u64 %r107,%r267,%r268; .loc 1 97 10 mov.u64 %r98,1; .loc 1 101 4 mov.u64 %r291,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r269,0; st.u64 [%r184+8],%r269; .loc 1 106 28 cvt.u16.u32 %r272,%r22; add.u16 %r271,%r272,-1; cvt.u32.u16 %r273,%r271; st.u8 [%r184+28],%r273; .loc 1 108 20 add.u32 %r274,%r22,-2; cvt.s64.s32 %r53,%r274; add.u64 %r276,%r53,%r53; add.u64 %r277,%r276,%r53; shl.b64 %r278,%r277,3; add.u64 %r279,%r184,%r278; .loc 1 108 67 shl.b64 %r281,%r53,3; add.u64 %r282,%frame,%r281; .loc 1 108 59 ld.u64 %r284,[%r279+40]; ld.u64 %r285,[%r282+240]; mul.lo.u64 %r120,%r284,%r285; .loc 1 110 29287; call (%value_in),_gfortrani_xmallocarray288,[%value_in]; } .loc 1 110 27 st.u64 [%r184],%r288; .loc 1 111 10 setp.eq.u64 %r290,%r120,0; @ ! %r290 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r98,%r49,%r98; $L17: .loc 1 101 4 st.u64 [%r147+8],%r291; ld.u64 %r49,[%r137]; add.u64 %r292,%r49,-1; st.u64 [%r147+16],%r292; st.u64 [%r147],%r98; .loc 1 94 7 add.u64 %r147,%r147,24; add.u64 %r137,%r137,8; setp.ne.u64 %r293,%r107,%r147; @ %r293 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r184+48],%r120; mov.u64 %r295,-1; st.u64 [%r184+56],%r295; mov.u64 %r296,1; st.u64 [%r184+40],%r296; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r59,[%r184+28]; .loc 1 121 10 setp.eq.u64 %r297,%r59,%r114; @ %r297 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r114; st.u64 [%stack],%r59;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r300,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r301,[%r300+36]; setp.eq.u32 %r302,%r301,0; @ %r302 bra $L21; .loc 1 128 2 add.u64 %r358,%frame,240; cvta.const.u64 %r306,$LC2r3530306; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r308,%r114,0; @ %r308 bra $L24; $L27: .loc 1 68 9 add.u64 %r309,%r31,1; .loc 1 68 7 sub.u64 %r116,%r309,%r33; max.s64 %r79,%r116,0; .loc 1 140 8 ld.u64 %r129,[%r185]; .loc 1 141 8 ld.u64 %r130,[%r184]; ld.u64 %r149,[%frame+360]; .loc 1 204 22 ld.u64 %r72,[%frame+120]; .loc 1 205 22 ld.u64 %r74,[%frame]; .loc 1 205 12 shl.b64 %r75,%r74,2; .loc 1 207 32 ld.u64 %r67,[%frame+240]; .loc 1 214 23 mul.lo.u64 %r310,%r67,%r72; .loc 1 214 9 neg.s64 %r145,%r310; .loc 1 215 23 mul.lo.u64 %r311,%r67,%r74; .loc 1 215 9 shl.b64 %r312,%r311,2; neg.s64 %r144,%r312; setp.gt.s64 %r356,%r116,0; setp.le.s64 %r357,%r114,1; .loc 1 180 9 setp.eq.u32 %r361,%r187,0; .loc 1 155 9 mov.u32 %r362,127; cvt.u32.u32 %r363,%r22; cvt.s64.s8 %r364,%r363; add.u64 %r365,%r364,-1; add.u64 %r366,%frame,120; add.u64 %r367,%frame,240; bra $L25; $L24: add.u64 %r166,%frame,360; add.u64 %r165,%r184,40; mov.u64 %r163,%frame; cvt.u32.u32 %r314,%r22; cvt.s64.s8 %r313,%r314; add.u64 %r148,%r313,-1; .loc 1 132 3 mov.u64 %r131,0; add.u64 %r358,%frame,240; .loc 1 134 16 mov.u64 %r315,%r131; $L26: st.u64 [%r166],%r315; .loc 1 135 18 ld.u64 %r316,[%r165]; st.u64 [%r163],%r316; .loc 1 136 17 shl.b64 %r318,%r131,3; add.u64 %r319,%r358,%r318; .loc 1 136 10 ld.u64 %r320,[%r319]; setp.le.s64 %r321,%r320,0; @ %r321 bra $L1; .loc 1 132 26 add.u64 %r131,%r131,1; .loc 1 132 3 add.u64 %r166,%r166,8; add.u64 %r165,%r165,24; add.u64 %r163,%r163,8; setp.ne.u64 %r322,%r131,%r148; @ %r322 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r356 bra $L28; .loc 1 159 10 mov.u32 %r324,0; st.u32 [%r130],%r324; bra $L29; $L28: .loc 1 180 9 @ %r361 bra $L30; .loc 1 190 33 mov.u64 %r97,%r129; .loc 1 155 9 mov.u32 %r104,%r362; .loc 1 157 9 mov.u32 %r101,1; .loc 1 178 8 mov.u64 %r125,0; $L32: .loc 1 183 9 ld.s8 %r66,[%r97]; .loc 1 183 8 cvt.u16.u32 %r326,%r66; cvt.u16.u32 %r327,%r104; setp.gt.s16 %r328,%r326,%r327; @ %r328 bra $L31; cvt.u32.u64 %r329,%r125; add.u32 %r101,%r329,1; mov.u32 %r104,%r66; $L31: .loc 1 181 25 add.u64 %r125,%r125,1; .loc 1 181 33 add.u64 %r97,%r97,%r118; .loc 1 181 8 setp.gt.s64 %r330,%r79,%r125; @ %r330 bra $L32; bra $L33; $L30: .loc 1 190 33 mov.u64 %r123,%r129; .loc 1 155 9 mov.u32 %r105,%r362; .loc 1 157 9 mov.u32 %r101,1; .loc 1 178 8 mov.u64 %r122,0; $L35: .loc 1 192 9 ld.s8 %r68,[%r123]; .loc 1 192 8 cvt.u16.u32 %r331,%r68; cvt.u16.u32 %r332,%r105; setp.ge.s16 %r333,%r331,%r332; @ %r333 bra $L34; cvt.u32.u64 %r334,%r122; add.u32 %r101,%r334,1; mov.u32 %r105,%r68; $L34: .loc 1 190 25 add.u64 %r122,%r122,1; .loc 1 190 33 add.u64 %r123,%r123,%r118; .loc 1 190 8 setp.gt.s64 %r335,%r79,%r122; @ %r335 bra $L35; $L33: .loc 1 199 12 st.u32 [%r130],%r101; $L29: .loc 1 203 15 add.u64 %r149,%r149,1; .loc 1 204 12 add.u64 %r129,%r129,%r72; .loc 1 205 12 add.u64 %r130,%r130,%r75; .loc 1 207 13 setp.ne.u64 %r336,%r67,%r149; @ %r336 bra $L25; .loc 1 214 9 add.u64 %r126,%r129,%r145; .loc 1 215 9 add.u64 %r127,%r130,%r144; .loc 1 217 7 @ %r357 bra $L1; add.u64 %r155,%frame,368; mov.u64 %r113,8; .loc 1 216 5 mov.u64 %r128,1; .loc 1 211 13 mov.u64 %r360,0; bra $L38; $L39: st.u64 [%r155],%r360; .loc 1 214 23 mul.lo.u64 %r342,%r86,%r85; .loc 1 214 9 sub.u64 %r126,%r129,%r342; .loc 1 215 23 mul.lo.u64 %r343,%r88,%r85; .loc 1 215 9 shl.b64 %r344,%r343,2; sub.u64 %r127,%r130,%r344; .loc 1 216 5 add.u64 %r128,%r128,1; .loc 1 217 7 add.u64 %r155,%r155,8; add.u64 %r113,%r113,8; setp.eq.u64 %r345,%r128,%r365; @ %r345 bra $L1; $L38: .loc 1 225 16 ld.u64 %r346,[%r155]; add.u64 %r85,%r346,1; st.u64 [%r155],%r85; .loc 1 226 23 add.u64 %r348,%r366,%r113; ld.u64 %r86,[%r348]; .loc 1 226 13 add.u64 %r129,%r126,%r86; .loc 1 227 23 add.u64 %r349,%frame,%r113; ld.u64 %r88,[%r349]; .loc 1 227 13 shl.b64 %r350,%r88,2; add.u64 %r130,%r127,%r350; .loc 1 207 32 add.u64 %r352,%r367,%r113; ld.u64 %r91,[%r352]; .loc 1 207 13 setp.eq.u64 %r353,%r85,%r91; @ %r353 bra $L39; .loc 1 211 13 mov.u64 %r149,0; bra $L25; $L53: .loc 1 90 6 ld.u64 %r354,[%r184]; setp.eq.u64 %r355,%r354,0; @ ! %r355 bra $L11; bra $L40; $L1: .loc 1 231_gfortran_mminloc1_4_i1 .visible .func _gfortran_mminloc1_4_, .param .u32 %in_ar4) {608pred %r225; .reg .u32 %r230; .reg .u64 %r232; .reg .u32 %r234; .reg .u32 %r235; .reg .u16 %r237; .reg .u16 %r238; .reg .u16pred %r262; .reg .u32 %r263; .reg .u32 %r264; .reg .u32 %r266; .reg .u32u16 %r274; .reg .u32 %r275; .reg .u16 %r276; .reg .predpredu64 %r302; .reg .u64 %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u64 %r306; .reg .u6464u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .pred %r336; .reg .u64 %r338; .reg .u64predpredpredpred %r413; .reg .pred %r414; .reg .predpred %r419; .reg .pred %r420; .reg .u16 %r421; .reg .u32 %r422; .reg .pred %r423; .reg .u16 %r424; .reg .u16 %r425; .reg .pred %r426; .reg .pred %r427; .reg .predu64predpred %r454; .reg .u64 %r455; .reg .u64 %r456; .reg .pred %r457; .reg .u64 %r458; .reg .predu64 %r467; mov.u64 %r220,%ar0; mov.u64 %r221,%ar1; mov.u64 %r222,%ar2; mov.u64 %r223,%ar3; mov.u32 %r224,%ar4; .loc 1 261 6 setp.ne.u64 %r225,%r223,0; @ %r225 bra $L2224; call _gfortran_minloc1_4_268 7 bra $L54; $L55: .loc 1 271 10 ld.u64 %r22,[%r222]; .loc 1 271 7 add.u64 %r152,%r22,-1; .loc 1 272 10 ld.s8 %r23,[%r221+28]; .loc 1 272 38 add.u32 %r230,%r23,-1; .loc 1 272 8 cvt.s64.s32 %r153,%r230; .loc 1 275 7 shr.u64 %r232,%r152,63; set.u32.gt.s64 %r234,%r152,%r153; neg.s32 %r235,%r234; cvt.u16.u64 %r238,%r232; cvt.u16.u32 %r239,%r235; or.b16 %r237,%r238,%r239; .loc 1 275 6 cvt.u32.u16 %r240,%r237; cvt.u16.u8 %r241,%r240; setp.eq.u16 %r242,%r241,0; @ %r242 bra $L57; .loc 1 277 7 cvt.u32.u32 %r245,%r23; cvt.s64.s8 %r244,%r245; st.u64 [%stack+8],%r244; st.u64 [%stack],%r22; cvta.const.u64 %r243stack; call _gfortran_runtime_error7: .loc 1 282 9 add.u64 %r452,%r152,%r152; add.u64 %r453,%r452,%r152; shl.b64 %r250,%r453,3; add.u64 %r251,%r221,%r250; ld.u64 %r254,[%r251+56]; add.u64 %r253,%r254,1; .loc 1 282 7 ld.u64 %r261,[%r251+48]; sub.u64 %r154,%r253,%r261; .loc 1 283 6 setp.le.s64 %r262,%r154,0; @ %r262 bra $L54; .loc 1 286 9 ld.u64 %r169,[%r223]; .loc 1 288 15 ld.u64 %r34,[%r223+16]; .loc 1 290 22 cvt.u32.u64 %r35,%r34; .loc 1 290 53 add.u32 %r263,%r35,-4; and.b32 %r264,%r263,-5; set.u32.eq.u32 %r266,%r264,0; neg.s32 %r267,%r266; .loc 1 290 22 add.u32 %r268,%r35,-1; set.u32.le.u32 %r270,%r268,1; neg.s32 %r271,%r270; .loc 1 290 6 cvt.u16.u32 %r273,%r267; cvt.u16.u32 %r274,%r271; or.b16 %r272,%r273,%r274; cvt.u32.u16 %r275,%r272; cvt.u16.u8 %r276,%r275; setp.ne.u16 %r277,%r276,0; @ %r277 bra $L59; .loc 1 292 7 setp.ne.u32 %r279,%r35,16; @ %r279 bra $L60; $L59: .loc 1 299 9 shl.b64 %r283,%r453,3; add.u64 %r284,%r221,%r283; ld.u64 %r157,[%r284+40]; .loc 1 300 12 add.u64 %r290,%r223,%r283; ld.u64 %r42,[%r290+40]; .loc 1 302 3 setp.ne.u64 %r292,%r152,0; @ %r292 bra $L61; $L67: .loc 1 312 3 setp.lt.s64 %r293,%r152,%r153; @ %r293 bra $L62; bra $L119; $L60: .loc 1 297 5 cvta.const.u64 %r2944_gfortran_runtime_error61: add.u64 %r40,%r221,40; add.u64 %r178,%r223,40; add.u64 %r297,%r22,%r22; add.u64 %r298,%r297,%r22; shl.b64 %r299,%r298,3; add.u64 %r300,%r221,16; add.u64 %r131,%r299,%r300; .loc 1 302 3 mov.u64 %r147,0; add.u64 %r456,%frame,240; add.u64 %r455,%frame,360; .loc 1 309 12 mov.u64 %r467,%r147; $L66: .loc 1 304 18 add.u64 %r302,%r456,%r147; ld.u64 %r303,[%r40]; st.u64 [%r302],%r303; .loc 1 305 18 add.u64 %r304,%frame,%r147; .loc 1 305 20 ld.u64 %r306,[%r178]; mul.lo.u64 %r305,%r306,%r34; .loc 1 305 18 st.u64 [%r304],%r305; .loc 1 306 19 ld.u64 %r308,[%r40+16]; add.u64 %r307,%r308,1; ld.u64 %r309,[%r40+8]; sub.u64 %r54,%r307,%r309; .loc 1 308 10 setp.lt.s64 %r310,%r54,0; @ %r310 bra $L64; .loc 1 306 17 add.u64 %r312,%r455,%r147; st.u64 [%r312],%r54; bra $L65; $L64: .loc 1 309 12 add.u64 %r314,%r455,%r147; st.u64 [%r314],%r467; $L65: .loc 1 302 3 add.u64 %r40,%r40,24; add.u64 %r178,%r178,24; add.u64 %r147,%r147,8; setp.ne.u64 %r316,%r40,%r131; @ %r316 bra $L66; bra $L67; $L119: .loc 1 322 6 ld.u64 %r317,[%r220]; setp.eq.u64 %r318,%r317,0; @ ! %r318 bra $L69; bra $L68; $L62: add.u64 %r320,%r22,%r22; add.u64 %r321,%r320,%r22; shl.b64 %r322,%r321,3; add.u64 %r114,%r322,40; add.u64 %r132,%r221,%r114; add.u64 %r96,%r223,%r114; shl.b64 %r323,%r22,3; add.u64 %r150,%r323,-8; cvt.u32.u32 %r325,%r23; cvt.s64.s8 %r324,%r325; shl.b64 %r326,%r324,3; add.u64 %r39,%r326,-8; add.u64 %r456,%frame,240; add.u64 %r455,%frame,360; .loc 1 319 12 mov.u64 %r466,0; $L72: .loc 1 314 18 add.u64 %r328,%r456,%r150; ld.u64 %r329,[%r132]; st.u64 [%r328],%r329; .loc 1 315 18 add.u64 %r330,%frame,%r150; .loc 1 315 20 ld.u64 %r332,[%r96]; mul.lo.u64 %r331,%r332,%r34; .loc 1 315 18 st.u64 [%r330],%r331; .loc 1 316 19 ld.u64 %r334,[%r132+16]; add.u64 %r333,%r334,1; ld.u64 %r335,[%r132+8]; sub.u64 %r64,%r333,%r335; .loc 1 318 10 setp.lt.s64 %r336,%r64,0; @ %r336 bra $L70; .loc 1 316 17 add.u64 %r338,%r455,%r150; st.u64 [%r338],%r64; bra $L71; $L70: .loc 1 319 12 add.u64 %r340,%r455,%r150; st.u64 [%r340],%r466; $L71: .loc 1 312 3 add.u64 %r132,%r132,24; add.u64 %r96,%r96,24; add.u64 %r150,%r150,8; setp.ne.u64 %r342,%r39,%r150; @ %r342 bra $L72; bra $L120; $L68: .loc 1 326 7 setp.eq.u64 %r343,%r153,0; @ %r343 bra $L74; add.u64 %r455,%frame,360; $L99: add.u64 %r145,%r220,40; mov.u64 %r142,%r455; cvt.u32.u32 %r345,%r23; cvt.s64.s8 %r344,%r345; add.u64 %r347,%r344,%r344; add.u64 %r348,%r347,%r344; shl.b64 %r349,%r348,3; add.u64 %r350,%r220,16; add.u64 %r133,%r349,%r350; .loc 1 329 10 mov.u64 %r135,1; .loc 1 333 4 mov.u64 %r369,0; bra $L75; $L74: .loc 1 337 20 add.u32 %r351,%r23,-2; cvt.s64.s32 %r74,%r351; add.u64 %r353,%r74,%r74; add.u64 %r354,%r353,%r74; shl.b64 %r355,%r354,3; add.u64 %r356,%r220,%r355; .loc 1 337 67 shl.b64 %r358,%r74,3; add.u64 %r359,%frame,%r358; .loc 1 337 59 ld.u64 %r361,[%r356+40]; ld.u64 %r362,[%r359+360]; mul.lo.u64 %r160,%r361,%r362; .loc 1 339 24 mov.u64 %r363,0; st.u64 [%r220+8],%r363; .loc 1 340 28 cvt.u16.u32 %r366,%r23; add.u16 %r365,%r366,-1; cvt.u32.u16 %r367,%r365; st.u8 [%r220+28],%r367; .loc 1 342 10 setp.eq.u64 %r368,%r160,0; @ %r368 bra $L76; bra $L121; $L78: .loc 1 331 47 mul.lo.u64 %r135,%r71,%r135; $L75: .loc 1 333 4 st.u64 [%r145+8],%r369; ld.u64 %r71,[%r142]; add.u64 %r370,%r71,-1; st.u64 [%r145+16],%r370; st.u64 [%r145],%r135; .loc 1 326 7 add.u64 %r145,%r145,24; add.u64 %r142,%r142,8; setp.ne.u64 %r371,%r133,%r145; @ %r371 bra $L78; bra $L74; $L76: .loc 1 345 4 st.u64 [%r220+48],%r160; mov.u64 %r373,-1; st.u64 [%r220+56],%r373; mov.u64 %r374,1; st.u64 [%r220+40],%r374; .loc 1 346 4 bra $L54; $L121: .loc 1 349 24call (%value_in),_gfortrani_xmallocarray77,[%value_in]; } .loc 1 349 22 st.u64 [%r220],%r377; bra $L79; $L69: .loc 1 354 19 ld.s8 %r379,[%r220+28]; .loc 1 354 10 setp.eq.u64 %r380,%r379,%r153; @ %r380 bra $L80; .loc 1 355 2 cvta.const.u64 %r381381_gfortran_runtime_error80: .loc 1 357 11 cvta.global.u64 %r383,_gfortrani_compile_options; .loc 1 357 10 ld.u32 %r384,[%r383+36]; setp.eq.u32 %r385,%r384,0; @ %r385 bra $L79; .loc 1 359 4 add.u64 %r455,%frame,360; cvta.const.u64 %r389,$LC2r455388389; call _gfortrani_bounds_ifunction_return361 4 cvta.const.u64 %r393,$LC393389; call _gfortrani_bounds_equal_extents$L79: .loc 1 366 3 setp.ne.u64 %r395,%r153,0; @ %r395 bra $L81; $L84: .loc 1 374 8 ld.u64 %r170,[%r220]; .loc 1 375 8 ld.u64 %r168,[%r221]; .loc 1 377 9 setp.ne.u64 %r396,%r168,0; @ %r396 bra $L82; bra $L54; $L81: add.u64 %r197,%frame,480; add.u64 %r196,%r220,40; add.u64 %r192,%frame,120; cvt.u32.u32 %r398,%r23; cvt.s64.s8 %r397,%r398; add.u64 %r146,%r397,-1; .loc 1 366 3 mov.u64 %r171,0; add.u64 %r455,%frame,360; .loc 1 368 16 mov.u64 %r399,%r171; $L83: st.u64 [%r197],%r399; .loc 1 369 18 ld.u64 %r400,[%r196]; st.u64 [%r192],%r400; .loc 1 370 17 shl.b64 %r402,%r171,3; add.u64 %r403,%r455,%r402; .loc 1 370 10 ld.u64 %r404,[%r403]; setp.le.s64 %r405,%r404,0; @ %r405 bra $L54; .loc 1 366 26 add.u64 %r171,%r171,1; .loc 1 366 3 add.u64 %r197,%r197,8; add.u64 %r196,%r196,24; add.u64 %r192,%r192,8; setp.ne.u64 %r406,%r146,%r171; @ %r406 bra $L83; bra $L84; $L82: .loc 1 300 12 mul.lo.u64 %r45,%r42,%r34; ld.u64 %r189,[%frame+480]; .loc 1 440 22 ld.u64 %r100,[%frame+240]; .loc 1 441 23 ld.u64 %r102,[%frame]; .loc 1 442 22 ld.u64 %r104,[%frame+120]; .loc 1 442 12 shl.b64 %r105,%r104,2; .loc 1 444 32 ld.u64 %r29,[%frame+360]; .loc 1 451 23 mul.lo.u64 %r38,%r100,%r29; .loc 1 452 24 mul.lo.u64 %r407,%r102,%r29; .loc 1 452 10 neg.s64 %r136,%r407; .loc 1 453 23 mul.lo.u64 %r408,%r104,%r29; .loc 1 453 9 shl.b64 %r409,%r408,2; neg.s64 %r190,%r409; setp.le.s64 %r454,%r153,1; .loc 1 418 9 setp.ne.u32 %r459,%r224,0; cvt.u32.u32 %r460,%r23; cvt.s64.s8 %r461,%r460; add.u64 %r462,%r461,-1; add.u64 %r463,%frame,240; add.u64 %r464,%frame,360; add.u64 %r465,%frame,120; $L100: .loc 1 366 3 mov.u64 %r163,%r169; mov.u64 %r162,%r168; .loc 1 396 9 mov.u64 %r161,0; $L88: .loc 1 399 7 ld.s8 %r219,[%r163]; .loc 1 399 6 setp.eq.u32 %r457,%r219,0; @ %r457 bra $L85; .loc 1 407 11 ld.s8 %r143,[%r162]; .loc 1 408 11 cvt.u32.u64 %r411,%r161; add.u32 %r141,%r411,1; .loc 1 418 9 @ %r459 bra $L86; bra $L122; $L85: .loc 1 396 24 add.u64 %r161,%r161,1; .loc 1 396 32 add.u64 %r162,%r162,%r157; .loc 1 396 47 add.u64 %r163,%r163,%r45; .loc 1 396 2 setp.ne.u64 %r413,%r154,%r161; @ %r413 bra $L88; .loc 1 395 9 mov.u32 %r141,%r219; bra $L89; $L122: .loc 1 428 10 setp.gt.s64 %r414,%r154,%r161; @ %r414 bra $L90; bra $L89; $L86: .loc 1 419 8 setp.le.s64 %r415,%r154,%r161; @ %r415 bra $L89; add.u64 %r204,%r161,1; $L92: .loc 1 421 8 @ %r457 bra $L91; .loc 1 421 18 ld.s8 %r89,[%r162]; .loc 1 421 15 cvt.u16.u32 %r417,%r89; cvt.u16.u32 %r418,%r143; setp.gt.s16 %r419,%r417,%r418; @ %r419 bra $L91; .loc 1 424 16 cvt.u32.u64 %r141,%r204; mov.u32 %r143,%r89; $L91: .loc 1 419 33 add.u64 %r162,%r162,%r157; .loc 1 419 48 add.u64 %r163,%r163,%r45; .loc 1 419 8 add.u64 %r205,%r204,1; setp.eq.u64 %r420,%r154,%r204; @ %r420 bra $L89; .loc 1 421 9 ld.s8 %r219,[%r163]; mov.u64 %r204,%r205; setp.eq.u32 %r457,%r219,0; bra $L92; $L90: add.u64 %r200,%r161,1; bra $L94; $L101: mov.u64 %r200,%r201; $L94: .loc 1 430 10 ld.s8 %r422,[%r163]; cvt.u16.u32 %r421,%r422; setp.eq.u16 %r423,%r421,0; @ %r423 bra $L93; .loc 1 430 20 ld.s8 %r94,[%r162]; .loc 1 430 17 cvt.u16.u32 %r424,%r94; cvt.u16.u32 %r425,%r143; setp.ge.s16 %r426,%r424,%r425; @ %r426 bra $L93; .loc 1 433 11 cvt.u32.u64 %r141,%r200; mov.u32 %r143,%r94; $L93: .loc 1 428 35 add.u64 %r162,%r162,%r157; .loc 1 428 50 add.u64 %r163,%r163,%r45; .loc 1 428 10 add.u64 %r201,%r200,1; setp.ne.u64 %r427,%r154,%r200; @ %r427 bra $L101; $L89: .loc 1 436 8 st.u32 [%r170],%r141; .loc 1 439 15 add.u64 %r189,%r189,1; .loc 1 440 12 add.u64 %r168,%r168,%r100; .loc 1 441 13 add.u64 %r169,%r169,%r102; .loc 1 442 12 add.u64 %r170,%r170,%r105; .loc 1 444 13 setp.ne.u64 %r428,%r29,%r189; @ %r428 bra $L100; .loc 1 452 10 add.u64 %r165,%r169,%r136; .loc 1 453 9 add.u64 %r166,%r170,%r190; .loc 1 455 7 @ %r454 bra $L54; add.u64 %r151,%frame,488; .loc 1 451 23 mov.u64 %r108,%r38; .loc 1 455 7 mov.u64 %r214,8; .loc 1 454 5 mov.u64 %r167,1; .loc 1 448 13 mov.u64 %r458,0; bra $L97; $L98: st.u64 [%r151],%r458; .loc 1 451 23 mul.lo.u64 %r108,%r122,%r121; .loc 1 452 24 mul.lo.u64 %r434,%r124,%r121; .loc 1 452 10 sub.u64 %r165,%r169,%r434; .loc 1 453 23 mul.lo.u64 %r435,%r126,%r121; .loc 1 453 9 shl.b64 %r436,%r435,2; sub.u64 %r166,%r170,%r436; .loc 1 454 5 add.u64 %r167,%r167,1; .loc 1 455 7 add.u64 %r151,%r151,8; add.u64 %r214,%r214,8; setp.eq.u64 %r437,%r167,%r462; @ %r437 bra $L54; $L97: .loc 1 463 16 ld.u64 %r438,[%r151]; add.u64 %r121,%r438,1; st.u64 [%r151],%r121; .loc 1 464 23 add.u64 %r440,%r463,%r214; ld.u64 %r122,[%r440]; .loc 1 464 13 sub.u64 %r441,%r122,%r108; add.u64 %r168,%r168,%r441; .loc 1 465 24 add.u64 %r442,%frame,%r214; ld.u64 %r124,[%r442]; .loc 1 465 14 add.u64 %r169,%r165,%r124; .loc 1 466 23 add.u64 %r444,%r465,%r214; ld.u64 %r126,[%r444]; .loc 1 466 13 shl.b64 %r445,%r126,2; add.u64 %r170,%r166,%r445; .loc 1 444 32 add.u64 %r447,%r464,%r214; ld.u64 %r130,[%r447]; .loc 1 444 13 setp.eq.u64 %r448,%r121,%r130; @ %r448 bra $L98; .loc 1 448 13 mov.u64 %r189,0; bra $L100; $L120: .loc 1 322 6 ld.u64 %r449,[%r220]; setp.eq.u64 %r450,%r449,0; @ ! %r450 bra $L69; bra $L99; $L54: .loc 1 4_gfortran_sminloc1_4_i1 .visible .func _gfortran_sminloc1_4_, .param .u32 %in_ar4) {8u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u32 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 493 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L124; .loc 1 493 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L125; $L124:r152154; call _gfortran_minloc1_4_500 7 bra $L123; $L125: .loc 1 503 10 ld.u64 %r23,[%r152]; .loc 1 503 7 add.u64 %r88,%r23,-1; .loc 1 504 10 ld.s8 %r24,[%r151+28]; .loc 1 504 38 add.u32 %r26,%r24,-1; .loc 1 504 8 cvt.s64.s32 %r296,%r26; .loc 1 506 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 506 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L127; .loc 1 513 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L128; $L134: .loc 1 521 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L129; bra $L182; $L127: .loc 1 508 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error128: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 518 12 mov.u64 %r302,0; $L133: .loc 1 515 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 517 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L131; .loc 1 515 17 st.u64 [%r139],%r36; bra $L132; $L131: .loc 1 518 12 st.u64 [%r139],%r302; $L132: .loc 1 513 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L133; bra $L134; $L182: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L136; bra $L135; $L129: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 527 12 mov.u64 %r301,0; $L139: .loc 1 524 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 526 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L137; .loc 1 523 17 st.u64 [%r43],%r41; bra $L138; $L137: .loc 1 527 12 st.u64 [%r43],%r301; $L138: .loc 1 521 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L139; bra $L183; $L135: .loc 1 534 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L141; add.u64 %r295,%frame,120; $L161: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 537 10 mov.u64 %r81,1; .loc 1 541 4 mov.u64 %r239,0; bra $L142; $L141: .loc 1 545 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 546 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 548 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 548 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 548 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 550 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L143; bra $L184; $L145: .loc 1 539 48 mul.lo.u64 %r81,%r46,%r81; $L142: .loc 1 541 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 534 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L145; bra $L141; $L143: .loc 1 553 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 554 4 bra $L123; $L184: .loc 1 557 2496246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 557 22 st.u64 [%r150],%r98; $L149: .loc 1 583 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L146; bra $L147; $L136: .loc 1 561 19 ld.s8 %r55,[%r150+28]; .loc 1 561 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L148; .loc 1 562 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error148: .loc 1 567 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 567 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L149; .loc 1 569 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L147; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 569 10 mov.u64 %r104,0; $L151: .loc 1 573 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 573 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 574 18 ld.u64 %r63,[%r31]; .loc 1 569 25 add.u64 %r104,%r104,1; .loc 1 574 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L150; .loc 1 575 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error150: .loc 1 569 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L151; $L146: .loc 1 585 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L152; shl.b64 %r266,%r296,3; bra $L153; $L152: mov.u64 %r266,8; $L153: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L154: .loc 1 586 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 583 26 add.u64 %r100,%r100,1; .loc 1 583 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L154; $L147: ld.u64 %r80,[%frame+240]; .loc 1 595 22 ld.u64 %r68,[%frame]; .loc 1 595 12 shl.b64 %r69,%r68,2; .loc 1 597 32 ld.u64 %r59,[%frame+120]; .loc 1 604 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 593 13 mov.u32 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L160: st.u32 [%r98],%r280; .loc 1 594 15 add.u64 %r80,%r80,1; .loc 1 595 12 add.u64 %r98,%r98,%r69; .loc 1 597 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L160; .loc 1 606 7 @ ! %r294 bra $L185; bra $L123; $L159: .loc 1 601 13 st.u64 [%r99],%r297; .loc 1 604 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 605 5 add.u64 %r97,%r97,1; .loc 1 606 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L158; bra $L123; $L185: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 604 23 mov.u64 %r72,%r44; .loc 1 605 5 mov.u64 %r97,1; .loc 1 601 13 mov.u64 %r297,0; $L158: .loc 1 610 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 611 23 ld.u64 %r75,[%r124]; .loc 1 611 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,2; add.u64 %r98,%r98,%r291; .loc 1 597 32 ld.u64 %r77,[%r123]; .loc 1 597 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L159; .loc 1 601 13 mov.u64 %r80,0; bra $L160; $L183: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L136; bra $L161; $L123: .loc 1 615minloc1_8_i1.o/ 1622802233_gfortran_minloc1_8_i1 .visible .func _gfortran_minloc1_8_i1.file 1 "../../../../libgfortran/generated/minloc1_8_i1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_8_i1 .visible .func _gfortran_mminloc1_8__gfortran_sminloc1_8_i1 .visible .func _gfortran_sminloc1_8_VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_minloc1_8_i1 .visible .func _gfortran_minloc1_8_i1100; .reg .u64predu64u64 %r229; .reg .u64 %r230; .reg .predpred %r255; .reg .predu64 %r266; .reg .u16u64predpred %r322; .reg .predpred %r326; .reg .predu64predpred %r353; .reg .pred %r354; .reg .predu64mov.u64 %r181,%ar0; mov.u64 %r182,%ar1; mov.u64 %r183,%ar2; mov.u32 %r184,%ar3; .loc 1 58 10 ld.s8 %r22,[%r182+28]; .loc 1 58 38 add.u32 %r185,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r115,%r185; .loc 1 59 10 ld.u64 %r25,[%r183]; .loc 1 59 7 add.u64 %r116,%r25,-1; .loc 1 61 7 shr.u64 %r187,%r116,63; set.u32.lt.s64 %r189,%r115,%r116; neg.s32 %r190,%r189; cvt.u16.u64 %r193,%r187; cvt.u16.u32 %r194,%r190; or.b16 %r192,%r193,%r194; .loc 1 61 6 cvt.u32.u16 %r195,%r192; cvt.u16.u8 %r196,%r195; setp.eq.u16 %r197,%r196,0; @ %r197 bra $L2; .loc 1 63 7 cvt.u32.u32 %r200,%r22; cvt.s64.s8 %r199,%r200; st.u64 [%stack+8],%r199_gfortran_runtime_errorr203,%r116,%r116; add.u64 %r204,%r203,%r116; shl.b64 %r205,%r204,3; add.u64 %r206,%r182,%r205; ld.u64 %r30,[%r206+56]; ld.u64 %r32,[%r206+48]; .loc 1 71 9 ld.u64 %r119,[%r206+40]; .loc 1 73 3 setp.ne.u64 %r220,%r116,0; @ %r220 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r221,%r115,%r116; @ %r221 bra $L4; bra $L54; $L3: add.u64 %r133,%r182,40; add.u64 %r179,%frame,120; add.u64 %r174,%frame,240; add.u64 %r222,%r182,16; add.u64 %r224,%r25,%r25; add.u64 %r225,%r224,%r25; shl.b64 %r226,%r225,3; add.u64 %r123,%r222,%r226; .loc 1 79 12 mov.u64 %r364,0; $L8: .loc 1 75 18 ld.u64 %r227,[%r133]; st.u64 [%r179],%r227; .loc 1 76 19 ld.u64 %r229,[%r133+16]; add.u64 %r228,%r229,1; ld.u64 %r230,[%r133+8]; sub.u64 %r37,%r228,%r230; .loc 1 78 10 setp.lt.s64 %r231,%r37,0; @ %r231 bra $L6; .loc 1 76 17 st.u64 [%r174],%r37; bra $L7; $L6: .loc 1 79 12 st.u64 [%r174],%r364; $L7: .loc 1 73 3 add.u64 %r133,%r133,24; add.u64 %r179,%r179,8; add.u64 %r174,%r174,8; setp.ne.u64 %r233,%r123,%r133; @ %r233 bra $L8; bra $L9; $L54: .loc 1 90 6 ld.u64 %r234,[%r181]; setp.eq.u64 %r235,%r234,0; @ ! %r235 bra $L11; bra $L10; $L4: add.u64 %r237,%r25,%r25; add.u64 %r238,%r237,%r25; shl.b64 %r239,%r238,3; add.u64 %r240,%r239,40; add.u64 %r108,%r182,%r240; shl.b64 %r241,%r25,3; add.u64 %r96,%r241,-8; add.u64 %r352,%frame,120; add.u64 %r98,%r352,%r96; add.u64 %r351,%frame,240; add.u64 %r93,%r351,%r96; cvt.u32.u32 %r245,%r22; cvt.s64.s8 %r244,%r245; add.u64 %r247,%r244,%r244; add.u64 %r248,%r247,%r244; shl.b64 %r249,%r248,3; add.u64 %r250,%r182,40; add.u64 %r94,%r249,%r250; .loc 1 87 12 mov.u64 %r363,0; $L14: .loc 1 83 18 ld.u64 %r251,[%r108]; st.u64 [%r98],%r251; .loc 1 84 19 ld.u64 %r253,[%r108+16]; add.u64 %r252,%r253,1; ld.u64 %r254,[%r108+8]; sub.u64 %r43,%r252,%r254; .loc 1 86 10 setp.lt.s64 %r255,%r43,0; @ %r255 bra $L12; .loc 1 84 17 st.u64 [%r93],%r43; bra $L13; $L12: .loc 1 87 12 st.u64 [%r93],%r363; $L13: .loc 1 81 3 add.u64 %r108,%r108,24; add.u64 %r98,%r98,8; add.u64 %r93,%r93,8; setp.ne.u64 %r257,%r94,%r108; @ %r257 bra $L14; bra $L55; $L10: .loc 1 94 7 setp.le.s64 %r258,%r115,0; @ %r258 bra $L16; add.u64 %r351,%frame,240; $L40: add.u64 %r145,%r181,40; mov.u64 %r139,%r351; cvt.u32.u32 %r260,%r22; cvt.s64.s8 %r259,%r260; add.u64 %r262,%r259,%r259; add.u64 %r263,%r262,%r259; shl.b64 %r264,%r263,3; add.u64 %r265,%r181,16; add.u64 %r109,%r264,%r265; .loc 1 97 10 mov.u64 %r100,1; .loc 1 101 4 mov.u64 %r288,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r266,0; st.u64 [%r181+8],%r266; .loc 1 106 28 cvt.u16.u32 %r269,%r22; add.u16 %r268,%r269,-1; cvt.u32.u16 %r270,%r268; st.u8 [%r181+28],%r270; .loc 1 108 20 add.u32 %r271,%r22,-2; cvt.s64.s32 %r53,%r271; add.u64 %r273,%r53,%r53; add.u64 %r274,%r273,%r53; shl.b64 %r275,%r274,3; add.u64 %r276,%r181,%r275; .loc 1 108 67 shl.b64 %r278,%r53,3; add.u64 %r279,%frame,%r278; .loc 1 108 59 ld.u64 %r281,[%r276+40]; ld.u64 %r282,[%r279+240]; mul.lo.u64 %r121,%r281,%r282; .loc 1 110 29 mov.u64 %r284284; call (%value_in),_gfortrani_xmallocarray285,[%value_in]; } .loc 1 110 27 st.u64 [%r181],%r285; .loc 1 111 10 setp.eq.u64 %r287,%r121,0; @ ! %r287 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r100,%r49,%r100; $L17: .loc 1 101 4 st.u64 [%r145+8],%r288; ld.u64 %r49,[%r139]; add.u64 %r289,%r49,-1; st.u64 [%r145+16],%r289; st.u64 [%r145],%r100; .loc 1 94 7 add.u64 %r145,%r145,24; add.u64 %r139,%r139,8; setp.ne.u64 %r290,%r109,%r145; @ %r290 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r181+48],%r121; mov.u64 %r292,-1; st.u64 [%r181+56],%r292; mov.u64 %r293,1; st.u64 [%r181+40],%r293; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r60,[%r181+28]; .loc 1 121 10 setp.eq.u64 %r294,%r60,%r115; @ %r294 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r115; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r297,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r298,[%r297+36]; setp.eq.u32 %r299,%r298,0; @ %r299 bra $L21; .loc 1 128 2 add.u64 %r351,%frame,240; cvta.const.u64 %r303,$LC2r351302303; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r305,%r115,0; @ %r305 bra $L24; $L27: .loc 1 68 9 add.u64 %r306,%r30,1; .loc 1 68 7 sub.u64 %r117,%r306,%r32; max.s64 %r80,%r117,0; .loc 1 140 8 ld.u64 %r129,[%r182]; .loc 1 141 8 ld.u64 %r130,[%r181]; ld.u64 %r45,[%frame+360]; .loc 1 204 22 ld.u64 %r72,[%frame+120]; .loc 1 205 22 ld.u64 %r74,[%frame]; .loc 1 205 12 shl.b64 %r76,%r74,3; setp.gt.s64 %r353,%r117,0; setp.le.s64 %r354,%r115,1; .loc 1 180 9 setp.eq.u32 %r356,%r184,0; .loc 1 159 10 mov.u64 %r357,0; cvt.u32.u32 %r358,%r22; cvt.s64.s8 %r359,%r358; add.u64 %r360,%r359,-1; add.u64 %r361,%frame,120; add.u64 %r362,%frame,240; bra $L25; $L24: add.u64 %r163,%frame,360; add.u64 %r162,%r181,40; mov.u64 %r160,%frame; cvt.u32.u32 %r308,%r22; cvt.s64.s8 %r307,%r308; add.u64 %r148,%r307,-1; .loc 1 132 3 mov.u64 %r131,0; add.u64 %r351,%frame,240; .loc 1 134 16 mov.u64 %r309,%r131; $L26: st.u64 [%r163],%r309; .loc 1 135 18 ld.u64 %r310,[%r162]; st.u64 [%r160],%r310; .loc 1 136 17 shl.b64 %r312,%r131,3; add.u64 %r313,%r351,%r312; .loc 1 136 10 ld.u64 %r314,[%r313]; setp.le.s64 %r315,%r314,0; @ %r315 bra $L1; .loc 1 132 26 add.u64 %r131,%r131,1; .loc 1 132 3 add.u64 %r163,%r163,8; add.u64 %r162,%r162,24; add.u64 %r160,%r160,8; setp.ne.u64 %r316,%r131,%r148; @ %r316 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r353 bra $L28; .loc 1 159 10 st.u64 [%r130],%r357; bra $L29; $L28: .loc 1 180 9 @ %r356 bra $L30; .loc 1 190 33 mov.u64 %r99,%r129; .loc 1 155 9 mov.u32 %r106,127; .loc 1 157 9 mov.u64 %r103,1; .loc 1 178 8 mov.u64 %r147,0; $L32: .loc 1 183 9 ld.s8 %r66,[%r99]; .loc 1 186 16 add.u64 %r147,%r147,1; .loc 1 183 8 cvt.u16.u32 %r320,%r66; cvt.u16.u32 %r321,%r106; setp.le.s16 %r322,%r320,%r321; selp.u32 %r106,%r66,%r106,%r322; selp.u64 %r103,%r147,%r103,%r322; .loc 1 181 33 add.u64 %r99,%r99,%r119; .loc 1 181 8 setp.gt.s64 %r323,%r80,%r147; @ %r323 bra $L32; bra $L33; $L30: .loc 1 190 33 mov.u64 %r124,%r129; .loc 1 155 9 mov.u32 %r107,127; .loc 1 157 9 mov.u64 %r103,1; .loc 1 178 8 mov.u64 %r146,0; $L35: .loc 1 192 9 ld.s8 %r70,[%r124]; .loc 1 195 16 add.u64 %r146,%r146,1; .loc 1 192 8 cvt.u16.u32 %r324,%r70; cvt.u16.u32 %r325,%r107; setp.lt.s16 %r326,%r324,%r325; selp.u32 %r107,%r70,%r107,%r326; selp.u64 %r103,%r146,%r103,%r326; .loc 1 190 33 add.u64 %r124,%r124,%r119; .loc 1 190 8 setp.gt.s64 %r327,%r80,%r146; @ %r327 bra $L35; $L33: .loc 1 199 12 st.u64 [%r130],%r103; $L29: .loc 1 203 15 add.u64 %r45,%r45,1; .loc 1 204 12 add.u64 %r129,%r129,%r72; .loc 1 205 12 add.u64 %r130,%r130,%r76; .loc 1 207 32 ld.u64 %r67,[%frame+240]; .loc 1 207 13 setp.ne.u64 %r328,%r67,%r45; @ %r328 bra $L25; .loc 1 214 23 mul.lo.u64 %r329,%r45,%r72; .loc 1 214 9 sub.u64 %r126,%r129,%r329; .loc 1 215 23 mul.lo.u64 %r330,%r45,%r74; .loc 1 215 9 shl.b64 %r331,%r330,3; sub.u64 %r127,%r130,%r331; .loc 1 217 7 @ %r354 bra $L1; add.u64 %r155,%frame,368; mov.u64 %r171,8; .loc 1 216 5 mov.u64 %r128,1; bra $L38; $L39: .loc 1 211 13 st.u64 [%r155],%r357; .loc 1 214 23 mul.lo.u64 %r337,%r87,%r86; .loc 1 214 9 sub.u64 %r126,%r129,%r337; .loc 1 215 23 mul.lo.u64 %r338,%r89,%r86; .loc 1 215 9 shl.b64 %r339,%r338,3; sub.u64 %r127,%r130,%r339; .loc 1 216 5 add.u64 %r128,%r128,1; .loc 1 217 7 add.u64 %r155,%r155,8; add.u64 %r171,%r171,8; setp.eq.u64 %r340,%r128,%r360; @ %r340 bra $L1; $L38: .loc 1 225 16 ld.u64 %r341,[%r155]; add.u64 %r86,%r341,1; st.u64 [%r155],%r86; .loc 1 226 23 add.u64 %r343,%r361,%r171; ld.u64 %r87,[%r343]; .loc 1 226 13 add.u64 %r129,%r126,%r87; .loc 1 227 23 add.u64 %r344,%frame,%r171; ld.u64 %r89,[%r344]; .loc 1 227 13 shl.b64 %r345,%r89,3; add.u64 %r130,%r127,%r345; .loc 1 207 32 add.u64 %r347,%r362,%r171; ld.u64 %r92,[%r347]; .loc 1 207 13 setp.eq.u64 %r348,%r86,%r92; @ %r348 bra $L39; .loc 1 211 13 mov.u64 %r45,0; bra $L25; $L55: .loc 1 90 6 ld.u64 %r349,[%r181]; setp.eq.u64 %r350,%r349,0; @ ! %r350 bra $L11; bra $L40; $L1: .loc 1 231_gfortran_mminloc1_8_i1 .visible .func _gfortran_mminloc1_8_, .param .u32 %in_ar4) {608101; .reg .u64 %r106; .reg .u64.reg .u64u64 %r214; .reg .u64u32 %r227; .reg .u16 %r229; .reg .u16u16 %r233; .reg .pred %r234; .reg .u64u64pred646464pred %r335; .reg .u64 %r336; .reg .u32 %r337; .reg .u64u64 %r342; .reg .u32 %r343; .reg .u64pred %r372; .reg .u64 %r373; .reg .u64 %r375; .reg .u32 %r376; .reg .predu32pred %r397; .reg .pred %r398; .reg .pred %r401; .reg .pred %r402; .reg .predpredpred %r414; .reg .pred %r415; .reg .predpredpredpredmov.u64 %r212,%ar0; mov.u64 %r213,%ar1; mov.u64 %r214,%ar2; mov.u64 %r215,%ar3; mov.u32 %r216,%ar4; .loc 1 261 6 setp.ne.u64 %r217,%r215,0; @ %r217 bra $L121321216; call _gfortran_minloc1_8_268 7 bra $L56; $L57: .loc 1 271 10 ld.u64 %r22,[%r214]; .loc 1 271 7 add.u64 %r142,%r22,-1; .loc 1 272 10 ld.s8 %r23,[%r213+28]; .loc 1 272 38 add.u32 %r222,%r23,-1; .loc 1 272 8 cvt.s64.s32 %r143,%r222; .loc 1 275 7 shr.u64 %r224,%r142,63; set.u32.gt.s64 %r226,%r142,%r143; neg.s32 %r227,%r226; cvt.u16.u64 %r230,%r224; cvt.u16.u32 %r231,%r227; or.b16 %r229,%r230,%r231; .loc 1 275 6 cvt.u32.u16 %r232,%r229; cvt.u16.u8 %r233,%r232; setp.eq.u16 %r234,%r233,0; @ %r234 bra $L59; .loc 1 277 7 cvt.u32.u32 %r237,%r23; cvt.s64.s8 %r236,%r237; st.u64 [%stack+8],%r236; st.u64 [%stack],%r22; cvta.const.u64 %r235235_gfortran_runtime_error9: .loc 1 282 9 add.u64 %r447,%r142,%r142; add.u64 %r448,%r447,%r142; shl.b64 %r242,%r448,3; add.u64 %r243,%r213,%r242; ld.u64 %r246,[%r243+56]; add.u64 %r245,%r246,1; .loc 1 282 7 ld.u64 %r253,[%r243+48]; sub.u64 %r144,%r245,%r253; .loc 1 283 6 setp.le.s64 %r254,%r144,0; @ %r254 bra $L56; .loc 1 286 9 ld.u64 %r160,[%r215]; .loc 1 288 15 ld.u64 %r34,[%r215+16]; .loc 1 290 22 cvt.u32.u64 %r35,%r34; .loc 1 290 53 add.u32 %r255,%r35,-4; and.b32 %r256,%r255,-5; set.u32.eq.u32 %r258,%r256,0; neg.s32 %r259,%r258; .loc 1 290 22 add.u32 %r260,%r35,-1; set.u32.le.u32 %r262,%r260,1; neg.s32 %r263,%r262; .loc 1 290 6 cvt.u16.u32 %r265,%r259; cvt.u16.u32 %r266,%r263; or.loc 1 292 7 setp.ne.u32 %r271,%r35,16; @ %r271 bra $L62; $L61: .loc 1 299 9 shl.b64 %r275,%r448,3; add.u64 %r276,%r213,%r275; ld.u64 %r147,[%r276+40]; .loc 1 300 12 add.u64 %r282,%r215,%r275; ld.u64 %r38,[%r282+40]; .loc 1 302 3 setp.ne.u64 %r284,%r142,0; @ %r284 bra $L63; $L69: .loc 1 312 3 setp.lt.s64 %r285,%r142,%r143; @ %r285 bra $L64; bra $L123; $L62: .loc 1 297 5 cvta.const.u64 %r286286_gfortran_runtime_error63: add.u64 %r99,%r213,40; add.u64 %r62,%r215,40; add.u64 %r289,%r22,%r22; add.u64 %r290,%r289,%r22; shl.b64 %r291,%r290,3; add.u64 %r292,%r213,16; add.u64 %r88,%r291,%r292; .loc 1 302 3 mov.u64 %r40,0; add.u64 %r445,%frame,240; add.u64 %r443,%frame,360; .loc 1 309 12 mov.u64 %r458,%r40; $L68: .loc 1 304 18 add.u64 %r294,%r445,%r40; ld.u64 %r295,[%r99]; st.u64 [%r294],%r295; .loc 1 305 18 add.u64 %r296,%frame,%r40; .loc 1 305 20 ld.u64 %r298,[%r62]; mul.lo.u64 %r297,%r298,%r34; .loc 1 305 18 st.u64 [%r296],%r297; .loc 1 306 19 ld.u64 %r300,[%r99+16]; add.u64 %r299,%r300,1; ld.u64 %r301,[%r99+8]; sub.u64 %r50,%r299,%r301; .loc 1 308 10 setp.lt.s64 %r302,%r50,0; @ %r302 bra $L66; .loc 1 306 17 add.u64 %r304,%r443,%r40; st.u64 [%r304],%r50; bra $L67; $L66: .loc 1 309 12 add.u64 %r306,%r443,%r40; st.u64 [%r306],%r458; $L67: .loc 1 302 3 add.u64 %r99,%r99,24; add.u64 %r62,%r62,24; add.u64 %r40,%r40,8; setp.ne.u64 %r308,%r88,%r99; @ %r308 bra $L68; bra $L69; $L123: .loc 1 322 6 ld.u64 %r309,[%r212]; setp.eq.u64 %r310,%r309,0; @ ! %r310 bra $L71; bra $L70; $L64: add.u64 %r312,%r22,%r22; add.u64 %r313,%r312,%r22; shl.b64 %r314,%r313,3; add.u64 %r122,%r314,40; add.u64 %r125,%r213,%r122; add.u64 %r106,%r215,%r122; shl.b64 %r315,%r22,3; add.u64 %r59,%r315,-8; cvt.u32.u32 %r317,%r23; cvt.s64.s8 %r316,%r317; shl.b64 %r318,%r316,3; add.u64 %r200,%r318,-8; add.u64 %r445,%frame,240; add.u64 %r443,%frame,360; .loc 1 319 12 mov.u64 %r457,0; $L74: .loc 1 314 18 add.u64 %r320,%r445,%r59; ld.u64 %r321,[%r125]; st.u64 [%r320],%r321; .loc 1 315 18 add.u64 %r322,%frame,%r59; .loc 1 315 20 ld.u64 %r324,[%r106]; mul.lo.u64 %r323,%r324,%r34; .loc 1 315 18 st.u64 [%r322],%r323; .loc 1 316 19 ld.u64 %r326,[%r125+16]; add.u64 %r325,%r326,1; ld.u64 %r327,[%r125+8]; sub.u64 %r61,%r325,%r327; .loc 1 318 10 setp.lt.s64 %r328,%r61,0; @ %r328 bra $L72; .loc 1 316 17 add.u64 %r330,%r443,%r59; st.u64 [%r330],%r61; bra $L73; $L72: .loc 1 319 12 add.u64 %r332,%r443,%r59; st.u64 [%r332],%r457; $L73: .loc 1 312 3 add.u64 %r125,%r125,24; add.u64 %r106,%r106,24; add.u64 %r59,%r59,8; setp.ne.u64 %r334,%r59,%r200; @ %r334 bra $L74; bra $L124; $L70: .loc 1 326 7 setp.eq.u64 %r335,%r143,0; @ %r335 bra $L76; add.u64 %r443,%frame,360; $L101: add.u64 %r139,%r212,40; mov.u64 %r137,%r443; cvt.u32.u32 %r337,%r23; cvt.s64.s8 %r336,%r337; add.u64 %r339,%r336,%r336; add.u64 %r340,%r339,%r336; shl.b64 %r341,%r340,3; add.u64 %r342,%r212,16; add.u64 %r128,%r341,%r342; .loc 1 329 10 mov.u64 %r126,1; .loc 1 333 4 mov.u64 %r361,0; bra $L77; $L76: .loc 1 337 20 add.u32 %r343,%r23,-2; cvt.s64.s32 %r70,%r343; add.u64 %r345,%r70,%r70; add.u64 %r346,%r345,%r70; shl.b64 %r347,%r346,3; add.u64 %r348,%r212,%r347; .loc 1 337 67 shl.b64 %r350,%r70,3; add.u64 %r351,%frame,%r350; .loc 1 337 59 ld.u64 %r353,[%r348+40]; ld.u64 %r354,[%r351+360]; mul.lo.u64 %r150,%r353,%r354; .loc 1 339 24 mov.u64 %r355,0; st.u64 [%r212+8],%r355; .loc 1 340 28 cvt.u16.u32 %r358,%r23; add.u16 %r357,%r358,-1; cvt.u32.u16 %r359,%r357; st.u8 [%r212+28],%r359; .loc 1 342 10 setp.eq.u64 %r360,%r150,0; @ %r360 bra $L78; bra $L125; $L80: .loc 1 331 47 mul.lo.u64 %r126,%r67,%r126; $L77: .loc 1 333 4 st.u64 [%r139+8],%r361; ld.u64 %r67,[%r137]; add.u64 %r362,%r67,-1; st.u64 [%r139+16],%r362; st.u64 [%r139],%r126; .loc 1 326 7 add.u64 %r139,%r139,24; add.u64 %r137,%r137,8; setp.ne.u64 %r363,%r128,%r139; @ %r363 bra $L80; bra $L76; $L78: .loc 1 345 4 st.u64 [%r212+48],%r150; mov.u64 %r365,-1; st.u64 [%r212+56],%r365; mov.u64 %r366,1; st.u64 [%r212+40],%r366; .loc 1 346 4 bra $L56; $L125: .loc 1 349 24 mov.u64 %r368368; call (%value_in),_gfortrani_xmallocarray69,[%value_in]; } .loc 1 349 22 st.u64 [%r212],%r369; bra $L81; $L71: .loc 1 354 19 ld.s8 %r371,[%r212+28]; .loc 1 354 10 setp.eq.u64 %r372,%r371,%r143; @ %r372 bra $L82; .loc 1 355 2 cvta.const.u64 %r373373_gfortran_runtime_error82: .loc 1 357 11 cvta.global.u64 %r375,_gfortrani_compile_options; .loc 1 357 10 ld.u32 %r376,[%r375+36]; setp.eq.u32 %r377,%r376,0; @ %r377 bra $L81; .loc 1 359 4 add.u64 %r443,%frame,360; cvta.const.u64 %r381,$LC2r443380381; call _gfortrani_bounds_ifunction_return361 4 cvta.const.u64 %r385,$LC38381; call _gfortrani_bounds_equal_extents$L81: .loc 1 366 3 setp.ne.u64 %r387,%r143,0; @ %r387 bra $L83; $L86: .loc 1 374 8 ld.u64 %r161,[%r212]; .loc 1 375 8 ld.u64 %r159,[%r213]; .loc 1 377 9 setp.ne.u64 %r388,%r159,0; @ %r388 bra $L84; bra $L56; $L83: add.u64 %r187,%frame,480; add.u64 %r184,%r212,40; add.u64 %r179,%frame,120; cvt.u32.u32 %r390,%r23; cvt.s64.s8 %r389,%r390; add.u64 %r155,%r389,-1; .loc 1 366 3 mov.u64 %r162,0; add.u64 %r443,%frame,360; .loc 1 368 16 mov.u64 %r391,%r162; $L85: st.u64 [%r187],%r391; .loc 1 369 18 ld.u64 %r392,[%r184]; st.u64 [%r179],%r392; .loc 1 370 17 shl.b64 %r394,%r162,3; add.u64 %r395,%r443,%r394; .loc 1 370 10 ld.u64 %r396,[%r395]; setp.le.s64 %r397,%r396,0; @ %r397 bra $L56; .loc 1 366 26 add.u64 %r162,%r162,1; .loc 1 366 3 add.u64 %r187,%r187,8; add.u64 %r184,%r184,24; add.u64 %r179,%r179,8; setp.ne.u64 %r398,%r155,%r162; @ %r398 bra $L85; bra $L86; $L84: .loc 1 300 12 mul.lo.u64 %r41,%r38,%r34; ld.u64 %r186,[%frame+480]; .loc 1 440 22 ld.u64 %r92,[%frame+240]; .loc 1 441 23 ld.u64 %r94,[%frame]; .loc 1 442 22 ld.u64 %r96,[%frame+120]; .loc 1 442 12 shl.b64 %r98,%r96,3; setp.le.s64 %r442,%r143,1; .loc 1 418 9 setp.ne.u32 %r450,%r216,0; cvt.u32.u32 %r451,%r23; cvt.s64.s8 %r452,%r451; add.u64 %r453,%r452,-1; add.u64 %r454,%frame,240; add.u64 %r455,%frame,360; add.u64 %r456,%frame,120; $L102: .loc 1 366 3 mov.u64 %r154,%r160; mov.u64 %r153,%r159; .loc 1 396 9 mov.u64 %r133,0; $L90: .loc 1 399 7 ld.s8 %r205,[%r154]; mov.u64 %r183,%r133; .loc 1 396 24 add.u64 %r133,%r133,1; .loc 1 399 6 setp.eq.u32 %r444,%r205,0; @ %r444 bra $L87; .loc 1 407 11 ld.s8 %r136,[%r153]; .loc 1 418 9 @ %r450 bra $L88; bra $L126; $L87: .loc 1 396 32 add.u64 %r153,%r153,%r147; .loc 1 396 47 add.u64 %r154,%r154,%r41; .loc 1 396 2 setp.ne.u64 %r401,%r144,%r133; @ %r401 bra $L90; .loc 1 395 9 mov.u64 %r133,0; bra $L91; $L126: .loc 1 428 10 setp.gt.s64 %r402,%r144,%r183; @ %r402 bra $L96; bra $L91; $L88: .loc 1 419 8 setp.le.s64 %r403,%r144,%r183; @ %r403 bra $L91; $L94: .loc 1 424 16 add.u64 %r183,%r183,1; .loc 1 421 8 @ %r444 bra $L93; .loc 1 421 18 ld.s8 %r85,[%r153]; .loc 1 421 15 cvt.u16.u32 %r405,%r85; cvt.u16.u32 %r406,%r136; setp.le.s16 %r407,%r405,%r406; selp.u32 %r136,%r85,%r136,%r407; selp.u64 %r133,%r183,%r133,%r407; $L93: .loc 1 419 33 add.u64 %r153,%r153,%r147; .loc 1 419 48 add.u64 %r154,%r154,%r41; .loc 1 419 8 setp.eq.u64 %r408,%r144,%r183; @ %r408 bra $L91; .loc 1 421 9 ld.s8 %r205,[%r154]; setp.eq.u32 %r444,%r205,0; bra $L94; $L96: .loc 1 433 11 add.u64 %r183,%r183,1; .loc 1 430 10 ld.s8 %r410,[%r154]; cvt.u16.u32 %r409,%r410; setp.eq.u16 %r411,%r409,0; @ %r411 bra $L95; .loc 1 430 20 ld.s8 %r90,[%r153]; .loc 1 430 17 cvt.u16.u32 %r412,%r90; cvt.u16.u32 %r413,%r136; setp.lt.s16 %r414,%r412,%r413; selp.u32 %r136,%r90,%r136,%r414; selp.u64 %r133,%r183,%r133,%r414; $L95: .loc 1 428 35 add.u64 %r153,%r153,%r147; .loc 1 428 50 add.u64 %r154,%r154,%r41; .loc 1 428 10 setp.ne.u64 %r415,%r144,%r183; @ %r415 bra $L96; $L91: .loc 1 436 8 st.u64 [%r161],%r133; .loc 1 439 15 add.u64 %r186,%r186,1; .loc 1 440 12 add.u64 %r159,%r159,%r92; .loc 1 441 13 add.u64 %r160,%r160,%r94; .loc 1 442 12 add.u64 %r161,%r161,%r98; .loc 1 444 32 ld.u64 %r29,[%frame+360]; .loc 1 444 13 setp.ne.u64 %r416,%r29,%r186; @ %r416 bra $L102; .loc 1 451 23 mul.lo.u64 %r101,%r186,%r92; .loc 1 452 24 mul.lo.u64 %r417,%r186,%r94; .loc 1 452 10 sub.u64 %r156,%r160,%r417; .loc 1 453 23 mul.lo.u64 %r418,%r186,%r96; .loc 1 453 9 shl.b64 %r419,%r418,3; sub.u64 %r157,%r161,%r419; .loc 1 455 7 @ %r442 bra $L56; add.u64 %r197,%frame,488; mov.u64 %r196,8; .loc 1 454 5 mov.u64 %r158,1; .loc 1 448 13 mov.u64 %r449,0; bra $L99; $L100: st.u64 [%r197],%r449; .loc 1 451 23 mul.lo.u64 %r101,%r113,%r112; .loc 1 452 24 mul.lo.u64 %r425,%r115,%r112; .loc 1 452 10 sub.u64 %r156,%r160,%r425; .loc 1 453 23 mul.lo.u64 %r426,%r117,%r112; .loc 1 453 9 shl.b64 %r427,%r426,3; sub.u64 %r157,%r161,%r427; .loc 1 454 5 add.u64 %r158,%r158,1; .loc 1 455 7 add.u64 %r197,%r197,8; add.u64 %r196,%r196,8; setp.eq.u64 %r428,%r158,%r453; @ %r428 bra $L56; $L99: .loc 1 463 16 ld.u64 %r429,[%r197]; add.u64 %r112,%r429,1; st.u64 [%r197],%r112; .loc 1 464 23 add.u64 %r431,%r454,%r196; ld.u64 %r113,[%r431]; .loc 1 464 13 sub.u64 %r432,%r113,%r101; add.u64 %r159,%r159,%r432; .loc 1 465 24 add.u64 %r433,%frame,%r196; ld.u64 %r115,[%r433]; .loc 1 465 14 add.u64 %r160,%r156,%r115; .loc 1 466 23 add.u64 %r435,%r456,%r196; ld.u64 %r117,[%r435]; .loc 1 466 13 shl.b64 %r436,%r117,3; add.u64 %r161,%r157,%r436; .loc 1 444 32 add.u64 %r438,%r455,%r196; ld.u64 %r121,[%r438]; .loc 1 444 13 setp.eq.u64 %r439,%r112,%r121; @ %r439 bra $L100; .loc 1 448 13 mov.u64 %r186,0; bra $L102; $L124: .loc 1 322 6 ld.u64 %r440,[%r212]; setp.eq.u64 %r441,%r440,0; @ ! %r441 bra $L71; bra $L101; $L56: .loc 1 4_gfortran_sminloc1_8_i1 .visible .func _gfortran_sminloc1_8_, .param .u32 %in_ar4) {8u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 493 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L128; .loc 1 493 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L129; $L128:r152154; call _gfortran_minloc1_8_500 7 bra $L127; $L129: .loc 1 503 10 ld.u64 %r23,[%r152]; .loc 1 503 7 add.u64 %r88,%r23,-1; .loc 1 504 10 ld.s8 %r24,[%r151+28]; .loc 1 504 38 add.u32 %r26,%r24,-1; .loc 1 504 8 cvt.s64.s32 %r296,%r26; .loc 1 506 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 506 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L131; .loc 1 513 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L132; $L138: .loc 1 521 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L133; bra $L186; $L131: .loc 1 508 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error132: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 518 12 mov.u64 %r302,0; $L137: .loc 1 515 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 517 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L135; .loc 1 515 17 st.u64 [%r139],%r36; bra $L136; $L135: .loc 1 518 12 st.u64 [%r139],%r302; $L136: .loc 1 513 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L137; bra $L138; $L186: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L140; bra $L139; $L133: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 527 12 mov.u64 %r301,0; $L143: .loc 1 524 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 526 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L141; .loc 1 523 17 st.u64 [%r43],%r41; bra $L142; $L141: .loc 1 527 12 st.u64 [%r43],%r301; $L142: .loc 1 521 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L143; bra $L187; $L139: .loc 1 534 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L145; add.u64 %r295,%frame,120; $L165: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 537 10 mov.u64 %r81,1; .loc 1 541 4 mov.u64 %r239,0; bra $L146; $L145: .loc 1 545 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 546 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 548 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 548 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 548 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 550 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L147; bra $L188; $L149: .loc 1 539 48 mul.lo.u64 %r81,%r46,%r81; $L146: .loc 1 541 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 534 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L149; bra $L145; $L147: .loc 1 553 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 554 4 bra $L127; $L188: .loc 1 557 24 mov.u64 %r24696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 557 22 st.u64 [%r150],%r98; $L153: .loc 1 583 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L150; bra $L151; $L140: .loc 1 561 19 ld.s8 %r55,[%r150+28]; .loc 1 561 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L152; .loc 1 562 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error152: .loc 1 567 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 567 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L153; .loc 1 569 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L151; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 569 10 mov.u64 %r104,0; $L155: .loc 1 573 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 573 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 574 18 ld.u64 %r63,[%r31]; .loc 1 569 25 add.u64 %r104,%r104,1; .loc 1 574 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L154; .loc 1 575 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error154: .loc 1 569 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L155; $L150: .loc 1 585 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L156; shl.b64 %r266,%r296,3; bra $L157; $L156: mov.u64 %r266,8; $L157: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L158: .loc 1 586 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 583 26 add.u64 %r100,%r100,1; .loc 1 583 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L158; $L151: ld.u64 %r80,[%frame+240]; .loc 1 595 22 ld.u64 %r68,[%frame]; .loc 1 595 12 shl.b64 %r69,%r68,3; .loc 1 597 32 ld.u64 %r59,[%frame+120]; .loc 1 604 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 593 13 mov.u64 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L164: st.u64 [%r98],%r280; .loc 1 594 15 add.u64 %r80,%r80,1; .loc 1 595 12 add.u64 %r98,%r98,%r69; .loc 1 597 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L164; .loc 1 606 7 @ ! %r294 bra $L189; bra $L127; $L163: .loc 1 601 13 st.u64 [%r99],%r280; .loc 1 604 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 605 5 add.u64 %r97,%r97,1; .loc 1 606 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L162; bra $L127; $L189: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 604 23 mov.u64 %r72,%r44; .loc 1 605 5 mov.u64 %r97,1; $L162: .loc 1 610 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 611 23 ld.u64 %r75,[%r124]; .loc 1 611 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,3; add.u64 %r98,%r98,%r291; .loc 1 597 32 ld.u64 %r77,[%r123]; .loc 1 597 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L163; .loc 1 601 13 mov.u64 %r80,0; bra $L164; $L187: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L140; bra $L165; $L127: .loc 1 615 minloc1_16_i1.o/1622802233_gfortran_minloc1_16_i1 .visible .func _gfortran_minloc1_16_i1.file 1 "../../../../libgfortran/generated/minloc1_16_i1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_16_i1 .visible .func _gfortran_mminloc1_16__gfortran_sminloc1_16_i1 .visible .func _gfortran_sminloc1_16_VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_minloc1_16_i1 .visible .func _gfortran_minloc1_16_i1predpred %r225; .reg .predpred %r236; .reg .pred %r238; .reg .u64u64pred %r263; .reg .u64 %r264; .reg .u32 %r265; .reg .u64 %r267; .reg .u64 %r268; .reg .u64 %r269; .reg .u64 %r270; .reg .u64 %r271; .reg .u16 %r273; .reg .u16u64u32 %r316; .reg .u64 %r317; .reg .u64pred %r323; .reg .pred %r324; .reg .u64 %r326; .reg .u16 %r329; .reg .u16 %r330; .reg .pred %r331; .reg .u64 %r334; .reg .u64 %r337; .reg .u32 %r339; .reg .u64 %r344; .reg .pred %r346; .reg .u16 %r347; .reg .u16 %r348; .reg .pred %r349; .reg .u64 %r352; .reg .u6464u64 %r391; .reg .u64pred %r403; .reg .predmov.u64 %r186,%ar0; mov.u64 %r187,%ar1; mov.u64 %r188,%ar2; mov.u32 %r189,%ar3; .loc 1 58 10 ld.s8 %r22,[%r187+28]; .loc 1 58 38 add.u32 %r190,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r109,%r190; .loc 1 59 10 ld.u64 %r25,[%r188]; .loc 1 59 7 add.u64 %r110,%r25,-1; .loc 1 61 7 shr.u64 %r192,%r110,63; set.u32.lt.s64 %r194,%r109,%r110; neg.s32 %r195,%r194; cvt.u16.u64 %r198,%r192; cvt.u16.u32 %r199,%r195; or.b16 %r197,%r198,%r199; .loc 1 61 6 cvt.u32.u16 %r200,%r197; cvt.u16.u8 %r201,%r200; setp.eq.u16 %r202,%r201,0; @ %r202 bra $L2; .loc 1 63 7 cvt.u32.u32 %r205,%r22; cvt.s64.s8 %r204,%r205; st.u64 [%stack+8],%r20403203_gfortran_runtime_errorr208,%r110,%r110; add.u64 %r209,%r208,%r110; shl.b64 %r210,%r209,3; add.u64 %r211,%r187,%r210; ld.u64 %r31,[%r211+56]; ld.u64 %r33,[%r211+48]; .loc 1 71 9 ld.u64 %r113,[%r211+40]; .loc 1 73 3 setp.ne.u64 %r225,%r110,0; @ %r225 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r226,%r109,%r110; @ %r226 bra $L4; bra $L52; $L3: add.u64 %r177,%r187,40; add.u64 %r47,%frame,120; add.u64 %r151,%frame,240; add.u64 %r227,%r187,16; add.u64 %r229,%r25,%r25; add.u64 %r230,%r229,%r25; shl.b64 %r231,%r230,3; add.u64 %r153,%r227,%r231; .loc 1 79 12 mov.u64 %r414,0; $L8: .loc 1 75 18 ld.u64 %r232,[%r177]; st.u64 [%r47],%r232; .loc 1 76 19 ld.u64 %r234,[%r177+16]; add.u64 %r233,%r234,1; ld.u64 %r235,[%r177+8]; sub.u64 %r38,%r233,%r235; .loc 1 78 10 setp.lt.s64 %r236,%r38,0; @ %r236 bra $L6; .loc 1 76 17 st.u64 [%r151],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r151],%r414; $L7: .loc 1 73 3 add.u64 %r177,%r177,24; add.u64 %r47,%r47,8; add.u64 %r151,%r151,8; setp.ne.u64 %r238,%r153,%r177; @ %r238 bra $L8; bra $L9; $L52: .loc 1 90 6 ld.u64 %r239,[%r186]; setp.eq.u64 %r240,%r239,0; @ ! %r240 bra $L11; bra $L10; $L4: add.u64 %r242,%r25,%r25; add.u64 %r243,%r242,%r25; shl.b64 %r244,%r243,3; add.u64 %r245,%r244,40; add.u64 %r99,%r187,%r245; shl.b64 %r246,%r25,3; add.u64 %r68,%r246,-8; add.u64 %r402,%frame,120; add.u64 %r92,%r402,%r68; add.u64 %r401,%frame,240; add.u64 %r61,%r401,%r68; cvt.u32.u32 %r250,%r22; cvt.s64.s8 %r249,%r250; add.u64 %r252,%r249,%r249; add.u64 %r253,%r252,%r249; shl.b64 %r254,%r253,3; add.u64 %r255,%r187,40; add.u64 %r184,%r254,%r255; .loc 1 87 12 mov.u64 %r413,0; $L14: .loc 1 83 18 ld.u64 %r256,[%r99]; st.u64 [%r92],%r256; .loc 1 84 19 ld.u64 %r258,[%r99+16]; add.u64 %r257,%r258,1; ld.u64 %r259,[%r99+8]; sub.u64 %r44,%r257,%r259; .loc 1 86 10 setp.lt.s64 %r260,%r44,0; @ %r260 bra $L12; .loc 1 84 17 st.u64 [%r61],%r44; bra $L13; $L12: .loc 1 87 12 st.u64 [%r61],%r413; $L13: .loc 1 81 3 add.u64 %r99,%r99,24; add.u64 %r92,%r92,8; add.u64 %r61,%r61,8; setp.ne.u64 %r262,%r99,%r184; @ %r262 bra $L14; bra $L53; $L10: .loc 1 94 7 setp.le.s64 %r263,%r109,0; @ %r263 bra $L16; add.u64 %r401,%frame,240; $L40: add.u64 %r136,%r186,40; mov.u64 %r122,%r401; cvt.u32.u32 %r265,%r22; cvt.s64.s8 %r264,%r265; add.u64 %r267,%r264,%r264; add.u64 %r268,%r267,%r264; shl.b64 %r269,%r268,3; add.u64 %r270,%r186,16; add.u64 %r100,%r269,%r270; .loc 1 97 10 mov.u64 %r98,1; .loc 1 101 4 mov.u64 %r293,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r271,0; st.u64 [%r186+8],%r271; .loc 1 106 28 cvt.u16.u32 %r274,%r22; add.u16 %r273,%r274,-1; cvt.u32.u16 %r275,%r273; st.u8 [%r186+28],%r275; .loc 1 108 20 add.u32 %r276,%r22,-2; cvt.s64.s32 %r53,%r276; add.u64 %r278,%r53,%r53; add.u64 %r279,%r278,%r53; shl.b64 %r280,%r279,3; add.u64 %r281,%r186,%r280; .loc 1 108 67 shl.b64 %r283,%r53,3; add.u64 %r284,%frame,%r283; .loc 1 108 59 ld.u64 %r286,[%r281+40]; ld.u64 %r287,[%r284+240]; mul.lo.u64 %r115,%r286,%r287; .loc 1 110 29 mov.u64 %r289,16289; call (%value_in),_gfortrani_xmallocarray290,[%value_in]; } .loc 1 110 27 st.u64 [%r186],%r290; .loc 1 111 10 setp.eq.u64 %r292,%r115,0; @ ! %r292 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r98,%r49,%r98; $L17: .loc 1 101 4 st.u64 [%r136+8],%r293; ld.u64 %r49,[%r122]; add.u64 %r294,%r49,-1; st.u64 [%r136+16],%r294; st.u64 [%r136],%r98; .loc 1 94 7 add.u64 %r136,%r136,24; add.u64 %r122,%r122,8; setp.ne.u64 %r295,%r100,%r136; @ %r295 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r186+48],%r115; mov.u64 %r297,-1; st.u64 [%r186+56],%r297; mov.u64 %r298,1; st.u64 [%r186+40],%r298; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r59,[%r186+28]; .loc 1 121 10 setp.eq.u64 %r299,%r59,%r109; @ %r299 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r109; st.u64 [%stack],%r59;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r302,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r303,[%r302+36]; setp.eq.u32 %r304,%r303,0; @ %r304 bra $L21; .loc 1 128 2 add.u64 %r401,%frame,240; cvta.const.u64 %r308,$LC2r401307308; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r310,%r109,0; @ %r310 bra $L24; $L27: .loc 1 68 9 add.u64 %r311,%r31,1; .loc 1 68 7 sub.u64 %r111,%r311,%r33; max.s64 %r78,%r111,0; .loc 1 140 8 ld.u64 %r126,[%r187]; .loc 1 141 8 ld.u64 %r127,[%r186]; ld.u64 %r148,[%frame+360]; .loc 1 204 22 ld.u64 %r71,[%frame+120]; .loc 1 205 22 ld.u64 %r73,[%frame]; .loc 1 205 12 shl.b64 %r74,%r73,4; .loc 1 207 32 ld.u64 %r65,[%frame+240]; .loc 1 214 23 mul.lo.u64 %r312,%r65,%r71; .loc 1 214 9 neg.s64 %r134,%r312; .loc 1 215 23 mul.lo.u64 %r313,%r65,%r73; .loc 1 215 9 shl.b64 %r314,%r313,4; neg.s64 %r139,%r314; setp.gt.s64 %r403,%r111,0; setp.le.s64 %r404,%r109,1; .loc 1 180 9 setp.eq.u32 %r406,%r189,0; .loc 1 190 33 mov.u64 %r407,1; cvt.u32.u32 %r408,%r22; cvt.s64.s8 %r409,%r408; add.u64 %r410,%r409,-1; add.u64 %r411,%frame,120; add.u64 %r412,%frame,240; bra $L25; $L24: add.u64 %r160,%frame,360; add.u64 %r159,%r186,40; mov.u64 %r157,%frame; cvt.u32.u32 %r316,%r22; cvt.s64.s8 %r315,%r316; add.u64 %r137,%r315,-1; .loc 1 132 3 mov.u64 %r128,0; add.u64 %r401,%frame,240; .loc 1 134 16 mov.u64 %r317,%r128; $L26: st.u64 [%r160],%r317; .loc 1 135 18 ld.u64 %r318,[%r159]; st.u64 [%r157],%r318; .loc 1 136 17 shl.b64 %r320,%r128,3; add.u64 %r321,%r401,%r320; .loc 1 136 10 ld.u64 %r322,[%r321]; setp.le.s64 %r323,%r322,0; @ %r323 bra $L1; .loc 1 132 26 add.u64 %r128,%r128,1; .loc 1 132 3 add.u64 %r160,%r160,8; add.u64 %r159,%r159,24; add.u64 %r157,%r157,8; setp.ne.u64 %r324,%r128,%r137; @ %r324 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r403 bra $L28; .loc 1 159 10 mov.u64 %r326,0; st.u64 [%r127],%r326; st.u64 [%r127+8],%r326; bra $L29; $L28: .loc 1 180 9 @ %r406 bra $L30; .loc 1 190 33 mov.u64 %r121,%r126; mov.u64 %r391,%r407; mov.u64 %r392,0; .loc 1 155 9 mov.u32 %r101,127; .loc 1 157 9 mov.u64 %r387,1; mov.u64 %r388,%r392; $L32: .loc 1 183 9 ld.s8 %r64,[%r121]; .loc 1 183 8 cvt.u16.u32 %r329,%r64; cvt.u16.u32 %r330,%r101; setp.gt.s16 %r331,%r329,%r330; .loc 1 186 16 selp.u64 %r387,%r387,%r391,%r331; selp.u64 %r388,%r388,%r392,%r331; selp.u32 %r101,%r101,%r64,%r331; .loc 1 181 33 add.u64 %r121,%r121,%r113; .loc 1 181 8 add.u64 %r334,%r391,%r407; set.u32.lt.u64 %r339,%r334,%r391; cvt.s64.s32 %r337,%r339; mov.u64 %r391,%r334; sub.u64 %r392,%r392,%r337; add.u64 %r344,%r391,-1; setp.gt.s64 %r346,%r78,%r344; @ %r346 bra $L32; bra $L33; $L30: .loc 1 190 33 mov.u64 %r118,%r126; mov.u64 %r389,%r407; mov.u64 %r390,0; .loc 1 155 9 mov.u32 %r102,127; .loc 1 157 9 mov.u64 %r387,1; mov.u64 %r388,%r390; $L35: .loc 1 192 9 ld.s8 %r67,[%r118]; .loc 1 192 8 cvt.u16.u32 %r347,%r67; cvt.u16.u32 %r348,%r102; setp.ge.s16 %r349,%r347,%r348; .loc 1 195 16 selp.u64 %r387,%r387,%r389,%r349; selp.u64 %r388,%r388,%r390,%r349; selp.u32 %r102,%r102,%r67,%r349; .loc 1 190 33 add.u64 %r118,%r118,%r113; .loc 1 190 8 add.u64 %r352,%r389,%r407; set.u32.lt.u64 %r357,%r352,%r389; cvt.s64.s32 %r355,%r357; mov.u64 %r389,%r352; sub.u64 %r390,%r390,%r355; add.u64 %r362,%r389,-1; setp.gt.s64 %r364,%r78,%r362; @ %r364 bra $L35; $L33: .loc 1 199 12 st.u64 [%r127],%r387; st.u64 [%r127+8],%r388; $L29: .loc 1 203 15 add.u64 %r148,%r148,1; .loc 1 204 12 add.u64 %r126,%r126,%r71; .loc 1 205 12 add.u64 %r127,%r127,%r74; .loc 1 207 13 setp.ne.u64 %r367,%r65,%r148; @ %r367 bra $L25; .loc 1 214 9 add.u64 %r123,%r126,%r134; .loc 1 215 9 add.u64 %r124,%r127,%r139; .loc 1 217 7 @ %r404 bra $L1; add.u64 %r154,%frame,368; mov.u64 %r108,8; .loc 1 216 5 mov.u64 %r125,1; .loc 1 211 13 mov.u64 %r405,0; bra $L38; $L39: st.u64 [%r154],%r405; .loc 1 214 23 mul.lo.u64 %r373,%r85,%r84; .loc 1 214 9 sub.u64 %r123,%r126,%r373; .loc 1 215 23 mul.lo.u64 %r374,%r87,%r84; .loc 1 215 9 shl.b64 %r375,%r374,4; sub.u64 %r124,%r127,%r375; .loc 1 216 5 add.u64 %r125,%r125,1; .loc 1 217 7 add.u64 %r154,%r154,8; add.u64 %r108,%r108,8; setp.eq.u64 %r376,%r125,%r410; @ %r376 bra $L1; $L38: .loc 1 225 16 ld.u64 %r377,[%r154]; add.u64 %r84,%r377,1; st.u64 [%r154],%r84; .loc 1 226 23 add.u64 %r379,%r411,%r108; ld.u64 %r85,[%r379]; .loc 1 226 13 add.u64 %r126,%r123,%r85; .loc 1 227 23 add.u64 %r380,%frame,%r108; ld.u64 %r87,[%r380]; .loc 1 227 13 shl.b64 %r381,%r87,4; add.u64 %r127,%r124,%r381; .loc 1 207 32 add.u64 %r383,%r412,%r108; ld.u64 %r90,[%r383]; .loc 1 207 13 setp.eq.u64 %r384,%r84,%r90; @ %r384 bra $L39; .loc 1 211 13 mov.u64 %r148,0; bra $L25; $L53: .loc 1 90 6 ld.u64 %r385,[%r186]; setp.eq.u64 %r386,%r385,0; @ ! %r386 bra $L11; bra $L40; $L1: .loc 1 231_gfortran_mminloc1_16_i1 .visible .func _gfortran_mminloc1_16_, .param .u32 %in_ar4) {608pred %r289; .reg .predpred %r304; .reg .pred %r305; .reg .u64u64u64 %r318; .reg .u64pred %r322; .reg .u64 %r324; .reg .u64 %r326; .reg .pred %r328; .reg .u64 %r329; .reg .predu64pred %r354; .reg .predpredpred %r397; .reg .u64 %r400; .reg .u64 %r401; .reg .u64 %r405; .reg .pred.reg .u64 %r416; .reg .predpred %r425; .reg .pred %r426; .reg .pred %r427; .reg .u64 %r431; .reg .u64 %r434; .reg .u64pred %r476; .reg .pred %r479; .reg .u64 %r483; .reg .u64 %r486; .reg .u64 %r489; .reg .u64 %r492; .reg .u32 %r494; .reg .u64 %r498; .reg .u64 %r500; .reg .u64 %r503; .reg .u32 %r505; .reg .u64 %r509; .reg .u16 %r510; .reg .u32 %r511; .reg .pred %r512; .reg .u16 %r513; .reg .u16 %r514; .reg .pred %r515; .reg .u64 %r518; .reg .u64 %r521; .reg .u32 %r523; .reg .predpred %r558; .reg .u64 %r559; .reg .u64 %r560; .reg .u64u64 %r593; .reg .pred.reg .u64 %r612; .reg .u64 %r613; mov.u64 %r232,%ar0; mov.u64 %r233,%ar1; mov.u64 %r234,%ar2; mov.u64 %r235,%ar3; mov.u32 %r236,%ar4; .loc 1 261 6 setp.ne.u64 %r237,%r235,0; @ %r237 bra $L23236; call _gfortran_minloc1_16_268 7 bra $L54; $L55: .loc 1 271 10 ld.u64 %r22,[%r234]; .loc 1 271 7 add.u64 %r150,%r22,-1; .loc 1 272 10 ld.s8 %r23,[%r233+28]; .loc 1 272 38 add.u32 %r242,%r23,-1; .loc 1 272 8 cvt.s64.s32 %r151,%r242; .loc 1 275 7 shr.u64 %r244,%r150,63; set.u32.gt.s64 %r246,%r150,%r151; neg.s32 %r247,%r246; cvt.u16.u64 %r250,%r244; cvt.u16.u32 %r251,%r247; or.b16 %r249,%r250,%r251; .loc 1 275 6.loc 1 277 7 cvt.u32.u32 %r257,%r23; cvt.s64.s8 %r256,%r257; st.u64 [%stack+8],%r256; st.u64 [%stack],%r22; cvta.const.u64 %r22_gfortran_runtime_error7: .loc 1 282 9 add.u64 %r598,%r150,%r150; add.u64 %r602,%r598,%r150; shl.b64 %r262,%r602,3; add.u64 %r263,%r233,%r262; ld.u64 %r266,[%r263+56]; add.u64 %r265,%r266,1; .loc 1 282 7 ld.u64 %r273,[%r263+48]; sub.u64 %r152,%r265,%r273; .loc 1 283 6 setp.le.s64 %r274,%r152,0; @ %r274 bra $L54; .loc 1 286 9 ld.u64 %r170,[%r235]; .loc 1 288 15 ld.u64 %r35,[%r235+16]; .loc 1 290 22 cvt.u32.u64 %r36,%r35; .loc 1 290 53 add.u32 %r275,%r36,-4; and.b32 %r276,%r275,-5; set.u32.eq.u32 %r278,%r276,0; neg.s32 %r279,%r278; .loc 1 290 22 add.u32 %r280,%r36,-1; set.u32.le.u32 %r282,%r280,1; neg.s32 %r283,%r282; .loc 1 290 6 cvt.u16.u32 %r285,%r279; cvt.u16.u32 %r286,%r283; or.b16 %r284,%r285,%r286; cvt.u32.u16 %r287,%r284; cvt.u16.u8 %r288,%r287; setp.ne.u16 %r289,%r288,0; @ %r289 bra $L59; .loc 1 292 7 setp.ne.u32 %r291,%r36,16; @ %r291 bra $L60; $L59: .loc 1 299 9 shl.b64 %r295,%r602,3; add.u64 %r296,%r233,%r295; ld.u64 %r155,[%r296+40]; .loc 1 300 12 add.u64 %r302,%r235,%r295; ld.u64 %r41,[%r302+40]; .loc 1 302 3 setp.ne.u64 %r304,%r150,0; @ %r304 bra $L61; $L67: .loc 1 312 3 setp.lt.s64 %r305,%r150,%r151; @ %r305 bra $L62; bra $L121; $L60: .loc 1 297 5 cvta.const.u64 %r306306_gfortran_runtime_error61: add.u64 %r85,%r233,40; add.u64 %r228,%r235,40; add.u64 %r309,%r22,%r22; add.u64 %r310,%r309,%r22; shl.b64 %r311,%r310,3; add.u64 %r312,%r233,16; add.u64 %r76,%r311,%r312; .loc 1 302 3 mov.u64 %r163,0; add.u64 %r599,%frame,240; add.u64 %r603,%frame,360; .loc 1 309 12 mov.u64 %r613,%r163; $L66: .loc 1 304 18 add.u64 %r314,%r599,%r163; ld.u64 %r315,[%r85]; st.u64 [%r314],%r315; .loc 1 305 18 add.u64 %r316,%frame,%r163; .loc 1 305 20 ld.u64 %r318,[%r228]; mul.lo.u64 %r317,%r318,%r35; .loc 1 305 18 st.u64 [%r316],%r317; .loc 1 306 19 ld.u64 %r320,[%r85+16]; add.u64 %r319,%r320,1; ld.u64 %r321,[%r85+8]; sub.u64 %r53,%r319,%r321; .loc 1 308 10 setp.lt.s64 %r322,%r53,0; @ %r322 bra $L64; .loc 1 306 17 add.u64 %r324,%r603,%r163; st.u64 [%r324],%r53; bra $L65; $L64: .loc 1 309 12 add.u64 %r326,%r603,%r163; st.u64 [%r326],%r613; $L65: .loc 1 302 3 add.u64 %r85,%r85,24; add.u64 %r228,%r228,24; add.u64 %r163,%r163,8; setp.ne.u64 %r328,%r76,%r85; @ %r328 bra $L66; bra $L67; $L121: .loc 1 322 6 ld.u64 %r329,[%r232]; setp.eq.u64 %r330,%r329,0; @ ! %r330 bra $L69; bra $L68; $L62: add.u64 %r332,%r22,%r22; add.u64 %r333,%r332,%r22; shl.b64 %r334,%r333,3; add.u64 %r176,%r334,40; add.u64 %r149,%r233,%r176; add.u64 %r221,%r235,%r176; shl.b64 %r335,%r22,3; add.u64 %r68,%r335,-8; cvt.u32.u32 %r337,%r23; cvt.s64.s8 %r336,%r337; shl.b64 %r338,%r336,3; add.u64 %r181,%r338,-8; add.u64 %r599,%frame,240; add.u64 %r603,%frame,360; .loc 1 319 12 mov.u64 %r612,0; $L72: .loc 1 314 18 add.u64 %r340,%r599,%r68; ld.u64 %r341,[%r149]; st.u64 [%r340],%r341; .loc 1 315 18 add.u64 %r342,%frame,%r68; .loc 1 315 20 ld.u64 %r344,[%r221]; mul.lo.u64 %r343,%r344,%r35; .loc 1 315 18 st.u64 [%r342],%r343; .loc 1 316 19 ld.u64 %r346,[%r149+16]; add.u64 %r345,%r346,1; ld.u64 %r347,[%r149+8]; sub.u64 %r66,%r345,%r347; .loc 1 318 10 setp.lt.s64 %r348,%r66,0; @ %r348 bra $L70; .loc 1 316 17 add.u64 %r350,%r603,%r68; st.u64 [%r350],%r66; bra $L71; $L70: .loc 1 319 12 add.u64 %r352,%r603,%r68; st.u64 [%r352],%r612; $L71: .loc 1 312 3 add.u64 %r149,%r149,24; add.u64 %r221,%r221,24; add.u64 %r68,%r68,8; setp.ne.u64 %r354,%r68,%r181; @ %r354 bra $L72; bra $L122; $L68: .loc 1 326 7 setp.eq.u64 %r355,%r151,0; @ %r355 bra $L74; add.u64 %r603,%frame,360; $L100: add.u64 %r132,%r232,40; mov.u64 %r130,%r603; cvt.u32.u32 %r357,%r23; cvt.s64.s8 %r356,%r357; add.u64 %r359,%r356,%r356; add.u64 %r360,%r359,%r356; shl.b64 %r361,%r360,3; add.u64 %r362,%r232,16; add.u64 %r59,%r361,%r362; .loc 1 329 10 mov.u64 %r133,1; .loc 1 333 4 mov.u64 %r381,0; bra $L75; $L74: .loc 1 337 20 add.u32 %r363,%r23,-2; cvt.s64.s32 %r74,%r363; add.u64 %r365,%r74,%r74; add.u64 %r366,%r365,%r74; shl.b64 %r367,%r366,3; add.u64 %r368,%r232,%r367; .loc 1 337 67 shl.b64 %r370,%r74,3; add.u64 %r371,%frame,%r370; .loc 1 337 59 ld.u64 %r373,[%r368+40]; ld.u64 %r374,[%r371+360]; mul.lo.u64 %r158,%r373,%r374; .loc 1 339 24 mov.u64 %r375,0; st.u64 [%r232+8],%r375; .loc 1 340 28 cvt.u16.u32 %r378,%r23; add.u16 %r377,%r378,-1; cvt.u32.u16 %r379,%r377; st.u8 [%r232+28],%r379; .loc 1 342 10 setp.eq.u64 %r380,%r158,0; @ %r380 bra $L76; bra $L123; $L78: .loc 1 331 47 mul.lo.u64 %r133,%r71,%r133; $L75: .loc 1 333 4 st.u64 [%r132+8],%r381; ld.u64 %r71,[%r130]; add.u64 %r382,%r71,-1; st.u64 [%r132+16],%r382; st.u64 [%r132],%r133; .loc 1 326 7 add.u64 %r132,%r132,24; add.u64 %r130,%r130,8; setp.ne.u64 %r383,%r59,%r132; @ %r383 bra $L78; bra $L74; $L76: .loc 1 345 4 st.u64 [%r232+48],%r158; mov.u64 %r385,-1; st.u64 [%r232+56],%r385; mov.u64 %r386,1; st.u64 [%r232+40],%r386; .loc 1 346 4 bra $L54; $L123: .loc 1 349 24 mov.u64 %r388,16388; call (%value_in),_gfortrani_xmallocarray89,[%value_in]; } .loc 1 349 22 st.u64 [%r232],%r389; bra $L79; $L69: .loc 1 354 19 ld.s8 %r391,[%r232+28]; .loc 1 354 10 setp.eq.u64 %r392,%r391,%r151; @ %r392 bra $L80; .loc 1 355 2 cvta.const.u64 %r3933_gfortran_runtime_error80: .loc 1 357 11 cvta.global.u64 %r395,_gfortrani_compile_options; .loc 1 357 10 ld.u32 %r396,[%r395+36]; setp.eq.u32 %r397,%r396,0; @ %r397 bra $L79; .loc 1 359 4 add.u64 %r603,%frame,360; cvta.const.u64 %r401,$LC2r60300401; call _gfortrani_bounds_ifunction_return361 4 cvta.const.u64 %r405,$LC40401; call _gfortrani_bounds_equal_extents$L79: .loc 1 366 3 setp.ne.u64 %r407,%r151,0; @ %r407 bra $L81; $L84: .loc 1 374 8 ld.u64 %r171,[%r232]; .loc 1 375 8 ld.u64 %r169,[%r233]; .loc 1 377 9 setp.ne.u64 %r408,%r169,0; @ %r408 bra $L82; bra $L54; $L81: add.u64 %r186,%frame,480; add.u64 %r148,%r232,40; add.u64 %r145,%frame,120; cvt.u32.u32 %r410,%r23; cvt.s64.s8 %r409,%r410; add.u64 %r134,%r409,-1; .loc 1 366 3 mov.u64 %r172,0; add.u64 %r603,%frame,360; .loc 1 368 16 mov.u64 %r411,%r172; $L83: st.u64 [%r186],%r411; .loc 1 369 18 ld.u64 %r412,[%r148]; st.u64 [%r145],%r412; .loc 1 370 17 shl.b64 %r414,%r172,3; add.u64 %r415,%r603,%r414; .loc 1 370 10 ld.u64 %r416,[%r415]; setp.le.s64 %r417,%r416,0; @ %r417 bra $L54; .loc 1 366 26 add.u64 %r172,%r172,1; .loc 1 366 3 add.u64 %r186,%r186,8; add.u64 %r148,%r148,24; add.u64 %r145,%r145,8; setp.ne.u64 %r418,%r134,%r172; @ %r418 bra $L83; bra $L84; $L82: .loc 1 300 12 mul.lo.u64 %r44,%r41,%r35; ld.u64 %r188,[%frame+480]; .loc 1 440 22 ld.u64 %r101,[%frame+240]; .loc 1 441 23 ld.u64 %r103,[%frame]; .loc 1 442 22 ld.u64 %r105,[%frame+120]; .loc 1 442 12 shl.b64 %r106,%r105,4; .loc 1 444 32 ld.u64 %r29,[%frame+360]; .loc 1 451 23 mul.lo.u64 %r39,%r101,%r29; .loc 1 452 24 mul.lo.u64 %r419,%r103,%r29; .loc 1 452 10 neg.s64 %r92,%r419; .loc 1 453 23 mul.lo.u64 %r420,%r105,%r29; .loc 1 453 9 shl.b64 %r421,%r420,4; neg.s64 %r182,%r421; setp.le.s64 %r597,%r151,1; .loc 1 418 9 setp.ne.u32 %r605,%r236,0; add.u64 %r606,%r152,-1; cvt.u32.u32 %r607,%r23; cvt.s64.s8 %r608,%r607; add.u64 %r609,%r608,-1; add.u64 %r610,%frame,240; add.u64 %r611,%frame,360; $L101: .loc 1 366 3 mov.u64 %r165,%r170; mov.u64 %r164,%r169; .loc 1 396 9 mov.u64 %r199,0; $L88: .loc 1 399 7 ld.s8 %r227,[%r165]; mov.u64 %r229,%r199; .loc 1 396 24 add.u64 %r199,%r199,1; .loc 1 399 6 setp.eq.u32 %r600,%r227,0; @ %r600 bra $L85; .loc 1 407 11 ld.s8 %r143,[%r164]; .loc 1 408 11 mov.u64 %r559,%r199; shr.s64 %r560,%r559,63; .loc 1 418 9 @ %r605 bra $L86; bra $L124; $L85: .loc 1 396 32 add.u64 %r164,%r164,%r155; .loc 1 396 47 add.u64 %r165,%r165,%r44; .loc 1 396 2 setp.ne.u64 %r425,%r152,%r199; @ %r425 bra $L88; .loc 1 395 9 mov.u64 %r559,0; mov.u64 %r560,%r559; bra $L89; $L124: .loc 1 428 10 setp.gt.s64 %r426,%r152,%r229; @ %r426 bra $L90; bra $L89; $L86: .loc 1 419 8 setp.le.s64 %r427,%r152,%r229; @ %r427 bra $L89; add.u64 %r567,%r229,1; shr.s64 %r568,%r567,63; sub.u64 %r431,%r606,%r229; shr.s64 %r434,%r229,63; add.u64 %r437,%r229,2; set.u32.lt.u64 %r442,%r437,%r229; cvt.s64.s32 %r440,%r442; sub.u64 %r446,%r434,%r440; add.u64 %r448,%r431,%r437; set.u32.lt.u64 %r453,%r448,%r431; cvt.s64.s32 %r451,%r453; sub.u64 %r457,%r446,%r451; mov.u64 %r579,1; $L93: .loc 1 421 8 @ %r600 bra $L91; .loc 1 421 18 ld.s8 %r90,[%r164]; .loc 1 421 15 cvt.u16.u32 %r459,%r90; cvt.u16.u32 %r460,%r143; setp.gt.s16 %r461,%r459,%r460; .loc 1 424 16 selp.u64 %r559,%r559,%r567,%r461; selp.u64 %r560,%r560,%r568,%r461; selp.u32 %r143,%r143,%r90,%r461; $L91: .loc 1 419 33 add.u64 %r164,%r164,%r155; .loc 1 419 48 add.u64 %r165,%r165,%r44; .loc 1 419 8 add.u64 %r464,%r567,%r579; set.u32.lt.u64 %r469,%r464,%r567; cvt.s64.s32 %r467,%r469; mov.u64 %r567,%r464; sub.u64 %r568,%r568,%r467; setp.ne.u64 %r476,%r448,%r567; @ %r476 bra $L102; setp.ne.u64 %r479,%r457,%r568; @ ! %r479 bra $L89; $L102: .loc 1 421 9 ld.s8 %r227,[%r165]; setp.eq.u32 %r600,%r227,0; bra $L93; $L90: add.u64 %r563,%r229,1; shr.s64 %r564,%r563,63; sub.u64 %r483,%r606,%r229; shr.s64 %r486,%r229,63; add.u64 %r489,%r229,2; set.u32.lt.u64 %r494,%r489,%r229; cvt.s64.s32 %r492,%r494; sub.u64 %r498,%r486,%r492; add.u64 %r500,%r483,%r489; set.u32.lt.u64 %r505,%r500,%r483; cvt.s64.s32 %r503,%r505; sub.u64 %r509,%r498,%r503; .loc 1 428 10 mov.u64 %r593,1; $L116: .loc 1 430 10 ld.s8 %r511,[%r165]; cvt.u16.u32 %r510,%r511; setp.eq.u16 %r512,%r510,0; @ %r512 bra $L94; .loc 1 430 20 ld.s8 %r95,[%r164]; .loc 1 430 17 cvt.u16.u32 %r513,%r95; cvt.u16.u32 %r514,%r143; setp.ge.s16 %r515,%r513,%r514; .loc 1 433 11 selp.u64 %r559,%r559,%r563,%r515; selp.u64 %r560,%r560,%r564,%r515; selp.u32 %r143,%r143,%r95,%r515; $L94: .loc 1 428 35 add.u64 %r164,%r164,%r155; .loc 1 428 50 add.u64 %r165,%r165,%r44; .loc 1 428 10 add.u64 %r518,%r563,%r593; set.u32.lt.u64 %r523,%r518,%r563; cvt.s64.s32 %r521,%r523; mov.u64 %r563,%r518; sub.u64 %r564,%r564,%r521; setp.ne.u64 %r530,%r500,%r563; @ %r530 bra $L116; setp.ne.u64 %r533,%r509,%r564; @ %r533 bra $L116; $L89: .loc 1 436 8 st.u64 [%r171],%r559; st.u64 [%r171+8],%r560; .loc 1 439 15 add.u64 %r188,%r188,1; .loc 1 440 12 add.u64 %r169,%r169,%r101; .loc 1 441 13 add.u64 %r170,%r170,%r103; .loc 1 442 12 add.u64 %r171,%r171,%r106; .loc 1 444 13 setp.ne.u64 %r536,%r29,%r188; @ %r536 bra $L101; .loc 1 452 10 add.u64 %r166,%r170,%r92; .loc 1 453 9 add.u64 %r167,%r171,%r182; .loc 1 455 7 @ %r597 bra $L54; add.u64 %r220,%frame,488; .loc 1 451 23 mov.u64 %r110,%r39; .loc 1 455 7 mov.u64 %r219,8; .loc 1 454 5 mov.u64 %r168,1; add.u64 %r601,%frame,120; .loc 1 448 13 mov.u64 %r604,0; bra $L98; $L99: st.u64 [%r220],%r604; .loc 1 451 23 mul.lo.u64 %r110,%r122,%r121; .loc 1 452 24 mul.lo.u64 %r542,%r124,%r121; .loc 1 452 10 sub.u64 %r166,%r170,%r542; .loc 1 453 23 mul.lo.u64 %r543,%r126,%r121; .loc 1 453 9 shl.b64 %r544,%r543,4; sub.u64 %r167,%r171,%r544; .loc 1 454 5 add.u64 %r168,%r168,1; .loc 1 455 7 add.u64 %r220,%r220,8; add.u64 %r219,%r219,8; setp.eq.u64 %r545,%r168,%r609; @ %r545 bra $L54; $L98: .loc 1 463 16 ld.u64 %r546,[%r220]; add.u64 %r121,%r546,1; st.u64 [%r220],%r121; .loc 1 464 23 add.u64 %r548,%r610,%r219; ld.u64 %r122,[%r548]; .loc 1 464 13 sub.u64 %r549,%r122,%r110; add.u64 %r169,%r169,%r549; .loc 1 465 24 add.u64 %r550,%frame,%r219; ld.u64 %r124,[%r550]; .loc 1 465 14 add.u64 %r170,%r166,%r124; .loc 1 466 23 add.u64 %r552,%r601,%r219; ld.u64 %r126,[%r552]; .loc 1 466 13 shl.b64 %r553,%r126,4; add.u64 %r171,%r167,%r553; .loc 1 444 32 add.u64 %r555,%r611,%r219; ld.u64 %r129,[%r555]; .loc 1 444 13 setp.eq.u64 %r556,%r121,%r129; @ %r556 bra $L99; .loc 1 448 13 mov.u64 %r188,0; bra $L101; $L122: .loc 1 322 6 ld.u64 %r557,[%r232]; setp.eq.u64 %r558,%r557,0; @ ! %r558 bra $L69; bra $L100; $L54: .loc 1 4_gfortran_sminloc1_16_i1 .visible .func _gfortran_sminloc1_16_, .param .u32 %in_ar4) {8u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r282; .reg .predpred %r293; .reg .pred %r294; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 493 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L126; .loc 1 493 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L127; $L126:r152154; call _gfortran_minloc1_16_500 7 bra $L125; $L127: .loc 1 503 10 ld.u64 %r23,[%r152]; .loc 1 503 7 add.u64 %r88,%r23,-1; .loc 1 504 10 ld.s8 %r24,[%r151+28]; .loc 1 504 38 add.u32 %r26,%r24,-1; .loc 1 504 8 cvt.s64.s32 %r297,%r26; .loc 1 506 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r297; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 506 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L129; .loc 1 513 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L130; $L136: .loc 1 521 3 setp.lt.s64 %r175,%r88,%r297; @ %r175 bra $L131; bra $L184; $L129: .loc 1 508 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error130: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 518 12 mov.u64 %r303,0; $L135: .loc 1 515 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 517 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L133; .loc 1 515 17 st.u64 [%r139],%r36; bra $L134; $L133: .loc 1 518 12 st.u64 [%r139],%r303; $L134: .loc 1 513 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L135; bra $L136; $L184: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L138; bra $L137; $L131: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r296,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r296,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 527 12 mov.u64 %r302,0; $L141: .loc 1 524 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 526 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L139; .loc 1 523 17 st.u64 [%r43],%r41; bra $L140; $L139: .loc 1 527 12 st.u64 [%r43],%r302; $L140: .loc 1 521 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L141; bra $L185; $L137: .loc 1 534 7 setp.eq.u64 %r213,%r297,0; @ %r213 bra $L143; add.u64 %r296,%frame,120; $L163: add.u64 %r111,%r150,40; mov.u64 %r107,%r296; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 537 10 mov.u64 %r81,1; .loc 1 541 4 mov.u64 %r239,0; bra $L144; $L143: .loc 1 545 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 546 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 548 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 548 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 548 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 550 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L145; bra $L186; $L147: .loc 1 539 48 mul.lo.u64 %r81,%r46,%r81; $L144: .loc 1 541 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 534 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L147; bra $L143; $L145: .loc 1 553 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 554 4 bra $L125; $L186: .loc 1 557 24 mov.u64 %r246,1696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 557 22 st.u64 [%r150],%r98; $L151: .loc 1 583 3 setp.ne.u64 %r249,%r297,0; @ %r249 bra $L148; bra $L149; $L138: .loc 1 561 19 ld.s8 %r55,[%r150+28]; .loc 1 561 10 setp.eq.u64 %r250,%r55,%r297; @ %r250 bra $L150; .loc 1 562 2 st.u64 [%stack+8],%r297; st.u64 [%stack],%r55;_gfortran_runtime_error150: .loc 1 567 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 567 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L151; .loc 1 569 4 setp.eq.u64 %r256,%r297,0; @ %r256 bra $L149; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 569 10 mov.u64 %r104,0; $L153: .loc 1 573 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 573 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 574 18 ld.u64 %r63,[%r31]; .loc 1 569 25 add.u64 %r104,%r104,1; .loc 1 574 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L152; .loc 1 575 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error152: .loc 1 569 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L153; $L148: .loc 1 585 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L154; shl.b64 %r266,%r297,3; bra $L155; $L154: mov.u64 %r266,8; $L155: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L156: .loc 1 586 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 583 26 add.u64 %r100,%r100,1; .loc 1 583 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r297,%r100; @ %r279 bra $L156; $L149: ld.u64 %r80,[%frame+240]; .loc 1 595 22 ld.u64 %r68,[%frame]; .loc 1 595 12 shl.b64 %r69,%r68,4; .loc 1 597 32 ld.u64 %r59,[%frame+120]; .loc 1 604 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r295,%r297,1; .loc 1 593 13 mov.u64 %r280,0; cvt.u32.u32 %r299,%r24; cvt.s64.s8 %r300,%r299; add.u64 %r301,%r300,-1; $L162: st.u64 [%r98],%r280; st.u64 [%r98+8],%r280; .loc 1 594 15 add.u64 %r80,%r80,1; .loc 1 595 12 add.u64 %r98,%r98,%r69; .loc 1 597 13 setp.ne.u64 %r282,%r59,%r80; @ %r282 bra $L162; .loc 1 606 7 @ ! %r295 bra $L187; bra $L125; $L161: .loc 1 601 13 st.u64 [%r99],%r280; .loc 1 604 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 605 5 add.u64 %r97,%r97,1; .loc 1 606 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r285,%r97,%r301; @ %r285 bra $L160; bra $L125; $L187: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 604 23 mov.u64 %r72,%r44; .loc 1 605 5 mov.u64 %r97,1; $L160: .loc 1 610 16 ld.u64 %r290,[%r99]; add.u64 %r74,%r290,1; st.u64 [%r99],%r74; .loc 1 611 23 ld.u64 %r75,[%r124]; .loc 1 611 13 sub.u64 %r291,%r75,%r72; shl.b64 %r292,%r291,4; add.u64 %r98,%r98,%r292; .loc 1 597 32 ld.u64 %r77,[%r123]; .loc 1 597 13 setp.eq.u64 %r293,%r74,%r77; @ %r293 bra $L161; .loc 1 601 13 mov.u64 %r80,0; bra $L162; $L185: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r294,%r98,0; @ ! %r294 bra $L138; bra $L163; $L125: .loc 1 615minloc1_4_i2.o/ 1622802233_gfortran_minloc1_4_i2 .visible .func _gfortran_minloc1_4_i2.file 1 "../../../../libgfortran/generated/minloc1_4_i2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_4_i2 .visible .func _gfortran_mminloc1_4__gfortran_sminloc1_4_i2 .visible .func _gfortran_sminloc1_4_VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_minloc1_4_i2 .visible .func _gfortran_minloc1_4_i2predpred %r238; .reg .predpred %r262; .reg .pred %r264; .reg .pred %r265; .reg .u64u64predu64 %r318; .reg .u64pred %r328; .reg .pred %r329; .reg .u32 %r331; .reg .u16 %r333; .reg .u16 %r334; .reg .pred %r335; .reg .u32 %r336; .reg .pred %r337; .reg .u16 %r338; .reg .u16mov.u64 %r188,%ar0; mov.u64 %r189,%ar1; mov.u64 %r190,%ar2; mov.u32 %r191,%ar3; .loc 1 58 10 ld.s8 %r22,[%r189+28]; .loc 1 58 38 add.u32 %r192,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r118,%r192; .loc 1 59 10 ld.u64 %r25,[%r190]; .loc 1 59 7 add.u64 %r119,%r25,-1; .loc 1 61 7 shr.u64 %r194,%r119,63; set.u32.lt.s64 %r196,%r118,%r119; neg.s32 %r197,%r196; cvt.u16.u64 %r200,%r194; cvt.u16.u32 %r201,%r197; or.b16 %r199,%r200,%r201; .loc 1 61 6 cvt.u32.u16 %r202,%r199; cvt.u16.u8 %r203,%r202; setp.eq.u16 %r204,%r203,0; @ %r204 bra $L2; .loc 1 63 7 cvt.u32.u32 %r207,%r22; cvt.s64.s8 %r206,%r207; st.u64 [%stack+8],%r20605205_gfortran_runtime_errorr210,%r119,%r119; add.u64 %r211,%r210,%r119; shl.b64 %r212,%r211,3; add.u64 %r213,%r189,%r212; ld.u64 %r30,[%r213+56]; ld.u64 %r32,[%r213+48]; .loc 1 71 9 ld.u64 %r122,[%r213+40]; .loc 1 73 3 setp.ne.u64 %r227,%r119,0; @ %r227 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r228,%r118,%r119; @ %r228 bra $L4; bra $L52; $L3: add.u64 %r138,%r189,40; add.u64 %r186,%frame,120; add.u64 %r185,%frame,240; add.u64 %r229,%r189,16; add.u64 %r231,%r25,%r25; add.u64 %r232,%r231,%r25; shl.b64 %r233,%r232,3; add.u64 %r145,%r229,%r233; .loc 1 79 12 mov.u64 %r378,0; $L8: .loc 1 75 18 ld.u64 %r234,[%r138]; st.u64 [%r186],%r234; .loc 1 76 19 ld.u64 %r236,[%r138+16]; add.u64 %r235,%r236,1; ld.u64 %r237,[%r138+8]; sub.u64 %r37,%r235,%r237; .loc 1 78 10 setp.lt.s64 %r238,%r37,0; @ %r238 bra $L6; .loc 1 76 17 st.u64 [%r185],%r37; bra $L7; $L6: .loc 1 79 12 st.u64 [%r185],%r378; $L7: .loc 1 73 3 add.u64 %r138,%r138,24; add.u64 %r186,%r186,8; add.u64 %r185,%r185,8; setp.ne.u64 %r240,%r138,%r145; @ %r240 bra $L8; bra $L9; $L52: .loc 1 90 6 ld.u64 %r241,[%r188]; setp.eq.u64 %r242,%r241,0; @ ! %r242 bra $L11; bra $L10; $L4: add.u64 %r244,%r25,%r25; add.u64 %r245,%r244,%r25; shl.b64 %r246,%r245,3; add.u64 %r247,%r246,40; add.u64 %r114,%r189,%r247; shl.b64 %r248,%r25,3; add.u64 %r102,%r248,-8; add.u64 %r366,%frame,120; add.u64 %r107,%r366,%r102; add.u64 %r365,%frame,240; add.u64 %r100,%r365,%r102; cvt.u32.u32 %r252,%r22; cvt.s64.s8 %r251,%r252; add.u64 %r254,%r251,%r251; add.u64 %r255,%r254,%r251; shl.b64 %r256,%r255,3; add.u64 %r257,%r189,40; add.u64 %r143,%r256,%r257; .loc 1 87 12 mov.u64 %r377,0; $L14: .loc 1 83 18 ld.u64 %r258,[%r114]; st.u64 [%r107],%r258; .loc 1 84 19 ld.u64 %r260,[%r114+16]; add.u64 %r259,%r260,1; ld.u64 %r261,[%r114+8]; sub.u64 %r43,%r259,%r261; .loc 1 86 10 setp.lt.s64 %r262,%r43,0; @ %r262 bra $L12; .loc 1 84 17 st.u64 [%r100],%r43; bra $L13; $L12: .loc 1 87 12 st.u64 [%r100],%r377; $L13: .loc 1 81 3 add.u64 %r114,%r114,24; add.u64 %r107,%r107,8; add.u64 %r100,%r100,8; setp.ne.u64 %r264,%r114,%r143; @ %r264 bra $L14; bra $L53; $L10: .loc 1 94 7 setp.le.s64 %r265,%r118,0; @ %r265 bra $L16; add.u64 %r365,%frame,240; $L39: add.u64 %r155,%r188,40; mov.u64 %r151,%r365; cvt.u32.u32 %r267,%r22; cvt.s64.s8 %r266,%r267; add.u64 %r269,%r266,%r266; add.u64 %r270,%r269,%r266; shl.b64 %r271,%r270,3; add.u64 %r272,%r188,16; add.u64 %r115,%r271,%r272; .loc 1 97 10 mov.u64 %r105,1; .loc 1 101 4 mov.u64 %r295,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r273,0; st.u64 [%r188+8],%r273; .loc 1 106 28 cvt.u16.u32 %r276,%r22; add.u16 %r275,%r276,-1; cvt.u32.u16 %r277,%r275; st.u8 [%r188+28],%r277; .loc 1 108 20 add.u32 %r278,%r22,-2; cvt.s64.s32 %r52,%r278; add.u64 %r280,%r52,%r52; add.u64 %r281,%r280,%r52; shl.b64 %r282,%r281,3; add.u64 %r283,%r188,%r282; .loc 1 108 67 shl.b64 %r285,%r52,3; add.u64 %r286,%frame,%r285; .loc 1 108 59 ld.u64 %r288,[%r283+40]; ld.u64 %r289,[%r286+240]; mul.lo.u64 %r124,%r288,%r289; .loc 1 110 29291; call (%value_in),_gfortrani_xmallocarray292,[%value_in]; } .loc 1 110 27 st.u64 [%r188],%r292; .loc 1 111 10 setp.eq.u64 %r294,%r124,0; @ ! %r294 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r105,%r48,%r105; $L17: .loc 1 101 4 st.u64 [%r155+8],%r295; ld.u64 %r48,[%r151]; add.u64 %r296,%r48,-1; st.u64 [%r155+16],%r296; st.u64 [%r155],%r105; .loc 1 94 7 add.u64 %r155,%r155,24; add.u64 %r151,%r151,8; setp.ne.u64 %r297,%r115,%r155; @ %r297 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r188+48],%r124; mov.u64 %r299,-1; st.u64 [%r188+56],%r299; mov.u64 %r300,1; st.u64 [%r188+40],%r300; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r59,[%r188+28]; .loc 1 121 10 setp.eq.u64 %r301,%r59,%r118; @ %r301 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r118; st.u64 [%stack],%r59;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r304,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r305,[%r304+36]; setp.eq.u32 %r306,%r305,0; @ %r306 bra $L21; .loc 1 128 2 add.u64 %r365,%frame,240; cvta.const.u64 %r310,$LC2r309310; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r312,%r118,0; @ %r312 bra $L24; $L27: .loc 1 68 9 add.u64 %r313,%r30,1; .loc 1 68 7 sub.u64 %r120,%r313,%r32; max.s64 %r86,%r120,0; .loc 1 140 8 ld.u64 %r134,[%r189]; .loc 1 141 8 ld.u64 %r135,[%r188]; .loc 1 190 33 add.u64 %r75,%r122,%r122; ld.u64 %r153,[%frame+360]; .loc 1 204 22 ld.u64 %r77,[%frame+120]; .loc 1 204 12 add.u64 %r78,%r77,%r77; .loc 1 205 22 ld.u64 %r80,[%frame]; .loc 1 205 12 shl.b64 %r81,%r80,2; .loc 1 207 32 ld.u64 %r68,[%frame+240]; .loc 1 214 23 mul.lo.u64 %r316,%r68,%r77; .loc 1 214 9 add.u64 %r317,%r316,%r316; neg.s64 %r142,%r317; .loc 1 215 23 mul.lo.u64 %r318,%r68,%r80; .loc 1 215 9 shl.b64 %r319,%r318,2; neg.s64 %r140,%r319; setp.gt.s64 %r367,%r120,0; setp.le.s64 %r368,%r118,1; .loc 1 180 9 setp.eq.u32 %r370,%r191,0; .loc 1 155 9 mov.u32 %r371,32767; cvt.u32.u32 %r372,%r22; cvt.s64.s8 %r373,%r372; add.u64 %r374,%r373,-1; add.u64 %r375,%frame,120; add.u64 %r376,%frame,240; bra $L25; $L24: add.u64 %r171,%frame,360; add.u64 %r170,%r188,40; mov.u64 %r168,%frame; cvt.u32.u32 %r321,%r22; cvt.s64.s8 %r320,%r321; add.u64 %r159,%r320,-1; .loc 1 132 3 mov.u64 %r136,0; add.u64 %r365,%frame,240; .loc 1 134 16 mov.u64 %r322,%r136; $L26: st.u64 [%r171],%r322; .loc 1 135 18 ld.u64 %r323,[%r170]; st.u64 [%r168],%r323; .loc 1 136 17 shl.b64 %r325,%r136,3; add.u64 %r326,%r365,%r325; .loc 1 136 10 ld.u64 %r327,[%r326]; setp.le.s64 %r328,%r327,0; @ %r328 bra $L1; .loc 1 132 26 add.u64 %r136,%r136,1; .loc 1 132 3 add.u64 %r171,%r171,8; add.u64 %r170,%r170,24; add.u64 %r168,%r168,8; setp.ne.u64 %r329,%r136,%r159; @ %r329 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r367 bra $L28; .loc 1 159 10 mov.u32 %r331,0; st.u32 [%r135],%r331; bra $L29; $L28: .loc 1 180 9 @ %r370 bra $L40; mov.u64 %r129,%r134; .loc 1 155 9 mov.u32 %r108,%r371; .loc 1 157 9 mov.u32 %r106,1; .loc 1 178 8 mov.u64 %r128,0; $L32: .loc 1 183 9 ld.s16 %r67,[%r129]; .loc 1 183 8 cvt.u16.u32 %r333,%r67; cvt.u16.u32 %r334,%r108; setp.gt.s16 %r335,%r333,%r334; @ %r335 bra $L31; cvt.u32.u64 %r336,%r128; add.u32 %r106,%r336,1; mov.u32 %r108,%r67; $L31: .loc 1 181 25 add.u64 %r128,%r128,1; .loc 1 181 33 add.u64 %r129,%r129,%r75; .loc 1 181 8 setp.gt.s64 %r337,%r86,%r128; @ %r337 bra $L32; bra $L33; $L40: mov.u64 %r126,%r134; .loc 1 155 9 mov.u32 %r110,%r371; .loc 1 157 9 mov.u32 %r106,1; .loc 1 178 8 mov.u64 %r104,0; $L30: .loc 1 192 9 ld.s16 %r72,[%r126]; .loc 1 192 8 cvt.u16.u32 %r338,%r72; cvt.u16.u32 %r339,%r110; setp.ge.s16 %r340,%r338,%r339; @ %r340 bra $L34; cvt.u32.u64 %r341,%r104; add.u32 %r106,%r341,1; mov.u32 %r110,%r72; $L34: .loc 1 190 25 add.u64 %r104,%r104,1; .loc 1 190 33 add.u64 %r126,%r126,%r75; .loc 1 190 8 setp.gt.s64 %r342,%r86,%r104; @ %r342 bra $L30; $L33: .loc 1 199 12 st.u32 [%r135],%r106; $L29: .loc 1 203 15 add.u64 %r153,%r153,1; .loc 1 204 12 add.u64 %r134,%r134,%r78; .loc 1 205 12 add.u64 %r135,%r135,%r81; .loc 1 207 13 setp.ne.u64 %r343,%r68,%r153; @ %r343 bra $L25; .loc 1 214 9 add.u64 %r131,%r134,%r142; .loc 1 215 9 add.u64 %r132,%r135,%r140; .loc 1 217 7 @ %r368 bra $L1; add.u64 %r156,%frame,368; mov.u64 %r117,8; .loc 1 216 5 mov.u64 %r133,1; .loc 1 211 13 mov.u64 %r369,0; bra $L37; $L38: st.u64 [%r156],%r369; .loc 1 214 23 mul.lo.u64 %r349,%r93,%r92; .loc 1 214 9 add.u64 %r350,%r349,%r349; sub.u64 %r131,%r134,%r350; .loc 1 215 23 mul.lo.u64 %r351,%r96,%r92; .loc 1 215 9 shl.b64 %r352,%r351,2; sub.u64 %r132,%r135,%r352; .loc 1 216 5 add.u64 %r133,%r133,1; .loc 1 217 7 add.u64 %r156,%r156,8; add.u64 %r117,%r117,8; setp.eq.u64 %r353,%r133,%r374; @ %r353 bra $L1; $L37: .loc 1 225 16 ld.u64 %r354,[%r156]; add.u64 %r92,%r354,1; st.u64 [%r156],%r92; .loc 1 226 23 add.u64 %r356,%r375,%r117; ld.u64 %r93,[%r356]; .loc 1 226 13 add.u64 %r357,%r93,%r93; add.u64 %r134,%r131,%r357; .loc 1 227 23 add.u64 %r358,%frame,%r117; ld.u64 %r96,[%r358]; .loc 1 227 13 shl.b64 %r359,%r96,2; add.u64 %r135,%r132,%r359; .loc 1 207 32 add.u64 %r361,%r376,%r117; ld.u64 %r99,[%r361]; .loc 1 207 13 setp.eq.u64 %r362,%r92,%r99; @ %r362 bra $L38; .loc 1 211 13 mov.u64 %r153,0; bra $L25; $L53: .loc 1 90 6 ld.u64 %r363,[%r188]; setp.eq.u64 %r364,%r363,0; @ ! %r364 bra $L11; bra $L39; $L1: .loc 1 231_gfortran_mminloc1_4_i2 .visible .func _gfortran_mminloc1_4_, .param .u32 %in_ar4) {60898u16 %r238; .reg .u16u64 %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u64 %r306; .reg .u64pred %r317; .reg .u64 %r318; .reg .pred %r319; .reg .u64u32u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64u64predu32 %r385; .reg .predu64 %r394; .reg .predu64pred %r407; .reg .u64 %r410; .reg .u64 %r411; .reg .u64 %r412; .reg .u32 %r414; .reg .pred %r416; .reg .pred %r417; .reg .predpred %r423; .reg .u16 %r424; .reg .u32 %r425; .reg .predpred %r429; .reg .pred %r430; .reg .predpred %r461; .reg .u64 %r462; .reg .pred.reg .u64 %r471; mov.u64 %r221,%ar0; mov.u64 %r222,%ar1; mov.u64 %r223,%ar2; mov.u64 %r224,%ar3; mov.u32 %r225,%ar4; .loc 1 261 6 setp.ne.u64 %r226,%r224,0; @ %r226 bra $L223225; call _gfortran_minloc1_4_268 7 bra $L54; $L55: .loc 1 271 10 ld.u64 %r22,[%r223]; .loc 1 271 7 add.u64 %r151,%r22,-1; .loc 1 272 10 ld.s8 %r23,[%r222+28]; .loc 1 272 38 add.u32 %r231,%r23,-1; .loc 1 272 8 cvt.s64.s32 %r152,%r231; .loc 1 275 7 shr.u64 %r233,%r151,63; set.u32.gt.s64 %r235,%r151,%r152; neg.s32 %r236,%r235; cvt.u16.u64 %r239,%r233; cvt.u16.u32 %r240,%r236; or.b16 %r238,%r239,%r240; .loc 1 275 6 cvt.u32.u16 %r241,%r238; cvt.u16.u8 %r242,%r241; setp.eq.u16 %r243,%r242,0; @ %r243 bra $L57; .loc 1 277 7 cvt.u32.u32 %r246,%r23; cvt.s64.s8 %r245,%r246; st.u64 [%stack+8],%r245; st.u64 [%stack],%r22; cvta.const.u64 %r244244_gfortran_runtime_error7: .loc 1 282 9 add.u64 %r456,%r151,%r151; add.u64 %r457,%r456,%r151; shl.b64 %r251,%r457,3; add.u64 %r252,%r222,%r251; ld.u64 %r255,[%r252+56]; add.u64 %r254,%r255,1; .loc 1 282 7 ld.u64 %r262,[%r252+48]; sub.u64 %r153,%r254,%r262; .loc 1 283 6 setp.le.s64 %r263,%r153,0; @ %r263 bra $L54; .loc 1 286 9 ld.u64 %r169,[%r224]; .loc 1 288 15 ld.u64 %r34,[%r224+16]; .loc 1 290 22 cvt.u32.u64 %r35,%r34; .loc 1 290 53 add.u32 %r264,%r35,-4; and.b32 %r265,%r264,-5; set.u32.eq.u32 %r267,%r265,0; neg.s32 %r268,%r267; .loc 1 290 22 add.u32 %r269,%r35,-1; set.u32.le.u32 %r271,%r269,1; neg.s32 %r272,%r271; .loc 1 290 6 cvt.u16.u32 %r274,%r268; cvt.u16.u32 %r275,%r272; or.loc 1 292 7 setp.ne.u32 %r280,%r35,16; @ %r280 bra $L60; $L59: .loc 1 299 9 shl.b64 %r284,%r457,3; add.u64 %r285,%r222,%r284; ld.u64 %r156,[%r285+40]; .loc 1 300 12 add.u64 %r291,%r224,%r284; ld.u64 %r41,[%r291+40]; .loc 1 302 3 setp.ne.u64 %r293,%r151,0; @ %r293 bra $L61; $L67: .loc 1 312 3 setp.lt.s64 %r294,%r151,%r152; @ %r294 bra $L62; bra $L119; $L60: .loc 1 297 5 cvta.const.u64 %r2955_gfortran_runtime_error61: add.u64 %r96,%r222,40; add.u64 %r68,%r224,40; add.u64 %r298,%r22,%r22; add.u64 %r299,%r298,%r22; shl.b64 %r300,%r299,3; add.u64 %r301,%r222,16; add.u64 %r39,%r300,%r301; .loc 1 302 3 mov.u64 %r139,0; add.u64 %r460,%frame,240; add.u64 %r459,%frame,360; .loc 1 309 12 mov.u64 %r471,%r139; $L66: .loc 1 304 18 add.u64 %r303,%r460,%r139; ld.u64 %r304,[%r96]; st.u64 [%r303],%r304; .loc 1 305 18 add.u64 %r305,%frame,%r139; .loc 1 305 20 ld.u64 %r307,[%r68]; mul.lo.u64 %r306,%r307,%r34; .loc 1 305 18 st.u64 [%r305],%r306; .loc 1 306 19 ld.u64 %r309,[%r96+16]; add.u64 %r308,%r309,1; ld.u64 %r310,[%r96+8]; sub.u64 %r53,%r308,%r310; .loc 1 308 10 setp.lt.s64 %r311,%r53,0; @ %r311 bra $L64; .loc 1 306 17 add.u64 %r313,%r459,%r139; st.u64 [%r313],%r53; bra $L65; $L64: .loc 1 309 12 add.u64 %r315,%r459,%r139; st.u64 [%r315],%r471; $L65: .loc 1 302 3 add.u64 %r96,%r96,24; add.u64 %r68,%r68,24; add.u64 %r139,%r139,8; setp.ne.u64 %r317,%r39,%r96; @ %r317 bra $L66; bra $L67; $L119: .loc 1 322 6 ld.u64 %r318,[%r221]; setp.eq.u64 %r319,%r318,0; @ ! %r319 bra $L69; bra $L68; $L62: add.u64 %r321,%r22,%r22; add.u64 %r322,%r321,%r22; shl.b64 %r323,%r322,3; add.u64 %r130,%r323,40; add.u64 %r134,%r222,%r130; add.u64 %r111,%r224,%r130; shl.b64 %r324,%r22,3; add.u64 %r58,%r324,-8; cvt.u32.u32 %r326,%r23; cvt.s64.s8 %r325,%r326; shl.b64 %r327,%r325,3; add.u64 %r69,%r327,-8; add.u64 %r460,%frame,240; add.u64 %r459,%frame,360; .loc 1 319 12 mov.u64 %r470,0; $L72: .loc 1 314 18 add.u64 %r329,%r460,%r58; ld.u64 %r330,[%r134]; st.u64 [%r329],%r330; .loc 1 315 18 add.u64 %r331,%frame,%r58; .loc 1 315 20 ld.u64 %r333,[%r111]; mul.lo.u64 %r332,%r333,%r34; .loc 1 315 18 st.u64 [%r331],%r332; .loc 1 316 19 ld.u64 %r335,[%r134+16]; add.u64 %r334,%r335,1; ld.u64 %r336,[%r134+8]; sub.u64 %r65,%r334,%r336; .loc 1 318 10 setp.lt.s64 %r337,%r65,0; @ %r337 bra $L70; .loc 1 316 17 add.u64 %r339,%r459,%r58; st.u64 [%r339],%r65; bra $L71; $L70: .loc 1 319 12 add.u64 %r341,%r459,%r58; st.u64 [%r341],%r470; $L71: .loc 1 312 3 add.u64 %r134,%r134,24; add.u64 %r111,%r111,24; add.u64 %r58,%r58,8; setp.ne.u64 %r343,%r69,%r58; @ %r343 bra $L72; bra $L120; $L68: .loc 1 326 7 setp.eq.u64 %r344,%r152,0; @ %r344 bra $L74; add.u64 %r459,%frame,360; $L99: add.u64 %r148,%r221,40; mov.u64 %r146,%r459; cvt.u32.u32 %r346,%r23; cvt.s64.s8 %r345,%r346; add.u64 %r348,%r345,%r345; add.u64 %r349,%r348,%r345; shl.b64 %r350,%r349,3; add.u64 %r351,%r221,16; add.u64 %r135,%r350,%r351; .loc 1 329 10 mov.u64 %r136,1; .loc 1 333 4 mov.u64 %r370,0; bra $L75; $L74: .loc 1 337 20 add.u32 %r352,%r23,-2; cvt.s64.s32 %r76,%r352; add.u64 %r354,%r76,%r76; add.u64 %r355,%r354,%r76; shl.b64 %r356,%r355,3; add.u64 %r357,%r221,%r356; .loc 1 337 67 shl.b64 %r359,%r76,3; add.u64 %r360,%frame,%r359; .loc 1 337 59 ld.u64 %r362,[%r357+40]; ld.u64 %r363,[%r360+360]; mul.lo.u64 %r160,%r362,%r363; .loc 1 339 24 mov.u64 %r364,0; st.u64 [%r221+8],%r364; .loc 1 340 28 cvt.u16.u32 %r367,%r23; add.u16 %r366,%r367,-1; cvt.u32.u16 %r368,%r366; st.u8 [%r221+28],%r368; .loc 1 342 10 setp.eq.u64 %r369,%r160,0; @ %r369 bra $L76; bra $L121; $L78: .loc 1 331 47 mul.lo.u64 %r136,%r73,%r136; $L75: .loc 1 333 4 st.u64 [%r148+8],%r370; ld.u64 %r73,[%r146]; add.u64 %r371,%r73,-1; st.u64 [%r148+16],%r371; st.u64 [%r148],%r136; .loc 1 326 7 add.u64 %r148,%r148,24; add.u64 %r146,%r146,8; setp.ne.u64 %r372,%r135,%r148; @ %r372 bra $L78; bra $L74; $L76: .loc 1 345 4 st.u64 [%r221+48],%r160; mov.u64 %r374,-1; st.u64 [%r221+56],%r374; mov.u64 %r375,1; st.u64 [%r221+40],%r375; .loc 1 346 4 bra $L54; $L121: .loc 1 349 24call (%value_in),_gfortrani_xmallocarray78,[%value_in]; } .loc 1 349 22 st.u64 [%r221],%r378; bra $L79; $L69: .loc 1 354 19 ld.s8 %r380,[%r221+28]; .loc 1 354 10 setp.eq.u64 %r381,%r380,%r152; @ %r381 bra $L80; .loc 1 355 2 cvta.const.u64 %r382,$LC5; {stack; call _gfortran_runtime_error80: .loc 1 357 11 cvta.global.u64 %r384,_gfortrani_compile_options; .loc 1 357 10 ld.u32 %r385,[%r384+36]; setp.eq.u32 %r386,%r385,0; @ %r386 bra $L79; .loc 1 359 4 add.u64 %r459,%frame,360; cvta.const.u64 %r390,$LC2r459389390; call _gfortrani_bounds_ifunction_return361 4 cvta.const.u64 %r394,$LC222394390; call _gfortrani_bounds_equal_extents$L79: .loc 1 366 3 setp.ne.u64 %r396,%r152,0; @ %r396 bra $L81; $L84: .loc 1 374 8 ld.u64 %r170,[%r221]; .loc 1 375 8 ld.u64 %r168,[%r222]; .loc 1 377 9 setp.ne.u64 %r397,%r168,0; @ %r397 bra $L82; bra $L54; $L81: add.u64 %r198,%frame,480; add.u64 %r197,%r221,40; add.u64 %r195,%frame,120; cvt.u32.u32 %r399,%r23; cvt.s64.s8 %r398,%r399; add.u64 %r164,%r398,-1; .loc 1 366 3 mov.u64 %r171,0; add.u64 %r459,%frame,360; .loc 1 368 16 mov.u64 %r400,%r171; $L83: st.u64 [%r198],%r400; .loc 1 369 18 ld.u64 %r401,[%r197]; st.u64 [%r195],%r401; .loc 1 370 17 shl.b64 %r403,%r171,3; add.u64 %r404,%r459,%r403; .loc 1 370 10 ld.u64 %r405,[%r404]; setp.le.s64 %r406,%r405,0; @ %r406 bra $L54; .loc 1 366 26 add.u64 %r171,%r171,1; .loc 1 366 3 add.u64 %r198,%r198,8; add.u64 %r197,%r197,24; add.u64 %r195,%r195,8; setp.ne.u64 %r407,%r164,%r171; @ %r407 bra $L83; bra $L84; $L82: .loc 1 300 12 mul.lo.u64 %r44,%r41,%r34; .loc 1 396 32 add.u64 %r91,%r156,%r156; ld.u64 %r187,[%frame+480]; .loc 1 440 22 ld.u64 %r101,[%frame+240]; .loc 1 440 12 add.u64 %r102,%r101,%r101; .loc 1 441 23 ld.u64 %r104,[%frame]; .loc 1 442 22 ld.u64 %r106,[%frame+120]; .loc 1 442 12 shl.b64 %r107,%r106,2; .loc 1 444 32 ld.u64 %r29,[%frame+360]; .loc 1 451 23 mul.lo.u64 %r38,%r29,%r101; .loc 1 452 24 mul.lo.u64 %r410,%r29,%r104; .loc 1 452 10 neg.s64 %r137,%r410; .loc 1 453 23 mul.lo.u64 %r411,%r29,%r106; .loc 1 453 9 shl.b64 %r412,%r411,2; neg.s64 %r188,%r412; setp.le.s64 %r458,%r152,1; .loc 1 418 9 setp.ne.u32 %r463,%r225,0; cvt.u32.u32 %r464,%r23; cvt.s64.s8 %r465,%r464; add.u64 %r466,%r465,-1; add.u64 %r467,%frame,240; add.u64 %r468,%frame,360; add.u64 %r469,%frame,120; $L100: .loc 1 366 3 mov.u64 %r163,%r169; mov.u64 %r162,%r168; .loc 1 396 9 mov.u64 %r161,0; $L88: .loc 1 399 7 ld.s8 %r220,[%r163]; .loc 1 399 6 setp.eq.u32 %r461,%r220,0; @ %r461 bra $L85; .loc 1 407 11 ld.s16 %r145,[%r162]; .loc 1 408 11 cvt.u32.u64 %r414,%r161; add.u32 %r143,%r414,1; .loc 1 418 9 @ %r463 bra $L86; bra $L122; $L85: .loc 1 396 24 add.u64 %r161,%r161,1; .loc 1 396 32 add.u64 %r162,%r162,%r91; .loc 1 396 47 add.u64 %r163,%r163,%r44; .loc 1 396 2 setp.ne.u64 %r416,%r153,%r161; @ %r416 bra $L88; .loc 1 395 9 mov.u32 %r143,%r220; bra $L89; $L122: .loc 1 428 10 setp.gt.s64 %r417,%r153,%r161; @ %r417 bra $L90; bra $L89; $L86: .loc 1 419 8 setp.le.s64 %r418,%r153,%r161; @ %r418 bra $L89; add.u64 %r205,%r161,1; $L92: .loc 1 421 8 @ %r461 bra $L91; .loc 1 421 18 ld.s16 %r93,[%r162]; .loc 1 421 15 cvt.u16.u32 %r420,%r93; cvt.u16.u32 %r421,%r145; setp.gt.s16 %r422,%r420,%r421; @ %r422 bra $L91; .loc 1 424 16 cvt.u32.u64 %r143,%r205; mov.u32 %r145,%r93; $L91: .loc 1 419 33 add.u64 %r162,%r162,%r91; .loc 1 419 48 add.u64 %r163,%r163,%r44; .loc 1 419 8 add.u64 %r206,%r205,1; setp.eq.u64 %r423,%r153,%r205; @ %r423 bra $L89; .loc 1 421 9 ld.s8 %r220,[%r163]; mov.u64 %r205,%r206; setp.eq.u32 %r461,%r220,0; bra $L92; $L90: add.u64 %r201,%r161,1; bra $L94; $L101: mov.u64 %r201,%r202; $L94: .loc 1 430 10 ld.s8 %r425,[%r163]; cvt.u16.u32 %r424,%r425; setp.eq.u16 %r426,%r424,0; @ %r426 bra $L93; .loc 1 430 20 ld.s16 %r98,[%r162]; .loc 1 430 17 cvt.u16.u32 %r427,%r98; cvt.u16.u32 %r428,%r145; setp.ge.s16 %r429,%r427,%r428; @ %r429 bra $L93; .loc 1 433 11 cvt.u32.u64 %r143,%r201; mov.u32 %r145,%r98; $L93: .loc 1 428 35 add.u64 %r162,%r162,%r91; .loc 1 428 50 add.u64 %r163,%r163,%r44; .loc 1 428 10 add.u64 %r202,%r201,1; setp.ne.u64 %r430,%r153,%r201; @ %r430 bra $L101; $L89: .loc 1 436 8 st.u32 [%r170],%r143; .loc 1 439 15 add.u64 %r187,%r187,1; .loc 1 440 12 add.u64 %r168,%r168,%r102; .loc 1 441 13 add.u64 %r169,%r169,%r104; .loc 1 442 12 add.u64 %r170,%r170,%r107; .loc 1 444 13 setp.ne.u64 %r431,%r29,%r187; @ %r431 bra $L100; .loc 1 452 10 add.u64 %r165,%r169,%r137; .loc 1 453 9 add.u64 %r166,%r170,%r188; .loc 1 455 7 @ %r458 bra $L54; add.u64 %r150,%frame,488; .loc 1 451 23 mov.u64 %r110,%r38; .loc 1 455 7 mov.u64 %r215,8; .loc 1 454 5 mov.u64 %r167,1; .loc 1 448 13 mov.u64 %r462,0; bra $L97; $L98: st.u64 [%r150],%r462; .loc 1 451 23 mul.lo.u64 %r110,%r123,%r122; .loc 1 452 24 mul.lo.u64 %r437,%r125,%r122; .loc 1 452 10 sub.u64 %r165,%r169,%r437; .loc 1 453 23 mul.lo.u64 %r438,%r127,%r122; .loc 1 453 9 shl.b64 %r439,%r438,2; sub.u64 %r166,%r170,%r439; .loc 1 454 5 add.u64 %r167,%r167,1; .loc 1 455 7 add.u64 %r150,%r150,8; add.u64 %r215,%r215,8; setp.eq.u64 %r440,%r167,%r466; @ %r440 bra $L54; $L97: .loc 1 463 16 ld.u64 %r441,[%r150]; add.u64 %r122,%r441,1; st.u64 [%r150],%r122; .loc 1 464 23 add.u64 %r443,%r467,%r215; ld.u64 %r123,[%r443]; .loc 1 464 13 sub.u64 %r444,%r123,%r110; add.u64 %r445,%r444,%r444; add.u64 %r168,%r168,%r445; .loc 1 465 24 add.u64 %r446,%frame,%r215; ld.u64 %r125,[%r446]; .loc 1 465 14 add.u64 %r169,%r165,%r125; .loc 1 466 23 add.u64 %r448,%r469,%r215; ld.u64 %r127,[%r448]; .loc 1 466 13 shl.b64 %r449,%r127,2; add.u64 %r170,%r166,%r449; .loc 1 444 32 add.u64 %r451,%r468,%r215; ld.u64 %r131,[%r451]; .loc 1 444 13 setp.eq.u64 %r452,%r122,%r131; @ %r452 bra $L98; .loc 1 448 13 mov.u64 %r187,0; bra $L100; $L120: .loc 1 322 6 ld.u64 %r453,[%r221]; setp.eq.u64 %r454,%r453,0; @ ! %r454 bra $L69; bra $L99; $L54: .loc 1 4_gfortran_sminloc1_4_i2 .visible .func _gfortran_sminloc1_4_, .param .u32 %in_ar4) {8u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u32 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 493 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L124; .loc 1 493 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L125; $L124:r152154; call _gfortran_minloc1_4_500 7 bra $L123; $L125: .loc 1 503 10 ld.u64 %r23,[%r152]; .loc 1 503 7 add.u64 %r88,%r23,-1; .loc 1 504 10 ld.s8 %r24,[%r151+28]; .loc 1 504 38 add.u32 %r26,%r24,-1; .loc 1 504 8 cvt.s64.s32 %r296,%r26; .loc 1 506 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 506 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L127; .loc 1 513 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L128; $L134: .loc 1 521 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L129; bra $L182; $L127: .loc 1 508 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error128: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 518 12 mov.u64 %r302,0; $L133: .loc 1 515 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 517 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L131; .loc 1 515 17 st.u64 [%r139],%r36; bra $L132; $L131: .loc 1 518 12 st.u64 [%r139],%r302; $L132: .loc 1 513 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L133; bra $L134; $L182: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L136; bra $L135; $L129: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 527 12 mov.u64 %r301,0; $L139: .loc 1 524 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 526 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L137; .loc 1 523 17 st.u64 [%r43],%r41; bra $L138; $L137: .loc 1 527 12 st.u64 [%r43],%r301; $L138: .loc 1 521 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L139; bra $L183; $L135: .loc 1 534 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L141; add.u64 %r295,%frame,120; $L161: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 537 10 mov.u64 %r81,1; .loc 1 541 4 mov.u64 %r239,0; bra $L142; $L141: .loc 1 545 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 546 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 548 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 548 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 548 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 550 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L143; bra $L184; $L145: .loc 1 539 48 mul.lo.u64 %r81,%r46,%r81; $L142: .loc 1 541 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 534 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L145; bra $L141; $L143: .loc 1 553 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 554 4 bra $L123; $L184: .loc 1 557 2496246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 557 22 st.u64 [%r150],%r98; $L149: .loc 1 583 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L146; bra $L147; $L136: .loc 1 561 19 ld.s8 %r55,[%r150+28]; .loc 1 561 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L148; .loc 1 562 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error148: .loc 1 567 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 567 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L149; .loc 1 569 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L147; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 569 10 mov.u64 %r104,0; $L151: .loc 1 573 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 573 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 574 18 ld.u64 %r63,[%r31]; .loc 1 569 25 add.u64 %r104,%r104,1; .loc 1 574 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L150; .loc 1 575 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error150: .loc 1 569 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L151; $L146: .loc 1 585 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L152; shl.b64 %r266,%r296,3; bra $L153; $L152: mov.u64 %r266,8; $L153: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L154: .loc 1 586 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 583 26 add.u64 %r100,%r100,1; .loc 1 583 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L154; $L147: ld.u64 %r80,[%frame+240]; .loc 1 595 22 ld.u64 %r68,[%frame]; .loc 1 595 12 shl.b64 %r69,%r68,2; .loc 1 597 32 ld.u64 %r59,[%frame+120]; .loc 1 604 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 593 13 mov.u32 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L160: st.u32 [%r98],%r280; .loc 1 594 15 add.u64 %r80,%r80,1; .loc 1 595 12 add.u64 %r98,%r98,%r69; .loc 1 597 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L160; .loc 1 606 7 @ ! %r294 bra $L185; bra $L123; $L159: .loc 1 601 13 st.u64 [%r99],%r297; .loc 1 604 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 605 5 add.u64 %r97,%r97,1; .loc 1 606 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L158; bra $L123; $L185: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 604 23 mov.u64 %r72,%r44; .loc 1 605 5 mov.u64 %r97,1; .loc 1 601 13 mov.u64 %r297,0; $L158: .loc 1 610 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 611 23 ld.u64 %r75,[%r124]; .loc 1 611 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,2; add.u64 %r98,%r98,%r291; .loc 1 597 32 ld.u64 %r77,[%r123]; .loc 1 597 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L159; .loc 1 601 13 mov.u64 %r80,0; bra $L160; $L183: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L136; bra $L161; $L123: .loc 1 615 minloc1_8_i2.o/ 1622802233_gfortran_minloc1_8_i2 .visible .func _gfortran_minloc1_8_i2.file 1 "../../../../libgfortran/generated/minloc1_8_i2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_8_i2 .visible .func _gfortran_mminloc1_8__gfortran_sminloc1_8_i2 .visible .func _gfortran_sminloc1_8_VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_minloc1_8_i2 .visible .func _gfortran_minloc1_8_i2predpredpred %r235; .reg .predu64 %r266; .reg .u64 %r267; .reg .u64u16u64 %r279; .reg .u64 %r280; .reg .u64 %r282; .reg .u64 %r283; .reg .u64pred32u64pred %r328; .reg .pred %r329; .reg .u16 %r330; .reg .u16 %r331; .reg .predu64 %r335; .reg .u64mov.u64 %r185,%ar0; mov.u64 %r186,%ar1; mov.u64 %r187,%ar2; mov.u32 %r188,%ar3; .loc 1 58 10 ld.s8 %r22,[%r186+28]; .loc 1 58 38 add.u32 %r189,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r120,%r189; .loc 1 59 10 ld.u64 %r25,[%r187]; .loc 1 59 7 add.u64 %r121,%r25,-1; .loc 1 61 7 shr.u64 %r191,%r121,63; set.u32.lt.s64 %r193,%r120,%r121; neg.s32 %r194,%r193; cvt.u16.u64 %r197,%r191; cvt.u16.u32 %r198,%r194; or.b16 %r196,%r197,%r198; .loc 1 61 6 cvt.u32.u16 %r199,%r196; cvt.u16.u8 %r200,%r199; setp.eq.u16 %r201,%r200,0; @ %r201 bra $L2; .loc 1 63 7 cvt.u32.u32 %r204,%r22; cvt.s64.s8 %r203,%r204; st.u64 [%stack+8],%r20302202_gfortran_runtime_errorr207,%r121,%r121; add.u64 %r208,%r207,%r121; shl.b64 %r209,%r208,3; add.u64 %r210,%r186,%r209; ld.u64 %r30,[%r210+56]; ld.u64 %r32,[%r210+48]; .loc 1 71 9 ld.u64 %r124,[%r210+40]; .loc 1 73 3 setp.ne.u64 %r224,%r121,0; @ %r224 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r225,%r120,%r121; @ %r225 bra $L4; bra $L54; $L3: add.u64 %r47,%r186,40; add.u64 %r119,%frame,120; add.u64 %r137,%frame,240; add.u64 %r226,%r186,16; add.u64 %r228,%r25,%r25; add.u64 %r229,%r228,%r25; shl.b64 %r230,%r229,3; add.u64 %r46,%r226,%r230; .loc 1 79 12 mov.u64 %r373,0; $L8: .loc 1 75 18 ld.u64 %r231,[%r47]; st.u64 [%r119],%r231; .loc 1 76 19 ld.u64 %r233,[%r47+16]; add.u64 %r232,%r233,1; ld.u64 %r234,[%r47+8]; sub.u64 %r37,%r232,%r234; .loc 1 78 10 setp.lt.s64 %r235,%r37,0; @ %r235 bra $L6; .loc 1 76 17 st.u64 [%r137],%r37; bra $L7; $L6: .loc 1 79 12 st.u64 [%r137],%r373; $L7: .loc 1 73 3 add.u64 %r47,%r47,24; add.u64 %r119,%r119,8; add.u64 %r137,%r137,8; setp.ne.u64 %r237,%r46,%r47; @ %r237 bra $L8; bra $L9; $L54: .loc 1 90 6 ld.u64 %r238,[%r185]; setp.eq.u64 %r239,%r238,0; @ ! %r239 bra $L11; bra $L10; $L4: add.u64 %r241,%r25,%r25; add.u64 %r242,%r241,%r25; shl.b64 %r243,%r242,3; add.u64 %r244,%r243,40; add.u64 %r118,%r186,%r244; shl.b64 %r245,%r25,3; add.u64 %r108,%r245,-8; add.u64 %r361,%frame,120; add.u64 %r112,%r361,%r108; add.u64 %r360,%frame,240; add.u64 %r104,%r360,%r108; cvt.u32.u32 %r249,%r22; cvt.s64.s8 %r248,%r249; add.u64 %r251,%r248,%r248; add.u64 %r252,%r251,%r248; shl.b64 %r253,%r252,3; add.u64 %r254,%r186,40; add.u64 %r64,%r253,%r254; .loc 1 87 12 mov.u64 %r372,0; $L14: .loc 1 83 18 ld.u64 %r255,[%r118]; st.u64 [%r112],%r255; .loc 1 84 19 ld.u64 %r257,[%r118+16]; add.u64 %r256,%r257,1; ld.u64 %r258,[%r118+8]; sub.u64 %r43,%r256,%r258; .loc 1 86 10 setp.lt.s64 %r259,%r43,0; @ %r259 bra $L12; .loc 1 84 17 st.u64 [%r104],%r43; bra $L13; $L12: .loc 1 87 12 st.u64 [%r104],%r372; $L13: .loc 1 81 3 add.u64 %r118,%r118,24; add.u64 %r112,%r112,8; add.u64 %r104,%r104,8; setp.ne.u64 %r261,%r64,%r118; @ %r261 bra $L14; bra $L55; $L10: .loc 1 94 7 setp.le.s64 %r262,%r120,0; @ %r262 bra $L16; add.u64 %r360,%frame,240; $L39: add.u64 %r156,%r185,40; mov.u64 %r153,%r360; cvt.u32.u32 %r264,%r22; cvt.s64.s8 %r263,%r264; add.u64 %r266,%r263,%r263; add.u64 %r267,%r266,%r263; shl.b64 %r268,%r267,3; add.u64 %r269,%r185,16; add.u64 %r130,%r268,%r269; .loc 1 97 10 mov.u64 %r107,1; .loc 1 101 4 mov.u64 %r292,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r270,0; st.u64 [%r185+8],%r270; .loc 1 106 28 cvt.u16.u32 %r273,%r22; add.u16 %r272,%r273,-1; cvt.u32.u16 %r274,%r272; st.u8 [%r185+28],%r274; .loc 1 108 20 add.u32 %r275,%r22,-2; cvt.s64.s32 %r54,%r275; add.u64 %r277,%r54,%r54; add.u64 %r278,%r277,%r54; shl.b64 %r279,%r278,3; add.u64 %r280,%r185,%r279; .loc 1 108 67 shl.b64 %r282,%r54,3; add.u64 %r283,%frame,%r282; .loc 1 108 59 ld.u64 %r285,[%r280+40]; ld.u64 %r286,[%r283+240]; mul.lo.u64 %r126,%r285,%r286; .loc 1 110 29 mov.u64 %r288288; call (%value_in),_gfortrani_xmallocarray289,[%value_in]; } .loc 1 110 27 st.u64 [%r185],%r289; .loc 1 111 10 setp.eq.u64 %r291,%r126,0; @ ! %r291 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r107,%r50,%r107; $L17: .loc 1 101 4 st.u64 [%r156+8],%r292; ld.u64 %r50,[%r153]; add.u64 %r293,%r50,-1; st.u64 [%r156+16],%r293; st.u64 [%r156],%r107; .loc 1 94 7 add.u64 %r156,%r156,24; add.u64 %r153,%r153,8; setp.ne.u64 %r294,%r130,%r156; @ %r294 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r185+48],%r126; mov.u64 %r296,-1; st.u64 [%r185+56],%r296; mov.u64 %r297,1; st.u64 [%r185+40],%r297; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r60,[%r185+28]; .loc 1 121 10 setp.eq.u64 %r298,%r60,%r120; @ %r298 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r120; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r301,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r302,[%r301+36]; setp.eq.u32 %r303,%r302,0; @ %r303 bra $L21; .loc 1 128 2 add.u64 %r360,%frame,240; cvta.const.u64 %r307,$LC2r360306307; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r309,%r120,0; @ %r309 bra $L24; $L27: .loc 1 68 9 add.u64 %r310,%r30,1; .loc 1 68 7 sub.u64 %r122,%r310,%r32; max.s64 %r87,%r122,0; .loc 1 140 8 ld.u64 %r134,[%r186]; .loc 1 141 8 ld.u64 %r135,[%r185]; .loc 1 190 33 add.u64 %r75,%r124,%r124; ld.u64 %r45,[%frame+360]; .loc 1 204 22 ld.u64 %r76,[%frame+120]; .loc 1 204 12 add.u64 %r78,%r76,%r76; .loc 1 205 22 ld.u64 %r79,[%frame]; .loc 1 205 12 shl.b64 %r81,%r79,3; setp.gt.s64 %r362,%r122,0; setp.le.s64 %r363,%r120,1; .loc 1 180 9 setp.eq.u32 %r365,%r188,0; .loc 1 159 10 mov.u64 %r366,0; cvt.u32.u32 %r367,%r22; cvt.s64.s8 %r368,%r367; add.u64 %r369,%r368,-1; add.u64 %r370,%frame,120; add.u64 %r371,%frame,240; bra $L25; $L24: add.u64 %r170,%frame,360; add.u64 %r169,%r185,40; mov.u64 %r167,%frame; cvt.u32.u32 %r314,%r22; cvt.s64.s8 %r313,%r314; add.u64 %r158,%r313,-1; .loc 1 132 3 mov.u64 %r136,0; add.u64 %r360,%frame,240; .loc 1 134 16 mov.u64 %r315,%r136; $L26: st.u64 [%r170],%r315; .loc 1 135 18 ld.u64 %r316,[%r169]; st.u64 [%r167],%r316; .loc 1 136 17 shl.b64 %r318,%r136,3; add.u64 %r319,%r360,%r318; .loc 1 136 10 ld.u64 %r320,[%r319]; setp.le.s64 %r321,%r320,0; @ %r321 bra $L1; .loc 1 132 26 add.u64 %r136,%r136,1; .loc 1 132 3 add.u64 %r170,%r170,8; add.u64 %r169,%r169,24; add.u64 %r167,%r167,8; setp.ne.u64 %r322,%r136,%r158; @ %r322 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r362 bra $L28; .loc 1 159 10 st.u64 [%r135],%r366; bra $L29; $L28: .loc 1 180 9 @ %r365 bra $L40; mov.u64 %r129,%r134; .loc 1 155 9 mov.u32 %r111,32767; .loc 1 157 9 mov.u64 %r110,1; .loc 1 178 8 mov.u64 %r152,0; $L32: .loc 1 183 9 ld.s16 %r67,[%r129]; .loc 1 186 16 add.u64 %r152,%r152,1; .loc 1 183 8 cvt.u16.u32 %r326,%r67; cvt.u16.u32 %r327,%r111; setp.le.s16 %r328,%r326,%r327; selp.u32 %r111,%r67,%r111,%r328; selp.u64 %r110,%r152,%r110,%r328; .loc 1 181 33 add.u64 %r129,%r129,%r75; .loc 1 181 8 setp.gt.s64 %r329,%r87,%r152; @ %r329 bra $L32; bra $L33; $L40: mov.u64 %r127,%r134; .loc 1 155 9 mov.u32 %r113,32767; .loc 1 157 9 mov.u64 %r110,1; .loc 1 178 8 mov.u64 %r106,0; $L30: .loc 1 192 9 ld.s16 %r72,[%r127]; .loc 1 195 16 add.u64 %r106,%r106,1; .loc 1 192 8 cvt.u16.u32 %r330,%r72; cvt.u16.u32 %r331,%r113; setp.lt.s16 %r332,%r330,%r331; selp.u32 %r113,%r72,%r113,%r332; selp.u64 %r110,%r106,%r110,%r332; .loc 1 190 33 add.u64 %r127,%r127,%r75; .loc 1 190 8 setp.gt.s64 %r333,%r87,%r106; @ %r333 bra $L30; $L33: .loc 1 199 12 st.u64 [%r135],%r110; $L29: .loc 1 203 15 add.u64 %r45,%r45,1; .loc 1 204 12 add.u64 %r134,%r134,%r78; .loc 1 205 12 add.u64 %r135,%r135,%r81; .loc 1 207 32 ld.u64 %r68,[%frame+240]; .loc 1 207 13 setp.ne.u64 %r334,%r68,%r45; @ %r334 bra $L25; .loc 1 214 23 mul.lo.u64 %r335,%r45,%r76; .loc 1 214 9 add.u64 %r336,%r335,%r335; sub.u64 %r131,%r134,%r336; .loc 1 215 23 mul.lo.u64 %r337,%r45,%r79; .loc 1 215 9 shl.b64 %r338,%r337,3; sub.u64 %r132,%r135,%r338; .loc 1 217 7 @ %r363 bra $L1; add.u64 %r157,%frame,368; mov.u64 %r178,8; .loc 1 216 5 mov.u64 %r133,1; bra $L37; $L38: .loc 1 211 13 st.u64 [%r157],%r366; .loc 1 214 23 mul.lo.u64 %r344,%r94,%r93; .loc 1 214 9 add.u64 %r345,%r344,%r344; sub.u64 %r131,%r134,%r345; .loc 1 215 23 mul.lo.u64 %r346,%r97,%r93; .loc 1 215 9 shl.b64 %r347,%r346,3; sub.u64 %r132,%r135,%r347; .loc 1 216 5 add.u64 %r133,%r133,1; .loc 1 217 7 add.u64 %r157,%r157,8; add.u64 %r178,%r178,8; setp.eq.u64 %r348,%r133,%r369; @ %r348 bra $L1; $L37: .loc 1 225 16 ld.u64 %r349,[%r157]; add.u64 %r93,%r349,1; st.u64 [%r157],%r93; .loc 1 226 23 add.u64 %r351,%r370,%r178; ld.u64 %r94,[%r351]; .loc 1 226 13 add.u64 %r352,%r94,%r94; add.u64 %r134,%r131,%r352; .loc 1 227 23 add.u64 %r353,%frame,%r178; ld.u64 %r97,[%r353]; .loc 1 227 13 shl.b64 %r354,%r97,3; add.u64 %r135,%r132,%r354; .loc 1 207 32 add.u64 %r356,%r371,%r178; ld.u64 %r100,[%r356]; .loc 1 207 13 setp.eq.u64 %r357,%r93,%r100; @ %r357 bra $L38; .loc 1 211 13 mov.u64 %r45,0; bra $L25; $L55: .loc 1 90 6 ld.u64 %r358,[%r185]; setp.eq.u64 %r359,%r358,0; @ ! %r359 bra $L11; bra $L39; $L1: .loc 1 231_gfortran_mminloc1_8_i2 .visible .func _gfortran_mminloc1_8_, .param .u32 %in_ar4) {608pred %r218; .reg .u32 %r223; .reg .u6464u64pred %r373; .reg .u64 %r374; .reg .u64 %r376; .reg .u32 %r377; .reg .pred %r378; .reg .u64 %r381; .reg .u64 %r382; .reg .u64 %r386; .reg .pred %r388; .reg .pred %r389; .reg .u64 %r390; .reg .u32 %r391; .reg .u64pred %r398; .reg .pred %r399; .reg .pred %r404; .reg .pred16u32 %r413; .reg .predpred %r417; .reg .pred %r418; .reg .predpredpredu64 %r460; .reg .u64 %r461; .reg .u64 %r462; mov.u64 %r213,%ar0; mov.u64 %r214,%ar1; mov.u64 %r215,%ar2; mov.u64 %r216,%ar3; mov.u32 %r217,%ar4; .loc 1 261 6 setp.ne.u64 %r218,%r216,0; @ %r218 bra $L1r214215217; call _gfortran_minloc1_8_268 7 bra $L56; $L57: .loc 1 271 10 ld.u64 %r22,[%r215]; .loc 1 271 7 add.u64 %r146,%r22,-1; .loc 1 272 10 ld.s8 %r23,[%r214+28]; .loc 1 272 38 add.u32 %r223,%r23,-1; .loc 1 272 8 cvt.s64.s32 %r147,%r223; .loc 1 275 7 shr.u64 %r225,%r146,63; set.u32.gt.s64 %r227,%r146,%r147; neg.s32 %r228,%r227; cvt.u16.u64 %r231,%r225; cvt.u16.u32 %r232,%r228; or.b16 %r230,%r231,%r232; .loc 1 275 6 cvt.u32.u16 %r233,%r230; cvt.u16.u8 %r234,%r233; setp.eq.u16 %r235,%r234,0; @ %r235 bra $L59; .loc 1 277 7 cvt.u32.u32 %r238,%r23; cvt.s64.s8 %r237,%r238; st.u64 [%stack+8],%r237; st.u64 [%stack],%r22; cvta.const.u64 %r236236_gfortran_runtime_error9: .loc 1 282 9 add.u64 %r452,%r146,%r146; add.u64 %r451,%r452,%r146; shl.b64 %r243,%r451,3; add.u64 %r244,%r214,%r243; ld.u64 %r247,[%r244+56]; add.u64 %r246,%r247,1; .loc 1 282 7 ld.u64 %r254,[%r244+48]; sub.u64 %r148,%r246,%r254; .loc 1 283 6 setp.le.s64 %r255,%r148,0; @ %r255 bra $L56; .loc 1 286 9 ld.u64 %r166,[%r216]; .loc 1 288 15 ld.u64 %r34,[%r216+16]; .loc 1 290 22 cvt.u32.u64 %r35,%r34; .loc 1 290 53 add.u32 %r256,%r35,-4; and.b32 %r257,%r256,-5; set.u32.eq.u32 %r259,%r257,0; neg.s32 %r260,%r259; .loc 1 290 22 add.u32 %r261,%r35,-1; set.u32.le.u32 %r263,%r261,1; neg.s32 %r264,%r263; .loc 1 290 6 cvt.u16.u32 %r266,%r260; cvt.u16.u32 %r267,%r264; or.b16 %r265,%r266,%r267; cvt.u32.u16 %r268,%r265; cvt.u16.u8 %r269,%r268; setp.ne.u16 %r270,%r269,0; @ %r270 bra $L61; .loc 1 292 7 setp.ne.u32 %r272,%r35,16; @ %r272 bra $L62; $L61: .loc 1 299 9 shl.b64 %r276,%r451,3; add.u64 %r277,%r214,%r276; ld.u64 %r151,[%r277+40]; .loc 1 300 12 add.u64 %r283,%r216,%r276; ld.u64 %r39,[%r283+40]; .loc 1 302 3 setp.ne.u64 %r285,%r146,0; @ %r285 bra $L63; $L69: .loc 1 312 3 setp.lt.s64 %r286,%r146,%r147; @ %r286 bra $L64; bra $L123; $L62: .loc 1 297 5_gfortran_runtime_error63: add.u64 %r205,%r214,40; add.u64 %r67,%r216,40; add.u64 %r289,%r214,16; add.u64 %r291,%r22,%r22; add.u64 %r292,%r291,%r22; shl.b64 %r293,%r292,3; add.u64 %r210,%r289,%r293; .loc 1 302 3 mov.u64 %r91,0; add.u64 %r449,%frame,240; add.u64 %r447,%frame,360; .loc 1 309 12 mov.u64 %r462,%r91; $L68: .loc 1 304 18 add.u64 %r295,%r449,%r91; ld.u64 %r296,[%r205]; st.u64 [%r295],%r296; .loc 1 305 18 add.u64 %r297,%frame,%r91; .loc 1 305 20 ld.u64 %r299,[%r67]; mul.lo.u64 %r298,%r299,%r34; .loc 1 305 18 st.u64 [%r297],%r298; .loc 1 306 19 ld.u64 %r301,[%r205+16]; add.u64 %r300,%r301,1; ld.u64 %r302,[%r205+8]; sub.u64 %r51,%r300,%r302; .loc 1 308 10 setp.lt.s64 %r303,%r51,0; @ %r303 bra $L66; .loc 1 306 17 add.u64 %r305,%r447,%r91; st.u64 [%r305],%r51; bra $L67; $L66: .loc 1 309 12 add.u64 %r307,%r447,%r91; st.u64 [%r307],%r462; $L67: .loc 1 302 3 add.u64 %r205,%r205,24; add.u64 %r67,%r67,24; add.u64 %r91,%r91,8; setp.ne.u64 %r309,%r205,%r210; @ %r309 bra $L68; bra $L69; $L123: .loc 1 322 6 ld.u64 %r310,[%r213]; setp.eq.u64 %r311,%r310,0; @ ! %r311 bra $L71; bra $L70; $L64: add.u64 %r313,%r22,%r22; add.u64 %r314,%r313,%r22; shl.b64 %r315,%r314,3; add.u64 %r128,%r315,40; add.u64 %r133,%r214,%r128; add.u64 %r124,%r216,%r128; shl.b64 %r316,%r22,3; add.u64 %r83,%r316,-8; cvt.u32.u32 %r318,%r23; cvt.s64.s8 %r317,%r318; shl.b64 %r319,%r317,3; add.u64 %r209,%r319,-8; add.u64 %r449,%frame,240; add.u64 %r447,%frame,360; .loc 1 319 12 mov.u64 %r461,0; $L74: .loc 1 314 18 add.u64 %r321,%r449,%r83; ld.u64 %r322,[%r133]; st.u64 [%r321],%r322; .loc 1 315 18 add.u64 %r323,%frame,%r83; .loc 1 315 20 ld.u64 %r325,[%r124]; mul.lo.u64 %r324,%r325,%r34; .loc 1 315 18 st.u64 [%r323],%r324; .loc 1 316 19 ld.u64 %r327,[%r133+16]; add.u64 %r326,%r327,1; ld.u64 %r328,[%r133+8]; sub.u64 %r64,%r326,%r328; .loc 1 318 10 setp.lt.s64 %r329,%r64,0; @ %r329 bra $L72; .loc 1 316 17 add.u64 %r331,%r447,%r83; st.u64 [%r331],%r64; bra $L73; $L72: .loc 1 319 12 add.u64 %r333,%r447,%r83; st.u64 [%r333],%r461; $L73: .loc 1 312 3 add.u64 %r133,%r133,24; add.u64 %r124,%r124,24; add.u64 %r83,%r83,8; setp.ne.u64 %r335,%r83,%r209; @ %r335 bra $L74; bra $L124; $L70: .loc 1 326 7 setp.eq.u64 %r336,%r147,0; @ %r336 bra $L76; add.u64 %r447,%frame,360; $L100: add.u64 %r161,%r213,40; mov.u64 %r143,%r447; cvt.u32.u32 %r338,%r23; cvt.s64.s8 %r337,%r338; add.u64 %r340,%r337,%r337; add.u64 %r341,%r340,%r337; shl.b64 %r342,%r341,3; add.u64 %r343,%r213,16; add.u64 %r135,%r342,%r343; .loc 1 329 10 mov.u64 %r131,1; .loc 1 333 4 mov.u64 %r362,0; bra $L77; $L76: .loc 1 337 20 add.u32 %r344,%r23,-2; cvt.s64.s32 %r73,%r344; add.u64 %r346,%r73,%r73; add.u64 %r347,%r346,%r73; shl.b64 %r348,%r347,3; add.u64 %r349,%r213,%r348; .loc 1 337 67 shl.b64 %r351,%r73,3; add.u64 %r352,%frame,%r351; .loc 1 337 59 ld.u64 %r354,[%r349+40]; ld.u64 %r355,[%r352+360]; mul.lo.u64 %r155,%r354,%r355; .loc 1 339 24 mov.u64 %r356,0; st.u64 [%r213+8],%r356; .loc 1 340 28 cvt.u16.u32 %r359,%r23; add.u16 %r358,%r359,-1; cvt.u32.u16 %r360,%r358; st.u8 [%r213+28],%r360; .loc 1 342 10 setp.eq.u64 %r361,%r155,0; @ %r361 bra $L78; bra $L125; $L80: .loc 1 331 47 mul.lo.u64 %r131,%r70,%r131; $L77: .loc 1 333 4 st.u64 [%r161+8],%r362; ld.u64 %r70,[%r143]; add.u64 %r363,%r70,-1; st.u64 [%r161+16],%r363; st.u64 [%r161],%r131; .loc 1 326 7 add.u64 %r161,%r161,24; add.u64 %r143,%r143,8; setp.ne.u64 %r364,%r135,%r161; @ %r364 bra $L80; bra $L76; $L78: .loc 1 345 4 st.u64 [%r213+48],%r155; mov.u64 %r366,-1; st.u64 [%r213+56],%r366; mov.u64 %r367,1; st.u64 [%r213+40],%r367; .loc 1 346 4 bra $L56; $L125: .loc 1 349 24 mov.u64 %r369369; call (%value_in),_gfortrani_xmallocarray70,[%value_in]; } .loc 1 349 22 st.u64 [%r213],%r370; bra $L81; $L71: .loc 1 354 19 ld.s8 %r372,[%r213+28]; .loc 1 354 10 setp.eq.u64 %r373,%r372,%r147; @ %r373 bra $L82; .loc 1 355 2 cvta.const.u64 %r374374_gfortran_runtime_error82: .loc 1 357 11 cvta.global.u64 %r376,_gfortrani_compile_options; .loc 1 357 10 ld.u32 %r377,[%r376+36]; setp.eq.u32 %r378,%r377,0; @ %r378 bra $L81; .loc 1 359 4 add.u64 %r447,%frame,360; cvta.const.u64 %r382,$LC2r4473382; call _gfortrani_bounds_ifunction_return361 4 cvta.const.u64 %r386,$LC386382; call _gfortrani_bounds_equal_extents$L81: .loc 1 366 3 setp.ne.u64 %r388,%r147,0; @ %r388 bra $L83; $L86: .loc 1 374 8 ld.u64 %r167,[%r213]; .loc 1 375 8 ld.u64 %r165,[%r214]; .loc 1 377 9 setp.ne.u64 %r389,%r165,0; @ %r389 bra $L84; bra $L56; $L83: add.u64 %r194,%frame,480; add.u64 %r193,%r213,40; add.u64 %r188,%frame,120; cvt.u32.u32 %r391,%r23; cvt.s64.s8 %r390,%r391; add.u64 %r174,%r390,-1; .loc 1 366 3 mov.u64 %r168,0; add.u64 %r447,%frame,360; .loc 1 368 16 mov.u64 %r392,%r168; $L85: st.u64 [%r194],%r392; .loc 1 369 18 ld.u64 %r393,[%r193]; st.u64 [%r188],%r393; .loc 1 370 17 shl.b64 %r395,%r168,3; add.u64 %r396,%r447,%r395; .loc 1 370 10 ld.u64 %r397,[%r396]; setp.le.s64 %r398,%r397,0; @ %r398 bra $L56; .loc 1 366 26 add.u64 %r168,%r168,1; .loc 1 366 3 add.u64 %r194,%r194,8; add.u64 %r193,%r193,24; add.u64 %r188,%r188,8; setp.ne.u64 %r399,%r168,%r174; @ %r399 bra $L85; bra $L86; $L84: .loc 1 300 12 mul.lo.u64 %r42,%r39,%r34; .loc 1 396 32 add.u64 %r87,%r151,%r151; ld.u64 %r68,[%frame+480]; .loc 1 440 22 ld.u64 %r97,[%frame+240]; .loc 1 440 12 add.u64 %r99,%r97,%r97; .loc 1 441 23 ld.u64 %r100,[%frame]; .loc 1 442 22 ld.u64 %r102,[%frame+120]; .loc 1 442 12 shl.b64 %r104,%r102,3; setp.le.s64 %r446,%r147,1; .loc 1 418 9 setp.ne.u32 %r454,%r217,0; cvt.u32.u32 %r455,%r23; cvt.s64.s8 %r456,%r455; add.u64 %r457,%r456,-1; add.u64 %r458,%frame,240; add.u64 %r459,%frame,360; add.u64 %r460,%frame,120; $L101: .loc 1 366 3 mov.u64 %r160,%r166; mov.u64 %r159,%r165; .loc 1 396 9 mov.u64 %r137,0; $L90: .loc 1 399 7 ld.s8 %r208,[%r160]; mov.u64 %r189,%r137; .loc 1 396 24 add.u64 %r137,%r137,1; .loc 1 399 6 setp.eq.u32 %r448,%r208,0; @ %r448 bra $L87; .loc 1 407 11 ld.s16 %r141,[%r159]; .loc 1 418 9 @ %r454 bra $L88; bra $L126; $L87: .loc 1 396 32 add.u64 %r159,%r159,%r87; .loc 1 396 47 add.u64 %r160,%r160,%r42; .loc 1 396 2 setp.ne.u64 %r404,%r148,%r137; @ %r404 bra $L90; .loc 1 395 9 mov.u64 %r137,0; bra $L91; $L126: .loc 1 428 10 setp.gt.s64 %r405,%r148,%r189; @ %r405 bra $L92; bra $L91; $L88: .loc 1 419 8 setp.le.s64 %r406,%r148,%r189; @ %r406 bra $L91; $L94: .loc 1 424 16 add.u64 %r189,%r189,1; .loc 1 421 8 @ %r448 bra $L93; .loc 1 421 18 ld.s16 %r88,[%r159]; .loc 1 421 15 cvt.u16.u32 %r408,%r88; cvt.u16.u32 %r409,%r141; setp.le.s16 %r410,%r408,%r409; selp.u32 %r141,%r88,%r141,%r410; selp.u64 %r137,%r189,%r137,%r410; $L93: .loc 1 419 33 add.u64 %r159,%r159,%r87; .loc 1 419 48 add.u64 %r160,%r160,%r42; .loc 1 419 8 setp.eq.u64 %r411,%r148,%r189; @ %r411 bra $L91; .loc 1 421 9 ld.s8 %r208,[%r160]; setp.eq.u32 %r448,%r208,0; bra $L94; $L92: .loc 1 433 11 add.u64 %r189,%r189,1; .loc 1 430 10 ld.s8 %r413,[%r160]; cvt.u16.u32 %r412,%r413; setp.eq.u16 %r414,%r412,0; @ %r414 bra $L95; .loc 1 430 20 ld.s16 %r93,[%r159]; .loc 1 430 17 cvt.u16.u32 %r415,%r93; cvt.u16.u32 %r416,%r141; setp.lt.s16 %r417,%r415,%r416; selp.u32 %r141,%r93,%r141,%r417; selp.u64 %r137,%r189,%r137,%r417; $L95: .loc 1 428 35 add.u64 %r159,%r159,%r87; .loc 1 428 50 add.u64 %r160,%r160,%r42; .loc 1 428 10 setp.ne.u64 %r418,%r148,%r189; @ %r418 bra $L92; $L91: .loc 1 436 8 st.u64 [%r167],%r137; .loc 1 439 15 add.u64 %r68,%r68,1; .loc 1 440 12 add.u64 %r165,%r165,%r99; .loc 1 441 13 add.u64 %r166,%r166,%r100; .loc 1 442 12 add.u64 %r167,%r167,%r104; .loc 1 444 32 ld.u64 %r29,[%frame+360]; .loc 1 444 13 setp.ne.u64 %r419,%r29,%r68; @ %r419 bra $L101; .loc 1 451 23 mul.lo.u64 %r106,%r68,%r97; .loc 1 452 24 mul.lo.u64 %r420,%r68,%r100; .loc 1 452 10 sub.u64 %r162,%r166,%r420; .loc 1 453 23 mul.lo.u64 %r421,%r68,%r102; .loc 1 453 9 shl.b64 %r422,%r421,3; sub.u64 %r163,%r167,%r422; .loc 1 455 7 @ %r446 bra $L56; add.u64 %r204,%frame,488; mov.u64 %r203,8; .loc 1 454 5 mov.u64 %r164,1; .loc 1 448 13 mov.u64 %r453,0; bra $L98; $L99: st.u64 [%r204],%r453; .loc 1 451 23 mul.lo.u64 %r106,%r117,%r116; .loc 1 452 24 mul.lo.u64 %r428,%r119,%r116; .loc 1 452 10 sub.u64 %r162,%r166,%r428; .loc 1 453 23 mul.lo.u64 %r429,%r121,%r116; .loc 1 453 9 shl.b64 %r430,%r429,3; sub.u64 %r163,%r167,%r430; .loc 1 454 5 add.u64 %r164,%r164,1; .loc 1 455 7 add.u64 %r204,%r204,8; add.u64 %r203,%r203,8; setp.eq.u64 %r431,%r164,%r457; @ %r431 bra $L56; $L98: .loc 1 463 16 ld.u64 %r432,[%r204]; add.u64 %r116,%r432,1; st.u64 [%r204],%r116; .loc 1 464 23 add.u64 %r434,%r458,%r203; ld.u64 %r117,[%r434]; .loc 1 464 13 sub.u64 %r435,%r117,%r106; add.u64 %r436,%r435,%r435; add.u64 %r165,%r165,%r436; .loc 1 465 24 add.u64 %r437,%frame,%r203; ld.u64 %r119,[%r437]; .loc 1 465 14 add.u64 %r166,%r162,%r119; .loc 1 466 23 add.u64 %r439,%r460,%r203; ld.u64 %r121,[%r439]; .loc 1 466 13 shl.b64 %r440,%r121,3; add.u64 %r167,%r163,%r440; .loc 1 444 32 add.u64 %r442,%r459,%r203; ld.u64 %r125,[%r442]; .loc 1 444 13 setp.eq.u64 %r443,%r116,%r125; @ %r443 bra $L99; .loc 1 448 13 mov.u64 %r68,0; bra $L101; $L124: .loc 1 322 6 ld.u64 %r444,[%r213]; setp.eq.u64 %r445,%r444,0; @ ! %r445 bra $L71; bra $L100; $L56: .loc 1 4_gfortran_sminloc1_8_i2 .visible .func _gfortran_sminloc1_8_, .param .u32 %in_ar4) {8u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 493 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L128; .loc 1 493 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L129; $L128:r152154; call _gfortran_minloc1_8_500 7 bra $L127; $L129: .loc 1 503 10 ld.u64 %r23,[%r152]; .loc 1 503 7 add.u64 %r88,%r23,-1; .loc 1 504 10 ld.s8 %r24,[%r151+28]; .loc 1 504 38 add.u32 %r26,%r24,-1; .loc 1 504 8 cvt.s64.s32 %r296,%r26; .loc 1 506 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 506 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L131; .loc 1 513 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L132; $L138: .loc 1 521 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L133; bra $L186; $L131: .loc 1 508 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error132: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 518 12 mov.u64 %r302,0; $L137: .loc 1 515 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 517 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L135; .loc 1 515 17 st.u64 [%r139],%r36; bra $L136; $L135: .loc 1 518 12 st.u64 [%r139],%r302; $L136: .loc 1 513 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L137; bra $L138; $L186: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L140; bra $L139; $L133: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 527 12 mov.u64 %r301,0; $L143: .loc 1 524 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 526 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L141; .loc 1 523 17 st.u64 [%r43],%r41; bra $L142; $L141: .loc 1 527 12 st.u64 [%r43],%r301; $L142: .loc 1 521 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L143; bra $L187; $L139: .loc 1 534 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L145; add.u64 %r295,%frame,120; $L165: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 537 10 mov.u64 %r81,1; .loc 1 541 4 mov.u64 %r239,0; bra $L146; $L145: .loc 1 545 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 546 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 548 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 548 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 548 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 550 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L147; bra $L188; $L149: .loc 1 539 48 mul.lo.u64 %r81,%r46,%r81; $L146: .loc 1 541 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 534 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L149; bra $L145; $L147: .loc 1 553 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 554 4 bra $L127; $L188: .loc 1 557 24 mov.u64 %r24696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 557 22 st.u64 [%r150],%r98; $L153: .loc 1 583 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L150; bra $L151; $L140: .loc 1 561 19 ld.s8 %r55,[%r150+28]; .loc 1 561 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L152; .loc 1 562 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error152: .loc 1 567 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 567 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L153; .loc 1 569 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L151; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 569 10 mov.u64 %r104,0; $L155: .loc 1 573 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 573 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 574 18 ld.u64 %r63,[%r31]; .loc 1 569 25 add.u64 %r104,%r104,1; .loc 1 574 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L154; .loc 1 575 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error154: .loc 1 569 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L155; $L150: .loc 1 585 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L156; shl.b64 %r266,%r296,3; bra $L157; $L156: mov.u64 %r266,8; $L157: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L158: .loc 1 586 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 583 26 add.u64 %r100,%r100,1; .loc 1 583 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L158; $L151: ld.u64 %r80,[%frame+240]; .loc 1 595 22 ld.u64 %r68,[%frame]; .loc 1 595 12 shl.b64 %r69,%r68,3; .loc 1 597 32 ld.u64 %r59,[%frame+120]; .loc 1 604 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 593 13 mov.u64 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L164: st.u64 [%r98],%r280; .loc 1 594 15 add.u64 %r80,%r80,1; .loc 1 595 12 add.u64 %r98,%r98,%r69; .loc 1 597 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L164; .loc 1 606 7 @ ! %r294 bra $L189; bra $L127; $L163: .loc 1 601 13 st.u64 [%r99],%r280; .loc 1 604 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 605 5 add.u64 %r97,%r97,1; .loc 1 606 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L162; bra $L127; $L189: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 604 23 mov.u64 %r72,%r44; .loc 1 605 5 mov.u64 %r97,1; $L162: .loc 1 610 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 611 23 ld.u64 %r75,[%r124]; .loc 1 611 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,3; add.u64 %r98,%r98,%r291; .loc 1 597 32 ld.u64 %r77,[%r123]; .loc 1 597 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L163; .loc 1 601 13 mov.u64 %r80,0; bra $L164; $L187: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L140; bra $L165; $L127: .loc 1 615 minloc1_16_i2.o/1622802233_gfortran_minloc1_16_i2 .visible .func _gfortran_minloc1_16_i2.file 1 "../../../../libgfortran/generated/minloc1_16_i2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_16_i2 .visible .func _gfortran_mminloc1_16__gfortran_sminloc1_16_i2 .visible .func _gfortran_sminloc1_16_VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_minloc1_16_i2 .visible .func _gfortran_minloc1_16_i2pred %r229; .reg .predpred %r240; .reg .predu16 %r278; .reg .u32 %r279; .reg .u32 %r280; .reg .u64 %r282; .reg .u64predpredu6432u64 %r328; .reg .u64 %r329; .reg .pred %r330; .reg .pred %r331; .reg .u64 %r333; .reg .u16 %r336; .reg .u16 %r337; .reg .predpredu64 %r399; .reg .u64 %r400; .reg .u64.loc 1 58 10 ld.s8 %r22,[%r191+28]; .loc 1 58 38 add.u32 %r194,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r118,%r194; .loc 1 59 10 ld.u64 %r25,[%r192]; .loc 1 59 7 add.u64 %r119,%r25,-1; .loc 1 61 7 shr.u64 %r196,%r119,63; set.u32.lt.s64 %r198,%r118,%r119; neg.s32 %r199,%r198; cvt.u16.u64 %r202,%r196;61%r206 bra $L2; .loc 1 63 7 cvt.u32.u32 %r209,%r22; cvt.s64.s8 %r208,%r209; st.u64 [%stack+8],%r208007_gfortran_runtime_errorr212,%r119,%r119; add.u64 %r213,%r212,%r119; shl.b64 %r214,%r213,3; add.u64 %r215,%r191,%r214; ld.u64 %r31,[%r215+56]; ld.u64 %r33,[%r215+48]; .loc 1 71 9 ld.u64 %r122,[%r215+40]; .loc 1 73 3 setp.ne.u64 %r229,%r119,0; @ %r229 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r230,%r118,%r119; @ %r230 bra $L4; bra $L52; $L3: add.u64 %r188,%r191,40; add.u64 %r184,%frame,120; add.u64 %r47,%frame,240; add.u64 %r231,%r191,16; add.u64 %r233,%r25,%r25; add.u64 %r234,%r233,%r25; shl.b64 %r235,%r234,3; add.u64 %r186,%r231,%r235; .loc 1 79 12 mov.u64 %r423,0; $L8: .loc 1 75 18 ld.u64 %r236,[%r188]; st.u64 [%r184],%r236; .loc 1 76 19 ld.u64 %r238,[%r188+16]; add.u64 %r237,%r238,1; ld.u64 %r239,[%r188+8]; sub.u64 %r38,%r237,%r239; .loc 1 78 10 setp.lt.s64 %r240,%r38,0; @ %r240 bra $L6; .loc 1 76 17 st.u64 [%r47],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r47],%r423; $L7: .loc 1 73 3 add.u64 %r188,%r188,24; add.u64 %r184,%r184,8; add.u64 %r47,%r47,8; setp.ne.u64 %r242,%r186,%r188; @ %r242 bra $L8; bra $L9; $L52: .loc 1 90 6 ld.u64 %r243,[%r190]; setp.eq.u64 %r244,%r243,0; @ ! %r244 bra $L11; bra $L10; $L4: add.u64 %r246,%r25,%r25; add.u64 %r247,%r246,%r25; shl.b64 %r248,%r247,3; add.u64 %r249,%r248,40; add.u64 %r110,%r191,%r249; shl.b64 %r250,%r25,3; add.u64 %r98,%r250,-8; add.u64 %r411,%frame,120; add.u64 %r100,%r411,%r98; add.u64 %r410,%frame,240; add.u64 %r96,%r410,%r98; cvt.u32.u32 %r254,%r22; cvt.s64.s8 %r253,%r254; add.u64 %r256,%r253,%r253; add.u64 %r257,%r256,%r253; shl.b64 %r258,%r257,3; add.u64 %r259,%r191,40; add.u64 %r135,%r258,%r259; .loc 1 87 12 mov.u64 %r422,0; $L14: .loc 1 83 18 ld.u64 %r260,[%r110]; st.u64 [%r100],%r260; .loc 1 84 19 ld.u64 %r262,[%r110+16]; add.u64 %r261,%r262,1; ld.u64 %r263,[%r110+8]; sub.u64 %r44,%r261,%r263; .loc 1 86 10 setp.lt.s64 %r264,%r44,0; @ %r264 bra $L12; .loc 1 84 17 st.u64 [%r96],%r44; bra $L13; $L12: .loc 1 87 12 st.u64 [%r96],%r422; $L13: .loc 1 81 3 add.u64 %r110,%r110,24; add.u64 %r100,%r100,8; add.u64 %r96,%r96,8; setp.ne.u64 %r266,%r110,%r135; @ %r266 bra $L14; bra $L53; $L10: .loc 1 94 7 setp.le.s64 %r267,%r118,0; @ %r267 bra $L16; add.u64 %r410,%frame,240; $L39: add.u64 %r146,%r190,40; mov.u64 %r143,%r410; cvt.u32.u32 %r269,%r22; cvt.s64.s8 %r268,%r269; add.u64 %r271,%r268,%r268; add.u64 %r272,%r271,%r268; shl.b64 %r273,%r272,3; add.u64 %r274,%r190,16; add.u64 %r111,%r273,%r274; .loc 1 97 10 mov.u64 %r102,1; .loc 1 101 4 mov.u64 %r297,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r275,0; st.u64 [%r190+8],%r275; .loc 1 106 28 cvt.u16.u32 %r278,%r22; add.u16 %r277,%r278,-1; cvt.u32.u16 %r279,%r277; st.u8 [%r190+28],%r279; .loc 1 108 20 add.u32 %r280,%r22,-2; cvt.s64.s32 %r53,%r280; add.u64 %r282,%r53,%r53; add.u64 %r283,%r282,%r53; shl.b64 %r284,%r283,3; add.u64 %r285,%r190,%r284; .loc 1 108 67 shl.b64 %r287,%r53,3; add.u64 %r288,%frame,%r287; .loc 1 108 59 ld.u64 %r290,[%r285+40]; ld.u64 %r291,[%r288+240]; mul.lo.u64 %r124,%r290,%r291; .loc 1 110 29 mov.u64 %r293,16293; call (%value_in),_gfortrani_xmallocarray294,[%value_in]; } .loc 1 110 27 st.u64 [%r190],%r294; .loc 1 111 10 setp.eq.u64 %r296,%r124,0; @ ! %r296 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r102,%r49,%r102; $L17: .loc 1 101 4 st.u64 [%r146+8],%r297; ld.u64 %r49,[%r143]; add.u64 %r298,%r49,-1; st.u64 [%r146+16],%r298; st.u64 [%r146],%r102; .loc 1 94 7 add.u64 %r146,%r146,24; add.u64 %r143,%r143,8; setp.ne.u64 %r299,%r111,%r146; @ %r299 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r190+48],%r124; mov.u64 %r301,-1; st.u64 [%r190+56],%r301; mov.u64 %r302,1; st.u64 [%r190+40],%r302; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r59,[%r190+28]; .loc 1 121 10 setp.eq.u64 %r303,%r59,%r118; @ %r303 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r118; st.u64 [%stack],%r59;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r306,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r307,[%r306+36]; setp.eq.u32 %r308,%r307,0; @ %r308 bra $L21; .loc 1 128 2 add.u64 %r410,%frame,240; cvta.const.u64 %r312,$LC2; cvta.const.u64 %r311,$LC3;_gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r314,%r118,0; @ %r314 bra $L24; $L27: .loc 1 68 9 add.u64 %r315,%r31,1; .loc 1 68 7 sub.u64 %r120,%r315,%r33; max.s64 %r82,%r120,0; .loc 1 140 8 ld.u64 %r131,[%r191]; .loc 1 141 8 ld.u64 %r132,[%r190]; .loc 1 190 33 add.u64 %r71,%r122,%r122; ld.u64 %r141,[%frame+360]; .loc 1 204 22 ld.u64 %r73,[%frame+120]; .loc 1 204 12 add.u64 %r74,%r73,%r73; .loc 1 205 22 ld.u64 %r76,[%frame]; .loc 1 205 12 shl.b64 %r77,%r76,4; .loc 1 207 32 ld.u64 %r66,[%frame+240]; .loc 1 214 23 mul.lo.u64 %r318,%r66,%r73; .loc 1 214 9 add.u64 %r319,%r318,%r318; neg.s64 %r140,%r319; .loc 1 215 23 mul.lo.u64 %r320,%r66,%r76; .loc 1 215 9 shl.b64 %r321,%r320,4; neg.s64 %r145,%r321; setp.gt.s64 %r412,%r120,0; setp.le.s64 %r413,%r118,1; .loc 1 180 9 setp.eq.u32 %r415,%r193,0; mov.u64 %r416,1; cvt.u32.u32 %r417,%r22; cvt.s64.s8 %r418,%r417; add.u64 %r419,%r418,-1; add.u64 %r420,%frame,120; add.u64 %r421,%frame,240; bra $L25; $L24: add.u64 %r168,%frame,360; add.u64 %r167,%r190,40; mov.u64 %r165,%frame; cvt.u32.u32 %r323,%r22; cvt.s64.s8 %r322,%r323; add.u64 %r148,%r322,-1; .loc 1 132 3 mov.u64 %r133,0; add.u64 %r410,%frame,240; .loc 1 134 16 mov.u64 %r324,%r133; $L26: st.u64 [%r168],%r324; .loc 1 135 18 ld.u64 %r325,[%r167]; st.u64 [%r165],%r325; .loc 1 136 17 shl.b64 %r327,%r133,3; add.u64 %r328,%r410,%r327; .loc 1 136 10 ld.u64 %r329,[%r328]; setp.le.s64 %r330,%r329,0; @ %r330 bra $L1; .loc 1 132 26 add.u64 %r133,%r133,1; .loc 1 132 3 add.u64 %r168,%r168,8; add.u64 %r167,%r167,24; add.u64 %r165,%r165,8; setp.ne.u64 %r331,%r133,%r148; @ %r331 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r412 bra $L28; .loc 1 159 10 mov.u64 %r333,0; st.u64 [%r132],%r333; st.u64 [%r132+8],%r333; bra $L29; $L28: .loc 1 180 9 @ %r415 bra $L40; mov.u64 %r127,%r131; mov.u64 %r400,%r416; mov.u64 %r401,0; .loc 1 155 9 mov.u32 %r108,32767; .loc 1 157 9 mov.u64 %r396,1; mov.u64 %r397,%r401; $L32: .loc 1 183 9 ld.s16 %r65,[%r127]; .loc 1 183 8 cvt.u16.u32 %r336,%r65; cvt.u16.u32 %r337,%r108; setp.gt.s16 %r338,%r336,%r337; .loc 1 186 16 selp.u64 %r396,%r396,%r400,%r338; selp.u64 %r397,%r397,%r401,%r338; selp.u32 %r108,%r108,%r65,%r338; .loc 1 181 33 add.u64 %r127,%r127,%r71; .loc 1 181 8 add.u64 %r341,%r400,%r416; set.u32.lt.u64 %r346,%r341,%r400; cvt.s64.s32 %r344,%r346; mov.u64 %r400,%r341; sub.u64 %r401,%r401,%r344; add.u64 %r351,%r400,-1; setp.gt.s64 %r353,%r82,%r351; @ %r353 bra $L32; bra $L33; $L40: mov.u64 %r101,%r131; mov.u64 %r398,%r416; mov.u64 %r399,0; .loc 1 155 9 mov.u32 %r109,32767; .loc 1 157 9 mov.u64 %r396,1; mov.u64 %r397,%r399; $L30: .loc 1 192 9 ld.s16 %r69,[%r101]; .loc 1 192 8 cvt.u16.u32 %r354,%r69; cvt.u16.u32 %r355,%r109; setp.ge.s16 %r356,%r354,%r355; .loc 1 195 16 selp.u64 %r396,%r396,%r398,%r356; selp.u64 %r397,%r397,%r399,%r356; selp.u32 %r109,%r109,%r69,%r356; .loc 1 190 33 add.u64 %r101,%r101,%r71; .loc 1 190 8 add.u64 %r359,%r398,%r416; set.u32.lt.u64 %r364,%r359,%r398; cvt.s64.s32 %r362,%r364; mov.u64 %r398,%r359; sub.u64 %r399,%r399,%r362; add.u64 %r369,%r398,-1; setp.gt.s64 %r371,%r82,%r369; @ %r371 bra $L30; $L33: .loc 1 199 12 st.u64 [%r132],%r396; st.u64 [%r132+8],%r397; $L29: .loc 1 203 15 add.u64 %r141,%r141,1; .loc 1 204 12 add.u64 %r131,%r131,%r74; .loc 1 205 12 add.u64 %r132,%r132,%r77; .loc 1 207 13 setp.ne.u64 %r374,%r66,%r141; @ %r374 bra $L25; .loc 1 214 9 add.u64 %r128,%r131,%r140; .loc 1 215 9 add.u64 %r129,%r132,%r145; .loc 1 217 7 @ %r413 bra $L1; add.u64 %r157,%frame,368; mov.u64 %r117,8; .loc 1 216 5 mov.u64 %r130,1; .loc 1 211 13 mov.u64 %r414,0; bra $L37; $L38: st.u64 [%r157],%r414; .loc 1 214 23 mul.lo.u64 %r380,%r89,%r88; .loc 1 214 9 add.u64 %r381,%r380,%r380; sub.u64 %r128,%r131,%r381; .loc 1 215 23 mul.lo.u64 %r382,%r92,%r88; .loc 1 215 9 shl.b64 %r383,%r382,4; sub.u64 %r129,%r132,%r383; .loc 1 216 5 add.u64 %r130,%r130,1; .loc 1 217 7 add.u64 %r157,%r157,8; add.u64 %r117,%r117,8; setp.eq.u64 %r384,%r130,%r419; @ %r384 bra $L1; $L37: .loc 1 225 16 ld.u64 %r385,[%r157]; add.u64 %r88,%r385,1; st.u64 [%r157],%r88; .loc 1 226 23 add.u64 %r387,%r420,%r117; ld.u64 %r89,[%r387]; .loc 1 226 13 add.u64 %r388,%r89,%r89; add.u64 %r131,%r128,%r388; .loc 1 227 23 add.u64 %r389,%frame,%r117; ld.u64 %r92,[%r389]; .loc 1 227 13 shl.b64 %r390,%r92,4; add.u64 %r132,%r129,%r390; .loc 1 207 32 add.u64 %r392,%r421,%r117; ld.u64 %r95,[%r392]; .loc 1 207 13 setp.eq.u64 %r393,%r88,%r95; @ %r393 bra $L38; .loc 1 211 13 mov.u64 %r141,0; bra $L25; $L53: .loc 1 90 6 ld.u64 %r394,[%r190]; setp.eq.u64 %r395,%r394,0; @ ! %r395 bra $L11; bra $L39; $L1: .loc 1 231_gfortran_mminloc1_16_i2 .visible .func _gfortran_mminloc1_16_, .param .u32 %in_ar4) {608u32 %r94; .reg .u64 %r97u64u64 %r267; .reg .u64 %r274; .reg .pred %r275; .reg .u32 %r276; .reg .u32 %r277; .reg .u32u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .pred %r329; .reg .u64 %r330; .reg .pred %r331; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32 %r338; .reg .u64 %r339; .reg .u64 %r341; .reg .u64 %r342; .reg .u64 %r343; .reg .u64 %r344; .reg .u64pred %r355; .reg .pred64predpred %r408; .reg .pred %r409; .reg .u64 %r410; .reg .u32.reg .u64 %r417; .reg .pred %r418; .reg .predpred %r428; .reg .pred %r429; .reg .pred %r430; .reg .u64.reg .u32 %r456; .reg .u64 %r460; .reg .u16 %r462; .reg .u16 %r463; .reg .predpred %r479; .reg .pred %r482; .reg .u64 %r486; .reg .u64 %r489; .reg .u64 %r492; .reg .u64 %r495; .reg .u32u64 %r547; .reg .pred %r548; .reg .u64 %r549; .reg .u64 %r551; .reg .u64u64 %r557; .reg .u64u64 %r572; .reg .u64 %r583; .reg .u64 %r597; .reg .predpred %r609; .reg .u64 %r610; .reg .u32 %r611; .reg .u64 %r612; .reg .u64 %r613; .reg .u64 %r614; .reg .u64 %r615; .reg .u64 %r616; .reg .u64 %r617; mov.u64 %r233,%ar0; mov.u64 %r234,%ar1; mov.u64 %r235,%ar2; mov.u64 %r236,%ar3; mov.u32 %r237,%ar4; .loc 1 261 6 setp.ne.u64 %r238,%r236,0; @ %r238 bra $L3r2235237; call _gfortran_minloc1_16_268 7 bra $L54; $L55: .loc 1 271 10 ld.u64 %r22,[%r235]; .loc 1 271 7 add.u64 %r153,%r22,-1; .loc 1 272 10 ld.s8 %r23,[%r234+28]; .loc 1 272 38 add.u32 %r243,%r23,-1; .loc 1 272 8 cvt.s64.s32 %r154,%r243; .loc 1 275 7 shr.u64 %r245,%r153,63; set.u32.gt.s64 %r247,%r153,%r154; neg.s32 %r248,%r247; cvt.u16.u64 %r251,%r245; cvt.u16.u32 %r252,%r248; or.b16 %r250,%r251,%r252; .loc 1 275 6 cvt.u32.u16 %r253,%r250; cvt.u16.u8 %r254,%r253; setp.eq.u16 %r255,%r254,0; @ %r255 bra $L57; .loc 1 277 7 cvt.u32.u32 %r258,%r23; cvt.s64.s8 %r257,%r258; st.u64 [%stack+8],%r257; st.u64 [%stack],%r22; cvta.const.u64 %r22stack; call _gfortran_runtime_error7: .loc 1 282 9 add.u64 %r602,%r153,%r153; add.u64 %r607,%r602,%r153; shl.b64 %r263,%r607,3; add.u64 %r264,%r234,%r263; ld.u64 %r267,[%r264+56]; add.u64 %r266,%r267,1; .loc 1 282 7 ld.u64 %r274,[%r264+48]; sub.u64 %r155,%r266,%r274; .loc 1 283 6 setp.le.s64 %r275,%r155,0; @ %r275 bra $L54; .loc 1 286 9 ld.u64 %r173,[%r236]; .loc 1 288 15 ld.u64 %r35,[%r236+16]; .loc 1 290 22 cvt.u32.u64 %r36,%r35; .loc 1 290 53 add.u32 %r276,%r36,-4; and.b32 %r277,%r276,-5; set.u32.eq.u32 %r279,%r277,0; neg.s32 %r280,%r279; .loc 1 290 22 add.u32 %r281,%r36,-1; set.u32.le.u32 %r283,%r281,1; neg.s32 %r284,%r283; .loc 1 290 6 cvt.u16.u32 %r286,%r280; cvt.u16.u32 %r287,%r284; or.b16 %r285,%r286,%r287;59; .loc 1 292 7 setp.ne.u32 %r292,%r36,16; @ %r292 bra $L60; $L59: .loc 1 299 9 shl.b64 %r296,%r607,3; add.u64 %r297,%r234,%r296; ld.u64 %r158,[%r297+40]; .loc 1 300 12 add.u64 %r303,%r236,%r296; ld.u64 %r41,[%r303+40]; .loc 1 302 3 setp.ne.u64 %r305,%r153,0; @ %r305 bra $L61; $L67: .loc 1 312 3 setp.lt.s64 %r306,%r153,%r154; @ %r306 bra $L62; bra $L121; $L60: .loc 1 297 5 cvta.const.u64 %r307307_gfortran_runtime_error61: add.u64 %r164,%r234,40; add.u64 %r150,%r236,40; add.u64 %r309,%r234,16; add.u64 %r311,%r22,%r22; add.u64 %r312,%r311,%r22; shl.b64 %r313,%r312,3; add.u64 %r223,%r309,%r313; .loc 1 302 3 mov.u64 %r53,0; add.u64 %r603,%frame,240; add.u64 %r606,%frame,360; .loc 1 309 12 mov.u64 %r617,%r53; $L66: .loc 1 304 18 add.u64 %r315,%r603,%r53; ld.u64 %r316,[%r164]; st.u64 [%r315],%r316; .loc 1 305 18 add.u64 %r317,%frame,%r53; .loc 1 305 20 ld.u64 %r319,[%r150]; mul.lo.u64 %r318,%r319,%r35; .loc 1 305 18 st.u64 [%r317],%r318; .loc 1 306 19 ld.u64 %r321,[%r164+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r164+8]; sub.u64 %r52,%r320,%r322; .loc 1 308 10 setp.lt.s64 %r323,%r52,0; @ %r323 bra $L64; .loc 1 306 17 add.u64 %r325,%r606,%r53; st.u64 [%r325],%r52; bra $L65; $L64: .loc 1 309 12 add.u64 %r327,%r606,%r53; st.u64 [%r327],%r617; $L65: .loc 1 302 3 add.u64 %r164,%r164,24; add.u64 %r150,%r150,24; add.u64 %r53,%r53,8; setp.ne.u64 %r329,%r164,%r223; @ %r329 bra $L66; bra $L67; $L121: .loc 1 322 6 ld.u64 %r330,[%r233]; setp.eq.u64 %r331,%r330,0; @ ! %r331 bra $L69; bra $L68; $L62: add.u64 %r333,%r22,%r22; add.u64 %r334,%r333,%r22; shl.b64 %r335,%r334,3; add.u64 %r178,%r335,40; add.u64 %r59,%r234,%r178; add.u64 %r180,%r236,%r178; shl.b64 %r336,%r22,3; add.u64 %r91,%r336,-8; cvt.u32.u32 %r338,%r23; cvt.s64.s8 %r337,%r338; shl.b64 %r339,%r337,3; add.u64 %r97,%r339,-8; add.u64 %r603,%frame,240; add.u64 %r606,%frame,360; .loc 1 319 12 mov.u64 %r616,0; $L72: .loc 1 314 18 add.u64 %r341,%r603,%r91; ld.u64 %r342,[%r59]; st.u64 [%r341],%r342; .loc 1 315 18 add.u64 %r343,%frame,%r91; .loc 1 315 20 ld.u64 %r345,[%r180]; mul.lo.u64 %r344,%r345,%r35; .loc 1 315 18 st.u64 [%r343],%r344; .loc 1 316 19 ld.u64 %r347,[%r59+16]; add.u64 %r346,%r347,1; ld.u64 %r348,[%r59+8]; sub.u64 %r66,%r346,%r348; .loc 1 318 10 setp.lt.s64 %r349,%r66,0; @ %r349 bra $L70; .loc 1 316 17 add.u64 %r351,%r606,%r91; st.u64 [%r351],%r66; bra $L71; $L70: .loc 1 319 12 add.u64 %r353,%r606,%r91; st.u64 [%r353],%r616; $L71: .loc 1 312 3 add.u64 %r59,%r59,24; add.u64 %r180,%r180,24; add.u64 %r91,%r91,8; setp.ne.u64 %r355,%r91,%r97; @ %r355 bra $L72; bra $L122; $L68: .loc 1 326 7 setp.eq.u64 %r356,%r154,0; @ %r356 bra $L74; add.u64 %r606,%frame,360; $L100: add.u64 %r137,%r233,40; mov.u64 %r134,%r606; cvt.u32.u32 %r358,%r23; cvt.s64.s8 %r357,%r358; add.u64 %r360,%r357,%r357; add.u64 %r361,%r360,%r357; shl.b64 %r362,%r361,3; add.u64 %r363,%r233,16; add.u64 %r64,%r362,%r363; .loc 1 329 10 mov.u64 %r136,1; .loc 1 333 4 mov.u64 %r382,0; bra $L75; $L74: .loc 1 337 20 add.u32 %r364,%r23,-2; cvt.s64.s32 %r75,%r364; add.u64 %r366,%r75,%r75; add.u64 %r367,%r366,%r75; shl.b64 %r368,%r367,3; add.u64 %r369,%r233,%r368; .loc 1 337 67 shl.b64 %r371,%r75,3; add.u64 %r372,%frame,%r371; .loc 1 337 59 ld.u64 %r374,[%r369+40]; ld.u64 %r375,[%r372+360]; mul.lo.u64 %r162,%r374,%r375; .loc 1 339 24 mov.u64 %r376,0; st.u64 [%r233+8],%r376; .loc 1 340 28 cvt.u16.u32 %r379,%r23; add.u16 %r378,%r379,-1; cvt.u32.u16 %r380,%r378; st.u8 [%r233+28],%r380; .loc 1 342 10 setp.eq.u64 %r381,%r162,0; @ %r381 bra $L76; bra $L123; $L78: .loc 1 331 47 mul.lo.u64 %r136,%r72,%r136; $L75: .loc 1 333 4 st.u64 [%r137+8],%r382; ld.u64 %r72,[%r134]; add.u64 %r383,%r72,-1; st.u64 [%r137+16],%r383; st.u64 [%r137],%r136; .loc 1 326 7 add.u64 %r137,%r137,24; add.u64 %r134,%r134,8; setp.ne.u64 %r384,%r64,%r137; @ %r384 bra $L78; bra $L74; $L76: .loc 1 345 4 st.u64 [%r233+48],%r162; mov.u64 %r386,-1; st.u64 [%r233+56],%r386; mov.u64 %r387,1; st.u64 [%r233+40],%r387; .loc 1 346 4 bra $L54; $L123: .loc 1 349 24 mov.u64 %r389,16389; call (%value_in),_gfortrani_xmallocarray90,[%value_in]; } .loc 1 349 22 st.u64 [%r233],%r390; bra $L79; $L69: .loc 1 354 19 ld.s8 %r392,[%r233+28]; .loc 1 354 10 setp.eq.u64 %r393,%r392,%r154; @ %r393 bra $L80; .loc 1 355 2 cvta.const.u64 %r3944_gfortran_runtime_error80: .loc 1 357 11 cvta.global.u64 %r396,_gfortrani_compile_options; .loc 1 357 10 ld.u32 %r397,[%r396+36]; setp.eq.u32 %r398,%r397,0; @ %r398 bra $L79; .loc 1 359 4 add.u64 %r606,%frame,360; cvta.const.u64 %r402,$LC24002; call _gfortrani_bounds_ifunction_return361 4 cvta.const.u64 %r406,$LC406402; call _gfortrani_bounds_equal_extents$L79: .loc 1 366 3 setp.ne.u64 %r408,%r154,0; @ %r408 bra $L81; $L84: .loc 1 374 8 ld.u64 %r174,[%r233]; .loc 1 375 8 ld.u64 %r172,[%r234]; .loc 1 377 9 setp.ne.u64 %r409,%r172,0; @ %r409 bra $L82; bra $L54; $L81: add.u64 %r188,%frame,480; add.u64 %r181,%r233,40; add.u64 %r149,%frame,120; cvt.u32.u32 %r411,%r23; cvt.s64.s8 %r410,%r411; add.u64 %r138,%r410,-1; .loc 1 366 3 mov.u64 %r175,0; add.u64 %r606,%frame,360; .loc 1 368 16 mov.u64 %r412,%r175; $L83: st.u64 [%r188],%r412; .loc 1 369 18 ld.u64 %r413,[%r181]; st.u64 [%r149],%r413; .loc 1 370 17 shl.b64 %r415,%r175,3; add.u64 %r416,%r606,%r415; .loc 1 370 10 ld.u64 %r417,[%r416]; setp.le.s64 %r418,%r417,0; @ %r418 bra $L54; .loc 1 366 26 add.u64 %r175,%r175,1; .loc 1 366 3 add.u64 %r188,%r188,8; add.u64 %r181,%r181,24; add.u64 %r149,%r149,8; setp.ne.u64 %r419,%r138,%r175; @ %r419 bra $L83; bra $L84; $L82: .loc 1 300 12 mul.lo.u64 %r43,%r41,%r35; .loc 1 396 32 add.u64 %r88,%r158,%r158; ld.u64 %r189,[%frame+480]; .loc 1 440 22 ld.u64 %r102,[%frame+240]; .loc 1 440 12 add.u64 %r103,%r102,%r102; .loc 1 441 23 ld.u64 %r105,[%frame]; .loc 1 442 22 ld.u64 %r107,[%frame+120]; .loc 1 442 12 shl.b64 %r108,%r107,4; .loc 1 444 32 ld.u64 %r29,[%frame+360]; .loc 1 451 23 mul.lo.u64 %r39,%r29,%r102; .loc 1 452 24 mul.lo.u64 %r422,%r29,%r105; .loc 1 452 10 neg.s64 %r92,%r422; .loc 1 453 23 mul.lo.u64 %r423,%r29,%r107; .loc 1 453 9 shl.b64 %r424,%r423,4; neg.s64 %r185,%r424; setp.le.s64 %r601,%r154,1; .loc 1 418 9 setp.ne.u32 %r609,%r237,0; add.u64 %r610,%r155,-1; cvt.u32.u32 %r611,%r23; cvt.s64.s8 %r612,%r611; add.u64 %r613,%r612,-1; add.u64 %r614,%frame,240; add.u64 %r615,%frame,360; $L101: .loc 1 366 3 mov.u64 %r168,%r173; mov.u64 %r167,%r172; .loc 1 396 9 mov.u64 %r197,0; $L88: .loc 1 399 7 ld.s8 %r224,[%r168]; mov.u64 %r230,%r197; .loc 1 396 24 add.u64 %r197,%r197,1; .loc 1 399 6 setp.eq.u32 %r604,%r224,0; @ %r604 bra $L85; .loc 1 407 11 ld.s16 %r145,[%r167]; .loc 1 408 11 mov.u64 %r563,%r197; shr.s64 %r564,%r563,63; .loc 1 418 9 @ %r609 bra $L86; bra $L124; $L85: .loc 1 396 32 add.u64 %r167,%r167,%r88; .loc 1 396 47 add.u64 %r168,%r168,%r43; .loc 1 396 2 setp.ne.u64 %r428,%r155,%r197; @ %r428 bra $L88; .loc 1 395 9 mov.u64 %r563,0; mov.u64 %r564,%r563; bra $L89; $L124: .loc 1 428 10 setp.gt.s64 %r429,%r155,%r230; @ %r429 bra $L90; bra $L89; $L86: .loc 1 419 8 setp.le.s64 %r430,%r155,%r230; @ %r430 bra $L89; add.u64 %r571,%r230,1; shr.s64 %r572,%r571,63; sub.u64 %r434,%r610,%r230; shr.s64 %r437,%r230,63; add.u64 %r440,%r230,2; set.u32.lt.u64 %r445,%r440,%r230; cvt.s64.s32 %r443,%r445; sub.u64 %r449,%r437,%r443; add.u64 %r451,%r434,%r440; set.u32.lt.u64 %r456,%r451,%r434; cvt.s64.s32 %r454,%r456; sub.u64 %r460,%r449,%r454; mov.u64 %r583,1; $L93: .loc 1 421 8 @ %r604 bra $L91; .loc 1 421 18 ld.s16 %r89,[%r167]; .loc 1 421 15 cvt.u16.u32 %r462,%r89; cvt.u16.u32 %r463,%r145; setp.gt.s16 %r464,%r462,%r463; .loc 1 424 16 selp.u64 %r563,%r563,%r571,%r464; selp.u64 %r564,%r564,%r572,%r464; selp.u32 %r145,%r145,%r89,%r464; $L91: .loc 1 419 33 add.u64 %r167,%r167,%r88; .loc 1 419 48 add.u64 %r168,%r168,%r43; .loc 1 419 8 add.u64 %r467,%r571,%r583; set.u32.lt.u64 %r472,%r467,%r571; cvt.s64.s32 %r470,%r472; mov.u64 %r571,%r467; sub.u64 %r572,%r572,%r470; setp.ne.u64 %r479,%r451,%r571; @ %r479 bra $L102; setp.ne.u64 %r482,%r460,%r572; @ ! %r482 bra $L89; $L102: .loc 1 421 9 ld.s8 %r224,[%r168]; setp.eq.u32 %r604,%r224,0; bra $L93; $L90: add.u64 %r567,%r230,1; shr.s64 %r568,%r567,63; sub.u64 %r486,%r610,%r230; shr.s64 %r489,%r230,63; add.u64 %r492,%r230,2; set.u32.lt.u64 %r497,%r492,%r230; cvt.s64.s32 %r495,%r497; sub.u64 %r501,%r489,%r495; add.u64 %r503,%r486,%r492; set.u32.lt.u64 %r508,%r503,%r486; cvt.s64.s32 %r506,%r508; sub.u64 %r512,%r501,%r506; .loc 1 428 10 mov.u64 %r597,1; $L116: .loc 1 430 10 ld.s8 %r514,[%r168]; cvt.u16.u32 %r513,%r514; setp.eq.u16 %r515,%r513,0; @ %r515 bra $L94; .loc 1 430 20 ld.s16 %r94,[%r167]; .loc 1 430 17 cvt.u16.u32 %r516,%r94; cvt.u16.u32 %r517,%r145; setp.ge.s16 %r518,%r516,%r517; .loc 1 433 11 selp.u64 %r563,%r563,%r567,%r518; selp.u64 %r564,%r564,%r568,%r518; selp.u32 %r145,%r145,%r94,%r518; $L94: .loc 1 428 35 add.u64 %r167,%r167,%r88; .loc 1 428 50 add.u64 %r168,%r168,%r43; .loc 1 428 10 add.u64 %r521,%r567,%r597; set.u32.lt.u64 %r526,%r521,%r567; cvt.s64.s32 %r524,%r526; mov.u64 %r567,%r521; sub.u64 %r568,%r568,%r524; setp.ne.u64 %r533,%r503,%r567; @ %r533 bra $L116; setp.ne.u64 %r536,%r512,%r568; @ %r536 bra $L116; $L89: .loc 1 436 8 st.u64 [%r174],%r563; st.u64 [%r174+8],%r564; .loc 1 439 15 add.u64 %r189,%r189,1; .loc 1 440 12 add.u64 %r172,%r172,%r103; .loc 1 441 13 add.u64 %r173,%r173,%r105; .loc 1 442 12 add.u64 %r174,%r174,%r108; .loc 1 444 13 setp.ne.u64 %r539,%r29,%r189; @ %r539 bra $L101; .loc 1 452 10 add.u64 %r169,%r173,%r92; .loc 1 453 9 add.u64 %r170,%r174,%r185; .loc 1 455 7 @ %r601 bra $L54; add.u64 %r219,%frame,488; .loc 1 451 23 mov.u64 %r112,%r39; .loc 1 455 7 mov.u64 %r218,8; .loc 1 454 5 mov.u64 %r171,1; add.u64 %r605,%frame,120; .loc 1 448 13 mov.u64 %r608,0; bra $L98; $L99: st.u64 [%r219],%r608; .loc 1 451 23 mul.lo.u64 %r112,%r125,%r124; .loc 1 452 24 mul.lo.u64 %r545,%r127,%r124; .loc 1 452 10 sub.u64 %r169,%r173,%r545; .loc 1 453 23 mul.lo.u64 %r546,%r129,%r124; .loc 1 453 9 shl.b64 %r547,%r546,4; sub.u64 %r170,%r174,%r547; .loc 1 454 5 add.u64 %r171,%r171,1; .loc 1 455 7 add.u64 %r219,%r219,8; add.u64 %r218,%r218,8; setp.eq.u64 %r548,%r171,%r613; @ %r548 bra $L54; $L98: .loc 1 463 16 ld.u64 %r549,[%r219]; add.u64 %r124,%r549,1; st.u64 [%r219],%r124; .loc 1 464 23 add.u64 %r551,%r614,%r218; ld.u64 %r125,[%r551]; .loc 1 464 13 sub.u64 %r552,%r125,%r112; add.u64 %r553,%r552,%r552; add.u64 %r172,%r172,%r553; .loc 1 465 24 add.u64 %r554,%frame,%r218; ld.u64 %r127,[%r554]; .loc 1 465 14 add.u64 %r173,%r169,%r127; .loc 1 466 23 add.u64 %r556,%r605,%r218; ld.u64 %r129,[%r556]; .loc 1 466 13 shl.b64 %r557,%r129,4; add.u64 %r174,%r170,%r557; .loc 1 444 32 add.u64 %r559,%r615,%r218; ld.u64 %r133,[%r559]; .loc 1 444 13 setp.eq.u64 %r560,%r124,%r133; @ %r560 bra $L99; .loc 1 448 13 mov.u64 %r189,0; bra $L101; $L122: .loc 1 322 6 ld.u64 %r561,[%r233]; setp.eq.u64 %r562,%r561,0; @ ! %r562 bra $L69; bra $L100; $L54: .loc 1 4_gfortran_sminloc1_16_i2 .visible .func _gfortran_sminloc1_16_, .param .u32 %in_ar4) {8u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r282; .reg .predpred %r293; .reg .pred %r294; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 493 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L126; .loc 1 493 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L127; $L126:r152154; call _gfortran_minloc1_16_500 7 bra $L125; $L127: .loc 1 503 10 ld.u64 %r23,[%r152]; .loc 1 503 7 add.u64 %r88,%r23,-1; .loc 1 504 10 ld.s8 %r24,[%r151+28]; .loc 1 504 38 add.u32 %r26,%r24,-1; .loc 1 504 8 cvt.s64.s32 %r297,%r26; .loc 1 506 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r297; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 506 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L129; .loc 1 513 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L130; $L136: .loc 1 521 3 setp.lt.s64 %r175,%r88,%r297; @ %r175 bra $L131; bra $L184; $L129: .loc 1 508 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error130: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 518 12 mov.u64 %r303,0; $L135: .loc 1 515 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 517 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L133; .loc 1 515 17 st.u64 [%r139],%r36; bra $L134; $L133: .loc 1 518 12 st.u64 [%r139],%r303; $L134: .loc 1 513 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L135; bra $L136; $L184: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L138; bra $L137; $L131: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r296,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r296,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 527 12 mov.u64 %r302,0; $L141: .loc 1 524 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 526 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L139; .loc 1 523 17 st.u64 [%r43],%r41; bra $L140; $L139: .loc 1 527 12 st.u64 [%r43],%r302; $L140: .loc 1 521 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L141; bra $L185; $L137: .loc 1 534 7 setp.eq.u64 %r213,%r297,0; @ %r213 bra $L143; add.u64 %r296,%frame,120; $L163: add.u64 %r111,%r150,40; mov.u64 %r107,%r296; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 537 10 mov.u64 %r81,1; .loc 1 541 4 mov.u64 %r239,0; bra $L144; $L143: .loc 1 545 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 546 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 548 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 548 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 548 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 550 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L145; bra $L186; $L147: .loc 1 539 48 mul.lo.u64 %r81,%r46,%r81; $L144: .loc 1 541 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 534 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L147; bra $L143; $L145: .loc 1 553 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 554 4 bra $L125; $L186: .loc 1 557 24 mov.u64 %r246,1696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 557 22 st.u64 [%r150],%r98; $L151: .loc 1 583 3 setp.ne.u64 %r249,%r297,0; @ %r249 bra $L148; bra $L149; $L138: .loc 1 561 19 ld.s8 %r55,[%r150+28]; .loc 1 561 10 setp.eq.u64 %r250,%r55,%r297; @ %r250 bra $L150; .loc 1 562 2 st.u64 [%stack+8],%r297; st.u64 [%stack],%r55;_gfortran_runtime_error150: .loc 1 567 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 567 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L151; .loc 1 569 4 setp.eq.u64 %r256,%r297,0; @ %r256 bra $L149; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 569 10 mov.u64 %r104,0; $L153: .loc 1 573 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 573 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 574 18 ld.u64 %r63,[%r31]; .loc 1 569 25 add.u64 %r104,%r104,1; .loc 1 574 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L152; .loc 1 575 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error152: .loc 1 569 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L153; $L148: .loc 1 585 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L154; shl.b64 %r266,%r297,3; bra $L155; $L154: mov.u64 %r266,8; $L155: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L156: .loc 1 586 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 583 26 add.u64 %r100,%r100,1; .loc 1 583 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r297,%r100; @ %r279 bra $L156; $L149: ld.u64 %r80,[%frame+240]; .loc 1 595 22 ld.u64 %r68,[%frame]; .loc 1 595 12 shl.b64 %r69,%r68,4; .loc 1 597 32 ld.u64 %r59,[%frame+120]; .loc 1 604 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r295,%r297,1; .loc 1 593 13 mov.u64 %r280,0; cvt.u32.u32 %r299,%r24; cvt.s64.s8 %r300,%r299; add.u64 %r301,%r300,-1; $L162: st.u64 [%r98],%r280; st.u64 [%r98+8],%r280; .loc 1 594 15 add.u64 %r80,%r80,1; .loc 1 595 12 add.u64 %r98,%r98,%r69; .loc 1 597 13 setp.ne.u64 %r282,%r59,%r80; @ %r282 bra $L162; .loc 1 606 7 @ ! %r295 bra $L187; bra $L125; $L161: .loc 1 601 13 st.u64 [%r99],%r280; .loc 1 604 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 605 5 add.u64 %r97,%r97,1; .loc 1 606 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r285,%r97,%r301; @ %r285 bra $L160; bra $L125; $L187: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 604 23 mov.u64 %r72,%r44; .loc 1 605 5 mov.u64 %r97,1; $L160: .loc 1 610 16 ld.u64 %r290,[%r99]; add.u64 %r74,%r290,1; st.u64 [%r99],%r74; .loc 1 611 23 ld.u64 %r75,[%r124]; .loc 1 611 13 sub.u64 %r291,%r75,%r72; shl.b64 %r292,%r291,4; add.u64 %r98,%r98,%r292; .loc 1 597 32 ld.u64 %r77,[%r123]; .loc 1 597 13 setp.eq.u64 %r293,%r74,%r77; @ %r293 bra $L161; .loc 1 601 13 mov.u64 %r80,0; bra $L162; $L185: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r294,%r98,0; @ ! %r294 bra $L138; bra $L163; $L125: .loc 1 615minloc1_4_i4.o/ 1622802233_gfortran_minloc1_4_i4 .visible .func _gfortran_minloc1_4_../libgfortran/generated/minloc1_4_i4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_4_i4 .visible .func _gfortran_mminloc1_4__gfortran_sminloc1_4_i4 .visible .func _gfortran_sminloc1_4_VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_minloc1_4_i4 .visible .func _gfortran_minloc1_4_i4predpred %r238; .reg .predpred %r262; .reg .pred %r264; .reg .pred %r265; .reg .u64u64predu64 %r318; .reg .u32u64 %r325; .reg .pred %r326; .reg .pred %r327; .reg .u32 %r329; .reg .predu32 %r335; .reg .pred %r336; .reg .predu64u64predmov.u64 %r188,%ar0; mov.u64 %r189,%ar1; mov.u64 %r190,%ar2; mov.u32 %r191,%ar3; .loc 1 58 10 ld.s8 %r22,[%r189+28]; .loc 1 58 38 add.u32 %r192,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r118,%r192; .loc 1 59 10 ld.u64 %r25,[%r190]; .loc 1 59 7 add.u64 %r119,%r25,-1; .loc 1 61 7 shr.u64 %r194,%r119,63; set.u32.lt.s64 %r196,%r118,%r119; neg.s32 %r197,%r196; cvt.u16.u64 %r200,%r194; cvt.u16.u32 %r201,%r197; or.b16 %r199,%r200,%r201; .loc 1 61 6 cvt.u32.u16 %r202,%r199; cvt.u16.u8 %r203,%r202; setp.eq.u16 %r204,%r203,0; @ %r204 bra $L2; .loc 1 63 7 cvt.u32.u32 %r207,%r22; cvt.s64.s8 %r206,%r207; st.u64 [%stack+8],%r20605205_gfortran_runtime_errorr210,%r119,%r119; add.u64 %r211,%r210,%r119; shl.b64 %r212,%r211,3; add.u64 %r213,%r189,%r212; ld.u64 %r30,[%r213+56]; ld.u64 %r32,[%r213+48]; .loc 1 71 9 ld.u64 %r122,[%r213+40]; .loc 1 73 3 setp.ne.u64 %r227,%r119,0; @ %r227 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r228,%r118,%r119; @ %r228 bra $L4; bra $L52; $L3: add.u64 %r138,%r189,40; add.u64 %r186,%frame,120; add.u64 %r185,%frame,240; add.u64 %r229,%r189,16; add.u64 %r231,%r25,%r25; add.u64 %r232,%r231,%r25; shl.b64 %r233,%r232,3; add.u64 %r145,%r229,%r233; .loc 1 79 12 mov.u64 %r372,0; $L8: .loc 1 75 18 ld.u64 %r234,[%r138]; st.u64 [%r186],%r234; .loc 1 76 19 ld.u64 %r236,[%r138+16]; add.u64 %r235,%r236,1; ld.u64 %r237,[%r138+8]; sub.u64 %r37,%r235,%r237; .loc 1 78 10 setp.lt.s64 %r238,%r37,0; @ %r238 bra $L6; .loc 1 76 17 st.u64 [%r185],%r37; bra $L7; $L6: .loc 1 79 12 st.u64 [%r185],%r372; $L7: .loc 1 73 3 add.u64 %r138,%r138,24; add.u64 %r186,%r186,8; add.u64 %r185,%r185,8; setp.ne.u64 %r240,%r138,%r145; @ %r240 bra $L8; bra $L9; $L52: .loc 1 90 6 ld.u64 %r241,[%r188]; setp.eq.u64 %r242,%r241,0; @ ! %r242 bra $L11; bra $L10; $L4: add.u64 %r244,%r25,%r25; add.u64 %r245,%r244,%r25; shl.b64 %r246,%r245,3; add.u64 %r247,%r246,40; add.u64 %r114,%r189,%r247; shl.b64 %r248,%r25,3; add.u64 %r102,%r248,-8; add.u64 %r362,%frame,120; add.u64 %r107,%r362,%r102; add.u64 %r361,%frame,240; add.u64 %r100,%r361,%r102; cvt.u32.u32 %r252,%r22; cvt.s64.s8 %r251,%r252; add.u64 %r254,%r251,%r251; add.u64 %r255,%r254,%r251; shl.b64 %r256,%r255,3; add.u64 %r257,%r189,40; add.u64 %r143,%r256,%r257; .loc 1 87 12 mov.u64 %r371,0; $L14: .loc 1 83 18 ld.u64 %r258,[%r114]; st.u64 [%r107],%r258; .loc 1 84 19 ld.u64 %r260,[%r114+16]; add.u64 %r259,%r260,1; ld.u64 %r261,[%r114+8]; sub.u64 %r43,%r259,%r261; .loc 1 86 10 setp.lt.s64 %r262,%r43,0; @ %r262 bra $L12; .loc 1 84 17 st.u64 [%r100],%r43; bra $L13; $L12: .loc 1 87 12 st.u64 [%r100],%r371; $L13: .loc 1 81 3 add.u64 %r114,%r114,24; add.u64 %r107,%r107,8; add.u64 %r100,%r100,8; setp.ne.u64 %r264,%r114,%r143; @ %r264 bra $L14; bra $L53; $L10: .loc 1 94 7 setp.le.s64 %r265,%r118,0; @ %r265 bra $L16; add.u64 %r361,%frame,240; $L39: add.u64 %r155,%r188,40; mov.u64 %r151,%r361; cvt.u32.u32 %r267,%r22; cvt.s64.s8 %r266,%r267; add.u64 %r269,%r266,%r266; add.u64 %r270,%r269,%r266; shl.b64 %r271,%r270,3; add.u64 %r272,%r188,16; add.u64 %r115,%r271,%r272; .loc 1 97 10 mov.u64 %r105,1; .loc 1 101 4 mov.u64 %r295,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r273,0; st.u64 [%r188+8],%r273; .loc 1 106 28 cvt.u16.u32 %r276,%r22; add.u16 %r275,%r276,-1; cvt.u32.u16 %r277,%r275; st.u8 [%r188+28],%r277; .loc 1 108 20 add.u32 %r278,%r22,-2; cvt.s64.s32 %r52,%r278; add.u64 %r280,%r52,%r52; add.u64 %r281,%r280,%r52; shl.b64 %r282,%r281,3; add.u64 %r283,%r188,%r282; .loc 1 108 67 shl.b64 %r285,%r52,3; add.u64 %r286,%frame,%r285; .loc 1 108 59 ld.u64 %r288,[%r283+40]; ld.u64 %r289,[%r286+240]; mul.lo.u64 %r124,%r288,%r289; .loc 1 110 29291; call (%value_in),_gfortrani_xmallocarray292,[%value_in]; } .loc 1 110 27 st.u64 [%r188],%r292; .loc 1 111 10 setp.eq.u64 %r294,%r124,0; @ ! %r294 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r105,%r48,%r105; $L17: .loc 1 101 4 st.u64 [%r155+8],%r295; ld.u64 %r48,[%r151]; add.u64 %r296,%r48,-1; st.u64 [%r155+16],%r296; st.u64 [%r155],%r105; .loc 1 94 7 add.u64 %r155,%r155,24; add.u64 %r151,%r151,8; setp.ne.u64 %r297,%r115,%r155; @ %r297 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r188+48],%r124; mov.u64 %r299,-1; st.u64 [%r188+56],%r299; mov.u64 %r300,1; st.u64 [%r188+40],%r300; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r59,[%r188+28]; .loc 1 121 10 setp.eq.u64 %r301,%r59,%r118; @ %r301 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r118; st.u64 [%stack],%r59;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r304,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r305,[%r304+36]; setp.eq.u32 %r306,%r305,0; @ %r306 bra $L21; .loc 1 128 2 add.u64 %r361,%frame,240; cvta.const.u64 %r310,$LC2r361309310; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r312,%r118,0; @ %r312 bra $L24; $L27: .loc 1 68 9 add.u64 %r313,%r30,1; .loc 1 68 7 sub.u64 %r120,%r313,%r32; max.s64 %r86,%r120,0; .loc 1 140 8 ld.u64 %r134,[%r189]; .loc 1 141 8 ld.u64 %r135,[%r188]; .loc 1 190 33 shl.b64 %r75,%r122,2; ld.u64 %r153,[%frame+360]; .loc 1 204 22 ld.u64 %r77,[%frame+120]; .loc 1 204 12 shl.b64 %r78,%r77,2; .loc 1 205 22 ld.u64 %r80,[%frame]; .loc 1 205 12 shl.b64 %r81,%r80,2; .loc 1 207 32 ld.u64 %r68,[%frame+240]; .loc 1 214 23 mul.lo.u64 %r314,%r68,%r77; .loc 1 214 9 shl.b64 %r315,%r314,2; neg.s64 %r142,%r315; .loc 1 215 23 mul.lo.u64 %r316,%r68,%r80; .loc 1 215 9 shl.b64 %r317,%r316,2; neg.s64 %r140,%r317; setp.gt.s64 %r359,%r120,0; setp.le.s64 %r360,%r118,1; .loc 1 180 9 setp.eq.u32 %r364,%r191,0; .loc 1 155 9 mov.u32 %r365,2147483647; cvt.u32.u32 %r366,%r22; cvt.s64.s8 %r367,%r366; add.u64 %r368,%r367,-1; add.u64 %r369,%frame,120; add.u64 %r370,%frame,240; bra $L25; $L24: add.u64 %r171,%frame,360; add.u64 %r170,%r188,40; mov.u64 %r168,%frame; cvt.u32.u32 %r319,%r22; cvt.s64.s8 %r318,%r319; add.u64 %r159,%r318,-1; .loc 1 132 3 mov.u64 %r136,0; add.u64 %r361,%frame,240; .loc 1 134 16 mov.u64 %r320,%r136; $L26: st.u64 [%r171],%r320; .loc 1 135 18 ld.u64 %r321,[%r170]; st.u64 [%r168],%r321; .loc 1 136 17 shl.b64 %r323,%r136,3; add.u64 %r324,%r361,%r323; .loc 1 136 10 ld.u64 %r325,[%r324]; setp.le.s64 %r326,%r325,0; @ %r326 bra $L1; .loc 1 132 26 add.u64 %r136,%r136,1; .loc 1 132 3 add.u64 %r171,%r171,8; add.u64 %r170,%r170,24; add.u64 %r168,%r168,8; setp.ne.u64 %r327,%r136,%r159; @ %r327 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r359 bra $L28; .loc 1 159 10 mov.u32 %r329,0; st.u32 [%r135],%r329; bra $L29; $L28: .loc 1 180 9 @ %r364 bra $L40; mov.u64 %r129,%r134; .loc 1 155 9 mov.u32 %r108,%r365; .loc 1 157 9 mov.u32 %r106,1; .loc 1 178 8 mov.u64 %r128,0; $L32: .loc 1 183 9 ld.u32 %r67,[%r129]; .loc 1 183 8 setp.gt.s32 %r331,%r67,%r108; @ %r331 bra $L31; cvt.u32.u64 %r332,%r128; add.u32 %r106,%r332,1; mov.u32 %r108,%r67; $L31: .loc 1 181 25 add.u64 %r128,%r128,1; .loc 1 181 33 add.u64 %r129,%r129,%r75; .loc 1 181 8 setp.gt.s64 %r333,%r86,%r128; @ %r333 bra $L32; bra $L33; $L40: mov.u64 %r126,%r134; .loc 1 155 9 mov.u32 %r110,%r365; .loc 1 157 9 mov.u32 %r106,1; .loc 1 178 8 mov.u64 %r104,0; $L30: .loc 1 192 9 ld.u32 %r72,[%r126]; .loc 1 192 8 setp.ge.s32 %r334,%r72,%r110; @ %r334 bra $L34; cvt.u32.u64 %r335,%r104; add.u32 %r106,%r335,1; mov.u32 %r110,%r72; $L34: .loc 1 190 25 add.u64 %r104,%r104,1; .loc 1 190 33 add.u64 %r126,%r126,%r75; .loc 1 190 8 setp.gt.s64 %r336,%r86,%r104; @ %r336 bra $L30; $L33: .loc 1 199 12 st.u32 [%r135],%r106; $L29: .loc 1 203 15 add.u64 %r153,%r153,1; .loc 1 204 12 add.u64 %r134,%r134,%r78; .loc 1 205 12 add.u64 %r135,%r135,%r81; .loc 1 207 13 setp.ne.u64 %r337,%r68,%r153; @ %r337 bra $L25; .loc 1 214 9 add.u64 %r131,%r134,%r142; .loc 1 215 9 add.u64 %r132,%r135,%r140; .loc 1 217 7 @ %r360 bra $L1; add.u64 %r156,%frame,368; mov.u64 %r117,8; .loc 1 216 5 mov.u64 %r133,1; .loc 1 211 13 mov.u64 %r363,0; bra $L37; $L38: st.u64 [%r156],%r363; .loc 1 214 23 mul.lo.u64 %r343,%r93,%r92; .loc 1 214 9 shl.b64 %r344,%r343,2; sub.u64 %r131,%r134,%r344; .loc 1 215 23 mul.lo.u64 %r345,%r96,%r92; .loc 1 215 9 shl.b64 %r346,%r345,2; sub.u64 %r132,%r135,%r346; .loc 1 216 5 add.u64 %r133,%r133,1; .loc 1 217 7 add.u64 %r156,%r156,8; add.u64 %r117,%r117,8; setp.eq.u64 %r347,%r133,%r368; @ %r347 bra $L1; $L37: .loc 1 225 16 ld.u64 %r348,[%r156]; add.u64 %r92,%r348,1; st.u64 [%r156],%r92; .loc 1 226 23 add.u64 %r350,%r369,%r117; ld.u64 %r93,[%r350]; .loc 1 226 13 shl.b64 %r351,%r93,2; add.u64 %r134,%r131,%r351; .loc 1 227 23 add.u64 %r352,%frame,%r117; ld.u64 %r96,[%r352]; .loc 1 227 13 shl.b64 %r353,%r96,2; add.u64 %r135,%r132,%r353; .loc 1 207 32 add.u64 %r355,%r370,%r117; ld.u64 %r99,[%r355]; .loc 1 207 13 setp.eq.u64 %r356,%r92,%r99; @ %r356 bra $L38; .loc 1 211 13 mov.u64 %r153,0; bra $L25; $L53: .loc 1 90 6 ld.u64 %r357,[%r188]; setp.eq.u64 %r358,%r357,0; @ ! %r358 bra $L11; bra $L39; $L1: .loc 1 231_gfortran_mminloc1_4_i4 .visible .func _gfortran_mminloc1_4_, .param .u32 %in_ar4) {60898u16 %r238; .reg .u16u64 %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u64 %r306; .reg .u64pred %r317; .reg .u64 %r318; .reg .pred %r319; .reg .u64u32u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64u64predu32 %r385; .reg .predu64 %r394; .reg .predu64predu64 %r410; .reg .u32 %r412; .reg .pred %r414; .reg .pred %r415; .reg .pred %r416; .reg .predpred %r423; .reg .predpredpred %r446; .reg .u64 %r447; .reg .predmov.u64 %r221,%ar0; mov.u64 %r222,%ar1; mov.u64 %r223,%ar2; mov.u64 %r224,%ar3; mov.u32 %r225,%ar4; .loc 1 261 6 setp.ne.u64 %r226,%r224,0; @ %r226 bra $L223225; call _gfortran_minloc1_4_268 7 bra $L54; $L55: .loc 1 271 10 ld.u64 %r22,[%r223]; .loc 1 271 7 add.u64 %r151,%r22,-1; .loc 1 272 10 ld.s8 %r23,[%r222+28]; .loc 1 272 38 add.u32 %r231,%r23,-1; .loc 1 272 8 cvt.s64.s32 %r152,%r231; .loc 1 275 7 shr.u64 %r233,%r151,63; set.u32.gt.s64 %r235,%r151,%r152; neg.s32 %r236,%r235; cvt.u16.u64 %r239,%r233; cvt.u16.u32 %r240,%r236; or.b16 %r238,%r239,%r240; .loc 1 275 6 cvt.u32.u16 %r241,%r238; cvt.u16.u8 %r242,%r241; setp.eq.u16 %r243,%r242,0; @ %r243 bra $L57; .loc 1 277 7 cvt.u32.u32 %r246,%r23; cvt.s64.s8 %r245,%r246; st.u64 [%stack+8],%r245; st.u64 [%stack],%r22; cvta.const.u64 %r244244_gfortran_runtime_error7: .loc 1 282 9 add.u64 %r449,%r151,%r151; add.u64 %r455,%r449,%r151; shl.b64 %r251,%r455,3; add.u64 %r252,%r222,%r251; ld.u64 %r255,[%r252+56]; add.u64 %r254,%r255,1; .loc 1 282 7 ld.u64 %r262,[%r252+48]; sub.u64 %r153,%r254,%r262; .loc 1 283 6 setp.le.s64 %r263,%r153,0; @ %r263 bra $L54; .loc 1 286 9 ld.u64 %r169,[%r224]; .loc 1 288 15 ld.u64 %r34,[%r224+16]; .loc 1 290 22 cvt.u32.u64 %r35,%r34; .loc 1 290 53 add.u32 %r264,%r35,-4; and.b32 %r265,%r264,-5; set.u32.eq.u32 %r267,%r265,0; neg.s32 %r268,%r267; .loc 1 290 22 add.u32 %r269,%r35,-1; set.u32.le.u32 %r271,%r269,1; neg.s32 %r272,%r271; .loc 1 290 6 cvt.u16.u32 %r274,%r268; cvt.u16.u32 %r275,%r272; or.loc 1 292 7 setp.ne.u32 %r280,%r35,16; @ %r280 bra $L60; $L59: .loc 1 299 9 shl.b64 %r284,%r455,3; add.u64 %r285,%r222,%r284; ld.u64 %r156,[%r285+40]; .loc 1 300 12 add.u64 %r291,%r224,%r284; ld.u64 %r41,[%r291+40]; .loc 1 302 3 setp.ne.u64 %r293,%r151,0; @ %r293 bra $L61; $L67: .loc 1 312 3 setp.lt.s64 %r294,%r151,%r152; @ %r294 bra $L62; bra $L119; $L60: .loc 1 297 5 cvta.const.u64 %r2955_gfortran_runtime_error61: add.u64 %r96,%r222,40; add.u64 %r68,%r224,40; add.u64 %r298,%r22,%r22; add.u64 %r299,%r298,%r22; shl.b64 %r300,%r299,3; add.u64 %r301,%r222,16; add.u64 %r39,%r300,%r301; .loc 1 302 3 mov.u64 %r139,0; add.u64 %r452,%frame,240; add.u64 %r451,%frame,360; .loc 1 309 12 mov.u64 %r465,%r139; $L66: .loc 1 304 18 add.u64 %r303,%r452,%r139; ld.u64 %r304,[%r96]; st.u64 [%r303],%r304; .loc 1 305 18 add.u64 %r305,%frame,%r139; .loc 1 305 20 ld.u64 %r307,[%r68]; mul.lo.u64 %r306,%r307,%r34; .loc 1 305 18 st.u64 [%r305],%r306; .loc 1 306 19 ld.u64 %r309,[%r96+16]; add.u64 %r308,%r309,1; ld.u64 %r310,[%r96+8]; sub.u64 %r53,%r308,%r310; .loc 1 308 10 setp.lt.s64 %r311,%r53,0; @ %r311 bra $L64; .loc 1 306 17 add.u64 %r313,%r451,%r139; st.u64 [%r313],%r53; bra $L65; $L64: .loc 1 309 12 add.u64 %r315,%r451,%r139; st.u64 [%r315],%r465; $L65: .loc 1 302 3 add.u64 %r96,%r96,24; add.u64 %r68,%r68,24; add.u64 %r139,%r139,8; setp.ne.u64 %r317,%r39,%r96; @ %r317 bra $L66; bra $L67; $L119: .loc 1 322 6 ld.u64 %r318,[%r221]; setp.eq.u64 %r319,%r318,0; @ ! %r319 bra $L69; bra $L68; $L62: add.u64 %r321,%r22,%r22; add.u64 %r322,%r321,%r22; shl.b64 %r323,%r322,3; add.u64 %r130,%r323,40; add.u64 %r134,%r222,%r130; add.u64 %r111,%r224,%r130; shl.b64 %r324,%r22,3; add.u64 %r58,%r324,-8; cvt.u32.u32 %r326,%r23; cvt.s64.s8 %r325,%r326; shl.b64 %r327,%r325,3; add.u64 %r69,%r327,-8; add.u64 %r452,%frame,240; add.u64 %r451,%frame,360; .loc 1 319 12 mov.u64 %r464,0; $L72: .loc 1 314 18 add.u64 %r329,%r452,%r58; ld.u64 %r330,[%r134]; st.u64 [%r329],%r330; .loc 1 315 18 add.u64 %r331,%frame,%r58; .loc 1 315 20 ld.u64 %r333,[%r111]; mul.lo.u64 %r332,%r333,%r34; .loc 1 315 18 st.u64 [%r331],%r332; .loc 1 316 19 ld.u64 %r335,[%r134+16]; add.u64 %r334,%r335,1; ld.u64 %r336,[%r134+8]; sub.u64 %r65,%r334,%r336; .loc 1 318 10 setp.lt.s64 %r337,%r65,0; @ %r337 bra $L70; .loc 1 316 17 add.u64 %r339,%r451,%r58; st.u64 [%r339],%r65; bra $L71; $L70: .loc 1 319 12 add.u64 %r341,%r451,%r58; st.u64 [%r341],%r464; $L71: .loc 1 312 3 add.u64 %r134,%r134,24; add.u64 %r111,%r111,24; add.u64 %r58,%r58,8; setp.ne.u64 %r343,%r69,%r58; @ %r343 bra $L72; bra $L120; $L68: .loc 1 326 7 setp.eq.u64 %r344,%r152,0; @ %r344 bra $L74; add.u64 %r451,%frame,360; $L99: add.u64 %r148,%r221,40; mov.u64 %r146,%r451; cvt.u32.u32 %r346,%r23; cvt.s64.s8 %r345,%r346; add.u64 %r348,%r345,%r345; add.u64 %r349,%r348,%r345; shl.b64 %r350,%r349,3; add.u64 %r351,%r221,16; add.u64 %r135,%r350,%r351; .loc 1 329 10 mov.u64 %r136,1; .loc 1 333 4 mov.u64 %r370,0; bra $L75; $L74: .loc 1 337 20 add.u32 %r352,%r23,-2; cvt.s64.s32 %r76,%r352; add.u64 %r354,%r76,%r76; add.u64 %r355,%r354,%r76; shl.b64 %r356,%r355,3; add.u64 %r357,%r221,%r356; .loc 1 337 67 shl.b64 %r359,%r76,3; add.u64 %r360,%frame,%r359; .loc 1 337 59 ld.u64 %r362,[%r357+40]; ld.u64 %r363,[%r360+360]; mul.lo.u64 %r160,%r362,%r363; .loc 1 339 24 mov.u64 %r364,0; st.u64 [%r221+8],%r364; .loc 1 340 28 cvt.u16.u32 %r367,%r23; add.u16 %r366,%r367,-1; cvt.u32.u16 %r368,%r366; st.u8 [%r221+28],%r368; .loc 1 342 10 setp.eq.u64 %r369,%r160,0; @ %r369 bra $L76; bra $L121; $L78: .loc 1 331 47 mul.lo.u64 %r136,%r73,%r136; $L75: .loc 1 333 4 st.u64 [%r148+8],%r370; ld.u64 %r73,[%r146]; add.u64 %r371,%r73,-1; st.u64 [%r148+16],%r371; st.u64 [%r148],%r136; .loc 1 326 7 add.u64 %r148,%r148,24; add.u64 %r146,%r146,8; setp.ne.u64 %r372,%r135,%r148; @ %r372 bra $L78; bra $L74; $L76: .loc 1 345 4 st.u64 [%r221+48],%r160; mov.u64 %r374,-1; st.u64 [%r221+56],%r374; mov.u64 %r375,1; st.u64 [%r221+40],%r375; .loc 1 346 4 bra $L54; $L121: .loc 1 349 24call (%value_in),_gfortrani_xmallocarray78,[%value_in]; } .loc 1 349 22 st.u64 [%r221],%r378; bra $L79; $L69: .loc 1 354 19 ld.s8 %r380,[%r221+28]; .loc 1 354 10 setp.eq.u64 %r381,%r380,%r152; @ %r381 bra $L80; .loc 1 355 2 cvta.const.u64 %r382,$LC5; {stack; call _gfortran_runtime_error80: .loc 1 357 11 cvta.global.u64 %r384,_gfortrani_compile_options; .loc 1 357 10 ld.u32 %r385,[%r384+36]; setp.eq.u32 %r386,%r385,0; @ %r386 bra $L79; .loc 1 359 4 add.u64 %r451,%frame,360; cvta.const.u64 %r390,$LC2390; call _gfortrani_bounds_ifunction_return361 4 cvta.const.u64 %r394,$LC222394390; call _gfortrani_bounds_equal_extents$L79: .loc 1 366 3 setp.ne.u64 %r396,%r152,0; @ %r396 bra $L81; $L84: .loc 1 374 8 ld.u64 %r170,[%r221]; .loc 1 375 8 ld.u64 %r168,[%r222]; .loc 1 377 9 setp.ne.u64 %r397,%r168,0; @ %r397 bra $L82; bra $L54; $L81: add.u64 %r198,%frame,480; add.u64 %r197,%r221,40; add.u64 %r195,%frame,120; cvt.u32.u32 %r399,%r23; cvt.s64.s8 %r398,%r399; add.u64 %r164,%r398,-1; .loc 1 366 3 mov.u64 %r171,0; add.u64 %r451,%frame,360; .loc 1 368 16 mov.u64 %r400,%r171; $L83: st.u64 [%r198],%r400; .loc 1 369 18 ld.u64 %r401,[%r197]; st.u64 [%r195],%r401; .loc 1 370 17 shl.b64 %r403,%r171,3; add.u64 %r404,%r451,%r403; .loc 1 370 10 ld.u64 %r405,[%r404]; setp.le.s64 %r406,%r405,0; @ %r406 bra $L54; .loc 1 366 26 add.u64 %r171,%r171,1; .loc 1 366 3 add.u64 %r198,%r198,8; add.u64 %r197,%r197,24; add.u64 %r195,%r195,8; setp.ne.u64 %r407,%r164,%r171; @ %r407 bra $L83; bra $L84; $L82: .loc 1 300 12 mul.lo.u64 %r44,%r41,%r34; .loc 1 396 32 shl.b64 %r91,%r156,2; ld.u64 %r187,[%frame+480]; .loc 1 440 22 ld.u64 %r101,[%frame+240]; .loc 1 440 12 shl.b64 %r102,%r101,2; .loc 1 441 23 ld.u64 %r104,[%frame]; .loc 1 442 22 ld.u64 %r106,[%frame+120]; .loc 1 442 12 shl.b64 %r107,%r106,2; .loc 1 444 32 ld.u64 %r29,[%frame+360]; .loc 1 451 23 mul.lo.u64 %r38,%r29,%r101; .loc 1 452 24 mul.lo.u64 %r408,%r29,%r104; .loc 1 452 10 neg.s64 %r137,%r408; .loc 1 453 23 mul.lo.u64 %r409,%r29,%r106; .loc 1 453 9 shl.b64 %r410,%r409,2; neg.s64 %r188,%r410; setp.le.s64 %r450,%r152,1; .loc 1 418 9 setp.ne.u32 %r457,%r225,0; cvt.u32.u32 %r458,%r23; cvt.s64.s8 %r459,%r458; add.u64 %r460,%r459,-1; add.u64 %r461,%frame,240; add.u64 %r462,%frame,360; add.u64 %r463,%frame,120; $L100: .loc 1 366 3 mov.u64 %r163,%r169; mov.u64 %r162,%r168; .loc 1 396 9 mov.u64 %r161,0; $L88: .loc 1 399 7 ld.s8 %r220,[%r163]; .loc 1 399 6 setp.eq.u32 %r453,%r220,0; @ %r453 bra $L85; .loc 1 407 11 ld.u32 %r145,[%r162]; .loc 1 408 11 cvt.u32.u64 %r412,%r161; add.u32 %r143,%r412,1; .loc 1 418 9 @ %r457 bra $L86; bra $L122; $L85: .loc 1 396 24 add.u64 %r161,%r161,1; .loc 1 396 32 add.u64 %r162,%r162,%r91; .loc 1 396 47 add.u64 %r163,%r163,%r44; .loc 1 396 2 setp.ne.u64 %r414,%r153,%r161; @ %r414 bra $L88; .loc 1 395 9 mov.u32 %r143,%r220; bra $L89; $L122: .loc 1 428 10 setp.gt.s64 %r415,%r153,%r161; @ %r415 bra $L90; bra $L89; $L86: .loc 1 419 8 setp.le.s64 %r416,%r153,%r161; @ %r416 bra $L89; add.u64 %r205,%r161,1; $L92: .loc 1 421 8 @ %r453 bra $L91; .loc 1 421 18 ld.u32 %r93,[%r162]; .loc 1 421 15 setp.gt.s32 %r418,%r93,%r145; @ %r418 bra $L91; .loc 1 424 16 cvt.u32.u64 %r143,%r205; mov.u32 %r145,%r93; $L91: .loc 1 419 33 add.u64 %r162,%r162,%r91; .loc 1 419 48 add.u64 %r163,%r163,%r44; .loc 1 419 8 add.u64 %r206,%r205,1; setp.eq.u64 %r419,%r153,%r205; @ %r419 bra $L89; .loc 1 421 9 ld.s8 %r220,[%r163]; mov.u64 %r205,%r206; setp.eq.u32 %r453,%r220,0; bra $L92; $L90: add.u64 %r201,%r161,1; bra $L94; $L101: mov.u64 %r201,%r202; $L94: .loc 1 430 10 ld.s8 %r421,[%r163]; cvt.u16.u32 %r420,%r421; setp.eq.u16 %r422,%r420,0; @ %r422 bra $L93; .loc 1 430 20 ld.u32 %r98,[%r162]; .loc 1 430 17 setp.ge.s32 %r423,%r98,%r145; @ %r423 bra $L93; .loc 1 433 11 cvt.u32.u64 %r143,%r201; mov.u32 %r145,%r98; $L93: .loc 1 428 35 add.u64 %r162,%r162,%r91; .loc 1 428 50 add.u64 %r163,%r163,%r44; .loc 1 428 10 add.u64 %r202,%r201,1; setp.ne.u64 %r424,%r153,%r201; @ %r424 bra $L101; $L89: .loc 1 436 8 st.u32 [%r170],%r143; .loc 1 439 15 add.u64 %r187,%r187,1; .loc 1 440 12 add.u64 %r168,%r168,%r102; .loc 1 441 13 add.u64 %r169,%r169,%r104; .loc 1 442 12 add.u64 %r170,%r170,%r107; .loc 1 444 13 setp.ne.u64 %r425,%r29,%r187; @ %r425 bra $L100; .loc 1 452 10 add.u64 %r165,%r169,%r137; .loc 1 453 9 add.u64 %r166,%r170,%r188; .loc 1 455 7 @ %r450 bra $L54; add.u64 %r150,%frame,488; .loc 1 451 23 mov.u64 %r110,%r38; .loc 1 455 7 mov.u64 %r215,8; .loc 1 454 5 mov.u64 %r167,1; .loc 1 448 13 mov.u64 %r456,0; bra $L97; $L98: st.u64 [%r150],%r456; .loc 1 451 23 mul.lo.u64 %r110,%r123,%r122; .loc 1 452 24 mul.lo.u64 %r431,%r125,%r122; .loc 1 452 10 sub.u64 %r165,%r169,%r431; .loc 1 453 23 mul.lo.u64 %r432,%r127,%r122; .loc 1 453 9 shl.b64 %r433,%r432,2; sub.u64 %r166,%r170,%r433; .loc 1 454 5 add.u64 %r167,%r167,1; .loc 1 455 7 add.u64 %r150,%r150,8; add.u64 %r215,%r215,8; setp.eq.u64 %r434,%r167,%r460; @ %r434 bra $L54; $L97: .loc 1 463 16 ld.u64 %r435,[%r150]; add.u64 %r122,%r435,1; st.u64 [%r150],%r122; .loc 1 464 23 add.u64 %r437,%r461,%r215; ld.u64 %r123,[%r437]; .loc 1 464 13 sub.u64 %r438,%r123,%r110; shl.b64 %r439,%r438,2; add.u64 %r168,%r168,%r439; .loc 1 465 24 add.u64 %r440,%frame,%r215; ld.u64 %r125,[%r440]; .loc 1 465 14 add.u64 %r169,%r165,%r125; .loc 1 466 23 add.u64 %r442,%r463,%r215; ld.u64 %r127,[%r442]; .loc 1 466 13 shl.b64 %r443,%r127,2; add.u64 %r170,%r166,%r443; .loc 1 444 32 add.u64 %r445,%r462,%r215; ld.u64 %r131,[%r445]; .loc 1 444 13 setp.eq.u64 %r446,%r122,%r131; @ %r446 bra $L98; .loc 1 448 13 mov.u64 %r187,0; bra $L100; $L120: .loc 1 322 6 ld.u64 %r447,[%r221]; setp.eq.u64 %r448,%r447,0; @ ! %r448 bra $L69; bra $L99; $L54: .loc 1 4_gfortran_sminloc1_4_i4 .visible .func _gfortran_sminloc1_4_, .param .u32 %in_ar4) {8u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u32 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 493 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L124; .loc 1 493 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L125; $L124:r152154; call _gfortran_minloc1_4_500 7 bra $L123; $L125: .loc 1 503 10 ld.u64 %r23,[%r152]; .loc 1 503 7 add.u64 %r88,%r23,-1; .loc 1 504 10 ld.s8 %r24,[%r151+28]; .loc 1 504 38 add.u32 %r26,%r24,-1; .loc 1 504 8 cvt.s64.s32 %r296,%r26; .loc 1 506 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 506 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L127; .loc 1 513 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L128; $L134: .loc 1 521 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L129; bra $L182; $L127: .loc 1 508 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error128: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 518 12 mov.u64 %r302,0; $L133: .loc 1 515 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 517 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L131; .loc 1 515 17 st.u64 [%r139],%r36; bra $L132; $L131: .loc 1 518 12 st.u64 [%r139],%r302; $L132: .loc 1 513 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L133; bra $L134; $L182: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L136; bra $L135; $L129: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 527 12 mov.u64 %r301,0; $L139: .loc 1 524 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 526 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L137; .loc 1 523 17 st.u64 [%r43],%r41; bra $L138; $L137: .loc 1 527 12 st.u64 [%r43],%r301; $L138: .loc 1 521 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L139; bra $L183; $L135: .loc 1 534 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L141; add.u64 %r295,%frame,120; $L161: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 537 10 mov.u64 %r81,1; .loc 1 541 4 mov.u64 %r239,0; bra $L142; $L141: .loc 1 545 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 546 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 548 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 548 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 548 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 550 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L143; bra $L184; $L145: .loc 1 539 48 mul.lo.u64 %r81,%r46,%r81; $L142: .loc 1 541 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 534 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L145; bra $L141; $L143: .loc 1 553 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 554 4 bra $L123; $L184: .loc 1 557 2496246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 557 22 st.u64 [%r150],%r98; $L149: .loc 1 583 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L146; bra $L147; $L136: .loc 1 561 19 ld.s8 %r55,[%r150+28]; .loc 1 561 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L148; .loc 1 562 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error148: .loc 1 567 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 567 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L149; .loc 1 569 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L147; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 569 10 mov.u64 %r104,0; $L151: .loc 1 573 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 573 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 574 18 ld.u64 %r63,[%r31]; .loc 1 569 25 add.u64 %r104,%r104,1; .loc 1 574 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L150; .loc 1 575 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error150: .loc 1 569 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L151; $L146: .loc 1 585 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L152; shl.b64 %r266,%r296,3; bra $L153; $L152: mov.u64 %r266,8; $L153: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L154: .loc 1 586 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 583 26 add.u64 %r100,%r100,1; .loc 1 583 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L154; $L147: ld.u64 %r80,[%frame+240]; .loc 1 595 22 ld.u64 %r68,[%frame]; .loc 1 595 12 shl.b64 %r69,%r68,2; .loc 1 597 32 ld.u64 %r59,[%frame+120]; .loc 1 604 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 593 13 mov.u32 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L160: st.u32 [%r98],%r280; .loc 1 594 15 add.u64 %r80,%r80,1; .loc 1 595 12 add.u64 %r98,%r98,%r69; .loc 1 597 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L160; .loc 1 606 7 @ ! %r294 bra $L185; bra $L123; $L159: .loc 1 601 13 st.u64 [%r99],%r297; .loc 1 604 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 605 5 add.u64 %r97,%r97,1; .loc 1 606 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L158; bra $L123; $L185: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 604 23 mov.u64 %r72,%r44; .loc 1 605 5 mov.u64 %r97,1; .loc 1 601 13 mov.u64 %r297,0; $L158: .loc 1 610 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 611 23 ld.u64 %r75,[%r124]; .loc 1 611 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,2; add.u64 %r98,%r98,%r291; .loc 1 597 32 ld.u64 %r77,[%r123]; .loc 1 597 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L159; .loc 1 601 13 mov.u64 %r80,0; bra $L160; $L183: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L136; bra $L161; $L123: .loc 1 615minloc1_8_i4.o/ 1622802233_gfortran_minloc1_8_i4 .visible .func _gfortran_minloc1_8_../libgfortran/generated/minloc1_8_i4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_8_i4 .visible .func _gfortran_mminloc1_8__gfortran_sminloc1_8_i4 .visible .func _gfortran_sminloc1_8_VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_minloc1_8_i4 .visible .func _gfortran_minloc1_8_i4predpredpred %r235; .reg .predu64 %r266; .reg .u64 %r267; .reg .u64u16u64 %r279; .reg .u64 %r280; .reg .u64 %r282; .reg .u64 %r283; .reg .u64predu64u64predpred %r326; .reg .predu64 %r338; .reg .u64pred %r353; .reg .pred %r354; .reg .pred64mov.u64 %r185,%ar0; mov.u64 %r186,%ar1; mov.u64 %r187,%ar2; mov.u32 %r188,%ar3; .loc 1 58 10 ld.s8 %r22,[%r186+28]; .loc 1 58 38 add.u32 %r189,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r120,%r189; .loc 1 59 10 ld.u64 %r25,[%r187]; .loc 1 59 7 add.u64 %r121,%r25,-1; .loc 1 61 7 shr.u64 %r191,%r121,63; set.u32.lt.s64 %r193,%r120,%r121; neg.s32 %r194,%r193; cvt.u16.u64 %r197,%r191; cvt.u16.u32 %r198,%r194; or.b16 %r196,%r197,%r198; .loc 1 61 6 cvt.u32.u16 %r199,%r196; cvt.u16.u8 %r200,%r199; setp.eq.u16 %r201,%r200,0; @ %r201 bra $L2; .loc 1 63 7 cvt.u32.u32 %r204,%r22; cvt.s64.s8 %r203,%r204; st.u64 [%stack+8],%r20302202_gfortran_runtime_errorr207,%r121,%r121; add.u64 %r208,%r207,%r121; shl.b64 %r209,%r208,3; add.u64 %r210,%r186,%r209; ld.u64 %r30,[%r210+56]; ld.u64 %r32,[%r210+48]; .loc 1 71 9 ld.u64 %r124,[%r210+40]; .loc 1 73 3 setp.ne.u64 %r224,%r121,0; @ %r224 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r225,%r120,%r121; @ %r225 bra $L4; bra $L54; $L3: add.u64 %r47,%r186,40; add.u64 %r119,%frame,120; add.u64 %r137,%frame,240; add.u64 %r226,%r186,16; add.u64 %r228,%r25,%r25; add.u64 %r229,%r228,%r25; shl.b64 %r230,%r229,3; add.u64 %r46,%r226,%r230; .loc 1 79 12 mov.u64 %r367,0; $L8: .loc 1 75 18 ld.u64 %r231,[%r47]; st.u64 [%r119],%r231; .loc 1 76 19 ld.u64 %r233,[%r47+16]; add.u64 %r232,%r233,1; ld.u64 %r234,[%r47+8]; sub.u64 %r37,%r232,%r234; .loc 1 78 10 setp.lt.s64 %r235,%r37,0; @ %r235 bra $L6; .loc 1 76 17 st.u64 [%r137],%r37; bra $L7; $L6: .loc 1 79 12 st.u64 [%r137],%r367; $L7: .loc 1 73 3 add.u64 %r47,%r47,24; add.u64 %r119,%r119,8; add.u64 %r137,%r137,8; setp.ne.u64 %r237,%r46,%r47; @ %r237 bra $L8; bra $L9; $L54: .loc 1 90 6 ld.u64 %r238,[%r185]; setp.eq.u64 %r239,%r238,0; @ ! %r239 bra $L11; bra $L10; $L4: add.u64 %r241,%r25,%r25; add.u64 %r242,%r241,%r25; shl.b64 %r243,%r242,3; add.u64 %r244,%r243,40; add.u64 %r118,%r186,%r244; shl.b64 %r245,%r25,3; add.u64 %r108,%r245,-8; add.u64 %r357,%frame,120; add.u64 %r112,%r357,%r108; add.u64 %r356,%frame,240; add.u64 %r104,%r356,%r108; cvt.u32.u32 %r249,%r22; cvt.s64.s8 %r248,%r249; add.u64 %r251,%r248,%r248; add.u64 %r252,%r251,%r248; shl.b64 %r253,%r252,3; add.u64 %r254,%r186,40; add.u64 %r64,%r253,%r254; .loc 1 87 12 mov.u64 %r366,0; $L14: .loc 1 83 18 ld.u64 %r255,[%r118]; st.u64 [%r112],%r255; .loc 1 84 19 ld.u64 %r257,[%r118+16]; add.u64 %r256,%r257,1; ld.u64 %r258,[%r118+8]; sub.u64 %r43,%r256,%r258; .loc 1 86 10 setp.lt.s64 %r259,%r43,0; @ %r259 bra $L12; .loc 1 84 17 st.u64 [%r104],%r43; bra $L13; $L12: .loc 1 87 12 st.u64 [%r104],%r366; $L13: .loc 1 81 3 add.u64 %r118,%r118,24; add.u64 %r112,%r112,8; add.u64 %r104,%r104,8; setp.ne.u64 %r261,%r64,%r118; @ %r261 bra $L14; bra $L55; $L10: .loc 1 94 7 setp.le.s64 %r262,%r120,0; @ %r262 bra $L16; add.u64 %r356,%frame,240; $L39: add.u64 %r156,%r185,40; mov.u64 %r153,%r356; cvt.u32.u32 %r264,%r22; cvt.s64.s8 %r263,%r264; add.u64 %r266,%r263,%r263; add.u64 %r267,%r266,%r263; shl.b64 %r268,%r267,3; add.u64 %r269,%r185,16; add.u64 %r130,%r268,%r269; .loc 1 97 10 mov.u64 %r107,1; .loc 1 101 4 mov.u64 %r292,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r270,0; st.u64 [%r185+8],%r270; .loc 1 106 28 cvt.u16.u32 %r273,%r22; add.u16 %r272,%r273,-1; cvt.u32.u16 %r274,%r272; st.u8 [%r185+28],%r274; .loc 1 108 20 add.u32 %r275,%r22,-2; cvt.s64.s32 %r54,%r275; add.u64 %r277,%r54,%r54; add.u64 %r278,%r277,%r54; shl.b64 %r279,%r278,3; add.u64 %r280,%r185,%r279; .loc 1 108 67 shl.b64 %r282,%r54,3; add.u64 %r283,%frame,%r282; .loc 1 108 59 ld.u64 %r285,[%r280+40]; ld.u64 %r286,[%r283+240]; mul.lo.u64 %r126,%r285,%r286; .loc 1 110 29 mov.u64 %r288288; call (%value_in),_gfortrani_xmallocarray289,[%value_in]; } .loc 1 110 27 st.u64 [%r185],%r289; .loc 1 111 10 setp.eq.u64 %r291,%r126,0; @ ! %r291 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r107,%r50,%r107; $L17: .loc 1 101 4 st.u64 [%r156+8],%r292; ld.u64 %r50,[%r153]; add.u64 %r293,%r50,-1; st.u64 [%r156+16],%r293; st.u64 [%r156],%r107; .loc 1 94 7 add.u64 %r156,%r156,24; add.u64 %r153,%r153,8; setp.ne.u64 %r294,%r130,%r156; @ %r294 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r185+48],%r126; mov.u64 %r296,-1; st.u64 [%r185+56],%r296; mov.u64 %r297,1; st.u64 [%r185+40],%r297; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r60,[%r185+28]; .loc 1 121 10 setp.eq.u64 %r298,%r60,%r120; @ %r298 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r120; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r301,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r302,[%r301+36]; setp.eq.u32 %r303,%r302,0; @ %r303 bra $L21; .loc 1 128 2 add.u64 %r356,%frame,240; cvta.const.u64 %r307,$LC2r356306307; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r309,%r120,0; @ %r309 bra $L24; $L27: .loc 1 68 9 add.u64 %r310,%r30,1; .loc 1 68 7 sub.u64 %r122,%r310,%r32; max.s64 %r87,%r122,0; .loc 1 140 8 ld.u64 %r134,[%r186]; .loc 1 141 8 ld.u64 %r135,[%r185]; .loc 1 190 33 shl.b64 %r75,%r124,2; ld.u64 %r45,[%frame+360]; .loc 1 204 22 ld.u64 %r76,[%frame+120]; .loc 1 204 12 shl.b64 %r78,%r76,2; .loc 1 205 22 ld.u64 %r79,[%frame]; .loc 1 205 12 shl.b64 %r81,%r79,3; setp.gt.s64 %r354,%r122,0; setp.le.s64 %r355,%r120,1; .loc 1 180 9 setp.eq.u32 %r359,%r188,0; .loc 1 159 10 mov.u64 %r360,0; cvt.u32.u32 %r361,%r22; cvt.s64.s8 %r362,%r361; add.u64 %r363,%r362,-1; add.u64 %r364,%frame,120; add.u64 %r365,%frame,240; bra $L25; $L24: add.u64 %r170,%frame,360; add.u64 %r169,%r185,40; mov.u64 %r167,%frame; cvt.u32.u32 %r312,%r22; cvt.s64.s8 %r311,%r312; add.u64 %r158,%r311,-1; .loc 1 132 3 mov.u64 %r136,0; add.u64 %r356,%frame,240; .loc 1 134 16 mov.u64 %r313,%r136; $L26: st.u64 [%r170],%r313; .loc 1 135 18 ld.u64 %r314,[%r169]; st.u64 [%r167],%r314; .loc 1 136 17 shl.b64 %r316,%r136,3; add.u64 %r317,%r356,%r316; .loc 1 136 10 ld.u64 %r318,[%r317]; setp.le.s64 %r319,%r318,0; @ %r319 bra $L1; .loc 1 132 26 add.u64 %r136,%r136,1; .loc 1 132 3 add.u64 %r170,%r170,8; add.u64 %r169,%r169,24; add.u64 %r167,%r167,8; setp.ne.u64 %r320,%r136,%r158; @ %r320 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r354 bra $L28; .loc 1 159 10 st.u64 [%r135],%r360; bra $L29; $L28: .loc 1 180 9 @ %r359 bra $L40; mov.u64 %r129,%r134; .loc 1 155 9 mov.u32 %r111,2147483647; .loc 1 157 9 mov.u64 %r110,1; .loc 1 178 8 mov.u64 %r152,0; $L32: .loc 1 183 9 ld.u32 %r67,[%r129]; .loc 1 186 16 add.u64 %r152,%r152,1; .loc 1 183 8 setp.le.s32 %r324,%r67,%r111; selp.u32 %r111,%r67,%r111,%r324; selp.u64 %r110,%r152,%r110,%r324; .loc 1 181 33 add.u64 %r129,%r129,%r75; .loc 1 181 8 setp.gt.s64 %r325,%r87,%r152; @ %r325 bra $L32; bra $L33; $L40: mov.u64 %r127,%r134; .loc 1 155 9 mov.u32 %r113,2147483647; .loc 1 157 9 mov.u64 %r110,1; .loc 1 178 8 mov.u64 %r106,0; $L30: .loc 1 192 9 ld.u32 %r72,[%r127]; .loc 1 195 16 add.u64 %r106,%r106,1; .loc 1 192 8 setp.lt.s32 %r326,%r72,%r113; selp.u32 %r113,%r72,%r113,%r326; selp.u64 %r110,%r106,%r110,%r326; .loc 1 190 33 add.u64 %r127,%r127,%r75; .loc 1 190 8 setp.gt.s64 %r327,%r87,%r106; @ %r327 bra $L30; $L33: .loc 1 199 12 st.u64 [%r135],%r110; $L29: .loc 1 203 15 add.u64 %r45,%r45,1; .loc 1 204 12 add.u64 %r134,%r134,%r78; .loc 1 205 12 add.u64 %r135,%r135,%r81; .loc 1 207 32 ld.u64 %r68,[%frame+240]; .loc 1 207 13 setp.ne.u64 %r328,%r68,%r45; @ %r328 bra $L25; .loc 1 214 23 mul.lo.u64 %r329,%r45,%r76; .loc 1 214 9 shl.b64 %r330,%r329,2; sub.u64 %r131,%r134,%r330; .loc 1 215 23 mul.lo.u64 %r331,%r45,%r79; .loc 1 215 9 shl.b64 %r332,%r331,3; sub.u64 %r132,%r135,%r332; .loc 1 217 7 @ %r355 bra $L1; add.u64 %r157,%frame,368; mov.u64 %r178,8; .loc 1 216 5 mov.u64 %r133,1; bra $L37; $L38: .loc 1 211 13 st.u64 [%r157],%r360; .loc 1 214 23 mul.lo.u64 %r338,%r94,%r93; .loc 1 214 9 shl.b64 %r339,%r338,2; sub.u64 %r131,%r134,%r339; .loc 1 215 23 mul.lo.u64 %r340,%r97,%r93; .loc 1 215 9 shl.b64 %r341,%r340,3; sub.u64 %r132,%r135,%r341; .loc 1 216 5 add.u64 %r133,%r133,1; .loc 1 217 7 add.u64 %r157,%r157,8; add.u64 %r178,%r178,8; setp.eq.u64 %r342,%r133,%r363; @ %r342 bra $L1; $L37: .loc 1 225 16 ld.u64 %r343,[%r157]; add.u64 %r93,%r343,1; st.u64 [%r157],%r93; .loc 1 226 23 add.u64 %r345,%r364,%r178; ld.u64 %r94,[%r345]; .loc 1 226 13 shl.b64 %r346,%r94,2; add.u64 %r134,%r131,%r346; .loc 1 227 23 add.u64 %r347,%frame,%r178; ld.u64 %r97,[%r347]; .loc 1 227 13 shl.b64 %r348,%r97,3; add.u64 %r135,%r132,%r348; .loc 1 207 32 add.u64 %r350,%r365,%r178; ld.u64 %r100,[%r350]; .loc 1 207 13 setp.eq.u64 %r351,%r93,%r100; @ %r351 bra $L38; .loc 1 211 13 mov.u64 %r45,0; bra $L25; $L55: .loc 1 90 6 ld.u64 %r352,[%r185]; setp.eq.u64 %r353,%r352,0; @ ! %r353 bra $L11; bra $L39; $L1: .loc 1 231_gfortran_mminloc1_8_i4 .visible .func _gfortran_mminloc1_8_, .param .u32 %in_ar4) {608pred %r218; .reg .u32 %r223; .reg .u6464u64pred %r373; .reg .u64 %r374; .reg .u64 %r376; .reg .u32 %r377; .reg .pred %r378; .reg .u64 %r381; .reg .u64 %r382; .reg .u64 %r386; .reg .pred %r388; .reg .pred %r389; .reg .u64 %r390; .reg .u32 %r391; .reg .u64pred %r398; .reg .pred %r399; .reg .pred %r402; .reg .pred %r403; .reg .pred %r404; .reg .pred %r406; .reg .pred %r407; .reg .u16 %r408; .reg .u32 %r409; .reg .pred %r410; .reg .pred %r411; .reg .pred %r412; .reg .pred.reg .u64 %r416; .reg .u64predpredmov.u64 %r213,%ar0; mov.u64 %r214,%ar1; mov.u64 %r215,%ar2; mov.u64 %r216,%ar3; mov.u32 %r217,%ar4; .loc 1 261 6 setp.ne.u64 %r218,%r216,0; @ %r218 bra $L1r214215217; call _gfortran_minloc1_8_268 7 bra $L56; $L57: .loc 1 271 10 ld.u64 %r22,[%r215]; .loc 1 271 7 add.u64 %r146,%r22,-1; .loc 1 272 10 ld.s8 %r23,[%r214+28]; .loc 1 272 38 add.u32 %r223,%r23,-1; .loc 1 272 8 cvt.s64.s32 %r147,%r223; .loc 1 275 7 shr.u64 %r225,%r146,63; set.u32.gt.s64 %r227,%r146,%r147; neg.s32 %r228,%r227; cvt.u16.u64 %r231,%r225; cvt.u16.u32 %r232,%r228; or.b16 %r230,%r231,%r232; .loc 1 275 6 cvt.u32.u16 %r233,%r230; cvt.u16.u8 %r234,%r233; setp.eq.u16 %r235,%r234,0; @ %r235 bra $L59; .loc 1 277 7 cvt.u32.u32 %r238,%r23; cvt.s64.s8 %r237,%r238; st.u64 [%stack+8],%r237; st.u64 [%stack],%r22; cvta.const.u64 %r236236_gfortran_runtime_error9: .loc 1 282 9 add.u64 %r445,%r146,%r146; add.u64 %r444,%r445,%r146; shl.b64 %r243,%r444,3; add.u64 %r244,%r214,%r243; ld.u64 %r247,[%r244+56]; add.u64 %r246,%r247,1; .loc 1 282 7 ld.u64 %r254,[%r244+48]; sub.u64 %r148,%r246,%r254; .loc 1 283 6 setp.le.s64 %r255,%r148,0; @ %r255 bra $L56; .loc 1 286 9 ld.u64 %r166,[%r216]; .loc 1 288 15 ld.u64 %r34,[%r216+16]; .loc 1 290 22 cvt.u32.u64 %r35,%r34; .loc 1 290 53 add.u32 %r256,%r35,-4; and.b32 %r257,%r256,-5; set.u32.eq.u32 %r259,%r257,0; neg.s32 %r260,%r259; .loc 1 290 22 add.u32 %r261,%r35,-1; set.u32.le.u32 %r263,%r261,1; neg.s32 %r264,%r263; .loc 1 290 6 cvt.u16.u32 %r266,%r260; cvt.u16.u32 %r267,%r264; or.b16 %r265,%r266,%r267; cvt.u32.u16 %r268,%r265; cvt.u16.u8 %r269,%r268; setp.ne.u16 %r270,%r269,0; @ %r270 bra $L61; .loc 1 292 7 setp.ne.u32 %r272,%r35,16; @ %r272 bra $L62; $L61: .loc 1 299 9 shl.b64 %r276,%r444,3; add.u64 %r277,%r214,%r276; ld.u64 %r151,[%r277+40]; .loc 1 300 12 add.u64 %r283,%r216,%r276; ld.u64 %r39,[%r283+40]; .loc 1 302 3 setp.ne.u64 %r285,%r146,0; @ %r285 bra $L63; $L69: .loc 1 312 3 setp.lt.s64 %r286,%r146,%r147; @ %r286 bra $L64; bra $L123; $L62: .loc 1 297 5_gfortran_runtime_error63: add.u64 %r205,%r214,40; add.u64 %r67,%r216,40; add.u64 %r289,%r214,16; add.u64 %r291,%r22,%r22; add.u64 %r292,%r291,%r22; shl.b64 %r293,%r292,3; add.u64 %r210,%r289,%r293; .loc 1 302 3 mov.u64 %r91,0; add.u64 %r442,%frame,240; add.u64 %r440,%frame,360; .loc 1 309 12 mov.u64 %r456,%r91; $L68: .loc 1 304 18 add.u64 %r295,%r442,%r91; ld.u64 %r296,[%r205]; st.u64 [%r295],%r296; .loc 1 305 18 add.u64 %r297,%frame,%r91; .loc 1 305 20 ld.u64 %r299,[%r67]; mul.lo.u64 %r298,%r299,%r34; .loc 1 305 18 st.u64 [%r297],%r298; .loc 1 306 19 ld.u64 %r301,[%r205+16]; add.u64 %r300,%r301,1; ld.u64 %r302,[%r205+8]; sub.u64 %r51,%r300,%r302; .loc 1 308 10 setp.lt.s64 %r303,%r51,0; @ %r303 bra $L66; .loc 1 306 17 add.u64 %r305,%r440,%r91; st.u64 [%r305],%r51; bra $L67; $L66: .loc 1 309 12 add.u64 %r307,%r440,%r91; st.u64 [%r307],%r456; $L67: .loc 1 302 3 add.u64 %r205,%r205,24; add.u64 %r67,%r67,24; add.u64 %r91,%r91,8; setp.ne.u64 %r309,%r205,%r210; @ %r309 bra $L68; bra $L69; $L123: .loc 1 322 6 ld.u64 %r310,[%r213]; setp.eq.u64 %r311,%r310,0; @ ! %r311 bra $L71; bra $L70; $L64: add.u64 %r313,%r22,%r22; add.u64 %r314,%r313,%r22; shl.b64 %r315,%r314,3; add.u64 %r128,%r315,40; add.u64 %r133,%r214,%r128; add.u64 %r124,%r216,%r128; shl.b64 %r316,%r22,3; add.u64 %r83,%r316,-8; cvt.u32.u32 %r318,%r23; cvt.s64.s8 %r317,%r318; shl.b64 %r319,%r317,3; add.u64 %r209,%r319,-8; add.u64 %r442,%frame,240; add.u64 %r440,%frame,360; .loc 1 319 12 mov.u64 %r455,0; $L74: .loc 1 314 18 add.u64 %r321,%r442,%r83; ld.u64 %r322,[%r133]; st.u64 [%r321],%r322; .loc 1 315 18 add.u64 %r323,%frame,%r83; .loc 1 315 20 ld.u64 %r325,[%r124]; mul.lo.u64 %r324,%r325,%r34; .loc 1 315 18 st.u64 [%r323],%r324; .loc 1 316 19 ld.u64 %r327,[%r133+16]; add.u64 %r326,%r327,1; ld.u64 %r328,[%r133+8]; sub.u64 %r64,%r326,%r328; .loc 1 318 10 setp.lt.s64 %r329,%r64,0; @ %r329 bra $L72; .loc 1 316 17 add.u64 %r331,%r440,%r83; st.u64 [%r331],%r64; bra $L73; $L72: .loc 1 319 12 add.u64 %r333,%r440,%r83; st.u64 [%r333],%r455; $L73: .loc 1 312 3 add.u64 %r133,%r133,24; add.u64 %r124,%r124,24; add.u64 %r83,%r83,8; setp.ne.u64 %r335,%r83,%r209; @ %r335 bra $L74; bra $L124; $L70: .loc 1 326 7 setp.eq.u64 %r336,%r147,0; @ %r336 bra $L76; add.u64 %r440,%frame,360; $L100: add.u64 %r161,%r213,40; mov.u64 %r143,%r440; cvt.u32.u32 %r338,%r23; cvt.s64.s8 %r337,%r338; add.u64 %r340,%r337,%r337; add.u64 %r341,%r340,%r337; shl.b64 %r342,%r341,3; add.u64 %r343,%r213,16; add.u64 %r135,%r342,%r343; .loc 1 329 10 mov.u64 %r131,1; .loc 1 333 4 mov.u64 %r362,0; bra $L77; $L76: .loc 1 337 20 add.u32 %r344,%r23,-2; cvt.s64.s32 %r73,%r344; add.u64 %r346,%r73,%r73; add.u64 %r347,%r346,%r73; shl.b64 %r348,%r347,3; add.u64 %r349,%r213,%r348; .loc 1 337 67 shl.b64 %r351,%r73,3; add.u64 %r352,%frame,%r351; .loc 1 337 59 ld.u64 %r354,[%r349+40]; ld.u64 %r355,[%r352+360]; mul.lo.u64 %r155,%r354,%r355; .loc 1 339 24 mov.u64 %r356,0; st.u64 [%r213+8],%r356; .loc 1 340 28 cvt.u16.u32 %r359,%r23; add.u16 %r358,%r359,-1; cvt.u32.u16 %r360,%r358; st.u8 [%r213+28],%r360; .loc 1 342 10 setp.eq.u64 %r361,%r155,0; @ %r361 bra $L78; bra $L125; $L80: .loc 1 331 47 mul.lo.u64 %r131,%r70,%r131; $L77: .loc 1 333 4 st.u64 [%r161+8],%r362; ld.u64 %r70,[%r143]; add.u64 %r363,%r70,-1; st.u64 [%r161+16],%r363; st.u64 [%r161],%r131; .loc 1 326 7 add.u64 %r161,%r161,24; add.u64 %r143,%r143,8; setp.ne.u64 %r364,%r135,%r161; @ %r364 bra $L80; bra $L76; $L78: .loc 1 345 4 st.u64 [%r213+48],%r155; mov.u64 %r366,-1; st.u64 [%r213+56],%r366; mov.u64 %r367,1; st.u64 [%r213+40],%r367; .loc 1 346 4 bra $L56; $L125: .loc 1 349 24 mov.u64 %r369369; call (%value_in),_gfortrani_xmallocarray70,[%value_in]; } .loc 1 349 22 st.u64 [%r213],%r370; bra $L81; $L71: .loc 1 354 19 ld.s8 %r372,[%r213+28]; .loc 1 354 10 setp.eq.u64 %r373,%r372,%r147; @ %r373 bra $L82; .loc 1 355 2 cvta.const.u64 %r374374_gfortran_runtime_error82: .loc 1 357 11 cvta.global.u64 %r376,_gfortrani_compile_options; .loc 1 357 10 ld.u32 %r377,[%r376+36]; setp.eq.u32 %r378,%r377,0; @ %r378 bra $L81; .loc 1 359 4 add.u64 %r440,%frame,360; cvta.const.u64 %r382,$LC2r43382; call _gfortrani_bounds_ifunction_return361 4 cvta.const.u64 %r386,$LC386382; call _gfortrani_bounds_equal_extents$L81: .loc 1 366 3 setp.ne.u64 %r388,%r147,0; @ %r388 bra $L83; $L86: .loc 1 374 8 ld.u64 %r167,[%r213]; .loc 1 375 8 ld.u64 %r165,[%r214]; .loc 1 377 9 setp.ne.u64 %r389,%r165,0; @ %r389 bra $L84; bra $L56; $L83: add.u64 %r194,%frame,480; add.u64 %r193,%r213,40; add.u64 %r188,%frame,120; cvt.u32.u32 %r391,%r23; cvt.s64.s8 %r390,%r391; add.u64 %r174,%r390,-1; .loc 1 366 3 mov.u64 %r168,0; add.u64 %r440,%frame,360; .loc 1 368 16 mov.u64 %r392,%r168; $L85: st.u64 [%r194],%r392; .loc 1 369 18 ld.u64 %r393,[%r193]; st.u64 [%r188],%r393; .loc 1 370 17 shl.b64 %r395,%r168,3; add.u64 %r396,%r440,%r395; .loc 1 370 10 ld.u64 %r397,[%r396]; setp.le.s64 %r398,%r397,0; @ %r398 bra $L56; .loc 1 366 26 add.u64 %r168,%r168,1; .loc 1 366 3 add.u64 %r194,%r194,8; add.u64 %r193,%r193,24; add.u64 %r188,%r188,8; setp.ne.u64 %r399,%r168,%r174; @ %r399 bra $L85; bra $L86; $L84: .loc 1 300 12 mul.lo.u64 %r42,%r39,%r34; .loc 1 396 32 shl.b64 %r87,%r151,2; ld.u64 %r68,[%frame+480]; .loc 1 440 22 ld.u64 %r97,[%frame+240]; .loc 1 440 12 shl.b64 %r99,%r97,2; .loc 1 441 23 ld.u64 %r100,[%frame]; .loc 1 442 22 ld.u64 %r102,[%frame+120]; .loc 1 442 12 shl.b64 %r104,%r102,3; setp.le.s64 %r446,%r147,1; .loc 1 418 9 setp.ne.u32 %r448,%r217,0; cvt.u32.u32 %r449,%r23; cvt.s64.s8 %r450,%r449; add.u64 %r451,%r450,-1; add.u64 %r452,%frame,240; add.u64 %r453,%frame,360; add.u64 %r454,%frame,120; $L101: .loc 1 366 3 mov.u64 %r160,%r166; mov.u64 %r159,%r165; .loc 1 396 9 mov.u64 %r137,0; $L90: .loc 1 399 7 ld.s8 %r208,[%r160]; mov.u64 %r189,%r137; .loc 1 396 24 add.u64 %r137,%r137,1; .loc 1 399 6 setp.eq.u32 %r441,%r208,0; @ %r441 bra $L87; .loc 1 407 11 ld.u32 %r141,[%r159]; .loc 1 418 9 @ %r448 bra $L88; bra $L126; $L87: .loc 1 396 32 add.u64 %r159,%r159,%r87; .loc 1 396 47 add.u64 %r160,%r160,%r42; .loc 1 396 2 setp.ne.u64 %r402,%r148,%r137; @ %r402 bra $L90; .loc 1 395 9 mov.u64 %r137,0; bra $L91; $L126: .loc 1 428 10 setp.gt.s64 %r403,%r148,%r189; @ %r403 bra $L92; bra $L91; $L88: .loc 1 419 8 setp.le.s64 %r404,%r148,%r189; @ %r404 bra $L91; $L94: .loc 1 424 16 add.u64 %r189,%r189,1; .loc 1 421 8 @ %r441 bra $L93; .loc 1 421 18 ld.u32 %r88,[%r159]; .loc 1 421 15 setp.le.s32 %r406,%r88,%r141; selp.u32 %r141,%r88,%r141,%r406; selp.u64 %r137,%r189,%r137,%r406; $L93: .loc 1 419 33 add.u64 %r159,%r159,%r87; .loc 1 419 48 add.u64 %r160,%r160,%r42; .loc 1 419 8 setp.eq.u64 %r407,%r148,%r189; @ %r407 bra $L91; .loc 1 421 9 ld.s8 %r208,[%r160]; setp.eq.u32 %r441,%r208,0; bra $L94; $L92: .loc 1 433 11 add.u64 %r189,%r189,1; .loc 1 430 10 ld.s8 %r409,[%r160]; cvt.u16.u32 %r408,%r409; setp.eq.u16 %r410,%r408,0; @ %r410 bra $L95; .loc 1 430 20 ld.u32 %r93,[%r159]; .loc 1 430 17 setp.lt.s32 %r411,%r93,%r141; selp.u32 %r141,%r93,%r141,%r411; selp.u64 %r137,%r189,%r137,%r411; $L95: .loc 1 428 35 add.u64 %r159,%r159,%r87; .loc 1 428 50 add.u64 %r160,%r160,%r42; .loc 1 428 10 setp.ne.u64 %r412,%r148,%r189; @ %r412 bra $L92; $L91: .loc 1 436 8 st.u64 [%r167],%r137; .loc 1 439 15 add.u64 %r68,%r68,1; .loc 1 440 12 add.u64 %r165,%r165,%r99; .loc 1 441 13 add.u64 %r166,%r166,%r100; .loc 1 442 12 add.u64 %r167,%r167,%r104; .loc 1 444 32 ld.u64 %r29,[%frame+360]; .loc 1 444 13 setp.ne.u64 %r413,%r29,%r68; @ %r413 bra $L101; .loc 1 451 23 mul.lo.u64 %r106,%r68,%r97; .loc 1 452 24 mul.lo.u64 %r414,%r68,%r100; .loc 1 452 10 sub.u64 %r162,%r166,%r414; .loc 1 453 23 mul.lo.u64 %r415,%r68,%r102; .loc 1 453 9 shl.b64 %r416,%r415,3; sub.u64 %r163,%r167,%r416; .loc 1 455 7 @ %r446 bra $L56; add.u64 %r204,%frame,488; mov.u64 %r203,8; .loc 1 454 5 mov.u64 %r164,1; .loc 1 448 13 mov.u64 %r447,0; bra $L98; $L99: st.u64 [%r204],%r447; .loc 1 451 23 mul.lo.u64 %r106,%r117,%r116; .loc 1 452 24 mul.lo.u64 %r422,%r119,%r116; .loc 1 452 10 sub.u64 %r162,%r166,%r422; .loc 1 453 23 mul.lo.u64 %r423,%r121,%r116; .loc 1 453 9 shl.b64 %r424,%r423,3; sub.u64 %r163,%r167,%r424; .loc 1 454 5 add.u64 %r164,%r164,1; .loc 1 455 7 add.u64 %r204,%r204,8; add.u64 %r203,%r203,8; setp.eq.u64 %r425,%r164,%r451; @ %r425 bra $L56; $L98: .loc 1 463 16 ld.u64 %r426,[%r204]; add.u64 %r116,%r426,1; st.u64 [%r204],%r116; .loc 1 464 23 add.u64 %r428,%r452,%r203; ld.u64 %r117,[%r428]; .loc 1 464 13 sub.u64 %r429,%r117,%r106; shl.b64 %r430,%r429,2; add.u64 %r165,%r165,%r430; .loc 1 465 24 add.u64 %r431,%frame,%r203; ld.u64 %r119,[%r431]; .loc 1 465 14 add.u64 %r166,%r162,%r119; .loc 1 466 23 add.u64 %r433,%r454,%r203; ld.u64 %r121,[%r433]; .loc 1 466 13 shl.b64 %r434,%r121,3; add.u64 %r167,%r163,%r434; .loc 1 444 32 add.u64 %r436,%r453,%r203; ld.u64 %r125,[%r436]; .loc 1 444 13 setp.eq.u64 %r437,%r116,%r125; @ %r437 bra $L99; .loc 1 448 13 mov.u64 %r68,0; bra $L101; $L124: .loc 1 322 6 ld.u64 %r438,[%r213]; setp.eq.u64 %r439,%r438,0; @ ! %r439 bra $L71; bra $L100; $L56: .loc 1 4_gfortran_sminloc1_8_i4 .visible .func _gfortran_sminloc1_8_, .param .u32 %in_ar4) {8u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 493 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L128; .loc 1 493 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L129; $L128:r152154; call _gfortran_minloc1_8_500 7 bra $L127; $L129: .loc 1 503 10 ld.u64 %r23,[%r152]; .loc 1 503 7 add.u64 %r88,%r23,-1; .loc 1 504 10 ld.s8 %r24,[%r151+28]; .loc 1 504 38 add.u32 %r26,%r24,-1; .loc 1 504 8 cvt.s64.s32 %r296,%r26; .loc 1 506 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 506 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L131; .loc 1 513 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L132; $L138: .loc 1 521 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L133; bra $L186; $L131: .loc 1 508 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error132: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 518 12 mov.u64 %r302,0; $L137: .loc 1 515 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 517 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L135; .loc 1 515 17 st.u64 [%r139],%r36; bra $L136; $L135: .loc 1 518 12 st.u64 [%r139],%r302; $L136: .loc 1 513 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L137; bra $L138; $L186: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L140; bra $L139; $L133: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 527 12 mov.u64 %r301,0; $L143: .loc 1 524 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 526 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L141; .loc 1 523 17 st.u64 [%r43],%r41; bra $L142; $L141: .loc 1 527 12 st.u64 [%r43],%r301; $L142: .loc 1 521 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L143; bra $L187; $L139: .loc 1 534 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L145; add.u64 %r295,%frame,120; $L165: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 537 10 mov.u64 %r81,1; .loc 1 541 4 mov.u64 %r239,0; bra $L146; $L145: .loc 1 545 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 546 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 548 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 548 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 548 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 550 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L147; bra $L188; $L149: .loc 1 539 48 mul.lo.u64 %r81,%r46,%r81; $L146: .loc 1 541 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 534 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L149; bra $L145; $L147: .loc 1 553 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 554 4 bra $L127; $L188: .loc 1 557 24 mov.u64 %r24696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 557 22 st.u64 [%r150],%r98; $L153: .loc 1 583 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L150; bra $L151; $L140: .loc 1 561 19 ld.s8 %r55,[%r150+28]; .loc 1 561 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L152; .loc 1 562 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error152: .loc 1 567 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 567 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L153; .loc 1 569 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L151; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 569 10 mov.u64 %r104,0; $L155: .loc 1 573 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 573 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 574 18 ld.u64 %r63,[%r31]; .loc 1 569 25 add.u64 %r104,%r104,1; .loc 1 574 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L154; .loc 1 575 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error154: .loc 1 569 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L155; $L150: .loc 1 585 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L156; shl.b64 %r266,%r296,3; bra $L157; $L156: mov.u64 %r266,8; $L157: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L158: .loc 1 586 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 583 26 add.u64 %r100,%r100,1; .loc 1 583 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L158; $L151: ld.u64 %r80,[%frame+240]; .loc 1 595 22 ld.u64 %r68,[%frame]; .loc 1 595 12 shl.b64 %r69,%r68,3; .loc 1 597 32 ld.u64 %r59,[%frame+120]; .loc 1 604 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 593 13 mov.u64 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L164: st.u64 [%r98],%r280; .loc 1 594 15 add.u64 %r80,%r80,1; .loc 1 595 12 add.u64 %r98,%r98,%r69; .loc 1 597 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L164; .loc 1 606 7 @ ! %r294 bra $L189; bra $L127; $L163: .loc 1 601 13 st.u64 [%r99],%r280; .loc 1 604 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 605 5 add.u64 %r97,%r97,1; .loc 1 606 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L162; bra $L127; $L189: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 604 23 mov.u64 %r72,%r44; .loc 1 605 5 mov.u64 %r97,1; $L162: .loc 1 610 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 611 23 ld.u64 %r75,[%r124]; .loc 1 611 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,3; add.u64 %r98,%r98,%r291; .loc 1 597 32 ld.u64 %r77,[%r123]; .loc 1 597 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L163; .loc 1 601 13 mov.u64 %r80,0; bra $L164; $L187: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L140; bra $L165; $L127: .loc 1 615minloc1_16_i4.o/1622802233_gfortran_minloc1_16_i4 .visible .func _gfortran_minloc1_16_../libgfortran/generated/minloc1_16_i4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_16_i4 .visible .func _gfortran_mminloc1_16__gfortran_sminloc1_16_i4 .visible .func _gfortran_sminloc1_16_VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_minloc1_16_i4 .visible .func _gfortran_minloc1_16_i4pred %r229; .reg .predpred %r240; .reg .predu16 %r278; .reg .u32 %r279; .reg .u32 %r280; .reg .u64 %r282; .reg .u64predu64 %r318; .reg .u64pred %r328; .reg .pred %r329; .reg .u64 %r331; .reg .pred %r334; .reg .u64 %r337; .reg .u64 %r340; .reg .u32 %r342; .reg .u64 %r347; .reg .pred %r349; .reg .predpred %r365; .reg .pred %r368; .reg .u64 %r374; .reg .u64 %r375; .reg .u64u64pred %r389; .reg .u64pred %r407; .reg .u64 %r408; .reg .pred %r409; .reg .u64 %r410; .reg .u32.loc 1 58 10 ld.s8 %r22,[%r191+28]; .loc 1 58 38 add.u32 %r194,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r118,%r194; .loc 1 59 10 ld.u64 %r25,[%r192]; .loc 1 59 7 add.u64 %r119,%r25,-1; .loc 1 61 7 shr.u64 %r196,%r119,63; set.u32.lt.s64 %r198,%r118,%r119; neg.s32 %r199,%r198; cvt.u16.u64 %r202,%r196;61%r206 bra $L2; .loc 1 63 7 cvt.u32.u32 %r209,%r22; cvt.s64.s8 %r208,%r209; st.u64 [%stack+8],%r208007_gfortran_runtime_errorr212,%r119,%r119; add.u64 %r213,%r212,%r119; shl.b64 %r214,%r213,3; add.u64 %r215,%r191,%r214; ld.u64 %r31,[%r215+56]; ld.u64 %r33,[%r215+48]; .loc 1 71 9 ld.u64 %r122,[%r215+40]; .loc 1 73 3 setp.ne.u64 %r229,%r119,0; @ %r229 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r230,%r118,%r119; @ %r230 bra $L4; bra $L52; $L3: add.u64 %r188,%r191,40; add.u64 %r184,%frame,120; add.u64 %r47,%frame,240; add.u64 %r231,%r191,16; add.u64 %r233,%r25,%r25; add.u64 %r234,%r233,%r25; shl.b64 %r235,%r234,3; add.u64 %r186,%r231,%r235; .loc 1 79 12 mov.u64 %r417,0; $L8: .loc 1 75 18 ld.u64 %r236,[%r188]; st.u64 [%r184],%r236; .loc 1 76 19 ld.u64 %r238,[%r188+16]; add.u64 %r237,%r238,1; ld.u64 %r239,[%r188+8]; sub.u64 %r38,%r237,%r239; .loc 1 78 10 setp.lt.s64 %r240,%r38,0; @ %r240 bra $L6; .loc 1 76 17 st.u64 [%r47],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r47],%r417; $L7: .loc 1 73 3 add.u64 %r188,%r188,24; add.u64 %r184,%r184,8; add.u64 %r47,%r47,8; setp.ne.u64 %r242,%r186,%r188; @ %r242 bra $L8; bra $L9; $L52: .loc 1 90 6 ld.u64 %r243,[%r190]; setp.eq.u64 %r244,%r243,0; @ ! %r244 bra $L11; bra $L10; $L4: add.u64 %r246,%r25,%r25; add.u64 %r247,%r246,%r25; shl.b64 %r248,%r247,3; add.u64 %r249,%r248,40; add.u64 %r110,%r191,%r249; shl.b64 %r250,%r25,3; add.u64 %r98,%r250,-8; add.u64 %r405,%frame,120; add.u64 %r100,%r405,%r98; add.u64 %r404,%frame,240; add.u64 %r96,%r404,%r98; cvt.u32.u32 %r254,%r22; cvt.s64.s8 %r253,%r254; add.u64 %r256,%r253,%r253; add.u64 %r257,%r256,%r253; shl.b64 %r258,%r257,3; add.u64 %r259,%r191,40; add.u64 %r135,%r258,%r259; .loc 1 87 12 mov.u64 %r416,0; $L14: .loc 1 83 18 ld.u64 %r260,[%r110]; st.u64 [%r100],%r260; .loc 1 84 19 ld.u64 %r262,[%r110+16]; add.u64 %r261,%r262,1; ld.u64 %r263,[%r110+8]; sub.u64 %r44,%r261,%r263; .loc 1 86 10 setp.lt.s64 %r264,%r44,0; @ %r264 bra $L12; .loc 1 84 17 st.u64 [%r96],%r44; bra $L13; $L12: .loc 1 87 12 st.u64 [%r96],%r416; $L13: .loc 1 81 3 add.u64 %r110,%r110,24; add.u64 %r100,%r100,8; add.u64 %r96,%r96,8; setp.ne.u64 %r266,%r110,%r135; @ %r266 bra $L14; bra $L53; $L10: .loc 1 94 7 setp.le.s64 %r267,%r118,0; @ %r267 bra $L16; add.u64 %r404,%frame,240; $L39: add.u64 %r146,%r190,40; mov.u64 %r143,%r404; cvt.u32.u32 %r269,%r22; cvt.s64.s8 %r268,%r269; add.u64 %r271,%r268,%r268; add.u64 %r272,%r271,%r268; shl.b64 %r273,%r272,3; add.u64 %r274,%r190,16; add.u64 %r111,%r273,%r274; .loc 1 97 10 mov.u64 %r102,1; .loc 1 101 4 mov.u64 %r297,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r275,0; st.u64 [%r190+8],%r275; .loc 1 106 28 cvt.u16.u32 %r278,%r22; add.u16 %r277,%r278,-1; cvt.u32.u16 %r279,%r277; st.u8 [%r190+28],%r279; .loc 1 108 20 add.u32 %r280,%r22,-2; cvt.s64.s32 %r53,%r280; add.u64 %r282,%r53,%r53; add.u64 %r283,%r282,%r53; shl.b64 %r284,%r283,3; add.u64 %r285,%r190,%r284; .loc 1 108 67 shl.b64 %r287,%r53,3; add.u64 %r288,%frame,%r287; .loc 1 108 59 ld.u64 %r290,[%r285+40]; ld.u64 %r291,[%r288+240]; mul.lo.u64 %r124,%r290,%r291; .loc 1 110 29 mov.u64 %r293,16293; call (%value_in),_gfortrani_xmallocarray294,[%value_in]; } .loc 1 110 27 st.u64 [%r190],%r294; .loc 1 111 10 setp.eq.u64 %r296,%r124,0; @ ! %r296 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r102,%r49,%r102; $L17: .loc 1 101 4 st.u64 [%r146+8],%r297; ld.u64 %r49,[%r143]; add.u64 %r298,%r49,-1; st.u64 [%r146+16],%r298; st.u64 [%r146],%r102; .loc 1 94 7 add.u64 %r146,%r146,24; add.u64 %r143,%r143,8; setp.ne.u64 %r299,%r111,%r146; @ %r299 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r190+48],%r124; mov.u64 %r301,-1; st.u64 [%r190+56],%r301; mov.u64 %r302,1; st.u64 [%r190+40],%r302; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r59,[%r190+28]; .loc 1 121 10 setp.eq.u64 %r303,%r59,%r118; @ %r303 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r118; st.u64 [%stack],%r59;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r306,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r307,[%r306+36]; setp.eq.u32 %r308,%r307,0; @ %r308 bra $L21; .loc 1 128 2 add.u64 %r404,%frame,240; cvta.const.u64 %r312,$LC2; cvta.const.u64 %r311,$LC3;_gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r314,%r118,0; @ %r314 bra $L24; $L27: .loc 1 68 9 add.u64 %r315,%r31,1; .loc 1 68 7 sub.u64 %r120,%r315,%r33; max.s64 %r82,%r120,0; .loc 1 140 8 ld.u64 %r131,[%r191]; .loc 1 141 8 ld.u64 %r132,[%r190]; .loc 1 190 33 shl.b64 %r71,%r122,2; ld.u64 %r141,[%frame+360]; .loc 1 204 22 ld.u64 %r73,[%frame+120]; .loc 1 204 12 shl.b64 %r74,%r73,2; .loc 1 205 22 ld.u64 %r76,[%frame]; .loc 1 205 12 shl.b64 %r77,%r76,4; .loc 1 207 32 ld.u64 %r66,[%frame+240]; .loc 1 214 23 mul.lo.u64 %r316,%r66,%r73; .loc 1 214 9 shl.b64 %r317,%r316,2; neg.s64 %r140,%r317; .loc 1 215 23 mul.lo.u64 %r318,%r66,%r76; .loc 1 215 9 shl.b64 %r319,%r318,4; neg.s64 %r145,%r319; setp.gt.s64 %r406,%r120,0; setp.le.s64 %r407,%r118,1; .loc 1 180 9 setp.eq.u32 %r409,%r193,0; mov.u64 %r410,1; cvt.u32.u32 %r411,%r22; cvt.s64.s8 %r412,%r411; add.u64 %r413,%r412,-1; add.u64 %r414,%frame,120; add.u64 %r415,%frame,240; bra $L25; $L24: add.u64 %r168,%frame,360; add.u64 %r167,%r190,40; mov.u64 %r165,%frame; cvt.u32.u32 %r321,%r22; cvt.s64.s8 %r320,%r321; add.u64 %r148,%r320,-1; .loc 1 132 3 mov.u64 %r133,0; add.u64 %r404,%frame,240; .loc 1 134 16 mov.u64 %r322,%r133; $L26: st.u64 [%r168],%r322; .loc 1 135 18 ld.u64 %r323,[%r167]; st.u64 [%r165],%r323; .loc 1 136 17 shl.b64 %r325,%r133,3; add.u64 %r326,%r404,%r325; .loc 1 136 10 ld.u64 %r327,[%r326]; setp.le.s64 %r328,%r327,0; @ %r328 bra $L1; .loc 1 132 26 add.u64 %r133,%r133,1; .loc 1 132 3 add.u64 %r168,%r168,8; add.u64 %r167,%r167,24; add.u64 %r165,%r165,8; setp.ne.u64 %r329,%r133,%r148; @ %r329 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r406 bra $L28; .loc 1 159 10 mov.u64 %r331,0; st.u64 [%r132],%r331; st.u64 [%r132+8],%r331; bra $L29; $L28: .loc 1 180 9 @ %r409 bra $L40; mov.u64 %r127,%r131; mov.u64 %r394,%r410; mov.u64 %r395,0; .loc 1 155 9 mov.u32 %r108,2147483647; .loc 1 157 9 mov.u64 %r390,1; mov.u64 %r391,%r395; $L32: .loc 1 183 9 ld.u32 %r65,[%r127]; .loc 1 183 8 setp.gt.s32 %r334,%r65,%r108; .loc 1 186 16 selp.u64 %r390,%r390,%r394,%r334; selp.u64 %r391,%r391,%r395,%r334; selp.u32 %r108,%r108,%r65,%r334; .loc 1 181 33 add.u64 %r127,%r127,%r71; .loc 1 181 8 add.u64 %r337,%r394,%r410; set.u32.lt.u64 %r342,%r337,%r394; cvt.s64.s32 %r340,%r342; mov.u64 %r394,%r337; sub.u64 %r395,%r395,%r340; add.u64 %r347,%r394,-1; setp.gt.s64 %r349,%r82,%r347; @ %r349 bra $L32; bra $L33; $L40: mov.u64 %r101,%r131; mov.u64 %r392,%r410; mov.u64 %r393,0; .loc 1 155 9 mov.u32 %r109,2147483647; .loc 1 157 9 mov.u64 %r390,1; mov.u64 %r391,%r393; $L30: .loc 1 192 9 ld.u32 %r69,[%r101]; .loc 1 192 8 setp.ge.s32 %r350,%r69,%r109; .loc 1 195 16 selp.u64 %r390,%r390,%r392,%r350; selp.u64 %r391,%r391,%r393,%r350; selp.u32 %r109,%r109,%r69,%r350; .loc 1 190 33 add.u64 %r101,%r101,%r71; .loc 1 190 8 add.u64 %r353,%r392,%r410; set.u32.lt.u64 %r358,%r353,%r392; cvt.s64.s32 %r356,%r358; mov.u64 %r392,%r353; sub.u64 %r393,%r393,%r356; add.u64 %r363,%r392,-1; setp.gt.s64 %r365,%r82,%r363; @ %r365 bra $L30; $L33: .loc 1 199 12 st.u64 [%r132],%r390; st.u64 [%r132+8],%r391; $L29: .loc 1 203 15 add.u64 %r141,%r141,1; .loc 1 204 12 add.u64 %r131,%r131,%r74; .loc 1 205 12 add.u64 %r132,%r132,%r77; .loc 1 207 13 setp.ne.u64 %r368,%r66,%r141; @ %r368 bra $L25; .loc 1 214 9 add.u64 %r128,%r131,%r140; .loc 1 215 9 add.u64 %r129,%r132,%r145; .loc 1 217 7 @ %r407 bra $L1; add.u64 %r157,%frame,368; mov.u64 %r117,8; .loc 1 216 5 mov.u64 %r130,1; .loc 1 211 13 mov.u64 %r408,0; bra $L37; $L38: st.u64 [%r157],%r408; .loc 1 214 23 mul.lo.u64 %r374,%r89,%r88; .loc 1 214 9 shl.b64 %r375,%r374,2; sub.u64 %r128,%r131,%r375; .loc 1 215 23 mul.lo.u64 %r376,%r92,%r88; .loc 1 215 9 shl.b64 %r377,%r376,4; sub.u64 %r129,%r132,%r377; .loc 1 216 5 add.u64 %r130,%r130,1; .loc 1 217 7 add.u64 %r157,%r157,8; add.u64 %r117,%r117,8; setp.eq.u64 %r378,%r130,%r413; @ %r378 bra $L1; $L37: .loc 1 225 16 ld.u64 %r379,[%r157]; add.u64 %r88,%r379,1; st.u64 [%r157],%r88; .loc 1 226 23 add.u64 %r381,%r414,%r117; ld.u64 %r89,[%r381]; .loc 1 226 13 shl.b64 %r382,%r89,2; add.u64 %r131,%r128,%r382; .loc 1 227 23 add.u64 %r383,%frame,%r117; ld.u64 %r92,[%r383]; .loc 1 227 13 shl.b64 %r384,%r92,4; add.u64 %r132,%r129,%r384; .loc 1 207 32 add.u64 %r386,%r415,%r117; ld.u64 %r95,[%r386]; .loc 1 207 13 setp.eq.u64 %r387,%r88,%r95; @ %r387 bra $L38; .loc 1 211 13 mov.u64 %r141,0; bra $L25; $L53: .loc 1 90 6 ld.u64 %r388,[%r190]; setp.eq.u64 %r389,%r388,0; @ ! %r389 bra $L11; bra $L39; $L1: .loc 1 231_gfortran_mminloc1_16_i4 .visible .func _gfortran_mminloc1_16_, .param .u32 %in_ar4) {608u32 %r94; .reg .u64 %r97u64u64 %r267; .reg .u64 %r274; .reg .pred %r275; .reg .u32 %r276; .reg .u32 %r277; .reg .u32u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .pred %r329; .reg .u64 %r330; .reg .pred %r331; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32 %r338; .reg .u64 %r339; .reg .u64 %r341; .reg .u64 %r342; .reg .u64 %r343; .reg .u64 %r344; .reg .u64pred %r355; .reg .pred64predpred %r408; .reg .pred %r409; .reg .u64 %r410; .reg .u32.reg .u64 %r417; .reg .pred %r418; .reg .predpred %r428; .reg .u64 %r432; .reg .u64pred %r460; .reg .u64 %r463; .reg .u64 %r466; .reg .u32 %r468; .reg .pred %r475; .reg .pred.reg .u32 %r504; .reg .u64 %r508; .reg .u16 %r509; .reg .u32 %r510; .reg .pred %r511; .reg .pred %r512; .reg .u64 %r515; .reg .u64 %r518; .reg .u32 %r520; .reg .pred %r527; .reg .pred %r530; .reg .pred %r533; .reg .u64pred %r542; .reg .u64 %r543; .reg .u64pred.reg .u64 %r610; .reg .u64 %r611; mov.u64 %r233,%ar0; mov.u64 %r234,%ar1; mov.u64 %r235,%ar2; mov.u64 %r236,%ar3; mov.u32 %r237,%ar4; .loc 1 261 6 setp.ne.u64 %r238,%r236,0; @ %r238 bra $L3r2235237; call _gfortran_minloc1_16_268 7 bra $L54; $L55: .loc 1 271 10 ld.u64 %r22,[%r235]; .loc 1 271 7 add.u64 %r153,%r22,-1; .loc 1 272 10 ld.s8 %r23,[%r234+28]; .loc 1 272 38 add.u32 %r243,%r23,-1; .loc 1 272 8 cvt.s64.s32 %r154,%r243; .loc 1 275 7 shr.u64 %r245,%r153,63; set.u32.gt.s64 %r247,%r153,%r154; neg.s32 %r248,%r247; cvt.u16.u64 %r251,%r245; cvt.u16.u32 %r252,%r248; or.b16 %r250,%r251,%r252; .loc 1 275 6 cvt.u32.u16 %r253,%r250; cvt.u16.u8 %r254,%r253; setp.eq.u16 %r255,%r254,0; @ %r255 bra $L57; .loc 1 277 7 cvt.u32.u32 %r258,%r23; cvt.s64.s8 %r257,%r258; st.u64 [%stack+8],%r257; st.u64 [%stack],%r22; cvta.const.u64 %r22stack; call _gfortran_runtime_error7: .loc 1 282 9 add.u64 %r596,%r153,%r153; add.u64 %r601,%r596,%r153; shl.b64 %r263,%r601,3; add.u64 %r264,%r234,%r263; ld.u64 %r267,[%r264+56]; add.u64 %r266,%r267,1; .loc 1 282 7 ld.u64 %r274,[%r264+48]; sub.u64 %r155,%r266,%r274; .loc 1 283 6 setp.le.s64 %r275,%r155,0; @ %r275 bra $L54; .loc 1 286 9 ld.u64 %r173,[%r236]; .loc 1 288 15 ld.u64 %r35,[%r236+16]; .loc 1 290 22 cvt.u32.u64 %r36,%r35; .loc 1 290 53 add.u32 %r276,%r36,-4; and.b32 %r277,%r276,-5; set.u32.eq.u32 %r279,%r277,0; neg.s32 %r280,%r279; .loc 1 290 22 add.u32 %r281,%r36,-1; set.u32.le.u32 %r283,%r281,1; neg.s32 %r284,%r283; .loc 1 290 6 cvt.u16.u32 %r286,%r280; cvt.u16.u32 %r287,%r284; or.b16 %r285,%r286,%r287;59; .loc 1 292 7 setp.ne.u32 %r292,%r36,16; @ %r292 bra $L60; $L59: .loc 1 299 9 shl.b64 %r296,%r601,3; add.u64 %r297,%r234,%r296; ld.u64 %r158,[%r297+40]; .loc 1 300 12 add.u64 %r303,%r236,%r296; ld.u64 %r41,[%r303+40]; .loc 1 302 3 setp.ne.u64 %r305,%r153,0; @ %r305 bra $L61; $L67: .loc 1 312 3 setp.lt.s64 %r306,%r153,%r154; @ %r306 bra $L62; bra $L121; $L60: .loc 1 297 5 cvta.const.u64 %r307307_gfortran_runtime_error61: add.u64 %r164,%r234,40; add.u64 %r150,%r236,40; add.u64 %r309,%r234,16; add.u64 %r311,%r22,%r22; add.u64 %r312,%r311,%r22; shl.b64 %r313,%r312,3; add.u64 %r223,%r309,%r313; .loc 1 302 3 mov.u64 %r53,0; add.u64 %r597,%frame,240; add.u64 %r600,%frame,360; .loc 1 309 12 mov.u64 %r611,%r53; $L66: .loc 1 304 18 add.u64 %r315,%r597,%r53; ld.u64 %r316,[%r164]; st.u64 [%r315],%r316; .loc 1 305 18 add.u64 %r317,%frame,%r53; .loc 1 305 20 ld.u64 %r319,[%r150]; mul.lo.u64 %r318,%r319,%r35; .loc 1 305 18 st.u64 [%r317],%r318; .loc 1 306 19 ld.u64 %r321,[%r164+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r164+8]; sub.u64 %r52,%r320,%r322; .loc 1 308 10 setp.lt.s64 %r323,%r52,0; @ %r323 bra $L64; .loc 1 306 17 add.u64 %r325,%r600,%r53; st.u64 [%r325],%r52; bra $L65; $L64: .loc 1 309 12 add.u64 %r327,%r600,%r53; st.u64 [%r327],%r611; $L65: .loc 1 302 3 add.u64 %r164,%r164,24; add.u64 %r150,%r150,24; add.u64 %r53,%r53,8; setp.ne.u64 %r329,%r164,%r223; @ %r329 bra $L66; bra $L67; $L121: .loc 1 322 6 ld.u64 %r330,[%r233]; setp.eq.u64 %r331,%r330,0; @ ! %r331 bra $L69; bra $L68; $L62: add.u64 %r333,%r22,%r22; add.u64 %r334,%r333,%r22; shl.b64 %r335,%r334,3; add.u64 %r178,%r335,40; add.u64 %r59,%r234,%r178; add.u64 %r180,%r236,%r178; shl.b64 %r336,%r22,3; add.u64 %r91,%r336,-8; cvt.u32.u32 %r338,%r23; cvt.s64.s8 %r337,%r338; shl.b64 %r339,%r337,3; add.u64 %r97,%r339,-8; add.u64 %r597,%frame,240; add.u64 %r600,%frame,360; .loc 1 319 12 mov.u64 %r610,0; $L72: .loc 1 314 18 add.u64 %r341,%r597,%r91; ld.u64 %r342,[%r59]; st.u64 [%r341],%r342; .loc 1 315 18 add.u64 %r343,%frame,%r91; .loc 1 315 20 ld.u64 %r345,[%r180]; mul.lo.u64 %r344,%r345,%r35; .loc 1 315 18 st.u64 [%r343],%r344; .loc 1 316 19 ld.u64 %r347,[%r59+16]; add.u64 %r346,%r347,1; ld.u64 %r348,[%r59+8]; sub.u64 %r66,%r346,%r348; .loc 1 318 10 setp.lt.s64 %r349,%r66,0; @ %r349 bra $L70; .loc 1 316 17 add.u64 %r351,%r600,%r91; st.u64 [%r351],%r66; bra $L71; $L70: .loc 1 319 12 add.u64 %r353,%r600,%r91; st.u64 [%r353],%r610; $L71: .loc 1 312 3 add.u64 %r59,%r59,24; add.u64 %r180,%r180,24; add.u64 %r91,%r91,8; setp.ne.u64 %r355,%r91,%r97; @ %r355 bra $L72; bra $L122; $L68: .loc 1 326 7 setp.eq.u64 %r356,%r154,0; @ %r356 bra $L74; add.u64 %r600,%frame,360; $L100: add.u64 %r137,%r233,40; mov.u64 %r134,%r600; cvt.u32.u32 %r358,%r23; cvt.s64.s8 %r357,%r358; add.u64 %r360,%r357,%r357; add.u64 %r361,%r360,%r357; shl.b64 %r362,%r361,3; add.u64 %r363,%r233,16; add.u64 %r64,%r362,%r363; .loc 1 329 10 mov.u64 %r136,1; .loc 1 333 4 mov.u64 %r382,0; bra $L75; $L74: .loc 1 337 20 add.u32 %r364,%r23,-2; cvt.s64.s32 %r75,%r364; add.u64 %r366,%r75,%r75; add.u64 %r367,%r366,%r75; shl.b64 %r368,%r367,3; add.u64 %r369,%r233,%r368; .loc 1 337 67 shl.b64 %r371,%r75,3; add.u64 %r372,%frame,%r371; .loc 1 337 59 ld.u64 %r374,[%r369+40]; ld.u64 %r375,[%r372+360]; mul.lo.u64 %r162,%r374,%r375; .loc 1 339 24 mov.u64 %r376,0; st.u64 [%r233+8],%r376; .loc 1 340 28 cvt.u16.u32 %r379,%r23; add.u16 %r378,%r379,-1; cvt.u32.u16 %r380,%r378; st.u8 [%r233+28],%r380; .loc 1 342 10 setp.eq.u64 %r381,%r162,0; @ %r381 bra $L76; bra $L123; $L78: .loc 1 331 47 mul.lo.u64 %r136,%r72,%r136; $L75: .loc 1 333 4 st.u64 [%r137+8],%r382; ld.u64 %r72,[%r134]; add.u64 %r383,%r72,-1; st.u64 [%r137+16],%r383; st.u64 [%r137],%r136; .loc 1 326 7 add.u64 %r137,%r137,24; add.u64 %r134,%r134,8; setp.ne.u64 %r384,%r64,%r137; @ %r384 bra $L78; bra $L74; $L76: .loc 1 345 4 st.u64 [%r233+48],%r162; mov.u64 %r386,-1; st.u64 [%r233+56],%r386; mov.u64 %r387,1; st.u64 [%r233+40],%r387; .loc 1 346 4 bra $L54; $L123: .loc 1 349 24 mov.u64 %r389,16389; call (%value_in),_gfortrani_xmallocarray90,[%value_in]; } .loc 1 349 22 st.u64 [%r233],%r390; bra $L79; $L69: .loc 1 354 19 ld.s8 %r392,[%r233+28]; .loc 1 354 10 setp.eq.u64 %r393,%r392,%r154; @ %r393 bra $L80; .loc 1 355 2 cvta.const.u64 %r3944_gfortran_runtime_error80: .loc 1 357 11 cvta.global.u64 %r396,_gfortrani_compile_options; .loc 1 357 10 ld.u32 %r397,[%r396+36]; setp.eq.u32 %r398,%r397,0; @ %r398 bra $L79; .loc 1 359 4 add.u64 %r600,%frame,360; cvta.const.u64 %r402,$LC2r600002; call _gfortrani_bounds_ifunction_return361 4 cvta.const.u64 %r406,$LC406402; call _gfortrani_bounds_equal_extents$L79: .loc 1 366 3 setp.ne.u64 %r408,%r154,0; @ %r408 bra $L81; $L84: .loc 1 374 8 ld.u64 %r174,[%r233]; .loc 1 375 8 ld.u64 %r172,[%r234]; .loc 1 377 9 setp.ne.u64 %r409,%r172,0; @ %r409 bra $L82; bra $L54; $L81: add.u64 %r188,%frame,480; add.u64 %r181,%r233,40; add.u64 %r149,%frame,120; cvt.u32.u32 %r411,%r23; cvt.s64.s8 %r410,%r411; add.u64 %r138,%r410,-1; .loc 1 366 3 mov.u64 %r175,0; add.u64 %r600,%frame,360; .loc 1 368 16 mov.u64 %r412,%r175; $L83: st.u64 [%r188],%r412; .loc 1 369 18 ld.u64 %r413,[%r181]; st.u64 [%r149],%r413; .loc 1 370 17 shl.b64 %r415,%r175,3; add.u64 %r416,%r600,%r415; .loc 1 370 10 ld.u64 %r417,[%r416]; setp.le.s64 %r418,%r417,0; @ %r418 bra $L54; .loc 1 366 26 add.u64 %r175,%r175,1; .loc 1 366 3 add.u64 %r188,%r188,8; add.u64 %r181,%r181,24; add.u64 %r149,%r149,8; setp.ne.u64 %r419,%r138,%r175; @ %r419 bra $L83; bra $L84; $L82: .loc 1 300 12 mul.lo.u64 %r43,%r41,%r35; .loc 1 396 32 shl.b64 %r88,%r158,2; ld.u64 %r189,[%frame+480]; .loc 1 440 22 ld.u64 %r102,[%frame+240]; .loc 1 440 12 shl.b64 %r103,%r102,2; .loc 1 441 23 ld.u64 %r105,[%frame]; .loc 1 442 22 ld.u64 %r107,[%frame+120]; .loc 1 442 12 shl.b64 %r108,%r107,4; .loc 1 444 32 ld.u64 %r29,[%frame+360]; .loc 1 451 23 mul.lo.u64 %r39,%r29,%r102; .loc 1 452 24 mul.lo.u64 %r420,%r29,%r105; .loc 1 452 10 neg.s64 %r92,%r420; .loc 1 453 23 mul.lo.u64 %r421,%r29,%r107; .loc 1 453 9 shl.b64 %r422,%r421,4; neg.s64 %r185,%r422; setp.le.s64 %r595,%r154,1; .loc 1 418 9 setp.ne.u32 %r603,%r237,0; add.u64 %r604,%r155,-1; cvt.u32.u32 %r605,%r23; cvt.s64.s8 %r606,%r605; add.u64 %r607,%r606,-1; add.u64 %r608,%frame,240; add.u64 %r609,%frame,360; $L101: .loc 1 366 3 mov.u64 %r168,%r173; mov.u64 %r167,%r172; .loc 1 396 9 mov.u64 %r197,0; $L88: .loc 1 399 7 ld.s8 %r224,[%r168]; mov.u64 %r230,%r197; .loc 1 396 24 add.u64 %r197,%r197,1; .loc 1 399 6 setp.eq.u32 %r598,%r224,0; @ %r598 bra $L85; .loc 1 407 11 ld.u32 %r145,[%r167]; .loc 1 408 11 mov.u64 %r557,%r197; shr.s64 %r558,%r557,63; .loc 1 418 9 @ %r603 bra $L86; bra $L124; $L85: .loc 1 396 32 add.u64 %r167,%r167,%r88; .loc 1 396 47 add.u64 %r168,%r168,%r43; .loc 1 396 2 setp.ne.u64 %r426,%r155,%r197; @ %r426 bra $L88; .loc 1 395 9 mov.u64 %r557,0; mov.u64 %r558,%r557; bra $L89; $L124: .loc 1 428 10 setp.gt.s64 %r427,%r155,%r230; @ %r427 bra $L90; bra $L89; $L86: .loc 1 419 8 setp.le.s64 %r428,%r155,%r230; @ %r428 bra $L89; add.u64 %r565,%r230,1; shr.s64 %r566,%r565,63; sub.u64 %r432,%r604,%r230; shr.s64 %r435,%r230,63; add.u64 %r438,%r230,2; set.u32.lt.u64 %r443,%r438,%r230; cvt.s64.s32 %r441,%r443; sub.u64 %r447,%r435,%r441; add.u64 %r449,%r432,%r438; set.u32.lt.u64 %r454,%r449,%r432; cvt.s64.s32 %r452,%r454; sub.u64 %r458,%r447,%r452; mov.u64 %r577,1; $L93: .loc 1 421 8 @ %r598 bra $L91; .loc 1 421 18 ld.u32 %r89,[%r167]; .loc 1 421 15 setp.gt.s32 %r460,%r89,%r145; .loc 1 424 16 selp.u64 %r557,%r557,%r565,%r460; selp.u64 %r558,%r558,%r566,%r460; selp.u32 %r145,%r145,%r89,%r460; $L91: .loc 1 419 33 add.u64 %r167,%r167,%r88; .loc 1 419 48 add.u64 %r168,%r168,%r43; .loc 1 419 8 add.u64 %r463,%r565,%r577; set.u32.lt.u64 %r468,%r463,%r565; cvt.s64.s32 %r466,%r468; mov.u64 %r565,%r463; sub.u64 %r566,%r566,%r466; setp.ne.u64 %r475,%r449,%r565; @ %r475 bra $L102; setp.ne.u64 %r478,%r458,%r566; @ ! %r478 bra $L89; $L102: .loc 1 421 9 ld.s8 %r224,[%r168]; setp.eq.u32 %r598,%r224,0; bra $L93; $L90: add.u64 %r561,%r230,1; shr.s64 %r562,%r561,63; sub.u64 %r482,%r604,%r230; shr.s64 %r485,%r230,63; add.u64 %r488,%r230,2; set.u32.lt.u64 %r493,%r488,%r230; cvt.s64.s32 %r491,%r493; sub.u64 %r497,%r485,%r491; add.u64 %r499,%r482,%r488; set.u32.lt.u64 %r504,%r499,%r482; cvt.s64.s32 %r502,%r504; sub.u64 %r508,%r497,%r502; .loc 1 428 10 mov.u64 %r591,1; $L116: .loc 1 430 10 ld.s8 %r510,[%r168]; cvt.u16.u32 %r509,%r510; setp.eq.u16 %r511,%r509,0; @ %r511 bra $L94; .loc 1 430 20 ld.u32 %r94,[%r167]; .loc 1 430 17 setp.ge.s32 %r512,%r94,%r145; .loc 1 433 11 selp.u64 %r557,%r557,%r561,%r512; selp.u64 %r558,%r558,%r562,%r512; selp.u32 %r145,%r145,%r94,%r512; $L94: .loc 1 428 35 add.u64 %r167,%r167,%r88; .loc 1 428 50 add.u64 %r168,%r168,%r43; .loc 1 428 10 add.u64 %r515,%r561,%r591; set.u32.lt.u64 %r520,%r515,%r561; cvt.s64.s32 %r518,%r520; mov.u64 %r561,%r515; sub.u64 %r562,%r562,%r518; setp.ne.u64 %r527,%r499,%r561; @ %r527 bra $L116; setp.ne.u64 %r530,%r508,%r562; @ %r530 bra $L116; $L89: .loc 1 436 8 st.u64 [%r174],%r557; st.u64 [%r174+8],%r558; .loc 1 439 15 add.u64 %r189,%r189,1; .loc 1 440 12 add.u64 %r172,%r172,%r103; .loc 1 441 13 add.u64 %r173,%r173,%r105; .loc 1 442 12 add.u64 %r174,%r174,%r108; .loc 1 444 13 setp.ne.u64 %r533,%r29,%r189; @ %r533 bra $L101; .loc 1 452 10 add.u64 %r169,%r173,%r92; .loc 1 453 9 add.u64 %r170,%r174,%r185; .loc 1 455 7 @ %r595 bra $L54; add.u64 %r219,%frame,488; .loc 1 451 23 mov.u64 %r112,%r39; .loc 1 455 7 mov.u64 %r218,8; .loc 1 454 5 mov.u64 %r171,1; add.u64 %r599,%frame,120; .loc 1 448 13 mov.u64 %r602,0; bra $L98; $L99: st.u64 [%r219],%r602; .loc 1 451 23 mul.lo.u64 %r112,%r125,%r124; .loc 1 452 24 mul.lo.u64 %r539,%r127,%r124; .loc 1 452 10 sub.u64 %r169,%r173,%r539; .loc 1 453 23 mul.lo.u64 %r540,%r129,%r124; .loc 1 453 9 shl.b64 %r541,%r540,4; sub.u64 %r170,%r174,%r541; .loc 1 454 5 add.u64 %r171,%r171,1; .loc 1 455 7 add.u64 %r219,%r219,8; add.u64 %r218,%r218,8; setp.eq.u64 %r542,%r171,%r607; @ %r542 bra $L54; $L98: .loc 1 463 16 ld.u64 %r543,[%r219]; add.u64 %r124,%r543,1; st.u64 [%r219],%r124; .loc 1 464 23 add.u64 %r545,%r608,%r218; ld.u64 %r125,[%r545]; .loc 1 464 13 sub.u64 %r546,%r125,%r112; shl.b64 %r547,%r546,2; add.u64 %r172,%r172,%r547; .loc 1 465 24 add.u64 %r548,%frame,%r218; ld.u64 %r127,[%r548]; .loc 1 465 14 add.u64 %r173,%r169,%r127; .loc 1 466 23 add.u64 %r550,%r599,%r218; ld.u64 %r129,[%r550]; .loc 1 466 13 shl.b64 %r551,%r129,4; add.u64 %r174,%r170,%r551; .loc 1 444 32 add.u64 %r553,%r609,%r218; ld.u64 %r133,[%r553]; .loc 1 444 13 setp.eq.u64 %r554,%r124,%r133; @ %r554 bra $L99; .loc 1 448 13 mov.u64 %r189,0; bra $L101; $L122: .loc 1 322 6 ld.u64 %r555,[%r233]; setp.eq.u64 %r556,%r555,0; @ ! %r556 bra $L69; bra $L100; $L54: .loc 1 4_gfortran_sminloc1_16_i4 .visible .func _gfortran_sminloc1_16_, .param .u32 %in_ar4) {8u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r282; .reg .predpred %r293; .reg .pred %r294; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 493 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L126; .loc 1 493 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L127; $L126:r152154; call _gfortran_minloc1_16_500 7 bra $L125; $L127: .loc 1 503 10 ld.u64 %r23,[%r152]; .loc 1 503 7 add.u64 %r88,%r23,-1; .loc 1 504 10 ld.s8 %r24,[%r151+28]; .loc 1 504 38 add.u32 %r26,%r24,-1; .loc 1 504 8 cvt.s64.s32 %r297,%r26; .loc 1 506 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r297; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 506 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L129; .loc 1 513 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L130; $L136: .loc 1 521 3 setp.lt.s64 %r175,%r88,%r297; @ %r175 bra $L131; bra $L184; $L129: .loc 1 508 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error130: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 518 12 mov.u64 %r303,0; $L135: .loc 1 515 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 517 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L133; .loc 1 515 17 st.u64 [%r139],%r36; bra $L134; $L133: .loc 1 518 12 st.u64 [%r139],%r303; $L134: .loc 1 513 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L135; bra $L136; $L184: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L138; bra $L137; $L131: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r296,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r296,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 527 12 mov.u64 %r302,0; $L141: .loc 1 524 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 526 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L139; .loc 1 523 17 st.u64 [%r43],%r41; bra $L140; $L139: .loc 1 527 12 st.u64 [%r43],%r302; $L140: .loc 1 521 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L141; bra $L185; $L137: .loc 1 534 7 setp.eq.u64 %r213,%r297,0; @ %r213 bra $L143; add.u64 %r296,%frame,120; $L163: add.u64 %r111,%r150,40; mov.u64 %r107,%r296; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 537 10 mov.u64 %r81,1; .loc 1 541 4 mov.u64 %r239,0; bra $L144; $L143: .loc 1 545 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 546 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 548 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 548 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 548 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 550 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L145; bra $L186; $L147: .loc 1 539 48 mul.lo.u64 %r81,%r46,%r81; $L144: .loc 1 541 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 534 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L147; bra $L143; $L145: .loc 1 553 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 554 4 bra $L125; $L186: .loc 1 557 24 mov.u64 %r246,1696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 557 22 st.u64 [%r150],%r98; $L151: .loc 1 583 3 setp.ne.u64 %r249,%r297,0; @ %r249 bra $L148; bra $L149; $L138: .loc 1 561 19 ld.s8 %r55,[%r150+28]; .loc 1 561 10 setp.eq.u64 %r250,%r55,%r297; @ %r250 bra $L150; .loc 1 562 2 st.u64 [%stack+8],%r297; st.u64 [%stack],%r55;_gfortran_runtime_error150: .loc 1 567 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 567 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L151; .loc 1 569 4 setp.eq.u64 %r256,%r297,0; @ %r256 bra $L149; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 569 10 mov.u64 %r104,0; $L153: .loc 1 573 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 573 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 574 18 ld.u64 %r63,[%r31]; .loc 1 569 25 add.u64 %r104,%r104,1; .loc 1 574 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L152; .loc 1 575 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error152: .loc 1 569 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L153; $L148: .loc 1 585 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L154; shl.b64 %r266,%r297,3; bra $L155; $L154: mov.u64 %r266,8; $L155: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L156: .loc 1 586 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 583 26 add.u64 %r100,%r100,1; .loc 1 583 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r297,%r100; @ %r279 bra $L156; $L149: ld.u64 %r80,[%frame+240]; .loc 1 595 22 ld.u64 %r68,[%frame]; .loc 1 595 12 shl.b64 %r69,%r68,4; .loc 1 597 32 ld.u64 %r59,[%frame+120]; .loc 1 604 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r295,%r297,1; .loc 1 593 13 mov.u64 %r280,0; cvt.u32.u32 %r299,%r24; cvt.s64.s8 %r300,%r299; add.u64 %r301,%r300,-1; $L162: st.u64 [%r98],%r280; st.u64 [%r98+8],%r280; .loc 1 594 15 add.u64 %r80,%r80,1; .loc 1 595 12 add.u64 %r98,%r98,%r69; .loc 1 597 13 setp.ne.u64 %r282,%r59,%r80; @ %r282 bra $L162; .loc 1 606 7 @ ! %r295 bra $L187; bra $L125; $L161: .loc 1 601 13 st.u64 [%r99],%r280; .loc 1 604 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 605 5 add.u64 %r97,%r97,1; .loc 1 606 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r285,%r97,%r301; @ %r285 bra $L160; bra $L125; $L187: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 604 23 mov.u64 %r72,%r44; .loc 1 605 5 mov.u64 %r97,1; $L160: .loc 1 610 16 ld.u64 %r290,[%r99]; add.u64 %r74,%r290,1; st.u64 [%r99],%r74; .loc 1 611 23 ld.u64 %r75,[%r124]; .loc 1 611 13 sub.u64 %r291,%r75,%r72; shl.b64 %r292,%r291,4; add.u64 %r98,%r98,%r292; .loc 1 597 32 ld.u64 %r77,[%r123]; .loc 1 597 13 setp.eq.u64 %r293,%r74,%r77; @ %r293 bra $L161; .loc 1 601 13 mov.u64 %r80,0; bra $L162; $L185: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r294,%r98,0; @ ! %r294 bra $L138; bra $L163; $L125: .loc 1 615minloc1_4_i8.o/ 1622802233_gfortran_minloc1_4_i8 .visible .func _gfortran_minloc1_4_i8.file 1 "../../../../libgfortran/generated/minloc1_4_i8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_4_i8 .visible .func _gfortran_mminloc1_4__gfortran_sminloc1_4_i8 .visible .func _gfortran_sminloc1_4_VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_minloc1_4_i8 .visible .func _gfortran_minloc1_4_i8predu64 %r258; .reg .u64pred %r262; .reg .pred %r264; .reg .pred %r265; .reg .u64u64predu64 %r318; .reg .u32u64 %r325; .reg .pred %r326; .reg .pred %r327; .reg .u32 %r329; .reg .predu32 %r335; .reg .pred %r336; .reg .predu64u64predmov.u64 %r188,%ar0; mov.u64 %r189,%ar1; mov.u64 %r190,%ar2; mov.u32 %r191,%ar3; .loc 1 58 10 ld.s8 %r22,[%r189+28]; .loc 1 58 38 add.u32 %r192,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r118,%r192; .loc 1 59 10 ld.u64 %r25,[%r190]; .loc 1 59 7 add.u64 %r119,%r25,-1; .loc 1 61 7 shr.u64 %r194,%r119,63; set.u32.lt.s64 %r196,%r118,%r119; neg.s32 %r197,%r196; cvt.u16.u64 %r200,%r194; cvt.u16.u32 %r201,%r197; or.b16 %r199,%r200,%r201; .loc 1 61 6 cvt.u32.u16 %r202,%r199; cvt.u16.u8 %r203,%r202; setp.eq.u16 %r204,%r203,0; @ %r204 bra $L2; .loc 1 63 7 cvt.u32.u32 %r207,%r22; cvt.s64.s8 %r206,%r207; st.u64 [%stack+8],%r20605205_gfortran_runtime_errorr210,%r119,%r119; add.u64 %r211,%r210,%r119; shl.b64 %r212,%r211,3; add.u64 %r213,%r189,%r212; ld.u64 %r30,[%r213+56]; ld.u64 %r32,[%r213+48]; .loc 1 71 9 ld.u64 %r122,[%r213+40]; .loc 1 73 3 setp.ne.u64 %r227,%r119,0; @ %r227 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r228,%r118,%r119; @ %r228 bra $L4; bra $L52; $L3: add.u64 %r138,%r189,40; add.u64 %r186,%frame,120; add.u64 %r185,%frame,240; add.u64 %r229,%r189,16; add.u64 %r231,%r25,%r25; add.u64 %r232,%r231,%r25; shl.b64 %r233,%r232,3; add.u64 %r145,%r229,%r233; .loc 1 79 12 mov.u64 %r372,0; $L8: .loc 1 75 18 ld.u64 %r234,[%r138]; st.u64 [%r186],%r234; .loc 1 76 19 ld.u64 %r236,[%r138+16]; add.u64 %r235,%r236,1; ld.u64 %r237,[%r138+8]; sub.u64 %r37,%r235,%r237; .loc 1 78 10 setp.lt.s64 %r238,%r37,0; @ %r238 bra $L6; .loc 1 76 17 st.u64 [%r185],%r37; bra $L7; $L6: .loc 1 79 12 st.u64 [%r185],%r372; $L7: .loc 1 73 3 add.u64 %r138,%r138,24; add.u64 %r186,%r186,8; add.u64 %r185,%r185,8; setp.ne.u64 %r240,%r138,%r145; @ %r240 bra $L8; bra $L9; $L52: .loc 1 90 6 ld.u64 %r241,[%r188]; setp.eq.u64 %r242,%r241,0; @ ! %r242 bra $L11; bra $L10; $L4: add.u64 %r244,%r25,%r25; add.u64 %r245,%r244,%r25; shl.b64 %r246,%r245,3; add.u64 %r247,%r246,40; add.u64 %r114,%r189,%r247; shl.b64 %r248,%r25,3; add.u64 %r102,%r248,-8; add.u64 %r362,%frame,120; add.u64 %r107,%r362,%r102; add.u64 %r361,%frame,240; add.u64 %r100,%r361,%r102; cvt.u32.u32 %r252,%r22; cvt.s64.s8 %r251,%r252; add.u64 %r254,%r251,%r251; add.u64 %r255,%r254,%r251; shl.b64 %r256,%r255,3; add.u64 %r257,%r189,40; add.u64 %r143,%r256,%r257; .loc 1 87 12 mov.u64 %r371,0; $L14: .loc 1 83 18 ld.u64 %r258,[%r114]; st.u64 [%r107],%r258; .loc 1 84 19 ld.u64 %r260,[%r114+16]; add.u64 %r259,%r260,1; ld.u64 %r261,[%r114+8]; sub.u64 %r43,%r259,%r261; .loc 1 86 10 setp.lt.s64 %r262,%r43,0; @ %r262 bra $L12; .loc 1 84 17 st.u64 [%r100],%r43; bra $L13; $L12: .loc 1 87 12 st.u64 [%r100],%r371; $L13: .loc 1 81 3 add.u64 %r114,%r114,24; add.u64 %r107,%r107,8; add.u64 %r100,%r100,8; setp.ne.u64 %r264,%r114,%r143; @ %r264 bra $L14; bra $L53; $L10: .loc 1 94 7 setp.le.s64 %r265,%r118,0; @ %r265 bra $L16; add.u64 %r361,%frame,240; $L39: add.u64 %r155,%r188,40; mov.u64 %r151,%r361; cvt.u32.u32 %r267,%r22; cvt.s64.s8 %r266,%r267; add.u64 %r269,%r266,%r266; add.u64 %r270,%r269,%r266; shl.b64 %r271,%r270,3; add.u64 %r272,%r188,16; add.u64 %r115,%r271,%r272; .loc 1 97 10 mov.u64 %r105,1; .loc 1 101 4 mov.u64 %r295,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r273,0; st.u64 [%r188+8],%r273; .loc 1 106 28 cvt.u16.u32 %r276,%r22; add.u16 %r275,%r276,-1; cvt.u32.u16 %r277,%r275; st.u8 [%r188+28],%r277; .loc 1 108 20 add.u32 %r278,%r22,-2; cvt.s64.s32 %r52,%r278; add.u64 %r280,%r52,%r52; add.u64 %r281,%r280,%r52; shl.b64 %r282,%r281,3; add.u64 %r283,%r188,%r282; .loc 1 108 67 shl.b64 %r285,%r52,3; add.u64 %r286,%frame,%r285; .loc 1 108 59 ld.u64 %r288,[%r283+40]; ld.u64 %r289,[%r286+240]; mul.lo.u64 %r124,%r288,%r289; .loc 1 110 29291; call (%value_in),_gfortrani_xmallocarray292,[%value_in]; } .loc 1 110 27 st.u64 [%r188],%r292; .loc 1 111 10 setp.eq.u64 %r294,%r124,0; @ ! %r294 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r105,%r48,%r105; $L17: .loc 1 101 4 st.u64 [%r155+8],%r295; ld.u64 %r48,[%r151]; add.u64 %r296,%r48,-1; st.u64 [%r155+16],%r296; st.u64 [%r155],%r105; .loc 1 94 7 add.u64 %r155,%r155,24; add.u64 %r151,%r151,8; setp.ne.u64 %r297,%r115,%r155; @ %r297 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r188+48],%r124; mov.u64 %r299,-1; st.u64 [%r188+56],%r299; mov.u64 %r300,1; st.u64 [%r188+40],%r300; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r59,[%r188+28]; .loc 1 121 10 setp.eq.u64 %r301,%r59,%r118; @ %r301 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r118; st.u64 [%stack],%r59;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r304,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r305,[%r304+36]; setp.eq.u32 %r306,%r305,0; @ %r306 bra $L21; .loc 1 128 2 add.u64 %r361,%frame,240; cvta.const.u64 %r310,$LC2r361309310; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r312,%r118,0; @ %r312 bra $L24; $L27: .loc 1 68 9 add.u64 %r313,%r30,1; .loc 1 68 7 sub.u64 %r120,%r313,%r32; max.s64 %r86,%r120,0; .loc 1 140 8 ld.u64 %r134,[%r189]; .loc 1 141 8 ld.u64 %r135,[%r188]; .loc 1 190 33 shl.b64 %r75,%r122,3; ld.u64 %r153,[%frame+360]; .loc 1 204 22 ld.u64 %r77,[%frame+120]; .loc 1 204 12 shl.b64 %r78,%r77,3; .loc 1 205 22 ld.u64 %r80,[%frame]; .loc 1 205 12 shl.b64 %r81,%r80,2; .loc 1 207 32 ld.u64 %r68,[%frame+240]; .loc 1 214 23 mul.lo.u64 %r314,%r68,%r77; .loc 1 214 9 shl.b64 %r315,%r314,3; neg.s64 %r142,%r315; .loc 1 215 23 mul.lo.u64 %r316,%r68,%r80; .loc 1 215 9 shl.b64 %r317,%r316,2; neg.s64 %r140,%r317; setp.gt.s64 %r359,%r120,0; setp.le.s64 %r360,%r118,1; .loc 1 180 9 setp.eq.u32 %r364,%r191,0; .loc 1 155 9 mov.u64 %r365,9223372036854775807; cvt.u32.u32 %r366,%r22; cvt.s64.s8 %r367,%r366; add.u64 %r368,%r367,-1; add.u64 %r369,%frame,120; add.u64 %r370,%frame,240; bra $L25; $L24: add.u64 %r171,%frame,360; add.u64 %r170,%r188,40; mov.u64 %r168,%frame; cvt.u32.u32 %r319,%r22; cvt.s64.s8 %r318,%r319; add.u64 %r159,%r318,-1; .loc 1 132 3 mov.u64 %r136,0; add.u64 %r361,%frame,240; .loc 1 134 16 mov.u64 %r320,%r136; $L26: st.u64 [%r171],%r320; .loc 1 135 18 ld.u64 %r321,[%r170]; st.u64 [%r168],%r321; .loc 1 136 17 shl.b64 %r323,%r136,3; add.u64 %r324,%r361,%r323; .loc 1 136 10 ld.u64 %r325,[%r324]; setp.le.s64 %r326,%r325,0; @ %r326 bra $L1; .loc 1 132 26 add.u64 %r136,%r136,1; .loc 1 132 3 add.u64 %r171,%r171,8; add.u64 %r170,%r170,24; add.u64 %r168,%r168,8; setp.ne.u64 %r327,%r136,%r159; @ %r327 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r359 bra $L28; .loc 1 159 10 mov.u32 %r329,0; st.u32 [%r135],%r329; bra $L29; $L28: .loc 1 180 9 @ %r364 bra $L40; mov.u64 %r129,%r134; .loc 1 155 9 mov.u64 %r108,%r365; .loc 1 157 9 mov.u32 %r106,1; .loc 1 178 8 mov.u64 %r128,0; $L32: .loc 1 183 9 ld.u64 %r67,[%r129]; .loc 1 183 8 setp.gt.s64 %r331,%r67,%r108; @ %r331 bra $L31; cvt.u32.u64 %r332,%r128; add.u32 %r106,%r332,1; mov.u64 %r108,%r67; $L31: .loc 1 181 25 add.u64 %r128,%r128,1; .loc 1 181 33 add.u64 %r129,%r129,%r75; .loc 1 181 8 setp.gt.s64 %r333,%r86,%r128; @ %r333 bra $L32; bra $L33; $L40: mov.u64 %r126,%r134; .loc 1 155 9 mov.u64 %r110,%r365; .loc 1 157 9 mov.u32 %r106,1; .loc 1 178 8 mov.u64 %r104,0; $L30: .loc 1 192 9 ld.u64 %r72,[%r126]; .loc 1 192 8 setp.ge.s64 %r334,%r72,%r110; @ %r334 bra $L34; cvt.u32.u64 %r335,%r104; add.u32 %r106,%r335,1; mov.u64 %r110,%r72; $L34: .loc 1 190 25 add.u64 %r104,%r104,1; .loc 1 190 33 add.u64 %r126,%r126,%r75; .loc 1 190 8 setp.gt.s64 %r336,%r86,%r104; @ %r336 bra $L30; $L33: .loc 1 199 12 st.u32 [%r135],%r106; $L29: .loc 1 203 15 add.u64 %r153,%r153,1; .loc 1 204 12 add.u64 %r134,%r134,%r78; .loc 1 205 12 add.u64 %r135,%r135,%r81; .loc 1 207 13 setp.ne.u64 %r337,%r68,%r153; @ %r337 bra $L25; .loc 1 214 9 add.u64 %r131,%r134,%r142; .loc 1 215 9 add.u64 %r132,%r135,%r140; .loc 1 217 7 @ %r360 bra $L1; add.u64 %r156,%frame,368; mov.u64 %r117,8; .loc 1 216 5 mov.u64 %r133,1; .loc 1 211 13 mov.u64 %r363,0; bra $L37; $L38: st.u64 [%r156],%r363; .loc 1 214 23 mul.lo.u64 %r343,%r93,%r92; .loc 1 214 9 shl.b64 %r344,%r343,3; sub.u64 %r131,%r134,%r344; .loc 1 215 23 mul.lo.u64 %r345,%r96,%r92; .loc 1 215 9 shl.b64 %r346,%r345,2; sub.u64 %r132,%r135,%r346; .loc 1 216 5 add.u64 %r133,%r133,1; .loc 1 217 7 add.u64 %r156,%r156,8; add.u64 %r117,%r117,8; setp.eq.u64 %r347,%r133,%r368; @ %r347 bra $L1; $L37: .loc 1 225 16 ld.u64 %r348,[%r156]; add.u64 %r92,%r348,1; st.u64 [%r156],%r92; .loc 1 226 23 add.u64 %r350,%r369,%r117; ld.u64 %r93,[%r350]; .loc 1 226 13 shl.b64 %r351,%r93,3; add.u64 %r134,%r131,%r351; .loc 1 227 23 add.u64 %r352,%frame,%r117; ld.u64 %r96,[%r352]; .loc 1 227 13 shl.b64 %r353,%r96,2; add.u64 %r135,%r132,%r353; .loc 1 207 32 add.u64 %r355,%r370,%r117; ld.u64 %r99,[%r355]; .loc 1 207 13 setp.eq.u64 %r356,%r92,%r99; @ %r356 bra $L38; .loc 1 211 13 mov.u64 %r153,0; bra $L25; $L53: .loc 1 90 6 ld.u64 %r357,[%r188]; setp.eq.u64 %r358,%r357,0; @ ! %r358 bra $L11; bra $L39; $L1: .loc 1 231_gfortran_mminloc1_4_i8 .visible .func _gfortran_mminloc1_4_, .param .u32 %in_ar4) {60898u16 %r238; .reg .u16u64 %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u64 %r306; .reg .u64pred %r317; .reg .u64 %r318; .reg .pred %r319; .reg .u64u32u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64u64predu32 %r385; .reg .predu64 %r394; .reg .predu64predu64 %r410; .reg .u32 %r412; .reg .pred %r414; .reg .pred %r415; .reg .pred %r416; .reg .predpred %r423; .reg .predpredpred %r446; .reg .u64 %r447; .reg .predmov.u64 %r221,%ar0; mov.u64 %r222,%ar1; mov.u64 %r223,%ar2; mov.u64 %r224,%ar3; mov.u32 %r225,%ar4; .loc 1 261 6 setp.ne.u64 %r226,%r224,0; @ %r226 bra $L223225; call _gfortran_minloc1_4_268 7 bra $L54; $L55: .loc 1 271 10 ld.u64 %r22,[%r223]; .loc 1 271 7 add.u64 %r151,%r22,-1; .loc 1 272 10 ld.s8 %r23,[%r222+28]; .loc 1 272 38 add.u32 %r231,%r23,-1; .loc 1 272 8 cvt.s64.s32 %r152,%r231; .loc 1 275 7 shr.u64 %r233,%r151,63; set.u32.gt.s64 %r235,%r151,%r152; neg.s32 %r236,%r235; cvt.u16.u64 %r239,%r233; cvt.u16.u32 %r240,%r236; or.b16 %r238,%r239,%r240; .loc 1 275 6 cvt.u32.u16 %r241,%r238; cvt.u16.u8 %r242,%r241; setp.eq.u16 %r243,%r242,0; @ %r243 bra $L57; .loc 1 277 7 cvt.u32.u32 %r246,%r23; cvt.s64.s8 %r245,%r246; st.u64 [%stack+8],%r245; st.u64 [%stack],%r22; cvta.const.u64 %r244244_gfortran_runtime_error7: .loc 1 282 9 add.u64 %r449,%r151,%r151; add.u64 %r455,%r449,%r151; shl.b64 %r251,%r455,3; add.u64 %r252,%r222,%r251; ld.u64 %r255,[%r252+56]; add.u64 %r254,%r255,1; .loc 1 282 7 ld.u64 %r262,[%r252+48]; sub.u64 %r153,%r254,%r262; .loc 1 283 6 setp.le.s64 %r263,%r153,0; @ %r263 bra $L54; .loc 1 286 9 ld.u64 %r169,[%r224]; .loc 1 288 15 ld.u64 %r34,[%r224+16]; .loc 1 290 22 cvt.u32.u64 %r35,%r34; .loc 1 290 53 add.u32 %r264,%r35,-4; and.b32 %r265,%r264,-5; set.u32.eq.u32 %r267,%r265,0; neg.s32 %r268,%r267; .loc 1 290 22 add.u32 %r269,%r35,-1; set.u32.le.u32 %r271,%r269,1; neg.s32 %r272,%r271; .loc 1 290 6 cvt.u16.u32 %r274,%r268; cvt.u16.u32 %r275,%r272; or.loc 1 292 7 setp.ne.u32 %r280,%r35,16; @ %r280 bra $L60; $L59: .loc 1 299 9 shl.b64 %r284,%r455,3; add.u64 %r285,%r222,%r284; ld.u64 %r156,[%r285+40]; .loc 1 300 12 add.u64 %r291,%r224,%r284; ld.u64 %r41,[%r291+40]; .loc 1 302 3 setp.ne.u64 %r293,%r151,0; @ %r293 bra $L61; $L67: .loc 1 312 3 setp.lt.s64 %r294,%r151,%r152; @ %r294 bra $L62; bra $L119; $L60: .loc 1 297 5 cvta.const.u64 %r2955_gfortran_runtime_error61: add.u64 %r96,%r222,40; add.u64 %r68,%r224,40; add.u64 %r298,%r22,%r22; add.u64 %r299,%r298,%r22; shl.b64 %r300,%r299,3; add.u64 %r301,%r222,16; add.u64 %r39,%r300,%r301; .loc 1 302 3 mov.u64 %r139,0; add.u64 %r452,%frame,240; add.u64 %r451,%frame,360; .loc 1 309 12 mov.u64 %r465,%r139; $L66: .loc 1 304 18 add.u64 %r303,%r452,%r139; ld.u64 %r304,[%r96]; st.u64 [%r303],%r304; .loc 1 305 18 add.u64 %r305,%frame,%r139; .loc 1 305 20 ld.u64 %r307,[%r68]; mul.lo.u64 %r306,%r307,%r34; .loc 1 305 18 st.u64 [%r305],%r306; .loc 1 306 19 ld.u64 %r309,[%r96+16]; add.u64 %r308,%r309,1; ld.u64 %r310,[%r96+8]; sub.u64 %r53,%r308,%r310; .loc 1 308 10 setp.lt.s64 %r311,%r53,0; @ %r311 bra $L64; .loc 1 306 17 add.u64 %r313,%r451,%r139; st.u64 [%r313],%r53; bra $L65; $L64: .loc 1 309 12 add.u64 %r315,%r451,%r139; st.u64 [%r315],%r465; $L65: .loc 1 302 3 add.u64 %r96,%r96,24; add.u64 %r68,%r68,24; add.u64 %r139,%r139,8; setp.ne.u64 %r317,%r39,%r96; @ %r317 bra $L66; bra $L67; $L119: .loc 1 322 6 ld.u64 %r318,[%r221]; setp.eq.u64 %r319,%r318,0; @ ! %r319 bra $L69; bra $L68; $L62: add.u64 %r321,%r22,%r22; add.u64 %r322,%r321,%r22; shl.b64 %r323,%r322,3; add.u64 %r130,%r323,40; add.u64 %r134,%r222,%r130; add.u64 %r111,%r224,%r130; shl.b64 %r324,%r22,3; add.u64 %r58,%r324,-8; cvt.u32.u32 %r326,%r23; cvt.s64.s8 %r325,%r326; shl.b64 %r327,%r325,3; add.u64 %r69,%r327,-8; add.u64 %r452,%frame,240; add.u64 %r451,%frame,360; .loc 1 319 12 mov.u64 %r464,0; $L72: .loc 1 314 18 add.u64 %r329,%r452,%r58; ld.u64 %r330,[%r134]; st.u64 [%r329],%r330; .loc 1 315 18 add.u64 %r331,%frame,%r58; .loc 1 315 20 ld.u64 %r333,[%r111]; mul.lo.u64 %r332,%r333,%r34; .loc 1 315 18 st.u64 [%r331],%r332; .loc 1 316 19 ld.u64 %r335,[%r134+16]; add.u64 %r334,%r335,1; ld.u64 %r336,[%r134+8]; sub.u64 %r65,%r334,%r336; .loc 1 318 10 setp.lt.s64 %r337,%r65,0; @ %r337 bra $L70; .loc 1 316 17 add.u64 %r339,%r451,%r58; st.u64 [%r339],%r65; bra $L71; $L70: .loc 1 319 12 add.u64 %r341,%r451,%r58; st.u64 [%r341],%r464; $L71: .loc 1 312 3 add.u64 %r134,%r134,24; add.u64 %r111,%r111,24; add.u64 %r58,%r58,8; setp.ne.u64 %r343,%r69,%r58; @ %r343 bra $L72; bra $L120; $L68: .loc 1 326 7 setp.eq.u64 %r344,%r152,0; @ %r344 bra $L74; add.u64 %r451,%frame,360; $L99: add.u64 %r148,%r221,40; mov.u64 %r146,%r451; cvt.u32.u32 %r346,%r23; cvt.s64.s8 %r345,%r346; add.u64 %r348,%r345,%r345; add.u64 %r349,%r348,%r345; shl.b64 %r350,%r349,3; add.u64 %r351,%r221,16; add.u64 %r135,%r350,%r351; .loc 1 329 10 mov.u64 %r136,1; .loc 1 333 4 mov.u64 %r370,0; bra $L75; $L74: .loc 1 337 20 add.u32 %r352,%r23,-2; cvt.s64.s32 %r76,%r352; add.u64 %r354,%r76,%r76; add.u64 %r355,%r354,%r76; shl.b64 %r356,%r355,3; add.u64 %r357,%r221,%r356; .loc 1 337 67 shl.b64 %r359,%r76,3; add.u64 %r360,%frame,%r359; .loc 1 337 59 ld.u64 %r362,[%r357+40]; ld.u64 %r363,[%r360+360]; mul.lo.u64 %r160,%r362,%r363; .loc 1 339 24 mov.u64 %r364,0; st.u64 [%r221+8],%r364; .loc 1 340 28 cvt.u16.u32 %r367,%r23; add.u16 %r366,%r367,-1; cvt.u32.u16 %r368,%r366; st.u8 [%r221+28],%r368; .loc 1 342 10 setp.eq.u64 %r369,%r160,0; @ %r369 bra $L76; bra $L121; $L78: .loc 1 331 47 mul.lo.u64 %r136,%r73,%r136; $L75: .loc 1 333 4 st.u64 [%r148+8],%r370; ld.u64 %r73,[%r146]; add.u64 %r371,%r73,-1; st.u64 [%r148+16],%r371; st.u64 [%r148],%r136; .loc 1 326 7 add.u64 %r148,%r148,24; add.u64 %r146,%r146,8; setp.ne.u64 %r372,%r135,%r148; @ %r372 bra $L78; bra $L74; $L76: .loc 1 345 4 st.u64 [%r221+48],%r160; mov.u64 %r374,-1; st.u64 [%r221+56],%r374; mov.u64 %r375,1; st.u64 [%r221+40],%r375; .loc 1 346 4 bra $L54; $L121: .loc 1 349 24call (%value_in),_gfortrani_xmallocarray78,[%value_in]; } .loc 1 349 22 st.u64 [%r221],%r378; bra $L79; $L69: .loc 1 354 19 ld.s8 %r380,[%r221+28]; .loc 1 354 10 setp.eq.u64 %r381,%r380,%r152; @ %r381 bra $L80; .loc 1 355 2 cvta.const.u64 %r382,$LC5; {stack; call _gfortran_runtime_error80: .loc 1 357 11 cvta.global.u64 %r384,_gfortrani_compile_options; .loc 1 357 10 ld.u32 %r385,[%r384+36]; setp.eq.u32 %r386,%r385,0; @ %r386 bra $L79; .loc 1 359 4 add.u64 %r451,%frame,360; cvta.const.u64 %r390,$LC2390; call _gfortrani_bounds_ifunction_return361 4 cvta.const.u64 %r394,$LC222394390; call _gfortrani_bounds_equal_extents$L79: .loc 1 366 3 setp.ne.u64 %r396,%r152,0; @ %r396 bra $L81; $L84: .loc 1 374 8 ld.u64 %r170,[%r221]; .loc 1 375 8 ld.u64 %r168,[%r222]; .loc 1 377 9 setp.ne.u64 %r397,%r168,0; @ %r397 bra $L82; bra $L54; $L81: add.u64 %r198,%frame,480; add.u64 %r197,%r221,40; add.u64 %r195,%frame,120; cvt.u32.u32 %r399,%r23; cvt.s64.s8 %r398,%r399; add.u64 %r164,%r398,-1; .loc 1 366 3 mov.u64 %r171,0; add.u64 %r451,%frame,360; .loc 1 368 16 mov.u64 %r400,%r171; $L83: st.u64 [%r198],%r400; .loc 1 369 18 ld.u64 %r401,[%r197]; st.u64 [%r195],%r401; .loc 1 370 17 shl.b64 %r403,%r171,3; add.u64 %r404,%r451,%r403; .loc 1 370 10 ld.u64 %r405,[%r404]; setp.le.s64 %r406,%r405,0; @ %r406 bra $L54; .loc 1 366 26 add.u64 %r171,%r171,1; .loc 1 366 3 add.u64 %r198,%r198,8; add.u64 %r197,%r197,24; add.u64 %r195,%r195,8; setp.ne.u64 %r407,%r164,%r171; @ %r407 bra $L83; bra $L84; $L82: .loc 1 300 12 mul.lo.u64 %r44,%r41,%r34; .loc 1 396 32 shl.b64 %r91,%r156,3; ld.u64 %r187,[%frame+480]; .loc 1 440 22 ld.u64 %r101,[%frame+240]; .loc 1 440 12 shl.b64 %r102,%r101,3; .loc 1 441 23 ld.u64 %r104,[%frame]; .loc 1 442 22 ld.u64 %r106,[%frame+120]; .loc 1 442 12 shl.b64 %r107,%r106,2; .loc 1 444 32 ld.u64 %r29,[%frame+360]; .loc 1 451 23 mul.lo.u64 %r38,%r29,%r101; .loc 1 452 24 mul.lo.u64 %r408,%r29,%r104; .loc 1 452 10 neg.s64 %r137,%r408; .loc 1 453 23 mul.lo.u64 %r409,%r29,%r106; .loc 1 453 9 shl.b64 %r410,%r409,2; neg.s64 %r188,%r410; setp.le.s64 %r450,%r152,1; .loc 1 418 9 setp.ne.u32 %r457,%r225,0; cvt.u32.u32 %r458,%r23; cvt.s64.s8 %r459,%r458; add.u64 %r460,%r459,-1; add.u64 %r461,%frame,240; add.u64 %r462,%frame,360; add.u64 %r463,%frame,120; $L100: .loc 1 366 3 mov.u64 %r163,%r169; mov.u64 %r162,%r168; .loc 1 396 9 mov.u64 %r161,0; $L88: .loc 1 399 7 ld.s8 %r220,[%r163]; .loc 1 399 6 setp.eq.u32 %r453,%r220,0; @ %r453 bra $L85; .loc 1 407 11 ld.u64 %r145,[%r162]; .loc 1 408 11 cvt.u32.u64 %r412,%r161; add.u32 %r143,%r412,1; .loc 1 418 9 @ %r457 bra $L86; bra $L122; $L85: .loc 1 396 24 add.u64 %r161,%r161,1; .loc 1 396 32 add.u64 %r162,%r162,%r91; .loc 1 396 47 add.u64 %r163,%r163,%r44; .loc 1 396 2 setp.ne.u64 %r414,%r153,%r161; @ %r414 bra $L88; .loc 1 395 9 mov.u32 %r143,%r220; bra $L89; $L122: .loc 1 428 10 setp.gt.s64 %r415,%r153,%r161; @ %r415 bra $L90; bra $L89; $L86: .loc 1 419 8 setp.le.s64 %r416,%r153,%r161; @ %r416 bra $L89; add.u64 %r205,%r161,1; $L92: .loc 1 421 8 @ %r453 bra $L91; .loc 1 421 18 ld.u64 %r93,[%r162]; .loc 1 421 15 setp.gt.s64 %r418,%r93,%r145; @ %r418 bra $L91; .loc 1 424 16 cvt.u32.u64 %r143,%r205; mov.u64 %r145,%r93; $L91: .loc 1 419 33 add.u64 %r162,%r162,%r91; .loc 1 419 48 add.u64 %r163,%r163,%r44; .loc 1 419 8 add.u64 %r206,%r205,1; setp.eq.u64 %r419,%r153,%r205; @ %r419 bra $L89; .loc 1 421 9 ld.s8 %r220,[%r163]; mov.u64 %r205,%r206; setp.eq.u32 %r453,%r220,0; bra $L92; $L90: add.u64 %r201,%r161,1; bra $L94; $L101: mov.u64 %r201,%r202; $L94: .loc 1 430 10 ld.s8 %r421,[%r163]; cvt.u16.u32 %r420,%r421; setp.eq.u16 %r422,%r420,0; @ %r422 bra $L93; .loc 1 430 20 ld.u64 %r98,[%r162]; .loc 1 430 17 setp.ge.s64 %r423,%r98,%r145; @ %r423 bra $L93; .loc 1 433 11 cvt.u32.u64 %r143,%r201; mov.u64 %r145,%r98; $L93: .loc 1 428 35 add.u64 %r162,%r162,%r91; .loc 1 428 50 add.u64 %r163,%r163,%r44; .loc 1 428 10 add.u64 %r202,%r201,1; setp.ne.u64 %r424,%r153,%r201; @ %r424 bra $L101; $L89: .loc 1 436 8 st.u32 [%r170],%r143; .loc 1 439 15 add.u64 %r187,%r187,1; .loc 1 440 12 add.u64 %r168,%r168,%r102; .loc 1 441 13 add.u64 %r169,%r169,%r104; .loc 1 442 12 add.u64 %r170,%r170,%r107; .loc 1 444 13 setp.ne.u64 %r425,%r29,%r187; @ %r425 bra $L100; .loc 1 452 10 add.u64 %r165,%r169,%r137; .loc 1 453 9 add.u64 %r166,%r170,%r188; .loc 1 455 7 @ %r450 bra $L54; add.u64 %r150,%frame,488; .loc 1 451 23 mov.u64 %r110,%r38; .loc 1 455 7 mov.u64 %r215,8; .loc 1 454 5 mov.u64 %r167,1; .loc 1 448 13 mov.u64 %r456,0; bra $L97; $L98: st.u64 [%r150],%r456; .loc 1 451 23 mul.lo.u64 %r110,%r123,%r122; .loc 1 452 24 mul.lo.u64 %r431,%r125,%r122; .loc 1 452 10 sub.u64 %r165,%r169,%r431; .loc 1 453 23 mul.lo.u64 %r432,%r127,%r122; .loc 1 453 9 shl.b64 %r433,%r432,2; sub.u64 %r166,%r170,%r433; .loc 1 454 5 add.u64 %r167,%r167,1; .loc 1 455 7 add.u64 %r150,%r150,8; add.u64 %r215,%r215,8; setp.eq.u64 %r434,%r167,%r460; @ %r434 bra $L54; $L97: .loc 1 463 16 ld.u64 %r435,[%r150]; add.u64 %r122,%r435,1; st.u64 [%r150],%r122; .loc 1 464 23 add.u64 %r437,%r461,%r215; ld.u64 %r123,[%r437]; .loc 1 464 13 sub.u64 %r438,%r123,%r110; shl.b64 %r439,%r438,3; add.u64 %r168,%r168,%r439; .loc 1 465 24 add.u64 %r440,%frame,%r215; ld.u64 %r125,[%r440]; .loc 1 465 14 add.u64 %r169,%r165,%r125; .loc 1 466 23 add.u64 %r442,%r463,%r215; ld.u64 %r127,[%r442]; .loc 1 466 13 shl.b64 %r443,%r127,2; add.u64 %r170,%r166,%r443; .loc 1 444 32 add.u64 %r445,%r462,%r215; ld.u64 %r131,[%r445]; .loc 1 444 13 setp.eq.u64 %r446,%r122,%r131; @ %r446 bra $L98; .loc 1 448 13 mov.u64 %r187,0; bra $L100; $L120: .loc 1 322 6 ld.u64 %r447,[%r221]; setp.eq.u64 %r448,%r447,0; @ ! %r448 bra $L69; bra $L99; $L54: .loc 1 4_gfortran_sminloc1_4_i8 .visible .func _gfortran_sminloc1_4_, .param .u32 %in_ar4) {8u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u32 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 493 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L124; .loc 1 493 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L125; $L124:r152154; call _gfortran_minloc1_4_500 7 bra $L123; $L125: .loc 1 503 10 ld.u64 %r23,[%r152]; .loc 1 503 7 add.u64 %r88,%r23,-1; .loc 1 504 10 ld.s8 %r24,[%r151+28]; .loc 1 504 38 add.u32 %r26,%r24,-1; .loc 1 504 8 cvt.s64.s32 %r296,%r26; .loc 1 506 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 506 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L127; .loc 1 513 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L128; $L134: .loc 1 521 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L129; bra $L182; $L127: .loc 1 508 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error128: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 518 12 mov.u64 %r302,0; $L133: .loc 1 515 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 517 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L131; .loc 1 515 17 st.u64 [%r139],%r36; bra $L132; $L131: .loc 1 518 12 st.u64 [%r139],%r302; $L132: .loc 1 513 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L133; bra $L134; $L182: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L136; bra $L135; $L129: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 527 12 mov.u64 %r301,0; $L139: .loc 1 524 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 526 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L137; .loc 1 523 17 st.u64 [%r43],%r41; bra $L138; $L137: .loc 1 527 12 st.u64 [%r43],%r301; $L138: .loc 1 521 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L139; bra $L183; $L135: .loc 1 534 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L141; add.u64 %r295,%frame,120; $L161: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 537 10 mov.u64 %r81,1; .loc 1 541 4 mov.u64 %r239,0; bra $L142; $L141: .loc 1 545 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 546 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 548 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 548 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 548 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 550 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L143; bra $L184; $L145: .loc 1 539 48 mul.lo.u64 %r81,%r46,%r81; $L142: .loc 1 541 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 534 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L145; bra $L141; $L143: .loc 1 553 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 554 4 bra $L123; $L184: .loc 1 557 2496246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 557 22 st.u64 [%r150],%r98; $L149: .loc 1 583 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L146; bra $L147; $L136: .loc 1 561 19 ld.s8 %r55,[%r150+28]; .loc 1 561 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L148; .loc 1 562 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error148: .loc 1 567 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 567 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L149; .loc 1 569 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L147; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 569 10 mov.u64 %r104,0; $L151: .loc 1 573 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 573 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 574 18 ld.u64 %r63,[%r31]; .loc 1 569 25 add.u64 %r104,%r104,1; .loc 1 574 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L150; .loc 1 575 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error150: .loc 1 569 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L151; $L146: .loc 1 585 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L152; shl.b64 %r266,%r296,3; bra $L153; $L152: mov.u64 %r266,8; $L153: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L154: .loc 1 586 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 583 26 add.u64 %r100,%r100,1; .loc 1 583 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L154; $L147: ld.u64 %r80,[%frame+240]; .loc 1 595 22 ld.u64 %r68,[%frame]; .loc 1 595 12 shl.b64 %r69,%r68,2; .loc 1 597 32 ld.u64 %r59,[%frame+120]; .loc 1 604 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 593 13 mov.u32 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L160: st.u32 [%r98],%r280; .loc 1 594 15 add.u64 %r80,%r80,1; .loc 1 595 12 add.u64 %r98,%r98,%r69; .loc 1 597 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L160; .loc 1 606 7 @ ! %r294 bra $L185; bra $L123; $L159: .loc 1 601 13 st.u64 [%r99],%r297; .loc 1 604 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 605 5 add.u64 %r97,%r97,1; .loc 1 606 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L158; bra $L123; $L185: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 604 23 mov.u64 %r72,%r44; .loc 1 605 5 mov.u64 %r97,1; .loc 1 601 13 mov.u64 %r297,0; $L158: .loc 1 610 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 611 23 ld.u64 %r75,[%r124]; .loc 1 611 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,2; add.u64 %r98,%r98,%r291; .loc 1 597 32 ld.u64 %r77,[%r123]; .loc 1 597 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L159; .loc 1 601 13 mov.u64 %r80,0; bra $L160; $L183: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L136; bra $L161; $L123: .loc 1 615 minloc1_8_i8.o/ 1622802233_gfortran_minloc1_8_i8 .visible .func _gfortran_minloc1_8_i8.file 1 "../../../../libgfortran/generated/minloc1_8_i8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_8_i8 .visible .func _gfortran_mminloc1_8__gfortran_sminloc1_8_i8 .visible .func _gfortran_sminloc1_8_VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_minloc1_8_i8 .visible .func _gfortran_minloc1_8_i8predpredpred %r235; .reg .predu64 %r266; .reg .u64 %r267; .reg .u64u16u64 %r279; .reg .u64 %r280; .reg .u64 %r282; .reg .u64 %r283; .reg .u64predu64u64predpred %r326; .reg .predu64 %r338; .reg .u64pred %r353; .reg .pred %r354; .reg .pred64mov.u64 %r185,%ar0; mov.u64 %r186,%ar1; mov.u64 %r187,%ar2; mov.u32 %r188,%ar3; .loc 1 58 10 ld.s8 %r22,[%r186+28]; .loc 1 58 38 add.u32 %r189,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r120,%r189; .loc 1 59 10 ld.u64 %r25,[%r187]; .loc 1 59 7 add.u64 %r121,%r25,-1; .loc 1 61 7 shr.u64 %r191,%r121,63; set.u32.lt.s64 %r193,%r120,%r121; neg.s32 %r194,%r193; cvt.u16.u64 %r197,%r191; cvt.u16.u32 %r198,%r194; or.b16 %r196,%r197,%r198; .loc 1 61 6 cvt.u32.u16 %r199,%r196; cvt.u16.u8 %r200,%r199; setp.eq.u16 %r201,%r200,0; @ %r201 bra $L2; .loc 1 63 7 cvt.u32.u32 %r204,%r22; cvt.s64.s8 %r203,%r204; st.u64 [%stack+8],%r20302202_gfortran_runtime_errorr207,%r121,%r121; add.u64 %r208,%r207,%r121; shl.b64 %r209,%r208,3; add.u64 %r210,%r186,%r209; ld.u64 %r30,[%r210+56]; ld.u64 %r32,[%r210+48]; .loc 1 71 9 ld.u64 %r124,[%r210+40]; .loc 1 73 3 setp.ne.u64 %r224,%r121,0; @ %r224 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r225,%r120,%r121; @ %r225 bra $L4; bra $L54; $L3: add.u64 %r47,%r186,40; add.u64 %r119,%frame,120; add.u64 %r137,%frame,240; add.u64 %r226,%r186,16; add.u64 %r228,%r25,%r25; add.u64 %r229,%r228,%r25; shl.b64 %r230,%r229,3; add.u64 %r46,%r226,%r230; .loc 1 79 12 mov.u64 %r367,0; $L8: .loc 1 75 18 ld.u64 %r231,[%r47]; st.u64 [%r119],%r231; .loc 1 76 19 ld.u64 %r233,[%r47+16]; add.u64 %r232,%r233,1; ld.u64 %r234,[%r47+8]; sub.u64 %r37,%r232,%r234; .loc 1 78 10 setp.lt.s64 %r235,%r37,0; @ %r235 bra $L6; .loc 1 76 17 st.u64 [%r137],%r37; bra $L7; $L6: .loc 1 79 12 st.u64 [%r137],%r367; $L7: .loc 1 73 3 add.u64 %r47,%r47,24; add.u64 %r119,%r119,8; add.u64 %r137,%r137,8; setp.ne.u64 %r237,%r46,%r47; @ %r237 bra $L8; bra $L9; $L54: .loc 1 90 6 ld.u64 %r238,[%r185]; setp.eq.u64 %r239,%r238,0; @ ! %r239 bra $L11; bra $L10; $L4: add.u64 %r241,%r25,%r25; add.u64 %r242,%r241,%r25; shl.b64 %r243,%r242,3; add.u64 %r244,%r243,40; add.u64 %r118,%r186,%r244; shl.b64 %r245,%r25,3; add.u64 %r108,%r245,-8; add.u64 %r357,%frame,120; add.u64 %r112,%r357,%r108; add.u64 %r356,%frame,240; add.u64 %r104,%r356,%r108; cvt.u32.u32 %r249,%r22; cvt.s64.s8 %r248,%r249; add.u64 %r251,%r248,%r248; add.u64 %r252,%r251,%r248; shl.b64 %r253,%r252,3; add.u64 %r254,%r186,40; add.u64 %r64,%r253,%r254; .loc 1 87 12 mov.u64 %r366,0; $L14: .loc 1 83 18 ld.u64 %r255,[%r118]; st.u64 [%r112],%r255; .loc 1 84 19 ld.u64 %r257,[%r118+16]; add.u64 %r256,%r257,1; ld.u64 %r258,[%r118+8]; sub.u64 %r43,%r256,%r258; .loc 1 86 10 setp.lt.s64 %r259,%r43,0; @ %r259 bra $L12; .loc 1 84 17 st.u64 [%r104],%r43; bra $L13; $L12: .loc 1 87 12 st.u64 [%r104],%r366; $L13: .loc 1 81 3 add.u64 %r118,%r118,24; add.u64 %r112,%r112,8; add.u64 %r104,%r104,8; setp.ne.u64 %r261,%r64,%r118; @ %r261 bra $L14; bra $L55; $L10: .loc 1 94 7 setp.le.s64 %r262,%r120,0; @ %r262 bra $L16; add.u64 %r356,%frame,240; $L39: add.u64 %r156,%r185,40; mov.u64 %r153,%r356; cvt.u32.u32 %r264,%r22; cvt.s64.s8 %r263,%r264; add.u64 %r266,%r263,%r263; add.u64 %r267,%r266,%r263; shl.b64 %r268,%r267,3; add.u64 %r269,%r185,16; add.u64 %r130,%r268,%r269; .loc 1 97 10 mov.u64 %r107,1; .loc 1 101 4 mov.u64 %r292,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r270,0; st.u64 [%r185+8],%r270; .loc 1 106 28 cvt.u16.u32 %r273,%r22; add.u16 %r272,%r273,-1; cvt.u32.u16 %r274,%r272; st.u8 [%r185+28],%r274; .loc 1 108 20 add.u32 %r275,%r22,-2; cvt.s64.s32 %r54,%r275; add.u64 %r277,%r54,%r54; add.u64 %r278,%r277,%r54; shl.b64 %r279,%r278,3; add.u64 %r280,%r185,%r279; .loc 1 108 67 shl.b64 %r282,%r54,3; add.u64 %r283,%frame,%r282; .loc 1 108 59 ld.u64 %r285,[%r280+40]; ld.u64 %r286,[%r283+240]; mul.lo.u64 %r126,%r285,%r286; .loc 1 110 29 mov.u64 %r288288; call (%value_in),_gfortrani_xmallocarray289,[%value_in]; } .loc 1 110 27 st.u64 [%r185],%r289; .loc 1 111 10 setp.eq.u64 %r291,%r126,0; @ ! %r291 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r107,%r50,%r107; $L17: .loc 1 101 4 st.u64 [%r156+8],%r292; ld.u64 %r50,[%r153]; add.u64 %r293,%r50,-1; st.u64 [%r156+16],%r293; st.u64 [%r156],%r107; .loc 1 94 7 add.u64 %r156,%r156,24; add.u64 %r153,%r153,8; setp.ne.u64 %r294,%r130,%r156; @ %r294 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r185+48],%r126; mov.u64 %r296,-1; st.u64 [%r185+56],%r296; mov.u64 %r297,1; st.u64 [%r185+40],%r297; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r60,[%r185+28]; .loc 1 121 10 setp.eq.u64 %r298,%r60,%r120; @ %r298 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r120; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r301,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r302,[%r301+36]; setp.eq.u32 %r303,%r302,0; @ %r303 bra $L21; .loc 1 128 2 add.u64 %r356,%frame,240; cvta.const.u64 %r307,$LC2r356306307; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r309,%r120,0; @ %r309 bra $L24; $L27: .loc 1 68 9 add.u64 %r310,%r30,1; .loc 1 68 7 sub.u64 %r122,%r310,%r32; max.s64 %r87,%r122,0; .loc 1 140 8 ld.u64 %r134,[%r186]; .loc 1 141 8 ld.u64 %r135,[%r185]; .loc 1 190 33 shl.b64 %r75,%r124,3; ld.u64 %r45,[%frame+360]; .loc 1 204 22 ld.u64 %r76,[%frame+120]; .loc 1 204 12 shl.b64 %r78,%r76,3; .loc 1 205 22 ld.u64 %r79,[%frame]; .loc 1 205 12 shl.b64 %r81,%r79,3; setp.gt.s64 %r354,%r122,0; setp.le.s64 %r355,%r120,1; .loc 1 180 9 setp.eq.u32 %r359,%r188,0; .loc 1 159 10 mov.u64 %r360,0; cvt.u32.u32 %r361,%r22; cvt.s64.s8 %r362,%r361; add.u64 %r363,%r362,-1; add.u64 %r364,%frame,120; add.u64 %r365,%frame,240; bra $L25; $L24: add.u64 %r170,%frame,360; add.u64 %r169,%r185,40; mov.u64 %r167,%frame; cvt.u32.u32 %r312,%r22; cvt.s64.s8 %r311,%r312; add.u64 %r158,%r311,-1; .loc 1 132 3 mov.u64 %r136,0; add.u64 %r356,%frame,240; .loc 1 134 16 mov.u64 %r313,%r136; $L26: st.u64 [%r170],%r313; .loc 1 135 18 ld.u64 %r314,[%r169]; st.u64 [%r167],%r314; .loc 1 136 17 shl.b64 %r316,%r136,3; add.u64 %r317,%r356,%r316; .loc 1 136 10 ld.u64 %r318,[%r317]; setp.le.s64 %r319,%r318,0; @ %r319 bra $L1; .loc 1 132 26 add.u64 %r136,%r136,1; .loc 1 132 3 add.u64 %r170,%r170,8; add.u64 %r169,%r169,24; add.u64 %r167,%r167,8; setp.ne.u64 %r320,%r136,%r158; @ %r320 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r354 bra $L28; .loc 1 159 10 st.u64 [%r135],%r360; bra $L29; $L28: .loc 1 180 9 @ %r359 bra $L40; mov.u64 %r129,%r134; .loc 1 155 9 mov.u64 %r111,9223372036854775807; .loc 1 157 9 mov.u64 %r110,1; .loc 1 178 8 mov.u64 %r152,0; $L32: .loc 1 183 9 ld.u64 %r67,[%r129]; .loc 1 186 16 add.u64 %r152,%r152,1; .loc 1 183 8 setp.le.s64 %r324,%r67,%r111; selp.u64 %r111,%r67,%r111,%r324; selp.u64 %r110,%r152,%r110,%r324; .loc 1 181 33 add.u64 %r129,%r129,%r75; .loc 1 181 8 setp.gt.s64 %r325,%r87,%r152; @ %r325 bra $L32; bra $L33; $L40: mov.u64 %r127,%r134; .loc 1 155 9 mov.u64 %r113,9223372036854775807; .loc 1 157 9 mov.u64 %r110,1; .loc 1 178 8 mov.u64 %r106,0; $L30: .loc 1 192 9 ld.u64 %r72,[%r127]; .loc 1 195 16 add.u64 %r106,%r106,1; .loc 1 192 8 setp.lt.s64 %r326,%r72,%r113; selp.u64 %r113,%r72,%r113,%r326; selp.u64 %r110,%r106,%r110,%r326; .loc 1 190 33 add.u64 %r127,%r127,%r75; .loc 1 190 8 setp.gt.s64 %r327,%r87,%r106; @ %r327 bra $L30; $L33: .loc 1 199 12 st.u64 [%r135],%r110; $L29: .loc 1 203 15 add.u64 %r45,%r45,1; .loc 1 204 12 add.u64 %r134,%r134,%r78; .loc 1 205 12 add.u64 %r135,%r135,%r81; .loc 1 207 32 ld.u64 %r68,[%frame+240]; .loc 1 207 13 setp.ne.u64 %r328,%r68,%r45; @ %r328 bra $L25; .loc 1 214 23 mul.lo.u64 %r329,%r45,%r76; .loc 1 214 9 shl.b64 %r330,%r329,3; sub.u64 %r131,%r134,%r330; .loc 1 215 23 mul.lo.u64 %r331,%r45,%r79; .loc 1 215 9 shl.b64 %r332,%r331,3; sub.u64 %r132,%r135,%r332; .loc 1 217 7 @ %r355 bra $L1; add.u64 %r157,%frame,368; mov.u64 %r178,8; .loc 1 216 5 mov.u64 %r133,1; bra $L37; $L38: .loc 1 211 13 st.u64 [%r157],%r360; .loc 1 214 23 mul.lo.u64 %r338,%r94,%r93; .loc 1 214 9 shl.b64 %r339,%r338,3; sub.u64 %r131,%r134,%r339; .loc 1 215 23 mul.lo.u64 %r340,%r97,%r93; .loc 1 215 9 shl.b64 %r341,%r340,3; sub.u64 %r132,%r135,%r341; .loc 1 216 5 add.u64 %r133,%r133,1; .loc 1 217 7 add.u64 %r157,%r157,8; add.u64 %r178,%r178,8; setp.eq.u64 %r342,%r133,%r363; @ %r342 bra $L1; $L37: .loc 1 225 16 ld.u64 %r343,[%r157]; add.u64 %r93,%r343,1; st.u64 [%r157],%r93; .loc 1 226 23 add.u64 %r345,%r364,%r178; ld.u64 %r94,[%r345]; .loc 1 226 13 shl.b64 %r346,%r94,3; add.u64 %r134,%r131,%r346; .loc 1 227 23 add.u64 %r347,%frame,%r178; ld.u64 %r97,[%r347]; .loc 1 227 13 shl.b64 %r348,%r97,3; add.u64 %r135,%r132,%r348; .loc 1 207 32 add.u64 %r350,%r365,%r178; ld.u64 %r100,[%r350]; .loc 1 207 13 setp.eq.u64 %r351,%r93,%r100; @ %r351 bra $L38; .loc 1 211 13 mov.u64 %r45,0; bra $L25; $L55: .loc 1 90 6 ld.u64 %r352,[%r185]; setp.eq.u64 %r353,%r352,0; @ ! %r353 bra $L11; bra $L39; $L1: .loc 1 231_gfortran_mminloc1_8_i8 .visible .func _gfortran_mminloc1_8_, .param .u32 %in_ar4) {608pred %r218; .reg .u32 %r223; .reg .u6464u64pred %r373; .reg .u64 %r374; .reg .u64 %r376; .reg .u32 %r377; .reg .pred %r378; .reg .u64 %r381; .reg .u64 %r382; .reg .u64 %r386; .reg .pred %r388; .reg .pred %r389; .reg .u64 %r390; .reg .u32 %r391; .reg .u64pred %r398; .reg .pred %r399; .reg .pred %r402; .reg .pred %r403; .reg .pred %r404; .reg .pred %r406; .reg .pred %r407; .reg .u16 %r408; .reg .u32 %r409; .reg .pred %r410; .reg .pred %r411; .reg .pred %r412; .reg .pred.reg .u64 %r416; .reg .u64predpredmov.u64 %r213,%ar0; mov.u64 %r214,%ar1; mov.u64 %r215,%ar2; mov.u64 %r216,%ar3; mov.u32 %r217,%ar4; .loc 1 261 6 setp.ne.u64 %r218,%r216,0; @ %r218 bra $L1r214215217; call _gfortran_minloc1_8_268 7 bra $L56; $L57: .loc 1 271 10 ld.u64 %r22,[%r215]; .loc 1 271 7 add.u64 %r146,%r22,-1; .loc 1 272 10 ld.s8 %r23,[%r214+28]; .loc 1 272 38 add.u32 %r223,%r23,-1; .loc 1 272 8 cvt.s64.s32 %r147,%r223; .loc 1 275 7 shr.u64 %r225,%r146,63; set.u32.gt.s64 %r227,%r146,%r147; neg.s32 %r228,%r227; cvt.u16.u64 %r231,%r225; cvt.u16.u32 %r232,%r228; or.b16 %r230,%r231,%r232; .loc 1 275 6 cvt.u32.u16 %r233,%r230; cvt.u16.u8 %r234,%r233; setp.eq.u16 %r235,%r234,0; @ %r235 bra $L59; .loc 1 277 7 cvt.u32.u32 %r238,%r23; cvt.s64.s8 %r237,%r238; st.u64 [%stack+8],%r237; st.u64 [%stack],%r22; cvta.const.u64 %r236236_gfortran_runtime_error9: .loc 1 282 9 add.u64 %r445,%r146,%r146; add.u64 %r444,%r445,%r146; shl.b64 %r243,%r444,3; add.u64 %r244,%r214,%r243; ld.u64 %r247,[%r244+56]; add.u64 %r246,%r247,1; .loc 1 282 7 ld.u64 %r254,[%r244+48]; sub.u64 %r148,%r246,%r254; .loc 1 283 6 setp.le.s64 %r255,%r148,0; @ %r255 bra $L56; .loc 1 286 9 ld.u64 %r166,[%r216]; .loc 1 288 15 ld.u64 %r34,[%r216+16]; .loc 1 290 22 cvt.u32.u64 %r35,%r34; .loc 1 290 53 add.u32 %r256,%r35,-4; and.b32 %r257,%r256,-5; set.u32.eq.u32 %r259,%r257,0; neg.s32 %r260,%r259; .loc 1 290 22 add.u32 %r261,%r35,-1; set.u32.le.u32 %r263,%r261,1; neg.s32 %r264,%r263; .loc 1 290 6 cvt.u16.u32 %r266,%r260; cvt.u16.u32 %r267,%r264; or.b16 %r265,%r266,%r267; cvt.u32.u16 %r268,%r265; cvt.u16.u8 %r269,%r268; setp.ne.u16 %r270,%r269,0; @ %r270 bra $L61; .loc 1 292 7 setp.ne.u32 %r272,%r35,16; @ %r272 bra $L62; $L61: .loc 1 299 9 shl.b64 %r276,%r444,3; add.u64 %r277,%r214,%r276; ld.u64 %r151,[%r277+40]; .loc 1 300 12 add.u64 %r283,%r216,%r276; ld.u64 %r39,[%r283+40]; .loc 1 302 3 setp.ne.u64 %r285,%r146,0; @ %r285 bra $L63; $L69: .loc 1 312 3 setp.lt.s64 %r286,%r146,%r147; @ %r286 bra $L64; bra $L123; $L62: .loc 1 297 5_gfortran_runtime_error63: add.u64 %r205,%r214,40; add.u64 %r67,%r216,40; add.u64 %r289,%r214,16; add.u64 %r291,%r22,%r22; add.u64 %r292,%r291,%r22; shl.b64 %r293,%r292,3; add.u64 %r210,%r289,%r293; .loc 1 302 3 mov.u64 %r91,0; add.u64 %r442,%frame,240; add.u64 %r440,%frame,360; .loc 1 309 12 mov.u64 %r456,%r91; $L68: .loc 1 304 18 add.u64 %r295,%r442,%r91; ld.u64 %r296,[%r205]; st.u64 [%r295],%r296; .loc 1 305 18 add.u64 %r297,%frame,%r91; .loc 1 305 20 ld.u64 %r299,[%r67]; mul.lo.u64 %r298,%r299,%r34; .loc 1 305 18 st.u64 [%r297],%r298; .loc 1 306 19 ld.u64 %r301,[%r205+16]; add.u64 %r300,%r301,1; ld.u64 %r302,[%r205+8]; sub.u64 %r51,%r300,%r302; .loc 1 308 10 setp.lt.s64 %r303,%r51,0; @ %r303 bra $L66; .loc 1 306 17 add.u64 %r305,%r440,%r91; st.u64 [%r305],%r51; bra $L67; $L66: .loc 1 309 12 add.u64 %r307,%r440,%r91; st.u64 [%r307],%r456; $L67: .loc 1 302 3 add.u64 %r205,%r205,24; add.u64 %r67,%r67,24; add.u64 %r91,%r91,8; setp.ne.u64 %r309,%r205,%r210; @ %r309 bra $L68; bra $L69; $L123: .loc 1 322 6 ld.u64 %r310,[%r213]; setp.eq.u64 %r311,%r310,0; @ ! %r311 bra $L71; bra $L70; $L64: add.u64 %r313,%r22,%r22; add.u64 %r314,%r313,%r22; shl.b64 %r315,%r314,3; add.u64 %r128,%r315,40; add.u64 %r133,%r214,%r128; add.u64 %r124,%r216,%r128; shl.b64 %r316,%r22,3; add.u64 %r83,%r316,-8; cvt.u32.u32 %r318,%r23; cvt.s64.s8 %r317,%r318; shl.b64 %r319,%r317,3; add.u64 %r209,%r319,-8; add.u64 %r442,%frame,240; add.u64 %r440,%frame,360; .loc 1 319 12 mov.u64 %r455,0; $L74: .loc 1 314 18 add.u64 %r321,%r442,%r83; ld.u64 %r322,[%r133]; st.u64 [%r321],%r322; .loc 1 315 18 add.u64 %r323,%frame,%r83; .loc 1 315 20 ld.u64 %r325,[%r124]; mul.lo.u64 %r324,%r325,%r34; .loc 1 315 18 st.u64 [%r323],%r324; .loc 1 316 19 ld.u64 %r327,[%r133+16]; add.u64 %r326,%r327,1; ld.u64 %r328,[%r133+8]; sub.u64 %r64,%r326,%r328; .loc 1 318 10 setp.lt.s64 %r329,%r64,0; @ %r329 bra $L72; .loc 1 316 17 add.u64 %r331,%r440,%r83; st.u64 [%r331],%r64; bra $L73; $L72: .loc 1 319 12 add.u64 %r333,%r440,%r83; st.u64 [%r333],%r455; $L73: .loc 1 312 3 add.u64 %r133,%r133,24; add.u64 %r124,%r124,24; add.u64 %r83,%r83,8; setp.ne.u64 %r335,%r83,%r209; @ %r335 bra $L74; bra $L124; $L70: .loc 1 326 7 setp.eq.u64 %r336,%r147,0; @ %r336 bra $L76; add.u64 %r440,%frame,360; $L100: add.u64 %r161,%r213,40; mov.u64 %r143,%r440; cvt.u32.u32 %r338,%r23; cvt.s64.s8 %r337,%r338; add.u64 %r340,%r337,%r337; add.u64 %r341,%r340,%r337; shl.b64 %r342,%r341,3; add.u64 %r343,%r213,16; add.u64 %r135,%r342,%r343; .loc 1 329 10 mov.u64 %r131,1; .loc 1 333 4 mov.u64 %r362,0; bra $L77; $L76: .loc 1 337 20 add.u32 %r344,%r23,-2; cvt.s64.s32 %r73,%r344; add.u64 %r346,%r73,%r73; add.u64 %r347,%r346,%r73; shl.b64 %r348,%r347,3; add.u64 %r349,%r213,%r348; .loc 1 337 67 shl.b64 %r351,%r73,3; add.u64 %r352,%frame,%r351; .loc 1 337 59 ld.u64 %r354,[%r349+40]; ld.u64 %r355,[%r352+360]; mul.lo.u64 %r155,%r354,%r355; .loc 1 339 24 mov.u64 %r356,0; st.u64 [%r213+8],%r356; .loc 1 340 28 cvt.u16.u32 %r359,%r23; add.u16 %r358,%r359,-1; cvt.u32.u16 %r360,%r358; st.u8 [%r213+28],%r360; .loc 1 342 10 setp.eq.u64 %r361,%r155,0; @ %r361 bra $L78; bra $L125; $L80: .loc 1 331 47 mul.lo.u64 %r131,%r70,%r131; $L77: .loc 1 333 4 st.u64 [%r161+8],%r362; ld.u64 %r70,[%r143]; add.u64 %r363,%r70,-1; st.u64 [%r161+16],%r363; st.u64 [%r161],%r131; .loc 1 326 7 add.u64 %r161,%r161,24; add.u64 %r143,%r143,8; setp.ne.u64 %r364,%r135,%r161; @ %r364 bra $L80; bra $L76; $L78: .loc 1 345 4 st.u64 [%r213+48],%r155; mov.u64 %r366,-1; st.u64 [%r213+56],%r366; mov.u64 %r367,1; st.u64 [%r213+40],%r367; .loc 1 346 4 bra $L56; $L125: .loc 1 349 24 mov.u64 %r369369; call (%value_in),_gfortrani_xmallocarray70,[%value_in]; } .loc 1 349 22 st.u64 [%r213],%r370; bra $L81; $L71: .loc 1 354 19 ld.s8 %r372,[%r213+28]; .loc 1 354 10 setp.eq.u64 %r373,%r372,%r147; @ %r373 bra $L82; .loc 1 355 2 cvta.const.u64 %r374374_gfortran_runtime_error82: .loc 1 357 11 cvta.global.u64 %r376,_gfortrani_compile_options; .loc 1 357 10 ld.u32 %r377,[%r376+36]; setp.eq.u32 %r378,%r377,0; @ %r378 bra $L81; .loc 1 359 4 add.u64 %r440,%frame,360; cvta.const.u64 %r382,$LC2r43382; call _gfortrani_bounds_ifunction_return361 4 cvta.const.u64 %r386,$LC386382; call _gfortrani_bounds_equal_extents$L81: .loc 1 366 3 setp.ne.u64 %r388,%r147,0; @ %r388 bra $L83; $L86: .loc 1 374 8 ld.u64 %r167,[%r213]; .loc 1 375 8 ld.u64 %r165,[%r214]; .loc 1 377 9 setp.ne.u64 %r389,%r165,0; @ %r389 bra $L84; bra $L56; $L83: add.u64 %r194,%frame,480; add.u64 %r193,%r213,40; add.u64 %r188,%frame,120; cvt.u32.u32 %r391,%r23; cvt.s64.s8 %r390,%r391; add.u64 %r174,%r390,-1; .loc 1 366 3 mov.u64 %r168,0; add.u64 %r440,%frame,360; .loc 1 368 16 mov.u64 %r392,%r168; $L85: st.u64 [%r194],%r392; .loc 1 369 18 ld.u64 %r393,[%r193]; st.u64 [%r188],%r393; .loc 1 370 17 shl.b64 %r395,%r168,3; add.u64 %r396,%r440,%r395; .loc 1 370 10 ld.u64 %r397,[%r396]; setp.le.s64 %r398,%r397,0; @ %r398 bra $L56; .loc 1 366 26 add.u64 %r168,%r168,1; .loc 1 366 3 add.u64 %r194,%r194,8; add.u64 %r193,%r193,24; add.u64 %r188,%r188,8; setp.ne.u64 %r399,%r168,%r174; @ %r399 bra $L85; bra $L86; $L84: .loc 1 300 12 mul.lo.u64 %r42,%r39,%r34; .loc 1 396 32 shl.b64 %r87,%r151,3; ld.u64 %r68,[%frame+480]; .loc 1 440 22 ld.u64 %r97,[%frame+240]; .loc 1 440 12 shl.b64 %r99,%r97,3; .loc 1 441 23 ld.u64 %r100,[%frame]; .loc 1 442 22 ld.u64 %r102,[%frame+120]; .loc 1 442 12 shl.b64 %r104,%r102,3; setp.le.s64 %r446,%r147,1; .loc 1 418 9 setp.ne.u32 %r448,%r217,0; cvt.u32.u32 %r449,%r23; cvt.s64.s8 %r450,%r449; add.u64 %r451,%r450,-1; add.u64 %r452,%frame,240; add.u64 %r453,%frame,360; add.u64 %r454,%frame,120; $L101: .loc 1 366 3 mov.u64 %r160,%r166; mov.u64 %r159,%r165; .loc 1 396 9 mov.u64 %r137,0; $L90: .loc 1 399 7 ld.s8 %r208,[%r160]; mov.u64 %r189,%r137; .loc 1 396 24 add.u64 %r137,%r137,1; .loc 1 399 6 setp.eq.u32 %r441,%r208,0; @ %r441 bra $L87; .loc 1 407 11 ld.u64 %r141,[%r159]; .loc 1 418 9 @ %r448 bra $L88; bra $L126; $L87: .loc 1 396 32 add.u64 %r159,%r159,%r87; .loc 1 396 47 add.u64 %r160,%r160,%r42; .loc 1 396 2 setp.ne.u64 %r402,%r148,%r137; @ %r402 bra $L90; .loc 1 395 9 mov.u64 %r137,0; bra $L91; $L126: .loc 1 428 10 setp.gt.s64 %r403,%r148,%r189; @ %r403 bra $L92; bra $L91; $L88: .loc 1 419 8 setp.le.s64 %r404,%r148,%r189; @ %r404 bra $L91; $L94: .loc 1 424 16 add.u64 %r189,%r189,1; .loc 1 421 8 @ %r441 bra $L93; .loc 1 421 18 ld.u64 %r88,[%r159]; .loc 1 421 15 setp.le.s64 %r406,%r88,%r141; selp.u64 %r141,%r88,%r141,%r406; selp.u64 %r137,%r189,%r137,%r406; $L93: .loc 1 419 33 add.u64 %r159,%r159,%r87; .loc 1 419 48 add.u64 %r160,%r160,%r42; .loc 1 419 8 setp.eq.u64 %r407,%r148,%r189; @ %r407 bra $L91; .loc 1 421 9 ld.s8 %r208,[%r160]; setp.eq.u32 %r441,%r208,0; bra $L94; $L92: .loc 1 433 11 add.u64 %r189,%r189,1; .loc 1 430 10 ld.s8 %r409,[%r160]; cvt.u16.u32 %r408,%r409; setp.eq.u16 %r410,%r408,0; @ %r410 bra $L95; .loc 1 430 20 ld.u64 %r93,[%r159]; .loc 1 430 17 setp.lt.s64 %r411,%r93,%r141; selp.u64 %r141,%r93,%r141,%r411; selp.u64 %r137,%r189,%r137,%r411; $L95: .loc 1 428 35 add.u64 %r159,%r159,%r87; .loc 1 428 50 add.u64 %r160,%r160,%r42; .loc 1 428 10 setp.ne.u64 %r412,%r148,%r189; @ %r412 bra $L92; $L91: .loc 1 436 8 st.u64 [%r167],%r137; .loc 1 439 15 add.u64 %r68,%r68,1; .loc 1 440 12 add.u64 %r165,%r165,%r99; .loc 1 441 13 add.u64 %r166,%r166,%r100; .loc 1 442 12 add.u64 %r167,%r167,%r104; .loc 1 444 32 ld.u64 %r29,[%frame+360]; .loc 1 444 13 setp.ne.u64 %r413,%r29,%r68; @ %r413 bra $L101; .loc 1 451 23 mul.lo.u64 %r106,%r68,%r97; .loc 1 452 24 mul.lo.u64 %r414,%r68,%r100; .loc 1 452 10 sub.u64 %r162,%r166,%r414; .loc 1 453 23 mul.lo.u64 %r415,%r68,%r102; .loc 1 453 9 shl.b64 %r416,%r415,3; sub.u64 %r163,%r167,%r416; .loc 1 455 7 @ %r446 bra $L56; add.u64 %r204,%frame,488; mov.u64 %r203,8; .loc 1 454 5 mov.u64 %r164,1; .loc 1 448 13 mov.u64 %r447,0; bra $L98; $L99: st.u64 [%r204],%r447; .loc 1 451 23 mul.lo.u64 %r106,%r117,%r116; .loc 1 452 24 mul.lo.u64 %r422,%r119,%r116; .loc 1 452 10 sub.u64 %r162,%r166,%r422; .loc 1 453 23 mul.lo.u64 %r423,%r121,%r116; .loc 1 453 9 shl.b64 %r424,%r423,3; sub.u64 %r163,%r167,%r424; .loc 1 454 5 add.u64 %r164,%r164,1; .loc 1 455 7 add.u64 %r204,%r204,8; add.u64 %r203,%r203,8; setp.eq.u64 %r425,%r164,%r451; @ %r425 bra $L56; $L98: .loc 1 463 16 ld.u64 %r426,[%r204]; add.u64 %r116,%r426,1; st.u64 [%r204],%r116; .loc 1 464 23 add.u64 %r428,%r452,%r203; ld.u64 %r117,[%r428]; .loc 1 464 13 sub.u64 %r429,%r117,%r106; shl.b64 %r430,%r429,3; add.u64 %r165,%r165,%r430; .loc 1 465 24 add.u64 %r431,%frame,%r203; ld.u64 %r119,[%r431]; .loc 1 465 14 add.u64 %r166,%r162,%r119; .loc 1 466 23 add.u64 %r433,%r454,%r203; ld.u64 %r121,[%r433]; .loc 1 466 13 shl.b64 %r434,%r121,3; add.u64 %r167,%r163,%r434; .loc 1 444 32 add.u64 %r436,%r453,%r203; ld.u64 %r125,[%r436]; .loc 1 444 13 setp.eq.u64 %r437,%r116,%r125; @ %r437 bra $L99; .loc 1 448 13 mov.u64 %r68,0; bra $L101; $L124: .loc 1 322 6 ld.u64 %r438,[%r213]; setp.eq.u64 %r439,%r438,0; @ ! %r439 bra $L71; bra $L100; $L56: .loc 1 4_gfortran_sminloc1_8_i8 .visible .func _gfortran_sminloc1_8_, .param .u32 %in_ar4) {8u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 493 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L128; .loc 1 493 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L129; $L128:r152154; call _gfortran_minloc1_8_500 7 bra $L127; $L129: .loc 1 503 10 ld.u64 %r23,[%r152]; .loc 1 503 7 add.u64 %r88,%r23,-1; .loc 1 504 10 ld.s8 %r24,[%r151+28]; .loc 1 504 38 add.u32 %r26,%r24,-1; .loc 1 504 8 cvt.s64.s32 %r296,%r26; .loc 1 506 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 506 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L131; .loc 1 513 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L132; $L138: .loc 1 521 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L133; bra $L186; $L131: .loc 1 508 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error132: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 518 12 mov.u64 %r302,0; $L137: .loc 1 515 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 517 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L135; .loc 1 515 17 st.u64 [%r139],%r36; bra $L136; $L135: .loc 1 518 12 st.u64 [%r139],%r302; $L136: .loc 1 513 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L137; bra $L138; $L186: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L140; bra $L139; $L133: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 527 12 mov.u64 %r301,0; $L143: .loc 1 524 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 526 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L141; .loc 1 523 17 st.u64 [%r43],%r41; bra $L142; $L141: .loc 1 527 12 st.u64 [%r43],%r301; $L142: .loc 1 521 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L143; bra $L187; $L139: .loc 1 534 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L145; add.u64 %r295,%frame,120; $L165: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 537 10 mov.u64 %r81,1; .loc 1 541 4 mov.u64 %r239,0; bra $L146; $L145: .loc 1 545 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 546 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 548 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 548 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 548 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 550 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L147; bra $L188; $L149: .loc 1 539 48 mul.lo.u64 %r81,%r46,%r81; $L146: .loc 1 541 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 534 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L149; bra $L145; $L147: .loc 1 553 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 554 4 bra $L127; $L188: .loc 1 557 24 mov.u64 %r24696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 557 22 st.u64 [%r150],%r98; $L153: .loc 1 583 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L150; bra $L151; $L140: .loc 1 561 19 ld.s8 %r55,[%r150+28]; .loc 1 561 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L152; .loc 1 562 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error152: .loc 1 567 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 567 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L153; .loc 1 569 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L151; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 569 10 mov.u64 %r104,0; $L155: .loc 1 573 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 573 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 574 18 ld.u64 %r63,[%r31]; .loc 1 569 25 add.u64 %r104,%r104,1; .loc 1 574 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L154; .loc 1 575 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error154: .loc 1 569 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L155; $L150: .loc 1 585 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L156; shl.b64 %r266,%r296,3; bra $L157; $L156: mov.u64 %r266,8; $L157: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L158: .loc 1 586 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 583 26 add.u64 %r100,%r100,1; .loc 1 583 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L158; $L151: ld.u64 %r80,[%frame+240]; .loc 1 595 22 ld.u64 %r68,[%frame]; .loc 1 595 12 shl.b64 %r69,%r68,3; .loc 1 597 32 ld.u64 %r59,[%frame+120]; .loc 1 604 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 593 13 mov.u64 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L164: st.u64 [%r98],%r280; .loc 1 594 15 add.u64 %r80,%r80,1; .loc 1 595 12 add.u64 %r98,%r98,%r69; .loc 1 597 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L164; .loc 1 606 7 @ ! %r294 bra $L189; bra $L127; $L163: .loc 1 601 13 st.u64 [%r99],%r280; .loc 1 604 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 605 5 add.u64 %r97,%r97,1; .loc 1 606 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L162; bra $L127; $L189: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 604 23 mov.u64 %r72,%r44; .loc 1 605 5 mov.u64 %r97,1; $L162: .loc 1 610 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 611 23 ld.u64 %r75,[%r124]; .loc 1 611 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,3; add.u64 %r98,%r98,%r291; .loc 1 597 32 ld.u64 %r77,[%r123]; .loc 1 597 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L163; .loc 1 601 13 mov.u64 %r80,0; bra $L164; $L187: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L140; bra $L165; $L127: .loc 1 615minloc1_16_i8.o/1622802233_gfortran_minloc1_16_i8 .visible .func _gfortran_minloc1_16_i8.file 1 "../../../../libgfortran/generated/minloc1_16_i8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_16_i8 .visible .func _gfortran_mminloc1_16__gfortran_sminloc1_16_i8 .visible .func _gfortran_sminloc1_16_VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_minloc1_16_i8 .visible .func _gfortran_minloc1_16_i8pred %r229; .reg .predpred %r240; .reg .predu16 %r278; .reg .u32 %r279; .reg .u32 %r280; .reg .u64 %r282; .reg .u64predu64 %r318; .reg .u64pred %r328; .reg .pred %r329; .reg .u64 %r331; .reg .pred %r334; .reg .u64 %r337; .reg .u64 %r340; .reg .u32 %r342; .reg .u64 %r347; .reg .pred %r349; .reg .predpred %r365; .reg .pred %r368; .reg .u64 %r374; .reg .u64 %r375; .reg .u64u64pred %r389; .reg .u64pred %r407; .reg .u64 %r408; .reg .pred %r409; .reg .u64 %r410; .reg .u32.loc 1 58 10 ld.s8 %r22,[%r191+28]; .loc 1 58 38 add.u32 %r194,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r118,%r194; .loc 1 59 10 ld.u64 %r25,[%r192]; .loc 1 59 7 add.u64 %r119,%r25,-1; .loc 1 61 7 shr.u64 %r196,%r119,63; set.u32.lt.s64 %r198,%r118,%r119; neg.s32 %r199,%r198; cvt.u16.u64 %r202,%r196;61%r206 bra $L2; .loc 1 63 7 cvt.u32.u32 %r209,%r22; cvt.s64.s8 %r208,%r209; st.u64 [%stack+8],%r208007_gfortran_runtime_errorr212,%r119,%r119; add.u64 %r213,%r212,%r119; shl.b64 %r214,%r213,3; add.u64 %r215,%r191,%r214; ld.u64 %r31,[%r215+56]; ld.u64 %r33,[%r215+48]; .loc 1 71 9 ld.u64 %r122,[%r215+40]; .loc 1 73 3 setp.ne.u64 %r229,%r119,0; @ %r229 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r230,%r118,%r119; @ %r230 bra $L4; bra $L52; $L3: add.u64 %r188,%r191,40; add.u64 %r184,%frame,120; add.u64 %r47,%frame,240; add.u64 %r231,%r191,16; add.u64 %r233,%r25,%r25; add.u64 %r234,%r233,%r25; shl.b64 %r235,%r234,3; add.u64 %r186,%r231,%r235; .loc 1 79 12 mov.u64 %r417,0; $L8: .loc 1 75 18 ld.u64 %r236,[%r188]; st.u64 [%r184],%r236; .loc 1 76 19 ld.u64 %r238,[%r188+16]; add.u64 %r237,%r238,1; ld.u64 %r239,[%r188+8]; sub.u64 %r38,%r237,%r239; .loc 1 78 10 setp.lt.s64 %r240,%r38,0; @ %r240 bra $L6; .loc 1 76 17 st.u64 [%r47],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r47],%r417; $L7: .loc 1 73 3 add.u64 %r188,%r188,24; add.u64 %r184,%r184,8; add.u64 %r47,%r47,8; setp.ne.u64 %r242,%r186,%r188; @ %r242 bra $L8; bra $L9; $L52: .loc 1 90 6 ld.u64 %r243,[%r190]; setp.eq.u64 %r244,%r243,0; @ ! %r244 bra $L11; bra $L10; $L4: add.u64 %r246,%r25,%r25; add.u64 %r247,%r246,%r25; shl.b64 %r248,%r247,3; add.u64 %r249,%r248,40; add.u64 %r110,%r191,%r249; shl.b64 %r250,%r25,3; add.u64 %r98,%r250,-8; add.u64 %r405,%frame,120; add.u64 %r100,%r405,%r98; add.u64 %r404,%frame,240; add.u64 %r96,%r404,%r98; cvt.u32.u32 %r254,%r22; cvt.s64.s8 %r253,%r254; add.u64 %r256,%r253,%r253; add.u64 %r257,%r256,%r253; shl.b64 %r258,%r257,3; add.u64 %r259,%r191,40; add.u64 %r135,%r258,%r259; .loc 1 87 12 mov.u64 %r416,0; $L14: .loc 1 83 18 ld.u64 %r260,[%r110]; st.u64 [%r100],%r260; .loc 1 84 19 ld.u64 %r262,[%r110+16]; add.u64 %r261,%r262,1; ld.u64 %r263,[%r110+8]; sub.u64 %r44,%r261,%r263; .loc 1 86 10 setp.lt.s64 %r264,%r44,0; @ %r264 bra $L12; .loc 1 84 17 st.u64 [%r96],%r44; bra $L13; $L12: .loc 1 87 12 st.u64 [%r96],%r416; $L13: .loc 1 81 3 add.u64 %r110,%r110,24; add.u64 %r100,%r100,8; add.u64 %r96,%r96,8; setp.ne.u64 %r266,%r110,%r135; @ %r266 bra $L14; bra $L53; $L10: .loc 1 94 7 setp.le.s64 %r267,%r118,0; @ %r267 bra $L16; add.u64 %r404,%frame,240; $L39: add.u64 %r146,%r190,40; mov.u64 %r143,%r404; cvt.u32.u32 %r269,%r22; cvt.s64.s8 %r268,%r269; add.u64 %r271,%r268,%r268; add.u64 %r272,%r271,%r268; shl.b64 %r273,%r272,3; add.u64 %r274,%r190,16; add.u64 %r111,%r273,%r274; .loc 1 97 10 mov.u64 %r102,1; .loc 1 101 4 mov.u64 %r297,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r275,0; st.u64 [%r190+8],%r275; .loc 1 106 28 cvt.u16.u32 %r278,%r22; add.u16 %r277,%r278,-1; cvt.u32.u16 %r279,%r277; st.u8 [%r190+28],%r279; .loc 1 108 20 add.u32 %r280,%r22,-2; cvt.s64.s32 %r53,%r280; add.u64 %r282,%r53,%r53; add.u64 %r283,%r282,%r53; shl.b64 %r284,%r283,3; add.u64 %r285,%r190,%r284; .loc 1 108 67 shl.b64 %r287,%r53,3; add.u64 %r288,%frame,%r287; .loc 1 108 59 ld.u64 %r290,[%r285+40]; ld.u64 %r291,[%r288+240]; mul.lo.u64 %r124,%r290,%r291; .loc 1 110 29 mov.u64 %r293,16293; call (%value_in),_gfortrani_xmallocarray294,[%value_in]; } .loc 1 110 27 st.u64 [%r190],%r294; .loc 1 111 10 setp.eq.u64 %r296,%r124,0; @ ! %r296 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r102,%r49,%r102; $L17: .loc 1 101 4 st.u64 [%r146+8],%r297; ld.u64 %r49,[%r143]; add.u64 %r298,%r49,-1; st.u64 [%r146+16],%r298; st.u64 [%r146],%r102; .loc 1 94 7 add.u64 %r146,%r146,24; add.u64 %r143,%r143,8; setp.ne.u64 %r299,%r111,%r146; @ %r299 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r190+48],%r124; mov.u64 %r301,-1; st.u64 [%r190+56],%r301; mov.u64 %r302,1; st.u64 [%r190+40],%r302; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r59,[%r190+28]; .loc 1 121 10 setp.eq.u64 %r303,%r59,%r118; @ %r303 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r118; st.u64 [%stack],%r59;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r306,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r307,[%r306+36]; setp.eq.u32 %r308,%r307,0; @ %r308 bra $L21; .loc 1 128 2 add.u64 %r404,%frame,240; cvta.const.u64 %r312,$LC2; cvta.const.u64 %r311,$LC3;_gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r314,%r118,0; @ %r314 bra $L24; $L27: .loc 1 68 9 add.u64 %r315,%r31,1; .loc 1 68 7 sub.u64 %r120,%r315,%r33; max.s64 %r82,%r120,0; .loc 1 140 8 ld.u64 %r131,[%r191]; .loc 1 141 8 ld.u64 %r132,[%r190]; .loc 1 190 33 shl.b64 %r71,%r122,3; ld.u64 %r141,[%frame+360]; .loc 1 204 22 ld.u64 %r73,[%frame+120]; .loc 1 204 12 shl.b64 %r74,%r73,3; .loc 1 205 22 ld.u64 %r76,[%frame]; .loc 1 205 12 shl.b64 %r77,%r76,4; .loc 1 207 32 ld.u64 %r66,[%frame+240]; .loc 1 214 23 mul.lo.u64 %r316,%r66,%r73; .loc 1 214 9 shl.b64 %r317,%r316,3; neg.s64 %r140,%r317; .loc 1 215 23 mul.lo.u64 %r318,%r66,%r76; .loc 1 215 9 shl.b64 %r319,%r318,4; neg.s64 %r145,%r319; setp.gt.s64 %r406,%r120,0; setp.le.s64 %r407,%r118,1; .loc 1 180 9 setp.eq.u32 %r409,%r193,0; mov.u64 %r410,1; cvt.u32.u32 %r411,%r22; cvt.s64.s8 %r412,%r411; add.u64 %r413,%r412,-1; add.u64 %r414,%frame,120; add.u64 %r415,%frame,240; bra $L25; $L24: add.u64 %r168,%frame,360; add.u64 %r167,%r190,40; mov.u64 %r165,%frame; cvt.u32.u32 %r321,%r22; cvt.s64.s8 %r320,%r321; add.u64 %r148,%r320,-1; .loc 1 132 3 mov.u64 %r133,0; add.u64 %r404,%frame,240; .loc 1 134 16 mov.u64 %r322,%r133; $L26: st.u64 [%r168],%r322; .loc 1 135 18 ld.u64 %r323,[%r167]; st.u64 [%r165],%r323; .loc 1 136 17 shl.b64 %r325,%r133,3; add.u64 %r326,%r404,%r325; .loc 1 136 10 ld.u64 %r327,[%r326]; setp.le.s64 %r328,%r327,0; @ %r328 bra $L1; .loc 1 132 26 add.u64 %r133,%r133,1; .loc 1 132 3 add.u64 %r168,%r168,8; add.u64 %r167,%r167,24; add.u64 %r165,%r165,8; setp.ne.u64 %r329,%r133,%r148; @ %r329 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r406 bra $L28; .loc 1 159 10 mov.u64 %r331,0; st.u64 [%r132],%r331; st.u64 [%r132+8],%r331; bra $L29; $L28: .loc 1 180 9 @ %r409 bra $L40; mov.u64 %r127,%r131; mov.u64 %r394,%r410; mov.u64 %r395,0; .loc 1 155 9 mov.u64 %r108,9223372036854775807; .loc 1 157 9 mov.u64 %r390,1; mov.u64 %r391,%r395; $L32: .loc 1 183 9 ld.u64 %r65,[%r127]; .loc 1 183 8 setp.gt.s64 %r334,%r65,%r108; .loc 1 186 16 selp.u64 %r390,%r390,%r394,%r334; selp.u64 %r391,%r391,%r395,%r334; selp.u64 %r108,%r108,%r65,%r334; .loc 1 181 33 add.u64 %r127,%r127,%r71; .loc 1 181 8 add.u64 %r337,%r394,%r410; set.u32.lt.u64 %r342,%r337,%r394; cvt.s64.s32 %r340,%r342; mov.u64 %r394,%r337; sub.u64 %r395,%r395,%r340; add.u64 %r347,%r394,-1; setp.gt.s64 %r349,%r82,%r347; @ %r349 bra $L32; bra $L33; $L40: mov.u64 %r101,%r131; mov.u64 %r392,%r410; mov.u64 %r393,0; .loc 1 155 9 mov.u64 %r109,9223372036854775807; .loc 1 157 9 mov.u64 %r390,1; mov.u64 %r391,%r393; $L30: .loc 1 192 9 ld.u64 %r69,[%r101]; .loc 1 192 8 setp.ge.s64 %r350,%r69,%r109; .loc 1 195 16 selp.u64 %r390,%r390,%r392,%r350; selp.u64 %r391,%r391,%r393,%r350; selp.u64 %r109,%r109,%r69,%r350; .loc 1 190 33 add.u64 %r101,%r101,%r71; .loc 1 190 8 add.u64 %r353,%r392,%r410; set.u32.lt.u64 %r358,%r353,%r392; cvt.s64.s32 %r356,%r358; mov.u64 %r392,%r353; sub.u64 %r393,%r393,%r356; add.u64 %r363,%r392,-1; setp.gt.s64 %r365,%r82,%r363; @ %r365 bra $L30; $L33: .loc 1 199 12 st.u64 [%r132],%r390; st.u64 [%r132+8],%r391; $L29: .loc 1 203 15 add.u64 %r141,%r141,1; .loc 1 204 12 add.u64 %r131,%r131,%r74; .loc 1 205 12 add.u64 %r132,%r132,%r77; .loc 1 207 13 setp.ne.u64 %r368,%r66,%r141; @ %r368 bra $L25; .loc 1 214 9 add.u64 %r128,%r131,%r140; .loc 1 215 9 add.u64 %r129,%r132,%r145; .loc 1 217 7 @ %r407 bra $L1; add.u64 %r157,%frame,368; mov.u64 %r117,8; .loc 1 216 5 mov.u64 %r130,1; .loc 1 211 13 mov.u64 %r408,0; bra $L37; $L38: st.u64 [%r157],%r408; .loc 1 214 23 mul.lo.u64 %r374,%r89,%r88; .loc 1 214 9 shl.b64 %r375,%r374,3; sub.u64 %r128,%r131,%r375; .loc 1 215 23 mul.lo.u64 %r376,%r92,%r88; .loc 1 215 9 shl.b64 %r377,%r376,4; sub.u64 %r129,%r132,%r377; .loc 1 216 5 add.u64 %r130,%r130,1; .loc 1 217 7 add.u64 %r157,%r157,8; add.u64 %r117,%r117,8; setp.eq.u64 %r378,%r130,%r413; @ %r378 bra $L1; $L37: .loc 1 225 16 ld.u64 %r379,[%r157]; add.u64 %r88,%r379,1; st.u64 [%r157],%r88; .loc 1 226 23 add.u64 %r381,%r414,%r117; ld.u64 %r89,[%r381]; .loc 1 226 13 shl.b64 %r382,%r89,3; add.u64 %r131,%r128,%r382; .loc 1 227 23 add.u64 %r383,%frame,%r117; ld.u64 %r92,[%r383]; .loc 1 227 13 shl.b64 %r384,%r92,4; add.u64 %r132,%r129,%r384; .loc 1 207 32 add.u64 %r386,%r415,%r117; ld.u64 %r95,[%r386]; .loc 1 207 13 setp.eq.u64 %r387,%r88,%r95; @ %r387 bra $L38; .loc 1 211 13 mov.u64 %r141,0; bra $L25; $L53: .loc 1 90 6 ld.u64 %r388,[%r190]; setp.eq.u64 %r389,%r388,0; @ ! %r389 bra $L11; bra $L39; $L1: .loc 1 231_gfortran_mminloc1_16_i8 .visible .func _gfortran_mminloc1_16_, .param .u32 %in_ar4) {608102u64u64 %r267; .reg .u64 %r274; .reg .pred %r275; .reg .u32 %r276; .reg .u32 %r277; .reg .u32u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .pred %r329; .reg .u64 %r330; .reg .pred %r331; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32 %r338; .reg .u64 %r339; .reg .u64 %r341; .reg .u64 %r342; .reg .u64 %r343; .reg .u64 %r344; .reg .u64pred %r355; .reg .pred64predpred %r408; .reg .pred %r409; .reg .u64 %r410; .reg .u32.reg .u64 %r417; .reg .pred %r418; .reg .predpred %r428; .reg .u64 %r432; .reg .u64pred %r460; .reg .u64 %r463; .reg .u64 %r466; .reg .u32 %r468; .reg .pred %r475; .reg .pred.reg .u32 %r504; .reg .u64 %r508; .reg .u16 %r509; .reg .u32 %r510; .reg .pred %r511; .reg .pred %r512; .reg .u64 %r515; .reg .u64 %r518; .reg .u32 %r520; .reg .pred %r527; .reg .pred %r530; .reg .pred %r533; .reg .u64pred %r542; .reg .u64 %r543; .reg .u64pred.reg .u64 %r610; .reg .u64 %r611; mov.u64 %r233,%ar0; mov.u64 %r234,%ar1; mov.u64 %r235,%ar2; mov.u64 %r236,%ar3; mov.u32 %r237,%ar4; .loc 1 261 6 setp.ne.u64 %r238,%r236,0; @ %r238 bra $L3r2235237; call _gfortran_minloc1_16_268 7 bra $L54; $L55: .loc 1 271 10 ld.u64 %r22,[%r235]; .loc 1 271 7 add.u64 %r153,%r22,-1; .loc 1 272 10 ld.s8 %r23,[%r234+28]; .loc 1 272 38 add.u32 %r243,%r23,-1; .loc 1 272 8 cvt.s64.s32 %r154,%r243; .loc 1 275 7 shr.u64 %r245,%r153,63; set.u32.gt.s64 %r247,%r153,%r154; neg.s32 %r248,%r247; cvt.u16.u64 %r251,%r245; cvt.u16.u32 %r252,%r248; or.b16 %r250,%r251,%r252; .loc 1 275 6 cvt.u32.u16 %r253,%r250; cvt.u16.u8 %r254,%r253; setp.eq.u16 %r255,%r254,0; @ %r255 bra $L57; .loc 1 277 7 cvt.u32.u32 %r258,%r23; cvt.s64.s8 %r257,%r258; st.u64 [%stack+8],%r257; st.u64 [%stack],%r22; cvta.const.u64 %r22stack; call _gfortran_runtime_error7: .loc 1 282 9 add.u64 %r596,%r153,%r153; add.u64 %r601,%r596,%r153; shl.b64 %r263,%r601,3; add.u64 %r264,%r234,%r263; ld.u64 %r267,[%r264+56]; add.u64 %r266,%r267,1; .loc 1 282 7 ld.u64 %r274,[%r264+48]; sub.u64 %r155,%r266,%r274; .loc 1 283 6 setp.le.s64 %r275,%r155,0; @ %r275 bra $L54; .loc 1 286 9 ld.u64 %r173,[%r236]; .loc 1 288 15 ld.u64 %r35,[%r236+16]; .loc 1 290 22 cvt.u32.u64 %r36,%r35; .loc 1 290 53 add.u32 %r276,%r36,-4; and.b32 %r277,%r276,-5; set.u32.eq.u32 %r279,%r277,0; neg.s32 %r280,%r279; .loc 1 290 22 add.u32 %r281,%r36,-1; set.u32.le.u32 %r283,%r281,1; neg.s32 %r284,%r283; .loc 1 290 6 cvt.u16.u32 %r286,%r280; cvt.u16.u32 %r287,%r284; or.b16 %r285,%r286,%r287;59; .loc 1 292 7 setp.ne.u32 %r292,%r36,16; @ %r292 bra $L60; $L59: .loc 1 299 9 shl.b64 %r296,%r601,3; add.u64 %r297,%r234,%r296; ld.u64 %r158,[%r297+40]; .loc 1 300 12 add.u64 %r303,%r236,%r296; ld.u64 %r41,[%r303+40]; .loc 1 302 3 setp.ne.u64 %r305,%r153,0; @ %r305 bra $L61; $L67: .loc 1 312 3 setp.lt.s64 %r306,%r153,%r154; @ %r306 bra $L62; bra $L121; $L60: .loc 1 297 5 cvta.const.u64 %r307307_gfortran_runtime_error61: add.u64 %r164,%r234,40; add.u64 %r150,%r236,40; add.u64 %r309,%r234,16; add.u64 %r311,%r22,%r22; add.u64 %r312,%r311,%r22; shl.b64 %r313,%r312,3; add.u64 %r223,%r309,%r313; .loc 1 302 3 mov.u64 %r53,0; add.u64 %r597,%frame,240; add.u64 %r600,%frame,360; .loc 1 309 12 mov.u64 %r611,%r53; $L66: .loc 1 304 18 add.u64 %r315,%r597,%r53; ld.u64 %r316,[%r164]; st.u64 [%r315],%r316; .loc 1 305 18 add.u64 %r317,%frame,%r53; .loc 1 305 20 ld.u64 %r319,[%r150]; mul.lo.u64 %r318,%r319,%r35; .loc 1 305 18 st.u64 [%r317],%r318; .loc 1 306 19 ld.u64 %r321,[%r164+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r164+8]; sub.u64 %r52,%r320,%r322; .loc 1 308 10 setp.lt.s64 %r323,%r52,0; @ %r323 bra $L64; .loc 1 306 17 add.u64 %r325,%r600,%r53; st.u64 [%r325],%r52; bra $L65; $L64: .loc 1 309 12 add.u64 %r327,%r600,%r53; st.u64 [%r327],%r611; $L65: .loc 1 302 3 add.u64 %r164,%r164,24; add.u64 %r150,%r150,24; add.u64 %r53,%r53,8; setp.ne.u64 %r329,%r164,%r223; @ %r329 bra $L66; bra $L67; $L121: .loc 1 322 6 ld.u64 %r330,[%r233]; setp.eq.u64 %r331,%r330,0; @ ! %r331 bra $L69; bra $L68; $L62: add.u64 %r333,%r22,%r22; add.u64 %r334,%r333,%r22; shl.b64 %r335,%r334,3; add.u64 %r178,%r335,40; add.u64 %r59,%r234,%r178; add.u64 %r180,%r236,%r178; shl.b64 %r336,%r22,3; add.u64 %r91,%r336,-8; cvt.u32.u32 %r338,%r23; cvt.s64.s8 %r337,%r338; shl.b64 %r339,%r337,3; add.u64 %r97,%r339,-8; add.u64 %r597,%frame,240; add.u64 %r600,%frame,360; .loc 1 319 12 mov.u64 %r610,0; $L72: .loc 1 314 18 add.u64 %r341,%r597,%r91; ld.u64 %r342,[%r59]; st.u64 [%r341],%r342; .loc 1 315 18 add.u64 %r343,%frame,%r91; .loc 1 315 20 ld.u64 %r345,[%r180]; mul.lo.u64 %r344,%r345,%r35; .loc 1 315 18 st.u64 [%r343],%r344; .loc 1 316 19 ld.u64 %r347,[%r59+16]; add.u64 %r346,%r347,1; ld.u64 %r348,[%r59+8]; sub.u64 %r66,%r346,%r348; .loc 1 318 10 setp.lt.s64 %r349,%r66,0; @ %r349 bra $L70; .loc 1 316 17 add.u64 %r351,%r600,%r91; st.u64 [%r351],%r66; bra $L71; $L70: .loc 1 319 12 add.u64 %r353,%r600,%r91; st.u64 [%r353],%r610; $L71: .loc 1 312 3 add.u64 %r59,%r59,24; add.u64 %r180,%r180,24; add.u64 %r91,%r91,8; setp.ne.u64 %r355,%r91,%r97; @ %r355 bra $L72; bra $L122; $L68: .loc 1 326 7 setp.eq.u64 %r356,%r154,0; @ %r356 bra $L74; add.u64 %r600,%frame,360; $L100: add.u64 %r137,%r233,40; mov.u64 %r134,%r600; cvt.u32.u32 %r358,%r23; cvt.s64.s8 %r357,%r358; add.u64 %r360,%r357,%r357; add.u64 %r361,%r360,%r357; shl.b64 %r362,%r361,3; add.u64 %r363,%r233,16; add.u64 %r64,%r362,%r363; .loc 1 329 10 mov.u64 %r136,1; .loc 1 333 4 mov.u64 %r382,0; bra $L75; $L74: .loc 1 337 20 add.u32 %r364,%r23,-2; cvt.s64.s32 %r75,%r364; add.u64 %r366,%r75,%r75; add.u64 %r367,%r366,%r75; shl.b64 %r368,%r367,3; add.u64 %r369,%r233,%r368; .loc 1 337 67 shl.b64 %r371,%r75,3; add.u64 %r372,%frame,%r371; .loc 1 337 59 ld.u64 %r374,[%r369+40]; ld.u64 %r375,[%r372+360]; mul.lo.u64 %r162,%r374,%r375; .loc 1 339 24 mov.u64 %r376,0; st.u64 [%r233+8],%r376; .loc 1 340 28 cvt.u16.u32 %r379,%r23; add.u16 %r378,%r379,-1; cvt.u32.u16 %r380,%r378; st.u8 [%r233+28],%r380; .loc 1 342 10 setp.eq.u64 %r381,%r162,0; @ %r381 bra $L76; bra $L123; $L78: .loc 1 331 47 mul.lo.u64 %r136,%r72,%r136; $L75: .loc 1 333 4 st.u64 [%r137+8],%r382; ld.u64 %r72,[%r134]; add.u64 %r383,%r72,-1; st.u64 [%r137+16],%r383; st.u64 [%r137],%r136; .loc 1 326 7 add.u64 %r137,%r137,24; add.u64 %r134,%r134,8; setp.ne.u64 %r384,%r64,%r137; @ %r384 bra $L78; bra $L74; $L76: .loc 1 345 4 st.u64 [%r233+48],%r162; mov.u64 %r386,-1; st.u64 [%r233+56],%r386; mov.u64 %r387,1; st.u64 [%r233+40],%r387; .loc 1 346 4 bra $L54; $L123: .loc 1 349 24 mov.u64 %r389,16389; call (%value_in),_gfortrani_xmallocarray90,[%value_in]; } .loc 1 349 22 st.u64 [%r233],%r390; bra $L79; $L69: .loc 1 354 19 ld.s8 %r392,[%r233+28]; .loc 1 354 10 setp.eq.u64 %r393,%r392,%r154; @ %r393 bra $L80; .loc 1 355 2 cvta.const.u64 %r3944_gfortran_runtime_error80: .loc 1 357 11 cvta.global.u64 %r396,_gfortrani_compile_options; .loc 1 357 10 ld.u32 %r397,[%r396+36]; setp.eq.u32 %r398,%r397,0; @ %r398 bra $L79; .loc 1 359 4 add.u64 %r600,%frame,360; cvta.const.u64 %r402,$LC2r600002; call _gfortrani_bounds_ifunction_return361 4 cvta.const.u64 %r406,$LC406402; call _gfortrani_bounds_equal_extents$L79: .loc 1 366 3 setp.ne.u64 %r408,%r154,0; @ %r408 bra $L81; $L84: .loc 1 374 8 ld.u64 %r174,[%r233]; .loc 1 375 8 ld.u64 %r172,[%r234]; .loc 1 377 9 setp.ne.u64 %r409,%r172,0; @ %r409 bra $L82; bra $L54; $L81: add.u64 %r188,%frame,480; add.u64 %r181,%r233,40; add.u64 %r149,%frame,120; cvt.u32.u32 %r411,%r23; cvt.s64.s8 %r410,%r411; add.u64 %r138,%r410,-1; .loc 1 366 3 mov.u64 %r175,0; add.u64 %r600,%frame,360; .loc 1 368 16 mov.u64 %r412,%r175; $L83: st.u64 [%r188],%r412; .loc 1 369 18 ld.u64 %r413,[%r181]; st.u64 [%r149],%r413; .loc 1 370 17 shl.b64 %r415,%r175,3; add.u64 %r416,%r600,%r415; .loc 1 370 10 ld.u64 %r417,[%r416]; setp.le.s64 %r418,%r417,0; @ %r418 bra $L54; .loc 1 366 26 add.u64 %r175,%r175,1; .loc 1 366 3 add.u64 %r188,%r188,8; add.u64 %r181,%r181,24; add.u64 %r149,%r149,8; setp.ne.u64 %r419,%r138,%r175; @ %r419 bra $L83; bra $L84; $L82: .loc 1 300 12 mul.lo.u64 %r43,%r41,%r35; .loc 1 396 32 shl.b64 %r88,%r158,3; ld.u64 %r189,[%frame+480]; .loc 1 440 22 ld.u64 %r102,[%frame+240]; .loc 1 440 12 shl.b64 %r103,%r102,3; .loc 1 441 23 ld.u64 %r105,[%frame]; .loc 1 442 22 ld.u64 %r107,[%frame+120]; .loc 1 442 12 shl.b64 %r108,%r107,4; .loc 1 444 32 ld.u64 %r29,[%frame+360]; .loc 1 451 23 mul.lo.u64 %r39,%r29,%r102; .loc 1 452 24 mul.lo.u64 %r420,%r29,%r105; .loc 1 452 10 neg.s64 %r92,%r420; .loc 1 453 23 mul.lo.u64 %r421,%r29,%r107; .loc 1 453 9 shl.b64 %r422,%r421,4; neg.s64 %r185,%r422; setp.le.s64 %r595,%r154,1; .loc 1 418 9 setp.ne.u32 %r603,%r237,0; add.u64 %r604,%r155,-1; cvt.u32.u32 %r605,%r23; cvt.s64.s8 %r606,%r605; add.u64 %r607,%r606,-1; add.u64 %r608,%frame,240; add.u64 %r609,%frame,360; $L101: .loc 1 366 3 mov.u64 %r168,%r173; mov.u64 %r167,%r172; .loc 1 396 9 mov.u64 %r197,0; $L88: .loc 1 399 7 ld.s8 %r224,[%r168]; mov.u64 %r230,%r197; .loc 1 396 24 add.u64 %r197,%r197,1; .loc 1 399 6 setp.eq.u32 %r598,%r224,0; @ %r598 bra $L85; .loc 1 407 11 ld.u64 %r145,[%r167]; .loc 1 408 11 mov.u64 %r557,%r197; shr.s64 %r558,%r557,63; .loc 1 418 9 @ %r603 bra $L86; bra $L124; $L85: .loc 1 396 32 add.u64 %r167,%r167,%r88; .loc 1 396 47 add.u64 %r168,%r168,%r43; .loc 1 396 2 setp.ne.u64 %r426,%r155,%r197; @ %r426 bra $L88; .loc 1 395 9 mov.u64 %r557,0; mov.u64 %r558,%r557; bra $L89; $L124: .loc 1 428 10 setp.gt.s64 %r427,%r155,%r230; @ %r427 bra $L90; bra $L89; $L86: .loc 1 419 8 setp.le.s64 %r428,%r155,%r230; @ %r428 bra $L89; add.u64 %r565,%r230,1; shr.s64 %r566,%r565,63; sub.u64 %r432,%r604,%r230; shr.s64 %r435,%r230,63; add.u64 %r438,%r230,2; set.u32.lt.u64 %r443,%r438,%r230; cvt.s64.s32 %r441,%r443; sub.u64 %r447,%r435,%r441; add.u64 %r449,%r432,%r438; set.u32.lt.u64 %r454,%r449,%r432; cvt.s64.s32 %r452,%r454; sub.u64 %r458,%r447,%r452; mov.u64 %r577,1; $L93: .loc 1 421 8 @ %r598 bra $L91; .loc 1 421 18 ld.u64 %r89,[%r167]; .loc 1 421 15 setp.gt.s64 %r460,%r89,%r145; .loc 1 424 16 selp.u64 %r557,%r557,%r565,%r460; selp.u64 %r558,%r558,%r566,%r460; selp.u64 %r145,%r145,%r89,%r460; $L91: .loc 1 419 33 add.u64 %r167,%r167,%r88; .loc 1 419 48 add.u64 %r168,%r168,%r43; .loc 1 419 8 add.u64 %r463,%r565,%r577; set.u32.lt.u64 %r468,%r463,%r565; cvt.s64.s32 %r466,%r468; mov.u64 %r565,%r463; sub.u64 %r566,%r566,%r466; setp.ne.u64 %r475,%r449,%r565; @ %r475 bra $L102; setp.ne.u64 %r478,%r458,%r566; @ ! %r478 bra $L89; $L102: .loc 1 421 9 ld.s8 %r224,[%r168]; setp.eq.u32 %r598,%r224,0; bra $L93; $L90: add.u64 %r561,%r230,1; shr.s64 %r562,%r561,63; sub.u64 %r482,%r604,%r230; shr.s64 %r485,%r230,63; add.u64 %r488,%r230,2; set.u32.lt.u64 %r493,%r488,%r230; cvt.s64.s32 %r491,%r493; sub.u64 %r497,%r485,%r491; add.u64 %r499,%r482,%r488; set.u32.lt.u64 %r504,%r499,%r482; cvt.s64.s32 %r502,%r504; sub.u64 %r508,%r497,%r502; .loc 1 428 10 mov.u64 %r591,1; $L116: .loc 1 430 10 ld.s8 %r510,[%r168]; cvt.u16.u32 %r509,%r510; setp.eq.u16 %r511,%r509,0; @ %r511 bra $L94; .loc 1 430 20 ld.u64 %r94,[%r167]; .loc 1 430 17 setp.ge.s64 %r512,%r94,%r145; .loc 1 433 11 selp.u64 %r557,%r557,%r561,%r512; selp.u64 %r558,%r558,%r562,%r512; selp.u64 %r145,%r145,%r94,%r512; $L94: .loc 1 428 35 add.u64 %r167,%r167,%r88; .loc 1 428 50 add.u64 %r168,%r168,%r43; .loc 1 428 10 add.u64 %r515,%r561,%r591; set.u32.lt.u64 %r520,%r515,%r561; cvt.s64.s32 %r518,%r520; mov.u64 %r561,%r515; sub.u64 %r562,%r562,%r518; setp.ne.u64 %r527,%r499,%r561; @ %r527 bra $L116; setp.ne.u64 %r530,%r508,%r562; @ %r530 bra $L116; $L89: .loc 1 436 8 st.u64 [%r174],%r557; st.u64 [%r174+8],%r558; .loc 1 439 15 add.u64 %r189,%r189,1; .loc 1 440 12 add.u64 %r172,%r172,%r103; .loc 1 441 13 add.u64 %r173,%r173,%r105; .loc 1 442 12 add.u64 %r174,%r174,%r108; .loc 1 444 13 setp.ne.u64 %r533,%r29,%r189; @ %r533 bra $L101; .loc 1 452 10 add.u64 %r169,%r173,%r92; .loc 1 453 9 add.u64 %r170,%r174,%r185; .loc 1 455 7 @ %r595 bra $L54; add.u64 %r219,%frame,488; .loc 1 451 23 mov.u64 %r112,%r39; .loc 1 455 7 mov.u64 %r218,8; .loc 1 454 5 mov.u64 %r171,1; add.u64 %r599,%frame,120; .loc 1 448 13 mov.u64 %r602,0; bra $L98; $L99: st.u64 [%r219],%r602; .loc 1 451 23 mul.lo.u64 %r112,%r125,%r124; .loc 1 452 24 mul.lo.u64 %r539,%r127,%r124; .loc 1 452 10 sub.u64 %r169,%r173,%r539; .loc 1 453 23 mul.lo.u64 %r540,%r129,%r124; .loc 1 453 9 shl.b64 %r541,%r540,4; sub.u64 %r170,%r174,%r541; .loc 1 454 5 add.u64 %r171,%r171,1; .loc 1 455 7 add.u64 %r219,%r219,8; add.u64 %r218,%r218,8; setp.eq.u64 %r542,%r171,%r607; @ %r542 bra $L54; $L98: .loc 1 463 16 ld.u64 %r543,[%r219]; add.u64 %r124,%r543,1; st.u64 [%r219],%r124; .loc 1 464 23 add.u64 %r545,%r608,%r218; ld.u64 %r125,[%r545]; .loc 1 464 13 sub.u64 %r546,%r125,%r112; shl.b64 %r547,%r546,3; add.u64 %r172,%r172,%r547; .loc 1 465 24 add.u64 %r548,%frame,%r218; ld.u64 %r127,[%r548]; .loc 1 465 14 add.u64 %r173,%r169,%r127; .loc 1 466 23 add.u64 %r550,%r599,%r218; ld.u64 %r129,[%r550]; .loc 1 466 13 shl.b64 %r551,%r129,4; add.u64 %r174,%r170,%r551; .loc 1 444 32 add.u64 %r553,%r609,%r218; ld.u64 %r133,[%r553]; .loc 1 444 13 setp.eq.u64 %r554,%r124,%r133; @ %r554 bra $L99; .loc 1 448 13 mov.u64 %r189,0; bra $L101; $L122: .loc 1 322 6 ld.u64 %r555,[%r233]; setp.eq.u64 %r556,%r555,0; @ ! %r556 bra $L69; bra $L100; $L54: .loc 1 4_gfortran_sminloc1_16_i8 .visible .func _gfortran_sminloc1_16_, .param .u32 %in_ar4) {8u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r282; .reg .predpred %r293; .reg .pred %r294; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 493 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L126; .loc 1 493 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L127; $L126:r152154; call _gfortran_minloc1_16_500 7 bra $L125; $L127: .loc 1 503 10 ld.u64 %r23,[%r152]; .loc 1 503 7 add.u64 %r88,%r23,-1; .loc 1 504 10 ld.s8 %r24,[%r151+28]; .loc 1 504 38 add.u32 %r26,%r24,-1; .loc 1 504 8 cvt.s64.s32 %r297,%r26; .loc 1 506 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r297; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 506 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L129; .loc 1 513 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L130; $L136: .loc 1 521 3 setp.lt.s64 %r175,%r88,%r297; @ %r175 bra $L131; bra $L184; $L129: .loc 1 508 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error130: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 518 12 mov.u64 %r303,0; $L135: .loc 1 515 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 517 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L133; .loc 1 515 17 st.u64 [%r139],%r36; bra $L134; $L133: .loc 1 518 12 st.u64 [%r139],%r303; $L134: .loc 1 513 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L135; bra $L136; $L184: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L138; bra $L137; $L131: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r296,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r296,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 527 12 mov.u64 %r302,0; $L141: .loc 1 524 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 526 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L139; .loc 1 523 17 st.u64 [%r43],%r41; bra $L140; $L139: .loc 1 527 12 st.u64 [%r43],%r302; $L140: .loc 1 521 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L141; bra $L185; $L137: .loc 1 534 7 setp.eq.u64 %r213,%r297,0; @ %r213 bra $L143; add.u64 %r296,%frame,120; $L163: add.u64 %r111,%r150,40; mov.u64 %r107,%r296; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 537 10 mov.u64 %r81,1; .loc 1 541 4 mov.u64 %r239,0; bra $L144; $L143: .loc 1 545 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 546 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 548 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 548 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 548 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 550 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L145; bra $L186; $L147: .loc 1 539 48 mul.lo.u64 %r81,%r46,%r81; $L144: .loc 1 541 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 534 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L147; bra $L143; $L145: .loc 1 553 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 554 4 bra $L125; $L186: .loc 1 557 24 mov.u64 %r246,1696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 557 22 st.u64 [%r150],%r98; $L151: .loc 1 583 3 setp.ne.u64 %r249,%r297,0; @ %r249 bra $L148; bra $L149; $L138: .loc 1 561 19 ld.s8 %r55,[%r150+28]; .loc 1 561 10 setp.eq.u64 %r250,%r55,%r297; @ %r250 bra $L150; .loc 1 562 2 st.u64 [%stack+8],%r297; st.u64 [%stack],%r55;_gfortran_runtime_error150: .loc 1 567 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 567 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L151; .loc 1 569 4 setp.eq.u64 %r256,%r297,0; @ %r256 bra $L149; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 569 10 mov.u64 %r104,0; $L153: .loc 1 573 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 573 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 574 18 ld.u64 %r63,[%r31]; .loc 1 569 25 add.u64 %r104,%r104,1; .loc 1 574 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L152; .loc 1 575 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error152: .loc 1 569 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L153; $L148: .loc 1 585 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L154; shl.b64 %r266,%r297,3; bra $L155; $L154: mov.u64 %r266,8; $L155: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L156: .loc 1 586 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 583 26 add.u64 %r100,%r100,1; .loc 1 583 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r297,%r100; @ %r279 bra $L156; $L149: ld.u64 %r80,[%frame+240]; .loc 1 595 22 ld.u64 %r68,[%frame]; .loc 1 595 12 shl.b64 %r69,%r68,4; .loc 1 597 32 ld.u64 %r59,[%frame+120]; .loc 1 604 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r295,%r297,1; .loc 1 593 13 mov.u64 %r280,0; cvt.u32.u32 %r299,%r24; cvt.s64.s8 %r300,%r299; add.u64 %r301,%r300,-1; $L162: st.u64 [%r98],%r280; st.u64 [%r98+8],%r280; .loc 1 594 15 add.u64 %r80,%r80,1; .loc 1 595 12 add.u64 %r98,%r98,%r69; .loc 1 597 13 setp.ne.u64 %r282,%r59,%r80; @ %r282 bra $L162; .loc 1 606 7 @ ! %r295 bra $L187; bra $L125; $L161: .loc 1 601 13 st.u64 [%r99],%r280; .loc 1 604 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 605 5 add.u64 %r97,%r97,1; .loc 1 606 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r285,%r97,%r301; @ %r285 bra $L160; bra $L125; $L187: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 604 23 mov.u64 %r72,%r44; .loc 1 605 5 mov.u64 %r97,1; $L160: .loc 1 610 16 ld.u64 %r290,[%r99]; add.u64 %r74,%r290,1; st.u64 [%r99],%r74; .loc 1 611 23 ld.u64 %r75,[%r124]; .loc 1 611 13 sub.u64 %r291,%r75,%r72; shl.b64 %r292,%r291,4; add.u64 %r98,%r98,%r292; .loc 1 597 32 ld.u64 %r77,[%r123]; .loc 1 597 13 setp.eq.u64 %r293,%r74,%r77; @ %r293 bra $L161; .loc 1 601 13 mov.u64 %r80,0; bra $L162; $L185: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r294,%r98,0; @ ! %r294 bra $L138; bra $L163; $L125: .loc 1 615minloc1_4_i16.o/1622802233_gfortran_minloc1_4_i16 .visible .func _gfortran_minloc1_4_i16.file 1 "../../../../libgfortran/generated/minloc1_4_i16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_4_i16 .visible .func _gfortran_mminloc1_4_i16_gfortran_sminloc1_4_i16 .visible .func _gfortran_sminloc1_4_i16VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_minloc1_4_i16 .visible .func _gfortran_minloc1_4_i16predu64 %r258; .reg .u64pred %r262; .reg .pred %r264; .reg .pred %r265; .reg .u64u64predu64 %r318; .reg .u32u64 %r325; .reg .pred %r326; .reg .pred %r327; .reg .u32predpredpred %r352; .reg .predu64 %r371; .reg .predpredu32u64 %r396; mov.u64 %r188,%ar0; mov.u64 %r189,%ar1; mov.u64 %r190,%ar2; mov.u32 %r191,%ar3; .loc 1 58 10 ld.s8 %r22,[%r189+28]; .loc 1 58 38 add.u32 %r192,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r118,%r192; .loc 1 59 10 ld.u64 %r25,[%r190]; .loc 1 59 7 add.u64 %r119,%r25,-1; .loc 1 61 7 shr.u64 %r194,%r119,63; set.u32.lt.s64 %r196,%r118,%r119; neg.s32 %r197,%r196; cvt.u16.u64 %r200,%r194; cvt.u16.u32 %r201,%r197; or.b16 %r199,%r200,%r201; .loc 1 61 6 cvt.u32.u16 %r202,%r199; cvt.u16.u8 %r203,%r202; setp.eq.u16 %r204,%r203,0; @ %r204 bra $L2; .loc 1 63 7 cvt.u32.u32 %r207,%r22; cvt.s64.s8 %r206,%r207; st.u64 [%stack+8],%r20605205_gfortran_runtime_errorr210,%r119,%r119; add.u64 %r211,%r210,%r119; shl.b64 %r212,%r211,3; add.u64 %r213,%r189,%r212; ld.u64 %r30,[%r213+56]; ld.u64 %r32,[%r213+48]; .loc 1 71 9 ld.u64 %r122,[%r213+40]; .loc 1 73 3 setp.ne.u64 %r227,%r119,0; @ %r227 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r228,%r118,%r119; @ %r228 bra $L4; bra $L56; $L3: add.u64 %r138,%r189,40; add.u64 %r186,%frame,120; add.u64 %r185,%frame,240; add.u64 %r229,%r189,16; add.u64 %r231,%r25,%r25; add.u64 %r232,%r231,%r25; shl.b64 %r233,%r232,3; add.u64 %r145,%r229,%r233; .loc 1 79 12 mov.u64 %r396,0; $L8: .loc 1 75 18 ld.u64 %r234,[%r138]; st.u64 [%r186],%r234; .loc 1 76 19 ld.u64 %r236,[%r138+16]; add.u64 %r235,%r236,1; ld.u64 %r237,[%r138+8]; sub.u64 %r37,%r235,%r237; .loc 1 78 10 setp.lt.s64 %r238,%r37,0; @ %r238 bra $L6; .loc 1 76 17 st.u64 [%r185],%r37; bra $L7; $L6: .loc 1 79 12 st.u64 [%r185],%r396; $L7: .loc 1 73 3 add.u64 %r138,%r138,24; add.u64 %r186,%r186,8; add.u64 %r185,%r185,8; setp.ne.u64 %r240,%r138,%r145; @ %r240 bra $L8; bra $L9; $L56: .loc 1 90 6 ld.u64 %r241,[%r188]; setp.eq.u64 %r242,%r241,0; @ ! %r242 bra $L11; bra $L10; $L4: add.u64 %r244,%r25,%r25; add.u64 %r245,%r244,%r25; shl.b64 %r246,%r245,3; add.u64 %r247,%r246,40; add.u64 %r114,%r189,%r247; shl.b64 %r248,%r25,3; add.u64 %r102,%r248,-8; add.u64 %r384,%frame,120; add.u64 %r107,%r384,%r102; add.u64 %r383,%frame,240; add.u64 %r100,%r383,%r102; cvt.u32.u32 %r252,%r22; cvt.s64.s8 %r251,%r252; add.u64 %r254,%r251,%r251; add.u64 %r255,%r254,%r251; shl.b64 %r256,%r255,3; add.u64 %r257,%r189,40; add.u64 %r143,%r256,%r257; .loc 1 87 12 mov.u64 %r395,0; $L14: .loc 1 83 18 ld.u64 %r258,[%r114]; st.u64 [%r107],%r258; .loc 1 84 19 ld.u64 %r260,[%r114+16]; add.u64 %r259,%r260,1; ld.u64 %r261,[%r114+8]; sub.u64 %r43,%r259,%r261; .loc 1 86 10 setp.lt.s64 %r262,%r43,0; @ %r262 bra $L12; .loc 1 84 17 st.u64 [%r100],%r43; bra $L13; $L12: .loc 1 87 12 st.u64 [%r100],%r395; $L13: .loc 1 81 3 add.u64 %r114,%r114,24; add.u64 %r107,%r107,8; add.u64 %r100,%r100,8; setp.ne.u64 %r264,%r114,%r143; @ %r264 bra $L14; bra $L57; $L10: .loc 1 94 7 setp.le.s64 %r265,%r118,0; @ %r265 bra $L16; add.u64 %r383,%frame,240; $L41: add.u64 %r155,%r188,40; mov.u64 %r151,%r383; cvt.u32.u32 %r267,%r22; cvt.s64.s8 %r266,%r267; add.u64 %r269,%r266,%r266; add.u64 %r270,%r269,%r266; shl.b64 %r271,%r270,3; add.u64 %r272,%r188,16; add.u64 %r115,%r271,%r272; .loc 1 97 10 mov.u64 %r105,1; .loc 1 101 4 mov.u64 %r295,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r273,0; st.u64 [%r188+8],%r273; .loc 1 106 28 cvt.u16.u32 %r276,%r22; add.u16 %r275,%r276,-1; cvt.u32.u16 %r277,%r275; st.u8 [%r188+28],%r277; .loc 1 108 20 add.u32 %r278,%r22,-2; cvt.s64.s32 %r52,%r278; add.u64 %r280,%r52,%r52; add.u64 %r281,%r280,%r52; shl.b64 %r282,%r281,3; add.u64 %r283,%r188,%r282; .loc 1 108 67 shl.b64 %r285,%r52,3; add.u64 %r286,%frame,%r285; .loc 1 108 59 ld.u64 %r288,[%r283+40]; ld.u64 %r289,[%r286+240]; mul.lo.u64 %r124,%r288,%r289; .loc 1 110 29291; call (%value_in),_gfortrani_xmallocarray292,[%value_in]; } .loc 1 110 27 st.u64 [%r188],%r292; .loc 1 111 10 setp.eq.u64 %r294,%r124,0; @ ! %r294 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r105,%r48,%r105; $L17: .loc 1 101 4 st.u64 [%r155+8],%r295; ld.u64 %r48,[%r151]; add.u64 %r296,%r48,-1; st.u64 [%r155+16],%r296; st.u64 [%r155],%r105; .loc 1 94 7 add.u64 %r155,%r155,24; add.u64 %r151,%r151,8; setp.ne.u64 %r297,%r115,%r155; @ %r297 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r188+48],%r124; mov.u64 %r299,-1; st.u64 [%r188+56],%r299; mov.u64 %r300,1; st.u64 [%r188+40],%r300; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r59,[%r188+28]; .loc 1 121 10 setp.eq.u64 %r301,%r59,%r118; @ %r301 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r118; st.u64 [%stack],%r59;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r304,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r305,[%r304+36]; setp.eq.u32 %r306,%r305,0; @ %r306 bra $L21; .loc 1 128 2 add.u64 %r383,%frame,240; cvta.const.u64 %r310,$LC209310; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r312,%r118,0; @ %r312 bra $L24; $L27: .loc 1 68 9 add.u64 %r313,%r30,1; .loc 1 68 7 sub.u64 %r120,%r313,%r32; max.s64 %r86,%r120,0; .loc 1 140 8 ld.u64 %r134,[%r189]; .loc 1 141 8 ld.u64 %r135,[%r188]; .loc 1 190 33 shl.b64 %r75,%r122,4; ld.u64 %r153,[%frame+360]; .loc 1 204 22 ld.u64 %r77,[%frame+120]; .loc 1 204 12 shl.b64 %r78,%r77,4; .loc 1 205 22 ld.u64 %r80,[%frame]; .loc 1 205 12 shl.b64 %r81,%r80,2; .loc 1 207 32 ld.u64 %r68,[%frame+240]; .loc 1 214 23 mul.lo.u64 %r314,%r68,%r77; .loc 1 214 9 shl.b64 %r315,%r314,4; neg.s64 %r142,%r315; .loc 1 215 23 mul.lo.u64 %r316,%r68,%r80; .loc 1 215 9 shl.b64 %r317,%r316,2; neg.s64 %r140,%r317; setp.gt.s64 %r385,%r120,0; setp.le.s64 %r386,%r118,1; .loc 1 180 9 setp.eq.u32 %r388,%r191,0; .loc 1 155 9 mov.u64 %r389,-1; cvt.u32.u32 %r390,%r22; cvt.s64.s8 %r391,%r390; add.u64 %r392,%r391,-1; add.u64 %r393,%frame,120; add.u64 %r394,%frame,240; bra $L25; $L24: add.u64 %r171,%frame,360; add.u64 %r170,%r188,40; mov.u64 %r168,%frame; cvt.u32.u32 %r319,%r22; cvt.s64.s8 %r318,%r319; add.u64 %r159,%r318,-1; .loc 1 132 3 mov.u64 %r136,0; add.u64 %r383,%frame,240; .loc 1 134 16 mov.u64 %r320,%r136; $L26: st.u64 [%r171],%r320; .loc 1 135 18 ld.u64 %r321,[%r170]; st.u64 [%r168],%r321; .loc 1 136 17 shl.b64 %r323,%r136,3; add.u64 %r324,%r383,%r323; .loc 1 136 10 ld.u64 %r325,[%r324]; setp.le.s64 %r326,%r325,0; @ %r326 bra $L1; .loc 1 132 26 add.u64 %r136,%r136,1; .loc 1 132 3 add.u64 %r171,%r171,8; add.u64 %r170,%r170,24; add.u64 %r168,%r168,8; setp.ne.u64 %r327,%r136,%r159; @ %r327 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r385 bra $L28; .loc 1 159 10 mov.u32 %r329,0; st.u32 [%r135],%r329; bra $L29; $L28: .loc 1 180 9 @ %r388 bra $L42; mov.u64 %r129,%r134; .loc 1 155 9 mov.u64 %r379,%r389; mov.u64 %r380,9223372036854775807; .loc 1 157 9 mov.u32 %r106,1; .loc 1 178 8 mov.u64 %r128,0; $L33: .loc 1 183 9 ld.u64 %r375,[%r129]; ld.u64 %r376,[%r129+8]; .loc 1 183 8 setp.gt.s64 %r333,%r376,%r380; @ %r333 bra $L31; setp.ne.u64 %r336,%r376,%r380; @ %r336 bra $L43; setp.gt.u64 %r339,%r375,%r379; @ %r339 bra $L31; $L43: cvt.u32.u64 %r340,%r128; add.u32 %r106,%r340,1; mov.u64 %r379,%r375; mov.u64 %r380,%r376; $L31: .loc 1 181 25 add.u64 %r128,%r128,1; .loc 1 181 33 add.u64 %r129,%r129,%r75; .loc 1 181 8 setp.gt.s64 %r341,%r86,%r128; @ %r341 bra $L33; bra $L34; $L42: mov.u64 %r126,%r134; .loc 1 155 9 mov.u64 %r381,%r389; mov.u64 %r382,9223372036854775807; .loc 1 157 9 mov.u32 %r106,1; .loc 1 178 8 mov.u64 %r104,0; $L30: .loc 1 192 9 ld.u64 %r377,[%r126]; ld.u64 %r378,[%r126+8]; .loc 1 192 8 setp.gt.s64 %r344,%r382,%r378; @ %r344 bra $L44; setp.ne.u64 %r347,%r382,%r378; @ %r347 bra $L35; setp.gt.u64 %r350,%r381,%r377; @ ! %r350 bra $L35; $L44: cvt.u32.u64 %r351,%r104; add.u32 %r106,%r351,1; mov.u64 %r381,%r377; mov.u64 %r382,%r378; $L35: .loc 1 190 25 add.u64 %r104,%r104,1; .loc 1 190 33 add.u64 %r126,%r126,%r75; .loc 1 190 8 setp.gt.s64 %r352,%r86,%r104; @ %r352 bra $L30; $L34: .loc 1 199 12 st.u32 [%r135],%r106; $L29: .loc 1 203 15 add.u64 %r153,%r153,1; .loc 1 204 12 add.u64 %r134,%r134,%r78; .loc 1 205 12 add.u64 %r135,%r135,%r81; .loc 1 207 13 setp.ne.u64 %r353,%r68,%r153; @ %r353 bra $L25; .loc 1 214 9 add.u64 %r131,%r134,%r142; .loc 1 215 9 add.u64 %r132,%r135,%r140; .loc 1 217 7 @ %r386 bra $L1; add.u64 %r156,%frame,368; mov.u64 %r117,8; .loc 1 216 5 mov.u64 %r133,1; .loc 1 211 13 mov.u64 %r387,0; bra $L39; $L40: st.u64 [%r156],%r387; .loc 1 214 23 mul.lo.u64 %r359,%r93,%r92; .loc 1 214 9 shl.b64 %r360,%r359,4; sub.u64 %r131,%r134,%r360; .loc 1 215 23 mul.lo.u64 %r361,%r96,%r92; .loc 1 215 9 shl.b64 %r362,%r361,2; sub.u64 %r132,%r135,%r362; .loc 1 216 5 add.u64 %r133,%r133,1; .loc 1 217 7 add.u64 %r156,%r156,8; add.u64 %r117,%r117,8; setp.eq.u64 %r363,%r133,%r392; @ %r363 bra $L1; $L39: .loc 1 225 16 ld.u64 %r364,[%r156]; add.u64 %r92,%r364,1; st.u64 [%r156],%r92; .loc 1 226 23 add.u64 %r366,%r393,%r117; ld.u64 %r93,[%r366]; .loc 1 226 13 shl.b64 %r367,%r93,4; add.u64 %r134,%r131,%r367; .loc 1 227 23 add.u64 %r368,%frame,%r117; ld.u64 %r96,[%r368]; .loc 1 227 13 shl.b64 %r369,%r96,2; add.u64 %r135,%r132,%r369; .loc 1 207 32 add.u64 %r371,%r394,%r117; ld.u64 %r99,[%r371]; .loc 1 207 13 setp.eq.u64 %r372,%r92,%r99; @ %r372 bra $L40; .loc 1 211 13 mov.u64 %r153,0; bra $L25; $L57: .loc 1 90 6 ld.u64 %r373,[%r188]; setp.eq.u64 %r374,%r373,0; @ ! %r374 bra $L11; bra $L41; $L1: .loc 1 231_gfortran_mminloc1_4_i16 .visible .func _gfortran_mminloc1_4_i16, .param .u32 %in_ar4) {60896u16 %r238; .reg .u16u64 %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u64 %r306; .reg .u64pred %r317; .reg .u64 %r318; .reg .pred %r319; .reg .u64u32u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64u64predu32 %r385; .reg .predu64 %r394; .reg .predu64predu64 %r410; .reg .u32 %r412; .reg .pred %r414; .reg .pred %r415; .reg .pred %r416; .reg .pred %r420; .reg .pred %r423; .reg .pred %r426; .reg .predpred %r430; .reg .pred %r433; .reg .pred %r436; .reg .pred %r439; .reg .predpred %r462; .reg .u64 %r463; .reg .predpred %r473; .reg .u64 %r475; .reg .pred %r476; .reg .u64 %r477; .reg .u64 %r478; .reg .pred %r479; .reg .u32u64 %r484; .reg .u64 %r485; .reg .u64 %r486; .reg .u64 %r487; mov.u64 %r221,%ar0; mov.u64 %r222,%ar1; mov.u64 %r223,%ar2; mov.u64 %r224,%ar3; mov.u32 %r225,%ar4; .loc 1 261 6 setp.ne.u64 %r226,%r224,0; @ %r226 bra $L223225; call _gfortran_minloc1_4_i16268 7 bra $L58; $L59: .loc 1 271 10 ld.u64 %r22,[%r223]; .loc 1 271 7 add.u64 %r151,%r22,-1; .loc 1 272 10 ld.s8 %r23,[%r222+28]; .loc 1 272 38 add.u32 %r231,%r23,-1; .loc 1 272 8 cvt.s64.s32 %r152,%r231; .loc 1 275 7 shr.u64 %r233,%r151,63; set.u32.gt.s64 %r235,%r151,%r152; neg.s32 %r236,%r235; cvt.u16.u64 %r239,%r233; cvt.u16.u32 %r240,%r236; or.b16 %r238,%r239,%r240; .loc 1 275 6 cvt.u32.u16 %r241,%r238; cvt.u16.u8 %r242,%r241; setp.eq.u16 %r243,%r242,0; @ %r243 bra $L61; .loc 1 277 7 cvt.u32.u32 %r246,%r23; cvt.s64.s8 %r245,%r246; st.u64 [%stack+8],%r245; st.u64 [%stack],%r22; cvta.const.u64 %r244244_gfortran_runtime_error61: .loc 1 282 9 add.u64 %r472,%r151,%r151; add.u64 %r475,%r472,%r151; shl.b64 %r251,%r475,3; add.u64 %r252,%r222,%r251; ld.u64 %r255,[%r252+56]; add.u64 %r254,%r255,1; .loc 1 282 7 ld.u64 %r262,[%r252+48]; sub.u64 %r153,%r254,%r262; .loc 1 283 6 setp.le.s64 %r263,%r153,0; @ %r263 bra $L58; .loc 1 286 9 ld.u64 %r169,[%r224]; .loc 1 288 15 ld.u64 %r34,[%r224+16]; .loc 1 290 22 cvt.u32.u64 %r35,%r34; .loc 1 290 53 add.u32 %r264,%r35,-4; and.b32 %r265,%r264,-5; set.u32.eq.u32 %r267,%r265,0; neg.s32 %r268,%r267; .loc 1 290 22 add.u32 %r269,%r35,-1; set.u32.le.u32 %r271,%r269,1; neg.s32 %r272,%r271; .loc 1 290 6 cvt.u16.u32 %r274,%r268; cvt.u16.u32 %r275,%r272; or.loc 1 292 7 setp.ne.u32 %r280,%r35,16; @ %r280 bra $L64; $L63: .loc 1 299 9 shl.b64 %r284,%r475,3; add.u64 %r285,%r222,%r284; ld.u64 %r156,[%r285+40]; .loc 1 300 12 add.u64 %r291,%r224,%r284; ld.u64 %r41,[%r291+40]; .loc 1 302 3 setp.ne.u64 %r293,%r151,0; @ %r293 bra $L65; $L71: .loc 1 312 3 setp.lt.s64 %r294,%r151,%r152; @ %r294 bra $L66; bra $L127; $L64: .loc 1 297 5 cvta.const.u64 %r2955_gfortran_runtime_error65: add.u64 %r96,%r222,40; add.u64 %r68,%r224,40; add.u64 %r298,%r22,%r22; add.u64 %r299,%r298,%r22; shl.b64 %r300,%r299,3; add.u64 %r301,%r222,16; add.u64 %r39,%r300,%r301; .loc 1 302 3 mov.u64 %r139,0; add.u64 %r471,%frame,240; add.u64 %r477,%frame,360; .loc 1 309 12 mov.u64 %r487,%r139; $L70: .loc 1 304 18 add.u64 %r303,%r471,%r139; ld.u64 %r304,[%r96]; st.u64 [%r303],%r304; .loc 1 305 18 add.u64 %r305,%frame,%r139; .loc 1 305 20 ld.u64 %r307,[%r68]; mul.lo.u64 %r306,%r307,%r34; .loc 1 305 18 st.u64 [%r305],%r306; .loc 1 306 19 ld.u64 %r309,[%r96+16]; add.u64 %r308,%r309,1; ld.u64 %r310,[%r96+8]; sub.u64 %r53,%r308,%r310; .loc 1 308 10 setp.lt.s64 %r311,%r53,0; @ %r311 bra $L68; .loc 1 306 17 add.u64 %r313,%r477,%r139; st.u64 [%r313],%r53; bra $L69; $L68: .loc 1 309 12 add.u64 %r315,%r477,%r139; st.u64 [%r315],%r487; $L69: .loc 1 302 3 add.u64 %r96,%r96,24; add.u64 %r68,%r68,24; add.u64 %r139,%r139,8; setp.ne.u64 %r317,%r39,%r96; @ %r317 bra $L70; bra $L71; $L127: .loc 1 322 6 ld.u64 %r318,[%r221]; setp.eq.u64 %r319,%r318,0; @ ! %r319 bra $L73; bra $L72; $L66: add.u64 %r321,%r22,%r22; add.u64 %r322,%r321,%r22; shl.b64 %r323,%r322,3; add.u64 %r130,%r323,40; add.u64 %r134,%r222,%r130; add.u64 %r111,%r224,%r130; shl.b64 %r324,%r22,3; add.u64 %r58,%r324,-8; cvt.u32.u32 %r326,%r23; cvt.s64.s8 %r325,%r326; shl.b64 %r327,%r325,3; add.u64 %r69,%r327,-8; add.u64 %r471,%frame,240; add.u64 %r477,%frame,360; .loc 1 319 12 mov.u64 %r486,0; $L76: .loc 1 314 18 add.u64 %r329,%r471,%r58; ld.u64 %r330,[%r134]; st.u64 [%r329],%r330; .loc 1 315 18 add.u64 %r331,%frame,%r58; .loc 1 315 20 ld.u64 %r333,[%r111]; mul.lo.u64 %r332,%r333,%r34; .loc 1 315 18 st.u64 [%r331],%r332; .loc 1 316 19 ld.u64 %r335,[%r134+16]; add.u64 %r334,%r335,1; ld.u64 %r336,[%r134+8]; sub.u64 %r65,%r334,%r336; .loc 1 318 10 setp.lt.s64 %r337,%r65,0; @ %r337 bra $L74; .loc 1 316 17 add.u64 %r339,%r477,%r58; st.u64 [%r339],%r65; bra $L75; $L74: .loc 1 319 12 add.u64 %r341,%r477,%r58; st.u64 [%r341],%r486; $L75: .loc 1 312 3 add.u64 %r134,%r134,24; add.u64 %r111,%r111,24; add.u64 %r58,%r58,8; setp.ne.u64 %r343,%r69,%r58; @ %r343 bra $L76; bra $L128; $L72: .loc 1 326 7 setp.eq.u64 %r344,%r152,0; @ %r344 bra $L78; add.u64 %r477,%frame,360; $L105: add.u64 %r148,%r221,40; mov.u64 %r146,%r477; cvt.u32.u32 %r346,%r23; cvt.s64.s8 %r345,%r346; add.u64 %r348,%r345,%r345; add.u64 %r349,%r348,%r345; shl.b64 %r350,%r349,3; add.u64 %r351,%r221,16; add.u64 %r135,%r350,%r351; .loc 1 329 10 mov.u64 %r136,1; .loc 1 333 4 mov.u64 %r370,0; bra $L79; $L78: .loc 1 337 20 add.u32 %r352,%r23,-2; cvt.s64.s32 %r76,%r352; add.u64 %r354,%r76,%r76; add.u64 %r355,%r354,%r76; shl.b64 %r356,%r355,3; add.u64 %r357,%r221,%r356; .loc 1 337 67 shl.b64 %r359,%r76,3; add.u64 %r360,%frame,%r359; .loc 1 337 59 ld.u64 %r362,[%r357+40]; ld.u64 %r363,[%r360+360]; mul.lo.u64 %r160,%r362,%r363; .loc 1 339 24 mov.u64 %r364,0; st.u64 [%r221+8],%r364; .loc 1 340 28 cvt.u16.u32 %r367,%r23; add.u16 %r366,%r367,-1; cvt.u32.u16 %r368,%r366; st.u8 [%r221+28],%r368; .loc 1 342 10 setp.eq.u64 %r369,%r160,0; @ %r369 bra $L80; bra $L129; $L82: .loc 1 331 47 mul.lo.u64 %r136,%r73,%r136; $L79: .loc 1 333 4 st.u64 [%r148+8],%r370; ld.u64 %r73,[%r146]; add.u64 %r371,%r73,-1; st.u64 [%r148+16],%r371; st.u64 [%r148],%r136; .loc 1 326 7 add.u64 %r148,%r148,24; add.u64 %r146,%r146,8; setp.ne.u64 %r372,%r135,%r148; @ %r372 bra $L82; bra $L78; $L80: .loc 1 345 4 st.u64 [%r221+48],%r160; mov.u64 %r374,-1; st.u64 [%r221+56],%r374; mov.u64 %r375,1; st.u64 [%r221+40],%r375; .loc 1 346 4 bra $L58; $L129: .loc 1 349 24call (%value_in),_gfortrani_xmallocarray78,[%value_in]; } .loc 1 349 22 st.u64 [%r221],%r378; bra $L83; $L73: .loc 1 354 19 ld.s8 %r380,[%r221+28]; .loc 1 354 10 setp.eq.u64 %r381,%r380,%r152; @ %r381 bra $L84; .loc 1 355 2 cvta.const.u64 %r382,$LC5; {stack; call _gfortran_runtime_error84: .loc 1 357 11 cvta.global.u64 %r384,_gfortrani_compile_options; .loc 1 357 10 ld.u32 %r385,[%r384+36]; setp.eq.u32 %r386,%r385,0; @ %r386 bra $L83; .loc 1 359 4 add.u64 %r477,%frame,360; cvta.const.u64 %r390,$LC2r477389390; call _gfortrani_bounds_ifunction_return361 4 cvta.const.u64 %r394,$LC222394390; call _gfortrani_bounds_equal_extents$L83: .loc 1 366 3 setp.ne.u64 %r396,%r152,0; @ %r396 bra $L85; $L88: .loc 1 374 8 ld.u64 %r170,[%r221]; .loc 1 375 8 ld.u64 %r168,[%r222]; .loc 1 377 9 setp.ne.u64 %r397,%r168,0; @ %r397 bra $L86; bra $L58; $L85: add.u64 %r198,%frame,480; add.u64 %r197,%r221,40; add.u64 %r195,%frame,120; cvt.u32.u32 %r399,%r23; cvt.s64.s8 %r398,%r399; add.u64 %r164,%r398,-1; .loc 1 366 3 mov.u64 %r171,0; add.u64 %r477,%frame,360; .loc 1 368 16 mov.u64 %r400,%r171; $L87: st.u64 [%r198],%r400; .loc 1 369 18 ld.u64 %r401,[%r197]; st.u64 [%r195],%r401; .loc 1 370 17 shl.b64 %r403,%r171,3; add.u64 %r404,%r477,%r403; .loc 1 370 10 ld.u64 %r405,[%r404]; setp.le.s64 %r406,%r405,0; @ %r406 bra $L58; .loc 1 366 26 add.u64 %r171,%r171,1; .loc 1 366 3 add.u64 %r198,%r198,8; add.u64 %r197,%r197,24; add.u64 %r195,%r195,8; setp.ne.u64 %r407,%r164,%r171; @ %r407 bra $L87; bra $L88; $L86: .loc 1 300 12 mul.lo.u64 %r44,%r41,%r34; .loc 1 396 32 shl.b64 %r91,%r156,4; ld.u64 %r187,[%frame+480]; .loc 1 440 22 ld.u64 %r101,[%frame+240]; .loc 1 440 12 shl.b64 %r102,%r101,4; .loc 1 441 23 ld.u64 %r104,[%frame]; .loc 1 442 22 ld.u64 %r106,[%frame+120]; .loc 1 442 12 shl.b64 %r107,%r106,2; .loc 1 444 32 ld.u64 %r29,[%frame+360]; .loc 1 451 23 mul.lo.u64 %r38,%r29,%r101; .loc 1 452 24 mul.lo.u64 %r408,%r29,%r104; .loc 1 452 10 neg.s64 %r137,%r408; .loc 1 453 23 mul.lo.u64 %r409,%r29,%r106; .loc 1 453 9 shl.b64 %r410,%r409,2; neg.s64 %r188,%r410; setp.le.s64 %r476,%r152,1; .loc 1 418 9 setp.ne.u32 %r479,%r225,0; cvt.u32.u32 %r480,%r23; cvt.s64.s8 %r481,%r480; add.u64 %r482,%r481,-1; add.u64 %r483,%frame,240; add.u64 %r484,%frame,360; add.u64 %r485,%frame,120; $L106: .loc 1 366 3 mov.u64 %r163,%r169; mov.u64 %r162,%r168; .loc 1 396 9 mov.u64 %r161,0; $L92: .loc 1 399 7 ld.s8 %r220,[%r163]; .loc 1 399 6 setp.eq.u32 %r473,%r220,0; @ %r473 bra $L89; .loc 1 407 11 ld.u64 %r469,[%r162]; ld.u64 %r470,[%r162+8]; .loc 1 408 11 cvt.u32.u64 %r412,%r161; add.u32 %r143,%r412,1; .loc 1 418 9 @ %r479 bra $L90; bra $L130; $L89: .loc 1 396 24 add.u64 %r161,%r161,1; .loc 1 396 32 add.u64 %r162,%r162,%r91; .loc 1 396 47 add.u64 %r163,%r163,%r44; .loc 1 396 2 setp.ne.u64 %r414,%r153,%r161; @ %r414 bra $L92; .loc 1 395 9 mov.u32 %r143,%r220; bra $L93; $L130: .loc 1 428 10 setp.gt.s64 %r415,%r153,%r161; @ %r415 bra $L94; bra $L93; $L90: .loc 1 419 8 setp.le.s64 %r416,%r153,%r161; @ %r416 bra $L93; add.u64 %r205,%r161,1; $L97: .loc 1 421 8 @ %r473 bra $L95; .loc 1 421 18 ld.u64 %r465,[%r162]; ld.u64 %r466,[%r162+8]; .loc 1 421 15 setp.gt.s64 %r420,%r466,%r470; @ %r420 bra $L95; setp.ne.u64 %r423,%r466,%r470; @ %r423 bra $L108; setp.gt.u64 %r426,%r465,%r469; @ %r426 bra $L95; $L108: .loc 1 424 16 cvt.u32.u64 %r143,%r205; mov.u64 %r469,%r465; mov.u64 %r470,%r466; $L95: .loc 1 419 33 add.u64 %r162,%r162,%r91; .loc 1 419 48 add.u64 %r163,%r163,%r44; .loc 1 419 8 add.u64 %r206,%r205,1; setp.eq.u64 %r427,%r153,%r205; @ %r427 bra $L93; .loc 1 421 9 ld.s8 %r220,[%r163]; mov.u64 %r205,%r206; setp.eq.u32 %r473,%r220,0; bra $L97; $L94: add.u64 %r201,%r161,1; bra $L100; $L107: mov.u64 %r201,%r202; $L100: .loc 1 430 10 ld.s8 %r429,[%r163]; cvt.u16.u32 %r428,%r429; setp.eq.u16 %r430,%r428,0; @ %r430 bra $L98; .loc 1 430 20 ld.u64 %r467,[%r162]; ld.u64 %r468,[%r162+8]; .loc 1 430 17 setp.gt.s64 %r433,%r470,%r468; @ %r433 bra $L109; setp.ne.u64 %r436,%r470,%r468; @ %r436 bra $L98; setp.gt.u64 %r439,%r469,%r467; @ ! %r439 bra $L98; $L109: .loc 1 433 11 cvt.u32.u64 %r143,%r201; mov.u64 %r469,%r467; mov.u64 %r470,%r468; $L98: .loc 1 428 35 add.u64 %r162,%r162,%r91; .loc 1 428 50 add.u64 %r163,%r163,%r44; .loc 1 428 10 add.u64 %r202,%r201,1; setp.ne.u64 %r440,%r153,%r201; @ %r440 bra $L107; $L93: .loc 1 436 8 st.u32 [%r170],%r143; .loc 1 439 15 add.u64 %r187,%r187,1; .loc 1 440 12 add.u64 %r168,%r168,%r102; .loc 1 441 13 add.u64 %r169,%r169,%r104; .loc 1 442 12 add.u64 %r170,%r170,%r107; .loc 1 444 13 setp.ne.u64 %r441,%r29,%r187; @ %r441 bra $L106; .loc 1 452 10 add.u64 %r165,%r169,%r137; .loc 1 453 9 add.u64 %r166,%r170,%r188; .loc 1 455 7 @ %r476 bra $L58; add.u64 %r150,%frame,488; .loc 1 451 23 mov.u64 %r110,%r38; .loc 1 455 7 mov.u64 %r215,8; .loc 1 454 5 mov.u64 %r167,1; .loc 1 448 13 mov.u64 %r478,0; bra $L103; $L104: st.u64 [%r150],%r478; .loc 1 451 23 mul.lo.u64 %r110,%r123,%r122; .loc 1 452 24 mul.lo.u64 %r447,%r125,%r122; .loc 1 452 10 sub.u64 %r165,%r169,%r447; .loc 1 453 23 mul.lo.u64 %r448,%r127,%r122; .loc 1 453 9 shl.b64 %r449,%r448,2; sub.u64 %r166,%r170,%r449; .loc 1 454 5 add.u64 %r167,%r167,1; .loc 1 455 7 add.u64 %r150,%r150,8; add.u64 %r215,%r215,8; setp.eq.u64 %r450,%r167,%r482; @ %r450 bra $L58; $L103: .loc 1 463 16 ld.u64 %r451,[%r150]; add.u64 %r122,%r451,1; st.u64 [%r150],%r122; .loc 1 464 23 add.u64 %r453,%r483,%r215; ld.u64 %r123,[%r453]; .loc 1 464 13 sub.u64 %r454,%r123,%r110; shl.b64 %r455,%r454,4; add.u64 %r168,%r168,%r455; .loc 1 465 24 add.u64 %r456,%frame,%r215; ld.u64 %r125,[%r456]; .loc 1 465 14 add.u64 %r169,%r165,%r125; .loc 1 466 23 add.u64 %r458,%r485,%r215; ld.u64 %r127,[%r458]; .loc 1 466 13 shl.b64 %r459,%r127,2; add.u64 %r170,%r166,%r459; .loc 1 444 32 add.u64 %r461,%r484,%r215; ld.u64 %r131,[%r461]; .loc 1 444 13 setp.eq.u64 %r462,%r122,%r131; @ %r462 bra $L104; .loc 1 448 13 mov.u64 %r187,0; bra $L106; $L128: .loc 1 322 6 ld.u64 %r463,[%r221]; setp.eq.u64 %r464,%r463,0; @ ! %r464 bra $L73; bra $L105; $L58: .loc 1 4_gfortran_sminloc1_4_i16 .visible .func _gfortran_sminloc1_4_i16, .param .u32 %in_ar4) {8u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u32 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 493 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L132; .loc 1 493 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L133; $L132:r152154; call _gfortran_minloc1_4_i16500 7 bra $L131; $L133: .loc 1 503 10 ld.u64 %r23,[%r152]; .loc 1 503 7 add.u64 %r88,%r23,-1; .loc 1 504 10 ld.s8 %r24,[%r151+28]; .loc 1 504 38 add.u32 %r26,%r24,-1; .loc 1 504 8 cvt.s64.s32 %r296,%r26; .loc 1 506 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 506 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L135; .loc 1 513 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L136; $L142: .loc 1 521 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L137; bra $L190; $L135: .loc 1 508 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error136: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 518 12 mov.u64 %r302,0; $L141: .loc 1 515 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 517 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L139; .loc 1 515 17 st.u64 [%r139],%r36; bra $L140; $L139: .loc 1 518 12 st.u64 [%r139],%r302; $L140: .loc 1 513 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L141; bra $L142; $L190: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L144; bra $L143; $L137: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 527 12 mov.u64 %r301,0; $L147: .loc 1 524 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 526 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L145; .loc 1 523 17 st.u64 [%r43],%r41; bra $L146; $L145: .loc 1 527 12 st.u64 [%r43],%r301; $L146: .loc 1 521 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L147; bra $L191; $L143: .loc 1 534 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L149; add.u64 %r295,%frame,120; $L169: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 537 10 mov.u64 %r81,1; .loc 1 541 4 mov.u64 %r239,0; bra $L150; $L149: .loc 1 545 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 546 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 548 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 548 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 548 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 550 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L151; bra $L192; $L153: .loc 1 539 48 mul.lo.u64 %r81,%r46,%r81; $L150: .loc 1 541 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 534 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L153; bra $L149; $L151: .loc 1 553 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 554 4 bra $L131; $L192: .loc 1 557 2496246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 557 22 st.u64 [%r150],%r98; $L157: .loc 1 583 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L154; bra $L155; $L144: .loc 1 561 19 ld.s8 %r55,[%r150+28]; .loc 1 561 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L156; .loc 1 562 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error156: .loc 1 567 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 567 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L157; .loc 1 569 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L155; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 569 10 mov.u64 %r104,0; $L159: .loc 1 573 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 573 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 574 18 ld.u64 %r63,[%r31]; .loc 1 569 25 add.u64 %r104,%r104,1; .loc 1 574 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L158; .loc 1 575 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error158: .loc 1 569 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L159; $L154: .loc 1 585 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L160; shl.b64 %r266,%r296,3; bra $L161; $L160: mov.u64 %r266,8; $L161: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L162: .loc 1 586 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 583 26 add.u64 %r100,%r100,1; .loc 1 583 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L162; $L155: ld.u64 %r80,[%frame+240]; .loc 1 595 22 ld.u64 %r68,[%frame]; .loc 1 595 12 shl.b64 %r69,%r68,2; .loc 1 597 32 ld.u64 %r59,[%frame+120]; .loc 1 604 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 593 13 mov.u32 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L168: st.u32 [%r98],%r280; .loc 1 594 15 add.u64 %r80,%r80,1; .loc 1 595 12 add.u64 %r98,%r98,%r69; .loc 1 597 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L168; .loc 1 606 7 @ ! %r294 bra $L193; bra $L131; $L167: .loc 1 601 13 st.u64 [%r99],%r297; .loc 1 604 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 605 5 add.u64 %r97,%r97,1; .loc 1 606 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L166; bra $L131; $L193: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 604 23 mov.u64 %r72,%r44; .loc 1 605 5 mov.u64 %r97,1; .loc 1 601 13 mov.u64 %r297,0; $L166: .loc 1 610 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 611 23 ld.u64 %r75,[%r124]; .loc 1 611 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,2; add.u64 %r98,%r98,%r291; .loc 1 597 32 ld.u64 %r77,[%r123]; .loc 1 597 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L167; .loc 1 601 13 mov.u64 %r80,0; bra $L168; $L191: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L144; bra $L169; $L131: .loc 1 615minloc1_8_i16.o/1622802233_gfortran_minloc1_8_i16 .visible .func _gfortran_minloc1_8_i16.file 1 "../../../../libgfortran/generated/minloc1_8_i16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_8_i16 .visible .func _gfortran_mminloc1_8_i16_gfortran_sminloc1_8_i16 .visible .func _gfortran_sminloc1_8_i16VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_minloc1_8_i16 .visible .func _gfortran_minloc1_8_i16predpredpred %r235; .reg .predu64 %r266; .reg .u64 %r267; .reg .u64u16u64 %r279; .reg .u64 %r280; .reg .u64 %r282; .reg .u64 %r283; .reg .u64predu64u64pred %r339; .reg .predpredpred %r380; .reg .predu64 %r390; .reg .u64 %r391; mov.u64 %r185,%ar0; mov.u64 %r186,%ar1; mov.u64 %r187,%ar2; mov.u32 %r188,%ar3; .loc 1 58 10 ld.s8 %r22,[%r186+28]; .loc 1 58 38 add.u32 %r189,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r120,%r189; .loc 1 59 10 ld.u64 %r25,[%r187]; .loc 1 59 7 add.u64 %r121,%r25,-1; .loc 1 61 7 shr.u64 %r191,%r121,63; set.u32.lt.s64 %r193,%r120,%r121; neg.s32 %r194,%r193; cvt.u16.u64 %r197,%r191; cvt.u16.u32 %r198,%r194; or.b16 %r196,%r197,%r198; .loc 1 61 6 cvt.u32.u16 %r199,%r196; cvt.u16.u8 %r200,%r199; setp.eq.u16 %r201,%r200,0; @ %r201 bra $L2; .loc 1 63 7 cvt.u32.u32 %r204,%r22; cvt.s64.s8 %r203,%r204; st.u64 [%stack+8],%r20302202_gfortran_runtime_errorr207,%r121,%r121; add.u64 %r208,%r207,%r121; shl.b64 %r209,%r208,3; add.u64 %r210,%r186,%r209; ld.u64 %r30,[%r210+56]; ld.u64 %r32,[%r210+48]; .loc 1 71 9 ld.u64 %r124,[%r210+40]; .loc 1 73 3 setp.ne.u64 %r224,%r121,0; @ %r224 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r225,%r120,%r121; @ %r225 bra $L4; bra $L56; $L3: add.u64 %r47,%r186,40; add.u64 %r119,%frame,120; add.u64 %r137,%frame,240; add.u64 %r226,%r186,16; add.u64 %r228,%r25,%r25; add.u64 %r229,%r228,%r25; shl.b64 %r230,%r229,3; add.u64 %r46,%r226,%r230; .loc 1 79 12 mov.u64 %r391,0; $L8: .loc 1 75 18 ld.u64 %r231,[%r47]; st.u64 [%r119],%r231; .loc 1 76 19 ld.u64 %r233,[%r47+16]; add.u64 %r232,%r233,1; ld.u64 %r234,[%r47+8]; sub.u64 %r37,%r232,%r234; .loc 1 78 10 setp.lt.s64 %r235,%r37,0; @ %r235 bra $L6; .loc 1 76 17 st.u64 [%r137],%r37; bra $L7; $L6: .loc 1 79 12 st.u64 [%r137],%r391; $L7: .loc 1 73 3 add.u64 %r47,%r47,24; add.u64 %r119,%r119,8; add.u64 %r137,%r137,8; setp.ne.u64 %r237,%r46,%r47; @ %r237 bra $L8; bra $L9; $L56: .loc 1 90 6 ld.u64 %r238,[%r185]; setp.eq.u64 %r239,%r238,0; @ ! %r239 bra $L11; bra $L10; $L4: add.u64 %r241,%r25,%r25; add.u64 %r242,%r241,%r25; shl.b64 %r243,%r242,3; add.u64 %r244,%r243,40; add.u64 %r118,%r186,%r244; shl.b64 %r245,%r25,3; add.u64 %r108,%r245,-8; add.u64 %r379,%frame,120; add.u64 %r112,%r379,%r108; add.u64 %r378,%frame,240; add.u64 %r104,%r378,%r108; cvt.u32.u32 %r249,%r22; cvt.s64.s8 %r248,%r249; add.u64 %r251,%r248,%r248; add.u64 %r252,%r251,%r248; shl.b64 %r253,%r252,3; add.u64 %r254,%r186,40; add.u64 %r64,%r253,%r254; .loc 1 87 12 mov.u64 %r390,0; $L14: .loc 1 83 18 ld.u64 %r255,[%r118]; st.u64 [%r112],%r255; .loc 1 84 19 ld.u64 %r257,[%r118+16]; add.u64 %r256,%r257,1; ld.u64 %r258,[%r118+8]; sub.u64 %r43,%r256,%r258; .loc 1 86 10 setp.lt.s64 %r259,%r43,0; @ %r259 bra $L12; .loc 1 84 17 st.u64 [%r104],%r43; bra $L13; $L12: .loc 1 87 12 st.u64 [%r104],%r390; $L13: .loc 1 81 3 add.u64 %r118,%r118,24; add.u64 %r112,%r112,8; add.u64 %r104,%r104,8; setp.ne.u64 %r261,%r64,%r118; @ %r261 bra $L14; bra $L57; $L10: .loc 1 94 7 setp.le.s64 %r262,%r120,0; @ %r262 bra $L16; add.u64 %r378,%frame,240; $L41: add.u64 %r156,%r185,40; mov.u64 %r153,%r378; cvt.u32.u32 %r264,%r22; cvt.s64.s8 %r263,%r264; add.u64 %r266,%r263,%r263; add.u64 %r267,%r266,%r263; shl.b64 %r268,%r267,3; add.u64 %r269,%r185,16; add.u64 %r130,%r268,%r269; .loc 1 97 10 mov.u64 %r107,1; .loc 1 101 4 mov.u64 %r292,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r270,0; st.u64 [%r185+8],%r270; .loc 1 106 28 cvt.u16.u32 %r273,%r22; add.u16 %r272,%r273,-1; cvt.u32.u16 %r274,%r272; st.u8 [%r185+28],%r274; .loc 1 108 20 add.u32 %r275,%r22,-2; cvt.s64.s32 %r54,%r275; add.u64 %r277,%r54,%r54; add.u64 %r278,%r277,%r54; shl.b64 %r279,%r278,3; add.u64 %r280,%r185,%r279; .loc 1 108 67 shl.b64 %r282,%r54,3; add.u64 %r283,%frame,%r282; .loc 1 108 59 ld.u64 %r285,[%r280+40]; ld.u64 %r286,[%r283+240]; mul.lo.u64 %r126,%r285,%r286; .loc 1 110 29 mov.u64 %r288288; call (%value_in),_gfortrani_xmallocarray289,[%value_in]; } .loc 1 110 27 st.u64 [%r185],%r289; .loc 1 111 10 setp.eq.u64 %r291,%r126,0; @ ! %r291 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r107,%r50,%r107; $L17: .loc 1 101 4 st.u64 [%r156+8],%r292; ld.u64 %r50,[%r153]; add.u64 %r293,%r50,-1; st.u64 [%r156+16],%r293; st.u64 [%r156],%r107; .loc 1 94 7 add.u64 %r156,%r156,24; add.u64 %r153,%r153,8; setp.ne.u64 %r294,%r130,%r156; @ %r294 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r185+48],%r126; mov.u64 %r296,-1; st.u64 [%r185+56],%r296; mov.u64 %r297,1; st.u64 [%r185+40],%r297; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r60,[%r185+28]; .loc 1 121 10 setp.eq.u64 %r298,%r60,%r120; @ %r298 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r120; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r301,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r302,[%r301+36]; setp.eq.u32 %r303,%r302,0; @ %r303 bra $L21; .loc 1 128 2 add.u64 %r378,%frame,240; cvta.const.u64 %r307,$LC206307; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r309,%r120,0; @ %r309 bra $L24; $L27: .loc 1 68 9 add.u64 %r310,%r30,1; .loc 1 68 7 sub.u64 %r122,%r310,%r32; max.s64 %r87,%r122,0; .loc 1 140 8 ld.u64 %r134,[%r186]; .loc 1 141 8 ld.u64 %r135,[%r185]; .loc 1 190 33 shl.b64 %r75,%r124,4; ld.u64 %r45,[%frame+360]; .loc 1 204 22 ld.u64 %r76,[%frame+120]; .loc 1 204 12 shl.b64 %r78,%r76,4; .loc 1 205 22 ld.u64 %r79,[%frame]; .loc 1 205 12 shl.b64 %r81,%r79,3; setp.gt.s64 %r380,%r122,0; setp.le.s64 %r381,%r120,1; .loc 1 180 9 setp.eq.u32 %r383,%r188,0; .loc 1 155 9 mov.u64 %r384,-1; cvt.u32.u32 %r385,%r22; cvt.s64.s8 %r386,%r385; add.u64 %r387,%r386,-1; add.u64 %r388,%frame,120; add.u64 %r389,%frame,240; bra $L25; $L24: add.u64 %r170,%frame,360; add.u64 %r169,%r185,40; mov.u64 %r167,%frame; cvt.u32.u32 %r312,%r22; cvt.s64.s8 %r311,%r312; add.u64 %r158,%r311,-1; .loc 1 132 3 mov.u64 %r136,0; add.u64 %r378,%frame,240; .loc 1 134 16 mov.u64 %r313,%r136; $L26: st.u64 [%r170],%r313; .loc 1 135 18 ld.u64 %r314,[%r169]; st.u64 [%r167],%r314; .loc 1 136 17 shl.b64 %r316,%r136,3; add.u64 %r317,%r378,%r316; .loc 1 136 10 ld.u64 %r318,[%r317]; setp.le.s64 %r319,%r318,0; @ %r319 bra $L1; .loc 1 132 26 add.u64 %r136,%r136,1; .loc 1 132 3 add.u64 %r170,%r170,8; add.u64 %r169,%r169,24; add.u64 %r167,%r167,8; setp.ne.u64 %r320,%r136,%r158; @ %r320 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r380 bra $L28; .loc 1 159 10 mov.u64 %r322,0; st.u64 [%r135],%r322; bra $L29; $L28: .loc 1 180 9 @ %r383 bra $L42; mov.u64 %r129,%r134; .loc 1 155 9 mov.u64 %r374,%r384; mov.u64 %r375,9223372036854775807; .loc 1 157 9 mov.u64 %r110,1; .loc 1 178 8 mov.u64 %r152,0; $L33: .loc 1 183 9 ld.u64 %r370,[%r129]; ld.u64 %r371,[%r129+8]; .loc 1 186 16 add.u64 %r152,%r152,1; .loc 1 183 8 setp.gt.s64 %r326,%r371,%r375; @ %r326 bra $L31; setp.ne.u64 %r329,%r371,%r375; @ %r329 bra $L43; setp.gt.u64 %r332,%r370,%r374; @ %r332 bra $L31; $L43: mov.u64 %r374,%r370; mov.u64 %r375,%r371; .loc 1 186 16 mov.u64 %r110,%r152; $L31: .loc 1 181 33 add.u64 %r129,%r129,%r75; .loc 1 181 8 setp.gt.s64 %r333,%r87,%r152; @ %r333 bra $L33; bra $L34; $L42: mov.u64 %r127,%r134; .loc 1 155 9 mov.u64 %r376,%r384; mov.u64 %r377,9223372036854775807; .loc 1 157 9 mov.u64 %r110,1; .loc 1 178 8 mov.u64 %r106,0; $L30: .loc 1 192 9 ld.u64 %r372,[%r127]; ld.u64 %r373,[%r127+8]; .loc 1 195 16 add.u64 %r106,%r106,1; .loc 1 192 8 setp.gt.s64 %r336,%r377,%r373; @ %r336 bra $L44; setp.ne.u64 %r339,%r377,%r373; @ %r339 bra $L35; setp.gt.u64 %r342,%r376,%r372; @ ! %r342 bra $L35; $L44: mov.u64 %r376,%r372; mov.u64 %r377,%r373; .loc 1 195 16 mov.u64 %r110,%r106; $L35: .loc 1 190 33 add.u64 %r127,%r127,%r75; .loc 1 190 8 setp.gt.s64 %r343,%r87,%r106; @ %r343 bra $L30; $L34: .loc 1 199 12 st.u64 [%r135],%r110; $L29: .loc 1 203 15 add.u64 %r45,%r45,1; .loc 1 204 12 add.u64 %r134,%r134,%r78; .loc 1 205 12 add.u64 %r135,%r135,%r81; .loc 1 207 32 ld.u64 %r68,[%frame+240]; .loc 1 207 13 setp.ne.u64 %r344,%r68,%r45; @ %r344 bra $L25; .loc 1 214 23 mul.lo.u64 %r345,%r45,%r76; .loc 1 214 9 shl.b64 %r346,%r345,4; sub.u64 %r131,%r134,%r346; .loc 1 215 23 mul.lo.u64 %r347,%r45,%r79; .loc 1 215 9 shl.b64 %r348,%r347,3; sub.u64 %r132,%r135,%r348; .loc 1 217 7 @ %r381 bra $L1; add.u64 %r157,%frame,368; mov.u64 %r178,8; .loc 1 216 5 mov.u64 %r133,1; .loc 1 211 13 mov.u64 %r382,0; bra $L39; $L40: st.u64 [%r157],%r382; .loc 1 214 23 mul.lo.u64 %r354,%r94,%r93; .loc 1 214 9 shl.b64 %r355,%r354,4; sub.u64 %r131,%r134,%r355; .loc 1 215 23 mul.lo.u64 %r356,%r97,%r93; .loc 1 215 9 shl.b64 %r357,%r356,3; sub.u64 %r132,%r135,%r357; .loc 1 216 5 add.u64 %r133,%r133,1; .loc 1 217 7 add.u64 %r157,%r157,8; add.u64 %r178,%r178,8; setp.eq.u64 %r358,%r133,%r387; @ %r358 bra $L1; $L39: .loc 1 225 16 ld.u64 %r359,[%r157]; add.u64 %r93,%r359,1; st.u64 [%r157],%r93; .loc 1 226 23 add.u64 %r361,%r388,%r178; ld.u64 %r94,[%r361]; .loc 1 226 13 shl.b64 %r362,%r94,4; add.u64 %r134,%r131,%r362; .loc 1 227 23 add.u64 %r363,%frame,%r178; ld.u64 %r97,[%r363]; .loc 1 227 13 shl.b64 %r364,%r97,3; add.u64 %r135,%r132,%r364; .loc 1 207 32 add.u64 %r366,%r389,%r178; ld.u64 %r100,[%r366]; .loc 1 207 13 setp.eq.u64 %r367,%r93,%r100; @ %r367 bra $L40; .loc 1 211 13 mov.u64 %r45,0; bra $L25; $L57: .loc 1 90 6 ld.u64 %r368,[%r185]; setp.eq.u64 %r369,%r368,0; @ ! %r369 bra $L11; bra $L41; $L1: .loc 1 231_gfortran_mminloc1_8_i16 .visible .func _gfortran_mminloc1_8_i16, .param .u32 %in_ar4) {608pred %r218; .reg .u32 %r223; .reg .u6464u64pred %r373; .reg .u64 %r374; .reg .u64 %r376; .reg .u32 %r377; .reg .pred %r378; .reg .u64 %r381; .reg .u64 %r382; .reg .u64 %r386; .reg .pred %r388; .reg .pred %r389; .reg .u64 %r390; .reg .u32 %r391; .reg .u64pred %r398; .reg .pred %r399; .reg .pred %r402; .reg .pred %r403; .reg .pred %r404; .reg .pred %r408; .reg .pred %r411; .reg .pred %r414; .reg .pred %r415; .reg .u16 %r416; .reg .u32 %r417; .reg .pred %r418; .reg .pred %r421; .reg .pred %r424; .reg .pred %r427; .reg .predu64 %r460; .reg .u64 %r461; .reg .u64 %r463; .reg .pred %r464; .reg .u64 %r465; .reg .predpredmov.u64 %r213,%ar0; mov.u64 %r214,%ar1; mov.u64 %r215,%ar2; mov.u64 %r216,%ar3; mov.u32 %r217,%ar4; .loc 1 261 6 setp.ne.u64 %r218,%r216,0; @ %r218 bra $L1r214215217; call _gfortran_minloc1_8_i16268 7 bra $L58; $L59: .loc 1 271 10 ld.u64 %r22,[%r215]; .loc 1 271 7 add.u64 %r146,%r22,-1; .loc 1 272 10 ld.s8 %r23,[%r214+28]; .loc 1 272 38 add.u32 %r223,%r23,-1; .loc 1 272 8 cvt.s64.s32 %r147,%r223; .loc 1 275 7 shr.u64 %r225,%r146,63; set.u32.gt.s64 %r227,%r146,%r147; neg.s32 %r228,%r227; cvt.u16.u64 %r231,%r225; cvt.u16.u32 %r232,%r228; or.b16 %r230,%r231,%r232; .loc 1 275 6 cvt.u32.u16 %r233,%r230; cvt.u16.u8 %r234,%r233; setp.eq.u16 %r235,%r234,0; @ %r235 bra $L61; .loc 1 277 7 cvt.u32.u32 %r238,%r23; cvt.s64.s8 %r237,%r238; st.u64 [%stack+8],%r237; st.u64 [%stack],%r22; cvta.const.u64 %r236236_gfortran_runtime_error61: .loc 1 282 9 add.u64 %r468,%r146,%r146; add.u64 %r463,%r468,%r146; shl.b64 %r243,%r463,3; add.u64 %r244,%r214,%r243; ld.u64 %r247,[%r244+56]; add.u64 %r246,%r247,1; .loc 1 282 7 ld.u64 %r254,[%r244+48]; sub.u64 %r148,%r246,%r254; .loc 1 283 6 setp.le.s64 %r255,%r148,0; @ %r255 bra $L58; .loc 1 286 9 ld.u64 %r166,[%r216]; .loc 1 288 15 ld.u64 %r34,[%r216+16]; .loc 1 290 22 cvt.u32.u64 %r35,%r34; .loc 1 290 53 add.u32 %r256,%r35,-4; and.b32 %r257,%r256,-5; set.u32.eq.u32 %r259,%r257,0; neg.s32 %r260,%r259; .loc 1 290 22 add.u32 %r261,%r35,-1; set.u32.le.u32 %r263,%r261,1; neg.s32 %r264,%r263; .loc 1 290 6 cvt.u16.u32 %r266,%r260; cvt.u16.u32 %r267,%r264; or.b16 %r265,%r266,%r267; cvt.u32.u16 %r268,%r265; cvt.u16.u8 %r269,%r268; setp.ne.u16 %r270,%r269,0; @ %r270 bra $L63; .loc 1 292 7 setp.ne.u32 %r272,%r35,16; @ %r272 bra $L64; $L63: .loc 1 299 9 shl.b64 %r276,%r463,3; add.u64 %r277,%r214,%r276; ld.u64 %r151,[%r277+40]; .loc 1 300 12 add.u64 %r283,%r216,%r276; ld.u64 %r39,[%r283+40]; .loc 1 302 3 setp.ne.u64 %r285,%r146,0; @ %r285 bra $L65; $L71: .loc 1 312 3 setp.lt.s64 %r286,%r146,%r147; @ %r286 bra $L66; bra $L127; $L64: .loc 1 297 5_gfortran_runtime_error65: add.u64 %r205,%r214,40; add.u64 %r67,%r216,40; add.u64 %r289,%r214,16; add.u64 %r291,%r22,%r22; add.u64 %r292,%r291,%r22; shl.b64 %r293,%r292,3; add.u64 %r210,%r289,%r293; .loc 1 302 3 mov.u64 %r91,0; add.u64 %r467,%frame,240; add.u64 %r465,%frame,360; .loc 1 309 12 mov.u64 %r478,%r91; $L70: .loc 1 304 18 add.u64 %r295,%r467,%r91; ld.u64 %r296,[%r205]; st.u64 [%r295],%r296; .loc 1 305 18 add.u64 %r297,%frame,%r91; .loc 1 305 20 ld.u64 %r299,[%r67]; mul.lo.u64 %r298,%r299,%r34; .loc 1 305 18 st.u64 [%r297],%r298; .loc 1 306 19 ld.u64 %r301,[%r205+16]; add.u64 %r300,%r301,1; ld.u64 %r302,[%r205+8]; sub.u64 %r51,%r300,%r302; .loc 1 308 10 setp.lt.s64 %r303,%r51,0; @ %r303 bra $L68; .loc 1 306 17 add.u64 %r305,%r465,%r91; st.u64 [%r305],%r51; bra $L69; $L68: .loc 1 309 12 add.u64 %r307,%r465,%r91; st.u64 [%r307],%r478; $L69: .loc 1 302 3 add.u64 %r205,%r205,24; add.u64 %r67,%r67,24; add.u64 %r91,%r91,8; setp.ne.u64 %r309,%r205,%r210; @ %r309 bra $L70; bra $L71; $L127: .loc 1 322 6 ld.u64 %r310,[%r213]; setp.eq.u64 %r311,%r310,0; @ ! %r311 bra $L73; bra $L72; $L66: add.u64 %r313,%r22,%r22; add.u64 %r314,%r313,%r22; shl.b64 %r315,%r314,3; add.u64 %r128,%r315,40; add.u64 %r133,%r214,%r128; add.u64 %r124,%r216,%r128; shl.b64 %r316,%r22,3; add.u64 %r83,%r316,-8; cvt.u32.u32 %r318,%r23; cvt.s64.s8 %r317,%r318; shl.b64 %r319,%r317,3; add.u64 %r209,%r319,-8; add.u64 %r467,%frame,240; add.u64 %r465,%frame,360; .loc 1 319 12 mov.u64 %r477,0; $L76: .loc 1 314 18 add.u64 %r321,%r467,%r83; ld.u64 %r322,[%r133]; st.u64 [%r321],%r322; .loc 1 315 18 add.u64 %r323,%frame,%r83; .loc 1 315 20 ld.u64 %r325,[%r124]; mul.lo.u64 %r324,%r325,%r34; .loc 1 315 18 st.u64 [%r323],%r324; .loc 1 316 19 ld.u64 %r327,[%r133+16]; add.u64 %r326,%r327,1; ld.u64 %r328,[%r133+8]; sub.u64 %r64,%r326,%r328; .loc 1 318 10 setp.lt.s64 %r329,%r64,0; @ %r329 bra $L74; .loc 1 316 17 add.u64 %r331,%r465,%r83; st.u64 [%r331],%r64; bra $L75; $L74: .loc 1 319 12 add.u64 %r333,%r465,%r83; st.u64 [%r333],%r477; $L75: .loc 1 312 3 add.u64 %r133,%r133,24; add.u64 %r124,%r124,24; add.u64 %r83,%r83,8; setp.ne.u64 %r335,%r83,%r209; @ %r335 bra $L76; bra $L128; $L72: .loc 1 326 7 setp.eq.u64 %r336,%r147,0; @ %r336 bra $L78; add.u64 %r465,%frame,360; $L104: add.u64 %r161,%r213,40; mov.u64 %r143,%r465; cvt.u32.u32 %r338,%r23; cvt.s64.s8 %r337,%r338; add.u64 %r340,%r337,%r337; add.u64 %r341,%r340,%r337; shl.b64 %r342,%r341,3; add.u64 %r343,%r213,16; add.u64 %r135,%r342,%r343; .loc 1 329 10 mov.u64 %r131,1; .loc 1 333 4 mov.u64 %r362,0; bra $L79; $L78: .loc 1 337 20 add.u32 %r344,%r23,-2; cvt.s64.s32 %r73,%r344; add.u64 %r346,%r73,%r73; add.u64 %r347,%r346,%r73; shl.b64 %r348,%r347,3; add.u64 %r349,%r213,%r348; .loc 1 337 67 shl.b64 %r351,%r73,3; add.u64 %r352,%frame,%r351; .loc 1 337 59 ld.u64 %r354,[%r349+40]; ld.u64 %r355,[%r352+360]; mul.lo.u64 %r155,%r354,%r355; .loc 1 339 24 mov.u64 %r356,0; st.u64 [%r213+8],%r356; .loc 1 340 28 cvt.u16.u32 %r359,%r23; add.u16 %r358,%r359,-1; cvt.u32.u16 %r360,%r358; st.u8 [%r213+28],%r360; .loc 1 342 10 setp.eq.u64 %r361,%r155,0; @ %r361 bra $L80; bra $L129; $L82: .loc 1 331 47 mul.lo.u64 %r131,%r70,%r131; $L79: .loc 1 333 4 st.u64 [%r161+8],%r362; ld.u64 %r70,[%r143]; add.u64 %r363,%r70,-1; st.u64 [%r161+16],%r363; st.u64 [%r161],%r131; .loc 1 326 7 add.u64 %r161,%r161,24; add.u64 %r143,%r143,8; setp.ne.u64 %r364,%r135,%r161; @ %r364 bra $L82; bra $L78; $L80: .loc 1 345 4 st.u64 [%r213+48],%r155; mov.u64 %r366,-1; st.u64 [%r213+56],%r366; mov.u64 %r367,1; st.u64 [%r213+40],%r367; .loc 1 346 4 bra $L58; $L129: .loc 1 349 24 mov.u64 %r369369; call (%value_in),_gfortrani_xmallocarray70,[%value_in]; } .loc 1 349 22 st.u64 [%r213],%r370; bra $L83; $L73: .loc 1 354 19 ld.s8 %r372,[%r213+28]; .loc 1 354 10 setp.eq.u64 %r373,%r372,%r147; @ %r373 bra $L84; .loc 1 355 2 cvta.const.u64 %r374374_gfortran_runtime_error84: .loc 1 357 11 cvta.global.u64 %r376,_gfortrani_compile_options; .loc 1 357 10 ld.u32 %r377,[%r376+36]; setp.eq.u32 %r378,%r377,0; @ %r378 bra $L83; .loc 1 359 4 add.u64 %r465,%frame,360; cvta.const.u64 %r382,$LC2r3382; call _gfortrani_bounds_ifunction_return361 4 cvta.const.u64 %r386,$LC386382; call _gfortrani_bounds_equal_extents$L83: .loc 1 366 3 setp.ne.u64 %r388,%r147,0; @ %r388 bra $L85; $L88: .loc 1 374 8 ld.u64 %r167,[%r213]; .loc 1 375 8 ld.u64 %r165,[%r214]; .loc 1 377 9 setp.ne.u64 %r389,%r165,0; @ %r389 bra $L86; bra $L58; $L85: add.u64 %r194,%frame,480; add.u64 %r193,%r213,40; add.u64 %r188,%frame,120; cvt.u32.u32 %r391,%r23; cvt.s64.s8 %r390,%r391; add.u64 %r174,%r390,-1; .loc 1 366 3 mov.u64 %r168,0; add.u64 %r465,%frame,360; .loc 1 368 16 mov.u64 %r392,%r168; $L87: st.u64 [%r194],%r392; .loc 1 369 18 ld.u64 %r393,[%r193]; st.u64 [%r188],%r393; .loc 1 370 17 shl.b64 %r395,%r168,3; add.u64 %r396,%r465,%r395; .loc 1 370 10 ld.u64 %r397,[%r396]; setp.le.s64 %r398,%r397,0; @ %r398 bra $L58; .loc 1 366 26 add.u64 %r168,%r168,1; .loc 1 366 3 add.u64 %r194,%r194,8; add.u64 %r193,%r193,24; add.u64 %r188,%r188,8; setp.ne.u64 %r399,%r168,%r174; @ %r399 bra $L87; bra $L88; $L86: .loc 1 300 12 mul.lo.u64 %r42,%r39,%r34; .loc 1 396 32 shl.b64 %r87,%r151,4; ld.u64 %r68,[%frame+480]; .loc 1 440 22 ld.u64 %r97,[%frame+240]; .loc 1 440 12 shl.b64 %r99,%r97,4; .loc 1 441 23 ld.u64 %r100,[%frame]; .loc 1 442 22 ld.u64 %r102,[%frame+120]; .loc 1 442 12 shl.b64 %r104,%r102,3; setp.le.s64 %r464,%r147,1; .loc 1 418 9 setp.ne.u32 %r470,%r217,0; cvt.u32.u32 %r471,%r23; cvt.s64.s8 %r472,%r471; add.u64 %r473,%r472,-1; add.u64 %r474,%frame,240; add.u64 %r475,%frame,360; add.u64 %r476,%frame,120; $L105: .loc 1 366 3 mov.u64 %r160,%r166; mov.u64 %r159,%r165; .loc 1 396 9 mov.u64 %r137,0; $L92: .loc 1 399 7 ld.s8 %r208,[%r160]; mov.u64 %r189,%r137; .loc 1 396 24 add.u64 %r137,%r137,1; .loc 1 399 6 setp.eq.u32 %r466,%r208,0; @ %r466 bra $L89; .loc 1 407 11 ld.u64 %r460,[%r159]; ld.u64 %r461,[%r159+8]; .loc 1 418 9 @ %r470 bra $L90; bra $L130; $L89: .loc 1 396 32 add.u64 %r159,%r159,%r87; .loc 1 396 47 add.u64 %r160,%r160,%r42; .loc 1 396 2 setp.ne.u64 %r402,%r148,%r137; @ %r402 bra $L92; .loc 1 395 9 mov.u64 %r137,0; bra $L93; $L130: .loc 1 428 10 setp.gt.s64 %r403,%r148,%r189; @ %r403 bra $L94; bra $L93; $L90: .loc 1 419 8 setp.le.s64 %r404,%r148,%r189; @ %r404 bra $L93; $L97: .loc 1 424 16 add.u64 %r189,%r189,1; .loc 1 421 8 @ %r466 bra $L95; .loc 1 421 18 ld.u64 %r456,[%r159]; ld.u64 %r457,[%r159+8]; .loc 1 421 15 setp.gt.s64 %r408,%r457,%r461; @ %r408 bra $L95; setp.ne.u64 %r411,%r457,%r461; @ %r411 bra $L106; setp.gt.u64 %r414,%r456,%r460; @ %r414 bra $L95; $L106: mov.u64 %r460,%r456; mov.u64 %r461,%r457; .loc 1 424 16 mov.u64 %r137,%r189; $L95: .loc 1 419 33 add.u64 %r159,%r159,%r87; .loc 1 419 48 add.u64 %r160,%r160,%r42; .loc 1 419 8 setp.eq.u64 %r415,%r148,%r189; @ %r415 bra $L93; .loc 1 421 9 ld.s8 %r208,[%r160]; setp.eq.u32 %r466,%r208,0; bra $L97; $L94: .loc 1 433 11 add.u64 %r189,%r189,1; .loc 1 430 10 ld.s8 %r417,[%r160]; cvt.u16.u32 %r416,%r417; setp.eq.u16 %r418,%r416,0; @ %r418 bra $L98; .loc 1 430 20 ld.u64 %r458,[%r159]; ld.u64 %r459,[%r159+8]; .loc 1 430 17 setp.gt.s64 %r421,%r461,%r459; @ %r421 bra $L107; setp.ne.u64 %r424,%r461,%r459; @ %r424 bra $L98; setp.gt.u64 %r427,%r460,%r458; @ ! %r427 bra $L98; $L107: mov.u64 %r460,%r458; mov.u64 %r461,%r459; .loc 1 433 11 mov.u64 %r137,%r189; $L98: .loc 1 428 35 add.u64 %r159,%r159,%r87; .loc 1 428 50 add.u64 %r160,%r160,%r42; .loc 1 428 10 setp.ne.u64 %r428,%r148,%r189; @ %r428 bra $L94; $L93: .loc 1 436 8 st.u64 [%r167],%r137; .loc 1 439 15 add.u64 %r68,%r68,1; .loc 1 440 12 add.u64 %r165,%r165,%r99; .loc 1 441 13 add.u64 %r166,%r166,%r100; .loc 1 442 12 add.u64 %r167,%r167,%r104; .loc 1 444 32 ld.u64 %r29,[%frame+360]; .loc 1 444 13 setp.ne.u64 %r429,%r29,%r68; @ %r429 bra $L105; .loc 1 451 23 mul.lo.u64 %r106,%r68,%r97; .loc 1 452 24 mul.lo.u64 %r430,%r68,%r100; .loc 1 452 10 sub.u64 %r162,%r166,%r430; .loc 1 453 23 mul.lo.u64 %r431,%r68,%r102; .loc 1 453 9 shl.b64 %r432,%r431,3; sub.u64 %r163,%r167,%r432; .loc 1 455 7 @ %r464 bra $L58; add.u64 %r204,%frame,488; mov.u64 %r203,8; .loc 1 454 5 mov.u64 %r164,1; .loc 1 448 13 mov.u64 %r469,0; bra $L102; $L103: st.u64 [%r204],%r469; .loc 1 451 23 mul.lo.u64 %r106,%r117,%r116; .loc 1 452 24 mul.lo.u64 %r438,%r119,%r116; .loc 1 452 10 sub.u64 %r162,%r166,%r438; .loc 1 453 23 mul.lo.u64 %r439,%r121,%r116; .loc 1 453 9 shl.b64 %r440,%r439,3; sub.u64 %r163,%r167,%r440; .loc 1 454 5 add.u64 %r164,%r164,1; .loc 1 455 7 add.u64 %r204,%r204,8; add.u64 %r203,%r203,8; setp.eq.u64 %r441,%r164,%r473; @ %r441 bra $L58; $L102: .loc 1 463 16 ld.u64 %r442,[%r204]; add.u64 %r116,%r442,1; st.u64 [%r204],%r116; .loc 1 464 23 add.u64 %r444,%r474,%r203; ld.u64 %r117,[%r444]; .loc 1 464 13 sub.u64 %r445,%r117,%r106; shl.b64 %r446,%r445,4; add.u64 %r165,%r165,%r446; .loc 1 465 24 add.u64 %r447,%frame,%r203; ld.u64 %r119,[%r447]; .loc 1 465 14 add.u64 %r166,%r162,%r119; .loc 1 466 23 add.u64 %r449,%r476,%r203; ld.u64 %r121,[%r449]; .loc 1 466 13 shl.b64 %r450,%r121,3; add.u64 %r167,%r163,%r450; .loc 1 444 32 add.u64 %r452,%r475,%r203; ld.u64 %r125,[%r452]; .loc 1 444 13 setp.eq.u64 %r453,%r116,%r125; @ %r453 bra $L103; .loc 1 448 13 mov.u64 %r68,0; bra $L105; $L128: .loc 1 322 6 ld.u64 %r454,[%r213]; setp.eq.u64 %r455,%r454,0; @ ! %r455 bra $L73; bra $L104; $L58: .loc 1 4_gfortran_sminloc1_8_i16 .visible .func _gfortran_sminloc1_8_i16, .param .u32 %in_ar4) {8u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 493 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L132; .loc 1 493 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L133; $L132:r152154; call _gfortran_minloc1_8_i16500 7 bra $L131; $L133: .loc 1 503 10 ld.u64 %r23,[%r152]; .loc 1 503 7 add.u64 %r88,%r23,-1; .loc 1 504 10 ld.s8 %r24,[%r151+28]; .loc 1 504 38 add.u32 %r26,%r24,-1; .loc 1 504 8 cvt.s64.s32 %r296,%r26; .loc 1 506 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 506 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L135; .loc 1 513 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L136; $L142: .loc 1 521 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L137; bra $L190; $L135: .loc 1 508 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error136: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 518 12 mov.u64 %r302,0; $L141: .loc 1 515 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 517 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L139; .loc 1 515 17 st.u64 [%r139],%r36; bra $L140; $L139: .loc 1 518 12 st.u64 [%r139],%r302; $L140: .loc 1 513 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L141; bra $L142; $L190: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L144; bra $L143; $L137: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 527 12 mov.u64 %r301,0; $L147: .loc 1 524 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 526 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L145; .loc 1 523 17 st.u64 [%r43],%r41; bra $L146; $L145: .loc 1 527 12 st.u64 [%r43],%r301; $L146: .loc 1 521 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L147; bra $L191; $L143: .loc 1 534 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L149; add.u64 %r295,%frame,120; $L169: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 537 10 mov.u64 %r81,1; .loc 1 541 4 mov.u64 %r239,0; bra $L150; $L149: .loc 1 545 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 546 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 548 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 548 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 548 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 550 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L151; bra $L192; $L153: .loc 1 539 48 mul.lo.u64 %r81,%r46,%r81; $L150: .loc 1 541 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 534 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L153; bra $L149; $L151: .loc 1 553 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 554 4 bra $L131; $L192: .loc 1 557 24 mov.u64 %r24696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 557 22 st.u64 [%r150],%r98; $L157: .loc 1 583 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L154; bra $L155; $L144: .loc 1 561 19 ld.s8 %r55,[%r150+28]; .loc 1 561 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L156; .loc 1 562 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error156: .loc 1 567 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 567 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L157; .loc 1 569 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L155; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 569 10 mov.u64 %r104,0; $L159: .loc 1 573 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 573 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 574 18 ld.u64 %r63,[%r31]; .loc 1 569 25 add.u64 %r104,%r104,1; .loc 1 574 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L158; .loc 1 575 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error158: .loc 1 569 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L159; $L154: .loc 1 585 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L160; shl.b64 %r266,%r296,3; bra $L161; $L160: mov.u64 %r266,8; $L161: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L162: .loc 1 586 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 583 26 add.u64 %r100,%r100,1; .loc 1 583 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L162; $L155: ld.u64 %r80,[%frame+240]; .loc 1 595 22 ld.u64 %r68,[%frame]; .loc 1 595 12 shl.b64 %r69,%r68,3; .loc 1 597 32 ld.u64 %r59,[%frame+120]; .loc 1 604 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 593 13 mov.u64 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L168: st.u64 [%r98],%r280; .loc 1 594 15 add.u64 %r80,%r80,1; .loc 1 595 12 add.u64 %r98,%r98,%r69; .loc 1 597 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L168; .loc 1 606 7 @ ! %r294 bra $L193; bra $L131; $L167: .loc 1 601 13 st.u64 [%r99],%r280; .loc 1 604 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 605 5 add.u64 %r97,%r97,1; .loc 1 606 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L166; bra $L131; $L193: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 604 23 mov.u64 %r72,%r44; .loc 1 605 5 mov.u64 %r97,1; $L166: .loc 1 610 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 611 23 ld.u64 %r75,[%r124]; .loc 1 611 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,3; add.u64 %r98,%r98,%r291; .loc 1 597 32 ld.u64 %r77,[%r123]; .loc 1 597 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L167; .loc 1 601 13 mov.u64 %r80,0; bra $L168; $L191: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L144; bra $L169; $L131: .loc 1 615181 _gfortran_minloc1_16_i16 .visible .func _gfortran_minloc1_16_i16.file 1 "../../../../libgfortran/generated/minloc1_16_i16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_16_i16 .visible .func _gfortran_mminloc1_16_i16_gfortran_sminloc1_16_i16 .visible .func _gfortran_sminloc1_16_i16VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_minloc1_16_i16 .visible .func _gfortran_minloc1_16_i16pred %r229; .reg .predpred %r240; .reg .predu16 %r278; .reg .u32 %r279; .reg .u32 %r280; .reg .u64 %r282; .reg .u64predu64 %r318; .reg .u64pred %r328; .reg .pred %r329; .reg .u64 %r331; .reg .pred %r336; .reg .pred %r339; .reg .pred %r342; .reg .u64 %r345; .reg .u64 %r348; .reg .u32pred %r360; .reg .pred %r363; .reg .pred %r366; .reg .u64 %r369; .reg .u64 %r372; .reg .u32.loc 1 58 10 ld.s8 %r22,[%r191+28]; .loc 1 58 38 add.u32 %r194,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r118,%r194; .loc 1 59 10 ld.u64 %r25,[%r192]; .loc 1 59 7 add.u64 %r119,%r25,-1; .loc 1 61 7 shr.u64 %r196,%r119,63; set.u32.lt.s64 %r198,%r118,%r119; neg.s32 %r199,%r198; cvt.u16.u64 %r202,%r196;61%r206 bra $L2; .loc 1 63 7 cvt.u32.u32 %r209,%r22; cvt.s64.s8 %r208,%r209; st.u64 [%stack+8],%r208007_gfortran_runtime_errorr212,%r119,%r119; add.u64 %r213,%r212,%r119; shl.b64 %r214,%r213,3; add.u64 %r215,%r191,%r214; ld.u64 %r31,[%r215+56]; ld.u64 %r33,[%r215+48]; .loc 1 71 9 ld.u64 %r122,[%r215+40]; .loc 1 73 3 setp.ne.u64 %r229,%r119,0; @ %r229 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r230,%r118,%r119; @ %r230 bra $L4; bra $L56; $L3: add.u64 %r188,%r191,40; add.u64 %r184,%frame,120; add.u64 %r47,%frame,240; add.u64 %r231,%r191,16; add.u64 %r233,%r25,%r25; add.u64 %r234,%r233,%r25; shl.b64 %r235,%r234,3; add.u64 %r186,%r231,%r235; .loc 1 79 12 mov.u64 %r441,0; $L8: .loc 1 75 18 ld.u64 %r236,[%r188]; st.u64 [%r184],%r236; .loc 1 76 19 ld.u64 %r238,[%r188+16]; add.u64 %r237,%r238,1; ld.u64 %r239,[%r188+8]; sub.u64 %r38,%r237,%r239; .loc 1 78 10 setp.lt.s64 %r240,%r38,0; @ %r240 bra $L6; .loc 1 76 17 st.u64 [%r47],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r47],%r441; $L7: .loc 1 73 3 add.u64 %r188,%r188,24; add.u64 %r184,%r184,8; add.u64 %r47,%r47,8; setp.ne.u64 %r242,%r186,%r188; @ %r242 bra $L8; bra $L9; $L56: .loc 1 90 6 ld.u64 %r243,[%r190]; setp.eq.u64 %r244,%r243,0; @ ! %r244 bra $L11; bra $L10; $L4: add.u64 %r246,%r25,%r25; add.u64 %r247,%r246,%r25; shl.b64 %r248,%r247,3; add.u64 %r249,%r248,40; add.u64 %r110,%r191,%r249; shl.b64 %r250,%r25,3; add.u64 %r98,%r250,-8; add.u64 %r429,%frame,120; add.u64 %r100,%r429,%r98; add.u64 %r428,%frame,240; add.u64 %r96,%r428,%r98; cvt.u32.u32 %r254,%r22; cvt.s64.s8 %r253,%r254; add.u64 %r256,%r253,%r253; add.u64 %r257,%r256,%r253; shl.b64 %r258,%r257,3; add.u64 %r259,%r191,40; add.u64 %r135,%r258,%r259; .loc 1 87 12 mov.u64 %r440,0; $L14: .loc 1 83 18 ld.u64 %r260,[%r110]; st.u64 [%r100],%r260; .loc 1 84 19 ld.u64 %r262,[%r110+16]; add.u64 %r261,%r262,1; ld.u64 %r263,[%r110+8]; sub.u64 %r44,%r261,%r263; .loc 1 86 10 setp.lt.s64 %r264,%r44,0; @ %r264 bra $L12; .loc 1 84 17 st.u64 [%r96],%r44; bra $L13; $L12: .loc 1 87 12 st.u64 [%r96],%r440; $L13: .loc 1 81 3 add.u64 %r110,%r110,24; add.u64 %r100,%r100,8; add.u64 %r96,%r96,8; setp.ne.u64 %r266,%r110,%r135; @ %r266 bra $L14; bra $L57; $L10: .loc 1 94 7 setp.le.s64 %r267,%r118,0; @ %r267 bra $L16; add.u64 %r428,%frame,240; $L41: add.u64 %r146,%r190,40; mov.u64 %r143,%r428; cvt.u32.u32 %r269,%r22; cvt.s64.s8 %r268,%r269; add.u64 %r271,%r268,%r268; add.u64 %r272,%r271,%r268; shl.b64 %r273,%r272,3; add.u64 %r274,%r190,16; add.u64 %r111,%r273,%r274; .loc 1 97 10 mov.u64 %r102,1; .loc 1 101 4 mov.u64 %r297,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r275,0; st.u64 [%r190+8],%r275; .loc 1 106 28 cvt.u16.u32 %r278,%r22; add.u16 %r277,%r278,-1; cvt.u32.u16 %r279,%r277; st.u8 [%r190+28],%r279; .loc 1 108 20 add.u32 %r280,%r22,-2; cvt.s64.s32 %r53,%r280; add.u64 %r282,%r53,%r53; add.u64 %r283,%r282,%r53; shl.b64 %r284,%r283,3; add.u64 %r285,%r190,%r284; .loc 1 108 67 shl.b64 %r287,%r53,3; add.u64 %r288,%frame,%r287; .loc 1 108 59 ld.u64 %r290,[%r285+40]; ld.u64 %r291,[%r288+240]; mul.lo.u64 %r124,%r290,%r291; .loc 1 110 29 mov.u64 %r293,16293; call (%value_in),_gfortrani_xmallocarray294,[%value_in]; } .loc 1 110 27 st.u64 [%r190],%r294; .loc 1 111 10 setp.eq.u64 %r296,%r124,0; @ ! %r296 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r102,%r49,%r102; $L17: .loc 1 101 4 st.u64 [%r146+8],%r297; ld.u64 %r49,[%r143]; add.u64 %r298,%r49,-1; st.u64 [%r146+16],%r298; st.u64 [%r146],%r102; .loc 1 94 7 add.u64 %r146,%r146,24; add.u64 %r143,%r143,8; setp.ne.u64 %r299,%r111,%r146; @ %r299 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r190+48],%r124; mov.u64 %r301,-1; st.u64 [%r190+56],%r301; mov.u64 %r302,1; st.u64 [%r190+40],%r302; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r59,[%r190+28]; .loc 1 121 10 setp.eq.u64 %r303,%r59,%r118; @ %r303 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r118; st.u64 [%stack],%r59;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r306,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r307,[%r306+36]; setp.eq.u32 %r308,%r307,0; @ %r308 bra $L21; .loc 1 128 2 add.u64 %r428,%frame,240; cvta.const.u64 %r312,$LC2; cvta.const.u64 %r311,$LC3;_gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r314,%r118,0; @ %r314 bra $L24; $L27: .loc 1 68 9 add.u64 %r315,%r31,1; .loc 1 68 7 sub.u64 %r120,%r315,%r33; max.s64 %r82,%r120,0; .loc 1 140 8 ld.u64 %r131,[%r191]; .loc 1 141 8 ld.u64 %r132,[%r190]; .loc 1 190 33 shl.b64 %r71,%r122,4; ld.u64 %r141,[%frame+360]; .loc 1 204 22 ld.u64 %r73,[%frame+120]; .loc 1 204 12 shl.b64 %r74,%r73,4; .loc 1 205 22 ld.u64 %r76,[%frame]; .loc 1 205 12 shl.b64 %r77,%r76,4; .loc 1 207 32 ld.u64 %r66,[%frame+240]; .loc 1 214 23 mul.lo.u64 %r316,%r66,%r73; .loc 1 214 9 shl.b64 %r317,%r316,4; neg.s64 %r140,%r317; .loc 1 215 23 mul.lo.u64 %r318,%r66,%r76; .loc 1 215 9 shl.b64 %r319,%r318,4; neg.s64 %r145,%r319; setp.gt.s64 %r430,%r120,0; setp.le.s64 %r431,%r118,1; .loc 1 180 9 setp.eq.u32 %r433,%r193,0; mov.u64 %r434,1; cvt.u32.u32 %r435,%r22; cvt.s64.s8 %r436,%r435; add.u64 %r437,%r436,-1; add.u64 %r438,%frame,120; add.u64 %r439,%frame,240; bra $L25; $L24: add.u64 %r168,%frame,360; add.u64 %r167,%r190,40; mov.u64 %r165,%frame; cvt.u32.u32 %r321,%r22; cvt.s64.s8 %r320,%r321; add.u64 %r148,%r320,-1; .loc 1 132 3 mov.u64 %r133,0; add.u64 %r428,%frame,240; .loc 1 134 16 mov.u64 %r322,%r133; $L26: st.u64 [%r168],%r322; .loc 1 135 18 ld.u64 %r323,[%r167]; st.u64 [%r165],%r323; .loc 1 136 17 shl.b64 %r325,%r133,3; add.u64 %r326,%r428,%r325; .loc 1 136 10 ld.u64 %r327,[%r326]; setp.le.s64 %r328,%r327,0; @ %r328 bra $L1; .loc 1 132 26 add.u64 %r133,%r133,1; .loc 1 132 3 add.u64 %r168,%r168,8; add.u64 %r167,%r167,24; add.u64 %r165,%r165,8; setp.ne.u64 %r329,%r133,%r148; @ %r329 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r430 bra $L28; .loc 1 159 10 mov.u64 %r331,0; st.u64 [%r132],%r331; st.u64 [%r132+8],%r331; bra $L29; $L28: .loc 1 180 9 @ %r433 bra $L42; mov.u64 %r127,%r131; mov.u64 %r418,%r434; mov.u64 %r419,0; .loc 1 155 9 mov.u64 %r412,-1; mov.u64 %r413,9223372036854775807; .loc 1 157 9 mov.u64 %r410,1; mov.u64 %r411,%r419; $L33: .loc 1 183 9 ld.u64 %r406,[%r127]; ld.u64 %r407,[%r127+8]; .loc 1 183 8 setp.gt.s64 %r336,%r407,%r413; @ %r336 bra $L31; setp.ne.u64 %r339,%r407,%r413; @ %r339 bra $L43; setp.gt.u64 %r342,%r406,%r412; @ %r342 bra $L31; $L43: .loc 1 186 16 mov.u64 %r410,%r418; mov.u64 %r411,%r419; mov.u64 %r412,%r406; mov.u64 %r413,%r407; $L31: .loc 1 181 33 add.u64 %r127,%r127,%r71; .loc 1 181 8 add.u64 %r345,%r418,%r434; set.u32.lt.u64 %r350,%r345,%r418; cvt.s64.s32 %r348,%r350; mov.u64 %r418,%r345; sub.u64 %r419,%r419,%r348; add.u64 %r355,%r418,-1; setp.gt.s64 %r357,%r82,%r355; @ %r357 bra $L33; bra $L34; $L42: mov.u64 %r101,%r131; mov.u64 %r416,%r434; mov.u64 %r417,0; .loc 1 155 9 mov.u64 %r414,-1; mov.u64 %r415,9223372036854775807; .loc 1 157 9 mov.u64 %r410,1; mov.u64 %r411,%r417; $L30: .loc 1 192 9 ld.u64 %r408,[%r101]; ld.u64 %r409,[%r101+8]; .loc 1 192 8 setp.gt.s64 %r360,%r415,%r409; @ %r360 bra $L44; setp.ne.u64 %r363,%r415,%r409; @ %r363 bra $L35; setp.gt.u64 %r366,%r414,%r408; @ ! %r366 bra $L35; $L44: .loc 1 195 16 mov.u64 %r410,%r416; mov.u64 %r411,%r417; mov.u64 %r414,%r408; mov.u64 %r415,%r409; $L35: .loc 1 190 33 add.u64 %r101,%r101,%r71; .loc 1 190 8 add.u64 %r369,%r416,%r434; set.u32.lt.u64 %r374,%r369,%r416; cvt.s64.s32 %r372,%r374; mov.u64 %r416,%r369; sub.u64 %r417,%r417,%r372; add.u64 %r379,%r416,-1; setp.gt.s64 %r381,%r82,%r379; @ %r381 bra $L30; $L34: .loc 1 199 12 st.u64 [%r132],%r410; st.u64 [%r132+8],%r411; $L29: .loc 1 203 15 add.u64 %r141,%r141,1; .loc 1 204 12 add.u64 %r131,%r131,%r74; .loc 1 205 12 add.u64 %r132,%r132,%r77; .loc 1 207 13 setp.ne.u64 %r384,%r66,%r141; @ %r384 bra $L25; .loc 1 214 9 add.u64 %r128,%r131,%r140; .loc 1 215 9 add.u64 %r129,%r132,%r145; .loc 1 217 7 @ %r431 bra $L1; add.u64 %r157,%frame,368; mov.u64 %r117,8; .loc 1 216 5 mov.u64 %r130,1; .loc 1 211 13 mov.u64 %r432,0; bra $L39; $L40: st.u64 [%r157],%r432; .loc 1 214 23 mul.lo.u64 %r390,%r89,%r88; .loc 1 214 9 shl.b64 %r391,%r390,4; sub.u64 %r128,%r131,%r391; .loc 1 215 23 mul.lo.u64 %r392,%r92,%r88; .loc 1 215 9 shl.b64 %r393,%r392,4; sub.u64 %r129,%r132,%r393; .loc 1 216 5 add.u64 %r130,%r130,1; .loc 1 217 7 add.u64 %r157,%r157,8; add.u64 %r117,%r117,8; setp.eq.u64 %r394,%r130,%r437; @ %r394 bra $L1; $L39: .loc 1 225 16 ld.u64 %r395,[%r157]; add.u64 %r88,%r395,1; st.u64 [%r157],%r88; .loc 1 226 23 add.u64 %r397,%r438,%r117; ld.u64 %r89,[%r397]; .loc 1 226 13 shl.b64 %r398,%r89,4; add.u64 %r131,%r128,%r398; .loc 1 227 23 add.u64 %r399,%frame,%r117; ld.u64 %r92,[%r399]; .loc 1 227 13 shl.b64 %r400,%r92,4; add.u64 %r132,%r129,%r400; .loc 1 207 32 add.u64 %r402,%r439,%r117; ld.u64 %r95,[%r402]; .loc 1 207 13 setp.eq.u64 %r403,%r88,%r95; @ %r403 bra $L40; .loc 1 211 13 mov.u64 %r141,0; bra $L25; $L57: .loc 1 90 6 ld.u64 %r404,[%r190]; setp.eq.u64 %r405,%r404,0; @ ! %r405 bra $L11; bra $L41; $L1: .loc 1 231_gfortran_mminloc1_16_i16 .visible .func _gfortran_mminloc1_16_i16, .param .u32 %in_ar4) {6087u64u64 %r267; .reg .u64 %r274; .reg .pred %r275; .reg .u32 %r276; .reg .u32 %r277; .reg .u32u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .pred %r329; .reg .u64 %r330; .reg .pred %r331; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32 %r338; .reg .u64 %r339; .reg .u64 %r341; .reg .u64 %r342; .reg .u64 %r343; .reg .u64 %r344; .reg .u64pred %r355; .reg .pred64predpred %r408; .reg .pred %r409; .reg .u64 %r410; .reg .u32.reg .u64 %r417; .reg .pred %r418; .reg .predpred %r428; .reg .u64 %r432; .reg .u64pred %r462; .reg .pred %r465; .reg .predpred %r483; .reg .pred.reg .u32 %r512; .reg .u64 %r516; .reg .u16 %r517; .reg .u32 %r518; .reg .pred %r519; .reg .pred %r522; .reg .pred %r525; .reg .predu32 %r536; .reg .pred %r543; .reg .pred %r546; .reg .pred %r549; .reg .u64 %r555; .reg .u64 %r556; .reg .u64 %r557; .reg .predu64u64 %r577; .reg .u64u64 %r587; .reg .u64 %r588; .reg .u64 %r599; .reg .u64 %r613; .reg .u64 %r617; .reg .u64 %r618; .reg .u64 %r619; .reg .u64 %r620; .reg .pred %r621; .reg .u64 %r622; .reg .pred %r623; .reg .u64 %r624; .reg .pred %r625; .reg .u64 %r626; .reg .u32 %r627; .reg .u64 %r628; .reg .u64 %r629; .reg .u64 %r630; .reg .u64 %r631; .reg .u64 %r632; .reg .u64 %r633; mov.u64 %r233,%ar0; mov.u64 %r234,%ar1; mov.u64 %r235,%ar2; mov.u64 %r236,%ar3; mov.u32 %r237,%ar4; .loc 1 261 6 setp.ne.u64 %r238,%r236,0; @ %r238 bra $L3r2235237; call _gfortran_minloc1_16_i16268 7 bra $L58; $L59: .loc 1 271 10 ld.u64 %r22,[%r235]; .loc 1 271 7 add.u64 %r153,%r22,-1; .loc 1 272 10 ld.s8 %r23,[%r234+28]; .loc 1 272 38 add.u32 %r243,%r23,-1; .loc 1 272 8 cvt.s64.s32 %r154,%r243; .loc 1 275 7 shr.u64 %r245,%r153,63; set.u32.gt.s64 %r247,%r153,%r154; neg.s32 %r248,%r247; cvt.u16.u64 %r251,%r245; cvt.u16.u32 %r252,%r248; or.b16 %r250,%r251,%r252; .loc 1 275 6 cvt.u32.u16 %r253,%r250; cvt.u16.u8 %r254,%r253; setp.eq.u16 %r255,%r254,0; @ %r255 bra $L61; .loc 1 277 7 cvt.u32.u32 %r258,%r23; cvt.s64.s8 %r257,%r258; st.u64 [%stack+8],%r257; st.u64 [%stack],%r22; cvta.const.u64 %r22stack; call _gfortran_runtime_error61: .loc 1 282 9 add.u64 %r619,%r153,%r153; add.u64 %r620,%r619,%r153; shl.b64 %r263,%r620,3; add.u64 %r264,%r234,%r263; ld.u64 %r267,[%r264+56]; add.u64 %r266,%r267,1; .loc 1 282 7 ld.u64 %r274,[%r264+48]; sub.u64 %r155,%r266,%r274; .loc 1 283 6 setp.le.s64 %r275,%r155,0; @ %r275 bra $L58; .loc 1 286 9 ld.u64 %r173,[%r236]; .loc 1 288 15 ld.u64 %r35,[%r236+16]; .loc 1 290 22 cvt.u32.u64 %r36,%r35; .loc 1 290 53 add.u32 %r276,%r36,-4; and.b32 %r277,%r276,-5; set.u32.eq.u32 %r279,%r277,0; neg.s32 %r280,%r279; .loc 1 290 22 add.u32 %r281,%r36,-1; set.u32.le.u32 %r283,%r281,1; neg.s32 %r284,%r283; .loc 1 290 6 cvt.u16.u32 %r286,%r280; cvt.u16.u32 %r287,%r284; or.b16 %r285,%r286,%r287;; .loc 1 292 7 setp.ne.u32 %r292,%r36,16; @ %r292 bra $L64; $L63: .loc 1 299 9 shl.b64 %r296,%r620,3; add.u64 %r297,%r234,%r296; ld.u64 %r158,[%r297+40]; .loc 1 300 12 add.u64 %r303,%r236,%r296; ld.u64 %r41,[%r303+40]; .loc 1 302 3 setp.ne.u64 %r305,%r153,0; @ %r305 bra $L65; $L71: .loc 1 312 3 setp.lt.s64 %r306,%r153,%r154; @ %r306 bra $L66; bra $L129; $L64: .loc 1 297 5 cvta.const.u64 %r307307_gfortran_runtime_error65: add.u64 %r164,%r234,40; add.u64 %r150,%r236,40; add.u64 %r309,%r234,16; add.u64 %r311,%r22,%r22; add.u64 %r312,%r311,%r22; shl.b64 %r313,%r312,3; add.u64 %r223,%r309,%r313; .loc 1 302 3 mov.u64 %r53,0; add.u64 %r622,%frame,240; add.u64 %r618,%frame,360; .loc 1 309 12 mov.u64 %r633,%r53; $L70: .loc 1 304 18 add.u64 %r315,%r622,%r53; ld.u64 %r316,[%r164]; st.u64 [%r315],%r316; .loc 1 305 18 add.u64 %r317,%frame,%r53; .loc 1 305 20 ld.u64 %r319,[%r150]; mul.lo.u64 %r318,%r319,%r35; .loc 1 305 18 st.u64 [%r317],%r318; .loc 1 306 19 ld.u64 %r321,[%r164+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r164+8]; sub.u64 %r52,%r320,%r322; .loc 1 308 10 setp.lt.s64 %r323,%r52,0; @ %r323 bra $L68; .loc 1 306 17 add.u64 %r325,%r618,%r53; st.u64 [%r325],%r52; bra $L69; $L68: .loc 1 309 12 add.u64 %r327,%r618,%r53; st.u64 [%r327],%r633; $L69: .loc 1 302 3 add.u64 %r164,%r164,24; add.u64 %r150,%r150,24; add.u64 %r53,%r53,8; setp.ne.u64 %r329,%r164,%r223; @ %r329 bra $L70; bra $L71; $L129: .loc 1 322 6 ld.u64 %r330,[%r233]; setp.eq.u64 %r331,%r330,0; @ ! %r331 bra $L73; bra $L72; $L66: add.u64 %r333,%r22,%r22; add.u64 %r334,%r333,%r22; shl.b64 %r335,%r334,3; add.u64 %r178,%r335,40; add.u64 %r59,%r234,%r178; add.u64 %r180,%r236,%r178; shl.b64 %r336,%r22,3; add.u64 %r91,%r336,-8; cvt.u32.u32 %r338,%r23; cvt.s64.s8 %r337,%r338; shl.b64 %r339,%r337,3; add.u64 %r97,%r339,-8; add.u64 %r622,%frame,240; add.u64 %r618,%frame,360; .loc 1 319 12 mov.u64 %r632,0; $L76: .loc 1 314 18 add.u64 %r341,%r622,%r91; ld.u64 %r342,[%r59]; st.u64 [%r341],%r342; .loc 1 315 18 add.u64 %r343,%frame,%r91; .loc 1 315 20 ld.u64 %r345,[%r180]; mul.lo.u64 %r344,%r345,%r35; .loc 1 315 18 st.u64 [%r343],%r344; .loc 1 316 19 ld.u64 %r347,[%r59+16]; add.u64 %r346,%r347,1; ld.u64 %r348,[%r59+8]; sub.u64 %r66,%r346,%r348; .loc 1 318 10 setp.lt.s64 %r349,%r66,0; @ %r349 bra $L74; .loc 1 316 17 add.u64 %r351,%r618,%r91; st.u64 [%r351],%r66; bra $L75; $L74: .loc 1 319 12 add.u64 %r353,%r618,%r91; st.u64 [%r353],%r632; $L75: .loc 1 312 3 add.u64 %r59,%r59,24; add.u64 %r180,%r180,24; add.u64 %r91,%r91,8; setp.ne.u64 %r355,%r91,%r97; @ %r355 bra $L76; bra $L130; $L72: .loc 1 326 7 setp.eq.u64 %r356,%r154,0; @ %r356 bra $L78; add.u64 %r618,%frame,360; $L106: add.u64 %r137,%r233,40; mov.u64 %r134,%r618; cvt.u32.u32 %r358,%r23; cvt.s64.s8 %r357,%r358; add.u64 %r360,%r357,%r357; add.u64 %r361,%r360,%r357; shl.b64 %r362,%r361,3; add.u64 %r363,%r233,16; add.u64 %r64,%r362,%r363; .loc 1 329 10 mov.u64 %r136,1; .loc 1 333 4 mov.u64 %r382,0; bra $L79; $L78: .loc 1 337 20 add.u32 %r364,%r23,-2; cvt.s64.s32 %r75,%r364; add.u64 %r366,%r75,%r75; add.u64 %r367,%r366,%r75; shl.b64 %r368,%r367,3; add.u64 %r369,%r233,%r368; .loc 1 337 67 shl.b64 %r371,%r75,3; add.u64 %r372,%frame,%r371; .loc 1 337 59 ld.u64 %r374,[%r369+40]; ld.u64 %r375,[%r372+360]; mul.lo.u64 %r162,%r374,%r375; .loc 1 339 24 mov.u64 %r376,0; st.u64 [%r233+8],%r376; .loc 1 340 28 cvt.u16.u32 %r379,%r23; add.u16 %r378,%r379,-1; cvt.u32.u16 %r380,%r378; st.u8 [%r233+28],%r380; .loc 1 342 10 setp.eq.u64 %r381,%r162,0; @ %r381 bra $L80; bra $L131; $L82: .loc 1 331 47 mul.lo.u64 %r136,%r72,%r136; $L79: .loc 1 333 4 st.u64 [%r137+8],%r382; ld.u64 %r72,[%r134]; add.u64 %r383,%r72,-1; st.u64 [%r137+16],%r383; st.u64 [%r137],%r136; .loc 1 326 7 add.u64 %r137,%r137,24; add.u64 %r134,%r134,8; setp.ne.u64 %r384,%r64,%r137; @ %r384 bra $L82; bra $L78; $L80: .loc 1 345 4 st.u64 [%r233+48],%r162; mov.u64 %r386,-1; st.u64 [%r233+56],%r386; mov.u64 %r387,1; st.u64 [%r233+40],%r387; .loc 1 346 4 bra $L58; $L131: .loc 1 349 24 mov.u64 %r389,16389; call (%value_in),_gfortrani_xmallocarray90,[%value_in]; } .loc 1 349 22 st.u64 [%r233],%r390; bra $L83; $L73: .loc 1 354 19 ld.s8 %r392,[%r233+28]; .loc 1 354 10 setp.eq.u64 %r393,%r392,%r154; @ %r393 bra $L84; .loc 1 355 2 cvta.const.u64 %r3944_gfortran_runtime_error84: .loc 1 357 11 cvta.global.u64 %r396,_gfortrani_compile_options; .loc 1 357 10 ld.u32 %r397,[%r396+36]; setp.eq.u32 %r398,%r397,0; @ %r398 bra $L83; .loc 1 359 4 add.u64 %r618,%frame,360; cvta.const.u64 %r402,$LC2r61002; call _gfortrani_bounds_ifunction_return361 4 cvta.const.u64 %r406,$LC406402; call _gfortrani_bounds_equal_extents$L83: .loc 1 366 3 setp.ne.u64 %r408,%r154,0; @ %r408 bra $L85; $L88: .loc 1 374 8 ld.u64 %r174,[%r233]; .loc 1 375 8 ld.u64 %r172,[%r234]; .loc 1 377 9 setp.ne.u64 %r409,%r172,0; @ %r409 bra $L86; bra $L58; $L85: add.u64 %r188,%frame,480; add.u64 %r181,%r233,40; add.u64 %r149,%frame,120; cvt.u32.u32 %r411,%r23; cvt.s64.s8 %r410,%r411; add.u64 %r138,%r410,-1; .loc 1 366 3 mov.u64 %r175,0; add.u64 %r618,%frame,360; .loc 1 368 16 mov.u64 %r412,%r175; $L87: st.u64 [%r188],%r412; .loc 1 369 18 ld.u64 %r413,[%r181]; st.u64 [%r149],%r413; .loc 1 370 17 shl.b64 %r415,%r175,3; add.u64 %r416,%r618,%r415; .loc 1 370 10 ld.u64 %r417,[%r416]; setp.le.s64 %r418,%r417,0; @ %r418 bra $L58; .loc 1 366 26 add.u64 %r175,%r175,1; .loc 1 366 3 add.u64 %r188,%r188,8; add.u64 %r181,%r181,24; add.u64 %r149,%r149,8; setp.ne.u64 %r419,%r138,%r175; @ %r419 bra $L87; bra $L88; $L86: .loc 1 300 12 mul.lo.u64 %r43,%r41,%r35; .loc 1 396 32 shl.b64 %r88,%r158,4; ld.u64 %r189,[%frame+480]; .loc 1 440 22 ld.u64 %r102,[%frame+240]; .loc 1 440 12 shl.b64 %r103,%r102,4; .loc 1 441 23 ld.u64 %r105,[%frame]; .loc 1 442 22 ld.u64 %r107,[%frame+120]; .loc 1 442 12 shl.b64 %r108,%r107,4; .loc 1 444 32 ld.u64 %r29,[%frame+360]; .loc 1 451 23 mul.lo.u64 %r39,%r29,%r102; .loc 1 452 24 mul.lo.u64 %r420,%r29,%r105; .loc 1 452 10 neg.s64 %r92,%r420; .loc 1 453 23 mul.lo.u64 %r421,%r29,%r107; .loc 1 453 9 shl.b64 %r422,%r421,4; neg.s64 %r185,%r422; setp.le.s64 %r621,%r154,1; .loc 1 418 9 setp.ne.u32 %r625,%r237,0; add.u64 %r626,%r155,-1; cvt.u32.u32 %r627,%r23; cvt.s64.s8 %r628,%r627; add.u64 %r629,%r628,-1; add.u64 %r630,%frame,240; add.u64 %r631,%frame,360; $L107: .loc 1 366 3 mov.u64 %r168,%r173; mov.u64 %r167,%r172; .loc 1 396 9 mov.u64 %r197,0; $L92: .loc 1 399 7 ld.s8 %r224,[%r168]; mov.u64 %r230,%r197; .loc 1 396 24 add.u64 %r197,%r197,1; .loc 1 399 6 setp.eq.u32 %r623,%r224,0; @ %r623 bra $L89; .loc 1 407 11 ld.u64 %r579,[%r167]; ld.u64 %r580,[%r167+8]; .loc 1 408 11 mov.u64 %r577,%r197; shr.s64 %r578,%r577,63; .loc 1 418 9 @ %r625 bra $L90; bra $L132; $L89: .loc 1 396 32 add.u64 %r167,%r167,%r88; .loc 1 396 47 add.u64 %r168,%r168,%r43; .loc 1 396 2 setp.ne.u64 %r426,%r155,%r197; @ %r426 bra $L92; .loc 1 395 9 mov.u64 %r577,0; mov.u64 %r578,%r577; bra $L93; $L132: .loc 1 428 10 setp.gt.s64 %r427,%r155,%r230; @ %r427 bra $L94; bra $L93; $L90: .loc 1 419 8 setp.le.s64 %r428,%r155,%r230; @ %r428 bra $L93; add.u64 %r587,%r230,1; shr.s64 %r588,%r587,63; sub.u64 %r432,%r626,%r230; shr.s64 %r435,%r230,63; add.u64 %r438,%r230,2; set.u32.lt.u64 %r443,%r438,%r230; cvt.s64.s32 %r441,%r443; sub.u64 %r447,%r435,%r441; add.u64 %r449,%r432,%r438; set.u32.lt.u64 %r454,%r449,%r432; cvt.s64.s32 %r452,%r454; sub.u64 %r458,%r447,%r452; mov.u64 %r599,1; $L98: .loc 1 421 8 @ %r623 bra $L95; .loc 1 421 18 ld.u64 %r573,[%r167]; ld.u64 %r574,[%r167+8]; .loc 1 421 15 setp.gt.s64 %r462,%r574,%r580; @ %r462 bra $L95; setp.ne.u64 %r465,%r574,%r580; @ %r465 bra $L108; setp.gt.u64 %r468,%r573,%r579; @ %r468 bra $L95; $L108: .loc 1 424 16 mov.u64 %r577,%r587; mov.u64 %r578,%r588; mov.u64 %r579,%r573; mov.u64 %r580,%r574; $L95: .loc 1 419 33 add.u64 %r167,%r167,%r88; .loc 1 419 48 add.u64 %r168,%r168,%r43; .loc 1 419 8 add.u64 %r471,%r587,%r599; set.u32.lt.u64 %r476,%r471,%r587; cvt.s64.s32 %r474,%r476; mov.u64 %r587,%r471; sub.u64 %r588,%r588,%r474; setp.ne.u64 %r483,%r449,%r587; @ %r483 bra $L109; setp.ne.u64 %r486,%r458,%r588; @ ! %r486 bra $L93; $L109: .loc 1 421 9 ld.s8 %r224,[%r168]; setp.eq.u32 %r623,%r224,0; bra $L98; $L94: add.u64 %r583,%r230,1; shr.s64 %r584,%r583,63; sub.u64 %r490,%r626,%r230; shr.s64 %r493,%r230,63; add.u64 %r496,%r230,2; set.u32.lt.u64 %r501,%r496,%r230; cvt.s64.s32 %r499,%r501; sub.u64 %r505,%r493,%r499; add.u64 %r507,%r490,%r496; set.u32.lt.u64 %r512,%r507,%r490; cvt.s64.s32 %r510,%r512; sub.u64 %r516,%r505,%r510; .loc 1 428 10 mov.u64 %r613,1; $L124: .loc 1 430 10 ld.s8 %r518,[%r168]; cvt.u16.u32 %r517,%r518; setp.eq.u16 %r519,%r517,0; @ %r519 bra $L99; .loc 1 430 20 ld.u64 %r575,[%r167]; ld.u64 %r576,[%r167+8]; .loc 1 430 17 setp.gt.s64 %r522,%r580,%r576; @ %r522 bra $L110; setp.ne.u64 %r525,%r580,%r576; @ %r525 bra $L99; setp.gt.u64 %r528,%r579,%r575; @ ! %r528 bra $L99; $L110: .loc 1 433 11 mov.u64 %r577,%r583; mov.u64 %r578,%r584; mov.u64 %r579,%r575; mov.u64 %r580,%r576; $L99: .loc 1 428 35 add.u64 %r167,%r167,%r88; .loc 1 428 50 add.u64 %r168,%r168,%r43; .loc 1 428 10 add.u64 %r531,%r583,%r613; set.u32.lt.u64 %r536,%r531,%r583; cvt.s64.s32 %r534,%r536; mov.u64 %r583,%r531; sub.u64 %r584,%r584,%r534; setp.ne.u64 %r543,%r507,%r583; @ %r543 bra $L124; setp.ne.u64 %r546,%r516,%r584; @ %r546 bra $L124; $L93: .loc 1 436 8 st.u64 [%r174],%r577; st.u64 [%r174+8],%r578; .loc 1 439 15 add.u64 %r189,%r189,1; .loc 1 440 12 add.u64 %r172,%r172,%r103; .loc 1 441 13 add.u64 %r173,%r173,%r105; .loc 1 442 12 add.u64 %r174,%r174,%r108; .loc 1 444 13 setp.ne.u64 %r549,%r29,%r189; @ %r549 bra $L107; .loc 1 452 10 add.u64 %r169,%r173,%r92; .loc 1 453 9 add.u64 %r170,%r174,%r185; .loc 1 455 7 @ %r621 bra $L58; add.u64 %r219,%frame,488; .loc 1 451 23 mov.u64 %r112,%r39; .loc 1 455 7 mov.u64 %r218,8; .loc 1 454 5 mov.u64 %r171,1; add.u64 %r617,%frame,120; .loc 1 448 13 mov.u64 %r624,0; bra $L104; $L105: st.u64 [%r219],%r624; .loc 1 451 23 mul.lo.u64 %r112,%r125,%r124; .loc 1 452 24 mul.lo.u64 %r555,%r127,%r124; .loc 1 452 10 sub.u64 %r169,%r173,%r555; .loc 1 453 23 mul.lo.u64 %r556,%r129,%r124; .loc 1 453 9 shl.b64 %r557,%r556,4; sub.u64 %r170,%r174,%r557; .loc 1 454 5 add.u64 %r171,%r171,1; .loc 1 455 7 add.u64 %r219,%r219,8; add.u64 %r218,%r218,8; setp.eq.u64 %r558,%r171,%r629; @ %r558 bra $L58; $L104: .loc 1 463 16 ld.u64 %r559,[%r219]; add.u64 %r124,%r559,1; st.u64 [%r219],%r124; .loc 1 464 23 add.u64 %r561,%r630,%r218; ld.u64 %r125,[%r561]; .loc 1 464 13 sub.u64 %r562,%r125,%r112; shl.b64 %r563,%r562,4; add.u64 %r172,%r172,%r563; .loc 1 465 24 add.u64 %r564,%frame,%r218; ld.u64 %r127,[%r564]; .loc 1 465 14 add.u64 %r173,%r169,%r127; .loc 1 466 23 add.u64 %r566,%r617,%r218; ld.u64 %r129,[%r566]; .loc 1 466 13 shl.b64 %r567,%r129,4; add.u64 %r174,%r170,%r567; .loc 1 444 32 add.u64 %r569,%r631,%r218; ld.u64 %r133,[%r569]; .loc 1 444 13 setp.eq.u64 %r570,%r124,%r133; @ %r570 bra $L105; .loc 1 448 13 mov.u64 %r189,0; bra $L107; $L130: .loc 1 322 6 ld.u64 %r571,[%r233]; setp.eq.u64 %r572,%r571,0; @ ! %r572 bra $L73; bra $L106; $L58: .loc 1 4_gfortran_sminloc1_16_i16 .visible .func _gfortran_sminloc1_16_i16, .param .u32 %in_ar4) {8u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r282; .reg .predpred %r293; .reg .pred %r294; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 493 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L134; .loc 1 493 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L135; $L134:r152154; call _gfortran_minloc1_16_i16500 7 bra $L133; $L135: .loc 1 503 10 ld.u64 %r23,[%r152]; .loc 1 503 7 add.u64 %r88,%r23,-1; .loc 1 504 10 ld.s8 %r24,[%r151+28]; .loc 1 504 38 add.u32 %r26,%r24,-1; .loc 1 504 8 cvt.s64.s32 %r297,%r26; .loc 1 506 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r297; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 506 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L137; .loc 1 513 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L138; $L144: .loc 1 521 3 setp.lt.s64 %r175,%r88,%r297; @ %r175 bra $L139; bra $L192; $L137: .loc 1 508 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error138: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 518 12 mov.u64 %r303,0; $L143: .loc 1 515 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 517 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L141; .loc 1 515 17 st.u64 [%r139],%r36; bra $L142; $L141: .loc 1 518 12 st.u64 [%r139],%r303; $L142: .loc 1 513 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L143; bra $L144; $L192: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L146; bra $L145; $L139: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r296,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r296,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 527 12 mov.u64 %r302,0; $L149: .loc 1 524 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 526 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L147; .loc 1 523 17 st.u64 [%r43],%r41; bra $L148; $L147: .loc 1 527 12 st.u64 [%r43],%r302; $L148: .loc 1 521 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L149; bra $L193; $L145: .loc 1 534 7 setp.eq.u64 %r213,%r297,0; @ %r213 bra $L151; add.u64 %r296,%frame,120; $L171: add.u64 %r111,%r150,40; mov.u64 %r107,%r296; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 537 10 mov.u64 %r81,1; .loc 1 541 4 mov.u64 %r239,0; bra $L152; $L151: .loc 1 545 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 546 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 548 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 548 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 548 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 550 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L153; bra $L194; $L155: .loc 1 539 48 mul.lo.u64 %r81,%r46,%r81; $L152: .loc 1 541 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 534 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L155; bra $L151; $L153: .loc 1 553 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 554 4 bra $L133; $L194: .loc 1 557 24 mov.u64 %r246,1696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 557 22 st.u64 [%r150],%r98; $L159: .loc 1 583 3 setp.ne.u64 %r249,%r297,0; @ %r249 bra $L156; bra $L157; $L146: .loc 1 561 19 ld.s8 %r55,[%r150+28]; .loc 1 561 10 setp.eq.u64 %r250,%r55,%r297; @ %r250 bra $L158; .loc 1 562 2 st.u64 [%stack+8],%r297; st.u64 [%stack],%r55;_gfortran_runtime_error158: .loc 1 567 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 567 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L159; .loc 1 569 4 setp.eq.u64 %r256,%r297,0; @ %r256 bra $L157; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 569 10 mov.u64 %r104,0; $L161: .loc 1 573 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 573 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 574 18 ld.u64 %r63,[%r31]; .loc 1 569 25 add.u64 %r104,%r104,1; .loc 1 574 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L160; .loc 1 575 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error160: .loc 1 569 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L161; $L156: .loc 1 585 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L162; shl.b64 %r266,%r297,3; bra $L163; $L162: mov.u64 %r266,8; $L163: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L164: .loc 1 586 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 583 26 add.u64 %r100,%r100,1; .loc 1 583 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r297,%r100; @ %r279 bra $L164; $L157: ld.u64 %r80,[%frame+240]; .loc 1 595 22 ld.u64 %r68,[%frame]; .loc 1 595 12 shl.b64 %r69,%r68,4; .loc 1 597 32 ld.u64 %r59,[%frame+120]; .loc 1 604 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r295,%r297,1; .loc 1 593 13 mov.u64 %r280,0; cvt.u32.u32 %r299,%r24; cvt.s64.s8 %r300,%r299; add.u64 %r301,%r300,-1; $L170: st.u64 [%r98],%r280; st.u64 [%r98+8],%r280; .loc 1 594 15 add.u64 %r80,%r80,1; .loc 1 595 12 add.u64 %r98,%r98,%r69; .loc 1 597 13 setp.ne.u64 %r282,%r59,%r80; @ %r282 bra $L170; .loc 1 606 7 @ ! %r295 bra $L195; bra $L133; $L169: .loc 1 601 13 st.u64 [%r99],%r280; .loc 1 604 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 605 5 add.u64 %r97,%r97,1; .loc 1 606 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r285,%r97,%r301; @ %r285 bra $L168; bra $L133; $L195: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 604 23 mov.u64 %r72,%r44; .loc 1 605 5 mov.u64 %r97,1; $L168: .loc 1 610 16 ld.u64 %r290,[%r99]; add.u64 %r74,%r290,1; st.u64 [%r99],%r74; .loc 1 611 23 ld.u64 %r75,[%r124]; .loc 1 611 13 sub.u64 %r291,%r75,%r72; shl.b64 %r292,%r291,4; add.u64 %r98,%r98,%r292; .loc 1 597 32 ld.u64 %r77,[%r123]; .loc 1 597 13 setp.eq.u64 %r293,%r74,%r77; @ %r293 bra $L169; .loc 1 601 13 mov.u64 %r80,0; bra $L170; $L193: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r294,%r98,0; @ ! %r294 bra $L146; bra $L171; $L133: .loc 1 615minloc1_4_r4.o/_gfortran_minloc1_4_r4 .visible .func _gfortran_minloc1_4_r../libgfortran/generated/minloc1_4_r4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_4_r4 .visible .func _gfortran_mminloc1_4_r_gfortran_sminloc1_4_r4 .visible .func _gfortran_sminloc1_4_rVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_minloc1_4_r4 .visible .func _gfortran_minloc1_4_r4pred %r229; .reg .predpred %r240; .reg .predu16 %r278; .reg .u32 %r279; .reg .u32 %r280; .reg .u64 %r282; .reg .u64predu64 %r318; .reg .u64predpred %r337; .reg .pred %r338; .reg .pred %r339; .reg .predu64 %r357; .reg .u64 %r358; .reg .u64 %r360; .reg .pred %r361; .reg .u64 %r362; .reg .predpred %r367; .reg .u64.loc 1 58 10 ld.s8 %r22,[%r191+28]; .loc 1 58 38 add.u32 %r194,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r122,%r194; .loc 1 59 10 ld.u64 %r25,[%r192]; .loc 1 59 7 add.u64 %r123,%r25,-1; .loc 1 61 7 shr.u64 %r196,%r123,63; set.u32.lt.s64 %r198,%r122,%r123; neg.s32 %r199,%r198; cvt.u16.u64 %r202,%r196;61%r206 bra $L2; .loc 1 63 7 cvt.u32.u32 %r209,%r22; cvt.s64.s8 %r208,%r209; st.u64 [%stack+8],%r208007_gfortran_runtime_errorr212,%r123,%r123; add.u64 %r213,%r212,%r123; shl.b64 %r214,%r213,3; add.u64 %r215,%r191,%r214; ld.u64 %r30,[%r215+56]; ld.u64 %r32,[%r215+48]; .loc 1 71 9 ld.u64 %r126,[%r215+40]; .loc 1 73 3 setp.ne.u64 %r229,%r123,0; @ %r229 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r230,%r122,%r123; @ %r230 bra $L4; bra $L61; $L3: add.u64 %r45,%r191,40; add.u64 %r69,%frame,120; add.u64 %r44,%frame,240; add.u64 %r232,%r25,%r25; add.u64 %r233,%r232,%r25; shl.b64 %r234,%r233,3; add.u64 %r235,%r191,16; add.u64 %r114,%r234,%r235; .loc 1 79 12 mov.u64 %r377,0; $L8: .loc 1 75 18 ld.u64 %r236,[%r45]; st.u64 [%r69],%r236; .loc 1 76 19 ld.u64 %r238,[%r45+16]; add.u64 %r237,%r238,1; ld.u64 %r239,[%r45+8]; sub.u64 %r37,%r237,%r239; .loc 1 78 10 setp.lt.s64 %r240,%r37,0; @ %r240 bra $L6; .loc 1 76 17 st.u64 [%r44],%r37; bra $L7; $L6: .loc 1 79 12 st.u64 [%r44],%r377; $L7: .loc 1 73 3 add.u64 %r45,%r45,24; add.u64 %r69,%r69,8; add.u64 %r44,%r44,8; setp.ne.u64 %r242,%r45,%r114; @ %r242 bra $L8; bra $L9; $L61: .loc 1 90 6 ld.u64 %r243,[%r190]; setp.eq.u64 %r244,%r243,0; @ ! %r244 bra $L11; bra $L10; $L4: add.u64 %r246,%r25,%r25; add.u64 %r247,%r246,%r25; shl.b64 %r248,%r247,3; add.u64 %r249,%r248,40; add.u64 %r106,%r191,%r249; shl.b64 %r250,%r25,3; add.u64 %r81,%r250,-8; add.u64 %r365,%frame,120; add.u64 %r101,%r365,%r81; add.u64 %r364,%frame,240; add.u64 %r58,%r364,%r81; add.u64 %r253,%r191,40; cvt.u32.u32 %r255,%r22; cvt.s64.s8 %r254,%r255; add.u64 %r257,%r254,%r254; add.u64 %r258,%r257,%r254; shl.b64 %r259,%r258,3; add.u64 %r185,%r253,%r259; .loc 1 87 12 mov.u64 %r376,0; $L14: .loc 1 83 18 ld.u64 %r260,[%r106]; st.u64 [%r101],%r260; .loc 1 84 19 ld.u64 %r262,[%r106+16]; add.u64 %r261,%r262,1; ld.u64 %r263,[%r106+8]; sub.u64 %r42,%r261,%r263; .loc 1 86 10 setp.lt.s64 %r264,%r42,0; @ %r264 bra $L12; .loc 1 84 17 st.u64 [%r58],%r42; bra $L13; $L12: .loc 1 87 12 st.u64 [%r58],%r376; $L13: .loc 1 81 3 add.u64 %r106,%r106,24; add.u64 %r101,%r101,8; add.u64 %r58,%r58,8; setp.ne.u64 %r266,%r106,%r185; @ %r266 bra $L14; bra $L62; $L10: .loc 1 94 7 setp.le.s64 %r267,%r122,0; @ %r267 bra $L16; add.u64 %r364,%frame,240; $L45: add.u64 %r144,%r190,40; mov.u64 %r119,%r364; cvt.u32.u32 %r269,%r22; cvt.s64.s8 %r268,%r269; add.u64 %r271,%r268,%r268; add.u64 %r272,%r271,%r268; shl.b64 %r273,%r272,3; add.u64 %r274,%r190,16; add.u64 %r109,%r273,%r274; .loc 1 97 10 mov.u64 %r108,1; .loc 1 101 4 mov.u64 %r297,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r275,0; st.u64 [%r190+8],%r275; .loc 1 106 28 cvt.u16.u32 %r278,%r22; add.u16 %r277,%r278,-1; cvt.u32.u16 %r279,%r277; st.u8 [%r190+28],%r279; .loc 1 108 20 add.u32 %r280,%r22,-2; cvt.s64.s32 %r51,%r280; add.u64 %r282,%r51,%r51; add.u64 %r283,%r282,%r51; shl.b64 %r284,%r283,3; add.u64 %r285,%r190,%r284; .loc 1 108 67 shl.b64 %r287,%r51,3; add.u64 %r288,%frame,%r287; .loc 1 108 59 ld.u64 %r290,[%r285+40]; ld.u64 %r291,[%r288+240]; mul.lo.u64 %r128,%r290,%r291; .loc 1 110 29293; call (%value_in),_gfortrani_xmallocarray294,[%value_in]; } .loc 1 110 27 st.u64 [%r190],%r294; .loc 1 111 10 setp.eq.u64 %r296,%r128,0; @ ! %r296 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r108,%r47,%r108; $L17: .loc 1 101 4 st.u64 [%r144+8],%r297; ld.u64 %r47,[%r119]; add.u64 %r298,%r47,-1; st.u64 [%r144+16],%r298; st.u64 [%r144],%r108; .loc 1 94 7 add.u64 %r144,%r144,24; add.u64 %r119,%r119,8; setp.ne.u64 %r299,%r109,%r144; @ %r299 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r190+48],%r128; mov.u64 %r301,-1; st.u64 [%r190+56],%r301; mov.u64 %r302,1; st.u64 [%r190+40],%r302; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r57,[%r190+28]; .loc 1 121 10 setp.eq.u64 %r303,%r57,%r122; @ %r303 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r122; st.u64 [%stack],%r57;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r306,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r307,[%r306+36]; setp.eq.u32 %r308,%r307,0; @ %r308 bra $L21; .loc 1 128 2 add.u64 %r364,%frame,240; cvta.const.u64 %r312,$LC2; cvta.const.u64 %r311,$LC3;_gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r314,%r122,0; @ %r314 bra $L24; $L27: .loc 1 68 9 add.u64 %r315,%r30,1; .loc 1 68 7 sub.u64 %r124,%r315,%r32; max.s64 %r142,%r124,0; .loc 1 140 8 ld.u64 %r137,[%r191]; .loc 1 141 8 ld.u64 %r138,[%r190]; .loc 1 168 40 shl.b64 %r65,%r126,2; ld.u64 %r145,[%frame+360]; .loc 1 204 22 ld.u64 %r75,[%frame+120]; .loc 1 204 12 shl.b64 %r76,%r75,2; .loc 1 205 22 ld.u64 %r78,[%frame]; .loc 1 205 12 shl.b64 %r79,%r78,2; .loc 1 207 32 ld.u64 %r117,[%frame+240]; .loc 1 214 23 mul.lo.u64 %r316,%r75,%r117; .loc 1 214 9 shl.b64 %r317,%r316,2; neg.s64 %r160,%r317; .loc 1 215 23 mul.lo.u64 %r318,%r78,%r117; .loc 1 215 9 shl.b64 %r319,%r318,2; neg.s64 %r68,%r319; setp.gt.s64 %r366,%r124,0; setp.le.s64 %r367,%r122,1; .loc 1 157 9 mov.u32 %r369,1; .loc 1 180 9 setp.ne.u32 %r370,%r193,0; cvt.u32.u32 %r371,%r22; cvt.s64.s8 %r372,%r371; add.u64 %r373,%r372,-1; add.u64 %r374,%frame,120; add.u64 %r375,%frame,240; bra $L25; $L24: add.u64 %r166,%frame,360; add.u64 %r165,%r190,40; mov.u64 %r161,%frame; cvt.u32.u32 %r321,%r22; cvt.s64.s8 %r320,%r321; add.u64 %r146,%r320,-1; .loc 1 132 3 mov.u64 %r139,0; add.u64 %r364,%frame,240; .loc 1 134 16 mov.u64 %r322,%r139; $L26: st.u64 [%r166],%r322; .loc 1 135 18 ld.u64 %r323,[%r165]; st.u64 [%r161],%r323; .loc 1 136 17 shl.b64 %r325,%r139,3; add.u64 %r326,%r364,%r325; .loc 1 136 10 ld.u64 %r327,[%r326]; setp.le.s64 %r328,%r327,0; @ %r328 bra $L1; .loc 1 132 26 add.u64 %r139,%r139,1; .loc 1 132 3 add.u64 %r166,%r166,8; add.u64 %r165,%r165,24; add.u64 %r161,%r161,8; setp.ne.u64 %r329,%r139,%r146; @ %r329 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r366 bra $L46; .loc 1 159 10 mov.u32 %r331,0; st.u32 [%r138],%r331; bra $L29; $L46: mov.u64 %r130,%r137; .loc 1 168 17 mov.u64 %r129,0; $L28: .loc 1 170 7 ld.f32 %r107,[%r130]; .loc 1 170 6 setp.le.f32 %r332,%r107,0f7f800000; @ ! %r332 bra $L63; .loc 1 173 14 cvt.u32.u64 %r333,%r129; add.u32 %r112,%r333,1; .loc 1 180 9 @ %r370 bra $L32; .loc 1 190 8 setp.gt.s64 %r335,%r142,%r129; @ %r335 bra $L33; bra $L34; $L32: .loc 1 181 8 setp.le.s64 %r336,%r142,%r129; @ %r336 bra $L34; add.u64 %r173,%r129,1; .loc 1 170 7 mov.f32 %r186,%r107; bra $L35; $L63: .loc 1 168 32 add.u64 %r129,%r129,1; .loc 1 168 40 add.u64 %r130,%r130,%r65; .loc 1 168 10 setp.lt.s64 %r337,%r129,%r142; @ %r337 bra $L28; .loc 1 157 9 mov.u32 %r112,%r369; bra $L34; $L35: .loc 1 183 8 setp.ge.f32 %r338,%r107,%r186; @ ! %r338 bra $L36; .loc 1 186 16 cvt.u32.u64 %r112,%r173; .loc 1 185 16 mov.f32 %r107,%r186; $L36: .loc 1 181 33 add.u64 %r130,%r130,%r65; .loc 1 181 8 add.u64 %r174,%r173,1; setp.eq.u64 %r339,%r142,%r173; @ %r339 bra $L34; .loc 1 183 9 ld.f32 %r186,[%r130]; mov.u64 %r173,%r174; bra $L35; $L33: add.u64 %r169,%r129,1; bra $L40; $L47: mov.u64 %r169,%r170; $L40: .loc 1 192 9 ld.f32 %r70,[%r130]; .loc 1 192 8 setp.lt.f32 %r340,%r70,%r107; @ ! %r340 bra $L38; .loc 1 195 16 cvt.u32.u64 %r112,%r169; .loc 1 194 16 mov.f32 %r107,%r70; $L38: .loc 1 190 33 add.u64 %r130,%r130,%r65; .loc 1 190 8 add.u64 %r170,%r169,1; setp.ne.u64 %r341,%r142,%r169; @ %r341 bra $L47; $L34: .loc 1 199 12 st.u32 [%r138],%r112; $L29: .loc 1 203 15 add.u64 %r145,%r145,1; .loc 1 204 12 add.u64 %r137,%r137,%r76; .loc 1 205 12 add.u64 %r138,%r138,%r79; .loc 1 207 13 setp.ne.u64 %r342,%r145,%r117; @ %r342 bra $L25; .loc 1 214 9 add.u64 %r134,%r137,%r160; .loc 1 215 9 add.u64 %r135,%r138,%r68; .loc 1 217 7 @ %r367 bra $L1; add.u64 %r121,%frame,368; mov.u64 %r182,8; .loc 1 216 5 mov.u64 %r136,1; .loc 1 211 13 mov.u64 %r368,0; bra $L43; $L44: st.u64 [%r121],%r368; .loc 1 214 23 mul.lo.u64 %r348,%r94,%r93; .loc 1 214 9 shl.b64 %r349,%r348,2; sub.u64 %r134,%r137,%r349; .loc 1 215 23 mul.lo.u64 %r350,%r97,%r93; .loc 1 215 9 shl.b64 %r351,%r350,2; sub.u64 %r135,%r138,%r351; .loc 1 216 5 add.u64 %r136,%r13182,%r182,8; setp.eq.u64 %r352,%r136,%r373; @ %r352 bra $L1; $L43: .loc 1 225 16 ld.u64 %r353,[%r121]; add.u64 %r93,%r353,1; st.u64 [%r121],%r93; .loc 1 226 23 add.u64 %r355,%r374,%r182; ld.u64 %r94,[%r355]; .loc 1 226 13 shl.b64 %r356,%r94,2; add.u64 %r137,%r134,%r356; .loc 1 227 23 add.u64 %r357,%frame,%r182; ld.u64 %r97,[%r357]; .loc 1 227 13 shl.b64 %r358,%r97,2; add.u64 %r138,%r135,%r358; .loc 1 207 32 add.u64 %r360,%r375,%r182; ld.u64 %r100,[%r360]; .loc 1 207 13 setp.eq.u64 %r361,%r93,%r100; @ %r361 bra $L44; .loc 1 211 13 mov.u64 %r145,0; bra $L25; $L62: .loc 1 90 6 ld.u64 %r362,[%r190]; setp.eq.u64 %r363,%r362,0; @ ! %r363 bra $L11; bra $L45; $L1: .loc 1 231_gfortran_mminloc1_4_r4 .visible .func _gfortran_mminloc1_4_r, .param .u32 %in_ar4) {608u32 %r267; .reg .u32 %r269; .reg .u32 %r270; .reg .u32 %r271; .reg .u32 %r273; .reg .u32predu64 %r293; .reg .predu64 %r302; .reg .u64 %r303; .reg .u64 %r305; .reg .u64 %r306; .reg .u64u64 %r327; .reg .u32u64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64pred %r339; .reg .u64 %r341; .reg .u64 %r343; .reg .pred %r345; .reg .predpred %r383; .reg .u64 %r384; .reg .u64 %r386; .reg .u32 %r387; .reg .pred %r388; .reg .u64 %r391; .reg .u64pred %r409; .reg .u64 %r410; .reg .u64 %r411; .reg .u64 %r412; .reg .pred %r414; .reg .u32pred %r418; .reg .pred %r419; .reg .pred %r422; .reg .pred %r423; .reg .pred %r425; .reg .pred %r426; .reg .predu64predmov.u64 %r223,%ar0; mov.u64 %r224,%ar1; mov.u64 %r225,%ar2; mov.u64 %r226,%ar3; mov.u32 %r227,%ar4; .loc 1 261 6 setp.ne.u64 %r228,%r226,0; @ %r228 bra $L65227; call _gfortran_minloc1_4_r4268 7 bra $L64; $L65: .loc 1 271 10 ld.u64 %r22,[%r225]; .loc 1 271 7 add.u64 %r159,%r22,-1; .loc 1 272 10 ld.s8 %r23,[%r224+28]; .loc 1 272 38 add.u32 %r233,%r23,-1; .loc 1 272 8 cvt.s64.s32 %r160,%r233; .loc 1 275 7 shr.u64 %r235,%r159,63; set.u32.gt.s64 %r237,%r159,%r160; neg.s32 %r238,%r237; cvt.u16.u64 %r241,%r235; cvt.u16.u32 %r242,%r238; or.b16 %r240,%r241,%r242; .loc 1 275 6 cvt.u32.u16 %r243,%r240; cvt.u16.u8 %r244,%r243; setp.eq.u16 %r245,%r244,0; @ %r245 bra $L67; .loc 1 277 7 cvt.u32.u32 %r248,%r23; cvt.s64.s8 %r247,%r248; st.u64 [%stack+8],%r247; st.u64 [%stack],%r22; cvta.const.u64 %r246246_gfortran_runtime_error67: .loc 1 282 9 add.u64 %r456,%r159,%r159; add.u64 %r457,%r456,%r159; shl.b64 %r253,%r457,3; add.u64 %r254,%r224,%r253; ld.u64 %r257,[%r254+56]; add.u64 %r256,%r257,1; .loc 1 282 7 ld.u64 %r264,[%r254+48]; sub.u64 %r161,%r256,%r264; .loc 1 283 6 setp.le.s64 %r265,%r161,0; @ %r265 bra $L64; .loc 1 286 9 ld.u64 %r180,[%r226]; .loc 1 288 15 ld.u64 %r34,[%r226+16]; .loc 1 290 22 cvt.u32.u64 %r35,%r34; .loc 1 290 53 add.u32 %r266,%r35,-4; and.b32 %r267,%r266,-5; set.u32.eq.u32 %r269,%r267,0; neg.s32 %r270,%r269; .loc 1 290 22 add.u32 %r271,%r35,-1; set.u32.le.u32 %r273,%r271,1; neg.s32 %r274,%r273; .loc 1 290 6 cvt.u16.u32 %r276,%r270; cvt.u16.u32 %r277,%r274; or.b16 %r275,%r276,%r277; cvt.u32.u16 %r278,%r275; cvt.u16.u8 %r279,%r278; setp.ne.u16 %r280,%r279,0; @ %r280 bra $L69; .loc 1 292 7 setp.ne.u32 %r282,%r35,16; @ %r282 bra $L70; $L69: .loc 1 299 9 shl.b64 %r286,%r457,3; add.u64 %r287,%r224,%r286; ld.u64 %r164,[%r287+40]; .loc 1 300 12 add.u64 %r293,%r226,%r286; ld.u64 %r41,[%r293+40]; .loc 1 302 3 setp.ne.u64 %r295,%r159,0; @ %r295 bra $L71; $L77: .loc 1 312 3 setp.lt.s64 %r296,%r159,%r160; @ %r296 bra $L72; bra $L135; $L70: .loc 1 297 5 cvta.const.u64 %r2977_gfortran_runtime_error71: add.u64 %r91,%r224,40; add.u64 %r40,%r226,40; add.u64 %r300,%r22,%r22; add.u64 %r301,%r300,%r22; shl.b64 %r302,%r301,3; add.u64 %r303,%r224,16; add.u64 %r186,%r302,%r303; .loc 1 302 3 mov.u64 %r148,0; add.u64 %r453,%frame,240; add.u64 %r452,%frame,360; .loc 1 309 12 mov.u64 %r467,%r148; $L76: .loc 1 304 18 add.u64 %r305,%r453,%r148; ld.u64 %r306,[%r91]; st.u64 [%r305],%r306; .loc 1 305 18 add.u64 %r307,%frame,%r148; .loc 1 305 20 ld.u64 %r309,[%r40]; mul.lo.u64 %r308,%r309,%r34; .loc 1 305 18 st.u64 [%r307],%r308; .loc 1 306 19 ld.u64 %r311,[%r91+16]; add.u64 %r310,%r311,1; ld.u64 %r312,[%r91+8]; sub.u64 %r52,%r310,%r312; .loc 1 308 10 setp.lt.s64 %r313,%r52,0; @ %r313 bra $L74; .loc 1 306 17 add.u64 %r315,%r452,%r148; st.u64 [%r315],%r52; bra $L75; $L74: .loc 1 309 12 add.u64 %r317,%r452,%r148; st.u64 [%r317],%r467; $L75: .loc 1 302 3 add.u64 %r91,%r91,24; add.u64 %r40,%r40,24; add.u64 %r148,%r148,8; setp.ne.u64 %r319,%r186,%r91; @ %r319 bra $L76; bra $L77; $L135: .loc 1 322 6 ld.u64 %r320,[%r223]; setp.eq.u64 %r321,%r320,0; @ ! %r321 bra $L79; bra $L78; $L72: add.u64 %r323,%r22,%r22; add.u64 %r324,%r323,%r22; shl.b64 %r325,%r324,3; add.u64 %r116,%r325,40; add.u64 %r133,%r224,%r116; add.u64 %r87,%r226,%r116; shl.b64 %r326,%r22,3; add.u64 %r157,%r326,-8; cvt.u32.u32 %r328,%r23; cvt.s64.s8 %r327,%r328; shl.b64 %r329,%r327,3; add.u64 %r96,%r329,-8; add.u64 %r453,%frame,240; add.u64 %r452,%frame,360; .loc 1 319 12 mov.u64 %r466,0; $L82: .loc 1 314 18 add.u64 %r331,%r453,%r157; ld.u64 %r332,[%r133]; st.u64 [%r331],%r332; .loc 1 315 18 add.u64 %r333,%frame,%r157; .loc 1 315 20 ld.u64 %r335,[%r87]; mul.lo.u64 %r334,%r335,%r34; .loc 1 315 18 st.u64 [%r333],%r334; .loc 1 316 19 ld.u64 %r337,[%r133+16]; add.u64 %r336,%r337,1; ld.u64 %r338,[%r133+8]; sub.u64 %r61,%r336,%r338; .loc 1 318 10 setp.lt.s64 %r339,%r61,0; @ %r339 bra $L80; .loc 1 316 17 add.u64 %r341,%r452,%r157; st.u64 [%r341],%r61; bra $L81; $L80: .loc 1 319 12 add.u64 %r343,%r452,%r157; st.u64 [%r343],%r466; $L81: .loc 1 312 3 add.u64 %r133,%r133,24; add.u64 %r87,%r87,24; add.u64 %r157,%r157,8; setp.ne.u64 %r345,%r96,%r157; @ %r345 bra $L82; bra $L136; $L78: .loc 1 326 7 setp.eq.u64 %r346,%r160,0; @ %r346 bra $L84; add.u64 %r452,%frame,360; $L112: add.u64 %r146,%r223,40; mov.u64 %r143,%r452; cvt.u32.u32 %r348,%r23; cvt.s64.s8 %r347,%r348; add.u64 %r350,%r347,%r347; add.u64 %r351,%r350,%r347; shl.b64 %r352,%r351,3; add.u64 %r353,%r223,16; add.u64 %r134,%r352,%r353; .loc 1 329 10 mov.u64 %r139,1; .loc 1 333 4 mov.u64 %r372,0; bra $L85; $L84: .loc 1 337 20 add.u32 %r354,%r23,-2; cvt.s64.s32 %r71,%r354; add.u64 %r356,%r71,%r71; add.u64 %r357,%r356,%r71; shl.b64 %r358,%r357,3; add.u64 %r359,%r223,%r358; .loc 1 337 67 shl.b64 %r361,%r71,3; add.u64 %r362,%frame,%r361; .loc 1 337 59 ld.u64 %r364,[%r359+40]; ld.u64 %r365,[%r362+360]; mul.lo.u64 %r168,%r364,%r365; .loc 1 339 24 mov.u64 %r366,0; st.u64 [%r223+8],%r366; .loc 1 340 28 cvt.u16.u32 %r369,%r23; add.u16 %r368,%r369,-1; cvt.u32.u16 %r370,%r368; st.u8 [%r223+28],%r370; .loc 1 342 10 setp.eq.u64 %r371,%r168,0; @ %r371 bra $L86; bra $L137; $L88: .loc 1 331 47 mul.lo.u64 %r139,%r68,%r139; $L85: .loc 1 333 4 st.u64 [%r146+8],%r372; ld.u64 %r68,[%r143]; add.u64 %r373,%r68,-1; st.u64 [%r146+16],%r373; st.u64 [%r146],%r139; .loc 1 326 7 add.u64 %r146,%r146,24; add.u64 %r143,%r143,8; setp.ne.u64 %r374,%r134,%r146; @ %r374 bra $L88; bra $L84; $L86: .loc 1 345 4 st.u64 [%r223+48],%r168; mov.u64 %r376,-1; st.u64 [%r223+56],%r376; mov.u64 %r377,1; st.u64 [%r223+40],%r377; .loc 1 346 4 bra $L64; $L137: .loc 1 349 24call (%value_in),_gfortrani_xmallocarray80,[%value_in]; } .loc 1 349 22 st.u64 [%r223],%r380; bra $L89; $L79: .loc 1 354 19 ld.s8 %r382,[%r223+28]; .loc 1 354 10 setp.eq.u64 %r383,%r382,%r160; @ %r383 bra $L90; .loc 1 355 2 cvta.const.u64 %r3843stack; call _gfortran_runtime_error90: .loc 1 357 11 cvta.global.u64 %r386,_gfortrani_compile_options; .loc 1 357 10 ld.u32 %r387,[%r386+36]; setp.eq.u32 %r388,%r387,0; @ %r388 bra $L89; .loc 1 359 4 add.u64 %r452,%frame,360; cvta.const.u64 %r392,$LC239392; call _gfortrani_bounds_ifunction_return361 4 cvta.const.u64 %r396,$LC396392; call _gfortrani_bounds_equal_extents$L89: .loc 1 366 3 setp.ne.u64 %r398,%r160,0; @ %r398 bra $L91; $L94: .loc 1 374 8 ld.u64 %r181,[%r223]; .loc 1 375 8 ld.u64 %r179,[%r224]; .loc 1 377 9 setp.ne.u64 %r399,%r179,0; @ %r399 bra $L92; bra $L64; $L91: add.u64 %r199,%frame,480; add.u64 %r195,%r223,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r401,%r23; cvt.s64.s8 %r400,%r401; add.u64 %r149,%r400,-1; .loc 1 366 3 mov.u64 %r182,0; add.u64 %r452,%frame,360; .loc 1 368 16 mov.u64 %r402,%r182; $L93: st.u64 [%r199],%r402; .loc 1 369 18 ld.u64 %r403,[%r195]; st.u64 [%r187],%r403; .loc 1 370 17 shl.b64 %r405,%r182,3; add.u64 %r406,%r452,%r405; .loc 1 370 10 ld.u64 %r407,[%r406]; setp.le.s64 %r408,%r407,0; @ %r408 bra $L64; .loc 1 366 26 add.u64 %r182,%r182,1; .loc 1 366 3 add.u64 %r199,%r199,8; add.u64 %r195,%r195,24; add.u64 %r187,%r187,8; setp.ne.u64 %r409,%r149,%r182; @ %r409 bra $L93; bra $L94; $L92: .loc 1 300 12 mul.lo.u64 %r43,%r41,%r34; .loc 1 396 32 shl.b64 %r86,%r164,2; ld.u64 %r39,[%frame+480]; .loc 1 440 22 ld.u64 %r100,[%frame+240]; .loc 1 440 12 shl.b64 %r101,%r100,2; .loc 1 441 23 ld.u64 %r103,[%frame]; .loc 1 442 22 ld.u64 %r105,[%frame+120]; .loc 1 442 12 shl.b64 %r106,%r105,2; .loc 1 444 32 ld.u64 %r172,[%frame+360]; .loc 1 451 23 mul.lo.u64 %r38,%r172,%r100; .loc 1 452 24 mul.lo.u64 %r410,%r172,%r103; .loc 1 452 10 neg.s64 %r94,%r410; .loc 1 453 23 mul.lo.u64 %r411,%r172,%r105; .loc 1 453 9 shl.b64 %r412,%r411,2; neg.s64 %r140,%r412; setp.le.s64 %r451,%r160,1; .loc 1 418 9 setp.ne.u32 %r459,%r227,0; cvt.u32.u32 %r460,%r23; cvt.s64.s8 %r461,%r460; add.u64 %r462,%r461,-1; add.u64 %r463,%frame,240; add.u64 %r464,%frame,360; add.u64 %r465,%frame,120; $L113: .loc 1 366 3 mov.u64 %r174,%r180; mov.u64 %r173,%r179; .loc 1 393 16 mov.u32 %r147,0; .loc 1 396 9 mov.u64 %r171,0; $L100: .loc 1 399 7 ld.s8 %r222,[%r174]; .loc 1 399 6 setp.eq.u32 %r454,%r222,0; @ %r454 bra $L95; .loc 1 402 10 setp.ne.u32 %r414,%r147,0; @ %r414 bra $L96; cvt.u32.u64 %r415,%r171; add.u32 %r147,%r415,1; $L96: .loc 1 404 11 ld.f32 %r145,[%r173]; .loc 1 404 10 setp.le.f32 %r416,%r145,0f7f800000; @ ! %r416 bra $L95; .loc 1 408 11 cvt.u32.u64 %r417,%r171; add.u32 %r170,%r417,1; .loc 1 414 9 setp.le.s64 %r418,%r161,%r171; @ ! %r418 bra $L138; bra $L98; $L95: .loc 1 396 24 add.u64 %r171,%r171,1; .loc 1 396 32 add.u64 %r173,%r173,%r86; .loc 1 396 47 add.u64 %r174,%r174,%r43; .loc 1 396 2 setp.ne.u64 %r419,%r161,%r171; @ %r419 bra $L100; bra $L98; $L138: .loc 1 418 9 @ %r459 bra $L101; add.u64 %r205,%r171,1; .loc 1 408 11 mov.u32 %r147,%r170; bra $L102; $L101: add.u64 %r209,%r171,1; mov.u32 %r147,%r170; $L105: .loc 1 421 8 @ %r454 bra $L103; .loc 1 421 18 ld.f32 %r89,[%r173]; .loc 1 421 15 setp.le.f32 %r422,%r89,%r145; @ ! %r422 bra $L103; .loc 1 424 16 cvt.u32.u64 %r147,%r209; .loc 1 423 16 mov.f32 %r145,%r89; $L103: .loc 1 419 33 add.u64 %r173,%r173,%r86; .loc 1 419 48 add.u64 %r174,%r174,%r43; .loc 1 419 8 add.u64 %r210,%r209,1; setp.eq.u64 %r423,%r161,%r209; @ %r423 bra $L98; .loc 1 421 9 ld.s8 %r222,[%r174]; mov.u64 %r209,%r210; setp.eq.u32 %r454,%r222,0; bra $L105; $L102: .loc 1 430 10 @ %r454 bra $L106; .loc 1 430 20 ld.f32 %r93,[%r173]; .loc 1 430 17 setp.lt.f32 %r425,%r93,%r145; @ ! %r425 bra $L106; .loc 1 433 11 cvt.u32.u64 %r147,%r205; .loc 1 432 18 mov.f32 %r145,%r93; $L106: .loc 1 428 35 add.u64 %r173,%r173,%r86; .loc 1 428 50 add.u64 %r174,%r174,%r43; .loc 1 428 10 add.u64 %r206,%r205,1; setp.eq.u64 %r426,%r161,%r205; @ %r426 bra $L98; .loc 1 430 11 ld.s8 %r222,[%r174]; mov.u64 %r205,%r206; setp.eq.u32 %r454,%r222,0; bra $L102; $L98: .loc 1 436 8 st.u32 [%r181],%r147; .loc 1 439 15 add.u64 %r39,%r39,1; .loc 1 440 12 add.u64 %r179,%r179,%r101; .loc 1 441 13 add.u64 %r180,%r180,%r103; .loc 1 442 12 add.u64 %r181,%r181,%r106; .loc 1 444 13 setp.ne.u64 %r427,%r39,%r172; @ %r427 bra $L113; .loc 1 452 10 add.u64 %r176,%r180,%r94; .loc 1 453 9 add.u64 %r177,%r181,%r140; .loc 1 455 7 @ %r451 bra $L64; add.u64 %r158,%frame,488; .loc 1 451 23 mov.u64 %r110,%r38; .loc 1 455 7 mov.u64 %r219,8; .loc 1 454 5 mov.u64 %r178,1; .loc 1 448 13 mov.u64 %r458,0; bra $L110; $L111: st.u64 [%r158],%r458; .loc 1 451 23 mul.lo.u64 %r110,%r124,%r123; .loc 1 452 24 mul.lo.u64 %r433,%r126,%r123; .loc 1 452 10 sub.u64 %r176,%r180,%r433; .loc 1 453 23 mul.lo.u64 %r434,%r128,%r123; .loc 1 453 9 shl.b64 %r435,%r434,2; sub.u64 %r177,%r181,%r435; .loc 1 454 5 add.u64 %r178,%r178,1; .loc 1 455 7 add.u64 %r158,%r158,8; add.u64 %r219,%r219,8; setp.eq.u64 %r436,%r178,%r462; @ %r436 bra $L64; $L110: .loc 1 463 16 ld.u64 %r437,[%r158]; add.u64 %r123,%r437,1; st.u64 [%r158],%r123; .loc 1 464 23 add.u64 %r439,%r463,%r219; ld.u64 %r124,[%r439]; .loc 1 464 13 sub.u64 %r440,%r124,%r110; shl.b64 %r441,%r440,2; add.u64 %r179,%r179,%r441; .loc 1 465 24 add.u64 %r442,%frame,%r219; ld.u64 %r126,[%r442]; .loc 1 465 14 add.u64 %r180,%r176,%r126; .loc 1 466 23 add.u64 %r444,%r465,%r219; ld.u64 %r128,[%r444]; .loc 1 466 13 shl.b64 %r445,%r128,2; add.u64 %r181,%r177,%r445; .loc 1 444 32 add.u64 %r447,%r464,%r219; ld.u64 %r131,[%r447]; .loc 1 444 13 setp.eq.u64 %r448,%r123,%r131; @ %r448 bra $L111; .loc 1 448 13 mov.u64 %r39,0; bra $L113; $L136: .loc 1 322 6 ld.u64 %r449,[%r223]; setp.eq.u64 %r450,%r449,0; @ ! %r450 bra $L79; bra $L112; $L64: .loc 1 4_gfortran_sminloc1_4_r4 .visible .func _gfortran_sminloc1_4_r, .param .u32 %in_ar4) {8u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u32 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 493 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L140; .loc 1 493 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L141; $L140:r152154; call _gfortran_minloc1_4_r4500 7 bra $L139; $L141: .loc 1 503 10 ld.u64 %r23,[%r152]; .loc 1 503 7 add.u64 %r88,%r23,-1; .loc 1 504 10 ld.s8 %r24,[%r151+28]; .loc 1 504 38 add.u32 %r26,%r24,-1; .loc 1 504 8 cvt.s64.s32 %r296,%r26; .loc 1 506 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 506 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L143; .loc 1 513 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L144; $L150: .loc 1 521 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L145; bra $L198; $L143: .loc 1 508 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error144: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 518 12 mov.u64 %r302,0; $L149: .loc 1 515 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 517 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L147; .loc 1 515 17 st.u64 [%r139],%r36; bra $L148; $L147: .loc 1 518 12 st.u64 [%r139],%r302; $L148: .loc 1 513 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L149; bra $L150; $L198: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L152; bra $L151; $L145: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 527 12 mov.u64 %r301,0; $L155: .loc 1 524 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 526 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L153; .loc 1 523 17 st.u64 [%r43],%r41; bra $L154; $L153: .loc 1 527 12 st.u64 [%r43],%r301; $L154: .loc 1 521 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L155; bra $L199; $L151: .loc 1 534 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L157; add.u64 %r295,%frame,120; $L177: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 537 10 mov.u64 %r81,1; .loc 1 541 4 mov.u64 %r239,0; bra $L158; $L157: .loc 1 545 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 546 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 548 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 548 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 548 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 550 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L159; bra $L200; $L161: .loc 1 539 48 mul.lo.u64 %r81,%r46,%r81; $L158: .loc 1 541 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 534 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L161; bra $L157; $L159: .loc 1 553 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 554 4 bra $L139; $L200: .loc 1 557 2496246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 557 22 st.u64 [%r150],%r98; $L165: .loc 1 583 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L162; bra $L163; $L152: .loc 1 561 19 ld.s8 %r55,[%r150+28]; .loc 1 561 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L164; .loc 1 562 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error164: .loc 1 567 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 567 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L165; .loc 1 569 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L163; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 569 10 mov.u64 %r104,0; $L167: .loc 1 573 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 573 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 574 18 ld.u64 %r63,[%r31]; .loc 1 569 25 add.u64 %r104,%r104,1; .loc 1 574 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L166; .loc 1 575 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error166: .loc 1 569 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L167; $L162: .loc 1 585 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L168; shl.b64 %r266,%r296,3; bra $L169; $L168: mov.u64 %r266,8; $L169: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L170: .loc 1 586 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 583 26 add.u64 %r100,%r100,1; .loc 1 583 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L170; $L163: ld.u64 %r80,[%frame+240]; .loc 1 595 22 ld.u64 %r68,[%frame]; .loc 1 595 12 shl.b64 %r69,%r68,2; .loc 1 597 32 ld.u64 %r59,[%frame+120]; .loc 1 604 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 593 13 mov.u32 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L176: st.u32 [%r98],%r280; .loc 1 594 15 add.u64 %r80,%r80,1; .loc 1 595 12 add.u64 %r98,%r98,%r69; .loc 1 597 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L176; .loc 1 606 7 @ ! %r294 bra $L201; bra $L139; $L175: .loc 1 601 13 st.u64 [%r99],%r297; .loc 1 604 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 605 5 add.u64 %r97,%r97,1; .loc 1 606 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L174; bra $L139; $L201: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 604 23 mov.u64 %r72,%r44; .loc 1 605 5 mov.u64 %r97,1; .loc 1 601 13 mov.u64 %r297,0; $L174: .loc 1 610 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 611 23 ld.u64 %r75,[%r124]; .loc 1 611 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,2; add.u64 %r98,%r98,%r291; .loc 1 597 32 ld.u64 %r77,[%r123]; .loc 1 597 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L175; .loc 1 601 13 mov.u64 %r80,0; bra $L176; $L199: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L152; bra $L177; $L139: .loc 1 615minloc1_8_r4.o/_gfortran_minloc1_8_r4 .visible .func _gfortran_minloc1_8_r../libgfortran/generated/minloc1_8_r4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_8_r4 .visible .func _gfortran_mminloc1_8_r_gfortran_sminloc1_8_r4 .visible .func _gfortran_sminloc1_8_rVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_minloc1_8_r4 .visible .func _gfortran_minloc1_8_r4pred %r221; .reg .pred %r222; .reg .u64u64 %r229; .reg .u64 %r230; .reg .u64 %r231; .reg .pred %r232; .reg .predu64 %r280; .reg .u64 %r282; .reg .u64pred64pred %r317; .reg .predpred %r323; .reg .predpred %r326; .reg .pred %r327; .reg .predu64 %r333; .reg .u64pred %r352; .reg .u64 %r353; .reg .predmov.u64 %r182,%ar0; mov.u64 %r183,%ar1; mov.u64 %r184,%ar2; mov.u32 %r185,%ar3; .loc 1 58 10 ld.s8 %r22,[%r183+28]; .loc 1 58 38 add.u32 %r186,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r126,%r186; .loc 1 59 10 ld.u64 %r25,[%r184]; .loc 1 59 7 add.u64 %r127,%r25,-1; .loc 1 61 7 shr.u64 %r188,%r127,63; set.u32.lt.s64 %r190,%r126,%r127; neg.s32 %r191,%r190; cvt.u16.u64 %r194,%r188; cvt.u16.u32 %r195,%r191; or.b16 %r193,%r194,%r195; .loc 1 61 6 cvt.u32.u16 %r196,%r193; cvt.u16.u8 %r197,%r196; setp.eq.u16 %r198,%r197,0; @ %r198 bra $L2; .loc 1 63 7 cvt.u32.u32 %r201,%r22; cvt.s64.s8 %r200,%r201; st.u64 [%stack+8],%r20019_gfortran_runtime_errorr204,%r127,%r127; add.u64 %r205,%r204,%r127; shl.b64 %r206,%r205,3; add.u64 %r207,%r183,%r206; ld.u64 %r31,[%r207+56]; ld.u64 %r33,[%r207+48]; .loc 1 71 9 ld.u64 %r130,[%r207+40]; .loc 1 73 3 setp.ne.u64 %r221,%r127,0; @ %r221 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r222,%r126,%r127; @ %r222 bra $L4; bra $L62; $L3: add.u64 %r125,%r183,40; add.u64 %r144,%frame,120; add.u64 %r181,%frame,240; add.u64 %r223,%r183,16; add.u64 %r225,%r25,%r25; add.u64 %r226,%r225,%r25; shl.b64 %r227,%r226,3; add.u64 %r75,%r223,%r227; .loc 1 79 12 mov.u64 %r368,0; $L8: .loc 1 75 18 ld.u64 %r228,[%r125]; st.u64 [%r144],%r228; .loc 1 76 19 ld.u64 %r230,[%r125+16]; add.u64 %r229,%r230,1; ld.u64 %r231,[%r125+8]; sub.u64 %r38,%r229,%r231; .loc 1 78 10 setp.lt.s64 %r232,%r38,0; @ %r232 bra $L6; .loc 1 76 17 st.u64 [%r181],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r181],%r368; $L7: .loc 1 73 3 add.u64 %r125,%r125,24; add.u64 %r144,%r144,8; add.u64 %r181,%r181,8; setp.ne.u64 %r234,%r75,%r125; @ %r234 bra $L8; bra $L9; $L62: .loc 1 90 6 ld.u64 %r235,[%r182]; setp.eq.u64 %r236,%r235,0; @ ! %r236 bra $L11; bra $L10; $L4: add.u64 %r238,%r25,%r25; add.u64 %r239,%r238,%r25; shl.b64 %r240,%r239,3; add.u64 %r241,%r240,40; add.u64 %r121,%r183,%r241; shl.b64 %r242,%r25,3; add.u64 %r111,%r242,-8; add.u64 %r356,%frame,120; add.u64 %r115,%r356,%r111; add.u64 %r355,%frame,240; add.u64 %r109,%r355,%r111; cvt.u32.u32 %r246,%r22; cvt.s64.s8 %r245,%r246; add.u64 %r248,%r245,%r245; add.u64 %r249,%r248,%r245; shl.b64 %r250,%r249,3; add.u64 %r251,%r183,40; add.u64 %r49,%r250,%r251; .loc 1 87 12 mov.u64 %r367,0; $L14: .loc 1 83 18 ld.u64 %r252,[%r121]; st.u64 [%r115],%r252; .loc 1 84 19 ld.u64 %r254,[%r121+16]; add.u64 %r253,%r254,1; ld.u64 %r255,[%r121+8]; sub.u64 %r45,%r253,%r255; .loc 1 86 10 setp.lt.s64 %r256,%r45,0; @ %r256 bra $L12; .loc 1 84 17 st.u64 [%r109],%r45; bra $L13; $L12: .loc 1 87 12 st.u64 [%r109],%r367; $L13: .loc 1 81 3 add.u64 %r121,%r121,24; add.u64 %r115,%r115,8; add.u64 %r109,%r109,8; setp.ne.u64 %r258,%r49,%r121; @ %r258 bra $L14; bra $L63; $L10: .loc 1 94 7 setp.le.s64 %r259,%r126,0; @ %r259 bra $L16; add.u64 %r355,%frame,240; $L44: add.u64 %r150,%r182,40; mov.u64 %r148,%r355; cvt.u32.u32 %r261,%r22; cvt.s64.s8 %r260,%r261; add.u64 %r263,%r260,%r260; add.u64 %r264,%r263,%r260; shl.b64 %r265,%r264,3; add.u64 %r266,%r182,16; add.u64 %r123,%r265,%r266; .loc 1 97 10 mov.u64 %r114,1; .loc 1 101 4 mov.u64 %r289,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r267,0; st.u64 [%r182+8],%r267; .loc 1 106 28 cvt.u16.u32 %r270,%r22; add.u16 %r269,%r270,-1; cvt.u32.u16 %r271,%r269; st.u8 [%r182+28],%r271; .loc 1 108 20 add.u32 %r272,%r22,-2; cvt.s64.s32 %r56,%r272; add.u64 %r274,%r56,%r56; add.u64 %r275,%r274,%r56; shl.b64 %r276,%r275,3; add.u64 %r277,%r182,%r276; .loc 1 108 67 shl.b64 %r279,%r56,3; add.u64 %r280,%frame,%r279; .loc 1 108 59 ld.u64 %r282,[%r277+40]; ld.u64 %r283,[%r280+240]; mul.lo.u64 %r132,%r282,%r283; .loc 1 110 29 mov.u64 %r2851285; call (%value_in),_gfortrani_xmallocarray286,[%value_in]; } .loc 1 110 27 st.u64 [%r182],%r286; .loc 1 111 10 setp.eq.u64 %r288,%r132,0; @ ! %r288 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r114,%r52,%r114; $L17: .loc 1 101 4 st.u64 [%r150+8],%r289; ld.u64 %r52,[%r148]; add.u64 %r290,%r52,-1; st.u64 [%r150+16],%r290; st.u64 [%r150],%r114; .loc 1 94 7 add.u64 %r150,%r150,24; add.u64 %r148,%r148,8; setp.ne.u64 %r291,%r123,%r150; @ %r291 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r182+48],%r132; mov.u64 %r293,-1; st.u64 [%r182+56],%r293; mov.u64 %r294,1; st.u64 [%r182+40],%r294; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r62,[%r182+28]; .loc 1 121 10 setp.eq.u64 %r295,%r62,%r126; @ %r295 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r126; st.u64 [%stack],%r62;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r298,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r299,[%r298+36]; setp.eq.u32 %r300,%r299,0; @ %r300 bra $L21; .loc 1 128 2 add.u64 %r355,%frame,240; cvta.const.u64 %r304,$LC2call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r306,%r126,0; @ %r306 bra $L24; $L27: .loc 1 68 9 add.u64 %r307,%r31,1; .loc 1 68 7 sub.u64 %r128,%r307,%r33; max.s64 %r145,%r128,0; .loc 1 140 8 ld.u64 %r140,[%r183]; .loc 1 141 8 ld.u64 %r141,[%r182]; .loc 1 168 40 shl.b64 %r72,%r130,2; ld.u64 %r47,[%frame+360]; .loc 1 204 22 ld.u64 %r82,[%frame+120]; .loc 1 204 12 shl.b64 %r84,%r82,2; .loc 1 205 22 ld.u64 %r85,[%frame]; .loc 1 205 12 shl.b64 %r87,%r85,3; setp.gt.s64 %r357,%r128,0; setp.le.s64 %r358,%r126,1; .loc 1 180 9 setp.ne.u32 %r360,%r185,0; .loc 1 159 10 mov.u64 %r361,0; cvt.u32.u32 %r362,%r22; cvt.s64.s8 %r363,%r362; add.u64 %r364,%r363,-1; add.u64 %r365,%frame,120; add.u64 %r366,%frame,240; bra $L25; $L24: add.u64 %r168,%frame,360; add.u64 %r167,%r182,40; mov.u64 %r165,%frame; cvt.u32.u32 %r309,%r22; cvt.s64.s8 %r308,%r309; add.u64 %r154,%r308,-1; .loc 1 132 3 mov.u64 %r142,0; add.u64 %r355,%frame,240; .loc 1 134 16 mov.u64 %r310,%r142; $L26: st.u64 [%r168],%r310; .loc 1 135 18 ld.u64 %r311,[%r167]; st.u64 [%r165],%r311; .loc 1 136 17 shl.b64 %r313,%r142,3; add.u64 %r314,%r355,%r313; .loc 1 136 10 ld.u64 %r315,[%r314]; setp.le.s64 %r316,%r315,0; @ %r316 bra $L1; .loc 1 132 26 add.u64 %r142,%r142,1; .loc 1 132 3 add.u64 %r168,%r168,8; add.u64 %r167,%r167,24; add.u64 %r165,%r165,8; setp.ne.u64 %r317,%r142,%r154; @ %r317 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r357 bra $L45; .loc 1 159 10 st.u64 [%r141],%r361; bra $L29; $L45: mov.u64 %r133,%r140; .loc 1 168 17 mov.u64 %r118,0; $L28: .loc 1 170 7 ld.f32 %r113,[%r133]; mov.u64 %r151,%r118; .loc 1 168 32 add.u64 %r118,%r118,1; .loc 1 170 6 setp.le.f32 %r320,%r113,0f7f800000; @ ! %r320 bra $L64; .loc 1 180 9 @ %r360 bra $L32; .loc 1 190 8 setp.gt.s64 %r322,%r145,%r151; @ %r322 bra $L33; bra $L34; $L32: .loc 1 181 8 setp.gt.s64 %r323,%r145,%r151; @ %r323 bra $L46; bra $L34; $L64: .loc 1 168 40 add.u64 %r133,%r133,%r72; .loc 1 168 10 setp.gt.s64 %r324,%r145,%r118; @ %r324 bra $L28; .loc 1 157 9 mov.u64 %r118,1; bra $L34; $L46: .loc 1 170 7 mov.f32 %r180,%r113; $L35: .loc 1 186 16 add.u64 %r151,%r151,1; .loc 1 183 8 setp.ge.f32 %r325,%r113,%r180; .loc 1 185 16 selp.f32 %r113,%r180,%r113,%r325; selp.u64 %r118,%r151,%r118,%r325; .loc 1 181 33 add.u64 %r133,%r133,%r72; .loc 1 181 8 setp.eq.u64 %r326,%r145,%r151; @ %r326 bra $L34; .loc 1 183 9 ld.f32 %r180,[%r133]; bra $L35; $L33: .loc 1 192 9 ld.f32 %r77,[%r133]; .loc 1 195 16 add.u64 %r151,%r151,1; .loc 1 192 8 setp.lt.f32 %r327,%r77,%r113; .loc 1 194 16 selp.f32 %r113,%r77,%r113,%r327; selp.u64 %r118,%r151,%r118,%r327; .loc 1 190 33 add.u64 %r133,%r133,%r72; .loc 1 190 8 setp.ne.u64 %r328,%r145,%r151; @ %r328 bra $L33; $L34: .loc 1 199 12 st.u64 [%r141],%r118; $L29: .loc 1 203 15 add.u64 %r47,%r47,1; .loc 1 204 12 add.u64 %r140,%r140,%r84; .loc 1 205 12 add.u64 %r141,%r141,%r87; .loc 1 207 32 ld.u64 %r122,[%frame+240]; .loc 1 207 13 setp.ne.u64 %r329,%r47,%r122; @ %r329 bra $L25; .loc 1 214 23 mul.lo.u64 %r330,%r82,%r47; .loc 1 214 9 shl.b64 %r331,%r330,2; sub.u64 %r137,%r140,%r331; .loc 1 215 23 mul.lo.u64 %r332,%r85,%r47; .loc 1 215 9 shl.b64 %r333,%r332,3; sub.u64 %r138,%r141,%r333; .loc 1 217 7 @ %r358 bra $L1; add.u64 %r177,%frame,368; mov.u64 %r176,8; .loc 1 216 5 mov.u64 %r139,1; bra $L42; $L43: .loc 1 211 13 st.u64 [%r177],%r361; .loc 1 214 23 mul.lo.u64 %r339,%r100,%r99; .loc 1 214 9 shl.b64 %r340,%r339,2; sub.u64 %r137,%r140,%r340; .loc 1 215 23 mul.lo.u64 %r341,%r103,%r99; .loc 1 215 9 shl.b64 %r342,%r341,3; sub.u64 %r138,%r141,%r342; .loc 1 216 5 add.u64 %r139,%r139,1; .loc 1 217 7 add.u64 %r177,%r177,8; add.u64 %r176,%r176,8; setp.eq.u64 %r343,%r139,%r364; @ %r343 bra $L1; $L42: .loc 1 225 16 ld.u64 %r344,[%r177]; add.u64 %r99,%r344,1; st.u64 [%r177],%r99; .loc 1 226 23 add.u64 %r346,%r365,%r176; ld.u64 %r100,[%r346]; .loc 1 226 13 shl.b64 %r347,%r100,2; add.u64 %r140,%r137,%r347; .loc 1 227 23 add.u64 %r348,%frame,%r176; ld.u64 %r103,[%r348]; .loc 1 227 13 shl.b64 %r349,%r103,3; add.u64 %r141,%r138,%r349; .loc 1 207 32 add.u64 %r351,%r366,%r176; ld.u64 %r107,[%r351]; .loc 1 207 13 setp.eq.u64 %r352,%r99,%r107; @ %r352 bra $L43; .loc 1 211 13 mov.u64 %r47,0; bra $L25; $L63: .loc 1 90 6 ld.u64 %r353,[%r182]; setp.eq.u64 %r354,%r353,0; @ ! %r354 bra $L11; bra $L44; $L1: .loc 1 231_gfortran_mminloc1_8_r4 .visible .func _gfortran_mminloc1_8_r, .param .u32 %in_ar4) {608pred %r218; .reg .u32 %r223; .reg .u6464u64pred %r373; .reg .u64 %r374; .reg .u64 %r376; .reg .u32 %r377; .reg .pred %r378; .reg .u64 %r381; .reg .u64 %r382; .reg .u64 %r386; .reg .pred %r388; .reg .pred %r389; .reg .u64 %r390; .reg .u32 %r391; .reg .u64pred %r398; .reg .pred %r399; .reg .pred %r401; .reg .pred %r402; .reg .pred %r403; .reg .pred %r404; .reg .pred %r407; .reg .pred %r408; .reg .pred %r410; .reg .pred %r411; .reg .pred.reg .u64 %r421; .reg .u64u64pred %r445; .reg .u64 %r446; .reg .predmov.u64 %r213,%ar0; mov.u64 %r214,%ar1; mov.u64 %r215,%ar2; mov.u64 %r216,%ar3; mov.u32 %r217,%ar4; .loc 1 261 6 setp.ne.u64 %r218,%r216,0; @ %r218 bra $L661r214215217; call _gfortran_minloc1_8_r4268 7 bra $L65; $L66: .loc 1 271 10 ld.u64 %r22,[%r215]; .loc 1 271 7 add.u64 %r152,%r22,-1; .loc 1 272 10 ld.s8 %r23,[%r214+28]; .loc 1 272 38 add.u32 %r223,%r23,-1; .loc 1 272 8 cvt.s64.s32 %r153,%r223; .loc 1 275 7 shr.u64 %r225,%r152,63; set.u32.gt.s64 %r227,%r152,%r153; neg.s32 %r228,%r227; cvt.u16.u64 %r231,%r225; cvt.u16.u32 %r232,%r228; or.b16 %r230,%r231,%r232; .loc 1 275 6 cvt.u32.u16 %r233,%r230; cvt.u16.u8 %r234,%r233; setp.eq.u16 %r235,%r234,0; @ %r235 bra $L68; .loc 1 277 7 cvt.u32.u32 %r238,%r23; cvt.s64.s8 %r237,%r238; st.u64 [%stack+8],%r237; st.u64 [%stack],%r22; cvta.const.u64 %r236236_gfortran_runtime_error68: .loc 1 282 9 add.u64 %r442,%r152,%r152; add.u64 %r444,%r442,%r152; shl.b64 %r243,%r444,3; add.u64 %r244,%r214,%r243; ld.u64 %r247,[%r244+56]; add.u64 %r246,%r247,1; .loc 1 282 7 ld.u64 %r254,[%r244+48]; sub.u64 %r154,%r246,%r254; .loc 1 283 6 setp.le.s64 %r255,%r154,0; @ %r255 bra $L65; .loc 1 286 9 ld.u64 %r172,[%r216]; .loc 1 288 15 ld.u64 %r34,[%r216+16]; .loc 1 290 22 cvt.u32.u64 %r35,%r34; .loc 1 290 53 add.u32 %r256,%r35,-4; and.b32 %r257,%r256,-5; set.u32.eq.u32 %r259,%r257,0; neg.s32 %r260,%r259; .loc 1 290 22 add.u32 %r261,%r35,-1; set.u32.le.u32 %r263,%r261,1; neg.s32 %r264,%r263; .loc 1 290 6 cvt.u16.u32 %r266,%r260; cvt.u16.u32 %r267,%r264; or.b16 %r265,%r266,%r267; cvt.u32.u16 %r268,%r265; cvt.u16.u8 %r269,%r268; setp.ne.u16 %r270,%r269,0; @ %r270 bra $L70; .loc 1 292 7 setp.ne.u32 %r272,%r35,16; @ %r272 bra $L71; $L70: .loc 1 299 9 shl.b64 %r276,%r444,3; add.u64 %r277,%r214,%r276; ld.u64 %r157,[%r277+40]; .loc 1 300 12 add.u64 %r283,%r216,%r276; ld.u64 %r38,[%r283+40]; .loc 1 302 3 setp.ne.u64 %r285,%r152,0; @ %r285 bra $L72; $L78: .loc 1 312 3 setp.lt.s64 %r286,%r152,%r153; @ %r286 bra $L73; bra $L139; $L71: .loc 1 297 5_gfortran_runtime_error72: add.u64 %r205,%r214,40; add.u64 %r70,%r216,40; add.u64 %r289,%r214,16; add.u64 %r291,%r22,%r22; add.u64 %r292,%r291,%r22; shl.b64 %r293,%r292,3; add.u64 %r188,%r289,%r293; .loc 1 302 3 mov.u64 %r92,0; add.u64 %r441,%frame,240; add.u64 %r439,%frame,360; .loc 1 309 12 mov.u64 %r455,%r92; $L77: .loc 1 304 18 add.u64 %r295,%r441,%r92; ld.u64 %r296,[%r205]; st.u64 [%r295],%r296; .loc 1 305 18 add.u64 %r297,%frame,%r92; .loc 1 305 20 ld.u64 %r299,[%r70]; mul.lo.u64 %r298,%r299,%r34; .loc 1 305 18 st.u64 [%r297],%r298; .loc 1 306 19 ld.u64 %r301,[%r205+16]; add.u64 %r300,%r301,1; ld.u64 %r302,[%r205+8]; sub.u64 %r49,%r300,%r302; .loc 1 308 10 setp.lt.s64 %r303,%r49,0; @ %r303 bra $L75; .loc 1 306 17 add.u64 %r305,%r439,%r92; st.u64 [%r305],%r49; bra $L76; $L75: .loc 1 309 12 add.u64 %r307,%r439,%r92; st.u64 [%r307],%r455; $L76: .loc 1 302 3 add.u64 %r205,%r205,24; add.u64 %r70,%r70,24; add.u64 %r92,%r92,8; setp.ne.u64 %r309,%r188,%r205; @ %r309 bra $L77; bra $L78; $L139: .loc 1 322 6 ld.u64 %r310,[%r213]; setp.eq.u64 %r311,%r310,0; @ ! %r311 bra $L80; bra $L79; $L73: add.u64 %r313,%r22,%r22; add.u64 %r314,%r313,%r22; shl.b64 %r315,%r314,3; add.u64 %r123,%r315,40; add.u64 %r128,%r214,%r123; add.u64 %r107,%r216,%r123; shl.b64 %r316,%r22,3; add.u64 %r78,%r316,-8; cvt.u32.u32 %r318,%r23; cvt.s64.s8 %r317,%r318; shl.b64 %r319,%r317,3; add.u64 %r180,%r319,-8; add.u64 %r441,%frame,240; add.u64 %r439,%frame,360; .loc 1 319 12 mov.u64 %r454,0; $L83: .loc 1 314 18 add.u64 %r321,%r441,%r78; ld.u64 %r322,[%r128]; st.u64 [%r321],%r322; .loc 1 315 18 add.u64 %r323,%frame,%r78; .loc 1 315 20 ld.u64 %r325,[%r107]; mul.lo.u64 %r324,%r325,%r34; .loc 1 315 18 st.u64 [%r323],%r324; .loc 1 316 19 ld.u64 %r327,[%r128+16]; add.u64 %r326,%r327,1; ld.u64 %r328,[%r128+8]; sub.u64 %r61,%r326,%r328; .loc 1 318 10 setp.lt.s64 %r329,%r61,0; @ %r329 bra $L81; .loc 1 316 17 add.u64 %r331,%r439,%r78; st.u64 [%r331],%r61; bra $L82; $L81: .loc 1 319 12 add.u64 %r333,%r439,%r78; st.u64 [%r333],%r454; $L82: .loc 1 312 3 add.u64 %r128,%r128,24; add.u64 %r107,%r107,24; add.u64 %r78,%r78,8; setp.ne.u64 %r335,%r78,%r180; @ %r335 bra $L83; bra $L140; $L79: .loc 1 326 7 setp.eq.u64 %r336,%r153,0; @ %r336 bra $L85; add.u64 %r439,%frame,360; $L112: add.u64 %r144,%r213,40; mov.u64 %r139,%r439; cvt.u32.u32 %r338,%r23; cvt.s64.s8 %r337,%r338; add.u64 %r340,%r337,%r337; add.u64 %r341,%r340,%r337; shl.b64 %r342,%r341,3; add.u64 %r343,%r213,16; add.u64 %r129,%r342,%r343; .loc 1 329 10 mov.u64 %r132,1; .loc 1 333 4 mov.u64 %r362,0; bra $L86; $L85: .loc 1 337 20 add.u32 %r344,%r23,-2; cvt.s64.s32 %r68,%r344; add.u64 %r346,%r68,%r68; add.u64 %r347,%r346,%r68; shl.b64 %r348,%r347,3; add.u64 %r349,%r213,%r348; .loc 1 337 67 shl.b64 %r351,%r68,3; add.u64 %r352,%frame,%r351; .loc 1 337 59 ld.u64 %r354,[%r349+40]; ld.u64 %r355,[%r352+360]; mul.lo.u64 %r161,%r354,%r355; .loc 1 339 24 mov.u64 %r356,0; st.u64 [%r213+8],%r356; .loc 1 340 28 cvt.u16.u32 %r359,%r23; add.u16 %r358,%r359,-1; cvt.u32.u16 %r360,%r358; st.u8 [%r213+28],%r360; .loc 1 342 10 setp.eq.u64 %r361,%r161,0; @ %r361 bra $L87; bra $L141; $L89: .loc 1 331 47 mul.lo.u64 %r132,%r65,%r132; $L86: .loc 1 333 4 st.u64 [%r144+8],%r362; ld.u64 %r65,[%r139]; add.u64 %r363,%r65,-1; st.u64 [%r144+16],%r363; st.u64 [%r144],%r132; .loc 1 326 7 add.u64 %r144,%r144,24; add.u64 %r139,%r139,8; setp.ne.u64 %r364,%r129,%r144; @ %r364 bra $L89; bra $L85; $L87: .loc 1 345 4 st.u64 [%r213+48],%r161; mov.u64 %r366,-1; st.u64 [%r213+56],%r366; mov.u64 %r367,1; st.u64 [%r213+40],%r367; .loc 1 346 4 bra $L65; $L141: .loc 1 349 24 mov.u64 %r369369; call (%value_in),_gfortrani_xmallocarray70,[%value_in]; } .loc 1 349 22 st.u64 [%r213],%r370; bra $L90; $L80: .loc 1 354 19 ld.s8 %r372,[%r213+28]; .loc 1 354 10 setp.eq.u64 %r373,%r372,%r153; @ %r373 bra $L91; .loc 1 355 2 cvta.const.u64 %r374374_gfortran_runtime_error91: .loc 1 357 11 cvta.global.u64 %r376,_gfortrani_compile_options; .loc 1 357 10 ld.u32 %r377,[%r376+36]; setp.eq.u32 %r378,%r377,0; @ %r378 bra $L90; .loc 1 359 4 add.u64 %r439,%frame,360; cvta.const.u64 %r382,$LC2382; call _gfortrani_bounds_ifunction_return361 4 cvta.const.u64 %r386,$LC386382; call _gfortrani_bounds_equal_extents$L90: .loc 1 366 3 setp.ne.u64 %r388,%r153,0; @ %r388 bra $L92; $L95: .loc 1 374 8 ld.u64 %r173,[%r213]; .loc 1 375 8 ld.u64 %r171,[%r214]; .loc 1 377 9 setp.ne.u64 %r389,%r171,0; @ %r389 bra $L93; bra $L65; $L92: add.u64 %r193,%frame,480; add.u64 %r187,%r213,40; add.u64 %r181,%frame,120; cvt.u32.u32 %r391,%r23; cvt.s64.s8 %r390,%r391; add.u64 %r145,%r390,-1; .loc 1 366 3 mov.u64 %r174,0; add.u64 %r439,%frame,360; .loc 1 368 16 mov.u64 %r392,%r174; $L94: st.u64 [%r193],%r392; .loc 1 369 18 ld.u64 %r393,[%r187]; st.u64 [%r181],%r393; .loc 1 370 17 shl.b64 %r395,%r174,3; add.u64 %r396,%r439,%r395; .loc 1 370 10 ld.u64 %r397,[%r396]; setp.le.s64 %r398,%r397,0; @ %r398 bra $L65; .loc 1 366 26 add.u64 %r174,%r174,1; .loc 1 366 3 add.u64 %r193,%r193,8; add.u64 %r187,%r187,24; add.u64 %r181,%r181,8; setp.ne.u64 %r399,%r145,%r174; @ %r399 bra $L94; bra $L95; $L93: .loc 1 300 12 mul.lo.u64 %r40,%r38,%r34; .loc 1 396 32 shl.b64 %r82,%r157,2; ld.u64 %r185,[%frame+480]; .loc 1 440 22 ld.u64 %r93,[%frame+240]; .loc 1 440 12 shl.b64 %r95,%r93,2; .loc 1 441 23 ld.u64 %r96,[%frame]; .loc 1 442 22 ld.u64 %r98,[%frame+120]; .loc 1 442 12 shl.b64 %r100,%r98,3; setp.le.s64 %r445,%r153,1; .loc 1 418 9 setp.eq.u32 %r447,%r217,0; cvt.u32.u32 %r448,%r23; cvt.s64.s8 %r449,%r448; add.u64 %r450,%r449,-1; add.u64 %r451,%frame,240; add.u64 %r452,%frame,360; add.u64 %r453,%frame,120; $L113: .loc 1 366 3 mov.u64 %r166,%r172; mov.u64 %r165,%r171; .loc 1 393 16 mov.u64 %r142,0; .loc 1 396 9 mov.u64 %r138,%r142; $L101: .loc 1 399 7 ld.s8 %r210,[%r166]; mov.u64 %r182,%r138; .loc 1 403 17 add.u64 %r138,%r138,1; .loc 1 399 6 setp.eq.u32 %r440,%r210,0; @ %r440 bra $L96; .loc 1 402 10 setp.eq.u64 %r401,%r142,0; .loc 1 403 17 selp.u64 %r142,%r138,%r142,%r401; .loc 1 404 11 ld.f32 %r140,[%r165]; .loc 1 404 10 setp.le.f32 %r402,%r140,0f7f800000; @ ! %r402 bra $L96; .loc 1 414 9 setp.le.s64 %r403,%r154,%r182; @ ! %r403 bra $L142; bra $L99; $L96: .loc 1 396 32 add.u64 %r165,%r165,%r82; .loc 1 396 47 add.u64 %r166,%r166,%r40; .loc 1 396 2 setp.ne.u64 %r404,%r154,%r138; @ %r404 bra $L101; bra $L99; $L142: .loc 1 418 9 @ %r447 bra $L102; $L105: .loc 1 424 16 add.u64 %r182,%r182,1; .loc 1 421 8 @ %r440 bra $L103; .loc 1 421 18 ld.f32 %r85,[%r165]; .loc 1 421 15 setp.le.f32 %r407,%r85,%r140; .loc 1 423 16 selp.f32 %r140,%r85,%r140,%r407; selp.u64 %r138,%r182,%r138,%r407; $L103: .loc 1 419 33 add.u64 %r165,%r165,%r82; .loc 1 419 48 add.u64 %r166,%r166,%r40; .loc 1 419 8 setp.eq.u64 %r408,%r154,%r182; @ %r408 bra $L115; .loc 1 421 9 ld.s8 %r210,[%r166]; setp.eq.u32 %r440,%r210,0; bra $L105; $L102: .loc 1 433 11 add.u64 %r182,%r182,1; .loc 1 430 10 @ %r440 bra $L106; .loc 1 430 20 ld.f32 %r90,[%r165]; .loc 1 430 17 setp.lt.f32 %r410,%r90,%r140; .loc 1 432 18 selp.f32 %r140,%r90,%r140,%r410; selp.u64 %r138,%r182,%r138,%r410; $L106: .loc 1 428 35 add.u64 %r165,%r165,%r82; .loc 1 428 50 add.u64 %r166,%r166,%r40; .loc 1 428 10 setp.eq.u64 %r411,%r154,%r182; @ %r411 bra $L116; .loc 1 430 11 ld.s8 %r210,[%r166]; setp.eq.u32 %r440,%r210,0; bra $L102; $L115: mov.u64 %r142,%r138; bra $L99; $L116: mov.u64 %r142,%r138; $L99: .loc 1 436 8 st.u64 [%r173],%r142; .loc 1 439 15 add.u64 %r185,%r185,1; .loc 1 440 12 add.u64 %r171,%r171,%r95; .loc 1 441 13 add.u64 %r172,%r172,%r96; .loc 1 442 12 add.u64 %r173,%r173,%r100; .loc 1 444 32 ld.u64 %r164,[%frame+360]; .loc 1 444 13 setp.ne.u64 %r412,%r185,%r164; @ %r412 bra $L113; .loc 1 451 23 mul.lo.u64 %r102,%r93,%r185; .loc 1 452 24 mul.lo.u64 %r413,%r96,%r185; .loc 1 452 10 sub.u64 %r168,%r172,%r413; .loc 1 453 23 mul.lo.u64 %r414,%r98,%r185; .loc 1 453 9 shl.b64 %r415,%r414,3; sub.u64 %r169,%r173,%r415; .loc 1 455 7 @ %r445 bra $L65; add.u64 %r203,%frame,488; mov.u64 %r202,8; .loc 1 454 5 mov.u64 %r170,1; .loc 1 448 13 mov.u64 %r446,0; bra $L110; $L111: st.u64 [%r203],%r446; .loc 1 451 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 452 24 mul.lo.u64 %r421,%r116,%r113; .loc 1 452 10 sub.u64 %r168,%r172,%r421; .loc 1 453 23 mul.lo.u64 %r422,%r118,%r113; .loc 1 453 9 shl.b64 %r423,%r422,3; sub.u64 %r169,%r173,%r423; .loc 1 454 5 add.u64 %r170,%r170,1; .loc 1 455 7 add.u64 %r203,%r203,8; add.u64 %r202,%r202,8; setp.eq.u64 %r424,%r170,%r450; @ %r424 bra $L65; $L110: .loc 1 463 16 ld.u64 %r425,[%r203]; add.u64 %r113,%r425,1; st.u64 [%r203],%r113; .loc 1 464 23 add.u64 %r427,%r451,%r202; ld.u64 %r114,[%r427]; .loc 1 464 13 sub.u64 %r428,%r114,%r102; shl.b64 %r429,%r428,2; add.u64 %r171,%r171,%r429; .loc 1 465 24 add.u64 %r430,%frame,%r202; ld.u64 %r116,[%r430]; .loc 1 465 14 add.u64 %r172,%r168,%r116; .loc 1 466 23 add.u64 %r432,%r453,%r202; ld.u64 %r118,[%r432]; .loc 1 466 13 shl.b64 %r433,%r118,3; add.u64 %r173,%r169,%r433; .loc 1 444 32 add.u64 %r435,%r452,%r202; ld.u64 %r122,[%r435]; .loc 1 444 13 setp.eq.u64 %r436,%r113,%r122; @ %r436 bra $L111; .loc 1 448 13 mov.u64 %r185,0; bra $L113; $L140: .loc 1 322 6 ld.u64 %r437,[%r213]; setp.eq.u64 %r438,%r437,0; @ ! %r438 bra $L80; bra $L112; $L65: .loc 1 4_gfortran_sminloc1_8_r4 .visible .func _gfortran_sminloc1_8_r, .param .u32 %in_ar4) {8u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 493 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L144; .loc 1 493 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L145; $L144:r152154; call _gfortran_minloc1_8_r4500 7 bra $L143; $L145: .loc 1 503 10 ld.u64 %r23,[%r152]; .loc 1 503 7 add.u64 %r88,%r23,-1; .loc 1 504 10 ld.s8 %r24,[%r151+28]; .loc 1 504 38 add.u32 %r26,%r24,-1; .loc 1 504 8 cvt.s64.s32 %r296,%r26; .loc 1 506 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 506 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L147; .loc 1 513 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L148; $L154: .loc 1 521 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L149; bra $L202; $L147: .loc 1 508 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error148: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 518 12 mov.u64 %r302,0; $L153: .loc 1 515 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 517 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L151; .loc 1 515 17 st.u64 [%r139],%r36; bra $L152; $L151: .loc 1 518 12 st.u64 [%r139],%r302; $L152: .loc 1 513 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L153; bra $L154; $L202: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L156; bra $L155; $L149: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 527 12 mov.u64 %r301,0; $L159: .loc 1 524 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 526 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L157; .loc 1 523 17 st.u64 [%r43],%r41; bra $L158; $L157: .loc 1 527 12 st.u64 [%r43],%r301; $L158: .loc 1 521 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L159; bra $L203; $L155: .loc 1 534 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L161; add.u64 %r295,%frame,120; $L181: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 537 10 mov.u64 %r81,1; .loc 1 541 4 mov.u64 %r239,0; bra $L162; $L161: .loc 1 545 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 546 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 548 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 548 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 548 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 550 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L163; bra $L204; $L165: .loc 1 539 48 mul.lo.u64 %r81,%r46,%r81; $L162: .loc 1 541 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 534 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L165; bra $L161; $L163: .loc 1 553 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 554 4 bra $L143; $L204: .loc 1 557 24 mov.u64 %r24696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 557 22 st.u64 [%r150],%r98; $L169: .loc 1 583 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L166; bra $L167; $L156: .loc 1 561 19 ld.s8 %r55,[%r150+28]; .loc 1 561 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L168; .loc 1 562 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error168: .loc 1 567 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 567 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L169; .loc 1 569 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L167; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 569 10 mov.u64 %r104,0; $L171: .loc 1 573 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 573 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 574 18 ld.u64 %r63,[%r31]; .loc 1 569 25 add.u64 %r104,%r104,1; .loc 1 574 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L170; .loc 1 575 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error170: .loc 1 569 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L171; $L166: .loc 1 585 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L172; shl.b64 %r266,%r296,3; bra $L173; $L172: mov.u64 %r266,8; $L173: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L174: .loc 1 586 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 583 26 add.u64 %r100,%r100,1; .loc 1 583 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L174; $L167: ld.u64 %r80,[%frame+240]; .loc 1 595 22 ld.u64 %r68,[%frame]; .loc 1 595 12 shl.b64 %r69,%r68,3; .loc 1 597 32 ld.u64 %r59,[%frame+120]; .loc 1 604 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 593 13 mov.u64 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L180: st.u64 [%r98],%r280; .loc 1 594 15 add.u64 %r80,%r80,1; .loc 1 595 12 add.u64 %r98,%r98,%r69; .loc 1 597 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L180; .loc 1 606 7 @ ! %r294 bra $L205; bra $L143; $L179: .loc 1 601 13 st.u64 [%r99],%r280; .loc 1 604 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 605 5 add.u64 %r97,%r97,1; .loc 1 606 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L178; bra $L143; $L205: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 604 23 mov.u64 %r72,%r44; .loc 1 605 5 mov.u64 %r97,1; $L178: .loc 1 610 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 611 23 ld.u64 %r75,[%r124]; .loc 1 611 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,3; add.u64 %r98,%r98,%r291; .loc 1 597 32 ld.u64 %r77,[%r123]; .loc 1 597 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L179; .loc 1 601 13 mov.u64 %r80,0; bra $L180; $L203: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L156; bra $L181; $L143: .loc 1 615 minloc1_16_r4.o/162280223_gfortran_minloc1_16_r4 .visible .func _gfortran_minloc1_16_r../libgfortran/generated/minloc1_16_r4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_16_r4 .visible .func _gfortran_mminloc1_16_r_gfortran_sminloc1_16_r4 .visible .func _gfortran_sminloc1_16_rVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_minloc1_16_r4 .visible .func _gfortran_minloc1_16_r4u64 %r116; .reg .f32u16 %r213; .reg .u16 %r214; .reg .u16 %r215; .reg .u32 %r216; .reg .u16 %r217; .reg .predpredu64 %r265; .reg .u32predu64 %r283; .reg .u64pred64u32 %r333; .reg .u64 %r334; .reg .u64pred %r340; .reg .pred %r341; .reg .u64 %r343; .reg .pred %r345; .reg .pred %r348; .reg .predpredu32 %r438; .reg .pred %r445; .reg .pred %r448; .reg .predu64 %r460; .reg .predpred %r470; .reg .u64 %r471; .reg .predu64 %r507; .reg .pred %r511; .reg .pred %r512; .reg .u64 %r513; .reg .u64 %r514; .reg .u64 %r515; .reg .pred %r516; .reg .u64 %r517; .reg .u32 %r518; .reg .u64 %r519; .reg .u64 %r520; .reg .u64 %r521; .reg .u64 %r522; .reg .u64 %r523; .reg .u64 %r524; mov.u64 %r202,%ar0; mov.u64 %r203,%ar1; mov.u64 %r204,%ar2; mov.u32 %r205,%ar3; .loc 1 58 10 ld.s8 %r22,[%r203+28]; .loc 1 58 38 add.u32 %r206,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r130,%r206; .loc 1 59 10 ld.u64 %r25,[%r204]; .loc 1 59 7 add.u64 %r131,%r25,-1; .loc 1 61 7 shr.u64 %r208,%r131,63; set.u32.lt.s64 %r210,%r130,%r131; neg.s32 %r211,%r210; cvt.u16.u64 %r214,%r208; cvt.u16.u32 %r215,%r211; or.b16 %r213,%r214,%r215; .loc 1 61 6 cvt.u32.u16 %r216,%r213; cvt.u16.u8 %r217,%r216; setp.eq.u16 %r218,%r217,0; @ %r218 bra $L2; .loc 1 63 7 cvt.u32.u32 %r221,%r22; cvt.s64.s8 %r220,%r221; st.u64 [%stack+8],%r22011_gfortran_runtime_errorr224,%r131,%r131; add.u64 %r225,%r224,%r131; shl.b64 %r226,%r225,3; add.u64 %r227,%r203,%r226; ld.u64 %r30,[%r227+56]; ld.u64 %r32,[%r227+48]; .loc 1 71 9 ld.u64 %r134,[%r227+40]; .loc 1 73 3 setp.ne.u64 %r241,%r131,0; @ %r241 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r242,%r130,%r131; @ %r242 bra $L4; bra $L63; $L3: add.u64 %r76,%r203,40; add.u64 %r38,%frame,120; add.u64 %r164,%frame,240; add.u64 %r244,%r25,%r25; add.u64 %r245,%r244,%r25; shl.b64 %r246,%r245,3; add.u64 %r247,%r203,16; add.u64 %r43,%r246,%r247; .loc 1 79 12 mov.u64 %r524,0; $L8: .loc 1 75 18 ld.u64 %r248,[%r76]; st.u64 [%r38],%r248; .loc 1 76 19 ld.u64 %r250,[%r76+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r76+8]; sub.u64 %r37,%r249,%r251; .loc 1 78 10 setp.lt.s64 %r252,%r37,0; @ %r252 bra $L6; .loc 1 76 17 st.u64 [%r164],%r37; bra $L7; $L6: .loc 1 79 12 st.u64 [%r164],%r524; $L7: .loc 1 73 3 add.u64 %r76,%r76,24; add.u64 %r38,%r38,8; add.u64 %r164,%r164,8; setp.ne.u64 %r254,%r43,%r76; @ %r254 bra $L8; bra $L9; $L63: .loc 1 90 6 ld.u64 %r255,[%r202]; setp.eq.u64 %r256,%r255,0; @ ! %r256 bra $L11; bra $L10; $L4: add.u64 %r258,%r25,%r25; add.u64 %r259,%r258,%r25; shl.b64 %r260,%r259,3; add.u64 %r261,%r260,40; add.u64 %r72,%r203,%r261; shl.b64 %r262,%r25,3; add.u64 %r197,%r262,-8; add.u64 %r514,%frame,120; add.u64 %r128,%r514,%r197; add.u64 %r513,%frame,240; add.u64 %r195,%r513,%r197; cvt.u32.u32 %r266,%r22; cvt.s64.s8 %r265,%r266; add.u64 %r268,%r265,%r265; add.u64 %r269,%r268,%r265; shl.b64 %r270,%r269,3; add.u64 %r271,%r203,40; add.u64 %r64,%r270,%r271; .loc 1 87 12 mov.u64 %r523,0; $L14: .loc 1 83 18 ld.u64 %r272,[%r72]; st.u64 [%r128],%r272; .loc 1 84 19 ld.u64 %r274,[%r72+16]; add.u64 %r273,%r274,1; ld.u64 %r275,[%r72+8]; sub.u64 %r45,%r273,%r275; .loc 1 86 10 setp.lt.s64 %r276,%r45,0; @ %r276 bra $L12; .loc 1 84 17 st.u64 [%r195],%r45; bra $L13; $L12: .loc 1 87 12 st.u64 [%r195],%r523; $L13: .loc 1 81 3 add.u64 %r72,%r72,24; add.u64 %r128,%r128,8; add.u64 %r195,%r195,8; setp.ne.u64 %r278,%r64,%r72; @ %r278 bra $L14; bra $L64; $L10: .loc 1 94 7 setp.le.s64 %r279,%r130,0; @ %r279 bra $L16; add.u64 %r513,%frame,240; $L46: add.u64 %r113,%r202,40; mov.u64 %r110,%r513; cvt.u32.u32 %r281,%r22; cvt.s64.s8 %r280,%r281; add.u64 %r283,%r280,%r280; add.u64 %r284,%r283,%r280; shl.b64 %r285,%r284,3; add.u64 %r286,%r202,16; add.u64 %r86,%r285,%r286; .loc 1 97 10 mov.u64 %r111,1; .loc 1 101 4 mov.u64 %r309,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r287,0; st.u64 [%r202+8],%r287; .loc 1 106 28 cvt.u16.u32 %r290,%r22; add.u16 %r289,%r290,-1; cvt.u32.u16 %r291,%r289; st.u8 [%r202+28],%r291; .loc 1 108 20 add.u32 %r292,%r22,-2; cvt.s64.s32 %r53,%r292; add.u64 %r294,%r53,%r53; add.u64 %r295,%r294,%r53; shl.b64 %r296,%r295,3; add.u64 %r297,%r202,%r296; .loc 1 108 67 shl.b64 %r299,%r53,3; add.u64 %r300,%frame,%r299; .loc 1 108 59 ld.u64 %r302,[%r297+40]; ld.u64 %r303,[%r300+240]; mul.lo.u64 %r136,%r302,%r303; .loc 1 110 29 mov.u64 %r305,16305; call (%value_in),_gfortrani_xmallocarray06,[%value_in]; } .loc 1 110 27 st.u64 [%r202],%r306; .loc 1 111 10 setp.eq.u64 %r308,%r136,0; @ ! %r308 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r111,%r49,%r111; $L17: .loc 1 101 4 st.u64 [%r113+8],%r309; ld.u64 %r49,[%r110]; add.u64 %r310,%r49,-1; st.u64 [%r113+16],%r310; st.u64 [%r113],%r111; .loc 1 94 7 add.u64 %r113,%r113,24; add.u64 %r110,%r110,8; setp.ne.u64 %r311,%r86,%r113; @ %r311 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r202+48],%r136; mov.u64 %r313,-1; st.u64 [%r202+56],%r313; mov.u64 %r314,1; st.u64 [%r202+40],%r314; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r60,[%r202+28]; .loc 1 121 10 setp.eq.u64 %r315,%r60,%r130; @ %r315 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r130; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r318,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r319,[%r318+36]; setp.eq.u32 %r320,%r319,0; @ %r320 bra $L21; .loc 1 128 2 add.u64 %r513,%frame,240; cvta.const.u64 %r324,$LC2r513324; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r326,%r130,0; @ %r326 bra $L24; $L27: .loc 1 68 9 add.u64 %r327,%r30,1; .loc 1 68 7 sub.u64 %r132,%r327,%r32; max.s64 %r68,%r132,0; .loc 1 140 8 ld.u64 %r147,[%r203]; .loc 1 141 8 ld.u64 %r148,[%r202]; .loc 1 168 40 shl.b64 %r70,%r134,2; ld.u64 %r161,[%frame+360]; .loc 1 204 22 ld.u64 %r81,[%frame+120]; .loc 1 204 12 shl.b64 %r82,%r81,2; .loc 1 205 22 ld.u64 %r84,[%frame]; .loc 1 205 12 shl.b64 %r85,%r84,4; .loc 1 207 32 ld.u64 %r122,[%frame+240]; .loc 1 214 23 mul.lo.u64 %r328,%r81,%r122; .loc 1 214 9 shl.b64 %r329,%r328,2; neg.s64 %r166,%r329; .loc 1 215 23 mul.lo.u64 %r330,%r84,%r122; .loc 1 215 9 shl.b64 %r331,%r330,4; neg.s64 %r71,%r331; setp.gt.s64 %r511,%r132,0; setp.le.s64 %r512,%r130,1; .loc 1 180 9 setp.ne.u32 %r516,%r205,0; add.u64 %r517,%r68,-1; cvt.u32.u32 %r518,%r22; cvt.s64.s8 %r519,%r518; add.u64 %r520,%r519,-1; add.u64 %r521,%frame,120; add.u64 %r522,%frame,240; bra $L25; $L24: add.u64 %r158,%frame,360; add.u64 %r157,%r202,40; mov.u64 %r153,%frame; cvt.u32.u32 %r333,%r22; cvt.s64.s8 %r332,%r333; add.u64 %r116,%r332,-1; .loc 1 132 3 mov.u64 %r149,0; add.u64 %r513,%frame,240; .loc 1 134 16 mov.u64 %r334,%r149; $L26: st.u64 [%r158],%r334; .loc 1 135 18 ld.u64 %r335,[%r157]; st.u64 [%r153],%r335; .loc 1 136 17 shl.b64 %r337,%r149,3; add.u64 %r338,%r513,%r337; .loc 1 136 10 ld.u64 %r339,[%r338]; setp.le.s64 %r340,%r339,0; @ %r340 bra $L1; .loc 1 132 26 add.u64 %r149,%r149,1; .loc 1 132 3 add.u64 %r158,%r158,8; add.u64 %r157,%r157,24; add.u64 %r153,%r153,8; setp.ne.u64 %r341,%r116,%r149; @ %r341 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r511 bra $L47; .loc 1 159 10 mov.u64 %r343,0; st.u64 [%r148],%r343; st.u64 [%r148+8],%r343; bra $L29; $L47: mov.u64 %r138,%r147; .loc 1 168 17 mov.u64 %r156,0; $L28: .loc 1 170 7 ld.f32 %r119,[%r138]; mov.u64 %r199,%r156; .loc 1 168 32 add.u64 %r156,%r156,1; .loc 1 170 6 setp.le.f32 %r345,%r119,0f7f800000; @ ! %r345 bra $L65; .loc 1 173 14 mov.u64 %r473,%r156; shr.s64 %r474,%r473,63; .loc 1 180 9 @ %r516 bra $L32; .loc 1 190 8 setp.gt.s64 %r348,%r68,%r199; @ %r348 bra $L33; bra $L34; $L32: .loc 1 181 8 setp.le.s64 %r349,%r68,%r199; @ %r349 bra $L34; add.u64 %r481,%r199,1; shr.s64 %r482,%r481,63; sub.u64 %r353,%r517,%r199; shr.s64 %r356,%r199,63; add.u64 %r359,%r199,2; set.u32.lt.u64 %r364,%r359,%r199; cvt.s64.s32 %r362,%r364; sub.u64 %r368,%r356,%r362; add.u64 %r370,%r353,%r359; set.u32.lt.u64 %r375,%r370,%r353; cvt.s64.s32 %r373,%r375; sub.u64 %r379,%r368,%r373; .loc 1 170 7 mov.f32 %r194,%r119; .loc 1 181 8 mov.u64 %r493,1; bra $L35; $L65: .loc 1 168 40 add.u64 %r138,%r138,%r70; .loc 1 168 10 setp.gt.s64 %r380,%r68,%r156; @ %r380 bra $L28; .loc 1 157 9 mov.u64 %r473,1; mov.u64 %r474,0; bra $L34; $L35: .loc 1 183 8 setp.ge.f32 %r381,%r119,%r194; .loc 1 186 16 selp.u64 %r473,%r481,%r473,%r381; selp.u64 %r474,%r482,%r474,%r381; selp.f32 %r119,%r194,%r119,%r381; .loc 1 181 33 add.u64 %r138,%r138,%r70; .loc 1 181 8 add.u64 %r384,%r481,%r493; set.u32.lt.u64 %r389,%r384,%r481; cvt.s64.s32 %r387,%r389; mov.u64 %r481,%r384; sub.u64 %r482,%r482,%r387; setp.ne.u64 %r396,%r370,%r481; @ %r396 bra $L50; setp.ne.u64 %r399,%r379,%r482; @ ! %r399 bra $L34; $L50: .loc 1 183 9 ld.f32 %r194,[%r138]; bra $L35; $L33: add.u64 %r477,%r199,1; shr.s64 %r478,%r477,63; sub.u64 %r403,%r517,%r199; shr.s64 %r406,%r199,63; add.u64 %r409,%r199,2; set.u32.lt.u64 %r414,%r409,%r199; cvt.s64.s32 %r412,%r414; sub.u64 %r418,%r406,%r412; add.u64 %r420,%r403,%r409; set.u32.lt.u64 %r425,%r420,%r403; cvt.s64.s32 %r423,%r425; sub.u64 %r429,%r418,%r423; .loc 1 190 8 mov.u64 %r507,1; $L60: .loc 1 192 9 ld.f32 %r75,[%r138]; .loc 1 192 8 setp.lt.f32 %r430,%r75,%r119; .loc 1 195 16 selp.u64 %r473,%r477,%r473,%r430; selp.u64 %r474,%r478,%r474,%r430; selp.f32 %r119,%r75,%r119,%r430; .loc 1 190 33 add.u64 %r138,%r138,%r70; .loc 1 190 8 add.u64 %r433,%r477,%r507; set.u32.lt.u64 %r438,%r433,%r477; cvt.s64.s32 %r436,%r438; mov.u64 %r477,%r433; sub.u64 %r478,%r478,%r436; setp.ne.u64 %r445,%r420,%r477; @ %r445 bra $L60; setp.ne.u64 %r448,%r429,%r478; @ %r448 bra $L60; $L34: .loc 1 199 12 st.u64 [%r148],%r473; st.u64 [%r148+8],%r474; $L29: .loc 1 203 15 add.u64 %r161,%r161,1; .loc 1 204 12 add.u64 %r147,%r147,%r82; .loc 1 205 12 add.u64 %r148,%r148,%r85; .loc 1 207 13 setp.ne.u64 %r451,%r161,%r122; @ %r451 bra $L25; .loc 1 214 9 add.u64 %r144,%r147,%r166; .loc 1 215 9 add.u64 %r145,%r148,%r71; .loc 1 217 7 @ %r512 bra $L1; add.u64 %r129,%frame,368; mov.u64 %r190,8; .loc 1 216 5 mov.u64 %r146,1; .loc 1 211 13 mov.u64 %r515,0; bra $L44; $L45: st.u64 [%r129],%r515; .loc 1 214 23 mul.lo.u64 %r457,%r97,%r96; .loc 1 214 9 shl.b64 %r458,%r457,2; sub.u64 %r144,%r147,%r458; .loc 1 215 23 mul.lo.u64 %r459,%r100,%r96; .loc 1 215 9 shl.b64 %r460,%r459,4; sub.u64 %r145,%r148,%r460; .loc 1 216 5 add.u64 %r146,%r146,1; .loc 1 217 7 add.u64 %r129,%r129,8; add.u64 %r190,%r190,8; setp.eq.u64 %r461,%r146,%r520; @ %r461 bra $L1; $L44: .loc 1 225 16 ld.u64 %r462,[%r129]; add.u64 %r96,%r462,1; st.u64 [%r129],%r96; .loc 1 226 23 add.u64 %r464,%r521,%r190; ld.u64 %r97,[%r464]; .loc 1 226 13 shl.b64 %r465,%r97,2; add.u64 %r147,%r144,%r465; .loc 1 227 23 add.u64 %r466,%frame,%r190; ld.u64 %r100,[%r466]; .loc 1 227 13 shl.b64 %r467,%r100,4; add.u64 %r148,%r145,%r467; .loc 1 207 32 add.u64 %r469,%r522,%r190; ld.u64 %r103,[%r469]; .loc 1 207 13 setp.eq.u64 %r470,%r96,%r103; @ %r470 bra $L45; .loc 1 211 13 mov.u64 %r161,0; bra $L25; $L64: .loc 1 90 6 ld.u64 %r471,[%r202]; setp.eq.u64 %r472,%r471,0; @ ! %r472 bra $L11; bra $L46; $L1: .loc 1 231_gfortran_mminloc1_16_r4 .visible .func _gfortran_mminloc1_16_r, .param .u32 %in_ar4) {608pred %r239; .reg .u32 %r244; .reg .u64 %r246; .reg .u32 %r248; .reg .u32 %r249; .reg .u16 %r251; .reg .u16 %r252; .reg .u16 %r253; .reg .u32u64 %r268; .reg .u64 %r275; .reg .pred %r276; .reg .u32 %r277; .reg .u32 %r278; .reg .u32 %r280; .reg .u32pred %r291; .reg .pred %r293; .reg .u64predu64 %r312; .reg .u64u64 %r317; .reg .u64 %r318; .reg .u64u64pred %r356; .reg .pred %r357; .reg .u64 %r358; .reg .u32pred %r410; .reg .u64pred %r419; .reg .predpred %r427; .reg .predu64 %r454; .reg .u64 %r457; .reg .u32 %r459; .reg .u64 %r463; .reg .u64 %r465; .reg .u64pred %r541; .reg .predu64u64 %r565; .reg .u64 %r567; .reg .pred %r568; .reg .u64 %r569; .reg .pred %r570; .reg .u64 %r571; .reg .u64 %r572; .reg .u64 %r573; .reg .u64.reg .u64 %r633; .reg .u64 %r634; mov.u64 %r234,%ar0; mov.u64 %r235,%ar1; mov.u64 %r236,%ar2; mov.u64 %r237,%ar3; mov.u32 %r238,%ar4; .loc 1 261 6 setp.ne.u64 %r239,%r237,0; @ %r239 bra $L67; .loc 1 264 7r2236238; call _gfortran_minloc1_16_r4268 7 bra $L66; $L67: .loc 1 271 10 ld.u64 %r22,[%r236]; .loc 1 271 7 add.u64 %r152,%r22,-1; .loc 1 272 10 ld.s8 %r23,[%r235+28]; .loc 1 272 38 add.u32 %r244,%r23,-1; .loc 1 272 8 cvt.s64.s32 %r153,%r244; .loc 1 275 7 shr.u64 %r246,%r152,63; set.u32.gt.s64 %r248,%r152,%r153; neg.s32 %r249,%r248; cvt.u16.u64 %r252,%r246; cvt.u16.u32 %r253,%r249; or.b16 %r251,%r252,%r253; .loc 1 275 6 cvt.u32.u16 %r254,%r251; cvt.u16.u8 %r255,%r254; setp.eq.u16 %r256,%r255,0; @ %r256 bra $L69; .loc 1 277 7 cvt.u32.u32 %r259,%r23; cvt.s64.s8 %r258,%r259; st.u64 [%stack+8],%r258; st.u64 [%stack],%r22; cvta.const.u64 %r25_gfortran_runtime_error69: .loc 1 282 9 add.u64 %r621,%r152,%r152; add.u64 %r617,%r621,%r152; shl.b64 %r264,%r617,3; add.u64 %r265,%r235,%r264; ld.u64 %r268,[%r265+56]; add.u64 %r267,%r268,1; .loc 1 282 7 ld.u64 %r275,[%r265+48]; sub.u64 %r154,%r267,%r275; .loc 1 283 6 setp.le.s64 %r276,%r154,0; @ %r276 bra $L66; .loc 1 286 9 ld.u64 %r174,[%r237]; .loc 1 288 15 ld.u64 %r33,[%r237+16]; .loc 1 290 22 cvt.u32.u64 %r34,%r33; .loc 1 290 53 add.u32 %r277,%r34,-4; and.b32 %r278,%r277,-5; set.u32.eq.u32 %r280,%r278,0; neg.s32 %r281,%r280; .loc 1 290 22 add.u32 %r282,%r34,-1; set.u32.le.u32 %r284,%r282,1; neg.s32 %r285,%r284; .loc 1 290 6 cvt.u16.u32 %r287,%r281; cvt.u16.u32 %r288,%r285; or.b16 %r286,%r287,%r288; cvt.u32.u16 %r289,%r286; cvt.u16.u8 %r290,%r289; setp.ne.u16 %r291,%r290,0; @ %r291 bra $L71; .loc 1 292 7 setp.ne.u32 %r293,%r34,16; @ %r293 bra $L72; $L71: .loc 1 299 9 shl.b64 %r297,%r617,3; add.u64 %r298,%r235,%r297; ld.u64 %r157,[%r298+40]; .loc 1 300 12 add.u64 %r304,%r237,%r297; ld.u64 %r40,[%r304+40]; .loc 1 302 3 setp.ne.u64 %r306,%r152,0; @ %r306 bra $L73; $L79: .loc 1 312 3 setp.lt.s64 %r307,%r152,%r153; @ %r307 bra $L74; bra $L141; $L72: .loc 1 297 5 cvta.const.u64 %r30830_gfortran_runtime_error73: add.u64 %r91,%r235,40; add.u64 %r196,%r237,40; add.u64 %r311,%r22,%r22; add.u64 %r312,%r311,%r22; shl.b64 %r313,%r312,3; add.u64 %r314,%r235,16; add.u64 %r226,%r313,%r314; .loc 1 302 3 mov.u64 %r107,0; add.u64 %r619,%frame,240; add.u64 %r623,%frame,360; .loc 1 309 12 mov.u64 %r634,%r107; $L78: .loc 1 304 18 add.u64 %r316,%r619,%r107; ld.u64 %r317,[%r91]; st.u64 [%r316],%r317; .loc 1 305 18 add.u64 %r318,%frame,%r107; .loc 1 305 20 ld.u64 %r320,[%r196]; mul.lo.u64 %r319,%r320,%r33; .loc 1 305 18 st.u64 [%r318],%r319; .loc 1 306 19 ld.u64 %r322,[%r91+16]; add.u64 %r321,%r322,1; ld.u64 %r323,[%r91+8]; sub.u64 %r52,%r321,%r323; .loc 1 308 10 setp.lt.s64 %r324,%r52,0; @ %r324 bra $L76; .loc 1 306 17 add.u64 %r326,%r623,%r107; st.u64 [%r326],%r52; bra $L77; $L76: .loc 1 309 12 add.u64 %r328,%r623,%r107; st.u64 [%r328],%r634; $L77: .loc 1 302 3 add.u64 %r91,%r91,24; add.u64 %r196,%r196,24; add.u64 %r107,%r107,8; setp.ne.u64 %r330,%r91,%r226; @ %r330 bra $L78; bra $L79; $L141: .loc 1 322 6 ld.u64 %r331,[%r234]; setp.eq.u64 %r332,%r331,0; @ ! %r332 bra $L81; bra $L80; $L74: add.u64 %r334,%r22,%r22; add.u64 %r335,%r334,%r22; shl.b64 %r336,%r335,3; add.u64 %r180,%r336,40; add.u64 %r57,%r235,%r180; add.u64 %r182,%r237,%r180; shl.b64 %r337,%r22,3; add.u64 %r74,%r337,-8; cvt.u32.u32 %r339,%r23; cvt.s64.s8 %r338,%r339; shl.b64 %r340,%r338,3; add.u64 %r88,%r340,-8; add.u64 %r619,%frame,240; add.u64 %r623,%frame,360; .loc 1 319 12 mov.u64 %r633,0; $L84: .loc 1 314 18 add.u64 %r342,%r619,%r74; ld.u64 %r343,[%r57]; st.u64 [%r342],%r343; .loc 1 315 18 add.u64 %r344,%frame,%r74; .loc 1 315 20 ld.u64 %r346,[%r182]; mul.lo.u64 %r345,%r346,%r33; .loc 1 315 18 st.u64 [%r344],%r345; .loc 1 316 19 ld.u64 %r348,[%r57+16]; add.u64 %r347,%r348,1; ld.u64 %r349,[%r57+8]; sub.u64 %r64,%r347,%r349; .loc 1 318 10 setp.lt.s64 %r350,%r64,0; @ %r350 bra $L82; .loc 1 316 17 add.u64 %r352,%r623,%r74; st.u64 [%r352],%r64; bra $L83; $L82: .loc 1 319 12 add.u64 %r354,%r623,%r74; st.u64 [%r354],%r633; $L83: .loc 1 312 3 add.u64 %r57,%r57,24; add.u64 %r182,%r182,24; add.u64 %r74,%r74,8; setp.ne.u64 %r356,%r74,%r88; @ %r356 bra $L84; bra $L142; $L80: .loc 1 326 7 setp.eq.u64 %r357,%r153,0; @ %r357 bra $L86; add.u64 %r623,%frame,360; $L116: add.u64 %r133,%r234,40; mov.u64 %r131,%r623; cvt.u32.u32 %r359,%r23; cvt.s64.s8 %r358,%r359; add.u64 %r361,%r358,%r358; add.u64 %r362,%r361,%r358; shl.b64 %r363,%r362,3; add.u64 %r364,%r234,16; add.u64 %r60,%r363,%r364; .loc 1 329 10 mov.u64 %r136,1; .loc 1 333 4 mov.u64 %r383,0; bra $L87; $L86: .loc 1 337 20 add.u32 %r365,%r23,-2; cvt.s64.s32 %r72,%r365; add.u64 %r367,%r72,%r72; add.u64 %r368,%r367,%r72; shl.b64 %r369,%r368,3; add.u64 %r370,%r234,%r369; .loc 1 337 67 shl.b64 %r372,%r72,3; add.u64 %r373,%frame,%r372; .loc 1 337 59 ld.u64 %r375,[%r370+40]; ld.u64 %r376,[%r373+360]; mul.lo.u64 %r161,%r375,%r376; .loc 1 339 24 mov.u64 %r377,0; st.u64 [%r234+8],%r377; .loc 1 340 28 cvt.u16.u32 %r380,%r23; add.u16 %r379,%r380,-1; cvt.u32.u16 %r381,%r379; st.u8 [%r234+28],%r381; .loc 1 342 10 setp.eq.u64 %r382,%r161,0; @ %r382 bra $L88; bra $L143; $L90: .loc 1 331 47 mul.lo.u64 %r136,%r69,%r136; $L87: .loc 1 333 4 st.u64 [%r133+8],%r383; ld.u64 %r69,[%r131]; add.u64 %r384,%r69,-1; st.u64 [%r133+16],%r384; st.u64 [%r133],%r136; .loc 1 326 7 add.u64 %r133,%r133,24; add.u64 %r131,%r131,8; setp.ne.u64 %r385,%r60,%r133; @ %r385 bra $L90; bra $L86; $L88: .loc 1 345 4 st.u64 [%r234+48],%r161; mov.u64 %r387,-1; st.u64 [%r234+56],%r387; mov.u64 %r388,1; st.u64 [%r234+40],%r388; .loc 1 346 4 bra $L66; $L143: .loc 1 349 24 mov.u64 %r390,16390; call (%value_in),_gfortrani_xmallocarray91,[%value_in]; } .loc 1 349 22 st.u64 [%r234],%r391; bra $L91; $L81: .loc 1 354 19 ld.s8 %r393,[%r234+28]; .loc 1 354 10 setp.eq.u64 %r394,%r393,%r153; @ %r394 bra $L92; .loc 1 355 2 cvta.const.u64 %r3955_gfortran_runtime_error92: .loc 1 357 11 cvta.global.u64 %r397,_gfortrani_compile_options; .loc 1 357 10 ld.u32 %r398,[%r397+36]; setp.eq.u32 %r399,%r398,0; @ %r399 bra $L91; .loc 1 359 4 add.u64 %r623,%frame,360; cvta.const.u64 %r403,$LC2;r62302403; call _gfortrani_bounds_ifunction_return361 4 cvta.const.u64 %r407,$LC207403; call _gfortrani_bounds_equal_extents$L91: .loc 1 366 3 setp.ne.u64 %r409,%r153,0; @ %r409 bra $L93; $L96: .loc 1 374 8 ld.u64 %r175,[%r234]; .loc 1 375 8 ld.u64 %r173,[%r235]; .loc 1 377 9 setp.ne.u64 %r410,%r173,0; @ %r410 bra $L94; bra $L66; $L93: add.u64 %r149,%frame,480; add.u64 %r148,%r234,40; add.u64 %r144,%frame,120; cvt.u32.u32 %r412,%r23; cvt.s64.s8 %r411,%r412; add.u64 %r134,%r411,-1; .loc 1 366 3 mov.u64 %r176,0; add.u64 %r623,%frame,360; .loc 1 368 16 mov.u64 %r413,%r176; $L95: st.u64 [%r149],%r413; .loc 1 369 18 ld.u64 %r414,[%r148]; st.u64 [%r144],%r414; .loc 1 370 17 shl.b64 %r416,%r176,3; add.u64 %r417,%r623,%r416; .loc 1 370 10 ld.u64 %r418,[%r417]; setp.le.s64 %r419,%r418,0; @ %r419 bra $L66; .loc 1 366 26 add.u64 %r176,%r176,1; .loc 1 366 3 add.u64 %r149,%r149,8; add.u64 %r148,%r148,24; add.u64 %r144,%r144,8; setp.ne.u64 %r420,%r134,%r176; @ %r420 bra $L95; bra $L96; $L94: .loc 1 300 12 mul.lo.u64 %r43,%r40,%r33; .loc 1 396 32 shl.b64 %r87,%r157,2; ld.u64 %r38,[%frame+480]; .loc 1 440 22 ld.u64 %r100,[%frame+240]; .loc 1 440 12 shl.b64 %r101,%r100,2; .loc 1 441 23 ld.u64 %r103,[%frame]; .loc 1 442 22 ld.u64 %r105,[%frame+120]; .loc 1 442 12 shl.b64 %r106,%r105,4; .loc 1 444 32 ld.u64 %r166,[%frame+360]; .loc 1 451 23 mul.lo.u64 %r37,%r166,%r100; .loc 1 452 24 mul.lo.u64 %r421,%r166,%r103; .loc 1 452 10 neg.s64 %r85,%r421; .loc 1 453 23 mul.lo.u64 %r422,%r166,%r105; .loc 1 453 9 shl.b64 %r423,%r422,4; neg.s64 %r93,%r423; setp.le.s64 %r618,%r153,1; .loc 1 366 3 mov.u64 %r626,1; .loc 1 418 9 setp.ne.u32 %r627,%r238,0; add.u64 %r628,%r154,-1; cvt.u32.u32 %r629,%r23; cvt.s64.s8 %r630,%r629; add.u64 %r631,%r630,-1; add.u64 %r632,%frame,240; $L117: .loc 1 366 3 mov.u64 %r169,%r174; mov.u64 %r168,%r173; mov.u64 %r573,%r626; mov.u64 %r574,0; .loc 1 393 17 mov.u64 %r571,%r574; mov.u64 %r572,%r574; .loc 1 396 9 mov.u64 %r200,%r574; $L102: .loc 1 399 7 ld.s8 %r225,[%r169]; mov.u64 %r231,%r200; add.u64 %r200,%r200,1; .loc 1 399 6 setp.eq.u32 %r620,%r225,0; @ %r620 bra $L97; .loc 1 402 10 or.b64 %r425,%r571,%r572; setp.ne.u64 %r427,%r425,0; .loc 1 403 17 selp.u64 %r571,%r571,%r573,%r427; selp.u64 %r572,%r572,%r574,%r427; .loc 1 404 11 ld.f32 %r145,[%r168]; .loc 1 404 10 setp.le.f32 %r428,%r145,0f7f800000; @ ! %r428 bra $L97; .loc 1 408 11 shr.s64 %r429,%r200,63; .loc 1 414 9 setp.le.s64 %r430,%r154,%r231; @ ! %r430 bra $L144; bra $L100; $L97: .loc 1 396 32 add.u64 %r168,%r168,%r87; .loc 1 396 47 add.u64 %r169,%r169,%r43; .loc 1 396 2 add.u64 %r433,%r573,%r626; set.u32.lt.u64 %r438,%r433,%r573; cvt.s64.s32 %r436,%r438; mov.u64 %r573,%r433; sub.u64 %r574,%r574,%r436; setp.ne.u64 %r443,%r154,%r200; @ %r443 bra $L102; bra $L100; $L144: .loc 1 418 9 @ %r627 bra $L103; add.u64 %r579,%r231,1; shr.s64 %r580,%r579,63; sub.u64 %r448,%r628,%r231; shr.s64 %r451,%r231,63; add.u64 %r454,%r231,2; set.u32.lt.u64 %r459,%r454,%r231; cvt.s64.s32 %r457,%r459; sub.u64 %r463,%r451,%r457; add.u64 %r465,%r448,%r454; set.u32.lt.u64 %r470,%r465,%r448; cvt.s64.s32 %r468,%r470; sub.u64 %r474,%r463,%r468; .loc 1 408 11 mov.u64 %r571,%r200; mov.u64 %r572,%r429; bra $L104; $L103: add.u64 %r583,%r231,1; shr.s64 %r584,%r583,63; sub.u64 %r478,%r628,%r231; shr.s64 %r481,%r231,63; add.u64 %r484,%r231,2; set.u32.lt.u64 %r489,%r484,%r231; cvt.s64.s32 %r487,%r489; sub.u64 %r493,%r481,%r487; add.u64 %r495,%r478,%r484; set.u32.lt.u64 %r500,%r495,%r478; cvt.s64.s32 %r498,%r500; sub.u64 %r504,%r493,%r498; mov.u64 %r571,%r200; mov.u64 %r572,%r429; $L108: .loc 1 421 8 @ %r620 bra $L105; .loc 1 421 18 ld.f32 %r90,[%r168]; .loc 1 421 15 setp.le.f32 %r506,%r90,%r145; .loc 1 424 16 selp.u64 %r571,%r583,%r571,%r506; selp.u64 %r572,%r584,%r572,%r506; selp.f32 %r145,%r90,%r145,%r506; $L105: .loc 1 419 33 add.u64 %r168,%r168,%r87; .loc 1 419 48 add.u64 %r169,%r169,%r43; .loc 1 419 8 add.u64 %r509,%r583,%r626; set.u32.lt.u64 %r514,%r509,%r583; cvt.s64.s32 %r512,%r514; mov.u64 %r583,%r509; sub.u64 %r584,%r584,%r512; setp.ne.u64 %r521,%r495,%r583; @ %r521 bra $L120; setp.ne.u64 %r524,%r504,%r584; @ ! %r524 bra $L100; $L120: .loc 1 421 9 ld.s8 %r225,[%r169]; setp.eq.u32 %r620,%r225,0; bra $L108; $L104: .loc 1 430 10 @ %r620 bra $L109; .loc 1 430 20 ld.f32 %r94,[%r168]; .loc 1 430 17 setp.lt.f32 %r526,%r94,%r145; .loc 1 433 11 selp.u64 %r571,%r579,%r571,%r526; selp.u64 %r572,%r580,%r572,%r526; selp.f32 %r145,%r94,%r145,%r526; $L109: .loc 1 428 35 add.u64 %r168,%r168,%r87; .loc 1 428 50 add.u64 %r169,%r169,%r43; .loc 1 428 10 add.u64 %r529,%r579,%r626; set.u32.lt.u64 %r534,%r529,%r579; cvt.s64.s32 %r532,%r534; mov.u64 %r579,%r529; sub.u64 %r580,%r580,%r532; setp.ne.u64 %r541,%r465,%r579; @ %r541 bra $L122; setp.ne.u64 %r544,%r474,%r580; @ ! %r544 bra $L100; $L122: .loc 1 430 11 ld.s8 %r225,[%r169]; setp.eq.u32 %r620,%r225,0; bra $L104; $L100: .loc 1 436 8 st.u64 [%r175],%r571; st.u64 [%r175+8],%r572; .loc 1 439 15 add.u64 %r38,%r38,1; .loc 1 440 12 add.u64 %r173,%r173,%r101; .loc 1 441 13 add.u64 %r174,%r174,%r103; .loc 1 442 12 add.u64 %r175,%r175,%r106; .loc 1 444 13 setp.ne.u64 %r547,%r38,%r166; @ %r547 bra $L117; .loc 1 452 10 add.u64 %r170,%r174,%r85; .loc 1 453 9 add.u64 %r171,%r175,%r93; .loc 1 455 7 @ %r618 bra $L66; add.u64 %r151,%frame,488; .loc 1 451 23 mov.u64 %r110,%r37; .loc 1 455 7 mov.u64 %r219,8; .loc 1 454 5 mov.u64 %r172,1; add.u64 %r623,%frame,360; add.u64 %r622,%frame,120; .loc 1 448 13 mov.u64 %r624,0; bra $L114; $L115: st.u64 [%r151],%r624; .loc 1 451 23 mul.lo.u64 %r110,%r123,%r122; .loc 1 452 24 mul.lo.u64 %r553,%r125,%r122; .loc 1 452 10 sub.u64 %r170,%r174,%r553; .loc 1 453 23 mul.lo.u64 %r554,%r127,%r122; .loc 1 453 9 shl.b64 %r555,%r554,4; sub.u64 %r171,%r175,%r555; .loc 1 454 5 add.u64 %r172,%r172,1; .loc 1 455 7 add.u64 %r151,%r151,8; add.u64 %r219,%r219,8; setp.eq.u64 %r556,%r172,%r631; @ %r556 bra $L66; $L114: .loc 1 463 16 ld.u64 %r557,[%r151]; add.u64 %r122,%r557,1; st.u64 [%r151],%r122; .loc 1 464 23 add.u64 %r559,%r632,%r219; ld.u64 %r123,[%r559]; .loc 1 464 13 sub.u64 %r560,%r123,%r110; shl.b64 %r561,%r560,2; add.u64 %r173,%r173,%r561; .loc 1 465 24 add.u64 %r562,%frame,%r219; ld.u64 %r125,[%r562]; .loc 1 465 14 add.u64 %r174,%r170,%r125; .loc 1 466 23 add.u64 %r564,%r622,%r219; ld.u64 %r127,[%r564]; .loc 1 466 13 shl.b64 %r565,%r127,4; add.u64 %r175,%r171,%r565; .loc 1 444 32 add.u64 %r567,%r623,%r219; ld.u64 %r130,[%r567]; .loc 1 444 13 setp.eq.u64 %r568,%r122,%r130; @ %r568 bra $L115; .loc 1 448 13 mov.u64 %r38,0; bra $L117; $L142: .loc 1 322 6 ld.u64 %r569,[%r234]; setp.eq.u64 %r570,%r569,0; @ ! %r570 bra $L81; bra $L116; $L66: .loc 1 4_gfortran_sminloc1_16_r4 .visible .func _gfortran_sminloc1_16_r, .param .u32 %in_ar4) {8u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r282; .reg .predpred %r293; .reg .pred %r294; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 493 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L146; .loc 1 493 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L147; $L146:r152154; call _gfortran_minloc1_16_r4500 7 bra $L145; $L147: .loc 1 503 10 ld.u64 %r23,[%r152]; .loc 1 503 7 add.u64 %r88,%r23,-1; .loc 1 504 10 ld.s8 %r24,[%r151+28]; .loc 1 504 38 add.u32 %r26,%r24,-1; .loc 1 504 8 cvt.s64.s32 %r297,%r26; .loc 1 506 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r297; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 506 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L149; .loc 1 513 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L150; $L156: .loc 1 521 3 setp.lt.s64 %r175,%r88,%r297; @ %r175 bra $L151; bra $L204; $L149: .loc 1 508 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error150: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 518 12 mov.u64 %r303,0; $L155: .loc 1 515 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 517 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L153; .loc 1 515 17 st.u64 [%r139],%r36; bra $L154; $L153: .loc 1 518 12 st.u64 [%r139],%r303; $L154: .loc 1 513 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L155; bra $L156; $L204: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L158; bra $L157; $L151: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r296,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r296,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 527 12 mov.u64 %r302,0; $L161: .loc 1 524 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 526 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L159; .loc 1 523 17 st.u64 [%r43],%r41; bra $L160; $L159: .loc 1 527 12 st.u64 [%r43],%r302; $L160: .loc 1 521 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L161; bra $L205; $L157: .loc 1 534 7 setp.eq.u64 %r213,%r297,0; @ %r213 bra $L163; add.u64 %r296,%frame,120; $L183: add.u64 %r111,%r150,40; mov.u64 %r107,%r296; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 537 10 mov.u64 %r81,1; .loc 1 541 4 mov.u64 %r239,0; bra $L164; $L163: .loc 1 545 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 546 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 548 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 548 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 548 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 550 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L165; bra $L206; $L167: .loc 1 539 48 mul.lo.u64 %r81,%r46,%r81; $L164: .loc 1 541 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 534 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L167; bra $L163; $L165: .loc 1 553 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 554 4 bra $L145; $L206: .loc 1 557 24 mov.u64 %r246,1696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 557 22 st.u64 [%r150],%r98; $L171: .loc 1 583 3 setp.ne.u64 %r249,%r297,0; @ %r249 bra $L168; bra $L169; $L158: .loc 1 561 19 ld.s8 %r55,[%r150+28]; .loc 1 561 10 setp.eq.u64 %r250,%r55,%r297; @ %r250 bra $L170; .loc 1 562 2 st.u64 [%stack+8],%r297; st.u64 [%stack],%r55;_gfortran_runtime_error170: .loc 1 567 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 567 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L171; .loc 1 569 4 setp.eq.u64 %r256,%r297,0; @ %r256 bra $L169; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 569 10 mov.u64 %r104,0; $L173: .loc 1 573 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 573 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 574 18 ld.u64 %r63,[%r31]; .loc 1 569 25 add.u64 %r104,%r104,1; .loc 1 574 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L172; .loc 1 575 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error172: .loc 1 569 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L173; $L168: .loc 1 585 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L174; shl.b64 %r266,%r297,3; bra $L175; $L174: mov.u64 %r266,8; $L175: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L176: .loc 1 586 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 583 26 add.u64 %r100,%r100,1; .loc 1 583 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r297,%r100; @ %r279 bra $L176; $L169: ld.u64 %r80,[%frame+240]; .loc 1 595 22 ld.u64 %r68,[%frame]; .loc 1 595 12 shl.b64 %r69,%r68,4; .loc 1 597 32 ld.u64 %r59,[%frame+120]; .loc 1 604 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r295,%r297,1; .loc 1 593 13 mov.u64 %r280,0; cvt.u32.u32 %r299,%r24; cvt.s64.s8 %r300,%r299; add.u64 %r301,%r300,-1; $L182: st.u64 [%r98],%r280; st.u64 [%r98+8],%r280; .loc 1 594 15 add.u64 %r80,%r80,1; .loc 1 595 12 add.u64 %r98,%r98,%r69; .loc 1 597 13 setp.ne.u64 %r282,%r59,%r80; @ %r282 bra $L182; .loc 1 606 7 @ ! %r295 bra $L207; bra $L145; $L181: .loc 1 601 13 st.u64 [%r99],%r280; .loc 1 604 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 605 5 add.u64 %r97,%r97,1; .loc 1 606 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r285,%r97,%r301; @ %r285 bra $L180; bra $L145; $L207: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 604 23 mov.u64 %r72,%r44; .loc 1 605 5 mov.u64 %r97,1; $L180: .loc 1 610 16 ld.u64 %r290,[%r99]; add.u64 %r74,%r290,1; st.u64 [%r99],%r74; .loc 1 611 23 ld.u64 %r75,[%r124]; .loc 1 611 13 sub.u64 %r291,%r75,%r72; shl.b64 %r292,%r291,4; add.u64 %r98,%r98,%r292; .loc 1 597 32 ld.u64 %r77,[%r123]; .loc 1 597 13 setp.eq.u64 %r293,%r74,%r77; @ %r293 bra $L181; .loc 1 601 13 mov.u64 %r80,0; bra $L182; $L205: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r294,%r98,0; @ ! %r294 bra $L158; bra $L183; $L145: .loc 1 615minloc1_4_r8.o/_gfortran_minloc1_4_r8 .visible .func _gfortran_minloc1_4_r8.file 1 "../../../../libgfortran/generated/minloc1_4_r8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_4_r8 .visible .func _gfortran_mminloc1_4_r8_gfortran_sminloc1_4_r8 .visible .func _gfortran_sminloc1_4_r8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_minloc1_4_r8 .visible .func _gfortran_minloc1_4_r8pred %r229; .reg .predpred %r240; .reg .predu16 %r278; .reg .u32 %r279; .reg .u32 %r280; .reg .u64 %r282; .reg .u64predu64 %r318; .reg .u64predpred %r337; .reg .pred %r338; .reg .pred %r339; .reg .predu64 %r357; .reg .u64 %r358; .reg .u64 %r360; .reg .pred %r361; .reg .u64 %r362; .reg .predpred %r367; .reg .u64.loc 1 58 10 ld.s8 %r22,[%r191+28]; .loc 1 58 38 add.u32 %r194,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r122,%r194; .loc 1 59 10 ld.u64 %r25,[%r192]; .loc 1 59 7 add.u64 %r123,%r25,-1; .loc 1 61 7 shr.u64 %r196,%r123,63; set.u32.lt.s64 %r198,%r122,%r123; neg.s32 %r199,%r198; cvt.u16.u64 %r202,%r196;61%r206 bra $L2; .loc 1 63 7 cvt.u32.u32 %r209,%r22; cvt.s64.s8 %r208,%r209; st.u64 [%stack+8],%r208007_gfortran_runtime_errorr212,%r123,%r123; add.u64 %r213,%r212,%r123; shl.b64 %r214,%r213,3; add.u64 %r215,%r191,%r214; ld.u64 %r30,[%r215+56]; ld.u64 %r32,[%r215+48]; .loc 1 71 9 ld.u64 %r126,[%r215+40]; .loc 1 73 3 setp.ne.u64 %r229,%r123,0; @ %r229 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r230,%r122,%r123; @ %r230 bra $L4; bra $L61; $L3: add.u64 %r45,%r191,40; add.u64 %r69,%frame,120; add.u64 %r44,%frame,240; add.u64 %r232,%r25,%r25; add.u64 %r233,%r232,%r25; shl.b64 %r234,%r233,3; add.u64 %r235,%r191,16; add.u64 %r114,%r234,%r235; .loc 1 79 12 mov.u64 %r377,0; $L8: .loc 1 75 18 ld.u64 %r236,[%r45]; st.u64 [%r69],%r236; .loc 1 76 19 ld.u64 %r238,[%r45+16]; add.u64 %r237,%r238,1; ld.u64 %r239,[%r45+8]; sub.u64 %r37,%r237,%r239; .loc 1 78 10 setp.lt.s64 %r240,%r37,0; @ %r240 bra $L6; .loc 1 76 17 st.u64 [%r44],%r37; bra $L7; $L6: .loc 1 79 12 st.u64 [%r44],%r377; $L7: .loc 1 73 3 add.u64 %r45,%r45,24; add.u64 %r69,%r69,8; add.u64 %r44,%r44,8; setp.ne.u64 %r242,%r45,%r114; @ %r242 bra $L8; bra $L9; $L61: .loc 1 90 6 ld.u64 %r243,[%r190]; setp.eq.u64 %r244,%r243,0; @ ! %r244 bra $L11; bra $L10; $L4: add.u64 %r246,%r25,%r25; add.u64 %r247,%r246,%r25; shl.b64 %r248,%r247,3; add.u64 %r249,%r248,40; add.u64 %r106,%r191,%r249; shl.b64 %r250,%r25,3; add.u64 %r81,%r250,-8; add.u64 %r365,%frame,120; add.u64 %r101,%r365,%r81; add.u64 %r364,%frame,240; add.u64 %r58,%r364,%r81; add.u64 %r253,%r191,40; cvt.u32.u32 %r255,%r22; cvt.s64.s8 %r254,%r255; add.u64 %r257,%r254,%r254; add.u64 %r258,%r257,%r254; shl.b64 %r259,%r258,3; add.u64 %r185,%r253,%r259; .loc 1 87 12 mov.u64 %r376,0; $L14: .loc 1 83 18 ld.u64 %r260,[%r106]; st.u64 [%r101],%r260; .loc 1 84 19 ld.u64 %r262,[%r106+16]; add.u64 %r261,%r262,1; ld.u64 %r263,[%r106+8]; sub.u64 %r42,%r261,%r263; .loc 1 86 10 setp.lt.s64 %r264,%r42,0; @ %r264 bra $L12; .loc 1 84 17 st.u64 [%r58],%r42; bra $L13; $L12: .loc 1 87 12 st.u64 [%r58],%r376; $L13: .loc 1 81 3 add.u64 %r106,%r106,24; add.u64 %r101,%r101,8; add.u64 %r58,%r58,8; setp.ne.u64 %r266,%r106,%r185; @ %r266 bra $L14; bra $L62; $L10: .loc 1 94 7 setp.le.s64 %r267,%r122,0; @ %r267 bra $L16; add.u64 %r364,%frame,240; $L45: add.u64 %r144,%r190,40; mov.u64 %r119,%r364; cvt.u32.u32 %r269,%r22; cvt.s64.s8 %r268,%r269; add.u64 %r271,%r268,%r268; add.u64 %r272,%r271,%r268; shl.b64 %r273,%r272,3; add.u64 %r274,%r190,16; add.u64 %r109,%r273,%r274; .loc 1 97 10 mov.u64 %r108,1; .loc 1 101 4 mov.u64 %r297,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r275,0; st.u64 [%r190+8],%r275; .loc 1 106 28 cvt.u16.u32 %r278,%r22; add.u16 %r277,%r278,-1; cvt.u32.u16 %r279,%r277; st.u8 [%r190+28],%r279; .loc 1 108 20 add.u32 %r280,%r22,-2; cvt.s64.s32 %r51,%r280; add.u64 %r282,%r51,%r51; add.u64 %r283,%r282,%r51; shl.b64 %r284,%r283,3; add.u64 %r285,%r190,%r284; .loc 1 108 67 shl.b64 %r287,%r51,3; add.u64 %r288,%frame,%r287; .loc 1 108 59 ld.u64 %r290,[%r285+40]; ld.u64 %r291,[%r288+240]; mul.lo.u64 %r128,%r290,%r291; .loc 1 110 29293; call (%value_in),_gfortrani_xmallocarray294,[%value_in]; } .loc 1 110 27 st.u64 [%r190],%r294; .loc 1 111 10 setp.eq.u64 %r296,%r128,0; @ ! %r296 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r108,%r47,%r108; $L17: .loc 1 101 4 st.u64 [%r144+8],%r297; ld.u64 %r47,[%r119]; add.u64 %r298,%r47,-1; st.u64 [%r144+16],%r298; st.u64 [%r144],%r108; .loc 1 94 7 add.u64 %r144,%r144,24; add.u64 %r119,%r119,8; setp.ne.u64 %r299,%r109,%r144; @ %r299 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r190+48],%r128; mov.u64 %r301,-1; st.u64 [%r190+56],%r301; mov.u64 %r302,1; st.u64 [%r190+40],%r302; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r57,[%r190+28]; .loc 1 121 10 setp.eq.u64 %r303,%r57,%r122; @ %r303 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r122; st.u64 [%stack],%r57;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r306,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r307,[%r306+36]; setp.eq.u32 %r308,%r307,0; @ %r308 bra $L21; .loc 1 128 2 add.u64 %r364,%frame,240; cvta.const.u64 %r312,$LC2; cvta.const.u64 %r311,$LC3;_gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r314,%r122,0; @ %r314 bra $L24; $L27: .loc 1 68 9 add.u64 %r315,%r30,1; .loc 1 68 7 sub.u64 %r124,%r315,%r32; max.s64 %r142,%r124,0; .loc 1 140 8 ld.u64 %r137,[%r191]; .loc 1 141 8 ld.u64 %r138,[%r190]; .loc 1 168 40 shl.b64 %r65,%r126,3; ld.u64 %r145,[%frame+360]; .loc 1 204 22 ld.u64 %r75,[%frame+120]; .loc 1 204 12 shl.b64 %r76,%r75,3; .loc 1 205 22 ld.u64 %r78,[%frame]; .loc 1 205 12 shl.b64 %r79,%r78,2; .loc 1 207 32 ld.u64 %r117,[%frame+240]; .loc 1 214 23 mul.lo.u64 %r316,%r75,%r117; .loc 1 214 9 shl.b64 %r317,%r316,3; neg.s64 %r160,%r317; .loc 1 215 23 mul.lo.u64 %r318,%r78,%r117; .loc 1 215 9 shl.b64 %r319,%r318,2; neg.s64 %r68,%r319; setp.gt.s64 %r366,%r124,0; setp.le.s64 %r367,%r122,1; .loc 1 157 9 mov.u32 %r369,1; .loc 1 180 9 setp.ne.u32 %r370,%r193,0; cvt.u32.u32 %r371,%r22; cvt.s64.s8 %r372,%r371; add.u64 %r373,%r372,-1; add.u64 %r374,%frame,120; add.u64 %r375,%frame,240; bra $L25; $L24: add.u64 %r166,%frame,360; add.u64 %r165,%r190,40; mov.u64 %r161,%frame; cvt.u32.u32 %r321,%r22; cvt.s64.s8 %r320,%r321; add.u64 %r146,%r320,-1; .loc 1 132 3 mov.u64 %r139,0; add.u64 %r364,%frame,240; .loc 1 134 16 mov.u64 %r322,%r139; $L26: st.u64 [%r166],%r322; .loc 1 135 18 ld.u64 %r323,[%r165]; st.u64 [%r161],%r323; .loc 1 136 17 shl.b64 %r325,%r139,3; add.u64 %r326,%r364,%r325; .loc 1 136 10 ld.u64 %r327,[%r326]; setp.le.s64 %r328,%r327,0; @ %r328 bra $L1; .loc 1 132 26 add.u64 %r139,%r139,1; .loc 1 132 3 add.u64 %r166,%r166,8; add.u64 %r165,%r165,24; add.u64 %r161,%r161,8; setp.ne.u64 %r329,%r139,%r146; @ %r329 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r366 bra $L46; .loc 1 159 10 mov.u32 %r331,0; st.u32 [%r138],%r331; bra $L29; $L46: mov.u64 %r130,%r137; .loc 1 168 17 mov.u64 %r129,0; $L28: .loc 1 170 7 ld.f64 %r107,[%r130]; .loc 1 170 6 setp.le.f64 %r332,%r107,0d7ff0000000000000; @ ! %r332 bra $L63; .loc 1 173 14 cvt.u32.u64 %r333,%r129; add.u32 %r112,%r333,1; .loc 1 180 9 @ %r370 bra $L32; .loc 1 190 8 setp.gt.s64 %r335,%r142,%r129; @ %r335 bra $L33; bra $L34; $L32: .loc 1 181 8 setp.le.s64 %r336,%r142,%r129; @ %r336 bra $L34; add.u64 %r173,%r129,1; .loc 1 170 7 mov.f64 %r186,%r107; bra $L35; $L63: .loc 1 168 32 add.u64 %r129,%r129,1; .loc 1 168 40 add.u64 %r130,%r130,%r65; .loc 1 168 10 setp.lt.s64 %r337,%r129,%r142; @ %r337 bra $L28; .loc 1 157 9 mov.u32 %r112,%r369; bra $L34; $L35: .loc 1 183 8 setp.ge.f64 %r338,%r107,%r186; @ ! %r338 bra $L36; .loc 1 186 16 cvt.u32.u64 %r112,%r173; .loc 1 185 16 mov.f64 %r107,%r186; $L36: .loc 1 181 33 add.u64 %r130,%r130,%r65; .loc 1 181 8 add.u64 %r174,%r173,1; setp.eq.u64 %r339,%r142,%r173; @ %r339 bra $L34; .loc 1 183 9 ld.f64 %r186,[%r130]; mov.u64 %r173,%r174; bra $L35; $L33: add.u64 %r169,%r129,1; bra $L40; $L47: mov.u64 %r169,%r170; $L40: .loc 1 192 9 ld.f64 %r70,[%r130]; .loc 1 192 8 setp.lt.f64 %r340,%r70,%r107; @ ! %r340 bra $L38; .loc 1 195 16 cvt.u32.u64 %r112,%r169; .loc 1 194 16 mov.f64 %r107,%r70; $L38: .loc 1 190 33 add.u64 %r130,%r130,%r65; .loc 1 190 8 add.u64 %r170,%r169,1; setp.ne.u64 %r341,%r142,%r169; @ %r341 bra $L47; $L34: .loc 1 199 12 st.u32 [%r138],%r112; $L29: .loc 1 203 15 add.u64 %r145,%r145,1; .loc 1 204 12 add.u64 %r137,%r137,%r76; .loc 1 205 12 add.u64 %r138,%r138,%r79; .loc 1 207 13 setp.ne.u64 %r342,%r145,%r117; @ %r342 bra $L25; .loc 1 214 9 add.u64 %r134,%r137,%r160; .loc 1 215 9 add.u64 %r135,%r138,%r68; .loc 1 217 7 @ %r367 bra $L1; add.u64 %r121,%frame,368; mov.u64 %r182,8; .loc 1 216 5 mov.u64 %r136,1; .loc 1 211 13 mov.u64 %r368,0; bra $L43; $L44: st.u64 [%r121],%r368; .loc 1 214 23 mul.lo.u64 %r348,%r94,%r93; .loc 1 214 9 shl.b64 %r349,%r348,3; sub.u64 %r134,%r137,%r349; .loc 1 215 23 mul.lo.u64 %r350,%r97,%r93; .loc 1 215 9 shl.b64 %r351,%r350,2; sub.u64 %r135,%r138,%r351; .loc 1 216 5 add.u64 %r136,%r13182,%r182,8; setp.eq.u64 %r352,%r136,%r373; @ %r352 bra $L1; $L43: .loc 1 225 16 ld.u64 %r353,[%r121]; add.u64 %r93,%r353,1; st.u64 [%r121],%r93; .loc 1 226 23 add.u64 %r355,%r374,%r182; ld.u64 %r94,[%r355]; .loc 1 226 13 shl.b64 %r356,%r94,3; add.u64 %r137,%r134,%r356; .loc 1 227 23 add.u64 %r357,%frame,%r182; ld.u64 %r97,[%r357]; .loc 1 227 13 shl.b64 %r358,%r97,2; add.u64 %r138,%r135,%r358; .loc 1 207 32 add.u64 %r360,%r375,%r182; ld.u64 %r100,[%r360]; .loc 1 207 13 setp.eq.u64 %r361,%r93,%r100; @ %r361 bra $L44; .loc 1 211 13 mov.u64 %r145,0; bra $L25; $L62: .loc 1 90 6 ld.u64 %r362,[%r190]; setp.eq.u64 %r363,%r362,0; @ ! %r363 bra $L11; bra $L45; $L1: .loc 1 231_gfortran_mminloc1_4_r8 .visible .func _gfortran_mminloc1_4_r8, .param .u32 %in_ar4) {608u32 %r267; .reg .u32 %r269; .reg .u32 %r270; .reg .u32 %r271; .reg .u32 %r273; .reg .u32predu64 %r293; .reg .predu64 %r302; .reg .u64 %r303; .reg .u64 %r305; .reg .u64 %r306; .reg .u64u64 %r327; .reg .u32u64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64pred %r339; .reg .u64 %r341; .reg .u64 %r343; .reg .pred %r345; .reg .predpred %r383; .reg .u64 %r384; .reg .u64 %r386; .reg .u32 %r387; .reg .pred %r388; .reg .u64 %r391; .reg .u64pred %r409; .reg .u64 %r410; .reg .u64 %r411; .reg .u64 %r412; .reg .pred %r414; .reg .u32pred %r418; .reg .pred %r419; .reg .pred %r422; .reg .pred %r423; .reg .pred %r425; .reg .pred %r426; .reg .predu64predmov.u64 %r223,%ar0; mov.u64 %r224,%ar1; mov.u64 %r225,%ar2; mov.u64 %r226,%ar3; mov.u32 %r227,%ar4; .loc 1 261 6 setp.ne.u64 %r228,%r226,0; @ %r228 bra $L65227; call _gfortran_minloc1_4_r8268 7 bra $L64; $L65: .loc 1 271 10 ld.u64 %r22,[%r225]; .loc 1 271 7 add.u64 %r159,%r22,-1; .loc 1 272 10 ld.s8 %r23,[%r224+28]; .loc 1 272 38 add.u32 %r233,%r23,-1; .loc 1 272 8 cvt.s64.s32 %r160,%r233; .loc 1 275 7 shr.u64 %r235,%r159,63; set.u32.gt.s64 %r237,%r159,%r160; neg.s32 %r238,%r237; cvt.u16.u64 %r241,%r235; cvt.u16.u32 %r242,%r238; or.b16 %r240,%r241,%r242; .loc 1 275 6 cvt.u32.u16 %r243,%r240; cvt.u16.u8 %r244,%r243; setp.eq.u16 %r245,%r244,0; @ %r245 bra $L67; .loc 1 277 7 cvt.u32.u32 %r248,%r23; cvt.s64.s8 %r247,%r248; st.u64 [%stack+8],%r247; st.u64 [%stack],%r22; cvta.const.u64 %r246246_gfortran_runtime_error67: .loc 1 282 9 add.u64 %r456,%r159,%r159; add.u64 %r457,%r456,%r159; shl.b64 %r253,%r457,3; add.u64 %r254,%r224,%r253; ld.u64 %r257,[%r254+56]; add.u64 %r256,%r257,1; .loc 1 282 7 ld.u64 %r264,[%r254+48]; sub.u64 %r161,%r256,%r264; .loc 1 283 6 setp.le.s64 %r265,%r161,0; @ %r265 bra $L64; .loc 1 286 9 ld.u64 %r180,[%r226]; .loc 1 288 15 ld.u64 %r34,[%r226+16]; .loc 1 290 22 cvt.u32.u64 %r35,%r34; .loc 1 290 53 add.u32 %r266,%r35,-4; and.b32 %r267,%r266,-5; set.u32.eq.u32 %r269,%r267,0; neg.s32 %r270,%r269; .loc 1 290 22 add.u32 %r271,%r35,-1; set.u32.le.u32 %r273,%r271,1; neg.s32 %r274,%r273; .loc 1 290 6 cvt.u16.u32 %r276,%r270; cvt.u16.u32 %r277,%r274; or.b16 %r275,%r276,%r277; cvt.u32.u16 %r278,%r275; cvt.u16.u8 %r279,%r278; setp.ne.u16 %r280,%r279,0; @ %r280 bra $L69; .loc 1 292 7 setp.ne.u32 %r282,%r35,16; @ %r282 bra $L70; $L69: .loc 1 299 9 shl.b64 %r286,%r457,3; add.u64 %r287,%r224,%r286; ld.u64 %r164,[%r287+40]; .loc 1 300 12 add.u64 %r293,%r226,%r286; ld.u64 %r41,[%r293+40]; .loc 1 302 3 setp.ne.u64 %r295,%r159,0; @ %r295 bra $L71; $L77: .loc 1 312 3 setp.lt.s64 %r296,%r159,%r160; @ %r296 bra $L72; bra $L135; $L70: .loc 1 297 5 cvta.const.u64 %r2977_gfortran_runtime_error71: add.u64 %r91,%r224,40; add.u64 %r40,%r226,40; add.u64 %r300,%r22,%r22; add.u64 %r301,%r300,%r22; shl.b64 %r302,%r301,3; add.u64 %r303,%r224,16; add.u64 %r186,%r302,%r303; .loc 1 302 3 mov.u64 %r148,0; add.u64 %r453,%frame,240; add.u64 %r452,%frame,360; .loc 1 309 12 mov.u64 %r467,%r148; $L76: .loc 1 304 18 add.u64 %r305,%r453,%r148; ld.u64 %r306,[%r91]; st.u64 [%r305],%r306; .loc 1 305 18 add.u64 %r307,%frame,%r148; .loc 1 305 20 ld.u64 %r309,[%r40]; mul.lo.u64 %r308,%r309,%r34; .loc 1 305 18 st.u64 [%r307],%r308; .loc 1 306 19 ld.u64 %r311,[%r91+16]; add.u64 %r310,%r311,1; ld.u64 %r312,[%r91+8]; sub.u64 %r52,%r310,%r312; .loc 1 308 10 setp.lt.s64 %r313,%r52,0; @ %r313 bra $L74; .loc 1 306 17 add.u64 %r315,%r452,%r148; st.u64 [%r315],%r52; bra $L75; $L74: .loc 1 309 12 add.u64 %r317,%r452,%r148; st.u64 [%r317],%r467; $L75: .loc 1 302 3 add.u64 %r91,%r91,24; add.u64 %r40,%r40,24; add.u64 %r148,%r148,8; setp.ne.u64 %r319,%r186,%r91; @ %r319 bra $L76; bra $L77; $L135: .loc 1 322 6 ld.u64 %r320,[%r223]; setp.eq.u64 %r321,%r320,0; @ ! %r321 bra $L79; bra $L78; $L72: add.u64 %r323,%r22,%r22; add.u64 %r324,%r323,%r22; shl.b64 %r325,%r324,3; add.u64 %r116,%r325,40; add.u64 %r133,%r224,%r116; add.u64 %r87,%r226,%r116; shl.b64 %r326,%r22,3; add.u64 %r157,%r326,-8; cvt.u32.u32 %r328,%r23; cvt.s64.s8 %r327,%r328; shl.b64 %r329,%r327,3; add.u64 %r96,%r329,-8; add.u64 %r453,%frame,240; add.u64 %r452,%frame,360; .loc 1 319 12 mov.u64 %r466,0; $L82: .loc 1 314 18 add.u64 %r331,%r453,%r157; ld.u64 %r332,[%r133]; st.u64 [%r331],%r332; .loc 1 315 18 add.u64 %r333,%frame,%r157; .loc 1 315 20 ld.u64 %r335,[%r87]; mul.lo.u64 %r334,%r335,%r34; .loc 1 315 18 st.u64 [%r333],%r334; .loc 1 316 19 ld.u64 %r337,[%r133+16]; add.u64 %r336,%r337,1; ld.u64 %r338,[%r133+8]; sub.u64 %r61,%r336,%r338; .loc 1 318 10 setp.lt.s64 %r339,%r61,0; @ %r339 bra $L80; .loc 1 316 17 add.u64 %r341,%r452,%r157; st.u64 [%r341],%r61; bra $L81; $L80: .loc 1 319 12 add.u64 %r343,%r452,%r157; st.u64 [%r343],%r466; $L81: .loc 1 312 3 add.u64 %r133,%r133,24; add.u64 %r87,%r87,24; add.u64 %r157,%r157,8; setp.ne.u64 %r345,%r96,%r157; @ %r345 bra $L82; bra $L136; $L78: .loc 1 326 7 setp.eq.u64 %r346,%r160,0; @ %r346 bra $L84; add.u64 %r452,%frame,360; $L112: add.u64 %r146,%r223,40; mov.u64 %r143,%r452; cvt.u32.u32 %r348,%r23; cvt.s64.s8 %r347,%r348; add.u64 %r350,%r347,%r347; add.u64 %r351,%r350,%r347; shl.b64 %r352,%r351,3; add.u64 %r353,%r223,16; add.u64 %r134,%r352,%r353; .loc 1 329 10 mov.u64 %r139,1; .loc 1 333 4 mov.u64 %r372,0; bra $L85; $L84: .loc 1 337 20 add.u32 %r354,%r23,-2; cvt.s64.s32 %r71,%r354; add.u64 %r356,%r71,%r71; add.u64 %r357,%r356,%r71; shl.b64 %r358,%r357,3; add.u64 %r359,%r223,%r358; .loc 1 337 67 shl.b64 %r361,%r71,3; add.u64 %r362,%frame,%r361; .loc 1 337 59 ld.u64 %r364,[%r359+40]; ld.u64 %r365,[%r362+360]; mul.lo.u64 %r168,%r364,%r365; .loc 1 339 24 mov.u64 %r366,0; st.u64 [%r223+8],%r366; .loc 1 340 28 cvt.u16.u32 %r369,%r23; add.u16 %r368,%r369,-1; cvt.u32.u16 %r370,%r368; st.u8 [%r223+28],%r370; .loc 1 342 10 setp.eq.u64 %r371,%r168,0; @ %r371 bra $L86; bra $L137; $L88: .loc 1 331 47 mul.lo.u64 %r139,%r68,%r139; $L85: .loc 1 333 4 st.u64 [%r146+8],%r372; ld.u64 %r68,[%r143]; add.u64 %r373,%r68,-1; st.u64 [%r146+16],%r373; st.u64 [%r146],%r139; .loc 1 326 7 add.u64 %r146,%r146,24; add.u64 %r143,%r143,8; setp.ne.u64 %r374,%r134,%r146; @ %r374 bra $L88; bra $L84; $L86: .loc 1 345 4 st.u64 [%r223+48],%r168; mov.u64 %r376,-1; st.u64 [%r223+56],%r376; mov.u64 %r377,1; st.u64 [%r223+40],%r377; .loc 1 346 4 bra $L64; $L137: .loc 1 349 24call (%value_in),_gfortrani_xmallocarray80,[%value_in]; } .loc 1 349 22 st.u64 [%r223],%r380; bra $L89; $L79: .loc 1 354 19 ld.s8 %r382,[%r223+28]; .loc 1 354 10 setp.eq.u64 %r383,%r382,%r160; @ %r383 bra $L90; .loc 1 355 2 cvta.const.u64 %r3843stack; call _gfortran_runtime_error90: .loc 1 357 11 cvta.global.u64 %r386,_gfortrani_compile_options; .loc 1 357 10 ld.u32 %r387,[%r386+36]; setp.eq.u32 %r388,%r387,0; @ %r388 bra $L89; .loc 1 359 4 add.u64 %r452,%frame,360; cvta.const.u64 %r392,$LC239392; call _gfortrani_bounds_ifunction_return361 4 cvta.const.u64 %r396,$LC396392; call _gfortrani_bounds_equal_extents$L89: .loc 1 366 3 setp.ne.u64 %r398,%r160,0; @ %r398 bra $L91; $L94: .loc 1 374 8 ld.u64 %r181,[%r223]; .loc 1 375 8 ld.u64 %r179,[%r224]; .loc 1 377 9 setp.ne.u64 %r399,%r179,0; @ %r399 bra $L92; bra $L64; $L91: add.u64 %r199,%frame,480; add.u64 %r195,%r223,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r401,%r23; cvt.s64.s8 %r400,%r401; add.u64 %r149,%r400,-1; .loc 1 366 3 mov.u64 %r182,0; add.u64 %r452,%frame,360; .loc 1 368 16 mov.u64 %r402,%r182; $L93: st.u64 [%r199],%r402; .loc 1 369 18 ld.u64 %r403,[%r195]; st.u64 [%r187],%r403; .loc 1 370 17 shl.b64 %r405,%r182,3; add.u64 %r406,%r452,%r405; .loc 1 370 10 ld.u64 %r407,[%r406]; setp.le.s64 %r408,%r407,0; @ %r408 bra $L64; .loc 1 366 26 add.u64 %r182,%r182,1; .loc 1 366 3 add.u64 %r199,%r199,8; add.u64 %r195,%r195,24; add.u64 %r187,%r187,8; setp.ne.u64 %r409,%r149,%r182; @ %r409 bra $L93; bra $L94; $L92: .loc 1 300 12 mul.lo.u64 %r43,%r41,%r34; .loc 1 396 32 shl.b64 %r86,%r164,3; ld.u64 %r39,[%frame+480]; .loc 1 440 22 ld.u64 %r100,[%frame+240]; .loc 1 440 12 shl.b64 %r101,%r100,3; .loc 1 441 23 ld.u64 %r103,[%frame]; .loc 1 442 22 ld.u64 %r105,[%frame+120]; .loc 1 442 12 shl.b64 %r106,%r105,2; .loc 1 444 32 ld.u64 %r172,[%frame+360]; .loc 1 451 23 mul.lo.u64 %r38,%r172,%r100; .loc 1 452 24 mul.lo.u64 %r410,%r172,%r103; .loc 1 452 10 neg.s64 %r94,%r410; .loc 1 453 23 mul.lo.u64 %r411,%r172,%r105; .loc 1 453 9 shl.b64 %r412,%r411,2; neg.s64 %r140,%r412; setp.le.s64 %r451,%r160,1; .loc 1 418 9 setp.ne.u32 %r459,%r227,0; cvt.u32.u32 %r460,%r23; cvt.s64.s8 %r461,%r460; add.u64 %r462,%r461,-1; add.u64 %r463,%frame,240; add.u64 %r464,%frame,360; add.u64 %r465,%frame,120; $L113: .loc 1 366 3 mov.u64 %r174,%r180; mov.u64 %r173,%r179; .loc 1 393 16 mov.u32 %r147,0; .loc 1 396 9 mov.u64 %r171,0; $L100: .loc 1 399 7 ld.s8 %r222,[%r174]; .loc 1 399 6 setp.eq.u32 %r454,%r222,0; @ %r454 bra $L95; .loc 1 402 10 setp.ne.u32 %r414,%r147,0; @ %r414 bra $L96; cvt.u32.u64 %r415,%r171; add.u32 %r147,%r415,1; $L96: .loc 1 404 11 ld.f64 %r145,[%r173]; .loc 1 404 10 setp.le.f64 %r416,%r145,0d7ff0000000000000; @ ! %r416 bra $L95; .loc 1 408 11 cvt.u32.u64 %r417,%r171; add.u32 %r170,%r417,1; .loc 1 414 9 setp.le.s64 %r418,%r161,%r171; @ ! %r418 bra $L138; bra $L98; $L95: .loc 1 396 24 add.u64 %r171,%r171,1; .loc 1 396 32 add.u64 %r173,%r173,%r86; .loc 1 396 47 add.u64 %r174,%r174,%r43; .loc 1 396 2 setp.ne.u64 %r419,%r161,%r171; @ %r419 bra $L100; bra $L98; $L138: .loc 1 418 9 @ %r459 bra $L101; add.u64 %r205,%r171,1; .loc 1 408 11 mov.u32 %r147,%r170; bra $L102; $L101: add.u64 %r209,%r171,1; mov.u32 %r147,%r170; $L105: .loc 1 421 8 @ %r454 bra $L103; .loc 1 421 18 ld.f64 %r89,[%r173]; .loc 1 421 15 setp.le.f64 %r422,%r89,%r145; @ ! %r422 bra $L103; .loc 1 424 16 cvt.u32.u64 %r147,%r209; .loc 1 423 16 mov.f64 %r145,%r89; $L103: .loc 1 419 33 add.u64 %r173,%r173,%r86; .loc 1 419 48 add.u64 %r174,%r174,%r43; .loc 1 419 8 add.u64 %r210,%r209,1; setp.eq.u64 %r423,%r161,%r209; @ %r423 bra $L98; .loc 1 421 9 ld.s8 %r222,[%r174]; mov.u64 %r209,%r210; setp.eq.u32 %r454,%r222,0; bra $L105; $L102: .loc 1 430 10 @ %r454 bra $L106; .loc 1 430 20 ld.f64 %r93,[%r173]; .loc 1 430 17 setp.lt.f64 %r425,%r93,%r145; @ ! %r425 bra $L106; .loc 1 433 11 cvt.u32.u64 %r147,%r205; .loc 1 432 18 mov.f64 %r145,%r93; $L106: .loc 1 428 35 add.u64 %r173,%r173,%r86; .loc 1 428 50 add.u64 %r174,%r174,%r43; .loc 1 428 10 add.u64 %r206,%r205,1; setp.eq.u64 %r426,%r161,%r205; @ %r426 bra $L98; .loc 1 430 11 ld.s8 %r222,[%r174]; mov.u64 %r205,%r206; setp.eq.u32 %r454,%r222,0; bra $L102; $L98: .loc 1 436 8 st.u32 [%r181],%r147; .loc 1 439 15 add.u64 %r39,%r39,1; .loc 1 440 12 add.u64 %r179,%r179,%r101; .loc 1 441 13 add.u64 %r180,%r180,%r103; .loc 1 442 12 add.u64 %r181,%r181,%r106; .loc 1 444 13 setp.ne.u64 %r427,%r39,%r172; @ %r427 bra $L113; .loc 1 452 10 add.u64 %r176,%r180,%r94; .loc 1 453 9 add.u64 %r177,%r181,%r140; .loc 1 455 7 @ %r451 bra $L64; add.u64 %r158,%frame,488; .loc 1 451 23 mov.u64 %r110,%r38; .loc 1 455 7 mov.u64 %r219,8; .loc 1 454 5 mov.u64 %r178,1; .loc 1 448 13 mov.u64 %r458,0; bra $L110; $L111: st.u64 [%r158],%r458; .loc 1 451 23 mul.lo.u64 %r110,%r124,%r123; .loc 1 452 24 mul.lo.u64 %r433,%r126,%r123; .loc 1 452 10 sub.u64 %r176,%r180,%r433; .loc 1 453 23 mul.lo.u64 %r434,%r128,%r123; .loc 1 453 9 shl.b64 %r435,%r434,2; sub.u64 %r177,%r181,%r435; .loc 1 454 5 add.u64 %r178,%r178,1; .loc 1 455 7 add.u64 %r158,%r158,8; add.u64 %r219,%r219,8; setp.eq.u64 %r436,%r178,%r462; @ %r436 bra $L64; $L110: .loc 1 463 16 ld.u64 %r437,[%r158]; add.u64 %r123,%r437,1; st.u64 [%r158],%r123; .loc 1 464 23 add.u64 %r439,%r463,%r219; ld.u64 %r124,[%r439]; .loc 1 464 13 sub.u64 %r440,%r124,%r110; shl.b64 %r441,%r440,3; add.u64 %r179,%r179,%r441; .loc 1 465 24 add.u64 %r442,%frame,%r219; ld.u64 %r126,[%r442]; .loc 1 465 14 add.u64 %r180,%r176,%r126; .loc 1 466 23 add.u64 %r444,%r465,%r219; ld.u64 %r128,[%r444]; .loc 1 466 13 shl.b64 %r445,%r128,2; add.u64 %r181,%r177,%r445; .loc 1 444 32 add.u64 %r447,%r464,%r219; ld.u64 %r131,[%r447]; .loc 1 444 13 setp.eq.u64 %r448,%r123,%r131; @ %r448 bra $L111; .loc 1 448 13 mov.u64 %r39,0; bra $L113; $L136: .loc 1 322 6 ld.u64 %r449,[%r223]; setp.eq.u64 %r450,%r449,0; @ ! %r450 bra $L79; bra $L112; $L64: .loc 1 4_gfortran_sminloc1_4_r8 .visible .func _gfortran_sminloc1_4_r8, .param .u32 %in_ar4) {8u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u32 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 493 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L140; .loc 1 493 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L141; $L140:r152154; call _gfortran_minloc1_4_r8500 7 bra $L139; $L141: .loc 1 503 10 ld.u64 %r23,[%r152]; .loc 1 503 7 add.u64 %r88,%r23,-1; .loc 1 504 10 ld.s8 %r24,[%r151+28]; .loc 1 504 38 add.u32 %r26,%r24,-1; .loc 1 504 8 cvt.s64.s32 %r296,%r26; .loc 1 506 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 506 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L143; .loc 1 513 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L144; $L150: .loc 1 521 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L145; bra $L198; $L143: .loc 1 508 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error144: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 518 12 mov.u64 %r302,0; $L149: .loc 1 515 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 517 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L147; .loc 1 515 17 st.u64 [%r139],%r36; bra $L148; $L147: .loc 1 518 12 st.u64 [%r139],%r302; $L148: .loc 1 513 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L149; bra $L150; $L198: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L152; bra $L151; $L145: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 527 12 mov.u64 %r301,0; $L155: .loc 1 524 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 526 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L153; .loc 1 523 17 st.u64 [%r43],%r41; bra $L154; $L153: .loc 1 527 12 st.u64 [%r43],%r301; $L154: .loc 1 521 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L155; bra $L199; $L151: .loc 1 534 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L157; add.u64 %r295,%frame,120; $L177: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 537 10 mov.u64 %r81,1; .loc 1 541 4 mov.u64 %r239,0; bra $L158; $L157: .loc 1 545 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 546 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 548 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 548 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 548 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 550 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L159; bra $L200; $L161: .loc 1 539 48 mul.lo.u64 %r81,%r46,%r81; $L158: .loc 1 541 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 534 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L161; bra $L157; $L159: .loc 1 553 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 554 4 bra $L139; $L200: .loc 1 557 2496246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 557 22 st.u64 [%r150],%r98; $L165: .loc 1 583 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L162; bra $L163; $L152: .loc 1 561 19 ld.s8 %r55,[%r150+28]; .loc 1 561 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L164; .loc 1 562 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error164: .loc 1 567 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 567 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L165; .loc 1 569 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L163; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 569 10 mov.u64 %r104,0; $L167: .loc 1 573 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 573 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 574 18 ld.u64 %r63,[%r31]; .loc 1 569 25 add.u64 %r104,%r104,1; .loc 1 574 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L166; .loc 1 575 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error166: .loc 1 569 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L167; $L162: .loc 1 585 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L168; shl.b64 %r266,%r296,3; bra $L169; $L168: mov.u64 %r266,8; $L169: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L170: .loc 1 586 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 583 26 add.u64 %r100,%r100,1; .loc 1 583 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L170; $L163: ld.u64 %r80,[%frame+240]; .loc 1 595 22 ld.u64 %r68,[%frame]; .loc 1 595 12 shl.b64 %r69,%r68,2; .loc 1 597 32 ld.u64 %r59,[%frame+120]; .loc 1 604 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 593 13 mov.u32 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L176: st.u32 [%r98],%r280; .loc 1 594 15 add.u64 %r80,%r80,1; .loc 1 595 12 add.u64 %r98,%r98,%r69; .loc 1 597 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L176; .loc 1 606 7 @ ! %r294 bra $L201; bra $L139; $L175: .loc 1 601 13 st.u64 [%r99],%r297; .loc 1 604 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 605 5 add.u64 %r97,%r97,1; .loc 1 606 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L174; bra $L139; $L201: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 604 23 mov.u64 %r72,%r44; .loc 1 605 5 mov.u64 %r97,1; .loc 1 601 13 mov.u64 %r297,0; $L174: .loc 1 610 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 611 23 ld.u64 %r75,[%r124]; .loc 1 611 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,2; add.u64 %r98,%r98,%r291; .loc 1 597 32 ld.u64 %r77,[%r123]; .loc 1 597 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L175; .loc 1 601 13 mov.u64 %r80,0; bra $L176; $L199: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L152; bra $L177; $L139: .loc 1 615minloc1_8_r8.o/_gfortran_minloc1_8_r8 .visible .func _gfortran_minloc1_8_r8.file 1 "../../../../libgfortran/generated/minloc1_8_r8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_8_r8 .visible .func _gfortran_mminloc1_8_r8_gfortran_sminloc1_8_r8 .visible .func _gfortran_sminloc1_8_r8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_minloc1_8_r8 .visible .func _gfortran_minloc1_8_r8pred %r221; .reg .pred %r222; .reg .u64u64 %r229; .reg .u64 %r230; .reg .u64 %r231; .reg .pred %r232; .reg .predu64 %r280; .reg .u64 %r282; .reg .u64pred64pred %r317; .reg .predpred %r323; .reg .predpred %r326; .reg .pred %r327; .reg .predu64 %r333; .reg .u64pred %r352; .reg .u64 %r353; .reg .predmov.u64 %r182,%ar0; mov.u64 %r183,%ar1; mov.u64 %r184,%ar2; mov.u32 %r185,%ar3; .loc 1 58 10 ld.s8 %r22,[%r183+28]; .loc 1 58 38 add.u32 %r186,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r126,%r186; .loc 1 59 10 ld.u64 %r25,[%r184]; .loc 1 59 7 add.u64 %r127,%r25,-1; .loc 1 61 7 shr.u64 %r188,%r127,63; set.u32.lt.s64 %r190,%r126,%r127; neg.s32 %r191,%r190; cvt.u16.u64 %r194,%r188; cvt.u16.u32 %r195,%r191; or.b16 %r193,%r194,%r195; .loc 1 61 6 cvt.u32.u16 %r196,%r193; cvt.u16.u8 %r197,%r196; setp.eq.u16 %r198,%r197,0; @ %r198 bra $L2; .loc 1 63 7 cvt.u32.u32 %r201,%r22; cvt.s64.s8 %r200,%r201; st.u64 [%stack+8],%r20019_gfortran_runtime_errorr204,%r127,%r127; add.u64 %r205,%r204,%r127; shl.b64 %r206,%r205,3; add.u64 %r207,%r183,%r206; ld.u64 %r31,[%r207+56]; ld.u64 %r33,[%r207+48]; .loc 1 71 9 ld.u64 %r130,[%r207+40]; .loc 1 73 3 setp.ne.u64 %r221,%r127,0; @ %r221 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r222,%r126,%r127; @ %r222 bra $L4; bra $L62; $L3: add.u64 %r125,%r183,40; add.u64 %r144,%frame,120; add.u64 %r181,%frame,240; add.u64 %r223,%r183,16; add.u64 %r225,%r25,%r25; add.u64 %r226,%r225,%r25; shl.b64 %r227,%r226,3; add.u64 %r75,%r223,%r227; .loc 1 79 12 mov.u64 %r368,0; $L8: .loc 1 75 18 ld.u64 %r228,[%r125]; st.u64 [%r144],%r228; .loc 1 76 19 ld.u64 %r230,[%r125+16]; add.u64 %r229,%r230,1; ld.u64 %r231,[%r125+8]; sub.u64 %r38,%r229,%r231; .loc 1 78 10 setp.lt.s64 %r232,%r38,0; @ %r232 bra $L6; .loc 1 76 17 st.u64 [%r181],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r181],%r368; $L7: .loc 1 73 3 add.u64 %r125,%r125,24; add.u64 %r144,%r144,8; add.u64 %r181,%r181,8; setp.ne.u64 %r234,%r75,%r125; @ %r234 bra $L8; bra $L9; $L62: .loc 1 90 6 ld.u64 %r235,[%r182]; setp.eq.u64 %r236,%r235,0; @ ! %r236 bra $L11; bra $L10; $L4: add.u64 %r238,%r25,%r25; add.u64 %r239,%r238,%r25; shl.b64 %r240,%r239,3; add.u64 %r241,%r240,40; add.u64 %r121,%r183,%r241; shl.b64 %r242,%r25,3; add.u64 %r111,%r242,-8; add.u64 %r356,%frame,120; add.u64 %r115,%r356,%r111; add.u64 %r355,%frame,240; add.u64 %r109,%r355,%r111; cvt.u32.u32 %r246,%r22; cvt.s64.s8 %r245,%r246; add.u64 %r248,%r245,%r245; add.u64 %r249,%r248,%r245; shl.b64 %r250,%r249,3; add.u64 %r251,%r183,40; add.u64 %r49,%r250,%r251; .loc 1 87 12 mov.u64 %r367,0; $L14: .loc 1 83 18 ld.u64 %r252,[%r121]; st.u64 [%r115],%r252; .loc 1 84 19 ld.u64 %r254,[%r121+16]; add.u64 %r253,%r254,1; ld.u64 %r255,[%r121+8]; sub.u64 %r45,%r253,%r255; .loc 1 86 10 setp.lt.s64 %r256,%r45,0; @ %r256 bra $L12; .loc 1 84 17 st.u64 [%r109],%r45; bra $L13; $L12: .loc 1 87 12 st.u64 [%r109],%r367; $L13: .loc 1 81 3 add.u64 %r121,%r121,24; add.u64 %r115,%r115,8; add.u64 %r109,%r109,8; setp.ne.u64 %r258,%r49,%r121; @ %r258 bra $L14; bra $L63; $L10: .loc 1 94 7 setp.le.s64 %r259,%r126,0; @ %r259 bra $L16; add.u64 %r355,%frame,240; $L44: add.u64 %r150,%r182,40; mov.u64 %r148,%r355; cvt.u32.u32 %r261,%r22; cvt.s64.s8 %r260,%r261; add.u64 %r263,%r260,%r260; add.u64 %r264,%r263,%r260; shl.b64 %r265,%r264,3; add.u64 %r266,%r182,16; add.u64 %r123,%r265,%r266; .loc 1 97 10 mov.u64 %r114,1; .loc 1 101 4 mov.u64 %r289,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r267,0; st.u64 [%r182+8],%r267; .loc 1 106 28 cvt.u16.u32 %r270,%r22; add.u16 %r269,%r270,-1; cvt.u32.u16 %r271,%r269; st.u8 [%r182+28],%r271; .loc 1 108 20 add.u32 %r272,%r22,-2; cvt.s64.s32 %r56,%r272; add.u64 %r274,%r56,%r56; add.u64 %r275,%r274,%r56; shl.b64 %r276,%r275,3; add.u64 %r277,%r182,%r276; .loc 1 108 67 shl.b64 %r279,%r56,3; add.u64 %r280,%frame,%r279; .loc 1 108 59 ld.u64 %r282,[%r277+40]; ld.u64 %r283,[%r280+240]; mul.lo.u64 %r132,%r282,%r283; .loc 1 110 29 mov.u64 %r2851285; call (%value_in),_gfortrani_xmallocarray286,[%value_in]; } .loc 1 110 27 st.u64 [%r182],%r286; .loc 1 111 10 setp.eq.u64 %r288,%r132,0; @ ! %r288 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r114,%r52,%r114; $L17: .loc 1 101 4 st.u64 [%r150+8],%r289; ld.u64 %r52,[%r148]; add.u64 %r290,%r52,-1; st.u64 [%r150+16],%r290; st.u64 [%r150],%r114; .loc 1 94 7 add.u64 %r150,%r150,24; add.u64 %r148,%r148,8; setp.ne.u64 %r291,%r123,%r150; @ %r291 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r182+48],%r132; mov.u64 %r293,-1; st.u64 [%r182+56],%r293; mov.u64 %r294,1; st.u64 [%r182+40],%r294; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r62,[%r182+28]; .loc 1 121 10 setp.eq.u64 %r295,%r62,%r126; @ %r295 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r126; st.u64 [%stack],%r62;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r298,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r299,[%r298+36]; setp.eq.u32 %r300,%r299,0; @ %r300 bra $L21; .loc 1 128 2 add.u64 %r355,%frame,240; cvta.const.u64 %r304,$LC2call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r306,%r126,0; @ %r306 bra $L24; $L27: .loc 1 68 9 add.u64 %r307,%r31,1; .loc 1 68 7 sub.u64 %r128,%r307,%r33; max.s64 %r145,%r128,0; .loc 1 140 8 ld.u64 %r140,[%r183]; .loc 1 141 8 ld.u64 %r141,[%r182]; .loc 1 168 40 shl.b64 %r72,%r130,3; ld.u64 %r47,[%frame+360]; .loc 1 204 22 ld.u64 %r82,[%frame+120]; .loc 1 204 12 shl.b64 %r84,%r82,3; .loc 1 205 22 ld.u64 %r85,[%frame]; .loc 1 205 12 shl.b64 %r87,%r85,3; setp.gt.s64 %r357,%r128,0; setp.le.s64 %r358,%r126,1; .loc 1 180 9 setp.ne.u32 %r360,%r185,0; .loc 1 159 10 mov.u64 %r361,0; cvt.u32.u32 %r362,%r22; cvt.s64.s8 %r363,%r362; add.u64 %r364,%r363,-1; add.u64 %r365,%frame,120; add.u64 %r366,%frame,240; bra $L25; $L24: add.u64 %r168,%frame,360; add.u64 %r167,%r182,40; mov.u64 %r165,%frame; cvt.u32.u32 %r309,%r22; cvt.s64.s8 %r308,%r309; add.u64 %r154,%r308,-1; .loc 1 132 3 mov.u64 %r142,0; add.u64 %r355,%frame,240; .loc 1 134 16 mov.u64 %r310,%r142; $L26: st.u64 [%r168],%r310; .loc 1 135 18 ld.u64 %r311,[%r167]; st.u64 [%r165],%r311; .loc 1 136 17 shl.b64 %r313,%r142,3; add.u64 %r314,%r355,%r313; .loc 1 136 10 ld.u64 %r315,[%r314]; setp.le.s64 %r316,%r315,0; @ %r316 bra $L1; .loc 1 132 26 add.u64 %r142,%r142,1; .loc 1 132 3 add.u64 %r168,%r168,8; add.u64 %r167,%r167,24; add.u64 %r165,%r165,8; setp.ne.u64 %r317,%r142,%r154; @ %r317 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r357 bra $L45; .loc 1 159 10 st.u64 [%r141],%r361; bra $L29; $L45: mov.u64 %r133,%r140; .loc 1 168 17 mov.u64 %r118,0; $L28: .loc 1 170 7 ld.f64 %r113,[%r133]; mov.u64 %r151,%r118; .loc 1 168 32 add.u64 %r118,%r118,1; .loc 1 170 6 setp.le.f64 %r320,%r113,0d7ff0000000000000; @ ! %r320 bra $L64; .loc 1 180 9 @ %r360 bra $L32; .loc 1 190 8 setp.gt.s64 %r322,%r145,%r151; @ %r322 bra $L33; bra $L34; $L32: .loc 1 181 8 setp.gt.s64 %r323,%r145,%r151; @ %r323 bra $L46; bra $L34; $L64: .loc 1 168 40 add.u64 %r133,%r133,%r72; .loc 1 168 10 setp.gt.s64 %r324,%r145,%r118; @ %r324 bra $L28; .loc 1 157 9 mov.u64 %r118,1; bra $L34; $L46: .loc 1 170 7 mov.f64 %r180,%r113; $L35: .loc 1 186 16 add.u64 %r151,%r151,1; .loc 1 183 8 setp.ge.f64 %r325,%r113,%r180; .loc 1 185 16 selp.f64 %r113,%r180,%r113,%r325; selp.u64 %r118,%r151,%r118,%r325; .loc 1 181 33 add.u64 %r133,%r133,%r72; .loc 1 181 8 setp.eq.u64 %r326,%r145,%r151; @ %r326 bra $L34; .loc 1 183 9 ld.f64 %r180,[%r133]; bra $L35; $L33: .loc 1 192 9 ld.f64 %r77,[%r133]; .loc 1 195 16 add.u64 %r151,%r151,1; .loc 1 192 8 setp.lt.f64 %r327,%r77,%r113; .loc 1 194 16 selp.f64 %r113,%r77,%r113,%r327; selp.u64 %r118,%r151,%r118,%r327; .loc 1 190 33 add.u64 %r133,%r133,%r72; .loc 1 190 8 setp.ne.u64 %r328,%r145,%r151; @ %r328 bra $L33; $L34: .loc 1 199 12 st.u64 [%r141],%r118; $L29: .loc 1 203 15 add.u64 %r47,%r47,1; .loc 1 204 12 add.u64 %r140,%r140,%r84; .loc 1 205 12 add.u64 %r141,%r141,%r87; .loc 1 207 32 ld.u64 %r122,[%frame+240]; .loc 1 207 13 setp.ne.u64 %r329,%r47,%r122; @ %r329 bra $L25; .loc 1 214 23 mul.lo.u64 %r330,%r82,%r47; .loc 1 214 9 shl.b64 %r331,%r330,3; sub.u64 %r137,%r140,%r331; .loc 1 215 23 mul.lo.u64 %r332,%r85,%r47; .loc 1 215 9 shl.b64 %r333,%r332,3; sub.u64 %r138,%r141,%r333; .loc 1 217 7 @ %r358 bra $L1; add.u64 %r177,%frame,368; mov.u64 %r176,8; .loc 1 216 5 mov.u64 %r139,1; bra $L42; $L43: .loc 1 211 13 st.u64 [%r177],%r361; .loc 1 214 23 mul.lo.u64 %r339,%r100,%r99; .loc 1 214 9 shl.b64 %r340,%r339,3; sub.u64 %r137,%r140,%r340; .loc 1 215 23 mul.lo.u64 %r341,%r103,%r99; .loc 1 215 9 shl.b64 %r342,%r341,3; sub.u64 %r138,%r141,%r342; .loc 1 216 5 add.u64 %r139,%r139,1; .loc 1 217 7 add.u64 %r177,%r177,8; add.u64 %r176,%r176,8; setp.eq.u64 %r343,%r139,%r364; @ %r343 bra $L1; $L42: .loc 1 225 16 ld.u64 %r344,[%r177]; add.u64 %r99,%r344,1; st.u64 [%r177],%r99; .loc 1 226 23 add.u64 %r346,%r365,%r176; ld.u64 %r100,[%r346]; .loc 1 226 13 shl.b64 %r347,%r100,3; add.u64 %r140,%r137,%r347; .loc 1 227 23 add.u64 %r348,%frame,%r176; ld.u64 %r103,[%r348]; .loc 1 227 13 shl.b64 %r349,%r103,3; add.u64 %r141,%r138,%r349; .loc 1 207 32 add.u64 %r351,%r366,%r176; ld.u64 %r107,[%r351]; .loc 1 207 13 setp.eq.u64 %r352,%r99,%r107; @ %r352 bra $L43; .loc 1 211 13 mov.u64 %r47,0; bra $L25; $L63: .loc 1 90 6 ld.u64 %r353,[%r182]; setp.eq.u64 %r354,%r353,0; @ ! %r354 bra $L11; bra $L44; $L1: .loc 1 231_gfortran_mminloc1_8_r8 .visible .func _gfortran_mminloc1_8_r8, .param .u32 %in_ar4) {608pred %r218; .reg .u32 %r223; .reg .u6464u64pred %r373; .reg .u64 %r374; .reg .u64 %r376; .reg .u32 %r377; .reg .pred %r378; .reg .u64 %r381; .reg .u64 %r382; .reg .u64 %r386; .reg .pred %r388; .reg .pred %r389; .reg .u64 %r390; .reg .u32 %r391; .reg .u64pred %r398; .reg .pred %r399; .reg .pred %r401; .reg .pred %r402; .reg .pred %r403; .reg .pred %r404; .reg .pred %r407; .reg .pred %r408; .reg .pred %r410; .reg .pred %r411; .reg .pred.reg .u64 %r421; .reg .u64u64pred %r445; .reg .u64 %r446; .reg .predmov.u64 %r213,%ar0; mov.u64 %r214,%ar1; mov.u64 %r215,%ar2; mov.u64 %r216,%ar3; mov.u32 %r217,%ar4; .loc 1 261 6 setp.ne.u64 %r218,%r216,0; @ %r218 bra $L661r214215217; call _gfortran_minloc1_8_r8268 7 bra $L65; $L66: .loc 1 271 10 ld.u64 %r22,[%r215]; .loc 1 271 7 add.u64 %r152,%r22,-1; .loc 1 272 10 ld.s8 %r23,[%r214+28]; .loc 1 272 38 add.u32 %r223,%r23,-1; .loc 1 272 8 cvt.s64.s32 %r153,%r223; .loc 1 275 7 shr.u64 %r225,%r152,63; set.u32.gt.s64 %r227,%r152,%r153; neg.s32 %r228,%r227; cvt.u16.u64 %r231,%r225; cvt.u16.u32 %r232,%r228; or.b16 %r230,%r231,%r232; .loc 1 275 6 cvt.u32.u16 %r233,%r230; cvt.u16.u8 %r234,%r233; setp.eq.u16 %r235,%r234,0; @ %r235 bra $L68; .loc 1 277 7 cvt.u32.u32 %r238,%r23; cvt.s64.s8 %r237,%r238; st.u64 [%stack+8],%r237; st.u64 [%stack],%r22; cvta.const.u64 %r236236_gfortran_runtime_error68: .loc 1 282 9 add.u64 %r442,%r152,%r152; add.u64 %r444,%r442,%r152; shl.b64 %r243,%r444,3; add.u64 %r244,%r214,%r243; ld.u64 %r247,[%r244+56]; add.u64 %r246,%r247,1; .loc 1 282 7 ld.u64 %r254,[%r244+48]; sub.u64 %r154,%r246,%r254; .loc 1 283 6 setp.le.s64 %r255,%r154,0; @ %r255 bra $L65; .loc 1 286 9 ld.u64 %r172,[%r216]; .loc 1 288 15 ld.u64 %r34,[%r216+16]; .loc 1 290 22 cvt.u32.u64 %r35,%r34; .loc 1 290 53 add.u32 %r256,%r35,-4; and.b32 %r257,%r256,-5; set.u32.eq.u32 %r259,%r257,0; neg.s32 %r260,%r259; .loc 1 290 22 add.u32 %r261,%r35,-1; set.u32.le.u32 %r263,%r261,1; neg.s32 %r264,%r263; .loc 1 290 6 cvt.u16.u32 %r266,%r260; cvt.u16.u32 %r267,%r264; or.b16 %r265,%r266,%r267; cvt.u32.u16 %r268,%r265; cvt.u16.u8 %r269,%r268; setp.ne.u16 %r270,%r269,0; @ %r270 bra $L70; .loc 1 292 7 setp.ne.u32 %r272,%r35,16; @ %r272 bra $L71; $L70: .loc 1 299 9 shl.b64 %r276,%r444,3; add.u64 %r277,%r214,%r276; ld.u64 %r157,[%r277+40]; .loc 1 300 12 add.u64 %r283,%r216,%r276; ld.u64 %r38,[%r283+40]; .loc 1 302 3 setp.ne.u64 %r285,%r152,0; @ %r285 bra $L72; $L78: .loc 1 312 3 setp.lt.s64 %r286,%r152,%r153; @ %r286 bra $L73; bra $L139; $L71: .loc 1 297 5_gfortran_runtime_error72: add.u64 %r205,%r214,40; add.u64 %r70,%r216,40; add.u64 %r289,%r214,16; add.u64 %r291,%r22,%r22; add.u64 %r292,%r291,%r22; shl.b64 %r293,%r292,3; add.u64 %r188,%r289,%r293; .loc 1 302 3 mov.u64 %r92,0; add.u64 %r441,%frame,240; add.u64 %r439,%frame,360; .loc 1 309 12 mov.u64 %r455,%r92; $L77: .loc 1 304 18 add.u64 %r295,%r441,%r92; ld.u64 %r296,[%r205]; st.u64 [%r295],%r296; .loc 1 305 18 add.u64 %r297,%frame,%r92; .loc 1 305 20 ld.u64 %r299,[%r70]; mul.lo.u64 %r298,%r299,%r34; .loc 1 305 18 st.u64 [%r297],%r298; .loc 1 306 19 ld.u64 %r301,[%r205+16]; add.u64 %r300,%r301,1; ld.u64 %r302,[%r205+8]; sub.u64 %r49,%r300,%r302; .loc 1 308 10 setp.lt.s64 %r303,%r49,0; @ %r303 bra $L75; .loc 1 306 17 add.u64 %r305,%r439,%r92; st.u64 [%r305],%r49; bra $L76; $L75: .loc 1 309 12 add.u64 %r307,%r439,%r92; st.u64 [%r307],%r455; $L76: .loc 1 302 3 add.u64 %r205,%r205,24; add.u64 %r70,%r70,24; add.u64 %r92,%r92,8; setp.ne.u64 %r309,%r188,%r205; @ %r309 bra $L77; bra $L78; $L139: .loc 1 322 6 ld.u64 %r310,[%r213]; setp.eq.u64 %r311,%r310,0; @ ! %r311 bra $L80; bra $L79; $L73: add.u64 %r313,%r22,%r22; add.u64 %r314,%r313,%r22; shl.b64 %r315,%r314,3; add.u64 %r123,%r315,40; add.u64 %r128,%r214,%r123; add.u64 %r107,%r216,%r123; shl.b64 %r316,%r22,3; add.u64 %r78,%r316,-8; cvt.u32.u32 %r318,%r23; cvt.s64.s8 %r317,%r318; shl.b64 %r319,%r317,3; add.u64 %r180,%r319,-8; add.u64 %r441,%frame,240; add.u64 %r439,%frame,360; .loc 1 319 12 mov.u64 %r454,0; $L83: .loc 1 314 18 add.u64 %r321,%r441,%r78; ld.u64 %r322,[%r128]; st.u64 [%r321],%r322; .loc 1 315 18 add.u64 %r323,%frame,%r78; .loc 1 315 20 ld.u64 %r325,[%r107]; mul.lo.u64 %r324,%r325,%r34; .loc 1 315 18 st.u64 [%r323],%r324; .loc 1 316 19 ld.u64 %r327,[%r128+16]; add.u64 %r326,%r327,1; ld.u64 %r328,[%r128+8]; sub.u64 %r61,%r326,%r328; .loc 1 318 10 setp.lt.s64 %r329,%r61,0; @ %r329 bra $L81; .loc 1 316 17 add.u64 %r331,%r439,%r78; st.u64 [%r331],%r61; bra $L82; $L81: .loc 1 319 12 add.u64 %r333,%r439,%r78; st.u64 [%r333],%r454; $L82: .loc 1 312 3 add.u64 %r128,%r128,24; add.u64 %r107,%r107,24; add.u64 %r78,%r78,8; setp.ne.u64 %r335,%r78,%r180; @ %r335 bra $L83; bra $L140; $L79: .loc 1 326 7 setp.eq.u64 %r336,%r153,0; @ %r336 bra $L85; add.u64 %r439,%frame,360; $L112: add.u64 %r144,%r213,40; mov.u64 %r139,%r439; cvt.u32.u32 %r338,%r23; cvt.s64.s8 %r337,%r338; add.u64 %r340,%r337,%r337; add.u64 %r341,%r340,%r337; shl.b64 %r342,%r341,3; add.u64 %r343,%r213,16; add.u64 %r129,%r342,%r343; .loc 1 329 10 mov.u64 %r132,1; .loc 1 333 4 mov.u64 %r362,0; bra $L86; $L85: .loc 1 337 20 add.u32 %r344,%r23,-2; cvt.s64.s32 %r68,%r344; add.u64 %r346,%r68,%r68; add.u64 %r347,%r346,%r68; shl.b64 %r348,%r347,3; add.u64 %r349,%r213,%r348; .loc 1 337 67 shl.b64 %r351,%r68,3; add.u64 %r352,%frame,%r351; .loc 1 337 59 ld.u64 %r354,[%r349+40]; ld.u64 %r355,[%r352+360]; mul.lo.u64 %r161,%r354,%r355; .loc 1 339 24 mov.u64 %r356,0; st.u64 [%r213+8],%r356; .loc 1 340 28 cvt.u16.u32 %r359,%r23; add.u16 %r358,%r359,-1; cvt.u32.u16 %r360,%r358; st.u8 [%r213+28],%r360; .loc 1 342 10 setp.eq.u64 %r361,%r161,0; @ %r361 bra $L87; bra $L141; $L89: .loc 1 331 47 mul.lo.u64 %r132,%r65,%r132; $L86: .loc 1 333 4 st.u64 [%r144+8],%r362; ld.u64 %r65,[%r139]; add.u64 %r363,%r65,-1; st.u64 [%r144+16],%r363; st.u64 [%r144],%r132; .loc 1 326 7 add.u64 %r144,%r144,24; add.u64 %r139,%r139,8; setp.ne.u64 %r364,%r129,%r144; @ %r364 bra $L89; bra $L85; $L87: .loc 1 345 4 st.u64 [%r213+48],%r161; mov.u64 %r366,-1; st.u64 [%r213+56],%r366; mov.u64 %r367,1; st.u64 [%r213+40],%r367; .loc 1 346 4 bra $L65; $L141: .loc 1 349 24 mov.u64 %r369369; call (%value_in),_gfortrani_xmallocarray70,[%value_in]; } .loc 1 349 22 st.u64 [%r213],%r370; bra $L90; $L80: .loc 1 354 19 ld.s8 %r372,[%r213+28]; .loc 1 354 10 setp.eq.u64 %r373,%r372,%r153; @ %r373 bra $L91; .loc 1 355 2 cvta.const.u64 %r374374_gfortran_runtime_error91: .loc 1 357 11 cvta.global.u64 %r376,_gfortrani_compile_options; .loc 1 357 10 ld.u32 %r377,[%r376+36]; setp.eq.u32 %r378,%r377,0; @ %r378 bra $L90; .loc 1 359 4 add.u64 %r439,%frame,360; cvta.const.u64 %r382,$LC2382; call _gfortrani_bounds_ifunction_return361 4 cvta.const.u64 %r386,$LC386382; call _gfortrani_bounds_equal_extents$L90: .loc 1 366 3 setp.ne.u64 %r388,%r153,0; @ %r388 bra $L92; $L95: .loc 1 374 8 ld.u64 %r173,[%r213]; .loc 1 375 8 ld.u64 %r171,[%r214]; .loc 1 377 9 setp.ne.u64 %r389,%r171,0; @ %r389 bra $L93; bra $L65; $L92: add.u64 %r193,%frame,480; add.u64 %r187,%r213,40; add.u64 %r181,%frame,120; cvt.u32.u32 %r391,%r23; cvt.s64.s8 %r390,%r391; add.u64 %r145,%r390,-1; .loc 1 366 3 mov.u64 %r174,0; add.u64 %r439,%frame,360; .loc 1 368 16 mov.u64 %r392,%r174; $L94: st.u64 [%r193],%r392; .loc 1 369 18 ld.u64 %r393,[%r187]; st.u64 [%r181],%r393; .loc 1 370 17 shl.b64 %r395,%r174,3; add.u64 %r396,%r439,%r395; .loc 1 370 10 ld.u64 %r397,[%r396]; setp.le.s64 %r398,%r397,0; @ %r398 bra $L65; .loc 1 366 26 add.u64 %r174,%r174,1; .loc 1 366 3 add.u64 %r193,%r193,8; add.u64 %r187,%r187,24; add.u64 %r181,%r181,8; setp.ne.u64 %r399,%r145,%r174; @ %r399 bra $L94; bra $L95; $L93: .loc 1 300 12 mul.lo.u64 %r40,%r38,%r34; .loc 1 396 32 shl.b64 %r82,%r157,3; ld.u64 %r185,[%frame+480]; .loc 1 440 22 ld.u64 %r93,[%frame+240]; .loc 1 440 12 shl.b64 %r95,%r93,3; .loc 1 441 23 ld.u64 %r96,[%frame]; .loc 1 442 22 ld.u64 %r98,[%frame+120]; .loc 1 442 12 shl.b64 %r100,%r98,3; setp.le.s64 %r445,%r153,1; .loc 1 418 9 setp.eq.u32 %r447,%r217,0; cvt.u32.u32 %r448,%r23; cvt.s64.s8 %r449,%r448; add.u64 %r450,%r449,-1; add.u64 %r451,%frame,240; add.u64 %r452,%frame,360; add.u64 %r453,%frame,120; $L113: .loc 1 366 3 mov.u64 %r166,%r172; mov.u64 %r165,%r171; .loc 1 393 16 mov.u64 %r142,0; .loc 1 396 9 mov.u64 %r138,%r142; $L101: .loc 1 399 7 ld.s8 %r210,[%r166]; mov.u64 %r182,%r138; .loc 1 403 17 add.u64 %r138,%r138,1; .loc 1 399 6 setp.eq.u32 %r440,%r210,0; @ %r440 bra $L96; .loc 1 402 10 setp.eq.u64 %r401,%r142,0; .loc 1 403 17 selp.u64 %r142,%r138,%r142,%r401; .loc 1 404 11 ld.f64 %r140,[%r165]; .loc 1 404 10 setp.le.f64 %r402,%r140,0d7ff0000000000000; @ ! %r402 bra $L96; .loc 1 414 9 setp.le.s64 %r403,%r154,%r182; @ ! %r403 bra $L142; bra $L99; $L96: .loc 1 396 32 add.u64 %r165,%r165,%r82; .loc 1 396 47 add.u64 %r166,%r166,%r40; .loc 1 396 2 setp.ne.u64 %r404,%r154,%r138; @ %r404 bra $L101; bra $L99; $L142: .loc 1 418 9 @ %r447 bra $L102; $L105: .loc 1 424 16 add.u64 %r182,%r182,1; .loc 1 421 8 @ %r440 bra $L103; .loc 1 421 18 ld.f64 %r85,[%r165]; .loc 1 421 15 setp.le.f64 %r407,%r85,%r140; .loc 1 423 16 selp.f64 %r140,%r85,%r140,%r407; selp.u64 %r138,%r182,%r138,%r407; $L103: .loc 1 419 33 add.u64 %r165,%r165,%r82; .loc 1 419 48 add.u64 %r166,%r166,%r40; .loc 1 419 8 setp.eq.u64 %r408,%r154,%r182; @ %r408 bra $L115; .loc 1 421 9 ld.s8 %r210,[%r166]; setp.eq.u32 %r440,%r210,0; bra $L105; $L102: .loc 1 433 11 add.u64 %r182,%r182,1; .loc 1 430 10 @ %r440 bra $L106; .loc 1 430 20 ld.f64 %r90,[%r165]; .loc 1 430 17 setp.lt.f64 %r410,%r90,%r140; .loc 1 432 18 selp.f64 %r140,%r90,%r140,%r410; selp.u64 %r138,%r182,%r138,%r410; $L106: .loc 1 428 35 add.u64 %r165,%r165,%r82; .loc 1 428 50 add.u64 %r166,%r166,%r40; .loc 1 428 10 setp.eq.u64 %r411,%r154,%r182; @ %r411 bra $L116; .loc 1 430 11 ld.s8 %r210,[%r166]; setp.eq.u32 %r440,%r210,0; bra $L102; $L115: mov.u64 %r142,%r138; bra $L99; $L116: mov.u64 %r142,%r138; $L99: .loc 1 436 8 st.u64 [%r173],%r142; .loc 1 439 15 add.u64 %r185,%r185,1; .loc 1 440 12 add.u64 %r171,%r171,%r95; .loc 1 441 13 add.u64 %r172,%r172,%r96; .loc 1 442 12 add.u64 %r173,%r173,%r100; .loc 1 444 32 ld.u64 %r164,[%frame+360]; .loc 1 444 13 setp.ne.u64 %r412,%r185,%r164; @ %r412 bra $L113; .loc 1 451 23 mul.lo.u64 %r102,%r93,%r185; .loc 1 452 24 mul.lo.u64 %r413,%r96,%r185; .loc 1 452 10 sub.u64 %r168,%r172,%r413; .loc 1 453 23 mul.lo.u64 %r414,%r98,%r185; .loc 1 453 9 shl.b64 %r415,%r414,3; sub.u64 %r169,%r173,%r415; .loc 1 455 7 @ %r445 bra $L65; add.u64 %r203,%frame,488; mov.u64 %r202,8; .loc 1 454 5 mov.u64 %r170,1; .loc 1 448 13 mov.u64 %r446,0; bra $L110; $L111: st.u64 [%r203],%r446; .loc 1 451 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 452 24 mul.lo.u64 %r421,%r116,%r113; .loc 1 452 10 sub.u64 %r168,%r172,%r421; .loc 1 453 23 mul.lo.u64 %r422,%r118,%r113; .loc 1 453 9 shl.b64 %r423,%r422,3; sub.u64 %r169,%r173,%r423; .loc 1 454 5 add.u64 %r170,%r170,1; .loc 1 455 7 add.u64 %r203,%r203,8; add.u64 %r202,%r202,8; setp.eq.u64 %r424,%r170,%r450; @ %r424 bra $L65; $L110: .loc 1 463 16 ld.u64 %r425,[%r203]; add.u64 %r113,%r425,1; st.u64 [%r203],%r113; .loc 1 464 23 add.u64 %r427,%r451,%r202; ld.u64 %r114,[%r427]; .loc 1 464 13 sub.u64 %r428,%r114,%r102; shl.b64 %r429,%r428,3; add.u64 %r171,%r171,%r429; .loc 1 465 24 add.u64 %r430,%frame,%r202; ld.u64 %r116,[%r430]; .loc 1 465 14 add.u64 %r172,%r168,%r116; .loc 1 466 23 add.u64 %r432,%r453,%r202; ld.u64 %r118,[%r432]; .loc 1 466 13 shl.b64 %r433,%r118,3; add.u64 %r173,%r169,%r433; .loc 1 444 32 add.u64 %r435,%r452,%r202; ld.u64 %r122,[%r435]; .loc 1 444 13 setp.eq.u64 %r436,%r113,%r122; @ %r436 bra $L111; .loc 1 448 13 mov.u64 %r185,0; bra $L113; $L140: .loc 1 322 6 ld.u64 %r437,[%r213]; setp.eq.u64 %r438,%r437,0; @ ! %r438 bra $L80; bra $L112; $L65: .loc 1 4_gfortran_sminloc1_8_r8 .visible .func _gfortran_sminloc1_8_r8, .param .u32 %in_ar4) {8u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 493 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L144; .loc 1 493 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L145; $L144:r152154; call _gfortran_minloc1_8_r8500 7 bra $L143; $L145: .loc 1 503 10 ld.u64 %r23,[%r152]; .loc 1 503 7 add.u64 %r88,%r23,-1; .loc 1 504 10 ld.s8 %r24,[%r151+28]; .loc 1 504 38 add.u32 %r26,%r24,-1; .loc 1 504 8 cvt.s64.s32 %r296,%r26; .loc 1 506 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 506 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L147; .loc 1 513 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L148; $L154: .loc 1 521 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L149; bra $L202; $L147: .loc 1 508 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error148: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 518 12 mov.u64 %r302,0; $L153: .loc 1 515 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 517 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L151; .loc 1 515 17 st.u64 [%r139],%r36; bra $L152; $L151: .loc 1 518 12 st.u64 [%r139],%r302; $L152: .loc 1 513 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L153; bra $L154; $L202: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L156; bra $L155; $L149: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 527 12 mov.u64 %r301,0; $L159: .loc 1 524 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 526 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L157; .loc 1 523 17 st.u64 [%r43],%r41; bra $L158; $L157: .loc 1 527 12 st.u64 [%r43],%r301; $L158: .loc 1 521 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L159; bra $L203; $L155: .loc 1 534 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L161; add.u64 %r295,%frame,120; $L181: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 537 10 mov.u64 %r81,1; .loc 1 541 4 mov.u64 %r239,0; bra $L162; $L161: .loc 1 545 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 546 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 548 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 548 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 548 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 550 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L163; bra $L204; $L165: .loc 1 539 48 mul.lo.u64 %r81,%r46,%r81; $L162: .loc 1 541 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 534 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L165; bra $L161; $L163: .loc 1 553 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 554 4 bra $L143; $L204: .loc 1 557 24 mov.u64 %r24696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 557 22 st.u64 [%r150],%r98; $L169: .loc 1 583 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L166; bra $L167; $L156: .loc 1 561 19 ld.s8 %r55,[%r150+28]; .loc 1 561 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L168; .loc 1 562 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error168: .loc 1 567 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 567 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L169; .loc 1 569 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L167; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 569 10 mov.u64 %r104,0; $L171: .loc 1 573 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 573 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 574 18 ld.u64 %r63,[%r31]; .loc 1 569 25 add.u64 %r104,%r104,1; .loc 1 574 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L170; .loc 1 575 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error170: .loc 1 569 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L171; $L166: .loc 1 585 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L172; shl.b64 %r266,%r296,3; bra $L173; $L172: mov.u64 %r266,8; $L173: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L174: .loc 1 586 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 583 26 add.u64 %r100,%r100,1; .loc 1 583 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L174; $L167: ld.u64 %r80,[%frame+240]; .loc 1 595 22 ld.u64 %r68,[%frame]; .loc 1 595 12 shl.b64 %r69,%r68,3; .loc 1 597 32 ld.u64 %r59,[%frame+120]; .loc 1 604 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 593 13 mov.u64 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L180: st.u64 [%r98],%r280; .loc 1 594 15 add.u64 %r80,%r80,1; .loc 1 595 12 add.u64 %r98,%r98,%r69; .loc 1 597 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L180; .loc 1 606 7 @ ! %r294 bra $L205; bra $L143; $L179: .loc 1 601 13 st.u64 [%r99],%r280; .loc 1 604 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 605 5 add.u64 %r97,%r97,1; .loc 1 606 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L178; bra $L143; $L205: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 604 23 mov.u64 %r72,%r44; .loc 1 605 5 mov.u64 %r97,1; $L178: .loc 1 610 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 611 23 ld.u64 %r75,[%r124]; .loc 1 611 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,3; add.u64 %r98,%r98,%r291; .loc 1 597 32 ld.u64 %r77,[%r123]; .loc 1 597 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L179; .loc 1 601 13 mov.u64 %r80,0; bra $L180; $L203: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L156; bra $L181; $L143: .loc 1 615 minloc1_16_r8.o/162280223_gfortran_minloc1_16_r8 .visible .func _gfortran_minloc1_16_r8.file 1 "../../../../libgfortran/generated/minloc1_16_r8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_16_r8 .visible .func _gfortran_mminloc1_16_r8_gfortran_sminloc1_16_r8 .visible .func _gfortran_sminloc1_16_r8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_minloc1_16_r8 .visible .func _gfortran_minloc1_16_r8u64 %r116; .reg .f64u16 %r213; .reg .u16 %r214; .reg .u16 %r215; .reg .u32 %r216; .reg .u16 %r217; .reg .predpredu64 %r265; .reg .u32predu64 %r283; .reg .u64pred64u32 %r333; .reg .u64 %r334; .reg .u64pred %r340; .reg .pred %r341; .reg .u64 %r343; .reg .pred %r345; .reg .pred %r348; .reg .predpredu32 %r438; .reg .pred %r445; .reg .pred %r448; .reg .predu64 %r460; .reg .predpred %r470; .reg .u64 %r471; .reg .predu64 %r507; .reg .pred %r511; .reg .pred %r512; .reg .u64 %r513; .reg .u64 %r514; .reg .u64 %r515; .reg .pred %r516; .reg .u64 %r517; .reg .u32 %r518; .reg .u64 %r519; .reg .u64 %r520; .reg .u64 %r521; .reg .u64 %r522; .reg .u64 %r523; .reg .u64 %r524; mov.u64 %r202,%ar0; mov.u64 %r203,%ar1; mov.u64 %r204,%ar2; mov.u32 %r205,%ar3; .loc 1 58 10 ld.s8 %r22,[%r203+28]; .loc 1 58 38 add.u32 %r206,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r130,%r206; .loc 1 59 10 ld.u64 %r25,[%r204]; .loc 1 59 7 add.u64 %r131,%r25,-1; .loc 1 61 7 shr.u64 %r208,%r131,63; set.u32.lt.s64 %r210,%r130,%r131; neg.s32 %r211,%r210; cvt.u16.u64 %r214,%r208; cvt.u16.u32 %r215,%r211; or.b16 %r213,%r214,%r215; .loc 1 61 6 cvt.u32.u16 %r216,%r213; cvt.u16.u8 %r217,%r216; setp.eq.u16 %r218,%r217,0; @ %r218 bra $L2; .loc 1 63 7 cvt.u32.u32 %r221,%r22; cvt.s64.s8 %r220,%r221; st.u64 [%stack+8],%r22011_gfortran_runtime_errorr224,%r131,%r131; add.u64 %r225,%r224,%r131; shl.b64 %r226,%r225,3; add.u64 %r227,%r203,%r226; ld.u64 %r30,[%r227+56]; ld.u64 %r32,[%r227+48]; .loc 1 71 9 ld.u64 %r134,[%r227+40]; .loc 1 73 3 setp.ne.u64 %r241,%r131,0; @ %r241 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r242,%r130,%r131; @ %r242 bra $L4; bra $L63; $L3: add.u64 %r76,%r203,40; add.u64 %r38,%frame,120; add.u64 %r164,%frame,240; add.u64 %r244,%r25,%r25; add.u64 %r245,%r244,%r25; shl.b64 %r246,%r245,3; add.u64 %r247,%r203,16; add.u64 %r43,%r246,%r247; .loc 1 79 12 mov.u64 %r524,0; $L8: .loc 1 75 18 ld.u64 %r248,[%r76]; st.u64 [%r38],%r248; .loc 1 76 19 ld.u64 %r250,[%r76+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r76+8]; sub.u64 %r37,%r249,%r251; .loc 1 78 10 setp.lt.s64 %r252,%r37,0; @ %r252 bra $L6; .loc 1 76 17 st.u64 [%r164],%r37; bra $L7; $L6: .loc 1 79 12 st.u64 [%r164],%r524; $L7: .loc 1 73 3 add.u64 %r76,%r76,24; add.u64 %r38,%r38,8; add.u64 %r164,%r164,8; setp.ne.u64 %r254,%r43,%r76; @ %r254 bra $L8; bra $L9; $L63: .loc 1 90 6 ld.u64 %r255,[%r202]; setp.eq.u64 %r256,%r255,0; @ ! %r256 bra $L11; bra $L10; $L4: add.u64 %r258,%r25,%r25; add.u64 %r259,%r258,%r25; shl.b64 %r260,%r259,3; add.u64 %r261,%r260,40; add.u64 %r72,%r203,%r261; shl.b64 %r262,%r25,3; add.u64 %r197,%r262,-8; add.u64 %r514,%frame,120; add.u64 %r128,%r514,%r197; add.u64 %r513,%frame,240; add.u64 %r195,%r513,%r197; cvt.u32.u32 %r266,%r22; cvt.s64.s8 %r265,%r266; add.u64 %r268,%r265,%r265; add.u64 %r269,%r268,%r265; shl.b64 %r270,%r269,3; add.u64 %r271,%r203,40; add.u64 %r64,%r270,%r271; .loc 1 87 12 mov.u64 %r523,0; $L14: .loc 1 83 18 ld.u64 %r272,[%r72]; st.u64 [%r128],%r272; .loc 1 84 19 ld.u64 %r274,[%r72+16]; add.u64 %r273,%r274,1; ld.u64 %r275,[%r72+8]; sub.u64 %r45,%r273,%r275; .loc 1 86 10 setp.lt.s64 %r276,%r45,0; @ %r276 bra $L12; .loc 1 84 17 st.u64 [%r195],%r45; bra $L13; $L12: .loc 1 87 12 st.u64 [%r195],%r523; $L13: .loc 1 81 3 add.u64 %r72,%r72,24; add.u64 %r128,%r128,8; add.u64 %r195,%r195,8; setp.ne.u64 %r278,%r64,%r72; @ %r278 bra $L14; bra $L64; $L10: .loc 1 94 7 setp.le.s64 %r279,%r130,0; @ %r279 bra $L16; add.u64 %r513,%frame,240; $L46: add.u64 %r113,%r202,40; mov.u64 %r110,%r513; cvt.u32.u32 %r281,%r22; cvt.s64.s8 %r280,%r281; add.u64 %r283,%r280,%r280; add.u64 %r284,%r283,%r280; shl.b64 %r285,%r284,3; add.u64 %r286,%r202,16; add.u64 %r86,%r285,%r286; .loc 1 97 10 mov.u64 %r111,1; .loc 1 101 4 mov.u64 %r309,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r287,0; st.u64 [%r202+8],%r287; .loc 1 106 28 cvt.u16.u32 %r290,%r22; add.u16 %r289,%r290,-1; cvt.u32.u16 %r291,%r289; st.u8 [%r202+28],%r291; .loc 1 108 20 add.u32 %r292,%r22,-2; cvt.s64.s32 %r53,%r292; add.u64 %r294,%r53,%r53; add.u64 %r295,%r294,%r53; shl.b64 %r296,%r295,3; add.u64 %r297,%r202,%r296; .loc 1 108 67 shl.b64 %r299,%r53,3; add.u64 %r300,%frame,%r299; .loc 1 108 59 ld.u64 %r302,[%r297+40]; ld.u64 %r303,[%r300+240]; mul.lo.u64 %r136,%r302,%r303; .loc 1 110 29 mov.u64 %r305,16305; call (%value_in),_gfortrani_xmallocarray06,[%value_in]; } .loc 1 110 27 st.u64 [%r202],%r306; .loc 1 111 10 setp.eq.u64 %r308,%r136,0; @ ! %r308 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r111,%r49,%r111; $L17: .loc 1 101 4 st.u64 [%r113+8],%r309; ld.u64 %r49,[%r110]; add.u64 %r310,%r49,-1; st.u64 [%r113+16],%r310; st.u64 [%r113],%r111; .loc 1 94 7 add.u64 %r113,%r113,24; add.u64 %r110,%r110,8; setp.ne.u64 %r311,%r86,%r113; @ %r311 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r202+48],%r136; mov.u64 %r313,-1; st.u64 [%r202+56],%r313; mov.u64 %r314,1; st.u64 [%r202+40],%r314; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r60,[%r202+28]; .loc 1 121 10 setp.eq.u64 %r315,%r60,%r130; @ %r315 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r130; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r318,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r319,[%r318+36]; setp.eq.u32 %r320,%r319,0; @ %r320 bra $L21; .loc 1 128 2 add.u64 %r513,%frame,240; cvta.const.u64 %r324,$LC2r513324; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r326,%r130,0; @ %r326 bra $L24; $L27: .loc 1 68 9 add.u64 %r327,%r30,1; .loc 1 68 7 sub.u64 %r132,%r327,%r32; max.s64 %r68,%r132,0; .loc 1 140 8 ld.u64 %r147,[%r203]; .loc 1 141 8 ld.u64 %r148,[%r202]; .loc 1 168 40 shl.b64 %r70,%r134,3; ld.u64 %r161,[%frame+360]; .loc 1 204 22 ld.u64 %r81,[%frame+120]; .loc 1 204 12 shl.b64 %r82,%r81,3; .loc 1 205 22 ld.u64 %r84,[%frame]; .loc 1 205 12 shl.b64 %r85,%r84,4; .loc 1 207 32 ld.u64 %r122,[%frame+240]; .loc 1 214 23 mul.lo.u64 %r328,%r81,%r122; .loc 1 214 9 shl.b64 %r329,%r328,3; neg.s64 %r166,%r329; .loc 1 215 23 mul.lo.u64 %r330,%r84,%r122; .loc 1 215 9 shl.b64 %r331,%r330,4; neg.s64 %r71,%r331; setp.gt.s64 %r511,%r132,0; setp.le.s64 %r512,%r130,1; .loc 1 180 9 setp.ne.u32 %r516,%r205,0; add.u64 %r517,%r68,-1; cvt.u32.u32 %r518,%r22; cvt.s64.s8 %r519,%r518; add.u64 %r520,%r519,-1; add.u64 %r521,%frame,120; add.u64 %r522,%frame,240; bra $L25; $L24: add.u64 %r158,%frame,360; add.u64 %r157,%r202,40; mov.u64 %r153,%frame; cvt.u32.u32 %r333,%r22; cvt.s64.s8 %r332,%r333; add.u64 %r116,%r332,-1; .loc 1 132 3 mov.u64 %r149,0; add.u64 %r513,%frame,240; .loc 1 134 16 mov.u64 %r334,%r149; $L26: st.u64 [%r158],%r334; .loc 1 135 18 ld.u64 %r335,[%r157]; st.u64 [%r153],%r335; .loc 1 136 17 shl.b64 %r337,%r149,3; add.u64 %r338,%r513,%r337; .loc 1 136 10 ld.u64 %r339,[%r338]; setp.le.s64 %r340,%r339,0; @ %r340 bra $L1; .loc 1 132 26 add.u64 %r149,%r149,1; .loc 1 132 3 add.u64 %r158,%r158,8; add.u64 %r157,%r157,24; add.u64 %r153,%r153,8; setp.ne.u64 %r341,%r116,%r149; @ %r341 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r511 bra $L47; .loc 1 159 10 mov.u64 %r343,0; st.u64 [%r148],%r343; st.u64 [%r148+8],%r343; bra $L29; $L47: mov.u64 %r138,%r147; .loc 1 168 17 mov.u64 %r156,0; $L28: .loc 1 170 7 ld.f64 %r119,[%r138]; mov.u64 %r199,%r156; .loc 1 168 32 add.u64 %r156,%r156,1; .loc 1 170 6 setp.le.f64 %r345,%r119,0d7ff0000000000000; @ ! %r345 bra $L65; .loc 1 173 14 mov.u64 %r473,%r156; shr.s64 %r474,%r473,63; .loc 1 180 9 @ %r516 bra $L32; .loc 1 190 8 setp.gt.s64 %r348,%r68,%r199; @ %r348 bra $L33; bra $L34; $L32: .loc 1 181 8 setp.le.s64 %r349,%r68,%r199; @ %r349 bra $L34; add.u64 %r481,%r199,1; shr.s64 %r482,%r481,63; sub.u64 %r353,%r517,%r199; shr.s64 %r356,%r199,63; add.u64 %r359,%r199,2; set.u32.lt.u64 %r364,%r359,%r199; cvt.s64.s32 %r362,%r364; sub.u64 %r368,%r356,%r362; add.u64 %r370,%r353,%r359; set.u32.lt.u64 %r375,%r370,%r353; cvt.s64.s32 %r373,%r375; sub.u64 %r379,%r368,%r373; .loc 1 170 7 mov.f64 %r194,%r119; .loc 1 181 8 mov.u64 %r493,1; bra $L35; $L65: .loc 1 168 40 add.u64 %r138,%r138,%r70; .loc 1 168 10 setp.gt.s64 %r380,%r68,%r156; @ %r380 bra $L28; .loc 1 157 9 mov.u64 %r473,1; mov.u64 %r474,0; bra $L34; $L35: .loc 1 183 8 setp.ge.f64 %r381,%r119,%r194; .loc 1 186 16 selp.u64 %r473,%r481,%r473,%r381; selp.u64 %r474,%r482,%r474,%r381; selp.f64 %r119,%r194,%r119,%r381; .loc 1 181 33 add.u64 %r138,%r138,%r70; .loc 1 181 8 add.u64 %r384,%r481,%r493; set.u32.lt.u64 %r389,%r384,%r481; cvt.s64.s32 %r387,%r389; mov.u64 %r481,%r384; sub.u64 %r482,%r482,%r387; setp.ne.u64 %r396,%r370,%r481; @ %r396 bra $L50; setp.ne.u64 %r399,%r379,%r482; @ ! %r399 bra $L34; $L50: .loc 1 183 9 ld.f64 %r194,[%r138]; bra $L35; $L33: add.u64 %r477,%r199,1; shr.s64 %r478,%r477,63; sub.u64 %r403,%r517,%r199; shr.s64 %r406,%r199,63; add.u64 %r409,%r199,2; set.u32.lt.u64 %r414,%r409,%r199; cvt.s64.s32 %r412,%r414; sub.u64 %r418,%r406,%r412; add.u64 %r420,%r403,%r409; set.u32.lt.u64 %r425,%r420,%r403; cvt.s64.s32 %r423,%r425; sub.u64 %r429,%r418,%r423; .loc 1 190 8 mov.u64 %r507,1; $L60: .loc 1 192 9 ld.f64 %r75,[%r138]; .loc 1 192 8 setp.lt.f64 %r430,%r75,%r119; .loc 1 195 16 selp.u64 %r473,%r477,%r473,%r430; selp.u64 %r474,%r478,%r474,%r430; selp.f64 %r119,%r75,%r119,%r430; .loc 1 190 33 add.u64 %r138,%r138,%r70; .loc 1 190 8 add.u64 %r433,%r477,%r507; set.u32.lt.u64 %r438,%r433,%r477; cvt.s64.s32 %r436,%r438; mov.u64 %r477,%r433; sub.u64 %r478,%r478,%r436; setp.ne.u64 %r445,%r420,%r477; @ %r445 bra $L60; setp.ne.u64 %r448,%r429,%r478; @ %r448 bra $L60; $L34: .loc 1 199 12 st.u64 [%r148],%r473; st.u64 [%r148+8],%r474; $L29: .loc 1 203 15 add.u64 %r161,%r161,1; .loc 1 204 12 add.u64 %r147,%r147,%r82; .loc 1 205 12 add.u64 %r148,%r148,%r85; .loc 1 207 13 setp.ne.u64 %r451,%r161,%r122; @ %r451 bra $L25; .loc 1 214 9 add.u64 %r144,%r147,%r166; .loc 1 215 9 add.u64 %r145,%r148,%r71; .loc 1 217 7 @ %r512 bra $L1; add.u64 %r129,%frame,368; mov.u64 %r190,8; .loc 1 216 5 mov.u64 %r146,1; .loc 1 211 13 mov.u64 %r515,0; bra $L44; $L45: st.u64 [%r129],%r515; .loc 1 214 23 mul.lo.u64 %r457,%r97,%r96; .loc 1 214 9 shl.b64 %r458,%r457,3; sub.u64 %r144,%r147,%r458; .loc 1 215 23 mul.lo.u64 %r459,%r100,%r96; .loc 1 215 9 shl.b64 %r460,%r459,4; sub.u64 %r145,%r148,%r460; .loc 1 216 5 add.u64 %r146,%r146,1; .loc 1 217 7 add.u64 %r129,%r129,8; add.u64 %r190,%r190,8; setp.eq.u64 %r461,%r146,%r520; @ %r461 bra $L1; $L44: .loc 1 225 16 ld.u64 %r462,[%r129]; add.u64 %r96,%r462,1; st.u64 [%r129],%r96; .loc 1 226 23 add.u64 %r464,%r521,%r190; ld.u64 %r97,[%r464]; .loc 1 226 13 shl.b64 %r465,%r97,3; add.u64 %r147,%r144,%r465; .loc 1 227 23 add.u64 %r466,%frame,%r190; ld.u64 %r100,[%r466]; .loc 1 227 13 shl.b64 %r467,%r100,4; add.u64 %r148,%r145,%r467; .loc 1 207 32 add.u64 %r469,%r522,%r190; ld.u64 %r103,[%r469]; .loc 1 207 13 setp.eq.u64 %r470,%r96,%r103; @ %r470 bra $L45; .loc 1 211 13 mov.u64 %r161,0; bra $L25; $L64: .loc 1 90 6 ld.u64 %r471,[%r202]; setp.eq.u64 %r472,%r471,0; @ ! %r472 bra $L11; bra $L46; $L1: .loc 1 231_gfortran_mminloc1_16_r8 .visible .func _gfortran_mminloc1_16_r8, .param .u32 %in_ar4) {608pred %r239; .reg .u32 %r244; .reg .u64 %r246; .reg .u32 %r248; .reg .u32 %r249; .reg .u16 %r251; .reg .u16 %r252; .reg .u16 %r253; .reg .u32u64 %r268; .reg .u64 %r275; .reg .pred %r276; .reg .u32 %r277; .reg .u32 %r278; .reg .u32 %r280; .reg .u32pred %r291; .reg .pred %r293; .reg .u64predu64 %r312; .reg .u64u64 %r317; .reg .u64 %r318; .reg .u64u64pred %r356; .reg .pred %r357; .reg .u64 %r358; .reg .u32pred %r410; .reg .u64pred %r419; .reg .predpred %r427; .reg .predu64 %r454; .reg .u64 %r457; .reg .u32 %r459; .reg .u64 %r463; .reg .u64 %r465; .reg .u64pred %r541; .reg .predu64u64 %r565; .reg .u64 %r567; .reg .pred %r568; .reg .u64 %r569; .reg .pred %r570; .reg .u64 %r571; .reg .u64 %r572; .reg .u64 %r573; .reg .u64.reg .u64 %r633; .reg .u64 %r634; mov.u64 %r234,%ar0; mov.u64 %r235,%ar1; mov.u64 %r236,%ar2; mov.u64 %r237,%ar3; mov.u32 %r238,%ar4; .loc 1 261 6 setp.ne.u64 %r239,%r237,0; @ %r239 bra $L67; .loc 1 264 7r2236238; call _gfortran_minloc1_16_r8268 7 bra $L66; $L67: .loc 1 271 10 ld.u64 %r22,[%r236]; .loc 1 271 7 add.u64 %r152,%r22,-1; .loc 1 272 10 ld.s8 %r23,[%r235+28]; .loc 1 272 38 add.u32 %r244,%r23,-1; .loc 1 272 8 cvt.s64.s32 %r153,%r244; .loc 1 275 7 shr.u64 %r246,%r152,63; set.u32.gt.s64 %r248,%r152,%r153; neg.s32 %r249,%r248; cvt.u16.u64 %r252,%r246; cvt.u16.u32 %r253,%r249; or.b16 %r251,%r252,%r253; .loc 1 275 6 cvt.u32.u16 %r254,%r251; cvt.u16.u8 %r255,%r254; setp.eq.u16 %r256,%r255,0; @ %r256 bra $L69; .loc 1 277 7 cvt.u32.u32 %r259,%r23; cvt.s64.s8 %r258,%r259; st.u64 [%stack+8],%r258; st.u64 [%stack],%r22; cvta.const.u64 %r25_gfortran_runtime_error69: .loc 1 282 9 add.u64 %r621,%r152,%r152; add.u64 %r617,%r621,%r152; shl.b64 %r264,%r617,3; add.u64 %r265,%r235,%r264; ld.u64 %r268,[%r265+56]; add.u64 %r267,%r268,1; .loc 1 282 7 ld.u64 %r275,[%r265+48]; sub.u64 %r154,%r267,%r275; .loc 1 283 6 setp.le.s64 %r276,%r154,0; @ %r276 bra $L66; .loc 1 286 9 ld.u64 %r174,[%r237]; .loc 1 288 15 ld.u64 %r33,[%r237+16]; .loc 1 290 22 cvt.u32.u64 %r34,%r33; .loc 1 290 53 add.u32 %r277,%r34,-4; and.b32 %r278,%r277,-5; set.u32.eq.u32 %r280,%r278,0; neg.s32 %r281,%r280; .loc 1 290 22 add.u32 %r282,%r34,-1; set.u32.le.u32 %r284,%r282,1; neg.s32 %r285,%r284; .loc 1 290 6 cvt.u16.u32 %r287,%r281; cvt.u16.u32 %r288,%r285; or.b16 %r286,%r287,%r288; cvt.u32.u16 %r289,%r286; cvt.u16.u8 %r290,%r289; setp.ne.u16 %r291,%r290,0; @ %r291 bra $L71; .loc 1 292 7 setp.ne.u32 %r293,%r34,16; @ %r293 bra $L72; $L71: .loc 1 299 9 shl.b64 %r297,%r617,3; add.u64 %r298,%r235,%r297; ld.u64 %r157,[%r298+40]; .loc 1 300 12 add.u64 %r304,%r237,%r297; ld.u64 %r40,[%r304+40]; .loc 1 302 3 setp.ne.u64 %r306,%r152,0; @ %r306 bra $L73; $L79: .loc 1 312 3 setp.lt.s64 %r307,%r152,%r153; @ %r307 bra $L74; bra $L141; $L72: .loc 1 297 5 cvta.const.u64 %r30830_gfortran_runtime_error73: add.u64 %r91,%r235,40; add.u64 %r196,%r237,40; add.u64 %r311,%r22,%r22; add.u64 %r312,%r311,%r22; shl.b64 %r313,%r312,3; add.u64 %r314,%r235,16; add.u64 %r226,%r313,%r314; .loc 1 302 3 mov.u64 %r107,0; add.u64 %r619,%frame,240; add.u64 %r623,%frame,360; .loc 1 309 12 mov.u64 %r634,%r107; $L78: .loc 1 304 18 add.u64 %r316,%r619,%r107; ld.u64 %r317,[%r91]; st.u64 [%r316],%r317; .loc 1 305 18 add.u64 %r318,%frame,%r107; .loc 1 305 20 ld.u64 %r320,[%r196]; mul.lo.u64 %r319,%r320,%r33; .loc 1 305 18 st.u64 [%r318],%r319; .loc 1 306 19 ld.u64 %r322,[%r91+16]; add.u64 %r321,%r322,1; ld.u64 %r323,[%r91+8]; sub.u64 %r52,%r321,%r323; .loc 1 308 10 setp.lt.s64 %r324,%r52,0; @ %r324 bra $L76; .loc 1 306 17 add.u64 %r326,%r623,%r107; st.u64 [%r326],%r52; bra $L77; $L76: .loc 1 309 12 add.u64 %r328,%r623,%r107; st.u64 [%r328],%r634; $L77: .loc 1 302 3 add.u64 %r91,%r91,24; add.u64 %r196,%r196,24; add.u64 %r107,%r107,8; setp.ne.u64 %r330,%r91,%r226; @ %r330 bra $L78; bra $L79; $L141: .loc 1 322 6 ld.u64 %r331,[%r234]; setp.eq.u64 %r332,%r331,0; @ ! %r332 bra $L81; bra $L80; $L74: add.u64 %r334,%r22,%r22; add.u64 %r335,%r334,%r22; shl.b64 %r336,%r335,3; add.u64 %r180,%r336,40; add.u64 %r57,%r235,%r180; add.u64 %r182,%r237,%r180; shl.b64 %r337,%r22,3; add.u64 %r74,%r337,-8; cvt.u32.u32 %r339,%r23; cvt.s64.s8 %r338,%r339; shl.b64 %r340,%r338,3; add.u64 %r88,%r340,-8; add.u64 %r619,%frame,240; add.u64 %r623,%frame,360; .loc 1 319 12 mov.u64 %r633,0; $L84: .loc 1 314 18 add.u64 %r342,%r619,%r74; ld.u64 %r343,[%r57]; st.u64 [%r342],%r343; .loc 1 315 18 add.u64 %r344,%frame,%r74; .loc 1 315 20 ld.u64 %r346,[%r182]; mul.lo.u64 %r345,%r346,%r33; .loc 1 315 18 st.u64 [%r344],%r345; .loc 1 316 19 ld.u64 %r348,[%r57+16]; add.u64 %r347,%r348,1; ld.u64 %r349,[%r57+8]; sub.u64 %r64,%r347,%r349; .loc 1 318 10 setp.lt.s64 %r350,%r64,0; @ %r350 bra $L82; .loc 1 316 17 add.u64 %r352,%r623,%r74; st.u64 [%r352],%r64; bra $L83; $L82: .loc 1 319 12 add.u64 %r354,%r623,%r74; st.u64 [%r354],%r633; $L83: .loc 1 312 3 add.u64 %r57,%r57,24; add.u64 %r182,%r182,24; add.u64 %r74,%r74,8; setp.ne.u64 %r356,%r74,%r88; @ %r356 bra $L84; bra $L142; $L80: .loc 1 326 7 setp.eq.u64 %r357,%r153,0; @ %r357 bra $L86; add.u64 %r623,%frame,360; $L116: add.u64 %r133,%r234,40; mov.u64 %r131,%r623; cvt.u32.u32 %r359,%r23; cvt.s64.s8 %r358,%r359; add.u64 %r361,%r358,%r358; add.u64 %r362,%r361,%r358; shl.b64 %r363,%r362,3; add.u64 %r364,%r234,16; add.u64 %r60,%r363,%r364; .loc 1 329 10 mov.u64 %r136,1; .loc 1 333 4 mov.u64 %r383,0; bra $L87; $L86: .loc 1 337 20 add.u32 %r365,%r23,-2; cvt.s64.s32 %r72,%r365; add.u64 %r367,%r72,%r72; add.u64 %r368,%r367,%r72; shl.b64 %r369,%r368,3; add.u64 %r370,%r234,%r369; .loc 1 337 67 shl.b64 %r372,%r72,3; add.u64 %r373,%frame,%r372; .loc 1 337 59 ld.u64 %r375,[%r370+40]; ld.u64 %r376,[%r373+360]; mul.lo.u64 %r161,%r375,%r376; .loc 1 339 24 mov.u64 %r377,0; st.u64 [%r234+8],%r377; .loc 1 340 28 cvt.u16.u32 %r380,%r23; add.u16 %r379,%r380,-1; cvt.u32.u16 %r381,%r379; st.u8 [%r234+28],%r381; .loc 1 342 10 setp.eq.u64 %r382,%r161,0; @ %r382 bra $L88; bra $L143; $L90: .loc 1 331 47 mul.lo.u64 %r136,%r69,%r136; $L87: .loc 1 333 4 st.u64 [%r133+8],%r383; ld.u64 %r69,[%r131]; add.u64 %r384,%r69,-1; st.u64 [%r133+16],%r384; st.u64 [%r133],%r136; .loc 1 326 7 add.u64 %r133,%r133,24; add.u64 %r131,%r131,8; setp.ne.u64 %r385,%r60,%r133; @ %r385 bra $L90; bra $L86; $L88: .loc 1 345 4 st.u64 [%r234+48],%r161; mov.u64 %r387,-1; st.u64 [%r234+56],%r387; mov.u64 %r388,1; st.u64 [%r234+40],%r388; .loc 1 346 4 bra $L66; $L143: .loc 1 349 24 mov.u64 %r390,16390; call (%value_in),_gfortrani_xmallocarray91,[%value_in]; } .loc 1 349 22 st.u64 [%r234],%r391; bra $L91; $L81: .loc 1 354 19 ld.s8 %r393,[%r234+28]; .loc 1 354 10 setp.eq.u64 %r394,%r393,%r153; @ %r394 bra $L92; .loc 1 355 2 cvta.const.u64 %r3955_gfortran_runtime_error92: .loc 1 357 11 cvta.global.u64 %r397,_gfortrani_compile_options; .loc 1 357 10 ld.u32 %r398,[%r397+36]; setp.eq.u32 %r399,%r398,0; @ %r399 bra $L91; .loc 1 359 4 add.u64 %r623,%frame,360; cvta.const.u64 %r403,$LC2;r62302403; call _gfortrani_bounds_ifunction_return361 4 cvta.const.u64 %r407,$LC207403; call _gfortrani_bounds_equal_extents$L91: .loc 1 366 3 setp.ne.u64 %r409,%r153,0; @ %r409 bra $L93; $L96: .loc 1 374 8 ld.u64 %r175,[%r234]; .loc 1 375 8 ld.u64 %r173,[%r235]; .loc 1 377 9 setp.ne.u64 %r410,%r173,0; @ %r410 bra $L94; bra $L66; $L93: add.u64 %r149,%frame,480; add.u64 %r148,%r234,40; add.u64 %r144,%frame,120; cvt.u32.u32 %r412,%r23; cvt.s64.s8 %r411,%r412; add.u64 %r134,%r411,-1; .loc 1 366 3 mov.u64 %r176,0; add.u64 %r623,%frame,360; .loc 1 368 16 mov.u64 %r413,%r176; $L95: st.u64 [%r149],%r413; .loc 1 369 18 ld.u64 %r414,[%r148]; st.u64 [%r144],%r414; .loc 1 370 17 shl.b64 %r416,%r176,3; add.u64 %r417,%r623,%r416; .loc 1 370 10 ld.u64 %r418,[%r417]; setp.le.s64 %r419,%r418,0; @ %r419 bra $L66; .loc 1 366 26 add.u64 %r176,%r176,1; .loc 1 366 3 add.u64 %r149,%r149,8; add.u64 %r148,%r148,24; add.u64 %r144,%r144,8; setp.ne.u64 %r420,%r134,%r176; @ %r420 bra $L95; bra $L96; $L94: .loc 1 300 12 mul.lo.u64 %r43,%r40,%r33; .loc 1 396 32 shl.b64 %r87,%r157,3; ld.u64 %r38,[%frame+480]; .loc 1 440 22 ld.u64 %r100,[%frame+240]; .loc 1 440 12 shl.b64 %r101,%r100,3; .loc 1 441 23 ld.u64 %r103,[%frame]; .loc 1 442 22 ld.u64 %r105,[%frame+120]; .loc 1 442 12 shl.b64 %r106,%r105,4; .loc 1 444 32 ld.u64 %r166,[%frame+360]; .loc 1 451 23 mul.lo.u64 %r37,%r166,%r100; .loc 1 452 24 mul.lo.u64 %r421,%r166,%r103; .loc 1 452 10 neg.s64 %r85,%r421; .loc 1 453 23 mul.lo.u64 %r422,%r166,%r105; .loc 1 453 9 shl.b64 %r423,%r422,4; neg.s64 %r93,%r423; setp.le.s64 %r618,%r153,1; .loc 1 366 3 mov.u64 %r626,1; .loc 1 418 9 setp.ne.u32 %r627,%r238,0; add.u64 %r628,%r154,-1; cvt.u32.u32 %r629,%r23; cvt.s64.s8 %r630,%r629; add.u64 %r631,%r630,-1; add.u64 %r632,%frame,240; $L117: .loc 1 366 3 mov.u64 %r169,%r174; mov.u64 %r168,%r173; mov.u64 %r573,%r626; mov.u64 %r574,0; .loc 1 393 17 mov.u64 %r571,%r574; mov.u64 %r572,%r574; .loc 1 396 9 mov.u64 %r200,%r574; $L102: .loc 1 399 7 ld.s8 %r225,[%r169]; mov.u64 %r231,%r200; add.u64 %r200,%r200,1; .loc 1 399 6 setp.eq.u32 %r620,%r225,0; @ %r620 bra $L97; .loc 1 402 10 or.b64 %r425,%r571,%r572; setp.ne.u64 %r427,%r425,0; .loc 1 403 17 selp.u64 %r571,%r571,%r573,%r427; selp.u64 %r572,%r572,%r574,%r427; .loc 1 404 11 ld.f64 %r145,[%r168]; .loc 1 404 10 setp.le.f64 %r428,%r145,0d7ff0000000000000; @ ! %r428 bra $L97; .loc 1 408 11 shr.s64 %r429,%r200,63; .loc 1 414 9 setp.le.s64 %r430,%r154,%r231; @ ! %r430 bra $L144; bra $L100; $L97: .loc 1 396 32 add.u64 %r168,%r168,%r87; .loc 1 396 47 add.u64 %r169,%r169,%r43; .loc 1 396 2 add.u64 %r433,%r573,%r626; set.u32.lt.u64 %r438,%r433,%r573; cvt.s64.s32 %r436,%r438; mov.u64 %r573,%r433; sub.u64 %r574,%r574,%r436; setp.ne.u64 %r443,%r154,%r200; @ %r443 bra $L102; bra $L100; $L144: .loc 1 418 9 @ %r627 bra $L103; add.u64 %r579,%r231,1; shr.s64 %r580,%r579,63; sub.u64 %r448,%r628,%r231; shr.s64 %r451,%r231,63; add.u64 %r454,%r231,2; set.u32.lt.u64 %r459,%r454,%r231; cvt.s64.s32 %r457,%r459; sub.u64 %r463,%r451,%r457; add.u64 %r465,%r448,%r454; set.u32.lt.u64 %r470,%r465,%r448; cvt.s64.s32 %r468,%r470; sub.u64 %r474,%r463,%r468; .loc 1 408 11 mov.u64 %r571,%r200; mov.u64 %r572,%r429; bra $L104; $L103: add.u64 %r583,%r231,1; shr.s64 %r584,%r583,63; sub.u64 %r478,%r628,%r231; shr.s64 %r481,%r231,63; add.u64 %r484,%r231,2; set.u32.lt.u64 %r489,%r484,%r231; cvt.s64.s32 %r487,%r489; sub.u64 %r493,%r481,%r487; add.u64 %r495,%r478,%r484; set.u32.lt.u64 %r500,%r495,%r478; cvt.s64.s32 %r498,%r500; sub.u64 %r504,%r493,%r498; mov.u64 %r571,%r200; mov.u64 %r572,%r429; $L108: .loc 1 421 8 @ %r620 bra $L105; .loc 1 421 18 ld.f64 %r90,[%r168]; .loc 1 421 15 setp.le.f64 %r506,%r90,%r145; .loc 1 424 16 selp.u64 %r571,%r583,%r571,%r506; selp.u64 %r572,%r584,%r572,%r506; selp.f64 %r145,%r90,%r145,%r506; $L105: .loc 1 419 33 add.u64 %r168,%r168,%r87; .loc 1 419 48 add.u64 %r169,%r169,%r43; .loc 1 419 8 add.u64 %r509,%r583,%r626; set.u32.lt.u64 %r514,%r509,%r583; cvt.s64.s32 %r512,%r514; mov.u64 %r583,%r509; sub.u64 %r584,%r584,%r512; setp.ne.u64 %r521,%r495,%r583; @ %r521 bra $L120; setp.ne.u64 %r524,%r504,%r584; @ ! %r524 bra $L100; $L120: .loc 1 421 9 ld.s8 %r225,[%r169]; setp.eq.u32 %r620,%r225,0; bra $L108; $L104: .loc 1 430 10 @ %r620 bra $L109; .loc 1 430 20 ld.f64 %r94,[%r168]; .loc 1 430 17 setp.lt.f64 %r526,%r94,%r145; .loc 1 433 11 selp.u64 %r571,%r579,%r571,%r526; selp.u64 %r572,%r580,%r572,%r526; selp.f64 %r145,%r94,%r145,%r526; $L109: .loc 1 428 35 add.u64 %r168,%r168,%r87; .loc 1 428 50 add.u64 %r169,%r169,%r43; .loc 1 428 10 add.u64 %r529,%r579,%r626; set.u32.lt.u64 %r534,%r529,%r579; cvt.s64.s32 %r532,%r534; mov.u64 %r579,%r529; sub.u64 %r580,%r580,%r532; setp.ne.u64 %r541,%r465,%r579; @ %r541 bra $L122; setp.ne.u64 %r544,%r474,%r580; @ ! %r544 bra $L100; $L122: .loc 1 430 11 ld.s8 %r225,[%r169]; setp.eq.u32 %r620,%r225,0; bra $L104; $L100: .loc 1 436 8 st.u64 [%r175],%r571; st.u64 [%r175+8],%r572; .loc 1 439 15 add.u64 %r38,%r38,1; .loc 1 440 12 add.u64 %r173,%r173,%r101; .loc 1 441 13 add.u64 %r174,%r174,%r103; .loc 1 442 12 add.u64 %r175,%r175,%r106; .loc 1 444 13 setp.ne.u64 %r547,%r38,%r166; @ %r547 bra $L117; .loc 1 452 10 add.u64 %r170,%r174,%r85; .loc 1 453 9 add.u64 %r171,%r175,%r93; .loc 1 455 7 @ %r618 bra $L66; add.u64 %r151,%frame,488; .loc 1 451 23 mov.u64 %r110,%r37; .loc 1 455 7 mov.u64 %r219,8; .loc 1 454 5 mov.u64 %r172,1; add.u64 %r623,%frame,360; add.u64 %r622,%frame,120; .loc 1 448 13 mov.u64 %r624,0; bra $L114; $L115: st.u64 [%r151],%r624; .loc 1 451 23 mul.lo.u64 %r110,%r123,%r122; .loc 1 452 24 mul.lo.u64 %r553,%r125,%r122; .loc 1 452 10 sub.u64 %r170,%r174,%r553; .loc 1 453 23 mul.lo.u64 %r554,%r127,%r122; .loc 1 453 9 shl.b64 %r555,%r554,4; sub.u64 %r171,%r175,%r555; .loc 1 454 5 add.u64 %r172,%r172,1; .loc 1 455 7 add.u64 %r151,%r151,8; add.u64 %r219,%r219,8; setp.eq.u64 %r556,%r172,%r631; @ %r556 bra $L66; $L114: .loc 1 463 16 ld.u64 %r557,[%r151]; add.u64 %r122,%r557,1; st.u64 [%r151],%r122; .loc 1 464 23 add.u64 %r559,%r632,%r219; ld.u64 %r123,[%r559]; .loc 1 464 13 sub.u64 %r560,%r123,%r110; shl.b64 %r561,%r560,3; add.u64 %r173,%r173,%r561; .loc 1 465 24 add.u64 %r562,%frame,%r219; ld.u64 %r125,[%r562]; .loc 1 465 14 add.u64 %r174,%r170,%r125; .loc 1 466 23 add.u64 %r564,%r622,%r219; ld.u64 %r127,[%r564]; .loc 1 466 13 shl.b64 %r565,%r127,4; add.u64 %r175,%r171,%r565; .loc 1 444 32 add.u64 %r567,%r623,%r219; ld.u64 %r130,[%r567]; .loc 1 444 13 setp.eq.u64 %r568,%r122,%r130; @ %r568 bra $L115; .loc 1 448 13 mov.u64 %r38,0; bra $L117; $L142: .loc 1 322 6 ld.u64 %r569,[%r234]; setp.eq.u64 %r570,%r569,0; @ ! %r570 bra $L81; bra $L116; $L66: .loc 1 4_gfortran_sminloc1_16_r8 .visible .func _gfortran_sminloc1_16_r8, .param .u32 %in_ar4) {8u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r282; .reg .predpred %r293; .reg .pred %r294; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 493 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L146; .loc 1 493 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L147; $L146:r152154; call _gfortran_minloc1_16_r8500 7 bra $L145; $L147: .loc 1 503 10 ld.u64 %r23,[%r152]; .loc 1 503 7 add.u64 %r88,%r23,-1; .loc 1 504 10 ld.s8 %r24,[%r151+28]; .loc 1 504 38 add.u32 %r26,%r24,-1; .loc 1 504 8 cvt.s64.s32 %r297,%r26; .loc 1 506 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r297; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 506 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L149; .loc 1 513 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L150; $L156: .loc 1 521 3 setp.lt.s64 %r175,%r88,%r297; @ %r175 bra $L151; bra $L204; $L149: .loc 1 508 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error150: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 518 12 mov.u64 %r303,0; $L155: .loc 1 515 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 517 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L153; .loc 1 515 17 st.u64 [%r139],%r36; bra $L154; $L153: .loc 1 518 12 st.u64 [%r139],%r303; $L154: .loc 1 513 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L155; bra $L156; $L204: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L158; bra $L157; $L151: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r296,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r296,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 527 12 mov.u64 %r302,0; $L161: .loc 1 524 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 526 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L159; .loc 1 523 17 st.u64 [%r43],%r41; bra $L160; $L159: .loc 1 527 12 st.u64 [%r43],%r302; $L160: .loc 1 521 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L161; bra $L205; $L157: .loc 1 534 7 setp.eq.u64 %r213,%r297,0; @ %r213 bra $L163; add.u64 %r296,%frame,120; $L183: add.u64 %r111,%r150,40; mov.u64 %r107,%r296; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 537 10 mov.u64 %r81,1; .loc 1 541 4 mov.u64 %r239,0; bra $L164; $L163: .loc 1 545 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 546 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 548 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 548 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 548 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 550 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L165; bra $L206; $L167: .loc 1 539 48 mul.lo.u64 %r81,%r46,%r81; $L164: .loc 1 541 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 534 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L167; bra $L163; $L165: .loc 1 553 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 554 4 bra $L145; $L206: .loc 1 557 24 mov.u64 %r246,1696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 557 22 st.u64 [%r150],%r98; $L171: .loc 1 583 3 setp.ne.u64 %r249,%r297,0; @ %r249 bra $L168; bra $L169; $L158: .loc 1 561 19 ld.s8 %r55,[%r150+28]; .loc 1 561 10 setp.eq.u64 %r250,%r55,%r297; @ %r250 bra $L170; .loc 1 562 2 st.u64 [%stack+8],%r297; st.u64 [%stack],%r55;_gfortran_runtime_error170: .loc 1 567 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 567 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L171; .loc 1 569 4 setp.eq.u64 %r256,%r297,0; @ %r256 bra $L169; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 569 10 mov.u64 %r104,0; $L173: .loc 1 573 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 573 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 574 18 ld.u64 %r63,[%r31]; .loc 1 569 25 add.u64 %r104,%r104,1; .loc 1 574 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L172; .loc 1 575 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error172: .loc 1 569 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L173; $L168: .loc 1 585 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L174; shl.b64 %r266,%r297,3; bra $L175; $L174: mov.u64 %r266,8; $L175: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L176: .loc 1 586 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 583 26 add.u64 %r100,%r100,1; .loc 1 583 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r297,%r100; @ %r279 bra $L176; $L169: ld.u64 %r80,[%frame+240]; .loc 1 595 22 ld.u64 %r68,[%frame]; .loc 1 595 12 shl.b64 %r69,%r68,4; .loc 1 597 32 ld.u64 %r59,[%frame+120]; .loc 1 604 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r295,%r297,1; .loc 1 593 13 mov.u64 %r280,0; cvt.u32.u32 %r299,%r24; cvt.s64.s8 %r300,%r299; add.u64 %r301,%r300,-1; $L182: st.u64 [%r98],%r280; st.u64 [%r98+8],%r280; .loc 1 594 15 add.u64 %r80,%r80,1; .loc 1 595 12 add.u64 %r98,%r98,%r69; .loc 1 597 13 setp.ne.u64 %r282,%r59,%r80; @ %r282 bra $L182; .loc 1 606 7 @ ! %r295 bra $L207; bra $L145; $L181: .loc 1 601 13 st.u64 [%r99],%r280; .loc 1 604 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 605 5 add.u64 %r97,%r97,1; .loc 1 606 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r285,%r97,%r301; @ %r285 bra $L180; bra $L145; $L207: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 604 23 mov.u64 %r72,%r44; .loc 1 605 5 mov.u64 %r97,1; $L180: .loc 1 610 16 ld.u64 %r290,[%r99]; add.u64 %r74,%r290,1; st.u64 [%r99],%r74; .loc 1 611 23 ld.u64 %r75,[%r124]; .loc 1 611 13 sub.u64 %r291,%r75,%r72; shl.b64 %r292,%r291,4; add.u64 %r98,%r98,%r292; .loc 1 597 32 ld.u64 %r77,[%r123]; .loc 1 597 13 setp.eq.u64 %r293,%r74,%r77; @ %r293 bra $L181; .loc 1 601 13 mov.u64 %r80,0; bra $L182; $L205: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r294,%r98,0; @ ! %r294 bra $L158; bra $L183; $L145: .loc 1 615minloc1_4_r10.o/162280223199 minloc1_8_r16.o/162280223minval_i1.o/_gfortran_minval_i1 .visible .func _gfortran_minval_i1fortran/generated/minval_i1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminval_i1 .visible .func _gfortran_mminval_i1gfortran_sminval_i1 .visible .func _gfortran_sminval_i1VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,86,65,76114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_minval_i1 .visible .func _gfortran_minval_i148predpred %r209; .reg .pred %r210; .reg .u64 %r211; .reg .u64 %r213; .reg .u64pred %r220; .reg .pred %r222; .reg .u64pred %r247; .reg .u64 %r248; .reg .u32 %r249; .reg .u64 %r251; .reg .u64 %r252; .reg .u64 %r253; .reg .u64 %r254; .reg .u64 %r255; .reg .u16 %r257; .reg .u16predpred %r288; .reg .u64 %r291; .reg .u64 %r292; .reg .predpred %r304; .reg .pred64pred %r335; .reg .u64 %r336; .reg .u64mov.u64 %r171,%ar0; mov.u64 %r172,%ar1; mov.u64 %r173,%ar2; .loc 1 55 10 ld.s8 %r22,[%r172+28]; .loc 1 55 38 add.u32 %r174,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r174; .loc 1 56 10 ld.u64 %r25,[%r173]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r176,%r104,63; set.u32.lt.s64 %r178,%r103,%r104; neg.s32 %r179,%r178; cvt.u16.u64 %r182,%r176; cvt.u16.u32 %r183,%r179; or.b16 %r181,%r182,%r183; .loc 1 58 6 cvt.u32.u16 %r184,%r181; cvt.u16.u8 %r185,%r184; setp.eq.u16 %r186,%r185,0; @ %r186 bra $L2; .loc 1 60 7 cvt.u32.u32 %r189,%r22; cvt.s64.s8 %r188,%r189; st.u64 [%stack+8],%r18818187_gfortran_runtime_errorr192,%r104,%r104; add.u64 %r193,%r192,%r104; shl.b64 %r194,%r193,3; add.u64 %r195,%r172,%r194; ld.u64 %r31,[%r195+56]; ld.u64 %r33,[%r195+48]; .loc 1 68 9 ld.u64 %r107,[%r195+40]; .loc 1 70 3 setp.ne.u64 %r209,%r104,0; @ %r209 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r210,%r103,%r104; @ %r210 bra $L4; bra $L46; $L3: add.u64 %r168,%r172,40; add.u64 %r165,%frame,120; add.u64 %r49,%frame,240; add.u64 %r211,%r172,16; add.u64 %r213,%r25,%r25; add.u64 %r214,%r213,%r25; shl.b64 %r215,%r214,3; add.u64 %r139,%r211,%r215; .loc 1 76 12 mov.u64 %r349,0; $L8: .loc 1 72 18 ld.u64 %r216,[%r168]; st.u64 [%r165],%r216; .loc 1 73 19 ld.u64 %r218,[%r168+16]; add.u64 %r217,%r218,1; ld.u64 %r219,[%r168+8]; sub.u64 %r38,%r217,%r219; .loc 1 75 10 setp.lt.s64 %r220,%r38,0; @ %r220 bra $L6; .loc 1 73 17 st.u64 [%r49],%r38; bra $L7; $L6: .loc 1 76 12 st.u64 [%r49],%r349; $L7: .loc 1 70 3 add.u64 %r168,%r168,24; add.u64 %r165,%r165,8; add.u64 %r49,%r49,8; setp.ne.u64 %r222,%r139,%r168; @ %r222 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r223,[%r171]; setp.eq.u64 %r224,%r223,0; @ ! %r224 bra $L11; bra $L10; $L4: add.u64 %r226,%r25,%r25; add.u64 %r227,%r226,%r25; shl.b64 %r228,%r227,3; add.u64 %r229,%r228,40; add.u64 %r99,%r172,%r229; shl.b64 %r230,%r25,3; add.u64 %r91,%r230,-8; add.u64 %r336,%frame,120; add.u64 %r93,%r336,%r91; add.u64 %r337,%frame,240; add.u64 %r78,%r337,%r91; cvt.u32.u32 %r234,%r22; cvt.s64.s8 %r233,%r234; add.u64 %r236,%r233,%r233; add.u64 %r237,%r236,%r233172,40; add.u64 %r121,%r238,%r239; .loc 1 84 12 mov.u64 %r348,0; $L14: .loc 1 80 18 ld.u64 %r240,[%r99]; st.u64 [%r93],%r240; .loc 1 81 19 ld.u64 %r242,[%r99+16]; add.u64 %r241,%r242,1; ld.u64 %r243,[%r99+8]; sub.u64 %r46,%r241,%r243; .loc 1 83 10 setp.lt.s64 %r244,%r46,0; @ %r244 bra $L12; .loc 1 81 17 st.u64 [%r78],%r46; bra $L13; $L12: .loc 1 84 12 st.u64 [%r78],%r348; $L13: .loc 1 78 3 add.u64 %r99,%r99,24; add.u64 %r93,%r93,8; add.u64 %r78,%r78,8; setp.ne.u64 %r246,%r99,%r121; @ %r246 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r247,%r103,0; @ %r247 bra $L16; add.u64 %r337,%frame,240; $L35: add.u64 %r137,%r171,40; mov.u64 %r132,%r337; cvt.u32.u32 %r249,%r22; cvt.s64.s8 %r248,%r249; add.u64 %r251,%r248,%r248; add.u64 %r252,%r251,%r248; shl.b64 %r253,%r252,3; add.u64 %r254,%r171,16; add.u64 %r100,%r253,%r254; .loc 1 94 10 mov.u64 %r95,1; .loc 1 98 4 mov.u64 %r277,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r255,0; st.u64 [%r171+8],%r255; .loc 1 103 28 cvt.u16.u32 %r258,%r22; add.u16 %r257,%r258,-1; cvt.u32.u16 %r259,%r257; st.u8 [%r171+28],%r259; .loc 1 105 20 add.u32 %r260,%r22,-2; cvt.s64.s32 %r55,%r260; add.u64 %r262,%r55,%r55; add.u64 %r263,%r262,%r55; shl.b64 %r264,%r263,3; add.u64 %r265,%r171,%r264; .loc 1 105 67 shl.b64 %r267,%r55,3; add.u64 %r268,%frame,%r267; .loc 1 105 59 ld.u64 %r270,[%r265+40]; ld.u64 %r271,[%r268+240]; mul.lo.u64 %r109,%r270,%r271; .loc 1 107 29 mov.u64 %r27273; call (%value_in),_gfortrani_xmallocarray274,[%value_in]; } .loc 1 107 27 st.u64 [%r171],%r274; .loc 1 108 10 setp.eq.u64 %r276,%r109,0; @ ! %r276 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r95,%r51,%r95; $L17: .loc 1 98 4 st.u64 [%r137+8],%r277; ld.u64 %r51,[%r132]; add.u64 %r278,%r51,-1; st.u64 [%r137+16],%r278; st.u64 [%r137],%r95; .loc 1 91 7 add.u64 %r137,%r137,24; add.u64 %r132,%r132,8; setp.ne.u64 %r279,%r100,%r137; @ %r279 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r171+48],%r109; mov.u64 %r281,-1; st.u64 [%r171+56],%r281; st.u64 [%r171+40],%r273; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r61,[%r171+28]; .loc 1 118 10 setp.eq.u64 %r283,%r61,%r103; @ %r283 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r61;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r286,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r287,[%r286+36]; setp.eq.u32 %r288,%r287,0; @ %r288 bra $L21; .loc 1 125 2 add.u64 %r337,%frame,240; cvta.const.u64 %r292,$LC2r329292; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r294,%r103,0; @ %r294 bra $L24; $L27: .loc 1 65 9 add.u64 %r295,%r31,1; .loc 1 65 7 sub.u64 %r105,%r295,%r33; max.s64 %r41,%r105,0; .loc 1 137 8 ld.u64 %r117,[%r172]; .loc 1 138 8 ld.u64 %r118,[%r171]; ld.u64 %r48,[%frame+360]; .loc 1 180 22 ld.u64 %r70,[%frame+120]; .loc 1 181 22 ld.u64 %r72,[%frame]; setp.gt.s64 %r338,%r105,0; setp.le.s64 %r339,%r103,1; cvt.u32.u32 %r341,%r22; cvt.s64.s8 %r342,%r341; add.u64 %r343,%r342,-1; add.u64 %r344,%frame,120; add.u64 %r345,%frame,240; .loc 1 187 13 mov.u64 %r346,0; bra $L25; $L24: add.u64 %r157,%frame,360; add.u64 %r156,%r171,40; mov.u64 %r154,%frame; cvt.u32.u32 %r297,%r22; cvt.s64.s8 %r296,%r297; add.u64 %r145,%r296,-1; .loc 1 129 3 mov.u64 %r119,0; add.u64 %r337,%frame,240; .loc 1 131 16 mov.u64 %r298,%r119; $L26: st.u64 [%r157],%r298; .loc 1 132 18 ld.u64 %r299,[%r156]; st.u64 [%r154],%r299; .loc 1 133 17 shl.b64 %r301,%r119,3; add.u64 %r302,%r337,%r301; .loc 1 133 10 ld.u64 %r303,[%r302]; setp.le.s64 %r304,%r303,0; @ %r304 bra $L1; .loc 1 129 26 add.u64 %r119,%r119,1; .loc 1 129 3 add.u64 %r157,%r157,8; add.u64 %r156,%r156,24; add.u64 %r154,%r154,8; setp.ne.u64 %r305,%r119,%r145; @ %r305 bra $L26; bra $L27; $L25: .loc 1 153 5 @ %r338 bra $L28; .loc 1 154 10 mov.u32 %r307,127; st.u8 [%r118],%r307; bra $L29; $L28: .loc 1 158 36 mov.u64 %r112,%r117; .loc 1 151 9 mov.u32 %r44,127; .loc 1 158 13 mov.u64 %r111,0; $L30: ld.s8 %r310,[%r112]; cvt.u16.u32 %r309,%r310; cvt.u16.u32 %r312,%r44; min.s16 %r311,%r309,%r312; cvt.u32.u16 %r313,%r311; cvt.s32.s8 %r44,%r313; .loc 1 158 28 add.u64 %r111,%r111,1; .loc 1 158 36 add.u64 %r112,%r112,%r107; .loc 1 158 6 setp.gt.s64 %r314,%r41,%r111; @ %r314 bra $L30; .loc 1 175 12 cvt.u32.u32 %r315,%r44; st.u8 [%r118],%r315; $L29: .loc 1 179 15 add.u64 %r48,%r48,1; .loc 1 180 12 add.u64 %r117,%r117,%r70; .loc 1 181 12 add.u64 %r118,%r118,%r72; .loc 1 183 32 ld.u64 %r126,[%frame+240]; .loc 1 183 13 setp.ne.u64 %r316,%r48,%r126; @ %r316 bra $L25; .loc 1 190 23 mul.lo.u64 %r317,%r70,%r48; .loc 1 190 9 sub.u64 %r114,%r117,%r317; .loc 1 191 23 mul.lo.u64 %r318,%r72,%r48; .loc 1 191 9 sub.u64 %r115,%r118,%r318; .loc 1 193 7 @ %r339 bra $L1; add.u64 %r143,%frame,368; mov.u64 %r102,8; .loc 1 192 5 mov.u64 %r116,1; bra $L33; $L34: .loc 1 187 13 st.u64 [%r143],%r346; .loc 1 190 23 mul.lo.u64 %r324,%r84,%r83; .loc 1 190 9 sub.u64 %r114,%r117,%r324; .loc 1 191 23 mul.lo.u64 %r325,%r86,%r83; .loc 1 191 9 sub.u64 %r115,%r118,%r325; .loc 1 192 5 add.u64 %r116,%r116,1; .loc 1 193 7 add.u64 %r143,%r143,8; add.u64 %r102,%r102,8; setp.eq.u64 %r326,%r116,%r343; @ %r326 bra $L1; $L33: .loc 1 201 16 ld.u64 %r327,[%r143]; add.u64 %r83,%r327,1; st.u64 [%r143],%r83; .loc 1 202 23 add.u64 %r329,%r344,%r102; ld.u64 %r84,[%r329]; .loc 1 202 13 add.u64 %r117,%r114,%r84; .loc 1 203 23 add.u64 %r330,%frame,%r102; ld.u64 %r86,[%r330]; .loc 1 203 13 add.u64 %r118,%r115,%r86; .loc 1 183 32 add.u64 %r332,%r345,%r102; ld.u64 %r88,[%r332]; .loc 1 183 13 setp.eq.u64 %r333,%r83,%r88; @ %r333 bra $L34; .loc 1 187 13 mov.u64 %r48,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r334,[%r171]; setp.eq.u64 %r335,%r334,0; @ ! %r335 bra $L11; bra $L35; $L1: .loc 1 207_gfortran_mminval_i1 .visible .func _gfortran_mminval_60899pred %r207; .reg .u32 %r211; .reg .u64 %r213; .reg .u32 %r215; .reg .u32 %r216; .reg .u16u16 %r255; .reg .u32 %r256; .reg .u16 %r257; .reg .pred %r258; .reg .predpred %r274; .reg .u64 %r275; .reg .u64 %r277; .reg .u64 %r279; .reg .u64 %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64pred %r297; .reg .u64 %r298; .reg .predu64 %r305; .reg .u32 %r306; .reg .u64u64 %r313; .reg .u64pred %r323; .reg .pred %r324; .reg .u64 %r325; .reg .u32 %r326; .reg .u64u64 %r331; .reg .u32 %r332; .reg .u64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u64u16 %r347; .reg .u32 %r348; .reg .pred %r349; .reg .u64 %r350; .reg .u64 %r351; .reg .predu16 %r388; .reg .u32 %r389; .reg .predpred %r397; .reg .u32u64 %r437; mov.u64 %r203,%ar0; mov.u64 %r204,%ar1; mov.u64 %r205,%ar2; mov.u64 %r206,%ar3; .loc 1 237 6 setp.ne.u64 %r207,%r206,0; @ %r207 bra $L490r204205; call _gfortran_minva.loc 1 244 7 bra $L48; $L49: .loc 1 247 10 ld.u64 %r22,[%r205]; .loc 1 247 7 add.u64 %r131,%r22,-1; .loc 1 248 10 ld.s8 %r23,[%r204+28]; .loc 1 248 38 add.u32 %r211,%r23,-1; .loc 1 248 8 cvt.s64.s32 %r132,%r211; .loc 1 251 7 shr.u64 %r213,%r131,63; set.u32.gt.s64 %r215,%r131,%r132; neg.s32 %r216,%r215; cvt.u16.u64 %r219,%r213; cvt.u16.u32 %r220,%r216; or.b16 %r218,%r219,%r220; .loc 1 251 6 cvt.u32.u16 %r221,%r218; cvt.u16.u8 %r222,%r221; setp.eq.u16 %r223,%r222,0; @ %r223 bra $L51; .loc 1 253 7 cvt.u32.u32 %r226,%r23; cvt.s64.s8 %r225,%r226; st.u64 [%stack+8],%r225; st.u64 [%stack],%r22; cvta.const.u64 %r2242stack; call _gfortran_runtime_error1: .loc 1 258 9 add.u64 %r426,%r131,%r131; add.u64 %r425,%r426,%r131; shl.b64 %r231,%r425,3; add.u64 %r232,%r204,%r231; ld.u64 %r235,[%r232+56]; add.u64 %r234,%r235,1; .loc 1 258 7 ld.u64 %r242,[%r232+48]; sub.u64 %r133,%r234,%r242; .loc 1 259 6 setp.le.s64 %r243,%r133,0; @ %r243 bra $L48; .loc 1 262 9 ld.u64 %r145,[%r206]; .loc 1 264 15 ld.u64 %r34,[%r206+16]; .loc 1 266 22 cvt.u32.u64 %r35,%r34; .loc 1 266 53 add.u32 %r244,%r35,-4; and.b32 %r245,%r244,-5; set.u32.eq.u32 %r247,%r245,0; neg.s32 %r248,%r247; .loc 1 266 22 add.u32 %r249,%r35,-1; set.u32.le.u32 %r251,%r249,1; neg.s32 %r252,%r251; .loc 1 266 6 cvt.u16.u32 %r254,%r248; cvt.u16.u32 %r255,%r252; or.b16 %r253,%r254,%r255; cvt.u32.u16 %r256,%r253; cvt.u16.u8 %r257,%r256; setp.ne.u16 %r258,%r257,0; @ %r258 bra $L53; .loc 1 268 7 setp.ne.u32 %r260,%r35,16; @ %r260 bra $L54; $L53: .loc 1 275 9 shl.b64 %r264,%r425,3; add.u64 %r265,%r204,%r264; ld.u64 %r136,[%r265+40]; .loc 1 276 12 add.u64 %r271,%r206,%r264; ld.u64 %r42,[%r271+40]; .loc 1 278 3 setp.ne.u64 %r273,%r131,0; @ %r273 bra $L55; $L61: .loc 1 288 3 setp.lt.s64 %r274,%r131,%r132; @ %r274 bra $L56; bra $L105; $L54: .loc 1 273 5 cvta.const.u64 %r275275_gfortran_runtime_error5: add.u64 %r187,%r204,40; add.u64 %r189,%r206,40; add.u64 %r277,%r204,16; add.u64 %r279,%r22,%r22; add.u64 %r280,%r279,%r22; shl.b64 %r281,%r280,3; add.u64 %r200,%r277,%r281; .loc 1 278 3 mov.u64 %r191,0; add.u64 %r423,%frame,240; add.u64 %r422,%frame,360; .loc 1 285 12 mov.u64 %r437,%r191; $L60: .loc 1 280 18 add.u64 %r283,%r423,%r191; ld.u64 %r284,[%r187]; st.u64 [%r283],%r284; .loc 1 281 18 add.u64 %r285,%frame,%r191; .loc 1 281 20 ld.u64 %r287,[%r189]; mul.lo.u64 %r286,%r287,%r34; .loc 1 281 18 st.u64 [%r285],%r286; .loc 1 282 19 ld.u64 %r289,[%r187+16]; add.u64 %r288,%r289,1; ld.u64 %r290,[%r187+8]; sub.u64 %r53,%r288,%r290; .loc 1 284 10 setp.lt.s64 %r291,%r53,0; @ %r291 bra $L58; .loc 1 282 17 add.u64 %r293,%r422,%r191; st.u64 [%r293],%r53; bra $L59; $L58: .loc 1 285 12 add.u64 %r295,%r422,%r191; st.u64 [%r295],%r437; $L59: .loc 1 278 3 add.u64 %r187,%r187,24; add.u64 %r189,%r189,24; add.u64 %r191,%r191,8; setp.ne.u64 %r297,%r187,%r200; @ %r297 bra $L60; bra $L61; $L105: .loc 1 298 6 ld.u64 %r298,[%r203]; setp.eq.u64 %r299,%r298,0; @ ! %r299 bra $L63; bra $L62; $L56: add.u64 %r301,%r22,%r22; add.u64 %r302,%r301,%r22; shl.b64 %r303,%r302,3; add.u64 %r152,%r303,40; add.u64 %r55,%r204,%r152; add.u64 %r154,%r206,%r152; shl.b64 %r304,%r22,3; add.u64 %r99,%r304,-8; cvt.u32.u32 %r306,%r23; cvt.s64.s8 %r305,%r306; shl.b64 %r307,%r305,3; add.u64 %r184,%r307,-8; add.u64 %r423,%frame,240; add.u64 %r422,%frame,360; .loc 1 295 12 mov.u64 %r436,0; $L66: .loc 1 290 18 add.u64 %r309,%r423,%r99; ld.u64 %r310,[%r55]; st.u64 [%r309],%r310; .loc 1 291 18 add.u64 %r311,%frame,%r99; .loc 1 291 20 ld.u64 %r313,[%r154]; mul.lo.u64 %r312,%r313,%r34; .loc 1 291 18 st.u64 [%r311],%r312; .loc 1 292 19 ld.u64 %r315,[%r55+16]; add.u64 %r314,%r315,1; ld.u64 %r316,[%r55+8]; sub.u64 %r63,%r314,%r316; .loc 1 294 10 setp.lt.s64 %r317,%r63,0; @ %r317 bra $L64; .loc 1 292 17 add.u64 %r319,%r422,%r99; st.u64 [%r319],%r63; bra $L65; $L64: .loc 1 295 12 add.u64 %r321,%r422,%r99; st.u64 [%r321],%r436; $L65: .loc 1 288 3 add.u64 %r55,%r55,24; add.u64 %r154,%r154,24; add.u64 %r99,%r99,8; setp.ne.u64 %r323,%r99,%r184; @ %r323 bra $L66; bra $L106; $L62: .loc 1 302 7 setp.eq.u64 %r324,%r132,0; @ %r324 bra $L68; add.u64 %r422,%frame,360; $L86: add.u64 %r123,%r203,40; mov.u64 %r117,%r422; cvt.u32.u32 %r326,%r23; cvt.s64.s8 %r325,%r326; add.u64 %r328,%r325,%r325; add.u64 %r329,%r328,%r325; shl.b64 %r330,%r329,3; add.u64 %r331,%r203,16; add.u64 %r71,%r330,%r331; .loc 1 305 10 mov.u64 %r119,1; .loc 1 309 4 mov.u64 %r350,0; bra $L69; $L68: .loc 1 313 20 add.u32 %r332,%r23,-2; cvt.s64.s32 %r69,%r332; add.u64 %r334,%r69,%r69; add.u64 %r335,%r334,%r69; shl.b64 %r336,%r335,3; add.u64 %r337,%r203,%r336; .loc 1 313 67 shl.b64 %r339,%r69,3; add.u64 %r340,%frame,%r339; .loc 1 313 59 ld.u64 %r342,[%r337+40]; ld.u64 %r343,[%r340+360]; mul.lo.u64 %r139,%r342,%r343; .loc 1 315 24 mov.u64 %r344,0; st.u64 [%r203+8],%r344; .loc 1 316 28 cvt.u16.u32 %r347,%r23; add.u16 %r346,%r347,-1; cvt.u32.u16 %r348,%r346; st.u8 [%r203+28],%r348; .loc 1 318 10 setp.eq.u64 %r349,%r139,0; @ %r349 bra $L70; bra $L107; $L72: .loc 1 307 47 mul.lo.u64 %r119,%r66,%r119; $L69: .loc 1 309 4 st.u64 [%r123+8],%r350; ld.u64 %r66,[%r117]; add.u64 %r351,%r66,-1; st.u64 [%r123+16],%r351; st.u64 [%r123],%r119; .loc 1 302 7 add.u64 %r123,%r123,24; add.u64 %r117,%r117,8; setp.ne.u64 %r352,%r71,%r123; @ %r352 bra $L72; bra $L68; $L70: .loc 1 321 4 st.u64 [%r203+48],%r139; mov.u64 %r354,-1; st.u64 [%r203+56],%r354; mov.u64 %r355,1; st.u64 [%r203+40],%r355; .loc 1 322 4 bra $L48; $L107: .loc 1 325 24 mov.u64 %r3571357; call (%value_in),_gfortrani_xmallocarray58,[%value_in]; } .loc 1 325 22 st.u64 [%r203],%r358; bra $L73; $L63: .loc 1 330 19 ld.s8 %r360,[%r203+28]; .loc 1 330 10 setp.eq.u64 %r361,%r360,%r132; @ %r361 bra $L74; .loc 1 331 2 cvta.const.u64 %r362362_gfortran_runtime_error74: .loc 1 333 11 cvta.global.u64 %r364,_gfortrani_compile_options; .loc 1 333 10 ld.u32 %r365,[%r364+36]; setp.eq.u32 %r366,%r365,0; @ %r366 bra $L73; .loc 1 335 4 add.u64 %r422,%frame,360; cvta.const.u64 %r370,$LC2r422369370; call _gfortrani_bounds_ifunction_return337 4 cvta.const.u64 %r374,$LC6; {374370; call _gfortrani_bounds_equal_extents$L73: .loc 1 342 3 setp.ne.u64 %r376,%r132,0; @ %r376 bra $L75; $L78: .loc 1 350 8 ld.u64 %r146,[%r203]; .loc 1 351 8 ld.u64 %r144,[%r204]; .loc 1 353 9 setp.ne.u64 %r377,%r144,0; @ %r377 bra $L76; bra $L48; $L75: add.u64 %r170,%frame,480; add.u64 %r167,%r203,40; add.u64 %r160,%frame,120; cvt.u32.u32 %r379,%r23; cvt.s64.s8 %r378,%r379; add.u64 %r124,%r378,-1; .loc 1 342 3 mov.u64 %r151,0; add.u64 %r422,%frame,360; .loc 1 344 16 mov.u64 %r380,%r151; $L77: st.u64 [%r170],%r380; .loc 1 345 18 ld.u64 %r381,[%r167]; st.u64 [%r160],%r381; .loc 1 346 17 shl.b64 %r383,%r151,3; add.u64 %r384,%r422,%r383; .loc 1 346 10 ld.u64 %r385,[%r384]; setp.le.s64 %r386,%r385,0; @ %r386 bra $L48; .loc 1 342 26 add.u64 %r151,%r151,1; .loc 1 342 3 add.u64 %r170,%r170,8; add.u64 %r167,%r167,24; add.u64 %r160,%r160,8; setp.ne.u64 %r387,%r124,%r151; @ %r387 bra $L77; bra $L78; $L76: .loc 1 276 12 mul.lo.u64 %r44,%r42,%r34; ld.u64 %r173,[%frame+480]; .loc 1 401 22 ld.u64 %r85,[%frame+240]; .loc 1 402 23 ld.u64 %r87,[%frame]; .loc 1 403 22 ld.u64 %r89,[%frame+120]; setp.le.s64 %r427,%r132,1; .loc 1 365 9 mov.u32 %r429,127; cvt.u32.u32 %r430,%r23; cvt.s64.s8 %r431,%r430; add.u64 %r432,%r431,-1; add.u64 %r433,%frame,240; add.u64 %r434,%frame,360; add.u64 %r435,%frame,120; $L87: .loc 1 342 3 mov.u64 %r150,%r145; mov.u64 %r149,%r144; .loc 1 365 9 mov.u32 %r121,%r429; .loc 1 370 9 mov.u64 %r148,0; $L80: .loc 1 394 6 ld.s8 %r389,[%r150]; cvt.u16.u32 %r388,%r389; setp.eq.u16 %r390,%r388,0; @ %r390 bra $L79; ld.s8 %r393,[%r149]; cvt.u16.u32 %r392,%r393; cvt.u16.u32 %r395,%r121; min.s16 %r394,%r392,%r395; cvt.u32.u16 %r396,%r394; cvt.s32.s8 %r121,%r396; $L79: .loc 1 370 24 add.u64 %r148,%r148,1; .loc 1 370 32 add.u64 %r149,%r149,%r136; .loc 1 370 47 add.u64 %r150,%r150,%r44; .loc 1 370 2 setp.ne.u64 %r397,%r133,%r148; @ %r397 bra $L80; .loc 1 397 8 cvt.u32.u32 %r398,%r121; st.u8 [%r146],%r398; .loc 1 400 15 add.u64 %r173,%r173,1; .loc 1 401 12 add.u64 %r144,%r144,%r85; .loc 1 402 13 add.u64 %r145,%r145,%r87; .loc 1 403 12 add.u64 %r146,%r146,%r89; .loc 1 405 32 ld.u64 %r79,[%frame+360]; .loc 1 405 13 setp.ne.u64 %r399,%r79,%r173; @ %r399 bra $L87; .loc 1 412 23 mul.lo.u64 %r93,%r173,%r85; .loc 1 413 24 mul.lo.u64 %r400,%r173,%r87; .loc 1 413 10 sub.u64 %r141,%r145,%r400; .loc 1 414 23 mul.lo.u64 %r401,%r173,%r89; .loc 1 414 9 sub.u64 %r142,%r146,%r401; .loc 1 416 7 @ ! %r427 bra $L108; bra $L48; $L85: .loc 1 409 13 st.u64 [%r40],%r428; .loc 1 412 23 mul.lo.u64 %r93,%r105,%r104; .loc 1 413 24 mul.lo.u64 %r404,%r107,%r104; .loc 1 413 10 sub.u64 %r141,%r145,%r404; .loc 1 414 23 mul.lo.u64 %r405,%r109,%r104; .loc 1 414 9 sub.u64 %r142,%r146,%r405; .loc 1 415 5 add.u64 %r143,%r143,1; .loc 1 416 7 add.u64 %r40,%r40,8; add.u64 %r164,%r164,8; setp.ne.u64 %r406,%r143,%r432; @ %r406 bra $L84; bra $L48; $L108: add.u64 %r40,%frame,488; mov.u64 %r164,8; .loc 1 415 5 mov.u64 %r143,1; .loc 1 409 13 mov.u64 %r428,0; $L84: .loc 1 424 16 ld.u64 %r410,[%r40]; add.u64 %r104,%r410,1; st.u64 [%r40],%r104; .loc 1 425 23 add.u64 %r412,%r433,%r164; ld.u64 %r105,[%r412]; .loc 1 425 13 sub.u64 %r413,%r105,%r93; add.u64 %r144,%r144,%r413; .loc 1 426 24 add.u64 %r414,%frame,%r164; ld.u64 %r107,[%r414]; .loc 1 426 14 add.u64 %r145,%r141,%r107; .loc 1 427 23 add.u64 %r416,%r435,%r164; ld.u64 %r109,[%r416]; .loc 1 427 13 add.u64 %r146,%r142,%r109; .loc 1 405 32 add.u64 %r418,%r434,%r164; ld.u64 %r112,[%r418]; .loc 1 405 13 setp.eq.u64 %r419,%r104,%r112; @ %r419 bra $L85; .loc 1 409 13 mov.u64 %r173,0; bra $L87; $L106: .loc 1 298 6 ld.u64 %r420,[%r203]; setp.eq.u64 %r421,%r420,0; @ ! %r421 bra $L63; bra $L86; $L48: .loc 1 431_gfortran_sminval_i1 .visible .func _gfortran_sminval_88pred %r184; .reg .predpred %r206; .reg .predu32 %r211; .reg .u64 %r213; .reg .u64u32 %r222; .reg .u64pred %r264; .reg .u64 %r266; .reg .u32 %r270; .reg .u64 %r272; .reg .u64 %r274; .reg .predpredu64 %r297; mov.u64 %r148,%ar0; mov.u64 %r149,%ar1; mov.u64 %r150,%ar2; mov.u64 %r151,%ar3; .loc 1 454 6 setp.eq.u64 %r152,%r151,0; @ %r152 bra $L110; .loc 1 454 20 ld.u32 %r153,[%r151]; setp.eq.u32 %r154,%r153,0; @ %r154 bra $L111; $L110: .loc 1 459 7gfortran_minva.loc 1 461 7 bra $L109; $L111: .loc 1 464 10 ld.u64 %r23,[%r150]; .loc 1 464 7 add.u64 %r86,%r23,-1; .loc 1 465 10 ld.s8 %r24,[%r149+28]; .loc 1 465 38 add.u32 %r26,%r24,-1; .loc 1 465 8 cvt.s64.s32 %r291,%r26; .loc 1 467 7 shr.u64 %r159,%r86,63; set.u32.gt.s64 %r161,%r86,%r291; neg.s32 %r162,%r161; cvt.u16.u64 %r165,%r159; cvt.u16.u32 %r166,%r162; or.b16 %r164,%r165,%r166; .loc 1 467 6 cvt.u32.u16 %r167,%r164; cvt.u16.u8 %r168,%r167; setp.ne.u16 %r169,%r168,0; @ %r169 bra $L113; .loc 1 474 3 setp.ne.u64 %r170,%r86,0; @ %r170 bra $L114; $L120: .loc 1 482 3 setp.lt.s64 %r171,%r86,%r291; @ %r171 bra $L115; bra $L168; $L113: .loc 1 469 7 cvt.u32.u32 %r174,%r24; cvt.s64.s8 %r173,%r174; st.u64 [%stack+8],%r173172172_gfortran_runtime_error114: add.u64 %r135,%r149,48; add.u64 %r137,%frame,120; add.u64 %r138,%r149,56; add.u64 %r176,%r149,24; add.u64 %r178,%r23,%r23; add.u64 %r179,%r178,%r23; shl.b64 %r180,%r179,3; add.u64 %r144,%r176,%r180; .loc 1 479 12 mov.u64 %r297,0; $L119: .loc 1 476 19 ld.u64 %r182,[%r138]; add.u64 %r181,%r182,1; ld.u64 %r183,[%r135]; sub.u64 %r36,%r181,%r183; .loc 1 478 10 setp.le.s64 %r184,%r36,0; @ %r184 bra $L117; .loc 1 476 17 st.u64 [%r137],%r36; bra $L118; $L117: .loc 1 479 12 st.u64 [%r137],%r297; $L118: .loc 1 474 3 add.u64 %r135,%r135,24; add.u64 %r137,%r137,8; add.u64 %r138,%r138,24; setp.ne.u64 %r186,%r135,%r144; @ %r186 bra $L119; bra $L120; $L168: .loc 1 491 15 ld.u64 %r96,[%r148]; .loc 1 491 6 setp.eq.u64 %r187,%r96,0; @ ! %r187 bra $L122; bra $L121; $L115: add.u64 %r189,%r23,%r23; add.u64 %r190,%r189,%r23; shl.b64 %r191,%r190,3; add.u64 %r192,%r191,48; add.u64 %r57,%r149,%r192; add.u64 %r290,%frame,120; shl.b64 %r194,%r23,3; add.u64 %r195,%r194,-8; add.u64 %r67,%r290,%r195; add.u64 %r196,%r149,48; cvt.u32.u32 %r198,%r24; cvt.s64.s8 %r197,%r198; add.u64 %r200,%r197,%r197; add.u64 %r201,%r200,%r197; shl.b64 %r202,%r201,3; add.u64 %r131,%r196,%r202; .loc 1 488 12 mov.u64 %r296,0; $L125: .loc 1 485 2 ld.u64 %r204,[%r57+8]; add.u64 %r203,%r204,1; ld.u64 %r205,[%r57]; sub.u64 %r41,%r203,%r205; .loc 1 487 10 setp.le.s64 %r206,%r41,0; @ %r206 bra $L123; .loc 1 484 17 st.u64 [%r67],%r41; bra $L124; $L123: .loc 1 488 12 st.u64 [%r67],%r296; $L124: .loc 1 482 3 add.u64 %r57,%r57,24; add.u64 %r67,%r67,8; setp.ne.u64 %r208,%r57,%r131; @ %r208 bra $L125; bra $L169; $L121: .loc 1 495 7 setp.eq.u64 %r209,%r291,0; @ %r209 bra $L127; add.u64 %r290,%frame,120; $L147: add.u64 %r102,%r148,40; mov.u64 %r84,%r290; cvt.u32.u32 %r211,%r24; cvt.s64.s8 %r210,%r211; add.u64 %r213,%r210,%r210; add.u64 %r214,%r213,%r210; shl.b64 %r215,%r214,3; add.u64 %r216,%r148,16; add.u64 %r60,%r215,%r216; .loc 1 498 10 mov.u64 %r82,1; .loc 1 502 4 mov.u64 %r235,0; bra $L128; $L127: .loc 1 506 24 mov.u64 %r217,0; st.u64 [%r148+8],%r217; .loc 1 507 28 cvt.u16.u32 %r220,%r24; add.u16 %r219,%r220,-1; cvt.u32.u16 %r221,%r219; st.u8 [%r148+28],%r221; .loc 1 509 20 add.u32 %r222,%r24,-2; cvt.s64.s32 %r50,%r222; add.u64 %r224,%r50,%r50; add.u64 %r225,%r224,%r50; shl.b64 %r226,%r225,3; add.u64 %r227,%r148,%r226; .loc 1 509 67 shl.b64 %r229,%r50,3; add.u64 %r230,%frame,%r229; .loc 1 509 59 ld.u64 %r232,[%r227+40]; ld.u64 %r233,[%r230+120]; mul.lo.u64 %r93,%r232,%r233; .loc 1 511 10 setp.eq.u64 %r234,%r93,0; @ %r234 bra $L129; bra $L170; $L131: .loc 1 500 48 mul.lo.u64 %r82,%r46,%r82; $L128: .loc 1 502 4 st.u64 [%r102+8],%r235; ld.u64 %r46,[%r84]; add.u64 %r236,%r46,-1; st.u64 [%r102+16],%r236; st.u64 [%r102],%r82; .loc 1 495 7 add.u64 %r102,%r102,24;131; bra $L127; $L129: .loc 1 514 4 st.u64 [%r148+48],%r93; mov.u64 %r239,-1; st.u64 [%r148+56],%r239; mov.u64 %r240,1; st.u64 [%r148+40],%r240; .loc 1 515 4 bra $L109; $L170: .loc 1 518 24 mov.u64 %r242(%value_in),_gfortrani_xmallocarray3,[%value_in]; } mov.u64 %r96,%r243; .loc 1 518 22 st.u64 [%r148],%r96; $L135: .loc 1 544 3 setp.ne.u64 %r245,%r291,0; @ %r245 bra $L132; bra $L133; $L122: .loc 1 522 19 ld.s8 %r55,[%r148+28]; .loc 1 522 10 setp.eq.u64 %r246,%r55,%r291; @ %r246 bra $L134; .loc 1 523 2 st.u64 [%stack+8],%r291; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 528 11 cvta.global.u64 %r249,_gfortrani_compile_options; .loc 1 528 10 ld.u32 %r250,[%r249+36]; setp.eq.u32 %r251,%r250,0; @ %r251 bra $L135; .loc 1 530 4 setp.eq.u64 %r252,%r291,0; @ %r252 bra $L133; add.u64 %r31,%r148,48; add.u64 %r125,%frame,120; cvt.u32.u32 %r254,%r24; cvt.s64.s8 %r253,%r254; add.u64 %r92,%r253,-1; .loc 1 530 10 mov.u64 %r101,0; $L137: .loc 1 534 21 ld.u64 %r256,[%r31+8]; add.u64 %r255,%r256,1; .loc 1 534 19 ld.u64 %r257,[%r31]; sub.u64 %r91,%r255,%r257; .loc 1 535 18 ld.u64 %r64,[%r125]; .loc 1 530 25 add.u64 %r101,%r101,1; .loc 1 535 11 setp.eq.u64 %r258,%r64,%r91; @ %r258 bra $L136; .loc 1 536 3 st.u64 [%stack+16],%r64; st.u64 [%stack+8],%r91; st.u64 [%stack],%r101; cvta.const.u64 %r22_gfortran_runtime_error136: .loc 1 530 4 add.u64 %r31,%r31,24; add.u64 %r125,%r125,8; setp.ne.u64 %r261,%r92,%r101; @ %r261 bra $L137; $L132: .loc 1 546 16 cvt.u16.u32 %r263,%r24; setp.le.s16 %r264,%r263,1; @ %r264 bra $L138; shl.b64 %r262,%r291,3; bra $L139; $L138: mov.u64 %r262,8; $L139: add.u64 %r266,%frame,240; mov.u32 %r27026272,[%value_in]; } add.u64 %r111,%r148,40; mov.u64 %r109,%frame; mov.u64 %r98,0; $L140: .loc 1 547 18 ld.u64 %r274,[%r111]; st.u64 [%r109],%r274; .loc 1 544 26 add.u64 %r98,%r98,1; .loc 1 544 3 add.u64 %r111,%r111,24; add.u64 %r109,%r109,8; setp.gt.s64 %r275,%r291,%r98; @ %r275 bra $L140; $L133: ld.u64 %r103,[%frame+240]; .loc 1 556 22 ld.u64 %r69,[%frame]; .loc 1 558 32 ld.u64 %r59,[%frame+120]; .loc 1 565 23 mul.lo.u64 %r81,%r59,%r69; setp.le.s64 %r289,%r291,1; .loc 1 554 13 mov.u32 %r276,127; cvt.u32.u32 %r293,%r24; cvt.s64.s8 %r294,%r293; add.u64 %r295,%r294,-1; $L146: st.u8 [%r96],%r276; .loc 1 555 15 add.u64 %r103,%r103,1; .loc 1 556 12 add.u64 %r96,%r96,%r69; .loc 1 558 13 setp.ne.u64 %r277,%r59,%r103; @ %r277 bra $L146; .loc 1 567 7 @ ! %r289 bra $L171; bra $L109; $L145: .loc 1 562 13 st.u64 [%r97],%r292; .loc 1 565 23 mul.lo.u64 %r71,%r74,%r73; .loc 1 566 5 add.u64 %r95,%r95,1; .loc 1 567 7 add.u64 %r97,%r97,8; add.u64 %r119,%r119,8; add.u64 %r118,%r118,8; setp.ne.u64 %r280,%r95,%r295; @ %r280 bra $L144; bra $L109; $L171: add.u64 %r97,%frame,248; add.u64 %r119,%frame,8; add.u64 %r118,%frame,128; .loc 1 565 23 mov.u64 %r71,%r81; .loc 1 566 5 mov.u64 %r95,1; .loc 1 562 13 mov.u64 %r292,0; $L144: .loc 1 571 16 ld.u64 %r285,[%r97]; add.u64 %r73,%r285,1; st.u64 [%r97],%r73; .loc 1 572 23 ld.u64 %r74,[%r119]; .loc 1 572 13 sub.u64 %r286,%r74,%r71; add.u64 %r96,%r96,%r286; .loc 1 558 32 ld.u64 %r77,[%r118]; .loc 1 558 13 setp.eq.u64 %r287,%r73,%r77; @ %r287 bra $L145; .loc 1 562 13 mov.u64 %r103,0; bra $L146; $L169: .loc 1 491 15 ld.u64 %r96,[%r148]; .loc 1 491 6 setp.eq.u64 %r288,%r96,0; @ ! %r288 bra $L122; bra $L147; $L109: .loc 1 minval_i2.o/_gfortran_minval_i2 .visible .func _gfortran_minval_i2fortran/generated/minval_i2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminval_i2 .visible .func _gfortran_mminval_i2gfortran_sminval_i2 .visible .func _gfortran_sminval_i2VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,86,65,76114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_minval_i2 .visible .func _gfortran_minval_i24899pred %r217; .reg .pred64u64 %r251; .reg .predu64 %r256; .reg .u64 %r257; .reg .u32u32 %r268; .reg .u64predmov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; .loc 1 55 10 ld.s8 %r22,[%r180+28]; .loc 1 55 38 add.u32 %r182,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r182; .loc 1 56 10 ld.u64 %r25,[%r181]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r184,%r104,63; set.u32.lt.s64 %r186,%r103,%r104; neg.s32 %r187,%r186; cvt.u16.u64 %r190,%r184; cvt.u16.u32 %r191,%r187; or.b16 %r189,%r190,%r191; .loc 1 58 6 cvt.u32.u16 %r192,%r189; cvt.u16.u8 %r193,%r192; setp.eq.u16 %r194,%r193,0; @ %r194 bra $L2; .loc 1 60 7 cvt.u32.u32 %r197,%r22; cvt.s64.s8 %r196,%r197; st.u64 [%stack+8],%r1961955_gfortran_runtime_errorr200,%r104,%r104; add.u64 %r201,%r200,%r104; shl.b64 %r202,%r201,3; add.u64 %r203,%r180,%r202; ld.u64 %r30,[%r203+56]; ld.u64 %r32,[%r203+48]; .loc 1 68 9 ld.u64 %r107,[%r203+40]; .loc 1 70 3 setp.ne.u64 %r217,%r104,0; @ %r217 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r218,%r103,%r104; @ %r218 bra $L4; bra $L46; $L3: add.u64 %r78,%r180,40; add.u64 %r59,%frame,120; add.u64 %r91,%frame,240; add.u64 %r219,%r180,16; add.u64 %r221,%r25,%r25; add.u64 %r222,%r221,%r25; shl.b64 %r223,%r222,3; add.u64 %r174,%r219,%r223; .loc 1 76 12 mov.u64 %r363,0; $L8: .loc 1 72 18 ld.u64 %r224,[%r78]; st.u64 [%r59],%r224; .loc 1 73 19 ld.u64 %r226,[%r78+16]; add.u64 %r225,%r226,1; ld.u64 %r227,[%r78+8]; sub.u64 %r37,%r225,%r227; .loc 1 75 10 setp.lt.s64 %r228,%r37,0; @ %r228 bra $L6; .loc 1 73 17 st.u64 [%r91],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r91],%r363; $L7: .loc 1 70 3 add.u64 %r78,%r78,24; add.u64 %r59,%r59,8; add.u64 %r91,%r91,8; setp.ne.u64 %r230,%r78,%r174; @ %r230 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r231,[%r179]; setp.eq.u64 %r232,%r231,0; @ ! %r232 bra $L11; bra $L10; $L4: add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r237,%r236,40; add.u64 %r143,%r180,%r237; shl.b64 %r238,%r25,3; add.u64 %r125,%r238,-8; add.u64 %r351,%frame,120; add.u64 %r128,%r351,%r125; add.u64 %r350,%frame,240; add.u64 %r121,%r350,%r125; cvt.u32.u32 %r242,%r22; cvt.s64.s8 %r241,%r242; add.u64 %r244,%r241,%r241; add.u64 %r245,%r244,%r241; shl.b64 %r246,%r245,3; add.u64 %r247,%r180,40; add.u64 %r95,%r246,%r247; .loc 1 84 12 mov.u64 %r362,0; $L14: .loc 1 80 18 ld.u64 %r248,[%r143]; st.u64 [%r128],%r248; .loc 1 81 19 ld.u64 %r250,[%r143+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r143+8]; sub.u64 %r45,%r249,%r251; .loc 1 83 10 setp.lt.s64 %r252,%r45,0; @ %r252 bra $L12; .loc 1 81 17 st.u64 [%r121],%r45; bra $L13; $L12: .loc 1 84 12 st.u64 [%r121],%r362; $L13: .loc 1 78 3 add.u64 %r143,%r143,24; add.u64 %r128,%r128,8; add.u64 %r121,%r121,8; setp.ne.u64 %r254,%r95,%r143; @ %r254 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r255,%r103,0; @ %r255 bra $L16; add.u64 %r350,%frame,240; $L34: add.u64 %r152,%r179,40; mov.u64 %r150,%r350; add.u64 %r256,%r179,16; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r144,%r256,%r262; .loc 1 94 10 mov.u64 %r99,1; .loc 1 98 4 mov.u64 %r285,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r263,0; st.u64 [%r179+8],%r263; .loc 1 103 28 cvt.u16.u32 %r266,%r22; add.u16 %r265,%r266,-1; cvt.u32.u16 %r267,%r265; st.u8 [%r179+28],%r267; .loc 1 105 20 add.u32 %r268,%r22,-2; cvt.s64.s32 %r52,%r268; add.u64 %r270,%r52,%r52; add.u64 %r271,%r270,%r52; shl.b64 %r272,%r271,3; add.u64 %r273,%r179,%r272; .loc 1 105 67 shl.b64 %r275,%r52,3; add.u64 %r276,%frame,%r275; .loc 1 105 59 ld.u64 %r278,[%r273+40]; ld.u64 %r279,[%r276+240]; mul.lo.u64 %r109,%r278,%r279; .loc 1 107 29 mov.u64 %r281,281; call (%value_in),_gfortrani_xmallocarray282,[%value_in]; } .loc 1 107 27 st.u64 [%r179],%r282; .loc 1 108 10 setp.eq.u64 %r284,%r109,0; @ ! %r284 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r99,%r48,%r99; $L17: .loc 1 98 4 st.u64 [%r152+8],%r285; ld.u64 %r48,[%r150]; add.u64 %r286,%r48,-1; st.u64 [%r152+16],%r286; st.u64 [%r152],%r99; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r287,%r144,%r152; @ %r287 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r179+48],%r109; mov.u64 %r289,-1; st.u64 [%r179+56],%r289; mov.u64 %r290,1; st.u64 [%r179+40],%r290; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r179+28]; .loc 1 118 10 setp.eq.u64 %r291,%r58,%r103; @ %r291 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r294,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r295,[%r294+36]; setp.eq.u32 %r296,%r295,0; @ %r296 bra $L21; .loc 1 125 2 add.u64 %r350,%frame,240; cvta.const.u64 %r300,$LC2r350299300; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r302,%r103,0; @ %r302 bra $L24; $L27: .loc 1 65 9 add.u64 %r303,%r30,1; .loc 1 65 7 sub.u64 %r105,%r303,%r32; max.s64 %r40,%r105,0; .loc 1 137 8 ld.u64 %r116,[%r180]; .loc 1 138 8 ld.u64 %r117,[%r179]; .loc 1 158 36 add.u64 %r66,%r107,%r107; ld.u64 %r138,[%frame+360]; .loc 1 180 22 ld.u64 %r68,[%frame+120]; .loc 1 180 12 add.u64 %r69,%r68,%r68; .loc 1 181 22 ld.u64 %r71,[%frame]; .loc 1 181 12 add.u64 %r72,%r71,%r71; .loc 1 183 32 ld.u64 %r123,[%frame+240]; .loc 1 190 23 mul.lo.u64 %r307,%r68,%r123; .loc 1 190 9 add.u64 %r308,%r307,%r307; neg.s64 %r127,%r308; .loc 1 191 23 mul.lo.u64 %r309,%r71,%r123; .loc 1 191 9 add.u64 %r310,%r309,%r309; neg.s64 %r130,%r310; setp.gt.s64 %r352,%r105,0; setp.le.s64 %r353,%r103,1; cvt.u32.u32 %r355,%r22; cvt.s64.s8 %r356,%r355; add.u64 %r357,%r356,-1; add.u64 %r358,%frame,120; add.u64 %r359,%frame,240; .loc 1 187 13 mov.u64 %r360,0; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r179,40; mov.u64 %r162,%frame; cvt.u32.u32 %r312,%r22; cvt.s64.s8 %r311,%r312; add.u64 %r153,%r311,-1; .loc 1 129 3 mov.u64 %r118,0; add.u64 %r350,%frame,240; .loc 1 131 16 mov.u64 %r313,%r118; $L26: st.u64 [%r165],%r313; .loc 1 132 18 ld.u64 %r314,[%r164]; st.u64 [%r162],%r314; .loc 1 133 17 shl.b64 %r316,%r118,3; add.u64 %r317,%r350,%r316; .loc 1 133 10 ld.u64 %r318,[%r317]; setp.le.s64 %r319,%r318,0; @ %r319 bra $L1; .loc 1 129 26 add.u64 %r118,%r118,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r320,%r118,%r153; @ %r320 bra $L26; bra $L27; $L25: .loc 1 153 5 @ %r352 bra $L35; .loc 1 154 10 mov.u16 %r322,32767; st.u16 [%r117],%r322; bra $L29; $L35: mov.u64 %r112,%r116; .loc 1 151 9 mov.u32 %r43,32767; .loc 1 158 13 mov.u64 %r111,0; $L28: ld.u16 %r324,[%r112]; cvt.u16.u32 %r325,%r43; min.s16 %r323,%r324,%r325; cvt.s32.s16 %r43,%r323; .loc 1 158 28 add.u64 %r111,%r111,1; .loc 1 158 36 add.u64 %r112,%r112,%r66; .loc 1 158 6 setp.gt.s64 %r326,%r40,%r111; @ %r326 bra $L28; .loc 1 175 12 cvt.u16.u32 %r327,%r43; st.u16 [%r117],%r327; $L29: .loc 1 179 15 add.u64 %r138,%r138,1; .loc 1 180 12 add.u64 %r116,%r116,%r69; .loc 1 181 12 add.u64 %r117,%r117,%r72; .loc 1 183 13 setp.ne.u64 %r328,%r138,%r123; @ %r328 bra $L25; .loc 1 190 9 add.u64 %r113,%r116,%r127; .loc 1 191 9 add.u64 %r114,%r117,%r130; .loc 1 193 7 @ %r353 bra $L1; add.u64 %r122,%frame,368; mov.u64 %r140,8; .loc 1 192 5 mov.u64 %r115,1; bra $L32; $L33: .loc 1 187 13 st.u64 [%r122],%r360; .loc 1 190 23 mul.lo.u64 %r334,%r85,%r84; .loc 1 190 9 add.u64 %r335,%r334,%r334; sub.u64 %r113,%r116,%r335; .loc 1 191 23 mul.lo.u64 %r336,%r88,%r84; .loc 1 191 9 add.u64 %r337,%r336,%r336; sub.u64 %r114,%r117,%r337; .loc 1 192 5 add.u64 %r115,%r115,1; .loc 1 193 7 add.u64 %r122,%r122,8; add.u64 %r140,%r140,8; setp.eq.u64 %r338,%r115,%r357; @ %r338 bra $L1; $L32: .loc 1 201 16 ld.u64 %r339,[%r122]; add.u64 %r84,%r339,1; st.u64 [%r122],%r84; .loc 1 202 23 add.u64 %r341,%r358,%r140; ld.u64 %r85,[%r341]; .loc 1 202 13 add.u64 %r342,%r85,%r85; add.u64 %r116,%r113,%r342; .loc 1 203 23 add.u64 %r343,%frame,%r140; ld.u64 %r88,[%r343]; .loc 1 203 13 add.u64 %r344,%r88,%r88; add.u64 %r117,%r114,%r344; .loc 1 183 32 add.u64 %r346,%r359,%r140; ld.u64 %r92,[%r346]; .loc 1 183 13 setp.eq.u64 %r347,%r84,%r92; @ %r347 bra $L33; .loc 1 187 13 mov.u64 %r138,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r348,[%r179]; setp.eq.u64 %r349,%r348,0; @ ! %r349 bra $L11; bra $L34; $L1: .loc 1 207_gfortran_mminval_i2 .visible .func _gfortran_mminval_608u64u16 %r263; .reg .pred %r264; .reg .pred %r266; .reg .u64 %r270; .reg .u64 %r271; .reg .u64 %r277; .reg .pred %r279; .reg .pred %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .predu64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32u64 %r342; .reg .u64pred %r358; .reg .u64predu64 %r391; .reg .pred %r392; .reg .pred %r393; .reg .u64u64 %r421; .reg .u64predmov.u64 %r211,%ar2; mov.u64 %r212,%ar3; .loc 1 237 6 setp.ne.u64 %r213,%r212,0; @ %r213 bra $L49gfortran_minva.loc 1 244 7 bra $L48; $L49: .loc 1 247 10 ld.u64 %r22,[%r211]; .loc 1 247 7 add.u64 %r136,%r22,-1; .loc 1 248 10 ld.s8 %r23,[%r210+28]; .loc 1 248 38 add.u32 %r217,%r23,-1; .loc 1 248 8 cvt.s64.s32 %r137,%r217; .loc 1 251 7 shr.u64 %r219,%r136,63; set.u32.gt.s64 %r221,%r136,%r137; neg.s32 %r222,%r221; cvt.u16.u64 %r225,%r219; cvt.u16.u32 %r226,%r222; or.b16 %r224,%r225,%r226; .loc 1 251 6 cvt.u32.u16 %r227,%r224; cvt.u16.u8 %r228,%r227; setp.eq.u16 %r229,%r228,0; @ %r229 bra $L51; .loc 1 253 7 cvt.u32.u32 %r232,%r23; cvt.s64.s8 %r231,%r232; st.u64 [%stack+8],%r231; st.u64 [%stack],%r22; cvta.const.u64 %r2302stack; call _gfortran_runtime_error1: .loc 1 258 9 add.u64 %r435,%r136,%r136; add.u64 %r436,%r435,%r136; shl.b64 %r237,%r436,3; add.u64 %r238,%r210,%r237; ld.u64 %r241,[%r238+56]; add.u64 %r240,%r241,1; .loc 1 258 7 ld.u64 %r248,[%r238+48]; sub.u64 %r138,%r240,%r248; .loc 1 259 6 setp.le.s64 %r249,%r138,0; @ %r249 bra $L48; .loc 1 262 9 ld.u64 %r151,[%r212]; .loc 1 264 15 ld.u64 %r34,[%r212+16]; .loc 1 266 22 cvt.u32.u64 %r35,%r34; .loc 1 266 53 add.u32 %r250,%r35,-4; and.b32 %r251,%r250,-5; set.u32.eq.u32 %r253,%r251,0; neg.s32 %r254,%r253; .loc 1 266 22 add.u32 %r255,%r35,-1; set.u32.le.u32 %r257,%r255,1; neg.s32 %r258,%r257; .loc 1 266 6 cvt.u16.u32 %r260,%r254; cvt.u16.u32 %r261,%r258; or.b16 %r259,%r260,%r261; cvt.u32.u16 %r262,%r259; cvt.u16.u8 %r263,%r262; setp.ne.u16 %r264,%r263,0; @ %r264 bra $L53; .loc 1 268 7 setp.ne.u32 %r266,%r35,16; @ %r266 bra $L54; $L53: .loc 1 275 9 shl.b64 %r270,%r436,3; add.u64 %r271,%r210,%r270; ld.u64 %r141,[%r271+40]; .loc 1 276 12 add.u64 %r277,%r212,%r270; ld.u64 %r42,[%r277+40]; .loc 1 278 3 setp.ne.u64 %r279,%r136,0; @ %r279 bra $L55; $L61: .loc 1 288 3 setp.lt.s64 %r280,%r136,%r137; @ %r280 bra $L56; bra $L105; $L54: .loc 1 273 5 cvta.const.u64 %r281281_gfortran_runtime_error5: add.u64 %r40,%r210,40; add.u64 %r179,%r212,40; add.u64 %r283,%r210,16; add.u64 %r285,%r22,%r22; add.u64 %r286,%r285,%r22; shl.b64 %r287,%r286,3; add.u64 %r206,%r283,%r287; .loc 1 278 3 mov.u64 %r91,0; add.u64 %r433,%frame,240; add.u64 %r432,%frame,360; .loc 1 285 12 mov.u64 %r447,%r91; $L60: .loc 1 280 18 add.u64 %r289,%r433,%r91; ld.u64 %r290,[%r40]; st.u64 [%r289],%r290; .loc 1 281 18 add.u64 %r291,%frame,%r91; .loc 1 281 20 ld.u64 %r293,[%r179]; mul.lo.u64 %r292,%r293,%r34; .loc 1 281 18 st.u64 [%r291],%r292; .loc 1 282 19 ld.u64 %r295,[%r40+16]; add.u64 %r294,%r295,1; ld.u64 %r296,[%r40+8]; sub.u64 %r54,%r294,%r296; .loc 1 284 10 setp.lt.s64 %r297,%r54,0; @ %r297 bra $L58; .loc 1 282 17 add.u64 %r299,%r432,%r91; st.u64 [%r299],%r54; bra $L59; $L58: .loc 1 285 12 add.u64 %r301,%r432,%r91; st.u64 [%r301],%r447; $L59: .loc 1 278 3 add.u64 %r40,%r40,24; add.u64 %r179,%r179,24; add.u64 %r91,%r91,8; setp.ne.u64 %r303,%r40,%r206; @ %r303 bra $L60; bra $L61; $L105: .loc 1 298 6 ld.u64 %r304,[%r209]; setp.eq.u64 %r305,%r304,0; @ ! %r305 bra $L63; bra $L62; $L56: add.u64 %r307,%r22,%r22; add.u64 %r308,%r307,%r22; shl.b64 %r309,%r308,3; add.u64 %r126,%r309,40; add.u64 %r132,%r210,%r126; add.u64 %r124,%r212,%r126; shl.b64 %r310,%r22,3; add.u64 %r102,%r310,-8; cvt.u32.u32 %r312,%r23; cvt.s64.s8 %r311,%r312; shl.b64 %r313,%r311,3; add.u64 %r68,%r313,-8; add.u64 %r433,%frame,240; add.u64 %r432,%frame,360; .loc 1 295 12 mov.u64 %r446,0; $L66: .loc 1 290 18 add.u64 %r315,%r433,%r102; ld.u64 %r316,[%r132]; st.u64 [%r315],%r316; .loc 1 291 18 add.u64 %r317,%frame,%r102; .loc 1 291 20 ld.u64 %r319,[%r124]; mul.lo.u64 %r318,%r319,%r34; .loc 1 291 18 st.u64 [%r317],%r318; .loc 1 292 19 ld.u64 %r321,[%r132+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r132+8]; sub.u64 %r65,%r320,%r322; .loc 1 294 10 setp.lt.s64 %r323,%r65,0; @ %r323 bra $L64; .loc 1 292 17 add.u64 %r325,%r432,%r102; st.u64 [%r325],%r65; bra $L65; $L64: .loc 1 295 12 add.u64 %r327,%r432,%r102; st.u64 [%r327],%r446; $L65: .loc 1 288 3 add.u64 %r132,%r132,24; add.u64 %r124,%r124,24; add.u64 %r102,%r102,8; setp.ne.u64 %r329,%r68,%r102; @ %r329 bra $L66; bra $L106; $L62: .loc 1 302 7 setp.eq.u64 %r330,%r137,0; @ %r330 bra $L68; add.u64 %r432,%frame,360; $L86: add.u64 %r172,%r209,40; mov.u64 %r164,%r432; cvt.u32.u32 %r332,%r23; cvt.s64.s8 %r331,%r332; add.u64 %r334,%r331,%r331; add.u64 %r335,%r334,%r331; shl.b64 %r336,%r335,3; add.u64 %r337,%r209,16; add.u64 %r133,%r336,%r337; .loc 1 305 10 mov.u64 %r128,1; .loc 1 309 4 mov.u64 %r356,0; bra $L69; $L68: .loc 1 313 20 add.u32 %r338,%r23,-2; cvt.s64.s32 %r74,%r338; add.u64 %r340,%r74,%r74; add.u64 %r341,%r340,%r74; shl.b64 %r342,%r341,3; add.u64 %r343,%r209,%r342; .loc 1 313 67 shl.b64 %r345,%r74,3; add.u64 %r346,%frame,%r345; .loc 1 313 59 ld.u64 %r348,[%r343+40]; ld.u64 %r349,[%r346+360]; mul.lo.u64 %r145,%r348,%r349; .loc 1 315 24 mov.u64 %r350,0; st.u64 [%r209+8],%r350; .loc 1 316 28 cvt.u16.u32 %r353,%r23; add.u16 %r352,%r353,-1; cvt.u32.u16 %r354,%r352; st.u8 [%r209+28],%r354; .loc 1 318 10 setp.eq.u64 %r355,%r145,0; @ %r355 bra $L70; bra $L107; $L72: .loc 1 307 47 mul.lo.u64 %r128,%r71,%r128; $L69: .loc 1 309 4 st.u64 [%r172+8],%r356; ld.u64 %r71,[%r164]; add.u64 %r357,%r71,-1; st.u64 [%r172+16],%r357; st.u64 [%r172],%r128; .loc 1 302 7 add.u64 %r172,%r172,24; add.u64 %r164,%r164,8; setp.ne.u64 %r358,%r133,%r172; @ %r358 bra $L72; bra $L68; $L70: .loc 1 321 4 st.u64 [%r209+48],%r145; mov.u64 %r360,-1; st.u64 [%r209+56],%r360; mov.u64 %r361,1; st.u64 [%r209+40],%r361; .loc 1 322 4 bra $L48; $L107: .loc 1 325 24 mov.u64 %r363,363; call (%value_in),_gfortrani_xmallocarray64,[%value_in]; } .loc 1 325 22 st.u64 [%r209],%r364; bra $L73; $L63: .loc 1 330 19 ld.s8 %r366,[%r209+28]; .loc 1 330 10 setp.eq.u64 %r367,%r366,%r137; @ %r367 bra $L74; .loc 1 331 2 cvta.const.u64 %r36836_gfortran_runtime_error74: .loc 1 333 11 cvta.global.u64 %r370,_gfortrani_compile_options; .loc 1 333 10 ld.u32 %r371,[%r370+36]; setp.eq.u32 %r372,%r371,0; @ %r372 bra $L73; .loc 1 335 4 add.u64 %r432,%frame,360; cvta.const.u64 %r376,$LC2r437376; call _gfortrani_bounds_ifunction_return337 4 cvta.const.u64 %r380,$LC1210380376; call _gfortrani_bounds_equal_extents$L73: .loc 1 342 3 setp.ne.u64 %r382,%r137,0; @ %r382 bra $L75; $L78: .loc 1 350 8 ld.u64 %r152,[%r209]; .loc 1 351 8 ld.u64 %r150,[%r210]; .loc 1 353 9 setp.ne.u64 %r383,%r150,0; @ %r383 bra $L76; bra $L48; $L75: add.u64 %r190,%frame,480; add.u64 %r189,%r209,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r385,%r23; cvt.s64.s8 %r384,%r385; add.u64 %r175,%r384,-1; .loc 1 342 3 mov.u64 %r157,0; add.u64 %r432,%frame,360; .loc 1 344 16 mov.u64 %r386,%r157; $L77: st.u64 [%r190],%r386; .loc 1 345 18 ld.u64 %r387,[%r189]; st.u64 [%r187],%r387; .loc 1 346 17 shl.b64 %r389,%r157,3; add.u64 %r390,%r432,%r389; .loc 1 346 10 ld.u64 %r391,[%r390]; setp.le.s64 %r392,%r391,0; @ %r392 bra $L48; .loc 1 342 26 add.u64 %r157,%r157,1; .loc 1 342 3 add.u64 %r190,%r190,8; add.u64 %r189,%r189,24; add.u64 %r187,%r187,8; setp.ne.u64 %r393,%r157,%r175; @ %r393 bra $L77; bra $L78; $L76: .loc 1 276 12 mul.lo.u64 %r45,%r42,%r34; .loc 1 370 32 add.u64 %r90,%r141,%r141; ld.u64 %r39,[%frame+480]; .loc 1 401 22 ld.u64 %r93,[%frame+240]; .loc 1 401 12 add.u64 %r94,%r93,%r93; .loc 1 402 23 ld.u64 %r96,[%frame]; .loc 1 403 22 ld.u64 %r98,[%frame+120]; .loc 1 403 12 add.u64 %r99,%r98,%r98; .loc 1 405 32 ld.u64 %r84,[%frame+360]; .loc 1 412 23 mul.lo.u64 %r38,%r84,%r93; .loc 1 413 24 mul.lo.u64 %r397,%r84,%r96; .loc 1 413 10 neg.s64 %r163,%r397; .loc 1 414 23 mul.lo.u64 %r398,%r84,%r98; .loc 1 414 9 add.u64 %r399,%r398,%r398; neg.s64 %r165,%r399; setp.le.s64 %r437,%r137,1; .loc 1 365 9 mov.u32 %r439,32767; cvt.u32.u32 %r440,%r23; cvt.s64.s8 %r441,%r440; add.u64 %r442,%r441,-1; add.u64 %r443,%frame,240; add.u64 %r444,%frame,360; add.u64 %r445,%frame,120; $L87: .loc 1 342 3 mov.u64 %r156,%r151; mov.u64 %r155,%r150; .loc 1 365 9 mov.u32 %r129,%r439; .loc 1 370 9 mov.u64 %r154,0; $L80: .loc 1 394 6 ld.s8 %r401,[%r156]; cvt.u16.u32 %r400,%r401; setp.eq.u16 %r402,%r400,0; @ %r402 bra $L79; ld.u16 %r404,[%r155]; cvt.u16.u32 %r405,%r129; min.s16 %r403,%r404,%r405; cvt.s32.s16 %r129,%r403; $L79: .loc 1 370 24 add.u64 %r154,%r154,1; .loc 1 370 32 add.u64 %r155,%r155,%r90; .loc 1 370 47 add.u64 %r156,%r156,%r45; .loc 1 370 2 setp.ne.u64 %r406,%r138,%r154; @ %r406 bra $L80; .loc 1 397 8 cvt.u16.u32 %r407,%r129; st.u16 [%r152],%r407; .loc 1 400 15 add.u64 %r39,%r39,1; .loc 1 401 12 add.u64 %r150,%r150,%r94; .loc 1 402 13 add.u64 %r151,%r151,%r96; .loc 1 403 12 add.u64 %r152,%r152,%r99; .loc 1 405 13 setp.ne.u64 %r408,%r84,%r39; @ %r408 bra $L87; .loc 1 413 10 add.u64 %r147,%r151,%r163; .loc 1 414 9 add.u64 %r148,%r152,%r165; .loc 1 416 7 @ ! %r437 bra $L108; bra $L48; $L85: .loc 1 409 13 st.u64 [%r41],%r438; .loc 1 412 23 mul.lo.u64 %r101,%r113,%r112; .loc 1 413 24 mul.lo.u64 %r411,%r115,%r112; .loc 1 413 10 sub.u64 %r147,%r151,%r411; .loc 1 414 23 mul.lo.u64 %r412,%r117,%r112; .loc 1 414 9 add.u64 %r413,%r412,%r412; sub.u64 %r148,%r152,%r413; .loc 1 415 5 add.u64 %r149,%r149,1; .loc 1 416 7 add.u64 %r41,%r41,8; add.u64 %r166,%r166,8; setp.ne.u64 %r414,%r149,%r442; @ %r414 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; .loc 1 412 23 mov.u64 %r101,%r38; .loc 1 416 7 mov.u64 %r166,8; .loc 1 415 5 mov.u64 %r149,1; .loc 1 409 13 mov.u64 %r438,0; $L84: .loc 1 424 16 ld.u64 %r418,[%r41]; add.u64 %r112,%r418,1; st.u64 [%r41],%r112; .loc 1 425 23 add.u64 %r420,%r443,%r166; ld.u64 %r113,[%r420]; .loc 1 425 13 sub.u64 %r421,%r113,%r101; add.u64 %r422,%r421,%r421; add.u64 %r150,%r150,%r422; .loc 1 426 24 add.u64 %r423,%frame,%r166; ld.u64 %r115,[%r423]; .loc 1 426 14 add.u64 %r151,%r147,%r115; .loc 1 427 23 add.u64 %r425,%r445,%r166; ld.u64 %r117,[%r425]; .loc 1 427 13 add.u64 %r426,%r117,%r117; add.u64 %r152,%r148,%r426; .loc 1 405 32 add.u64 %r428,%r444,%r166; ld.u64 %r121,[%r428]; .loc 1 405 13 setp.eq.u64 %r429,%r112,%r121; @ %r429 bra $L85; .loc 1 409 13 mov.u64 %r39,0; bra $L87; $L106: .loc 1 298 6 ld.u64 %r430,[%r209]; setp.eq.u64 %r431,%r430,0; @ ! %r431 bra $L63; bra $L86; $L48: .loc 1 431_gfortran_sminval_i2 .visible .func _gfortran_sminval_8u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64predpred %r292; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 454 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 454 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_minva.loc 1 461 7 bra $L109; $L111: .loc 1 464 10 ld.u64 %r23,[%r152]; .loc 1 464 7 add.u64 %r88,%r23,-1; .loc 1 465 10 ld.s8 %r24,[%r151+28]; .loc 1 465 38 add.u32 %r26,%r24,-1; .loc 1 465 8 cvt.s64.s32 %r295,%r26; .loc 1 467 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r295; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 467 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 474 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 482 3 setp.lt.s64 %r173,%r88,%r295; @ %r173 bra $L115; bra $L168; $L113: .loc 1 469 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 479 12 mov.u64 %r301,0; $L119: .loc 1 476 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 478 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 476 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 479 12 st.u64 [%r139],%r301; $L118: .loc 1 474 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 491 15 ld.u64 %r98,[%r150]; .loc 1 491 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r294,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r294,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 488 12 mov.u64 %r300,0; $L125: .loc 1 485 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 487 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 484 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 488 12 st.u64 [%r43],%r300; $L124: .loc 1 482 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 495 7 setp.eq.u64 %r211,%r295,0; @ %r211 bra $L127; add.u64 %r294,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r294; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 498 10 mov.u64 %r81,1; .loc 1 502 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 506 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 507 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 509 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 509 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 509 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 511 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 500 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 502 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 495 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 514 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 515 4 bra $L109; $L170: .loc 1 518 24 mov.u64 %r244,4; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 518 22 st.u64 [%r150],%r98; $L135: .loc 1 544 3 setp.ne.u64 %r247,%r295,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 522 19 ld.s8 %r55,[%r150+28]; .loc 1 522 10 setp.eq.u64 %r248,%r55,%r295; @ %r248 bra $L134; .loc 1 523 2 st.u64 [%stack+8],%r295; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 528 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 528 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 530 4 setp.eq.u64 %r254,%r295,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 530 10 mov.u64 %r104,0; $L137: .loc 1 534 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 534 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 535 18 ld.u64 %r63,[%r31]; .loc 1 530 25 add.u64 %r104,%r104,1; .loc 1 535 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 536 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 530 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 546 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r295,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 547 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 544 26 add.u64 %r100,%r100,1; .loc 1 544 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r295,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 556 22 ld.u64 %r68,[%frame]; .loc 1 556 12 add.u64 %r69,%r68,%r68; .loc 1 558 32 ld.u64 %r59,[%frame+120]; .loc 1 565 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r293,%r295,1; .loc 1 554 13 mov.u16 %r279,32767; cvt.u32.u32 %r297,%r24; cvt.s64.s8 %r298,%r297; add.u64 %r299,%r298,-1; $L146: st.u16 [%r98],%r279; .loc 1 555 15 add.u64 %r80,%r80,1; .loc 1 556 12 add.u64 %r98,%r98,%r69; .loc 1 558 13 setp.ne.u64 %r280,%r59,%r80; @ %r280 bra $L146; .loc 1 567 7 @ ! %r293 bra $L171; bra $L109; $L145: .loc 1 562 13 st.u64 [%r99],%r296; .loc 1 565 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 566 5 add.u64 %r97,%r97,1; .loc 1 567 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r283,%r97,%r299; @ %r283 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 565 23 mov.u64 %r72,%r44; .loc 1 566 5 mov.u64 %r97,1; .loc 1 562 13 mov.u64 %r296,0; $L144: .loc 1 571 16 ld.u64 %r288,[%r99]; add.u64 %r74,%r288,1; st.u64 [%r99],%r74; .loc 1 572 23 ld.u64 %r75,[%r124]; .loc 1 572 13 sub.u64 %r289,%r75,%r72; add.u64 %r290,%r289,%r289; add.u64 %r98,%r98,%r290; .loc 1 558 32 ld.u64 %r77,[%r123]; .loc 1 558 13 setp.eq.u64 %r291,%r74,%r77; @ %r291 bra $L145; .loc 1 562 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 491 15 ld.u64 %r98,[%r150]; .loc 1 491 6 setp.eq.u64 %r292,%r98,0; @ ! %r292 bra $L122; bra $L147; $L109: .loc 1 minval_i4.o/_gfortran_minval_i4 .visible .func _gfortran_minval_i4fortran/generated/minval_i4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminval_i4 .visible .func _gfortran_mminval_i4gfortran_sminval_i4 .visible .func _gfortran_sminval_i4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,86,65,76114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_minval_i4 .visible .func _gfortran_minval_i44899pred %r217; .reg .pred64u64 %r251; .reg .predu64 %r256; .reg .u64 %r257; .reg .u32u32 %r268; .reg .u64u64predpredu64u64predmov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; .loc 1 55 10 ld.s8 %r22,[%r180+28]; .loc 1 55 38 add.u32 %r182,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r182; .loc 1 56 10 ld.u64 %r25,[%r181]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r184,%r104,63; set.u32.lt.s64 %r186,%r103,%r104; neg.s32 %r187,%r186; cvt.u16.u64 %r190,%r184; cvt.u16.u32 %r191,%r187; or.b16 %r189,%r190,%r191; .loc 1 58 6 cvt.u32.u16 %r192,%r189; cvt.u16.u8 %r193,%r192; setp.eq.u16 %r194,%r193,0; @ %r194 bra $L2; .loc 1 60 7 cvt.u32.u32 %r197,%r22; cvt.s64.s8 %r196,%r197; st.u64 [%stack+8],%r1961955_gfortran_runtime_errorr200,%r104,%r104; add.u64 %r201,%r200,%r104; shl.b64 %r202,%r201,3; add.u64 %r203,%r180,%r202; ld.u64 %r30,[%r203+56]; ld.u64 %r32,[%r203+48]; .loc 1 68 9 ld.u64 %r107,[%r203+40]; .loc 1 70 3 setp.ne.u64 %r217,%r104,0; @ %r217 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r218,%r103,%r104; @ %r218 bra $L4; bra $L46; $L3: add.u64 %r78,%r180,40; add.u64 %r59,%frame,120; add.u64 %r91,%frame,240; add.u64 %r219,%r180,16; add.u64 %r221,%r25,%r25; add.u64 %r222,%r221,%r25; shl.b64 %r223,%r222,3; add.u64 %r174,%r219,%r223; .loc 1 76 12 mov.u64 %r357,0; $L8: .loc 1 72 18 ld.u64 %r224,[%r78]; st.u64 [%r59],%r224; .loc 1 73 19 ld.u64 %r226,[%r78+16]; add.u64 %r225,%r226,1; ld.u64 %r227,[%r78+8]; sub.u64 %r37,%r225,%r227; .loc 1 75 10 setp.lt.s64 %r228,%r37,0; @ %r228 bra $L6; .loc 1 73 17 st.u64 [%r91],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r91],%r357; $L7: .loc 1 70 3 add.u64 %r78,%r78,24; add.u64 %r59,%r59,8; add.u64 %r91,%r91,8; setp.ne.u64 %r230,%r78,%r174; @ %r230 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r231,[%r179]; setp.eq.u64 %r232,%r231,0; @ ! %r232 bra $L11; bra $L10; $L4: add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r237,%r236,40; add.u64 %r143,%r180,%r237; shl.b64 %r238,%r25,3; add.u64 %r125,%r238,-8; add.u64 %r344,%frame,120; add.u64 %r128,%r344,%r125; add.u64 %r345,%frame,240; add.u64 %r121,%r345,%r125; cvt.u32.u32 %r242,%r22; cvt.s64.s8 %r241,%r242; add.u64 %r244,%r241,%r241; add.u64 %r245,%r244,%r241; shl.b64 %r246,%r245,3; add.u64 %r247,%r180,40; add.u64 %r95,%r246,%r247; .loc 1 84 12 mov.u64 %r356,0; $L14: .loc 1 80 18 ld.u64 %r248,[%r143]; st.u64 [%r128],%r248; .loc 1 81 19 ld.u64 %r250,[%r143+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r143+8]; sub.u64 %r45,%r249,%r251; .loc 1 83 10 setp.lt.s64 %r252,%r45,0; @ %r252 bra $L12; .loc 1 81 17 st.u64 [%r121],%r45; bra $L13; $L12: .loc 1 84 12 st.u64 [%r121],%r356; $L13: .loc 1 78 3 add.u64 %r143,%r143,24; add.u64 %r128,%r128,8; add.u64 %r121,%r121,8; setp.ne.u64 %r254,%r95,%r143; @ %r254 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r255,%r103,0; @ %r255 bra $L16; add.u64 %r345,%frame,240; $L34: add.u64 %r152,%r179,40; mov.u64 %r150,%r345; add.u64 %r256,%r179,16; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r144,%r256,%r262; .loc 1 94 10 mov.u64 %r99,1; .loc 1 98 4 mov.u64 %r285,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r263,0; st.u64 [%r179+8],%r263; .loc 1 103 28 cvt.u16.u32 %r266,%r22; add.u16 %r265,%r266,-1; cvt.u32.u16 %r267,%r265; st.u8 [%r179+28],%r267; .loc 1 105 20 add.u32 %r268,%r22,-2; cvt.s64.s32 %r52,%r268; add.u64 %r270,%r52,%r52; add.u64 %r271,%r270,%r52; shl.b64 %r272,%r271,3; add.u64 %r273,%r179,%r272; .loc 1 105 67 shl.b64 %r275,%r52,3; add.u64 %r276,%frame,%r275; .loc 1 105 59 ld.u64 %r278,[%r273+40]; ld.u64 %r279,[%r276+240]; mul.lo.u64 %r109,%r278,%r279; .loc 1 107 29281; call (%value_in),_gfortrani_xmallocarray282,[%value_in]; } .loc 1 107 27 st.u64 [%r179],%r282; .loc 1 108 10 setp.eq.u64 %r284,%r109,0; @ ! %r284 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r99,%r48,%r99; $L17: .loc 1 98 4 st.u64 [%r152+8],%r285; ld.u64 %r48,[%r150]; add.u64 %r286,%r48,-1; st.u64 [%r152+16],%r286; st.u64 [%r152],%r99; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r287,%r144,%r152; @ %r287 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r179+48],%r109; mov.u64 %r289,-1; st.u64 [%r179+56],%r289; mov.u64 %r290,1; st.u64 [%r179+40],%r290; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r179+28]; .loc 1 118 10 setp.eq.u64 %r291,%r58,%r103; @ %r291 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r294,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r295,[%r294+36]; setp.eq.u32 %r296,%r295,0; @ %r296 bra $L21; .loc 1 125 2 add.u64 %r345,%frame,240; cvta.const.u64 %r300,$LC2r345299300; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r302,%r103,0; @ %r302 bra $L24; $L27: .loc 1 65 9 add.u64 %r303,%r30,1; .loc 1 65 7 sub.u64 %r105,%r303,%r32; max.s64 %r40,%r105,0; .loc 1 137 8 ld.u64 %r116,[%r180]; .loc 1 138 8 ld.u64 %r117,[%r179]; .loc 1 158 36 shl.b64 %r66,%r107,2; ld.u64 %r138,[%frame+360]; .loc 1 180 22 ld.u64 %r68,[%frame+120]; .loc 1 180 12 shl.b64 %r69,%r68,2; .loc 1 181 22 ld.u64 %r71,[%frame]; .loc 1 181 12 shl.b64 %r72,%r71,2; .loc 1 183 32 ld.u64 %r123,[%frame+240]; .loc 1 190 23 mul.lo.u64 %r304,%r68,%r123; .loc 1 190 9 shl.b64 %r305,%r304,2; neg.s64 %r127,%r305; .loc 1 191 23 mul.lo.u64 %r306,%r71,%r123; .loc 1 191 9 shl.b64 %r307,%r306,2; neg.s64 %r130,%r307; setp.gt.s64 %r346,%r105,0; setp.le.s64 %r347,%r103,1; cvt.u32.u32 %r349,%r22; cvt.s64.s8 %r350,%r349; add.u64 %r351,%r350,-1; add.u64 %r352,%frame,120; add.u64 %r353,%frame,240; .loc 1 187 13 mov.u64 %r354,0; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r179,40; mov.u64 %r162,%frame; cvt.u32.u32 %r309,%r22; cvt.s64.s8 %r308,%r309; add.u64 %r153,%r308,-1; .loc 1 129 3 mov.u64 %r118,0; add.u64 %r345,%frame,240; .loc 1 131 16 mov.u64 %r310,%r118; $L26: st.u64 [%r165],%r310; .loc 1 132 18 ld.u64 %r311,[%r164]; st.u64 [%r162],%r311; .loc 1 133 17 shl.b64 %r313,%r118,3; add.u64 %r314,%r345,%r313; .loc 1 133 10 ld.u64 %r315,[%r314]; setp.le.s64 %r316,%r315,0; @ %r316 bra $L1; .loc 1 129 26 add.u64 %r118,%r118,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r317,%r118,%r153; @ %r317 bra $L26; bra $L27; $L25: .loc 1 153 5 @ %r346 bra $L35; .loc 1 154 10 mov.u32 %r319,2147483647; st.u32 [%r117],%r319; bra $L29; $L35: mov.u64 %r112,%r116; .loc 1 151 9 mov.u32 %r43,2147483647; .loc 1 158 13 mov.u64 %r111,0; $L28: ld.u32 %r320,[%r112]; min.s32 %r43,%r43,%r320; .loc 1 158 28 add.u64 %r111,%r111,1; .loc 1 158 36 add.u64 %r112,%r112,%r66; .loc 1 158 6 setp.gt.s64 %r321,%r40,%r111; @ %r321 bra $L28; .loc 1 175 12 st.u32 [%r117],%r43; $L29: .loc 1 179 15 add.u64 %r138,%r138,1; .loc 1 180 12 add.u64 %r116,%r116,%r69; .loc 1 181 12 add.u64 %r117,%r117,%r72; .loc 1 183 13 setp.ne.u64 %r322,%r138,%r123; @ %r322 bra $L25; .loc 1 190 9 add.u64 %r113,%r116,%r127; .loc 1 191 9 add.u64 %r114,%r117,%r130; .loc 1 193 7 @ %r347 bra $L1; add.u64 %r122,%frame,368; mov.u64 %r140,8; .loc 1 192 5 mov.u64 %r115,1; bra $L32; $L33: .loc 1 187 13 st.u64 [%r122],%r354; .loc 1 190 23 mul.lo.u64 %r328,%r85,%r84; .loc 1 190 9 shl.b64 %r329,%r328,2; sub.u64 %r113,%r116,%r329; .loc 1 191 23 mul.lo.u64 %r330,%r88,%r84; .loc 1 191 9 shl.b64 %r331,%r330,2; sub.u64 %r114,%r117,%r331; .loc 1 192 5 add.u64 %r115,%r115,1; .loc 1 193 7 add.u64 %r122,%r122,8; add.u64 %r140,%r140,8; setp.eq.u64 %r332,%r115,%r351; @ %r332 bra $L1; $L32: .loc 1 201 16 ld.u64 %r333,[%r122]; add.u64 %r84,%r333,1; st.u64 [%r122],%r84; .loc 1 202 23 add.u64 %r335,%r352,%r140; ld.u64 %r85,[%r335]; .loc 1 202 13 shl.b64 %r336,%r85,2; add.u64 %r116,%r113,%r336; .loc 1 203 23 add.u64 %r337,%frame,%r140; ld.u64 %r88,[%r337]; .loc 1 203 13 shl.b64 %r338,%r88,2; add.u64 %r117,%r114,%r338; .loc 1 183 32 add.u64 %r340,%r353,%r140; ld.u64 %r92,[%r340]; .loc 1 183 13 setp.eq.u64 %r341,%r84,%r92; @ %r341 bra $L33; .loc 1 187 13 mov.u64 %r138,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r342,[%r179]; setp.eq.u64 %r343,%r342,0; @ ! %r343 bra $L11; bra $L34; $L1: .loc 1 207_gfortran_mminval_i4 .visible .func _gfortran_mminval_608u64u16 %r263; .reg .pred %r264; .reg .pred %r266; .reg .u64 %r270; .reg .u64 %r271; .reg .u64 %r277; .reg .pred %r279; .reg .pred %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .predu64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32u64 %r342; .reg .u64pred %r358; .reg .u64predu64 %r391; .reg .predpred %r399; .reg .u32 %r400; .reg .pred %r401; .reg .predmov.u64 %r211,%ar2; mov.u64 %r212,%ar3; .loc 1 237 6 setp.ne.u64 %r213,%r212,0; @ %r213 bra $L49gfortran_minva.loc 1 244 7 bra $L48; $L49: .loc 1 247 10 ld.u64 %r22,[%r211]; .loc 1 247 7 add.u64 %r136,%r22,-1; .loc 1 248 10 ld.s8 %r23,[%r210+28]; .loc 1 248 38 add.u32 %r217,%r23,-1; .loc 1 248 8 cvt.s64.s32 %r137,%r217; .loc 1 251 7 shr.u64 %r219,%r136,63; set.u32.gt.s64 %r221,%r136,%r137; neg.s32 %r222,%r221; cvt.u16.u64 %r225,%r219; cvt.u16.u32 %r226,%r222; or.b16 %r224,%r225,%r226; .loc 1 251 6 cvt.u32.u16 %r227,%r224; cvt.u16.u8 %r228,%r227; setp.eq.u16 %r229,%r228,0; @ %r229 bra $L51; .loc 1 253 7 cvt.u32.u32 %r232,%r23; cvt.s64.s8 %r231,%r232; st.u64 [%stack+8],%r231; st.u64 [%stack],%r22; cvta.const.u64 %r2302stack; call _gfortran_runtime_error1: .loc 1 258 9 add.u64 %r429,%r136,%r136; add.u64 %r430,%r429,%r136; shl.b64 %r237,%r430,3; add.u64 %r238,%r210,%r237; ld.u64 %r241,[%r238+56]; add.u64 %r240,%r241,1; .loc 1 258 7 ld.u64 %r248,[%r238+48]; sub.u64 %r138,%r240,%r248; .loc 1 259 6 setp.le.s64 %r249,%r138,0; @ %r249 bra $L48; .loc 1 262 9 ld.u64 %r151,[%r212]; .loc 1 264 15 ld.u64 %r34,[%r212+16]; .loc 1 266 22 cvt.u32.u64 %r35,%r34; .loc 1 266 53 add.u32 %r250,%r35,-4; and.b32 %r251,%r250,-5; set.u32.eq.u32 %r253,%r251,0; neg.s32 %r254,%r253; .loc 1 266 22 add.u32 %r255,%r35,-1; set.u32.le.u32 %r257,%r255,1; neg.s32 %r258,%r257; .loc 1 266 6 cvt.u16.u32 %r260,%r254; cvt.u16.u32 %r261,%r258; or.b16 %r259,%r260,%r261; cvt.u32.u16 %r262,%r259; cvt.u16.u8 %r263,%r262; setp.ne.u16 %r264,%r263,0; @ %r264 bra $L53; .loc 1 268 7 setp.ne.u32 %r266,%r35,16; @ %r266 bra $L54; $L53: .loc 1 275 9 shl.b64 %r270,%r430,3; add.u64 %r271,%r210,%r270; ld.u64 %r141,[%r271+40]; .loc 1 276 12 add.u64 %r277,%r212,%r270; ld.u64 %r42,[%r277+40]; .loc 1 278 3 setp.ne.u64 %r279,%r136,0; @ %r279 bra $L55; $L61: .loc 1 288 3 setp.lt.s64 %r280,%r136,%r137; @ %r280 bra $L56; bra $L105; $L54: .loc 1 273 5 cvta.const.u64 %r281281_gfortran_runtime_error5: add.u64 %r40,%r210,40; add.u64 %r179,%r212,40; add.u64 %r283,%r210,16; add.u64 %r285,%r22,%r22; add.u64 %r286,%r285,%r22; shl.b64 %r287,%r286,3; add.u64 %r206,%r283,%r287; .loc 1 278 3 mov.u64 %r91,0; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 285 12 mov.u64 %r441,%r91; $L60: .loc 1 280 18 add.u64 %r289,%r427,%r91; ld.u64 %r290,[%r40]; st.u64 [%r289],%r290; .loc 1 281 18 add.u64 %r291,%frame,%r91; .loc 1 281 20 ld.u64 %r293,[%r179]; mul.lo.u64 %r292,%r293,%r34; .loc 1 281 18 st.u64 [%r291],%r292; .loc 1 282 19 ld.u64 %r295,[%r40+16]; add.u64 %r294,%r295,1; ld.u64 %r296,[%r40+8]; sub.u64 %r54,%r294,%r296; .loc 1 284 10 setp.lt.s64 %r297,%r54,0; @ %r297 bra $L58; .loc 1 282 17 add.u64 %r299,%r426,%r91; st.u64 [%r299],%r54; bra $L59; $L58: .loc 1 285 12 add.u64 %r301,%r426,%r91; st.u64 [%r301],%r441; $L59: .loc 1 278 3 add.u64 %r40,%r40,24; add.u64 %r179,%r179,24; add.u64 %r91,%r91,8; setp.ne.u64 %r303,%r40,%r206; @ %r303 bra $L60; bra $L61; $L105: .loc 1 298 6 ld.u64 %r304,[%r209]; setp.eq.u64 %r305,%r304,0; @ ! %r305 bra $L63; bra $L62; $L56: add.u64 %r307,%r22,%r22; add.u64 %r308,%r307,%r22; shl.b64 %r309,%r308,3; add.u64 %r126,%r309,40; add.u64 %r132,%r210,%r126; add.u64 %r124,%r212,%r126; shl.b64 %r310,%r22,3; add.u64 %r102,%r310,-8; cvt.u32.u32 %r312,%r23; cvt.s64.s8 %r311,%r312; shl.b64 %r313,%r311,3; add.u64 %r68,%r313,-8; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 295 12 mov.u64 %r440,0; $L66: .loc 1 290 18 add.u64 %r315,%r427,%r102; ld.u64 %r316,[%r132]; st.u64 [%r315],%r316; .loc 1 291 18 add.u64 %r317,%frame,%r102; .loc 1 291 20 ld.u64 %r319,[%r124]; mul.lo.u64 %r318,%r319,%r34; .loc 1 291 18 st.u64 [%r317],%r318; .loc 1 292 19 ld.u64 %r321,[%r132+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r132+8]; sub.u64 %r65,%r320,%r322; .loc 1 294 10 setp.lt.s64 %r323,%r65,0; @ %r323 bra $L64; .loc 1 292 17 add.u64 %r325,%r426,%r102; st.u64 [%r325],%r65; bra $L65; $L64: .loc 1 295 12 add.u64 %r327,%r426,%r102; st.u64 [%r327],%r440; $L65: .loc 1 288 3 add.u64 %r132,%r132,24; add.u64 %r124,%r124,24; add.u64 %r102,%r102,8; setp.ne.u64 %r329,%r68,%r102; @ %r329 bra $L66; bra $L106; $L62: .loc 1 302 7 setp.eq.u64 %r330,%r137,0; @ %r330 bra $L68; add.u64 %r426,%frame,360; $L86: add.u64 %r172,%r209,40; mov.u64 %r164,%r426; cvt.u32.u32 %r332,%r23; cvt.s64.s8 %r331,%r332; add.u64 %r334,%r331,%r331; add.u64 %r335,%r334,%r331; shl.b64 %r336,%r335,3; add.u64 %r337,%r209,16; add.u64 %r133,%r336,%r337; .loc 1 305 10 mov.u64 %r128,1; .loc 1 309 4 mov.u64 %r356,0; bra $L69; $L68: .loc 1 313 20 add.u32 %r338,%r23,-2; cvt.s64.s32 %r74,%r338; add.u64 %r340,%r74,%r74; add.u64 %r341,%r340,%r74; shl.b64 %r342,%r341,3; add.u64 %r343,%r209,%r342; .loc 1 313 67 shl.b64 %r345,%r74,3; add.u64 %r346,%frame,%r345; .loc 1 313 59 ld.u64 %r348,[%r343+40]; ld.u64 %r349,[%r346+360]; mul.lo.u64 %r145,%r348,%r349; .loc 1 315 24 mov.u64 %r350,0; st.u64 [%r209+8],%r350; .loc 1 316 28 cvt.u16.u32 %r353,%r23; add.u16 %r352,%r353,-1; cvt.u32.u16 %r354,%r352; st.u8 [%r209+28],%r354; .loc 1 318 10 setp.eq.u64 %r355,%r145,0; @ %r355 bra $L70; bra $L107; $L72: .loc 1 307 47 mul.lo.u64 %r128,%r71,%r128; $L69: .loc 1 309 4 st.u64 [%r172+8],%r356; ld.u64 %r71,[%r164]; add.u64 %r357,%r71,-1; st.u64 [%r172+16],%r357; st.u64 [%r172],%r128; .loc 1 302 7 add.u64 %r172,%r172,24; add.u64 %r164,%r164,8; setp.ne.u64 %r358,%r133,%r172; @ %r358 bra $L72; bra $L68; $L70: .loc 1 321 4 st.u64 [%r209+48],%r145; mov.u64 %r360,-1; st.u64 [%r209+56],%r360; mov.u64 %r361,1; st.u64 [%r209+40],%r361; .loc 1 322 4 bra $L48; $L107: .loc 1 325 24363; call (%value_in),_gfortrani_xmallocarray64,[%value_in]; } .loc 1 325 22 st.u64 [%r209],%r364; bra $L73; $L63: .loc 1 330 19 ld.s8 %r366,[%r209+28]; .loc 1 330 10 setp.eq.u64 %r367,%r366,%r137; @ %r367 bra $L74; .loc 1 331 2 cvta.const.u64 %r36836_gfortran_runtime_error74: .loc 1 333 11 cvta.global.u64 %r370,_gfortrani_compile_options; .loc 1 333 10 ld.u32 %r371,[%r370+36]; setp.eq.u32 %r372,%r371,0; @ %r372 bra $L73; .loc 1 335 4 add.u64 %r426,%frame,360; cvta.const.u64 %r376,$LC2r42637376; call _gfortrani_bounds_ifunction_return337 4 cvta.const.u64 %r380,$LC1210380376; call _gfortrani_bounds_equal_extents$L73: .loc 1 342 3 setp.ne.u64 %r382,%r137,0; @ %r382 bra $L75; $L78: .loc 1 350 8 ld.u64 %r152,[%r209]; .loc 1 351 8 ld.u64 %r150,[%r210]; .loc 1 353 9 setp.ne.u64 %r383,%r150,0; @ %r383 bra $L76; bra $L48; $L75: add.u64 %r190,%frame,480; add.u64 %r189,%r209,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r385,%r23; cvt.s64.s8 %r384,%r385; add.u64 %r175,%r384,-1; .loc 1 342 3 mov.u64 %r157,0; add.u64 %r426,%frame,360; .loc 1 344 16 mov.u64 %r386,%r157; $L77: st.u64 [%r190],%r386; .loc 1 345 18 ld.u64 %r387,[%r189]; st.u64 [%r187],%r387; .loc 1 346 17 shl.b64 %r389,%r157,3; add.u64 %r390,%r426,%r389; .loc 1 346 10 ld.u64 %r391,[%r390]; setp.le.s64 %r392,%r391,0; @ %r392 bra $L48; .loc 1 342 26 add.u64 %r157,%r157,1; .loc 1 342 3 add.u64 %r190,%r190,8; add.u64 %r189,%r189,24; add.u64 %r187,%r187,8; setp.ne.u64 %r393,%r157,%r175; @ %r393 bra $L77; bra $L78; $L76: .loc 1 276 12 mul.lo.u64 %r45,%r42,%r34; .loc 1 370 32 shl.b64 %r90,%r141,2; ld.u64 %r39,[%frame+480]; .loc 1 401 22 ld.u64 %r93,[%frame+240]; .loc 1 401 12 shl.b64 %r94,%r93,2; .loc 1 402 23 ld.u64 %r96,[%frame]; .loc 1 403 22 ld.u64 %r98,[%frame+120]; .loc 1 403 12 shl.b64 %r99,%r98,2; .loc 1 405 32 ld.u64 %r84,[%frame+360]; .loc 1 412 23 mul.lo.u64 %r38,%r84,%r93; .loc 1 413 24 mul.lo.u64 %r394,%r84,%r96; .loc 1 413 10 neg.s64 %r163,%r394; .loc 1 414 23 mul.lo.u64 %r395,%r84,%r98; .loc 1 414 9 shl.b64 %r396,%r395,2; neg.s64 %r165,%r396; setp.le.s64 %r431,%r137,1; cvt.u32.u32 %r433,%r23; cvt.s64.s8 %r434,%r433; add.u64 %r435,%r434,-1; add.u64 %r436,%frame,240; add.u64 %r437,%frame,360; add.u64 %r438,%frame,120; .loc 1 409 13 mov.u64 %r439,0; $L87: .loc 1 342 3 mov.u64 %r156,%r151; mov.u64 %r155,%r150; .loc 1 365 9 mov.u32 %r129,2147483647; .loc 1 370 9 mov.u64 %r154,0; $L80: .loc 1 394 6 ld.s8 %r398,[%r156]; cvt.u16.u32 %r397,%r398; setp.eq.u16 %r399,%r397,0; @ %r399 bra $L79; ld.u32 %r400,[%r155]; min.s32 %r129,%r129,%r400; $L79: .loc 1 370 24 add.u64 %r154,%r154,1; .loc 1 370 32 add.u64 %r155,%r155,%r90; .loc 1 370 47 add.u64 %r156,%r156,%r45; .loc 1 370 2 setp.ne.u64 %r401,%r138,%r154; @ %r401 bra $L80; .loc 1 397 8 st.u32 [%r152],%r129; .loc 1 400 15 add.u64 %r39,%r39,1; .loc 1 401 12 add.u64 %r150,%r150,%r94; .loc 1 402 13 add.u64 %r151,%r151,%r96; .loc 1 403 12 add.u64 %r152,%r152,%r99; .loc 1 405 13 setp.ne.u64 %r402,%r84,%r39; @ %r402 bra $L87; .loc 1 413 10 add.u64 %r147,%r151,%r163; .loc 1 414 9 add.u64 %r148,%r152,%r165; .loc 1 416 7 @ ! %r431 bra $L108; bra $L48; $L85: .loc 1 409 13 st.u64 [%r41],%r439; .loc 1 412 23 mul.lo.u64 %r101,%r113,%r112; .loc 1 413 24 mul.lo.u64 %r405,%r115,%r112; .loc 1 413 10 sub.u64 %r147,%r151,%r405; .loc 1 414 23 mul.lo.u64 %r406,%r117,%r112; .loc 1 414 9 shl.b64 %r407,%r406,2; sub.u64 %r148,%r152,%r407; .loc 1 415 5 add.u64 %r149,%r149,1; .loc 1 416 7 add.u64 %r41,%r41,8; add.u64 %r166,%r166,8; setp.ne.u64 %r408,%r149,%r435; @ %r408 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; .loc 1 412 23 mov.u64 %r101,%r38; .loc 1 416 7 mov.u64 %r166,8; .loc 1 415 5 mov.u64 %r149,1; $L84: .loc 1 424 16 ld.u64 %r412,[%r41]; add.u64 %r112,%r412,1; st.u64 [%r41],%r112; .loc 1 425 23 add.u64 %r414,%r436,%r166; ld.u64 %r113,[%r414]; .loc 1 425 13 sub.u64 %r415,%r113,%r101; shl.b64 %r416,%r415,2; add.u64 %r150,%r150,%r416; .loc 1 426 24 add.u64 %r417,%frame,%r166; ld.u64 %r115,[%r417]; .loc 1 426 14 add.u64 %r151,%r147,%r115; .loc 1 427 23 add.u64 %r419,%r438,%r166; ld.u64 %r117,[%r419]; .loc 1 427 13 shl.b64 %r420,%r117,2; add.u64 %r152,%r148,%r420; .loc 1 405 32 add.u64 %r422,%r437,%r166; ld.u64 %r121,[%r422]; .loc 1 405 13 setp.eq.u64 %r423,%r112,%r121; @ %r423 bra $L85; .loc 1 409 13 mov.u64 %r39,0; bra $L87; $L106: .loc 1 298 6 ld.u64 %r424,[%r209]; setp.eq.u64 %r425,%r424,0; @ ! %r425 bra $L63; bra $L86; $L48: .loc 1 431_gfortran_sminval_i4 .visible .func _gfortran_sminval_8u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64predu64mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 454 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 454 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_minva.loc 1 461 7 bra $L109; $L111: .loc 1 464 10 ld.u64 %r23,[%r152]; .loc 1 464 7 add.u64 %r88,%r23,-1; .loc 1 465 10 ld.s8 %r24,[%r151+28]; .loc 1 465 38 add.u32 %r26,%r24,-1; .loc 1 465 8 cvt.s64.s32 %r294,%r26; .loc 1 467 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r294; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 467 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 474 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 482 3 setp.lt.s64 %r173,%r88,%r294; @ %r173 bra $L115; bra $L168; $L113: .loc 1 469 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 479 12 mov.u64 %r300,0; $L119: .loc 1 476 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 478 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 476 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 479 12 st.u64 [%r139],%r300; $L118: .loc 1 474 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 491 15 ld.u64 %r98,[%r150]; .loc 1 491 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r293,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r293,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 488 12 mov.u64 %r299,0; $L125: .loc 1 485 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 487 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 484 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 488 12 st.u64 [%r43],%r299; $L124: .loc 1 482 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 495 7 setp.eq.u64 %r211,%r294,0; @ %r211 bra $L127; add.u64 %r293,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r293; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 498 10 mov.u64 %r81,1; .loc 1 502 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 506 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 507 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 509 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 509 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 509 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 511 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 500 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 502 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 495 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 514 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 515 4 bra $L109; $L170: .loc 1 518 2496244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 518 22 st.u64 [%r150],%r98; $L135: .loc 1 544 3 setp.ne.u64 %r247,%r294,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 522 19 ld.s8 %r55,[%r150+28]; .loc 1 522 10 setp.eq.u64 %r248,%r55,%r294; @ %r248 bra $L134; .loc 1 523 2 st.u64 [%stack+8],%r294; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 528 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 528 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 530 4 setp.eq.u64 %r254,%r294,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 530 10 mov.u64 %r104,0; $L137: .loc 1 534 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 534 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 535 18 ld.u64 %r63,[%r31]; .loc 1 530 25 add.u64 %r104,%r104,1; .loc 1 535 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 536 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 530 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 546 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r294,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 547 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 544 26 add.u64 %r100,%r100,1; .loc 1 544 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r294,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 556 22 ld.u64 %r68,[%frame]; .loc 1 556 12 shl.b64 %r69,%r68,2; .loc 1 558 32 ld.u64 %r59,[%frame+120]; .loc 1 565 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r292,%r294,1; .loc 1 554 13 mov.u32 %r278,2147483647; cvt.u32.u32 %r296,%r24; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; $L146: st.u32 [%r98],%r278; .loc 1 555 15 add.u64 %r80,%r80,1; .loc 1 556 12 add.u64 %r98,%r98,%r69; .loc 1 558 13 setp.ne.u64 %r279,%r59,%r80; @ %r279 bra $L146; .loc 1 567 7 @ ! %r292 bra $L171; bra $L109; $L145: .loc 1 562 13 st.u64 [%r99],%r295; .loc 1 565 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 566 5 add.u64 %r97,%r97,1; .loc 1 567 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r282,%r97,%r298; @ %r282 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 565 23 mov.u64 %r72,%r44; .loc 1 566 5 mov.u64 %r97,1; .loc 1 562 13 mov.u64 %r295,0; $L144: .loc 1 571 16 ld.u64 %r287,[%r99]; add.u64 %r74,%r287,1; st.u64 [%r99],%r74; .loc 1 572 23 ld.u64 %r75,[%r124]; .loc 1 572 13 sub.u64 %r288,%r75,%r72; shl.b64 %r289,%r288,2; add.u64 %r98,%r98,%r289; .loc 1 558 32 ld.u64 %r77,[%r123]; .loc 1 558 13 setp.eq.u64 %r290,%r74,%r77; @ %r290 bra $L145; .loc 1 562 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 491 15 ld.u64 %r98,[%r150]; .loc 1 491 6 setp.eq.u64 %r291,%r98,0; @ ! %r291 bra $L122; bra $L147; $L109: .loc 1 minval_i8.o/_gfortran_minval_i8 .visible .func _gfortran_minval_i8fortran/generated/minval_i8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminval_i8 .visible .func _gfortran_mminval_i8gfortran_sminval_i8 .visible .func _gfortran_sminval_i8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,86,65,76114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_minval_i8 .visible .func _gfortran_minval_i84897pred %r218; .reg .predpred %r231; .reg .u64 %r232; .reg .predu64 %r250; .reg .u64 %r251; .reg .u64 %r252; .reg .pred %r253; .reg .predu64u16u64 %r274; .reg .u64pred %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u32 %r306; .reg .u64u64 %r311; .reg .u64 %r312; .reg .pred %r313; .reg .predu64predmov.u64 %r180,%ar0; mov.u64 %r181,%ar1; mov.u64 %r182,%ar2; .loc 1 55 10 ld.s8 %r22,[%r181+28]; .loc 1 55 38 add.u32 %r183,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r108,%r183; .loc 1 56 10 ld.u64 %r25,[%r182]; .loc 1 56 7 add.u64 %r109,%r25,-1; .loc 1 58 7 shr.u64 %r185,%r109,63; set.u32.lt.s64 %r187,%r108,%r109; neg.s32 %r188,%r187; cvt.u16.u64 %r191,%r185; cvt.u16.u32 %r192,%r188; or.b16 %r190,%r191,%r192; .loc 1 58 6 cvt.u32.u16 %r193,%r190; cvt.u16.u8 %r194,%r193; setp.eq.u16 %r195,%r194,0; @ %r195 bra $L2; .loc 1 60 7 cvt.u32.u32 %r198,%r22; cvt.s64.s8 %r197,%r198; st.u64 [%stack+8],%r1971966_gfortran_runtime_errorr201,%r109,%r109; add.u64 %r202,%r201,%r109; shl.b64 %r203,%r202,3; add.u64 %r204,%r181,%r203; ld.u64 %r30,[%r204+56]; ld.u64 %r32,[%r204+48]; .loc 1 68 9 ld.u64 %r112,[%r204+40]; .loc 1 70 3 setp.ne.u64 %r218,%r109,0; @ %r218 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r219,%r108,%r109; @ %r219 bra $L4; bra $L46; $L3: add.u64 %r63,%r181,40; add.u64 %r48,%frame,120; add.u64 %r97,%frame,240; add.u64 %r220,%r181,16; add.u64 %r222,%r25,%r25; add.u64 %r223,%r222,%r25; shl.b64 %r224,%r223,3; add.u64 %r175,%r220,%r224; .loc 1 76 12 mov.u64 %r358,0; $L8: .loc 1 72 18 ld.u64 %r225,[%r63]; st.u64 [%r48],%r225; .loc 1 73 19 ld.u64 %r227,[%r63+16]; add.u64 %r226,%r227,1; ld.u64 %r228,[%r63+8]; sub.u64 %r37,%r226,%r228; .loc 1 75 10 setp.lt.s64 %r229,%r37,0; @ %r229 bra $L6; .loc 1 73 17 st.u64 [%r97],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r97],%r358; $L7: .loc 1 70 3 add.u64 %r63,%r63,24; add.u64 %r48,%r48,8; add.u64 %r97,%r97,8; setp.ne.u64 %r231,%r63,%r175; @ %r231 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r232,[%r180]; setp.eq.u64 %r233,%r232,0; @ ! %r233 bra $L11; bra $L10; $L4: add.u64 %r235,%r25,%r25; add.u64 %r236,%r235,%r25; shl.b64 %r237,%r236,3; add.u64 %r238,%r237,40; add.u64 %r136,%r181,%r238; shl.b64 %r239,%r25,3; add.u64 %r106,%r239,-8; add.u64 %r345,%frame,120; add.u64 %r129,%r345,%r106; add.u64 %r346,%frame,240; add.u64 %r104,%r346,%r106; cvt.u32.u32 %r243,%r22; cvt.s64.s8 %r242,%r243; add.u64 %r245,%r242,%r242; add.u64 %r246,%r245,%r242; shl.b64 %r247,%r246,3; add.u64 %r248,%r181,40; add.u64 %r82,%r247,%r248; .loc 1 84 12 mov.u64 %r357,0; $L14: .loc 1 80 18 ld.u64 %r249,[%r136]; st.u64 [%r129],%r249; .loc 1 81 19 ld.u64 %r251,[%r136+16]; add.u64 %r250,%r251,1; ld.u64 %r252,[%r136+8]; sub.u64 %r45,%r250,%r252; .loc 1 83 10 setp.lt.s64 %r253,%r45,0; @ %r253 bra $L12; .loc 1 81 17 st.u64 [%r104],%r45; bra $L13; $L12: .loc 1 84 12 st.u64 [%r104],%r357; $L13: .loc 1 78 3 add.u64 %r136,%r136,24; add.u64 %r129,%r129,8; add.u64 %r104,%r104,8; setp.ne.u64 %r255,%r82,%r136; @ %r255 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r256,%r108,0; @ %r256 bra $L16; add.u64 %r346,%frame,240; $L34: add.u64 %r152,%r180,40; mov.u64 %r150,%r346; add.u64 %r257,%r180,16; cvt.u32.u32 %r259,%r22; cvt.s64.s8 %r258,%r259; add.u64 %r261,%r258,%r258; add.u64 %r262,%r261,%r258; shl.b64 %r263,%r262,3; add.u64 %r139,%r257,%r263; .loc 1 94 10 mov.u64 %r103,1; .loc 1 98 4 mov.u64 %r286,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r264,0; st.u64 [%r180+8],%r264; .loc 1 103 28 cvt.u16.u32 %r267,%r22; add.u16 %r266,%r267,-1; cvt.u32.u16 %r268,%r266; st.u8 [%r180+28],%r268; .loc 1 105 20 add.u32 %r269,%r22,-2; cvt.s64.s32 %r54,%r269; add.u64 %r271,%r54,%r54; add.u64 %r272,%r271,%r54; shl.b64 %r273,%r272,3; add.u64 %r274,%r180,%r273; .loc 1 105 67 shl.b64 %r276,%r54,3; add.u64 %r277,%frame,%r276; .loc 1 105 59 ld.u64 %r279,[%r274+40]; ld.u64 %r280,[%r277+240]; mul.lo.u64 %r114,%r279,%r280; .loc 1 107 29 mov.u64 %r2811282; call (%value_in),_gfortrani_xmallocarray283,[%value_in]; } .loc 1 107 27 st.u64 [%r180],%r283; .loc 1 108 10 setp.eq.u64 %r285,%r114,0; @ ! %r285 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r103,%r50,%r103; $L17: .loc 1 98 4 st.u64 [%r152+8],%r286; ld.u64 %r50,[%r150]; add.u64 %r287,%r50,-1; st.u64 [%r152+16],%r287; st.u64 [%r152],%r103; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r288,%r139,%r152; @ %r288 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r180+48],%r114; mov.u64 %r290,-1; st.u64 [%r180+56],%r290; mov.u64 %r291,1; st.u64 [%r180+40],%r291; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r60,[%r180+28]; .loc 1 118 10 setp.eq.u64 %r292,%r60,%r108; @ %r292 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r108; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r295,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r296,[%r295+36]; setp.eq.u32 %r297,%r296,0; @ %r297 bra $L21; .loc 1 125 2 add.u64 %r346,%frame,240; cvta.const.u64 %r301,$LC2r346300301; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r303,%r108,0; @ %r303 bra $L24; $L27: .loc 1 65 9 add.u64 %r304,%r30,1; .loc 1 65 7 sub.u64 %r110,%r304,%r32; max.s64 %r40,%r110,0; .loc 1 137 8 ld.u64 %r121,[%r181]; .loc 1 138 8 ld.u64 %r122,[%r180]; .loc 1 158 36 shl.b64 %r68,%r112,3; ld.u64 %r47,[%frame+360]; .loc 1 180 22 ld.u64 %r69,[%frame+120]; .loc 1 180 12 shl.b64 %r71,%r69,3; .loc 1 181 22 ld.u64 %r72,[%frame]; .loc 1 181 12 shl.b64 %r74,%r72,3; setp.gt.s64 %r347,%r110,0; setp.le.s64 %r348,%r108,1; cvt.u32.u32 %r350,%r22; cvt.s64.s8 %r351,%r350; add.u64 %r352,%r351,-1; add.u64 %r353,%frame,120; add.u64 %r354,%frame,240; .loc 1 187 13 mov.u64 %r355,0; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r180,40; mov.u64 %r162,%frame; cvt.u32.u32 %r306,%r22; cvt.s64.s8 %r305,%r306; add.u64 %r153,%r305,-1; .loc 1 129 3 mov.u64 %r123,0; add.u64 %r346,%frame,240; .loc 1 131 16 mov.u64 %r307,%r123; $L26: st.u64 [%r165],%r307; .loc 1 132 18 ld.u64 %r308,[%r164]; st.u64 [%r162],%r308; .loc 1 133 17 shl.b64 %r310,%r123,3; add.u64 %r311,%r346,%r310; .loc 1 133 10 ld.u64 %r312,[%r311]; setp.le.s64 %r313,%r312,0; @ %r313 bra $L1; .loc 1 129 26 add.u64 %r123,%r123,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r314,%r123,%r153; @ %r314 bra $L26; bra $L27; $L25: .loc 1 153 5 @ %r347 bra $L35; .loc 1 154 10 mov.u64 %r316,9223372036854775807; st.u64 [%r122],%r316; bra $L29; $L35: mov.u64 %r116,%r121; .loc 1 151 9 mov.u64 %r43,9223372036854775807; .loc 1 158 13 mov.u64 %r115,0; $L28: ld.u64 %r317,[%r116]; min.s64 %r43,%r43,%r317; .loc 1 158 28 add.u64 %r115,%r115,1; .loc 1 158 36 add.u64 %r116,%r116,%r68; .loc 1 158 6 setp.gt.s64 %r318,%r40,%r115; @ %r318 bra $L28; .loc 1 175 12 st.u64 [%r122],%r43; $L29: .loc 1 179 15 add.u64 %r47,%r47,1; .loc 1 180 12 add.u64 %r121,%r121,%r71; .loc 1 181 12 add.u64 %r122,%r122,%r74; .loc 1 183 32 ld.u64 %r127,[%frame+240]; .loc 1 183 13 setp.ne.u64 %r319,%r47,%r127; @ %r319 bra $L25; .loc 1 190 23 mul.lo.u64 %r320,%r69,%r47; .loc 1 190 9 shl.b64 %r321,%r320,3; sub.u64 %r118,%r121,%r321; .loc 1 191 23 mul.lo.u64 %r322,%r72,%r47; .loc 1 191 9 shl.b64 %r323,%r322,3; sub.u64 %r119,%r122,%r323; .loc 1 193 7 @ %r348 bra $L1; add.u64 %r144,%frame,368; mov.u64 %r107,8; .loc 1 192 5 mov.u64 %r120,1; bra $L32; $L33: .loc 1 187 13 st.u64 [%r144],%r355; .loc 1 190 23 mul.lo.u64 %r329,%r89,%r88; .loc 1 190 9 shl.b64 %r330,%r329,3; sub.u64 %r118,%r121,%r330; .loc 1 191 23 mul.lo.u64 %r331,%r92,%r88; .loc 1 191 9 shl.b64 %r332,%r331,3; sub.u64 %r119,%r122,%r332; .loc 1 192 5 add.u64 %r120,%r120,1; .loc 1 193 7 add.u64 %r144,%r144,8; add.u64 %r107,%r107,8; setp.eq.u64 %r333,%r120,%r352; @ %r333 bra $L1; $L32: .loc 1 201 16 ld.u64 %r334,[%r144]; add.u64 %r88,%r334,1; st.u64 [%r144],%r88; .loc 1 202 23 add.u64 %r336,%r353,%r107; ld.u64 %r89,[%r336]; .loc 1 202 13 shl.b64 %r337,%r89,3; add.u64 %r121,%r118,%r337; .loc 1 203 23 add.u64 %r338,%frame,%r107; ld.u64 %r92,[%r338]; .loc 1 203 13 shl.b64 %r339,%r92,3; add.u64 %r122,%r119,%r339; .loc 1 183 32 add.u64 %r341,%r354,%r107; ld.u64 %r95,[%r341]; .loc 1 183 13 setp.eq.u64 %r342,%r88,%r95; @ %r342 bra $L33; .loc 1 187 13 mov.u64 %r47,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r343,[%r180]; setp.eq.u64 %r344,%r343,0; @ ! %r344 bra $L11; bra $L34; $L1: .loc 1 207_gfortran_mminval_i8 .visible .func _gfortran_mminval_608predpred %r280; .reg .pred %r281; .reg .u64 %r282; .reg .u64 %r284; .reg .u64predu64 %r318; .reg .u64u64u64 %r364; .reg .u64 %r365; .reg .u64 %r367; .reg .predu64pred %r393; .reg .predpredmov.u64 %r210,%ar0; mov.u64 %r211,%ar1; mov.u64 %r212,%ar2; mov.u64 %r213,%ar3; .loc 1 237 6911212; call _gfortran_minva.loc 1 244 7 bra $L48; $L49: .loc 1 247 10 ld.u64 %r22,[%r212]; .loc 1 247 7 add.u64 %r140,%r22,-1; .loc 1 248 10 ld.s8 %r23,[%r211+28]; .loc 1 248 38 add.u32 %r218,%r23,-1; .loc 1 248 8 cvt.s64.s32 %r141,%r218; .loc 1 251 7 shr.u64 %r220,%r140,63; set.u32.gt.s64 %r222,%r140,%r141; neg.s32 %r223,%r222; cvt.u16.u64 %r226,%r220; cvt.u16.u32 %r227,%r223; or.b16 %r225,%r226,%r227; .loc 1 251 6 cvt.u32.u16 %r228,%r225; cvt.u16.u8 %r229,%r228; setp.eq.u16 %r230,%r229,0; @ %r230 bra $L51; .loc 1 253 7 cvt.u32.u32 %r233,%r23; cvt.s64.s8 %r232,%r233; st.u64 [%stack+8],%r232; st.u64 [%stack],%r22; cvta.const.u64 %r2312stack; call _gfortran_runtime_error1: .loc 1 258 9 add.u64 %r431,%r140,%r140; add.u64 %r430,%r431,%r140; shl.b64 %r238,%r430,3; add.u64 %r239,%r211,%r238; ld.u64 %r242,[%r239+56]; add.u64 %r241,%r242,1; .loc 1 258 7 ld.u64 %r249,[%r239+48]; sub.u64 %r142,%r241,%r249; .loc 1 259 6 setp.le.s64 %r250,%r142,0; @ %r250 bra $L48; .loc 1 262 9 ld.u64 %r155,[%r213]; .loc 1 264 15 ld.u64 %r35,[%r213+16]; .loc 1 266 22 cvt.u32.u64 %r36,%r35; .loc 1 266 53 add.u32 %r251,%r36,-4; and.b32 %r252,%r251,-5; set.u32.eq.u32 %r254,%r252,0; neg.s32 %r255,%r254; .loc 1 266 22 add.u32 %r256,%r36,-1; set.u32.le.u32 %r258,%r256,1; neg.s32 %r259,%r258; .loc 1 266 6 cvt.u16.u32 %r261,%r255; cvt.u16.u32 %r262,%r259; or.b16 %r260,%r261,%r262; cvt.u32.u16 %r263,%r260; cvt.u16.u8 %r264,%r263; setp.ne.u16 %r265,%r264,0; @ %r265 bra $L53; .loc 1 268 7 setp.ne.u32 %r267,%r36,16; @ %r267 bra $L54; $L53: .loc 1 275 9 shl.b64 %r271,%r430,3; add.u64 %r272,%r211,%r271; ld.u64 %r145,[%r272+40]; .loc 1 276 12 add.u64 %r278,%r213,%r271; ld.u64 %r43,[%r278+40]; .loc 1 278 3 setp.ne.u64 %r280,%r140,0; @ %r280 bra $L55; $L61: .loc 1 288 3 setp.lt.s64 %r281,%r140,%r141; @ %r281 bra $L56; bra $L105; $L54: .loc 1 273 5 cvta.const.u64 %r282282_gfortran_runtime_error5: add.u64 %r194,%r211,40; add.u64 %r196,%r213,40; add.u64 %r284,%r211,16; add.u64 %r286,%r22,%r22; add.u64 %r287,%r286,%r22; shl.b64 %r288,%r287,3; add.u64 %r207,%r284,%r288; .loc 1 278 3 mov.u64 %r198,0; add.u64 %r428,%frame,240; add.u64 %r427,%frame,360; .loc 1 285 12 mov.u64 %r442,%r198; $L60: .loc 1 280 18 add.u64 %r290,%r428,%r198; ld.u64 %r291,[%r194]; st.u64 [%r290],%r291; .loc 1 281 18 add.u64 %r292,%frame,%r198; .loc 1 281 20 ld.u64 %r294,[%r196]; mul.lo.u64 %r293,%r294,%r35; .loc 1 281 18 st.u64 [%r292],%r293; .loc 1 282 19 ld.u64 %r296,[%r194+16]; add.u64 %r295,%r296,1; ld.u64 %r297,[%r194+8]; sub.u64 %r55,%r295,%r297; .loc 1 284 10 setp.lt.s64 %r298,%r55,0; @ %r298 bra $L58; .loc 1 282 17 add.u64 %r300,%r427,%r198; st.u64 [%r300],%r55; bra $L59; $L58: .loc 1 285 12 add.u64 %r302,%r427,%r198; st.u64 [%r302],%r442; $L59: .loc 1 278 3 add.u64 %r194,%r194,24; add.u64 %r196,%r196,24; add.u64 %r198,%r198,8; setp.ne.u64 %r304,%r194,%r207; @ %r304 bra $L60; bra $L61; $L105: .loc 1 298 6 ld.u64 %r305,[%r210]; setp.eq.u64 %r306,%r305,0; @ ! %r306 bra $L63; bra $L62; $L56: add.u64 %r308,%r22,%r22; add.u64 %r309,%r308,%r22; shl.b64 %r310,%r309,3; add.u64 %r58,%r310,40; add.u64 %r99,%r211,%r58; add.u64 %r138,%r213,%r58; shl.b64 %r311,%r22,3; add.u64 %r192,%r311,-8; cvt.u32.u32 %r313,%r23; cvt.s64.s8 %r312,%r313; shl.b64 %r314,%r312,3; add.u64 %r39,%r314,-8; add.u64 %r428,%frame,240; add.u64 %r427,%frame,360; .loc 1 295 12 mov.u64 %r441,0; $L66: .loc 1 290 18 add.u64 %r316,%r428,%r192; ld.u64 %r317,[%r99]; st.u64 [%r316],%r317; .loc 1 291 18 add.u64 %r318,%frame,%r192; .loc 1 291 20 ld.u64 %r320,[%r138]; mul.lo.u64 %r319,%r320,%r35; .loc 1 291 18 st.u64 [%r318],%r319; .loc 1 292 19 ld.u64 %r322,[%r99+16]; add.u64 %r321,%r322,1; ld.u64 %r323,[%r99+8]; sub.u64 %r66,%r321,%r323; .loc 1 294 10 setp.lt.s64 %r324,%r66,0; @ %r324 bra $L64; .loc 1 292 17 add.u64 %r326,%r427,%r192; st.u64 [%r326],%r66; bra $L65; $L64: .loc 1 295 12 add.u64 %r328,%r427,%r192; st.u64 [%r328],%r441; $L65: .loc 1 288 3 add.u64 %r99,%r99,24; add.u64 %r138,%r138,24; add.u64 %r192,%r192,8; setp.ne.u64 %r330,%r39,%r192; @ %r330 bra $L66; bra $L106; $L62: .loc 1 302 7 setp.eq.u64 %r331,%r141,0; @ %r331 bra $L68; add.u64 %r427,%frame,360; $L86: add.u64 %r136,%r210,40; mov.u64 %r134,%r427; cvt.u32.u32 %r333,%r23; cvt.s64.s8 %r332,%r333; add.u64 %r335,%r332,%r332; add.u64 %r336,%r335,%r332; shl.b64 %r337,%r336,3; add.u64 %r338,%r210,16; add.u64 %r105,%r337,%r338; .loc 1 305 10 mov.u64 %r130,1; .loc 1 309 4 mov.u64 %r357,0; bra $L69; $L68: .loc 1 313 20 add.u32 %r339,%r23,-2; cvt.s64.s32 %r73,%r339; add.u64 %r341,%r73,%r73; add.u64 %r342,%r341,%r73; shl.b64 %r343,%r342,3; add.u64 %r344,%r210,%r343; .loc 1 313 67 shl.b64 %r346,%r73,3; add.u64 %r347,%frame,%r346; .loc 1 313 59 ld.u64 %r349,[%r344+40]; ld.u64 %r350,[%r347+360]; mul.lo.u64 %r149,%r349,%r350; .loc 1 315 24 mov.u64 %r351,0; st.u64 [%r210+8],%r351; .loc 1 316 28 cvt.u16.u32 %r354,%r23; add.u16 %r353,%r354,-1; cvt.u32.u16 %r355,%r353; st.u8 [%r210+28],%r355; .loc 1 318 10 setp.eq.u64 %r356,%r149,0; @ %r356 bra $L70; bra $L107; $L72: .loc 1 307 47 mul.lo.u64 %r130,%r70,%r130; $L69: .loc 1 309 4 st.u64 [%r136+8],%r357; ld.u64 %r70,[%r134]; add.u64 %r358,%r70,-1; st.u64 [%r136+16],%r358; st.u64 [%r136],%r130; .loc 1 302 7 add.u64 %r136,%r136,24; add.u64 %r134,%r134,8; setp.ne.u64 %r359,%r105,%r136; @ %r359 bra $L72; bra $L68; $L70: .loc 1 321 4 st.u64 [%r210+48],%r149; mov.u64 %r361,-1; st.u64 [%r210+56],%r361; mov.u64 %r362,1; st.u64 [%r210+40],%r362; .loc 1 322 4 bra $L48; $L107: .loc 1 325 24 mov.u64 %r364364; call (%value_in),_gfortrani_xmallocarray65,[%value_in]; } .loc 1 325 22 st.u64 [%r210],%r365; bra $L73; $L63: .loc 1 330 19 ld.s8 %r367,[%r210+28]; .loc 1 330 10 setp.eq.u64 %r368,%r367,%r141; @ %r368 bra $L74; .loc 1 331 2 cvta.const.u64 %r36stack; call _gfortran_runtime_error74: .loc 1 333 11 cvta.global.u64 %r371,_gfortrani_compile_options; .loc 1 333 10 ld.u32 %r372,[%r371+36]; setp.eq.u32 %r373,%r372,0; @ %r373 bra $L73; .loc 1 335 4 add.u64 %r427,%frame,360; cvta.const.u64 %r377,$LC2r427376377; call _gfortrani_bounds_ifunction_return337 4 cvta.const.u64 %r381,$LC1r2113377; call _gfortrani_bounds_equal_extents$L73: .loc 1 342 3 setp.ne.u64 %r383,%r141,0; @ %r383 bra $L75; $L78: .loc 1 350 8 ld.u64 %r156,[%r210]; .loc 1 351 8 ld.u64 %r154,[%r211]; .loc 1 353 9 setp.ne.u64 %r384,%r154,0; @ %r384 bra $L76; bra $L48; $L75: add.u64 %r183,%frame,480; add.u64 %r180,%r210,40; add.u64 %r177,%frame,120; cvt.u32.u32 %r386,%r23; cvt.s64.s8 %r385,%r386; add.u64 %r137,%r385,-1; .loc 1 342 3 mov.u64 %r161,0; add.u64 %r427,%frame,360; .loc 1 344 16 mov.u64 %r387,%r161; $L77: st.u64 [%r183],%r387; .loc 1 345 18 ld.u64 %r388,[%r180]; st.u64 [%r177],%r388; .loc 1 346 17 shl.b64 %r390,%r161,3; add.u64 %r391,%r427,%r390; .loc 1 346 10 ld.u64 %r392,[%r391]; setp.le.s64 %r393,%r392,0; @ %r393 bra $L48; .loc 1 342 26 add.u64 %r161,%r161,1; .loc 1 342 3 add.u64 %r183,%r183,8; add.u64 %r180,%r180,24; add.u64 %r177,%r177,8; setp.ne.u64 %r394,%r137,%r161; @ %r394 bra $L77; bra $L78; $L76: .loc 1 276 12 mul.lo.u64 %r46,%r43,%r35; .loc 1 370 32 shl.b64 %r89,%r145,3; ld.u64 %r29,[%frame+480]; .loc 1 401 22 ld.u64 %r91,[%frame+240]; .loc 1 401 12 shl.b64 %r93,%r91,3; .loc 1 402 23 ld.u64 %r94,[%frame]; .loc 1 403 22 ld.u64 %r96,[%frame+120]; .loc 1 403 12 shl.b64 %r98,%r96,3; setp.le.s64 %r432,%r141,1; cvt.u32.u32 %r434,%r23; cvt.s64.s8 %r435,%r434; add.u64 %r436,%r435,-1; add.u64 %r437,%frame,240; add.u64 %r438,%frame,360; add.u64 %r439,%frame,120; .loc 1 409 13 mov.u64 %r440,0; $L87: .loc 1 342 3 mov.u64 %r160,%r155; mov.u64 %r159,%r154; .loc 1 365 9 mov.u64 %r132,9223372036854775807; .loc 1 370 9 mov.u64 %r158,0; $L80: .loc 1 394 6 ld.s8 %r396,[%r160]; cvt.u16.u32 %r395,%r396; setp.eq.u16 %r397,%r395,0; @ %r397 bra $L79; ld.u64 %r398,[%r159]; min.s64 %r132,%r132,%r398; $L79: .loc 1 370 24 add.u64 %r158,%r158,1; .loc 1 370 32 add.u64 %r159,%r159,%r89; .loc 1 370 47 add.u64 %r160,%r160,%r46; .loc 1 370 2 setp.ne.u64 %r399,%r142,%r158; @ %r399 bra $L80; .loc 1 397 8 st.u64 [%r156],%r132; .loc 1 400 15 add.u64 %r29,%r29,1; .loc 1 401 12 add.u64 %r154,%r154,%r93; .loc 1 402 13 add.u64 %r155,%r155,%r94; .loc 1 403 12 add.u64 %r156,%r156,%r98; .loc 1 405 32 ld.u64 %r83,[%frame+360]; .loc 1 405 13 setp.ne.u64 %r400,%r83,%r29; @ %r400 bra $L87; .loc 1 412 23 mul.lo.u64 %r101,%r29,%r91; .loc 1 413 24 mul.lo.u64 %r401,%r29,%r94; .loc 1 413 10 sub.u64 %r151,%r155,%r401; .loc 1 414 23 mul.lo.u64 %r402,%r29,%r96; .loc 1 414 9 shl.b64 %r403,%r402,3; sub.u64 %r152,%r156,%r403; .loc 1 416 7 @ ! %r432 bra $L108; bra $L48; $L85: .loc 1 409 13 st.u64 [%r41],%r440; .loc 1 412 23 mul.lo.u64 %r101,%r113,%r112; .loc 1 413 24 mul.lo.u64 %r406,%r115,%r112; .loc 1 413 10 sub.u64 %r151,%r155,%r406; .loc 1 414 23 mul.lo.u64 %r407,%r117,%r112; .loc 1 414 9 shl.b64 %r408,%r407,3; sub.u64 %r152,%r156,%r408; .loc 1 415 5 add.u64 %r153,%r153,1; .loc 1 416 7 add.u64 %r41,%r41,8; add.u64 %r30,%r30,8; setp.ne.u64 %r409,%r153,%r436; @ %r409 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; mov.u64 %r30,8; .loc 1 415 5 mov.u64 %r153,1; $L84: .loc 1 424 16 ld.u64 %r413,[%r41]; add.u64 %r112,%r413,1; st.u64 [%r41],%r112; .loc 1 425 23 add.u64 %r415,%r437,%r30; ld.u64 %r113,[%r415]; .loc 1 425 13 sub.u64 %r416,%r113,%r101; shl.b64 %r417,%r416,3; add.u64 %r154,%r154,%r417; .loc 1 426 24 add.u64 %r418,%frame,%r30; ld.u64 %r115,[%r418]; .loc 1 426 14 add.u64 %r155,%r151,%r115; .loc 1 427 23 add.u64 %r420,%r439,%r30; ld.u64 %r117,[%r420]; .loc 1 427 13 shl.b64 %r421,%r117,3; add.u64 %r156,%r152,%r421; .loc 1 405 32 add.u64 %r423,%r438,%r30; ld.u64 %r120,[%r423]; .loc 1 405 13 setp.eq.u64 %r424,%r112,%r120; @ %r424 bra $L85; .loc 1 409 13 mov.u64 %r29,0; bra $L87; $L106: .loc 1 298 6 ld.u64 %r425,[%r210]; setp.eq.u64 %r426,%r425,0; @ ! %r426 bra $L63; bra $L86; $L48: .loc 1 431_gfortran_sminval_i8 .visible .func _gfortran_sminval_8u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64predu64mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 454 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 454 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_minva.loc 1 461 7 bra $L109; $L111: .loc 1 464 10 ld.u64 %r23,[%r152]; .loc 1 464 7 add.u64 %r88,%r23,-1; .loc 1 465 10 ld.s8 %r24,[%r151+28]; .loc 1 465 38 add.u32 %r26,%r24,-1; .loc 1 465 8 cvt.s64.s32 %r294,%r26; .loc 1 467 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r294; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 467 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 474 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 482 3 setp.lt.s64 %r173,%r88,%r294; @ %r173 bra $L115; bra $L168; $L113: .loc 1 469 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 479 12 mov.u64 %r300,0; $L119: .loc 1 476 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 478 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 476 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 479 12 st.u64 [%r139],%r300; $L118: .loc 1 474 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 491 15 ld.u64 %r98,[%r150]; .loc 1 491 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r293,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r293,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 488 12 mov.u64 %r299,0; $L125: .loc 1 485 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 487 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 484 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 488 12 st.u64 [%r43],%r299; $L124: .loc 1 482 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 495 7 setp.eq.u64 %r211,%r294,0; @ %r211 bra $L127; add.u64 %r293,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r293; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 498 10 mov.u64 %r81,1; .loc 1 502 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 506 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 507 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 509 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 509 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 509 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 511 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 500 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 502 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 495 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 514 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 515 4 bra $L109; $L170: .loc 1 518 24 mov.u64 %r24496244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 518 22 st.u64 [%r150],%r98; $L135: .loc 1 544 3 setp.ne.u64 %r247,%r294,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 522 19 ld.s8 %r55,[%r150+28]; .loc 1 522 10 setp.eq.u64 %r248,%r55,%r294; @ %r248 bra $L134; .loc 1 523 2 st.u64 [%stack+8],%r294; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 528 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 528 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 530 4 setp.eq.u64 %r254,%r294,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 530 10 mov.u64 %r104,0; $L137: .loc 1 534 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 534 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 535 18 ld.u64 %r63,[%r31]; .loc 1 530 25 add.u64 %r104,%r104,1; .loc 1 535 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 536 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 530 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 546 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r294,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 547 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 544 26 add.u64 %r100,%r100,1; .loc 1 544 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r294,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 556 22 ld.u64 %r68,[%frame]; .loc 1 556 12 shl.b64 %r69,%r68,3; .loc 1 558 32 ld.u64 %r59,[%frame+120]; .loc 1 565 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r292,%r294,1; .loc 1 554 13 mov.u64 %r278,9223372036854775807; cvt.u32.u32 %r296,%r24; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; $L146: st.u64 [%r98],%r278; .loc 1 555 15 add.u64 %r80,%r80,1; .loc 1 556 12 add.u64 %r98,%r98,%r69; .loc 1 558 13 setp.ne.u64 %r279,%r59,%r80; @ %r279 bra $L146; .loc 1 567 7 @ ! %r292 bra $L171; bra $L109; $L145: .loc 1 562 13 st.u64 [%r99],%r295; .loc 1 565 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 566 5 add.u64 %r97,%r97,1; .loc 1 567 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r282,%r97,%r298; @ %r282 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 565 23 mov.u64 %r72,%r44; .loc 1 566 5 mov.u64 %r97,1; .loc 1 562 13 mov.u64 %r295,0; $L144: .loc 1 571 16 ld.u64 %r287,[%r99]; add.u64 %r74,%r287,1; st.u64 [%r99],%r74; .loc 1 572 23 ld.u64 %r75,[%r124]; .loc 1 572 13 sub.u64 %r288,%r75,%r72; shl.b64 %r289,%r288,3; add.u64 %r98,%r98,%r289; .loc 1 558 32 ld.u64 %r77,[%r123]; .loc 1 558 13 setp.eq.u64 %r290,%r74,%r77; @ %r290 bra $L145; .loc 1 562 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 491 15 ld.u64 %r98,[%r150]; .loc 1 491 6 setp.eq.u64 %r291,%r98,0; @ ! %r291 bra $L122; bra $L147; $L109: .loc 1 minval_i16.o/_gfortran_minval_i16 .visible .func _gfortran_minval_i16fortran/generated/minval_i16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminval_i16 .visible .func _gfortran_mminval_i16gfortran_sminval_i16 .visible .func _gfortran_sminval_i16VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,86,65,76114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_minval_i16 .visible .func _gfortran_minval_i16489pred %r217; .reg .pred64u64 %r251; .reg .predu64 %r256; .reg .u64 %r257; .reg .u32u32 %r268; .reg .u64u64pred %r317; .reg .u64pred %r327; .reg .pred %r330; .reg .predu64 %r359; .reg .u64predu64 %r370; .reg .u64 %r372; .reg .u64 %r373; mov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; .loc 1 55 10 ld.s8 %r22,[%r180+28]; .loc 1 55 38 add.u32 %r182,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r182; .loc 1 56 10 ld.u64 %r25,[%r181]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r184,%r104,63; set.u32.lt.s64 %r186,%r103,%r104; neg.s32 %r187,%r186; cvt.u16.u64 %r190,%r184; cvt.u16.u32 %r191,%r187; or.b16 %r189,%r190,%r191; .loc 1 58 6 cvt.u32.u16 %r192,%r189; cvt.u16.u8 %r193,%r192; setp.eq.u16 %r194,%r193,0; @ %r194 bra $L2; .loc 1 60 7 cvt.u32.u32 %r197,%r22; cvt.s64.s8 %r196,%r197; st.u64 [%stack+8],%r1961955_gfortran_runtime_errorr200,%r104,%r104; add.u64 %r201,%r200,%r104; shl.b64 %r202,%r201,3; add.u64 %r203,%r180,%r202; ld.u64 %r30,[%r203+56]; ld.u64 %r32,[%r203+48]; .loc 1 68 9 ld.u64 %r107,[%r203+40]; .loc 1 70 3 setp.ne.u64 %r217,%r104,0; @ %r217 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r218,%r103,%r104; @ %r218 bra $L4; bra $L48; $L3: add.u64 %r78,%r180,40; add.u64 %r59,%frame,120; add.u64 %r91,%frame,240; add.u64 %r219,%r180,16; add.u64 %r221,%r25,%r25; add.u64 %r222,%r221,%r25; shl.b64 %r223,%r222,3; add.u64 %r174,%r219,%r223; .loc 1 76 12 mov.u64 %r373,0; $L8: .loc 1 72 18 ld.u64 %r224,[%r78]; st.u64 [%r59],%r224; .loc 1 73 19 ld.u64 %r226,[%r78+16]; add.u64 %r225,%r226,1; ld.u64 %r227,[%r78+8]; sub.u64 %r37,%r225,%r227; .loc 1 75 10 setp.lt.s64 %r228,%r37,0; @ %r228 bra $L6; .loc 1 73 17 st.u64 [%r91],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r91],%r373; $L7: .loc 1 70 3 add.u64 %r78,%r78,24; add.u64 %r59,%r59,8; add.u64 %r91,%r91,8; setp.ne.u64 %r230,%r78,%r174; @ %r230 bra $L8; bra $L9; $L48: .loc 1 87 6 ld.u64 %r231,[%r179]; setp.eq.u64 %r232,%r231,0; @ ! %r232 bra $L11; bra $L10; $L4: add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r237,%r236,40; add.u64 %r143,%r180,%r237; shl.b64 %r238,%r25,3; add.u64 %r125,%r238,-8; add.u64 %r361,%frame,120; add.u64 %r128,%r361,%r125; add.u64 %r360,%frame,240; add.u64 %r121,%r360,%r125; cvt.u32.u32 %r242,%r22; cvt.s64.s8 %r241,%r242; add.u64 %r244,%r241,%r241; add.u64 %r245,%r244,%r241; shl.b64 %r246,%r245,3; add.u64 %r247,%r180,40; add.u64 %r95,%r246,%r247; .loc 1 84 12 mov.u64 %r372,0; $L14: .loc 1 80 18 ld.u64 %r248,[%r143]; st.u64 [%r128],%r248; .loc 1 81 19 ld.u64 %r250,[%r143+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r143+8]; sub.u64 %r45,%r249,%r251; .loc 1 83 10 setp.lt.s64 %r252,%r45,0; @ %r252 bra $L12; .loc 1 81 17 st.u64 [%r121],%r45; bra $L13; $L12: .loc 1 84 12 st.u64 [%r121],%r372; $L13: .loc 1 78 3 add.u64 %r143,%r143,24; add.u64 %r128,%r128,8; add.u64 %r121,%r121,8; setp.ne.u64 %r254,%r95,%r143; @ %r254 bra $L14; bra $L49; $L10: .loc 1 91 7 setp.le.s64 %r255,%r103,0; @ %r255 bra $L16; add.u64 %r360,%frame,240; $L36: add.u64 %r152,%r179,40; mov.u64 %r150,%r360; add.u64 %r256,%r179,16; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r144,%r256,%r262; .loc 1 94 10 mov.u64 %r99,1; .loc 1 98 4 mov.u64 %r285,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r263,0; st.u64 [%r179+8],%r263; .loc 1 103 28 cvt.u16.u32 %r266,%r22; add.u16 %r265,%r266,-1; cvt.u32.u16 %r267,%r265; st.u8 [%r179+28],%r267; .loc 1 105 20 add.u32 %r268,%r22,-2; cvt.s64.s32 %r52,%r268; add.u64 %r270,%r52,%r52; add.u64 %r271,%r270,%r52; shl.b64 %r272,%r271,3; add.u64 %r273,%r179,%r272; .loc 1 105 67 shl.b64 %r275,%r52,3; add.u64 %r276,%frame,%r275; .loc 1 105 59 ld.u64 %r278,[%r273+40]; ld.u64 %r279,[%r276+240]; mul.lo.u64 %r109,%r278,%r279; .loc 1 107 29 mov.u64 %r281,16281; call (%value_in),_gfortrani_xmallocarray282,[%value_in]; } .loc 1 107 27 st.u64 [%r179],%r282; .loc 1 108 10 setp.eq.u64 %r284,%r109,0; @ ! %r284 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r99,%r48,%r99; $L17: .loc 1 98 4 st.u64 [%r152+8],%r285; ld.u64 %r48,[%r150]; add.u64 %r286,%r48,-1; st.u64 [%r152+16],%r286; st.u64 [%r152],%r99; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r287,%r144,%r152; @ %r287 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r179+48],%r109; mov.u64 %r289,-1; st.u64 [%r179+56],%r289; mov.u64 %r290,1; st.u64 [%r179+40],%r290; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r179+28]; .loc 1 118 10 setp.eq.u64 %r291,%r58,%r103; @ %r291 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r294,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r295,[%r294+36]; setp.eq.u32 %r296,%r295,0; @ %r296 bra $L21; .loc 1 125 2 add.u64 %r360,%frame,240; cvta.const.u64 %r300,$LC2r360299300; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r302,%r103,0; @ %r302 bra $L24; $L27: .loc 1 65 9 add.u64 %r303,%r30,1; .loc 1 65 7 sub.u64 %r105,%r303,%r32; max.s64 %r40,%r105,0; .loc 1 137 8 ld.u64 %r116,[%r180]; .loc 1 138 8 ld.u64 %r117,[%r179]; .loc 1 158 36 shl.b64 %r66,%r107,4; ld.u64 %r138,[%frame+360]; .loc 1 180 22 ld.u64 %r68,[%frame+120]; .loc 1 180 12 shl.b64 %r69,%r68,4; .loc 1 181 22 ld.u64 %r71,[%frame]; .loc 1 181 12 shl.b64 %r72,%r71,4; .loc 1 183 32 ld.u64 %r123,[%frame+240]; .loc 1 190 23 mul.lo.u64 %r304,%r68,%r123; .loc 1 190 9 shl.b64 %r305,%r304,4; neg.s64 %r127,%r305; .loc 1 191 23 mul.lo.u64 %r306,%r71,%r123; .loc 1 191 9 shl.b64 %r307,%r306,4; neg.s64 %r130,%r307; setp.gt.s64 %r362,%r105,0; setp.le.s64 %r363,%r103,1; cvt.u32.u32 %r365,%r22; cvt.s64.s8 %r366,%r365; add.u64 %r367,%r366,-1; add.u64 %r368,%frame,120; add.u64 %r369,%frame,240; .loc 1 187 13 mov.u64 %r370,0; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r179,40; mov.u64 %r162,%frame; cvt.u32.u32 %r309,%r22; cvt.s64.s8 %r308,%r309; add.u64 %r153,%r308,-1; .loc 1 129 3 mov.u64 %r118,0; add.u64 %r360,%frame,240; .loc 1 131 16 mov.u64 %r310,%r118; $L26: st.u64 [%r165],%r310; .loc 1 132 18 ld.u64 %r311,[%r164]; st.u64 [%r162],%r311; .loc 1 133 17 shl.b64 %r313,%r118,3; add.u64 %r314,%r360,%r313; .loc 1 133 10 ld.u64 %r315,[%r314]; setp.le.s64 %r316,%r315,0; @ %r316 bra $L1; .loc 1 129 26 add.u64 %r118,%r118,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r317,%r118,%r153; @ %r317 bra $L26; bra $L27; $L25: .loc 1 153 5 @ %r362 bra $L37; .loc 1 154 10 mov.u64 %r319,-1; st.u64 [%r117],%r319; mov.u64 %r320,9223372036854775807; st.u64 [%r117+8],%r320; bra $L29; $L37: mov.u64 %r112,%r116; .loc 1 151 9 mov.u64 %r356,-1; mov.u64 %r357,9223372036854775807; .loc 1 158 13 mov.u64 %r111,0; $L28: ld.u64 %r358,[%r112]; ld.u64 %r359,[%r112+8]; setp.gt.s64 %r324,%r357,%r359; @ %r324 bra $L31; setp.ne.u64 %r327,%r357,%r359; @ %r327 bra $L30; setp.gt.u64 %r330,%r356,%r358; @ ! %r330 bra $L30; $L31: mov.u64 %r356,%r358; mov.u64 %r357,%r359; $L30: .loc 1 158 28 add.u64 %r111,%r111,1; .loc 1 158 36 add.u64 %r112,%r112,%r66; .loc 1 158 6 setp.gt.s64 %r331,%r40,%r111; @ %r331 bra $L28; .loc 1 175 12 st.u64 [%r117],%r356; st.u64 [%r117+8],%r357; $L29: .loc 1 179 15 add.u64 %r138,%r138,1; .loc 1 180 12 add.u64 %r116,%r116,%r69; .loc 1 181 12 add.u64 %r117,%r117,%r72; .loc 1 183 13 setp.ne.u64 %r334,%r138,%r123; @ %r334 bra $L25; .loc 1 190 9 add.u64 %r113,%r116,%r127; .loc 1 191 9 add.u64 %r114,%r117,%r130; .loc 1 193 7 @ %r363 bra $L1; add.u64 %r122,%frame,368; mov.u64 %r140,8; .loc 1 192 5 mov.u64 %r115,1; bra $L34; $L35: .loc 1 187 13 st.u64 [%r122],%r370; .loc 1 190 23 mul.lo.u64 %r340,%r85,%r84; .loc 1 190 9 shl.b64 %r341,%r340,4; sub.u64 %r113,%r116,%r341; .loc 1 191 23 mul.lo.u64 %r342,%r88,%r84; .loc 1 191 9 shl.b64 %r343,%r342,4; sub.u64 %r114,%r117,%r343; .loc 1 192 5 add.u64 %r115,%r115,1; .loc 1 193 7 add.u64 %r122,%r122,8; add.u64 %r140,%r140,8; setp.eq.u64 %r344,%r115,%r367; @ %r344 bra $L1; $L34: .loc 1 201 16 ld.u64 %r345,[%r122]; add.u64 %r84,%r345,1; st.u64 [%r122],%r84; .loc 1 202 23 add.u64 %r347,%r368,%r140; ld.u64 %r85,[%r347]; .loc 1 202 13 shl.b64 %r348,%r85,4; add.u64 %r116,%r113,%r348; .loc 1 203 23 add.u64 %r349,%frame,%r140; ld.u64 %r88,[%r349]; .loc 1 203 13 shl.b64 %r350,%r88,4; add.u64 %r117,%r114,%r350; .loc 1 183 32 add.u64 %r352,%r369,%r140; ld.u64 %r92,[%r352]; .loc 1 183 13 setp.eq.u64 %r353,%r84,%r92; @ %r353 bra $L35; .loc 1 187 13 mov.u64 %r138,0; bra $L25; $L49: .loc 1 87 6 ld.u64 %r354,[%r179]; setp.eq.u64 %r355,%r354,0; @ ! %r355 bra $L11; bra $L36; $L1: .loc 1 207_gfortran_mminval_i16 .visible .func _gfortran_mminval_i16608u64 %r166; .reg .u64u64u16 %r263; .reg .pred %r264; .reg .pred %r266; .reg .u64 %r270; .reg .u64 %r271; .reg .u64 %r277; .reg .pred %r279; .reg .pred %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .predu64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32u64 %r342; .reg .u64pred %r358; .reg .u64predu64 %r391; .reg .predpred %r399; .reg .pred %r403; .reg .pred %r406; .reg .predmov.u64 %r211,%ar2; mov.u64 %r212,%ar3; .loc 1 237 6 setp.ne.u64 %r213,%r212,0; @ %r213 bra $Lgfortran_minval_i16.loc 1 244 7 bra $L50; $L51: .loc 1 247 10 ld.u64 %r22,[%r211]; .loc 1 247 7 add.u64 %r136,%r22,-1; .loc 1 248 10 ld.s8 %r23,[%r210+28]; .loc 1 248 38 add.u32 %r217,%r23,-1; .loc 1 248 8 cvt.s64.s32 %r137,%r217; .loc 1 251 7 shr.u64 %r219,%r136,63; set.u32.gt.s64 %r221,%r136,%r137; neg.s32 %r222,%r221; cvt.u16.u64 %r225,%r219; cvt.u16.u32 %r226,%r222; or.b16 %r224,%r225,%r226; .loc 1 251 6 cvt.u32.u16 %r227,%r224; cvt.u16.u8 %r228,%r227; setp.eq.u16 %r229,%r228,0; @ %r229 bra $L53; .loc 1 253 7 cvt.u32.u32 %r232,%r23; cvt.s64.s8 %r231,%r232; st.u64 [%stack+8],%r231; st.u64 [%stack],%r22; cvta.const.u64 %r2302stack; call _gfortran_runtime_error3: .loc 1 258 9 add.u64 %r443,%r136,%r136; add.u64 %r446,%r443,%r136; shl.b64 %r237,%r446,3; add.u64 %r238,%r210,%r237; ld.u64 %r241,[%r238+56]; add.u64 %r240,%r241,1; .loc 1 258 7 ld.u64 %r248,[%r238+48]; sub.u64 %r138,%r240,%r248; .loc 1 259 6 setp.le.s64 %r249,%r138,0; @ %r249 bra $L50; .loc 1 262 9 ld.u64 %r151,[%r212]; .loc 1 264 15 ld.u64 %r34,[%r212+16]; .loc 1 266 22 cvt.u32.u64 %r35,%r34; .loc 1 266 53 add.u32 %r250,%r35,-4; and.b32 %r251,%r250,-5; set.u32.eq.u32 %r253,%r251,0; neg.s32 %r254,%r253; .loc 1 266 22 add.u32 %r255,%r35,-1; set.u32.le.u32 %r257,%r255,1; neg.s32 %r258,%r257; .loc 1 266 6 cvt.u16.u32 %r260,%r254; cvt.u16.u32 %r261,%r258; or.b16 %r259,%r260,%r261; cvt.u32.u16 %r262,%r259; cvt.u16.u8 %r263,%r262; setp.ne.u16 %r264,%r263,0; @ %r264 bra $L55; .loc 1 268 7 setp.ne.u32 %r266,%r35,16; @ %r266 bra $L56; $L55: .loc 1 275 9 shl.b64 %r270,%r446,3; add.u64 %r271,%r210,%r270; ld.u64 %r141,[%r271+40]; .loc 1 276 12 add.u64 %r277,%r212,%r270; ld.u64 %r42,[%r277+40]; .loc 1 278 3 setp.ne.u64 %r279,%r136,0; @ %r279 bra $L57; $L63: .loc 1 288 3 setp.lt.s64 %r280,%r136,%r137; @ %r280 bra $L58; bra $L109; $L56: .loc 1 273 5 cvta.const.u64 %r281281_gfortran_runtime_error7: add.u64 %r40,%r210,40; add.u64 %r179,%r212,40; add.u64 %r283,%r210,16; add.u64 %r285,%r22,%r22; add.u64 %r286,%r285,%r22; shl.b64 %r287,%r286,3; add.u64 %r206,%r283,%r287; .loc 1 278 3 mov.u64 %r91,0; add.u64 %r444,%frame,240; add.u64 %r442,%frame,360; .loc 1 285 12 mov.u64 %r456,%r91; $L62: .loc 1 280 18 add.u64 %r289,%r444,%r91; ld.u64 %r290,[%r40]; st.u64 [%r289],%r290; .loc 1 281 18 add.u64 %r291,%frame,%r91; .loc 1 281 20 ld.u64 %r293,[%r179]; mul.lo.u64 %r292,%r293,%r34; .loc 1 281 18 st.u64 [%r291],%r292; .loc 1 282 19 ld.u64 %r295,[%r40+16]; add.u64 %r294,%r295,1; ld.u64 %r296,[%r40+8]; sub.u64 %r54,%r294,%r296; .loc 1 284 10 setp.lt.s64 %r297,%r54,0; @ %r297 bra $L60; .loc 1 282 17 add.u64 %r299,%r442,%r91; st.u64 [%r299],%r54; bra $L61; $L60: .loc 1 285 12 add.u64 %r301,%r442,%r91; st.u64 [%r301],%r456; $L61: .loc 1 278 3 add.u64 %r40,%r40,24; add.u64 %r179,%r179,24; add.u64 %r91,%r91,8; setp.ne.u64 %r303,%r40,%r206; @ %r303 bra $L62; bra $L63; $L109: .loc 1 298 6 ld.u64 %r304,[%r209]; setp.eq.u64 %r305,%r304,0; @ ! %r305 bra $L65; bra $L64; $L58: add.u64 %r307,%r22,%r22; add.u64 %r308,%r307,%r22; shl.b64 %r309,%r308,3; add.u64 %r126,%r309,40; add.u64 %r132,%r210,%r126; add.u64 %r124,%r212,%r126; shl.b64 %r310,%r22,3; add.u64 %r102,%r310,-8; cvt.u32.u32 %r312,%r23; cvt.s64.s8 %r311,%r312; shl.b64 %r313,%r311,3; add.u64 %r68,%r313,-8; add.u64 %r444,%frame,240; add.u64 %r442,%frame,360; .loc 1 295 12 mov.u64 %r455,0; $L68: .loc 1 290 18 add.u64 %r315,%r444,%r102; ld.u64 %r316,[%r132]; st.u64 [%r315],%r316; .loc 1 291 18 add.u64 %r317,%frame,%r102; .loc 1 291 20 ld.u64 %r319,[%r124]; mul.lo.u64 %r318,%r319,%r34; .loc 1 291 18 st.u64 [%r317],%r318; .loc 1 292 19 ld.u64 %r321,[%r132+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r132+8]; sub.u64 %r65,%r320,%r322; .loc 1 294 10 setp.lt.s64 %r323,%r65,0; @ %r323 bra $L66; .loc 1 292 17 add.u64 %r325,%r442,%r102; st.u64 [%r325],%r65; bra $L67; $L66: .loc 1 295 12 add.u64 %r327,%r442,%r102; st.u64 [%r327],%r455; $L67: .loc 1 288 3 add.u64 %r132,%r132,24; add.u64 %r124,%r124,24; add.u64 %r102,%r102,8; setp.ne.u64 %r329,%r68,%r102; @ %r329 bra $L68; bra $L110; $L64: .loc 1 302 7 setp.eq.u64 %r330,%r137,0; @ %r330 bra $L70; add.u64 %r442,%frame,360; $L90: add.u64 %r172,%r209,40; mov.u64 %r164,%r442; cvt.u32.u32 %r332,%r23; cvt.s64.s8 %r331,%r332; add.u64 %r334,%r331,%r331; add.u64 %r335,%r334,%r331; shl.b64 %r336,%r335,3; add.u64 %r337,%r209,16; add.u64 %r133,%r336,%r337; .loc 1 305 10 mov.u64 %r128,1; .loc 1 309 4 mov.u64 %r356,0; bra $L71; $L70: .loc 1 313 20 add.u32 %r338,%r23,-2; cvt.s64.s32 %r74,%r338; add.u64 %r340,%r74,%r74; add.u64 %r341,%r340,%r74; shl.b64 %r342,%r341,3; add.u64 %r343,%r209,%r342; .loc 1 313 67 shl.b64 %r345,%r74,3; add.u64 %r346,%frame,%r345; .loc 1 313 59 ld.u64 %r348,[%r343+40]; ld.u64 %r349,[%r346+360]; mul.lo.u64 %r145,%r348,%r349; .loc 1 315 24 mov.u64 %r350,0; st.u64 [%r209+8],%r350; .loc 1 316 28 cvt.u16.u32 %r353,%r23; add.u16 %r352,%r353,-1; cvt.u32.u16 %r354,%r352; st.u8 [%r209+28],%r354; .loc 1 318 10 setp.eq.u64 %r355,%r145,0; @ %r355 bra $L72; bra $L111; $L74: .loc 1 307 47 mul.lo.u64 %r128,%r71,%r128; $L71: .loc 1 309 4 st.u64 [%r172+8],%r356; ld.u64 %r71,[%r164]; add.u64 %r357,%r71,-1; st.u64 [%r172+16],%r357; st.u64 [%r172],%r128; .loc 1 302 7 add.u64 %r172,%r172,24; add.u64 %r164,%r164,8; setp.ne.u64 %r358,%r133,%r172; @ %r358 bra $L74; bra $L70; $L72: .loc 1 321 4 st.u64 [%r209+48],%r145; mov.u64 %r360,-1; st.u64 [%r209+56],%r360; mov.u64 %r361,1; st.u64 [%r209+40],%r361; .loc 1 322 4 bra $L50; $L111: .loc 1 325 24 mov.u64 %r363,16363; call (%value_in),_gfortrani_xmallocarray64,[%value_in]; } .loc 1 325 22 st.u64 [%r209],%r364; bra $L75; $L65: .loc 1 330 19 ld.s8 %r366,[%r209+28]; .loc 1 330 10 setp.eq.u64 %r367,%r366,%r137; @ %r367 bra $L76; .loc 1 331 2 cvta.const.u64 %r36836_gfortran_runtime_error76: .loc 1 333 11 cvta.global.u64 %r370,_gfortrani_compile_options; .loc 1 333 10 ld.u32 %r371,[%r370+36]; setp.eq.u32 %r372,%r371,0; @ %r372 bra $L75; .loc 1 335 4 add.u64 %r442,%frame,360; cvta.const.u64 %r376,$LC2r44237376; call _gfortrani_bounds_ifunction_return337 4 cvta.const.u64 %r380,$LC1210380376; call _gfortrani_bounds_equal_extents$L75: .loc 1 342 3 setp.ne.u64 %r382,%r137,0; @ %r382 bra $L77; $L80: .loc 1 350 8 ld.u64 %r152,[%r209]; .loc 1 351 8 ld.u64 %r150,[%r210]; .loc 1 353 9 setp.ne.u64 %r383,%r150,0; @ %r383 bra $L78; bra $L50; $L77: add.u64 %r190,%frame,480; add.u64 %r189,%r209,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r385,%r23; cvt.s64.s8 %r384,%r385; add.u64 %r175,%r384,-1; .loc 1 342 3 mov.u64 %r157,0; add.u64 %r442,%frame,360; .loc 1 344 16 mov.u64 %r386,%r157; $L79: st.u64 [%r190],%r386; .loc 1 345 18 ld.u64 %r387,[%r189]; st.u64 [%r187],%r387; .loc 1 346 17 shl.b64 %r389,%r157,3; add.u64 %r390,%r442,%r389; .loc 1 346 10 ld.u64 %r391,[%r390]; setp.le.s64 %r392,%r391,0; @ %r392 bra $L50; .loc 1 342 26 add.u64 %r157,%r157,1; .loc 1 342 3 add.u64 %r190,%r190,8; add.u64 %r189,%r189,24; add.u64 %r187,%r187,8; setp.ne.u64 %r393,%r157,%r175; @ %r393 bra $L79; bra $L80; $L78: .loc 1 276 12 mul.lo.u64 %r45,%r42,%r34; .loc 1 370 32 shl.b64 %r90,%r141,4; ld.u64 %r39,[%frame+480]; .loc 1 401 22 ld.u64 %r93,[%frame+240]; .loc 1 401 12 shl.b64 %r94,%r93,4; .loc 1 402 23 ld.u64 %r96,[%frame]; .loc 1 403 22 ld.u64 %r98,[%frame+120]; .loc 1 403 12 shl.b64 %r99,%r98,4; .loc 1 405 32 ld.u64 %r84,[%frame+360]; .loc 1 412 23 mul.lo.u64 %r38,%r84,%r93; .loc 1 413 24 mul.lo.u64 %r394,%r84,%r96; .loc 1 413 10 neg.s64 %r163,%r394; .loc 1 414 23 mul.lo.u64 %r395,%r84,%r98; .loc 1 414 9 shl.b64 %r396,%r395,4; neg.s64 %r165,%r396; setp.le.s64 %r441,%r137,1; .loc 1 365 9 mov.u64 %r448,-1; cvt.u32.u32 %r449,%r23; cvt.s64.s8 %r450,%r449; add.u64 %r451,%r450,-1; add.u64 %r452,%frame,240; add.u64 %r453,%frame,360; add.u64 %r454,%frame,120; $L91: .loc 1 342 3 mov.u64 %r156,%r151; mov.u64 %r155,%r150; .loc 1 365 9 mov.u64 %r437,%r448; mov.u64 %r438,9223372036854775807; .loc 1 370 9 mov.u64 %r154,0; $L84: .loc 1 394 6 ld.s8 %r398,[%r156]; cvt.u16.u32 %r397,%r398; setp.eq.u16 %r399,%r397,0; @ %r399 bra $L81; ld.u64 %r439,[%r155]; ld.u64 %r440,[%r155+8]; setp.gt.s64 %r403,%r438,%r440; @ %r403 bra $L83; setp.ne.u64 %r406,%r438,%r440; @ %r406 bra $L81; setp.gt.u64 %r409,%r437,%r439; @ ! %r409 bra $L81; $L83: mov.u64 %r437,%r439; mov.u64 %r438,%r440; $L81: .loc 1 370 24 add.u64 %r154,%r154,1; .loc 1 370 32 add.u64 %r155,%r155,%r90; .loc 1 370 47 add.u64 %r156,%r156,%r45; .loc 1 370 2 setp.ne.u64 %r410,%r138,%r154; @ %r410 bra $L84; .loc 1 397 8 st.u64 [%r152],%r437; st.u64 [%r152+8],%r438; .loc 1 400 15 add.u64 %r39,%r39,1; .loc 1 401 12 add.u64 %r150,%r150,%r94; .loc 1 402 13 add.u64 %r151,%r151,%r96; .loc 1 403 12 add.u64 %r152,%r152,%r99; .loc 1 405 13 setp.ne.u64 %r413,%r84,%r39; @ %r413 bra $L91; .loc 1 413 10 add.u64 %r147,%r151,%r163; .loc 1 414 9 add.u64 %r148,%r152,%r165; .loc 1 416 7 @ ! %r441 bra $L112; bra $L50; $L89: .loc 1 409 13 st.u64 [%r41],%r447; .loc 1 412 23 mul.lo.u64 %r101,%r113,%r112; .loc 1 413 24 mul.lo.u64 %r416,%r115,%r112; .loc 1 413 10 sub.u64 %r147,%r151,%r416; .loc 1 414 23 mul.lo.u64 %r417,%r117,%r112; .loc 1 414 9 shl.b64 %r418,%r417,4; sub.u64 %r148,%r152,%r418; .loc 1 415 5 add.u64 %r149,%r149,1; .loc 1 416 7 add.u64 %r41,%r41,8; add.u64 %r166,%r166,8; setp.ne.u64 %r419,%r149,%r451; @ %r419 bra $L88; bra $L50; $L112: add.u64 %r41,%frame,488; .loc 1 412 23 mov.u64 %r101,%r38; .loc 1 416 7 mov.u64 %r166,8; .loc 1 415 5 mov.u64 %r149,1; .loc 1 409 13 mov.u64 %r447,0; $L88: .loc 1 424 16 ld.u64 %r423,[%r41]; add.u64 %r112,%r423,1; st.u64 [%r41],%r112; .loc 1 425 23 add.u64 %r425,%r452,%r166; ld.u64 %r113,[%r425]; .loc 1 425 13 sub.u64 %r426,%r113,%r101; shl.b64 %r427,%r426,4; add.u64 %r150,%r150,%r427; .loc 1 426 24 add.u64 %r428,%frame,%r166; ld.u64 %r115,[%r428]; .loc 1 426 14 add.u64 %r151,%r147,%r115; .loc 1 427 23 add.u64 %r430,%r454,%r166; ld.u64 %r117,[%r430]; .loc 1 427 13 shl.b64 %r431,%r117,4; add.u64 %r152,%r148,%r431; .loc 1 405 32 add.u64 %r433,%r453,%r166; ld.u64 %r121,[%r433]; .loc 1 405 13 setp.eq.u64 %r434,%r112,%r121; @ %r434 bra $L89; .loc 1 409 13 mov.u64 %r39,0; bra $L91; $L110: .loc 1 298 6 ld.u64 %r435,[%r209]; setp.eq.u64 %r436,%r435,0; @ ! %r436 bra $L65; bra $L90; $L50: .loc 1 431_gfortran_sminval_i16 .visible .func _gfortran_sminval_i168u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64u64 %r279; .reg .pred %r280; .reg .predpred %r292; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 454 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L114; .loc 1 454 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L115; $L114:r152; call _gfortran_minval_i16.loc 1 461 7 bra $L113; $L115: .loc 1 464 10 ld.u64 %r23,[%r152]; .loc 1 464 7 add.u64 %r88,%r23,-1; .loc 1 465 10 ld.s8 %r24,[%r151+28]; .loc 1 465 38 add.u32 %r26,%r24,-1; .loc 1 465 8 cvt.s64.s32 %r295,%r26; .loc 1 467 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r295; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 467 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L117; .loc 1 474 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L118; $L124: .loc 1 482 3 setp.lt.s64 %r173,%r88,%r295; @ %r173 bra $L119; bra $L172; $L117: .loc 1 469 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error118: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 479 12 mov.u64 %r301,0; $L123: .loc 1 476 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 478 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L121; .loc 1 476 17 st.u64 [%r139],%r36; bra $L122; $L121: .loc 1 479 12 st.u64 [%r139],%r301; $L122: .loc 1 474 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L123; bra $L124; $L172: .loc 1 491 15 ld.u64 %r98,[%r150]; .loc 1 491 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L126; bra $L125; $L119: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r294,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r294,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 488 12 mov.u64 %r300,0; $L129: .loc 1 485 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 487 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L127; .loc 1 484 17 st.u64 [%r43],%r41; bra $L128; $L127: .loc 1 488 12 st.u64 [%r43],%r300; $L128: .loc 1 482 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L129; bra $L173; $L125: .loc 1 495 7 setp.eq.u64 %r211,%r295,0; @ %r211 bra $L131; add.u64 %r294,%frame,120; $L151: add.u64 %r111,%r150,40; mov.u64 %r107,%r294; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 498 10 mov.u64 %r81,1; .loc 1 502 4 mov.u64 %r237,0; bra $L132; $L131: .loc 1 506 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 507 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 509 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 509 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 509 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 511 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L133; bra $L174; $L135: .loc 1 500 48 mul.lo.u64 %r81,%r46,%r81; $L132: .loc 1 502 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 495 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L135; bra $L131; $L133: .loc 1 514 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 515 4 bra $L113; $L174: .loc 1 518 24 mov.u64 %r244,1696244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 518 22 st.u64 [%r150],%r98; $L139: .loc 1 544 3 setp.ne.u64 %r247,%r295,0; @ %r247 bra $L136; bra $L137; $L126: .loc 1 522 19 ld.s8 %r55,[%r150+28]; .loc 1 522 10 setp.eq.u64 %r248,%r55,%r295; @ %r248 bra $L138; .loc 1 523 2 st.u64 [%stack+8],%r295; st.u64 [%stack],%r55;_gfortran_runtime_error138: .loc 1 528 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 528 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L139; .loc 1 530 4 setp.eq.u64 %r254,%r295,0; @ %r254 bra $L137; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 530 10 mov.u64 %r104,0; $L141: .loc 1 534 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 534 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 535 18 ld.u64 %r63,[%r31]; .loc 1 530 25 add.u64 %r104,%r104,1; .loc 1 535 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L140; .loc 1 536 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error140: .loc 1 530 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L141; $L136: .loc 1 546 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L142; shl.b64 %r264,%r295,3; bra $L143; $L142: mov.u64 %r264,8; $L143: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L144: .loc 1 547 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 544 26 add.u64 %r100,%r100,1; .loc 1 544 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r295,%r100; @ %r277 bra $L144; $L137: ld.u64 %r80,[%frame+240]; .loc 1 556 22 ld.u64 %r68,[%frame]; .loc 1 556 12 shl.b64 %r69,%r68,4; .loc 1 558 32 ld.u64 %r59,[%frame+120]; .loc 1 565 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r293,%r295,1; .loc 1 554 13 mov.u64 %r278,-1; cvt.u32.u32 %r297,%r24; cvt.s64.s8 %r298,%r297; add.u64 %r299,%r298,-1; $L150: st.u64 [%r98],%r278; mov.u64 %r279,9223372036854775807; st.u64 [%r98+8],%r279; .loc 1 555 15 add.u64 %r80,%r80,1; .loc 1 556 12 add.u64 %r98,%r98,%r69; .loc 1 558 13 setp.ne.u64 %r280,%r59,%r80; @ %r280 bra $L150; .loc 1 567 7 @ ! %r293 bra $L175; bra $L113; $L149: .loc 1 562 13 st.u64 [%r99],%r296; .loc 1 565 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 566 5 add.u64 %r97,%r97,1; .loc 1 567 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r283,%r97,%r299; @ %r283 bra $L148; bra $L113; $L175: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 565 23 mov.u64 %r72,%r44; .loc 1 566 5 mov.u64 %r97,1; .loc 1 562 13 mov.u64 %r296,0; $L148: .loc 1 571 16 ld.u64 %r288,[%r99]; add.u64 %r74,%r288,1; st.u64 [%r99],%r74; .loc 1 572 23 ld.u64 %r75,[%r124]; .loc 1 572 13 sub.u64 %r289,%r75,%r72; shl.b64 %r290,%r289,4; add.u64 %r98,%r98,%r290; .loc 1 558 32 ld.u64 %r77,[%r123]; .loc 1 558 13 setp.eq.u64 %r291,%r74,%r77; @ %r291 bra $L149; .loc 1 562 13 mov.u64 %r80,0; bra $L150; $L173: .loc 1 491 15 ld.u64 %r98,[%r150]; .loc 1 491 6 setp.eq.u64 %r292,%r98,0; @ ! %r292 bra $L126; bra $L151; $L113: .loc 1 minval_r4.o/_gfortran_minval_r4 .visible .func _gfortran_minval_r4fortran/generated/minval_r4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminval_r4 .visible .func _gfortran_mminval_r4gfortran_sminval_r4 .visible .func _gfortran_sminval_r4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,86,65,76114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_minval_r4 .visible .func _gfortran_minval_r448u64u32 %r194; .reg .u16 %r195; .reg .predu64 %r227; .reg .u64 %r228; .reg .u64 %r229; .reg .predpredu32u64 %r250; .reg .u64 %r251; .reg .u64 %r252; .reg .u64 %r253; .reg .predu32u16u64pred %r319; .reg .f32 %r321; .reg .pred %r322; .reg .pred %r323; .reg .pred %r324; .reg .pred %r325; .reg .pred %r326; .reg .u64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64u64 %r357; .reg .u64 %r358; .reg .u64 %r360; .reg .u64 %r361; mov.u64 %r181,%ar0; mov.u64 %r182,%ar1; mov.u64 %r183,%ar2; .loc 1 55 10 ld.s8 %r22,[%r182+28]; .loc 1 55 38 add.u32 %r184,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r109,%r184; .loc 1 56 10 ld.u64 %r25,[%r183]; .loc 1 56 7 add.u64 %r110,%r25,-1; .loc 1 58 7 shr.u64 %r186,%r110,63; set.u32.lt.s64 %r188,%r109,%r110; neg.s32 %r189,%r188; cvt.u16.u64 %r192,%r186; cvt.u16.u32 %r193,%r189; or.b16 %r191,%r192,%r193; .loc 1 58 6 cvt.u32.u16 %r194,%r191; cvt.u16.u8 %r195,%r194; setp.eq.u16 %r196,%r195,0; @ %r196 bra $L2; .loc 1 60 7 cvt.u32.u32 %r199,%r22; cvt.s64.s8 %r198,%r199; st.u64 [%stack+8],%r19819197_gfortran_runtime_errorr202,%r110,%r110; add.u64 %r203,%r202,%r110; shl.b64 %r204,%r203,3; add.u64 %r205,%r182,%r204; ld.u64 %r31,[%r205+56]; ld.u64 %r33,[%r205+48]; .loc 1 68 9 ld.u64 %r113,[%r205+40]; .loc 1 70 3 setp.ne.u64 %r219,%r110,0; @ %r219 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r220,%r109,%r110; @ %r220 bra $L4; bra $L52; $L3: add.u64 %r29,%r182,40; add.u64 %r126,%frame,120; add.u64 %r180,%frame,240; add.u64 %r221,%r182,16; add.u64 %r223,%r25,%r25; add.u64 %r224,%r223,%r25; shl.b64 %r225,%r224,3; add.u64 %r71,%r221,%r225; .loc 1 76 12 mov.u64 %r361,0; $L8: .loc 1 72 18 ld.u64 %r226,[%r29]; st.u64 [%r126],%r226; .loc 1 73 19 ld.u64 %r228,[%r29+16]; add.u64 %r227,%r228,1; ld.u64 %r229,[%r29+8]; sub.u64 %r38,%r227,%r229; .loc 1 75 10 setp.lt.s64 %r230,%r38,0; @ %r230 bra $L6; .loc 1 73 17 st.u64 [%r180],%r38; bra $L7; $L6: .loc 1 76 12 st.u64 [%r180],%r361; $L7: .loc 1 70 3 add.u64 %r29,%r29,24; add.u64 %r126,%r126,8; add.u64 %r180,%r180,8; setp.ne.u64 %r232,%r71,%r29; @ %r232 bra $L8; bra $L9; $L52: .loc 1 87 6 ld.u64 %r233,[%r181]; setp.eq.u64 %r234,%r233,0; @ ! %r234 bra $L11; bra $L10; $L4: add.u64 %r236,%r25,%r25; add.u64 %r237,%r236,%r25238,40; add.u64 %r144,%r182,%r239; shl.b64 %r240,%r25,3; add.u64 %r107,%r240,-8; add.u64 %r349,%frame,120; add.u64 %r128,%r349,%r107; add.u64 %r348,%frame,240; add.u64 %r105,%r348,%r107; cvt.u32.u32 %r244,%r22; cvt.s64.s8 %r243,%r244; add.u64 %r246,%r243,%r243; add.u64 %r247,%r246,%r243; shl.b64 %r248,%r247,3; add.u64 %r249,%r182,40; add.u64 %r65,%r248,%r249; .loc 1 84 12 mov.u64 %r360,0; $L14: .loc 1 80 18 ld.u64 %r250,[%r144]; st.u64 [%r128],%r250; .loc 1 81 19 ld.u64 %r252,[%r144+16]; add.u64 %r251,%r252,1; ld.u64 %r253,[%r144+8]; sub.u64 %r45,%r251,%r253; .loc 1 83 10 setp.lt.s64 %r254,%r45,0; @ %r254 bra $L12; .loc 1 81 17 st.u64 [%r105],%r45; bra $L13; $L12: .loc 1 84 12 st.u64 [%r105],%r360; $L13: .loc 1 78 3 add.u64 %r144,%r144,24; add.u64 %r128,%r128,8; add.u64 %r105,%r105,8; setp.ne.u64 %r256,%r65,%r144; @ %r256 bra $L14; bra $L53; $L10: .loc 1 91 7 setp.le.s64 %r257,%r109,0; @ %r257 bra $L16; add.u64 %r348,%frame,240; $L38: add.u64 %r153,%r181,40; mov.u64 %r151,%r348; add.u64 %r258,%r181,16; cvt.u32.u32 %r260,%r22; cvt.s64.s8 %r259,%r260; add.u64 %r262,%r259,%r259; add.u64 %r263,%r262,%r259; shl.b64 %r264,%r263,3; add.u64 %r145,%r258,%r264; .loc 1 94 10 mov.u64 %r103,1; .loc 1 98 4 mov.u64 %r287,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r265,0; st.u64 [%r181+8],%r265; .loc 1 103 28 cvt.u16.u32 %r268,%r22; add.u16 %r267,%r268,-1; cvt.u32.u16 %r269,%r267; st.u8 [%r181+28],%r269; .loc 1 105 20 add.u32 %r270,%r22,-2; cvt.s64.s32 %r56,%r270; add.u64 %r272,%r56,%r56; add.u64 %r273,%r272,%r56; shl.b64 %r274,%r273,3; add.u64 %r275,%r181,%r274; .loc 1 105 67 shl.b64 %r277,%r56,3; add.u64 %r278,%frame,%r277; .loc 1 105 59 ld.u64 %r280,[%r275+40]; ld.u64 %r281,[%r278+240]; mul.lo.u64 %r115,%r280,%r281; .loc 1 107 29283; call (%value_in),_gfortrani_xmallocarray284,[%value_in]; } .loc 1 107 27 st.u64 [%r181],%r284; .loc 1 108 10 setp.eq.u64 %r286,%r115,0; @ ! %r286 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r103,%r52,%r103; $L17: .loc 1 98 4 st.u64 [%r153+8],%r287; ld.u64 %r52,[%r151]; add.u64 %r288,%r52,-1; st.u64 [%r153+16],%r288; st.u64 [%r153],%r103; .loc 1 91 7 add.u64 %r153,%r153,24; add.u64 %r151,%r151,8; setp.ne.u64 %r289,%r145,%r153; @ %r289 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r181+48],%r115; mov.u64 %r291,-1; st.u64 [%r181+56],%r291; mov.u64 %r292,1; st.u64 [%r181+40],%r292; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r62,[%r181+28]; .loc 1 118 10 setp.eq.u64 %r293,%r62,%r109; @ %r293 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r109; st.u64 [%stack],%r62;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r296,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r297,[%r296+36]; setp.eq.u32 %r298,%r297,0; @ %r298 bra $L21; .loc 1 125 2 add.u64 %r348,%frame,240; cvta.const.u64 %r302,$LC2r3430302; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r304,%r109,0; @ %r304 bra $L24; $L27: .loc 1 65 9 add.u64 %r305,%r31,1; .loc 1 65 7 sub.u64 %r111,%r305,%r33; max.s64 %r46,%r111,0; .loc 1 137 8 ld.u64 %r122,[%r182]; .loc 1 138 8 ld.u64 %r123,[%r181]; .loc 1 158 36 shl.b64 %r69,%r113,2; ld.u64 %r137,[%frame+360]; .loc 1 180 22 ld.u64 %r73,[%frame+120]; .loc 1 180 12 shl.b64 %r74,%r73,2; .loc 1 181 22 ld.u64 %r76,[%frame]; .loc 1 181 12 shl.b64 %r77,%r76,2; .loc 1 183 32 ld.u64 %r142,[%frame+240]; .loc 1 190 23 mul.lo.u64 %r306,%r73,%r142; .loc 1 190 9 shl.b64 %r307,%r306,2; neg.s64 %r143,%r307; .loc 1 191 23 mul.lo.u64 %r308,%r76,%r142; .loc 1 191 9 shl.b64 %r309,%r308,2; neg.s64 %r70,%r309; setp.gt.s64 %r350,%r111,0; setp.le.s64 %r351,%r109,1; cvt.u32.u32 %r353,%r22; cvt.s64.s8 %r354,%r353; add.u64 %r355,%r354,-1; add.u64 %r356,%frame,120; add.u64 %r357,%frame,240; .loc 1 187 13 mov.u64 %r358,0; bra $L25; $L24: add.u64 %r166,%frame,360; add.u64 %r165,%r181,40; mov.u64 %r163,%frame; cvt.u32.u32 %r311,%r22; cvt.s64.s8 %r310,%r311; add.u64 %r154,%r310,-1; .loc 1 129 3 mov.u64 %r124,0; add.u64 %r348,%frame,240; .loc 1 131 16 mov.u64 %r312,%r124; $L26: st.u64 [%r166],%r312; .loc 1 132 18 ld.u64 %r313,[%r165]; st.u64 [%r163],%r313; .loc 1 133 17 shl.b64 %r315,%r124,3; add.u64 %r316,%r348,%r315; .loc 1 133 10 ld.u64 %r317,[%r316]; setp.le.s64 %r318,%r317,0; @ %r318 bra $L1; .loc 1 129 26 add.u64 %r124,%r124,1; .loc 1 129 3 add.u64 %r166,%r166,8; add.u64 %r165,%r165,24; add.u64 %r163,%r163,8; setp.ne.u64 %r319,%r124,%r154; @ %r319 bra $L26; bra $L27; $L25: .loc 1 153 5 @ %r350 bra $L39; .loc 1 154 10 mov.f32 %r321,0f7f7fffff; st.f32 [%r123],%r321; bra $L29; $L39: mov.u64 %r139,%r122; .loc 1 158 13 mov.u64 %r129,0; $L28: .loc 1 163 7 ld.f32 %r179,[%r139]; mov.u64 %r117,%r129; .loc 1 158 28 add.u64 %r129,%r129,1; mov.u64 %r118,%r139; .loc 1 158 36 add.u64 %r139,%r139,%r69; .loc 1 163 6 setp.le.f32 %r322,%r179,0f7f800000; @ %r322 bra $L40; .loc 1 158 6 setp.gt.s64 %r323,%r46,%r129; @ %r323 bra $L28; .loc 1 167 15 mov.f32 %r104,0f7fc00000; bra $L31; $L40: .loc 1 149 9 mov.f32 %r104,0f7f800000; $L30: .loc 1 171 6 setp.gt.f32 %r324,%r104,%r179; .loc 1 172 12 selp.f32 %r104,%r179,%r104,%r324; .loc 1 168 28 add.u64 %r117,%r117,1; .loc 1 168 36 add.u64 %r118,%r118,%r69; .loc 1 168 11 setp.le.s64 %r325,%r46,%r117; @ %r325 bra $L31; .loc 1 171 7 ld.f32 %r179,[%r118]; bra $L30; $L31: .loc 1 175 12 st.f32 [%r123],%r104; $L29: .loc 1 179 15 add.u64 %r137,%r137,1; .loc 1 180 12 add.u64 %r122,%r122,%r74; .loc 1 181 12 add.u64 %r123,%r123,%r77; .loc 1 183 13 setp.ne.u64 %r326,%r137,%r142; @ %r326 bra $L25; .loc 1 190 9 add.u64 %r119,%r122,%r143; .loc 1 191 9 add.u64 %r120,%r123,%r70; .loc 1 193 7 @ %r351 bra $L1; add.u64 %r132,%frame,368; mov.u64 %r174,8; .loc 1 192 5 mov.u64 %r121,1; bra $L36; $L37: .loc 1 187 13 st.u64 [%r132],%r358; .loc 1 190 23 mul.lo.u64 %r332,%r90,%r89; .loc 1 190 9 shl.b64 %r333,%r332,2; sub.u64 %r119,%r122,%r333; .loc 1 191 23 mul.lo.u64 %r334,%r93,%r89; .loc 1 191 9 shl.b64 %r335,%r334,2; sub.u64 %r120,%r123,%r335; .loc 1 192 5 add.u64 %r121,%r121,1; .loc 1 193 7 add.u64 %r132,%r132,8; add.u64 %r174,%r174,8; setp.eq.u64 %r336,%r121,%r355; @ %r336 bra $L1; $L36: .loc 1 201 16 ld.u64 %r337,[%r132]; add.u64 %r89,%r337,1; st.u64 [%r132],%r89; .loc 1 202 23 add.u64 %r339,%r356,%r174; ld.u64 %r90,[%r339]; .loc 1 202 13 shl.b64 %r340,%r90,2; add.u64 %r122,%r119,%r340; .loc 1 203 23 add.u64 %r341,%frame,%r174; ld.u64 %r93,[%r341]; .loc 1 203 13 shl.b64 %r342,%r93,2; add.u64 %r123,%r120,%r342; .loc 1 183 32 add.u64 %r344,%r357,%r174; ld.u64 %r96,[%r344]; .loc 1 183 13 setp.eq.u64 %r345,%r89,%r96; @ %r345 bra $L37; .loc 1 187 13 mov.u64 %r137,0; bra $L25; $L53: .loc 1 87 6 ld.u64 %r346,[%r181]; setp.eq.u64 %r347,%r346,0; @ ! %r347 bra $L11; bra $L38; $L1: .loc 1 207_gfortran_mminval_r4 .visible .func _gfortran_mminval_r608.reg .u64 %r89; .reg .f32u32 %r219; .reg .u64 %r221; .reg .u32 %r223; .reg .u32 %r224; .reg .u16 %r226; .reg .u16 %r227; .reg .u16 %r228; .reg .u32u16 %r263; .reg .u32pred %r282; .reg .u64 %r283; .reg .u64u64u6464pred %r325; .reg .u64 %r327; .reg .u64 %r329; .reg .pred %r331; .reg .pred %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u32u64pred %r357; .reg .u64 %r358; .reg .u64 %r359; .reg .predu64 %r392; .reg .u64 %r393; .reg .pred %r394; .reg .predpred %r403; .reg .pred %r405; .reg .predpredu64 %r422; .reg .u64 %r424; .reg .u64 %r425; .reg .u64 %r427; .reg .pred %r428; .reg .pred %r429; .reg .u64 %r430; .reg .pred %r431; .reg .u64 %r433; .reg .predu64 %r449; mov.u64 %r211,%ar0; mov.u64 %r212,%ar1; mov.u64 %r213,%ar2; mov.u64 %r214,%ar3; .loc 1 237 6 setp.ne.u64 %r215,%r214,0; @ %r215 bra $Lgfortran_minval_r4.loc 1 244 7 bra $L54; $L55: .loc 1 247 10 ld.u64 %r22,[%r213]; .loc 1 247 7 add.u64 %r142,%r22,-1; .loc 1 248 10 ld.s8 %r23,[%r212+28]; .loc 1 248 38 add.u32 %r219,%r23,-1; .loc 1 248 8 cvt.s64.s32 %r143,%r219; .loc 1 251 7 shr.u64 %r221,%r142,63; set.u32.gt.s64 %r223,%r142,%r143; neg.s32 %r224,%r223; cvt.u16.u64 %r227,%r221; cvt.u16.u32 %r228,%r224; or.b16 %r226,%r227,%r228; .loc 1 251 6 cvt.u32.u16 %r229,%r226; cvt.u16.u8 %r230,%r229; setp.eq.u16 %r231,%r230,0; @ %r231 bra $L57; .loc 1 253 7 cvt.u32.u32 %r234,%r23; cvt.s64.s8 %r233,%r234; st.u64 [%stack+8],%r233; st.u64 [%stack],%r22; cvta.const.u64 %r2322stack; call _gfortran_runtime_error7: .loc 1 258 9 add.u64 %r435,%r142,%r142; add.u64 %r433,%r435,%r142; shl.b64 %r239,%r433,3; add.u64 %r240,%r212,%r239; ld.u64 %r243,[%r240+56]; add.u64 %r242,%r243,1; .loc 1 258 7 ld.u64 %r250,[%r240+48]; sub.u64 %r144,%r242,%r250; .loc 1 259 6 setp.le.s64 %r251,%r144,0; @ %r251 bra $L54; .loc 1 262 9 ld.u64 %r159,[%r214]; .loc 1 264 15 ld.u64 %r33,[%r214+16]; .loc 1 266 22 cvt.u32.u64 %r34,%r33; .loc 1 266 53 add.u32 %r252,%r34,-4; and.b32 %r253,%r252,-5; set.u32.eq.u32 %r255,%r253,0; neg.s32 %r256,%r255; .loc 1 266 22 add.u32 %r257,%r34,-1; set.u32.le.u32 %r259,%r257,1; neg.s32 %r260,%r259; .loc 1 266 6 cvt.u16.u32 %r262,%r256; cvt.u16.u32 %r263,%r260; or.b16 %r261,%r262,%r263; cvt.u32.u16 %r264,%r261; cvt.u16.u8 %r265,%r264; setp.ne.u16 %r266,%r265,0; @ %r266 bra $L59; .loc 1 268 7 setp.ne.u32 %r268,%r34,16; @ %r268 bra $L60; $L59: .loc 1 275 9 shl.b64 %r272,%r433,3; add.u64 %r273,%r212,%r272; ld.u64 %r147,[%r273+40]; .loc 1 276 12 add.u64 %r279,%r214,%r272; ld.u64 %r40,[%r279+40]; .loc 1 278 3 setp.ne.u64 %r281,%r142,0; @ %r281 bra $L61; $L67: .loc 1 288 3 setp.lt.s64 %r282,%r142,%r143; @ %r282 bra $L62; bra $L118; $L60: .loc 1 273 5 cvta.const.u64 %r283283_gfortran_runtime_error61: add.u64 %r53,%r212,40; add.u64 %r163,%r214,40; add.u64 %r286,%r22,%r22; add.u64 %r287,%r286,%r22; shl.b64 %r288,%r287,3; add.u64 %r289,%r212,16; add.u64 %r39,%r288,%r289; .loc 1 278 3 mov.u64 %r209,0; add.u64 %r438,%frame,240; add.u64 %r436,%frame,360; .loc 1 285 12 mov.u64 %r449,%r209; $L66: .loc 1 280 18 add.u64 %r291,%r438,%r209; ld.u64 %r292,[%r53]; st.u64 [%r291],%r292; .loc 1 281 18 add.u64 %r293,%frame,%r209; .loc 1 281 20 ld.u64 %r295,[%r163]; mul.lo.u64 %r294,%r295,%r33; .loc 1 281 18 st.u64 [%r293],%r294; .loc 1 282 19 ld.u64 %r297,[%r53+16]; add.u64 %r296,%r297,1; ld.u64 %r298,[%r53+8]; sub.u64 %r51,%r296,%r298; .loc 1 284 10 setp.lt.s64 %r299,%r51,0; @ %r299 bra $L64; .loc 1 282 17 add.u64 %r301,%r436,%r209; st.u64 [%r301],%r51; bra $L65; $L64: .loc 1 285 12 add.u64 %r303,%r436,%r209; st.u64 [%r303],%r449; $L65: .loc 1 278 3 add.u64 %r53,%r53,24; add.u64 %r163,%r163,24; add.u64 %r209,%r209,8; setp.ne.u64 %r305,%r39,%r53; @ %r305 bra $L66; bra $L67; $L118: .loc 1 298 6 ld.u64 %r306,[%r211]; setp.eq.u64 %r307,%r306,0; @ ! %r307 bra $L69; bra $L68; $L62: add.u64 %r309,%r22,%r22; add.u64 %r310,%r309,%r22; shl.b64 %r311,%r310,3; add.u64 %r167,%r311,40; add.u64 %r171,%r212,%r167; add.u64 %r165,%r214,%r167; shl.b64 %r312,%r22,3; add.u64 %r139,%r312,-8; cvt.u32.u32 %r314,%r23; cvt.s64.s8 %r313,%r314; shl.b64 %r315,%r313,3; add.u64 %r124,%r315,-8; add.u64 %r438,%frame,240; add.u64 %r436,%frame,360; .loc 1 295 12 mov.u64 %r448,0; $L72: .loc 1 290 18 add.u64 %r317,%r438,%r139; ld.u64 %r318,[%r171]; st.u64 [%r317],%r318; .loc 1 291 18 add.u64 %r319,%frame,%r139; .loc 1 291 20 ld.u64 %r321,[%r165]; mul.lo.u64 %r320,%r321,%r33; .loc 1 291 18 st.u64 [%r319],%r320; .loc 1 292 19 ld.u64 %r323,[%r171+16]; add.u64 %r322,%r323,1; ld.u64 %r324,[%r171+8]; sub.u64 %r63,%r322,%r324; .loc 1 294 10 setp.lt.s64 %r325,%r63,0; @ %r325 bra $L70; .loc 1 292 17 add.u64 %r327,%r436,%r139; st.u64 [%r327],%r63; bra $L71; $L70: .loc 1 295 12 add.u64 %r329,%r436,%r139; st.u64 [%r329],%r448; $L71: .loc 1 288 3 add.u64 %r171,%r171,24; add.u64 %r165,%r165,24; add.u64 %r139,%r139,8; setp.ne.u64 %r331,%r124,%r139; @ %r331 bra $L72; bra $L119; $L68: .loc 1 302 7 setp.eq.u64 %r332,%r143,0; @ %r332 bra $L74; add.u64 %r436,%frame,360; $L96: add.u64 %r185,%r211,40; mov.u64 %r183,%r436; add.u64 %r333,%r211,16; cvt.u32.u32 %r335,%r23; cvt.s64.s8 %r334,%r335; add.u64 %r337,%r334,%r334; add.u64 %r338,%r337,%r334; shl.b64 %r339,%r338,3; add.u64 %r172,%r333,%r339; .loc 1 305 10 mov.u64 %r131,1; .loc 1 309 4 mov.u64 %r358,0; bra $L75; $L74: .loc 1 313 20 add.u32 %r340,%r23,-2; cvt.s64.s32 %r73,%r340; add.u64 %r342,%r73,%r73; add.u64 %r343,%r342,%r73; shl.b64 %r344,%r343,3; add.u64 %r345,%r211,%r344; .loc 1 313 67 shl.b64 %r347,%r73,3; add.u64 %r348,%frame,%r347; .loc 1 313 59 ld.u64 %r350,[%r345+40]; ld.u64 %r351,[%r348+360]; mul.lo.u64 %r151,%r350,%r351; .loc 1 315 24 mov.u64 %r352,0; st.u64 [%r211+8],%r352; .loc 1 316 28 cvt.u16.u32 %r355,%r23; add.u16 %r354,%r355,-1; cvt.u32.u16 %r356,%r354; st.u8 [%r211+28],%r356; .loc 1 318 10 setp.eq.u64 %r357,%r151,0; @ %r357 bra $L76; bra $L120; $L78: .loc 1 307 47 mul.lo.u64 %r131,%r70,%r131; $L75: .loc 1 309 4 st.u64 [%r185+8],%r358; ld.u64 %r70,[%r183]; add.u64 %r359,%r70,-1; st.u64 [%r185+16],%r359; st.u64 [%r185],%r131; .loc 1 302 7 add.u64 %r185,%r185,24; add.u64 %r183,%r183,8; setp.ne.u64 %r360,%r172,%r185; @ %r360 bra $L78; bra $L74; $L76: .loc 1 321 4 st.u64 [%r211+48],%r151; mov.u64 %r362,-1; st.u64 [%r211+56],%r362; mov.u64 %r363,1; st.u64 [%r211+40],%r363; .loc 1 322 4 bra $L54; $L120: .loc 1 325 24365; call (%value_in),_gfortrani_xmallocarray66,[%value_in]; } .loc 1 325 22 st.u64 [%r211],%r366; bra $L79; $L69: .loc 1 330 19 ld.s8 %r368,[%r211+28]; .loc 1 330 10 setp.eq.u64 %r369,%r368,%r143; @ %r369 bra $L80; .loc 1 331 2 cvta.const.u64 %r370370_gfortran_runtime_error80: .loc 1 333 11 cvta.global.u64 %r372,_gfortrani_compile_options; .loc 1 333 10 ld.u32 %r373,[%r372+36]; setp.eq.u32 %r374,%r373,0; @ %r374 bra $L79; .loc 1 335 4 add.u64 %r436,%frame,360; cvta.const.u64 %r378,$LC2r4r377378; call _gfortrani_bounds_ifunction_return337 4 cvta.const.u64 %r382,$LC1212382378; call _gfortrani_bounds_equal_extents$L79: .loc 1 342 3 setp.ne.u64 %r384,%r143,0; @ %r384 bra $L81; $L84: .loc 1 350 8 ld.u64 %r160,[%r211]; .loc 1 351 8 ld.u64 %r158,[%r212]; .loc 1 353 9 setp.ne.u64 %r385,%r158,0; @ %r385 bra $L82; bra $L54; $L81: add.u64 %r198,%frame,480; add.u64 %r197,%r211,40; add.u64 %r195,%frame,120; cvt.u32.u32 %r387,%r23; cvt.s64.s8 %r386,%r387; add.u64 %r186,%r386,-1; .loc 1 342 3 mov.u64 %r161,0; add.u64 %r436,%frame,360; .loc 1 344 16 mov.u64 %r388,%r161; $L83: st.u64 [%r198],%r388; .loc 1 345 18 ld.u64 %r389,[%r197]; st.u64 [%r195],%r389; .loc 1 346 17 shl.b64 %r391,%r161,3; add.u64 %r392,%r436,%r391; .loc 1 346 10 ld.u64 %r393,[%r392]; setp.le.s64 %r394,%r393,0; @ %r394 bra $L54; .loc 1 342 26 add.u64 %r161,%r161,1; .loc 1 342 3 add.u64 %r198,%r198,8; add.u64 %r197,%r197,24; add.u64 %r195,%r195,8; setp.ne.u64 %r395,%r161,%r186; @ %r395 bra $L83; bra $L84; $L82: .loc 1 276 12 mul.lo.u64 %r42,%r40,%r33; .loc 1 370 32 shl.b64 %r87,%r147,2; ld.u64 %r38,[%frame+480]; .loc 1 401 22 ld.u64 %r95,[%frame+240]; .loc 1 401 12 shl.b64 %r96,%r95,2; .loc 1 402 23 ld.u64 %r98,[%frame]; .loc 1 403 22 ld.u64 %r100,[%frame+120]; .loc 1 403 12 shl.b64 %r101,%r100,2; .loc 1 405 32 ld.u64 %r59,[%frame+360]; .loc 1 412 23 mul.lo.u64 %r37,%r95,%r59; .loc 1 413 24 mul.lo.u64 %r396,%r98,%r59; .loc 1 413 10 neg.s64 %r89,%r396; .loc 1 414 23 mul.lo.u64 %r397,%r100,%r59; .loc 1 414 9 shl.b64 %r398,%r397,2; neg.s64 %r132,%r398; setp.le.s64 %r434,%r143,1; .loc 1 377 19 mov.u32 %r440,1; cvt.u32.u32 %r442,%r23; cvt.s64.s8 %r443,%r442; add.u64 %r444,%r443,-1; add.u64 %r445,%frame,240; add.u64 %r446,%frame,360; add.u64 %r447,%frame,120; $L97: .loc 1 342 3 mov.u64 %r154,%r159; mov.u64 %r153,%r158; .loc 1 368 6 mov.u32 %r137,0; .loc 1 370 9 mov.u64 %r152,0; $L87: .loc 1 374 7 ld.s8 %r210,[%r154]; .loc 1 374 6 setp.eq.u32 %r437,%r210,0; @ %r437 bra $L85; .loc 1 378 10 ld.f32 %r400,[%r153]; setp.le.f32 %r401,%r400,0f7f800000; @ %r401 bra $L86; .loc 1 377 19 mov.u32 %r137,%r440; $L85: .loc 1 370 24 add.u64 %r152,%r152,1; .loc 1 370 32 add.u64 %r153,%r153,%r87; .loc 1 370 47 add.u64 %r154,%r154,%r42; .loc 1 370 2 setp.ne.u64 %r402,%r144,%r152; @ %r402 bra $L87; .loc 1 386 47 setp.ne.u32 %r403,%r137,0; selp.f32 %r136,0f7fc00000,0f7f7fffff,%r403; bra $L88; $L91: .loc 1 394 6 @ %r437 bra $L89; .loc 1 394 16 ld.f32 %r91,[%r153]; .loc 1 394 13 setp.lt.f32 %r405,%r91,%r136; .loc 1 395 12 selp.f32 %r136,%r91,%r136,%r405; $L89: .loc 1 391 28 add.u64 %r152,%r152,1; .loc 1 391 36 add.u64 %r153,%r153,%r87; .loc 1 391 51 add.u64 %r154,%r154,%r42; .loc 1 391 11 setp.eq.u64 %r406,%r144,%r152; @ %r406 bra $L88; .loc 1 394 7 ld.s8 %r210,[%r154]; setp.eq.u32 %r437,%r210,0; bra $L91; $L99: .loc 1 386 47 mov.f32 %r136,0f7fc00000; $L88: .loc 1 397 8 st.f32 [%r160],%r136; .loc 1 400 15 add.u64 %r38,%r38,1; .loc 1 401 12 add.u64 %r158,%r158,%r96; .loc 1 402 13 add.u64 %r159,%r159,%r98; .loc 1 403 12 add.u64 %r160,%r160,%r101; .loc 1 405 13 setp.ne.u64 %r407,%r59,%r38; @ %r407 bra $L97; .loc 1 413 10 add.u64 %r155,%r159,%r89; .loc 1 414 9 add.u64 %r156,%r160,%r132; .loc 1 416 7 @ %r434 bra $L54; add.u64 %r169,%frame,488; .loc 1 412 23 mov.u64 %r103,%r37; .loc 1 416 7 mov.u64 %r141,8; .loc 1 415 5 mov.u64 %r157,1; .loc 1 409 13 mov.u64 %r439,0; bra $L94; $L95: st.u64 [%r169],%r439; .loc 1 412 23 mul.lo.u64 %r103,%r114,%r113; .loc 1 413 24 mul.lo.u64 %r413,%r116,%r113; .loc 1 413 10 sub.u64 %r155,%r159,%r413; .loc 1 414 23 mul.lo.u64 %r414,%r118,%r113; .loc 1 414 9 shl.b64 %r415,%r414,2; sub.u64 %r156,%r160,%r415; .loc 1 415 5 add.u64 %r157,%r157,1; .loc 1 416 7 add.u64 %r169,%r169,8; add.u64 %r141,%r141,8; setp.eq.u64 %r416,%r157,%r444; @ %r416 bra $L54; $L94: .loc 1 424 16 ld.u64 %r417,[%r169]; add.u64 %r113,%r417,1; st.u64 [%r169],%r113; .loc 1 425 23 add.u64 %r419,%r445,%r141; ld.u64 %r114,[%r419]; .loc 1 425 13 sub.u64 %r420,%r114,%r103; shl.b64 %r421,%r420,2; add.u64 %r158,%r158,%r421; .loc 1 426 24 add.u64 %r422,%frame,%r141; ld.u64 %r116,[%r422]; .loc 1 426 14 add.u64 %r159,%r155,%r116; .loc 1 427 23 add.u64 %r424,%r447,%r141; ld.u64 %r118,[%r424]; .loc 1 427 13 shl.b64 %r425,%r118,2; add.u64 %r160,%r156,%r425; .loc 1 405 32 add.u64 %r427,%r446,%r141; ld.u64 %r121,[%r427]; .loc 1 405 13 setp.eq.u64 %r428,%r113,%r121; @ %r428 bra $L95; .loc 1 409 13 mov.u64 %r38,0; bra $L97; $L86: .loc 1 383 9 setp.le.s64 %r429,%r144,%r152; @ %r429 bra $L99; mov.f32 %r136,0f7f800000; bra $L91; $L119: .loc 1 298 6 ld.u64 %r430,[%r211]; setp.eq.u64 %r431,%r430,0; @ ! %r431 bra $L69; bra $L96; $L54: .loc 1 431_gfortran_sminval_r4 .visible .func _gfortran_sminval_r8u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64 %r276; .reg .predpredu64mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 454 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L122; .loc 1 454 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L123; r152; call _gfortran_minval_r4.loc 1 461 7 bra $L121; $L123: .loc 1 464 10 ld.u64 %r23,[%r152]; .loc 1 464 7 add.u64 %r88,%r23,-1; .loc 1 465 10 ld.s8 %r24,[%r151+28]; .loc 1 465 38 add.u32 %r26,%r24,-1; .loc 1 465 8 cvt.s64.s32 %r294,%r26; .loc 1 467 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r294; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 467 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L125; .loc 1 474 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L126; $L132: .loc 1 482 3 setp.lt.s64 %r173,%r88,%r294; @ %r173 bra $L127; bra $L180; $L125: .loc 1 469 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error126: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 479 12 mov.u64 %r300,0; $L131: .loc 1 476 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 478 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L129; .loc 1 476 17 st.u64 [%r139],%r36; bra $L130; $L129: .loc 1 479 12 st.u64 [%r139],%r300; $L130: .loc 1 474 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L131; bra $L132; $L180: .loc 1 491 15 ld.u64 %r98,[%r150]; .loc 1 491 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L134; bra $L133; $L127: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r293,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r293,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 488 12 mov.u64 %r299,0; $L137: .loc 1 485 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 487 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L135; .loc 1 484 17 st.u64 [%r43],%r41; bra $L136; $L135: .loc 1 488 12 st.u64 [%r43],%r299; $L136: .loc 1 482 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L137; bra $L181; $L133: .loc 1 495 7 setp.eq.u64 %r211,%r294,0; @ %r211 bra $L139; add.u64 %r293,%frame,120; $L159: add.u64 %r111,%r150,40; mov.u64 %r107,%r293; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 498 10 mov.u64 %r81,1; .loc 1 502 4 mov.u64 %r237,0; bra $L140; $L139: .loc 1 506 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 507 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 509 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 509 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 509 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 511 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L141; bra $L182; $L143: .loc 1 500 48 mul.lo.u64 %r81,%r46,%r81; $L140: .loc 1 502 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 495 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L143; bra $L139; $L141: .loc 1 514 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 515 4 bra $L121; $L182: .loc 1 518 2496244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 518 22 st.u64 [%r150],%r98; $L147: .loc 1 544 3 setp.ne.u64 %r247,%r294,0; @ %r247 bra $L144; bra $L145; $L134: .loc 1 522 19 ld.s8 %r55,[%r150+28]; .loc 1 522 10 setp.eq.u64 %r248,%r55,%r294; @ %r248 bra $L146; .loc 1 523 2 st.u64 [%stack+8],%r294; st.u64 [%stack],%r55;_gfortran_runtime_error146: .loc 1 528 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 528 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L147; .loc 1 530 4 setp.eq.u64 %r254,%r294,0; @ %r254 bra $L145; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 530 10 mov.u64 %r104,0; $L149: .loc 1 534 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 534 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 535 18 ld.u64 %r63,[%r31]; .loc 1 530 25 add.u64 %r104,%r104,1; .loc 1 535 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L148; .loc 1 536 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error148: .loc 1 530 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L149; $L144: .loc 1 546 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L150; shl.b64 %r264,%r294,3; bra $L151; $L150: mov.u64 %r264,8; $L151: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L152: .loc 1 547 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 544 26 add.u64 %r100,%r100,1; .loc 1 544 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r294,%r100; @ %r277 bra $L152; $L145: ld.u64 %r80,[%frame+240]; .loc 1 556 22 ld.u64 %r68,[%frame]; .loc 1 556 12 shl.b64 %r69,%r68,2; .loc 1 558 32 ld.u64 %r59,[%frame+120]; .loc 1 565 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r292,%r294,1; .loc 1 554 13 mov.f32 %r278,0f7f7fffff; cvt.u32.u32 %r296,%r24; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; $L158: st.f32 [%r98],%r278; .loc 1 555 15 add.u64 %r80,%r80,1; .loc 1 556 12 add.u64 %r98,%r98,%r69; .loc 1 558 13 setp.ne.u64 %r279,%r59,%r80; @ %r279 bra $L158; .loc 1 567 7 @ ! %r292 bra $L183; bra $L121; $L157: .loc 1 562 13 st.u64 [%r99],%r295; .loc 1 565 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 566 5 add.u64 %r97,%r97,1; .loc 1 567 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r282,%r97,%r298; @ %r282 bra $L156; bra $L121; $L183: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 565 23 mov.u64 %r72,%r44; .loc 1 566 5 mov.u64 %r97,1; .loc 1 562 13 mov.u64 %r295,0; $L156: .loc 1 571 16 ld.u64 %r287,[%r99]; add.u64 %r74,%r287,1; st.u64 [%r99],%r74; .loc 1 572 23 ld.u64 %r75,[%r124]; .loc 1 572 13 sub.u64 %r288,%r75,%r72; shl.b64 %r289,%r288,2; add.u64 %r98,%r98,%r289; .loc 1 558 32 ld.u64 %r77,[%r123]; .loc 1 558 13 setp.eq.u64 %r290,%r74,%r77; @ %r290 bra $L157; .loc 1 562 13 mov.u64 %r80,0; bra $L158; $L181: .loc 1 491 15 ld.u64 %r98,[%r150]; .loc 1 491 6 setp.eq.u64 %r291,%r98,0; @ ! %r291 bra $L134; bra $L159; $L121: .loc 1 minval_r8.o/_gfortran_minval_r8 .visible .func _gfortran_minval_r8fortran/generated/minval_r8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminval_r8 .visible .func _gfortran_mminval_r8gfortran_sminval_r8 .visible .func _gfortran_sminval_r8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,86,65,76114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_minval_r8 .visible .func _gfortran_minval_r848u32 %r194; .reg .u16 %r195; .reg .predu64 %r227; .reg .u64 %r228; .reg .u64 %r229; .reg .predpredu32u64 %r250; .reg .u64 %r251; .reg .u64 %r252; .reg .u64 %r253; .reg .predu32u16u64pred %r319; .reg .f64 %r321; .reg .pred %r322; .reg .pred %r323; .reg .pred %r324; .reg .pred %r325; .reg .pred %r326; .reg .u64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64u64 %r357; .reg .u64 %r358; .reg .u64 %r360; .reg .u64 %r361; mov.u64 %r181,%ar0; mov.u64 %r182,%ar1; mov.u64 %r183,%ar2; .loc 1 55 10 ld.s8 %r22,[%r182+28]; .loc 1 55 38 add.u32 %r184,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r109,%r184; .loc 1 56 10 ld.u64 %r25,[%r183]; .loc 1 56 7 add.u64 %r110,%r25,-1; .loc 1 58 7 shr.u64 %r186,%r110,63; set.u32.lt.s64 %r188,%r109,%r110; neg.s32 %r189,%r188; cvt.u16.u64 %r192,%r186; cvt.u16.u32 %r193,%r189; or.b16 %r191,%r192,%r193; .loc 1 58 6 cvt.u32.u16 %r194,%r191; cvt.u16.u8 %r195,%r194; setp.eq.u16 %r196,%r195,0; @ %r196 bra $L2; .loc 1 60 7 cvt.u32.u32 %r199,%r22; cvt.s64.s8 %r198,%r199; st.u64 [%stack+8],%r19819197_gfortran_runtime_errorr202,%r110,%r110; add.u64 %r203,%r202,%r110; shl.b64 %r204,%r203,3; add.u64 %r205,%r182,%r204; ld.u64 %r31,[%r205+56]; ld.u64 %r33,[%r205+48]; .loc 1 68 9 ld.u64 %r113,[%r205+40]; .loc 1 70 3 setp.ne.u64 %r219,%r110,0; @ %r219 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r220,%r109,%r110; @ %r220 bra $L4; bra $L52; $L3: add.u64 %r29,%r182,40; add.u64 %r126,%frame,120; add.u64 %r180,%frame,240; add.u64 %r221,%r182,16; add.u64 %r223,%r25,%r25; add.u64 %r224,%r223,%r25; shl.b64 %r225,%r224,3; add.u64 %r71,%r221,%r225; .loc 1 76 12 mov.u64 %r361,0; $L8: .loc 1 72 18 ld.u64 %r226,[%r29]; st.u64 [%r126],%r226; .loc 1 73 19 ld.u64 %r228,[%r29+16]; add.u64 %r227,%r228,1; ld.u64 %r229,[%r29+8]; sub.u64 %r38,%r227,%r229; .loc 1 75 10 setp.lt.s64 %r230,%r38,0; @ %r230 bra $L6; .loc 1 73 17 st.u64 [%r180],%r38; bra $L7; $L6: .loc 1 76 12 st.u64 [%r180],%r361; $L7: .loc 1 70 3 add.u64 %r29,%r29,24; add.u64 %r126,%r126,8; add.u64 %r180,%r180,8; setp.ne.u64 %r232,%r71,%r29; @ %r232 bra $L8; bra $L9; $L52: .loc 1 87 6 ld.u64 %r233,[%r181]; setp.eq.u64 %r234,%r233,0; @ ! %r234 bra $L11; bra $L10; $L4: add.u64 %r236,%r25,%r25; add.u64 %r237,%r236,%r25238,40; add.u64 %r144,%r182,%r239; shl.b64 %r240,%r25,3; add.u64 %r107,%r240,-8; add.u64 %r349,%frame,120; add.u64 %r128,%r349,%r107; add.u64 %r348,%frame,240; add.u64 %r105,%r348,%r107; cvt.u32.u32 %r244,%r22; cvt.s64.s8 %r243,%r244; add.u64 %r246,%r243,%r243; add.u64 %r247,%r246,%r243; shl.b64 %r248,%r247,3; add.u64 %r249,%r182,40; add.u64 %r65,%r248,%r249; .loc 1 84 12 mov.u64 %r360,0; $L14: .loc 1 80 18 ld.u64 %r250,[%r144]; st.u64 [%r128],%r250; .loc 1 81 19 ld.u64 %r252,[%r144+16]; add.u64 %r251,%r252,1; ld.u64 %r253,[%r144+8]; sub.u64 %r45,%r251,%r253; .loc 1 83 10 setp.lt.s64 %r254,%r45,0; @ %r254 bra $L12; .loc 1 81 17 st.u64 [%r105],%r45; bra $L13; $L12: .loc 1 84 12 st.u64 [%r105],%r360; $L13: .loc 1 78 3 add.u64 %r144,%r144,24; add.u64 %r128,%r128,8; add.u64 %r105,%r105,8; setp.ne.u64 %r256,%r65,%r144; @ %r256 bra $L14; bra $L53; $L10: .loc 1 91 7 setp.le.s64 %r257,%r109,0; @ %r257 bra $L16; add.u64 %r348,%frame,240; $L38: add.u64 %r153,%r181,40; mov.u64 %r151,%r348; add.u64 %r258,%r181,16; cvt.u32.u32 %r260,%r22; cvt.s64.s8 %r259,%r260; add.u64 %r262,%r259,%r259; add.u64 %r263,%r262,%r259; shl.b64 %r264,%r263,3; add.u64 %r145,%r258,%r264; .loc 1 94 10 mov.u64 %r103,1; .loc 1 98 4 mov.u64 %r287,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r265,0; st.u64 [%r181+8],%r265; .loc 1 103 28 cvt.u16.u32 %r268,%r22; add.u16 %r267,%r268,-1; cvt.u32.u16 %r269,%r267; st.u8 [%r181+28],%r269; .loc 1 105 20 add.u32 %r270,%r22,-2; cvt.s64.s32 %r56,%r270; add.u64 %r272,%r56,%r56; add.u64 %r273,%r272,%r56; shl.b64 %r274,%r273,3; add.u64 %r275,%r181,%r274; .loc 1 105 67 shl.b64 %r277,%r56,3; add.u64 %r278,%frame,%r277; .loc 1 105 59 ld.u64 %r280,[%r275+40]; ld.u64 %r281,[%r278+240]; mul.lo.u64 %r115,%r280,%r281; .loc 1 107 29 mov.u64 %r283283; call (%value_in),_gfortrani_xmallocarray284,[%value_in]; } .loc 1 107 27 st.u64 [%r181],%r284; .loc 1 108 10 setp.eq.u64 %r286,%r115,0; @ ! %r286 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r103,%r52,%r103; $L17: .loc 1 98 4 st.u64 [%r153+8],%r287; ld.u64 %r52,[%r151]; add.u64 %r288,%r52,-1; st.u64 [%r153+16],%r288; st.u64 [%r153],%r103; .loc 1 91 7 add.u64 %r153,%r153,24; add.u64 %r151,%r151,8; setp.ne.u64 %r289,%r145,%r153; @ %r289 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r181+48],%r115; mov.u64 %r291,-1; st.u64 [%r181+56],%r291; mov.u64 %r292,1; st.u64 [%r181+40],%r292; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r62,[%r181+28]; .loc 1 118 10 setp.eq.u64 %r293,%r62,%r109; @ %r293 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r109; st.u64 [%stack],%r62;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r296,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r297,[%r296+36]; setp.eq.u32 %r298,%r297,0; @ %r298 bra $L21; .loc 1 125 2 add.u64 %r348,%frame,240; cvta.const.u64 %r302,$LC2r3430302; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r304,%r109,0; @ %r304 bra $L24; $L27: .loc 1 65 9 add.u64 %r305,%r31,1; .loc 1 65 7 sub.u64 %r111,%r305,%r33; max.s64 %r46,%r111,0; .loc 1 137 8 ld.u64 %r122,[%r182]; .loc 1 138 8 ld.u64 %r123,[%r181]; .loc 1 158 36 shl.b64 %r69,%r113,3; ld.u64 %r137,[%frame+360]; .loc 1 180 22 ld.u64 %r73,[%frame+120]; .loc 1 180 12 shl.b64 %r74,%r73,3; .loc 1 181 22 ld.u64 %r76,[%frame]; .loc 1 181 12 shl.b64 %r77,%r76,3; .loc 1 183 32 ld.u64 %r142,[%frame+240]; .loc 1 190 23 mul.lo.u64 %r306,%r73,%r142; .loc 1 190 9 shl.b64 %r307,%r306,3; neg.s64 %r143,%r307; .loc 1 191 23 mul.lo.u64 %r308,%r76,%r142; .loc 1 191 9 shl.b64 %r309,%r308,3; neg.s64 %r70,%r309; setp.gt.s64 %r350,%r111,0; setp.le.s64 %r351,%r109,1; cvt.u32.u32 %r353,%r22; cvt.s64.s8 %r354,%r353; add.u64 %r355,%r354,-1; add.u64 %r356,%frame,120; add.u64 %r357,%frame,240; .loc 1 187 13 mov.u64 %r358,0; bra $L25; $L24: add.u64 %r166,%frame,360; add.u64 %r165,%r181,40; mov.u64 %r163,%frame; cvt.u32.u32 %r311,%r22; cvt.s64.s8 %r310,%r311; add.u64 %r154,%r310,-1; .loc 1 129 3 mov.u64 %r124,0; add.u64 %r348,%frame,240; .loc 1 131 16 mov.u64 %r312,%r124; $L26: st.u64 [%r166],%r312; .loc 1 132 18 ld.u64 %r313,[%r165]; st.u64 [%r163],%r313; .loc 1 133 17 shl.b64 %r315,%r124,3; add.u64 %r316,%r348,%r315; .loc 1 133 10 ld.u64 %r317,[%r316]; setp.le.s64 %r318,%r317,0; @ %r318 bra $L1; .loc 1 129 26 add.u64 %r124,%r124,1; .loc 1 129 3 add.u64 %r166,%r166,8; add.u64 %r165,%r165,24; add.u64 %r163,%r163,8; setp.ne.u64 %r319,%r124,%r154; @ %r319 bra $L26; bra $L27; $L25: .loc 1 153 5 @ %r350 bra $L39; .loc 1 154 10 mov.f64 %r321,0d7fefffffffffffff; st.f64 [%r123],%r321; bra $L29; $L39: mov.u64 %r139,%r122; .loc 1 158 13 mov.u64 %r129,0; $L28: .loc 1 163 7 ld.f64 %r179,[%r139]; mov.u64 %r117,%r129; .loc 1 158 28 add.u64 %r129,%r129,1; mov.u64 %r118,%r139; .loc 1 158 36 add.u64 %r139,%r139,%r69; .loc 1 163 6 setp.le.f64 %r322,%r179,0d7ff0000000000000; @ %r322 bra $L40; .loc 1 158 6 setp.gt.s64 %r323,%r46,%r129; @ %r323 bra $L28; .loc 1 167 15 mov.f64 %r104,0d7ff8000000000000; bra $L31; $L40: .loc 1 149 9 mov.f64 %r104,0d7ff0000000000000; $L30: .loc 1 171 6 setp.gt.f64 %r324,%r104,%r179; .loc 1 172 12 selp.f64 %r104,%r179,%r104,%r324; .loc 1 168 28 add.u64 %r117,%r117,1; .loc 1 168 36 add.u64 %r118,%r118,%r69; .loc 1 168 11 setp.le.s64 %r325,%r46,%r117; @ %r325 bra $L31; .loc 1 171 7 ld.f64 %r179,[%r118]; bra $L30; $L31: .loc 1 175 12 st.f64 [%r123],%r104; $L29: .loc 1 179 15 add.u64 %r137,%r137,1; .loc 1 180 12 add.u64 %r122,%r122,%r74; .loc 1 181 12 add.u64 %r123,%r123,%r77; .loc 1 183 13 setp.ne.u64 %r326,%r137,%r142; @ %r326 bra $L25; .loc 1 190 9 add.u64 %r119,%r122,%r143; .loc 1 191 9 add.u64 %r120,%r123,%r70; .loc 1 193 7 @ %r351 bra $L1; add.u64 %r132,%frame,368; mov.u64 %r174,8; .loc 1 192 5 mov.u64 %r121,1; bra $L36; $L37: .loc 1 187 13 st.u64 [%r132],%r358; .loc 1 190 23 mul.lo.u64 %r332,%r90,%r89; .loc 1 190 9 shl.b64 %r333,%r332,3; sub.u64 %r119,%r122,%r333; .loc 1 191 23 mul.lo.u64 %r334,%r93,%r89; .loc 1 191 9 shl.b64 %r335,%r334,3; sub.u64 %r120,%r123,%r335; .loc 1 192 5 add.u64 %r121,%r121,1; .loc 1 193 7 add.u64 %r132,%r132,8; add.u64 %r174,%r174,8; setp.eq.u64 %r336,%r121,%r355; @ %r336 bra $L1; $L36: .loc 1 201 16 ld.u64 %r337,[%r132]; add.u64 %r89,%r337,1; st.u64 [%r132],%r89; .loc 1 202 23 add.u64 %r339,%r356,%r174; ld.u64 %r90,[%r339]; .loc 1 202 13 shl.b64 %r340,%r90,3; add.u64 %r122,%r119,%r340; .loc 1 203 23 add.u64 %r341,%frame,%r174; ld.u64 %r93,[%r341]; .loc 1 203 13 shl.b64 %r342,%r93,3; add.u64 %r123,%r120,%r342; .loc 1 183 32 add.u64 %r344,%r357,%r174; ld.u64 %r96,[%r344]; .loc 1 183 13 setp.eq.u64 %r345,%r89,%r96; @ %r345 bra $L37; .loc 1 187 13 mov.u64 %r137,0; bra $L25; $L53: .loc 1 87 6 ld.u64 %r346,[%r181]; setp.eq.u64 %r347,%r346,0; @ ! %r347 bra $L11; bra $L38; $L1: .loc 1 207_gfortran_mminval_r8 .visible .func _gfortran_mminval_r8608.reg .u64 %r89; .reg .f64u32 %r219; .reg .u64 %r221; .reg .u32 %r223; .reg .u32 %r224; .reg .u16 %r226; .reg .u16 %r227; .reg .u16 %r228; .reg .u32u16 %r263; .reg .u32pred %r282; .reg .u64 %r283; .reg .u64u64u6464pred %r325; .reg .u64 %r327; .reg .u64 %r329; .reg .pred %r331; .reg .pred %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u32u64pred %r357; .reg .u64 %r358; .reg .u64 %r359; .reg .predu64 %r392; .reg .u64 %r393; .reg .pred %r394; .reg .predpred %r403; .reg .pred %r405; .reg .predpredu64 %r422; .reg .u64 %r424; .reg .u64 %r425; .reg .u64 %r427; .reg .pred %r428; .reg .pred %r429; .reg .u64 %r430; .reg .pred %r431; .reg .u64 %r433; .reg .predu64 %r449; mov.u64 %r211,%ar0; mov.u64 %r212,%ar1; mov.u64 %r213,%ar2; mov.u64 %r214,%ar3; .loc 1 237 6 setp.ne.u64 %r215,%r214,0; @ %r215 bra $Lgfortran_minval_r8.loc 1 244 7 bra $L54; $L55: .loc 1 247 10 ld.u64 %r22,[%r213]; .loc 1 247 7 add.u64 %r142,%r22,-1; .loc 1 248 10 ld.s8 %r23,[%r212+28]; .loc 1 248 38 add.u32 %r219,%r23,-1; .loc 1 248 8 cvt.s64.s32 %r143,%r219; .loc 1 251 7 shr.u64 %r221,%r142,63; set.u32.gt.s64 %r223,%r142,%r143; neg.s32 %r224,%r223; cvt.u16.u64 %r227,%r221; cvt.u16.u32 %r228,%r224; or.b16 %r226,%r227,%r228; .loc 1 251 6 cvt.u32.u16 %r229,%r226; cvt.u16.u8 %r230,%r229; setp.eq.u16 %r231,%r230,0; @ %r231 bra $L57; .loc 1 253 7 cvt.u32.u32 %r234,%r23; cvt.s64.s8 %r233,%r234; st.u64 [%stack+8],%r233; st.u64 [%stack],%r22; cvta.const.u64 %r2322stack; call _gfortran_runtime_error7: .loc 1 258 9 add.u64 %r435,%r142,%r142; add.u64 %r433,%r435,%r142; shl.b64 %r239,%r433,3; add.u64 %r240,%r212,%r239; ld.u64 %r243,[%r240+56]; add.u64 %r242,%r243,1; .loc 1 258 7 ld.u64 %r250,[%r240+48]; sub.u64 %r144,%r242,%r250; .loc 1 259 6 setp.le.s64 %r251,%r144,0; @ %r251 bra $L54; .loc 1 262 9 ld.u64 %r159,[%r214]; .loc 1 264 15 ld.u64 %r33,[%r214+16]; .loc 1 266 22 cvt.u32.u64 %r34,%r33; .loc 1 266 53 add.u32 %r252,%r34,-4; and.b32 %r253,%r252,-5; set.u32.eq.u32 %r255,%r253,0; neg.s32 %r256,%r255; .loc 1 266 22 add.u32 %r257,%r34,-1; set.u32.le.u32 %r259,%r257,1; neg.s32 %r260,%r259; .loc 1 266 6 cvt.u16.u32 %r262,%r256; cvt.u16.u32 %r263,%r260; or.b16 %r261,%r262,%r263; cvt.u32.u16 %r264,%r261; cvt.u16.u8 %r265,%r264; setp.ne.u16 %r266,%r265,0; @ %r266 bra $L59; .loc 1 268 7 setp.ne.u32 %r268,%r34,16; @ %r268 bra $L60; $L59: .loc 1 275 9 shl.b64 %r272,%r433,3; add.u64 %r273,%r212,%r272; ld.u64 %r147,[%r273+40]; .loc 1 276 12 add.u64 %r279,%r214,%r272; ld.u64 %r40,[%r279+40]; .loc 1 278 3 setp.ne.u64 %r281,%r142,0; @ %r281 bra $L61; $L67: .loc 1 288 3 setp.lt.s64 %r282,%r142,%r143; @ %r282 bra $L62; bra $L118; $L60: .loc 1 273 5 cvta.const.u64 %r283283_gfortran_runtime_error61: add.u64 %r53,%r212,40; add.u64 %r163,%r214,40; add.u64 %r286,%r22,%r22; add.u64 %r287,%r286,%r22; shl.b64 %r288,%r287,3; add.u64 %r289,%r212,16; add.u64 %r39,%r288,%r289; .loc 1 278 3 mov.u64 %r209,0; add.u64 %r438,%frame,240; add.u64 %r436,%frame,360; .loc 1 285 12 mov.u64 %r449,%r209; $L66: .loc 1 280 18 add.u64 %r291,%r438,%r209; ld.u64 %r292,[%r53]; st.u64 [%r291],%r292; .loc 1 281 18 add.u64 %r293,%frame,%r209; .loc 1 281 20 ld.u64 %r295,[%r163]; mul.lo.u64 %r294,%r295,%r33; .loc 1 281 18 st.u64 [%r293],%r294; .loc 1 282 19 ld.u64 %r297,[%r53+16]; add.u64 %r296,%r297,1; ld.u64 %r298,[%r53+8]; sub.u64 %r51,%r296,%r298; .loc 1 284 10 setp.lt.s64 %r299,%r51,0; @ %r299 bra $L64; .loc 1 282 17 add.u64 %r301,%r436,%r209; st.u64 [%r301],%r51; bra $L65; $L64: .loc 1 285 12 add.u64 %r303,%r436,%r209; st.u64 [%r303],%r449; $L65: .loc 1 278 3 add.u64 %r53,%r53,24; add.u64 %r163,%r163,24; add.u64 %r209,%r209,8; setp.ne.u64 %r305,%r39,%r53; @ %r305 bra $L66; bra $L67; $L118: .loc 1 298 6 ld.u64 %r306,[%r211]; setp.eq.u64 %r307,%r306,0; @ ! %r307 bra $L69; bra $L68; $L62: add.u64 %r309,%r22,%r22; add.u64 %r310,%r309,%r22; shl.b64 %r311,%r310,3; add.u64 %r167,%r311,40; add.u64 %r171,%r212,%r167; add.u64 %r165,%r214,%r167; shl.b64 %r312,%r22,3; add.u64 %r139,%r312,-8; cvt.u32.u32 %r314,%r23; cvt.s64.s8 %r313,%r314; shl.b64 %r315,%r313,3; add.u64 %r124,%r315,-8; add.u64 %r438,%frame,240; add.u64 %r436,%frame,360; .loc 1 295 12 mov.u64 %r448,0; $L72: .loc 1 290 18 add.u64 %r317,%r438,%r139; ld.u64 %r318,[%r171]; st.u64 [%r317],%r318; .loc 1 291 18 add.u64 %r319,%frame,%r139; .loc 1 291 20 ld.u64 %r321,[%r165]; mul.lo.u64 %r320,%r321,%r33; .loc 1 291 18 st.u64 [%r319],%r320; .loc 1 292 19 ld.u64 %r323,[%r171+16]; add.u64 %r322,%r323,1; ld.u64 %r324,[%r171+8]; sub.u64 %r63,%r322,%r324; .loc 1 294 10 setp.lt.s64 %r325,%r63,0; @ %r325 bra $L70; .loc 1 292 17 add.u64 %r327,%r436,%r139; st.u64 [%r327],%r63; bra $L71; $L70: .loc 1 295 12 add.u64 %r329,%r436,%r139; st.u64 [%r329],%r448; $L71: .loc 1 288 3 add.u64 %r171,%r171,24; add.u64 %r165,%r165,24; add.u64 %r139,%r139,8; setp.ne.u64 %r331,%r124,%r139; @ %r331 bra $L72; bra $L119; $L68: .loc 1 302 7 setp.eq.u64 %r332,%r143,0; @ %r332 bra $L74; add.u64 %r436,%frame,360; $L96: add.u64 %r185,%r211,40; mov.u64 %r183,%r436; add.u64 %r333,%r211,16; cvt.u32.u32 %r335,%r23; cvt.s64.s8 %r334,%r335; add.u64 %r337,%r334,%r334; add.u64 %r338,%r337,%r334; shl.b64 %r339,%r338,3; add.u64 %r172,%r333,%r339; .loc 1 305 10 mov.u64 %r131,1; .loc 1 309 4 mov.u64 %r358,0; bra $L75; $L74: .loc 1 313 20 add.u32 %r340,%r23,-2; cvt.s64.s32 %r73,%r340; add.u64 %r342,%r73,%r73; add.u64 %r343,%r342,%r73; shl.b64 %r344,%r343,3; add.u64 %r345,%r211,%r344; .loc 1 313 67 shl.b64 %r347,%r73,3; add.u64 %r348,%frame,%r347; .loc 1 313 59 ld.u64 %r350,[%r345+40]; ld.u64 %r351,[%r348+360]; mul.lo.u64 %r151,%r350,%r351; .loc 1 315 24 mov.u64 %r352,0; st.u64 [%r211+8],%r352; .loc 1 316 28 cvt.u16.u32 %r355,%r23; add.u16 %r354,%r355,-1; cvt.u32.u16 %r356,%r354; st.u8 [%r211+28],%r356; .loc 1 318 10 setp.eq.u64 %r357,%r151,0; @ %r357 bra $L76; bra $L120; $L78: .loc 1 307 47 mul.lo.u64 %r131,%r70,%r131; $L75: .loc 1 309 4 st.u64 [%r185+8],%r358; ld.u64 %r70,[%r183]; add.u64 %r359,%r70,-1; st.u64 [%r185+16],%r359; st.u64 [%r185],%r131; .loc 1 302 7 add.u64 %r185,%r185,24; add.u64 %r183,%r183,8; setp.ne.u64 %r360,%r172,%r185; @ %r360 bra $L78; bra $L74; $L76: .loc 1 321 4 st.u64 [%r211+48],%r151; mov.u64 %r362,-1; st.u64 [%r211+56],%r362; mov.u64 %r363,1; st.u64 [%r211+40],%r363; .loc 1 322 4 bra $L54; $L120: .loc 1 325 24 mov.u64 %r365365; call (%value_in),_gfortrani_xmallocarray66,[%value_in]; } .loc 1 325 22 st.u64 [%r211],%r366; bra $L79; $L69: .loc 1 330 19 ld.s8 %r368,[%r211+28]; .loc 1 330 10 setp.eq.u64 %r369,%r368,%r143; @ %r369 bra $L80; .loc 1 331 2 cvta.const.u64 %r370370_gfortran_runtime_error80: .loc 1 333 11 cvta.global.u64 %r372,_gfortrani_compile_options; .loc 1 333 10 ld.u32 %r373,[%r372+36]; setp.eq.u32 %r374,%r373,0; @ %r374 bra $L79; .loc 1 335 4 add.u64 %r436,%frame,360; cvta.const.u64 %r378,$LC2r4r377378; call _gfortrani_bounds_ifunction_return337 4 cvta.const.u64 %r382,$LC1212382378; call _gfortrani_bounds_equal_extents$L79: .loc 1 342 3 setp.ne.u64 %r384,%r143,0; @ %r384 bra $L81; $L84: .loc 1 350 8 ld.u64 %r160,[%r211]; .loc 1 351 8 ld.u64 %r158,[%r212]; .loc 1 353 9 setp.ne.u64 %r385,%r158,0; @ %r385 bra $L82; bra $L54; $L81: add.u64 %r198,%frame,480; add.u64 %r197,%r211,40; add.u64 %r195,%frame,120; cvt.u32.u32 %r387,%r23; cvt.s64.s8 %r386,%r387; add.u64 %r186,%r386,-1; .loc 1 342 3 mov.u64 %r161,0; add.u64 %r436,%frame,360; .loc 1 344 16 mov.u64 %r388,%r161; $L83: st.u64 [%r198],%r388; .loc 1 345 18 ld.u64 %r389,[%r197]; st.u64 [%r195],%r389; .loc 1 346 17 shl.b64 %r391,%r161,3; add.u64 %r392,%r436,%r391; .loc 1 346 10 ld.u64 %r393,[%r392]; setp.le.s64 %r394,%r393,0; @ %r394 bra $L54; .loc 1 342 26 add.u64 %r161,%r161,1; .loc 1 342 3 add.u64 %r198,%r198,8; add.u64 %r197,%r197,24; add.u64 %r195,%r195,8; setp.ne.u64 %r395,%r161,%r186; @ %r395 bra $L83; bra $L84; $L82: .loc 1 276 12 mul.lo.u64 %r42,%r40,%r33; .loc 1 370 32 shl.b64 %r87,%r147,3; ld.u64 %r38,[%frame+480]; .loc 1 401 22 ld.u64 %r95,[%frame+240]; .loc 1 401 12 shl.b64 %r96,%r95,3; .loc 1 402 23 ld.u64 %r98,[%frame]; .loc 1 403 22 ld.u64 %r100,[%frame+120]; .loc 1 403 12 shl.b64 %r101,%r100,3; .loc 1 405 32 ld.u64 %r59,[%frame+360]; .loc 1 412 23 mul.lo.u64 %r37,%r95,%r59; .loc 1 413 24 mul.lo.u64 %r396,%r98,%r59; .loc 1 413 10 neg.s64 %r89,%r396; .loc 1 414 23 mul.lo.u64 %r397,%r100,%r59; .loc 1 414 9 shl.b64 %r398,%r397,3; neg.s64 %r132,%r398; setp.le.s64 %r434,%r143,1; .loc 1 377 19 mov.u32 %r440,1; cvt.u32.u32 %r442,%r23; cvt.s64.s8 %r443,%r442; add.u64 %r444,%r443,-1; add.u64 %r445,%frame,240; add.u64 %r446,%frame,360; add.u64 %r447,%frame,120; $L97: .loc 1 342 3 mov.u64 %r154,%r159; mov.u64 %r153,%r158; .loc 1 368 6 mov.u32 %r137,0; .loc 1 370 9 mov.u64 %r152,0; $L87: .loc 1 374 7 ld.s8 %r210,[%r154]; .loc 1 374 6 setp.eq.u32 %r437,%r210,0; @ %r437 bra $L85; .loc 1 378 10 ld.f64 %r400,[%r153]; setp.le.f64 %r401,%r400,0d7ff0000000000000; @ %r401 bra $L86; .loc 1 377 19 mov.u32 %r137,%r440; $L85: .loc 1 370 24 add.u64 %r152,%r152,1; .loc 1 370 32 add.u64 %r153,%r153,%r87; .loc 1 370 47 add.u64 %r154,%r154,%r42; .loc 1 370 2 setp.ne.u64 %r402,%r144,%r152; @ %r402 bra $L87; .loc 1 386 47 setp.ne.u32 %r403,%r137,0; selp.f64 %r136,0d7ff8000000000000,0d7fefffffffffffff,%r403; bra $L88; $L91: .loc 1 394 6 @ %r437 bra $L89; .loc 1 394 16 ld.f64 %r91,[%r153]; .loc 1 394 13 setp.lt.f64 %r405,%r91,%r136; .loc 1 395 12 selp.f64 %r136,%r91,%r136,%r405; $L89: .loc 1 391 28 add.u64 %r152,%r152,1; .loc 1 391 36 add.u64 %r153,%r153,%r87; .loc 1 391 51 add.u64 %r154,%r154,%r42; .loc 1 391 11 setp.eq.u64 %r406,%r144,%r152; @ %r406 bra $L88; .loc 1 394 7 ld.s8 %r210,[%r154]; setp.eq.u32 %r437,%r210,0; bra $L91; $L99: .loc 1 386 47 mov.f64 %r136,0d7ff8000000000000; $L88: .loc 1 397 8 st.f64 [%r160],%r136; .loc 1 400 15 add.u64 %r38,%r38,1; .loc 1 401 12 add.u64 %r158,%r158,%r96; .loc 1 402 13 add.u64 %r159,%r159,%r98; .loc 1 403 12 add.u64 %r160,%r160,%r101; .loc 1 405 13 setp.ne.u64 %r407,%r59,%r38; @ %r407 bra $L97; .loc 1 413 10 add.u64 %r155,%r159,%r89; .loc 1 414 9 add.u64 %r156,%r160,%r132; .loc 1 416 7 @ %r434 bra $L54; add.u64 %r169,%frame,488; .loc 1 412 23 mov.u64 %r103,%r37; .loc 1 416 7 mov.u64 %r141,8; .loc 1 415 5 mov.u64 %r157,1; .loc 1 409 13 mov.u64 %r439,0; bra $L94; $L95: st.u64 [%r169],%r439; .loc 1 412 23 mul.lo.u64 %r103,%r114,%r113; .loc 1 413 24 mul.lo.u64 %r413,%r116,%r113; .loc 1 413 10 sub.u64 %r155,%r159,%r413; .loc 1 414 23 mul.lo.u64 %r414,%r118,%r113; .loc 1 414 9 shl.b64 %r415,%r414,3; sub.u64 %r156,%r160,%r415; .loc 1 415 5 add.u64 %r157,%r157,1; .loc 1 416 7 add.u64 %r169,%r169,8; add.u64 %r141,%r141,8; setp.eq.u64 %r416,%r157,%r444; @ %r416 bra $L54; $L94: .loc 1 424 16 ld.u64 %r417,[%r169]; add.u64 %r113,%r417,1; st.u64 [%r169],%r113; .loc 1 425 23 add.u64 %r419,%r445,%r141; ld.u64 %r114,[%r419]; .loc 1 425 13 sub.u64 %r420,%r114,%r103; shl.b64 %r421,%r420,3; add.u64 %r158,%r158,%r421; .loc 1 426 24 add.u64 %r422,%frame,%r141; ld.u64 %r116,[%r422]; .loc 1 426 14 add.u64 %r159,%r155,%r116; .loc 1 427 23 add.u64 %r424,%r447,%r141; ld.u64 %r118,[%r424]; .loc 1 427 13 shl.b64 %r425,%r118,3; add.u64 %r160,%r156,%r425; .loc 1 405 32 add.u64 %r427,%r446,%r141; ld.u64 %r121,[%r427]; .loc 1 405 13 setp.eq.u64 %r428,%r113,%r121; @ %r428 bra $L95; .loc 1 409 13 mov.u64 %r38,0; bra $L97; $L86: .loc 1 383 9 setp.le.s64 %r429,%r144,%r152; @ %r429 bra $L99; mov.f64 %r136,0d7ff0000000000000; bra $L91; $L119: .loc 1 298 6 ld.u64 %r430,[%r211]; setp.eq.u64 %r431,%r430,0; @ ! %r431 bra $L69; bra $L96; $L54: .loc 1 431_gfortran_sminval_r8 .visible .func _gfortran_sminval_r88u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64 %r276; .reg .predpredu64mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 454 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L122; .loc 1 454 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L123; r152; call _gfortran_minval_r8.loc 1 461 7 bra $L121; $L123: .loc 1 464 10 ld.u64 %r23,[%r152]; .loc 1 464 7 add.u64 %r88,%r23,-1; .loc 1 465 10 ld.s8 %r24,[%r151+28]; .loc 1 465 38 add.u32 %r26,%r24,-1; .loc 1 465 8 cvt.s64.s32 %r294,%r26; .loc 1 467 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r294; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 467 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L125; .loc 1 474 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L126; $L132: .loc 1 482 3 setp.lt.s64 %r173,%r88,%r294; @ %r173 bra $L127; bra $L180; $L125: .loc 1 469 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error126: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 479 12 mov.u64 %r300,0; $L131: .loc 1 476 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 478 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L129; .loc 1 476 17 st.u64 [%r139],%r36; bra $L130; $L129: .loc 1 479 12 st.u64 [%r139],%r300; $L130: .loc 1 474 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L131; bra $L132; $L180: .loc 1 491 15 ld.u64 %r98,[%r150]; .loc 1 491 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L134; bra $L133; $L127: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r293,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r293,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 488 12 mov.u64 %r299,0; $L137: .loc 1 485 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 487 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L135; .loc 1 484 17 st.u64 [%r43],%r41; bra $L136; $L135: .loc 1 488 12 st.u64 [%r43],%r299; $L136: .loc 1 482 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L137; bra $L181; $L133: .loc 1 495 7 setp.eq.u64 %r211,%r294,0; @ %r211 bra $L139; add.u64 %r293,%frame,120; $L159: add.u64 %r111,%r150,40; mov.u64 %r107,%r293; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 498 10 mov.u64 %r81,1; .loc 1 502 4 mov.u64 %r237,0; bra $L140; $L139: .loc 1 506 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 507 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 509 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 509 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 509 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 511 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L141; bra $L182; $L143: .loc 1 500 48 mul.lo.u64 %r81,%r46,%r81; $L140: .loc 1 502 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 495 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L143; bra $L139; $L141: .loc 1 514 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 515 4 bra $L121; $L182: .loc 1 518 24 mov.u64 %r24496244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 518 22 st.u64 [%r150],%r98; $L147: .loc 1 544 3 setp.ne.u64 %r247,%r294,0; @ %r247 bra $L144; bra $L145; $L134: .loc 1 522 19 ld.s8 %r55,[%r150+28]; .loc 1 522 10 setp.eq.u64 %r248,%r55,%r294; @ %r248 bra $L146; .loc 1 523 2 st.u64 [%stack+8],%r294; st.u64 [%stack],%r55;_gfortran_runtime_error146: .loc 1 528 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 528 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L147; .loc 1 530 4 setp.eq.u64 %r254,%r294,0; @ %r254 bra $L145; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 530 10 mov.u64 %r104,0; $L149: .loc 1 534 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 534 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 535 18 ld.u64 %r63,[%r31]; .loc 1 530 25 add.u64 %r104,%r104,1; .loc 1 535 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L148; .loc 1 536 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error148: .loc 1 530 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L149; $L144: .loc 1 546 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L150; shl.b64 %r264,%r294,3; bra $L151; $L150: mov.u64 %r264,8; $L151: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L152: .loc 1 547 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 544 26 add.u64 %r100,%r100,1; .loc 1 544 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r294,%r100; @ %r277 bra $L152; $L145: ld.u64 %r80,[%frame+240]; .loc 1 556 22 ld.u64 %r68,[%frame]; .loc 1 556 12 shl.b64 %r69,%r68,3; .loc 1 558 32 ld.u64 %r59,[%frame+120]; .loc 1 565 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r292,%r294,1; .loc 1 554 13 mov.f64 %r278,0d7fefffffffffffff; cvt.u32.u32 %r296,%r24; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; $L158: st.f64 [%r98],%r278; .loc 1 555 15 add.u64 %r80,%r80,1; .loc 1 556 12 add.u64 %r98,%r98,%r69; .loc 1 558 13 setp.ne.u64 %r279,%r59,%r80; @ %r279 bra $L158; .loc 1 567 7 @ ! %r292 bra $L183; bra $L121; $L157: .loc 1 562 13 st.u64 [%r99],%r295; .loc 1 565 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 566 5 add.u64 %r97,%r97,1; .loc 1 567 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r282,%r97,%r298; @ %r282 bra $L156; bra $L121; $L183: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 565 23 mov.u64 %r72,%r44; .loc 1 566 5 mov.u64 %r97,1; .loc 1 562 13 mov.u64 %r295,0; $L156: .loc 1 571 16 ld.u64 %r287,[%r99]; add.u64 %r74,%r287,1; st.u64 [%r99],%r74; .loc 1 572 23 ld.u64 %r75,[%r124]; .loc 1 572 13 sub.u64 %r288,%r75,%r72; shl.b64 %r289,%r288,3; add.u64 %r98,%r98,%r289; .loc 1 558 32 ld.u64 %r77,[%r123]; .loc 1 558 13 setp.eq.u64 %r290,%r74,%r77; @ %r290 bra $L157; .loc 1 562 13 mov.u64 %r80,0; bra $L158; $L181: .loc 1 491 15 ld.u64 %r98,[%r150]; .loc 1 491 6 setp.eq.u64 %r291,%r98,0; @ ! %r291 bra $L134; bra $L159; $L121: .loc 1 minval_r10.o/ product_i1.o/_gfortran_product_i1 .visible .func _gfortran_product_i1fortran/generated/product_i1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mproduct_i1 .visible .func _gfortran_mproduct_i1gfortran_sproduct_i1 .visible .func _gfortran_sproduct_i1VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,108] = {80,82,79,68,85,67,84114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_product_i1 .visible .func _gfortran_product_i148u64 %r111; .reg .u64predu64 %r229; .reg .u64u64pred %r247; .reg .pred %r249; .reg .predu64u64 %r334; .reg .u64 %r336; .reg .predpred %r343; .reg .u32u64 %r353; .reg .u16 %r354; mov.u64 %r174,%ar0; mov.u64 %r175,%ar1; mov.u64 %r176,%ar2; .loc 1 55 10 ld.s8 %r22,[%r175+28]; .loc 1 55 38 add.u32 %r177,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r107,%r177; .loc 1 56 10 ld.u64 %r25,[%r176]; .loc 1 56 7 add.u64 %r108,%r25,-1; .loc 1 58 7 shr.u64 %r179,%r108,63; set.u32.lt.s64 %r181,%r107,%r108; neg.s32 %r182,%r181; cvt.u16.u64 %r185,%r179; cvt.u16.u32 %r186,%r182; or.b16 %r184,%r185,%r186; .loc 1 58 6 cvt.u32.u16 %r187,%r184; cvt.u16.u8 %r188,%r187; setp.eq.u16 %r189,%r188,0; @ %r189 bra $L2; .loc 1 60 7 cvt.u32.u32 %r192,%r22; cvt.s64.s8 %r191,%r192; st.u64 [%stack+8],%r19190,$LC0;stack; call _gfortran_runtime_errorr195,%r108,%r108; add.u64 %r196,%r195,%r108; shl.b64 %r197,%r196,3; add.u64 %r198,%r175,%r197; ld.u64 %r30,[%r198+56]; ld.u64 %r32,[%r198+48]; .loc 1 68 9 ld.u64 %r111,[%r198+40]; .loc 1 70 3 setp.ne.u64 %r212,%r108,0; @ %r212 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r213,%r107,%r108; @ %r213 bra $L4; bra $L46; $L3: add.u64 %r125,%r175,40; add.u64 %r172,%frame,120; add.u64 %r170,%frame,240; add.u64 %r214,%r175,16; add.u64 %r216,%r25,%r25; add.u64 %r217,%r216,%r25; shl.b64 %r218,%r217,3; add.u64 %r79,%r214,%r218; .loc 1 76 12 mov.u64 %r353,0; $L8: .loc 1 72 18 ld.u64 %r219,[%r125]; st.u64 [%r172],%r219; .loc 1 73 19 ld.u64 %r221,[%r125+16]; add.u64 %r220,%r221,1; ld.u64 %r222,[%r125+8]; sub.u64 %r37,%r220,%r222; .loc 1 75 10 setp.lt.s64 %r223,%r37,0; @ %r223 bra $L6; .loc 1 73 17 st.u64 [%r170],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r170],%r353; $L7: .loc 1 70 3 add.u64 %r125,%r125,24; add.u64 %r172,%r172,8; add.u64 %r170,%r170,8; setp.ne.u64 %r225,%r79,%r125; @ %r225 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r226,[%r174]; setp.eq.u64 %r227,%r226,0; @ ! %r227 bra $L11; bra $L10; $L4: add.u64 %r229,%r25,%r25; add.u64 %r230,%r229,%r25; shl.b64 %r231,%r230,3; add.u64 %r232,%r231,40; add.u64 %r117,%r175,%r232; shl.b64 %r233,%r25,3; add.u64 %r97,%r233,-8; add.u64 %r340,%frame,120; add.u64 %r100,%r340,%r97; add.u64 %r341,%frame,240; add.u64 %r95,%r341,%r97; cvt.u32.u32 %r237,%r22; cvt.s64.s8 %r236,%r237; add.u64 %r239,%r236,%r236; add.u64 %r240,%r239,%r236; shl.b64 %r241,%r240,3; add.u64 %r242,%r175,40; add.u64 %r94,%r241,%r242; .loc 1 84 12 mov.u64 %r352,0; $L14: .loc 1 80 18 ld.u64 %r243,[%r117]; st.u64 [%r100],%r243; .loc 1 81 19 ld.u64 %r245,[%r117+16]; add.u64 %r244,%r245,1; ld.u64 %r246,[%r117+8]; sub.u64 %r44,%r244,%r246; .loc 1 83 10 setp.lt.s64 %r247,%r44,0; @ %r247 bra $L12; .loc 1 81 17 st.u64 [%r95],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r95],%r352; $L13: .loc 1 78 3 add.u64 %r117,%r117,24; add.u64 %r100,%r100,8; add.u64 %r95,%r95,8; setp.ne.u64 %r249,%r94,%r117; @ %r249 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r250,%r107,0; @ %r250 bra $L16; add.u64 %r341,%frame,240; $L35: add.u64 %r147,%r174,40; mov.u64 %r138,%r341; cvt.u32.u32 %r252,%r22; cvt.s64.s8 %r251,%r252; add.u64 %r254,%r251,%r251; add.u64 %r255,%r254,%r251; shl.b64 %r256,%r255,3; add.u64 %r257,%r174,16; add.u64 %r126,%r256,%r257; .loc 1 94 10 mov.u64 %r99,1; .loc 1 98 4 mov.u64 %r280,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r258,0; st.u64 [%r174+8],%r258; .loc 1 103 28 cvt.u16.u32 %r261,%r22; add.u16 %r260,%r261,-1; cvt.u32.u16 %r262,%r260; st.u8 [%r174+28],%r262; .loc 1 105 20 add.u32 %r263,%r22,-2; cvt.s64.s32 %r55,%r263; add.u64 %r265,%r55,%r55; add.u64 %r266,%r265,%r55; shl.b64 %r267,%r266,3; add.u64 %r268,%r174,%r267; .loc 1 105 67 shl.b64 %r270,%r55,3; add.u64 %r271,%frame,%r270; .loc 1 105 59 ld.u64 %r273,[%r268+40]; ld.u64 %r274,[%r271+240]; mul.lo.u64 %r113,%r273,%r274; .loc 1 107 29 mov.u64 %r27611(%value_in),_gfortrani_xmallocarray277,[%value_in]; } .loc 1 107 27 st.u64 [%r174],%r277; .loc 1 108 10 setp.eq.u64 %r279,%r113,0; @ ! %r279 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r99,%r51,%r99; $L17: .loc 1 98 4 st.u64 [%r147+8],%r280; ld.u64 %r51,[%r138]; add.u64 %r281,%r51,-1; st.u64 [%r147+16],%r281; st.u64 [%r147],%r99; .loc 1 91 7 add.u64 %r147,%r147,24; add.u64 %r138,%r138,8; setp.ne.u64 %r282,%r126,%r147; @ %r282 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r174+48],%r113; mov.u64 %r284,-1; st.u64 [%r174+56],%r284; st.u64 [%r174+40],%r276; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r62,[%r174+28]; .loc 1 118 10 setp.eq.u64 %r286,%r62,%r107; @ %r286 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r107;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r289,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r290,[%r289+36]; setp.eq.u32 %r291,%r290,0; @ %r291 bra $L21; .loc 1 125 2 add.u64 %r341,%frame,240; cvta.const.u64 %r295,$LC2r34129495; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r297,%r107,0; @ %r297 bra $L24; $L27: .loc 1 65 9 add.u64 %r298,%r30,1; .loc 1 65 7 sub.u64 %r109,%r298,%r32; max.s64 %r45,%r109,0; .loc 1 137 8 ld.u64 %r121,[%r175]; .loc 1 138 8 ld.u64 %r122,[%r174]; ld.u64 %r47,[%frame+360]; .loc 1 166 22 ld.u64 %r74,[%frame+120]; .loc 1 167 22 ld.u64 %r76,[%frame]; setp.gt.s64 %r342,%r109,0; setp.le.s64 %r343,%r107,1; cvt.u32.u32 %r345,%r22; cvt.s64.s8 %r346,%r345; add.u64 %r347,%r346,-1; add.u64 %r348,%frame,120; add.u64 %r349,%frame,240; .loc 1 173 13 mov.u64 %r350,0; bra $L25; $L24: add.u64 %r160,%frame,360; add.u64 %r159,%r174,40; mov.u64 %r157,%frame; cvt.u32.u32 %r300,%r22; cvt.s64.s8 %r299,%r300; add.u64 %r148,%r299,-1; .loc 1 129 3 mov.u64 %r123,0; add.u64 %r341,%frame,240; .loc 1 131 16 mov.u64 %r301,%r123; $L26: st.u64 [%r160],%r301; .loc 1 132 18 ld.u64 %r302,[%r159]; st.u64 [%r157],%r302; .loc 1 133 17 shl.b64 %r304,%r123,3; add.u64 %r305,%r341,%r304; .loc 1 133 10 ld.u64 %r306,[%r305]; setp.le.s64 %r307,%r306,0; @ %r307 bra $L1; .loc 1 129 26 add.u64 %r123,%r123,1; .loc 1 129 3 add.u64 %r160,%r160,8; add.u64 %r159,%r159,24; add.u64 %r157,%r157,8; setp.ne.u64 %r308,%r123,%r148; @ %r308 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r342 bra $L28; .loc 1 150 10 mov.u32 %r310,1; st.u8 [%r122],%r310; bra $L29; $L28: .loc 1 154 36 mov.u64 %r116,%r121; .loc 1 148 10 mov.u32 %r114,1; .loc 1 154 13 mov.u64 %r115,0; $L30: .loc 1 158 10 ld.u8 %r354,[%r116]; mov.u16 %r314,%r354; cvt.u16.u32 %r315,%r114; mul.lo.u16 %r313,%r314,%r315; cvt.u32.u16 %r316,%r313; cvt.s32.s8 %r114,%r316; .loc 1 154 28 add.u64 %r115,%r115,1; .loc 1 154 36 add.u64 %r116,%r116,%r111; .loc 1 154 6 setp.gt.s64 %r318,%r45,%r115; @ %r318 bra $L30; .loc 1 161 12 cvt.u32.u32 %r319,%r114; st.u8 [%r122],%r319; $L29: .loc 1 165 15 add.u64 %r47,%r47,1; .loc 1 166 12 add.u64 %r121,%r121,%r74; .loc 1 167 12 add.u64 %r122,%r122,%r76; .loc 1 169 32 ld.u64 %r128,[%frame+240]; .loc 1 169 13 setp.ne.u64 %r320,%r47,%r128; @ %r320 bra $L25; .loc 1 176 23 mul.lo.u64 %r321,%r74,%r47; .loc 1 176 9 sub.u64 %r118,%r121,%r321; .loc 1 177 23 mul.lo.u64 %r322,%r76,%r47; .loc 1 177 9 sub.u64 %r119,%r122,%r322; .loc 1 179 7 @ %r343 bra $L1; add.u64 %r143,%frame,368; mov.u64 %r106,8; .loc 1 178 5 mov.u64 %r120,1; bra $L33; $L34: .loc 1 173 13 st.u64 [%r143],%r350; .loc 1 176 23 mul.lo.u64 %r328,%r89,%r88; .loc 1 176 9 sub.u64 %r118,%r121,%r328; .loc 1 177 23 mul.lo.u64 %r329,%r91,%r88; .loc 1 177 9 sub.u64 %r119,%r122,%r329; .loc 1 178 5 add.u64 %r120,%r120,1; .loc 1 179 7 add.u64 %r143,%r143,8; add.u64 %r106,%r106,8; setp.eq.u64 %r330,%r120,%r347; @ %r330 bra $L1; $L33: .loc 1 187 16 ld.u64 %r331,[%r143]; add.u64 %r88,%r331,1; st.u64 [%r143],%r88; .loc 1 188 23 add.u64 %r333,%r348,%r106; ld.u64 %r89,[%r333]; .loc 1 188 13 add.u64 %r121,%r118,%r89; .loc 1 189 23 add.u64 %r334,%frame,%r106; ld.u64 %r91,[%r334]; .loc 1 189 13 add.u64 %r122,%r119,%r91; .loc 1 169 32 add.u64 %r336,%r349,%r106; ld.u64 %r93,[%r336]; .loc 1 169 13 setp.eq.u64 %r337,%r88,%r93; @ %r337 bra $L34; .loc 1 173 13 mov.u64 %r47,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r338,[%r174]; setp.eq.u64 %r339,%r338,0; @ ! %r339 bra $L11; bra $L35; $L1_gfortran_mproduct_i1 .visible .func _gfortran_mproduct_608pred %r210; .reg .u32 %r214; .reg .u64 %r216; .reg .u32 %r218; .reg .u32 %r219; .reg .u16 %r221; .reg .u16 %r222; .reg .u16 %r223; .reg .u32 %r224; .reg .u16 %r225; .reg .predu32 %r259; .reg .u16u64 %r280; .reg .u64u64predu64u64 %r333; .reg .u64 %r334; .reg .u32predpredu64 %r377; .reg .pred %r379; .reg .pred %r380; .reg .u64 %r381; .reg .u32pred %r389; .reg .predu32 %r399; .reg .predpredmov.u64 %r206,%ar0; mov.u64 %r207,%ar1; mov.u64 %r208,%ar2; mov.u64 %r209,%ar3; .loc 1 223 6 setp.ne.u64 %r210,%r209,0; @ %r210 bra $L49; .loc 1 228 7 {_gfortran_product_.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r208]; .loc 1 233 7 add.u64 %r135,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r207+28]; .loc 1 234 38 add.u32 %r214,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r136,%r214; .loc 1 237 7 shr.u64 %r216,%r135,63; set.u32.gt.s64 %r218,%r135,%r136; neg.s32 %r219,%r218; cvt.u16.u64 %r222,%r216; cvt.u16.u32 %r223,%r219; or.b16 %r221,%r222,%r223; .loc 1 237 6 cvt.u32.u16 %r224,%r221; cvt.u16.u8 %r225,%r224; setp.eq.u16 %r226,%r225,0; @ %r226 bra $L51; .loc 1 239 7 cvt.u32.u32 %r229,%r23; cvt.s64.s8 %r228,%r229; st.u64 [%stack+8],%r228; st.u64 [%stack],%r22; cvta.const.u64 %r227_gfortran_runtime_error1: .loc 1 244 9 add.u64 %r429,%r135,%r135; add.u64 %r430,%r429,%r135; shl.b64 %r234,%r430,3; add.u64 %r235,%r207,%r234; ld.u64 %r238,[%r235+56]; add.u64 %r237,%r238,1; .loc 1 244 7 ld.u64 %r245,[%r235+48]; sub.u64 %r137,%r237,%r245; .loc 1 245 6 setp.le.s64 %r246,%r137,0; @ %r246 bra $L48; .loc 1 248 9 ld.u64 %r149,[%r209]; .loc 1 250 15 ld.u64 %r34,[%r209+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r247,%r35,-4; and.b32 %r248,%r247,-5; set.u32.eq.u32 %r250,%r248,0; neg.s32 %r251,%r250; .loc 1 252 22 add.u32 %r252,%r35,-1; set.u32.le.u32 %r254,%r252,1; neg.s32 %r255,%r254; .loc 1 252 6 cvt.u16.u32 %r257,%r251; cvt.u16.u32 %r258,%r255; or.b16 %r256,%r257,%r258; cvt.u32.u16 %r259,%r256; cvt.u16.u8 %r260,%r259; setp.ne.u16 %r261,%r260,0; @ %r261 bra $L53; .loc 1 254 7 setp.ne.u32 %r263,%r35,16; @ %r263 bra $L54; $L53: .loc 1 261 9 shl.b64 %r267,%r430,3; add.u64 %r268,%r207,%r267; ld.u64 %r140,[%r268+40]; .loc 1 262 12 add.u64 %r274,%r209,%r267; ld.u64 %r42,[%r274+40]; .loc 1 264 3 setp.ne.u64 %r276,%r135,0; @ %r276 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r277,%r135,%r136; @ %r277 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r27827_gfortran_runtime_error5: add.u64 %r190,%r207,40; add.u64 %r192,%r209,40; add.u64 %r280,%r207,16; add.u64 %r282,%r22,%r22; add.u64 %r283,%r282,%r22; shl.b64 %r284,%r283,3; add.u64 %r203,%r280,%r284; .loc 1 264 3 mov.u64 %r194,0; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 271 12 mov.u64 %r441,%r194; $L60: .loc 1 266 18 add.u64 %r286,%r427,%r194; ld.u64 %r287,[%r190]; st.u64 [%r286],%r287; .loc 1 267 18 add.u64 %r288,%frame,%r194; .loc 1 267 20 ld.u64 %r290,[%r192]; mul.lo.u64 %r289,%r290,%r34; .loc 1 267 18 st.u64 [%r288],%r289; .loc 1 268 19 ld.u64 %r292,[%r190+16]; add.u64 %r291,%r292,1; ld.u64 %r293,[%r190+8]; sub.u64 %r53,%r291,%r293; .loc 1 270 10 setp.lt.s64 %r294,%r53,0; @ %r294 bra $L58; .loc 1 268 17 add.u64 %r296,%r426,%r194; st.u64 [%r296],%r53; bra $L59; $L58: .loc 1 271 12 add.u64 %r298,%r426,%r194; st.u64 [%r298],%r441; $L59: .loc 1 264 3 add.u64 %r190,%r190,24; add.u64 %r192,%r192,24; add.u64 %r194,%r194,8; setp.ne.u64 %r300,%r190,%r203; @ %r300 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r301,[%r206]; setp.eq.u64 %r302,%r301,0; @ ! %r302 bra $L63; bra $L62; $L56: add.u64 %r304,%r22,%r22; add.u64 %r305,%r304,%r22; shl.b64 %r306,%r305,3; add.u64 %r155,%r306,40; add.u64 %r58,%r207,%r155; add.u64 %r157,%r209,%r155; shl.b64 %r307,%r22,3; add.u64 %r103,%r307,-8; cvt.u32.u32 %r309,%r23; cvt.s64.s8 %r308,%r309; shl.b64 %r310,%r308,3; add.u64 %r187,%r310,-8; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 281 12 mov.u64 %r440,0; $L66: .loc 1 276 18 add.u64 %r312,%r427,%r103; ld.u64 %r313,[%r58]; st.u64 [%r312],%r313; .loc 1 277 18 add.u64 %r314,%frame,%r103; .loc 1 277 20 ld.u64 %r316,[%r157]; mul.lo.u64 %r315,%r316,%r34; .loc 1 277 18 st.u64 [%r314],%r315; .loc 1 278 19 ld.u64 %r318,[%r58+16]; add.u64 %r317,%r318,1; ld.u64 %r319,[%r58+8]; sub.u64 %r64,%r317,%r319; .loc 1 280 10 setp.lt.s64 %r320,%r64,0; @ %r320 bra $L64; .loc 1 278 17 add.u64 %r322,%r426,%r103; st.u64 [%r322],%r64; bra $L65; $L64: .loc 1 281 12 add.u64 %r324,%r426,%r103; st.u64 [%r324],%r440; $L65: .loc 1 274 3 add.u64 %r58,%r58,24; add.u64 %r157,%r157,24; add.u64 %r103,%r103,8; setp.ne.u64 %r326,%r103,%r187; @ %r326 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r327,%r136,0; @ %r327 bra $L68; add.u64 %r426,%frame,360; $L86: add.u64 %r127,%r206,40; mov.u64 %r121,%r426; cvt.u32.u32 %r329,%r23; cvt.s64.s8 %r328,%r329; add.u64 %r331,%r328,%r328; add.u64 %r332,%r331,%r328; shl.b64 %r333,%r332,3; add.u64 %r334,%r206,16; add.u64 %r72,%r333,%r334; .loc 1 291 10 mov.u64 %r123,1; .loc 1 295 4 mov.u64 %r353,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r335,%r23,-2; cvt.s64.s32 %r70,%r335; add.u64 %r337,%r70,%r70; add.u64 %r338,%r337,%r70; shl.b64 %r339,%r338,3; add.u64 %r340,%r206,%r339; .loc 1 299 67 shl.b64 %r342,%r70,3; add.u64 %r343,%frame,%r342; .loc 1 299 59 ld.u64 %r345,[%r340+40]; ld.u64 %r346,[%r343+360]; mul.lo.u64 %r143,%r345,%r346; .loc 1 301 24 mov.u64 %r347,0; st.u64 [%r206+8],%r347; .loc 1 302 28 cvt.u16.u32 %r350,%r23; add.u16 %r349,%r350,-1; cvt.u32.u16 %r351,%r349; st.u8 [%r206+28],%r351; .loc 1 304 10 setp.eq.u64 %r352,%r143,0; @ %r352 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r123,%r67,%r123; $L69: .loc 1 295 4 st.u64 [%r127+8],%r353; ld.u64 %r67,[%r121]; add.u64 %r354,%r67,-1; st.u64 [%r127+16],%r354; st.u64 [%r127],%r123; .loc 1 288 7 add.u64 %r127,%r127,24; add.u64 %r121,%r121,8; setp.ne.u64 %r355,%r72,%r127; @ %r355 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r206+48],%r143; mov.u64 %r357,-1; st.u64 [%r206+56],%r357; mov.u64 %r358,1; st.u64 [%r206+40],%r358; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r360360; call (%value_in),_gfortrani_xmallocarray61,[%value_in]; } .loc 1 311 22 st.u64 [%r206],%r361; bra $L73; $L63: .loc 1 316 19 ld.s8 %r363,[%r206+28]; .loc 1 316 10 setp.eq.u64 %r364,%r363,%r136; @ %r364 bra $L74; .loc 1 317 2 cvta.const.u64 %r365365_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r367,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r368,[%r367+36]; setp.eq.u32 %r369,%r368,0; @ %r369 bra $L73; .loc 1 321 4 add.u64 %r426,%frame,360; cvta.const.u64 %r373,$LC2; cvta.const.u64 %r372,$LC3; {372373; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r377,$LC377373; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r379,%r136,0; @ %r379 bra $L75; $L78: .loc 1 336 8 ld.u64 %r150,[%r206]; .loc 1 337 8 ld.u64 %r148,[%r207]; .loc 1 339 9 setp.ne.u64 %r380,%r148,0; @ %r380 bra $L76; bra $L48; $L75: add.u64 %r173,%frame,480; add.u64 %r170,%r206,40; add.u64 %r163,%frame,120; cvt.u32.u32 %r382,%r23; cvt.s64.s8 %r381,%r382; add.u64 %r128,%r381,-1; .loc 1 328 3 mov.u64 %r154,0; add.u64 %r426,%frame,360; .loc 1 330 16 mov.u64 %r383,%r154; $L77: st.u64 [%r173],%r383; .loc 1 331 18 ld.u64 %r384,[%r170]; st.u64 [%r163],%r384; .loc 1 332 17 shl.b64 %r386,%r154,3; add.u64 %r387,%r426,%r386; .loc 1 332 10 ld.u64 %r388,[%r387]; setp.le.s64 %r389,%r388,0; @ %r389 bra $L48; .loc 1 328 26 add.u64 %r154,%r154,1; .loc 1 328 3 add.u64 %r173,%r173,8; add.u64 %r170,%r170,24; add.u64 %r163,%r163,8; setp.ne.u64 %r390,%r128,%r154; @ %r390 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r44,%r42,%r34; ld.u64 %r176,[%frame+480]; .loc 1 359 22 ld.u64 %r89,[%frame+240]; .loc 1 360 23 ld.u64 %r91,[%frame]; .loc 1 361 22 ld.u64 %r93,[%frame+120]; setp.le.s64 %r431,%r136,1; .loc 1 348 10 mov.u32 %r433,1; cvt.u32.u32 %r434,%r23; cvt.s64.s8 %r435,%r434; add.u64 %r436,%r435,-1; add.u64 %r437,%frame,240; add.u64 %r438,%frame,360; add.u64 %r439,%frame,120; $L87: .loc 1 328 3 mov.u64 %r153,%r149; mov.u64 %r152,%r148; .loc 1 348 10 mov.u32 %r125,%r433; .loc 1 349 9 mov.u64 %r151,0; $L80: .loc 1 352 6 ld.s8 %r392,[%r153]; cvt.u16.u32 %r391,%r392; setp.eq.u16 %r393,%r391,0; @ %r393 bra $L79; .loc 1 353 12 ld.u8 %r442,[%r152]; mov.u16 %r397,%r442; cvt.u16.u32 %r398,%r125; mul.lo.u16 %r396,%r397,%r398; cvt.u32.u16 %r399,%r396; cvt.s32.s8 %r125,%r399; $L79: .loc 1 349 24 add.u64 %r151,%r151,1; .loc 1 349 32 add.u64 %r152,%r152,%r140; .loc 1 349 47 add.u64 %r153,%r153,%r44; .loc 1 349 2 setp.ne.u64 %r401,%r137,%r151; @ %r401 bra $L80; .loc 1 355 8 cvt.u32.u32 %r402,%r125; st.u8 [%r150],%r402; .loc 1 358 15 add.u64 %r176,%r176,1; .loc 1 359 12 add.u64 %r148,%r148,%r89; .loc 1 360 13 add.u64 %r149,%r149,%r91; .loc 1 361 12 add.u64 %r150,%r150,%r93; .loc 1 363 32 ld.u64 %r80,[%frame+360]; .loc 1 363 13 setp.ne.u64 %r403,%r80,%r176; @ %r403 bra $L87; .loc 1 370 23 mul.lo.u64 %r97,%r176,%r89; .loc 1 371 24 mul.lo.u64 %r404,%r176,%r91; .loc 1 371 10 sub.u64 %r145,%r149,%r404; .loc 1 372 23 mul.lo.u64 %r405,%r176,%r93; .loc 1 372 9 sub.u64 %r146,%r150,%r405; .loc 1 374 7 @ ! %r431 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r40],%r432; .loc 1 370 23 mul.lo.u64 %r97,%r109,%r108; .loc 1 371 24 mul.lo.u64 %r408,%r111,%r108; .loc 1 371 10 sub.u64 %r145,%r149,%r408; .loc 1 372 23 mul.lo.u64 %r409,%r113,%r108; .loc 1 372 9 sub.u64 %r146,%r150,%r409; .loc 1 373 5 add.u64 %r147,%r147,1; .loc 1 374 7 add.u64 %r40,%r40,8; add.u64 %r167,%r167,8; setp.ne.u64 %r410,%r147,%r436; @ %r410 bra $L84; bra $L48; $L108: add.u64 %r40,%frame,488; mov.u64 %r167,8; .loc 1 373 5 mov.u64 %r147,1; .loc 1 367 13 mov.u64 %r432,0; $L84: .loc 1 382 16 ld.u64 %r414,[%r40]; add.u64 %r108,%r414,1; st.u64 [%r40],%r108; .loc 1 383 23 add.u64 %r416,%r437,%r167; ld.u64 %r109,[%r416]; .loc 1 383 13 sub.u64 %r417,%r109,%r97; add.u64 %r148,%r148,%r417; .loc 1 384 24 add.u64 %r418,%frame,%r167; ld.u64 %r111,[%r418]; .loc 1 384 14 add.u64 %r149,%r145,%r111; .loc 1 385 23 add.u64 %r420,%r439,%r167; ld.u64 %r113,[%r420]; .loc 1 385 13 add.u64 %r150,%r146,%r113; .loc 1 363 32 add.u64 %r422,%r438,%r167; ld.u64 %r116,[%r422]; .loc 1 363 13 setp.eq.u64 %r423,%r108,%r116; @ %r423 bra $L85; .loc 1 367 13 mov.u64 %r176,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r424,[%r206]; setp.eq.u64 %r425,%r424,0; @ ! %r425 bra $L63; bra $L86; $L48: .loc 1 389_gfortran_sproduct_i1 .visible .func _gfortran_sproduct_88pred %r184; .reg .predpred %r206; .reg .predu32 %r211; .reg .u64 %r213; .reg .u64u32 %r222; .reg .u64pred %r264; .reg .u64 %r266; .reg .u32 %r270; .reg .u64 %r272; .reg .u64 %r274; .reg .predpredu64 %r297; mov.u64 %r148,%ar0; mov.u64 %r149,%ar1; mov.u64 %r150,%ar2; mov.u64 %r151,%ar3; .loc 1 412 6 setp.eq.u64 %r152,%r151,0; @ %r152 bra $L110; .loc 1 412 20 ld.u32 %r153,[%r151]; setp.eq.u32 %r154,%r153,0; @ %r154 bra $L111; $L110: .loc 1 417 7gfortran_product_.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r150]; .loc 1 422 7 add.u64 %r86,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r149+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r291,%r26; .loc 1 425 7 shr.u64 %r159,%r86,63; set.u32.gt.s64 %r161,%r86,%r291; neg.s32 %r162,%r161; cvt.u16.u64 %r165,%r159; cvt.u16.u32 %r166,%r162; or.b16 %r164,%r165,%r166; .loc 1 425 6 cvt.u32.u16 %r167,%r164; cvt.u16.u8 %r168,%r167; setp.ne.u16 %r169,%r168,0; @ %r169 bra $L113; .loc 1 432 3 setp.ne.u64 %r170,%r86,0; @ %r170 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r171,%r86,%r291; @ %r171 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r174,%r24; cvt.s64.s8 %r173,%r174; st.u64 [%stack+8],%r173172172_gfortran_runtime_error114: add.u64 %r135,%r149,48; add.u64 %r137,%frame,120; add.u64 %r138,%r149,56; add.u64 %r176,%r149,24; add.u64 %r178,%r23,%r23; add.u64 %r179,%r178,%r23; shl.b64 %r180,%r179,3; add.u64 %r144,%r176,%r180; .loc 1 437 12 mov.u64 %r297,0; $L119: .loc 1 434 19 ld.u64 %r182,[%r138]; add.u64 %r181,%r182,1; ld.u64 %r183,[%r135]; sub.u64 %r36,%r181,%r183; .loc 1 436 10 setp.le.s64 %r184,%r36,0; @ %r184 bra $L117; .loc 1 434 17 st.u64 [%r137],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r137],%r297; $L118: .loc 1 432 3 add.u64 %r135,%r135,24; add.u64 %r137,%r137,8; add.u64 %r138,%r138,24; setp.ne.u64 %r186,%r135,%r144; @ %r186 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r96,[%r148]; .loc 1 449 6 setp.eq.u64 %r187,%r96,0; @ ! %r187 bra $L122; bra $L121; $L115: add.u64 %r189,%r23,%r23; add.u64 %r190,%r189,%r23; shl.b64 %r191,%r190,3; add.u64 %r192,%r191,48; add.u64 %r57,%r149,%r192; add.u64 %r290,%frame,120; shl.b64 %r194,%r23,3; add.u64 %r195,%r194,-8; add.u64 %r67,%r290,%r195; add.u64 %r196,%r149,48; cvt.u32.u32 %r198,%r24; cvt.s64.s8 %r197,%r198; add.u64 %r200,%r197,%r197; add.u64 %r201,%r200,%r197; shl.b64 %r202,%r201,3; add.u64 %r131,%r196,%r202; .loc 1 446 12 mov.u64 %r296,0; $L125: .loc 1 443 2 ld.u64 %r204,[%r57+8]; add.u64 %r203,%r204,1; ld.u64 %r205,[%r57]; sub.u64 %r41,%r203,%r205; .loc 1 445 10 setp.le.s64 %r206,%r41,0; @ %r206 bra $L123; .loc 1 442 17 st.u64 [%r67],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r67],%r296; $L124: .loc 1 440 3 add.u64 %r57,%r57,24; add.u64 %r67,%r67,8; setp.ne.u64 %r208,%r57,%r131; @ %r208 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r209,%r291,0; @ %r209 bra $L127; add.u64 %r290,%frame,120; $L147: add.u64 %r102,%r148,40; mov.u64 %r84,%r290; cvt.u32.u32 %r211,%r24; cvt.s64.s8 %r210,%r211; add.u64 %r213,%r210,%r210; add.u64 %r214,%r213,%r210; shl.b64 %r215,%r214,3; add.u64 %r216,%r148,16; add.u64 %r60,%r215,%r216; .loc 1 456 10 mov.u64 %r82,1; .loc 1 460 4 mov.u64 %r235,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r217,0; st.u64 [%r148+8],%r217; .loc 1 465 28 cvt.u16.u32 %r220,%r24; add.u16 %r219,%r220,-1; cvt.u32.u16 %r221,%r219; st.u8 [%r148+28],%r221; .loc 1 467 20 add.u32 %r222,%r24,-2; cvt.s64.s32 %r50,%r222; add.u64 %r224,%r50,%r50; add.u64 %r225,%r224,%r50; shl.b64 %r226,%r225,3; add.u64 %r227,%r148,%r226; .loc 1 467 67 shl.b64 %r229,%r50,3; add.u64 %r230,%frame,%r229; .loc 1 467 59 ld.u64 %r232,[%r227+40]; ld.u64 %r233,[%r230+120]; mul.lo.u64 %r93,%r232,%r233; .loc 1 469 10 setp.eq.u64 %r234,%r93,0; @ %r234 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r82,%r46,%r82; $L128: .loc 1 460 4 st.u64 [%r102+8],%r235; ld.u64 %r46,[%r84]; add.u64 %r236,%r46,-1; st.u64 [%r102+16],%r236; st.u64 [%r102],%r82; .loc 1 453 7 add.u64 %r102,%r102,24;131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r148+48],%r93; mov.u64 %r239,-1; st.u64 [%r148+56],%r239; mov.u64 %r240,1; st.u64 [%r148+40],%r240; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r242(%value_in),_gfortrani_xmallocarray3,[%value_in]; } mov.u64 %r96,%r243; .loc 1 476 22 st.u64 [%r148],%r96; $L135: .loc 1 502 3 setp.ne.u64 %r245,%r291,0; @ %r245 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r148+28]; .loc 1 480 10 setp.eq.u64 %r246,%r55,%r291; @ %r246 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r291; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r249,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r250,[%r249+36]; setp.eq.u32 %r251,%r250,0; @ %r251 bra $L135; .loc 1 488 4 setp.eq.u64 %r252,%r291,0; @ %r252 bra $L133; add.u64 %r31,%r148,48; add.u64 %r125,%frame,120; cvt.u32.u32 %r254,%r24; cvt.s64.s8 %r253,%r254; add.u64 %r92,%r253,-1; .loc 1 488 10 mov.u64 %r101,0; $L137: .loc 1 492 21 ld.u64 %r256,[%r31+8]; add.u64 %r255,%r256,1; .loc 1 492 19 ld.u64 %r257,[%r31]; sub.u64 %r91,%r255,%r257; .loc 1 493 18 ld.u64 %r64,[%r125]; .loc 1 488 25 add.u64 %r101,%r101,1; .loc 1 493 11 setp.eq.u64 %r258,%r64,%r91; @ %r258 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r64; st.u64 [%stack+8],%r91; st.u64 [%stack],%r101; cvta.const.u64 %r22_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r31,%r31,24; add.u64 %r125,%r125,8; setp.ne.u64 %r261,%r92,%r101; @ %r261 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r263,%r24; setp.le.s16 %r264,%r263,1; @ %r264 bra $L138; shl.b64 %r262,%r291,3; bra $L139; $L138: mov.u64 %r262,8; $L139: add.u64 %r266,%frame,240; mov.u32 %r27026272,[%value_in]; } add.u64 %r111,%r148,40; mov.u64 %r109,%frame; mov.u64 %r98,0; $L140: .loc 1 505 18 ld.u64 %r274,[%r111]; st.u64 [%r109],%r274; .loc 1 502 26 add.u64 %r98,%r98,1; .loc 1 502 3 add.u64 %r111,%r111,24; add.u64 %r109,%r109,8; setp.gt.s64 %r275,%r291,%r98; @ %r275 bra $L140; $L133: ld.u64 %r103,[%frame+240]; .loc 1 514 22 ld.u64 %r69,[%frame]; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r81,%r59,%r69; setp.le.s64 %r289,%r291,1; .loc 1 512 13 mov.u32 %r276,1; cvt.u32.u32 %r293,%r24; cvt.s64.s8 %r294,%r293; add.u64 %r295,%r294,-1; $L146: st.u8 [%r96],%r276; .loc 1 513 15 add.u64 %r103,%r103,1; .loc 1 514 12 add.u64 %r96,%r96,%r69; .loc 1 516 13 setp.ne.u64 %r277,%r59,%r103; @ %r277 bra $L146; .loc 1 525 7 @ ! %r289 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r97],%r292; .loc 1 523 23 mul.lo.u64 %r71,%r74,%r73; .loc 1 524 5 add.u64 %r95,%r95,1; .loc 1 525 7 add.u64 %r97,%r97,8; add.u64 %r119,%r119,8; add.u64 %r118,%r118,8; setp.ne.u64 %r280,%r95,%r295; @ %r280 bra $L144; bra $L109; $L171: add.u64 %r97,%frame,248; add.u64 %r119,%frame,8; add.u64 %r118,%frame,128; .loc 1 523 23 mov.u64 %r71,%r81; .loc 1 524 5 mov.u64 %r95,1; .loc 1 520 13 mov.u64 %r292,0; $L144: .loc 1 529 16 ld.u64 %r285,[%r97]; add.u64 %r73,%r285,1; st.u64 [%r97],%r73; .loc 1 530 23 ld.u64 %r74,[%r119]; .loc 1 530 13 sub.u64 %r286,%r74,%r71; add.u64 %r96,%r96,%r286; .loc 1 516 32 ld.u64 %r77,[%r118]; .loc 1 516 13 setp.eq.u64 %r287,%r73,%r77; @ %r287 bra $L145; .loc 1 520 13 mov.u64 %r103,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r96,[%r148]; .loc 1 449 6 setp.eq.u64 %r288,%r96,0; @ ! %r288 bra $L122; bra $L147; $L109: .loc 1 534 product_i2.o/_gfortran_product_i2 .visible .func _gfortran_product_i2fortran/generated/product_i2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mproduct_i2 .visible .func _gfortran_mproduct_i2gfortran_sproduct_i2 .visible .func _gfortran_sproduct_i2VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,108] = {80,82,79,68,85,67,84114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_product_i2 .visible .func _gfortran_product_i248predu64u64 %r229; .reg .u64 %r230; .reg .predpred %r255; .reg .predu32u16u64predu64pred %r322; .reg .predpred %r330; .reg .u16 %r331; .reg .predpred %r357; .reg .u32 %r359; .reg .u64u64 %r366; .reg .u64 %r367; mov.u64 %r182,%ar0; mov.u64 %r183,%ar1; mov.u64 %r184,%ar2; .loc 1 55 10 ld.s8 %r22,[%r183+28]; .loc 1 55 38 add.u32 %r185,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r107,%r185; .loc 1 56 10 ld.u64 %r25,[%r184]; .loc 1 56 7 add.u64 %r108,%r25,-1; .loc 1 58 7 shr.u64 %r187,%r108,63; set.u32.lt.s64 %r189,%r107,%r108; neg.s32 %r190,%r189; cvt.u16.u64 %r193,%r187; cvt.u16.u32 %r194,%r190; or.b16 %r192,%r193,%r194; .loc 1 58 6 cvt.u32.u16 %r195,%r192; cvt.u16.u8 %r196,%r195; setp.eq.u16 %r197,%r196,0; @ %r197 bra $L2; .loc 1 60 7 cvt.u32.u32 %r200,%r22; cvt.s64.s8 %r199,%r200; st.u64 [%stack+8],%r199_gfortran_runtime_errorr203,%r108,%r108; add.u64 %r204,%r203,%r108; shl.b64 %r205,%r204,3; add.u64 %r206,%r183,%r205; ld.u64 %r31,[%r206+56]; ld.u64 %r33,[%r206+48]; .loc 1 68 9 ld.u64 %r111,[%r206+40]; .loc 1 70 3 setp.ne.u64 %r220,%r108,0; @ %r220 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r221,%r107,%r108; @ %r221 bra $L4; bra $L46; $L3: add.u64 %r97,%r183,40; add.u64 %r61,%frame,120; add.u64 %r59,%frame,240; add.u64 %r222,%r183,16; add.u64 %r224,%r25,%r25; add.u64 %r225,%r224,%r25; shl.b64 %r226,%r225,3; add.u64 %r180,%r222,%r226; .loc 1 76 12 mov.u64 %r367,0; $L8: .loc 1 72 18 ld.u64 %r227,[%r97]; st.u64 [%r61],%r227; .loc 1 73 19 ld.u64 %r229,[%r97+16]; add.u64 %r228,%r229,1; ld.u64 %r230,[%r97+8]; sub.u64 %r38,%r228,%r230; .loc 1 75 10 setp.lt.s64 %r231,%r38,0; @ %r231 bra $L6; .loc 1 73 17 st.u64 [%r59],%r38; bra $L7; $L6: .loc 1 76 12 st.u64 [%r59],%r367; $L7: .loc 1 70 3 add.u64 %r97,%r97,24; add.u64 %r61,%r61,8; add.u64 %r59,%r59,8; setp.ne.u64 %r233,%r97,%r180; @ %r233 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r234,[%r182]; setp.eq.u64 %r235,%r234,0; @ ! %r235 bra $L11; bra $L10; $L4: add.u64 %r237,%r25,%r25; add.u64 %r238,%r237,%r25; shl.b64 %r239,%r238,3; add.u64 %r240,%r239,40; add.u64 %r150,%r183,%r240; shl.b64 %r241,%r25,3; add.u64 %r133,%r241,-8; add.u64 %r355,%frame,120; add.u64 %r138,%r355,%r133; add.u64 %r354,%frame,240; add.u64 %r130,%r354,%r133; cvt.u32.u32 %r245,%r22; cvt.s64.s8 %r244,%r245; add.u64 %r247,%r244,%r244; add.u64 %r248,%r247,%r244; shl.b64 %r249,%r248,3; add.u64 %r250,%r183,40; add.u64 %r100,%r249,%r250; .loc 1 84 12 mov.u64 %r366,0; $L14: .loc 1 80 18 ld.u64 %r251,[%r150]; st.u64 [%r138],%r251; .loc 1 81 19 ld.u64 %r253,[%r150+16]; add.u64 %r252,%r253,1; ld.u64 %r254,[%r150+8]; sub.u64 %r44,%r252,%r254; .loc 1 83 10 setp.lt.s64 %r255,%r44,0; @ %r255 bra $L12; .loc 1 81 17 st.u64 [%r130],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r130],%r366; $L13: .loc 1 78 3 add.u64 %r150,%r150,24; add.u64 %r138,%r138,8; add.u64 %r130,%r130,8; setp.ne.u64 %r257,%r100,%r150; @ %r257 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r258,%r107,0; @ %r258 bra $L16; add.u64 %r354,%frame,240; $L34: add.u64 %r159,%r182,40; mov.u64 %r157,%r354; add.u64 %r259,%r182,16; cvt.u32.u32 %r261,%r22; cvt.s64.s8 %r260,%r261; add.u64 %r263,%r260,%r260; add.u64 %r264,%r263,%r260; shl.b64 %r265,%r264,3; add.u64 %r151,%r259,%r265; .loc 1 94 10 mov.u64 %r101,1; .loc 1 98 4 mov.u64 %r288,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r266,0; st.u64 [%r182+8],%r266; .loc 1 103 28 cvt.u16.u32 %r269,%r22; add.u16 %r268,%r269,-1; cvt.u32.u16 %r270,%r268; st.u8 [%r182+28],%r270; .loc 1 105 20 add.u32 %r271,%r22,-2; cvt.s64.s32 %r52,%r271; add.u64 %r273,%r52,%r52; add.u64 %r274,%r273,%r52; shl.b64 %r275,%r274,3; add.u64 %r276,%r182,%r275; .loc 1 105 67 shl.b64 %r278,%r52,3; add.u64 %r279,%frame,%r278; .loc 1 105 59 ld.u64 %r281,[%r276+40]; ld.u64 %r282,[%r279+240]; mul.lo.u64 %r113,%r281,%r282; .loc 1 107 29 mov.u64 %r284,11(%value_in),_gfortrani_xmallocarray285,[%value_in]; } .loc 1 107 27 st.u64 [%r182],%r285; .loc 1 108 10 setp.eq.u64 %r287,%r113,0; @ ! %r287 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r101,%r48,%r101; $L17: .loc 1 98 4 st.u64 [%r159+8],%r288; ld.u64 %r48,[%r157]; add.u64 %r289,%r48,-1; st.u64 [%r159+16],%r289; st.u64 [%r159],%r101; .loc 1 91 7 add.u64 %r159,%r159,24; add.u64 %r157,%r157,8; setp.ne.u64 %r290,%r151,%r159; @ %r290 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r182+48],%r113; mov.u64 %r292,-1; st.u64 [%r182+56],%r292; mov.u64 %r293,1; st.u64 [%r182+40],%r293; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r182+28]; .loc 1 118 10 setp.eq.u64 %r294,%r58,%r107; @ %r294 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r107; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r297,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r298,[%r297+36]; setp.eq.u32 %r299,%r298,0; @ %r299 bra $L21; .loc 1 125 2 add.u64 %r354,%frame,240; cvta.const.u64 %r303,$LC2r354302303; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r305,%r107,0; @ %r305 bra $L24; $L27: .loc 1 65 9 add.u64 %r306,%r31,1; .loc 1 65 7 sub.u64 %r109,%r306,%r33; max.s64 %r45,%r109,0; .loc 1 137 8 ld.u64 %r121,[%r183]; .loc 1 138 8 ld.u64 %r122,[%r182]; .loc 1 154 36 add.u64 %r69,%r111,%r111; ld.u64 %r142,[%frame+360]; .loc 1 166 22 ld.u64 %r71,[%frame+120]; .loc 1 166 12 add.u64 %r72,%r71,%r71; .loc 1 167 22 ld.u64 %r74,[%frame]; .loc 1 167 12 add.u64 %r75,%r74,%r74; .loc 1 169 32 ld.u64 %r129,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r310,%r71,%r129; .loc 1 176 9 add.u64 %r311,%r310,%r310; neg.s64 %r132,%r311; .loc 1 177 23 mul.lo.u64 %r312,%r74,%r129; .loc 1 177 9 add.u64 %r313,%r312,%r312; neg.s64 %r135,%r313; setp.gt.s64 %r356,%r109,0; setp.le.s64 %r357,%r107,1; cvt.u32.u32 %r359,%r22; cvt.s64.s8 %r360,%r359; add.u64 %r361,%r360,-1; add.u64 %r362,%frame,120; add.u64 %r363,%frame,240; .loc 1 173 13 mov.u64 %r364,0; bra $L25; $L24: add.u64 %r172,%frame,360; add.u64 %r171,%r182,40; mov.u64 %r169,%frame; cvt.u32.u32 %r315,%r22; cvt.s64.s8 %r314,%r315; add.u64 %r160,%r314,-1; .loc 1 129 3 mov.u64 %r123,0; add.u64 %r354,%frame,240; .loc 1 131 16 mov.u64 %r316,%r123; $L26: st.u64 [%r172],%r316; .loc 1 132 18 ld.u64 %r317,[%r171]; st.u64 [%r169],%r317; .loc 1 133 17 shl.b64 %r319,%r123,3; add.u64 %r320,%r354,%r319; .loc 1 133 10 ld.u64 %r321,[%r320]; setp.le.s64 %r322,%r321,0; @ %r322 bra $L1; .loc 1 129 26 add.u64 %r123,%r123,1; .loc 1 129 3 add.u64 %r172,%r172,8; add.u64 %r171,%r171,24; add.u64 %r169,%r169,8; setp.ne.u64 %r323,%r123,%r160; @ %r323 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r356 bra $L35; .loc 1 150 10 mov.u16 %r325,1; st.u16 [%r122],%r325; bra $L29; $L35: mov.u64 %r116,%r121; .loc 1 148 10 mov.u32 %r114,1; .loc 1 154 13 mov.u64 %r115,0; $L28: .loc 1 158 10 ld.u16 %r327,[%r116]; cvt.u16.u32 %r328,%r114; mul.lo.u16 %r326,%r327,%r328; cvt.s32.s16 %r114,%r326; .loc 1 154 28 add.u64 %r115,%r115,1; .loc 1 154 36 add.u64 %r116,%r116,%r69; .loc 1 154 6 setp.gt.s64 %r330,%r45,%r115; @ %r330 bra $L28; .loc 1 161 12 cvt.u16.u32 %r331,%r114; st.u16 [%r122],%r331; $L29: .loc 1 165 15 add.u64 %r142,%r142,1; .loc 1 166 12 add.u64 %r121,%r121,%r72; .loc 1 167 12 add.u64 %r122,%r122,%r75; .loc 1 169 13 setp.ne.u64 %r332,%r142,%r129; @ %r332 bra $L25; .loc 1 176 9 add.u64 %r118,%r121,%r132; .loc 1 177 9 add.u64 %r119,%r122,%r135; .loc 1 179 7 @ %r357 bra $L1; add.u64 %r128,%frame,368; mov.u64 %r144,8; .loc 1 178 5 mov.u64 %r120,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r128],%r364; .loc 1 176 23 mul.lo.u64 %r338,%r88,%r87; .loc 1 176 9 add.u64 %r339,%r338,%r338; sub.u64 %r118,%r121,%r339; .loc 1 177 23 mul.lo.u64 %r340,%r91,%r87; .loc 1 177 9 add.u64 %r341,%r340,%r340; sub.u64 %r119,%r122,%r341; .loc 1 178 5 add.u64 %r120,%r120,1; .loc 1 179 7 add.u64 %r128,%r128,8; add.u64 %r144,%r144,8; setp.eq.u64 %r342,%r120,%r361; @ %r342 bra $L1; $L32: .loc 1 187 16 ld.u64 %r343,[%r128]; add.u64 %r87,%r343,1; st.u64 [%r128],%r87; .loc 1 188 23 add.u64 %r345,%r362,%r144; ld.u64 %r88,[%r345]; .loc 1 188 13 add.u64 %r346,%r88,%r88; add.u64 %r121,%r118,%r346; .loc 1 189 23 add.u64 %r347,%frame,%r144; ld.u64 %r91,[%r347]; .loc 1 189 13 add.u64 %r348,%r91,%r91; add.u64 %r122,%r119,%r348; .loc 1 169 32 add.u64 %r350,%r363,%r144; ld.u64 %r95,[%r350]; .loc 1 169 13 setp.eq.u64 %r351,%r87,%r95; @ %r351 bra $L33; .loc 1 173 13 mov.u64 %r142,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r352,[%r182]; setp.eq.u64 %r353,%r352,0; @ ! %r353 bra $L11; bra $L34; $L1_gfortran_mproduct_i2 .visible .func _gfortran_mproduct_608pred %r216; .reg .u32 %r220; .reg .u64 %r222; .reg .u32 %r224; .reg .u32 %r225; .reg .u16 %r227; .reg .u16u16 %r264; .reg .u32 %r265; .reg .u16u64 %r274; .reg .u64 %r280; .reg .pred %r282; .reg .pred %r283; .reg .u64u64u64predu64 %r313; .reg .u64u64pred %r332; .reg .pred %r333; .reg .u64 %r334; .reg .u32u16 %r355; .reg .u16 %r356; .reg .u32 %r357; .reg .predpred %r370; .reg .u64 %r371; .reg .u64 %r373; .reg .u32 %r374; .reg .predpredpredu64 %r427; .reg .u64 %r429; .reg .u64u64 %r451; mov.u64 %r212,%ar0; mov.u64 %r213,%ar1; mov.u64 %r214,%ar2; mov.u64 %r215,%ar3; .loc 1 223 6 setp.ne.u64 %r216,%r215,0; @ %r216 bra $L491213214; call _gfortran_product_.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r214]; .loc 1 233 7 add.u64 %r140,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r213+28]; .loc 1 234 38 add.u32 %r220,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r141,%r220; .loc 1 237 7 shr.u64 %r222,%r140,63; set.u32.gt.s64 %r224,%r140,%r141; neg.s32 %r225,%r224; cvt.u16.u64 %r228,%r222; cvt.u16.u32 %r229,%r225; or.b16 %r227,%r228,%r229; .loc 1 237 6 cvt.u32.u16 %r230,%r227; cvt.u16.u8 %r231,%r230; setp.eq.u16 %r232,%r231,0; @ %r232 bra $L51; .loc 1 239 7 cvt.u32.u32 %r235,%r23; cvt.s64.s8 %r234,%r235; st.u64 [%stack+8],%r234; st.u64 [%stack],%r22; cvta.const.u64 %r233233_gfortran_runtime_error1: .loc 1 244 9 add.u64 %r439,%r140,%r140; add.u64 %r440,%r439,%r140; shl.b64 %r240,%r440,3; add.u64 %r241,%r213,%r240; ld.u64 %r244,[%r241+56]; add.u64 %r243,%r244,1; .loc 1 244 7 ld.u64 %r251,[%r241+48]; sub.u64 %r142,%r243,%r251; .loc 1 245 6 setp.le.s64 %r252,%r142,0; @ %r252 bra $L48; .loc 1 248 9 ld.u64 %r155,[%r215]; .loc 1 250 15 ld.u64 %r34,[%r215+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r253,%r35,-4; and.b32 %r254,%r253,-5; set.u32.eq.u32 %r256,%r254,0; neg.s32 %r257,%r256; .loc 1 252 22 add.u32 %r258,%r35,-1; set.u32.le.u32 %r260,%r258,1; neg.s32 %r261,%r260; .loc 1 252 6 cvt.u16.u32 %r263,%r257; cvt.u16.u32 %r264,%r261; or.b16 %r262,%r263,%r264; cvt.u32.u16 %r265,%r262; cvt.u16.u8 %r266,%r265; setp.ne.u16 %r267,%r266,0; @ %r267 bra $L53; .loc 1 254 7 setp.ne.u32 %r269,%r35,16; @ %r269 bra $L54; $L53: .loc 1 261 9 shl.b64 %r273,%r440,3; add.u64 %r274,%r213,%r273; ld.u64 %r145,[%r274+40]; .loc 1 262 12 add.u64 %r280,%r215,%r273; ld.u64 %r42,[%r280+40]; .loc 1 264 3 setp.ne.u64 %r282,%r140,0; @ %r282 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r283,%r140,%r141; @ %r283 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r2842stack; call _gfortran_runtime_error5: add.u64 %r40,%r213,40; add.u64 %r182,%r215,40; add.u64 %r286,%r213,16; add.u64 %r288,%r22,%r22; add.u64 %r289,%r288,%r22; shl.b64 %r290,%r289,3; add.u64 %r209,%r286,%r290; .loc 1 264 3 mov.u64 %r95,0; add.u64 %r437,%frame,240; add.u64 %r436,%frame,360; .loc 1 271 12 mov.u64 %r451,%r95; $L60: .loc 1 266 18 add.u64 %r292,%r437,%r95; ld.u64 %r293,[%r40]; st.u64 [%r292],%r293; .loc 1 267 18 add.u64 %r294,%frame,%r95; .loc 1 267 20 ld.u64 %r296,[%r182]; mul.lo.u64 %r295,%r296,%r34; .loc 1 267 18 st.u64 [%r294],%r295; .loc 1 268 19 ld.u64 %r298,[%r40+16]; add.u64 %r297,%r298,1; ld.u64 %r299,[%r40+8]; sub.u64 %r54,%r297,%r299; .loc 1 270 10 setp.lt.s64 %r300,%r54,0; @ %r300 bra $L58; .loc 1 268 17 add.u64 %r302,%r436,%r95; st.u64 [%r302],%r54; bra $L59; $L58: .loc 1 271 12 add.u64 %r304,%r436,%r95; st.u64 [%r304],%r451; $L59: .loc 1 264 3 add.u64 %r40,%r40,24; add.u64 %r182,%r182,24; add.u64 %r95,%r95,8; setp.ne.u64 %r306,%r40,%r209; @ %r306 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r307,[%r212]; setp.eq.u64 %r308,%r307,0; @ ! %r308 bra $L63; bra $L62; $L56: add.u64 %r310,%r22,%r22; add.u64 %r311,%r310,%r22; shl.b64 %r312,%r311,3; add.u64 %r130,%r312,40; add.u64 %r136,%r213,%r130; add.u64 %r128,%r215,%r130; shl.b64 %r313,%r22,3; add.u64 %r106,%r313,-8; cvt.u32.u32 %r315,%r23; cvt.s64.s8 %r314,%r315; shl.b64 %r316,%r314,3; add.u64 %r69,%r316,-8; add.u64 %r437,%frame,240; add.u64 %r436,%frame,360; .loc 1 281 12 mov.u64 %r450,0; $L66: .loc 1 276 18 add.u64 %r318,%r437,%r106; ld.u64 %r319,[%r136]; st.u64 [%r318],%r319; .loc 1 277 18 add.u64 %r320,%frame,%r106; .loc 1 277 20 ld.u64 %r322,[%r128]; mul.lo.u64 %r321,%r322,%r34; .loc 1 277 18 st.u64 [%r320],%r321; .loc 1 278 19 ld.u64 %r324,[%r136+16]; add.u64 %r323,%r324,1; ld.u64 %r325,[%r136+8]; sub.u64 %r66,%r323,%r325; .loc 1 280 10 setp.lt.s64 %r326,%r66,0; @ %r326 bra $L64; .loc 1 278 17 add.u64 %r328,%r436,%r106; st.u64 [%r328],%r66; bra $L65; $L64: .loc 1 281 12 add.u64 %r330,%r436,%r106; st.u64 [%r330],%r450; $L65: .loc 1 274 3 add.u64 %r136,%r136,24; add.u64 %r128,%r128,24; add.u64 %r106,%r106,8; setp.ne.u64 %r332,%r69,%r106; @ %r332 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r333,%r141,0; @ %r333 bra $L68; add.u64 %r436,%frame,360; $L86: add.u64 %r175,%r212,40; mov.u64 %r167,%r436; cvt.u32.u32 %r335,%r23; cvt.s64.s8 %r334,%r335; add.u64 %r337,%r334,%r334; add.u64 %r338,%r337,%r334; shl.b64 %r339,%r338,3; add.u64 %r340,%r212,16; add.u64 %r137,%r339,%r340; .loc 1 291 10 mov.u64 %r132,1; .loc 1 295 4 mov.u64 %r359,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r341,%r23,-2; cvt.s64.s32 %r75,%r341; add.u64 %r343,%r75,%r75; add.u64 %r344,%r343,%r75; shl.b64 %r345,%r344,3; add.u64 %r346,%r212,%r345; .loc 1 299 67 shl.b64 %r348,%r75,3; add.u64 %r349,%frame,%r348; .loc 1 299 59 ld.u64 %r351,[%r346+40]; ld.u64 %r352,[%r349+360]; mul.lo.u64 %r149,%r351,%r352; .loc 1 301 24 mov.u64 %r353,0; st.u64 [%r212+8],%r353; .loc 1 302 28 cvt.u16.u32 %r356,%r23; add.u16 %r355,%r356,-1; cvt.u32.u16 %r357,%r355; st.u8 [%r212+28],%r357; .loc 1 304 10 setp.eq.u64 %r358,%r149,0; @ %r358 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r132,%r72,%r132; $L69: .loc 1 295 4 st.u64 [%r175+8],%r359; ld.u64 %r72,[%r167]; add.u64 %r360,%r72,-1; st.u64 [%r175+16],%r360; st.u64 [%r175],%r132; .loc 1 288 7 add.u64 %r175,%r175,24; add.u64 %r167,%r167,8; setp.ne.u64 %r361,%r137,%r175; @ %r361 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r212+48],%r149; mov.u64 %r363,-1; st.u64 [%r212+56],%r363; mov.u64 %r364,1; st.u64 [%r212+40],%r364; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r366,366; call (%value_in),_gfortrani_xmallocarray67,[%value_in]; } .loc 1 311 22 st.u64 [%r212],%r367; bra $L73; $L63: .loc 1 316 19 ld.s8 %r369,[%r212+28]; .loc 1 316 10 setp.eq.u64 %r370,%r369,%r141; @ %r370 bra $L74; .loc 1 317 2 cvta.const.u64 %r371371_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r373,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r374,[%r373+36]; setp.eq.u32 %r375,%r374,0; @ %r375 bra $L73; .loc 1 321 4 add.u64 %r436,%frame,360; cvta.const.u64 %r379,$LC2r4r378379; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r383,$LC383379; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r385,%r141,0; @ %r385 bra $L75; $L78: .loc 1 336 8 ld.u64 %r156,[%r212]; .loc 1 337 8 ld.u64 %r154,[%r213]; .loc 1 339 9 setp.ne.u64 %r386,%r154,0; @ %r386 bra $L76; bra $L48; $L75: add.u64 %r193,%frame,480; add.u64 %r192,%r212,40; add.u64 %r190,%frame,120; cvt.u32.u32 %r388,%r23; cvt.s64.s8 %r387,%r388; add.u64 %r178,%r387,-1; .loc 1 328 3 mov.u64 %r160,0; add.u64 %r436,%frame,360; .loc 1 330 16 mov.u64 %r389,%r160; $L77: st.u64 [%r193],%r389; .loc 1 331 18 ld.u64 %r390,[%r192]; st.u64 [%r190],%r390; .loc 1 332 17 shl.b64 %r392,%r160,3; add.u64 %r393,%r436,%r392; .loc 1 332 10 ld.u64 %r394,[%r393]; setp.le.s64 %r395,%r394,0; @ %r395 bra $L48; .loc 1 328 26 add.u64 %r160,%r160,1; .loc 1 328 3 add.u64 %r193,%r193,8; add.u64 %r192,%r192,24; add.u64 %r190,%r190,8; setp.ne.u64 %r396,%r160,%r178; @ %r396 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r45,%r42,%r34; .loc 1 349 32 add.u64 %r94,%r145,%r145; ld.u64 %r39,[%frame+480]; .loc 1 359 22 ld.u64 %r97,[%frame+240]; .loc 1 359 12 add.u64 %r98,%r97,%r97; .loc 1 360 23 ld.u64 %r100,[%frame]; .loc 1 361 22 ld.u64 %r102,[%frame+120]; .loc 1 361 12 add.u64 %r103,%r102,%r102; .loc 1 363 32 ld.u64 %r85,[%frame+360]; .loc 1 370 23 mul.lo.u64 %r38,%r85,%r97; .loc 1 371 24 mul.lo.u64 %r400,%r85,%r100; .loc 1 371 10 neg.s64 %r166,%r400; .loc 1 372 23 mul.lo.u64 %r401,%r85,%r102; .loc 1 372 9 add.u64 %r402,%r401,%r401; neg.s64 %r168,%r402; setp.le.s64 %r441,%r141,1; .loc 1 348 10 mov.u32 %r443,1; cvt.u32.u32 %r444,%r23; cvt.s64.s8 %r445,%r444; add.u64 %r446,%r445,-1; add.u64 %r447,%frame,240; add.u64 %r448,%frame,360; add.u64 %r449,%frame,120; $L87: .loc 1 328 3 mov.u64 %r159,%r155; mov.u64 %r158,%r154; .loc 1 348 10 mov.u32 %r133,%r443; .loc 1 349 9 mov.u64 %r157,0; $L80: .loc 1 352 6 ld.s8 %r404,[%r159]; cvt.u16.u32 %r403,%r404; setp.eq.u16 %r405,%r403,0; @ %r405 bra $L79; .loc 1 353 12 ld.u16 %r407,[%r158]; cvt.u16.u32 %r408,%r133; mul.lo.u16 %r406,%r407,%r408; cvt.s32.s16 %r133,%r406; $L79: .loc 1 349 24 add.u64 %r157,%r157,1; .loc 1 349 32 add.u64 %r158,%r158,%r94; .loc 1 349 47 add.u64 %r159,%r159,%r45; .loc 1 349 2 setp.ne.u64 %r410,%r142,%r157; @ %r410 bra $L80; .loc 1 355 8 cvt.u16.u32 %r411,%r133; st.u16 [%r156],%r411; .loc 1 358 15 add.u64 %r39,%r39,1; .loc 1 359 12 add.u64 %r154,%r154,%r98; .loc 1 360 13 add.u64 %r155,%r155,%r100; .loc 1 361 12 add.u64 %r156,%r156,%r103; .loc 1 363 13 setp.ne.u64 %r412,%r85,%r39; @ %r412 bra $L87; .loc 1 371 10 add.u64 %r151,%r155,%r166; .loc 1 372 9 add.u64 %r152,%r156,%r168; .loc 1 374 7 @ ! %r441 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r41],%r442; .loc 1 370 23 mul.lo.u64 %r105,%r117,%r116; .loc 1 371 24 mul.lo.u64 %r415,%r119,%r116; .loc 1 371 10 sub.u64 %r151,%r155,%r415; .loc 1 372 23 mul.lo.u64 %r416,%r121,%r116; .loc 1 372 9 add.u64 %r417,%r416,%r416; sub.u64 %r152,%r156,%r417; .loc 1 373 5 add.u64 %r153,%r153,1; .loc 1 374 7 add.u64 %r41,%r41,8; add.u64 %r169,%r169,8; setp.ne.u64 %r418,%r153,%r446; @ %r418 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; .loc 1 370 23 mov.u64 %r105,%r38; .loc 1 374 7 mov.u64 %r169,8; .loc 1 373 5 mov.u64 %r153,1; .loc 1 367 13 mov.u64 %r442,0; $L84: .loc 1 382 16 ld.u64 %r422,[%r41]; add.u64 %r116,%r422,1; st.u64 [%r41],%r116; .loc 1 383 23 add.u64 %r424,%r447,%r169; ld.u64 %r117,[%r424]; .loc 1 383 13 sub.u64 %r425,%r117,%r105; add.u64 %r426,%r425,%r425; add.u64 %r154,%r154,%r426; .loc 1 384 24 add.u64 %r427,%frame,%r169; ld.u64 %r119,[%r427]; .loc 1 384 14 add.u64 %r155,%r151,%r119; .loc 1 385 23 add.u64 %r429,%r449,%r169; ld.u64 %r121,[%r429]; .loc 1 385 13 add.u64 %r430,%r121,%r121; add.u64 %r156,%r152,%r430; .loc 1 363 32 add.u64 %r432,%r448,%r169; ld.u64 %r125,[%r432]; .loc 1 363 13 setp.eq.u64 %r433,%r116,%r125; @ %r433 bra $L85; .loc 1 367 13 mov.u64 %r39,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r434,[%r212]; setp.eq.u64 %r435,%r434,0; @ ! %r435 bra $L63; bra $L86; $L48: .loc 1 389_gfortran_sproduct_i2 .visible .func _gfortran_sproduct_8u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64predpred %r292; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_product_.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r295,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r295; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r295; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r301,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r301; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r294,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r294,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r300,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r300; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r295,0; @ %r211 bra $L127; add.u64 %r294,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r294; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r244,4; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r295,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r295; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r295; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r295,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r295,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r295,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 add.u64 %r69,%r68,%r68; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r293,%r295,1; .loc 1 512 13 mov.u16 %r279,1; cvt.u32.u32 %r297,%r24; cvt.s64.s8 %r298,%r297; add.u64 %r299,%r298,-1; $L146: st.u16 [%r98],%r279; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r280,%r59,%r80; @ %r280 bra $L146; .loc 1 525 7 @ ! %r293 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r296; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r283,%r97,%r299; @ %r283 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; .loc 1 520 13 mov.u64 %r296,0; $L144: .loc 1 529 16 ld.u64 %r288,[%r99]; add.u64 %r74,%r288,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r289,%r75,%r72; add.u64 %r290,%r289,%r289; add.u64 %r98,%r98,%r290; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r291,%r74,%r77; @ %r291 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r292,%r98,0; @ ! %r292 bra $L122; bra $L147; $L109: .loc 1 534product_i4.o/_gfortran_product_i4 .visible .func _gfortran_product_i4fortran/generated/product_i4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mproduct_i4 .visible .func _gfortran_mproduct_i4gfortran_sproduct_i4 .visible .func _gfortran_sproduct_i4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,108] = {80,82,79,68,85,67,84114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_product_i4 .visible .func _gfortran_product_i448pred %r217; .reg .pred64u64 %r251; .reg .predu64 %r256; .reg .u64 %r257; .reg .u32u32 %r268; .reg .u64u64predpredu64u64predmov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; .loc 1 55 10 ld.s8 %r22,[%r180+28]; .loc 1 55 38 add.u32 %r182,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r182; .loc 1 56 10 ld.u64 %r25,[%r181]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r184,%r104,63; set.u32.lt.s64 %r186,%r103,%r104; neg.s32 %r187,%r186; cvt.u16.u64 %r190,%r184; cvt.u16.u32 %r191,%r187; or.b16 %r189,%r190,%r191; .loc 1 58 6 cvt.u32.u16 %r192,%r189; cvt.u16.u8 %r193,%r192; setp.eq.u16 %r194,%r193,0; @ %r194 bra $L2; .loc 1 60 7 cvt.u32.u32 %r197,%r22; cvt.s64.s8 %r196,%r197; st.u64 [%stack+8],%r1961955_gfortran_runtime_errorr200,%r104,%r104; add.u64 %r201,%r200,%r104; shl.b64 %r202,%r201,3; add.u64 %r203,%r180,%r202; ld.u64 %r30,[%r203+56]; ld.u64 %r32,[%r203+48]; .loc 1 68 9 ld.u64 %r107,[%r203+40]; .loc 1 70 3 setp.ne.u64 %r217,%r104,0; @ %r217 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r218,%r103,%r104; @ %r218 bra $L4; bra $L46; $L3: add.u64 %r61,%r180,40; add.u64 %r40,%frame,120; add.u64 %r91,%frame,240; add.u64 %r219,%r180,16; add.u64 %r221,%r25,%r25; add.u64 %r222,%r221,%r25; shl.b64 %r223,%r222,3; add.u64 %r174,%r219,%r223; .loc 1 76 12 mov.u64 %r357,0; $L8: .loc 1 72 18 ld.u64 %r224,[%r61]; st.u64 [%r40],%r224; .loc 1 73 19 ld.u64 %r226,[%r61+16]; add.u64 %r225,%r226,1; ld.u64 %r227,[%r61+8]; sub.u64 %r37,%r225,%r227; .loc 1 75 10 setp.lt.s64 %r228,%r37,0; @ %r228 bra $L6; .loc 1 73 17 st.u64 [%r91],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r91],%r357; $L7: .loc 1 70 3 add.u64 %r61,%r61,24; add.u64 %r40,%r40,8; add.u64 %r91,%r91,8; setp.ne.u64 %r230,%r61,%r174; @ %r230 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r231,[%r179]; setp.eq.u64 %r232,%r231,0; @ ! %r232 bra $L11; bra $L10; $L4: add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r237,%r236,40; add.u64 %r143,%r180,%r237; shl.b64 %r238,%r25,3; add.u64 %r125,%r238,-8; add.u64 %r344,%frame,120; add.u64 %r128,%r344,%r125; add.u64 %r345,%frame,240; add.u64 %r121,%r345,%r125; cvt.u32.u32 %r242,%r22; cvt.s64.s8 %r241,%r242; add.u64 %r244,%r241,%r241; add.u64 %r245,%r244,%r241; shl.b64 %r246,%r245,3; add.u64 %r247,%r180,40; add.u64 %r95,%r246,%r247; .loc 1 84 12 mov.u64 %r356,0; $L14: .loc 1 80 18 ld.u64 %r248,[%r143]; st.u64 [%r128],%r248; .loc 1 81 19 ld.u64 %r250,[%r143+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r143+8]; sub.u64 %r44,%r249,%r251; .loc 1 83 10 setp.lt.s64 %r252,%r44,0; @ %r252 bra $L12; .loc 1 81 17 st.u64 [%r121],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r121],%r356; $L13: .loc 1 78 3 add.u64 %r143,%r143,24; add.u64 %r128,%r128,8; add.u64 %r121,%r121,8; setp.ne.u64 %r254,%r95,%r143; @ %r254 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r255,%r103,0; @ %r255 bra $L16; add.u64 %r345,%frame,240; $L34: add.u64 %r152,%r179,40; mov.u64 %r150,%r345; add.u64 %r256,%r179,16; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r144,%r256,%r262; .loc 1 94 10 mov.u64 %r99,1; .loc 1 98 4 mov.u64 %r285,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r263,0; st.u64 [%r179+8],%r263; .loc 1 103 28 cvt.u16.u32 %r266,%r22; add.u16 %r265,%r266,-1; cvt.u32.u16 %r267,%r265; st.u8 [%r179+28],%r267; .loc 1 105 20 add.u32 %r268,%r22,-2; cvt.s64.s32 %r52,%r268; add.u64 %r270,%r52,%r52; add.u64 %r271,%r270,%r52; shl.b64 %r272,%r271,3; add.u64 %r273,%r179,%r272; .loc 1 105 67 shl.b64 %r275,%r52,3; add.u64 %r276,%frame,%r275; .loc 1 105 59 ld.u64 %r278,[%r273+40]; ld.u64 %r279,[%r276+240]; mul.lo.u64 %r109,%r278,%r279; .loc 1 107 29281; call (%value_in),_gfortrani_xmallocarray282,[%value_in]; } .loc 1 107 27 st.u64 [%r179],%r282; .loc 1 108 10 setp.eq.u64 %r284,%r109,0; @ ! %r284 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r99,%r48,%r99; $L17: .loc 1 98 4 st.u64 [%r152+8],%r285; ld.u64 %r48,[%r150]; add.u64 %r286,%r48,-1; st.u64 [%r152+16],%r286; st.u64 [%r152],%r99; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r287,%r144,%r152; @ %r287 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r179+48],%r109; mov.u64 %r289,-1; st.u64 [%r179+56],%r289; mov.u64 %r290,1; st.u64 [%r179+40],%r290; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r179+28]; .loc 1 118 10 setp.eq.u64 %r291,%r58,%r103; @ %r291 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r294,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r295,[%r294+36]; setp.eq.u32 %r296,%r295,0; @ %r296 bra $L21; .loc 1 125 2 add.u64 %r345,%frame,240; cvta.const.u64 %r300,$LC2r345299300; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r302,%r103,0; @ %r302 bra $L24; $L27: .loc 1 65 9 add.u64 %r303,%r30,1; .loc 1 65 7 sub.u64 %r105,%r303,%r32; max.s64 %r45,%r105,0; .loc 1 137 8 ld.u64 %r116,[%r180]; .loc 1 138 8 ld.u64 %r117,[%r179]; .loc 1 154 36 shl.b64 %r66,%r107,2; ld.u64 %r138,[%frame+360]; .loc 1 166 22 ld.u64 %r68,[%frame+120]; .loc 1 166 12 shl.b64 %r69,%r68,2; .loc 1 167 22 ld.u64 %r71,[%frame]; .loc 1 167 12 shl.b64 %r72,%r71,2; .loc 1 169 32 ld.u64 %r123,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r304,%r68,%r123; .loc 1 176 9 shl.b64 %r305,%r304,2; neg.s64 %r127,%r305; .loc 1 177 23 mul.lo.u64 %r306,%r71,%r123; .loc 1 177 9 shl.b64 %r307,%r306,2; neg.s64 %r130,%r307; setp.gt.s64 %r346,%r105,0; setp.le.s64 %r347,%r103,1; cvt.u32.u32 %r349,%r22; cvt.s64.s8 %r350,%r349; add.u64 %r351,%r350,-1; add.u64 %r352,%frame,120; add.u64 %r353,%frame,240; .loc 1 173 13 mov.u64 %r354,0; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r179,40; mov.u64 %r162,%frame; cvt.u32.u32 %r309,%r22; cvt.s64.s8 %r308,%r309; add.u64 %r153,%r308,-1; .loc 1 129 3 mov.u64 %r118,0; add.u64 %r345,%frame,240; .loc 1 131 16 mov.u64 %r310,%r118; $L26: st.u64 [%r165],%r310; .loc 1 132 18 ld.u64 %r311,[%r164]; st.u64 [%r162],%r311; .loc 1 133 17 shl.b64 %r313,%r118,3; add.u64 %r314,%r345,%r313; .loc 1 133 10 ld.u64 %r315,[%r314]; setp.le.s64 %r316,%r315,0; @ %r316 bra $L1; .loc 1 129 26 add.u64 %r118,%r118,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r317,%r118,%r153; @ %r317 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r346 bra $L35; .loc 1 150 10 mov.u32 %r319,1; st.u32 [%r117],%r319; bra $L29; $L35: mov.u64 %r112,%r116; .loc 1 148 10 mov.u32 %r110,1; .loc 1 154 13 mov.u64 %r111,0; $L28: .loc 1 158 10 ld.u32 %r320,[%r112]; mul.lo.u32 %r110,%r110,%r320; .loc 1 154 28 add.u64 %r111,%r111,1; .loc 1 154 36 add.u64 %r112,%r112,%r66; .loc 1 154 6 setp.gt.s64 %r321,%r45,%r111; @ %r321 bra $L28; .loc 1 161 12 st.u32 [%r117],%r110; $L29: .loc 1 165 15 add.u64 %r138,%r138,1; .loc 1 166 12 add.u64 %r116,%r116,%r69; .loc 1 167 12 add.u64 %r117,%r117,%r72; .loc 1 169 13 setp.ne.u64 %r322,%r138,%r123; @ %r322 bra $L25; .loc 1 176 9 add.u64 %r113,%r116,%r127; .loc 1 177 9 add.u64 %r114,%r117,%r130; .loc 1 179 7 @ %r347 bra $L1; add.u64 %r122,%frame,368; mov.u64 %r140,8; .loc 1 178 5 mov.u64 %r115,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r122],%r354; .loc 1 176 23 mul.lo.u64 %r328,%r85,%r84; .loc 1 176 9 shl.b64 %r329,%r328,2; sub.u64 %r113,%r116,%r329; .loc 1 177 23 mul.lo.u64 %r330,%r88,%r84; .loc 1 177 9 shl.b64 %r331,%r330,2; sub.u64 %r114,%r117,%r331; .loc 1 178 5 add.u64 %r115,%r115,1; .loc 1 179 7 add.u64 %r122,%r122,8; add.u64 %r140,%r140,8; setp.eq.u64 %r332,%r115,%r351; @ %r332 bra $L1; $L32: .loc 1 187 16 ld.u64 %r333,[%r122]; add.u64 %r84,%r333,1; st.u64 [%r122],%r84; .loc 1 188 23 add.u64 %r335,%r352,%r140; ld.u64 %r85,[%r335]; .loc 1 188 13 shl.b64 %r336,%r85,2; add.u64 %r116,%r113,%r336; .loc 1 189 23 add.u64 %r337,%frame,%r140; ld.u64 %r88,[%r337]; .loc 1 189 13 shl.b64 %r338,%r88,2; add.u64 %r117,%r114,%r338; .loc 1 169 32 add.u64 %r340,%r353,%r140; ld.u64 %r92,[%r340]; .loc 1 169 13 setp.eq.u64 %r341,%r84,%r92; @ %r341 bra $L33; .loc 1 173 13 mov.u64 %r138,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r342,[%r179]; setp.eq.u64 %r343,%r342,0; @ ! %r343 bra $L11; bra $L34; $L1_gfortran_mproduct_i4 .visible .func _gfortran_mproduct_608u64u16 %r263; .reg .pred %r264; .reg .pred %r266; .reg .u64 %r270; .reg .u64 %r271; .reg .u64 %r277; .reg .pred %r279; .reg .pred %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .predu64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32u64 %r342; .reg .u64pred %r358; .reg .u64predu64 %r391; .reg .predpred %r399; .reg .u32 %r400; .reg .pred %r401; .reg .predmov.u64 %r211,%ar2; mov.u64 %r212,%ar3; .loc 1 223 6 setp.ne.u64 %r213,%r212,0; @ %r213 bra $L49gfortran_product_.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r211]; .loc 1 233 7 add.u64 %r137,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r210+28]; .loc 1 234 38 add.u32 %r217,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r138,%r217; .loc 1 237 7 shr.u64 %r219,%r137,63; set.u32.gt.s64 %r221,%r137,%r138; neg.s32 %r222,%r221; cvt.u16.u64 %r225,%r219; cvt.u16.u32 %r226,%r222; or.b16 %r224,%r225,%r226; .loc 1 237 6 cvt.u32.u16 %r227,%r224; cvt.u16.u8 %r228,%r227; setp.eq.u16 %r229,%r228,0; @ %r229 bra $L51; .loc 1 239 7 cvt.u32.u32 %r232,%r23; cvt.s64.s8 %r231,%r232; st.u64 [%stack+8],%r231; st.u64 [%stack],%r22; cvta.const.u64 %r2302stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r429,%r137,%r137; add.u64 %r430,%r429,%r137; shl.b64 %r237,%r430,3; add.u64 %r238,%r210,%r237; ld.u64 %r241,[%r238+56]; add.u64 %r240,%r241,1; .loc 1 244 7 ld.u64 %r248,[%r238+48]; sub.u64 %r139,%r240,%r248; .loc 1 245 6 setp.le.s64 %r249,%r139,0; @ %r249 bra $L48; .loc 1 248 9 ld.u64 %r152,[%r212]; .loc 1 250 15 ld.u64 %r34,[%r212+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r250,%r35,-4; and.b32 %r251,%r250,-5; set.u32.eq.u32 %r253,%r251,0; neg.s32 %r254,%r253; .loc 1 252 22 add.u32 %r255,%r35,-1; set.u32.le.u32 %r257,%r255,1; neg.s32 %r258,%r257; .loc 1 252 6 cvt.u16.u32 %r260,%r254; cvt.u16.u32 %r261,%r258; or.b16 %r259,%r260,%r261; cvt.u32.u16 %r262,%r259; cvt.u16.u8 %r263,%r262; setp.ne.u16 %r264,%r263,0; @ %r264 bra $L53; .loc 1 254 7 setp.ne.u32 %r266,%r35,16; @ %r266 bra $L54; $L53: .loc 1 261 9 shl.b64 %r270,%r430,3; add.u64 %r271,%r210,%r270; ld.u64 %r142,[%r271+40]; .loc 1 262 12 add.u64 %r277,%r212,%r270; ld.u64 %r42,[%r277+40]; .loc 1 264 3 setp.ne.u64 %r279,%r137,0; @ %r279 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r280,%r137,%r138; @ %r280 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r281281_gfortran_runtime_error5: add.u64 %r40,%r210,40; add.u64 %r179,%r212,40; add.u64 %r283,%r210,16; add.u64 %r285,%r22,%r22; add.u64 %r286,%r285,%r22; shl.b64 %r287,%r286,3; add.u64 %r206,%r283,%r287; .loc 1 264 3 mov.u64 %r92,0; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 271 12 mov.u64 %r441,%r92; $L60: .loc 1 266 18 add.u64 %r289,%r427,%r92; ld.u64 %r290,[%r40]; st.u64 [%r289],%r290; .loc 1 267 18 add.u64 %r291,%frame,%r92; .loc 1 267 20 ld.u64 %r293,[%r179]; mul.lo.u64 %r292,%r293,%r34; .loc 1 267 18 st.u64 [%r291],%r292; .loc 1 268 19 ld.u64 %r295,[%r40+16]; add.u64 %r294,%r295,1; ld.u64 %r296,[%r40+8]; sub.u64 %r54,%r294,%r296; .loc 1 270 10 setp.lt.s64 %r297,%r54,0; @ %r297 bra $L58; .loc 1 268 17 add.u64 %r299,%r426,%r92; st.u64 [%r299],%r54; bra $L59; $L58: .loc 1 271 12 add.u64 %r301,%r426,%r92; st.u64 [%r301],%r441; $L59: .loc 1 264 3 add.u64 %r40,%r40,24; add.u64 %r179,%r179,24; add.u64 %r92,%r92,8; setp.ne.u64 %r303,%r40,%r206; @ %r303 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r304,[%r209]; setp.eq.u64 %r305,%r304,0; @ ! %r305 bra $L63; bra $L62; $L56: add.u64 %r307,%r22,%r22; add.u64 %r308,%r307,%r22; shl.b64 %r309,%r308,3; add.u64 %r127,%r309,40; add.u64 %r133,%r210,%r127; add.u64 %r125,%r212,%r127; shl.b64 %r310,%r22,3; add.u64 %r103,%r310,-8; cvt.u32.u32 %r312,%r23; cvt.s64.s8 %r311,%r312; shl.b64 %r313,%r311,3; add.u64 %r69,%r313,-8; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 281 12 mov.u64 %r440,0; $L66: .loc 1 276 18 add.u64 %r315,%r427,%r103; ld.u64 %r316,[%r133]; st.u64 [%r315],%r316; .loc 1 277 18 add.u64 %r317,%frame,%r103; .loc 1 277 20 ld.u64 %r319,[%r125]; mul.lo.u64 %r318,%r319,%r34; .loc 1 277 18 st.u64 [%r317],%r318; .loc 1 278 19 ld.u64 %r321,[%r133+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r133+8]; sub.u64 %r66,%r320,%r322; .loc 1 280 10 setp.lt.s64 %r323,%r66,0; @ %r323 bra $L64; .loc 1 278 17 add.u64 %r325,%r426,%r103; st.u64 [%r325],%r66; bra $L65; $L64: .loc 1 281 12 add.u64 %r327,%r426,%r103; st.u64 [%r327],%r440; $L65: .loc 1 274 3 add.u64 %r133,%r133,24; add.u64 %r125,%r125,24; add.u64 %r103,%r103,8; setp.ne.u64 %r329,%r69,%r103; @ %r329 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r330,%r138,0; @ %r330 bra $L68; add.u64 %r426,%frame,360; $L86: add.u64 %r172,%r209,40; mov.u64 %r164,%r426; cvt.u32.u32 %r332,%r23; cvt.s64.s8 %r331,%r332; add.u64 %r334,%r331,%r331; add.u64 %r335,%r334,%r331; shl.b64 %r336,%r335,3; add.u64 %r337,%r209,16; add.u64 %r134,%r336,%r337; .loc 1 291 10 mov.u64 %r129,1; .loc 1 295 4 mov.u64 %r356,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r338,%r23,-2; cvt.s64.s32 %r75,%r338; add.u64 %r340,%r75,%r75; add.u64 %r341,%r340,%r75; shl.b64 %r342,%r341,3; add.u64 %r343,%r209,%r342; .loc 1 299 67 shl.b64 %r345,%r75,3; add.u64 %r346,%frame,%r345; .loc 1 299 59 ld.u64 %r348,[%r343+40]; ld.u64 %r349,[%r346+360]; mul.lo.u64 %r146,%r348,%r349; .loc 1 301 24 mov.u64 %r350,0; st.u64 [%r209+8],%r350; .loc 1 302 28 cvt.u16.u32 %r353,%r23; add.u16 %r352,%r353,-1; cvt.u32.u16 %r354,%r352; st.u8 [%r209+28],%r354; .loc 1 304 10 setp.eq.u64 %r355,%r146,0; @ %r355 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r129,%r72,%r129; $L69: .loc 1 295 4 st.u64 [%r172+8],%r356; ld.u64 %r72,[%r164]; add.u64 %r357,%r72,-1; st.u64 [%r172+16],%r357; st.u64 [%r172],%r129; .loc 1 288 7 add.u64 %r172,%r172,24; add.u64 %r164,%r164,8; setp.ne.u64 %r358,%r134,%r172; @ %r358 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r209+48],%r146; mov.u64 %r360,-1; st.u64 [%r209+56],%r360; mov.u64 %r361,1; st.u64 [%r209+40],%r361; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24363; call (%value_in),_gfortrani_xmallocarray64,[%value_in]; } .loc 1 311 22 st.u64 [%r209],%r364; bra $L73; $L63: .loc 1 316 19 ld.s8 %r366,[%r209+28]; .loc 1 316 10 setp.eq.u64 %r367,%r366,%r138; @ %r367 bra $L74; .loc 1 317 2 cvta.const.u64 %r36836_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r370,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r371,[%r370+36]; setp.eq.u32 %r372,%r371,0; @ %r372 bra $L73; .loc 1 321 4 add.u64 %r426,%frame,360; cvta.const.u64 %r376,$LC2r42637376; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r380,$LC1210380376; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r382,%r138,0; @ %r382 bra $L75; $L78: .loc 1 336 8 ld.u64 %r153,[%r209]; .loc 1 337 8 ld.u64 %r151,[%r210]; .loc 1 339 9 setp.ne.u64 %r383,%r151,0; @ %r383 bra $L76; bra $L48; $L75: add.u64 %r190,%frame,480; add.u64 %r189,%r209,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r385,%r23; cvt.s64.s8 %r384,%r385; add.u64 %r175,%r384,-1; .loc 1 328 3 mov.u64 %r157,0; add.u64 %r426,%frame,360; .loc 1 330 16 mov.u64 %r386,%r157; $L77: st.u64 [%r190],%r386; .loc 1 331 18 ld.u64 %r387,[%r189]; st.u64 [%r187],%r387; .loc 1 332 17 shl.b64 %r389,%r157,3; add.u64 %r390,%r426,%r389; .loc 1 332 10 ld.u64 %r391,[%r390]; setp.le.s64 %r392,%r391,0; @ %r392 bra $L48; .loc 1 328 26 add.u64 %r157,%r157,1; .loc 1 328 3 add.u64 %r190,%r190,8; add.u64 %r189,%r189,24; add.u64 %r187,%r187,8; setp.ne.u64 %r393,%r157,%r175; @ %r393 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r45,%r42,%r34; .loc 1 349 32 shl.b64 %r91,%r142,2; ld.u64 %r39,[%frame+480]; .loc 1 359 22 ld.u64 %r94,[%frame+240]; .loc 1 359 12 shl.b64 %r95,%r94,2; .loc 1 360 23 ld.u64 %r97,[%frame]; .loc 1 361 22 ld.u64 %r99,[%frame+120]; .loc 1 361 12 shl.b64 %r100,%r99,2; .loc 1 363 32 ld.u64 %r85,[%frame+360]; .loc 1 370 23 mul.lo.u64 %r38,%r85,%r94; .loc 1 371 24 mul.lo.u64 %r394,%r85,%r97; .loc 1 371 10 neg.s64 %r163,%r394; .loc 1 372 23 mul.lo.u64 %r395,%r85,%r99; .loc 1 372 9 shl.b64 %r396,%r395,2; neg.s64 %r165,%r396; setp.le.s64 %r431,%r138,1; cvt.u32.u32 %r433,%r23; cvt.s64.s8 %r434,%r433; add.u64 %r435,%r434,-1; add.u64 %r436,%frame,240; add.u64 %r437,%frame,360; add.u64 %r438,%frame,120; .loc 1 367 13 mov.u64 %r439,0; $L87: .loc 1 328 3 mov.u64 %r156,%r152; mov.u64 %r155,%r151; .loc 1 348 10 mov.u32 %r130,1; .loc 1 349 9 mov.u64 %r154,0; $L80: .loc 1 352 6 ld.s8 %r398,[%r156]; cvt.u16.u32 %r397,%r398; setp.eq.u16 %r399,%r397,0; @ %r399 bra $L79; .loc 1 353 12 ld.u32 %r400,[%r155]; mul.lo.u32 %r130,%r130,%r400; $L79: .loc 1 349 24 add.u64 %r154,%r154,1; .loc 1 349 32 add.u64 %r155,%r155,%r91; .loc 1 349 47 add.u64 %r156,%r156,%r45; .loc 1 349 2 setp.ne.u64 %r401,%r139,%r154; @ %r401 bra $L80; .loc 1 355 8 st.u32 [%r153],%r130; .loc 1 358 15 add.u64 %r39,%r39,1; .loc 1 359 12 add.u64 %r151,%r151,%r95; .loc 1 360 13 add.u64 %r152,%r152,%r97; .loc 1 361 12 add.u64 %r153,%r153,%r100; .loc 1 363 13 setp.ne.u64 %r402,%r85,%r39; @ %r402 bra $L87; .loc 1 371 10 add.u64 %r148,%r152,%r163; .loc 1 372 9 add.u64 %r149,%r153,%r165; .loc 1 374 7 @ ! %r431 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r41],%r439; .loc 1 370 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 371 24 mul.lo.u64 %r405,%r116,%r113; .loc 1 371 10 sub.u64 %r148,%r152,%r405; .loc 1 372 23 mul.lo.u64 %r406,%r118,%r113; .loc 1 372 9 shl.b64 %r407,%r406,2; sub.u64 %r149,%r153,%r407; .loc 1 373 5 add.u64 %r150,%r150,1; .loc 1 374 7 add.u64 %r41,%r41,8; add.u64 %r166,%r166,8; setp.ne.u64 %r408,%r150,%r435; @ %r408 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; .loc 1 370 23 mov.u64 %r102,%r38; .loc 1 374 7 mov.u64 %r166,8; .loc 1 373 5 mov.u64 %r150,1; $L84: .loc 1 382 16 ld.u64 %r412,[%r41]; add.u64 %r113,%r412,1; st.u64 [%r41],%r113; .loc 1 383 23 add.u64 %r414,%r436,%r166; ld.u64 %r114,[%r414]; .loc 1 383 13 sub.u64 %r415,%r114,%r102; shl.b64 %r416,%r415,2; add.u64 %r151,%r151,%r416; .loc 1 384 24 add.u64 %r417,%frame,%r166; ld.u64 %r116,[%r417]; .loc 1 384 14 add.u64 %r152,%r148,%r116; .loc 1 385 23 add.u64 %r419,%r438,%r166; ld.u64 %r118,[%r419]; .loc 1 385 13 shl.b64 %r420,%r118,2; add.u64 %r153,%r149,%r420; .loc 1 363 32 add.u64 %r422,%r437,%r166; ld.u64 %r122,[%r422]; .loc 1 363 13 setp.eq.u64 %r423,%r113,%r122; @ %r423 bra $L85; .loc 1 367 13 mov.u64 %r39,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r424,[%r209]; setp.eq.u64 %r425,%r424,0; @ ! %r425 bra $L63; bra $L86; $L48: .loc 1 389_gfortran_sproduct_i4 .visible .func _gfortran_sproduct_8u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64predu64mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_product_.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r294,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r294; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r294; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r300,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r300; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r293,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r293,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r299,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r299; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r294,0; @ %r211 bra $L127; add.u64 %r293,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r293; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 2496244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r294,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r294; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r294; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r294,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r294,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r294,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 shl.b64 %r69,%r68,2; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r292,%r294,1; .loc 1 512 13 mov.u32 %r278,1; cvt.u32.u32 %r296,%r24; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; $L146: st.u32 [%r98],%r278; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r279,%r59,%r80; @ %r279 bra $L146; .loc 1 525 7 @ ! %r292 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r295; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r282,%r97,%r298; @ %r282 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; .loc 1 520 13 mov.u64 %r295,0; $L144: .loc 1 529 16 ld.u64 %r287,[%r99]; add.u64 %r74,%r287,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r288,%r75,%r72; shl.b64 %r289,%r288,2; add.u64 %r98,%r98,%r289; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r290,%r74,%r77; @ %r290 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r291,%r98,0; @ ! %r291 bra $L122; bra $L147; $L109: .loc 1 534product_i8.o/_gfortran_product_i8 .visible .func _gfortran_product_i8fortran/generated/product_i8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mproduct_i8 .visible .func _gfortran_mproduct_i8gfortran_sproduct_i8 .visible .func _gfortran_sproduct_i8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,108] = {80,82,79,68,85,67,84114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_product_i8 .visible .func _gfortran_product_i84896pred %r218; .reg .predpred %r231; .reg .u64 %r232; .reg .predu64 %r250; .reg .u64 %r251; .reg .u64 %r252; .reg .pred %r253; .reg .predu64u16u64 %r274; .reg .u64pred %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u32 %r306; .reg .u64u64 %r311; .reg .u64 %r312; .reg .pred %r313; .reg .predu64predmov.u64 %r180,%ar0; mov.u64 %r181,%ar1; mov.u64 %r182,%ar2; .loc 1 55 10 ld.s8 %r22,[%r181+28]; .loc 1 55 38 add.u32 %r183,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r107,%r183; .loc 1 56 10 ld.u64 %r25,[%r182]; .loc 1 56 7 add.u64 %r108,%r25,-1; .loc 1 58 7 shr.u64 %r185,%r108,63; set.u32.lt.s64 %r187,%r107,%r108; neg.s32 %r188,%r187; cvt.u16.u64 %r191,%r185; cvt.u16.u32 %r192,%r188; or.b16 %r190,%r191,%r192; .loc 1 58 6 cvt.u32.u16 %r193,%r190; cvt.u16.u8 %r194,%r193; setp.eq.u16 %r195,%r194,0; @ %r195 bra $L2; .loc 1 60 7 cvt.u32.u32 %r198,%r22; cvt.s64.s8 %r197,%r198; st.u64 [%stack+8],%r1971966_gfortran_runtime_errorr201,%r108,%r108; add.u64 %r202,%r201,%r108; shl.b64 %r203,%r202,3; add.u64 %r204,%r181,%r203; ld.u64 %r30,[%r204+56]; ld.u64 %r32,[%r204+48]; .loc 1 68 9 ld.u64 %r111,[%r204+40]; .loc 1 70 3 setp.ne.u64 %r218,%r108,0; @ %r218 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r219,%r107,%r108; @ %r219 bra $L4; bra $L46; $L3: add.u64 %r61,%r181,40; add.u64 %r40,%frame,120; add.u64 %r96,%frame,240; add.u64 %r220,%r181,16; add.u64 %r222,%r25,%r25; add.u64 %r223,%r222,%r25; shl.b64 %r224,%r223,3; add.u64 %r175,%r220,%r224; .loc 1 76 12 mov.u64 %r358,0; $L8: .loc 1 72 18 ld.u64 %r225,[%r61]; st.u64 [%r40],%r225; .loc 1 73 19 ld.u64 %r227,[%r61+16]; add.u64 %r226,%r227,1; ld.u64 %r228,[%r61+8]; sub.u64 %r37,%r226,%r228; .loc 1 75 10 setp.lt.s64 %r229,%r37,0; @ %r229 bra $L6; .loc 1 73 17 st.u64 [%r96],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r96],%r358; $L7: .loc 1 70 3 add.u64 %r61,%r61,24; add.u64 %r40,%r40,8; add.u64 %r96,%r96,8; setp.ne.u64 %r231,%r61,%r175; @ %r231 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r232,[%r180]; setp.eq.u64 %r233,%r232,0; @ ! %r233 bra $L11; bra $L10; $L4: add.u64 %r235,%r25,%r25; add.u64 %r236,%r235,%r25; shl.b64 %r237,%r236,3; add.u64 %r238,%r237,40; add.u64 %r136,%r181,%r238; shl.b64 %r239,%r25,3; add.u64 %r105,%r239,-8; add.u64 %r345,%frame,120; add.u64 %r129,%r345,%r105; add.u64 %r346,%frame,240; add.u64 %r103,%r346,%r105; cvt.u32.u32 %r243,%r22; cvt.s64.s8 %r242,%r243; add.u64 %r245,%r242,%r242; add.u64 %r246,%r245,%r242; shl.b64 %r247,%r246,3; add.u64 %r248,%r181,40; add.u64 %r80,%r247,%r248; .loc 1 84 12 mov.u64 %r357,0; $L14: .loc 1 80 18 ld.u64 %r249,[%r136]; st.u64 [%r129],%r249; .loc 1 81 19 ld.u64 %r251,[%r136+16]; add.u64 %r250,%r251,1; ld.u64 %r252,[%r136+8]; sub.u64 %r44,%r250,%r252; .loc 1 83 10 setp.lt.s64 %r253,%r44,0; @ %r253 bra $L12; .loc 1 81 17 st.u64 [%r103],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r103],%r357; $L13: .loc 1 78 3 add.u64 %r136,%r136,24; add.u64 %r129,%r129,8; add.u64 %r103,%r103,8; setp.ne.u64 %r255,%r80,%r136; @ %r255 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r256,%r107,0; @ %r256 bra $L16; add.u64 %r346,%frame,240; $L34: add.u64 %r152,%r180,40; mov.u64 %r150,%r346; add.u64 %r257,%r180,16; cvt.u32.u32 %r259,%r22; cvt.s64.s8 %r258,%r259; add.u64 %r261,%r258,%r258; add.u64 %r262,%r261,%r258; shl.b64 %r263,%r262,3; add.u64 %r139,%r257,%r263; .loc 1 94 10 mov.u64 %r102,1; .loc 1 98 4 mov.u64 %r286,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r264,0; st.u64 [%r180+8],%r264; .loc 1 103 28 cvt.u16.u32 %r267,%r22; add.u16 %r266,%r267,-1; cvt.u32.u16 %r268,%r266; st.u8 [%r180+28],%r268; .loc 1 105 20 add.u32 %r269,%r22,-2; cvt.s64.s32 %r53,%r269; add.u64 %r271,%r53,%r53; add.u64 %r272,%r271,%r53; shl.b64 %r273,%r272,3; add.u64 %r274,%r180,%r273; .loc 1 105 67 shl.b64 %r276,%r53,3; add.u64 %r277,%frame,%r276; .loc 1 105 59 ld.u64 %r279,[%r274+40]; ld.u64 %r280,[%r277+240]; mul.lo.u64 %r113,%r279,%r280; .loc 1 107 29 mov.u64 %r2811(%value_in),_gfortrani_xmallocarray283,[%value_in]; } .loc 1 107 27 st.u64 [%r180],%r283; .loc 1 108 10 setp.eq.u64 %r285,%r113,0; @ ! %r285 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r102,%r49,%r102; $L17: .loc 1 98 4 st.u64 [%r152+8],%r286; ld.u64 %r49,[%r150]; add.u64 %r287,%r49,-1; st.u64 [%r152+16],%r287; st.u64 [%r152],%r102; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r288,%r139,%r152; @ %r288 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r180+48],%r113; mov.u64 %r290,-1; st.u64 [%r180+56],%r290; mov.u64 %r291,1; st.u64 [%r180+40],%r291; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r60,[%r180+28]; .loc 1 118 10 setp.eq.u64 %r292,%r60,%r107; @ %r292 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r107; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r295,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r296,[%r295+36]; setp.eq.u32 %r297,%r296,0; @ %r297 bra $L21; .loc 1 125 2 add.u64 %r346,%frame,240; cvta.const.u64 %r301,$LC2r346300301; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r303,%r107,0; @ %r303 bra $L24; $L27: .loc 1 65 9 add.u64 %r304,%r30,1; .loc 1 65 7 sub.u64 %r109,%r304,%r32; max.s64 %r45,%r109,0; .loc 1 137 8 ld.u64 %r121,[%r181]; .loc 1 138 8 ld.u64 %r122,[%r180]; .loc 1 154 36 shl.b64 %r68,%r111,3; ld.u64 %r47,[%frame+360]; .loc 1 166 22 ld.u64 %r69,[%frame+120]; .loc 1 166 12 shl.b64 %r71,%r69,3; .loc 1 167 22 ld.u64 %r72,[%frame]; .loc 1 167 12 shl.b64 %r74,%r72,3; setp.gt.s64 %r347,%r109,0; setp.le.s64 %r348,%r107,1; cvt.u32.u32 %r350,%r22; cvt.s64.s8 %r351,%r350; add.u64 %r352,%r351,-1; add.u64 %r353,%frame,120; add.u64 %r354,%frame,240; .loc 1 173 13 mov.u64 %r355,0; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r180,40; mov.u64 %r162,%frame; cvt.u32.u32 %r306,%r22; cvt.s64.s8 %r305,%r306; add.u64 %r153,%r305,-1; .loc 1 129 3 mov.u64 %r123,0; add.u64 %r346,%frame,240; .loc 1 131 16 mov.u64 %r307,%r123; $L26: st.u64 [%r165],%r307; .loc 1 132 18 ld.u64 %r308,[%r164]; st.u64 [%r162],%r308; .loc 1 133 17 shl.b64 %r310,%r123,3; add.u64 %r311,%r346,%r310; .loc 1 133 10 ld.u64 %r312,[%r311]; setp.le.s64 %r313,%r312,0; @ %r313 bra $L1; .loc 1 129 26 add.u64 %r123,%r123,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r314,%r123,%r153; @ %r314 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r347 bra $L35; .loc 1 150 10 mov.u64 %r316,1; st.u64 [%r122],%r316; bra $L29; $L35: mov.u64 %r116,%r121; .loc 1 148 10 mov.u64 %r114,1; .loc 1 154 13 mov.u64 %r115,0; $L28: .loc 1 158 10 ld.u64 %r317,[%r116]; mul.lo.u64 %r114,%r114,%r317; .loc 1 154 28 add.u64 %r115,%r115,1; .loc 1 154 36 add.u64 %r116,%r116,%r68; .loc 1 154 6 setp.gt.s64 %r318,%r45,%r115; @ %r318 bra $L28; .loc 1 161 12 st.u64 [%r122],%r114; $L29: .loc 1 165 15 add.u64 %r47,%r47,1; .loc 1 166 12 add.u64 %r121,%r121,%r71; .loc 1 167 12 add.u64 %r122,%r122,%r74; .loc 1 169 32 ld.u64 %r127,[%frame+240]; .loc 1 169 13 setp.ne.u64 %r319,%r47,%r127; @ %r319 bra $L25; .loc 1 176 23 mul.lo.u64 %r320,%r69,%r47; .loc 1 176 9 shl.b64 %r321,%r320,3; sub.u64 %r118,%r121,%r321; .loc 1 177 23 mul.lo.u64 %r322,%r72,%r47; .loc 1 177 9 shl.b64 %r323,%r322,3; sub.u64 %r119,%r122,%r323; .loc 1 179 7 @ %r348 bra $L1; add.u64 %r144,%frame,368; mov.u64 %r106,8; .loc 1 178 5 mov.u64 %r120,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r144],%r355; .loc 1 176 23 mul.lo.u64 %r329,%r88,%r87; .loc 1 176 9 shl.b64 %r330,%r329,3; sub.u64 %r118,%r121,%r330; .loc 1 177 23 mul.lo.u64 %r331,%r91,%r87; .loc 1 177 9 shl.b64 %r332,%r331,3; sub.u64 %r119,%r122,%r332; .loc 1 178 5 add.u64 %r120,%r120,1; .loc 1 179 7 add.u64 %r144,%r144,8; add.u64 %r106,%r106,8; setp.eq.u64 %r333,%r120,%r352; @ %r333 bra $L1; $L32: .loc 1 187 16 ld.u64 %r334,[%r144]; add.u64 %r87,%r334,1; st.u64 [%r144],%r87; .loc 1 188 23 add.u64 %r336,%r353,%r106; ld.u64 %r88,[%r336]; .loc 1 188 13 shl.b64 %r337,%r88,3; add.u64 %r121,%r118,%r337; .loc 1 189 23 add.u64 %r338,%frame,%r106; ld.u64 %r91,[%r338]; .loc 1 189 13 shl.b64 %r339,%r91,3; add.u64 %r122,%r119,%r339; .loc 1 169 32 add.u64 %r341,%r354,%r106; ld.u64 %r94,[%r341]; .loc 1 169 13 setp.eq.u64 %r342,%r87,%r94; @ %r342 bra $L33; .loc 1 173 13 mov.u64 %r47,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r343,[%r180]; setp.eq.u64 %r344,%r343,0; @ ! %r344 bra $L11; bra $L34; $L1_gfortran_mproduct_i8 .visible .func _gfortran_mproduct_608predpred %r280; .reg .pred %r281; .reg .u64 %r282; .reg .u64 %r284; .reg .u64predu64 %r318; .reg .u64u64u64 %r364; .reg .u64 %r365; .reg .u64 %r367; .reg .predu64pred %r393; .reg .predpredmov.u64 %r210,%ar0; mov.u64 %r211,%ar1; mov.u64 %r212,%ar2; mov.u64 %r213,%ar3; .loc 1 223 6911212; call _gfortran_product_.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r212]; .loc 1 233 7 add.u64 %r141,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r211+28]; .loc 1 234 38 add.u32 %r218,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r142,%r218; .loc 1 237 7 shr.u64 %r220,%r141,63; set.u32.gt.s64 %r222,%r141,%r142; neg.s32 %r223,%r222; cvt.u16.u64 %r226,%r220; cvt.u16.u32 %r227,%r223; or.b16 %r225,%r226,%r227; .loc 1 237 6 cvt.u32.u16 %r228,%r225; cvt.u16.u8 %r229,%r228; setp.eq.u16 %r230,%r229,0; @ %r230 bra $L51; .loc 1 239 7 cvt.u32.u32 %r233,%r23; cvt.s64.s8 %r232,%r233; st.u64 [%stack+8],%r232; st.u64 [%stack],%r22; cvta.const.u64 %r2312stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r431,%r141,%r141; add.u64 %r430,%r431,%r141; shl.b64 %r238,%r430,3; add.u64 %r239,%r211,%r238; ld.u64 %r242,[%r239+56]; add.u64 %r241,%r242,1; .loc 1 244 7 ld.u64 %r249,[%r239+48]; sub.u64 %r143,%r241,%r249; .loc 1 245 6 setp.le.s64 %r250,%r143,0; @ %r250 bra $L48; .loc 1 248 9 ld.u64 %r156,[%r213]; .loc 1 250 15 ld.u64 %r35,[%r213+16]; .loc 1 252 22 cvt.u32.u64 %r36,%r35; .loc 1 252 53 add.u32 %r251,%r36,-4; and.b32 %r252,%r251,-5; set.u32.eq.u32 %r254,%r252,0; neg.s32 %r255,%r254; .loc 1 252 22 add.u32 %r256,%r36,-1; set.u32.le.u32 %r258,%r256,1; neg.s32 %r259,%r258; .loc 1 252 6 cvt.u16.u32 %r261,%r255; cvt.u16.u32 %r262,%r259; or.b16 %r260,%r261,%r262; cvt.u32.u16 %r263,%r260; cvt.u16.u8 %r264,%r263; setp.ne.u16 %r265,%r264,0; @ %r265 bra $L53; .loc 1 254 7 setp.ne.u32 %r267,%r36,16; @ %r267 bra $L54; $L53: .loc 1 261 9 shl.b64 %r271,%r430,3; add.u64 %r272,%r211,%r271; ld.u64 %r146,[%r272+40]; .loc 1 262 12 add.u64 %r278,%r213,%r271; ld.u64 %r43,[%r278+40]; .loc 1 264 3 setp.ne.u64 %r280,%r141,0; @ %r280 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r281,%r141,%r142; @ %r281 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r282282_gfortran_runtime_error5: add.u64 %r194,%r211,40; add.u64 %r196,%r213,40; add.u64 %r284,%r211,16; add.u64 %r286,%r22,%r22; add.u64 %r287,%r286,%r22; shl.b64 %r288,%r287,3; add.u64 %r207,%r284,%r288; .loc 1 264 3 mov.u64 %r198,0; add.u64 %r428,%frame,240; add.u64 %r427,%frame,360; .loc 1 271 12 mov.u64 %r442,%r198; $L60: .loc 1 266 18 add.u64 %r290,%r428,%r198; ld.u64 %r291,[%r194]; st.u64 [%r290],%r291; .loc 1 267 18 add.u64 %r292,%frame,%r198; .loc 1 267 20 ld.u64 %r294,[%r196]; mul.lo.u64 %r293,%r294,%r35; .loc 1 267 18 st.u64 [%r292],%r293; .loc 1 268 19 ld.u64 %r296,[%r194+16]; add.u64 %r295,%r296,1; ld.u64 %r297,[%r194+8]; sub.u64 %r55,%r295,%r297; .loc 1 270 10 setp.lt.s64 %r298,%r55,0; @ %r298 bra $L58; .loc 1 268 17 add.u64 %r300,%r427,%r198; st.u64 [%r300],%r55; bra $L59; $L58: .loc 1 271 12 add.u64 %r302,%r427,%r198; st.u64 [%r302],%r442; $L59: .loc 1 264 3 add.u64 %r194,%r194,24; add.u64 %r196,%r196,24; add.u64 %r198,%r198,8; setp.ne.u64 %r304,%r194,%r207; @ %r304 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r305,[%r210]; setp.eq.u64 %r306,%r305,0; @ ! %r306 bra $L63; bra $L62; $L56: add.u64 %r308,%r22,%r22; add.u64 %r309,%r308,%r22; shl.b64 %r310,%r309,3; add.u64 %r61,%r310,40; add.u64 %r100,%r211,%r61; add.u64 %r139,%r213,%r61; shl.b64 %r311,%r22,3; add.u64 %r192,%r311,-8; cvt.u32.u32 %r313,%r23; cvt.s64.s8 %r312,%r313; shl.b64 %r314,%r312,3; add.u64 %r39,%r314,-8; add.u64 %r428,%frame,240; add.u64 %r427,%frame,360; .loc 1 281 12 mov.u64 %r441,0; $L66: .loc 1 276 18 add.u64 %r316,%r428,%r192; ld.u64 %r317,[%r100]; st.u64 [%r316],%r317; .loc 1 277 18 add.u64 %r318,%frame,%r192; .loc 1 277 20 ld.u64 %r320,[%r139]; mul.lo.u64 %r319,%r320,%r35; .loc 1 277 18 st.u64 [%r318],%r319; .loc 1 278 19 ld.u64 %r322,[%r100+16]; add.u64 %r321,%r322,1; ld.u64 %r323,[%r100+8]; sub.u64 %r67,%r321,%r323; .loc 1 280 10 setp.lt.s64 %r324,%r67,0; @ %r324 bra $L64; .loc 1 278 17 add.u64 %r326,%r427,%r192; st.u64 [%r326],%r67; bra $L65; $L64: .loc 1 281 12 add.u64 %r328,%r427,%r192; st.u64 [%r328],%r441; $L65: .loc 1 274 3 add.u64 %r100,%r100,24; add.u64 %r139,%r139,24; add.u64 %r192,%r192,8; setp.ne.u64 %r330,%r39,%r192; @ %r330 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r331,%r142,0; @ %r331 bra $L68; add.u64 %r427,%frame,360; $L86: add.u64 %r137,%r210,40; mov.u64 %r135,%r427; cvt.u32.u32 %r333,%r23; cvt.s64.s8 %r332,%r333; add.u64 %r335,%r332,%r332; add.u64 %r336,%r335,%r332; shl.b64 %r337,%r336,3; add.u64 %r338,%r210,16; add.u64 %r106,%r337,%r338; .loc 1 291 10 mov.u64 %r131,1; .loc 1 295 4 mov.u64 %r357,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r339,%r23,-2; cvt.s64.s32 %r74,%r339; add.u64 %r341,%r74,%r74; add.u64 %r342,%r341,%r74; shl.b64 %r343,%r342,3; add.u64 %r344,%r210,%r343; .loc 1 299 67 shl.b64 %r346,%r74,3; add.u64 %r347,%frame,%r346; .loc 1 299 59 ld.u64 %r349,[%r344+40]; ld.u64 %r350,[%r347+360]; mul.lo.u64 %r150,%r349,%r350; .loc 1 301 24 mov.u64 %r351,0; st.u64 [%r210+8],%r351; .loc 1 302 28 cvt.u16.u32 %r354,%r23; add.u16 %r353,%r354,-1; cvt.u32.u16 %r355,%r353; st.u8 [%r210+28],%r355; .loc 1 304 10 setp.eq.u64 %r356,%r150,0; @ %r356 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r131,%r71,%r131; $L69: .loc 1 295 4 st.u64 [%r137+8],%r357; ld.u64 %r71,[%r135]; add.u64 %r358,%r71,-1; st.u64 [%r137+16],%r358; st.u64 [%r137],%r131; .loc 1 288 7 add.u64 %r137,%r137,24; add.u64 %r135,%r135,8; setp.ne.u64 %r359,%r106,%r137; @ %r359 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r210+48],%r150; mov.u64 %r361,-1; st.u64 [%r210+56],%r361; mov.u64 %r362,1; st.u64 [%r210+40],%r362; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r364364; call (%value_in),_gfortrani_xmallocarray65,[%value_in]; } .loc 1 311 22 st.u64 [%r210],%r365; bra $L73; $L63: .loc 1 316 19 ld.s8 %r367,[%r210+28]; .loc 1 316 10 setp.eq.u64 %r368,%r367,%r142; @ %r368 bra $L74; .loc 1 317 2 cvta.const.u64 %r36stack; call _gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r371,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r372,[%r371+36]; setp.eq.u32 %r373,%r372,0; @ %r373 bra $L73; .loc 1 321 4 add.u64 %r427,%frame,360; cvta.const.u64 %r377,$LC2r427376377; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r381,$LC1r2113377; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r383,%r142,0; @ %r383 bra $L75; $L78: .loc 1 336 8 ld.u64 %r157,[%r210]; .loc 1 337 8 ld.u64 %r155,[%r211]; .loc 1 339 9 setp.ne.u64 %r384,%r155,0; @ %r384 bra $L76; bra $L48; $L75: add.u64 %r183,%frame,480; add.u64 %r180,%r210,40; add.u64 %r177,%frame,120; cvt.u32.u32 %r386,%r23; cvt.s64.s8 %r385,%r386; add.u64 %r138,%r385,-1; .loc 1 328 3 mov.u64 %r161,0; add.u64 %r427,%frame,360; .loc 1 330 16 mov.u64 %r387,%r161; $L77: st.u64 [%r183],%r387; .loc 1 331 18 ld.u64 %r388,[%r180]; st.u64 [%r177],%r388; .loc 1 332 17 shl.b64 %r390,%r161,3; add.u64 %r391,%r427,%r390; .loc 1 332 10 ld.u64 %r392,[%r391]; setp.le.s64 %r393,%r392,0; @ %r393 bra $L48; .loc 1 328 26 add.u64 %r161,%r161,1; .loc 1 328 3 add.u64 %r183,%r183,8; add.u64 %r180,%r180,24; add.u64 %r177,%r177,8; setp.ne.u64 %r394,%r138,%r161; @ %r394 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r46,%r43,%r35; .loc 1 349 32 shl.b64 %r90,%r146,3; ld.u64 %r29,[%frame+480]; .loc 1 359 22 ld.u64 %r92,[%frame+240]; .loc 1 359 12 shl.b64 %r94,%r92,3; .loc 1 360 23 ld.u64 %r95,[%frame]; .loc 1 361 22 ld.u64 %r97,[%frame+120]; .loc 1 361 12 shl.b64 %r99,%r97,3; setp.le.s64 %r432,%r142,1; cvt.u32.u32 %r434,%r23; cvt.s64.s8 %r435,%r434; add.u64 %r436,%r435,-1; add.u64 %r437,%frame,240; add.u64 %r438,%frame,360; add.u64 %r439,%frame,120; .loc 1 367 13 mov.u64 %r440,0; $L87: .loc 1 328 3 mov.u64 %r160,%r156; mov.u64 %r159,%r155; .loc 1 348 10 mov.u64 %r133,1; .loc 1 349 9 mov.u64 %r158,0; $L80: .loc 1 352 6 ld.s8 %r396,[%r160]; cvt.u16.u32 %r395,%r396; setp.eq.u16 %r397,%r395,0; @ %r397 bra $L79; .loc 1 353 12 ld.u64 %r398,[%r159]; mul.lo.u64 %r133,%r133,%r398; $L79: .loc 1 349 24 add.u64 %r158,%r158,1; .loc 1 349 32 add.u64 %r159,%r159,%r90; .loc 1 349 47 add.u64 %r160,%r160,%r46; .loc 1 349 2 setp.ne.u64 %r399,%r143,%r158; @ %r399 bra $L80; .loc 1 355 8 st.u64 [%r157],%r133; .loc 1 358 15 add.u64 %r29,%r29,1; .loc 1 359 12 add.u64 %r155,%r155,%r94; .loc 1 360 13 add.u64 %r156,%r156,%r95; .loc 1 361 12 add.u64 %r157,%r157,%r99; .loc 1 363 32 ld.u64 %r84,[%frame+360]; .loc 1 363 13 setp.ne.u64 %r400,%r84,%r29; @ %r400 bra $L87; .loc 1 370 23 mul.lo.u64 %r102,%r29,%r92; .loc 1 371 24 mul.lo.u64 %r401,%r29,%r95; .loc 1 371 10 sub.u64 %r152,%r156,%r401; .loc 1 372 23 mul.lo.u64 %r402,%r29,%r97; .loc 1 372 9 shl.b64 %r403,%r402,3; sub.u64 %r153,%r157,%r403; .loc 1 374 7 @ ! %r432 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r41],%r440; .loc 1 370 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 371 24 mul.lo.u64 %r406,%r116,%r113; .loc 1 371 10 sub.u64 %r152,%r156,%r406; .loc 1 372 23 mul.lo.u64 %r407,%r118,%r113; .loc 1 372 9 shl.b64 %r408,%r407,3; sub.u64 %r153,%r157,%r408; .loc 1 373 5 add.u64 %r154,%r154,1; .loc 1 374 7 add.u64 %r41,%r41,8; add.u64 %r30,%r30,8; setp.ne.u64 %r409,%r154,%r436; @ %r409 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; mov.u64 %r30,8; .loc 1 373 5 mov.u64 %r154,1; $L84: .loc 1 382 16 ld.u64 %r413,[%r41]; add.u64 %r113,%r413,1; st.u64 [%r41],%r113; .loc 1 383 23 add.u64 %r415,%r437,%r30; ld.u64 %r114,[%r415]; .loc 1 383 13 sub.u64 %r416,%r114,%r102; shl.b64 %r417,%r416,3; add.u64 %r155,%r155,%r417; .loc 1 384 24 add.u64 %r418,%frame,%r30; ld.u64 %r116,[%r418]; .loc 1 384 14 add.u64 %r156,%r152,%r116; .loc 1 385 23 add.u64 %r420,%r439,%r30; ld.u64 %r118,[%r420]; .loc 1 385 13 shl.b64 %r421,%r118,3; add.u64 %r157,%r153,%r421; .loc 1 363 32 add.u64 %r423,%r438,%r30; ld.u64 %r121,[%r423]; .loc 1 363 13 setp.eq.u64 %r424,%r113,%r121; @ %r424 bra $L85; .loc 1 367 13 mov.u64 %r29,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r425,[%r210]; setp.eq.u64 %r426,%r425,0; @ ! %r426 bra $L63; bra $L86; $L48: .loc 1 389_gfortran_sproduct_i8 .visible .func _gfortran_sproduct_8u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64predu64mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_product_.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r294,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r294; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r294; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r300,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r300; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r293,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r293,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r299,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r299; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r294,0; @ %r211 bra $L127; add.u64 %r293,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r293; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r24496244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r294,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r294; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r294; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r294,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r294,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r294,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 shl.b64 %r69,%r68,3; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r292,%r294,1; .loc 1 512 13 mov.u64 %r278,1; cvt.u32.u32 %r296,%r24; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; $L146: st.u64 [%r98],%r278; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r279,%r59,%r80; @ %r279 bra $L146; .loc 1 525 7 @ ! %r292 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r295; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r282,%r97,%r298; @ %r282 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; .loc 1 520 13 mov.u64 %r295,0; $L144: .loc 1 529 16 ld.u64 %r287,[%r99]; add.u64 %r74,%r287,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r288,%r75,%r72; shl.b64 %r289,%r288,3; add.u64 %r98,%r98,%r289; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r290,%r74,%r77; @ %r290 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r291,%r98,0; @ ! %r291 bra $L122; bra $L147; $L109: .loc 1 534product_i16.o/_gfortran_product_i16 .visible .func _gfortran_product_i16fortran/generated/product_i16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mproduct_i16 .visible .func _gfortran_mproduct_i16gfortran_sproduct_i16 .visible .func _gfortran_sproduct_i16VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_retur_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,108] = {80,82,79,68,85,67,84114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_product_i16 .visible .func _gfortran_product_i16528pred %r217; .reg .pred64u64 %r251; .reg .predu64 %r256; .reg .u64 %r257; .reg .u32u32 %r268; .reg .u64u64pred %r317; .reg .u64pred %r332; .reg .predmov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; .loc 1 55 10 ld.s8 %r22,[%r180+28]; .loc 1 55 38 add.u32 %r182,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r182; .loc 1 56 10 ld.u64 %r25,[%r181]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r184,%r104,63; set.u32.lt.s64 %r186,%r103,%r104; neg.s32 %r187,%r186; cvt.u16.u64 %r190,%r184; cvt.u16.u32 %r191,%r187; or.b16 %r189,%r190,%r191; .loc 1 58 6 cvt.u32.u16 %r192,%r189; cvt.u16.u8 %r193,%r192; setp.eq.u16 %r194,%r193,0; @ %r194 bra $L2; .loc 1 60 7 cvt.u32.u32 %r197,%r22; cvt.s64.s8 %r196,%r197; st.u64 [%stack+8],%r1961955_gfortran_runtime_errorr200,%r104,%r104; add.u64 %r201,%r200,%r104; shl.b64 %r202,%r201,3; add.u64 %r203,%r180,%r202; ld.u64 %r30,[%r203+56]; ld.u64 %r32,[%r203+48]; .loc 1 68 9 ld.u64 %r107,[%r203+40]; .loc 1 70 3 setp.ne.u64 %r217,%r104,0; @ %r217 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r218,%r103,%r104; @ %r218 bra $L4; bra $L46; $L3: add.u64 %r61,%r180,40; add.u64 %r40,%frame,120; add.u64 %r91,%frame,240; add.u64 %r219,%r180,16; add.u64 %r221,%r25,%r25; add.u64 %r222,%r221,%r25; shl.b64 %r223,%r222,3; add.u64 %r174,%r219,%r223; .loc 1 76 12 mov.u64 %r378,0; $L8: .loc 1 72 18 ld.u64 %r224,[%r61]; st.u64 [%r40],%r224; .loc 1 73 19 ld.u64 %r226,[%r61+16]; add.u64 %r225,%r226,1; ld.u64 %r227,[%r61+8]; sub.u64 %r37,%r225,%r227; .loc 1 75 10 setp.lt.s64 %r228,%r37,0; @ %r228 bra $L6; .loc 1 73 17 st.u64 [%r91],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r91],%r378; $L7: .loc 1 70 3 add.u64 %r61,%r61,24; add.u64 %r40,%r40,8; add.u64 %r91,%r91,8; setp.ne.u64 %r230,%r61,%r174; @ %r230 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r231,[%r179]; setp.eq.u64 %r232,%r231,0; @ ! %r232 bra $L11; bra $L10; $L4: add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r237,%r236,40; add.u64 %r143,%r180,%r237; shl.b64 %r238,%r25,3; add.u64 %r125,%r238,-8; add.u64 %r361,%frame,120; add.u64 %r128,%r361,%r125; add.u64 %r366,%frame,240; add.u64 %r121,%r366,%r125; cvt.u32.u32 %r242,%r22; cvt.s64.s8 %r241,%r242; add.u64 %r244,%r241,%r241; add.u64 %r245,%r244,%r241; shl.b64 %r246,%r245,3; add.u64 %r247,%r180,40; add.u64 %r95,%r246,%r247; .loc 1 84 12 mov.u64 %r377,0; $L14: .loc 1 80 18 ld.u64 %r248,[%r143]; st.u64 [%r128],%r248; .loc 1 81 19 ld.u64 %r250,[%r143+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r143+8]; sub.u64 %r44,%r249,%r251; .loc 1 83 10 setp.lt.s64 %r252,%r44,0; @ %r252 bra $L12; .loc 1 81 17 st.u64 [%r121],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r121],%r377; $L13: .loc 1 78 3 add.u64 %r143,%r143,24; add.u64 %r128,%r128,8; add.u64 %r121,%r121,8; setp.ne.u64 %r254,%r95,%r143; @ %r254 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r255,%r103,0; @ %r255 bra $L16; add.u64 %r366,%frame,240; $L34: add.u64 %r152,%r179,40; mov.u64 %r150,%r366; add.u64 %r256,%r179,16; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r144,%r256,%r262; .loc 1 94 10 mov.u64 %r99,1; .loc 1 98 4 mov.u64 %r285,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r263,0; st.u64 [%r179+8],%r263; .loc 1 103 28 cvt.u16.u32 %r266,%r22; add.u16 %r265,%r266,-1; cvt.u32.u16 %r267,%r265; st.u8 [%r179+28],%r267; .loc 1 105 20 add.u32 %r268,%r22,-2; cvt.s64.s32 %r52,%r268; add.u64 %r270,%r52,%r52; add.u64 %r271,%r270,%r52; shl.b64 %r272,%r271,3; add.u64 %r273,%r179,%r272; .loc 1 105 67 shl.b64 %r275,%r52,3; add.u64 %r276,%frame,%r275; .loc 1 105 59 ld.u64 %r278,[%r273+40]; ld.u64 %r279,[%r276+240]; mul.lo.u64 %r109,%r278,%r279; .loc 1 107 29 mov.u64 %r281,16281; call (%value_in),_gfortrani_xmallocarray282,[%value_in]; } .loc 1 107 27 st.u64 [%r179],%r282; .loc 1 108 10 setp.eq.u64 %r284,%r109,0; @ ! %r284 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r99,%r48,%r99; $L17: .loc 1 98 4 st.u64 [%r152+8],%r285; ld.u64 %r48,[%r150]; add.u64 %r286,%r48,-1; st.u64 [%r152+16],%r286; st.u64 [%r152],%r99; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r287,%r144,%r152; @ %r287 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r179+48],%r109; mov.u64 %r289,-1; st.u64 [%r179+56],%r289; mov.u64 %r290,1; st.u64 [%r179+40],%r290; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r179+28]; .loc 1 118 10 setp.eq.u64 %r291,%r58,%r103; @ %r291 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r294,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r295,[%r294+36]; setp.eq.u32 %r296,%r295,0; @ %r296 bra $L21; .loc 1 125 2 add.u64 %r366,%frame,240; cvta.const.u64 %r300,$LC2r366299300; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r302,%r103,0; @ %r302 bra $L24; $L27: .loc 1 65 9 add.u64 %r303,%r30,1; .loc 1 65 7 sub.u64 %r105,%r303,%r32; max.s64 %r45,%r105,0; .loc 1 137 8 ld.u64 %r116,[%r180]; .loc 1 138 8 ld.u64 %r117,[%r179]; .loc 1 154 36 shl.b64 %r66,%r107,4; ld.u64 %r138,[%frame+360]; .loc 1 166 22 ld.u64 %r68,[%frame+120]; .loc 1 166 12 shl.b64 %r69,%r68,4; .loc 1 167 22 ld.u64 %r71,[%frame]; .loc 1 167 12 shl.b64 %r72,%r71,4; .loc 1 169 32 ld.u64 %r123,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r304,%r68,%r123; .loc 1 176 9 shl.b64 %r305,%r304,4; neg.s64 %r127,%r305; .loc 1 177 23 mul.lo.u64 %r306,%r71,%r123; .loc 1 177 9 shl.b64 %r307,%r306,4; neg.s64 %r130,%r307; setp.gt.s64 %r363,%r105,0; setp.le.s64 %r364,%r103,1; add.u64 %r369,%frame,480; add.u64 %r370,%frame,496; add.u64 %r371,%frame,512; cvt.u32.u32 %r372,%r22; cvt.s64.s8 %r373,%r372; add.u64 %r374,%r373,-1; add.u64 %r375,%frame,120; add.u64 %r376,%frame,240; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r179,40; mov.u64 %r162,%frame; cvt.u32.u32 %r309,%r22; cvt.s64.s8 %r308,%r309; add.u64 %r153,%r308,-1; .loc 1 129 3 mov.u64 %r118,0; add.u64 %r366,%frame,240; .loc 1 131 16 mov.u64 %r310,%r118; $L26: st.u64 [%r165],%r310; .loc 1 132 18 ld.u64 %r311,[%r164]; st.u64 [%r162],%r311; .loc 1 133 17 shl.b64 %r313,%r118,3; add.u64 %r314,%r366,%r313; .loc 1 133 10 ld.u64 %r315,[%r314]; setp.le.s64 %r316,%r315,0; @ %r316 bra $L1; .loc 1 129 26 add.u64 %r118,%r118,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r317,%r118,%r153; @ %r317 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r363 bra $L35; .loc 1 150 10 mov.u64 %r319,1; st.u64 [%r117],%r319; mov.u64 %r320,0; st.u64 [%r117+8],%r320; bra $L29; $L35: mov.u64 %r112,%r116; .loc 1 148 10 mov.u64 %r357,1; mov.u64 %r358,0; .loc 1 154 13 mov.u64 %r111,%r358; $L28: .loc 1 158 10 st.u64 [%frame+496],%r357; st.u64 [%frame+504],%r358; ld.u64 %r327,[%r112]; st.u64 [%frame+512],%r327; ld.u64 %r328,[%r112+8]; st.u64 [%frame+520],%r328480]; ld.u64 %r358,[%frame+488]; .loc 1 154 28 add.u64 %r111,%r111,1; .loc 1 154 36 add.u64 %r112,%r112,%r66; .loc 1 154 6 setp.gt.s64 %r332,%r45,%r111; @ %r332 bra $L28; .loc 1 161 12 st.u64 [%r117],%r357; st.u64 [%r117+8],%r358; $L29: .loc 1 165 15 add.u64 %r138,%r138,1; .loc 1 166 12 add.u64 %r116,%r116,%r69; .loc 1 167 12 add.u64 %r117,%r117,%r72; .loc 1 169 13 setp.ne.u64 %r335,%r138,%r123; @ %r335 bra $L25; .loc 1 176 9 add.u64 %r113,%r116,%r127; .loc 1 177 9 add.u64 %r114,%r117,%r130; .loc 1 179 7 @ %r364 bra $L1; add.u64 %r122,%frame,368; mov.u64 %r140,8; .loc 1 178 5 mov.u64 %r115,1; .loc 1 173 13 mov.u64 %r368,0; bra $L32; $L33: st.u64 [%r122],%r368; .loc 1 176 23 mul.lo.u64 %r341,%r85,%r84; .loc 1 176 9 shl.b64 %r342,%r341,4; sub.u64 %r113,%r116,%r342; .loc 1 177 23 mul.lo.u64 %r343,%r88,%r84; .loc 1 177 9 shl.b64 %r344,%r343,4; sub.u64 %r114,%r117,%r344; .loc 1 178 5 add.u64 %r115,%r115,1; .loc 1 179 7 add.u64 %r122,%r122,8; add.u64 %r140,%r140,8; setp.eq.u64 %r345,%r115,%r374; @ %r345 bra $L1; $L32: .loc 1 187 16 ld.u64 %r346,[%r122]; add.u64 %r84,%r346,1; st.u64 [%r122],%r84; .loc 1 188 23 add.u64 %r348,%r375,%r140; ld.u64 %r85,[%r348]; .loc 1 188 13 shl.b64 %r349,%r85,4; add.u64 %r116,%r113,%r349; .loc 1 189 23 add.u64 %r350,%frame,%r140; ld.u64 %r88,[%r350]; .loc 1 189 13 shl.b64 %r351,%r88,4; add.u64 %r117,%r114,%r351; .loc 1 169 32 add.u64 %r353,%r376,%r140; ld.u64 %r92,[%r353]; .loc 1 169 13 setp.eq.u64 %r354,%r84,%r92; @ %r354 bra $L33; .loc 1 173 13 mov.u64 %r138,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r355,[%r179]; setp.eq.u64 %r356,%r355,0; @ ! %r356 bra $L11; bra $L34; $L1_gfortran_mproduct_i16 .visible .func _gfortran_mproduct_i16656u64u16 %r263; .reg .pred %r264; .reg .pred %r266; .reg .u64 %r270; .reg .u64 %r271; .reg .u64 %r277; .reg .pred %r279; .reg .pred %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .predu64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32u64 %r342; .reg .u64pred %r358; .reg .u64predu64 %r391; .reg .predpredpred %r411; .reg .predpred %r420; .reg .u64u64 %r432; .reg .u64 %r434; .reg .pred %r435; .reg .u64 %r436; .reg .predpred.reg .u32 %r452; .reg .u64 %r453; .reg .u64 %r454; .reg .u64mov.u64 %r211,%ar2; mov.u64 %r212,%ar3; .loc 1 223 6 setp.ne.u64 %r213,%r212,0; @ %r213 bra $L49gfortran_product_i16.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r211]; .loc 1 233 7 add.u64 %r137,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r210+28]; .loc 1 234 38 add.u32 %r217,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r138,%r217; .loc 1 237 7 shr.u64 %r219,%r137,63; set.u32.gt.s64 %r221,%r137,%r138; neg.s32 %r222,%r221; cvt.u16.u64 %r225,%r219; cvt.u16.u32 %r226,%r222; or.b16 %r224,%r225,%r226; .loc 1 237 6 cvt.u32.u16 %r227,%r224; cvt.u16.u8 %r228,%r227; setp.eq.u16 %r229,%r228,0; @ %r229 bra $L51; .loc 1 239 7 cvt.u32.u32 %r232,%r23; cvt.s64.s8 %r231,%r232; st.u64 [%stack+8],%r231; st.u64 [%stack],%r22; cvta.const.u64 %r2302stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r443,%r137,%r137; add.u64 %r446,%r443,%r137; shl.b64 %r237,%r446,3; add.u64 %r238,%r210,%r237; ld.u64 %r241,[%r238+56]; add.u64 %r240,%r241,1; .loc 1 244 7 ld.u64 %r248,[%r238+48]; sub.u64 %r139,%r240,%r248; .loc 1 245 6 setp.le.s64 %r249,%r139,0; @ %r249 bra $L48; .loc 1 248 9 ld.u64 %r152,[%r212]; .loc 1 250 15 ld.u64 %r34,[%r212+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r250,%r35,-4; and.b32 %r251,%r250,-5; set.u32.eq.u32 %r253,%r251,0; neg.s32 %r254,%r253; .loc 1 252 22 add.u32 %r255,%r35,-1; set.u32.le.u32 %r257,%r255,1; neg.s32 %r258,%r257; .loc 1 252 6 cvt.u16.u32 %r260,%r254; cvt.u16.u32 %r261,%r258; or.b16 %r259,%r260,%r261; cvt.u32.u16 %r262,%r259; cvt.u16.u8 %r263,%r262; setp.ne.u16 %r264,%r263,0; @ %r264 bra $L53; .loc 1 254 7 setp.ne.u32 %r266,%r35,16; @ %r266 bra $L54; $L53: .loc 1 261 9 shl.b64 %r270,%r446,3; add.u64 %r271,%r210,%r270; ld.u64 %r142,[%r271+40]; .loc 1 262 12 add.u64 %r277,%r212,%r270; ld.u64 %r42,[%r277+40]; .loc 1 264 3 setp.ne.u64 %r279,%r137,0; @ %r279 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r280,%r137,%r138; @ %r280 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r281281_gfortran_runtime_error5: add.u64 %r40,%r210,40; add.u64 %r179,%r212,40; add.u64 %r283,%r210,16; add.u64 %r285,%r22,%r22; add.u64 %r286,%r285,%r22; shl.b64 %r287,%r286,3; add.u64 %r206,%r283,%r287; .loc 1 264 3 mov.u64 %r92,0; add.u64 %r444,%frame,240; add.u64 %r442,%frame,360; .loc 1 271 12 mov.u64 %r459,%r92; $L60: .loc 1 266 18 add.u64 %r289,%r444,%r92; ld.u64 %r290,[%r40]; st.u64 [%r289],%r290; .loc 1 267 18 add.u64 %r291,%frame,%r92; .loc 1 267 20 ld.u64 %r293,[%r179]; mul.lo.u64 %r292,%r293,%r34; .loc 1 267 18 st.u64 [%r291],%r292; .loc 1 268 19 ld.u64 %r295,[%r40+16]; add.u64 %r294,%r295,1; ld.u64 %r296,[%r40+8]; sub.u64 %r54,%r294,%r296; .loc 1 270 10 setp.lt.s64 %r297,%r54,0; @ %r297 bra $L58; .loc 1 268 17 add.u64 %r299,%r442,%r92; st.u64 [%r299],%r54; bra $L59; $L58: .loc 1 271 12 add.u64 %r301,%r442,%r92; st.u64 [%r301],%r459; $L59: .loc 1 264 3 add.u64 %r40,%r40,24; add.u64 %r179,%r179,24; add.u64 %r92,%r92,8; setp.ne.u64 %r303,%r40,%r206; @ %r303 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r304,[%r209]; setp.eq.u64 %r305,%r304,0; @ ! %r305 bra $L63; bra $L62; $L56: add.u64 %r307,%r22,%r22; add.u64 %r308,%r307,%r22; shl.b64 %r309,%r308,3; add.u64 %r127,%r309,40; add.u64 %r133,%r210,%r127; add.u64 %r125,%r212,%r127; shl.b64 %r310,%r22,3; add.u64 %r103,%r310,-8; cvt.u32.u32 %r312,%r23; cvt.s64.s8 %r311,%r312; shl.b64 %r313,%r311,3; add.u64 %r69,%r313,-8; add.u64 %r444,%frame,240; add.u64 %r442,%frame,360; .loc 1 281 12 mov.u64 %r458,0; $L66: .loc 1 276 18 add.u64 %r315,%r444,%r103; ld.u64 %r316,[%r133]; st.u64 [%r315],%r316; .loc 1 277 18 add.u64 %r317,%frame,%r103; .loc 1 277 20 ld.u64 %r319,[%r125]; mul.lo.u64 %r318,%r319,%r34; .loc 1 277 18 st.u64 [%r317],%r318; .loc 1 278 19 ld.u64 %r321,[%r133+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r133+8]; sub.u64 %r66,%r320,%r322; .loc 1 280 10 setp.lt.s64 %r323,%r66,0; @ %r323 bra $L64; .loc 1 278 17 add.u64 %r325,%r442,%r103; st.u64 [%r325],%r66; bra $L65; $L64: .loc 1 281 12 add.u64 %r327,%r442,%r103; st.u64 [%r327],%r458; $L65: .loc 1 274 3 add.u64 %r133,%r133,24; add.u64 %r125,%r125,24; add.u64 %r103,%r103,8; setp.ne.u64 %r329,%r69,%r103; @ %r329 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r330,%r138,0; @ %r330 bra $L68; add.u64 %r442,%frame,360; $L86: add.u64 %r172,%r209,40; mov.u64 %r164,%r442; cvt.u32.u32 %r332,%r23; cvt.s64.s8 %r331,%r332; add.u64 %r334,%r331,%r331; add.u64 %r335,%r334,%r331; shl.b64 %r336,%r335,3; add.u64 %r337,%r209,16; add.u64 %r134,%r336,%r337; .loc 1 291 10 mov.u64 %r129,1; .loc 1 295 4 mov.u64 %r356,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r338,%r23,-2; cvt.s64.s32 %r75,%r338; add.u64 %r340,%r75,%r75; add.u64 %r341,%r340,%r75; shl.b64 %r342,%r341,3; add.u64 %r343,%r209,%r342; .loc 1 299 67 shl.b64 %r345,%r75,3; add.u64 %r346,%frame,%r345; .loc 1 299 59 ld.u64 %r348,[%r343+40]; ld.u64 %r349,[%r346+360]; mul.lo.u64 %r146,%r348,%r349; .loc 1 301 24 mov.u64 %r350,0; st.u64 [%r209+8],%r350; .loc 1 302 28 cvt.u16.u32 %r353,%r23; add.u16 %r352,%r353,-1; cvt.u32.u16 %r354,%r352; st.u8 [%r209+28],%r354; .loc 1 304 10 setp.eq.u64 %r355,%r146,0; @ %r355 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r129,%r72,%r129; $L69: .loc 1 295 4 st.u64 [%r172+8],%r356; ld.u64 %r72,[%r164]; add.u64 %r357,%r72,-1; st.u64 [%r172+16],%r357; st.u64 [%r172],%r129; .loc 1 288 7 add.u64 %r172,%r172,24; add.u64 %r164,%r164,8; setp.ne.u64 %r358,%r134,%r172; @ %r358 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r209+48],%r146; mov.u64 %r360,-1; st.u64 [%r209+56],%r360; mov.u64 %r361,1; st.u64 [%r209+40],%r361; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r363,16363; call (%value_in),_gfortrani_xmallocarray64,[%value_in]; } .loc 1 311 22 st.u64 [%r209],%r364; bra $L73; $L63: .loc 1 316 19 ld.s8 %r366,[%r209+28]; .loc 1 316 10 setp.eq.u64 %r367,%r366,%r138; @ %r367 bra $L74; .loc 1 317 2 cvta.const.u64 %r36836_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r370,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r371,[%r370+36]; setp.eq.u32 %r372,%r371,0; @ %r372 bra $L73; .loc 1 321 4 add.u64 %r442,%frame,360; cvta.const.u64 %r376,$LC2r44237376; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r380,$LC1210380376; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r382,%r138,0; @ %r382 bra $L75; $L78: .loc 1 336 8 ld.u64 %r153,[%r209]; .loc 1 337 8 ld.u64 %r151,[%r210]; .loc 1 339 9 setp.ne.u64 %r383,%r151,0; @ %r383 bra $L76; bra $L48; $L75: add.u64 %r190,%frame,480; add.u64 %r189,%r209,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r385,%r23; cvt.s64.s8 %r384,%r385; add.u64 %r175,%r384,-1; .loc 1 328 3 mov.u64 %r157,0; add.u64 %r442,%frame,360; .loc 1 330 16 mov.u64 %r386,%r157; $L77: st.u64 [%r190],%r386; .loc 1 331 18 ld.u64 %r387,[%r189]; st.u64 [%r187],%r387; .loc 1 332 17 shl.b64 %r389,%r157,3; add.u64 %r390,%r442,%r389; .loc 1 332 10 ld.u64 %r391,[%r390]; setp.le.s64 %r392,%r391,0; @ %r392 bra $L48; .loc 1 328 26 add.u64 %r157,%r157,1; .loc 1 328 3 add.u64 %r190,%r190,8; add.u64 %r189,%r189,24; add.u64 %r187,%r187,8; setp.ne.u64 %r393,%r157,%r175; @ %r393 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r45,%r42,%r34; .loc 1 349 32 shl.b64 %r91,%r142,4; ld.u64 %r39,[%frame+480]; .loc 1 359 22 ld.u64 %r94,[%frame+240]; .loc 1 359 12 shl.b64 %r95,%r94,4; .loc 1 360 23 ld.u64 %r97,[%frame]; .loc 1 361 22 ld.u64 %r99,[%frame+120]; .loc 1 361 12 shl.b64 %r100,%r99,4; .loc 1 363 32 ld.u64 %r85,[%frame+360]; .loc 1 370 23 mul.lo.u64 %r38,%r85,%r94; .loc 1 371 24 mul.lo.u64 %r394,%r85,%r97; .loc 1 371 10 neg.s64 %r163,%r394; .loc 1 372 23 mul.lo.u64 %r395,%r85,%r99; .loc 1 372 9 shl.b64 %r396,%r395,4; neg.s64 %r165,%r396; setp.le.s64 %r447,%r138,1; .loc 1 353 12 add.u64 %r449,%frame,608; add.u64 %r450,%frame,624; add.u64 %r451,%frame,640; cvt.u32.u32 %r452,%r23; cvt.s64.s8 %r453,%r452; add.u64 %r454,%r453,-1; add.u64 %r455,%frame,240; add.u64 %r456,%frame,360; add.u64 %r457,%frame,120; $L87: .loc 1 328 3 mov.u64 %r156,%r152; mov.u64 %r155,%r151; .loc 1 348 10 mov.u64 %r438,1; mov.u64 %r439,0; .loc 1 349 9 mov.u64 %r154,%r439; $L80: .loc 1 352 6 ld.s8 %r398,[%r156]; cvt.u16.u32 %r397,%r398; setp.eq.u16 %r399,%r397,0; @ %r399 bra $L79; .loc 1 353 12 st.u64 [%frame+624],%r438; st.u64 [%frame+632],%r439; ld.u64 %r406,[%r155]; st.u64 [%frame+640],%r406; ld.u64 %r407,[%r155+8]; st.u64 [%frame+648],%r40739,[%frame+616]; $L79: .loc 1 349 24 add.u64 %r154,%r154,1; .loc 1 349 32 add.u64 %r155,%r155,%r91; .loc 1 349 47 add.u64 %r156,%r156,%r45; .loc 1 349 2 setp.ne.u64 %r411,%r139,%r154; @ %r411 bra $L80; .loc 1 355 8 st.u64 [%r153],%r438; st.u64 [%r153+8],%r439; .loc 1 358 15 add.u64 %r39,%r39,1; .loc 1 359 12 add.u64 %r151,%r151,%r95; .loc 1 360 13 add.u64 %r152,%r152,%r97; .loc 1 361 12 add.u64 %r153,%r153,%r100; .loc 1 363 13 setp.ne.u64 %r414,%r85,%r39; @ %r414 bra $L87; .loc 1 371 10 add.u64 %r148,%r152,%r163; .loc 1 372 9 add.u64 %r149,%r153,%r165; .loc 1 374 7 @ ! %r447 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r41],%r448; .loc 1 370 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 371 24 mul.lo.u64 %r417,%r116,%r113; .loc 1 371 10 sub.u64 %r148,%r152,%r417; .loc 1 372 23 mul.lo.u64 %r418,%r118,%r113; .loc 1 372 9 shl.b64 %r419,%r418,4; sub.u64 %r149,%r153,%r419; .loc 1 373 5 add.u64 %r150,%r150,1; .loc 1 374 7 add.u64 %r41,%r41,8; add.u64 %r166,%r166,8; setp.ne.u64 %r420,%r150,%r454; @ %r420 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; .loc 1 370 23 mov.u64 %r102,%r38; .loc 1 374 7 mov.u64 %r166,8; .loc 1 373 5 mov.u64 %r150,1; .loc 1 367 13 mov.u64 %r448,0; $L84: .loc 1 382 16 ld.u64 %r424,[%r41]; add.u64 %r113,%r424,1; st.u64 [%r41],%r113; .loc 1 383 23 add.u64 %r426,%r455,%r166; ld.u64 %r114,[%r426]; .loc 1 383 13 sub.u64 %r427,%r114,%r102; shl.b64 %r428,%r427,4; add.u64 %r151,%r151,%r428; .loc 1 384 24 add.u64 %r429,%frame,%r166; ld.u64 %r116,[%r429]; .loc 1 384 14 add.u64 %r152,%r148,%r116; .loc 1 385 23 add.u64 %r431,%r457,%r166; ld.u64 %r118,[%r431]; .loc 1 385 13 shl.b64 %r432,%r118,4; add.u64 %r153,%r149,%r432; .loc 1 363 32 add.u64 %r434,%r456,%r166; ld.u64 %r122,[%r434]; .loc 1 363 13 setp.eq.u64 %r435,%r113,%r122; @ %r435 bra $L85; .loc 1 367 13 mov.u64 %r39,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r436,[%r209]; setp.eq.u64 %r437,%r436,0; @ ! %r437 bra $L63; bra $L86; $L48: .loc 1 389_gfortran_sproduct_i16 .visible .func _gfortran_sproduct_i168u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64u64 %r279; .reg .pred %r280; .reg .predpred %r292; .reg .predu32u64 %r301; mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_product_i16,(%out_arg1,%out_arg2,%out_arg3); } .loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r295,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r295; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r295; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r301,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r301; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r294,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r294,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r300,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r300; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r295,0; @ %r211 bra $L127; add.u64 %r294,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r294; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r244,1696244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r295,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r295; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r295; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r295,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r295,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r295,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 shl.b64 %r69,%r68,4; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r293,%r295,1; .loc 1 512 13 mov.u64 %r278,1; cvt.u32.u32 %r297,%r24; cvt.s64.s8 %r298,%r297; add.u64 %r299,%r298,-1; $L146: st.u64 [%r98],%r278; mov.u64 %r279,0; st.u64 [%r98+8],%r279; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r280,%r59,%r80; @ %r280 bra $L146; .loc 1 525 7 @ ! %r293 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r279; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r283,%r97,%r299; @ %r283 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; $L144: .loc 1 529 16 ld.u64 %r288,[%r99]; add.u64 %r74,%r288,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r289,%r75,%r72; shl.b64 %r290,%r289,4; add.u64 %r98,%r98,%r290; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r291,%r74,%r77; @ %r291 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r292,%r98,0; @ ! %r292 bra $L122; bra $L147; $L109: .loc 1 534 product_r4.o/_gfortran_product_r4 .visible .func _gfortran_product_r4fortran/generated/product_r4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mproduct_r4 .visible .func _gfortran_mproduct_r4gfortran_sproduct_r4 .visible .func _gfortran_sproduct_r4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,108] = {80,82,79,68,85,67,84114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_product_r4 .visible .func _gfortran_product_r448pred %r217; .reg .pred64u64 %r251; .reg .predu64 %r256; .reg .u64 %r257; .reg .u32u32 %r268; .reg .u64u64pred %r317; .reg .f32 %r319; .reg .f32pred %r332; .reg .u64 %r333; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u64 %r338; .reg .u64 %r340; .reg .predmov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; .loc 1 55 10 ld.s8 %r22,[%r180+28]; .loc 1 55 38 add.u32 %r182,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r182; .loc 1 56 10 ld.u64 %r25,[%r181]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r184,%r104,63; set.u32.lt.s64 %r186,%r103,%r104; neg.s32 %r187,%r186; cvt.u16.u64 %r190,%r184; cvt.u16.u32 %r191,%r187; or.b16 %r189,%r190,%r191; .loc 1 58 6 cvt.u32.u16 %r192,%r189; cvt.u16.u8 %r193,%r192; setp.eq.u16 %r194,%r193,0; @ %r194 bra $L2; .loc 1 60 7 cvt.u32.u32 %r197,%r22; cvt.s64.s8 %r196,%r197; st.u64 [%stack+8],%r1961955_gfortran_runtime_errorr200,%r104,%r104; add.u64 %r201,%r200,%r104; shl.b64 %r202,%r201,3; add.u64 %r203,%r180,%r202; ld.u64 %r30,[%r203+56]; ld.u64 %r32,[%r203+48]; .loc 1 68 9 ld.u64 %r107,[%r203+40]; .loc 1 70 3 setp.ne.u64 %r217,%r104,0; @ %r217 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r218,%r103,%r104; @ %r218 bra $L4; bra $L46; $L3: add.u64 %r61,%r180,40; add.u64 %r40,%frame,120; add.u64 %r91,%frame,240; add.u64 %r219,%r180,16; add.u64 %r221,%r25,%r25; add.u64 %r222,%r221,%r25; shl.b64 %r223,%r222,3; add.u64 %r174,%r219,%r223; .loc 1 76 12 mov.u64 %r357,0; $L8: .loc 1 72 18 ld.u64 %r224,[%r61]; st.u64 [%r40],%r224; .loc 1 73 19 ld.u64 %r226,[%r61+16]; add.u64 %r225,%r226,1; ld.u64 %r227,[%r61+8]; sub.u64 %r37,%r225,%r227; .loc 1 75 10 setp.lt.s64 %r228,%r37,0; @ %r228 bra $L6; .loc 1 73 17 st.u64 [%r91],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r91],%r357; $L7: .loc 1 70 3 add.u64 %r61,%r61,24; add.u64 %r40,%r40,8; add.u64 %r91,%r91,8; setp.ne.u64 %r230,%r61,%r174; @ %r230 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r231,[%r179]; setp.eq.u64 %r232,%r231,0; @ ! %r232 bra $L11; bra $L10; $L4: add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r237,%r236,40; add.u64 %r143,%r180,%r237; shl.b64 %r238,%r25,3; add.u64 %r125,%r238,-8; add.u64 %r344,%frame,120; add.u64 %r128,%r344,%r125; add.u64 %r345,%frame,240; add.u64 %r121,%r345,%r125; cvt.u32.u32 %r242,%r22; cvt.s64.s8 %r241,%r242; add.u64 %r244,%r241,%r241; add.u64 %r245,%r244,%r241; shl.b64 %r246,%r245,3; add.u64 %r247,%r180,40; add.u64 %r95,%r246,%r247; .loc 1 84 12 mov.u64 %r356,0; $L14: .loc 1 80 18 ld.u64 %r248,[%r143]; st.u64 [%r128],%r248; .loc 1 81 19 ld.u64 %r250,[%r143+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r143+8]; sub.u64 %r44,%r249,%r251; .loc 1 83 10 setp.lt.s64 %r252,%r44,0; @ %r252 bra $L12; .loc 1 81 17 st.u64 [%r121],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r121],%r356; $L13: .loc 1 78 3 add.u64 %r143,%r143,24; add.u64 %r128,%r128,8; add.u64 %r121,%r121,8; setp.ne.u64 %r254,%r95,%r143; @ %r254 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r255,%r103,0; @ %r255 bra $L16; add.u64 %r345,%frame,240; $L34: add.u64 %r152,%r179,40; mov.u64 %r150,%r345; add.u64 %r256,%r179,16; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r144,%r256,%r262; .loc 1 94 10 mov.u64 %r99,1; .loc 1 98 4 mov.u64 %r285,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r263,0; st.u64 [%r179+8],%r263; .loc 1 103 28 cvt.u16.u32 %r266,%r22; add.u16 %r265,%r266,-1; cvt.u32.u16 %r267,%r265; st.u8 [%r179+28],%r267; .loc 1 105 20 add.u32 %r268,%r22,-2; cvt.s64.s32 %r52,%r268; add.u64 %r270,%r52,%r52; add.u64 %r271,%r270,%r52; shl.b64 %r272,%r271,3; add.u64 %r273,%r179,%r272; .loc 1 105 67 shl.b64 %r275,%r52,3; add.u64 %r276,%frame,%r275; .loc 1 105 59 ld.u64 %r278,[%r273+40]; ld.u64 %r279,[%r276+240]; mul.lo.u64 %r109,%r278,%r279; .loc 1 107 29281; call (%value_in),_gfortrani_xmallocarray282,[%value_in]; } .loc 1 107 27 st.u64 [%r179],%r282; .loc 1 108 10 setp.eq.u64 %r284,%r109,0; @ ! %r284 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r99,%r48,%r99; $L17: .loc 1 98 4 st.u64 [%r152+8],%r285; ld.u64 %r48,[%r150]; add.u64 %r286,%r48,-1; st.u64 [%r152+16],%r286; st.u64 [%r152],%r99; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r287,%r144,%r152; @ %r287 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r179+48],%r109; mov.u64 %r289,-1; st.u64 [%r179+56],%r289; mov.u64 %r290,1; st.u64 [%r179+40],%r290; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r179+28]; .loc 1 118 10 setp.eq.u64 %r291,%r58,%r103; @ %r291 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r294,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r295,[%r294+36]; setp.eq.u32 %r296,%r295,0; @ %r296 bra $L21; .loc 1 125 2 add.u64 %r345,%frame,240; cvta.const.u64 %r300,$LC2r345299300; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r302,%r103,0; @ %r302 bra $L24; $L27: .loc 1 65 9 add.u64 %r303,%r30,1; .loc 1 65 7 sub.u64 %r105,%r303,%r32; max.s64 %r45,%r105,0; .loc 1 137 8 ld.u64 %r116,[%r180]; .loc 1 138 8 ld.u64 %r117,[%r179]; .loc 1 154 36 shl.b64 %r66,%r107,2; ld.u64 %r138,[%frame+360]; .loc 1 166 22 ld.u64 %r68,[%frame+120]; .loc 1 166 12 shl.b64 %r69,%r68,2; .loc 1 167 22 ld.u64 %r71,[%frame]; .loc 1 167 12 shl.b64 %r72,%r71,2; .loc 1 169 32 ld.u64 %r123,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r304,%r68,%r123; .loc 1 176 9 shl.b64 %r305,%r304,2; neg.s64 %r127,%r305; .loc 1 177 23 mul.lo.u64 %r306,%r71,%r123; .loc 1 177 9 shl.b64 %r307,%r306,2; neg.s64 %r130,%r307; setp.gt.s64 %r346,%r105,0; setp.le.s64 %r347,%r103,1; cvt.u32.u32 %r349,%r22; cvt.s64.s8 %r350,%r349; add.u64 %r351,%r350,-1; add.u64 %r352,%frame,120; add.u64 %r353,%frame,240; .loc 1 173 13 mov.u64 %r354,0; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r179,40; mov.u64 %r162,%frame; cvt.u32.u32 %r309,%r22; cvt.s64.s8 %r308,%r309; add.u64 %r153,%r308,-1; .loc 1 129 3 mov.u64 %r118,0; add.u64 %r345,%frame,240; .loc 1 131 16 mov.u64 %r310,%r118; $L26: st.u64 [%r165],%r310; .loc 1 132 18 ld.u64 %r311,[%r164]; st.u64 [%r162],%r311; .loc 1 133 17 shl.b64 %r313,%r118,3; add.u64 %r314,%r345,%r313; .loc 1 133 10 ld.u64 %r315,[%r314]; setp.le.s64 %r316,%r315,0; @ %r316 bra $L1; .loc 1 129 26 add.u64 %r118,%r118,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r317,%r118,%r153; @ %r317 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r346 bra $L35; .loc 1 150 10 mov.f32 %r319,0f3f800000; st.f32 [%r117],%r319; bra $L29; $L35: mov.u64 %r112,%r116; .loc 1 148 10 mov.f32 %r110,0f3f800000; .loc 1 154 13 mov.u64 %r111,0; $L28: .loc 1 158 10 ld.f32 %r320,[%r112]; mul.f32 %r110,%r110,%r320; .loc 1 154 28 add.u64 %r111,%r111,1; .loc 1 154 36 add.u64 %r112,%r112,%r66; .loc 1 154 6 setp.gt.s64 %r321,%r45,%r111; @ %r321 bra $L28; .loc 1 161 12 st.f32 [%r117],%r110; $L29: .loc 1 165 15 add.u64 %r138,%r138,1; .loc 1 166 12 add.u64 %r116,%r116,%r69; .loc 1 167 12 add.u64 %r117,%r117,%r72; .loc 1 169 13 setp.ne.u64 %r322,%r138,%r123; @ %r322 bra $L25; .loc 1 176 9 add.u64 %r113,%r116,%r127; .loc 1 177 9 add.u64 %r114,%r117,%r130; .loc 1 179 7 @ %r347 bra $L1; add.u64 %r122,%frame,368; mov.u64 %r140,8; .loc 1 178 5 mov.u64 %r115,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r122],%r354; .loc 1 176 23 mul.lo.u64 %r328,%r85,%r84; .loc 1 176 9 shl.b64 %r329,%r328,2; sub.u64 %r113,%r116,%r329; .loc 1 177 23 mul.lo.u64 %r330,%r88,%r84; .loc 1 177 9 shl.b64 %r331,%r330,2; sub.u64 %r114,%r117,%r331; .loc 1 178 5 add.u64 %r115,%r115,1; .loc 1 179 7 add.u64 %r122,%r122,8; add.u64 %r140,%r140,8; setp.eq.u64 %r332,%r115,%r351; @ %r332 bra $L1; $L32: .loc 1 187 16 ld.u64 %r333,[%r122]; add.u64 %r84,%r333,1; st.u64 [%r122],%r84; .loc 1 188 23 add.u64 %r335,%r352,%r140; ld.u64 %r85,[%r335]; .loc 1 188 13 shl.b64 %r336,%r85,2; add.u64 %r116,%r113,%r336; .loc 1 189 23 add.u64 %r337,%frame,%r140; ld.u64 %r88,[%r337]; .loc 1 189 13 shl.b64 %r338,%r88,2; add.u64 %r117,%r114,%r338; .loc 1 169 32 add.u64 %r340,%r353,%r140; ld.u64 %r92,[%r340]; .loc 1 169 13 setp.eq.u64 %r341,%r84,%r92; @ %r341 bra $L33; .loc 1 173 13 mov.u64 %r138,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r342,[%r179]; setp.eq.u64 %r343,%r342,0; @ ! %r343 bra $L11; bra $L34; $L1_gfortran_mproduct_r4 .visible .func _gfortran_mproduct_r608u64u16 %r263; .reg .pred %r264; .reg .pred %r266; .reg .u64 %r270; .reg .u64 %r271; .reg .u64 %r277; .reg .pred %r279; .reg .pred %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .predu64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32u64 %r342; .reg .u64pred %r358; .reg .u64predu64 %r391; .reg .predpred %r399; .reg .f32 %r400; .reg .pred %r401; .reg .predmov.u64 %r211,%ar2; mov.u64 %r212,%ar3; .loc 1 223 6 setp.ne.u64 %r213,%r212,0; @ %r213 bra $L49gfortran_product_r4.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r211]; .loc 1 233 7 add.u64 %r137,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r210+28]; .loc 1 234 38 add.u32 %r217,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r138,%r217; .loc 1 237 7 shr.u64 %r219,%r137,63; set.u32.gt.s64 %r221,%r137,%r138; neg.s32 %r222,%r221; cvt.u16.u64 %r225,%r219; cvt.u16.u32 %r226,%r222; or.b16 %r224,%r225,%r226; .loc 1 237 6 cvt.u32.u16 %r227,%r224; cvt.u16.u8 %r228,%r227; setp.eq.u16 %r229,%r228,0; @ %r229 bra $L51; .loc 1 239 7 cvt.u32.u32 %r232,%r23; cvt.s64.s8 %r231,%r232; st.u64 [%stack+8],%r231; st.u64 [%stack],%r22; cvta.const.u64 %r2302stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r429,%r137,%r137; add.u64 %r430,%r429,%r137; shl.b64 %r237,%r430,3; add.u64 %r238,%r210,%r237; ld.u64 %r241,[%r238+56]; add.u64 %r240,%r241,1; .loc 1 244 7 ld.u64 %r248,[%r238+48]; sub.u64 %r139,%r240,%r248; .loc 1 245 6 setp.le.s64 %r249,%r139,0; @ %r249 bra $L48; .loc 1 248 9 ld.u64 %r152,[%r212]; .loc 1 250 15 ld.u64 %r34,[%r212+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r250,%r35,-4; and.b32 %r251,%r250,-5; set.u32.eq.u32 %r253,%r251,0; neg.s32 %r254,%r253; .loc 1 252 22 add.u32 %r255,%r35,-1; set.u32.le.u32 %r257,%r255,1; neg.s32 %r258,%r257; .loc 1 252 6 cvt.u16.u32 %r260,%r254; cvt.u16.u32 %r261,%r258; or.b16 %r259,%r260,%r261; cvt.u32.u16 %r262,%r259; cvt.u16.u8 %r263,%r262; setp.ne.u16 %r264,%r263,0; @ %r264 bra $L53; .loc 1 254 7 setp.ne.u32 %r266,%r35,16; @ %r266 bra $L54; $L53: .loc 1 261 9 shl.b64 %r270,%r430,3; add.u64 %r271,%r210,%r270; ld.u64 %r142,[%r271+40]; .loc 1 262 12 add.u64 %r277,%r212,%r270; ld.u64 %r42,[%r277+40]; .loc 1 264 3 setp.ne.u64 %r279,%r137,0; @ %r279 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r280,%r137,%r138; @ %r280 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r281281_gfortran_runtime_error5: add.u64 %r40,%r210,40; add.u64 %r179,%r212,40; add.u64 %r283,%r210,16; add.u64 %r285,%r22,%r22; add.u64 %r286,%r285,%r22; shl.b64 %r287,%r286,3; add.u64 %r206,%r283,%r287; .loc 1 264 3 mov.u64 %r92,0; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 271 12 mov.u64 %r441,%r92; $L60: .loc 1 266 18 add.u64 %r289,%r427,%r92; ld.u64 %r290,[%r40]; st.u64 [%r289],%r290; .loc 1 267 18 add.u64 %r291,%frame,%r92; .loc 1 267 20 ld.u64 %r293,[%r179]; mul.lo.u64 %r292,%r293,%r34; .loc 1 267 18 st.u64 [%r291],%r292; .loc 1 268 19 ld.u64 %r295,[%r40+16]; add.u64 %r294,%r295,1; ld.u64 %r296,[%r40+8]; sub.u64 %r54,%r294,%r296; .loc 1 270 10 setp.lt.s64 %r297,%r54,0; @ %r297 bra $L58; .loc 1 268 17 add.u64 %r299,%r426,%r92; st.u64 [%r299],%r54; bra $L59; $L58: .loc 1 271 12 add.u64 %r301,%r426,%r92; st.u64 [%r301],%r441; $L59: .loc 1 264 3 add.u64 %r40,%r40,24; add.u64 %r179,%r179,24; add.u64 %r92,%r92,8; setp.ne.u64 %r303,%r40,%r206; @ %r303 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r304,[%r209]; setp.eq.u64 %r305,%r304,0; @ ! %r305 bra $L63; bra $L62; $L56: add.u64 %r307,%r22,%r22; add.u64 %r308,%r307,%r22; shl.b64 %r309,%r308,3; add.u64 %r127,%r309,40; add.u64 %r133,%r210,%r127; add.u64 %r125,%r212,%r127; shl.b64 %r310,%r22,3; add.u64 %r103,%r310,-8; cvt.u32.u32 %r312,%r23; cvt.s64.s8 %r311,%r312; shl.b64 %r313,%r311,3; add.u64 %r69,%r313,-8; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 281 12 mov.u64 %r440,0; $L66: .loc 1 276 18 add.u64 %r315,%r427,%r103; ld.u64 %r316,[%r133]; st.u64 [%r315],%r316; .loc 1 277 18 add.u64 %r317,%frame,%r103; .loc 1 277 20 ld.u64 %r319,[%r125]; mul.lo.u64 %r318,%r319,%r34; .loc 1 277 18 st.u64 [%r317],%r318; .loc 1 278 19 ld.u64 %r321,[%r133+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r133+8]; sub.u64 %r66,%r320,%r322; .loc 1 280 10 setp.lt.s64 %r323,%r66,0; @ %r323 bra $L64; .loc 1 278 17 add.u64 %r325,%r426,%r103; st.u64 [%r325],%r66; bra $L65; $L64: .loc 1 281 12 add.u64 %r327,%r426,%r103; st.u64 [%r327],%r440; $L65: .loc 1 274 3 add.u64 %r133,%r133,24; add.u64 %r125,%r125,24; add.u64 %r103,%r103,8; setp.ne.u64 %r329,%r69,%r103; @ %r329 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r330,%r138,0; @ %r330 bra $L68; add.u64 %r426,%frame,360; $L86: add.u64 %r172,%r209,40; mov.u64 %r164,%r426; cvt.u32.u32 %r332,%r23; cvt.s64.s8 %r331,%r332; add.u64 %r334,%r331,%r331; add.u64 %r335,%r334,%r331; shl.b64 %r336,%r335,3; add.u64 %r337,%r209,16; add.u64 %r134,%r336,%r337; .loc 1 291 10 mov.u64 %r129,1; .loc 1 295 4 mov.u64 %r356,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r338,%r23,-2; cvt.s64.s32 %r75,%r338; add.u64 %r340,%r75,%r75; add.u64 %r341,%r340,%r75; shl.b64 %r342,%r341,3; add.u64 %r343,%r209,%r342; .loc 1 299 67 shl.b64 %r345,%r75,3; add.u64 %r346,%frame,%r345; .loc 1 299 59 ld.u64 %r348,[%r343+40]; ld.u64 %r349,[%r346+360]; mul.lo.u64 %r146,%r348,%r349; .loc 1 301 24 mov.u64 %r350,0; st.u64 [%r209+8],%r350; .loc 1 302 28 cvt.u16.u32 %r353,%r23; add.u16 %r352,%r353,-1; cvt.u32.u16 %r354,%r352; st.u8 [%r209+28],%r354; .loc 1 304 10 setp.eq.u64 %r355,%r146,0; @ %r355 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r129,%r72,%r129; $L69: .loc 1 295 4 st.u64 [%r172+8],%r356; ld.u64 %r72,[%r164]; add.u64 %r357,%r72,-1; st.u64 [%r172+16],%r357; st.u64 [%r172],%r129; .loc 1 288 7 add.u64 %r172,%r172,24; add.u64 %r164,%r164,8; setp.ne.u64 %r358,%r134,%r172; @ %r358 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r209+48],%r146; mov.u64 %r360,-1; st.u64 [%r209+56],%r360; mov.u64 %r361,1; st.u64 [%r209+40],%r361; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24363; call (%value_in),_gfortrani_xmallocarray64,[%value_in]; } .loc 1 311 22 st.u64 [%r209],%r364; bra $L73; $L63: .loc 1 316 19 ld.s8 %r366,[%r209+28]; .loc 1 316 10 setp.eq.u64 %r367,%r366,%r138; @ %r367 bra $L74; .loc 1 317 2 cvta.const.u64 %r36836_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r370,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r371,[%r370+36]; setp.eq.u32 %r372,%r371,0; @ %r372 bra $L73; .loc 1 321 4 add.u64 %r426,%frame,360; cvta.const.u64 %r376,$LC2r42637376; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r380,$LC1210380376; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r382,%r138,0; @ %r382 bra $L75; $L78: .loc 1 336 8 ld.u64 %r153,[%r209]; .loc 1 337 8 ld.u64 %r151,[%r210]; .loc 1 339 9 setp.ne.u64 %r383,%r151,0; @ %r383 bra $L76; bra $L48; $L75: add.u64 %r190,%frame,480; add.u64 %r189,%r209,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r385,%r23; cvt.s64.s8 %r384,%r385; add.u64 %r175,%r384,-1; .loc 1 328 3 mov.u64 %r157,0; add.u64 %r426,%frame,360; .loc 1 330 16 mov.u64 %r386,%r157; $L77: st.u64 [%r190],%r386; .loc 1 331 18 ld.u64 %r387,[%r189]; st.u64 [%r187],%r387; .loc 1 332 17 shl.b64 %r389,%r157,3; add.u64 %r390,%r426,%r389; .loc 1 332 10 ld.u64 %r391,[%r390]; setp.le.s64 %r392,%r391,0; @ %r392 bra $L48; .loc 1 328 26 add.u64 %r157,%r157,1; .loc 1 328 3 add.u64 %r190,%r190,8; add.u64 %r189,%r189,24; add.u64 %r187,%r187,8; setp.ne.u64 %r393,%r157,%r175; @ %r393 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r45,%r42,%r34; .loc 1 349 32 shl.b64 %r91,%r142,2; ld.u64 %r39,[%frame+480]; .loc 1 359 22 ld.u64 %r94,[%frame+240]; .loc 1 359 12 shl.b64 %r95,%r94,2; .loc 1 360 23 ld.u64 %r97,[%frame]; .loc 1 361 22 ld.u64 %r99,[%frame+120]; .loc 1 361 12 shl.b64 %r100,%r99,2; .loc 1 363 32 ld.u64 %r85,[%frame+360]; .loc 1 370 23 mul.lo.u64 %r38,%r85,%r94; .loc 1 371 24 mul.lo.u64 %r394,%r85,%r97; .loc 1 371 10 neg.s64 %r163,%r394; .loc 1 372 23 mul.lo.u64 %r395,%r85,%r99; .loc 1 372 9 shl.b64 %r396,%r395,2; neg.s64 %r165,%r396; setp.le.s64 %r431,%r138,1; cvt.u32.u32 %r433,%r23; cvt.s64.s8 %r434,%r433; add.u64 %r435,%r434,-1; add.u64 %r436,%frame,240; add.u64 %r437,%frame,360; add.u64 %r438,%frame,120; .loc 1 367 13 mov.u64 %r439,0; $L87: .loc 1 328 3 mov.u64 %r156,%r152; mov.u64 %r155,%r151; .loc 1 348 10 mov.f32 %r130,0f3f800000; .loc 1 349 9 mov.u64 %r154,0; $L80: .loc 1 352 6 ld.s8 %r398,[%r156]; cvt.u16.u32 %r397,%r398; setp.eq.u16 %r399,%r397,0; @ %r399 bra $L79; .loc 1 353 12 ld.f32 %r400,[%r155]; mul.f32 %r130,%r130,%r400; $L79: .loc 1 349 24 add.u64 %r154,%r154,1; .loc 1 349 32 add.u64 %r155,%r155,%r91; .loc 1 349 47 add.u64 %r156,%r156,%r45; .loc 1 349 2 setp.ne.u64 %r401,%r139,%r154; @ %r401 bra $L80; .loc 1 355 8 st.f32 [%r153],%r130; .loc 1 358 15 add.u64 %r39,%r39,1; .loc 1 359 12 add.u64 %r151,%r151,%r95; .loc 1 360 13 add.u64 %r152,%r152,%r97; .loc 1 361 12 add.u64 %r153,%r153,%r100; .loc 1 363 13 setp.ne.u64 %r402,%r85,%r39; @ %r402 bra $L87; .loc 1 371 10 add.u64 %r148,%r152,%r163; .loc 1 372 9 add.u64 %r149,%r153,%r165; .loc 1 374 7 @ ! %r431 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r41],%r439; .loc 1 370 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 371 24 mul.lo.u64 %r405,%r116,%r113; .loc 1 371 10 sub.u64 %r148,%r152,%r405; .loc 1 372 23 mul.lo.u64 %r406,%r118,%r113; .loc 1 372 9 shl.b64 %r407,%r406,2; sub.u64 %r149,%r153,%r407; .loc 1 373 5 add.u64 %r150,%r150,1; .loc 1 374 7 add.u64 %r41,%r41,8; add.u64 %r166,%r166,8; setp.ne.u64 %r408,%r150,%r435; @ %r408 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; .loc 1 370 23 mov.u64 %r102,%r38; .loc 1 374 7 mov.u64 %r166,8; .loc 1 373 5 mov.u64 %r150,1; $L84: .loc 1 382 16 ld.u64 %r412,[%r41]; add.u64 %r113,%r412,1; st.u64 [%r41],%r113; .loc 1 383 23 add.u64 %r414,%r436,%r166; ld.u64 %r114,[%r414]; .loc 1 383 13 sub.u64 %r415,%r114,%r102; shl.b64 %r416,%r415,2; add.u64 %r151,%r151,%r416; .loc 1 384 24 add.u64 %r417,%frame,%r166; ld.u64 %r116,[%r417]; .loc 1 384 14 add.u64 %r152,%r148,%r116; .loc 1 385 23 add.u64 %r419,%r438,%r166; ld.u64 %r118,[%r419]; .loc 1 385 13 shl.b64 %r420,%r118,2; add.u64 %r153,%r149,%r420; .loc 1 363 32 add.u64 %r422,%r437,%r166; ld.u64 %r122,[%r422]; .loc 1 363 13 setp.eq.u64 %r423,%r113,%r122; @ %r423 bra $L85; .loc 1 367 13 mov.u64 %r39,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r424,[%r209]; setp.eq.u64 %r425,%r424,0; @ ! %r425 bra $L63; bra $L86; $L48: .loc 1 389_gfortran_sproduct_r4 .visible .func _gfortran_sproduct_r8u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64 %r276; .reg .predpredu64mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_product_r4.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r294,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r294; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r294; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r300,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r300; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r293,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r293,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r299,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r299; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r294,0; @ %r211 bra $L127; add.u64 %r293,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r293; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 2496244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r294,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r294; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r294; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r294,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r294,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r294,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 shl.b64 %r69,%r68,2; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r292,%r294,1; .loc 1 512 13 mov.f32 %r278,0f3f800000; cvt.u32.u32 %r296,%r24; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; $L146: st.f32 [%r98],%r278; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r279,%r59,%r80; @ %r279 bra $L146; .loc 1 525 7 @ ! %r292 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r295; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r282,%r97,%r298; @ %r282 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; .loc 1 520 13 mov.u64 %r295,0; $L144: .loc 1 529 16 ld.u64 %r287,[%r99]; add.u64 %r74,%r287,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r288,%r75,%r72; shl.b64 %r289,%r288,2; add.u64 %r98,%r98,%r289; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r290,%r74,%r77; @ %r290 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r291,%r98,0; @ ! %r291 bra $L122; bra $L147; $L109: .loc 1 534product_r8.o/_gfortran_product_r8 .visible .func _gfortran_product_r8fortran/generated/product_r8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mproduct_r8 .visible .func _gfortran_mproduct_r8gfortran_sproduct_r8 .visible .func _gfortran_sproduct_r8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,108] = {80,82,79,68,85,67,84114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_product_r8 .visible .func _gfortran_product_r848pred %r217; .reg .pred64u64 %r251; .reg .predu64 %r256; .reg .u64 %r257; .reg .u32u32 %r268; .reg .u64u64pred %r317; .reg .f64 %r319; .reg .f64pred %r332; .reg .u64 %r333; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u64 %r338; .reg .u64 %r340; .reg .predmov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; .loc 1 55 10 ld.s8 %r22,[%r180+28]; .loc 1 55 38 add.u32 %r182,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r182; .loc 1 56 10 ld.u64 %r25,[%r181]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r184,%r104,63; set.u32.lt.s64 %r186,%r103,%r104; neg.s32 %r187,%r186; cvt.u16.u64 %r190,%r184; cvt.u16.u32 %r191,%r187; or.b16 %r189,%r190,%r191; .loc 1 58 6 cvt.u32.u16 %r192,%r189; cvt.u16.u8 %r193,%r192; setp.eq.u16 %r194,%r193,0; @ %r194 bra $L2; .loc 1 60 7 cvt.u32.u32 %r197,%r22; cvt.s64.s8 %r196,%r197; st.u64 [%stack+8],%r1961955_gfortran_runtime_errorr200,%r104,%r104; add.u64 %r201,%r200,%r104; shl.b64 %r202,%r201,3; add.u64 %r203,%r180,%r202; ld.u64 %r30,[%r203+56]; ld.u64 %r32,[%r203+48]; .loc 1 68 9 ld.u64 %r107,[%r203+40]; .loc 1 70 3 setp.ne.u64 %r217,%r104,0; @ %r217 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r218,%r103,%r104; @ %r218 bra $L4; bra $L46; $L3: add.u64 %r61,%r180,40; add.u64 %r40,%frame,120; add.u64 %r91,%frame,240; add.u64 %r219,%r180,16; add.u64 %r221,%r25,%r25; add.u64 %r222,%r221,%r25; shl.b64 %r223,%r222,3; add.u64 %r174,%r219,%r223; .loc 1 76 12 mov.u64 %r357,0; $L8: .loc 1 72 18 ld.u64 %r224,[%r61]; st.u64 [%r40],%r224; .loc 1 73 19 ld.u64 %r226,[%r61+16]; add.u64 %r225,%r226,1; ld.u64 %r227,[%r61+8]; sub.u64 %r37,%r225,%r227; .loc 1 75 10 setp.lt.s64 %r228,%r37,0; @ %r228 bra $L6; .loc 1 73 17 st.u64 [%r91],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r91],%r357; $L7: .loc 1 70 3 add.u64 %r61,%r61,24; add.u64 %r40,%r40,8; add.u64 %r91,%r91,8; setp.ne.u64 %r230,%r61,%r174; @ %r230 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r231,[%r179]; setp.eq.u64 %r232,%r231,0; @ ! %r232 bra $L11; bra $L10; $L4: add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r237,%r236,40; add.u64 %r143,%r180,%r237; shl.b64 %r238,%r25,3; add.u64 %r125,%r238,-8; add.u64 %r344,%frame,120; add.u64 %r128,%r344,%r125; add.u64 %r345,%frame,240; add.u64 %r121,%r345,%r125; cvt.u32.u32 %r242,%r22; cvt.s64.s8 %r241,%r242; add.u64 %r244,%r241,%r241; add.u64 %r245,%r244,%r241; shl.b64 %r246,%r245,3; add.u64 %r247,%r180,40; add.u64 %r95,%r246,%r247; .loc 1 84 12 mov.u64 %r356,0; $L14: .loc 1 80 18 ld.u64 %r248,[%r143]; st.u64 [%r128],%r248; .loc 1 81 19 ld.u64 %r250,[%r143+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r143+8]; sub.u64 %r44,%r249,%r251; .loc 1 83 10 setp.lt.s64 %r252,%r44,0; @ %r252 bra $L12; .loc 1 81 17 st.u64 [%r121],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r121],%r356; $L13: .loc 1 78 3 add.u64 %r143,%r143,24; add.u64 %r128,%r128,8; add.u64 %r121,%r121,8; setp.ne.u64 %r254,%r95,%r143; @ %r254 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r255,%r103,0; @ %r255 bra $L16; add.u64 %r345,%frame,240; $L34: add.u64 %r152,%r179,40; mov.u64 %r150,%r345; add.u64 %r256,%r179,16; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r144,%r256,%r262; .loc 1 94 10 mov.u64 %r99,1; .loc 1 98 4 mov.u64 %r285,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r263,0; st.u64 [%r179+8],%r263; .loc 1 103 28 cvt.u16.u32 %r266,%r22; add.u16 %r265,%r266,-1; cvt.u32.u16 %r267,%r265; st.u8 [%r179+28],%r267; .loc 1 105 20 add.u32 %r268,%r22,-2; cvt.s64.s32 %r52,%r268; add.u64 %r270,%r52,%r52; add.u64 %r271,%r270,%r52; shl.b64 %r272,%r271,3; add.u64 %r273,%r179,%r272; .loc 1 105 67 shl.b64 %r275,%r52,3; add.u64 %r276,%frame,%r275; .loc 1 105 59 ld.u64 %r278,[%r273+40]; ld.u64 %r279,[%r276+240]; mul.lo.u64 %r109,%r278,%r279; .loc 1 107 29 mov.u64 %r281281; call (%value_in),_gfortrani_xmallocarray282,[%value_in]; } .loc 1 107 27 st.u64 [%r179],%r282; .loc 1 108 10 setp.eq.u64 %r284,%r109,0; @ ! %r284 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r99,%r48,%r99; $L17: .loc 1 98 4 st.u64 [%r152+8],%r285; ld.u64 %r48,[%r150]; add.u64 %r286,%r48,-1; st.u64 [%r152+16],%r286; st.u64 [%r152],%r99; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r287,%r144,%r152; @ %r287 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r179+48],%r109; mov.u64 %r289,-1; st.u64 [%r179+56],%r289; mov.u64 %r290,1; st.u64 [%r179+40],%r290; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r179+28]; .loc 1 118 10 setp.eq.u64 %r291,%r58,%r103; @ %r291 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r294,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r295,[%r294+36]; setp.eq.u32 %r296,%r295,0; @ %r296 bra $L21; .loc 1 125 2 add.u64 %r345,%frame,240; cvta.const.u64 %r300,$LC2r345299300; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r302,%r103,0; @ %r302 bra $L24; $L27: .loc 1 65 9 add.u64 %r303,%r30,1; .loc 1 65 7 sub.u64 %r105,%r303,%r32; max.s64 %r45,%r105,0; .loc 1 137 8 ld.u64 %r116,[%r180]; .loc 1 138 8 ld.u64 %r117,[%r179]; .loc 1 154 36 shl.b64 %r66,%r107,3; ld.u64 %r138,[%frame+360]; .loc 1 166 22 ld.u64 %r68,[%frame+120]; .loc 1 166 12 shl.b64 %r69,%r68,3; .loc 1 167 22 ld.u64 %r71,[%frame]; .loc 1 167 12 shl.b64 %r72,%r71,3; .loc 1 169 32 ld.u64 %r123,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r304,%r68,%r123; .loc 1 176 9 shl.b64 %r305,%r304,3; neg.s64 %r127,%r305; .loc 1 177 23 mul.lo.u64 %r306,%r71,%r123; .loc 1 177 9 shl.b64 %r307,%r306,3; neg.s64 %r130,%r307; setp.gt.s64 %r346,%r105,0; setp.le.s64 %r347,%r103,1; cvt.u32.u32 %r349,%r22; cvt.s64.s8 %r350,%r349; add.u64 %r351,%r350,-1; add.u64 %r352,%frame,120; add.u64 %r353,%frame,240; .loc 1 173 13 mov.u64 %r354,0; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r179,40; mov.u64 %r162,%frame; cvt.u32.u32 %r309,%r22; cvt.s64.s8 %r308,%r309; add.u64 %r153,%r308,-1; .loc 1 129 3 mov.u64 %r118,0; add.u64 %r345,%frame,240; .loc 1 131 16 mov.u64 %r310,%r118; $L26: st.u64 [%r165],%r310; .loc 1 132 18 ld.u64 %r311,[%r164]; st.u64 [%r162],%r311; .loc 1 133 17 shl.b64 %r313,%r118,3; add.u64 %r314,%r345,%r313; .loc 1 133 10 ld.u64 %r315,[%r314]; setp.le.s64 %r316,%r315,0; @ %r316 bra $L1; .loc 1 129 26 add.u64 %r118,%r118,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r317,%r118,%r153; @ %r317 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r346 bra $L35; .loc 1 150 10 mov.f64 %r319,0d3ff0000000000000; st.f64 [%r117],%r319; bra $L29; $L35: mov.u64 %r112,%r116; .loc 1 148 10 mov.f64 %r110,0d3ff0000000000000; .loc 1 154 13 mov.u64 %r111,0; $L28: .loc 1 158 10 ld.f64 %r320,[%r112]; mul.f64 %r110,%r110,%r320; .loc 1 154 28 add.u64 %r111,%r111,1; .loc 1 154 36 add.u64 %r112,%r112,%r66; .loc 1 154 6 setp.gt.s64 %r321,%r45,%r111; @ %r321 bra $L28; .loc 1 161 12 st.f64 [%r117],%r110; $L29: .loc 1 165 15 add.u64 %r138,%r138,1; .loc 1 166 12 add.u64 %r116,%r116,%r69; .loc 1 167 12 add.u64 %r117,%r117,%r72; .loc 1 169 13 setp.ne.u64 %r322,%r138,%r123; @ %r322 bra $L25; .loc 1 176 9 add.u64 %r113,%r116,%r127; .loc 1 177 9 add.u64 %r114,%r117,%r130; .loc 1 179 7 @ %r347 bra $L1; add.u64 %r122,%frame,368; mov.u64 %r140,8; .loc 1 178 5 mov.u64 %r115,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r122],%r354; .loc 1 176 23 mul.lo.u64 %r328,%r85,%r84; .loc 1 176 9 shl.b64 %r329,%r328,3; sub.u64 %r113,%r116,%r329; .loc 1 177 23 mul.lo.u64 %r330,%r88,%r84; .loc 1 177 9 shl.b64 %r331,%r330,3; sub.u64 %r114,%r117,%r331; .loc 1 178 5 add.u64 %r115,%r115,1; .loc 1 179 7 add.u64 %r122,%r122,8; add.u64 %r140,%r140,8; setp.eq.u64 %r332,%r115,%r351; @ %r332 bra $L1; $L32: .loc 1 187 16 ld.u64 %r333,[%r122]; add.u64 %r84,%r333,1; st.u64 [%r122],%r84; .loc 1 188 23 add.u64 %r335,%r352,%r140; ld.u64 %r85,[%r335]; .loc 1 188 13 shl.b64 %r336,%r85,3; add.u64 %r116,%r113,%r336; .loc 1 189 23 add.u64 %r337,%frame,%r140; ld.u64 %r88,[%r337]; .loc 1 189 13 shl.b64 %r338,%r88,3; add.u64 %r117,%r114,%r338; .loc 1 169 32 add.u64 %r340,%r353,%r140; ld.u64 %r92,[%r340]; .loc 1 169 13 setp.eq.u64 %r341,%r84,%r92; @ %r341 bra $L33; .loc 1 173 13 mov.u64 %r138,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r342,[%r179]; setp.eq.u64 %r343,%r342,0; @ ! %r343 bra $L11; bra $L34; $L1_gfortran_mproduct_r8 .visible .func _gfortran_mproduct_r8608u64u16 %r263; .reg .pred %r264; .reg .pred %r266; .reg .u64 %r270; .reg .u64 %r271; .reg .u64 %r277; .reg .pred %r279; .reg .pred %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .predu64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32u64 %r342; .reg .u64pred %r358; .reg .u64predu64 %r391; .reg .predpred %r399; .reg .f64 %r400; .reg .pred %r401; .reg .predmov.u64 %r211,%ar2; mov.u64 %r212,%ar3; .loc 1 223 6 setp.ne.u64 %r213,%r212,0; @ %r213 bra $L49gfortran_product_r8.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r211]; .loc 1 233 7 add.u64 %r137,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r210+28]; .loc 1 234 38 add.u32 %r217,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r138,%r217; .loc 1 237 7 shr.u64 %r219,%r137,63; set.u32.gt.s64 %r221,%r137,%r138; neg.s32 %r222,%r221; cvt.u16.u64 %r225,%r219; cvt.u16.u32 %r226,%r222; or.b16 %r224,%r225,%r226; .loc 1 237 6 cvt.u32.u16 %r227,%r224; cvt.u16.u8 %r228,%r227; setp.eq.u16 %r229,%r228,0; @ %r229 bra $L51; .loc 1 239 7 cvt.u32.u32 %r232,%r23; cvt.s64.s8 %r231,%r232; st.u64 [%stack+8],%r231; st.u64 [%stack],%r22; cvta.const.u64 %r2302stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r429,%r137,%r137; add.u64 %r430,%r429,%r137; shl.b64 %r237,%r430,3; add.u64 %r238,%r210,%r237; ld.u64 %r241,[%r238+56]; add.u64 %r240,%r241,1; .loc 1 244 7 ld.u64 %r248,[%r238+48]; sub.u64 %r139,%r240,%r248; .loc 1 245 6 setp.le.s64 %r249,%r139,0; @ %r249 bra $L48; .loc 1 248 9 ld.u64 %r152,[%r212]; .loc 1 250 15 ld.u64 %r34,[%r212+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r250,%r35,-4; and.b32 %r251,%r250,-5; set.u32.eq.u32 %r253,%r251,0; neg.s32 %r254,%r253; .loc 1 252 22 add.u32 %r255,%r35,-1; set.u32.le.u32 %r257,%r255,1; neg.s32 %r258,%r257; .loc 1 252 6 cvt.u16.u32 %r260,%r254; cvt.u16.u32 %r261,%r258; or.b16 %r259,%r260,%r261; cvt.u32.u16 %r262,%r259; cvt.u16.u8 %r263,%r262; setp.ne.u16 %r264,%r263,0; @ %r264 bra $L53; .loc 1 254 7 setp.ne.u32 %r266,%r35,16; @ %r266 bra $L54; $L53: .loc 1 261 9 shl.b64 %r270,%r430,3; add.u64 %r271,%r210,%r270; ld.u64 %r142,[%r271+40]; .loc 1 262 12 add.u64 %r277,%r212,%r270; ld.u64 %r42,[%r277+40]; .loc 1 264 3 setp.ne.u64 %r279,%r137,0; @ %r279 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r280,%r137,%r138; @ %r280 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r281281_gfortran_runtime_error5: add.u64 %r40,%r210,40; add.u64 %r179,%r212,40; add.u64 %r283,%r210,16; add.u64 %r285,%r22,%r22; add.u64 %r286,%r285,%r22; shl.b64 %r287,%r286,3; add.u64 %r206,%r283,%r287; .loc 1 264 3 mov.u64 %r92,0; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 271 12 mov.u64 %r441,%r92; $L60: .loc 1 266 18 add.u64 %r289,%r427,%r92; ld.u64 %r290,[%r40]; st.u64 [%r289],%r290; .loc 1 267 18 add.u64 %r291,%frame,%r92; .loc 1 267 20 ld.u64 %r293,[%r179]; mul.lo.u64 %r292,%r293,%r34; .loc 1 267 18 st.u64 [%r291],%r292; .loc 1 268 19 ld.u64 %r295,[%r40+16]; add.u64 %r294,%r295,1; ld.u64 %r296,[%r40+8]; sub.u64 %r54,%r294,%r296; .loc 1 270 10 setp.lt.s64 %r297,%r54,0; @ %r297 bra $L58; .loc 1 268 17 add.u64 %r299,%r426,%r92; st.u64 [%r299],%r54; bra $L59; $L58: .loc 1 271 12 add.u64 %r301,%r426,%r92; st.u64 [%r301],%r441; $L59: .loc 1 264 3 add.u64 %r40,%r40,24; add.u64 %r179,%r179,24; add.u64 %r92,%r92,8; setp.ne.u64 %r303,%r40,%r206; @ %r303 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r304,[%r209]; setp.eq.u64 %r305,%r304,0; @ ! %r305 bra $L63; bra $L62; $L56: add.u64 %r307,%r22,%r22; add.u64 %r308,%r307,%r22; shl.b64 %r309,%r308,3; add.u64 %r127,%r309,40; add.u64 %r133,%r210,%r127; add.u64 %r125,%r212,%r127; shl.b64 %r310,%r22,3; add.u64 %r103,%r310,-8; cvt.u32.u32 %r312,%r23; cvt.s64.s8 %r311,%r312; shl.b64 %r313,%r311,3; add.u64 %r69,%r313,-8; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 281 12 mov.u64 %r440,0; $L66: .loc 1 276 18 add.u64 %r315,%r427,%r103; ld.u64 %r316,[%r133]; st.u64 [%r315],%r316; .loc 1 277 18 add.u64 %r317,%frame,%r103; .loc 1 277 20 ld.u64 %r319,[%r125]; mul.lo.u64 %r318,%r319,%r34; .loc 1 277 18 st.u64 [%r317],%r318; .loc 1 278 19 ld.u64 %r321,[%r133+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r133+8]; sub.u64 %r66,%r320,%r322; .loc 1 280 10 setp.lt.s64 %r323,%r66,0; @ %r323 bra $L64; .loc 1 278 17 add.u64 %r325,%r426,%r103; st.u64 [%r325],%r66; bra $L65; $L64: .loc 1 281 12 add.u64 %r327,%r426,%r103; st.u64 [%r327],%r440; $L65: .loc 1 274 3 add.u64 %r133,%r133,24; add.u64 %r125,%r125,24; add.u64 %r103,%r103,8; setp.ne.u64 %r329,%r69,%r103; @ %r329 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r330,%r138,0; @ %r330 bra $L68; add.u64 %r426,%frame,360; $L86: add.u64 %r172,%r209,40; mov.u64 %r164,%r426; cvt.u32.u32 %r332,%r23; cvt.s64.s8 %r331,%r332; add.u64 %r334,%r331,%r331; add.u64 %r335,%r334,%r331; shl.b64 %r336,%r335,3; add.u64 %r337,%r209,16; add.u64 %r134,%r336,%r337; .loc 1 291 10 mov.u64 %r129,1; .loc 1 295 4 mov.u64 %r356,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r338,%r23,-2; cvt.s64.s32 %r75,%r338; add.u64 %r340,%r75,%r75; add.u64 %r341,%r340,%r75; shl.b64 %r342,%r341,3; add.u64 %r343,%r209,%r342; .loc 1 299 67 shl.b64 %r345,%r75,3; add.u64 %r346,%frame,%r345; .loc 1 299 59 ld.u64 %r348,[%r343+40]; ld.u64 %r349,[%r346+360]; mul.lo.u64 %r146,%r348,%r349; .loc 1 301 24 mov.u64 %r350,0; st.u64 [%r209+8],%r350; .loc 1 302 28 cvt.u16.u32 %r353,%r23; add.u16 %r352,%r353,-1; cvt.u32.u16 %r354,%r352; st.u8 [%r209+28],%r354; .loc 1 304 10 setp.eq.u64 %r355,%r146,0; @ %r355 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r129,%r72,%r129; $L69: .loc 1 295 4 st.u64 [%r172+8],%r356; ld.u64 %r72,[%r164]; add.u64 %r357,%r72,-1; st.u64 [%r172+16],%r357; st.u64 [%r172],%r129; .loc 1 288 7 add.u64 %r172,%r172,24; add.u64 %r164,%r164,8; setp.ne.u64 %r358,%r134,%r172; @ %r358 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r209+48],%r146; mov.u64 %r360,-1; st.u64 [%r209+56],%r360; mov.u64 %r361,1; st.u64 [%r209+40],%r361; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r363363; call (%value_in),_gfortrani_xmallocarray64,[%value_in]; } .loc 1 311 22 st.u64 [%r209],%r364; bra $L73; $L63: .loc 1 316 19 ld.s8 %r366,[%r209+28]; .loc 1 316 10 setp.eq.u64 %r367,%r366,%r138; @ %r367 bra $L74; .loc 1 317 2 cvta.const.u64 %r36836_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r370,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r371,[%r370+36]; setp.eq.u32 %r372,%r371,0; @ %r372 bra $L73; .loc 1 321 4 add.u64 %r426,%frame,360; cvta.const.u64 %r376,$LC2r42637376; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r380,$LC1210380376; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r382,%r138,0; @ %r382 bra $L75; $L78: .loc 1 336 8 ld.u64 %r153,[%r209]; .loc 1 337 8 ld.u64 %r151,[%r210]; .loc 1 339 9 setp.ne.u64 %r383,%r151,0; @ %r383 bra $L76; bra $L48; $L75: add.u64 %r190,%frame,480; add.u64 %r189,%r209,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r385,%r23; cvt.s64.s8 %r384,%r385; add.u64 %r175,%r384,-1; .loc 1 328 3 mov.u64 %r157,0; add.u64 %r426,%frame,360; .loc 1 330 16 mov.u64 %r386,%r157; $L77: st.u64 [%r190],%r386; .loc 1 331 18 ld.u64 %r387,[%r189]; st.u64 [%r187],%r387; .loc 1 332 17 shl.b64 %r389,%r157,3; add.u64 %r390,%r426,%r389; .loc 1 332 10 ld.u64 %r391,[%r390]; setp.le.s64 %r392,%r391,0; @ %r392 bra $L48; .loc 1 328 26 add.u64 %r157,%r157,1; .loc 1 328 3 add.u64 %r190,%r190,8; add.u64 %r189,%r189,24; add.u64 %r187,%r187,8; setp.ne.u64 %r393,%r157,%r175; @ %r393 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r45,%r42,%r34; .loc 1 349 32 shl.b64 %r91,%r142,3; ld.u64 %r39,[%frame+480]; .loc 1 359 22 ld.u64 %r94,[%frame+240]; .loc 1 359 12 shl.b64 %r95,%r94,3; .loc 1 360 23 ld.u64 %r97,[%frame]; .loc 1 361 22 ld.u64 %r99,[%frame+120]; .loc 1 361 12 shl.b64 %r100,%r99,3; .loc 1 363 32 ld.u64 %r85,[%frame+360]; .loc 1 370 23 mul.lo.u64 %r38,%r85,%r94; .loc 1 371 24 mul.lo.u64 %r394,%r85,%r97; .loc 1 371 10 neg.s64 %r163,%r394; .loc 1 372 23 mul.lo.u64 %r395,%r85,%r99; .loc 1 372 9 shl.b64 %r396,%r395,3; neg.s64 %r165,%r396; setp.le.s64 %r431,%r138,1; cvt.u32.u32 %r433,%r23; cvt.s64.s8 %r434,%r433; add.u64 %r435,%r434,-1; add.u64 %r436,%frame,240; add.u64 %r437,%frame,360; add.u64 %r438,%frame,120; .loc 1 367 13 mov.u64 %r439,0; $L87: .loc 1 328 3 mov.u64 %r156,%r152; mov.u64 %r155,%r151; .loc 1 348 10 mov.f64 %r130,0d3ff0000000000000; .loc 1 349 9 mov.u64 %r154,0; $L80: .loc 1 352 6 ld.s8 %r398,[%r156]; cvt.u16.u32 %r397,%r398; setp.eq.u16 %r399,%r397,0; @ %r399 bra $L79; .loc 1 353 12 ld.f64 %r400,[%r155]; mul.f64 %r130,%r130,%r400; $L79: .loc 1 349 24 add.u64 %r154,%r154,1; .loc 1 349 32 add.u64 %r155,%r155,%r91; .loc 1 349 47 add.u64 %r156,%r156,%r45; .loc 1 349 2 setp.ne.u64 %r401,%r139,%r154; @ %r401 bra $L80; .loc 1 355 8 st.f64 [%r153],%r130; .loc 1 358 15 add.u64 %r39,%r39,1; .loc 1 359 12 add.u64 %r151,%r151,%r95; .loc 1 360 13 add.u64 %r152,%r152,%r97; .loc 1 361 12 add.u64 %r153,%r153,%r100; .loc 1 363 13 setp.ne.u64 %r402,%r85,%r39; @ %r402 bra $L87; .loc 1 371 10 add.u64 %r148,%r152,%r163; .loc 1 372 9 add.u64 %r149,%r153,%r165; .loc 1 374 7 @ ! %r431 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r41],%r439; .loc 1 370 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 371 24 mul.lo.u64 %r405,%r116,%r113; .loc 1 371 10 sub.u64 %r148,%r152,%r405; .loc 1 372 23 mul.lo.u64 %r406,%r118,%r113; .loc 1 372 9 shl.b64 %r407,%r406,3; sub.u64 %r149,%r153,%r407; .loc 1 373 5 add.u64 %r150,%r150,1; .loc 1 374 7 add.u64 %r41,%r41,8; add.u64 %r166,%r166,8; setp.ne.u64 %r408,%r150,%r435; @ %r408 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; .loc 1 370 23 mov.u64 %r102,%r38; .loc 1 374 7 mov.u64 %r166,8; .loc 1 373 5 mov.u64 %r150,1; $L84: .loc 1 382 16 ld.u64 %r412,[%r41]; add.u64 %r113,%r412,1; st.u64 [%r41],%r113; .loc 1 383 23 add.u64 %r414,%r436,%r166; ld.u64 %r114,[%r414]; .loc 1 383 13 sub.u64 %r415,%r114,%r102; shl.b64 %r416,%r415,3; add.u64 %r151,%r151,%r416; .loc 1 384 24 add.u64 %r417,%frame,%r166; ld.u64 %r116,[%r417]; .loc 1 384 14 add.u64 %r152,%r148,%r116; .loc 1 385 23 add.u64 %r419,%r438,%r166; ld.u64 %r118,[%r419]; .loc 1 385 13 shl.b64 %r420,%r118,3; add.u64 %r153,%r149,%r420; .loc 1 363 32 add.u64 %r422,%r437,%r166; ld.u64 %r122,[%r422]; .loc 1 363 13 setp.eq.u64 %r423,%r113,%r122; @ %r423 bra $L85; .loc 1 367 13 mov.u64 %r39,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r424,[%r209]; setp.eq.u64 %r425,%r424,0; @ ! %r425 bra $L63; bra $L86; $L48: .loc 1 389_gfortran_sproduct_r8 .visible .func _gfortran_sproduct_r88u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64 %r276; .reg .predpredu64mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_product_r8.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r294,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r294; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r294; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r300,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r300; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r293,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r293,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r299,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r299; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r294,0; @ %r211 bra $L127; add.u64 %r293,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r293; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r24496244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r294,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r294; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r294; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r294,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r294,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r294,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 shl.b64 %r69,%r68,3; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r292,%r294,1; .loc 1 512 13 mov.f64 %r278,0d3ff0000000000000; cvt.u32.u32 %r296,%r24; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; $L146: st.f64 [%r98],%r278; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r279,%r59,%r80; @ %r279 bra $L146; .loc 1 525 7 @ ! %r292 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r295; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r282,%r97,%r298; @ %r282 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; .loc 1 520 13 mov.u64 %r295,0; $L144: .loc 1 529 16 ld.u64 %r287,[%r99]; add.u64 %r74,%r287,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r288,%r75,%r72; shl.b64 %r289,%r288,3; add.u64 %r98,%r98,%r289; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r290,%r74,%r77; @ %r290 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r291,%r98,0; @ ! %r291 bra $L122; bra $L147; $L109: .loc 1 534product_r10product_c4.o/_gfortran_product_c4 .visible .func _gfortran_product_c4fortran/generated/product_c4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mproduct_c4 .visible .func _gfortran_mproduct_c4gfortran_sproduct_c4 .visible .func _gfortran_sproduct_c4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,108] = {80,82,79,68,85,67,84114,101,116,117,114,110,32,118,97,108,117,10132 $LC4[2] = {1065353226] = {70,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,9977,65,83,75,32,97,114,103,117,109,101,110,116,0 }99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_product_c4 .visible .func _gfortran_product_c448predpredu64 %r231; .reg .u64 %r232; .reg .pred %r233; .reg .pred %r235; .reg .u64 %r236; .reg .pred %r237; .reg .u64u32 %r263; .reg .u64 %r265; .reg .u64 %r266; .reg .u64 %r267; .reg .u64 %r268; .reg .u16 %r270; .reg .u16 %r271; .reg .u32 %r272; .reg .u32 %r273; .reg .u64 %r275; .reg .u64u64pred64pred %r328; .reg .pred %r329; .reg .u64 %r335; .reg .u64u64 %r359; .reg .u64mov.u64 %r184,%ar0; mov.u64 %r185,%ar1; mov.u64 %r186,%ar2; .loc 1 55 10 ld.s8 %r22,[%r185+28]; .loc 1 55 38 add.u32 %r187,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r104,%r187; .loc 1 56 10 ld.u64 %r25,[%r186]; .loc 1 56 7 add.u64 %r105,%r25,-1; .loc 1 58 7 shr.u64 %r189,%r105,63; set.u32.lt.s64 %r191,%r104,%r105; neg.s32 %r192,%r191; cvt.u16.u64 %r195,%r189; cvt.u16.u32 %r196,%r192; or.b16 %r194,%r195,%r196; .loc 1 58 6 cvt.u32.u16 %r197,%r194; cvt.u16.u8 %r198,%r197; setp.eq.u16 %r199,%r198,0; @ %r199 bra $L2; .loc 1 60 7 cvt.u32.u32 %r202,%r22; cvt.s64.s8 %r201,%r202; st.u64 [%stack+8],%r2000200_gfortran_runtime_errorr205,%r105,%r105; add.u64 %r206,%r205,%r105; shl.b64 %r207,%r206,3; add.u64 %r208,%r185,%r207; ld.u64 %r30,[%r208+56]; ld.u64 %r32,[%r208+48]; .loc 1 68 9 ld.u64 %r108,[%r208+40]; .loc 1 70 3 setp.ne.u64 %r222,%r105,0; @ %r222 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r223,%r104,%r105; @ %r223 bra $L4; bra $L46; $L3: add.u64 %r102,%r185,40; add.u64 %r98,%frame,120; add.u64 %r97,%frame,240; add.u64 %r224,%r185,16; add.u64 %r226,%r25,%r25; add.u64 %r227,%r226,%r25; shl.b64 %r228,%r227,3; add.u64 %r103,%r224,%r228; .loc 1 76 12 mov.u64 %r364,0; $L8: .loc 1 72 18 ld.u64 %r229,[%r102]; st.u64 [%r98],%r229; .loc 1 73 19 ld.u64 %r231,[%r102+16]; add.u64 %r230,%r231,1; ld.u64 %r232,[%r102+8]; sub.u64 %r37,%r230,%r232; .loc 1 75 10 setp.lt.s64 %r233,%r37,0; @ %r233 bra $L6; .loc 1 73 17 st.u64 [%r97],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r97],%r364; $L7: .loc 1 70 3 add.u64 %r102,%r102,24; add.u64 %r98,%r98,8; add.u64 %r97,%r97,8; setp.ne.u64 %r235,%r102,%r103; @ %r235 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r236,[%r184]; setp.eq.u64 %r237,%r236,0; @ ! %r237 bra $L11; bra $L10; $L4: add.u64 %r239,%r25,%r25; add.u64 %r240,%r239,%r25; shl.b64 %r241,%r240,3; add.u64 %r242,%r241,40; add.u64 %r154,%r185,%r242; shl.b64 %r243,%r25,3; add.u64 %r147,%r243,-8; add.u64 %r352,%frame,120; add.u64 %r149,%r352,%r147; add.u64 %r351,%frame,240; add.u64 %r145,%r351,%r147; cvt.u32.u32 %r247,%r22; cvt.s64.s8 %r246,%r247; add.u64 %r249,%r246,%r246; add.u64 %r250,%r249,%r246; shl.b64 %r251,%r250,3; add.u64 %r252,%r185,40; add.u64 %r128,%r251,%r252; .loc 1 84 12 mov.u64 %r363,0; $L14: .loc 1 80 18 ld.u64 %r253,[%r154]; st.u64 [%r149],%r253; .loc 1 81 19 ld.u64 %r255,[%r154+16]; add.u64 %r254,%r255,1; ld.u64 %r256,[%r154+8]; sub.u64 %r44,%r254,%r256; .loc 1 83 10 setp.lt.s64 %r257,%r44,0; @ %r257 bra $L12; .loc 1 81 17 st.u64 [%r145],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r145],%r363; $L13: .loc 1 78 3 add.u64 %r154,%r154,24; add.u64 %r149,%r149,8; add.u64 %r145,%r145,8; setp.ne.u64 %r259,%r128,%r154; @ %r259 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r260,%r104,0; @ %r260 bra $L16; add.u64 %r351,%frame,240; $L34: add.u64 %r163,%r184,40; mov.u64 %r161,%r351; add.u64 %r261,%r184,16; cvt.u32.u32 %r263,%r22; cvt.s64.s8 %r262,%r263; add.u64 %r265,%r262,%r262; add.u64 %r266,%r265,%r262; shl.b64 %r267,%r266,3; add.u64 %r155,%r261,%r267; .loc 1 94 10 mov.u64 %r100,1; .loc 1 98 4 mov.u64 %r290,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r268,0; st.u64 [%r184+8],%r268; .loc 1 103 28 cvt.u16.u32 %r271,%r22; add.u16 %r270,%r271,-1; cvt.u32.u16 %r272,%r270; st.u8 [%r184+28],%r272; .loc 1 105 20 add.u32 %r273,%r22,-2; cvt.s64.s32 %r53,%r273; add.u64 %r275,%r53,%r53; add.u64 %r276,%r275,%r53; shl.b64 %r277,%r276,3; add.u64 %r278,%r184,%r277; .loc 1 105 67 shl.b64 %r280,%r53,3; add.u64 %r281,%frame,%r280; .loc 1 105 59 ld.u64 %r283,[%r278+40]; ld.u64 %r284,[%r281+240]; mul.lo.u64 %r110,%r283,%r284; .loc 1 107 29 mov.u64 %r28611286; call (%value_in),_gfortrani_xmallocarray287,[%value_in]; } .loc 1 107 27 st.u64 [%r184],%r287; .loc 1 108 10 setp.eq.u64 %r289,%r110,0; @ ! %r289 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r100,%r49,%r100; $L17: .loc 1 98 4 st.u64 [%r163+8],%r290; ld.u64 %r49,[%r161]; add.u64 %r291,%r49,-1; st.u64 [%r163+16],%r291; st.u64 [%r163],%r100; .loc 1 91 7 add.u64 %r163,%r163,24; add.u64 %r161,%r161,8; setp.ne.u64 %r292,%r155,%r163; @ %r292 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r184+48],%r110; mov.u64 %r294,-1; st.u64 [%r184+56],%r294; mov.u64 %r295,1; st.u64 [%r184+40],%r295; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r60,[%r184+28]; .loc 1 118 10 setp.eq.u64 %r296,%r60,%r104; @ %r296 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r104; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r299,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r300,[%r299+36]; setp.eq.u32 %r301,%r300,0; @ %r301 bra $L21; .loc 1 125 2 add.u64 %r351,%frame,240; cvta.const.u64 %r305,$LC2r351304305; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r307,%r104,0; @ %r307 bra $L24; $L27: .loc 1 65 9 add.u64 %r308,%r30,1; .loc 1 65 7 sub.u64 %r106,%r308,%r32; max.s64 %r45,%r106,0; .loc 1 137 8 ld.u64 %r117,[%r185]; .loc 1 138 8 ld.u64 %r118,[%r184]; .loc 1 154 36 shl.b64 %r68,%r108,3; ld.u64 %r111,[%frame+360]; .loc 1 166 22 ld.u64 %r70,[%frame+120]; .loc 1 166 12 shl.b64 %r71,%r70,3; .loc 1 167 22 ld.u64 %r73,[%frame]; .loc 1 167 12 shl.b64 %r74,%r73,3; .loc 1 169 32 ld.u64 %r124,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r309,%r70,%r124; .loc 1 176 9 shl.b64 %r310,%r309,3; neg.s64 %r129,%r310; .loc 1 177 23 mul.lo.u64 %r311,%r73,%r124; .loc 1 177 9 shl.b64 %r312,%r311,3; neg.s64 %r133,%r312; setp.gt.s64 %r353,%r106,0; setp.le.s64 %r354,%r104,1; .loc 1 150 10 ld.const.f32 %r356,[$LC4]; ld.const.f32 %r357,[$LC4+4]; cvt.u32.u32 %r358,%r22; cvt.s64.s8 %r359,%r358; add.u64 %r360,%r359,-1; add.u64 %r361,%frame,120; add.u64 %r362,%frame,240; bra $L25; $L24: add.u64 %r176,%frame,360; add.u64 %r175,%r184,40; mov.u64 %r173,%frame; cvt.u32.u32 %r314,%r22; cvt.s64.s8 %r313,%r314; add.u64 %r164,%r313,-1; .loc 1 129 3 mov.u64 %r119,0; add.u64 %r351,%frame,240; .loc 1 131 16 mov.u64 %r315,%r119; $L26: st.u64 [%r176],%r315; .loc 1 132 18 ld.u64 %r316,[%r175]; st.u64 [%r173],%r316; .loc 1 133 17 shl.b64 %r318,%r119,3; add.u64 %r319,%r351,%r318; .loc 1 133 10 ld.u64 %r320,[%r319]; setp.le.s64 %r321,%r320,0; @ %r321 bra $L1; .loc 1 129 26 add.u64 %r119,%r119,1; .loc 1 129 3 add.u64 %r176,%r176,8; add.u64 %r175,%r175,24; add.u64 %r173,%r173,8; setp.ne.u64 %r322,%r119,%r164; @ %r322 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r353 bra $L35; .loc 1 150 10 st.f32 [%r118],%r356; st.f32 [%r118+4],%r357; bra $L29; $L35: mov.u64 %r113,%r117; mov.f32 %r95,0f00000000; mov.f32 %r96,0f3f800000; .loc 1 154 13 mov.u64 %r112,0; $L28: ld.f32 %r126,[%r113]; ld.f32 %r122,[%r113+4]; .loc 1 158 10 mul.f32 %r99,%r96,%r122; mul.f32 %r326,%r95,%r122; neg.f32 %r327,%r326; fma.rn.f32 %r96,%r96,%r126,%r327; fma.rn.f32 %r95,%r95,%r126,%r99; .loc 1 154 28 add.u64 %r112,%r112,1; .loc 1 154 36 add.u64 %r113,%r113,%r68; .loc 1 154 6 setp.gt.s64 %r328,%r45,%r112; @ %r328 bra $L28; .loc 1 161 12 st.f32 [%r118],%r96; st.f32 [%r118+4],%r95; $L29: .loc 1 165 15 add.u64 %r111,%r111,1; .loc 1 166 12 add.u64 %r117,%r117,%r71; .loc 1 167 12 add.u64 %r118,%r118,%r74; .loc 1 169 13 setp.ne.u64 %r329,%r111,%r124; @ %r329 bra $L25; .loc 1 176 9 add.u64 %r114,%r117,%r129; .loc 1 177 9 add.u64 %r115,%r118,%r133; .loc 1 179 7 @ %r354 bra $L1; add.u64 %r123,%frame,368; mov.u64 %r141,8; .loc 1 178 5 mov.u64 %r116,1; .loc 1 173 13 mov.u64 %r355,0; bra $L32; $L33: st.u64 [%r123],%r355; .loc 1 176 23 mul.lo.u64 %r335,%r88,%r87; .loc 1 176 9 shl.b64 %r336,%r335,3; sub.u64 %r114,%r117,%r336; .loc 1 177 23 mul.lo.u64 %r337,%r91,%r87; .loc 1 177 9 shl.b64 %r338,%r337,3; sub.u64 %r115,%r118,%r338; .loc 1 178 5 add.u64 %r116,%r116,1; .loc 1 179 7 add.u64 %r123,%r123,8; add.u64 %r141,%r141,8; setp.eq.u64 %r339,%r116,%r360; @ %r339 bra $L1; $L32: .loc 1 187 16 ld.u64 %r340,[%r123]; add.u64 %r87,%r340,1; st.u64 [%r123],%r87; .loc 1 188 23 add.u64 %r342,%r361,%r141; ld.u64 %r88,[%r342]; .loc 1 188 13 shl.b64 %r343,%r88,3; add.u64 %r117,%r114,%r343; .loc 1 189 23 add.u64 %r344,%frame,%r141; ld.u64 %r91,[%r344]; .loc 1 189 13 shl.b64 %r345,%r91,3; add.u64 %r118,%r115,%r345; .loc 1 169 32 add.u64 %r347,%r362,%r141; ld.u64 %r94,[%r347]; .loc 1 169 13 setp.eq.u64 %r348,%r87,%r94; @ %r348 bra $L33; .loc 1 173 13 mov.u64 %r111,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r349,[%r184]; setp.eq.u64 %r350,%r349,0; @ ! %r350 bra $L11; bra $L34; $L1_gfortran_mproduct_c4 .visible .func _gfortran_mproduct_c608pred %r218; .reg .u32 %r222; .reg .u64 %r224; .reg .u32 %r226; .reg .u32 %r227; .reg .u16 %r229; .reg .u16u16 %r233; .reg .pred %r234; .reg .u64u64predu6464pred %r335; .reg .u64 %r336; .reg .u32 %r337; .reg .u64u64 %r342; .reg .u32 %r343; .reg .u64pred %r372; .reg .u64 %r373; .reg .u64 %r375; .reg .u32 %r376; .reg .predu32pred %r404; .reg .f32 %r405; .reg .f32 %r406; .reg .predu64 %r413; .reg .predu64predmov.u64 %r216,%ar2; mov.u64 %r217,%ar3; .loc 1 223 6 setp.ne.u64 %r218,%r217,0; @ %r218 bra $L491215216; call _gfortran_product_c4.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r216]; .loc 1 233 7 add.u64 %r134,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r215+28]; .loc 1 234 38 add.u32 %r222,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r135,%r222; .loc 1 237 7 shr.u64 %r224,%r134,63; set.u32.gt.s64 %r226,%r134,%r135; neg.s32 %r227,%r226; cvt.u16.u64 %r230,%r224; cvt.u16.u32 %r231,%r227; or.b16 %r229,%r230,%r231; .loc 1 237 6 cvt.u32.u16 %r232,%r229; cvt.u16.u8 %r233,%r232; setp.eq.u16 %r234,%r233,0; @ %r234 bra $L51; .loc 1 239 7 cvt.u32.u32 %r237,%r23; cvt.s64.s8 %r236,%r237; st.u64 [%stack+8],%r236; st.u64 [%stack],%r22; cvta.const.u64 %r235235_gfortran_runtime_error1: .loc 1 244 9 add.u64 %r434,%r134,%r134; add.u64 %r436,%r434,%r134; shl.b64 %r242,%r436,3; add.u64 %r243,%r215,%r242; ld.u64 %r246,[%r243+56]; add.u64 %r245,%r246,1; .loc 1 244 7 ld.u64 %r253,[%r243+48]; sub.u64 %r136,%r245,%r253; .loc 1 245 6 setp.le.s64 %r254,%r136,0; @ %r254 bra $L48; .loc 1 248 9 ld.u64 %r149,[%r217]; .loc 1 250 15 ld.u64 %r34,[%r217+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r255,%r35,-4; and.b32 %r256,%r255,-5; set.u32.eq.u32 %r258,%r256,0; neg.s32 %r259,%r258; .loc 1 252 22 add.u32 %r260,%r35,-1; set.u32.le.u32 %r262,%r260,1; neg.s32 %r263,%r262; .loc 1 252 6 cvt.u16.u32 %r265,%r259; cvt.u16.u32 %r266,%r263; or.loc 1 254 7 setp.ne.u32 %r271,%r35,16; @ %r271 bra $L54; $L53: .loc 1 261 9 shl.b64 %r275,%r436,3; add.u64 %r276,%r215,%r275; ld.u64 %r139,[%r276+40]; .loc 1 262 12 add.u64 %r282,%r217,%r275; ld.u64 %r41,[%r282+40]; .loc 1 264 3 setp.ne.u64 %r284,%r134,0; @ %r284 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r285,%r134,%r135; @ %r285 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r286286_gfortran_runtime_error5: add.u64 %r40,%r215,40; add.u64 %r128,%r217,40; add.u64 %r288,%r215,16; add.u64 %r290,%r22,%r22; add.u64 %r291,%r290,%r22; shl.b64 %r292,%r291,3; add.u64 %r211,%r288,%r292; .loc 1 264 3 mov.u64 %r88,0; add.u64 %r433,%frame,240; add.u64 %r432,%frame,360; .loc 1 271 12 mov.u64 %r447,%r88; $L60: .loc 1 266 18 add.u64 %r294,%r433,%r88; ld.u64 %r295,[%r40]; st.u64 [%r294],%r295; .loc 1 267 18 add.u64 %r296,%frame,%r88; .loc 1 267 20 ld.u64 %r298,[%r128]; mul.lo.u64 %r297,%r298,%r34; .loc 1 267 18 st.u64 [%r296],%r297; .loc 1 268 19 ld.u64 %r300,[%r40+16]; add.u64 %r299,%r300,1; ld.u64 %r301,[%r40+8]; sub.u64 %r52,%r299,%r301; .loc 1 270 10 setp.lt.s64 %r302,%r52,0; @ %r302 bra $L58; .loc 1 268 17 add.u64 %r304,%r432,%r88; st.u64 [%r304],%r52; bra $L59; $L58: .loc 1 271 12 add.u64 %r306,%r432,%r88; st.u64 [%r306],%r447; $L59: .loc 1 264 3 add.u64 %r40,%r40,24; add.u64 %r128,%r128,24; add.u64 %r88,%r88,8; setp.ne.u64 %r308,%r40,%r211; @ %r308 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r309,[%r214]; setp.eq.u64 %r310,%r309,0; @ ! %r310 bra $L63; bra $L62; $L56: add.u64 %r312,%r22,%r22; add.u64 %r313,%r312,%r22; shl.b64 %r314,%r313,3; add.u64 %r123,%r314,40; add.u64 %r132,%r215,%r123; add.u64 %r121,%r217,%r123; shl.b64 %r315,%r22,3; add.u64 %r133,%r315,-8; cvt.u32.u32 %r317,%r23; cvt.s64.s8 %r316,%r317; shl.b64 %r318,%r316,3; add.u64 %r67,%r318,-8; add.u64 %r433,%frame,240; add.u64 %r432,%frame,360; .loc 1 281 12 mov.u64 %r446,0; $L66: .loc 1 276 18 add.u64 %r320,%r433,%r133; ld.u64 %r321,[%r132]; st.u64 [%r320],%r321; .loc 1 277 18 add.u64 %r322,%frame,%r133; .loc 1 277 20 ld.u64 %r324,[%r121]; mul.lo.u64 %r323,%r324,%r34; .loc 1 277 18 st.u64 [%r322],%r323; .loc 1 278 19 ld.u64 %r326,[%r132+16]; add.u64 %r325,%r326,1; ld.u64 %r327,[%r132+8]; sub.u64 %r63,%r325,%r327; .loc 1 280 10 setp.lt.s64 %r328,%r63,0; @ %r328 bra $L64; .loc 1 278 17 add.u64 %r330,%r432,%r133; st.u64 [%r330],%r63; bra $L65; $L64: .loc 1 281 12 add.u64 %r332,%r432,%r133; st.u64 [%r332],%r446; $L65: .loc 1 274 3 add.u64 %r132,%r132,24; add.u64 %r121,%r121,24; add.u64 %r133,%r133,8; setp.ne.u64 %r334,%r67,%r133; @ %r334 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r335,%r135,0; @ %r335 bra $L68; add.u64 %r432,%frame,360; $L86: add.u64 %r181,%r214,40; mov.u64 %r179,%r432; cvt.u32.u32 %r337,%r23; cvt.s64.s8 %r336,%r337; add.u64 %r339,%r336,%r336; add.u64 %r340,%r339,%r336; shl.b64 %r341,%r340,3; add.u64 %r342,%r214,16; add.u64 %r144,%r341,%r342; .loc 1 291 10 mov.u64 %r125,1; .loc 1 295 4 mov.u64 %r361,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r343,%r23,-2; cvt.s64.s32 %r72,%r343; add.u64 %r345,%r72,%r72; add.u64 %r346,%r345,%r72; shl.b64 %r347,%r346,3; add.u64 %r348,%r214,%r347; .loc 1 299 67 shl.b64 %r350,%r72,3; add.u64 %r351,%frame,%r350; .loc 1 299 59 ld.u64 %r353,[%r348+40]; ld.u64 %r354,[%r351+360]; mul.lo.u64 %r143,%r353,%r354; .loc 1 301 24 mov.u64 %r355,0; st.u64 [%r214+8],%r355; .loc 1 302 28 cvt.u16.u32 %r358,%r23; add.u16 %r357,%r358,-1; cvt.u32.u16 %r359,%r357; st.u8 [%r214+28],%r359; .loc 1 304 10 setp.eq.u64 %r360,%r143,0; @ %r360 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r125,%r69,%r125; $L69: .loc 1 295 4 st.u64 [%r181+8],%r361; ld.u64 %r69,[%r179]; add.u64 %r362,%r69,-1; st.u64 [%r181+16],%r362; st.u64 [%r181],%r125; .loc 1 288 7 add.u64 %r181,%r181,24; add.u64 %r179,%r179,8; setp.ne.u64 %r363,%r144,%r181; @ %r363 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r214+48],%r143; mov.u64 %r365,-1; st.u64 [%r214+56],%r365; mov.u64 %r366,1; st.u64 [%r214+40],%r366; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r368368; call (%value_in),_gfortrani_xmallocarray69,[%value_in]; } .loc 1 311 22 st.u64 [%r214],%r369; bra $L73; $L63: .loc 1 316 19 ld.s8 %r371,[%r214+28]; .loc 1 316 10 setp.eq.u64 %r372,%r371,%r135; @ %r372 bra $L74; .loc 1 317 2 cvta.const.u64 %r373,$LC373_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r375,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r376,[%r375+36]; setp.eq.u32 %r377,%r376,0; @ %r377 bra $L73; .loc 1 321 4 add.u64 %r432,%frame,360; cvta.const.u64 %r381,$LC2r4380381; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r385,$LC738381; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r387,%r135,0; @ %r387 bra $L75; $L78: .loc 1 336 8 ld.u64 %r150,[%r214]; .loc 1 337 8 ld.u64 %r148,[%r215]; .loc 1 339 9 setp.ne.u64 %r388,%r148,0; @ %r388 bra $L76; bra $L48; $L75: add.u64 %r194,%frame,480; add.u64 %r193,%r214,40; add.u64 %r191,%frame,120; cvt.u32.u32 %r390,%r23; cvt.s64.s8 %r389,%r390; add.u64 %r182,%r389,-1; .loc 1 328 3 mov.u64 %r155,0; add.u64 %r432,%frame,360; .loc 1 330 16 mov.u64 %r391,%r155; $L77: st.u64 [%r194],%r391; .loc 1 331 18 ld.u64 %r392,[%r193]; st.u64 [%r191],%r392; .loc 1 332 17 shl.b64 %r394,%r155,3; add.u64 %r395,%r432,%r394; .loc 1 332 10 ld.u64 %r396,[%r395]; setp.le.s64 %r397,%r396,0; @ %r397 bra $L48; .loc 1 328 26 add.u64 %r155,%r155,1; .loc 1 328 3 add.u64 %r194,%r194,8; add.u64 %r193,%r193,24; add.u64 %r191,%r191,8; setp.ne.u64 %r398,%r155,%r182; @ %r398 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r43,%r41,%r34; .loc 1 349 32 shl.b64 %r87,%r139,3; ld.u64 %r151,[%frame+480]; .loc 1 359 22 ld.u64 %r90,[%frame+240]; .loc 1 359 12 shl.b64 %r91,%r90,3; .loc 1 360 23 ld.u64 %r93,[%frame]; .loc 1 361 22 ld.u64 %r95,[%frame+120]; .loc 1 361 12 shl.b64 %r96,%r95,3; .loc 1 363 32 ld.u64 %r81,[%frame+360]; .loc 1 370 23 mul.lo.u64 %r38,%r81,%r90; .loc 1 371 24 mul.lo.u64 %r399,%r81,%r93; .loc 1 371 10 neg.s64 %r161,%r399; .loc 1 372 23 mul.lo.u64 %r400,%r81,%r95; .loc 1 372 9 shl.b64 %r401,%r400,3; neg.s64 %r165,%r401; setp.le.s64 %r437,%r135,1; cvt.u32.u32 %r439,%r23; cvt.s64.s8 %r440,%r439; add.u64 %r441,%r440,-1; add.u64 %r442,%frame,240; add.u64 %r443,%frame,360; add.u64 %r444,%frame,120; .loc 1 367 13 mov.u64 %r445,0; $L87: .loc 1 328 3 mov.u64 %r154,%r149; mov.u64 %r153,%r148; mov.f32 %r119,0f00000000; mov.f32 %r120,0f3f800000; .loc 1 349 9 mov.u64 %r152,0; $L80: .loc 1 352 6 ld.s8 %r403,[%r154]; cvt.u16.u32 %r402,%r403; setp.eq.u16 %r404,%r402,0; @ %r404 bra $L79; ld.f32 %r163,[%r153]; ld.f32 %r170,[%r153+4]; .loc 1 353 12 mul.f32 %r174,%r120,%r170; mul.f32 %r405,%r119,%r170; neg.f32 %r406,%r405; fma.rn.f32 %r120,%r120,%r163,%r406; fma.rn.f32 %r119,%r119,%r163,%r174; $L79: .loc 1 349 24 add.u64 %r152,%r152,1; .loc 1 349 32 add.u64 %r153,%r153,%r87; .loc 1 349 47 add.u64 %r154,%r154,%r43; .loc 1 349 2 setp.ne.u64 %r407,%r136,%r152; @ %r407 bra $L80; .loc 1 355 8 st.f32 [%r150],%r120; st.f32 [%r150+4],%r119; .loc 1 358 15 add.u64 %r151,%r151,1; .loc 1 359 12 add.u64 %r148,%r148,%r91; .loc 1 360 13 add.u64 %r149,%r149,%r93; .loc 1 361 12 add.u64 %r150,%r150,%r96; .loc 1 363 13 setp.ne.u64 %r408,%r81,%r151; @ %r408 bra $L87; .loc 1 371 10 add.u64 %r145,%r149,%r161; .loc 1 372 9 add.u64 %r146,%r150,%r165; .loc 1 374 7 @ ! %r437 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r127],%r445; .loc 1 370 23 mul.lo.u64 %r99,%r110,%r109; .loc 1 371 24 mul.lo.u64 %r411,%r112,%r109; .loc 1 371 10 sub.u64 %r145,%r149,%r411; .loc 1 372 23 mul.lo.u64 %r412,%r114,%r109; .loc 1 372 9 shl.b64 %r413,%r412,3; sub.u64 %r146,%r150,%r413; .loc 1 373 5 add.u64 %r147,%r147,1; .loc 1 374 7 add.u64 %r127,%r127,8; add.u64 %r178,%r178,8; setp.ne.u64 %r414,%r147,%r441; @ %r414 bra $L84; bra $L48; $L108: add.u64 %r127,%frame,488; .loc 1 370 23 mov.u64 %r99,%r38; .loc 1 374 7 mov.u64 %r178,8; .loc 1 373 5 mov.u64 %r147,1; $L84: .loc 1 382 16 ld.u64 %r418,[%r127]; add.u64 %r109,%r418,1; st.u64 [%r127],%r109; .loc 1 383 23 add.u64 %r420,%r442,%r178; ld.u64 %r110,[%r420]; .loc 1 383 13 sub.u64 %r421,%r110,%r99; shl.b64 %r422,%r421,3; add.u64 %r148,%r148,%r422; .loc 1 384 24 add.u64 %r423,%frame,%r178; ld.u64 %r112,[%r423]; .loc 1 384 14 add.u64 %r149,%r145,%r112; .loc 1 385 23 add.u64 %r425,%r444,%r178; ld.u64 %r114,[%r425]; .loc 1 385 13 shl.b64 %r426,%r114,3; add.u64 %r150,%r146,%r426; .loc 1 363 32 add.u64 %r428,%r443,%r178; ld.u64 %r117,[%r428]; .loc 1 363 13 setp.eq.u64 %r429,%r109,%r117; @ %r429 bra $L85; .loc 1 367 13 mov.u64 %r151,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r430,[%r214]; setp.eq.u64 %r431,%r430,0; @ ! %r431 bra $L63; bra $L86; $L48: .loc 1 389_gfortran_sproduct_c4 .visible .func _gfortran_sproduct_c8u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64 %r276; .reg .pred %r277; .reg .pred %r280; .reg .predpred %r292; .reg .f32 %r293; .reg .f32 %r294; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_product_c4.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r297,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r297; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r297; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r304,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r304; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r296,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r296,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r303,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r303; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r297,0; @ %r211 bra $L127; add.u64 %r296,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r296; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r24496244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r297,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r297; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r297; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r297,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r297,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r297,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 shl.b64 %r69,%r68,3; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; ld.const.f32 %r293,[$LC4]; ld.const.f32 %r294,[$LC4+4]; setp.le.s64 %r295,%r297,1; cvt.u32.u32 %r299,%r24; cvt.s64.s8 %r300,%r299; add.u64 %r301,%r300,-1; .loc 1 520 13 mov.u64 %r302,0; $L146: .loc 1 512 13 st.f32 [%r98],%r293; st.f32 [%r98+4],%r294; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r280,%r59,%r80; @ %r280 bra $L146; .loc 1 525 7 @ ! %r295 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r302; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r283,%r97,%r301; @ %r283 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; $L144: .loc 1 529 16 ld.u64 %r288,[%r99]; add.u64 %r74,%r288,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r289,%r75,%r72; shl.b64 %r290,%r289,3; add.u64 %r98,%r98,%r290; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r291,%r74,%r77; @ %r291 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r292,%r98,0; @ ! %r292 bra $L122; bra $L147; $L109: .loc 1 534 product_c8.o/_gfortran_product_c8 .visible .func _gfortran_product_c8fortran/generated/product_c8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mproduct_c8 .visible .func _gfortran_mproduct_c8gfortran_sproduct_c8 .visible .func _gfortran_sproduct_c8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,108] = {80,82,79,68,85,67,84114,101,116,117,114,110,32,118,97,108,117,10164 $LC4[2] = {460718241880001740870,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,9977,65,83,75,32,97,114,103,117,109,101,110,116,0 }99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_product_c8 .visible .func _gfortran_product_c848predpredu64 %r231; .reg .u64 %r232; .reg .pred %r233; .reg .pred %r235; .reg .u64 %r236; .reg .pred %r237; .reg .u64u32 %r263; .reg .u64 %r265; .reg .u64 %r266; .reg .u64 %r267; .reg .u64 %r268; .reg .u16 %r270; .reg .u16 %r271; .reg .u32 %r272; .reg .u32 %r273; .reg .u64 %r275; .reg .u64u64pred64pred %r328; .reg .pred %r329; .reg .u64 %r335; .reg .u64u64 %r359; .reg .u64mov.u64 %r184,%ar0; mov.u64 %r185,%ar1; mov.u64 %r186,%ar2; .loc 1 55 10 ld.s8 %r22,[%r185+28]; .loc 1 55 38 add.u32 %r187,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r104,%r187; .loc 1 56 10 ld.u64 %r25,[%r186]; .loc 1 56 7 add.u64 %r105,%r25,-1; .loc 1 58 7 shr.u64 %r189,%r105,63; set.u32.lt.s64 %r191,%r104,%r105; neg.s32 %r192,%r191; cvt.u16.u64 %r195,%r189; cvt.u16.u32 %r196,%r192; or.b16 %r194,%r195,%r196; .loc 1 58 6 cvt.u32.u16 %r197,%r194; cvt.u16.u8 %r198,%r197; setp.eq.u16 %r199,%r198,0; @ %r199 bra $L2; .loc 1 60 7 cvt.u32.u32 %r202,%r22; cvt.s64.s8 %r201,%r202; st.u64 [%stack+8],%r2000200_gfortran_runtime_errorr205,%r105,%r105; add.u64 %r206,%r205,%r105; shl.b64 %r207,%r206,3; add.u64 %r208,%r185,%r207; ld.u64 %r30,[%r208+56]; ld.u64 %r32,[%r208+48]; .loc 1 68 9 ld.u64 %r108,[%r208+40]; .loc 1 70 3 setp.ne.u64 %r222,%r105,0; @ %r222 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r223,%r104,%r105; @ %r223 bra $L4; bra $L46; $L3: add.u64 %r102,%r185,40; add.u64 %r98,%frame,120; add.u64 %r97,%frame,240; add.u64 %r224,%r185,16; add.u64 %r226,%r25,%r25; add.u64 %r227,%r226,%r25; shl.b64 %r228,%r227,3; add.u64 %r103,%r224,%r228; .loc 1 76 12 mov.u64 %r364,0; $L8: .loc 1 72 18 ld.u64 %r229,[%r102]; st.u64 [%r98],%r229; .loc 1 73 19 ld.u64 %r231,[%r102+16]; add.u64 %r230,%r231,1; ld.u64 %r232,[%r102+8]; sub.u64 %r37,%r230,%r232; .loc 1 75 10 setp.lt.s64 %r233,%r37,0; @ %r233 bra $L6; .loc 1 73 17 st.u64 [%r97],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r97],%r364; $L7: .loc 1 70 3 add.u64 %r102,%r102,24; add.u64 %r98,%r98,8; add.u64 %r97,%r97,8; setp.ne.u64 %r235,%r102,%r103; @ %r235 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r236,[%r184]; setp.eq.u64 %r237,%r236,0; @ ! %r237 bra $L11; bra $L10; $L4: add.u64 %r239,%r25,%r25; add.u64 %r240,%r239,%r25; shl.b64 %r241,%r240,3; add.u64 %r242,%r241,40; add.u64 %r154,%r185,%r242; shl.b64 %r243,%r25,3; add.u64 %r147,%r243,-8; add.u64 %r352,%frame,120; add.u64 %r149,%r352,%r147; add.u64 %r351,%frame,240; add.u64 %r145,%r351,%r147; cvt.u32.u32 %r247,%r22; cvt.s64.s8 %r246,%r247; add.u64 %r249,%r246,%r246; add.u64 %r250,%r249,%r246; shl.b64 %r251,%r250,3; add.u64 %r252,%r185,40; add.u64 %r128,%r251,%r252; .loc 1 84 12 mov.u64 %r363,0; $L14: .loc 1 80 18 ld.u64 %r253,[%r154]; st.u64 [%r149],%r253; .loc 1 81 19 ld.u64 %r255,[%r154+16]; add.u64 %r254,%r255,1; ld.u64 %r256,[%r154+8]; sub.u64 %r44,%r254,%r256; .loc 1 83 10 setp.lt.s64 %r257,%r44,0; @ %r257 bra $L12; .loc 1 81 17 st.u64 [%r145],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r145],%r363; $L13: .loc 1 78 3 add.u64 %r154,%r154,24; add.u64 %r149,%r149,8; add.u64 %r145,%r145,8; setp.ne.u64 %r259,%r128,%r154; @ %r259 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r260,%r104,0; @ %r260 bra $L16; add.u64 %r351,%frame,240; $L34: add.u64 %r163,%r184,40; mov.u64 %r161,%r351; add.u64 %r261,%r184,16; cvt.u32.u32 %r263,%r22; cvt.s64.s8 %r262,%r263; add.u64 %r265,%r262,%r262; add.u64 %r266,%r265,%r262; shl.b64 %r267,%r266,3; add.u64 %r155,%r261,%r267; .loc 1 94 10 mov.u64 %r100,1; .loc 1 98 4 mov.u64 %r290,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r268,0; st.u64 [%r184+8],%r268; .loc 1 103 28 cvt.u16.u32 %r271,%r22; add.u16 %r270,%r271,-1; cvt.u32.u16 %r272,%r270; st.u8 [%r184+28],%r272; .loc 1 105 20 add.u32 %r273,%r22,-2; cvt.s64.s32 %r53,%r273; add.u64 %r275,%r53,%r53; add.u64 %r276,%r275,%r53; shl.b64 %r277,%r276,3; add.u64 %r278,%r184,%r277; .loc 1 105 67 shl.b64 %r280,%r53,3; add.u64 %r281,%frame,%r280; .loc 1 105 59 ld.u64 %r283,[%r278+40]; ld.u64 %r284,[%r281+240]; mul.lo.u64 %r110,%r283,%r284; .loc 1 107 29 mov.u64 %r286,1607 27 st.u64 [%r184],%r287; .loc 1 108 10 setp.eq.u64 %r289,%r110,0; @ ! %r289 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r100,%r49,%r100; $L17: .loc 1 98 4 st.u64 [%r163+8],%r290; ld.u64 %r49,[%r161]; add.u64 %r291,%r49,-1; st.u64 [%r163+16],%r291; st.u64 [%r163],%r100; .loc 1 91 7 add.u64 %r163,%r163,24; add.u64 %r161,%r161,8; setp.ne.u64 %r292,%r155,%r163; @ %r292 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r184+48],%r110; mov.u64 %r294,-1; st.u64 [%r184+56],%r294; mov.u64 %r295,1; st.u64 [%r184+40],%r295; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r60,[%r184+28]; .loc 1 118 10 setp.eq.u64 %r296,%r60,%r104; @ %r296 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r104; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r299,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r300,[%r299+36]; setp.eq.u32 %r301,%r300,0; @ %r301 bra $L21; .loc 1 125 2 add.u64 %r351,%frame,240; cvta.const.u64 %r305,$LC2r351304305; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r307,%r104,0; @ %r307 bra $L24; $L27: .loc 1 65 9 add.u64 %r308,%r30,1; .loc 1 65 7 sub.u64 %r106,%r308,%r32; max.s64 %r45,%r106,0; .loc 1 137 8 ld.u64 %r117,[%r185]; .loc 1 138 8 ld.u64 %r118,[%r184]; .loc 1 154 36 shl.b64 %r68,%r108,4; ld.u64 %r111,[%frame+360]; .loc 1 166 22 ld.u64 %r70,[%frame+120]; .loc 1 166 12 shl.b64 %r71,%r70,4; .loc 1 167 22 ld.u64 %r73,[%frame]; .loc 1 167 12 shl.b64 %r74,%r73,4; .loc 1 169 32 ld.u64 %r124,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r309,%r70,%r124; .loc 1 176 9 shl.b64 %r310,%r309,4; neg.s64 %r129,%r310; .loc 1 177 23 mul.lo.u64 %r311,%r73,%r124; .loc 1 177 9 shl.b64 %r312,%r311,4; neg.s64 %r133,%r312; setp.gt.s64 %r353,%r106,0; setp.le.s64 %r354,%r104,1; .loc 1 150 10 ld.const.f64 %r356,[$LC4]; ld.const.f64 %r357,[$LC4+8]; cvt.u32.u32 %r358,%r22; cvt.s64.s8 %r359,%r358; add.u64 %r360,%r359,-1; add.u64 %r361,%frame,120; add.u64 %r362,%frame,240; bra $L25; $L24: add.u64 %r176,%frame,360; add.u64 %r175,%r184,40; mov.u64 %r173,%frame; cvt.u32.u32 %r314,%r22; cvt.s64.s8 %r313,%r314; add.u64 %r164,%r313,-1; .loc 1 129 3 mov.u64 %r119,0; add.u64 %r351,%frame,240; .loc 1 131 16 mov.u64 %r315,%r119; $L26: st.u64 [%r176],%r315; .loc 1 132 18 ld.u64 %r316,[%r175]; st.u64 [%r173],%r316; .loc 1 133 17 shl.b64 %r318,%r119,3; add.u64 %r319,%r351,%r318; .loc 1 133 10 ld.u64 %r320,[%r319]; setp.le.s64 %r321,%r320,0; @ %r321 bra $L1; .loc 1 129 26 add.u64 %r119,%r119,1; .loc 1 129 3 add.u64 %r176,%r176,8; add.u64 %r175,%r175,24; add.u64 %r173,%r173,8; setp.ne.u64 %r322,%r119,%r164; @ %r322 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r353 bra $L35; .loc 1 150 10 st.f64 [%r118],%r356; st.f64 [%r118+8],%r357; bra $L29; $L35: mov.u64 %r113,%r117; mov.f64 %r95,0d0000000000000000; mov.f64 %r96,0d3ff0000000000000; .loc 1 154 13 mov.u64 %r112,0; $L28: ld.f64 %r126,[%r113]; ld.f64 %r122,[%r113+8]; .loc 1 158 10 mul.f64 %r99,%r96,%r122; mul.f64 %r326,%r95,%r122; neg.f64 %r327,%r326; fma.rn.f64 %r96,%r96,%r126,%r327; fma.rn.f64 %r95,%r95,%r126,%r99; .loc 1 154 28 add.u64 %r112,%r112,1; .loc 1 154 36 add.u64 %r113,%r113,%r68; .loc 1 154 6 setp.gt.s64 %r328,%r45,%r112; @ %r328 bra $L28; .loc 1 161 12 st.f64 [%r118],%r96; st.f64 [%r118+8],%r95; $L29: .loc 1 165 15 add.u64 %r111,%r111,1; .loc 1 166 12 add.u64 %r117,%r117,%r71; .loc 1 167 12 add.u64 %r118,%r118,%r74; .loc 1 169 13 setp.ne.u64 %r329,%r111,%r124; @ %r329 bra $L25; .loc 1 176 9 add.u64 %r114,%r117,%r129; .loc 1 177 9 add.u64 %r115,%r118,%r133; .loc 1 179 7 @ %r354 bra $L1; add.u64 %r123,%frame,368; mov.u64 %r141,8; .loc 1 178 5 mov.u64 %r116,1; .loc 1 173 13 mov.u64 %r355,0; bra $L32; $L33: st.u64 [%r123],%r355; .loc 1 176 23 mul.lo.u64 %r335,%r88,%r87; .loc 1 176 9 shl.b64 %r336,%r335,4; sub.u64 %r114,%r117,%r336; .loc 1 177 23 mul.lo.u64 %r337,%r91,%r87; .loc 1 177 9 shl.b64 %r338,%r337,4; sub.u64 %r115,%r118,%r338; .loc 1 178 5 add.u64 %r116,%r116,1; .loc 1 179 7 add.u64 %r123,%r123,8; add.u64 %r141,%r141,8; setp.eq.u64 %r339,%r116,%r360; @ %r339 bra $L1; $L32: .loc 1 187 16 ld.u64 %r340,[%r123]; add.u64 %r87,%r340,1; st.u64 [%r123],%r87; .loc 1 188 23 add.u64 %r342,%r361,%r141; ld.u64 %r88,[%r342]; .loc 1 188 13 shl.b64 %r343,%r88,4; add.u64 %r117,%r114,%r343; .loc 1 189 23 add.u64 %r344,%frame,%r141; ld.u64 %r91,[%r344]; .loc 1 189 13 shl.b64 %r345,%r91,4; add.u64 %r118,%r115,%r345; .loc 1 169 32 add.u64 %r347,%r362,%r141; ld.u64 %r94,[%r347]; .loc 1 169 13 setp.eq.u64 %r348,%r87,%r94; @ %r348 bra $L33; .loc 1 173 13 mov.u64 %r111,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r349,[%r184]; setp.eq.u64 %r350,%r349,0; @ ! %r350 bra $L11; bra $L34; $L1_gfortran_mproduct_c8 .visible .func _gfortran_mproduct_c8608pred %r218; .reg .u32 %r222; .reg .u64 %r224; .reg .u32 %r226; .reg .u32 %r227; .reg .u16 %r229; .reg .u16u16 %r233; .reg .pred %r234; .reg .u64u64predu6464pred %r335; .reg .u64 %r336; .reg .u32 %r337; .reg .u64u64 %r342; .reg .u32 %r343; .reg .u64pred %r372; .reg .u64 %r373; .reg .u64 %r375; .reg .u32 %r376; .reg .predu32pred %r404; .reg .f64 %r405; .reg .f64 %r406; .reg .predu64 %r413; .reg .predu64predmov.u64 %r216,%ar2; mov.u64 %r217,%ar3; .loc 1 223 6 setp.ne.u64 %r218,%r217,0; @ %r218 bra $L491215216; call _gfortran_product_c8.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r216]; .loc 1 233 7 add.u64 %r134,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r215+28]; .loc 1 234 38 add.u32 %r222,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r135,%r222; .loc 1 237 7 shr.u64 %r224,%r134,63; set.u32.gt.s64 %r226,%r134,%r135; neg.s32 %r227,%r226; cvt.u16.u64 %r230,%r224; cvt.u16.u32 %r231,%r227; or.b16 %r229,%r230,%r231; .loc 1 237 6 cvt.u32.u16 %r232,%r229; cvt.u16.u8 %r233,%r232; setp.eq.u16 %r234,%r233,0; @ %r234 bra $L51; .loc 1 239 7 cvt.u32.u32 %r237,%r23; cvt.s64.s8 %r236,%r237; st.u64 [%stack+8],%r236; st.u64 [%stack],%r22; cvta.const.u64 %r235235_gfortran_runtime_error1: .loc 1 244 9 add.u64 %r434,%r134,%r134; add.u64 %r436,%r434,%r134; shl.b64 %r242,%r436,3; add.u64 %r243,%r215,%r242; ld.u64 %r246,[%r243+56]; add.u64 %r245,%r246,1; .loc 1 244 7 ld.u64 %r253,[%r243+48]; sub.u64 %r136,%r245,%r253; .loc 1 245 6 setp.le.s64 %r254,%r136,0; @ %r254 bra $L48; .loc 1 248 9 ld.u64 %r149,[%r217]; .loc 1 250 15 ld.u64 %r34,[%r217+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r255,%r35,-4; and.b32 %r256,%r255,-5; set.u32.eq.u32 %r258,%r256,0; neg.s32 %r259,%r258; .loc 1 252 22 add.u32 %r260,%r35,-1; set.u32.le.u32 %r262,%r260,1; neg.s32 %r263,%r262; .loc 1 252 6 cvt.u16.u32 %r265,%r259; cvt.u16.u32 %r266,%r263; or.loc 1 254 7 setp.ne.u32 %r271,%r35,16; @ %r271 bra $L54; $L53: .loc 1 261 9 shl.b64 %r275,%r436,3; add.u64 %r276,%r215,%r275; ld.u64 %r139,[%r276+40]; .loc 1 262 12 add.u64 %r282,%r217,%r275; ld.u64 %r41,[%r282+40]; .loc 1 264 3 setp.ne.u64 %r284,%r134,0; @ %r284 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r285,%r134,%r135; @ %r285 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r286286_gfortran_runtime_error5: add.u64 %r40,%r215,40; add.u64 %r128,%r217,40; add.u64 %r288,%r215,16; add.u64 %r290,%r22,%r22; add.u64 %r291,%r290,%r22; shl.b64 %r292,%r291,3; add.u64 %r211,%r288,%r292; .loc 1 264 3 mov.u64 %r88,0; add.u64 %r433,%frame,240; add.u64 %r432,%frame,360; .loc 1 271 12 mov.u64 %r447,%r88; $L60: .loc 1 266 18 add.u64 %r294,%r433,%r88; ld.u64 %r295,[%r40]; st.u64 [%r294],%r295; .loc 1 267 18 add.u64 %r296,%frame,%r88; .loc 1 267 20 ld.u64 %r298,[%r128]; mul.lo.u64 %r297,%r298,%r34; .loc 1 267 18 st.u64 [%r296],%r297; .loc 1 268 19 ld.u64 %r300,[%r40+16]; add.u64 %r299,%r300,1; ld.u64 %r301,[%r40+8]; sub.u64 %r52,%r299,%r301; .loc 1 270 10 setp.lt.s64 %r302,%r52,0; @ %r302 bra $L58; .loc 1 268 17 add.u64 %r304,%r432,%r88; st.u64 [%r304],%r52; bra $L59; $L58: .loc 1 271 12 add.u64 %r306,%r432,%r88; st.u64 [%r306],%r447; $L59: .loc 1 264 3 add.u64 %r40,%r40,24; add.u64 %r128,%r128,24; add.u64 %r88,%r88,8; setp.ne.u64 %r308,%r40,%r211; @ %r308 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r309,[%r214]; setp.eq.u64 %r310,%r309,0; @ ! %r310 bra $L63; bra $L62; $L56: add.u64 %r312,%r22,%r22; add.u64 %r313,%r312,%r22; shl.b64 %r314,%r313,3; add.u64 %r123,%r314,40; add.u64 %r132,%r215,%r123; add.u64 %r121,%r217,%r123; shl.b64 %r315,%r22,3; add.u64 %r133,%r315,-8; cvt.u32.u32 %r317,%r23; cvt.s64.s8 %r316,%r317; shl.b64 %r318,%r316,3; add.u64 %r67,%r318,-8; add.u64 %r433,%frame,240; add.u64 %r432,%frame,360; .loc 1 281 12 mov.u64 %r446,0; $L66: .loc 1 276 18 add.u64 %r320,%r433,%r133; ld.u64 %r321,[%r132]; st.u64 [%r320],%r321; .loc 1 277 18 add.u64 %r322,%frame,%r133; .loc 1 277 20 ld.u64 %r324,[%r121]; mul.lo.u64 %r323,%r324,%r34; .loc 1 277 18 st.u64 [%r322],%r323; .loc 1 278 19 ld.u64 %r326,[%r132+16]; add.u64 %r325,%r326,1; ld.u64 %r327,[%r132+8]; sub.u64 %r63,%r325,%r327; .loc 1 280 10 setp.lt.s64 %r328,%r63,0; @ %r328 bra $L64; .loc 1 278 17 add.u64 %r330,%r432,%r133; st.u64 [%r330],%r63; bra $L65; $L64: .loc 1 281 12 add.u64 %r332,%r432,%r133; st.u64 [%r332],%r446; $L65: .loc 1 274 3 add.u64 %r132,%r132,24; add.u64 %r121,%r121,24; add.u64 %r133,%r133,8; setp.ne.u64 %r334,%r67,%r133; @ %r334 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r335,%r135,0; @ %r335 bra $L68; add.u64 %r432,%frame,360; $L86: add.u64 %r181,%r214,40; mov.u64 %r179,%r432; cvt.u32.u32 %r337,%r23; cvt.s64.s8 %r336,%r337; add.u64 %r339,%r336,%r336; add.u64 %r340,%r339,%r336; shl.b64 %r341,%r340,3; add.u64 %r342,%r214,16; add.u64 %r144,%r341,%r342; .loc 1 291 10 mov.u64 %r125,1; .loc 1 295 4 mov.u64 %r361,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r343,%r23,-2; cvt.s64.s32 %r72,%r343; add.u64 %r345,%r72,%r72; add.u64 %r346,%r345,%r72; shl.b64 %r347,%r346,3; add.u64 %r348,%r214,%r347; .loc 1 299 67 shl.b64 %r350,%r72,3; add.u64 %r351,%frame,%r350; .loc 1 299 59 ld.u64 %r353,[%r348+40]; ld.u64 %r354,[%r351+360]; mul.lo.u64 %r143,%r353,%r354; .loc 1 301 24 mov.u64 %r355,0; st.u64 [%r214+8],%r355; .loc 1 302 28 cvt.u16.u32 %r358,%r23; add.u16 %r357,%r358,-1; cvt.u32.u16 %r359,%r357; st.u8 [%r214+28],%r359; .loc 1 304 10 setp.eq.u64 %r360,%r143,0; @ %r360 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r125,%r69,%r125; $L69: .loc 1 295 4 st.u64 [%r181+8],%r361; ld.u64 %r69,[%r179]; add.u64 %r362,%r69,-1; st.u64 [%r181+16],%r362; st.u64 [%r181],%r125; .loc 1 288 7 add.u64 %r181,%r181,24; add.u64 %r179,%r179,8; setp.ne.u64 %r363,%r144,%r181; @ %r363 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r214+48],%r143; mov.u64 %r365,-1; st.u64 [%r214+56],%r365; mov.u64 %r366,1; st.u64 [%r214+40],%r366; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r368,16368; call (%value_in),_gfortrani_xmallocarray69,[%value_in]; } .loc 1 311 22 st.u64 [%r214],%r369; bra $L73; $L63: .loc 1 316 19 ld.s8 %r371,[%r214+28]; .loc 1 316 10 setp.eq.u64 %r372,%r371,%r135; @ %r372 bra $L74; .loc 1 317 2 cvta.const.u64 %r373,$LC373_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r375,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r376,[%r375+36]; setp.eq.u32 %r377,%r376,0; @ %r377 bra $L73; .loc 1 321 4 add.u64 %r432,%frame,360; cvta.const.u64 %r381,$LC2r4380381; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r385,$LC738381; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r387,%r135,0; @ %r387 bra $L75; $L78: .loc 1 336 8 ld.u64 %r150,[%r214]; .loc 1 337 8 ld.u64 %r148,[%r215]; .loc 1 339 9 setp.ne.u64 %r388,%r148,0; @ %r388 bra $L76; bra $L48; $L75: add.u64 %r194,%frame,480; add.u64 %r193,%r214,40; add.u64 %r191,%frame,120; cvt.u32.u32 %r390,%r23; cvt.s64.s8 %r389,%r390; add.u64 %r182,%r389,-1; .loc 1 328 3 mov.u64 %r155,0; add.u64 %r432,%frame,360; .loc 1 330 16 mov.u64 %r391,%r155; $L77: st.u64 [%r194],%r391; .loc 1 331 18 ld.u64 %r392,[%r193]; st.u64 [%r191],%r392; .loc 1 332 17 shl.b64 %r394,%r155,3; add.u64 %r395,%r432,%r394; .loc 1 332 10 ld.u64 %r396,[%r395]; setp.le.s64 %r397,%r396,0; @ %r397 bra $L48; .loc 1 328 26 add.u64 %r155,%r155,1; .loc 1 328 3 add.u64 %r194,%r194,8; add.u64 %r193,%r193,24; add.u64 %r191,%r191,8; setp.ne.u64 %r398,%r155,%r182; @ %r398 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r43,%r41,%r34; .loc 1 349 32 shl.b64 %r87,%r139,4; ld.u64 %r151,[%frame+480]; .loc 1 359 22 ld.u64 %r90,[%frame+240]; .loc 1 359 12 shl.b64 %r91,%r90,4; .loc 1 360 23 ld.u64 %r93,[%frame]; .loc 1 361 22 ld.u64 %r95,[%frame+120]; .loc 1 361 12 shl.b64 %r96,%r95,4; .loc 1 363 32 ld.u64 %r81,[%frame+360]; .loc 1 370 23 mul.lo.u64 %r38,%r81,%r90; .loc 1 371 24 mul.lo.u64 %r399,%r81,%r93; .loc 1 371 10 neg.s64 %r161,%r399; .loc 1 372 23 mul.lo.u64 %r400,%r81,%r95; .loc 1 372 9 shl.b64 %r401,%r400,4; neg.s64 %r165,%r401; setp.le.s64 %r437,%r135,1; cvt.u32.u32 %r439,%r23; cvt.s64.s8 %r440,%r439; add.u64 %r441,%r440,-1; add.u64 %r442,%frame,240; add.u64 %r443,%frame,360; add.u64 %r444,%frame,120; .loc 1 367 13 mov.u64 %r445,0; $L87: .loc 1 328 3 mov.u64 %r154,%r149; mov.u64 %r153,%r148; mov.f64 %r119,0d0000000000000000; mov.f64 %r120,0d3ff0000000000000; .loc 1 349 9 mov.u64 %r152,0; $L80: .loc 1 352 6 ld.s8 %r403,[%r154]; cvt.u16.u32 %r402,%r403; setp.eq.u16 %r404,%r402,0; @ %r404 bra $L79; ld.f64 %r163,[%r153]; ld.f64 %r170,[%r153+8]; .loc 1 353 12 mul.f64 %r174,%r120,%r170; mul.f64 %r405,%r119,%r170; neg.f64 %r406,%r405; fma.rn.f64 %r120,%r120,%r163,%r406; fma.rn.f64 %r119,%r119,%r163,%r174; $L79: .loc 1 349 24 add.u64 %r152,%r152,1; .loc 1 349 32 add.u64 %r153,%r153,%r87; .loc 1 349 47 add.u64 %r154,%r154,%r43; .loc 1 349 2 setp.ne.u64 %r407,%r136,%r152; @ %r407 bra $L80; .loc 1 355 8 st.f64 [%r150],%r120; st.f64 [%r150+8],%r119; .loc 1 358 15 add.u64 %r151,%r151,1; .loc 1 359 12 add.u64 %r148,%r148,%r91; .loc 1 360 13 add.u64 %r149,%r149,%r93; .loc 1 361 12 add.u64 %r150,%r150,%r96; .loc 1 363 13 setp.ne.u64 %r408,%r81,%r151; @ %r408 bra $L87; .loc 1 371 10 add.u64 %r145,%r149,%r161; .loc 1 372 9 add.u64 %r146,%r150,%r165; .loc 1 374 7 @ ! %r437 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r127],%r445; .loc 1 370 23 mul.lo.u64 %r99,%r110,%r109; .loc 1 371 24 mul.lo.u64 %r411,%r112,%r109; .loc 1 371 10 sub.u64 %r145,%r149,%r411; .loc 1 372 23 mul.lo.u64 %r412,%r114,%r109; .loc 1 372 9 shl.b64 %r413,%r412,4; sub.u64 %r146,%r150,%r413; .loc 1 373 5 add.u64 %r147,%r147,1; .loc 1 374 7 add.u64 %r127,%r127,8; add.u64 %r178,%r178,8; setp.ne.u64 %r414,%r147,%r441; @ %r414 bra $L84; bra $L48; $L108: add.u64 %r127,%frame,488; .loc 1 370 23 mov.u64 %r99,%r38; .loc 1 374 7 mov.u64 %r178,8; .loc 1 373 5 mov.u64 %r147,1; $L84: .loc 1 382 16 ld.u64 %r418,[%r127]; add.u64 %r109,%r418,1; st.u64 [%r127],%r109; .loc 1 383 23 add.u64 %r420,%r442,%r178; ld.u64 %r110,[%r420]; .loc 1 383 13 sub.u64 %r421,%r110,%r99; shl.b64 %r422,%r421,4; add.u64 %r148,%r148,%r422; .loc 1 384 24 add.u64 %r423,%frame,%r178; ld.u64 %r112,[%r423]; .loc 1 384 14 add.u64 %r149,%r145,%r112; .loc 1 385 23 add.u64 %r425,%r444,%r178; ld.u64 %r114,[%r425]; .loc 1 385 13 shl.b64 %r426,%r114,4; add.u64 %r150,%r146,%r426; .loc 1 363 32 add.u64 %r428,%r443,%r178; ld.u64 %r117,[%r428]; .loc 1 363 13 setp.eq.u64 %r429,%r109,%r117; @ %r429 bra $L85; .loc 1 367 13 mov.u64 %r151,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r430,[%r214]; setp.eq.u64 %r431,%r430,0; @ ! %r431 bra $L63; bra $L86; $L48: .loc 1 389_gfortran_sproduct_c8 .visible .func _gfortran_sproduct_c88u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64 %r276; .reg .pred %r277; .reg .pred %r280; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_product_c8.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r297,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r297; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r297; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r304,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r304; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r296,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r296,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r303,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r303; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r297,0; @ %r211 bra $L127; add.u64 %r296,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r296; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r244,1696244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r297,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r297; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r297; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r297,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r297,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r297,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 shl.b64 %r69,%r68,4; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; ld.const.f64 %r293,[$LC4]; ld.const.f64 %r294,[$LC4+8]; setp.le.s64 %r295,%r297,1; cvt.u32.u32 %r299,%r24; cvt.s64.s8 %r300,%r299; add.u64 %r301,%r300,-1; .loc 1 520 13 mov.u64 %r302,0; $L146: .loc 1 512 13 st.f64 [%r98],%r293; st.f64 [%r98+8],%r294; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r280,%r59,%r80; @ %r280 bra $L146; .loc 1 525 7 @ ! %r295 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r302; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r283,%r97,%r301; @ %r283 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; $L144: .loc 1 529 16 ld.u64 %r288,[%r99]; add.u64 %r74,%r288,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r289,%r75,%r72; shl.b64 %r290,%r289,4; add.u64 %r98,%r98,%r290; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r291,%r74,%r77; @ %r291 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r292,%r98,0; @ ! %r292 bra $L122; bra $L147; $L109: .loc 1 534 product_c10sum_i1.o/_gfortran_sum_i1 .visible .func _gfortran_sum_i1fortran/generated/sum_i1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_msum_i1 .visible .func _gfortran_msum_i1gfortran_ssum_i1 .visible .func _gfortran_ssum_i1VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10] = {83,85,77114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,83,85,77,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_sum_i1 .visible .func _gfortran_sum_i148u64 %r111; .reg .u64predu64 %r229; .reg .u64u64pred %r247; .reg .pred %r249; .reg .predu64u64 %r334; .reg .u64 %r336; .reg .predpred %r343; .reg .u32u64 %r353; .reg .u16 %r354; mov.u64 %r174,%ar0; mov.u64 %r175,%ar1; mov.u64 %r176,%ar2; .loc 1 55 10 ld.s8 %r22,[%r175+28]; .loc 1 55 38 add.u32 %r177,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r107,%r177; .loc 1 56 10 ld.u64 %r25,[%r176]; .loc 1 56 7 add.u64 %r108,%r25,-1; .loc 1 58 7 shr.u64 %r179,%r108,63; set.u32.lt.s64 %r181,%r107,%r108; neg.s32 %r182,%r181; cvt.u16.u64 %r185,%r179; cvt.u16.u32 %r186,%r182; or.b16 %r184,%r185,%r186; .loc 1 58 6 cvt.u32.u16 %r187,%r184; cvt.u16.u8 %r188,%r187; setp.eq.u16 %r189,%r188,0; @ %r189 bra $L2; .loc 1 60 7 cvt.u32.u32 %r192,%r22; cvt.s64.s8 %r191,%r192; st.u64 [%stack+8],%r19190,$LC0;stack; call _gfortran_runtime_errorr195,%r108,%r108; add.u64 %r196,%r195,%r108; shl.b64 %r197,%r196,3; add.u64 %r198,%r175,%r197; ld.u64 %r30,[%r198+56]; ld.u64 %r32,[%r198+48]; .loc 1 68 9 ld.u64 %r111,[%r198+40]; .loc 1 70 3 setp.ne.u64 %r212,%r108,0; @ %r212 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r213,%r107,%r108; @ %r213 bra $L4; bra $L46; $L3: add.u64 %r125,%r175,40; add.u64 %r172,%frame,120; add.u64 %r170,%frame,240; add.u64 %r214,%r175,16; add.u64 %r216,%r25,%r25; add.u64 %r217,%r216,%r25; shl.b64 %r218,%r217,3; add.u64 %r79,%r214,%r218; .loc 1 76 12 mov.u64 %r353,0; $L8: .loc 1 72 18 ld.u64 %r219,[%r125]; st.u64 [%r172],%r219; .loc 1 73 19 ld.u64 %r221,[%r125+16]; add.u64 %r220,%r221,1; ld.u64 %r222,[%r125+8]; sub.u64 %r37,%r220,%r222; .loc 1 75 10 setp.lt.s64 %r223,%r37,0; @ %r223 bra $L6; .loc 1 73 17 st.u64 [%r170],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r170],%r353; $L7: .loc 1 70 3 add.u64 %r125,%r125,24; add.u64 %r172,%r172,8; add.u64 %r170,%r170,8; setp.ne.u64 %r225,%r79,%r125; @ %r225 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r226,[%r174]; setp.eq.u64 %r227,%r226,0; @ ! %r227 bra $L11; bra $L10; $L4: add.u64 %r229,%r25,%r25; add.u64 %r230,%r229,%r25; shl.b64 %r231,%r230,3; add.u64 %r232,%r231,40; add.u64 %r117,%r175,%r232; shl.b64 %r233,%r25,3; add.u64 %r97,%r233,-8; add.u64 %r340,%frame,120; add.u64 %r100,%r340,%r97; add.u64 %r341,%frame,240; add.u64 %r95,%r341,%r97; cvt.u32.u32 %r237,%r22; cvt.s64.s8 %r236,%r237; add.u64 %r239,%r236,%r236; add.u64 %r240,%r239,%r236; shl.b64 %r241,%r240,3; add.u64 %r242,%r175,40; add.u64 %r94,%r241,%r242; .loc 1 84 12 mov.u64 %r352,0; $L14: .loc 1 80 18 ld.u64 %r243,[%r117]; st.u64 [%r100],%r243; .loc 1 81 19 ld.u64 %r245,[%r117+16]; add.u64 %r244,%r245,1; ld.u64 %r246,[%r117+8]; sub.u64 %r44,%r244,%r246; .loc 1 83 10 setp.lt.s64 %r247,%r44,0; @ %r247 bra $L12; .loc 1 81 17 st.u64 [%r95],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r95],%r352; $L13: .loc 1 78 3 add.u64 %r117,%r117,24; add.u64 %r100,%r100,8; add.u64 %r95,%r95,8; setp.ne.u64 %r249,%r94,%r117; @ %r249 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r250,%r107,0; @ %r250 bra $L16; add.u64 %r341,%frame,240; $L35: add.u64 %r147,%r174,40; mov.u64 %r138,%r341; cvt.u32.u32 %r252,%r22; cvt.s64.s8 %r251,%r252; add.u64 %r254,%r251,%r251; add.u64 %r255,%r254,%r251; shl.b64 %r256,%r255,3; add.u64 %r257,%r174,16; add.u64 %r126,%r256,%r257; .loc 1 94 10 mov.u64 %r99,1; .loc 1 98 4 mov.u64 %r280,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r258,0; st.u64 [%r174+8],%r258; .loc 1 103 28 cvt.u16.u32 %r261,%r22; add.u16 %r260,%r261,-1; cvt.u32.u16 %r262,%r260; st.u8 [%r174+28],%r262; .loc 1 105 20 add.u32 %r263,%r22,-2; cvt.s64.s32 %r55,%r263; add.u64 %r265,%r55,%r55; add.u64 %r266,%r265,%r55; shl.b64 %r267,%r266,3; add.u64 %r268,%r174,%r267; .loc 1 105 67 shl.b64 %r270,%r55,3; add.u64 %r271,%frame,%r270; .loc 1 105 59 ld.u64 %r273,[%r268+40]; ld.u64 %r274,[%r271+240]; mul.lo.u64 %r113,%r273,%r274; .loc 1 107 29 mov.u64 %r27611(%value_in),_gfortrani_xmallocarray277,[%value_in]; } .loc 1 107 27 st.u64 [%r174],%r277; .loc 1 108 10 setp.eq.u64 %r279,%r113,0; @ ! %r279 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r99,%r51,%r99; $L17: .loc 1 98 4 st.u64 [%r147+8],%r280; ld.u64 %r51,[%r138]; add.u64 %r281,%r51,-1; st.u64 [%r147+16],%r281; st.u64 [%r147],%r99; .loc 1 91 7 add.u64 %r147,%r147,24; add.u64 %r138,%r138,8; setp.ne.u64 %r282,%r126,%r147; @ %r282 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r174+48],%r113; mov.u64 %r284,-1; st.u64 [%r174+56],%r284; st.u64 [%r174+40],%r276; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r62,[%r174+28]; .loc 1 118 10 setp.eq.u64 %r286,%r62,%r107; @ %r286 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r107;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r289,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r290,[%r289+36]; setp.eq.u32 %r291,%r290,0; @ %r291 bra $L21; .loc 1 125 2 add.u64 %r341,%frame,240; cvta.const.u64 %r295,$LC2r34129495; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r297,%r107,0; @ %r297 bra $L24; $L27: .loc 1 65 9 add.u64 %r298,%r30,1; .loc 1 65 7 sub.u64 %r109,%r298,%r32; max.s64 %r45,%r109,0; .loc 1 137 8 ld.u64 %r121,[%r175]; .loc 1 138 8 ld.u64 %r122,[%r174]; ld.u64 %r47,[%frame+360]; .loc 1 166 22 ld.u64 %r74,[%frame+120]; .loc 1 167 22 ld.u64 %r76,[%frame]; setp.gt.s64 %r342,%r109,0; setp.le.s64 %r343,%r107,1; cvt.u32.u32 %r345,%r22; cvt.s64.s8 %r346,%r345; add.u64 %r347,%r346,-1; add.u64 %r348,%frame,120; add.u64 %r349,%frame,240; .loc 1 173 13 mov.u64 %r350,0; bra $L25; $L24: add.u64 %r160,%frame,360; add.u64 %r159,%r174,40; mov.u64 %r157,%frame; cvt.u32.u32 %r300,%r22; cvt.s64.s8 %r299,%r300; add.u64 %r148,%r299,-1; .loc 1 129 3 mov.u64 %r123,0; add.u64 %r341,%frame,240; .loc 1 131 16 mov.u64 %r301,%r123; $L26: st.u64 [%r160],%r301; .loc 1 132 18 ld.u64 %r302,[%r159]; st.u64 [%r157],%r302; .loc 1 133 17 shl.b64 %r304,%r123,3; add.u64 %r305,%r341,%r304; .loc 1 133 10 ld.u64 %r306,[%r305]; setp.le.s64 %r307,%r306,0; @ %r307 bra $L1; .loc 1 129 26 add.u64 %r123,%r123,1; .loc 1 129 3 add.u64 %r160,%r160,8; add.u64 %r159,%r159,24; add.u64 %r157,%r157,8; setp.ne.u64 %r308,%r123,%r148; @ %r308 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r342 bra $L28; .loc 1 150 10 mov.u32 %r310,0; st.u8 [%r122],%r310; bra $L29; $L28: .loc 1 154 36 mov.u64 %r116,%r121; .loc 1 148 10 mov.u32 %r114,0; .loc 1 154 13 mov.u64 %r115,0; $L30: .loc 1 158 10 ld.u8 %r354,[%r116]; mov.u16 %r314,%r354; cvt.u16.u32 %r315,%r114; add.u16 %r313,%r314,%r315; cvt.u32.u16 %r316,%r313; cvt.s32.s8 %r114,%r316; .loc 1 154 28 add.u64 %r115,%r115,1; .loc 1 154 36 add.u64 %r116,%r116,%r111; .loc 1 154 6 setp.gt.s64 %r318,%r45,%r115; @ %r318 bra $L30; .loc 1 161 12 cvt.u32.u32 %r319,%r114; st.u8 [%r122],%r319; $L29: .loc 1 165 15 add.u64 %r47,%r47,1; .loc 1 166 12 add.u64 %r121,%r121,%r74; .loc 1 167 12 add.u64 %r122,%r122,%r76; .loc 1 169 32 ld.u64 %r128,[%frame+240]; .loc 1 169 13 setp.ne.u64 %r320,%r47,%r128; @ %r320 bra $L25; .loc 1 176 23 mul.lo.u64 %r321,%r74,%r47; .loc 1 176 9 sub.u64 %r118,%r121,%r321; .loc 1 177 23 mul.lo.u64 %r322,%r76,%r47; .loc 1 177 9 sub.u64 %r119,%r122,%r322; .loc 1 179 7 @ %r343 bra $L1; add.u64 %r143,%frame,368; mov.u64 %r106,8; .loc 1 178 5 mov.u64 %r120,1; bra $L33; $L34: .loc 1 173 13 st.u64 [%r143],%r350; .loc 1 176 23 mul.lo.u64 %r328,%r89,%r88; .loc 1 176 9 sub.u64 %r118,%r121,%r328; .loc 1 177 23 mul.lo.u64 %r329,%r91,%r88; .loc 1 177 9 sub.u64 %r119,%r122,%r329; .loc 1 178 5 add.u64 %r120,%r120,1; .loc 1 179 7 add.u64 %r143,%r143,8; add.u64 %r106,%r106,8; setp.eq.u64 %r330,%r120,%r347; @ %r330 bra $L1; $L33: .loc 1 187 16 ld.u64 %r331,[%r143]; add.u64 %r88,%r331,1; st.u64 [%r143],%r88; .loc 1 188 23 add.u64 %r333,%r348,%r106; ld.u64 %r89,[%r333]; .loc 1 188 13 add.u64 %r121,%r118,%r89; .loc 1 189 23 add.u64 %r334,%frame,%r106; ld.u64 %r91,[%r334]; .loc 1 189 13 add.u64 %r122,%r119,%r91; .loc 1 169 32 add.u64 %r336,%r349,%r106; ld.u64 %r93,[%r336]; .loc 1 169 13 setp.eq.u64 %r337,%r88,%r93; @ %r337 bra $L34; .loc 1 173 13 mov.u64 %r47,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r338,[%r174]; setp.eq.u64 %r339,%r338,0; @ ! %r339 bra $L11; bra $L35; $L1_gfortran_msum_i1 .visible .func _gfortran_msum_608pred %r210; .reg .u32 %r214; .reg .u64 %r216; .reg .u32 %r218; .reg .u32 %r219; .reg .u16 %r221; .reg .u16 %r222; .reg .u16 %r223; .reg .u32 %r224; .reg .u16 %r225; .reg .predu32 %r259; .reg .u16u64 %r280; .reg .u64u64predu64u64 %r333; .reg .u64 %r334; .reg .u32predpredu64 %r377; .reg .pred %r379; .reg .pred %r380; .reg .u64 %r381; .reg .u32pred %r389; .reg .predu32 %r399; .reg .predpredmov.u64 %r206,%ar0; mov.u64 %r207,%ar1; mov.u64 %r208,%ar2; mov.u64 %r209,%ar3; .loc 1 223 6 setp.ne.u64 %r210,%r209,0; @ %r210 bra $L49; .loc 1 228 7 {_gfortran_sum_.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r208]; .loc 1 233 7 add.u64 %r135,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r207+28]; .loc 1 234 38 add.u32 %r214,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r136,%r214; .loc 1 237 7 shr.u64 %r216,%r135,63; set.u32.gt.s64 %r218,%r135,%r136; neg.s32 %r219,%r218; cvt.u16.u64 %r222,%r216; cvt.u16.u32 %r223,%r219; or.b16 %r221,%r222,%r223; .loc 1 237 6 cvt.u32.u16 %r224,%r221; cvt.u16.u8 %r225,%r224; setp.eq.u16 %r226,%r225,0; @ %r226 bra $L51; .loc 1 239 7 cvt.u32.u32 %r229,%r23; cvt.s64.s8 %r228,%r229; st.u64 [%stack+8],%r228; st.u64 [%stack],%r22; cvta.const.u64 %r227_gfortran_runtime_error1: .loc 1 244 9 add.u64 %r429,%r135,%r135; add.u64 %r430,%r429,%r135; shl.b64 %r234,%r430,3; add.u64 %r235,%r207,%r234; ld.u64 %r238,[%r235+56]; add.u64 %r237,%r238,1; .loc 1 244 7 ld.u64 %r245,[%r235+48]; sub.u64 %r137,%r237,%r245; .loc 1 245 6 setp.le.s64 %r246,%r137,0; @ %r246 bra $L48; .loc 1 248 9 ld.u64 %r149,[%r209]; .loc 1 250 15 ld.u64 %r34,[%r209+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r247,%r35,-4; and.b32 %r248,%r247,-5; set.u32.eq.u32 %r250,%r248,0; neg.s32 %r251,%r250; .loc 1 252 22 add.u32 %r252,%r35,-1; set.u32.le.u32 %r254,%r252,1; neg.s32 %r255,%r254; .loc 1 252 6 cvt.u16.u32 %r257,%r251; cvt.u16.u32 %r258,%r255; or.b16 %r256,%r257,%r258; cvt.u32.u16 %r259,%r256; cvt.u16.u8 %r260,%r259; setp.ne.u16 %r261,%r260,0; @ %r261 bra $L53; .loc 1 254 7 setp.ne.u32 %r263,%r35,16; @ %r263 bra $L54; $L53: .loc 1 261 9 shl.b64 %r267,%r430,3; add.u64 %r268,%r207,%r267; ld.u64 %r140,[%r268+40]; .loc 1 262 12 add.u64 %r274,%r209,%r267; ld.u64 %r42,[%r274+40]; .loc 1 264 3 setp.ne.u64 %r276,%r135,0; @ %r276 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r277,%r135,%r136; @ %r277 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r27827_gfortran_runtime_error5: add.u64 %r190,%r207,40; add.u64 %r192,%r209,40; add.u64 %r280,%r207,16; add.u64 %r282,%r22,%r22; add.u64 %r283,%r282,%r22; shl.b64 %r284,%r283,3; add.u64 %r203,%r280,%r284; .loc 1 264 3 mov.u64 %r194,0; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 271 12 mov.u64 %r441,%r194; $L60: .loc 1 266 18 add.u64 %r286,%r427,%r194; ld.u64 %r287,[%r190]; st.u64 [%r286],%r287; .loc 1 267 18 add.u64 %r288,%frame,%r194; .loc 1 267 20 ld.u64 %r290,[%r192]; mul.lo.u64 %r289,%r290,%r34; .loc 1 267 18 st.u64 [%r288],%r289; .loc 1 268 19 ld.u64 %r292,[%r190+16]; add.u64 %r291,%r292,1; ld.u64 %r293,[%r190+8]; sub.u64 %r53,%r291,%r293; .loc 1 270 10 setp.lt.s64 %r294,%r53,0; @ %r294 bra $L58; .loc 1 268 17 add.u64 %r296,%r426,%r194; st.u64 [%r296],%r53; bra $L59; $L58: .loc 1 271 12 add.u64 %r298,%r426,%r194; st.u64 [%r298],%r441; $L59: .loc 1 264 3 add.u64 %r190,%r190,24; add.u64 %r192,%r192,24; add.u64 %r194,%r194,8; setp.ne.u64 %r300,%r190,%r203; @ %r300 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r301,[%r206]; setp.eq.u64 %r302,%r301,0; @ ! %r302 bra $L63; bra $L62; $L56: add.u64 %r304,%r22,%r22; add.u64 %r305,%r304,%r22; shl.b64 %r306,%r305,3; add.u64 %r155,%r306,40; add.u64 %r58,%r207,%r155; add.u64 %r157,%r209,%r155; shl.b64 %r307,%r22,3; add.u64 %r103,%r307,-8; cvt.u32.u32 %r309,%r23; cvt.s64.s8 %r308,%r309; shl.b64 %r310,%r308,3; add.u64 %r187,%r310,-8; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 281 12 mov.u64 %r440,0; $L66: .loc 1 276 18 add.u64 %r312,%r427,%r103; ld.u64 %r313,[%r58]; st.u64 [%r312],%r313; .loc 1 277 18 add.u64 %r314,%frame,%r103; .loc 1 277 20 ld.u64 %r316,[%r157]; mul.lo.u64 %r315,%r316,%r34; .loc 1 277 18 st.u64 [%r314],%r315; .loc 1 278 19 ld.u64 %r318,[%r58+16]; add.u64 %r317,%r318,1; ld.u64 %r319,[%r58+8]; sub.u64 %r64,%r317,%r319; .loc 1 280 10 setp.lt.s64 %r320,%r64,0; @ %r320 bra $L64; .loc 1 278 17 add.u64 %r322,%r426,%r103; st.u64 [%r322],%r64; bra $L65; $L64: .loc 1 281 12 add.u64 %r324,%r426,%r103; st.u64 [%r324],%r440; $L65: .loc 1 274 3 add.u64 %r58,%r58,24; add.u64 %r157,%r157,24; add.u64 %r103,%r103,8; setp.ne.u64 %r326,%r103,%r187; @ %r326 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r327,%r136,0; @ %r327 bra $L68; add.u64 %r426,%frame,360; $L86: add.u64 %r127,%r206,40; mov.u64 %r121,%r426; cvt.u32.u32 %r329,%r23; cvt.s64.s8 %r328,%r329; add.u64 %r331,%r328,%r328; add.u64 %r332,%r331,%r328; shl.b64 %r333,%r332,3; add.u64 %r334,%r206,16; add.u64 %r72,%r333,%r334; .loc 1 291 10 mov.u64 %r123,1; .loc 1 295 4 mov.u64 %r353,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r335,%r23,-2; cvt.s64.s32 %r70,%r335; add.u64 %r337,%r70,%r70; add.u64 %r338,%r337,%r70; shl.b64 %r339,%r338,3; add.u64 %r340,%r206,%r339; .loc 1 299 67 shl.b64 %r342,%r70,3; add.u64 %r343,%frame,%r342; .loc 1 299 59 ld.u64 %r345,[%r340+40]; ld.u64 %r346,[%r343+360]; mul.lo.u64 %r143,%r345,%r346; .loc 1 301 24 mov.u64 %r347,0; st.u64 [%r206+8],%r347; .loc 1 302 28 cvt.u16.u32 %r350,%r23; add.u16 %r349,%r350,-1; cvt.u32.u16 %r351,%r349; st.u8 [%r206+28],%r351; .loc 1 304 10 setp.eq.u64 %r352,%r143,0; @ %r352 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r123,%r67,%r123; $L69: .loc 1 295 4 st.u64 [%r127+8],%r353; ld.u64 %r67,[%r121]; add.u64 %r354,%r67,-1; st.u64 [%r127+16],%r354; st.u64 [%r127],%r123; .loc 1 288 7 add.u64 %r127,%r127,24; add.u64 %r121,%r121,8; setp.ne.u64 %r355,%r72,%r127; @ %r355 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r206+48],%r143; mov.u64 %r357,-1; st.u64 [%r206+56],%r357; mov.u64 %r358,1; st.u64 [%r206+40],%r358; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r360360; call (%value_in),_gfortrani_xmallocarray61,[%value_in]; } .loc 1 311 22 st.u64 [%r206],%r361; bra $L73; $L63: .loc 1 316 19 ld.s8 %r363,[%r206+28]; .loc 1 316 10 setp.eq.u64 %r364,%r363,%r136; @ %r364 bra $L74; .loc 1 317 2 cvta.const.u64 %r365365_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r367,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r368,[%r367+36]; setp.eq.u32 %r369,%r368,0; @ %r369 bra $L73; .loc 1 321 4 add.u64 %r426,%frame,360; cvta.const.u64 %r373,$LC2; cvta.const.u64 %r372,$LC3; {372373; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r377,$LC377373; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r379,%r136,0; @ %r379 bra $L75; $L78: .loc 1 336 8 ld.u64 %r150,[%r206]; .loc 1 337 8 ld.u64 %r148,[%r207]; .loc 1 339 9 setp.ne.u64 %r380,%r148,0; @ %r380 bra $L76; bra $L48; $L75: add.u64 %r173,%frame,480; add.u64 %r170,%r206,40; add.u64 %r163,%frame,120; cvt.u32.u32 %r382,%r23; cvt.s64.s8 %r381,%r382; add.u64 %r128,%r381,-1; .loc 1 328 3 mov.u64 %r154,0; add.u64 %r426,%frame,360; .loc 1 330 16 mov.u64 %r383,%r154; $L77: st.u64 [%r173],%r383; .loc 1 331 18 ld.u64 %r384,[%r170]; st.u64 [%r163],%r384; .loc 1 332 17 shl.b64 %r386,%r154,3; add.u64 %r387,%r426,%r386; .loc 1 332 10 ld.u64 %r388,[%r387]; setp.le.s64 %r389,%r388,0; @ %r389 bra $L48; .loc 1 328 26 add.u64 %r154,%r154,1; .loc 1 328 3 add.u64 %r173,%r173,8; add.u64 %r170,%r170,24; add.u64 %r163,%r163,8; setp.ne.u64 %r390,%r128,%r154; @ %r390 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r44,%r42,%r34; ld.u64 %r176,[%frame+480]; .loc 1 359 22 ld.u64 %r89,[%frame+240]; .loc 1 360 23 ld.u64 %r91,[%frame]; .loc 1 361 22 ld.u64 %r93,[%frame+120]; setp.le.s64 %r431,%r136,1; .loc 1 348 10 mov.u32 %r433,0; cvt.u32.u32 %r434,%r23; cvt.s64.s8 %r435,%r434; add.u64 %r436,%r435,-1; add.u64 %r437,%frame,240; add.u64 %r438,%frame,360; add.u64 %r439,%frame,120; $L87: .loc 1 328 3 mov.u64 %r153,%r149; mov.u64 %r152,%r148; .loc 1 348 10 mov.u32 %r125,%r433; .loc 1 349 9 mov.u64 %r151,0; $L80: .loc 1 352 6 ld.s8 %r392,[%r153]; cvt.u16.u32 %r391,%r392; setp.eq.u16 %r393,%r391,0; @ %r393 bra $L79; .loc 1 353 12 ld.u8 %r442,[%r152]; mov.u16 %r397,%r442; cvt.u16.u32 %r398,%r125; add.u16 %r396,%r397,%r398; cvt.u32.u16 %r399,%r396; cvt.s32.s8 %r125,%r399; $L79: .loc 1 349 24 add.u64 %r151,%r151,1; .loc 1 349 32 add.u64 %r152,%r152,%r140; .loc 1 349 47 add.u64 %r153,%r153,%r44; .loc 1 349 2 setp.ne.u64 %r401,%r137,%r151; @ %r401 bra $L80; .loc 1 355 8 cvt.u32.u32 %r402,%r125; st.u8 [%r150],%r402; .loc 1 358 15 add.u64 %r176,%r176,1; .loc 1 359 12 add.u64 %r148,%r148,%r89; .loc 1 360 13 add.u64 %r149,%r149,%r91; .loc 1 361 12 add.u64 %r150,%r150,%r93; .loc 1 363 32 ld.u64 %r80,[%frame+360]; .loc 1 363 13 setp.ne.u64 %r403,%r80,%r176; @ %r403 bra $L87; .loc 1 370 23 mul.lo.u64 %r97,%r176,%r89; .loc 1 371 24 mul.lo.u64 %r404,%r176,%r91; .loc 1 371 10 sub.u64 %r145,%r149,%r404; .loc 1 372 23 mul.lo.u64 %r405,%r176,%r93; .loc 1 372 9 sub.u64 %r146,%r150,%r405; .loc 1 374 7 @ ! %r431 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r40],%r432; .loc 1 370 23 mul.lo.u64 %r97,%r109,%r108; .loc 1 371 24 mul.lo.u64 %r408,%r111,%r108; .loc 1 371 10 sub.u64 %r145,%r149,%r408; .loc 1 372 23 mul.lo.u64 %r409,%r113,%r108; .loc 1 372 9 sub.u64 %r146,%r150,%r409; .loc 1 373 5 add.u64 %r147,%r147,1; .loc 1 374 7 add.u64 %r40,%r40,8; add.u64 %r167,%r167,8; setp.ne.u64 %r410,%r147,%r436; @ %r410 bra $L84; bra $L48; $L108: add.u64 %r40,%frame,488; mov.u64 %r167,8; .loc 1 373 5 mov.u64 %r147,1; .loc 1 367 13 mov.u64 %r432,0; $L84: .loc 1 382 16 ld.u64 %r414,[%r40]; add.u64 %r108,%r414,1; st.u64 [%r40],%r108; .loc 1 383 23 add.u64 %r416,%r437,%r167; ld.u64 %r109,[%r416]; .loc 1 383 13 sub.u64 %r417,%r109,%r97; add.u64 %r148,%r148,%r417; .loc 1 384 24 add.u64 %r418,%frame,%r167; ld.u64 %r111,[%r418]; .loc 1 384 14 add.u64 %r149,%r145,%r111; .loc 1 385 23 add.u64 %r420,%r439,%r167; ld.u64 %r113,[%r420]; .loc 1 385 13 add.u64 %r150,%r146,%r113; .loc 1 363 32 add.u64 %r422,%r438,%r167; ld.u64 %r116,[%r422]; .loc 1 363 13 setp.eq.u64 %r423,%r108,%r116; @ %r423 bra $L85; .loc 1 367 13 mov.u64 %r176,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r424,[%r206]; setp.eq.u64 %r425,%r424,0; @ ! %r425 bra $L63; bra $L86; $L48: .loc 1 389_gfortran_ssum_i1 .visible .func _gfortran_ssum_88pred %r184; .reg .predpred %r206; .reg .predu32 %r211; .reg .u64 %r213; .reg .u64u32 %r222; .reg .u64pred %r264; .reg .u64 %r266; .reg .u32 %r270; .reg .u64 %r272; .reg .u64 %r274; .reg .predpredu64 %r297; mov.u64 %r148,%ar0; mov.u64 %r149,%ar1; mov.u64 %r150,%ar2; mov.u64 %r151,%ar3; .loc 1 412 6 setp.eq.u64 %r152,%r151,0; @ %r152 bra $L110; .loc 1 412 20 ld.u32 %r153,[%r151]; setp.eq.u32 %r154,%r153,0; @ %r154 bra $L111; $L110: .loc 1 417 7gfortran_sum_.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r150]; .loc 1 422 7 add.u64 %r86,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r149+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r291,%r26; .loc 1 425 7 shr.u64 %r159,%r86,63; set.u32.gt.s64 %r161,%r86,%r291; neg.s32 %r162,%r161; cvt.u16.u64 %r165,%r159; cvt.u16.u32 %r166,%r162; or.b16 %r164,%r165,%r166; .loc 1 425 6 cvt.u32.u16 %r167,%r164; cvt.u16.u8 %r168,%r167; setp.ne.u16 %r169,%r168,0; @ %r169 bra $L113; .loc 1 432 3 setp.ne.u64 %r170,%r86,0; @ %r170 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r171,%r86,%r291; @ %r171 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r174,%r24; cvt.s64.s8 %r173,%r174; st.u64 [%stack+8],%r173172172_gfortran_runtime_error114: add.u64 %r135,%r149,48; add.u64 %r137,%frame,120; add.u64 %r138,%r149,56; add.u64 %r176,%r149,24; add.u64 %r178,%r23,%r23; add.u64 %r179,%r178,%r23; shl.b64 %r180,%r179,3; add.u64 %r144,%r176,%r180; .loc 1 437 12 mov.u64 %r297,0; $L119: .loc 1 434 19 ld.u64 %r182,[%r138]; add.u64 %r181,%r182,1; ld.u64 %r183,[%r135]; sub.u64 %r36,%r181,%r183; .loc 1 436 10 setp.le.s64 %r184,%r36,0; @ %r184 bra $L117; .loc 1 434 17 st.u64 [%r137],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r137],%r297; $L118: .loc 1 432 3 add.u64 %r135,%r135,24; add.u64 %r137,%r137,8; add.u64 %r138,%r138,24; setp.ne.u64 %r186,%r135,%r144; @ %r186 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r96,[%r148]; .loc 1 449 6 setp.eq.u64 %r187,%r96,0; @ ! %r187 bra $L122; bra $L121; $L115: add.u64 %r189,%r23,%r23; add.u64 %r190,%r189,%r23; shl.b64 %r191,%r190,3; add.u64 %r192,%r191,48; add.u64 %r57,%r149,%r192; add.u64 %r290,%frame,120; shl.b64 %r194,%r23,3; add.u64 %r195,%r194,-8; add.u64 %r67,%r290,%r195; add.u64 %r196,%r149,48; cvt.u32.u32 %r198,%r24; cvt.s64.s8 %r197,%r198; add.u64 %r200,%r197,%r197; add.u64 %r201,%r200,%r197; shl.b64 %r202,%r201,3; add.u64 %r131,%r196,%r202; .loc 1 446 12 mov.u64 %r296,0; $L125: .loc 1 443 2 ld.u64 %r204,[%r57+8]; add.u64 %r203,%r204,1; ld.u64 %r205,[%r57]; sub.u64 %r41,%r203,%r205; .loc 1 445 10 setp.le.s64 %r206,%r41,0; @ %r206 bra $L123; .loc 1 442 17 st.u64 [%r67],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r67],%r296; $L124: .loc 1 440 3 add.u64 %r57,%r57,24; add.u64 %r67,%r67,8; setp.ne.u64 %r208,%r57,%r131; @ %r208 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r209,%r291,0; @ %r209 bra $L127; add.u64 %r290,%frame,120; $L147: add.u64 %r102,%r148,40; mov.u64 %r84,%r290; cvt.u32.u32 %r211,%r24; cvt.s64.s8 %r210,%r211; add.u64 %r213,%r210,%r210; add.u64 %r214,%r213,%r210; shl.b64 %r215,%r214,3; add.u64 %r216,%r148,16; add.u64 %r60,%r215,%r216; .loc 1 456 10 mov.u64 %r82,1; .loc 1 460 4 mov.u64 %r235,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r217,0; st.u64 [%r148+8],%r217; .loc 1 465 28 cvt.u16.u32 %r220,%r24; add.u16 %r219,%r220,-1; cvt.u32.u16 %r221,%r219; st.u8 [%r148+28],%r221; .loc 1 467 20 add.u32 %r222,%r24,-2; cvt.s64.s32 %r50,%r222; add.u64 %r224,%r50,%r50; add.u64 %r225,%r224,%r50; shl.b64 %r226,%r225,3; add.u64 %r227,%r148,%r226; .loc 1 467 67 shl.b64 %r229,%r50,3; add.u64 %r230,%frame,%r229; .loc 1 467 59 ld.u64 %r232,[%r227+40]; ld.u64 %r233,[%r230+120]; mul.lo.u64 %r93,%r232,%r233; .loc 1 469 10 setp.eq.u64 %r234,%r93,0; @ %r234 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r82,%r46,%r82; $L128: .loc 1 460 4 st.u64 [%r102+8],%r235; ld.u64 %r46,[%r84]; add.u64 %r236,%r46,-1; st.u64 [%r102+16],%r236; st.u64 [%r102],%r82; .loc 1 453 7 add.u64 %r102,%r102,24;131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r148+48],%r93; mov.u64 %r239,-1; st.u64 [%r148+56],%r239; mov.u64 %r240,1; st.u64 [%r148+40],%r240; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r242(%value_in),_gfortrani_xmallocarray3,[%value_in]; } mov.u64 %r96,%r243; .loc 1 476 22 st.u64 [%r148],%r96; $L135: .loc 1 502 3 setp.ne.u64 %r245,%r291,0; @ %r245 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r148+28]; .loc 1 480 10 setp.eq.u64 %r246,%r55,%r291; @ %r246 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r291; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r249,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r250,[%r249+36]; setp.eq.u32 %r251,%r250,0; @ %r251 bra $L135; .loc 1 488 4 setp.eq.u64 %r252,%r291,0; @ %r252 bra $L133; add.u64 %r31,%r148,48; add.u64 %r125,%frame,120; cvt.u32.u32 %r254,%r24; cvt.s64.s8 %r253,%r254; add.u64 %r92,%r253,-1; .loc 1 488 10 mov.u64 %r101,0; $L137: .loc 1 492 21 ld.u64 %r256,[%r31+8]; add.u64 %r255,%r256,1; .loc 1 492 19 ld.u64 %r257,[%r31]; sub.u64 %r91,%r255,%r257; .loc 1 493 18 ld.u64 %r64,[%r125]; .loc 1 488 25 add.u64 %r101,%r101,1; .loc 1 493 11 setp.eq.u64 %r258,%r64,%r91; @ %r258 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r64; st.u64 [%stack+8],%r91; st.u64 [%stack],%r101; cvta.const.u64 %r22_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r31,%r31,24; add.u64 %r125,%r125,8; setp.ne.u64 %r261,%r92,%r101; @ %r261 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r263,%r24; setp.le.s16 %r264,%r263,1; @ %r264 bra $L138; shl.b64 %r262,%r291,3; bra $L139; $L138: mov.u64 %r262,8; $L139: add.u64 %r266,%frame,240; mov.u32 %r27026272,[%value_in]; } add.u64 %r111,%r148,40; mov.u64 %r109,%frame; mov.u64 %r98,0; $L140: .loc 1 505 18 ld.u64 %r274,[%r111]; st.u64 [%r109],%r274; .loc 1 502 26 add.u64 %r98,%r98,1; .loc 1 502 3 add.u64 %r111,%r111,24; add.u64 %r109,%r109,8; setp.gt.s64 %r275,%r291,%r98; @ %r275 bra $L140; $L133: ld.u64 %r103,[%frame+240]; .loc 1 514 22 ld.u64 %r69,[%frame]; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r81,%r59,%r69; setp.le.s64 %r289,%r291,1; .loc 1 512 13 mov.u32 %r276,0; cvt.u32.u32 %r293,%r24; cvt.s64.s8 %r294,%r293; add.u64 %r295,%r294,-1; $L146: st.u8 [%r96],%r276; .loc 1 513 15 add.u64 %r103,%r103,1; .loc 1 514 12 add.u64 %r96,%r96,%r69; .loc 1 516 13 setp.ne.u64 %r277,%r59,%r103; @ %r277 bra $L146; .loc 1 525 7 @ ! %r289 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r97],%r292; .loc 1 523 23 mul.lo.u64 %r71,%r74,%r73; .loc 1 524 5 add.u64 %r95,%r95,1; .loc 1 525 7 add.u64 %r97,%r97,8; add.u64 %r119,%r119,8; add.u64 %r118,%r118,8; setp.ne.u64 %r280,%r95,%r295; @ %r280 bra $L144; bra $L109; $L171: add.u64 %r97,%frame,248; add.u64 %r119,%frame,8; add.u64 %r118,%frame,128; .loc 1 523 23 mov.u64 %r71,%r81; .loc 1 524 5 mov.u64 %r95,1; .loc 1 520 13 mov.u64 %r292,0; $L144: .loc 1 529 16 ld.u64 %r285,[%r97]; add.u64 %r73,%r285,1; st.u64 [%r97],%r73; .loc 1 530 23 ld.u64 %r74,[%r119]; .loc 1 530 13 sub.u64 %r286,%r74,%r71; add.u64 %r96,%r96,%r286; .loc 1 516 32 ld.u64 %r77,[%r118]; .loc 1 516 13 setp.eq.u64 %r287,%r73,%r77; @ %r287 bra $L145; .loc 1 520 13 mov.u64 %r103,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r96,[%r148]; .loc 1 449 6 setp.eq.u64 %r288,%r96,0; @ ! %r288 bra $L122; bra $L147; $L109: .loc 1 534 sum_i2.o/_gfortran_sum_i2 .visible .func _gfortran_sum_i2fortran/generated/sum_i2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_msum_i2 .visible .func _gfortran_msum_i2gfortran_ssum_i2 .visible .func _gfortran_ssum_i2VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10] = {83,85,77114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,83,85,77,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_sum_i2 .visible .func _gfortran_sum_i248predu64u64 %r229; .reg .u64 %r230; .reg .predpred %r255; .reg .predu32u16u64predu64pred %r322; .reg .predpred %r330; .reg .u16 %r331; .reg .predpred %r357; .reg .u32 %r359; .reg .u64u64 %r366; .reg .u64 %r367; mov.u64 %r182,%ar0; mov.u64 %r183,%ar1; mov.u64 %r184,%ar2; .loc 1 55 10 ld.s8 %r22,[%r183+28]; .loc 1 55 38 add.u32 %r185,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r107,%r185; .loc 1 56 10 ld.u64 %r25,[%r184]; .loc 1 56 7 add.u64 %r108,%r25,-1; .loc 1 58 7 shr.u64 %r187,%r108,63; set.u32.lt.s64 %r189,%r107,%r108; neg.s32 %r190,%r189; cvt.u16.u64 %r193,%r187; cvt.u16.u32 %r194,%r190; or.b16 %r192,%r193,%r194; .loc 1 58 6 cvt.u32.u16 %r195,%r192; cvt.u16.u8 %r196,%r195; setp.eq.u16 %r197,%r196,0; @ %r197 bra $L2; .loc 1 60 7 cvt.u32.u32 %r200,%r22; cvt.s64.s8 %r199,%r200; st.u64 [%stack+8],%r199_gfortran_runtime_errorr203,%r108,%r108; add.u64 %r204,%r203,%r108; shl.b64 %r205,%r204,3; add.u64 %r206,%r183,%r205; ld.u64 %r31,[%r206+56]; ld.u64 %r33,[%r206+48]; .loc 1 68 9 ld.u64 %r111,[%r206+40]; .loc 1 70 3 setp.ne.u64 %r220,%r108,0; @ %r220 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r221,%r107,%r108; @ %r221 bra $L4; bra $L46; $L3: add.u64 %r97,%r183,40; add.u64 %r61,%frame,120; add.u64 %r59,%frame,240; add.u64 %r222,%r183,16; add.u64 %r224,%r25,%r25; add.u64 %r225,%r224,%r25; shl.b64 %r226,%r225,3; add.u64 %r180,%r222,%r226; .loc 1 76 12 mov.u64 %r367,0; $L8: .loc 1 72 18 ld.u64 %r227,[%r97]; st.u64 [%r61],%r227; .loc 1 73 19 ld.u64 %r229,[%r97+16]; add.u64 %r228,%r229,1; ld.u64 %r230,[%r97+8]; sub.u64 %r38,%r228,%r230; .loc 1 75 10 setp.lt.s64 %r231,%r38,0; @ %r231 bra $L6; .loc 1 73 17 st.u64 [%r59],%r38; bra $L7; $L6: .loc 1 76 12 st.u64 [%r59],%r367; $L7: .loc 1 70 3 add.u64 %r97,%r97,24; add.u64 %r61,%r61,8; add.u64 %r59,%r59,8; setp.ne.u64 %r233,%r97,%r180; @ %r233 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r234,[%r182]; setp.eq.u64 %r235,%r234,0; @ ! %r235 bra $L11; bra $L10; $L4: add.u64 %r237,%r25,%r25; add.u64 %r238,%r237,%r25; shl.b64 %r239,%r238,3; add.u64 %r240,%r239,40; add.u64 %r150,%r183,%r240; shl.b64 %r241,%r25,3; add.u64 %r133,%r241,-8; add.u64 %r355,%frame,120; add.u64 %r138,%r355,%r133; add.u64 %r354,%frame,240; add.u64 %r130,%r354,%r133; cvt.u32.u32 %r245,%r22; cvt.s64.s8 %r244,%r245; add.u64 %r247,%r244,%r244; add.u64 %r248,%r247,%r244; shl.b64 %r249,%r248,3; add.u64 %r250,%r183,40; add.u64 %r100,%r249,%r250; .loc 1 84 12 mov.u64 %r366,0; $L14: .loc 1 80 18 ld.u64 %r251,[%r150]; st.u64 [%r138],%r251; .loc 1 81 19 ld.u64 %r253,[%r150+16]; add.u64 %r252,%r253,1; ld.u64 %r254,[%r150+8]; sub.u64 %r44,%r252,%r254; .loc 1 83 10 setp.lt.s64 %r255,%r44,0; @ %r255 bra $L12; .loc 1 81 17 st.u64 [%r130],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r130],%r366; $L13: .loc 1 78 3 add.u64 %r150,%r150,24; add.u64 %r138,%r138,8; add.u64 %r130,%r130,8; setp.ne.u64 %r257,%r100,%r150; @ %r257 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r258,%r107,0; @ %r258 bra $L16; add.u64 %r354,%frame,240; $L34: add.u64 %r159,%r182,40; mov.u64 %r157,%r354; add.u64 %r259,%r182,16; cvt.u32.u32 %r261,%r22; cvt.s64.s8 %r260,%r261; add.u64 %r263,%r260,%r260; add.u64 %r264,%r263,%r260; shl.b64 %r265,%r264,3; add.u64 %r151,%r259,%r265; .loc 1 94 10 mov.u64 %r101,1; .loc 1 98 4 mov.u64 %r288,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r266,0; st.u64 [%r182+8],%r266; .loc 1 103 28 cvt.u16.u32 %r269,%r22; add.u16 %r268,%r269,-1; cvt.u32.u16 %r270,%r268; st.u8 [%r182+28],%r270; .loc 1 105 20 add.u32 %r271,%r22,-2; cvt.s64.s32 %r52,%r271; add.u64 %r273,%r52,%r52; add.u64 %r274,%r273,%r52; shl.b64 %r275,%r274,3; add.u64 %r276,%r182,%r275; .loc 1 105 67 shl.b64 %r278,%r52,3; add.u64 %r279,%frame,%r278; .loc 1 105 59 ld.u64 %r281,[%r276+40]; ld.u64 %r282,[%r279+240]; mul.lo.u64 %r113,%r281,%r282; .loc 1 107 29 mov.u64 %r284,11(%value_in),_gfortrani_xmallocarray285,[%value_in]; } .loc 1 107 27 st.u64 [%r182],%r285; .loc 1 108 10 setp.eq.u64 %r287,%r113,0; @ ! %r287 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r101,%r48,%r101; $L17: .loc 1 98 4 st.u64 [%r159+8],%r288; ld.u64 %r48,[%r157]; add.u64 %r289,%r48,-1; st.u64 [%r159+16],%r289; st.u64 [%r159],%r101; .loc 1 91 7 add.u64 %r159,%r159,24; add.u64 %r157,%r157,8; setp.ne.u64 %r290,%r151,%r159; @ %r290 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r182+48],%r113; mov.u64 %r292,-1; st.u64 [%r182+56],%r292; mov.u64 %r293,1; st.u64 [%r182+40],%r293; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r182+28]; .loc 1 118 10 setp.eq.u64 %r294,%r58,%r107; @ %r294 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r107; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r297,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r298,[%r297+36]; setp.eq.u32 %r299,%r298,0; @ %r299 bra $L21; .loc 1 125 2 add.u64 %r354,%frame,240; cvta.const.u64 %r303,$LC2r354302303; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r305,%r107,0; @ %r305 bra $L24; $L27: .loc 1 65 9 add.u64 %r306,%r31,1; .loc 1 65 7 sub.u64 %r109,%r306,%r33; max.s64 %r45,%r109,0; .loc 1 137 8 ld.u64 %r121,[%r183]; .loc 1 138 8 ld.u64 %r122,[%r182]; .loc 1 154 36 add.u64 %r69,%r111,%r111; ld.u64 %r142,[%frame+360]; .loc 1 166 22 ld.u64 %r71,[%frame+120]; .loc 1 166 12 add.u64 %r72,%r71,%r71; .loc 1 167 22 ld.u64 %r74,[%frame]; .loc 1 167 12 add.u64 %r75,%r74,%r74; .loc 1 169 32 ld.u64 %r129,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r310,%r71,%r129; .loc 1 176 9 add.u64 %r311,%r310,%r310; neg.s64 %r132,%r311; .loc 1 177 23 mul.lo.u64 %r312,%r74,%r129; .loc 1 177 9 add.u64 %r313,%r312,%r312; neg.s64 %r135,%r313; setp.gt.s64 %r356,%r109,0; setp.le.s64 %r357,%r107,1; cvt.u32.u32 %r359,%r22; cvt.s64.s8 %r360,%r359; add.u64 %r361,%r360,-1; add.u64 %r362,%frame,120; add.u64 %r363,%frame,240; .loc 1 173 13 mov.u64 %r364,0; bra $L25; $L24: add.u64 %r172,%frame,360; add.u64 %r171,%r182,40; mov.u64 %r169,%frame; cvt.u32.u32 %r315,%r22; cvt.s64.s8 %r314,%r315; add.u64 %r160,%r314,-1; .loc 1 129 3 mov.u64 %r123,0; add.u64 %r354,%frame,240; .loc 1 131 16 mov.u64 %r316,%r123; $L26: st.u64 [%r172],%r316; .loc 1 132 18 ld.u64 %r317,[%r171]; st.u64 [%r169],%r317; .loc 1 133 17 shl.b64 %r319,%r123,3; add.u64 %r320,%r354,%r319; .loc 1 133 10 ld.u64 %r321,[%r320]; setp.le.s64 %r322,%r321,0; @ %r322 bra $L1; .loc 1 129 26 add.u64 %r123,%r123,1; .loc 1 129 3 add.u64 %r172,%r172,8; add.u64 %r171,%r171,24; add.u64 %r169,%r169,8; setp.ne.u64 %r323,%r123,%r160; @ %r323 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r356 bra $L35; .loc 1 150 10 mov.u16 %r325,0; st.u16 [%r122],%r325; bra $L29; $L35: mov.u64 %r116,%r121; .loc 1 148 10 mov.u32 %r114,0; .loc 1 154 13 mov.u64 %r115,0; $L28: .loc 1 158 10 ld.u16 %r327,[%r116]; cvt.u16.u32 %r328,%r114; add.u16 %r326,%r327,%r328; cvt.s32.s16 %r114,%r326; .loc 1 154 28 add.u64 %r115,%r115,1; .loc 1 154 36 add.u64 %r116,%r116,%r69; .loc 1 154 6 setp.gt.s64 %r330,%r45,%r115; @ %r330 bra $L28; .loc 1 161 12 cvt.u16.u32 %r331,%r114; st.u16 [%r122],%r331; $L29: .loc 1 165 15 add.u64 %r142,%r142,1; .loc 1 166 12 add.u64 %r121,%r121,%r72; .loc 1 167 12 add.u64 %r122,%r122,%r75; .loc 1 169 13 setp.ne.u64 %r332,%r142,%r129; @ %r332 bra $L25; .loc 1 176 9 add.u64 %r118,%r121,%r132; .loc 1 177 9 add.u64 %r119,%r122,%r135; .loc 1 179 7 @ %r357 bra $L1; add.u64 %r128,%frame,368; mov.u64 %r144,8; .loc 1 178 5 mov.u64 %r120,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r128],%r364; .loc 1 176 23 mul.lo.u64 %r338,%r88,%r87; .loc 1 176 9 add.u64 %r339,%r338,%r338; sub.u64 %r118,%r121,%r339; .loc 1 177 23 mul.lo.u64 %r340,%r91,%r87; .loc 1 177 9 add.u64 %r341,%r340,%r340; sub.u64 %r119,%r122,%r341; .loc 1 178 5 add.u64 %r120,%r120,1; .loc 1 179 7 add.u64 %r128,%r128,8; add.u64 %r144,%r144,8; setp.eq.u64 %r342,%r120,%r361; @ %r342 bra $L1; $L32: .loc 1 187 16 ld.u64 %r343,[%r128]; add.u64 %r87,%r343,1; st.u64 [%r128],%r87; .loc 1 188 23 add.u64 %r345,%r362,%r144; ld.u64 %r88,[%r345]; .loc 1 188 13 add.u64 %r346,%r88,%r88; add.u64 %r121,%r118,%r346; .loc 1 189 23 add.u64 %r347,%frame,%r144; ld.u64 %r91,[%r347]; .loc 1 189 13 add.u64 %r348,%r91,%r91; add.u64 %r122,%r119,%r348; .loc 1 169 32 add.u64 %r350,%r363,%r144; ld.u64 %r95,[%r350]; .loc 1 169 13 setp.eq.u64 %r351,%r87,%r95; @ %r351 bra $L33; .loc 1 173 13 mov.u64 %r142,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r352,[%r182]; setp.eq.u64 %r353,%r352,0; @ ! %r353 bra $L11; bra $L34; $L1_gfortran_msum_i2 .visible .func _gfortran_msum_608pred %r216; .reg .u32 %r220; .reg .u64 %r222; .reg .u32 %r224; .reg .u32 %r225; .reg .u16 %r227; .reg .u16u16 %r264; .reg .u32 %r265; .reg .u16u64 %r274; .reg .u64 %r280; .reg .pred %r282; .reg .pred %r283; .reg .u64u64u64predu64 %r313; .reg .u64u64pred %r332; .reg .pred %r333; .reg .u64 %r334; .reg .u32u16 %r355; .reg .u16 %r356; .reg .u32 %r357; .reg .predpred %r370; .reg .u64 %r371; .reg .u64 %r373; .reg .u32 %r374; .reg .predpredpredu64 %r427; .reg .u64 %r429; .reg .u64u64 %r451; mov.u64 %r212,%ar0; mov.u64 %r213,%ar1; mov.u64 %r214,%ar2; mov.u64 %r215,%ar3; .loc 1 223 6 setp.ne.u64 %r216,%r215,0; @ %r216 bra $L491213214; call _gfortran_sum_.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r214]; .loc 1 233 7 add.u64 %r140,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r213+28]; .loc 1 234 38 add.u32 %r220,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r141,%r220; .loc 1 237 7 shr.u64 %r222,%r140,63; set.u32.gt.s64 %r224,%r140,%r141; neg.s32 %r225,%r224; cvt.u16.u64 %r228,%r222; cvt.u16.u32 %r229,%r225; or.b16 %r227,%r228,%r229; .loc 1 237 6 cvt.u32.u16 %r230,%r227; cvt.u16.u8 %r231,%r230; setp.eq.u16 %r232,%r231,0; @ %r232 bra $L51; .loc 1 239 7 cvt.u32.u32 %r235,%r23; cvt.s64.s8 %r234,%r235; st.u64 [%stack+8],%r234; st.u64 [%stack],%r22; cvta.const.u64 %r233233_gfortran_runtime_error1: .loc 1 244 9 add.u64 %r439,%r140,%r140; add.u64 %r440,%r439,%r140; shl.b64 %r240,%r440,3; add.u64 %r241,%r213,%r240; ld.u64 %r244,[%r241+56]; add.u64 %r243,%r244,1; .loc 1 244 7 ld.u64 %r251,[%r241+48]; sub.u64 %r142,%r243,%r251; .loc 1 245 6 setp.le.s64 %r252,%r142,0; @ %r252 bra $L48; .loc 1 248 9 ld.u64 %r155,[%r215]; .loc 1 250 15 ld.u64 %r34,[%r215+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r253,%r35,-4; and.b32 %r254,%r253,-5; set.u32.eq.u32 %r256,%r254,0; neg.s32 %r257,%r256; .loc 1 252 22 add.u32 %r258,%r35,-1; set.u32.le.u32 %r260,%r258,1; neg.s32 %r261,%r260; .loc 1 252 6 cvt.u16.u32 %r263,%r257; cvt.u16.u32 %r264,%r261; or.b16 %r262,%r263,%r264; cvt.u32.u16 %r265,%r262; cvt.u16.u8 %r266,%r265; setp.ne.u16 %r267,%r266,0; @ %r267 bra $L53; .loc 1 254 7 setp.ne.u32 %r269,%r35,16; @ %r269 bra $L54; $L53: .loc 1 261 9 shl.b64 %r273,%r440,3; add.u64 %r274,%r213,%r273; ld.u64 %r145,[%r274+40]; .loc 1 262 12 add.u64 %r280,%r215,%r273; ld.u64 %r42,[%r280+40]; .loc 1 264 3 setp.ne.u64 %r282,%r140,0; @ %r282 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r283,%r140,%r141; @ %r283 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r2842stack; call _gfortran_runtime_error5: add.u64 %r40,%r213,40; add.u64 %r182,%r215,40; add.u64 %r286,%r213,16; add.u64 %r288,%r22,%r22; add.u64 %r289,%r288,%r22; shl.b64 %r290,%r289,3; add.u64 %r209,%r286,%r290; .loc 1 264 3 mov.u64 %r95,0; add.u64 %r437,%frame,240; add.u64 %r436,%frame,360; .loc 1 271 12 mov.u64 %r451,%r95; $L60: .loc 1 266 18 add.u64 %r292,%r437,%r95; ld.u64 %r293,[%r40]; st.u64 [%r292],%r293; .loc 1 267 18 add.u64 %r294,%frame,%r95; .loc 1 267 20 ld.u64 %r296,[%r182]; mul.lo.u64 %r295,%r296,%r34; .loc 1 267 18 st.u64 [%r294],%r295; .loc 1 268 19 ld.u64 %r298,[%r40+16]; add.u64 %r297,%r298,1; ld.u64 %r299,[%r40+8]; sub.u64 %r54,%r297,%r299; .loc 1 270 10 setp.lt.s64 %r300,%r54,0; @ %r300 bra $L58; .loc 1 268 17 add.u64 %r302,%r436,%r95; st.u64 [%r302],%r54; bra $L59; $L58: .loc 1 271 12 add.u64 %r304,%r436,%r95; st.u64 [%r304],%r451; $L59: .loc 1 264 3 add.u64 %r40,%r40,24; add.u64 %r182,%r182,24; add.u64 %r95,%r95,8; setp.ne.u64 %r306,%r40,%r209; @ %r306 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r307,[%r212]; setp.eq.u64 %r308,%r307,0; @ ! %r308 bra $L63; bra $L62; $L56: add.u64 %r310,%r22,%r22; add.u64 %r311,%r310,%r22; shl.b64 %r312,%r311,3; add.u64 %r130,%r312,40; add.u64 %r136,%r213,%r130; add.u64 %r128,%r215,%r130; shl.b64 %r313,%r22,3; add.u64 %r106,%r313,-8; cvt.u32.u32 %r315,%r23; cvt.s64.s8 %r314,%r315; shl.b64 %r316,%r314,3; add.u64 %r69,%r316,-8; add.u64 %r437,%frame,240; add.u64 %r436,%frame,360; .loc 1 281 12 mov.u64 %r450,0; $L66: .loc 1 276 18 add.u64 %r318,%r437,%r106; ld.u64 %r319,[%r136]; st.u64 [%r318],%r319; .loc 1 277 18 add.u64 %r320,%frame,%r106; .loc 1 277 20 ld.u64 %r322,[%r128]; mul.lo.u64 %r321,%r322,%r34; .loc 1 277 18 st.u64 [%r320],%r321; .loc 1 278 19 ld.u64 %r324,[%r136+16]; add.u64 %r323,%r324,1; ld.u64 %r325,[%r136+8]; sub.u64 %r66,%r323,%r325; .loc 1 280 10 setp.lt.s64 %r326,%r66,0; @ %r326 bra $L64; .loc 1 278 17 add.u64 %r328,%r436,%r106; st.u64 [%r328],%r66; bra $L65; $L64: .loc 1 281 12 add.u64 %r330,%r436,%r106; st.u64 [%r330],%r450; $L65: .loc 1 274 3 add.u64 %r136,%r136,24; add.u64 %r128,%r128,24; add.u64 %r106,%r106,8; setp.ne.u64 %r332,%r69,%r106; @ %r332 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r333,%r141,0; @ %r333 bra $L68; add.u64 %r436,%frame,360; $L86: add.u64 %r175,%r212,40; mov.u64 %r167,%r436; cvt.u32.u32 %r335,%r23; cvt.s64.s8 %r334,%r335; add.u64 %r337,%r334,%r334; add.u64 %r338,%r337,%r334; shl.b64 %r339,%r338,3; add.u64 %r340,%r212,16; add.u64 %r137,%r339,%r340; .loc 1 291 10 mov.u64 %r132,1; .loc 1 295 4 mov.u64 %r359,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r341,%r23,-2; cvt.s64.s32 %r75,%r341; add.u64 %r343,%r75,%r75; add.u64 %r344,%r343,%r75; shl.b64 %r345,%r344,3; add.u64 %r346,%r212,%r345; .loc 1 299 67 shl.b64 %r348,%r75,3; add.u64 %r349,%frame,%r348; .loc 1 299 59 ld.u64 %r351,[%r346+40]; ld.u64 %r352,[%r349+360]; mul.lo.u64 %r149,%r351,%r352; .loc 1 301 24 mov.u64 %r353,0; st.u64 [%r212+8],%r353; .loc 1 302 28 cvt.u16.u32 %r356,%r23; add.u16 %r355,%r356,-1; cvt.u32.u16 %r357,%r355; st.u8 [%r212+28],%r357; .loc 1 304 10 setp.eq.u64 %r358,%r149,0; @ %r358 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r132,%r72,%r132; $L69: .loc 1 295 4 st.u64 [%r175+8],%r359; ld.u64 %r72,[%r167]; add.u64 %r360,%r72,-1; st.u64 [%r175+16],%r360; st.u64 [%r175],%r132; .loc 1 288 7 add.u64 %r175,%r175,24; add.u64 %r167,%r167,8; setp.ne.u64 %r361,%r137,%r175; @ %r361 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r212+48],%r149; mov.u64 %r363,-1; st.u64 [%r212+56],%r363; mov.u64 %r364,1; st.u64 [%r212+40],%r364; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r366,366; call (%value_in),_gfortrani_xmallocarray67,[%value_in]; } .loc 1 311 22 st.u64 [%r212],%r367; bra $L73; $L63: .loc 1 316 19 ld.s8 %r369,[%r212+28]; .loc 1 316 10 setp.eq.u64 %r370,%r369,%r141; @ %r370 bra $L74; .loc 1 317 2 cvta.const.u64 %r371371_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r373,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r374,[%r373+36]; setp.eq.u32 %r375,%r374,0; @ %r375 bra $L73; .loc 1 321 4 add.u64 %r436,%frame,360; cvta.const.u64 %r379,$LC2r4r378379; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r383,$LC383379; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r385,%r141,0; @ %r385 bra $L75; $L78: .loc 1 336 8 ld.u64 %r156,[%r212]; .loc 1 337 8 ld.u64 %r154,[%r213]; .loc 1 339 9 setp.ne.u64 %r386,%r154,0; @ %r386 bra $L76; bra $L48; $L75: add.u64 %r193,%frame,480; add.u64 %r192,%r212,40; add.u64 %r190,%frame,120; cvt.u32.u32 %r388,%r23; cvt.s64.s8 %r387,%r388; add.u64 %r178,%r387,-1; .loc 1 328 3 mov.u64 %r160,0; add.u64 %r436,%frame,360; .loc 1 330 16 mov.u64 %r389,%r160; $L77: st.u64 [%r193],%r389; .loc 1 331 18 ld.u64 %r390,[%r192]; st.u64 [%r190],%r390; .loc 1 332 17 shl.b64 %r392,%r160,3; add.u64 %r393,%r436,%r392; .loc 1 332 10 ld.u64 %r394,[%r393]; setp.le.s64 %r395,%r394,0; @ %r395 bra $L48; .loc 1 328 26 add.u64 %r160,%r160,1; .loc 1 328 3 add.u64 %r193,%r193,8; add.u64 %r192,%r192,24; add.u64 %r190,%r190,8; setp.ne.u64 %r396,%r160,%r178; @ %r396 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r45,%r42,%r34; .loc 1 349 32 add.u64 %r94,%r145,%r145; ld.u64 %r39,[%frame+480]; .loc 1 359 22 ld.u64 %r97,[%frame+240]; .loc 1 359 12 add.u64 %r98,%r97,%r97; .loc 1 360 23 ld.u64 %r100,[%frame]; .loc 1 361 22 ld.u64 %r102,[%frame+120]; .loc 1 361 12 add.u64 %r103,%r102,%r102; .loc 1 363 32 ld.u64 %r85,[%frame+360]; .loc 1 370 23 mul.lo.u64 %r38,%r85,%r97; .loc 1 371 24 mul.lo.u64 %r400,%r85,%r100; .loc 1 371 10 neg.s64 %r166,%r400; .loc 1 372 23 mul.lo.u64 %r401,%r85,%r102; .loc 1 372 9 add.u64 %r402,%r401,%r401; neg.s64 %r168,%r402; setp.le.s64 %r441,%r141,1; .loc 1 348 10 mov.u32 %r443,0; cvt.u32.u32 %r444,%r23; cvt.s64.s8 %r445,%r444; add.u64 %r446,%r445,-1; add.u64 %r447,%frame,240; add.u64 %r448,%frame,360; add.u64 %r449,%frame,120; $L87: .loc 1 328 3 mov.u64 %r159,%r155; mov.u64 %r158,%r154; .loc 1 348 10 mov.u32 %r133,%r443; .loc 1 349 9 mov.u64 %r157,0; $L80: .loc 1 352 6 ld.s8 %r404,[%r159]; cvt.u16.u32 %r403,%r404; setp.eq.u16 %r405,%r403,0; @ %r405 bra $L79; .loc 1 353 12 ld.u16 %r407,[%r158]; cvt.u16.u32 %r408,%r133; add.u16 %r406,%r407,%r408; cvt.s32.s16 %r133,%r406; $L79: .loc 1 349 24 add.u64 %r157,%r157,1; .loc 1 349 32 add.u64 %r158,%r158,%r94; .loc 1 349 47 add.u64 %r159,%r159,%r45; .loc 1 349 2 setp.ne.u64 %r410,%r142,%r157; @ %r410 bra $L80; .loc 1 355 8 cvt.u16.u32 %r411,%r133; st.u16 [%r156],%r411; .loc 1 358 15 add.u64 %r39,%r39,1; .loc 1 359 12 add.u64 %r154,%r154,%r98; .loc 1 360 13 add.u64 %r155,%r155,%r100; .loc 1 361 12 add.u64 %r156,%r156,%r103; .loc 1 363 13 setp.ne.u64 %r412,%r85,%r39; @ %r412 bra $L87; .loc 1 371 10 add.u64 %r151,%r155,%r166; .loc 1 372 9 add.u64 %r152,%r156,%r168; .loc 1 374 7 @ ! %r441 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r41],%r442; .loc 1 370 23 mul.lo.u64 %r105,%r117,%r116; .loc 1 371 24 mul.lo.u64 %r415,%r119,%r116; .loc 1 371 10 sub.u64 %r151,%r155,%r415; .loc 1 372 23 mul.lo.u64 %r416,%r121,%r116; .loc 1 372 9 add.u64 %r417,%r416,%r416; sub.u64 %r152,%r156,%r417; .loc 1 373 5 add.u64 %r153,%r153,1; .loc 1 374 7 add.u64 %r41,%r41,8; add.u64 %r169,%r169,8; setp.ne.u64 %r418,%r153,%r446; @ %r418 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; .loc 1 370 23 mov.u64 %r105,%r38; .loc 1 374 7 mov.u64 %r169,8; .loc 1 373 5 mov.u64 %r153,1; .loc 1 367 13 mov.u64 %r442,0; $L84: .loc 1 382 16 ld.u64 %r422,[%r41]; add.u64 %r116,%r422,1; st.u64 [%r41],%r116; .loc 1 383 23 add.u64 %r424,%r447,%r169; ld.u64 %r117,[%r424]; .loc 1 383 13 sub.u64 %r425,%r117,%r105; add.u64 %r426,%r425,%r425; add.u64 %r154,%r154,%r426; .loc 1 384 24 add.u64 %r427,%frame,%r169; ld.u64 %r119,[%r427]; .loc 1 384 14 add.u64 %r155,%r151,%r119; .loc 1 385 23 add.u64 %r429,%r449,%r169; ld.u64 %r121,[%r429]; .loc 1 385 13 add.u64 %r430,%r121,%r121; add.u64 %r156,%r152,%r430; .loc 1 363 32 add.u64 %r432,%r448,%r169; ld.u64 %r125,[%r432]; .loc 1 363 13 setp.eq.u64 %r433,%r116,%r125; @ %r433 bra $L85; .loc 1 367 13 mov.u64 %r39,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r434,[%r212]; setp.eq.u64 %r435,%r434,0; @ ! %r435 bra $L63; bra $L86; $L48: .loc 1 389_gfortran_ssum_i2 .visible .func _gfortran_ssum_8u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64predpred %r292; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_sum_.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r295,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r295; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r295; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r301,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r301; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r294,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r294,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r300,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r300; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r295,0; @ %r211 bra $L127; add.u64 %r294,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r294; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r244,4; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r295,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r295; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r295; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r295,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r295,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r295,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 add.u64 %r69,%r68,%r68; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r293,%r295,1; .loc 1 512 13 mov.u16 %r279,0; cvt.u32.u32 %r297,%r24; cvt.s64.s8 %r298,%r297; add.u64 %r299,%r298,-1; $L146: st.u16 [%r98],%r279; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r280,%r59,%r80; @ %r280 bra $L146; .loc 1 525 7 @ ! %r293 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r296; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r283,%r97,%r299; @ %r283 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; .loc 1 520 13 mov.u64 %r296,0; $L144: .loc 1 529 16 ld.u64 %r288,[%r99]; add.u64 %r74,%r288,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r289,%r75,%r72; add.u64 %r290,%r289,%r289; add.u64 %r98,%r98,%r290; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r291,%r74,%r77; @ %r291 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r292,%r98,0; @ ! %r292 bra $L122; bra $L147; $L109: .loc 1 534sum_i4.o/_gfortran_sum_i4 .visible .func _gfortran_sum_i4fortran/generated/sum_i4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_msum_i4 .visible .func _gfortran_msum_i4gfortran_ssum_i4 .visible .func _gfortran_ssum_i4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10] = {83,85,77114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,83,85,77,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_sum_i4 .visible .func _gfortran_sum_i448pred %r217; .reg .pred64u64 %r251; .reg .predu64 %r256; .reg .u64 %r257; .reg .u32u32 %r268; .reg .u64u64predpredu64u64predmov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; .loc 1 55 10 ld.s8 %r22,[%r180+28]; .loc 1 55 38 add.u32 %r182,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r182; .loc 1 56 10 ld.u64 %r25,[%r181]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r184,%r104,63; set.u32.lt.s64 %r186,%r103,%r104; neg.s32 %r187,%r186; cvt.u16.u64 %r190,%r184; cvt.u16.u32 %r191,%r187; or.b16 %r189,%r190,%r191; .loc 1 58 6 cvt.u32.u16 %r192,%r189; cvt.u16.u8 %r193,%r192; setp.eq.u16 %r194,%r193,0; @ %r194 bra $L2; .loc 1 60 7 cvt.u32.u32 %r197,%r22; cvt.s64.s8 %r196,%r197; st.u64 [%stack+8],%r1961955_gfortran_runtime_errorr200,%r104,%r104; add.u64 %r201,%r200,%r104; shl.b64 %r202,%r201,3; add.u64 %r203,%r180,%r202; ld.u64 %r30,[%r203+56]; ld.u64 %r32,[%r203+48]; .loc 1 68 9 ld.u64 %r107,[%r203+40]; .loc 1 70 3 setp.ne.u64 %r217,%r104,0; @ %r217 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r218,%r103,%r104; @ %r218 bra $L4; bra $L46; $L3: add.u64 %r61,%r180,40; add.u64 %r40,%frame,120; add.u64 %r91,%frame,240; add.u64 %r219,%r180,16; add.u64 %r221,%r25,%r25; add.u64 %r222,%r221,%r25; shl.b64 %r223,%r222,3; add.u64 %r174,%r219,%r223; .loc 1 76 12 mov.u64 %r357,0; $L8: .loc 1 72 18 ld.u64 %r224,[%r61]; st.u64 [%r40],%r224; .loc 1 73 19 ld.u64 %r226,[%r61+16]; add.u64 %r225,%r226,1; ld.u64 %r227,[%r61+8]; sub.u64 %r37,%r225,%r227; .loc 1 75 10 setp.lt.s64 %r228,%r37,0; @ %r228 bra $L6; .loc 1 73 17 st.u64 [%r91],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r91],%r357; $L7: .loc 1 70 3 add.u64 %r61,%r61,24; add.u64 %r40,%r40,8; add.u64 %r91,%r91,8; setp.ne.u64 %r230,%r61,%r174; @ %r230 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r231,[%r179]; setp.eq.u64 %r232,%r231,0; @ ! %r232 bra $L11; bra $L10; $L4: add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r237,%r236,40; add.u64 %r143,%r180,%r237; shl.b64 %r238,%r25,3; add.u64 %r125,%r238,-8; add.u64 %r344,%frame,120; add.u64 %r128,%r344,%r125; add.u64 %r345,%frame,240; add.u64 %r121,%r345,%r125; cvt.u32.u32 %r242,%r22; cvt.s64.s8 %r241,%r242; add.u64 %r244,%r241,%r241; add.u64 %r245,%r244,%r241; shl.b64 %r246,%r245,3; add.u64 %r247,%r180,40; add.u64 %r95,%r246,%r247; .loc 1 84 12 mov.u64 %r356,0; $L14: .loc 1 80 18 ld.u64 %r248,[%r143]; st.u64 [%r128],%r248; .loc 1 81 19 ld.u64 %r250,[%r143+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r143+8]; sub.u64 %r44,%r249,%r251; .loc 1 83 10 setp.lt.s64 %r252,%r44,0; @ %r252 bra $L12; .loc 1 81 17 st.u64 [%r121],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r121],%r356; $L13: .loc 1 78 3 add.u64 %r143,%r143,24; add.u64 %r128,%r128,8; add.u64 %r121,%r121,8; setp.ne.u64 %r254,%r95,%r143; @ %r254 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r255,%r103,0; @ %r255 bra $L16; add.u64 %r345,%frame,240; $L34: add.u64 %r152,%r179,40; mov.u64 %r150,%r345; add.u64 %r256,%r179,16; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r144,%r256,%r262; .loc 1 94 10 mov.u64 %r99,1; .loc 1 98 4 mov.u64 %r285,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r263,0; st.u64 [%r179+8],%r263; .loc 1 103 28 cvt.u16.u32 %r266,%r22; add.u16 %r265,%r266,-1; cvt.u32.u16 %r267,%r265; st.u8 [%r179+28],%r267; .loc 1 105 20 add.u32 %r268,%r22,-2; cvt.s64.s32 %r52,%r268; add.u64 %r270,%r52,%r52; add.u64 %r271,%r270,%r52; shl.b64 %r272,%r271,3; add.u64 %r273,%r179,%r272; .loc 1 105 67 shl.b64 %r275,%r52,3; add.u64 %r276,%frame,%r275; .loc 1 105 59 ld.u64 %r278,[%r273+40]; ld.u64 %r279,[%r276+240]; mul.lo.u64 %r109,%r278,%r279; .loc 1 107 29281; call (%value_in),_gfortrani_xmallocarray282,[%value_in]; } .loc 1 107 27 st.u64 [%r179],%r282; .loc 1 108 10 setp.eq.u64 %r284,%r109,0; @ ! %r284 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r99,%r48,%r99; $L17: .loc 1 98 4 st.u64 [%r152+8],%r285; ld.u64 %r48,[%r150]; add.u64 %r286,%r48,-1; st.u64 [%r152+16],%r286; st.u64 [%r152],%r99; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r287,%r144,%r152; @ %r287 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r179+48],%r109; mov.u64 %r289,-1; st.u64 [%r179+56],%r289; mov.u64 %r290,1; st.u64 [%r179+40],%r290; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r179+28]; .loc 1 118 10 setp.eq.u64 %r291,%r58,%r103; @ %r291 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r294,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r295,[%r294+36]; setp.eq.u32 %r296,%r295,0; @ %r296 bra $L21; .loc 1 125 2 add.u64 %r345,%frame,240; cvta.const.u64 %r300,$LC2r345299300; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r302,%r103,0; @ %r302 bra $L24; $L27: .loc 1 65 9 add.u64 %r303,%r30,1; .loc 1 65 7 sub.u64 %r105,%r303,%r32; max.s64 %r45,%r105,0; .loc 1 137 8 ld.u64 %r116,[%r180]; .loc 1 138 8 ld.u64 %r117,[%r179]; .loc 1 154 36 shl.b64 %r66,%r107,2; ld.u64 %r138,[%frame+360]; .loc 1 166 22 ld.u64 %r68,[%frame+120]; .loc 1 166 12 shl.b64 %r69,%r68,2; .loc 1 167 22 ld.u64 %r71,[%frame]; .loc 1 167 12 shl.b64 %r72,%r71,2; .loc 1 169 32 ld.u64 %r123,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r304,%r68,%r123; .loc 1 176 9 shl.b64 %r305,%r304,2; neg.s64 %r127,%r305; .loc 1 177 23 mul.lo.u64 %r306,%r71,%r123; .loc 1 177 9 shl.b64 %r307,%r306,2; neg.s64 %r130,%r307; setp.gt.s64 %r346,%r105,0; setp.le.s64 %r347,%r103,1; cvt.u32.u32 %r349,%r22; cvt.s64.s8 %r350,%r349; add.u64 %r351,%r350,-1; add.u64 %r352,%frame,120; add.u64 %r353,%frame,240; .loc 1 173 13 mov.u64 %r354,0; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r179,40; mov.u64 %r162,%frame; cvt.u32.u32 %r309,%r22; cvt.s64.s8 %r308,%r309; add.u64 %r153,%r308,-1; .loc 1 129 3 mov.u64 %r118,0; add.u64 %r345,%frame,240; .loc 1 131 16 mov.u64 %r310,%r118; $L26: st.u64 [%r165],%r310; .loc 1 132 18 ld.u64 %r311,[%r164]; st.u64 [%r162],%r311; .loc 1 133 17 shl.b64 %r313,%r118,3; add.u64 %r314,%r345,%r313; .loc 1 133 10 ld.u64 %r315,[%r314]; setp.le.s64 %r316,%r315,0; @ %r316 bra $L1; .loc 1 129 26 add.u64 %r118,%r118,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r317,%r118,%r153; @ %r317 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r346 bra $L35; .loc 1 150 10 mov.u32 %r319,0; st.u32 [%r117],%r319; bra $L29; $L35: mov.u64 %r112,%r116; .loc 1 148 10 mov.u32 %r110,0; .loc 1 154 13 mov.u64 %r111,0; $L28: .loc 1 158 10 ld.u32 %r320,[%r112]; add.u32 %r110,%r110,%r320; .loc 1 154 28 add.u64 %r111,%r111,1; .loc 1 154 36 add.u64 %r112,%r112,%r66; .loc 1 154 6 setp.gt.s64 %r321,%r45,%r111; @ %r321 bra $L28; .loc 1 161 12 st.u32 [%r117],%r110; $L29: .loc 1 165 15 add.u64 %r138,%r138,1; .loc 1 166 12 add.u64 %r116,%r116,%r69; .loc 1 167 12 add.u64 %r117,%r117,%r72; .loc 1 169 13 setp.ne.u64 %r322,%r138,%r123; @ %r322 bra $L25; .loc 1 176 9 add.u64 %r113,%r116,%r127; .loc 1 177 9 add.u64 %r114,%r117,%r130; .loc 1 179 7 @ %r347 bra $L1; add.u64 %r122,%frame,368; mov.u64 %r140,8; .loc 1 178 5 mov.u64 %r115,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r122],%r354; .loc 1 176 23 mul.lo.u64 %r328,%r85,%r84; .loc 1 176 9 shl.b64 %r329,%r328,2; sub.u64 %r113,%r116,%r329; .loc 1 177 23 mul.lo.u64 %r330,%r88,%r84; .loc 1 177 9 shl.b64 %r331,%r330,2; sub.u64 %r114,%r117,%r331; .loc 1 178 5 add.u64 %r115,%r115,1; .loc 1 179 7 add.u64 %r122,%r122,8; add.u64 %r140,%r140,8; setp.eq.u64 %r332,%r115,%r351; @ %r332 bra $L1; $L32: .loc 1 187 16 ld.u64 %r333,[%r122]; add.u64 %r84,%r333,1; st.u64 [%r122],%r84; .loc 1 188 23 add.u64 %r335,%r352,%r140; ld.u64 %r85,[%r335]; .loc 1 188 13 shl.b64 %r336,%r85,2; add.u64 %r116,%r113,%r336; .loc 1 189 23 add.u64 %r337,%frame,%r140; ld.u64 %r88,[%r337]; .loc 1 189 13 shl.b64 %r338,%r88,2; add.u64 %r117,%r114,%r338; .loc 1 169 32 add.u64 %r340,%r353,%r140; ld.u64 %r92,[%r340]; .loc 1 169 13 setp.eq.u64 %r341,%r84,%r92; @ %r341 bra $L33; .loc 1 173 13 mov.u64 %r138,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r342,[%r179]; setp.eq.u64 %r343,%r342,0; @ ! %r343 bra $L11; bra $L34; $L1_gfortran_msum_i4 .visible .func _gfortran_msum_608u64u16 %r263; .reg .pred %r264; .reg .pred %r266; .reg .u64 %r270; .reg .u64 %r271; .reg .u64 %r277; .reg .pred %r279; .reg .pred %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .predu64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32u64 %r342; .reg .u64pred %r358; .reg .u64predu64 %r391; .reg .predpred %r399; .reg .u32 %r400; .reg .pred %r401; .reg .predmov.u64 %r211,%ar2; mov.u64 %r212,%ar3; .loc 1 223 6 setp.ne.u64 %r213,%r212,0; @ %r213 bra $L49gfortran_sum_.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r211]; .loc 1 233 7 add.u64 %r137,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r210+28]; .loc 1 234 38 add.u32 %r217,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r138,%r217; .loc 1 237 7 shr.u64 %r219,%r137,63; set.u32.gt.s64 %r221,%r137,%r138; neg.s32 %r222,%r221; cvt.u16.u64 %r225,%r219; cvt.u16.u32 %r226,%r222; or.b16 %r224,%r225,%r226; .loc 1 237 6 cvt.u32.u16 %r227,%r224; cvt.u16.u8 %r228,%r227; setp.eq.u16 %r229,%r228,0; @ %r229 bra $L51; .loc 1 239 7 cvt.u32.u32 %r232,%r23; cvt.s64.s8 %r231,%r232; st.u64 [%stack+8],%r231; st.u64 [%stack],%r22; cvta.const.u64 %r2302stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r429,%r137,%r137; add.u64 %r430,%r429,%r137; shl.b64 %r237,%r430,3; add.u64 %r238,%r210,%r237; ld.u64 %r241,[%r238+56]; add.u64 %r240,%r241,1; .loc 1 244 7 ld.u64 %r248,[%r238+48]; sub.u64 %r139,%r240,%r248; .loc 1 245 6 setp.le.s64 %r249,%r139,0; @ %r249 bra $L48; .loc 1 248 9 ld.u64 %r152,[%r212]; .loc 1 250 15 ld.u64 %r34,[%r212+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r250,%r35,-4; and.b32 %r251,%r250,-5; set.u32.eq.u32 %r253,%r251,0; neg.s32 %r254,%r253; .loc 1 252 22 add.u32 %r255,%r35,-1; set.u32.le.u32 %r257,%r255,1; neg.s32 %r258,%r257; .loc 1 252 6 cvt.u16.u32 %r260,%r254; cvt.u16.u32 %r261,%r258; or.b16 %r259,%r260,%r261; cvt.u32.u16 %r262,%r259; cvt.u16.u8 %r263,%r262; setp.ne.u16 %r264,%r263,0; @ %r264 bra $L53; .loc 1 254 7 setp.ne.u32 %r266,%r35,16; @ %r266 bra $L54; $L53: .loc 1 261 9 shl.b64 %r270,%r430,3; add.u64 %r271,%r210,%r270; ld.u64 %r142,[%r271+40]; .loc 1 262 12 add.u64 %r277,%r212,%r270; ld.u64 %r42,[%r277+40]; .loc 1 264 3 setp.ne.u64 %r279,%r137,0; @ %r279 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r280,%r137,%r138; @ %r280 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r281281_gfortran_runtime_error5: add.u64 %r40,%r210,40; add.u64 %r179,%r212,40; add.u64 %r283,%r210,16; add.u64 %r285,%r22,%r22; add.u64 %r286,%r285,%r22; shl.b64 %r287,%r286,3; add.u64 %r206,%r283,%r287; .loc 1 264 3 mov.u64 %r92,0; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 271 12 mov.u64 %r441,%r92; $L60: .loc 1 266 18 add.u64 %r289,%r427,%r92; ld.u64 %r290,[%r40]; st.u64 [%r289],%r290; .loc 1 267 18 add.u64 %r291,%frame,%r92; .loc 1 267 20 ld.u64 %r293,[%r179]; mul.lo.u64 %r292,%r293,%r34; .loc 1 267 18 st.u64 [%r291],%r292; .loc 1 268 19 ld.u64 %r295,[%r40+16]; add.u64 %r294,%r295,1; ld.u64 %r296,[%r40+8]; sub.u64 %r54,%r294,%r296; .loc 1 270 10 setp.lt.s64 %r297,%r54,0; @ %r297 bra $L58; .loc 1 268 17 add.u64 %r299,%r426,%r92; st.u64 [%r299],%r54; bra $L59; $L58: .loc 1 271 12 add.u64 %r301,%r426,%r92; st.u64 [%r301],%r441; $L59: .loc 1 264 3 add.u64 %r40,%r40,24; add.u64 %r179,%r179,24; add.u64 %r92,%r92,8; setp.ne.u64 %r303,%r40,%r206; @ %r303 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r304,[%r209]; setp.eq.u64 %r305,%r304,0; @ ! %r305 bra $L63; bra $L62; $L56: add.u64 %r307,%r22,%r22; add.u64 %r308,%r307,%r22; shl.b64 %r309,%r308,3; add.u64 %r127,%r309,40; add.u64 %r133,%r210,%r127; add.u64 %r125,%r212,%r127; shl.b64 %r310,%r22,3; add.u64 %r103,%r310,-8; cvt.u32.u32 %r312,%r23; cvt.s64.s8 %r311,%r312; shl.b64 %r313,%r311,3; add.u64 %r69,%r313,-8; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 281 12 mov.u64 %r440,0; $L66: .loc 1 276 18 add.u64 %r315,%r427,%r103; ld.u64 %r316,[%r133]; st.u64 [%r315],%r316; .loc 1 277 18 add.u64 %r317,%frame,%r103; .loc 1 277 20 ld.u64 %r319,[%r125]; mul.lo.u64 %r318,%r319,%r34; .loc 1 277 18 st.u64 [%r317],%r318; .loc 1 278 19 ld.u64 %r321,[%r133+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r133+8]; sub.u64 %r66,%r320,%r322; .loc 1 280 10 setp.lt.s64 %r323,%r66,0; @ %r323 bra $L64; .loc 1 278 17 add.u64 %r325,%r426,%r103; st.u64 [%r325],%r66; bra $L65; $L64: .loc 1 281 12 add.u64 %r327,%r426,%r103; st.u64 [%r327],%r440; $L65: .loc 1 274 3 add.u64 %r133,%r133,24; add.u64 %r125,%r125,24; add.u64 %r103,%r103,8; setp.ne.u64 %r329,%r69,%r103; @ %r329 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r330,%r138,0; @ %r330 bra $L68; add.u64 %r426,%frame,360; $L86: add.u64 %r172,%r209,40; mov.u64 %r164,%r426; cvt.u32.u32 %r332,%r23; cvt.s64.s8 %r331,%r332; add.u64 %r334,%r331,%r331; add.u64 %r335,%r334,%r331; shl.b64 %r336,%r335,3; add.u64 %r337,%r209,16; add.u64 %r134,%r336,%r337; .loc 1 291 10 mov.u64 %r129,1; .loc 1 295 4 mov.u64 %r356,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r338,%r23,-2; cvt.s64.s32 %r75,%r338; add.u64 %r340,%r75,%r75; add.u64 %r341,%r340,%r75; shl.b64 %r342,%r341,3; add.u64 %r343,%r209,%r342; .loc 1 299 67 shl.b64 %r345,%r75,3; add.u64 %r346,%frame,%r345; .loc 1 299 59 ld.u64 %r348,[%r343+40]; ld.u64 %r349,[%r346+360]; mul.lo.u64 %r146,%r348,%r349; .loc 1 301 24 mov.u64 %r350,0; st.u64 [%r209+8],%r350; .loc 1 302 28 cvt.u16.u32 %r353,%r23; add.u16 %r352,%r353,-1; cvt.u32.u16 %r354,%r352; st.u8 [%r209+28],%r354; .loc 1 304 10 setp.eq.u64 %r355,%r146,0; @ %r355 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r129,%r72,%r129; $L69: .loc 1 295 4 st.u64 [%r172+8],%r356; ld.u64 %r72,[%r164]; add.u64 %r357,%r72,-1; st.u64 [%r172+16],%r357; st.u64 [%r172],%r129; .loc 1 288 7 add.u64 %r172,%r172,24; add.u64 %r164,%r164,8; setp.ne.u64 %r358,%r134,%r172; @ %r358 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r209+48],%r146; mov.u64 %r360,-1; st.u64 [%r209+56],%r360; mov.u64 %r361,1; st.u64 [%r209+40],%r361; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24363; call (%value_in),_gfortrani_xmallocarray64,[%value_in]; } .loc 1 311 22 st.u64 [%r209],%r364; bra $L73; $L63: .loc 1 316 19 ld.s8 %r366,[%r209+28]; .loc 1 316 10 setp.eq.u64 %r367,%r366,%r138; @ %r367 bra $L74; .loc 1 317 2 cvta.const.u64 %r36836_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r370,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r371,[%r370+36]; setp.eq.u32 %r372,%r371,0; @ %r372 bra $L73; .loc 1 321 4 add.u64 %r426,%frame,360; cvta.const.u64 %r376,$LC2r42637376; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r380,$LC1210380376; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r382,%r138,0; @ %r382 bra $L75; $L78: .loc 1 336 8 ld.u64 %r153,[%r209]; .loc 1 337 8 ld.u64 %r151,[%r210]; .loc 1 339 9 setp.ne.u64 %r383,%r151,0; @ %r383 bra $L76; bra $L48; $L75: add.u64 %r190,%frame,480; add.u64 %r189,%r209,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r385,%r23; cvt.s64.s8 %r384,%r385; add.u64 %r175,%r384,-1; .loc 1 328 3 mov.u64 %r157,0; add.u64 %r426,%frame,360; .loc 1 330 16 mov.u64 %r386,%r157; $L77: st.u64 [%r190],%r386; .loc 1 331 18 ld.u64 %r387,[%r189]; st.u64 [%r187],%r387; .loc 1 332 17 shl.b64 %r389,%r157,3; add.u64 %r390,%r426,%r389; .loc 1 332 10 ld.u64 %r391,[%r390]; setp.le.s64 %r392,%r391,0; @ %r392 bra $L48; .loc 1 328 26 add.u64 %r157,%r157,1; .loc 1 328 3 add.u64 %r190,%r190,8; add.u64 %r189,%r189,24; add.u64 %r187,%r187,8; setp.ne.u64 %r393,%r157,%r175; @ %r393 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r45,%r42,%r34; .loc 1 349 32 shl.b64 %r91,%r142,2; ld.u64 %r39,[%frame+480]; .loc 1 359 22 ld.u64 %r94,[%frame+240]; .loc 1 359 12 shl.b64 %r95,%r94,2; .loc 1 360 23 ld.u64 %r97,[%frame]; .loc 1 361 22 ld.u64 %r99,[%frame+120]; .loc 1 361 12 shl.b64 %r100,%r99,2; .loc 1 363 32 ld.u64 %r85,[%frame+360]; .loc 1 370 23 mul.lo.u64 %r38,%r85,%r94; .loc 1 371 24 mul.lo.u64 %r394,%r85,%r97; .loc 1 371 10 neg.s64 %r163,%r394; .loc 1 372 23 mul.lo.u64 %r395,%r85,%r99; .loc 1 372 9 shl.b64 %r396,%r395,2; neg.s64 %r165,%r396; setp.le.s64 %r431,%r138,1; cvt.u32.u32 %r433,%r23; cvt.s64.s8 %r434,%r433; add.u64 %r435,%r434,-1; add.u64 %r436,%frame,240; add.u64 %r437,%frame,360; add.u64 %r438,%frame,120; .loc 1 367 13 mov.u64 %r439,0; $L87: .loc 1 328 3 mov.u64 %r156,%r152; mov.u64 %r155,%r151; .loc 1 348 10 mov.u32 %r130,0; .loc 1 349 9 mov.u64 %r154,0; $L80: .loc 1 352 6 ld.s8 %r398,[%r156]; cvt.u16.u32 %r397,%r398; setp.eq.u16 %r399,%r397,0; @ %r399 bra $L79; .loc 1 353 12 ld.u32 %r400,[%r155]; add.u32 %r130,%r130,%r400; $L79: .loc 1 349 24 add.u64 %r154,%r154,1; .loc 1 349 32 add.u64 %r155,%r155,%r91; .loc 1 349 47 add.u64 %r156,%r156,%r45; .loc 1 349 2 setp.ne.u64 %r401,%r139,%r154; @ %r401 bra $L80; .loc 1 355 8 st.u32 [%r153],%r130; .loc 1 358 15 add.u64 %r39,%r39,1; .loc 1 359 12 add.u64 %r151,%r151,%r95; .loc 1 360 13 add.u64 %r152,%r152,%r97; .loc 1 361 12 add.u64 %r153,%r153,%r100; .loc 1 363 13 setp.ne.u64 %r402,%r85,%r39; @ %r402 bra $L87; .loc 1 371 10 add.u64 %r148,%r152,%r163; .loc 1 372 9 add.u64 %r149,%r153,%r165; .loc 1 374 7 @ ! %r431 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r41],%r439; .loc 1 370 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 371 24 mul.lo.u64 %r405,%r116,%r113; .loc 1 371 10 sub.u64 %r148,%r152,%r405; .loc 1 372 23 mul.lo.u64 %r406,%r118,%r113; .loc 1 372 9 shl.b64 %r407,%r406,2; sub.u64 %r149,%r153,%r407; .loc 1 373 5 add.u64 %r150,%r150,1; .loc 1 374 7 add.u64 %r41,%r41,8; add.u64 %r166,%r166,8; setp.ne.u64 %r408,%r150,%r435; @ %r408 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; .loc 1 370 23 mov.u64 %r102,%r38; .loc 1 374 7 mov.u64 %r166,8; .loc 1 373 5 mov.u64 %r150,1; $L84: .loc 1 382 16 ld.u64 %r412,[%r41]; add.u64 %r113,%r412,1; st.u64 [%r41],%r113; .loc 1 383 23 add.u64 %r414,%r436,%r166; ld.u64 %r114,[%r414]; .loc 1 383 13 sub.u64 %r415,%r114,%r102; shl.b64 %r416,%r415,2; add.u64 %r151,%r151,%r416; .loc 1 384 24 add.u64 %r417,%frame,%r166; ld.u64 %r116,[%r417]; .loc 1 384 14 add.u64 %r152,%r148,%r116; .loc 1 385 23 add.u64 %r419,%r438,%r166; ld.u64 %r118,[%r419]; .loc 1 385 13 shl.b64 %r420,%r118,2; add.u64 %r153,%r149,%r420; .loc 1 363 32 add.u64 %r422,%r437,%r166; ld.u64 %r122,[%r422]; .loc 1 363 13 setp.eq.u64 %r423,%r113,%r122; @ %r423 bra $L85; .loc 1 367 13 mov.u64 %r39,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r424,[%r209]; setp.eq.u64 %r425,%r424,0; @ ! %r425 bra $L63; bra $L86; $L48: .loc 1 389_gfortran_ssum_i4 .visible .func _gfortran_ssum_8u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64predu64mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_sum_.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r294,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r294; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r294; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r300,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r300; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r293,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r293,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r299,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r299; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r294,0; @ %r211 bra $L127; add.u64 %r293,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r293; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 2496244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r294,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r294; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r294; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r294,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r294,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r294,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 shl.b64 %r69,%r68,2; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r292,%r294,1; .loc 1 512 13 mov.u32 %r278,0; cvt.u32.u32 %r296,%r24; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; $L146: st.u32 [%r98],%r278; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r279,%r59,%r80; @ %r279 bra $L146; .loc 1 525 7 @ ! %r292 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r295; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r282,%r97,%r298; @ %r282 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; .loc 1 520 13 mov.u64 %r295,0; $L144: .loc 1 529 16 ld.u64 %r287,[%r99]; add.u64 %r74,%r287,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r288,%r75,%r72; shl.b64 %r289,%r288,2; add.u64 %r98,%r98,%r289; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r290,%r74,%r77; @ %r290 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r291,%r98,0; @ ! %r291 bra $L122; bra $L147; $L109: .loc 1 534sum_i8.o/_gfortran_sum_i8 .visible .func _gfortran_sum_i8fortran/generated/sum_i8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_msum_i8 .visible .func _gfortran_msum_i8gfortran_ssum_i8 .visible .func _gfortran_ssum_i8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10] = {83,85,77114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,83,85,77,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_sum_i8 .visible .func _gfortran_sum_i84896pred %r218; .reg .predpred %r231; .reg .u64 %r232; .reg .predu64 %r250; .reg .u64 %r251; .reg .u64 %r252; .reg .pred %r253; .reg .predu64u16u64 %r274; .reg .u64pred %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u32 %r306; .reg .u64u64 %r311; .reg .u64 %r312; .reg .pred %r313; .reg .predu64pred %r347; .reg .predu64 %r358; mov.u64 %r180,%ar0; mov.u64 %r181,%ar1; mov.u64 %r182,%ar2; .loc 1 55 10 ld.s8 %r22,[%r181+28]; .loc 1 55 38 add.u32 %r183,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r107,%r183; .loc 1 56 10 ld.u64 %r25,[%r182]; .loc 1 56 7 add.u64 %r108,%r25,-1; .loc 1 58 7 shr.u64 %r185,%r108,63; set.u32.lt.s64 %r187,%r107,%r108; neg.s32 %r188,%r187; cvt.u16.u64 %r191,%r185; cvt.u16.u32 %r192,%r188; or.b16 %r190,%r191,%r192; .loc 1 58 6 cvt.u32.u16 %r193,%r190; cvt.u16.u8 %r194,%r193; setp.eq.u16 %r195,%r194,0; @ %r195 bra $L2; .loc 1 60 7 cvt.u32.u32 %r198,%r22; cvt.s64.s8 %r197,%r198; st.u64 [%stack+8],%r1971966_gfortran_runtime_errorr201,%r108,%r108; add.u64 %r202,%r201,%r108; shl.b64 %r203,%r202,3; add.u64 %r204,%r181,%r203; ld.u64 %r30,[%r204+56]; ld.u64 %r32,[%r204+48]; .loc 1 68 9 ld.u64 %r111,[%r204+40]; .loc 1 70 3 setp.ne.u64 %r218,%r108,0; @ %r218 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r219,%r107,%r108; @ %r219 bra $L4; bra $L46; $L3: add.u64 %r61,%r181,40; add.u64 %r40,%frame,120; add.u64 %r96,%frame,240; add.u64 %r220,%r181,16; add.u64 %r222,%r25,%r25; add.u64 %r223,%r222,%r25; shl.b64 %r224,%r223,3; add.u64 %r175,%r220,%r224; .loc 1 76 12 mov.u64 %r358,0; $L8: .loc 1 72 18 ld.u64 %r225,[%r61]; st.u64 [%r40],%r225; .loc 1 73 19 ld.u64 %r227,[%r61+16]; add.u64 %r226,%r227,1; ld.u64 %r228,[%r61+8]; sub.u64 %r37,%r226,%r228; .loc 1 75 10 setp.lt.s64 %r229,%r37,0; @ %r229 bra $L6; .loc 1 73 17 st.u64 [%r96],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r96],%r358; $L7: .loc 1 70 3 add.u64 %r61,%r61,24; add.u64 %r40,%r40,8; add.u64 %r96,%r96,8; setp.ne.u64 %r231,%r61,%r175; @ %r231 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r232,[%r180]; setp.eq.u64 %r233,%r232,0; @ ! %r233 bra $L11; bra $L10; $L4: add.u64 %r235,%r25,%r25; add.u64 %r236,%r235,%r25; shl.b64 %r237,%r236,3; add.u64 %r238,%r237,40; add.u64 %r136,%r181,%r238; shl.b64 %r239,%r25,3; add.u64 %r105,%r239,-8; add.u64 %r345,%frame,120; add.u64 %r129,%r345,%r105; add.u64 %r346,%frame,240; add.u64 %r103,%r346,%r105; cvt.u32.u32 %r243,%r22; cvt.s64.s8 %r242,%r243; add.u64 %r245,%r242,%r242; add.u64 %r246,%r245,%r242; shl.b64 %r247,%r246,3; add.u64 %r248,%r181,40; add.u64 %r80,%r247,%r248; .loc 1 84 12 mov.u64 %r357,0; $L14: .loc 1 80 18 ld.u64 %r249,[%r136]; st.u64 [%r129],%r249; .loc 1 81 19 ld.u64 %r251,[%r136+16]; add.u64 %r250,%r251,1; ld.u64 %r252,[%r136+8]; sub.u64 %r44,%r250,%r252; .loc 1 83 10 setp.lt.s64 %r253,%r44,0; @ %r253 bra $L12; .loc 1 81 17 st.u64 [%r103],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r103],%r357; $L13: .loc 1 78 3 add.u64 %r136,%r136,24; add.u64 %r129,%r129,8; add.u64 %r103,%r103,8; setp.ne.u64 %r255,%r80,%r136; @ %r255 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r256,%r107,0; @ %r256 bra $L16; add.u64 %r346,%frame,240; $L34: add.u64 %r152,%r180,40; mov.u64 %r150,%r346; add.u64 %r257,%r180,16; cvt.u32.u32 %r259,%r22; cvt.s64.s8 %r258,%r259; add.u64 %r261,%r258,%r258; add.u64 %r262,%r261,%r258; shl.b64 %r263,%r262,3; add.u64 %r139,%r257,%r263; .loc 1 94 10 mov.u64 %r102,1; .loc 1 98 4 mov.u64 %r286,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r264,0; st.u64 [%r180+8],%r264; .loc 1 103 28 cvt.u16.u32 %r267,%r22; add.u16 %r266,%r267,-1; cvt.u32.u16 %r268,%r266; st.u8 [%r180+28],%r268; .loc 1 105 20 add.u32 %r269,%r22,-2; cvt.s64.s32 %r53,%r269; add.u64 %r271,%r53,%r53; add.u64 %r272,%r271,%r53; shl.b64 %r273,%r272,3; add.u64 %r274,%r180,%r273; .loc 1 105 67 shl.b64 %r276,%r53,3; add.u64 %r277,%frame,%r276; .loc 1 105 59 ld.u64 %r279,[%r274+40]; ld.u64 %r280,[%r277+240]; mul.lo.u64 %r113,%r279,%r280; .loc 1 107 29 mov.u64 %r2811(%value_in),_gfortrani_xmallocarray283,[%value_in]; } .loc 1 107 27 st.u64 [%r180],%r283; .loc 1 108 10 setp.eq.u64 %r285,%r113,0; @ ! %r285 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r102,%r49,%r102; $L17: .loc 1 98 4 st.u64 [%r152+8],%r286; ld.u64 %r49,[%r150]; add.u64 %r287,%r49,-1; st.u64 [%r152+16],%r287; st.u64 [%r152],%r102; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r288,%r139,%r152; @ %r288 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r180+48],%r113; mov.u64 %r290,-1; st.u64 [%r180+56],%r290; mov.u64 %r291,1; st.u64 [%r180+40],%r291; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r60,[%r180+28]; .loc 1 118 10 setp.eq.u64 %r292,%r60,%r107; @ %r292 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r107; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r295,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r296,[%r295+36]; setp.eq.u32 %r297,%r296,0; @ %r297 bra $L21; .loc 1 125 2 add.u64 %r346,%frame,240; cvta.const.u64 %r301,$LC2r346300301; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r303,%r107,0; @ %r303 bra $L24; $L27: .loc 1 65 9 add.u64 %r304,%r30,1; .loc 1 65 7 sub.u64 %r109,%r304,%r32; max.s64 %r45,%r109,0; .loc 1 137 8 ld.u64 %r121,[%r181]; .loc 1 138 8 ld.u64 %r122,[%r180]; .loc 1 154 36 shl.b64 %r68,%r111,3; ld.u64 %r47,[%frame+360]; .loc 1 166 22 ld.u64 %r69,[%frame+120]; .loc 1 166 12 shl.b64 %r71,%r69,3; .loc 1 167 22 ld.u64 %r72,[%frame]; .loc 1 167 12 shl.b64 %r74,%r72,3; setp.gt.s64 %r347,%r109,0; setp.le.s64 %r348,%r107,1; .loc 1 150 10 mov.u64 %r350,0; cvt.u32.u32 %r351,%r22; cvt.s64.s8 %r352,%r351; add.u64 %r353,%r352,-1; add.u64 %r354,%frame,120; add.u64 %r355,%frame,240; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r180,40; mov.u64 %r162,%frame; cvt.u32.u32 %r306,%r22; cvt.s64.s8 %r305,%r306; add.u64 %r153,%r305,-1; .loc 1 129 3 mov.u64 %r123,0; add.u64 %r346,%frame,240; .loc 1 131 16 mov.u64 %r307,%r123; $L26: st.u64 [%r165],%r307; .loc 1 132 18 ld.u64 %r308,[%r164]; st.u64 [%r162],%r308; .loc 1 133 17 shl.b64 %r310,%r123,3; add.u64 %r311,%r346,%r310; .loc 1 133 10 ld.u64 %r312,[%r311]; setp.le.s64 %r313,%r312,0; @ %r313 bra $L1; .loc 1 129 26 add.u64 %r123,%r123,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r314,%r123,%r153; @ %r314 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r347 bra $L35; .loc 1 150 10 st.u64 [%r122],%r350; bra $L29; $L35: mov.u64 %r116,%r121; .loc 1 148 10 mov.u64 %r114,0; .loc 1 154 13 mov.u64 %r115,%r114; $L28: .loc 1 158 10 ld.u64 %r317,[%r116]; add.u64 %r114,%r114,%r317; .loc 1 154 28 add.u64 %r115,%r115,1; .loc 1 154 36 add.u64 %r116,%r116,%r68; .loc 1 154 6 setp.gt.s64 %r318,%r45,%r115; @ %r318 bra $L28; .loc 1 161 12 st.u64 [%r122],%r114; $L29: .loc 1 165 15 add.u64 %r47,%r47,1; .loc 1 166 12 add.u64 %r121,%r121,%r71; .loc 1 167 12 add.u64 %r122,%r122,%r74; .loc 1 169 32 ld.u64 %r127,[%frame+240]; .loc 1 169 13 setp.ne.u64 %r319,%r47,%r127; @ %r319 bra $L25; .loc 1 176 23 mul.lo.u64 %r320,%r69,%r47; .loc 1 176 9 shl.b64 %r321,%r320,3; sub.u64 %r118,%r121,%r321; .loc 1 177 23 mul.lo.u64 %r322,%r72,%r47; .loc 1 177 9 shl.b64 %r323,%r322,3; sub.u64 %r119,%r122,%r323; .loc 1 179 7 @ %r348 bra $L1; add.u64 %r144,%frame,368; mov.u64 %r106,8; .loc 1 178 5 mov.u64 %r120,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r144],%r350; .loc 1 176 23 mul.lo.u64 %r329,%r88,%r87; .loc 1 176 9 shl.b64 %r330,%r329,3; sub.u64 %r118,%r121,%r330; .loc 1 177 23 mul.lo.u64 %r331,%r91,%r87; .loc 1 177 9 shl.b64 %r332,%r331,3; sub.u64 %r119,%r122,%r332; .loc 1 178 5 add.u64 %r120,%r120,1; .loc 1 179 7 add.u64 %r144,%r144,8; add.u64 %r106,%r106,8; setp.eq.u64 %r333,%r120,%r353; @ %r333 bra $L1; $L32: .loc 1 187 16 ld.u64 %r334,[%r144]; add.u64 %r87,%r334,1; st.u64 [%r144],%r87; .loc 1 188 23 add.u64 %r336,%r354,%r106; ld.u64 %r88,[%r336]; .loc 1 188 13 shl.b64 %r337,%r88,3; add.u64 %r121,%r118,%r337; .loc 1 189 23 add.u64 %r338,%frame,%r106; ld.u64 %r91,[%r338]; .loc 1 189 13 shl.b64 %r339,%r91,3; add.u64 %r122,%r119,%r339; .loc 1 169 32 add.u64 %r341,%r355,%r106; ld.u64 %r94,[%r341]; .loc 1 169 13 setp.eq.u64 %r342,%r87,%r94; @ %r342 bra $L33; .loc 1 173 13 mov.u64 %r47,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r343,[%r180]; setp.eq.u64 %r344,%r343,0; @ ! %r344 bra $L11; bra $L34; $L1_gfortran_msum_i8 .visible .func _gfortran_msum_608predpred %r280; .reg .pred %r281; .reg .u64 %r282; .reg .u64 %r284; .reg .u64predu64 %r318; .reg .u64u64u64 %r364; .reg .u64 %r365; .reg .u64 %r367; .reg .predu64pred %r393; .reg .predpredmov.u64 %r210,%ar0; mov.u64 %r211,%ar1; mov.u64 %r212,%ar2; mov.u64 %r213,%ar3; .loc 1 223 6911212; call _gfortran_sum_.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r212]; .loc 1 233 7 add.u64 %r141,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r211+28]; .loc 1 234 38 add.u32 %r218,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r142,%r218; .loc 1 237 7 shr.u64 %r220,%r141,63; set.u32.gt.s64 %r222,%r141,%r142; neg.s32 %r223,%r222; cvt.u16.u64 %r226,%r220; cvt.u16.u32 %r227,%r223; or.b16 %r225,%r226,%r227; .loc 1 237 6 cvt.u32.u16 %r228,%r225; cvt.u16.u8 %r229,%r228; setp.eq.u16 %r230,%r229,0; @ %r230 bra $L51; .loc 1 239 7 cvt.u32.u32 %r233,%r23; cvt.s64.s8 %r232,%r233; st.u64 [%stack+8],%r232; st.u64 [%stack],%r22; cvta.const.u64 %r2312stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r431,%r141,%r141; add.u64 %r430,%r431,%r141; shl.b64 %r238,%r430,3; add.u64 %r239,%r211,%r238; ld.u64 %r242,[%r239+56]; add.u64 %r241,%r242,1; .loc 1 244 7 ld.u64 %r249,[%r239+48]; sub.u64 %r143,%r241,%r249; .loc 1 245 6 setp.le.s64 %r250,%r143,0; @ %r250 bra $L48; .loc 1 248 9 ld.u64 %r156,[%r213]; .loc 1 250 15 ld.u64 %r35,[%r213+16]; .loc 1 252 22 cvt.u32.u64 %r36,%r35; .loc 1 252 53 add.u32 %r251,%r36,-4; and.b32 %r252,%r251,-5; set.u32.eq.u32 %r254,%r252,0; neg.s32 %r255,%r254; .loc 1 252 22 add.u32 %r256,%r36,-1; set.u32.le.u32 %r258,%r256,1; neg.s32 %r259,%r258; .loc 1 252 6 cvt.u16.u32 %r261,%r255; cvt.u16.u32 %r262,%r259; or.b16 %r260,%r261,%r262; cvt.u32.u16 %r263,%r260; cvt.u16.u8 %r264,%r263; setp.ne.u16 %r265,%r264,0; @ %r265 bra $L53; .loc 1 254 7 setp.ne.u32 %r267,%r36,16; @ %r267 bra $L54; $L53: .loc 1 261 9 shl.b64 %r271,%r430,3; add.u64 %r272,%r211,%r271; ld.u64 %r146,[%r272+40]; .loc 1 262 12 add.u64 %r278,%r213,%r271; ld.u64 %r43,[%r278+40]; .loc 1 264 3 setp.ne.u64 %r280,%r141,0; @ %r280 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r281,%r141,%r142; @ %r281 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r282282_gfortran_runtime_error5: add.u64 %r194,%r211,40; add.u64 %r196,%r213,40; add.u64 %r284,%r211,16; add.u64 %r286,%r22,%r22; add.u64 %r287,%r286,%r22; shl.b64 %r288,%r287,3; add.u64 %r207,%r284,%r288; .loc 1 264 3 mov.u64 %r198,0; add.u64 %r428,%frame,240; add.u64 %r427,%frame,360; .loc 1 271 12 mov.u64 %r442,%r198; $L60: .loc 1 266 18 add.u64 %r290,%r428,%r198; ld.u64 %r291,[%r194]; st.u64 [%r290],%r291; .loc 1 267 18 add.u64 %r292,%frame,%r198; .loc 1 267 20 ld.u64 %r294,[%r196]; mul.lo.u64 %r293,%r294,%r35; .loc 1 267 18 st.u64 [%r292],%r293; .loc 1 268 19 ld.u64 %r296,[%r194+16]; add.u64 %r295,%r296,1; ld.u64 %r297,[%r194+8]; sub.u64 %r55,%r295,%r297; .loc 1 270 10 setp.lt.s64 %r298,%r55,0; @ %r298 bra $L58; .loc 1 268 17 add.u64 %r300,%r427,%r198; st.u64 [%r300],%r55; bra $L59; $L58: .loc 1 271 12 add.u64 %r302,%r427,%r198; st.u64 [%r302],%r442; $L59: .loc 1 264 3 add.u64 %r194,%r194,24; add.u64 %r196,%r196,24; add.u64 %r198,%r198,8; setp.ne.u64 %r304,%r194,%r207; @ %r304 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r305,[%r210]; setp.eq.u64 %r306,%r305,0; @ ! %r306 bra $L63; bra $L62; $L56: add.u64 %r308,%r22,%r22; add.u64 %r309,%r308,%r22; shl.b64 %r310,%r309,3; add.u64 %r61,%r310,40; add.u64 %r100,%r211,%r61; add.u64 %r139,%r213,%r61; shl.b64 %r311,%r22,3; add.u64 %r192,%r311,-8; cvt.u32.u32 %r313,%r23; cvt.s64.s8 %r312,%r313; shl.b64 %r314,%r312,3; add.u64 %r39,%r314,-8; add.u64 %r428,%frame,240; add.u64 %r427,%frame,360; .loc 1 281 12 mov.u64 %r441,0; $L66: .loc 1 276 18 add.u64 %r316,%r428,%r192; ld.u64 %r317,[%r100]; st.u64 [%r316],%r317; .loc 1 277 18 add.u64 %r318,%frame,%r192; .loc 1 277 20 ld.u64 %r320,[%r139]; mul.lo.u64 %r319,%r320,%r35; .loc 1 277 18 st.u64 [%r318],%r319; .loc 1 278 19 ld.u64 %r322,[%r100+16]; add.u64 %r321,%r322,1; ld.u64 %r323,[%r100+8]; sub.u64 %r67,%r321,%r323; .loc 1 280 10 setp.lt.s64 %r324,%r67,0; @ %r324 bra $L64; .loc 1 278 17 add.u64 %r326,%r427,%r192; st.u64 [%r326],%r67; bra $L65; $L64: .loc 1 281 12 add.u64 %r328,%r427,%r192; st.u64 [%r328],%r441; $L65: .loc 1 274 3 add.u64 %r100,%r100,24; add.u64 %r139,%r139,24; add.u64 %r192,%r192,8; setp.ne.u64 %r330,%r39,%r192; @ %r330 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r331,%r142,0; @ %r331 bra $L68; add.u64 %r427,%frame,360; $L86: add.u64 %r137,%r210,40; mov.u64 %r135,%r427; cvt.u32.u32 %r333,%r23; cvt.s64.s8 %r332,%r333; add.u64 %r335,%r332,%r332; add.u64 %r336,%r335,%r332; shl.b64 %r337,%r336,3; add.u64 %r338,%r210,16; add.u64 %r106,%r337,%r338; .loc 1 291 10 mov.u64 %r131,1; .loc 1 295 4 mov.u64 %r357,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r339,%r23,-2; cvt.s64.s32 %r74,%r339; add.u64 %r341,%r74,%r74; add.u64 %r342,%r341,%r74; shl.b64 %r343,%r342,3; add.u64 %r344,%r210,%r343; .loc 1 299 67 shl.b64 %r346,%r74,3; add.u64 %r347,%frame,%r346; .loc 1 299 59 ld.u64 %r349,[%r344+40]; ld.u64 %r350,[%r347+360]; mul.lo.u64 %r150,%r349,%r350; .loc 1 301 24 mov.u64 %r351,0; st.u64 [%r210+8],%r351; .loc 1 302 28 cvt.u16.u32 %r354,%r23; add.u16 %r353,%r354,-1; cvt.u32.u16 %r355,%r353; st.u8 [%r210+28],%r355; .loc 1 304 10 setp.eq.u64 %r356,%r150,0; @ %r356 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r131,%r71,%r131; $L69: .loc 1 295 4 st.u64 [%r137+8],%r357; ld.u64 %r71,[%r135]; add.u64 %r358,%r71,-1; st.u64 [%r137+16],%r358; st.u64 [%r137],%r131; .loc 1 288 7 add.u64 %r137,%r137,24; add.u64 %r135,%r135,8; setp.ne.u64 %r359,%r106,%r137; @ %r359 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r210+48],%r150; mov.u64 %r361,-1; st.u64 [%r210+56],%r361; mov.u64 %r362,1; st.u64 [%r210+40],%r362; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r364364; call (%value_in),_gfortrani_xmallocarray65,[%value_in]; } .loc 1 311 22 st.u64 [%r210],%r365; bra $L73; $L63: .loc 1 316 19 ld.s8 %r367,[%r210+28]; .loc 1 316 10 setp.eq.u64 %r368,%r367,%r142; @ %r368 bra $L74; .loc 1 317 2 cvta.const.u64 %r36stack; call _gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r371,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r372,[%r371+36]; setp.eq.u32 %r373,%r372,0; @ %r373 bra $L73; .loc 1 321 4 add.u64 %r427,%frame,360; cvta.const.u64 %r377,$LC2r427376377; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r381,$LC1r2113377; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r383,%r142,0; @ %r383 bra $L75; $L78: .loc 1 336 8 ld.u64 %r157,[%r210]; .loc 1 337 8 ld.u64 %r155,[%r211]; .loc 1 339 9 setp.ne.u64 %r384,%r155,0; @ %r384 bra $L76; bra $L48; $L75: add.u64 %r183,%frame,480; add.u64 %r180,%r210,40; add.u64 %r177,%frame,120; cvt.u32.u32 %r386,%r23; cvt.s64.s8 %r385,%r386; add.u64 %r138,%r385,-1; .loc 1 328 3 mov.u64 %r161,0; add.u64 %r427,%frame,360; .loc 1 330 16 mov.u64 %r387,%r161; $L77: st.u64 [%r183],%r387; .loc 1 331 18 ld.u64 %r388,[%r180]; st.u64 [%r177],%r388; .loc 1 332 17 shl.b64 %r390,%r161,3; add.u64 %r391,%r427,%r390; .loc 1 332 10 ld.u64 %r392,[%r391]; setp.le.s64 %r393,%r392,0; @ %r393 bra $L48; .loc 1 328 26 add.u64 %r161,%r161,1; .loc 1 328 3 add.u64 %r183,%r183,8; add.u64 %r180,%r180,24; add.u64 %r177,%r177,8; setp.ne.u64 %r394,%r138,%r161; @ %r394 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r46,%r43,%r35; .loc 1 349 32 shl.b64 %r90,%r146,3; ld.u64 %r29,[%frame+480]; .loc 1 359 22 ld.u64 %r92,[%frame+240]; .loc 1 359 12 shl.b64 %r94,%r92,3; .loc 1 360 23 ld.u64 %r95,[%frame]; .loc 1 361 22 ld.u64 %r97,[%frame+120]; .loc 1 361 12 shl.b64 %r99,%r97,3; setp.le.s64 %r432,%r142,1; cvt.u32.u32 %r434,%r23; cvt.s64.s8 %r435,%r434; add.u64 %r436,%r435,-1; add.u64 %r437,%frame,240; add.u64 %r438,%frame,360; add.u64 %r439,%frame,120; .loc 1 367 13 mov.u64 %r440,0; $L87: .loc 1 328 3 mov.u64 %r160,%r156; mov.u64 %r159,%r155; .loc 1 348 10 mov.u64 %r133,0; .loc 1 349 9 mov.u64 %r158,%r133; $L80: .loc 1 352 6 ld.s8 %r396,[%r160]; cvt.u16.u32 %r395,%r396; setp.eq.u16 %r397,%r395,0; @ %r397 bra $L79; .loc 1 353 12 ld.u64 %r398,[%r159]; add.u64 %r133,%r133,%r398; $L79: .loc 1 349 24 add.u64 %r158,%r158,1; .loc 1 349 32 add.u64 %r159,%r159,%r90; .loc 1 349 47 add.u64 %r160,%r160,%r46; .loc 1 349 2 setp.ne.u64 %r399,%r143,%r158; @ %r399 bra $L80; .loc 1 355 8 st.u64 [%r157],%r133; .loc 1 358 15 add.u64 %r29,%r29,1; .loc 1 359 12 add.u64 %r155,%r155,%r94; .loc 1 360 13 add.u64 %r156,%r156,%r95; .loc 1 361 12 add.u64 %r157,%r157,%r99; .loc 1 363 32 ld.u64 %r84,[%frame+360]; .loc 1 363 13 setp.ne.u64 %r400,%r84,%r29; @ %r400 bra $L87; .loc 1 370 23 mul.lo.u64 %r102,%r29,%r92; .loc 1 371 24 mul.lo.u64 %r401,%r29,%r95; .loc 1 371 10 sub.u64 %r152,%r156,%r401; .loc 1 372 23 mul.lo.u64 %r402,%r29,%r97; .loc 1 372 9 shl.b64 %r403,%r402,3; sub.u64 %r153,%r157,%r403; .loc 1 374 7 @ ! %r432 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r41],%r440; .loc 1 370 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 371 24 mul.lo.u64 %r406,%r116,%r113; .loc 1 371 10 sub.u64 %r152,%r156,%r406; .loc 1 372 23 mul.lo.u64 %r407,%r118,%r113; .loc 1 372 9 shl.b64 %r408,%r407,3; sub.u64 %r153,%r157,%r408; .loc 1 373 5 add.u64 %r154,%r154,1; .loc 1 374 7 add.u64 %r41,%r41,8; add.u64 %r30,%r30,8; setp.ne.u64 %r409,%r154,%r436; @ %r409 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; mov.u64 %r30,8; .loc 1 373 5 mov.u64 %r154,1; $L84: .loc 1 382 16 ld.u64 %r413,[%r41]; add.u64 %r113,%r413,1; st.u64 [%r41],%r113; .loc 1 383 23 add.u64 %r415,%r437,%r30; ld.u64 %r114,[%r415]; .loc 1 383 13 sub.u64 %r416,%r114,%r102; shl.b64 %r417,%r416,3; add.u64 %r155,%r155,%r417; .loc 1 384 24 add.u64 %r418,%frame,%r30; ld.u64 %r116,[%r418]; .loc 1 384 14 add.u64 %r156,%r152,%r116; .loc 1 385 23 add.u64 %r420,%r439,%r30; ld.u64 %r118,[%r420]; .loc 1 385 13 shl.b64 %r421,%r118,3; add.u64 %r157,%r153,%r421; .loc 1 363 32 add.u64 %r423,%r438,%r30; ld.u64 %r121,[%r423]; .loc 1 363 13 setp.eq.u64 %r424,%r113,%r121; @ %r424 bra $L85; .loc 1 367 13 mov.u64 %r29,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r425,[%r210]; setp.eq.u64 %r426,%r425,0; @ ! %r426 bra $L63; bra $L86; $L48: .loc 1 389_gfortran_ssum_i8 .visible .func _gfortran_ssum_8u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_sum_.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r294,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r294; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r294; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r300,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r300; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r293,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r293,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r299,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r299; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r294,0; @ %r211 bra $L127; add.u64 %r293,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r293; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r24496244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r294,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r294; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r294; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r294,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r294,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r294,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 shl.b64 %r69,%r68,3; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r292,%r294,1; .loc 1 512 13 mov.u64 %r278,0; cvt.u32.u32 %r296,%r24; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; $L146: st.u64 [%r98],%r278; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r279,%r59,%r80; @ %r279 bra $L146; .loc 1 525 7 @ ! %r292 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r278; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r282,%r97,%r298; @ %r282 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; $L144: .loc 1 529 16 ld.u64 %r287,[%r99]; add.u64 %r74,%r287,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r288,%r75,%r72; shl.b64 %r289,%r288,3; add.u64 %r98,%r98,%r289; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r290,%r74,%r77; @ %r290 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r291,%r98,0; @ ! %r291 bra $L122; bra $L147; $L109: .loc 1 534sum_i16.o/_gfortran_sum_i16 .visible .func _gfortran_sum_i16fortran/generated/sum_i16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_msum_i16 .visible .func _gfortran_msum_i16gfortran_ssum_i16 .visible .func _gfortran_ssum_i16VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10] = {83,85,77114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,83,85,77,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_sum_i16 .visible .func _gfortran_sum_i1648pred %r217; .reg .pred64u64 %r251; .reg .predu64 %r256; .reg .u64 %r257; .reg .u32u32 %r268; .reg .u64u64pred %r317; .reg .u64 %r323; .reg .u64 %r326; .reg .u32 %r328; .reg .u64u64 %r352; .reg .u64 %r354; .reg .predpred %r365; .reg .predu64 %r374; .reg .u64 %r375; .reg .u64 %r376; .reg .u64 %r377; mov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; .loc 1 55 10 ld.s8 %r22,[%r180+28]; .loc 1 55 38 add.u32 %r182,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r182; .loc 1 56 10 ld.u64 %r25,[%r181]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r184,%r104,63; set.u32.lt.s64 %r186,%r103,%r104; neg.s32 %r187,%r186; cvt.u16.u64 %r190,%r184; cvt.u16.u32 %r191,%r187; or.b16 %r189,%r190,%r191; .loc 1 58 6 cvt.u32.u16 %r192,%r189; cvt.u16.u8 %r193,%r192; setp.eq.u16 %r194,%r193,0; @ %r194 bra $L2; .loc 1 60 7 cvt.u32.u32 %r197,%r22; cvt.s64.s8 %r196,%r197; st.u64 [%stack+8],%r1961955_gfortran_runtime_errorr200,%r104,%r104; add.u64 %r201,%r200,%r104; shl.b64 %r202,%r201,3; add.u64 %r203,%r180,%r202; ld.u64 %r30,[%r203+56]; ld.u64 %r32,[%r203+48]; .loc 1 68 9 ld.u64 %r107,[%r203+40]; .loc 1 70 3 setp.ne.u64 %r217,%r104,0; @ %r217 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r218,%r103,%r104; @ %r218 bra $L4; bra $L46; $L3: add.u64 %r61,%r180,40; add.u64 %r40,%frame,120; add.u64 %r91,%frame,240; add.u64 %r219,%r180,16; add.u64 %r221,%r25,%r25; add.u64 %r222,%r221,%r25; shl.b64 %r223,%r222,3; add.u64 %r174,%r219,%r223; .loc 1 76 12 mov.u64 %r377,0; $L8: .loc 1 72 18 ld.u64 %r224,[%r61]; st.u64 [%r40],%r224; .loc 1 73 19 ld.u64 %r226,[%r61+16]; add.u64 %r225,%r226,1; ld.u64 %r227,[%r61+8]; sub.u64 %r37,%r225,%r227; .loc 1 75 10 setp.lt.s64 %r228,%r37,0; @ %r228 bra $L6; .loc 1 73 17 st.u64 [%r91],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r91],%r377; $L7: .loc 1 70 3 add.u64 %r61,%r61,24; add.u64 %r40,%r40,8; add.u64 %r91,%r91,8; setp.ne.u64 %r230,%r61,%r174; @ %r230 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r231,[%r179]; setp.eq.u64 %r232,%r231,0; @ ! %r232 bra $L11; bra $L10; $L4: add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r237,%r236,40; add.u64 %r143,%r180,%r237; shl.b64 %r238,%r25,3; add.u64 %r125,%r238,-8; add.u64 %r364,%frame,120; add.u64 %r128,%r364,%r125; add.u64 %r367,%frame,240; add.u64 %r121,%r367,%r125; cvt.u32.u32 %r242,%r22; cvt.s64.s8 %r241,%r242; add.u64 %r244,%r241,%r241; add.u64 %r245,%r244,%r241; shl.b64 %r246,%r245,3; add.u64 %r247,%r180,40; add.u64 %r95,%r246,%r247; .loc 1 84 12 mov.u64 %r376,0; $L14: .loc 1 80 18 ld.u64 %r248,[%r143]; st.u64 [%r128],%r248; .loc 1 81 19 ld.u64 %r250,[%r143+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r143+8]; sub.u64 %r44,%r249,%r251; .loc 1 83 10 setp.lt.s64 %r252,%r44,0; @ %r252 bra $L12; .loc 1 81 17 st.u64 [%r121],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r121],%r376; $L13: .loc 1 78 3 add.u64 %r143,%r143,24; add.u64 %r128,%r128,8; add.u64 %r121,%r121,8; setp.ne.u64 %r254,%r95,%r143; @ %r254 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r255,%r103,0; @ %r255 bra $L16; add.u64 %r367,%frame,240; $L34: add.u64 %r152,%r179,40; mov.u64 %r150,%r367; add.u64 %r256,%r179,16; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r144,%r256,%r262; .loc 1 94 10 mov.u64 %r99,1; .loc 1 98 4 mov.u64 %r285,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r263,0; st.u64 [%r179+8],%r263; .loc 1 103 28 cvt.u16.u32 %r266,%r22; add.u16 %r265,%r266,-1; cvt.u32.u16 %r267,%r265; st.u8 [%r179+28],%r267; .loc 1 105 20 add.u32 %r268,%r22,-2; cvt.s64.s32 %r52,%r268; add.u64 %r270,%r52,%r52; add.u64 %r271,%r270,%r52; shl.b64 %r272,%r271,3; add.u64 %r273,%r179,%r272; .loc 1 105 67 shl.b64 %r275,%r52,3; add.u64 %r276,%frame,%r275; .loc 1 105 59 ld.u64 %r278,[%r273+40]; ld.u64 %r279,[%r276+240]; mul.lo.u64 %r109,%r278,%r279; .loc 1 107 29 mov.u64 %r281,16281; call (%value_in),_gfortrani_xmallocarray282,[%value_in]; } .loc 1 107 27 st.u64 [%r179],%r282; .loc 1 108 10 setp.eq.u64 %r284,%r109,0; @ ! %r284 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r99,%r48,%r99; $L17: .loc 1 98 4 st.u64 [%r152+8],%r285; ld.u64 %r48,[%r150]; add.u64 %r286,%r48,-1; st.u64 [%r152+16],%r286; st.u64 [%r152],%r99; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r287,%r144,%r152; @ %r287 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r179+48],%r109; mov.u64 %r289,-1; st.u64 [%r179+56],%r289; mov.u64 %r290,1; st.u64 [%r179+40],%r290; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r179+28]; .loc 1 118 10 setp.eq.u64 %r291,%r58,%r103; @ %r291 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r294,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r295,[%r294+36]; setp.eq.u32 %r296,%r295,0; @ %r296 bra $L21; .loc 1 125 2 add.u64 %r367,%frame,240; cvta.const.u64 %r300,$LC2r367299300; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r302,%r103,0; @ %r302 bra $L24; $L27: .loc 1 65 9 add.u64 %r303,%r30,1; .loc 1 65 7 sub.u64 %r105,%r303,%r32; max.s64 %r45,%r105,0; .loc 1 137 8 ld.u64 %r116,[%r180]; .loc 1 138 8 ld.u64 %r117,[%r179]; .loc 1 154 36 shl.b64 %r66,%r107,4; ld.u64 %r138,[%frame+360]; .loc 1 166 22 ld.u64 %r68,[%frame+120]; .loc 1 166 12 shl.b64 %r69,%r68,4; .loc 1 167 22 ld.u64 %r71,[%frame]; .loc 1 167 12 shl.b64 %r72,%r71,4; .loc 1 169 32 ld.u64 %r123,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r304,%r68,%r123; .loc 1 176 9 shl.b64 %r305,%r304,4; neg.s64 %r127,%r305; .loc 1 177 23 mul.lo.u64 %r306,%r71,%r123; .loc 1 177 9 shl.b64 %r307,%r306,4; neg.s64 %r130,%r307; setp.gt.s64 %r365,%r105,0; setp.le.s64 %r366,%r103,1; .loc 1 148 10 mov.u64 %r369,0; cvt.u32.u32 %r371,%r22; cvt.s64.s8 %r372,%r371; add.u64 %r373,%r372,-1; add.u64 %r374,%frame,120; add.u64 %r375,%frame,240; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r179,40; mov.u64 %r162,%frame; cvt.u32.u32 %r309,%r22; cvt.s64.s8 %r308,%r309; add.u64 %r153,%r308,-1; .loc 1 129 3 mov.u64 %r118,0; add.u64 %r367,%frame,240; .loc 1 131 16 mov.u64 %r310,%r118; $L26: st.u64 [%r165],%r310; .loc 1 132 18 ld.u64 %r311,[%r164]; st.u64 [%r162],%r311; .loc 1 133 17 shl.b64 %r313,%r118,3; add.u64 %r314,%r367,%r313; .loc 1 133 10 ld.u64 %r315,[%r314]; setp.le.s64 %r316,%r315,0; @ %r316 bra $L1; .loc 1 129 26 add.u64 %r118,%r118,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r317,%r118,%r153; @ %r317 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r365 bra $L35; .loc 1 150 10 st.u64 [%r117],%r369; st.u64 [%r117+8],%r369; bra $L29; $L35: mov.u64 %r112,%r116; .loc 1 148 10 mov.u64 %r358,%r369; mov.u64 %r359,%r369; .loc 1 154 13 mov.u64 %r111,0; $L28: .loc 1 158 10 ld.u64 %r360,[%r112]; ld.u64 %r361,[%r112+8]; add.u64 %r323,%r358,%r360; set.u32.lt.u64 %r328,%r323,%r358; cvt.s64.s32 %r326,%r328; add.u64 %r329,%r359,%r361; mov.u64 %r358,%r323; sub.u64 %r359,%r329,%r326; .loc 1 154 28 add.u64 %r111,%r111,1; .loc 1 154 36 add.u64 %r112,%r112,%r66; .loc 1 154 6 setp.gt.s64 %r333,%r45,%r111; @ %r333 bra $L28; .loc 1 161 12 st.u64 [%r117],%r358; st.u64 [%r117+8],%r359; $L29: .loc 1 165 15 add.u64 %r138,%r138,1; .loc 1 166 12 add.u64 %r116,%r116,%r69; .loc 1 167 12 add.u64 %r117,%r117,%r72; .loc 1 169 13 setp.ne.u64 %r336,%r138,%r123; @ %r336 bra $L25; .loc 1 176 9 add.u64 %r113,%r116,%r127; .loc 1 177 9 add.u64 %r114,%r117,%r130; .loc 1 179 7 @ %r366 bra $L1; add.u64 %r122,%frame,368; mov.u64 %r140,8; .loc 1 178 5 mov.u64 %r115,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r122],%r369; .loc 1 176 23 mul.lo.u64 %r342,%r85,%r84; .loc 1 176 9 shl.b64 %r343,%r342,4; sub.u64 %r113,%r116,%r343; .loc 1 177 23 mul.lo.u64 %r344,%r88,%r84; .loc 1 177 9 shl.b64 %r345,%r344,4; sub.u64 %r114,%r117,%r345; .loc 1 178 5 add.u64 %r115,%r115,1; .loc 1 179 7 add.u64 %r122,%r122,8; add.u64 %r140,%r140,8; setp.eq.u64 %r346,%r115,%r373; @ %r346 bra $L1; $L32: .loc 1 187 16 ld.u64 %r347,[%r122]; add.u64 %r84,%r347,1; st.u64 [%r122],%r84; .loc 1 188 23 add.u64 %r349,%r374,%r140; ld.u64 %r85,[%r349]; .loc 1 188 13 shl.b64 %r350,%r85,4; add.u64 %r116,%r113,%r350; .loc 1 189 23 add.u64 %r351,%frame,%r140; ld.u64 %r88,[%r351]; .loc 1 189 13 shl.b64 %r352,%r88,4; add.u64 %r117,%r114,%r352; .loc 1 169 32 add.u64 %r354,%r375,%r140; ld.u64 %r92,[%r354]; .loc 1 169 13 setp.eq.u64 %r355,%r84,%r92; @ %r355 bra $L33; .loc 1 173 13 mov.u64 %r138,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r356,[%r179]; setp.eq.u64 %r357,%r356,0; @ ! %r357 bra $L11; bra $L34; $L1_gfortran_msum_i16 .visible .func _gfortran_msum_i16608u64u16 %r263; .reg .pred %r264; .reg .pred %r266; .reg .u64 %r270; .reg .u64 %r271; .reg .u64 %r277; .reg .pred %r279; .reg .pred %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .predu64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32u64 %r342; .reg .u64pred %r358; .reg .u64predu64 %r391; .reg .predpred %r399; .reg .u64 %r402; .reg .u64 %r405; .reg .u32 %r407; .reg .u64 %r408; .reg .pred %r412; .reg .predu64u64mov.u64 %r211,%ar2; mov.u64 %r212,%ar3; .loc 1 223 6 setp.ne.u64 %r213,%r212,0; @ %r213 bra $L49gfortran_sum_i16.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r211]; .loc 1 233 7 add.u64 %r137,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r210+28]; .loc 1 234 38 add.u32 %r217,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r138,%r217; .loc 1 237 7 shr.u64 %r219,%r137,63; set.u32.gt.s64 %r221,%r137,%r138; neg.s32 %r222,%r221; cvt.u16.u64 %r225,%r219; cvt.u16.u32 %r226,%r222; or.b16 %r224,%r225,%r226; .loc 1 237 6 cvt.u32.u16 %r227,%r224; cvt.u16.u8 %r228,%r227; setp.eq.u16 %r229,%r228,0; @ %r229 bra $L51; .loc 1 239 7 cvt.u32.u32 %r232,%r23; cvt.s64.s8 %r231,%r232; st.u64 [%stack+8],%r231; st.u64 [%stack],%r22; cvta.const.u64 %r2302stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r446,%r137,%r137; add.u64 %r449,%r446,%r137; shl.b64 %r237,%r449,3; add.u64 %r238,%r210,%r237; ld.u64 %r241,[%r238+56]; add.u64 %r240,%r241,1; .loc 1 244 7 ld.u64 %r248,[%r238+48]; sub.u64 %r139,%r240,%r248; .loc 1 245 6 setp.le.s64 %r249,%r139,0; @ %r249 bra $L48; .loc 1 248 9 ld.u64 %r152,[%r212]; .loc 1 250 15 ld.u64 %r34,[%r212+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r250,%r35,-4; and.b32 %r251,%r250,-5; set.u32.eq.u32 %r253,%r251,0; neg.s32 %r254,%r253; .loc 1 252 22 add.u32 %r255,%r35,-1; set.u32.le.u32 %r257,%r255,1; neg.s32 %r258,%r257; .loc 1 252 6 cvt.u16.u32 %r260,%r254; cvt.u16.u32 %r261,%r258; or.b16 %r259,%r260,%r261; cvt.u32.u16 %r262,%r259; cvt.u16.u8 %r263,%r262; setp.ne.u16 %r264,%r263,0; @ %r264 bra $L53; .loc 1 254 7 setp.ne.u32 %r266,%r35,16; @ %r266 bra $L54; $L53: .loc 1 261 9 shl.b64 %r270,%r449,3; add.u64 %r271,%r210,%r270; ld.u64 %r142,[%r271+40]; .loc 1 262 12 add.u64 %r277,%r212,%r270; ld.u64 %r42,[%r277+40]; .loc 1 264 3 setp.ne.u64 %r279,%r137,0; @ %r279 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r280,%r137,%r138; @ %r280 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r281281_gfortran_runtime_error5: add.u64 %r40,%r210,40; add.u64 %r179,%r212,40; add.u64 %r283,%r210,16; add.u64 %r285,%r22,%r22; add.u64 %r286,%r285,%r22; shl.b64 %r287,%r286,3; add.u64 %r206,%r283,%r287; .loc 1 264 3 mov.u64 %r92,0; add.u64 %r447,%frame,240; add.u64 %r445,%frame,360; .loc 1 271 12 mov.u64 %r460,%r92; $L60: .loc 1 266 18 add.u64 %r289,%r447,%r92; ld.u64 %r290,[%r40]; st.u64 [%r289],%r290; .loc 1 267 18 add.u64 %r291,%frame,%r92; .loc 1 267 20 ld.u64 %r293,[%r179]; mul.lo.u64 %r292,%r293,%r34; .loc 1 267 18 st.u64 [%r291],%r292; .loc 1 268 19 ld.u64 %r295,[%r40+16]; add.u64 %r294,%r295,1; ld.u64 %r296,[%r40+8]; sub.u64 %r54,%r294,%r296; .loc 1 270 10 setp.lt.s64 %r297,%r54,0; @ %r297 bra $L58; .loc 1 268 17 add.u64 %r299,%r445,%r92; st.u64 [%r299],%r54; bra $L59; $L58: .loc 1 271 12 add.u64 %r301,%r445,%r92; st.u64 [%r301],%r460; $L59: .loc 1 264 3 add.u64 %r40,%r40,24; add.u64 %r179,%r179,24; add.u64 %r92,%r92,8; setp.ne.u64 %r303,%r40,%r206; @ %r303 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r304,[%r209]; setp.eq.u64 %r305,%r304,0; @ ! %r305 bra $L63; bra $L62; $L56: add.u64 %r307,%r22,%r22; add.u64 %r308,%r307,%r22; shl.b64 %r309,%r308,3; add.u64 %r127,%r309,40; add.u64 %r133,%r210,%r127; add.u64 %r125,%r212,%r127; shl.b64 %r310,%r22,3; add.u64 %r103,%r310,-8; cvt.u32.u32 %r312,%r23; cvt.s64.s8 %r311,%r312; shl.b64 %r313,%r311,3; add.u64 %r69,%r313,-8; add.u64 %r447,%frame,240; add.u64 %r445,%frame,360; .loc 1 281 12 mov.u64 %r459,0; $L66: .loc 1 276 18 add.u64 %r315,%r447,%r103; ld.u64 %r316,[%r133]; st.u64 [%r315],%r316; .loc 1 277 18 add.u64 %r317,%frame,%r103; .loc 1 277 20 ld.u64 %r319,[%r125]; mul.lo.u64 %r318,%r319,%r34; .loc 1 277 18 st.u64 [%r317],%r318; .loc 1 278 19 ld.u64 %r321,[%r133+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r133+8]; sub.u64 %r66,%r320,%r322; .loc 1 280 10 setp.lt.s64 %r323,%r66,0; @ %r323 bra $L64; .loc 1 278 17 add.u64 %r325,%r445,%r103; st.u64 [%r325],%r66; bra $L65; $L64: .loc 1 281 12 add.u64 %r327,%r445,%r103; st.u64 [%r327],%r459; $L65: .loc 1 274 3 add.u64 %r133,%r133,24; add.u64 %r125,%r125,24; add.u64 %r103,%r103,8; setp.ne.u64 %r329,%r69,%r103; @ %r329 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r330,%r138,0; @ %r330 bra $L68; add.u64 %r445,%frame,360; $L86: add.u64 %r172,%r209,40; mov.u64 %r164,%r445; cvt.u32.u32 %r332,%r23; cvt.s64.s8 %r331,%r332; add.u64 %r334,%r331,%r331; add.u64 %r335,%r334,%r331; shl.b64 %r336,%r335,3; add.u64 %r337,%r209,16; add.u64 %r134,%r336,%r337; .loc 1 291 10 mov.u64 %r129,1; .loc 1 295 4 mov.u64 %r356,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r338,%r23,-2; cvt.s64.s32 %r75,%r338; add.u64 %r340,%r75,%r75; add.u64 %r341,%r340,%r75; shl.b64 %r342,%r341,3; add.u64 %r343,%r209,%r342; .loc 1 299 67 shl.b64 %r345,%r75,3; add.u64 %r346,%frame,%r345; .loc 1 299 59 ld.u64 %r348,[%r343+40]; ld.u64 %r349,[%r346+360]; mul.lo.u64 %r146,%r348,%r349; .loc 1 301 24 mov.u64 %r350,0; st.u64 [%r209+8],%r350; .loc 1 302 28 cvt.u16.u32 %r353,%r23; add.u16 %r352,%r353,-1; cvt.u32.u16 %r354,%r352; st.u8 [%r209+28],%r354; .loc 1 304 10 setp.eq.u64 %r355,%r146,0; @ %r355 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r129,%r72,%r129; $L69: .loc 1 295 4 st.u64 [%r172+8],%r356; ld.u64 %r72,[%r164]; add.u64 %r357,%r72,-1; st.u64 [%r172+16],%r357; st.u64 [%r172],%r129; .loc 1 288 7 add.u64 %r172,%r172,24; add.u64 %r164,%r164,8; setp.ne.u64 %r358,%r134,%r172; @ %r358 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r209+48],%r146; mov.u64 %r360,-1; st.u64 [%r209+56],%r360; mov.u64 %r361,1; st.u64 [%r209+40],%r361; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r363,16363; call (%value_in),_gfortrani_xmallocarray64,[%value_in]; } .loc 1 311 22 st.u64 [%r209],%r364; bra $L73; $L63: .loc 1 316 19 ld.s8 %r366,[%r209+28]; .loc 1 316 10 setp.eq.u64 %r367,%r366,%r138; @ %r367 bra $L74; .loc 1 317 2 cvta.const.u64 %r36836_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r370,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r371,[%r370+36]; setp.eq.u32 %r372,%r371,0; @ %r372 bra $L73; .loc 1 321 4 add.u64 %r445,%frame,360; cvta.const.u64 %r376,$LC2r44537376; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r380,$LC1210380376; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r382,%r138,0; @ %r382 bra $L75; $L78: .loc 1 336 8 ld.u64 %r153,[%r209]; .loc 1 337 8 ld.u64 %r151,[%r210]; .loc 1 339 9 setp.ne.u64 %r383,%r151,0; @ %r383 bra $L76; bra $L48; $L75: add.u64 %r190,%frame,480; add.u64 %r189,%r209,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r385,%r23; cvt.s64.s8 %r384,%r385; add.u64 %r175,%r384,-1; .loc 1 328 3 mov.u64 %r157,0; add.u64 %r445,%frame,360; .loc 1 330 16 mov.u64 %r386,%r157; $L77: st.u64 [%r190],%r386; .loc 1 331 18 ld.u64 %r387,[%r189]; st.u64 [%r187],%r387; .loc 1 332 17 shl.b64 %r389,%r157,3; add.u64 %r390,%r445,%r389; .loc 1 332 10 ld.u64 %r391,[%r390]; setp.le.s64 %r392,%r391,0; @ %r392 bra $L48; .loc 1 328 26 add.u64 %r157,%r157,1; .loc 1 328 3 add.u64 %r190,%r190,8; add.u64 %r189,%r189,24; add.u64 %r187,%r187,8; setp.ne.u64 %r393,%r157,%r175; @ %r393 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r45,%r42,%r34; .loc 1 349 32 shl.b64 %r91,%r142,4; ld.u64 %r39,[%frame+480]; .loc 1 359 22 ld.u64 %r94,[%frame+240]; .loc 1 359 12 shl.b64 %r95,%r94,4; .loc 1 360 23 ld.u64 %r97,[%frame]; .loc 1 361 22 ld.u64 %r99,[%frame+120]; .loc 1 361 12 shl.b64 %r100,%r99,4; .loc 1 363 32 ld.u64 %r85,[%frame+360]; .loc 1 370 23 mul.lo.u64 %r38,%r85,%r94; .loc 1 371 24 mul.lo.u64 %r394,%r85,%r97; .loc 1 371 10 neg.s64 %r163,%r394; .loc 1 372 23 mul.lo.u64 %r395,%r85,%r99; .loc 1 372 9 shl.b64 %r396,%r395,4; neg.s64 %r165,%r396; setp.le.s64 %r450,%r138,1; .loc 1 348 10 mov.u64 %r452,0; cvt.u32.u32 %r453,%r23; cvt.s64.s8 %r454,%r453; add.u64 %r455,%r454,-1; add.u64 %r456,%frame,240; add.u64 %r457,%frame,360; add.u64 %r458,%frame,120; $L87: .loc 1 328 3 mov.u64 %r156,%r152; mov.u64 %r155,%r151; .loc 1 348 10 mov.u64 %r439,%r452; mov.u64 %r440,0; .loc 1 349 9 mov.u64 %r154,%r440; $L80: .loc 1 352 6 ld.s8 %r398,[%r156]; cvt.u16.u32 %r397,%r398; setp.eq.u16 %r399,%r397,0; @ %r399 bra $L79; .loc 1 353 12 ld.u64 %r441,[%r155]; ld.u64 %r442,[%r155+8]; add.u64 %r402,%r439,%r441; set.u32.lt.u64 %r407,%r402,%r439; cvt.s64.s32 %r405,%r407; add.u64 %r408,%r440,%r442; mov.u64 %r439,%r402; sub.u64 %r440,%r408,%r405; $L79: .loc 1 349 24 add.u64 %r154,%r154,1; .loc 1 349 32 add.u64 %r155,%r155,%r91; .loc 1 349 47 add.u64 %r156,%r156,%r45; .loc 1 349 2 setp.ne.u64 %r412,%r139,%r154; @ %r412 bra $L80; .loc 1 355 8 st.u64 [%r153],%r439; st.u64 [%r153+8],%r440; .loc 1 358 15 add.u64 %r39,%r39,1; .loc 1 359 12 add.u64 %r151,%r151,%r95; .loc 1 360 13 add.u64 %r152,%r152,%r97; .loc 1 361 12 add.u64 %r153,%r153,%r100; .loc 1 363 13 setp.ne.u64 %r415,%r85,%r39; @ %r415 bra $L87; .loc 1 371 10 add.u64 %r148,%r152,%r163; .loc 1 372 9 add.u64 %r149,%r153,%r165; .loc 1 374 7 @ ! %r450 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r41],%r452; .loc 1 370 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 371 24 mul.lo.u64 %r418,%r116,%r113; .loc 1 371 10 sub.u64 %r148,%r152,%r418; .loc 1 372 23 mul.lo.u64 %r419,%r118,%r113; .loc 1 372 9 shl.b64 %r420,%r419,4; sub.u64 %r149,%r153,%r420; .loc 1 373 5 add.u64 %r150,%r150,1; .loc 1 374 7 add.u64 %r41,%r41,8; add.u64 %r166,%r166,8; setp.ne.u64 %r421,%r150,%r455; @ %r421 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; .loc 1 370 23 mov.u64 %r102,%r38; .loc 1 374 7 mov.u64 %r166,8; .loc 1 373 5 mov.u64 %r150,1; $L84: .loc 1 382 16 ld.u64 %r425,[%r41]; add.u64 %r113,%r425,1; st.u64 [%r41],%r113; .loc 1 383 23 add.u64 %r427,%r456,%r166; ld.u64 %r114,[%r427]; .loc 1 383 13 sub.u64 %r428,%r114,%r102; shl.b64 %r429,%r428,4; add.u64 %r151,%r151,%r429; .loc 1 384 24 add.u64 %r430,%frame,%r166; ld.u64 %r116,[%r430]; .loc 1 384 14 add.u64 %r152,%r148,%r116; .loc 1 385 23 add.u64 %r432,%r458,%r166; ld.u64 %r118,[%r432]; .loc 1 385 13 shl.b64 %r433,%r118,4; add.u64 %r153,%r149,%r433; .loc 1 363 32 add.u64 %r435,%r457,%r166; ld.u64 %r122,[%r435]; .loc 1 363 13 setp.eq.u64 %r436,%r113,%r122; @ %r436 bra $L85; .loc 1 367 13 mov.u64 %r39,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r437,[%r209]; setp.eq.u64 %r438,%r437,0; @ ! %r438 bra $L63; bra $L86; $L48: .loc 1 389_gfortran_ssum_i16 .visible .func _gfortran_ssum_i168u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64predpred %r292; .reg .predu32u64 %r301; mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_sum_i16.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r295,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r295; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r295; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r301,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r301; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r294,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r294,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r300,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r300; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r295,0; @ %r211 bra $L127; add.u64 %r294,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r294; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r244,1696244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r295,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r295; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r295; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r295,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r295,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r295,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 shl.b64 %r69,%r68,4; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r293,%r295,1; .loc 1 512 13 mov.u64 %r278,0; cvt.u32.u32 %r297,%r24; cvt.s64.s8 %r298,%r297; add.u64 %r299,%r298,-1; $L146: st.u64 [%r98],%r278; st.u64 [%r98+8],%r278; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r280,%r59,%r80; @ %r280 bra $L146; .loc 1 525 7 @ ! %r293 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r278; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r283,%r97,%r299; @ %r283 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; $L144: .loc 1 529 16 ld.u64 %r288,[%r99]; add.u64 %r74,%r288,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r289,%r75,%r72; shl.b64 %r290,%r289,4; add.u64 %r98,%r98,%r290; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r291,%r74,%r77; @ %r291 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r292,%r98,0; @ ! %r292 bra $L122; bra $L147; $L109: .loc 1 534 sum_r4.o/_gfortran_sum_r4 .visible .func _gfortran_sum_r4fortran/generated/sum_r4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_msum_r4 .visible .func _gfortran_msum_r4gfortran_ssum_r4 .visible .func _gfortran_ssum_r4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10] = {83,85,77114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,83,85,77,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_sum_r4 .visible .func _gfortran_sum_r448pred %r217; .reg .pred64u64 %r251; .reg .predu64 %r256; .reg .u64 %r257; .reg .u32u32 %r268; .reg .u64u64pred %r317; .reg .f32 %r319; .reg .f32pred %r332; .reg .u64 %r333; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u64 %r338; .reg .u64 %r340; .reg .predmov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; .loc 1 55 10 ld.s8 %r22,[%r180+28]; .loc 1 55 38 add.u32 %r182,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r182; .loc 1 56 10 ld.u64 %r25,[%r181]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r184,%r104,63; set.u32.lt.s64 %r186,%r103,%r104; neg.s32 %r187,%r186; cvt.u16.u64 %r190,%r184; cvt.u16.u32 %r191,%r187; or.b16 %r189,%r190,%r191; .loc 1 58 6 cvt.u32.u16 %r192,%r189; cvt.u16.u8 %r193,%r192; setp.eq.u16 %r194,%r193,0; @ %r194 bra $L2; .loc 1 60 7 cvt.u32.u32 %r197,%r22; cvt.s64.s8 %r196,%r197; st.u64 [%stack+8],%r1961955_gfortran_runtime_errorr200,%r104,%r104; add.u64 %r201,%r200,%r104; shl.b64 %r202,%r201,3; add.u64 %r203,%r180,%r202; ld.u64 %r30,[%r203+56]; ld.u64 %r32,[%r203+48]; .loc 1 68 9 ld.u64 %r107,[%r203+40]; .loc 1 70 3 setp.ne.u64 %r217,%r104,0; @ %r217 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r218,%r103,%r104; @ %r218 bra $L4; bra $L46; $L3: add.u64 %r61,%r180,40; add.u64 %r40,%frame,120; add.u64 %r91,%frame,240; add.u64 %r219,%r180,16; add.u64 %r221,%r25,%r25; add.u64 %r222,%r221,%r25; shl.b64 %r223,%r222,3; add.u64 %r174,%r219,%r223; .loc 1 76 12 mov.u64 %r357,0; $L8: .loc 1 72 18 ld.u64 %r224,[%r61]; st.u64 [%r40],%r224; .loc 1 73 19 ld.u64 %r226,[%r61+16]; add.u64 %r225,%r226,1; ld.u64 %r227,[%r61+8]; sub.u64 %r37,%r225,%r227; .loc 1 75 10 setp.lt.s64 %r228,%r37,0; @ %r228 bra $L6; .loc 1 73 17 st.u64 [%r91],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r91],%r357; $L7: .loc 1 70 3 add.u64 %r61,%r61,24; add.u64 %r40,%r40,8; add.u64 %r91,%r91,8; setp.ne.u64 %r230,%r61,%r174; @ %r230 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r231,[%r179]; setp.eq.u64 %r232,%r231,0; @ ! %r232 bra $L11; bra $L10; $L4: add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r237,%r236,40; add.u64 %r143,%r180,%r237; shl.b64 %r238,%r25,3; add.u64 %r125,%r238,-8; add.u64 %r344,%frame,120; add.u64 %r128,%r344,%r125; add.u64 %r345,%frame,240; add.u64 %r121,%r345,%r125; cvt.u32.u32 %r242,%r22; cvt.s64.s8 %r241,%r242; add.u64 %r244,%r241,%r241; add.u64 %r245,%r244,%r241; shl.b64 %r246,%r245,3; add.u64 %r247,%r180,40; add.u64 %r95,%r246,%r247; .loc 1 84 12 mov.u64 %r356,0; $L14: .loc 1 80 18 ld.u64 %r248,[%r143]; st.u64 [%r128],%r248; .loc 1 81 19 ld.u64 %r250,[%r143+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r143+8]; sub.u64 %r44,%r249,%r251; .loc 1 83 10 setp.lt.s64 %r252,%r44,0; @ %r252 bra $L12; .loc 1 81 17 st.u64 [%r121],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r121],%r356; $L13: .loc 1 78 3 add.u64 %r143,%r143,24; add.u64 %r128,%r128,8; add.u64 %r121,%r121,8; setp.ne.u64 %r254,%r95,%r143; @ %r254 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r255,%r103,0; @ %r255 bra $L16; add.u64 %r345,%frame,240; $L34: add.u64 %r152,%r179,40; mov.u64 %r150,%r345; add.u64 %r256,%r179,16; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r144,%r256,%r262; .loc 1 94 10 mov.u64 %r99,1; .loc 1 98 4 mov.u64 %r285,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r263,0; st.u64 [%r179+8],%r263; .loc 1 103 28 cvt.u16.u32 %r266,%r22; add.u16 %r265,%r266,-1; cvt.u32.u16 %r267,%r265; st.u8 [%r179+28],%r267; .loc 1 105 20 add.u32 %r268,%r22,-2; cvt.s64.s32 %r52,%r268; add.u64 %r270,%r52,%r52; add.u64 %r271,%r270,%r52; shl.b64 %r272,%r271,3; add.u64 %r273,%r179,%r272; .loc 1 105 67 shl.b64 %r275,%r52,3; add.u64 %r276,%frame,%r275; .loc 1 105 59 ld.u64 %r278,[%r273+40]; ld.u64 %r279,[%r276+240]; mul.lo.u64 %r109,%r278,%r279; .loc 1 107 29281; call (%value_in),_gfortrani_xmallocarray282,[%value_in]; } .loc 1 107 27 st.u64 [%r179],%r282; .loc 1 108 10 setp.eq.u64 %r284,%r109,0; @ ! %r284 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r99,%r48,%r99; $L17: .loc 1 98 4 st.u64 [%r152+8],%r285; ld.u64 %r48,[%r150]; add.u64 %r286,%r48,-1; st.u64 [%r152+16],%r286; st.u64 [%r152],%r99; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r287,%r144,%r152; @ %r287 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r179+48],%r109; mov.u64 %r289,-1; st.u64 [%r179+56],%r289; mov.u64 %r290,1; st.u64 [%r179+40],%r290; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r179+28]; .loc 1 118 10 setp.eq.u64 %r291,%r58,%r103; @ %r291 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r294,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r295,[%r294+36]; setp.eq.u32 %r296,%r295,0; @ %r296 bra $L21; .loc 1 125 2 add.u64 %r345,%frame,240; cvta.const.u64 %r300,$LC2r345299300; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r302,%r103,0; @ %r302 bra $L24; $L27: .loc 1 65 9 add.u64 %r303,%r30,1; .loc 1 65 7 sub.u64 %r105,%r303,%r32; max.s64 %r45,%r105,0; .loc 1 137 8 ld.u64 %r116,[%r180]; .loc 1 138 8 ld.u64 %r117,[%r179]; .loc 1 154 36 shl.b64 %r66,%r107,2; ld.u64 %r138,[%frame+360]; .loc 1 166 22 ld.u64 %r68,[%frame+120]; .loc 1 166 12 shl.b64 %r69,%r68,2; .loc 1 167 22 ld.u64 %r71,[%frame]; .loc 1 167 12 shl.b64 %r72,%r71,2; .loc 1 169 32 ld.u64 %r123,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r304,%r68,%r123; .loc 1 176 9 shl.b64 %r305,%r304,2; neg.s64 %r127,%r305; .loc 1 177 23 mul.lo.u64 %r306,%r71,%r123; .loc 1 177 9 shl.b64 %r307,%r306,2; neg.s64 %r130,%r307; setp.gt.s64 %r346,%r105,0; setp.le.s64 %r347,%r103,1; cvt.u32.u32 %r349,%r22; cvt.s64.s8 %r350,%r349; add.u64 %r351,%r350,-1; add.u64 %r352,%frame,120; add.u64 %r353,%frame,240; .loc 1 173 13 mov.u64 %r354,0; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r179,40; mov.u64 %r162,%frame; cvt.u32.u32 %r309,%r22; cvt.s64.s8 %r308,%r309; add.u64 %r153,%r308,-1; .loc 1 129 3 mov.u64 %r118,0; add.u64 %r345,%frame,240; .loc 1 131 16 mov.u64 %r310,%r118; $L26: st.u64 [%r165],%r310; .loc 1 132 18 ld.u64 %r311,[%r164]; st.u64 [%r162],%r311; .loc 1 133 17 shl.b64 %r313,%r118,3; add.u64 %r314,%r345,%r313; .loc 1 133 10 ld.u64 %r315,[%r314]; setp.le.s64 %r316,%r315,0; @ %r316 bra $L1; .loc 1 129 26 add.u64 %r118,%r118,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r317,%r118,%r153; @ %r317 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r346 bra $L35; .loc 1 150 10 mov.f32 %r319,0f00000000; st.f32 [%r117],%r319; bra $L29; $L35: mov.u64 %r112,%r116; .loc 1 148 10 mov.f32 %r110,0f00000000; .loc 1 154 13 mov.u64 %r111,0; $L28: .loc 1 158 10 ld.f32 %r320,[%r112]; add.f32 %r110,%r110,%r320; .loc 1 154 28 add.u64 %r111,%r111,1; .loc 1 154 36 add.u64 %r112,%r112,%r66; .loc 1 154 6 setp.gt.s64 %r321,%r45,%r111; @ %r321 bra $L28; .loc 1 161 12 st.f32 [%r117],%r110; $L29: .loc 1 165 15 add.u64 %r138,%r138,1; .loc 1 166 12 add.u64 %r116,%r116,%r69; .loc 1 167 12 add.u64 %r117,%r117,%r72; .loc 1 169 13 setp.ne.u64 %r322,%r138,%r123; @ %r322 bra $L25; .loc 1 176 9 add.u64 %r113,%r116,%r127; .loc 1 177 9 add.u64 %r114,%r117,%r130; .loc 1 179 7 @ %r347 bra $L1; add.u64 %r122,%frame,368; mov.u64 %r140,8; .loc 1 178 5 mov.u64 %r115,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r122],%r354; .loc 1 176 23 mul.lo.u64 %r328,%r85,%r84; .loc 1 176 9 shl.b64 %r329,%r328,2; sub.u64 %r113,%r116,%r329; .loc 1 177 23 mul.lo.u64 %r330,%r88,%r84; .loc 1 177 9 shl.b64 %r331,%r330,2; sub.u64 %r114,%r117,%r331; .loc 1 178 5 add.u64 %r115,%r115,1; .loc 1 179 7 add.u64 %r122,%r122,8; add.u64 %r140,%r140,8; setp.eq.u64 %r332,%r115,%r351; @ %r332 bra $L1; $L32: .loc 1 187 16 ld.u64 %r333,[%r122]; add.u64 %r84,%r333,1; st.u64 [%r122],%r84; .loc 1 188 23 add.u64 %r335,%r352,%r140; ld.u64 %r85,[%r335]; .loc 1 188 13 shl.b64 %r336,%r85,2; add.u64 %r116,%r113,%r336; .loc 1 189 23 add.u64 %r337,%frame,%r140; ld.u64 %r88,[%r337]; .loc 1 189 13 shl.b64 %r338,%r88,2; add.u64 %r117,%r114,%r338; .loc 1 169 32 add.u64 %r340,%r353,%r140; ld.u64 %r92,[%r340]; .loc 1 169 13 setp.eq.u64 %r341,%r84,%r92; @ %r341 bra $L33; .loc 1 173 13 mov.u64 %r138,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r342,[%r179]; setp.eq.u64 %r343,%r342,0; @ ! %r343 bra $L11; bra $L34; $L1_gfortran_msum_r4 .visible .func _gfortran_msum_r608u64u16 %r263; .reg .pred %r264; .reg .pred %r266; .reg .u64 %r270; .reg .u64 %r271; .reg .u64 %r277; .reg .pred %r279; .reg .pred %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .predu64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32u64 %r342; .reg .u64pred %r358; .reg .u64predu64 %r391; .reg .predpred %r399; .reg .f32 %r400; .reg .pred %r401; .reg .predmov.u64 %r211,%ar2; mov.u64 %r212,%ar3; .loc 1 223 6 setp.ne.u64 %r213,%r212,0; @ %r213 bra $L49gfortran_sum_r4.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r211]; .loc 1 233 7 add.u64 %r137,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r210+28]; .loc 1 234 38 add.u32 %r217,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r138,%r217; .loc 1 237 7 shr.u64 %r219,%r137,63; set.u32.gt.s64 %r221,%r137,%r138; neg.s32 %r222,%r221; cvt.u16.u64 %r225,%r219; cvt.u16.u32 %r226,%r222; or.b16 %r224,%r225,%r226; .loc 1 237 6 cvt.u32.u16 %r227,%r224; cvt.u16.u8 %r228,%r227; setp.eq.u16 %r229,%r228,0; @ %r229 bra $L51; .loc 1 239 7 cvt.u32.u32 %r232,%r23; cvt.s64.s8 %r231,%r232; st.u64 [%stack+8],%r231; st.u64 [%stack],%r22; cvta.const.u64 %r2302stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r429,%r137,%r137; add.u64 %r430,%r429,%r137; shl.b64 %r237,%r430,3; add.u64 %r238,%r210,%r237; ld.u64 %r241,[%r238+56]; add.u64 %r240,%r241,1; .loc 1 244 7 ld.u64 %r248,[%r238+48]; sub.u64 %r139,%r240,%r248; .loc 1 245 6 setp.le.s64 %r249,%r139,0; @ %r249 bra $L48; .loc 1 248 9 ld.u64 %r152,[%r212]; .loc 1 250 15 ld.u64 %r34,[%r212+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r250,%r35,-4; and.b32 %r251,%r250,-5; set.u32.eq.u32 %r253,%r251,0; neg.s32 %r254,%r253; .loc 1 252 22 add.u32 %r255,%r35,-1; set.u32.le.u32 %r257,%r255,1; neg.s32 %r258,%r257; .loc 1 252 6 cvt.u16.u32 %r260,%r254; cvt.u16.u32 %r261,%r258; or.b16 %r259,%r260,%r261; cvt.u32.u16 %r262,%r259; cvt.u16.u8 %r263,%r262; setp.ne.u16 %r264,%r263,0; @ %r264 bra $L53; .loc 1 254 7 setp.ne.u32 %r266,%r35,16; @ %r266 bra $L54; $L53: .loc 1 261 9 shl.b64 %r270,%r430,3; add.u64 %r271,%r210,%r270; ld.u64 %r142,[%r271+40]; .loc 1 262 12 add.u64 %r277,%r212,%r270; ld.u64 %r42,[%r277+40]; .loc 1 264 3 setp.ne.u64 %r279,%r137,0; @ %r279 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r280,%r137,%r138; @ %r280 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r281281_gfortran_runtime_error5: add.u64 %r40,%r210,40; add.u64 %r179,%r212,40; add.u64 %r283,%r210,16; add.u64 %r285,%r22,%r22; add.u64 %r286,%r285,%r22; shl.b64 %r287,%r286,3; add.u64 %r206,%r283,%r287; .loc 1 264 3 mov.u64 %r92,0; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 271 12 mov.u64 %r441,%r92; $L60: .loc 1 266 18 add.u64 %r289,%r427,%r92; ld.u64 %r290,[%r40]; st.u64 [%r289],%r290; .loc 1 267 18 add.u64 %r291,%frame,%r92; .loc 1 267 20 ld.u64 %r293,[%r179]; mul.lo.u64 %r292,%r293,%r34; .loc 1 267 18 st.u64 [%r291],%r292; .loc 1 268 19 ld.u64 %r295,[%r40+16]; add.u64 %r294,%r295,1; ld.u64 %r296,[%r40+8]; sub.u64 %r54,%r294,%r296; .loc 1 270 10 setp.lt.s64 %r297,%r54,0; @ %r297 bra $L58; .loc 1 268 17 add.u64 %r299,%r426,%r92; st.u64 [%r299],%r54; bra $L59; $L58: .loc 1 271 12 add.u64 %r301,%r426,%r92; st.u64 [%r301],%r441; $L59: .loc 1 264 3 add.u64 %r40,%r40,24; add.u64 %r179,%r179,24; add.u64 %r92,%r92,8; setp.ne.u64 %r303,%r40,%r206; @ %r303 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r304,[%r209]; setp.eq.u64 %r305,%r304,0; @ ! %r305 bra $L63; bra $L62; $L56: add.u64 %r307,%r22,%r22; add.u64 %r308,%r307,%r22; shl.b64 %r309,%r308,3; add.u64 %r127,%r309,40; add.u64 %r133,%r210,%r127; add.u64 %r125,%r212,%r127; shl.b64 %r310,%r22,3; add.u64 %r103,%r310,-8; cvt.u32.u32 %r312,%r23; cvt.s64.s8 %r311,%r312; shl.b64 %r313,%r311,3; add.u64 %r69,%r313,-8; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 281 12 mov.u64 %r440,0; $L66: .loc 1 276 18 add.u64 %r315,%r427,%r103; ld.u64 %r316,[%r133]; st.u64 [%r315],%r316; .loc 1 277 18 add.u64 %r317,%frame,%r103; .loc 1 277 20 ld.u64 %r319,[%r125]; mul.lo.u64 %r318,%r319,%r34; .loc 1 277 18 st.u64 [%r317],%r318; .loc 1 278 19 ld.u64 %r321,[%r133+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r133+8]; sub.u64 %r66,%r320,%r322; .loc 1 280 10 setp.lt.s64 %r323,%r66,0; @ %r323 bra $L64; .loc 1 278 17 add.u64 %r325,%r426,%r103; st.u64 [%r325],%r66; bra $L65; $L64: .loc 1 281 12 add.u64 %r327,%r426,%r103; st.u64 [%r327],%r440; $L65: .loc 1 274 3 add.u64 %r133,%r133,24; add.u64 %r125,%r125,24; add.u64 %r103,%r103,8; setp.ne.u64 %r329,%r69,%r103; @ %r329 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r330,%r138,0; @ %r330 bra $L68; add.u64 %r426,%frame,360; $L86: add.u64 %r172,%r209,40; mov.u64 %r164,%r426; cvt.u32.u32 %r332,%r23; cvt.s64.s8 %r331,%r332; add.u64 %r334,%r331,%r331; add.u64 %r335,%r334,%r331; shl.b64 %r336,%r335,3; add.u64 %r337,%r209,16; add.u64 %r134,%r336,%r337; .loc 1 291 10 mov.u64 %r129,1; .loc 1 295 4 mov.u64 %r356,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r338,%r23,-2; cvt.s64.s32 %r75,%r338; add.u64 %r340,%r75,%r75; add.u64 %r341,%r340,%r75; shl.b64 %r342,%r341,3; add.u64 %r343,%r209,%r342; .loc 1 299 67 shl.b64 %r345,%r75,3; add.u64 %r346,%frame,%r345; .loc 1 299 59 ld.u64 %r348,[%r343+40]; ld.u64 %r349,[%r346+360]; mul.lo.u64 %r146,%r348,%r349; .loc 1 301 24 mov.u64 %r350,0; st.u64 [%r209+8],%r350; .loc 1 302 28 cvt.u16.u32 %r353,%r23; add.u16 %r352,%r353,-1; cvt.u32.u16 %r354,%r352; st.u8 [%r209+28],%r354; .loc 1 304 10 setp.eq.u64 %r355,%r146,0; @ %r355 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r129,%r72,%r129; $L69: .loc 1 295 4 st.u64 [%r172+8],%r356; ld.u64 %r72,[%r164]; add.u64 %r357,%r72,-1; st.u64 [%r172+16],%r357; st.u64 [%r172],%r129; .loc 1 288 7 add.u64 %r172,%r172,24; add.u64 %r164,%r164,8; setp.ne.u64 %r358,%r134,%r172; @ %r358 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r209+48],%r146; mov.u64 %r360,-1; st.u64 [%r209+56],%r360; mov.u64 %r361,1; st.u64 [%r209+40],%r361; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24363; call (%value_in),_gfortrani_xmallocarray64,[%value_in]; } .loc 1 311 22 st.u64 [%r209],%r364; bra $L73; $L63: .loc 1 316 19 ld.s8 %r366,[%r209+28]; .loc 1 316 10 setp.eq.u64 %r367,%r366,%r138; @ %r367 bra $L74; .loc 1 317 2 cvta.const.u64 %r36836_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r370,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r371,[%r370+36]; setp.eq.u32 %r372,%r371,0; @ %r372 bra $L73; .loc 1 321 4 add.u64 %r426,%frame,360; cvta.const.u64 %r376,$LC2r42637376; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r380,$LC1210380376; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r382,%r138,0; @ %r382 bra $L75; $L78: .loc 1 336 8 ld.u64 %r153,[%r209]; .loc 1 337 8 ld.u64 %r151,[%r210]; .loc 1 339 9 setp.ne.u64 %r383,%r151,0; @ %r383 bra $L76; bra $L48; $L75: add.u64 %r190,%frame,480; add.u64 %r189,%r209,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r385,%r23; cvt.s64.s8 %r384,%r385; add.u64 %r175,%r384,-1; .loc 1 328 3 mov.u64 %r157,0; add.u64 %r426,%frame,360; .loc 1 330 16 mov.u64 %r386,%r157; $L77: st.u64 [%r190],%r386; .loc 1 331 18 ld.u64 %r387,[%r189]; st.u64 [%r187],%r387; .loc 1 332 17 shl.b64 %r389,%r157,3; add.u64 %r390,%r426,%r389; .loc 1 332 10 ld.u64 %r391,[%r390]; setp.le.s64 %r392,%r391,0; @ %r392 bra $L48; .loc 1 328 26 add.u64 %r157,%r157,1; .loc 1 328 3 add.u64 %r190,%r190,8; add.u64 %r189,%r189,24; add.u64 %r187,%r187,8; setp.ne.u64 %r393,%r157,%r175; @ %r393 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r45,%r42,%r34; .loc 1 349 32 shl.b64 %r91,%r142,2; ld.u64 %r39,[%frame+480]; .loc 1 359 22 ld.u64 %r94,[%frame+240]; .loc 1 359 12 shl.b64 %r95,%r94,2; .loc 1 360 23 ld.u64 %r97,[%frame]; .loc 1 361 22 ld.u64 %r99,[%frame+120]; .loc 1 361 12 shl.b64 %r100,%r99,2; .loc 1 363 32 ld.u64 %r85,[%frame+360]; .loc 1 370 23 mul.lo.u64 %r38,%r85,%r94; .loc 1 371 24 mul.lo.u64 %r394,%r85,%r97; .loc 1 371 10 neg.s64 %r163,%r394; .loc 1 372 23 mul.lo.u64 %r395,%r85,%r99; .loc 1 372 9 shl.b64 %r396,%r395,2; neg.s64 %r165,%r396; setp.le.s64 %r431,%r138,1; cvt.u32.u32 %r433,%r23; cvt.s64.s8 %r434,%r433; add.u64 %r435,%r434,-1; add.u64 %r436,%frame,240; add.u64 %r437,%frame,360; add.u64 %r438,%frame,120; .loc 1 367 13 mov.u64 %r439,0; $L87: .loc 1 328 3 mov.u64 %r156,%r152; mov.u64 %r155,%r151; .loc 1 348 10 mov.f32 %r130,0f00000000; .loc 1 349 9 mov.u64 %r154,0; $L80: .loc 1 352 6 ld.s8 %r398,[%r156]; cvt.u16.u32 %r397,%r398; setp.eq.u16 %r399,%r397,0; @ %r399 bra $L79; .loc 1 353 12 ld.f32 %r400,[%r155]; add.f32 %r130,%r130,%r400; $L79: .loc 1 349 24 add.u64 %r154,%r154,1; .loc 1 349 32 add.u64 %r155,%r155,%r91; .loc 1 349 47 add.u64 %r156,%r156,%r45; .loc 1 349 2 setp.ne.u64 %r401,%r139,%r154; @ %r401 bra $L80; .loc 1 355 8 st.f32 [%r153],%r130; .loc 1 358 15 add.u64 %r39,%r39,1; .loc 1 359 12 add.u64 %r151,%r151,%r95; .loc 1 360 13 add.u64 %r152,%r152,%r97; .loc 1 361 12 add.u64 %r153,%r153,%r100; .loc 1 363 13 setp.ne.u64 %r402,%r85,%r39; @ %r402 bra $L87; .loc 1 371 10 add.u64 %r148,%r152,%r163; .loc 1 372 9 add.u64 %r149,%r153,%r165; .loc 1 374 7 @ ! %r431 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r41],%r439; .loc 1 370 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 371 24 mul.lo.u64 %r405,%r116,%r113; .loc 1 371 10 sub.u64 %r148,%r152,%r405; .loc 1 372 23 mul.lo.u64 %r406,%r118,%r113; .loc 1 372 9 shl.b64 %r407,%r406,2; sub.u64 %r149,%r153,%r407; .loc 1 373 5 add.u64 %r150,%r150,1; .loc 1 374 7 add.u64 %r41,%r41,8; add.u64 %r166,%r166,8; setp.ne.u64 %r408,%r150,%r435; @ %r408 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; .loc 1 370 23 mov.u64 %r102,%r38; .loc 1 374 7 mov.u64 %r166,8; .loc 1 373 5 mov.u64 %r150,1; $L84: .loc 1 382 16 ld.u64 %r412,[%r41]; add.u64 %r113,%r412,1; st.u64 [%r41],%r113; .loc 1 383 23 add.u64 %r414,%r436,%r166; ld.u64 %r114,[%r414]; .loc 1 383 13 sub.u64 %r415,%r114,%r102; shl.b64 %r416,%r415,2; add.u64 %r151,%r151,%r416; .loc 1 384 24 add.u64 %r417,%frame,%r166; ld.u64 %r116,[%r417]; .loc 1 384 14 add.u64 %r152,%r148,%r116; .loc 1 385 23 add.u64 %r419,%r438,%r166; ld.u64 %r118,[%r419]; .loc 1 385 13 shl.b64 %r420,%r118,2; add.u64 %r153,%r149,%r420; .loc 1 363 32 add.u64 %r422,%r437,%r166; ld.u64 %r122,[%r422]; .loc 1 363 13 setp.eq.u64 %r423,%r113,%r122; @ %r423 bra $L85; .loc 1 367 13 mov.u64 %r39,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r424,[%r209]; setp.eq.u64 %r425,%r424,0; @ ! %r425 bra $L63; bra $L86; $L48: .loc 1 389_gfortran_ssum_r4 .visible .func _gfortran_ssum_r8u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64 %r276; .reg .predpredu64mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_sum_r4.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r294,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r294; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r294; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r300,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r300; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r293,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r293,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r299,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r299; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r294,0; @ %r211 bra $L127; add.u64 %r293,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r293; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 2496244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r294,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r294; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r294; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r294,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r294,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r294,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 shl.b64 %r69,%r68,2; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r292,%r294,1; .loc 1 512 13 mov.f32 %r278,0f00000000; cvt.u32.u32 %r296,%r24; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; $L146: st.f32 [%r98],%r278; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r279,%r59,%r80; @ %r279 bra $L146; .loc 1 525 7 @ ! %r292 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r295; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r282,%r97,%r298; @ %r282 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; .loc 1 520 13 mov.u64 %r295,0; $L144: .loc 1 529 16 ld.u64 %r287,[%r99]; add.u64 %r74,%r287,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r288,%r75,%r72; shl.b64 %r289,%r288,2; add.u64 %r98,%r98,%r289; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r290,%r74,%r77; @ %r290 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r291,%r98,0; @ ! %r291 bra $L122; bra $L147; $L109: .loc 1 534sum_r8.o/_gfortran_sum_r8 .visible .func _gfortran_sum_r8fortran/generated/sum_r8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_msum_r8 .visible .func _gfortran_msum_r8gfortran_ssum_r8 .visible .func _gfortran_ssum_r8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10] = {83,85,77114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,83,85,77,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_sum_r8 .visible .func _gfortran_sum_r848pred %r217; .reg .pred64u64 %r251; .reg .predu64 %r256; .reg .u64 %r257; .reg .u32u32 %r268; .reg .u64u64pred %r317; .reg .f64 %r319; .reg .f64pred %r332; .reg .u64 %r333; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u64 %r338; .reg .u64 %r340; .reg .predmov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; .loc 1 55 10 ld.s8 %r22,[%r180+28]; .loc 1 55 38 add.u32 %r182,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r182; .loc 1 56 10 ld.u64 %r25,[%r181]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r184,%r104,63; set.u32.lt.s64 %r186,%r103,%r104; neg.s32 %r187,%r186; cvt.u16.u64 %r190,%r184; cvt.u16.u32 %r191,%r187; or.b16 %r189,%r190,%r191; .loc 1 58 6 cvt.u32.u16 %r192,%r189; cvt.u16.u8 %r193,%r192; setp.eq.u16 %r194,%r193,0; @ %r194 bra $L2; .loc 1 60 7 cvt.u32.u32 %r197,%r22; cvt.s64.s8 %r196,%r197; st.u64 [%stack+8],%r1961955_gfortran_runtime_errorr200,%r104,%r104; add.u64 %r201,%r200,%r104; shl.b64 %r202,%r201,3; add.u64 %r203,%r180,%r202; ld.u64 %r30,[%r203+56]; ld.u64 %r32,[%r203+48]; .loc 1 68 9 ld.u64 %r107,[%r203+40]; .loc 1 70 3 setp.ne.u64 %r217,%r104,0; @ %r217 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r218,%r103,%r104; @ %r218 bra $L4; bra $L46; $L3: add.u64 %r61,%r180,40; add.u64 %r40,%frame,120; add.u64 %r91,%frame,240; add.u64 %r219,%r180,16; add.u64 %r221,%r25,%r25; add.u64 %r222,%r221,%r25; shl.b64 %r223,%r222,3; add.u64 %r174,%r219,%r223; .loc 1 76 12 mov.u64 %r357,0; $L8: .loc 1 72 18 ld.u64 %r224,[%r61]; st.u64 [%r40],%r224; .loc 1 73 19 ld.u64 %r226,[%r61+16]; add.u64 %r225,%r226,1; ld.u64 %r227,[%r61+8]; sub.u64 %r37,%r225,%r227; .loc 1 75 10 setp.lt.s64 %r228,%r37,0; @ %r228 bra $L6; .loc 1 73 17 st.u64 [%r91],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r91],%r357; $L7: .loc 1 70 3 add.u64 %r61,%r61,24; add.u64 %r40,%r40,8; add.u64 %r91,%r91,8; setp.ne.u64 %r230,%r61,%r174; @ %r230 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r231,[%r179]; setp.eq.u64 %r232,%r231,0; @ ! %r232 bra $L11; bra $L10; $L4: add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r237,%r236,40; add.u64 %r143,%r180,%r237; shl.b64 %r238,%r25,3; add.u64 %r125,%r238,-8; add.u64 %r344,%frame,120; add.u64 %r128,%r344,%r125; add.u64 %r345,%frame,240; add.u64 %r121,%r345,%r125; cvt.u32.u32 %r242,%r22; cvt.s64.s8 %r241,%r242; add.u64 %r244,%r241,%r241; add.u64 %r245,%r244,%r241; shl.b64 %r246,%r245,3; add.u64 %r247,%r180,40; add.u64 %r95,%r246,%r247; .loc 1 84 12 mov.u64 %r356,0; $L14: .loc 1 80 18 ld.u64 %r248,[%r143]; st.u64 [%r128],%r248; .loc 1 81 19 ld.u64 %r250,[%r143+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r143+8]; sub.u64 %r44,%r249,%r251; .loc 1 83 10 setp.lt.s64 %r252,%r44,0; @ %r252 bra $L12; .loc 1 81 17 st.u64 [%r121],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r121],%r356; $L13: .loc 1 78 3 add.u64 %r143,%r143,24; add.u64 %r128,%r128,8; add.u64 %r121,%r121,8; setp.ne.u64 %r254,%r95,%r143; @ %r254 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r255,%r103,0; @ %r255 bra $L16; add.u64 %r345,%frame,240; $L34: add.u64 %r152,%r179,40; mov.u64 %r150,%r345; add.u64 %r256,%r179,16; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r144,%r256,%r262; .loc 1 94 10 mov.u64 %r99,1; .loc 1 98 4 mov.u64 %r285,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r263,0; st.u64 [%r179+8],%r263; .loc 1 103 28 cvt.u16.u32 %r266,%r22; add.u16 %r265,%r266,-1; cvt.u32.u16 %r267,%r265; st.u8 [%r179+28],%r267; .loc 1 105 20 add.u32 %r268,%r22,-2; cvt.s64.s32 %r52,%r268; add.u64 %r270,%r52,%r52; add.u64 %r271,%r270,%r52; shl.b64 %r272,%r271,3; add.u64 %r273,%r179,%r272; .loc 1 105 67 shl.b64 %r275,%r52,3; add.u64 %r276,%frame,%r275; .loc 1 105 59 ld.u64 %r278,[%r273+40]; ld.u64 %r279,[%r276+240]; mul.lo.u64 %r109,%r278,%r279; .loc 1 107 29 mov.u64 %r281281; call (%value_in),_gfortrani_xmallocarray282,[%value_in]; } .loc 1 107 27 st.u64 [%r179],%r282; .loc 1 108 10 setp.eq.u64 %r284,%r109,0; @ ! %r284 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r99,%r48,%r99; $L17: .loc 1 98 4 st.u64 [%r152+8],%r285; ld.u64 %r48,[%r150]; add.u64 %r286,%r48,-1; st.u64 [%r152+16],%r286; st.u64 [%r152],%r99; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r287,%r144,%r152; @ %r287 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r179+48],%r109; mov.u64 %r289,-1; st.u64 [%r179+56],%r289; mov.u64 %r290,1; st.u64 [%r179+40],%r290; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r179+28]; .loc 1 118 10 setp.eq.u64 %r291,%r58,%r103; @ %r291 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r294,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r295,[%r294+36]; setp.eq.u32 %r296,%r295,0; @ %r296 bra $L21; .loc 1 125 2 add.u64 %r345,%frame,240; cvta.const.u64 %r300,$LC2r345299300; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r302,%r103,0; @ %r302 bra $L24; $L27: .loc 1 65 9 add.u64 %r303,%r30,1; .loc 1 65 7 sub.u64 %r105,%r303,%r32; max.s64 %r45,%r105,0; .loc 1 137 8 ld.u64 %r116,[%r180]; .loc 1 138 8 ld.u64 %r117,[%r179]; .loc 1 154 36 shl.b64 %r66,%r107,3; ld.u64 %r138,[%frame+360]; .loc 1 166 22 ld.u64 %r68,[%frame+120]; .loc 1 166 12 shl.b64 %r69,%r68,3; .loc 1 167 22 ld.u64 %r71,[%frame]; .loc 1 167 12 shl.b64 %r72,%r71,3; .loc 1 169 32 ld.u64 %r123,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r304,%r68,%r123; .loc 1 176 9 shl.b64 %r305,%r304,3; neg.s64 %r127,%r305; .loc 1 177 23 mul.lo.u64 %r306,%r71,%r123; .loc 1 177 9 shl.b64 %r307,%r306,3; neg.s64 %r130,%r307; setp.gt.s64 %r346,%r105,0; setp.le.s64 %r347,%r103,1; cvt.u32.u32 %r349,%r22; cvt.s64.s8 %r350,%r349; add.u64 %r351,%r350,-1; add.u64 %r352,%frame,120; add.u64 %r353,%frame,240; .loc 1 173 13 mov.u64 %r354,0; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r179,40; mov.u64 %r162,%frame; cvt.u32.u32 %r309,%r22; cvt.s64.s8 %r308,%r309; add.u64 %r153,%r308,-1; .loc 1 129 3 mov.u64 %r118,0; add.u64 %r345,%frame,240; .loc 1 131 16 mov.u64 %r310,%r118; $L26: st.u64 [%r165],%r310; .loc 1 132 18 ld.u64 %r311,[%r164]; st.u64 [%r162],%r311; .loc 1 133 17 shl.b64 %r313,%r118,3; add.u64 %r314,%r345,%r313; .loc 1 133 10 ld.u64 %r315,[%r314]; setp.le.s64 %r316,%r315,0; @ %r316 bra $L1; .loc 1 129 26 add.u64 %r118,%r118,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r317,%r118,%r153; @ %r317 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r346 bra $L35; .loc 1 150 10 mov.f64 %r319,0d0000000000000000; st.f64 [%r117],%r319; bra $L29; $L35: mov.u64 %r112,%r116; .loc 1 148 10 mov.f64 %r110,0d0000000000000000; .loc 1 154 13 mov.u64 %r111,0; $L28: .loc 1 158 10 ld.f64 %r320,[%r112]; add.f64 %r110,%r110,%r320; .loc 1 154 28 add.u64 %r111,%r111,1; .loc 1 154 36 add.u64 %r112,%r112,%r66; .loc 1 154 6 setp.gt.s64 %r321,%r45,%r111; @ %r321 bra $L28; .loc 1 161 12 st.f64 [%r117],%r110; $L29: .loc 1 165 15 add.u64 %r138,%r138,1; .loc 1 166 12 add.u64 %r116,%r116,%r69; .loc 1 167 12 add.u64 %r117,%r117,%r72; .loc 1 169 13 setp.ne.u64 %r322,%r138,%r123; @ %r322 bra $L25; .loc 1 176 9 add.u64 %r113,%r116,%r127; .loc 1 177 9 add.u64 %r114,%r117,%r130; .loc 1 179 7 @ %r347 bra $L1; add.u64 %r122,%frame,368; mov.u64 %r140,8; .loc 1 178 5 mov.u64 %r115,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r122],%r354; .loc 1 176 23 mul.lo.u64 %r328,%r85,%r84; .loc 1 176 9 shl.b64 %r329,%r328,3; sub.u64 %r113,%r116,%r329; .loc 1 177 23 mul.lo.u64 %r330,%r88,%r84; .loc 1 177 9 shl.b64 %r331,%r330,3; sub.u64 %r114,%r117,%r331; .loc 1 178 5 add.u64 %r115,%r115,1; .loc 1 179 7 add.u64 %r122,%r122,8; add.u64 %r140,%r140,8; setp.eq.u64 %r332,%r115,%r351; @ %r332 bra $L1; $L32: .loc 1 187 16 ld.u64 %r333,[%r122]; add.u64 %r84,%r333,1; st.u64 [%r122],%r84; .loc 1 188 23 add.u64 %r335,%r352,%r140; ld.u64 %r85,[%r335]; .loc 1 188 13 shl.b64 %r336,%r85,3; add.u64 %r116,%r113,%r336; .loc 1 189 23 add.u64 %r337,%frame,%r140; ld.u64 %r88,[%r337]; .loc 1 189 13 shl.b64 %r338,%r88,3; add.u64 %r117,%r114,%r338; .loc 1 169 32 add.u64 %r340,%r353,%r140; ld.u64 %r92,[%r340]; .loc 1 169 13 setp.eq.u64 %r341,%r84,%r92; @ %r341 bra $L33; .loc 1 173 13 mov.u64 %r138,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r342,[%r179]; setp.eq.u64 %r343,%r342,0; @ ! %r343 bra $L11; bra $L34; $L1_gfortran_msum_r8 .visible .func _gfortran_msum_r8608u64u16 %r263; .reg .pred %r264; .reg .pred %r266; .reg .u64 %r270; .reg .u64 %r271; .reg .u64 %r277; .reg .pred %r279; .reg .pred %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .predu64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32u64 %r342; .reg .u64pred %r358; .reg .u64predu64 %r391; .reg .predpred %r399; .reg .f64 %r400; .reg .pred %r401; .reg .predmov.u64 %r211,%ar2; mov.u64 %r212,%ar3; .loc 1 223 6 setp.ne.u64 %r213,%r212,0; @ %r213 bra $L49gfortran_sum_r8.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r211]; .loc 1 233 7 add.u64 %r137,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r210+28]; .loc 1 234 38 add.u32 %r217,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r138,%r217; .loc 1 237 7 shr.u64 %r219,%r137,63; set.u32.gt.s64 %r221,%r137,%r138; neg.s32 %r222,%r221; cvt.u16.u64 %r225,%r219; cvt.u16.u32 %r226,%r222; or.b16 %r224,%r225,%r226; .loc 1 237 6 cvt.u32.u16 %r227,%r224; cvt.u16.u8 %r228,%r227; setp.eq.u16 %r229,%r228,0; @ %r229 bra $L51; .loc 1 239 7 cvt.u32.u32 %r232,%r23; cvt.s64.s8 %r231,%r232; st.u64 [%stack+8],%r231; st.u64 [%stack],%r22; cvta.const.u64 %r2302stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r429,%r137,%r137; add.u64 %r430,%r429,%r137; shl.b64 %r237,%r430,3; add.u64 %r238,%r210,%r237; ld.u64 %r241,[%r238+56]; add.u64 %r240,%r241,1; .loc 1 244 7 ld.u64 %r248,[%r238+48]; sub.u64 %r139,%r240,%r248; .loc 1 245 6 setp.le.s64 %r249,%r139,0; @ %r249 bra $L48; .loc 1 248 9 ld.u64 %r152,[%r212]; .loc 1 250 15 ld.u64 %r34,[%r212+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r250,%r35,-4; and.b32 %r251,%r250,-5; set.u32.eq.u32 %r253,%r251,0; neg.s32 %r254,%r253; .loc 1 252 22 add.u32 %r255,%r35,-1; set.u32.le.u32 %r257,%r255,1; neg.s32 %r258,%r257; .loc 1 252 6 cvt.u16.u32 %r260,%r254; cvt.u16.u32 %r261,%r258; or.b16 %r259,%r260,%r261; cvt.u32.u16 %r262,%r259; cvt.u16.u8 %r263,%r262; setp.ne.u16 %r264,%r263,0; @ %r264 bra $L53; .loc 1 254 7 setp.ne.u32 %r266,%r35,16; @ %r266 bra $L54; $L53: .loc 1 261 9 shl.b64 %r270,%r430,3; add.u64 %r271,%r210,%r270; ld.u64 %r142,[%r271+40]; .loc 1 262 12 add.u64 %r277,%r212,%r270; ld.u64 %r42,[%r277+40]; .loc 1 264 3 setp.ne.u64 %r279,%r137,0; @ %r279 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r280,%r137,%r138; @ %r280 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r281281_gfortran_runtime_error5: add.u64 %r40,%r210,40; add.u64 %r179,%r212,40; add.u64 %r283,%r210,16; add.u64 %r285,%r22,%r22; add.u64 %r286,%r285,%r22; shl.b64 %r287,%r286,3; add.u64 %r206,%r283,%r287; .loc 1 264 3 mov.u64 %r92,0; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 271 12 mov.u64 %r441,%r92; $L60: .loc 1 266 18 add.u64 %r289,%r427,%r92; ld.u64 %r290,[%r40]; st.u64 [%r289],%r290; .loc 1 267 18 add.u64 %r291,%frame,%r92; .loc 1 267 20 ld.u64 %r293,[%r179]; mul.lo.u64 %r292,%r293,%r34; .loc 1 267 18 st.u64 [%r291],%r292; .loc 1 268 19 ld.u64 %r295,[%r40+16]; add.u64 %r294,%r295,1; ld.u64 %r296,[%r40+8]; sub.u64 %r54,%r294,%r296; .loc 1 270 10 setp.lt.s64 %r297,%r54,0; @ %r297 bra $L58; .loc 1 268 17 add.u64 %r299,%r426,%r92; st.u64 [%r299],%r54; bra $L59; $L58: .loc 1 271 12 add.u64 %r301,%r426,%r92; st.u64 [%r301],%r441; $L59: .loc 1 264 3 add.u64 %r40,%r40,24; add.u64 %r179,%r179,24; add.u64 %r92,%r92,8; setp.ne.u64 %r303,%r40,%r206; @ %r303 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r304,[%r209]; setp.eq.u64 %r305,%r304,0; @ ! %r305 bra $L63; bra $L62; $L56: add.u64 %r307,%r22,%r22; add.u64 %r308,%r307,%r22; shl.b64 %r309,%r308,3; add.u64 %r127,%r309,40; add.u64 %r133,%r210,%r127; add.u64 %r125,%r212,%r127; shl.b64 %r310,%r22,3; add.u64 %r103,%r310,-8; cvt.u32.u32 %r312,%r23; cvt.s64.s8 %r311,%r312; shl.b64 %r313,%r311,3; add.u64 %r69,%r313,-8; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 281 12 mov.u64 %r440,0; $L66: .loc 1 276 18 add.u64 %r315,%r427,%r103; ld.u64 %r316,[%r133]; st.u64 [%r315],%r316; .loc 1 277 18 add.u64 %r317,%frame,%r103; .loc 1 277 20 ld.u64 %r319,[%r125]; mul.lo.u64 %r318,%r319,%r34; .loc 1 277 18 st.u64 [%r317],%r318; .loc 1 278 19 ld.u64 %r321,[%r133+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r133+8]; sub.u64 %r66,%r320,%r322; .loc 1 280 10 setp.lt.s64 %r323,%r66,0; @ %r323 bra $L64; .loc 1 278 17 add.u64 %r325,%r426,%r103; st.u64 [%r325],%r66; bra $L65; $L64: .loc 1 281 12 add.u64 %r327,%r426,%r103; st.u64 [%r327],%r440; $L65: .loc 1 274 3 add.u64 %r133,%r133,24; add.u64 %r125,%r125,24; add.u64 %r103,%r103,8; setp.ne.u64 %r329,%r69,%r103; @ %r329 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r330,%r138,0; @ %r330 bra $L68; add.u64 %r426,%frame,360; $L86: add.u64 %r172,%r209,40; mov.u64 %r164,%r426; cvt.u32.u32 %r332,%r23; cvt.s64.s8 %r331,%r332; add.u64 %r334,%r331,%r331; add.u64 %r335,%r334,%r331; shl.b64 %r336,%r335,3; add.u64 %r337,%r209,16; add.u64 %r134,%r336,%r337; .loc 1 291 10 mov.u64 %r129,1; .loc 1 295 4 mov.u64 %r356,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r338,%r23,-2; cvt.s64.s32 %r75,%r338; add.u64 %r340,%r75,%r75; add.u64 %r341,%r340,%r75; shl.b64 %r342,%r341,3; add.u64 %r343,%r209,%r342; .loc 1 299 67 shl.b64 %r345,%r75,3; add.u64 %r346,%frame,%r345; .loc 1 299 59 ld.u64 %r348,[%r343+40]; ld.u64 %r349,[%r346+360]; mul.lo.u64 %r146,%r348,%r349; .loc 1 301 24 mov.u64 %r350,0; st.u64 [%r209+8],%r350; .loc 1 302 28 cvt.u16.u32 %r353,%r23; add.u16 %r352,%r353,-1; cvt.u32.u16 %r354,%r352; st.u8 [%r209+28],%r354; .loc 1 304 10 setp.eq.u64 %r355,%r146,0; @ %r355 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r129,%r72,%r129; $L69: .loc 1 295 4 st.u64 [%r172+8],%r356; ld.u64 %r72,[%r164]; add.u64 %r357,%r72,-1; st.u64 [%r172+16],%r357; st.u64 [%r172],%r129; .loc 1 288 7 add.u64 %r172,%r172,24; add.u64 %r164,%r164,8; setp.ne.u64 %r358,%r134,%r172; @ %r358 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r209+48],%r146; mov.u64 %r360,-1; st.u64 [%r209+56],%r360; mov.u64 %r361,1; st.u64 [%r209+40],%r361; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r363363; call (%value_in),_gfortrani_xmallocarray64,[%value_in]; } .loc 1 311 22 st.u64 [%r209],%r364; bra $L73; $L63: .loc 1 316 19 ld.s8 %r366,[%r209+28]; .loc 1 316 10 setp.eq.u64 %r367,%r366,%r138; @ %r367 bra $L74; .loc 1 317 2 cvta.const.u64 %r36836_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r370,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r371,[%r370+36]; setp.eq.u32 %r372,%r371,0; @ %r372 bra $L73; .loc 1 321 4 add.u64 %r426,%frame,360; cvta.const.u64 %r376,$LC2r42637376; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r380,$LC1210380376; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r382,%r138,0; @ %r382 bra $L75; $L78: .loc 1 336 8 ld.u64 %r153,[%r209]; .loc 1 337 8 ld.u64 %r151,[%r210]; .loc 1 339 9 setp.ne.u64 %r383,%r151,0; @ %r383 bra $L76; bra $L48; $L75: add.u64 %r190,%frame,480; add.u64 %r189,%r209,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r385,%r23; cvt.s64.s8 %r384,%r385; add.u64 %r175,%r384,-1; .loc 1 328 3 mov.u64 %r157,0; add.u64 %r426,%frame,360; .loc 1 330 16 mov.u64 %r386,%r157; $L77: st.u64 [%r190],%r386; .loc 1 331 18 ld.u64 %r387,[%r189]; st.u64 [%r187],%r387; .loc 1 332 17 shl.b64 %r389,%r157,3; add.u64 %r390,%r426,%r389; .loc 1 332 10 ld.u64 %r391,[%r390]; setp.le.s64 %r392,%r391,0; @ %r392 bra $L48; .loc 1 328 26 add.u64 %r157,%r157,1; .loc 1 328 3 add.u64 %r190,%r190,8; add.u64 %r189,%r189,24; add.u64 %r187,%r187,8; setp.ne.u64 %r393,%r157,%r175; @ %r393 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r45,%r42,%r34; .loc 1 349 32 shl.b64 %r91,%r142,3; ld.u64 %r39,[%frame+480]; .loc 1 359 22 ld.u64 %r94,[%frame+240]; .loc 1 359 12 shl.b64 %r95,%r94,3; .loc 1 360 23 ld.u64 %r97,[%frame]; .loc 1 361 22 ld.u64 %r99,[%frame+120]; .loc 1 361 12 shl.b64 %r100,%r99,3; .loc 1 363 32 ld.u64 %r85,[%frame+360]; .loc 1 370 23 mul.lo.u64 %r38,%r85,%r94; .loc 1 371 24 mul.lo.u64 %r394,%r85,%r97; .loc 1 371 10 neg.s64 %r163,%r394; .loc 1 372 23 mul.lo.u64 %r395,%r85,%r99; .loc 1 372 9 shl.b64 %r396,%r395,3; neg.s64 %r165,%r396; setp.le.s64 %r431,%r138,1; cvt.u32.u32 %r433,%r23; cvt.s64.s8 %r434,%r433; add.u64 %r435,%r434,-1; add.u64 %r436,%frame,240; add.u64 %r437,%frame,360; add.u64 %r438,%frame,120; .loc 1 367 13 mov.u64 %r439,0; $L87: .loc 1 328 3 mov.u64 %r156,%r152; mov.u64 %r155,%r151; .loc 1 348 10 mov.f64 %r130,0d0000000000000000; .loc 1 349 9 mov.u64 %r154,0; $L80: .loc 1 352 6 ld.s8 %r398,[%r156]; cvt.u16.u32 %r397,%r398; setp.eq.u16 %r399,%r397,0; @ %r399 bra $L79; .loc 1 353 12 ld.f64 %r400,[%r155]; add.f64 %r130,%r130,%r400; $L79: .loc 1 349 24 add.u64 %r154,%r154,1; .loc 1 349 32 add.u64 %r155,%r155,%r91; .loc 1 349 47 add.u64 %r156,%r156,%r45; .loc 1 349 2 setp.ne.u64 %r401,%r139,%r154; @ %r401 bra $L80; .loc 1 355 8 st.f64 [%r153],%r130; .loc 1 358 15 add.u64 %r39,%r39,1; .loc 1 359 12 add.u64 %r151,%r151,%r95; .loc 1 360 13 add.u64 %r152,%r152,%r97; .loc 1 361 12 add.u64 %r153,%r153,%r100; .loc 1 363 13 setp.ne.u64 %r402,%r85,%r39; @ %r402 bra $L87; .loc 1 371 10 add.u64 %r148,%r152,%r163; .loc 1 372 9 add.u64 %r149,%r153,%r165; .loc 1 374 7 @ ! %r431 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r41],%r439; .loc 1 370 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 371 24 mul.lo.u64 %r405,%r116,%r113; .loc 1 371 10 sub.u64 %r148,%r152,%r405; .loc 1 372 23 mul.lo.u64 %r406,%r118,%r113; .loc 1 372 9 shl.b64 %r407,%r406,3; sub.u64 %r149,%r153,%r407; .loc 1 373 5 add.u64 %r150,%r150,1; .loc 1 374 7 add.u64 %r41,%r41,8; add.u64 %r166,%r166,8; setp.ne.u64 %r408,%r150,%r435; @ %r408 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; .loc 1 370 23 mov.u64 %r102,%r38; .loc 1 374 7 mov.u64 %r166,8; .loc 1 373 5 mov.u64 %r150,1; $L84: .loc 1 382 16 ld.u64 %r412,[%r41]; add.u64 %r113,%r412,1; st.u64 [%r41],%r113; .loc 1 383 23 add.u64 %r414,%r436,%r166; ld.u64 %r114,[%r414]; .loc 1 383 13 sub.u64 %r415,%r114,%r102; shl.b64 %r416,%r415,3; add.u64 %r151,%r151,%r416; .loc 1 384 24 add.u64 %r417,%frame,%r166; ld.u64 %r116,[%r417]; .loc 1 384 14 add.u64 %r152,%r148,%r116; .loc 1 385 23 add.u64 %r419,%r438,%r166; ld.u64 %r118,[%r419]; .loc 1 385 13 shl.b64 %r420,%r118,3; add.u64 %r153,%r149,%r420; .loc 1 363 32 add.u64 %r422,%r437,%r166; ld.u64 %r122,[%r422]; .loc 1 363 13 setp.eq.u64 %r423,%r113,%r122; @ %r423 bra $L85; .loc 1 367 13 mov.u64 %r39,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r424,[%r209]; setp.eq.u64 %r425,%r424,0; @ ! %r425 bra $L63; bra $L86; $L48: .loc 1 389_gfortran_ssum_r8 .visible .func _gfortran_ssum_r88u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64 %r276; .reg .predpredu64mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_sum_r8.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r294,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r294; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r294; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r300,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r300; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r293,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r293,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r299,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r299; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r294,0; @ %r211 bra $L127; add.u64 %r293,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r293; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r24496244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r294,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r294; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r294; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r294,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r294,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r294,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 shl.b64 %r69,%r68,3; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r292,%r294,1; .loc 1 512 13 mov.f64 %r278,0d0000000000000000; cvt.u32.u32 %r296,%r24; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; $L146: st.f64 [%r98],%r278; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r279,%r59,%r80; @ %r279 bra $L146; .loc 1 525 7 @ ! %r292 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r295; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r282,%r97,%r298; @ %r282 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; .loc 1 520 13 mov.u64 %r295,0; $L144: .loc 1 529 16 ld.u64 %r287,[%r99]; add.u64 %r74,%r287,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r288,%r75,%r72; shl.b64 %r289,%r288,3; add.u64 %r98,%r98,%r289; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r290,%r74,%r77; @ %r290 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r291,%r98,0; @ ! %r291 bra $L122; bra $L147; $L109: .loc 1 534sum_r10.o/ sum_c4.o/_gfortran_sum_c4 .visible .func _gfortran_sum_c4fortran/generated/sum_c4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_msum_c4 .visible .func _gfortran_msum_c4gfortran_ssum_c4 .visible .func _gfortran_ssum_c4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10] = {83,85,77114,101,116,117,114,110,32,118,97,108,117,10132 $LC4[2] = {70,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,110,115,105,9977,65,83,75,32,97,114,103,117,109,101,110,116,0 }99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,83,85,77,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_sum_c4 .visible .func _gfortran_sum_c448u64predu64 %r227; .reg .u64 %r228; .reg .u64 %r229; .reg .predpredu32u64 %r250; .reg .u64 %r251; .reg .u64 %r252; .reg .u64 %r253; .reg .predu32u16u64pred %r326; .reg .u64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64u64 %r352; .reg .f32 %r353; .reg .f32 %r354; .reg .u32 %r355; .reg .u64 %r356; .reg .u64 %r357; .reg .u64 %r358; .reg .u64 %r359; .reg .u64 %r360; .reg .u64 %r361; mov.u64 %r181,%ar0; mov.u64 %r182,%ar1; mov.u64 %r183,%ar2; .loc 1 55 10 ld.s8 %r22,[%r182+28]; .loc 1 55 38 add.u32 %r184,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r104,%r184; .loc 1 56 10 ld.u64 %r25,[%r183]; .loc 1 56 7 add.u64 %r105,%r25,-1; .loc 1 58 7 shr.u64 %r186,%r105,63; set.u32.lt.s64 %r188,%r104,%r105; neg.s32 %r189,%r188; cvt.u16.u64 %r192,%r186; cvt.u16.u32 %r193,%r189; or.b16 %r191,%r192,%r193; .loc 1 58 6 cvt.u32.u16 %r194,%r191; cvt.u16.u8 %r195,%r194; setp.eq.u16 %r196,%r195,0; @ %r196 bra $L2; .loc 1 60 7 cvt.u32.u32 %r199,%r22; cvt.s64.s8 %r198,%r199; st.u64 [%stack+8],%r19819197_gfortran_runtime_errorr202,%r105,%r105; add.u64 %r203,%r202,%r105; shl.b64 %r204,%r203,3; add.u64 %r205,%r182,%r204; ld.u64 %r30,[%r205+56]; ld.u64 %r32,[%r205+48]; .loc 1 68 9 ld.u64 %r108,[%r205+40]; .loc 1 70 3 setp.ne.u64 %r219,%r105,0; @ %r219 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r220,%r104,%r105; @ %r220 bra $L4; bra $L46; $L3: add.u64 %r91,%r182,40; add.u64 %r60,%frame,120; add.u64 %r47,%frame,240; add.u64 %r221,%r182,16; add.u64 %r223,%r25,%r25; add.u64 %r224,%r223,%r25; shl.b64 %r225,%r224,3; add.u64 %r178,%r221,%r225; .loc 1 76 12 mov.u64 %r361,0; $L8: .loc 1 72 18 ld.u64 %r226,[%r91]; st.u64 [%r60],%r226; .loc 1 73 19 ld.u64 %r228,[%r91+16]; add.u64 %r227,%r228,1; ld.u64 %r229,[%r91+8]; sub.u64 %r37,%r227,%r229; .loc 1 75 10 setp.lt.s64 %r230,%r37,0; @ %r230 bra $L6; .loc 1 73 17 st.u64 [%r47],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r47],%r361; $L7: .loc 1 70 3 add.u64 %r91,%r91,24; add.u64 %r60,%r60,8; add.u64 %r47,%r47,8; setp.ne.u64 %r232,%r91,%r178; @ %r232 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r233,[%r181]; setp.eq.u64 %r234,%r233,0; @ ! %r234 bra $L11; bra $L10; $L4: add.u64 %r236,%r25,%r25; add.u64 %r237,%r236,%r25238,40; add.u64 %r147,%r182,%r239; shl.b64 %r240,%r25,3; add.u64 %r134,%r240,-8; add.u64 %r349,%frame,120; add.u64 %r142,%r349,%r134; add.u64 %r348,%frame,240; add.u64 %r130,%r348,%r134; cvt.u32.u32 %r244,%r22; cvt.s64.s8 %r243,%r244; add.u64 %r246,%r243,%r243; add.u64 %r247,%r246,%r243; shl.b64 %r248,%r247,3; add.u64 %r249,%r182,40; add.u64 %r95,%r248,%r249; .loc 1 84 12 mov.u64 %r360,0; $L14: .loc 1 80 18 ld.u64 %r250,[%r147]; st.u64 [%r142],%r250; .loc 1 81 19 ld.u64 %r252,[%r147+16]; add.u64 %r251,%r252,1; ld.u64 %r253,[%r147+8]; sub.u64 %r44,%r251,%r253; .loc 1 83 10 setp.lt.s64 %r254,%r44,0; @ %r254 bra $L12; .loc 1 81 17 st.u64 [%r130],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r130],%r360; $L13: .loc 1 78 3 add.u64 %r147,%r147,24; add.u64 %r142,%r142,8; add.u64 %r130,%r130,8; setp.ne.u64 %r256,%r95,%r147; @ %r256 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r257,%r104,0; @ %r257 bra $L16; add.u64 %r348,%frame,240; $L34: add.u64 %r156,%r181,40; mov.u64 %r154,%r348; add.u64 %r258,%r181,16; cvt.u32.u32 %r260,%r22; cvt.s64.s8 %r259,%r260; add.u64 %r262,%r259,%r259; add.u64 %r263,%r262,%r259; shl.b64 %r264,%r263,3; add.u64 %r148,%r258,%r264; .loc 1 94 10 mov.u64 %r96,1; .loc 1 98 4 mov.u64 %r287,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r265,0; st.u64 [%r181+8],%r265; .loc 1 103 28 cvt.u16.u32 %r268,%r22; add.u16 %r267,%r268,-1; cvt.u32.u16 %r269,%r267; st.u8 [%r181+28],%r269; .loc 1 105 20 add.u32 %r270,%r22,-2; cvt.s64.s32 %r53,%r270; add.u64 %r272,%r53,%r53; add.u64 %r273,%r272,%r53; shl.b64 %r274,%r273,3; add.u64 %r275,%r181,%r274; .loc 1 105 67 shl.b64 %r277,%r53,3; add.u64 %r278,%frame,%r277; .loc 1 105 59 ld.u64 %r280,[%r275+40]; ld.u64 %r281,[%r278+240]; mul.lo.u64 %r110,%r280,%r281; .loc 1 107 29 mov.u64 %r28311283; call (%value_in),_gfortrani_xmallocarray284,[%value_in]; } .loc 1 107 27 st.u64 [%r181],%r284; .loc 1 108 10 setp.eq.u64 %r286,%r110,0; @ ! %r286 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r96,%r49,%r96; $L17: .loc 1 98 4 st.u64 [%r156+8],%r287; ld.u64 %r49,[%r154]; add.u64 %r288,%r49,-1; st.u64 [%r156+16],%r288; st.u64 [%r156],%r96; .loc 1 91 7 add.u64 %r156,%r156,24; add.u64 %r154,%r154,8; setp.ne.u64 %r289,%r148,%r156; @ %r289 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r181+48],%r110; mov.u64 %r291,-1; st.u64 [%r181+56],%r291; mov.u64 %r292,1; st.u64 [%r181+40],%r292; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r59,[%r181+28]; .loc 1 118 10 setp.eq.u64 %r293,%r59,%r104; @ %r293 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r104; st.u64 [%stack],%r59;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r296,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r297,[%r296+36]; setp.eq.u32 %r298,%r297,0; @ %r298 bra $L21; .loc 1 125 2 add.u64 %r348,%frame,240; cvta.const.u64 %r302,$LC2r3430302; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r304,%r104,0; @ %r304 bra $L24; $L27: .loc 1 65 9 add.u64 %r305,%r30,1; .loc 1 65 7 sub.u64 %r106,%r305,%r32; max.s64 %r45,%r106,0; .loc 1 137 8 ld.u64 %r118,[%r182]; .loc 1 138 8 ld.u64 %r119,[%r181]; .loc 1 154 36 shl.b64 %r66,%r108,3; ld.u64 %r111,[%frame+360]; .loc 1 166 22 ld.u64 %r68,[%frame+120]; .loc 1 166 12 shl.b64 %r69,%r68,3; .loc 1 167 22 ld.u64 %r71,[%frame]; .loc 1 167 12 shl.b64 %r72,%r71,3; .loc 1 169 32 ld.u64 %r126,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r306,%r68,%r126; .loc 1 176 9 shl.b64 %r307,%r306,3; neg.s64 %r129,%r307; .loc 1 177 23 mul.lo.u64 %r308,%r71,%r126; .loc 1 177 9 shl.b64 %r309,%r308,3; neg.s64 %r131,%r309; setp.gt.s64 %r350,%r106,0; setp.le.s64 %r351,%r104,1; .loc 1 150 10 ld.const.f32 %r353,[$LC4]; ld.const.f32 %r354,[$LC4+4]; cvt.u32.u32 %r355,%r22; cvt.s64.s8 %r356,%r355; add.u64 %r357,%r356,-1; add.u64 %r358,%frame,120; add.u64 %r359,%frame,240; bra $L25; $L24: add.u64 %r169,%frame,360; add.u64 %r168,%r181,40; mov.u64 %r166,%frame; cvt.u32.u32 %r311,%r22; cvt.s64.s8 %r310,%r311; add.u64 %r157,%r310,-1; .loc 1 129 3 mov.u64 %r120,0; add.u64 %r348,%frame,240; .loc 1 131 16 mov.u64 %r312,%r120; $L26: st.u64 [%r169],%r312; .loc 1 132 18 ld.u64 %r313,[%r168]; st.u64 [%r166],%r313; .loc 1 133 17 shl.b64 %r315,%r120,3; add.u64 %r316,%r348,%r315; .loc 1 133 10 ld.u64 %r317,[%r316]; setp.le.s64 %r318,%r317,0; @ %r318 bra $L1; .loc 1 129 26 add.u64 %r120,%r120,1; .loc 1 129 3 add.u64 %r169,%r169,8; add.u64 %r168,%r168,24; add.u64 %r166,%r166,8; setp.ne.u64 %r319,%r120,%r157; @ %r319 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r350 bra $L35; .loc 1 150 10 st.f32 [%r119],%r353; st.f32 [%r119+4],%r354; bra $L29; $L35: mov.u64 %r113,%r118; mov.f32 %r40,0f00000000; mov.f32 %r102,%r40; .loc 1 154 13 mov.u64 %r112,0; $L28: .loc 1 158 10 ld.f32 %r323,[%r113]; add.f32 %r102,%r102,%r323; ld.f32 %r324,[%r113+4]; add.f32 %r40,%r40,%r324; .loc 1 154 28 add.u64 %r112,%r112,1; .loc 1 154 36 add.u64 %r113,%r113,%r66; .loc 1 154 6 setp.gt.s64 %r325,%r45,%r112; @ %r325 bra $L28; .loc 1 161 12 st.f32 [%r119],%r102; st.f32 [%r119+4],%r40; $L29: .loc 1 165 15 add.u64 %r111,%r111,1; .loc 1 166 12 add.u64 %r118,%r118,%r69; .loc 1 167 12 add.u64 %r119,%r119,%r72; .loc 1 169 13 setp.ne.u64 %r326,%r111,%r126; @ %r326 bra $L25; .loc 1 176 9 add.u64 %r115,%r118,%r129; .loc 1 177 9 add.u64 %r116,%r119,%r131; .loc 1 179 7 @ %r351 bra $L1; add.u64 %r125,%frame,368; mov.u64 %r140,8; .loc 1 178 5 mov.u64 %r117,1; .loc 1 173 13 mov.u64 %r352,0; bra $L32; $L33: st.u64 [%r125],%r352; .loc 1 176 23 mul.lo.u64 %r332,%r85,%r84; .loc 1 176 9 shl.b64 %r333,%r332,3; sub.u64 %r115,%r118,%r333; .loc 1 177 23 mul.lo.u64 %r334,%r88,%r84; .loc 1 177 9 shl.b64 %r335,%r334,3; sub.u64 %r116,%r119,%r335; .loc 1 178 5 add.u64 %r117,%r117,1; .loc 1 179 7 add.u64 %r125,%r125,8; add.u64 %r140,%r140,8; setp.eq.u64 %r336,%r117,%r357; @ %r336 bra $L1; $L32: .loc 1 187 16 ld.u64 %r337,[%r125]; add.u64 %r84,%r337,1; st.u64 [%r125],%r84; .loc 1 188 23 add.u64 %r339,%r358,%r140; ld.u64 %r85,[%r339]; .loc 1 188 13 shl.b64 %r340,%r85,3; add.u64 %r118,%r115,%r340; .loc 1 189 23 add.u64 %r341,%frame,%r140; ld.u64 %r88,[%r341]; .loc 1 189 13 shl.b64 %r342,%r88,3; add.u64 %r119,%r116,%r342; .loc 1 169 32 add.u64 %r344,%r359,%r140; ld.u64 %r92,[%r344]; .loc 1 169 13 setp.eq.u64 %r345,%r84,%r92; @ %r345 bra $L33; .loc 1 173 13 mov.u64 %r111,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r346,[%r181]; setp.eq.u64 %r347,%r346,0; @ ! %r347 bra $L11; bra $L34; $L1_gfortran_msum_c4 .visible .func _gfortran_msum_c6088pred %r215; .reg .u32 %r219; .reg .u64 %r221; .reg .u32 %r223; .reg .u32 %r224; .reg .u16 %r226; .reg .u16 %r227; .reg .u16u16 %r263; .reg .u32pred %r282; .reg .u64 %r283; .reg .u64u64u6464pred %r325; .reg .u64 %r327; .reg .u64 %r329; .reg .pred %r331; .reg .pred %r332; .reg .u64 %r333; .reg .u32 %r334; .reg .u64u64pred %r357; .reg .u64 %r358; .reg .u64 %r359; .reg .predu64 %r392; .reg .u64 %r393; .reg .pred %r394; .reg .predpred %r404; .reg .pred %r405; .reg .u64pred %r426; .reg .u64 %r427; .reg .predpred.reg .u64mov.u64 %r211,%ar0; mov.u64 %r212,%ar1; mov.u64 %r213,%ar2; mov.u64 %r214,%ar3; .loc 1 223 6 setp.ne.u64 %r215,%r214,0; @ %r215 bra $L49gfortran_sum_c4.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r213]; .loc 1 233 7 add.u64 %r136,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r212+28]; .loc 1 234 38 add.u32 %r219,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r137,%r219; .loc 1 237 7 shr.u64 %r221,%r136,63; set.u32.gt.s64 %r223,%r136,%r137; neg.s32 %r224,%r223; cvt.u16.u64 %r227,%r221; cvt.u16.u32 %r228,%r224; or.b16 %r226,%r227,%r228; .loc 1 237 6 cvt.u32.u16 %r229,%r226; cvt.u16.u8 %r230,%r229; setp.eq.u16 %r231,%r230,0; @ %r231 bra $L51; .loc 1 239 7 cvt.u32.u32 %r234,%r23; cvt.s64.s8 %r233,%r234; st.u64 [%stack+8],%r233; st.u64 [%stack],%r22; cvta.const.u64 %r2322stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r432,%r136,%r136; add.u64 %r433,%r432,%r136; shl.b64 %r239,%r433,3; add.u64 %r240,%r212,%r239; ld.u64 %r243,[%r240+56]; add.u64 %r242,%r243,1; .loc 1 244 7 ld.u64 %r250,[%r240+48]; sub.u64 %r138,%r242,%r250; .loc 1 245 6 setp.le.s64 %r251,%r138,0; @ %r251 bra $L48; .loc 1 248 9 ld.u64 %r151,[%r214]; .loc 1 250 15 ld.u64 %r34,[%r214+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r252,%r35,-4; and.b32 %r253,%r252,-5; set.u32.eq.u32 %r255,%r253,0; neg.s32 %r256,%r255; .loc 1 252 22 add.u32 %r257,%r35,-1; set.u32.le.u32 %r259,%r257,1; neg.s32 %r260,%r259; .loc 1 252 6 cvt.u16.u32 %r262,%r256; cvt.u16.u32 %r263,%r260; or.b16 %r261,%r262,%r263; cvt.u32.u16 %r264,%r261; cvt.u16.u8 %r265,%r264; setp.ne.u16 %r266,%r265,0; @ %r266 bra $L53; .loc 1 254 7 setp.ne.u32 %r268,%r35,16; @ %r268 bra $L54; $L53: .loc 1 261 9 shl.b64 %r272,%r433,3; add.u64 %r273,%r212,%r272; ld.u64 %r141,[%r273+40]; .loc 1 262 12 add.u64 %r279,%r214,%r272; ld.u64 %r41,[%r279+40]; .loc 1 264 3 setp.ne.u64 %r281,%r136,0; @ %r281 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r282,%r136,%r137; @ %r282 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r283283_gfortran_runtime_error5: add.u64 %r40,%r212,40; add.u64 %r129,%r214,40; add.u64 %r285,%r212,16; add.u64 %r287,%r22,%r22; add.u64 %r288,%r287,%r22; shl.b64 %r289,%r288,3; add.u64 %r208,%r285,%r289; .loc 1 264 3 mov.u64 %r87,0; add.u64 %r430,%frame,240; add.u64 %r429,%frame,360; .loc 1 271 12 mov.u64 %r444,%r87; $L60: .loc 1 266 18 add.u64 %r291,%r430,%r87; ld.u64 %r292,[%r40]; st.u64 [%r291],%r292; .loc 1 267 18 add.u64 %r293,%frame,%r87; .loc 1 267 20 ld.u64 %r295,[%r129]; mul.lo.u64 %r294,%r295,%r34; .loc 1 267 18 st.u64 [%r293],%r294; .loc 1 268 19 ld.u64 %r297,[%r40+16]; add.u64 %r296,%r297,1; ld.u64 %r298,[%r40+8]; sub.u64 %r52,%r296,%r298; .loc 1 270 10 setp.lt.s64 %r299,%r52,0; @ %r299 bra $L58; .loc 1 268 17 add.u64 %r301,%r429,%r87; st.u64 [%r301],%r52; bra $L59; $L58: .loc 1 271 12 add.u64 %r303,%r429,%r87; st.u64 [%r303],%r444; $L59: .loc 1 264 3 add.u64 %r40,%r40,24; add.u64 %r129,%r129,24; add.u64 %r87,%r87,8; setp.ne.u64 %r305,%r40,%r208; @ %r305 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r306,[%r211]; setp.eq.u64 %r307,%r306,0; @ ! %r307 bra $L63; bra $L62; $L56: add.u64 %r309,%r22,%r22; add.u64 %r310,%r309,%r22; shl.b64 %r311,%r310,3; add.u64 %r122,%r311,40; add.u64 %r131,%r212,%r122; add.u64 %r117,%r214,%r122; shl.b64 %r312,%r22,3; add.u64 %r135,%r312,-8; cvt.u32.u32 %r314,%r23; cvt.s64.s8 %r313,%r314; shl.b64 %r315,%r313,3; add.u64 %r66,%r315,-8; add.u64 %r430,%frame,240; add.u64 %r429,%frame,360; .loc 1 281 12 mov.u64 %r443,0; $L66: .loc 1 276 18 add.u64 %r317,%r430,%r135; ld.u64 %r318,[%r131]; st.u64 [%r317],%r318; .loc 1 277 18 add.u64 %r319,%frame,%r135; .loc 1 277 20 ld.u64 %r321,[%r117]; mul.lo.u64 %r320,%r321,%r34; .loc 1 277 18 st.u64 [%r319],%r320; .loc 1 278 19 ld.u64 %r323,[%r131+16]; add.u64 %r322,%r323,1; ld.u64 %r324,[%r131+8]; sub.u64 %r62,%r322,%r324; .loc 1 280 10 setp.lt.s64 %r325,%r62,0; @ %r325 bra $L64; .loc 1 278 17 add.u64 %r327,%r429,%r135; st.u64 [%r327],%r62; bra $L65; $L64: .loc 1 281 12 add.u64 %r329,%r429,%r135; st.u64 [%r329],%r443; $L65: .loc 1 274 3 add.u64 %r131,%r131,24; add.u64 %r117,%r117,24; add.u64 %r135,%r135,8; setp.ne.u64 %r331,%r66,%r135; @ %r331 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r332,%r137,0; @ %r332 bra $L68; add.u64 %r429,%frame,360; $L86: add.u64 %r179,%r211,40; mov.u64 %r171,%r429; cvt.u32.u32 %r334,%r23; cvt.s64.s8 %r333,%r334; add.u64 %r336,%r333,%r333; add.u64 %r337,%r336,%r333; shl.b64 %r338,%r337,3; add.u64 %r339,%r211,16; add.u64 %r132,%r338,%r339; .loc 1 291 10 mov.u64 %r126,1; .loc 1 295 4 mov.u64 %r358,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r340,%r23,-2; cvt.s64.s32 %r71,%r340; add.u64 %r342,%r71,%r71; add.u64 %r343,%r342,%r71; shl.b64 %r344,%r343,3; add.u64 %r345,%r211,%r344; .loc 1 299 67 shl.b64 %r347,%r71,3; add.u64 %r348,%frame,%r347; .loc 1 299 59 ld.u64 %r350,[%r345+40]; ld.u64 %r351,[%r348+360]; mul.lo.u64 %r145,%r350,%r351; .loc 1 301 24 mov.u64 %r352,0; st.u64 [%r211+8],%r352; .loc 1 302 28 cvt.u16.u32 %r355,%r23; add.u16 %r354,%r355,-1; cvt.u32.u16 %r356,%r354; st.u8 [%r211+28],%r356; .loc 1 304 10 setp.eq.u64 %r357,%r145,0; @ %r357 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r126,%r68,%r126; $L69: .loc 1 295 4 st.u64 [%r179+8],%r358; ld.u64 %r68,[%r171]; add.u64 %r359,%r68,-1; st.u64 [%r179+16],%r359; st.u64 [%r179],%r126; .loc 1 288 7 add.u64 %r179,%r179,24; add.u64 %r171,%r171,8; setp.ne.u64 %r360,%r132,%r179; @ %r360 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r211+48],%r145; mov.u64 %r362,-1; st.u64 [%r211+56],%r362; mov.u64 %r363,1; st.u64 [%r211+40],%r363; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r365365; call (%value_in),_gfortrani_xmallocarray66,[%value_in]; } .loc 1 311 22 st.u64 [%r211],%r366; bra $L73; $L63: .loc 1 316 19 ld.s8 %r368,[%r211+28]; .loc 1 316 10 setp.eq.u64 %r369,%r368,%r137; @ %r369 bra $L74; .loc 1 317 2 cvta.const.u64 %r370,$LC370_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r372,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r373,[%r372+36]; setp.eq.u32 %r374,%r373,0; @ %r374 bra $L73; .loc 1 321 4 add.u64 %r429,%frame,360; cvta.const.u64 %r378,$LC2377378; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r38221212382378; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r384,%r137,0; @ %r384 bra $L75; $L78: .loc 1 336 8 ld.u64 %r152,[%r211]; .loc 1 337 8 ld.u64 %r150,[%r212]; .loc 1 339 9 setp.ne.u64 %r385,%r150,0; @ %r385 bra $L76; bra $L48; $L75: add.u64 %r192,%frame,480; add.u64 %r191,%r211,40; add.u64 %r189,%frame,120; cvt.u32.u32 %r387,%r23; cvt.s64.s8 %r386,%r387; add.u64 %r180,%r386,-1; .loc 1 328 3 mov.u64 %r157,0; add.u64 %r429,%frame,360; .loc 1 330 16 mov.u64 %r388,%r157; $L77: st.u64 [%r192],%r388; .loc 1 331 18 ld.u64 %r389,[%r191]; st.u64 [%r189],%r389; .loc 1 332 17 shl.b64 %r391,%r157,3; add.u64 %r392,%r429,%r391; .loc 1 332 10 ld.u64 %r393,[%r392]; setp.le.s64 %r394,%r393,0; @ %r394 bra $L48; .loc 1 328 26 add.u64 %r157,%r157,1; .loc 1 328 3 add.u64 %r192,%r192,8; add.u64 %r191,%r191,24; add.u64 %r189,%r189,8; setp.ne.u64 %r395,%r157,%r180; @ %r395 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r43,%r41,%r34; .loc 1 349 32 shl.b64 %r86,%r141,3; ld.u64 %r153,[%frame+480]; .loc 1 359 22 ld.u64 %r89,[%frame+240]; .loc 1 359 12 shl.b64 %r90,%r89,3; .loc 1 360 23 ld.u64 %r92,[%frame]; .loc 1 361 22 ld.u64 %r94,[%frame+120]; .loc 1 361 12 shl.b64 %r95,%r94,3; .loc 1 363 32 ld.u64 %r80,[%frame+360]; .loc 1 370 23 mul.lo.u64 %r38,%r80,%r89; .loc 1 371 24 mul.lo.u64 %r396,%r80,%r92; .loc 1 371 10 neg.s64 %r162,%r396; .loc 1 372 23 mul.lo.u64 %r397,%r80,%r94; .loc 1 372 9 shl.b64 %r398,%r397,3; neg.s64 %r165,%r398; setp.le.s64 %r434,%r137,1; cvt.u32.u32 %r436,%r23; cvt.s64.s8 %r437,%r436; add.u64 %r438,%r437,-1; add.u64 %r439,%frame,240; add.u64 %r440,%frame,360; add.u64 %r441,%frame,120; .loc 1 367 13 mov.u64 %r442,0; $L87: .loc 1 328 3 mov.u64 %r156,%r151; mov.u64 %r155,%r150; mov.f32 %r118,0f00000000; mov.f32 %r119,%r118; .loc 1 349 9 mov.u64 %r154,0; $L80: .loc 1 352 6 ld.s8 %r400,[%r156]; cvt.u16.u32 %r399,%r400; setp.eq.u16 %r401,%r399,0; @ %r401 bra $L79; .loc 1 353 12 ld.f32 %r402,[%r155]; add.f32 %r119,%r119,%r402; ld.f32 %r403,[%r155+4]; add.f32 %r118,%r118,%r403; $L79: .loc 1 349 24 add.u64 %r154,%r154,1; .loc 1 349 32 add.u64 %r155,%r155,%r86; .loc 1 349 47 add.u64 %r156,%r156,%r43; .loc 1 349 2 setp.ne.u64 %r404,%r138,%r154; @ %r404 bra $L80; .loc 1 355 8 st.f32 [%r152],%r119; st.f32 [%r152+4],%r118; .loc 1 358 15 add.u64 %r153,%r153,1; .loc 1 359 12 add.u64 %r150,%r150,%r90; .loc 1 360 13 add.u64 %r151,%r151,%r92; .loc 1 361 12 add.u64 %r152,%r152,%r95; .loc 1 363 13 setp.ne.u64 %r405,%r80,%r153; @ %r405 bra $L87; .loc 1 371 10 add.u64 %r147,%r151,%r162; .loc 1 372 9 add.u64 %r148,%r152,%r165; .loc 1 374 7 @ ! %r434 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r128],%r442; .loc 1 370 23 mul.lo.u64 %r98,%r109,%r108; .loc 1 371 24 mul.lo.u64 %r408,%r111,%r108; .loc 1 371 10 sub.u64 %r147,%r151,%r408; .loc 1 372 23 mul.lo.u64 %r409,%r113,%r108; .loc 1 372 9 shl.b64 %r410,%r409,3; sub.u64 %r148,%r152,%r410; .loc 1 373 5 add.u64 %r149,%r149,1; .loc 1 374 7 add.u64 %r128,%r128,8; add.u64 %r178,%r178,8; setp.ne.u64 %r411,%r149,%r438; @ %r411 bra $L84; bra $L48; $L108: add.u64 %r128,%frame,488; .loc 1 370 23 mov.u64 %r98,%r38; .loc 1 374 7 mov.u64 %r178,8; .loc 1 373 5 mov.u64 %r149,1; $L84: .loc 1 382 16 ld.u64 %r415,[%r128]; add.u64 %r108,%r415,1; st.u64 [%r128],%r108; .loc 1 383 23 add.u64 %r417,%r439,%r178; ld.u64 %r109,[%r417]; .loc 1 383 13 sub.u64 %r418,%r109,%r98; shl.b64 %r419,%r418,3; add.u64 %r150,%r150,%r419; .loc 1 384 24 add.u64 %r420,%frame,%r178; ld.u64 %r111,[%r420]; .loc 1 384 14 add.u64 %r151,%r147,%r111; .loc 1 385 23 add.u64 %r422,%r441,%r178; ld.u64 %r113,[%r422]; .loc 1 385 13 shl.b64 %r423,%r113,3; add.u64 %r152,%r148,%r423; .loc 1 363 32 add.u64 %r425,%r440,%r178; ld.u64 %r116,[%r425]; .loc 1 363 13 setp.eq.u64 %r426,%r108,%r116; @ %r426 bra $L85; .loc 1 367 13 mov.u64 %r153,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r427,[%r211]; setp.eq.u64 %r428,%r427,0; @ ! %r428 bra $L63; bra $L86; $L48: .loc 1 389_gfortran_ssum_c4 .visible .func _gfortran_ssum_c8u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64 %r276; .reg .pred %r277; .reg .pred %r280; .reg .predpred %r292; .reg .f32 %r293; .reg .f32 %r294; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_sum_c4.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r297,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r297; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r297; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r304,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r304; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r296,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r296,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r303,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r303; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r297,0; @ %r211 bra $L127; add.u64 %r296,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r296; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r24496244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r297,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r297; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r297; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r297,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r297,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r297,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 shl.b64 %r69,%r68,3; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; ld.const.f32 %r293,[$LC4]; ld.const.f32 %r294,[$LC4+4]; setp.le.s64 %r295,%r297,1; cvt.u32.u32 %r299,%r24; cvt.s64.s8 %r300,%r299; add.u64 %r301,%r300,-1; .loc 1 520 13 mov.u64 %r302,0; $L146: .loc 1 512 13 st.f32 [%r98],%r293; st.f32 [%r98+4],%r294; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r280,%r59,%r80; @ %r280 bra $L146; .loc 1 525 7 @ ! %r295 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r302; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r283,%r97,%r301; @ %r283 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; $L144: .loc 1 529 16 ld.u64 %r288,[%r99]; add.u64 %r74,%r288,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r289,%r75,%r72; shl.b64 %r290,%r289,3; add.u64 %r98,%r98,%r290; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r291,%r74,%r77; @ %r291 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r292,%r98,0; @ ! %r292 bra $L122; bra $L147; $L109: .loc 1 534 sum_c8.o/_gfortran_sum_c8 .visible .func _gfortran_sum_c8fortran/generated/sum_c8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_msum_c8 .visible .func _gfortran_msum_c8gfortran_ssum_c8 .visible .func _gfortran_ssum_c8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10] = {83,85,77114,101,116,117,114,110,32,118,97,108,117,10164 $LC4[2] = {70,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,110,115,105,9977,65,83,75,32,97,114,103,117,109,101,110,116,0 }99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,83,85,77,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_sum_c8 .visible .func _gfortran_sum_c848u64predu64 %r227; .reg .u64 %r228; .reg .u64 %r229; .reg .predpredu32u64 %r250; .reg .u64 %r251; .reg .u64 %r252; .reg .u64 %r253; .reg .predu32u16u64pred %r326; .reg .u64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64u64 %r352; .reg .f64 %r353; .reg .f64 %r354; .reg .u32 %r355; .reg .u64 %r356; .reg .u64 %r357; .reg .u64 %r358; .reg .u64 %r359; .reg .u64 %r360; .reg .u64 %r361; mov.u64 %r181,%ar0; mov.u64 %r182,%ar1; mov.u64 %r183,%ar2; .loc 1 55 10 ld.s8 %r22,[%r182+28]; .loc 1 55 38 add.u32 %r184,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r104,%r184; .loc 1 56 10 ld.u64 %r25,[%r183]; .loc 1 56 7 add.u64 %r105,%r25,-1; .loc 1 58 7 shr.u64 %r186,%r105,63; set.u32.lt.s64 %r188,%r104,%r105; neg.s32 %r189,%r188; cvt.u16.u64 %r192,%r186; cvt.u16.u32 %r193,%r189; or.b16 %r191,%r192,%r193; .loc 1 58 6 cvt.u32.u16 %r194,%r191; cvt.u16.u8 %r195,%r194; setp.eq.u16 %r196,%r195,0; @ %r196 bra $L2; .loc 1 60 7 cvt.u32.u32 %r199,%r22; cvt.s64.s8 %r198,%r199; st.u64 [%stack+8],%r19819197_gfortran_runtime_errorr202,%r105,%r105; add.u64 %r203,%r202,%r105; shl.b64 %r204,%r203,3; add.u64 %r205,%r182,%r204; ld.u64 %r30,[%r205+56]; ld.u64 %r32,[%r205+48]; .loc 1 68 9 ld.u64 %r108,[%r205+40]; .loc 1 70 3 setp.ne.u64 %r219,%r105,0; @ %r219 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r220,%r104,%r105; @ %r220 bra $L4; bra $L46; $L3: add.u64 %r91,%r182,40; add.u64 %r60,%frame,120; add.u64 %r47,%frame,240; add.u64 %r221,%r182,16; add.u64 %r223,%r25,%r25; add.u64 %r224,%r223,%r25; shl.b64 %r225,%r224,3; add.u64 %r178,%r221,%r225; .loc 1 76 12 mov.u64 %r361,0; $L8: .loc 1 72 18 ld.u64 %r226,[%r91]; st.u64 [%r60],%r226; .loc 1 73 19 ld.u64 %r228,[%r91+16]; add.u64 %r227,%r228,1; ld.u64 %r229,[%r91+8]; sub.u64 %r37,%r227,%r229; .loc 1 75 10 setp.lt.s64 %r230,%r37,0; @ %r230 bra $L6; .loc 1 73 17 st.u64 [%r47],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r47],%r361; $L7: .loc 1 70 3 add.u64 %r91,%r91,24; add.u64 %r60,%r60,8; add.u64 %r47,%r47,8; setp.ne.u64 %r232,%r91,%r178; @ %r232 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r233,[%r181]; setp.eq.u64 %r234,%r233,0; @ ! %r234 bra $L11; bra $L10; $L4: add.u64 %r236,%r25,%r25; add.u64 %r237,%r236,%r25238,40; add.u64 %r147,%r182,%r239; shl.b64 %r240,%r25,3; add.u64 %r134,%r240,-8; add.u64 %r349,%frame,120; add.u64 %r142,%r349,%r134; add.u64 %r348,%frame,240; add.u64 %r130,%r348,%r134; cvt.u32.u32 %r244,%r22; cvt.s64.s8 %r243,%r244; add.u64 %r246,%r243,%r243; add.u64 %r247,%r246,%r243; shl.b64 %r248,%r247,3; add.u64 %r249,%r182,40; add.u64 %r95,%r248,%r249; .loc 1 84 12 mov.u64 %r360,0; $L14: .loc 1 80 18 ld.u64 %r250,[%r147]; st.u64 [%r142],%r250; .loc 1 81 19 ld.u64 %r252,[%r147+16]; add.u64 %r251,%r252,1; ld.u64 %r253,[%r147+8]; sub.u64 %r44,%r251,%r253; .loc 1 83 10 setp.lt.s64 %r254,%r44,0; @ %r254 bra $L12; .loc 1 81 17 st.u64 [%r130],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r130],%r360; $L13: .loc 1 78 3 add.u64 %r147,%r147,24; add.u64 %r142,%r142,8; add.u64 %r130,%r130,8; setp.ne.u64 %r256,%r95,%r147; @ %r256 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r257,%r104,0; @ %r257 bra $L16; add.u64 %r348,%frame,240; $L34: add.u64 %r156,%r181,40; mov.u64 %r154,%r348; add.u64 %r258,%r181,16; cvt.u32.u32 %r260,%r22; cvt.s64.s8 %r259,%r260; add.u64 %r262,%r259,%r259; add.u64 %r263,%r262,%r259; shl.b64 %r264,%r263,3; add.u64 %r148,%r258,%r264; .loc 1 94 10 mov.u64 %r96,1; .loc 1 98 4 mov.u64 %r287,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r265,0; st.u64 [%r181+8],%r265; .loc 1 103 28 cvt.u16.u32 %r268,%r22; add.u16 %r267,%r268,-1; cvt.u32.u16 %r269,%r267; st.u8 [%r181+28],%r269; .loc 1 105 20 add.u32 %r270,%r22,-2; cvt.s64.s32 %r53,%r270; add.u64 %r272,%r53,%r53; add.u64 %r273,%r272,%r53; shl.b64 %r274,%r273,3; add.u64 %r275,%r181,%r274; .loc 1 105 67 shl.b64 %r277,%r53,3; add.u64 %r278,%frame,%r277; .loc 1 105 59 ld.u64 %r280,[%r275+40]; ld.u64 %r281,[%r278+240]; mul.lo.u64 %r110,%r280,%r281; .loc 1 107 29 mov.u64 %r283,1607 27 st.u64 [%r181],%r284; .loc 1 108 10 setp.eq.u64 %r286,%r110,0; @ ! %r286 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r96,%r49,%r96; $L17: .loc 1 98 4 st.u64 [%r156+8],%r287; ld.u64 %r49,[%r154]; add.u64 %r288,%r49,-1; st.u64 [%r156+16],%r288; st.u64 [%r156],%r96; .loc 1 91 7 add.u64 %r156,%r156,24; add.u64 %r154,%r154,8; setp.ne.u64 %r289,%r148,%r156; @ %r289 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r181+48],%r110; mov.u64 %r291,-1; st.u64 [%r181+56],%r291; mov.u64 %r292,1; st.u64 [%r181+40],%r292; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r59,[%r181+28]; .loc 1 118 10 setp.eq.u64 %r293,%r59,%r104; @ %r293 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r104; st.u64 [%stack],%r59;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r296,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r297,[%r296+36]; setp.eq.u32 %r298,%r297,0; @ %r298 bra $L21; .loc 1 125 2 add.u64 %r348,%frame,240; cvta.const.u64 %r302,$LC2r3430302; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r304,%r104,0; @ %r304 bra $L24; $L27: .loc 1 65 9 add.u64 %r305,%r30,1; .loc 1 65 7 sub.u64 %r106,%r305,%r32; max.s64 %r45,%r106,0; .loc 1 137 8 ld.u64 %r118,[%r182]; .loc 1 138 8 ld.u64 %r119,[%r181]; .loc 1 154 36 shl.b64 %r66,%r108,4; ld.u64 %r111,[%frame+360]; .loc 1 166 22 ld.u64 %r68,[%frame+120]; .loc 1 166 12 shl.b64 %r69,%r68,4; .loc 1 167 22 ld.u64 %r71,[%frame]; .loc 1 167 12 shl.b64 %r72,%r71,4; .loc 1 169 32 ld.u64 %r126,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r306,%r68,%r126; .loc 1 176 9 shl.b64 %r307,%r306,4; neg.s64 %r129,%r307; .loc 1 177 23 mul.lo.u64 %r308,%r71,%r126; .loc 1 177 9 shl.b64 %r309,%r308,4; neg.s64 %r131,%r309; setp.gt.s64 %r350,%r106,0; setp.le.s64 %r351,%r104,1; .loc 1 150 10 ld.const.f64 %r353,[$LC4]; ld.const.f64 %r354,[$LC4+8]; cvt.u32.u32 %r355,%r22; cvt.s64.s8 %r356,%r355; add.u64 %r357,%r356,-1; add.u64 %r358,%frame,120; add.u64 %r359,%frame,240; bra $L25; $L24: add.u64 %r169,%frame,360; add.u64 %r168,%r181,40; mov.u64 %r166,%frame; cvt.u32.u32 %r311,%r22; cvt.s64.s8 %r310,%r311; add.u64 %r157,%r310,-1; .loc 1 129 3 mov.u64 %r120,0; add.u64 %r348,%frame,240; .loc 1 131 16 mov.u64 %r312,%r120; $L26: st.u64 [%r169],%r312; .loc 1 132 18 ld.u64 %r313,[%r168]; st.u64 [%r166],%r313; .loc 1 133 17 shl.b64 %r315,%r120,3; add.u64 %r316,%r348,%r315; .loc 1 133 10 ld.u64 %r317,[%r316]; setp.le.s64 %r318,%r317,0; @ %r318 bra $L1; .loc 1 129 26 add.u64 %r120,%r120,1; .loc 1 129 3 add.u64 %r169,%r169,8; add.u64 %r168,%r168,24; add.u64 %r166,%r166,8; setp.ne.u64 %r319,%r120,%r157; @ %r319 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r350 bra $L35; .loc 1 150 10 st.f64 [%r119],%r353; st.f64 [%r119+8],%r354; bra $L29; $L35: mov.u64 %r113,%r118; mov.f64 %r40,0d0000000000000000; mov.f64 %r102,%r40; .loc 1 154 13 mov.u64 %r112,0; $L28: .loc 1 158 10 ld.f64 %r323,[%r113]; add.f64 %r102,%r102,%r323; ld.f64 %r324,[%r113+8]; add.f64 %r40,%r40,%r324; .loc 1 154 28 add.u64 %r112,%r112,1; .loc 1 154 36 add.u64 %r113,%r113,%r66; .loc 1 154 6 setp.gt.s64 %r325,%r45,%r112; @ %r325 bra $L28; .loc 1 161 12 st.f64 [%r119],%r102; st.f64 [%r119+8],%r40; $L29: .loc 1 165 15 add.u64 %r111,%r111,1; .loc 1 166 12 add.u64 %r118,%r118,%r69; .loc 1 167 12 add.u64 %r119,%r119,%r72; .loc 1 169 13 setp.ne.u64 %r326,%r111,%r126; @ %r326 bra $L25; .loc 1 176 9 add.u64 %r115,%r118,%r129; .loc 1 177 9 add.u64 %r116,%r119,%r131; .loc 1 179 7 @ %r351 bra $L1; add.u64 %r125,%frame,368; mov.u64 %r140,8; .loc 1 178 5 mov.u64 %r117,1; .loc 1 173 13 mov.u64 %r352,0; bra $L32; $L33: st.u64 [%r125],%r352; .loc 1 176 23 mul.lo.u64 %r332,%r85,%r84; .loc 1 176 9 shl.b64 %r333,%r332,4; sub.u64 %r115,%r118,%r333; .loc 1 177 23 mul.lo.u64 %r334,%r88,%r84; .loc 1 177 9 shl.b64 %r335,%r334,4; sub.u64 %r116,%r119,%r335; .loc 1 178 5 add.u64 %r117,%r117,1; .loc 1 179 7 add.u64 %r125,%r125,8; add.u64 %r140,%r140,8; setp.eq.u64 %r336,%r117,%r357; @ %r336 bra $L1; $L32: .loc 1 187 16 ld.u64 %r337,[%r125]; add.u64 %r84,%r337,1; st.u64 [%r125],%r84; .loc 1 188 23 add.u64 %r339,%r358,%r140; ld.u64 %r85,[%r339]; .loc 1 188 13 shl.b64 %r340,%r85,4; add.u64 %r118,%r115,%r340; .loc 1 189 23 add.u64 %r341,%frame,%r140; ld.u64 %r88,[%r341]; .loc 1 189 13 shl.b64 %r342,%r88,4; add.u64 %r119,%r116,%r342; .loc 1 169 32 add.u64 %r344,%r359,%r140; ld.u64 %r92,[%r344]; .loc 1 169 13 setp.eq.u64 %r345,%r84,%r92; @ %r345 bra $L33; .loc 1 173 13 mov.u64 %r111,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r346,[%r181]; setp.eq.u64 %r347,%r346,0; @ ! %r347 bra $L11; bra $L34; $L1_gfortran_msum_c8 .visible .func _gfortran_msum_c86088pred %r215; .reg .u32 %r219; .reg .u64 %r221; .reg .u32 %r223; .reg .u32 %r224; .reg .u16 %r226; .reg .u16 %r227; .reg .u16u16 %r263; .reg .u32pred %r282; .reg .u64 %r283; .reg .u64u64u6464pred %r325; .reg .u64 %r327; .reg .u64 %r329; .reg .pred %r331; .reg .pred %r332; .reg .u64 %r333; .reg .u32 %r334; .reg .u64u64pred %r357; .reg .u64 %r358; .reg .u64 %r359; .reg .predu64 %r392; .reg .u64 %r393; .reg .pred %r394; .reg .predpred %r404; .reg .pred %r405; .reg .u64pred %r426; .reg .u64 %r427; .reg .predpred.reg .u64mov.u64 %r211,%ar0; mov.u64 %r212,%ar1; mov.u64 %r213,%ar2; mov.u64 %r214,%ar3; .loc 1 223 6 setp.ne.u64 %r215,%r214,0; @ %r215 bra $L49gfortran_sum_c8.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r213]; .loc 1 233 7 add.u64 %r136,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r212+28]; .loc 1 234 38 add.u32 %r219,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r137,%r219; .loc 1 237 7 shr.u64 %r221,%r136,63; set.u32.gt.s64 %r223,%r136,%r137; neg.s32 %r224,%r223; cvt.u16.u64 %r227,%r221; cvt.u16.u32 %r228,%r224; or.b16 %r226,%r227,%r228; .loc 1 237 6 cvt.u32.u16 %r229,%r226; cvt.u16.u8 %r230,%r229; setp.eq.u16 %r231,%r230,0; @ %r231 bra $L51; .loc 1 239 7 cvt.u32.u32 %r234,%r23; cvt.s64.s8 %r233,%r234; st.u64 [%stack+8],%r233; st.u64 [%stack],%r22; cvta.const.u64 %r2322stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r432,%r136,%r136; add.u64 %r433,%r432,%r136; shl.b64 %r239,%r433,3; add.u64 %r240,%r212,%r239; ld.u64 %r243,[%r240+56]; add.u64 %r242,%r243,1; .loc 1 244 7 ld.u64 %r250,[%r240+48]; sub.u64 %r138,%r242,%r250; .loc 1 245 6 setp.le.s64 %r251,%r138,0; @ %r251 bra $L48; .loc 1 248 9 ld.u64 %r151,[%r214]; .loc 1 250 15 ld.u64 %r34,[%r214+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r252,%r35,-4; and.b32 %r253,%r252,-5; set.u32.eq.u32 %r255,%r253,0; neg.s32 %r256,%r255; .loc 1 252 22 add.u32 %r257,%r35,-1; set.u32.le.u32 %r259,%r257,1; neg.s32 %r260,%r259; .loc 1 252 6 cvt.u16.u32 %r262,%r256; cvt.u16.u32 %r263,%r260; or.b16 %r261,%r262,%r263; cvt.u32.u16 %r264,%r261; cvt.u16.u8 %r265,%r264; setp.ne.u16 %r266,%r265,0; @ %r266 bra $L53; .loc 1 254 7 setp.ne.u32 %r268,%r35,16; @ %r268 bra $L54; $L53: .loc 1 261 9 shl.b64 %r272,%r433,3; add.u64 %r273,%r212,%r272; ld.u64 %r141,[%r273+40]; .loc 1 262 12 add.u64 %r279,%r214,%r272; ld.u64 %r41,[%r279+40]; .loc 1 264 3 setp.ne.u64 %r281,%r136,0; @ %r281 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r282,%r136,%r137; @ %r282 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r283283_gfortran_runtime_error5: add.u64 %r40,%r212,40; add.u64 %r129,%r214,40; add.u64 %r285,%r212,16; add.u64 %r287,%r22,%r22; add.u64 %r288,%r287,%r22; shl.b64 %r289,%r288,3; add.u64 %r208,%r285,%r289; .loc 1 264 3 mov.u64 %r87,0; add.u64 %r430,%frame,240; add.u64 %r429,%frame,360; .loc 1 271 12 mov.u64 %r444,%r87; $L60: .loc 1 266 18 add.u64 %r291,%r430,%r87; ld.u64 %r292,[%r40]; st.u64 [%r291],%r292; .loc 1 267 18 add.u64 %r293,%frame,%r87; .loc 1 267 20 ld.u64 %r295,[%r129]; mul.lo.u64 %r294,%r295,%r34; .loc 1 267 18 st.u64 [%r293],%r294; .loc 1 268 19 ld.u64 %r297,[%r40+16]; add.u64 %r296,%r297,1; ld.u64 %r298,[%r40+8]; sub.u64 %r52,%r296,%r298; .loc 1 270 10 setp.lt.s64 %r299,%r52,0; @ %r299 bra $L58; .loc 1 268 17 add.u64 %r301,%r429,%r87; st.u64 [%r301],%r52; bra $L59; $L58: .loc 1 271 12 add.u64 %r303,%r429,%r87; st.u64 [%r303],%r444; $L59: .loc 1 264 3 add.u64 %r40,%r40,24; add.u64 %r129,%r129,24; add.u64 %r87,%r87,8; setp.ne.u64 %r305,%r40,%r208; @ %r305 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r306,[%r211]; setp.eq.u64 %r307,%r306,0; @ ! %r307 bra $L63; bra $L62; $L56: add.u64 %r309,%r22,%r22; add.u64 %r310,%r309,%r22; shl.b64 %r311,%r310,3; add.u64 %r122,%r311,40; add.u64 %r131,%r212,%r122; add.u64 %r117,%r214,%r122; shl.b64 %r312,%r22,3; add.u64 %r135,%r312,-8; cvt.u32.u32 %r314,%r23; cvt.s64.s8 %r313,%r314; shl.b64 %r315,%r313,3; add.u64 %r66,%r315,-8; add.u64 %r430,%frame,240; add.u64 %r429,%frame,360; .loc 1 281 12 mov.u64 %r443,0; $L66: .loc 1 276 18 add.u64 %r317,%r430,%r135; ld.u64 %r318,[%r131]; st.u64 [%r317],%r318; .loc 1 277 18 add.u64 %r319,%frame,%r135; .loc 1 277 20 ld.u64 %r321,[%r117]; mul.lo.u64 %r320,%r321,%r34; .loc 1 277 18 st.u64 [%r319],%r320; .loc 1 278 19 ld.u64 %r323,[%r131+16]; add.u64 %r322,%r323,1; ld.u64 %r324,[%r131+8]; sub.u64 %r62,%r322,%r324; .loc 1 280 10 setp.lt.s64 %r325,%r62,0; @ %r325 bra $L64; .loc 1 278 17 add.u64 %r327,%r429,%r135; st.u64 [%r327],%r62; bra $L65; $L64: .loc 1 281 12 add.u64 %r329,%r429,%r135; st.u64 [%r329],%r443; $L65: .loc 1 274 3 add.u64 %r131,%r131,24; add.u64 %r117,%r117,24; add.u64 %r135,%r135,8; setp.ne.u64 %r331,%r66,%r135; @ %r331 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r332,%r137,0; @ %r332 bra $L68; add.u64 %r429,%frame,360; $L86: add.u64 %r179,%r211,40; mov.u64 %r171,%r429; cvt.u32.u32 %r334,%r23; cvt.s64.s8 %r333,%r334; add.u64 %r336,%r333,%r333; add.u64 %r337,%r336,%r333; shl.b64 %r338,%r337,3; add.u64 %r339,%r211,16; add.u64 %r132,%r338,%r339; .loc 1 291 10 mov.u64 %r126,1; .loc 1 295 4 mov.u64 %r358,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r340,%r23,-2; cvt.s64.s32 %r71,%r340; add.u64 %r342,%r71,%r71; add.u64 %r343,%r342,%r71; shl.b64 %r344,%r343,3; add.u64 %r345,%r211,%r344; .loc 1 299 67 shl.b64 %r347,%r71,3; add.u64 %r348,%frame,%r347; .loc 1 299 59 ld.u64 %r350,[%r345+40]; ld.u64 %r351,[%r348+360]; mul.lo.u64 %r145,%r350,%r351; .loc 1 301 24 mov.u64 %r352,0; st.u64 [%r211+8],%r352; .loc 1 302 28 cvt.u16.u32 %r355,%r23; add.u16 %r354,%r355,-1; cvt.u32.u16 %r356,%r354; st.u8 [%r211+28],%r356; .loc 1 304 10 setp.eq.u64 %r357,%r145,0; @ %r357 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r126,%r68,%r126; $L69: .loc 1 295 4 st.u64 [%r179+8],%r358; ld.u64 %r68,[%r171]; add.u64 %r359,%r68,-1; st.u64 [%r179+16],%r359; st.u64 [%r179],%r126; .loc 1 288 7 add.u64 %r179,%r179,24; add.u64 %r171,%r171,8; setp.ne.u64 %r360,%r132,%r179; @ %r360 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r211+48],%r145; mov.u64 %r362,-1; st.u64 [%r211+56],%r362; mov.u64 %r363,1; st.u64 [%r211+40],%r363; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r365,16365; call (%value_in),_gfortrani_xmallocarray66,[%value_in]; } .loc 1 311 22 st.u64 [%r211],%r366; bra $L73; $L63: .loc 1 316 19 ld.s8 %r368,[%r211+28]; .loc 1 316 10 setp.eq.u64 %r369,%r368,%r137; @ %r369 bra $L74; .loc 1 317 2 cvta.const.u64 %r370,$LC370_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r372,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r373,[%r372+36]; setp.eq.u32 %r374,%r373,0; @ %r374 bra $L73; .loc 1 321 4 add.u64 %r429,%frame,360; cvta.const.u64 %r378,$LC2377378; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r38221212382378; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r384,%r137,0; @ %r384 bra $L75; $L78: .loc 1 336 8 ld.u64 %r152,[%r211]; .loc 1 337 8 ld.u64 %r150,[%r212]; .loc 1 339 9 setp.ne.u64 %r385,%r150,0; @ %r385 bra $L76; bra $L48; $L75: add.u64 %r192,%frame,480; add.u64 %r191,%r211,40; add.u64 %r189,%frame,120; cvt.u32.u32 %r387,%r23; cvt.s64.s8 %r386,%r387; add.u64 %r180,%r386,-1; .loc 1 328 3 mov.u64 %r157,0; add.u64 %r429,%frame,360; .loc 1 330 16 mov.u64 %r388,%r157; $L77: st.u64 [%r192],%r388; .loc 1 331 18 ld.u64 %r389,[%r191]; st.u64 [%r189],%r389; .loc 1 332 17 shl.b64 %r391,%r157,3; add.u64 %r392,%r429,%r391; .loc 1 332 10 ld.u64 %r393,[%r392]; setp.le.s64 %r394,%r393,0; @ %r394 bra $L48; .loc 1 328 26 add.u64 %r157,%r157,1; .loc 1 328 3 add.u64 %r192,%r192,8; add.u64 %r191,%r191,24; add.u64 %r189,%r189,8; setp.ne.u64 %r395,%r157,%r180; @ %r395 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r43,%r41,%r34; .loc 1 349 32 shl.b64 %r86,%r141,4; ld.u64 %r153,[%frame+480]; .loc 1 359 22 ld.u64 %r89,[%frame+240]; .loc 1 359 12 shl.b64 %r90,%r89,4; .loc 1 360 23 ld.u64 %r92,[%frame]; .loc 1 361 22 ld.u64 %r94,[%frame+120]; .loc 1 361 12 shl.b64 %r95,%r94,4; .loc 1 363 32 ld.u64 %r80,[%frame+360]; .loc 1 370 23 mul.lo.u64 %r38,%r80,%r89; .loc 1 371 24 mul.lo.u64 %r396,%r80,%r92; .loc 1 371 10 neg.s64 %r162,%r396; .loc 1 372 23 mul.lo.u64 %r397,%r80,%r94; .loc 1 372 9 shl.b64 %r398,%r397,4; neg.s64 %r165,%r398; setp.le.s64 %r434,%r137,1; cvt.u32.u32 %r436,%r23; cvt.s64.s8 %r437,%r436; add.u64 %r438,%r437,-1; add.u64 %r439,%frame,240; add.u64 %r440,%frame,360; add.u64 %r441,%frame,120; .loc 1 367 13 mov.u64 %r442,0; $L87: .loc 1 328 3 mov.u64 %r156,%r151; mov.u64 %r155,%r150; mov.f64 %r118,0d0000000000000000; mov.f64 %r119,%r118; .loc 1 349 9 mov.u64 %r154,0; $L80: .loc 1 352 6 ld.s8 %r400,[%r156]; cvt.u16.u32 %r399,%r400; setp.eq.u16 %r401,%r399,0; @ %r401 bra $L79; .loc 1 353 12 ld.f64 %r402,[%r155]; add.f64 %r119,%r119,%r402; ld.f64 %r403,[%r155+8]; add.f64 %r118,%r118,%r403; $L79: .loc 1 349 24 add.u64 %r154,%r154,1; .loc 1 349 32 add.u64 %r155,%r155,%r86; .loc 1 349 47 add.u64 %r156,%r156,%r43; .loc 1 349 2 setp.ne.u64 %r404,%r138,%r154; @ %r404 bra $L80; .loc 1 355 8 st.f64 [%r152],%r119; st.f64 [%r152+8],%r118; .loc 1 358 15 add.u64 %r153,%r153,1; .loc 1 359 12 add.u64 %r150,%r150,%r90; .loc 1 360 13 add.u64 %r151,%r151,%r92; .loc 1 361 12 add.u64 %r152,%r152,%r95; .loc 1 363 13 setp.ne.u64 %r405,%r80,%r153; @ %r405 bra $L87; .loc 1 371 10 add.u64 %r147,%r151,%r162; .loc 1 372 9 add.u64 %r148,%r152,%r165; .loc 1 374 7 @ ! %r434 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r128],%r442; .loc 1 370 23 mul.lo.u64 %r98,%r109,%r108; .loc 1 371 24 mul.lo.u64 %r408,%r111,%r108; .loc 1 371 10 sub.u64 %r147,%r151,%r408; .loc 1 372 23 mul.lo.u64 %r409,%r113,%r108; .loc 1 372 9 shl.b64 %r410,%r409,4; sub.u64 %r148,%r152,%r410; .loc 1 373 5 add.u64 %r149,%r149,1; .loc 1 374 7 add.u64 %r128,%r128,8; add.u64 %r178,%r178,8; setp.ne.u64 %r411,%r149,%r438; @ %r411 bra $L84; bra $L48; $L108: add.u64 %r128,%frame,488; .loc 1 370 23 mov.u64 %r98,%r38; .loc 1 374 7 mov.u64 %r178,8; .loc 1 373 5 mov.u64 %r149,1; $L84: .loc 1 382 16 ld.u64 %r415,[%r128]; add.u64 %r108,%r415,1; st.u64 [%r128],%r108; .loc 1 383 23 add.u64 %r417,%r439,%r178; ld.u64 %r109,[%r417]; .loc 1 383 13 sub.u64 %r418,%r109,%r98; shl.b64 %r419,%r418,4; add.u64 %r150,%r150,%r419; .loc 1 384 24 add.u64 %r420,%frame,%r178; ld.u64 %r111,[%r420]; .loc 1 384 14 add.u64 %r151,%r147,%r111; .loc 1 385 23 add.u64 %r422,%r441,%r178; ld.u64 %r113,[%r422]; .loc 1 385 13 shl.b64 %r423,%r113,4; add.u64 %r152,%r148,%r423; .loc 1 363 32 add.u64 %r425,%r440,%r178; ld.u64 %r116,[%r425]; .loc 1 363 13 setp.eq.u64 %r426,%r108,%r116; @ %r426 bra $L85; .loc 1 367 13 mov.u64 %r153,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r427,[%r211]; setp.eq.u64 %r428,%r427,0; @ ! %r428 bra $L63; bra $L86; $L48: .loc 1 389_gfortran_ssum_c8 .visible .func _gfortran_ssum_c88u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64 %r276; .reg .pred %r277; .reg .pred %r280; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_sum_c8.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r297,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r297; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r297; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r304,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r304; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r296,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r296,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r303,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r303; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r297,0; @ %r211 bra $L127; add.u64 %r296,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r296; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r244,1696244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r297,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r297; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r297; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r297,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r297,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r297,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 shl.b64 %r69,%r68,4; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; ld.const.f64 %r293,[$LC4]; ld.const.f64 %r294,[$LC4+8]; setp.le.s64 %r295,%r297,1; cvt.u32.u32 %r299,%r24; cvt.s64.s8 %r300,%r299; add.u64 %r301,%r300,-1; .loc 1 520 13 mov.u64 %r302,0; $L146: .loc 1 512 13 st.f64 [%r98],%r293; st.f64 [%r98+8],%r294; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r280,%r59,%r80; @ %r280 bra $L146; .loc 1 525 7 @ ! %r295 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r302; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r283,%r97,%r301; @ %r283 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; $L144: .loc 1 529 16 ld.u64 %r288,[%r99]; add.u64 %r74,%r288,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r289,%r75,%r72; shl.b64 %r290,%r289,4; add.u64 %r98,%r98,%r290; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r291,%r74,%r77; @ %r291 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r292,%r98,0; @ ! %r292 bra $L122; bra $L147; $L109: .loc 1 534sum_c10.o/ bessel_r4.o/_gfortran_bessel_jn_r4 .visible .func _gfortran_bessel_jn_r4, .param .f32 %in_ar3); .file 1 "../../../../libgfortran/generated/bessel_r4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_bessel_yn_r4 .visible .func _gfortran_bessel_yn_r4_gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime_jnf .extern .func (.param .f32 %value_out) jnfCL: ynf .extern .func (.param .f32 %value_out) ynf99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,66,69,83,83,69,76,95,74,78,32,40,37,108,100,32,118,115,46,32,37,108,100,41_gfortran_bessel_jn_r4 .visible .func _gfortran_bessel_jn_r4predpredpred %r115; .reg .f32 %r116; .reg .pred %r117; .reg .f32 %r118; .reg .predpredpred %r147; mov.u64 %r84,%ar0; mov.u32 %r85,%ar1; mov.u32 %r86,%ar2; mov.f32 %r87,%ar3; .loc 1 52 6 ld.u64 %r88,[%r84]; setp.ne.u64 %r89,%r88,0; @ %r89 bra $L2; .loc 1 54 33 setp.lt.s32 %r90,%r86,%r85; @ %r90 bra $L10; .loc 1 54 37 sub.u32 %r91,%r86,%r85; .loc 1 54 40 add.u32 %r92,%r91,1; .loc 1 54 33 cvt.s64.s32 %r63,%r92; .loc 1 55 7 add.u64 %r78,%r63,-1; bra $L3; $L10: mov.u64 %r78,-1; .loc 1 54 33 mov.u64 %r63,%r88; $L3: .loc 1 55 7 mov.u64 %r93,0; st.u64 [%r84+48],%r93; st.u64 [%r84+56],%r78; mov.u64 %r94,1; st.u64 [%r84+40],%r94; .loc 1 56 24 mov.u64 %r96r96; call (%value_in),_gfortrani_xmallocarray.loc 1 56 22 st.u64 [%r84],%r97; .loc 1 57 19 st.u64 [%r84+8],%r93; $L2: .loc 1 60 6 setp.ge.s32 %r100,%r86,%r85; @ ! %r100 bra $L1; .loc 1 54 37 sub.u32 %r83,%r86,%r85; .loc 1 63 7 cvta.global.u64 %r101,_gfortrani_compile_options; .loc 1 63 6 ld.u32 %r102,[%r101+36]; setp.eq.u32 %r103,%r102,0; @ %r103 bra $L5; .loc 1 64 10 ld.u64 %r105,[%r84+56]; add.u64 %r104,%r105,1; ld.u64 %r106,[%r84+48]; sub.u64 %r33,%r104,%r106; .loc 1 64 48 add.u32 %r107,%r83,133,%r108; @ %r109 bra $L5; .loc 1 65 5 st.u64 [%stack+8],%r33; .loc 1 66 22 cvt.s64.s32 %r111,%r86; .loc 1 65 5 cvt.s64.s32 %r112,%r85; sub.u64 %r113,%r111,%r112; st.u64 [%stack],%r113; cvta.const.u64 %r110110_gfortran_runtime_errorr66,[%r84+40]; .loc 1 71 6 setp.neu.f32 %r115,%r87,0f00000000; @ %r115 bra $L6; .loc 1 73 10 ld.u64 %r41,[%r84]; .loc 1 73 25 mov.f32 %r116,0f3f800000; st.f32 [%r41],%r116; .loc 1 74 7 setp.eq.u32 %r117,%r83,0; @ %r117 bra $L1; shl.b64 %r74,%r66,2; add.u64 %r65,%r41,%r74; add.u32 %r69,%r83,1; .loc 1 74 14 mov.u32 %r68,1; .loc 1 75 34 mov.f32 %r118,0f00000000; $L8: st.f32 [%r65],%r118; .loc 1 74 32 add.u32 %r68,%r68,1; .loc 1 74 7 add.u64 %r65,%r65,%r74; setp.ne.u32 %r119,%r68,%r69; @ %r119 bra $L8; bra $L1; $L6: .loc 1 79 jn122,[%value_in]; } mov.f32 %r40,%r122; .loc 1 80 21 cvt.s64.s32 %r123,%r83; .loc 1 80 25 mul.lo.u64 %r124,%r123,%r66; .loc 1 80 34 ld.u64 %r125,[%r84]; shl.b64 %r126,%r124,2; add.u64 %r127,%r125,%r126; st.f32 [%r127],%r40; .loc 1 82 6 setp.eq.u32 %r128,%r86,%r85; @ %r128 bra $L1; .loc 1 85 11 add.u32 %r131,%r86,-131(%value_in),jn132,[%value_in]; } mov.f32 %r29,%r132; .loc 1 86 6 ld.u64 %r50,[%r84]; .loc 1 86 24 add.u32 %r51,%r83,-1; cvt.s64.s32 %r133,%r51; .loc 1 86 27 mul.lo.u64 %r53,%r133,%r66; .loc 1 86 36 shl.b64 %r134,%r53,2; add.u64 %r135,%r50,%r134; st.f32 [%r135],%r29; .loc 1 88 10 add.u32 %r136,%r85,1; .loc 1 88 6 setp.eq.u32 %r137,%r136,%r86; @ %r137 bra $L1; .loc 1 91 9 mov.f32 %r138,0f40000000; div.rn.f32 %r67,%r138,%r87; .loc 1 93 3 setp.le.s32 %r139,%r83,1; @ %r139 bra $L1; add.u32 %r64,%r51,%r85; neg.s64 %r141,%r66; shl.b64 %r142,%r141,2; sub.u64 %r143,%r53,%r66; shl.b64 %r144,%r143,2; add.u64 %r61,%r50,%r144; $L9: neg.f32 %r27,%r40; mov.f32 %r40,%r29; .loc 1 95 40 cvt.rn.f32.s32 %r145,%r64; mul.f32 %r146,%r145,%r67; .loc 1 95 59 fma.rn.f32 %r29,%r29,%r146,%r27; .loc 1 95 32 st.f32 [%r61],%r29; .loc 1 93 3 add.u32 %r64,%r64,-1; add.u64 %r61,%r61,%r142; setp.ne.u32 %r147,%r64,%r85; @ %r147 bra $L9; $L1: .loc 1 99_gfortran_bessel_yn_r4 .visible .func _gfortran_bessel_yn_r4u64 %r68; .reg .f32 %r69; .reg .u64 %r70; .reg .u3264pred %r109; .reg .f32predpred %r127; .reg .f32 %r128; mov.u64 %r74,%ar0; mov.u32 %r75,%ar1; mov.u32 %r76,%ar2; mov.f32 %r77,%ar3; .loc 1 119 6 ld.u64 %r78,[%r74]; setp.ne.u64 %r79,%r78,0; @ %r79 bra $L20; .loc 1 121 33 setp.lt.s32 %r80,%r76,%r75; @ %r80 bra $L30; .loc 1 121 37 sub.u32 %r81,%r76,%r75; .loc 1 121 40 add.u32 %r82,%r81,1; .loc 1 121 33 cvt.s64.s32 %r65,%r82; .loc 1 122 7 add.u64 %r70,%r65,-1; bra $L21; $L30: mov.u64 %r70,-1; .loc 1 121 33 mov.u64 %r65,%r78; $L21: .loc 1 122 7 mov.u64 %r83,0; st.u64 [%r74+48],%r83; st.u64 [%r74+56],%r70; mov.u64 %r84,1; st.u64 [%r74+40],%r84; .loc 1 123 24 mov.u64 %r866586; call (%value_in),_gfortrani_xmallocarray.loc 1 123 22 st.u64 [%r74],%r87; .loc 1 124 19 st.u64 [%r74+8],%r83; $L20: .loc 1 127 6 setp.ge.s32 %r90,%r76,%r75; @ ! %r90 bra $L19; .loc 1 130 7 cvta.global.u64 %r91,_gfortrani_compile_options; .loc 1 130 6 ld.u32 %r92,[%r91+36]; setp.eq.u32 %r93,%r92,0; @ %r93 bra $L23; .loc 1 131 10 ld.u64 %r95,[%r74+56]; add.u64 %r94,%r95,1; ld.u64 %r96,[%r74+48]; sub.u64 %r34,%r94,%r96; .loc 1 131 45 sub.u32 %r97,%r76,%r75; .loc 1 131 48 add.u32 %r98,%r97,1; cvt.s64.s32 %r99,%r98; .loc 1 131 7 setp.eq.u64 %r100,%r34,%r99; @ %r100 bra $L23; .loc 1 132 5 st.u64 [%stack+8],%r34; .loc 1 133 22 cvt.s64.s32 %r102,%r76; .loc 1 132 5 cvt.s64.s32 %r103,%r75; sub.u64 %r104,%r102,%r103; st.u64 [%stack],%r104; cvta.const.u64 %r101,$LC0; {_gfortran_runtime_error23: .loc 1 136 10 ld.u64 %r68,[%r74+40]; .loc 1 138 6 setp.neu.f32 %r106,%r77,0f00000000; @ %r106 bra $L24; shl.b64 %r64,%r68,2; ld.u64 %r67,[%r74]; .loc 1 140 26 sub.u32 %r107,%r76,%r75; add.u32 %r50,%r107,1; .loc 1 140 14 mov.u32 %r42,0; .loc 1 142 34 mov.f32 %r108,0fff800000; $L25: st.f32 [%r67],%r108; .loc 1 140 32 add.u32 %r42,%r42,1; .loc 1 140 7 add.u64 %r67,%r67,%r64; setp.ne.u32 %r109,%r50,%r42; @ %r109 bra $L25; bra $L19; $L24: .loc 1 149 yn112,[%value_in]; } mov.f32 %r62,%r112; .loc 1 150 21 ld.u64 %r113,[%r74]; st.f32 [%r113],%r62; .loc 1 152 6 setp.eq.u32 %r114,%r76,%r75; @ %r114 bra $L19; .loc 1 155 11 add.u32 %r45,%r75,yn117,[%value_in]; } mov.f32 %r63,%r117; .loc 1 156 17 shl.b64 %r48,%r68,2; ld.u64 %r118,[%r74]; add.u64 %r49,%r118,%r48; .loc 1 156 28 st.f32 [%r49],%r63; .loc 1 158 6 setp.eq.u32 %r119,%r45,%r76; @ %r119 bra $L19; .loc 1 161 9 mov.f32 %r120,0f40000000; div.rn.f32 %r69,%r120,%r77; .loc 1 163 23 sub.u32 %r51,%r76,%r75; .loc 1 163 3 setp.eq.u32 %r121,%r51,1; @ %r121 bra $L19; add.u64 %r41,%r49,%r48; mov.u32 %r72,%r45; add.u32 %r25,%r51,%r75; .loc 1 168 29 mov.f32 %r128,0fff800000; $L29: .loc 1 166 10 setp.neu.f32 %r122,%r63,0fff800000; @ %r122 bra $L27; .loc 1 168 29 st.f32 [%r41],%r128; bra $L28; $L27: .loc 1 173 37 cvt.rn.f32.s32 %r124,%r72; mul.f32 %r125,%r124,%r69; .loc 1 173 56 neg.f32 %r126,%r62; fma.rn.f32 %r58,%r125,%r63,%r126; .loc 1 173 29 st.f32 [%r41],%r58; mov.f32 %r62,%r63; .loc 1 175 10 mov.f32 %r63,%r58; $L28: .loc 1 163 3 add.u64 %r41,%r41,%r48; add.u32 %r72,%r72,1; setp.ne.u32 %r127,%r25,%r72; @ %r127 bra $L29; $L19: .loc 1 1 bessel_r8.o/_gfortran_bessel_jn_r8 .visible .func _gfortran_bessel_jn_r8fortran/generated/bessel_r8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_bessel_yn_r8 .visible .func _gfortran_bessel_yn_r8gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime_jn .extern .func (.param .f64 %value_out) jnCL: yn .extern .func (.param .f64 %value_out) yn99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,66,69,83,83,69,76,95,74,78,32,40,37,108,100,32,118,115,46,32,37,108,100,41_gfortran_bessel_jn_r8 .visible .func _gfortran_bessel_jn_r8predpredpred %r115; .reg .f64 %r116; .reg .pred %r117; .reg .f64 %r118; .reg .predpredpred %r147; mov.u64 %r84,%ar0; mov.u32 %r85,%ar1; mov.u32 %r86,%ar2; mov.f64 %r87,%ar3; .loc 1 52 6 ld.u64 %r88,[%r84]; setp.ne.u64 %r89,%r88,0; @ %r89 bra $L2; .loc 1 54 33 setp.lt.s32 %r90,%r86,%r85; @ %r90 bra $L10; .loc 1 54 37 sub.u32 %r91,%r86,%r85; .loc 1 54 40 add.u32 %r92,%r91,1; .loc 1 54 33 cvt.s64.s32 %r63,%r92; .loc 1 55 7 add.u64 %r78,%r63,-1; bra $L3; $L10: mov.u64 %r78,-1; .loc 1 54 33 mov.u64 %r63,%r88; $L3: .loc 1 55 7 mov.u64 %r93,0; st.u64 [%r84+48],%r93; st.u64 [%r84+56],%r78; mov.u64 %r94,1; st.u64 [%r84+40],%r94; .loc 1 56 24 mov.u6r96; call (%value_in),_gfortrani_xmallocarray.loc 1 56 22 st.u64 [%r84],%r97; .loc 1 57 19 st.u64 [%r84+8],%r93; $L2: .loc 1 60 6 setp.ge.s32 %r100,%r86,%r85; @ ! %r100 bra $L1; .loc 1 54 37 sub.u32 %r83,%r86,%r85; .loc 1 63 7 cvta.global.u64 %r101,_gfortrani_compile_options; .loc 1 63 6 ld.u32 %r102,[%r101+36]; setp.eq.u32 %r103,%r102,0; @ %r103 bra $L5; .loc 1 64 10 ld.u64 %r105,[%r84+56]; add.u64 %r104,%r105,1; ld.u64 %r106,[%r84+48]; sub.u64 %r33,%r104,%r106; .loc 1 64 48 add.u32 %r107,%r83,133,%r108; @ %r109 bra $L5; .loc 1 65 5 st.u64 [%stack+8],%r33; .loc 1 66 22 cvt.s64.s32 %r111,%r86; .loc 1 65 5 cvt.s64.s32 %r112,%r85; sub.u64 %r113,%r111,%r112; st.u64 [%stack],%r113; cvta.const.u64 %r110110_gfortran_runtime_errorr66,[%r84+40]; .loc 1 71 6 setp.neu.f64 %r115,%r87,0d0000000000000000; @ %r115 bra $L6; .loc 1 73 10 ld.u64 %r41,[%r84]; .loc 1 73 25 mov.f64 %r116,0d3ff0000000000000; st.f64 [%r41],%r116; .loc 1 74 7 setp.eq.u32 %r117,%r83,0; @ %r117 bra $L1; shl.b64 %r74,%r66,3; add.u64 %r65,%r41,%r74; add.u32 %r69,%r83,1; .loc 1 74 14 mov.u32 %r68,1; .loc 1 75 34 mov.f64 %r118,0d0000000000000000; $L8: st.f64 [%r65],%r118; .loc 1 74 32 add.u32 %r68,%r68,1; .loc 1 74 7 add.u64 %r65,%r65,%r74; setp.ne.u32 %r119,%r68,%r69; @ %r119 bra $L8; bra $L1; $L6: .loc 1 79 1186(%value_in),jn,(%out_arg1,%out_arg2); ld.param.f64 %r122,[%value_in]; } mov.f64 %r40,%r122; .loc 1 80 21 cvt.s64.s32 %r123,%r83; .loc 1 80 25 mul.lo.u64 %r124,%r123,%r66; .loc 1 80 34 ld.u64 %r125,[%r84]; shl.b64 %r126,%r124,3; add.u64 %r127,%r125,%r126; st.f64 [%r127],%r40; .loc 1 82 6 setp.eq.u32 %r128,%r86,%r85; @ %r128 bra $L1; .loc 1 85 11 add.u32 %r131,%r86,-1131(%value_in),jn,(%out_arg1,%out_arg2); ld.param.f64 %r132,[%value_in]; } mov.f64 %r29,%r132; .loc 1 86 6 ld.u64 %r50,[%r84]; .loc 1 86 24 add.u32 %r51,%r83,-1; cvt.s64.s32 %r133,%r51; .loc 1 86 27 mul.lo.u64 %r53,%r133,%r66; .loc 1 86 36 shl.b64 %r134,%r53,3; add.u64 %r135,%r50,%r134; st.f64 [%r135],%r29; .loc 1 88 10 add.u32 %r136,%r85,1; .loc 1 88 6 setp.eq.u32 %r137,%r136,%r86; @ %r137 bra $L1; .loc 1 91 9 mov.f64 %r138,0d4000000000000000; div.rn.f64 %r67,%r138,%r87; .loc 1 93 3 setp.le.s32 %r139,%r83,1; @ %r139 bra $L1; add.u32 %r64,%r51,%r85; neg.s64 %r141,%r66; shl.b64 %r142,%r141,3; sub.u64 %r143,%r53,%r66; shl.b64 %r144,%r143,3; add.u64 %r61,%r50,%r144; $L9: neg.f64 %r27,%r40; mov.f64 %r40,%r29; .loc 1 95 40 cvt.rn.f64.s32 %r145,%r64; mul.f64 %r146,%r145,%r67; .loc 1 95 59 fma.rn.f64 %r29,%r29,%r146,%r27; .loc 1 95 32 st.f64 [%r61],%r29; .loc 1 93 3 add.u32 %r64,%r64,-1; add.u64 %r61,%r61,%r142; setp.ne.u32 %r147,%r64,%r85; @ %r147 bra $L9; $L1: .loc 1 99_gfortran_bessel_yn_r8 .visible .func _gfortran_bessel_yn_r8.reg .f64 %r62; .reg .f64.reg .u6464pred %r109; .reg .f64predpred %r127; .reg .f64 %r128; mov.u64 %r74,%ar0; mov.u32 %r75,%ar1; mov.u32 %r76,%ar2; mov.f64 %r77,%ar3; .loc 1 119 6 ld.u64 %r78,[%r74]; setp.ne.u64 %r79,%r78,0; @ %r79 bra $L20; .loc 1 121 33 setp.lt.s32 %r80,%r76,%r75; @ %r80 bra $L30; .loc 1 121 37 sub.u32 %r81,%r76,%r75; .loc 1 121 40 add.u32 %r82,%r81,1; .loc 1 121 33 cvt.s64.s32 %r65,%r82; .loc 1 122 7 add.u64 %r70,%r65,-1; bra $L21; $L30: mov.u64 %r70,-1; .loc 1 121 33 mov.u64 %r65,%r78; $L21: .loc 1 122 7 mov.u64 %r83,0; st.u64 [%r74+48],%r83; st.u64 [%r74+56],%r70; mov.u64 %r84,1; st.u64 [%r74+40],%r84; .loc 1 123 24 mov.u64 6586; call (%value_in),_gfortrani_xmallocarray.loc 1 123 22 st.u64 [%r74],%r87; .loc 1 124 19 st.u64 [%r74+8],%r83; $L20: .loc 1 127 6 setp.ge.s32 %r90,%r76,%r75; @ ! %r90 bra $L19; .loc 1 130 7 cvta.global.u64 %r91,_gfortrani_compile_options; .loc 1 130 6 ld.u32 %r92,[%r91+36]; setp.eq.u32 %r93,%r92,0; @ %r93 bra $L23; .loc 1 131 10 ld.u64 %r95,[%r74+56]; add.u64 %r94,%r95,1; ld.u64 %r96,[%r74+48]; sub.u64 %r34,%r94,%r96; .loc 1 131 45 sub.u32 %r97,%r76,%r75; .loc 1 131 48 add.u32 %r98,%r97,1; cvt.s64.s32 %r99,%r98; .loc 1 131 7 setp.eq.u64 %r100,%r34,%r99; @ %r100 bra $L23; .loc 1 132 5 st.u64 [%stack+8],%r34; .loc 1 133 22 cvt.s64.s32 %r102,%r76; .loc 1 132 5 cvt.s64.s32 %r103,%r75; sub.u64 %r104,%r102,%r103; st.u64 [%stack],%r104; cvta.const.u64 %r101,$LC0; {_gfortran_runtime_error23: .loc 1 136 10 ld.u64 %r68,[%r74+40]; .loc 1 138 6 setp.neu.f64 %r106,%r77,0d0000000000000000; @ %r106 bra $L24; shl.b64 %r64,%r68,3; ld.u64 %r67,[%r74]; .loc 1 140 26 sub.u32 %r107,%r76,%r75; add.u32 %r50,%r107,1; .loc 1 140 14 mov.u32 %r42,0; .loc 1 142 34 mov.f64 %r108,0dfff0000000000000; $L25: st.f64 [%r67],%r108; .loc 1 140 32 add.u32 %r42,%r42,1; .loc 1 140 7 add.u64 %r67,%r67,%r64; setp.ne.u32 %r109,%r50,%r42; @ %r109 bra $L25; bra $L19; $L24: .loc 1 149 1175(%value_in),yn,(%out_arg1,%out_arg2); ld.param.f64 %r112,[%value_in]; } mov.f64 %r62,%r112; .loc 1 150 21 ld.u64 %r113,[%r74]; st.f64 [%r113],%r62; .loc 1 152 6 setp.eq.u32 %r114,%r76,%r75; @ %r114 bra $L19; .loc 1 155 11 add.u32 %r45,%r75,145(%value_in),yn,(%out_arg1,%out_arg2); ld.param.f64 %r117,[%value_in]; } mov.f64 %r63,%r117; .loc 1 156 17 shl.b64 %r48,%r68,3; ld.u64 %r118,[%r74]; add.u64 %r49,%r118,%r48; .loc 1 156 28 st.f64 [%r49],%r63; .loc 1 158 6 setp.eq.u32 %r119,%r45,%r76; @ %r119 bra $L19; .loc 1 161 9 mov.f64 %r120,0d4000000000000000; div.rn.f64 %r69,%r120,%r77; .loc 1 163 23 sub.u32 %r51,%r76,%r75; .loc 1 163 3 setp.eq.u32 %r121,%r51,1; @ %r121 bra $L19; add.u64 %r41,%r49,%r48; mov.u32 %r72,%r45; add.u32 %r25,%r51,%r75; .loc 1 168 29 mov.f64 %r128,0dfff0000000000000; $L29: .loc 1 166 10 setp.neu.f64 %r122,%r63,0dfff0000000000000; @ %r122 bra $L27; .loc 1 168 29 st.f64 [%r41],%r128; bra $L28; $L27: .loc 1 173 37 cvt.rn.f64.s32 %r124,%r72; mul.f64 %r125,%r124,%r69; .loc 1 173 56 neg.f64 %r126,%r62; fma.rn.f64 %r58,%r125,%r63,%r126; .loc 1 173 29 st.f64 [%r41],%r58; mov.f64 %r62,%r63; .loc 1 175 10 mov.f64 %r63,%r58; $L28: .loc 1 163 3 add.u64 %r41,%r41,%r48; add.u32 %r72,%r72,1; setp.ne.u32 %r127,%r25,%r72; @ %r127 bra $L29; $L19: .loc 1 1 bessel_r10.o/ iall_i1.o/_gfortran_iall_i1 .visible .func _gfortran_iall_i1fortran/generated/iall_i1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_miall_i1 .visible .func _gfortran_miall_i1gfortran_siall_i1 .visible .func _gfortran_siall_i1VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,76,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,76,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,105] = {73,65,76,76114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,76,76,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,73,65,76,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_iall_i1 .visible .func _gfortran_iall_i148predpred %r209; .reg .pred %r210; .reg .u64 %r211; .reg .u64 %r213; .reg .u64pred %r220; .reg .pred %r222; .reg .u64pred %r247; .reg .u64 %r248; .reg .u32 %r249; .reg .u64 %r251; .reg .u64 %r252; .reg .u64 %r253; .reg .u64 %r254; .reg .u64 %r255; .reg .u16 %r257; .reg .u16predpred %r288; .reg .u64 %r291; .reg .u64 %r292; .reg .predpred %r304; .reg .predpredpred %r332; .reg .u64 %r333; .reg .pred %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .pred %r337; .reg .pred %r338; .reg .u32 %r340; .reg .u64 %r341; .reg .u64 %r342; .reg .u64 %r343; .reg .u64 %r344; .reg .u64 %r345; .reg .u64 %r347; .reg .u64 %r348; .reg .u16 %r349; mov.u64 %r171,%ar0; mov.u64 %r172,%ar1; mov.u64 %r173,%ar2; .loc 1 55 10 ld.s8 %r22,[%r172+28]; .loc 1 55 38 add.u32 %r174,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r174; .loc 1 56 10 ld.u64 %r25,[%r173]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r176,%r104,63; set.u32.lt.s64 %r178,%r103,%r104; neg.s32 %r179,%r178; cvt.u16.u64 %r182,%r176; cvt.u16.u32 %r183,%r179; or.b16 %r181,%r182,%r183; .loc 1 58 6 cvt.u32.u16 %r184,%r181; cvt.u16.u8 %r185,%r184; setp.eq.u16 %r186,%r185,0; @ %r186 bra $L2; .loc 1 60 7 cvt.u32.u32 %r189,%r22; cvt.s64.s8 %r188,%r189; st.u64 [%stack+8],%r18818187_gfortran_runtime_errorr192,%r104,%r104; add.u64 %r193,%r192,%r104; shl.b64 %r194,%r193,3; add.u64 %r195,%r172,%r194; ld.u64 %r31,[%r195+56]; ld.u64 %r33,[%r195+48]; .loc 1 68 9 ld.u64 %r107,[%r195+40]; .loc 1 70 3 setp.ne.u64 %r209,%r104,0; @ %r209 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r210,%r103,%r104; @ %r210 bra $L4; bra $L46; $L3: add.u64 %r168,%r172,40; add.u64 %r165,%frame,120; add.u64 %r48,%frame,240; add.u64 %r211,%r172,16; add.u64 %r213,%r25,%r25; add.u64 %r214,%r213,%r25; shl.b64 %r215,%r214,3; add.u64 %r139,%r211,%r215; .loc 1 76 12 mov.u64 %r348,0; $L8: .loc 1 72 18 ld.u64 %r216,[%r168]; st.u64 [%r165],%r216; .loc 1 73 19 ld.u64 %r218,[%r168+16]; add.u64 %r217,%r218,1; ld.u64 %r219,[%r168+8]; sub.u64 %r38,%r217,%r219; .loc 1 75 10 setp.lt.s64 %r220,%r38,0; @ %r220 bra $L6; .loc 1 73 17 st.u64 [%r48],%r38; bra $L7; $L6: .loc 1 76 12 st.u64 [%r48],%r348; $L7: .loc 1 70 3 add.u64 %r168,%r168,24; add.u64 %r165,%r165,8; add.u64 %r48,%r48,8; setp.ne.u64 %r222,%r139,%r168; @ %r222 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r223,[%r171]; setp.eq.u64 %r224,%r223,0; @ ! %r224 bra $L11; bra $L10; $L4: add.u64 %r226,%r25,%r25; add.u64 %r227,%r226,%r25; shl.b64 %r228,%r227,3; add.u64 %r229,%r228,40; add.u64 %r99,%r172,%r229; shl.b64 %r230,%r25,3; add.u64 %r91,%r230,-8; add.u64 %r335,%frame,120; add.u64 %r93,%r335,%r91; add.u64 %r336,%frame,240; add.u64 %r77,%r336,%r91; cvt.u32.u32 %r234,%r22; cvt.s64.s8 %r233,%r234; add.u64 %r236,%r233,%r233; add.u64 %r237,%r236,%r233172,40; add.u64 %r121,%r238,%r239; .loc 1 84 12 mov.u64 %r347,0; $L14: .loc 1 80 18 ld.u64 %r240,[%r99]; st.u64 [%r93],%r240; .loc 1 81 19 ld.u64 %r242,[%r99+16]; add.u64 %r241,%r242,1; ld.u64 %r243,[%r99+8]; sub.u64 %r44,%r241,%r243; .loc 1 83 10 setp.lt.s64 %r244,%r44,0; @ %r244 bra $L12; .loc 1 81 17 st.u64 [%r77],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r77],%r347; $L13: .loc 1 78 3 add.u64 %r99,%r99,24; add.u64 %r93,%r93,8; add.u64 %r77,%r77,8; setp.ne.u64 %r246,%r99,%r121; @ %r246 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r247,%r103,0; @ %r247 bra $L16; add.u64 %r336,%frame,240; $L35: add.u64 %r137,%r171,40; mov.u64 %r132,%r336; cvt.u32.u32 %r249,%r22; cvt.s64.s8 %r248,%r249; add.u64 %r251,%r248,%r248; add.u64 %r252,%r251,%r248; shl.b64 %r253,%r252,3; add.u64 %r254,%r171,16; add.u64 %r100,%r253,%r254; .loc 1 94 10 mov.u64 %r95,1; .loc 1 98 4 mov.u64 %r277,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r255,0; st.u64 [%r171+8],%r255; .loc 1 103 28 cvt.u16.u32 %r258,%r22; add.u16 %r257,%r258,-1; cvt.u32.u16 %r259,%r257; st.u8 [%r171+28],%r259; .loc 1 105 20 add.u32 %r260,%r22,-2; cvt.s64.s32 %r54,%r260; add.u64 %r262,%r54,%r54; add.u64 %r263,%r262,%r54; shl.b64 %r264,%r263,3; add.u64 %r265,%r171,%r264; .loc 1 105 67 shl.b64 %r267,%r54,3; add.u64 %r268,%frame,%r267; .loc 1 105 59 ld.u64 %r270,[%r265+40]; ld.u64 %r271,[%r268+240]; mul.lo.u64 %r109,%r270,%r271; .loc 1 107 29 mov.u64 %r27273; call (%value_in),_gfortrani_xmallocarray274,[%value_in]; } .loc 1 107 27 st.u64 [%r171],%r274; .loc 1 108 10 setp.eq.u64 %r276,%r109,0; @ ! %r276 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r95,%r50,%r95; $L17: .loc 1 98 4 st.u64 [%r137+8],%r277; ld.u64 %r50,[%r132]; add.u64 %r278,%r50,-1; st.u64 [%r137+16],%r278; st.u64 [%r137],%r95; .loc 1 91 7 add.u64 %r137,%r137,24; add.u64 %r132,%r132,8; setp.ne.u64 %r279,%r100,%r137; @ %r279 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r171+48],%r109; mov.u64 %r281,-1; st.u64 [%r171+56],%r281; st.u64 [%r171+40],%r273; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r61,[%r171+28]; .loc 1 118 10 setp.eq.u64 %r283,%r61,%r103; @ %r283 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r61;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r286,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r287,[%r286+36]; setp.eq.u32 %r288,%r287,0; @ %r288 bra $L21; .loc 1 125 2 add.u64 %r336,%frame,240; cvta.const.u64 %r292,$LC2r3r29292; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r294,%r103,0; @ %r294 bra $L24; $L27: .loc 1 65 9 add.u64 %r295,%r31,1; .loc 1 65 7 sub.u64 %r105,%r295,%r33; max.s64 %r45,%r105,0; .loc 1 137 8 ld.u64 %r117,[%r172]; .loc 1 138 8 ld.u64 %r118,[%r171]; ld.u64 %r47,[%frame+360]; .loc 1 166 22 ld.u64 %r69,[%frame+120]; .loc 1 167 22 ld.u64 %r71,[%frame]; setp.gt.s64 %r337,%r105,0; setp.le.s64 %r338,%r103,1; cvt.u32.u32 %r340,%r22; cvt.s64.s8 %r341,%r340; add.u64 %r342,%r341,-1; add.u64 %r343,%frame,120; add.u64 %r344,%frame,240; .loc 1 173 13 mov.u64 %r345,0; bra $L25; $L24: add.u64 %r157,%frame,360; add.u64 %r156,%r171,40; mov.u64 %r154,%frame; cvt.u32.u32 %r297,%r22; cvt.s64.s8 %r296,%r297; add.u64 %r145,%r296,-1; .loc 1 129 3 mov.u64 %r119,0; add.u64 %r336,%frame,240; .loc 1 131 16 mov.u64 %r298,%r119; $L26: st.u64 [%r157],%r298; .loc 1 132 18 ld.u64 %r299,[%r156]; st.u64 [%r154],%r299; .loc 1 133 17 shl.b64 %r301,%r119,3; add.u64 %r302,%r336,%r301; .loc 1 133 10 ld.u64 %r303,[%r302]; setp.le.s64 %r304,%r303,0; @ %r304 bra $L1; .loc 1 129 26 add.u64 %r119,%r119,1; .loc 1 129 3 add.u64 %r157,%r157,8; add.u64 %r156,%r156,24; add.u64 %r154,%r154,8; setp.ne.u64 %r305,%r119,%r145; @ %r305 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r337 bra $L28; .loc 1 150 10 mov.u32 %r307,0; st.u8 [%r118],%r307; bra $L29; $L28: .loc 1 154 36 mov.u64 %r112,%r117; .loc 1 148 10 mov.u32 %r110,-1; .loc 1 154 13 mov.u64 %r111,0; $L30: .loc 1 158 10 ld.u8 %r349,[%r112]; mov.loc 1 154 28 add.u64 %r111,%r111,1; .loc 1 154 36 add.u64 %r112,%r112,%r107; .loc 1 154 6 setp.gt.s64 %r313,%r45,%r111; @ %r313 bra $L30; .loc 1 161 12 cvt.u32.u32 %r314,%r110; st.u8 [%r118],%r314; $L29: .loc 1 165 15 add.u64 %r47,%r47,1; .loc 1 166 12 add.u64 %r117,%r117,%r69; .loc 1 167 12 add.u64 %r118,%r118,%r71; .loc 1 169 32 ld.u64 %r126,[%frame+240]; .loc 1 169 13 setp.ne.u64 %r315,%r47,%r126; @ %r315 bra $L25; .loc 1 176 23 mul.lo.u64 %r316,%r69,%r47; .loc 1 176 9 sub.u64 %r114,%r117,%r316; .loc 1 177 23 mul.lo.u64 %r317,%r71,%r47; .loc 1 177 9 sub.u64 %r115,%r118,%r317; .loc 1 179 7 @ %r338 bra $L1; add.u64 %r143,%frame,368; mov.u64 %r102,8; .loc 1 178 5 mov.u64 %r116,1; bra $L33; $L34: .loc 1 173 13 st.u64 [%r143],%r345; .loc 1 176 23 mul.lo.u64 %r323,%r84,%r83; .loc 1 176 9 sub.u64 %r114,%r117,%r323; .loc 1 177 23 mul.lo.u64 %r324,%r86,%r83; .loc 1 177 9 sub.u64 %r115,%r118,%r324; .loc 1 178 5 add.u64 %r116,%r116,1; .loc 1 179 7 add.u64 %r143,%r143,8; add.u64 %r102,%r102,8; setp.eq.u64 %r325,%r116,%r342; @ %r325 bra $L1; $L33: .loc 1 187 16 ld.u64 %r326,[%r143]; add.u64 %r83,%r326,1; st.u64 [%r143],%r83; .loc 1 188 23 add.u64 %r328,%r343,%r102; ld.u64 %r84,[%r328]; .loc 1 188 13 add.u64 %r117,%r114,%r84; .loc 1 189 23 add.u64 %r329,%frame,%r102; ld.u64 %r86,[%r329]; .loc 1 189 13 add.u64 %r118,%r115,%r86; .loc 1 169 32 add.u64 %r331,%r344,%r102; ld.u64 %r88,[%r331]; .loc 1 169 13 setp.eq.u64 %r332,%r83,%r88; @ %r332 bra $L34; .loc 1 173 13 mov.u64 %r47,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r333,[%r171]; setp.eq.u64 %r334,%r333,0; @ ! %r334 bra $L11; bra $L35; $L1_gfortran_miall_i1 .visible .func _gfortran_miall_608u32 %r215; .reg .u32 %r216; .reg .u16u16 %r255; .reg .u32 %r256; .reg .u16 %r257; .reg .pred %r258; .reg .predpred %r274; .reg .u64 %r275; .reg .u64 %r277; .reg .u64 %r279; .reg .u64 %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64pred %r297; .reg .u64 %r298; .reg .predu64 %r305; .reg .u32 %r306; .reg .u64u64 %r313; .reg .u64pred %r323; .reg .pred %r324; .reg .u64 %r325; .reg .u32 %r326; .reg .u64u64 %r331; .reg .u32 %r332; .reg .u64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u64u16 %r347; .reg .u32 %r348; .reg .pred %r349; .reg .u64 %r350; .reg .u64 %r351; .reg .predu16 %r388; .reg .u32 %r389; .reg .predu64 %r436; .reg .u16 %r437; mov.u64 %r203,%ar0; mov.u64 %r204,%ar1; mov.u64 %r205,%ar2; mov.u64 %r206,%ar3; .loc 1 223 6 setp.ne.u64 %r207,%r206,0; @ %r207 bra $L490r204205; call _gfortran_iall_i1,(%out_arg1,%out_arg2,%out_arg3); } .loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r205]; .loc 1 233 7 add.u64 %r132,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r204+28]; .loc 1 234 38 add.u32 %r211,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r133,%r211; .loc 1 237 7 shr.u64 %r213,%r132,63; set.u32.gt.s64 %r215,%r132,%r133; neg.s32 %r216,%r215; cvt.u16.u64 %r219,%r213; cvt.u16.u32 %r220,%r216; or.b16 %r218,%r219,%r220; .loc 1 237 6 cvt.u32.u16 %r221,%r218; cvt.u16.u8 %r222,%r221; setp.eq.u16 %r223,%r222,0; @ %r223 bra $L51; .loc 1 239 7 cvt.u32.u32 %r226,%r23; cvt.s64.s8 %r225,%r226; st.u64 [%stack+8],%r225; st.u64 [%stack],%r22; cvta.const.u64 %r2242stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r425,%r132,%r132; add.u64 %r424,%r425,%r132; shl.b64 %r231,%r424,3; add.u64 %r232,%r204,%r231; ld.u64 %r235,[%r232+56]; add.u64 %r234,%r235,1; .loc 1 244 7 ld.u64 %r242,[%r232+48]; sub.u64 %r134,%r234,%r242; .loc 1 245 6 setp.le.s64 %r243,%r134,0; @ %r243 bra $L48; .loc 1 248 9 ld.u64 %r146,[%r206]; .loc 1 250 15 ld.u64 %r34,[%r206+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r244,%r35,-4; and.b32 %r245,%r244,-5; set.u32.eq.u32 %r247,%r245,0; neg.s32 %r248,%r247; .loc 1 252 22 add.u32 %r249,%r35,-1; set.u32.le.u32 %r251,%r249,1; neg.s32 %r252,%r251; .loc 1 252 6 cvt.u16.u32 %r254,%r248; cvt.u16.u32 %r255,%r252; or.b16 %r253,%r254,%r255; cvt.u32.u16 %r256,%r253; cvt.u16.u8 %r257,%r256; setp.ne.u16 %r258,%r257,0; @ %r258 bra $L53; .loc 1 254 7 setp.ne.u32 %r260,%r35,16; @ %r260 bra $L54; $L53: .loc 1 261 9 shl.b64 %r264,%r424,3; add.u64 %r265,%r204,%r264; ld.u64 %r137,[%r265+40]; .loc 1 262 12 add.u64 %r271,%r206,%r264; ld.u64 %r42,[%r271+40]; .loc 1 264 3 setp.ne.u64 %r273,%r132,0; @ %r273 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r274,%r132,%r133; @ %r274 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r275275_gfortran_runtime_error5: add.u64 %r187,%r204,40; add.u64 %r189,%r206,40; add.u64 %r277,%r204,16; add.u64 %r279,%r22,%r22; add.u64 %r280,%r279,%r22; shl.b64 %r281,%r280,3; add.u64 %r200,%r277,%r281; .loc 1 264 3 mov.u64 %r191,0; add.u64 %r422,%frame,240; add.u64 %r421,%frame,360; .loc 1 271 12 mov.u64 %r436,%r191; $L60: .loc 1 266 18 add.u64 %r283,%r422,%r191; ld.u64 %r284,[%r187]; st.u64 [%r283],%r284; .loc 1 267 18 add.u64 %r285,%frame,%r191; .loc 1 267 20 ld.u64 %r287,[%r189]; mul.lo.u64 %r286,%r287,%r34; .loc 1 267 18 st.u64 [%r285],%r286; .loc 1 268 19 ld.u64 %r289,[%r187+16]; add.u64 %r288,%r289,1; ld.u64 %r290,[%r187+8]; sub.u64 %r53,%r288,%r290; .loc 1 270 10 setp.lt.s64 %r291,%r53,0; @ %r291 bra $L58; .loc 1 268 17 add.u64 %r293,%r421,%r191; st.u64 [%r293],%r53; bra $L59; $L58: .loc 1 271 12 add.u64 %r295,%r421,%r191; st.u64 [%r295],%r436; $L59: .loc 1 264 3 add.u64 %r187,%r187,24; add.u64 %r189,%r189,24; add.u64 %r191,%r191,8; setp.ne.u64 %r297,%r187,%r200; @ %r297 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r298,[%r203]; setp.eq.u64 %r299,%r298,0; @ ! %r299 bra $L63; bra $L62; $L56: add.u64 %r301,%r22,%r22; add.u64 %r302,%r301,%r22; shl.b64 %r303,%r302,3; add.u64 %r152,%r303,40; add.u64 %r58,%r204,%r152; add.u64 %r154,%r206,%r152; shl.b64 %r304,%r22,3; add.u64 %r100,%r304,-8; cvt.u32.u32 %r306,%r23; cvt.s64.s8 %r305,%r306; shl.b64 %r307,%r305,3; add.u64 %r184,%r307,-8; add.u64 %r422,%frame,240; add.u64 %r421,%frame,360; .loc 1 281 12 mov.u64 %r435,0; $L66: .loc 1 276 18 add.u64 %r309,%r422,%r100; ld.u64 %r310,[%r58]; st.u64 [%r309],%r310; .loc 1 277 18 add.u64 %r311,%frame,%r100; .loc 1 277 20 ld.u64 %r313,[%r154]; mul.lo.u64 %r312,%r313,%r34; .loc 1 277 18 st.u64 [%r311],%r312; .loc 1 278 19 ld.u64 %r315,[%r58+16]; add.u64 %r314,%r315,1; ld.u64 %r316,[%r58+8]; sub.u64 %r64,%r314,%r316; .loc 1 280 10 setp.lt.s64 %r317,%r64,0; @ %r317 bra $L64; .loc 1 278 17 add.u64 %r319,%r421,%r100; st.u64 [%r319],%r64; bra $L65; $L64: .loc 1 281 12 add.u64 %r321,%r421,%r100; st.u64 [%r321],%r435; $L65: .loc 1 274 3 add.u64 %r58,%r58,24; add.u64 %r154,%r154,24; add.u64 %r100,%r100,8; setp.ne.u64 %r323,%r100,%r184; @ %r323 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r324,%r133,0; @ %r324 bra $L68; add.u64 %r421,%frame,360; $L86: add.u64 %r124,%r203,40; mov.u64 %r118,%r421; cvt.u32.u32 %r326,%r23; cvt.s64.s8 %r325,%r326; add.u64 %r328,%r325,%r325; add.u64 %r329,%r328,%r325; shl.b64 %r330,%r329,3; add.u64 %r331,%r203,16; add.u64 %r72,%r330,%r331; .loc 1 291 10 mov.u64 %r120,1; .loc 1 295 4 mov.u64 %r350,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r332,%r23,-2; cvt.s64.s32 %r70,%r332; add.u64 %r334,%r70,%r70; add.u64 %r335,%r334,%r70; shl.b64 %r336,%r335,3; add.u64 %r337,%r203,%r336; .loc 1 299 67 shl.b64 %r339,%r70,3; add.u64 %r340,%frame,%r339; .loc 1 299 59 ld.u64 %r342,[%r337+40]; ld.u64 %r343,[%r340+360]; mul.lo.u64 %r140,%r342,%r343; .loc 1 301 24 mov.u64 %r344,0; st.u64 [%r203+8],%r344; .loc 1 302 28 cvt.u16.u32 %r347,%r23; add.u16 %r346,%r347,-1; cvt.u32.u16 %r348,%r346; st.u8 [%r203+28],%r348; .loc 1 304 10 setp.eq.u64 %r349,%r140,0; @ %r349 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r120,%r67,%r120; $L69: .loc 1 295 4 st.u64 [%r124+8],%r350; ld.u64 %r67,[%r118]; add.u64 %r351,%r67,-1; st.u64 [%r124+16],%r351; st.u64 [%r124],%r120; .loc 1 288 7 add.u64 %r124,%r124,24; add.u64 %r118,%r118,8; setp.ne.u64 %r352,%r72,%r124; @ %r352 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r203+48],%r140; mov.u64 %r354,-1; st.u64 [%r203+56],%r354; mov.u64 %r355,1; st.u64 [%r203+40],%r355; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r35714357; call (%value_in),_gfortrani_xmallocarray58,[%value_in]; } .loc 1 311 22 st.u64 [%r203],%r358; bra $L73; $L63: .loc 1 316 19 ld.s8 %r360,[%r203+28]; .loc 1 316 10 setp.eq.u64 %r361,%r360,%r133; @ %r361 bra $L74; .loc 1 317 2 cvta.const.u64 %r362362_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r364,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r365,[%r364+36]; setp.eq.u32 %r366,%r365,0; @ %r366 bra $L73; .loc 1 321 4 add.u64 %r421,%frame,360; cvta.const.u64 %r370,$LC2369370; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r374,$LC6; {374370; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r376,%r133,0; @ %r376 bra $L75; $L78: .loc 1 336 8 ld.u64 %r147,[%r203]; .loc 1 337 8 ld.u64 %r145,[%r204]; .loc 1 339 9 setp.ne.u64 %r377,%r145,0; @ %r377 bra $L76; bra $L48; $L75: add.u64 %r170,%frame,480; add.u64 %r167,%r203,40; add.u64 %r160,%frame,120; cvt.u32.u32 %r379,%r23; cvt.s64.s8 %r378,%r379; add.u64 %r125,%r378,-1; .loc 1 328 3 mov.u64 %r151,0; add.u64 %r421,%frame,360; .loc 1 330 16 mov.u64 %r380,%r151; $L77: st.u64 [%r170],%r380; .loc 1 331 18 ld.u64 %r381,[%r167]; st.u64 [%r160],%r381; .loc 1 332 17 shl.b64 %r383,%r151,3; add.u64 %r384,%r421,%r383; .loc 1 332 10 ld.u64 %r385,[%r384]; setp.le.s64 %r386,%r385,0; @ %r386 bra $L48; .loc 1 328 26 add.u64 %r151,%r151,1; .loc 1 328 3 add.u64 %r170,%r170,8; add.u64 %r167,%r167,24; add.u64 %r160,%r160,8; setp.ne.u64 %r387,%r125,%r151; @ %r387 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r44,%r42,%r34; ld.u64 %r173,[%frame+480]; .loc 1 359 22 ld.u64 %r86,[%frame+240]; .loc 1 360 23 ld.u64 %r88,[%frame]; .loc 1 361 22 ld.u64 %r90,[%frame+120]; setp.le.s64 %r426,%r133,1; .loc 1 348 10 mov.u32 %r428,-1; cvt.u32.u32 %r429,%r23; cvt.s64.s8 %r430,%r429; add.u64 %r431,%r430,-1; add.u64 %r432,%frame,240; add.u64 %r433,%frame,360; add.u64 %r434,%frame,120; $L87: .loc 1 328 3 mov.u64 %r150,%r146; mov.u64 %r149,%r145; .loc 1 348 10 mov.u32 %r122,%r428; .loc 1 349 9 mov.u64 %r148,0; $L80: .loc 1 352 6 ld.s8 %r389,[%r150]; cvt.u16.u32 %r388,%r389; setp.eq.u16 %r390,%r388,0; @ %r390 bra $L79; .loc 1 353 12 ld.u8 %r437,[%r149]; mov$L79: .loc 1 349 24 add.u64 %r148,%r148,1; .loc 1 349 32 add.u64 %r149,%r149,%r137; .loc 1 349 47 add.u64 %r150,%r150,%r44; .loc 1 349 2 setp.ne.u64 %r396,%r134,%r148; @ %r396 bra $L80; .loc 1 355 8 cvt.u32.u32 %r397,%r122; st.u8 [%r147],%r397; .loc 1 358 15 add.u64 %r173,%r173,1; .loc 1 359 12 add.u64 %r145,%r145,%r86; .loc 1 360 13 add.u64 %r146,%r146,%r88; .loc 1 361 12 add.u64 %r147,%r147,%r90; .loc 1 363 32 ld.u64 %r80,[%frame+360]; .loc 1 363 13 setp.ne.u64 %r398,%r80,%r173; @ %r398 bra $L87; .loc 1 370 23 mul.lo.u64 %r94,%r173,%r86; .loc 1 371 24 mul.lo.u64 %r399,%r173,%r88; .loc 1 371 10 sub.u64 %r142,%r146,%r399; .loc 1 372 23 mul.lo.u64 %r400,%r173,%r90; .loc 1 372 9 sub.u64 %r143,%r147,%r400; .loc 1 374 7 @ ! %r426 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r40],%r427; .loc 1 370 23 mul.lo.u64 %r94,%r106,%r105; .loc 1 371 24 mul.lo.u64 %r403,%r108,%r105; .loc 1 371 10 sub.u64 %r142,%r146,%r403; .loc 1 372 23 mul.lo.u64 %r404,%r110,%r105; .loc 1 372 9 sub.u64 %r143,%r147,%r404; .loc 1 373 5 add.u64 %r144,%r144,1; .loc 1 374 7 add.u64 %r40,%r40,8; add.u64 %r164,%r164,8; setp.ne.u64 %r405,%r144,%r431; @ %r405 bra $L84; bra $L48; $L108: add.u64 %r40,%frame,488; mov.u64 %r164,8; .loc 1 373 5 mov.u64 %r144,1; .loc 1 367 13 mov.u64 %r427,0; $L84: .loc 1 382 16 ld.u64 %r409,[%r40]; add.u64 %r105,%r409,1; st.u64 [%r40],%r105; .loc 1 383 23 add.u64 %r411,%r432,%r164; ld.u64 %r106,[%r411]; .loc 1 383 13 sub.u64 %r412,%r106,%r94; add.u64 %r145,%r145,%r412; .loc 1 384 24 add.u64 %r413,%frame,%r164; ld.u64 %r108,[%r413]; .loc 1 384 14 add.u64 %r146,%r142,%r108; .loc 1 385 23 add.u64 %r415,%r434,%r164; ld.u64 %r110,[%r415]; .loc 1 385 13 add.u64 %r147,%r143,%r110; .loc 1 363 32 add.u64 %r417,%r433,%r164; ld.u64 %r113,[%r417]; .loc 1 363 13 setp.eq.u64 %r418,%r105,%r113; @ %r418 bra $L85; .loc 1 367 13 mov.u64 %r173,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r419,[%r203]; setp.eq.u64 %r420,%r419,0; @ ! %r420 bra $L63; bra $L86; $L48: .loc 1 389_gfortran_siall_i1 .visible .func _gfortran_siall_88pred %r184; .reg .predpred %r206; .reg .predu32 %r211; .reg .u64 %r213; .reg .u64u32 %r222; .reg .u64pred %r264; .reg .u64 %r266; .reg .u32 %r270; .reg .u64 %r272; .reg .u64 %r274; .reg .predpredu64 %r297; mov.u64 %r148,%ar0; mov.u64 %r149,%ar1; mov.u64 %r150,%ar2; mov.u64 %r151,%ar3; .loc 1 412 6 setp.eq.u64 %r152,%r151,0; @ %r152 bra $L110; .loc 1 412 20 ld.u32 %r153,[%r151]; setp.eq.u32 %r154,%r153,0; @ %r154 bra $L111; $L110: .loc 1 417 7gfortran_ial.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r150]; .loc 1 422 7 add.u64 %r86,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r149+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r291,%r26; .loc 1 425 7 shr.u64 %r159,%r86,63; set.u32.gt.s64 %r161,%r86,%r291; neg.s32 %r162,%r161; cvt.u16.u64 %r165,%r159; cvt.u16.u32 %r166,%r162; or.b16 %r164,%r165,%r166; .loc 1 425 6 cvt.u32.u16 %r167,%r164; cvt.u16.u8 %r168,%r167; setp.ne.u16 %r169,%r168,0; @ %r169 bra $L113; .loc 1 432 3 setp.ne.u64 %r170,%r86,0; @ %r170 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r171,%r86,%r291; @ %r171 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r174,%r24; cvt.s64.s8 %r173,%r174; st.u64 [%stack+8],%r173172172_gfortran_runtime_error114: add.u64 %r135,%r149,48; add.u64 %r137,%frame,120; add.u64 %r138,%r149,56; add.u64 %r176,%r149,24; add.u64 %r178,%r23,%r23; add.u64 %r179,%r178,%r23; shl.b64 %r180,%r179,3; add.u64 %r144,%r176,%r180; .loc 1 437 12 mov.u64 %r297,0; $L119: .loc 1 434 19 ld.u64 %r182,[%r138]; add.u64 %r181,%r182,1; ld.u64 %r183,[%r135]; sub.u64 %r36,%r181,%r183; .loc 1 436 10 setp.le.s64 %r184,%r36,0; @ %r184 bra $L117; .loc 1 434 17 st.u64 [%r137],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r137],%r297; $L118: .loc 1 432 3 add.u64 %r135,%r135,24; add.u64 %r137,%r137,8; add.u64 %r138,%r138,24; setp.ne.u64 %r186,%r135,%r144; @ %r186 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r96,[%r148]; .loc 1 449 6 setp.eq.u64 %r187,%r96,0; @ ! %r187 bra $L122; bra $L121; $L115: add.u64 %r189,%r23,%r23; add.u64 %r190,%r189,%r23; shl.b64 %r191,%r190,3; add.u64 %r192,%r191,48; add.u64 %r57,%r149,%r192; add.u64 %r290,%frame,120; shl.b64 %r194,%r23,3; add.u64 %r195,%r194,-8; add.u64 %r67,%r290,%r195; add.u64 %r196,%r149,48; cvt.u32.u32 %r198,%r24; cvt.s64.s8 %r197,%r198; add.u64 %r200,%r197,%r197; add.u64 %r201,%r200,%r197; shl.b64 %r202,%r201,3; add.u64 %r131,%r196,%r202; .loc 1 446 12 mov.u64 %r296,0; $L125: .loc 1 443 2 ld.u64 %r204,[%r57+8]; add.u64 %r203,%r204,1; ld.u64 %r205,[%r57]; sub.u64 %r41,%r203,%r205; .loc 1 445 10 setp.le.s64 %r206,%r41,0; @ %r206 bra $L123; .loc 1 442 17 st.u64 [%r67],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r67],%r296; $L124: .loc 1 440 3 add.u64 %r57,%r57,24; add.u64 %r67,%r67,8; setp.ne.u64 %r208,%r57,%r131; @ %r208 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r209,%r291,0; @ %r209 bra $L127; add.u64 %r290,%frame,120; $L147: add.u64 %r102,%r148,40; mov.u64 %r84,%r290; cvt.u32.u32 %r211,%r24; cvt.s64.s8 %r210,%r211; add.u64 %r213,%r210,%r210; add.u64 %r214,%r213,%r210; shl.b64 %r215,%r214,3; add.u64 %r216,%r148,16; add.u64 %r60,%r215,%r216; .loc 1 456 10 mov.u64 %r82,1; .loc 1 460 4 mov.u64 %r235,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r217,0; st.u64 [%r148+8],%r217; .loc 1 465 28 cvt.u16.u32 %r220,%r24; add.u16 %r219,%r220,-1; cvt.u32.u16 %r221,%r219; st.u8 [%r148+28],%r221; .loc 1 467 20 add.u32 %r222,%r24,-2; cvt.s64.s32 %r50,%r222; add.u64 %r224,%r50,%r50; add.u64 %r225,%r224,%r50; shl.b64 %r226,%r225,3; add.u64 %r227,%r148,%r226; .loc 1 467 67 shl.b64 %r229,%r50,3; add.u64 %r230,%frame,%r229; .loc 1 467 59 ld.u64 %r232,[%r227+40]; ld.u64 %r233,[%r230+120]; mul.lo.u64 %r93,%r232,%r233; .loc 1 469 10 setp.eq.u64 %r234,%r93,0; @ %r234 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r82,%r46,%r82; $L128: .loc 1 460 4 st.u64 [%r102+8],%r235; ld.u64 %r46,[%r84]; add.u64 %r236,%r46,-1; st.u64 [%r102+16],%r236; st.u64 [%r102],%r82; .loc 1 453 7 add.u64 %r102,%r102,24;131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r148+48],%r93; mov.u64 %r239,-1; st.u64 [%r148+56],%r239; mov.u64 %r240,1; st.u64 [%r148+40],%r240; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r242(%value_in),_gfortrani_xmallocarray3,[%value_in]; } mov.u64 %r96,%r243; .loc 1 476 22 st.u64 [%r148],%r96; $L135: .loc 1 502 3 setp.ne.u64 %r245,%r291,0; @ %r245 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r148+28]; .loc 1 480 10 setp.eq.u64 %r246,%r55,%r291; @ %r246 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r291; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r249,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r250,[%r249+36]; setp.eq.u32 %r251,%r250,0; @ %r251 bra $L135; .loc 1 488 4 setp.eq.u64 %r252,%r291,0; @ %r252 bra $L133; add.u64 %r31,%r148,48; add.u64 %r125,%frame,120; cvt.u32.u32 %r254,%r24; cvt.s64.s8 %r253,%r254; add.u64 %r92,%r253,-1; .loc 1 488 10 mov.u64 %r101,0; $L137: .loc 1 492 21 ld.u64 %r256,[%r31+8]; add.u64 %r255,%r256,1; .loc 1 492 19 ld.u64 %r257,[%r31]; sub.u64 %r91,%r255,%r257; .loc 1 493 18 ld.u64 %r64,[%r125]; .loc 1 488 25 add.u64 %r101,%r101,1; .loc 1 493 11 setp.eq.u64 %r258,%r64,%r91; @ %r258 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r64; st.u64 [%stack+8],%r91; st.u64 [%stack],%r101; cvta.const.u64 %r22_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r31,%r31,24; add.u64 %r125,%r125,8; setp.ne.u64 %r261,%r92,%r101; @ %r261 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r263,%r24; setp.le.s16 %r264,%r263,1; @ %r264 bra $L138; shl.b64 %r262,%r291,3; bra $L139; $L138: mov.u64 %r262,8; $L139: add.u64 %r266,%frame,240; mov.u32 %r27026272,[%value_in]; } add.u64 %r111,%r148,40; mov.u64 %r109,%frame; mov.u64 %r98,0; $L140: .loc 1 505 18 ld.u64 %r274,[%r111]; st.u64 [%r109],%r274; .loc 1 502 26 add.u64 %r98,%r98,1; .loc 1 502 3 add.u64 %r111,%r111,24; add.u64 %r109,%r109,8; setp.gt.s64 %r275,%r291,%r98; @ %r275 bra $L140; $L133: ld.u64 %r103,[%frame+240]; .loc 1 514 22 ld.u64 %r69,[%frame]; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r81,%r59,%r69; setp.le.s64 %r289,%r291,1; .loc 1 512 13 mov.u32 %r276,0; cvt.u32.u32 %r293,%r24; cvt.s64.s8 %r294,%r293; add.u64 %r295,%r294,-1; $L146: st.u8 [%r96],%r276; .loc 1 513 15 add.u64 %r103,%r103,1; .loc 1 514 12 add.u64 %r96,%r96,%r69; .loc 1 516 13 setp.ne.u64 %r277,%r59,%r103; @ %r277 bra $L146; .loc 1 525 7 @ ! %r289 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r97],%r292; .loc 1 523 23 mul.lo.u64 %r71,%r74,%r73; .loc 1 524 5 add.u64 %r95,%r95,1; .loc 1 525 7 add.u64 %r97,%r97,8; add.u64 %r119,%r119,8; add.u64 %r118,%r118,8; setp.ne.u64 %r280,%r95,%r295; @ %r280 bra $L144; bra $L109; $L171: add.u64 %r97,%frame,248; add.u64 %r119,%frame,8; add.u64 %r118,%frame,128; .loc 1 523 23 mov.u64 %r71,%r81; .loc 1 524 5 mov.u64 %r95,1; .loc 1 520 13 mov.u64 %r292,0; $L144: .loc 1 529 16 ld.u64 %r285,[%r97]; add.u64 %r73,%r285,1; st.u64 [%r97],%r73; .loc 1 530 23 ld.u64 %r74,[%r119]; .loc 1 530 13 sub.u64 %r286,%r74,%r71; add.u64 %r96,%r96,%r286; .loc 1 516 32 ld.u64 %r77,[%r118]; .loc 1 516 13 setp.eq.u64 %r287,%r73,%r77; @ %r287 bra $L145; .loc 1 520 13 mov.u64 %r103,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r96,[%r148]; .loc 1 449 6 setp.eq.u64 %r288,%r96,0; @ ! %r288 bra $L122; bra $L147; $L109: .loc 1 534iall_i2.o/_gfortran_iall_i2 .visible .func _gfortran_iall_i2fortran/generated/iall_i2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_miall_i2 .visible .func _gfortran_miall_i2gfortran_siall_i2 .visible .func _gfortran_siall_i2VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,76,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,76,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,105] = {73,65,76,76114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,76,76,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,73,65,76,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_iall_i2 .visible .func _gfortran_iall_i248pred %r217; .reg .pred64u64 %r251; .reg .predu64 %r256; .reg .u64 %r257; .reg .u32u32 %r268; .reg .u64predmov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; .loc 1 55 10 ld.s8 %r22,[%r180+28]; .loc 1 55 38 add.u32 %r182,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r182; .loc 1 56 10 ld.u64 %r25,[%r181]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r184,%r104,63; set.u32.lt.s64 %r186,%r103,%r104; neg.s32 %r187,%r186; cvt.u16.u64 %r190,%r184; cvt.u16.u32 %r191,%r187; or.b16 %r189,%r190,%r191; .loc 1 58 6 cvt.u32.u16 %r192,%r189; cvt.u16.u8 %r193,%r192; setp.eq.u16 %r194,%r193,0; @ %r194 bra $L2; .loc 1 60 7 cvt.u32.u32 %r197,%r22; cvt.s64.s8 %r196,%r197; st.u64 [%stack+8],%r1961955_gfortran_runtime_errorr200,%r104,%r104; add.u64 %r201,%r200,%r104; shl.b64 %r202,%r201,3; add.u64 %r203,%r180,%r202; ld.u64 %r30,[%r203+56]; ld.u64 %r32,[%r203+48]; .loc 1 68 9 ld.u64 %r107,[%r203+40]; .loc 1 70 3 setp.ne.u64 %r217,%r104,0; @ %r217 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r218,%r103,%r104; @ %r218 bra $L4; bra $L46; $L3: add.u64 %r61,%r180,40; add.u64 %r40,%frame,120; add.u64 %r91,%frame,240; add.u64 %r219,%r180,16; add.u64 %r221,%r25,%r25; add.u64 %r222,%r221,%r25; shl.b64 %r223,%r222,3; add.u64 %r174,%r219,%r223; .loc 1 76 12 mov.u64 %r363,0; $L8: .loc 1 72 18 ld.u64 %r224,[%r61]; st.u64 [%r40],%r224; .loc 1 73 19 ld.u64 %r226,[%r61+16]; add.u64 %r225,%r226,1; ld.u64 %r227,[%r61+8]; sub.u64 %r37,%r225,%r227; .loc 1 75 10 setp.lt.s64 %r228,%r37,0; @ %r228 bra $L6; .loc 1 73 17 st.u64 [%r91],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r91],%r363; $L7: .loc 1 70 3 add.u64 %r61,%r61,24; add.u64 %r40,%r40,8; add.u64 %r91,%r91,8; setp.ne.u64 %r230,%r61,%r174; @ %r230 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r231,[%r179]; setp.eq.u64 %r232,%r231,0; @ ! %r232 bra $L11; bra $L10; $L4: add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r237,%r236,40; add.u64 %r143,%r180,%r237; shl.b64 %r238,%r25,3; add.u64 %r125,%r238,-8; add.u64 %r351,%frame,120; add.u64 %r128,%r351,%r125; add.u64 %r350,%frame,240; add.u64 %r121,%r350,%r125; cvt.u32.u32 %r242,%r22; cvt.s64.s8 %r241,%r242; add.u64 %r244,%r241,%r241; add.u64 %r245,%r244,%r241; shl.b64 %r246,%r245,3; add.u64 %r247,%r180,40; add.u64 %r95,%r246,%r247; .loc 1 84 12 mov.u64 %r362,0; $L14: .loc 1 80 18 ld.u64 %r248,[%r143]; st.u64 [%r128],%r248; .loc 1 81 19 ld.u64 %r250,[%r143+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r143+8]; sub.u64 %r44,%r249,%r251; .loc 1 83 10 setp.lt.s64 %r252,%r44,0; @ %r252 bra $L12; .loc 1 81 17 st.u64 [%r121],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r121],%r362; $L13: .loc 1 78 3 add.u64 %r143,%r143,24; add.u64 %r128,%r128,8; add.u64 %r121,%r121,8; setp.ne.u64 %r254,%r95,%r143; @ %r254 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r255,%r103,0; @ %r255 bra $L16; add.u64 %r350,%frame,240; $L34: add.u64 %r152,%r179,40; mov.u64 %r150,%r350; add.u64 %r256,%r179,16; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r144,%r256,%r262; .loc 1 94 10 mov.u64 %r99,1; .loc 1 98 4 mov.u64 %r285,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r263,0; st.u64 [%r179+8],%r263; .loc 1 103 28 cvt.u16.u32 %r266,%r22; add.u16 %r265,%r266,-1; cvt.u32.u16 %r267,%r265; st.u8 [%r179+28],%r267; .loc 1 105 20 add.u32 %r268,%r22,-2; cvt.s64.s32 %r52,%r268; add.u64 %r270,%r52,%r52; add.u64 %r271,%r270,%r52; shl.b64 %r272,%r271,3; add.u64 %r273,%r179,%r272; .loc 1 105 67 shl.b64 %r275,%r52,3; add.u64 %r276,%frame,%r275; .loc 1 105 59 ld.u64 %r278,[%r273+40]; ld.u64 %r279,[%r276+240]; mul.lo.u64 %r109,%r278,%r279; .loc 1 107 29 mov.u64 %r281,281; call (%value_in),_gfortrani_xmallocarray282,[%value_in]; } .loc 1 107 27 st.u64 [%r179],%r282; .loc 1 108 10 setp.eq.u64 %r284,%r109,0; @ ! %r284 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r99,%r48,%r99; $L17: .loc 1 98 4 st.u64 [%r152+8],%r285; ld.u64 %r48,[%r150]; add.u64 %r286,%r48,-1; st.u64 [%r152+16],%r286; st.u64 [%r152],%r99; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r287,%r144,%r152; @ %r287 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r179+48],%r109; mov.u64 %r289,-1; st.u64 [%r179+56],%r289; mov.u64 %r290,1; st.u64 [%r179+40],%r290; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r179+28]; .loc 1 118 10 setp.eq.u64 %r291,%r58,%r103; @ %r291 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r294,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r295,[%r294+36]; setp.eq.u32 %r296,%r295,0; @ %r296 bra $L21; .loc 1 125 2 add.u64 %r350,%frame,240; cvta.const.u64 %r300,$LC2r350299300; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r302,%r103,0; @ %r302 bra $L24; $L27: .loc 1 65 9 add.u64 %r303,%r30,1; .loc 1 65 7 sub.u64 %r105,%r303,%r32; max.s64 %r45,%r105,0; .loc 1 137 8 ld.u64 %r116,[%r180]; .loc 1 138 8 ld.u64 %r117,[%r179]; .loc 1 154 36 add.u64 %r66,%r107,%r107; ld.u64 %r138,[%frame+360]; .loc 1 166 22 ld.u64 %r68,[%frame+120]; .loc 1 166 12 add.u64 %r69,%r68,%r68; .loc 1 167 22 ld.u64 %r71,[%frame]; .loc 1 167 12 add.u64 %r72,%r71,%r71; .loc 1 169 32 ld.u64 %r123,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r307,%r68,%r123; .loc 1 176 9 add.u64 %r308,%r307,%r307; neg.s64 %r127,%r308; .loc 1 177 23 mul.lo.u64 %r309,%r71,%r123; .loc 1 177 9 add.u64 %r310,%r309,%r309; neg.s64 %r130,%r310; setp.gt.s64 %r352,%r105,0; setp.le.s64 %r353,%r103,1; cvt.u32.u32 %r355,%r22; cvt.s64.s8 %r356,%r355; add.u64 %r357,%r356,-1; add.u64 %r358,%frame,120; add.u64 %r359,%frame,240; .loc 1 173 13 mov.u64 %r360,0; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r179,40; mov.u64 %r162,%frame; cvt.u32.u32 %r312,%r22; cvt.s64.s8 %r311,%r312; add.u64 %r153,%r311,-1; .loc 1 129 3 mov.u64 %r118,0; add.u64 %r350,%frame,240; .loc 1 131 16 mov.u64 %r313,%r118; $L26: st.u64 [%r165],%r313; .loc 1 132 18 ld.u64 %r314,[%r164]; st.u64 [%r162],%r314; .loc 1 133 17 shl.b64 %r316,%r118,3; add.u64 %r317,%r350,%r316; .loc 1 133 10 ld.u64 %r318,[%r317]; setp.le.s64 %r319,%r318,0; @ %r319 bra $L1; .loc 1 129 26 add.u64 %r118,%r118,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r320,%r118,%r153; @ %r320 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r352 bra $L35; .loc 1 150 10 mov.u16 %r322,0; st.u16 [%r117],%r322; bra $L29; $L35: mov.u64 %r112,%r116; .loc 1 148 10 mov.u32 %r110,-1; .loc 1 154 13 mov.u64 %r111,0; $L28: .loc 1 158 10 ld.u16 %r324,[%r112]; cvt.u16.u32 %r325,%r110; and.b16 %r323,%r324,%r325; cvt.s32.s16 %r110,%r323; .loc 1 154 28 add.u64 %r111,%r111,1; .loc 1 154 36 add.u64 %r112,%r112,%r66; .loc 1 154 6 setp.gt.s64 %r326,%r45,%r111; @ %r326 bra $L28; .loc 1 161 12 cvt.u16.u32 %r327,%r110; st.u16 [%r117],%r327; $L29: .loc 1 165 15 add.u64 %r138,%r138,1; .loc 1 166 12 add.u64 %r116,%r116,%r69; .loc 1 167 12 add.u64 %r117,%r117,%r72; .loc 1 169 13 setp.ne.u64 %r328,%r138,%r123; @ %r328 bra $L25; .loc 1 176 9 add.u64 %r113,%r116,%r127; .loc 1 177 9 add.u64 %r114,%r117,%r130; .loc 1 179 7 @ %r353 bra $L1; add.u64 %r122,%frame,368; mov.u64 %r140,8; .loc 1 178 5 mov.u64 %r115,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r122],%r360; .loc 1 176 23 mul.lo.u64 %r334,%r85,%r84; .loc 1 176 9 add.u64 %r335,%r334,%r334; sub.u64 %r113,%r116,%r335; .loc 1 177 23 mul.lo.u64 %r336,%r88,%r84; .loc 1 177 9 add.u64 %r337,%r336,%r336; sub.u64 %r114,%r117,%r337; .loc 1 178 5 add.u64 %r115,%r115,1; .loc 1 179 7 add.u64 %r122,%r122,8; add.u64 %r140,%r140,8; setp.eq.u64 %r338,%r115,%r357; @ %r338 bra $L1; $L32: .loc 1 187 16 ld.u64 %r339,[%r122]; add.u64 %r84,%r339,1; st.u64 [%r122],%r84; .loc 1 188 23 add.u64 %r341,%r358,%r140; ld.u64 %r85,[%r341]; .loc 1 188 13 add.u64 %r342,%r85,%r85; add.u64 %r116,%r113,%r342; .loc 1 189 23 add.u64 %r343,%frame,%r140; ld.u64 %r88,[%r343]; .loc 1 189 13 add.u64 %r344,%r88,%r88; add.u64 %r117,%r114,%r344; .loc 1 169 32 add.u64 %r346,%r359,%r140; ld.u64 %r92,[%r346]; .loc 1 169 13 setp.eq.u64 %r347,%r84,%r92; @ %r347 bra $L33; .loc 1 173 13 mov.u64 %r138,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r348,[%r179]; setp.eq.u64 %r349,%r348,0; @ ! %r349 bra $L11; bra $L34; $L1_gfortran_miall_i2 .visible .func _gfortran_miall_608u64u16 %r263; .reg .pred %r264; .reg .pred %r266; .reg .u64 %r270; .reg .u64 %r271; .reg .u64 %r277; .reg .pred %r279; .reg .pred %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .predu64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32u64 %r342; .reg .u64pred %r358; .reg .u64predu64 %r391; .reg .pred %r392; .reg .pred %r393; .reg .u64u64 %r421; .reg .u64predmov.u64 %r211,%ar2; mov.u64 %r212,%ar3; .loc 1 223 6 setp.ne.u64 %r213,%r212,0; @ %r213 bra $L49gfortran_ial.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r211]; .loc 1 233 7 add.u64 %r137,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r210+28]; .loc 1 234 38 add.u32 %r217,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r138,%r217; .loc 1 237 7 shr.u64 %r219,%r137,63; set.u32.gt.s64 %r221,%r137,%r138; neg.s32 %r222,%r221; cvt.u16.u64 %r225,%r219; cvt.u16.u32 %r226,%r222; or.b16 %r224,%r225,%r226; .loc 1 237 6 cvt.u32.u16 %r227,%r224; cvt.u16.u8 %r228,%r227; setp.eq.u16 %r229,%r228,0; @ %r229 bra $L51; .loc 1 239 7 cvt.u32.u32 %r232,%r23; cvt.s64.s8 %r231,%r232; st.u64 [%stack+8],%r231; st.u64 [%stack],%r22; cvta.const.u64 %r2302stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r435,%r137,%r137; add.u64 %r436,%r435,%r137; shl.b64 %r237,%r436,3; add.u64 %r238,%r210,%r237; ld.u64 %r241,[%r238+56]; add.u64 %r240,%r241,1; .loc 1 244 7 ld.u64 %r248,[%r238+48]; sub.u64 %r139,%r240,%r248; .loc 1 245 6 setp.le.s64 %r249,%r139,0; @ %r249 bra $L48; .loc 1 248 9 ld.u64 %r152,[%r212]; .loc 1 250 15 ld.u64 %r34,[%r212+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r250,%r35,-4; and.b32 %r251,%r250,-5; set.u32.eq.u32 %r253,%r251,0; neg.s32 %r254,%r253; .loc 1 252 22 add.u32 %r255,%r35,-1; set.u32.le.u32 %r257,%r255,1; neg.s32 %r258,%r257; .loc 1 252 6 cvt.u16.u32 %r260,%r254; cvt.u16.u32 %r261,%r258; or.b16 %r259,%r260,%r261; cvt.u32.u16 %r262,%r259; cvt.u16.u8 %r263,%r262; setp.ne.u16 %r264,%r263,0; @ %r264 bra $L53; .loc 1 254 7 setp.ne.u32 %r266,%r35,16; @ %r266 bra $L54; $L53: .loc 1 261 9 shl.b64 %r270,%r436,3; add.u64 %r271,%r210,%r270; ld.u64 %r142,[%r271+40]; .loc 1 262 12 add.u64 %r277,%r212,%r270; ld.u64 %r42,[%r277+40]; .loc 1 264 3 setp.ne.u64 %r279,%r137,0; @ %r279 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r280,%r137,%r138; @ %r280 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r281281_gfortran_runtime_error5: add.u64 %r40,%r210,40; add.u64 %r179,%r212,40; add.u64 %r283,%r210,16; add.u64 %r285,%r22,%r22; add.u64 %r286,%r285,%r22; shl.b64 %r287,%r286,3; add.u64 %r206,%r283,%r287; .loc 1 264 3 mov.u64 %r92,0; add.u64 %r433,%frame,240; add.u64 %r432,%frame,360; .loc 1 271 12 mov.u64 %r447,%r92; $L60: .loc 1 266 18 add.u64 %r289,%r433,%r92; ld.u64 %r290,[%r40]; st.u64 [%r289],%r290; .loc 1 267 18 add.u64 %r291,%frame,%r92; .loc 1 267 20 ld.u64 %r293,[%r179]; mul.lo.u64 %r292,%r293,%r34; .loc 1 267 18 st.u64 [%r291],%r292; .loc 1 268 19 ld.u64 %r295,[%r40+16]; add.u64 %r294,%r295,1; ld.u64 %r296,[%r40+8]; sub.u64 %r54,%r294,%r296; .loc 1 270 10 setp.lt.s64 %r297,%r54,0; @ %r297 bra $L58; .loc 1 268 17 add.u64 %r299,%r432,%r92; st.u64 [%r299],%r54; bra $L59; $L58: .loc 1 271 12 add.u64 %r301,%r432,%r92; st.u64 [%r301],%r447; $L59: .loc 1 264 3 add.u64 %r40,%r40,24; add.u64 %r179,%r179,24; add.u64 %r92,%r92,8; setp.ne.u64 %r303,%r40,%r206; @ %r303 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r304,[%r209]; setp.eq.u64 %r305,%r304,0; @ ! %r305 bra $L63; bra $L62; $L56: add.u64 %r307,%r22,%r22; add.u64 %r308,%r307,%r22; shl.b64 %r309,%r308,3; add.u64 %r127,%r309,40; add.u64 %r133,%r210,%r127; add.u64 %r125,%r212,%r127; shl.b64 %r310,%r22,3; add.u64 %r103,%r310,-8; cvt.u32.u32 %r312,%r23; cvt.s64.s8 %r311,%r312; shl.b64 %r313,%r311,3; add.u64 %r69,%r313,-8; add.u64 %r433,%frame,240; add.u64 %r432,%frame,360; .loc 1 281 12 mov.u64 %r446,0; $L66: .loc 1 276 18 add.u64 %r315,%r433,%r103; ld.u64 %r316,[%r133]; st.u64 [%r315],%r316; .loc 1 277 18 add.u64 %r317,%frame,%r103; .loc 1 277 20 ld.u64 %r319,[%r125]; mul.lo.u64 %r318,%r319,%r34; .loc 1 277 18 st.u64 [%r317],%r318; .loc 1 278 19 ld.u64 %r321,[%r133+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r133+8]; sub.u64 %r66,%r320,%r322; .loc 1 280 10 setp.lt.s64 %r323,%r66,0; @ %r323 bra $L64; .loc 1 278 17 add.u64 %r325,%r432,%r103; st.u64 [%r325],%r66; bra $L65; $L64: .loc 1 281 12 add.u64 %r327,%r432,%r103; st.u64 [%r327],%r446; $L65: .loc 1 274 3 add.u64 %r133,%r133,24; add.u64 %r125,%r125,24; add.u64 %r103,%r103,8; setp.ne.u64 %r329,%r69,%r103; @ %r329 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r330,%r138,0; @ %r330 bra $L68; add.u64 %r432,%frame,360; $L86: add.u64 %r172,%r209,40; mov.u64 %r164,%r432; cvt.u32.u32 %r332,%r23; cvt.s64.s8 %r331,%r332; add.u64 %r334,%r331,%r331; add.u64 %r335,%r334,%r331; shl.b64 %r336,%r335,3; add.u64 %r337,%r209,16; add.u64 %r134,%r336,%r337; .loc 1 291 10 mov.u64 %r129,1; .loc 1 295 4 mov.u64 %r356,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r338,%r23,-2; cvt.s64.s32 %r75,%r338; add.u64 %r340,%r75,%r75; add.u64 %r341,%r340,%r75; shl.b64 %r342,%r341,3; add.u64 %r343,%r209,%r342; .loc 1 299 67 shl.b64 %r345,%r75,3; add.u64 %r346,%frame,%r345; .loc 1 299 59 ld.u64 %r348,[%r343+40]; ld.u64 %r349,[%r346+360]; mul.lo.u64 %r146,%r348,%r349; .loc 1 301 24 mov.u64 %r350,0; st.u64 [%r209+8],%r350; .loc 1 302 28 cvt.u16.u32 %r353,%r23; add.u16 %r352,%r353,-1; cvt.u32.u16 %r354,%r352; st.u8 [%r209+28],%r354; .loc 1 304 10 setp.eq.u64 %r355,%r146,0; @ %r355 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r129,%r72,%r129; $L69: .loc 1 295 4 st.u64 [%r172+8],%r356; ld.u64 %r72,[%r164]; add.u64 %r357,%r72,-1; st.u64 [%r172+16],%r357; st.u64 [%r172],%r129; .loc 1 288 7 add.u64 %r172,%r172,24; add.u64 %r164,%r164,8; setp.ne.u64 %r358,%r134,%r172; @ %r358 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r209+48],%r146; mov.u64 %r360,-1; st.u64 [%r209+56],%r360; mov.u64 %r361,1; st.u64 [%r209+40],%r361; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r363,363; call (%value_in),_gfortrani_xmallocarray64,[%value_in]; } .loc 1 311 22 st.u64 [%r209],%r364; bra $L73; $L63: .loc 1 316 19 ld.s8 %r366,[%r209+28]; .loc 1 316 10 setp.eq.u64 %r367,%r366,%r138; @ %r367 bra $L74; .loc 1 317 2 cvta.const.u64 %r36836_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r370,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r371,[%r370+36]; setp.eq.u32 %r372,%r371,0; @ %r372 bra $L73; .loc 1 321 4 add.u64 %r432,%frame,360; cvta.const.u64 %r376,$LC2r437376; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r380,$LC1210380376; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r382,%r138,0; @ %r382 bra $L75; $L78: .loc 1 336 8 ld.u64 %r153,[%r209]; .loc 1 337 8 ld.u64 %r151,[%r210]; .loc 1 339 9 setp.ne.u64 %r383,%r151,0; @ %r383 bra $L76; bra $L48; $L75: add.u64 %r190,%frame,480; add.u64 %r189,%r209,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r385,%r23; cvt.s64.s8 %r384,%r385; add.u64 %r175,%r384,-1; .loc 1 328 3 mov.u64 %r157,0; add.u64 %r432,%frame,360; .loc 1 330 16 mov.u64 %r386,%r157; $L77: st.u64 [%r190],%r386; .loc 1 331 18 ld.u64 %r387,[%r189]; st.u64 [%r187],%r387; .loc 1 332 17 shl.b64 %r389,%r157,3; add.u64 %r390,%r432,%r389; .loc 1 332 10 ld.u64 %r391,[%r390]; setp.le.s64 %r392,%r391,0; @ %r392 bra $L48; .loc 1 328 26 add.u64 %r157,%r157,1; .loc 1 328 3 add.u64 %r190,%r190,8; add.u64 %r189,%r189,24; add.u64 %r187,%r187,8; setp.ne.u64 %r393,%r157,%r175; @ %r393 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r45,%r42,%r34; .loc 1 349 32 add.u64 %r91,%r142,%r142; ld.u64 %r39,[%frame+480]; .loc 1 359 22 ld.u64 %r94,[%frame+240]; .loc 1 359 12 add.u64 %r95,%r94,%r94; .loc 1 360 23 ld.u64 %r97,[%frame]; .loc 1 361 22 ld.u64 %r99,[%frame+120]; .loc 1 361 12 add.u64 %r100,%r99,%r99; .loc 1 363 32 ld.u64 %r85,[%frame+360]; .loc 1 370 23 mul.lo.u64 %r38,%r85,%r94; .loc 1 371 24 mul.lo.u64 %r397,%r85,%r97; .loc 1 371 10 neg.s64 %r163,%r397; .loc 1 372 23 mul.lo.u64 %r398,%r85,%r99; .loc 1 372 9 add.u64 %r399,%r398,%r398; neg.s64 %r165,%r399; setp.le.s64 %r437,%r138,1; .loc 1 348 10 mov.u32 %r439,-1; cvt.u32.u32 %r440,%r23; cvt.s64.s8 %r441,%r440; add.u64 %r442,%r441,-1; add.u64 %r443,%frame,240; add.u64 %r444,%frame,360; add.u64 %r445,%frame,120; $L87: .loc 1 328 3 mov.u64 %r156,%r152; mov.u64 %r155,%r151; .loc 1 348 10 mov.u32 %r130,%r439; .loc 1 349 9 mov.u64 %r154,0; $L80: .loc 1 352 6 ld.s8 %r401,[%r156]; cvt.u16.u32 %r400,%r401; setp.eq.u16 %r402,%r400,0; @ %r402 bra $L79; .loc 1 353 12 ld.u16 %r404,[%r155]; cvt.u16.u32 %r405,%r130; and.b16 %r403,%r404,%r405; cvt.s32.s16 %r130,%r403; $L79: .loc 1 349 24 add.u64 %r154,%r154,1; .loc 1 349 32 add.u64 %r155,%r155,%r91; .loc 1 349 47 add.u64 %r156,%r156,%r45; .loc 1 349 2 setp.ne.u64 %r406,%r139,%r154; @ %r406 bra $L80; .loc 1 355 8 cvt.u16.u32 %r407,%r130; st.u16 [%r153],%r407; .loc 1 358 15 add.u64 %r39,%r39,1; .loc 1 359 12 add.u64 %r151,%r151,%r95; .loc 1 360 13 add.u64 %r152,%r152,%r97; .loc 1 361 12 add.u64 %r153,%r153,%r100; .loc 1 363 13 setp.ne.u64 %r408,%r85,%r39; @ %r408 bra $L87; .loc 1 371 10 add.u64 %r148,%r152,%r163; .loc 1 372 9 add.u64 %r149,%r153,%r165; .loc 1 374 7 @ ! %r437 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r41],%r438; .loc 1 370 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 371 24 mul.lo.u64 %r411,%r116,%r113; .loc 1 371 10 sub.u64 %r148,%r152,%r411; .loc 1 372 23 mul.lo.u64 %r412,%r118,%r113; .loc 1 372 9 add.u64 %r413,%r412,%r412; sub.u64 %r149,%r153,%r413; .loc 1 373 5 add.u64 %r150,%r150,1; .loc 1 374 7 add.u64 %r41,%r41,8; add.u64 %r166,%r166,8; setp.ne.u64 %r414,%r150,%r442; @ %r414 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; .loc 1 370 23 mov.u64 %r102,%r38; .loc 1 374 7 mov.u64 %r166,8; .loc 1 373 5 mov.u64 %r150,1; .loc 1 367 13 mov.u64 %r438,0; $L84: .loc 1 382 16 ld.u64 %r418,[%r41]; add.u64 %r113,%r418,1; st.u64 [%r41],%r113; .loc 1 383 23 add.u64 %r420,%r443,%r166; ld.u64 %r114,[%r420]; .loc 1 383 13 sub.u64 %r421,%r114,%r102; add.u64 %r422,%r421,%r421; add.u64 %r151,%r151,%r422; .loc 1 384 24 add.u64 %r423,%frame,%r166; ld.u64 %r116,[%r423]; .loc 1 384 14 add.u64 %r152,%r148,%r116; .loc 1 385 23 add.u64 %r425,%r445,%r166; ld.u64 %r118,[%r425]; .loc 1 385 13 add.u64 %r426,%r118,%r118; add.u64 %r153,%r149,%r426; .loc 1 363 32 add.u64 %r428,%r444,%r166; ld.u64 %r122,[%r428]; .loc 1 363 13 setp.eq.u64 %r429,%r113,%r122; @ %r429 bra $L85; .loc 1 367 13 mov.u64 %r39,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r430,[%r209]; setp.eq.u64 %r431,%r430,0; @ ! %r431 bra $L63; bra $L86; $L48: .loc 1 389_gfortran_siall_i2 .visible .func _gfortran_siall_8u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64predpred %r292; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_ial.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r295,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r295; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r295; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r301,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r301; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r294,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r294,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r300,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r300; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r295,0; @ %r211 bra $L127; add.u64 %r294,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r294; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r244,4; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r295,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r295; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r295; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r295,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r295,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r295,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 add.u64 %r69,%r68,%r68; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r293,%r295,1; .loc 1 512 13 mov.u16 %r279,0; cvt.u32.u32 %r297,%r24; cvt.s64.s8 %r298,%r297; add.u64 %r299,%r298,-1; $L146: st.u16 [%r98],%r279; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r280,%r59,%r80; @ %r280 bra $L146; .loc 1 525 7 @ ! %r293 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r296; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r283,%r97,%r299; @ %r283 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; .loc 1 520 13 mov.u64 %r296,0; $L144: .loc 1 529 16 ld.u64 %r288,[%r99]; add.u64 %r74,%r288,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r289,%r75,%r72; add.u64 %r290,%r289,%r289; add.u64 %r98,%r98,%r290; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r291,%r74,%r77; @ %r291 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r292,%r98,0; @ ! %r292 bra $L122; bra $L147; $L109: .loc 1 534 iall_i4.o/_gfortran_iall_i4 .visible .func _gfortran_iall_i4fortran/generated/iall_i4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_miall_i4 .visible .func _gfortran_miall_i4gfortran_siall_i4 .visible .func _gfortran_siall_i4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,76,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,76,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,105] = {73,65,76,76114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,76,76,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,73,65,76,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_iall_i4 .visible .func _gfortran_iall_i448pred %r217; .reg .pred64u64 %r251; .reg .predu64 %r256; .reg .u64 %r257; .reg .u32u32 %r268; .reg .u64u64predpredu64u64predmov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; .loc 1 55 10 ld.s8 %r22,[%r180+28]; .loc 1 55 38 add.u32 %r182,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r182; .loc 1 56 10 ld.u64 %r25,[%r181]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r184,%r104,63; set.u32.lt.s64 %r186,%r103,%r104; neg.s32 %r187,%r186; cvt.u16.u64 %r190,%r184; cvt.u16.u32 %r191,%r187; or.b16 %r189,%r190,%r191; .loc 1 58 6 cvt.u32.u16 %r192,%r189; cvt.u16.u8 %r193,%r192; setp.eq.u16 %r194,%r193,0; @ %r194 bra $L2; .loc 1 60 7 cvt.u32.u32 %r197,%r22; cvt.s64.s8 %r196,%r197; st.u64 [%stack+8],%r1961955_gfortran_runtime_errorr200,%r104,%r104; add.u64 %r201,%r200,%r104; shl.b64 %r202,%r201,3; add.u64 %r203,%r180,%r202; ld.u64 %r30,[%r203+56]; ld.u64 %r32,[%r203+48]; .loc 1 68 9 ld.u64 %r107,[%r203+40]; .loc 1 70 3 setp.ne.u64 %r217,%r104,0; @ %r217 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r218,%r103,%r104; @ %r218 bra $L4; bra $L46; $L3: add.u64 %r61,%r180,40; add.u64 %r40,%frame,120; add.u64 %r91,%frame,240; add.u64 %r219,%r180,16; add.u64 %r221,%r25,%r25; add.u64 %r222,%r221,%r25; shl.b64 %r223,%r222,3; add.u64 %r174,%r219,%r223; .loc 1 76 12 mov.u64 %r357,0; $L8: .loc 1 72 18 ld.u64 %r224,[%r61]; st.u64 [%r40],%r224; .loc 1 73 19 ld.u64 %r226,[%r61+16]; add.u64 %r225,%r226,1; ld.u64 %r227,[%r61+8]; sub.u64 %r37,%r225,%r227; .loc 1 75 10 setp.lt.s64 %r228,%r37,0; @ %r228 bra $L6; .loc 1 73 17 st.u64 [%r91],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r91],%r357; $L7: .loc 1 70 3 add.u64 %r61,%r61,24; add.u64 %r40,%r40,8; add.u64 %r91,%r91,8; setp.ne.u64 %r230,%r61,%r174; @ %r230 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r231,[%r179]; setp.eq.u64 %r232,%r231,0; @ ! %r232 bra $L11; bra $L10; $L4: add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r237,%r236,40; add.u64 %r143,%r180,%r237; shl.b64 %r238,%r25,3; add.u64 %r125,%r238,-8; add.u64 %r344,%frame,120; add.u64 %r128,%r344,%r125; add.u64 %r345,%frame,240; add.u64 %r121,%r345,%r125; cvt.u32.u32 %r242,%r22; cvt.s64.s8 %r241,%r242; add.u64 %r244,%r241,%r241; add.u64 %r245,%r244,%r241; shl.b64 %r246,%r245,3; add.u64 %r247,%r180,40; add.u64 %r95,%r246,%r247; .loc 1 84 12 mov.u64 %r356,0; $L14: .loc 1 80 18 ld.u64 %r248,[%r143]; st.u64 [%r128],%r248; .loc 1 81 19 ld.u64 %r250,[%r143+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r143+8]; sub.u64 %r44,%r249,%r251; .loc 1 83 10 setp.lt.s64 %r252,%r44,0; @ %r252 bra $L12; .loc 1 81 17 st.u64 [%r121],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r121],%r356; $L13: .loc 1 78 3 add.u64 %r143,%r143,24; add.u64 %r128,%r128,8; add.u64 %r121,%r121,8; setp.ne.u64 %r254,%r95,%r143; @ %r254 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r255,%r103,0; @ %r255 bra $L16; add.u64 %r345,%frame,240; $L34: add.u64 %r152,%r179,40; mov.u64 %r150,%r345; add.u64 %r256,%r179,16; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r144,%r256,%r262; .loc 1 94 10 mov.u64 %r99,1; .loc 1 98 4 mov.u64 %r285,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r263,0; st.u64 [%r179+8],%r263; .loc 1 103 28 cvt.u16.u32 %r266,%r22; add.u16 %r265,%r266,-1; cvt.u32.u16 %r267,%r265; st.u8 [%r179+28],%r267; .loc 1 105 20 add.u32 %r268,%r22,-2; cvt.s64.s32 %r52,%r268; add.u64 %r270,%r52,%r52; add.u64 %r271,%r270,%r52; shl.b64 %r272,%r271,3; add.u64 %r273,%r179,%r272; .loc 1 105 67 shl.b64 %r275,%r52,3; add.u64 %r276,%frame,%r275; .loc 1 105 59 ld.u64 %r278,[%r273+40]; ld.u64 %r279,[%r276+240]; mul.lo.u64 %r109,%r278,%r279; .loc 1 107 29281; call (%value_in),_gfortrani_xmallocarray282,[%value_in]; } .loc 1 107 27 st.u64 [%r179],%r282; .loc 1 108 10 setp.eq.u64 %r284,%r109,0; @ ! %r284 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r99,%r48,%r99; $L17: .loc 1 98 4 st.u64 [%r152+8],%r285; ld.u64 %r48,[%r150]; add.u64 %r286,%r48,-1; st.u64 [%r152+16],%r286; st.u64 [%r152],%r99; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r287,%r144,%r152; @ %r287 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r179+48],%r109; mov.u64 %r289,-1; st.u64 [%r179+56],%r289; mov.u64 %r290,1; st.u64 [%r179+40],%r290; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r179+28]; .loc 1 118 10 setp.eq.u64 %r291,%r58,%r103; @ %r291 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r294,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r295,[%r294+36]; setp.eq.u32 %r296,%r295,0; @ %r296 bra $L21; .loc 1 125 2 add.u64 %r345,%frame,240; cvta.const.u64 %r300,$LC2r345299300; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r302,%r103,0; @ %r302 bra $L24; $L27: .loc 1 65 9 add.u64 %r303,%r30,1; .loc 1 65 7 sub.u64 %r105,%r303,%r32; max.s64 %r45,%r105,0; .loc 1 137 8 ld.u64 %r116,[%r180]; .loc 1 138 8 ld.u64 %r117,[%r179]; .loc 1 154 36 shl.b64 %r66,%r107,2; ld.u64 %r138,[%frame+360]; .loc 1 166 22 ld.u64 %r68,[%frame+120]; .loc 1 166 12 shl.b64 %r69,%r68,2; .loc 1 167 22 ld.u64 %r71,[%frame]; .loc 1 167 12 shl.b64 %r72,%r71,2; .loc 1 169 32 ld.u64 %r123,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r304,%r68,%r123; .loc 1 176 9 shl.b64 %r305,%r304,2; neg.s64 %r127,%r305; .loc 1 177 23 mul.lo.u64 %r306,%r71,%r123; .loc 1 177 9 shl.b64 %r307,%r306,2; neg.s64 %r130,%r307; setp.gt.s64 %r346,%r105,0; setp.le.s64 %r347,%r103,1; cvt.u32.u32 %r349,%r22; cvt.s64.s8 %r350,%r349; add.u64 %r351,%r350,-1; add.u64 %r352,%frame,120; add.u64 %r353,%frame,240; .loc 1 173 13 mov.u64 %r354,0; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r179,40; mov.u64 %r162,%frame; cvt.u32.u32 %r309,%r22; cvt.s64.s8 %r308,%r309; add.u64 %r153,%r308,-1; .loc 1 129 3 mov.u64 %r118,0; add.u64 %r345,%frame,240; .loc 1 131 16 mov.u64 %r310,%r118; $L26: st.u64 [%r165],%r310; .loc 1 132 18 ld.u64 %r311,[%r164]; st.u64 [%r162],%r311; .loc 1 133 17 shl.b64 %r313,%r118,3; add.u64 %r314,%r345,%r313; .loc 1 133 10 ld.u64 %r315,[%r314]; setp.le.s64 %r316,%r315,0; @ %r316 bra $L1; .loc 1 129 26 add.u64 %r118,%r118,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r317,%r118,%r153; @ %r317 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r346 bra $L35; .loc 1 150 10 mov.u32 %r319,0; st.u32 [%r117],%r319; bra $L29; $L35: mov.u64 %r112,%r116; .loc 1 148 10 mov.u32 %r110,-1; .loc 1 154 13 mov.u64 %r111,0; $L28: .loc 1 158 10 ld.u32 %r320,[%r112]; and.b32 %r110,%r110,%r320; .loc 1 154 28 add.u64 %r111,%r111,1; .loc 1 154 36 add.u64 %r112,%r112,%r66; .loc 1 154 6 setp.gt.s64 %r321,%r45,%r111; @ %r321 bra $L28; .loc 1 161 12 st.u32 [%r117],%r110; $L29: .loc 1 165 15 add.u64 %r138,%r138,1; .loc 1 166 12 add.u64 %r116,%r116,%r69; .loc 1 167 12 add.u64 %r117,%r117,%r72; .loc 1 169 13 setp.ne.u64 %r322,%r138,%r123; @ %r322 bra $L25; .loc 1 176 9 add.u64 %r113,%r116,%r127; .loc 1 177 9 add.u64 %r114,%r117,%r130; .loc 1 179 7 @ %r347 bra $L1; add.u64 %r122,%frame,368; mov.u64 %r140,8; .loc 1 178 5 mov.u64 %r115,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r122],%r354; .loc 1 176 23 mul.lo.u64 %r328,%r85,%r84; .loc 1 176 9 shl.b64 %r329,%r328,2; sub.u64 %r113,%r116,%r329; .loc 1 177 23 mul.lo.u64 %r330,%r88,%r84; .loc 1 177 9 shl.b64 %r331,%r330,2; sub.u64 %r114,%r117,%r331; .loc 1 178 5 add.u64 %r115,%r115,1; .loc 1 179 7 add.u64 %r122,%r122,8; add.u64 %r140,%r140,8; setp.eq.u64 %r332,%r115,%r351; @ %r332 bra $L1; $L32: .loc 1 187 16 ld.u64 %r333,[%r122]; add.u64 %r84,%r333,1; st.u64 [%r122],%r84; .loc 1 188 23 add.u64 %r335,%r352,%r140; ld.u64 %r85,[%r335]; .loc 1 188 13 shl.b64 %r336,%r85,2; add.u64 %r116,%r113,%r336; .loc 1 189 23 add.u64 %r337,%frame,%r140; ld.u64 %r88,[%r337]; .loc 1 189 13 shl.b64 %r338,%r88,2; add.u64 %r117,%r114,%r338; .loc 1 169 32 add.u64 %r340,%r353,%r140; ld.u64 %r92,[%r340]; .loc 1 169 13 setp.eq.u64 %r341,%r84,%r92; @ %r341 bra $L33; .loc 1 173 13 mov.u64 %r138,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r342,[%r179]; setp.eq.u64 %r343,%r342,0; @ ! %r343 bra $L11; bra $L34; $L1_gfortran_miall_i4 .visible .func _gfortran_miall_608u64u16 %r263; .reg .pred %r264; .reg .pred %r266; .reg .u64 %r270; .reg .u64 %r271; .reg .u64 %r277; .reg .pred %r279; .reg .pred %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .predu64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32u64 %r342; .reg .u64pred %r358; .reg .u64predu64 %r391; .reg .predpred %r399; .reg .u32 %r400; .reg .pred %r401; .reg .predmov.u64 %r211,%ar2; mov.u64 %r212,%ar3; .loc 1 223 6 setp.ne.u64 %r213,%r212,0; @ %r213 bra $L49gfortran_ial.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r211]; .loc 1 233 7 add.u64 %r137,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r210+28]; .loc 1 234 38 add.u32 %r217,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r138,%r217; .loc 1 237 7 shr.u64 %r219,%r137,63; set.u32.gt.s64 %r221,%r137,%r138; neg.s32 %r222,%r221; cvt.u16.u64 %r225,%r219; cvt.u16.u32 %r226,%r222; or.b16 %r224,%r225,%r226; .loc 1 237 6 cvt.u32.u16 %r227,%r224; cvt.u16.u8 %r228,%r227; setp.eq.u16 %r229,%r228,0; @ %r229 bra $L51; .loc 1 239 7 cvt.u32.u32 %r232,%r23; cvt.s64.s8 %r231,%r232; st.u64 [%stack+8],%r231; st.u64 [%stack],%r22; cvta.const.u64 %r2302stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r429,%r137,%r137; add.u64 %r430,%r429,%r137; shl.b64 %r237,%r430,3; add.u64 %r238,%r210,%r237; ld.u64 %r241,[%r238+56]; add.u64 %r240,%r241,1; .loc 1 244 7 ld.u64 %r248,[%r238+48]; sub.u64 %r139,%r240,%r248; .loc 1 245 6 setp.le.s64 %r249,%r139,0; @ %r249 bra $L48; .loc 1 248 9 ld.u64 %r152,[%r212]; .loc 1 250 15 ld.u64 %r34,[%r212+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r250,%r35,-4; and.b32 %r251,%r250,-5; set.u32.eq.u32 %r253,%r251,0; neg.s32 %r254,%r253; .loc 1 252 22 add.u32 %r255,%r35,-1; set.u32.le.u32 %r257,%r255,1; neg.s32 %r258,%r257; .loc 1 252 6 cvt.u16.u32 %r260,%r254; cvt.u16.u32 %r261,%r258; or.b16 %r259,%r260,%r261; cvt.u32.u16 %r262,%r259; cvt.u16.u8 %r263,%r262; setp.ne.u16 %r264,%r263,0; @ %r264 bra $L53; .loc 1 254 7 setp.ne.u32 %r266,%r35,16; @ %r266 bra $L54; $L53: .loc 1 261 9 shl.b64 %r270,%r430,3; add.u64 %r271,%r210,%r270; ld.u64 %r142,[%r271+40]; .loc 1 262 12 add.u64 %r277,%r212,%r270; ld.u64 %r42,[%r277+40]; .loc 1 264 3 setp.ne.u64 %r279,%r137,0; @ %r279 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r280,%r137,%r138; @ %r280 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r281281_gfortran_runtime_error5: add.u64 %r40,%r210,40; add.u64 %r179,%r212,40; add.u64 %r283,%r210,16; add.u64 %r285,%r22,%r22; add.u64 %r286,%r285,%r22; shl.b64 %r287,%r286,3; add.u64 %r206,%r283,%r287; .loc 1 264 3 mov.u64 %r92,0; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 271 12 mov.u64 %r441,%r92; $L60: .loc 1 266 18 add.u64 %r289,%r427,%r92; ld.u64 %r290,[%r40]; st.u64 [%r289],%r290; .loc 1 267 18 add.u64 %r291,%frame,%r92; .loc 1 267 20 ld.u64 %r293,[%r179]; mul.lo.u64 %r292,%r293,%r34; .loc 1 267 18 st.u64 [%r291],%r292; .loc 1 268 19 ld.u64 %r295,[%r40+16]; add.u64 %r294,%r295,1; ld.u64 %r296,[%r40+8]; sub.u64 %r54,%r294,%r296; .loc 1 270 10 setp.lt.s64 %r297,%r54,0; @ %r297 bra $L58; .loc 1 268 17 add.u64 %r299,%r426,%r92; st.u64 [%r299],%r54; bra $L59; $L58: .loc 1 271 12 add.u64 %r301,%r426,%r92; st.u64 [%r301],%r441; $L59: .loc 1 264 3 add.u64 %r40,%r40,24; add.u64 %r179,%r179,24; add.u64 %r92,%r92,8; setp.ne.u64 %r303,%r40,%r206; @ %r303 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r304,[%r209]; setp.eq.u64 %r305,%r304,0; @ ! %r305 bra $L63; bra $L62; $L56: add.u64 %r307,%r22,%r22; add.u64 %r308,%r307,%r22; shl.b64 %r309,%r308,3; add.u64 %r127,%r309,40; add.u64 %r133,%r210,%r127; add.u64 %r125,%r212,%r127; shl.b64 %r310,%r22,3; add.u64 %r103,%r310,-8; cvt.u32.u32 %r312,%r23; cvt.s64.s8 %r311,%r312; shl.b64 %r313,%r311,3; add.u64 %r69,%r313,-8; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 281 12 mov.u64 %r440,0; $L66: .loc 1 276 18 add.u64 %r315,%r427,%r103; ld.u64 %r316,[%r133]; st.u64 [%r315],%r316; .loc 1 277 18 add.u64 %r317,%frame,%r103; .loc 1 277 20 ld.u64 %r319,[%r125]; mul.lo.u64 %r318,%r319,%r34; .loc 1 277 18 st.u64 [%r317],%r318; .loc 1 278 19 ld.u64 %r321,[%r133+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r133+8]; sub.u64 %r66,%r320,%r322; .loc 1 280 10 setp.lt.s64 %r323,%r66,0; @ %r323 bra $L64; .loc 1 278 17 add.u64 %r325,%r426,%r103; st.u64 [%r325],%r66; bra $L65; $L64: .loc 1 281 12 add.u64 %r327,%r426,%r103; st.u64 [%r327],%r440; $L65: .loc 1 274 3 add.u64 %r133,%r133,24; add.u64 %r125,%r125,24; add.u64 %r103,%r103,8; setp.ne.u64 %r329,%r69,%r103; @ %r329 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r330,%r138,0; @ %r330 bra $L68; add.u64 %r426,%frame,360; $L86: add.u64 %r172,%r209,40; mov.u64 %r164,%r426; cvt.u32.u32 %r332,%r23; cvt.s64.s8 %r331,%r332; add.u64 %r334,%r331,%r331; add.u64 %r335,%r334,%r331; shl.b64 %r336,%r335,3; add.u64 %r337,%r209,16; add.u64 %r134,%r336,%r337; .loc 1 291 10 mov.u64 %r129,1; .loc 1 295 4 mov.u64 %r356,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r338,%r23,-2; cvt.s64.s32 %r75,%r338; add.u64 %r340,%r75,%r75; add.u64 %r341,%r340,%r75; shl.b64 %r342,%r341,3; add.u64 %r343,%r209,%r342; .loc 1 299 67 shl.b64 %r345,%r75,3; add.u64 %r346,%frame,%r345; .loc 1 299 59 ld.u64 %r348,[%r343+40]; ld.u64 %r349,[%r346+360]; mul.lo.u64 %r146,%r348,%r349; .loc 1 301 24 mov.u64 %r350,0; st.u64 [%r209+8],%r350; .loc 1 302 28 cvt.u16.u32 %r353,%r23; add.u16 %r352,%r353,-1; cvt.u32.u16 %r354,%r352; st.u8 [%r209+28],%r354; .loc 1 304 10 setp.eq.u64 %r355,%r146,0; @ %r355 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r129,%r72,%r129; $L69: .loc 1 295 4 st.u64 [%r172+8],%r356; ld.u64 %r72,[%r164]; add.u64 %r357,%r72,-1; st.u64 [%r172+16],%r357; st.u64 [%r172],%r129; .loc 1 288 7 add.u64 %r172,%r172,24; add.u64 %r164,%r164,8; setp.ne.u64 %r358,%r134,%r172; @ %r358 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r209+48],%r146; mov.u64 %r360,-1; st.u64 [%r209+56],%r360; mov.u64 %r361,1; st.u64 [%r209+40],%r361; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24363; call (%value_in),_gfortrani_xmallocarray64,[%value_in]; } .loc 1 311 22 st.u64 [%r209],%r364; bra $L73; $L63: .loc 1 316 19 ld.s8 %r366,[%r209+28]; .loc 1 316 10 setp.eq.u64 %r367,%r366,%r138; @ %r367 bra $L74; .loc 1 317 2 cvta.const.u64 %r36836_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r370,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r371,[%r370+36]; setp.eq.u32 %r372,%r371,0; @ %r372 bra $L73; .loc 1 321 4 add.u64 %r426,%frame,360; cvta.const.u64 %r376,$LC2r42637376; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r380,$LC1210380376; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r382,%r138,0; @ %r382 bra $L75; $L78: .loc 1 336 8 ld.u64 %r153,[%r209]; .loc 1 337 8 ld.u64 %r151,[%r210]; .loc 1 339 9 setp.ne.u64 %r383,%r151,0; @ %r383 bra $L76; bra $L48; $L75: add.u64 %r190,%frame,480; add.u64 %r189,%r209,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r385,%r23; cvt.s64.s8 %r384,%r385; add.u64 %r175,%r384,-1; .loc 1 328 3 mov.u64 %r157,0; add.u64 %r426,%frame,360; .loc 1 330 16 mov.u64 %r386,%r157; $L77: st.u64 [%r190],%r386; .loc 1 331 18 ld.u64 %r387,[%r189]; st.u64 [%r187],%r387; .loc 1 332 17 shl.b64 %r389,%r157,3; add.u64 %r390,%r426,%r389; .loc 1 332 10 ld.u64 %r391,[%r390]; setp.le.s64 %r392,%r391,0; @ %r392 bra $L48; .loc 1 328 26 add.u64 %r157,%r157,1; .loc 1 328 3 add.u64 %r190,%r190,8; add.u64 %r189,%r189,24; add.u64 %r187,%r187,8; setp.ne.u64 %r393,%r157,%r175; @ %r393 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r45,%r42,%r34; .loc 1 349 32 shl.b64 %r91,%r142,2; ld.u64 %r39,[%frame+480]; .loc 1 359 22 ld.u64 %r94,[%frame+240]; .loc 1 359 12 shl.b64 %r95,%r94,2; .loc 1 360 23 ld.u64 %r97,[%frame]; .loc 1 361 22 ld.u64 %r99,[%frame+120]; .loc 1 361 12 shl.b64 %r100,%r99,2; .loc 1 363 32 ld.u64 %r85,[%frame+360]; .loc 1 370 23 mul.lo.u64 %r38,%r85,%r94; .loc 1 371 24 mul.lo.u64 %r394,%r85,%r97; .loc 1 371 10 neg.s64 %r163,%r394; .loc 1 372 23 mul.lo.u64 %r395,%r85,%r99; .loc 1 372 9 shl.b64 %r396,%r395,2; neg.s64 %r165,%r396; setp.le.s64 %r431,%r138,1; cvt.u32.u32 %r433,%r23; cvt.s64.s8 %r434,%r433; add.u64 %r435,%r434,-1; add.u64 %r436,%frame,240; add.u64 %r437,%frame,360; add.u64 %r438,%frame,120; .loc 1 367 13 mov.u64 %r439,0; $L87: .loc 1 328 3 mov.u64 %r156,%r152; mov.u64 %r155,%r151; .loc 1 348 10 mov.u32 %r130,-1; .loc 1 349 9 mov.u64 %r154,0; $L80: .loc 1 352 6 ld.s8 %r398,[%r156]; cvt.u16.u32 %r397,%r398; setp.eq.u16 %r399,%r397,0; @ %r399 bra $L79; .loc 1 353 12 ld.u32 %r400,[%r155]; and.b32 %r130,%r130,%r400; $L79: .loc 1 349 24 add.u64 %r154,%r154,1; .loc 1 349 32 add.u64 %r155,%r155,%r91; .loc 1 349 47 add.u64 %r156,%r156,%r45; .loc 1 349 2 setp.ne.u64 %r401,%r139,%r154; @ %r401 bra $L80; .loc 1 355 8 st.u32 [%r153],%r130; .loc 1 358 15 add.u64 %r39,%r39,1; .loc 1 359 12 add.u64 %r151,%r151,%r95; .loc 1 360 13 add.u64 %r152,%r152,%r97; .loc 1 361 12 add.u64 %r153,%r153,%r100; .loc 1 363 13 setp.ne.u64 %r402,%r85,%r39; @ %r402 bra $L87; .loc 1 371 10 add.u64 %r148,%r152,%r163; .loc 1 372 9 add.u64 %r149,%r153,%r165; .loc 1 374 7 @ ! %r431 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r41],%r439; .loc 1 370 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 371 24 mul.lo.u64 %r405,%r116,%r113; .loc 1 371 10 sub.u64 %r148,%r152,%r405; .loc 1 372 23 mul.lo.u64 %r406,%r118,%r113; .loc 1 372 9 shl.b64 %r407,%r406,2; sub.u64 %r149,%r153,%r407; .loc 1 373 5 add.u64 %r150,%r150,1; .loc 1 374 7 add.u64 %r41,%r41,8; add.u64 %r166,%r166,8; setp.ne.u64 %r408,%r150,%r435; @ %r408 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; .loc 1 370 23 mov.u64 %r102,%r38; .loc 1 374 7 mov.u64 %r166,8; .loc 1 373 5 mov.u64 %r150,1; $L84: .loc 1 382 16 ld.u64 %r412,[%r41]; add.u64 %r113,%r412,1; st.u64 [%r41],%r113; .loc 1 383 23 add.u64 %r414,%r436,%r166; ld.u64 %r114,[%r414]; .loc 1 383 13 sub.u64 %r415,%r114,%r102; shl.b64 %r416,%r415,2; add.u64 %r151,%r151,%r416; .loc 1 384 24 add.u64 %r417,%frame,%r166; ld.u64 %r116,[%r417]; .loc 1 384 14 add.u64 %r152,%r148,%r116; .loc 1 385 23 add.u64 %r419,%r438,%r166; ld.u64 %r118,[%r419]; .loc 1 385 13 shl.b64 %r420,%r118,2; add.u64 %r153,%r149,%r420; .loc 1 363 32 add.u64 %r422,%r437,%r166; ld.u64 %r122,[%r422]; .loc 1 363 13 setp.eq.u64 %r423,%r113,%r122; @ %r423 bra $L85; .loc 1 367 13 mov.u64 %r39,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r424,[%r209]; setp.eq.u64 %r425,%r424,0; @ ! %r425 bra $L63; bra $L86; $L48: .loc 1 389_gfortran_siall_i4 .visible .func _gfortran_siall_8u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64predu64mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_ial.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r294,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r294; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r294; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r300,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r300; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r293,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r293,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r299,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r299; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r294,0; @ %r211 bra $L127; add.u64 %r293,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r293; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 2496244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r294,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r294; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r294; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r294,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r294,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r294,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 shl.b64 %r69,%r68,2; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r292,%r294,1; .loc 1 512 13 mov.u32 %r278,0; cvt.u32.u32 %r296,%r24; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; $L146: st.u32 [%r98],%r278; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r279,%r59,%r80; @ %r279 bra $L146; .loc 1 525 7 @ ! %r292 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r295; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r282,%r97,%r298; @ %r282 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; .loc 1 520 13 mov.u64 %r295,0; $L144: .loc 1 529 16 ld.u64 %r287,[%r99]; add.u64 %r74,%r287,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r288,%r75,%r72; shl.b64 %r289,%r288,2; add.u64 %r98,%r98,%r289; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r290,%r74,%r77; @ %r290 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r291,%r98,0; @ ! %r291 bra $L122; bra $L147; $L109: .loc 1 534iall_i8.o/_gfortran_iall_i8 .visible .func _gfortran_iall_i8fortran/generated/iall_i8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_miall_i8 .visible .func _gfortran_miall_i8gfortran_siall_i8 .visible .func _gfortran_siall_i8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,76,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,76,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,105] = {73,65,76,76114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,76,76,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,73,65,76,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_iall_i8 .visible .func _gfortran_iall_i84896pred %r218; .reg .predpred %r231; .reg .u64 %r232; .reg .predu64 %r250; .reg .u64 %r251; .reg .u64 %r252; .reg .pred %r253; .reg .predu64u16u64 %r274; .reg .u64pred %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u32 %r306; .reg .u64u64 %r311; .reg .u64 %r312; .reg .pred %r313; .reg .predu64pred %r347; .reg .predu64 %r358; mov.u64 %r180,%ar0; mov.u64 %r181,%ar1; mov.u64 %r182,%ar2; .loc 1 55 10 ld.s8 %r22,[%r181+28]; .loc 1 55 38 add.u32 %r183,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r107,%r183; .loc 1 56 10 ld.u64 %r25,[%r182]; .loc 1 56 7 add.u64 %r108,%r25,-1; .loc 1 58 7 shr.u64 %r185,%r108,63; set.u32.lt.s64 %r187,%r107,%r108; neg.s32 %r188,%r187; cvt.u16.u64 %r191,%r185; cvt.u16.u32 %r192,%r188; or.b16 %r190,%r191,%r192; .loc 1 58 6 cvt.u32.u16 %r193,%r190; cvt.u16.u8 %r194,%r193; setp.eq.u16 %r195,%r194,0; @ %r195 bra $L2; .loc 1 60 7 cvt.u32.u32 %r198,%r22; cvt.s64.s8 %r197,%r198; st.u64 [%stack+8],%r1971966_gfortran_runtime_errorr201,%r108,%r108; add.u64 %r202,%r201,%r108; shl.b64 %r203,%r202,3; add.u64 %r204,%r181,%r203; ld.u64 %r30,[%r204+56]; ld.u64 %r32,[%r204+48]; .loc 1 68 9 ld.u64 %r111,[%r204+40]; .loc 1 70 3 setp.ne.u64 %r218,%r108,0; @ %r218 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r219,%r107,%r108; @ %r219 bra $L4; bra $L46; $L3: add.u64 %r61,%r181,40; add.u64 %r40,%frame,120; add.u64 %r96,%frame,240; add.u64 %r220,%r181,16; add.u64 %r222,%r25,%r25; add.u64 %r223,%r222,%r25; shl.b64 %r224,%r223,3; add.u64 %r175,%r220,%r224; .loc 1 76 12 mov.u64 %r358,0; $L8: .loc 1 72 18 ld.u64 %r225,[%r61]; st.u64 [%r40],%r225; .loc 1 73 19 ld.u64 %r227,[%r61+16]; add.u64 %r226,%r227,1; ld.u64 %r228,[%r61+8]; sub.u64 %r37,%r226,%r228; .loc 1 75 10 setp.lt.s64 %r229,%r37,0; @ %r229 bra $L6; .loc 1 73 17 st.u64 [%r96],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r96],%r358; $L7: .loc 1 70 3 add.u64 %r61,%r61,24; add.u64 %r40,%r40,8; add.u64 %r96,%r96,8; setp.ne.u64 %r231,%r61,%r175; @ %r231 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r232,[%r180]; setp.eq.u64 %r233,%r232,0; @ ! %r233 bra $L11; bra $L10; $L4: add.u64 %r235,%r25,%r25; add.u64 %r236,%r235,%r25; shl.b64 %r237,%r236,3; add.u64 %r238,%r237,40; add.u64 %r136,%r181,%r238; shl.b64 %r239,%r25,3; add.u64 %r105,%r239,-8; add.u64 %r345,%frame,120; add.u64 %r129,%r345,%r105; add.u64 %r346,%frame,240; add.u64 %r103,%r346,%r105; cvt.u32.u32 %r243,%r22; cvt.s64.s8 %r242,%r243; add.u64 %r245,%r242,%r242; add.u64 %r246,%r245,%r242; shl.b64 %r247,%r246,3; add.u64 %r248,%r181,40; add.u64 %r80,%r247,%r248; .loc 1 84 12 mov.u64 %r357,0; $L14: .loc 1 80 18 ld.u64 %r249,[%r136]; st.u64 [%r129],%r249; .loc 1 81 19 ld.u64 %r251,[%r136+16]; add.u64 %r250,%r251,1; ld.u64 %r252,[%r136+8]; sub.u64 %r44,%r250,%r252; .loc 1 83 10 setp.lt.s64 %r253,%r44,0; @ %r253 bra $L12; .loc 1 81 17 st.u64 [%r103],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r103],%r357; $L13: .loc 1 78 3 add.u64 %r136,%r136,24; add.u64 %r129,%r129,8; add.u64 %r103,%r103,8; setp.ne.u64 %r255,%r80,%r136; @ %r255 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r256,%r107,0; @ %r256 bra $L16; add.u64 %r346,%frame,240; $L34: add.u64 %r152,%r180,40; mov.u64 %r150,%r346; add.u64 %r257,%r180,16; cvt.u32.u32 %r259,%r22; cvt.s64.s8 %r258,%r259; add.u64 %r261,%r258,%r258; add.u64 %r262,%r261,%r258; shl.b64 %r263,%r262,3; add.u64 %r139,%r257,%r263; .loc 1 94 10 mov.u64 %r102,1; .loc 1 98 4 mov.u64 %r286,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r264,0; st.u64 [%r180+8],%r264; .loc 1 103 28 cvt.u16.u32 %r267,%r22; add.u16 %r266,%r267,-1; cvt.u32.u16 %r268,%r266; st.u8 [%r180+28],%r268; .loc 1 105 20 add.u32 %r269,%r22,-2; cvt.s64.s32 %r53,%r269; add.u64 %r271,%r53,%r53; add.u64 %r272,%r271,%r53; shl.b64 %r273,%r272,3; add.u64 %r274,%r180,%r273; .loc 1 105 67 shl.b64 %r276,%r53,3; add.u64 %r277,%frame,%r276; .loc 1 105 59 ld.u64 %r279,[%r274+40]; ld.u64 %r280,[%r277+240]; mul.lo.u64 %r113,%r279,%r280; .loc 1 107 29 mov.u64 %r2811(%value_in),_gfortrani_xmallocarray283,[%value_in]; } .loc 1 107 27 st.u64 [%r180],%r283; .loc 1 108 10 setp.eq.u64 %r285,%r113,0; @ ! %r285 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r102,%r49,%r102; $L17: .loc 1 98 4 st.u64 [%r152+8],%r286; ld.u64 %r49,[%r150]; add.u64 %r287,%r49,-1; st.u64 [%r152+16],%r287; st.u64 [%r152],%r102; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r288,%r139,%r152; @ %r288 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r180+48],%r113; mov.u64 %r290,-1; st.u64 [%r180+56],%r290; mov.u64 %r291,1; st.u64 [%r180+40],%r291; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r60,[%r180+28]; .loc 1 118 10 setp.eq.u64 %r292,%r60,%r107; @ %r292 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r107; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r295,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r296,[%r295+36]; setp.eq.u32 %r297,%r296,0; @ %r297 bra $L21; .loc 1 125 2 add.u64 %r346,%frame,240; cvta.const.u64 %r301,$LC2r346300301; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r303,%r107,0; @ %r303 bra $L24; $L27: .loc 1 65 9 add.u64 %r304,%r30,1; .loc 1 65 7 sub.u64 %r109,%r304,%r32; max.s64 %r45,%r109,0; .loc 1 137 8 ld.u64 %r121,[%r181]; .loc 1 138 8 ld.u64 %r122,[%r180]; .loc 1 154 36 shl.b64 %r68,%r111,3; ld.u64 %r47,[%frame+360]; .loc 1 166 22 ld.u64 %r69,[%frame+120]; .loc 1 166 12 shl.b64 %r71,%r69,3; .loc 1 167 22 ld.u64 %r72,[%frame]; .loc 1 167 12 shl.b64 %r74,%r72,3; setp.gt.s64 %r347,%r109,0; setp.le.s64 %r348,%r107,1; .loc 1 150 10 mov.u64 %r350,0; cvt.u32.u32 %r351,%r22; cvt.s64.s8 %r352,%r351; add.u64 %r353,%r352,-1; add.u64 %r354,%frame,120; add.u64 %r355,%frame,240; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r180,40; mov.u64 %r162,%frame; cvt.u32.u32 %r306,%r22; cvt.s64.s8 %r305,%r306; add.u64 %r153,%r305,-1; .loc 1 129 3 mov.u64 %r123,0; add.u64 %r346,%frame,240; .loc 1 131 16 mov.u64 %r307,%r123; $L26: st.u64 [%r165],%r307; .loc 1 132 18 ld.u64 %r308,[%r164]; st.u64 [%r162],%r308; .loc 1 133 17 shl.b64 %r310,%r123,3; add.u64 %r311,%r346,%r310; .loc 1 133 10 ld.u64 %r312,[%r311]; setp.le.s64 %r313,%r312,0; @ %r313 bra $L1; .loc 1 129 26 add.u64 %r123,%r123,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r314,%r123,%r153; @ %r314 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r347 bra $L35; .loc 1 150 10 st.u64 [%r122],%r350; bra $L29; $L35: mov.u64 %r116,%r121; .loc 1 148 10 mov.u64 %r114,-1; .loc 1 154 13 mov.u64 %r115,0; $L28: .loc 1 158 10 ld.u64 %r317,[%r116]; and.b64 %r114,%r114,%r317; .loc 1 154 28 add.u64 %r115,%r115,1; .loc 1 154 36 add.u64 %r116,%r116,%r68; .loc 1 154 6 setp.gt.s64 %r318,%r45,%r115; @ %r318 bra $L28; .loc 1 161 12 st.u64 [%r122],%r114; $L29: .loc 1 165 15 add.u64 %r47,%r47,1; .loc 1 166 12 add.u64 %r121,%r121,%r71; .loc 1 167 12 add.u64 %r122,%r122,%r74; .loc 1 169 32 ld.u64 %r127,[%frame+240]; .loc 1 169 13 setp.ne.u64 %r319,%r47,%r127; @ %r319 bra $L25; .loc 1 176 23 mul.lo.u64 %r320,%r69,%r47; .loc 1 176 9 shl.b64 %r321,%r320,3; sub.u64 %r118,%r121,%r321; .loc 1 177 23 mul.lo.u64 %r322,%r72,%r47; .loc 1 177 9 shl.b64 %r323,%r322,3; sub.u64 %r119,%r122,%r323; .loc 1 179 7 @ %r348 bra $L1; add.u64 %r144,%frame,368; mov.u64 %r106,8; .loc 1 178 5 mov.u64 %r120,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r144],%r350; .loc 1 176 23 mul.lo.u64 %r329,%r88,%r87; .loc 1 176 9 shl.b64 %r330,%r329,3; sub.u64 %r118,%r121,%r330; .loc 1 177 23 mul.lo.u64 %r331,%r91,%r87; .loc 1 177 9 shl.b64 %r332,%r331,3; sub.u64 %r119,%r122,%r332; .loc 1 178 5 add.u64 %r120,%r120,1; .loc 1 179 7 add.u64 %r144,%r144,8; add.u64 %r106,%r106,8; setp.eq.u64 %r333,%r120,%r353; @ %r333 bra $L1; $L32: .loc 1 187 16 ld.u64 %r334,[%r144]; add.u64 %r87,%r334,1; st.u64 [%r144],%r87; .loc 1 188 23 add.u64 %r336,%r354,%r106; ld.u64 %r88,[%r336]; .loc 1 188 13 shl.b64 %r337,%r88,3; add.u64 %r121,%r118,%r337; .loc 1 189 23 add.u64 %r338,%frame,%r106; ld.u64 %r91,[%r338]; .loc 1 189 13 shl.b64 %r339,%r91,3; add.u64 %r122,%r119,%r339; .loc 1 169 32 add.u64 %r341,%r355,%r106; ld.u64 %r94,[%r341]; .loc 1 169 13 setp.eq.u64 %r342,%r87,%r94; @ %r342 bra $L33; .loc 1 173 13 mov.u64 %r47,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r343,[%r180]; setp.eq.u64 %r344,%r343,0; @ ! %r344 bra $L11; bra $L34; $L1_gfortran_miall_i8 .visible .func _gfortran_miall_608predpred %r280; .reg .pred %r281; .reg .u64 %r282; .reg .u64 %r284; .reg .u64predu64 %r318; .reg .u64u64u64 %r364; .reg .u64 %r365; .reg .u64 %r367; .reg .predu64pred %r393; .reg .predpredmov.u64 %r210,%ar0; mov.u64 %r211,%ar1; mov.u64 %r212,%ar2; mov.u64 %r213,%ar3; .loc 1 223 6911212; call _gfortran_iall_i8,(%out_arg1,%out_arg2,%out_arg3); } .loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r212]; .loc 1 233 7 add.u64 %r141,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r211+28]; .loc 1 234 38 add.u32 %r218,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r142,%r218; .loc 1 237 7 shr.u64 %r220,%r141,63; set.u32.gt.s64 %r222,%r141,%r142; neg.s32 %r223,%r222; cvt.u16.u64 %r226,%r220; cvt.u16.u32 %r227,%r223; or.b16 %r225,%r226,%r227; .loc 1 237 6 cvt.u32.u16 %r228,%r225; cvt.u16.u8 %r229,%r228; setp.eq.u16 %r230,%r229,0; @ %r230 bra $L51; .loc 1 239 7 cvt.u32.u32 %r233,%r23; cvt.s64.s8 %r232,%r233; st.u64 [%stack+8],%r232; st.u64 [%stack],%r22; cvta.const.u64 %r2312stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r431,%r141,%r141; add.u64 %r430,%r431,%r141; shl.b64 %r238,%r430,3; add.u64 %r239,%r211,%r238; ld.u64 %r242,[%r239+56]; add.u64 %r241,%r242,1; .loc 1 244 7 ld.u64 %r249,[%r239+48]; sub.u64 %r143,%r241,%r249; .loc 1 245 6 setp.le.s64 %r250,%r143,0; @ %r250 bra $L48; .loc 1 248 9 ld.u64 %r156,[%r213]; .loc 1 250 15 ld.u64 %r35,[%r213+16]; .loc 1 252 22 cvt.u32.u64 %r36,%r35; .loc 1 252 53 add.u32 %r251,%r36,-4; and.b32 %r252,%r251,-5; set.u32.eq.u32 %r254,%r252,0; neg.s32 %r255,%r254; .loc 1 252 22 add.u32 %r256,%r36,-1; set.u32.le.u32 %r258,%r256,1; neg.s32 %r259,%r258; .loc 1 252 6 cvt.u16.u32 %r261,%r255; cvt.u16.u32 %r262,%r259; or.b16 %r260,%r261,%r262; cvt.u32.u16 %r263,%r260; cvt.u16.u8 %r264,%r263; setp.ne.u16 %r265,%r264,0; @ %r265 bra $L53; .loc 1 254 7 setp.ne.u32 %r267,%r36,16; @ %r267 bra $L54; $L53: .loc 1 261 9 shl.b64 %r271,%r430,3; add.u64 %r272,%r211,%r271; ld.u64 %r146,[%r272+40]; .loc 1 262 12 add.u64 %r278,%r213,%r271; ld.u64 %r43,[%r278+40]; .loc 1 264 3 setp.ne.u64 %r280,%r141,0; @ %r280 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r281,%r141,%r142; @ %r281 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r282282_gfortran_runtime_error5: add.u64 %r194,%r211,40; add.u64 %r196,%r213,40; add.u64 %r284,%r211,16; add.u64 %r286,%r22,%r22; add.u64 %r287,%r286,%r22; shl.b64 %r288,%r287,3; add.u64 %r207,%r284,%r288; .loc 1 264 3 mov.u64 %r198,0; add.u64 %r428,%frame,240; add.u64 %r427,%frame,360; .loc 1 271 12 mov.u64 %r442,%r198; $L60: .loc 1 266 18 add.u64 %r290,%r428,%r198; ld.u64 %r291,[%r194]; st.u64 [%r290],%r291; .loc 1 267 18 add.u64 %r292,%frame,%r198; .loc 1 267 20 ld.u64 %r294,[%r196]; mul.lo.u64 %r293,%r294,%r35; .loc 1 267 18 st.u64 [%r292],%r293; .loc 1 268 19 ld.u64 %r296,[%r194+16]; add.u64 %r295,%r296,1; ld.u64 %r297,[%r194+8]; sub.u64 %r55,%r295,%r297; .loc 1 270 10 setp.lt.s64 %r298,%r55,0; @ %r298 bra $L58; .loc 1 268 17 add.u64 %r300,%r427,%r198; st.u64 [%r300],%r55; bra $L59; $L58: .loc 1 271 12 add.u64 %r302,%r427,%r198; st.u64 [%r302],%r442; $L59: .loc 1 264 3 add.u64 %r194,%r194,24; add.u64 %r196,%r196,24; add.u64 %r198,%r198,8; setp.ne.u64 %r304,%r194,%r207; @ %r304 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r305,[%r210]; setp.eq.u64 %r306,%r305,0; @ ! %r306 bra $L63; bra $L62; $L56: add.u64 %r308,%r22,%r22; add.u64 %r309,%r308,%r22; shl.b64 %r310,%r309,3; add.u64 %r61,%r310,40; add.u64 %r100,%r211,%r61; add.u64 %r139,%r213,%r61; shl.b64 %r311,%r22,3; add.u64 %r192,%r311,-8; cvt.u32.u32 %r313,%r23; cvt.s64.s8 %r312,%r313; shl.b64 %r314,%r312,3; add.u64 %r39,%r314,-8; add.u64 %r428,%frame,240; add.u64 %r427,%frame,360; .loc 1 281 12 mov.u64 %r441,0; $L66: .loc 1 276 18 add.u64 %r316,%r428,%r192; ld.u64 %r317,[%r100]; st.u64 [%r316],%r317; .loc 1 277 18 add.u64 %r318,%frame,%r192; .loc 1 277 20 ld.u64 %r320,[%r139]; mul.lo.u64 %r319,%r320,%r35; .loc 1 277 18 st.u64 [%r318],%r319; .loc 1 278 19 ld.u64 %r322,[%r100+16]; add.u64 %r321,%r322,1; ld.u64 %r323,[%r100+8]; sub.u64 %r67,%r321,%r323; .loc 1 280 10 setp.lt.s64 %r324,%r67,0; @ %r324 bra $L64; .loc 1 278 17 add.u64 %r326,%r427,%r192; st.u64 [%r326],%r67; bra $L65; $L64: .loc 1 281 12 add.u64 %r328,%r427,%r192; st.u64 [%r328],%r441; $L65: .loc 1 274 3 add.u64 %r100,%r100,24; add.u64 %r139,%r139,24; add.u64 %r192,%r192,8; setp.ne.u64 %r330,%r39,%r192; @ %r330 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r331,%r142,0; @ %r331 bra $L68; add.u64 %r427,%frame,360; $L86: add.u64 %r137,%r210,40; mov.u64 %r135,%r427; cvt.u32.u32 %r333,%r23; cvt.s64.s8 %r332,%r333; add.u64 %r335,%r332,%r332; add.u64 %r336,%r335,%r332; shl.b64 %r337,%r336,3; add.u64 %r338,%r210,16; add.u64 %r106,%r337,%r338; .loc 1 291 10 mov.u64 %r131,1; .loc 1 295 4 mov.u64 %r357,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r339,%r23,-2; cvt.s64.s32 %r74,%r339; add.u64 %r341,%r74,%r74; add.u64 %r342,%r341,%r74; shl.b64 %r343,%r342,3; add.u64 %r344,%r210,%r343; .loc 1 299 67 shl.b64 %r346,%r74,3; add.u64 %r347,%frame,%r346; .loc 1 299 59 ld.u64 %r349,[%r344+40]; ld.u64 %r350,[%r347+360]; mul.lo.u64 %r150,%r349,%r350; .loc 1 301 24 mov.u64 %r351,0; st.u64 [%r210+8],%r351; .loc 1 302 28 cvt.u16.u32 %r354,%r23; add.u16 %r353,%r354,-1; cvt.u32.u16 %r355,%r353; st.u8 [%r210+28],%r355; .loc 1 304 10 setp.eq.u64 %r356,%r150,0; @ %r356 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r131,%r71,%r131; $L69: .loc 1 295 4 st.u64 [%r137+8],%r357; ld.u64 %r71,[%r135]; add.u64 %r358,%r71,-1; st.u64 [%r137+16],%r358; st.u64 [%r137],%r131; .loc 1 288 7 add.u64 %r137,%r137,24; add.u64 %r135,%r135,8; setp.ne.u64 %r359,%r106,%r137; @ %r359 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r210+48],%r150; mov.u64 %r361,-1; st.u64 [%r210+56],%r361; mov.u64 %r362,1; st.u64 [%r210+40],%r362; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r364364; call (%value_in),_gfortrani_xmallocarray65,[%value_in]; } .loc 1 311 22 st.u64 [%r210],%r365; bra $L73; $L63: .loc 1 316 19 ld.s8 %r367,[%r210+28]; .loc 1 316 10 setp.eq.u64 %r368,%r367,%r142; @ %r368 bra $L74; .loc 1 317 2 cvta.const.u64 %r36stack; call _gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r371,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r372,[%r371+36]; setp.eq.u32 %r373,%r372,0; @ %r373 bra $L73; .loc 1 321 4 add.u64 %r427,%frame,360; cvta.const.u64 %r377,$LC2r427376377; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r381,$LC1r2113377; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r383,%r142,0; @ %r383 bra $L75; $L78: .loc 1 336 8 ld.u64 %r157,[%r210]; .loc 1 337 8 ld.u64 %r155,[%r211]; .loc 1 339 9 setp.ne.u64 %r384,%r155,0; @ %r384 bra $L76; bra $L48; $L75: add.u64 %r183,%frame,480; add.u64 %r180,%r210,40; add.u64 %r177,%frame,120; cvt.u32.u32 %r386,%r23; cvt.s64.s8 %r385,%r386; add.u64 %r138,%r385,-1; .loc 1 328 3 mov.u64 %r161,0; add.u64 %r427,%frame,360; .loc 1 330 16 mov.u64 %r387,%r161; $L77: st.u64 [%r183],%r387; .loc 1 331 18 ld.u64 %r388,[%r180]; st.u64 [%r177],%r388; .loc 1 332 17 shl.b64 %r390,%r161,3; add.u64 %r391,%r427,%r390; .loc 1 332 10 ld.u64 %r392,[%r391]; setp.le.s64 %r393,%r392,0; @ %r393 bra $L48; .loc 1 328 26 add.u64 %r161,%r161,1; .loc 1 328 3 add.u64 %r183,%r183,8; add.u64 %r180,%r180,24; add.u64 %r177,%r177,8; setp.ne.u64 %r394,%r138,%r161; @ %r394 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r46,%r43,%r35; .loc 1 349 32 shl.b64 %r90,%r146,3; ld.u64 %r29,[%frame+480]; .loc 1 359 22 ld.u64 %r92,[%frame+240]; .loc 1 359 12 shl.b64 %r94,%r92,3; .loc 1 360 23 ld.u64 %r95,[%frame]; .loc 1 361 22 ld.u64 %r97,[%frame+120]; .loc 1 361 12 shl.b64 %r99,%r97,3; setp.le.s64 %r432,%r142,1; cvt.u32.u32 %r434,%r23; cvt.s64.s8 %r435,%r434; add.u64 %r436,%r435,-1; add.u64 %r437,%frame,240; add.u64 %r438,%frame,360; add.u64 %r439,%frame,120; .loc 1 367 13 mov.u64 %r440,0; $L87: .loc 1 328 3 mov.u64 %r160,%r156; mov.u64 %r159,%r155; .loc 1 348 10 mov.u64 %r133,-1; .loc 1 349 9 mov.u64 %r158,0; $L80: .loc 1 352 6 ld.s8 %r396,[%r160]; cvt.u16.u32 %r395,%r396; setp.eq.u16 %r397,%r395,0; @ %r397 bra $L79; .loc 1 353 12 ld.u64 %r398,[%r159]; and.b64 %r133,%r133,%r398; $L79: .loc 1 349 24 add.u64 %r158,%r158,1; .loc 1 349 32 add.u64 %r159,%r159,%r90; .loc 1 349 47 add.u64 %r160,%r160,%r46; .loc 1 349 2 setp.ne.u64 %r399,%r143,%r158; @ %r399 bra $L80; .loc 1 355 8 st.u64 [%r157],%r133; .loc 1 358 15 add.u64 %r29,%r29,1; .loc 1 359 12 add.u64 %r155,%r155,%r94; .loc 1 360 13 add.u64 %r156,%r156,%r95; .loc 1 361 12 add.u64 %r157,%r157,%r99; .loc 1 363 32 ld.u64 %r84,[%frame+360]; .loc 1 363 13 setp.ne.u64 %r400,%r84,%r29; @ %r400 bra $L87; .loc 1 370 23 mul.lo.u64 %r102,%r29,%r92; .loc 1 371 24 mul.lo.u64 %r401,%r29,%r95; .loc 1 371 10 sub.u64 %r152,%r156,%r401; .loc 1 372 23 mul.lo.u64 %r402,%r29,%r97; .loc 1 372 9 shl.b64 %r403,%r402,3; sub.u64 %r153,%r157,%r403; .loc 1 374 7 @ ! %r432 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r41],%r440; .loc 1 370 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 371 24 mul.lo.u64 %r406,%r116,%r113; .loc 1 371 10 sub.u64 %r152,%r156,%r406; .loc 1 372 23 mul.lo.u64 %r407,%r118,%r113; .loc 1 372 9 shl.b64 %r408,%r407,3; sub.u64 %r153,%r157,%r408; .loc 1 373 5 add.u64 %r154,%r154,1; .loc 1 374 7 add.u64 %r41,%r41,8; add.u64 %r30,%r30,8; setp.ne.u64 %r409,%r154,%r436; @ %r409 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; mov.u64 %r30,8; .loc 1 373 5 mov.u64 %r154,1; $L84: .loc 1 382 16 ld.u64 %r413,[%r41]; add.u64 %r113,%r413,1; st.u64 [%r41],%r113; .loc 1 383 23 add.u64 %r415,%r437,%r30; ld.u64 %r114,[%r415]; .loc 1 383 13 sub.u64 %r416,%r114,%r102; shl.b64 %r417,%r416,3; add.u64 %r155,%r155,%r417; .loc 1 384 24 add.u64 %r418,%frame,%r30; ld.u64 %r116,[%r418]; .loc 1 384 14 add.u64 %r156,%r152,%r116; .loc 1 385 23 add.u64 %r420,%r439,%r30; ld.u64 %r118,[%r420]; .loc 1 385 13 shl.b64 %r421,%r118,3; add.u64 %r157,%r153,%r421; .loc 1 363 32 add.u64 %r423,%r438,%r30; ld.u64 %r121,[%r423]; .loc 1 363 13 setp.eq.u64 %r424,%r113,%r121; @ %r424 bra $L85; .loc 1 367 13 mov.u64 %r29,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r425,[%r210]; setp.eq.u64 %r426,%r425,0; @ ! %r426 bra $L63; bra $L86; $L48: .loc 1 389_gfortran_siall_i8 .visible .func _gfortran_siall_8u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_ial.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r294,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r294; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r294; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r300,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r300; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r293,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r293,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r299,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r299; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r294,0; @ %r211 bra $L127; add.u64 %r293,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r293; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r24496244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r294,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r294; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r294; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r294,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r294,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r294,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 shl.b64 %r69,%r68,3; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r292,%r294,1; .loc 1 512 13 mov.u64 %r278,0; cvt.u32.u32 %r296,%r24; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; $L146: st.u64 [%r98],%r278; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r279,%r59,%r80; @ %r279 bra $L146; .loc 1 525 7 @ ! %r292 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r278; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r282,%r97,%r298; @ %r282 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; $L144: .loc 1 529 16 ld.u64 %r287,[%r99]; add.u64 %r74,%r287,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r288,%r75,%r72; shl.b64 %r289,%r288,3; add.u64 %r98,%r98,%r289; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r290,%r74,%r77; @ %r290 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r291,%r98,0; @ ! %r291 bra $L122; bra $L147; $L109: .loc 1 534iall_i16.o/_gfortran_iall_i16 .visible .func _gfortran_iall_i16fortran/generated/iall_i16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_miall_i16 .visible .func _gfortran_miall_i16gfortran_siall_i16 .visible .func _gfortran_siall_i16VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,76,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,76,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,105] = {73,65,76,76114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,76,76,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,73,65,76,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_iall_i1648pred %r217; .reg .pred64u64 %r251; .reg .predu64 %r256; .reg .u64 %r257; .reg .u32u32 %r268; .reg .u64u6464predu64 %r370; mov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; .loc 1 55 10 ld.s8 %r22,[%r180+28]; .loc 1 55 38 add.u32 %r182,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r182; .loc 1 56 10 ld.u64 %r25,[%r181]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r184,%r104,63; set.u32.lt.s64 %r186,%r103,%r104; neg.s32 %r187,%r186; cvt.u16.u64 %r190,%r184; cvt.u16.u32 %r191,%r187; or.b16 %r189,%r190,%r191; .loc 1 58 6 cvt.u32.u16 %r192,%r189; cvt.u16.u8 %r193,%r192; setp.eq.u16 %r194,%r193,0; @ %r194 bra $L2; .loc 1 60 7 cvt.u32.u32 %r197,%r22; cvt.s64.s8 %r196,%r197; st.u64 [%stack+8],%r1961955_gfortran_runtime_errorr200,%r104,%r104; add.u64 %r201,%r200,%r104; shl.b64 %r202,%r201,3; add.u64 %r203,%r180,%r202; ld.u64 %r30,[%r203+56]; ld.u64 %r32,[%r203+48]; .loc 1 68 9 ld.u64 %r107,[%r203+40]; .loc 1 70 3 setp.ne.u64 %r217,%r104,0; @ %r217 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r218,%r103,%r104; @ %r218 bra $L4; bra $L46; $L3: add.u64 %r61,%r180,40; add.u64 %r40,%frame,120; add.u64 %r91,%frame,240; add.u64 %r219,%r180,16; add.u64 %r221,%r25,%r25; add.u64 %r222,%r221,%r25; shl.b64 %r223,%r222,3; add.u64 %r174,%r219,%r223; .loc 1 76 12 mov.u64 %r370,0; $L8: .loc 1 72 18 ld.u64 %r224,[%r61]; st.u64 [%r40],%r224; .loc 1 73 19 ld.u64 %r226,[%r61+16]; add.u64 %r225,%r226,1; ld.u64 %r227,[%r61+8]; sub.u64 %r37,%r225,%r227; .loc 1 75 10 setp.lt.s64 %r228,%r37,0; @ %r228 bra $L6; .loc 1 73 17 st.u64 [%r91],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r91],%r370; $L7: .loc 1 70 3 add.u64 %r61,%r61,24; add.u64 %r40,%r40,8; add.u64 %r91,%r91,8; setp.ne.u64 %r230,%r61,%r174; @ %r230 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r231,[%r179]; setp.eq.u64 %r232,%r231,0; @ ! %r232 bra $L11; bra $L10; $L4: add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r237,%r236,40; add.u64 %r143,%r180,%r237; shl.b64 %r238,%r25,3; add.u64 %r125,%r238,-8; add.u64 %r358,%frame,120; add.u64 %r128,%r358,%r125; add.u64 %r357,%frame,240; add.u64 %r121,%r357,%r125; cvt.u32.u32 %r242,%r22; cvt.s64.s8 %r241,%r242; add.u64 %r244,%r241,%r241; add.u64 %r245,%r244,%r241; shl.b64 %r246,%r245,3; add.u64 %r247,%r180,40; add.u64 %r95,%r246,%r247; .loc 1 84 12 mov.u64 %r369,0; $L14: .loc 1 80 18 ld.u64 %r248,[%r143]; st.u64 [%r128],%r248; .loc 1 81 19 ld.u64 %r250,[%r143+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r143+8]; sub.u64 %r44,%r249,%r251; .loc 1 83 10 setp.lt.s64 %r252,%r44,0; @ %r252 bra $L12; .loc 1 81 17 st.u64 [%r121],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r121],%r369; $L13: .loc 1 78 3 add.u64 %r143,%r143,24; add.u64 %r128,%r128,8; add.u64 %r121,%r121,8; setp.ne.u64 %r254,%r95,%r143; @ %r254 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r255,%r103,0; @ %r255 bra $L16; add.u64 %r357,%frame,240; $L34: add.u64 %r152,%r179,40; mov.u64 %r150,%r357; add.u64 %r256,%r179,16; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r144,%r256,%r262; .loc 1 94 10 mov.u64 %r99,1; .loc 1 98 4 mov.u64 %r285,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r263,0; st.u64 [%r179+8],%r263; .loc 1 103 28 cvt.u16.u32 %r266,%r22; add.u16 %r265,%r266,-1; cvt.u32.u16 %r267,%r265; st.u8 [%r179+28],%r267; .loc 1 105 20 add.u32 %r268,%r22,-2; cvt.s64.s32 %r52,%r268; add.u64 %r270,%r52,%r52; add.u64 %r271,%r270,%r52; shl.b64 %r272,%r271,3; add.u64 %r273,%r179,%r272; .loc 1 105 67 shl.b64 %r275,%r52,3; add.u64 %r276,%frame,%r275; .loc 1 105 59 ld.u64 %r278,[%r273+40]; ld.u64 %r279,[%r276+240]; mul.lo.u64 %r109,%r278,%r279; .loc 1 107 29 mov.u64 %r281,16281; call (%value_in),_gfortrani_xmallocarray282,[%value_in]; } .loc 1 107 27 st.u64 [%r179],%r282; .loc 1 108 10 setp.eq.u64 %r284,%r109,0; @ ! %r284 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r99,%r48,%r99; $L17: .loc 1 98 4 st.u64 [%r152+8],%r285; ld.u64 %r48,[%r150]; add.u64 %r286,%r48,-1; st.u64 [%r152+16],%r286; st.u64 [%r152],%r99; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r287,%r144,%r152; @ %r287 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r179+48],%r109; mov.u64 %r289,-1; st.u64 [%r179+56],%r289; mov.u64 %r290,1; st.u64 [%r179+40],%r290; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r179+28]; .loc 1 118 10 setp.eq.u64 %r291,%r58,%r103; @ %r291 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r294,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r295,[%r294+36]; setp.eq.u32 %r296,%r295,0; @ %r296 bra $L21; .loc 1 125 2 add.u64 %r357,%frame,240; cvta.const.u64 %r300,$LC2r357299300; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r302,%r103,0; @ %r302 bra $L24; $L27: .loc 1 65 9 add.u64 %r303,%r30,1; .loc 1 65 7 sub.u64 %r105,%r303,%r32; max.s64 %r45,%r105,0; .loc 1 137 8 ld.u64 %r116,[%r180]; .loc 1 138 8 ld.u64 %r117,[%r179]; .loc 1 154 36 shl.b64 %r66,%r107,4; ld.u64 %r138,[%frame+360]; .loc 1 166 22 ld.u64 %r68,[%frame+120]; .loc 1 166 12 shl.b64 %r69,%r68,4; .loc 1 167 22 ld.u64 %r71,[%frame]; .loc 1 167 12 shl.b64 %r72,%r71,4; .loc 1 169 32 ld.u64 %r123,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r304,%r68,%r123; .loc 1 176 9 shl.b64 %r305,%r304,4; neg.s64 %r127,%r305; .loc 1 177 23 mul.lo.u64 %r306,%r71,%r123; .loc 1 177 9 shl.b64 %r307,%r306,4; neg.s64 %r130,%r307; setp.gt.s64 %r359,%r105,0; setp.le.s64 %r360,%r103,1; .loc 1 148 10 mov.u64 %r362,-1; cvt.u32.u32 %r364,%r22; cvt.s64.s8 %r365,%r364; add.u64 %r366,%r365,-1; add.u64 %r367,%frame,120; add.u64 %r368,%frame,240; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r179,40; mov.u64 %r162,%frame; cvt.u32.u32 %r309,%r22; cvt.s64.s8 %r308,%r309; add.u64 %r153,%r308,-1; .loc 1 129 3 mov.u64 %r118,0; add.u64 %r357,%frame,240; .loc 1 131 16 mov.u64 %r310,%r118; $L26: st.u64 [%r165],%r310; .loc 1 132 18 ld.u64 %r311,[%r164]; st.u64 [%r162],%r311; .loc 1 133 17 shl.b64 %r313,%r118,3; add.u64 %r314,%r357,%r313; .loc 1 133 10 ld.u64 %r315,[%r314]; setp.le.s64 %r316,%r315,0; @ %r316 bra $L1; .loc 1 129 26 add.u64 %r118,%r118,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r317,%r118,%r153; @ %r317 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r359 bra $L35; .loc 1 150 10 mov.u64 %r319,0; st.u64 [%r117],%r319; st.u64 [%r117+8],%r319; bra $L29; $L35: mov.u64 %r112,%r116; .loc 1 148 10 mov.u64 %r353,%r362; mov.u64 %r354,%r362; .loc 1 154 13 mov.u64 %r111,0; $L28: .loc 1 158 10 ld.u64 %r323,[%r112]; ld.u64 %r326,[%r112+8]; and.b64 %r353,%r323,%r353; and.b64 %r354,%r326,%r354; .loc 1 154 28 add.u64 %r111,%r111,1; .loc 1 154 36 add.u64 %r112,%r112,%r66; .loc 1 154 6 setp.gt.s64 %r328,%r45,%r111; @ %r328 bra $L28; .loc 1 161 12 st.u64 [%r117],%r353; st.u64 [%r117+8],%r354; $L29: .loc 1 165 15 add.u64 %r138,%r138,1; .loc 1 166 12 add.u64 %r116,%r116,%r69; .loc 1 167 12 add.u64 %r117,%r117,%r72; .loc 1 169 13 setp.ne.u64 %r331,%r138,%r123; @ %r331 bra $L25; .loc 1 176 9 add.u64 %r113,%r116,%r127; .loc 1 177 9 add.u64 %r114,%r117,%r130; .loc 1 179 7 @ %r360 bra $L1; add.u64 %r122,%frame,368; mov.u64 %r140,8; .loc 1 178 5 mov.u64 %r115,1; .loc 1 173 13 mov.u64 %r361,0; bra $L32; $L33: st.u64 [%r122],%r361; .loc 1 176 23 mul.lo.u64 %r337,%r85,%r84; .loc 1 176 9 shl.b64 %r338,%r337,4; sub.u64 %r113,%r116,%r338; .loc 1 177 23 mul.lo.u64 %r339,%r88,%r84; .loc 1 177 9 shl.b64 %r340,%r339,4; sub.u64 %r114,%r117,%r340; .loc 1 178 5 add.u64 %r115,%r115,1; .loc 1 179 7 add.u64 %r122,%r122,8; add.u64 %r140,%r140,8; setp.eq.u64 %r341,%r115,%r366; @ %r341 bra $L1; $L32: .loc 1 187 16 ld.u64 %r342,[%r122]; add.u64 %r84,%r342,1; st.u64 [%r122],%r84; .loc 1 188 23 add.u64 %r344,%r367,%r140; ld.u64 %r85,[%r344]; .loc 1 188 13 shl.b64 %r345,%r85,4; add.u64 %r116,%r113,%r345; .loc 1 189 23 add.u64 %r346,%frame,%r140; ld.u64 %r88,[%r346]; .loc 1 189 13 shl.b64 %r347,%r88,4; add.u64 %r117,%r114,%r347; .loc 1 169 32 add.u64 %r349,%r368,%r140; ld.u64 %r92,[%r349]; .loc 1 169 13 setp.eq.u64 %r350,%r84,%r92; @ %r350 bra $L33; .loc 1 173 13 mov.u64 %r138,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r351,[%r179]; setp.eq.u64 %r352,%r351,0; @ ! %r352 bra $L11; bra $L34; $L1_gfortran_miall_i16 .visible .func _gfortran_miall_i16608u64u16 %r263; .reg .pred %r264; .reg .pred %r266; .reg .u64 %r270; .reg .u64 %r271; .reg .u64 %r277; .reg .pred %r279; .reg .pred %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .predu64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32u64 %r342; .reg .u64pred %r358; .reg .u64predu64 %r391; .reg .predpred %r399; .reg .u64 %r402; .reg .u64 %r405; .reg .pred %r407; .reg .predu64 %r435; .reg .u64 %r438; .reg .u64 %r439; .reg .u64 %r440; .reg .u64 %r442; .reg .predmov.u64 %r211,%ar2; mov.u64 %r212,%ar3; .loc 1 223 6 setp.ne.u64 %r213,%r212,0; @ %r213 bra $L492,%out_arg3); } .loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r211]; .loc 1 233 7 add.u64 %r137,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r210+28]; .loc 1 234 38 add.u32 %r217,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r138,%r217; .loc 1 237 7 shr.u64 %r219,%r137,63; set.u32.gt.s64 %r221,%r137,%r138; neg.s32 %r222,%r221; cvt.u16.u64 %r225,%r219; cvt.u16.u32 %r226,%r222; or.b16 %r224,%r225,%r226; .loc 1 237 6 cvt.u32.u16 %r227,%r224; cvt.u16.u8 %r228,%r227; setp.eq.u16 %r229,%r228,0; @ %r229 bra $L51; .loc 1 239 7 cvt.u32.u32 %r232,%r23; cvt.s64.s8 %r231,%r232; st.u64 [%stack+8],%r231; st.u64 [%stack],%r22; cvta.const.u64 %r2302stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r440,%r137,%r137; add.u64 %r442,%r440,%r137; shl.b64 %r237,%r442,3; add.u64 %r238,%r210,%r237; ld.u64 %r241,[%r238+56]; add.u64 %r240,%r241,1; .loc 1 244 7 ld.u64 %r248,[%r238+48]; sub.u64 %r139,%r240,%r248; .loc 1 245 6 setp.le.s64 %r249,%r139,0; @ %r249 bra $L48; .loc 1 248 9 ld.u64 %r152,[%r212]; .loc 1 250 15 ld.u64 %r34,[%r212+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r250,%r35,-4; and.b32 %r251,%r250,-5; set.u32.eq.u32 %r253,%r251,0; neg.s32 %r254,%r253; .loc 1 252 22 add.u32 %r255,%r35,-1; set.u32.le.u32 %r257,%r255,1; neg.s32 %r258,%r257; .loc 1 252 6 cvt.u16.u32 %r260,%r254; cvt.u16.u32 %r261,%r258; or.b16 %r259,%r260,%r261; cvt.u32.u16 %r262,%r259; cvt.u16.u8 %r263,%r262; setp.ne.u16 %r264,%r263,0; @ %r264 bra $L53; .loc 1 254 7 setp.ne.u32 %r266,%r35,16; @ %r266 bra $L54; $L53: .loc 1 261 9 shl.b64 %r270,%r442,3; add.u64 %r271,%r210,%r270; ld.u64 %r142,[%r271+40]; .loc 1 262 12 add.u64 %r277,%r212,%r270; ld.u64 %r42,[%r277+40]; .loc 1 264 3 setp.ne.u64 %r279,%r137,0; @ %r279 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r280,%r137,%r138; @ %r280 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r281281_gfortran_runtime_error5: add.u64 %r40,%r210,40; add.u64 %r179,%r212,40; add.u64 %r283,%r210,16; add.u64 %r285,%r22,%r22; add.u64 %r286,%r285,%r22; shl.b64 %r287,%r286,3; add.u64 %r206,%r283,%r287; .loc 1 264 3 mov.u64 %r92,0; add.u64 %r439,%frame,240; add.u64 %r438,%frame,360; .loc 1 271 12 mov.u64 %r454,%r92; $L60: .loc 1 266 18 add.u64 %r289,%r439,%r92; ld.u64 %r290,[%r40]; st.u64 [%r289],%r290; .loc 1 267 18 add.u64 %r291,%frame,%r92; .loc 1 267 20 ld.u64 %r293,[%r179]; mul.lo.u64 %r292,%r293,%r34; .loc 1 267 18 st.u64 [%r291],%r292; .loc 1 268 19 ld.u64 %r295,[%r40+16]; add.u64 %r294,%r295,1; ld.u64 %r296,[%r40+8]; sub.u64 %r54,%r294,%r296; .loc 1 270 10 setp.lt.s64 %r297,%r54,0; @ %r297 bra $L58; .loc 1 268 17 add.u64 %r299,%r438,%r92; st.u64 [%r299],%r54; bra $L59; $L58: .loc 1 271 12 add.u64 %r301,%r438,%r92; st.u64 [%r301],%r454; $L59: .loc 1 264 3 add.u64 %r40,%r40,24; add.u64 %r179,%r179,24; add.u64 %r92,%r92,8; setp.ne.u64 %r303,%r40,%r206; @ %r303 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r304,[%r209]; setp.eq.u64 %r305,%r304,0; @ ! %r305 bra $L63; bra $L62; $L56: add.u64 %r307,%r22,%r22; add.u64 %r308,%r307,%r22; shl.b64 %r309,%r308,3; add.u64 %r127,%r309,40; add.u64 %r133,%r210,%r127; add.u64 %r125,%r212,%r127; shl.b64 %r310,%r22,3; add.u64 %r103,%r310,-8; cvt.u32.u32 %r312,%r23; cvt.s64.s8 %r311,%r312; shl.b64 %r313,%r311,3; add.u64 %r69,%r313,-8; add.u64 %r439,%frame,240; add.u64 %r438,%frame,360; .loc 1 281 12 mov.u64 %r453,0; $L66: .loc 1 276 18 add.u64 %r315,%r439,%r103; ld.u64 %r316,[%r133]; st.u64 [%r315],%r316; .loc 1 277 18 add.u64 %r317,%frame,%r103; .loc 1 277 20 ld.u64 %r319,[%r125]; mul.lo.u64 %r318,%r319,%r34; .loc 1 277 18 st.u64 [%r317],%r318; .loc 1 278 19 ld.u64 %r321,[%r133+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r133+8]; sub.u64 %r66,%r320,%r322; .loc 1 280 10 setp.lt.s64 %r323,%r66,0; @ %r323 bra $L64; .loc 1 278 17 add.u64 %r325,%r438,%r103; st.u64 [%r325],%r66; bra $L65; $L64: .loc 1 281 12 add.u64 %r327,%r438,%r103; st.u64 [%r327],%r453; $L65: .loc 1 274 3 add.u64 %r133,%r133,24; add.u64 %r125,%r125,24; add.u64 %r103,%r103,8; setp.ne.u64 %r329,%r69,%r103; @ %r329 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r330,%r138,0; @ %r330 bra $L68; add.u64 %r438,%frame,360; $L86: add.u64 %r172,%r209,40; mov.u64 %r164,%r438; cvt.u32.u32 %r332,%r23; cvt.s64.s8 %r331,%r332; add.u64 %r334,%r331,%r331; add.u64 %r335,%r334,%r331; shl.b64 %r336,%r335,3; add.u64 %r337,%r209,16; add.u64 %r134,%r336,%r337; .loc 1 291 10 mov.u64 %r129,1; .loc 1 295 4 mov.u64 %r356,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r338,%r23,-2; cvt.s64.s32 %r75,%r338; add.u64 %r340,%r75,%r75; add.u64 %r341,%r340,%r75; shl.b64 %r342,%r341,3; add.u64 %r343,%r209,%r342; .loc 1 299 67 shl.b64 %r345,%r75,3; add.u64 %r346,%frame,%r345; .loc 1 299 59 ld.u64 %r348,[%r343+40]; ld.u64 %r349,[%r346+360]; mul.lo.u64 %r146,%r348,%r349; .loc 1 301 24 mov.u64 %r350,0; st.u64 [%r209+8],%r350; .loc 1 302 28 cvt.u16.u32 %r353,%r23; add.u16 %r352,%r353,-1; cvt.u32.u16 %r354,%r352; st.u8 [%r209+28],%r354; .loc 1 304 10 setp.eq.u64 %r355,%r146,0; @ %r355 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r129,%r72,%r129; $L69: .loc 1 295 4 st.u64 [%r172+8],%r356; ld.u64 %r72,[%r164]; add.u64 %r357,%r72,-1; st.u64 [%r172+16],%r357; st.u64 [%r172],%r129; .loc 1 288 7 add.u64 %r172,%r172,24; add.u64 %r164,%r164,8; setp.ne.u64 %r358,%r134,%r172; @ %r358 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r209+48],%r146; mov.u64 %r360,-1; st.u64 [%r209+56],%r360; mov.u64 %r361,1; st.u64 [%r209+40],%r361; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r363,16363; call (%value_in),_gfortrani_xmallocarray64,[%value_in]; } .loc 1 311 22 st.u64 [%r209],%r364; bra $L73; $L63: .loc 1 316 19 ld.s8 %r366,[%r209+28]; .loc 1 316 10 setp.eq.u64 %r367,%r366,%r138; @ %r367 bra $L74; .loc 1 317 2 cvta.const.u64 %r36836_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r370,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r371,[%r370+36]; setp.eq.u32 %r372,%r371,0; @ %r372 bra $L73; .loc 1 321 4 add.u64 %r438,%frame,360; cvta.const.u64 %r376,$LC2r437376; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r380,$LC1210380376; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r382,%r138,0; @ %r382 bra $L75; $L78: .loc 1 336 8 ld.u64 %r153,[%r209]; .loc 1 337 8 ld.u64 %r151,[%r210]; .loc 1 339 9 setp.ne.u64 %r383,%r151,0; @ %r383 bra $L76; bra $L48; $L75: add.u64 %r190,%frame,480; add.u64 %r189,%r209,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r385,%r23; cvt.s64.s8 %r384,%r385; add.u64 %r175,%r384,-1; .loc 1 328 3 mov.u64 %r157,0; add.u64 %r438,%frame,360; .loc 1 330 16 mov.u64 %r386,%r157; $L77: st.u64 [%r190],%r386; .loc 1 331 18 ld.u64 %r387,[%r189]; st.u64 [%r187],%r387; .loc 1 332 17 shl.b64 %r389,%r157,3; add.u64 %r390,%r438,%r389; .loc 1 332 10 ld.u64 %r391,[%r390]; setp.le.s64 %r392,%r391,0; @ %r392 bra $L48; .loc 1 328 26 add.u64 %r157,%r157,1; .loc 1 328 3 add.u64 %r190,%r190,8; add.u64 %r189,%r189,24; add.u64 %r187,%r187,8; setp.ne.u64 %r393,%r157,%r175; @ %r393 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r45,%r42,%r34; .loc 1 349 32 shl.b64 %r91,%r142,4; ld.u64 %r39,[%frame+480]; .loc 1 359 22 ld.u64 %r94,[%frame+240]; .loc 1 359 12 shl.b64 %r95,%r94,4; .loc 1 360 23 ld.u64 %r97,[%frame]; .loc 1 361 22 ld.u64 %r99,[%frame+120]; .loc 1 361 12 shl.b64 %r100,%r99,4; .loc 1 363 32 ld.u64 %r85,[%frame+360]; .loc 1 370 23 mul.lo.u64 %r38,%r85,%r94; .loc 1 371 24 mul.lo.u64 %r394,%r85,%r97; .loc 1 371 10 neg.s64 %r163,%r394; .loc 1 372 23 mul.lo.u64 %r395,%r85,%r99; .loc 1 372 9 shl.b64 %r396,%r395,4; neg.s64 %r165,%r396; setp.le.s64 %r443,%r138,1; .loc 1 348 10 mov.u64 %r445,-1; cvt.u32.u32 %r447,%r23; cvt.s64.s8 %r448,%r447; add.u64 %r449,%r448,-1; add.u64 %r450,%frame,240; add.u64 %r451,%frame,360; add.u64 %r452,%frame,120; $L87: .loc 1 328 3 mov.u64 %r156,%r152; mov.u64 %r155,%r151; .loc 1 348 10 mov.u64 %r434,%r445; mov.u64 %r435,%r445; .loc 1 349 9 mov.u64 %r154,0; $L80: .loc 1 352 6 ld.s8 %r398,[%r156]; cvt.u16.u32 %r397,%r398; setp.eq.u16 %r399,%r397,0; @ %r399 bra $L79; .loc 1 353 12 ld.u64 %r402,[%r155]; ld.u64 %r405,[%r155+8]; and.b64 %r434,%r402,%r434; and.b64 %r435,%r405,%r435; $L79: .loc 1 349 24 add.u64 %r154,%r154,1; .loc 1 349 32 add.u64 %r155,%r155,%r91; .loc 1 349 47 add.u64 %r156,%r156,%r45; .loc 1 349 2 setp.ne.u64 %r407,%r139,%r154; @ %r407 bra $L80; .loc 1 355 8 st.u64 [%r153],%r434; st.u64 [%r153+8],%r435; .loc 1 358 15 add.u64 %r39,%r39,1; .loc 1 359 12 add.u64 %r151,%r151,%r95; .loc 1 360 13 add.u64 %r152,%r152,%r97; .loc 1 361 12 add.u64 %r153,%r153,%r100; .loc 1 363 13 setp.ne.u64 %r410,%r85,%r39; @ %r410 bra $L87; .loc 1 371 10 add.u64 %r148,%r152,%r163; .loc 1 372 9 add.u64 %r149,%r153,%r165; .loc 1 374 7 @ ! %r443 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r41],%r444; .loc 1 370 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 371 24 mul.lo.u64 %r413,%r116,%r113; .loc 1 371 10 sub.u64 %r148,%r152,%r413; .loc 1 372 23 mul.lo.u64 %r414,%r118,%r113; .loc 1 372 9 shl.b64 %r415,%r414,4; sub.u64 %r149,%r153,%r415; .loc 1 373 5 add.u64 %r150,%r150,1; .loc 1 374 7 add.u64 %r41,%r41,8; add.u64 %r166,%r166,8; setp.ne.u64 %r416,%r150,%r449; @ %r416 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; .loc 1 370 23 mov.u64 %r102,%r38; .loc 1 374 7 mov.u64 %r166,8; .loc 1 373 5 mov.u64 %r150,1; .loc 1 367 13 mov.u64 %r444,0; $L84: .loc 1 382 16 ld.u64 %r420,[%r41]; add.u64 %r113,%r420,1; st.u64 [%r41],%r113; .loc 1 383 23 add.u64 %r422,%r450,%r166; ld.u64 %r114,[%r422]; .loc 1 383 13 sub.u64 %r423,%r114,%r102; shl.b64 %r424,%r423,4; add.u64 %r151,%r151,%r424; .loc 1 384 24 add.u64 %r425,%frame,%r166; ld.u64 %r116,[%r425]; .loc 1 384 14 add.u64 %r152,%r148,%r116; .loc 1 385 23 add.u64 %r427,%r452,%r166; ld.u64 %r118,[%r427]; .loc 1 385 13 shl.b64 %r428,%r118,4; add.u64 %r153,%r149,%r428; .loc 1 363 32 add.u64 %r430,%r451,%r166; ld.u64 %r122,[%r430]; .loc 1 363 13 setp.eq.u64 %r431,%r113,%r122; @ %r431 bra $L85; .loc 1 367 13 mov.u64 %r39,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r432,[%r209]; setp.eq.u64 %r433,%r432,0; @ ! %r433 bra $L63; bra $L86; $L48: .loc 1 389_gfortran_siall_i16 .visible .func _gfortran_siall_i168u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64predpred %r292; .reg .predu32u64 %r301; mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_iall_i16.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r295,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r295; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r295; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r301,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r301; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r294,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r294,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r300,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r300; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r295,0; @ %r211 bra $L127; add.u64 %r294,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r294; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r244,1696244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r295,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r295; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r295; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r295,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r295,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r295,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 shl.b64 %r69,%r68,4; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r293,%r295,1; .loc 1 512 13 mov.u64 %r278,0; cvt.u32.u32 %r297,%r24; cvt.s64.s8 %r298,%r297; add.u64 %r299,%r298,-1; $L146: st.u64 [%r98],%r278; st.u64 [%r98+8],%r278; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r280,%r59,%r80; @ %r280 bra $L146; .loc 1 525 7 @ ! %r293 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r278; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r283,%r97,%r299; @ %r283 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; $L144: .loc 1 529 16 ld.u64 %r288,[%r99]; add.u64 %r74,%r288,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r289,%r75,%r72; shl.b64 %r290,%r289,4; add.u64 %r98,%r98,%r290; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r291,%r74,%r77; @ %r291 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r292,%r98,0; @ ! %r292 bra $L122; bra $L147; $L109: .loc 1 534 iany_i1.o/_gfortran_iany_i1 .visible .func _gfortran_iany_i1fortran/generated/iany_i1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_miany_i1 .visible .func _gfortran_miany_i1gfortran_siany_i1 .visible .func _gfortran_siany_i1VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,78,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,78,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,105] = {73,65,78,89114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,78,89,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,73,65,78,89,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_iany_i1 .visible .func _gfortran_iany_i148predpred %r209; .reg .pred %r210; .reg .u64 %r211; .reg .u64 %r213; .reg .u64pred %r220; .reg .pred %r222; .reg .u64pred %r247; .reg .u64 %r248; .reg .u32 %r249; .reg .u64 %r251; .reg .u64 %r252; .reg .u64 %r253; .reg .u64 %r254; .reg .u64 %r255; .reg .u16 %r257; .reg .u16predpred %r288; .reg .u64 %r291; .reg .u64 %r292; .reg .predpred %r304; .reg .predpredpred %r332; .reg .u64 %r333; .reg .pred %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .pred %r337; .reg .pred %r338; .reg .u32 %r340; .reg .u64 %r341; .reg .u64 %r342; .reg .u64 %r343; .reg .u64 %r344; .reg .u64 %r345; .reg .u64 %r347; .reg .u64 %r348; .reg .u16 %r349; mov.u64 %r171,%ar0; mov.u64 %r172,%ar1; mov.u64 %r173,%ar2; .loc 1 55 10 ld.s8 %r22,[%r172+28]; .loc 1 55 38 add.u32 %r174,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r174; .loc 1 56 10 ld.u64 %r25,[%r173]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r176,%r104,63; set.u32.lt.s64 %r178,%r103,%r104; neg.s32 %r179,%r178; cvt.u16.u64 %r182,%r176; cvt.u16.u32 %r183,%r179; or.b16 %r181,%r182,%r183; .loc 1 58 6 cvt.u32.u16 %r184,%r181; cvt.u16.u8 %r185,%r184; setp.eq.u16 %r186,%r185,0; @ %r186 bra $L2; .loc 1 60 7 cvt.u32.u32 %r189,%r22; cvt.s64.s8 %r188,%r189; st.u64 [%stack+8],%r18818187_gfortran_runtime_errorr192,%r104,%r104; add.u64 %r193,%r192,%r104; shl.b64 %r194,%r193,3; add.u64 %r195,%r172,%r194; ld.u64 %r31,[%r195+56]; ld.u64 %r33,[%r195+48]; .loc 1 68 9 ld.u64 %r107,[%r195+40]; .loc 1 70 3 setp.ne.u64 %r209,%r104,0; @ %r209 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r210,%r103,%r104; @ %r210 bra $L4; bra $L46; $L3: add.u64 %r168,%r172,40; add.u64 %r165,%frame,120; add.u64 %r48,%frame,240; add.u64 %r211,%r172,16; add.u64 %r213,%r25,%r25; add.u64 %r214,%r213,%r25; shl.b64 %r215,%r214,3; add.u64 %r139,%r211,%r215; .loc 1 76 12 mov.u64 %r348,0; $L8: .loc 1 72 18 ld.u64 %r216,[%r168]; st.u64 [%r165],%r216; .loc 1 73 19 ld.u64 %r218,[%r168+16]; add.u64 %r217,%r218,1; ld.u64 %r219,[%r168+8]; sub.u64 %r38,%r217,%r219; .loc 1 75 10 setp.lt.s64 %r220,%r38,0; @ %r220 bra $L6; .loc 1 73 17 st.u64 [%r48],%r38; bra $L7; $L6: .loc 1 76 12 st.u64 [%r48],%r348; $L7: .loc 1 70 3 add.u64 %r168,%r168,24; add.u64 %r165,%r165,8; add.u64 %r48,%r48,8; setp.ne.u64 %r222,%r139,%r168; @ %r222 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r223,[%r171]; setp.eq.u64 %r224,%r223,0; @ ! %r224 bra $L11; bra $L10; $L4: add.u64 %r226,%r25,%r25; add.u64 %r227,%r226,%r25; shl.b64 %r228,%r227,3; add.u64 %r229,%r228,40; add.u64 %r99,%r172,%r229; shl.b64 %r230,%r25,3; add.u64 %r91,%r230,-8; add.u64 %r335,%frame,120; add.u64 %r93,%r335,%r91; add.u64 %r336,%frame,240; add.u64 %r77,%r336,%r91; cvt.u32.u32 %r234,%r22; cvt.s64.s8 %r233,%r234; add.u64 %r236,%r233,%r233; add.u64 %r237,%r236,%r233172,40; add.u64 %r121,%r238,%r239; .loc 1 84 12 mov.u64 %r347,0; $L14: .loc 1 80 18 ld.u64 %r240,[%r99]; st.u64 [%r93],%r240; .loc 1 81 19 ld.u64 %r242,[%r99+16]; add.u64 %r241,%r242,1; ld.u64 %r243,[%r99+8]; sub.u64 %r44,%r241,%r243; .loc 1 83 10 setp.lt.s64 %r244,%r44,0; @ %r244 bra $L12; .loc 1 81 17 st.u64 [%r77],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r77],%r347; $L13: .loc 1 78 3 add.u64 %r99,%r99,24; add.u64 %r93,%r93,8; add.u64 %r77,%r77,8; setp.ne.u64 %r246,%r99,%r121; @ %r246 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r247,%r103,0; @ %r247 bra $L16; add.u64 %r336,%frame,240; $L35: add.u64 %r137,%r171,40; mov.u64 %r132,%r336; cvt.u32.u32 %r249,%r22; cvt.s64.s8 %r248,%r249; add.u64 %r251,%r248,%r248; add.u64 %r252,%r251,%r248; shl.b64 %r253,%r252,3; add.u64 %r254,%r171,16; add.u64 %r100,%r253,%r254; .loc 1 94 10 mov.u64 %r95,1; .loc 1 98 4 mov.u64 %r277,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r255,0; st.u64 [%r171+8],%r255; .loc 1 103 28 cvt.u16.u32 %r258,%r22; add.u16 %r257,%r258,-1; cvt.u32.u16 %r259,%r257; st.u8 [%r171+28],%r259; .loc 1 105 20 add.u32 %r260,%r22,-2; cvt.s64.s32 %r54,%r260; add.u64 %r262,%r54,%r54; add.u64 %r263,%r262,%r54; shl.b64 %r264,%r263,3; add.u64 %r265,%r171,%r264; .loc 1 105 67 shl.b64 %r267,%r54,3; add.u64 %r268,%frame,%r267; .loc 1 105 59 ld.u64 %r270,[%r265+40]; ld.u64 %r271,[%r268+240]; mul.lo.u64 %r109,%r270,%r271; .loc 1 107 29 mov.u64 %r27273; call (%value_in),_gfortrani_xmallocarray274,[%value_in]; } .loc 1 107 27 st.u64 [%r171],%r274; .loc 1 108 10 setp.eq.u64 %r276,%r109,0; @ ! %r276 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r95,%r50,%r95; $L17: .loc 1 98 4 st.u64 [%r137+8],%r277; ld.u64 %r50,[%r132]; add.u64 %r278,%r50,-1; st.u64 [%r137+16],%r278; st.u64 [%r137],%r95; .loc 1 91 7 add.u64 %r137,%r137,24; add.u64 %r132,%r132,8; setp.ne.u64 %r279,%r100,%r137; @ %r279 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r171+48],%r109; mov.u64 %r281,-1; st.u64 [%r171+56],%r281; st.u64 [%r171+40],%r273; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r61,[%r171+28]; .loc 1 118 10 setp.eq.u64 %r283,%r61,%r103; @ %r283 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r61;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r286,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r287,[%r286+36]; setp.eq.u32 %r288,%r287,0; @ %r288 bra $L21; .loc 1 125 2 add.u64 %r336,%frame,240; cvta.const.u64 %r292,$LC2r3r29292; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r294,%r103,0; @ %r294 bra $L24; $L27: .loc 1 65 9 add.u64 %r295,%r31,1; .loc 1 65 7 sub.u64 %r105,%r295,%r33; max.s64 %r45,%r105,0; .loc 1 137 8 ld.u64 %r117,[%r172]; .loc 1 138 8 ld.u64 %r118,[%r171]; ld.u64 %r47,[%frame+360]; .loc 1 166 22 ld.u64 %r69,[%frame+120]; .loc 1 167 22 ld.u64 %r71,[%frame]; setp.gt.s64 %r337,%r105,0; setp.le.s64 %r338,%r103,1; cvt.u32.u32 %r340,%r22; cvt.s64.s8 %r341,%r340; add.u64 %r342,%r341,-1; add.u64 %r343,%frame,120; add.u64 %r344,%frame,240; .loc 1 173 13 mov.u64 %r345,0; bra $L25; $L24: add.u64 %r157,%frame,360; add.u64 %r156,%r171,40; mov.u64 %r154,%frame; cvt.u32.u32 %r297,%r22; cvt.s64.s8 %r296,%r297; add.u64 %r145,%r296,-1; .loc 1 129 3 mov.u64 %r119,0; add.u64 %r336,%frame,240; .loc 1 131 16 mov.u64 %r298,%r119; $L26: st.u64 [%r157],%r298; .loc 1 132 18 ld.u64 %r299,[%r156]; st.u64 [%r154],%r299; .loc 1 133 17 shl.b64 %r301,%r119,3; add.u64 %r302,%r336,%r301; .loc 1 133 10 ld.u64 %r303,[%r302]; setp.le.s64 %r304,%r303,0; @ %r304 bra $L1; .loc 1 129 26 add.u64 %r119,%r119,1; .loc 1 129 3 add.u64 %r157,%r157,8; add.u64 %r156,%r156,24; add.u64 %r154,%r154,8; setp.ne.u64 %r305,%r119,%r145; @ %r305 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r337 bra $L28; .loc 1 150 10 mov.u32 %r307,0; st.u8 [%r118],%r307; bra $L29; $L28: .loc 1 154 36 mov.u64 %r112,%r117; .loc 1 148 10 mov.u32 %r110,0; .loc 1 154 13 mov.u64 %r111,0; $L30: .loc 1 158 10 ld.u8 %r349,[%r112]; mov.u16 %r310,%r349; cvt.u16.u32 %r311,%r110; or.loc 1 154 28 add.u64 %r111,%r111,1; .loc 1 154 36 add.u64 %r112,%r112,%r107; .loc 1 154 6 setp.gt.s64 %r313,%r45,%r111; @ %r313 bra $L30; .loc 1 161 12 cvt.u32.u32 %r314,%r110; st.u8 [%r118],%r314; $L29: .loc 1 165 15 add.u64 %r47,%r47,1; .loc 1 166 12 add.u64 %r117,%r117,%r69; .loc 1 167 12 add.u64 %r118,%r118,%r71; .loc 1 169 32 ld.u64 %r126,[%frame+240]; .loc 1 169 13 setp.ne.u64 %r315,%r47,%r126; @ %r315 bra $L25; .loc 1 176 23 mul.lo.u64 %r316,%r69,%r47; .loc 1 176 9 sub.u64 %r114,%r117,%r316; .loc 1 177 23 mul.lo.u64 %r317,%r71,%r47; .loc 1 177 9 sub.u64 %r115,%r118,%r317; .loc 1 179 7 @ %r338 bra $L1; add.u64 %r143,%frame,368; mov.u64 %r102,8; .loc 1 178 5 mov.u64 %r116,1; bra $L33; $L34: .loc 1 173 13 st.u64 [%r143],%r345; .loc 1 176 23 mul.lo.u64 %r323,%r84,%r83; .loc 1 176 9 sub.u64 %r114,%r117,%r323; .loc 1 177 23 mul.lo.u64 %r324,%r86,%r83; .loc 1 177 9 sub.u64 %r115,%r118,%r324; .loc 1 178 5 add.u64 %r116,%r116,1; .loc 1 179 7 add.u64 %r143,%r143,8; add.u64 %r102,%r102,8; setp.eq.u64 %r325,%r116,%r342; @ %r325 bra $L1; $L33: .loc 1 187 16 ld.u64 %r326,[%r143]; add.u64 %r83,%r326,1; st.u64 [%r143],%r83; .loc 1 188 23 add.u64 %r328,%r343,%r102; ld.u64 %r84,[%r328]; .loc 1 188 13 add.u64 %r117,%r114,%r84; .loc 1 189 23 add.u64 %r329,%frame,%r102; ld.u64 %r86,[%r329]; .loc 1 189 13 add.u64 %r118,%r115,%r86; .loc 1 169 32 add.u64 %r331,%r344,%r102; ld.u64 %r88,[%r331]; .loc 1 169 13 setp.eq.u64 %r332,%r83,%r88; @ %r332 bra $L34; .loc 1 173 13 mov.u64 %r47,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r333,[%r171]; setp.eq.u64 %r334,%r333,0; @ ! %r334 bra $L11; bra $L35; $L1_gfortran_miany_i1 .visible .func _gfortran_miany_608u32 %r215; .reg .u32 %r216; .reg .u16u16 %r255; .reg .u32 %r256; .reg .u16 %r257; .reg .pred %r258; .reg .predpred %r274; .reg .u64 %r275; .reg .u64 %r277; .reg .u64 %r279; .reg .u64 %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64pred %r297; .reg .u64 %r298; .reg .predu64 %r305; .reg .u32 %r306; .reg .u64u64 %r313; .reg .u64pred %r323; .reg .pred %r324; .reg .u64 %r325; .reg .u32 %r326; .reg .u64u64 %r331; .reg .u32 %r332; .reg .u64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u64u16 %r347; .reg .u32 %r348; .reg .pred %r349; .reg .u64 %r350; .reg .u64 %r351; .reg .predu16 %r388; .reg .u32 %r389; .reg .predu64 %r436; .reg .u16 %r437; mov.u64 %r203,%ar0; mov.u64 %r204,%ar1; mov.u64 %r205,%ar2; mov.u64 %r206,%ar3; .loc 1 223 6 setp.ne.u64 %r207,%r206,0; @ %r207 bra $L490r204205; call _gfortran_iany_i1,(%out_arg1,%out_arg2,%out_arg3); } .loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r205]; .loc 1 233 7 add.u64 %r132,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r204+28]; .loc 1 234 38 add.u32 %r211,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r133,%r211; .loc 1 237 7 shr.u64 %r213,%r132,63; set.u32.gt.s64 %r215,%r132,%r133; neg.s32 %r216,%r215; cvt.u16.u64 %r219,%r213; cvt.u16.u32 %r220,%r216; or.b16 %r218,%r219,%r220; .loc 1 237 6 cvt.u32.u16 %r221,%r218; cvt.u16.u8 %r222,%r221; setp.eq.u16 %r223,%r222,0; @ %r223 bra $L51; .loc 1 239 7 cvt.u32.u32 %r226,%r23; cvt.s64.s8 %r225,%r226; st.u64 [%stack+8],%r225; st.u64 [%stack],%r22; cvta.const.u64 %r2242stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r425,%r132,%r132; add.u64 %r424,%r425,%r132; shl.b64 %r231,%r424,3; add.u64 %r232,%r204,%r231; ld.u64 %r235,[%r232+56]; add.u64 %r234,%r235,1; .loc 1 244 7 ld.u64 %r242,[%r232+48]; sub.u64 %r134,%r234,%r242; .loc 1 245 6 setp.le.s64 %r243,%r134,0; @ %r243 bra $L48; .loc 1 248 9 ld.u64 %r146,[%r206]; .loc 1 250 15 ld.u64 %r34,[%r206+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r244,%r35,-4; and.b32 %r245,%r244,-5; set.u32.eq.u32 %r247,%r245,0; neg.s32 %r248,%r247; .loc 1 252 22 add.u32 %r249,%r35,-1; set.u32.le.u32 %r251,%r249,1; neg.s32 %r252,%r251; .loc 1 252 6 cvt.u16.u32 %r254,%r248; cvt.u16.u32 %r255,%r252; or.b16 %r253,%r254,%r255; cvt.u32.u16 %r256,%r253; cvt.u16.u8 %r257,%r256; setp.ne.u16 %r258,%r257,0; @ %r258 bra $L53; .loc 1 254 7 setp.ne.u32 %r260,%r35,16; @ %r260 bra $L54; $L53: .loc 1 261 9 shl.b64 %r264,%r424,3; add.u64 %r265,%r204,%r264; ld.u64 %r137,[%r265+40]; .loc 1 262 12 add.u64 %r271,%r206,%r264; ld.u64 %r42,[%r271+40]; .loc 1 264 3 setp.ne.u64 %r273,%r132,0; @ %r273 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r274,%r132,%r133; @ %r274 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r275275_gfortran_runtime_error5: add.u64 %r187,%r204,40; add.u64 %r189,%r206,40; add.u64 %r277,%r204,16; add.u64 %r279,%r22,%r22; add.u64 %r280,%r279,%r22; shl.b64 %r281,%r280,3; add.u64 %r200,%r277,%r281; .loc 1 264 3 mov.u64 %r191,0; add.u64 %r422,%frame,240; add.u64 %r421,%frame,360; .loc 1 271 12 mov.u64 %r436,%r191; $L60: .loc 1 266 18 add.u64 %r283,%r422,%r191; ld.u64 %r284,[%r187]; st.u64 [%r283],%r284; .loc 1 267 18 add.u64 %r285,%frame,%r191; .loc 1 267 20 ld.u64 %r287,[%r189]; mul.lo.u64 %r286,%r287,%r34; .loc 1 267 18 st.u64 [%r285],%r286; .loc 1 268 19 ld.u64 %r289,[%r187+16]; add.u64 %r288,%r289,1; ld.u64 %r290,[%r187+8]; sub.u64 %r53,%r288,%r290; .loc 1 270 10 setp.lt.s64 %r291,%r53,0; @ %r291 bra $L58; .loc 1 268 17 add.u64 %r293,%r421,%r191; st.u64 [%r293],%r53; bra $L59; $L58: .loc 1 271 12 add.u64 %r295,%r421,%r191; st.u64 [%r295],%r436; $L59: .loc 1 264 3 add.u64 %r187,%r187,24; add.u64 %r189,%r189,24; add.u64 %r191,%r191,8; setp.ne.u64 %r297,%r187,%r200; @ %r297 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r298,[%r203]; setp.eq.u64 %r299,%r298,0; @ ! %r299 bra $L63; bra $L62; $L56: add.u64 %r301,%r22,%r22; add.u64 %r302,%r301,%r22; shl.b64 %r303,%r302,3; add.u64 %r152,%r303,40; add.u64 %r58,%r204,%r152; add.u64 %r154,%r206,%r152; shl.b64 %r304,%r22,3; add.u64 %r100,%r304,-8; cvt.u32.u32 %r306,%r23; cvt.s64.s8 %r305,%r306; shl.b64 %r307,%r305,3; add.u64 %r184,%r307,-8; add.u64 %r422,%frame,240; add.u64 %r421,%frame,360; .loc 1 281 12 mov.u64 %r435,0; $L66: .loc 1 276 18 add.u64 %r309,%r422,%r100; ld.u64 %r310,[%r58]; st.u64 [%r309],%r310; .loc 1 277 18 add.u64 %r311,%frame,%r100; .loc 1 277 20 ld.u64 %r313,[%r154]; mul.lo.u64 %r312,%r313,%r34; .loc 1 277 18 st.u64 [%r311],%r312; .loc 1 278 19 ld.u64 %r315,[%r58+16]; add.u64 %r314,%r315,1; ld.u64 %r316,[%r58+8]; sub.u64 %r64,%r314,%r316; .loc 1 280 10 setp.lt.s64 %r317,%r64,0; @ %r317 bra $L64; .loc 1 278 17 add.u64 %r319,%r421,%r100; st.u64 [%r319],%r64; bra $L65; $L64: .loc 1 281 12 add.u64 %r321,%r421,%r100; st.u64 [%r321],%r435; $L65: .loc 1 274 3 add.u64 %r58,%r58,24; add.u64 %r154,%r154,24; add.u64 %r100,%r100,8; setp.ne.u64 %r323,%r100,%r184; @ %r323 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r324,%r133,0; @ %r324 bra $L68; add.u64 %r421,%frame,360; $L86: add.u64 %r124,%r203,40; mov.u64 %r118,%r421; cvt.u32.u32 %r326,%r23; cvt.s64.s8 %r325,%r326; add.u64 %r328,%r325,%r325; add.u64 %r329,%r328,%r325; shl.b64 %r330,%r329,3; add.u64 %r331,%r203,16; add.u64 %r72,%r330,%r331; .loc 1 291 10 mov.u64 %r120,1; .loc 1 295 4 mov.u64 %r350,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r332,%r23,-2; cvt.s64.s32 %r70,%r332; add.u64 %r334,%r70,%r70; add.u64 %r335,%r334,%r70; shl.b64 %r336,%r335,3; add.u64 %r337,%r203,%r336; .loc 1 299 67 shl.b64 %r339,%r70,3; add.u64 %r340,%frame,%r339; .loc 1 299 59 ld.u64 %r342,[%r337+40]; ld.u64 %r343,[%r340+360]; mul.lo.u64 %r140,%r342,%r343; .loc 1 301 24 mov.u64 %r344,0; st.u64 [%r203+8],%r344; .loc 1 302 28 cvt.u16.u32 %r347,%r23; add.u16 %r346,%r347,-1; cvt.u32.u16 %r348,%r346; st.u8 [%r203+28],%r348; .loc 1 304 10 setp.eq.u64 %r349,%r140,0; @ %r349 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r120,%r67,%r120; $L69: .loc 1 295 4 st.u64 [%r124+8],%r350; ld.u64 %r67,[%r118]; add.u64 %r351,%r67,-1; st.u64 [%r124+16],%r351; st.u64 [%r124],%r120; .loc 1 288 7 add.u64 %r124,%r124,24; add.u64 %r118,%r118,8; setp.ne.u64 %r352,%r72,%r124; @ %r352 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r203+48],%r140; mov.u64 %r354,-1; st.u64 [%r203+56],%r354; mov.u64 %r355,1; st.u64 [%r203+40],%r355; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r35714357; call (%value_in),_gfortrani_xmallocarray58,[%value_in]; } .loc 1 311 22 st.u64 [%r203],%r358; bra $L73; $L63: .loc 1 316 19 ld.s8 %r360,[%r203+28]; .loc 1 316 10 setp.eq.u64 %r361,%r360,%r133; @ %r361 bra $L74; .loc 1 317 2 cvta.const.u64 %r362362_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r364,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r365,[%r364+36]; setp.eq.u32 %r366,%r365,0; @ %r366 bra $L73; .loc 1 321 4 add.u64 %r421,%frame,360; cvta.const.u64 %r370,$LC2369370; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r374,$LC6; {374370; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r376,%r133,0; @ %r376 bra $L75; $L78: .loc 1 336 8 ld.u64 %r147,[%r203]; .loc 1 337 8 ld.u64 %r145,[%r204]; .loc 1 339 9 setp.ne.u64 %r377,%r145,0; @ %r377 bra $L76; bra $L48; $L75: add.u64 %r170,%frame,480; add.u64 %r167,%r203,40; add.u64 %r160,%frame,120; cvt.u32.u32 %r379,%r23; cvt.s64.s8 %r378,%r379; add.u64 %r125,%r378,-1; .loc 1 328 3 mov.u64 %r151,0; add.u64 %r421,%frame,360; .loc 1 330 16 mov.u64 %r380,%r151; $L77: st.u64 [%r170],%r380; .loc 1 331 18 ld.u64 %r381,[%r167]; st.u64 [%r160],%r381; .loc 1 332 17 shl.b64 %r383,%r151,3; add.u64 %r384,%r421,%r383; .loc 1 332 10 ld.u64 %r385,[%r384]; setp.le.s64 %r386,%r385,0; @ %r386 bra $L48; .loc 1 328 26 add.u64 %r151,%r151,1; .loc 1 328 3 add.u64 %r170,%r170,8; add.u64 %r167,%r167,24; add.u64 %r160,%r160,8; setp.ne.u64 %r387,%r125,%r151; @ %r387 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r44,%r42,%r34; ld.u64 %r173,[%frame+480]; .loc 1 359 22 ld.u64 %r86,[%frame+240]; .loc 1 360 23 ld.u64 %r88,[%frame]; .loc 1 361 22 ld.u64 %r90,[%frame+120]; setp.le.s64 %r426,%r133,1; .loc 1 348 10 mov.u32 %r428,0; cvt.u32.u32 %r429,%r23; cvt.s64.s8 %r430,%r429; add.u64 %r431,%r430,-1; add.u64 %r432,%frame,240; add.u64 %r433,%frame,360; add.u64 %r434,%frame,120; $L87: .loc 1 328 3 mov.u64 %r150,%r146; mov.u64 %r149,%r145; .loc 1 348 10 mov.u32 %r122,%r428; .loc 1 349 9 mov.u64 %r148,0; $L80: .loc 1 352 6 ld.s8 %r389,[%r150]; cvt.u16.u32 %r388,%r389; setp.eq.u16 %r390,%r388,0; @ %r390 bra $L79; .loc 1 353 12 ld.u8 %r437,[%r149]; mov.u16 %r393,%r437; cvt.u16.u32 %r394,%r122; or$L79: .loc 1 349 24 add.u64 %r148,%r148,1; .loc 1 349 32 add.u64 %r149,%r149,%r137; .loc 1 349 47 add.u64 %r150,%r150,%r44; .loc 1 349 2 setp.ne.u64 %r396,%r134,%r148; @ %r396 bra $L80; .loc 1 355 8 cvt.u32.u32 %r397,%r122; st.u8 [%r147],%r397; .loc 1 358 15 add.u64 %r173,%r173,1; .loc 1 359 12 add.u64 %r145,%r145,%r86; .loc 1 360 13 add.u64 %r146,%r146,%r88; .loc 1 361 12 add.u64 %r147,%r147,%r90; .loc 1 363 32 ld.u64 %r80,[%frame+360]; .loc 1 363 13 setp.ne.u64 %r398,%r80,%r173; @ %r398 bra $L87; .loc 1 370 23 mul.lo.u64 %r94,%r173,%r86; .loc 1 371 24 mul.lo.u64 %r399,%r173,%r88; .loc 1 371 10 sub.u64 %r142,%r146,%r399; .loc 1 372 23 mul.lo.u64 %r400,%r173,%r90; .loc 1 372 9 sub.u64 %r143,%r147,%r400; .loc 1 374 7 @ ! %r426 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r40],%r427; .loc 1 370 23 mul.lo.u64 %r94,%r106,%r105; .loc 1 371 24 mul.lo.u64 %r403,%r108,%r105; .loc 1 371 10 sub.u64 %r142,%r146,%r403; .loc 1 372 23 mul.lo.u64 %r404,%r110,%r105; .loc 1 372 9 sub.u64 %r143,%r147,%r404; .loc 1 373 5 add.u64 %r144,%r144,1; .loc 1 374 7 add.u64 %r40,%r40,8; add.u64 %r164,%r164,8; setp.ne.u64 %r405,%r144,%r431; @ %r405 bra $L84; bra $L48; $L108: add.u64 %r40,%frame,488; mov.u64 %r164,8; .loc 1 373 5 mov.u64 %r144,1; .loc 1 367 13 mov.u64 %r427,0; $L84: .loc 1 382 16 ld.u64 %r409,[%r40]; add.u64 %r105,%r409,1; st.u64 [%r40],%r105; .loc 1 383 23 add.u64 %r411,%r432,%r164; ld.u64 %r106,[%r411]; .loc 1 383 13 sub.u64 %r412,%r106,%r94; add.u64 %r145,%r145,%r412; .loc 1 384 24 add.u64 %r413,%frame,%r164; ld.u64 %r108,[%r413]; .loc 1 384 14 add.u64 %r146,%r142,%r108; .loc 1 385 23 add.u64 %r415,%r434,%r164; ld.u64 %r110,[%r415]; .loc 1 385 13 add.u64 %r147,%r143,%r110; .loc 1 363 32 add.u64 %r417,%r433,%r164; ld.u64 %r113,[%r417]; .loc 1 363 13 setp.eq.u64 %r418,%r105,%r113; @ %r418 bra $L85; .loc 1 367 13 mov.u64 %r173,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r419,[%r203]; setp.eq.u64 %r420,%r419,0; @ ! %r420 bra $L63; bra $L86; $L48: .loc 1 389_gfortran_siany_i1 .visible .func _gfortran_siany_88pred %r184; .reg .predpred %r206; .reg .predu32 %r211; .reg .u64 %r213; .reg .u64u32 %r222; .reg .u64pred %r264; .reg .u64 %r266; .reg .u32 %r270; .reg .u64 %r272; .reg .u64 %r274; .reg .predpredu64 %r297; mov.u64 %r148,%ar0; mov.u64 %r149,%ar1; mov.u64 %r150,%ar2; mov.u64 %r151,%ar3; .loc 1 412 6 setp.eq.u64 %r152,%r151,0; @ %r152 bra $L110; .loc 1 412 20 ld.u32 %r153,[%r151]; setp.eq.u32 %r154,%r153,0; @ %r154 bra $L111; $L110: .loc 1 417 7gfortran_iany_.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r150]; .loc 1 422 7 add.u64 %r86,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r149+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r291,%r26; .loc 1 425 7 shr.u64 %r159,%r86,63; set.u32.gt.s64 %r161,%r86,%r291; neg.s32 %r162,%r161; cvt.u16.u64 %r165,%r159; cvt.u16.u32 %r166,%r162; or.b16 %r164,%r165,%r166; .loc 1 425 6 cvt.u32.u16 %r167,%r164; cvt.u16.u8 %r168,%r167; setp.ne.u16 %r169,%r168,0; @ %r169 bra $L113; .loc 1 432 3 setp.ne.u64 %r170,%r86,0; @ %r170 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r171,%r86,%r291; @ %r171 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r174,%r24; cvt.s64.s8 %r173,%r174; st.u64 [%stack+8],%r173172172_gfortran_runtime_error114: add.u64 %r135,%r149,48; add.u64 %r137,%frame,120; add.u64 %r138,%r149,56; add.u64 %r176,%r149,24; add.u64 %r178,%r23,%r23; add.u64 %r179,%r178,%r23; shl.b64 %r180,%r179,3; add.u64 %r144,%r176,%r180; .loc 1 437 12 mov.u64 %r297,0; $L119: .loc 1 434 19 ld.u64 %r182,[%r138]; add.u64 %r181,%r182,1; ld.u64 %r183,[%r135]; sub.u64 %r36,%r181,%r183; .loc 1 436 10 setp.le.s64 %r184,%r36,0; @ %r184 bra $L117; .loc 1 434 17 st.u64 [%r137],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r137],%r297; $L118: .loc 1 432 3 add.u64 %r135,%r135,24; add.u64 %r137,%r137,8; add.u64 %r138,%r138,24; setp.ne.u64 %r186,%r135,%r144; @ %r186 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r96,[%r148]; .loc 1 449 6 setp.eq.u64 %r187,%r96,0; @ ! %r187 bra $L122; bra $L121; $L115: add.u64 %r189,%r23,%r23; add.u64 %r190,%r189,%r23; shl.b64 %r191,%r190,3; add.u64 %r192,%r191,48; add.u64 %r57,%r149,%r192; add.u64 %r290,%frame,120; shl.b64 %r194,%r23,3; add.u64 %r195,%r194,-8; add.u64 %r67,%r290,%r195; add.u64 %r196,%r149,48; cvt.u32.u32 %r198,%r24; cvt.s64.s8 %r197,%r198; add.u64 %r200,%r197,%r197; add.u64 %r201,%r200,%r197; shl.b64 %r202,%r201,3; add.u64 %r131,%r196,%r202; .loc 1 446 12 mov.u64 %r296,0; $L125: .loc 1 443 2 ld.u64 %r204,[%r57+8]; add.u64 %r203,%r204,1; ld.u64 %r205,[%r57]; sub.u64 %r41,%r203,%r205; .loc 1 445 10 setp.le.s64 %r206,%r41,0; @ %r206 bra $L123; .loc 1 442 17 st.u64 [%r67],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r67],%r296; $L124: .loc 1 440 3 add.u64 %r57,%r57,24; add.u64 %r67,%r67,8; setp.ne.u64 %r208,%r57,%r131; @ %r208 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r209,%r291,0; @ %r209 bra $L127; add.u64 %r290,%frame,120; $L147: add.u64 %r102,%r148,40; mov.u64 %r84,%r290; cvt.u32.u32 %r211,%r24; cvt.s64.s8 %r210,%r211; add.u64 %r213,%r210,%r210; add.u64 %r214,%r213,%r210; shl.b64 %r215,%r214,3; add.u64 %r216,%r148,16; add.u64 %r60,%r215,%r216; .loc 1 456 10 mov.u64 %r82,1; .loc 1 460 4 mov.u64 %r235,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r217,0; st.u64 [%r148+8],%r217; .loc 1 465 28 cvt.u16.u32 %r220,%r24; add.u16 %r219,%r220,-1; cvt.u32.u16 %r221,%r219; st.u8 [%r148+28],%r221; .loc 1 467 20 add.u32 %r222,%r24,-2; cvt.s64.s32 %r50,%r222; add.u64 %r224,%r50,%r50; add.u64 %r225,%r224,%r50; shl.b64 %r226,%r225,3; add.u64 %r227,%r148,%r226; .loc 1 467 67 shl.b64 %r229,%r50,3; add.u64 %r230,%frame,%r229; .loc 1 467 59 ld.u64 %r232,[%r227+40]; ld.u64 %r233,[%r230+120]; mul.lo.u64 %r93,%r232,%r233; .loc 1 469 10 setp.eq.u64 %r234,%r93,0; @ %r234 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r82,%r46,%r82; $L128: .loc 1 460 4 st.u64 [%r102+8],%r235; ld.u64 %r46,[%r84]; add.u64 %r236,%r46,-1; st.u64 [%r102+16],%r236; st.u64 [%r102],%r82; .loc 1 453 7 add.u64 %r102,%r102,24;131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r148+48],%r93; mov.u64 %r239,-1; st.u64 [%r148+56],%r239; mov.u64 %r240,1; st.u64 [%r148+40],%r240; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r242(%value_in),_gfortrani_xmallocarray3,[%value_in]; } mov.u64 %r96,%r243; .loc 1 476 22 st.u64 [%r148],%r96; $L135: .loc 1 502 3 setp.ne.u64 %r245,%r291,0; @ %r245 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r148+28]; .loc 1 480 10 setp.eq.u64 %r246,%r55,%r291; @ %r246 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r291; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r249,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r250,[%r249+36]; setp.eq.u32 %r251,%r250,0; @ %r251 bra $L135; .loc 1 488 4 setp.eq.u64 %r252,%r291,0; @ %r252 bra $L133; add.u64 %r31,%r148,48; add.u64 %r125,%frame,120; cvt.u32.u32 %r254,%r24; cvt.s64.s8 %r253,%r254; add.u64 %r92,%r253,-1; .loc 1 488 10 mov.u64 %r101,0; $L137: .loc 1 492 21 ld.u64 %r256,[%r31+8]; add.u64 %r255,%r256,1; .loc 1 492 19 ld.u64 %r257,[%r31]; sub.u64 %r91,%r255,%r257; .loc 1 493 18 ld.u64 %r64,[%r125]; .loc 1 488 25 add.u64 %r101,%r101,1; .loc 1 493 11 setp.eq.u64 %r258,%r64,%r91; @ %r258 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r64; st.u64 [%stack+8],%r91; st.u64 [%stack],%r101; cvta.const.u64 %r22_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r31,%r31,24; add.u64 %r125,%r125,8; setp.ne.u64 %r261,%r92,%r101; @ %r261 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r263,%r24; setp.le.s16 %r264,%r263,1; @ %r264 bra $L138; shl.b64 %r262,%r291,3; bra $L139; $L138: mov.u64 %r262,8; $L139: add.u64 %r266,%frame,240; mov.u32 %r27026272,[%value_in]; } add.u64 %r111,%r148,40; mov.u64 %r109,%frame; mov.u64 %r98,0; $L140: .loc 1 505 18 ld.u64 %r274,[%r111]; st.u64 [%r109],%r274; .loc 1 502 26 add.u64 %r98,%r98,1; .loc 1 502 3 add.u64 %r111,%r111,24; add.u64 %r109,%r109,8; setp.gt.s64 %r275,%r291,%r98; @ %r275 bra $L140; $L133: ld.u64 %r103,[%frame+240]; .loc 1 514 22 ld.u64 %r69,[%frame]; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r81,%r59,%r69; setp.le.s64 %r289,%r291,1; .loc 1 512 13 mov.u32 %r276,0; cvt.u32.u32 %r293,%r24; cvt.s64.s8 %r294,%r293; add.u64 %r295,%r294,-1; $L146: st.u8 [%r96],%r276; .loc 1 513 15 add.u64 %r103,%r103,1; .loc 1 514 12 add.u64 %r96,%r96,%r69; .loc 1 516 13 setp.ne.u64 %r277,%r59,%r103; @ %r277 bra $L146; .loc 1 525 7 @ ! %r289 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r97],%r292; .loc 1 523 23 mul.lo.u64 %r71,%r74,%r73; .loc 1 524 5 add.u64 %r95,%r95,1; .loc 1 525 7 add.u64 %r97,%r97,8; add.u64 %r119,%r119,8; add.u64 %r118,%r118,8; setp.ne.u64 %r280,%r95,%r295; @ %r280 bra $L144; bra $L109; $L171: add.u64 %r97,%frame,248; add.u64 %r119,%frame,8; add.u64 %r118,%frame,128; .loc 1 523 23 mov.u64 %r71,%r81; .loc 1 524 5 mov.u64 %r95,1; .loc 1 520 13 mov.u64 %r292,0; $L144: .loc 1 529 16 ld.u64 %r285,[%r97]; add.u64 %r73,%r285,1; st.u64 [%r97],%r73; .loc 1 530 23 ld.u64 %r74,[%r119]; .loc 1 530 13 sub.u64 %r286,%r74,%r71; add.u64 %r96,%r96,%r286; .loc 1 516 32 ld.u64 %r77,[%r118]; .loc 1 516 13 setp.eq.u64 %r287,%r73,%r77; @ %r287 bra $L145; .loc 1 520 13 mov.u64 %r103,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r96,[%r148]; .loc 1 449 6 setp.eq.u64 %r288,%r96,0; @ ! %r288 bra $L122; bra $L147; $L109: .loc 1 534iany_i2.o/_gfortran_iany_i2 .visible .func _gfortran_iany_i2fortran/generated/iany_i2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_miany_i2 .visible .func _gfortran_miany_i2gfortran_siany_i2 .visible .func _gfortran_siany_i2VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,78,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,78,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,105] = {73,65,78,89114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,78,89,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,73,65,78,89,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_iany_i2 .visible .func _gfortran_iany_i248pred %r217; .reg .pred64u64 %r251; .reg .predu64 %r256; .reg .u64 %r257; .reg .u32u32 %r268; .reg .u64predmov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; .loc 1 55 10 ld.s8 %r22,[%r180+28]; .loc 1 55 38 add.u32 %r182,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r182; .loc 1 56 10 ld.u64 %r25,[%r181]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r184,%r104,63; set.u32.lt.s64 %r186,%r103,%r104; neg.s32 %r187,%r186; cvt.u16.u64 %r190,%r184; cvt.u16.u32 %r191,%r187; or.b16 %r189,%r190,%r191; .loc 1 58 6 cvt.u32.u16 %r192,%r189; cvt.u16.u8 %r193,%r192; setp.eq.u16 %r194,%r193,0; @ %r194 bra $L2; .loc 1 60 7 cvt.u32.u32 %r197,%r22; cvt.s64.s8 %r196,%r197; st.u64 [%stack+8],%r1961955_gfortran_runtime_errorr200,%r104,%r104; add.u64 %r201,%r200,%r104; shl.b64 %r202,%r201,3; add.u64 %r203,%r180,%r202; ld.u64 %r30,[%r203+56]; ld.u64 %r32,[%r203+48]; .loc 1 68 9 ld.u64 %r107,[%r203+40]; .loc 1 70 3 setp.ne.u64 %r217,%r104,0; @ %r217 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r218,%r103,%r104; @ %r218 bra $L4; bra $L46; $L3: add.u64 %r61,%r180,40; add.u64 %r40,%frame,120; add.u64 %r91,%frame,240; add.u64 %r219,%r180,16; add.u64 %r221,%r25,%r25; add.u64 %r222,%r221,%r25; shl.b64 %r223,%r222,3; add.u64 %r174,%r219,%r223; .loc 1 76 12 mov.u64 %r363,0; $L8: .loc 1 72 18 ld.u64 %r224,[%r61]; st.u64 [%r40],%r224; .loc 1 73 19 ld.u64 %r226,[%r61+16]; add.u64 %r225,%r226,1; ld.u64 %r227,[%r61+8]; sub.u64 %r37,%r225,%r227; .loc 1 75 10 setp.lt.s64 %r228,%r37,0; @ %r228 bra $L6; .loc 1 73 17 st.u64 [%r91],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r91],%r363; $L7: .loc 1 70 3 add.u64 %r61,%r61,24; add.u64 %r40,%r40,8; add.u64 %r91,%r91,8; setp.ne.u64 %r230,%r61,%r174; @ %r230 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r231,[%r179]; setp.eq.u64 %r232,%r231,0; @ ! %r232 bra $L11; bra $L10; $L4: add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r237,%r236,40; add.u64 %r143,%r180,%r237; shl.b64 %r238,%r25,3; add.u64 %r125,%r238,-8; add.u64 %r351,%frame,120; add.u64 %r128,%r351,%r125; add.u64 %r350,%frame,240; add.u64 %r121,%r350,%r125; cvt.u32.u32 %r242,%r22; cvt.s64.s8 %r241,%r242; add.u64 %r244,%r241,%r241; add.u64 %r245,%r244,%r241; shl.b64 %r246,%r245,3; add.u64 %r247,%r180,40; add.u64 %r95,%r246,%r247; .loc 1 84 12 mov.u64 %r362,0; $L14: .loc 1 80 18 ld.u64 %r248,[%r143]; st.u64 [%r128],%r248; .loc 1 81 19 ld.u64 %r250,[%r143+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r143+8]; sub.u64 %r44,%r249,%r251; .loc 1 83 10 setp.lt.s64 %r252,%r44,0; @ %r252 bra $L12; .loc 1 81 17 st.u64 [%r121],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r121],%r362; $L13: .loc 1 78 3 add.u64 %r143,%r143,24; add.u64 %r128,%r128,8; add.u64 %r121,%r121,8; setp.ne.u64 %r254,%r95,%r143; @ %r254 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r255,%r103,0; @ %r255 bra $L16; add.u64 %r350,%frame,240; $L34: add.u64 %r152,%r179,40; mov.u64 %r150,%r350; add.u64 %r256,%r179,16; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r144,%r256,%r262; .loc 1 94 10 mov.u64 %r99,1; .loc 1 98 4 mov.u64 %r285,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r263,0; st.u64 [%r179+8],%r263; .loc 1 103 28 cvt.u16.u32 %r266,%r22; add.u16 %r265,%r266,-1; cvt.u32.u16 %r267,%r265; st.u8 [%r179+28],%r267; .loc 1 105 20 add.u32 %r268,%r22,-2; cvt.s64.s32 %r52,%r268; add.u64 %r270,%r52,%r52; add.u64 %r271,%r270,%r52; shl.b64 %r272,%r271,3; add.u64 %r273,%r179,%r272; .loc 1 105 67 shl.b64 %r275,%r52,3; add.u64 %r276,%frame,%r275; .loc 1 105 59 ld.u64 %r278,[%r273+40]; ld.u64 %r279,[%r276+240]; mul.lo.u64 %r109,%r278,%r279; .loc 1 107 29 mov.u64 %r281,281; call (%value_in),_gfortrani_xmallocarray282,[%value_in]; } .loc 1 107 27 st.u64 [%r179],%r282; .loc 1 108 10 setp.eq.u64 %r284,%r109,0; @ ! %r284 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r99,%r48,%r99; $L17: .loc 1 98 4 st.u64 [%r152+8],%r285; ld.u64 %r48,[%r150]; add.u64 %r286,%r48,-1; st.u64 [%r152+16],%r286; st.u64 [%r152],%r99; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r287,%r144,%r152; @ %r287 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r179+48],%r109; mov.u64 %r289,-1; st.u64 [%r179+56],%r289; mov.u64 %r290,1; st.u64 [%r179+40],%r290; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r179+28]; .loc 1 118 10 setp.eq.u64 %r291,%r58,%r103; @ %r291 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r294,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r295,[%r294+36]; setp.eq.u32 %r296,%r295,0; @ %r296 bra $L21; .loc 1 125 2 add.u64 %r350,%frame,240; cvta.const.u64 %r300,$LC2r350299300; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r302,%r103,0; @ %r302 bra $L24; $L27: .loc 1 65 9 add.u64 %r303,%r30,1; .loc 1 65 7 sub.u64 %r105,%r303,%r32; max.s64 %r45,%r105,0; .loc 1 137 8 ld.u64 %r116,[%r180]; .loc 1 138 8 ld.u64 %r117,[%r179]; .loc 1 154 36 add.u64 %r66,%r107,%r107; ld.u64 %r138,[%frame+360]; .loc 1 166 22 ld.u64 %r68,[%frame+120]; .loc 1 166 12 add.u64 %r69,%r68,%r68; .loc 1 167 22 ld.u64 %r71,[%frame]; .loc 1 167 12 add.u64 %r72,%r71,%r71; .loc 1 169 32 ld.u64 %r123,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r307,%r68,%r123; .loc 1 176 9 add.u64 %r308,%r307,%r307; neg.s64 %r127,%r308; .loc 1 177 23 mul.lo.u64 %r309,%r71,%r123; .loc 1 177 9 add.u64 %r310,%r309,%r309; neg.s64 %r130,%r310; setp.gt.s64 %r352,%r105,0; setp.le.s64 %r353,%r103,1; cvt.u32.u32 %r355,%r22; cvt.s64.s8 %r356,%r355; add.u64 %r357,%r356,-1; add.u64 %r358,%frame,120; add.u64 %r359,%frame,240; .loc 1 173 13 mov.u64 %r360,0; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r179,40; mov.u64 %r162,%frame; cvt.u32.u32 %r312,%r22; cvt.s64.s8 %r311,%r312; add.u64 %r153,%r311,-1; .loc 1 129 3 mov.u64 %r118,0; add.u64 %r350,%frame,240; .loc 1 131 16 mov.u64 %r313,%r118; $L26: st.u64 [%r165],%r313; .loc 1 132 18 ld.u64 %r314,[%r164]; st.u64 [%r162],%r314; .loc 1 133 17 shl.b64 %r316,%r118,3; add.u64 %r317,%r350,%r316; .loc 1 133 10 ld.u64 %r318,[%r317]; setp.le.s64 %r319,%r318,0; @ %r319 bra $L1; .loc 1 129 26 add.u64 %r118,%r118,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r320,%r118,%r153; @ %r320 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r352 bra $L35; .loc 1 150 10 mov.u16 %r322,0; st.u16 [%r117],%r322; bra $L29; $L35: mov.u64 %r112,%r116; .loc 1 148 10 mov.u32 %r110,0; .loc 1 154 13 mov.u64 %r111,0; $L28: .loc 1 158 10 ld.u16 %r324,[%r112]; cvt.u16.u32 %r325,%r110; or.b16 %r323,%r324,%r325; cvt.s32.s16 %r110,%r323; .loc 1 154 28 add.u64 %r111,%r111,1; .loc 1 154 36 add.u64 %r112,%r112,%r66; .loc 1 154 6 setp.gt.s64 %r326,%r45,%r111; @ %r326 bra $L28; .loc 1 161 12 cvt.u16.u32 %r327,%r110; st.u16 [%r117],%r327; $L29: .loc 1 165 15 add.u64 %r138,%r138,1; .loc 1 166 12 add.u64 %r116,%r116,%r69; .loc 1 167 12 add.u64 %r117,%r117,%r72; .loc 1 169 13 setp.ne.u64 %r328,%r138,%r123; @ %r328 bra $L25; .loc 1 176 9 add.u64 %r113,%r116,%r127; .loc 1 177 9 add.u64 %r114,%r117,%r130; .loc 1 179 7 @ %r353 bra $L1; add.u64 %r122,%frame,368; mov.u64 %r140,8; .loc 1 178 5 mov.u64 %r115,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r122],%r360; .loc 1 176 23 mul.lo.u64 %r334,%r85,%r84; .loc 1 176 9 add.u64 %r335,%r334,%r334; sub.u64 %r113,%r116,%r335; .loc 1 177 23 mul.lo.u64 %r336,%r88,%r84; .loc 1 177 9 add.u64 %r337,%r336,%r336; sub.u64 %r114,%r117,%r337; .loc 1 178 5 add.u64 %r115,%r115,1; .loc 1 179 7 add.u64 %r122,%r122,8; add.u64 %r140,%r140,8; setp.eq.u64 %r338,%r115,%r357; @ %r338 bra $L1; $L32: .loc 1 187 16 ld.u64 %r339,[%r122]; add.u64 %r84,%r339,1; st.u64 [%r122],%r84; .loc 1 188 23 add.u64 %r341,%r358,%r140; ld.u64 %r85,[%r341]; .loc 1 188 13 add.u64 %r342,%r85,%r85; add.u64 %r116,%r113,%r342; .loc 1 189 23 add.u64 %r343,%frame,%r140; ld.u64 %r88,[%r343]; .loc 1 189 13 add.u64 %r344,%r88,%r88; add.u64 %r117,%r114,%r344; .loc 1 169 32 add.u64 %r346,%r359,%r140; ld.u64 %r92,[%r346]; .loc 1 169 13 setp.eq.u64 %r347,%r84,%r92; @ %r347 bra $L33; .loc 1 173 13 mov.u64 %r138,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r348,[%r179]; setp.eq.u64 %r349,%r348,0; @ ! %r349 bra $L11; bra $L34; $L1_gfortran_miany_i2 .visible .func _gfortran_miany_608u64u16 %r263; .reg .pred %r264; .reg .pred %r266; .reg .u64 %r270; .reg .u64 %r271; .reg .u64 %r277; .reg .pred %r279; .reg .pred %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .predu64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32u64 %r342; .reg .u64pred %r358; .reg .u64predu64 %r391; .reg .pred %r392; .reg .pred %r393; .reg .u64u64 %r421; .reg .u64predmov.u64 %r211,%ar2; mov.u64 %r212,%ar3; .loc 1 223 6 setp.ne.u64 %r213,%r212,0; @ %r213 bra $L49gfortran_iany_.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r211]; .loc 1 233 7 add.u64 %r137,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r210+28]; .loc 1 234 38 add.u32 %r217,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r138,%r217; .loc 1 237 7 shr.u64 %r219,%r137,63; set.u32.gt.s64 %r221,%r137,%r138; neg.s32 %r222,%r221; cvt.u16.u64 %r225,%r219; cvt.u16.u32 %r226,%r222; or.b16 %r224,%r225,%r226; .loc 1 237 6 cvt.u32.u16 %r227,%r224; cvt.u16.u8 %r228,%r227; setp.eq.u16 %r229,%r228,0; @ %r229 bra $L51; .loc 1 239 7 cvt.u32.u32 %r232,%r23; cvt.s64.s8 %r231,%r232; st.u64 [%stack+8],%r231; st.u64 [%stack],%r22; cvta.const.u64 %r2302stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r435,%r137,%r137; add.u64 %r436,%r435,%r137; shl.b64 %r237,%r436,3; add.u64 %r238,%r210,%r237; ld.u64 %r241,[%r238+56]; add.u64 %r240,%r241,1; .loc 1 244 7 ld.u64 %r248,[%r238+48]; sub.u64 %r139,%r240,%r248; .loc 1 245 6 setp.le.s64 %r249,%r139,0; @ %r249 bra $L48; .loc 1 248 9 ld.u64 %r152,[%r212]; .loc 1 250 15 ld.u64 %r34,[%r212+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r250,%r35,-4; and.b32 %r251,%r250,-5; set.u32.eq.u32 %r253,%r251,0; neg.s32 %r254,%r253; .loc 1 252 22 add.u32 %r255,%r35,-1; set.u32.le.u32 %r257,%r255,1; neg.s32 %r258,%r257; .loc 1 252 6 cvt.u16.u32 %r260,%r254; cvt.u16.u32 %r261,%r258; or.b16 %r259,%r260,%r261; cvt.u32.u16 %r262,%r259; cvt.u16.u8 %r263,%r262; setp.ne.u16 %r264,%r263,0; @ %r264 bra $L53; .loc 1 254 7 setp.ne.u32 %r266,%r35,16; @ %r266 bra $L54; $L53: .loc 1 261 9 shl.b64 %r270,%r436,3; add.u64 %r271,%r210,%r270; ld.u64 %r142,[%r271+40]; .loc 1 262 12 add.u64 %r277,%r212,%r270; ld.u64 %r42,[%r277+40]; .loc 1 264 3 setp.ne.u64 %r279,%r137,0; @ %r279 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r280,%r137,%r138; @ %r280 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r281281_gfortran_runtime_error5: add.u64 %r40,%r210,40; add.u64 %r179,%r212,40; add.u64 %r283,%r210,16; add.u64 %r285,%r22,%r22; add.u64 %r286,%r285,%r22; shl.b64 %r287,%r286,3; add.u64 %r206,%r283,%r287; .loc 1 264 3 mov.u64 %r92,0; add.u64 %r433,%frame,240; add.u64 %r432,%frame,360; .loc 1 271 12 mov.u64 %r447,%r92; $L60: .loc 1 266 18 add.u64 %r289,%r433,%r92; ld.u64 %r290,[%r40]; st.u64 [%r289],%r290; .loc 1 267 18 add.u64 %r291,%frame,%r92; .loc 1 267 20 ld.u64 %r293,[%r179]; mul.lo.u64 %r292,%r293,%r34; .loc 1 267 18 st.u64 [%r291],%r292; .loc 1 268 19 ld.u64 %r295,[%r40+16]; add.u64 %r294,%r295,1; ld.u64 %r296,[%r40+8]; sub.u64 %r54,%r294,%r296; .loc 1 270 10 setp.lt.s64 %r297,%r54,0; @ %r297 bra $L58; .loc 1 268 17 add.u64 %r299,%r432,%r92; st.u64 [%r299],%r54; bra $L59; $L58: .loc 1 271 12 add.u64 %r301,%r432,%r92; st.u64 [%r301],%r447; $L59: .loc 1 264 3 add.u64 %r40,%r40,24; add.u64 %r179,%r179,24; add.u64 %r92,%r92,8; setp.ne.u64 %r303,%r40,%r206; @ %r303 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r304,[%r209]; setp.eq.u64 %r305,%r304,0; @ ! %r305 bra $L63; bra $L62; $L56: add.u64 %r307,%r22,%r22; add.u64 %r308,%r307,%r22; shl.b64 %r309,%r308,3; add.u64 %r127,%r309,40; add.u64 %r133,%r210,%r127; add.u64 %r125,%r212,%r127; shl.b64 %r310,%r22,3; add.u64 %r103,%r310,-8; cvt.u32.u32 %r312,%r23; cvt.s64.s8 %r311,%r312; shl.b64 %r313,%r311,3; add.u64 %r69,%r313,-8; add.u64 %r433,%frame,240; add.u64 %r432,%frame,360; .loc 1 281 12 mov.u64 %r446,0; $L66: .loc 1 276 18 add.u64 %r315,%r433,%r103; ld.u64 %r316,[%r133]; st.u64 [%r315],%r316; .loc 1 277 18 add.u64 %r317,%frame,%r103; .loc 1 277 20 ld.u64 %r319,[%r125]; mul.lo.u64 %r318,%r319,%r34; .loc 1 277 18 st.u64 [%r317],%r318; .loc 1 278 19 ld.u64 %r321,[%r133+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r133+8]; sub.u64 %r66,%r320,%r322; .loc 1 280 10 setp.lt.s64 %r323,%r66,0; @ %r323 bra $L64; .loc 1 278 17 add.u64 %r325,%r432,%r103; st.u64 [%r325],%r66; bra $L65; $L64: .loc 1 281 12 add.u64 %r327,%r432,%r103; st.u64 [%r327],%r446; $L65: .loc 1 274 3 add.u64 %r133,%r133,24; add.u64 %r125,%r125,24; add.u64 %r103,%r103,8; setp.ne.u64 %r329,%r69,%r103; @ %r329 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r330,%r138,0; @ %r330 bra $L68; add.u64 %r432,%frame,360; $L86: add.u64 %r172,%r209,40; mov.u64 %r164,%r432; cvt.u32.u32 %r332,%r23; cvt.s64.s8 %r331,%r332; add.u64 %r334,%r331,%r331; add.u64 %r335,%r334,%r331; shl.b64 %r336,%r335,3; add.u64 %r337,%r209,16; add.u64 %r134,%r336,%r337; .loc 1 291 10 mov.u64 %r129,1; .loc 1 295 4 mov.u64 %r356,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r338,%r23,-2; cvt.s64.s32 %r75,%r338; add.u64 %r340,%r75,%r75; add.u64 %r341,%r340,%r75; shl.b64 %r342,%r341,3; add.u64 %r343,%r209,%r342; .loc 1 299 67 shl.b64 %r345,%r75,3; add.u64 %r346,%frame,%r345; .loc 1 299 59 ld.u64 %r348,[%r343+40]; ld.u64 %r349,[%r346+360]; mul.lo.u64 %r146,%r348,%r349; .loc 1 301 24 mov.u64 %r350,0; st.u64 [%r209+8],%r350; .loc 1 302 28 cvt.u16.u32 %r353,%r23; add.u16 %r352,%r353,-1; cvt.u32.u16 %r354,%r352; st.u8 [%r209+28],%r354; .loc 1 304 10 setp.eq.u64 %r355,%r146,0; @ %r355 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r129,%r72,%r129; $L69: .loc 1 295 4 st.u64 [%r172+8],%r356; ld.u64 %r72,[%r164]; add.u64 %r357,%r72,-1; st.u64 [%r172+16],%r357; st.u64 [%r172],%r129; .loc 1 288 7 add.u64 %r172,%r172,24; add.u64 %r164,%r164,8; setp.ne.u64 %r358,%r134,%r172; @ %r358 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r209+48],%r146; mov.u64 %r360,-1; st.u64 [%r209+56],%r360; mov.u64 %r361,1; st.u64 [%r209+40],%r361; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r363,363; call (%value_in),_gfortrani_xmallocarray64,[%value_in]; } .loc 1 311 22 st.u64 [%r209],%r364; bra $L73; $L63: .loc 1 316 19 ld.s8 %r366,[%r209+28]; .loc 1 316 10 setp.eq.u64 %r367,%r366,%r138; @ %r367 bra $L74; .loc 1 317 2 cvta.const.u64 %r36836_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r370,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r371,[%r370+36]; setp.eq.u32 %r372,%r371,0; @ %r372 bra $L73; .loc 1 321 4 add.u64 %r432,%frame,360; cvta.const.u64 %r376,$LC2r437376; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r380,$LC1210380376; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r382,%r138,0; @ %r382 bra $L75; $L78: .loc 1 336 8 ld.u64 %r153,[%r209]; .loc 1 337 8 ld.u64 %r151,[%r210]; .loc 1 339 9 setp.ne.u64 %r383,%r151,0; @ %r383 bra $L76; bra $L48; $L75: add.u64 %r190,%frame,480; add.u64 %r189,%r209,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r385,%r23; cvt.s64.s8 %r384,%r385; add.u64 %r175,%r384,-1; .loc 1 328 3 mov.u64 %r157,0; add.u64 %r432,%frame,360; .loc 1 330 16 mov.u64 %r386,%r157; $L77: st.u64 [%r190],%r386; .loc 1 331 18 ld.u64 %r387,[%r189]; st.u64 [%r187],%r387; .loc 1 332 17 shl.b64 %r389,%r157,3; add.u64 %r390,%r432,%r389; .loc 1 332 10 ld.u64 %r391,[%r390]; setp.le.s64 %r392,%r391,0; @ %r392 bra $L48; .loc 1 328 26 add.u64 %r157,%r157,1; .loc 1 328 3 add.u64 %r190,%r190,8; add.u64 %r189,%r189,24; add.u64 %r187,%r187,8; setp.ne.u64 %r393,%r157,%r175; @ %r393 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r45,%r42,%r34; .loc 1 349 32 add.u64 %r91,%r142,%r142; ld.u64 %r39,[%frame+480]; .loc 1 359 22 ld.u64 %r94,[%frame+240]; .loc 1 359 12 add.u64 %r95,%r94,%r94; .loc 1 360 23 ld.u64 %r97,[%frame]; .loc 1 361 22 ld.u64 %r99,[%frame+120]; .loc 1 361 12 add.u64 %r100,%r99,%r99; .loc 1 363 32 ld.u64 %r85,[%frame+360]; .loc 1 370 23 mul.lo.u64 %r38,%r85,%r94; .loc 1 371 24 mul.lo.u64 %r397,%r85,%r97; .loc 1 371 10 neg.s64 %r163,%r397; .loc 1 372 23 mul.lo.u64 %r398,%r85,%r99; .loc 1 372 9 add.u64 %r399,%r398,%r398; neg.s64 %r165,%r399; setp.le.s64 %r437,%r138,1; .loc 1 348 10 mov.u32 %r439,0; cvt.u32.u32 %r440,%r23; cvt.s64.s8 %r441,%r440; add.u64 %r442,%r441,-1; add.u64 %r443,%frame,240; add.u64 %r444,%frame,360; add.u64 %r445,%frame,120; $L87: .loc 1 328 3 mov.u64 %r156,%r152; mov.u64 %r155,%r151; .loc 1 348 10 mov.u32 %r130,%r439; .loc 1 349 9 mov.u64 %r154,0; $L80: .loc 1 352 6 ld.s8 %r401,[%r156]; cvt.u16.u32 %r400,%r401; setp.eq.u16 %r402,%r400,0; @ %r402 bra $L79; .loc 1 353 12 ld.u16 %r404,[%r155]; cvt.u16.u32 %r405,%r130; or.b16 %r403,%r404,%r405; cvt.s32.s16 %r130,%r403; $L79: .loc 1 349 24 add.u64 %r154,%r154,1; .loc 1 349 32 add.u64 %r155,%r155,%r91; .loc 1 349 47 add.u64 %r156,%r156,%r45; .loc 1 349 2 setp.ne.u64 %r406,%r139,%r154; @ %r406 bra $L80; .loc 1 355 8 cvt.u16.u32 %r407,%r130; st.u16 [%r153],%r407; .loc 1 358 15 add.u64 %r39,%r39,1; .loc 1 359 12 add.u64 %r151,%r151,%r95; .loc 1 360 13 add.u64 %r152,%r152,%r97; .loc 1 361 12 add.u64 %r153,%r153,%r100; .loc 1 363 13 setp.ne.u64 %r408,%r85,%r39; @ %r408 bra $L87; .loc 1 371 10 add.u64 %r148,%r152,%r163; .loc 1 372 9 add.u64 %r149,%r153,%r165; .loc 1 374 7 @ ! %r437 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r41],%r438; .loc 1 370 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 371 24 mul.lo.u64 %r411,%r116,%r113; .loc 1 371 10 sub.u64 %r148,%r152,%r411; .loc 1 372 23 mul.lo.u64 %r412,%r118,%r113; .loc 1 372 9 add.u64 %r413,%r412,%r412; sub.u64 %r149,%r153,%r413; .loc 1 373 5 add.u64 %r150,%r150,1; .loc 1 374 7 add.u64 %r41,%r41,8; add.u64 %r166,%r166,8; setp.ne.u64 %r414,%r150,%r442; @ %r414 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; .loc 1 370 23 mov.u64 %r102,%r38; .loc 1 374 7 mov.u64 %r166,8; .loc 1 373 5 mov.u64 %r150,1; .loc 1 367 13 mov.u64 %r438,0; $L84: .loc 1 382 16 ld.u64 %r418,[%r41]; add.u64 %r113,%r418,1; st.u64 [%r41],%r113; .loc 1 383 23 add.u64 %r420,%r443,%r166; ld.u64 %r114,[%r420]; .loc 1 383 13 sub.u64 %r421,%r114,%r102; add.u64 %r422,%r421,%r421; add.u64 %r151,%r151,%r422; .loc 1 384 24 add.u64 %r423,%frame,%r166; ld.u64 %r116,[%r423]; .loc 1 384 14 add.u64 %r152,%r148,%r116; .loc 1 385 23 add.u64 %r425,%r445,%r166; ld.u64 %r118,[%r425]; .loc 1 385 13 add.u64 %r426,%r118,%r118; add.u64 %r153,%r149,%r426; .loc 1 363 32 add.u64 %r428,%r444,%r166; ld.u64 %r122,[%r428]; .loc 1 363 13 setp.eq.u64 %r429,%r113,%r122; @ %r429 bra $L85; .loc 1 367 13 mov.u64 %r39,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r430,[%r209]; setp.eq.u64 %r431,%r430,0; @ ! %r431 bra $L63; bra $L86; $L48: .loc 1 389_gfortran_siany_i2 .visible .func _gfortran_siany_8u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64predpred %r292; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_iany_.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r295,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r295; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r295; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r301,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r301; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r294,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r294,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r300,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r300; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r295,0; @ %r211 bra $L127; add.u64 %r294,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r294; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r244,4; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r295,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r295; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r295; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r295,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r295,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r295,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 add.u64 %r69,%r68,%r68; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r293,%r295,1; .loc 1 512 13 mov.u16 %r279,0; cvt.u32.u32 %r297,%r24; cvt.s64.s8 %r298,%r297; add.u64 %r299,%r298,-1; $L146: st.u16 [%r98],%r279; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r280,%r59,%r80; @ %r280 bra $L146; .loc 1 525 7 @ ! %r293 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r296; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r283,%r97,%r299; @ %r283 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; .loc 1 520 13 mov.u64 %r296,0; $L144: .loc 1 529 16 ld.u64 %r288,[%r99]; add.u64 %r74,%r288,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r289,%r75,%r72; add.u64 %r290,%r289,%r289; add.u64 %r98,%r98,%r290; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r291,%r74,%r77; @ %r291 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r292,%r98,0; @ ! %r292 bra $L122; bra $L147; $L109: .loc 1 534 iany_i4.o/_gfortran_iany_i4 .visible .func _gfortran_iany_i4fortran/generated/iany_i4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_miany_i4 .visible .func _gfortran_miany_i4gfortran_siany_i4 .visible .func _gfortran_siany_i4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,78,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,78,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,105] = {73,65,78,89114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,78,89,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,73,65,78,89,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_iany_i4 .visible .func _gfortran_iany_i448pred %r217; .reg .pred64u64 %r251; .reg .predu64 %r256; .reg .u64 %r257; .reg .u32u32 %r268; .reg .u64u64predpredu64u64predmov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; .loc 1 55 10 ld.s8 %r22,[%r180+28]; .loc 1 55 38 add.u32 %r182,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r182; .loc 1 56 10 ld.u64 %r25,[%r181]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r184,%r104,63; set.u32.lt.s64 %r186,%r103,%r104; neg.s32 %r187,%r186; cvt.u16.u64 %r190,%r184; cvt.u16.u32 %r191,%r187; or.b16 %r189,%r190,%r191; .loc 1 58 6 cvt.u32.u16 %r192,%r189; cvt.u16.u8 %r193,%r192; setp.eq.u16 %r194,%r193,0; @ %r194 bra $L2; .loc 1 60 7 cvt.u32.u32 %r197,%r22; cvt.s64.s8 %r196,%r197; st.u64 [%stack+8],%r1961955_gfortran_runtime_errorr200,%r104,%r104; add.u64 %r201,%r200,%r104; shl.b64 %r202,%r201,3; add.u64 %r203,%r180,%r202; ld.u64 %r30,[%r203+56]; ld.u64 %r32,[%r203+48]; .loc 1 68 9 ld.u64 %r107,[%r203+40]; .loc 1 70 3 setp.ne.u64 %r217,%r104,0; @ %r217 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r218,%r103,%r104; @ %r218 bra $L4; bra $L46; $L3: add.u64 %r61,%r180,40; add.u64 %r40,%frame,120; add.u64 %r91,%frame,240; add.u64 %r219,%r180,16; add.u64 %r221,%r25,%r25; add.u64 %r222,%r221,%r25; shl.b64 %r223,%r222,3; add.u64 %r174,%r219,%r223; .loc 1 76 12 mov.u64 %r357,0; $L8: .loc 1 72 18 ld.u64 %r224,[%r61]; st.u64 [%r40],%r224; .loc 1 73 19 ld.u64 %r226,[%r61+16]; add.u64 %r225,%r226,1; ld.u64 %r227,[%r61+8]; sub.u64 %r37,%r225,%r227; .loc 1 75 10 setp.lt.s64 %r228,%r37,0; @ %r228 bra $L6; .loc 1 73 17 st.u64 [%r91],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r91],%r357; $L7: .loc 1 70 3 add.u64 %r61,%r61,24; add.u64 %r40,%r40,8; add.u64 %r91,%r91,8; setp.ne.u64 %r230,%r61,%r174; @ %r230 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r231,[%r179]; setp.eq.u64 %r232,%r231,0; @ ! %r232 bra $L11; bra $L10; $L4: add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r237,%r236,40; add.u64 %r143,%r180,%r237; shl.b64 %r238,%r25,3; add.u64 %r125,%r238,-8; add.u64 %r344,%frame,120; add.u64 %r128,%r344,%r125; add.u64 %r345,%frame,240; add.u64 %r121,%r345,%r125; cvt.u32.u32 %r242,%r22; cvt.s64.s8 %r241,%r242; add.u64 %r244,%r241,%r241; add.u64 %r245,%r244,%r241; shl.b64 %r246,%r245,3; add.u64 %r247,%r180,40; add.u64 %r95,%r246,%r247; .loc 1 84 12 mov.u64 %r356,0; $L14: .loc 1 80 18 ld.u64 %r248,[%r143]; st.u64 [%r128],%r248; .loc 1 81 19 ld.u64 %r250,[%r143+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r143+8]; sub.u64 %r44,%r249,%r251; .loc 1 83 10 setp.lt.s64 %r252,%r44,0; @ %r252 bra $L12; .loc 1 81 17 st.u64 [%r121],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r121],%r356; $L13: .loc 1 78 3 add.u64 %r143,%r143,24; add.u64 %r128,%r128,8; add.u64 %r121,%r121,8; setp.ne.u64 %r254,%r95,%r143; @ %r254 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r255,%r103,0; @ %r255 bra $L16; add.u64 %r345,%frame,240; $L34: add.u64 %r152,%r179,40; mov.u64 %r150,%r345; add.u64 %r256,%r179,16; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r144,%r256,%r262; .loc 1 94 10 mov.u64 %r99,1; .loc 1 98 4 mov.u64 %r285,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r263,0; st.u64 [%r179+8],%r263; .loc 1 103 28 cvt.u16.u32 %r266,%r22; add.u16 %r265,%r266,-1; cvt.u32.u16 %r267,%r265; st.u8 [%r179+28],%r267; .loc 1 105 20 add.u32 %r268,%r22,-2; cvt.s64.s32 %r52,%r268; add.u64 %r270,%r52,%r52; add.u64 %r271,%r270,%r52; shl.b64 %r272,%r271,3; add.u64 %r273,%r179,%r272; .loc 1 105 67 shl.b64 %r275,%r52,3; add.u64 %r276,%frame,%r275; .loc 1 105 59 ld.u64 %r278,[%r273+40]; ld.u64 %r279,[%r276+240]; mul.lo.u64 %r109,%r278,%r279; .loc 1 107 29281; call (%value_in),_gfortrani_xmallocarray282,[%value_in]; } .loc 1 107 27 st.u64 [%r179],%r282; .loc 1 108 10 setp.eq.u64 %r284,%r109,0; @ ! %r284 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r99,%r48,%r99; $L17: .loc 1 98 4 st.u64 [%r152+8],%r285; ld.u64 %r48,[%r150]; add.u64 %r286,%r48,-1; st.u64 [%r152+16],%r286; st.u64 [%r152],%r99; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r287,%r144,%r152; @ %r287 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r179+48],%r109; mov.u64 %r289,-1; st.u64 [%r179+56],%r289; mov.u64 %r290,1; st.u64 [%r179+40],%r290; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r179+28]; .loc 1 118 10 setp.eq.u64 %r291,%r58,%r103; @ %r291 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r294,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r295,[%r294+36]; setp.eq.u32 %r296,%r295,0; @ %r296 bra $L21; .loc 1 125 2 add.u64 %r345,%frame,240; cvta.const.u64 %r300,$LC2r345299300; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r302,%r103,0; @ %r302 bra $L24; $L27: .loc 1 65 9 add.u64 %r303,%r30,1; .loc 1 65 7 sub.u64 %r105,%r303,%r32; max.s64 %r45,%r105,0; .loc 1 137 8 ld.u64 %r116,[%r180]; .loc 1 138 8 ld.u64 %r117,[%r179]; .loc 1 154 36 shl.b64 %r66,%r107,2; ld.u64 %r138,[%frame+360]; .loc 1 166 22 ld.u64 %r68,[%frame+120]; .loc 1 166 12 shl.b64 %r69,%r68,2; .loc 1 167 22 ld.u64 %r71,[%frame]; .loc 1 167 12 shl.b64 %r72,%r71,2; .loc 1 169 32 ld.u64 %r123,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r304,%r68,%r123; .loc 1 176 9 shl.b64 %r305,%r304,2; neg.s64 %r127,%r305; .loc 1 177 23 mul.lo.u64 %r306,%r71,%r123; .loc 1 177 9 shl.b64 %r307,%r306,2; neg.s64 %r130,%r307; setp.gt.s64 %r346,%r105,0; setp.le.s64 %r347,%r103,1; cvt.u32.u32 %r349,%r22; cvt.s64.s8 %r350,%r349; add.u64 %r351,%r350,-1; add.u64 %r352,%frame,120; add.u64 %r353,%frame,240; .loc 1 173 13 mov.u64 %r354,0; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r179,40; mov.u64 %r162,%frame; cvt.u32.u32 %r309,%r22; cvt.s64.s8 %r308,%r309; add.u64 %r153,%r308,-1; .loc 1 129 3 mov.u64 %r118,0; add.u64 %r345,%frame,240; .loc 1 131 16 mov.u64 %r310,%r118; $L26: st.u64 [%r165],%r310; .loc 1 132 18 ld.u64 %r311,[%r164]; st.u64 [%r162],%r311; .loc 1 133 17 shl.b64 %r313,%r118,3; add.u64 %r314,%r345,%r313; .loc 1 133 10 ld.u64 %r315,[%r314]; setp.le.s64 %r316,%r315,0; @ %r316 bra $L1; .loc 1 129 26 add.u64 %r118,%r118,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r317,%r118,%r153; @ %r317 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r346 bra $L35; .loc 1 150 10 mov.u32 %r319,0; st.u32 [%r117],%r319; bra $L29; $L35: mov.u64 %r112,%r116; .loc 1 148 10 mov.u32 %r110,0; .loc 1 154 13 mov.u64 %r111,0; $L28: .loc 1 158 10 ld.u32 %r320,[%r112]; or.b32 %r110,%r110,%r320; .loc 1 154 28 add.u64 %r111,%r111,1; .loc 1 154 36 add.u64 %r112,%r112,%r66; .loc 1 154 6 setp.gt.s64 %r321,%r45,%r111; @ %r321 bra $L28; .loc 1 161 12 st.u32 [%r117],%r110; $L29: .loc 1 165 15 add.u64 %r138,%r138,1; .loc 1 166 12 add.u64 %r116,%r116,%r69; .loc 1 167 12 add.u64 %r117,%r117,%r72; .loc 1 169 13 setp.ne.u64 %r322,%r138,%r123; @ %r322 bra $L25; .loc 1 176 9 add.u64 %r113,%r116,%r127; .loc 1 177 9 add.u64 %r114,%r117,%r130; .loc 1 179 7 @ %r347 bra $L1; add.u64 %r122,%frame,368; mov.u64 %r140,8; .loc 1 178 5 mov.u64 %r115,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r122],%r354; .loc 1 176 23 mul.lo.u64 %r328,%r85,%r84; .loc 1 176 9 shl.b64 %r329,%r328,2; sub.u64 %r113,%r116,%r329; .loc 1 177 23 mul.lo.u64 %r330,%r88,%r84; .loc 1 177 9 shl.b64 %r331,%r330,2; sub.u64 %r114,%r117,%r331; .loc 1 178 5 add.u64 %r115,%r115,1; .loc 1 179 7 add.u64 %r122,%r122,8; add.u64 %r140,%r140,8; setp.eq.u64 %r332,%r115,%r351; @ %r332 bra $L1; $L32: .loc 1 187 16 ld.u64 %r333,[%r122]; add.u64 %r84,%r333,1; st.u64 [%r122],%r84; .loc 1 188 23 add.u64 %r335,%r352,%r140; ld.u64 %r85,[%r335]; .loc 1 188 13 shl.b64 %r336,%r85,2; add.u64 %r116,%r113,%r336; .loc 1 189 23 add.u64 %r337,%frame,%r140; ld.u64 %r88,[%r337]; .loc 1 189 13 shl.b64 %r338,%r88,2; add.u64 %r117,%r114,%r338; .loc 1 169 32 add.u64 %r340,%r353,%r140; ld.u64 %r92,[%r340]; .loc 1 169 13 setp.eq.u64 %r341,%r84,%r92; @ %r341 bra $L33; .loc 1 173 13 mov.u64 %r138,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r342,[%r179]; setp.eq.u64 %r343,%r342,0; @ ! %r343 bra $L11; bra $L34; $L1_gfortran_miany_i4 .visible .func _gfortran_miany_608u64u16 %r263; .reg .pred %r264; .reg .pred %r266; .reg .u64 %r270; .reg .u64 %r271; .reg .u64 %r277; .reg .pred %r279; .reg .pred %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .predu64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32u64 %r342; .reg .u64pred %r358; .reg .u64predu64 %r391; .reg .predpred %r399; .reg .u32 %r400; .reg .pred %r401; .reg .predmov.u64 %r211,%ar2; mov.u64 %r212,%ar3; .loc 1 223 6 setp.ne.u64 %r213,%r212,0; @ %r213 bra $L49gfortran_iany_.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r211]; .loc 1 233 7 add.u64 %r137,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r210+28]; .loc 1 234 38 add.u32 %r217,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r138,%r217; .loc 1 237 7 shr.u64 %r219,%r137,63; set.u32.gt.s64 %r221,%r137,%r138; neg.s32 %r222,%r221; cvt.u16.u64 %r225,%r219; cvt.u16.u32 %r226,%r222; or.b16 %r224,%r225,%r226; .loc 1 237 6 cvt.u32.u16 %r227,%r224; cvt.u16.u8 %r228,%r227; setp.eq.u16 %r229,%r228,0; @ %r229 bra $L51; .loc 1 239 7 cvt.u32.u32 %r232,%r23; cvt.s64.s8 %r231,%r232; st.u64 [%stack+8],%r231; st.u64 [%stack],%r22; cvta.const.u64 %r2302stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r429,%r137,%r137; add.u64 %r430,%r429,%r137; shl.b64 %r237,%r430,3; add.u64 %r238,%r210,%r237; ld.u64 %r241,[%r238+56]; add.u64 %r240,%r241,1; .loc 1 244 7 ld.u64 %r248,[%r238+48]; sub.u64 %r139,%r240,%r248; .loc 1 245 6 setp.le.s64 %r249,%r139,0; @ %r249 bra $L48; .loc 1 248 9 ld.u64 %r152,[%r212]; .loc 1 250 15 ld.u64 %r34,[%r212+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r250,%r35,-4; and.b32 %r251,%r250,-5; set.u32.eq.u32 %r253,%r251,0; neg.s32 %r254,%r253; .loc 1 252 22 add.u32 %r255,%r35,-1; set.u32.le.u32 %r257,%r255,1; neg.s32 %r258,%r257; .loc 1 252 6 cvt.u16.u32 %r260,%r254; cvt.u16.u32 %r261,%r258; or.b16 %r259,%r260,%r261; cvt.u32.u16 %r262,%r259; cvt.u16.u8 %r263,%r262; setp.ne.u16 %r264,%r263,0; @ %r264 bra $L53; .loc 1 254 7 setp.ne.u32 %r266,%r35,16; @ %r266 bra $L54; $L53: .loc 1 261 9 shl.b64 %r270,%r430,3; add.u64 %r271,%r210,%r270; ld.u64 %r142,[%r271+40]; .loc 1 262 12 add.u64 %r277,%r212,%r270; ld.u64 %r42,[%r277+40]; .loc 1 264 3 setp.ne.u64 %r279,%r137,0; @ %r279 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r280,%r137,%r138; @ %r280 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r281281_gfortran_runtime_error5: add.u64 %r40,%r210,40; add.u64 %r179,%r212,40; add.u64 %r283,%r210,16; add.u64 %r285,%r22,%r22; add.u64 %r286,%r285,%r22; shl.b64 %r287,%r286,3; add.u64 %r206,%r283,%r287; .loc 1 264 3 mov.u64 %r92,0; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 271 12 mov.u64 %r441,%r92; $L60: .loc 1 266 18 add.u64 %r289,%r427,%r92; ld.u64 %r290,[%r40]; st.u64 [%r289],%r290; .loc 1 267 18 add.u64 %r291,%frame,%r92; .loc 1 267 20 ld.u64 %r293,[%r179]; mul.lo.u64 %r292,%r293,%r34; .loc 1 267 18 st.u64 [%r291],%r292; .loc 1 268 19 ld.u64 %r295,[%r40+16]; add.u64 %r294,%r295,1; ld.u64 %r296,[%r40+8]; sub.u64 %r54,%r294,%r296; .loc 1 270 10 setp.lt.s64 %r297,%r54,0; @ %r297 bra $L58; .loc 1 268 17 add.u64 %r299,%r426,%r92; st.u64 [%r299],%r54; bra $L59; $L58: .loc 1 271 12 add.u64 %r301,%r426,%r92; st.u64 [%r301],%r441; $L59: .loc 1 264 3 add.u64 %r40,%r40,24; add.u64 %r179,%r179,24; add.u64 %r92,%r92,8; setp.ne.u64 %r303,%r40,%r206; @ %r303 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r304,[%r209]; setp.eq.u64 %r305,%r304,0; @ ! %r305 bra $L63; bra $L62; $L56: add.u64 %r307,%r22,%r22; add.u64 %r308,%r307,%r22; shl.b64 %r309,%r308,3; add.u64 %r127,%r309,40; add.u64 %r133,%r210,%r127; add.u64 %r125,%r212,%r127; shl.b64 %r310,%r22,3; add.u64 %r103,%r310,-8; cvt.u32.u32 %r312,%r23; cvt.s64.s8 %r311,%r312; shl.b64 %r313,%r311,3; add.u64 %r69,%r313,-8; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 281 12 mov.u64 %r440,0; $L66: .loc 1 276 18 add.u64 %r315,%r427,%r103; ld.u64 %r316,[%r133]; st.u64 [%r315],%r316; .loc 1 277 18 add.u64 %r317,%frame,%r103; .loc 1 277 20 ld.u64 %r319,[%r125]; mul.lo.u64 %r318,%r319,%r34; .loc 1 277 18 st.u64 [%r317],%r318; .loc 1 278 19 ld.u64 %r321,[%r133+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r133+8]; sub.u64 %r66,%r320,%r322; .loc 1 280 10 setp.lt.s64 %r323,%r66,0; @ %r323 bra $L64; .loc 1 278 17 add.u64 %r325,%r426,%r103; st.u64 [%r325],%r66; bra $L65; $L64: .loc 1 281 12 add.u64 %r327,%r426,%r103; st.u64 [%r327],%r440; $L65: .loc 1 274 3 add.u64 %r133,%r133,24; add.u64 %r125,%r125,24; add.u64 %r103,%r103,8; setp.ne.u64 %r329,%r69,%r103; @ %r329 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r330,%r138,0; @ %r330 bra $L68; add.u64 %r426,%frame,360; $L86: add.u64 %r172,%r209,40; mov.u64 %r164,%r426; cvt.u32.u32 %r332,%r23; cvt.s64.s8 %r331,%r332; add.u64 %r334,%r331,%r331; add.u64 %r335,%r334,%r331; shl.b64 %r336,%r335,3; add.u64 %r337,%r209,16; add.u64 %r134,%r336,%r337; .loc 1 291 10 mov.u64 %r129,1; .loc 1 295 4 mov.u64 %r356,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r338,%r23,-2; cvt.s64.s32 %r75,%r338; add.u64 %r340,%r75,%r75; add.u64 %r341,%r340,%r75; shl.b64 %r342,%r341,3; add.u64 %r343,%r209,%r342; .loc 1 299 67 shl.b64 %r345,%r75,3; add.u64 %r346,%frame,%r345; .loc 1 299 59 ld.u64 %r348,[%r343+40]; ld.u64 %r349,[%r346+360]; mul.lo.u64 %r146,%r348,%r349; .loc 1 301 24 mov.u64 %r350,0; st.u64 [%r209+8],%r350; .loc 1 302 28 cvt.u16.u32 %r353,%r23; add.u16 %r352,%r353,-1; cvt.u32.u16 %r354,%r352; st.u8 [%r209+28],%r354; .loc 1 304 10 setp.eq.u64 %r355,%r146,0; @ %r355 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r129,%r72,%r129; $L69: .loc 1 295 4 st.u64 [%r172+8],%r356; ld.u64 %r72,[%r164]; add.u64 %r357,%r72,-1; st.u64 [%r172+16],%r357; st.u64 [%r172],%r129; .loc 1 288 7 add.u64 %r172,%r172,24; add.u64 %r164,%r164,8; setp.ne.u64 %r358,%r134,%r172; @ %r358 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r209+48],%r146; mov.u64 %r360,-1; st.u64 [%r209+56],%r360; mov.u64 %r361,1; st.u64 [%r209+40],%r361; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24363; call (%value_in),_gfortrani_xmallocarray64,[%value_in]; } .loc 1 311 22 st.u64 [%r209],%r364; bra $L73; $L63: .loc 1 316 19 ld.s8 %r366,[%r209+28]; .loc 1 316 10 setp.eq.u64 %r367,%r366,%r138; @ %r367 bra $L74; .loc 1 317 2 cvta.const.u64 %r36836_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r370,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r371,[%r370+36]; setp.eq.u32 %r372,%r371,0; @ %r372 bra $L73; .loc 1 321 4 add.u64 %r426,%frame,360; cvta.const.u64 %r376,$LC2r42637376; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r380,$LC1210380376; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r382,%r138,0; @ %r382 bra $L75; $L78: .loc 1 336 8 ld.u64 %r153,[%r209]; .loc 1 337 8 ld.u64 %r151,[%r210]; .loc 1 339 9 setp.ne.u64 %r383,%r151,0; @ %r383 bra $L76; bra $L48; $L75: add.u64 %r190,%frame,480; add.u64 %r189,%r209,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r385,%r23; cvt.s64.s8 %r384,%r385; add.u64 %r175,%r384,-1; .loc 1 328 3 mov.u64 %r157,0; add.u64 %r426,%frame,360; .loc 1 330 16 mov.u64 %r386,%r157; $L77: st.u64 [%r190],%r386; .loc 1 331 18 ld.u64 %r387,[%r189]; st.u64 [%r187],%r387; .loc 1 332 17 shl.b64 %r389,%r157,3; add.u64 %r390,%r426,%r389; .loc 1 332 10 ld.u64 %r391,[%r390]; setp.le.s64 %r392,%r391,0; @ %r392 bra $L48; .loc 1 328 26 add.u64 %r157,%r157,1; .loc 1 328 3 add.u64 %r190,%r190,8; add.u64 %r189,%r189,24; add.u64 %r187,%r187,8; setp.ne.u64 %r393,%r157,%r175; @ %r393 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r45,%r42,%r34; .loc 1 349 32 shl.b64 %r91,%r142,2; ld.u64 %r39,[%frame+480]; .loc 1 359 22 ld.u64 %r94,[%frame+240]; .loc 1 359 12 shl.b64 %r95,%r94,2; .loc 1 360 23 ld.u64 %r97,[%frame]; .loc 1 361 22 ld.u64 %r99,[%frame+120]; .loc 1 361 12 shl.b64 %r100,%r99,2; .loc 1 363 32 ld.u64 %r85,[%frame+360]; .loc 1 370 23 mul.lo.u64 %r38,%r85,%r94; .loc 1 371 24 mul.lo.u64 %r394,%r85,%r97; .loc 1 371 10 neg.s64 %r163,%r394; .loc 1 372 23 mul.lo.u64 %r395,%r85,%r99; .loc 1 372 9 shl.b64 %r396,%r395,2; neg.s64 %r165,%r396; setp.le.s64 %r431,%r138,1; cvt.u32.u32 %r433,%r23; cvt.s64.s8 %r434,%r433; add.u64 %r435,%r434,-1; add.u64 %r436,%frame,240; add.u64 %r437,%frame,360; add.u64 %r438,%frame,120; .loc 1 367 13 mov.u64 %r439,0; $L87: .loc 1 328 3 mov.u64 %r156,%r152; mov.u64 %r155,%r151; .loc 1 348 10 mov.u32 %r130,0; .loc 1 349 9 mov.u64 %r154,0; $L80: .loc 1 352 6 ld.s8 %r398,[%r156]; cvt.u16.u32 %r397,%r398; setp.eq.u16 %r399,%r397,0; @ %r399 bra $L79; .loc 1 353 12 ld.u32 %r400,[%r155]; or.b32 %r130,%r130,%r400; $L79: .loc 1 349 24 add.u64 %r154,%r154,1; .loc 1 349 32 add.u64 %r155,%r155,%r91; .loc 1 349 47 add.u64 %r156,%r156,%r45; .loc 1 349 2 setp.ne.u64 %r401,%r139,%r154; @ %r401 bra $L80; .loc 1 355 8 st.u32 [%r153],%r130; .loc 1 358 15 add.u64 %r39,%r39,1; .loc 1 359 12 add.u64 %r151,%r151,%r95; .loc 1 360 13 add.u64 %r152,%r152,%r97; .loc 1 361 12 add.u64 %r153,%r153,%r100; .loc 1 363 13 setp.ne.u64 %r402,%r85,%r39; @ %r402 bra $L87; .loc 1 371 10 add.u64 %r148,%r152,%r163; .loc 1 372 9 add.u64 %r149,%r153,%r165; .loc 1 374 7 @ ! %r431 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r41],%r439; .loc 1 370 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 371 24 mul.lo.u64 %r405,%r116,%r113; .loc 1 371 10 sub.u64 %r148,%r152,%r405; .loc 1 372 23 mul.lo.u64 %r406,%r118,%r113; .loc 1 372 9 shl.b64 %r407,%r406,2; sub.u64 %r149,%r153,%r407; .loc 1 373 5 add.u64 %r150,%r150,1; .loc 1 374 7 add.u64 %r41,%r41,8; add.u64 %r166,%r166,8; setp.ne.u64 %r408,%r150,%r435; @ %r408 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; .loc 1 370 23 mov.u64 %r102,%r38; .loc 1 374 7 mov.u64 %r166,8; .loc 1 373 5 mov.u64 %r150,1; $L84: .loc 1 382 16 ld.u64 %r412,[%r41]; add.u64 %r113,%r412,1; st.u64 [%r41],%r113; .loc 1 383 23 add.u64 %r414,%r436,%r166; ld.u64 %r114,[%r414]; .loc 1 383 13 sub.u64 %r415,%r114,%r102; shl.b64 %r416,%r415,2; add.u64 %r151,%r151,%r416; .loc 1 384 24 add.u64 %r417,%frame,%r166; ld.u64 %r116,[%r417]; .loc 1 384 14 add.u64 %r152,%r148,%r116; .loc 1 385 23 add.u64 %r419,%r438,%r166; ld.u64 %r118,[%r419]; .loc 1 385 13 shl.b64 %r420,%r118,2; add.u64 %r153,%r149,%r420; .loc 1 363 32 add.u64 %r422,%r437,%r166; ld.u64 %r122,[%r422]; .loc 1 363 13 setp.eq.u64 %r423,%r113,%r122; @ %r423 bra $L85; .loc 1 367 13 mov.u64 %r39,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r424,[%r209]; setp.eq.u64 %r425,%r424,0; @ ! %r425 bra $L63; bra $L86; $L48: .loc 1 389_gfortran_siany_i4 .visible .func _gfortran_siany_8u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64predu64mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_iany_.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r294,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r294; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r294; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r300,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r300; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r293,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r293,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r299,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r299; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r294,0; @ %r211 bra $L127; add.u64 %r293,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r293; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 2496244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r294,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r294; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r294; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r294,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r294,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r294,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 shl.b64 %r69,%r68,2; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r292,%r294,1; .loc 1 512 13 mov.u32 %r278,0; cvt.u32.u32 %r296,%r24; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; $L146: st.u32 [%r98],%r278; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r279,%r59,%r80; @ %r279 bra $L146; .loc 1 525 7 @ ! %r292 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r295; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r282,%r97,%r298; @ %r282 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; .loc 1 520 13 mov.u64 %r295,0; $L144: .loc 1 529 16 ld.u64 %r287,[%r99]; add.u64 %r74,%r287,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r288,%r75,%r72; shl.b64 %r289,%r288,2; add.u64 %r98,%r98,%r289; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r290,%r74,%r77; @ %r290 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r291,%r98,0; @ ! %r291 bra $L122; bra $L147; $L109: .loc 1 534iany_i8.o/_gfortran_iany_i8 .visible .func _gfortran_iany_i8fortran/generated/iany_i8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_miany_i8 .visible .func _gfortran_miany_i8gfortran_siany_i8 .visible .func _gfortran_siany_i8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,78,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,78,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,105] = {73,65,78,89114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,78,89,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,73,65,78,89,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_iany_i8 .visible .func _gfortran_iany_i84896pred %r218; .reg .predpred %r231; .reg .u64 %r232; .reg .predu64 %r250; .reg .u64 %r251; .reg .u64 %r252; .reg .pred %r253; .reg .predu64u16u64 %r274; .reg .u64pred %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u32 %r306; .reg .u64u64 %r311; .reg .u64 %r312; .reg .pred %r313; .reg .predu64pred %r347; .reg .predu64 %r358; mov.u64 %r180,%ar0; mov.u64 %r181,%ar1; mov.u64 %r182,%ar2; .loc 1 55 10 ld.s8 %r22,[%r181+28]; .loc 1 55 38 add.u32 %r183,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r107,%r183; .loc 1 56 10 ld.u64 %r25,[%r182]; .loc 1 56 7 add.u64 %r108,%r25,-1; .loc 1 58 7 shr.u64 %r185,%r108,63; set.u32.lt.s64 %r187,%r107,%r108; neg.s32 %r188,%r187; cvt.u16.u64 %r191,%r185; cvt.u16.u32 %r192,%r188; or.b16 %r190,%r191,%r192; .loc 1 58 6 cvt.u32.u16 %r193,%r190; cvt.u16.u8 %r194,%r193; setp.eq.u16 %r195,%r194,0; @ %r195 bra $L2; .loc 1 60 7 cvt.u32.u32 %r198,%r22; cvt.s64.s8 %r197,%r198; st.u64 [%stack+8],%r1971966_gfortran_runtime_errorr201,%r108,%r108; add.u64 %r202,%r201,%r108; shl.b64 %r203,%r202,3; add.u64 %r204,%r181,%r203; ld.u64 %r30,[%r204+56]; ld.u64 %r32,[%r204+48]; .loc 1 68 9 ld.u64 %r111,[%r204+40]; .loc 1 70 3 setp.ne.u64 %r218,%r108,0; @ %r218 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r219,%r107,%r108; @ %r219 bra $L4; bra $L46; $L3: add.u64 %r61,%r181,40; add.u64 %r40,%frame,120; add.u64 %r96,%frame,240; add.u64 %r220,%r181,16; add.u64 %r222,%r25,%r25; add.u64 %r223,%r222,%r25; shl.b64 %r224,%r223,3; add.u64 %r175,%r220,%r224; .loc 1 76 12 mov.u64 %r358,0; $L8: .loc 1 72 18 ld.u64 %r225,[%r61]; st.u64 [%r40],%r225; .loc 1 73 19 ld.u64 %r227,[%r61+16]; add.u64 %r226,%r227,1; ld.u64 %r228,[%r61+8]; sub.u64 %r37,%r226,%r228; .loc 1 75 10 setp.lt.s64 %r229,%r37,0; @ %r229 bra $L6; .loc 1 73 17 st.u64 [%r96],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r96],%r358; $L7: .loc 1 70 3 add.u64 %r61,%r61,24; add.u64 %r40,%r40,8; add.u64 %r96,%r96,8; setp.ne.u64 %r231,%r61,%r175; @ %r231 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r232,[%r180]; setp.eq.u64 %r233,%r232,0; @ ! %r233 bra $L11; bra $L10; $L4: add.u64 %r235,%r25,%r25; add.u64 %r236,%r235,%r25; shl.b64 %r237,%r236,3; add.u64 %r238,%r237,40; add.u64 %r136,%r181,%r238; shl.b64 %r239,%r25,3; add.u64 %r105,%r239,-8; add.u64 %r345,%frame,120; add.u64 %r129,%r345,%r105; add.u64 %r346,%frame,240; add.u64 %r103,%r346,%r105; cvt.u32.u32 %r243,%r22; cvt.s64.s8 %r242,%r243; add.u64 %r245,%r242,%r242; add.u64 %r246,%r245,%r242; shl.b64 %r247,%r246,3; add.u64 %r248,%r181,40; add.u64 %r80,%r247,%r248; .loc 1 84 12 mov.u64 %r357,0; $L14: .loc 1 80 18 ld.u64 %r249,[%r136]; st.u64 [%r129],%r249; .loc 1 81 19 ld.u64 %r251,[%r136+16]; add.u64 %r250,%r251,1; ld.u64 %r252,[%r136+8]; sub.u64 %r44,%r250,%r252; .loc 1 83 10 setp.lt.s64 %r253,%r44,0; @ %r253 bra $L12; .loc 1 81 17 st.u64 [%r103],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r103],%r357; $L13: .loc 1 78 3 add.u64 %r136,%r136,24; add.u64 %r129,%r129,8; add.u64 %r103,%r103,8; setp.ne.u64 %r255,%r80,%r136; @ %r255 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r256,%r107,0; @ %r256 bra $L16; add.u64 %r346,%frame,240; $L34: add.u64 %r152,%r180,40; mov.u64 %r150,%r346; add.u64 %r257,%r180,16; cvt.u32.u32 %r259,%r22; cvt.s64.s8 %r258,%r259; add.u64 %r261,%r258,%r258; add.u64 %r262,%r261,%r258; shl.b64 %r263,%r262,3; add.u64 %r139,%r257,%r263; .loc 1 94 10 mov.u64 %r102,1; .loc 1 98 4 mov.u64 %r286,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r264,0; st.u64 [%r180+8],%r264; .loc 1 103 28 cvt.u16.u32 %r267,%r22; add.u16 %r266,%r267,-1; cvt.u32.u16 %r268,%r266; st.u8 [%r180+28],%r268; .loc 1 105 20 add.u32 %r269,%r22,-2; cvt.s64.s32 %r53,%r269; add.u64 %r271,%r53,%r53; add.u64 %r272,%r271,%r53; shl.b64 %r273,%r272,3; add.u64 %r274,%r180,%r273; .loc 1 105 67 shl.b64 %r276,%r53,3; add.u64 %r277,%frame,%r276; .loc 1 105 59 ld.u64 %r279,[%r274+40]; ld.u64 %r280,[%r277+240]; mul.lo.u64 %r113,%r279,%r280; .loc 1 107 29 mov.u64 %r2811(%value_in),_gfortrani_xmallocarray283,[%value_in]; } .loc 1 107 27 st.u64 [%r180],%r283; .loc 1 108 10 setp.eq.u64 %r285,%r113,0; @ ! %r285 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r102,%r49,%r102; $L17: .loc 1 98 4 st.u64 [%r152+8],%r286; ld.u64 %r49,[%r150]; add.u64 %r287,%r49,-1; st.u64 [%r152+16],%r287; st.u64 [%r152],%r102; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r288,%r139,%r152; @ %r288 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r180+48],%r113; mov.u64 %r290,-1; st.u64 [%r180+56],%r290; mov.u64 %r291,1; st.u64 [%r180+40],%r291; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r60,[%r180+28]; .loc 1 118 10 setp.eq.u64 %r292,%r60,%r107; @ %r292 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r107; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r295,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r296,[%r295+36]; setp.eq.u32 %r297,%r296,0; @ %r297 bra $L21; .loc 1 125 2 add.u64 %r346,%frame,240; cvta.const.u64 %r301,$LC2r346300301; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r303,%r107,0; @ %r303 bra $L24; $L27: .loc 1 65 9 add.u64 %r304,%r30,1; .loc 1 65 7 sub.u64 %r109,%r304,%r32; max.s64 %r45,%r109,0; .loc 1 137 8 ld.u64 %r121,[%r181]; .loc 1 138 8 ld.u64 %r122,[%r180]; .loc 1 154 36 shl.b64 %r68,%r111,3; ld.u64 %r47,[%frame+360]; .loc 1 166 22 ld.u64 %r69,[%frame+120]; .loc 1 166 12 shl.b64 %r71,%r69,3; .loc 1 167 22 ld.u64 %r72,[%frame]; .loc 1 167 12 shl.b64 %r74,%r72,3; setp.gt.s64 %r347,%r109,0; setp.le.s64 %r348,%r107,1; .loc 1 150 10 mov.u64 %r350,0; cvt.u32.u32 %r351,%r22; cvt.s64.s8 %r352,%r351; add.u64 %r353,%r352,-1; add.u64 %r354,%frame,120; add.u64 %r355,%frame,240; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r180,40; mov.u64 %r162,%frame; cvt.u32.u32 %r306,%r22; cvt.s64.s8 %r305,%r306; add.u64 %r153,%r305,-1; .loc 1 129 3 mov.u64 %r123,0; add.u64 %r346,%frame,240; .loc 1 131 16 mov.u64 %r307,%r123; $L26: st.u64 [%r165],%r307; .loc 1 132 18 ld.u64 %r308,[%r164]; st.u64 [%r162],%r308; .loc 1 133 17 shl.b64 %r310,%r123,3; add.u64 %r311,%r346,%r310; .loc 1 133 10 ld.u64 %r312,[%r311]; setp.le.s64 %r313,%r312,0; @ %r313 bra $L1; .loc 1 129 26 add.u64 %r123,%r123,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r314,%r123,%r153; @ %r314 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r347 bra $L35; .loc 1 150 10 st.u64 [%r122],%r350; bra $L29; $L35: mov.u64 %r116,%r121; .loc 1 148 10 mov.u64 %r114,0; .loc 1 154 13 mov.u64 %r115,%r114; $L28: .loc 1 158 10 ld.u64 %r317,[%r116]; or.b64 %r114,%r114,%r317; .loc 1 154 28 add.u64 %r115,%r115,1; .loc 1 154 36 add.u64 %r116,%r116,%r68; .loc 1 154 6 setp.gt.s64 %r318,%r45,%r115; @ %r318 bra $L28; .loc 1 161 12 st.u64 [%r122],%r114; $L29: .loc 1 165 15 add.u64 %r47,%r47,1; .loc 1 166 12 add.u64 %r121,%r121,%r71; .loc 1 167 12 add.u64 %r122,%r122,%r74; .loc 1 169 32 ld.u64 %r127,[%frame+240]; .loc 1 169 13 setp.ne.u64 %r319,%r47,%r127; @ %r319 bra $L25; .loc 1 176 23 mul.lo.u64 %r320,%r69,%r47; .loc 1 176 9 shl.b64 %r321,%r320,3; sub.u64 %r118,%r121,%r321; .loc 1 177 23 mul.lo.u64 %r322,%r72,%r47; .loc 1 177 9 shl.b64 %r323,%r322,3; sub.u64 %r119,%r122,%r323; .loc 1 179 7 @ %r348 bra $L1; add.u64 %r144,%frame,368; mov.u64 %r106,8; .loc 1 178 5 mov.u64 %r120,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r144],%r350; .loc 1 176 23 mul.lo.u64 %r329,%r88,%r87; .loc 1 176 9 shl.b64 %r330,%r329,3; sub.u64 %r118,%r121,%r330; .loc 1 177 23 mul.lo.u64 %r331,%r91,%r87; .loc 1 177 9 shl.b64 %r332,%r331,3; sub.u64 %r119,%r122,%r332; .loc 1 178 5 add.u64 %r120,%r120,1; .loc 1 179 7 add.u64 %r144,%r144,8; add.u64 %r106,%r106,8; setp.eq.u64 %r333,%r120,%r353; @ %r333 bra $L1; $L32: .loc 1 187 16 ld.u64 %r334,[%r144]; add.u64 %r87,%r334,1; st.u64 [%r144],%r87; .loc 1 188 23 add.u64 %r336,%r354,%r106; ld.u64 %r88,[%r336]; .loc 1 188 13 shl.b64 %r337,%r88,3; add.u64 %r121,%r118,%r337; .loc 1 189 23 add.u64 %r338,%frame,%r106; ld.u64 %r91,[%r338]; .loc 1 189 13 shl.b64 %r339,%r91,3; add.u64 %r122,%r119,%r339; .loc 1 169 32 add.u64 %r341,%r355,%r106; ld.u64 %r94,[%r341]; .loc 1 169 13 setp.eq.u64 %r342,%r87,%r94; @ %r342 bra $L33; .loc 1 173 13 mov.u64 %r47,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r343,[%r180]; setp.eq.u64 %r344,%r343,0; @ ! %r344 bra $L11; bra $L34; $L1_gfortran_miany_i8 .visible .func _gfortran_miany_608predpred %r280; .reg .pred %r281; .reg .u64 %r282; .reg .u64 %r284; .reg .u64predu64 %r318; .reg .u64u64u64 %r364; .reg .u64 %r365; .reg .u64 %r367; .reg .predu64pred %r393; .reg .predpredmov.u64 %r210,%ar0; mov.u64 %r211,%ar1; mov.u64 %r212,%ar2; mov.u64 %r213,%ar3; .loc 1 223 6911212; call _gfortran_iany_i8,(%out_arg1,%out_arg2,%out_arg3); } .loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r212]; .loc 1 233 7 add.u64 %r141,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r211+28]; .loc 1 234 38 add.u32 %r218,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r142,%r218; .loc 1 237 7 shr.u64 %r220,%r141,63; set.u32.gt.s64 %r222,%r141,%r142; neg.s32 %r223,%r222; cvt.u16.u64 %r226,%r220; cvt.u16.u32 %r227,%r223; or.b16 %r225,%r226,%r227; .loc 1 237 6 cvt.u32.u16 %r228,%r225; cvt.u16.u8 %r229,%r228; setp.eq.u16 %r230,%r229,0; @ %r230 bra $L51; .loc 1 239 7 cvt.u32.u32 %r233,%r23; cvt.s64.s8 %r232,%r233; st.u64 [%stack+8],%r232; st.u64 [%stack],%r22; cvta.const.u64 %r2312stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r431,%r141,%r141; add.u64 %r430,%r431,%r141; shl.b64 %r238,%r430,3; add.u64 %r239,%r211,%r238; ld.u64 %r242,[%r239+56]; add.u64 %r241,%r242,1; .loc 1 244 7 ld.u64 %r249,[%r239+48]; sub.u64 %r143,%r241,%r249; .loc 1 245 6 setp.le.s64 %r250,%r143,0; @ %r250 bra $L48; .loc 1 248 9 ld.u64 %r156,[%r213]; .loc 1 250 15 ld.u64 %r35,[%r213+16]; .loc 1 252 22 cvt.u32.u64 %r36,%r35; .loc 1 252 53 add.u32 %r251,%r36,-4; and.b32 %r252,%r251,-5; set.u32.eq.u32 %r254,%r252,0; neg.s32 %r255,%r254; .loc 1 252 22 add.u32 %r256,%r36,-1; set.u32.le.u32 %r258,%r256,1; neg.s32 %r259,%r258; .loc 1 252 6 cvt.u16.u32 %r261,%r255; cvt.u16.u32 %r262,%r259; or.b16 %r260,%r261,%r262; cvt.u32.u16 %r263,%r260; cvt.u16.u8 %r264,%r263; setp.ne.u16 %r265,%r264,0; @ %r265 bra $L53; .loc 1 254 7 setp.ne.u32 %r267,%r36,16; @ %r267 bra $L54; $L53: .loc 1 261 9 shl.b64 %r271,%r430,3; add.u64 %r272,%r211,%r271; ld.u64 %r146,[%r272+40]; .loc 1 262 12 add.u64 %r278,%r213,%r271; ld.u64 %r43,[%r278+40]; .loc 1 264 3 setp.ne.u64 %r280,%r141,0; @ %r280 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r281,%r141,%r142; @ %r281 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r282282_gfortran_runtime_error5: add.u64 %r194,%r211,40; add.u64 %r196,%r213,40; add.u64 %r284,%r211,16; add.u64 %r286,%r22,%r22; add.u64 %r287,%r286,%r22; shl.b64 %r288,%r287,3; add.u64 %r207,%r284,%r288; .loc 1 264 3 mov.u64 %r198,0; add.u64 %r428,%frame,240; add.u64 %r427,%frame,360; .loc 1 271 12 mov.u64 %r442,%r198; $L60: .loc 1 266 18 add.u64 %r290,%r428,%r198; ld.u64 %r291,[%r194]; st.u64 [%r290],%r291; .loc 1 267 18 add.u64 %r292,%frame,%r198; .loc 1 267 20 ld.u64 %r294,[%r196]; mul.lo.u64 %r293,%r294,%r35; .loc 1 267 18 st.u64 [%r292],%r293; .loc 1 268 19 ld.u64 %r296,[%r194+16]; add.u64 %r295,%r296,1; ld.u64 %r297,[%r194+8]; sub.u64 %r55,%r295,%r297; .loc 1 270 10 setp.lt.s64 %r298,%r55,0; @ %r298 bra $L58; .loc 1 268 17 add.u64 %r300,%r427,%r198; st.u64 [%r300],%r55; bra $L59; $L58: .loc 1 271 12 add.u64 %r302,%r427,%r198; st.u64 [%r302],%r442; $L59: .loc 1 264 3 add.u64 %r194,%r194,24; add.u64 %r196,%r196,24; add.u64 %r198,%r198,8; setp.ne.u64 %r304,%r194,%r207; @ %r304 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r305,[%r210]; setp.eq.u64 %r306,%r305,0; @ ! %r306 bra $L63; bra $L62; $L56: add.u64 %r308,%r22,%r22; add.u64 %r309,%r308,%r22; shl.b64 %r310,%r309,3; add.u64 %r61,%r310,40; add.u64 %r100,%r211,%r61; add.u64 %r139,%r213,%r61; shl.b64 %r311,%r22,3; add.u64 %r192,%r311,-8; cvt.u32.u32 %r313,%r23; cvt.s64.s8 %r312,%r313; shl.b64 %r314,%r312,3; add.u64 %r39,%r314,-8; add.u64 %r428,%frame,240; add.u64 %r427,%frame,360; .loc 1 281 12 mov.u64 %r441,0; $L66: .loc 1 276 18 add.u64 %r316,%r428,%r192; ld.u64 %r317,[%r100]; st.u64 [%r316],%r317; .loc 1 277 18 add.u64 %r318,%frame,%r192; .loc 1 277 20 ld.u64 %r320,[%r139]; mul.lo.u64 %r319,%r320,%r35; .loc 1 277 18 st.u64 [%r318],%r319; .loc 1 278 19 ld.u64 %r322,[%r100+16]; add.u64 %r321,%r322,1; ld.u64 %r323,[%r100+8]; sub.u64 %r67,%r321,%r323; .loc 1 280 10 setp.lt.s64 %r324,%r67,0; @ %r324 bra $L64; .loc 1 278 17 add.u64 %r326,%r427,%r192; st.u64 [%r326],%r67; bra $L65; $L64: .loc 1 281 12 add.u64 %r328,%r427,%r192; st.u64 [%r328],%r441; $L65: .loc 1 274 3 add.u64 %r100,%r100,24; add.u64 %r139,%r139,24; add.u64 %r192,%r192,8; setp.ne.u64 %r330,%r39,%r192; @ %r330 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r331,%r142,0; @ %r331 bra $L68; add.u64 %r427,%frame,360; $L86: add.u64 %r137,%r210,40; mov.u64 %r135,%r427; cvt.u32.u32 %r333,%r23; cvt.s64.s8 %r332,%r333; add.u64 %r335,%r332,%r332; add.u64 %r336,%r335,%r332; shl.b64 %r337,%r336,3; add.u64 %r338,%r210,16; add.u64 %r106,%r337,%r338; .loc 1 291 10 mov.u64 %r131,1; .loc 1 295 4 mov.u64 %r357,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r339,%r23,-2; cvt.s64.s32 %r74,%r339; add.u64 %r341,%r74,%r74; add.u64 %r342,%r341,%r74; shl.b64 %r343,%r342,3; add.u64 %r344,%r210,%r343; .loc 1 299 67 shl.b64 %r346,%r74,3; add.u64 %r347,%frame,%r346; .loc 1 299 59 ld.u64 %r349,[%r344+40]; ld.u64 %r350,[%r347+360]; mul.lo.u64 %r150,%r349,%r350; .loc 1 301 24 mov.u64 %r351,0; st.u64 [%r210+8],%r351; .loc 1 302 28 cvt.u16.u32 %r354,%r23; add.u16 %r353,%r354,-1; cvt.u32.u16 %r355,%r353; st.u8 [%r210+28],%r355; .loc 1 304 10 setp.eq.u64 %r356,%r150,0; @ %r356 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r131,%r71,%r131; $L69: .loc 1 295 4 st.u64 [%r137+8],%r357; ld.u64 %r71,[%r135]; add.u64 %r358,%r71,-1; st.u64 [%r137+16],%r358; st.u64 [%r137],%r131; .loc 1 288 7 add.u64 %r137,%r137,24; add.u64 %r135,%r135,8; setp.ne.u64 %r359,%r106,%r137; @ %r359 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r210+48],%r150; mov.u64 %r361,-1; st.u64 [%r210+56],%r361; mov.u64 %r362,1; st.u64 [%r210+40],%r362; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r364364; call (%value_in),_gfortrani_xmallocarray65,[%value_in]; } .loc 1 311 22 st.u64 [%r210],%r365; bra $L73; $L63: .loc 1 316 19 ld.s8 %r367,[%r210+28]; .loc 1 316 10 setp.eq.u64 %r368,%r367,%r142; @ %r368 bra $L74; .loc 1 317 2 cvta.const.u64 %r36stack; call _gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r371,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r372,[%r371+36]; setp.eq.u32 %r373,%r372,0; @ %r373 bra $L73; .loc 1 321 4 add.u64 %r427,%frame,360; cvta.const.u64 %r377,$LC2r427376377; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r381,$LC1r2113377; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r383,%r142,0; @ %r383 bra $L75; $L78: .loc 1 336 8 ld.u64 %r157,[%r210]; .loc 1 337 8 ld.u64 %r155,[%r211]; .loc 1 339 9 setp.ne.u64 %r384,%r155,0; @ %r384 bra $L76; bra $L48; $L75: add.u64 %r183,%frame,480; add.u64 %r180,%r210,40; add.u64 %r177,%frame,120; cvt.u32.u32 %r386,%r23; cvt.s64.s8 %r385,%r386; add.u64 %r138,%r385,-1; .loc 1 328 3 mov.u64 %r161,0; add.u64 %r427,%frame,360; .loc 1 330 16 mov.u64 %r387,%r161; $L77: st.u64 [%r183],%r387; .loc 1 331 18 ld.u64 %r388,[%r180]; st.u64 [%r177],%r388; .loc 1 332 17 shl.b64 %r390,%r161,3; add.u64 %r391,%r427,%r390; .loc 1 332 10 ld.u64 %r392,[%r391]; setp.le.s64 %r393,%r392,0; @ %r393 bra $L48; .loc 1 328 26 add.u64 %r161,%r161,1; .loc 1 328 3 add.u64 %r183,%r183,8; add.u64 %r180,%r180,24; add.u64 %r177,%r177,8; setp.ne.u64 %r394,%r138,%r161; @ %r394 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r46,%r43,%r35; .loc 1 349 32 shl.b64 %r90,%r146,3; ld.u64 %r29,[%frame+480]; .loc 1 359 22 ld.u64 %r92,[%frame+240]; .loc 1 359 12 shl.b64 %r94,%r92,3; .loc 1 360 23 ld.u64 %r95,[%frame]; .loc 1 361 22 ld.u64 %r97,[%frame+120]; .loc 1 361 12 shl.b64 %r99,%r97,3; setp.le.s64 %r432,%r142,1; cvt.u32.u32 %r434,%r23; cvt.s64.s8 %r435,%r434; add.u64 %r436,%r435,-1; add.u64 %r437,%frame,240; add.u64 %r438,%frame,360; add.u64 %r439,%frame,120; .loc 1 367 13 mov.u64 %r440,0; $L87: .loc 1 328 3 mov.u64 %r160,%r156; mov.u64 %r159,%r155; .loc 1 348 10 mov.u64 %r133,0; .loc 1 349 9 mov.u64 %r158,%r133; $L80: .loc 1 352 6 ld.s8 %r396,[%r160]; cvt.u16.u32 %r395,%r396; setp.eq.u16 %r397,%r395,0; @ %r397 bra $L79; .loc 1 353 12 ld.u64 %r398,[%r159]; or.b64 %r133,%r133,%r398; $L79: .loc 1 349 24 add.u64 %r158,%r158,1; .loc 1 349 32 add.u64 %r159,%r159,%r90; .loc 1 349 47 add.u64 %r160,%r160,%r46; .loc 1 349 2 setp.ne.u64 %r399,%r143,%r158; @ %r399 bra $L80; .loc 1 355 8 st.u64 [%r157],%r133; .loc 1 358 15 add.u64 %r29,%r29,1; .loc 1 359 12 add.u64 %r155,%r155,%r94; .loc 1 360 13 add.u64 %r156,%r156,%r95; .loc 1 361 12 add.u64 %r157,%r157,%r99; .loc 1 363 32 ld.u64 %r84,[%frame+360]; .loc 1 363 13 setp.ne.u64 %r400,%r84,%r29; @ %r400 bra $L87; .loc 1 370 23 mul.lo.u64 %r102,%r29,%r92; .loc 1 371 24 mul.lo.u64 %r401,%r29,%r95; .loc 1 371 10 sub.u64 %r152,%r156,%r401; .loc 1 372 23 mul.lo.u64 %r402,%r29,%r97; .loc 1 372 9 shl.b64 %r403,%r402,3; sub.u64 %r153,%r157,%r403; .loc 1 374 7 @ ! %r432 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r41],%r440; .loc 1 370 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 371 24 mul.lo.u64 %r406,%r116,%r113; .loc 1 371 10 sub.u64 %r152,%r156,%r406; .loc 1 372 23 mul.lo.u64 %r407,%r118,%r113; .loc 1 372 9 shl.b64 %r408,%r407,3; sub.u64 %r153,%r157,%r408; .loc 1 373 5 add.u64 %r154,%r154,1; .loc 1 374 7 add.u64 %r41,%r41,8; add.u64 %r30,%r30,8; setp.ne.u64 %r409,%r154,%r436; @ %r409 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; mov.u64 %r30,8; .loc 1 373 5 mov.u64 %r154,1; $L84: .loc 1 382 16 ld.u64 %r413,[%r41]; add.u64 %r113,%r413,1; st.u64 [%r41],%r113; .loc 1 383 23 add.u64 %r415,%r437,%r30; ld.u64 %r114,[%r415]; .loc 1 383 13 sub.u64 %r416,%r114,%r102; shl.b64 %r417,%r416,3; add.u64 %r155,%r155,%r417; .loc 1 384 24 add.u64 %r418,%frame,%r30; ld.u64 %r116,[%r418]; .loc 1 384 14 add.u64 %r156,%r152,%r116; .loc 1 385 23 add.u64 %r420,%r439,%r30; ld.u64 %r118,[%r420]; .loc 1 385 13 shl.b64 %r421,%r118,3; add.u64 %r157,%r153,%r421; .loc 1 363 32 add.u64 %r423,%r438,%r30; ld.u64 %r121,[%r423]; .loc 1 363 13 setp.eq.u64 %r424,%r113,%r121; @ %r424 bra $L85; .loc 1 367 13 mov.u64 %r29,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r425,[%r210]; setp.eq.u64 %r426,%r425,0; @ ! %r426 bra $L63; bra $L86; $L48: .loc 1 389_gfortran_siany_i8 .visible .func _gfortran_siany_8u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_iany_.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r294,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r294; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r294; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r300,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r300; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r293,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r293,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r299,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r299; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r294,0; @ %r211 bra $L127; add.u64 %r293,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r293; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r24496244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r294,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r294; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r294; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r294,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r294,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r294,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 shl.b64 %r69,%r68,3; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r292,%r294,1; .loc 1 512 13 mov.u64 %r278,0; cvt.u32.u32 %r296,%r24; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; $L146: st.u64 [%r98],%r278; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r279,%r59,%r80; @ %r279 bra $L146; .loc 1 525 7 @ ! %r292 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r278; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r282,%r97,%r298; @ %r282 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; $L144: .loc 1 529 16 ld.u64 %r287,[%r99]; add.u64 %r74,%r287,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r288,%r75,%r72; shl.b64 %r289,%r288,3; add.u64 %r98,%r98,%r289; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r290,%r74,%r77; @ %r290 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r291,%r98,0; @ ! %r291 bra $L122; bra $L147; $L109: .loc 1 534iany_i16.o/_gfortran_iany_i16 .visible .func _gfortran_iany_i16fortran/generated/iany_i16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_miany_i16 .visible .func _gfortran_miany_i16gfortran_siany_i16 .visible .func _gfortran_siany_i16VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,78,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,78,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,105] = {73,65,78,89114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,78,89,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,73,65,78,89,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_iany_i1648pred %r217; .reg .pred64u64 %r251; .reg .predu64 %r256; .reg .u64 %r257; .reg .u32u32 %r268; .reg .u64u6464predu64 %r370; mov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; .loc 1 55 10 ld.s8 %r22,[%r180+28]; .loc 1 55 38 add.u32 %r182,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r182; .loc 1 56 10 ld.u64 %r25,[%r181]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r184,%r104,63; set.u32.lt.s64 %r186,%r103,%r104; neg.s32 %r187,%r186; cvt.u16.u64 %r190,%r184; cvt.u16.u32 %r191,%r187; or.b16 %r189,%r190,%r191; .loc 1 58 6 cvt.u32.u16 %r192,%r189; cvt.u16.u8 %r193,%r192; setp.eq.u16 %r194,%r193,0; @ %r194 bra $L2; .loc 1 60 7 cvt.u32.u32 %r197,%r22; cvt.s64.s8 %r196,%r197; st.u64 [%stack+8],%r1961955_gfortran_runtime_errorr200,%r104,%r104; add.u64 %r201,%r200,%r104; shl.b64 %r202,%r201,3; add.u64 %r203,%r180,%r202; ld.u64 %r30,[%r203+56]; ld.u64 %r32,[%r203+48]; .loc 1 68 9 ld.u64 %r107,[%r203+40]; .loc 1 70 3 setp.ne.u64 %r217,%r104,0; @ %r217 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r218,%r103,%r104; @ %r218 bra $L4; bra $L46; $L3: add.u64 %r61,%r180,40; add.u64 %r40,%frame,120; add.u64 %r91,%frame,240; add.u64 %r219,%r180,16; add.u64 %r221,%r25,%r25; add.u64 %r222,%r221,%r25; shl.b64 %r223,%r222,3; add.u64 %r174,%r219,%r223; .loc 1 76 12 mov.u64 %r370,0; $L8: .loc 1 72 18 ld.u64 %r224,[%r61]; st.u64 [%r40],%r224; .loc 1 73 19 ld.u64 %r226,[%r61+16]; add.u64 %r225,%r226,1; ld.u64 %r227,[%r61+8]; sub.u64 %r37,%r225,%r227; .loc 1 75 10 setp.lt.s64 %r228,%r37,0; @ %r228 bra $L6; .loc 1 73 17 st.u64 [%r91],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r91],%r370; $L7: .loc 1 70 3 add.u64 %r61,%r61,24; add.u64 %r40,%r40,8; add.u64 %r91,%r91,8; setp.ne.u64 %r230,%r61,%r174; @ %r230 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r231,[%r179]; setp.eq.u64 %r232,%r231,0; @ ! %r232 bra $L11; bra $L10; $L4: add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r237,%r236,40; add.u64 %r143,%r180,%r237; shl.b64 %r238,%r25,3; add.u64 %r125,%r238,-8; add.u64 %r358,%frame,120; add.u64 %r128,%r358,%r125; add.u64 %r357,%frame,240; add.u64 %r121,%r357,%r125; cvt.u32.u32 %r242,%r22; cvt.s64.s8 %r241,%r242; add.u64 %r244,%r241,%r241; add.u64 %r245,%r244,%r241; shl.b64 %r246,%r245,3; add.u64 %r247,%r180,40; add.u64 %r95,%r246,%r247; .loc 1 84 12 mov.u64 %r369,0; $L14: .loc 1 80 18 ld.u64 %r248,[%r143]; st.u64 [%r128],%r248; .loc 1 81 19 ld.u64 %r250,[%r143+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r143+8]; sub.u64 %r44,%r249,%r251; .loc 1 83 10 setp.lt.s64 %r252,%r44,0; @ %r252 bra $L12; .loc 1 81 17 st.u64 [%r121],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r121],%r369; $L13: .loc 1 78 3 add.u64 %r143,%r143,24; add.u64 %r128,%r128,8; add.u64 %r121,%r121,8; setp.ne.u64 %r254,%r95,%r143; @ %r254 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r255,%r103,0; @ %r255 bra $L16; add.u64 %r357,%frame,240; $L34: add.u64 %r152,%r179,40; mov.u64 %r150,%r357; add.u64 %r256,%r179,16; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r144,%r256,%r262; .loc 1 94 10 mov.u64 %r99,1; .loc 1 98 4 mov.u64 %r285,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r263,0; st.u64 [%r179+8],%r263; .loc 1 103 28 cvt.u16.u32 %r266,%r22; add.u16 %r265,%r266,-1; cvt.u32.u16 %r267,%r265; st.u8 [%r179+28],%r267; .loc 1 105 20 add.u32 %r268,%r22,-2; cvt.s64.s32 %r52,%r268; add.u64 %r270,%r52,%r52; add.u64 %r271,%r270,%r52; shl.b64 %r272,%r271,3; add.u64 %r273,%r179,%r272; .loc 1 105 67 shl.b64 %r275,%r52,3; add.u64 %r276,%frame,%r275; .loc 1 105 59 ld.u64 %r278,[%r273+40]; ld.u64 %r279,[%r276+240]; mul.lo.u64 %r109,%r278,%r279; .loc 1 107 29 mov.u64 %r281,16281; call (%value_in),_gfortrani_xmallocarray282,[%value_in]; } .loc 1 107 27 st.u64 [%r179],%r282; .loc 1 108 10 setp.eq.u64 %r284,%r109,0; @ ! %r284 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r99,%r48,%r99; $L17: .loc 1 98 4 st.u64 [%r152+8],%r285; ld.u64 %r48,[%r150]; add.u64 %r286,%r48,-1; st.u64 [%r152+16],%r286; st.u64 [%r152],%r99; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r287,%r144,%r152; @ %r287 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r179+48],%r109; mov.u64 %r289,-1; st.u64 [%r179+56],%r289; mov.u64 %r290,1; st.u64 [%r179+40],%r290; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r179+28]; .loc 1 118 10 setp.eq.u64 %r291,%r58,%r103; @ %r291 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r294,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r295,[%r294+36]; setp.eq.u32 %r296,%r295,0; @ %r296 bra $L21; .loc 1 125 2 add.u64 %r357,%frame,240; cvta.const.u64 %r300,$LC2r357299300; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r302,%r103,0; @ %r302 bra $L24; $L27: .loc 1 65 9 add.u64 %r303,%r30,1; .loc 1 65 7 sub.u64 %r105,%r303,%r32; max.s64 %r45,%r105,0; .loc 1 137 8 ld.u64 %r116,[%r180]; .loc 1 138 8 ld.u64 %r117,[%r179]; .loc 1 154 36 shl.b64 %r66,%r107,4; ld.u64 %r138,[%frame+360]; .loc 1 166 22 ld.u64 %r68,[%frame+120]; .loc 1 166 12 shl.b64 %r69,%r68,4; .loc 1 167 22 ld.u64 %r71,[%frame]; .loc 1 167 12 shl.b64 %r72,%r71,4; .loc 1 169 32 ld.u64 %r123,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r304,%r68,%r123; .loc 1 176 9 shl.b64 %r305,%r304,4; neg.s64 %r127,%r305; .loc 1 177 23 mul.lo.u64 %r306,%r71,%r123; .loc 1 177 9 shl.b64 %r307,%r306,4; neg.s64 %r130,%r307; setp.gt.s64 %r359,%r105,0; setp.le.s64 %r360,%r103,1; .loc 1 148 10 mov.u64 %r362,0; cvt.u32.u32 %r364,%r22; cvt.s64.s8 %r365,%r364; add.u64 %r366,%r365,-1; add.u64 %r367,%frame,120; add.u64 %r368,%frame,240; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r179,40; mov.u64 %r162,%frame; cvt.u32.u32 %r309,%r22; cvt.s64.s8 %r308,%r309; add.u64 %r153,%r308,-1; .loc 1 129 3 mov.u64 %r118,0; add.u64 %r357,%frame,240; .loc 1 131 16 mov.u64 %r310,%r118; $L26: st.u64 [%r165],%r310; .loc 1 132 18 ld.u64 %r311,[%r164]; st.u64 [%r162],%r311; .loc 1 133 17 shl.b64 %r313,%r118,3; add.u64 %r314,%r357,%r313; .loc 1 133 10 ld.u64 %r315,[%r314]; setp.le.s64 %r316,%r315,0; @ %r316 bra $L1; .loc 1 129 26 add.u64 %r118,%r118,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r317,%r118,%r153; @ %r317 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r359 bra $L35; .loc 1 150 10 st.u64 [%r117],%r362; st.u64 [%r117+8],%r362; bra $L29; $L35: mov.u64 %r112,%r116; .loc 1 148 10 mov.u64 %r353,%r362; mov.u64 %r354,%r362; .loc 1 154 13 mov.u64 %r111,0; $L28: .loc 1 158 10 ld.u64 %r323,[%r112]; ld.u64 %r326,[%r112+8]; or.b64 %r353,%r323,%r353; or.b64 %r354,%r326,%r354; .loc 1 154 28 add.u64 %r111,%r111,1; .loc 1 154 36 add.u64 %r112,%r112,%r66; .loc 1 154 6 setp.gt.s64 %r328,%r45,%r111; @ %r328 bra $L28; .loc 1 161 12 st.u64 [%r117],%r353; st.u64 [%r117+8],%r354; $L29: .loc 1 165 15 add.u64 %r138,%r138,1; .loc 1 166 12 add.u64 %r116,%r116,%r69; .loc 1 167 12 add.u64 %r117,%r117,%r72; .loc 1 169 13 setp.ne.u64 %r331,%r138,%r123; @ %r331 bra $L25; .loc 1 176 9 add.u64 %r113,%r116,%r127; .loc 1 177 9 add.u64 %r114,%r117,%r130; .loc 1 179 7 @ %r360 bra $L1; add.u64 %r122,%frame,368; mov.u64 %r140,8; .loc 1 178 5 mov.u64 %r115,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r122],%r362; .loc 1 176 23 mul.lo.u64 %r337,%r85,%r84; .loc 1 176 9 shl.b64 %r338,%r337,4; sub.u64 %r113,%r116,%r338; .loc 1 177 23 mul.lo.u64 %r339,%r88,%r84; .loc 1 177 9 shl.b64 %r340,%r339,4; sub.u64 %r114,%r117,%r340; .loc 1 178 5 add.u64 %r115,%r115,1; .loc 1 179 7 add.u64 %r122,%r122,8; add.u64 %r140,%r140,8; setp.eq.u64 %r341,%r115,%r366; @ %r341 bra $L1; $L32: .loc 1 187 16 ld.u64 %r342,[%r122]; add.u64 %r84,%r342,1; st.u64 [%r122],%r84; .loc 1 188 23 add.u64 %r344,%r367,%r140; ld.u64 %r85,[%r344]; .loc 1 188 13 shl.b64 %r345,%r85,4; add.u64 %r116,%r113,%r345; .loc 1 189 23 add.u64 %r346,%frame,%r140; ld.u64 %r88,[%r346]; .loc 1 189 13 shl.b64 %r347,%r88,4; add.u64 %r117,%r114,%r347; .loc 1 169 32 add.u64 %r349,%r368,%r140; ld.u64 %r92,[%r349]; .loc 1 169 13 setp.eq.u64 %r350,%r84,%r92; @ %r350 bra $L33; .loc 1 173 13 mov.u64 %r138,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r351,[%r179]; setp.eq.u64 %r352,%r351,0; @ ! %r352 bra $L11; bra $L34; $L1_gfortran_miany_i16 .visible .func _gfortran_miany_i16608u64u16 %r263; .reg .pred %r264; .reg .pred %r266; .reg .u64 %r270; .reg .u64 %r271; .reg .u64 %r277; .reg .pred %r279; .reg .pred %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .predu64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32u64 %r342; .reg .u64pred %r358; .reg .u64predu64 %r391; .reg .predpred %r399; .reg .u64 %r402; .reg .u64 %r405; .reg .pred %r407; .reg .predu64 %r435; .reg .u64 %r438; .reg .u64 %r439; .reg .u64 %r440; .reg .u64 %r442; .reg .predmov.u64 %r211,%ar2; mov.u64 %r212,%ar3; .loc 1 223 6 setp.ne.u64 %r213,%r212,0; @ %r213 bra $L492,%out_arg3); } .loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r211]; .loc 1 233 7 add.u64 %r137,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r210+28]; .loc 1 234 38 add.u32 %r217,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r138,%r217; .loc 1 237 7 shr.u64 %r219,%r137,63; set.u32.gt.s64 %r221,%r137,%r138; neg.s32 %r222,%r221; cvt.u16.u64 %r225,%r219; cvt.u16.u32 %r226,%r222; or.b16 %r224,%r225,%r226; .loc 1 237 6 cvt.u32.u16 %r227,%r224; cvt.u16.u8 %r228,%r227; setp.eq.u16 %r229,%r228,0; @ %r229 bra $L51; .loc 1 239 7 cvt.u32.u32 %r232,%r23; cvt.s64.s8 %r231,%r232; st.u64 [%stack+8],%r231; st.u64 [%stack],%r22; cvta.const.u64 %r2302stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r440,%r137,%r137; add.u64 %r442,%r440,%r137; shl.b64 %r237,%r442,3; add.u64 %r238,%r210,%r237; ld.u64 %r241,[%r238+56]; add.u64 %r240,%r241,1; .loc 1 244 7 ld.u64 %r248,[%r238+48]; sub.u64 %r139,%r240,%r248; .loc 1 245 6 setp.le.s64 %r249,%r139,0; @ %r249 bra $L48; .loc 1 248 9 ld.u64 %r152,[%r212]; .loc 1 250 15 ld.u64 %r34,[%r212+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r250,%r35,-4; and.b32 %r251,%r250,-5; set.u32.eq.u32 %r253,%r251,0; neg.s32 %r254,%r253; .loc 1 252 22 add.u32 %r255,%r35,-1; set.u32.le.u32 %r257,%r255,1; neg.s32 %r258,%r257; .loc 1 252 6 cvt.u16.u32 %r260,%r254; cvt.u16.u32 %r261,%r258; or.b16 %r259,%r260,%r261; cvt.u32.u16 %r262,%r259; cvt.u16.u8 %r263,%r262; setp.ne.u16 %r264,%r263,0; @ %r264 bra $L53; .loc 1 254 7 setp.ne.u32 %r266,%r35,16; @ %r266 bra $L54; $L53: .loc 1 261 9 shl.b64 %r270,%r442,3; add.u64 %r271,%r210,%r270; ld.u64 %r142,[%r271+40]; .loc 1 262 12 add.u64 %r277,%r212,%r270; ld.u64 %r42,[%r277+40]; .loc 1 264 3 setp.ne.u64 %r279,%r137,0; @ %r279 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r280,%r137,%r138; @ %r280 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r281281_gfortran_runtime_error5: add.u64 %r40,%r210,40; add.u64 %r179,%r212,40; add.u64 %r283,%r210,16; add.u64 %r285,%r22,%r22; add.u64 %r286,%r285,%r22; shl.b64 %r287,%r286,3; add.u64 %r206,%r283,%r287; .loc 1 264 3 mov.u64 %r92,0; add.u64 %r439,%frame,240; add.u64 %r438,%frame,360; .loc 1 271 12 mov.u64 %r453,%r92; $L60: .loc 1 266 18 add.u64 %r289,%r439,%r92; ld.u64 %r290,[%r40]; st.u64 [%r289],%r290; .loc 1 267 18 add.u64 %r291,%frame,%r92; .loc 1 267 20 ld.u64 %r293,[%r179]; mul.lo.u64 %r292,%r293,%r34; .loc 1 267 18 st.u64 [%r291],%r292; .loc 1 268 19 ld.u64 %r295,[%r40+16]; add.u64 %r294,%r295,1; ld.u64 %r296,[%r40+8]; sub.u64 %r54,%r294,%r296; .loc 1 270 10 setp.lt.s64 %r297,%r54,0; @ %r297 bra $L58; .loc 1 268 17 add.u64 %r299,%r438,%r92; st.u64 [%r299],%r54; bra $L59; $L58: .loc 1 271 12 add.u64 %r301,%r438,%r92; st.u64 [%r301],%r453; $L59: .loc 1 264 3 add.u64 %r40,%r40,24; add.u64 %r179,%r179,24; add.u64 %r92,%r92,8; setp.ne.u64 %r303,%r40,%r206; @ %r303 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r304,[%r209]; setp.eq.u64 %r305,%r304,0; @ ! %r305 bra $L63; bra $L62; $L56: add.u64 %r307,%r22,%r22; add.u64 %r308,%r307,%r22; shl.b64 %r309,%r308,3; add.u64 %r127,%r309,40; add.u64 %r133,%r210,%r127; add.u64 %r125,%r212,%r127; shl.b64 %r310,%r22,3; add.u64 %r103,%r310,-8; cvt.u32.u32 %r312,%r23; cvt.s64.s8 %r311,%r312; shl.b64 %r313,%r311,3; add.u64 %r69,%r313,-8; add.u64 %r439,%frame,240; add.u64 %r438,%frame,360; .loc 1 281 12 mov.u64 %r452,0; $L66: .loc 1 276 18 add.u64 %r315,%r439,%r103; ld.u64 %r316,[%r133]; st.u64 [%r315],%r316; .loc 1 277 18 add.u64 %r317,%frame,%r103; .loc 1 277 20 ld.u64 %r319,[%r125]; mul.lo.u64 %r318,%r319,%r34; .loc 1 277 18 st.u64 [%r317],%r318; .loc 1 278 19 ld.u64 %r321,[%r133+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r133+8]; sub.u64 %r66,%r320,%r322; .loc 1 280 10 setp.lt.s64 %r323,%r66,0; @ %r323 bra $L64; .loc 1 278 17 add.u64 %r325,%r438,%r103; st.u64 [%r325],%r66; bra $L65; $L64: .loc 1 281 12 add.u64 %r327,%r438,%r103; st.u64 [%r327],%r452; $L65: .loc 1 274 3 add.u64 %r133,%r133,24; add.u64 %r125,%r125,24; add.u64 %r103,%r103,8; setp.ne.u64 %r329,%r69,%r103; @ %r329 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r330,%r138,0; @ %r330 bra $L68; add.u64 %r438,%frame,360; $L86: add.u64 %r172,%r209,40; mov.u64 %r164,%r438; cvt.u32.u32 %r332,%r23; cvt.s64.s8 %r331,%r332; add.u64 %r334,%r331,%r331; add.u64 %r335,%r334,%r331; shl.b64 %r336,%r335,3; add.u64 %r337,%r209,16; add.u64 %r134,%r336,%r337; .loc 1 291 10 mov.u64 %r129,1; .loc 1 295 4 mov.u64 %r356,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r338,%r23,-2; cvt.s64.s32 %r75,%r338; add.u64 %r340,%r75,%r75; add.u64 %r341,%r340,%r75; shl.b64 %r342,%r341,3; add.u64 %r343,%r209,%r342; .loc 1 299 67 shl.b64 %r345,%r75,3; add.u64 %r346,%frame,%r345; .loc 1 299 59 ld.u64 %r348,[%r343+40]; ld.u64 %r349,[%r346+360]; mul.lo.u64 %r146,%r348,%r349; .loc 1 301 24 mov.u64 %r350,0; st.u64 [%r209+8],%r350; .loc 1 302 28 cvt.u16.u32 %r353,%r23; add.u16 %r352,%r353,-1; cvt.u32.u16 %r354,%r352; st.u8 [%r209+28],%r354; .loc 1 304 10 setp.eq.u64 %r355,%r146,0; @ %r355 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r129,%r72,%r129; $L69: .loc 1 295 4 st.u64 [%r172+8],%r356; ld.u64 %r72,[%r164]; add.u64 %r357,%r72,-1; st.u64 [%r172+16],%r357; st.u64 [%r172],%r129; .loc 1 288 7 add.u64 %r172,%r172,24; add.u64 %r164,%r164,8; setp.ne.u64 %r358,%r134,%r172; @ %r358 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r209+48],%r146; mov.u64 %r360,-1; st.u64 [%r209+56],%r360; mov.u64 %r361,1; st.u64 [%r209+40],%r361; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r363,16363; call (%value_in),_gfortrani_xmallocarray64,[%value_in]; } .loc 1 311 22 st.u64 [%r209],%r364; bra $L73; $L63: .loc 1 316 19 ld.s8 %r366,[%r209+28]; .loc 1 316 10 setp.eq.u64 %r367,%r366,%r138; @ %r367 bra $L74; .loc 1 317 2 cvta.const.u64 %r36836_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r370,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r371,[%r370+36]; setp.eq.u32 %r372,%r371,0; @ %r372 bra $L73; .loc 1 321 4 add.u64 %r438,%frame,360; cvta.const.u64 %r376,$LC2r437376; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r380,$LC1210380376; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r382,%r138,0; @ %r382 bra $L75; $L78: .loc 1 336 8 ld.u64 %r153,[%r209]; .loc 1 337 8 ld.u64 %r151,[%r210]; .loc 1 339 9 setp.ne.u64 %r383,%r151,0; @ %r383 bra $L76; bra $L48; $L75: add.u64 %r190,%frame,480; add.u64 %r189,%r209,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r385,%r23; cvt.s64.s8 %r384,%r385; add.u64 %r175,%r384,-1; .loc 1 328 3 mov.u64 %r157,0; add.u64 %r438,%frame,360; .loc 1 330 16 mov.u64 %r386,%r157; $L77: st.u64 [%r190],%r386; .loc 1 331 18 ld.u64 %r387,[%r189]; st.u64 [%r187],%r387; .loc 1 332 17 shl.b64 %r389,%r157,3; add.u64 %r390,%r438,%r389; .loc 1 332 10 ld.u64 %r391,[%r390]; setp.le.s64 %r392,%r391,0; @ %r392 bra $L48; .loc 1 328 26 add.u64 %r157,%r157,1; .loc 1 328 3 add.u64 %r190,%r190,8; add.u64 %r189,%r189,24; add.u64 %r187,%r187,8; setp.ne.u64 %r393,%r157,%r175; @ %r393 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r45,%r42,%r34; .loc 1 349 32 shl.b64 %r91,%r142,4; ld.u64 %r39,[%frame+480]; .loc 1 359 22 ld.u64 %r94,[%frame+240]; .loc 1 359 12 shl.b64 %r95,%r94,4; .loc 1 360 23 ld.u64 %r97,[%frame]; .loc 1 361 22 ld.u64 %r99,[%frame+120]; .loc 1 361 12 shl.b64 %r100,%r99,4; .loc 1 363 32 ld.u64 %r85,[%frame+360]; .loc 1 370 23 mul.lo.u64 %r38,%r85,%r94; .loc 1 371 24 mul.lo.u64 %r394,%r85,%r97; .loc 1 371 10 neg.s64 %r163,%r394; .loc 1 372 23 mul.lo.u64 %r395,%r85,%r99; .loc 1 372 9 shl.b64 %r396,%r395,4; neg.s64 %r165,%r396; setp.le.s64 %r443,%r138,1; .loc 1 348 10 mov.u64 %r445,0; cvt.u32.u32 %r446,%r23; cvt.s64.s8 %r447,%r446; add.u64 %r448,%r447,-1; add.u64 %r449,%frame,240; add.u64 %r450,%frame,360; add.u64 %r451,%frame,120; $L87: .loc 1 328 3 mov.u64 %r156,%r152; mov.u64 %r155,%r151; .loc 1 348 10 mov.u64 %r434,%r445; mov.u64 %r435,0; .loc 1 349 9 mov.u64 %r154,%r435; $L80: .loc 1 352 6 ld.s8 %r398,[%r156]; cvt.u16.u32 %r397,%r398; setp.eq.u16 %r399,%r397,0; @ %r399 bra $L79; .loc 1 353 12 ld.u64 %r402,[%r155]; ld.u64 %r405,[%r155+8]; or.b64 %r434,%r402,%r434; or.b64 %r435,%r405,%r435; $L79: .loc 1 349 24 add.u64 %r154,%r154,1; .loc 1 349 32 add.u64 %r155,%r155,%r91; .loc 1 349 47 add.u64 %r156,%r156,%r45; .loc 1 349 2 setp.ne.u64 %r407,%r139,%r154; @ %r407 bra $L80; .loc 1 355 8 st.u64 [%r153],%r434; st.u64 [%r153+8],%r435; .loc 1 358 15 add.u64 %r39,%r39,1; .loc 1 359 12 add.u64 %r151,%r151,%r95; .loc 1 360 13 add.u64 %r152,%r152,%r97; .loc 1 361 12 add.u64 %r153,%r153,%r100; .loc 1 363 13 setp.ne.u64 %r410,%r85,%r39; @ %r410 bra $L87; .loc 1 371 10 add.u64 %r148,%r152,%r163; .loc 1 372 9 add.u64 %r149,%r153,%r165; .loc 1 374 7 @ ! %r443 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r41],%r445; .loc 1 370 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 371 24 mul.lo.u64 %r413,%r116,%r113; .loc 1 371 10 sub.u64 %r148,%r152,%r413; .loc 1 372 23 mul.lo.u64 %r414,%r118,%r113; .loc 1 372 9 shl.b64 %r415,%r414,4; sub.u64 %r149,%r153,%r415; .loc 1 373 5 add.u64 %r150,%r150,1; .loc 1 374 7 add.u64 %r41,%r41,8; add.u64 %r166,%r166,8; setp.ne.u64 %r416,%r150,%r448; @ %r416 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; .loc 1 370 23 mov.u64 %r102,%r38; .loc 1 374 7 mov.u64 %r166,8; .loc 1 373 5 mov.u64 %r150,1; $L84: .loc 1 382 16 ld.u64 %r420,[%r41]; add.u64 %r113,%r420,1; st.u64 [%r41],%r113; .loc 1 383 23 add.u64 %r422,%r449,%r166; ld.u64 %r114,[%r422]; .loc 1 383 13 sub.u64 %r423,%r114,%r102; shl.b64 %r424,%r423,4; add.u64 %r151,%r151,%r424; .loc 1 384 24 add.u64 %r425,%frame,%r166; ld.u64 %r116,[%r425]; .loc 1 384 14 add.u64 %r152,%r148,%r116; .loc 1 385 23 add.u64 %r427,%r451,%r166; ld.u64 %r118,[%r427]; .loc 1 385 13 shl.b64 %r428,%r118,4; add.u64 %r153,%r149,%r428; .loc 1 363 32 add.u64 %r430,%r450,%r166; ld.u64 %r122,[%r430]; .loc 1 363 13 setp.eq.u64 %r431,%r113,%r122; @ %r431 bra $L85; .loc 1 367 13 mov.u64 %r39,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r432,[%r209]; setp.eq.u64 %r433,%r432,0; @ ! %r433 bra $L63; bra $L86; $L48: .loc 1 389_gfortran_siany_i16 .visible .func _gfortran_siany_i168u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64predpred %r292; .reg .predu32u64 %r301; mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_iany_i16.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r295,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r295; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r295; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r301,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r301; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r294,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r294,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r300,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r300; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r295,0; @ %r211 bra $L127; add.u64 %r294,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r294; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r244,1696244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r295,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r295; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r295; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r295,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r295,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r295,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 shl.b64 %r69,%r68,4; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r293,%r295,1; .loc 1 512 13 mov.u64 %r278,0; cvt.u32.u32 %r297,%r24; cvt.s64.s8 %r298,%r297; add.u64 %r299,%r298,-1; $L146: st.u64 [%r98],%r278; st.u64 [%r98+8],%r278; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r280,%r59,%r80; @ %r280 bra $L146; .loc 1 525 7 @ ! %r293 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r278; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r283,%r97,%r299; @ %r283 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; $L144: .loc 1 529 16 ld.u64 %r288,[%r99]; add.u64 %r74,%r288,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r289,%r75,%r72; shl.b64 %r290,%r289,4; add.u64 %r98,%r98,%r290; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r291,%r74,%r77; @ %r291 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r292,%r98,0; @ ! %r292 bra $L122; bra $L147; $L109: .loc 1 534 iparity_i1.o/_gfortran_iparity_i1 .visible .func _gfortran_iparity_i1fortran/generated/iparity_i1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_miparity_i1 .visible .func _gfortran_miparity_i1gfortran_siparity_i1 .visible .func _gfortran_siparity_i1VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,73,80,65,82,73,84,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,80,65,82,73,84,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,108] = {73,80,65,82,73,84,89114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,80,65,82,73,84,89,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,73,80,65,82,73,84,89,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_iparity_i1 .visible .func _gfortran_iparity_i148predpred %r209; .reg .pred %r210; .reg .u64 %r211; .reg .u64 %r213; .reg .u64pred %r220; .reg .pred %r222; .reg .u64pred %r247; .reg .u64 %r248; .reg .u32 %r249; .reg .u64 %r251; .reg .u64 %r252; .reg .u64 %r253; .reg .u64 %r254; .reg .u64 %r255; .reg .u16 %r257; .reg .u16predpred %r288; .reg .u64 %r291; .reg .u64 %r292; .reg .predpred %r304; .reg .predpredpred %r332; .reg .u64 %r333; .reg .pred %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .pred %r337; .reg .pred %r338; .reg .u32 %r340; .reg .u64 %r341; .reg .u64 %r342; .reg .u64 %r343; .reg .u64 %r344; .reg .u64 %r345; .reg .u64 %r347; .reg .u64 %r348; .reg .u16 %r349; mov.u64 %r171,%ar0; mov.u64 %r172,%ar1; mov.u64 %r173,%ar2; .loc 1 55 10 ld.s8 %r22,[%r172+28]; .loc 1 55 38 add.u32 %r174,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r174; .loc 1 56 10 ld.u64 %r25,[%r173]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r176,%r104,63; set.u32.lt.s64 %r178,%r103,%r104; neg.s32 %r179,%r178; cvt.u16.u64 %r182,%r176; cvt.u16.u32 %r183,%r179; or.b16 %r181,%r182,%r183; .loc 1 58 6 cvt.u32.u16 %r184,%r181; cvt.u16.u8 %r185,%r184; setp.eq.u16 %r186,%r185,0; @ %r186 bra $L2; .loc 1 60 7 cvt.u32.u32 %r189,%r22; cvt.s64.s8 %r188,%r189; st.u64 [%stack+8],%r18818187_gfortran_runtime_errorr192,%r104,%r104; add.u64 %r193,%r192,%r104; shl.b64 %r194,%r193,3; add.u64 %r195,%r172,%r194; ld.u64 %r31,[%r195+56]; ld.u64 %r33,[%r195+48]; .loc 1 68 9 ld.u64 %r107,[%r195+40]; .loc 1 70 3 setp.ne.u64 %r209,%r104,0; @ %r209 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r210,%r103,%r104; @ %r210 bra $L4; bra $L46; $L3: add.u64 %r168,%r172,40; add.u64 %r165,%frame,120; add.u64 %r48,%frame,240; add.u64 %r211,%r172,16; add.u64 %r213,%r25,%r25; add.u64 %r214,%r213,%r25; shl.b64 %r215,%r214,3; add.u64 %r139,%r211,%r215; .loc 1 76 12 mov.u64 %r348,0; $L8: .loc 1 72 18 ld.u64 %r216,[%r168]; st.u64 [%r165],%r216; .loc 1 73 19 ld.u64 %r218,[%r168+16]; add.u64 %r217,%r218,1; ld.u64 %r219,[%r168+8]; sub.u64 %r38,%r217,%r219; .loc 1 75 10 setp.lt.s64 %r220,%r38,0; @ %r220 bra $L6; .loc 1 73 17 st.u64 [%r48],%r38; bra $L7; $L6: .loc 1 76 12 st.u64 [%r48],%r348; $L7: .loc 1 70 3 add.u64 %r168,%r168,24; add.u64 %r165,%r165,8; add.u64 %r48,%r48,8; setp.ne.u64 %r222,%r139,%r168; @ %r222 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r223,[%r171]; setp.eq.u64 %r224,%r223,0; @ ! %r224 bra $L11; bra $L10; $L4: add.u64 %r226,%r25,%r25; add.u64 %r227,%r226,%r25; shl.b64 %r228,%r227,3; add.u64 %r229,%r228,40; add.u64 %r99,%r172,%r229; shl.b64 %r230,%r25,3; add.u64 %r91,%r230,-8; add.u64 %r335,%frame,120; add.u64 %r93,%r335,%r91; add.u64 %r336,%frame,240; add.u64 %r77,%r336,%r91; cvt.u32.u32 %r234,%r22; cvt.s64.s8 %r233,%r234; add.u64 %r236,%r233,%r233; add.u64 %r237,%r236,%r233172,40; add.u64 %r121,%r238,%r239; .loc 1 84 12 mov.u64 %r347,0; $L14: .loc 1 80 18 ld.u64 %r240,[%r99]; st.u64 [%r93],%r240; .loc 1 81 19 ld.u64 %r242,[%r99+16]; add.u64 %r241,%r242,1; ld.u64 %r243,[%r99+8]; sub.u64 %r44,%r241,%r243; .loc 1 83 10 setp.lt.s64 %r244,%r44,0; @ %r244 bra $L12; .loc 1 81 17 st.u64 [%r77],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r77],%r347; $L13: .loc 1 78 3 add.u64 %r99,%r99,24; add.u64 %r93,%r93,8; add.u64 %r77,%r77,8; setp.ne.u64 %r246,%r99,%r121; @ %r246 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r247,%r103,0; @ %r247 bra $L16; add.u64 %r336,%frame,240; $L35: add.u64 %r137,%r171,40; mov.u64 %r132,%r336; cvt.u32.u32 %r249,%r22; cvt.s64.s8 %r248,%r249; add.u64 %r251,%r248,%r248; add.u64 %r252,%r251,%r248; shl.b64 %r253,%r252,3; add.u64 %r254,%r171,16; add.u64 %r100,%r253,%r254; .loc 1 94 10 mov.u64 %r95,1; .loc 1 98 4 mov.u64 %r277,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r255,0; st.u64 [%r171+8],%r255; .loc 1 103 28 cvt.u16.u32 %r258,%r22; add.u16 %r257,%r258,-1; cvt.u32.u16 %r259,%r257; st.u8 [%r171+28],%r259; .loc 1 105 20 add.u32 %r260,%r22,-2; cvt.s64.s32 %r54,%r260; add.u64 %r262,%r54,%r54; add.u64 %r263,%r262,%r54; shl.b64 %r264,%r263,3; add.u64 %r265,%r171,%r264; .loc 1 105 67 shl.b64 %r267,%r54,3; add.u64 %r268,%frame,%r267; .loc 1 105 59 ld.u64 %r270,[%r265+40]; ld.u64 %r271,[%r268+240]; mul.lo.u64 %r109,%r270,%r271; .loc 1 107 29 mov.u64 %r27273; call (%value_in),_gfortrani_xmallocarray274,[%value_in]; } .loc 1 107 27 st.u64 [%r171],%r274; .loc 1 108 10 setp.eq.u64 %r276,%r109,0; @ ! %r276 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r95,%r50,%r95; $L17: .loc 1 98 4 st.u64 [%r137+8],%r277; ld.u64 %r50,[%r132]; add.u64 %r278,%r50,-1; st.u64 [%r137+16],%r278; st.u64 [%r137],%r95; .loc 1 91 7 add.u64 %r137,%r137,24; add.u64 %r132,%r132,8; setp.ne.u64 %r279,%r100,%r137; @ %r279 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r171+48],%r109; mov.u64 %r281,-1; st.u64 [%r171+56],%r281; st.u64 [%r171+40],%r273; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r61,[%r171+28]; .loc 1 118 10 setp.eq.u64 %r283,%r61,%r103; @ %r283 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r61;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r286,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r287,[%r286+36]; setp.eq.u32 %r288,%r287,0; @ %r288 bra $L21; .loc 1 125 2 add.u64 %r336,%frame,240; cvta.const.u64 %r292,$LC2r3r29292; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r294,%r103,0; @ %r294 bra $L24; $L27: .loc 1 65 9 add.u64 %r295,%r31,1; .loc 1 65 7 sub.u64 %r105,%r295,%r33; max.s64 %r45,%r105,0; .loc 1 137 8 ld.u64 %r117,[%r172]; .loc 1 138 8 ld.u64 %r118,[%r171]; ld.u64 %r47,[%frame+360]; .loc 1 166 22 ld.u64 %r69,[%frame+120]; .loc 1 167 22 ld.u64 %r71,[%frame]; setp.gt.s64 %r337,%r105,0; setp.le.s64 %r338,%r103,1; cvt.u32.u32 %r340,%r22; cvt.s64.s8 %r341,%r340; add.u64 %r342,%r341,-1; add.u64 %r343,%frame,120; add.u64 %r344,%frame,240; .loc 1 173 13 mov.u64 %r345,0; bra $L25; $L24: add.u64 %r157,%frame,360; add.u64 %r156,%r171,40; mov.u64 %r154,%frame; cvt.u32.u32 %r297,%r22; cvt.s64.s8 %r296,%r297; add.u64 %r145,%r296,-1; .loc 1 129 3 mov.u64 %r119,0; add.u64 %r336,%frame,240; .loc 1 131 16 mov.u64 %r298,%r119; $L26: st.u64 [%r157],%r298; .loc 1 132 18 ld.u64 %r299,[%r156]; st.u64 [%r154],%r299; .loc 1 133 17 shl.b64 %r301,%r119,3; add.u64 %r302,%r336,%r301; .loc 1 133 10 ld.u64 %r303,[%r302]; setp.le.s64 %r304,%r303,0; @ %r304 bra $L1; .loc 1 129 26 add.u64 %r119,%r119,1; .loc 1 129 3 add.u64 %r157,%r157,8; add.u64 %r156,%r156,24; add.u64 %r154,%r154,8; setp.ne.u64 %r305,%r119,%r145; @ %r305 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r337 bra $L28; .loc 1 150 10 mov.u32 %r307,0; st.u8 [%r118],%r307; bra $L29; $L28: .loc 1 154 36 mov.u64 %r112,%r117; .loc 1 148 10 mov.u32 %r110,0; .loc 1 154 13 mov.u64 %r111,0; $L30: .loc 1 158 10 ld.u8 %r349,[%r112]; mov.loc 1 154 28 add.u64 %r111,%r111,1; .loc 1 154 36 add.u64 %r112,%r112,%r107; .loc 1 154 6 setp.gt.s64 %r313,%r45,%r111; @ %r313 bra $L30; .loc 1 161 12 cvt.u32.u32 %r314,%r110; st.u8 [%r118],%r314; $L29: .loc 1 165 15 add.u64 %r47,%r47,1; .loc 1 166 12 add.u64 %r117,%r117,%r69; .loc 1 167 12 add.u64 %r118,%r118,%r71; .loc 1 169 32 ld.u64 %r126,[%frame+240]; .loc 1 169 13 setp.ne.u64 %r315,%r47,%r126; @ %r315 bra $L25; .loc 1 176 23 mul.lo.u64 %r316,%r69,%r47; .loc 1 176 9 sub.u64 %r114,%r117,%r316; .loc 1 177 23 mul.lo.u64 %r317,%r71,%r47; .loc 1 177 9 sub.u64 %r115,%r118,%r317; .loc 1 179 7 @ %r338 bra $L1; add.u64 %r143,%frame,368; mov.u64 %r102,8; .loc 1 178 5 mov.u64 %r116,1; bra $L33; $L34: .loc 1 173 13 st.u64 [%r143],%r345; .loc 1 176 23 mul.lo.u64 %r323,%r84,%r83; .loc 1 176 9 sub.u64 %r114,%r117,%r323; .loc 1 177 23 mul.lo.u64 %r324,%r86,%r83; .loc 1 177 9 sub.u64 %r115,%r118,%r324; .loc 1 178 5 add.u64 %r116,%r116,1; .loc 1 179 7 add.u64 %r143,%r143,8; add.u64 %r102,%r102,8; setp.eq.u64 %r325,%r116,%r342; @ %r325 bra $L1; $L33: .loc 1 187 16 ld.u64 %r326,[%r143]; add.u64 %r83,%r326,1; st.u64 [%r143],%r83; .loc 1 188 23 add.u64 %r328,%r343,%r102; ld.u64 %r84,[%r328]; .loc 1 188 13 add.u64 %r117,%r114,%r84; .loc 1 189 23 add.u64 %r329,%frame,%r102; ld.u64 %r86,[%r329]; .loc 1 189 13 add.u64 %r118,%r115,%r86; .loc 1 169 32 add.u64 %r331,%r344,%r102; ld.u64 %r88,[%r331]; .loc 1 169 13 setp.eq.u64 %r332,%r83,%r88; @ %r332 bra $L34; .loc 1 173 13 mov.u64 %r47,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r333,[%r171]; setp.eq.u64 %r334,%r333,0; @ ! %r334 bra $L11; bra $L35; $L1_gfortran_miparity_i1 .visible .func _gfortran_miparity_608u32 %r215; .reg .u32 %r216; .reg .u16u16 %r255; .reg .u32 %r256; .reg .u16 %r257; .reg .pred %r258; .reg .predpred %r274; .reg .u64 %r275; .reg .u64 %r277; .reg .u64 %r279; .reg .u64 %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64pred %r297; .reg .u64 %r298; .reg .predu64 %r305; .reg .u32 %r306; .reg .u64u64 %r313; .reg .u64pred %r323; .reg .pred %r324; .reg .u64 %r325; .reg .u32 %r326; .reg .u64u64 %r331; .reg .u32 %r332; .reg .u64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u64u16 %r347; .reg .u32 %r348; .reg .pred %r349; .reg .u64 %r350; .reg .u64 %r351; .reg .predu16 %r388; .reg .u32 %r389; .reg .predu64 %r436; .reg .u16 %r437; mov.u64 %r203,%ar0; mov.u64 %r204,%ar1; mov.u64 %r205,%ar2; mov.u64 %r206,%ar3; .loc 1 223 6 setp.ne.u64 %r207,%r206,0; @ %r207 bra $L490r204205; call _gfortran_iparity_.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r205]; .loc 1 233 7 add.u64 %r132,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r204+28]; .loc 1 234 38 add.u32 %r211,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r133,%r211; .loc 1 237 7 shr.u64 %r213,%r132,63; set.u32.gt.s64 %r215,%r132,%r133; neg.s32 %r216,%r215; cvt.u16.u64 %r219,%r213; cvt.u16.u32 %r220,%r216; or.b16 %r218,%r219,%r220; .loc 1 237 6 cvt.u32.u16 %r221,%r218; cvt.u16.u8 %r222,%r221; setp.eq.u16 %r223,%r222,0; @ %r223 bra $L51; .loc 1 239 7 cvt.u32.u32 %r226,%r23; cvt.s64.s8 %r225,%r226; st.u64 [%stack+8],%r225; st.u64 [%stack],%r22; cvta.const.u64 %r2242stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r425,%r132,%r132; add.u64 %r424,%r425,%r132; shl.b64 %r231,%r424,3; add.u64 %r232,%r204,%r231; ld.u64 %r235,[%r232+56]; add.u64 %r234,%r235,1; .loc 1 244 7 ld.u64 %r242,[%r232+48]; sub.u64 %r134,%r234,%r242; .loc 1 245 6 setp.le.s64 %r243,%r134,0; @ %r243 bra $L48; .loc 1 248 9 ld.u64 %r146,[%r206]; .loc 1 250 15 ld.u64 %r34,[%r206+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r244,%r35,-4; and.b32 %r245,%r244,-5; set.u32.eq.u32 %r247,%r245,0; neg.s32 %r248,%r247; .loc 1 252 22 add.u32 %r249,%r35,-1; set.u32.le.u32 %r251,%r249,1; neg.s32 %r252,%r251; .loc 1 252 6 cvt.u16.u32 %r254,%r248; cvt.u16.u32 %r255,%r252; or.b16 %r253,%r254,%r255; cvt.u32.u16 %r256,%r253; cvt.u16.u8 %r257,%r256; setp.ne.u16 %r258,%r257,0; @ %r258 bra $L53; .loc 1 254 7 setp.ne.u32 %r260,%r35,16; @ %r260 bra $L54; $L53: .loc 1 261 9 shl.b64 %r264,%r424,3; add.u64 %r265,%r204,%r264; ld.u64 %r137,[%r265+40]; .loc 1 262 12 add.u64 %r271,%r206,%r264; ld.u64 %r42,[%r271+40]; .loc 1 264 3 setp.ne.u64 %r273,%r132,0; @ %r273 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r274,%r132,%r133; @ %r274 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r275275_gfortran_runtime_error5: add.u64 %r187,%r204,40; add.u64 %r189,%r206,40; add.u64 %r277,%r204,16; add.u64 %r279,%r22,%r22; add.u64 %r280,%r279,%r22; shl.b64 %r281,%r280,3; add.u64 %r200,%r277,%r281; .loc 1 264 3 mov.u64 %r191,0; add.u64 %r422,%frame,240; add.u64 %r421,%frame,360; .loc 1 271 12 mov.u64 %r436,%r191; $L60: .loc 1 266 18 add.u64 %r283,%r422,%r191; ld.u64 %r284,[%r187]; st.u64 [%r283],%r284; .loc 1 267 18 add.u64 %r285,%frame,%r191; .loc 1 267 20 ld.u64 %r287,[%r189]; mul.lo.u64 %r286,%r287,%r34; .loc 1 267 18 st.u64 [%r285],%r286; .loc 1 268 19 ld.u64 %r289,[%r187+16]; add.u64 %r288,%r289,1; ld.u64 %r290,[%r187+8]; sub.u64 %r53,%r288,%r290; .loc 1 270 10 setp.lt.s64 %r291,%r53,0; @ %r291 bra $L58; .loc 1 268 17 add.u64 %r293,%r421,%r191; st.u64 [%r293],%r53; bra $L59; $L58: .loc 1 271 12 add.u64 %r295,%r421,%r191; st.u64 [%r295],%r436; $L59: .loc 1 264 3 add.u64 %r187,%r187,24; add.u64 %r189,%r189,24; add.u64 %r191,%r191,8; setp.ne.u64 %r297,%r187,%r200; @ %r297 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r298,[%r203]; setp.eq.u64 %r299,%r298,0; @ ! %r299 bra $L63; bra $L62; $L56: add.u64 %r301,%r22,%r22; add.u64 %r302,%r301,%r22; shl.b64 %r303,%r302,3; add.u64 %r152,%r303,40; add.u64 %r58,%r204,%r152; add.u64 %r154,%r206,%r152; shl.b64 %r304,%r22,3; add.u64 %r100,%r304,-8; cvt.u32.u32 %r306,%r23; cvt.s64.s8 %r305,%r306; shl.b64 %r307,%r305,3; add.u64 %r184,%r307,-8; add.u64 %r422,%frame,240; add.u64 %r421,%frame,360; .loc 1 281 12 mov.u64 %r435,0; $L66: .loc 1 276 18 add.u64 %r309,%r422,%r100; ld.u64 %r310,[%r58]; st.u64 [%r309],%r310; .loc 1 277 18 add.u64 %r311,%frame,%r100; .loc 1 277 20 ld.u64 %r313,[%r154]; mul.lo.u64 %r312,%r313,%r34; .loc 1 277 18 st.u64 [%r311],%r312; .loc 1 278 19 ld.u64 %r315,[%r58+16]; add.u64 %r314,%r315,1; ld.u64 %r316,[%r58+8]; sub.u64 %r64,%r314,%r316; .loc 1 280 10 setp.lt.s64 %r317,%r64,0; @ %r317 bra $L64; .loc 1 278 17 add.u64 %r319,%r421,%r100; st.u64 [%r319],%r64; bra $L65; $L64: .loc 1 281 12 add.u64 %r321,%r421,%r100; st.u64 [%r321],%r435; $L65: .loc 1 274 3 add.u64 %r58,%r58,24; add.u64 %r154,%r154,24; add.u64 %r100,%r100,8; setp.ne.u64 %r323,%r100,%r184; @ %r323 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r324,%r133,0; @ %r324 bra $L68; add.u64 %r421,%frame,360; $L86: add.u64 %r124,%r203,40; mov.u64 %r118,%r421; cvt.u32.u32 %r326,%r23; cvt.s64.s8 %r325,%r326; add.u64 %r328,%r325,%r325; add.u64 %r329,%r328,%r325; shl.b64 %r330,%r329,3; add.u64 %r331,%r203,16; add.u64 %r72,%r330,%r331; .loc 1 291 10 mov.u64 %r120,1; .loc 1 295 4 mov.u64 %r350,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r332,%r23,-2; cvt.s64.s32 %r70,%r332; add.u64 %r334,%r70,%r70; add.u64 %r335,%r334,%r70; shl.b64 %r336,%r335,3; add.u64 %r337,%r203,%r336; .loc 1 299 67 shl.b64 %r339,%r70,3; add.u64 %r340,%frame,%r339; .loc 1 299 59 ld.u64 %r342,[%r337+40]; ld.u64 %r343,[%r340+360]; mul.lo.u64 %r140,%r342,%r343; .loc 1 301 24 mov.u64 %r344,0; st.u64 [%r203+8],%r344; .loc 1 302 28 cvt.u16.u32 %r347,%r23; add.u16 %r346,%r347,-1; cvt.u32.u16 %r348,%r346; st.u8 [%r203+28],%r348; .loc 1 304 10 setp.eq.u64 %r349,%r140,0; @ %r349 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r120,%r67,%r120; $L69: .loc 1 295 4 st.u64 [%r124+8],%r350; ld.u64 %r67,[%r118]; add.u64 %r351,%r67,-1; st.u64 [%r124+16],%r351; st.u64 [%r124],%r120; .loc 1 288 7 add.u64 %r124,%r124,24; add.u64 %r118,%r118,8; setp.ne.u64 %r352,%r72,%r124; @ %r352 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r203+48],%r140; mov.u64 %r354,-1; st.u64 [%r203+56],%r354; mov.u64 %r355,1; st.u64 [%r203+40],%r355; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r35714357; call (%value_in),_gfortrani_xmallocarray58,[%value_in]; } .loc 1 311 22 st.u64 [%r203],%r358; bra $L73; $L63: .loc 1 316 19 ld.s8 %r360,[%r203+28]; .loc 1 316 10 setp.eq.u64 %r361,%r360,%r133; @ %r361 bra $L74; .loc 1 317 2 cvta.const.u64 %r362362_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r364,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r365,[%r364+36]; setp.eq.u32 %r366,%r365,0; @ %r366 bra $L73; .loc 1 321 4 add.u64 %r421,%frame,360; cvta.const.u64 %r370,$LC2369370; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r374,$LC6; {374370; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r376,%r133,0; @ %r376 bra $L75; $L78: .loc 1 336 8 ld.u64 %r147,[%r203]; .loc 1 337 8 ld.u64 %r145,[%r204]; .loc 1 339 9 setp.ne.u64 %r377,%r145,0; @ %r377 bra $L76; bra $L48; $L75: add.u64 %r170,%frame,480; add.u64 %r167,%r203,40; add.u64 %r160,%frame,120; cvt.u32.u32 %r379,%r23; cvt.s64.s8 %r378,%r379; add.u64 %r125,%r378,-1; .loc 1 328 3 mov.u64 %r151,0; add.u64 %r421,%frame,360; .loc 1 330 16 mov.u64 %r380,%r151; $L77: st.u64 [%r170],%r380; .loc 1 331 18 ld.u64 %r381,[%r167]; st.u64 [%r160],%r381; .loc 1 332 17 shl.b64 %r383,%r151,3; add.u64 %r384,%r421,%r383; .loc 1 332 10 ld.u64 %r385,[%r384]; setp.le.s64 %r386,%r385,0; @ %r386 bra $L48; .loc 1 328 26 add.u64 %r151,%r151,1; .loc 1 328 3 add.u64 %r170,%r170,8; add.u64 %r167,%r167,24; add.u64 %r160,%r160,8; setp.ne.u64 %r387,%r125,%r151; @ %r387 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r44,%r42,%r34; ld.u64 %r173,[%frame+480]; .loc 1 359 22 ld.u64 %r86,[%frame+240]; .loc 1 360 23 ld.u64 %r88,[%frame]; .loc 1 361 22 ld.u64 %r90,[%frame+120]; setp.le.s64 %r426,%r133,1; .loc 1 348 10 mov.u32 %r428,0; cvt.u32.u32 %r429,%r23; cvt.s64.s8 %r430,%r429; add.u64 %r431,%r430,-1; add.u64 %r432,%frame,240; add.u64 %r433,%frame,360; add.u64 %r434,%frame,120; $L87: .loc 1 328 3 mov.u64 %r150,%r146; mov.u64 %r149,%r145; .loc 1 348 10 mov.u32 %r122,%r428; .loc 1 349 9 mov.u64 %r148,0; $L80: .loc 1 352 6 ld.s8 %r389,[%r150]; cvt.u16.u32 %r388,%r389; setp.eq.u16 %r390,%r388,0; @ %r390 bra $L79; .loc 1 353 12 ld.u8 %r437,[%r149]; mov$L79: .loc 1 349 24 add.u64 %r148,%r148,1; .loc 1 349 32 add.u64 %r149,%r149,%r137; .loc 1 349 47 add.u64 %r150,%r150,%r44; .loc 1 349 2 setp.ne.u64 %r396,%r134,%r148; @ %r396 bra $L80; .loc 1 355 8 cvt.u32.u32 %r397,%r122; st.u8 [%r147],%r397; .loc 1 358 15 add.u64 %r173,%r173,1; .loc 1 359 12 add.u64 %r145,%r145,%r86; .loc 1 360 13 add.u64 %r146,%r146,%r88; .loc 1 361 12 add.u64 %r147,%r147,%r90; .loc 1 363 32 ld.u64 %r80,[%frame+360]; .loc 1 363 13 setp.ne.u64 %r398,%r80,%r173; @ %r398 bra $L87; .loc 1 370 23 mul.lo.u64 %r94,%r173,%r86; .loc 1 371 24 mul.lo.u64 %r399,%r173,%r88; .loc 1 371 10 sub.u64 %r142,%r146,%r399; .loc 1 372 23 mul.lo.u64 %r400,%r173,%r90; .loc 1 372 9 sub.u64 %r143,%r147,%r400; .loc 1 374 7 @ ! %r426 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r40],%r427; .loc 1 370 23 mul.lo.u64 %r94,%r106,%r105; .loc 1 371 24 mul.lo.u64 %r403,%r108,%r105; .loc 1 371 10 sub.u64 %r142,%r146,%r403; .loc 1 372 23 mul.lo.u64 %r404,%r110,%r105; .loc 1 372 9 sub.u64 %r143,%r147,%r404; .loc 1 373 5 add.u64 %r144,%r144,1; .loc 1 374 7 add.u64 %r40,%r40,8; add.u64 %r164,%r164,8; setp.ne.u64 %r405,%r144,%r431; @ %r405 bra $L84; bra $L48; $L108: add.u64 %r40,%frame,488; mov.u64 %r164,8; .loc 1 373 5 mov.u64 %r144,1; .loc 1 367 13 mov.u64 %r427,0; $L84: .loc 1 382 16 ld.u64 %r409,[%r40]; add.u64 %r105,%r409,1; st.u64 [%r40],%r105; .loc 1 383 23 add.u64 %r411,%r432,%r164; ld.u64 %r106,[%r411]; .loc 1 383 13 sub.u64 %r412,%r106,%r94; add.u64 %r145,%r145,%r412; .loc 1 384 24 add.u64 %r413,%frame,%r164; ld.u64 %r108,[%r413]; .loc 1 384 14 add.u64 %r146,%r142,%r108; .loc 1 385 23 add.u64 %r415,%r434,%r164; ld.u64 %r110,[%r415]; .loc 1 385 13 add.u64 %r147,%r143,%r110; .loc 1 363 32 add.u64 %r417,%r433,%r164; ld.u64 %r113,[%r417]; .loc 1 363 13 setp.eq.u64 %r418,%r105,%r113; @ %r418 bra $L85; .loc 1 367 13 mov.u64 %r173,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r419,[%r203]; setp.eq.u64 %r420,%r419,0; @ ! %r420 bra $L63; bra $L86; $L48: .loc 1 389_gfortran_siparity_i1 .visible .func _gfortran_siparity_88pred %r184; .reg .predpred %r206; .reg .predu32 %r211; .reg .u64 %r213; .reg .u64u32 %r222; .reg .u64pred %r264; .reg .u64 %r266; .reg .u32 %r270; .reg .u64 %r272; .reg .u64 %r274; .reg .predpredu64 %r297; mov.u64 %r148,%ar0; mov.u64 %r149,%ar1; mov.u64 %r150,%ar2; mov.u64 %r151,%ar3; .loc 1 412 6 setp.eq.u64 %r152,%r151,0; @ %r152 bra $L110; .loc 1 412 20 ld.u32 %r153,[%r151]; setp.eq.u32 %r154,%r153,0; @ %r154 bra $L111; $L110: .loc 1 417 7gfortran_iparity_.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r150]; .loc 1 422 7 add.u64 %r86,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r149+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r291,%r26; .loc 1 425 7 shr.u64 %r159,%r86,63; set.u32.gt.s64 %r161,%r86,%r291; neg.s32 %r162,%r161; cvt.u16.u64 %r165,%r159; cvt.u16.u32 %r166,%r162; or.b16 %r164,%r165,%r166; .loc 1 425 6 cvt.u32.u16 %r167,%r164; cvt.u16.u8 %r168,%r167; setp.ne.u16 %r169,%r168,0; @ %r169 bra $L113; .loc 1 432 3 setp.ne.u64 %r170,%r86,0; @ %r170 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r171,%r86,%r291; @ %r171 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r174,%r24; cvt.s64.s8 %r173,%r174; st.u64 [%stack+8],%r173172172_gfortran_runtime_error114: add.u64 %r135,%r149,48; add.u64 %r137,%frame,120; add.u64 %r138,%r149,56; add.u64 %r176,%r149,24; add.u64 %r178,%r23,%r23; add.u64 %r179,%r178,%r23; shl.b64 %r180,%r179,3; add.u64 %r144,%r176,%r180; .loc 1 437 12 mov.u64 %r297,0; $L119: .loc 1 434 19 ld.u64 %r182,[%r138]; add.u64 %r181,%r182,1; ld.u64 %r183,[%r135]; sub.u64 %r36,%r181,%r183; .loc 1 436 10 setp.le.s64 %r184,%r36,0; @ %r184 bra $L117; .loc 1 434 17 st.u64 [%r137],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r137],%r297; $L118: .loc 1 432 3 add.u64 %r135,%r135,24; add.u64 %r137,%r137,8; add.u64 %r138,%r138,24; setp.ne.u64 %r186,%r135,%r144; @ %r186 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r96,[%r148]; .loc 1 449 6 setp.eq.u64 %r187,%r96,0; @ ! %r187 bra $L122; bra $L121; $L115: add.u64 %r189,%r23,%r23; add.u64 %r190,%r189,%r23; shl.b64 %r191,%r190,3; add.u64 %r192,%r191,48; add.u64 %r57,%r149,%r192; add.u64 %r290,%frame,120; shl.b64 %r194,%r23,3; add.u64 %r195,%r194,-8; add.u64 %r67,%r290,%r195; add.u64 %r196,%r149,48; cvt.u32.u32 %r198,%r24; cvt.s64.s8 %r197,%r198; add.u64 %r200,%r197,%r197; add.u64 %r201,%r200,%r197; shl.b64 %r202,%r201,3; add.u64 %r131,%r196,%r202; .loc 1 446 12 mov.u64 %r296,0; $L125: .loc 1 443 2 ld.u64 %r204,[%r57+8]; add.u64 %r203,%r204,1; ld.u64 %r205,[%r57]; sub.u64 %r41,%r203,%r205; .loc 1 445 10 setp.le.s64 %r206,%r41,0; @ %r206 bra $L123; .loc 1 442 17 st.u64 [%r67],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r67],%r296; $L124: .loc 1 440 3 add.u64 %r57,%r57,24; add.u64 %r67,%r67,8; setp.ne.u64 %r208,%r57,%r131; @ %r208 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r209,%r291,0; @ %r209 bra $L127; add.u64 %r290,%frame,120; $L147: add.u64 %r102,%r148,40; mov.u64 %r84,%r290; cvt.u32.u32 %r211,%r24; cvt.s64.s8 %r210,%r211; add.u64 %r213,%r210,%r210; add.u64 %r214,%r213,%r210; shl.b64 %r215,%r214,3; add.u64 %r216,%r148,16; add.u64 %r60,%r215,%r216; .loc 1 456 10 mov.u64 %r82,1; .loc 1 460 4 mov.u64 %r235,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r217,0; st.u64 [%r148+8],%r217; .loc 1 465 28 cvt.u16.u32 %r220,%r24; add.u16 %r219,%r220,-1; cvt.u32.u16 %r221,%r219; st.u8 [%r148+28],%r221; .loc 1 467 20 add.u32 %r222,%r24,-2; cvt.s64.s32 %r50,%r222; add.u64 %r224,%r50,%r50; add.u64 %r225,%r224,%r50; shl.b64 %r226,%r225,3; add.u64 %r227,%r148,%r226; .loc 1 467 67 shl.b64 %r229,%r50,3; add.u64 %r230,%frame,%r229; .loc 1 467 59 ld.u64 %r232,[%r227+40]; ld.u64 %r233,[%r230+120]; mul.lo.u64 %r93,%r232,%r233; .loc 1 469 10 setp.eq.u64 %r234,%r93,0; @ %r234 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r82,%r46,%r82; $L128: .loc 1 460 4 st.u64 [%r102+8],%r235; ld.u64 %r46,[%r84]; add.u64 %r236,%r46,-1; st.u64 [%r102+16],%r236; st.u64 [%r102],%r82; .loc 1 453 7 add.u64 %r102,%r102,24;131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r148+48],%r93; mov.u64 %r239,-1; st.u64 [%r148+56],%r239; mov.u64 %r240,1; st.u64 [%r148+40],%r240; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r242(%value_in),_gfortrani_xmallocarray3,[%value_in]; } mov.u64 %r96,%r243; .loc 1 476 22 st.u64 [%r148],%r96; $L135: .loc 1 502 3 setp.ne.u64 %r245,%r291,0; @ %r245 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r148+28]; .loc 1 480 10 setp.eq.u64 %r246,%r55,%r291; @ %r246 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r291; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r249,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r250,[%r249+36]; setp.eq.u32 %r251,%r250,0; @ %r251 bra $L135; .loc 1 488 4 setp.eq.u64 %r252,%r291,0; @ %r252 bra $L133; add.u64 %r31,%r148,48; add.u64 %r125,%frame,120; cvt.u32.u32 %r254,%r24; cvt.s64.s8 %r253,%r254; add.u64 %r92,%r253,-1; .loc 1 488 10 mov.u64 %r101,0; $L137: .loc 1 492 21 ld.u64 %r256,[%r31+8]; add.u64 %r255,%r256,1; .loc 1 492 19 ld.u64 %r257,[%r31]; sub.u64 %r91,%r255,%r257; .loc 1 493 18 ld.u64 %r64,[%r125]; .loc 1 488 25 add.u64 %r101,%r101,1; .loc 1 493 11 setp.eq.u64 %r258,%r64,%r91; @ %r258 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r64; st.u64 [%stack+8],%r91; st.u64 [%stack],%r101; cvta.const.u64 %r22_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r31,%r31,24; add.u64 %r125,%r125,8; setp.ne.u64 %r261,%r92,%r101; @ %r261 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r263,%r24; setp.le.s16 %r264,%r263,1; @ %r264 bra $L138; shl.b64 %r262,%r291,3; bra $L139; $L138: mov.u64 %r262,8; $L139: add.u64 %r266,%frame,240; mov.u32 %r27026272,[%value_in]; } add.u64 %r111,%r148,40; mov.u64 %r109,%frame; mov.u64 %r98,0; $L140: .loc 1 505 18 ld.u64 %r274,[%r111]; st.u64 [%r109],%r274; .loc 1 502 26 add.u64 %r98,%r98,1; .loc 1 502 3 add.u64 %r111,%r111,24; add.u64 %r109,%r109,8; setp.gt.s64 %r275,%r291,%r98; @ %r275 bra $L140; $L133: ld.u64 %r103,[%frame+240]; .loc 1 514 22 ld.u64 %r69,[%frame]; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r81,%r59,%r69; setp.le.s64 %r289,%r291,1; .loc 1 512 13 mov.u32 %r276,0; cvt.u32.u32 %r293,%r24; cvt.s64.s8 %r294,%r293; add.u64 %r295,%r294,-1; $L146: st.u8 [%r96],%r276; .loc 1 513 15 add.u64 %r103,%r103,1; .loc 1 514 12 add.u64 %r96,%r96,%r69; .loc 1 516 13 setp.ne.u64 %r277,%r59,%r103; @ %r277 bra $L146; .loc 1 525 7 @ ! %r289 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r97],%r292; .loc 1 523 23 mul.lo.u64 %r71,%r74,%r73; .loc 1 524 5 add.u64 %r95,%r95,1; .loc 1 525 7 add.u64 %r97,%r97,8; add.u64 %r119,%r119,8; add.u64 %r118,%r118,8; setp.ne.u64 %r280,%r95,%r295; @ %r280 bra $L144; bra $L109; $L171: add.u64 %r97,%frame,248; add.u64 %r119,%frame,8; add.u64 %r118,%frame,128; .loc 1 523 23 mov.u64 %r71,%r81; .loc 1 524 5 mov.u64 %r95,1; .loc 1 520 13 mov.u64 %r292,0; $L144: .loc 1 529 16 ld.u64 %r285,[%r97]; add.u64 %r73,%r285,1; st.u64 [%r97],%r73; .loc 1 530 23 ld.u64 %r74,[%r119]; .loc 1 530 13 sub.u64 %r286,%r74,%r71; add.u64 %r96,%r96,%r286; .loc 1 516 32 ld.u64 %r77,[%r118]; .loc 1 516 13 setp.eq.u64 %r287,%r73,%r77; @ %r287 bra $L145; .loc 1 520 13 mov.u64 %r103,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r96,[%r148]; .loc 1 449 6 setp.eq.u64 %r288,%r96,0; @ ! %r288 bra $L122; bra $L147; $L109: .loc 1 534iparity_i2.o/_gfortran_iparity_i2 .visible .func _gfortran_iparity_i2fortran/generated/iparity_i2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_miparity_i2 .visible .func _gfortran_miparity_i2gfortran_siparity_i2 .visible .func _gfortran_siparity_i2VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,73,80,65,82,73,84,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,80,65,82,73,84,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,108] = {73,80,65,82,73,84,89114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,80,65,82,73,84,89,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,73,80,65,82,73,84,89,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_iparity_i2 .visible .func _gfortran_iparity_i248pred %r217; .reg .pred64u64 %r251; .reg .predu64 %r256; .reg .u64 %r257; .reg .u32u32 %r268; .reg .u64predmov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; .loc 1 55 10 ld.s8 %r22,[%r180+28]; .loc 1 55 38 add.u32 %r182,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r182; .loc 1 56 10 ld.u64 %r25,[%r181]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r184,%r104,63; set.u32.lt.s64 %r186,%r103,%r104; neg.s32 %r187,%r186; cvt.u16.u64 %r190,%r184; cvt.u16.u32 %r191,%r187; or.b16 %r189,%r190,%r191; .loc 1 58 6 cvt.u32.u16 %r192,%r189; cvt.u16.u8 %r193,%r192; setp.eq.u16 %r194,%r193,0; @ %r194 bra $L2; .loc 1 60 7 cvt.u32.u32 %r197,%r22; cvt.s64.s8 %r196,%r197; st.u64 [%stack+8],%r1961955_gfortran_runtime_errorr200,%r104,%r104; add.u64 %r201,%r200,%r104; shl.b64 %r202,%r201,3; add.u64 %r203,%r180,%r202; ld.u64 %r30,[%r203+56]; ld.u64 %r32,[%r203+48]; .loc 1 68 9 ld.u64 %r107,[%r203+40]; .loc 1 70 3 setp.ne.u64 %r217,%r104,0; @ %r217 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r218,%r103,%r104; @ %r218 bra $L4; bra $L46; $L3: add.u64 %r61,%r180,40; add.u64 %r40,%frame,120; add.u64 %r91,%frame,240; add.u64 %r219,%r180,16; add.u64 %r221,%r25,%r25; add.u64 %r222,%r221,%r25; shl.b64 %r223,%r222,3; add.u64 %r174,%r219,%r223; .loc 1 76 12 mov.u64 %r363,0; $L8: .loc 1 72 18 ld.u64 %r224,[%r61]; st.u64 [%r40],%r224; .loc 1 73 19 ld.u64 %r226,[%r61+16]; add.u64 %r225,%r226,1; ld.u64 %r227,[%r61+8]; sub.u64 %r37,%r225,%r227; .loc 1 75 10 setp.lt.s64 %r228,%r37,0; @ %r228 bra $L6; .loc 1 73 17 st.u64 [%r91],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r91],%r363; $L7: .loc 1 70 3 add.u64 %r61,%r61,24; add.u64 %r40,%r40,8; add.u64 %r91,%r91,8; setp.ne.u64 %r230,%r61,%r174; @ %r230 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r231,[%r179]; setp.eq.u64 %r232,%r231,0; @ ! %r232 bra $L11; bra $L10; $L4: add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r237,%r236,40; add.u64 %r143,%r180,%r237; shl.b64 %r238,%r25,3; add.u64 %r125,%r238,-8; add.u64 %r351,%frame,120; add.u64 %r128,%r351,%r125; add.u64 %r350,%frame,240; add.u64 %r121,%r350,%r125; cvt.u32.u32 %r242,%r22; cvt.s64.s8 %r241,%r242; add.u64 %r244,%r241,%r241; add.u64 %r245,%r244,%r241; shl.b64 %r246,%r245,3; add.u64 %r247,%r180,40; add.u64 %r95,%r246,%r247; .loc 1 84 12 mov.u64 %r362,0; $L14: .loc 1 80 18 ld.u64 %r248,[%r143]; st.u64 [%r128],%r248; .loc 1 81 19 ld.u64 %r250,[%r143+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r143+8]; sub.u64 %r44,%r249,%r251; .loc 1 83 10 setp.lt.s64 %r252,%r44,0; @ %r252 bra $L12; .loc 1 81 17 st.u64 [%r121],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r121],%r362; $L13: .loc 1 78 3 add.u64 %r143,%r143,24; add.u64 %r128,%r128,8; add.u64 %r121,%r121,8; setp.ne.u64 %r254,%r95,%r143; @ %r254 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r255,%r103,0; @ %r255 bra $L16; add.u64 %r350,%frame,240; $L34: add.u64 %r152,%r179,40; mov.u64 %r150,%r350; add.u64 %r256,%r179,16; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r144,%r256,%r262; .loc 1 94 10 mov.u64 %r99,1; .loc 1 98 4 mov.u64 %r285,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r263,0; st.u64 [%r179+8],%r263; .loc 1 103 28 cvt.u16.u32 %r266,%r22; add.u16 %r265,%r266,-1; cvt.u32.u16 %r267,%r265; st.u8 [%r179+28],%r267; .loc 1 105 20 add.u32 %r268,%r22,-2; cvt.s64.s32 %r52,%r268; add.u64 %r270,%r52,%r52; add.u64 %r271,%r270,%r52; shl.b64 %r272,%r271,3; add.u64 %r273,%r179,%r272; .loc 1 105 67 shl.b64 %r275,%r52,3; add.u64 %r276,%frame,%r275; .loc 1 105 59 ld.u64 %r278,[%r273+40]; ld.u64 %r279,[%r276+240]; mul.lo.u64 %r109,%r278,%r279; .loc 1 107 29 mov.u64 %r281,281; call (%value_in),_gfortrani_xmallocarray282,[%value_in]; } .loc 1 107 27 st.u64 [%r179],%r282; .loc 1 108 10 setp.eq.u64 %r284,%r109,0; @ ! %r284 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r99,%r48,%r99; $L17: .loc 1 98 4 st.u64 [%r152+8],%r285; ld.u64 %r48,[%r150]; add.u64 %r286,%r48,-1; st.u64 [%r152+16],%r286; st.u64 [%r152],%r99; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r287,%r144,%r152; @ %r287 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r179+48],%r109; mov.u64 %r289,-1; st.u64 [%r179+56],%r289; mov.u64 %r290,1; st.u64 [%r179+40],%r290; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r179+28]; .loc 1 118 10 setp.eq.u64 %r291,%r58,%r103; @ %r291 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r294,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r295,[%r294+36]; setp.eq.u32 %r296,%r295,0; @ %r296 bra $L21; .loc 1 125 2 add.u64 %r350,%frame,240; cvta.const.u64 %r300,$LC2r350299300; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r302,%r103,0; @ %r302 bra $L24; $L27: .loc 1 65 9 add.u64 %r303,%r30,1; .loc 1 65 7 sub.u64 %r105,%r303,%r32; max.s64 %r45,%r105,0; .loc 1 137 8 ld.u64 %r116,[%r180]; .loc 1 138 8 ld.u64 %r117,[%r179]; .loc 1 154 36 add.u64 %r66,%r107,%r107; ld.u64 %r138,[%frame+360]; .loc 1 166 22 ld.u64 %r68,[%frame+120]; .loc 1 166 12 add.u64 %r69,%r68,%r68; .loc 1 167 22 ld.u64 %r71,[%frame]; .loc 1 167 12 add.u64 %r72,%r71,%r71; .loc 1 169 32 ld.u64 %r123,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r307,%r68,%r123; .loc 1 176 9 add.u64 %r308,%r307,%r307; neg.s64 %r127,%r308; .loc 1 177 23 mul.lo.u64 %r309,%r71,%r123; .loc 1 177 9 add.u64 %r310,%r309,%r309; neg.s64 %r130,%r310; setp.gt.s64 %r352,%r105,0; setp.le.s64 %r353,%r103,1; cvt.u32.u32 %r355,%r22; cvt.s64.s8 %r356,%r355; add.u64 %r357,%r356,-1; add.u64 %r358,%frame,120; add.u64 %r359,%frame,240; .loc 1 173 13 mov.u64 %r360,0; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r179,40; mov.u64 %r162,%frame; cvt.u32.u32 %r312,%r22; cvt.s64.s8 %r311,%r312; add.u64 %r153,%r311,-1; .loc 1 129 3 mov.u64 %r118,0; add.u64 %r350,%frame,240; .loc 1 131 16 mov.u64 %r313,%r118; $L26: st.u64 [%r165],%r313; .loc 1 132 18 ld.u64 %r314,[%r164]; st.u64 [%r162],%r314; .loc 1 133 17 shl.b64 %r316,%r118,3; add.u64 %r317,%r350,%r316; .loc 1 133 10 ld.u64 %r318,[%r317]; setp.le.s64 %r319,%r318,0; @ %r319 bra $L1; .loc 1 129 26 add.u64 %r118,%r118,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r320,%r118,%r153; @ %r320 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r352 bra $L35; .loc 1 150 10 mov.u16 %r322,0; st.u16 [%r117],%r322; bra $L29; $L35: mov.u64 %r112,%r116; .loc 1 148 10 mov.u32 %r110,0; .loc 1 154 13 mov.u64 %r111,0; $L28: .loc 1 158 10 ld.u16 %r324,[%r112]; cvt.u16.u32 %r325,%r110; xor.b16 %r323,%r324,%r325; cvt.s32.s16 %r110,%r323; .loc 1 154 28 add.u64 %r111,%r111,1; .loc 1 154 36 add.u64 %r112,%r112,%r66; .loc 1 154 6 setp.gt.s64 %r326,%r45,%r111; @ %r326 bra $L28; .loc 1 161 12 cvt.u16.u32 %r327,%r110; st.u16 [%r117],%r327; $L29: .loc 1 165 15 add.u64 %r138,%r138,1; .loc 1 166 12 add.u64 %r116,%r116,%r69; .loc 1 167 12 add.u64 %r117,%r117,%r72; .loc 1 169 13 setp.ne.u64 %r328,%r138,%r123; @ %r328 bra $L25; .loc 1 176 9 add.u64 %r113,%r116,%r127; .loc 1 177 9 add.u64 %r114,%r117,%r130; .loc 1 179 7 @ %r353 bra $L1; add.u64 %r122,%frame,368; mov.u64 %r140,8; .loc 1 178 5 mov.u64 %r115,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r122],%r360; .loc 1 176 23 mul.lo.u64 %r334,%r85,%r84; .loc 1 176 9 add.u64 %r335,%r334,%r334; sub.u64 %r113,%r116,%r335; .loc 1 177 23 mul.lo.u64 %r336,%r88,%r84; .loc 1 177 9 add.u64 %r337,%r336,%r336; sub.u64 %r114,%r117,%r337; .loc 1 178 5 add.u64 %r115,%r115,1; .loc 1 179 7 add.u64 %r122,%r122,8; add.u64 %r140,%r140,8; setp.eq.u64 %r338,%r115,%r357; @ %r338 bra $L1; $L32: .loc 1 187 16 ld.u64 %r339,[%r122]; add.u64 %r84,%r339,1; st.u64 [%r122],%r84; .loc 1 188 23 add.u64 %r341,%r358,%r140; ld.u64 %r85,[%r341]; .loc 1 188 13 add.u64 %r342,%r85,%r85; add.u64 %r116,%r113,%r342; .loc 1 189 23 add.u64 %r343,%frame,%r140; ld.u64 %r88,[%r343]; .loc 1 189 13 add.u64 %r344,%r88,%r88; add.u64 %r117,%r114,%r344; .loc 1 169 32 add.u64 %r346,%r359,%r140; ld.u64 %r92,[%r346]; .loc 1 169 13 setp.eq.u64 %r347,%r84,%r92; @ %r347 bra $L33; .loc 1 173 13 mov.u64 %r138,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r348,[%r179]; setp.eq.u64 %r349,%r348,0; @ ! %r349 bra $L11; bra $L34; $L1_gfortran_miparity_i2 .visible .func _gfortran_miparity_608u64u16 %r263; .reg .pred %r264; .reg .pred %r266; .reg .u64 %r270; .reg .u64 %r271; .reg .u64 %r277; .reg .pred %r279; .reg .pred %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .predu64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32u64 %r342; .reg .u64pred %r358; .reg .u64predu64 %r391; .reg .pred %r392; .reg .pred %r393; .reg .u64u64 %r421; .reg .u64predmov.u64 %r211,%ar2; mov.u64 %r212,%ar3; .loc 1 223 6 setp.ne.u64 %r213,%r212,0; @ %r213 bra $L49gfortran_iparity_.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r211]; .loc 1 233 7 add.u64 %r137,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r210+28]; .loc 1 234 38 add.u32 %r217,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r138,%r217; .loc 1 237 7 shr.u64 %r219,%r137,63; set.u32.gt.s64 %r221,%r137,%r138; neg.s32 %r222,%r221; cvt.u16.u64 %r225,%r219; cvt.u16.u32 %r226,%r222; or.b16 %r224,%r225,%r226; .loc 1 237 6 cvt.u32.u16 %r227,%r224; cvt.u16.u8 %r228,%r227; setp.eq.u16 %r229,%r228,0; @ %r229 bra $L51; .loc 1 239 7 cvt.u32.u32 %r232,%r23; cvt.s64.s8 %r231,%r232; st.u64 [%stack+8],%r231; st.u64 [%stack],%r22; cvta.const.u64 %r2302stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r435,%r137,%r137; add.u64 %r436,%r435,%r137; shl.b64 %r237,%r436,3; add.u64 %r238,%r210,%r237; ld.u64 %r241,[%r238+56]; add.u64 %r240,%r241,1; .loc 1 244 7 ld.u64 %r248,[%r238+48]; sub.u64 %r139,%r240,%r248; .loc 1 245 6 setp.le.s64 %r249,%r139,0; @ %r249 bra $L48; .loc 1 248 9 ld.u64 %r152,[%r212]; .loc 1 250 15 ld.u64 %r34,[%r212+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r250,%r35,-4; and.b32 %r251,%r250,-5; set.u32.eq.u32 %r253,%r251,0; neg.s32 %r254,%r253; .loc 1 252 22 add.u32 %r255,%r35,-1; set.u32.le.u32 %r257,%r255,1; neg.s32 %r258,%r257; .loc 1 252 6 cvt.u16.u32 %r260,%r254; cvt.u16.u32 %r261,%r258; or.b16 %r259,%r260,%r261; cvt.u32.u16 %r262,%r259; cvt.u16.u8 %r263,%r262; setp.ne.u16 %r264,%r263,0; @ %r264 bra $L53; .loc 1 254 7 setp.ne.u32 %r266,%r35,16; @ %r266 bra $L54; $L53: .loc 1 261 9 shl.b64 %r270,%r436,3; add.u64 %r271,%r210,%r270; ld.u64 %r142,[%r271+40]; .loc 1 262 12 add.u64 %r277,%r212,%r270; ld.u64 %r42,[%r277+40]; .loc 1 264 3 setp.ne.u64 %r279,%r137,0; @ %r279 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r280,%r137,%r138; @ %r280 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r281281_gfortran_runtime_error5: add.u64 %r40,%r210,40; add.u64 %r179,%r212,40; add.u64 %r283,%r210,16; add.u64 %r285,%r22,%r22; add.u64 %r286,%r285,%r22; shl.b64 %r287,%r286,3; add.u64 %r206,%r283,%r287; .loc 1 264 3 mov.u64 %r92,0; add.u64 %r433,%frame,240; add.u64 %r432,%frame,360; .loc 1 271 12 mov.u64 %r447,%r92; $L60: .loc 1 266 18 add.u64 %r289,%r433,%r92; ld.u64 %r290,[%r40]; st.u64 [%r289],%r290; .loc 1 267 18 add.u64 %r291,%frame,%r92; .loc 1 267 20 ld.u64 %r293,[%r179]; mul.lo.u64 %r292,%r293,%r34; .loc 1 267 18 st.u64 [%r291],%r292; .loc 1 268 19 ld.u64 %r295,[%r40+16]; add.u64 %r294,%r295,1; ld.u64 %r296,[%r40+8]; sub.u64 %r54,%r294,%r296; .loc 1 270 10 setp.lt.s64 %r297,%r54,0; @ %r297 bra $L58; .loc 1 268 17 add.u64 %r299,%r432,%r92; st.u64 [%r299],%r54; bra $L59; $L58: .loc 1 271 12 add.u64 %r301,%r432,%r92; st.u64 [%r301],%r447; $L59: .loc 1 264 3 add.u64 %r40,%r40,24; add.u64 %r179,%r179,24; add.u64 %r92,%r92,8; setp.ne.u64 %r303,%r40,%r206; @ %r303 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r304,[%r209]; setp.eq.u64 %r305,%r304,0; @ ! %r305 bra $L63; bra $L62; $L56: add.u64 %r307,%r22,%r22; add.u64 %r308,%r307,%r22; shl.b64 %r309,%r308,3; add.u64 %r127,%r309,40; add.u64 %r133,%r210,%r127; add.u64 %r125,%r212,%r127; shl.b64 %r310,%r22,3; add.u64 %r103,%r310,-8; cvt.u32.u32 %r312,%r23; cvt.s64.s8 %r311,%r312; shl.b64 %r313,%r311,3; add.u64 %r69,%r313,-8; add.u64 %r433,%frame,240; add.u64 %r432,%frame,360; .loc 1 281 12 mov.u64 %r446,0; $L66: .loc 1 276 18 add.u64 %r315,%r433,%r103; ld.u64 %r316,[%r133]; st.u64 [%r315],%r316; .loc 1 277 18 add.u64 %r317,%frame,%r103; .loc 1 277 20 ld.u64 %r319,[%r125]; mul.lo.u64 %r318,%r319,%r34; .loc 1 277 18 st.u64 [%r317],%r318; .loc 1 278 19 ld.u64 %r321,[%r133+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r133+8]; sub.u64 %r66,%r320,%r322; .loc 1 280 10 setp.lt.s64 %r323,%r66,0; @ %r323 bra $L64; .loc 1 278 17 add.u64 %r325,%r432,%r103; st.u64 [%r325],%r66; bra $L65; $L64: .loc 1 281 12 add.u64 %r327,%r432,%r103; st.u64 [%r327],%r446; $L65: .loc 1 274 3 add.u64 %r133,%r133,24; add.u64 %r125,%r125,24; add.u64 %r103,%r103,8; setp.ne.u64 %r329,%r69,%r103; @ %r329 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r330,%r138,0; @ %r330 bra $L68; add.u64 %r432,%frame,360; $L86: add.u64 %r172,%r209,40; mov.u64 %r164,%r432; cvt.u32.u32 %r332,%r23; cvt.s64.s8 %r331,%r332; add.u64 %r334,%r331,%r331; add.u64 %r335,%r334,%r331; shl.b64 %r336,%r335,3; add.u64 %r337,%r209,16; add.u64 %r134,%r336,%r337; .loc 1 291 10 mov.u64 %r129,1; .loc 1 295 4 mov.u64 %r356,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r338,%r23,-2; cvt.s64.s32 %r75,%r338; add.u64 %r340,%r75,%r75; add.u64 %r341,%r340,%r75; shl.b64 %r342,%r341,3; add.u64 %r343,%r209,%r342; .loc 1 299 67 shl.b64 %r345,%r75,3; add.u64 %r346,%frame,%r345; .loc 1 299 59 ld.u64 %r348,[%r343+40]; ld.u64 %r349,[%r346+360]; mul.lo.u64 %r146,%r348,%r349; .loc 1 301 24 mov.u64 %r350,0; st.u64 [%r209+8],%r350; .loc 1 302 28 cvt.u16.u32 %r353,%r23; add.u16 %r352,%r353,-1; cvt.u32.u16 %r354,%r352; st.u8 [%r209+28],%r354; .loc 1 304 10 setp.eq.u64 %r355,%r146,0; @ %r355 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r129,%r72,%r129; $L69: .loc 1 295 4 st.u64 [%r172+8],%r356; ld.u64 %r72,[%r164]; add.u64 %r357,%r72,-1; st.u64 [%r172+16],%r357; st.u64 [%r172],%r129; .loc 1 288 7 add.u64 %r172,%r172,24; add.u64 %r164,%r164,8; setp.ne.u64 %r358,%r134,%r172; @ %r358 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r209+48],%r146; mov.u64 %r360,-1; st.u64 [%r209+56],%r360; mov.u64 %r361,1; st.u64 [%r209+40],%r361; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r363,363; call (%value_in),_gfortrani_xmallocarray64,[%value_in]; } .loc 1 311 22 st.u64 [%r209],%r364; bra $L73; $L63: .loc 1 316 19 ld.s8 %r366,[%r209+28]; .loc 1 316 10 setp.eq.u64 %r367,%r366,%r138; @ %r367 bra $L74; .loc 1 317 2 cvta.const.u64 %r36836_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r370,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r371,[%r370+36]; setp.eq.u32 %r372,%r371,0; @ %r372 bra $L73; .loc 1 321 4 add.u64 %r432,%frame,360; cvta.const.u64 %r376,$LC2r437376; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r380,$LC1210380376; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r382,%r138,0; @ %r382 bra $L75; $L78: .loc 1 336 8 ld.u64 %r153,[%r209]; .loc 1 337 8 ld.u64 %r151,[%r210]; .loc 1 339 9 setp.ne.u64 %r383,%r151,0; @ %r383 bra $L76; bra $L48; $L75: add.u64 %r190,%frame,480; add.u64 %r189,%r209,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r385,%r23; cvt.s64.s8 %r384,%r385; add.u64 %r175,%r384,-1; .loc 1 328 3 mov.u64 %r157,0; add.u64 %r432,%frame,360; .loc 1 330 16 mov.u64 %r386,%r157; $L77: st.u64 [%r190],%r386; .loc 1 331 18 ld.u64 %r387,[%r189]; st.u64 [%r187],%r387; .loc 1 332 17 shl.b64 %r389,%r157,3; add.u64 %r390,%r432,%r389; .loc 1 332 10 ld.u64 %r391,[%r390]; setp.le.s64 %r392,%r391,0; @ %r392 bra $L48; .loc 1 328 26 add.u64 %r157,%r157,1; .loc 1 328 3 add.u64 %r190,%r190,8; add.u64 %r189,%r189,24; add.u64 %r187,%r187,8; setp.ne.u64 %r393,%r157,%r175; @ %r393 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r45,%r42,%r34; .loc 1 349 32 add.u64 %r91,%r142,%r142; ld.u64 %r39,[%frame+480]; .loc 1 359 22 ld.u64 %r94,[%frame+240]; .loc 1 359 12 add.u64 %r95,%r94,%r94; .loc 1 360 23 ld.u64 %r97,[%frame]; .loc 1 361 22 ld.u64 %r99,[%frame+120]; .loc 1 361 12 add.u64 %r100,%r99,%r99; .loc 1 363 32 ld.u64 %r85,[%frame+360]; .loc 1 370 23 mul.lo.u64 %r38,%r85,%r94; .loc 1 371 24 mul.lo.u64 %r397,%r85,%r97; .loc 1 371 10 neg.s64 %r163,%r397; .loc 1 372 23 mul.lo.u64 %r398,%r85,%r99; .loc 1 372 9 add.u64 %r399,%r398,%r398; neg.s64 %r165,%r399; setp.le.s64 %r437,%r138,1; .loc 1 348 10 mov.u32 %r439,0; cvt.u32.u32 %r440,%r23; cvt.s64.s8 %r441,%r440; add.u64 %r442,%r441,-1; add.u64 %r443,%frame,240; add.u64 %r444,%frame,360; add.u64 %r445,%frame,120; $L87: .loc 1 328 3 mov.u64 %r156,%r152; mov.u64 %r155,%r151; .loc 1 348 10 mov.u32 %r130,%r439; .loc 1 349 9 mov.u64 %r154,0; $L80: .loc 1 352 6 ld.s8 %r401,[%r156]; cvt.u16.u32 %r400,%r401; setp.eq.u16 %r402,%r400,0; @ %r402 bra $L79; .loc 1 353 12 ld.u16 %r404,[%r155]; cvt.u16.u32 %r405,%r130; xor.b16 %r403,%r404,%r405; cvt.s32.s16 %r130,%r403; $L79: .loc 1 349 24 add.u64 %r154,%r154,1; .loc 1 349 32 add.u64 %r155,%r155,%r91; .loc 1 349 47 add.u64 %r156,%r156,%r45; .loc 1 349 2 setp.ne.u64 %r406,%r139,%r154; @ %r406 bra $L80; .loc 1 355 8 cvt.u16.u32 %r407,%r130; st.u16 [%r153],%r407; .loc 1 358 15 add.u64 %r39,%r39,1; .loc 1 359 12 add.u64 %r151,%r151,%r95; .loc 1 360 13 add.u64 %r152,%r152,%r97; .loc 1 361 12 add.u64 %r153,%r153,%r100; .loc 1 363 13 setp.ne.u64 %r408,%r85,%r39; @ %r408 bra $L87; .loc 1 371 10 add.u64 %r148,%r152,%r163; .loc 1 372 9 add.u64 %r149,%r153,%r165; .loc 1 374 7 @ ! %r437 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r41],%r438; .loc 1 370 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 371 24 mul.lo.u64 %r411,%r116,%r113; .loc 1 371 10 sub.u64 %r148,%r152,%r411; .loc 1 372 23 mul.lo.u64 %r412,%r118,%r113; .loc 1 372 9 add.u64 %r413,%r412,%r412; sub.u64 %r149,%r153,%r413; .loc 1 373 5 add.u64 %r150,%r150,1; .loc 1 374 7 add.u64 %r41,%r41,8; add.u64 %r166,%r166,8; setp.ne.u64 %r414,%r150,%r442; @ %r414 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; .loc 1 370 23 mov.u64 %r102,%r38; .loc 1 374 7 mov.u64 %r166,8; .loc 1 373 5 mov.u64 %r150,1; .loc 1 367 13 mov.u64 %r438,0; $L84: .loc 1 382 16 ld.u64 %r418,[%r41]; add.u64 %r113,%r418,1; st.u64 [%r41],%r113; .loc 1 383 23 add.u64 %r420,%r443,%r166; ld.u64 %r114,[%r420]; .loc 1 383 13 sub.u64 %r421,%r114,%r102; add.u64 %r422,%r421,%r421; add.u64 %r151,%r151,%r422; .loc 1 384 24 add.u64 %r423,%frame,%r166; ld.u64 %r116,[%r423]; .loc 1 384 14 add.u64 %r152,%r148,%r116; .loc 1 385 23 add.u64 %r425,%r445,%r166; ld.u64 %r118,[%r425]; .loc 1 385 13 add.u64 %r426,%r118,%r118; add.u64 %r153,%r149,%r426; .loc 1 363 32 add.u64 %r428,%r444,%r166; ld.u64 %r122,[%r428]; .loc 1 363 13 setp.eq.u64 %r429,%r113,%r122; @ %r429 bra $L85; .loc 1 367 13 mov.u64 %r39,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r430,[%r209]; setp.eq.u64 %r431,%r430,0; @ ! %r431 bra $L63; bra $L86; $L48: .loc 1 389_gfortran_siparity_i2 .visible .func _gfortran_siparity_8u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64predpred %r292; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_iparity_.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r295,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r295; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r295; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r301,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r301; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r294,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r294,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r300,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r300; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r295,0; @ %r211 bra $L127; add.u64 %r294,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r294; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r244,4; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r295,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r295; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r295; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r295,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r295,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r295,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 add.u64 %r69,%r68,%r68; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r293,%r295,1; .loc 1 512 13 mov.u16 %r279,0; cvt.u32.u32 %r297,%r24; cvt.s64.s8 %r298,%r297; add.u64 %r299,%r298,-1; $L146: st.u16 [%r98],%r279; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r280,%r59,%r80; @ %r280 bra $L146; .loc 1 525 7 @ ! %r293 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r296; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r283,%r97,%r299; @ %r283 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; .loc 1 520 13 mov.u64 %r296,0; $L144: .loc 1 529 16 ld.u64 %r288,[%r99]; add.u64 %r74,%r288,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r289,%r75,%r72; add.u64 %r290,%r289,%r289; add.u64 %r98,%r98,%r290; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r291,%r74,%r77; @ %r291 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r292,%r98,0; @ ! %r292 bra $L122; bra $L147; $L109: .loc 1 534 iparity_i4.o/_gfortran_iparity_i4 .visible .func _gfortran_iparity_i4fortran/generated/iparity_i4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_miparity_i4 .visible .func _gfortran_miparity_i4gfortran_siparity_i4 .visible .func _gfortran_siparity_i4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,73,80,65,82,73,84,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,80,65,82,73,84,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,108] = {73,80,65,82,73,84,89114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,80,65,82,73,84,89,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,73,80,65,82,73,84,89,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_iparity_i4 .visible .func _gfortran_iparity_i448pred %r217; .reg .pred64u64 %r251; .reg .predu64 %r256; .reg .u64 %r257; .reg .u32u32 %r268; .reg .u64u64predpredu64u64predmov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; .loc 1 55 10 ld.s8 %r22,[%r180+28]; .loc 1 55 38 add.u32 %r182,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r182; .loc 1 56 10 ld.u64 %r25,[%r181]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r184,%r104,63; set.u32.lt.s64 %r186,%r103,%r104; neg.s32 %r187,%r186; cvt.u16.u64 %r190,%r184; cvt.u16.u32 %r191,%r187; or.b16 %r189,%r190,%r191; .loc 1 58 6 cvt.u32.u16 %r192,%r189; cvt.u16.u8 %r193,%r192; setp.eq.u16 %r194,%r193,0; @ %r194 bra $L2; .loc 1 60 7 cvt.u32.u32 %r197,%r22; cvt.s64.s8 %r196,%r197; st.u64 [%stack+8],%r1961955_gfortran_runtime_errorr200,%r104,%r104; add.u64 %r201,%r200,%r104; shl.b64 %r202,%r201,3; add.u64 %r203,%r180,%r202; ld.u64 %r30,[%r203+56]; ld.u64 %r32,[%r203+48]; .loc 1 68 9 ld.u64 %r107,[%r203+40]; .loc 1 70 3 setp.ne.u64 %r217,%r104,0; @ %r217 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r218,%r103,%r104; @ %r218 bra $L4; bra $L46; $L3: add.u64 %r61,%r180,40; add.u64 %r40,%frame,120; add.u64 %r91,%frame,240; add.u64 %r219,%r180,16; add.u64 %r221,%r25,%r25; add.u64 %r222,%r221,%r25; shl.b64 %r223,%r222,3; add.u64 %r174,%r219,%r223; .loc 1 76 12 mov.u64 %r357,0; $L8: .loc 1 72 18 ld.u64 %r224,[%r61]; st.u64 [%r40],%r224; .loc 1 73 19 ld.u64 %r226,[%r61+16]; add.u64 %r225,%r226,1; ld.u64 %r227,[%r61+8]; sub.u64 %r37,%r225,%r227; .loc 1 75 10 setp.lt.s64 %r228,%r37,0; @ %r228 bra $L6; .loc 1 73 17 st.u64 [%r91],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r91],%r357; $L7: .loc 1 70 3 add.u64 %r61,%r61,24; add.u64 %r40,%r40,8; add.u64 %r91,%r91,8; setp.ne.u64 %r230,%r61,%r174; @ %r230 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r231,[%r179]; setp.eq.u64 %r232,%r231,0; @ ! %r232 bra $L11; bra $L10; $L4: add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r237,%r236,40; add.u64 %r143,%r180,%r237; shl.b64 %r238,%r25,3; add.u64 %r125,%r238,-8; add.u64 %r344,%frame,120; add.u64 %r128,%r344,%r125; add.u64 %r345,%frame,240; add.u64 %r121,%r345,%r125; cvt.u32.u32 %r242,%r22; cvt.s64.s8 %r241,%r242; add.u64 %r244,%r241,%r241; add.u64 %r245,%r244,%r241; shl.b64 %r246,%r245,3; add.u64 %r247,%r180,40; add.u64 %r95,%r246,%r247; .loc 1 84 12 mov.u64 %r356,0; $L14: .loc 1 80 18 ld.u64 %r248,[%r143]; st.u64 [%r128],%r248; .loc 1 81 19 ld.u64 %r250,[%r143+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r143+8]; sub.u64 %r44,%r249,%r251; .loc 1 83 10 setp.lt.s64 %r252,%r44,0; @ %r252 bra $L12; .loc 1 81 17 st.u64 [%r121],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r121],%r356; $L13: .loc 1 78 3 add.u64 %r143,%r143,24; add.u64 %r128,%r128,8; add.u64 %r121,%r121,8; setp.ne.u64 %r254,%r95,%r143; @ %r254 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r255,%r103,0; @ %r255 bra $L16; add.u64 %r345,%frame,240; $L34: add.u64 %r152,%r179,40; mov.u64 %r150,%r345; add.u64 %r256,%r179,16; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r144,%r256,%r262; .loc 1 94 10 mov.u64 %r99,1; .loc 1 98 4 mov.u64 %r285,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r263,0; st.u64 [%r179+8],%r263; .loc 1 103 28 cvt.u16.u32 %r266,%r22; add.u16 %r265,%r266,-1; cvt.u32.u16 %r267,%r265; st.u8 [%r179+28],%r267; .loc 1 105 20 add.u32 %r268,%r22,-2; cvt.s64.s32 %r52,%r268; add.u64 %r270,%r52,%r52; add.u64 %r271,%r270,%r52; shl.b64 %r272,%r271,3; add.u64 %r273,%r179,%r272; .loc 1 105 67 shl.b64 %r275,%r52,3; add.u64 %r276,%frame,%r275; .loc 1 105 59 ld.u64 %r278,[%r273+40]; ld.u64 %r279,[%r276+240]; mul.lo.u64 %r109,%r278,%r279; .loc 1 107 29281; call (%value_in),_gfortrani_xmallocarray282,[%value_in]; } .loc 1 107 27 st.u64 [%r179],%r282; .loc 1 108 10 setp.eq.u64 %r284,%r109,0; @ ! %r284 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r99,%r48,%r99; $L17: .loc 1 98 4 st.u64 [%r152+8],%r285; ld.u64 %r48,[%r150]; add.u64 %r286,%r48,-1; st.u64 [%r152+16],%r286; st.u64 [%r152],%r99; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r287,%r144,%r152; @ %r287 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r179+48],%r109; mov.u64 %r289,-1; st.u64 [%r179+56],%r289; mov.u64 %r290,1; st.u64 [%r179+40],%r290; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r179+28]; .loc 1 118 10 setp.eq.u64 %r291,%r58,%r103; @ %r291 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r294,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r295,[%r294+36]; setp.eq.u32 %r296,%r295,0; @ %r296 bra $L21; .loc 1 125 2 add.u64 %r345,%frame,240; cvta.const.u64 %r300,$LC2r345299300; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r302,%r103,0; @ %r302 bra $L24; $L27: .loc 1 65 9 add.u64 %r303,%r30,1; .loc 1 65 7 sub.u64 %r105,%r303,%r32; max.s64 %r45,%r105,0; .loc 1 137 8 ld.u64 %r116,[%r180]; .loc 1 138 8 ld.u64 %r117,[%r179]; .loc 1 154 36 shl.b64 %r66,%r107,2; ld.u64 %r138,[%frame+360]; .loc 1 166 22 ld.u64 %r68,[%frame+120]; .loc 1 166 12 shl.b64 %r69,%r68,2; .loc 1 167 22 ld.u64 %r71,[%frame]; .loc 1 167 12 shl.b64 %r72,%r71,2; .loc 1 169 32 ld.u64 %r123,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r304,%r68,%r123; .loc 1 176 9 shl.b64 %r305,%r304,2; neg.s64 %r127,%r305; .loc 1 177 23 mul.lo.u64 %r306,%r71,%r123; .loc 1 177 9 shl.b64 %r307,%r306,2; neg.s64 %r130,%r307; setp.gt.s64 %r346,%r105,0; setp.le.s64 %r347,%r103,1; cvt.u32.u32 %r349,%r22; cvt.s64.s8 %r350,%r349; add.u64 %r351,%r350,-1; add.u64 %r352,%frame,120; add.u64 %r353,%frame,240; .loc 1 173 13 mov.u64 %r354,0; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r179,40; mov.u64 %r162,%frame; cvt.u32.u32 %r309,%r22; cvt.s64.s8 %r308,%r309; add.u64 %r153,%r308,-1; .loc 1 129 3 mov.u64 %r118,0; add.u64 %r345,%frame,240; .loc 1 131 16 mov.u64 %r310,%r118; $L26: st.u64 [%r165],%r310; .loc 1 132 18 ld.u64 %r311,[%r164]; st.u64 [%r162],%r311; .loc 1 133 17 shl.b64 %r313,%r118,3; add.u64 %r314,%r345,%r313; .loc 1 133 10 ld.u64 %r315,[%r314]; setp.le.s64 %r316,%r315,0; @ %r316 bra $L1; .loc 1 129 26 add.u64 %r118,%r118,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r317,%r118,%r153; @ %r317 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r346 bra $L35; .loc 1 150 10 mov.u32 %r319,0; st.u32 [%r117],%r319; bra $L29; $L35: mov.u64 %r112,%r116; .loc 1 148 10 mov.u32 %r110,0; .loc 1 154 13 mov.u64 %r111,0; $L28: .loc 1 158 10 ld.u32 %r320,[%r112]; xor.b32 %r110,%r110,%r320; .loc 1 154 28 add.u64 %r111,%r111,1; .loc 1 154 36 add.u64 %r112,%r112,%r66; .loc 1 154 6 setp.gt.s64 %r321,%r45,%r111; @ %r321 bra $L28; .loc 1 161 12 st.u32 [%r117],%r110; $L29: .loc 1 165 15 add.u64 %r138,%r138,1; .loc 1 166 12 add.u64 %r116,%r116,%r69; .loc 1 167 12 add.u64 %r117,%r117,%r72; .loc 1 169 13 setp.ne.u64 %r322,%r138,%r123; @ %r322 bra $L25; .loc 1 176 9 add.u64 %r113,%r116,%r127; .loc 1 177 9 add.u64 %r114,%r117,%r130; .loc 1 179 7 @ %r347 bra $L1; add.u64 %r122,%frame,368; mov.u64 %r140,8; .loc 1 178 5 mov.u64 %r115,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r122],%r354; .loc 1 176 23 mul.lo.u64 %r328,%r85,%r84; .loc 1 176 9 shl.b64 %r329,%r328,2; sub.u64 %r113,%r116,%r329; .loc 1 177 23 mul.lo.u64 %r330,%r88,%r84; .loc 1 177 9 shl.b64 %r331,%r330,2; sub.u64 %r114,%r117,%r331; .loc 1 178 5 add.u64 %r115,%r115,1; .loc 1 179 7 add.u64 %r122,%r122,8; add.u64 %r140,%r140,8; setp.eq.u64 %r332,%r115,%r351; @ %r332 bra $L1; $L32: .loc 1 187 16 ld.u64 %r333,[%r122]; add.u64 %r84,%r333,1; st.u64 [%r122],%r84; .loc 1 188 23 add.u64 %r335,%r352,%r140; ld.u64 %r85,[%r335]; .loc 1 188 13 shl.b64 %r336,%r85,2; add.u64 %r116,%r113,%r336; .loc 1 189 23 add.u64 %r337,%frame,%r140; ld.u64 %r88,[%r337]; .loc 1 189 13 shl.b64 %r338,%r88,2; add.u64 %r117,%r114,%r338; .loc 1 169 32 add.u64 %r340,%r353,%r140; ld.u64 %r92,[%r340]; .loc 1 169 13 setp.eq.u64 %r341,%r84,%r92; @ %r341 bra $L33; .loc 1 173 13 mov.u64 %r138,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r342,[%r179]; setp.eq.u64 %r343,%r342,0; @ ! %r343 bra $L11; bra $L34; $L1_gfortran_miparity_i4 .visible .func _gfortran_miparity_608u64u16 %r263; .reg .pred %r264; .reg .pred %r266; .reg .u64 %r270; .reg .u64 %r271; .reg .u64 %r277; .reg .pred %r279; .reg .pred %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .predu64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32u64 %r342; .reg .u64pred %r358; .reg .u64predu64 %r391; .reg .predpred %r399; .reg .u32 %r400; .reg .pred %r401; .reg .predmov.u64 %r211,%ar2; mov.u64 %r212,%ar3; .loc 1 223 6 setp.ne.u64 %r213,%r212,0; @ %r213 bra $L49gfortran_iparity_.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r211]; .loc 1 233 7 add.u64 %r137,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r210+28]; .loc 1 234 38 add.u32 %r217,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r138,%r217; .loc 1 237 7 shr.u64 %r219,%r137,63; set.u32.gt.s64 %r221,%r137,%r138; neg.s32 %r222,%r221; cvt.u16.u64 %r225,%r219; cvt.u16.u32 %r226,%r222; or.b16 %r224,%r225,%r226; .loc 1 237 6 cvt.u32.u16 %r227,%r224; cvt.u16.u8 %r228,%r227; setp.eq.u16 %r229,%r228,0; @ %r229 bra $L51; .loc 1 239 7 cvt.u32.u32 %r232,%r23; cvt.s64.s8 %r231,%r232; st.u64 [%stack+8],%r231; st.u64 [%stack],%r22; cvta.const.u64 %r2302stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r429,%r137,%r137; add.u64 %r430,%r429,%r137; shl.b64 %r237,%r430,3; add.u64 %r238,%r210,%r237; ld.u64 %r241,[%r238+56]; add.u64 %r240,%r241,1; .loc 1 244 7 ld.u64 %r248,[%r238+48]; sub.u64 %r139,%r240,%r248; .loc 1 245 6 setp.le.s64 %r249,%r139,0; @ %r249 bra $L48; .loc 1 248 9 ld.u64 %r152,[%r212]; .loc 1 250 15 ld.u64 %r34,[%r212+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r250,%r35,-4; and.b32 %r251,%r250,-5; set.u32.eq.u32 %r253,%r251,0; neg.s32 %r254,%r253; .loc 1 252 22 add.u32 %r255,%r35,-1; set.u32.le.u32 %r257,%r255,1; neg.s32 %r258,%r257; .loc 1 252 6 cvt.u16.u32 %r260,%r254; cvt.u16.u32 %r261,%r258; or.b16 %r259,%r260,%r261; cvt.u32.u16 %r262,%r259; cvt.u16.u8 %r263,%r262; setp.ne.u16 %r264,%r263,0; @ %r264 bra $L53; .loc 1 254 7 setp.ne.u32 %r266,%r35,16; @ %r266 bra $L54; $L53: .loc 1 261 9 shl.b64 %r270,%r430,3; add.u64 %r271,%r210,%r270; ld.u64 %r142,[%r271+40]; .loc 1 262 12 add.u64 %r277,%r212,%r270; ld.u64 %r42,[%r277+40]; .loc 1 264 3 setp.ne.u64 %r279,%r137,0; @ %r279 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r280,%r137,%r138; @ %r280 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r281281_gfortran_runtime_error5: add.u64 %r40,%r210,40; add.u64 %r179,%r212,40; add.u64 %r283,%r210,16; add.u64 %r285,%r22,%r22; add.u64 %r286,%r285,%r22; shl.b64 %r287,%r286,3; add.u64 %r206,%r283,%r287; .loc 1 264 3 mov.u64 %r92,0; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 271 12 mov.u64 %r441,%r92; $L60: .loc 1 266 18 add.u64 %r289,%r427,%r92; ld.u64 %r290,[%r40]; st.u64 [%r289],%r290; .loc 1 267 18 add.u64 %r291,%frame,%r92; .loc 1 267 20 ld.u64 %r293,[%r179]; mul.lo.u64 %r292,%r293,%r34; .loc 1 267 18 st.u64 [%r291],%r292; .loc 1 268 19 ld.u64 %r295,[%r40+16]; add.u64 %r294,%r295,1; ld.u64 %r296,[%r40+8]; sub.u64 %r54,%r294,%r296; .loc 1 270 10 setp.lt.s64 %r297,%r54,0; @ %r297 bra $L58; .loc 1 268 17 add.u64 %r299,%r426,%r92; st.u64 [%r299],%r54; bra $L59; $L58: .loc 1 271 12 add.u64 %r301,%r426,%r92; st.u64 [%r301],%r441; $L59: .loc 1 264 3 add.u64 %r40,%r40,24; add.u64 %r179,%r179,24; add.u64 %r92,%r92,8; setp.ne.u64 %r303,%r40,%r206; @ %r303 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r304,[%r209]; setp.eq.u64 %r305,%r304,0; @ ! %r305 bra $L63; bra $L62; $L56: add.u64 %r307,%r22,%r22; add.u64 %r308,%r307,%r22; shl.b64 %r309,%r308,3; add.u64 %r127,%r309,40; add.u64 %r133,%r210,%r127; add.u64 %r125,%r212,%r127; shl.b64 %r310,%r22,3; add.u64 %r103,%r310,-8; cvt.u32.u32 %r312,%r23; cvt.s64.s8 %r311,%r312; shl.b64 %r313,%r311,3; add.u64 %r69,%r313,-8; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 281 12 mov.u64 %r440,0; $L66: .loc 1 276 18 add.u64 %r315,%r427,%r103; ld.u64 %r316,[%r133]; st.u64 [%r315],%r316; .loc 1 277 18 add.u64 %r317,%frame,%r103; .loc 1 277 20 ld.u64 %r319,[%r125]; mul.lo.u64 %r318,%r319,%r34; .loc 1 277 18 st.u64 [%r317],%r318; .loc 1 278 19 ld.u64 %r321,[%r133+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r133+8]; sub.u64 %r66,%r320,%r322; .loc 1 280 10 setp.lt.s64 %r323,%r66,0; @ %r323 bra $L64; .loc 1 278 17 add.u64 %r325,%r426,%r103; st.u64 [%r325],%r66; bra $L65; $L64: .loc 1 281 12 add.u64 %r327,%r426,%r103; st.u64 [%r327],%r440; $L65: .loc 1 274 3 add.u64 %r133,%r133,24; add.u64 %r125,%r125,24; add.u64 %r103,%r103,8; setp.ne.u64 %r329,%r69,%r103; @ %r329 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r330,%r138,0; @ %r330 bra $L68; add.u64 %r426,%frame,360; $L86: add.u64 %r172,%r209,40; mov.u64 %r164,%r426; cvt.u32.u32 %r332,%r23; cvt.s64.s8 %r331,%r332; add.u64 %r334,%r331,%r331; add.u64 %r335,%r334,%r331; shl.b64 %r336,%r335,3; add.u64 %r337,%r209,16; add.u64 %r134,%r336,%r337; .loc 1 291 10 mov.u64 %r129,1; .loc 1 295 4 mov.u64 %r356,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r338,%r23,-2; cvt.s64.s32 %r75,%r338; add.u64 %r340,%r75,%r75; add.u64 %r341,%r340,%r75; shl.b64 %r342,%r341,3; add.u64 %r343,%r209,%r342; .loc 1 299 67 shl.b64 %r345,%r75,3; add.u64 %r346,%frame,%r345; .loc 1 299 59 ld.u64 %r348,[%r343+40]; ld.u64 %r349,[%r346+360]; mul.lo.u64 %r146,%r348,%r349; .loc 1 301 24 mov.u64 %r350,0; st.u64 [%r209+8],%r350; .loc 1 302 28 cvt.u16.u32 %r353,%r23; add.u16 %r352,%r353,-1; cvt.u32.u16 %r354,%r352; st.u8 [%r209+28],%r354; .loc 1 304 10 setp.eq.u64 %r355,%r146,0; @ %r355 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r129,%r72,%r129; $L69: .loc 1 295 4 st.u64 [%r172+8],%r356; ld.u64 %r72,[%r164]; add.u64 %r357,%r72,-1; st.u64 [%r172+16],%r357; st.u64 [%r172],%r129; .loc 1 288 7 add.u64 %r172,%r172,24; add.u64 %r164,%r164,8; setp.ne.u64 %r358,%r134,%r172; @ %r358 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r209+48],%r146; mov.u64 %r360,-1; st.u64 [%r209+56],%r360; mov.u64 %r361,1; st.u64 [%r209+40],%r361; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24363; call (%value_in),_gfortrani_xmallocarray64,[%value_in]; } .loc 1 311 22 st.u64 [%r209],%r364; bra $L73; $L63: .loc 1 316 19 ld.s8 %r366,[%r209+28]; .loc 1 316 10 setp.eq.u64 %r367,%r366,%r138; @ %r367 bra $L74; .loc 1 317 2 cvta.const.u64 %r36836_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r370,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r371,[%r370+36]; setp.eq.u32 %r372,%r371,0; @ %r372 bra $L73; .loc 1 321 4 add.u64 %r426,%frame,360; cvta.const.u64 %r376,$LC2r42637376; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r380,$LC1210380376; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r382,%r138,0; @ %r382 bra $L75; $L78: .loc 1 336 8 ld.u64 %r153,[%r209]; .loc 1 337 8 ld.u64 %r151,[%r210]; .loc 1 339 9 setp.ne.u64 %r383,%r151,0; @ %r383 bra $L76; bra $L48; $L75: add.u64 %r190,%frame,480; add.u64 %r189,%r209,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r385,%r23; cvt.s64.s8 %r384,%r385; add.u64 %r175,%r384,-1; .loc 1 328 3 mov.u64 %r157,0; add.u64 %r426,%frame,360; .loc 1 330 16 mov.u64 %r386,%r157; $L77: st.u64 [%r190],%r386; .loc 1 331 18 ld.u64 %r387,[%r189]; st.u64 [%r187],%r387; .loc 1 332 17 shl.b64 %r389,%r157,3; add.u64 %r390,%r426,%r389; .loc 1 332 10 ld.u64 %r391,[%r390]; setp.le.s64 %r392,%r391,0; @ %r392 bra $L48; .loc 1 328 26 add.u64 %r157,%r157,1; .loc 1 328 3 add.u64 %r190,%r190,8; add.u64 %r189,%r189,24; add.u64 %r187,%r187,8; setp.ne.u64 %r393,%r157,%r175; @ %r393 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r45,%r42,%r34; .loc 1 349 32 shl.b64 %r91,%r142,2; ld.u64 %r39,[%frame+480]; .loc 1 359 22 ld.u64 %r94,[%frame+240]; .loc 1 359 12 shl.b64 %r95,%r94,2; .loc 1 360 23 ld.u64 %r97,[%frame]; .loc 1 361 22 ld.u64 %r99,[%frame+120]; .loc 1 361 12 shl.b64 %r100,%r99,2; .loc 1 363 32 ld.u64 %r85,[%frame+360]; .loc 1 370 23 mul.lo.u64 %r38,%r85,%r94; .loc 1 371 24 mul.lo.u64 %r394,%r85,%r97; .loc 1 371 10 neg.s64 %r163,%r394; .loc 1 372 23 mul.lo.u64 %r395,%r85,%r99; .loc 1 372 9 shl.b64 %r396,%r395,2; neg.s64 %r165,%r396; setp.le.s64 %r431,%r138,1; cvt.u32.u32 %r433,%r23; cvt.s64.s8 %r434,%r433; add.u64 %r435,%r434,-1; add.u64 %r436,%frame,240; add.u64 %r437,%frame,360; add.u64 %r438,%frame,120; .loc 1 367 13 mov.u64 %r439,0; $L87: .loc 1 328 3 mov.u64 %r156,%r152; mov.u64 %r155,%r151; .loc 1 348 10 mov.u32 %r130,0; .loc 1 349 9 mov.u64 %r154,0; $L80: .loc 1 352 6 ld.s8 %r398,[%r156]; cvt.u16.u32 %r397,%r398; setp.eq.u16 %r399,%r397,0; @ %r399 bra $L79; .loc 1 353 12 ld.u32 %r400,[%r155]; xor.b32 %r130,%r130,%r400; $L79: .loc 1 349 24 add.u64 %r154,%r154,1; .loc 1 349 32 add.u64 %r155,%r155,%r91; .loc 1 349 47 add.u64 %r156,%r156,%r45; .loc 1 349 2 setp.ne.u64 %r401,%r139,%r154; @ %r401 bra $L80; .loc 1 355 8 st.u32 [%r153],%r130; .loc 1 358 15 add.u64 %r39,%r39,1; .loc 1 359 12 add.u64 %r151,%r151,%r95; .loc 1 360 13 add.u64 %r152,%r152,%r97; .loc 1 361 12 add.u64 %r153,%r153,%r100; .loc 1 363 13 setp.ne.u64 %r402,%r85,%r39; @ %r402 bra $L87; .loc 1 371 10 add.u64 %r148,%r152,%r163; .loc 1 372 9 add.u64 %r149,%r153,%r165; .loc 1 374 7 @ ! %r431 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r41],%r439; .loc 1 370 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 371 24 mul.lo.u64 %r405,%r116,%r113; .loc 1 371 10 sub.u64 %r148,%r152,%r405; .loc 1 372 23 mul.lo.u64 %r406,%r118,%r113; .loc 1 372 9 shl.b64 %r407,%r406,2; sub.u64 %r149,%r153,%r407; .loc 1 373 5 add.u64 %r150,%r150,1; .loc 1 374 7 add.u64 %r41,%r41,8; add.u64 %r166,%r166,8; setp.ne.u64 %r408,%r150,%r435; @ %r408 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; .loc 1 370 23 mov.u64 %r102,%r38; .loc 1 374 7 mov.u64 %r166,8; .loc 1 373 5 mov.u64 %r150,1; $L84: .loc 1 382 16 ld.u64 %r412,[%r41]; add.u64 %r113,%r412,1; st.u64 [%r41],%r113; .loc 1 383 23 add.u64 %r414,%r436,%r166; ld.u64 %r114,[%r414]; .loc 1 383 13 sub.u64 %r415,%r114,%r102; shl.b64 %r416,%r415,2; add.u64 %r151,%r151,%r416; .loc 1 384 24 add.u64 %r417,%frame,%r166; ld.u64 %r116,[%r417]; .loc 1 384 14 add.u64 %r152,%r148,%r116; .loc 1 385 23 add.u64 %r419,%r438,%r166; ld.u64 %r118,[%r419]; .loc 1 385 13 shl.b64 %r420,%r118,2; add.u64 %r153,%r149,%r420; .loc 1 363 32 add.u64 %r422,%r437,%r166; ld.u64 %r122,[%r422]; .loc 1 363 13 setp.eq.u64 %r423,%r113,%r122; @ %r423 bra $L85; .loc 1 367 13 mov.u64 %r39,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r424,[%r209]; setp.eq.u64 %r425,%r424,0; @ ! %r425 bra $L63; bra $L86; $L48: .loc 1 389_gfortran_siparity_i4 .visible .func _gfortran_siparity_8u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64predu64mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_iparity_.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r294,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r294; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r294; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r300,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r300; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r293,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r293,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r299,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r299; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r294,0; @ %r211 bra $L127; add.u64 %r293,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r293; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 2496244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r294,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r294; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r294; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r294,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r294,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r294,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 shl.b64 %r69,%r68,2; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r292,%r294,1; .loc 1 512 13 mov.u32 %r278,0; cvt.u32.u32 %r296,%r24; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; $L146: st.u32 [%r98],%r278; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r279,%r59,%r80; @ %r279 bra $L146; .loc 1 525 7 @ ! %r292 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r295; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r282,%r97,%r298; @ %r282 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; .loc 1 520 13 mov.u64 %r295,0; $L144: .loc 1 529 16 ld.u64 %r287,[%r99]; add.u64 %r74,%r287,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r288,%r75,%r72; shl.b64 %r289,%r288,2; add.u64 %r98,%r98,%r289; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r290,%r74,%r77; @ %r290 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r291,%r98,0; @ ! %r291 bra $L122; bra $L147; $L109: .loc 1 534iparity_i8.o/_gfortran_iparity_i8 .visible .func _gfortran_iparity_i8fortran/generated/iparity_i8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_miparity_i8 .visible .func _gfortran_miparity_i8gfortran_siparity_i8 .visible .func _gfortran_siparity_i8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,73,80,65,82,73,84,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,80,65,82,73,84,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,108] = {73,80,65,82,73,84,89114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,80,65,82,73,84,89,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,73,80,65,82,73,84,89,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_iparity_i8 .visible .func _gfortran_iparity_i84896pred %r218; .reg .predpred %r231; .reg .u64 %r232; .reg .predu64 %r250; .reg .u64 %r251; .reg .u64 %r252; .reg .pred %r253; .reg .predu64u16u64 %r274; .reg .u64pred %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u32 %r306; .reg .u64u64 %r311; .reg .u64 %r312; .reg .pred %r313; .reg .predu64pred %r347; .reg .predu64 %r358; mov.u64 %r180,%ar0; mov.u64 %r181,%ar1; mov.u64 %r182,%ar2; .loc 1 55 10 ld.s8 %r22,[%r181+28]; .loc 1 55 38 add.u32 %r183,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r107,%r183; .loc 1 56 10 ld.u64 %r25,[%r182]; .loc 1 56 7 add.u64 %r108,%r25,-1; .loc 1 58 7 shr.u64 %r185,%r108,63; set.u32.lt.s64 %r187,%r107,%r108; neg.s32 %r188,%r187; cvt.u16.u64 %r191,%r185; cvt.u16.u32 %r192,%r188; or.b16 %r190,%r191,%r192; .loc 1 58 6 cvt.u32.u16 %r193,%r190; cvt.u16.u8 %r194,%r193; setp.eq.u16 %r195,%r194,0; @ %r195 bra $L2; .loc 1 60 7 cvt.u32.u32 %r198,%r22; cvt.s64.s8 %r197,%r198; st.u64 [%stack+8],%r1971966_gfortran_runtime_errorr201,%r108,%r108; add.u64 %r202,%r201,%r108; shl.b64 %r203,%r202,3; add.u64 %r204,%r181,%r203; ld.u64 %r30,[%r204+56]; ld.u64 %r32,[%r204+48]; .loc 1 68 9 ld.u64 %r111,[%r204+40]; .loc 1 70 3 setp.ne.u64 %r218,%r108,0; @ %r218 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r219,%r107,%r108; @ %r219 bra $L4; bra $L46; $L3: add.u64 %r61,%r181,40; add.u64 %r40,%frame,120; add.u64 %r96,%frame,240; add.u64 %r220,%r181,16; add.u64 %r222,%r25,%r25; add.u64 %r223,%r222,%r25; shl.b64 %r224,%r223,3; add.u64 %r175,%r220,%r224; .loc 1 76 12 mov.u64 %r358,0; $L8: .loc 1 72 18 ld.u64 %r225,[%r61]; st.u64 [%r40],%r225; .loc 1 73 19 ld.u64 %r227,[%r61+16]; add.u64 %r226,%r227,1; ld.u64 %r228,[%r61+8]; sub.u64 %r37,%r226,%r228; .loc 1 75 10 setp.lt.s64 %r229,%r37,0; @ %r229 bra $L6; .loc 1 73 17 st.u64 [%r96],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r96],%r358; $L7: .loc 1 70 3 add.u64 %r61,%r61,24; add.u64 %r40,%r40,8; add.u64 %r96,%r96,8; setp.ne.u64 %r231,%r61,%r175; @ %r231 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r232,[%r180]; setp.eq.u64 %r233,%r232,0; @ ! %r233 bra $L11; bra $L10; $L4: add.u64 %r235,%r25,%r25; add.u64 %r236,%r235,%r25; shl.b64 %r237,%r236,3; add.u64 %r238,%r237,40; add.u64 %r136,%r181,%r238; shl.b64 %r239,%r25,3; add.u64 %r105,%r239,-8; add.u64 %r345,%frame,120; add.u64 %r129,%r345,%r105; add.u64 %r346,%frame,240; add.u64 %r103,%r346,%r105; cvt.u32.u32 %r243,%r22; cvt.s64.s8 %r242,%r243; add.u64 %r245,%r242,%r242; add.u64 %r246,%r245,%r242; shl.b64 %r247,%r246,3; add.u64 %r248,%r181,40; add.u64 %r80,%r247,%r248; .loc 1 84 12 mov.u64 %r357,0; $L14: .loc 1 80 18 ld.u64 %r249,[%r136]; st.u64 [%r129],%r249; .loc 1 81 19 ld.u64 %r251,[%r136+16]; add.u64 %r250,%r251,1; ld.u64 %r252,[%r136+8]; sub.u64 %r44,%r250,%r252; .loc 1 83 10 setp.lt.s64 %r253,%r44,0; @ %r253 bra $L12; .loc 1 81 17 st.u64 [%r103],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r103],%r357; $L13: .loc 1 78 3 add.u64 %r136,%r136,24; add.u64 %r129,%r129,8; add.u64 %r103,%r103,8; setp.ne.u64 %r255,%r80,%r136; @ %r255 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r256,%r107,0; @ %r256 bra $L16; add.u64 %r346,%frame,240; $L34: add.u64 %r152,%r180,40; mov.u64 %r150,%r346; add.u64 %r257,%r180,16; cvt.u32.u32 %r259,%r22; cvt.s64.s8 %r258,%r259; add.u64 %r261,%r258,%r258; add.u64 %r262,%r261,%r258; shl.b64 %r263,%r262,3; add.u64 %r139,%r257,%r263; .loc 1 94 10 mov.u64 %r102,1; .loc 1 98 4 mov.u64 %r286,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r264,0; st.u64 [%r180+8],%r264; .loc 1 103 28 cvt.u16.u32 %r267,%r22; add.u16 %r266,%r267,-1; cvt.u32.u16 %r268,%r266; st.u8 [%r180+28],%r268; .loc 1 105 20 add.u32 %r269,%r22,-2; cvt.s64.s32 %r53,%r269; add.u64 %r271,%r53,%r53; add.u64 %r272,%r271,%r53; shl.b64 %r273,%r272,3; add.u64 %r274,%r180,%r273; .loc 1 105 67 shl.b64 %r276,%r53,3; add.u64 %r277,%frame,%r276; .loc 1 105 59 ld.u64 %r279,[%r274+40]; ld.u64 %r280,[%r277+240]; mul.lo.u64 %r113,%r279,%r280; .loc 1 107 29 mov.u64 %r2811(%value_in),_gfortrani_xmallocarray283,[%value_in]; } .loc 1 107 27 st.u64 [%r180],%r283; .loc 1 108 10 setp.eq.u64 %r285,%r113,0; @ ! %r285 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r102,%r49,%r102; $L17: .loc 1 98 4 st.u64 [%r152+8],%r286; ld.u64 %r49,[%r150]; add.u64 %r287,%r49,-1; st.u64 [%r152+16],%r287; st.u64 [%r152],%r102; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r288,%r139,%r152; @ %r288 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r180+48],%r113; mov.u64 %r290,-1; st.u64 [%r180+56],%r290; mov.u64 %r291,1; st.u64 [%r180+40],%r291; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r60,[%r180+28]; .loc 1 118 10 setp.eq.u64 %r292,%r60,%r107; @ %r292 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r107; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r295,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r296,[%r295+36]; setp.eq.u32 %r297,%r296,0; @ %r297 bra $L21; .loc 1 125 2 add.u64 %r346,%frame,240; cvta.const.u64 %r301,$LC2r346300301; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r303,%r107,0; @ %r303 bra $L24; $L27: .loc 1 65 9 add.u64 %r304,%r30,1; .loc 1 65 7 sub.u64 %r109,%r304,%r32; max.s64 %r45,%r109,0; .loc 1 137 8 ld.u64 %r121,[%r181]; .loc 1 138 8 ld.u64 %r122,[%r180]; .loc 1 154 36 shl.b64 %r68,%r111,3; ld.u64 %r47,[%frame+360]; .loc 1 166 22 ld.u64 %r69,[%frame+120]; .loc 1 166 12 shl.b64 %r71,%r69,3; .loc 1 167 22 ld.u64 %r72,[%frame]; .loc 1 167 12 shl.b64 %r74,%r72,3; setp.gt.s64 %r347,%r109,0; setp.le.s64 %r348,%r107,1; .loc 1 150 10 mov.u64 %r350,0; cvt.u32.u32 %r351,%r22; cvt.s64.s8 %r352,%r351; add.u64 %r353,%r352,-1; add.u64 %r354,%frame,120; add.u64 %r355,%frame,240; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r180,40; mov.u64 %r162,%frame; cvt.u32.u32 %r306,%r22; cvt.s64.s8 %r305,%r306; add.u64 %r153,%r305,-1; .loc 1 129 3 mov.u64 %r123,0; add.u64 %r346,%frame,240; .loc 1 131 16 mov.u64 %r307,%r123; $L26: st.u64 [%r165],%r307; .loc 1 132 18 ld.u64 %r308,[%r164]; st.u64 [%r162],%r308; .loc 1 133 17 shl.b64 %r310,%r123,3; add.u64 %r311,%r346,%r310; .loc 1 133 10 ld.u64 %r312,[%r311]; setp.le.s64 %r313,%r312,0; @ %r313 bra $L1; .loc 1 129 26 add.u64 %r123,%r123,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r314,%r123,%r153; @ %r314 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r347 bra $L35; .loc 1 150 10 st.u64 [%r122],%r350; bra $L29; $L35: mov.u64 %r116,%r121; .loc 1 148 10 mov.u64 %r114,0; .loc 1 154 13 mov.u64 %r115,%r114; $L28: .loc 1 158 10 ld.u64 %r317,[%r116]; xor.b64 %r114,%r114,%r317; .loc 1 154 28 add.u64 %r115,%r115,1; .loc 1 154 36 add.u64 %r116,%r116,%r68; .loc 1 154 6 setp.gt.s64 %r318,%r45,%r115; @ %r318 bra $L28; .loc 1 161 12 st.u64 [%r122],%r114; $L29: .loc 1 165 15 add.u64 %r47,%r47,1; .loc 1 166 12 add.u64 %r121,%r121,%r71; .loc 1 167 12 add.u64 %r122,%r122,%r74; .loc 1 169 32 ld.u64 %r127,[%frame+240]; .loc 1 169 13 setp.ne.u64 %r319,%r47,%r127; @ %r319 bra $L25; .loc 1 176 23 mul.lo.u64 %r320,%r69,%r47; .loc 1 176 9 shl.b64 %r321,%r320,3; sub.u64 %r118,%r121,%r321; .loc 1 177 23 mul.lo.u64 %r322,%r72,%r47; .loc 1 177 9 shl.b64 %r323,%r322,3; sub.u64 %r119,%r122,%r323; .loc 1 179 7 @ %r348 bra $L1; add.u64 %r144,%frame,368; mov.u64 %r106,8; .loc 1 178 5 mov.u64 %r120,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r144],%r350; .loc 1 176 23 mul.lo.u64 %r329,%r88,%r87; .loc 1 176 9 shl.b64 %r330,%r329,3; sub.u64 %r118,%r121,%r330; .loc 1 177 23 mul.lo.u64 %r331,%r91,%r87; .loc 1 177 9 shl.b64 %r332,%r331,3; sub.u64 %r119,%r122,%r332; .loc 1 178 5 add.u64 %r120,%r120,1; .loc 1 179 7 add.u64 %r144,%r144,8; add.u64 %r106,%r106,8; setp.eq.u64 %r333,%r120,%r353; @ %r333 bra $L1; $L32: .loc 1 187 16 ld.u64 %r334,[%r144]; add.u64 %r87,%r334,1; st.u64 [%r144],%r87; .loc 1 188 23 add.u64 %r336,%r354,%r106; ld.u64 %r88,[%r336]; .loc 1 188 13 shl.b64 %r337,%r88,3; add.u64 %r121,%r118,%r337; .loc 1 189 23 add.u64 %r338,%frame,%r106; ld.u64 %r91,[%r338]; .loc 1 189 13 shl.b64 %r339,%r91,3; add.u64 %r122,%r119,%r339; .loc 1 169 32 add.u64 %r341,%r355,%r106; ld.u64 %r94,[%r341]; .loc 1 169 13 setp.eq.u64 %r342,%r87,%r94; @ %r342 bra $L33; .loc 1 173 13 mov.u64 %r47,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r343,[%r180]; setp.eq.u64 %r344,%r343,0; @ ! %r344 bra $L11; bra $L34; $L1_gfortran_miparity_i8 .visible .func _gfortran_miparity_608predpred %r280; .reg .pred %r281; .reg .u64 %r282; .reg .u64 %r284; .reg .u64predu64 %r318; .reg .u64u64u64 %r364; .reg .u64 %r365; .reg .u64 %r367; .reg .predu64pred %r393; .reg .predpredmov.u64 %r210,%ar0; mov.u64 %r211,%ar1; mov.u64 %r212,%ar2; mov.u64 %r213,%ar3; .loc 1 223 6911212; call _gfortran_iparity_.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r212]; .loc 1 233 7 add.u64 %r141,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r211+28]; .loc 1 234 38 add.u32 %r218,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r142,%r218; .loc 1 237 7 shr.u64 %r220,%r141,63; set.u32.gt.s64 %r222,%r141,%r142; neg.s32 %r223,%r222; cvt.u16.u64 %r226,%r220; cvt.u16.u32 %r227,%r223; or.b16 %r225,%r226,%r227; .loc 1 237 6 cvt.u32.u16 %r228,%r225; cvt.u16.u8 %r229,%r228; setp.eq.u16 %r230,%r229,0; @ %r230 bra $L51; .loc 1 239 7 cvt.u32.u32 %r233,%r23; cvt.s64.s8 %r232,%r233; st.u64 [%stack+8],%r232; st.u64 [%stack],%r22; cvta.const.u64 %r2312stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r431,%r141,%r141; add.u64 %r430,%r431,%r141; shl.b64 %r238,%r430,3; add.u64 %r239,%r211,%r238; ld.u64 %r242,[%r239+56]; add.u64 %r241,%r242,1; .loc 1 244 7 ld.u64 %r249,[%r239+48]; sub.u64 %r143,%r241,%r249; .loc 1 245 6 setp.le.s64 %r250,%r143,0; @ %r250 bra $L48; .loc 1 248 9 ld.u64 %r156,[%r213]; .loc 1 250 15 ld.u64 %r35,[%r213+16]; .loc 1 252 22 cvt.u32.u64 %r36,%r35; .loc 1 252 53 add.u32 %r251,%r36,-4; and.b32 %r252,%r251,-5; set.u32.eq.u32 %r254,%r252,0; neg.s32 %r255,%r254; .loc 1 252 22 add.u32 %r256,%r36,-1; set.u32.le.u32 %r258,%r256,1; neg.s32 %r259,%r258; .loc 1 252 6 cvt.u16.u32 %r261,%r255; cvt.u16.u32 %r262,%r259; or.b16 %r260,%r261,%r262; cvt.u32.u16 %r263,%r260; cvt.u16.u8 %r264,%r263; setp.ne.u16 %r265,%r264,0; @ %r265 bra $L53; .loc 1 254 7 setp.ne.u32 %r267,%r36,16; @ %r267 bra $L54; $L53: .loc 1 261 9 shl.b64 %r271,%r430,3; add.u64 %r272,%r211,%r271; ld.u64 %r146,[%r272+40]; .loc 1 262 12 add.u64 %r278,%r213,%r271; ld.u64 %r43,[%r278+40]; .loc 1 264 3 setp.ne.u64 %r280,%r141,0; @ %r280 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r281,%r141,%r142; @ %r281 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r282282_gfortran_runtime_error5: add.u64 %r194,%r211,40; add.u64 %r196,%r213,40; add.u64 %r284,%r211,16; add.u64 %r286,%r22,%r22; add.u64 %r287,%r286,%r22; shl.b64 %r288,%r287,3; add.u64 %r207,%r284,%r288; .loc 1 264 3 mov.u64 %r198,0; add.u64 %r428,%frame,240; add.u64 %r427,%frame,360; .loc 1 271 12 mov.u64 %r442,%r198; $L60: .loc 1 266 18 add.u64 %r290,%r428,%r198; ld.u64 %r291,[%r194]; st.u64 [%r290],%r291; .loc 1 267 18 add.u64 %r292,%frame,%r198; .loc 1 267 20 ld.u64 %r294,[%r196]; mul.lo.u64 %r293,%r294,%r35; .loc 1 267 18 st.u64 [%r292],%r293; .loc 1 268 19 ld.u64 %r296,[%r194+16]; add.u64 %r295,%r296,1; ld.u64 %r297,[%r194+8]; sub.u64 %r55,%r295,%r297; .loc 1 270 10 setp.lt.s64 %r298,%r55,0; @ %r298 bra $L58; .loc 1 268 17 add.u64 %r300,%r427,%r198; st.u64 [%r300],%r55; bra $L59; $L58: .loc 1 271 12 add.u64 %r302,%r427,%r198; st.u64 [%r302],%r442; $L59: .loc 1 264 3 add.u64 %r194,%r194,24; add.u64 %r196,%r196,24; add.u64 %r198,%r198,8; setp.ne.u64 %r304,%r194,%r207; @ %r304 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r305,[%r210]; setp.eq.u64 %r306,%r305,0; @ ! %r306 bra $L63; bra $L62; $L56: add.u64 %r308,%r22,%r22; add.u64 %r309,%r308,%r22; shl.b64 %r310,%r309,3; add.u64 %r61,%r310,40; add.u64 %r100,%r211,%r61; add.u64 %r139,%r213,%r61; shl.b64 %r311,%r22,3; add.u64 %r192,%r311,-8; cvt.u32.u32 %r313,%r23; cvt.s64.s8 %r312,%r313; shl.b64 %r314,%r312,3; add.u64 %r39,%r314,-8; add.u64 %r428,%frame,240; add.u64 %r427,%frame,360; .loc 1 281 12 mov.u64 %r441,0; $L66: .loc 1 276 18 add.u64 %r316,%r428,%r192; ld.u64 %r317,[%r100]; st.u64 [%r316],%r317; .loc 1 277 18 add.u64 %r318,%frame,%r192; .loc 1 277 20 ld.u64 %r320,[%r139]; mul.lo.u64 %r319,%r320,%r35; .loc 1 277 18 st.u64 [%r318],%r319; .loc 1 278 19 ld.u64 %r322,[%r100+16]; add.u64 %r321,%r322,1; ld.u64 %r323,[%r100+8]; sub.u64 %r67,%r321,%r323; .loc 1 280 10 setp.lt.s64 %r324,%r67,0; @ %r324 bra $L64; .loc 1 278 17 add.u64 %r326,%r427,%r192; st.u64 [%r326],%r67; bra $L65; $L64: .loc 1 281 12 add.u64 %r328,%r427,%r192; st.u64 [%r328],%r441; $L65: .loc 1 274 3 add.u64 %r100,%r100,24; add.u64 %r139,%r139,24; add.u64 %r192,%r192,8; setp.ne.u64 %r330,%r39,%r192; @ %r330 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r331,%r142,0; @ %r331 bra $L68; add.u64 %r427,%frame,360; $L86: add.u64 %r137,%r210,40; mov.u64 %r135,%r427; cvt.u32.u32 %r333,%r23; cvt.s64.s8 %r332,%r333; add.u64 %r335,%r332,%r332; add.u64 %r336,%r335,%r332; shl.b64 %r337,%r336,3; add.u64 %r338,%r210,16; add.u64 %r106,%r337,%r338; .loc 1 291 10 mov.u64 %r131,1; .loc 1 295 4 mov.u64 %r357,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r339,%r23,-2; cvt.s64.s32 %r74,%r339; add.u64 %r341,%r74,%r74; add.u64 %r342,%r341,%r74; shl.b64 %r343,%r342,3; add.u64 %r344,%r210,%r343; .loc 1 299 67 shl.b64 %r346,%r74,3; add.u64 %r347,%frame,%r346; .loc 1 299 59 ld.u64 %r349,[%r344+40]; ld.u64 %r350,[%r347+360]; mul.lo.u64 %r150,%r349,%r350; .loc 1 301 24 mov.u64 %r351,0; st.u64 [%r210+8],%r351; .loc 1 302 28 cvt.u16.u32 %r354,%r23; add.u16 %r353,%r354,-1; cvt.u32.u16 %r355,%r353; st.u8 [%r210+28],%r355; .loc 1 304 10 setp.eq.u64 %r356,%r150,0; @ %r356 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r131,%r71,%r131; $L69: .loc 1 295 4 st.u64 [%r137+8],%r357; ld.u64 %r71,[%r135]; add.u64 %r358,%r71,-1; st.u64 [%r137+16],%r358; st.u64 [%r137],%r131; .loc 1 288 7 add.u64 %r137,%r137,24; add.u64 %r135,%r135,8; setp.ne.u64 %r359,%r106,%r137; @ %r359 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r210+48],%r150; mov.u64 %r361,-1; st.u64 [%r210+56],%r361; mov.u64 %r362,1; st.u64 [%r210+40],%r362; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r364364; call (%value_in),_gfortrani_xmallocarray65,[%value_in]; } .loc 1 311 22 st.u64 [%r210],%r365; bra $L73; $L63: .loc 1 316 19 ld.s8 %r367,[%r210+28]; .loc 1 316 10 setp.eq.u64 %r368,%r367,%r142; @ %r368 bra $L74; .loc 1 317 2 cvta.const.u64 %r36stack; call _gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r371,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r372,[%r371+36]; setp.eq.u32 %r373,%r372,0; @ %r373 bra $L73; .loc 1 321 4 add.u64 %r427,%frame,360; cvta.const.u64 %r377,$LC2r427376377; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r381,$LC1r2113377; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r383,%r142,0; @ %r383 bra $L75; $L78: .loc 1 336 8 ld.u64 %r157,[%r210]; .loc 1 337 8 ld.u64 %r155,[%r211]; .loc 1 339 9 setp.ne.u64 %r384,%r155,0; @ %r384 bra $L76; bra $L48; $L75: add.u64 %r183,%frame,480; add.u64 %r180,%r210,40; add.u64 %r177,%frame,120; cvt.u32.u32 %r386,%r23; cvt.s64.s8 %r385,%r386; add.u64 %r138,%r385,-1; .loc 1 328 3 mov.u64 %r161,0; add.u64 %r427,%frame,360; .loc 1 330 16 mov.u64 %r387,%r161; $L77: st.u64 [%r183],%r387; .loc 1 331 18 ld.u64 %r388,[%r180]; st.u64 [%r177],%r388; .loc 1 332 17 shl.b64 %r390,%r161,3; add.u64 %r391,%r427,%r390; .loc 1 332 10 ld.u64 %r392,[%r391]; setp.le.s64 %r393,%r392,0; @ %r393 bra $L48; .loc 1 328 26 add.u64 %r161,%r161,1; .loc 1 328 3 add.u64 %r183,%r183,8; add.u64 %r180,%r180,24; add.u64 %r177,%r177,8; setp.ne.u64 %r394,%r138,%r161; @ %r394 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r46,%r43,%r35; .loc 1 349 32 shl.b64 %r90,%r146,3; ld.u64 %r29,[%frame+480]; .loc 1 359 22 ld.u64 %r92,[%frame+240]; .loc 1 359 12 shl.b64 %r94,%r92,3; .loc 1 360 23 ld.u64 %r95,[%frame]; .loc 1 361 22 ld.u64 %r97,[%frame+120]; .loc 1 361 12 shl.b64 %r99,%r97,3; setp.le.s64 %r432,%r142,1; cvt.u32.u32 %r434,%r23; cvt.s64.s8 %r435,%r434; add.u64 %r436,%r435,-1; add.u64 %r437,%frame,240; add.u64 %r438,%frame,360; add.u64 %r439,%frame,120; .loc 1 367 13 mov.u64 %r440,0; $L87: .loc 1 328 3 mov.u64 %r160,%r156; mov.u64 %r159,%r155; .loc 1 348 10 mov.u64 %r133,0; .loc 1 349 9 mov.u64 %r158,%r133; $L80: .loc 1 352 6 ld.s8 %r396,[%r160]; cvt.u16.u32 %r395,%r396; setp.eq.u16 %r397,%r395,0; @ %r397 bra $L79; .loc 1 353 12 ld.u64 %r398,[%r159]; xor.b64 %r133,%r133,%r398; $L79: .loc 1 349 24 add.u64 %r158,%r158,1; .loc 1 349 32 add.u64 %r159,%r159,%r90; .loc 1 349 47 add.u64 %r160,%r160,%r46; .loc 1 349 2 setp.ne.u64 %r399,%r143,%r158; @ %r399 bra $L80; .loc 1 355 8 st.u64 [%r157],%r133; .loc 1 358 15 add.u64 %r29,%r29,1; .loc 1 359 12 add.u64 %r155,%r155,%r94; .loc 1 360 13 add.u64 %r156,%r156,%r95; .loc 1 361 12 add.u64 %r157,%r157,%r99; .loc 1 363 32 ld.u64 %r84,[%frame+360]; .loc 1 363 13 setp.ne.u64 %r400,%r84,%r29; @ %r400 bra $L87; .loc 1 370 23 mul.lo.u64 %r102,%r29,%r92; .loc 1 371 24 mul.lo.u64 %r401,%r29,%r95; .loc 1 371 10 sub.u64 %r152,%r156,%r401; .loc 1 372 23 mul.lo.u64 %r402,%r29,%r97; .loc 1 372 9 shl.b64 %r403,%r402,3; sub.u64 %r153,%r157,%r403; .loc 1 374 7 @ ! %r432 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r41],%r440; .loc 1 370 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 371 24 mul.lo.u64 %r406,%r116,%r113; .loc 1 371 10 sub.u64 %r152,%r156,%r406; .loc 1 372 23 mul.lo.u64 %r407,%r118,%r113; .loc 1 372 9 shl.b64 %r408,%r407,3; sub.u64 %r153,%r157,%r408; .loc 1 373 5 add.u64 %r154,%r154,1; .loc 1 374 7 add.u64 %r41,%r41,8; add.u64 %r30,%r30,8; setp.ne.u64 %r409,%r154,%r436; @ %r409 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; mov.u64 %r30,8; .loc 1 373 5 mov.u64 %r154,1; $L84: .loc 1 382 16 ld.u64 %r413,[%r41]; add.u64 %r113,%r413,1; st.u64 [%r41],%r113; .loc 1 383 23 add.u64 %r415,%r437,%r30; ld.u64 %r114,[%r415]; .loc 1 383 13 sub.u64 %r416,%r114,%r102; shl.b64 %r417,%r416,3; add.u64 %r155,%r155,%r417; .loc 1 384 24 add.u64 %r418,%frame,%r30; ld.u64 %r116,[%r418]; .loc 1 384 14 add.u64 %r156,%r152,%r116; .loc 1 385 23 add.u64 %r420,%r439,%r30; ld.u64 %r118,[%r420]; .loc 1 385 13 shl.b64 %r421,%r118,3; add.u64 %r157,%r153,%r421; .loc 1 363 32 add.u64 %r423,%r438,%r30; ld.u64 %r121,[%r423]; .loc 1 363 13 setp.eq.u64 %r424,%r113,%r121; @ %r424 bra $L85; .loc 1 367 13 mov.u64 %r29,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r425,[%r210]; setp.eq.u64 %r426,%r425,0; @ ! %r426 bra $L63; bra $L86; $L48: .loc 1 389_gfortran_siparity_i8 .visible .func _gfortran_siparity_8u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_iparity_.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r294,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r294; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r294; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r300,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r300; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r293,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r293,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r299,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r299; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r294,0; @ %r211 bra $L127; add.u64 %r293,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r293; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r24496244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r294,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r294; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r294; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r294,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r294,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r294,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 shl.b64 %r69,%r68,3; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r292,%r294,1; .loc 1 512 13 mov.u64 %r278,0; cvt.u32.u32 %r296,%r24; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; $L146: st.u64 [%r98],%r278; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r279,%r59,%r80; @ %r279 bra $L146; .loc 1 525 7 @ ! %r292 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r278; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r282,%r97,%r298; @ %r282 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; $L144: .loc 1 529 16 ld.u64 %r287,[%r99]; add.u64 %r74,%r287,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r288,%r75,%r72; shl.b64 %r289,%r288,3; add.u64 %r98,%r98,%r289; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r290,%r74,%r77; @ %r290 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r291,%r98,0; @ ! %r291 bra $L122; bra $L147; $L109: .loc 1 534iparity_i16.o/_gfortran_iparity_i16 .visible .func _gfortran_iparity_i16fortran/generated/iparity_i16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_miparity_i16 .visible .func _gfortran_miparity_i16gfortran_siparity_i16 .visible .func _gfortran_siparity_i16VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,73,80,65,82,73,84,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,80,65,82,73,84,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,108] = {73,80,65,82,73,84,89114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,80,65,82,73,84,89,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,73,80,65,82,73,84,89,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_iparity_i16 .visible .func _gfortran_iparity_i1648pred %r217; .reg .pred64u64 %r251; .reg .predu64 %r256; .reg .u64 %r257; .reg .u32u32 %r268; .reg .u64u6464predu64 %r370; mov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; .loc 1 55 10 ld.s8 %r22,[%r180+28]; .loc 1 55 38 add.u32 %r182,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r182; .loc 1 56 10 ld.u64 %r25,[%r181]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r184,%r104,63; set.u32.lt.s64 %r186,%r103,%r104; neg.s32 %r187,%r186; cvt.u16.u64 %r190,%r184; cvt.u16.u32 %r191,%r187; or.b16 %r189,%r190,%r191; .loc 1 58 6 cvt.u32.u16 %r192,%r189; cvt.u16.u8 %r193,%r192; setp.eq.u16 %r194,%r193,0; @ %r194 bra $L2; .loc 1 60 7 cvt.u32.u32 %r197,%r22; cvt.s64.s8 %r196,%r197; st.u64 [%stack+8],%r1961955_gfortran_runtime_errorr200,%r104,%r104; add.u64 %r201,%r200,%r104; shl.b64 %r202,%r201,3; add.u64 %r203,%r180,%r202; ld.u64 %r30,[%r203+56]; ld.u64 %r32,[%r203+48]; .loc 1 68 9 ld.u64 %r107,[%r203+40]; .loc 1 70 3 setp.ne.u64 %r217,%r104,0; @ %r217 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r218,%r103,%r104; @ %r218 bra $L4; bra $L46; $L3: add.u64 %r61,%r180,40; add.u64 %r40,%frame,120; add.u64 %r91,%frame,240; add.u64 %r219,%r180,16; add.u64 %r221,%r25,%r25; add.u64 %r222,%r221,%r25; shl.b64 %r223,%r222,3; add.u64 %r174,%r219,%r223; .loc 1 76 12 mov.u64 %r370,0; $L8: .loc 1 72 18 ld.u64 %r224,[%r61]; st.u64 [%r40],%r224; .loc 1 73 19 ld.u64 %r226,[%r61+16]; add.u64 %r225,%r226,1; ld.u64 %r227,[%r61+8]; sub.u64 %r37,%r225,%r227; .loc 1 75 10 setp.lt.s64 %r228,%r37,0; @ %r228 bra $L6; .loc 1 73 17 st.u64 [%r91],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r91],%r370; $L7: .loc 1 70 3 add.u64 %r61,%r61,24; add.u64 %r40,%r40,8; add.u64 %r91,%r91,8; setp.ne.u64 %r230,%r61,%r174; @ %r230 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r231,[%r179]; setp.eq.u64 %r232,%r231,0; @ ! %r232 bra $L11; bra $L10; $L4: add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r237,%r236,40; add.u64 %r143,%r180,%r237; shl.b64 %r238,%r25,3; add.u64 %r125,%r238,-8; add.u64 %r358,%frame,120; add.u64 %r128,%r358,%r125; add.u64 %r357,%frame,240; add.u64 %r121,%r357,%r125; cvt.u32.u32 %r242,%r22; cvt.s64.s8 %r241,%r242; add.u64 %r244,%r241,%r241; add.u64 %r245,%r244,%r241; shl.b64 %r246,%r245,3; add.u64 %r247,%r180,40; add.u64 %r95,%r246,%r247; .loc 1 84 12 mov.u64 %r369,0; $L14: .loc 1 80 18 ld.u64 %r248,[%r143]; st.u64 [%r128],%r248; .loc 1 81 19 ld.u64 %r250,[%r143+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r143+8]; sub.u64 %r44,%r249,%r251; .loc 1 83 10 setp.lt.s64 %r252,%r44,0; @ %r252 bra $L12; .loc 1 81 17 st.u64 [%r121],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r121],%r369; $L13: .loc 1 78 3 add.u64 %r143,%r143,24; add.u64 %r128,%r128,8; add.u64 %r121,%r121,8; setp.ne.u64 %r254,%r95,%r143; @ %r254 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r255,%r103,0; @ %r255 bra $L16; add.u64 %r357,%frame,240; $L34: add.u64 %r152,%r179,40; mov.u64 %r150,%r357; add.u64 %r256,%r179,16; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r144,%r256,%r262; .loc 1 94 10 mov.u64 %r99,1; .loc 1 98 4 mov.u64 %r285,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r263,0; st.u64 [%r179+8],%r263; .loc 1 103 28 cvt.u16.u32 %r266,%r22; add.u16 %r265,%r266,-1; cvt.u32.u16 %r267,%r265; st.u8 [%r179+28],%r267; .loc 1 105 20 add.u32 %r268,%r22,-2; cvt.s64.s32 %r52,%r268; add.u64 %r270,%r52,%r52; add.u64 %r271,%r270,%r52; shl.b64 %r272,%r271,3; add.u64 %r273,%r179,%r272; .loc 1 105 67 shl.b64 %r275,%r52,3; add.u64 %r276,%frame,%r275; .loc 1 105 59 ld.u64 %r278,[%r273+40]; ld.u64 %r279,[%r276+240]; mul.lo.u64 %r109,%r278,%r279; .loc 1 107 29 mov.u64 %r281,16281; call (%value_in),_gfortrani_xmallocarray282,[%value_in]; } .loc 1 107 27 st.u64 [%r179],%r282; .loc 1 108 10 setp.eq.u64 %r284,%r109,0; @ ! %r284 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r99,%r48,%r99; $L17: .loc 1 98 4 st.u64 [%r152+8],%r285; ld.u64 %r48,[%r150]; add.u64 %r286,%r48,-1; st.u64 [%r152+16],%r286; st.u64 [%r152],%r99; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r287,%r144,%r152; @ %r287 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r179+48],%r109; mov.u64 %r289,-1; st.u64 [%r179+56],%r289; mov.u64 %r290,1; st.u64 [%r179+40],%r290; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r179+28]; .loc 1 118 10 setp.eq.u64 %r291,%r58,%r103; @ %r291 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r294,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r295,[%r294+36]; setp.eq.u32 %r296,%r295,0; @ %r296 bra $L21; .loc 1 125 2 add.u64 %r357,%frame,240; cvta.const.u64 %r300,$LC2r357299300; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r302,%r103,0; @ %r302 bra $L24; $L27: .loc 1 65 9 add.u64 %r303,%r30,1; .loc 1 65 7 sub.u64 %r105,%r303,%r32; max.s64 %r45,%r105,0; .loc 1 137 8 ld.u64 %r116,[%r180]; .loc 1 138 8 ld.u64 %r117,[%r179]; .loc 1 154 36 shl.b64 %r66,%r107,4; ld.u64 %r138,[%frame+360]; .loc 1 166 22 ld.u64 %r68,[%frame+120]; .loc 1 166 12 shl.b64 %r69,%r68,4; .loc 1 167 22 ld.u64 %r71,[%frame]; .loc 1 167 12 shl.b64 %r72,%r71,4; .loc 1 169 32 ld.u64 %r123,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r304,%r68,%r123; .loc 1 176 9 shl.b64 %r305,%r304,4; neg.s64 %r127,%r305; .loc 1 177 23 mul.lo.u64 %r306,%r71,%r123; .loc 1 177 9 shl.b64 %r307,%r306,4; neg.s64 %r130,%r307; setp.gt.s64 %r359,%r105,0; setp.le.s64 %r360,%r103,1; .loc 1 148 10 mov.u64 %r362,0; cvt.u32.u32 %r364,%r22; cvt.s64.s8 %r365,%r364; add.u64 %r366,%r365,-1; add.u64 %r367,%frame,120; add.u64 %r368,%frame,240; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r179,40; mov.u64 %r162,%frame; cvt.u32.u32 %r309,%r22; cvt.s64.s8 %r308,%r309; add.u64 %r153,%r308,-1; .loc 1 129 3 mov.u64 %r118,0; add.u64 %r357,%frame,240; .loc 1 131 16 mov.u64 %r310,%r118; $L26: st.u64 [%r165],%r310; .loc 1 132 18 ld.u64 %r311,[%r164]; st.u64 [%r162],%r311; .loc 1 133 17 shl.b64 %r313,%r118,3; add.u64 %r314,%r357,%r313; .loc 1 133 10 ld.u64 %r315,[%r314]; setp.le.s64 %r316,%r315,0; @ %r316 bra $L1; .loc 1 129 26 add.u64 %r118,%r118,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r317,%r118,%r153; @ %r317 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r359 bra $L35; .loc 1 150 10 st.u64 [%r117],%r362; st.u64 [%r117+8],%r362; bra $L29; $L35: mov.u64 %r112,%r116; .loc 1 148 10 mov.u64 %r353,%r362; mov.u64 %r354,%r362; .loc 1 154 13 mov.u64 %r111,0; $L28: .loc 1 158 10 ld.u64 %r323,[%r112]; ld.u64 %r326,[%r112+8]; xor.b64 %r353,%r323,%r353; xor.b64 %r354,%r326,%r354; .loc 1 154 28 add.u64 %r111,%r111,1; .loc 1 154 36 add.u64 %r112,%r112,%r66; .loc 1 154 6 setp.gt.s64 %r328,%r45,%r111; @ %r328 bra $L28; .loc 1 161 12 st.u64 [%r117],%r353; st.u64 [%r117+8],%r354; $L29: .loc 1 165 15 add.u64 %r138,%r138,1; .loc 1 166 12 add.u64 %r116,%r116,%r69; .loc 1 167 12 add.u64 %r117,%r117,%r72; .loc 1 169 13 setp.ne.u64 %r331,%r138,%r123; @ %r331 bra $L25; .loc 1 176 9 add.u64 %r113,%r116,%r127; .loc 1 177 9 add.u64 %r114,%r117,%r130; .loc 1 179 7 @ %r360 bra $L1; add.u64 %r122,%frame,368; mov.u64 %r140,8; .loc 1 178 5 mov.u64 %r115,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r122],%r362; .loc 1 176 23 mul.lo.u64 %r337,%r85,%r84; .loc 1 176 9 shl.b64 %r338,%r337,4; sub.u64 %r113,%r116,%r338; .loc 1 177 23 mul.lo.u64 %r339,%r88,%r84; .loc 1 177 9 shl.b64 %r340,%r339,4; sub.u64 %r114,%r117,%r340; .loc 1 178 5 add.u64 %r115,%r115,1; .loc 1 179 7 add.u64 %r122,%r122,8; add.u64 %r140,%r140,8; setp.eq.u64 %r341,%r115,%r366; @ %r341 bra $L1; $L32: .loc 1 187 16 ld.u64 %r342,[%r122]; add.u64 %r84,%r342,1; st.u64 [%r122],%r84; .loc 1 188 23 add.u64 %r344,%r367,%r140; ld.u64 %r85,[%r344]; .loc 1 188 13 shl.b64 %r345,%r85,4; add.u64 %r116,%r113,%r345; .loc 1 189 23 add.u64 %r346,%frame,%r140; ld.u64 %r88,[%r346]; .loc 1 189 13 shl.b64 %r347,%r88,4; add.u64 %r117,%r114,%r347; .loc 1 169 32 add.u64 %r349,%r368,%r140; ld.u64 %r92,[%r349]; .loc 1 169 13 setp.eq.u64 %r350,%r84,%r92; @ %r350 bra $L33; .loc 1 173 13 mov.u64 %r138,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r351,[%r179]; setp.eq.u64 %r352,%r351,0; @ ! %r352 bra $L11; bra $L34; $L1_gfortran_miparity_i16 .visible .func _gfortran_miparity_i16608u64u16 %r263; .reg .pred %r264; .reg .pred %r266; .reg .u64 %r270; .reg .u64 %r271; .reg .u64 %r277; .reg .pred %r279; .reg .pred %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .predu64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32u64 %r342; .reg .u64pred %r358; .reg .u64predu64 %r391; .reg .predpred %r399; .reg .u64 %r402; .reg .u64 %r405; .reg .pred %r407; .reg .predu64 %r435; .reg .u64 %r438; .reg .u64 %r439; .reg .u64 %r440; .reg .u64 %r442; .reg .predmov.u64 %r211,%ar2; mov.u64 %r212,%ar3; .loc 1 223 6 setp.ne.u64 %r213,%r212,0; @ %r213 bra $L49gfortran_iparity_i16.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r211]; .loc 1 233 7 add.u64 %r137,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r210+28]; .loc 1 234 38 add.u32 %r217,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r138,%r217; .loc 1 237 7 shr.u64 %r219,%r137,63; set.u32.gt.s64 %r221,%r137,%r138; neg.s32 %r222,%r221; cvt.u16.u64 %r225,%r219; cvt.u16.u32 %r226,%r222; or.b16 %r224,%r225,%r226; .loc 1 237 6 cvt.u32.u16 %r227,%r224; cvt.u16.u8 %r228,%r227; setp.eq.u16 %r229,%r228,0; @ %r229 bra $L51; .loc 1 239 7 cvt.u32.u32 %r232,%r23; cvt.s64.s8 %r231,%r232; st.u64 [%stack+8],%r231; st.u64 [%stack],%r22; cvta.const.u64 %r2302stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r440,%r137,%r137; add.u64 %r442,%r440,%r137; shl.b64 %r237,%r442,3; add.u64 %r238,%r210,%r237; ld.u64 %r241,[%r238+56]; add.u64 %r240,%r241,1; .loc 1 244 7 ld.u64 %r248,[%r238+48]; sub.u64 %r139,%r240,%r248; .loc 1 245 6 setp.le.s64 %r249,%r139,0; @ %r249 bra $L48; .loc 1 248 9 ld.u64 %r152,[%r212]; .loc 1 250 15 ld.u64 %r34,[%r212+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r250,%r35,-4; and.b32 %r251,%r250,-5; set.u32.eq.u32 %r253,%r251,0; neg.s32 %r254,%r253; .loc 1 252 22 add.u32 %r255,%r35,-1; set.u32.le.u32 %r257,%r255,1; neg.s32 %r258,%r257; .loc 1 252 6 cvt.u16.u32 %r260,%r254; cvt.u16.u32 %r261,%r258; or.b16 %r259,%r260,%r261; cvt.u32.u16 %r262,%r259; cvt.u16.u8 %r263,%r262; setp.ne.u16 %r264,%r263,0; @ %r264 bra $L53; .loc 1 254 7 setp.ne.u32 %r266,%r35,16; @ %r266 bra $L54; $L53: .loc 1 261 9 shl.b64 %r270,%r442,3; add.u64 %r271,%r210,%r270; ld.u64 %r142,[%r271+40]; .loc 1 262 12 add.u64 %r277,%r212,%r270; ld.u64 %r42,[%r277+40]; .loc 1 264 3 setp.ne.u64 %r279,%r137,0; @ %r279 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r280,%r137,%r138; @ %r280 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r281281_gfortran_runtime_error5: add.u64 %r40,%r210,40; add.u64 %r179,%r212,40; add.u64 %r283,%r210,16; add.u64 %r285,%r22,%r22; add.u64 %r286,%r285,%r22; shl.b64 %r287,%r286,3; add.u64 %r206,%r283,%r287; .loc 1 264 3 mov.u64 %r92,0; add.u64 %r439,%frame,240; add.u64 %r438,%frame,360; .loc 1 271 12 mov.u64 %r453,%r92; $L60: .loc 1 266 18 add.u64 %r289,%r439,%r92; ld.u64 %r290,[%r40]; st.u64 [%r289],%r290; .loc 1 267 18 add.u64 %r291,%frame,%r92; .loc 1 267 20 ld.u64 %r293,[%r179]; mul.lo.u64 %r292,%r293,%r34; .loc 1 267 18 st.u64 [%r291],%r292; .loc 1 268 19 ld.u64 %r295,[%r40+16]; add.u64 %r294,%r295,1; ld.u64 %r296,[%r40+8]; sub.u64 %r54,%r294,%r296; .loc 1 270 10 setp.lt.s64 %r297,%r54,0; @ %r297 bra $L58; .loc 1 268 17 add.u64 %r299,%r438,%r92; st.u64 [%r299],%r54; bra $L59; $L58: .loc 1 271 12 add.u64 %r301,%r438,%r92; st.u64 [%r301],%r453; $L59: .loc 1 264 3 add.u64 %r40,%r40,24; add.u64 %r179,%r179,24; add.u64 %r92,%r92,8; setp.ne.u64 %r303,%r40,%r206; @ %r303 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r304,[%r209]; setp.eq.u64 %r305,%r304,0; @ ! %r305 bra $L63; bra $L62; $L56: add.u64 %r307,%r22,%r22; add.u64 %r308,%r307,%r22; shl.b64 %r309,%r308,3; add.u64 %r127,%r309,40; add.u64 %r133,%r210,%r127; add.u64 %r125,%r212,%r127; shl.b64 %r310,%r22,3; add.u64 %r103,%r310,-8; cvt.u32.u32 %r312,%r23; cvt.s64.s8 %r311,%r312; shl.b64 %r313,%r311,3; add.u64 %r69,%r313,-8; add.u64 %r439,%frame,240; add.u64 %r438,%frame,360; .loc 1 281 12 mov.u64 %r452,0; $L66: .loc 1 276 18 add.u64 %r315,%r439,%r103; ld.u64 %r316,[%r133]; st.u64 [%r315],%r316; .loc 1 277 18 add.u64 %r317,%frame,%r103; .loc 1 277 20 ld.u64 %r319,[%r125]; mul.lo.u64 %r318,%r319,%r34; .loc 1 277 18 st.u64 [%r317],%r318; .loc 1 278 19 ld.u64 %r321,[%r133+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r133+8]; sub.u64 %r66,%r320,%r322; .loc 1 280 10 setp.lt.s64 %r323,%r66,0; @ %r323 bra $L64; .loc 1 278 17 add.u64 %r325,%r438,%r103; st.u64 [%r325],%r66; bra $L65; $L64: .loc 1 281 12 add.u64 %r327,%r438,%r103; st.u64 [%r327],%r452; $L65: .loc 1 274 3 add.u64 %r133,%r133,24; add.u64 %r125,%r125,24; add.u64 %r103,%r103,8; setp.ne.u64 %r329,%r69,%r103; @ %r329 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r330,%r138,0; @ %r330 bra $L68; add.u64 %r438,%frame,360; $L86: add.u64 %r172,%r209,40; mov.u64 %r164,%r438; cvt.u32.u32 %r332,%r23; cvt.s64.s8 %r331,%r332; add.u64 %r334,%r331,%r331; add.u64 %r335,%r334,%r331; shl.b64 %r336,%r335,3; add.u64 %r337,%r209,16; add.u64 %r134,%r336,%r337; .loc 1 291 10 mov.u64 %r129,1; .loc 1 295 4 mov.u64 %r356,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r338,%r23,-2; cvt.s64.s32 %r75,%r338; add.u64 %r340,%r75,%r75; add.u64 %r341,%r340,%r75; shl.b64 %r342,%r341,3; add.u64 %r343,%r209,%r342; .loc 1 299 67 shl.b64 %r345,%r75,3; add.u64 %r346,%frame,%r345; .loc 1 299 59 ld.u64 %r348,[%r343+40]; ld.u64 %r349,[%r346+360]; mul.lo.u64 %r146,%r348,%r349; .loc 1 301 24 mov.u64 %r350,0; st.u64 [%r209+8],%r350; .loc 1 302 28 cvt.u16.u32 %r353,%r23; add.u16 %r352,%r353,-1; cvt.u32.u16 %r354,%r352; st.u8 [%r209+28],%r354; .loc 1 304 10 setp.eq.u64 %r355,%r146,0; @ %r355 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r129,%r72,%r129; $L69: .loc 1 295 4 st.u64 [%r172+8],%r356; ld.u64 %r72,[%r164]; add.u64 %r357,%r72,-1; st.u64 [%r172+16],%r357; st.u64 [%r172],%r129; .loc 1 288 7 add.u64 %r172,%r172,24; add.u64 %r164,%r164,8; setp.ne.u64 %r358,%r134,%r172; @ %r358 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r209+48],%r146; mov.u64 %r360,-1; st.u64 [%r209+56],%r360; mov.u64 %r361,1; st.u64 [%r209+40],%r361; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r363,16363; call (%value_in),_gfortrani_xmallocarray64,[%value_in]; } .loc 1 311 22 st.u64 [%r209],%r364; bra $L73; $L63: .loc 1 316 19 ld.s8 %r366,[%r209+28]; .loc 1 316 10 setp.eq.u64 %r367,%r366,%r138; @ %r367 bra $L74; .loc 1 317 2 cvta.const.u64 %r36836_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r370,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r371,[%r370+36]; setp.eq.u32 %r372,%r371,0; @ %r372 bra $L73; .loc 1 321 4 add.u64 %r438,%frame,360; cvta.const.u64 %r376,$LC2r437376; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r380,$LC1210380376; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r382,%r138,0; @ %r382 bra $L75; $L78: .loc 1 336 8 ld.u64 %r153,[%r209]; .loc 1 337 8 ld.u64 %r151,[%r210]; .loc 1 339 9 setp.ne.u64 %r383,%r151,0; @ %r383 bra $L76; bra $L48; $L75: add.u64 %r190,%frame,480; add.u64 %r189,%r209,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r385,%r23; cvt.s64.s8 %r384,%r385; add.u64 %r175,%r384,-1; .loc 1 328 3 mov.u64 %r157,0; add.u64 %r438,%frame,360; .loc 1 330 16 mov.u64 %r386,%r157; $L77: st.u64 [%r190],%r386; .loc 1 331 18 ld.u64 %r387,[%r189]; st.u64 [%r187],%r387; .loc 1 332 17 shl.b64 %r389,%r157,3; add.u64 %r390,%r438,%r389; .loc 1 332 10 ld.u64 %r391,[%r390]; setp.le.s64 %r392,%r391,0; @ %r392 bra $L48; .loc 1 328 26 add.u64 %r157,%r157,1; .loc 1 328 3 add.u64 %r190,%r190,8; add.u64 %r189,%r189,24; add.u64 %r187,%r187,8; setp.ne.u64 %r393,%r157,%r175; @ %r393 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r45,%r42,%r34; .loc 1 349 32 shl.b64 %r91,%r142,4; ld.u64 %r39,[%frame+480]; .loc 1 359 22 ld.u64 %r94,[%frame+240]; .loc 1 359 12 shl.b64 %r95,%r94,4; .loc 1 360 23 ld.u64 %r97,[%frame]; .loc 1 361 22 ld.u64 %r99,[%frame+120]; .loc 1 361 12 shl.b64 %r100,%r99,4; .loc 1 363 32 ld.u64 %r85,[%frame+360]; .loc 1 370 23 mul.lo.u64 %r38,%r85,%r94; .loc 1 371 24 mul.lo.u64 %r394,%r85,%r97; .loc 1 371 10 neg.s64 %r163,%r394; .loc 1 372 23 mul.lo.u64 %r395,%r85,%r99; .loc 1 372 9 shl.b64 %r396,%r395,4; neg.s64 %r165,%r396; setp.le.s64 %r443,%r138,1; .loc 1 348 10 mov.u64 %r445,0; cvt.u32.u32 %r446,%r23; cvt.s64.s8 %r447,%r446; add.u64 %r448,%r447,-1; add.u64 %r449,%frame,240; add.u64 %r450,%frame,360; add.u64 %r451,%frame,120; $L87: .loc 1 328 3 mov.u64 %r156,%r152; mov.u64 %r155,%r151; .loc 1 348 10 mov.u64 %r434,%r445; mov.u64 %r435,0; .loc 1 349 9 mov.u64 %r154,%r435; $L80: .loc 1 352 6 ld.s8 %r398,[%r156]; cvt.u16.u32 %r397,%r398; setp.eq.u16 %r399,%r397,0; @ %r399 bra $L79; .loc 1 353 12 ld.u64 %r402,[%r155]; ld.u64 %r405,[%r155+8]; xor.b64 %r434,%r402,%r434; xor.b64 %r435,%r405,%r435; $L79: .loc 1 349 24 add.u64 %r154,%r154,1; .loc 1 349 32 add.u64 %r155,%r155,%r91; .loc 1 349 47 add.u64 %r156,%r156,%r45; .loc 1 349 2 setp.ne.u64 %r407,%r139,%r154; @ %r407 bra $L80; .loc 1 355 8 st.u64 [%r153],%r434; st.u64 [%r153+8],%r435; .loc 1 358 15 add.u64 %r39,%r39,1; .loc 1 359 12 add.u64 %r151,%r151,%r95; .loc 1 360 13 add.u64 %r152,%r152,%r97; .loc 1 361 12 add.u64 %r153,%r153,%r100; .loc 1 363 13 setp.ne.u64 %r410,%r85,%r39; @ %r410 bra $L87; .loc 1 371 10 add.u64 %r148,%r152,%r163; .loc 1 372 9 add.u64 %r149,%r153,%r165; .loc 1 374 7 @ ! %r443 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r41],%r445; .loc 1 370 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 371 24 mul.lo.u64 %r413,%r116,%r113; .loc 1 371 10 sub.u64 %r148,%r152,%r413; .loc 1 372 23 mul.lo.u64 %r414,%r118,%r113; .loc 1 372 9 shl.b64 %r415,%r414,4; sub.u64 %r149,%r153,%r415; .loc 1 373 5 add.u64 %r150,%r150,1; .loc 1 374 7 add.u64 %r41,%r41,8; add.u64 %r166,%r166,8; setp.ne.u64 %r416,%r150,%r448; @ %r416 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; .loc 1 370 23 mov.u64 %r102,%r38; .loc 1 374 7 mov.u64 %r166,8; .loc 1 373 5 mov.u64 %r150,1; $L84: .loc 1 382 16 ld.u64 %r420,[%r41]; add.u64 %r113,%r420,1; st.u64 [%r41],%r113; .loc 1 383 23 add.u64 %r422,%r449,%r166; ld.u64 %r114,[%r422]; .loc 1 383 13 sub.u64 %r423,%r114,%r102; shl.b64 %r424,%r423,4; add.u64 %r151,%r151,%r424; .loc 1 384 24 add.u64 %r425,%frame,%r166; ld.u64 %r116,[%r425]; .loc 1 384 14 add.u64 %r152,%r148,%r116; .loc 1 385 23 add.u64 %r427,%r451,%r166; ld.u64 %r118,[%r427]; .loc 1 385 13 shl.b64 %r428,%r118,4; add.u64 %r153,%r149,%r428; .loc 1 363 32 add.u64 %r430,%r450,%r166; ld.u64 %r122,[%r430]; .loc 1 363 13 setp.eq.u64 %r431,%r113,%r122; @ %r431 bra $L85; .loc 1 367 13 mov.u64 %r39,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r432,[%r209]; setp.eq.u64 %r433,%r432,0; @ ! %r433 bra $L63; bra $L86; $L48: .loc 1 389_gfortran_siparity_i16 .visible .func _gfortran_siparity_i168u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64predpred %r292; .reg .predu32u64 %r301; mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_iparity_i16,(%out_arg1,%out_arg2,%out_arg3); } .loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r295,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r295; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r295; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r301,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r301; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r294,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r294,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r300,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r300; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r295,0; @ %r211 bra $L127; add.u64 %r294,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r294; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r244,1696244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r295,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r295; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r295; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r295,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r295,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r295,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 shl.b64 %r69,%r68,4; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r293,%r295,1; .loc 1 512 13 mov.u64 %r278,0; cvt.u32.u32 %r297,%r24; cvt.s64.s8 %r298,%r297; add.u64 %r299,%r298,-1; $L146: st.u64 [%r98],%r278; st.u64 [%r98+8],%r278; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r280,%r59,%r80; @ %r280 bra $L146; .loc 1 525 7 @ ! %r293 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r278; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r283,%r97,%r299; @ %r283 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; $L144: .loc 1 529 16 ld.u64 %r288,[%r99]; add.u64 %r74,%r288,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r289,%r75,%r72; shl.b64 %r290,%r289,4; add.u64 %r98,%r98,%r290; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r291,%r74,%r77; @ %r291 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r292,%r98,0; @ ! %r292 bra $L122; bra $L147; $L109: .loc 1 534 norm2_r4.o/_gfortran_norm2_r4 .visible .func _gfortran_norm2_r4fortran/generated/norm2_r4.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_retursqrtf .extern .func (.param .f32 %value_out) sqrtf,105,110,99,111,114,114,101,99,116,32,105,110,32,78,79,82,77,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,78,79,82,77,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,105] = {78,79,82,77114,101,116,117,114,110,32,118,97,108,11_gfortran_norm2_r4 .visible .func _gfortran_norm2_r448f32101pred %r225; .reg .predpred %r236; .reg .pred %r238; .reg .u64u64pred %r263; .reg .u64 %r264; .reg .u32 %r265; .reg .u64 %r267; .reg .u64 %r268; .reg .u64 %r269; .reg .u64 %r270; .reg .u64 %r271; .reg .u16 %r273; .reg .u16u32u64 %r317; .reg .u64predpred %r326; .reg .pred %r327; .reg .f32 %r329; .reg .f32 %r330; .reg .pred %r331; .reg .u64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64pred %r345; .reg .u64mov.u64 %r187,%ar0; mov.u64 %r188,%ar1; mov.u64 %r189,%ar2; .loc 1 58 10 ld.s8 %r22,[%r188+28]; .loc 1 58 38 add.u32 %r190,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r113,%r190; .loc 1 59 10 ld.u64 %r25,[%r189]; .loc 1 59 7 add.u64 %r114,%r25,-1; .loc 1 61 7 shr.u64 %r192,%r114,63; set.u32.lt.s64 %r194,%r113,%r114; neg.s32 %r195,%r194; cvt.u16.u64 %r198,%r192; cvt.u16.u32 %r199,%r195; or.b16 %r197,%r198,%r199; .loc 1 61 6 cvt.u32.u16 %r200,%r197; cvt.u16.u8 %r201,%r200; setp.eq.u16 %r202,%r201,0; @ %r202 bra $L2; .loc 1 63 7 cvt.u32.u32 %r205,%r22; cvt.s64.s8 %r204,%r205; st.u64 [%stack+8],%r20403203_gfortran_runtime_errorr208,%r114,%r114; add.u64 %r209,%r208,%r114; shl.b64 %r210,%r209,3; add.u64 %r211,%r188,%r210; ld.u64 %r31,[%r211+56]; ld.u64 %r33,[%r211+48]; .loc 1 71 9 ld.u64 %r117,[%r211+40]; .loc 1 73 3 setp.ne.u64 %r225,%r114,0; @ %r225 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r226,%r113,%r114; @ %r226 bra $L4; bra $L54; $L3: add.u64 %r112,%r188,40; add.u64 %r133,%frame,120; add.u64 %r185,%frame,240; add.u64 %r227,%r188,16; add.u64 %r229,%r25,%r25; add.u64 %r230,%r229,%r25; shl.b64 %r231,%r230,3; add.u64 %r75,%r227,%r231; .loc 1 79 12 mov.u64 %r368,0; $L8: .loc 1 75 18 ld.u64 %r232,[%r112]; st.u64 [%r133],%r232; .loc 1 76 19 ld.u64 %r234,[%r112+16]; add.u64 %r233,%r234,1; ld.u64 %r235,[%r112+8]; sub.u64 %r38,%r233,%r235; .loc 1 78 10 setp.lt.s64 %r236,%r38,0; @ %r236 bra $L6; .loc 1 76 17 st.u64 [%r185],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r185],%r368; $L7: .loc 1 73 3 add.u64 %r112,%r112,24; add.u64 %r133,%r133,8; add.u64 %r185,%r185,8; setp.ne.u64 %r238,%r75,%r112; @ %r238 bra $L8; bra $L9; $L54: .loc 1 90 6 ld.u64 %r239,[%r187]; setp.eq.u64 %r240,%r239,0; @ ! %r240 bra $L11; bra $L10; $L4: add.u64 %r242,%r25,%r25; add.u64 %r243,%r242,%r25; shl.b64 %r244,%r243,3; add.u64 %r245,%r244,40; add.u64 %r111,%r188,%r245; shl.b64 %r246,%r25,3; add.u64 %r101,%r246,-8; add.u64 %r358,%frame,120; add.u64 %r106,%r358,%r101; add.u64 %r357,%frame,240; add.u64 %r99,%r357,%r101; cvt.u32.u32 %r250,%r22; cvt.s64.s8 %r249,%r250; add.u64 %r252,%r249,%r249; add.u64 %r253,%r252,%r249; shl.b64 %r254,%r253,3; add.u64 %r255,%r188,40; add.u64 %r48,%r254,%r255; .loc 1 87 12 mov.u64 %r367,0; $L14: .loc 1 83 18 ld.u64 %r256,[%r111]; st.u64 [%r106],%r256; .loc 1 84 19 ld.u64 %r258,[%r111+16]; add.u64 %r257,%r258,1; ld.u64 %r259,[%r111+8]; sub.u64 %r45,%r257,%r259; .loc 1 86 10 setp.lt.s64 %r260,%r45,0; @ %r260 bra $L12; .loc 1 84 17 st.u64 [%r99],%r45; bra $L13; $L12: .loc 1 87 12 st.u64 [%r99],%r367; $L13: .loc 1 81 3 add.u64 %r111,%r111,24; add.u64 %r106,%r106,8; add.u64 %r99,%r99,8; setp.ne.u64 %r262,%r48,%r111; @ %r262 bra $L14; bra $L55; $L10: .loc 1 94 7 setp.le.s64 %r263,%r113,0; @ %r263 bra $L16; add.u64 %r357,%frame,240; $L40: add.u64 %r151,%r187,40; mov.u64 %r148,%r357; cvt.u32.u32 %r265,%r22; cvt.s64.s8 %r264,%r265; add.u64 %r267,%r264,%r264; add.u64 %r268,%r267,%r264; shl.b64 %r269,%r268,3; add.u64 %r270,%r187,16; add.u64 %r134,%r269,%r270; .loc 1 97 10 mov.u64 %r103,1; .loc 1 101 4 mov.u64 %r293,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r271,0; st.u64 [%r187+8],%r271; .loc 1 106 28 cvt.u16.u32 %r274,%r22; add.u16 %r273,%r274,-1; cvt.u32.u16 %r275,%r273; st.u8 [%r187+28],%r275; .loc 1 108 20 add.u32 %r276,%r22,-2; cvt.s64.s32 %r56,%r276; add.u64 %r278,%r56,%r56; add.u64 %r279,%r278,%r56; shl.b64 %r280,%r279,3; add.u64 %r281,%r187,%r280; .loc 1 108 67 shl.b64 %r283,%r56,3; add.u64 %r284,%frame,%r283; .loc 1 108 59 ld.u64 %r286,[%r281+40]; ld.u64 %r287,[%r284+240]; mul.lo.u64 %r119,%r286,%r287; .loc 1 110 29289; call (%value_in),_gfortrani_xmallocarray290,[%value_in]; } .loc 1 110 27 st.u64 [%r187],%r290; .loc 1 111 10 setp.eq.u64 %r292,%r119,0; @ ! %r292 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r103,%r52,%r103; $L17: .loc 1 101 4 st.u64 [%r151+8],%r293; ld.u64 %r52,[%r148]; add.u64 %r294,%r52,-1; st.u64 [%r151+16],%r294; st.u64 [%r151],%r103; .loc 1 94 7 add.u64 %r151,%r151,24; add.u64 %r148,%r148,8; setp.ne.u64 %r295,%r134,%r151; @ %r295 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r187+48],%r119; mov.u64 %r297,-1; st.u64 [%r187+56],%r297; mov.u64 %r298,1; st.u64 [%r187+40],%r298; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r62,[%r187+28]; .loc 1 121 10 setp.eq.u64 %r299,%r62,%r113; @ %r299 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r113; st.u64 [%stack],%r62;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r302,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r303,[%r302+36]; setp.eq.u32 %r304,%r303,0; @ %r304 bra $L21; .loc 1 128 2 add.u64 %r357,%frame,240; cvta.const.u64 %r308,$LC2r357307308; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r310,%r113,0; @ %r310 bra $L24; $L28: .loc 1 68 9 add.u64 %r311,%r31,1; .loc 1 68 7 sub.u64 %r115,%r311,%r33; max.s64 %r46,%r115,0; .loc 1 140 8 ld.u64 %r129,[%r188]; .loc 1 141 8 ld.u64 %r130,[%r187]; .loc 1 159 36 shl.b64 %r73,%r117,2; .loc 1 186 22 ld.u64 %r177,[%frame+120]; .loc 1 186 12 shl.b64 %r180,%r177,2; .loc 1 187 22 ld.u64 %r181,[%frame]; .loc 1 187 12 shl.b64 %r184,%r181,2; .loc 1 185 12 ld.u64 %r186,[%frame+360]; setp.gt.s64 %r359,%r115,0; setp.le.s64 %r360,%r113,1; cvt.u32.u32 %r362,%r22; cvt.s64.s8 %r363,%r362; add.u64 %r364,%r363,-1; add.u64 %r365,%frame,120; add.u64 %r366,%frame,240; bra $L25; $L24: add.u64 %r168,%frame,360; add.u64 %r167,%r187,40; mov.u64 %r165,%frame; cvt.u32.u32 %r313,%r22; cvt.s64.s8 %r312,%r313; add.u64 %r154,%r312,-1; .loc 1 132 3 mov.u64 %r131,0; add.u64 %r357,%frame,240; .loc 1 134 16 mov.u64 %r314,%r131; $L27: st.u64 [%r168],%r314; .loc 1 135 18 ld.u64 %r315,[%r167]; st.u64 [%r165],%r315; .loc 1 136 17 shl.b64 %r317,%r131,3; add.u64 %r318,%r357,%r317; .loc 1 136 10 ld.u64 %r319,[%r318]; setp.le.s64 %r320,%r319,0; @ %r320 bra $L1; .loc 1 132 26 add.u64 %r131,%r131,1; .loc 1 132 3 add.u64 %r168,%r168,8; add.u64 %r167,%r167,24; add.u64 %r165,%r165,8; setp.ne.u64 %r321,%r131,%r154; @ %r321 bra $L27; bra $L28; $L25: .loc 1 154 5 @ %r359 bra $L41; .loc 1 155 10 mov.f32 %r323,0f00000000; st.f32 [%r130],%r323; bra $L30; $L41: mov.u64 %r125,%r129; .loc 1 153 8 mov.f32 %r105,0f3f800000; .loc 1 152 9 mov.f32 %r104,0f00000000; .loc 1 159 13 mov.u64 %r124,0; $L29: .loc 1 163 8 ld.f32 %r69,[%r125]; .loc 1 163 7 setp.eq.f32 %r324,%r69,0f00000000; @ %r324 bra $L31; .loc 1 166 13 abs.f32 %r121,%r69; .loc 1 167 11 setp.gt.f32 %r325,%r121,%r105; @ ! %r325 bra $L56; .loc 1 169 9 div.rn.f32 %r123,%r105,%r121; .loc 1 170 25 mul.f32 %r71,%r123,%r104; .loc 1 170 12 fma.rn.f32 %r104,%r71,%r123,0f3f800000; .loc 1 171 11 mov.f32 %r105,%r121; bra $L31; $L56: .loc 1 175 9 div.rn.f32 %r122,%r69,%r105; .loc 1 176 12 fma.rn.f32 %r104,%r122,%r122,%r104; $L31: .loc 1 159 28 add.u64 %r124,%r124,1; .loc 1 159 36 add.u64 %r125,%r125,%r73; .loc 1 159 6 setp.gt.s64 %r326,%r46,%r124; @ %r326 bra $L29; setp.geu.f32 %r327,%r104,0f00000000; @ ! %r327 bra $L57; .loc 1 180 23 sqrt.rn.f32 %r74,%r104; bra $L36; $L57: { .param .f32 %value_in; .param .f32 %out_arg1; st.param.f32 [%out_arg1],%r104; call (%value_in),sqrtf,(%out_arg1); ld.param.f32 %r329,[%value_in]; } mov.f32 %r74,%r329; $L36: .loc 1 180 13 mul.f32 %r330,%r105,%r74; .loc 1 181 12 st.f32 [%r130],%r330; $L30: .loc 1 185 15 add.u64 %r186,%r186,1; .loc 1 186 12 add.u64 %r129,%r129,%r180; .loc 1 187 12 add.u64 %r130,%r130,%r184; .loc 1 189 32 ld.u64 %r141,[%frame+240]; .loc 1 189 13 setp.ne.u64 %r331,%r186,%r141; @ %r331 bra $L25; .loc 1 196 23 mul.lo.u64 %r332,%r186,%r177; .loc 1 196 9 shl.b64 %r333,%r332,2; sub.u64 %r126,%r129,%r333; .loc 1 197 23 mul.lo.u64 %r334,%r186,%r181; .loc 1 197 9 shl.b64 %r335,%r334,2; sub.u64 %r127,%r130,%r335; .loc 1 199 7 @ %r360 bra $L1; add.u64 %r63,%frame,368; mov.u64 %r159,8; .loc 1 198 5 mov.u64 %r128,1; .loc 1 193 13 mov.u64 %r361,0; bra $L38; $L39: st.u64 [%r63],%r361; .loc 1 196 23 mul.lo.u64 %r341,%r90,%r89; .loc 1 196 9 shl.b64 %r342,%r341,2; sub.u64 %r126,%r129,%r342; .loc 1 197 23 mul.lo.u64 %r343,%r93,%r89; .loc 1 197 9 shl.b64 %r344,%r343,2; sub.u64 %r127,%r130,%r344; .loc 1 198 5 add.u64 %r128,%r128,1; .loc 1 199 7 add.u64 %r63,%r63,8; add.u64 %r159,%r159,8; setp.eq.u64 %r345,%r128,%r364; @ %r345 bra $L1; $L38: .loc 1 207 16 ld.u64 %r346,[%r63]; add.u64 %r89,%r346,1; st.u64 [%r63],%r89; .loc 1 208 23 add.u64 %r348,%r365,%r159; ld.u64 %r90,[%r348]; .loc 1 208 13 shl.b64 %r349,%r90,2; add.u64 %r129,%r126,%r349; .loc 1 209 23 add.u64 %r350,%frame,%r159; ld.u64 %r93,[%r350]; .loc 1 209 13 shl.b64 %r351,%r93,2; add.u64 %r130,%r127,%r351; .loc 1 189 32 add.u64 %r353,%r366,%r159; ld.u64 %r96,[%r353]; .loc 1 189 13 setp.eq.u64 %r354,%r89,%r96; @ %r354 bra $L39; mov.u64 %r186,0; bra $L25; $L55: .loc 1 90 6 ld.u64 %r355,[%r187]; setp.eq.u64 %r356,%r355,0; @ ! %r356 bra $L11; bra $L40; $L1 norm2_r8.o/_gfortran_norm2_r8 .visible .func _gfortran_norm2_r8fortran/generated/norm2_r8.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_retursqrt .extern .func (.param .f64 %value_out) sqrt (.param .f64 %in_ar0),105,110,99,111,114,114,101,99,116,32,105,110,32,78,79,82,77,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,78,79,82,77,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,105] = {78,79,82,77114,101,116,117,114,110,32,118,97,108,11_gfortran_norm2_r8 .visible .func _gfortran_norm2_r848pred %r225; .reg .predpred %r236; .reg .pred %r238; .reg .u64u64pred %r263; .reg .u64 %r264; .reg .u32 %r265; .reg .u64 %r267; .reg .u64 %r268; .reg .u64 %r269; .reg .u64 %r270; .reg .u64 %r271; .reg .u16 %r273; .reg .u16u32u64 %r317; .reg .u64predpred %r326; .reg .pred %r327; .reg .f64 %r329; .reg .f64 %r330; .reg .pred %r331; .reg .u64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64pred %r345; .reg .u64mov.u64 %r187,%ar0; mov.u64 %r188,%ar1; mov.u64 %r189,%ar2; .loc 1 58 10 ld.s8 %r22,[%r188+28]; .loc 1 58 38 add.u32 %r190,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r113,%r190; .loc 1 59 10 ld.u64 %r25,[%r189]; .loc 1 59 7 add.u64 %r114,%r25,-1; .loc 1 61 7 shr.u64 %r192,%r114,63; set.u32.lt.s64 %r194,%r113,%r114; neg.s32 %r195,%r194; cvt.u16.u64 %r198,%r192; cvt.u16.u32 %r199,%r195; or.b16 %r197,%r198,%r199; .loc 1 61 6 cvt.u32.u16 %r200,%r197; cvt.u16.u8 %r201,%r200; setp.eq.u16 %r202,%r201,0; @ %r202 bra $L2; .loc 1 63 7 cvt.u32.u32 %r205,%r22; cvt.s64.s8 %r204,%r205; st.u64 [%stack+8],%r20403203_gfortran_runtime_errorr208,%r114,%r114; add.u64 %r209,%r208,%r114; shl.b64 %r210,%r209,3; add.u64 %r211,%r188,%r210; ld.u64 %r31,[%r211+56]; ld.u64 %r33,[%r211+48]; .loc 1 71 9 ld.u64 %r117,[%r211+40]; .loc 1 73 3 setp.ne.u64 %r225,%r114,0; @ %r225 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r226,%r113,%r114; @ %r226 bra $L4; bra $L54; $L3: add.u64 %r112,%r188,40; add.u64 %r133,%frame,120; add.u64 %r185,%frame,240; add.u64 %r227,%r188,16; add.u64 %r229,%r25,%r25; add.u64 %r230,%r229,%r25; shl.b64 %r231,%r230,3; add.u64 %r75,%r227,%r231; .loc 1 79 12 mov.u64 %r368,0; $L8: .loc 1 75 18 ld.u64 %r232,[%r112]; st.u64 [%r133],%r232; .loc 1 76 19 ld.u64 %r234,[%r112+16]; add.u64 %r233,%r234,1; ld.u64 %r235,[%r112+8]; sub.u64 %r38,%r233,%r235; .loc 1 78 10 setp.lt.s64 %r236,%r38,0; @ %r236 bra $L6; .loc 1 76 17 st.u64 [%r185],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r185],%r368; $L7: .loc 1 73 3 add.u64 %r112,%r112,24; add.u64 %r133,%r133,8; add.u64 %r185,%r185,8; setp.ne.u64 %r238,%r75,%r112; @ %r238 bra $L8; bra $L9; $L54: .loc 1 90 6 ld.u64 %r239,[%r187]; setp.eq.u64 %r240,%r239,0; @ ! %r240 bra $L11; bra $L10; $L4: add.u64 %r242,%r25,%r25; add.u64 %r243,%r242,%r25; shl.b64 %r244,%r243,3; add.u64 %r245,%r244,40; add.u64 %r111,%r188,%r245; shl.b64 %r246,%r25,3; add.u64 %r101,%r246,-8; add.u64 %r358,%frame,120; add.u64 %r106,%r358,%r101; add.u64 %r357,%frame,240; add.u64 %r99,%r357,%r101; cvt.u32.u32 %r250,%r22; cvt.s64.s8 %r249,%r250; add.u64 %r252,%r249,%r249; add.u64 %r253,%r252,%r249; shl.b64 %r254,%r253,3; add.u64 %r255,%r188,40; add.u64 %r48,%r254,%r255; .loc 1 87 12 mov.u64 %r367,0; $L14: .loc 1 83 18 ld.u64 %r256,[%r111]; st.u64 [%r106],%r256; .loc 1 84 19 ld.u64 %r258,[%r111+16]; add.u64 %r257,%r258,1; ld.u64 %r259,[%r111+8]; sub.u64 %r45,%r257,%r259; .loc 1 86 10 setp.lt.s64 %r260,%r45,0; @ %r260 bra $L12; .loc 1 84 17 st.u64 [%r99],%r45; bra $L13; $L12: .loc 1 87 12 st.u64 [%r99],%r367; $L13: .loc 1 81 3 add.u64 %r111,%r111,24; add.u64 %r106,%r106,8; add.u64 %r99,%r99,8; setp.ne.u64 %r262,%r48,%r111; @ %r262 bra $L14; bra $L55; $L10: .loc 1 94 7 setp.le.s64 %r263,%r113,0; @ %r263 bra $L16; add.u64 %r357,%frame,240; $L40: add.u64 %r151,%r187,40; mov.u64 %r148,%r357; cvt.u32.u32 %r265,%r22; cvt.s64.s8 %r264,%r265; add.u64 %r267,%r264,%r264; add.u64 %r268,%r267,%r264; shl.b64 %r269,%r268,3; add.u64 %r270,%r187,16; add.u64 %r134,%r269,%r270; .loc 1 97 10 mov.u64 %r103,1; .loc 1 101 4 mov.u64 %r293,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r271,0; st.u64 [%r187+8],%r271; .loc 1 106 28 cvt.u16.u32 %r274,%r22; add.u16 %r273,%r274,-1; cvt.u32.u16 %r275,%r273; st.u8 [%r187+28],%r275; .loc 1 108 20 add.u32 %r276,%r22,-2; cvt.s64.s32 %r56,%r276; add.u64 %r278,%r56,%r56; add.u64 %r279,%r278,%r56; shl.b64 %r280,%r279,3; add.u64 %r281,%r187,%r280; .loc 1 108 67 shl.b64 %r283,%r56,3; add.u64 %r284,%frame,%r283; .loc 1 108 59 ld.u64 %r286,[%r281+40]; ld.u64 %r287,[%r284+240]; mul.lo.u64 %r119,%r286,%r287; .loc 1 110 29 mov.u64 %r289289; call (%value_in),_gfortrani_xmallocarray290,[%value_in]; } .loc 1 110 27 st.u64 [%r187],%r290; .loc 1 111 10 setp.eq.u64 %r292,%r119,0; @ ! %r292 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r103,%r52,%r103; $L17: .loc 1 101 4 st.u64 [%r151+8],%r293; ld.u64 %r52,[%r148]; add.u64 %r294,%r52,-1; st.u64 [%r151+16],%r294; st.u64 [%r151],%r103; .loc 1 94 7 add.u64 %r151,%r151,24; add.u64 %r148,%r148,8; setp.ne.u64 %r295,%r134,%r151; @ %r295 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r187+48],%r119; mov.u64 %r297,-1; st.u64 [%r187+56],%r297; mov.u64 %r298,1; st.u64 [%r187+40],%r298; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r62,[%r187+28]; .loc 1 121 10 setp.eq.u64 %r299,%r62,%r113; @ %r299 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r113; st.u64 [%stack],%r62;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r302,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r303,[%r302+36]; setp.eq.u32 %r304,%r303,0; @ %r304 bra $L21; .loc 1 128 2 add.u64 %r357,%frame,240; cvta.const.u64 %r308,$LC2r357307308; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r310,%r113,0; @ %r310 bra $L24; $L28: .loc 1 68 9 add.u64 %r311,%r31,1; .loc 1 68 7 sub.u64 %r115,%r311,%r33; max.s64 %r46,%r115,0; .loc 1 140 8 ld.u64 %r129,[%r188]; .loc 1 141 8 ld.u64 %r130,[%r187]; .loc 1 159 36 shl.b64 %r73,%r117,3; .loc 1 186 22 ld.u64 %r177,[%frame+120]; .loc 1 186 12 shl.b64 %r180,%r177,3; .loc 1 187 22 ld.u64 %r181,[%frame]; .loc 1 187 12 shl.b64 %r184,%r181,3; .loc 1 185 12 ld.u64 %r186,[%frame+360]; setp.gt.s64 %r359,%r115,0; setp.le.s64 %r360,%r113,1; cvt.u32.u32 %r362,%r22; cvt.s64.s8 %r363,%r362; add.u64 %r364,%r363,-1; add.u64 %r365,%frame,120; add.u64 %r366,%frame,240; bra $L25; $L24: add.u64 %r168,%frame,360; add.u64 %r167,%r187,40; mov.u64 %r165,%frame; cvt.u32.u32 %r313,%r22; cvt.s64.s8 %r312,%r313; add.u64 %r154,%r312,-1; .loc 1 132 3 mov.u64 %r131,0; add.u64 %r357,%frame,240; .loc 1 134 16 mov.u64 %r314,%r131; $L27: st.u64 [%r168],%r314; .loc 1 135 18 ld.u64 %r315,[%r167]; st.u64 [%r165],%r315; .loc 1 136 17 shl.b64 %r317,%r131,3; add.u64 %r318,%r357,%r317; .loc 1 136 10 ld.u64 %r319,[%r318]; setp.le.s64 %r320,%r319,0; @ %r320 bra $L1; .loc 1 132 26 add.u64 %r131,%r131,1; .loc 1 132 3 add.u64 %r168,%r168,8; add.u64 %r167,%r167,24; add.u64 %r165,%r165,8; setp.ne.u64 %r321,%r131,%r154; @ %r321 bra $L27; bra $L28; $L25: .loc 1 154 5 @ %r359 bra $L41; .loc 1 155 10 mov.f64 %r323,0d0000000000000000; st.f64 [%r130],%r323; bra $L30; $L41: mov.u64 %r125,%r129; .loc 1 153 8 mov.f64 %r105,0d3ff0000000000000; .loc 1 152 9 mov.f64 %r104,0d0000000000000000; .loc 1 159 13 mov.u64 %r124,0; $L29: .loc 1 163 8 ld.f64 %r69,[%r125]; .loc 1 163 7 setp.eq.f64 %r324,%r69,0d0000000000000000; @ %r324 bra $L31; .loc 1 166 13 abs.f64 %r121,%r69; .loc 1 167 11 setp.gt.f64 %r325,%r121,%r105; @ ! %r325 bra $L56; .loc 1 169 9 div.rn.f64 %r123,%r105,%r121; .loc 1 170 25 mul.f64 %r71,%r123,%r104; .loc 1 170 12 fma.rn.f64 %r104,%r71,%r123,0d3ff0000000000000; .loc 1 171 11 mov.f64 %r105,%r121; bra $L31; $L56: .loc 1 175 9 div.rn.f64 %r122,%r69,%r105; .loc 1 176 12 fma.rn.f64 %r104,%r122,%r122,%r104; $L31: .loc 1 159 28 add.u64 %r124,%r124,1; .loc 1 159 36 add.u64 %r125,%r125,%r73; .loc 1 159 6 setp.gt.s64 %r326,%r46,%r124; @ %r326 bra $L29; setp.geu.f64 %r327,%r104,0d0000000000000000; @ ! %r327 bra $L57; .loc 1 180 23 sqrt.rn.f64 %r74,%r104; bra $L36; $L57: { .param .f64 %value_in; .param .f64 %out_arg1; st.param.f64 [%out_arg1],%r104; call (%value_in),sqrt,(%out_arg1); ld.param.f64 %r329,[%value_in]; } mov.f64 %r74,%r329; $L36: .loc 1 180 13 mul.f64 %r330,%r105,%r74; .loc 1 181 12 st.f64 [%r130],%r330; $L30: .loc 1 185 15 add.u64 %r186,%r186,1; .loc 1 186 12 add.u64 %r129,%r129,%r180; .loc 1 187 12 add.u64 %r130,%r130,%r184; .loc 1 189 32 ld.u64 %r141,[%frame+240]; .loc 1 189 13 setp.ne.u64 %r331,%r186,%r141; @ %r331 bra $L25; .loc 1 196 23 mul.lo.u64 %r332,%r186,%r177; .loc 1 196 9 shl.b64 %r333,%r332,3; sub.u64 %r126,%r129,%r333; .loc 1 197 23 mul.lo.u64 %r334,%r186,%r181; .loc 1 197 9 shl.b64 %r335,%r334,3; sub.u64 %r127,%r130,%r335; .loc 1 199 7 @ %r360 bra $L1; add.u64 %r63,%frame,368; mov.u64 %r159,8; .loc 1 198 5 mov.u64 %r128,1; .loc 1 193 13 mov.u64 %r361,0; bra $L38; $L39: st.u64 [%r63],%r361; .loc 1 196 23 mul.lo.u64 %r341,%r90,%r89; .loc 1 196 9 shl.b64 %r342,%r341,3; sub.u64 %r126,%r129,%r342; .loc 1 197 23 mul.lo.u64 %r343,%r93,%r89; .loc 1 197 9 shl.b64 %r344,%r343,3; sub.u64 %r127,%r130,%r344; .loc 1 198 5 add.u64 %r128,%r128,1; .loc 1 199 7 add.u64 %r63,%r63,8; add.u64 %r159,%r159,8; setp.eq.u64 %r345,%r128,%r364; @ %r345 bra $L1; $L38: .loc 1 207 16 ld.u64 %r346,[%r63]; add.u64 %r89,%r346,1; st.u64 [%r63],%r89; .loc 1 208 23 add.u64 %r348,%r365,%r159; ld.u64 %r90,[%r348]; .loc 1 208 13 shl.b64 %r349,%r90,3; add.u64 %r129,%r126,%r349; .loc 1 209 23 add.u64 %r350,%frame,%r159; ld.u64 %r93,[%r350]; .loc 1 209 13 shl.b64 %r351,%r93,3; add.u64 %r130,%r127,%r351; .loc 1 189 32 add.u64 %r353,%r366,%r159; ld.u64 %r96,[%r353]; .loc 1 189 13 setp.eq.u64 %r354,%r89,%r96; @ %r354 bra $L39; mov.u64 %r186,0; bra $L25; $L55: .loc 1 90 6 ld.u64 %r355,[%r187]; setp.eq.u64 %r356,%r355,0; @ ! %r356 bra $L11; bra $L40; $L1norm2_r10.o/ parity_l1.o/_gfortran_parity_l1 .visible .func _gfortran_parity_l1fortran/generated/parity_l1.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_retur,105,110,99,111,114,114,101,99,116,32,105,110,32,80,65,82,73,84,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,80,65,82,73,84,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {80,65,82,73,84,89114,101,116,117,114,110,32,118,97,108,11_gfortran_parity_l1 .visible .func _gfortran_parity_l1486u64 %r212; .reg .u64pred %r245; .reg .predu64predpredu64 %r332; .reg .u64pred %r340; .reg .pred %r341; .reg .u32 %r343; .reg .u64u64 %r350; .reg .u64 %r351; mov.u64 %r172,%ar0; mov.u64 %r173,%ar1; mov.u64 %r174,%ar2; .loc 1 55 10 ld.s8 %r22,[%r173+28]; .loc 1 55 38 add.u32 %r175,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r175; .loc 1 56 10 ld.u64 %r25,[%r174]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r177,%r104,63; set.u32.lt.s64 %r179,%r103,%r104; neg.s32 %r180,%r179; cvt.u16.u64 %r183,%r177; cvt.u16.u32 %r184,%r180; or.b16 %r182,%r183,%r184; .loc 1 58 6 cvt.u32.u16 %r185,%r182; cvt.u16.u8 %r186,%r185; setp.eq.u16 %r187,%r186,0; @ %r187 bra $L2; .loc 1 60 7 cvt.u32.u32 %r190,%r22; cvt.s64.s8 %r189,%r190; st.u64 [%stack+8],%r189_gfortran_runtime_errorr193,%r104,%r104; add.u64 %r194,%r193,%r10173,%r195; ld.u64 %r31,[%r196+56]; ld.u64 %r33,[%r196+48]; .loc 1 68 9 ld.u64 %r107,[%r196+40]; .loc 1 70 3 setp.ne.u64 %r210,%r104,0; @ %r210 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r211,%r103,%r104; @ %r211 bra $L4; bra $L46; $L3: add.u64 %r123,%r173,40; add.u64 %r169,%frame,120; add.u64 %r168,%frame,240; add.u64 %r212,%r173,16; add.u64 %r214,%r25,%r25; add.u64 %r215,%r214,%r25; shl.b64 %r216,%r215,3; add.u64 %r131,%r212,%r216; .loc 1 76 12 mov.u64 %r351,0; $L8: .loc 1 72 18 ld.u64 %r217,[%r123]; st.u64 [%r169],%r217; .loc 1 73 19 ld.u64 %r219,[%r123+16]; add.u64 %r218,%r219,1; ld.u64 %r220,[%r123+8]; sub.u64 %r38,%r218,%r220; .loc 1 75 10 setp.lt.s64 %r221,%r38,0; @ %r221 bra $L6; .loc 1 73 17 st.u64 [%r168],%r38; bra $L7; $L6: .loc 1 76 12 st.u64 [%r168],%r351; $L7: .loc 1 70 3 add.u64 %r123,%r123,24; add.u64 %r169,%r169,8; add.u64 %r168,%r168,8; setp.ne.u64 %r223,%r131,%r123; @ %r223 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r224,[%r172]; setp.eq.u64 %r225,%r224,0; @ ! %r225 bra $L11; bra $L10; $L4: add.u64 %r227,%r25,%r25; add.u64 %r228,%r227,%r25; shl.b64 %r229,%r228,3; add.u64 %r230,%r229,40; add.u64 %r101,%r173,%r230; shl.b64 %r231,%r25,3; add.u64 %r93,%r231,-8; add.u64 %r338,%frame,120; add.u64 %r95,%r338,%r93; add.u64 %r339,%frame,240; add.u64 %r90,%r339,%r93; cvt.u32.u32 %r235,%r22; cvt.s64.s8 %r234,%r235; add.u64 %r237,%r234,%r234; add.u64 %r238,%r237,%r234; shl.b64 %r239,%r238,3; add.u64 %r240,%r173,40; add.u64 %r120,%r239,%r240; .loc 1 84 12 mov.u64 %r350,0; $L14: .loc 1 80 18 ld.u64 %r241,[%r101]; st.u64 [%r95],%r241; .loc 1 81 19 ld.u64 %r243,[%r101+16]; add.u64 %r242,%r243,1; ld.u64 %r244,[%r101+8]; sub.u64 %r45,%r242,%r244; .loc 1 83 10 setp.lt.s64 %r245,%r45,0; @ %r245 bra $L12; .loc 1 81 17 st.u64 [%r90],%r45; bra $L13; $L12: .loc 1 84 12 st.u64 [%r90],%r350; $L13: .loc 1 78 3 add.u64 %r101,%r101,24; add.u64 %r95,%r95,8; add.u64 %r90,%r90,8; setp.ne.u64 %r247,%r101,%r120; @ %r247 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r248,%r103,0; @ %r248 bra $L16; add.u64 %r339,%frame,240; $L35: add.u64 %r147,%r172,40; mov.u64 %r135,%r339; cvt.u32.u32 %r250,%r22; cvt.s64.s8 %r249,%r250; add.u64 %r252,%r249,%r249; add.u64 %r253,%r252,%r249; shl.b64 %r254,%r253,3; add.u64 %r255,%r172,16; add.u64 %r113,%r254,%r255; .loc 1 94 10 mov.u64 %r96,1; .loc 1 98 4 mov.u64 %r278,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r256,0; st.u64 [%r172+8],%r256; .loc 1 103 28 cvt.u16.u32 %r259,%r22; add.u16 %r258,%r259,-1; cvt.u32.u16 %r260,%r258; st.u8 [%r172+28],%r260; .loc 1 105 20 add.u32 %r261,%r22,-2; cvt.s64.s32 %r56,%r261; add.u64 %r263,%r56,%r56; add.u64 %r264,%r263,%r56; shl.b64 %r265,%r264,3; add.u64 %r266,%r172,%r265; .loc 1 105 67 shl.b64 %r268,%r56,3; add.u64 %r269,%frame,%r268; .loc 1 105 59 ld.u64 %r271,[%r266+40]; ld.u64 %r272,[%r269+240]; mul.lo.u64 %r109,%r271,%r272; .loc 1 107 29 mov.u64 %r274274; call (%value_in),_gfortrani_xmallocarray275,[%value_in]; } .loc 1 107 27 st.u64 [%r172],%r275; .loc 1 108 10 setp.eq.u64 %r277,%r109,0; @ ! %r277 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r96,%r52,%r96; $L17: .loc 1 98 4 st.u64 [%r147+8],%r278; ld.u64 %r52,[%r135]; add.u64 %r279,%r52,-1; st.u64 [%r147+16],%r279; st.u64 [%r147],%r96; .loc 1 91 7 add.u64 %r147,%r147,24; add.u64 %r135,%r135,8; setp.ne.u64 %r280,%r113,%r147; @ %r280 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r172+48],%r109; mov.u64 %r282,-1; st.u64 [%r172+56],%r282; st.u64 [%r172+40],%r274; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r62,[%r172+28]; .loc 1 118 10 setp.eq.u64 %r284,%r62,%r103; @ %r284 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r62;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r287,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r288,[%r287+36]; setp.eq.u32 %r289,%r288,0; @ %r289 bra $L21; .loc 1 125 2 add.u64 %r339,%frame,240; cvta.const.u64 %r293,$LC2r329293; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r295,%r103,0; @ %r295 bra $L24; $L27: .loc 1 65 9 add.u64 %r296,%r31,1; .loc 1 65 7 sub.u64 %r105,%r296,%r33; max.s64 %r46,%r105,0; .loc 1 137 8 ld.u64 %r117,[%r173]; .loc 1 138 8 ld.u64 %r118,[%r172]; ld.u64 %r48,[%frame+360]; .loc 1 166 22 ld.u64 %r71,[%frame+120]; .loc 1 167 22 ld.u64 %r73,[%frame]; setp.gt.s64 %r340,%r105,0; setp.le.s64 %r341,%r103,1; cvt.u32.u32 %r343,%r22; cvt.s64.s8 %r344,%r343; add.u64 %r345,%r344,-1; add.u64 %r346,%frame,120; add.u64 %r347,%frame,240; .loc 1 173 13 mov.u64 %r348,0; bra $L25; $L24: add.u64 %r160,%frame,360; add.u64 %r159,%r172,40; mov.u64 %r157,%frame; cvt.u32.u32 %r298,%r22; cvt.s64.s8 %r297,%r298; add.u64 %r148,%r297,-1; .loc 1 129 3 mov.u64 %r119,0; add.u64 %r339,%frame,240; .loc 1 131 16 mov.u64 %r299,%r119; $L26: st.u64 [%r160],%r299; .loc 1 132 18 ld.u64 %r300,[%r159]; st.u64 [%r157],%r300; .loc 1 133 17 shl.b64 %r302,%r119,3; add.u64 %r303,%r339,%r302; .loc 1 133 10 ld.u64 %r304,[%r303]; setp.le.s64 %r305,%r304,0; @ %r305 bra $L1; .loc 1 129 26 add.u64 %r119,%r119,1; .loc 1 129 3 add.u64 %r160,%r160,8; add.u64 %r159,%r159,24; add.u64 %r157,%r157,8; setp.ne.u64 %r306,%r119,%r148; @ %r306 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r340 bra $L28; .loc 1 150 10 mov.u32 %r308,0; st.u8 [%r118],%r308; bra $L29; $L28: .loc 1 154 36 mov.u64 %r112,%r117; .loc 1 148 10 mov.u32 %r110,0; .loc 1 154 13 mov.u64 %r111,0; $L30: .loc 1 158 19 ld.s8 %r311,[%r112]; cvt.u16.u32 %r310,%r311; cvt.u16.u32 %r312,%r110; set.u32.ne.u16 %r313,%r310,%r312; neg.s32 %r314,%r313; cvt.u32.u32 %r309,%r314; .loc 1 158 10 cvt.s32.s8 %r110,%r309; .loc 1 154 28 add.u64 %r111,%r111,1; .loc 1 154 36 add.u64 %r112,%r112,%r107; .loc 1 154 6 setp.gt.s64 %r316,%r46,%r111; @ %r316 bra $L30; .loc 1 161 12 cvt.u32.u32 %r317,%r110; st.u8 [%r118],%r317; $L29: .loc 1 165 15 add.u64 %r48,%r48,1; .loc 1 166 12 add.u64 %r117,%r117,%r71; .loc 1 167 12 add.u64 %r118,%r118,%r73; .loc 1 169 32 ld.u64 %r126,[%frame+240]; .loc 1 169 13 setp.ne.u64 %r318,%r48,%r126; @ %r318 bra $L25; .loc 1 176 23 mul.lo.u64 %r319,%r71,%r48; .loc 1 176 9 sub.u64 %r114,%r117,%r319; .loc 1 177 23 mul.lo.u64 %r320,%r73,%r48; .loc 1 177 9 sub.u64 %r115,%r118,%r320; .loc 1 179 7 @ %r341 bra $L1; add.u64 %r143,%frame,368; mov.u64 %r102,8; .loc 1 178 5 mov.u64 %r116,1; bra $L33; $L34: .loc 1 173 13 st.u64 [%r143],%r348; .loc 1 176 23 mul.lo.u64 %r326,%r85,%r84; .loc 1 176 9 sub.u64 %r114,%r117,%r326; .loc 1 177 23 mul.lo.u64 %r327,%r87,%r84; .loc 1 177 9 sub.u64 %r115,%r118,%r327; .loc 1 178 5 add.u64 %r116,%r116,1; .loc 1 179 7 add.u64 %r143,%r143,8; add.u64 %r102,%r102,8; setp.eq.u64 %r328,%r116,%r345; @ %r328 bra $L1; $L33: .loc 1 187 16 ld.u64 %r329,[%r143]; add.u64 %r84,%r329,1; st.u64 [%r143],%r84; .loc 1 188 23 add.u64 %r331,%r346,%r102; ld.u64 %r85,[%r331]; .loc 1 188 13 add.u64 %r117,%r114,%r85; .loc 1 189 23 add.u64 %r332,%frame,%r102; ld.u64 %r87,[%r332]; .loc 1 189 13 add.u64 %r118,%r115,%r87; .loc 1 169 32 add.u64 %r334,%r347,%r102; ld.u64 %r89,[%r334]; .loc 1 169 13 setp.eq.u64 %r335,%r84,%r89; @ %r335 bra $L34; .loc 1 173 13 mov.u64 %r48,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r336,[%r172]; setp.eq.u64 %r337,%r336,0; @ ! %r337 bra $L11; bra $L35; $L1parity_l2.o/_gfortran_parity_l2 .visible .func _gfortran_parity_l2fortran/generated/parity_lgfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_retur,105,110,99,111,114,114,101,99,116,32,105,110,32,80,65,82,73,84,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,80,65,82,73,84,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {80,65,82,73,84,89114,101,116,117,114,110,32,118,97,108,11_gfortran_parity_l2 .visible .func _gfortran_parity_l248pred %r218; .reg .predpred %r231; .reg .u64 %r232; .reg .predu64 %r250; .reg .u64 %r251; .reg .u64 %r252; .reg .pred %r253; .reg .predu64u16u64 %r274; .reg .u64pred %r303; .reg .u64 %r304; .reg .u64u32u64 %r317; .reg .u64pred %r357; .reg .u32 %r359; .reg .u64u64 %r366; .reg .u64 %r367; mov.u64 %r180,%ar0; mov.u64 %r181,%ar1; mov.u64 %r182,%ar2; .loc 1 55 10 ld.s8 %r22,[%r181+28]; .loc 1 55 38 add.u32 %r183,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r105,%r183; .loc 1 56 10 ld.u64 %r25,[%r182]; .loc 1 56 7 add.u64 %r106,%r25,-1; .loc 1 58 7 shr.u64 %r185,%r106,63; set.u32.lt.s64 %r187,%r105,%r106; neg.s32 %r188,%r187; cvt.u16.u64 %r191,%r185; cvt.u16.u32 %r192,%r188; or.b16 %r190,%r191,%r192; .loc 1 58 6 cvt.u32.u16 %r193,%r190; cvt.u16.u8 %r194,%r193; setp.eq.u16 %r195,%r194,0; @ %r195 bra $L2; .loc 1 60 7 cvt.u32.u32 %r198,%r22; cvt.s64.s8 %r197,%r198; st.u64 [%stack+8],%r1971966_gfortran_runtime_errorr201,%r106,%r106; add.u64 %r202,%r201,%r106; shl.b64 %r203,%r202,3; add.u64 %r204,%r181,%r203; ld.u64 %r31,[%r204+56]; ld.u64 %r33,[%r204+48]; .loc 1 68 9 ld.u64 %r109,[%r204+40]; .loc 1 70 3 setp.ne.u64 %r218,%r106,0; @ %r218 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r219,%r105,%r106; @ %r219 bra $L4; bra $L46; $L3: add.u64 %r93,%r181,40; add.u64 %r59,%frame,120; add.u64 %r29,%frame,240; add.u64 %r220,%r181,16; add.u64 %r222,%r25,%r25; add.u64 %r223,%r222,%r25; shl.b64 %r224,%r223,3; add.u64 %r177,%r220,%r224; .loc 1 76 12 mov.u64 %r367,0; $L8: .loc 1 72 18 ld.u64 %r225,[%r93]; st.u64 [%r59],%r225; .loc 1 73 19 ld.u64 %r227,[%r93+16]; add.u64 %r226,%r227,1; ld.u64 %r228,[%r93+8]; sub.u64 %r38,%r226,%r228; .loc 1 75 10 setp.lt.s64 %r229,%r38,0; @ %r229 bra $L6; .loc 1 73 17 st.u64 [%r29],%r38; bra $L7; $L6: .loc 1 76 12 st.u64 [%r29],%r367; $L7: .loc 1 70 3 add.u64 %r93,%r93,24; add.u64 %r59,%r59,8; add.u64 %r29,%r29,8; setp.ne.u64 %r231,%r93,%r177; @ %r231 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r232,[%r180]; setp.eq.u64 %r233,%r232,0; @ ! %r233 bra $L11; bra $L10; $L4: add.u64 %r235,%r25,%r25; add.u64 %r236,%r235,%r25; shl.b64 %r237,%r236,3; add.u64 %r238,%r237,40; add.u64 %r146,%r181,%r238; shl.b64 %r239,%r25,3; add.u64 %r128,%r239,-8; add.u64 %r355,%frame,120; add.u64 %r131,%r355,%r128; add.u64 %r354,%frame,240; add.u64 %r126,%r354,%r128; cvt.u32.u32 %r243,%r22; cvt.s64.s8 %r242,%r243; add.u64 %r245,%r242,%r242; add.u64 %r246,%r245,%r242; shl.b64 %r247,%r246,3; add.u64 %r248,%r181,40; add.u64 %r96,%r247,%r248; .loc 1 84 12 mov.u64 %r366,0; $L14: .loc 1 80 18 ld.u64 %r249,[%r146]; st.u64 [%r131],%r249; .loc 1 81 19 ld.u64 %r251,[%r146+16]; add.u64 %r250,%r251,1; ld.u64 %r252,[%r146+8]; sub.u64 %r44,%r250,%r252; .loc 1 83 10 setp.lt.s64 %r253,%r44,0; @ %r253 bra $L12; .loc 1 81 17 st.u64 [%r126],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r126],%r366; $L13: .loc 1 78 3 add.u64 %r146,%r146,24; add.u64 %r131,%r131,8; add.u64 %r126,%r126,8; setp.ne.u64 %r255,%r96,%r146; @ %r255 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r256,%r105,0; @ %r256 bra $L16; add.u64 %r354,%frame,240; $L34: add.u64 %r155,%r180,40; mov.u64 %r153,%r354; add.u64 %r257,%r180,16; cvt.u32.u32 %r259,%r22; cvt.s64.s8 %r258,%r259; add.u64 %r261,%r258,%r258; add.u64 %r262,%r261,%r258; shl.b64 %r263,%r262,3; add.u64 %r147,%r257,%r263; .loc 1 94 10 mov.u64 %r98,1; .loc 1 98 4 mov.u64 %r286,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r264,0; st.u64 [%r180+8],%r264; .loc 1 103 28 cvt.u16.u32 %r267,%r22; add.u16 %r266,%r267,-1; cvt.u32.u16 %r268,%r266; st.u8 [%r180+28],%r268; .loc 1 105 20 add.u32 %r269,%r22,-2; cvt.s64.s32 %r52,%r269; add.u64 %r271,%r52,%r52; add.u64 %r272,%r271,%r52; shl.b64 %r273,%r272,3; add.u64 %r274,%r180,%r273; .loc 1 105 67 shl.b64 %r276,%r52,3; add.u64 %r277,%frame,%r276; .loc 1 105 59 ld.u64 %r279,[%r274+40]; ld.u64 %r280,[%r277+240]; mul.lo.u64 %r111,%r279,%r280; .loc 1 107 29 mov.u64 %r28call (%value_in),_gfortrani_xmallocarray283,[%value_in]; } .loc 1 107 27 st.u64 [%r180],%r283; .loc 1 108 10 setp.eq.u64 %r285,%r111,0; @ ! %r285 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r98,%r48,%r98; $L17: .loc 1 98 4 st.u64 [%r155+8],%r286; ld.u64 %r48,[%r153]; add.u64 %r287,%r48,-1; st.u64 [%r155+16],%r287; st.u64 [%r155],%r98; .loc 1 91 7 add.u64 %r155,%r155,24; add.u64 %r153,%r153,8; setp.ne.u64 %r288,%r147,%r155; @ %r288 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r180+48],%r111; mov.u64 %r290,-1; st.u64 [%r180+56],%r290; mov.u64 %r291,1; st.u64 [%r180+40],%r291; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r180+28]; .loc 1 118 10 setp.eq.u64 %r292,%r58,%r105; @ %r292 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r105; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r295,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r296,[%r295+36]; setp.eq.u32 %r297,%r296,0; @ %r297 bra $L21; .loc 1 125 2 add.u64 %r354,%frame,240; cvta.const.u64 %r301,$LC2r354300301; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r303,%r105,0; @ %r303 bra $L24; $L27: .loc 1 65 9 add.u64 %r304,%r31,1; .loc 1 65 7 sub.u64 %r107,%r304,%r33; max.s64 %r45,%r107,0; .loc 1 137 8 ld.u64 %r118,[%r181]; .loc 1 138 8 ld.u64 %r119,[%r180]; .loc 1 154 36 add.u64 %r67,%r109,%r109; ld.u64 %r139,[%frame+360]; .loc 1 166 22 ld.u64 %r69,[%frame+120]; .loc 1 166 12 add.u64 %r70,%r69,%r69; .loc 1 167 22 ld.u64 %r72,[%frame]; .loc 1 167 12 add.u64 %r73,%r72,%r72; .loc 1 169 32 ld.u64 %r125,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r308,%r69,%r125; .loc 1 176 9 add.u64 %r309,%r308,%r308; neg.s64 %r130,%r309; .loc 1 177 23 mul.lo.u64 %r310,%r72,%r125; .loc 1 177 9 add.u64 %r311,%r310,%r310; neg.s64 %r132,%r311; setp.gt.s64 %r356,%r107,0; setp.le.s64 %r357,%r105,1; cvt.u32.u32 %r359,%r22; cvt.s64.s8 %r360,%r359; add.u64 %r361,%r360,-1; add.u64 %r362,%frame,120; add.u64 %r363,%frame,240; .loc 1 173 13 mov.u64 %r364,0; bra $L25; $L24: add.u64 %r168,%frame,360; add.u64 %r167,%r180,40; mov.u64 %r165,%frame; cvt.u32.u32 %r313,%r22; cvt.s64.s8 %r312,%r313; add.u64 %r156,%r312,-1; .loc 1 129 3 mov.u64 %r120,0; add.u64 %r354,%frame,240; .loc 1 131 16 mov.u64 %r314,%r120; $L26: st.u64 [%r168],%r314; .loc 1 132 18 ld.u64 %r315,[%r167]; st.u64 [%r165],%r315; .loc 1 133 17 shl.b64 %r317,%r120,3; add.u64 %r318,%r354,%r317; .loc 1 133 10 ld.u64 %r319,[%r318]; setp.le.s64 %r320,%r319,0; @ %r320 bra $L1; .loc 1 129 26 add.u64 %r120,%r120,1; .loc 1 129 3 add.u64 %r168,%r168,8; add.u64 %r167,%r167,24; add.u64 %r165,%r165,8; setp.ne.u64 %r321,%r120,%r156; @ %r321 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r356 bra $L35; .loc 1 150 10 mov.u16 %r323,0; st.u16 [%r119],%r323; bra $L29; $L35: mov.u64 %r114,%r118; .loc 1 148 10 mov.u32 %r112,0; .loc 1 154 13 mov.u64 %r113,0; $L28: .loc 1 158 19 ld.u16 %r325,[%r114]; cvt.u16.u32 %r326,%r112; set.u32.ne.u16 %r327,%r325,%r326; neg.s32 %r328,%r327; cvt.u32.u32 %r324,%r328; .loc 1 158 10 cvt.u32.u8 %r112,%r324; .loc 1 154 28 add.u64 %r113,%r113,1; .loc 1 154 36 add.u64 %r114,%r114,%r67; .loc 1 154 6 setp.gt.s64 %r330,%r45,%r113; @ %r330 bra $L28; .loc 1 161 12 cvt.u16.u32 %r331,%r112; st.u16 [%r119],%r331; $L29: .loc 1 165 15 add.u64 %r139,%r139,1; .loc 1 166 12 add.u64 %r118,%r118,%r70; .loc 1 167 12 add.u64 %r119,%r119,%r73; .loc 1 169 13 setp.ne.u64 %r332,%r139,%r125; @ %r332 bra $L25; .loc 1 176 9 add.u64 %r115,%r118,%r130; .loc 1 177 9 add.u64 %r116,%r119,%r132; .loc 1 179 7 @ %r357 bra $L1; add.u64 %r124,%frame,368; mov.u64 %r141,8; .loc 1 178 5 mov.u64 %r117,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r124],%r364; .loc 1 176 23 mul.lo.u64 %r338,%r86,%r85; .loc 1 176 9 add.u64 %r339,%r338,%r338; sub.u64 %r115,%r118,%r339; .loc 1 177 23 mul.lo.u64 %r340,%r89,%r85; .loc 1 177 9 add.u64 %r341,%r340,%r340; sub.u64 %r116,%r119,%r341; .loc 1 178 5 add.u64 %r117,%r117,1; .loc 1 179 7 add.u64 %r124,%r124,8; add.u64 %r141,%r141,8; setp.eq.u64 %r342,%r117,%r361; @ %r342 bra $L1; $L32: .loc 1 187 16 ld.u64 %r343,[%r124]; add.u64 %r85,%r343,1; st.u64 [%r124],%r85; .loc 1 188 23 add.u64 %r345,%r362,%r141; ld.u64 %r86,[%r345]; .loc 1 188 13 add.u64 %r346,%r86,%r86; add.u64 %r118,%r115,%r346; .loc 1 189 23 add.u64 %r347,%frame,%r141; ld.u64 %r89,[%r347]; .loc 1 189 13 add.u64 %r348,%r89,%r89; add.u64 %r119,%r116,%r348; .loc 1 169 32 add.u64 %r350,%r363,%r141; ld.u64 %r92,[%r350]; .loc 1 169 13 setp.eq.u64 %r351,%r85,%r92; @ %r351 bra $L33; .loc 1 173 13 mov.u64 %r139,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r352,[%r180]; setp.eq.u64 %r353,%r352,0; @ ! %r353 bra $L11; bra $L34; $L1 parity_l4.o/_gfortran_parity_l4 .visible .func _gfortran_parity_l4fortran/generated/parity_l4.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_retur,105,110,99,111,114,114,101,99,116,32,105,110,32,80,65,82,73,84,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,80,65,82,73,84,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {80,65,82,73,84,89114,101,116,117,114,110,32,118,97,108,11_gfortran_parity_l4 .visible .func _gfortran_parity_l448pred %r218; .reg .predpred %r231; .reg .u64 %r232; .reg .predu64 %r250; .reg .u64 %r251; .reg .u64 %r252; .reg .pred %r253; .reg .predu64u16u64 %r274; .reg .u64pred %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u64 %r306; .reg .u64u64pred %r325; .reg .pred %r326; .reg .u64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64u64 %r357; .reg .u64 %r358; .reg .u64 %r360; .reg .u64 %r361; mov.u64 %r180,%ar0; mov.u64 %r181,%ar1; mov.u64 %r182,%ar2; .loc 1 55 10 ld.s8 %r22,[%r181+28]; .loc 1 55 38 add.u32 %r183,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r105,%r183; .loc 1 56 10 ld.u64 %r25,[%r182]; .loc 1 56 7 add.u64 %r106,%r25,-1; .loc 1 58 7 shr.u64 %r185,%r106,63; set.u32.lt.s64 %r187,%r105,%r106; neg.s32 %r188,%r187; cvt.u16.u64 %r191,%r185; cvt.u16.u32 %r192,%r188; or.b16 %r190,%r191,%r192; .loc 1 58 6 cvt.u32.u16 %r193,%r190; cvt.u16.u8 %r194,%r193; setp.eq.u16 %r195,%r194,0; @ %r195 bra $L2; .loc 1 60 7 cvt.u32.u32 %r198,%r22; cvt.s64.s8 %r197,%r198; st.u64 [%stack+8],%r1971966_gfortran_runtime_errorr201,%r106,%r106; add.u64 %r202,%r201,%r106; shl.b64 %r203,%r202,3; add.u64 %r204,%r181,%r203; ld.u64 %r31,[%r204+56]; ld.u64 %r33,[%r204+48]; .loc 1 68 9 ld.u64 %r109,[%r204+40]; .loc 1 70 3 setp.ne.u64 %r218,%r106,0; @ %r218 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r219,%r105,%r106; @ %r219 bra $L4; bra $L46; $L3: add.u64 %r93,%r181,40; add.u64 %r59,%frame,120; add.u64 %r29,%frame,240; add.u64 %r220,%r181,16; add.u64 %r222,%r25,%r25; add.u64 %r223,%r222,%r25; shl.b64 %r224,%r223,3; add.u64 %r177,%r220,%r224; .loc 1 76 12 mov.u64 %r361,0; $L8: .loc 1 72 18 ld.u64 %r225,[%r93]; st.u64 [%r59],%r225; .loc 1 73 19 ld.u64 %r227,[%r93+16]; add.u64 %r226,%r227,1; ld.u64 %r228,[%r93+8]; sub.u64 %r38,%r226,%r228; .loc 1 75 10 setp.lt.s64 %r229,%r38,0; @ %r229 bra $L6; .loc 1 73 17 st.u64 [%r29],%r38; bra $L7; $L6: .loc 1 76 12 st.u64 [%r29],%r361; $L7: .loc 1 70 3 add.u64 %r93,%r93,24; add.u64 %r59,%r59,8; add.u64 %r29,%r29,8; setp.ne.u64 %r231,%r93,%r177; @ %r231 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r232,[%r180]; setp.eq.u64 %r233,%r232,0; @ ! %r233 bra $L11; bra $L10; $L4: add.u64 %r235,%r25,%r25; add.u64 %r236,%r235,%r25; shl.b64 %r237,%r236,3; add.u64 %r238,%r237,40; add.u64 %r146,%r181,%r238; shl.b64 %r239,%r25,3; add.u64 %r128,%r239,-8; add.u64 %r349,%frame,120; add.u64 %r131,%r349,%r128; add.u64 %r348,%frame,240; add.u64 %r126,%r348,%r128; cvt.u32.u32 %r243,%r22; cvt.s64.s8 %r242,%r243; add.u64 %r245,%r242,%r242; add.u64 %r246,%r245,%r242; shl.b64 %r247,%r246,3; add.u64 %r248,%r181,40; add.u64 %r96,%r247,%r248; .loc 1 84 12 mov.u64 %r360,0; $L14: .loc 1 80 18 ld.u64 %r249,[%r146]; st.u64 [%r131],%r249; .loc 1 81 19 ld.u64 %r251,[%r146+16]; add.u64 %r250,%r251,1; ld.u64 %r252,[%r146+8]; sub.u64 %r44,%r250,%r252; .loc 1 83 10 setp.lt.s64 %r253,%r44,0; @ %r253 bra $L12; .loc 1 81 17 st.u64 [%r126],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r126],%r360; $L13: .loc 1 78 3 add.u64 %r146,%r146,24; add.u64 %r131,%r131,8; add.u64 %r126,%r126,8; setp.ne.u64 %r255,%r96,%r146; @ %r255 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r256,%r105,0; @ %r256 bra $L16; add.u64 %r348,%frame,240; $L34: add.u64 %r155,%r180,40; mov.u64 %r153,%r348; add.u64 %r257,%r180,16; cvt.u32.u32 %r259,%r22; cvt.s64.s8 %r258,%r259; add.u64 %r261,%r258,%r258; add.u64 %r262,%r261,%r258; shl.b64 %r263,%r262,3; add.u64 %r147,%r257,%r263; .loc 1 94 10 mov.u64 %r98,1; .loc 1 98 4 mov.u64 %r286,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r264,0; st.u64 [%r180+8],%r264; .loc 1 103 28 cvt.u16.u32 %r267,%r22; add.u16 %r266,%r267,-1; cvt.u32.u16 %r268,%r266; st.u8 [%r180+28],%r268; .loc 1 105 20 add.u32 %r269,%r22,-2; cvt.s64.s32 %r52,%r269; add.u64 %r271,%r52,%r52; add.u64 %r272,%r271,%r52; shl.b64 %r273,%r272,3; add.u64 %r274,%r180,%r273; .loc 1 105 67 shl.b64 %r276,%r52,3; add.u64 %r277,%frame,%r276; .loc 1 105 59 ld.u64 %r279,[%r274+40]; ld.u64 %r280,[%r277+240]; mul.lo.u64 %r111,%r279,%r280; .loc 1 107 29282; call (%value_in),_gfortrani_xmallocarray283,[%value_in]; } .loc 1 107 27 st.u64 [%r180],%r283; .loc 1 108 10 setp.eq.u64 %r285,%r111,0; @ ! %r285 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r98,%r48,%r98; $L17: .loc 1 98 4 st.u64 [%r155+8],%r286; ld.u64 %r48,[%r153]; add.u64 %r287,%r48,-1; st.u64 [%r155+16],%r287; st.u64 [%r155],%r98; .loc 1 91 7 add.u64 %r155,%r155,24; add.u64 %r153,%r153,8; setp.ne.u64 %r288,%r147,%r155; @ %r288 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r180+48],%r111; mov.u64 %r290,-1; st.u64 [%r180+56],%r290; mov.u64 %r291,1; st.u64 [%r180+40],%r291; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r180+28]; .loc 1 118 10 setp.eq.u64 %r292,%r58,%r105; @ %r292 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r105; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r295,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r296,[%r295+36]; setp.eq.u32 %r297,%r296,0; @ %r297 bra $L21; .loc 1 125 2 add.u64 %r348,%frame,240; cvta.const.u64 %r301,$LC2r34300301; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r303,%r105,0; @ %r303 bra $L24; $L27: .loc 1 65 9 add.u64 %r304,%r31,1; .loc 1 65 7 sub.u64 %r107,%r304,%r33; max.s64 %r45,%r107,0; .loc 1 137 8 ld.u64 %r118,[%r181]; .loc 1 138 8 ld.u64 %r119,[%r180]; .loc 1 154 36 shl.b64 %r67,%r109,2; ld.u64 %r139,[%frame+360]; .loc 1 166 22 ld.u64 %r69,[%frame+120]; .loc 1 166 12 shl.b64 %r70,%r69,2; .loc 1 167 22 ld.u64 %r72,[%frame]; .loc 1 167 12 shl.b64 %r73,%r72,2; .loc 1 169 32 ld.u64 %r125,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r305,%r69,%r125; .loc 1 176 9 shl.b64 %r306,%r305,2; neg.s64 %r130,%r306; .loc 1 177 23 mul.lo.u64 %r307,%r72,%r125; .loc 1 177 9 shl.b64 %r308,%r307,2; neg.s64 %r132,%r308; setp.gt.s64 %r350,%r107,0; setp.le.s64 %r351,%r105,1; cvt.u32.u32 %r353,%r22; cvt.s64.s8 %r354,%r353; add.u64 %r355,%r354,-1; add.u64 %r356,%frame,120; add.u64 %r357,%frame,240; .loc 1 173 13 mov.u64 %r358,0; bra $L25; $L24: add.u64 %r168,%frame,360; add.u64 %r167,%r180,40; mov.u64 %r165,%frame; cvt.u32.u32 %r310,%r22; cvt.s64.s8 %r309,%r310; add.u64 %r156,%r309,-1; .loc 1 129 3 mov.u64 %r120,0; add.u64 %r348,%frame,240; .loc 1 131 16 mov.u64 %r311,%r120; $L26: st.u64 [%r168],%r311; .loc 1 132 18 ld.u64 %r312,[%r167]; st.u64 [%r165],%r312; .loc 1 133 17 shl.b64 %r314,%r120,3; add.u64 %r315,%r348,%r314; .loc 1 133 10 ld.u64 %r316,[%r315]; setp.le.s64 %r317,%r316,0; @ %r317 bra $L1; .loc 1 129 26 add.u64 %r120,%r120,1; .loc 1 129 3 add.u64 %r168,%r168,8; add.u64 %r167,%r167,24; add.u64 %r165,%r165,8; setp.ne.u64 %r318,%r120,%r156; @ %r318 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r350 bra $L35; .loc 1 150 10 mov.u32 %r320,0; st.u32 [%r119],%r320; bra $L29; $L35: mov.u64 %r114,%r118; .loc 1 148 10 mov.u32 %r112,0; .loc 1 154 13 mov.u64 %r113,0; $L28: .loc 1 158 19 ld.u32 %r322,[%r114]; set.u32.ne.u32 %r323,%r322,%r112; neg.s32 %r324,%r323; cvt.u32.u32 %r321,%r324; cvt.u32.u8 %r112,%r321; .loc 1 154 28 add.u64 %r113,%r113,1; .loc 1 154 36 add.u64 %r114,%r114,%r67; .loc 1 154 6 setp.gt.s64 %r325,%r45,%r113; @ %r325 bra $L28; .loc 1 161 12 st.u32 [%r119],%r112; $L29: .loc 1 165 15 add.u64 %r139,%r139,1; .loc 1 166 12 add.u64 %r118,%r118,%r70; .loc 1 167 12 add.u64 %r119,%r119,%r73; .loc 1 169 13 setp.ne.u64 %r326,%r139,%r125; @ %r326 bra $L25; .loc 1 176 9 add.u64 %r115,%r118,%r130; .loc 1 177 9 add.u64 %r116,%r119,%r132; .loc 1 179 7 @ %r351 bra $L1; add.u64 %r124,%frame,368; mov.u64 %r141,8; .loc 1 178 5 mov.u64 %r117,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r124],%r358; .loc 1 176 23 mul.lo.u64 %r332,%r86,%r85; .loc 1 176 9 shl.b64 %r333,%r332,2; sub.u64 %r115,%r118,%r333; .loc 1 177 23 mul.lo.u64 %r334,%r89,%r85; .loc 1 177 9 shl.b64 %r335,%r334,2; sub.u64 %r116,%r119,%r335; .loc 1 178 5 add.u64 %r117,%r117,1; .loc 1 179 7 add.u64 %r124,%r124,8; add.u64 %r141,%r141,8; setp.eq.u64 %r336,%r117,%r355; @ %r336 bra $L1; $L32: .loc 1 187 16 ld.u64 %r337,[%r124]; add.u64 %r85,%r337,1; st.u64 [%r124],%r85; .loc 1 188 23 add.u64 %r339,%r356,%r141; ld.u64 %r86,[%r339]; .loc 1 188 13 shl.b64 %r340,%r86,2; add.u64 %r118,%r115,%r340; .loc 1 189 23 add.u64 %r341,%frame,%r141; ld.u64 %r89,[%r341]; .loc 1 189 13 shl.b64 %r342,%r89,2; add.u64 %r119,%r116,%r342; .loc 1 169 32 add.u64 %r344,%r357,%r141; ld.u64 %r92,[%r344]; .loc 1 169 13 setp.eq.u64 %r345,%r85,%r92; @ %r345 bra $L33; .loc 1 173 13 mov.u64 %r139,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r346,[%r180]; setp.eq.u64 %r347,%r346,0; @ ! %r347 bra $L11; bra $L34; $L1 parity_l8.o/_gfortran_parity_l8 .visible .func _gfortran_parity_l8fortran/generated/parity_l8.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_retur,105,110,99,111,114,114,101,99,116,32,105,110,32,80,65,82,73,84,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,80,65,82,73,84,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {80,65,82,73,84,89114,101,116,117,114,110,32,118,97,108,11_gfortran_parity_l8 .visible .func _gfortran_parity_l84896predu64 %r227; .reg .u64 %r228; .reg .u64 %r229; .reg .predpredu32u64 %r250; .reg .u64 %r251; .reg .u64 %r252; .reg .u64 %r253; .reg .predu32u16u64pred %r323; .reg .pred %r324; .reg .u64 %r325; .reg .u64u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .predmov.u64 %r181,%ar0; mov.u64 %r182,%ar1; mov.u64 %r183,%ar2; .loc 1 55 10 ld.s8 %r22,[%r182+28]; .loc 1 55 38 add.u32 %r184,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r107,%r184; .loc 1 56 10 ld.u64 %r25,[%r183]; .loc 1 56 7 add.u64 %r108,%r25,-1; .loc 1 58 7 shr.u64 %r186,%r108,63; set.u32.lt.s64 %r188,%r107,%r108; neg.s32 %r189,%r188; cvt.u16.u64 %r192,%r186; cvt.u16.u32 %r193,%r189; or.b16 %r191,%r192,%r193; .loc 1 58 6 cvt.u32.u16 %r194,%r191; cvt.u16.u8 %r195,%r194; setp.eq.u16 %r196,%r195,0; @ %r196 bra $L2; .loc 1 60 7 cvt.u32.u32 %r199,%r22; cvt.s64.s8 %r198,%r199; st.u64 [%stack+8],%r19819197_gfortran_runtime_errorr202,%r108,%r108; add.u64 %r203,%r202,%r108; shl.b64 %r204,%r203,3; add.u64 %r205,%r182,%r204; ld.u64 %r31,[%r205+56]; ld.u64 %r33,[%r205+48]; .loc 1 68 9 ld.u64 %r111,[%r205+40]; .loc 1 70 3 setp.ne.u64 %r219,%r108,0; @ %r219 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r220,%r107,%r108; @ %r220 bra $L4; bra $L46; $L3: add.u64 %r63,%r182,40; add.u64 %r55,%frame,120; add.u64 %r29,%frame,240; add.u64 %r221,%r182,16; add.u64 %r223,%r25,%r25; add.u64 %r224,%r223,%r25; shl.b64 %r225,%r224,3; add.u64 %r178,%r221,%r225; .loc 1 76 12 mov.u64 %r363,0; $L8: .loc 1 72 18 ld.u64 %r226,[%r63]; st.u64 [%r55],%r226; .loc 1 73 19 ld.u64 %r228,[%r63+16]; add.u64 %r227,%r228,1; ld.u64 %r229,[%r63+8]; sub.u64 %r38,%r227,%r229; .loc 1 75 10 setp.lt.s64 %r230,%r38,0; @ %r230 bra $L6; .loc 1 73 17 st.u64 [%r29],%r38; bra $L7; $L6: .loc 1 76 12 st.u64 [%r29],%r363; $L7: .loc 1 70 3 add.u64 %r63,%r63,24; add.u64 %r55,%r55,8; add.u64 %r29,%r29,8; setp.ne.u64 %r232,%r63,%r178; @ %r232 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r233,[%r181]; setp.eq.u64 %r234,%r233,0; @ ! %r234 bra $L11; bra $L10; $L4: add.u64 %r236,%r25,%r25; add.u64 %r237,%r236,%r25238,40; add.u64 %r141,%r182,%r239; shl.b64 %r240,%r25,3; add.u64 %r127,%r240,-8; add.u64 %r351,%frame,120; add.u64 %r132,%r351,%r127; add.u64 %r350,%frame,240; add.u64 %r104,%r350,%r127; cvt.u32.u32 %r244,%r22; cvt.s64.s8 %r243,%r244; add.u64 %r246,%r243,%r243; add.u64 %r247,%r246,%r243; shl.b64 %r248,%r247,3; add.u64 %r249,%r182,40; add.u64 %r96,%r248,%r249; .loc 1 84 12 mov.u64 %r362,0; $L14: .loc 1 80 18 ld.u64 %r250,[%r141]; st.u64 [%r132],%r250; .loc 1 81 19 ld.u64 %r252,[%r141+16]; add.u64 %r251,%r252,1; ld.u64 %r253,[%r141+8]; sub.u64 %r44,%r251,%r253; .loc 1 83 10 setp.lt.s64 %r254,%r44,0; @ %r254 bra $L12; .loc 1 81 17 st.u64 [%r104],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r104],%r362; $L13: .loc 1 78 3 add.u64 %r141,%r141,24; add.u64 %r132,%r132,8; add.u64 %r104,%r104,8; setp.ne.u64 %r256,%r96,%r141; @ %r256 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r257,%r107,0; @ %r257 bra $L16; add.u64 %r350,%frame,240; $L34: add.u64 %r155,%r181,40; mov.u64 %r153,%r350; add.u64 %r258,%r181,16; cvt.u32.u32 %r260,%r22; cvt.s64.s8 %r259,%r260; add.u64 %r262,%r259,%r259; add.u64 %r263,%r262,%r259; shl.b64 %r264,%r263,3; add.u64 %r147,%r258,%r264; .loc 1 94 10 mov.u64 %r101,1; .loc 1 98 4 mov.u64 %r287,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r265,0; st.u64 [%r181+8],%r265; .loc 1 103 28 cvt.u16.u32 %r268,%r22; add.u16 %r267,%r268,-1; cvt.u32.u16 %r269,%r267; st.u8 [%r181+28],%r269; .loc 1 105 20 add.u32 %r270,%r22,-2; cvt.s64.s32 %r53,%r270; add.u64 %r272,%r53,%r53; add.u64 %r273,%r272,%r53; shl.b64 %r274,%r273,3; add.u64 %r275,%r181,%r274; .loc 1 105 67 shl.b64 %r277,%r53,3; add.u64 %r278,%frame,%r277; .loc 1 105 59 ld.u64 %r280,[%r275+40]; ld.u64 %r281,[%r278+240]; mul.lo.u64 %r113,%r280,%r281; .loc 1 107 29 mov.u64 %r28311(%value_in),_gfortrani_xmallocarray284,[%value_in]; } .loc 1 107 27 st.u64 [%r181],%r284; .loc 1 108 10 setp.eq.u64 %r286,%r113,0; @ ! %r286 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r101,%r49,%r101; $L17: .loc 1 98 4 st.u64 [%r155+8],%r287; ld.u64 %r49,[%r153]; add.u64 %r288,%r49,-1; st.u64 [%r155+16],%r288; st.u64 [%r155],%r101; .loc 1 91 7 add.u64 %r155,%r155,24; add.u64 %r153,%r153,8; setp.ne.u64 %r289,%r147,%r155; @ %r289 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r181+48],%r113; mov.u64 %r291,-1; st.u64 [%r181+56],%r291; mov.u64 %r292,1; st.u64 [%r181+40],%r292; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r60,[%r181+28]; .loc 1 118 10 setp.eq.u64 %r293,%r60,%r107; @ %r293 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r107; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r296,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r297,[%r296+36]; setp.eq.u32 %r298,%r297,0; @ %r298 bra $L21; .loc 1 125 2 add.u64 %r350,%frame,240; cvta.const.u64 %r302,$LC20302; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r304,%r107,0; @ %r304 bra $L24; $L27: .loc 1 65 9 add.u64 %r305,%r31,1; .loc 1 65 7 sub.u64 %r109,%r305,%r33; max.s64 %r45,%r109,0; .loc 1 137 8 ld.u64 %r121,[%r182]; .loc 1 138 8 ld.u64 %r122,[%r181]; .loc 1 154 36 shl.b64 %r69,%r111,3; ld.u64 %r47,[%frame+360]; .loc 1 166 22 ld.u64 %r70,[%frame+120]; .loc 1 166 12 shl.b64 %r72,%r70,3; .loc 1 167 22 ld.u64 %r73,[%frame]; .loc 1 167 12 shl.b64 %r75,%r73,3; setp.gt.s64 %r352,%r109,0; setp.le.s64 %r353,%r107,1; .loc 1 148 10 mov.u64 %r355,0; cvt.u32.u32 %r356,%r22; cvt.s64.s8 %r357,%r356; add.u64 %r358,%r357,-1; add.u64 %r359,%frame,120; add.u64 %r360,%frame,240; bra $L25; $L24: add.u64 %r168,%frame,360; add.u64 %r167,%r181,40; mov.u64 %r165,%frame; cvt.u32.u32 %r307,%r22; cvt.s64.s8 %r306,%r307; add.u64 %r156,%r306,-1; .loc 1 129 3 mov.u64 %r123,0; add.u64 %r350,%frame,240; .loc 1 131 16 mov.u64 %r308,%r123; $L26: st.u64 [%r168],%r308; .loc 1 132 18 ld.u64 %r309,[%r167]; st.u64 [%r165],%r309; .loc 1 133 17 shl.b64 %r311,%r123,3; add.u64 %r312,%r350,%r311; .loc 1 133 10 ld.u64 %r313,[%r312]; setp.le.s64 %r314,%r313,0; @ %r314 bra $L1; .loc 1 129 26 add.u64 %r123,%r123,1; .loc 1 129 3 add.u64 %r168,%r168,8; add.u64 %r167,%r167,24; add.u64 %r165,%r165,8; setp.ne.u64 %r315,%r123,%r156; @ %r315 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r352 bra $L35; .loc 1 150 10 st.u64 [%r122],%r355; bra $L29; $L35: mov.u64 %r116,%r121; .loc 1 148 10 mov.u64 %r114,%r355; .loc 1 154 13 mov.u64 %r115,0; $L28: .loc 1 158 19 ld.u64 %r319,[%r116]; set.u32.ne.u64 %r320,%r319,%r114; neg.s32 %r321,%r320; cvt.u32.u32 %r318,%r321; .loc 1 158 10 cvt.u64.u8 %r114,%r318; .loc 1 154 28 add.u64 %r115,%r115,1; .loc 1 154 36 add.u64 %r116,%r116,%r69; .loc 1 154 6 setp.gt.s64 %r323,%r45,%r115; @ %r323 bra $L28; .loc 1 161 12 st.u64 [%r122],%r114; $L29: .loc 1 165 15 add.u64 %r47,%r47,1; .loc 1 166 12 add.u64 %r121,%r121,%r72; .loc 1 167 12 add.u64 %r122,%r122,%r75; .loc 1 169 32 ld.u64 %r130,[%frame+240]; .loc 1 169 13 setp.ne.u64 %r324,%r47,%r130; @ %r324 bra $L25; .loc 1 176 23 mul.lo.u64 %r325,%r70,%r47; .loc 1 176 9 shl.b64 %r326,%r325,3; sub.u64 %r118,%r121,%r326; .loc 1 177 23 mul.lo.u64 %r327,%r73,%r47; .loc 1 177 9 shl.b64 %r328,%r327,3; sub.u64 %r119,%r122,%r328; .loc 1 179 7 @ %r353 bra $L1; add.u64 %r146,%frame,368; mov.u64 %r106,8; .loc 1 178 5 mov.u64 %r120,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r146],%r355; .loc 1 176 23 mul.lo.u64 %r334,%r88,%r87; .loc 1 176 9 shl.b64 %r335,%r334,3; sub.u64 %r118,%r121,%r335; .loc 1 177 23 mul.lo.u64 %r336,%r91,%r87; .loc 1 177 9 shl.b64 %r337,%r336,3; sub.u64 %r119,%r122,%r337; .loc 1 178 5 add.u64 %r120,%r120,1; .loc 1 179 7 add.u64 %r146,%r146,8; add.u64 %r106,%r106,8; setp.eq.u64 %r338,%r120,%r358; @ %r338 bra $L1; $L32: .loc 1 187 16 ld.u64 %r339,[%r146]; add.u64 %r87,%r339,1; st.u64 [%r146],%r87; .loc 1 188 23 add.u64 %r341,%r359,%r106; ld.u64 %r88,[%r341]; .loc 1 188 13 shl.b64 %r342,%r88,3; add.u64 %r121,%r118,%r342; .loc 1 189 23 add.u64 %r343,%frame,%r106; ld.u64 %r91,[%r343]; .loc 1 189 13 shl.b64 %r344,%r91,3; add.u64 %r122,%r119,%r344; .loc 1 169 32 add.u64 %r346,%r360,%r106; ld.u64 %r95,[%r346]; .loc 1 169 13 setp.eq.u64 %r347,%r87,%r95; @ %r347 bra $L33; .loc 1 173 13 mov.u64 %r47,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r348,[%r181]; setp.eq.u64 %r349,%r348,0; @ ! %r349 bra $L11; bra $L34; $L1parity_l16.o/_gfortran_parity_l16 .visible .func _gfortran_parity_l16fortran/generated/parity_l16.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_retur,105,110,99,111,114,114,101,99,116,32,105,110,32,80,65,82,73,84,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,80,65,82,73,84,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {80,65,82,73,84,89114,101,116,117,114,110,32,118,97,108,11_gfortran_parity_l16 .visible .func _gfortran_parity_l1648pred %r218; .reg .predpred %r231; .reg .u64 %r232; .reg .predu64 %r250; .reg .u64 %r251; .reg .u64 %r252; .reg .pred %r253; .reg .predu64u16u64 %r274; .reg .u64pred %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u64 %r306; .reg .u64u64pred %r318; .reg .u32u64 %r330; .reg .u32 %r334; .reg .u32 %r335; .reg .pred %r338; .reg .predmov.u64 %r180,%ar0; mov.u64 %r181,%ar1; mov.u64 %r182,%ar2; .loc 1 55 10 ld.s8 %r22,[%r181+28]; .loc 1 55 38 add.u32 %r183,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r105,%r183; .loc 1 56 10 ld.u64 %r25,[%r182]; .loc 1 56 7 add.u64 %r106,%r25,-1; .loc 1 58 7 shr.u64 %r185,%r106,63; set.u32.lt.s64 %r187,%r105,%r106; neg.s32 %r188,%r187; cvt.u16.u64 %r191,%r185; cvt.u16.u32 %r192,%r188; or.b16 %r190,%r191,%r192; .loc 1 58 6 cvt.u32.u16 %r193,%r190; cvt.u16.u8 %r194,%r193; setp.eq.u16 %r195,%r194,0; @ %r195 bra $L2; .loc 1 60 7 cvt.u32.u32 %r198,%r22; cvt.s64.s8 %r197,%r198; st.u64 [%stack+8],%r1971966_gfortran_runtime_errorr201,%r106,%r106; add.u64 %r202,%r201,%r106; shl.b64 %r203,%r202,3; add.u64 %r204,%r181,%r203; ld.u64 %r31,[%r204+56]; ld.u64 %r33,[%r204+48]; .loc 1 68 9 ld.u64 %r109,[%r204+40]; .loc 1 70 3 setp.ne.u64 %r218,%r106,0; @ %r218 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r219,%r105,%r106; @ %r219 bra $L4; bra $L46; $L3: add.u64 %r93,%r181,40; add.u64 %r59,%frame,120; add.u64 %r29,%frame,240; add.u64 %r220,%r181,16; add.u64 %r222,%r25,%r25; add.u64 %r223,%r222,%r25; shl.b64 %r224,%r223,3; add.u64 %r177,%r220,%r224; .loc 1 76 12 mov.u64 %r381,0; $L8: .loc 1 72 18 ld.u64 %r225,[%r93]; st.u64 [%r59],%r225; .loc 1 73 19 ld.u64 %r227,[%r93+16]; add.u64 %r226,%r227,1; ld.u64 %r228,[%r93+8]; sub.u64 %r38,%r226,%r228; .loc 1 75 10 setp.lt.s64 %r229,%r38,0; @ %r229 bra $L6; .loc 1 73 17 st.u64 [%r29],%r38; bra $L7; $L6: .loc 1 76 12 st.u64 [%r29],%r381; $L7: .loc 1 70 3 add.u64 %r93,%r93,24; add.u64 %r59,%r59,8; add.u64 %r29,%r29,8; setp.ne.u64 %r231,%r93,%r177; @ %r231 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r232,[%r180]; setp.eq.u64 %r233,%r232,0; @ ! %r233 bra $L11; bra $L10; $L4: add.u64 %r235,%r25,%r25; add.u64 %r236,%r235,%r25; shl.b64 %r237,%r236,3; add.u64 %r238,%r237,40; add.u64 %r146,%r181,%r238; shl.b64 %r239,%r25,3; add.u64 %r128,%r239,-8; add.u64 %r367,%frame,120; add.u64 %r131,%r367,%r128; add.u64 %r370,%frame,240; add.u64 %r126,%r370,%r128; cvt.u32.u32 %r243,%r22; cvt.s64.s8 %r242,%r243; add.u64 %r245,%r242,%r242; add.u64 %r246,%r245,%r242; shl.b64 %r247,%r246,3; add.u64 %r248,%r181,40; add.u64 %r96,%r247,%r248; .loc 1 84 12 mov.u64 %r380,0; $L14: .loc 1 80 18 ld.u64 %r249,[%r146]; st.u64 [%r131],%r249; .loc 1 81 19 ld.u64 %r251,[%r146+16]; add.u64 %r250,%r251,1; ld.u64 %r252,[%r146+8]; sub.u64 %r44,%r250,%r252; .loc 1 83 10 setp.lt.s64 %r253,%r44,0; @ %r253 bra $L12; .loc 1 81 17 st.u64 [%r126],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r126],%r380; $L13: .loc 1 78 3 add.u64 %r146,%r146,24; add.u64 %r131,%r131,8; add.u64 %r126,%r126,8; setp.ne.u64 %r255,%r96,%r146; @ %r255 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r256,%r105,0; @ %r256 bra $L16; add.u64 %r370,%frame,240; $L34: add.u64 %r155,%r180,40; mov.u64 %r153,%r370; add.u64 %r257,%r180,16; cvt.u32.u32 %r259,%r22; cvt.s64.s8 %r258,%r259; add.u64 %r261,%r258,%r258; add.u64 %r262,%r261,%r258; shl.b64 %r263,%r262,3; add.u64 %r147,%r257,%r263; .loc 1 94 10 mov.u64 %r98,1; .loc 1 98 4 mov.u64 %r286,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r264,0; st.u64 [%r180+8],%r264; .loc 1 103 28 cvt.u16.u32 %r267,%r22; add.u16 %r266,%r267,-1; cvt.u32.u16 %r268,%r266; st.u8 [%r180+28],%r268; .loc 1 105 20 add.u32 %r269,%r22,-2; cvt.s64.s32 %r52,%r269; add.u64 %r271,%r52,%r52; add.u64 %r272,%r271,%r52; shl.b64 %r273,%r272,3; add.u64 %r274,%r180,%r273; .loc 1 105 67 shl.b64 %r276,%r52,3; add.u64 %r277,%frame,%r276; .loc 1 105 59 ld.u64 %r279,[%r274+40]; ld.u64 %r280,[%r277+240]; mul.lo.u64 %r111,%r279,%r280; .loc 1 107 29 mov.u64 %r282,16282; call (%value_in),_gfortrani_xmallocarray283,[%value_in]; } .loc 1 107 27 st.u64 [%r180],%r283; .loc 1 108 10 setp.eq.u64 %r285,%r111,0; @ ! %r285 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r98,%r48,%r98; $L17: .loc 1 98 4 st.u64 [%r155+8],%r286; ld.u64 %r48,[%r153]; add.u64 %r287,%r48,-1; st.u64 [%r155+16],%r287; st.u64 [%r155],%r98; .loc 1 91 7 add.u64 %r155,%r155,24; add.u64 %r153,%r153,8; setp.ne.u64 %r288,%r147,%r155; @ %r288 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r180+48],%r111; mov.u64 %r290,-1; st.u64 [%r180+56],%r290; mov.u64 %r291,1; st.u64 [%r180+40],%r291; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r180+28]; .loc 1 118 10 setp.eq.u64 %r292,%r58,%r105; @ %r292 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r105; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r295,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r296,[%r295+36]; setp.eq.u32 %r297,%r296,0; @ %r297 bra $L21; .loc 1 125 2 add.u64 %r370,%frame,240; cvta.const.u64 %r301,$LC2r370300301; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r303,%r105,0; @ %r303 bra $L24; $L27: .loc 1 65 9 add.u64 %r304,%r31,1; .loc 1 65 7 sub.u64 %r107,%r304,%r33; max.s64 %r45,%r107,0; .loc 1 137 8 ld.u64 %r118,[%r181]; .loc 1 138 8 ld.u64 %r119,[%r180]; .loc 1 154 36 shl.b64 %r67,%r109,4; ld.u64 %r139,[%frame+360]; .loc 1 166 22 ld.u64 %r69,[%frame+120]; .loc 1 166 12 shl.b64 %r70,%r69,4; .loc 1 167 22 ld.u64 %r72,[%frame]; .loc 1 167 12 shl.b64 %r73,%r72,4; .loc 1 169 32 ld.u64 %r125,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r305,%r69,%r125; .loc 1 176 9 shl.b64 %r306,%r305,4; neg.s64 %r130,%r306; .loc 1 177 23 mul.lo.u64 %r307,%r72,%r125; .loc 1 177 9 shl.b64 %r308,%r307,4; neg.s64 %r132,%r308; setp.gt.s64 %r368,%r107,0; setp.le.s64 %r369,%r105,1; .loc 1 148 10 mov.u64 %r373,0; cvt.u32.u32 %r375,%r22; cvt.s64.s8 %r376,%r375; add.u64 %r377,%r376,-1; add.u64 %r378,%frame,120; add.u64 %r379,%frame,240; bra $L25; $L24: add.u64 %r168,%frame,360; add.u64 %r167,%r180,40; mov.u64 %r165,%frame; cvt.u32.u32 %r310,%r22; cvt.s64.s8 %r309,%r310; add.u64 %r156,%r309,-1; .loc 1 129 3 mov.u64 %r120,0; add.u64 %r370,%frame,240; .loc 1 131 16 mov.u64 %r311,%r120; $L26: st.u64 [%r168],%r311; .loc 1 132 18 ld.u64 %r312,[%r167]; st.u64 [%r165],%r312; .loc 1 133 17 shl.b64 %r314,%r120,3; add.u64 %r315,%r370,%r314; .loc 1 133 10 ld.u64 %r316,[%r315]; setp.le.s64 %r317,%r316,0; @ %r317 bra $L1; .loc 1 129 26 add.u64 %r120,%r120,1; .loc 1 129 3 add.u64 %r168,%r168,8; add.u64 %r167,%r167,24; add.u64 %r165,%r165,8; setp.ne.u64 %r318,%r120,%r156; @ %r318 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r368 bra $L35; .loc 1 150 10 st.u64 [%r119],%r373; st.u64 [%r119+8],%r373; bra $L29; $L35: mov.u64 %r114,%r118; .loc 1 148 10 mov.u64 %r363,%r373; .loc 1 154 13 mov.u64 %r113,0; $L28: .loc 1 158 19 ld.u64 %r325,[%r114]; xor.b64 %r324,%r325,%r363; ld.u64 %r327,[%r114+8]; or.b64 %r330,%r324,%r327; set.u32.ne.u64 %r334,%r330,0; neg.s32 %r335,%r334; cvt.u32.u32 %r322,%r335; .loc 1 158 10 cvt.u64.u8 %r363,%r322; .loc 1 154 28 add.u64 %r113,%r113,1; .loc 1 154 36 add.u64 %r114,%r114,%r67; .loc 1 154 6 setp.gt.s64 %r338,%r45,%r113; @ %r338 bra $L28; .loc 1 161 12 st.u64 [%r119],%r363; st.u64 [%r119+8],%r373; $L29: .loc 1 165 15 add.u64 %r139,%r139,1; .loc 1 166 12 add.u64 %r118,%r118,%r70; .loc 1 167 12 add.u64 %r119,%r119,%r73; .loc 1 169 13 setp.ne.u64 %r341,%r139,%r125; @ %r341 bra $L25; .loc 1 176 9 add.u64 %r115,%r118,%r130; .loc 1 177 9 add.u64 %r116,%r119,%r132; .loc 1 179 7 @ %r369 bra $L1; add.u64 %r124,%frame,368; mov.u64 %r141,8; .loc 1 178 5 mov.u64 %r117,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r124],%r373; .loc 1 176 23 mul.lo.u64 %r347,%r86,%r85; .loc 1 176 9 shl.b64 %r348,%r347,4; sub.u64 %r115,%r118,%r348; .loc 1 177 23 mul.lo.u64 %r349,%r89,%r85; .loc 1 177 9 shl.b64 %r350,%r349,4; sub.u64 %r116,%r119,%r350; .loc 1 178 5 add.u64 %r117,%r117,1; .loc 1 179 7 add.u64 %r124,%r124,8; add.u64 %r141,%r141,8; setp.eq.u64 %r351,%r117,%r377; @ %r351 bra $L1; $L32: .loc 1 187 16 ld.u64 %r352,[%r124]; add.u64 %r85,%r352,1; st.u64 [%r124],%r85; .loc 1 188 23 add.u64 %r354,%r378,%r141; ld.u64 %r86,[%r354]; .loc 1 188 13 shl.b64 %r355,%r86,4; add.u64 %r118,%r115,%r355; .loc 1 189 23 add.u64 %r356,%frame,%r141; ld.u64 %r89,[%r356]; .loc 1 189 13 shl.b64 %r357,%r89,4; add.u64 %r119,%r116,%r357; .loc 1 169 32 add.u64 %r359,%r379,%r141; ld.u64 %r92,[%r359]; .loc 1 169 13 setp.eq.u64 %r360,%r85,%r92; @ %r360 bra $L33; .loc 1 173 13 mov.u64 %r139,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r361,[%r180]; setp.eq.u64 %r362,%r361,0; @ ! %r362 bra $L11; bra $L34; $L1 matmul_i1.o/_gfortran_matmul_i1 .visible .func _gfortran_matmul_, .param .u32 %in_ar4, .param .u64 %in_ar5); .file 1 "../../../../libgfortran/generated/matmul_i1.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_options[6]FUNCTION DECL: _gfortran_size0_gfortran_size0_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime___func__$0 .const .align 1 .u8 __func__$0[10] = {109,97,116,109,117,108,95,105,49,0 }2] = {782] = {672] = {8471,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,32,40,97,41,32,61,61,32,50,32,124,124,32,71,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,32,40,98,41,32,61,61,3246116,114,97,110,47,103,101,110,101,114,97,116,101,100,47,109,97,116,109,117,108,95,105,49,465,114,114,97,121,32,98,111,117,110,100,32,109,105,115,109,97,116,99,104,32,102,111,114,32,100,105,109,101,110,115,105,111,110,32,49,32,111,102,32,97,114,114,97,121,32,40,37,108,100,47,37,108,100,41,3265,114,114,97,121,32,98,111,117,110,100,32,109,105,115,109,97,116,99,104,32,102,111,114,32,100,105,109,101,110,115,105,111,110,32,50,32,111,102,32,97,114,114,97,121,32,40,37,108,100,47,37,108,100,41,3273,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,97,114,103,117,109,101,110,116,32,66,32,105,110,32,77,65,84,77,85,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,49,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,100,0 }103,101,109,109,32,33,61,32,78,85,76,7_gfortran_matmul_i1 .visible .func _gfortran_matmul_9u64 %r410; .reg .u32.reg .u64u64 %r531; .reg .u64 %r533; .reg .u64 %r536; .reg .u64 %r537; .reg .u64 %r538; .reg .u64 %r539; .reg .u64 %r545; .reg .u64u64 %r556; .reg .u64 %r559; .reg .u64 %r575; .reg .u64 %r576; .reg .u64 %r577; .reg .u64 %r578; .reg .u64 %r584; .reg .u64 %r585; .reg .u64 %r587; .reg .u64u64 %r603; .reg .u64 %r608; .reg .u64 %r609; .reg .u64 %r614; .reg .u64 %r615; .reg .u64 %r616; .reg .u32 %r624; .reg .u64 %r625; .reg .u64 %r626; .reg .u64 %r630; .reg .u64 %r634; .reg .u64 %r635; .reg .u64 %r636; .reg .u64 %r641; .reg .u64 %r642; .reg .u64 %r644; .reg .u64 %r649; .reg .u64 %r651; .reg .u64 %r654; .reg .u64 %r656; .reg .u64 %r661; .reg .u64 %r663; .reg .u64 %r668; .reg .u64 %r670; .reg .u64 %r675; .reg .u64 %r677; .reg .u64 %r681; .reg .u64 %r684; .reg .u64 %r690; .reg .u64 %r692; .reg .u64 %r696; .reg .u64 %r697; .reg .u64 %r700; .reg .u64 %r703; .reg .u64 %r706; .reg .u64 %r709; .reg .u64 %r713; .reg .u64 %r715; .reg .u64 %r719; .reg .u64 %r721; .reg .u64 %r724; .reg .u64 %r725; .reg .u64 %r726; .reg .u32 %r727; .reg .u32 %r728; .reg .u64 %r729; .reg .u16 %r730; .reg .pred %r731; .reg .u16 %r732; .reg .pred %r734; .reg .u64 %r735; .reg .u32 %r736; .reg .u64 %r737; .reg .u64 %r738; .reg .u64 %r739; .reg .pred %r740; .reg .pred %r742; .reg .u64 %r745; .reg .u64 %r747; .reg .u64 %r748; .reg .u64 %r749; .reg .u64 %r751; .reg .u64 %r752; .reg .u64 %r753; .reg .u64 %r754; .reg .u64 %r755; .reg .u64 %r758; .reg .u64 %r760; .reg .u64 %r761; .reg .u64 %r763; .reg .u64 %r765; .reg .u64 %r766; .reg .u64 %r768; .reg .u64 %r769; .reg .u64 %r770; .reg .u64 %r771; .reg .u64 %r772; .reg .u32 %r773; .reg .pred %r774; .reg .u64 %r775; .reg .u64 %r776; .reg .u64 %r777; .reg .pred %r779; .reg .u64 %r780; .reg .u64 %r781; .reg .u64 %r782; .reg .pred %r783; .reg .u64 %r784; .reg .u64 %r785; .reg .u64 %r786; .reg .u16 %r787; .reg .pred %r788; .reg .u64 %r789; .reg .u64 %r791; .reg .u64 %r792; .reg .u64 %r793; .reg .u16 %r794; .reg .pred %r796; .reg .pred %r797; .reg .u64 %r798; .reg .pred %r800; .reg .u64 %r801; .reg .u64 %r803; .reg .u64 %r804; .reg .u64 %r805; .reg .u64 %r806; .reg .u64 %r807; .reg .u64 %r808; .reg .pred %r809; .reg .u64 %r810; .reg .u64 %r812; .reg .u64 %r813; .reg .u64 %r814; .reg .u16 %r815; .reg .u32 %r816; .reg .pred %r817; .reg .u16 %r818; .reg .pred %r819; .reg .u64 %r820; .reg .u64 %r821; .reg .u64 %r822; .reg .u64 %r823; .reg .u64 %r824; .reg .u64 %r825; .reg .pred %r826; .reg .u32 %r828; .reg .u32 %r829; .reg .u32 %r831; .reg .u32 %r832; .reg .u16 %r833; .reg .u16 %r834; .reg .u16 %r835; .reg .u32 %r836; .reg .u16 %r837; .reg .pred %r838; .reg .u64 %r839; .reg .u16 %r841; .reg .pred %r842; .reg .u64 %r843; .reg .u64 %r844; .reg .u64 %r845; .reg .u32 %r846; .reg .u32 %r847; .reg .u32 %r848; .reg .u32 %r849; .reg .u32 %r850; .reg .u32 %r851; .reg .u32 %r853; .reg .u32 %r854; .reg .u16 %r855; .reg .u16 %r856; .reg .u16 %r857; .reg .u32 %r858; .reg .u16 %r859; .reg .pred %r860; .reg .u32 %r862; .reg .u32 %r863; .reg .u16 %r864; .reg .u16 %r865; .reg .u16 %r866; .reg .u32 %r867; .reg .u16 %r868; .reg .pred %r869; .reg .u32 %r871; .reg .u32 %r872; .reg .u32 %r874; .reg .u32 %r875; .reg .u16 %r876; .reg .u16 %r877; .reg .u16 %r878; .reg .u32 %r879; .reg .u16 %r880; .reg .pred %r881; .reg .f32 %r882; .reg .f32 %r883; .reg .f32 %r884; .reg .f32 %r885; .reg .f32 %r886; .reg .f32 %r887; .reg .f32 %r888; .reg .pred %r889; .reg .u32 %r890; .reg .u32 %r891; .reg .pred %r892; .reg .pred %r893; .reg .u32 %r895; .reg .u32 %r896; .reg .u32 %r898; .reg .u32 %r899; .reg .u16 %r900; .reg .u16 %r901; .reg .u16 %r902; .reg .u32 %r903; .reg .u16 %r904; .reg .pred %r905; .reg .u32 %r907; .reg .u32 %r908; .reg .u32 %r910; .reg .u32 %r911; .reg .u16 %r912; .reg .u16 %r913; .reg .u16 %r914; .reg .u32 %r915; .reg .u16 %r916; .reg .pred %r917; .reg .u32 %r919; .reg .u32 %r920; .reg .u32 %r922; .reg .u32 %r923; .reg .u16 %r924; .reg .u16 %r925; .reg .u16 %r926; .reg .u32 %r927; .reg .u16 %r928; .reg .pred %r929; .reg .pred %r930; .reg .u64 %r931; .reg .u32 %r932; .reg .u64 %r933; .reg .u64 %r934; .reg .u32 %r935; .reg .pred %r936; .reg .pred %r937; .reg .u32 %r938; .reg .pred %r939; .reg .pred %r940; .reg .u32 %r955; .reg .u64 %r956; .reg .u64 %r957; .reg .u64 %r958; .reg .u64 %r959; .reg .u64 %r960; .reg .u64 %r961; .reg .u64 %r962; .reg .u16 %r963; .reg .u16 %r964; .reg .u16 %r965; .reg .u32 %r966; .reg .u16 %r967; .reg .pred %r968; .reg .pred %r969; .reg .u16 %r970; .reg .pred %r971; .reg .u64 %r972; .reg .pred %r973; .reg .u32 %r978; .reg .u64 %r980; .reg .u32 %r985; .reg .u32 %r986; .reg .u32 %r988; .reg .u32 %r989; .reg .u16 %r990; .reg .u16 %r991; .reg .u16 %r992; .reg .u32 %r994; .reg .u32 %r995; .reg .u16 %r997; .reg .u16 %r999; .reg .u32 %r1000; .reg .u16 %r1001; .reg .pred %r1002; .reg .pred %r1003; .reg .u64 %r1004; .reg .u64 %r1006; .reg .u64 %r1007; .reg .pred %r1009; .reg .u64 %r1011; .reg .u64 %r1014; .reg .u64 %r1015; .reg .u64 %r1019; .reg .u64 %r1022; .reg .u64 %r1026; .reg .u64 %r1029; .reg .u64 %r1030; .reg .u64 %r1033; .reg .u64 %r1035; .reg .u64 %r1036; .reg .u64 %r1037; .reg .u64 %r1038; .reg .u64 %r1039; .reg .pred %r1041; .reg .u64 %r1047; .reg .u64 %r1048; .reg .u64 %r1049; .reg .u64 %r1050; .reg .pred %r1052; .reg .u64 %r1053; .reg .u64 %r1054; .reg .u64 %r1056; .reg .u64 %r1059; .reg .u64 %r1060; .reg .u64 %r1061; .reg .u64 %r1062; .reg .u64 %r1066; .reg .u32 %r1075; .reg .u64 %r1076; .reg .u64 %r1077; .reg .u32 %r1078; .reg .u32 %r1079; .reg .u64 %r1080; .reg .u64 %r1081; .reg .u32 %r1082; .reg .u64 %r1085; .reg .u64 %r1086; .reg .u64 %r1088; .reg .u64 %r1092; .reg .u16 %r1093; .reg .u32 %r1094; .reg .u32 %r1095; .reg .u16 %r1096; .reg .u16 %r1097; .reg .u16 %r1098; .reg .pred %r1099; .reg .u64 %r1102; .reg .u64 %r1103; .reg .u64 %r1104; .reg .u64 %r1105; .reg .pred %r1107; .reg .u64 %r1109; .reg .u64 %r1110; .reg .u64 %r1111; .reg .u32 %r1112; .reg .u64 %r1121; .reg .u16 %r1123; .reg .u16 %r1124; .reg .u16 %r1127; .reg .u16 %r1129; .reg .u32 %r1130; .reg .u16 %r1134; .reg .u16 %r1137; .reg .u16 %r1139; .reg .u32 %r1140; .reg .u64 %r1143; .reg .u16 %r1146; .reg .u16 %r1149; .reg .u16 %r1151; .reg .u32 %r1152; .reg .u16 %r1159; .reg .u16 %r1161; .reg .u32 %r1162; .reg .u64 %r1164; .reg .u16 %r1167; .reg .u16 %r1170; .reg .u16 %r1172; .reg .u32 %r1173; .reg .u16 %r1180; .reg .u16 %r1182; .reg .u32 %r1183; .reg .u64 %r1185; .reg .u16 %r1188; .reg .u16 %r1191; .reg .u16 %r1193; .reg .u32 %r1194; .reg .u16 %r1201; .reg .u16 %r1203; .reg .u32 %r1204; .reg .u64 %r1206; .reg .u16 %r1209; .reg .u16 %r1212; .reg .u16 %r1214; .reg .u32 %r1215; .reg .u64 %r1217; .reg .u16 %r1220; .reg .u16 %r1223; .reg .u16 %r1225; .reg .u32 %r1226; .reg .u16 %r1233; .reg .u16 %r1235; .reg .u32 %r1236; .reg .u16 %r1243; .reg .u16 %r1245; .reg .u32 %r1246; .reg .u16 %r1253; .reg .u16 %r1255; .reg .u32 %r1256; .reg .u16 %r1263; .reg .u16 %r1265; .reg .u32 %r1266; .reg .u16 %r1273; .reg .u16 %r1275; .reg .u32 %r1276; .reg .u16 %r1283; .reg .u16 %r1285; .reg .u32 %r1286; .reg .pred %r1288; .reg .u32 %r1289; .reg .u32 %r1290; .reg .u32 %r1291; .reg .u32 %r1292; .reg .u32 %r1293; .reg .u32 %r1294; .reg .u32 %r1295; .reg .u32 %r1296; .reg .u32 %r1297; .reg .u32 %r1298; .reg .u32 %r1299; .reg .u32 %r1300; .reg .u32 %r1301; .reg .u32 %r1302; .reg .u32 %r1303; .reg .u32 %r1304; .reg .pred %r1305; .reg .pred %r1306; .reg .u16 %r1312; .reg .u16 %r1313; .reg .u16 %r1316; .reg .u16 %r1318; .reg .u32 %r1319; .reg .u64 %r1321; .reg .u64 %r1322; .reg .u16 %r1325; .reg .u16 %r1329; .reg .u16 %r1331; .reg .u32 %r1332; .reg .u64 %r1334; .reg .u64 %r1335; .reg .u16 %r1338; .reg .u16 %r1342; .reg .u16 %r1344; .reg .u32 %r1345; .reg .u64 %r1347; .reg .u64 %r1348; .reg .u16 %r1351; .reg .u16 %r1355; .reg .u16 %r1357; .reg .u32 %r1358; .reg .u32 %r1361; .reg .u32 %r1362; .reg .u32 %r1363; .reg .u32 %r1364; .reg .u64 %r1366; .reg .pred %r1367; .reg .pred %r1368; .reg .u64 %r1374; .reg .u16 %r1377; .reg .u16 %r1378; .reg .u16 %r1381; .reg .u16 %r1383; .reg .u32 %r1384; .reg .u16 %r1388; .reg .u16 %r1392; .reg .u16 %r1394; .reg .u32 %r1395; .reg .u64 %r1397; .reg .u16 %r1400; .reg .u16 %r1404; .reg .u16 %r1406; .reg .u32 %r1407; .reg .u64 %r1409; .reg .u16 %r1412; .reg .u16 %r1416; .reg .u16 %r1418; .reg .u32 %r1419; .reg .u32 %r1422; .reg .u32 %r1423; .reg .u32 %r1424; .reg .u32 %r1425; .reg .pred %r1426; .reg .pred %r1427; .reg .u16 %r1433; .reg .u16 %r1434; .reg .u16 %r1437; .reg .u16 %r1439; .reg .u32 %r1440; .reg .u32 %r1443; .reg .pred %r1445; .reg .u16 %r1446; .reg .pred %r1447; .reg .u32 %r1449; .reg .u32 %r1450; .reg .u16 %r1451; .reg .u16 %r1452; .reg .u32 %r1454; .reg .u16 %r1455; .reg .pred %r1456; .reg .pred %r1457; .reg .u16 %r1458; .reg .pred %r1459; .reg .pred %r1460; .reg .pred %r1461; .reg .pred %r1463; .reg .u16 %r1468; .reg .u16 %r1469; .reg .u16 %r1472; .reg .u16 %r1474; .reg .u32 %r1475; .reg .u32 %r1478; .reg .pred %r1480; .reg .u16 %r1485; .reg .u16 %r1486; .reg .u16 %r1489; .reg .u16 %r1491; .reg .u32 %r1492; .reg .u32 %r1496; .reg .pred %r1497; .reg .pred %r1498; .reg .u16 %r1503; .reg .u16 %r1504; .reg .u16 %r1507; .reg .u16 %r1509; .reg .u32 %r1510; .reg .u32 %r1513; .reg .pred %r1514; .reg .pred %r1515; .reg .pred %r1516; .reg .pred %r1517; .reg .pred %r1520; .reg .u16 %r1525; .reg .u16 %r1526; .reg .u16 %r1529; .reg .u16 %r1531; .reg .u32 %r1532; .reg .pred %r1534; .reg .pred %r1536; .reg .pred %r1539; .reg .u16 %r1544; .reg .u16 %r1545; .reg .u16 %r1548; .reg .u16 %r1550; .reg .u32 %r1551; .reg .u32 %r1554; .reg .pred %r1555; .reg .pred %r1556; .reg .u64 %r1557; .reg .pred %r1558; .reg .u64 %r1559; .reg .u64 %r1560; .reg .u64 %r1561; .reg .u16 %r1562; .reg .u32 %r1563; .reg .pred %r1564; .reg .u16 %r1565; .reg .u32 %r1566; .reg .pred %r1567; .reg .u64 %r1568; .reg .u64 %r1569; .reg .u64 %r1570; .reg .u64 %r1571; .reg .u64 %r1572; .reg .u64 %r1573; .reg .u64 %r1574; .reg .u32 %r1575; .reg .pred %r1576; .reg .u64 %r1577; .reg .u64 %r1578; .reg .pred %r1579; .reg .pred %r1580; .reg .pred %r1581; .reg .pred %r1584; .reg .pred %r1585; .reg .pred %r1586; .reg .pred %r1587; .reg .pred %r1588; .reg .pred %r1590; .reg .u32 %r1591; .reg .u64 %r1592; .reg .u64 %r1593; .reg .u64 %r1594; .reg .u64 %r1596; .reg .pred %r1598; .reg .u64 %r1610; .reg .u64 %r1611; .reg .u64 %r1615; .reg .u64 %r1616; .reg .u64 %r1617; .reg .u64 %r1618; .reg .u64 %r1619; .reg .u64 %r1620; .reg .u64 %r1621; .reg .u64 %r1622; .reg .u64 %r1623; .reg .u64 %r1624; .reg .u64 %r1625; .reg .pred %r1626; .reg .u64 %r1628; .reg .pred %r1631; .reg .u64 %r1633; .reg .pred %r1634; .reg .pred %r1635; .reg .u64 %r1636; .reg .u64 %r1637; .reg .u64 %r1638; .reg .u64 %r1639; .reg .pred %r1643; .reg .u32 %r1644; .reg .u32 %r1646; .reg .u32 %r1647; .reg .u32 %r1648; .reg .pred %r1650; .reg .u32 %r1651; .reg .u64 %r1655; .reg .u64 %r1656; .reg .pred %r1657; .reg .pred %r1658; .reg .pred %r1659; .reg .u64 %r1664; .reg .u64 %r1665; .reg .pred %r1666; .reg .u64 %r1668; .reg .u64 %r1672; .reg .pred %r1673; .reg .u64 %r1675; .reg .u64 %r1678; .reg .u64 %r1679; .reg .pred %r1680; .reg .pred %r1681; .reg .pred %r1682; .reg .u64 %r1685; .reg .u64 %r1687; .reg .u64 %r1688; .reg .u64 %r1689; .reg .pred %r1690; .reg .pred %r1691; .reg .pred %r1692; .reg .u64 %r1693; .reg .u64 %r1694; .reg .u64 %r1697; .reg .u64 %r1698; .reg .pred %r1699; .reg .pred %r1700; .reg .pred %r1701; .reg .u64 %r1702; .reg .u64 %r1705; .reg .u64 %r1706; .reg .pred %r1707; .reg .pred %r1708; .reg .pred %r1709; .reg .u64 %r1715; .reg .pred %r1716; .reg .pred %r1717; .reg .pred %r1718; .reg .u64 %r1723; .reg .pred %r1724; .reg .pred %r1725; .reg .pred %r1726; .reg .u64 %r1731; .reg .pred %r1732; .reg .pred %r1733; .reg .pred %r1734; .reg .u64 %r1739; .reg .pred %r1740; .reg .pred %r1741; .reg .pred %r1742; .reg .u64 %r1747; .reg .pred %r1748; .reg .pred %r1749; .reg .pred %r1750; .reg .u64 %r1753; .reg .u64 %r1754; .reg .u64 %r1756; .reg .u64 %r1757; .reg .u32 %r1760; .reg .u64 %r1761; .reg .u64 %r1762; .reg .u32 %r1763; .reg .u32 %r1764; .reg .u64 %r1765; .reg .u64 %r1766; .reg .u32 %r1767; .reg .u64 %r1768; .reg .u64 %r1769; .reg .u32 %r1771; .reg .u64 %r1772; .reg .u64 %r1773; .reg .u32 %r1774; .reg .u32 %r1775; .reg .u64 %r1776; .reg .u64 %r1777; .reg .u32 %r1778; .reg .u64 %r1779; .reg .u64 %r1780; .reg .u32 %r1782; .reg .u64 %r1783; .reg .u64 %r1784; .reg .u32 %r1785; .reg .u32 %r1786; .reg .u64 %r1787; .reg .u64 %r1788; .reg .u32 %r1789; .reg .pred %r1790; .reg .u64 %r1791; .reg .u32 %r1794; .reg .u32 %r1798; .reg .u32 %r1802; .reg .pred %r1803; .reg .u64 %r1804; .reg .u32 %r1805; .reg .u32 %r1806; .reg .u32 %r1807; .reg .u32 %r1808; .reg .u16 %r1813; .reg .u16 %r1814; .reg .u16 %r1816; .reg .u16 %r1817; .reg .u32 %r1818; .reg .u64 %r1819; .reg .u64 %r1820; .reg .u16 %r1822; .reg .u16 %r1824; .reg .u16 %r1825; .reg .u32 %r1826; .reg .u64 %r1827; .reg .u64 %r1828; .reg .u16 %r1830; .reg .u16 %r1832; .reg .u16 %r1833; .reg .u32 %r1834; .reg .u64 %r1835; .reg .u64 %r1836; .reg .u16 %r1838; .reg .u16 %r1840; .reg .u16 %r1841; .reg .u32 %r1842; .reg .pred %r1843; .reg .pred %r1844; .reg .u32 %r1845; .reg .u32 %r1846; .reg .u32 %r1847; .reg .u32 %r1848; .reg .u64 %r1851; .reg .u64 %r1853; .reg .u16 %r1855; .reg .u16 %r1856; .reg .u16 %r1858; .reg .u16 %r1859; .reg .u32 %r1860; .reg .u16 %r1862; .reg .u16 %r1864; .reg .u16 %r1865; .reg .u32 %r1866; .reg .u64 %r1867; .reg .u16 %r1869; .reg .u16 %r1871; .reg .u16 %r1872; .reg .u32 %r1873; .reg .u64 %r1874; .reg .u16 %r1876; .reg .u16 %r1878; .reg .u16 %r1879; .reg .u32 %r1880; .reg .pred %r1881; .reg .u32 %r1882; .reg .u16 %r1887; .reg .u16 %r1888; .reg .u16 %r1890; .reg .u16 %r1891; .reg .u32 %r1892; .reg .u32 %r1893; .reg .u16 %r1899; .reg .u16 %r1900; .reg .u16 %r1902; .reg .u16 %r1903; .reg .u32 %r1904; .reg .u32 %r1905; .reg .u16 %r1911; .reg .u16 %r1912; .reg .u16 %r1914; .reg .u16 %r1915; .reg .u32 %r1916; .reg .pred %r1917; .reg .pred %r1918; .reg .u32 %r1919; .reg .u16 %r1924; .reg .u16 %r1925; .reg .u16 %r1927; .reg .u16 %r1928; .reg .u32 %r1929; .reg .u32 %r1930; .reg .u16 %r1936; .reg .u16 %r1937; .reg .u16 %r1939; .reg .u16 %r1940; .reg .u32 %r1941; .reg .u32 %r1942; .reg .u16 %r1948; .reg .u16 %r1949; .reg .u16 %r1951; .reg .u16 %r1952; .reg .u32 %r1953; .reg .pred %r1954; .reg .pred %r1955; .reg .u32 %r1956; .reg .u64 %r1957; .reg .u16 %r1961; .reg .u16 %r1962; .reg .u16 %r1964; .reg .u16 %r1965; .reg .u32 %r1966; .reg .u32 %r1967; .reg .u64 %r1968; .reg .u16 %r1973; .reg .u16 %r1974; .reg .u16 %r1976; .reg .u16 %r1977; .reg .u32 %r1978; .reg .u32 %r1979; .reg .u64 %r1980; .reg .u16 %r1985; .reg .u16 %r1986; .reg .u16 %r1988; .reg .u16 %r1989; .reg .u32 %r1990; .reg .pred %r1991; .reg .pred %r1992; .reg .u32 %r1993; .reg .u64 %r1995; .reg .u64 %r1996; .reg .u16 %r1999; .reg .u16 %r2000; .reg .u16 %r2002; .reg .u16 %r2003; .reg .u32 %r2004; .reg .u32 %r2005; .reg .u64 %r2007; .reg .u64 %r2008; .reg .u16 %r2012; .reg .u16 %r2013; .reg .u16 %r2015; .reg .u16 %r2016; .reg .u32 %r2017; .reg .u32 %r2018; .reg .u64 %r2020; .reg .u64 %r2021; .reg .u16 %r2025; .reg .u16 %r2026; .reg .u16 %r2028; .reg .u16 %r2029; .reg .u32 %r2030; .reg .pred %r2031; .reg .u64 %r2033; .reg .u64 %r2035; .reg .u64 %r2038; .reg .pred %r2040; .reg .pred %r2041; .reg .pred %r2042; .reg .u64 %r2044; .reg .u64 %r2045; .reg .u16 %r2048; .reg .u16 %r2049; .reg .u16 %r2052; .reg .u16 %r2053; .reg .u32 %r2054; .reg .u64 %r2056; .reg .u64 %r2057; .reg .u16 %r2061; .reg .u16 %r2062; .reg .u16 %r2065; .reg .u16 %r2066; .reg .u32 %r2067; .reg .u64 %r2069; .reg .u64 %r2070; .reg .u16 %r2074; .reg .u16 %r2075; .reg .u16 %r2078; .reg .u16 %r2079; .reg .u32 %r2080; .reg .pred %r2081; .reg .pred %r2082; .reg .u32 %r2084; .reg .u64 %r2086; .reg .u64 %r2087; .reg .u16 %r2090; .reg .u16 %r2091; .reg .u16 %r2093; .reg .u16 %r2094; .reg .u32 %r2095; .reg .u32 %r2096; .reg .u64 %r2098; .reg .u64 %r2099; .reg .u16 %r2103; .reg .u16 %r2104; .reg .u16 %r2106; .reg .u16 %r2107; .reg .u32 %r2108; .reg .u32 %r2109; .reg .u64 %r2111; .reg .u64 %r2112; .reg .u16 %r2116; .reg .u16 %r2117; .reg .u16 %r2119; .reg .u16 %r2120; .reg .u32 %r2121; .reg .pred %r2122; .reg .u16 %r2125; .reg .u16 %r2126; .reg .u16 %r2128; .reg .u16 %r2129; .reg .u32 %r2130; .reg .u16 %r2134; .reg .u16 %r2135; .reg .u16 %r2137; .reg .u16 %r2138; .reg .u32 %r2139; .reg .u16 %r2143; .reg .u16 %r2144; .reg .u16 %r2146; .reg .u16 %r2147; .reg .u32 %r2148; .reg .pred %r2149; .reg .u64 %r2150; .reg .u64 %r2152; .reg .u16 %r2154; .reg .u16 %r2155; .reg .u16 %r2157; .reg .u16 %r2158; .reg .u32 %r2159; .reg .u16 %r2161; .reg .u16 %r2163; .reg .u16 %r2164; .reg .u32 %r2165; .reg .u64 %r2166; .reg .u16 %r2168; .reg .u16 %r2170; .reg .u16 %r2171; .reg .u32 %r2172; .reg .u64 %r2173; .reg .u16 %r2175; .reg .u16 %r2177; .reg .u16 %r2178; .reg .u32 %r2179; .reg .pred %r2180; .reg .u16 %r2183; .reg .u16 %r2184; .reg .u16 %r2186; .reg .u16 %r2187; .reg .u32 %r2188; .reg .u64 %r2189; .reg .u64 %r2190; .reg .u16 %r2192; .reg .u16 %r2194; .reg .u16 %r2195; .reg .u32 %r2196; .reg .u64 %r2197; .reg .u64 %r2198; .reg .u16 %r2200; .reg .u16 %r2202; .reg .u16 %r2203; .reg .u32 %r2204; .reg .u64 %r2205; .reg .u64 %r2206; .reg .u16 %r2208; .reg .u16 %r2210; .reg .u16 %r2211; .reg .u32 %r2212; .reg .pred %r2213; .reg .u32 %r2214; .reg .u32 %r2216; .reg .u32 %r2218; .reg .pred %r2219; .reg .u32 %r2220; .reg .u64 %r2221; .reg .u64 %r2222; .reg .u32 %r2223; .reg .u32 %r2224; .reg .u64 %r2225; .reg .u64 %r2226; .reg .u32 %r2227; .reg .u32 %r2229; .reg .u64 %r2230; .reg .u64 %r2231; .reg .u32 %r2232; .reg .u32 %r2233; .reg .u64 %r2234; .reg .u64 %r2235; .reg .u32 %r2236; .reg .u32 %r2238; .reg .u64 %r2239; .reg .u64 %r2240; .reg .u32 %r2241; .reg .u32 %r2242; .reg .u64 %r2243; .reg .u64 %r2244; .reg .u32 %r2245; .reg .pred %r2246; .reg .u32 %r2247; .reg .u64 %r2248; .reg .u32 %r2250; .reg .u64 %r2251; .reg .u32 %r2253; .reg .u64 %r2254; .reg .pred %r2255; .reg .u32 %r2256; .reg .u64 %r2257; .reg .u64 %r2259; .reg .u32 %r2261; .reg .u64 %r2262; .reg .u64 %r2264; .reg .u32 %r2266; .reg .u64 %r2267; .reg .pred %r2268; .reg .u16 %r2271; .reg .u16 %r2272; .reg .u16 %r2273; .reg .u32 %r2276; .reg .u16 %r2280; .reg .u16 %r2281; .reg .u16 %r2283; .reg .u16 %r2284; .reg .u32 %r2285; .reg .u16 %r2289; .reg .u16 %r2290; .reg .u16 %r2292; .reg .u16 %r2293; .reg .u32 %r2294; .reg .pred %r2295; .reg .u16 %r2298; .reg .u16 %r2299; .reg .u16 %r2300; .reg .u32 %r2303; .reg .u16 %r2307; .reg .u16 %r2308; .reg .u16 %r2310; .reg .u16 %r2311; .reg .u32 %r2312; .reg .u16 %r2316; .reg .u16 %r2317; .reg .u16 %r2319; .reg .u16 %r2320; .reg .u32 %r2321; .reg .pred %r2322; .reg .u16 %r2325; .reg .u16 %r2326; .reg .u16 %r2327; .reg .u32 %r2330; .reg .u16 %r2334; .reg .u16 %r2335; .reg .u16 %r2337; .reg .u16 %r2338; .reg .u32 %r2339; .reg .u16 %r2343; .reg .u16 %r2344; .reg .u16 %r2346; .reg .u16 %r2347; .reg .u32 %r2348; .reg .pred %r2349; .reg .u16 %r2352; .reg .u16 %r2353; .reg .u16 %r2356; .reg .u16 %r2357; .reg .u32 %r2358; .reg .u16 %r2362; .reg .u16 %r2363; .reg .u16 %r2366; .reg .u16 %r2367; .reg .u32 %r2368; .reg .u16 %r2372; .reg .u16 %r2373; .reg .u16 %r2376; .reg .u16 %r2377; .reg .u32 %r2378; .reg .pred %r2379; .reg .pred %r2382; .reg .u16 %r2385; .reg .u16 %r2386; .reg .u16 %r2387; .reg .u32 %r2390; .reg .u16 %r2394; .reg .u16 %r2395; .reg .u16 %r2397; .reg .u16 %r2398; .reg .u32 %r2399; .reg .u16 %r2403; .reg .u16 %r2404; .reg .u16 %r2406; .reg .u16 %r2407; .reg .u32 %r2408; .reg .pred %r2409; .reg .u32 %r2419; .reg .u16 %r2420; mov.u64 %r724,%ar0; mov.u64 %r725,%ar1; mov.u64 %r726,%ar2; mov.u32 %r727,%ar3; mov.u32 %r728,%ar4; mov.u64 %r729,%ar5; .loc 1 2462 3 ld.s8 %r265,[%r725+28]; cvt.u16.u32 %r730,%r265; setp.eq.u16 %r731,%r730,2; @ %r731 bra $L2; ld.s8 %r1591,[%r726+28]; cvt.u16.u32 %r732,%r1591; setp.eq.u16 %r734,%r732,2; @ %r734 bra $L3; cvta.const.u64 %r738,$LC3; cvta.const.u64 %r737,__func__$0; mov.u32 %r736,2462; cvta.const.u64 %r7357377383: .loc 1 2476 6 ld.u64 %r739,[%r724]; setp.ne.u64 %r740,%r739,0; @ %r740 bra $L4; .loc 1 2478 10 setp.ne.u16 %r742,%r730,1; @ %r742 bra $L5; .loc 1 2480 4 st.u64 [%r724+48],%r739; ld.u64 %r745,[%r726+80]; ld.u64 %r747,[%r726+72]; sub.u64 %r748,%r745,%r747; st.u64 [%r724+56],%r748; mov.u64 %r749,1; st.u64 [%r724+40],%r749; bra $L6; $L123: .loc 1 2485 4 st.u64 [%r724+48],%r1557; add.u64 %r751,%r630,-1; st.u64 [%r724+56],%r751; mov.u64 %r752,1; st.u64 [%r724+40],%r752; bra $L6; $L124: .loc 1 2490 4 mov.u64 %r753,0; st.u64 [%r724+48],%r753; add.u64 %r754,%r630,-1; st.u64 [%r724+56],%r754; mov.u64 %r755,1; st.u64 [%r724+40],%r755; .loc 1 2493 11 st.u64 [%r724+72],%r753; ld.u64 %r758,[%r726+80]; ld.u64 %r760,[%r726+72]; sub.u64 %r761,%r758,%r760; st.u64 [%r724+80],%r761; st.u64 [%r724+64],%r630; $L6: .loc 1 2499 1724; call (%value_in),_gfortran_size0,(%out_arg1); ld.param.u64 %r763,[%value_in]; } .loc 1 2499 4 mov.u64 %r76576(%value_in),_gfortrani_xmallocarray766,[%value_in]; } .loc 1 2499 2 st.u64 [%r724],%r766; .loc 1 2500 24 mov.u64 %r768,0; st.u64 [%r724+8],%r768; .loc 1 2557 7 ld.s8 %r265,[%r725+28]; .loc 1 2564 15 ld.u64 %r770,[%r725+56]; add.u64 %r769,%r770,1; .loc 1 2564 13 ld.u64 %r771,[%r725+48]; sub.u64 %r266,%r769,%r771; bra $L7; $L4: .loc 1 2502 12 cvta.global.u64 %r772,_gfortrani_compile_options; .loc 1 2502 11 ld.u32 %r773,[%r772+36]; setp.eq.u32 %r774,%r773,0; @ %r774 bra $L8; .loc 1 2509 17 ld.u64 %r776,[%r724+56]; add.u64 %r775,%r776,1; .loc 1 2509 15 ld.u64 %r777,[%r724+48]; sub.u64 %r266,%r775,%r777; .loc 1 2506 10 setp.ne.u16 %r779,%r730,1; @ %r779 bra $L9; .loc 1 2508 17 ld.u64 %r781,[%r726+80]; add.u64 %r780,%r781,1; .loc 1 2508 15 ld.u64 %r782,[%r726+72]; sub.u64 %r349,%r780,%r782; .loc 1 2510 7 setp.ne.u64 %r783,%r349,%r266; @ %r783 bra $L10; .loc 1 2564 15 ld.u64 %r785,[%r725+56]; add.u64 %r784,%r785,1; .loc 1 2564 13 ld.u64 %r786,[%r725+48]; sub.u64 %r266,%r784,%r786; .loc 1 2543 7 ld.s8 %r288,[%r724+28]; .loc 1 2548 27 ld.u64 %r336,[%r724+40]; .loc 1 2543 6 cvt.u16.u32 %r787,%r288; setp.eq.u16 %r788,%r787,1; @ %r788 bra $L11; .loc 1 2553 16 ld.u64 %r626,[%r724+64]; .loc 1 2560 16 ld.u64 %r377,[%r725+40]; .loc 1 2563 14 mov.u64 %r313,1; .loc 1 2561 16 mov.u64 %r307,%r313; bra $L12; $L10: .loc 1 2511 6 st.u64 [%stack+8],%r349; st.u64 [%stack],%r266; cvta.const.u64 %r7878_gfortran_runtime_error2485 4 ld.u64 %r792,[%r725+56]; add.u64 %r791,%r792,1; ld.u64 %r793,[%r725+48]; sub.u64 %r635,%r791,%r793; .loc 1 2515 15 cvt.u16.u32 %r794,%r1591; setp.ne.u16 %r796,%r794,1; @ %r796 bra $L13; .loc 1 2519 7 setp.eq.u64 %r797,%r266,%r635; @ %r797 bra $L14; .loc 1 2520 6 st.u64 [%stack+8],%r635; st.u64 [%stack],%r266; cvta.const.u64 %r798_gfortran_runtime_error13: .loc 1 2528 7 setp.eq.u64 %r800,%r266,%r635; @ %r800 bra $L15; .loc 1 2529 6 st.u64 [%stack+8],%r635; st.u64 [%stack],%r266; cvta.const.u64 %r801801_gfortran_runtime_error15: .loc 1 2533 17 ld.u64 %r804,[%r726+80]; add.u64 %r803,%r804,1; .loc 1 2533 15 ld.u64 %r805,[%r726+72]; sub.u64 %r346,%r803,%r805; .loc 1 2534 17 ld.u64 %r807,[%r724+80]; add.u64 %r806,%r807,1; .loc 1 2534 15 ld.u64 %r808,[%r724+72]; sub.u64 %r347,%r806,%r808; .loc 1 2535 7 setp.eq.u64 %r809,%r346,%r347; @ %r809 bra $L14; .loc 1 2536 6 st.u64 [%stack+8],%r346; st.u64 [%stack],%r347; cvta.const.u64 %r810,$LC810_gfortran_runtime_error2564 15 ld.u64 %r813,[%r725+56]; add.u64 %r812,%r813,1; .loc 1 2564 13 ld.u64 %r814,[%r725+48]; sub.u64 %r266,%r812,%r814; $L7: .loc 1 2548 27 ld.u64 %r626,[%r724+40]; .loc 1 2543 6 ld.s8 %r816,[%r724+28]; cvt.u16.u32 %r815,%r816; setp.eq.u16 %r817,%r815,1; @ %r817 bra $L130; $L127: .loc 1 2553 16 mov.u64 %r336,%r626; ld.u64 %r626,[%r724+64]; bra $L16; $L130: .loc 1 2548 27 mov.u64 %r336,%r626; $L16: .loc 1 2560 16 ld.u64 %r377,[%r725+40]; .loc 1 2557 6 cvt.u16.u32 %r818,%r265; setp.eq.u16 %r819,%r818,1; @ %r819 bra $L131; $L125: .loc 1 2569 16 ld.u64 %r307,[%r725+64]; .loc 1 2571 15 ld.u64 %r821,[%r725+80]; add.u64 %r820,%r821,1; .loc 1 2571 13 ld.u64 %r822,[%r725+72]; mov.u64 %r313,%r266; sub.u64 %r266,%r820,%r822; bra $L12; $L131: .loc 1 2563 14 mov.u64 %r313,1; .loc 1 2561 16 mov.u64 %r307,%r313; $L12: .loc 1 2575 16 ld.u64 %r824,[%r726+56]; add.u64 %r823,%r824,1; ld.u64 %r825,[%r726+48]; sub.u64 %r74,%r823,%r825; .loc 1 2575 6 setp.eq.u64 %r826,%r74,%r266; @ %r826 bra $L17; .loc 1 2577 10 set.u32.gt.s64 %r828,%r266,0; neg.s32 %r829,%r828; .loc 1 2577 21 set.u32.gt.s64 %r831,%r74,0; neg.s32 %r832,%r831; cvt.u16.u32 %r834,%r829; cvt.u16.u32 %r835,%r832; or.b16 %r833,%r834,%r835; cvt.u32.u16 %r836,%r833; cvt.u16.u8 %r837,%r836; setp.eq.u16 %r838,%r837,0; @ %r838 bra $L17; .loc 1 2578 2 st.u64 [%stack+8],%r266; st.u64 [%stack],%r74; cvta.const.u64 %r838stack; call _gfortran_runtime_error17: .loc 1 2583 7 ld.s8 %r78,[%r726+28]; .loc 1 2586 16 ld.u64 %r625,[%r726+40]; .loc 1 2583 6 cvt.u16.u32 %r841,%r78; setp.eq.u16 %r842,%r841,1; @ %r842 bra $L132; .loc 1 2597 16 ld.u64 %r308,[%r726+64]; .loc 1 2598 16 ld.u64 %r844,[%r726+80]; add.u64 %r843,%r844,1; .loc 1 2598 14 ld.u64 %r845,[%r726+72]; sub.u64 %r314,%r843,%r845; bra $L18; $L132: .loc 1 2592 14 mov.u64 %r314,1; .loc 1 2591 16 mov.u64 %r308,256; $L18: .loc 1 2601 9 ld.u64 %r354,[%r725]; .loc 1 2602 9 ld.u64 %r355,[%r726]; .loc 1 2603 8 ld.u64 %r356,[%r724]; .loc 1 2612 28 set.u32.eq.u64 %r847,%r336,1; neg.s32 %r848,%r847; cvt.u32.u32 %r846,%r848; cvt.u32.u8 %r83,%r846; .loc 1 2612 46 set.u32.eq.u64 %r850,%r377,1; neg.s32 %r851,%r850; cvt.u32.u32 %r849,%r851; cvt.u32.u8 %r624,%r849; .loc 1 2612 7 set.u32.ne.u32 %r853,%r727,0; neg.s32 %r854,%r853; .loc 1 2612 16 cvt.u16.u32 %r856,%r854; cvt.u16.u32 %r857,%r83; and.b16 %r855,%r856,%r857; .loc 1 2612 6 cvt.u32.u16 %r858,%r855; cvt.u16.u8 %r859,%r858; setp.eq.u16 %r860,%r859,0; @ %r860 bra $L19; .loc 1 2612 63 set.u32.eq.u64 %r862,%r307,1; neg.s32 %r863,%r862; .loc 1 2612 51 cvt.u16.u32 %r865,%r863; cvt.u16.u32 %r866,%r624; or.b16 %r864,%r865,%r866; .loc 1 2612 33 cvt.u32.u16 %r867,%r864; cvt.u16.u8 %r868,%r867; setp.eq.u16 %r869,%r868,0; @ %r869 bra $L20; .loc 1 2613 20 set.u32.eq.u64 %r871,%r625,1; neg.s32 %r872,%r871; .loc 1 2613 37 set.u32.eq.u64 %r874,%r308,1; neg.s32 %r875,%r874; .loc 1 2613 25 cvt.u16.u32 %r877,%r872; cvt.u16.u32 %r878,%r875; or.b16 %r876,%r877,%r878; .loc 1 2613 7 cvt.u32.u16 %r879,%r876; cvt.u16.u8 %r880,%r879; setp.eq.u16 %r881,%r880,0; @ %r881 bra $L20; .loc 1 2615 13 cvt.rn.f32.s32 %r94,%r728; .loc 1 2614 12 cvt.rn.f32.s64 %r882,%r313; .loc 1 2614 50 cvt.rn.f32.s64 %r883,%r266; .loc 1 2614 47 mul.f32 %r884,%r882,%r883; .loc 1 2614 31 cvt.rn.f32.s64 %r885,%r314; .loc 1 2614 47 mul.f32 %r886,%r884,%r885; .loc 1 2615 13 mul.f32 %r887,%r94,%r94; mul.f32 %r888,%r887,%r94; .loc 1 2614 7 setp.gt.f32 %r889,%r886,%r888; @ ! %r889 bra $L19; .loc 1 2617 17 cvt.u32.u64 %r96,%r313; st.u32 [%frame+20],%r96; .loc 1 2617 29 cvt.u32.u64 %r97,%r314; st.u32 [%frame+16],%r97; .loc 1 2617 41 cvt.u32.u64 %r98,%r266; st.u32 [%frame+12],%r98; .loc 1 2617 52 cvt.u32.u64 %r99,%r626; st.u32 [%frame+8],%r99; .loc 1 2618 27 mov.u32 %r890,1; st.u8 [%frame+25],%r890; .loc 1 2618 36 mov.u32 %r891,0; st.u8 [%frame+24],%r891; .loc 1 2619 17 setp.ne.u64 %r892,%r377,1; @ %r892 bra $L22; cvt.u32.u64 %r337,%r307; bra $L23; $L22: cvt.u32.u64 %r337,%r377; $L23: st.u32 [%frame+4],%r337; .loc 1 2620 3 setp.ne.u64 %r893,%r625,1; @ %r893 bra $L24; cvt.u32.u64 %r338,%r308; bra $L25; $L24: cvt.u32.u64 %r338,%r625; $L25: st.u32 [%frame],%r338; .loc 1 2622 15 set.u32.gt.s32 %r895,%r337,0; neg.s32 %r896,%r895; .loc 1 2622 26 set.u32.gt.s32 %r898,%r338,0; neg.s32 %r899,%r898; .loc 1 2622 19 cvt.u16.u32 %r901,%r896; cvt.u16.u32 %r902,%r899; and.b16 %r900,%r901,%r902; .loc 1 2622 10 cvt.u32.u16 %r903,%r900; cvt.u16.u8 %r904,%r903; setp.eq.u16 %r905,%r904,0; @ %r905 bra $L19; .loc 1 2622 37 set.u32.gt.s32 %r907,%r99,0; neg.s32 %r908,%r907; .loc 1 2622 46 set.u32.gt.s32 %r910,%r96,1; neg.s32 %r911,%r910; .loc 1 2622 41 cvt.u16.u32 %r913,%r908; cvt.u16.u32 %r914,%r911; and.b16 %r912,%r913,%r914; cvt.u32.u16 %r915,%r912; cvt.u16.u8 %r916,%r915; setp.eq.u16 %r917,%r916,0; @ %r917 bra $L19; .loc 1 2622 55 set.u32.gt.s32 %r919,%r97,1; neg.s32 %r920,%r919; .loc 1 2622 64 set.u32.gt.s32 %r922,%r98,1; neg.s32 %r923,%r922; .loc 1 2622 59 cvt.u16.u32 %r925,%r920; cvt.u16.u32 %r926,%r923; and.b16 %r924,%r925,%r926; cvt.u32.u16 %r927,%r924; cvt.u16.u8 %r928,%r927; setp.eq.u16 %r929,%r928,0; @ %r929 bra $L19; .loc 1 2624 4 setp.ne.u64 %r930,%r729,0; @ %r930 bra $L27; cvta.const.u64 %r934,$LC8; cvta.const.u64 %r933,__func__$0; mov.u32 %r932,2624; cvta.const.u64 %r93199339327: .loc 1 2626 17 and.b32 %r935,%r727,2; .loc 1 2626 7 setp.ne.u32 %r936,%r935,0; @ %r936 bra $L133; .loc 1 2629 35 setp.eq.u64 %r937,%r377,1; @ %r937 bra $L134; cvta.const.u64 %r315,$LC2; bra $L28; $L133: .loc 1 2627 13 cvta.const.u64 %r315,$LC1; bra $L28; $L134: .loc 1 2629 35 cvta.const.u64 %r315,$LC0; $L28: .loc 1 2631 17 and.b32 %r938,%r727,4; .loc 1 2631 7 setp.ne.u32 %r939,%r938,0; @ %r939 bra $L135; .loc 1 2634 35 setp.eq.u64 %r940,%r625,1; @ %r940 bra $L136; cvta.const.u64 %r316,$LC2; bra $L29; $L135: .loc 1 2632 13 cvta.const.u64 %r316,$LC1; bra $L29; $L136: .loc 1 2634 35 cvta.const.u64 %r316,$LC0; $L29: .loc 1 2636 4 add.u64 %r956,%frame,8; add.u64 %r957,%frame,24; add.u64 %r958,%frame,4; add.u64 %r959,%frame,25; add.u64 %r960,%frame,12; add.u64 %r961,%frame,16; add.u64 %r962,%frame,20; mov.u32 %r955,1;,.param .u64 %in_ar3,.param .u64 %in_ar4,.param .u64 %in_ar5,.param .u64 %in_ar6,.param .u64 %in_ar7,.param .u64 %in_ar8,.param .u64 %in_ar9,.param .u64 %in_ar10,.param .u64 %in_ar11,.param .u64 %in_ar12,.param .u32 %in_ar13,.param .u32 %in_ar14962960354; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r958; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r355; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%frame; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r957; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r356; .param .u64 %out_arg13; st.param.u64 [%out_arg13],%r956; .param .u32 %out_arg14; st.param.u32 [%out_arg14],%r955; .param .u32 %out_arg15; st.param.u32 [%out_arg15],%r955; call %r729,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13,%out_arg14,%out_arg15),$LCT0; } bra $L1; $L19: .loc 1 2643 21 cvt.u16.u32 %r964,%r83; cvt.u16.u32 %r965,%r624; and.b16 %r963,%r964,%r965; .loc 1 2643 6 cvt.u32.u16 %r966,%r963; cvt.u16.u8 %r967,%r966; setp.eq.u16 %r968,%r967,0; @ %r968 bra $L31; .loc 1 2643 38 setp.ne.u64 %r969,%r625,1; @ %r969 bra $L20; .loc 1 2644 7 cvt.u16.u32 %r970,%r78; setp.eq.u16 %r971,%r970,1; @ %r971 bra $L32; .loc 1 2677 16 add.u64 %r972,%r626,1; .loc 1 2678 9 sub.u64 %r358,%r356,%r972; .loc 1 2687 7 setp.le.s64 %r973,%r314,0; @ %r973 bra $L33; add.u64 %r141,%r358,%r972; add.u64 %r345,%r314,1; .loc 1 2687 13 mov.u64 %r407,%r625; setp.gt.s64 %r1590,%r313,0; sub.u64 %r1697,%r345,%r625; and.b64 %r1698,%r1697,3; setp.eq.u64 %r1701,%r1698,0; @ %r1701 bra $L34; setp.eq.u64 %r1700,%r1698,1; @ %r1700 bra $L294; setp.eq.u64 %r1699,%r1698,2; @ %r1699 bra $L295; bra $L356; $L35: .loc 1 2689 22 mov.u32 %r978973130,[%value_in]; } $L36: .loc 1 2687 24 add.u64 %r1702,%r407,1; .loc 1 2687 7 add.u64 %r1753,%r141,%r626; .loc 1 2688 2 @ %r1590 bra $L232; bra $L357; $L34: @ %r1590 bra $L35; bra $L36; $L33: .loc 1 2692 28 set.u32.eq.u64 %r985,%r266,0; neg.s32 %r986,%r985; .loc 1 2692 13 set.u32.eq.u64 %r988,%r313,0; neg.s32 %r989,%r988; .loc 1 2692 28 cvt.u16.u32 %r991,%r986; cvt.u16.u32 %r992,%r989; or.b16 %r990,%r991,%r992; .loc 1 2692 23 set.u32.eq.u64 %r994,%r314,0; neg.s32 %r995,%r994; .loc 1 2692 28 cvt.u16.u32 %r999,%r995; or.b16 %r997,%r990,%r999; cvt.u32.u16 %r1000,%r997; cvt.u16.u8 %r1001,%r1000; setp.eq.u16 %r1002,%r1001,0; @ ! %r1002 bra $L1; .loc 1 2697 10 setp.eq.u64 %r1003,%r307,1; selp.u64 %r333,%r626,%r307,%r1003; .loc 1 2702 21 shl.b64 %r1004,%r333,8; .loc 1 2702 14 add.u64 %r363,%r1004,%r308; .loc 1 2706 12 min.s64 %r1006,%r363,655361007,[%value_in]; } .loc 1 2710 7 setp.gt.s64 %r1009,%r314,0; @ %r1009 bra $L39; $L42: .loc 1 2910072918 7 bra $L1; $L39: .loc 1 2680 16 add.u64 %r1011,%r307,1; .loc 1 2681 9 sub.u64 %r360,%r354,%r1011; shl.b64 %r634,%r626,9; add.u64 %r636,%r626,%r626; neg.s64 %r1014,%r308; shl.b64 %r1015,%r1014,9; mov.u64 %r57,%r1014; shl.b64 %r348,%r308,9; add.u64 %r1578,%r308,%r308; add.u64 %r54,%r1578,%r308; add.u64 %r50,%r358,%r636; add.u64 %r1019,%r314,-1; and.b64 %r65,%r1019,-512; shl.b64 %r587,%r307,8; neg.s64 %r1592,%r307; shl.b64 %r1022,%r1592,8; add.u64 %r319,%r307,%r307; add.u64 %r1026,%r1592,%r1592; shl.b64 %r462,%r308,2; shl.b64 %r1029,%r57,2; .loc 1 2683 16 add.u64 %r1030,%r308,1; .loc 1 2684 9 sub.u64 %r129,%r355,%r1030; add.u64 %r1033,%r57,%r57; add.u64 %r131,%r1033,%r129; shl.b64 %r453,%r626,2; mov.u64 %r56,%r626; mov.u64 %r575,0; setp.gt.s64 %r1588,%r266,0; setp.gt.s64 %r1635,%r313,0; add.u64 %r1636,%r266,1; add.u64 %r1637,%r313,-1; and.b64 %r1638,%r1637,-256; add.u64 %r1639,%r360,1; bra $L41; $L138: mov.u64 %r575,%r576; $L41: add.u64 %r70,%r575,1; .loc 1 2715 9 sub.u64 %r365,%r314,%r575; min.s64 %r1754,%r365,512; .loc 1 2716 24 shr.s64 %r1035,%r1754,63; shr.u64 %r1036,%r1035,62; add.u64 %r1037,%r1754,%r1036; and.b64 %r1038,%r1037,3; sub.u64 %r1039,%r1038,%r1036; .loc 1 2716 10 sub.u64 %r366,%r1754,%r1039; .loc 1 2718 4 @ %r1588 bra $L40; $L45: .loc 1 2710 7 add.u64 %r576,%r575,512; add.u64 %r56,%r56,%r634; add.u64 %r57,%r57,%r1015; add.u64 %r54,%r54,%r348; add.u64 %r50,%r50,%r634; setp.ne.u64 %r1041,%r65,%r575; @ %r1041 bra $L138; bra $L42; $L40: mov.u64 %r584,%r1592; .loc 1 2768 13 add.u64 %r600,%r366,%r70; add.u64 %r455,%r54,%r131; add.u64 %r113,%r50,%r636; mad.lo.u64 %r577,%r308,%r600,%r129; mad.lo.u64 %r526,%r600,%r626,%r358; mov.u64 %r585,%r307; .loc 1 2718 12 mov.u64 %r369,%r625; .loc 1 2768 8 add.u64 %r1628,%r600,-1; setp.le.s64 %r1631,%r70,%r1628; .loc 1 2873 17 add.u64 %r1633,%r70,%r1754; .loc 1 2874 9 setp.le.s64 %r1634,%r1633,%r600; $L44: .loc 1 2723 13 sub.u64 %r367,%r1636,%r369; min.s64 %r1756,%r367,256; .loc 1 2724 28 shr.u64 %r1047,%r1756,63; add.u64 %r1048,%r1756,%r1047; and.b64 %r1049,%r1048,1; sub.u64 %r1050,%r1049,%r1047; .loc 1 2724 14 sub.u64 %r368,%r1756,%r1050; .loc 1 2727 8 @ %r1635 bra $L43; $L73: .loc 1 2718 30 add.u64 %r369,%r369,256; .loc 1 2718 4 add.u64 %r585,%r585,%r587; add.u64 %r584,%r584,%r1022; setp.ge.s64 %r1052,%r266,%r369; @ %r1052 bra $L44; bra $L45; $L43: .loc 1 2763 26 add.u64 %r1053,%r369,-1; add.u64 %r1054,%r1053,%r1756; .loc 1 2763 31 mul.lo.u64 %r430,%r1054,%r307; .loc 1 2734 13 add.u64 %r599,%r368,%r369; .loc 1 2734 8 add.u64 %r601,%r599,-1; add.u64 %r1056,%r455,%r369; add.u64 %r456,%r1056,%r1756; add.u64 %r578,%r577,%r369; add.u64 %r528,%r1756,%r578; .loc 1 2768 8 mov.u64 %r598,0; setp.le.s64 %r1598,%r369,%r601; setp.gt.s64 %r1585,%r1756,%r368; setp.gt.s64 %r1584,%r1754,%r366; not.b64 %r1621,%r369; add.u64 %r1622,%r1621,%r599; and.b64 %r1623,%r1622,-2; .loc 1 2904 18 add.u64 %r1624,%r1756,%r369; .loc 1 2904 13 add.u64 %r1625,%r1624,-1; setp.gt.s64 %r1626,%r369,%r1625; bra $L72; $L140: mov.u64 %r598,%r597; $L72: add.u64 %r591,%r598,1; .loc 1 2732 10 sub.u64 %r370,%r313,%r598; min.s64 %r1757,%r370,256; .loc 1 2733 25 shr.s64 %r1593,%r1757,63; shr.u64 %r1059,%r1757,63; add.u64 %r1060,%r1757,%r1059; and.b64 %r1061,%r1060,1; sub.u64 %r1062,%r1061,%r1059; .loc 1 2733 11 sub.u64 %r371,%r1757,%r1062; .loc 1 2735 5 @ %r1598 bra $L46; $L55: .loc 1 2757 8 @ ! %r1585 bra $L48; bra $L47; $L46: .loc 1 2751 28 shl.b64 %r135,%r1757,8; .loc 1 2737 17 add.u64 %r603,%r371,%r591; add.u64 %r1066,%r1639,%r585; add.u64 %r320,%r1066,%r598; add.u64 %r112,%r307,%r585; mov.u64 %r77,%r584; mov.u64 %r317,%r585; mov.u64 %r322,0; setp.lt.s64 %r1579,%r591,%r603; setp.gt.s64 %r1586,%r1757,%r371; not.b64 %r1615,%r591; add.u64 %r1616,%r1615,%r603; and.b64 %r1617,%r1616,-2; add.u64 %r1618,%r1617,2; add.u64 %r1619,%r1757,%r591; add.u64 %r1620,%r360,%r1619; bra $L54; $L139: mov.u64 %r322,%r321; $L54: .loc 1 2738 9 @ %r1579 bra $L49; $L53: .loc 1 2749 12 @ %r1586 bra $L50; bra $L51; $L49: add.u64 %r340,%r1007,%r322; add.u64 %r323,%r1618,%r320; .loc 1 2738 9 mov.u64 %r341,%r320; add.u64 %r1687,%r1618,-2; shr.u64 %r1685,%r1687,1; add.u64 %r1688,%r1685,1; and.b64 %r1689,%r1688,3; setp.eq.u64 %r1692,%r1689,0; @ %r1692 bra $L52; setp.eq.u64 %r1691,%r1689,1; @ %r1691 bra $L296; setp.eq.u64 %r1690,%r1689,2; @ %r1690 bra $L297; bra $L358; $L52: .loc 1 2740 49 ld.u8 %r1075,[%r341]; st.u8 [%r340],%r1075; .loc 1 2743 7 add.u64 %r1076,%r112,%r341; add.u64 %r1077,%r1076,%r77; .loc 1 2742 49 ld.u8 %r1078,[%r1077]; st.u8 [%r340+1],%r1078; .loc 1 2744 49 ld.u8 %r1079,[%r341+1]; st.u8 [%r340+256],%r1079; .loc 1 2747 7 sub.u64 %r1080,%r341,%r317; add.u64 %r1081,%r1080,%r112; .loc 1 2746 49 ld.u8 %r1082,[%r1081+1]; st.u8 [%r340+257],%r1082; .loc 1 2738 9 add.u64 %r1693,%r341,2; add.u64 %r1694,%r340,512; .loc 1 2740 49 ld.u8 %r1760,[%r1693]; st.u8 [%r1694],%r1760; .loc 1 2743 7 add.u64 %r1761,%r112,%r1693; add.u64 %r1762,%r1761,%r77; .loc 1 2742 49 ld.u8 %r1763,[%r1762]; st.u8 [%r1694+1],%r1763; .loc 1 2744 49 ld.u8 %r1764,[%r1693+1]; st.u8 [%r1694+256],%r1764; .loc 1 2747 7 sub.u64 %r1765,%r1693,%r317; add.u64 %r1766,%r1765,%r112; .loc 1 2746 49 ld.u8 %r1767,[%r1766+1]; st.u8 [%r1694+257],%r1767; .loc 1 2738 9 add.u64 %r1768,%r341,4; add.u64 %r1769,%r340,1024; .loc 1 2740 49 ld.u8 %r1771,[%r1768]; st.u8 [%r1769],%r1771; .loc 1 2743 7 add.u64 %r1772,%r112,%r1768; add.u64 %r1773,%r1772,%r77; .loc 1 2742 49 ld.u8 %r1774,[%r1773]; st.u8 [%r1769+1],%r1774; .loc 1 2744 49 ld.u8 %r1775,[%r1768+1]; st.u8 [%r1769+256],%r1775; .loc 1 2747 7 sub.u64 %r1776,%r1768,%r317; add.u64 %r1777,%r1776,%r112; .loc 1 2746 49 ld.u8 %r1778,[%r1777+1]; st.u8 [%r1769+257],%r1778; .loc 1 2738 9 add.u64 %r1779,%r341,6; add.u64 %r1780,%r340,1536; .loc 1 2740 49 ld.u8 %r1782,[%r1779]; st.u8 [%r1780],%r1782; .loc 1 2743 7 add.u64 %r1783,%r112,%r1779; add.u64 %r1784,%r1783,%r77; .loc 1 2742 49 ld.u8 %r1785,[%r1784]; st.u8 [%r1780+1],%r1785; .loc 1 2744 49 ld.u8 %r1786,[%r1779+1]; st.u8 [%r1780+256],%r1786; .loc 1 2747 7 sub.u64 %r1787,%r1779,%r317; add.u64 %r1788,%r1787,%r112; .loc 1 2746 49 ld.u8 %r1789,[%r1788+1]; st.u8 [%r1780+257],%r1789; .loc 1 2738 9 add.u64 %r341,%r341,8; add.u64 %r340,%r340,2048; setp.ne.u64 %r1790,%r323,%r341; @ %r1790 bra $L52; bra $L53; $L50: .loc 1 2751 41 add.u64 %r1085,%r135,%r322; add.u64 %r1086,%r1007,%r1085; .loc 1 2752 10 add.u64 %r1088,%r1620,%r317; .loc 1 2754 10 add.u64 %r1092,%r1620,%r112; .loc 1 2751 41 ld.u8 %r1093,[%r1092+-1]; cvt.u32.u16 %r2419,%r1093; mov.u32 %r1095,%r2419; shl.b32 %r1094,%r1095,8; ld.u8 %r1096,[%r1088+-1]; cvt.u16.u32 %r1098,%r1094; or.b16 %r1097,%r1096,%r1098; st.u16 [%r1086+-256],%r1097; $L51: .loc 1 2735 5 add.u64 %r321,%r322,2; add.u64 %r320,%r320,%r319; add.u64 %r317,%r317,%r319; add.u64 %r112,%r112,%r319; add.u64 %r77,%r77,%r1026; setp.ne.u64 %r1099,%r1623,%r322; @ %r1099 bra $L139; bra $L55; $L48: .loc 1 2767 25 shr.u64 %r1102,%r1593,62; add.u64 %r1103,%r1757,%r1102; and.b64 %r1104,%r1103,3; sub.u64 %r1105,%r1104,%r1102; .loc 1 2767 11 sub.u64 %r372,%r1757,%r1105; .loc 1 2769 5 @ %r1631 bra $L56; bra $L57; $L47: .loc 1 2759 17 add.u64 %r1791,%r1757,%r591; .loc 1 2760 9 setp.le.s64 %r1107,%r1791,%r591; @ %r1107 bra $L48; add.u64 %r1109,%r1639,%r430; add.u64 %r432,%r1109,%r598; add.u64 %r1110,%r1756,-1; add.u64 %r431,%r1007,%r1110; add.u64 %r1111,%r430,%r360; add.u64 %r342,%r1111,%r1791; sub.u64 %r1678,%r342,%r432; and.b64 %r1679,%r1678,3; setp.eq.u64 %r1682,%r1679,0; @ %r1682 bra $L58; setp.eq.u64 %r1681,%r1679,1; @ %r1681 bra $L298; setp.eq.u64 %r1680,%r1679,2; @ %r1680 bra $L299; bra $L359; $L58: .loc 1 2762 43 ld.u8 %r1112,[%r432]; st.u8 [%r431],%r1112; ld.u8 %r1794,[%r432+1]; st.u8 [%r431+256],%r1794; ld.u8 %r1798,[%r432+2]; st.u8 [%r431+512],%r1798; ld.u8 %r1802,[%r432+3]; st.u8 [%r431+768],%r1802; .loc 1 2760 9 add.u64 %r432,%r432,4; add.u64 %r431,%r431,1024; setp.ne.u64 %r1803,%r342,%r432; @ %r1803 bra $L58; bra $L48; $L57: .loc 1 2871 8 @ ! %r1584 bra $L60; bra $L59; $L56: .loc 1 2876 14 add.u64 %r608,%r372,%r591; .loc 1 2876 9 add.u64 %r609,%r608,-1; sub.u64 %r463,%r54,%r308; add.u64 %r458,%r1578,%r463; add.u64 %r454,%r358,%r56; sub.u64 %r450,%r113,%r626; shl.b64 %r502,%r372,8; mov.u64 %r448,%r113; mov.u64 %r452,%r50; mov.u64 %r457,%r456; mov.u64 %r459,%r54; mov.u64 %r461,%r57; mov.u64 %r387,%r70; setp.le.s64 %r1580,%r591,%r609; setp.gt.s64 %r1587,%r1757,%r372; add.u64 %r1611,%r129,%r369; $L68: .loc 1 2772 9 @ %r1580 bra $L61; $L67: .loc 1 2843 12 @ ! %r1587 bra $L63; bra $L62; $L61: add.u64 %r521,%r1611,%r1756; .loc 1 2904 13 mov.u64 %r389,%r591; mov.u64 %r474,1024; mov.u64 %r475,768; mov.u64 %r476,512; mov.u64 %r477,256; sub.u64 %r1610,%r521,%r457; $L66: add.u64 %r471,%r454,%r389; .loc 1 2774 10 ld.s8 %r390,[%r471]; .loc 1 2775 10 ld.s8 %r391,[%r471+1]; add.u64 %r469,%r452,%r389; .loc 1 2776 10 ld.s8 %r392,[%r469]; .loc 1 2777 10 ld.s8 %r393,[%r469+1]; add.u64 %r467,%r450,%r389; .loc 1 2778 10 ld.s8 %r394,[%r467]; .loc 1 2779 10 ld.s8 %r395,[%r467+1]; add.u64 %r465,%r448,%r389; .loc 1 2780 10 ld.s8 %r396,[%r465]; .loc 1 2781 10 ld.s8 %r397,[%r465+1]; .loc 1 2782 10 ld.s8 %r398,[%r471+2]; .loc 1 2783 10 ld.s8 %r399,[%r471+3]; .loc 1 2784 10 ld.s8 %r400,[%r469+2]; .loc 1 2785 10 ld.s8 %r401,[%r469+3]; .loc 1 2786 10 ld.s8 %r402,[%r467+2]; .loc 1 2787 10 ld.s8 %r403,[%r467+3]; .loc 1 2788 10 ld.s8 %r404,[%r465+2]; .loc 1 2789 10 ld.s8 %r405,[%r465+3]; .loc 1 2791 6 @ %r1626 bra $L64; sub.u64 %r493,%r457,%r1756; add.u64 %r491,%r1007,%r477; $L65: sub.u64 %r488,%r491,%r476; .loc 1 2793 19 add.u64 %r1121,%r488,%r477; .loc 1 2794 11 ld.u8 %r1123,[%r1121]; ld.u8 %r1124,[%r493]; .loc 1 2793 14 cvt.u16.u32 %r1129,%r390; mad.lo.u16 %r1127,%r1123,%r1124,%r1129; cvt.u32.u16 %r1130,%r1127; cvt.s32.s8 %r390,%r1130; .loc 1 2796 11 ld.u8 %r1134,[%r491]; .loc 1 2795 14 cvt.u16.u32 %r1139,%r391; mad.lo.u16 %r1137,%r1124,%r1134,%r1139; cvt.u32.u16 %r1140,%r1137; cvt.s32.s8 %r391,%r1140; add.u64 %r483,%r1610,%r493; .loc 1 2798 14 add.u64 %r1143,%r483,%r463; .loc 1 2798 11 ld.u8 %r1146,[%r1143]; .loc 1 2797 14 cvt.u16.u32 %r1151,%r392; mad.lo.u16 %r1149,%r1123,%r1146,%r1151; cvt.u32.u16 %r1152,%r1149; cvt.s32.s8 %r392,%r1152; .loc 1 2799 14 cvt.u16.u32 %r1161,%r393; mad.lo.u16 %r1159,%r1134,%r1146,%r1161; cvt.u32.u16 %r1162,%r1159; cvt.s32.s8 %r393,%r1162; .loc 1 2802 14 add.u64 %r1164,%r483,%r459; .loc 1 2802 11 ld.u8 %r1167,[%r1164]; .loc 1 2801 14 cvt.u16.u32 %r1172,%r394; mad.lo.u16 %r1170,%r1123,%r1167,%r1172; cvt.u32.u16 %r1173,%r1170; cvt.s32.s8 %r394,%r1173; .loc 1 2803 14 cvt.u16.u32 %r1182,%r395; mad.lo.u16 %r1180,%r1134,%r1167,%r1182; cvt.u32.u16 %r1183,%r1180; cvt.s32.s8 %r395,%r1183; .loc 1 2806 14 add.u64 %r1185,%r483,%r458; .loc 1 2806 11 ld.u8 %r1188,[%r1185]; .loc 1 2805 14 cvt.u16.u32 %r1193,%r396; mad.lo.u16 %r1191,%r1123,%r1188,%r1193; cvt.u32.u16 %r1194,%r1191; cvt.s32.s8 %r396,%r1194; .loc 1 2807 14 cvt.u16.u32 %r1203,%r397; mad.lo.u16 %r1201,%r1134,%r1188,%r1203; cvt.u32.u16 %r1204,%r1201; cvt.s32.s8 %r397,%r1204; .loc 1 2809 19 add.u64 %r1206,%r488,%r475; .loc 1 2810 11 ld.u8 %r1209,[%r1206]; .loc 1 2809 14 cvt.u16.u32 %r1214,%r398; mad.lo.u16 %r1212,%r1124,%r1209,%r1214; cvt.u32.u16 %r1215,%r1212; cvt.s32.s8 %r398,%r1215; .loc 1 2811 19 add.u64 %r1217,%r488,%r474; .loc 1 2812 11 ld.u8 %r1220,[%r1217]; .loc 1 2811 14 cvt.u16.u32 %r1225,%r399; mad.lo.u16 %r1223,%r1124,%r1220,%r1225; cvt.u32.u16 %r1226,%r1223; cvt.s32.s8 %r399,%r1226; .loc 1 2813 14 cvt.u16.u32 %r1235,%r400; mad.lo.u16 %r1233,%r1146,%r1209,%r1235; cvt.u32.u16 %r1236,%r1233; cvt.s32.s8 %r400,%r1236; .loc 1 2815 14 cvt.u16.u32 %r1245,%r401; mad.lo.u16 %r1243,%r1146,%r1220,%r1245; cvt.u32.u16 %r1246,%r1243; cvt.s32.s8 %r401,%r1246; .loc 1 2817 14 cvt.u16.u32 %r1255,%r402; mad.lo.u16 %r1253,%r1167,%r1209,%r1255; cvt.u32.u16 %r1256,%r1253; cvt.s32.s8 %r402,%r1256; .loc 1 2819 14 cvt.u16.u32 %r1265,%r403; mad.lo.u16 %r1263,%r1167,%r1220,%r1265; cvt.u32.u16 %r1266,%r1263; cvt.s32.s8 %r403,%r1266; .loc 1 2821 14 cvt.u16.u32 %r1275,%r404; mad.lo.u16 %r1273,%r1188,%r1209,%r1275; cvt.u32.u16 %r1276,%r1273; cvt.s32.s8 %r404,%r1276; .loc 1 2823 14 cvt.u16.u32 %r1285,%r405; mad.lo.u16 %r1283,%r1188,%r1220,%r1285; cvt.u32.u16 %r1286,%r1283; cvt.s32.s8 %r405,%r1286; .loc 1 2791 6 add.u64 %r493,%r493,1; add.u64 %r491,%r491,1; setp.ne.u64 %r1288,%r457,%r493; @ %r1288 bra $L65; $L64: .loc 1 2826 24 cvt.u32.u32 %r1289,%r390; st.u8 [%r471],%r1289; .loc 1 2827 28 cvt.u32.u32 %r1290,%r391; st.u8 [%r471+1],%r1290; .loc 1 2828 30 cvt.u32.u32 %r1291,%r392; st.u8 [%r469],%r1291; .loc 1 2829 34 cvt.u32.u32 %r1292,%r393; st.u8 [%r469+1],%r1292; .loc 1 2830 30 cvt.u32.u32 %r1293,%r394; st.u8 [%r467],%r1293; .loc 1 2831 34 cvt.u32.u32 %r1294,%r395; st.u8 [%r467+1],%r1294; .loc 1 2832 30 cvt.u32.u32 %r1295,%r396; st.u8 [%r465],%r1295; .loc 1 2833 34 cvt.u32.u32 %r1296,%r397; st.u8 [%r465+1],%r1296; .loc 1 2834 28 cvt.u32.u32 %r1297,%r398; st.u8 [%r471+2],%r1297; .loc 1 2835 28 cvt.u32.u32 %r1298,%r399; st.u8 [%r471+3],%r1298; .loc 1 2836 34 cvt.u32.u32 %r1299,%r400; st.u8 [%r469+2],%r1299; .loc 1 2837 34 cvt.u32.u32 %r1300,%r401; st.u8 [%r469+3],%r1300; .loc 1 2838 34 cvt.u32.u32 %r1301,%r402; st.u8 [%r467+2],%r1301; .loc 1 2839 34 cvt.u32.u32 %r1302,%r403; st.u8 [%r467+3],%r1302; .loc 1 2840 34 cvt.u32.u32 %r1303,%r404; st.u8 [%r465+2],%r1303; .loc 1 2841 34 cvt.u32.u32 %r1304,%r405; st.u8 [%r465+3],%r1304; .loc 1 2772 33 add.u64 %r389,%r389,4; .loc 1 2772 9 add.u64 %r477,%r477,1024; add.u64 %r476,%r476,1024; add.u64 %r475,%r475,1024; add.u64 %r474,%r474,1024; setp.le.s64 %r1305,%r389,%r609; @ %r1305 bra $L66; bra $L67; $L63: .loc 1 2769 29 add.u64 %r387,%r387,4; .loc 1 2769 5 add.u64 %r463,%r463,%r462; add.u64 %r461,%r461,%r1029; add.u64 %r459,%r459,%r462; add.u64 %r458,%r458,%r462; add.u64 %r457,%r457,%r462; add.u64 %r454,%r454,%r453; add.u64 %r452,%r452,%r453; add.u64 %r450,%r450,%r453; add.u64 %r448,%r448,%r453; setp.le.s64 %r1306,%r387,%r1628; @ %r1306 bra $L68; bra $L57; $L62: add.u64 %r508,%r454,%r608; add.u64 %r507,%r452,%r608; add.u64 %r505,%r450,%r608; add.u64 %r504,%r448,%r608; .loc 1 2904 13 mov.u64 %r503,%r502; add.u64 %r1804,%r1757,%r591; .loc 1 2846 6 add.u64 %r1366,%r1804,%r454; $L71: .loc 1 2848 14 ld.s8 %r383,[%r508]; .loc 1 2849 14 ld.s8 %r384,[%r507]; .loc 1 2850 14 ld.s8 %r385,[%r505]; .loc 1 2851 14 ld.s8 %r386,[%r504]; .loc 1 2853 10 @ %r1626 bra $L69; add.u64 %r522,%r1007,%r503; sub.u64 %r520,%r457,%r1756; and.b64 %r1672,%r1756,1; setp.eq.u64 %r1673,%r1672,0; @ ! %r1673 bra $L360; $L70: .loc 1 2856 13 ld.u8 %r2420,[%r520]; mov.u16 %r1312,%r2420; ld.u8 %r1313,[%r522]; .loc 1 2855 11 cvt.u16.u32 %r1318,%r383; mad.lo.u16 %r1316,%r1312,%r1313,%r1318; cvt.u32.u16 %r1319,%r1316; cvt.s32.s8 %r1805,%r1319; .loc 1 2858 16 add.u64 %r1321,%r463,%r520; add.u64 %r1322,%r1321,%r461; .loc 1 2858 13 ld.u8 %r2420,[%r1322]; mov.u16 %r1325,%r2420; .loc 1 2857 11 cvt.u16.u32 %r1331,%r384; mad.lo.u16 %r1329,%r1325,%r1313,%r1331; cvt.u32.u16 %r1332,%r1329; cvt.s32.s8 %r1806,%r1332; .loc 1 2860 16 add.u64 %r1334,%r459,%r520; add.u64 %r1335,%r1334,%r461; .loc 1 2860 13 ld.u8 %r2420,[%r1335]; mov.u16 %r1338,%r2420; .loc 1 2859 11 cvt.u16.u32 %r1344,%r385; mad.lo.u16 %r1342,%r1338,%r1313,%r1344; cvt.u32.u16 %r1345,%r1342; cvt.s32.s8 %r1807,%r1345; .loc 1 2862 16 add.u64 %r1347,%r458,%r520; add.u64 %r1348,%r1347,%r461; .loc 1 2862 13 ld.u8 %r2420,[%r1348]; mov.u16 %r1351,%r2420; .loc 1 2861 11 cvt.u16.u32 %r1357,%r386; mad.lo.u16 %r1355,%r1351,%r1313,%r1357; cvt.u32.u16 %r1358,%r1355; cvt.s32.s8 %r1808,%r1358; .loc 1 2853 10 add.u64 %r1675,%r520,1; .loc 1 2856 13 ld.u8 %r2420,[%r1675]; mov.u16 %r1813,%r2420; ld.u8 %r1814,[%r522+1]; .loc 1 2855 11 cvt.u16.u32 %r1816,%r1805; mad.lo.u16 %r1817,%r1813,%r1814,%r1816; cvt.u32.u16 %r1818,%r1817; cvt.s32.s8 %r383,%r1818; .loc 1 2858 16 add.u64 %r1819,%r463,%r1675; add.u64 %r1820,%r1819,%r461; .loc 1 2858 13 ld.u8 %r2420,[%r1820]; mov.u16 %r1822,%r2420; .loc 1 2857 11 cvt.u16.u32 %r1824,%r1806; mad.lo.u16 %r1825,%r1822,%r1814,%r1824; cvt.u32.u16 %r1826,%r1825; cvt.s32.s8 %r384,%r1826; .loc 1 2860 16 add.u64 %r1827,%r459,%r1675; add.u64 %r1828,%r1827,%r461; .loc 1 2860 13 ld.u8 %r2420,[%r1828]; mov.u16 %r1830,%r2420; .loc 1 2859 11 cvt.u16.u32 %r1832,%r1807; mad.lo.u16 %r1833,%r1830,%r1814,%r1832; cvt.u32.u16 %r1834,%r1833; cvt.s32.s8 %r385,%r1834; .loc 1 2862 16 add.u64 %r1835,%r458,%r1675; add.u64 %r1836,%r1835,%r461; .loc 1 2862 13 ld.u8 %r2420,[%r1836]; mov.u16 %r1838,%r2420; .loc 1 2861 11 cvt.u16.u32 %r1840,%r1808; mad.lo.u16 %r1841,%r1838,%r1814,%r1840; cvt.u32.u16 %r1842,%r1841; cvt.s32.s8 %r386,%r1842; .loc 1 2853 10 add.u64 %r522,%r522,2; add.u64 %r520,%r520,2; setp.ne.u64 %r1843,%r457,%r520; @ %r1843 bra $L70; $L69: .loc 1 2864 28 cvt.u32.u32 %r1361,%r383; st.u8 [%r508],%r1361; .loc 1 2865 34 cvt.u32.u32 %r1362,%r384; st.u8 [%r507],%r1362; .loc 1 2866 34 cvt.u32.u32 %r1363,%r385; st.u8 [%r505],%r1363; .loc 1 2867 34 cvt.u32.u32 %r1364,%r386; st.u8 [%r504],%r1364; .loc 1 2846 6 add.u64 %r508,%r508,1; add.u64 %r507,%r507,1; add.u64 %r505,%r505,1; add.u64 %r504,%r504,1; add.u64 %r503,%r503,256; setp.ne.u64 %r1367,%r1366,%r508; @ %r1367 bra $L71; bra $L63; $L60: .loc 1 2727 8 add.u64 %r597,%r598,256; setp.ne.u64 %r1368,%r1638,%r598; @ %r1368 bra $L140; bra $L73; $L59: .loc 1 2874 9 @ %r1634 bra $L60; .loc 1 2759 17 add.u64 %r614,%r1757,%r591; .loc 1 2876 14 add.u64 %r615,%r372,%r591; .loc 1 2876 9 add.u64 %r616,%r615,-1; shl.b64 %r554,%r372,8; mov.u64 %r527,%r526; mov.u64 %r529,%r528; mov.u64 %r531,%r578; mov.u64 %r374,%r600; setp.gt.s64 %r1844,%r1757,%r372; setp.le.s64 %r1581,%r591,%r616; $L81: .loc 1 2877 6 @ %r1581 bra $L74; $L80: .loc 1 2901 6 @ %r1844 bra $L75; bra $L76; $L74: .loc 1 2904 13 mov.u64 %r378,%r591; mov.u64 %r536,1024; mov.u64 %r537,768; mov.u64 %r538,512; mov.u64 %r539,256; $L79: add.u64 %r533,%r527,%r378; .loc 1 2879 14 ld.s8 %r379,[%r533]; .loc 1 2880 14 ld.s8 %r380,[%r533+1]; .loc 1 2881 14 ld.s8 %r381,[%r533+2]; .loc 1 2882 14 ld.s8 %r382,[%r533+3]; .loc 1 2884 10 @ %r1626 bra $L77; add.u64 %r547,%r1007,%r539; mov.u64 %r548,%r531; sub.u64 %r1664,%r529,%r531; and.b64 %r1665,%r1664,1; setp.eq.u64 %r1666,%r1665,0; @ ! %r1666 bra $L361; $L78: sub.u64 %r545,%r547,%r538; .loc 1 2886 16 add.u64 %r1374,%r545,%r539; .loc 1 2887 13 ld.u8 %r2420,[%r1374]; mov.u16 %r1377,%r2420; ld.u8 %r1378,[%r548]; .loc 1 2886 11 cvt.u16.u32 %r1383,%r379; mad.lo.u16 %r1381,%r1377,%r1378,%r1383; cvt.u32.u16 %r1384,%r1381; cvt.s32.s8 %r1845,%r1384; .loc 1 2889 13 ld.u8 %r2420,[%r547]; mov.u16 %r1388,%r2420; .loc 1 2888 11 cvt.u16.u32 %r1394,%r380; mad.lo.u16 %r1392,%r1388,%r1378,%r1394; cvt.u32.u16 %r1395,%r1392; cvt.s32.s8 %r1846,%r1395; .loc 1 2890 16 add.u64 %r1397,%r545,%r537; .loc 1 2891 13 ld.u8 %r2420,[%r1397]; mov.u16 %r1400,%r2420; .loc 1 2890 11 cvt.u16.u32 %r1406,%r381; mad.lo.u16 %r1404,%r1400,%r1378,%r1406; cvt.u32.u16 %r1407,%r1404; cvt.s32.s8 %r1847,%r1407; .loc 1 2892 16 add.u64 %r1409,%r545,%r536; .loc 1 2893 13 ld.u8 %r2420,[%r1409]; mov.u16 %r1412,%r2420; .loc 1 2892 11 cvt.u16.u32 %r1418,%r382; mad.lo.u16 %r1416,%r1412,%r1378,%r1418; cvt.u32.u16 %r1419,%r1416; cvt.s32.s8 %r1848,%r1419; .loc 1 2884 10 add.u64 %r1668,%r547,1; sub.u64 %r1851,%r1668,%r538; .loc 1 2886 16 add.u64 %r1853,%r1851,%r539; .loc 1 2887 13 ld.u8 %r2420,[%r1853]; mov.u16 %r1855,%r2420; ld.u8 %r1856,[%r548+1]; .loc 1 2886 11 cvt.u16.u32 %r1858,%r1845; mad.lo.u16 %r1859,%r1855,%r1856,%r1858; cvt.u32.u16 %r1860,%r1859; cvt.s32.s8 %r379,%r1860; .loc 1 2889 13 ld.u8 %r2420,[%r1668]; mov.u16 %r1862,%r2420; .loc 1 2888 11 cvt.u16.u32 %r1864,%r1846; mad.lo.u16 %r1865,%r1862,%r1856,%r1864; cvt.u32.u16 %r1866,%r1865; cvt.s32.s8 %r380,%r1866; .loc 1 2890 16 add.u64 %r1867,%r1851,%r537; .loc 1 2891 13 ld.u8 %r2420,[%r1867]; mov.u16 %r1869,%r2420; .loc 1 2890 11 cvt.u16.u32 %r1871,%r1847; mad.lo.u16 %r1872,%r1869,%r1856,%r1871; cvt.u32.u16 %r1873,%r1872; cvt.s32.s8 %r381,%r1873; .loc 1 2892 16 add.u64 %r1874,%r1851,%r536; .loc 1 2893 13 ld.u8 %r2420,[%r1874]; mov.u16 %r1876,%r2420; .loc 1 2892 11 cvt.u16.u32 %r1878,%r1848; mad.lo.u16 %r1879,%r1876,%r1856,%r1878; cvt.u32.u16 %r1880,%r1879; cvt.s32.s8 %r382,%r1880; .loc 1 2884 10 add.u64 %r548,%r548,2; add.u64 %r547,%r547,2; setp.ne.u64 %r1881,%r529,%r548; @ %r1881 bra $L78; $L77: .loc 1 2895 28 cvt.u32.u32 %r1422,%r379; st.u8 [%r533],%r1422; .loc 1 2896 32 cvt.u32.u32 %r1423,%r380; st.u8 [%r533+1],%r1423; .loc 1 2897 32 cvt.u32.u32 %r1424,%r381; st.u8 [%r533+2],%r1424; .loc 1 2898 32 cvt.u32.u32 %r1425,%r382; st.u8 [%r533+3],%r1425; .loc 1 2877 30 add.u64 %r378,%r378,4; .loc 1 2877 6 add.u64 %r539,%r539,1024; add.u64 %r538,%r538,1024; add.u64 %r537,%r537,1024; add.u64 %r536,%r536,1024; setp.le.s64 %r1426,%r378,%r616; @ %r1426 bra $L79; bra $L80; $L76: .loc 1 2874 39 add.u64 %r374,%r374,1; .loc 1 2874 9 add.u64 %r531,%r531,%r308; add.u64 %r529,%r529,%r308; add.u64 %r527,%r527,%r626; setp.ne.u64 %r1427,%r1633,%r374; @ %r1427 bra $L81; bra $L60; $L75: add.u64 %r556,%r527,%r615; .loc 1 2904 13 mov.u64 %r555,%r554; add.u64 %r1577,%r527,%r614; $L84: .loc 1 2903 14 ld.s8 %r376,[%r556]; .loc 1 2905 10 @ %r1626 bra $L82; add.u64 %r406,%r1007,%r555; mov.u64 %r559,%r531; sub.u64 %r1655,%r529,%r531; and.b64 %r1656,%r1655,3; setp.eq.u64 %r1659,%r1656,0; @ %r1659 bra $L83; setp.eq.u64 %r1658,%r1656,1; @ %r1658 bra $L300; setp.eq.u64 %r1657,%r1656,2; @ %r1657 bra $L301; bra $L362; $L83: .loc 1 2908 13 ld.u8 %r2420,[%r406]; mov.u16 %r1433,%r2420; ld.u8 %r2420,[%r559]; mov.u16 %r1434,%r2420; .loc 1 2907 11 cvt.u16.u32 %r1439,%r376; mad.lo.u16 %r1437,%r1433,%r1434,%r1439; cvt.u32.u16 %r1440,%r1437; cvt.s32.s8 %r1882,%r1440; .loc 1 2908 13 ld.u8 %r2420,[%r406+1]; mov.u16 %r1887,%r2420; ld.u8 %r2420,[%r559+1]; mov.u16 %r1888,%r2420; .loc 1 2907 11 cvt.u16.u32 %r1890,%r1882; mad.lo.u16 %r1891,%r1887,%r1888,%r1890; cvt.u32.u16 %r1892,%r1891; cvt.s32.s8 %r1893,%r1892; .loc 1 2908 13 ld.u8 %r2420,[%r406+2]; mov.u16 %r1899,%r2420; ld.u8 %r2420,[%r559+2]; mov.u16 %r1900,%r2420; .loc 1 2907 11 cvt.u16.u32 %r1902,%r1893; mad.lo.u16 %r1903,%r1899,%r1900,%r1902; cvt.u32.u16 %r1904,%r1903; cvt.s32.s8 %r1905,%r1904; .loc 1 2908 13 ld.u8 %r2420,[%r406+3]; mov.u16 %r1911,%r2420; ld.u8 %r2420,[%r559+3]; mov.u16 %r1912,%r2420; .loc 1 2907 11 cvt.u16.u32 %r1914,%r1905; mad.lo.u16 %r1915,%r1911,%r1912,%r1914; cvt.u32.u16 %r1916,%r1915; cvt.s32.s8 %r376,%r1916; .loc 1 2905 10 add.u64 %r406,%r406,4; add.u64 %r559,%r559,4; setp.ne.u64 %r1917,%r529,%r559; @ %r1917 bra $L83; $L82: .loc 1 2910 28 cvt.u32.u32 %r1443,%r376; st.u8 [%r556],%r1443; .loc 1 2901 6 add.u64 %r556,%r556,1; add.u64 %r555,%r555,256; setp.ne.u64 %r1445,%r1577,%r556; @ %r1445 bra $L84; bra $L76; $L20: .loc 1 2958 11 cvt.u16.u32 %r1446,%r265; setp.eq.u16 %r1447,%r1446,1; @ ! %r1447 bra $L363; bra $L85; $L31: .loc 1 2920 38 set.u32.eq.u64 %r1449,%r307,1; neg.s32 %r1450,%r1449; .loc 1 2920 26 cvt.u16.u32 %r1452,%r1450; and.b16 %r1451,%r1452,%r964; .loc 1 2920 11 cvt.u32.u16 %r1454,%r1451; cvt.u16.u8 %r1455,%r1454; setp.eq.u16 %r1456,%r1455,0; @ %r1456 bra $L20; .loc 1 2920 43 setp.ne.u64 %r1457,%r625,1; @ %r1457 bra $L20; $L121: .loc 1 2922 10 cvt.u16.u32 %r1458,%r265; setp.ne.u16 %r1459,%r1458,1; @ %r1459 bra $L87; .loc 1 2948 4 setp.gt.s64 %r1460,%r314,0; @ %r1460 bra $L88; bra $L1; $L87: .loc 1 2929 4 setp.le.s64 %r1461,%r314,0; @ %r1461 bra $L1; mov.u64 %r649,%r356; mov.u64 %r651,%r355; .loc 1 2929 11 mov.u64 %r410,0; setp.gt.s64 %r1918,%r313,0; setp.le.s64 %r1643,%r266,0; .loc 1 2936 7 cvt.u32.u64 %r1644,%r410; $L91: .loc 1 2933 8 @ %r1918 bra $L90; $L95: .loc 1 2929 29 add.u64 %r410,%r410,1; .loc 1 2929 4 add.u64 %r649,%r649,%r626; add.u64 %r651,%r651,%r308; setp.ne.u64 %r1463,%r314,%r410; @ %r1463 bra $L91; bra $L1; $L90: add.u64 %r644,%r266,%r354; .loc 1 2933 8 mov.u64 %r641,%r649; mov.u64 %r642,0; add.u64 %r1596,%r313,%r649; $L94: .loc 1 2937 5 @ %r1643 bra $L141; add.u64 %r310,%r354,%r642; mov.u64 %r231,%r651; .loc 1 2936 7 mov.u32 %r411,%r1644; sub.u64 %r1705,%r644,%r310; and.b64 %r1706,%r1705,3; setp.eq.u64 %r1709,%r1706,0; @ %r1709 bra $L93; setp.eq.u64 %r1708,%r1706,1; @ %r1708 bra $L302; setp.eq.u64 %r1707,%r1706,2; @ %r1707 bra $L303; bra $L364; $L93: .loc 1 2938 23 ld.u8 %r2420,[%r310]; mov.u16 %r1468,%r2420; ld.u8 %r2420,[%r231]; mov.u16 %r1469,%r2420; .loc 1 2938 9 cvt.u16.u32 %r1474,%r411; mad.lo.u16 %r1472,%r1468,%r1469,%r1474; cvt.u32.u16 %r1475,%r1472; cvt.s32.s8 %r1919,%r1475; .loc 1 2938 23 ld.u8 %r2420,[%r310+1]; mov.u16 %r1924,%r2420; ld.u8 %r2420,[%r231+1]; mov.u16 %r1925,%r2420; .loc 1 2938 9 cvt.u16.u32 %r1927,%r1919; mad.lo.u16 %r1928,%r1924,%r1925,%r1927; cvt.u32.u16 %r1929,%r1928; cvt.s32.s8 %r1930,%r1929; .loc 1 2938 23 ld.u8 %r2420,[%r310+2]; mov.u16 %r1936,%r2420; ld.u8 %r2420,[%r231+2]; mov.u16 %r1937,%r2420; .loc 1 2938 9 cvt.u16.u32 %r1939,%r1930; mad.lo.u16 %r1940,%r1936,%r1937,%r1939; cvt.u32.u16 %r1941,%r1940; cvt.s32.s8 %r1942,%r1941; .loc 1 2938 23 ld.u8 %r2420,[%r310+3]; mov.u16 %r1948,%r2420; ld.u8 %r2420,[%r231+3]; mov.u16 %r1949,%r2420; .loc 1 2938 9 cvt.u16.u32 %r1951,%r1942; mad.lo.u16 %r1952,%r1948,%r1949,%r1951; cvt.u32.u16 %r1953,%r1952; cvt.s32.s8 %r411,%r1953; .loc 1 2937 5 add.u64 %r310,%r310,4; add.u64 %r231,%r231,4; setp.ne.u64 %r1954,%r310,%r644; @ %r1954 bra $L93; bra $L92; $L141: .loc 1 2936 7 mov.u32 %r411,%r1644; $L92: .loc 1 2939 15 cvt.u32.u32 %r1478,%r411; st.u8 [%r641],%r1478; .loc 1 2933 8 add.u64 %r641,%r641,1; add.u64 %r642,%r642,%r377; add.u64 %r644,%r644,%r377; setp.ne.u64 %r1480,%r641,%r1596; @ %r1480 bra $L94; bra $L95; $L88: mov.u64 %r661,%r356; mov.u64 %r663,%r355; .loc 1 2948 11 mov.u64 %r408,0; setp.le.s64 %r1955,%r266,0; .loc 1 2951 10 cvt.u32.u64 %r1646,%r408; $L98: .loc 1 2952 8 @ %r1955 bra $L142; mov.u64 %r654,%r354; mov.u64 %r656,%r663; .loc 1 2951 10 mov.u32 %r409,%r1646; add.u64 %r1594,%r266,%r663; and.b64 %r1715,%r266,3; setp.eq.u64 %r1718,%r1715,0; @ %r1718 bra $L97; setp.eq.u64 %r1717,%r1715,1; @ %r1717 bra $L304; setp.eq.u64 %r1716,%r1715,2; @ %r1716 bra $L305; bra $L365; $L97: .loc 1 2953 26 ld.u8 %r2420,[%r654]; mov.u16 %r1485,%r2420; ld.u8 %r2420,[%r656]; mov.u16 %r1486,%r2420; .loc 1 2953 5 cvt.u16.u32 %r1491,%r409; mad.lo.u16 %r1489,%r1485,%r1486,%r1491; cvt.u32.u16 %r1492,%r1489; cvt.s32.s8 %r1956,%r1492; .loc 1 2952 8 add.u64 %r1957,%r654,%r377; .loc 1 2953 26 ld.u8 %r2420,[%r1957]; mov.u16 %r1961,%r2420; ld.u8 %r2420,[%r656+1]; mov.u16 %r1962,%r2420; .loc 1 2953 5 cvt.u16.u32 %r1964,%r1956; mad.lo.u16 %r1965,%r1961,%r1962,%r1964; cvt.u32.u16 %r1966,%r1965; cvt.s32.s8 %r1967,%r1966; .loc 1 2952 8 add.u64 %r1968,%r1957,%r377; .loc 1 2953 26 ld.u8 %r2420,[%r1968]; mov.u16 %r1973,%r2420; ld.u8 %r2420,[%r656+2]; mov.u16 %r1974,%r2420; .loc 1 2953 5 cvt.u16.u32 %r1976,%r1967; mad.lo.u16 %r1977,%r1973,%r1974,%r1976; cvt.u32.u16 %r1978,%r1977; cvt.s32.s8 %r1979,%r1978; .loc 1 2952 8 add.u64 %r1980,%r1968,%r377; .loc 1 2953 26 ld.u8 %r2420,[%r1980]; mov.u16 %r1985,%r2420; ld.u8 %r2420,[%r656+3]; mov.u16 %r1986,%r2420; .loc 1 2953 5 cvt.u16.u32 %r1988,%r1979; mad.lo.u16 %r1989,%r1985,%r1986,%r1988; cvt.u32.u16 %r1990,%r1989; cvt.s32.s8 %r409,%r1990; .loc 1 2952 8 add.u64 %r654,%r1980,%r377; add.u64 %r656,%r656,4; setp.ne.u64 %r1991,%r656,%r1594; @ %r1991 bra $L97; bra $L96; $L142: .loc 1 2951 10 mov.u32 %r409,%r1646; $L96: .loc 1 2954 25 cvt.u32.u32 %r1496,%r409; st.u8 [%r661],%r1496; .loc 1 2948 29 add.u64 %r408,%r408,1; .loc 1 2948 4 add.u64 %r661,%r661,%r626; add.u64 %r663,%r663,%r308; setp.ne.u64 %r1497,%r314,%r408; @ %r1497 bra $L98; bra $L1; $L85: .loc 1 2963 7 setp.le.s64 %r1498,%r314,0; @ %r1498 bra $L1; mov.u64 %r675,%r356; mov.u64 %r677,0; .loc 1 2963 14 mov.u64 %r422,%r677; setp.le.s64 %r1992,%r266,0; .loc 1 2966 6 cvt.u32.u64 %r1647,%r677; $L101: .loc 1 2967 4 @ %r1992 bra $L143; mov.u64 %r668,%r354; add.u64 %r670,%r355,%r677; .loc 1 2966 6 mov.u32 %r423,%r1647; .loc 1 2967 11 mov.u64 %r424,0; and.b64 %r1723,%r266,3; setp.eq.u64 %r1726,%r1723,0; @ %r1726 bra $L100; setp.eq.u64 %r1725,%r1723,1; @ %r1725 bra $L306; setp.eq.u64 %r1724,%r1723,2; @ %r1724 bra $L307; bra $L366; $L100: .loc 1 2968 29 ld.u8 %r2420,[%r668]; mov.u16 %r1503,%r2420; ld.u8 %r2420,[%r670]; mov.u16 %r1504,%r2420; .loc 1 2968 8 cvt.u16.u32 %r1509,%r423; mad.lo.u16 %r1507,%r1503,%r1504,%r1509; cvt.u32.u16 %r1510,%r1507; cvt.s32.s8 %r1993,%r1510; .loc 1 2967 4 add.u64 %r1995,%r668,%r377; add.u64 %r1996,%r670,%r625; .loc 1 2968 29 ld.u8 %r2420,[%r1995]; mov.u16 %r1999,%r2420; ld.u8 %r2420,[%r1996]; mov.u16 %r2000,%r2420; .loc 1 2968 8 cvt.u16.u32 %r2002,%r1993; mad.lo.u16 %r2003,%r1999,%r2000,%r2002; cvt.u32.u16 %r2004,%r2003; cvt.s32.s8 %r2005,%r2004; .loc 1 2967 4 add.u64 %r2007,%r1995,%r377; add.u64 %r2008,%r1996,%r625; .loc 1 2968 29 ld.u8 %r2420,[%r2007]; mov.u16 %r2012,%r2420; ld.u8 %r2420,[%r2008]; mov.u16 %r2013,%r2420; .loc 1 2968 8 cvt.u16.u32 %r2015,%r2005; mad.lo.u16 %r2016,%r2012,%r2013,%r2015; cvt.u32.u16 %r2017,%r2016; cvt.s32.s8 %r2018,%r2017; .loc 1 2967 4 add.u64 %r2020,%r2007,%r377; add.u64 %r2021,%r2008,%r625; .loc 1 2968 29 ld.u8 %r2420,[%r2020]; mov.u16 %r2025,%r2420; ld.u8 %r2420,[%r2021]; mov.u16 %r2026,%r2420; .loc 1 2968 8 cvt.u16.u32 %r2028,%r2018; mad.lo.u16 %r2029,%r2025,%r2026,%r2028; cvt.u32.u16 %r2030,%r2029; cvt.s32.s8 %r423,%r2030; .loc 1 2967 28 add.u64 %r424,%r424,4; .loc 1 2967 4 add.u64 %r668,%r2020,%r377; add.u64 %r670,%r2021,%r625; setp.ne.u64 %r2031,%r266,%r424; @ %r2031 bra $L100; bra $L99; $L143: .loc 1 2966 6 mov.u32 %r423,%r1647; $L99: .loc 1 2969 21 cvt.u32.u32 %r1513,%r423; st.u8 [%r675],%r1513; .loc 1 2963 32 add.u64 %r422,%r422,1; .loc 1 2963 7 add.u64 %r675,%r675,%r336; add.u64 %r677,%r677,%r308; setp.ne.u64 %r1514,%r314,%r422; @ %r1514 bra $L101; bra $L1; $L363: .loc 1 2972 11 setp.gt.s64 %r1515,%r307,%r377; @ %r1515 bra $L102; .loc 1 2993 7 setp.gt.s64 %r1516,%r314,0; @ %r1516 bra $L103; bra $L1; $L102: .loc 1 2974 7 setp.gt.s64 %r1517,%r314,0; @ %r1517 bra $L104; bra $L1; $L108: add.u64 %r700,%r356,%r703; .loc 1 2975 9 mov.u64 %r420,0; and.b64 %r1739,%r313,3; setp.eq.u64 %r1742,%r1739,0; @ %r1742 bra $L105; setp.eq.u64 %r1741,%r1739,1; @ %r1741 bra $L308; setp.eq.u64 %r1740,%r1739,2; @ %r1740 bra $L309; bra $L367; $L105: .loc 1 2976 34 st.u8 [%r700],%r1648; .loc 1 2975 2 add.u64 %r2033,%r700,%r336; .loc 1 2976 34 st.u8 [%r2033],%r1648; .loc 1 2975 2 add.u64 %r2035,%r2033,%r336; .loc 1 2976 34 st.u8 [%r2035],%r1648; .loc 1 2975 2 add.u64 %r2038,%r2035,%r336; .loc 1 2976 34 st.u8 [%r2038],%r1648; .loc 1 2975 27 add.u64 %r420,%r420,4; .loc 1 2975 2 add.u64 %r700,%r2038,%r336; setp.ne.u64 %r2040,%r313,%r420; @ %r2040 bra $L105; $L109: .loc 1 2974 32 add.u64 %r419,%r435,1; .loc 1 2974 7 add.u64 %r703,%r703,%r626; setp.ne.u64 %r1520,%r314,%r419; @ %r1520 bra $L144; mov.u64 %r696,%r356; mov.u64 %r697,0; mov.u64 %r436,%r697; setp.le.s64 %r2041,%r266,0; bra $L107; $L104: mov.u64 %r703,0; .loc 1 2974 14 mov.u64 %r435,%r703; setp.gt.s64 %r2042,%r313,0; .loc 1 2976 34 cvt.u32.u64 %r1648,%r703; bra $L106; $L144: mov.u64 %r435,%r419; $L106: .loc 1 2975 2 @ %r2042 bra $L108; bra $L109; $L112: add.u64 %r684,%r354,%r690; .loc 1 2980 4 mov.u64 %r681,%r696; .loc 1 2980 11 mov.u64 %r418,0; and.b64 %r1731,%r313,3; setp.eq.u64 %r1734,%r1731,0; @ %r1734 bra $L110; setp.eq.u64 %r1733,%r1731,1; @ %r1733 bra $L310; setp.eq.u64 %r1732,%r1731,2; @ %r1732 bra $L311; bra $L368; $L110: .loc 1 2983 37 ld.u8 %r2420,[%r684]; mov.u16 %r1525,%r2420; ld.u8 %r2420,[%r692]; mov.u16 %r1526,%r2420; .loc 1 2982 36 ld.u8 %r2420,[%r681]; mov.u16 %r1531,%r2420; mad.lo.u16 %r1529,%r1525,%r1526,%r1531; cvt.u32.u16 %r1532,%r1529; st.u8 [%r681],%r1532; .loc 1 2980 4 add.u64 %r2044,%r681,%r336; add.u64 %r2045,%r684,%r377; .loc 1 2983 37 ld.u8 %r2420,[%r2045]; mov.u16 %r2048,%r2420; ld.u8 %r2420,[%r692]; mov.u16 %r2049,%r2420; .loc 1 2982 36 ld.u8 %r2420,[%r2044]; mov.u16 %r2052,%r2420; mad.lo.u16 %r2053,%r2048,%r2049,%r2052; cvt.u32.u16 %r2054,%r2053; st.u8 [%r2044],%r2054; .loc 1 2980 4 add.u64 %r2056,%r2044,%r336; add.u64 %r2057,%r2045,%r377; .loc 1 2983 37 ld.u8 %r2420,[%r2057]; mov.u16 %r2061,%r2420; ld.u8 %r2420,[%r692]; mov.u16 %r2062,%r2420; .loc 1 2982 36 ld.u8 %r2420,[%r2056]; mov.u16 %r2065,%r2420; mad.lo.u16 %r2066,%r2061,%r2062,%r2065; cvt.u32.u16 %r2067,%r2066; st.u8 [%r2056],%r2067; .loc 1 2980 4 add.u64 %r2069,%r2056,%r336; add.u64 %r2070,%r2057,%r377; .loc 1 2983 37 ld.u8 %r2420,[%r2070]; mov.u16 %r2074,%r2420; ld.u8 %r2420,[%r692]; mov.u16 %r2075,%r2420; .loc 1 2982 36 ld.u8 %r2420,[%r2069]; mov.u16 %r2078,%r2420; mad.lo.u16 %r2079,%r2074,%r2075,%r2078; cvt.u32.u16 %r2080,%r2079; st.u8 [%r2069],%r2080; .loc 1 2980 29 add.u64 %r418,%r418,4; .loc 1 2980 4 add.u64 %r681,%r2069,%r336; add.u64 %r684,%r2070,%r377; setp.ne.u64 %r2081,%r313,%r418; @ %r2081 bra $L110; $L113: .loc 1 2979 26 add.u64 %r417,%r417,1; .loc 1 2979 2 add.u64 %r690,%r690,%r307; add.u64 %r692,%r692,%r625; setp.eq.u64 %r1534,%r266,%r417; @ %r1534 bra $L111; $L114: .loc 1 2980 4 @ %r2042 bra $L112; bra $L113; $L111: .loc 1 2978 32 add.u64 %r416,%r436,1; .loc 1 2978 7 add.u64 %r696,%r696,%r626; add.u64 %r697,%r697,%r308; setp.eq.u64 %r1536,%r435,%r436; @ %r1536 bra $L1; mov.u64 %r436,%r416; $L107: .loc 1 2979 2 @ %r2041 bra $L111; add.u64 %r692,%r355,%r697; mov.u64 %r690,0; .loc 1 2979 9 mov.u64 %r417,%r690; bra $L114; $L103: mov.u64 %r721,%r355; .loc 1 2993 7 mov.u64 %r719,0; .loc 1 2993 14 mov.u64 %r412,%r719; setp.gt.s64 %r2082,%r313,0; setp.le.s64 %r1650,%r266,0; .loc 1 3000 10 cvt.u32.u64 %r1651,%r719; $L116: .loc 1 2997 4 @ %r2082 bra $L115; $L120: .loc 1 2993 32 add.u64 %r412,%r412,1; .loc 1 2993 7 add.u64 %r719,%r719,%r626; add.u64 %r721,%r721,%r308; setp.ne.u64 %r1539,%r314,%r412; @ %r1539 bra $L116; bra $L1; $L115: add.u64 %r713,%r356,%r719; .loc 1 2997 4 mov.u64 %r715,0; .loc 1 2997 11 mov.u64 %r413,%r715; $L119: .loc 1 3001 8 @ %r1650 bra $L145; add.u64 %r706,%r354,%r715; mov.u64 %r709,%r721; .loc 1 3000 10 mov.u32 %r414,%r1651; .loc 1 3001 15 mov.u64 %r415,0; and.b64 %r1747,%r266,3; setp.eq.u64 %r1750,%r1747,0; @ %r1750 bra $L118; setp.eq.u64 %r1749,%r1747,1; @ %r1749 bra $L312; setp.eq.u64 %r1748,%r1747,2; @ %r1748 bra $L313; bra $L369; $L118: .loc 1 3002 28 ld.u8 %r2420,[%r706]; mov.u16 %r1544,%r2420; ld.u8 %r2420,[%r709]; mov.u16 %r1545,%r2420; .loc 1 3002 5 cvt.u16.u32 %r1550,%r414; mad.lo.u16 %r1548,%r1544,%r1545,%r1550; cvt.u32.u16 %r1551,%r1548; cvt.s32.s8 %r2084,%r1551; .loc 1 3001 8 add.u64 %r2086,%r706,%r307; add.u64 %r2087,%r709,%r625; .loc 1 3002 28 ld.u8 %r2420,[%r2086]; mov.u16 %r2090,%r2420; ld.u8 %r2420,[%r2087]; mov.u16 %r2091,%r2420; .loc 1 3002 5 cvt.u16.u32 %r2093,%r2084; mad.lo.u16 %r2094,%r2090,%r2091,%r2093; cvt.u32.u16 %r2095,%r2094; cvt.s32.s8 %r2096,%r2095; .loc 1 3001 8 add.u64 %r2098,%r2086,%r307; add.u64 %r2099,%r2087,%r625; .loc 1 3002 28 ld.u8 %r2420,[%r2098]; mov.u16 %r2103,%r2420; ld.u8 %r2420,[%r2099]; mov.u16 %r2104,%r2420; .loc 1 3002 5 cvt.u16.u32 %r2106,%r2096; mad.lo.u16 %r2107,%r2103,%r2104,%r2106; cvt.u32.u16 %r2108,%r2107; cvt.s32.s8 %r2109,%r2108; .loc 1 3001 8 add.u64 %r2111,%r2098,%r307; add.u64 %r2112,%r2099,%r625; .loc 1 3002 28 ld.u8 %r2420,[%r2111]; mov.u16 %r2116,%r2420; ld.u8 %r2420,[%r2112]; mov.u16 %r2117,%r2420; .loc 1 3002 5 cvt.u16.u32 %r2119,%r2109; mad.lo.u16 %r2120,%r2116,%r2117,%r2119; cvt.u32.u16 %r2121,%r2120; cvt.s32.s8 %r414,%r2121; .loc 1 3001 32 add.u64 %r415,%r415,4; .loc 1 3001 8 add.u64 %r706,%r2111,%r307; add.u64 %r709,%r2112,%r625; setp.ne.u64 %r2122,%r266,%r415; @ %r2122 bra $L118; bra $L117; $L145: .loc 1 3000 10 mov.u32 %r414,%r1651; $L117: .loc 1 3003 27 cvt.u32.u32 %r1554,%r414; st.u8 [%r713],%r1554; .loc 1 2997 29 add.u64 %r413,%r413,1; .loc 1 2997 4 add.u64 %r713,%r713,%r336; add.u64 %r715,%r715,%r377; setp.ne.u64 %r1555,%r313,%r413; @ %r1555 bra $L119; bra $L120; $L32: .loc 1 2920 11 setp.eq.u64 %r1556,%r307,1; @ %r1556 bra $L121; bra $L20; $L2: .loc 1 2476 6 ld.u64 %r1557,[%r724]; setp.ne.u64 %r1558,%r1557,0; @ %r1558 bra $L122; .loc 1 2485 4 ld.u64 %r1560,[%r725+56]; add.u64 %r1559,%r1560,1; ld.u64 %r1561,[%r725+48]; sub.u64 %r630,%r1559,%r1561; .loc 1 2483 15 ld.s8 %r1563,[%r726+28]; cvt.u16.u32 %r1562,%r1563; setp.eq.u16 %r1564,%r1562,1; @ %r1564 bra $L123; bra $L124; $L126: .loc 1 2560 16 ld.u64 %r377,[%r725+40]; .loc 1 2548 27 mov.u64 %r336,%r626; bra $L125; $L14: ld.u64 %r626,[%r724+40]; .loc 1 2543 6 ld.s8 %r1566,[%r724+28]; cvt.u16.u32 %r1565,%r1566; setp.eq.u16 %r1567,%r1565,1; @ ! %r1567 bra $L127; bra $L126; $L11: .loc 1 2560 16 ld.u64 %r377,[%r725+40]; mov.u32 %r265,%r288; .loc 1 2548 27 mov.u64 %r626,%r336; .loc 1 2563 14 mov.u64 %r313,1; .loc 1 2561 16 mov.u64 %r307,%r313; bra $L12; $L5: .loc 1 2485 4 ld.u64 %r1569,[%r725+56]; add.u64 %r1568,%r1569,1; ld.u64 %r1570,[%r725+48]; sub.u64 %r630,%r1568,%r1570; bra $L124; $L128: .loc 1 2509 17 ld.u64 %r1572,[%r724+56]; add.u64 %r1571,%r1572,1; .loc 1 2509 15 ld.u64 %r1573,[%r724+48]; sub.u64 %r266,%r1571,%r1573; ld.s8 %r1591,[%r726+28]; bra $L9; $L122: .loc 1 2502 12 cvta.global.u64 %r1574,_gfortrani_compile_options; .loc 1 2502 11 ld.u32 %r1575,[%r1574+36]; setp.ne.u32 %r1576,%r1575,0; @ ! %r1576 bra $L8; bra $L128; $L1$L362: .loc 1 2908 13 ld.u8 %r2420,[%r406]; mov.u16 %r2125,%r2420; ld.u8 %r2420,[%r531]; mov.u16 %r2126,%r2420; .loc 1 2907 11 cvt.u16.u32 %r2128,%r376; mad.lo.u16 %r2129,%r2125,%r2126,%r2128; cvt.u32.u16 %r2130,%r2129; cvt.s32.s8 %r376,%r2130; .loc 1 2905 10 add.u64 %r406,%r406,1; add.u64 %r559,%r531,1; $L301: .loc 1 2908 13 ld.u8 %r2420,[%r406]; mov.u16 %r2134,%r2420; ld.u8 %r2420,[%r559]; mov.u16 %r2135,%r2420; .loc 1 2907 11 cvt.u16.u32 %r2137,%r376; mad.lo.u16 %r2138,%r2134,%r2135,%r2137; cvt.u32.u16 %r2139,%r2138; cvt.s32.s8 %r376,%r2139; .loc 1 2905 10 add.u64 %r406,%r406,1; add.u64 %r559,%r559,1; $L300: .loc 1 2908 13 ld.u8 %r2420,[%r406]; mov.u16 %r2143,%r2420; ld.u8 %r2420,[%r559]; mov.u16 %r2144,%r2420; .loc 1 2907 11 cvt.u16.u32 %r2146,%r376; mad.lo.u16 %r2147,%r2143,%r2144,%r2146; cvt.u32.u16 %r2148,%r2147; cvt.s32.s8 %r376,%r2148; .loc 1 2905 10 add.u64 %r406,%r406,1; add.u64 %r559,%r559,1; setp.ne.u64 %r2149,%r529,%r559; @ %r2149 bra $L83; bra $L82; $L361: sub.u64 %r2150,%r547,%r538; .loc 1 2886 16 add.u64 %r2152,%r2150,%r539; .loc 1 2887 13 ld.u8 %r2420,[%r2152]; mov.u16 %r2154,%r2420; ld.u8 %r2155,[%r531]; .loc 1 2886 11 cvt.u16.u32 %r2157,%r379; mad.lo.u16 %r2158,%r2154,%r2155,%r2157; cvt.u32.u16 %r2159,%r2158; cvt.s32.s8 %r379,%r2159; .loc 1 2889 13 ld.u8 %r2420,[%r547]; mov.u16 %r2161,%r2420; .loc 1 2888 11 cvt.u16.u32 %r2163,%r380; mad.lo.u16 %r2164,%r2161,%r2155,%r2163; cvt.u32.u16 %r2165,%r2164; cvt.s32.s8 %r380,%r2165; .loc 1 2890 16 add.u64 %r2166,%r2150,%r537; .loc 1 2891 13 ld.u8 %r2420,[%r2166]; mov.u16 %r2168,%r2420; .loc 1 2890 11 cvt.u16.u32 %r2170,%r381; mad.lo.u16 %r2171,%r2168,%r2155,%r2170; cvt.u32.u16 %r2172,%r2171; cvt.s32.s8 %r381,%r2172; .loc 1 2892 16 add.u64 %r2173,%r2150,%r536; .loc 1 2893 13 ld.u8 %r2420,[%r2173]; mov.u16 %r2175,%r2420; .loc 1 2892 11 cvt.u16.u32 %r2177,%r382; mad.lo.u16 %r2178,%r2175,%r2155,%r2177; cvt.u32.u16 %r2179,%r2178; cvt.s32.s8 %r382,%r2179; .loc 1 2884 10 add.u64 %r548,%r531,1; add.u64 %r547,%r547,1; setp.ne.u64 %r2180,%r529,%r548; @ %r2180 bra $L78; bra $L77; $L360: .loc 1 2856 13 ld.u8 %r2420,[%r520]; mov.u16 %r2183,%r2420; ld.u8 %r2184,[%r522]; .loc 1 2855 11 cvt.u16.u32 %r2186,%r383; mad.lo.u16 %r2187,%r2183,%r2184,%r2186; cvt.u32.u16 %r2188,%r2187; cvt.s32.s8 %r383,%r2188; .loc 1 2858 16 add.u64 %r2189,%r463,%r520; add.u64 %r2190,%r2189,%r461; .loc 1 2858 13 ld.u8 %r2420,[%r2190]; mov.u16 %r2192,%r2420; .loc 1 2857 11 cvt.u16.u32 %r2194,%r384; mad.lo.u16 %r2195,%r2192,%r2184,%r2194; cvt.u32.u16 %r2196,%r2195; cvt.s32.s8 %r384,%r2196; .loc 1 2860 16 add.u64 %r2197,%r459,%r520; add.u64 %r2198,%r2197,%r461; .loc 1 2860 13 ld.u8 %r2420,[%r2198]; mov.u16 %r2200,%r2420; .loc 1 2859 11 cvt.u16.u32 %r2202,%r385; mad.lo.u16 %r2203,%r2200,%r2184,%r2202; cvt.u32.u16 %r2204,%r2203; cvt.s32.s8 %r385,%r2204; .loc 1 2862 16 add.u64 %r2205,%r458,%r520; add.u64 %r2206,%r2205,%r461; .loc 1 2862 13 ld.u8 %r2420,[%r2206]; mov.u16 %r2208,%r2420; .loc 1 2861 11 cvt.u16.u32 %r2210,%r386; mad.lo.u16 %r2211,%r2208,%r2184,%r2210; cvt.u32.u16 %r2212,%r2211; cvt.s32.s8 %r386,%r2212; .loc 1 2853 10 add.u64 %r522,%r522,1; add.u64 %r520,%r520,1; setp.ne.u64 %r2213,%r457,%r520; @ %r2213 bra $L70; bra $L69; $L359: .loc 1 2762 43 ld.u8 %r2214,[%r432]; st.u8 [%r431],%r2214; .loc 1 2760 9 add.u64 %r432,%r432,1; add.u64 %r431,%r431,256; $L299: .loc 1 2762 43 ld.u8 %r2216,[%r432]; st.u8 [%r431],%r2216; .loc 1 2760 9 add.u64 %r432,%r432,1; add.u64 %r431,%r431,256; $L298: .loc 1 2762 43 ld.u8 %r2218,[%r432]; st.u8 [%r431],%r2218; .loc 1 2760 9 add.u64 %r432,%r432,1; add.u64 %r431,%r431,256; setp.ne.u64 %r2219,%r342,%r432; @ %r2219 bra $L58; bra $L48; $L358: .loc 1 2740 49 ld.u8 %r2220,[%r320]; st.u8 [%r340],%r2220; .loc 1 2743 7 add.u64 %r2221,%r112,%r320; add.u64 %r2222,%r2221,%r77; .loc 1 2742 49 ld.u8 %r2223,[%r2222]; st.u8 [%r340+1],%r2223; .loc 1 2744 49 ld.u8 %r2224,[%r320+1]; st.u8 [%r340+256],%r2224; .loc 1 2747 7 sub.u64 %r2225,%r320,%r317; add.u64 %r2226,%r2225,%r112; .loc 1 2746 49 ld.u8 %r2227,[%r2226+1]; st.u8 [%r340+257],%r2227; .loc 1 2738 9 add.u64 %r341,%r320,2; add.u64 %r340,%r340,512; $L297: .loc 1 2740 49 ld.u8 %r2229,[%r341]; st.u8 [%r340],%r2229; .loc 1 2743 7 add.u64 %r2230,%r112,%r341; add.u64 %r2231,%r2230,%r77; .loc 1 2742 49 ld.u8 %r2232,[%r2231]; st.u8 [%r340+1],%r2232; .loc 1 2744 49 ld.u8 %r2233,[%r341+1]; st.u8 [%r340+256],%r2233; .loc 1 2747 7 sub.u64 %r2234,%r341,%r317; add.u64 %r2235,%r2234,%r112; .loc 1 2746 49 ld.u8 %r2236,[%r2235+1]; st.u8 [%r340+257],%r2236; .loc 1 2738 9 add.u64 %r341,%r341,2; add.u64 %r340,%r340,512; $L296: .loc 1 2740 49 ld.u8 %r2238,[%r341]; st.u8 [%r340],%r2238; .loc 1 2743 7 add.u64 %r2239,%r112,%r341; add.u64 %r2240,%r2239,%r77; .loc 1 2742 49 ld.u8 %r2241,[%r2240]; st.u8 [%r340+1],%r2241; .loc 1 2744 49 ld.u8 %r2242,[%r341+1]; st.u8 [%r340+256],%r2242; .loc 1 2747 7 sub.u64 %r2243,%r341,%r317; add.u64 %r2244,%r2243,%r112; .loc 1 2746 49 ld.u8 %r2245,[%r2244+1]; st.u8 [%r340+257],%r2245; .loc 1 2738 9 add.u64 %r341,%r341,2; add.u64 %r340,%r340,512; setp.ne.u64 %r2246,%r323,%r341; @ %r2246 bra $L52; bra $L53; $L356: .loc 1 2688 2 @ ! %r1590 bra $L370; .loc 1 2689 22 mov.u32 %r224747313248,[%value_in]; } $L370: .loc 1 2687 24 mov.u64 %r407,2; .loc 1 2687 7 add.u64 %r141,%r141,%r626; $L295: .loc 1 2688 2 @ ! %r1590 bra $L371; .loc 1 2689 22 mov.u32 %r225050313251,[%value_in]; } $L371: .loc 1 2687 24 add.u64 %r407,%r407,1; .loc 1 2687 7 add.u64 %r141,%r141,%r626; $L294: .loc 1 2688 2 @ ! %r1590 bra $L372; .loc 1 2689 22 mov.u32 %r225353313254,[%value_in]; } $L372: .loc 1 2687 24 add.u64 %r407,%r407,1; .loc 1 2687 7 add.u64 %r141,%r141,%r626; setp.eq.u64 %r2255,%r345,%r407; @ ! %r2255 bra $L34; bra $L33; $L232: .loc 1 2689 22 mov.u32 %r225652256313257,[%value_in]; } $L357: .loc 1 2687 7 add.u64 %r2259,%r1753,%r626; .loc 1 2688 2 @ ! %r1590 bra $L373; .loc 1 2689 22 mov.u32 %r226122592261313262,[%value_in]; } $L373: .loc 1 2687 7 add.u64 %r2264,%r2259,%r626; .loc 1 2688 2 @ ! %r1590 bra $L374; .loc 1 2689 22 mov.u32 %r226622642266313267,[%value_in]; } $L374: .loc 1 2687 24 add.u64 %r407,%r1702,3; .loc 1 2687 7 add.u64 %r141,%r2264,%r626; setp.eq.u64 %r2268,%r345,%r407; @ ! %r2268 bra $L34; bra $L33; $L364: .loc 1 2938 23 ld.u8 %r2420,[%r310]; mov.u16 %r2271,%r2420; ld.u8 %r2420,[%r651]; mov.u16 %r2272,%r2420; mul.lo.u16 %r2273,%r2271,%r2272; .loc 1 2938 9 cvt.u32.u16 %r2276,%r2273; cvt.s32.s8 %r411,%r2276; .loc 1 2937 5 add.u64 %r310,%r310,1; add.u64 %r231,%r651,1; $L303: .loc 1 2938 23 ld.u8 %r2420,[%r310]; mov.u16 %r2280,%r2420; ld.u8 %r2420,[%r231]; mov.u16 %r2281,%r2420; .loc 1 2938 9 cvt.u16.u32 %r2283,%r411; mad.lo.u16 %r2284,%r2280,%r2281,%r2283; cvt.u32.u16 %r2285,%r2284; cvt.s32.s8 %r411,%r2285; .loc 1 2937 5 add.u64 %r310,%r310,1; add.u64 %r231,%r231,1; $L302: .loc 1 2938 23 ld.u8 %r2420,[%r310]; mov.u16 %r2289,%r2420; ld.u8 %r2420,[%r231]; mov.u16 %r2290,%r2420; .loc 1 2938 9 cvt.u16.u32 %r2292,%r411; mad.lo.u16 %r2293,%r2289,%r2290,%r2292; cvt.u32.u16 %r2294,%r2293; cvt.s32.s8 %r411,%r2294; .loc 1 2937 5 add.u64 %r310,%r310,1; add.u64 %r231,%r231,1; setp.ne.u64 %r2295,%r310,%r644; @ %r2295 bra $L93; bra $L92; $L365: .loc 1 2953 26 ld.u8 %r2420,[%r354]; mov.u16 %r2298,%r2420; ld.u8 %r2420,[%r663]; mov.u16 %r2299,%r2420; mul.lo.u16 %r2300,%r2298,%r2299; .loc 1 2953 5 cvt.u32.u16 %r2303,%r2300; cvt.s32.s8 %r409,%r2303; .loc 1 2952 8 add.u64 %r654,%r354,%r377; add.u64 %r656,%r663,1; $L305: .loc 1 2953 26 ld.u8 %r2420,[%r654]; mov.u16 %r2307,%r2420; ld.u8 %r2420,[%r656]; mov.u16 %r2308,%r2420; .loc 1 2953 5 cvt.u16.u32 %r2310,%r409; mad.lo.u16 %r2311,%r2307,%r2308,%r2310; cvt.u32.u16 %r2312,%r2311; cvt.s32.s8 %r409,%r2312; .loc 1 2952 8 add.u64 %r654,%r654,%r377; add.u64 %r656,%r656,1; $L304: .loc 1 2953 26 ld.u8 %r2420,[%r654]; mov.u16 %r2316,%r2420; ld.u8 %r2420,[%r656]; mov.u16 %r2317,%r2420; .loc 1 2953 5 cvt.u16.u32 %r2319,%r409; mad.lo.u16 %r2320,%r2316,%r2317,%r2319; cvt.u32.u16 %r2321,%r2320; cvt.s32.s8 %r409,%r2321; .loc 1 2952 8 add.u64 %r654,%r654,%r377; add.u64 %r656,%r656,1; setp.ne.u64 %r2322,%r656,%r1594; @ %r2322 bra $L97; bra $L96; $L366: .loc 1 2968 29 ld.u8 %r2420,[%r354]; mov.u16 %r2325,%r2420; ld.u8 %r2420,[%r670]; mov.u16 %r2326,%r2420; mul.lo.u16 %r2327,%r2325,%r2326; .loc 1 2968 8 cvt.u32.u16 %r2330,%r2327; cvt.s32.s8 %r423,%r2330; .loc 1 2967 28 mov.u64 %r424,1; .loc 1 2967 4 add.u64 %r668,%r354,%r377; add.u64 %r670,%r670,%r625; $L307: .loc 1 2968 29 ld.u8 %r2420,[%r668]; mov.u16 %r2334,%r2420; ld.u8 %r2420,[%r670]; mov.u16 %r2335,%r2420; .loc 1 2968 8 cvt.u16.u32 %r2337,%r423; mad.lo.u16 %r2338,%r2334,%r2335,%r2337; cvt.u32.u16 %r2339,%r2338; cvt.s32.s8 %r423,%r2339; .loc 1 2967 28 add.u64 %r424,%r424,1; .loc 1 2967 4 add.u64 %r668,%r668,%r377; add.u64 %r670,%r670,%r625; $L306: .loc 1 2968 29 ld.u8 %r2420,[%r668]; mov.u16 %r2343,%r2420; ld.u8 %r2420,[%r670]; mov.u16 %r2344,%r2420; .loc 1 2968 8 cvt.u16.u32 %r2346,%r423; mad.lo.u16 %r2347,%r2343,%r2344,%r2346; cvt.u32.u16 %r2348,%r2347; cvt.s32.s8 %r423,%r2348; .loc 1 2967 28 add.u64 %r424,%r424,1; .loc 1 2967 4 add.u64 %r668,%r668,%r377; add.u64 %r670,%r670,%r625; setp.ne.u64 %r2349,%r266,%r424; @ %r2349 bra $L100; bra $L99; $L368: .loc 1 2983 37 ld.u8 %r2420,[%r684]; mov.u16 %r2352,%r2420; ld.u8 %r2420,[%r692]; mov.u16 %r2353,%r2420; .loc 1 2982 36 ld.u8 %r2420,[%r696]; mov.u16 %r2356,%r2420; mad.lo.u16 %r2357,%r2352,%r2353,%r2356; cvt.u32.u16 %r2358,%r2357; st.u8 [%r696],%r2358; .loc 1 2980 29 mov.u64 %r418,1; .loc 1 2980 4 add.u64 %r681,%r696,%r336; add.u64 %r684,%r684,%r377; $L311: .loc 1 2983 37 ld.u8 %r2420,[%r684]; mov.u16 %r2362,%r2420; ld.u8 %r2420,[%r692]; mov.u16 %r2363,%r2420; .loc 1 2982 36 ld.u8 %r2420,[%r681]; mov.u16 %r2366,%r2420; mad.lo.u16 %r2367,%r2362,%r2363,%r2366; cvt.u32.u16 %r2368,%r2367; st.u8 [%r681],%r2368; .loc 1 2980 29 add.u64 %r418,%r418,1; .loc 1 2980 4 add.u64 %r681,%r681,%r336; add.u64 %r684,%r684,%r377; $L310: .loc 1 2983 37 ld.u8 %r2420,[%r684]; mov.u16 %r2372,%r2420; ld.u8 %r2420,[%r692]; mov.u16 %r2373,%r2420; .loc 1 2982 36 ld.u8 %r2420,[%r681]; mov.u16 %r2376,%r2420; mad.lo.u16 %r2377,%r2372,%r2373,%r2376; cvt.u32.u16 %r2378,%r2377; st.u8 [%r681],%r2378; .loc 1 2980 29 add.u64 %r418,%r418,1; .loc 1 2980 4 add.u64 %r681,%r681,%r336; add.u64 %r684,%r684,%r377; setp.ne.u64 %r2379,%r313,%r418; @ %r2379 bra $L110; bra $L113; $L367: .loc 1 2976 34 st.u8 [%r700],%r1648; .loc 1 2975 27 mov.u64 %r420,1; .loc 1 2975 2 add.u64 %r700,%r700,%r336; $L309: .loc 1 2976 34 st.u8 [%r700],%r1648; .loc 1 2975 27 add.u64 %r420,%r420,1; .loc 1 2975 2 add.u64 %r700,%r700,%r336; $L308: .loc 1 2976 34 st.u8 [%r700],%r1648; .loc 1 2975 27 add.u64 %r420,%r420,1; .loc 1 2975 2 add.u64 %r700,%r700,%r336; setp.ne.u64 %r2382,%r313,%r420; @ %r2382 bra $L105; bra $L109; $L369: .loc 1 3002 28 ld.u8 %r2420,[%r706]; mov.u16 %r2385,%r2420; ld.u8 %r2420,[%r721]; mov.u16 %r2386,%r2420; mul.lo.u16 %r2387,%r2385,%r2386; .loc 1 3002 5 cvt.u32.u16 %r2390,%r2387; cvt.s32.s8 %r414,%r2390; .loc 1 3001 32 mov.u64 %r415,1; .loc 1 3001 8 add.u64 %r706,%r706,%r307; add.u64 %r709,%r721,%r625; $L313: .loc 1 3002 28 ld.u8 %r2420,[%r706]; mov.u16 %r2394,%r2420; ld.u8 %r2420,[%r709]; mov.u16 %r2395,%r2420; .loc 1 3002 5 cvt.u16.u32 %r2397,%r414; mad.lo.u16 %r2398,%r2394,%r2395,%r2397; cvt.u32.u16 %r2399,%r2398; cvt.s32.s8 %r414,%r2399; .loc 1 3001 32 add.u64 %r415,%r415,1; .loc 1 3001 8 add.u64 %r706,%r706,%r307; add.u64 %r709,%r709,%r625; $L312: .loc 1 3002 28 ld.u8 %r2420,[%r706]; mov.u16 %r2403,%r2420; ld.u8 %r2420,[%r709]; mov.u16 %r2404,%r2420; .loc 1 3002 5 cvt.u16.u32 %r2406,%r414; mad.lo.u16 %r2407,%r2403,%r2404,%r2406; cvt.u32.u16 %r2408,%r2407; cvt.s32.s8 %r414,%r2408; .loc 1 3001 32 add.u64 %r415,%r415,1; .loc 1 3001 8 add.u64 %r706,%r706,%r307; add.u64 %r709,%r709,%r625; setp.ne.u64 %r2409,%r266,%r415; @ %r2409 bra $L118; bra $L117; } matmul_i2.o/_gfortran_matmul_i2 .visible .func _gfortran_matmul_, .param .u32 %in_ar4, .param .u64 %in_ar5); .file 1 "../../../../libgfortran/generated/matmul_i2.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_options[6]FUNCTION DECL: _gfortran_size0_gfortran_size0_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime___func__$0 .const .align 1 .u8 __func__$0[10] = {109,97,116,109,117,108,95,105,50,0 }2] = {782] = {672] = {8471,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,32,40,97,41,32,61,61,32,50,32,124,124,32,71,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,32,40,98,41,32,61,61,3246116,114,97,110,47,103,101,110,101,114,97,116,101,100,47,109,97,116,109,117,108,95,105,50,465,114,114,97,121,32,98,111,117,110,100,32,109,105,115,109,97,116,99,104,32,102,111,114,32,100,105,109,101,110,115,105,111,110,32,49,32,111,102,32,97,114,114,97,121,32,40,37,108,100,47,37,108,100,41,3265,114,114,97,121,32,98,111,117,110,100,32,109,105,115,109,97,116,99,104,32,102,111,114,32,100,105,109,101,110,115,105,111,110,32,50,32,111,102,32,97,114,114,97,121,32,40,37,108,100,47,37,108,100,41,3273,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,97,114,103,117,109,101,110,116,32,66,32,105,110,32,77,65,84,77,85,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,49,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,100,0 }103,101,109,109,32,33,61,32,78,85,76,7_gfortran_matmul_i2 .visible .func _gfortran_matmul_u64 %r68; .reg .u64 %r75; .reg .u64 %r82; .reg .u32 %r83; .reg .u32 %r88; .reg .f32 %r99u64 %r148; .reg .u64 %r239; .reg .u64 %r272; .reg .u64u64u64u64 %r330; .reg .u32 %r331; .reg .u32u64 %r353; .reg .u64 %r355; .reg .u64 %r358; .reg .u64u64 %r365; .reg .u64 %r366; .reg .u64 %r369; .reg .u64u32u64 %r408; .reg .u32 %r409; .reg .u64 %r410; .reg .u64u64 %r488; .reg .u64 %r491; .reg .u64 %r496; .reg .u64 %r498; .reg .u64u64 %r511; .reg .u64 %r518; .reg .u64 %r520; .reg .u64 %r521; .reg .u64 %r523; .reg .u64 %r525; .reg .u64 %r528; .reg .u64 %r532; .reg .u64 %r535; .reg .u64 %r536; .reg .u64u64 %r550; .reg .u64 %r551; .reg .u64 %r552; .reg .u64 %r553; .reg .u64 %r554; .reg .u64 %r560; .reg .u64 %r563; .reg .u64 %r565; .reg .u64 %r570; .reg .u64 %r571; .reg .u64 %r572; .reg .u64 %r576; .reg .u64 %r578; .reg .u64 %r581; .reg .u32 %r583; .reg .u64 %r593; .reg .u64 %r594; .reg .u64.reg .u64 %r624; .reg .u64 %r625; .reg .u64 %r627; .reg .u64 %r628; .reg .u64 %r631; .reg .u64 %r632; .reg .u64 %r637; .reg .u64 %r638; .reg .u64 %r639; .reg .u32 %r644; .reg .u64 %r645; .reg .u64 %r646; .reg .u64 %r650; .reg .u64 %r655; .reg .u64 %r660; .reg .u64 %r661; .reg .u64 %r663; .reg .u64 %r664; .reg .u64u64 %r679; .reg .u64 %r680; .reg .u64 %r681; .reg .u64 %r682; .reg .u64 %r683; .reg .u64 %r684; .reg .u64 %r685; .reg .u64 %r687; .reg .u64u64 %r694; .reg .u64 %r695; .reg .u64 %r696; .reg .u64 %r699; .reg .u64 %r700; .reg .u64 %r701; .reg .u64 %r706; .reg .u64 %r708; .reg .u64 %r709; .reg .u64 %r710; .reg .u64 %r711; .reg .u64 %r712; .reg .u64 %r715; .reg .u64 %r716; .reg .u64 %r717; .reg .u64 %r720; .reg .u64 %r721; .reg .u64 %r722; .reg .u64 %r724; .reg .u64 %r725; .reg .u64 %r732; .reg .u64 %r734; .reg .u64 %r738; .reg .u64 %r743; .reg .u64 %r744; .reg .u64 %r746; .reg .u64 %r747; .reg .u64 %r752; .reg .u64 %r754; .reg .u64 %r755; .reg .u64 %r763; .reg .u64 %r764; .reg .u64 %r766; .reg .u64 %r767; .reg .u64 %r769; .reg .u64 %r772; .reg .u64 %r774; .reg .u64 %r775; .reg .u64 %r780; .reg .u64 %r781; .reg .u64 %r783; .reg .u64 %r788; .reg .u64 %r790; .reg .u64 %r792; .reg .u64 %r796; .reg .u64 %r797; .reg .u64 %r801; .reg .u64 %r804; .reg .u64 %r806; .reg .u64 %r808; .reg .u64 %r812; .reg .u64 %r814; .reg .u64 %r816; .reg .u64 %r818; .reg .u64 %r821; .reg .u64 %r823; .reg .u64 %r824; .reg .u64 %r829; .reg .u64 %r831; .reg .u64 %r837; .reg .u64 %r838; .reg .u64 %r839; .reg .u64 %r840; .reg .u32 %r841; .reg .u32 %r842; .reg .u64 %r843; .reg .u16 %r844; .reg .pred %r845; .reg .u16 %r846; .reg .pred %r848; .reg .u64 %r849; .reg .u32 %r850; .reg .u64 %r851; .reg .u64 %r852; .reg .u64 %r853; .reg .pred %r854; .reg .pred %r856; .reg .u64 %r859; .reg .u64 %r861; .reg .u64 %r862; .reg .u64 %r863; .reg .u64 %r865; .reg .u64 %r866; .reg .u64 %r867; .reg .u64 %r868; .reg .u64 %r869; .reg .u64 %r872; .reg .u64 %r874; .reg .u64 %r875; .reg .u64 %r877; .reg .u64 %r879; .reg .u64 %r880; .reg .u64 %r882; .reg .u64 %r883; .reg .u64 %r884; .reg .u64 %r885; .reg .u64 %r886; .reg .u32 %r887; .reg .pred %r888; .reg .u64 %r889; .reg .u64 %r890; .reg .u64 %r891; .reg .pred %r893; .reg .u64 %r894; .reg .u64 %r895; .reg .u64 %r896; .reg .pred %r897; .reg .u64 %r898; .reg .u64 %r899; .reg .u64 %r900; .reg .u16 %r901; .reg .pred %r902; .reg .u64 %r903; .reg .u64 %r905; .reg .u64 %r906; .reg .u64 %r907; .reg .u16 %r908; .reg .pred %r910; .reg .pred %r911; .reg .u64 %r912; .reg .pred %r914; .reg .u64 %r915; .reg .u64 %r917; .reg .u64 %r918; .reg .u64 %r919; .reg .u64 %r920; .reg .u64 %r921; .reg .u64 %r922; .reg .pred %r923; .reg .u64 %r924; .reg .u64 %r926; .reg .u64 %r927; .reg .u64 %r928; .reg .u16 %r929; .reg .u32 %r930; .reg .pred %r931; .reg .u16 %r932; .reg .pred %r933; .reg .u64 %r934; .reg .u64 %r935; .reg .u64 %r936; .reg .u64 %r937; .reg .u64 %r938; .reg .u64 %r939; .reg .pred %r940; .reg .u32 %r942; .reg .u32 %r943; .reg .u32 %r945; .reg .u32 %r946; .reg .u16 %r947; .reg .u16 %r948; .reg .u16 %r949; .reg .u32 %r950; .reg .u16 %r951; .reg .pred %r952; .reg .u64 %r953; .reg .u16 %r955; .reg .pred %r956; .reg .u64 %r957; .reg .u64 %r958; .reg .u64 %r959; .reg .u32 %r960; .reg .u32 %r961; .reg .u32 %r962; .reg .u32 %r963; .reg .u32 %r964; .reg .u32 %r965; .reg .u32 %r967; .reg .u32 %r968; .reg .u16 %r969; .reg .u16 %r970; .reg .u16 %r971; .reg .u32 %r972; .reg .u16 %r973; .reg .pred %r974; .reg .u32 %r976; .reg .u32 %r977; .reg .u16 %r978; .reg .u16 %r979; .reg .u16 %r980; .reg .u32 %r981; .reg .u16 %r982; .reg .pred %r983; .reg .u32 %r985; .reg .u32 %r986; .reg .u32 %r988; .reg .u32 %r989; .reg .u16 %r990; .reg .u16 %r991; .reg .u16 %r992; .reg .u32 %r993; .reg .u16 %r994; .reg .pred %r995; .reg .f32 %r996; .reg .f32 %r997; .reg .f32 %r998; .reg .f32 %r999; .reg .f32 %r1000; .reg .f32 %r1001; .reg .f32 %r1002; .reg .pred %r1003; .reg .u16 %r1004; .reg .u16 %r1005; .reg .pred %r1006; .reg .pred %r1007; .reg .u32 %r1009; .reg .u32 %r1010; .reg .u32 %r1012; .reg .u32 %r1013; .reg .u16 %r1014; .reg .u16 %r1015; .reg .u16 %r1016; .reg .u32 %r1017; .reg .u16 %r1018; .reg .pred %r1019; .reg .u32 %r1021; .reg .u32 %r1022; .reg .u32 %r1024; .reg .u32 %r1025; .reg .u16 %r1026; .reg .u16 %r1027; .reg .u16 %r1028; .reg .u32 %r1029; .reg .u16 %r1030; .reg .pred %r1031; .reg .u32 %r1033; .reg .u32 %r1034; .reg .u32 %r1036; .reg .u32 %r1037; .reg .u16 %r1038; .reg .u16 %r1039; .reg .u16 %r1040; .reg .u32 %r1041; .reg .u16 %r1042; .reg .pred %r1043; .reg .pred %r1044; .reg .u64 %r1045; .reg .u32 %r1046; .reg .u64 %r1047; .reg .u64 %r1048; .reg .u32 %r1049; .reg .pred %r1050; .reg .pred %r1051; .reg .u32 %r1052; .reg .pred %r1053; .reg .pred %r1054; .reg .u32 %r1069; .reg .u64 %r1070; .reg .u64 %r1071; .reg .u64 %r1072; .reg .u64 %r1073; .reg .u64 %r1074; .reg .u64 %r1075; .reg .u64 %r1076; .reg .u16 %r1077; .reg .u16 %r1078; .reg .u16 %r1079; .reg .u32 %r1080; .reg .u16 %r1081; .reg .pred %r1082; .reg .pred %r1083; .reg .u16 %r1084; .reg .pred %r1085; .reg .u64 %r1086; .reg .pred %r1088; .reg .u32 %r1093; .reg .u64 %r1095; .reg .u32 %r1100; .reg .u32 %r1101; .reg .u32 %r1103; .reg .u32 %r1104; .reg .u16 %r1105; .reg .u16 %r1106; .reg .u16 %r1107; .reg .u32 %r1109; .reg .u32 %r1110; .reg .u16 %r1112; .reg .u16 %r1114; .reg .u32 %r1115; .reg .u16 %r1116; .reg .pred %r1117; .reg .pred %r1118; .reg .u64 %r1119; .reg .u64 %r1121; .reg .u64 %r1122; .reg .u64 %r1123; .reg .pred %r1125; .reg .u64 %r1127; .reg .u64 %r1128; .reg .u64 %r1129; .reg .u64 %r1130; .reg .u64 %r1132; .reg .u64 %r1133; .reg .u64 %r1137; .reg .u64 %r1139; .reg .u64 %r1140; .reg .u64 %r1143; .reg .u64 %r1146; .reg .u64 %r1148; .reg .u64 %r1149; .reg .u64 %r1150; .reg .u64 %r1151; .reg .u64 %r1152; .reg .pred %r1154; .reg .u64 %r1160; .reg .u64 %r1161; .reg .u64 %r1162; .reg .u64 %r1163; .reg .pred %r1165; .reg .u64 %r1166; .reg .u64 %r1167; .reg .u64 %r1168; .reg .u64 %r1169; .reg .u64 %r1171; .reg .u64 %r1172; .reg .u64 %r1173; .reg .u64 %r1174; .reg .u64 %r1175; .reg .u64 %r1176; .reg .u64 %r1177; .reg .u64 %r1178; .reg .u64 %r1179; .reg .u64 %r1183; .reg .u64 %r1184; .reg .u64 %r1185; .reg .u64 %r1186; .reg .u64 %r1189; .reg .u64 %r1190; .reg .u64 %r1191; .reg .u64 %r1192; .reg .u64 %r1194; .reg .u64 %r1196; .reg .u64 %r1201; .reg .u64 %r1204; .reg .u64 %r1210; .reg .u64 %r1211; .reg .u64 %r1213; .reg .u64 %r1214; .reg .u16 %r1215; .reg .u16 %r1216; .reg .u64 %r1217; .reg .u64 %r1218; .reg .u16 %r1219; .reg .u16 %r1220; .reg .u64 %r1226; .reg .u64 %r1227; .reg .u32 %r1228; .reg .u32 %r1229; .reg .u32 %r1230; .reg .u32 %r1231; .reg .pred %r1232; .reg .u64 %r1235; .reg .u64 %r1236; .reg .u64 %r1237; .reg .u64 %r1238; .reg .u64 %r1240; .reg .pred %r1241; .reg .u64 %r1242; .reg .u64 %r1243; .reg .u16 %r1244; .reg .u64 %r1247; .reg .u64 %r1248; .reg .u64 %r1249; .reg .u64 %r1250; .reg .u64 %r1251; .reg .u64 %r1252; .reg .u64 %r1253; .reg .u64 %r1254; .reg .u64 %r1257; .reg .u64 %r1258; .reg .u64 %r1259; .reg .u64 %r1260; .reg .u64 %r1266; .reg .u16 %r1268; .reg .u16 %r1269; .reg .u16 %r1270; .reg .u16 %r1271; .reg .u64 %r1273; .reg .u16 %r1276; .reg .u16 %r1277; .reg .u16 %r1278; .reg .u64 %r1280; .reg .u16 %r1283; .reg .u16 %r1284; .reg .u16 %r1285; .reg .u16 %r1290; .reg .u16 %r1291; .reg .u64 %r1293; .reg .u16 %r1296; .reg .u16 %r1297; .reg .u16 %r1298; .reg .u16 %r1303; .reg .u16 %r1304; .reg .u64 %r1306; .reg .u16 %r1309; .reg .u16 %r1310; .reg .u16 %r1311; .reg .u16 %r1316; .reg .u16 %r1317; .reg .u64 %r1319; .reg .u16 %r1322; .reg .u16 %r1323; .reg .u16 %r1324; .reg .u64 %r1326; .reg .u16 %r1329; .reg .u16 %r1330; .reg .u16 %r1331; .reg .u16 %r1336; .reg .u16 %r1337; .reg .u16 %r1342; .reg .u16 %r1343; .reg .u16 %r1348; .reg .u16 %r1349; .reg .u16 %r1354; .reg .u16 %r1355; .reg .u16 %r1360; .reg .u16 %r1361; .reg .u16 %r1366; .reg .u16 %r1367; .reg .pred %r1369; .reg .u16 %r1370; .reg .u16 %r1371; .reg .u16 %r1372; .reg .u16 %r1373; .reg .u16 %r1374; .reg .u16 %r1375; .reg .u16 %r1376; .reg .u16 %r1377; .reg .u16 %r1378; .reg .u16 %r1379; .reg .u16 %r1380; .reg .u16 %r1381; .reg .u16 %r1382; .reg .u16 %r1383; .reg .u16 %r1384; .reg .u16 %r1385; .reg .pred %r1386; .reg .pred %r1387; .reg .u64 %r1391; .reg .u16 %r1393; .reg .u16 %r1394; .reg .u16 %r1396; .reg .u16 %r1399; .reg .u16 %r1402; .reg .u16 %r1405; .reg .u16 %r1408; .reg .u16 %r1411; .reg .u16 %r1414; .reg .u16 %r1417; .reg .u16 %r1418; .reg .u16 %r1419; .reg .u16 %r1420; .reg .pred %r1421; .reg .pred %r1422; .reg .pred %r1423; .reg .u64 %r1424; .reg .u64 %r1425; .reg .u64 %r1426; .reg .u64 %r1430; .reg .u64 %r1434; .reg .u64 %r1435; .reg .u64 %r1438; .reg .u16 %r1440; .reg .u16 %r1441; .reg .u16 %r1443; .reg .u16 %r1446; .reg .u16 %r1449; .reg .u64 %r1451; .reg .u16 %r1453; .reg .u16 %r1456; .reg .u64 %r1458; .reg .u16 %r1460; .reg .u16 %r1463; .reg .u16 %r1466; .reg .u16 %r1467; .reg .u16 %r1468; .reg .u16 %r1469; .reg .pred %r1470; .reg .pred %r1471; .reg .u64 %r1473; .reg .u64 %r1474; .reg .u64 %r1475; .reg .u64 %r1477; .reg .u16 %r1479; .reg .u16 %r1480; .reg .u16 %r1482; .reg .u16 %r1485; .reg .pred %r1486; .reg .u16 %r1487; .reg .pred %r1488; .reg .u32 %r1490; .reg .u32 %r1491; .reg .u16 %r1492; .reg .u16 %r1493; .reg .u32 %r1495; .reg .u16 %r1496; .reg .pred %r1497; .reg .pred %r1498; .reg .u16 %r1499; .reg .pred %r1500; .reg .pred %r1501; .reg .pred %r1502; .reg .u64 %r1504; .reg .pred %r1506; .reg .u64 %r1507; .reg .u64 %r1508; .reg .u64 %r1509; .reg .u64 %r1510; .reg .u64 %r1512; .reg .u16 %r1514; .reg .u16 %r1515; .reg .u16 %r1517; .reg .u16 %r1520; .reg .pred %r1521; .reg .u64 %r1525; .reg .u64 %r1526; .reg .u64 %r1527; .reg .u16 %r1529; .reg .u16 %r1530; .reg .u16 %r1532; .reg .u16 %r1535; .reg .pred %r1536; .reg .pred %r1537; .reg .u64 %r1542; .reg .u16 %r1544; .reg .u16 %r1545; .reg .u16 %r1547; .reg .u16 %r1550; .reg .pred %r1551; .reg .pred %r1552; .reg .pred %r1553; .reg .pred %r1554; .reg .u64 %r1555; .reg .pred %r1558; .reg .u64 %r1563; .reg .u16 %r1565; .reg .u16 %r1566; .reg .u16 %r1567; .reg .u16 %r1568; .reg .pred %r1570; .reg .pred %r1572; .reg .u64 %r1574; .reg .u64 %r1575; .reg .pred %r1580; .reg .u64 %r1581; .reg .u64 %r1582; .reg .u64 %r1584; .reg .u16 %r1586; .reg .u16 %r1587; .reg .u16 %r1589; .reg .u16 %r1592; .reg .pred %r1593; .reg .pred %r1594; .reg .u64 %r1595; .reg .pred %r1596; .reg .u64 %r1597; .reg .u64 %r1598; .reg .u64 %r1599; .reg .u16 %r1600; .reg .u32 %r1601; .reg .pred %r1602; .reg .u16 %r1603; .reg .u32 %r1604; .reg .pred %r1605; .reg .u64 %r1606; .reg .u64 %r1607; .reg .u64 %r1608; .reg .u64 %r1609; .reg .u64 %r1610; .reg .u64 %r1611; .reg .u64 %r1612; .reg .u32 %r1613; .reg .pred %r1614; .reg .pred %r1615; .reg .pred %r1619; .reg .pred %r1621; .reg .pred %r1622; .reg .u64 %r1623; .reg .pred %r1624; .reg .pred %r1625; .reg .pred %r1626; .reg .pred %r1627; .reg .pred %r1629; .reg .pred %r1631; .reg .pred %r1632; .reg .u64 %r1633; .reg .u32 %r1634; .reg .u64 %r1641; .reg .u64 %r1642; .reg .u64 %r1643; .reg .u64 %r1644; .reg .u64 %r1649; .reg .u64 %r1650; .reg .u64 %r1651; .reg .u64 %r1652; .reg .u64 %r1653; .reg .u64 %r1655; .reg .u64 %r1656; .reg .u64 %r1657; .reg .u64 %r1658; .reg .u64 %r1659; .reg .pred %r1660; .reg .u64 %r1662; .reg .u64 %r1666; .reg .u64 %r1668; .reg .u64 %r1669; .reg .pred %r1670; .reg .u64 %r1671; .reg .u64 %r1672; .reg .pred %r1677; .reg .u32 %r1678; .reg .u32 %r1680; .reg .u32 %r1681; .reg .u16 %r1682; .reg .pred %r1684; .reg .u32 %r1685; .reg .u64 %r1687; .reg .u64 %r1688; .reg .u64 %r1689; .reg .u64 %r1690; .reg .u64 %r1691; .reg .pred %r1692; .reg .pred %r1693; .reg .pred %r1694; .reg .u64 %r1697; .reg .u64 %r1698; .reg .u64 %r1699; .reg .u64 %r1701; .reg .pred %r1702; .reg .u64 %r1704; .reg .u64 %r1705; .reg .u64 %r1706; .reg .u64 %r1707; .reg .u64 %r1709; .reg .pred %r1710; .reg .u64 %r1716; .reg .u64 %r1717; .reg .u64 %r1718; .reg .u64 %r1719; .reg .u64 %r1720; .reg .pred %r1721; .reg .pred %r1722; .reg .pred %r1723; .reg .u64 %r1726; .reg .u64 %r1727; .reg .u64 %r1728; .reg .u64 %r1729; .reg .u64 %r1730; .reg .pred %r1731; .reg .pred %r1732; .reg .pred %r1733; .reg .u64 %r1734; .reg .u64 %r1735; .reg .u64 %r1739; .reg .pred %r1740; .reg .pred %r1741; .reg .pred %r1742; .reg .u64 %r1743; .reg .u64 %r1744; .reg .u64 %r1745; .reg .u64 %r1746; .reg .u64 %r1747; .reg .u64 %r1748; .reg .pred %r1749; .reg .pred %r1750; .reg .pred %r1751; .reg .u64 %r1754; .reg .u64 %r1755; .reg .u64 %r1756; .reg .u64 %r1757; .reg .u64 %r1758; .reg .pred %r1759; .reg .pred %r1760; .reg .pred %r1761; .reg .u64 %r1766; .reg .pred %r1767; .reg .pred %r1768; .reg .pred %r1769; .reg .u64 %r1774; .reg .pred %r1775; .reg .pred %r1776; .reg .pred %r1777; .reg .u64 %r1782; .reg .pred %r1783; .reg .pred %r1784; .reg .pred %r1785; .reg .u64 %r1790; .reg .pred %r1791; .reg .pred %r1792; .reg .pred %r1793; .reg .u64 %r1796; .reg .u64 %r1797; .reg .u64 %r1799; .reg .u64 %r1800; .reg .u64 %r1803; .reg .u64 %r1804; .reg .u16 %r1805; .reg .u16 %r1806; .reg .u64 %r1807; .reg .u64 %r1808; .reg .u16 %r1809; .reg .u16 %r1810; .reg .u64 %r1811; .reg .u64 %r1812; .reg .u64 %r1814; .reg .u64 %r1815; .reg .u16 %r1816; .reg .u16 %r1817; .reg .u64 %r1818; .reg .u64 %r1819; .reg .u16 %r1820; .reg .u16 %r1821; .reg .u64 %r1822; .reg .u64 %r1823; .reg .u64 %r1825; .reg .u64 %r1826; .reg .u16 %r1827; .reg .u16 %r1828; .reg .u64 %r1829; .reg .u64 %r1830; .reg .u16 %r1831; .reg .u16 %r1832; .reg .pred %r1833; .reg .u16 %r1836; .reg .u16 %r1840; .reg .u16 %r1844; .reg .pred %r1845; .reg .u16 %r1856; .reg .u16 %r1857; .reg .u16 %r1859; .reg .u16 %r1860; .reg .u16 %r1861; .reg .u16 %r1863; .reg .u16 %r1864; .reg .u16 %r1865; .reg .u16 %r1867; .reg .u16 %r1868; .reg .u16 %r1869; .reg .u16 %r1871; .reg .u16 %r1872; .reg .pred %r1873; .reg .pred %r1874; .reg .u64 %r1881; .reg .u64 %r1883; .reg .u16 %r1884; .reg .u16 %r1885; .reg .u16 %r1887; .reg .u16 %r1888; .reg .u16 %r1889; .reg .u16 %r1891; .reg .u16 %r1892; .reg .u64 %r1893; .reg .u16 %r1894; .reg .u16 %r1896; .reg .u16 %r1897; .reg .u64 %r1898; .reg .u16 %r1899; .reg .u16 %r1901; .reg .u16 %r1902; .reg .pred %r1903; .reg .u16 %r1907; .reg .u16 %r1908; .reg .u16 %r1910; .reg .u16 %r1916; .reg .u16 %r1917; .reg .u16 %r1919; .reg .u16 %r1925; .reg .u16 %r1926; .reg .u16 %r1928; .reg .u16 %r1929; .reg .pred %r1930; .reg .pred %r1931; .reg .u16 %r1935; .reg .u16 %r1936; .reg .u16 %r1938; .reg .u16 %r1944; .reg .u16 %r1945; .reg .u16 %r1947; .reg .u16 %r1953; .reg .u16 %r1954; .reg .u16 %r1956; .reg .u16 %r1957; .reg .pred %r1958; .reg .pred %r1959; .reg .u64 %r1961; .reg .u16 %r1963; .reg .u16 %r1964; .reg .u16 %r1966; .reg .u64 %r1969; .reg .u16 %r1972; .reg .u16 %r1973; .reg .u16 %r1975; .reg .u64 %r1978; .reg .u16 %r1981; .reg .u16 %r1982; .reg .u16 %r1984; .reg .u16 %r1985; .reg .pred %r1986; .reg .pred %r1987; .reg .u64 %r1990; .reg .u64 %r1991; .reg .u16 %r1992; .reg .u16 %r1993; .reg .u16 %r1995; .reg .u64 %r1999; .reg .u64 %r2000; .reg .u16 %r2002; .reg .u16 %r2003; .reg .u16 %r2005; .reg .u64 %r2009; .reg .u64 %r2010; .reg .u16 %r2012; .reg .u16 %r2013; .reg .u16 %r2015; .reg .u16 %r2016; .reg .pred %r2017; .reg .u64 %r2019; .reg .u64 %r2021; .reg .u64 %r2024; .reg .pred %r2026; .reg .pred %r2027; .reg .pred %r2028; .reg .u64 %r2030; .reg .u64 %r2031; .reg .u16 %r2032; .reg .u16 %r2033; .reg .u16 %r2035; .reg .u16 %r2036; .reg .u64 %r2038; .reg .u64 %r2039; .reg .u16 %r2041; .reg .u16 %r2042; .reg .u16 %r2044; .reg .u16 %r2045; .reg .u64 %r2047; .reg .u64 %r2048; .reg .u16 %r2050; .reg .u16 %r2051; .reg .u16 %r2053; .reg .u16 %r2054; .reg .pred %r2055; .reg .pred %r2056; .reg .u64 %r2060; .reg .u64 %r2061; .reg .u16 %r2062; .reg .u16 %r2063; .reg .u16 %r2065; .reg .u64 %r2069; .reg .u64 %r2070; .reg .u16 %r2072; .reg .u16 %r2073; .reg .u16 %r2075; .reg .u64 %r2079; .reg .u64 %r2080; .reg .u16 %r2082; .reg .u16 %r2083; .reg .u16 %r2085; .reg .u16 %r2086; .reg .pred %r2087; .reg .u16 %r2088; .reg .u16 %r2089; .reg .u16 %r2091; .reg .u16 %r2092; .reg .u16 %r2094; .reg .u16 %r2095; .reg .u16 %r2097; .reg .u16 %r2098; .reg .u16 %r2100; .reg .u16 %r2101; .reg .u16 %r2103; .reg .u16 %r2104; .reg .pred %r2105; .reg .u64 %r2106; .reg .u64 %r2108; .reg .u16 %r2109; .reg .u16 %r2110; .reg .u16 %r2112; .reg .u16 %r2113; .reg .u16 %r2114; .reg .u16 %r2116; .reg .u16 %r2117; .reg .u64 %r2118; .reg .u16 %r2119; .reg .u16 %r2121; .reg .u16 %r2122; .reg .u64 %r2123; .reg .u16 %r2124; .reg .u16 %r2126; .reg .u16 %r2127; .reg .pred %r2128; .reg .u16 %r2130; .reg .u16 %r2131; .reg .u16 %r2133; .reg .u16 %r2134; .reg .u16 %r2135; .reg .u16 %r2137; .reg .u16 %r2138; .reg .u16 %r2139; .reg .u16 %r2141; .reg .u16 %r2142; .reg .u16 %r2143; .reg .u16 %r2145; .reg .u16 %r2146; .reg .pred %r2147; .reg .u16 %r2148; .reg .u16 %r2150; .reg .u16 %r2152; .reg .pred %r2153; .reg .u64 %r2154; .reg .u64 %r2155; .reg .u16 %r2156; .reg .u16 %r2157; .reg .u64 %r2158; .reg .u64 %r2159; .reg .u16 %r2160; .reg .u16 %r2161; .reg .u64 %r2163; .reg .u64 %r2164; .reg .u16 %r2165; .reg .u16 %r2166; .reg .u64 %r2167; .reg .u64 %r2168; .reg .u16 %r2169; .reg .u16 %r2170; .reg .u64 %r2172; .reg .u64 %r2173; .reg .u16 %r2174; .reg .u16 %r2175; .reg .u64 %r2176; .reg .u64 %r2177; .reg .u16 %r2178; .reg .u16 %r2179; .reg .pred %r2180; .reg .u32 %r2181; .reg .u64 %r2182; .reg .u32 %r2184; .reg .u64 %r2185; .reg .u32 %r2187; .reg .u64 %r2188; .reg .pred %r2189; .reg .u32 %r2190; .reg .u64 %r2191; .reg .u64 %r2193; .reg .u32 %r2195; .reg .u64 %r2196; .reg .u64 %r2198; .reg .u32 %r2200; .reg .u64 %r2201; .reg .pred %r2202; .reg .u16 %r2203; .reg .u16 %r2204; .reg .u16 %r2205; .reg .u16 %r2209; .reg .u16 %r2210; .reg .u16 %r2212; .reg .u16 %r2213; .reg .u16 %r2215; .reg .u16 %r2216; .reg .u16 %r2218; .reg .u16 %r2219; .reg .pred %r2220; .reg .u16 %r2221; .reg .u16 %r2222; .reg .u16 %r2223; .reg .u16 %r2227; .reg .u16 %r2228; .reg .u16 %r2230; .reg .u16 %r2231; .reg .u16 %r2233; .reg .u16 %r2234; .reg .u16 %r2236; .reg .u16 %r2237; .reg .pred %r2238; .reg .u16 %r2239; .reg .u16 %r2240; .reg .u16 %r2241; .reg .u16 %r2245; .reg .u16 %r2246; .reg .u16 %r2248; .reg .u16 %r2249; .reg .u16 %r2251; .reg .u16 %r2252; .reg .u16 %r2254; .reg .u16 %r2255; .reg .pred %r2256; .reg .u16 %r2257; .reg .u16 %r2258; .reg .u16 %r2260; .reg .u16 %r2261; .reg .u16 %r2263; .reg .u16 %r2264; .reg .u16 %r2266; .reg .u16 %r2267; .reg .u16 %r2269; .reg .u16 %r2270; .reg .u16 %r2272; .reg .u16 %r2273; .reg .pred %r2274; .reg .pred %r2277; .reg .u16 %r2278; .reg .u16 %r2279; .reg .u16 %r2280; .reg .u16 %r2284; .reg .u16 %r2285; .reg .u16 %r2287; .reg .u16 %r2288; .reg .u16 %r2290; .reg .u16 %r2291; .reg .u16 %r2293; .reg .u16 %r2294; .reg .pred %r2295; mov.u64 %r838,%ar0; mov.u64 %r839,%ar1; mov.u64 %r840,%ar2; mov.u32 %r841,%ar3; mov.u32 %r842,%ar4; mov.u64 %r843,%ar5; .loc 1 2462 3 ld.s8 %r583,[%r839+28]; cvt.u16.u32 %r844,%r583; setp.eq.u16 %r845,%r844,2; @ %r845 bra $L2; ld.s8 %r1634,[%r840+28]; cvt.u16.u32 %r846,%r1634; setp.eq.u16 %r848,%r846,2; @ %r848 bra $L3; cvta.const.u64 %r852,$LC3; cvta.const.u64 %r851,__func__$0; mov.u32 %r850,2462; cvta.const.u64 %r848583: .loc 1 2476 6 ld.u64 %r853,[%r838]; setp.ne.u64 %r854,%r853,0; @ %r854 bra $L4; .loc 1 2478 10 setp.ne.u16 %r856,%r844,1; @ %r856 bra $L5; .loc 1 2480 4 st.u64 [%r838+48],%r853; ld.u64 %r859,[%r840+80]; ld.u64 %r861,[%r840+72]; sub.u64 %r862,%r859,%r861; st.u64 [%r838+56],%r862; mov.u64 %r863,1; st.u64 [%r838+40],%r863; bra $L6; $L123: .loc 1 2485 4 st.u64 [%r838+48],%r1595; add.u64 %r865,%r650,-1; st.u64 [%r838+56],%r865; mov.u64 %r866,1; st.u64 [%r838+40],%r866; bra $L6; $L124: .loc 1 2490 4 mov.u64 %r867,0; st.u64 [%r838+48],%r867; add.u64 %r868,%r650,-1; st.u64 [%r838+56],%r868; mov.u64 %r869,1; st.u64 [%r838+40],%r869; .loc 1 2493 11 st.u64 [%r838+72],%r867; ld.u64 %r872,[%r840+80]; ld.u64 %r874,[%r840+72]; sub.u64 %r875,%r872,%r874; st.u64 [%r838+80],%r875; st.u64 [%r838+64],%r650; $L6: .loc 1 2499 1838; call (%value_in),_gfortran_size0,(%out_arg1); ld.param.u64 %r877,[%value_in]; } .loc 1 2499 4 mov.u64 %r879,879; call (%value_in),_gfortrani_xmallocarray880,[%value_in]; } .loc 1 2499 2 st.u64 [%r838],%r880; .loc 1 2500 24 mov.u64 %r882,0; st.u64 [%r838+8],%r882; .loc 1 2557 7 ld.s8 %r583,[%r839+28]; .loc 1 2564 15 ld.u64 %r884,[%r839+56]; add.u64 %r883,%r884,1; .loc 1 2564 13 ld.u64 %r885,[%r839+48]; sub.u64 %r272,%r883,%r885; bra $L7; $L4: .loc 1 2502 12 cvta.global.u64 %r886,_gfortrani_compile_options; .loc 1 2502 11 ld.u32 %r887,[%r886+36]; setp.eq.u32 %r888,%r887,0; @ %r888 bra $L8; .loc 1 2509 17 ld.u64 %r890,[%r838+56]; add.u64 %r889,%r890,1; .loc 1 2509 15 ld.u64 %r891,[%r838+48]; sub.u64 %r272,%r889,%r891; .loc 1 2506 10 setp.ne.u16 %r893,%r844,1; @ %r893 bra $L9; .loc 1 2508 17 ld.u64 %r895,[%r840+80]; add.u64 %r894,%r895,1; .loc 1 2508 15 ld.u64 %r896,[%r840+72]; sub.u64 %r346,%r894,%r896; .loc 1 2510 7 setp.ne.u64 %r897,%r346,%r272; @ %r897 bra $L10; .loc 1 2564 15 ld.u64 %r899,[%r839+56]; add.u64 %r898,%r899,1; .loc 1 2564 13 ld.u64 %r900,[%r839+48]; sub.u64 %r272,%r898,%r900; .loc 1 2543 7 ld.s8 %r443,[%r838+28]; .loc 1 2548 27 ld.u64 %r712,[%r838+40]; .loc 1 2543 6 cvt.u16.u32 %r901,%r443; setp.eq.u16 %r902,%r901,1; @ %r902 bra $L11; .loc 1 2553 16 ld.u64 %r581,[%r838+64]; .loc 1 2560 16 ld.u64 %r646,[%r839+40]; .loc 1 2563 14 mov.u64 %r320,1; .loc 1 2561 16 mov.u64 %r313,%r320; bra $L12; $L10: .loc 1 2511 6 st.u64 [%stack+8],%r346; st.u64 [%stack],%r272; cvta.const.u64 %r903903_gfortran_runtime_error2485 4 ld.u64 %r906,[%r839+56]; add.u64 %r905,%r906,1; ld.u64 %r907,[%r839+48]; sub.u64 %r655,%r905,%r907; .loc 1 2515 15 cvt.u16.u32 %r908,%r1634; setp.ne.u16 %r910,%r908,1; @ %r910 bra $L13; .loc 1 2519 7 setp.eq.u64 %r911,%r655,%r272; @ %r911 bra $L14; .loc 1 2520 6 st.u64 [%stack+8],%r655; st.u64 [%stack],%r272; cvta.const.u64 %r912912_gfortran_runtime_error13: .loc 1 2528 7 setp.eq.u64 %r914,%r655,%r272; @ %r914 bra $L15; .loc 1 2529 6 st.u64 [%stack+8],%r655; st.u64 [%stack],%r272; cvta.const.u64 %r915915_gfortran_runtime_error15: .loc 1 2533 17 ld.u64 %r918,[%r840+80]; add.u64 %r917,%r918,1; .loc 1 2533 15 ld.u64 %r919,[%r840+72]; sub.u64 %r343,%r917,%r919; .loc 1 2534 17 ld.u64 %r921,[%r838+80]; add.u64 %r920,%r921,1; .loc 1 2534 15 ld.u64 %r922,[%r838+72]; sub.u64 %r344,%r920,%r922; .loc 1 2535 7 setp.eq.u64 %r923,%r343,%r344; @ %r923 bra $L14; .loc 1 2536 6 st.u64 [%stack+8],%r343; st.u64 [%stack],%r344; cvta.const.u64 %r924,$LC9stack; call _gfortran_runtime_error2564 15 ld.u64 %r927,[%r839+56]; add.u64 %r926,%r927,1; .loc 1 2564 13 ld.u64 %r928,[%r839+48]; sub.u64 %r272,%r926,%r928; $L7: .loc 1 2548 27 ld.u64 %r581,[%r838+40]; .loc 1 2543 6 ld.s8 %r930,[%r838+28]; cvt.u16.u32 %r929,%r930; setp.eq.u16 %r931,%r929,1; @ %r931 bra $L130; $L127: .loc 1 2553 16 mov.u64 %r712,%r581; ld.u64 %r581,[%r838+64]; bra $L16; $L130: .loc 1 2548 27 mov.u64 %r712,%r581; $L16: .loc 1 2560 16 ld.u64 %r646,[%r839+40]; .loc 1 2557 6 cvt.u16.u32 %r932,%r583; setp.eq.u16 %r933,%r932,1; @ %r933 bra $L131; $L125: .loc 1 2569 16 ld.u64 %r313,[%r839+64]; .loc 1 2571 15 ld.u64 %r935,[%r839+80]; add.u64 %r934,%r935,1; .loc 1 2571 13 ld.u64 %r936,[%r839+72]; mov.u64 %r320,%r272; sub.u64 %r272,%r934,%r936; bra $L12; $L131: .loc 1 2563 14 mov.u64 %r320,1; .loc 1 2561 16 mov.u64 %r313,%r320; $L12: .loc 1 2575 16 ld.u64 %r938,[%r840+56]; add.u64 %r937,%r938,1; ld.u64 %r939,[%r840+48]; sub.u64 %r75,%r937,%r939; .loc 1 2575 6 setp.eq.u64 %r940,%r75,%r272; @ %r940 bra $L17; .loc 1 2577 10 set.u32.gt.s64 %r942,%r272,0; neg.s32 %r943,%r942; .loc 1 2577 21 set.u32.gt.s64 %r945,%r75,0; neg.s32 %r946,%r945; cvt.u16.u32 %r948,%r943; cvt.u16.u32 %r949,%r946; or.b16 %r947,%r948,%r949; cvt.u32.u16 %r950,%r947; cvt.u16.u8 %r951,%r950; setp.eq.u16 %r952,%r951,0; @ %r952 bra $L17; .loc 1 2578 2 st.u64 [%stack+8],%r2729_gfortran_runtime_error17: .loc 1 2583 7 ld.s8 %r83,[%r840+28]; .loc 1 2586 16 ld.u64 %r645,[%r840+40]; .loc 1 2583 6 cvt.u16.u32 %r955,%r83; setp.eq.u16 %r956,%r955,1; @ %r956 bra $L132; .loc 1 2597 16 ld.u64 %r314,[%r840+64]; .loc 1 2598 16 ld.u64 %r958,[%r840+80]; add.u64 %r957,%r958,1; .loc 1 2598 14 ld.u64 %r959,[%r840+72]; sub.u64 %r321,%r957,%r959; bra $L18; $L132: .loc 1 2592 14 mov.u64 %r321,1; .loc 1 2591 16 mov.u64 %r314,256; $L18: .loc 1 2601 9 ld.u64 %r349,[%r839]; .loc 1 2602 9 ld.u64 %r350,[%r840]; .loc 1 2603 8 ld.u64 %r351,[%r838]; .loc 1 2612 28 set.u32.eq.u64 %r961,%r712,1; neg.s32 %r962,%r961; cvt.u32.u32 %r960,%r962; cvt.u32.u8 %r88,%r960; .loc 1 2612 46 set.u32.eq.u64 %r964,%r646,1; neg.s32 %r965,%r964; cvt.u32.u32 %r963,%r965; cvt.u32.u8 %r644,%r963; .loc 1 2612 7 set.u32.ne.u32 %r967,%r841,0; neg.s32 %r968,%r967; .loc 1 2612 16 cvt.u16.u32 %r970,%r968; cvt.u16.u32 %r971,%r88; and.b16 %r969,%r970,%r971; .loc 1 2612 6 cvt.u32.u16 %r972,%r969; cvt.u16.u8 %r973,%r972; setp.eq.u16 %r974,%r973,0; @ %r974 bra $L19; .loc 1 2612 63 set.u32.eq.u64 %r976,%r313,1; neg.s32 %r977,%r976; .loc 1 2612 51 cvt.u16.u32 %r979,%r977; cvt.u16.u32 %r980,%r644; or.b16 %r978,%r979,%r980; .loc 1 2612 33 cvt.u32.u16 %r981,%r978; cvt.u16.u8 %r982,%r981; setp.eq.u16 %r983,%r982,0; @ %r983 bra $L20; .loc 1 2613 20 set.u32.eq.u64 %r985,%r645,1; neg.s32 %r986,%r985; .loc 1 2613 37 set.u32.eq.u64 %r988,%r314,1; neg.s32 %r989,%r988; .loc 1 2613 25 cvt.u16.u32 %r991,%r986; cvt.u16.u32 %r992,%r989; or.b16 %r990,%r991,%r992; .loc 1 2613 7 cvt.u32.u16 %r993,%r990; cvt.u16.u8 %r994,%r993; setp.eq.u16 %r995,%r994,0; @ %r995 bra $L20; .loc 1 2615 13 cvt.rn.f32.s32 %r99,%r842; .loc 1 2614 12 cvt.rn.f32.s64 %r996,%r320; .loc 1 2614 50 cvt.rn.f32.s64 %r997,%r272; .loc 1 2614 47 mul.f32 %r998,%r996,%r997; .loc 1 2614 31 cvt.rn.f32.s64 %r999,%r321; .loc 1 2614 47 mul.f32 %r1000,%r998,%r999; .loc 1 2615 13 mul.f32 %r1001,%r99,%r99; mul.f32 %r1002,%r1001,%r99; .loc 1 2614 7 setp.gt.f32 %r1003,%r1000,%r1002; @ ! %r1003 bra $L19; .loc 1 2617 17 cvt.u32.u64 %r102,%r320; st.u32 [%frame+20],%r102; .loc 1 2617 29 cvt.u32.u64 %r103,%r321; st.u32 [%frame+16],%r103; .loc 1 2617 41 cvt.u32.u64 %r104,%r272; st.u32 [%frame+12],%r104; .loc 1 2617 52 cvt.u32.u64 %r105,%r581; st.u32 [%frame+8],%r105; .loc 1 2618 27 mov.u16 %r1004,1; st.u16 [%frame+26],%r1004; .loc 1 2618 36 mov.u16 %r1005,0; st.u16 [%frame+24],%r1005; .loc 1 2619 17 setp.ne.u64 %r1006,%r646,1; @ %r1006 bra $L22; cvt.u32.u64 %r331,%r313; bra $L23; $L22: cvt.u32.u64 %r331,%r646; $L23: st.u32 [%frame+4],%r331; .loc 1 2620 3 setp.ne.u64 %r1007,%r645,1; @ %r1007 bra $L24; cvt.u32.u64 %r332,%r314; bra $L25; $L24: cvt.u32.u64 %r332,%r645; $L25: st.u32 [%frame],%r332; .loc 1 2622 15 set.u32.gt.s32 %r1009,%r331,0; neg.s32 %r1010,%r1009; .loc 1 2622 26 set.u32.gt.s32 %r1012,%r332,0; neg.s32 %r1013,%r1012; .loc 1 2622 19 cvt.u16.u32 %r1015,%r1010; cvt.u16.u32 %r1016,%r1013; and.b16 %r1014,%r1015,%r1016; .loc 1 2622 10 cvt.u32.u16 %r1017,%r1014; cvt.u16.u8 %r1018,%r1017; setp.eq.u16 %r1019,%r1018,0; @ %r1019 bra $L19; .loc 1 2622 37 set.u32.gt.s32 %r1021,%r105,0; neg.s32 %r1022,%r1021; .loc 1 2622 46 set.u32.gt.s32 %r1024,%r102,1; neg.s32 %r1025,%r1024; .loc 1 2622 41 cvt.u16.u32 %r1027,%r1022; cvt.u16.u32 %r1028,%r1025; and.b16 %r1026,%r1027,%r1028; cvt.u32.u16 %r1029,%r1026; cvt.u16.u8 %r1030,%r1029; setp.eq.u16 %r1031,%r1030,0; @ %r1031 bra $L19; .loc 1 2622 55 set.u32.gt.s32 %r1033,%r103,1; neg.s32 %r1034,%r1033; .loc 1 2622 64 set.u32.gt.s32 %r1036,%r104,1; neg.s32 %r1037,%r1036; .loc 1 2622 59 cvt.u16.u32 %r1039,%r1034; cvt.u16.u32 %r1040,%r1037; and.b16 %r1038,%r1039,%r1040; cvt.u32.u16 %r1041,%r1038; cvt.u16.u8 %r1042,%r1041; setp.eq.u16 %r1043,%r1042,0; @ %r1043 bra $L19; .loc 1 2624 4 setp.ne.u64 %r1044,%r843,0; @ %r1044 bra $L27; cvta.const.u64 %r1048,$LC8; cvta.const.u64 %r1047,__func__$0; mov.u32 %r1046,2624; cvta.const.u64 %r1045104561004827: .loc 1 2626 17 and.b32 %r1049,%r841,2; .loc 1 2626 7 setp.ne.u32 %r1050,%r1049,0; @ %r1050 bra $L133; .loc 1 2629 35 setp.eq.u64 %r1051,%r646,1; @ %r1051 bra $L134; cvta.const.u64 %r322,$LC2; bra $L28; $L133: .loc 1 2627 13 cvta.const.u64 %r322,$LC1; bra $L28; $L134: .loc 1 2629 35 cvta.const.u64 %r322,$LC0; $L28: .loc 1 2631 17 and.b32 %r1052,%r841,4; .loc 1 2631 7 setp.ne.u32 %r1053,%r1052,0; @ %r1053 bra $L135; .loc 1 2634 35 setp.eq.u64 %r1054,%r645,1; @ %r1054 bra $L136; cvta.const.u64 %r323,$LC2; bra $L29; $L135: .loc 1 2632 13 cvta.const.u64 %r323,$LC1; bra $L29; $L136: .loc 1 2634 35 cvta.const.u64 %r323,$LC0; $L29: .loc 1 2636 4 add.u64 %r1070,%frame,8; add.u64 %r1071,%frame,24; add.u64 %r1072,%frame,4; add.u64 %r1073,%frame,26; add.u64 %r1074,%frame,12; add.u64 %r1075,%frame,16; add.u64 %r1076,%frame,20; mov.u32 %r1069,1;,.param .u64 %in_ar3,.param .u64 %in_ar4,.param .u64 %in_ar5,.param .u64 %in_ar6,.param .u64 %in_ar7,.param .u64 %in_ar8,.param .u64 %in_ar9,.param .u64 %in_ar10,.param .u64 %in_ar11,.param .u64 %in_ar12,.param .u32 %in_ar13,10761010741073.param .u64 %out_arg8; st.param.u64 [%out_arg8],%r1072; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r350; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%frame; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r1071; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r351; .param .u64 %out_arg13; st.param.u64 [%out_arg13],%r1070; .param .u32 %out_arg14; st.param.u32 [%out_arg14],%r1069; .param .u32 %out_arg15; st.param.u32 [%out_arg15],%r1069; call %r843,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13,%out_arg14,%out_arg15),$LCT0; } bra $L1; $L19: .loc 1 2643 21 cvt.u16.u32 %r1078,%r88; cvt.u16.u32 %r1079,%r644; and.b16 %r1077,%r1078,%r1079; .loc 1 2643 6 cvt.u32.u16 %r1080,%r1077; cvt.u16.u8 %r1081,%r1080; setp.eq.u16 %r1082,%r1081,0; @ %r1082 bra $L31; .loc 1 2643 38 setp.ne.u64 %r1083,%r645,1; @ %r1083 bra $L20; .loc 1 2644 7 cvt.u16.u32 %r1084,%r83; setp.eq.u16 %r1085,%r1084,1; @ %r1085 bra $L32; .loc 1 2677 16 add.u64 %r1086,%r581,1; .loc 1 2678 9 add.u64 %r122,%r1086,%r1086; sub.u64 %r353,%r351,%r122; .loc 1 2687 7 setp.le.s64 %r1088,%r321,0; @ %r1088 bra $L33; add.u64 %r711,%r122,-2; mov.u64 %r710,%r351; add.u64 %r715,%r321,1; .loc 1 2689 22 add.u64 %r578,%r320,%r320; .loc 1 2687 13 mov.u64 %r402,%r645; setp.gt.s64 %r1631,%r320,0; and.b64 %r1739,%r321,3; setp.eq.u64 %r1742,%r1739,0; @ %r1742 bra $L34; setp.eq.u64 %r1741,%r1739,1; @ %r1741 bra $L294; setp.eq.u64 %r1740,%r1739,2; @ %r1740 bra $L295; bra $L356; $L35: .loc 1 2689 22 mov.u32 %r100935781095,[%value_in]; } $L36: .loc 1 2687 24 add.u64 %r1743,%r402,1; .loc 1 2687 7 add.u64 %r1796,%r710,%r711; .loc 1 2688 2 @ %r1631 bra $L232; bra $L357; $L34: @ %r1631 bra $L35; bra $L36; $L33: .loc 1 2692 28 set.u32.eq.u64 %r1100,%r272,0; neg.s32 %r1101,%r1100; .loc 1 2692 13 set.u32.eq.u64 %r1103,%r320,0; neg.s32 %r1104,%r1103; .loc 1 2692 28 cvt.u16.u32 %r1106,%r1101; cvt.u16.u32 %r1107,%r1104; or.b16 %r1105,%r1106,%r1107; .loc 1 2692 23 set.u32.eq.u64 %r1109,%r321,0; neg.s32 %r1110,%r1109; .loc 1 2692 28 cvt.u16.u32 %r1114,%r1110; or.b16 %r1112,%r1105,%r1114; cvt.u32.u16 %r1115,%r1112; cvt.u16.u8 %r1116,%r1115; setp.eq.u16 %r1117,%r1116,0; @ ! %r1117 bra $L1; .loc 1 2697 10 setp.eq.u64 %r1118,%r313,1; selp.u64 %r330,%r581,%r313,%r1118; .loc 1 2702 21 shl.b64 %r1119,%r330,8; .loc 1 2702 14 add.u64 %r358,%r1119,%r314; .loc 1 2706 12 min.s64 %r1121,%r358,65536; add.u64 %r1122,%r1121,%r11211221123,[%value_in]; } .loc 1 2710 7 setp.gt.s64 %r1125,%r321,0; @ %r1125 bra $L39; $L42: .loc 1 2911232918 7 bra $L1; $L39: .loc 1 2680 16 add.u64 %r1127,%r313,1; .loc 1 2681 9 add.u64 %r1128,%r1127,%r1127; sub.u64 %r355,%r349,%r1128; .loc 1 2683 16 add.u64 %r1129,%r314,1; .loc 1 2684 9 add.u64 %r1130,%r1129,%r1129; shl.b64 %r687,%r314,9; shl.b64 %r689,%r314,10; shl.b64 %r699,%r314,2; sub.u64 %r690,%r350,%r1130; add.u64 %r688,%r699,%r690; shl.b64 %r693,%r581,9; add.u64 %r694,%r122,-2; shl.b64 %r695,%r581,2; neg.s64 %r1132,%r314; shl.b64 %r1133,%r1132,10; shl.b64 %r701,%r581,10; mov.u64 %r700,%r351; add.u64 %r1137,%r321,-1; and.b64 %r706,%r1137,-512; shl.b64 %r663,%r313,8; shl.b64 %r665,%r313,9; add.u64 %r666,%r1128,-2; neg.s64 %r1139,%r313; shl.b64 %r1140,%r1139,9; shl.b64 %r1143,%r1139,2; shl.b64 %r606,%r313,2; add.u64 %r709,%r1130,-2; shl.b64 %r459,%r314,3; shl.b64 %r454,%r581,3; shl.b64 %r1146,%r1132,3; shl.b64 %r696,%r1132,1; mov.u64 %r691,%r694; mov.u64 %r685,%r314; mov.u64 %r683,0; setp.gt.s64 %r1626,%r272,0; add.u64 %r1669,%r696,2; setp.gt.s64 %r1670,%r320,0; add.u64 %r1671,%r320,1; add.u64 %r1672,%r355,4; bra $L41; $L138: mov.u64 %r683,%r684; $L41: add.u64 %r708,%r683,1; .loc 1 2715 9 sub.u64 %r360,%r321,%r683; min.s64 %r1797,%r360,512; .loc 1 2716 24 shr.s64 %r1148,%r1797,63; shr.u64 %r1149,%r1148,62; add.u64 %r1150,%r1797,%r1149; and.b64 %r1151,%r1150,3; sub.u64 %r1152,%r1151,%r1149; .loc 1 2716 10 sub.u64 %r361,%r1797,%r1152; .loc 1 2718 4 @ %r1626 bra $L40; $L45: .loc 1 2710 7 add.u64 %r684,%r683,512; add.u64 %r685,%r685,%r687; add.u64 %r688,%r688,%r689; add.u64 %r691,%r691,%r693; add.u64 %r695,%r695,%r693; add.u64 %r696,%r696,%r1133; add.u64 %r699,%r699,%r689; add.u64 %r700,%r700,%r701; setp.ne.u64 %r1154,%r683,%r706; @ %r1154 bra $L138; bra $L42; $L40: add.u64 %r660,%r1669,%r688; .loc 1 2768 13 add.u64 %r624,%r361,%r708; add.u64 %r440,%r699,%r709; add.u64 %r433,%r440,%r709; add.u64 %r679,%r688,%r709; add.u64 %r680,%r679,%r709; mul.lo.u64 %r541,%r624,%r314; mad.lo.u64 %r676,%r624,%r709,%r690; mul.lo.u64 %r535,%r624,%r581; mov.u64 %r667,%r1143; mov.u64 %r664,%r666; mov.u64 %r661,%r313; .loc 1 2718 12 mov.u64 %r364,%r645; add.u64 %r1623,%r272,1; .loc 1 2768 8 add.u64 %r1662,%r624,-1; sub.u64 %r1666,%r691,%r581; add.u64 %r1668,%r581,%r691; $L44: .loc 1 2723 13 sub.u64 %r362,%r1623,%r364; min.s64 %r1799,%r362,256; .loc 1 2724 28 shr.u64 %r1160,%r1799,63; add.u64 %r1161,%r1799,%r1160; and.b64 %r1162,%r1161,1; sub.u64 %r1163,%r1162,%r1160; .loc 1 2724 14 sub.u64 %r363,%r1799,%r1163; .loc 1 2727 8 @ %r1670 bra $L43; $L73: .loc 1 2718 30 add.u64 %r364,%r364,256; .loc 1 2718 4 add.u64 %r660,%r660,512; add.u64 %r661,%r661,%r663; add.u64 %r664,%r664,%r665; add.u64 %r667,%r667,%r1140; setp.ge.s64 %r1165,%r272,%r364; @ %r1165 bra $L44; bra $L45; $L43: .loc 1 2763 26 add.u64 %r1166,%r364,-1; add.u64 %r1167,%r1166,%r1799; .loc 1 2763 31 mul.lo.u64 %r421,%r1167,%r313; .loc 1 2734 13 add.u64 %r623,%r363,%r364; .loc 1 2734 8 add.u64 %r625,%r623,-1; add.u64 %r1168,%r421,1; add.u64 %r1169,%r1168,%r1168; add.u64 %r442,%r355,%r1169; add.u64 %r682,%r664,%r666; add.u64 %r681,%r666,%r667; shl.b64 %r1171,%r1799,1; add.u64 %r1172,%r1171,-2; add.u64 %r82,%r1123,%r1172; add.u64 %r1173,%r364,%r685; add.u64 %r1174,%r1173,%r1799; add.u64 %r1175,%r1174,%r1174; add.u64 %r458,%r1175,%r690; add.u64 %r1176,%r364,%r364; add.u64 %r329,%r1176,%r688; add.u64 %r327,%r1176,%r679; add.u64 %r325,%r1176,%r680; add.u64 %r677,%r676,%r1176; add.u64 %r1177,%r541,%r364; add.u64 %r1178,%r1177,%r1799; add.u64 %r1179,%r1178,%r1178; add.u64 %r539,%r1179,%r690; .loc 1 2768 8 mov.u64 %r68,%r700; .loc 1 2727 16 mov.u64 %r378,%r645; setp.le.s64 %r1615,%r364,%r625; setp.gt.s64 %r1625,%r1799,%r363; setp.ge.s64 %r1632,%r1662,%r708; setp.gt.s64 %r1624,%r1797,%r361; not.b64 %r1655,%r364; add.u64 %r1656,%r1655,%r623; and.b64 %r1657,%r1656,-2; .loc 1 2904 18 add.u64 %r1658,%r1799,%r364; .loc 1 2904 13 add.u64 %r1659,%r1658,-1; setp.gt.s64 %r1660,%r364,%r1659; $L72: .loc 1 2732 10 sub.u64 %r365,%r1671,%r378; min.s64 %r1800,%r365,256; .loc 1 2733 25 shr.s64 %r1633,%r1800,63; shr.u64 %r1183,%r1800,63; add.u64 %r1184,%r1800,%r1183; and.b64 %r1185,%r1184,1; sub.u64 %r1186,%r1185,%r1183; .loc 1 2733 11 sub.u64 %r366,%r1800,%r1186; .loc 1 2735 5 @ %r1615 bra $L46; $L55: .loc 1 2757 8 @ ! %r1625 bra $L48; bra $L47; $L46: .loc 1 2737 17 add.u64 %r627,%r366,%r378; .loc 1 2737 12 add.u64 %r628,%r627,-1; add.u64 %r1189,%r661,-1; add.u64 %r1190,%r1189,%r378; add.u64 %r1191,%r1190,%r1800; add.u64 %r1192,%r1191,%r1191; add.u64 %r609,%r355,%r1192; shl.b64 %r1194,%r1800,9; add.u64 %r1196,%r1194,-512; add.u64 %r602,%r1123,%r1196; add.u64 %r593,%r661,%r313; shl.b64 %r1201,%r1800,1; sub.u64 %r67,%r666,%r1201; mov.u64 %r594,%r667; mov.u64 %r595,%r664; mov.u64 %r607,0; setp.le.s64 %r1619,%r378,%r628; setp.gt.s64 %r1627,%r1800,%r366; add.u64 %r1649,%r67,2; not.b64 %r1650,%r378; add.u64 %r1651,%r1650,%r627; and.b64 %r1652,%r1651,-2; add.u64 %r1653,%r1652,%r378; bra $L54; $L139: mov.u64 %r607,%r608; $L54: .loc 1 2738 9 @ %r1619 bra $L49; $L53: .loc 1 2749 12 @ %r1627 bra $L50; bra $L51; $L49: add.u64 %r1204,%r607,%r607; add.u64 %r38,%r1123,%r1204; add.u64 %r637,%r1649,%r609; add.u64 %r1210,%r1653,%r593; add.u64 %r1211,%r1210,%r1210; add.u64 %r610,%r1211,%r1672; sub.u64 %r1727,%r610,%r637; add.u64 %r1728,%r1727,-4; shr.u64 %r1726,%r1728,2; add.u64 %r1729,%r1726,1; and.b64 %r1730,%r1729,3; setp.eq.u64 %r1733,%r1730,0; @ %r1733 bra $L52; setp.eq.u64 %r1732,%r1730,1; @ %r1732 bra $L296; setp.eq.u64 %r1731,%r1730,2; @ %r1731 bra $L297; bra $L358; $L52: .loc 1 2741 7 add.u64 %r1213,%r595,%r637; add.u64 %r1214,%r1213,%r594; .loc 1 2740 49 ld.u16 %r1215,[%r1214]; st.u16 [%r38],%r1215; .loc 1 2742 49 ld.u16 %r1216,[%r637]; st.u16 [%r38+2],%r1216; .loc 1 2745 7 add.u64 %r1217,%r594,%r637; add.u64 %r1218,%r1217,%r595; .loc 1 2744 49 ld.u16 %r1219,[%r1218+2]; st.u16 [%r38+512],%r1219; .loc 1 2746 49 ld.u16 %r1220,[%r637+2]; st.u16 [%r38+514],%r1220; .loc 1 2738 9 add.u64 %r1734,%r38,1024; add.u64 %r1735,%r637,4; .loc 1 2741 7 add.u64 %r1803,%r595,%r1735; add.u64 %r1804,%r1803,%r594; .loc 1 2740 49 ld.u16 %r1805,[%r1804]; st.u16 [%r1734],%r1805; .loc 1 2742 49 ld.u16 %r1806,[%r1735]; st.u16 [%r1734+2],%r1806; .loc 1 2745 7 add.u64 %r1807,%r594,%r1735; add.u64 %r1808,%r1807,%r595; .loc 1 2744 49 ld.u16 %r1809,[%r1808+2]; st.u16 [%r1734+512],%r1809; .loc 1 2746 49 ld.u16 %r1810,[%r1735+2]; st.u16 [%r1734+514],%r1810; .loc 1 2738 9 add.u64 %r1811,%r38,2048; add.u64 %r1812,%r637,8; .loc 1 2741 7 add.u64 %r1814,%r595,%r1812; add.u64 %r1815,%r1814,%r594; .loc 1 2740 49 ld.u16 %r1816,[%r1815]; st.u16 [%r1811],%r1816; .loc 1 2742 49 ld.u16 %r1817,[%r1812]; st.u16 [%r1811+2],%r1817; .loc 1 2745 7 add.u64 %r1818,%r594,%r1812; add.u64 %r1819,%r1818,%r595; .loc 1 2744 49 ld.u16 %r1820,[%r1819+2]; st.u16 [%r1811+512],%r1820; .loc 1 2746 49 ld.u16 %r1821,[%r1812+2]; st.u16 [%r1811+514],%r1821; .loc 1 2738 9 add.u64 %r1822,%r38,3072; add.u64 %r1823,%r637,12; .loc 1 2741 7 add.u64 %r1825,%r595,%r1823; add.u64 %r1826,%r1825,%r594; .loc 1 2740 49 ld.u16 %r1827,[%r1826]; st.u16 [%r1822],%r1827; .loc 1 2742 49 ld.u16 %r1828,[%r1823]; st.u16 [%r1822+2],%r1828; .loc 1 2745 7 add.u64 %r1829,%r594,%r1823; add.u64 %r1830,%r1829,%r595; .loc 1 2744 49 ld.u16 %r1831,[%r1830+2]; st.u16 [%r1822+512],%r1831; .loc 1 2746 49 ld.u16 %r1832,[%r1823+2]; st.u16 [%r1822+514],%r1832; .loc 1 2738 9 add.u64 %r38,%r38,4096; add.u64 %r637,%r637,16; setp.ne.u64 %r1833,%r610,%r637; @ %r1833 bra $L52; bra $L53; $L50: .loc 1 2754 10 add.u64 %r1226,%r609,%r681; add.u64 %r1227,%r1226,%r682; .loc 1 2751 41 ld.u16 %r1228,[%r1227]; shl.b32 %r1229,%r1228,16; ld.u16 %r1230,[%r609]; or.b32 %r1231,%r1230,%r1229; st.u32 [%r602],%r1231; $L51: .loc 1 2735 5 add.u64 %r608,%r607,2; add.u64 %r609,%r609,%r606; add.u64 %r602,%r602,4; add.u64 %r595,%r595,%r606; add.u64 %r594,%r594,%r1143; add.u64 %r593,%r593,%r666; setp.ne.u64 %r1232,%r1657,%r607; @ %r1232 bra $L139; bra $L55; $L48: .loc 1 2767 25 shr.u64 %r1235,%r1633,62; add.u64 %r1236,%r1800,%r1235; and.b64 %r1237,%r1236,3; sub.u64 %r1238,%r1237,%r1235; .loc 1 2767 11 sub.u64 %r369,%r1800,%r1238; .loc 1 2769 5 @ %r1632 bra $L56; bra $L57; $L47: .loc 1 2759 17 add.u64 %r148,%r1800,%r378; .loc 1 2759 12 add.u64 %r1240,%r148,-1; .loc 1 2760 9 setp.lt.s64 %r1241,%r1240,%r378; @ %r1241 bra $L48; add.u64 %r1242,%r148,%r421; add.u64 %r1243,%r1242,%r1242; add.u64 %r39,%r355,%r1243; mov.u64 %r100,%r82; mov.u64 %r120,%r442; sub.u64 %r1717,%r39,%r442; add.u64 %r1718,%r1717,-2; shr.u64 %r1716,%r1718,1; add.u64 %r1719,%r1716,1; and.b64 %r1720,%r1719,3; setp.eq.u64 %r1723,%r1720,0; @ %r1723 bra $L58; setp.eq.u64 %r1722,%r1720,1; @ %r1722 bra $L298; setp.eq.u64 %r1721,%r1720,2; @ %r1721 bra $L299; bra $L359; $L58: .loc 1 2762 43 ld.u16 %r1244,[%r120]; st.u16 [%r100],%r1244; ld.u16 %r1836,[%r120+2]; st.u16 [%r100+512],%r1836; ld.u16 %r1840,[%r120+4]; st.u16 [%r100+1024],%r1840; ld.u16 %r1844,[%r120+6]; st.u16 [%r100+1536],%r1844; .loc 1 2760 9 add.u64 %r120,%r120,8; add.u64 %r100,%r100,2048; setp.ne.u64 %r1845,%r39,%r120; @ %r1845 bra $L58; bra $L48; $L57: .loc 1 2871 8 @ ! %r1624 bra $L60; bra $L59; $L56: .loc 1 2876 14 add.u64 %r631,%r369,%r378; .loc 1 2876 9 add.u64 %r632,%r631,-1; add.u64 %r452,%r68,%r694; add.u64 %r451,%r452,%r694; add.u64 %r449,%r451,%r694; not.b64 %r1247,%r378; add.u64 %r1248,%r1247,%r631; shr.u64 %r1249,%r1248,2; add.u64 %r1250,%r1249,1; shl.b64 %r465,%r1250,3; add.u64 %r1251,%r1666,%r631; add.u64 %r1252,%r1251,%r1251; add.u64 %r424,%r1252,%r353; add.u64 %r1253,%r631,%r691; add.u64 %r1254,%r1253,%r1253; add.u64 %r420,%r1254,%r353; add.u64 %r1257,%r1668,%r631; add.u64 %r1258,%r1257,%r1257; add.u64 %r336,%r1258,%r353; add.u64 %r1259,%r631,%r695; add.u64 %r1260,%r1259,%r1259; add.u64 %r333,%r1260,%r353; shl.b64 %r506,%r369,8; add.u64 %r429,%r1800,%r1800; mov.u64 %r434,%r433; mov.u64 %r444,%r440; mov.u64 %r445,%r699; mov.u64 %r447,%r696; mov.u64 %r455,%r68; mov.u64 %r460,%r458; mov.u64 %r383,%r708; mov.u64 %r431,0; mov.u64 %r432,%r431; setp.le.s64 %r1621,%r378,%r632; setp.gt.s64 %r1629,%r1800,%r369; $L68: .loc 1 2772 9 @ %r1621 bra $L61; $L67: .loc 1 2843 12 @ ! %r1629 bra $L63; bra $L62; $L61: .loc 1 2904 13 mov.u64 %r478,0; mov.u64 %r479,2048; mov.u64 %r480,1536; mov.u64 %r481,1024; mov.u64 %r482,-512; $L66: add.u64 %r476,%r455,%r478; .loc 1 2774 10 ld.s16 %r384,[%r476]; .loc 1 2775 10 ld.s16 %r385,[%r476+2]; add.u64 %r474,%r452,%r478; .loc 1 2776 10 ld.s16 %r386,[%r474]; .loc 1 2777 10 ld.s16 %r387,[%r474+2]; add.u64 %r472,%r451,%r478; .loc 1 2778 10 ld.s16 %r388,[%r472]; .loc 1 2779 10 ld.s16 %r389,[%r472+2]; add.u64 %r470,%r449,%r478; .loc 1 2780 10 ld.s16 %r390,[%r470]; .loc 1 2781 10 ld.s16 %r391,[%r470+2]; .loc 1 2782 10 ld.s16 %r392,[%r476+4]; .loc 1 2783 10 ld.s16 %r393,[%r476+6]; .loc 1 2784 10 ld.s16 %r394,[%r474+4]; .loc 1 2785 10 ld.s16 %r395,[%r474+6]; .loc 1 2786 10 ld.s16 %r396,[%r472+4]; .loc 1 2787 10 ld.s16 %r397,[%r472+6]; .loc 1 2788 10 ld.s16 %r398,[%r470+4]; .loc 1 2789 10 ld.s16 %r399,[%r470+6]; .loc 1 2791 6 @ %r1660 bra $L64; shl.b64 %r1266,%r478,8; add.u64 %r498,%r1123,%r1266; add.u64 %r496,%r431,%r660; $L65: .loc 1 2794 11 ld.u16 %r1268,[%r498]; ld.u16 %r1269,[%r496]; .loc 1 2793 14 cvt.u16.u32 %r1271,%r384; mad.lo.u16 %r1270,%r1268,%r1269,%r1271; cvt.s32.s16 %r384,%r1270; add.u64 %r491,%r482,%r498; .loc 1 2795 19 add.u64 %r1273,%r491,%r481; .loc 1 2796 11 ld.u16 %r1276,[%r1273]; .loc 1 2795 14 cvt.u16.u32 %r1278,%r385; mad.lo.u16 %r1277,%r1269,%r1276,%r1278; cvt.s32.s16 %r385,%r1277; add.u64 %r488,%r447,%r496; .loc 1 2798 14 add.u64 %r1280,%r488,%r445; .loc 1 2798 11 ld.u16 %r1283,[%r1280]; .loc 1 2797 14 cvt.u16.u32 %r1285,%r386; mad.lo.u16 %r1284,%r1268,%r1283,%r1285; cvt.s32.s16 %r386,%r1284; .loc 1 2799 14 cvt.u16.u32 %r1291,%r387; mad.lo.u16 %r1290,%r1276,%r1283,%r1291; cvt.s32.s16 %r387,%r1290; .loc 1 2802 14 add.u64 %r1293,%r488,%r444; .loc 1 2802 11 ld.u16 %r1296,[%r1293]; .loc 1 2801 14 cvt.u16.u32 %r1298,%r388; mad.lo.u16 %r1297,%r1268,%r1296,%r1298; cvt.s32.s16 %r388,%r1297; .loc 1 2803 14 cvt.u16.u32 %r1304,%r389; mad.lo.u16 %r1303,%r1276,%r1296,%r1304; cvt.s32.s16 %r389,%r1303; .loc 1 2806 14 add.u64 %r1306,%r488,%r434; .loc 1 2806 11 ld.u16 %r1309,[%r1306]; .loc 1 2805 14 cvt.u16.u32 %r1311,%r390; mad.lo.u16 %r1310,%r1268,%r1309,%r1311; cvt.s32.s16 %r390,%r1310; .loc 1 2807 14 cvt.u16.u32 %r1317,%r391; mad.lo.u16 %r1316,%r1276,%r1309,%r1317; cvt.s32.s16 %r391,%r1316; .loc 1 2809 19 add.u64 %r1319,%r491,%r480; .loc 1 2810 11 ld.u16 %r1322,[%r1319]; .loc 1 2809 14 cvt.u16.u32 %r1324,%r392; mad.lo.u16 %r1323,%r1269,%r1322,%r1324; cvt.s32.s16 %r392,%r1323; .loc 1 2811 19 add.u64 %r1326,%r491,%r479; .loc 1 2812 11 ld.u16 %r1329,[%r1326]; .loc 1 2811 14 cvt.u16.u32 %r1331,%r393; mad.lo.u16 %r1330,%r1269,%r1329,%r1331; cvt.s32.s16 %r393,%r1330; .loc 1 2813 14 cvt.u16.u32 %r1337,%r394; mad.lo.u16 %r1336,%r1283,%r1322,%r1337; cvt.s32.s16 %r394,%r1336; .loc 1 2815 14 cvt.u16.u32 %r1343,%r395; mad.lo.u16 %r1342,%r1283,%r1329,%r1343; cvt.s32.s16 %r395,%r1342; .loc 1 2817 14 cvt.u16.u32 %r1349,%r396; mad.lo.u16 %r1348,%r1296,%r1322,%r1349; cvt.s32.s16 %r396,%r1348; .loc 1 2819 14 cvt.u16.u32 %r1355,%r397; mad.lo.u16 %r1354,%r1296,%r1329,%r1355; cvt.s32.s16 %r397,%r1354; .loc 1 2821 14 cvt.u16.u32 %r1361,%r398; mad.lo.u16 %r1360,%r1309,%r1322,%r1361; cvt.s32.s16 %r398,%r1360; .loc 1 2823 14 cvt.u16.u32 %r1367,%r399; mad.lo.u16 %r1366,%r1309,%r1329,%r1367; cvt.s32.s16 %r399,%r1366; .loc 1 2791 6 add.u64 %r498,%r498,2; add.u64 %r496,%r496,2; setp.ne.u64 %r1369,%r460,%r496; @ %r1369 bra $L65; $L64: .loc 1 2826 24 cvt.u16.u32 %r1370,%r384; st.u16 [%r476],%r1370; .loc 1 2827 28 cvt.u16.u32 %r1371,%r385; st.u16 [%r476+2],%r1371; .loc 1 2828 30 cvt.u16.u32 %r1372,%r386; st.u16 [%r474],%r1372; .loc 1 2829 34 cvt.u16.u32 %r1373,%r387; st.u16 [%r474+2],%r1373; .loc 1 2830 30 cvt.u16.u32 %r1374,%r388; st.u16 [%r472],%r1374; .loc 1 2831 34 cvt.u16.u32 %r1375,%r389; st.u16 [%r472+2],%r1375; .loc 1 2832 30 cvt.u16.u32 %r1376,%r390; st.u16 [%r470],%r1376; .loc 1 2833 34 cvt.u16.u32 %r1377,%r391; st.u16 [%r470+2],%r1377; .loc 1 2834 28 cvt.u16.u32 %r1378,%r392; st.u16 [%r476+4],%r1378; .loc 1 2835 28 cvt.u16.u32 %r1379,%r393; st.u16 [%r476+6],%r1379; .loc 1 2836 34 cvt.u16.u32 %r1380,%r394; st.u16 [%r474+4],%r1380; .loc 1 2837 34 cvt.u16.u32 %r1381,%r395; st.u16 [%r474+6],%r1381; .loc 1 2838 34 cvt.u16.u32 %r1382,%r396; st.u16 [%r472+4],%r1382; .loc 1 2839 34 cvt.u16.u32 %r1383,%r397; st.u16 [%r472+6],%r1383; .loc 1 2840 34 cvt.u16.u32 %r1384,%r398; st.u16 [%r470+4],%r1384; .loc 1 2841 34 cvt.u16.u32 %r1385,%r399; st.u16 [%r470+6],%r1385; .loc 1 2772 9 add.u64 %r482,%r482,-2048; add.u64 %r481,%r481,2048; add.u64 %r480,%r480,2048; add.u64 %r479,%r479,2048; add.u64 %r478,%r478,8; setp.ne.u64 %r1386,%r465,%r478; @ %r1386 bra $L66; bra $L67; $L63: .loc 1 2769 29 add.u64 %r383,%r383,4; .loc 1 2769 5 add.u64 %r460,%r460,%r459; add.u64 %r455,%r455,%r454; add.u64 %r452,%r452,%r454; add.u64 %r451,%r451,%r454; add.u64 %r449,%r449,%r454; add.u64 %r447,%r447,%r1146; add.u64 %r445,%r445,%r459; add.u64 %r444,%r444,%r459; add.u64 %r434,%r434,%r459; add.u64 %r432,%r432,%r454; add.u64 %r431,%r431,%r459; setp.le.s64 %r1387,%r383,%r1662; @ %r1387 bra $L68; bra $L57; $L62: add.u64 %r511,%r424,%r432; add.u64 %r510,%r420,%r432; add.u64 %r509,%r336,%r432; add.u64 %r508,%r333,%r432; add.u64 %r328,%r329,%r431; add.u64 %r326,%r327,%r431; add.u64 %r324,%r325,%r431; add.u64 %r428,%r429,%r455; .loc 1 2904 13 mov.u64 %r507,%r506; $L71: .loc 1 2848 14 ld.s16 %r379,[%r511]; .loc 1 2849 14 ld.s16 %r380,[%r510]; .loc 1 2850 14 ld.s16 %r381,[%r509]; .loc 1 2851 14 ld.s16 %r382,[%r508]; .loc 1 2853 10 @ %r1660 bra $L69; add.u64 %r1391,%r507,%r507; add.u64 %r525,%r1123,%r1391; add.u64 %r523,%r431,%r660; mov.u64 %r518,%r324; mov.u64 %r520,%r326; mov.u64 %r521,%r328; sub.u64 %r1706,%r460,%r523; add.u64 %r1707,%r1706,-2; shr.u64 %r1705,%r1707,1; and.b64 %r1709,%r1705,1; setp.ne.u64 %r1710,%r1709,0; @ ! %r1710 bra $L360; $L70: .loc 1 2856 13 ld.u16 %r1393,[%r523]; ld.u16 %r1394,[%r525]; .loc 1 2855 11 cvt.u16.u32 %r1396,%r379; .loc 1 2858 13 ld.u16 %r1399,[%r521]; .loc 1 2857 11 cvt.u16.u32 %r1402,%r380; .loc 1 2860 13 ld.u16 %r1405,[%r520]; .loc 1 2859 11 cvt.u16.u32 %r1408,%r381; .loc 1 2862 13 ld.u16 %r1411,[%r518]; .loc 1 2861 11 cvt.u16.u32 %r1414,%r382; .loc 1 2856 13 ld.u16 %r1856,[%r523+2]; ld.u16 %r1857,[%r525+2]; .loc 1 2855 11 mad.lo.u16 %r1859,%r1393,%r1394,%r1396; mad.lo.u16 %r1860,%r1856,%r1857,%r1859; cvt.s32.s16 %r379,%r1860; .loc 1 2858 13 ld.u16 %r1861,[%r521+2]; .loc 1 2857 11 mad.lo.u16 %r1863,%r1399,%r1394,%r1402; mad.lo.u16 %r1864,%r1861,%r1857,%r1863; cvt.s32.s16 %r380,%r1864; .loc 1 2860 13 ld.u16 %r1865,[%r520+2]; .loc 1 2859 11 mad.lo.u16 %r1867,%r1405,%r1394,%r1408; mad.lo.u16 %r1868,%r1865,%r1857,%r1867; cvt.s32.s16 %r381,%r1868; .loc 1 2862 13 ld.u16 %r1869,[%r518+2]; .loc 1 2861 11 mad.lo.u16 %r1871,%r1411,%r1394,%r1414; mad.lo.u16 %r1872,%r1869,%r1857,%r1871; cvt.s32.s16 %r382,%r1872; .loc 1 2853 10 add.u64 %r525,%r525,4; add.u64 %r523,%r523,4; add.u64 %r521,%r521,4; add.u64 %r520,%r520,4; add.u64 %r518,%r518,4; setp.ne.u64 %r1873,%r460,%r523; @ %r1873 bra $L70; $L69: .loc 1 2864 28 cvt.u16.u32 %r1417,%r379; st.u16 [%r511],%r1417; .loc 1 2865 34 cvt.u16.u32 %r1418,%r380; st.u16 [%r510],%r1418; .loc 1 2866 34 cvt.u16.u32 %r1419,%r381; st.u16 [%r509],%r1419; .loc 1 2867 34 cvt.u16.u32 %r1420,%r382; st.u16 [%r508],%r1420; .loc 1 2846 6 add.u64 %r511,%r511,2; add.u64 %r510,%r510,2; add.u64 %r509,%r509,2; add.u64 %r508,%r508,2; add.u64 %r507,%r507,256; setp.ne.u64 %r1421,%r428,%r511; @ %r1421 bra $L71; bra $L63; $L60: .loc 1 2727 34 add.u64 %r378,%r378,256; .loc 1 2727 8 add.u64 %r68,%r68,512; add.u64 %r442,%r442,512; setp.ge.s64 %r1422,%r320,%r378; @ %r1422 bra $L72; bra $L73; $L59: .loc 1 2873 17 add.u64 %r239,%r1797,%r708; .loc 1 2874 9 setp.le.s64 %r1423,%r239,%r624; @ %r1423 bra $L60; .loc 1 2876 14 add.u64 %r638,%r369,%r378; .loc 1 2876 9 add.u64 %r639,%r638,-1; add.u64 %r536,%r378,%r535; add.u64 %r1424,%r378,%r378; sub.u64 %r1425,%r353,%r1424; add.u64 %r1426,%r638,%r638; add.u64 %r528,%r1425,%r1426; shl.b64 %r570,%r369,8; mov.u64 %r540,%r539; mov.u64 %r542,%r677; mov.u64 %r372,%r624; setp.gt.s64 %r1874,%r1800,%r369; setp.le.s64 %r1622,%r378,%r639; not.b64 %r1641,%r378; add.u64 %r1642,%r1641,%r638; and.b64 %r1643,%r1642,-4; add.u64 %r1644,%r353,8; $L81: .loc 1 2877 6 @ %r1622 bra $L74; $L80: .loc 1 2901 6 @ %r1874 bra $L75; bra $L76; $L74: add.u64 %r1430,%r536,%r536; add.u64 %r554,%r353,%r1430; add.u64 %r1434,%r1643,%r536; add.u64 %r1435,%r1434,%r1434; add.u64 %r544,%r1435,%r1644; .loc 1 2904 13 mov.u64 %r550,2048; mov.u64 %r551,1536; mov.u64 %r552,512; mov.u64 %r553,-1024; $L79: .loc 1 2879 14 ld.s16 %r374,[%r554]; .loc 1 2880 14 ld.s16 %r375,[%r554+2]; .loc 1 2881 14 ld.s16 %r376,[%r554+4]; .loc 1 2882 14 ld.s16 %r377,[%r554+6]; .loc 1 2884 10 @ %r1660 bra $L77; add.u64 %r563,%r1123,%r552; mov.u64 %r565,%r542; sub.u64 %r1698,%r540,%r542; add.u64 %r1699,%r1698,-2; shr.u64 %r1697,%r1699,1; and.b64 %r1701,%r1697,1; setp.ne.u64 %r1702,%r1701,0; @ ! %r1702 bra $L361; $L78: add.u64 %r560,%r553,%r563; .loc 1 2886 16 add.u64 %r1438,%r560,%r552; .loc 1 2887 13 ld.u16 %r1440,[%r1438]; ld.u16 %r1441,[%r565]; .loc 1 2886 11 cvt.u16.u32 %r1443,%r374; .loc 1 2889 13 ld.u16 %r1446,[%r563]; .loc 1 2888 11 cvt.u16.u32 %r1449,%r375; .loc 1 2890 16 add.u64 %r1451,%r560,%r551; .loc 1 2891 13 ld.u16 %r1453,[%r1451]; .loc 1 2890 11 cvt.u16.u32 %r1456,%r376; .loc 1 2892 16 add.u64 %r1458,%r560,%r550; .loc 1 2893 13 ld.u16 %r1460,[%r1458]; .loc 1 2892 11 cvt.u16.u32 %r1463,%r377; .loc 1 2884 10 add.u64 %r1704,%r563,2; add.u64 %r1881,%r553,%r1704; .loc 1 2886 16 add.u64 %r1883,%r1881,%r552; .loc 1 2887 13 ld.u16 %r1884,[%r1883]; ld.u16 %r1885,[%r565+2]; .loc 1 2886 11 mad.lo.u16 %r1887,%r1440,%r1441,%r1443; mad.lo.u16 %r1888,%r1884,%r1885,%r1887; cvt.s32.s16 %r374,%r1888; .loc 1 2889 13 ld.u16 %r1889,[%r1704]; .loc 1 2888 11 mad.lo.u16 %r1891,%r1446,%r1441,%r1449; mad.lo.u16 %r1892,%r1889,%r1885,%r1891; cvt.s32.s16 %r375,%r1892; .loc 1 2890 16 add.u64 %r1893,%r1881,%r551; .loc 1 2891 13 ld.u16 %r1894,[%r1893]; .loc 1 2890 11 mad.lo.u16 %r1896,%r1453,%r1441,%r1456; mad.lo.u16 %r1897,%r1894,%r1885,%r1896; cvt.s32.s16 %r376,%r1897; .loc 1 2892 16 add.u64 %r1898,%r1881,%r550; .loc 1 2893 13 ld.u16 %r1899,[%r1898]; .loc 1 2892 11 mad.lo.u16 %r1901,%r1460,%r1441,%r1463; mad.lo.u16 %r1902,%r1899,%r1885,%r1901; cvt.s32.s16 %r377,%r1902; .loc 1 2884 10 add.u64 %r565,%r565,4; add.u64 %r563,%r563,4; setp.ne.u64 %r1903,%r540,%r565; @ %r1903 bra $L78; $L77: .loc 1 2895 28 cvt.u16.u32 %r1466,%r374; st.u16 [%r554],%r1466; .loc 1 2896 32 cvt.u16.u32 %r1467,%r375; st.u16 [%r554+2],%r1467; .loc 1 2897 32 cvt.u16.u32 %r1468,%r376; st.u16 [%r554+4],%r1468; .loc 1 2898 32 cvt.u16.u32 %r1469,%r377; st.u16 [%r554+6],%r1469; .loc 1 2877 6 add.u64 %r554,%r554,8; add.u64 %r553,%r553,-2048; add.u64 %r552,%r552,2048; add.u64 %r551,%r551,2048; add.u64 %r550,%r550,2048; setp.ne.u64 %r1470,%r544,%r554; @ %r1470 bra $L79; bra $L80; $L76: .loc 1 2874 39 add.u64 %r372,%r372,1; .loc 1 2874 9 add.u64 %r542,%r542,%r709; add.u64 %r540,%r540,%r709; add.u64 %r536,%r536,%r581; setp.ne.u64 %r1471,%r239,%r372; @ %r1471 bra $L81; bra $L60; $L75: add.u64 %r1473,%r536,%r536; add.u64 %r572,%r1473,%r528; add.u64 %r1474,%r1800,%r536; add.u64 %r1475,%r1474,%r1474; add.u64 %r532,%r353,%r1475; .loc 1 2904 13 mov.u64 %r571,%r570; $L84: .loc 1 2903 14 ld.s16 %r373,[%r572]; .loc 1 2905 10 @ %r1660 bra $L82; add.u64 %r1477,%r571,%r571; add.u64 %r401,%r1123,%r1477; mov.u64 %r576,%r542; sub.u64 %r1688,%r540,%r542; add.u64 %r1689,%r1688,-2; shr.u64 %r1687,%r1689,1; add.u64 %r1690,%r1687,1; and.b64 %r1691,%r1690,3; setp.eq.u64 %r1694,%r1691,0; @ %r1694 bra $L83; setp.eq.u64 %r1693,%r1691,1; @ %r1693 bra $L300; setp.eq.u64 %r1692,%r1691,2; @ %r1692 bra $L301; bra $L362; $L83: .loc 1 2908 13 ld.u16 %r1479,[%r401]; ld.u16 %r1480,[%r576]; .loc 1 2907 11 cvt.u16.u32 %r1482,%r373; .loc 1 2908 13 ld.u16 %r1907,[%r401+2]; ld.u16 %r1908,[%r576+2]; .loc 1 2907 11 mad.lo.u16 %r1910,%r1479,%r1480,%r1482; .loc 1 2908 13 ld.u16 %r1916,[%r401+4]; ld.u16 %r1917,[%r576+4]; .loc 1 2907 11 mad.lo.u16 %r1919,%r1907,%r1908,%r1910; .loc 1 2908 13 ld.u16 %r1925,[%r401+6]; ld.u16 %r1926,[%r576+6]; .loc 1 2907 11 mad.lo.u16 %r1928,%r1916,%r1917,%r1919; mad.lo.u16 %r1929,%r1925,%r1926,%r1928; cvt.s32.s16 %r373,%r1929; .loc 1 2905 10 add.u64 %r401,%r401,8; add.u64 %r576,%r576,8; setp.ne.u64 %r1930,%r540,%r576; @ %r1930 bra $L83; $L82: .loc 1 2910 28 cvt.u16.u32 %r1485,%r373; st.u16 [%r572],%r1485; .loc 1 2901 6 add.u64 %r572,%r572,2; add.u64 %r571,%r571,256; setp.ne.u64 %r1486,%r532,%r572; @ %r1486 bra $L84; bra $L76; $L20: .loc 1 2958 11 cvt.u16.u32 %r1487,%r583; setp.eq.u16 %r1488,%r1487,1; @ ! %r1488 bra $L363; bra $L85; $L31: .loc 1 2920 38 set.u32.eq.u64 %r1490,%r313,1; neg.s32 %r1491,%r1490; .loc 1 2920 26 cvt.u16.u32 %r1493,%r1491; and.b16 %r1492,%r1493,%r1078; .loc 1 2920 11 cvt.u32.u16 %r1495,%r1492; cvt.u16.u8 %r1496,%r1495; setp.eq.u16 %r1497,%r1496,0; @ %r1497 bra $L20; .loc 1 2920 43 setp.ne.u64 %r1498,%r645,1; @ %r1498 bra $L20; $L121: .loc 1 2922 10 cvt.u16.u32 %r1499,%r583; setp.ne.u16 %r1500,%r1499,1; @ %r1500 bra $L87; .loc 1 2948 4 setp.gt.s64 %r1501,%r321,0; @ %r1501 bra $L88; bra $L1; $L87: .loc 1 2929 4 setp.le.s64 %r1502,%r321,0; @ %r1502 bra $L1; add.u64 %r724,%r646,%r646; add.u64 %r1504,%r272,%r272; add.u64 %r725,%r349,%r1504; mov.u64 %r734,0; mov.u64 %r732,%r734; .loc 1 2929 11 mov.u64 %r405,%r734; setp.gt.s64 %r1931,%r320,0; setp.le.s64 %r1677,%r272,0; .loc 1 2936 7 cvt.u32.u64 %r1678,%r734; $L91: .loc 1 2933 8 @ %r1931 bra $L90; $L95: .loc 1 2929 29 add.u64 %r405,%r405,1; .loc 1 2929 4 add.u64 %r732,%r732,%r581; add.u64 %r734,%r734,%r314; setp.ne.u64 %r1506,%r321,%r405; @ %r1506 bra $L91; bra $L1; $L90: add.u64 %r1507,%r732,%r732; add.u64 %r720,%r1507,%r351; add.u64 %r1508,%r734,%r734; add.u64 %r743,%r350,%r1508; add.u64 %r1509,%r320,%r732; add.u64 %r1510,%r1509,%r1509; add.u64 %r738,%r1510,%r351; .loc 1 2933 8 mov.u64 %r722,%r725; mov.u64 %r721,0; $L94: .loc 1 2937 5 @ %r1677 bra $L140; add.u64 %r1512,%r721,%r721; add.u64 %r716,%r349,%r1512; mov.u64 %r717,%r743; .loc 1 2936 7 mov.u32 %r406,%r1678; sub.u64 %r1745,%r722,%r716; add.u64 %r1746,%r1745,-2; shr.u64 %r1744,%r1746,1; add.u64 %r1747,%r1744,1; and.b64 %r1748,%r1747,3; setp.eq.u64 %r1751,%r1748,0; @ %r1751 bra $L93; setp.eq.u64 %r1750,%r1748,1; @ %r1750 bra $L302; setp.eq.u64 %r1749,%r1748,2; @ %r1749 bra $L303; bra $L364; $L93: .loc 1 2938 23 ld.u16 %r1514,[%r716]; ld.u16 %r1515,[%r717]; .loc 1 2938 9 cvt.u16.u32 %r1517,%r406; .loc 1 2938 23 ld.u16 %r1935,[%r716+2]; ld.u16 %r1936,[%r717+2]; .loc 1 2938 9 mad.lo.u16 %r1938,%r1514,%r1515,%r1517; .loc 1 2938 23 ld.u16 %r1944,[%r716+4]; ld.u16 %r1945,[%r717+4]; .loc 1 2938 9 mad.lo.u16 %r1947,%r1935,%r1936,%r1938; .loc 1 2938 23 ld.u16 %r1953,[%r716+6]; ld.u16 %r1954,[%r717+6]; .loc 1 2938 9 mad.lo.u16 %r1956,%r1944,%r1945,%r1947; mad.lo.u16 %r1957,%r1953,%r1954,%r1956; cvt.s32.s16 %r406,%r1957; .loc 1 2937 5 add.u64 %r716,%r716,8; add.u64 %r717,%r717,8; setp.ne.u64 %r1958,%r716,%r722; @ %r1958 bra $L93; bra $L92; $L140: .loc 1 2936 7 mov.u32 %r406,%r1678; $L92: .loc 1 2939 15 cvt.u16.u32 %r1520,%r406; st.u16 [%r720],%r1520; .loc 1 2933 8 add.u64 %r720,%r720,2; add.u64 %r721,%r721,%r646; add.u64 %r722,%r722,%r724; setp.ne.u64 %r1521,%r720,%r738; @ %r1521 bra $L94; bra $L95; $L88: add.u64 %r754,%r581,%r581; mov.u64 %r752,%r351; add.u64 %r746,%r646,%r646; .loc 1 2948 4 mov.u64 %r755,0; .loc 1 2948 11 mov.u64 %r403,%r755; setp.le.s64 %r1959,%r272,0; .loc 1 2951 10 cvt.u32.u64 %r1680,%r755; $L98: .loc 1 2952 8 @ %r1959 bra $L141; mov.u64 %r744,%r349; add.u64 %r1525,%r755,%r755; add.u64 %r747,%r1525,%r350; add.u64 %r1526,%r272,%r755; add.u64 %r1527,%r1526,%r1526; add.u64 %r763,%r1527,%r350; .loc 1 2951 10 mov.u32 %r404,%r1680; sub.u64 %r1755,%r763,%r747; add.u64 %r1756,%r1755,-2; shr.u64 %r1754,%r1756,1; add.u64 %r1757,%r1754,1; and.b64 %r1758,%r1757,3; setp.eq.u64 %r1761,%r1758,0; @ %r1761 bra $L97; setp.eq.u64 %r1760,%r1758,1; @ %r1760 bra $L304; setp.eq.u64 %r1759,%r1758,2; @ %r1759 bra $L305; bra $L365; $L97: .loc 1 2953 26 ld.u16 %r1529,[%r744]; ld.u16 %r1530,[%r747]; .loc 1 2953 5 cvt.u16.u32 %r1532,%r404; .loc 1 2952 8 add.u64 %r1961,%r744,%r746; .loc 1 2953 26 ld.u16 %r1963,[%r1961]; ld.u16 %r1964,[%r747+2]; .loc 1 2953 5 mad.lo.u16 %r1966,%r1529,%r1530,%r1532; .loc 1 2952 8 add.u64 %r1969,%r1961,%r746; .loc 1 2953 26 ld.u16 %r1972,[%r1969]; ld.u16 %r1973,[%r747+4]; .loc 1 2953 5 mad.lo.u16 %r1975,%r1963,%r1964,%r1966; .loc 1 2952 8 add.u64 %r1978,%r1969,%r746; .loc 1 2953 26 ld.u16 %r1981,[%r1978]; ld.u16 %r1982,[%r747+6]; .loc 1 2953 5 mad.lo.u16 %r1984,%r1972,%r1973,%r1975; mad.lo.u16 %r1985,%r1981,%r1982,%r1984; cvt.s32.s16 %r404,%r1985; .loc 1 2952 8 add.u64 %r744,%r1978,%r746; add.u64 %r747,%r747,8; setp.ne.u64 %r1986,%r747,%r763; @ %r1986 bra $L97; bra $L96; $L141: .loc 1 2951 10 mov.u32 %r404,%r1680; $L96: .loc 1 2954 25 cvt.u16.u32 %r1535,%r404; st.u16 [%r752],%r1535; .loc 1 2948 29 add.u64 %r403,%r403,1; .loc 1 2948 4 add.u64 %r752,%r752,%r754; add.u64 %r755,%r755,%r314; setp.ne.u64 %r1536,%r321,%r403; @ %r1536 bra $L98; bra $L1; $L85: .loc 1 2963 7 setp.le.s64 %r1537,%r321,0; @ %r1537 bra $L1; add.u64 %r774,%r712,%r712; mov.u64 %r772,%r351; add.u64 %r766,%r646,%r646; add.u64 %r769,%r645,%r645; mov.u64 %r775,0; .loc 1 2963 14 mov.u64 %r416,%r775; setp.le.s64 %r1987,%r272,0; .loc 1 2966 6 cvt.u32.u64 %r1681,%r775; $L101: .loc 1 2967 4 @ %r1987 bra $L142; mov.u64 %r764,%r349; add.u64 %r1542,%r775,%r775; add.u64 %r767,%r350,%r1542; .loc 1 2966 6 mov.u32 %r417,%r1681; .loc 1 2967 11 mov.u64 %r418,0; and.b64 %r1766,%r272,3; setp.eq.u64 %r1769,%r1766,0; @ %r1769 bra $L100; setp.eq.u64 %r1768,%r1766,1; @ %r1768 bra $L306; setp.eq.u64 %r1767,%r1766,2; @ %r1767 bra $L307; bra $L366; $L100: .loc 1 2968 29 ld.u16 %r1544,[%r764]; ld.u16 %r1545,[%r767]; .loc 1 2968 8 cvt.u16.u32 %r1547,%r417; .loc 1 2967 4 add.u64 %r1990,%r764,%r766; add.u64 %r1991,%r767,%r769; .loc 1 2968 29 ld.u16 %r1992,[%r1990]; ld.u16 %r1993,[%r1991]; .loc 1 2968 8 mad.lo.u16 %r1995,%r1544,%r1545,%r1547; .loc 1 2967 4 add.u64 %r1999,%r1990,%r766; add.u64 %r2000,%r1991,%r769; .loc 1 2968 29 ld.u16 %r2002,[%r1999]; ld.u16 %r2003,[%r2000]; .loc 1 2968 8 mad.lo.u16 %r2005,%r1992,%r1993,%r1995; .loc 1 2967 4 add.u64 %r2009,%r1999,%r766; add.u64 %r2010,%r2000,%r769; .loc 1 2968 29 ld.u16 %r2012,[%r2009]; ld.u16 %r2013,[%r2010]; .loc 1 2968 8 mad.lo.u16 %r2015,%r2002,%r2003,%r2005; mad.lo.u16 %r2016,%r2012,%r2013,%r2015; cvt.s32.s16 %r417,%r2016; .loc 1 2967 28 add.u64 %r418,%r418,4; .loc 1 2967 4 add.u64 %r764,%r2009,%r766; add.u64 %r767,%r2010,%r769; setp.ne.u64 %r2017,%r272,%r418; @ %r2017 bra $L100; bra $L99; $L142: .loc 1 2966 6 mov.u32 %r417,%r1681; $L99: .loc 1 2969 21 cvt.u16.u32 %r1550,%r417; st.u16 [%r772],%r1550; .loc 1 2963 32 add.u64 %r416,%r416,1; .loc 1 2963 7 add.u64 %r772,%r772,%r774; add.u64 %r775,%r775,%r314; setp.ne.u64 %r1551,%r321,%r416; @ %r1551 bra $L101; bra $L1; $L363: .loc 1 2972 11 setp.gt.s64 %r1552,%r313,%r646; @ %r1552 bra $L102; .loc 1 2993 7 setp.gt.s64 %r1553,%r321,0; @ %r1553 bra $L103; bra $L1; $L102: .loc 1 2974 7 setp.gt.s64 %r1554,%r321,0; @ %r1554 bra $L104; bra $L1; $L108: add.u64 %r1555,%r808,%r808; add.u64 %r804,%r351,%r1555; .loc 1 2975 9 mov.u64 %r415,0; and.b64 %r1782,%r320,3; setp.eq.u64 %r1785,%r1782,0; @ %r1785 bra $L105; setp.eq.u64 %r1784,%r1782,1; @ %r1784 bra $L308; setp.eq.u64 %r1783,%r1782,2; @ %r1783 bra $L309; bra $L367; $L105: .loc 1 2976 34 st.u16 [%r804],%r1682; .loc 1 2975 2 add.u64 %r2019,%r804,%r806; .loc 1 2976 34 st.u16 [%r2019],%r1682; .loc 1 2975 2 add.u64 %r2021,%r2019,%r806; .loc 1 2976 34 st.u16 [%r2021],%r1682; .loc 1 2975 2 add.u64 %r2024,%r2021,%r806; .loc 1 2976 34 st.u16 [%r2024],%r1682; .loc 1 2975 27 add.u64 %r415,%r415,4; .loc 1 2975 2 add.u64 %r804,%r2024,%r806; setp.ne.u64 %r2026,%r320,%r415; @ %r2026 bra $L105; $L109: .loc 1 2974 32 add.u64 %r414,%r436,1; .loc 1 2974 7 add.u64 %r808,%r808,%r581; setp.ne.u64 %r1558,%r321,%r414; @ %r1558 bra $L143; add.u64 %r792,%r645,%r645; add.u64 %r783,%r646,%r646; mov.u64 %r797,0; mov.u64 %r796,%r797; mov.u64 %r437,%r797; setp.le.s64 %r2027,%r272,0; bra $L107; $L104: add.u64 %r806,%r712,%r712; mov.u64 %r808,0; .loc 1 2974 14 mov.u64 %r436,%r808; setp.gt.s64 %r2028,%r320,0; .loc 1 2976 34 cvt.u16.u64 %r1682,%r808; bra $L106; $L143: mov.u64 %r436,%r414; $L106: .loc 1 2975 2 @ %r2028 bra $L108; bra $L109; $L112: add.u64 %r1563,%r788,%r788; add.u64 %r781,%r349,%r1563; .loc 1 2980 4 mov.u64 %r780,%r801; .loc 1 2980 11 mov.u64 %r413,0; and.b64 %r1774,%r320,3; setp.eq.u64 %r1777,%r1774,0; @ %r1777 bra $L110; setp.eq.u64 %r1776,%r1774,1; @ %r1776 bra $L310; setp.eq.u64 %r1775,%r1774,2; @ %r1775 bra $L311; bra $L368; $L110: .loc 1 2983 37 ld.u16 %r1565,[%r781]; ld.u16 %r1566,[%r790]; .loc 1 2982 36 ld.u16 %r1568,[%r780]; mad.lo.u16 %r1567,%r1565,%r1566,%r1568; st.u16 [%r780],%r1567; .loc 1 2980 4 add.u64 %r2030,%r780,%r806; add.u64 %r2031,%r781,%r783; .loc 1 2983 37 ld.u16 %r2032,[%r2031]; ld.u16 %r2033,[%r790]; .loc 1 2982 36 ld.u16 %r2035,[%r2030]; mad.lo.u16 %r2036,%r2032,%r2033,%r2035; st.u16 [%r2030],%r2036; .loc 1 2980 4 add.u64 %r2038,%r2030,%r806; add.u64 %r2039,%r2031,%r783; .loc 1 2983 37 ld.u16 %r2041,[%r2039]; ld.u16 %r2042,[%r790]; .loc 1 2982 36 ld.u16 %r2044,[%r2038]; mad.lo.u16 %r2045,%r2041,%r2042,%r2044; st.u16 [%r2038],%r2045; .loc 1 2980 4 add.u64 %r2047,%r2038,%r806; add.u64 %r2048,%r2039,%r783; .loc 1 2983 37 ld.u16 %r2050,[%r2048]; ld.u16 %r2051,[%r790]; .loc 1 2982 36 ld.u16 %r2053,[%r2047]; mad.lo.u16 %r2054,%r2050,%r2051,%r2053; st.u16 [%r2047],%r2054; .loc 1 2980 29 add.u64 %r413,%r413,4; .loc 1 2980 4 add.u64 %r780,%r2047,%r806; add.u64 %r781,%r2048,%r783; setp.ne.u64 %r2055,%r320,%r413; @ %r2055 bra $L110; $L113: .loc 1 2979 26 add.u64 %r412,%r412,1; .loc 1 2979 2 add.u64 %r788,%r788,%r313; add.u64 %r790,%r790,%r792; setp.eq.u64 %r1570,%r272,%r412; @ %r1570 bra $L111; $L114: .loc 1 2980 4 @ %r2028 bra $L112; bra $L113; $L111: .loc 1 2978 32 add.u64 %r411,%r437,1; .loc 1 2978 7 add.u64 %r796,%r796,%r581; add.u64 %r797,%r797,%r314; setp.eq.u64 %r1572,%r436,%r437; @ %r1572 bra $L1; mov.u64 %r437,%r411; $L107: .loc 1 2979 2 @ %r2027 bra $L111; add.u64 %r1574,%r797,%r797; add.u64 %r790,%r350,%r1574; add.u64 %r1575,%r796,%r796; add.u64 %r801,%r351,%r1575; mov.u64 %r788,0; .loc 1 2979 9 mov.u64 %r412,%r788; bra $L114; $L103: add.u64 %r823,%r712,%r712; add.u64 %r814,%r313,%r313; add.u64 %r818,%r645,%r645; .loc 1 2993 7 mov.u64 %r831,0; mov.u64 %r829,%r831; .loc 1 2993 14 mov.u64 %r407,%r831; setp.gt.s64 %r2056,%r320,0; setp.le.s64 %r1684,%r272,0; .loc 1 3000 10 cvt.u32.u64 %r1685,%r831; $L116: .loc 1 2997 4 @ %r2056 bra $L115; $L120: .loc 1 2993 32 add.u64 %r407,%r407,1; .loc 1 2993 7 add.u64 %r829,%r829,%r581; add.u64 %r831,%r831,%r314; setp.ne.u64 %r1580,%r321,%r407; @ %r1580 bra $L116; bra $L1; $L115: add.u64 %r1581,%r829,%r829; add.u64 %r821,%r351,%r1581; add.u64 %r1582,%r831,%r831; add.u64 %r837,%r350,%r1582; .loc 1 2997 4 mov.u64 %r824,0; .loc 1 2997 11 mov.u64 %r408,%r824; $L119: .loc 1 3001 8 @ %r1684 bra $L144; add.u64 %r1584,%r824,%r824; add.u64 %r812,%r349,%r1584; mov.u64 %r816,%r837; .loc 1 3000 10 mov.u32 %r409,%r1685; .loc 1 3001 15 mov.u64 %r410,0; and.b64 %r1790,%r272,3; setp.eq.u64 %r1793,%r1790,0; @ %r1793 bra $L118; setp.eq.u64 %r1792,%r1790,1; @ %r1792 bra $L312; setp.eq.u64 %r1791,%r1790,2; @ %r1791 bra $L313; bra $L369; $L118: .loc 1 3002 28 ld.u16 %r1586,[%r812]; ld.u16 %r1587,[%r816]; .loc 1 3002 5 cvt.u16.u32 %r1589,%r409; .loc 1 3001 8 add.u64 %r2060,%r812,%r814; add.u64 %r2061,%r816,%r818; .loc 1 3002 28 ld.u16 %r2062,[%r2060]; ld.u16 %r2063,[%r2061]; .loc 1 3002 5 mad.lo.u16 %r2065,%r1586,%r1587,%r1589; .loc 1 3001 8 add.u64 %r2069,%r2060,%r814; add.u64 %r2070,%r2061,%r818; .loc 1 3002 28 ld.u16 %r2072,[%r2069]; ld.u16 %r2073,[%r2070]; .loc 1 3002 5 mad.lo.u16 %r2075,%r2062,%r2063,%r2065; .loc 1 3001 8 add.u64 %r2079,%r2069,%r814; add.u64 %r2080,%r2070,%r818; .loc 1 3002 28 ld.u16 %r2082,[%r2079]; ld.u16 %r2083,[%r2080]; .loc 1 3002 5 mad.lo.u16 %r2085,%r2072,%r2073,%r2075; mad.lo.u16 %r2086,%r2082,%r2083,%r2085; cvt.s32.s16 %r409,%r2086; .loc 1 3001 32 add.u64 %r410,%r410,4; .loc 1 3001 8 add.u64 %r812,%r2079,%r814; add.u64 %r816,%r2080,%r818; setp.ne.u64 %r2087,%r272,%r410; @ %r2087 bra $L118; bra $L117; $L144: .loc 1 3000 10 mov.u32 %r409,%r1685; $L117: .loc 1 3003 27 cvt.u16.u32 %r1592,%r409; st.u16 [%r821],%r1592; .loc 1 2997 29 add.u64 %r408,%r408,1; .loc 1 2997 4 add.u64 %r821,%r821,%r823; add.u64 %r824,%r824,%r646; setp.ne.u64 %r1593,%r320,%r408; @ %r1593 bra $L119; bra $L120; $L32: .loc 1 2920 11 setp.eq.u64 %r1594,%r313,1; @ %r1594 bra $L121; bra $L20; $L2: .loc 1 2476 6 ld.u64 %r1595,[%r838]; setp.ne.u64 %r1596,%r1595,0; @ %r1596 bra $L122; .loc 1 2485 4 ld.u64 %r1598,[%r839+56]; add.u64 %r1597,%r1598,1; ld.u64 %r1599,[%r839+48]; sub.u64 %r650,%r1597,%r1599; .loc 1 2483 15 ld.s8 %r1601,[%r840+28]; cvt.u16.u32 %r1600,%r1601; setp.eq.u16 %r1602,%r1600,1; @ %r1602 bra $L123; bra $L124; $L126: .loc 1 2560 16 ld.u64 %r646,[%r839+40]; .loc 1 2548 27 mov.u64 %r712,%r581; bra $L125; $L14: ld.u64 %r581,[%r838+40]; .loc 1 2543 6 ld.s8 %r1604,[%r838+28]; cvt.u16.u32 %r1603,%r1604; setp.eq.u16 %r1605,%r1603,1; @ ! %r1605 bra $L127; bra $L126; $L11: .loc 1 2560 16 ld.u64 %r646,[%r839+40]; mov.u32 %r583,%r443; .loc 1 2548 27 mov.u64 %r581,%r712; .loc 1 2563 14 mov.u64 %r320,1; .loc 1 2561 16 mov.u64 %r313,%r320; bra $L12; $L5: .loc 1 2485 4 ld.u64 %r1607,[%r839+56]; add.u64 %r1606,%r1607,1; ld.u64 %r1608,[%r839+48]; sub.u64 %r650,%r1606,%r1608; bra $L124; $L128: .loc 1 2509 17 ld.u64 %r1610,[%r838+56]; add.u64 %r1609,%r1610,1; .loc 1 2509 15 ld.u64 %r1611,[%r838+48]; sub.u64 %r272,%r1609,%r1611; ld.s8 %r1634,[%r840+28]; bra $L9; $L122: .loc 1 2502 12 cvta.global.u64 %r1612,_gfortrani_compile_options; .loc 1 2502 11 ld.u32 %r1613,[%r1612+36]; setp.ne.u32 %r1614,%r1613,0; @ ! %r1614 bra $L8; bra $L128; $L1$L362: .loc 1 2908 13 ld.u16 %r2088,[%r401]; ld.u16 %r2089,[%r542]; .loc 1 2907 11 cvt.u16.u32 %r2091,%r373; mad.lo.u16 %r2092,%r2088,%r2089,%r2091; cvt.s32.s16 %r373,%r2092; .loc 1 2905 10 add.u64 %r401,%r401,2; add.u64 %r576,%r542,2; $L301: .loc 1 2908 13 ld.u16 %r2094,[%r401]; ld.u16 %r2095,[%r576]; .loc 1 2907 11 cvt.u16.u32 %r2097,%r373; mad.lo.u16 %r2098,%r2094,%r2095,%r2097; cvt.s32.s16 %r373,%r2098; .loc 1 2905 10 add.u64 %r401,%r401,2; add.u64 %r576,%r576,2; $L300: .loc 1 2908 13 ld.u16 %r2100,[%r401]; ld.u16 %r2101,[%r576]; .loc 1 2907 11 cvt.u16.u32 %r2103,%r373; mad.lo.u16 %r2104,%r2100,%r2101,%r2103; cvt.s32.s16 %r373,%r2104; .loc 1 2905 10 add.u64 %r401,%r401,2; add.u64 %r576,%r576,2; setp.ne.u64 %r2105,%r540,%r576; @ %r2105 bra $L83; bra $L82; $L361: add.u64 %r2106,%r553,%r563; .loc 1 2886 16 add.u64 %r2108,%r2106,%r552; .loc 1 2887 13 ld.u16 %r2109,[%r2108]; ld.u16 %r2110,[%r542]; .loc 1 2886 11 cvt.u16.u32 %r2112,%r374; mad.lo.u16 %r2113,%r2109,%r2110,%r2112; cvt.s32.s16 %r374,%r2113; .loc 1 2889 13 ld.u16 %r2114,[%r563]; .loc 1 2888 11 cvt.u16.u32 %r2116,%r375; mad.lo.u16 %r2117,%r2114,%r2110,%r2116; cvt.s32.s16 %r375,%r2117; .loc 1 2890 16 add.u64 %r2118,%r2106,%r551; .loc 1 2891 13 ld.u16 %r2119,[%r2118]; .loc 1 2890 11 cvt.u16.u32 %r2121,%r376; mad.lo.u16 %r2122,%r2119,%r2110,%r2121; cvt.s32.s16 %r376,%r2122; .loc 1 2892 16 add.u64 %r2123,%r2106,%r550; .loc 1 2893 13 ld.u16 %r2124,[%r2123]; .loc 1 2892 11 cvt.u16.u32 %r2126,%r377; mad.lo.u16 %r2127,%r2124,%r2110,%r2126; cvt.s32.s16 %r377,%r2127; .loc 1 2884 10 add.u64 %r565,%r542,2; add.u64 %r563,%r563,2; setp.ne.u64 %r2128,%r540,%r565; @ %r2128 bra $L78; bra $L77; $L360: .loc 1 2856 13 ld.u16 %r2130,[%r523]; ld.u16 %r2131,[%r525]; .loc 1 2855 11 cvt.u16.u32 %r2133,%r379; mad.lo.u16 %r2134,%r2130,%r2131,%r2133; cvt.s32.s16 %r379,%r2134; .loc 1 2858 13 ld.u16 %r2135,[%r328]; .loc 1 2857 11 cvt.u16.u32 %r2137,%r380; mad.lo.u16 %r2138,%r2135,%r2131,%r2137; cvt.s32.s16 %r380,%r2138; .loc 1 2860 13 ld.u16 %r2139,[%r326]; .loc 1 2859 11 cvt.u16.u32 %r2141,%r381; mad.lo.u16 %r2142,%r2139,%r2131,%r2141; cvt.s32.s16 %r381,%r2142; .loc 1 2862 13 ld.u16 %r2143,[%r324]; .loc 1 2861 11 cvt.u16.u32 %r2145,%r382; mad.lo.u16 %r2146,%r2143,%r2131,%r2145; cvt.s32.s16 %r382,%r2146; .loc 1 2853 10 add.u64 %r525,%r525,2; add.u64 %r523,%r523,2; add.u64 %r521,%r328,2; add.u64 %r520,%r326,2; add.u64 %r518,%r324,2; setp.ne.u64 %r2147,%r460,%r523; @ %r2147 bra $L70; bra $L69; $L359: .loc 1 2762 43 ld.u16 %r2148,[%r442]; st.u16 [%r82],%r2148; .loc 1 2760 9 add.u64 %r120,%r442,2; add.u64 %r100,%r82,512; $L299: .loc 1 2762 43 ld.u16 %r2150,[%r120]; st.u16 [%r100],%r2150; .loc 1 2760 9 add.u64 %r120,%r120,2; add.u64 %r100,%r100,512; $L298: .loc 1 2762 43 ld.u16 %r2152,[%r120]; st.u16 [%r100],%r2152; .loc 1 2760 9 add.u64 %r120,%r120,2; add.u64 %r100,%r100,512; setp.ne.u64 %r2153,%r39,%r120; @ %r2153 bra $L58; bra $L48; $L358: .loc 1 2741 7 add.u64 %r2154,%r595,%r637; add.u64 %r2155,%r2154,%r594; .loc 1 2740 49 ld.u16 %r2156,[%r2155]; st.u16 [%r38],%r2156; .loc 1 2742 49 ld.u16 %r2157,[%r637]; st.u16 [%r38+2],%r2157; .loc 1 2745 7 add.u64 %r2158,%r594,%r637; add.u64 %r2159,%r2158,%r595; .loc 1 2744 49 ld.u16 %r2160,[%r2159+2]; st.u16 [%r38+512],%r2160; .loc 1 2746 49 ld.u16 %r2161,[%r637+2]; st.u16 [%r38+514],%r2161; .loc 1 2738 9 add.u64 %r38,%r38,1024; add.u64 %r637,%r637,4; $L297: .loc 1 2741 7 add.u64 %r2163,%r595,%r637; add.u64 %r2164,%r2163,%r594; .loc 1 2740 49 ld.u16 %r2165,[%r2164]; st.u16 [%r38],%r2165; .loc 1 2742 49 ld.u16 %r2166,[%r637]; st.u16 [%r38+2],%r2166; .loc 1 2745 7 add.u64 %r2167,%r594,%r637; add.u64 %r2168,%r2167,%r595; .loc 1 2744 49 ld.u16 %r2169,[%r2168+2]; st.u16 [%r38+512],%r2169; .loc 1 2746 49 ld.u16 %r2170,[%r637+2]; st.u16 [%r38+514],%r2170; .loc 1 2738 9 add.u64 %r38,%r38,1024; add.u64 %r637,%r637,4; $L296: .loc 1 2741 7 add.u64 %r2172,%r595,%r637; add.u64 %r2173,%r2172,%r594; .loc 1 2740 49 ld.u16 %r2174,[%r2173]; st.u16 [%r38],%r2174; .loc 1 2742 49 ld.u16 %r2175,[%r637]; st.u16 [%r38+2],%r2175; .loc 1 2745 7 add.u64 %r2176,%r594,%r637; add.u64 %r2177,%r2176,%r595; .loc 1 2744 49 ld.u16 %r2178,[%r2177+2]; st.u16 [%r38+512],%r2178; .loc 1 2746 49 ld.u16 %r2179,[%r637+2]; st.u16 [%r38+514],%r2179; .loc 1 2738 9 add.u64 %r38,%r38,1024; add.u64 %r637,%r637,4; setp.ne.u64 %r2180,%r610,%r637; @ %r2180 bra $L52; bra $L53; $L356: .loc 1 2688 2 @ ! %r1631 bra $L370; .loc 1 2689 22 mov.u32 %r21812181578182,[%value_in]; } $L370: .loc 1 2687 24 mov.u64 %r402,2; .loc 1 2687 7 add.u64 %r710,%r351,%r711; $L295: .loc 1 2688 2 @ ! %r1631 bra $L371; .loc 1 2689 22 mov.u32 %r21842184578185,[%value_in]; } $L371: .loc 1 2687 24 add.u64 %r402,%r402,1; .loc 1 2687 7 add.u64 %r710,%r710,%r711; $L294: .loc 1 2688 2 @ ! %r1631 bra $L372; .loc 1 2689 22 mov.u32 %r21872187578188,[%value_in]; } $L372: .loc 1 2687 24 add.u64 %r402,%r402,1; .loc 1 2687 7 add.u64 %r710,%r710,%r711; setp.eq.u64 %r2189,%r402,%r715; @ ! %r2189 bra $L34; bra $L33; $L232: .loc 1 2689 22 mov.u32 %r2190962190578191,[%value_in]; } $L357: .loc 1 2687 7 add.u64 %r2193,%r1796,%r711; .loc 1 2688 2 @ ! %r1631 bra $L373; .loc 1 2689 22 mov.u32 %r212192195578196,[%value_in]; } $L373: .loc 1 2687 7 add.u64 %r2198,%r2193,%r711; .loc 1 2688 2 @ ! %r1631 bra $L374; .loc 1 2689 22 mov.u32 %r2200210578201,[%value_in]; } $L374: .loc 1 2687 24 add.u64 %r402,%r1743,3; .loc 1 2687 7 add.u64 %r710,%r2198,%r711; setp.eq.u64 %r2202,%r402,%r715; @ ! %r2202 bra $L34; bra $L33; $L364: .loc 1 2938 23 ld.u16 %r2203,[%r716]; ld.u16 %r2204,[%r743]; mul.lo.u16 %r2205,%r2203,%r2204; .loc 1 2938 9 cvt.s32.s16 %r406,%r2205; .loc 1 2937 5 add.u64 %r716,%r716,2; add.u64 %r717,%r743,2; $L303: .loc 1 2938 23 ld.u16 %r2209,[%r716]; ld.u16 %r2210,[%r717]; .loc 1 2938 9 cvt.u16.u32 %r2212,%r406; mad.lo.u16 %r2213,%r2209,%r2210,%r2212; cvt.s32.s16 %r406,%r2213; .loc 1 2937 5 add.u64 %r716,%r716,2; add.u64 %r717,%r717,2; $L302: .loc 1 2938 23 ld.u16 %r2215,[%r716]; ld.u16 %r2216,[%r717]; .loc 1 2938 9 cvt.u16.u32 %r2218,%r406; mad.lo.u16 %r2219,%r2215,%r2216,%r2218; cvt.s32.s16 %r406,%r2219; .loc 1 2937 5 add.u64 %r716,%r716,2; add.u64 %r717,%r717,2; setp.ne.u64 %r2220,%r716,%r722; @ %r2220 bra $L93; bra $L92; $L365: .loc 1 2953 26 ld.u16 %r2221,[%r349]; ld.u16 %r2222,[%r747]; mul.lo.u16 %r2223,%r2221,%r2222; .loc 1 2953 5 cvt.s32.s16 %r404,%r2223; .loc 1 2952 8 add.u64 %r744,%r349,%r746; add.u64 %r747,%r747,2; $L305: .loc 1 2953 26 ld.u16 %r2227,[%r744]; ld.u16 %r2228,[%r747]; .loc 1 2953 5 cvt.u16.u32 %r2230,%r404; mad.lo.u16 %r2231,%r2227,%r2228,%r2230; cvt.s32.s16 %r404,%r2231; .loc 1 2952 8 add.u64 %r744,%r744,%r746; add.u64 %r747,%r747,2; $L304: .loc 1 2953 26 ld.u16 %r2233,[%r744]; ld.u16 %r2234,[%r747]; .loc 1 2953 5 cvt.u16.u32 %r2236,%r404; mad.lo.u16 %r2237,%r2233,%r2234,%r2236; cvt.s32.s16 %r404,%r2237; .loc 1 2952 8 add.u64 %r744,%r744,%r746; add.u64 %r747,%r747,2; setp.ne.u64 %r2238,%r747,%r763; @ %r2238 bra $L97; bra $L96; $L366: .loc 1 2968 29 ld.u16 %r2239,[%r349]; ld.u16 %r2240,[%r767]; mul.lo.u16 %r2241,%r2239,%r2240; .loc 1 2968 8 cvt.s32.s16 %r417,%r2241; .loc 1 2967 28 mov.u64 %r418,1; .loc 1 2967 4 add.u64 %r764,%r349,%r766; add.u64 %r767,%r767,%r769; $L307: .loc 1 2968 29 ld.u16 %r2245,[%r764]; ld.u16 %r2246,[%r767]; .loc 1 2968 8 cvt.u16.u32 %r2248,%r417; mad.lo.u16 %r2249,%r2245,%r2246,%r2248; cvt.s32.s16 %r417,%r2249; .loc 1 2967 28 add.u64 %r418,%r418,1; .loc 1 2967 4 add.u64 %r764,%r764,%r766; add.u64 %r767,%r767,%r769; $L306: .loc 1 2968 29 ld.u16 %r2251,[%r764]; ld.u16 %r2252,[%r767]; .loc 1 2968 8 cvt.u16.u32 %r2254,%r417; mad.lo.u16 %r2255,%r2251,%r2252,%r2254; cvt.s32.s16 %r417,%r2255; .loc 1 2967 28 add.u64 %r418,%r418,1; .loc 1 2967 4 add.u64 %r764,%r764,%r766; add.u64 %r767,%r767,%r769; setp.ne.u64 %r2256,%r272,%r418; @ %r2256 bra $L100; bra $L99; $L368: .loc 1 2983 37 ld.u16 %r2257,[%r781]; ld.u16 %r2258,[%r790]; .loc 1 2982 36 ld.u16 %r2260,[%r801]; mad.lo.u16 %r2261,%r2257,%r2258,%r2260; st.u16 [%r801],%r2261; .loc 1 2980 29 mov.u64 %r413,1; .loc 1 2980 4 add.u64 %r780,%r801,%r806; add.u64 %r781,%r781,%r783; $L311: .loc 1 2983 37 ld.u16 %r2263,[%r781]; ld.u16 %r2264,[%r790]; .loc 1 2982 36 ld.u16 %r2266,[%r780]; mad.lo.u16 %r2267,%r2263,%r2264,%r2266; st.u16 [%r780],%r2267; .loc 1 2980 29 add.u64 %r413,%r413,1; .loc 1 2980 4 add.u64 %r780,%r780,%r806; add.u64 %r781,%r781,%r783; $L310: .loc 1 2983 37 ld.u16 %r2269,[%r781]; ld.u16 %r2270,[%r790]; .loc 1 2982 36 ld.u16 %r2272,[%r780]; mad.lo.u16 %r2273,%r2269,%r2270,%r2272; st.u16 [%r780],%r2273; .loc 1 2980 29 add.u64 %r413,%r413,1; .loc 1 2980 4 add.u64 %r780,%r780,%r806; add.u64 %r781,%r781,%r783; setp.ne.u64 %r2274,%r320,%r413; @ %r2274 bra $L110; bra $L113; $L367: .loc 1 2976 34 st.u16 [%r804],%r1682; .loc 1 2975 27 mov.u64 %r415,1; .loc 1 2975 2 add.u64 %r804,%r804,%r806; $L309: .loc 1 2976 34 st.u16 [%r804],%r1682; .loc 1 2975 27 add.u64 %r415,%r415,1; .loc 1 2975 2 add.u64 %r804,%r804,%r806; $L308: .loc 1 2976 34 st.u16 [%r804],%r1682; .loc 1 2975 27 add.u64 %r415,%r415,1; .loc 1 2975 2 add.u64 %r804,%r804,%r806; setp.ne.u64 %r2277,%r320,%r415; @ %r2277 bra $L105; bra $L109; $L369: .loc 1 3002 28 ld.u16 %r2278,[%r812]; ld.u16 %r2279,[%r837]; mul.lo.u16 %r2280,%r2278,%r2279; .loc 1 3002 5 cvt.s32.s16 %r409,%r2280; .loc 1 3001 32 mov.u64 %r410,1; .loc 1 3001 8 add.u64 %r812,%r812,%r814; add.u64 %r816,%r837,%r818; $L313: .loc 1 3002 28 ld.u16 %r2284,[%r812]; ld.u16 %r2285,[%r816]; .loc 1 3002 5 cvt.u16.u32 %r2287,%r409; mad.lo.u16 %r2288,%r2284,%r2285,%r2287; cvt.s32.s16 %r409,%r2288; .loc 1 3001 32 add.u64 %r410,%r410,1; .loc 1 3001 8 add.u64 %r812,%r812,%r814; add.u64 %r816,%r816,%r818; $L312: .loc 1 3002 28 ld.u16 %r2290,[%r812]; ld.u16 %r2291,[%r816]; .loc 1 3002 5 cvt.u16.u32 %r2293,%r409; mad.lo.u16 %r2294,%r2290,%r2291,%r2293; cvt.s32.s16 %r409,%r2294; .loc 1 3001 32 add.u64 %r410,%r410,1; .loc 1 3001 8 add.u64 %r812,%r812,%r814; add.u64 %r816,%r816,%r818; setp.ne.u64 %r2295,%r272,%r410; @ %r2295 bra $L118; bra $L117; } matmul_i4.o/, .param .u32 %in_ar4, .param .u64 %in_ar5); .file 1 "../../../../libgfortran/generated/matmul_i4.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_options[6]FUNCTION DECL: _gfortran_size0_gfortran_size0_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime___func__$0 .const .align 1 .u8 __func__$0[10] = {109,97,116,109,117,108,95,105,52,0 }2] = {782] = {672] = {8471,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,32,40,97,41,32,61,61,32,50,32,124,124,32,71,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,32,40,98,41,32,61,61,3246116,114,97,110,47,103,101,110,101,114,97,116,101,100,47,109,97,116,109,117,108,95,105,52,465,114,114,97,121,32,98,111,117,110,100,32,109,105,115,109,97,116,99,104,32,102,111,114,32,100,105,109,101,110,115,105,111,110,32,49,32,111,102,32,97,114,114,97,121,32,40,37,108,100,47,37,108,100,41,3265,114,114,97,121,32,98,111,117,110,100,32,109,105,115,109,97,116,99,104,32,102,111,114,32,100,105,109,101,110,115,105,111,110,32,50,32,111,102,32,97,114,114,97,121,32,40,37,108,100,47,37,108,100,41,3273,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,97,114,103,117,109,101,110,116,32,66,32,105,110,32,77,65,84,77,85,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,49,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,100,0 }103,101,109,109,32,33,61,32,78,85,76,7_gfortran_matmul_i4 .visible .func _gfortran_matmul_u64 %r48; .reg .u64 %r56; .reg .u64u64u64 %r279; .reg .u32 %r280; .reg .u32u64 %r301; .reg .u32 %r302; .reg .u64u64u32 %r313; .reg .u64u64u64 %r374; .reg .u64u64 %r390; .reg .u64 %r392; .reg .u64u64u64 %r409; .reg .u64u64.reg .u64 %r530; .reg .u64 %r531; .reg .u64 %r532; .reg .u64 %r533; .reg .u64u64 %r544; .reg .u64 %r550; .reg .u64 %r553; .reg .u64 %r558; .reg .u64 %r559; .reg .u64 %r569; .reg .u64 %r571; .reg .u64 %r572; .reg .u64 %r573; .reg .u64 %r575; .reg .u64 %r576; .reg .u64 %r577; .reg .u64u64 %r590; .reg .u64 %r591; .reg .u64 %r592; .reg .u64 %r593; .reg .u64 %r594; .reg .u64u64.reg .u64 %r619; .reg .u64 %r621; .reg .u64 %r622; .reg .u64 %r623; .reg .u64 %r626; .reg .u64 %r627; .reg .u64 %r628; .reg .u64 %r631; .reg .u64 %r632; .reg .u64 %r633; .reg .u64 %r635; .reg .u64 %r636; .reg .u64 %r642; .reg .u64 %r644; .reg .u64 %r645; .reg .u64 %r649; .reg .u64 %r654; .reg .u64 %r655; .reg .u64 %r657; .reg .u64 %r658; .reg .u64 %r663; .reg .u64 %r665; .reg .u64 %r666; .reg .u64 %r674; .reg .u64 %r675; .reg .u64 %r677; .reg .u64 %r678; .reg .u64 %r680; .reg .u64 %r683; .reg .u64 %r685; .reg .u64 %r686; .reg .u64 %r691; .reg .u64 %r692; .reg .u64 %r694; .reg .u64 %r699; .reg .u64 %r701; .reg .u64 %r703; .reg .u64 %r707; .reg .u64 %r708; .reg .u64 %r712; .reg .u64 %r715; .reg .u64 %r717; .reg .u64 %r719; .reg .u64 %r723; .reg .u64 %r725; .reg .u64 %r727; .reg .u64 %r729; .reg .u64 %r732; .reg .u64 %r734; .reg .u64 %r736; .reg .u64 %r741; .reg .u64 %r743; .reg .u64 %r749; .reg .u64 %r750; .reg .u64 %r751; .reg .u64 %r752; .reg .u32 %r753; .reg .u32 %r754; .reg .u64 %r755; .reg .u16 %r756; .reg .pred %r757; .reg .u16 %r758; .reg .pred %r760; .reg .u64 %r761; .reg .u32 %r762; .reg .u64 %r763; .reg .u64 %r764; .reg .u64 %r765; .reg .pred %r766; .reg .pred %r768; .reg .u64 %r771; .reg .u64 %r773; .reg .u64 %r774; .reg .u64 %r775; .reg .u64 %r777; .reg .u64 %r778; .reg .u64 %r779; .reg .u64 %r780; .reg .u64 %r781; .reg .u64 %r784; .reg .u64 %r786; .reg .u64 %r787; .reg .u64 %r789; .reg .u64 %r791; .reg .u64 %r792; .reg .u64 %r794; .reg .u64 %r795; .reg .u64 %r796; .reg .u64 %r797; .reg .u64 %r798; .reg .u32 %r799; .reg .pred %r800; .reg .u64 %r801; .reg .u64 %r802; .reg .u64 %r803; .reg .pred %r805; .reg .u64 %r806; .reg .u64 %r807; .reg .u64 %r808; .reg .pred %r809; .reg .u64 %r810; .reg .u64 %r811; .reg .u64 %r812; .reg .u16 %r813; .reg .pred %r814; .reg .u64 %r815; .reg .u64 %r817; .reg .u64 %r818; .reg .u64 %r819; .reg .u16 %r820; .reg .pred %r822; .reg .pred %r823; .reg .u64 %r824; .reg .pred %r826; .reg .u64 %r827; .reg .u64 %r829; .reg .u64 %r830; .reg .u64 %r831; .reg .u64 %r832; .reg .u64 %r833; .reg .u64 %r834; .reg .pred %r835; .reg .u64 %r836; .reg .u64 %r838; .reg .u64 %r839; .reg .u64 %r840; .reg .u16 %r841; .reg .u32 %r842; .reg .pred %r843; .reg .u16 %r844; .reg .pred %r845; .reg .u64 %r846; .reg .u64 %r847; .reg .u64 %r848; .reg .u64 %r849; .reg .u64 %r850; .reg .u64 %r851; .reg .pred %r852; .reg .u32 %r854; .reg .u32 %r855; .reg .u32 %r857; .reg .u32 %r858; .reg .u16 %r859; .reg .u16 %r860; .reg .u16 %r861; .reg .u32 %r862; .reg .u16 %r863; .reg .pred %r864; .reg .u64 %r865; .reg .u16 %r867; .reg .pred %r868; .reg .u64 %r869; .reg .u64 %r870; .reg .u64 %r871; .reg .u32 %r872; .reg .u32 %r873; .reg .u32 %r874; .reg .u32 %r875; .reg .u32 %r876; .reg .u32 %r877; .reg .u32 %r879; .reg .u32 %r880; .reg .u16 %r881; .reg .u16 %r882; .reg .u16 %r883; .reg .u32 %r884; .reg .u16 %r885; .reg .pred %r886; .reg .u32 %r888; .reg .u32 %r889; .reg .u16 %r890; .reg .u16 %r891; .reg .u16 %r892; .reg .u32 %r893; .reg .u16 %r894; .reg .pred %r895; .reg .u32 %r897; .reg .u32 %r898; .reg .u32 %r900; .reg .u32 %r901; .reg .u16 %r902; .reg .u16 %r903; .reg .u16 %r904; .reg .u32 %r905; .reg .u16 %r906; .reg .pred %r907; .reg .f32 %r908; .reg .f32 %r909; .reg .f32 %r910; .reg .f32 %r911; .reg .f32 %r912; .reg .f32 %r913; .reg .f32 %r914; .reg .pred %r915; .reg .u32 %r916; .reg .u32 %r917; .reg .pred %r918; .reg .pred %r919; .reg .u32 %r921; .reg .u32 %r922; .reg .u32 %r924; .reg .u32 %r925; .reg .u16 %r926; .reg .u16 %r927; .reg .u16 %r928; .reg .u32 %r929; .reg .u16 %r930; .reg .pred %r931; .reg .u32 %r933; .reg .u32 %r934; .reg .u32 %r936; .reg .u32 %r937; .reg .u16 %r938; .reg .u16 %r939; .reg .u16 %r940; .reg .u32 %r941; .reg .u16 %r942; .reg .pred %r943; .reg .u32 %r945; .reg .u32 %r946; .reg .u32 %r948; .reg .u32 %r949; .reg .u16 %r950; .reg .u16 %r951; .reg .u16 %r952; .reg .u32 %r953; .reg .u16 %r954; .reg .pred %r955; .reg .pred %r956; .reg .u64 %r957; .reg .u32 %r958; .reg .u64 %r959; .reg .u64 %r960; .reg .u32 %r961; .reg .pred %r962; .reg .pred %r963; .reg .u32 %r964; .reg .pred %r965; .reg .pred %r966; .reg .u32 %r981; .reg .u64 %r982; .reg .u64 %r983; .reg .u64 %r984; .reg .u64 %r985; .reg .u64 %r986; .reg .u64 %r987; .reg .u64 %r988; .reg .u16 %r989; .reg .u16 %r990; .reg .u16 %r991; .reg .u32 %r992; .reg .u16 %r993; .reg .pred %r994; .reg .pred %r995; .reg .u16 %r996; .reg .pred %r997; .reg .u64 %r998; .reg .pred %r999; .reg .u32 %r1003; .reg .u64 %r1005; .reg .u32 %r1010; .reg .u32 %r1011; .reg .u32 %r1013; .reg .u32 %r1014; .reg .u16 %r1015; .reg .u16 %r1016; .reg .u16 %r1017; .reg .u32 %r1019; .reg .u32 %r1020; .reg .u16 %r1022; .reg .u16 %r1024; .reg .u32 %r1025; .reg .u16 %r1026; .reg .pred %r1027; .reg .pred %r1028; .reg .u64 %r1029; .reg .u64 %r1031; .reg .u64 %r1032; .reg .u64 %r1033; .reg .pred %r1035; .reg .u64 %r1037; .reg .u64 %r1038; .reg .u64 %r1041; .reg .u64 %r1042; .reg .u64 %r1046; .reg .u64 %r1047; .reg .u64 %r1048; .reg .u64 %r1050; .reg .u64 %r1051; .reg .u64 %r1054; .reg .u64 %r1057; .reg .u64 %r1058; .reg .u64 %r1061; .reg .u64 %r1063; .reg .u64 %r1064; .reg .u64 %r1065; .reg .u64 %r1066; .reg .u64 %r1067; .reg .pred %r1069; .reg .u64 %r1075; .reg .u64 %r1076; .reg .u64 %r1077; .reg .u64 %r1078; .reg .pred %r1080; .reg .u64 %r1081; .reg .u64 %r1082; .reg .u64 %r1083; .reg .u64 %r1084; .reg .u64 %r1085; .reg .u64 %r1087; .reg .u64 %r1088; .reg .u64 %r1089; .reg .u64 %r1090; .reg .u64 %r1091; .reg .u64 %r1092; .reg .u64 %r1093; .reg .u64 %r1094; .reg .u64 %r1098; .reg .u64 %r1099; .reg .u64 %r1100; .reg .u64 %r1101; .reg .u64 %r1105; .reg .u64 %r1107; .reg .u64 %r1110; .reg .u64 %r1113; .reg .u64 %r1118; .reg .u64 %r1119; .reg .u64 %r1121; .reg .u64 %r1122; .reg .u32 %r1123; .reg .u32 %r1124; .reg .u64 %r1125; .reg .u64 %r1126; .reg .u32 %r1127; .reg .u32 %r1128; .reg .v2 .u32 %r1130; .reg .u64 %r1131; .reg .u64 %r1132; .reg .u64 %r1137; .reg .u64 %r1138; .reg .u64 %r1139; .reg .u64 %r1141; .reg .u64 %r1142; .reg .pred %r1143; .reg .u64 %r1146; .reg .u64 %r1147; .reg .u64 %r1148; .reg .u64 %r1149; .reg .u64 %r1151; .reg .pred %r1152; .reg .u64 %r1153; .reg .u64 %r1154; .reg .u32 %r1155; .reg .u64 %r1158; .reg .u64 %r1159; .reg .u64 %r1160; .reg .u64 %r1161; .reg .u64 %r1162; .reg .u64 %r1163; .reg .u64 %r1164; .reg .u64 %r1165; .reg .u64 %r1166; .reg .u64 %r1168; .reg .u64 %r1169; .reg .u64 %r1170; .reg .u64 %r1171; .reg .u64 %r1176; .reg .u64 %r1179; .reg .u64 %r1182; .reg .u64 %r1185; .reg .u64 %r1188; .reg .u64 %r1190; .reg .pred %r1198; .reg .pred %r1199; .reg .pred %r1200; .reg .u64 %r1204; .reg .u32 %r1206; .reg .u32 %r1208; .reg .u32 %r1210; .reg .u32 %r1212; .reg .pred %r1214; .reg .pred %r1215; .reg .pred %r1216; .reg .u64 %r1217; .reg .u64 %r1218; .reg .u64 %r1220; .reg .u64 %r1221; .reg .u64 %r1222; .reg .u64 %r1223; .reg .u64 %r1231; .reg .u64 %r1232; .reg .u64 %r1233; .reg .u64 %r1235; .reg .u32 %r1237; .reg .u32 %r1239; .reg .u64 %r1240; .reg .u32 %r1242; .reg .u64 %r1243; .reg .u32 %r1245; .reg .pred %r1247; .reg .pred %r1248; .reg .u64 %r1251; .reg .u32 %r1253; .reg .u32 %r1254; .reg .pred %r1256; .reg .u16 %r1257; .reg .pred %r1258; .reg .u32 %r1260; .reg .u32 %r1261; .reg .u16 %r1262; .reg .u16 %r1263; .reg .u32 %r1265; .reg .u16 %r1266; .reg .pred %r1267; .reg .pred %r1268; .reg .u16 %r1269; .reg .pred %r1270; .reg .pred %r1271; .reg .pred %r1272; .reg .u64 %r1273; .reg .u64 %r1274; .reg .pred %r1276; .reg .u64 %r1277; .reg .u64 %r1278; .reg .u64 %r1280; .reg .u32 %r1282; .reg .u32 %r1283; .reg .pred %r1285; .reg .u64 %r1287; .reg .u64 %r1288; .reg .u64 %r1289; .reg .u32 %r1291; .reg .u32 %r1292; .reg .pred %r1294; .reg .pred %r1295; .reg .u64 %r1297; .reg .u32 %r1299; .reg .u32 %r1300; .reg .pred %r1302; .reg .pred %r1303; .reg .pred %r1304; .reg .pred %r1305; .reg .u64 %r1306; .reg .pred %r1309; .reg .u64 %r1311; .reg .u32 %r1313; .reg .u32 %r1314; .reg .u32 %r1315; .reg .u32 %r1316; .reg .pred %r1318; .reg .pred %r1320; .reg .u64 %r1322; .reg .u64 %r1323; .reg .pred %r1325; .reg .u64 %r1326; .reg .u64 %r1327; .reg .u64 %r1329; .reg .u32 %r1331; .reg .u32 %r1332; .reg .pred %r1334; .reg .pred %r1335; .reg .u64 %r1336; .reg .pred %r1337; .reg .u64 %r1338; .reg .u64 %r1339; .reg .u64 %r1340; .reg .u16 %r1341; .reg .u32 %r1342; .reg .pred %r1343; .reg .u16 %r1344; .reg .u32 %r1345; .reg .pred %r1346; .reg .u64 %r1347; .reg .u64 %r1348; .reg .u64 %r1349; .reg .u64 %r1350; .reg .u64 %r1351; .reg .u64 %r1352; .reg .u64 %r1353; .reg .u32 %r1354; .reg .pred %r1355; .reg .pred %r1357; .reg .pred %r1358; .reg .pred %r1359; .reg .pred %r1360; .reg .pred %r1361; .reg .u64 %r1363; .reg .u32 %r1364; .reg .u64 %r1365; .reg .pred %r1366; .reg .pred %r1370; .reg .pred %r1372; .reg .pred %r1373; .reg .pred %r1374; .reg .u64 %r1382; .reg .u64 %r1383; .reg .u64 %r1384; .reg .u64 %r1385; .reg .u64 %r1390; .reg .u64 %r1391; .reg .u64 %r1392; .reg .u64 %r1393; .reg .u64 %r1395; .reg .u64 %r1396; .reg .u64 %r1397; .reg .u64 %r1398; .reg .u64 %r1399; .reg .pred %r1400; .reg .u64 %r1402; .reg .pred %r1404; .reg .u64 %r1407; .reg .u64 %r1408; .reg .u64 %r1409; .reg .pred %r1410; .reg .u64 %r1411; .reg .u64 %r1412; .reg .pred %r1416; .reg .u32 %r1417; .reg .pred %r1418; .reg .u64 %r1419; .reg .u64 %r1420; .reg .u64 %r1421; .reg .u64 %r1422; .reg .u64 %r1423; .reg .pred %r1424; .reg .pred %r1425; .reg .pred %r1426; .reg .u64 %r1429; .reg .u64 %r1430; .reg .u64 %r1431; .reg .u64 %r1433; .reg .pred %r1434; .reg .u64 %r1436; .reg .u64 %r1437; .reg .u64 %r1438; .reg .u64 %r1439; .reg .u64 %r1440; .reg .u64 %r1441; .reg .pred %r1442; .reg .pred %r1443; .reg .pred %r1444; .reg .u64 %r1450; .reg .u64 %r1451; .reg .u64 %r1452; .reg .u64 %r1453; .reg .u64 %r1454; .reg .pred %r1455; .reg .pred %r1456; .reg .pred %r1457; .reg .u64 %r1460; .reg .u64 %r1461; .reg .u64 %r1462; .reg .u64 %r1463; .reg .u64 %r1464; .reg .pred %r1465; .reg .pred %r1466; .reg .pred %r1467; .reg .u64 %r1468; .reg .u64 %r1469; .reg .u64 %r1473; .reg .pred %r1474; .reg .pred %r1475; .reg .pred %r1476; .reg .u64 %r1477; .reg .u64 %r1478; .reg .u64 %r1479; .reg .u64 %r1480; .reg .u64 %r1481; .reg .u64 %r1482; .reg .pred %r1483; .reg .pred %r1484; .reg .pred %r1485; .reg .u64 %r1488; .reg .u64 %r1489; .reg .u64 %r1490; .reg .u64 %r1491; .reg .u64 %r1492; .reg .pred %r1493; .reg .pred %r1494; .reg .pred %r1495; .reg .u64 %r1500; .reg .pred %r1501; .reg .pred %r1502; .reg .pred %r1503; .reg .u64 %r1508; .reg .pred %r1509; .reg .pred %r1510; .reg .pred %r1511; .reg .u64 %r1516; .reg .pred %r1517; .reg .pred %r1518; .reg .pred %r1519; .reg .u64 %r1524; .reg .pred %r1525; .reg .pred %r1526; .reg .pred %r1527; .reg .u64 %r1530; .reg .u64 %r1531; .reg .u64 %r1533; .reg .u64 %r1534; .reg .u64 %r1537; .reg .u64 %r1538; .reg .u32 %r1539; .reg .u32 %r1540; .reg .u64 %r1541; .reg .u64 %r1542; .reg .u32 %r1543; .reg .u32 %r1544; .reg .u64 %r1545; .reg .u64 %r1546; .reg .u64 %r1548; .reg .u64 %r1549; .reg .u32 %r1550; .reg .u32 %r1551; .reg .u64 %r1552; .reg .u64 %r1553; .reg .u32 %r1554; .reg .u32 %r1555; .reg .u64 %r1556; .reg .u64 %r1557; .reg .u64 %r1559; .reg .u64 %r1560; .reg .u32 %r1561; .reg .u32 %r1562; .reg .u64 %r1563; .reg .u64 %r1564; .reg .u32 %r1565; .reg .u32 %r1566; .reg .pred %r1567; .reg .u32 %r1570; .reg .u32 %r1574; .reg .u32 %r1578; .reg .pred %r1579; .reg .u32 %r1580; .reg .u32 %r1581; .reg .u32 %r1582; .reg .u32 %r1583; .reg .u32 %r1589; .reg .u32 %r1590; .reg .u32 %r1592; .reg .u32 %r1593; .reg .u32 %r1595; .reg .u32 %r1596; .reg .u32 %r1598; .reg .u32 %r1599; .reg .u32 %r1601; .reg .u32 %r1608; .reg .u32 %r1609; .reg .u32 %r1611; .reg .u32 %r1612; .reg .u32 %r1614; .reg .u32 %r1615; .reg .u32 %r1617; .reg .u32 %r1618; .reg .u32 %r1620; .reg .u32 %r1627; .reg .u32 %r1628; .reg .u32 %r1630; .reg .u32 %r1632; .reg .u32 %r1634; .reg .pred %r1636; .reg .pred %r1637; .reg .u32 %r1638; .reg .u32 %r1639; .reg .u32 %r1640; .reg .u32 %r1641; .reg .u64 %r1644; .reg .u32 %r1645; .reg .u64 %r1646; .reg .u32 %r1647; .reg .u32 %r1649; .reg .u64 %r1651; .reg .u32 %r1652; .reg .u64 %r1654; .reg .u32 %r1655; .reg .pred %r1657; .reg .u32 %r1658; .reg .u32 %r1661; .reg .u32 %r1662; .reg .u32 %r1664; .reg .u32 %r1668; .reg .u32 %r1669; .reg .u32 %r1671; .reg .u32 %r1675; .reg .u32 %r1676; .reg .pred %r1678; .reg .pred %r1679; .reg .u32 %r1680; .reg .u32 %r1683; .reg .u32 %r1684; .reg .u32 %r1686; .reg .u32 %r1690; .reg .u32 %r1691; .reg .u32 %r1693; .reg .u32 %r1697; .reg .u32 %r1698; .reg .pred %r1700; .reg .pred %r1701; .reg .u32 %r1702; .reg .u64 %r1703; .reg .u32 %r1705; .reg .u32 %r1706; .reg .u32 %r1708; .reg .u64 %r1709; .reg .u32 %r1712; .reg .u32 %r1713; .reg .u32 %r1715; .reg .u64 %r1716; .reg .u32 %r1719; .reg .u32 %r1720; .reg .pred %r1722; .reg .pred %r1723; .reg .u32 %r1724; .reg .u64 %r1726; .reg .u64 %r1727; .reg .u32 %r1728; .reg .u32 %r1729; .reg .u32 %r1731; .reg .u64 %r1733; .reg .u64 %r1734; .reg .u32 %r1736; .reg .u32 %r1737; .reg .u32 %r1739; .reg .u64 %r1741; .reg .u64 %r1742; .reg .u32 %r1744; .reg .u32 %r1745; .reg .pred %r1747; .reg .u64 %r1749; .reg .u64 %r1751; .reg .u64 %r1754; .reg .pred %r1756; .reg .pred %r1757; .reg .pred %r1758; .reg .u64 %r1760; .reg .u64 %r1761; .reg .u32 %r1762; .reg .u32 %r1763; .reg .u32 %r1765; .reg .u32 %r1766; .reg .u64 %r1768; .reg .u64 %r1769; .reg .u32 %r1771; .reg .u32 %r1772; .reg .u32 %r1774; .reg .u32 %r1775; .reg .u64 %r1777; .reg .u64 %r1778; .reg .u32 %r1780; .reg .u32 %r1781; .reg .u32 %r1783; .reg .u32 %r1784; .reg .pred %r1785; .reg .pred %r1786; .reg .u32 %r1788; .reg .u64 %r1790; .reg .u64 %r1791; .reg .u32 %r1792; .reg .u32 %r1793; .reg .u32 %r1795; .reg .u64 %r1797; .reg .u64 %r1798; .reg .u32 %r1800; .reg .u32 %r1801; .reg .u32 %r1803; .reg .u64 %r1805; .reg .u64 %r1806; .reg .u32 %r1808; .reg .u32 %r1809; .reg .pred %r1811; .reg .u32 %r1812; .reg .u32 %r1813; .reg .u32 %r1816; .reg .u32 %r1817; .reg .u32 %r1820; .reg .u32 %r1821; .reg .pred %r1823; .reg .u64 %r1824; .reg .u32 %r1825; .reg .u64 %r1826; .reg .u32 %r1827; .reg .u32 %r1829; .reg .u64 %r1831; .reg .u32 %r1832; .reg .u64 %r1834; .reg .u32 %r1835; .reg .pred %r1837; .reg .u32 %r1838; .reg .u32 %r1839; .reg .u32 %r1841; .reg .u32 %r1843; .reg .u32 %r1845; .reg .u32 %r1848; .reg .u32 %r1849; .reg .u32 %r1851; .reg .u32 %r1853; .reg .u32 %r1855; .reg .u32 %r1858; .reg .u32 %r1859; .reg .u32 %r1861; .reg .u32 %r1863; .reg .u32 %r1865; .reg .pred %r1867; .reg .u32 %r1868; .reg .u32 %r1870; .reg .u32 %r1872; .reg .pred %r1873; .reg .u64 %r1874; .reg .u64 %r1875; .reg .u32 %r1876; .reg .u32 %r1877; .reg .u64 %r1878; .reg .u64 %r1879; .reg .u32 %r1880; .reg .u32 %r1881; .reg .u64 %r1883; .reg .u64 %r1884; .reg .u32 %r1885; .reg .u32 %r1886; .reg .u64 %r1887; .reg .u64 %r1888; .reg .u32 %r1889; .reg .u32 %r1890; .reg .u64 %r1892; .reg .u64 %r1893; .reg .u32 %r1894; .reg .u32 %r1895; .reg .u64 %r1896; .reg .u64 %r1897; .reg .u32 %r1898; .reg .u32 %r1899; .reg .pred %r1900; .reg .u32 %r1901; .reg .u64 %r1902; .reg .u32 %r1904; .reg .u64 %r1905; .reg .u32 %r1907; .reg .u64 %r1908; .reg .pred %r1909; .reg .u32 %r1910; .reg .u64 %r1911; .reg .u64 %r1913; .reg .u32 %r1915; .reg .u64 %r1916; .reg .u64 %r1918; .reg .u32 %r1920; .reg .u64 %r1921; .reg .pred %r1922; .reg .u32 %r1923; .reg .u32 %r1924; .reg .u32 %r1927; .reg .u32 %r1928; .reg .u32 %r1931; .reg .u32 %r1932; .reg .pred %r1934; .reg .u32 %r1935; .reg .u32 %r1936; .reg .u32 %r1939; .reg .u32 %r1940; .reg .u32 %r1943; .reg .u32 %r1944; .reg .pred %r1946; .reg .u32 %r1947; .reg .u32 %r1948; .reg .u32 %r1951; .reg .u32 %r1952; .reg .u32 %r1955; .reg .u32 %r1956; .reg .pred %r1958; .reg .u32 %r1959; .reg .u32 %r1960; .reg .u32 %r1962; .reg .u32 %r1963; .reg .u32 %r1965; .reg .u32 %r1966; .reg .u32 %r1968; .reg .u32 %r1969; .reg .u32 %r1971; .reg .u32 %r1972; .reg .u32 %r1974; .reg .u32 %r1975; .reg .pred %r1976; .reg .pred %r1979; .reg .u32 %r1980; .reg .u32 %r1981; .reg .u32 %r1984; .reg .u32 %r1985; .reg .u32 %r1988; .reg .u32 %r1989; .reg .pred %r1991; mov.u64 %r750,%ar0; mov.u64 %r751,%ar1; mov.u64 %r752,%ar2; mov.u32 %r753,%ar3; mov.u32 %r754,%ar4; mov.u64 %r755,%ar5; .loc 1 2462 3 ld.s8 %r447,[%r751+28]; cvt.u16.u32 %r756,%r447; setp.eq.u16 %r757,%r756,2; @ %r757 bra $L2; ld.s8 %r1364,[%r752+28]; cvt.u16.u32 %r758,%r1364; setp.eq.u16 %r760,%r758,2; @ %r760 bra $L3; cvta.const.u64 %r764,$LC3; cvta.const.u64 %r763,__func__$0; mov.u32 %r762,2462; cvta.const.u64 %r76177627633: .loc 1 2476 6 ld.u64 %r765,[%r750]; setp.ne.u64 %r766,%r765,0; @ %r766 bra $L4; .loc 1 2478 10 setp.ne.u16 %r768,%r756,1; @ %r768 bra $L5; .loc 1 2480 4 st.u64 [%r750+48],%r765; ld.u64 %r771,[%r752+80]; ld.u64 %r773,[%r752+72]; sub.u64 %r774,%r771,%r773; st.u64 [%r750+56],%r774; mov.u64 %r775,1; st.u64 [%r750+40],%r775; bra $L6; $L123: .loc 1 2485 4 st.u64 [%r750+48],%r1336; add.u64 %r777,%r509,-1; st.u64 [%r750+56],%r777; mov.u64 %r778,1; st.u64 [%r750+40],%r778; bra $L6; $L124: .loc 1 2490 4 mov.u64 %r779,0; st.u64 [%r750+48],%r779; add.u64 %r780,%r509,-1; st.u64 [%r750+56],%r780; mov.u64 %r781,1; st.u64 [%r750+40],%r781; .loc 1 2493 11 st.u64 [%r750+72],%r779; ld.u64 %r784,[%r752+80]; ld.u64 %r786,[%r752+72]; sub.u64 %r787,%r784,%r786; st.u64 [%r750+80],%r787; st.u64 [%r750+64],%r509; $L6: .loc 1 2499 1750; call (%value_in),_gfortran_size0,(%out_arg1); ld.param.u64 %r789,[%value_in]; } .loc 1 2499 4791; call (%value_in),_gfortrani_xmallocarray792,[%value_in]; } .loc 1 2499 2 st.u64 [%r750],%r792; .loc 1 2500 24 mov.u64 %r794,0; st.u64 [%r750+8],%r794; .loc 1 2557 7 ld.s8 %r447,[%r751+28]; .loc 1 2564 15 ld.u64 %r796,[%r751+56]; add.u64 %r795,%r796,1; .loc 1 2564 13 ld.u64 %r797,[%r751+48]; sub.u64 %r457,%r795,%r797; bra $L7; $L4: .loc 1 2502 12 cvta.global.u64 %r798,_gfortrani_compile_options; .loc 1 2502 11 ld.u32 %r799,[%r798+36]; setp.eq.u32 %r800,%r799,0; @ %r800 bra $L8; .loc 1 2509 17 ld.u64 %r802,[%r750+56]; add.u64 %r801,%r802,1; .loc 1 2509 15 ld.u64 %r803,[%r750+48]; sub.u64 %r457,%r801,%r803; .loc 1 2506 10 setp.ne.u16 %r805,%r756,1; @ %r805 bra $L9; .loc 1 2508 17 ld.u64 %r807,[%r752+80]; add.u64 %r806,%r807,1; .loc 1 2508 15 ld.u64 %r808,[%r752+72]; sub.u64 %r240,%r806,%r808; .loc 1 2510 7 setp.ne.u64 %r809,%r240,%r457; @ %r809 bra $L10; .loc 1 2564 15 ld.u64 %r811,[%r751+56]; add.u64 %r810,%r811,1; .loc 1 2564 13 ld.u64 %r812,[%r751+48]; sub.u64 %r457,%r810,%r812; .loc 1 2543 7 ld.s8 %r332,[%r750+28]; .loc 1 2548 27 ld.u64 %r445,[%r750+40]; .loc 1 2543 6 cvt.u16.u32 %r813,%r332; setp.eq.u16 %r814,%r813,1; @ %r814 bra $L11; .loc 1 2553 16 ld.u64 %r505,[%r750+64]; .loc 1 2560 16 ld.u64 %r504,[%r751+40]; .loc 1 2563 14 mov.u64 %r215,1; .loc 1 2561 16 mov.u64 %r208,%r215; bra $L12; $L10: .loc 1 2511 6 st.u64 [%stack+8],%r240; st.u64 [%stack],%r457; cvta.const.u64 %r815815_gfortran_runtime_error2485 4 ld.u64 %r818,[%r751+56]; add.u64 %r817,%r818,1; ld.u64 %r819,[%r751+48]; sub.u64 %r513,%r817,%r819; .loc 1 2515 15 cvt.u16.u32 %r820,%r1364; setp.ne.u16 %r822,%r820,1; @ %r822 bra $L13; .loc 1 2519 7 setp.eq.u64 %r823,%r513,%r457; @ %r823 bra $L14; .loc 1 2520 6 st.u64 [%stack+8],%r513; st.u64 [%stack],%r457; cvta.const.u64 %r8248stack; call _gfortran_runtime_error13: .loc 1 2528 7 setp.eq.u64 %r826,%r513,%r457; @ %r826 bra $L15; .loc 1 2529 6 st.u64 [%stack+8],%r513; st.u64 [%stack],%r457; cvta.const.u64 %r827827_gfortran_runtime_error15: .loc 1 2533 17 ld.u64 %r830,[%r752+80]; add.u64 %r829,%r830,1; .loc 1 2533 15 ld.u64 %r831,[%r752+72]; sub.u64 %r237,%r829,%r831; .loc 1 2534 17 ld.u64 %r833,[%r750+80]; add.u64 %r832,%r833,1; .loc 1 2534 15 ld.u64 %r834,[%r750+72]; sub.u64 %r238,%r832,%r834; .loc 1 2535 7 setp.eq.u64 %r835,%r237,%r238; @ %r835 bra $L14; .loc 1 2536 6 st.u64 [%stack+8],%r237; st.u64 [%stack],%r238; cvta.const.u64 %r836,$LC836_gfortran_runtime_error2564 15 ld.u64 %r839,[%r751+56]; add.u64 %r838,%r839,1; .loc 1 2564 13 ld.u64 %r840,[%r751+48]; sub.u64 %r457,%r838,%r840; $L7: .loc 1 2548 27 ld.u64 %r505,[%r750+40]; .loc 1 2543 6 ld.s8 %r842,[%r750+28]; cvt.u16.u32 %r841,%r842; setp.eq.u16 %r843,%r841,1; @ %r843 bra $L130; $L127: .loc 1 2553 16 mov.u64 %r445,%r505; ld.u64 %r505,[%r750+64]; bra $L16; $L130: .loc 1 2548 27 mov.u64 %r445,%r505; $L16: .loc 1 2560 16 ld.u64 %r504,[%r751+40]; .loc 1 2557 6 cvt.u16.u32 %r844,%r447; setp.eq.u16 %r845,%r844,1; @ %r845 bra $L131; $L125: .loc 1 2569 16 ld.u64 %r208,[%r751+64]; .loc 1 2571 15 ld.u64 %r847,[%r751+80]; add.u64 %r846,%r847,1; .loc 1 2571 13 ld.u64 %r848,[%r751+72]; mov.u64 %r215,%r457; sub.u64 %r457,%r846,%r848; bra $L12; $L131: .loc 1 2563 14 mov.u64 %r215,1; .loc 1 2561 16 mov.u64 %r208,%r215; $L12: .loc 1 2575 16 ld.u64 %r850,[%r752+56]; add.u64 %r849,%r850,1; ld.u64 %r851,[%r752+48]; sub.u64 %r63,%r849,%r851; .loc 1 2575 6 setp.eq.u64 %r852,%r63,%r457; @ %r852 bra $L17; .loc 1 2577 10 set.u32.gt.s64 %r854,%r457,0; neg.s32 %r855,%r854; .loc 1 2577 21 set.u32.gt.s64 %r857,%r63,0; neg.s32 %r858,%r857; cvt.u16.u32 %r860,%r855; cvt.u16.u32 %r861,%r858; or.b16 %r859,%r860,%r861; cvt.u32.u16 %r862,%r859; cvt.u16.u8 %r863,%r862; setp.eq.u16 %r864,%r863,0; @ %r864 bra $L17; .loc 1 2578 2 st.u64 [%stack+8],%r457; st.u64 [%stack],%r63; cvta.const.u64 %r865865_gfortran_runtime_error17: .loc 1 2583 7 ld.s8 %r65,[%r752+28]; .loc 1 2586 16 ld.u64 %r503,[%r752+40]; .loc 1 2583 6 cvt.u16.u32 %r867,%r65; setp.eq.u16 %r868,%r867,1; @ %r868 bra $L132; .loc 1 2597 16 ld.u64 %r209,[%r752+64]; .loc 1 2598 16 ld.u64 %r870,[%r752+80]; add.u64 %r869,%r870,1; .loc 1 2598 14 ld.u64 %r871,[%r752+72]; sub.u64 %r216,%r869,%r871; bra $L18; $L132: .loc 1 2592 14 mov.u64 %r216,1; .loc 1 2591 16 mov.u64 %r209,256; $L18: .loc 1 2601 9 ld.u64 %r244,[%r751]; .loc 1 2602 9 ld.u64 %r245,[%r752]; .loc 1 2603 8 ld.u64 %r246,[%r750]; .loc 1 2612 28 set.u32.eq.u64 %r873,%r445,1; neg.s32 %r874,%r873; cvt.u32.u32 %r872,%r874; cvt.u32.u8 %r70,%r872; .loc 1 2612 46 set.u32.eq.u64 %r876,%r504,1; neg.s32 %r877,%r876; cvt.u32.u32 %r875,%r877; cvt.u32.u8 %r502,%r875; .loc 1 2612 7 set.u32.ne.u32 %r879,%r753,0; neg.s32 %r880,%r879; .loc 1 2612 16 cvt.u16.u32 %r882,%r880; cvt.u16.u32 %r883,%r70; and.b16 %r881,%r882,%r883; .loc 1 2612 6 cvt.u32.u16 %r884,%r881; cvt.u16.u8 %r885,%r884; setp.eq.u16 %r886,%r885,0; @ %r886 bra $L19; .loc 1 2612 63 set.u32.eq.u64 %r888,%r208,1; neg.s32 %r889,%r888; .loc 1 2612 51 cvt.u16.u32 %r891,%r889; cvt.u16.u32 %r892,%r502; or.b16 %r890,%r891,%r892; .loc 1 2612 33 cvt.u32.u16 %r893,%r890; cvt.u16.u8 %r894,%r893; setp.eq.u16 %r895,%r894,0; @ %r895 bra $L20; .loc 1 2613 20 set.u32.eq.u64 %r897,%r503,1; neg.s32 %r898,%r897; .loc 1 2613 37 set.u32.eq.u64 %r900,%r209,1; neg.s32 %r901,%r900; .loc 1 2613 25 cvt.u16.u32 %r903,%r898; cvt.u16.u32 %r904,%r901; or.b16 %r902,%r903,%r904; .loc 1 2613 7 cvt.u32.u16 %r905,%r902; cvt.u16.u8 %r906,%r905; setp.eq.u16 %r907,%r906,0; @ %r907 bra $L20; .loc 1 2615 13 cvt.rn.f32.s32 %r81,%r754; .loc 1 2614 12 cvt.rn.f32.s64 %r908,%r215; .loc 1 2614 50 cvt.rn.f32.s64 %r909,%r457; .loc 1 2614 47 mul.f32 %r910,%r908,%r909; .loc 1 2614 31 cvt.rn.f32.s64 %r911,%r216; .loc 1 2614 47 mul.f32 %r912,%r910,%r911; .loc 1 2615 13 mul.f32 %r913,%r81,%r81; mul.f32 %r914,%r913,%r81; .loc 1 2614 7 setp.gt.f32 %r915,%r912,%r914; @ ! %r915 bra $L19; .loc 1 2617 17 cvt.u32.u64 %r84,%r215; st.u32 [%frame+28],%r84; .loc 1 2617 29 cvt.u32.u64 %r85,%r216; st.u32 [%frame+24],%r85; .loc 1 2617 41 cvt.u32.u64 %r86,%r457; st.u32 [%frame+20],%r86; .loc 1 2617 52 cvt.u32.u64 %r87,%r505; st.u32 [%frame+16],%r87; .loc 1 2618 27 mov.u32 %r916,1; st.u32 [%frame+12],%r916; .loc 1 2618 36 mov.u32 %r917,0; st.u32 [%frame+8],%r917; .loc 1 2619 17 setp.ne.u64 %r918,%r504,1; @ %r918 bra $L22; cvt.u32.u64 %r229,%r208; bra $L23; $L22: cvt.u32.u64 %r229,%r504; $L23: st.u32 [%frame+4],%r229; .loc 1 2620 3 setp.ne.u64 %r919,%r503,1; @ %r919 bra $L24; cvt.u32.u64 %r230,%r209; bra $L25; $L24: cvt.u32.u64 %r230,%r503; $L25: st.u32 [%frame],%r230; .loc 1 2622 15 set.u32.gt.s32 %r921,%r229,0; neg.s32 %r922,%r921; .loc 1 2622 26 set.u32.gt.s32 %r924,%r230,0; neg.s32 %r925,%r924; .loc 1 2622 19 cvt.u16.u32 %r927,%r922; cvt.u16.u32 %r928,%r925; and.b16 %r926,%r927,%r928; .loc 1 2622 10 cvt.u32.u16 %r929,%r926; cvt.u16.u8 %r930,%r929; setp.eq.u16 %r931,%r930,0; @ %r931 bra $L19; .loc 1 2622 37 set.u32.gt.s32 %r933,%r87,0; neg.s32 %r934,%r933; .loc 1 2622 46 set.u32.gt.s32 %r936,%r84,1; neg.s32 %r937,%r936; .loc 1 2622 41 cvt.u16.u32 %r939,%r934; cvt.u16.u32 %r940,%r937; and.b16 %r938,%r939,%r940; cvt.u32.u16 %r941,%r938; cvt.u16.u8 %r942,%r941; setp.eq.u16 %r943,%r942,0; @ %r943 bra $L19; .loc 1 2622 55 set.u32.gt.s32 %r945,%r85,1; neg.s32 %r946,%r945; .loc 1 2622 64 set.u32.gt.s32 %r948,%r86,1; neg.s32 %r949,%r948; .loc 1 2622 59 cvt.u16.u32 %r951,%r946; cvt.u16.u32 %r952,%r949; and.b16 %r950,%r951,%r952; cvt.u32.u16 %r953,%r950; cvt.u16.u8 %r954,%r953; setp.eq.u16 %r955,%r954,0; @ %r955 bra $L19; .loc 1 2624 4 setp.ne.u64 %r956,%r755,0; @ %r956 bra $L27; cvta.const.u64 %r960,$LC8; cvta.const.u64 %r959,__func__$0; mov.u32 %r958,2624; cvta.const.u64 %r9959027: .loc 1 2626 17 and.b32 %r961,%r753,2; .loc 1 2626 7 setp.ne.u32 %r962,%r961,0; @ %r962 bra $L133; .loc 1 2629 35 setp.eq.u64 %r963,%r504,1; @ %r963 bra $L134; cvta.const.u64 %r217,$LC2; bra $L28; $L133: .loc 1 2627 13 cvta.const.u64 %r217,$LC1; bra $L28; $L134: .loc 1 2629 35 cvta.const.u64 %r217,$LC0; $L28: .loc 1 2631 17 and.b32 %r964,%r753,4; .loc 1 2631 7 setp.ne.u32 %r965,%r964,0; @ %r965 bra $L135; .loc 1 2634 35 setp.eq.u64 %r966,%r503,1; @ %r966 bra $L136; cvta.const.u64 %r218,$LC2; bra $L29; $L135: .loc 1 2632 13 cvta.const.u64 %r218,$LC1; bra $L29; $L136: .loc 1 2634 35 cvta.const.u64 %r218,$LC0; $L29: .loc 1 2636 4 add.u64 %r982,%frame,16; add.u64 %r983,%frame,8; add.u64 %r984,%frame,4; add.u64 %r985,%frame,12; add.u64 %r986,%frame,20; add.u64 %r987,%frame,24; add.u64 %r988,%frame,28; mov.u32 %r981,1;,.param .u64 %in_ar3,.param .u64 %in_ar4,.param .u64 %in_ar5,.param .u64 %in_ar6,.param .u64 %in_ar7,.param .u64 %in_ar8,.param .u64 %in_ar9,.param .u64 %in_ar10,.param .u64 %in_ar11,.param .u64 %in_ar12,.param .u32 %in_ar13,.param .u32 %in_ar14);988987986244; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r984; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r245; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%frame; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r983; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r246; .param .u64 %out_arg13; st.param.u64 [%out_arg13],%r982; .param .u32 %out_arg14; st.param.u32 [%out_arg14],%r981; .param .u32 %out_arg15; st.param.u32 [%out_arg15],%r981; call %r755,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13,%out_arg14,%out_arg15),$LCT0; } bra $L1; $L19: .loc 1 2643 21 cvt.u16.u32 %r990,%r70; cvt.u16.u32 %r991,%r502; and.b16 %r989,%r990,%r991; .loc 1 2643 6 cvt.u32.u16 %r992,%r989; cvt.u16.u8 %r993,%r992; setp.eq.u16 %r994,%r993,0; @ %r994 bra $L31; .loc 1 2643 38 setp.ne.u64 %r995,%r503,1; @ %r995 bra $L20; .loc 1 2644 7 cvt.u16.u32 %r996,%r65; setp.eq.u16 %r997,%r996,1; @ %r997 bra $L32; .loc 1 2677 16 add.u64 %r998,%r505,1; .loc 1 2678 9 shl.b64 %r103,%r998,2; sub.u64 %r248,%r246,%r103; .loc 1 2687 7 setp.le.s64 %r999,%r216,0; @ %r999 bra $L33; add.u64 %r623,%r103,-4; mov.u64 %r622,%r246; add.u64 %r626,%r216,1; .loc 1 2689 22 shl.b64 %r442,%r215,2; .loc 1 2687 13 mov.u64 %r298,%r503; setp.gt.s64 %r1361,%r215,0; and.b64 %r1473,%r216,3; setp.eq.u64 %r1476,%r1473,0; @ %r1476 bra $L34; setp.eq.u64 %r1475,%r1473,1; @ %r1475 bra $L300; setp.eq.u64 %r1474,%r1473,2; @ %r1474 bra $L301; bra $L366; $L35: .loc 1 2689 22 mov.u32 %r100310031005,[%value_in]; } $L36: .loc 1 2687 24 add.u64 %r1477,%r298,1; .loc 1 2687 7 add.u64 %r1530,%r622,%r623; .loc 1 2688 2 @ %r1361 bra $L238; bra $L367; $L34: @ %r1361 bra $L35; bra $L36; $L33: .loc 1 2692 28 set.u32.eq.u64 %r1010,%r457,0; neg.s32 %r1011,%r1010; .loc 1 2692 13 set.u32.eq.u64 %r1013,%r215,0; neg.s32 %r1014,%r1013; .loc 1 2692 28 cvt.u16.u32 %r1016,%r1011; cvt.u16.u32 %r1017,%r1014; or.b16 %r1015,%r1016,%r1017; .loc 1 2692 23 set.u32.eq.u64 %r1019,%r216,0; neg.s32 %r1020,%r1019; .loc 1 2692 28 cvt.u16.u32 %r1024,%r1020; or.b16 %r1022,%r1015,%r1024; cvt.u32.u16 %r1025,%r1022; cvt.u16.u8 %r1026,%r1025; setp.eq.u16 %r1027,%r1026,0; @ ! %r1027 bra $L1; .loc 1 2697 10 setp.eq.u64 %r1028,%r208,1; selp.u64 %r227,%r505,%r208,%r1028; .loc 1 2702 21 shl.b64 %r1029,%r227,8; .loc 1 2702 14 add.u64 %r253,%r1029,%r209; .loc 1 2706 12 min.s64 %r1031,%r253,65536; shl.b64 %r1032,%r1031,10321033,[%value_in]; } .loc 1 2710 7 setp.gt.s64 %r1035,%r216,0; @ %r1035 bra $L39; $L42: .loc 1 2910332918 7 bra $L1; $L39: .loc 1 2680 16 add.u64 %r1037,%r208,1; .loc 1 2681 9 shl.b64 %r106,%r1037,2; sub.u64 %r250,%r244,%r106; .loc 1 2683 16 add.u64 %r1038,%r209,1; .loc 1 2684 9 shl.b64 %r109,%r1038,2; shl.b64 %r596,%r209,9; shl.b64 %r598,%r209,11; shl.b64 %r608,%r209,3; sub.u64 %r599,%r245,%r109; add.u64 %r597,%r608,%r599; shl.b64 %r602,%r505,9; add.u64 %r603,%r505,%r505; add.u64 %r605,%r103,-4; neg.s64 %r1041,%r209; shl.b64 %r1042,%r1041,11; shl.b64 %r606,%r1041,2; shl.b64 %r610,%r505,11; shl.b64 %r1046,%r505,3; add.u64 %r1047,%r1046,4; add.u64 %r609,%r248,%r1047; add.u64 %r1048,%r216,-1; and.b64 %r618,%r1048,-512; add.u64 %r619,%r599,%r606; shl.b64 %r575,%r208,8; shl.b64 %r577,%r208,10; add.u64 %r578,%r106,-4; neg.s64 %r1050,%r208; shl.b64 %r1051,%r1050,10; shl.b64 %r1054,%r1050,3; shl.b64 %r540,%r208,3; add.u64 %r541,%r208,%r208; add.u64 %r231,%r109,-4; shl.b64 %r225,%r209,4; shl.b64 %r220,%r505,4; neg.s64 %r1057,%r505; shl.b64 %r1058,%r1057,2; shl.b64 %r1061,%r1041,4; mov.u64 %r604,%r605; mov.u64 %r600,%r603; mov.u64 %r594,%r209; mov.u64 %r592,0; setp.gt.s64 %r1374,%r457,0; add.u64 %r1409,%r619,4; setp.gt.s64 %r1410,%r215,0; add.u64 %r1411,%r215,1; add.u64 %r1412,%r250,8; bra $L41; $L138: mov.u64 %r592,%r593; $L41: add.u64 %r621,%r592,1; .loc 1 2715 9 sub.u64 %r255,%r216,%r592; min.s64 %r1531,%r255,512; .loc 1 2716 24 shr.s64 %r1063,%r1531,63; shr.u64 %r1064,%r1063,62; add.u64 %r1065,%r1531,%r1064; and.b64 %r1066,%r1065,3; sub.u64 %r1067,%r1066,%r1064; .loc 1 2716 10 sub.u64 %r256,%r1531,%r1067; .loc 1 2718 4 @ %r1374 bra $L40; $L45: .loc 1 2710 7 add.u64 %r593,%r592,512; add.u64 %r594,%r594,%r596; add.u64 %r597,%r597,%r598; add.u64 %r600,%r600,%r602; add.u64 %r604,%r604,%r602; add.u64 %r606,%r606,%r1042; add.u64 %r608,%r608,%r598; add.u64 %r609,%r609,%r610; setp.ne.u64 %r1069,%r592,%r618; @ %r1069 bra $L138; bra $L42; $L40: add.u64 %r572,%r1409,%r608; .loc 1 2768 13 add.u64 %r482,%r256,%r621; add.u64 %r478,%r231,%r608; add.u64 %r476,%r231,%r478; add.u64 %r590,%r231,%r597; add.u64 %r591,%r231,%r590; mul.lo.u64 %r406,%r482,%r505; mul.lo.u64 %r399,%r482,%r209; mad.lo.u64 %r587,%r482,%r231,%r599; mov.u64 %r579,%r1054; mov.u64 %r576,%r578; mov.u64 %r573,%r208; .loc 1 2718 12 mov.u64 %r259,%r503; add.u64 %r1365,%r457,1; .loc 1 2768 8 add.u64 %r1402,%r482,-1; setp.ge.s64 %r1404,%r1402,%r621; sub.u64 %r1407,%r600,%r505; add.u64 %r1408,%r1407,%r603; $L44: .loc 1 2723 13 sub.u64 %r257,%r1365,%r259; min.s64 %r1533,%r257,256; .loc 1 2724 28 shr.u64 %r1075,%r1533,63; add.u64 %r1076,%r1533,%r1075; and.b64 %r1077,%r1076,1; sub.u64 %r1078,%r1077,%r1075; .loc 1 2724 14 sub.u64 %r258,%r1533,%r1078; .loc 1 2727 8 @ %r1410 bra $L43; $L73: .loc 1 2718 30 add.u64 %r259,%r259,256; .loc 1 2718 4 add.u64 %r572,%r572,1024; add.u64 %r573,%r573,%r575; add.u64 %r576,%r576,%r577; add.u64 %r579,%r579,%r1051; setp.ge.s64 %r1080,%r457,%r259; @ %r1080 bra $L44; bra $L45; $L43: .loc 1 2763 26 add.u64 %r1081,%r259,-1; add.u64 %r1082,%r1081,%r1533; .loc 1 2763 31 mul.lo.u64 %r316,%r1082,%r208; .loc 1 2734 13 add.u64 %r481,%r258,%r259; .loc 1 2734 8 add.u64 %r483,%r481,-1; add.u64 %r1083,%r316,1; shl.b64 %r1084,%r1083,2; add.u64 %r559,%r250,%r1084; add.u64 %r569,%r573,%r208; sub.u64 %r1085,%r569,%r316; shl.b64 %r571,%r1085,2; shl.b64 %r1087,%r1533,2; add.u64 %r1088,%r1087,-4; add.u64 %r100,%r1033,%r1088; add.u64 %r1089,%r259,%r594; add.u64 %r1090,%r1089,%r1533; shl.b64 %r1091,%r1090,2; add.u64 %r224,%r1091,%r599; shl.b64 %r589,%r259,2; add.u64 %r48,%r589,%r597; add.u64 %r30,%r589,%r590; add.u64 %r242,%r589,%r591; add.u64 %r588,%r587,%r589; add.u64 %r1092,%r399,%r259; add.u64 %r1093,%r1092,%r1533; shl.b64 %r1094,%r1093,2; add.u64 %r397,%r1094,%r599; .loc 1 2768 8 mov.u64 %r558,%r609; .loc 1 2727 16 mov.u64 %r273,%r503; setp.le.s64 %r1366,%r259,%r483; setp.gt.s64 %r1358,%r1533,%r258; setp.gt.s64 %r1357,%r1531,%r256; not.b64 %r1395,%r259; add.u64 %r1396,%r1395,%r481; and.b64 %r1397,%r1396,-2; .loc 1 2904 18 add.u64 %r1398,%r1533,%r259; .loc 1 2904 13 add.u64 %r1399,%r1398,-1; setp.gt.s64 %r1400,%r259,%r1399; $L72: .loc 1 2732 10 sub.u64 %r260,%r1411,%r273; min.s64 %r1534,%r260,256; .loc 1 2733 25 shr.s64 %r1363,%r1534,63; shr.u64 %r1098,%r1534,63; add.u64 %r1099,%r1534,%r1098; and.b64 %r1100,%r1099,1; sub.u64 %r1101,%r1100,%r1098; .loc 1 2733 11 sub.u64 %r261,%r1534,%r1101; .loc 1 2735 5 @ %r1366 bra $L46; $L55: .loc 1 2757 8 @ ! %r1358 bra $L48; bra $L47; $L46: .loc 1 2737 17 add.u64 %r485,%r261,%r273; .loc 1 2737 12 add.u64 %r486,%r485,-1; shl.b64 %r1105,%r1534,10; add.u64 %r1107,%r1105,-1024; add.u64 %r533,%r1033,%r1107; add.u64 %r539,%r559,%r571; add.u64 %r1110,%r1534,%r273; shl.b64 %r550,%r1110,2; shl.b64 %r553,%r1534,2; mov.u64 %r544,%r569; mov.u64 %r543,%r579; mov.u64 %r542,%r576; mov.u64 %r531,0; setp.le.s64 %r1370,%r273,%r486; setp.gt.s64 %r1359,%r1534,%r261; not.b64 %r1390,%r273; add.u64 %r1391,%r1390,%r485; and.b64 %r1392,%r1391,-2; add.u64 %r1393,%r1392,%r273; bra $L54; $L139: mov.u64 %r531,%r532; $L54: .loc 1 2738 9 @ %r1370 bra $L49; $L53: .loc 1 2749 12 @ %r1359 bra $L50; bra $L51; $L49: shl.b64 %r1113,%r531,2; add.u64 %r315,%r1033,%r1113; add.u64 %r1118,%r1393,%r544; shl.b64 %r1119,%r1118,2; add.u64 %r530,%r1119,%r1412; .loc 1 2738 9 mov.u64 %r160,%r539; sub.u64 %r1461,%r530,%r539; add.u64 %r1462,%r1461,-8; shr.u64 %r1460,%r1462,3; add.u64 %r1463,%r1460,1; and.b64 %r1464,%r1463,3; setp.eq.u64 %r1467,%r1464,0; @ %r1467 bra $L52; setp.eq.u64 %r1466,%r1464,1; @ %r1466 bra $L302; setp.eq.u64 %r1465,%r1464,2; @ %r1465 bra $L303; bra $L368; $L52: .loc 1 2741 7 add.u64 %r1121,%r160,%r542; add.u64 %r1122,%r1121,%r543; .loc 1 2740 49 ld.u32 %r1123,[%r1122]; st.u32 [%r315],%r1123; .loc 1 2742 49 ld.u32 %r1124,[%r160]; st.u32 [%r315+4],%r1124; .loc 1 2745 7 add.u64 %r1125,%r160,%r543; add.u64 %r1126,%r1125,%r542; .loc 1 2744 49 ld.u32 %r1127,[%r1126+4]; st.u32 [%r315+1024],%r1127; .loc 1 2746 49 ld.u32 %r1128,[%r160+4]; st.u32 [%r315+1028],%r1128; .loc 1 2738 9 add.u64 %r1468,%r315,2048; add.u64 %r1469,%r160,8; .loc 1 2741 7 add.u64 %r1537,%r1469,%r542; add.u64 %r1538,%r1537,%r543; .loc 1 2740 49 ld.u32 %r1539,[%r1538]; st.u32 [%r1468],%r1539; .loc 1 2742 49 ld.u32 %r1540,[%r1469]; st.u32 [%r1468+4],%r1540; .loc 1 2745 7 add.u64 %r1541,%r1469,%r543; add.u64 %r1542,%r1541,%r542; .loc 1 2744 49 ld.u32 %r1543,[%r1542+4]; st.u32 [%r1468+1024],%r1543; .loc 1 2746 49 ld.u32 %r1544,[%r1469+4]; st.u32 [%r1468+1028],%r1544; .loc 1 2738 9 add.u64 %r1545,%r315,4096; add.u64 %r1546,%r160,16; .loc 1 2741 7 add.u64 %r1548,%r1546,%r542; add.u64 %r1549,%r1548,%r543; .loc 1 2740 49 ld.u32 %r1550,[%r1549]; st.u32 [%r1545],%r1550; .loc 1 2742 49 ld.u32 %r1551,[%r1546]; st.u32 [%r1545+4],%r1551; .loc 1 2745 7 add.u64 %r1552,%r1546,%r543; add.u64 %r1553,%r1552,%r542; .loc 1 2744 49 ld.u32 %r1554,[%r1553+4]; st.u32 [%r1545+1024],%r1554; .loc 1 2746 49 ld.u32 %r1555,[%r1546+4]; st.u32 [%r1545+1028],%r1555; .loc 1 2738 9 add.u64 %r1556,%r315,6144; add.u64 %r1557,%r160,24; .loc 1 2741 7 add.u64 %r1559,%r1557,%r542; add.u64 %r1560,%r1559,%r543; .loc 1 2740 49 ld.u32 %r1561,[%r1560]; st.u32 [%r1556],%r1561; .loc 1 2742 49 ld.u32 %r1562,[%r1557]; st.u32 [%r1556+4],%r1562; .loc 1 2745 7 add.u64 %r1563,%r1557,%r543; add.u64 %r1564,%r1563,%r542; .loc 1 2744 49 ld.u32 %r1565,[%r1564+4]; st.u32 [%r1556+1024],%r1565; .loc 1 2746 49 ld.u32 %r1566,[%r1557+4]; st.u32 [%r1556+1028],%r1566; .loc 1 2738 9 add.u64 %r315,%r315,8192; add.u64 %r160,%r160,32; setp.ne.u64 %r1567,%r160,%r530; @ %r1567 bra $L52; bra $L53; $L50: .loc 1 2752 10 add.u64 %r1131,%r542,%r550; add.u64 %r1132,%r250,%r1131; .loc 1 2754 10 add.u64 %r1137,%r539,%r553; .loc 1 2751 41 ld.u32 %r1138,[%r1137+-4]; shl.b64 %r1139,%r1138,32; ld.u32 %r1141,[%r1132+-4]; or.b64 %r1142,%r1141,%r1139; mov.b64 %r1130,%r1142; st.v2.u32 [%r533],%r1130; $L51: .loc 1 2735 5 add.u64 %r532,%r531,2; add.u64 %r533,%r533,8; add.u64 %r539,%r539,%r540; add.u64 %r542,%r542,%r540; add.u64 %r543,%r543,%r1054; add.u64 %r544,%r544,%r541; setp.ne.u64 %r1143,%r531,%r1397; @ %r1143 bra $L139; bra $L55; $L48: .loc 1 2767 25 shr.u64 %r1146,%r1363,62; add.u64 %r1147,%r1534,%r1146; and.b64 %r1148,%r1147,3; sub.u64 %r1149,%r1148,%r1146; .loc 1 2767 11 sub.u64 %r263,%r1534,%r1149; .loc 1 2769 5 @ %r1404 bra $L56; bra $L57; $L47: .loc 1 2759 17 add.u64 %r130,%r1534,%r273; .loc 1 2759 12 add.u64 %r1151,%r130,-1; .loc 1 2760 9 setp.lt.s64 %r1152,%r1151,%r273; @ %r1152 bra $L48; add.u64 %r1153,%r130,%r316; shl.b64 %r1154,%r1153,2; add.u64 %r228,%r250,%r1154; mov.u64 %r331,%r100; mov.u64 %r56,%r559; sub.u64 %r1451,%r228,%r559; add.u64 %r1452,%r1451,-4; shr.u64 %r1450,%r1452,2; add.u64 %r1453,%r1450,1; and.b64 %r1454,%r1453,3; setp.eq.u64 %r1457,%r1454,0; @ %r1457 bra $L58; setp.eq.u64 %r1456,%r1454,1; @ %r1456 bra $L304; setp.eq.u64 %r1455,%r1454,2; @ %r1455 bra $L305; bra $L369; $L58: .loc 1 2762 43 ld.u32 %r1155,[%r56]; st.u32 [%r331],%r1155; ld.u32 %r1570,[%r56+4]; st.u32 [%r331+1024],%r1570; ld.u32 %r1574,[%r56+8]; st.u32 [%r331+2048],%r1574; ld.u32 %r1578,[%r56+12]; st.u32 [%r331+3072],%r1578; .loc 1 2760 9 add.u64 %r56,%r56,16; add.u64 %r331,%r331,4096; setp.ne.u64 %r1579,%r228,%r56; @ %r1579 bra $L58; bra $L48; $L57: .loc 1 2871 8 @ ! %r1357 bra $L60; bra $L59; $L56: .loc 1 2876 14 add.u64 %r489,%r263,%r273; .loc 1 2876 9 add.u64 %r490,%r489,-1; add.u64 %r221,%r558,%r1058; add.u64 %r82,%r558,%r605; shl.b64 %r1158,%r505,2; add.u64 %r39,%r1158,%r82; not.b64 %r1159,%r273; add.u64 %r1160,%r1159,%r489; shr.u64 %r1161,%r1160,2; add.u64 %r1162,%r1161,1; shl.b64 %r320,%r1162,4; add.u64 %r1163,%r1407,%r489; shl.b64 %r1164,%r1163,2; add.u64 %r467,%r1164,%r248; add.u64 %r1165,%r489,%r600; shl.b64 %r1166,%r1165,2; add.u64 %r464,%r1166,%r248; add.u64 %r1168,%r1408,%r489; shl.b64 %r1169,%r1168,2; add.u64 %r462,%r1169,%r248; add.u64 %r1170,%r489,%r604; shl.b64 %r1171,%r1170,2; add.u64 %r458,%r1171,%r248; shl.b64 %r369,%r263,8; shl.b64 %r471,%r1534,2; mov.u64 %r477,%r476; mov.u64 %r479,%r478; mov.u64 %r480,%r608; mov.u64 %r488,%r606; mov.u64 %r219,%r558; mov.u64 %r226,%r224; mov.u64 %r279,%r621; mov.u64 %r474,0; mov.u64 %r475,%r474; setp.le.s64 %r1372,%r273,%r490; setp.gt.s64 %r1360,%r1534,%r263; $L68: .loc 1 2772 9 @ %r1372 bra $L61; $L67: .loc 1 2843 12 @ ! %r1360 bra $L63; bra $L62; $L61: .loc 1 2904 13 mov.u64 %r341,0; mov.u64 %r342,4096; mov.u64 %r343,3072; mov.u64 %r344,1024; mov.u64 %r345,-2048; $L66: add.u64 %r339,%r221,%r341; .loc 1 2774 10 ld.u32 %r280,[%r339]; .loc 1 2775 10 ld.u32 %r281,[%r339+4]; add.u64 %r337,%r219,%r341; .loc 1 2776 10 ld.u32 %r282,[%r337]; .loc 1 2777 10 ld.u32 %r283,[%r337+4]; add.u64 %r335,%r82,%r341; .loc 1 2778 10 ld.u32 %r284,[%r335]; .loc 1 2779 10 ld.u32 %r285,[%r335+4]; add.u64 %r333,%r39,%r341; .loc 1 2780 10 ld.u32 %r286,[%r333]; .loc 1 2781 10 ld.u32 %r287,[%r333+4]; .loc 1 2782 10 ld.u32 %r288,[%r339+8]; .loc 1 2783 10 ld.u32 %r289,[%r339+12]; .loc 1 2784 10 ld.u32 %r290,[%r337+8]; .loc 1 2785 10 ld.u32 %r291,[%r337+12]; .loc 1 2786 10 ld.u32 %r292,[%r335+8]; .loc 1 2787 10 ld.u32 %r293,[%r335+12]; .loc 1 2788 10 ld.u32 %r294,[%r333+8]; .loc 1 2789 10 ld.u32 %r295,[%r333+12]; .loc 1 2791 6 @ %r1400 bra $L64; add.u64 %r361,%r474,%r572; add.u64 %r359,%r1033,%r344; $L65: add.u64 %r356,%r345,%r359; .loc 1 2793 19 add.u64 %r1176,%r356,%r344; ld.u32 %r135,[%r1176]; .loc 1 2794 14 ld.u32 %r136,[%r361]; .loc 1 2793 14 mad.lo.u32 %r280,%r135,%r136,%r280; .loc 1 2795 19 ld.u32 %r138,[%r359]; .loc 1 2795 14 mad.lo.u32 %r281,%r136,%r138,%r281; add.u64 %r351,%r361,%r488; .loc 1 2798 14 add.u64 %r1179,%r351,%r480; ld.u32 %r140,[%r1179]; .loc 1 2797 14 mad.lo.u32 %r282,%r135,%r140,%r282; .loc 1 2799 14 mad.lo.u32 %r283,%r138,%r140,%r283; .loc 1 2802 14 add.u64 %r1182,%r351,%r479; ld.u32 %r143,[%r1182]; .loc 1 2801 14 mad.lo.u32 %r284,%r135,%r143,%r284; .loc 1 2803 14 mad.lo.u32 %r285,%r138,%r143,%r285; .loc 1 2806 14 add.u64 %r1185,%r351,%r477; ld.u32 %r146,[%r1185]; .loc 1 2805 14 mad.lo.u32 %r286,%r135,%r146,%r286; .loc 1 2807 14 mad.lo.u32 %r287,%r138,%r146,%r287; .loc 1 2809 19 add.u64 %r1188,%r356,%r343; ld.u32 %r149,[%r1188]; .loc 1 2809 14 mad.lo.u32 %r288,%r136,%r149,%r288; .loc 1 2811 19 add.u64 %r1190,%r356,%r342; ld.u32 %r151,[%r1190]; .loc 1 2811 14 mad.lo.u32 %r289,%r136,%r151,%r289; .loc 1 2813 14 mad.lo.u32 %r290,%r140,%r149,%r290; .loc 1 2815 14 mad.lo.u32 %r291,%r140,%r151,%r291; .loc 1 2817 14 mad.lo.u32 %r292,%r143,%r149,%r292; .loc 1 2819 14 mad.lo.u32 %r293,%r143,%r151,%r293; .loc 1 2821 14 mad.lo.u32 %r294,%r146,%r149,%r294; .loc 1 2823 14 mad.lo.u32 %r295,%r146,%r151,%r295; .loc 1 2791 6 add.u64 %r361,%r361,4; add.u64 %r359,%r359,4; setp.ne.u64 %r1198,%r226,%r361; @ %r1198 bra $L65; $L64: .loc 1 2826 24 st.u32 [%r339],%r280; .loc 1 2827 28 st.u32 [%r339+4],%r281; .loc 1 2828 30 st.u32 [%r337],%r282; .loc 1 2829 34 st.u32 [%r337+4],%r283; .loc 1 2830 30 st.u32 [%r335],%r284; .loc 1 2831 34 st.u32 [%r335+4],%r285; .loc 1 2832 30 st.u32 [%r333],%r286; .loc 1 2833 34 st.u32 [%r333+4],%r287; .loc 1 2834 28 st.u32 [%r339+8],%r288; .loc 1 2835 28 st.u32 [%r339+12],%r289; .loc 1 2836 34 st.u32 [%r337+8],%r290; .loc 1 2837 34 st.u32 [%r337+12],%r291; .loc 1 2838 34 st.u32 [%r335+8],%r292; .loc 1 2839 34 st.u32 [%r335+12],%r293; .loc 1 2840 34 st.u32 [%r333+8],%r294; .loc 1 2841 34 st.u32 [%r333+12],%r295; .loc 1 2772 9 add.u64 %r345,%r345,-4096; add.u64 %r344,%r344,4096; add.u64 %r343,%r343,4096; add.u64 %r342,%r342,4096; add.u64 %r341,%r341,16; setp.ne.u64 %r1199,%r320,%r341; @ %r1199 bra $L66; bra $L67; $L63: .loc 1 2769 29 add.u64 %r279,%r279,4; .loc 1 2769 5 add.u64 %r226,%r226,%r225; add.u64 %r221,%r221,%r220; add.u64 %r219,%r219,%r220; add.u64 %r82,%r82,%r220; add.u64 %r39,%r39,%r220; add.u64 %r488,%r488,%r1061; add.u64 %r480,%r480,%r225; add.u64 %r479,%r479,%r225; add.u64 %r477,%r477,%r225; add.u64 %r475,%r475,%r220; add.u64 %r474,%r474,%r225; setp.le.s64 %r1200,%r279,%r1402; @ %r1200 bra $L68; bra $L57; $L62: add.u64 %r374,%r467,%r475; add.u64 %r373,%r464,%r475; add.u64 %r372,%r462,%r475; add.u64 %r371,%r458,%r475; add.u64 %r239,%r48,%r474; add.u64 %r32,%r30,%r474; add.u64 %r207,%r242,%r474; add.u64 %r472,%r221,%r471; .loc 1 2904 13 mov.u64 %r370,%r369; $L71: .loc 1 2848 14 ld.u32 %r275,[%r374]; .loc 1 2849 14 ld.u32 %r276,[%r373]; .loc 1 2850 14 ld.u32 %r277,[%r372]; .loc 1 2851 14 ld.u32 %r278,[%r371]; .loc 1 2853 10 @ %r1400 bra $L69; shl.b64 %r1204,%r370,2; add.u64 %r389,%r1033,%r1204; add.u64 %r387,%r474,%r572; mov.u64 %r381,%r207; mov.u64 %r383,%r32; mov.u64 %r385,%r239; sub.u64 %r1438,%r226,%r387; add.u64 %r1439,%r1438,-4; shr.u64 %r1437,%r1439,2; add.u64 %r1440,%r1437,1; and.b64 %r1441,%r1440,3; setp.eq.u64 %r1444,%r1441,0; @ %r1444 bra $L70; setp.eq.u64 %r1443,%r1441,1; @ %r1443 bra $L306; setp.eq.u64 %r1442,%r1441,2; @ %r1442 bra $L307; bra $L370; $L70: .loc 1 2855 16 ld.u32 %r161,[%r389]; .loc 1 2856 13 ld.u32 %r1206,[%r387]; .loc 1 2855 11 mad.lo.u32 %r1580,%r161,%r1206,%r275; .loc 1 2858 13 ld.u32 %r1208,[%r385]; .loc 1 2857 11 mad.lo.u32 %r1581,%r161,%r1208,%r276; .loc 1 2860 13 ld.u32 %r1210,[%r383]; .loc 1 2859 11 mad.lo.u32 %r1582,%r161,%r1210,%r277; .loc 1 2862 13 ld.u32 %r1212,[%r381]; .loc 1 2861 11 mad.lo.u32 %r1583,%r161,%r1212,%r278; .loc 1 2855 16 ld.u32 %r1589,[%r389+4]; .loc 1 2856 13 ld.u32 %r1590,[%r387+4]; .loc 1 2855 11 mad.lo.u32 %r1592,%r1589,%r1590,%r1580; .loc 1 2858 13 ld.u32 %r1593,[%r385+4]; .loc 1 2857 11 mad.lo.u32 %r1595,%r1589,%r1593,%r1581; .loc 1 2860 13 ld.u32 %r1596,[%r383+4]; .loc 1 2859 11 mad.lo.u32 %r1598,%r1589,%r1596,%r1582; .loc 1 2862 13 ld.u32 %r1599,[%r381+4]; .loc 1 2861 11 mad.lo.u32 %r1601,%r1589,%r1599,%r1583; .loc 1 2855 16 ld.u32 %r1608,[%r389+8]; .loc 1 2856 13 ld.u32 %r1609,[%r387+8]; .loc 1 2855 11 mad.lo.u32 %r1611,%r1608,%r1609,%r1592; .loc 1 2858 13 ld.u32 %r1612,[%r385+8]; .loc 1 2857 11 mad.lo.u32 %r1614,%r1608,%r1612,%r1595; .loc 1 2860 13 ld.u32 %r1615,[%r383+8]; .loc 1 2859 11 mad.lo.u32 %r1617,%r1608,%r1615,%r1598; .loc 1 2862 13 ld.u32 %r1618,[%r381+8]; .loc 1 2861 11 mad.lo.u32 %r1620,%r1608,%r1618,%r1601; .loc 1 2855 16 ld.u32 %r1627,[%r389+12]; .loc 1 2856 13 ld.u32 %r1628,[%r387+12]; .loc 1 2855 11 mad.lo.u32 %r275,%r1627,%r1628,%r1611; .loc 1 2858 13 ld.u32 %r1630,[%r385+12]; .loc 1 2857 11 mad.lo.u32 %r276,%r1627,%r1630,%r1614; .loc 1 2860 13 ld.u32 %r1632,[%r383+12]; .loc 1 2859 11 mad.lo.u32 %r277,%r1627,%r1632,%r1617; .loc 1 2862 13 ld.u32 %r1634,[%r381+12]; .loc 1 2861 11 mad.lo.u32 %r278,%r1627,%r1634,%r1620; .loc 1 2853 10 add.u64 %r389,%r389,16; add.u64 %r387,%r387,16; add.u64 %r385,%r385,16; add.u64 %r383,%r383,16; add.u64 %r381,%r381,16; setp.ne.u64 %r1636,%r226,%r387; @ %r1636 bra $L70; $L69: .loc 1 2864 28 st.u32 [%r374],%r275; .loc 1 2865 34 st.u32 [%r373],%r276; .loc 1 2866 34 st.u32 [%r372],%r277; .loc 1 2867 34 st.u32 [%r371],%r278; .loc 1 2846 6 add.u64 %r374,%r374,4; add.u64 %r373,%r373,4; add.u64 %r372,%r372,4; add.u64 %r371,%r371,4; add.u64 %r370,%r370,256; setp.ne.u64 %r1214,%r374,%r472; @ %r1214 bra $L71; bra $L63; $L60: .loc 1 2727 34 add.u64 %r273,%r273,256; .loc 1 2727 8 add.u64 %r558,%r558,1024; add.u64 %r559,%r559,1024; setp.ge.s64 %r1215,%r215,%r273; @ %r1215 bra $L72; bra $L73; $L59: .loc 1 2873 17 add.u64 %r170,%r1531,%r621; .loc 1 2874 9 setp.le.s64 %r1216,%r170,%r482; @ %r1216 bra $L60; .loc 1 2876 14 add.u64 %r496,%r263,%r273; .loc 1 2876 9 add.u64 %r497,%r496,-1; .loc 1 2759 17 add.u64 %r405,%r1534,%r273; add.u64 %r1217,%r405,%r406; shl.b64 %r1218,%r1217,2; add.u64 %r407,%r1218,%r248; neg.s64 %r1220,%r405; shl.b64 %r1221,%r1220,2; shl.b64 %r1222,%r273,2; add.u64 %r390,%r1222,%r1221; shl.b64 %r1223,%r496,2; add.u64 %r392,%r1223,%r1221; shl.b64 %r434,%r263,8; .loc 1 2876 9 mov.u64 %r398,%r397; mov.u64 %r400,%r588; mov.u64 %r401,%r406; mov.u64 %r266,%r482; setp.gt.s64 %r1637,%r1534,%r263; setp.le.s64 %r1373,%r273,%r497; not.b64 %r1382,%r273; add.u64 %r1383,%r1382,%r496; and.b64 %r1384,%r1383,-4; add.u64 %r1385,%r1384,%r273; $L81: .loc 1 2877 6 @ %r1373 bra $L74; $L80: .loc 1 2901 6 @ %r1637 bra $L75; bra $L76; $L74: add.u64 %r419,%r390,%r407; add.u64 %r1231,%r1385,%r401; shl.b64 %r1232,%r1231,2; add.u64 %r1233,%r248,16; add.u64 %r409,%r1232,%r1233; .loc 1 2904 13 mov.u64 %r415,4096; mov.u64 %r416,3072; mov.u64 %r417,1024; mov.u64 %r418,-2048; $L79: .loc 1 2879 14 ld.u32 %r269,[%r419]; .loc 1 2880 14 ld.u32 %r270,[%r419+4]; .loc 1 2881 14 ld.u32 %r271,[%r419+8]; .loc 1 2882 14 ld.u32 %r272,[%r419+12]; .loc 1 2884 10 @ %r1400 bra $L77; add.u64 %r428,%r1033,%r417; mov.u64 %r429,%r400; sub.u64 %r1430,%r398,%r400; add.u64 %r1431,%r1430,-4; shr.u64 %r1429,%r1431,2; and.b64 %r1433,%r1429,1; setp.ne.u64 %r1434,%r1433,0; @ ! %r1434 bra $L371; $L78: add.u64 %r425,%r418,%r428; .loc 1 2887 16 ld.u32 %r172,[%r429]; .loc 1 2886 16 add.u64 %r1235,%r425,%r417; .loc 1 2887 13 ld.u32 %r1237,[%r1235]; .loc 1 2886 11 mad.lo.u32 %r1638,%r172,%r1237,%r269; .loc 1 2889 13 ld.u32 %r1239,[%r428]; .loc 1 2888 11 mad.lo.u32 %r1639,%r172,%r1239,%r270; .loc 1 2890 16 add.u64 %r1240,%r425,%r416; .loc 1 2891 13 ld.u32 %r1242,[%r1240]; .loc 1 2890 11 mad.lo.u32 %r1640,%r172,%r1242,%r271; .loc 1 2892 16 add.u64 %r1243,%r425,%r415; .loc 1 2893 13 ld.u32 %r1245,[%r1243]; .loc 1 2892 11 mad.lo.u32 %r1641,%r172,%r1245,%r272; .loc 1 2884 10 add.u64 %r1436,%r428,4; add.u64 %r1644,%r418,%r1436; .loc 1 2887 16 ld.u32 %r1645,[%r429+4]; .loc 1 2886 16 add.u64 %r1646,%r1644,%r417; .loc 1 2887 13 ld.u32 %r1647,[%r1646]; .loc 1 2886 11 mad.lo.u32 %r269,%r1645,%r1647,%r1638; .loc 1 2889 13 ld.u32 %r1649,[%r1436]; .loc 1 2888 11 mad.lo.u32 %r270,%r1645,%r1649,%r1639; .loc 1 2890 16 add.u64 %r1651,%r1644,%r416; .loc 1 2891 13 ld.u32 %r1652,[%r1651]; .loc 1 2890 11 mad.lo.u32 %r271,%r1645,%r1652,%r1640; .loc 1 2892 16 add.u64 %r1654,%r1644,%r415; .loc 1 2893 13 ld.u32 %r1655,[%r1654]; .loc 1 2892 11 mad.lo.u32 %r272,%r1645,%r1655,%r1641; .loc 1 2884 10 add.u64 %r429,%r429,8; add.u64 %r428,%r428,8; setp.ne.u64 %r1657,%r398,%r429; @ %r1657 bra $L78; $L77: .loc 1 2895 28 st.u32 [%r419],%r269; .loc 1 2896 32 st.u32 [%r419+4],%r270; .loc 1 2897 32 st.u32 [%r419+8],%r271; .loc 1 2898 32 st.u32 [%r419+12],%r272; .loc 1 2877 6 add.u64 %r419,%r419,16; add.u64 %r418,%r418,-4096; add.u64 %r417,%r417,4096; add.u64 %r416,%r416,4096; add.u64 %r415,%r415,4096; setp.ne.u64 %r1247,%r409,%r419; @ %r1247 bra $L79; bra $L80; $L76: .loc 1 2874 39 add.u64 %r266,%r266,1; .loc 1 2874 9 add.u64 %r407,%r407,%r605; add.u64 %r401,%r401,%r505; add.u64 %r400,%r400,%r231; add.u64 %r398,%r398,%r231; setp.ne.u64 %r1248,%r170,%r266; @ %r1248 bra $L81; bra $L60; $L75: add.u64 %r436,%r392,%r407; .loc 1 2904 13 mov.u64 %r435,%r434; $L84: .loc 1 2903 14 ld.u32 %r268,[%r436]; .loc 1 2905 10 @ %r1400 bra $L82; shl.b64 %r1251,%r435,2; add.u64 %r297,%r1033,%r1251; mov.u64 %r440,%r400; sub.u64 %r1420,%r398,%r400; add.u64 %r1421,%r1420,-4; shr.u64 %r1419,%r1421,2; add.u64 %r1422,%r1419,1; and.b64 %r1423,%r1422,3; setp.eq.u64 %r1426,%r1423,0; @ %r1426 bra $L83; setp.eq.u64 %r1425,%r1423,1; @ %r1425 bra $L308; setp.eq.u64 %r1424,%r1423,2; @ %r1424 bra $L309; bra $L372; $L83: .loc 1 2908 13 ld.u32 %r1253,[%r297]; ld.u32 %r1254,[%r440]; .loc 1 2907 11 mad.lo.u32 %r1658,%r1253,%r1254,%r268; .loc 1 2908 13 ld.u32 %r1661,[%r297+4]; ld.u32 %r1662,[%r440+4]; .loc 1 2907 11 mad.lo.u32 %r1664,%r1661,%r1662,%r1658; .loc 1 2908 13 ld.u32 %r1668,[%r297+8]; ld.u32 %r1669,[%r440+8]; .loc 1 2907 11 mad.lo.u32 %r1671,%r1668,%r1669,%r1664; .loc 1 2908 13 ld.u32 %r1675,[%r297+12]; ld.u32 %r1676,[%r440+12]; .loc 1 2907 11 mad.lo.u32 %r268,%r1675,%r1676,%r1671; .loc 1 2905 10 add.u64 %r297,%r297,16; add.u64 %r440,%r440,16; setp.ne.u64 %r1678,%r398,%r440; @ %r1678 bra $L83; $L82: .loc 1 2910 28 st.u32 [%r436],%r268; .loc 1 2901 6 add.u64 %r436,%r436,4; add.u64 %r435,%r435,256; setp.ne.u64 %r1256,%r407,%r436; @ %r1256 bra $L84; bra $L76; $L20: .loc 1 2958 11 cvt.u16.u32 %r1257,%r447; setp.eq.u16 %r1258,%r1257,1; @ ! %r1258 bra $L373; bra $L85; $L31: .loc 1 2920 38 set.u32.eq.u64 %r1260,%r208,1; neg.s32 %r1261,%r1260; .loc 1 2920 26 cvt.u16.u32 %r1263,%r1261; and.b16 %r1262,%r1263,%r990; .loc 1 2920 11 cvt.u32.u16 %r1265,%r1262; cvt.u16.u8 %r1266,%r1265; setp.eq.u16 %r1267,%r1266,0; @ %r1267 bra $L20; .loc 1 2920 43 setp.ne.u64 %r1268,%r503,1; @ %r1268 bra $L20; $L121: .loc 1 2922 10 cvt.u16.u32 %r1269,%r447; setp.ne.u16 %r1270,%r1269,1; @ %r1270 bra $L87; .loc 1 2948 4 setp.gt.s64 %r1271,%r216,0; @ %r1271 bra $L88; bra $L1; $L87: .loc 1 2929 4 setp.le.s64 %r1272,%r216,0; @ %r1272 bra $L1; shl.b64 %r645,%r505,2; shl.b64 %r1273,%r215,2; add.u64 %r644,%r246,%r1273; shl.b64 %r635,%r504,2; shl.b64 %r1274,%r457,2; add.u64 %r636,%r244,%r1274; mov.u64 %r649,0; mov.u64 %r642,%r649; .loc 1 2929 11 mov.u64 %r301,%r649; setp.gt.s64 %r1679,%r215,0; setp.le.s64 %r1416,%r457,0; $L91: .loc 1 2933 8 @ %r1679 bra $L90; $L95: .loc 1 2929 29 add.u64 %r301,%r301,1; .loc 1 2929 4 add.u64 %r642,%r642,%r505; add.u64 %r644,%r644,%r645; add.u64 %r649,%r649,%r209; setp.ne.u64 %r1276,%r216,%r301; @ %r1276 bra $L91; bra $L1; $L90: shl.b64 %r1277,%r642,2; add.u64 %r631,%r246,%r1277; shl.b64 %r1278,%r649,2; add.u64 %r654,%r245,%r1278; .loc 1 2933 8 mov.u64 %r633,%r636; mov.u64 %r632,0; $L94: .loc 1 2937 5 @ %r1416 bra $L140; shl.b64 %r1280,%r632,2; add.u64 %r627,%r244,%r1280; mov.u64 %r628,%r654; .loc 1 2936 7 mov.u32 %r302,0; sub.u64 %r1479,%r633,%r627; add.u64 %r1480,%r1479,-4; shr.u64 %r1478,%r1480,2; add.u64 %r1481,%r1478,1; and.b64 %r1482,%r1481,3; setp.eq.u64 %r1485,%r1482,0; @ %r1485 bra $L93; setp.eq.u64 %r1484,%r1482,1; @ %r1484 bra $L310; setp.eq.u64 %r1483,%r1482,2; @ %r1483 bra $L311; bra $L374; $L93: .loc 1 2938 23 ld.u32 %r1282,[%r627]; ld.u32 %r1283,[%r628]; .loc 1 2938 9 mad.lo.u32 %r1680,%r1282,%r1283,%r302; .loc 1 2938 23 ld.u32 %r1683,[%r627+4]; ld.u32 %r1684,[%r628+4]; .loc 1 2938 9 mad.lo.u32 %r1686,%r1683,%r1684,%r1680; .loc 1 2938 23 ld.u32 %r1690,[%r627+8]; ld.u32 %r1691,[%r628+8]; .loc 1 2938 9 mad.lo.u32 %r1693,%r1690,%r1691,%r1686; .loc 1 2938 23 ld.u32 %r1697,[%r627+12]; ld.u32 %r1698,[%r628+12]; .loc 1 2938 9 mad.lo.u32 %r302,%r1697,%r1698,%r1693; .loc 1 2937 5 add.u64 %r627,%r627,16; add.u64 %r628,%r628,16; setp.ne.u64 %r1700,%r627,%r633; @ %r1700 bra $L93; bra $L92; $L140: .loc 1 2936 7 mov.u32 %r302,0; $L92: .loc 1 2939 15 st.u32 [%r631],%r302; .loc 1 2933 8 add.u64 %r631,%r631,4; add.u64 %r632,%r632,%r504; add.u64 %r633,%r633,%r635; setp.ne.u64 %r1285,%r631,%r644; @ %r1285 bra $L94; bra $L95; $L88: shl.b64 %r665,%r505,2; mov.u64 %r663,%r246; shl.b64 %r657,%r504,2; .loc 1 2948 4 mov.u64 %r666,0; .loc 1 2948 11 mov.u64 %r299,%r666; setp.le.s64 %r1701,%r457,0; $L98: .loc 1 2952 8 @ %r1701 bra $L141; mov.u64 %r655,%r244; shl.b64 %r1287,%r666,2; add.u64 %r658,%r1287,%r245; add.u64 %r1288,%r457,%r666; shl.b64 %r1289,%r1288,2; add.u64 %r674,%r1289,%r245; .loc 1 2951 10 mov.u32 %r300,0; sub.u64 %r1489,%r674,%r658; add.u64 %r1490,%r1489,-4; shr.u64 %r1488,%r1490,2; add.u64 %r1491,%r1488,1; and.b64 %r1492,%r1491,3; setp.eq.u64 %r1495,%r1492,0; @ %r1495 bra $L97; setp.eq.u64 %r1494,%r1492,1; @ %r1494 bra $L312; setp.eq.u64 %r1493,%r1492,2; @ %r1493 bra $L313; bra $L375; $L97: .loc 1 2953 26 ld.u32 %r1291,[%r655]; ld.u32 %r1292,[%r658]; .loc 1 2953 5 mad.lo.u32 %r1702,%r1291,%r1292,%r300; .loc 1 2952 8 add.u64 %r1703,%r655,%r657; .loc 1 2953 26 ld.u32 %r1705,[%r1703]; ld.u32 %r1706,[%r658+4]; .loc 1 2953 5 mad.lo.u32 %r1708,%r1705,%r1706,%r1702; .loc 1 2952 8 add.u64 %r1709,%r1703,%r657; .loc 1 2953 26 ld.u32 %r1712,[%r1709]; ld.u32 %r1713,[%r658+8]; .loc 1 2953 5 mad.lo.u32 %r1715,%r1712,%r1713,%r1708; .loc 1 2952 8 add.u64 %r1716,%r1709,%r657; .loc 1 2953 26 ld.u32 %r1719,[%r1716]; ld.u32 %r1720,[%r658+12]; .loc 1 2953 5 mad.lo.u32 %r300,%r1719,%r1720,%r1715; .loc 1 2952 8 add.u64 %r655,%r1716,%r657; add.u64 %r658,%r658,16; setp.ne.u64 %r1722,%r658,%r674; @ %r1722 bra $L97; bra $L96; $L141: .loc 1 2951 10 mov.u32 %r300,0; $L96: .loc 1 2954 25 st.u32 [%r663],%r300; .loc 1 2948 29 add.u64 %r299,%r299,1; .loc 1 2948 4 add.u64 %r663,%r663,%r665; add.u64 %r666,%r666,%r209; setp.ne.u64 %r1294,%r216,%r299; @ %r1294 bra $L98; bra $L1; $L85: .loc 1 2963 7 setp.le.s64 %r1295,%r216,0; @ %r1295 bra $L1; shl.b64 %r685,%r445,2; mov.u64 %r683,%r246; shl.b64 %r677,%r504,2; shl.b64 %r680,%r503,2; mov.u64 %r686,0; .loc 1 2963 14 mov.u64 %r312,%r686; setp.le.s64 %r1723,%r457,0; $L101: .loc 1 2967 4 @ %r1723 bra $L142; mov.u64 %r675,%r244; shl.b64 %r1297,%r686,2; add.u64 %r678,%r245,%r1297; .loc 1 2966 6 mov.u32 %r313,0; .loc 1 2967 11 mov.u64 %r314,0; and.b64 %r1500,%r457,3; setp.eq.u64 %r1503,%r1500,0; @ %r1503 bra $L100; setp.eq.u64 %r1502,%r1500,1; @ %r1502 bra $L314; setp.eq.u64 %r1501,%r1500,2; @ %r1501 bra $L315; bra $L376; $L100: .loc 1 2968 29 ld.u32 %r1299,[%r675]; ld.u32 %r1300,[%r678]; .loc 1 2968 8 mad.lo.u32 %r1724,%r1299,%r1300,%r313; .loc 1 2967 4 add.u64 %r1726,%r675,%r677; add.u64 %r1727,%r678,%r680; .loc 1 2968 29 ld.u32 %r1728,[%r1726]; ld.u32 %r1729,[%r1727]; .loc 1 2968 8 mad.lo.u32 %r1731,%r1728,%r1729,%r1724; .loc 1 2967 4 add.u64 %r1733,%r1726,%r677; add.u64 %r1734,%r1727,%r680; .loc 1 2968 29 ld.u32 %r1736,[%r1733]; ld.u32 %r1737,[%r1734]; .loc 1 2968 8 mad.lo.u32 %r1739,%r1736,%r1737,%r1731; .loc 1 2967 4 add.u64 %r1741,%r1733,%r677; add.u64 %r1742,%r1734,%r680; .loc 1 2968 29 ld.u32 %r1744,[%r1741]; ld.u32 %r1745,[%r1742]; .loc 1 2968 8 mad.lo.u32 %r313,%r1744,%r1745,%r1739; .loc 1 2967 28 add.u64 %r314,%r314,4; .loc 1 2967 4 add.u64 %r675,%r1741,%r677; add.u64 %r678,%r1742,%r680; setp.ne.u64 %r1747,%r457,%r314; @ %r1747 bra $L100; bra $L99; $L142: .loc 1 2966 6 mov.u32 %r313,0; $L99: .loc 1 2969 21 st.u32 [%r683],%r313; .loc 1 2963 32 add.u64 %r312,%r312,1; .loc 1 2963 7 add.u64 %r683,%r683,%r685; add.u64 %r686,%r686,%r209; setp.ne.u64 %r1302,%r216,%r312; @ %r1302 bra $L101; bra $L1; $L373: .loc 1 2972 11 setp.gt.s64 %r1303,%r208,%r504; @ %r1303 bra $L102; .loc 1 2993 7 setp.gt.s64 %r1304,%r216,0; @ %r1304 bra $L103; bra $L1; $L102: .loc 1 2974 7 setp.gt.s64 %r1305,%r216,0; @ %r1305 bra $L104; bra $L1; $L108: shl.b64 %r1306,%r719,2; add.u64 %r715,%r246,%r1306; .loc 1 2975 9 mov.u64 %r311,0; and.b64 %r1516,%r215,3; setp.eq.u64 %r1519,%r1516,0; @ %r1519 bra $L105; setp.eq.u64 %r1518,%r1516,1; @ %r1518 bra $L316; setp.eq.u64 %r1517,%r1516,2; @ %r1517 bra $L317; bra $L377; $L105: .loc 1 2976 34 st.u32 [%r715],%r1417; .loc 1 2975 2 add.u64 %r1749,%r715,%r717; .loc 1 2976 34 st.u32 [%r1749],%r1417; .loc 1 2975 2 add.u64 %r1751,%r1749,%r717; .loc 1 2976 34 st.u32 [%r1751],%r1417; .loc 1 2975 2 add.u64 %r1754,%r1751,%r717; .loc 1 2976 34 st.u32 [%r1754],%r1417; .loc 1 2975 27 add.u64 %r311,%r311,4; .loc 1 2975 2 add.u64 %r715,%r1754,%r717; setp.ne.u64 %r1756,%r215,%r311; @ %r1756 bra $L105; $L109: .loc 1 2974 32 add.u64 %r310,%r326,1; .loc 1 2974 7 add.u64 %r719,%r719,%r505; setp.ne.u64 %r1309,%r216,%r310; @ %r1309 bra $L143; shl.b64 %r703,%r503,2; shl.b64 %r694,%r504,2; mov.u64 %r708,0; mov.u64 %r707,%r708; mov.u64 %r327,%r708; setp.le.s64 %r1757,%r457,0; bra $L107; $L104: shl.b64 %r717,%r445,2; mov.u64 %r719,0; .loc 1 2974 14 mov.u64 %r326,%r719; setp.gt.s64 %r1758,%r215,0; .loc 1 2976 34 cvt.u32.u64 %r1417,%r719; bra $L106; $L143: mov.u64 %r326,%r310; $L106: .loc 1 2975 2 @ %r1758 bra $L108; bra $L109; $L112: shl.b64 %r1311,%r699,2; add.u64 %r692,%r244,%r1311; .loc 1 2980 4 mov.u64 %r691,%r712; .loc 1 2980 11 mov.u64 %r309,0; and.b64 %r1508,%r215,3; setp.eq.u64 %r1511,%r1508,0; @ %r1511 bra $L110; setp.eq.u64 %r1510,%r1508,1; @ %r1510 bra $L318; setp.eq.u64 %r1509,%r1508,2; @ %r1509 bra $L319; bra $L378; $L110: .loc 1 2983 37 ld.u32 %r1313,[%r692]; ld.u32 %r1314,[%r701]; .loc 1 2982 36 ld.u32 %r1316,[%r691]; mad.lo.u32 %r1315,%r1313,%r1314,%r1316; st.u32 [%r691],%r1315; .loc 1 2980 4 add.u64 %r1760,%r691,%r717; add.u64 %r1761,%r692,%r694; .loc 1 2983 37 ld.u32 %r1762,[%r1761]; ld.u32 %r1763,[%r701]; .loc 1 2982 36 ld.u32 %r1765,[%r1760]; mad.lo.u32 %r1766,%r1762,%r1763,%r1765; st.u32 [%r1760],%r1766; .loc 1 2980 4 add.u64 %r1768,%r1760,%r717; add.u64 %r1769,%r1761,%r694; .loc 1 2983 37 ld.u32 %r1771,[%r1769]; ld.u32 %r1772,[%r701]; .loc 1 2982 36 ld.u32 %r1774,[%r1768]; mad.lo.u32 %r1775,%r1771,%r1772,%r1774; st.u32 [%r1768],%r1775; .loc 1 2980 4 add.u64 %r1777,%r1768,%r717; add.u64 %r1778,%r1769,%r694; .loc 1 2983 37 ld.u32 %r1780,[%r1778]; ld.u32 %r1781,[%r701]; .loc 1 2982 36 ld.u32 %r1783,[%r1777]; mad.lo.u32 %r1784,%r1780,%r1781,%r1783; st.u32 [%r1777],%r1784; .loc 1 2980 29 add.u64 %r309,%r309,4; .loc 1 2980 4 add.u64 %r691,%r1777,%r717; add.u64 %r692,%r1778,%r694; setp.ne.u64 %r1785,%r215,%r309; @ %r1785 bra $L110; $L113: .loc 1 2979 26 add.u64 %r308,%r308,1; .loc 1 2979 2 add.u64 %r699,%r699,%r208; add.u64 %r701,%r701,%r703; setp.eq.u64 %r1318,%r457,%r308; @ %r1318 bra $L111; $L114: .loc 1 2980 4 @ %r1758 bra $L112; bra $L113; $L111: .loc 1 2978 32 add.u64 %r307,%r327,1; .loc 1 2978 7 add.u64 %r707,%r707,%r505; add.u64 %r708,%r708,%r209; setp.eq.u64 %r1320,%r326,%r327; @ %r1320 bra $L1; mov.u64 %r327,%r307; $L107: .loc 1 2979 2 @ %r1757 bra $L111; shl.b64 %r1322,%r708,2; add.u64 %r701,%r245,%r1322; shl.b64 %r1323,%r707,2; add.u64 %r712,%r246,%r1323; mov.u64 %r699,0; .loc 1 2979 9 mov.u64 %r308,%r699; bra $L114; $L103: shl.b64 %r734,%r445,2; shl.b64 %r725,%r208,2; shl.b64 %r729,%r503,2; .loc 1 2993 7 mov.u64 %r743,0; mov.u64 %r741,%r743; .loc 1 2993 14 mov.u64 %r303,%r743; setp.gt.s64 %r1786,%r215,0; setp.le.s64 %r1418,%r457,0; $L116: .loc 1 2997 4 @ %r1786 bra $L115; $L120: .loc 1 2993 32 add.u64 %r303,%r303,1; .loc 1 2993 7 add.u64 %r741,%r741,%r505; add.u64 %r743,%r743,%r209; setp.ne.u64 %r1325,%r216,%r303; @ %r1325 bra $L116; bra $L1; $L115: shl.b64 %r1326,%r741,2; add.u64 %r732,%r246,%r1326; shl.b64 %r1327,%r743,2; add.u64 %r749,%r245,%r1327; .loc 1 2997 4 mov.u64 %r736,0; .loc 1 2997 11 mov.u64 %r304,%r736; $L119: .loc 1 3001 8 @ %r1418 bra $L144; shl.b64 %r1329,%r736,2; add.u64 %r723,%r244,%r1329; mov.u64 %r727,%r749; .loc 1 3000 10 mov.u32 %r305,0; .loc 1 3001 15 mov.u64 %r306,0; and.b64 %r1524,%r457,3; setp.eq.u64 %r1527,%r1524,0; @ %r1527 bra $L118; setp.eq.u64 %r1526,%r1524,1; @ %r1526 bra $L320; setp.eq.u64 %r1525,%r1524,2; @ %r1525 bra $L321; bra $L379; $L118: .loc 1 3002 28 ld.u32 %r1331,[%r723]; ld.u32 %r1332,[%r727]; .loc 1 3002 5 mad.lo.u32 %r1788,%r1331,%r1332,%r305; .loc 1 3001 8 add.u64 %r1790,%r723,%r725; add.u64 %r1791,%r727,%r729; .loc 1 3002 28 ld.u32 %r1792,[%r1790]; ld.u32 %r1793,[%r1791]; .loc 1 3002 5 mad.lo.u32 %r1795,%r1792,%r1793,%r1788; .loc 1 3001 8 add.u64 %r1797,%r1790,%r725; add.u64 %r1798,%r1791,%r729; .loc 1 3002 28 ld.u32 %r1800,[%r1797]; ld.u32 %r1801,[%r1798]; .loc 1 3002 5 mad.lo.u32 %r1803,%r1800,%r1801,%r1795; .loc 1 3001 8 add.u64 %r1805,%r1797,%r725; add.u64 %r1806,%r1798,%r729; .loc 1 3002 28 ld.u32 %r1808,[%r1805]; ld.u32 %r1809,[%r1806]; .loc 1 3002 5 mad.lo.u32 %r305,%r1808,%r1809,%r1803; .loc 1 3001 32 add.u64 %r306,%r306,4; .loc 1 3001 8 add.u64 %r723,%r1805,%r725; add.u64 %r727,%r1806,%r729; setp.ne.u64 %r1811,%r457,%r306; @ %r1811 bra $L118; bra $L117; $L144: .loc 1 3000 10 mov.u32 %r305,0; $L117: .loc 1 3003 27 st.u32 [%r732],%r305; .loc 1 2997 29 add.u64 %r304,%r304,1; .loc 1 2997 4 add.u64 %r732,%r732,%r734; add.u64 %r736,%r736,%r504; setp.ne.u64 %r1334,%r215,%r304; @ %r1334 bra $L119; bra $L120; $L32: .loc 1 2920 11 setp.eq.u64 %r1335,%r208,1; @ %r1335 bra $L121; bra $L20; $L2: .loc 1 2476 6 ld.u64 %r1336,[%r750]; setp.ne.u64 %r1337,%r1336,0; @ %r1337 bra $L122; .loc 1 2485 4 ld.u64 %r1339,[%r751+56]; add.u64 %r1338,%r1339,1; ld.u64 %r1340,[%r751+48]; sub.u64 %r509,%r1338,%r1340; .loc 1 2483 15 ld.s8 %r1342,[%r752+28]; cvt.u16.u32 %r1341,%r1342; setp.eq.u16 %r1343,%r1341,1; @ %r1343 bra $L123; bra $L124; $L126: .loc 1 2560 16 ld.u64 %r504,[%r751+40]; .loc 1 2548 27 mov.u64 %r445,%r505; bra $L125; $L14: ld.u64 %r505,[%r750+40]; .loc 1 2543 6 ld.s8 %r1345,[%r750+28]; cvt.u16.u32 %r1344,%r1345; setp.eq.u16 %r1346,%r1344,1; @ ! %r1346 bra $L127; bra $L126; $L11: .loc 1 2560 16 ld.u64 %r504,[%r751+40]; mov.u32 %r447,%r332; .loc 1 2548 27 mov.u64 %r505,%r445; .loc 1 2563 14 mov.u64 %r215,1; .loc 1 2561 16 mov.u64 %r208,%r215; bra $L12; $L5: .loc 1 2485 4 ld.u64 %r1348,[%r751+56]; add.u64 %r1347,%r1348,1; ld.u64 %r1349,[%r751+48]; sub.u64 %r509,%r1347,%r1349; bra $L124; $L128: .loc 1 2509 17 ld.u64 %r1351,[%r750+56]; add.u64 %r1350,%r1351,1; .loc 1 2509 15 ld.u64 %r1352,[%r750+48]; sub.u64 %r457,%r1350,%r1352; ld.s8 %r1364,[%r752+28]; bra $L9; $L122: .loc 1 2502 12 cvta.global.u64 %r1353,_gfortrani_compile_options; .loc 1 2502 11 ld.u32 %r1354,[%r1353+36]; setp.ne.u32 %r1355,%r1354,0; @ ! %r1355 bra $L8; bra $L128; $L1$L372: .loc 1 2908 13 ld.u32 %r1812,[%r297]; ld.u32 %r1813,[%r400]; .loc 1 2907 11 mad.lo.u32 %r268,%r1812,%r1813,%r268; .loc 1 2905 10 add.u64 %r297,%r297,4; add.u64 %r440,%r400,4; $L309: .loc 1 2908 13 ld.u32 %r1816,[%r297]; ld.u32 %r1817,[%r440]; .loc 1 2907 11 mad.lo.u32 %r268,%r1816,%r1817,%r268; .loc 1 2905 10 add.u64 %r297,%r297,4; add.u64 %r440,%r440,4; $L308: .loc 1 2908 13 ld.u32 %r1820,[%r297]; ld.u32 %r1821,[%r440]; .loc 1 2907 11 mad.lo.u32 %r268,%r1820,%r1821,%r268; .loc 1 2905 10 add.u64 %r297,%r297,4; add.u64 %r440,%r440,4; setp.ne.u64 %r1823,%r398,%r440; @ %r1823 bra $L83; bra $L82; $L371: add.u64 %r1824,%r418,%r428; .loc 1 2887 16 ld.u32 %r1825,[%r400]; .loc 1 2886 16 add.u64 %r1826,%r1824,%r417; .loc 1 2887 13 ld.u32 %r1827,[%r1826]; .loc 1 2886 11 mad.lo.u32 %r269,%r1825,%r1827,%r269; .loc 1 2889 13 ld.u32 %r1829,[%r428]; .loc 1 2888 11 mad.lo.u32 %r270,%r1825,%r1829,%r270; .loc 1 2890 16 add.u64 %r1831,%r1824,%r416; .loc 1 2891 13 ld.u32 %r1832,[%r1831]; .loc 1 2890 11 mad.lo.u32 %r271,%r1825,%r1832,%r271; .loc 1 2892 16 add.u64 %r1834,%r1824,%r415; .loc 1 2893 13 ld.u32 %r1835,[%r1834]; .loc 1 2892 11 mad.lo.u32 %r272,%r1825,%r1835,%r272; .loc 1 2884 10 add.u64 %r429,%r400,4; add.u64 %r428,%r428,4; setp.ne.u64 %r1837,%r398,%r429; @ %r1837 bra $L78; bra $L77; $L370: .loc 1 2855 16 ld.u32 %r1838,[%r389]; .loc 1 2856 13 ld.u32 %r1839,[%r387]; .loc 1 2855 11 mad.lo.u32 %r275,%r1838,%r1839,%r275; .loc 1 2858 13 ld.u32 %r1841,[%r239]; .loc 1 2857 11 mad.lo.u32 %r276,%r1838,%r1841,%r276; .loc 1 2860 13 ld.u32 %r1843,[%r32]; .loc 1 2859 11 mad.lo.u32 %r277,%r1838,%r1843,%r277; .loc 1 2862 13 ld.u32 %r1845,[%r207]; .loc 1 2861 11 mad.lo.u32 %r278,%r1838,%r1845,%r278; .loc 1 2853 10 add.u64 %r389,%r389,4; add.u64 %r387,%r387,4; add.u64 %r385,%r239,4; add.u64 %r383,%r32,4; add.u64 %r381,%r207,4; $L307: .loc 1 2855 16 ld.u32 %r1848,[%r389]; .loc 1 2856 13 ld.u32 %r1849,[%r387]; .loc 1 2855 11 mad.lo.u32 %r275,%r1848,%r1849,%r275; .loc 1 2858 13 ld.u32 %r1851,[%r385]; .loc 1 2857 11 mad.lo.u32 %r276,%r1848,%r1851,%r276; .loc 1 2860 13 ld.u32 %r1853,[%r383]; .loc 1 2859 11 mad.lo.u32 %r277,%r1848,%r1853,%r277; .loc 1 2862 13 ld.u32 %r1855,[%r381]; .loc 1 2861 11 mad.lo.u32 %r278,%r1848,%r1855,%r278; .loc 1 2853 10 add.u64 %r389,%r389,4; add.u64 %r387,%r387,4; add.u64 %r385,%r385,4; add.u64 %r383,%r383,4; add.u64 %r381,%r381,4; $L306: .loc 1 2855 16 ld.u32 %r1858,[%r389]; .loc 1 2856 13 ld.u32 %r1859,[%r387]; .loc 1 2855 11 mad.lo.u32 %r275,%r1858,%r1859,%r275; .loc 1 2858 13 ld.u32 %r1861,[%r385]; .loc 1 2857 11 mad.lo.u32 %r276,%r1858,%r1861,%r276; .loc 1 2860 13 ld.u32 %r1863,[%r383]; .loc 1 2859 11 mad.lo.u32 %r277,%r1858,%r1863,%r277; .loc 1 2862 13 ld.u32 %r1865,[%r381]; .loc 1 2861 11 mad.lo.u32 %r278,%r1858,%r1865,%r278; .loc 1 2853 10 add.u64 %r389,%r389,4; add.u64 %r387,%r387,4; add.u64 %r385,%r385,4; add.u64 %r383,%r383,4; add.u64 %r381,%r381,4; setp.ne.u64 %r1867,%r226,%r387; @ %r1867 bra $L70; bra $L69; $L369: .loc 1 2762 43 ld.u32 %r1868,[%r559]; st.u32 [%r100],%r1868; .loc 1 2760 9 add.u64 %r56,%r559,4; add.u64 %r331,%r100,1024; $L305: .loc 1 2762 43 ld.u32 %r1870,[%r56]; st.u32 [%r331],%r1870; .loc 1 2760 9 add.u64 %r56,%r56,4; add.u64 %r331,%r331,1024; $L304: .loc 1 2762 43 ld.u32 %r1872,[%r56]; st.u32 [%r331],%r1872; .loc 1 2760 9 add.u64 %r56,%r56,4; add.u64 %r331,%r331,1024; setp.ne.u64 %r1873,%r228,%r56; @ %r1873 bra $L58; bra $L48; $L368: .loc 1 2741 7 add.u64 %r1874,%r539,%r542; add.u64 %r1875,%r1874,%r543; .loc 1 2740 49 ld.u32 %r1876,[%r1875]; st.u32 [%r315],%r1876; .loc 1 2742 49 ld.u32 %r1877,[%r539]; st.u32 [%r315+4],%r1877; .loc 1 2745 7 add.u64 %r1878,%r539,%r543; add.u64 %r1879,%r1878,%r542; .loc 1 2744 49 ld.u32 %r1880,[%r1879+4]; st.u32 [%r315+1024],%r1880; .loc 1 2746 49 ld.u32 %r1881,[%r539+4]; st.u32 [%r315+1028],%r1881; .loc 1 2738 9 add.u64 %r315,%r315,2048; add.u64 %r160,%r539,8; $L303: .loc 1 2741 7 add.u64 %r1883,%r160,%r542; add.u64 %r1884,%r1883,%r543; .loc 1 2740 49 ld.u32 %r1885,[%r1884]; st.u32 [%r315],%r1885; .loc 1 2742 49 ld.u32 %r1886,[%r160]; st.u32 [%r315+4],%r1886; .loc 1 2745 7 add.u64 %r1887,%r160,%r543; add.u64 %r1888,%r1887,%r542; .loc 1 2744 49 ld.u32 %r1889,[%r1888+4]; st.u32 [%r315+1024],%r1889; .loc 1 2746 49 ld.u32 %r1890,[%r160+4]; st.u32 [%r315+1028],%r1890; .loc 1 2738 9 add.u64 %r315,%r315,2048; add.u64 %r160,%r160,8; $L302: .loc 1 2741 7 add.u64 %r1892,%r160,%r542; add.u64 %r1893,%r1892,%r543; .loc 1 2740 49 ld.u32 %r1894,[%r1893]; st.u32 [%r315],%r1894; .loc 1 2742 49 ld.u32 %r1895,[%r160]; st.u32 [%r315+4],%r1895; .loc 1 2745 7 add.u64 %r1896,%r160,%r543; add.u64 %r1897,%r1896,%r542; .loc 1 2744 49 ld.u32 %r1898,[%r1897+4]; st.u32 [%r315+1024],%r1898; .loc 1 2746 49 ld.u32 %r1899,[%r160+4]; st.u32 [%r315+1028],%r1899; .loc 1 2738 9 add.u64 %r315,%r315,2048; add.u64 %r160,%r160,8; setp.ne.u64 %r1900,%r160,%r530; @ %r1900 bra $L52; bra $L53; $L366: .loc 1 2688 2 @ ! %r1361 bra $L380; .loc 1 2689 22 mov.u32 %r19019011902,[%value_in]; } $L380: .loc 1 2687 24 mov.u64 %r298,2; .loc 1 2687 7 add.u64 %r622,%r246,%r623; $L301: .loc 1 2688 2 @ ! %r1361 bra $L381; .loc 1 2689 22 mov.u32 %r190419041905,[%value_in]; } $L381: .loc 1 2687 24 add.u64 %r298,%r298,1; .loc 1 2687 7 add.u64 %r622,%r622,%r623; $L300: .loc 1 2688 2 @ ! %r1361 bra $L382; .loc 1 2689 22 mov.u32 %r190719071908,[%value_in]; } $L382: .loc 1 2687 24 add.u64 %r298,%r298,1; .loc 1 2687 7 add.u64 %r622,%r622,%r623; setp.eq.u64 %r1909,%r298,%r626; @ ! %r1909 bra $L34; bra $L33; $L238: .loc 1 2689 22 mov.u32 %r1910153911911,[%value_in]; } $L367: .loc 1 2687 7 add.u64 %r1913,%r1530,%r623; .loc 1 2688 2 @ ! %r1361 bra $L383; .loc 1 2689 22 mov.u32 %r191519119151916,[%value_in]; } $L383: .loc 1 2687 7 add.u64 %r1918,%r1913,%r623; .loc 1 2688 2 @ ! %r1361 bra $L384; .loc 1 2689 22 mov.u32 %r192019121921,[%value_in]; } $L384: .loc 1 2687 24 add.u64 %r298,%r1477,3; .loc 1 2687 7 add.u64 %r622,%r1918,%r623; setp.eq.u64 %r1922,%r298,%r626; @ ! %r1922 bra $L34; bra $L33; $L374: .loc 1 2938 23 ld.u32 %r1923,[%r627]; ld.u32 %r1924,[%r654]; mul.lo.u32 %r302,%r1923,%r1924; .loc 1 2937 5 add.u64 %r627,%r627,4; add.u64 %r628,%r654,4; $L311: .loc 1 2938 23 ld.u32 %r1927,[%r627]; ld.u32 %r1928,[%r628]; .loc 1 2938 9 mad.lo.u32 %r302,%r1927,%r1928,%r302; .loc 1 2937 5 add.u64 %r627,%r627,4; add.u64 %r628,%r628,4; $L310: .loc 1 2938 23 ld.u32 %r1931,[%r627]; ld.u32 %r1932,[%r628]; .loc 1 2938 9 mad.lo.u32 %r302,%r1931,%r1932,%r302; .loc 1 2937 5 add.u64 %r627,%r627,4; add.u64 %r628,%r628,4; setp.ne.u64 %r1934,%r627,%r633; @ %r1934 bra $L93; bra $L92; $L375: .loc 1 2953 26 ld.u32 %r1935,[%r244]; ld.u32 %r1936,[%r658]; mul.lo.u32 %r300,%r1935,%r1936; .loc 1 2952 8 add.u64 %r655,%r244,%r657; add.u64 %r658,%r658,4; $L313: .loc 1 2953 26 ld.u32 %r1939,[%r655]; ld.u32 %r1940,[%r658]; .loc 1 2953 5 mad.lo.u32 %r300,%r1939,%r1940,%r300; .loc 1 2952 8 add.u64 %r655,%r655,%r657; add.u64 %r658,%r658,4; $L312: .loc 1 2953 26 ld.u32 %r1943,[%r655]; ld.u32 %r1944,[%r658]; .loc 1 2953 5 mad.lo.u32 %r300,%r1943,%r1944,%r300; .loc 1 2952 8 add.u64 %r655,%r655,%r657; add.u64 %r658,%r658,4; setp.ne.u64 %r1946,%r658,%r674; @ %r1946 bra $L97; bra $L96; $L376: .loc 1 2968 29 ld.u32 %r1947,[%r244]; ld.u32 %r1948,[%r678]; mul.lo.u32 %r313,%r1947,%r1948; .loc 1 2967 28 mov.u64 %r314,1; .loc 1 2967 4 add.u64 %r675,%r244,%r677; add.u64 %r678,%r678,%r680; $L315: .loc 1 2968 29 ld.u32 %r1951,[%r675]; ld.u32 %r1952,[%r678]; .loc 1 2968 8 mad.lo.u32 %r313,%r1951,%r1952,%r313; .loc 1 2967 28 add.u64 %r314,%r314,1; .loc 1 2967 4 add.u64 %r675,%r675,%r677; add.u64 %r678,%r678,%r680; $L314: .loc 1 2968 29 ld.u32 %r1955,[%r675]; ld.u32 %r1956,[%r678]; .loc 1 2968 8 mad.lo.u32 %r313,%r1955,%r1956,%r313; .loc 1 2967 28 add.u64 %r314,%r314,1; .loc 1 2967 4 add.u64 %r675,%r675,%r677; add.u64 %r678,%r678,%r680; setp.ne.u64 %r1958,%r457,%r314; @ %r1958 bra $L100; bra $L99; $L378: .loc 1 2983 37 ld.u32 %r1959,[%r692]; ld.u32 %r1960,[%r701]; .loc 1 2982 36 ld.u32 %r1962,[%r712]; mad.lo.u32 %r1963,%r1959,%r1960,%r1962; st.u32 [%r712],%r1963; .loc 1 2980 29 mov.u64 %r309,1; .loc 1 2980 4 add.u64 %r691,%r712,%r717; add.u64 %r692,%r692,%r694; $L319: .loc 1 2983 37 ld.u32 %r1965,[%r692]; ld.u32 %r1966,[%r701]; .loc 1 2982 36 ld.u32 %r1968,[%r691]; mad.lo.u32 %r1969,%r1965,%r1966,%r1968; st.u32 [%r691],%r1969; .loc 1 2980 29 add.u64 %r309,%r309,1; .loc 1 2980 4 add.u64 %r691,%r691,%r717; add.u64 %r692,%r692,%r694; $L318: .loc 1 2983 37 ld.u32 %r1971,[%r692]; ld.u32 %r1972,[%r701]; .loc 1 2982 36 ld.u32 %r1974,[%r691]; mad.lo.u32 %r1975,%r1971,%r1972,%r1974; st.u32 [%r691],%r1975; .loc 1 2980 29 add.u64 %r309,%r309,1; .loc 1 2980 4 add.u64 %r691,%r691,%r717; add.u64 %r692,%r692,%r694; setp.ne.u64 %r1976,%r215,%r309; @ %r1976 bra $L110; bra $L113; $L377: .loc 1 2976 34 st.u32 [%r715],%r1417; .loc 1 2975 27 mov.u64 %r311,1; .loc 1 2975 2 add.u64 %r715,%r715,%r717; $L317: .loc 1 2976 34 st.u32 [%r715],%r1417; .loc 1 2975 27 add.u64 %r311,%r311,1; .loc 1 2975 2 add.u64 %r715,%r715,%r717; $L316: .loc 1 2976 34 st.u32 [%r715],%r1417; .loc 1 2975 27 add.u64 %r311,%r311,1; .loc 1 2975 2 add.u64 %r715,%r715,%r717; setp.ne.u64 %r1979,%r215,%r311; @ %r1979 bra $L105; bra $L109; $L379: .loc 1 3002 28 ld.u32 %r1980,[%r723]; ld.u32 %r1981,[%r749]; mul.lo.u32 %r305,%r1980,%r1981; .loc 1 3001 32 mov.u64 %r306,1; .loc 1 3001 8 add.u64 %r723,%r723,%r725; add.u64 %r727,%r749,%r729; $L321: .loc 1 3002 28 ld.u32 %r1984,[%r723]; ld.u32 %r1985,[%r727]; .loc 1 3002 5 mad.lo.u32 %r305,%r1984,%r1985,%r305; .loc 1 3001 32 add.u64 %r306,%r306,1; .loc 1 3001 8 add.u64 %r723,%r723,%r725; add.u64 %r727,%r727,%r729; $L320: .loc 1 3002 28 ld.u32 %r1988,[%r723]; ld.u32 %r1989,[%r727]; .loc 1 3002 5 mad.lo.u32 %r305,%r1988,%r1989,%r305; .loc 1 3001 32 add.u64 %r306,%r306,1; .loc 1 3001 8 add.u64 %r723,%r723,%r725; add.u64 %r727,%r727,%r729; setp.ne.u64 %r1991,%r457,%r306; @ %r1991 bra $L118; bra $L117; } matmul_i8.o/_gfortran_matmul_i8 .visible .func _gfortran_matmul_, .param .u32 %in_ar4, .param .u64 %in_ar5); .file 1 "../../../../libgfortran/generated/matmul_i8.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_options[6]FUNCTION DECL: _gfortran_size0_gfortran_size0_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime___func__$0 .const .align 1 .u8 __func__$0[10] = {109,97,116,109,117,108,95,105,56,0 }2] = {782] = {672] = {8471,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,32,40,97,41,32,61,61,32,50,32,124,124,32,71,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,32,40,98,41,32,61,61,3246116,114,97,110,47,103,101,110,101,114,97,116,101,100,47,109,97,116,109,117,108,95,105,56,465,114,114,97,121,32,98,111,117,110,100,32,109,105,115,109,97,116,99,104,32,102,111,114,32,100,105,109,101,110,115,105,111,110,32,49,32,111,102,32,97,114,114,97,121,32,40,37,108,100,47,37,108,100,41,3265,114,114,97,121,32,98,111,117,110,100,32,109,105,115,109,97,116,99,104,32,102,111,114,32,100,105,109,101,110,115,105,111,110,32,50,32,111,102,32,97,114,114,97,121,32,40,37,108,100,47,37,108,100,41,3273,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,97,114,103,117,109,101,110,116,32,66,32,105,110,32,77,65,84,77,85,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,49,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,100,0 }103,101,109,109,32,33,61,32,78,85,76,7_gfortran_matmul_i8 .visible .func _gfortran_matmul_8964u64 %r280; .reg .u64 %r281; .reg .u64 %r282; .reg .u64 %r283; .reg .u64u64u64u64 %r305; .reg .u64 %r306; .reg .u64u64u32 %r331; .reg .u64 %r332; .reg .u64u64 %r374; .reg .u64u64 %r390; .reg .u64 %r392; .reg .u64u64u64 %r409; .reg .u64u64u64 %r530; .reg .u64 %r536; .reg .u64 %r537; .reg .u64 %r538; .reg .u64u64 %r555; .reg .u64u64 %r571; .reg .u64 %r573; .reg .u64 %r574; .reg .u64 %r575; .reg .u64 %r576; .reg .u64 %r577; .reg .u64 %r585; .reg .u64 %r586; .reg .u64 %r587; .reg .u64 %r588; .reg .u64 %r589; .reg .u64 %r590; .reg .u64 %r591; .reg .u64 %r592; .reg .u64 %r594; .reg .u64u64.reg .u64 %r619; .reg .u64 %r620; .reg .u64 %r621; .reg .u64 %r624; .reg .u64 %r625; .reg .u64 %r626; .reg .u64 %r629; .reg .u64 %r630; .reg .u64 %r631; .reg .u64 %r633; .reg .u64 %r634; .reg .u64 %r640; .reg .u64 %r642; .reg .u64 %r643; .reg .u64 %r647; .reg .u64 %r652; .reg .u64 %r653; .reg .u64 %r655; .reg .u64 %r656; .reg .u64 %r661; .reg .u64 %r663; .reg .u64 %r664; .reg .u64 %r672; .reg .u64 %r673; .reg .u64 %r675; .reg .u64 %r676; .reg .u64 %r678; .reg .u64 %r681; .reg .u64 %r683; .reg .u64u64 %r699; .reg .u64 %r701; .reg .u64 %r705; .reg .u64 %r706; .reg .u64 %r710; .reg .u64 %r713; .reg .u64 %r715; .reg .u64 %r717; .reg .u64 %r721; .reg .u64 %r723; .reg .u64 %r725; .reg .u64 %r727; .reg .u64 %r730; .reg .u64 %r732; .reg .u64 %r734; .reg .u64 %r739; .reg .u64 %r741; .reg .u64 %r747; .reg .u64 %r748; .reg .u64 %r749; .reg .u64 %r750; .reg .u32 %r751; .reg .u32 %r752; .reg .u64 %r753; .reg .u16 %r754; .reg .pred %r755; .reg .u16 %r756; .reg .pred %r758; .reg .u64 %r759; .reg .u32 %r760; .reg .u64 %r761; .reg .u64 %r762; .reg .u64 %r763; .reg .pred %r764; .reg .pred %r766; .reg .u64 %r769; .reg .u64 %r771; .reg .u64 %r772; .reg .u64 %r773; .reg .u64 %r775; .reg .u64 %r776; .reg .u64 %r777; .reg .u64 %r778; .reg .u64 %r779; .reg .u64 %r782; .reg .u64 %r784; .reg .u64 %r785; .reg .u64 %r787; .reg .u64 %r789; .reg .u64 %r790; .reg .u64 %r792; .reg .u64 %r793; .reg .u64 %r794; .reg .u64 %r795; .reg .u64 %r796; .reg .u32 %r797; .reg .pred %r798; .reg .u64 %r799; .reg .u64 %r800; .reg .u64 %r801; .reg .pred %r803; .reg .u64 %r804; .reg .u64 %r805; .reg .u64 %r806; .reg .pred %r807; .reg .u64 %r808; .reg .u64 %r809; .reg .u64 %r810; .reg .u16 %r811; .reg .pred %r812; .reg .u64 %r813; .reg .u64 %r815; .reg .u64 %r816; .reg .u64 %r817; .reg .u16 %r818; .reg .pred %r820; .reg .pred %r821; .reg .u64 %r822; .reg .pred %r824; .reg .u64 %r825; .reg .u64 %r827; .reg .u64 %r828; .reg .u64 %r829; .reg .u64 %r830; .reg .u64 %r831; .reg .u64 %r832; .reg .pred %r833; .reg .u64 %r834; .reg .u64 %r836; .reg .u64 %r837; .reg .u64 %r838; .reg .u16 %r839; .reg .u32 %r840; .reg .pred %r841; .reg .u16 %r842; .reg .pred %r843; .reg .u64 %r844; .reg .u64 %r845; .reg .u64 %r846; .reg .u64 %r847; .reg .u64 %r848; .reg .u64 %r849; .reg .pred %r850; .reg .u32 %r852; .reg .u32 %r853; .reg .u32 %r855; .reg .u32 %r856; .reg .u16 %r857; .reg .u16 %r858; .reg .u16 %r859; .reg .u32 %r860; .reg .u16 %r861; .reg .pred %r862; .reg .u64 %r863; .reg .u16 %r865; .reg .pred %r866; .reg .u64 %r867; .reg .u64 %r868; .reg .u64 %r869; .reg .u32 %r870; .reg .u32 %r871; .reg .u32 %r872; .reg .u32 %r873; .reg .u32 %r874; .reg .u32 %r875; .reg .u32 %r877; .reg .u32 %r878; .reg .u16 %r879; .reg .u16 %r880; .reg .u16 %r881; .reg .u32 %r882; .reg .u16 %r883; .reg .pred %r884; .reg .u32 %r886; .reg .u32 %r887; .reg .u16 %r888; .reg .u16 %r889; .reg .u16 %r890; .reg .u32 %r891; .reg .u16 %r892; .reg .pred %r893; .reg .u32 %r895; .reg .u32 %r896; .reg .u32 %r898; .reg .u32 %r899; .reg .u16 %r900; .reg .u16 %r901; .reg .u16 %r902; .reg .u32 %r903; .reg .u16 %r904; .reg .pred %r905; .reg .f32 %r906; .reg .f32 %r907; .reg .f32 %r908; .reg .f32 %r909; .reg .f32 %r910; .reg .f32 %r911; .reg .f32 %r912; .reg .pred %r913; .reg .u64 %r914; .reg .u64 %r915; .reg .pred %r916; .reg .pred %r917; .reg .u32 %r919; .reg .u32 %r920; .reg .u32 %r922; .reg .u32 %r923; .reg .u16 %r924; .reg .u16 %r925; .reg .u16 %r926; .reg .u32 %r927; .reg .u16 %r928; .reg .pred %r929; .reg .u32 %r931; .reg .u32 %r932; .reg .u32 %r934; .reg .u32 %r935; .reg .u16 %r936; .reg .u16 %r937; .reg .u16 %r938; .reg .u32 %r939; .reg .u16 %r940; .reg .pred %r941; .reg .u32 %r943; .reg .u32 %r944; .reg .u32 %r946; .reg .u32 %r947; .reg .u16 %r948; .reg .u16 %r949; .reg .u16 %r950; .reg .u32 %r951; .reg .u16 %r952; .reg .pred %r953; .reg .pred %r954; .reg .u64 %r955; .reg .u32 %r956; .reg .u64 %r957; .reg .u64 %r958; .reg .u32 %r959; .reg .pred %r960; .reg .pred %r961; .reg .u32 %r962; .reg .pred %r963; .reg .pred %r964; .reg .u32 %r979; .reg .u64 %r980; .reg .u64 %r981; .reg .u64 %r982; .reg .u64 %r983; .reg .u64 %r984; .reg .u64 %r985; .reg .u64 %r986; .reg .u16 %r987; .reg .u16 %r988; .reg .u16 %r989; .reg .u32 %r990; .reg .u16 %r991; .reg .pred %r992; .reg .pred %r993; .reg .u16 %r994; .reg .pred %r995; .reg .u64 %r996; .reg .pred %r997; .reg .u32 %r1001; .reg .u64 %r1003; .reg .u32 %r1008; .reg .u32 %r1009; .reg .u32 %r1011; .reg .u32 %r1012; .reg .u16 %r1013; .reg .u16 %r1014; .reg .u16 %r1015; .reg .u32 %r1017; .reg .u32 %r1018; .reg .u16 %r1020; .reg .u16 %r1022; .reg .u32 %r1023; .reg .u16 %r1024; .reg .pred %r1025; .reg .pred %r1026; .reg .u64 %r1027; .reg .u64 %r1029; .reg .u64 %r1030; .reg .u64 %r1031; .reg .pred %r1033; .reg .u64 %r1035; .reg .u64 %r1036; .reg .u64 %r1037; .reg .u64 %r1040; .reg .u64 %r1041; .reg .u64 %r1045; .reg .u64 %r1046; .reg .u64 %r1047; .reg .u64 %r1049; .reg .u64 %r1050; .reg .u64 %r1053; .reg .u64 %r1056; .reg .u64 %r1057; .reg .u64 %r1060; .reg .u64 %r1062; .reg .u64 %r1063; .reg .u64 %r1064; .reg .u64 %r1065; .reg .u64 %r1066; .reg .pred %r1068; .reg .u64 %r1074; .reg .u64 %r1075; .reg .u64 %r1076; .reg .u64 %r1077; .reg .pred %r1079; .reg .u64 %r1080; .reg .u64 %r1081; .reg .u64 %r1082; .reg .u64 %r1083; .reg .u64 %r1084; .reg .u64 %r1086; .reg .u64 %r1087; .reg .u64 %r1088; .reg .u64 %r1089; .reg .u64 %r1090; .reg .u64 %r1091; .reg .u64 %r1092; .reg .u64 %r1093; .reg .u64 %r1097; .reg .u64 %r1098; .reg .u64 %r1099; .reg .u64 %r1100; .reg .u64 %r1104; .reg .u64 %r1106; .reg .u64 %r1109; .reg .u64 %r1112; .reg .u64 %r1117; .reg .u64 %r1118; .reg .u64 %r1120; .reg .u64 %r1121; .reg .u64 %r1122; .reg .u64 %r1123; .reg .u64 %r1124; .reg .u64 %r1125; .reg .u64 %r1126; .reg .u64 %r1127; .reg .u64 %r1129; .reg .u64 %r1130; .reg .u64 %r1131; .reg .u64 %r1132; .reg .u64 %r1133; .reg .pred %r1134; .reg .u64 %r1137; .reg .u64 %r1138; .reg .u64 %r1139; .reg .u64 %r1140; .reg .u64 %r1142; .reg .pred %r1143; .reg .u64 %r1144; .reg .u64 %r1145; .reg .u64 %r1146; .reg .u64 %r1149; .reg .u64 %r1150; .reg .u64 %r1151; .reg .u64 %r1152; .reg .u64 %r1153; .reg .u64 %r1154; .reg .u64 %r1155; .reg .u64 %r1156; .reg .u64 %r1158; .reg .u64 %r1159; .reg .u64 %r1160; .reg .u64 %r1161; .reg .u64 %r1166; .reg .u64 %r1169; .reg .u64 %r1172; .reg .u64 %r1175; .reg .u64 %r1178; .reg .u64 %r1180; .reg .pred %r1188; .reg .pred %r1189; .reg .pred %r1190; .reg .u64 %r1194; .reg .u64 %r1196; .reg .u64 %r1198; .reg .u64 %r1200; .reg .u64 %r1202; .reg .pred %r1204; .reg .pred %r1205; .reg .pred %r1206; .reg .u64 %r1207; .reg .u64 %r1208; .reg .u64 %r1210; .reg .u64 %r1211; .reg .u64 %r1212; .reg .u64 %r1213; .reg .u64 %r1221; .reg .u64 %r1222; .reg .u64 %r1223; .reg .u64 %r1225; .reg .u64 %r1227; .reg .u64 %r1229; .reg .u64 %r1230; .reg .u64 %r1232; .reg .u64 %r1233; .reg .u64 %r1235; .reg .pred %r1237; .reg .pred %r1238; .reg .u64 %r1241; .reg .u64 %r1243; .reg .u64 %r1244; .reg .pred %r1246; .reg .u16 %r1247; .reg .pred %r1248; .reg .u32 %r1250; .reg .u32 %r1251; .reg .u16 %r1252; .reg .u16 %r1253; .reg .u32 %r1255; .reg .u16 %r1256; .reg .pred %r1257; .reg .pred %r1258; .reg .u16 %r1259; .reg .pred %r1260; .reg .pred %r1261; .reg .pred %r1262; .reg .u64 %r1263; .reg .u64 %r1264; .reg .pred %r1266; .reg .u64 %r1267; .reg .u64 %r1268; .reg .u64 %r1270; .reg .u64 %r1272; .reg .u64 %r1273; .reg .pred %r1275; .reg .u64 %r1277; .reg .u64 %r1278; .reg .u64 %r1279; .reg .u64 %r1281; .reg .u64 %r1282; .reg .pred %r1284; .reg .pred %r1285; .reg .u64 %r1287; .reg .u64 %r1289; .reg .u64 %r1290; .reg .pred %r1292; .reg .pred %r1293; .reg .pred %r1294; .reg .pred %r1295; .reg .u64 %r1296; .reg .pred %r1299; .reg .u64 %r1301; .reg .u64 %r1303; .reg .u64 %r1304; .reg .u64 %r1305; .reg .u64 %r1306; .reg .pred %r1308; .reg .pred %r1310; .reg .u64 %r1312; .reg .u64 %r1313; .reg .pred %r1315; .reg .u64 %r1316; .reg .u64 %r1317; .reg .u64 %r1319; .reg .u64 %r1321; .reg .u64 %r1322; .reg .pred %r1324; .reg .pred %r1325; .reg .u64 %r1326; .reg .pred %r1327; .reg .u64 %r1328; .reg .u64 %r1329; .reg .u64 %r1330; .reg .u16 %r1331; .reg .u32 %r1332; .reg .pred %r1333; .reg .u16 %r1334; .reg .u32 %r1335; .reg .pred %r1336; .reg .u64 %r1337; .reg .u64 %r1338; .reg .u64 %r1339; .reg .u64 %r1340; .reg .u64 %r1341; .reg .u64 %r1342; .reg .u64 %r1343; .reg .u32 %r1344; .reg .pred %r1345; .reg .pred %r1347; .reg .pred %r1348; .reg .pred %r1349; .reg .pred %r1350; .reg .pred %r1351; .reg .u64 %r1353; .reg .u32 %r1354; .reg .u64 %r1355; .reg .pred %r1356; .reg .pred %r1361; .reg .pred %r1362; .reg .pred %r1363; .reg .pred %r1365; .reg .u64 %r1372; .reg .u64 %r1373; .reg .u64 %r1374; .reg .u64 %r1375; .reg .u64 %r1380; .reg .u64 %r1381; .reg .u64 %r1382; .reg .u64 %r1383; .reg .u64 %r1385; .reg .u64 %r1386; .reg .u64 %r1387; .reg .u64 %r1388; .reg .u64 %r1389; .reg .pred %r1390; .reg .u64 %r1392; .reg .pred %r1394; .reg .u64 %r1397; .reg .u64 %r1398; .reg .u64 %r1399; .reg .pred %r1400; .reg .u64 %r1401; .reg .u64 %r1402; .reg .pred %r1406; .reg .u64 %r1407; .reg .pred %r1408; .reg .u64 %r1409; .reg .u64 %r1410; .reg .u64 %r1411; .reg .u64 %r1412; .reg .u64 %r1413; .reg .pred %r1414; .reg .pred %r1415; .reg .pred %r1416; .reg .u64 %r1419; .reg .u64 %r1420; .reg .u64 %r1421; .reg .u64 %r1423; .reg .pred %r1424; .reg .u64 %r1426; .reg .u64 %r1427; .reg .u64 %r1428; .reg .u64 %r1429; .reg .u64 %r1430; .reg .u64 %r1431; .reg .pred %r1432; .reg .pred %r1433; .reg .pred %r1434; .reg .u64 %r1440; .reg .u64 %r1441; .reg .u64 %r1442; .reg .u64 %r1443; .reg .u64 %r1444; .reg .pred %r1445; .reg .pred %r1446; .reg .pred %r1447; .reg .u64 %r1450; .reg .u64 %r1451; .reg .u64 %r1452; .reg .u64 %r1453; .reg .u64 %r1454; .reg .pred %r1455; .reg .pred %r1456; .reg .pred %r1457; .reg .u64 %r1458; .reg .u64 %r1459; .reg .u64 %r1463; .reg .pred %r1464; .reg .pred %r1465; .reg .pred %r1466; .reg .u64 %r1467; .reg .u64 %r1468; .reg .u64 %r1469; .reg .u64 %r1470; .reg .u64 %r1471; .reg .u64 %r1472; .reg .pred %r1473; .reg .pred %r1474; .reg .pred %r1475; .reg .u64 %r1478; .reg .u64 %r1479; .reg .u64 %r1480; .reg .u64 %r1481; .reg .u64 %r1482; .reg .pred %r1483; .reg .pred %r1484; .reg .pred %r1485; .reg .u64 %r1490; .reg .pred %r1491; .reg .pred %r1492; .reg .pred %r1493; .reg .u64 %r1498; .reg .pred %r1499; .reg .pred %r1500; .reg .pred %r1501; .reg .u64 %r1506; .reg .pred %r1507; .reg .pred %r1508; .reg .pred %r1509; .reg .u64 %r1514; .reg .pred %r1515; .reg .pred %r1516; .reg .pred %r1517; .reg .u64 %r1520; .reg .u64 %r1521; .reg .u64 %r1523; .reg .u64 %r1524; .reg .u64 %r1527; .reg .u64 %r1528; .reg .u64 %r1529; .reg .u64 %r1530; .reg .u64 %r1531; .reg .u64 %r1532; .reg .u64 %r1533; .reg .u64 %r1534; .reg .u64 %r1535; .reg .u64 %r1536; .reg .u64 %r1538; .reg .u64 %r1539; .reg .u64 %r1540; .reg .u64 %r1541; .reg .u64 %r1542; .reg .u64 %r1543; .reg .u64 %r1544; .reg .u64 %r1545; .reg .u64 %r1546; .reg .u64 %r1547; .reg .u64 %r1549; .reg .u64 %r1550; .reg .u64 %r1551; .reg .u64 %r1552; .reg .u64 %r1553; .reg .u64 %r1554; .reg .u64 %r1555; .reg .u64 %r1556; .reg .pred %r1557; .reg .u64 %r1560; .reg .u64 %r1564; .reg .u64 %r1568; .reg .pred %r1569; .reg .u64 %r1570; .reg .u64 %r1571; .reg .u64 %r1572; .reg .u64 %r1573; .reg .u64 %r1579; .reg .u64 %r1580; .reg .u64 %r1582; .reg .u64 %r1583; .reg .u64 %r1585; .reg .u64 %r1586; .reg .u64 %r1588; .reg .u64 %r1589; .reg .u64 %r1591; .reg .u64 %r1598; .reg .u64 %r1599; .reg .u64 %r1601; .reg .u64 %r1602; .reg .u64 %r1604; .reg .u64 %r1605; .reg .u64 %r1607; .reg .u64 %r1608; .reg .u64 %r1610; .reg .u64 %r1617; .reg .u64 %r1618; .reg .u64 %r1620; .reg .u64 %r1622; .reg .u64 %r1624; .reg .pred %r1626; .reg .pred %r1627; .reg .u64 %r1628; .reg .u64 %r1629; .reg .u64 %r1630; .reg .u64 %r1631; .reg .u64 %r1634; .reg .u64 %r1635; .reg .u64 %r1636; .reg .u64 %r1637; .reg .u64 %r1639; .reg .u64 %r1641; .reg .u64 %r1642; .reg .u64 %r1644; .reg .u64 %r1645; .reg .pred %r1647; .reg .u64 %r1648; .reg .u64 %r1651; .reg .u64 %r1652; .reg .u64 %r1654; .reg .u64 %r1658; .reg .u64 %r1659; .reg .u64 %r1661; .reg .u64 %r1665; .reg .u64 %r1666; .reg .pred %r1668; .reg .pred %r1669; .reg .u64 %r1670; .reg .u64 %r1673; .reg .u64 %r1674; .reg .u64 %r1676; .reg .u64 %r1680; .reg .u64 %r1681; .reg .u64 %r1683; .reg .u64 %r1687; .reg .u64 %r1688; .reg .pred %r1690; .reg .pred %r1691; .reg .u64 %r1692; .reg .u64 %r1693; .reg .u64 %r1695; .reg .u64 %r1696; .reg .u64 %r1698; .reg .u64 %r1699; .reg .u64 %r1702; .reg .u64 %r1703; .reg .u64 %r1705; .reg .u64 %r1706; .reg .u64 %r1709; .reg .u64 %r1710; .reg .pred %r1712; .reg .pred %r1713; .reg .u64 %r1714; .reg .u64 %r1716; .reg .u64 %r1717; .reg .u64 %r1718; .reg .u64 %r1719; .reg .u64 %r1721; .reg .u64 %r1723; .reg .u64 %r1724; .reg .u64 %r1726; .reg .u64 %r1727; .reg .u64 %r1729; .reg .u64 %r1731; .reg .u64 %r1732; .reg .u64 %r1734; .reg .u64 %r1735; .reg .pred %r1737; .reg .u64 %r1739; .reg .u64 %r1741; .reg .u64 %r1744; .reg .pred %r1746; .reg .pred %r1747; .reg .pred %r1748; .reg .u64 %r1750; .reg .u64 %r1751; .reg .u64 %r1752; .reg .u64 %r1753; .reg .u64 %r1755; .reg .u64 %r1756; .reg .u64 %r1758; .reg .u64 %r1759; .reg .u64 %r1761; .reg .u64 %r1762; .reg .u64 %r1764; .reg .u64 %r1765; .reg .u64 %r1767; .reg .u64 %r1768; .reg .u64 %r1770; .reg .u64 %r1771; .reg .u64 %r1773; .reg .u64 %r1774; .reg .pred %r1775; .reg .pred %r1776; .reg .u64 %r1778; .reg .u64 %r1780; .reg .u64 %r1781; .reg .u64 %r1782; .reg .u64 %r1783; .reg .u64 %r1785; .reg .u64 %r1787; .reg .u64 %r1788; .reg .u64 %r1790; .reg .u64 %r1791; .reg .u64 %r1793; .reg .u64 %r1795; .reg .u64 %r1796; .reg .u64 %r1798; .reg .u64 %r1799; .reg .pred %r1801; .reg .u64 %r1802; .reg .u64 %r1803; .reg .u64 %r1806; .reg .u64 %r1807; .reg .u64 %r1810; .reg .u64 %r1811; .reg .pred %r1813; .reg .u64 %r1814; .reg .u64 %r1815; .reg .u64 %r1816; .reg .u64 %r1817; .reg .u64 %r1819; .reg .u64 %r1821; .reg .u64 %r1822; .reg .u64 %r1824; .reg .u64 %r1825; .reg .pred %r1827; .reg .u64 %r1828; .reg .u64 %r1829; .reg .u64 %r1831; .reg .u64 %r1833; .reg .u64 %r1835; .reg .u64 %r1838; .reg .u64 %r1839; .reg .u64 %r1841; .reg .u64 %r1843; .reg .u64 %r1845; .reg .u64 %r1848; .reg .u64 %r1849; .reg .u64 %r1851; .reg .u64 %r1853; .reg .u64 %r1855; .reg .pred %r1857; .reg .u64 %r1858; .reg .u64 %r1860; .reg .u64 %r1862; .reg .pred %r1863; .reg .u64 %r1864; .reg .u64 %r1865; .reg .u64 %r1866; .reg .u64 %r1867; .reg .u64 %r1868; .reg .u64 %r1869; .reg .u64 %r1870; .reg .u64 %r1871; .reg .u64 %r1873; .reg .u64 %r1874; .reg .u64 %r1875; .reg .u64 %r1876; .reg .u64 %r1877; .reg .u64 %r1878; .reg .u64 %r1879; .reg .u64 %r1880; .reg .u64 %r1882; .reg .u64 %r1883; .reg .u64 %r1884; .reg .u64 %r1885; .reg .u64 %r1886; .reg .u64 %r1887; .reg .u64 %r1888; .reg .u64 %r1889; .reg .pred %r1890; .reg .u32 %r1891; .reg .u64 %r1892; .reg .u32 %r1894; .reg .u64 %r1895; .reg .u32 %r1897; .reg .u64 %r1898; .reg .pred %r1899; .reg .u32 %r1900; .reg .u64 %r1901; .reg .u64 %r1903; .reg .u32 %r1905; .reg .u64 %r1906; .reg .u64 %r1908; .reg .u32 %r1910; .reg .u64 %r1911; .reg .pred %r1912; .reg .u64 %r1913; .reg .u64 %r1914; .reg .u64 %r1917; .reg .u64 %r1918; .reg .u64 %r1921; .reg .u64 %r1922; .reg .pred %r1924; .reg .u64 %r1925; .reg .u64 %r1926; .reg .u64 %r1929; .reg .u64 %r1930; .reg .u64 %r1933; .reg .u64 %r1934; .reg .pred %r1936; .reg .u64 %r1937; .reg .u64 %r1938; .reg .u64 %r1941; .reg .u64 %r1942; .reg .u64 %r1945; .reg .u64 %r1946; .reg .pred %r1948; .reg .u64 %r1949; .reg .u64 %r1950; .reg .u64 %r1952; .reg .u64 %r1953; .reg .u64 %r1955; .reg .u64 %r1956; .reg .u64 %r1958; .reg .u64 %r1959; .reg .u64 %r1961; .reg .u64 %r1962; .reg .u64 %r1964; .reg .u64 %r1965; .reg .pred %r1966; .reg .pred %r1969; .reg .u64 %r1970; .reg .u64 %r1971; .reg .u64 %r1974; .reg .u64 %r1975; .reg .u64 %r1978; .reg .u64 %r1979; .reg .pred %r1981; mov.u64 %r748,%ar0; mov.u64 %r749,%ar1; mov.u64 %r750,%ar2; mov.u32 %r751,%ar3; mov.u32 %r752,%ar4; mov.u64 %r753,%ar5; .loc 1 2462 3 ld.s8 %r446,[%r749+28]; cvt.u16.u32 %r754,%r446; setp.eq.u16 %r755,%r754,2; @ %r755 bra $L2; ld.s8 %r1354,[%r750+28]; cvt.u16.u32 %r756,%r1354; setp.eq.u16 %r758,%r756,2; @ %r758 bra $L3; cvta.const.u64 %r762,$LC3; cvta.const.u64 %r761,__func__$0; mov.u32 %r760,2462; cvta.const.u64 %r7767623: .loc 1 2476 6 ld.u64 %r763,[%r748]; setp.ne.u64 %r764,%r763,0; @ %r764 bra $L4; .loc 1 2478 10 setp.ne.u16 %r766,%r754,1; @ %r766 bra $L5; .loc 1 2480 4 st.u64 [%r748+48],%r763; ld.u64 %r769,[%r750+80]; ld.u64 %r771,[%r750+72]; sub.u64 %r772,%r769,%r771; st.u64 [%r748+56],%r772; mov.u64 %r773,1; st.u64 [%r748+40],%r773; bra $L6; $L123: .loc 1 2485 4 st.u64 [%r748+48],%r1326; add.u64 %r775,%r508,-1; st.u64 [%r748+56],%r775; mov.u64 %r776,1; st.u64 [%r748+40],%r776; bra $L6; $L124: .loc 1 2490 4 mov.u64 %r777,0; st.u64 [%r748+48],%r777; add.u64 %r778,%r508,-1; st.u64 [%r748+56],%r778; mov.u64 %r779,1; st.u64 [%r748+40],%r779; .loc 1 2493 11 st.u64 [%r748+72],%r777; ld.u64 %r782,[%r750+80]; ld.u64 %r784,[%r750+72]; sub.u64 %r785,%r782,%r784; st.u64 [%r748+80],%r785; st.u64 [%r748+64],%r508; $L6: .loc 1 2499 1748; call (%value_in),_gfortran_size0,(%out_arg1); ld.param.u64 %r787,[%value_in]; } .loc 1 2499 4 mov.u64 %r789789; call (%value_in),_gfortrani_xmallocarray790,[%value_in]; } .loc 1 2499 2 st.u64 [%r748],%r790; .loc 1 2500 24 mov.u64 %r792,0; st.u64 [%r748+8],%r792; .loc 1 2557 7 ld.s8 %r446,[%r749+28]; .loc 1 2564 15 ld.u64 %r794,[%r749+56]; add.u64 %r793,%r794,1; .loc 1 2564 13 ld.u64 %r795,[%r749+48]; sub.u64 %r456,%r793,%r795; bra $L7; $L4: .loc 1 2502 12 cvta.global.u64 %r796,_gfortrani_compile_options; .loc 1 2502 11 ld.u32 %r797,[%r796+36]; setp.eq.u32 %r798,%r797,0; @ %r798 bra $L8; .loc 1 2509 17 ld.u64 %r800,[%r748+56]; add.u64 %r799,%r800,1; .loc 1 2509 15 ld.u64 %r801,[%r748+48]; sub.u64 %r456,%r799,%r801; .loc 1 2506 10 setp.ne.u16 %r803,%r754,1; @ %r803 bra $L9; .loc 1 2508 17 ld.u64 %r805,[%r750+80]; add.u64 %r804,%r805,1; .loc 1 2508 15 ld.u64 %r806,[%r750+72]; sub.u64 %r238,%r804,%r806; .loc 1 2510 7 setp.ne.u64 %r807,%r238,%r456; @ %r807 bra $L10; .loc 1 2564 15 ld.u64 %r809,[%r749+56]; add.u64 %r808,%r809,1; .loc 1 2564 13 ld.u64 %r810,[%r749+48]; sub.u64 %r456,%r808,%r810; .loc 1 2543 7 ld.s8 %r331,[%r748+28]; .loc 1 2548 27 ld.u64 %r444,[%r748+40]; .loc 1 2543 6 cvt.u16.u32 %r811,%r331; setp.eq.u16 %r812,%r811,1; @ %r812 bra $L11; .loc 1 2553 16 ld.u64 %r504,[%r748+64]; .loc 1 2560 16 ld.u64 %r503,[%r749+40]; .loc 1 2563 14 mov.u64 %r216,1; .loc 1 2561 16 mov.u64 %r208,%r216; bra $L12; $L10: .loc 1 2511 6 st.u64 [%stack+8],%r238; st.u64 [%stack],%r456; cvta.const.u64 %r813813_gfortran_runtime_error2485 4 ld.u64 %r816,[%r749+56]; add.u64 %r815,%r816,1; ld.u64 %r817,[%r749+48]; sub.u64 %r512,%r815,%r817; .loc 1 2515 15 cvt.u16.u32 %r818,%r1354; setp.ne.u16 %r820,%r818,1; @ %r820 bra $L13; .loc 1 2519 7 setp.eq.u64 %r821,%r512,%r456; @ %r821 bra $L14; .loc 1 2520 6 st.u64 [%stack+8],%r512; st.u64 [%stack],%r456; cvta.const.u64 %r8228stack; call _gfortran_runtime_error13: .loc 1 2528 7 setp.eq.u64 %r824,%r512,%r456; @ %r824 bra $L15; .loc 1 2529 6 st.u64 [%stack+8],%r512; st.u64 [%stack],%r456; cvta.const.u64 %r825825_gfortran_runtime_error15: .loc 1 2533 17 ld.u64 %r828,[%r750+80]; add.u64 %r827,%r828,1; .loc 1 2533 15 ld.u64 %r829,[%r750+72]; sub.u64 %r236,%r827,%r829; .loc 1 2534 17 ld.u64 %r831,[%r748+80]; add.u64 %r830,%r831,1; .loc 1 2534 15 ld.u64 %r832,[%r748+72]; sub.u64 %r237,%r830,%r832; .loc 1 2535 7 setp.eq.u64 %r833,%r236,%r237; @ %r833 bra $L14; .loc 1 2536 6 st.u64 [%stack+8],%r236; st.u64 [%stack],%r237; cvta.const.u64 %r834,$LC8stack; call _gfortran_runtime_error2564 15 ld.u64 %r837,[%r749+56]; add.u64 %r836,%r837,1; .loc 1 2564 13 ld.u64 %r838,[%r749+48]; sub.u64 %r456,%r836,%r838; $L7: .loc 1 2548 27 ld.u64 %r504,[%r748+40]; .loc 1 2543 6 ld.s8 %r840,[%r748+28]; cvt.u16.u32 %r839,%r840; setp.eq.u16 %r841,%r839,1; @ %r841 bra $L130; $L127: .loc 1 2553 16 mov.u64 %r444,%r504; ld.u64 %r504,[%r748+64]; bra $L16; $L130: .loc 1 2548 27 mov.u64 %r444,%r504; $L16: .loc 1 2560 16 ld.u64 %r503,[%r749+40]; .loc 1 2557 6 cvt.u16.u32 %r842,%r446; setp.eq.u16 %r843,%r842,1; @ %r843 bra $L131; $L125: .loc 1 2569 16 ld.u64 %r208,[%r749+64]; .loc 1 2571 15 ld.u64 %r845,[%r749+80]; add.u64 %r844,%r845,1; .loc 1 2571 13 ld.u64 %r846,[%r749+72]; mov.u64 %r216,%r456; sub.u64 %r456,%r844,%r846; bra $L12; $L131: .loc 1 2563 14 mov.u64 %r216,1; .loc 1 2561 16 mov.u64 %r208,%r216; $L12: .loc 1 2575 16 ld.u64 %r848,[%r750+56]; add.u64 %r847,%r848,1; ld.u64 %r849,[%r750+48]; sub.u64 %r64,%r847,%r849; .loc 1 2575 6 setp.eq.u64 %r850,%r64,%r456; @ %r850 bra $L17; .loc 1 2577 10 set.u32.gt.s64 %r852,%r456,0; neg.s32 %r853,%r852; .loc 1 2577 21 set.u32.gt.s64 %r855,%r64,0; neg.s32 %r856,%r855; cvt.u16.u32 %r858,%r853; cvt.u16.u32 %r859,%r856; or.b16 %r857,%r858,%r859; cvt.u32.u16 %r860,%r857; cvt.u16.u8 %r861,%r860; setp.eq.u16 %r862,%r861,0; @ %r862 bra $L17; .loc 1 2578 2 st.u64 [%stack+8],%r456; st.u64 [%stack],%r64;_gfortran_runtime_error17: .loc 1 2583 7 ld.s8 %r66,[%r750+28]; .loc 1 2586 16 ld.u64 %r502,[%r750+40]; .loc 1 2583 6 cvt.u16.u32 %r865,%r66; setp.eq.u16 %r866,%r865,1; @ %r866 bra $L132; .loc 1 2597 16 ld.u64 %r209,[%r750+64]; .loc 1 2598 16 ld.u64 %r868,[%r750+80]; add.u64 %r867,%r868,1; .loc 1 2598 14 ld.u64 %r869,[%r750+72]; sub.u64 %r217,%r867,%r869; bra $L18; $L132: .loc 1 2592 14 mov.u64 %r217,1; .loc 1 2591 16 mov.u64 %r209,256; $L18: .loc 1 2601 9 ld.u64 %r243,[%r749]; .loc 1 2602 9 ld.u64 %r244,[%r750]; .loc 1 2603 8 ld.u64 %r245,[%r748]; .loc 1 2612 28 set.u32.eq.u64 %r871,%r444,1; neg.s32 %r872,%r871; cvt.u32.u32 %r870,%r872; cvt.u32.u8 %r71,%r870; .loc 1 2612 46 set.u32.eq.u64 %r874,%r503,1; neg.s32 %r875,%r874; cvt.u32.u32 %r873,%r875; cvt.u32.u8 %r501,%r873; .loc 1 2612 7 set.u32.ne.u32 %r877,%r751,0; neg.s32 %r878,%r877; .loc 1 2612 16 cvt.u16.u32 %r880,%r878; cvt.u16.u32 %r881,%r71; and.b16 %r879,%r880,%r881; .loc 1 2612 6 cvt.u32.u16 %r882,%r879; cvt.u16.u8 %r883,%r882; setp.eq.u16 %r884,%r883,0; @ %r884 bra $L19; .loc 1 2612 63 set.u32.eq.u64 %r886,%r208,1; neg.s32 %r887,%r886; .loc 1 2612 51 cvt.u16.u32 %r889,%r887; cvt.u16.u32 %r890,%r501; or.b16 %r888,%r889,%r890; .loc 1 2612 33 cvt.u32.u16 %r891,%r888; cvt.u16.u8 %r892,%r891; setp.eq.u16 %r893,%r892,0; @ %r893 bra $L20; .loc 1 2613 20 set.u32.eq.u64 %r895,%r502,1; neg.s32 %r896,%r895; .loc 1 2613 37 set.u32.eq.u64 %r898,%r209,1; neg.s32 %r899,%r898; .loc 1 2613 25 cvt.u16.u32 %r901,%r896; cvt.u16.u32 %r902,%r899; or.b16 %r900,%r901,%r902; .loc 1 2613 7 cvt.u32.u16 %r903,%r900; cvt.u16.u8 %r904,%r903; setp.eq.u16 %r905,%r904,0; @ %r905 bra $L20; .loc 1 2615 13 cvt.rn.f32.s32 %r83,%r752; .loc 1 2614 12 cvt.rn.f32.s64 %r906,%r216; .loc 1 2614 50 cvt.rn.f32.s64 %r907,%r456; .loc 1 2614 47 mul.f32 %r908,%r906,%r907; .loc 1 2614 31 cvt.rn.f32.s64 %r909,%r217; .loc 1 2614 47 mul.f32 %r910,%r908,%r909; .loc 1 2615 13 mul.f32 %r911,%r83,%r83; mul.f32 %r912,%r911,%r83; .loc 1 2614 7 setp.gt.f32 %r913,%r910,%r912; @ ! %r913 bra $L19; .loc 1 2617 17 cvt.u32.u64 %r85,%r216; st.u32 [%frame+36],%r85; .loc 1 2617 29 cvt.u32.u64 %r86,%r217; st.u32 [%frame+32],%r86; .loc 1 2617 41 cvt.u32.u64 %r87,%r456; st.u32 [%frame+28],%r87; .loc 1 2617 52 cvt.u32.u64 %r88,%r504; st.u32 [%frame+24],%r88; .loc 1 2618 27 mov.u64 %r914,1; st.u64 [%frame+8],%r914; .loc 1 2618 36 mov.u64 %r915,0; st.u64 [%frame],%r915; .loc 1 2619 17 setp.ne.u64 %r916,%r503,1; @ %r916 bra $L22; cvt.u32.u64 %r230,%r208; bra $L23; $L22: cvt.u32.u64 %r230,%r503; $L23: st.u32 [%frame+20],%r230; .loc 1 2620 3 setp.ne.u64 %r917,%r502,1; @ %r917 bra $L24; cvt.u32.u64 %r231,%r209; bra $L25; $L24: cvt.u32.u64 %r231,%r502; $L25: st.u32 [%frame+16],%r231; .loc 1 2622 15 set.u32.gt.s32 %r919,%r230,0; neg.s32 %r920,%r919; .loc 1 2622 26 set.u32.gt.s32 %r922,%r231,0; neg.s32 %r923,%r922; .loc 1 2622 19 cvt.u16.u32 %r925,%r920; cvt.u16.u32 %r926,%r923; and.b16 %r924,%r925,%r926; .loc 1 2622 10 cvt.u32.u16 %r927,%r924; cvt.u16.u8 %r928,%r927; setp.eq.u16 %r929,%r928,0; @ %r929 bra $L19; .loc 1 2622 37 set.u32.gt.s32 %r931,%r88,0; neg.s32 %r932,%r931; .loc 1 2622 46 set.u32.gt.s32 %r934,%r85,1; neg.s32 %r935,%r934; .loc 1 2622 41 cvt.u16.u32 %r937,%r932; cvt.u16.u32 %r938,%r935; and.b16 %r936,%r937,%r938; cvt.u32.u16 %r939,%r936; cvt.u16.u8 %r940,%r939; setp.eq.u16 %r941,%r940,0; @ %r941 bra $L19; .loc 1 2622 55 set.u32.gt.s32 %r943,%r86,1; neg.s32 %r944,%r943; .loc 1 2622 64 set.u32.gt.s32 %r946,%r87,1; neg.s32 %r947,%r946; .loc 1 2622 59 cvt.u16.u32 %r949,%r944; cvt.u16.u32 %r950,%r947; and.b16 %r948,%r949,%r950; cvt.u32.u16 %r951,%r948; cvt.u16.u8 %r952,%r951; setp.eq.u16 %r953,%r952,0; @ %r953 bra $L19; .loc 1 2624 4 setp.ne.u64 %r954,%r753,0; @ %r954 bra $L27; cvta.const.u64 %r958,$LC8; cvta.const.u64 %r957,__func__$0; mov.u32 %r956,2624; cvta.const.u64 %r9957927: .loc 1 2626 17 and.b32 %r959,%r751,2; .loc 1 2626 7 setp.ne.u32 %r960,%r959,0; @ %r960 bra $L133; .loc 1 2629 35 setp.eq.u64 %r961,%r503,1; @ %r961 bra $L134; cvta.const.u64 %r218,$LC2; bra $L28; $L133: .loc 1 2627 13 cvta.const.u64 %r218,$LC1; bra $L28; $L134: .loc 1 2629 35 cvta.const.u64 %r218,$LC0; $L28: .loc 1 2631 17 and.b32 %r962,%r751,4; .loc 1 2631 7 setp.ne.u32 %r963,%r962,0; @ %r963 bra $L135; .loc 1 2634 35 setp.eq.u64 %r964,%r502,1; @ %r964 bra $L136; cvta.const.u64 %r219,$LC2; bra $L29; $L135: .loc 1 2632 13 cvta.const.u64 %r219,$LC1; bra $L29; $L136: .loc 1 2634 35 cvta.const.u64 %r219,$LC0; $L29: .loc 1 2636 4 add.u64 %r980,%frame,24; add.u64 %r981,%frame,16; add.u64 %r982,%frame,20; add.u64 %r983,%frame,8; add.u64 %r984,%frame,28; add.u64 %r985,%frame,32; add.u64 %r986,%frame,36; mov.u32 %r979,1;,.param .u64 %in_ar3,.param .u64 %in_ar4,.param .u64 %in_ar5,.param .u64 %in_ar6,.param .u64 %in_ar7,.param .u64 %in_ar8,.param .u64 %in_ar9,.param .u64 %in_ar10,.param .u64 %in_ar11,.param .u64 %in_ar12,.param .u32 %in_ar13,.param .u32 %in_ar14);r219984243; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r982; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r244; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r981; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%frame; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r245; .param .u64 %out_arg13; st.param.u64 [%out_arg13],%r980; .param .u32 %out_arg14; st.param.u32 [%out_arg14],%r979; .param .u32 %out_arg15; st.param.u32 [%out_arg15],%r979; call %r753,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13,%out_arg14,%out_arg15),$LCT0; } bra $L1; $L19: .loc 1 2643 21 cvt.u16.u32 %r988,%r71; cvt.u16.u32 %r989,%r501; and.b16 %r987,%r988,%r989; .loc 1 2643 6 cvt.u32.u16 %r990,%r987; cvt.u16.u8 %r991,%r990; setp.eq.u16 %r992,%r991,0; @ %r992 bra $L31; .loc 1 2643 38 setp.ne.u64 %r993,%r502,1; @ %r993 bra $L20; .loc 1 2644 7 cvt.u16.u32 %r994,%r66; setp.eq.u16 %r995,%r994,1; @ %r995 bra $L32; .loc 1 2677 16 add.u64 %r996,%r504,1; .loc 1 2678 9 shl.b64 %r104,%r996,3; sub.u64 %r247,%r245,%r104; .loc 1 2687 7 setp.le.s64 %r997,%r217,0; @ %r997 bra $L33; add.u64 %r621,%r104,-8; mov.u64 %r620,%r245; add.u64 %r624,%r217,1; .loc 1 2689 22 shl.b64 %r441,%r216,3; .loc 1 2687 13 mov.u64 %r296,%r502; setp.gt.s64 %r1351,%r216,0; and.b64 %r1463,%r217,3; setp.eq.u64 %r1466,%r1463,0; @ %r1466 bra $L34; setp.eq.u64 %r1465,%r1463,1; @ %r1465 bra $L300; setp.eq.u64 %r1464,%r1463,2; @ %r1464 bra $L301; bra $L366; $L35: .loc 1 2689 22 mov.u32 %r10010011003,[%value_in]; } $L36: .loc 1 2687 24 add.u64 %r1467,%r296,1; .loc 1 2687 7 add.u64 %r1520,%r620,%r621; .loc 1 2688 2 @ %r1351 bra $L238; bra $L367; $L34: @ %r1351 bra $L35; bra $L36; $L33: .loc 1 2692 28 set.u32.eq.u64 %r1008,%r456,0; neg.s32 %r1009,%r1008; .loc 1 2692 13 set.u32.eq.u64 %r1011,%r216,0; neg.s32 %r1012,%r1011; .loc 1 2692 28 cvt.u16.u32 %r1014,%r1009; cvt.u16.u32 %r1015,%r1012; or.b16 %r1013,%r1014,%r1015; .loc 1 2692 23 set.u32.eq.u64 %r1017,%r217,0; neg.s32 %r1018,%r1017; .loc 1 2692 28 cvt.u16.u32 %r1022,%r1018; or.b16 %r1020,%r1013,%r1022; cvt.u32.u16 %r1023,%r1020; cvt.u16.u8 %r1024,%r1023; setp.eq.u16 %r1025,%r1024,0; @ ! %r1025 bra $L1; .loc 1 2697 10 setp.eq.u64 %r1026,%r208,1; selp.u64 %r228,%r504,%r208,%r1026; .loc 1 2702 21 shl.b64 %r1027,%r228,8; .loc 1 2702 14 add.u64 %r252,%r1027,%r209; .loc 1 2706 12 min.s64 %r1029,%r252,65536; shl.b64 %r1030,%r1029031031,[%value_in]; } .loc 1 2710 7 setp.gt.s64 %r1033,%r217,0; @ %r1033 bra $L39; $L42: .loc 1 2910312918 7 bra $L1; $L39: .loc 1 2680 16 add.u64 %r1035,%r208,1; .loc 1 2681 9 shl.b64 %r107,%r1035,3; sub.u64 %r249,%r243,%r107; shl.b64 %r594,%r209,9; shl.b64 %r596,%r209,12; shl.b64 %r605,%r209,4; .loc 1 2683 16 add.u64 %r1036,%r209,1; .loc 1 2684 9 shl.b64 %r1037,%r1036,3; sub.u64 %r597,%r244,%r1037; add.u64 %r595,%r605,%r597; shl.b64 %r600,%r504,9; add.u64 %r601,%r504,%r504; shl.b64 %r602,%r504,2; neg.s64 %r1040,%r209; shl.b64 %r1041,%r1040,12; shl.b64 %r603,%r1040,3; shl.b64 %r607,%r504,12; shl.b64 %r1045,%r504,4; add.u64 %r1046,%r1045,8; add.u64 %r606,%r247,%r1046; add.u64 %r1047,%r217,-1; and.b64 %r615,%r1047,-512; add.u64 %r616,%r597,%r603; shl.b64 %r573,%r208,8; shl.b64 %r575,%r208,11; add.u64 %r576,%r107,-8; neg.s64 %r1049,%r208; shl.b64 %r1050,%r1049,11; shl.b64 %r1053,%r1049,4; shl.b64 %r537,%r208,4; add.u64 %r538,%r208,%r208; shl.b64 %r226,%r209,5; shl.b64 %r221,%r504,5; neg.s64 %r1056,%r504; shl.b64 %r1057,%r1056,3; shl.b64 %r566,%r504,3; shl.b64 %r1060,%r1040,5; shl.b64 %r619,%r209,3; mov.u64 %r598,%r601; mov.u64 %r592,%r209; mov.u64 %r590,0; setp.gt.s64 %r1363,%r456,0; add.u64 %r1399,%r616,8; setp.gt.s64 %r1400,%r216,0; add.u64 %r1401,%r216,1; add.u64 %r1402,%r249,16; bra $L41; $L138: mov.u64 %r590,%r591; $L41: add.u64 %r618,%r590,1; .loc 1 2715 9 sub.u64 %r254,%r217,%r590; min.s64 %r1521,%r254,512; .loc 1 2716 24 shr.s64 %r1062,%r1521,63; shr.u64 %r1063,%r1062,62; add.u64 %r1064,%r1521,%r1063; and.b64 %r1065,%r1064,3; sub.u64 %r1066,%r1065,%r1063; .loc 1 2716 10 sub.u64 %r255,%r1521,%r1066; .loc 1 2718 4 @ %r1363 bra $L40; $L45: .loc 1 2710 7 add.u64 %r591,%r590,512; add.u64 %r592,%r592,%r594; add.u64 %r595,%r595,%r596; add.u64 %r598,%r598,%r600; add.u64 %r602,%r602,%r600; add.u64 %r603,%r603,%r1041; add.u64 %r605,%r605,%r596; add.u64 %r606,%r606,%r607; setp.ne.u64 %r1068,%r590,%r615; @ %r1068 bra $L138; bra $L42; $L40: add.u64 %r570,%r1399,%r605; .loc 1 2768 13 add.u64 %r480,%r255,%r618; add.u64 %r477,%r605,%r619; add.u64 %r475,%r477,%r619; add.u64 %r588,%r595,%r619; add.u64 %r589,%r588,%r619; mul.lo.u64 %r406,%r480,%r504; mul.lo.u64 %r399,%r480,%r209; mad.lo.u64 %r585,%r480,%r619,%r597; mov.u64 %r577,%r1053; mov.u64 %r574,%r576; mov.u64 %r571,%r208; .loc 1 2718 12 mov.u64 %r258,%r502; add.u64 %r1355,%r456,1; .loc 1 2768 8 add.u64 %r1392,%r480,-1; setp.ge.s64 %r1394,%r1392,%r618; sub.u64 %r1397,%r598,%r504; add.u64 %r1398,%r1397,%r601; $L44: .loc 1 2723 13 sub.u64 %r256,%r1355,%r258; min.s64 %r1523,%r256,256; .loc 1 2724 28 shr.u64 %r1074,%r1523,63; add.u64 %r1075,%r1523,%r1074; and.b64 %r1076,%r1075,1; sub.u64 %r1077,%r1076,%r1074; .loc 1 2724 14 sub.u64 %r257,%r1523,%r1077; .loc 1 2727 8 @ %r1400 bra $L43; $L73: .loc 1 2718 30 add.u64 %r258,%r258,256; .loc 1 2718 4 add.u64 %r570,%r570,2048; add.u64 %r571,%r571,%r573; add.u64 %r574,%r574,%r575; add.u64 %r577,%r577,%r1050; setp.ge.s64 %r1079,%r456,%r258; @ %r1079 bra $L44; bra $L45; $L43: .loc 1 2763 26 add.u64 %r1080,%r258,-1; add.u64 %r1081,%r1080,%r1523; .loc 1 2763 31 mul.lo.u64 %r314,%r1081,%r208; .loc 1 2734 13 add.u64 %r479,%r257,%r258; .loc 1 2734 8 add.u64 %r481,%r479,-1; add.u64 %r1082,%r314,1; shl.b64 %r1083,%r1082,3; add.u64 %r556,%r249,%r1083; add.u64 %r567,%r571,%r208; sub.u64 %r1084,%r567,%r314; shl.b64 %r569,%r1084,3; shl.b64 %r1086,%r1523,3; add.u64 %r1087,%r1086,-8; add.u64 %r330,%r1031,%r1087; add.u64 %r1088,%r258,%r592; add.u64 %r1089,%r1088,%r1523; shl.b64 %r1090,%r1089,3; add.u64 %r225,%r1090,%r597; shl.b64 %r587,%r258,3; add.u64 %r48,%r587,%r595; add.u64 %r46,%r587,%r588; add.u64 %r30,%r587,%r589; add.u64 %r586,%r585,%r587; add.u64 %r1091,%r399,%r258; add.u64 %r1092,%r1091,%r1523; shl.b64 %r1093,%r1092,3; add.u64 %r397,%r1093,%r597; .loc 1 2768 8 mov.u64 %r555,%r606; .loc 1 2727 16 mov.u64 %r273,%r502; setp.le.s64 %r1356,%r258,%r481; setp.gt.s64 %r1348,%r1523,%r257; setp.gt.s64 %r1347,%r1521,%r255; not.b64 %r1385,%r258; add.u64 %r1386,%r1385,%r479; and.b64 %r1387,%r1386,-2; .loc 1 2904 18 add.u64 %r1388,%r1523,%r258; .loc 1 2904 13 add.u64 %r1389,%r1388,-1; setp.gt.s64 %r1390,%r258,%r1389; $L72: .loc 1 2732 10 sub.u64 %r259,%r1401,%r273; min.s64 %r1524,%r259,256; .loc 1 2733 25 shr.s64 %r1353,%r1524,63; shr.u64 %r1097,%r1524,63; add.u64 %r1098,%r1524,%r1097; and.b64 %r1099,%r1098,1; sub.u64 %r1100,%r1099,%r1097; .loc 1 2733 11 sub.u64 %r260,%r1524,%r1100; .loc 1 2735 5 @ %r1356 bra $L46; $L55: .loc 1 2757 8 @ ! %r1348 bra $L48; bra $L47; $L46: .loc 1 2737 17 add.u64 %r484,%r260,%r273; .loc 1 2737 12 add.u64 %r485,%r484,-1; shl.b64 %r1104,%r1524,11; add.u64 %r1106,%r1104,-2048; add.u64 %r530,%r1031,%r1106; add.u64 %r536,%r556,%r569; add.u64 %r1109,%r1524,%r273; shl.b64 %r547,%r1109,3; shl.b64 %r551,%r1524,3; mov.u64 %r541,%r567; mov.u64 %r540,%r577; mov.u64 %r539,%r574; mov.u64 %r528,0; setp.le.s64 %r1361,%r273,%r485; setp.gt.s64 %r1349,%r1524,%r260; not.b64 %r1380,%r273; add.u64 %r1381,%r1380,%r484; and.b64 %r1382,%r1381,-2; add.u64 %r1383,%r1382,%r273; bra $L54; $L139: mov.u64 %r528,%r529; $L54: .loc 1 2738 9 @ %r1361 bra $L49; $L53: .loc 1 2749 12 @ %r1349 bra $L50; bra $L51; $L49: shl.b64 %r1112,%r528,3; add.u64 %r229,%r1031,%r1112; add.u64 %r1117,%r1383,%r541; shl.b64 %r1118,%r1117,3; add.u64 %r527,%r1118,%r1402; .loc 1 2738 9 mov.u64 %r211,%r536; sub.u64 %r1451,%r527,%r536; add.u64 %r1452,%r1451,-16; shr.u64 %r1450,%r1452,4; add.u64 %r1453,%r1450,1; and.b64 %r1454,%r1453,3; setp.eq.u64 %r1457,%r1454,0; @ %r1457 bra $L52; setp.eq.u64 %r1456,%r1454,1; @ %r1456 bra $L302; setp.eq.u64 %r1455,%r1454,2; @ %r1455 bra $L303; bra $L368; $L52: .loc 1 2741 7 add.u64 %r1120,%r211,%r539; add.u64 %r1121,%r1120,%r540; .loc 1 2740 49 ld.u64 %r1122,[%r1121]; st.u64 [%r229],%r1122; .loc 1 2742 49 ld.u64 %r1123,[%r211]; st.u64 [%r229+8],%r1123; .loc 1 2745 7 add.u64 %r1124,%r211,%r540; add.u64 %r1125,%r1124,%r539; .loc 1 2744 49 ld.u64 %r1126,[%r1125+8]; st.u64 [%r229+2048],%r1126; .loc 1 2746 49 ld.u64 %r1127,[%r211+8]; st.u64 [%r229+2056],%r1127; .loc 1 2738 9 add.u64 %r1458,%r229,4096; add.u64 %r1459,%r211,16; .loc 1 2741 7 add.u64 %r1527,%r1459,%r539; add.u64 %r1528,%r1527,%r540; .loc 1 2740 49 ld.u64 %r1529,[%r1528]; st.u64 [%r1458],%r1529; .loc 1 2742 49 ld.u64 %r1530,[%r1459]; st.u64 [%r1458+8],%r1530; .loc 1 2745 7 add.u64 %r1531,%r1459,%r540; add.u64 %r1532,%r1531,%r539; .loc 1 2744 49 ld.u64 %r1533,[%r1532+8]; st.u64 [%r1458+2048],%r1533; .loc 1 2746 49 ld.u64 %r1534,[%r1459+8]; st.u64 [%r1458+2056],%r1534; .loc 1 2738 9 add.u64 %r1535,%r229,8192; add.u64 %r1536,%r211,32; .loc 1 2741 7 add.u64 %r1538,%r1536,%r539; add.u64 %r1539,%r1538,%r540; .loc 1 2740 49 ld.u64 %r1540,[%r1539]; st.u64 [%r1535],%r1540; .loc 1 2742 49 ld.u64 %r1541,[%r1536]; st.u64 [%r1535+8],%r1541; .loc 1 2745 7 add.u64 %r1542,%r1536,%r540; add.u64 %r1543,%r1542,%r539; .loc 1 2744 49 ld.u64 %r1544,[%r1543+8]; st.u64 [%r1535+2048],%r1544; .loc 1 2746 49 ld.u64 %r1545,[%r1536+8]; st.u64 [%r1535+2056],%r1545; .loc 1 2738 9 add.u64 %r1546,%r229,12288; add.u64 %r1547,%r211,48; .loc 1 2741 7 add.u64 %r1549,%r1547,%r539; add.u64 %r1550,%r1549,%r540; .loc 1 2740 49 ld.u64 %r1551,[%r1550]; st.u64 [%r1546],%r1551; .loc 1 2742 49 ld.u64 %r1552,[%r1547]; st.u64 [%r1546+8],%r1552; .loc 1 2745 7 add.u64 %r1553,%r1547,%r540; add.u64 %r1554,%r1553,%r539; .loc 1 2744 49 ld.u64 %r1555,[%r1554+8]; st.u64 [%r1546+2048],%r1555; .loc 1 2746 49 ld.u64 %r1556,[%r1547+8]; st.u64 [%r1546+2056],%r1556; .loc 1 2738 9 add.u64 %r229,%r229,16384; add.u64 %r211,%r211,64; setp.ne.u64 %r1557,%r211,%r527; @ %r1557 bra $L52; bra $L53; $L50: .loc 1 2752 10 add.u64 %r1129,%r539,%r547; add.u64 %r1130,%r249,%r1129; .loc 1 2751 41 ld.u64 %r1131,[%r1130+-8]; st.u64 [%r530],%r1131; .loc 1 2754 10 add.u64 %r1132,%r536,%r551; .loc 1 2753 41 ld.u64 %r1133,[%r1132+-8]; st.u64 [%r530+8],%r1133; $L51: .loc 1 2735 5 add.u64 %r529,%r528,2; add.u64 %r530,%r530,16; add.u64 %r536,%r536,%r537; add.u64 %r539,%r539,%r537; add.u64 %r540,%r540,%r1053; add.u64 %r541,%r541,%r538; setp.ne.u64 %r1134,%r528,%r1387; @ %r1134 bra $L139; bra $L55; $L48: .loc 1 2767 25 shr.u64 %r1137,%r1353,62; add.u64 %r1138,%r1524,%r1137; and.b64 %r1139,%r1138,3; sub.u64 %r1140,%r1139,%r1137; .loc 1 2767 11 sub.u64 %r263,%r1524,%r1140; .loc 1 2769 5 @ %r1394 bra $L56; bra $L57; $L47: .loc 1 2759 17 add.u64 %r132,%r1524,%r273; .loc 1 2759 12 add.u64 %r1142,%r132,-1; .loc 1 2760 9 setp.lt.s64 %r1143,%r1142,%r273; @ %r1143 bra $L48; add.u64 %r1144,%r132,%r314; shl.b64 %r1145,%r1144,3; add.u64 %r136,%r249,%r1145; mov.u64 %r73,%r330; mov.u64 %r57,%r556; sub.u64 %r1441,%r136,%r556; add.u64 %r1442,%r1441,-8; shr.u64 %r1440,%r1442,3; add.u64 %r1443,%r1440,1; and.b64 %r1444,%r1443,3; setp.eq.u64 %r1447,%r1444,0; @ %r1447 bra $L58; setp.eq.u64 %r1446,%r1444,1; @ %r1446 bra $L304; setp.eq.u64 %r1445,%r1444,2; @ %r1445 bra $L305; bra $L369; $L58: .loc 1 2762 43 ld.u64 %r1146,[%r57]; st.u64 [%r73],%r1146; ld.u64 %r1560,[%r57+8]; st.u64 [%r73+2048],%r1560; ld.u64 %r1564,[%r57+16]; st.u64 [%r73+4096],%r1564; ld.u64 %r1568,[%r57+24]; st.u64 [%r73+6144],%r1568; .loc 1 2760 9 add.u64 %r57,%r57,32; add.u64 %r73,%r73,8192; setp.ne.u64 %r1569,%r57,%r136; @ %r1569 bra $L58; bra $L48; $L57: .loc 1 2871 8 @ ! %r1347 bra $L60; bra $L59; $L56: .loc 1 2876 14 add.u64 %r488,%r263,%r273; .loc 1 2876 9 add.u64 %r489,%r488,-1; add.u64 %r222,%r555,%r1057; add.u64 %r89,%r555,%r566; add.u64 %r42,%r566,%r89; not.b64 %r1149,%r273; add.u64 %r1150,%r1149,%r488; shr.u64 %r1151,%r1150,2; add.u64 %r1152,%r1151,1; shl.b64 %r318,%r1152,5; add.u64 %r1153,%r1397,%r488; shl.b64 %r1154,%r1153,3; add.u64 %r468,%r1154,%r247; add.u64 %r1155,%r488,%r598; shl.b64 %r1156,%r1155,3; add.u64 %r463,%r1156,%r247; add.u64 %r1158,%r1398,%r488; shl.b64 %r1159,%r1158,3; add.u64 %r460,%r1159,%r247; add.u64 %r1160,%r488,%r602; shl.b64 %r1161,%r1160,3; add.u64 %r457,%r1161,%r247; shl.b64 %r369,%r263,8; shl.b64 %r471,%r1524,3; mov.u64 %r476,%r475; mov.u64 %r478,%r477; mov.u64 %r483,%r605; mov.u64 %r486,%r603; mov.u64 %r220,%r555; mov.u64 %r227,%r225; mov.u64 %r278,%r618; mov.u64 %r473,0; mov.u64 %r474,%r473; setp.le.s64 %r1362,%r273,%r489; setp.gt.s64 %r1350,%r1524,%r263; $L68: .loc 1 2772 9 @ %r1362 bra $L61; $L67: .loc 1 2843 12 @ ! %r1350 bra $L63; bra $L62; $L61: .loc 1 2904 13 mov.u64 %r341,0; mov.u64 %r342,8192; mov.u64 %r343,6144; mov.u64 %r344,2048; mov.u64 %r345,-4096; $L66: add.u64 %r339,%r222,%r341; .loc 1 2774 10 ld.u64 %r279,[%r339]; .loc 1 2775 10 ld.u64 %r280,[%r339+8]; add.u64 %r337,%r220,%r341; .loc 1 2776 10 ld.u64 %r281,[%r337]; .loc 1 2777 10 ld.u64 %r282,[%r337+8]; add.u64 %r335,%r89,%r341; .loc 1 2778 10 ld.u64 %r283,[%r335]; .loc 1 2779 10 ld.u64 %r284,[%r335+8]; add.u64 %r333,%r42,%r341; .loc 1 2780 10 ld.u64 %r285,[%r333]; .loc 1 2781 10 ld.u64 %r286,[%r333+8]; .loc 1 2782 10 ld.u64 %r287,[%r339+16]; .loc 1 2783 10 ld.u64 %r288,[%r339+24]; .loc 1 2784 10 ld.u64 %r289,[%r337+16]; .loc 1 2785 10 ld.u64 %r290,[%r337+24]; .loc 1 2786 10 ld.u64 %r291,[%r335+16]; .loc 1 2787 10 ld.u64 %r292,[%r335+24]; .loc 1 2788 10 ld.u64 %r293,[%r333+16]; .loc 1 2789 10 ld.u64 %r294,[%r333+24]; .loc 1 2791 6 @ %r1390 bra $L64; add.u64 %r361,%r473,%r570; add.u64 %r359,%r1031,%r344; $L65: add.u64 %r356,%r345,%r359; .loc 1 2793 19 add.u64 %r1166,%r356,%r344; ld.u64 %r137,[%r1166]; .loc 1 2794 14 ld.u64 %r138,[%r361]; .loc 1 2793 14 mad.lo.u64 %r279,%r137,%r138,%r279; .loc 1 2795 19 ld.u64 %r140,[%r359]; .loc 1 2795 14 mad.lo.u64 %r280,%r138,%r140,%r280; add.u64 %r351,%r361,%r486; .loc 1 2798 14 add.u64 %r1169,%r351,%r483; ld.u64 %r142,[%r1169]; .loc 1 2797 14 mad.lo.u64 %r281,%r137,%r142,%r281; .loc 1 2799 14 mad.lo.u64 %r282,%r140,%r142,%r282; .loc 1 2802 14 add.u64 %r1172,%r351,%r478; ld.u64 %r145,[%r1172]; .loc 1 2801 14 mad.lo.u64 %r283,%r137,%r145,%r283; .loc 1 2803 14 mad.lo.u64 %r284,%r140,%r145,%r284; .loc 1 2806 14 add.u64 %r1175,%r351,%r476; ld.u64 %r148,[%r1175]; .loc 1 2805 14 mad.lo.u64 %r285,%r137,%r148,%r285; .loc 1 2807 14 mad.lo.u64 %r286,%r140,%r148,%r286; .loc 1 2809 19 add.u64 %r1178,%r356,%r343; ld.u64 %r151,[%r1178]; .loc 1 2809 14 mad.lo.u64 %r287,%r138,%r151,%r287; .loc 1 2811 19 add.u64 %r1180,%r356,%r342; ld.u64 %r153,[%r1180]; .loc 1 2811 14 mad.lo.u64 %r288,%r138,%r153,%r288; .loc 1 2813 14 mad.lo.u64 %r289,%r142,%r151,%r289; .loc 1 2815 14 mad.lo.u64 %r290,%r142,%r153,%r290; .loc 1 2817 14 mad.lo.u64 %r291,%r145,%r151,%r291; .loc 1 2819 14 mad.lo.u64 %r292,%r145,%r153,%r292; .loc 1 2821 14 mad.lo.u64 %r293,%r148,%r151,%r293; .loc 1 2823 14 mad.lo.u64 %r294,%r148,%r153,%r294; .loc 1 2791 6 add.u64 %r361,%r361,8; add.u64 %r359,%r359,8; setp.ne.u64 %r1188,%r227,%r361; @ %r1188 bra $L65; $L64: .loc 1 2826 24 st.u64 [%r339],%r279; .loc 1 2827 28 st.u64 [%r339+8],%r280; .loc 1 2828 30 st.u64 [%r337],%r281; .loc 1 2829 34 st.u64 [%r337+8],%r282; .loc 1 2830 30 st.u64 [%r335],%r283; .loc 1 2831 34 st.u64 [%r335+8],%r284; .loc 1 2832 30 st.u64 [%r333],%r285; .loc 1 2833 34 st.u64 [%r333+8],%r286; .loc 1 2834 28 st.u64 [%r339+16],%r287; .loc 1 2835 28 st.u64 [%r339+24],%r288; .loc 1 2836 34 st.u64 [%r337+16],%r289; .loc 1 2837 34 st.u64 [%r337+24],%r290; .loc 1 2838 34 st.u64 [%r335+16],%r291; .loc 1 2839 34 st.u64 [%r335+24],%r292; .loc 1 2840 34 st.u64 [%r333+16],%r293; .loc 1 2841 34 st.u64 [%r333+24],%r294; .loc 1 2772 9 add.u64 %r345,%r345,-8192; add.u64 %r344,%r344,8192; add.u64 %r343,%r343,8192; add.u64 %r342,%r342,8192; add.u64 %r341,%r341,32; setp.ne.u64 %r1189,%r318,%r341; @ %r1189 bra $L66; bra $L67; $L63: .loc 1 2769 29 add.u64 %r278,%r278,4; .loc 1 2769 5 add.u64 %r227,%r227,%r226; add.u64 %r222,%r222,%r221; add.u64 %r220,%r220,%r221; add.u64 %r89,%r89,%r221; add.u64 %r42,%r42,%r221; add.u64 %r486,%r486,%r1060; add.u64 %r483,%r483,%r226; add.u64 %r478,%r478,%r226; add.u64 %r476,%r476,%r226; add.u64 %r474,%r474,%r221; add.u64 %r473,%r473,%r226; setp.le.s64 %r1190,%r278,%r1392; @ %r1190 bra $L68; bra $L57; $L62: add.u64 %r374,%r468,%r474; add.u64 %r373,%r463,%r474; add.u64 %r372,%r460,%r474; add.u64 %r371,%r457,%r474; add.u64 %r49,%r48,%r473; add.u64 %r239,%r46,%r473; add.u64 %r240,%r30,%r473; add.u64 %r472,%r222,%r471; .loc 1 2904 13 mov.u64 %r370,%r369; $L71: .loc 1 2848 14 ld.u64 %r274,[%r374]; .loc 1 2849 14 ld.u64 %r275,[%r373]; .loc 1 2850 14 ld.u64 %r276,[%r372]; .loc 1 2851 14 ld.u64 %r277,[%r371]; .loc 1 2853 10 @ %r1390 bra $L69; shl.b64 %r1194,%r370,3; add.u64 %r389,%r1031,%r1194; add.u64 %r387,%r473,%r570; mov.u64 %r381,%r240; mov.u64 %r383,%r239; mov.u64 %r385,%r49; sub.u64 %r1428,%r227,%r387; add.u64 %r1429,%r1428,-8; shr.u64 %r1427,%r1429,3; add.u64 %r1430,%r1427,1; and.b64 %r1431,%r1430,3; setp.eq.u64 %r1434,%r1431,0; @ %r1434 bra $L70; setp.eq.u64 %r1433,%r1431,1; @ %r1433 bra $L306; setp.eq.u64 %r1432,%r1431,2; @ %r1432 bra $L307; bra $L370; $L70: .loc 1 2855 16 ld.u64 %r162,[%r389]; .loc 1 2856 13 ld.u64 %r1196,[%r387]; .loc 1 2855 11 mad.lo.u64 %r1570,%r162,%r1196,%r274; .loc 1 2858 13 ld.u64 %r1198,[%r385]; .loc 1 2857 11 mad.lo.u64 %r1571,%r162,%r1198,%r275; .loc 1 2860 13 ld.u64 %r1200,[%r383]; .loc 1 2859 11 mad.lo.u64 %r1572,%r162,%r1200,%r276; .loc 1 2862 13 ld.u64 %r1202,[%r381]; .loc 1 2861 11 mad.lo.u64 %r1573,%r162,%r1202,%r277; .loc 1 2855 16 ld.u64 %r1579,[%r389+8]; .loc 1 2856 13 ld.u64 %r1580,[%r387+8]; .loc 1 2855 11 mad.lo.u64 %r1582,%r1579,%r1580,%r1570; .loc 1 2858 13 ld.u64 %r1583,[%r385+8]; .loc 1 2857 11 mad.lo.u64 %r1585,%r1579,%r1583,%r1571; .loc 1 2860 13 ld.u64 %r1586,[%r383+8]; .loc 1 2859 11 mad.lo.u64 %r1588,%r1579,%r1586,%r1572; .loc 1 2862 13 ld.u64 %r1589,[%r381+8]; .loc 1 2861 11 mad.lo.u64 %r1591,%r1579,%r1589,%r1573; .loc 1 2855 16 ld.u64 %r1598,[%r389+16]; .loc 1 2856 13 ld.u64 %r1599,[%r387+16]; .loc 1 2855 11 mad.lo.u64 %r1601,%r1598,%r1599,%r1582; .loc 1 2858 13 ld.u64 %r1602,[%r385+16]; .loc 1 2857 11 mad.lo.u64 %r1604,%r1598,%r1602,%r1585; .loc 1 2860 13 ld.u64 %r1605,[%r383+16]; .loc 1 2859 11 mad.lo.u64 %r1607,%r1598,%r1605,%r1588; .loc 1 2862 13 ld.u64 %r1608,[%r381+16]; .loc 1 2861 11 mad.lo.u64 %r1610,%r1598,%r1608,%r1591; .loc 1 2855 16 ld.u64 %r1617,[%r389+24]; .loc 1 2856 13 ld.u64 %r1618,[%r387+24]; .loc 1 2855 11 mad.lo.u64 %r274,%r1617,%r1618,%r1601; .loc 1 2858 13 ld.u64 %r1620,[%r385+24]; .loc 1 2857 11 mad.lo.u64 %r275,%r1617,%r1620,%r1604; .loc 1 2860 13 ld.u64 %r1622,[%r383+24]; .loc 1 2859 11 mad.lo.u64 %r276,%r1617,%r1622,%r1607; .loc 1 2862 13 ld.u64 %r1624,[%r381+24]; .loc 1 2861 11 mad.lo.u64 %r277,%r1617,%r1624,%r1610; .loc 1 2853 10 add.u64 %r389,%r389,32; add.u64 %r387,%r387,32; add.u64 %r385,%r385,32; add.u64 %r383,%r383,32; add.u64 %r381,%r381,32; setp.ne.u64 %r1626,%r227,%r387; @ %r1626 bra $L70; $L69: .loc 1 2864 28 st.u64 [%r374],%r274; .loc 1 2865 34 st.u64 [%r373],%r275; .loc 1 2866 34 st.u64 [%r372],%r276; .loc 1 2867 34 st.u64 [%r371],%r277; .loc 1 2846 6 add.u64 %r374,%r374,8; add.u64 %r373,%r373,8; add.u64 %r372,%r372,8; add.u64 %r371,%r371,8; add.u64 %r370,%r370,256; setp.ne.u64 %r1204,%r374,%r472; @ %r1204 bra $L71; bra $L63; $L60: .loc 1 2727 34 add.u64 %r273,%r273,256; .loc 1 2727 8 add.u64 %r555,%r555,2048; add.u64 %r556,%r556,2048; setp.ge.s64 %r1205,%r216,%r273; @ %r1205 bra $L72; bra $L73; $L59: .loc 1 2873 17 add.u64 %r171,%r1521,%r618; .loc 1 2874 9 setp.le.s64 %r1206,%r171,%r480; @ %r1206 bra $L60; .loc 1 2876 14 add.u64 %r495,%r263,%r273; .loc 1 2876 9 add.u64 %r496,%r495,-1; .loc 1 2759 17 add.u64 %r405,%r1524,%r273; add.u64 %r1207,%r405,%r406; shl.b64 %r1208,%r1207,3; add.u64 %r407,%r1208,%r247; neg.s64 %r1210,%r405; shl.b64 %r1211,%r1210,3; shl.b64 %r1212,%r273,3; add.u64 %r390,%r1212,%r1211; shl.b64 %r1213,%r495,3; add.u64 %r392,%r1213,%r1211; shl.b64 %r434,%r263,8; .loc 1 2876 9 mov.u64 %r398,%r397; mov.u64 %r400,%r586; mov.u64 %r401,%r406; mov.u64 %r267,%r480; setp.gt.s64 %r1627,%r1524,%r263; setp.le.s64 %r1365,%r273,%r496; not.b64 %r1372,%r273; add.u64 %r1373,%r1372,%r495; and.b64 %r1374,%r1373,-4; add.u64 %r1375,%r1374,%r273; $L81: .loc 1 2877 6 @ %r1365 bra $L74; $L80: .loc 1 2901 6 @ %r1627 bra $L75; bra $L76; $L74: add.u64 %r419,%r390,%r407; add.u64 %r1221,%r1375,%r401; shl.b64 %r1222,%r1221,3; add.u64 %r1223,%r247,32; add.u64 %r409,%r1222,%r1223; .loc 1 2904 13 mov.u64 %r415,8192; mov.u64 %r416,6144; mov.u64 %r417,2048; mov.u64 %r418,-4096; $L79: .loc 1 2879 14 ld.u64 %r269,[%r419]; .loc 1 2880 14 ld.u64 %r270,[%r419+8]; .loc 1 2881 14 ld.u64 %r271,[%r419+16]; .loc 1 2882 14 ld.u64 %r272,[%r419+24]; .loc 1 2884 10 @ %r1390 bra $L77; add.u64 %r428,%r1031,%r417; mov.u64 %r429,%r400; sub.u64 %r1420,%r398,%r400; add.u64 %r1421,%r1420,-8; shr.u64 %r1419,%r1421,3; and.b64 %r1423,%r1419,1; setp.ne.u64 %r1424,%r1423,0; @ ! %r1424 bra $L371; $L78: add.u64 %r425,%r418,%r428; .loc 1 2887 16 ld.u64 %r173,[%r429]; .loc 1 2886 16 add.u64 %r1225,%r425,%r417; .loc 1 2887 13 ld.u64 %r1227,[%r1225]; .loc 1 2886 11 mad.lo.u64 %r1628,%r173,%r1227,%r269; .loc 1 2889 13 ld.u64 %r1229,[%r428]; .loc 1 2888 11 mad.lo.u64 %r1629,%r173,%r1229,%r270; .loc 1 2890 16 add.u64 %r1230,%r425,%r416; .loc 1 2891 13 ld.u64 %r1232,[%r1230]; .loc 1 2890 11 mad.lo.u64 %r1630,%r173,%r1232,%r271; .loc 1 2892 16 add.u64 %r1233,%r425,%r415; .loc 1 2893 13 ld.u64 %r1235,[%r1233]; .loc 1 2892 11 mad.lo.u64 %r1631,%r173,%r1235,%r272; .loc 1 2884 10 add.u64 %r1426,%r428,8; add.u64 %r1634,%r418,%r1426; .loc 1 2887 16 ld.u64 %r1635,[%r429+8]; .loc 1 2886 16 add.u64 %r1636,%r1634,%r417; .loc 1 2887 13 ld.u64 %r1637,[%r1636]; .loc 1 2886 11 mad.lo.u64 %r269,%r1635,%r1637,%r1628; .loc 1 2889 13 ld.u64 %r1639,[%r1426]; .loc 1 2888 11 mad.lo.u64 %r270,%r1635,%r1639,%r1629; .loc 1 2890 16 add.u64 %r1641,%r1634,%r416; .loc 1 2891 13 ld.u64 %r1642,[%r1641]; .loc 1 2890 11 mad.lo.u64 %r271,%r1635,%r1642,%r1630; .loc 1 2892 16 add.u64 %r1644,%r1634,%r415; .loc 1 2893 13 ld.u64 %r1645,[%r1644]; .loc 1 2892 11 mad.lo.u64 %r272,%r1635,%r1645,%r1631; .loc 1 2884 10 add.u64 %r429,%r429,16; add.u64 %r428,%r428,16; setp.ne.u64 %r1647,%r398,%r429; @ %r1647 bra $L78; $L77: .loc 1 2895 28 st.u64 [%r419],%r269; .loc 1 2896 32 st.u64 [%r419+8],%r270; .loc 1 2897 32 st.u64 [%r419+16],%r271; .loc 1 2898 32 st.u64 [%r419+24],%r272; .loc 1 2877 6 add.u64 %r419,%r419,32; add.u64 %r418,%r418,-8192; add.u64 %r417,%r417,8192; add.u64 %r416,%r416,8192; add.u64 %r415,%r415,8192; setp.ne.u64 %r1237,%r409,%r419; @ %r1237 bra $L79; bra $L80; $L76: .loc 1 2874 39 add.u64 %r267,%r267,1; .loc 1 2874 9 add.u64 %r407,%r407,%r566; add.u64 %r401,%r401,%r504; add.u64 %r400,%r400,%r619; add.u64 %r398,%r398,%r619; setp.ne.u64 %r1238,%r171,%r267; @ %r1238 bra $L81; bra $L60; $L75: add.u64 %r436,%r392,%r407; .loc 1 2904 13 mov.u64 %r435,%r434; $L84: .loc 1 2903 14 ld.u64 %r268,[%r436]; .loc 1 2905 10 @ %r1390 bra $L82; shl.b64 %r1241,%r435,3; add.u64 %r332,%r1031,%r1241; mov.u64 %r297,%r400; sub.u64 %r1410,%r398,%r400; add.u64 %r1411,%r1410,-8; shr.u64 %r1409,%r1411,3; add.u64 %r1412,%r1409,1; and.b64 %r1413,%r1412,3; setp.eq.u64 %r1416,%r1413,0; @ %r1416 bra $L83; setp.eq.u64 %r1415,%r1413,1; @ %r1415 bra $L308; setp.eq.u64 %r1414,%r1413,2; @ %r1414 bra $L309; bra $L372; $L83: .loc 1 2908 13 ld.u64 %r1243,[%r332]; ld.u64 %r1244,[%r297]; .loc 1 2907 11 mad.lo.u64 %r1648,%r1243,%r1244,%r268; .loc 1 2908 13 ld.u64 %r1651,[%r332+8]; ld.u64 %r1652,[%r297+8]; .loc 1 2907 11 mad.lo.u64 %r1654,%r1651,%r1652,%r1648; .loc 1 2908 13 ld.u64 %r1658,[%r332+16]; ld.u64 %r1659,[%r297+16]; .loc 1 2907 11 mad.lo.u64 %r1661,%r1658,%r1659,%r1654; .loc 1 2908 13 ld.u64 %r1665,[%r332+24]; ld.u64 %r1666,[%r297+24]; .loc 1 2907 11 mad.lo.u64 %r268,%r1665,%r1666,%r1661; .loc 1 2905 10 add.u64 %r332,%r332,32; add.u64 %r297,%r297,32; setp.ne.u64 %r1668,%r398,%r297; @ %r1668 bra $L83; $L82: .loc 1 2910 28 st.u64 [%r436],%r268; .loc 1 2901 6 add.u64 %r436,%r436,8; add.u64 %r435,%r435,256; setp.ne.u64 %r1246,%r407,%r436; @ %r1246 bra $L84; bra $L76; $L20: .loc 1 2958 11 cvt.u16.u32 %r1247,%r446; setp.eq.u16 %r1248,%r1247,1; @ ! %r1248 bra $L373; bra $L85; $L31: .loc 1 2920 38 set.u32.eq.u64 %r1250,%r208,1; neg.s32 %r1251,%r1250; .loc 1 2920 26 cvt.u16.u32 %r1253,%r1251; and.b16 %r1252,%r1253,%r988; .loc 1 2920 11 cvt.u32.u16 %r1255,%r1252; cvt.u16.u8 %r1256,%r1255; setp.eq.u16 %r1257,%r1256,0; @ %r1257 bra $L20; .loc 1 2920 43 setp.ne.u64 %r1258,%r502,1; @ %r1258 bra $L20; $L121: .loc 1 2922 10 cvt.u16.u32 %r1259,%r446; setp.ne.u16 %r1260,%r1259,1; @ %r1260 bra $L87; .loc 1 2948 4 setp.gt.s64 %r1261,%r217,0; @ %r1261 bra $L88; bra $L1; $L87: .loc 1 2929 4 setp.le.s64 %r1262,%r217,0; @ %r1262 bra $L1; shl.b64 %r643,%r504,3; shl.b64 %r1263,%r216,3; add.u64 %r642,%r245,%r1263; shl.b64 %r633,%r503,3; shl.b64 %r1264,%r456,3; add.u64 %r634,%r243,%r1264; mov.u64 %r647,0; mov.u64 %r640,%r647; .loc 1 2929 11 mov.u64 %r300,%r647; setp.gt.s64 %r1669,%r216,0; setp.le.s64 %r1406,%r456,0; $L91: .loc 1 2933 8 @ %r1669 bra $L90; $L95: .loc 1 2929 29 add.u64 %r300,%r300,1; .loc 1 2929 4 add.u64 %r640,%r640,%r504; add.u64 %r642,%r642,%r643; add.u64 %r647,%r647,%r209; setp.ne.u64 %r1266,%r217,%r300; @ %r1266 bra $L91; bra $L1; $L90: shl.b64 %r1267,%r640,3; add.u64 %r629,%r245,%r1267; shl.b64 %r1268,%r647,3; add.u64 %r652,%r244,%r1268; .loc 1 2933 8 mov.u64 %r631,%r634; mov.u64 %r630,0; $L94: .loc 1 2937 5 @ %r1406 bra $L140; shl.b64 %r1270,%r630,3; add.u64 %r625,%r243,%r1270; mov.u64 %r626,%r652; .loc 1 2936 7 mov.u64 %r301,0; sub.u64 %r1469,%r631,%r625; add.u64 %r1470,%r1469,-8; shr.u64 %r1468,%r1470,3; add.u64 %r1471,%r1468,1; and.b64 %r1472,%r1471,3; setp.eq.u64 %r1475,%r1472,0; @ %r1475 bra $L93; setp.eq.u64 %r1474,%r1472,1; @ %r1474 bra $L310; setp.eq.u64 %r1473,%r1472,2; @ %r1473 bra $L311; bra $L374; $L93: .loc 1 2938 23 ld.u64 %r1272,[%r625]; ld.u64 %r1273,[%r626]; .loc 1 2938 9 mad.lo.u64 %r1670,%r1272,%r1273,%r301; .loc 1 2938 23 ld.u64 %r1673,[%r625+8]; ld.u64 %r1674,[%r626+8]; .loc 1 2938 9 mad.lo.u64 %r1676,%r1673,%r1674,%r1670; .loc 1 2938 23 ld.u64 %r1680,[%r625+16]; ld.u64 %r1681,[%r626+16]; .loc 1 2938 9 mad.lo.u64 %r1683,%r1680,%r1681,%r1676; .loc 1 2938 23 ld.u64 %r1687,[%r625+24]; ld.u64 %r1688,[%r626+24]; .loc 1 2938 9 mad.lo.u64 %r301,%r1687,%r1688,%r1683; .loc 1 2937 5 add.u64 %r625,%r625,32; add.u64 %r626,%r626,32; setp.ne.u64 %r1690,%r625,%r631; @ %r1690 bra $L93; bra $L92; $L140: .loc 1 2936 7 mov.u64 %r301,0; $L92: .loc 1 2939 15 st.u64 [%r629],%r301; .loc 1 2933 8 add.u64 %r629,%r629,8; add.u64 %r630,%r630,%r503; add.u64 %r631,%r631,%r633; setp.ne.u64 %r1275,%r629,%r642; @ %r1275 bra $L94; bra $L95; $L88: shl.b64 %r663,%r504,3; mov.u64 %r661,%r245; shl.b64 %r655,%r503,3; .loc 1 2948 4 mov.u64 %r664,0; .loc 1 2948 11 mov.u64 %r298,%r664; setp.le.s64 %r1691,%r456,0; $L98: .loc 1 2952 8 @ %r1691 bra $L141; mov.u64 %r653,%r243; shl.b64 %r1277,%r664,3; add.u64 %r656,%r1277,%r244; add.u64 %r1278,%r456,%r664; shl.b64 %r1279,%r1278,3; add.u64 %r672,%r1279,%r244; .loc 1 2951 10 mov.u64 %r299,0; sub.u64 %r1479,%r672,%r656; add.u64 %r1480,%r1479,-8; shr.u64 %r1478,%r1480,3; add.u64 %r1481,%r1478,1; and.b64 %r1482,%r1481,3; setp.eq.u64 %r1485,%r1482,0; @ %r1485 bra $L97; setp.eq.u64 %r1484,%r1482,1; @ %r1484 bra $L312; setp.eq.u64 %r1483,%r1482,2; @ %r1483 bra $L313; bra $L375; $L97: .loc 1 2953 26 ld.u64 %r1281,[%r653]; ld.u64 %r1282,[%r656]; .loc 1 2953 5 mad.lo.u64 %r1692,%r1281,%r1282,%r299; .loc 1 2952 8 add.u64 %r1693,%r653,%r655; .loc 1 2953 26 ld.u64 %r1695,[%r1693]; ld.u64 %r1696,[%r656+8]; .loc 1 2953 5 mad.lo.u64 %r1698,%r1695,%r1696,%r1692; .loc 1 2952 8 add.u64 %r1699,%r1693,%r655; .loc 1 2953 26 ld.u64 %r1702,[%r1699]; ld.u64 %r1703,[%r656+16]; .loc 1 2953 5 mad.lo.u64 %r1705,%r1702,%r1703,%r1698; .loc 1 2952 8 add.u64 %r1706,%r1699,%r655; .loc 1 2953 26 ld.u64 %r1709,[%r1706]; ld.u64 %r1710,[%r656+24]; .loc 1 2953 5 mad.lo.u64 %r299,%r1709,%r1710,%r1705; .loc 1 2952 8 add.u64 %r653,%r1706,%r655; add.u64 %r656,%r656,32; setp.ne.u64 %r1712,%r656,%r672; @ %r1712 bra $L97; bra $L96; $L141: .loc 1 2951 10 mov.u64 %r299,0; $L96: .loc 1 2954 25 st.u64 [%r661],%r299; .loc 1 2948 29 add.u64 %r298,%r298,1; .loc 1 2948 4 add.u64 %r661,%r661,%r663; add.u64 %r664,%r664,%r209; setp.ne.u64 %r1284,%r217,%r298; @ %r1284 bra $L98; bra $L1; $L85: .loc 1 2963 7 setp.le.s64 %r1285,%r217,0; @ %r1285 bra $L1; shl.b64 %r683,%r444,3; mov.u64 %r681,%r245; shl.b64 %r675,%r503,3; shl.b64 %r678,%r502,3; mov.u64 %r684,0; .loc 1 2963 14 mov.u64 %r311,%r684; setp.le.s64 %r1713,%r456,0; $L101: .loc 1 2967 4 @ %r1713 bra $L142; mov.u64 %r673,%r243; shl.b64 %r1287,%r684,3; add.u64 %r676,%r244,%r1287; .loc 1 2966 6 mov.u64 %r312,0; .loc 1 2967 11 mov.u64 %r313,%r312; and.b64 %r1490,%r456,3; setp.eq.u64 %r1493,%r1490,0; @ %r1493 bra $L100; setp.eq.u64 %r1492,%r1490,1; @ %r1492 bra $L314; setp.eq.u64 %r1491,%r1490,2; @ %r1491 bra $L315; bra $L376; $L100: .loc 1 2968 29 ld.u64 %r1289,[%r673]; ld.u64 %r1290,[%r676]; .loc 1 2968 8 mad.lo.u64 %r1714,%r1289,%r1290,%r312; .loc 1 2967 4 add.u64 %r1716,%r673,%r675; add.u64 %r1717,%r676,%r678; .loc 1 2968 29 ld.u64 %r1718,[%r1716]; ld.u64 %r1719,[%r1717]; .loc 1 2968 8 mad.lo.u64 %r1721,%r1718,%r1719,%r1714; .loc 1 2967 4 add.u64 %r1723,%r1716,%r675; add.u64 %r1724,%r1717,%r678; .loc 1 2968 29 ld.u64 %r1726,[%r1723]; ld.u64 %r1727,[%r1724]; .loc 1 2968 8 mad.lo.u64 %r1729,%r1726,%r1727,%r1721; .loc 1 2967 4 add.u64 %r1731,%r1723,%r675; add.u64 %r1732,%r1724,%r678; .loc 1 2968 29 ld.u64 %r1734,[%r1731]; ld.u64 %r1735,[%r1732]; .loc 1 2968 8 mad.lo.u64 %r312,%r1734,%r1735,%r1729; .loc 1 2967 28 add.u64 %r313,%r313,4; .loc 1 2967 4 add.u64 %r673,%r1731,%r675; add.u64 %r676,%r1732,%r678; setp.ne.u64 %r1737,%r456,%r313; @ %r1737 bra $L100; bra $L99; $L142: .loc 1 2966 6 mov.u64 %r312,0; $L99: .loc 1 2969 21 st.u64 [%r681],%r312; .loc 1 2963 32 add.u64 %r311,%r311,1; .loc 1 2963 7 add.u64 %r681,%r681,%r683; add.u64 %r684,%r684,%r209; setp.ne.u64 %r1292,%r217,%r311; @ %r1292 bra $L101; bra $L1; $L373: .loc 1 2972 11 setp.gt.s64 %r1293,%r208,%r503; @ %r1293 bra $L102; .loc 1 2993 7 setp.gt.s64 %r1294,%r217,0; @ %r1294 bra $L103; bra $L1; $L102: .loc 1 2974 7 setp.gt.s64 %r1295,%r217,0; @ %r1295 bra $L104; bra $L1; $L108: shl.b64 %r1296,%r717,3; add.u64 %r713,%r245,%r1296; .loc 1 2975 9 mov.u64 %r310,0; and.b64 %r1506,%r216,3; setp.eq.u64 %r1509,%r1506,0; @ %r1509 bra $L105; setp.eq.u64 %r1508,%r1506,1; @ %r1508 bra $L316; setp.eq.u64 %r1507,%r1506,2; @ %r1507 bra $L317; bra $L377; $L105: .loc 1 2976 34 st.u64 [%r713],%r1407; .loc 1 2975 2 add.u64 %r1739,%r713,%r715; .loc 1 2976 34 st.u64 [%r1739],%r1407; .loc 1 2975 2 add.u64 %r1741,%r1739,%r715; .loc 1 2976 34 st.u64 [%r1741],%r1407; .loc 1 2975 2 add.u64 %r1744,%r1741,%r715; .loc 1 2976 34 st.u64 [%r1744],%r1407; .loc 1 2975 27 add.u64 %r310,%r310,4; .loc 1 2975 2 add.u64 %r713,%r1744,%r715; setp.ne.u64 %r1746,%r216,%r310; @ %r1746 bra $L105; $L109: .loc 1 2974 32 add.u64 %r309,%r324,1; .loc 1 2974 7 add.u64 %r717,%r717,%r504; setp.ne.u64 %r1299,%r217,%r309; @ %r1299 bra $L143; shl.b64 %r701,%r502,3; shl.b64 %r692,%r503,3; mov.u64 %r706,0; mov.u64 %r705,%r706; mov.u64 %r325,%r706; setp.le.s64 %r1747,%r456,0; bra $L107; $L104: shl.b64 %r715,%r444,3; mov.u64 %r717,0; .loc 1 2974 14 mov.u64 %r324,%r717; setp.gt.s64 %r1748,%r216,0; .loc 1 2976 34 mov.u64 %r1407,%r717; bra $L106; $L143: mov.u64 %r324,%r309; $L106: .loc 1 2975 2 @ %r1748 bra $L108; bra $L109; $L112: shl.b64 %r1301,%r697,3; add.u64 %r690,%r243,%r1301; .loc 1 2980 4 mov.u64 %r689,%r710; .loc 1 2980 11 mov.u64 %r308,0; and.b64 %r1498,%r216,3; setp.eq.u64 %r1501,%r1498,0; @ %r1501 bra $L110; setp.eq.u64 %r1500,%r1498,1; @ %r1500 bra $L318; setp.eq.u64 %r1499,%r1498,2; @ %r1499 bra $L319; bra $L378; $L110: .loc 1 2983 37 ld.u64 %r1303,[%r690]; ld.u64 %r1304,[%r699]; .loc 1 2982 36 ld.u64 %r1306,[%r689]; mad.lo.u64 %r1305,%r1303,%r1304,%r1306; st.u64 [%r689],%r1305; .loc 1 2980 4 add.u64 %r1750,%r689,%r715; add.u64 %r1751,%r690,%r692; .loc 1 2983 37 ld.u64 %r1752,[%r1751]; ld.u64 %r1753,[%r699]; .loc 1 2982 36 ld.u64 %r1755,[%r1750]; mad.lo.u64 %r1756,%r1752,%r1753,%r1755; st.u64 [%r1750],%r1756; .loc 1 2980 4 add.u64 %r1758,%r1750,%r715; add.u64 %r1759,%r1751,%r692; .loc 1 2983 37 ld.u64 %r1761,[%r1759]; ld.u64 %r1762,[%r699]; .loc 1 2982 36 ld.u64 %r1764,[%r1758]; mad.lo.u64 %r1765,%r1761,%r1762,%r1764; st.u64 [%r1758],%r1765; .loc 1 2980 4 add.u64 %r1767,%r1758,%r715; add.u64 %r1768,%r1759,%r692; .loc 1 2983 37 ld.u64 %r1770,[%r1768]; ld.u64 %r1771,[%r699]; .loc 1 2982 36 ld.u64 %r1773,[%r1767]; mad.lo.u64 %r1774,%r1770,%r1771,%r1773; st.u64 [%r1767],%r1774; .loc 1 2980 29 add.u64 %r308,%r308,4; .loc 1 2980 4 add.u64 %r689,%r1767,%r715; add.u64 %r690,%r1768,%r692; setp.ne.u64 %r1775,%r216,%r308; @ %r1775 bra $L110; $L113: .loc 1 2979 26 add.u64 %r307,%r307,1; .loc 1 2979 2 add.u64 %r697,%r697,%r208; add.u64 %r699,%r699,%r701; setp.eq.u64 %r1308,%r456,%r307; @ %r1308 bra $L111; $L114: .loc 1 2980 4 @ %r1748 bra $L112; bra $L113; $L111: .loc 1 2978 32 add.u64 %r306,%r325,1; .loc 1 2978 7 add.u64 %r705,%r705,%r504; add.u64 %r706,%r706,%r209; setp.eq.u64 %r1310,%r324,%r325; @ %r1310 bra $L1; mov.u64 %r325,%r306; $L107: .loc 1 2979 2 @ %r1747 bra $L111; shl.b64 %r1312,%r706,3; add.u64 %r699,%r244,%r1312; shl.b64 %r1313,%r705,3; add.u64 %r710,%r245,%r1313; mov.u64 %r697,0; .loc 1 2979 9 mov.u64 %r307,%r697; bra $L114; $L103: shl.b64 %r732,%r444,3; shl.b64 %r723,%r208,3; shl.b64 %r727,%r502,3; .loc 1 2993 7 mov.u64 %r741,0; mov.u64 %r739,%r741; .loc 1 2993 14 mov.u64 %r302,%r741; setp.gt.s64 %r1776,%r216,0; setp.le.s64 %r1408,%r456,0; $L116: .loc 1 2997 4 @ %r1776 bra $L115; $L120: .loc 1 2993 32 add.u64 %r302,%r302,1; .loc 1 2993 7 add.u64 %r739,%r739,%r504; add.u64 %r741,%r741,%r209; setp.ne.u64 %r1315,%r217,%r302; @ %r1315 bra $L116; bra $L1; $L115: shl.b64 %r1316,%r739,3; add.u64 %r730,%r245,%r1316; shl.b64 %r1317,%r741,3; add.u64 %r747,%r244,%r1317; .loc 1 2997 4 mov.u64 %r734,0; .loc 1 2997 11 mov.u64 %r303,%r734; $L119: .loc 1 3001 8 @ %r1408 bra $L144; shl.b64 %r1319,%r734,3; add.u64 %r721,%r243,%r1319; mov.u64 %r725,%r747; .loc 1 3000 10 mov.u64 %r304,0; .loc 1 3001 15 mov.u64 %r305,%r304; and.b64 %r1514,%r456,3; setp.eq.u64 %r1517,%r1514,0; @ %r1517 bra $L118; setp.eq.u64 %r1516,%r1514,1; @ %r1516 bra $L320; setp.eq.u64 %r1515,%r1514,2; @ %r1515 bra $L321; bra $L379; $L118: .loc 1 3002 28 ld.u64 %r1321,[%r721]; ld.u64 %r1322,[%r725]; .loc 1 3002 5 mad.lo.u64 %r1778,%r1321,%r1322,%r304; .loc 1 3001 8 add.u64 %r1780,%r721,%r723; add.u64 %r1781,%r725,%r727; .loc 1 3002 28 ld.u64 %r1782,[%r1780]; ld.u64 %r1783,[%r1781]; .loc 1 3002 5 mad.lo.u64 %r1785,%r1782,%r1783,%r1778; .loc 1 3001 8 add.u64 %r1787,%r1780,%r723; add.u64 %r1788,%r1781,%r727; .loc 1 3002 28 ld.u64 %r1790,[%r1787]; ld.u64 %r1791,[%r1788]; .loc 1 3002 5 mad.lo.u64 %r1793,%r1790,%r1791,%r1785; .loc 1 3001 8 add.u64 %r1795,%r1787,%r723; add.u64 %r1796,%r1788,%r727; .loc 1 3002 28 ld.u64 %r1798,[%r1795]; ld.u64 %r1799,[%r1796]; .loc 1 3002 5 mad.lo.u64 %r304,%r1798,%r1799,%r1793; .loc 1 3001 32 add.u64 %r305,%r305,4; .loc 1 3001 8 add.u64 %r721,%r1795,%r723; add.u64 %r725,%r1796,%r727; setp.ne.u64 %r1801,%r456,%r305; @ %r1801 bra $L118; bra $L117; $L144: .loc 1 3000 10 mov.u64 %r304,0; $L117: .loc 1 3003 27 st.u64 [%r730],%r304; .loc 1 2997 29 add.u64 %r303,%r303,1; .loc 1 2997 4 add.u64 %r730,%r730,%r732; add.u64 %r734,%r734,%r503; setp.ne.u64 %r1324,%r216,%r303; @ %r1324 bra $L119; bra $L120; $L32: .loc 1 2920 11 setp.eq.u64 %r1325,%r208,1; @ %r1325 bra $L121; bra $L20; $L2: .loc 1 2476 6 ld.u64 %r1326,[%r748]; setp.ne.u64 %r1327,%r1326,0; @ %r1327 bra $L122; .loc 1 2485 4 ld.u64 %r1329,[%r749+56]; add.u64 %r1328,%r1329,1; ld.u64 %r1330,[%r749+48]; sub.u64 %r508,%r1328,%r1330; .loc 1 2483 15 ld.s8 %r1332,[%r750+28]; cvt.u16.u32 %r1331,%r1332; setp.eq.u16 %r1333,%r1331,1; @ %r1333 bra $L123; bra $L124; $L126: .loc 1 2560 16 ld.u64 %r503,[%r749+40]; .loc 1 2548 27 mov.u64 %r444,%r504; bra $L125; $L14: ld.u64 %r504,[%r748+40]; .loc 1 2543 6 ld.s8 %r1335,[%r748+28]; cvt.u16.u32 %r1334,%r1335; setp.eq.u16 %r1336,%r1334,1; @ ! %r1336 bra $L127; bra $L126; $L11: .loc 1 2560 16 ld.u64 %r503,[%r749+40]; mov.u32 %r446,%r331; .loc 1 2548 27 mov.u64 %r504,%r444; .loc 1 2563 14 mov.u64 %r216,1; .loc 1 2561 16 mov.u64 %r208,%r216; bra $L12; $L5: .loc 1 2485 4 ld.u64 %r1338,[%r749+56]; add.u64 %r1337,%r1338,1; ld.u64 %r1339,[%r749+48]; sub.u64 %r508,%r1337,%r1339; bra $L124; $L128: .loc 1 2509 17 ld.u64 %r1341,[%r748+56]; add.u64 %r1340,%r1341,1; .loc 1 2509 15 ld.u64 %r1342,[%r748+48]; sub.u64 %r456,%r1340,%r1342; ld.s8 %r1354,[%r750+28]; bra $L9; $L122: .loc 1 2502 12 cvta.global.u64 %r1343,_gfortrani_compile_options; .loc 1 2502 11 ld.u32 %r1344,[%r1343+36]; setp.ne.u32 %r1345,%r1344,0; @ ! %r1345 bra $L8; bra $L128; $L1$L372: .loc 1 2908 13 ld.u64 %r1802,[%r332]; ld.u64 %r1803,[%r400]; .loc 1 2907 11 mad.lo.u64 %r268,%r1802,%r1803,%r268; .loc 1 2905 10 add.u64 %r332,%r332,8; add.u64 %r297,%r400,8; $L309: .loc 1 2908 13 ld.u64 %r1806,[%r332]; ld.u64 %r1807,[%r297]; .loc 1 2907 11 mad.lo.u64 %r268,%r1806,%r1807,%r268; .loc 1 2905 10 add.u64 %r332,%r332,8; add.u64 %r297,%r297,8; $L308: .loc 1 2908 13 ld.u64 %r1810,[%r332]; ld.u64 %r1811,[%r297]; .loc 1 2907 11 mad.lo.u64 %r268,%r1810,%r1811,%r268; .loc 1 2905 10 add.u64 %r332,%r332,8; add.u64 %r297,%r297,8; setp.ne.u64 %r1813,%r398,%r297; @ %r1813 bra $L83; bra $L82; $L371: add.u64 %r1814,%r418,%r428; .loc 1 2887 16 ld.u64 %r1815,[%r400]; .loc 1 2886 16 add.u64 %r1816,%r1814,%r417; .loc 1 2887 13 ld.u64 %r1817,[%r1816]; .loc 1 2886 11 mad.lo.u64 %r269,%r1815,%r1817,%r269; .loc 1 2889 13 ld.u64 %r1819,[%r428]; .loc 1 2888 11 mad.lo.u64 %r270,%r1815,%r1819,%r270; .loc 1 2890 16 add.u64 %r1821,%r1814,%r416; .loc 1 2891 13 ld.u64 %r1822,[%r1821]; .loc 1 2890 11 mad.lo.u64 %r271,%r1815,%r1822,%r271; .loc 1 2892 16 add.u64 %r1824,%r1814,%r415; .loc 1 2893 13 ld.u64 %r1825,[%r1824]; .loc 1 2892 11 mad.lo.u64 %r272,%r1815,%r1825,%r272; .loc 1 2884 10 add.u64 %r429,%r400,8; add.u64 %r428,%r428,8; setp.ne.u64 %r1827,%r398,%r429; @ %r1827 bra $L78; bra $L77; $L370: .loc 1 2855 16 ld.u64 %r1828,[%r389]; .loc 1 2856 13 ld.u64 %r1829,[%r387]; .loc 1 2855 11 mad.lo.u64 %r274,%r1828,%r1829,%r274; .loc 1 2858 13 ld.u64 %r1831,[%r49]; .loc 1 2857 11 mad.lo.u64 %r275,%r1828,%r1831,%r275; .loc 1 2860 13 ld.u64 %r1833,[%r239]; .loc 1 2859 11 mad.lo.u64 %r276,%r1828,%r1833,%r276; .loc 1 2862 13 ld.u64 %r1835,[%r240]; .loc 1 2861 11 mad.lo.u64 %r277,%r1828,%r1835,%r277; .loc 1 2853 10 add.u64 %r389,%r389,8; add.u64 %r387,%r387,8; add.u64 %r385,%r49,8; add.u64 %r383,%r239,8; add.u64 %r381,%r240,8; $L307: .loc 1 2855 16 ld.u64 %r1838,[%r389]; .loc 1 2856 13 ld.u64 %r1839,[%r387]; .loc 1 2855 11 mad.lo.u64 %r274,%r1838,%r1839,%r274; .loc 1 2858 13 ld.u64 %r1841,[%r385]; .loc 1 2857 11 mad.lo.u64 %r275,%r1838,%r1841,%r275; .loc 1 2860 13 ld.u64 %r1843,[%r383]; .loc 1 2859 11 mad.lo.u64 %r276,%r1838,%r1843,%r276; .loc 1 2862 13 ld.u64 %r1845,[%r381]; .loc 1 2861 11 mad.lo.u64 %r277,%r1838,%r1845,%r277; .loc 1 2853 10 add.u64 %r389,%r389,8; add.u64 %r387,%r387,8; add.u64 %r385,%r385,8; add.u64 %r383,%r383,8; add.u64 %r381,%r381,8; $L306: .loc 1 2855 16 ld.u64 %r1848,[%r389]; .loc 1 2856 13 ld.u64 %r1849,[%r387]; .loc 1 2855 11 mad.lo.u64 %r274,%r1848,%r1849,%r274; .loc 1 2858 13 ld.u64 %r1851,[%r385]; .loc 1 2857 11 mad.lo.u64 %r275,%r1848,%r1851,%r275; .loc 1 2860 13 ld.u64 %r1853,[%r383]; .loc 1 2859 11 mad.lo.u64 %r276,%r1848,%r1853,%r276; .loc 1 2862 13 ld.u64 %r1855,[%r381]; .loc 1 2861 11 mad.lo.u64 %r277,%r1848,%r1855,%r277; .loc 1 2853 10 add.u64 %r389,%r389,8; add.u64 %r387,%r387,8; add.u64 %r385,%r385,8; add.u64 %r383,%r383,8; add.u64 %r381,%r381,8; setp.ne.u64 %r1857,%r227,%r387; @ %r1857 bra $L70; bra $L69; $L369: .loc 1 2762 43 ld.u64 %r1858,[%r556]; st.u64 [%r330],%r1858; .loc 1 2760 9 add.u64 %r57,%r556,8; add.u64 %r73,%r330,2048; $L305: .loc 1 2762 43 ld.u64 %r1860,[%r57]; st.u64 [%r73],%r1860; .loc 1 2760 9 add.u64 %r57,%r57,8; add.u64 %r73,%r73,2048; $L304: .loc 1 2762 43 ld.u64 %r1862,[%r57]; st.u64 [%r73],%r1862; .loc 1 2760 9 add.u64 %r57,%r57,8; add.u64 %r73,%r73,2048; setp.ne.u64 %r1863,%r57,%r136; @ %r1863 bra $L58; bra $L48; $L368: .loc 1 2741 7 add.u64 %r1864,%r536,%r539; add.u64 %r1865,%r1864,%r540; .loc 1 2740 49 ld.u64 %r1866,[%r1865]; st.u64 [%r229],%r1866; .loc 1 2742 49 ld.u64 %r1867,[%r536]; st.u64 [%r229+8],%r1867; .loc 1 2745 7 add.u64 %r1868,%r536,%r540; add.u64 %r1869,%r1868,%r539; .loc 1 2744 49 ld.u64 %r1870,[%r1869+8]; st.u64 [%r229+2048],%r1870; .loc 1 2746 49 ld.u64 %r1871,[%r536+8]; st.u64 [%r229+2056],%r1871; .loc 1 2738 9 add.u64 %r229,%r229,4096; add.u64 %r211,%r536,16; $L303: .loc 1 2741 7 add.u64 %r1873,%r211,%r539; add.u64 %r1874,%r1873,%r540; .loc 1 2740 49 ld.u64 %r1875,[%r1874]; st.u64 [%r229],%r1875; .loc 1 2742 49 ld.u64 %r1876,[%r211]; st.u64 [%r229+8],%r1876; .loc 1 2745 7 add.u64 %r1877,%r211,%r540; add.u64 %r1878,%r1877,%r539; .loc 1 2744 49 ld.u64 %r1879,[%r1878+8]; st.u64 [%r229+2048],%r1879; .loc 1 2746 49 ld.u64 %r1880,[%r211+8]; st.u64 [%r229+2056],%r1880; .loc 1 2738 9 add.u64 %r229,%r229,4096; add.u64 %r211,%r211,16; $L302: .loc 1 2741 7 add.u64 %r1882,%r211,%r539; add.u64 %r1883,%r1882,%r540; .loc 1 2740 49 ld.u64 %r1884,[%r1883]; st.u64 [%r229],%r1884; .loc 1 2742 49 ld.u64 %r1885,[%r211]; st.u64 [%r229+8],%r1885; .loc 1 2745 7 add.u64 %r1886,%r211,%r540; add.u64 %r1887,%r1886,%r539; .loc 1 2744 49 ld.u64 %r1888,[%r1887+8]; st.u64 [%r229+2048],%r1888; .loc 1 2746 49 ld.u64 %r1889,[%r211+8]; st.u64 [%r229+2056],%r1889; .loc 1 2738 9 add.u64 %r229,%r229,4096; add.u64 %r211,%r211,16; setp.ne.u64 %r1890,%r211,%r527; @ %r1890 bra $L52; bra $L53; $L366: .loc 1 2688 2 @ ! %r1351 bra $L380; .loc 1 2689 22 mov.u32 %r1881892,[%value_in]; } $L380: .loc 1 2687 24 mov.u64 %r296,2; .loc 1 2687 7 add.u64 %r620,%r245,%r621; $L301: .loc 1 2688 2 @ ! %r1351 bra $L381; .loc 1 2689 22 mov.u32 %r188941895,[%value_in]; } $L381: .loc 1 2687 24 add.u64 %r296,%r296,1; .loc 1 2687 7 add.u64 %r620,%r620,%r621; $L300: .loc 1 2688 2 @ ! %r1351 bra $L382; .loc 1 2689 22 mov.u32 %r1881898,[%value_in]; } $L382: .loc 1 2687 24 add.u64 %r296,%r296,1; .loc 1 2687 7 add.u64 %r620,%r620,%r621; setp.eq.u64 %r1899,%r296,%r624; @ ! %r1899 bra $L34; bra $L33; $L238: .loc 1 2689 22 mov.u32 %r19001529001901,[%value_in]; } $L367: .loc 1 2687 7 add.u64 %r1903,%r1520,%r621; .loc 1 2688 2 @ ! %r1351 bra $L383; .loc 1 2689 22 mov.u32 %r190519019051906,[%value_in]; } $L383: .loc 1 2687 7 add.u64 %r1908,%r1903,%r621; .loc 1 2688 2 @ ! %r1351 bra $L384; .loc 1 2689 22 mov.u32 %r19101919101911,[%value_in]; } $L384: .loc 1 2687 24 add.u64 %r296,%r1467,3; .loc 1 2687 7 add.u64 %r620,%r1908,%r621; setp.eq.u64 %r1912,%r296,%r624; @ ! %r1912 bra $L34; bra $L33; $L374: .loc 1 2938 23 ld.u64 %r1913,[%r625]; ld.u64 %r1914,[%r652]; mul.lo.u64 %r301,%r1913,%r1914; .loc 1 2937 5 add.u64 %r625,%r625,8; add.u64 %r626,%r652,8; $L311: .loc 1 2938 23 ld.u64 %r1917,[%r625]; ld.u64 %r1918,[%r626]; .loc 1 2938 9 mad.lo.u64 %r301,%r1917,%r1918,%r301; .loc 1 2937 5 add.u64 %r625,%r625,8; add.u64 %r626,%r626,8; $L310: .loc 1 2938 23 ld.u64 %r1921,[%r625]; ld.u64 %r1922,[%r626]; .loc 1 2938 9 mad.lo.u64 %r301,%r1921,%r1922,%r301; .loc 1 2937 5 add.u64 %r625,%r625,8; add.u64 %r626,%r626,8; setp.ne.u64 %r1924,%r625,%r631; @ %r1924 bra $L93; bra $L92; $L375: .loc 1 2953 26 ld.u64 %r1925,[%r243]; ld.u64 %r1926,[%r656]; mul.lo.u64 %r299,%r1925,%r1926; .loc 1 2952 8 add.u64 %r653,%r243,%r655; add.u64 %r656,%r656,8; $L313: .loc 1 2953 26 ld.u64 %r1929,[%r653]; ld.u64 %r1930,[%r656]; .loc 1 2953 5 mad.lo.u64 %r299,%r1929,%r1930,%r299; .loc 1 2952 8 add.u64 %r653,%r653,%r655; add.u64 %r656,%r656,8; $L312: .loc 1 2953 26 ld.u64 %r1933,[%r653]; ld.u64 %r1934,[%r656]; .loc 1 2953 5 mad.lo.u64 %r299,%r1933,%r1934,%r299; .loc 1 2952 8 add.u64 %r653,%r653,%r655; add.u64 %r656,%r656,8; setp.ne.u64 %r1936,%r656,%r672; @ %r1936 bra $L97; bra $L96; $L376: .loc 1 2968 29 ld.u64 %r1937,[%r243]; ld.u64 %r1938,[%r676]; mul.lo.u64 %r312,%r1937,%r1938; .loc 1 2967 28 mov.u64 %r313,1; .loc 1 2967 4 add.u64 %r673,%r243,%r675; add.u64 %r676,%r676,%r678; $L315: .loc 1 2968 29 ld.u64 %r1941,[%r673]; ld.u64 %r1942,[%r676]; .loc 1 2968 8 mad.lo.u64 %r312,%r1941,%r1942,%r312; .loc 1 2967 28 add.u64 %r313,%r313,1; .loc 1 2967 4 add.u64 %r673,%r673,%r675; add.u64 %r676,%r676,%r678; $L314: .loc 1 2968 29 ld.u64 %r1945,[%r673]; ld.u64 %r1946,[%r676]; .loc 1 2968 8 mad.lo.u64 %r312,%r1945,%r1946,%r312; .loc 1 2967 28 add.u64 %r313,%r313,1; .loc 1 2967 4 add.u64 %r673,%r673,%r675; add.u64 %r676,%r676,%r678; setp.ne.u64 %r1948,%r456,%r313; @ %r1948 bra $L100; bra $L99; $L378: .loc 1 2983 37 ld.u64 %r1949,[%r690]; ld.u64 %r1950,[%r699]; .loc 1 2982 36 ld.u64 %r1952,[%r710]; mad.lo.u64 %r1953,%r1949,%r1950,%r1952; st.u64 [%r710],%r1953; .loc 1 2980 29 mov.u64 %r308,1; .loc 1 2980 4 add.u64 %r689,%r710,%r715; add.u64 %r690,%r690,%r692; $L319: .loc 1 2983 37 ld.u64 %r1955,[%r690]; ld.u64 %r1956,[%r699]; .loc 1 2982 36 ld.u64 %r1958,[%r689]; mad.lo.u64 %r1959,%r1955,%r1956,%r1958; st.u64 [%r689],%r1959; .loc 1 2980 29 add.u64 %r308,%r308,1; .loc 1 2980 4 add.u64 %r689,%r689,%r715; add.u64 %r690,%r690,%r692; $L318: .loc 1 2983 37 ld.u64 %r1961,[%r690]; ld.u64 %r1962,[%r699]; .loc 1 2982 36 ld.u64 %r1964,[%r689]; mad.lo.u64 %r1965,%r1961,%r1962,%r1964; st.u64 [%r689],%r1965; .loc 1 2980 29 add.u64 %r308,%r308,1; .loc 1 2980 4 add.u64 %r689,%r689,%r715; add.u64 %r690,%r690,%r692; setp.ne.u64 %r1966,%r216,%r308; @ %r1966 bra $L110; bra $L113; $L377: .loc 1 2976 34 st.u64 [%r713],%r1407; .loc 1 2975 27 mov.u64 %r310,1; .loc 1 2975 2 add.u64 %r713,%r713,%r715; $L317: .loc 1 2976 34 st.u64 [%r713],%r1407; .loc 1 2975 27 add.u64 %r310,%r310,1; .loc 1 2975 2 add.u64 %r713,%r713,%r715; $L316: .loc 1 2976 34 st.u64 [%r713],%r1407; .loc 1 2975 27 add.u64 %r310,%r310,1; .loc 1 2975 2 add.u64 %r713,%r713,%r715; setp.ne.u64 %r1969,%r216,%r310; @ %r1969 bra $L105; bra $L109; $L379: .loc 1 3002 28 ld.u64 %r1970,[%r721]; ld.u64 %r1971,[%r747]; mul.lo.u64 %r304,%r1970,%r1971; .loc 1 3001 32 mov.u64 %r305,1; .loc 1 3001 8 add.u64 %r721,%r721,%r723; add.u64 %r725,%r747,%r727; $L321: .loc 1 3002 28 ld.u64 %r1974,[%r721]; ld.u64 %r1975,[%r725]; .loc 1 3002 5 mad.lo.u64 %r304,%r1974,%r1975,%r304; .loc 1 3001 32 add.u64 %r305,%r305,1; .loc 1 3001 8 add.u64 %r721,%r721,%r723; add.u64 %r725,%r725,%r727; $L320: .loc 1 3002 28 ld.u64 %r1978,[%r721]; ld.u64 %r1979,[%r725]; .loc 1 3002 5 mad.lo.u64 %r304,%r1978,%r1979,%r304; .loc 1 3001 32 add.u64 %r305,%r305,1; .loc 1 3001 8 add.u64 %r721,%r721,%r723; add.u64 %r725,%r725,%r727; setp.ne.u64 %r1981,%r456,%r305; @ %r1981 bra $L118; bra $L117; } matmul_i16.o/_gfortran_matmul_i16 .visible .func _gfortran_matmul_i16, .param .u32 %in_ar4, .param .u64 %in_ar5); .file 1 "../../../../libgfortran/generated/matmul_i16.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_options[6]FUNCTION DECL: _gfortran_size0_gfortran_size0_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime_97,116,109,117,108,95,105,49,54,0 }2] = {782] = {672] = {8471,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,32,40,97,41,32,61,61,32,50,32,124,124,32,71,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,32,40,98,41,32,61,61,32116,114,97,110,47,103,101,110,101,114,97,116,101,100,47,109,97,116,109,117,108,95,105,49,54,465,114,114,97,121,32,98,111,117,110,100,32,109,105,115,109,97,116,99,104,32,102,111,114,32,100,105,109,101,110,115,105,111,110,32,49,32,111,102,32,97,114,114,97,121,32,40,37,108,100,47,37,108,100,41,3265,114,114,97,121,32,98,111,117,110,100,32,109,105,115,109,97,116,99,104,32,102,111,114,32,100,105,109,101,110,115,105,111,110,32,50,32,111,102,32,97,114,114,97,121,32,40,37,108,100,47,37,108,100,41,3273,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,97,114,103,117,109,101,110,116,32,66,32,105,110,32,77,65,84,77,85,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,49,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,100,0 }103,101,109,109,32,33,61,32,78,85,76,7_gfortran_matmul_i16 .visible .func _gfortran_matmul_i16112u64 %r89u64 %r314; .reg .u64 %r316; .reg .u64u32 %r331; .reg .u64 %r332; .reg .u64u64 %r374; .reg .u64u64 %r390; .reg .u64 %r392; .reg .u64u64u64 %r409; .reg .u64u64u64 %r530; .reg .u64 %r536; .reg .u64 %r537; .reg .u64 %r538; .reg .u64u64 %r555; .reg .u64u64 %r571; .reg .u64 %r573; .reg .u64 %r574; .reg .u64 %r575; .reg .u64 %r576; .reg .u64 %r577; .reg .u64 %r585; .reg .u64 %r586; .reg .u64 %r587; .reg .u64 %r588; .reg .u64 %r589; .reg .u64 %r590; .reg .u64 %r591; .reg .u64 %r592; .reg .u64 %r594; .reg .u64u64.reg .u64 %r619; .reg .u64 %r620; .reg .u64 %r621; .reg .u64 %r624; .reg .u64 %r625; .reg .u64 %r626; .reg .u64 %r629; .reg .u64 %r630; .reg .u64 %r631; .reg .u64 %r633; .reg .u64 %r634; .reg .u64 %r640; .reg .u64 %r642; .reg .u64 %r643; .reg .u64 %r647; .reg .u64 %r652; .reg .u64 %r653; .reg .u64 %r655; .reg .u64 %r656; .reg .u64 %r661; .reg .u64 %r663; .reg .u64 %r664; .reg .u64 %r672; .reg .u64 %r673; .reg .u64 %r675; .reg .u64 %r676; .reg .u64 %r678; .reg .u64 %r681; .reg .u64 %r683; .reg .u64u64 %r699; .reg .u64 %r701; .reg .u64 %r705; .reg .u64 %r706; .reg .u64 %r710; .reg .u64 %r713; .reg .u64 %r715; .reg .u64 %r717; .reg .u64 %r721; .reg .u64 %r723; .reg .u64 %r725; .reg .u64 %r727; .reg .u64 %r730; .reg .u64 %r732; .reg .u64 %r734; .reg .u64 %r739; .reg .u64 %r741; .reg .u64 %r747; .reg .u64 %r748; .reg .u64 %r749; .reg .u64 %r750; .reg .u32 %r751; .reg .u32 %r752; .reg .u64 %r753; .reg .u16 %r754; .reg .pred %r755; .reg .u16 %r756; .reg .pred %r758; .reg .u64 %r759; .reg .u32 %r760; .reg .u64 %r761; .reg .u64 %r762; .reg .u64 %r763; .reg .pred %r764; .reg .pred %r766; .reg .u64 %r769; .reg .u64 %r771; .reg .u64 %r772; .reg .u64 %r773; .reg .u64 %r775; .reg .u64 %r776; .reg .u64 %r777; .reg .u64 %r778; .reg .u64 %r779; .reg .u64 %r782; .reg .u64 %r784; .reg .u64 %r785; .reg .u64 %r787; .reg .u64 %r789; .reg .u64 %r790; .reg .u64 %r792; .reg .u64 %r793; .reg .u64 %r794; .reg .u64 %r795; .reg .u64 %r796; .reg .u32 %r797; .reg .pred %r798; .reg .u64 %r799; .reg .u64 %r800; .reg .u64 %r801; .reg .pred %r803; .reg .u64 %r804; .reg .u64 %r805; .reg .u64 %r806; .reg .pred %r807; .reg .u64 %r808; .reg .u64 %r809; .reg .u64 %r810; .reg .u16 %r811; .reg .pred %r812; .reg .u64 %r813; .reg .u64 %r815; .reg .u64 %r816; .reg .u64 %r817; .reg .u16 %r818; .reg .pred %r820; .reg .pred %r821; .reg .u64 %r822; .reg .pred %r824; .reg .u64 %r825; .reg .u64 %r827; .reg .u64 %r828; .reg .u64 %r829; .reg .u64 %r830; .reg .u64 %r831; .reg .u64 %r832; .reg .pred %r833; .reg .u64 %r834; .reg .u64 %r836; .reg .u64 %r837; .reg .u64 %r838; .reg .u16 %r839; .reg .u32 %r840; .reg .pred %r841; .reg .u16 %r842; .reg .pred %r843; .reg .u64 %r844; .reg .u64 %r845; .reg .u64 %r846; .reg .u64 %r847; .reg .u64 %r848; .reg .u64 %r849; .reg .pred %r850; .reg .u32 %r852; .reg .u32 %r853; .reg .u32 %r855; .reg .u32 %r856; .reg .u16 %r857; .reg .u16 %r858; .reg .u16 %r859; .reg .u32 %r860; .reg .u16 %r861; .reg .pred %r862; .reg .u64 %r863; .reg .u16 %r865; .reg .pred %r866; .reg .u64 %r867; .reg .u64 %r868; .reg .u64 %r869; .reg .u32 %r870; .reg .u32 %r871; .reg .u32 %r872; .reg .u32 %r873; .reg .u32 %r874; .reg .u32 %r875; .reg .u32 %r877; .reg .u32 %r878; .reg .u16 %r879; .reg .u16 %r880; .reg .u16 %r881; .reg .u32 %r882; .reg .u16 %r883; .reg .pred %r884; .reg .u32 %r886; .reg .u32 %r887; .reg .u16 %r888; .reg .u16 %r889; .reg .u16 %r890; .reg .u32 %r891; .reg .u16 %r892; .reg .pred %r893; .reg .u32 %r895; .reg .u32 %r896; .reg .u32 %r898; .reg .u32 %r899; .reg .u16 %r900; .reg .u16 %r901; .reg .u16 %r902; .reg .u32 %r903; .reg .u16 %r904; .reg .pred %r905; .reg .f32 %r906; .reg .f32 %r907; .reg .f32 %r908; .reg .f32 %r909; .reg .f32 %r910; .reg .f32 %r911; .reg .f32 %r912; .reg .pred %r913; .reg .u64 %r914; .reg .u64 %r915; .reg .pred %r918; .reg .pred %r919; .reg .u32 %r921; .reg .u32 %r922; .reg .u32 %r924; .reg .u32 %r925; .reg .u16 %r926; .reg .u16 %r927; .reg .u16 %r928; .reg .u32 %r929; .reg .u16 %r930; .reg .pred %r931; .reg .u32 %r933; .reg .u32 %r934; .reg .u32 %r936; .reg .u32 %r937; .reg .u16 %r938; .reg .u16 %r939; .reg .u16 %r940; .reg .u32 %r941; .reg .u16 %r942; .reg .pred %r943; .reg .u32 %r945; .reg .u32 %r946; .reg .u32 %r948; .reg .u32 %r949; .reg .u16 %r950; .reg .u16 %r951; .reg .u16 %r952; .reg .u32 %r953; .reg .u16 %r954; .reg .pred %r955; .reg .pred %r956; .reg .u64 %r957; .reg .u32 %r958; .reg .u64 %r959; .reg .u64 %r960; .reg .u32 %r961; .reg .pred %r962; .reg .pred %r963; .reg .u32 %r964; .reg .pred %r965; .reg .pred %r966; .reg .u32 %r981; .reg .u64 %r982; .reg .u64 %r983; .reg .u64 %r984; .reg .u64 %r985; .reg .u64 %r986; .reg .u64 %r987; .reg .u64 %r988; .reg .u16 %r989; .reg .u16 %r990; .reg .u16 %r991; .reg .u32 %r992; .reg .u16 %r993; .reg .pred %r994; .reg .pred %r995; .reg .u16 %r996; .reg .pred %r997; .reg .u64 %r998; .reg .pred %r999; .reg .u32 %r1003; .reg .u64 %r1005; .reg .u32 %r1010; .reg .u32 %r1011; .reg .u32 %r1013; .reg .u32 %r1014; .reg .u16 %r1015; .reg .u16 %r1016; .reg .u16 %r1017; .reg .u32 %r1019; .reg .u32 %r1020; .reg .u16 %r1022; .reg .u16 %r1024; .reg .u32 %r1025; .reg .u16 %r1026; .reg .pred %r1027; .reg .pred %r1028; .reg .u64 %r1029; .reg .u64 %r1031; .reg .u64 %r1032; .reg .u64 %r1033; .reg .pred %r1035; .reg .u64 %r1037; .reg .u64 %r1038; .reg .u64 %r1039; .reg .u64 %r1042; .reg .u64 %r1043; .reg .u64 %r1047; .reg .u64 %r1048; .reg .u64 %r1049; .reg .u64 %r1051; .reg .u64 %r1052; .reg .u64 %r1055; .reg .u64 %r1058; .reg .u64 %r1059; .reg .u64 %r1062; .reg .u64 %r1064; .reg .u64 %r1065; .reg .u64 %r1066; .reg .u64 %r1067; .reg .u64 %r1068; .reg .pred %r1070; .reg .u64 %r1076; .reg .u64 %r1077; .reg .u64 %r1078; .reg .u64 %r1079; .reg .pred %r1081; .reg .u64 %r1082; .reg .u64 %r1083; .reg .u64 %r1084; .reg .u64 %r1085; .reg .u64 %r1086; .reg .u64 %r1088; .reg .u64 %r1089; .reg .u64 %r1090; .reg .u64 %r1091; .reg .u64 %r1092; .reg .u64 %r1093; .reg .u64 %r1094; .reg .u64 %r1095; .reg .u64 %r1099; .reg .u64 %r1100; .reg .u64 %r1101; .reg .u64 %r1102; .reg .u64 %r1106; .reg .u64 %r1108; .reg .u64 %r1111; .reg .u64 %r1114; .reg .u64 %r1119; .reg .u64 %r1120; .reg .u64 %r1122; .reg .u64 %r1123; .reg .u64 %r1124; .reg .u64 %r1125; .reg .u64 %r1126; .reg .u64 %r1127; .reg .u64 %r1128; .reg .u64 %r1129; .reg .u64 %r1130; .reg .u64 %r1131; .reg .u64 %r1132; .reg .u64 %r1133; .reg .u64 %r1135; .reg .u64 %r1136; .reg .u64 %r1137; .reg .u64 %r1138; .reg .u64 %r1139; .reg .u64 %r1140; .reg .u64 %r1141; .reg .pred %r1142; .reg .u64 %r1145; .reg .u64 %r1146; .reg .u64 %r1147; .reg .u64 %r1148; .reg .u64 %r1150; .reg .pred %r1151; .reg .u64 %r1152; .reg .u64 %r1153; .reg .u64 %r1154; .reg .u64 %r1155; .reg .u64 %r1158; .reg .u64 %r1159; .reg .u64 %r1160; .reg .u64 %r1161; .reg .u64 %r1162; .reg .u64 %r1163; .reg .u64 %r1164; .reg .u64 %r1165; .reg .u64 %r1166; .reg .u64 %r1167; .reg .u64 %r1168; .reg .u64 %r1169; .reg .u64 %r1170; .reg .u64 %r1175; .reg .u64 %r1188; .reg .u64 %r1191; .reg .u32 %r1193; .reg .u64 %r1194; .reg .u64 %r1210; .reg .u64 %r1213; .reg .u32 %r1215; .reg .u64 %r1216; .reg .u64 %r1220; .reg .u64 %r1233; .reg .u64 %r1236; .reg .u32 %r1238; .reg .u64 %r1239; .reg .u64 %r1255; .reg .u64 %r1258; .reg .u32 %r1260; .reg .u64 %r1261; .reg .u64 %r1265; .reg .u64 %r1278; .reg .u64 %r1281; .reg .u32 %r1283; .reg .u64 %r1284; .reg .u64 %r1300; .reg .u64 %r1303; .reg .u32 %r1305; .reg .u64 %r1306; .reg .u64 %r1310; .reg .u64 %r1323; .reg .u64 %r1326; .reg .u32 %r1328; .reg .u64 %r1329; .reg .u64 %r1345; .reg .u64 %r1348; .reg .u32 %r1350; .reg .u64 %r1351; .reg .u64 %r1355; .reg .u64 %r1368; .reg .u64 %r1371; .reg .u32 %r1373; .reg .u64 %r1374; .reg .u64 %r1378; .reg .u64 %r1391; .reg .u64 %r1394; .reg .u32 %r1396; .reg .u64 %r1397; .reg .u64 %r1413; .reg .u64 %r1416; .reg .u32 %r1418; .reg .u64 %r1419; .reg .u64 %r1435; .reg .u64 %r1438; .reg .u32 %r1440; .reg .u64 %r1441; .reg .u64 %r1457; .reg .u64 %r1460; .reg .u32 %r1462; .reg .u64 %r1463; .reg .u64 %r1479; .reg .u64 %r1482; .reg .u32 %r1484; .reg .u64 %r1485; .reg .u64 %r1501; .reg .u64 %r1504; .reg .u32 %r1506; .reg .u64 %r1507; .reg .u64 %r1523; .reg .u64 %r1526; .reg .u32 %r1528; .reg .u64 %r1529; .reg .pred %r1533; .reg .pred %r1566; .reg .pred %r1567; .reg .u64 %r1571; .reg .u64 %r1578; .reg .u64 %r1579; .reg .u64 %r1584; .reg .u64 %r1587; .reg .u32 %r1589; .reg .u64 %r1590; .reg .u64 %r1600; .reg .u64 %r1601; .reg .u64 %r1606; .reg .u64 %r1609; .reg .u32 %r1611; .reg .u64 %r1612; .reg .u64 %r1622; .reg .u64 %r1623; .reg .u64 %r1628; .reg .u64 %r1631; .reg .u32 %r1633; .reg .u64 %r1634; .reg .u64 %r1644; .reg .u64 %r1645; .reg .u64 %r1650; .reg .u64 %r1653; .reg .u32 %r1655; .reg .u64 %r1656; .reg .pred %r1660; .reg .pred %r1669; .reg .pred %r1670; .reg .pred %r1671; .reg .u64 %r1672; .reg .u64 %r1673; .reg .u64 %r1675; .reg .u64 %r1676; .reg .u64 %r1677; .reg .u64 %r1678; .reg .u64 %r1685; .reg .u64 %r1686; .reg .u64 %r1687; .reg .u64 %r1690; .reg .u64 %r1697; .reg .u64 %r1698; .reg .u64 %r1703; .reg .u64 %r1706; .reg .u32 %r1708; .reg .u64 %r1709; .reg .u64 %r1719; .reg .u64 %r1720; .reg .u64 %r1725; .reg .u64 %r1728; .reg .u32 %r1730; .reg .u64 %r1731; .reg .u64 %r1735; .reg .u64 %r1742; .reg .u64 %r1743; .reg .u64 %r1748; .reg .u64 %r1751; .reg .u32 %r1753; .reg .u64 %r1754; .reg .u64 %r1758; .reg .u64 %r1765; .reg .u64 %r1766; .reg .u64 %r1771; .reg .u64 %r1774; .reg .u32 %r1776; .reg .u64 %r1777; .reg .pred %r1781; .reg .pred %r1790; .reg .pred %r1791; .reg .u64 %r1794; .reg .u64 %r1797; .reg .u64 %r1798; .reg .u64 %r1801; .reg .u64 %r1802; .reg .u64 %r1807; .reg .u64 %r1810; .reg .u32 %r1812; .reg .u64 %r1813; .reg .u64 %r1816; .reg .pred %r1820; .reg .u16 %r1821; .reg .pred %r1822; .reg .u32 %r1824; .reg .u32 %r1825; .reg .u16 %r1826; .reg .u16 %r1827; .reg .u32 %r1829; .reg .u16 %r1830; .reg .pred %r1831; .reg .pred %r1832; .reg .u16 %r1833; .reg .pred %r1834; .reg .pred %r1835; .reg .pred %r1836; .reg .u64 %r1837; .reg .u64 %r1838; .reg .pred %r1840; .reg .u64 %r1841; .reg .u64 %r1842; .reg .u64 %r1844; .reg .u64 %r1847; .reg .u64 %r1848; .reg .u64 %r1851; .reg .u64 %r1852; .reg .u64 %r1857; .reg .u64 %r1860; .reg .u32 %r1862; .reg .u64 %r1863; .reg .u64 %r1866; .reg .pred %r1870; .reg .u64 %r1872; .reg .u64 %r1873; .reg .u64 %r1874; .reg .u64 %r1877; .reg .u64 %r1878; .reg .u64 %r1881; .reg .u64 %r1882; .reg .u64 %r1887; .reg .u64 %r1890; .reg .u32 %r1892; .reg .u64 %r1893; .reg .u64 %r1896; .reg .pred %r1900; .reg .pred %r1901; .reg .u64 %r1903; .reg .u64 %r1906; .reg .u64 %r1907; .reg .u64 %r1910; .reg .u64 %r1911; .reg .u64 %r1916; .reg .u64 %r1919; .reg .u32 %r1921; .reg .u64 %r1922; .reg .u64 %r1925; .reg .pred %r1929; .reg .pred %r1930; .reg .pred %r1931; .reg .pred %r1932; .reg .u64 %r1933; .reg .pred %r1937; .reg .u64 %r1939; .reg .u64 %r1942; .reg .u64 %r1943; .reg .u64 %r1946; .reg .u64 %r1947; .reg .u64 %r1953; .reg .u64 %r1956; .reg .u32 %r1958; .reg .u64 %r1959; .reg .u64 %r1962; .reg .pred %r1966; .reg .pred %r1968; .reg .u64 %r1970; .reg .u64 %r1971; .reg .pred %r1973; .reg .u64 %r1974; .reg .u64 %r1975; .reg .u64 %r1977; .reg .u64 %r1980; .reg .u64 %r1981; .reg .u64 %r1984; .reg .u64 %r1985; .reg .u64 %r1990; .reg .u64 %r1993; .reg .u32 %r1995; .reg .u64 %r1996; .reg .u64 %r1999; .reg .pred %r2003; .reg .pred %r2004; .reg .u64 %r2005; .reg .pred %r2006; .reg .u64 %r2007; .reg .u64 %r2008; .reg .u64 %r2009; .reg .u16 %r2010; .reg .u32 %r2011; .reg .pred %r2012; .reg .u16 %r2013; .reg .u32 %r2014; .reg .pred %r2015; .reg .u64 %r2016; .reg .u64 %r2017; .reg .u64 %r2018; .reg .u64 %r2019; .reg .u64 %r2020; .reg .u64 %r2021; .reg .u64 %r2022; .reg .u32 %r2023; .reg .pred %r2024; .reg .u64 %r2025; .reg .u64 %r2026; .reg .u64 %r2027; .reg .u64 %r2028; .reg .u64 %r2029; .reg .u64 %r2030; .reg .u64 %r2031; .reg .u64 %r2032; .reg .u64 %r2033; .reg .u64 %r2034; .reg .u64 %r2035; .reg .u64 %r2036; .reg .u64 %r2037; .reg .u64 %r2038; .reg .u64 %r2039; .reg .u64 %r2040; .reg .u64 %r2041; .reg .u64 %r2042; .reg .u64 %r2043; .reg .u64 %r2044; .reg .u64 %r2045; .reg .u64 %r2046; .reg .u64 %r2047; .reg .u64 %r2048; .reg .u64 %r2049; .reg .u64 %r2050; .reg .u64 %r2051; .reg .u64 %r2052; .reg .u64 %r2053; .reg .u64 %r2054; .reg .u64 %r2055; .reg .u64 %r2056; .reg .u64 %r2057; .reg .u64 %r2058; .reg .u64 %r2059; .reg .u64 %r2060; .reg .u64 %r2061; .reg .u64 %r2062; .reg .u64 %r2063; .reg .u64 %r2064; .reg .u64 %r2065; .reg .u64 %r2066; .reg .u64 %r2067; .reg .u64 %r2068; .reg .u64 %r2069; .reg .u64 %r2070; .reg .u64 %r2071; .reg .u64 %r2072; .reg .u64 %r2073; .reg .u64 %r2074; .reg .u64 %r2075; .reg .u64 %r2076; .reg .u64 %r2077; .reg .u64 %r2078; .reg .u64 %r2079; .reg .u64 %r2080; .reg .u64 %r2081; .reg .u64 %r2082; .reg .u64 %r2083; .reg .u64 %r2084; .reg .u64 %r2085; .reg .u64 %r2086; .reg .u64 %r2087; .reg .u64 %r2088; .reg .u64 %r2089; .reg .u64 %r2090; .reg .u64 %r2091; .reg .u64 %r2092; .reg .u64 %r2093; .reg .u64 %r2094; .reg .u64 %r2095; .reg .u64 %r2096; .reg .u64 %r2097; .reg .u64 %r2098; .reg .u64 %r2099; .reg .u64 %r2100; .reg .u64 %r2101; .reg .u64 %r2102; .reg .u64 %r2103; .reg .u64 %r2104; .reg .u64 %r2107; .reg .u64 %r2108; .reg .u64 %r2111; .reg .u64 %r2112; .reg .u64 %r2115; .reg .u64 %r2116; .reg .u64 %r2119; .reg .u64 %r2120; .reg .u64 %r2123; .reg .u64 %r2124; .reg .u64 %r2127; .reg .u64 %r2128; .reg .u64 %r2131; .reg .u64 %r2132; .reg .u64 %r2135; .reg .u64 %r2136; .reg .u64 %r2139; .reg .u64 %r2140; .reg .u64 %r2143; .reg .u64 %r2144; .reg .u64 %r2147; .reg .u64 %r2148; .reg .u64 %r2151; .reg .u64 %r2152; .reg .u64 %r2155; .reg .u64 %r2156; .reg .u64 %r2159; .reg .u64 %r2160; .reg .u64 %r2163; .reg .u64 %r2164; .reg .u64 %r2167; .reg .u64 %r2168; .reg .u64 %r2171; .reg .u64 %r2172; .reg .u64 %r2175; .reg .u64 %r2176; .reg .u64 %r2179; .reg .u64 %r2180; .reg .u64 %r2183; .reg .u64 %r2184; .reg .u64 %r2187; .reg .u64 %r2188; .reg .u64 %r2191; .reg .u64 %r2192; .reg .u64 %r2195; .reg .u64 %r2196; .reg .u64 %r2199; .reg .u64 %r2200; .reg .u64 %r2203; .reg .u64 %r2204; .reg .u64 %r2207; .reg .u64 %r2208; .reg .u64 %r2211; .reg .u64 %r2212; .reg .u64 %r2215; .reg .u64 %r2216; .reg .u64 %r2217; .reg .u64 %r2218; .reg .u64 %r2221; .reg .u64 %r2222; .reg .pred %r2226; .reg .pred %r2228; .reg .pred %r2229; .reg .pred %r2230; .reg .u32 %r2232; .reg .pred %r2234; .reg .pred %r2235; .reg .pred %r2236; .reg .pred %r2237; .reg .pred %r2238; .reg .pred %r2239; .reg .pred %r2240; .reg .pred %r2241; .reg .u64 %r2242; .reg .pred %r2243; .reg .pred %r2245; .reg .pred %r2247; .reg .u64 %r2260; .reg .u64 %r2261; .reg .u64 %r2262; .reg .u64 %r2278; .reg .u64 %r2279; .reg .u64 %r2280; .reg .u64 %r2281; .reg .u64 %r2283; .reg .u64 %r2284; .reg .u64 %r2285; .reg .u64 %r2287; .reg .u64 %r2288; .reg .u64 %r2293; .reg .u64 %r2300; .reg .pred %r2301; .reg .u64 %r2302; .reg .u64 %r2303; .reg .u64 %r2304; .reg .u64 %r2305; .reg .u64 %r2306; .reg .u64 %r2307; .reg .u64 %r2308; .reg .pred %r2312; .reg .u64 %r2313; .reg .u64 %r2314; .reg .u64 %r2315; .reg .u64 %r2316; .reg .u64 %r2317; .reg .u64 %r2318; .reg .u64 %r2319; .reg .u64 %r2320; .reg .u64 %r2321; .reg .u64 %r2325; .reg .u64 %r2326; .reg .u64 %r2327; .reg .u64 %r2328; .reg .pred %r2332; .reg .u64 %r2333; .reg .u64 %r2334; .reg .u64 %r2335; .reg .u64 %r2336; .reg .u64 %r2337; .reg .u64 %r2338; .reg .u64 %r2340; .reg .pred %r2341; .reg .u64 %r2342; .reg .u64 %r2343; .reg .u64 %r2344; .reg .u64 %r2345; .reg .u64 %r2346; .reg .u64 %r2347; .reg .u64 %r2348; .reg .pred %r2349; .reg .pred %r2350; .reg .pred %r2351; .reg .u64 %r2352; .reg .u64 %r2353; .reg .u64 %r2354; .reg .u64 %r2355; .reg .u64 %r2356; .reg .u64 %r2358; .reg .pred %r2359; .reg .u64 %r2360; .reg .u64 %r2361; .reg .u64 %r2365; .reg .pred %r2366; .reg .pred %r2367; .reg .pred %r2368; .reg .u64 %r2369; .reg .u64 %r2370; .reg .u64 %r2371; .reg .u64 %r2372; .reg .u64 %r2374; .reg .pred %r2375; .reg .u64 %r2376; .reg .u64 %r2377; .reg .u64 %r2378; .reg .u64 %r2379; .reg .u64 %r2380; .reg .u64 %r2382; .reg .pred %r2383; .reg .u64 %r2384; .reg .u64 %r2388; .reg .pred %r2389; .reg .u64 %r2394; .reg .pred %r2395; .reg .u64 %r2400; .reg .pred %r2401; .reg .pred %r2402; .reg .pred %r2403; .reg .u64 %r2408; .reg .pred %r2409; .reg .u64 %r2412; .reg .u64 %r2413; .reg .u64 %r2415; .reg .u64 %r2416; .reg .u64 %r2419; .reg .u64 %r2420; .reg .u64 %r2421; .reg .u64 %r2422; .reg .u64 %r2423; .reg .u64 %r2424; .reg .u64 %r2425; .reg .u64 %r2426; .reg .u64 %r2427; .reg .u64 %r2428; .reg .u64 %r2429; .reg .u64 %r2430; .reg .pred %r2431; .reg .u64 %r2434; .reg .u64 %r2435; .reg .u64 %r2436; .reg .u64 %r2437; .reg .u64 %r2439; .reg .u64 %r2440; .reg .u64 %r2441; .reg .u64 %r2442; .reg .u64 %r2444; .reg .u64 %r2445; .reg .pred %r2446; .reg .pred %r2450; .reg .u64 %r2461; .reg .u64 %r2462; .reg .u64 %r2463; .reg .u64 %r2464; .reg .u64 %r2465; .reg .u64 %r2466; .reg .u64 %r2467; .reg .u32 %r2468; .reg .u64 %r2469; .reg .u64 %r2470; .reg .pred %r2472; .reg .pred %r2473; .reg .u64 %r2481; .reg .u64 %r2482; .reg .u64 %r2483; .reg .u64 %r2484; .reg .u64 %r2485; .reg .u64 %r2486; .reg .u64 %r2487; .reg .u32 %r2488; .reg .u64 %r2489; .reg .u64 %r2490; .reg .pred %r2492; .reg .pred %r2493; .reg .u64 %r2499; .reg .u64 %r2501; .reg .u64 %r2502; .reg .u64 %r2503; .reg .u64 %r2504; .reg .u64 %r2505; .reg .u64 %r2506; .reg .u64 %r2507; .reg .u32 %r2508; .reg .u64 %r2509; .reg .u64 %r2510; .reg .pred %r2512; .reg .pred %r2513; .reg .u64 %r2520; .reg .u64 %r2521; .reg .u64 %r2522; .reg .u64 %r2523; .reg .u64 %r2524; .reg .u64 %r2525; .reg .u64 %r2526; .reg .u64 %r2527; .reg .u64 %r2528; .reg .u32 %r2529; .reg .u64 %r2530; .reg .u64 %r2531; .reg .pred %r2533; .reg .u64 %r2535; .reg .u64 %r2537; .reg .u64 %r2540; .reg .pred %r2542; .reg .pred %r2543; .reg .pred %r2544; .reg .u64 %r2549; .reg .u64 %r2550; .reg .u64 %r2551; .reg .u64 %r2552; .reg .u64 %r2553; .reg .u64 %r2554; .reg .u64 %r2555; .reg .u64 %r2556; .reg .u64 %r2557; .reg .u64 %r2558; .reg .u64 %r2559; .reg .u32 %r2560; .reg .u64 %r2561; .reg .u64 %r2562; .reg .u64 %r2563; .reg .pred %r2564; .reg .pred %r2565; .reg .u64 %r2573; .reg .u64 %r2574; .reg .u64 %r2575; .reg .u64 %r2576; .reg .u64 %r2577; .reg .u64 %r2578; .reg .u64 %r2579; .reg .u64 %r2580; .reg .u64 %r2581; .reg .u32 %r2582; .reg .u64 %r2583; .reg .u64 %r2584; .reg .pred %r2586; .reg .u64 %r2587; .reg .u64 %r2588; .reg .u64 %r2589; .reg .u64 %r2590; .reg .u64 %r2591; .reg .u64 %r2592; .reg .u64 %r2593; .reg .u32 %r2594; .reg .u64 %r2595; .reg .u64 %r2596; .reg .pred %r2598; .reg .u64 %r2599; .reg .u64 %r2600; .reg .u64 %r2602; .reg .u64 %r2603; .reg .u64 %r2605; .reg .u64 %r2606; .reg .pred %r2607; .reg .u64 %r2608; .reg .u64 %r2609; .reg .u64 %r2610; .reg .u64 %r2611; .reg .u64 %r2612; .reg .u64 %r2613; .reg .u64 %r2614; .reg .u64 %r2615; .reg .u64 %r2616; .reg .u64 %r2617; .reg .u64 %r2618; .reg .u64 %r2619; .reg .pred %r2620; .reg .u32 %r2621; .reg .u64 %r2622; .reg .u32 %r2624; .reg .u64 %r2625; .reg .u32 %r2627; .reg .u64 %r2628; .reg .pred %r2629; .reg .u32 %r2630; .reg .u64 %r2631; .reg .u64 %r2633; .reg .u32 %r2635; .reg .u64 %r2636; .reg .u64 %r2638; .reg .u32 %r2640; .reg .u64 %r2641; .reg .pred %r2642; .reg .u64 %r2643; .reg .u64 %r2644; .reg .u64 %r2645; .reg .u64 %r2646; .reg .pred %r2654; .reg .u64 %r2655; .reg .u64 %r2656; .reg .u64 %r2657; .reg .u64 %r2658; .reg .pred %r2666; .reg .u64 %r2667; .reg .u64 %r2668; .reg .u64 %r2669; .reg .u64 %r2670; .reg .pred %r2678; .reg .u64 %r2679; .reg .u64 %r2680; .reg .u64 %r2681; .reg .u64 %r2682; .reg .u64 %r2683; .reg .u64 %r2684; .reg .u64 %r2685; .reg .u64 %r2686; .reg .u64 %r2687; .reg .u32 %r2688; .reg .u64 %r2689; .reg .u64 %r2690; .reg .u64 %r2691; .reg .pred %r2692; .reg .pred %r2695; .reg .u64 %r2696; .reg .u64 %r2697; .reg .u64 %r2698; .reg .u64 %r2699; .reg .pred %r2707; mov.u64 %r748,%ar0; mov.u64 %r749,%ar1; mov.u64 %r750,%ar2; mov.u32 %r751,%ar3; mov.u32 %r752,%ar4; mov.u64 %r753,%ar5; .loc 1 2462 3 ld.s8 %r446,[%r749+28]; cvt.u16.u32 %r754,%r446; setp.eq.u16 %r755,%r754,2; @ %r755 bra $L2; ld.s8 %r2232,[%r750+28]; cvt.u16.u32 %r756,%r2232; setp.eq.u16 %r758,%r756,2; @ %r758 bra $L3; cvta.const.u64 %r762,$LC3; cvta.const.u64 %r761,__func__$0; mov.u32 %r760,2462; cvta.const.u64 %r7767623: .loc 1 2476 6 ld.u64 %r763,[%r748]; setp.ne.u64 %r764,%r763,0; @ %r764 bra $L4; .loc 1 2478 10 setp.ne.u16 %r766,%r754,1; @ %r766 bra $L5; .loc 1 2480 4 st.u64 [%r748+48],%r763; ld.u64 %r769,[%r750+80]; ld.u64 %r771,[%r750+72]; sub.u64 %r772,%r769,%r771; st.u64 [%r748+56],%r772; mov.u64 %r773,1; st.u64 [%r748+40],%r773; bra $L6; $L123: .loc 1 2485 4 st.u64 [%r748+48],%r2005; add.u64 %r775,%r508,-1; st.u64 [%r748+56],%r775; mov.u64 %r776,1; st.u64 [%r748+40],%r776; bra $L6; $L124: .loc 1 2490 4 mov.u64 %r777,0; st.u64 [%r748+48],%r777; add.u64 %r778,%r508,-1; st.u64 [%r748+56],%r778; mov.u64 %r779,1; st.u64 [%r748+40],%r779; .loc 1 2493 11 st.u64 [%r748+72],%r777; ld.u64 %r782,[%r750+80]; ld.u64 %r784,[%r750+72]; sub.u64 %r785,%r782,%r784; st.u64 [%r748+80],%r785; st.u64 [%r748+64],%r508; $L6: .loc 1 2499 1748; call (%value_in),_gfortran_size0,(%out_arg1); ld.param.u64 %r787,[%value_in]; } .loc 1 2499 4 mov.u64 %r789,16789; call (%value_in),_gfortrani_xmallocarray790,[%value_in]; } .loc 1 2499 2 st.u64 [%r748],%r790; .loc 1 2500 24 mov.u64 %r792,0; st.u64 [%r748+8],%r792; .loc 1 2557 7 ld.s8 %r446,[%r749+28]; .loc 1 2564 15 ld.u64 %r794,[%r749+56]; add.u64 %r793,%r794,1; .loc 1 2564 13 ld.u64 %r795,[%r749+48]; sub.u64 %r456,%r793,%r795; bra $L7; $L4: .loc 1 2502 12 cvta.global.u64 %r796,_gfortrani_compile_options; .loc 1 2502 11 ld.u32 %r797,[%r796+36]; setp.eq.u32 %r798,%r797,0; @ %r798 bra $L8; .loc 1 2509 17 ld.u64 %r800,[%r748+56]; add.u64 %r799,%r800,1; .loc 1 2509 15 ld.u64 %r801,[%r748+48]; sub.u64 %r456,%r799,%r801; .loc 1 2506 10 setp.ne.u16 %r803,%r754,1; @ %r803 bra $L9; .loc 1 2508 17 ld.u64 %r805,[%r750+80]; add.u64 %r804,%r805,1; .loc 1 2508 15 ld.u64 %r806,[%r750+72]; sub.u64 %r238,%r804,%r806; .loc 1 2510 7 setp.ne.u64 %r807,%r238,%r456; @ %r807 bra $L10; .loc 1 2564 15 ld.u64 %r809,[%r749+56]; add.u64 %r808,%r809,1; .loc 1 2564 13 ld.u64 %r810,[%r749+48]; sub.u64 %r456,%r808,%r810; .loc 1 2543 7 ld.s8 %r331,[%r748+28]; .loc 1 2548 27 ld.u64 %r444,[%r748+40]; .loc 1 2543 6 cvt.u16.u32 %r811,%r331; setp.eq.u16 %r812,%r811,1; @ %r812 bra $L11; .loc 1 2553 16 ld.u64 %r504,[%r748+64]; .loc 1 2560 16 ld.u64 %r503,[%r749+40]; .loc 1 2563 14 mov.u64 %r216,1; .loc 1 2561 16 mov.u64 %r208,%r216; bra $L12; $L10: .loc 1 2511 6 st.u64 [%stack+8],%r238; st.u64 [%stack],%r456; cvta.const.u64 %r813813_gfortran_runtime_error2485 4 ld.u64 %r816,[%r749+56]; add.u64 %r815,%r816,1; ld.u64 %r817,[%r749+48]; sub.u64 %r512,%r815,%r817; .loc 1 2515 15 cvt.u16.u32 %r818,%r2232; setp.ne.u16 %r820,%r818,1; @ %r820 bra $L13; .loc 1 2519 7 setp.eq.u64 %r821,%r512,%r456; @ %r821 bra $L14; .loc 1 2520 6 st.u64 [%stack+8],%r512; st.u64 [%stack],%r456; cvta.const.u64 %r8228stack; call _gfortran_runtime_error13: .loc 1 2528 7 setp.eq.u64 %r824,%r512,%r456; @ %r824 bra $L15; .loc 1 2529 6 st.u64 [%stack+8],%r512; st.u64 [%stack],%r456; cvta.const.u64 %r825825_gfortran_runtime_error15: .loc 1 2533 17 ld.u64 %r828,[%r750+80]; add.u64 %r827,%r828,1; .loc 1 2533 15 ld.u64 %r829,[%r750+72]; sub.u64 %r236,%r827,%r829; .loc 1 2534 17 ld.u64 %r831,[%r748+80]; add.u64 %r830,%r831,1; .loc 1 2534 15 ld.u64 %r832,[%r748+72]; sub.u64 %r237,%r830,%r832; .loc 1 2535 7 setp.eq.u64 %r833,%r236,%r237; @ %r833 bra $L14; .loc 1 2536 6 st.u64 [%stack+8],%r236; st.u64 [%stack],%r237; cvta.const.u64 %r834,$LC8stack; call _gfortran_runtime_error2564 15 ld.u64 %r837,[%r749+56]; add.u64 %r836,%r837,1; .loc 1 2564 13 ld.u64 %r838,[%r749+48]; sub.u64 %r456,%r836,%r838; $L7: .loc 1 2548 27 ld.u64 %r504,[%r748+40]; .loc 1 2543 6 ld.s8 %r840,[%r748+28]; cvt.u16.u32 %r839,%r840; setp.eq.u16 %r841,%r839,1; @ %r841 bra $L130; $L127: .loc 1 2553 16 mov.u64 %r444,%r504; ld.u64 %r504,[%r748+64]; bra $L16; $L130: .loc 1 2548 27 mov.u64 %r444,%r504; $L16: .loc 1 2560 16 ld.u64 %r503,[%r749+40]; .loc 1 2557 6 cvt.u16.u32 %r842,%r446; setp.eq.u16 %r843,%r842,1; @ %r843 bra $L131; $L125: .loc 1 2569 16 ld.u64 %r208,[%r749+64]; .loc 1 2571 15 ld.u64 %r845,[%r749+80]; add.u64 %r844,%r845,1; .loc 1 2571 13 ld.u64 %r846,[%r749+72]; mov.u64 %r216,%r456; sub.u64 %r456,%r844,%r846; bra $L12; $L131: .loc 1 2563 14 mov.u64 %r216,1; .loc 1 2561 16 mov.u64 %r208,%r216; $L12: .loc 1 2575 16 ld.u64 %r848,[%r750+56]; add.u64 %r847,%r848,1; ld.u64 %r849,[%r750+48]; sub.u64 %r64,%r847,%r849; .loc 1 2575 6 setp.eq.u64 %r850,%r64,%r456; @ %r850 bra $L17; .loc 1 2577 10 set.u32.gt.s64 %r852,%r456,0; neg.s32 %r853,%r852; .loc 1 2577 21 set.u32.gt.s64 %r855,%r64,0; neg.s32 %r856,%r855; cvt.u16.u32 %r858,%r853; cvt.u16.u32 %r859,%r856; or.b16 %r857,%r858,%r859; cvt.u32.u16 %r860,%r857; cvt.u16.u8 %r861,%r860; setp.eq.u16 %r862,%r861,0; @ %r862 bra $L17; .loc 1 2578 2 st.u64 [%stack+8],%r456; st.u64 [%stack],%r64;_gfortran_runtime_error17: .loc 1 2583 7 ld.s8 %r66,[%r750+28]; .loc 1 2586 16 ld.u64 %r502,[%r750+40]; .loc 1 2583 6 cvt.u16.u32 %r865,%r66; setp.eq.u16 %r866,%r865,1; @ %r866 bra $L132; .loc 1 2597 16 ld.u64 %r209,[%r750+64]; .loc 1 2598 16 ld.u64 %r868,[%r750+80]; add.u64 %r867,%r868,1; .loc 1 2598 14 ld.u64 %r869,[%r750+72]; sub.u64 %r217,%r867,%r869; bra $L18; $L132: .loc 1 2592 14 mov.u64 %r217,1; .loc 1 2591 16 mov.u64 %r209,256; $L18: .loc 1 2601 9 ld.u64 %r243,[%r749]; .loc 1 2602 9 ld.u64 %r244,[%r750]; .loc 1 2603 8 ld.u64 %r245,[%r748]; .loc 1 2612 28 set.u32.eq.u64 %r871,%r444,1; neg.s32 %r872,%r871; cvt.u32.u32 %r870,%r872; cvt.u32.u8 %r71,%r870; .loc 1 2612 46 set.u32.eq.u64 %r874,%r503,1; neg.s32 %r875,%r874; cvt.u32.u32 %r873,%r875; cvt.u32.u8 %r501,%r873; .loc 1 2612 7 set.u32.ne.u32 %r877,%r751,0; neg.s32 %r878,%r877; .loc 1 2612 16 cvt.u16.u32 %r880,%r878; cvt.u16.u32 %r881,%r71; and.b16 %r879,%r880,%r881; .loc 1 2612 6 cvt.u32.u16 %r882,%r879; cvt.u16.u8 %r883,%r882; setp.eq.u16 %r884,%r883,0; @ %r884 bra $L19; .loc 1 2612 63 set.u32.eq.u64 %r886,%r208,1; neg.s32 %r887,%r886; .loc 1 2612 51 cvt.u16.u32 %r889,%r887; cvt.u16.u32 %r890,%r501; or.b16 %r888,%r889,%r890; .loc 1 2612 33 cvt.u32.u16 %r891,%r888; cvt.u16.u8 %r892,%r891; setp.eq.u16 %r893,%r892,0; @ %r893 bra $L20; .loc 1 2613 20 set.u32.eq.u64 %r895,%r502,1; neg.s32 %r896,%r895; .loc 1 2613 37 set.u32.eq.u64 %r898,%r209,1; neg.s32 %r899,%r898; .loc 1 2613 25 cvt.u16.u32 %r901,%r896; cvt.u16.u32 %r902,%r899; or.b16 %r900,%r901,%r902; .loc 1 2613 7 cvt.u32.u16 %r903,%r900; cvt.u16.u8 %r904,%r903; setp.eq.u16 %r905,%r904,0; @ %r905 bra $L20; .loc 1 2615 13 cvt.rn.f32.s32 %r83,%r752; .loc 1 2614 12 cvt.rn.f32.s64 %r906,%r216; .loc 1 2614 50 cvt.rn.f32.s64 %r907,%r456; .loc 1 2614 47 mul.f32 %r908,%r906,%r907; .loc 1 2614 31 cvt.rn.f32.s64 %r909,%r217; .loc 1 2614 47 mul.f32 %r910,%r908,%r909; .loc 1 2615 13 mul.f32 %r911,%r83,%r83; mul.f32 %r912,%r911,%r83; .loc 1 2614 7 setp.gt.f32 %r913,%r910,%r912; @ ! %r913 bra $L19; .loc 1 2617 17 cvt.u32.u64 %r85,%r216; st.u32 [%frame+52],%r85; .loc 1 2617 29 cvt.u32.u64 %r86,%r217; st.u32 [%frame+48],%r86; .loc 1 2617 41 cvt.u32.u64 %r87,%r456; st.u32 [%frame+44],%r87; .loc 1 2617 52 cvt.u32.u64 %r88,%r504; st.u32 [%frame+40],%r88; .loc 1 2618 28 mov.u64 %r914,1; st.u64 [%frame+16],%r914; mov.u64 %r915,0; st.u64 [%frame+24],%r915; .loc 1 2618 37 st.u64 [%frame],%r915; st.u64 [%frame+8],%r915; .loc 1 2619 17 setp.ne.u64 %r918,%r503,1; @ %r918 bra $L22; cvt.u32.u64 %r230,%r208; bra $L23; $L22: cvt.u32.u64 %r230,%r503; $L23: st.u32 [%frame+36],%r230; .loc 1 2620 3 setp.ne.u64 %r919,%r502,1; @ %r919 bra $L24; cvt.u32.u64 %r231,%r209; bra $L25; $L24: cvt.u32.u64 %r231,%r502; $L25: st.u32 [%frame+32],%r231; .loc 1 2622 15 set.u32.gt.s32 %r921,%r230,0; neg.s32 %r922,%r921; .loc 1 2622 26 set.u32.gt.s32 %r924,%r231,0; neg.s32 %r925,%r924; .loc 1 2622 19 cvt.u16.u32 %r927,%r922; cvt.u16.u32 %r928,%r925; and.b16 %r926,%r927,%r928; .loc 1 2622 10 cvt.u32.u16 %r929,%r926; cvt.u16.u8 %r930,%r929; setp.eq.u16 %r931,%r930,0; @ %r931 bra $L19; .loc 1 2622 37 set.u32.gt.s32 %r933,%r88,0; neg.s32 %r934,%r933; .loc 1 2622 46 set.u32.gt.s32 %r936,%r85,1; neg.s32 %r937,%r936; .loc 1 2622 41 cvt.u16.u32 %r939,%r934; cvt.u16.u32 %r940,%r937; and.b16 %r938,%r939,%r940; cvt.u32.u16 %r941,%r938; cvt.u16.u8 %r942,%r941; setp.eq.u16 %r943,%r942,0; @ %r943 bra $L19; .loc 1 2622 55 set.u32.gt.s32 %r945,%r86,1; neg.s32 %r946,%r945; .loc 1 2622 64 set.u32.gt.s32 %r948,%r87,1; neg.s32 %r949,%r948; .loc 1 2622 59 cvt.u16.u32 %r951,%r946; cvt.u16.u32 %r952,%r949; and.b16 %r950,%r951,%r952; cvt.u32.u16 %r953,%r950; cvt.u16.u8 %r954,%r953; setp.eq.u16 %r955,%r954,0; @ %r955 bra $L19; .loc 1 2624 4 setp.ne.u64 %r956,%r753,0; @ %r956 bra $L27; cvta.const.u64 %r960,$LC8; cvta.const.u64 %r959,__func__$0; mov.u32 %r958,2624; cvta.const.u64 %r9959027: .loc 1 2626 17 and.b32 %r961,%r751,2; .loc 1 2626 7 setp.ne.u32 %r962,%r961,0; @ %r962 bra $L133; .loc 1 2629 35 setp.eq.u64 %r963,%r503,1; @ %r963 bra $L134; cvta.const.u64 %r218,$LC2; bra $L28; $L133: .loc 1 2627 13 cvta.const.u64 %r218,$LC1; bra $L28; $L134: .loc 1 2629 35 cvta.const.u64 %r218,$LC0; $L28: .loc 1 2631 17 and.b32 %r964,%r751,4; .loc 1 2631 7 setp.ne.u32 %r965,%r964,0; @ %r965 bra $L135; .loc 1 2634 35 setp.eq.u64 %r966,%r502,1; @ %r966 bra $L136; cvta.const.u64 %r219,$LC2; bra $L29; $L135: .loc 1 2632 13 cvta.const.u64 %r219,$LC1; bra $L29; $L136: .loc 1 2634 35 cvta.const.u64 %r219,$LC0; $L29: .loc 1 2636 4 add.u64 %r982,%frame,40; add.u64 %r983,%frame,32; add.u64 %r984,%frame,36; add.u64 %r985,%frame,16; add.u64 %r986,%frame,44; add.u64 %r987,%frame,48; add.u64 %r988,%frame,52; mov.u32 %r981,1;,.param .u64 %in_ar3,.param .u64 %in_ar4,.param .u64 %in_ar5,.param .u64 %in_ar6,.param .u64 %in_ar7,.param .u64 %in_ar8,.param .u64 %in_ar9,.param .u64 %in_ar10,.param .u64 %in_ar11,.param .u64 %in_ar12,.param .u32 %in_ar13,.param .u32 %in_ar14);r219988987986243; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r984; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r244; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r983; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%frame; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r245; .param .u64 %out_arg13; st.param.u64 [%out_arg13],%r982; .param .u32 %out_arg14; st.param.u32 [%out_arg14],%r981; .param .u32 %out_arg15; st.param.u32 [%out_arg15],%r981; call %r753,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13,%out_arg14,%out_arg15),$LCT0; } bra $L1; $L19: .loc 1 2643 21 cvt.u16.u32 %r990,%r71; cvt.u16.u32 %r991,%r501; and.b16 %r989,%r990,%r991; .loc 1 2643 6 cvt.u32.u16 %r992,%r989; cvt.u16.u8 %r993,%r992; setp.eq.u16 %r994,%r993,0; @ %r994 bra $L31; .loc 1 2643 38 setp.ne.u64 %r995,%r502,1; @ %r995 bra $L20; .loc 1 2644 7 cvt.u16.u32 %r996,%r66; setp.eq.u16 %r997,%r996,1; @ %r997 bra $L32; .loc 1 2677 16 add.u64 %r998,%r504,1; .loc 1 2678 9 shl.b64 %r104,%r998,4; sub.u64 %r247,%r245,%r104; .loc 1 2687 7 setp.le.s64 %r999,%r217,0; @ %r999 bra $L33; add.u64 %r621,%r104,-16; mov.u64 %r620,%r245; add.u64 %r624,%r217,1; .loc 1 2689 22 shl.b64 %r441,%r216,4; .loc 1 2687 13 mov.u64 %r296,%r502; setp.gt.s64 %r2234,%r216,0; and.b64 %r2365,%r217,3; setp.eq.u64 %r2368,%r2365,0; @ %r2368 bra $L34; setp.eq.u64 %r2367,%r2365,1; @ %r2367 bra $L246; setp.eq.u64 %r2366,%r2365,2; @ %r2366 bra $L247; bra $L278; $L35: .loc 1 2689 22 mov.u32 %r10030031005,[%value_in]; } $L36: .loc 1 2687 24 add.u64 %r2369,%r296,1; .loc 1 2687 7 add.u64 %r2412,%r620,%r621; .loc 1 2688 2 @ %r2234 bra $L214; bra $L279; $L34: @ %r2234 bra $L35; bra $L36; $L33: .loc 1 2692 28 set.u32.eq.u64 %r1010,%r456,0; neg.s32 %r1011,%r1010; .loc 1 2692 13 set.u32.eq.u64 %r1013,%r216,0; neg.s32 %r1014,%r1013; .loc 1 2692 28 cvt.u16.u32 %r1016,%r1011; cvt.u16.u32 %r1017,%r1014; or.b16 %r1015,%r1016,%r1017; .loc 1 2692 23 set.u32.eq.u64 %r1019,%r217,0; neg.s32 %r1020,%r1019; .loc 1 2692 28 cvt.u16.u32 %r1024,%r1020; or.b16 %r1022,%r1015,%r1024; cvt.u32.u16 %r1025,%r1022; cvt.u16.u8 %r1026,%r1025; setp.eq.u16 %r1027,%r1026,0; @ ! %r1027 bra $L1; .loc 1 2697 10 setp.eq.u64 %r1028,%r208,1; selp.u64 %r228,%r504,%r208,%r1028; .loc 1 2702 21 shl.b64 %r1029,%r228,8; .loc 1 2702 14 add.u64 %r252,%r1029,%r209; .loc 1 2706 12 min.s64 %r1031,%r252,65536; shl.b64 %r1032,%r10310321033,[%value_in]; } .loc 1 2710 7 setp.gt.s64 %r1035,%r217,0; @ %r1035 bra $L39; $L42: .loc 1 2910332918 7 bra $L1; $L39: .loc 1 2680 16 add.u64 %r1037,%r208,1; .loc 1 2681 9 shl.b64 %r107,%r1037,4; sub.u64 %r249,%r243,%r107; shl.b64 %r594,%r209,9; shl.b64 %r596,%r209,13; shl.b64 %r605,%r209,5; .loc 1 2683 16 add.u64 %r1038,%r209,1; .loc 1 2684 9 shl.b64 %r1039,%r1038,4; sub.u64 %r597,%r244,%r1039; add.u64 %r595,%r605,%r597; shl.b64 %r600,%r504,9; add.u64 %r601,%r504,%r504; shl.b64 %r602,%r504,2; neg.s64 %r1042,%r209; shl.b64 %r1043,%r1042,13; shl.b64 %r603,%r1042,4; shl.b64 %r607,%r504,13; shl.b64 %r1047,%r504,5; add.u64 %r1048,%r1047,16; add.u64 %r606,%r247,%r1048; add.u64 %r1049,%r217,-1; and.b64 %r615,%r1049,-512; add.u64 %r616,%r597,%r603; shl.b64 %r573,%r208,8; shl.b64 %r575,%r208,12; add.u64 %r576,%r107,-16; neg.s64 %r1051,%r208; shl.b64 %r1052,%r1051,12; shl.b64 %r1055,%r1051,5; shl.b64 %r537,%r208,5; add.u64 %r538,%r208,%r208; shl.b64 %r226,%r209,6; shl.b64 %r221,%r504,6; neg.s64 %r1058,%r504; shl.b64 %r1059,%r1058,4; shl.b64 %r566,%r504,4; shl.b64 %r1062,%r1042,6; shl.b64 %r619,%r209,4; mov.u64 %r598,%r601; mov.u64 %r592,%r209; mov.u64 %r590,0; setp.gt.s64 %r2245,%r456,0; add.u64 %r2300,%r616,16; setp.gt.s64 %r2301,%r216,0; add.u64 %r2302,%r456,1; add.u64 %r2303,%r216,1; add.u64 %r2304,%r249,32; add.u64 %r2305,%frame,64; add.u64 %r2306,%frame,80; add.u64 %r2307,%frame,96; add.u64 %r2308,%r247,64; bra $L41; $L138: mov.u64 %r590,%r591; $L41: add.u64 %r618,%r590,1; .loc 1 2715 9 sub.u64 %r254,%r217,%r590; min.s64 %r2413,%r254,512; .loc 1 2716 24 shr.s64 %r1064,%r2413,63; shr.u64 %r1065,%r1064,62; add.u64 %r1066,%r2413,%r1065; and.b64 %r1067,%r1066,3; sub.u64 %r1068,%r1067,%r1065; .loc 1 2716 10 sub.u64 %r255,%r2413,%r1068; .loc 1 2718 4 @ %r2245 bra $L40; $L45: .loc 1 2710 7 add.u64 %r591,%r590,512; add.u64 %r592,%r592,%r594; add.u64 %r595,%r595,%r596; add.u64 %r598,%r598,%r600; add.u64 %r602,%r602,%r600; add.u64 %r603,%r603,%r1043; add.u64 %r605,%r605,%r596; add.u64 %r606,%r606,%r607; setp.ne.u64 %r1070,%r590,%r615; @ %r1070 bra $L138; bra $L42; $L40: add.u64 %r570,%r2300,%r605; .loc 1 2768 13 add.u64 %r480,%r255,%r618; add.u64 %r477,%r605,%r619; add.u64 %r475,%r477,%r619; add.u64 %r588,%r595,%r619; add.u64 %r589,%r588,%r619; mul.lo.u64 %r406,%r480,%r504; mul.lo.u64 %r399,%r480,%r209; mad.lo.u64 %r585,%r480,%r619,%r597; mov.u64 %r577,%r1055; mov.u64 %r574,%r576; mov.u64 %r571,%r208; .loc 1 2718 12 mov.u64 %r258,%r502; .loc 1 2768 8 add.u64 %r2293,%r480,-1; $L44: .loc 1 2723 13 sub.u64 %r256,%r2302,%r258; min.s64 %r2415,%r256,256; .loc 1 2724 28 shr.u64 %r1076,%r2415,63; add.u64 %r1077,%r2415,%r1076; and.b64 %r1078,%r1077,1; sub.u64 %r1079,%r1078,%r1076; .loc 1 2724 14 sub.u64 %r257,%r2415,%r1079; .loc 1 2727 8 @ %r2301 bra $L43; $L73: .loc 1 2718 30 add.u64 %r258,%r258,256; .loc 1 2718 4 add.u64 %r570,%r570,4096; add.u64 %r571,%r571,%r573; add.u64 %r574,%r574,%r575; add.u64 %r577,%r577,%r1052; setp.ge.s64 %r1081,%r456,%r258; @ %r1081 bra $L44; bra $L45; $L43: .loc 1 2763 26 add.u64 %r1082,%r258,-1; add.u64 %r1083,%r1082,%r2415; .loc 1 2763 31 mul.lo.u64 %r314,%r1083,%r208; .loc 1 2734 13 add.u64 %r479,%r257,%r258; .loc 1 2734 8 add.u64 %r481,%r479,-1; add.u64 %r1084,%r314,1; shl.b64 %r1085,%r1084,4; add.u64 %r556,%r249,%r1085; add.u64 %r567,%r571,%r208; sub.u64 %r1086,%r567,%r314; shl.b64 %r569,%r1086,4; shl.b64 %r1088,%r2415,4; add.u64 %r1089,%r1088,-16; add.u64 %r330,%r1033,%r1089; add.u64 %r1090,%r258,%r592; add.u64 %r1091,%r1090,%r2415; shl.b64 %r1092,%r1091,4; add.u64 %r225,%r1092,%r597; shl.b64 %r587,%r258,4; add.u64 %r48,%r587,%r595; add.u64 %r46,%r587,%r588; add.u64 %r30,%r587,%r589; add.u64 %r586,%r585,%r587; add.u64 %r1093,%r399,%r258; add.u64 %r1094,%r1093,%r2415; shl.b64 %r1095,%r1094,4; add.u64 %r397,%r1095,%r597; .loc 1 2768 8 mov.u64 %r555,%r606; .loc 1 2727 16 mov.u64 %r273,%r502; setp.le.s64 %r2235,%r258,%r481; setp.gt.s64 %r2228,%r2415,%r257; setp.ge.s64 %r2247,%r2293,%r618; setp.gt.s64 %r2226,%r2413,%r255; not.b64 %r2283,%r258; add.u64 %r2284,%r2283,%r479; and.b64 %r2285,%r2284,-2; .loc 1 2904 18 add.u64 %r2287,%r2415,%r258; .loc 1 2904 13 add.u64 %r2288,%r2287,-1; $L72: .loc 1 2732 10 sub.u64 %r259,%r2303,%r273; min.s64 %r2416,%r259,256; .loc 1 2733 25 shr.s64 %r2242,%r2416,63; shr.u64 %r1099,%r2416,63; add.u64 %r1100,%r2416,%r1099; and.b64 %r1101,%r1100,1; sub.u64 %r1102,%r1101,%r1099; .loc 1 2733 11 sub.u64 %r260,%r2416,%r1102; .loc 1 2735 5 @ %r2235 bra $L46; $L55: .loc 1 2757 8 @ ! %r2228 bra $L48; bra $L47; $L46: .loc 1 2737 17 add.u64 %r484,%r260,%r273; .loc 1 2737 12 add.u64 %r485,%r484,-1; shl.b64 %r1106,%r2416,12; add.u64 %r1108,%r1106,-4096; add.u64 %r530,%r1033,%r1108; add.u64 %r536,%r556,%r569; add.u64 %r1111,%r2416,%r273; shl.b64 %r547,%r1111,4; shl.b64 %r551,%r2416,4; mov.u64 %r541,%r567; mov.u64 %r540,%r577; mov.u64 %r539,%r574; mov.u64 %r528,0; setp.le.s64 %r2240,%r273,%r485; setp.gt.s64 %r2229,%r2416,%r260; not.b64 %r2278,%r273; add.u64 %r2279,%r2278,%r484; and.b64 %r2280,%r2279,-2; add.u64 %r2281,%r2280,%r273; bra $L54; $L139: mov.u64 %r528,%r529; $L54: .loc 1 2738 9 @ %r2240 bra $L49; $L53: .loc 1 2749 12 @ %r2229 bra $L50; bra $L51; $L49: shl.b64 %r1114,%r528,4; add.u64 %r229,%r1033,%r1114; add.u64 %r1119,%r2281,%r541; shl.b64 %r1120,%r1119,4; add.u64 %r527,%r1120,%r2304; .loc 1 2738 9 mov.u64 %r211,%r536; sub.u64 %r2355,%r527,%r536; add.u64 %r2356,%r2355,-32; shr.u64 %r2354,%r2356,5; and.b64 %r2358,%r2354,1; setp.ne.u64 %r2359,%r2358,0; @ ! %r2359 bra $L280; $L52: .loc 1 2741 7 add.u64 %r1122,%r211,%r539; add.u64 %r1123,%r1122,%r540; .loc 1 2740 49 ld.u64 %r1124,[%r1123]; st.u64 [%r229],%r1124; ld.u64 %r1125,[%r1123+8]; st.u64 [%r229+8],%r1125; .loc 1 2742 49 ld.u64 %r1126,[%r211]; st.u64 [%r229+16],%r1126; ld.u64 %r1127,[%r211+8]; st.u64 [%r229+24],%r1127; .loc 1 2745 7 add.u64 %r1128,%r211,%r540; add.u64 %r1129,%r1128,%r539; .loc 1 2744 49 ld.u64 %r1130,[%r1129+16]; st.u64 [%r229+4096],%r1130; ld.u64 %r1131,[%r1129+24]; st.u64 [%r229+4104],%r1131; .loc 1 2746 49 ld.u64 %r1132,[%r211+16]; st.u64 [%r229+4112],%r1132; ld.u64 %r1133,[%r211+24]; st.u64 [%r229+4120],%r1133; .loc 1 2738 9 add.u64 %r2360,%r229,8192; add.u64 %r2361,%r211,32; .loc 1 2741 7 add.u64 %r2419,%r2361,%r539; add.u64 %r2420,%r2419,%r540; .loc 1 2740 49 ld.u64 %r2421,[%r2420]; st.u64 [%r2360],%r2421; ld.u64 %r2422,[%r2420+8]; st.u64 [%r2360+8],%r2422; .loc 1 2742 49 ld.u64 %r2423,[%r2361]; st.u64 [%r2360+16],%r2423; ld.u64 %r2424,[%r2361+8]; st.u64 [%r2360+24],%r2424; .loc 1 2745 7 add.u64 %r2425,%r2361,%r540; add.u64 %r2426,%r2425,%r539; .loc 1 2744 49 ld.u64 %r2427,[%r2426+16]; st.u64 [%r2360+4096],%r2427; ld.u64 %r2428,[%r2426+24]; st.u64 [%r2360+4104],%r2428; .loc 1 2746 49 ld.u64 %r2429,[%r2361+16]; st.u64 [%r2360+4112],%r2429; ld.u64 %r2430,[%r2361+24]; st.u64 [%r2360+4120],%r2430; .loc 1 2738 9 add.u64 %r229,%r229,16384; add.u64 %r211,%r211,64; setp.ne.u64 %r2431,%r211,%r527; @ %r2431 bra $L52; bra $L53; $L50: .loc 1 2752 10 add.u64 %r1135,%r539,%r547; add.u64 %r1136,%r249,%r1135; .loc 1 2751 41 ld.u64 %r1137,[%r1136+-16]; st.u64 [%r530],%r1137; ld.u64 %r1138,[%r1136+-8]; st.u64 [%r530+8],%r1138; .loc 1 2754 10 add.u64 %r1139,%r536,%r551; .loc 1 2753 41 ld.u64 %r1140,[%r1139+-16]; st.u64 [%r530+16],%r1140; ld.u64 %r1141,[%r1139+-8]; st.u64 [%r530+24],%r1141; $L51: .loc 1 2735 5 add.u64 %r529,%r528,2; add.u64 %r530,%r530,32; add.u64 %r536,%r536,%r537; add.u64 %r539,%r539,%r537; add.u64 %r540,%r540,%r1055; add.u64 %r541,%r541,%r538; setp.ne.u64 %r1142,%r528,%r2285; @ %r1142 bra $L139; bra $L55; $L48: .loc 1 2767 25 shr.u64 %r1145,%r2242,62; add.u64 %r1146,%r2416,%r1145; and.b64 %r1147,%r1146,3; sub.u64 %r1148,%r1147,%r1145; .loc 1 2767 11 sub.u64 %r263,%r2416,%r1148; .loc 1 2769 5 @ %r2247 bra $L56; bra $L57; $L47: .loc 1 2759 17 add.u64 %r132,%r2416,%r273; .loc 1 2759 12 add.u64 %r1150,%r132,-1; .loc 1 2760 9 setp.lt.s64 %r1151,%r1150,%r273; @ %r1151 bra $L48; add.u64 %r1152,%r132,%r314; shl.b64 %r1153,%r1152,4; add.u64 %r136,%r249,%r1153; mov.u64 %r73,%r330; mov.u64 %r57,%r556; sub.u64 %r2345,%r136,%r556; add.u64 %r2346,%r2345,-16; shr.u64 %r2344,%r2346,4; add.u64 %r2347,%r2344,1; and.b64 %r2348,%r2347,3; setp.eq.u64 %r2351,%r2348,0; @ %r2351 bra $L58; setp.eq.u64 %r2350,%r2348,1; @ %r2350 bra $L248; setp.eq.u64 %r2349,%r2348,2; @ %r2349 bra $L249; bra $L281; $L58: .loc 1 2762 43 ld.u64 %r1154,[%r57]; st.u64 [%r73],%r1154; ld.u64 %r1155,[%r57+8]; st.u64 [%r73+8],%r1155; .loc 1 2760 9 add.u64 %r2352,%r57,16; add.u64 %r2353,%r73,4096; .loc 1 2762 43 ld.u64 %r2434,[%r2352]; st.u64 [%r2353],%r2434; ld.u64 %r2435,[%r2352+8]; st.u64 [%r2353+8],%r2435; .loc 1 2760 9 add.u64 %r2436,%r57,32; add.u64 %r2437,%r73,8192; .loc 1 2762 43 ld.u64 %r2439,[%r2436]; st.u64 [%r2437],%r2439; ld.u64 %r2440,[%r2436+8]; st.u64 [%r2437+8],%r2440; .loc 1 2760 9 add.u64 %r2441,%r57,48; add.u64 %r2442,%r73,12288; .loc 1 2762 43 ld.u64 %r2444,[%r2441]; st.u64 [%r2442],%r2444; ld.u64 %r2445,[%r2441+8]; st.u64 [%r2442+8],%r2445; .loc 1 2760 9 add.u64 %r57,%r57,64; add.u64 %r73,%r73,16384; setp.ne.u64 %r2446,%r57,%r136; @ %r2446 bra $L58; bra $L48; $L57: .loc 1 2871 8 @ ! %r2226 bra $L60; bra $L59; $L56: .loc 1 2876 14 add.u64 %r488,%r263,%r273; .loc 1 2876 9 add.u64 %r489,%r488,-1; sub.u64 %r316,%r598,%r504; add.u64 %r222,%r555,%r1059; add.u64 %r89,%r555,%r566; add.u64 %r42,%r566,%r89; not.b64 %r1158,%r273; add.u64 %r1159,%r1158,%r488; shr.u64 %r1160,%r1159,2; add.u64 %r1161,%r1160,1; shl.b64 %r318,%r1161,6; add.u64 %r1162,%r316,%r488; shl.b64 %r1163,%r1162,4; add.u64 %r468,%r1163,%r247; add.u64 %r1164,%r488,%r598; shl.b64 %r1165,%r1164,4; add.u64 %r463,%r1165,%r247; add.u64 %r1166,%r316,%r601; add.u64 %r1167,%r1166,%r488; shl.b64 %r1168,%r1167,4; add.u64 %r460,%r1168,%r247; add.u64 %r1169,%r488,%r602; shl.b64 %r1170,%r1169,4; add.u64 %r457,%r1170,%r247; shl.b64 %r369,%r263,8; shl.b64 %r471,%r2416,4; mov.u64 %r476,%r475; mov.u64 %r478,%r477; mov.u64 %r483,%r605; mov.u64 %r486,%r603; mov.u64 %r220,%r555; mov.u64 %r227,%r225; mov.u64 %r278,%r618; mov.u64 %r473,0; mov.u64 %r474,%r473; setp.le.s64 %r2241,%r273,%r489; setp.gt.s64 %r2230,%r2416,%r263; $L68: .loc 1 2772 9 @ %r2241 bra $L61; $L67: .loc 1 2843 12 @ ! %r2230 bra $L63; bra $L62; $L61: .loc 1 2904 13 mov.u64 %r341,0; mov.u64 %r342,16384; mov.u64 %r343,12288; mov.u64 %r344,4096; mov.u64 %r345,-8192; setp.gt.s64 %r2236,%r258,%r2288; $L66: add.u64 %r339,%r222,%r341; .loc 1 2774 10 ld.u64 %r2063,[%r339]; ld.u64 %r2064,[%r339+8]; .loc 1 2775 10 ld.u64 %r2065,[%r339+16]; ld.u64 %r2066,[%r339+24]; add.u64 %r337,%r220,%r341; .loc 1 2776 10 ld.u64 %r2067,[%r337]; ld.u64 %r2068,[%r337+8]; .loc 1 2777 10 ld.u64 %r2069,[%r337+16]; ld.u64 %r2070,[%r337+24]; add.u64 %r335,%r89,%r341; .loc 1 2778 10 ld.u64 %r2071,[%r335]; ld.u64 %r2072,[%r335+8]; .loc 1 2779 10 ld.u64 %r2073,[%r335+16]; ld.u64 %r2074,[%r335+24]; add.u64 %r333,%r42,%r341; .loc 1 2780 10 ld.u64 %r2075,[%r333]; ld.u64 %r2076,[%r333+8]; .loc 1 2781 10 ld.u64 %r2077,[%r333+16]; ld.u64 %r2078,[%r333+24]; .loc 1 2782 10 ld.u64 %r2079,[%r339+32]; ld.u64 %r2080,[%r339+40]; .loc 1 2783 10 ld.u64 %r2081,[%r339+48]; ld.u64 %r2082,[%r339+56]; .loc 1 2784 10 ld.u64 %r2083,[%r337+32]; ld.u64 %r2084,[%r337+40]; .loc 1 2785 10 ld.u64 %r2085,[%r337+48]; ld.u64 %r2086,[%r337+56]; .loc 1 2786 10 ld.u64 %r2087,[%r335+32]; ld.u64 %r2088,[%r335+40]; .loc 1 2787 10 ld.u64 %r2089,[%r335+48]; ld.u64 %r2090,[%r335+56]; .loc 1 2788 10 ld.u64 %r2091,[%r333+32]; ld.u64 %r2092,[%r333+40]; .loc 1 2789 10 ld.u64 %r2093,[%r333+48]; ld.u64 %r2094,[%r333+56]; .loc 1 2791 6 @ %r2236 bra $L64; add.u64 %r361,%r473,%r570; add.u64 %r359,%r1033,%r344; $L65: add.u64 %r356,%r345,%r359; .loc 1 2793 19 add.u64 %r1175,%r356,%r344; ld.u64 %r2025,[%r1175]; ld.u64 %r2026,[%r1175+8]; .loc 1 2794 14 ld.u64 %r2027,[%r361]; ld.u64 %r2028,[%r361+8]; .loc 1 2794 11 st.u64 [%frame+80],%r2025; st.u64 [%frame+88],%r2026; st.u64 [%frame+96],%r2027; st.u64 [%frame+104],%r20282305230623072103,[%frame+64]; ld.u64 %r2104,[%frame+72]; .loc 1 2793 14 add.u64 %r1188,%r2063,%r2103; set.u32.lt.u64 %r1193,%r1188,%r2063; cvt.s64.s32 %r1191,%r1193; add.u64 %r1194,%r2064,%r2104; mov.u64 %r2063,%r1188; sub.u64 %r2064,%r1194,%r1191; .loc 1 2795 19 ld.u64 %r2029,[%r359]; ld.u64 %r2030,[%r359+8]; .loc 1 2796 11 st.u64 [%frame+80],%r2027; st.u64 [%frame+88],%r2028; st.u64 [%frame+96],%r2029; st.u64 [%frame+104],%r2032305230623072107,[%frame+64]; ld.u64 %r2108,[%frame+72]; .loc 1 2795 14 add.u64 %r1210,%r2065,%r2107; set.u32.lt.u64 %r1215,%r1210,%r2065; cvt.s64.s32 %r1213,%r1215; add.u64 %r1216,%r2066,%r2108; mov.u64 %r2065,%r1210; sub.u64 %r2066,%r1216,%r1213; add.u64 %r351,%r361,%r486; .loc 1 2798 14 add.u64 %r1220,%r351,%r483; ld.u64 %r2031,[%r1220]; ld.u64 %r2032,[%r1220+8]; .loc 1 2798 11 st.u64 [%frame+80],%r2025; st.u64 [%frame+88],%r2026; st.u64 [%frame+96],%r2031; st.u64 [%frame+104],%r20322305230623072111,[%frame+64]; ld.u64 %r2112,[%frame+72]; .loc 1 2797 14 add.u64 %r1233,%r2067,%r2111; set.u32.lt.u64 %r1238,%r1233,%r2067; cvt.s64.s32 %r1236,%r1238; add.u64 %r1239,%r2068,%r2112; mov.u64 %r2067,%r1233; sub.u64 %r2068,%r1239,%r1236; .loc 1 2800 11 st.u64 [%frame+80],%r2029; st.u64 [%frame+88],%r2030; st.u64 [%frame+96],%r2031; st.u64 [%frame+104],%r20322305230623072115,[%frame+64]; ld.u64 %r2116,[%frame+72]; .loc 1 2799 14 add.u64 %r1255,%r2069,%r2115; set.u32.lt.u64 %r1260,%r1255,%r2069; cvt.s64.s32 %r1258,%r1260; add.u64 %r1261,%r2070,%r2116; mov.u64 %r2069,%r1255; sub.u64 %r2070,%r1261,%r1258; .loc 1 2802 14 add.u64 %r1265,%r351,%r478; ld.u64 %r2033,[%r1265]; ld.u64 %r2034,[%r1265+8]; .loc 1 2802 11 st.u64 [%frame+80],%r2025; st.u64 [%frame+88],%r2026; st.u64 [%frame+96],%r2033; st.u64 [%frame+104],%r20342305230623072119,[%frame+64]; ld.u64 %r2120,[%frame+72]; .loc 1 2801 14 add.u64 %r1278,%r2071,%r2119; set.u32.lt.u64 %r1283,%r1278,%r2071; cvt.s64.s32 %r1281,%r1283; add.u64 %r1284,%r2072,%r2120; mov.u64 %r2071,%r1278; sub.u64 %r2072,%r1284,%r1281; .loc 1 2804 11 st.u64 [%frame+80],%r2029; st.u64 [%frame+88],%r2030; st.u64 [%frame+96],%r2033; st.u64 [%frame+104],%r20342305230623072123,[%frame+64]; ld.u64 %r2124,[%frame+72]; .loc 1 2803 14 add.u64 %r1300,%r2073,%r2123; set.u32.lt.u64 %r1305,%r1300,%r2073; cvt.s64.s32 %r1303,%r1305; add.u64 %r1306,%r2074,%r2124; mov.u64 %r2073,%r1300; sub.u64 %r2074,%r1306,%r1303; .loc 1 2806 14 add.u64 %r1310,%r351,%r476; ld.u64 %r2035,[%r1310]; ld.u64 %r2036,[%r1310+8]; .loc 1 2806 11 st.u64 [%frame+80],%r2025; st.u64 [%frame+88],%r2026; st.u64 [%frame+96],%r2035; st.u64 [%frame+104],%r203305230623072127,[%frame+64]; ld.u64 %r2128,[%frame+72]; .loc 1 2805 14 add.u64 %r1323,%r2075,%r2127; set.u32.lt.u64 %r1328,%r1323,%r2075; cvt.s64.s32 %r1326,%r1328; add.u64 %r1329,%r2076,%r2128; mov.u64 %r2075,%r1323; sub.u64 %r2076,%r1329,%r1326; .loc 1 2808 11 st.u64 [%frame+80],%r2029; st.u64 [%frame+88],%r2030; st.u64 [%frame+96],%r2035; st.u64 [%frame+104],%r203305230623072131,[%frame+64]; ld.u64 %r2132,[%frame+72]; .loc 1 2807 14 add.u64 %r1345,%r2077,%r2131; set.u32.lt.u64 %r1350,%r1345,%r2077; cvt.s64.s32 %r1348,%r1350; add.u64 %r1351,%r2078,%r2132; mov.u64 %r2077,%r1345; sub.u64 %r2078,%r1351,%r1348; .loc 1 2809 19 add.u64 %r1355,%r356,%r343; ld.u64 %r2037,[%r1355]; ld.u64 %r2038,[%r1355+8]; .loc 1 2810 11 st.u64 [%frame+80],%r2027; st.u64 [%frame+88],%r2028; st.u64 [%frame+96],%r2037; st.u64 [%frame+104],%r20382305230623072135,[%frame+64]; ld.u64 %r2136,[%frame+72]; .loc 1 2809 14 add.u64 %r1368,%r2079,%r2135; set.u32.lt.u64 %r1373,%r1368,%r2079; cvt.s64.s32 %r1371,%r1373; add.u64 %r1374,%r2080,%r2136; mov.u64 %r2079,%r1368; sub.u64 %r2080,%r1374,%r1371; .loc 1 2811 19 add.u64 %r1378,%r356,%r342; ld.u64 %r2039,[%r1378]; ld.u64 %r2040,[%r1378+8]; .loc 1 2812 11 st.u64 [%frame+80],%r2027; st.u64 [%frame+88],%r2028; st.u64 [%frame+96],%r2039; st.u64 [%frame+104],%r2042305230623072139,[%frame+64]; ld.u64 %r2140,[%frame+72]; .loc 1 2811 14 add.u64 %r1391,%r2081,%r2139; set.u32.lt.u64 %r1396,%r1391,%r2081; cvt.s64.s32 %r1394,%r1396; add.u64 %r1397,%r2082,%r2140; mov.u64 %r2081,%r1391; sub.u64 %r2082,%r1397,%r1394; .loc 1 2814 11 st.u64 [%frame+80],%r2031; st.u64 [%frame+88],%r2032; st.u64 [%frame+96],%r2037; st.u64 [%frame+104],%r20382305230623072143,[%frame+64]; ld.u64 %r2144,[%frame+72]; .loc 1 2813 14 add.u64 %r1413,%r2083,%r2143; set.u32.lt.u64 %r1418,%r1413,%r2083; cvt.s64.s32 %r1416,%r1418; add.u64 %r1419,%r2084,%r2144; mov.u64 %r2083,%r1413; sub.u64 %r2084,%r1419,%r1416; .loc 1 2816 11 st.u64 [%frame+80],%r2031; st.u64 [%frame+88],%r2032; st.u64 [%frame+96],%r2039; st.u64 [%frame+104],%r2042305230623072147,[%frame+64]; ld.u64 %r2148,[%frame+72]; .loc 1 2815 14 add.u64 %r1435,%r2085,%r2147; set.u32.lt.u64 %r1440,%r1435,%r2085; cvt.s64.s32 %r1438,%r1440; add.u64 %r1441,%r2086,%r2148; mov.u64 %r2085,%r1435; sub.u64 %r2086,%r1441,%r1438; .loc 1 2818 11 st.u64 [%frame+80],%r2033; st.u64 [%frame+88],%r2034; st.u64 [%frame+96],%r2037; st.u64 [%frame+104],%r20382305230623072151,[%frame+64]; ld.u64 %r2152,[%frame+72]; .loc 1 2817 14 add.u64 %r1457,%r2087,%r2151; set.u32.lt.u64 %r1462,%r1457,%r2087; cvt.s64.s32 %r1460,%r1462; add.u64 %r1463,%r2088,%r2152; mov.u64 %r2087,%r1457; sub.u64 %r2088,%r1463,%r1460; .loc 1 2820 11 st.u64 [%frame+80],%r2033; st.u64 [%frame+88],%r2034; st.u64 [%frame+96],%r2039; st.u64 [%frame+104],%r2042305230623072155,[%frame+64]; ld.u64 %r2156,[%frame+72]; .loc 1 2819 14 add.u64 %r1479,%r2089,%r2155; set.u32.lt.u64 %r1484,%r1479,%r2089; cvt.s64.s32 %r1482,%r1484; add.u64 %r1485,%r2090,%r2156; mov.u64 %r2089,%r1479; sub.u64 %r2090,%r1485,%r1482; .loc 1 2822 11 st.u64 [%frame+80],%r2035; st.u64 [%frame+88],%r2036; st.u64 [%frame+96],%r2037; st.u64 [%frame+104],%r20382305230623072159,[%frame+64]; ld.u64 %r2160,[%frame+72]; .loc 1 2821 14 add.u64 %r1501,%r2091,%r2159; set.u32.lt.u64 %r1506,%r1501,%r2091; cvt.s64.s32 %r1504,%r1506; add.u64 %r1507,%r2092,%r2160; mov.u64 %r2091,%r1501; sub.u64 %r2092,%r1507,%r1504; .loc 1 2824 11 st.u64 [%frame+80],%r2035; st.u64 [%frame+88],%r2036; st.u64 [%frame+96],%r2039; st.u64 [%frame+104],%r2042305230623072163,[%frame+64]; ld.u64 %r2164,[%frame+72]; .loc 1 2823 14 add.u64 %r1523,%r2093,%r2163; set.u32.lt.u64 %r1528,%r1523,%r2093; cvt.s64.s32 %r1526,%r1528; add.u64 %r1529,%r2094,%r2164; mov.u64 %r2093,%r1523; sub.u64 %r2094,%r1529,%r1526; .loc 1 2791 6 add.u64 %r361,%r361,16; add.u64 %r359,%r359,16; setp.ne.u64 %r1533,%r227,%r361; @ %r1533 bra $L65; $L64: .loc 1 2826 24 st.u64 [%r339],%r2063; st.u64 [%r339+8],%r2064; .loc 1 2827 28 st.u64 [%r339+16],%r2065; st.u64 [%r339+24],%r2066; .loc 1 2828 30 st.u64 [%r337],%r2067; st.u64 [%r337+8],%r2068; .loc 1 2829 34 st.u64 [%r337+16],%r2069; st.u64 [%r337+24],%r2070; .loc 1 2830 30 st.u64 [%r335],%r2071; st.u64 [%r335+8],%r2072; .loc 1 2831 34 st.u64 [%r335+16],%r2073; st.u64 [%r335+24],%r2074; .loc 1 2832 30 st.u64 [%r333],%r2075; st.u64 [%r333+8],%r2076; .loc 1 2833 34 st.u64 [%r333+16],%r2077; st.u64 [%r333+24],%r2078; .loc 1 2834 28 st.u64 [%r339+32],%r2079; st.u64 [%r339+40],%r2080; .loc 1 2835 28 st.u64 [%r339+48],%r2081; st.u64 [%r339+56],%r2082; .loc 1 2836 34 st.u64 [%r337+32],%r2083; st.u64 [%r337+40],%r2084; .loc 1 2837 34 st.u64 [%r337+48],%r2085; st.u64 [%r337+56],%r2086; .loc 1 2838 34 st.u64 [%r335+32],%r2087; st.u64 [%r335+40],%r2088; .loc 1 2839 34 st.u64 [%r335+48],%r2089; st.u64 [%r335+56],%r2090; .loc 1 2840 34 st.u64 [%r333+32],%r2091; st.u64 [%r333+40],%r2092; .loc 1 2841 34 st.u64 [%r333+48],%r2093; st.u64 [%r333+56],%r2094; .loc 1 2772 9 add.u64 %r345,%r345,-16384; add.u64 %r344,%r344,16384; add.u64 %r343,%r343,16384; add.u64 %r342,%r342,16384; add.u64 %r341,%r341,64; setp.ne.u64 %r1566,%r318,%r341; @ %r1566 bra $L66; bra $L67; $L63: .loc 1 2769 29 add.u64 %r278,%r278,4; .loc 1 2769 5 add.u64 %r227,%r227,%r226; add.u64 %r222,%r222,%r221; add.u64 %r220,%r220,%r221; add.u64 %r89,%r89,%r221; add.u64 %r42,%r42,%r221; add.u64 %r486,%r486,%r1062; add.u64 %r483,%r483,%r226; add.u64 %r478,%r478,%r226; add.u64 %r476,%r476,%r226; add.u64 %r474,%r474,%r221; add.u64 %r473,%r473,%r226; setp.le.s64 %r1567,%r278,%r2293; @ %r1567 bra $L68; bra $L57; $L62: add.u64 %r374,%r468,%r474; add.u64 %r373,%r463,%r474; add.u64 %r372,%r460,%r474; add.u64 %r371,%r457,%r474; add.u64 %r49,%r48,%r473; add.u64 %r239,%r46,%r473; add.u64 %r240,%r30,%r473; add.u64 %r472,%r222,%r471; .loc 1 2904 13 mov.u64 %r370,%r369; setp.gt.s64 %r2237,%r258,%r2288; $L71: .loc 1 2848 14 ld.u64 %r2055,[%r374]; ld.u64 %r2056,[%r374+8]; .loc 1 2849 14 ld.u64 %r2057,[%r373]; ld.u64 %r2058,[%r373+8]; .loc 1 2850 14 ld.u64 %r2059,[%r372]; ld.u64 %r2060,[%r372+8]; .loc 1 2851 14 ld.u64 %r2061,[%r371]; ld.u64 %r2062,[%r371+8]; .loc 1 2853 10 @ %r2237 bra $L69; shl.b64 %r1571,%r370,4; add.u64 %r389,%r1033,%r1571; add.u64 %r387,%r473,%r570; mov.u64 %r381,%r240; mov.u64 %r383,%r239; mov.u64 %r385,%r49; $L70: .loc 1 2855 16 ld.u64 %r2041,[%r389]; ld.u64 %r2042,[%r389+8]; .loc 1 2856 13 st.u64 [%frame+80],%r2041; st.u64 [%frame+88],%r2042; ld.u64 %r1578,[%r387]; st.u64 [%frame+96],%r1578; ld.u64 %r1579,[%r387+8]; st.u64 [%frame+104],%r1572305230623072167,[%frame+64]; ld.u64 %r2168,[%frame+72]; .loc 1 2855 11 add.u64 %r1584,%r2055,%r2167; set.u32.lt.u64 %r1589,%r1584,%r2055; cvt.s64.s32 %r1587,%r1589; add.u64 %r1590,%r2056,%r2168; mov.u64 %r2055,%r1584; sub.u64 %r2056,%r1590,%r1587; .loc 1 2858 13 st.u64 [%frame+80],%r2041; st.u64 [%frame+88],%r2042; ld.u64 %r1600,[%r385]; st.u64 [%frame+96],%r1600; ld.u64 %r1601,[%r385+8]; st.u64 [%frame+104],%r16005230623072171,[%frame+64]; ld.u64 %r2172,[%frame+72]; .loc 1 2857 11 add.u64 %r1606,%r2057,%r2171; set.u32.lt.u64 %r1611,%r1606,%r2057; cvt.s64.s32 %r1609,%r1611; add.u64 %r1612,%r2058,%r2172; mov.u64 %r2057,%r1606; sub.u64 %r2058,%r1612,%r1609; .loc 1 2860 13 st.u64 [%frame+80],%r2041; st.u64 [%frame+88],%r2042; ld.u64 %r1622,[%r383]; st.u64 [%frame+96],%r1622; ld.u64 %r1623,[%r383+8]; st.u64 [%frame+104],%r162305230623072175,[%frame+64]; ld.u64 %r2176,[%frame+72]; .loc 1 2859 11 add.u64 %r1628,%r2059,%r2175; set.u32.lt.u64 %r1633,%r1628,%r2059; cvt.s64.s32 %r1631,%r1633; add.u64 %r1634,%r2060,%r2176; mov.u64 %r2059,%r1628; sub.u64 %r2060,%r1634,%r1631; .loc 1 2862 13 st.u64 [%frame+80],%r2041; st.u64 [%frame+88],%r2042; ld.u64 %r1644,[%r381]; st.u64 [%frame+96],%r1644; ld.u64 %r1645,[%r381+8]; st.u64 [%frame+104],%r16452305230623072179,[%frame+64]; ld.u64 %r2180,[%frame+72]; .loc 1 2861 11 add.u64 %r1650,%r2061,%r2179; set.u32.lt.u64 %r1655,%r1650,%r2061; cvt.s64.s32 %r1653,%r1655; add.u64 %r1656,%r2062,%r2180; mov.u64 %r2061,%r1650; sub.u64 %r2062,%r1656,%r1653; .loc 1 2853 10 add.u64 %r389,%r389,16; add.u64 %r387,%r387,16; add.u64 %r385,%r385,16; add.u64 %r383,%r383,16; add.u64 %r381,%r381,16; setp.ne.u64 %r1660,%r227,%r387; @ %r1660 bra $L70; $L69: .loc 1 2864 28 st.u64 [%r374],%r2055; st.u64 [%r374+8],%r2056; .loc 1 2865 34 st.u64 [%r373],%r2057; st.u64 [%r373+8],%r2058; .loc 1 2866 34 st.u64 [%r372],%r2059; st.u64 [%r372+8],%r2060; .loc 1 2867 34 st.u64 [%r371],%r2061; st.u64 [%r371+8],%r2062; .loc 1 2846 6 add.u64 %r374,%r374,16; add.u64 %r373,%r373,16; add.u64 %r372,%r372,16; add.u64 %r371,%r371,16; add.u64 %r370,%r370,256; setp.ne.u64 %r1669,%r374,%r472; @ %r1669 bra $L71; bra $L63; $L60: .loc 1 2727 34 add.u64 %r273,%r273,256; .loc 1 2727 8 add.u64 %r555,%r555,4096; add.u64 %r556,%r556,4096; setp.ge.s64 %r1670,%r216,%r273; @ %r1670 bra $L72; bra $L73; $L59: .loc 1 2873 17 add.u64 %r171,%r2413,%r618; .loc 1 2874 9 setp.le.s64 %r1671,%r171,%r480; @ %r1671 bra $L60; .loc 1 2876 14 add.u64 %r495,%r263,%r273; .loc 1 2876 9 add.u64 %r496,%r495,-1; .loc 1 2759 17 add.u64 %r405,%r2416,%r273; add.u64 %r1672,%r405,%r406; shl.b64 %r1673,%r1672,4; add.u64 %r407,%r1673,%r247; neg.s64 %r1675,%r405; shl.b64 %r1676,%r1675,4; shl.b64 %r1677,%r273,4; add.u64 %r390,%r1677,%r1676; shl.b64 %r1678,%r495,4; add.u64 %r392,%r1678,%r1676; shl.b64 %r434,%r263,8; .loc 1 2876 9 mov.u64 %r398,%r397; mov.u64 %r400,%r586; mov.u64 %r401,%r406; mov.u64 %r267,%r480; setp.gt.s64 %r2450,%r2416,%r263; setp.le.s64 %r2243,%r273,%r496; not.b64 %r2260,%r273; add.u64 %r2261,%r2260,%r495; and.b64 %r2262,%r2261,-4; $L81: .loc 1 2877 6 @ %r2243 bra $L74; $L80: .loc 1 2901 6 @ %r2450 bra $L75; bra $L76; $L74: add.u64 %r419,%r390,%r407; add.u64 %r1685,%r2262,%r273; add.u64 %r1686,%r1685,%r401; shl.b64 %r1687,%r1686,4; add.u64 %r409,%r1687,%r2308; .loc 1 2904 13 mov.u64 %r415,16384; mov.u64 %r416,12288; mov.u64 %r417,4096; mov.u64 %r418,-8192; setp.gt.s64 %r2238,%r258,%r2288; $L79: .loc 1 2879 14 ld.u64 %r2047,[%r419]; ld.u64 %r2048,[%r419+8]; .loc 1 2880 14 ld.u64 %r2049,[%r419+16]; ld.u64 %r2050,[%r419+24]; .loc 1 2881 14 ld.u64 %r2051,[%r419+32]; ld.u64 %r2052,[%r419+40]; .loc 1 2882 14 ld.u64 %r2053,[%r419+48]; ld.u64 %r2054,[%r419+56]; .loc 1 2884 10 @ %r2238 bra $L77; add.u64 %r428,%r1033,%r417; mov.u64 %r429,%r400; $L78: add.u64 %r425,%r418,%r428; .loc 1 2887 16 ld.u64 %r2043,[%r429]; ld.u64 %r2044,[%r429+8]; .loc 1 2886 16 add.u64 %r1690,%r425,%r417; .loc 1 2887 13 st.u64 [%frame+80],%r2043; st.u64 [%frame+88],%r2044; ld.u64 %r1697,[%r1690]; st.u64 [%frame+96],%r1697; ld.u64 %r1698,[%r1690+8]; st.u64 [%frame+104],%r16982305230623072183,[%frame+64]; ld.u64 %r2184,[%frame+72]; .loc 1 2886 11 add.u64 %r1703,%r2047,%r2183; set.u32.lt.u64 %r1708,%r1703,%r2047; cvt.s64.s32 %r1706,%r1708; add.u64 %r1709,%r2048,%r2184; mov.u64 %r2047,%r1703; sub.u64 %r2048,%r1709,%r1706; .loc 1 2889 13 st.u64 [%frame+80],%r2043; st.u64 [%frame+88],%r2044; ld.u64 %r1719,[%r428]; st.u64 [%frame+96],%r1719; ld.u64 %r1720,[%r428+8]; st.u64 [%frame+104],%r172305230623072187,[%frame+64]; ld.u64 %r2188,[%frame+72]; .loc 1 2888 11 add.u64 %r1725,%r2049,%r2187; set.u32.lt.u64 %r1730,%r1725,%r2049; cvt.s64.s32 %r1728,%r1730; add.u64 %r1731,%r2050,%r2188; mov.u64 %r2049,%r1725; sub.u64 %r2050,%r1731,%r1728; .loc 1 2890 16 add.u64 %r1735,%r425,%r416; .loc 1 2891 13 st.u64 [%frame+80],%r2043; st.u64 [%frame+88],%r2044; ld.u64 %r1742,[%r1735]; st.u64 [%frame+96],%r1742; ld.u64 %r1743,[%r1735+8]; st.u64 [%frame+104],%r174305230623072191,[%frame+64]; ld.u64 %r2192,[%frame+72]; .loc 1 2890 11 add.u64 %r1748,%r2051,%r2191; set.u32.lt.u64 %r1753,%r1748,%r2051; cvt.s64.s32 %r1751,%r1753; add.u64 %r1754,%r2052,%r2192; mov.u64 %r2051,%r1748; sub.u64 %r2052,%r1754,%r1751; .loc 1 2892 16 add.u64 %r1758,%r425,%r415; .loc 1 2893 13 st.u64 [%frame+80],%r2043; st.u64 [%frame+88],%r2044; ld.u64 %r1765,[%r1758]; st.u64 [%frame+96],%r1765; ld.u64 %r1766,[%r1758+8]; st.u64 [%frame+104],%r176305230623072195,[%frame+64]; ld.u64 %r2196,[%frame+72]; .loc 1 2892 11 add.u64 %r1771,%r2053,%r2195; set.u32.lt.u64 %r1776,%r1771,%r2053; cvt.s64.s32 %r1774,%r1776; add.u64 %r1777,%r2054,%r2196; mov.u64 %r2053,%r1771; sub.u64 %r2054,%r1777,%r1774; .loc 1 2884 10 add.u64 %r429,%r429,16; add.u64 %r428,%r428,16; setp.ne.u64 %r1781,%r398,%r429; @ %r1781 bra $L78; $L77: .loc 1 2895 28 st.u64 [%r419],%r2047; st.u64 [%r419+8],%r2048; .loc 1 2896 32 st.u64 [%r419+16],%r2049; st.u64 [%r419+24],%r2050; .loc 1 2897 32 st.u64 [%r419+32],%r2051; st.u64 [%r419+40],%r2052; .loc 1 2898 32 st.u64 [%r419+48],%r2053; st.u64 [%r419+56],%r2054; .loc 1 2877 6 add.u64 %r419,%r419,64; add.u64 %r418,%r418,-16384; add.u64 %r417,%r417,16384; add.u64 %r416,%r416,16384; add.u64 %r415,%r415,16384; setp.ne.u64 %r1790,%r409,%r419; @ %r1790 bra $L79; bra $L80; $L76: .loc 1 2874 39 add.u64 %r267,%r267,1; .loc 1 2874 9 add.u64 %r407,%r407,%r566; add.u64 %r401,%r401,%r504; add.u64 %r400,%r400,%r619; add.u64 %r398,%r398,%r619; setp.ne.u64 %r1791,%r171,%r267; @ %r1791 bra $L81; bra $L60; $L75: add.u64 %r436,%r392,%r407; .loc 1 2904 13 mov.u64 %r435,%r434; setp.gt.s64 %r2239,%r258,%r2288; $L84: .loc 1 2903 14 ld.u64 %r2045,[%r436]; ld.u64 %r2046,[%r436+8]; .loc 1 2905 10 @ %r2239 bra $L82; shl.b64 %r1794,%r435,4; add.u64 %r332,%r1033,%r1794; mov.u64 %r297,%r400; sub.u64 %r2337,%r398,%r400; add.u64 %r2338,%r2337,-16; shr.u64 %r2336,%r2338,4; and.b64 %r2340,%r2336,1; setp.ne.u64 %r2341,%r2340,0; @ ! %r2341 bra $L282; $L83: .loc 1 2908 13 ld.u64 %r1797,[%r332]; st.u64 [%frame+80],%r1797; ld.u64 %r1798,[%r332+8]; st.u64 [%frame+88],%r1798; ld.u64 %r1801,[%r297]; st.u64 [%frame+96],%r1801; ld.u64 %r1802,[%r297+8]; st.u64 [%frame+104],%r18022305230623072199,[%frame+64]; ld.u64 %r2200,[%frame+72]; .loc 1 2907 11 add.u64 %r1807,%r2045,%r2199; set.u32.lt.u64 %r1812,%r1807,%r2045; cvt.s64.s32 %r1810,%r1812; add.u64 %r1813,%r2046,%r2200; sub.u64 %r1816,%r1813,%r1810; .loc 1 2905 10 add.u64 %r2342,%r332,16; add.u64 %r2343,%r297,16; .loc 1 2908 13 ld.u64 %r2461,[%r2342]; st.u64 [%frame+80],%r2461; ld.u64 %r2462,[%r2342+8]; st.u64 [%frame+88],%r2462; ld.u64 %r2463,[%r2343]; st.u64 [%frame+96],%r2463; ld.u64 %r2464,[%r2343+8]; st.u64 [%frame+104],%r24642305230623072465,[%frame+64]; ld.u64 %r2466,[%frame+72]; .loc 1 2907 11 add.u64 %r2467,%r1807,%r2465; set.u32.lt.u64 %r2468,%r2467,%r1807; cvt.s64.s32 %r2469,%r2468; add.u64 %r2470,%r1816,%r2466; mov.u64 %r2045,%r2467; sub.u64 %r2046,%r2470,%r2469; .loc 1 2905 10 add.u64 %r332,%r332,32; add.u64 %r297,%r297,32; setp.ne.u64 %r2472,%r398,%r297; @ %r2472 bra $L83; $L82: .loc 1 2910 28 st.u64 [%r436],%r2045; st.u64 [%r436+8],%r2046; .loc 1 2901 6 add.u64 %r436,%r436,16; add.u64 %r435,%r435,256; setp.ne.u64 %r1820,%r407,%r436; @ %r1820 bra $L84; bra $L76; $L20: .loc 1 2958 11 cvt.u16.u32 %r1821,%r446; setp.eq.u16 %r1822,%r1821,1; @ ! %r1822 bra $L283; bra $L85; $L31: .loc 1 2920 38 set.u32.eq.u64 %r1824,%r208,1; neg.s32 %r1825,%r1824; .loc 1 2920 26 cvt.u16.u32 %r1827,%r1825; and.b16 %r1826,%r1827,%r990; .loc 1 2920 11 cvt.u32.u16 %r1829,%r1826; cvt.u16.u8 %r1830,%r1829; setp.eq.u16 %r1831,%r1830,0; @ %r1831 bra $L20; .loc 1 2920 43 setp.ne.u64 %r1832,%r502,1; @ %r1832 bra $L20; $L121: .loc 1 2922 10 cvt.u16.u32 %r1833,%r446; setp.ne.u16 %r1834,%r1833,1; @ %r1834 bra $L87; .loc 1 2948 4 setp.gt.s64 %r1835,%r217,0; @ %r1835 bra $L88; bra $L1; $L87: .loc 1 2929 4 setp.le.s64 %r1836,%r217,0; @ %r1836 bra $L1; shl.b64 %r643,%r504,4; shl.b64 %r1837,%r216,4; add.u64 %r642,%r245,%r1837; shl.b64 %r633,%r503,4; shl.b64 %r1838,%r456,4; add.u64 %r634,%r243,%r1838; mov.u64 %r647,0; mov.u64 %r640,%r647; .loc 1 2929 11 mov.u64 %r300,%r647; setp.gt.s64 %r2473,%r216,0; setp.le.s64 %r2312,%r456,0; add.u64 %r2313,%frame,64; add.u64 %r2314,%frame,80; add.u64 %r2315,%frame,96; $L91: .loc 1 2933 8 @ %r2473 bra $L90; $L95: .loc 1 2929 29 add.u64 %r300,%r300,1; .loc 1 2929 4 add.u64 %r640,%r640,%r504; add.u64 %r642,%r642,%r643; add.u64 %r647,%r647,%r209; setp.ne.u64 %r1840,%r217,%r300; @ %r1840 bra $L91; bra $L1; $L90: shl.b64 %r1841,%r640,4; add.u64 %r629,%r245,%r1841; shl.b64 %r1842,%r647,4; add.u64 %r652,%r244,%r1842; .loc 1 2933 8 mov.u64 %r631,%r634; mov.u64 %r630,0; $L94: .loc 1 2937 5 @ %r2312 bra $L140; shl.b64 %r1844,%r630,4; add.u64 %r625,%r243,%r1844; mov.u64 %r626,%r652; .loc 1 2936 7 mov.u64 %r2097,0; mov.u64 %r2098,%r2097; sub.u64 %r2371,%r631,%r625; add.u64 %r2372,%r2371,-16; shr.u64 %r2370,%r2372,4; and.b64 %r2374,%r2370,1; setp.ne.u64 %r2375,%r2374,0; @ ! %r2375 bra $L284; $L93: .loc 1 2938 23 ld.u64 %r1847,[%r625]; st.u64 [%frame+80],%r1847; ld.u64 %r1848,[%r625+8]; st.u64 [%frame+88],%r1848; ld.u64 %r1851,[%r626]; st.u64 [%frame+96],%r1851; ld.u64 %r1852,[%r626+8]; st.u64 [%frame+104],%r1852231r23142312203,[%frame+64]; ld.u64 %r2204,[%frame+72]; .loc 1 2938 9 add.u64 %r1857,%r2097,%r2203; set.u32.lt.u64 %r1862,%r1857,%r2097; cvt.s64.s32 %r1860,%r1862; add.u64 %r1863,%r2098,%r2204; sub.u64 %r1866,%r1863,%r1860; .loc 1 2937 5 add.u64 %r2376,%r625,16; add.u64 %r2377,%r626,16; .loc 1 2938 23 ld.u64 %r2481,[%r2376]; st.u64 [%frame+80],%r2481; ld.u64 %r2482,[%r2376+8]; st.u64 [%frame+88],%r2482; ld.u64 %r2483,[%r2377]; st.u64 [%frame+96],%r2483; ld.u64 %r2484,[%r2377+8]; st.u64 [%frame+104],%r2484231r23142312485,[%frame+64]; ld.u64 %r2486,[%frame+72]; .loc 1 2938 9 add.u64 %r2487,%r1857,%r2485; set.u32.lt.u64 %r2488,%r2487,%r1857; cvt.s64.s32 %r2489,%r2488; add.u64 %r2490,%r1866,%r2486; mov.u64 %r2097,%r2487; sub.u64 %r2098,%r2490,%r2489; .loc 1 2937 5 add.u64 %r625,%r625,32; add.u64 %r626,%r626,32; setp.ne.u64 %r2492,%r625,%r631; @ %r2492 bra $L93; bra $L92; $L140: .loc 1 2936 7 mov.u64 %r2097,0; mov.u64 %r2098,%r2097; $L92: .loc 1 2939 15 st.u64 [%r629],%r2097; st.u64 [%r629+8],%r2098; .loc 1 2933 8 add.u64 %r629,%r629,16; add.u64 %r630,%r630,%r503; add.u64 %r631,%r631,%r633; setp.ne.u64 %r1870,%r629,%r642; @ %r1870 bra $L94; bra $L95; $L88: shl.b64 %r663,%r504,4; mov.u64 %r661,%r245; shl.b64 %r655,%r503,4; .loc 1 2948 4 mov.u64 %r664,0; .loc 1 2948 11 mov.u64 %r298,%r664; setp.le.s64 %r2493,%r456,0; add.u64 %r2316,%frame,64; add.u64 %r2317,%frame,80; add.u64 %r2318,%frame,96; $L98: .loc 1 2952 8 @ %r2493 bra $L141; mov.u64 %r653,%r243; shl.b64 %r1872,%r664,4; add.u64 %r656,%r1872,%r244; add.u64 %r1873,%r456,%r664; shl.b64 %r1874,%r1873,4; add.u64 %r672,%r1874,%r244; .loc 1 2951 10 mov.u64 %r2095,0; mov.u64 %r2096,%r2095; sub.u64 %r2379,%r672,%r656; add.u64 %r2380,%r2379,-16; shr.u64 %r2378,%r2380,4; and.b64 %r2382,%r2378,1; setp.ne.u64 %r2383,%r2382,0; @ ! %r2383 bra $L285; $L97: .loc 1 2953 26 ld.u64 %r1877,[%r653]; st.u64 [%frame+80],%r1877; ld.u64 %r1878,[%r653+8]; st.u64 [%frame+88],%r1878; ld.u64 %r1881,[%r656]; st.u64 [%frame+96],%r1881; ld.u64 %r1882,[%r656+8]; st.u64 [%frame+104],%r1882231623172312207,[%frame+64]; ld.u64 %r2208,[%frame+72]; .loc 1 2953 5 add.u64 %r1887,%r2095,%r2207; set.u32.lt.u64 %r1892,%r1887,%r2095; cvt.s64.s32 %r1890,%r1892; add.u64 %r1893,%r2096,%r2208; sub.u64 %r1896,%r1893,%r1890; .loc 1 2952 8 add.u64 %r2499,%r653,%r655; add.u64 %r2384,%r656,16; .loc 1 2953 26 ld.u64 %r2501,[%r2499]; st.u64 [%frame+80],%r2501; ld.u64 %r2502,[%r2499+8]; st.u64 [%frame+88],%r2502; ld.u64 %r2503,[%r2384]; st.u64 [%frame+96],%r2503; ld.u64 %r2504,[%r2384+8]; st.u64 [%frame+104],%r2504231623172312505,[%frame+64]; ld.u64 %r2506,[%frame+72]; .loc 1 2953 5 add.u64 %r2507,%r1887,%r2505; set.u32.lt.u64 %r2508,%r2507,%r1887; cvt.s64.s32 %r2509,%r2508; add.u64 %r2510,%r1896,%r2506; mov.u64 %r2095,%r2507; sub.u64 %r2096,%r2510,%r2509; .loc 1 2952 8 add.u64 %r653,%r2499,%r655; add.u64 %r656,%r656,32; setp.ne.u64 %r2512,%r656,%r672; @ %r2512 bra $L97; bra $L96; $L141: .loc 1 2951 10 mov.u64 %r2095,0; mov.u64 %r2096,%r2095; $L96: .loc 1 2954 25 st.u64 [%r661],%r2095; st.u64 [%r661+8],%r2096; .loc 1 2948 29 add.u64 %r298,%r298,1; .loc 1 2948 4 add.u64 %r661,%r661,%r663; add.u64 %r664,%r664,%r209; setp.ne.u64 %r1900,%r217,%r298; @ %r1900 bra $L98; bra $L1; $L85: .loc 1 2963 7 setp.le.s64 %r1901,%r217,0; @ %r1901 bra $L1; shl.b64 %r683,%r444,4; mov.u64 %r681,%r245; shl.b64 %r675,%r503,4; shl.b64 %r678,%r502,4; mov.u64 %r684,0; .loc 1 2963 14 mov.u64 %r311,%r684; setp.le.s64 %r2513,%r456,0; add.u64 %r2319,%frame,64; add.u64 %r2320,%frame,80; add.u64 %r2321,%frame,96; $L101: .loc 1 2967 4 @ %r2513 bra $L142; mov.u64 %r673,%r243; shl.b64 %r1903,%r684,4; add.u64 %r676,%r244,%r1903; .loc 1 2966 6 mov.u64 %r2101,0; mov.u64 %r2102,%r2101; .loc 1 2967 11 mov.u64 %r313,%r2101; and.b64 %r2388,%r456,1; setp.eq.u64 %r2389,%r2388,0; @ ! %r2389 bra $L286; $L100: .loc 1 2968 29 ld.u64 %r1906,[%r673]; st.u64 [%frame+80],%r1906; ld.u64 %r1907,[%r673+8]; st.u64 [%frame+88],%r1907; ld.u64 %r1910,[%r676]; st.u64 [%frame+96],%r1910; ld.u64 %r1911,[%r676+8]; st.u64 [%frame+104],%r191192320232211,[%frame+64]; ld.u64 %r2212,[%frame+72]; .loc 1 2968 8 add.u64 %r1916,%r2101,%r2211; set.u32.lt.u64 %r1921,%r1916,%r2101; cvt.s64.s32 %r1919,%r1921; add.u64 %r1922,%r2102,%r2212; sub.u64 %r1925,%r1922,%r1919; .loc 1 2967 4 add.u64 %r2520,%r673,%r675; add.u64 %r2521,%r676,%r678; .loc 1 2968 29 ld.u64 %r2522,[%r2520]; st.u64 [%frame+80],%r2522; ld.u64 %r2523,[%r2520+8]; st.u64 [%frame+88],%r2523; ld.u64 %r2524,[%r2521]; st.u64 [%frame+96],%r2524; ld.u64 %r2525,[%r2521+8]; st.u64 [%frame+104],%r252523192320232526,[%frame+64]; ld.u64 %r2527,[%frame+72]; .loc 1 2968 8 add.u64 %r2528,%r1916,%r2526; set.u32.lt.u64 %r2529,%r2528,%r1916; cvt.s64.s32 %r2530,%r2529; add.u64 %r2531,%r1925,%r2527; mov.u64 %r2101,%r2528; sub.u64 %r2102,%r2531,%r2530; .loc 1 2967 28 add.u64 %r313,%r313,2; .loc 1 2967 4 add.u64 %r673,%r2520,%r675; add.u64 %r676,%r2521,%r678; setp.ne.u64 %r2533,%r456,%r313; @ %r2533 bra $L100; bra $L99; $L142: .loc 1 2966 6 mov.u64 %r2101,0; mov.u64 %r2102,%r2101; $L99: .loc 1 2969 21 st.u64 [%r681],%r2101; st.u64 [%r681+8],%r2102; .loc 1 2963 32 add.u64 %r311,%r311,1; .loc 1 2963 7 add.u64 %r681,%r681,%r683; add.u64 %r684,%r684,%r209; setp.ne.u64 %r1929,%r217,%r311; @ %r1929 bra $L101; bra $L1; $L283: .loc 1 2972 11 setp.gt.s64 %r1930,%r208,%r503; @ %r1930 bra $L102; .loc 1 2993 7 setp.gt.s64 %r1931,%r217,0; @ %r1931 bra $L103; bra $L1; $L102: .loc 1 2974 7 setp.gt.s64 %r1932,%r217,0; @ %r1932 bra $L104; bra $L1; $L108: shl.b64 %r1933,%r717,4; add.u64 %r713,%r245,%r1933; .loc 1 2975 9 mov.u64 %r310,0; and.b64 %r2400,%r216,3; setp.eq.u64 %r2403,%r2400,0; @ %r2403 bra $L105; setp.eq.u64 %r2402,%r2400,1; @ %r2402 bra $L250; setp.eq.u64 %r2401,%r2400,2; @ %r2401 bra $L251; bra $L287; $L105: .loc 1 2976 34 st.u64 [%r713],%r2328; st.u64 [%r713+8],%r2328; .loc 1 2975 2 add.u64 %r2535,%r713,%r715; .loc 1 2976 34 st.u64 [%r2535],%r2328; st.u64 [%r2535+8],%r2328; .loc 1 2975 2 add.u64 %r2537,%r2535,%r715; .loc 1 2976 34 st.u64 [%r2537],%r2328; st.u64 [%r2537+8],%r2328; .loc 1 2975 2 add.u64 %r2540,%r2537,%r715; .loc 1 2976 34 st.u64 [%r2540],%r2328; st.u64 [%r2540+8],%r2328; .loc 1 2975 27 add.u64 %r310,%r310,4; .loc 1 2975 2 add.u64 %r713,%r2540,%r715; setp.ne.u64 %r2542,%r216,%r310; @ %r2542 bra $L105; $L109: .loc 1 2974 32 add.u64 %r309,%r324,1; .loc 1 2974 7 add.u64 %r717,%r717,%r504; setp.ne.u64 %r1937,%r217,%r309; @ %r1937 bra $L143; shl.b64 %r701,%r502,4; shl.b64 %r692,%r503,4; mov.u64 %r706,0; mov.u64 %r705,%r706; mov.u64 %r325,%r706; setp.le.s64 %r2543,%r456,0; add.u64 %r2325,%frame,64; add.u64 %r2326,%frame,80; add.u64 %r2327,%frame,96; bra $L107; $L104: shl.b64 %r715,%r444,4; mov.u64 %r717,0; .loc 1 2974 14 mov.u64 %r324,%r717; setp.gt.s64 %r2544,%r216,0; .loc 1 2976 34 mov.u64 %r2328,%r717; bra $L106; $L143: mov.u64 %r324,%r309; $L106: .loc 1 2975 2 @ %r2544 bra $L108; bra $L109; $L112: shl.b64 %r1939,%r697,4; add.u64 %r690,%r243,%r1939; .loc 1 2980 4 mov.u64 %r689,%r710; .loc 1 2980 11 mov.u64 %r308,0; and.b64 %r2394,%r216,1; setp.eq.u64 %r2395,%r2394,0; @ ! %r2395 bra $L288; $L110: .loc 1 2983 37 ld.u64 %r1942,[%r690]; st.u64 [%frame+80],%r1942; ld.u64 %r1943,[%r690+8]; st.u64 [%frame+88],%r1943; ld.u64 %r1946,[%r699]; st.u64 [%frame+96],%r1946; ld.u64 %r1947,[%r699+8]; st.u64 [%frame+104],%r194723252326232215,[%frame+64]; ld.u64 %r2216,[%frame+72]; .loc 1 2982 36 ld.u64 %r2217,[%r689]; ld.u64 %r2218,[%r689+8]; add.u64 %r1953,%r2217,%r2215; set.u32.lt.u64 %r1958,%r1953,%r2217; cvt.s64.s32 %r1956,%r1958; add.u64 %r1959,%r2218,%r2216; sub.u64 %r1962,%r1959,%r1956; st.u64 [%r689],%r1953; st.u64 [%r689+8],%r1962; .loc 1 2980 4 add.u64 %r2549,%r689,%r715; add.u64 %r2550,%r690,%r692; .loc 1 2983 37 ld.u64 %r2551,[%r2550]; st.u64 [%frame+80],%r2551; ld.u64 %r2552,[%r2550+8]; st.u64 [%frame+88],%r2552; ld.u64 %r2553,[%r699]; st.u64 [%frame+96],%r2553; ld.u64 %r2554,[%r699+8]; st.u64 [%frame+104],%r255423252326232555,[%frame+64]; ld.u64 %r2556,[%frame+72]; .loc 1 2982 36 ld.u64 %r2557,[%r2549]; ld.u64 %r2558,[%r2549+8]; add.u64 %r2559,%r2557,%r2555; set.u32.lt.u64 %r2560,%r2559,%r2557; cvt.s64.s32 %r2561,%r2560; add.u64 %r2562,%r2558,%r2556; sub.u64 %r2563,%r2562,%r2561; st.u64 [%r2549],%r2559; st.u64 [%r2549+8],%r2563; .loc 1 2980 29 add.u64 %r308,%r308,2; .loc 1 2980 4 add.u64 %r689,%r2549,%r715; add.u64 %r690,%r2550,%r692; setp.ne.u64 %r2564,%r216,%r308; @ %r2564 bra $L110; $L113: .loc 1 2979 26 add.u64 %r307,%r307,1; .loc 1 2979 2 add.u64 %r697,%r697,%r208; add.u64 %r699,%r699,%r701; setp.eq.u64 %r1966,%r456,%r307; @ %r1966 bra $L111; $L114: .loc 1 2980 4 @ %r2544 bra $L112; bra $L113; $L111: .loc 1 2978 32 add.u64 %r306,%r325,1; .loc 1 2978 7 add.u64 %r705,%r705,%r504; add.u64 %r706,%r706,%r209; setp.eq.u64 %r1968,%r324,%r325; @ %r1968 bra $L1; mov.u64 %r325,%r306; $L107: .loc 1 2979 2 @ %r2543 bra $L111; shl.b64 %r1970,%r706,4; add.u64 %r699,%r244,%r1970; shl.b64 %r1971,%r705,4; add.u64 %r710,%r245,%r1971; mov.u64 %r697,0; .loc 1 2979 9 mov.u64 %r307,%r697; bra $L114; $L103: shl.b64 %r732,%r444,4; shl.b64 %r723,%r208,4; shl.b64 %r727,%r502,4; .loc 1 2993 7 mov.u64 %r741,0; mov.u64 %r739,%r741; .loc 1 2993 14 mov.u64 %r302,%r741; setp.gt.s64 %r2565,%r216,0; setp.le.s64 %r2332,%r456,0; add.u64 %r2333,%frame,64; add.u64 %r2334,%frame,80; add.u64 %r2335,%frame,96; $L116: .loc 1 2997 4 @ %r2565 bra $L115; $L120: .loc 1 2993 32 add.u64 %r302,%r302,1; .loc 1 2993 7 add.u64 %r739,%r739,%r504; add.u64 %r741,%r741,%r209; setp.ne.u64 %r1973,%r217,%r302; @ %r1973 bra $L116; bra $L1; $L115: shl.b64 %r1974,%r739,4; add.u64 %r730,%r245,%r1974; shl.b64 %r1975,%r741,4; add.u64 %r747,%r244,%r1975; .loc 1 2997 4 mov.u64 %r734,0; .loc 1 2997 11 mov.u64 %r303,%r734; $L119: .loc 1 3001 8 @ %r2332 bra $L144; shl.b64 %r1977,%r734,4; add.u64 %r721,%r243,%r1977; mov.u64 %r725,%r747; .loc 1 3000 10 mov.u64 %r2099,0; mov.u64 %r2100,%r2099; .loc 1 3001 15 mov.u64 %r305,%r2099; and.b64 %r2408,%r456,1; setp.eq.u64 %r2409,%r2408,0; @ ! %r2409 bra $L289; $L118: .loc 1 3002 28 ld.u64 %r1980,[%r721]; st.u64 [%frame+80],%r1980; ld.u64 %r1981,[%r721+8]; st.u64 [%frame+88],%r1981; ld.u64 %r1984,[%r725]; st.u64 [%frame+96],%r1984; ld.u64 %r1985,[%r725+8]; st.u64 [%frame+104],%r1985233r232332221,[%frame+64]; ld.u64 %r2222,[%frame+72]; .loc 1 3002 5 add.u64 %r1990,%r2099,%r2221; set.u32.lt.u64 %r1995,%r1990,%r2099; cvt.s64.s32 %r1993,%r1995; add.u64 %r1996,%r2100,%r2222; sub.u64 %r1999,%r1996,%r1993; .loc 1 3001 8 add.u64 %r2573,%r721,%r723; add.u64 %r2574,%r725,%r727; .loc 1 3002 28 ld.u64 %r2575,[%r2573]; st.u64 [%frame+80],%r2575; ld.u64 %r2576,[%r2573+8]; st.u64 [%frame+88],%r2576; ld.u64 %r2577,[%r2574]; st.u64 [%frame+96],%r2577; ld.u64 %r2578,[%r2574+8]; st.u64 [%frame+104],%r2578233r232332579,[%frame+64]; ld.u64 %r2580,[%frame+72]; .loc 1 3002 5 add.u64 %r2581,%r1990,%r2579; set.u32.lt.u64 %r2582,%r2581,%r1990; cvt.s64.s32 %r2583,%r2582; add.u64 %r2584,%r1999,%r2580; mov.u64 %r2099,%r2581; sub.u64 %r2100,%r2584,%r2583; .loc 1 3001 32 add.u64 %r305,%r305,2; .loc 1 3001 8 add.u64 %r721,%r2573,%r723; add.u64 %r725,%r2574,%r727; setp.ne.u64 %r2586,%r456,%r305; @ %r2586 bra $L118; bra $L117; $L144: .loc 1 3000 10 mov.u64 %r2099,0; mov.u64 %r2100,%r2099; $L117: .loc 1 3003 27 st.u64 [%r730],%r2099; st.u64 [%r730+8],%r2100; .loc 1 2997 29 add.u64 %r303,%r303,1; .loc 1 2997 4 add.u64 %r730,%r730,%r732; add.u64 %r734,%r734,%r503; setp.ne.u64 %r2003,%r216,%r303; @ %r2003 bra $L119; bra $L120; $L32: .loc 1 2920 11 setp.eq.u64 %r2004,%r208,1; @ %r2004 bra $L121; bra $L20; $L2: .loc 1 2476 6 ld.u64 %r2005,[%r748]; setp.ne.u64 %r2006,%r2005,0; @ %r2006 bra $L122; .loc 1 2485 4 ld.u64 %r2008,[%r749+56]; add.u64 %r2007,%r2008,1; ld.u64 %r2009,[%r749+48]; sub.u64 %r508,%r2007,%r2009; .loc 1 2483 15 ld.s8 %r2011,[%r750+28]; cvt.u16.u32 %r2010,%r2011; setp.eq.u16 %r2012,%r2010,1; @ %r2012 bra $L123; bra $L124; $L126: .loc 1 2560 16 ld.u64 %r503,[%r749+40]; .loc 1 2548 27 mov.u64 %r444,%r504; bra $L125; $L14: ld.u64 %r504,[%r748+40]; .loc 1 2543 6 ld.s8 %r2014,[%r748+28]; cvt.u16.u32 %r2013,%r2014; setp.eq.u16 %r2015,%r2013,1; @ ! %r2015 bra $L127; bra $L126; $L11: .loc 1 2560 16 ld.u64 %r503,[%r749+40]; mov.u32 %r446,%r331; .loc 1 2548 27 mov.u64 %r504,%r444; .loc 1 2563 14 mov.u64 %r216,1; .loc 1 2561 16 mov.u64 %r208,%r216; bra $L12; $L5: .loc 1 2485 4 ld.u64 %r2017,[%r749+56]; add.u64 %r2016,%r2017,1; ld.u64 %r2018,[%r749+48]; sub.u64 %r508,%r2016,%r2018; bra $L124; $L128: .loc 1 2509 17 ld.u64 %r2020,[%r748+56]; add.u64 %r2019,%r2020,1; .loc 1 2509 15 ld.u64 %r2021,[%r748+48]; sub.u64 %r456,%r2019,%r2021; ld.s8 %r2232,[%r750+28]; bra $L9; $L122: .loc 1 2502 12 cvta.global.u64 %r2022,_gfortrani_compile_options; .loc 1 2502 11 ld.u32 %r2023,[%r2022+36]; setp.ne.u32 %r2024,%r2023,0; @ ! %r2024 bra $L8; bra $L128; $L1$L282: .loc 1 2908 13 ld.u64 %r2587,[%r332]; st.u64 [%frame+80],%r2587; ld.u64 %r2588,[%r332+8]; st.u64 [%frame+88],%r2588; ld.u64 %r2589,[%r400]; st.u64 [%frame+96],%r2589; ld.u64 %r2590,[%r400+8]; st.u64 [%frame+104],%r2592305230623072591,[%frame+64]; ld.u64 %r2592,[%frame+72]; .loc 1 2907 11 add.u64 %r2593,%r2045,%r2591; set.u32.lt.u64 %r2594,%r2593,%r2045; cvt.s64.s32 %r2595,%r2594; add.u64 %r2596,%r2046,%r2592; mov.u64 %r2045,%r2593; sub.u64 %r2046,%r2596,%r2595; .loc 1 2905 10 add.u64 %r332,%r332,16; add.u64 %r297,%r400,16; setp.ne.u64 %r2598,%r398,%r297; @ %r2598 bra $L83; bra $L82; $L281: .loc 1 2762 43 ld.u64 %r2599,[%r556]; st.u64 [%r330],%r2599; ld.u64 %r2600,[%r556+8]; st.u64 [%r330+8],%r2600; .loc 1 2760 9 add.u64 %r57,%r556,16; add.u64 %r73,%r330,4096; $L249: .loc 1 2762 43 ld.u64 %r2602,[%r57]; st.u64 [%r73],%r2602; ld.u64 %r2603,[%r57+8]; st.u64 [%r73+8],%r2603; .loc 1 2760 9 add.u64 %r57,%r57,16; add.u64 %r73,%r73,4096; $L248: .loc 1 2762 43 ld.u64 %r2605,[%r57]; st.u64 [%r73],%r2605; ld.u64 %r2606,[%r57+8]; st.u64 [%r73+8],%r2606; .loc 1 2760 9 add.u64 %r57,%r57,16; add.u64 %r73,%r73,4096; setp.ne.u64 %r2607,%r57,%r136; @ %r2607 bra $L58; bra $L48; $L280: .loc 1 2741 7 add.u64 %r2608,%r536,%r539; add.u64 %r2609,%r2608,%r540; .loc 1 2740 49 ld.u64 %r2610,[%r2609]; st.u64 [%r229],%r2610; ld.u64 %r2611,[%r2609+8]; st.u64 [%r229+8],%r2611; .loc 1 2742 49 ld.u64 %r2612,[%r536]; st.u64 [%r229+16],%r2612; ld.u64 %r2613,[%r536+8]; st.u64 [%r229+24],%r2613; .loc 1 2745 7 add.u64 %r2614,%r536,%r540; add.u64 %r2615,%r2614,%r539; .loc 1 2744 49 ld.u64 %r2616,[%r2615+16]; st.u64 [%r229+4096],%r2616; ld.u64 %r2617,[%r2615+24]; st.u64 [%r229+4104],%r2617; .loc 1 2746 49 ld.u64 %r2618,[%r536+16]; st.u64 [%r229+4112],%r2618; ld.u64 %r2619,[%r536+24]; st.u64 [%r229+4120],%r2619; .loc 1 2738 9 add.u64 %r229,%r229,8192; add.u64 %r211,%r536,32; setp.ne.u64 %r2620,%r211,%r527; @ %r2620 bra $L52; bra $L53; $L278: .loc 1 2688 2 @ ! %r2234 bra $L290; .loc 1 2689 22 mov.u32 %r2621262622,[%value_in]; } $L290: .loc 1 2687 24 mov.u64 %r296,2; .loc 1 2687 7 add.u64 %r620,%r245,%r621; $L247: .loc 1 2688 2 @ ! %r2234 bra $L291; .loc 1 2689 22 mov.u32 %r262426242625,[%value_in]; } $L291: .loc 1 2687 24 add.u64 %r296,%r296,1; .loc 1 2687 7 add.u64 %r620,%r620,%r621; $L246: .loc 1 2688 2 @ ! %r2234 bra $L292; .loc 1 2689 22 mov.u32 %r262726272628,[%value_in]; } $L292: .loc 1 2687 24 add.u64 %r296,%r296,1; .loc 1 2687 7 add.u64 %r620,%r620,%r621; setp.eq.u64 %r2629,%r296,%r624; @ ! %r2629 bra $L34; bra $L33; $L214: .loc 1 2689 22 mov.u32 %r2630241262631,[%value_in]; } $L279: .loc 1 2687 7 add.u64 %r2633,%r2412,%r621; .loc 1 2688 2 @ ! %r2234 bra $L293; .loc 1 2689 22 mov.u32 %r2635263262636,[%value_in]; } $L293: .loc 1 2687 7 add.u64 %r2638,%r2633,%r621; .loc 1 2688 2 @ ! %r2234 bra $L294; .loc 1 2689 22 mov.u32 %r264026362641,[%value_in]; } $L294: .loc 1 2687 24 add.u64 %r296,%r2369,3; .loc 1 2687 7 add.u64 %r620,%r2638,%r621; setp.eq.u64 %r2642,%r296,%r624; @ ! %r2642 bra $L34; bra $L33; $L284: .loc 1 2938 23 ld.u64 %r2643,[%r625]; st.u64 [%frame+80],%r2643; ld.u64 %r2644,[%r625+8]; st.u64 [%frame+88],%r2644; ld.u64 %r2645,[%r652]; st.u64 [%frame+96],%r2645; ld.u64 %r2646,[%r652+8]; st.u64 [%frame+104],%r26431r2314231.loc 1 2938 9 ld.u64 %r2097,[%frame+64]; ld.u64 %r2098,[%frame+72]; .loc 1 2937 5 add.u64 %r625,%r625,16; add.u64 %r626,%r652,16; setp.ne.u64 %r2654,%r625,%r631; @ %r2654 bra $L93; bra $L92; $L285: .loc 1 2953 26 ld.u64 %r2655,[%r243]; st.u64 [%frame+80],%r2655; ld.u64 %r2656,[%r243+8]; st.u64 [%frame+88],%r2656; ld.u64 %r2657,[%r656]; st.u64 [%frame+96],%r2657; ld.u64 %r2658,[%r656+8]; st.u64 [%frame+104],%r265823162317231.loc 1 2953 5 ld.u64 %r2095,[%frame+64]; ld.u64 %r2096,[%frame+72]; .loc 1 2952 8 add.u64 %r653,%r243,%r655; add.u64 %r656,%r656,16; setp.ne.u64 %r2666,%r656,%r672; @ %r2666 bra $L97; bra $L96; $L286: .loc 1 2968 29 ld.u64 %r2667,[%r243]; st.u64 [%frame+80],%r2667; ld.u64 %r2668,[%r243+8]; st.u64 [%frame+88],%r2668; ld.u64 %r2669,[%r676]; st.u64 [%frame+96],%r2669; ld.u64 %r2670,[%r676+8]; st.u64 [%frame+104],%r2672319232023.loc 1 2968 8 ld.u64 %r2101,[%frame+64]; ld.u64 %r2102,[%frame+72]; .loc 1 2967 28 mov.u64 %r313,1; .loc 1 2967 4 add.u64 %r673,%r243,%r675; add.u64 %r676,%r676,%r678; setp.ne.u64 %r2678,%r456,%r313; @ %r2678 bra $L100; bra $L99; $L288: .loc 1 2983 37 ld.u64 %r2679,[%r690]; st.u64 [%frame+80],%r2679; ld.u64 %r2680,[%r690+8]; st.u64 [%frame+88],%r2680; ld.u64 %r2681,[%r699]; st.u64 [%frame+96],%r2681; ld.u64 %r2682,[%r699+8]; st.u64 [%frame+104],%r268223252326232683,[%frame+64]; ld.u64 %r2684,[%frame+72]; .loc 1 2982 36 ld.u64 %r2685,[%r710]; ld.u64 %r2686,[%r710+8]; add.u64 %r2687,%r2685,%r2683; set.u32.lt.u64 %r2688,%r2687,%r2685; cvt.s64.s32 %r2689,%r2688; add.u64 %r2690,%r2686,%r2684; sub.u64 %r2691,%r2690,%r2689; st.u64 [%r710],%r2687; st.u64 [%r710+8],%r2691; .loc 1 2980 29 mov.u64 %r308,1; .loc 1 2980 4 add.u64 %r689,%r710,%r715; add.u64 %r690,%r690,%r692; setp.ne.u64 %r2692,%r216,%r308; @ %r2692 bra $L110; bra $L113; $L287: .loc 1 2976 34 st.u64 [%r713],%r2328; st.u64 [%r713+8],%r2328; .loc 1 2975 27 mov.u64 %r310,1; .loc 1 2975 2 add.u64 %r713,%r713,%r715; $L251: .loc 1 2976 34 st.u64 [%r713],%r2328; st.u64 [%r713+8],%r2328; .loc 1 2975 27 add.u64 %r310,%r310,1; .loc 1 2975 2 add.u64 %r713,%r713,%r715; $L250: .loc 1 2976 34 st.u64 [%r713],%r2328; st.u64 [%r713+8],%r2328; .loc 1 2975 27 add.u64 %r310,%r310,1; .loc 1 2975 2 add.u64 %r713,%r713,%r715; setp.ne.u64 %r2695,%r216,%r310; @ %r2695 bra $L105; bra $L109; $L289: .loc 1 3002 28 ld.u64 %r2696,[%r721]; st.u64 [%frame+80],%r2696; ld.u64 %r2697,[%r721+8]; st.u64 [%frame+88],%r2697; ld.u64 %r2698,[%r747]; st.u64 [%frame+96],%r2698; ld.u64 %r2699,[%r747+8]; st.u64 [%frame+104],%r269233r23233.loc 1 3002 5 ld.u64 %r2099,[%frame+64]; ld.u64 %r2100,[%frame+72]; .loc 1 3001 32 mov.u64 %r305,1; .loc 1 3001 8 add.u64 %r721,%r721,%r723; add.u64 %r725,%r747,%r727; setp.ne.u64 %r2707,%r456,%r305; @ %r2707 bra $L118; bra $L117; } matmul_r4.o/, .param .u32 %in_ar4, .param .u64 %in_ar5); .file 1 "../../../../libgfortran/generated/matmul_r4.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_options[6]FUNCTION DECL: _gfortran_size0_gfortran_size0_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime___func__$0 .const .align 1 .u8 __func__$0[10] = {109,97,116,109,117,108,95,114,52,0 }2] = {782] = {672] = {8471,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,32,40,97,41,32,61,61,32,50,32,124,124,32,71,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,32,40,98,41,32,61,61,3246116,114,97,110,47,103,101,110,101,114,97,116,101,100,47,109,97,116,109,117,108,95,114,52,465,114,114,97,121,32,98,111,117,110,100,32,109,105,115,109,97,116,99,104,32,102,111,114,32,100,105,109,101,110,115,105,111,110,32,49,32,111,102,32,97,114,114,97,121,32,40,37,108,100,47,37,108,100,41,3265,114,114,97,121,32,98,111,117,110,100,32,109,105,115,109,97,116,99,104,32,102,111,114,32,100,105,109,101,110,115,105,111,110,32,50,32,111,102,32,97,114,114,97,121,32,40,37,108,100,47,37,108,100,41,3273,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,97,114,103,117,109,101,110,116,32,66,32,105,110,32,77,65,84,77,85,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,49,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,100,0 }103,101,109,109,32,33,61,32,78,85,76,7_gfortran_matmul_r4 .visible .func _gfortran_matmul_r89f32 %r240; .reg .f32 %r241; .reg .f32 %r242; .reg .f32 %r243; .reg .f32 %r244; .reg .u64 %r245; .reg .f32 %r246; .reg .f32 %r247; .reg .f32 %r248; .reg .f32 %r249; .reg .u64 %r250; .reg .f32 %r251; .reg .f32 %r252; .reg .f32 %r253; .reg .f32 %r254; .reg .f32 %r255; .reg .f32 %r256; .reg .f32 %r257; .reg .f32 %r258; .reg .f32 %r259; .reg .f32 %r260; .reg .f32 %r261; .reg .f32 %r262; .reg .f32 %r263; .reg .f32 %r264; .reg .f32 %r265; .reg .f32 %r266; .reg .u64f32u64 %r280; .reg .u64 %r281; .reg .u64 %r282; .reg .u64 %r283; .reg .f32u64 %r311; .reg .u64 %r313; .reg .u64u64 %r323; .reg .u64 %r328; .reg .u64 %r331; .reg .u64u64 %r391; .reg .u64 %r397; .reg .u64u64u32 %r473; .reg .u64 %r474; .reg .u64 %r475; .reg .u64 %r476; .reg .u64u64 %r511; .reg .u64 %r512; .reg .u64 %r513; .reg .u64u64u64u64 %r571; .reg .u64 %r572; .reg .u64 %r573; .reg .u64 %r574; .reg .u64 %r575; .reg .u64 %r577; .reg .u64u64 %r591; .reg .u64u64 %r602; .reg .u64 %r604; .reg .u64 %r605; .reg .u64 %r611; .reg .u64 %r613; .reg .u64 %r614; .reg .u64 %r618; .reg .u64 %r623; .reg .u64 %r624; .reg .u64 %r626; .reg .u64 %r627; .reg .u64 %r632; .reg .u64 %r634; .reg .u64 %r635; .reg .u64 %r643; .reg .u64 %r644; .reg .u64 %r646; .reg .u64 %r647; .reg .u64 %r649; .reg .u64 %r652; .reg .u64 %r654; .reg .u64 %r655; .reg .u64 %r660; .reg .u64 %r661; .reg .u64 %r663; .reg .u64 %r668; .reg .u64 %r670; .reg .u64 %r672; .reg .u64 %r676; .reg .u64 %r677; .reg .u64 %r681; .reg .u64 %r684; .reg .u64 %r686; .reg .u64 %r688; .reg .u64 %r692; .reg .u64 %r694; .reg .u64 %r696; .reg .u64 %r698; .reg .u64 %r701; .reg .u64 %r703; .reg .u64 %r705; .reg .u64 %r710; .reg .u64 %r712; .reg .u64 %r718; .reg .u64 %r719; .reg .u64 %r720; .reg .u64 %r721; .reg .u32 %r722; .reg .u32 %r723; .reg .u64 %r724; .reg .u16 %r725; .reg .pred %r726; .reg .u16 %r727; .reg .pred %r729; .reg .u64 %r730; .reg .u32 %r731; .reg .u64 %r732; .reg .u64 %r733; .reg .u64 %r734; .reg .pred %r735; .reg .pred %r737; .reg .u64 %r740; .reg .u64 %r742; .reg .u64 %r743; .reg .u64 %r744; .reg .u64 %r746; .reg .u64 %r747; .reg .u64 %r748; .reg .u64 %r749; .reg .u64 %r750; .reg .u64 %r753; .reg .u64 %r755; .reg .u64 %r756; .reg .u64 %r758; .reg .u64 %r760; .reg .u64 %r761; .reg .u64 %r763; .reg .u64 %r764; .reg .u64 %r765; .reg .u64 %r766; .reg .u64 %r767; .reg .u32 %r768; .reg .pred %r769; .reg .u64 %r770; .reg .u64 %r771; .reg .u64 %r772; .reg .pred %r774; .reg .u64 %r775; .reg .u64 %r776; .reg .u64 %r777; .reg .pred %r778; .reg .u64 %r779; .reg .u64 %r780; .reg .u64 %r781; .reg .u16 %r782; .reg .pred %r783; .reg .u64 %r784; .reg .u64 %r786; .reg .u64 %r787; .reg .u64 %r788; .reg .u16 %r789; .reg .pred %r791; .reg .pred %r792; .reg .u64 %r793; .reg .pred %r795; .reg .u64 %r796; .reg .u64 %r798; .reg .u64 %r799; .reg .u64 %r800; .reg .u64 %r801; .reg .u64 %r802; .reg .u64 %r803; .reg .pred %r804; .reg .u64 %r805; .reg .u64 %r807; .reg .u64 %r808; .reg .u64 %r809; .reg .u16 %r810; .reg .u32 %r811; .reg .pred %r812; .reg .u16 %r813; .reg .pred %r814; .reg .u64 %r815; .reg .u64 %r816; .reg .u64 %r817; .reg .u64 %r818; .reg .u64 %r819; .reg .u64 %r820; .reg .pred %r821; .reg .u32 %r823; .reg .u32 %r824; .reg .u32 %r826; .reg .u32 %r827; .reg .u16 %r828; .reg .u16 %r829; .reg .u16 %r830; .reg .u32 %r831; .reg .u16 %r832; .reg .pred %r833; .reg .u64 %r834; .reg .u16 %r836; .reg .pred %r837; .reg .u64 %r838; .reg .u64 %r839; .reg .u64 %r840; .reg .u32 %r841; .reg .u32 %r842; .reg .u32 %r843; .reg .u32 %r844; .reg .u32 %r845; .reg .u32 %r846; .reg .u32 %r848; .reg .u32 %r849; .reg .u16 %r850; .reg .u16 %r851; .reg .u16 %r852; .reg .u32 %r853; .reg .u16 %r854; .reg .pred %r855; .reg .u32 %r857; .reg .u32 %r858; .reg .u16 %r859; .reg .u16 %r860; .reg .u16 %r861; .reg .u32 %r862; .reg .u16 %r863; .reg .pred %r864; .reg .u32 %r866; .reg .u32 %r867; .reg .u32 %r869; .reg .u32 %r870; .reg .u16 %r871; .reg .u16 %r872; .reg .u16 %r873; .reg .u32 %r874; .reg .u16 %r875; .reg .pred %r876; .reg .f32 %r877; .reg .f32 %r878; .reg .f32 %r879; .reg .f32 %r880; .reg .f32 %r881; .reg .f32 %r882; .reg .f32 %r883; .reg .pred %r884; .reg .f32 %r885; .reg .f32 %r886; .reg .pred %r887; .reg .pred %r888; .reg .u32 %r890; .reg .u32 %r891; .reg .u32 %r893; .reg .u32 %r894; .reg .u16 %r895; .reg .u16 %r896; .reg .u16 %r897; .reg .u32 %r898; .reg .u16 %r899; .reg .pred %r900; .reg .u32 %r902; .reg .u32 %r903; .reg .u32 %r905; .reg .u32 %r906; .reg .u16 %r907; .reg .u16 %r908; .reg .u16 %r909; .reg .u32 %r910; .reg .u16 %r911; .reg .pred %r912; .reg .u32 %r914; .reg .u32 %r915; .reg .u32 %r917; .reg .u32 %r918; .reg .u16 %r919; .reg .u16 %r920; .reg .u16 %r921; .reg .u32 %r922; .reg .u16 %r923; .reg .pred %r924; .reg .pred %r925; .reg .u64 %r926; .reg .u32 %r927; .reg .u64 %r928; .reg .u64 %r929; .reg .u32 %r930; .reg .pred %r931; .reg .pred %r932; .reg .u32 %r933; .reg .pred %r934; .reg .pred %r935; .reg .u32 %r950; .reg .u64 %r951; .reg .u64 %r952; .reg .u64 %r953; .reg .u64 %r954; .reg .u64 %r955; .reg .u64 %r956; .reg .u64 %r957; .reg .u16 %r958; .reg .u16 %r959; .reg .u16 %r960; .reg .u32 %r961; .reg .u16 %r962; .reg .pred %r963; .reg .pred %r964; .reg .u16 %r965; .reg .pred %r966; .reg .u64 %r967; .reg .pred %r968; .reg .u32 %r972; .reg .u64 %r974; .reg .u32 %r979; .reg .u32 %r980; .reg .u32 %r982; .reg .u32 %r983; .reg .u16 %r984; .reg .u16 %r985; .reg .u16 %r986; .reg .u32 %r988; .reg .u32 %r989; .reg .u16 %r991; .reg .u16 %r993; .reg .u32 %r994; .reg .u16 %r995; .reg .pred %r996; .reg .pred %r997; .reg .u64 %r998; .reg .u64 %r1000; .reg .u64 %r1001; .reg .u64 %r1002; .reg .pred %r1004; .reg .u64 %r1006; .reg .u64 %r1007; .reg .u64 %r1010; .reg .u64 %r1011; .reg .u64 %r1015; .reg .u64 %r1016; .reg .u64 %r1017; .reg .u64 %r1019; .reg .u64 %r1020; .reg .u64 %r1023; .reg .u64 %r1026; .reg .u64 %r1027; .reg .u64 %r1030; .reg .u64 %r1032; .reg .u64 %r1033; .reg .u64 %r1034; .reg .u64 %r1035; .reg .u64 %r1036; .reg .pred %r1038; .reg .u64 %r1044; .reg .u64 %r1045; .reg .u64 %r1046; .reg .u64 %r1047; .reg .pred %r1049; .reg .u64 %r1050; .reg .u64 %r1051; .reg .u64 %r1052; .reg .u64 %r1053; .reg .u64 %r1054; .reg .u64 %r1056; .reg .u64 %r1057; .reg .u64 %r1058; .reg .u64 %r1059; .reg .u64 %r1060; .reg .u64 %r1061; .reg .u64 %r1062; .reg .u64 %r1063; .reg .u64 %r1067; .reg .u64 %r1068; .reg .u64 %r1069; .reg .u64 %r1070; .reg .u64 %r1074; .reg .u64 %r1076; .reg .u64 %r1079; .reg .u64 %r1082; .reg .u64 %r1087; .reg .u64 %r1088; .reg .u64 %r1090; .reg .u64 %r1091; .reg .f32 %r1092; .reg .f32 %r1093; .reg .u64 %r1094; .reg .u64 %r1095; .reg .f32 %r1096; .reg .f32 %r1097; .reg .u64 %r1099; .reg .u64 %r1100; .reg .f32 %r1101; .reg .u64 %r1102; .reg .f32 %r1103; .reg .pred %r1104; .reg .u64 %r1107; .reg .u64 %r1108; .reg .u64 %r1109; .reg .u64 %r1110; .reg .u64 %r1112; .reg .pred %r1113; .reg .u64 %r1114; .reg .u64 %r1115; .reg .f32 %r1116; .reg .u64 %r1119; .reg .u64 %r1120; .reg .u64 %r1121; .reg .u64 %r1122; .reg .u64 %r1123; .reg .u64 %r1124; .reg .u64 %r1125; .reg .u64 %r1126; .reg .u64 %r1127; .reg .u64 %r1129; .reg .u64 %r1130; .reg .u64 %r1131; .reg .u64 %r1132; .reg .u64 %r1137; .reg .u64 %r1138; .reg .u64 %r1139; .reg .u64 %r1140; .reg .u64 %r1141; .reg .u64 %r1142; .reg .pred %r1144; .reg .pred %r1145; .reg .u64 %r1149; .reg .f32 %r1150; .reg .f32 %r1151; .reg .f32 %r1152; .reg .f32 %r1153; .reg .pred %r1155; .reg .pred %r1156; .reg .pred %r1157; .reg .u64 %r1158; .reg .u64 %r1159; .reg .u64 %r1161; .reg .u64 %r1162; .reg .u64 %r1163; .reg .u64 %r1164; .reg .u64 %r1172; .reg .u64 %r1173; .reg .u64 %r1174; .reg .u64 %r1176; .reg .f32 %r1177; .reg .f32 %r1178; .reg .u64 %r1179; .reg .f32 %r1180; .reg .u64 %r1181; .reg .f32 %r1182; .reg .pred %r1184; .reg .pred %r1185; .reg .u64 %r1188; .reg .f32 %r1189; .reg .f32 %r1190; .reg .pred %r1192; .reg .u16 %r1193; .reg .pred %r1194; .reg .u32 %r1196; .reg .u32 %r1197; .reg .u16 %r1198; .reg .u16 %r1199; .reg .u32 %r1201; .reg .u16 %r1202; .reg .pred %r1203; .reg .pred %r1204; .reg .u16 %r1205; .reg .pred %r1206; .reg .pred %r1207; .reg .pred %r1208; .reg .u64 %r1209; .reg .u64 %r1210; .reg .pred %r1212; .reg .u64 %r1213; .reg .u64 %r1214; .reg .u64 %r1216; .reg .f32 %r1217; .reg .f32 %r1218; .reg .pred %r1220; .reg .u64 %r1222; .reg .u64 %r1223; .reg .u64 %r1224; .reg .f32 %r1225; .reg .f32 %r1226; .reg .pred %r1228; .reg .pred %r1229; .reg .u64 %r1231; .reg .f32 %r1232; .reg .f32 %r1233; .reg .pred %r1235; .reg .pred %r1236; .reg .pred %r1237; .reg .pred %r1238; .reg .u64 %r1239; .reg .pred %r1242; .reg .u64 %r1244; .reg .f32 %r1245; .reg .f32 %r1246; .reg .f32 %r1247; .reg .pred %r1249; .reg .pred %r1251; .reg .u64 %r1253; .reg .u64 %r1254; .reg .pred %r1256; .reg .u64 %r1257; .reg .u64 %r1258; .reg .u64 %r1260; .reg .f32 %r1261; .reg .f32 %r1262; .reg .pred %r1264; .reg .pred %r1265; .reg .u64 %r1266; .reg .pred %r1267; .reg .u64 %r1268; .reg .u64 %r1269; .reg .u64 %r1270; .reg .u16 %r1271; .reg .u32 %r1272; .reg .pred %r1273; .reg .u16 %r1274; .reg .u32 %r1275; .reg .pred %r1276; .reg .u64 %r1277; .reg .u64 %r1278; .reg .u64 %r1279; .reg .u64 %r1280; .reg .u64 %r1281; .reg .u64 %r1282; .reg .u64 %r1283; .reg .u32 %r1284; .reg .pred %r1285; .reg .u32 %r1286; .reg .u64 %r1287; .reg .pred %r1288; .reg .pred %r1293; .reg .pred %r1294; .reg .pred %r1295; .reg .pred %r1296; .reg .pred %r1299; .reg .pred %r1300; .reg .pred %r1301; .reg .pred %r1302; .reg .pred %r1303; .reg .u64 %r1305; .reg .u64 %r1312; .reg .u64 %r1313; .reg .u64 %r1314; .reg .u64 %r1315; .reg .u64 %r1320; .reg .u64 %r1321; .reg .u64 %r1322; .reg .u64 %r1323; .reg .u64 %r1325; .reg .u64 %r1326; .reg .u64 %r1327; .reg .u64 %r1328; .reg .u64 %r1329; .reg .pred %r1330; .reg .u64 %r1332; .reg .pred %r1334; .reg .u64 %r1337; .reg .u64 %r1338; .reg .u64 %r1339; .reg .pred %r1340; .reg .u64 %r1341; .reg .u64 %r1342; .reg .pred %r1346; .reg .f32 %r1347; .reg .pred %r1348; .reg .u64 %r1349; .reg .u64 %r1350; .reg .u64 %r1351; .reg .u64 %r1352; .reg .u64 %r1353; .reg .pred %r1354; .reg .pred %r1355; .reg .pred %r1356; .reg .u64 %r1359; .reg .u64 %r1360; .reg .u64 %r1361; .reg .u64 %r1362; .reg .u64 %r1363; .reg .pred %r1364; .reg .pred %r1365; .reg .pred %r1366; .reg .u64 %r1368; .reg .u64 %r1369; .reg .u64 %r1370; .reg .u64 %r1371; .reg .u64 %r1372; .reg .u64 %r1373; .reg .pred %r1374; .reg .pred %r1375; .reg .pred %r1376; .reg .u64 %r1382; .reg .u64 %r1383; .reg .u64 %r1384; .reg .u64 %r1386; .reg .pred %r1387; .reg .u64 %r1388; .reg .u64 %r1389; .reg .u64 %r1390; .reg .u64 %r1391; .reg .u64 %r1392; .reg .u64 %r1393; .reg .u64 %r1394; .reg .pred %r1395; .reg .pred %r1396; .reg .pred %r1397; .reg .u64 %r1400; .reg .u64 %r1401; .reg .u64 %r1402; .reg .u64 %r1403; .reg .u64 %r1404; .reg .pred %r1405; .reg .pred %r1406; .reg .pred %r1407; .reg .u64 %r1408; .reg .u64 %r1409; .reg .u64 %r1413; .reg .pred %r1414; .reg .pred %r1415; .reg .pred %r1416; .reg .u64 %r1417; .reg .u64 %r1418; .reg .u64 %r1419; .reg .u64 %r1420; .reg .u64 %r1421; .reg .u64 %r1422; .reg .pred %r1423; .reg .pred %r1424; .reg .pred %r1425; .reg .u64 %r1428; .reg .u64 %r1429; .reg .u64 %r1430; .reg .u64 %r1431; .reg .u64 %r1432; .reg .pred %r1433; .reg .pred %r1434; .reg .pred %r1435; .reg .u64 %r1440; .reg .pred %r1441; .reg .pred %r1442; .reg .pred %r1443; .reg .u64 %r1448; .reg .pred %r1449; .reg .pred %r1450; .reg .pred %r1451; .reg .u64 %r1456; .reg .pred %r1457; .reg .pred %r1458; .reg .pred %r1459; .reg .u64 %r1464; .reg .pred %r1465; .reg .pred %r1466; .reg .pred %r1467; .reg .u64 %r1470; .reg .u64 %r1471; .reg .u64 %r1473; .reg .u64 %r1474; .reg .u64 %r1477; .reg .u64 %r1478; .reg .f32 %r1479; .reg .f32 %r1480; .reg .u64 %r1481; .reg .u64 %r1482; .reg .f32 %r1483; .reg .f32 %r1484; .reg .u64 %r1485; .reg .u64 %r1486; .reg .u64 %r1488; .reg .u64 %r1489; .reg .f32 %r1490; .reg .f32 %r1491; .reg .u64 %r1492; .reg .u64 %r1493; .reg .f32 %r1494; .reg .f32 %r1495; .reg .u64 %r1496; .reg .u64 %r1497; .reg .u64 %r1499; .reg .u64 %r1500; .reg .f32 %r1501; .reg .f32 %r1502; .reg .u64 %r1503; .reg .u64 %r1504; .reg .f32 %r1505; .reg .f32 %r1506; .reg .pred %r1507; .reg .f32 %r1510; .reg .f32 %r1514; .reg .f32 %r1518; .reg .pred %r1519; .reg .f32 %r1520; .reg .f32 %r1521; .reg .f32 %r1522; .reg .f32 %r1523; .reg .f32 %r1524; .reg .f32 %r1525; .reg .f32 %r1526; .reg .f32 %r1527; .reg .f32 %r1528; .reg .f32 %r1529; .reg .f32 %r1530; .reg .f32 %r1531; .reg .f32 %r1532; .reg .f32 %r1533; .reg .f32 %r1534; .reg .f32 %r1535; .reg .u64 %r1538; .reg .u64 %r1539; .reg .f32 %r1540; .reg .f32 %r1541; .reg .f32 %r1542; .reg .u64 %r1543; .reg .u64 %r1544; .reg .f32 %r1545; .reg .u64 %r1546; .reg .f32 %r1547; .reg .u64 %r1548; .reg .f32 %r1549; .reg .u64 %r1550; .reg .f32 %r1551; .reg .u64 %r1552; .reg .f32 %r1553; .reg .pred %r1554; .reg .f32 %r1555; .reg .f32 %r1556; .reg .f32 %r1557; .reg .f32 %r1558; .reg .f32 %r1564; .reg .f32 %r1565; .reg .f32 %r1566; .reg .f32 %r1567; .reg .f32 %r1568; .reg .f32 %r1569; .reg .f32 %r1570; .reg .f32 %r1571; .reg .f32 %r1572; .reg .f32 %r1579; .reg .f32 %r1580; .reg .f32 %r1581; .reg .f32 %r1582; .reg .f32 %r1583; .reg .f32 %r1584; .reg .f32 %r1585; .reg .f32 %r1586; .reg .f32 %r1587; .reg .f32 %r1594; .reg .f32 %r1595; .reg .f32 %r1596; .reg .f32 %r1597; .reg .f32 %r1598; .reg .pred %r1599; .reg .pred %r1600; .reg .f32 %r1601; .reg .f32 %r1602; .reg .f32 %r1603; .reg .f32 %r1604; .reg .u64 %r1607; .reg .f32 %r1608; .reg .u64 %r1609; .reg .f32 %r1610; .reg .f32 %r1611; .reg .f32 %r1612; .reg .f32 %r1613; .reg .u64 %r1614; .reg .f32 %r1615; .reg .f32 %r1616; .reg .u64 %r1617; .reg .f32 %r1618; .reg .f32 %r1619; .reg .u64 %r1621; .reg .u64 %r1623; .reg .f32 %r1624; .reg .u64 %r1625; .reg .f32 %r1626; .reg .f32 %r1627; .reg .f32 %r1628; .reg .f32 %r1629; .reg .u64 %r1630; .reg .f32 %r1631; .reg .f32 %r1632; .reg .u64 %r1633; .reg .f32 %r1634; .reg .f32 %r1635; .reg .u64 %r1637; .reg .u64 %r1639; .reg .f32 %r1640; .reg .u64 %r1641; .reg .f32 %r1642; .reg .f32 %r1643; .reg .u64 %r1644; .reg .f32 %r1645; .reg .u64 %r1646; .reg .f32 %r1647; .reg .pred %r1648; .reg .f32 %r1649; .reg .f32 %r1652; .reg .f32 %r1653; .reg .f32 %r1654; .reg .f32 %r1658; .reg .f32 %r1659; .reg .f32 %r1660; .reg .f32 %r1664; .reg .f32 %r1665; .reg .pred %r1666; .reg .pred %r1667; .reg .f32 %r1668; .reg .f32 %r1671; .reg .f32 %r1672; .reg .f32 %r1673; .reg .f32 %r1677; .reg .f32 %r1678; .reg .f32 %r1679; .reg .f32 %r1683; .reg .f32 %r1684; .reg .pred %r1685; .reg .pred %r1686; .reg .f32 %r1687; .reg .u64 %r1688; .reg .f32 %r1690; .reg .f32 %r1691; .reg .f32 %r1692; .reg .u64 %r1693; .reg .f32 %r1696; .reg .f32 %r1697; .reg .f32 %r1698; .reg .u64 %r1699; .reg .f32 %r1702; .reg .f32 %r1703; .reg .pred %r1704; .reg .pred %r1705; .reg .f32 %r1706; .reg .u64 %r1708; .reg .u64 %r1709; .reg .f32 %r1710; .reg .f32 %r1711; .reg .f32 %r1712; .reg .u64 %r1714; .reg .u64 %r1715; .reg .f32 %r1717; .reg .f32 %r1718; .reg .f32 %r1719; .reg .u64 %r1721; .reg .u64 %r1722; .reg .f32 %r1724; .reg .f32 %r1725; .reg .pred %r1726; .reg .u64 %r1728; .reg .u64 %r1730; .reg .u64 %r1733; .reg .pred %r1735; .reg .pred %r1736; .reg .pred %r1737; .reg .u64 %r1739; .reg .u64 %r1740; .reg .f32 %r1741; .reg .f32 %r1742; .reg .f32 %r1743; .reg .f32 %r1744; .reg .u64 %r1746; .reg .u64 %r1747; .reg .f32 %r1749; .reg .f32 %r1750; .reg .f32 %r1751; .reg .f32 %r1752; .reg .u64 %r1754; .reg .u64 %r1755; .reg .f32 %r1757; .reg .f32 %r1758; .reg .f32 %r1759; .reg .f32 %r1760; .reg .pred %r1761; .reg .pred %r1762; .reg .f32 %r1764; .reg .u64 %r1766; .reg .u64 %r1767; .reg .f32 %r1768; .reg .f32 %r1769; .reg .f32 %r1770; .reg .u64 %r1772; .reg .u64 %r1773; .reg .f32 %r1775; .reg .f32 %r1776; .reg .f32 %r1777; .reg .u64 %r1779; .reg .u64 %r1780; .reg .f32 %r1782; .reg .f32 %r1783; .reg .pred %r1784; .reg .f32 %r1785; .reg .f32 %r1786; .reg .f32 %r1788; .reg .f32 %r1789; .reg .f32 %r1791; .reg .f32 %r1792; .reg .pred %r1793; .reg .u64 %r1794; .reg .f32 %r1795; .reg .u64 %r1796; .reg .f32 %r1797; .reg .f32 %r1798; .reg .u64 %r1799; .reg .f32 %r1800; .reg .u64 %r1801; .reg .f32 %r1802; .reg .u64 %r1804; .reg .f32 %r1805; .reg .u64 %r1806; .reg .f32 %r1807; .reg .f32 %r1808; .reg .u64 %r1809; .reg .f32 %r1810; .reg .u64 %r1811; .reg .f32 %r1812; .reg .u64 %r1814; .reg .f32 %r1815; .reg .u64 %r1816; .reg .f32 %r1817; .reg .f32 %r1818; .reg .u64 %r1819; .reg .f32 %r1820; .reg .u64 %r1821; .reg .f32 %r1822; .reg .pred %r1823; .reg .f32 %r1824; .reg .f32 %r1825; .reg .f32 %r1826; .reg .f32 %r1827; .reg .f32 %r1828; .reg .f32 %r1830; .reg .f32 %r1831; .reg .f32 %r1832; .reg .f32 %r1833; .reg .f32 %r1834; .reg .f32 %r1836; .reg .f32 %r1837; .reg .f32 %r1838; .reg .f32 %r1839; .reg .f32 %r1840; .reg .pred %r1841; .reg .u64 %r1842; .reg .u64 %r1843; .reg .f32 %r1844; .reg .f32 %r1845; .reg .f32 %r1846; .reg .u64 %r1847; .reg .u64 %r1848; .reg .f32 %r1849; .reg .u64 %r1850; .reg .f32 %r1851; .reg .u64 %r1852; .reg .f32 %r1853; .reg .u64 %r1854; .reg .f32 %r1855; .reg .u64 %r1856; .reg .f32 %r1857; .reg .pred %r1858; .reg .f32 %r1859; .reg .f32 %r1861; .reg .f32 %r1863; .reg .pred %r1864; .reg .u64 %r1865; .reg .u64 %r1866; .reg .f32 %r1867; .reg .f32 %r1868; .reg .u64 %r1869; .reg .u64 %r1870; .reg .f32 %r1871; .reg .f32 %r1872; .reg .u64 %r1874; .reg .u64 %r1875; .reg .f32 %r1876; .reg .f32 %r1877; .reg .u64 %r1878; .reg .u64 %r1879; .reg .f32 %r1880; .reg .f32 %r1881; .reg .u64 %r1883; .reg .u64 %r1884; .reg .f32 %r1885; .reg .f32 %r1886; .reg .u64 %r1887; .reg .u64 %r1888; .reg .f32 %r1889; .reg .f32 %r1890; .reg .pred %r1891; .reg .u32 %r1892; .reg .u64 %r1893; .reg .u32 %r1895; .reg .u64 %r1896; .reg .u32 %r1898; .reg .u64 %r1899; .reg .pred %r1900; .reg .u32 %r1901; .reg .u64 %r1902; .reg .u64 %r1904; .reg .u32 %r1906; .reg .u64 %r1907; .reg .u64 %r1909; .reg .u32 %r1911; .reg .u64 %r1912; .reg .pred %r1913; .reg .f32 %r1914; .reg .f32 %r1915; .reg .f32 %r1917; .reg .f32 %r1918; .reg .f32 %r1920; .reg .f32 %r1921; .reg .pred %r1922; .reg .f32 %r1923; .reg .f32 %r1924; .reg .f32 %r1926; .reg .f32 %r1927; .reg .f32 %r1929; .reg .f32 %r1930; .reg .pred %r1931; .reg .f32 %r1932; .reg .f32 %r1933; .reg .f32 %r1935; .reg .f32 %r1936; .reg .f32 %r1938; .reg .f32 %r1939; .reg .pred %r1940; .reg .f32 %r1941; .reg .f32 %r1942; .reg .f32 %r1943; .reg .f32 %r1944; .reg .f32 %r1946; .reg .f32 %r1947; .reg .f32 %r1948; .reg .f32 %r1949; .reg .f32 %r1951; .reg .f32 %r1952; .reg .f32 %r1953; .reg .f32 %r1954; .reg .pred %r1955; .reg .pred %r1958; .reg .f32 %r1959; .reg .f32 %r1960; .reg .f32 %r1962; .reg .f32 %r1963; .reg .f32 %r1965; .reg .f32 %r1966; .reg .pred %r1967; mov.u64 %r719,%ar0; mov.u64 %r720,%ar1; mov.u64 %r721,%ar2; mov.u32 %r722,%ar3; mov.u32 %r723,%ar4; mov.u64 %r724,%ar5; .loc 1 2462 3 ld.s8 %r418,[%r720+28]; cvt.u16.u32 %r725,%r418; setp.eq.u16 %r726,%r725,2; @ %r726 bra $L2; ld.s8 %r1286,[%r721+28]; cvt.u16.u32 %r727,%r1286; setp.eq.u16 %r729,%r727,2; @ %r729 bra $L3; cvta.const.u64 %r733,$LC3; cvta.const.u64 %r732,__func__$0; mov.u32 %r731,2462; cvta.const.u64 %r7307327333: .loc 1 2476 6 ld.u64 %r734,[%r719]; setp.ne.u64 %r735,%r734,0; @ %r735 bra $L4; .loc 1 2478 10 setp.ne.u16 %r737,%r725,1; @ %r737 bra $L5; .loc 1 2480 4 st.u64 [%r719+48],%r734; ld.u64 %r740,[%r721+80]; ld.u64 %r742,[%r721+72]; sub.u64 %r743,%r740,%r742; st.u64 [%r719+56],%r743; mov.u64 %r744,1; st.u64 [%r719+40],%r744; bra $L6; $L123: .loc 1 2485 4 st.u64 [%r719+48],%r1266; add.u64 %r746,%r480,-1; st.u64 [%r719+56],%r746; mov.u64 %r747,1; st.u64 [%r719+40],%r747; bra $L6; $L124: .loc 1 2490 4 mov.u64 %r748,0; st.u64 [%r719+48],%r748; add.u64 %r749,%r480,-1; st.u64 [%r719+56],%r749; mov.u64 %r750,1; st.u64 [%r719+40],%r750; .loc 1 2493 11 st.u64 [%r719+72],%r748; ld.u64 %r753,[%r721+80]; ld.u64 %r755,[%r721+72]; sub.u64 %r756,%r753,%r755; st.u64 [%r719+80],%r756; st.u64 [%r719+64],%r480; $L6: .loc 1 2499 1719; call (%value_in),_gfortran_size0,(%out_arg1); ld.param.u64 %r758,[%value_in]; } .loc 1 2499 4760; call (%value_in),_gfortrani_xmallocarray761,[%value_in]; } .loc 1 2499 2 st.u64 [%r719],%r761; .loc 1 2500 24 mov.u64 %r763,0; st.u64 [%r719+8],%r763; .loc 1 2557 7 ld.s8 %r418,[%r720+28]; .loc 1 2564 15 ld.u64 %r765,[%r720+56]; add.u64 %r764,%r765,1; .loc 1 2564 13 ld.u64 %r766,[%r720+48]; sub.u64 %r428,%r764,%r766; bra $L7; $L4: .loc 1 2502 12 cvta.global.u64 %r767,_gfortrani_compile_options; .loc 1 2502 11 ld.u32 %r768,[%r767+36]; setp.eq.u32 %r769,%r768,0; @ %r769 bra $L8; .loc 1 2509 17 ld.u64 %r771,[%r719+56]; add.u64 %r770,%r771,1; .loc 1 2509 15 ld.u64 %r772,[%r719+48]; sub.u64 %r428,%r770,%r772; .loc 1 2506 10 setp.ne.u16 %r774,%r725,1; @ %r774 bra $L9; .loc 1 2508 17 ld.u64 %r776,[%r721+80]; add.u64 %r775,%r776,1; .loc 1 2508 15 ld.u64 %r777,[%r721+72]; sub.u64 %r210,%r775,%r777; .loc 1 2510 7 setp.ne.u64 %r778,%r210,%r428; @ %r778 bra $L10; .loc 1 2564 15 ld.u64 %r780,[%r720+56]; add.u64 %r779,%r780,1; .loc 1 2564 13 ld.u64 %r781,[%r720+48]; sub.u64 %r428,%r779,%r781; .loc 1 2543 7 ld.s8 %r303,[%r719+28]; .loc 1 2548 27 ld.u64 %r416,[%r719+40]; .loc 1 2543 6 cvt.u16.u32 %r782,%r303; setp.eq.u16 %r783,%r782,1; @ %r783 bra $L11; .loc 1 2553 16 ld.u64 %r476,[%r719+64]; .loc 1 2560 16 ld.u64 %r475,[%r720+40]; .loc 1 2563 14 mov.u64 %r186,1; .loc 1 2561 16 mov.u64 %r178,%r186; bra $L12; $L10: .loc 1 2511 6 st.u64 [%stack+8],%r210; st.u64 [%stack],%r428; cvta.const.u64 %r7847stack; call _gfortran_runtime_error2485 4 ld.u64 %r787,[%r720+56]; add.u64 %r786,%r787,1; ld.u64 %r788,[%r720+48]; sub.u64 %r484,%r786,%r788; .loc 1 2515 15 cvt.u16.u32 %r789,%r1286; setp.ne.u16 %r791,%r789,1; @ %r791 bra $L13; .loc 1 2519 7 setp.eq.u64 %r792,%r484,%r428; @ %r792 bra $L14; .loc 1 2520 6 st.u64 [%stack+8],%r484; st.u64 [%stack],%r428; cvta.const.u64 %r7933_gfortran_runtime_error13: .loc 1 2528 7 setp.eq.u64 %r795,%r484,%r428; @ %r795 bra $L15; .loc 1 2529 6 st.u64 [%stack+8],%r484; st.u64 [%stack],%r428; cvta.const.u64 %r7966_gfortran_runtime_error15: .loc 1 2533 17 ld.u64 %r799,[%r721+80]; add.u64 %r798,%r799,1; .loc 1 2533 15 ld.u64 %r800,[%r721+72]; sub.u64 %r208,%r798,%r800; .loc 1 2534 17 ld.u64 %r802,[%r719+80]; add.u64 %r801,%r802,1; .loc 1 2534 15 ld.u64 %r803,[%r719+72]; sub.u64 %r209,%r801,%r803; .loc 1 2535 7 setp.eq.u64 %r804,%r208,%r209; @ %r804 bra $L14; .loc 1 2536 6 st.u64 [%stack+8],%r208; st.u64 [%stack],%r209; cvta.const.u64 %r805,$LC805_gfortran_runtime_error2564 15 ld.u64 %r808,[%r720+56]; add.u64 %r807,%r808,1; .loc 1 2564 13 ld.u64 %r809,[%r720+48]; sub.u64 %r428,%r807,%r809; $L7: .loc 1 2548 27 ld.u64 %r476,[%r719+40]; .loc 1 2543 6 ld.s8 %r811,[%r719+28]; cvt.u16.u32 %r810,%r811; setp.eq.u16 %r812,%r810,1; @ %r812 bra $L130; $L127: .loc 1 2553 16 mov.u64 %r416,%r476; ld.u64 %r476,[%r719+64]; bra $L16; $L130: .loc 1 2548 27 mov.u64 %r416,%r476; $L16: .loc 1 2560 16 ld.u64 %r475,[%r720+40]; .loc 1 2557 6 cvt.u16.u32 %r813,%r418; setp.eq.u16 %r814,%r813,1; @ %r814 bra $L131; $L125: .loc 1 2569 16 ld.u64 %r178,[%r720+64]; .loc 1 2571 15 ld.u64 %r816,[%r720+80]; add.u64 %r815,%r816,1; .loc 1 2571 13 ld.u64 %r817,[%r720+72]; mov.u64 %r186,%r428; sub.u64 %r428,%r815,%r817; bra $L12; $L131: .loc 1 2563 14 mov.u64 %r186,1; .loc 1 2561 16 mov.u64 %r178,%r186; $L12: .loc 1 2575 16 ld.u64 %r819,[%r721+56]; add.u64 %r818,%r819,1; ld.u64 %r820,[%r721+48]; sub.u64 %r64,%r818,%r820; .loc 1 2575 6 setp.eq.u64 %r821,%r64,%r428; @ %r821 bra $L17; .loc 1 2577 10 set.u32.gt.s64 %r823,%r428,0; neg.s32 %r824,%r823; .loc 1 2577 21 set.u32.gt.s64 %r826,%r64,0; neg.s32 %r827,%r826; cvt.u16.u32 %r829,%r824; cvt.u16.u32 %r830,%r827; or.b16 %r828,%r829,%r830; cvt.u32.u16 %r831,%r828; cvt.u16.u8 %r832,%r831; setp.eq.u16 %r833,%r832,0; @ %r833 bra $L17; .loc 1 2578 2 st.u64 [%stack+8],%r428; st.u64 [%stack],%r64; cvta.const.u64 %r8348stack; call _gfortran_runtime_error17: .loc 1 2583 7 ld.s8 %r66,[%r721+28]; .loc 1 2586 16 ld.u64 %r474,[%r721+40]; .loc 1 2583 6 cvt.u16.u32 %r836,%r66; setp.eq.u16 %r837,%r836,1; @ %r837 bra $L132; .loc 1 2597 16 ld.u64 %r179,[%r721+64]; .loc 1 2598 16 ld.u64 %r839,[%r721+80]; add.u64 %r838,%r839,1; .loc 1 2598 14 ld.u64 %r840,[%r721+72]; sub.u64 %r187,%r838,%r840; bra $L18; $L132: .loc 1 2592 14 mov.u64 %r187,1; .loc 1 2591 16 mov.u64 %r179,256; $L18: .loc 1 2601 9 ld.u64 %r215,[%r720]; .loc 1 2602 9 ld.u64 %r216,[%r721]; .loc 1 2603 8 ld.u64 %r217,[%r719]; .loc 1 2612 28 set.u32.eq.u64 %r842,%r416,1; neg.s32 %r843,%r842; cvt.u32.u32 %r841,%r843; cvt.u32.u8 %r71,%r841; .loc 1 2612 46 set.u32.eq.u64 %r845,%r475,1; neg.s32 %r846,%r845; cvt.u32.u32 %r844,%r846; cvt.u32.u8 %r473,%r844; .loc 1 2612 7 set.u32.ne.u32 %r848,%r722,0; neg.s32 %r849,%r848; .loc 1 2612 16 cvt.u16.u32 %r851,%r849; cvt.u16.u32 %r852,%r71; and.b16 %r850,%r851,%r852; .loc 1 2612 6 cvt.u32.u16 %r853,%r850; cvt.u16.u8 %r854,%r853; setp.eq.u16 %r855,%r854,0; @ %r855 bra $L19; .loc 1 2612 63 set.u32.eq.u64 %r857,%r178,1; neg.s32 %r858,%r857; .loc 1 2612 51 cvt.u16.u32 %r860,%r858; cvt.u16.u32 %r861,%r473; or.b16 %r859,%r860,%r861; .loc 1 2612 33 cvt.u32.u16 %r862,%r859; cvt.u16.u8 %r863,%r862; setp.eq.u16 %r864,%r863,0; @ %r864 bra $L20; .loc 1 2613 20 set.u32.eq.u64 %r866,%r474,1; neg.s32 %r867,%r866; .loc 1 2613 37 set.u32.eq.u64 %r869,%r179,1; neg.s32 %r870,%r869; .loc 1 2613 25 cvt.u16.u32 %r872,%r867; cvt.u16.u32 %r873,%r870; or.b16 %r871,%r872,%r873; .loc 1 2613 7 cvt.u32.u16 %r874,%r871; cvt.u16.u8 %r875,%r874; setp.eq.u16 %r876,%r875,0; @ %r876 bra $L20; .loc 1 2615 13 cvt.rn.f32.s32 %r83,%r723; .loc 1 2614 12 cvt.rn.f32.s64 %r877,%r186; .loc 1 2614 50 cvt.rn.f32.s64 %r878,%r428; .loc 1 2614 47 mul.f32 %r879,%r877,%r878; .loc 1 2614 31 cvt.rn.f32.s64 %r880,%r187; .loc 1 2614 47 mul.f32 %r881,%r879,%r880; .loc 1 2615 13 mul.f32 %r882,%r83,%r83; mul.f32 %r883,%r882,%r83; .loc 1 2614 7 setp.gt.f32 %r884,%r881,%r883; @ ! %r884 bra $L19; .loc 1 2617 17 cvt.u32.u64 %r85,%r186; st.u32 [%frame+28],%r85; .loc 1 2617 29 cvt.u32.u64 %r86,%r187; st.u32 [%frame+24],%r86; .loc 1 2617 41 cvt.u32.u64 %r87,%r428; st.u32 [%frame+20],%r87; .loc 1 2617 52 cvt.u32.u64 %r88,%r476; st.u32 [%frame+16],%r88; .loc 1 2618 24 mov.f32 %r885,0f3f800000; st.f32 [%frame+12],%r885; .loc 1 2618 33 mov.f32 %r886,0f00000000; st.f32 [%frame+8],%r886; .loc 1 2619 17 setp.ne.u64 %r887,%r475,1; @ %r887 bra $L22; cvt.u32.u64 %r200,%r178; bra $L23; $L22: cvt.u32.u64 %r200,%r475; $L23: st.u32 [%frame+4],%r200; .loc 1 2620 3 setp.ne.u64 %r888,%r474,1; @ %r888 bra $L24; cvt.u32.u64 %r201,%r179; bra $L25; $L24: cvt.u32.u64 %r201,%r474; $L25: st.u32 [%frame],%r201; .loc 1 2622 15 set.u32.gt.s32 %r890,%r200,0; neg.s32 %r891,%r890; .loc 1 2622 26 set.u32.gt.s32 %r893,%r201,0; neg.s32 %r894,%r893; .loc 1 2622 19 cvt.u16.u32 %r896,%r891; cvt.u16.u32 %r897,%r894; and.b16 %r895,%r896,%r897; .loc 1 2622 10 cvt.u32.u16 %r898,%r895; cvt.u16.u8 %r899,%r898; setp.eq.u16 %r900,%r899,0; @ %r900 bra $L19; .loc 1 2622 37 set.u32.gt.s32 %r902,%r88,0; neg.s32 %r903,%r902; .loc 1 2622 46 set.u32.gt.s32 %r905,%r85,1; neg.s32 %r906,%r905; .loc 1 2622 41 cvt.u16.u32 %r908,%r903; cvt.u16.u32 %r909,%r906; and.b16 %r907,%r908,%r909; cvt.u32.u16 %r910,%r907; cvt.u16.u8 %r911,%r910; setp.eq.u16 %r912,%r911,0; @ %r912 bra $L19; .loc 1 2622 55 set.u32.gt.s32 %r914,%r86,1; neg.s32 %r915,%r914; .loc 1 2622 64 set.u32.gt.s32 %r917,%r87,1; neg.s32 %r918,%r917; .loc 1 2622 59 cvt.u16.u32 %r920,%r915; cvt.u16.u32 %r921,%r918; and.b16 %r919,%r920,%r921; cvt.u32.u16 %r922,%r919; cvt.u16.u8 %r923,%r922; setp.eq.u16 %r924,%r923,0; @ %r924 bra $L19; .loc 1 2624 4 setp.ne.u64 %r925,%r724,0; @ %r925 bra $L27; cvta.const.u64 %r929,$LC8; cvta.const.u64 %r928,__func__$0; mov.u32 %r927,2624; cvta.const.u64 %r926992927: .loc 1 2626 17 and.b32 %r930,%r722,2; .loc 1 2626 7 setp.ne.u32 %r931,%r930,0; @ %r931 bra $L133; .loc 1 2629 35 setp.eq.u64 %r932,%r475,1; @ %r932 bra $L134; cvta.const.u64 %r188,$LC2; bra $L28; $L133: .loc 1 2627 13 cvta.const.u64 %r188,$LC1; bra $L28; $L134: .loc 1 2629 35 cvta.const.u64 %r188,$LC0; $L28: .loc 1 2631 17 and.b32 %r933,%r722,4; .loc 1 2631 7 setp.ne.u32 %r934,%r933,0; @ %r934 bra $L135; .loc 1 2634 35 setp.eq.u64 %r935,%r474,1; @ %r935 bra $L136; cvta.const.u64 %r189,$LC2; bra $L29; $L135: .loc 1 2632 13 cvta.const.u64 %r189,$LC1; bra $L29; $L136: .loc 1 2634 35 cvta.const.u64 %r189,$LC0; $L29: .loc 1 2636 4 add.u64 %r951,%frame,16; add.u64 %r952,%frame,8; add.u64 %r953,%frame,4; add.u64 %r954,%frame,12; add.u64 %r955,%frame,20; add.u64 %r956,%frame,24; add.u64 %r957,%frame,28; mov.u32 %r950,1;,.param .u64 %in_ar3,.param .u64 %in_ar4,.param .u64 %in_ar5,.param .u64 %in_ar6,.param .u64 %in_ar7,.param .u64 %in_ar8,.param .u64 %in_ar9,.param .u64 %in_ar10,.param .u64 %in_ar11,.param .u64 %in_ar12,.param .u32 %in_ar13,.param .u32 %in_ar1418r18995795955215; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r953; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r216; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%frame; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r952; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r217; .param .u64 %out_arg13; st.param.u64 [%out_arg13],%r951; .param .u32 %out_arg14; st.param.u32 [%out_arg14],%r950; .param .u32 %out_arg15; st.param.u32 [%out_arg15],%r950; call %r724,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13,%out_arg14,%out_arg15),$LCT0; } bra $L1; $L19: .loc 1 2643 21 cvt.u16.u32 %r959,%r71; cvt.u16.u32 %r960,%r473; and.b16 %r958,%r959,%r960; .loc 1 2643 6 cvt.u32.u16 %r961,%r958; cvt.u16.u8 %r962,%r961; setp.eq.u16 %r963,%r962,0; @ %r963 bra $L31; .loc 1 2643 38 setp.ne.u64 %r964,%r474,1; @ %r964 bra $L20; .loc 1 2644 7 cvt.u16.u32 %r965,%r66; setp.eq.u16 %r966,%r965,1; @ %r966 bra $L32; .loc 1 2677 16 add.u64 %r967,%r476,1; .loc 1 2678 9 shl.b64 %r104,%r967,2; sub.u64 %r219,%r217,%r104; .loc 1 2687 7 setp.le.s64 %r968,%r187,0; @ %r968 bra $L33; add.u64 %r592,%r104,-4; mov.u64 %r591,%r217; add.u64 %r595,%r187,1; .loc 1 2689 22 shl.b64 %r413,%r186,2; .loc 1 2687 13 mov.u64 %r268,%r474; setp.gt.s64 %r1303,%r186,0; and.b64 %r1413,%r187,3; setp.eq.u64 %r1416,%r1413,0; @ %r1416 bra $L34; setp.eq.u64 %r1415,%r1413,1; @ %r1415 bra $L309; setp.eq.u64 %r1414,%r1413,2; @ %r1414 bra $L310; bra $L380; $L35: .loc 1 2689 22 mov.u32 %r9797213974,[%value_in]; } $L36: .loc 1 2687 24 add.u64 %r1417,%r268,1; .loc 1 2687 7 add.u64 %r1470,%r591,%r592; .loc 1 2688 2 @ %r1303 bra $L247; bra $L381; $L34: @ %r1303 bra $L35; bra $L36; $L33: .loc 1 2692 28 set.u32.eq.u64 %r979,%r428,0; neg.s32 %r980,%r979; .loc 1 2692 13 set.u32.eq.u64 %r982,%r186,0; neg.s32 %r983,%r982; .loc 1 2692 28 cvt.u16.u32 %r985,%r980; cvt.u16.u32 %r986,%r983; or.b16 %r984,%r985,%r986; .loc 1 2692 23 set.u32.eq.u64 %r988,%r187,0; neg.s32 %r989,%r988; .loc 1 2692 28 cvt.u16.u32 %r993,%r989; or.b16 %r991,%r984,%r993; cvt.u32.u16 %r994,%r991; cvt.u16.u8 %r995,%r994; setp.eq.u16 %r996,%r995,0; @ ! %r996 bra $L1; .loc 1 2697 10 setp.eq.u64 %r997,%r178,1; selp.u64 %r198,%r476,%r178,%r997; .loc 1 2702 21 shl.b64 %r998,%r198,8; .loc 1 2702 14 add.u64 %r224,%r998,%r179; .loc 1 2706 12 min.s64 %r1000,%r224,65536; shl.b64 %r1001,%r1000,10011002,[%value_in]; } .loc 1 2710 7 setp.gt.s64 %r1004,%r187,0; @ %r1004 bra $L39; $L42: .loc 1 2910022918 7 bra $L1; $L39: .loc 1 2680 16 add.u64 %r1006,%r178,1; .loc 1 2681 9 shl.b64 %r107,%r1006,2; sub.u64 %r221,%r215,%r107; .loc 1 2683 16 add.u64 %r1007,%r179,1; .loc 1 2684 9 shl.b64 %r110,%r1007,2; shl.b64 %r565,%r179,9; shl.b64 %r567,%r179,11; shl.b64 %r577,%r179,3; sub.u64 %r568,%r216,%r110; add.u64 %r566,%r577,%r568; shl.b64 %r571,%r476,9; add.u64 %r572,%r476,%r476; add.u64 %r574,%r104,-4; neg.s64 %r1010,%r179; shl.b64 %r1011,%r1010,11; shl.b64 %r575,%r1010,2; shl.b64 %r579,%r476,11; shl.b64 %r1015,%r476,3; add.u64 %r1016,%r1015,4; add.u64 %r578,%r219,%r1016; add.u64 %r1017,%r187,-1; and.b64 %r587,%r1017,-512; add.u64 %r588,%r568,%r575; shl.b64 %r544,%r178,8; shl.b64 %r546,%r178,10; add.u64 %r547,%r107,-4; neg.s64 %r1019,%r178; shl.b64 %r1020,%r1019,10; shl.b64 %r1023,%r1019,3; shl.b64 %r509,%r178,3; add.u64 %r510,%r178,%r178; add.u64 %r202,%r110,-4; shl.b64 %r196,%r179,4; shl.b64 %r191,%r476,4; neg.s64 %r1026,%r476; shl.b64 %r1027,%r1026,2; shl.b64 %r1030,%r1010,4; mov.u64 %r573,%r574; mov.u64 %r569,%r572; mov.u64 %r563,%r179; mov.u64 %r561,0; setp.gt.s64 %r1295,%r428,0; add.u64 %r1339,%r588,4; setp.gt.s64 %r1340,%r186,0; add.u64 %r1341,%r186,1; add.u64 %r1342,%r221,8; bra $L41; $L138: mov.u64 %r561,%r562; $L41: add.u64 %r590,%r561,1; .loc 1 2715 9 sub.u64 %r226,%r187,%r561; min.s64 %r1471,%r226,512; .loc 1 2716 24 shr.s64 %r1032,%r1471,63; shr.u64 %r1033,%r1032,62; add.u64 %r1034,%r1471,%r1033; and.b64 %r1035,%r1034,3; sub.u64 %r1036,%r1035,%r1033; .loc 1 2716 10 sub.u64 %r227,%r1471,%r1036; .loc 1 2718 4 @ %r1295 bra $L40; $L45: .loc 1 2710 7 add.u64 %r562,%r561,512; add.u64 %r563,%r563,%r565; add.u64 %r566,%r566,%r567; add.u64 %r569,%r569,%r571; add.u64 %r573,%r573,%r571; add.u64 %r575,%r575,%r1011; add.u64 %r577,%r577,%r567; add.u64 %r578,%r578,%r579; setp.ne.u64 %r1038,%r561,%r587; @ %r1038 bra $L138; bra $L42; $L40: add.u64 %r541,%r1339,%r577; .loc 1 2768 13 add.u64 %r452,%r227,%r590; add.u64 %r449,%r202,%r577; add.u64 %r447,%r202,%r449; add.u64 %r559,%r202,%r566; add.u64 %r560,%r202,%r559; mul.lo.u64 %r378,%r452,%r476; mul.lo.u64 %r371,%r452,%r179; mad.lo.u64 %r556,%r452,%r202,%r568; mov.u64 %r548,%r1023; mov.u64 %r545,%r547; mov.u64 %r542,%r178; .loc 1 2718 12 mov.u64 %r230,%r474; add.u64 %r1287,%r428,1; .loc 1 2768 8 add.u64 %r1332,%r452,-1; setp.ge.s64 %r1334,%r1332,%r590; sub.u64 %r1337,%r569,%r476; add.u64 %r1338,%r1337,%r572; $L44: .loc 1 2723 13 sub.u64 %r228,%r1287,%r230; min.s64 %r1473,%r228,256; .loc 1 2724 28 shr.u64 %r1044,%r1473,63; add.u64 %r1045,%r1473,%r1044; and.b64 %r1046,%r1045,1; sub.u64 %r1047,%r1046,%r1044; .loc 1 2724 14 sub.u64 %r229,%r1473,%r1047; .loc 1 2727 8 @ %r1340 bra $L43; $L73: .loc 1 2718 30 add.u64 %r230,%r230,256; .loc 1 2718 4 add.u64 %r541,%r541,1024; add.u64 %r542,%r542,%r544; add.u64 %r545,%r545,%r546; add.u64 %r548,%r548,%r1020; setp.ge.s64 %r1049,%r428,%r230; @ %r1049 bra $L44; bra $L45; $L43: .loc 1 2763 26 add.u64 %r1050,%r230,-1; add.u64 %r1051,%r1050,%r1473; .loc 1 2763 31 mul.lo.u64 %r286,%r1051,%r178; .loc 1 2734 13 add.u64 %r451,%r229,%r230; .loc 1 2734 8 add.u64 %r453,%r451,-1; add.u64 %r1052,%r286,1; shl.b64 %r1053,%r1052,2; add.u64 %r528,%r221,%r1053; add.u64 %r538,%r542,%r178; sub.u64 %r1054,%r538,%r286; shl.b64 %r540,%r1054,2; shl.b64 %r1056,%r1473,2; add.u64 %r1057,%r1056,-4; add.u64 %r302,%r1002,%r1057; add.u64 %r1058,%r230,%r563; add.u64 %r1059,%r1058,%r1473; shl.b64 %r1060,%r1059,2; add.u64 %r195,%r1060,%r568; shl.b64 %r558,%r230,2; add.u64 %r48,%r558,%r566; add.u64 %r46,%r558,%r559; add.u64 %r30,%r558,%r560; add.u64 %r557,%r556,%r558; add.u64 %r1061,%r371,%r230; add.u64 %r1062,%r1061,%r1473; shl.b64 %r1063,%r1062,2; add.u64 %r369,%r1063,%r568; .loc 1 2768 8 mov.u64 %r527,%r578; .loc 1 2727 16 mov.u64 %r245,%r474; setp.le.s64 %r1288,%r230,%r453; setp.gt.s64 %r1300,%r1473,%r229; setp.gt.s64 %r1299,%r1471,%r227; not.b64 %r1325,%r230; add.u64 %r1326,%r1325,%r451; and.b64 %r1327,%r1326,-2; .loc 1 2904 18 add.u64 %r1328,%r1473,%r230; .loc 1 2904 13 add.u64 %r1329,%r1328,-1; setp.gt.s64 %r1330,%r230,%r1329; $L72: .loc 1 2732 10 sub.u64 %r231,%r1341,%r245; min.s64 %r1474,%r231,256; .loc 1 2733 25 shr.s64 %r1305,%r1474,63; shr.u64 %r1067,%r1474,63; add.u64 %r1068,%r1474,%r1067; and.b64 %r1069,%r1068,1; sub.u64 %r1070,%r1069,%r1067; .loc 1 2733 11 sub.u64 %r232,%r1474,%r1070; .loc 1 2735 5 @ %r1288 bra $L46; $L55: .loc 1 2757 8 @ ! %r1300 bra $L48; bra $L47; $L46: .loc 1 2737 17 add.u64 %r456,%r232,%r245; .loc 1 2737 12 add.u64 %r457,%r456,-1; shl.b64 %r1074,%r1474,10; add.u64 %r1076,%r1074,-1024; add.u64 %r502,%r1002,%r1076; add.u64 %r508,%r528,%r540; add.u64 %r1079,%r1474,%r245; shl.b64 %r519,%r1079,2; shl.b64 %r523,%r1474,2; mov.u64 %r513,%r538; mov.u64 %r512,%r548; mov.u64 %r511,%r545; mov.u64 %r500,0; setp.le.s64 %r1293,%r245,%r457; setp.gt.s64 %r1301,%r1474,%r232; not.b64 %r1320,%r245; add.u64 %r1321,%r1320,%r456; and.b64 %r1322,%r1321,-2; add.u64 %r1323,%r1322,%r245; bra $L54; $L139: mov.u64 %r500,%r501; $L54: .loc 1 2738 9 @ %r1293 bra $L49; $L53: .loc 1 2749 12 @ %r1301 bra $L50; bra $L51; $L49: shl.b64 %r1082,%r500,2; add.u64 %r199,%r1002,%r1082; add.u64 %r1087,%r1323,%r513; shl.b64 %r1088,%r1087,2; add.u64 %r499,%r1088,%r1342; .loc 1 2738 9 mov.u64 %r181,%r508; sub.u64 %r1401,%r499,%r508; add.u64 %r1402,%r1401,-8; shr.u64 %r1400,%r1402,3; add.u64 %r1403,%r1400,1; and.b64 %r1404,%r1403,3; setp.eq.u64 %r1407,%r1404,0; @ %r1407 bra $L52; setp.eq.u64 %r1406,%r1404,1; @ %r1406 bra $L311; setp.eq.u64 %r1405,%r1404,2; @ %r1405 bra $L312; bra $L382; $L52: .loc 1 2741 7 add.u64 %r1090,%r181,%r511; add.u64 %r1091,%r1090,%r512; .loc 1 2740 49 ld.f32 %r1092,[%r1091]; st.f32 [%r199],%r1092; .loc 1 2742 49 ld.f32 %r1093,[%r181]; st.f32 [%r199+4],%r1093; .loc 1 2745 7 add.u64 %r1094,%r181,%r512; add.u64 %r1095,%r1094,%r511; .loc 1 2744 49 ld.f32 %r1096,[%r1095+4]; st.f32 [%r199+1024],%r1096; .loc 1 2746 49 ld.f32 %r1097,[%r181+4]; st.f32 [%r199+1028],%r1097; .loc 1 2738 9 add.u64 %r1408,%r199,2048; add.u64 %r1409,%r181,8; .loc 1 2741 7 add.u64 %r1477,%r1409,%r511; add.u64 %r1478,%r1477,%r512; .loc 1 2740 49 ld.f32 %r1479,[%r1478]; st.f32 [%r1408],%r1479; .loc 1 2742 49 ld.f32 %r1480,[%r1409]; st.f32 [%r1408+4],%r1480; .loc 1 2745 7 add.u64 %r1481,%r1409,%r512; add.u64 %r1482,%r1481,%r511; .loc 1 2744 49 ld.f32 %r1483,[%r1482+4]; st.f32 [%r1408+1024],%r1483; .loc 1 2746 49 ld.f32 %r1484,[%r1409+4]; st.f32 [%r1408+1028],%r1484; .loc 1 2738 9 add.u64 %r1485,%r199,4096; add.u64 %r1486,%r181,16; .loc 1 2741 7 add.u64 %r1488,%r1486,%r511; add.u64 %r1489,%r1488,%r512; .loc 1 2740 49 ld.f32 %r1490,[%r1489]; st.f32 [%r1485],%r1490; .loc 1 2742 49 ld.f32 %r1491,[%r1486]; st.f32 [%r1485+4],%r1491; .loc 1 2745 7 add.u64 %r1492,%r1486,%r512; add.u64 %r1493,%r1492,%r511; .loc 1 2744 49 ld.f32 %r1494,[%r1493+4]; st.f32 [%r1485+1024],%r1494; .loc 1 2746 49 ld.f32 %r1495,[%r1486+4]; st.f32 [%r1485+1028],%r1495; .loc 1 2738 9 add.u64 %r1496,%r199,6144; add.u64 %r1497,%r181,24; .loc 1 2741 7 add.u64 %r1499,%r1497,%r511; add.u64 %r1500,%r1499,%r512; .loc 1 2740 49 ld.f32 %r1501,[%r1500]; st.f32 [%r1496],%r1501; .loc 1 2742 49 ld.f32 %r1502,[%r1497]; st.f32 [%r1496+4],%r1502; .loc 1 2745 7 add.u64 %r1503,%r1497,%r512; add.u64 %r1504,%r1503,%r511; .loc 1 2744 49 ld.f32 %r1505,[%r1504+4]; st.f32 [%r1496+1024],%r1505; .loc 1 2746 49 ld.f32 %r1506,[%r1497+4]; st.f32 [%r1496+1028],%r1506; .loc 1 2738 9 add.u64 %r199,%r199,8192; add.u64 %r181,%r181,32; setp.ne.u64 %r1507,%r181,%r499; @ %r1507 bra $L52; bra $L53; $L50: .loc 1 2752 10 add.u64 %r1099,%r511,%r519; add.u64 %r1100,%r221,%r1099; .loc 1 2751 41 ld.f32 %r1101,[%r1100+-4]; st.f32 [%r502],%r1101; .loc 1 2754 10 add.u64 %r1102,%r508,%r523; .loc 1 2753 41 ld.f32 %r1103,[%r1102+-4]; st.f32 [%r502+4],%r1103; $L51: .loc 1 2735 5 add.u64 %r501,%r500,2; add.u64 %r502,%r502,8; add.u64 %r508,%r508,%r509; add.u64 %r511,%r511,%r509; add.u64 %r512,%r512,%r1023; add.u64 %r513,%r513,%r510; setp.ne.u64 %r1104,%r500,%r1327; @ %r1104 bra $L139; bra $L55; $L48: .loc 1 2767 25 shr.u64 %r1107,%r1305,62; add.u64 %r1108,%r1474,%r1107; and.b64 %r1109,%r1108,3; sub.u64 %r1110,%r1109,%r1107; .loc 1 2767 11 sub.u64 %r235,%r1474,%r1110; .loc 1 2769 5 @ %r1334 bra $L56; bra $L57; $L47: .loc 1 2759 17 add.u64 %r132,%r1474,%r245; .loc 1 2759 12 add.u64 %r1112,%r132,-1; .loc 1 2760 9 setp.lt.s64 %r1113,%r1112,%r245; @ %r1113 bra $L48; add.u64 %r1114,%r132,%r286; shl.b64 %r1115,%r1114,2; add.u64 %r136,%r221,%r1115; mov.u64 %r73,%r302; mov.u64 %r57,%r528; sub.u64 %r1391,%r136,%r528; add.u64 %r1392,%r1391,-4; shr.u64 %r1390,%r1392,2; add.u64 %r1393,%r1390,1; and.b64 %r1394,%r1393,3; setp.eq.u64 %r1397,%r1394,0; @ %r1397 bra $L58; setp.eq.u64 %r1396,%r1394,1; @ %r1396 bra $L313; setp.eq.u64 %r1395,%r1394,2; @ %r1395 bra $L314; bra $L383; $L58: .loc 1 2762 43 ld.f32 %r1116,[%r57]; st.f32 [%r73],%r1116; ld.f32 %r1510,[%r57+4]; st.f32 [%r73+1024],%r1510; ld.f32 %r1514,[%r57+8]; st.f32 [%r73+2048],%r1514; ld.f32 %r1518,[%r57+12]; st.f32 [%r73+3072],%r1518; .loc 1 2760 9 add.u64 %r57,%r57,16; add.u64 %r73,%r73,4096; setp.ne.u64 %r1519,%r57,%r136; @ %r1519 bra $L58; bra $L48; $L57: .loc 1 2871 8 @ ! %r1299 bra $L60; bra $L59; $L56: .loc 1 2876 14 add.u64 %r460,%r235,%r245; .loc 1 2876 9 add.u64 %r461,%r460,-1; add.u64 %r192,%r527,%r1027; add.u64 %r89,%r527,%r574; shl.b64 %r1119,%r476,2; add.u64 %r42,%r1119,%r89; not.b64 %r1120,%r245; add.u64 %r1121,%r1120,%r460; shr.u64 %r1122,%r1121,2; add.u64 %r1123,%r1122,1; shl.b64 %r290,%r1123,4; add.u64 %r1124,%r1337,%r460; shl.b64 %r1125,%r1124,2; add.u64 %r440,%r1125,%r219; add.u64 %r1126,%r460,%r569; shl.b64 %r1127,%r1126,2; add.u64 %r435,%r1127,%r219; add.u64 %r1129,%r1338,%r460; shl.b64 %r1130,%r1129,2; add.u64 %r432,%r1130,%r219; add.u64 %r1131,%r460,%r573; shl.b64 %r1132,%r1131,2; add.u64 %r429,%r1132,%r219; shl.b64 %r341,%r235,8; shl.b64 %r443,%r1474,2; mov.u64 %r448,%r447; mov.u64 %r450,%r449; mov.u64 %r455,%r577; mov.u64 %r458,%r575; mov.u64 %r190,%r527; mov.u64 %r197,%r195; mov.u64 %r250,%r590; mov.u64 %r445,0; mov.u64 %r446,%r445; setp.le.s64 %r1294,%r245,%r461; setp.gt.s64 %r1302,%r1474,%r235; $L68: .loc 1 2772 9 @ %r1294 bra $L61; $L67: .loc 1 2843 12 @ ! %r1302 bra $L63; bra $L62; $L61: .loc 1 2904 13 mov.u64 %r313,0; mov.u64 %r314,4096; mov.u64 %r315,3072; mov.u64 %r316,1024; mov.u64 %r317,-2048; $L66: add.u64 %r311,%r192,%r313; .loc 1 2774 10 ld.f32 %r251,[%r311]; .loc 1 2775 10 ld.f32 %r252,[%r311+4]; add.u64 %r309,%r190,%r313; .loc 1 2776 10 ld.f32 %r253,[%r309]; .loc 1 2777 10 ld.f32 %r254,[%r309+4]; add.u64 %r307,%r89,%r313; .loc 1 2778 10 ld.f32 %r255,[%r307]; .loc 1 2779 10 ld.f32 %r256,[%r307+4]; add.u64 %r305,%r42,%r313; .loc 1 2780 10 ld.f32 %r257,[%r305]; .loc 1 2781 10 ld.f32 %r258,[%r305+4]; .loc 1 2782 10 ld.f32 %r259,[%r311+8]; .loc 1 2783 10 ld.f32 %r260,[%r311+12]; .loc 1 2784 10 ld.f32 %r261,[%r309+8]; .loc 1 2785 10 ld.f32 %r262,[%r309+12]; .loc 1 2786 10 ld.f32 %r263,[%r307+8]; .loc 1 2787 10 ld.f32 %r264,[%r307+12]; .loc 1 2788 10 ld.f32 %r265,[%r305+8]; .loc 1 2789 10 ld.f32 %r266,[%r305+12]; .loc 1 2791 6 @ %r1330 bra $L64; add.u64 %r333,%r445,%r541; add.u64 %r331,%r1002,%r316; sub.u64 %r1383,%r197,%r333; add.u64 %r1384,%r1383,-4; shr.u64 %r1382,%r1384,2; and.b64 %r1386,%r1382,1; setp.ne.u64 %r1387,%r1386,0; @ ! %r1387 bra $L384; $L65: add.u64 %r328,%r317,%r331; .loc 1 2793 19 add.u64 %r1137,%r328,%r316; ld.f32 %r137,[%r1137]; .loc 1 2794 14 ld.f32 %r138,[%r333]; .loc 1 2793 14 fma.rn.f32 %r1520,%r137,%r138,%r251; .loc 1 2795 19 ld.f32 %r139,[%r331]; .loc 1 2795 14 fma.rn.f32 %r1521,%r138,%r139,%r252; add.u64 %r323,%r333,%r458; .loc 1 2798 14 add.u64 %r1138,%r323,%r455; ld.f32 %r140,[%r1138]; .loc 1 2797 14 fma.rn.f32 %r1522,%r137,%r140,%r253; .loc 1 2799 14 fma.rn.f32 %r1523,%r139,%r140,%r254; .loc 1 2802 14 add.u64 %r1139,%r323,%r450; ld.f32 %r141,[%r1139]; .loc 1 2801 14 fma.rn.f32 %r1524,%r137,%r141,%r255; .loc 1 2803 14 fma.rn.f32 %r1525,%r139,%r141,%r256; .loc 1 2806 14 add.u64 %r1140,%r323,%r448; ld.f32 %r142,[%r1140]; .loc 1 2805 14 fma.rn.f32 %r1526,%r137,%r142,%r257; .loc 1 2807 14 fma.rn.f32 %r1527,%r139,%r142,%r258; .loc 1 2809 19 add.u64 %r1141,%r328,%r315; ld.f32 %r143,[%r1141]; .loc 1 2809 14 fma.rn.f32 %r1528,%r138,%r143,%r259; .loc 1 2811 19 add.u64 %r1142,%r328,%r314; ld.f32 %r144,[%r1142]; .loc 1 2811 14 fma.rn.f32 %r1529,%r138,%r144,%r260; .loc 1 2813 14 fma.rn.f32 %r1530,%r140,%r143,%r261; .loc 1 2815 14 fma.rn.f32 %r1531,%r140,%r144,%r262; .loc 1 2817 14 fma.rn.f32 %r1532,%r141,%r143,%r263; .loc 1 2819 14 fma.rn.f32 %r1533,%r141,%r144,%r264; .loc 1 2821 14 fma.rn.f32 %r1534,%r142,%r143,%r265; .loc 1 2823 14 fma.rn.f32 %r1535,%r142,%r144,%r266; .loc 1 2791 6 add.u64 %r1388,%r333,4; add.u64 %r1389,%r331,4; add.u64 %r1538,%r317,%r1389; .loc 1 2793 19 add.u64 %r1539,%r1538,%r316; ld.f32 %r1540,[%r1539]; .loc 1 2794 14 ld.f32 %r1541,[%r1388]; .loc 1 2793 14 fma.rn.f32 %r251,%r1540,%r1541,%r1520; .loc 1 2795 19 ld.f32 %r1542,[%r1389]; .loc 1 2795 14 fma.rn.f32 %r252,%r1541,%r1542,%r1521; add.u64 %r1543,%r1388,%r458; .loc 1 2798 14 add.u64 %r1544,%r1543,%r455; ld.f32 %r1545,[%r1544]; .loc 1 2797 14 fma.rn.f32 %r253,%r1540,%r1545,%r1522; .loc 1 2799 14 fma.rn.f32 %r254,%r1542,%r1545,%r1523; .loc 1 2802 14 add.u64 %r1546,%r1543,%r450; ld.f32 %r1547,[%r1546]; .loc 1 2801 14 fma.rn.f32 %r255,%r1540,%r1547,%r1524; .loc 1 2803 14 fma.rn.f32 %r256,%r1542,%r1547,%r1525; .loc 1 2806 14 add.u64 %r1548,%r1543,%r448; ld.f32 %r1549,[%r1548]; .loc 1 2805 14 fma.rn.f32 %r257,%r1540,%r1549,%r1526; .loc 1 2807 14 fma.rn.f32 %r258,%r1542,%r1549,%r1527; .loc 1 2809 19 add.u64 %r1550,%r1538,%r315; ld.f32 %r1551,[%r1550]; .loc 1 2809 14 fma.rn.f32 %r259,%r1541,%r1551,%r1528; .loc 1 2811 19 add.u64 %r1552,%r1538,%r314; ld.f32 %r1553,[%r1552]; .loc 1 2811 14 fma.rn.f32 %r260,%r1541,%r1553,%r1529; .loc 1 2813 14 fma.rn.f32 %r261,%r1545,%r1551,%r1530; .loc 1 2815 14 fma.rn.f32 %r262,%r1545,%r1553,%r1531; .loc 1 2817 14 fma.rn.f32 %r263,%r1547,%r1551,%r1532; .loc 1 2819 14 fma.rn.f32 %r264,%r1547,%r1553,%r1533; .loc 1 2821 14 fma.rn.f32 %r265,%r1549,%r1551,%r1534; .loc 1 2823 14 fma.rn.f32 %r266,%r1549,%r1553,%r1535; .loc 1 2791 6 add.u64 %r333,%r333,8; add.u64 %r331,%r331,8; setp.ne.u64 %r1554,%r197,%r333; @ %r1554 bra $L65; $L64: .loc 1 2826 24 st.f32 [%r311],%r251; .loc 1 2827 28 st.f32 [%r311+4],%r252; .loc 1 2828 30 st.f32 [%r309],%r253; .loc 1 2829 34 st.f32 [%r309+4],%r254; .loc 1 2830 30 st.f32 [%r307],%r255; .loc 1 2831 34 st.f32 [%r307+4],%r256; .loc 1 2832 30 st.f32 [%r305],%r257; .loc 1 2833 34 st.f32 [%r305+4],%r258; .loc 1 2834 28 st.f32 [%r311+8],%r259; .loc 1 2835 28 st.f32 [%r311+12],%r260; .loc 1 2836 34 st.f32 [%r309+8],%r261; .loc 1 2837 34 st.f32 [%r309+12],%r262; .loc 1 2838 34 st.f32 [%r307+8],%r263; .loc 1 2839 34 st.f32 [%r307+12],%r264; .loc 1 2840 34 st.f32 [%r305+8],%r265; .loc 1 2841 34 st.f32 [%r305+12],%r266; .loc 1 2772 9 add.u64 %r317,%r317,-4096; add.u64 %r316,%r316,4096; add.u64 %r315,%r315,4096; add.u64 %r314,%r314,4096; add.u64 %r313,%r313,16; setp.ne.u64 %r1144,%r290,%r313; @ %r1144 bra $L66; bra $L67; $L63: .loc 1 2769 29 add.u64 %r250,%r250,4; .loc 1 2769 5 add.u64 %r197,%r197,%r196; add.u64 %r192,%r192,%r191; add.u64 %r190,%r190,%r191; add.u64 %r89,%r89,%r191; add.u64 %r42,%r42,%r191; add.u64 %r458,%r458,%r1030; add.u64 %r455,%r455,%r196; add.u64 %r450,%r450,%r196; add.u64 %r448,%r448,%r196; add.u64 %r446,%r446,%r191; add.u64 %r445,%r445,%r196; setp.le.s64 %r1145,%r250,%r1332; @ %r1145 bra $L68; bra $L57; $L62: add.u64 %r346,%r440,%r446; add.u64 %r345,%r435,%r446; add.u64 %r344,%r432,%r446; add.u64 %r343,%r429,%r446; add.u64 %r49,%r48,%r445; add.u64 %r211,%r46,%r445; add.u64 %r212,%r30,%r445; add.u64 %r444,%r192,%r443; .loc 1 2904 13 mov.u64 %r342,%r341; $L71: .loc 1 2848 14 ld.f32 %r246,[%r346]; .loc 1 2849 14 ld.f32 %r247,[%r345]; .loc 1 2850 14 ld.f32 %r248,[%r344]; .loc 1 2851 14 ld.f32 %r249,[%r343]; .loc 1 2853 10 @ %r1330 bra $L69; shl.b64 %r1149,%r342,2; add.u64 %r361,%r1002,%r1149; add.u64 %r359,%r445,%r541; mov.u64 %r353,%r212; mov.u64 %r355,%r211; mov.u64 %r357,%r49; sub.u64 %r1370,%r197,%r359; add.u64 %r1371,%r1370,-4; shr.u64 %r1369,%r1371,2; add.u64 %r1372,%r1369,1; and.b64 %r1373,%r1372,3; setp.eq.u64 %r1376,%r1373,0; @ %r1376 bra $L70; setp.eq.u64 %r1375,%r1373,1; @ %r1375 bra $L315; setp.eq.u64 %r1374,%r1373,2; @ %r1374 bra $L316; bra $L385; $L70: .loc 1 2855 16 ld.f32 %r146,[%r361]; .loc 1 2855 11 ld.f32 %r1150,[%r359]; fma.rn.f32 %r1555,%r146,%r1150,%r246; .loc 1 2857 11 ld.f32 %r1151,[%r357]; fma.rn.f32 %r1556,%r146,%r1151,%r247; .loc 1 2859 11 ld.f32 %r1152,[%r355]; fma.rn.f32 %r1557,%r146,%r1152,%r248; .loc 1 2861 11 ld.f32 %r1153,[%r353]; fma.rn.f32 %r1558,%r146,%r1153,%r249; .loc 1 2855 16 ld.f32 %r1564,[%r361+4]; .loc 1 2855 11 ld.f32 %r1565,[%r359+4]; fma.rn.f32 %r1566,%r1564,%r1565,%r1555; .loc 1 2857 11 ld.f32 %r1567,[%r357+4]; fma.rn.f32 %r1568,%r1564,%r1567,%r1556; .loc 1 2859 11 ld.f32 %r1569,[%r355+4]; fma.rn.f32 %r1570,%r1564,%r1569,%r1557; .loc 1 2861 11 ld.f32 %r1571,[%r353+4]; fma.rn.f32 %r1572,%r1564,%r1571,%r1558; .loc 1 2855 16 ld.f32 %r1579,[%r361+8]; .loc 1 2855 11 ld.f32 %r1580,[%r359+8]; fma.rn.f32 %r1581,%r1579,%r1580,%r1566; .loc 1 2857 11 ld.f32 %r1582,[%r357+8]; fma.rn.f32 %r1583,%r1579,%r1582,%r1568; .loc 1 2859 11 ld.f32 %r1584,[%r355+8]; fma.rn.f32 %r1585,%r1579,%r1584,%r1570; .loc 1 2861 11 ld.f32 %r1586,[%r353+8]; fma.rn.f32 %r1587,%r1579,%r1586,%r1572; .loc 1 2855 16 ld.f32 %r1594,[%r361+12]; .loc 1 2855 11 ld.f32 %r1595,[%r359+12]; fma.rn.f32 %r246,%r1594,%r1595,%r1581; .loc 1 2857 11 ld.f32 %r1596,[%r357+12]; fma.rn.f32 %r247,%r1594,%r1596,%r1583; .loc 1 2859 11 ld.f32 %r1597,[%r355+12]; fma.rn.f32 %r248,%r1594,%r1597,%r1585; .loc 1 2861 11 ld.f32 %r1598,[%r353+12]; fma.rn.f32 %r249,%r1594,%r1598,%r1587; .loc 1 2853 10 add.u64 %r361,%r361,16; add.u64 %r359,%r359,16; add.u64 %r357,%r357,16; add.u64 %r355,%r355,16; add.u64 %r353,%r353,16; setp.ne.u64 %r1599,%r197,%r359; @ %r1599 bra $L70; $L69: .loc 1 2864 28 st.f32 [%r346],%r246; .loc 1 2865 34 st.f32 [%r345],%r247; .loc 1 2866 34 st.f32 [%r344],%r248; .loc 1 2867 34 st.f32 [%r343],%r249; .loc 1 2846 6 add.u64 %r346,%r346,4; add.u64 %r345,%r345,4; add.u64 %r344,%r344,4; add.u64 %r343,%r343,4; add.u64 %r342,%r342,256; setp.ne.u64 %r1155,%r346,%r444; @ %r1155 bra $L71; bra $L63; $L60: .loc 1 2727 34 add.u64 %r245,%r245,256; .loc 1 2727 8 add.u64 %r527,%r527,1024; add.u64 %r528,%r528,1024; setp.ge.s64 %r1156,%r186,%r245; @ %r1156 bra $L72; bra $L73; $L59: .loc 1 2873 17 add.u64 %r151,%r1471,%r590; .loc 1 2874 9 setp.le.s64 %r1157,%r151,%r452; @ %r1157 bra $L60; .loc 1 2876 14 add.u64 %r467,%r235,%r245; .loc 1 2876 9 add.u64 %r468,%r467,-1; .loc 1 2759 17 add.u64 %r377,%r1474,%r245; add.u64 %r1158,%r377,%r378; shl.b64 %r1159,%r1158,2; add.u64 %r379,%r1159,%r219; neg.s64 %r1161,%r377; shl.b64 %r1162,%r1161,2; shl.b64 %r1163,%r245,2; add.u64 %r362,%r1163,%r1162; shl.b64 %r1164,%r467,2; add.u64 %r364,%r1164,%r1162; shl.b64 %r406,%r235,8; .loc 1 2876 9 mov.u64 %r370,%r369; mov.u64 %r372,%r557; mov.u64 %r373,%r378; mov.u64 %r239,%r452; setp.gt.s64 %r1600,%r1474,%r235; setp.le.s64 %r1296,%r245,%r468; not.b64 %r1312,%r245; add.u64 %r1313,%r1312,%r467; and.b64 %r1314,%r1313,-4; add.u64 %r1315,%r1314,%r245; $L81: .loc 1 2877 6 @ %r1296 bra $L74; $L80: .loc 1 2901 6 @ %r1600 bra $L75; bra $L76; $L74: add.u64 %r391,%r362,%r379; add.u64 %r1172,%r1315,%r373; shl.b64 %r1173,%r1172,2; add.u64 %r1174,%r219,16; add.u64 %r381,%r1173,%r1174; .loc 1 2904 13 mov.u64 %r387,4096; mov.u64 %r388,3072; mov.u64 %r389,1024; mov.u64 %r390,-2048; $L79: .loc 1 2879 14 ld.f32 %r241,[%r391]; .loc 1 2880 14 ld.f32 %r242,[%r391+4]; .loc 1 2881 14 ld.f32 %r243,[%r391+8]; .loc 1 2882 14 ld.f32 %r244,[%r391+12]; .loc 1 2884 10 @ %r1330 bra $L77; add.u64 %r400,%r1002,%r389; mov.u64 %r401,%r372; sub.u64 %r1360,%r370,%r372; add.u64 %r1361,%r1360,-4; shr.u64 %r1359,%r1361,2; add.u64 %r1362,%r1359,1; and.b64 %r1363,%r1362,3; setp.eq.u64 %r1366,%r1363,0; @ %r1366 bra $L78; setp.eq.u64 %r1365,%r1363,1; @ %r1365 bra $L317; setp.eq.u64 %r1364,%r1363,2; @ %r1364 bra $L318; bra $L386; $L78: add.u64 %r397,%r390,%r400; .loc 1 2887 16 ld.f32 %r153,[%r401]; .loc 1 2886 16 add.u64 %r1176,%r397,%r389; .loc 1 2886 11 ld.f32 %r1177,[%r1176]; fma.rn.f32 %r1601,%r1177,%r153,%r241; .loc 1 2888 11 ld.f32 %r1178,[%r400]; fma.rn.f32 %r1602,%r153,%r1178,%r242; .loc 1 2890 16 add.u64 %r1179,%r397,%r388; .loc 1 2890 11 ld.f32 %r1180,[%r1179]; fma.rn.f32 %r1603,%r153,%r1180,%r243; .loc 1 2892 16 add.u64 %r1181,%r397,%r387; .loc 1 2892 11 ld.f32 %r1182,[%r1181]; fma.rn.f32 %r1604,%r153,%r1182,%r244; .loc 1 2884 10 add.u64 %r1368,%r400,4; add.u64 %r1607,%r390,%r1368; .loc 1 2887 16 ld.f32 %r1608,[%r401+4]; .loc 1 2886 16 add.u64 %r1609,%r1607,%r389; .loc 1 2886 11 ld.f32 %r1610,[%r1609]; fma.rn.f32 %r1611,%r1610,%r1608,%r1601; .loc 1 2888 11 ld.f32 %r1612,[%r1368]; fma.rn.f32 %r1613,%r1608,%r1612,%r1602; .loc 1 2890 16 add.u64 %r1614,%r1607,%r388; .loc 1 2890 11 ld.f32 %r1615,[%r1614]; fma.rn.f32 %r1616,%r1608,%r1615,%r1603; .loc 1 2892 16 add.u64 %r1617,%r1607,%r387; .loc 1 2892 11 ld.f32 %r1618,[%r1617]; fma.rn.f32 %r1619,%r1608,%r1618,%r1604; .loc 1 2884 10 add.u64 %r1621,%r400,8; add.u64 %r1623,%r390,%r1621; .loc 1 2887 16 ld.f32 %r1624,[%r401+8]; .loc 1 2886 16 add.u64 %r1625,%r1623,%r389; .loc 1 2886 11 ld.f32 %r1626,[%r1625]; fma.rn.f32 %r1627,%r1626,%r1624,%r1611; .loc 1 2888 11 ld.f32 %r1628,[%r1621]; fma.rn.f32 %r1629,%r1624,%r1628,%r1613; .loc 1 2890 16 add.u64 %r1630,%r1623,%r388; .loc 1 2890 11 ld.f32 %r1631,[%r1630]; fma.rn.f32 %r1632,%r1624,%r1631,%r1616; .loc 1 2892 16 add.u64 %r1633,%r1623,%r387; .loc 1 2892 11 ld.f32 %r1634,[%r1633]; fma.rn.f32 %r1635,%r1624,%r1634,%r1619; .loc 1 2884 10 add.u64 %r1637,%r400,12; add.u64 %r1639,%r390,%r1637; .loc 1 2887 16 ld.f32 %r1640,[%r401+12]; .loc 1 2886 16 add.u64 %r1641,%r1639,%r389; .loc 1 2886 11 ld.f32 %r1642,[%r1641]; fma.rn.f32 %r241,%r1642,%r1640,%r1627; .loc 1 2888 11 ld.f32 %r1643,[%r1637]; fma.rn.f32 %r242,%r1640,%r1643,%r1629; .loc 1 2890 16 add.u64 %r1644,%r1639,%r388; .loc 1 2890 11 ld.f32 %r1645,[%r1644]; fma.rn.f32 %r243,%r1640,%r1645,%r1632; .loc 1 2892 16 add.u64 %r1646,%r1639,%r387; .loc 1 2892 11 ld.f32 %r1647,[%r1646]; fma.rn.f32 %r244,%r1640,%r1647,%r1635; .loc 1 2884 10 add.u64 %r401,%r401,16; add.u64 %r400,%r400,16; setp.ne.u64 %r1648,%r370,%r401; @ %r1648 bra $L78; $L77: .loc 1 2895 28 st.f32 [%r391],%r241; .loc 1 2896 32 st.f32 [%r391+4],%r242; .loc 1 2897 32 st.f32 [%r391+8],%r243; .loc 1 2898 32 st.f32 [%r391+12],%r244; .loc 1 2877 6 add.u64 %r391,%r391,16; add.u64 %r390,%r390,-4096; add.u64 %r389,%r389,4096; add.u64 %r388,%r388,4096; add.u64 %r387,%r387,4096; setp.ne.u64 %r1184,%r381,%r391; @ %r1184 bra $L79; bra $L80; $L76: .loc 1 2874 39 add.u64 %r239,%r239,1; .loc 1 2874 9 add.u64 %r379,%r379,%r574; add.u64 %r373,%r373,%r476; add.u64 %r372,%r372,%r202; add.u64 %r370,%r370,%r202; setp.ne.u64 %r1185,%r151,%r239; @ %r1185 bra $L81; bra $L60; $L75: add.u64 %r408,%r364,%r379; .loc 1 2904 13 mov.u64 %r407,%r406; $L84: .loc 1 2903 14 ld.f32 %r240,[%r408]; .loc 1 2905 10 @ %r1330 bra $L82; shl.b64 %r1188,%r407,2; add.u64 %r304,%r1002,%r1188; mov.u64 %r269,%r372; sub.u64 %r1350,%r370,%r372; add.u64 %r1351,%r1350,-4; shr.u64 %r1349,%r1351,2; add.u64 %r1352,%r1349,1; and.b64 %r1353,%r1352,3; setp.eq.u64 %r1356,%r1353,0; @ %r1356 bra $L83; setp.eq.u64 %r1355,%r1353,1; @ %r1355 bra $L319; setp.eq.u64 %r1354,%r1353,2; @ %r1354 bra $L320; bra $L387; $L83: .loc 1 2907 11 ld.f32 %r1189,[%r304]; ld.f32 %r1190,[%r269]; fma.rn.f32 %r1649,%r1189,%r1190,%r240; ld.f32 %r1652,[%r304+4]; ld.f32 %r1653,[%r269+4]; fma.rn.f32 %r1654,%r1652,%r1653,%r1649; ld.f32 %r1658,[%r304+8]; ld.f32 %r1659,[%r269+8]; fma.rn.f32 %r1660,%r1658,%r1659,%r1654; ld.f32 %r1664,[%r304+12]; ld.f32 %r1665,[%r269+12]; fma.rn.f32 %r240,%r1664,%r1665,%r1660; .loc 1 2905 10 add.u64 %r304,%r304,16; add.u64 %r269,%r269,16; setp.ne.u64 %r1666,%r370,%r269; @ %r1666 bra $L83; $L82: .loc 1 2910 28 st.f32 [%r408],%r240; .loc 1 2901 6 add.u64 %r408,%r408,4; add.u64 %r407,%r407,256; setp.ne.u64 %r1192,%r379,%r408; @ %r1192 bra $L84; bra $L76; $L20: .loc 1 2958 11 cvt.u16.u32 %r1193,%r418; setp.eq.u16 %r1194,%r1193,1; @ ! %r1194 bra $L388; bra $L85; $L31: .loc 1 2920 38 set.u32.eq.u64 %r1196,%r178,1; neg.s32 %r1197,%r1196; .loc 1 2920 26 cvt.u16.u32 %r1199,%r1197; and.b16 %r1198,%r1199,%r959; .loc 1 2920 11 cvt.u32.u16 %r1201,%r1198; cvt.u16.u8 %r1202,%r1201; setp.eq.u16 %r1203,%r1202,0; @ %r1203 bra $L20; .loc 1 2920 43 setp.ne.u64 %r1204,%r474,1; @ %r1204 bra $L20; $L121: .loc 1 2922 10 cvt.u16.u32 %r1205,%r418; setp.ne.u16 %r1206,%r1205,1; @ %r1206 bra $L87; .loc 1 2948 4 setp.gt.s64 %r1207,%r187,0; @ %r1207 bra $L88; bra $L1; $L87: .loc 1 2929 4 setp.le.s64 %r1208,%r187,0; @ %r1208 bra $L1; shl.b64 %r614,%r476,2; shl.b64 %r1209,%r186,2; add.u64 %r613,%r217,%r1209; shl.b64 %r604,%r475,2; shl.b64 %r1210,%r428,2; add.u64 %r605,%r215,%r1210; mov.u64 %r618,0; mov.u64 %r611,%r618; .loc 1 2929 11 mov.u64 %r272,%r618; setp.gt.s64 %r1667,%r186,0; setp.le.s64 %r1346,%r428,0; $L91: .loc 1 2933 8 @ %r1667 bra $L90; $L95: .loc 1 2929 29 add.u64 %r272,%r272,1; .loc 1 2929 4 add.u64 %r611,%r611,%r476; add.u64 %r613,%r613,%r614; add.u64 %r618,%r618,%r179; setp.ne.u64 %r1212,%r187,%r272; @ %r1212 bra $L91; bra $L1; $L90: shl.b64 %r1213,%r611,2; add.u64 %r600,%r217,%r1213; shl.b64 %r1214,%r618,2; add.u64 %r623,%r216,%r1214; .loc 1 2933 8 mov.u64 %r602,%r605; mov.u64 %r601,0; $L94: .loc 1 2937 5 @ %r1346 bra $L140; shl.b64 %r1216,%r601,2; add.u64 %r596,%r215,%r1216; mov.u64 %r597,%r623; .loc 1 2936 7 mov.f32 %r273,0f00000000; sub.u64 %r1419,%r602,%r596; add.u64 %r1420,%r1419,-4; shr.u64 %r1418,%r1420,2; add.u64 %r1421,%r1418,1; and.b64 %r1422,%r1421,3; setp.eq.u64 %r1425,%r1422,0; @ %r1425 bra $L93; setp.eq.u64 %r1424,%r1422,1; @ %r1424 bra $L321; setp.eq.u64 %r1423,%r1422,2; @ %r1423 bra $L322; bra $L389; $L93: .loc 1 2938 9 ld.f32 %r1217,[%r596]; ld.f32 %r1218,[%r597]; fma.rn.f32 %r1668,%r1217,%r1218,%r273; ld.f32 %r1671,[%r596+4]; ld.f32 %r1672,[%r597+4]; fma.rn.f32 %r1673,%r1671,%r1672,%r1668; ld.f32 %r1677,[%r596+8]; ld.f32 %r1678,[%r597+8]; fma.rn.f32 %r1679,%r1677,%r1678,%r1673; ld.f32 %r1683,[%r596+12]; ld.f32 %r1684,[%r597+12]; fma.rn.f32 %r273,%r1683,%r1684,%r1679; .loc 1 2937 5 add.u64 %r596,%r596,16; add.u64 %r597,%r597,16; setp.ne.u64 %r1685,%r596,%r602; @ %r1685 bra $L93; bra $L92; $L140: .loc 1 2936 7 mov.f32 %r273,0f00000000; $L92: .loc 1 2939 15 st.f32 [%r600],%r273; .loc 1 2933 8 add.u64 %r600,%r600,4; add.u64 %r601,%r601,%r475; add.u64 %r602,%r602,%r604; setp.ne.u64 %r1220,%r600,%r613; @ %r1220 bra $L94; bra $L95; $L88: shl.b64 %r634,%r476,2; mov.u64 %r632,%r217; shl.b64 %r626,%r475,2; .loc 1 2948 4 mov.u64 %r635,0; .loc 1 2948 11 mov.u64 %r270,%r635; setp.le.s64 %r1686,%r428,0; $L98: .loc 1 2952 8 @ %r1686 bra $L141; mov.u64 %r624,%r215; shl.b64 %r1222,%r635,2; add.u64 %r627,%r1222,%r216; add.u64 %r1223,%r428,%r635; shl.b64 %r1224,%r1223,2; add.u64 %r643,%r1224,%r216; .loc 1 2951 10 mov.f32 %r271,0f00000000; sub.u64 %r1429,%r643,%r627; add.u64 %r1430,%r1429,-4; shr.u64 %r1428,%r1430,2; add.u64 %r1431,%r1428,1; and.b64 %r1432,%r1431,3; setp.eq.u64 %r1435,%r1432,0; @ %r1435 bra $L97; setp.eq.u64 %r1434,%r1432,1; @ %r1434 bra $L323; setp.eq.u64 %r1433,%r1432,2; @ %r1433 bra $L324; bra $L390; $L97: .loc 1 2953 5 ld.f32 %r1225,[%r624]; ld.f32 %r1226,[%r627]; fma.rn.f32 %r1687,%r1225,%r1226,%r271; .loc 1 2952 8 add.u64 %r1688,%r624,%r626; .loc 1 2953 5 ld.f32 %r1690,[%r1688]; ld.f32 %r1691,[%r627+4]; fma.rn.f32 %r1692,%r1690,%r1691,%r1687; .loc 1 2952 8 add.u64 %r1693,%r1688,%r626; .loc 1 2953 5 ld.f32 %r1696,[%r1693]; ld.f32 %r1697,[%r627+8]; fma.rn.f32 %r1698,%r1696,%r1697,%r1692; .loc 1 2952 8 add.u64 %r1699,%r1693,%r626; .loc 1 2953 5 ld.f32 %r1702,[%r1699]; ld.f32 %r1703,[%r627+12]; fma.rn.f32 %r271,%r1702,%r1703,%r1698; .loc 1 2952 8 add.u64 %r624,%r1699,%r626; add.u64 %r627,%r627,16; setp.ne.u64 %r1704,%r627,%r643; @ %r1704 bra $L97; bra $L96; $L141: .loc 1 2951 10 mov.f32 %r271,0f00000000; $L96: .loc 1 2954 25 st.f32 [%r632],%r271; .loc 1 2948 29 add.u64 %r270,%r270,1; .loc 1 2948 4 add.u64 %r632,%r632,%r634; add.u64 %r635,%r635,%r179; setp.ne.u64 %r1228,%r187,%r270; @ %r1228 bra $L98; bra $L1; $L85: .loc 1 2963 7 setp.le.s64 %r1229,%r187,0; @ %r1229 bra $L1; shl.b64 %r654,%r416,2; mov.u64 %r652,%r217; shl.b64 %r646,%r475,2; shl.b64 %r649,%r474,2; mov.u64 %r655,0; .loc 1 2963 14 mov.u64 %r283,%r655; setp.le.s64 %r1705,%r428,0; $L101: .loc 1 2967 4 @ %r1705 bra $L142; mov.u64 %r644,%r215; shl.b64 %r1231,%r655,2; add.u64 %r647,%r216,%r1231; .loc 1 2966 6 mov.f32 %r284,0f00000000; .loc 1 2967 11 mov.u64 %r285,0; and.b64 %r1440,%r428,3; setp.eq.u64 %r1443,%r1440,0; @ %r1443 bra $L100; setp.eq.u64 %r1442,%r1440,1; @ %r1442 bra $L325; setp.eq.u64 %r1441,%r1440,2; @ %r1441 bra $L326; bra $L391; $L100: .loc 1 2968 8 ld.f32 %r1232,[%r644]; ld.f32 %r1233,[%r647]; fma.rn.f32 %r1706,%r1232,%r1233,%r284; .loc 1 2967 4 add.u64 %r1708,%r644,%r646; add.u64 %r1709,%r647,%r649; .loc 1 2968 8 ld.f32 %r1710,[%r1708]; ld.f32 %r1711,[%r1709]; fma.rn.f32 %r1712,%r1710,%r1711,%r1706; .loc 1 2967 4 add.u64 %r1714,%r1708,%r646; add.u64 %r1715,%r1709,%r649; .loc 1 2968 8 ld.f32 %r1717,[%r1714]; ld.f32 %r1718,[%r1715]; fma.rn.f32 %r1719,%r1717,%r1718,%r1712; .loc 1 2967 4 add.u64 %r1721,%r1714,%r646; add.u64 %r1722,%r1715,%r649; .loc 1 2968 8 ld.f32 %r1724,[%r1721]; ld.f32 %r1725,[%r1722]; fma.rn.f32 %r284,%r1724,%r1725,%r1719; .loc 1 2967 28 add.u64 %r285,%r285,4; .loc 1 2967 4 add.u64 %r644,%r1721,%r646; add.u64 %r647,%r1722,%r649; setp.ne.u64 %r1726,%r428,%r285; @ %r1726 bra $L100; bra $L99; $L142: .loc 1 2966 6 mov.f32 %r284,0f00000000; $L99: .loc 1 2969 21 st.f32 [%r652],%r284; .loc 1 2963 32 add.u64 %r283,%r283,1; .loc 1 2963 7 add.u64 %r652,%r652,%r654; add.u64 %r655,%r655,%r179; setp.ne.u64 %r1235,%r187,%r283; @ %r1235 bra $L101; bra $L1; $L388: .loc 1 2972 11 setp.gt.s64 %r1236,%r178,%r475; @ %r1236 bra $L102; .loc 1 2993 7 setp.gt.s64 %r1237,%r187,0; @ %r1237 bra $L103; bra $L1; $L102: .loc 1 2974 7 setp.gt.s64 %r1238,%r187,0; @ %r1238 bra $L104; bra $L1; $L108: shl.b64 %r1239,%r688,2; add.u64 %r684,%r217,%r1239; .loc 1 2975 9 mov.u64 %r282,0; and.b64 %r1456,%r186,3; setp.eq.u64 %r1459,%r1456,0; @ %r1459 bra $L105; setp.eq.u64 %r1458,%r1456,1; @ %r1458 bra $L327; setp.eq.u64 %r1457,%r1456,2; @ %r1457 bra $L328; bra $L392; $L105: .loc 1 2976 34 st.f32 [%r684],%r1347; .loc 1 2975 2 add.u64 %r1728,%r684,%r686; .loc 1 2976 34 st.f32 [%r1728],%r1347; .loc 1 2975 2 add.u64 %r1730,%r1728,%r686; .loc 1 2976 34 st.f32 [%r1730],%r1347; .loc 1 2975 2 add.u64 %r1733,%r1730,%r686; .loc 1 2976 34 st.f32 [%r1733],%r1347; .loc 1 2975 27 add.u64 %r282,%r282,4; .loc 1 2975 2 add.u64 %r684,%r1733,%r686; setp.ne.u64 %r1735,%r186,%r282; @ %r1735 bra $L105; $L109: .loc 1 2974 32 add.u64 %r281,%r296,1; .loc 1 2974 7 add.u64 %r688,%r688,%r476; setp.ne.u64 %r1242,%r187,%r281; @ %r1242 bra $L143; shl.b64 %r672,%r474,2; shl.b64 %r663,%r475,2; mov.u64 %r677,0; mov.u64 %r676,%r677; mov.u64 %r297,%r677; setp.le.s64 %r1736,%r428,0; bra $L107; $L104: shl.b64 %r686,%r416,2; mov.u64 %r688,0; .loc 1 2974 14 mov.u64 %r296,%r688; setp.gt.s64 %r1737,%r186,0; .loc 1 2976 34 mov.f32 %r1347,0f00000000; bra $L106; $L143: mov.u64 %r296,%r281; $L106: .loc 1 2975 2 @ %r1737 bra $L108; bra $L109; $L112: shl.b64 %r1244,%r668,2; add.u64 %r661,%r215,%r1244; .loc 1 2980 4 mov.u64 %r660,%r681; .loc 1 2980 11 mov.u64 %r280,0; and.b64 %r1448,%r186,3; setp.eq.u64 %r1451,%r1448,0; @ %r1451 bra $L110; setp.eq.u64 %r1450,%r1448,1; @ %r1450 bra $L329; setp.eq.u64 %r1449,%r1448,2; @ %r1449 bra $L330; bra $L393; $L110: .loc 1 2982 36 ld.f32 %r1245,[%r661]; ld.f32 %r1246,[%r670]; ld.f32 %r1247,[%r660]; fma.rn.f32 %r175,%r1245,%r1246,%r1247; st.f32 [%r660],%r175; .loc 1 2980 4 add.u64 %r1739,%r660,%r686; add.u64 %r1740,%r661,%r663; .loc 1 2982 36 ld.f32 %r1741,[%r1740]; ld.f32 %r1742,[%r670]; ld.f32 %r1743,[%r1739]; fma.rn.f32 %r1744,%r1741,%r1742,%r1743; st.f32 [%r1739],%r1744; .loc 1 2980 4 add.u64 %r1746,%r1739,%r686; add.u64 %r1747,%r1740,%r663; .loc 1 2982 36 ld.f32 %r1749,[%r1747]; ld.f32 %r1750,[%r670]; ld.f32 %r1751,[%r1746]; fma.rn.f32 %r1752,%r1749,%r1750,%r1751; st.f32 [%r1746],%r1752; .loc 1 2980 4 add.u64 %r1754,%r1746,%r686; add.u64 %r1755,%r1747,%r663; .loc 1 2982 36 ld.f32 %r1757,[%r1755]; ld.f32 %r1758,[%r670]; ld.f32 %r1759,[%r1754]; fma.rn.f32 %r1760,%r1757,%r1758,%r1759; st.f32 [%r1754],%r1760; .loc 1 2980 29 add.u64 %r280,%r280,4; .loc 1 2980 4 add.u64 %r660,%r1754,%r686; add.u64 %r661,%r1755,%r663; setp.ne.u64 %r1761,%r186,%r280; @ %r1761 bra $L110; $L113: .loc 1 2979 26 add.u64 %r279,%r279,1; .loc 1 2979 2 add.u64 %r668,%r668,%r178; add.u64 %r670,%r670,%r672; setp.eq.u64 %r1249,%r428,%r279; @ %r1249 bra $L111; $L114: .loc 1 2980 4 @ %r1737 bra $L112; bra $L113; $L111: .loc 1 2978 32 add.u64 %r278,%r297,1; .loc 1 2978 7 add.u64 %r676,%r676,%r476; add.u64 %r677,%r677,%r179; setp.eq.u64 %r1251,%r296,%r297; @ %r1251 bra $L1; mov.u64 %r297,%r278; $L107: .loc 1 2979 2 @ %r1736 bra $L111; shl.b64 %r1253,%r677,2; add.u64 %r670,%r216,%r1253; shl.b64 %r1254,%r676,2; add.u64 %r681,%r217,%r1254; mov.u64 %r668,0; .loc 1 2979 9 mov.u64 %r279,%r668; bra $L114; $L103: shl.b64 %r703,%r416,2; shl.b64 %r694,%r178,2; shl.b64 %r698,%r474,2; .loc 1 2993 7 mov.u64 %r712,0; mov.u64 %r710,%r712; .loc 1 2993 14 mov.u64 %r274,%r712; setp.gt.s64 %r1762,%r186,0; setp.le.s64 %r1348,%r428,0; $L116: .loc 1 2997 4 @ %r1762 bra $L115; $L120: .loc 1 2993 32 add.u64 %r274,%r274,1; .loc 1 2993 7 add.u64 %r710,%r710,%r476; add.u64 %r712,%r712,%r179; setp.ne.u64 %r1256,%r187,%r274; @ %r1256 bra $L116; bra $L1; $L115: shl.b64 %r1257,%r710,2; add.u64 %r701,%r217,%r1257; shl.b64 %r1258,%r712,2; add.u64 %r718,%r216,%r1258; .loc 1 2997 4 mov.u64 %r705,0; .loc 1 2997 11 mov.u64 %r275,%r705; $L119: .loc 1 3001 8 @ %r1348 bra $L144; shl.b64 %r1260,%r705,2; add.u64 %r692,%r215,%r1260; mov.u64 %r696,%r718; .loc 1 3000 10 mov.f32 %r276,0f00000000; .loc 1 3001 15 mov.u64 %r277,0; and.b64 %r1464,%r428,3; setp.eq.u64 %r1467,%r1464,0; @ %r1467 bra $L118; setp.eq.u64 %r1466,%r1464,1; @ %r1466 bra $L331; setp.eq.u64 %r1465,%r1464,2; @ %r1465 bra $L332; bra $L394; $L118: .loc 1 3002 5 ld.f32 %r1261,[%r692]; ld.f32 %r1262,[%r696]; fma.rn.f32 %r1764,%r1261,%r1262,%r276; .loc 1 3001 8 add.u64 %r1766,%r692,%r694; add.u64 %r1767,%r696,%r698; .loc 1 3002 5 ld.f32 %r1768,[%r1766]; ld.f32 %r1769,[%r1767]; fma.rn.f32 %r1770,%r1768,%r1769,%r1764; .loc 1 3001 8 add.u64 %r1772,%r1766,%r694; add.u64 %r1773,%r1767,%r698; .loc 1 3002 5 ld.f32 %r1775,[%r1772]; ld.f32 %r1776,[%r1773]; fma.rn.f32 %r1777,%r1775,%r1776,%r1770; .loc 1 3001 8 add.u64 %r1779,%r1772,%r694; add.u64 %r1780,%r1773,%r698; .loc 1 3002 5 ld.f32 %r1782,[%r1779]; ld.f32 %r1783,[%r1780]; fma.rn.f32 %r276,%r1782,%r1783,%r1777; .loc 1 3001 32 add.u64 %r277,%r277,4; .loc 1 3001 8 add.u64 %r692,%r1779,%r694; add.u64 %r696,%r1780,%r698; setp.ne.u64 %r1784,%r428,%r277; @ %r1784 bra $L118; bra $L117; $L144: .loc 1 3000 10 mov.f32 %r276,0f00000000; $L117: .loc 1 3003 27 st.f32 [%r701],%r276; .loc 1 2997 29 add.u64 %r275,%r275,1; .loc 1 2997 4 add.u64 %r701,%r701,%r703; add.u64 %r705,%r705,%r475; setp.ne.u64 %r1264,%r186,%r275; @ %r1264 bra $L119; bra $L120; $L32: .loc 1 2920 11 setp.eq.u64 %r1265,%r178,1; @ %r1265 bra $L121; bra $L20; $L2: .loc 1 2476 6 ld.u64 %r1266,[%r719]; setp.ne.u64 %r1267,%r1266,0; @ %r1267 bra $L122; .loc 1 2485 4 ld.u64 %r1269,[%r720+56]; add.u64 %r1268,%r1269,1; ld.u64 %r1270,[%r720+48]; sub.u64 %r480,%r1268,%r1270; .loc 1 2483 15 ld.s8 %r1272,[%r721+28]; cvt.u16.u32 %r1271,%r1272; setp.eq.u16 %r1273,%r1271,1; @ %r1273 bra $L123; bra $L124; $L126: .loc 1 2560 16 ld.u64 %r475,[%r720+40]; .loc 1 2548 27 mov.u64 %r416,%r476; bra $L125; $L14: ld.u64 %r476,[%r719+40]; .loc 1 2543 6 ld.s8 %r1275,[%r719+28]; cvt.u16.u32 %r1274,%r1275; setp.eq.u16 %r1276,%r1274,1; @ ! %r1276 bra $L127; bra $L126; $L11: .loc 1 2560 16 ld.u64 %r475,[%r720+40]; mov.u32 %r418,%r303; .loc 1 2548 27 mov.u64 %r476,%r416; .loc 1 2563 14 mov.u64 %r186,1; .loc 1 2561 16 mov.u64 %r178,%r186; bra $L12; $L5: .loc 1 2485 4 ld.u64 %r1278,[%r720+56]; add.u64 %r1277,%r1278,1; ld.u64 %r1279,[%r720+48]; sub.u64 %r480,%r1277,%r1279; bra $L124; $L128: .loc 1 2509 17 ld.u64 %r1281,[%r719+56]; add.u64 %r1280,%r1281,1; .loc 1 2509 15 ld.u64 %r1282,[%r719+48]; sub.u64 %r428,%r1280,%r1282; ld.s8 %r1286,[%r721+28]; bra $L9; $L122: .loc 1 2502 12 cvta.global.u64 %r1283,_gfortrani_compile_options; .loc 1 2502 11 ld.u32 %r1284,[%r1283+36]; setp.ne.u32 %r1285,%r1284,0; @ ! %r1285 bra $L8; bra $L128; $L1$L387: .loc 1 2907 11 ld.f32 %r1785,[%r304]; ld.f32 %r1786,[%r372]; fma.rn.f32 %r240,%r1785,%r1786,%r240; .loc 1 2905 10 add.u64 %r304,%r304,4; add.u64 %r269,%r372,4; $L320: .loc 1 2907 11 ld.f32 %r1788,[%r304]; ld.f32 %r1789,[%r269]; fma.rn.f32 %r240,%r1788,%r1789,%r240; .loc 1 2905 10 add.u64 %r304,%r304,4; add.u64 %r269,%r269,4; $L319: .loc 1 2907 11 ld.f32 %r1791,[%r304]; ld.f32 %r1792,[%r269]; fma.rn.f32 %r240,%r1791,%r1792,%r240; .loc 1 2905 10 add.u64 %r304,%r304,4; add.u64 %r269,%r269,4; setp.ne.u64 %r1793,%r370,%r269; @ %r1793 bra $L83; bra $L82; $L386: add.u64 %r1794,%r390,%r400; .loc 1 2887 16 ld.f32 %r1795,[%r372]; .loc 1 2886 16 add.u64 %r1796,%r1794,%r389; .loc 1 2886 11 ld.f32 %r1797,[%r1796]; fma.rn.f32 %r241,%r1797,%r1795,%r241; .loc 1 2888 11 ld.f32 %r1798,[%r400]; fma.rn.f32 %r242,%r1795,%r1798,%r242; .loc 1 2890 16 add.u64 %r1799,%r1794,%r388; .loc 1 2890 11 ld.f32 %r1800,[%r1799]; fma.rn.f32 %r243,%r1795,%r1800,%r243; .loc 1 2892 16 add.u64 %r1801,%r1794,%r387; .loc 1 2892 11 ld.f32 %r1802,[%r1801]; fma.rn.f32 %r244,%r1795,%r1802,%r244; .loc 1 2884 10 add.u64 %r401,%r372,4; add.u64 %r400,%r400,4; $L318: add.u64 %r1804,%r390,%r400; .loc 1 2887 16 ld.f32 %r1805,[%r401]; .loc 1 2886 16 add.u64 %r1806,%r1804,%r389; .loc 1 2886 11 ld.f32 %r1807,[%r1806]; fma.rn.f32 %r241,%r1807,%r1805,%r241; .loc 1 2888 11 ld.f32 %r1808,[%r400]; fma.rn.f32 %r242,%r1805,%r1808,%r242; .loc 1 2890 16 add.u64 %r1809,%r1804,%r388; .loc 1 2890 11 ld.f32 %r1810,[%r1809]; fma.rn.f32 %r243,%r1805,%r1810,%r243; .loc 1 2892 16 add.u64 %r1811,%r1804,%r387; .loc 1 2892 11 ld.f32 %r1812,[%r1811]; fma.rn.f32 %r244,%r1805,%r1812,%r244; .loc 1 2884 10 add.u64 %r401,%r401,4; add.u64 %r400,%r400,4; $L317: add.u64 %r1814,%r390,%r400; .loc 1 2887 16 ld.f32 %r1815,[%r401]; .loc 1 2886 16 add.u64 %r1816,%r1814,%r389; .loc 1 2886 11 ld.f32 %r1817,[%r1816]; fma.rn.f32 %r241,%r1817,%r1815,%r241; .loc 1 2888 11 ld.f32 %r1818,[%r400]; fma.rn.f32 %r242,%r1815,%r1818,%r242; .loc 1 2890 16 add.u64 %r1819,%r1814,%r388; .loc 1 2890 11 ld.f32 %r1820,[%r1819]; fma.rn.f32 %r243,%r1815,%r1820,%r243; .loc 1 2892 16 add.u64 %r1821,%r1814,%r387; .loc 1 2892 11 ld.f32 %r1822,[%r1821]; fma.rn.f32 %r244,%r1815,%r1822,%r244; .loc 1 2884 10 add.u64 %r401,%r401,4; add.u64 %r400,%r400,4; setp.ne.u64 %r1823,%r370,%r401; @ %r1823 bra $L78; bra $L77; $L385: .loc 1 2855 16 ld.f32 %r1824,[%r361]; .loc 1 2855 11 ld.f32 %r1825,[%r359]; fma.rn.f32 %r246,%r1824,%r1825,%r246; .loc 1 2857 11 ld.f32 %r1826,[%r49]; fma.rn.f32 %r247,%r1824,%r1826,%r247; .loc 1 2859 11 ld.f32 %r1827,[%r211]; fma.rn.f32 %r248,%r1824,%r1827,%r248; .loc 1 2861 11 ld.f32 %r1828,[%r212]; fma.rn.f32 %r249,%r1824,%r1828,%r249; .loc 1 2853 10 add.u64 %r361,%r361,4; add.u64 %r359,%r359,4; add.u64 %r357,%r49,4; add.u64 %r355,%r211,4; add.u64 %r353,%r212,4; $L316: .loc 1 2855 16 ld.f32 %r1830,[%r361]; .loc 1 2855 11 ld.f32 %r1831,[%r359]; fma.rn.f32 %r246,%r1830,%r1831,%r246; .loc 1 2857 11 ld.f32 %r1832,[%r357]; fma.rn.f32 %r247,%r1830,%r1832,%r247; .loc 1 2859 11 ld.f32 %r1833,[%r355]; fma.rn.f32 %r248,%r1830,%r1833,%r248; .loc 1 2861 11 ld.f32 %r1834,[%r353]; fma.rn.f32 %r249,%r1830,%r1834,%r249; .loc 1 2853 10 add.u64 %r361,%r361,4; add.u64 %r359,%r359,4; add.u64 %r357,%r357,4; add.u64 %r355,%r355,4; add.u64 %r353,%r353,4; $L315: .loc 1 2855 16 ld.f32 %r1836,[%r361]; .loc 1 2855 11 ld.f32 %r1837,[%r359]; fma.rn.f32 %r246,%r1836,%r1837,%r246; .loc 1 2857 11 ld.f32 %r1838,[%r357]; fma.rn.f32 %r247,%r1836,%r1838,%r247; .loc 1 2859 11 ld.f32 %r1839,[%r355]; fma.rn.f32 %r248,%r1836,%r1839,%r248; .loc 1 2861 11 ld.f32 %r1840,[%r353]; fma.rn.f32 %r249,%r1836,%r1840,%r249; .loc 1 2853 10 add.u64 %r361,%r361,4; add.u64 %r359,%r359,4; add.u64 %r357,%r357,4; add.u64 %r355,%r355,4; add.u64 %r353,%r353,4; setp.ne.u64 %r1841,%r197,%r359; @ %r1841 bra $L70; bra $L69; $L384: add.u64 %r1842,%r317,%r331; .loc 1 2793 19 add.u64 %r1843,%r1842,%r316; ld.f32 %r1844,[%r1843]; .loc 1 2794 14 ld.f32 %r1845,[%r333]; .loc 1 2793 14 fma.rn.f32 %r251,%r1844,%r1845,%r251; .loc 1 2795 19 ld.f32 %r1846,[%r331]; .loc 1 2795 14 fma.rn.f32 %r252,%r1845,%r1846,%r252; add.u64 %r1847,%r333,%r458; .loc 1 2798 14 add.u64 %r1848,%r1847,%r455; ld.f32 %r1849,[%r1848]; .loc 1 2797 14 fma.rn.f32 %r253,%r1844,%r1849,%r253; .loc 1 2799 14 fma.rn.f32 %r254,%r1846,%r1849,%r254; .loc 1 2802 14 add.u64 %r1850,%r1847,%r450; ld.f32 %r1851,[%r1850]; .loc 1 2801 14 fma.rn.f32 %r255,%r1844,%r1851,%r255; .loc 1 2803 14 fma.rn.f32 %r256,%r1846,%r1851,%r256; .loc 1 2806 14 add.u64 %r1852,%r1847,%r448; ld.f32 %r1853,[%r1852]; .loc 1 2805 14 fma.rn.f32 %r257,%r1844,%r1853,%r257; .loc 1 2807 14 fma.rn.f32 %r258,%r1846,%r1853,%r258; .loc 1 2809 19 add.u64 %r1854,%r1842,%r315; ld.f32 %r1855,[%r1854]; .loc 1 2809 14 fma.rn.f32 %r259,%r1845,%r1855,%r259; .loc 1 2811 19 add.u64 %r1856,%r1842,%r314; ld.f32 %r1857,[%r1856]; .loc 1 2811 14 fma.rn.f32 %r260,%r1845,%r1857,%r260; .loc 1 2813 14 fma.rn.f32 %r261,%r1849,%r1855,%r261; .loc 1 2815 14 fma.rn.f32 %r262,%r1849,%r1857,%r262; .loc 1 2817 14 fma.rn.f32 %r263,%r1851,%r1855,%r263; .loc 1 2819 14 fma.rn.f32 %r264,%r1851,%r1857,%r264; .loc 1 2821 14 fma.rn.f32 %r265,%r1853,%r1855,%r265; .loc 1 2823 14 fma.rn.f32 %r266,%r1853,%r1857,%r266; .loc 1 2791 6 add.u64 %r333,%r333,4; add.u64 %r331,%r331,4; setp.ne.u64 %r1858,%r197,%r333; @ %r1858 bra $L65; bra $L64; $L383: .loc 1 2762 43 ld.f32 %r1859,[%r528]; st.f32 [%r302],%r1859; .loc 1 2760 9 add.u64 %r57,%r528,4; add.u64 %r73,%r302,1024; $L314: .loc 1 2762 43 ld.f32 %r1861,[%r57]; st.f32 [%r73],%r1861; .loc 1 2760 9 add.u64 %r57,%r57,4; add.u64 %r73,%r73,1024; $L313: .loc 1 2762 43 ld.f32 %r1863,[%r57]; st.f32 [%r73],%r1863; .loc 1 2760 9 add.u64 %r57,%r57,4; add.u64 %r73,%r73,1024; setp.ne.u64 %r1864,%r57,%r136; @ %r1864 bra $L58; bra $L48; $L382: .loc 1 2741 7 add.u64 %r1865,%r508,%r511; add.u64 %r1866,%r1865,%r512; .loc 1 2740 49 ld.f32 %r1867,[%r1866]; st.f32 [%r199],%r1867; .loc 1 2742 49 ld.f32 %r1868,[%r508]; st.f32 [%r199+4],%r1868; .loc 1 2745 7 add.u64 %r1869,%r508,%r512; add.u64 %r1870,%r1869,%r511; .loc 1 2744 49 ld.f32 %r1871,[%r1870+4]; st.f32 [%r199+1024],%r1871; .loc 1 2746 49 ld.f32 %r1872,[%r508+4]; st.f32 [%r199+1028],%r1872; .loc 1 2738 9 add.u64 %r199,%r199,2048; add.u64 %r181,%r508,8; $L312: .loc 1 2741 7 add.u64 %r1874,%r181,%r511; add.u64 %r1875,%r1874,%r512; .loc 1 2740 49 ld.f32 %r1876,[%r1875]; st.f32 [%r199],%r1876; .loc 1 2742 49 ld.f32 %r1877,[%r181]; st.f32 [%r199+4],%r1877; .loc 1 2745 7 add.u64 %r1878,%r181,%r512; add.u64 %r1879,%r1878,%r511; .loc 1 2744 49 ld.f32 %r1880,[%r1879+4]; st.f32 [%r199+1024],%r1880; .loc 1 2746 49 ld.f32 %r1881,[%r181+4]; st.f32 [%r199+1028],%r1881; .loc 1 2738 9 add.u64 %r199,%r199,2048; add.u64 %r181,%r181,8; $L311: .loc 1 2741 7 add.u64 %r1883,%r181,%r511; add.u64 %r1884,%r1883,%r512; .loc 1 2740 49 ld.f32 %r1885,[%r1884]; st.f32 [%r199],%r1885; .loc 1 2742 49 ld.f32 %r1886,[%r181]; st.f32 [%r199+4],%r1886; .loc 1 2745 7 add.u64 %r1887,%r181,%r512; add.u64 %r1888,%r1887,%r511; .loc 1 2744 49 ld.f32 %r1889,[%r1888+4]; st.f32 [%r199+1024],%r1889; .loc 1 2746 49 ld.f32 %r1890,[%r181+4]; st.f32 [%r199+1028],%r1890; .loc 1 2738 9 add.u64 %r199,%r199,2048; add.u64 %r181,%r181,8; setp.ne.u64 %r1891,%r181,%r499; @ %r1891 bra $L52; bra $L53; $L380: .loc 1 2688 2 @ ! %r1303 bra $L395; .loc 1 2689 22 mov.u32 %r188921393,[%value_in]; } $L395: .loc 1 2687 24 mov.u64 %r268,2; .loc 1 2687 7 add.u64 %r591,%r217,%r592; $L310: .loc 1 2688 2 @ ! %r1303 bra $L396; .loc 1 2689 22 mov.u32 %r188951396,[%value_in]; } $L396: .loc 1 2687 24 add.u64 %r268,%r268,1; .loc 1 2687 7 add.u64 %r591,%r591,%r592; $L309: .loc 1 2688 2 @ ! %r1303 bra $L397; .loc 1 2689 22 mov.u32 %r18891399,[%value_in]; } $L397: .loc 1 2687 24 add.u64 %r268,%r268,1; .loc 1 2687 7 add.u64 %r591,%r591,%r592; setp.eq.u64 %r1900,%r268,%r595; @ ! %r1900 bra $L34; bra $L33; $L247: .loc 1 2689 22 mov.u32 %r190114901131902,[%value_in]; } $L381: .loc 1 2687 7 add.u64 %r1904,%r1470,%r592; .loc 1 2688 2 @ ! %r1303 bra $L398; .loc 1 2689 22 mov.u32 %r19061904906131907,[%value_in]; } $L398: .loc 1 2687 7 add.u64 %r1909,%r1904,%r592; .loc 1 2688 2 @ ! %r1303 bra $L399; .loc 1 2689 22 mov.u32 %r19111909911131912,[%value_in]; } $L399: .loc 1 2687 24 add.u64 %r268,%r1417,3; .loc 1 2687 7 add.u64 %r591,%r1909,%r592; setp.eq.u64 %r1913,%r268,%r595; @ ! %r1913 bra $L34; bra $L33; $L389: .loc 1 2938 9 ld.f32 %r1914,[%r596]; ld.f32 %r1915,[%r623]; fma.rn.f32 %r273,%r1914,%r1915,%r273; .loc 1 2937 5 add.u64 %r596,%r596,4; add.u64 %r597,%r623,4; $L322: .loc 1 2938 9 ld.f32 %r1917,[%r596]; ld.f32 %r1918,[%r597]; fma.rn.f32 %r273,%r1917,%r1918,%r273; .loc 1 2937 5 add.u64 %r596,%r596,4; add.u64 %r597,%r597,4; $L321: .loc 1 2938 9 ld.f32 %r1920,[%r596]; ld.f32 %r1921,[%r597]; fma.rn.f32 %r273,%r1920,%r1921,%r273; .loc 1 2937 5 add.u64 %r596,%r596,4; add.u64 %r597,%r597,4; setp.ne.u64 %r1922,%r596,%r602; @ %r1922 bra $L93; bra $L92; $L390: .loc 1 2953 5 ld.f32 %r1923,[%r215]; ld.f32 %r1924,[%r627]; fma.rn.f32 %r271,%r1923,%r1924,%r271; .loc 1 2952 8 add.u64 %r624,%r215,%r626; add.u64 %r627,%r627,4; $L324: .loc 1 2953 5 ld.f32 %r1926,[%r624]; ld.f32 %r1927,[%r627]; fma.rn.f32 %r271,%r1926,%r1927,%r271; .loc 1 2952 8 add.u64 %r624,%r624,%r626; add.u64 %r627,%r627,4; $L323: .loc 1 2953 5 ld.f32 %r1929,[%r624]; ld.f32 %r1930,[%r627]; fma.rn.f32 %r271,%r1929,%r1930,%r271; .loc 1 2952 8 add.u64 %r624,%r624,%r626; add.u64 %r627,%r627,4; setp.ne.u64 %r1931,%r627,%r643; @ %r1931 bra $L97; bra $L96; $L391: .loc 1 2968 8 ld.f32 %r1932,[%r215]; ld.f32 %r1933,[%r647]; fma.rn.f32 %r284,%r1932,%r1933,%r284; .loc 1 2967 28 mov.u64 %r285,1; .loc 1 2967 4 add.u64 %r644,%r215,%r646; add.u64 %r647,%r647,%r649; $L326: .loc 1 2968 8 ld.f32 %r1935,[%r644]; ld.f32 %r1936,[%r647]; fma.rn.f32 %r284,%r1935,%r1936,%r284; .loc 1 2967 28 add.u64 %r285,%r285,1; .loc 1 2967 4 add.u64 %r644,%r644,%r646; add.u64 %r647,%r647,%r649; $L325: .loc 1 2968 8 ld.f32 %r1938,[%r644]; ld.f32 %r1939,[%r647]; fma.rn.f32 %r284,%r1938,%r1939,%r284; .loc 1 2967 28 add.u64 %r285,%r285,1; .loc 1 2967 4 add.u64 %r644,%r644,%r646; add.u64 %r647,%r647,%r649; setp.ne.u64 %r1940,%r428,%r285; @ %r1940 bra $L100; bra $L99; $L393: .loc 1 2982 36 ld.f32 %r1941,[%r661]; ld.f32 %r1942,[%r670]; ld.f32 %r1943,[%r681]; fma.rn.f32 %r1944,%r1941,%r1942,%r1943; st.f32 [%r681],%r1944; .loc 1 2980 29 mov.u64 %r280,1; .loc 1 2980 4 add.u64 %r660,%r681,%r686; add.u64 %r661,%r661,%r663; $L330: .loc 1 2982 36 ld.f32 %r1946,[%r661]; ld.f32 %r1947,[%r670]; ld.f32 %r1948,[%r660]; fma.rn.f32 %r1949,%r1946,%r1947,%r1948; st.f32 [%r660],%r1949; .loc 1 2980 29 add.u64 %r280,%r280,1; .loc 1 2980 4 add.u64 %r660,%r660,%r686; add.u64 %r661,%r661,%r663; $L329: .loc 1 2982 36 ld.f32 %r1951,[%r661]; ld.f32 %r1952,[%r670]; ld.f32 %r1953,[%r660]; fma.rn.f32 %r1954,%r1951,%r1952,%r1953; st.f32 [%r660],%r1954; .loc 1 2980 29 add.u64 %r280,%r280,1; .loc 1 2980 4 add.u64 %r660,%r660,%r686; add.u64 %r661,%r661,%r663; setp.ne.u64 %r1955,%r186,%r280; @ %r1955 bra $L110; bra $L113; $L392: .loc 1 2976 34 st.f32 [%r684],%r1347; .loc 1 2975 27 mov.u64 %r282,1; .loc 1 2975 2 add.u64 %r684,%r684,%r686; $L328: .loc 1 2976 34 st.f32 [%r684],%r1347; .loc 1 2975 27 add.u64 %r282,%r282,1; .loc 1 2975 2 add.u64 %r684,%r684,%r686; $L327: .loc 1 2976 34 st.f32 [%r684],%r1347; .loc 1 2975 27 add.u64 %r282,%r282,1; .loc 1 2975 2 add.u64 %r684,%r684,%r686; setp.ne.u64 %r1958,%r186,%r282; @ %r1958 bra $L105; bra $L109; $L394: .loc 1 3002 5 ld.f32 %r1959,[%r692]; ld.f32 %r1960,[%r718]; fma.rn.f32 %r276,%r1959,%r1960,%r276; .loc 1 3001 32 mov.u64 %r277,1; .loc 1 3001 8 add.u64 %r692,%r692,%r694; add.u64 %r696,%r718,%r698; $L332: .loc 1 3002 5 ld.f32 %r1962,[%r692]; ld.f32 %r1963,[%r696]; fma.rn.f32 %r276,%r1962,%r1963,%r276; .loc 1 3001 32 add.u64 %r277,%r277,1; .loc 1 3001 8 add.u64 %r692,%r692,%r694; add.u64 %r696,%r696,%r698; $L331: .loc 1 3002 5 ld.f32 %r1965,[%r692]; ld.f32 %r1966,[%r696]; fma.rn.f32 %r276,%r1965,%r1966,%r276; .loc 1 3001 32 add.u64 %r277,%r277,1; .loc 1 3001 8 add.u64 %r692,%r692,%r694; add.u64 %r696,%r696,%r698; setp.ne.u64 %r1967,%r428,%r277; @ %r1967 bra $L118; bra $L117; } matmul_r8.o/_gfortran_matmul_r8 .visible .func _gfortran_matmul_r8, .param .u32 %in_ar4, .param .u64 %in_ar5); .file 1 "../../../../libgfortran/generated/matmul_r8.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_options[6]FUNCTION DECL: _gfortran_size0_gfortran_size0_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime___func__$0 .const .align 1 .u8 __func__$0[10] = {109,97,116,109,117,108,95,114,56,0 }2] = {782] = {672] = {8471,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,32,40,97,41,32,61,61,32,50,32,124,124,32,71,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,32,40,98,41,32,61,61,3246116,114,97,110,47,103,101,110,101,114,97,116,101,100,47,109,97,116,109,117,108,95,114,56,465,114,114,97,121,32,98,111,117,110,100,32,109,105,115,109,97,116,99,104,32,102,111,114,32,100,105,109,101,110,115,105,111,110,32,49,32,111,102,32,97,114,114,97,121,32,40,37,108,100,47,37,108,100,41,3265,114,114,97,121,32,98,111,117,110,100,32,109,105,115,109,97,116,99,104,32,102,111,114,32,100,105,109,101,110,115,105,111,110,32,50,32,111,102,32,97,114,114,97,121,32,40,37,108,100,47,37,108,100,41,3273,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,97,114,103,117,109,101,110,116,32,66,32,105,110,32,77,65,84,77,85,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,49,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,100,0 }103,101,109,109,32,33,61,32,78,85,76,7_gfortran_matmul_r8 .visible .func _gfortran_matmul_r889f64 %r240; .reg .f64 %r241; .reg .f64 %r242; .reg .u64 %r243; .reg .f64 %r244; .reg .f64f64 %r250; .reg .f64 %r251; .reg .f64 %r252; .reg .f64 %r253; .reg .f64 %r254; .reg .f64 %r255; .reg .f64 %r256; .reg .f64 %r257; .reg .f64 %r258; .reg .f64 %r259; .reg .f64 %r260; .reg .f64 %r261; .reg .f64 %r262; .reg .f64 %r263; .reg .f64 %r264; .reg .u64 %r266; .reg .u64 %r267; .reg .u64 %r268; .reg .f64 %r269; .reg .u64u64 %r275; .reg .u64u64 %r280; .reg .u64 %r281; .reg .f64u64 %r312; .reg .u64u64 %r321; .reg .u64 %r326; .reg .u64 %r329; .reg .u64 %r331; .reg .u64u64u64 %r414; .reg .u32 %r416; .reg .u64 %r426; .reg .u64 %r427; .reg .u64u64 %r500; .reg .u64u64 %r511; .reg .u64 %r517; .reg .u64 %r521; .reg .u64u64u64u64 %r571; .reg .u64 %r572; .reg .u64 %r573; .reg .u64 %r575; .reg .u64 %r576; .reg .u64 %r577; .reg .u64 %r585; .reg .u64 %r586; .reg .u64 %r588; .reg .u64 %r589; .reg .u64 %r590; .reg .u64 %r591; .reg .u64u64 %r603; .reg .u64 %r604; .reg .u64 %r610; .reg .u64 %r612; .reg .u64 %r613; .reg .u64 %r617; .reg .u64 %r622; .reg .u64 %r623; .reg .u64 %r625; .reg .u64 %r626; .reg .u64 %r631; .reg .u64 %r633; .reg .u64 %r634; .reg .u64 %r642; .reg .u64 %r643; .reg .u64 %r645; .reg .u64 %r646; .reg .u64 %r648; .reg .u64 %r651; .reg .u64 %r653; .reg .u64 %r654; .reg .u64 %r659; .reg .u64 %r660; .reg .u64 %r662; .reg .u64 %r667; .reg .u64 %r669; .reg .u64 %r671; .reg .u64 %r675; .reg .u64 %r676; .reg .u64 %r680; .reg .u64 %r683; .reg .u64 %r685; .reg .u64 %r687; .reg .u64 %r691; .reg .u64 %r693; .reg .u64 %r695; .reg .u64 %r697; .reg .u64 %r700; .reg .u64 %r702; .reg .u64 %r704; .reg .u64 %r709; .reg .u64 %r711; .reg .u64 %r717; .reg .u64 %r718; .reg .u64 %r719; .reg .u64 %r720; .reg .u32 %r721; .reg .u32 %r722; .reg .u64 %r723; .reg .u16 %r724; .reg .pred %r725; .reg .u16 %r726; .reg .pred %r728; .reg .u64 %r729; .reg .u32 %r730; .reg .u64 %r731; .reg .u64 %r732; .reg .u64 %r733; .reg .pred %r734; .reg .pred %r736; .reg .u64 %r739; .reg .u64 %r741; .reg .u64 %r742; .reg .u64 %r743; .reg .u64 %r745; .reg .u64 %r746; .reg .u64 %r747; .reg .u64 %r748; .reg .u64 %r749; .reg .u64 %r752; .reg .u64 %r754; .reg .u64 %r755; .reg .u64 %r757; .reg .u64 %r759; .reg .u64 %r760; .reg .u64 %r762; .reg .u64 %r763; .reg .u64 %r764; .reg .u64 %r765; .reg .u64 %r766; .reg .u32 %r767; .reg .pred %r768; .reg .u64 %r769; .reg .u64 %r770; .reg .u64 %r771; .reg .pred %r773; .reg .u64 %r774; .reg .u64 %r775; .reg .u64 %r776; .reg .pred %r777; .reg .u64 %r778; .reg .u64 %r779; .reg .u64 %r780; .reg .u16 %r781; .reg .pred %r782; .reg .u64 %r783; .reg .u64 %r785; .reg .u64 %r786; .reg .u64 %r787; .reg .u16 %r788; .reg .pred %r790; .reg .pred %r791; .reg .u64 %r792; .reg .pred %r794; .reg .u64 %r795; .reg .u64 %r797; .reg .u64 %r798; .reg .u64 %r799; .reg .u64 %r800; .reg .u64 %r801; .reg .u64 %r802; .reg .pred %r803; .reg .u64 %r804; .reg .u64 %r806; .reg .u64 %r807; .reg .u64 %r808; .reg .u16 %r809; .reg .u32 %r810; .reg .pred %r811; .reg .u16 %r812; .reg .pred %r813; .reg .u64 %r814; .reg .u64 %r815; .reg .u64 %r816; .reg .u64 %r817; .reg .u64 %r818; .reg .u64 %r819; .reg .pred %r820; .reg .u32 %r822; .reg .u32 %r823; .reg .u32 %r825; .reg .u32 %r826; .reg .u16 %r827; .reg .u16 %r828; .reg .u16 %r829; .reg .u32 %r830; .reg .u16 %r831; .reg .pred %r832; .reg .u64 %r833; .reg .u16 %r835; .reg .pred %r836; .reg .u64 %r837; .reg .u64 %r838; .reg .u64 %r839; .reg .u32 %r840; .reg .u32 %r841; .reg .u32 %r842; .reg .u32 %r843; .reg .u32 %r844; .reg .u32 %r845; .reg .u32 %r847; .reg .u32 %r848; .reg .u16 %r849; .reg .u16 %r850; .reg .u16 %r851; .reg .u32 %r852; .reg .u16 %r853; .reg .pred %r854; .reg .u32 %r856; .reg .u32 %r857; .reg .u16 %r858; .reg .u16 %r859; .reg .u16 %r860; .reg .u32 %r861; .reg .u16 %r862; .reg .pred %r863; .reg .u32 %r865; .reg .u32 %r866; .reg .u32 %r868; .reg .u32 %r869; .reg .u16 %r870; .reg .u16 %r871; .reg .u16 %r872; .reg .u32 %r873; .reg .u16 %r874; .reg .pred %r875; .reg .f32 %r876; .reg .f32 %r877; .reg .f32 %r878; .reg .f32 %r879; .reg .f32 %r880; .reg .f32 %r881; .reg .f32 %r882; .reg .pred %r883; .reg .f64 %r884; .reg .f64 %r885; .reg .pred %r886; .reg .pred %r887; .reg .u32 %r889; .reg .u32 %r890; .reg .u32 %r892; .reg .u32 %r893; .reg .u16 %r894; .reg .u16 %r895; .reg .u16 %r896; .reg .u32 %r897; .reg .u16 %r898; .reg .pred %r899; .reg .u32 %r901; .reg .u32 %r902; .reg .u32 %r904; .reg .u32 %r905; .reg .u16 %r906; .reg .u16 %r907; .reg .u16 %r908; .reg .u32 %r909; .reg .u16 %r910; .reg .pred %r911; .reg .u32 %r913; .reg .u32 %r914; .reg .u32 %r916; .reg .u32 %r917; .reg .u16 %r918; .reg .u16 %r919; .reg .u16 %r920; .reg .u32 %r921; .reg .u16 %r922; .reg .pred %r923; .reg .pred %r924; .reg .u64 %r925; .reg .u32 %r926; .reg .u64 %r927; .reg .u64 %r928; .reg .u32 %r929; .reg .pred %r930; .reg .pred %r931; .reg .u32 %r932; .reg .pred %r933; .reg .pred %r934; .reg .u32 %r949; .reg .u64 %r950; .reg .u64 %r951; .reg .u64 %r952; .reg .u64 %r953; .reg .u64 %r954; .reg .u64 %r955; .reg .u64 %r956; .reg .u16 %r957; .reg .u16 %r958; .reg .u16 %r959; .reg .u32 %r960; .reg .u16 %r961; .reg .pred %r962; .reg .pred %r963; .reg .u16 %r964; .reg .pred %r965; .reg .u64 %r966; .reg .pred %r967; .reg .u32 %r971; .reg .u64 %r973; .reg .u32 %r978; .reg .u32 %r979; .reg .u32 %r981; .reg .u32 %r982; .reg .u16 %r983; .reg .u16 %r984; .reg .u16 %r985; .reg .u32 %r987; .reg .u32 %r988; .reg .u16 %r990; .reg .u16 %r992; .reg .u32 %r993; .reg .u16 %r994; .reg .pred %r995; .reg .pred %r996; .reg .u64 %r997; .reg .u64 %r999; .reg .u64 %r1000; .reg .u64 %r1001; .reg .pred %r1003; .reg .u64 %r1005; .reg .u64 %r1006; .reg .u64 %r1007; .reg .u64 %r1010; .reg .u64 %r1011; .reg .u64 %r1015; .reg .u64 %r1016; .reg .u64 %r1017; .reg .u64 %r1019; .reg .u64 %r1020; .reg .u64 %r1023; .reg .u64 %r1026; .reg .u64 %r1027; .reg .u64 %r1030; .reg .u64 %r1032; .reg .u64 %r1033; .reg .u64 %r1034; .reg .u64 %r1035; .reg .u64 %r1036; .reg .pred %r1038; .reg .u64 %r1044; .reg .u64 %r1045; .reg .u64 %r1046; .reg .u64 %r1047; .reg .pred %r1049; .reg .u64 %r1050; .reg .u64 %r1051; .reg .u64 %r1052; .reg .u64 %r1053; .reg .u64 %r1054; .reg .u64 %r1056; .reg .u64 %r1057; .reg .u64 %r1058; .reg .u64 %r1059; .reg .u64 %r1060; .reg .u64 %r1061; .reg .u64 %r1062; .reg .u64 %r1063; .reg .u64 %r1067; .reg .u64 %r1068; .reg .u64 %r1069; .reg .u64 %r1070; .reg .u64 %r1074; .reg .u64 %r1076; .reg .u64 %r1079; .reg .u64 %r1082; .reg .u64 %r1087; .reg .u64 %r1088; .reg .u64 %r1090; .reg .u64 %r1091; .reg .f64 %r1092; .reg .f64 %r1093; .reg .u64 %r1094; .reg .u64 %r1095; .reg .f64 %r1096; .reg .f64 %r1097; .reg .u64 %r1099; .reg .u64 %r1100; .reg .f64 %r1101; .reg .u64 %r1102; .reg .f64 %r1103; .reg .pred %r1104; .reg .u64 %r1107; .reg .u64 %r1108; .reg .u64 %r1109; .reg .u64 %r1110; .reg .u64 %r1112; .reg .pred %r1113; .reg .u64 %r1114; .reg .u64 %r1115; .reg .f64 %r1116; .reg .u64 %r1119; .reg .u64 %r1120; .reg .u64 %r1121; .reg .u64 %r1122; .reg .u64 %r1123; .reg .u64 %r1124; .reg .u64 %r1125; .reg .u64 %r1126; .reg .u64 %r1128; .reg .u64 %r1129; .reg .u64 %r1130; .reg .u64 %r1131; .reg .u64 %r1136; .reg .u64 %r1137; .reg .u64 %r1138; .reg .u64 %r1139; .reg .u64 %r1140; .reg .u64 %r1141; .reg .pred %r1143; .reg .pred %r1144; .reg .u64 %r1148; .reg .f64 %r1149; .reg .f64 %r1150; .reg .f64 %r1151; .reg .f64 %r1152; .reg .pred %r1154; .reg .pred %r1155; .reg .pred %r1156; .reg .u64 %r1157; .reg .u64 %r1158; .reg .u64 %r1160; .reg .u64 %r1161; .reg .u64 %r1162; .reg .u64 %r1163; .reg .u64 %r1171; .reg .u64 %r1172; .reg .u64 %r1173; .reg .u64 %r1175; .reg .f64 %r1176; .reg .f64 %r1177; .reg .u64 %r1178; .reg .f64 %r1179; .reg .u64 %r1180; .reg .f64 %r1181; .reg .pred %r1183; .reg .pred %r1184; .reg .u64 %r1187; .reg .f64 %r1188; .reg .f64 %r1189; .reg .pred %r1191; .reg .u16 %r1192; .reg .pred %r1193; .reg .u32 %r1195; .reg .u32 %r1196; .reg .u16 %r1197; .reg .u16 %r1198; .reg .u32 %r1200; .reg .u16 %r1201; .reg .pred %r1202; .reg .pred %r1203; .reg .u16 %r1204; .reg .pred %r1205; .reg .pred %r1206; .reg .pred %r1207; .reg .u64 %r1208; .reg .u64 %r1209; .reg .pred %r1211; .reg .u64 %r1212; .reg .u64 %r1213; .reg .u64 %r1215; .reg .f64 %r1216; .reg .f64 %r1217; .reg .pred %r1219; .reg .u64 %r1221; .reg .u64 %r1222; .reg .u64 %r1223; .reg .f64 %r1224; .reg .f64 %r1225; .reg .pred %r1227; .reg .pred %r1228; .reg .u64 %r1230; .reg .f64 %r1231; .reg .f64 %r1232; .reg .pred %r1234; .reg .pred %r1235; .reg .pred %r1236; .reg .pred %r1237; .reg .u64 %r1238; .reg .pred %r1241; .reg .u64 %r1243; .reg .f64 %r1244; .reg .f64 %r1245; .reg .f64 %r1246; .reg .pred %r1248; .reg .pred %r1250; .reg .u64 %r1252; .reg .u64 %r1253; .reg .pred %r1255; .reg .u64 %r1256; .reg .u64 %r1257; .reg .u64 %r1259; .reg .f64 %r1260; .reg .f64 %r1261; .reg .pred %r1263; .reg .pred %r1264; .reg .u64 %r1265; .reg .pred %r1266; .reg .u64 %r1267; .reg .u64 %r1268; .reg .u64 %r1269; .reg .u16 %r1270; .reg .u32 %r1271; .reg .pred %r1272; .reg .u16 %r1273; .reg .u32 %r1274; .reg .pred %r1275; .reg .u64 %r1276; .reg .u64 %r1277; .reg .u64 %r1278; .reg .u64 %r1279; .reg .u64 %r1280; .reg .u64 %r1281; .reg .u64 %r1282; .reg .u32 %r1283; .reg .pred %r1284; .reg .pred %r1285; .reg .u32 %r1286; .reg .u64 %r1287; .reg .pred %r1292; .reg .pred %r1293; .reg .pred %r1294; .reg .pred %r1295; .reg .pred %r1298; .reg .pred %r1299; .reg .pred %r1300; .reg .pred %r1301; .reg .pred %r1302; .reg .u64 %r1304; .reg .u64 %r1311; .reg .u64 %r1312; .reg .u64 %r1313; .reg .u64 %r1314; .reg .u64 %r1319; .reg .u64 %r1320; .reg .u64 %r1321; .reg .u64 %r1322; .reg .u64 %r1324; .reg .u64 %r1325; .reg .u64 %r1326; .reg .u64 %r1327; .reg .u64 %r1328; .reg .pred %r1329; .reg .u64 %r1331; .reg .pred %r1333; .reg .u64 %r1336; .reg .u64 %r1337; .reg .u64 %r1338; .reg .pred %r1339; .reg .u64 %r1340; .reg .u64 %r1341; .reg .pred %r1345; .reg .f64 %r1346; .reg .pred %r1347; .reg .u64 %r1348; .reg .u64 %r1349; .reg .u64 %r1350; .reg .u64 %r1351; .reg .u64 %r1352; .reg .pred %r1353; .reg .pred %r1354; .reg .pred %r1355; .reg .u64 %r1358; .reg .u64 %r1359; .reg .u64 %r1360; .reg .u64 %r1361; .reg .u64 %r1362; .reg .pred %r1363; .reg .pred %r1364; .reg .pred %r1365; .reg .u64 %r1367; .reg .u64 %r1368; .reg .u64 %r1369; .reg .u64 %r1370; .reg .u64 %r1371; .reg .u64 %r1372; .reg .pred %r1373; .reg .pred %r1374; .reg .pred %r1375; .reg .u64 %r1381; .reg .u64 %r1382; .reg .u64 %r1383; .reg .u64 %r1385; .reg .pred %r1386; .reg .u64 %r1387; .reg .u64 %r1388; .reg .u64 %r1389; .reg .u64 %r1390; .reg .u64 %r1391; .reg .u64 %r1392; .reg .u64 %r1393; .reg .pred %r1394; .reg .pred %r1395; .reg .pred %r1396; .reg .u64 %r1399; .reg .u64 %r1400; .reg .u64 %r1401; .reg .u64 %r1402; .reg .u64 %r1403; .reg .pred %r1404; .reg .pred %r1405; .reg .pred %r1406; .reg .u64 %r1407; .reg .u64 %r1408; .reg .u64 %r1412; .reg .pred %r1413; .reg .pred %r1414; .reg .pred %r1415; .reg .u64 %r1416; .reg .u64 %r1417; .reg .u64 %r1418; .reg .u64 %r1419; .reg .u64 %r1420; .reg .u64 %r1421; .reg .pred %r1422; .reg .pred %r1423; .reg .pred %r1424; .reg .u64 %r1427; .reg .u64 %r1428; .reg .u64 %r1429; .reg .u64 %r1430; .reg .u64 %r1431; .reg .pred %r1432; .reg .pred %r1433; .reg .pred %r1434; .reg .u64 %r1439; .reg .pred %r1440; .reg .pred %r1441; .reg .pred %r1442; .reg .u64 %r1447; .reg .pred %r1448; .reg .pred %r1449; .reg .pred %r1450; .reg .u64 %r1455; .reg .pred %r1456; .reg .pred %r1457; .reg .pred %r1458; .reg .u64 %r1463; .reg .pred %r1464; .reg .pred %r1465; .reg .pred %r1466; .reg .u64 %r1469; .reg .u64 %r1470; .reg .u64 %r1472; .reg .u64 %r1473; .reg .u64 %r1476; .reg .u64 %r1477; .reg .f64 %r1478; .reg .f64 %r1479; .reg .u64 %r1480; .reg .u64 %r1481; .reg .f64 %r1482; .reg .f64 %r1483; .reg .u64 %r1484; .reg .u64 %r1485; .reg .u64 %r1487; .reg .u64 %r1488; .reg .f64 %r1489; .reg .f64 %r1490; .reg .u64 %r1491; .reg .u64 %r1492; .reg .f64 %r1493; .reg .f64 %r1494; .reg .u64 %r1495; .reg .u64 %r1496; .reg .u64 %r1498; .reg .u64 %r1499; .reg .f64 %r1500; .reg .f64 %r1501; .reg .u64 %r1502; .reg .u64 %r1503; .reg .f64 %r1504; .reg .f64 %r1505; .reg .pred %r1506; .reg .f64 %r1509; .reg .f64 %r1513; .reg .f64 %r1517; .reg .pred %r1518; .reg .f64 %r1519; .reg .f64 %r1520; .reg .f64 %r1521; .reg .f64 %r1522; .reg .f64 %r1523; .reg .f64 %r1524; .reg .f64 %r1525; .reg .f64 %r1526; .reg .f64 %r1527; .reg .f64 %r1528; .reg .f64 %r1529; .reg .f64 %r1530; .reg .f64 %r1531; .reg .f64 %r1532; .reg .f64 %r1533; .reg .f64 %r1534; .reg .u64 %r1537; .reg .u64 %r1538; .reg .f64 %r1539; .reg .f64 %r1540; .reg .f64 %r1541; .reg .u64 %r1542; .reg .u64 %r1543; .reg .f64 %r1544; .reg .u64 %r1545; .reg .f64 %r1546; .reg .u64 %r1547; .reg .f64 %r1548; .reg .u64 %r1549; .reg .f64 %r1550; .reg .u64 %r1551; .reg .f64 %r1552; .reg .pred %r1553; .reg .f64 %r1554; .reg .f64 %r1555; .reg .f64 %r1556; .reg .f64 %r1557; .reg .f64 %r1563; .reg .f64 %r1564; .reg .f64 %r1565; .reg .f64 %r1566; .reg .f64 %r1567; .reg .f64 %r1568; .reg .f64 %r1569; .reg .f64 %r1570; .reg .f64 %r1571; .reg .f64 %r1578; .reg .f64 %r1579; .reg .f64 %r1580; .reg .f64 %r1581; .reg .f64 %r1582; .reg .f64 %r1583; .reg .f64 %r1584; .reg .f64 %r1585; .reg .f64 %r1586; .reg .f64 %r1593; .reg .f64 %r1594; .reg .f64 %r1595; .reg .f64 %r1596; .reg .f64 %r1597; .reg .pred %r1598; .reg .pred %r1599; .reg .f64 %r1600; .reg .f64 %r1601; .reg .f64 %r1602; .reg .f64 %r1603; .reg .u64 %r1606; .reg .f64 %r1607; .reg .u64 %r1608; .reg .f64 %r1609; .reg .f64 %r1610; .reg .f64 %r1611; .reg .f64 %r1612; .reg .u64 %r1613; .reg .f64 %r1614; .reg .f64 %r1615; .reg .u64 %r1616; .reg .f64 %r1617; .reg .f64 %r1618; .reg .u64 %r1620; .reg .u64 %r1622; .reg .f64 %r1623; .reg .u64 %r1624; .reg .f64 %r1625; .reg .f64 %r1626; .reg .f64 %r1627; .reg .f64 %r1628; .reg .u64 %r1629; .reg .f64 %r1630; .reg .f64 %r1631; .reg .u64 %r1632; .reg .f64 %r1633; .reg .f64 %r1634; .reg .u64 %r1636; .reg .u64 %r1638; .reg .f64 %r1639; .reg .u64 %r1640; .reg .f64 %r1641; .reg .f64 %r1642; .reg .u64 %r1643; .reg .f64 %r1644; .reg .u64 %r1645; .reg .f64 %r1646; .reg .pred %r1647; .reg .f64 %r1648; .reg .f64 %r1651; .reg .f64 %r1652; .reg .f64 %r1653; .reg .f64 %r1657; .reg .f64 %r1658; .reg .f64 %r1659; .reg .f64 %r1663; .reg .f64 %r1664; .reg .pred %r1665; .reg .pred %r1666; .reg .f64 %r1667; .reg .f64 %r1670; .reg .f64 %r1671; .reg .f64 %r1672; .reg .f64 %r1676; .reg .f64 %r1677; .reg .f64 %r1678; .reg .f64 %r1682; .reg .f64 %r1683; .reg .pred %r1684; .reg .pred %r1685; .reg .f64 %r1686; .reg .u64 %r1687; .reg .f64 %r1689; .reg .f64 %r1690; .reg .f64 %r1691; .reg .u64 %r1692; .reg .f64 %r1695; .reg .f64 %r1696; .reg .f64 %r1697; .reg .u64 %r1698; .reg .f64 %r1701; .reg .f64 %r1702; .reg .pred %r1703; .reg .pred %r1704; .reg .f64 %r1705; .reg .u64 %r1707; .reg .u64 %r1708; .reg .f64 %r1709; .reg .f64 %r1710; .reg .f64 %r1711; .reg .u64 %r1713; .reg .u64 %r1714; .reg .f64 %r1716; .reg .f64 %r1717; .reg .f64 %r1718; .reg .u64 %r1720; .reg .u64 %r1721; .reg .f64 %r1723; .reg .f64 %r1724; .reg .pred %r1725; .reg .u64 %r1727; .reg .u64 %r1729; .reg .u64 %r1732; .reg .pred %r1734; .reg .pred %r1735; .reg .pred %r1736; .reg .u64 %r1738; .reg .u64 %r1739; .reg .f64 %r1740; .reg .f64 %r1741; .reg .f64 %r1742; .reg .f64 %r1743; .reg .u64 %r1745; .reg .u64 %r1746; .reg .f64 %r1748; .reg .f64 %r1749; .reg .f64 %r1750; .reg .f64 %r1751; .reg .u64 %r1753; .reg .u64 %r1754; .reg .f64 %r1756; .reg .f64 %r1757; .reg .f64 %r1758; .reg .f64 %r1759; .reg .pred %r1760; .reg .pred %r1761; .reg .f64 %r1763; .reg .u64 %r1765; .reg .u64 %r1766; .reg .f64 %r1767; .reg .f64 %r1768; .reg .f64 %r1769; .reg .u64 %r1771; .reg .u64 %r1772; .reg .f64 %r1774; .reg .f64 %r1775; .reg .f64 %r1776; .reg .u64 %r1778; .reg .u64 %r1779; .reg .f64 %r1781; .reg .f64 %r1782; .reg .pred %r1783; .reg .f64 %r1784; .reg .f64 %r1785; .reg .f64 %r1787; .reg .f64 %r1788; .reg .f64 %r1790; .reg .f64 %r1791; .reg .pred %r1792; .reg .u64 %r1793; .reg .f64 %r1794; .reg .u64 %r1795; .reg .f64 %r1796; .reg .f64 %r1797; .reg .u64 %r1798; .reg .f64 %r1799; .reg .u64 %r1800; .reg .f64 %r1801; .reg .u64 %r1803; .reg .f64 %r1804; .reg .u64 %r1805; .reg .f64 %r1806; .reg .f64 %r1807; .reg .u64 %r1808; .reg .f64 %r1809; .reg .u64 %r1810; .reg .f64 %r1811; .reg .u64 %r1813; .reg .f64 %r1814; .reg .u64 %r1815; .reg .f64 %r1816; .reg .f64 %r1817; .reg .u64 %r1818; .reg .f64 %r1819; .reg .u64 %r1820; .reg .f64 %r1821; .reg .pred %r1822; .reg .f64 %r1823; .reg .f64 %r1824; .reg .f64 %r1825; .reg .f64 %r1826; .reg .f64 %r1827; .reg .f64 %r1829; .reg .f64 %r1830; .reg .f64 %r1831; .reg .f64 %r1832; .reg .f64 %r1833; .reg .f64 %r1835; .reg .f64 %r1836; .reg .f64 %r1837; .reg .f64 %r1838; .reg .f64 %r1839; .reg .pred %r1840; .reg .u64 %r1841; .reg .u64 %r1842; .reg .f64 %r1843; .reg .f64 %r1844; .reg .f64 %r1845; .reg .u64 %r1846; .reg .u64 %r1847; .reg .f64 %r1848; .reg .u64 %r1849; .reg .f64 %r1850; .reg .u64 %r1851; .reg .f64 %r1852; .reg .u64 %r1853; .reg .f64 %r1854; .reg .u64 %r1855; .reg .f64 %r1856; .reg .pred %r1857; .reg .f64 %r1858; .reg .f64 %r1860; .reg .f64 %r1862; .reg .pred %r1863; .reg .u64 %r1864; .reg .u64 %r1865; .reg .f64 %r1866; .reg .f64 %r1867; .reg .u64 %r1868; .reg .u64 %r1869; .reg .f64 %r1870; .reg .f64 %r1871; .reg .u64 %r1873; .reg .u64 %r1874; .reg .f64 %r1875; .reg .f64 %r1876; .reg .u64 %r1877; .reg .u64 %r1878; .reg .f64 %r1879; .reg .f64 %r1880; .reg .u64 %r1882; .reg .u64 %r1883; .reg .f64 %r1884; .reg .f64 %r1885; .reg .u64 %r1886; .reg .u64 %r1887; .reg .f64 %r1888; .reg .f64 %r1889; .reg .pred %r1890; .reg .u32 %r1891; .reg .u64 %r1892; .reg .u32 %r1894; .reg .u64 %r1895; .reg .u32 %r1897; .reg .u64 %r1898; .reg .pred %r1899; .reg .u32 %r1900; .reg .u64 %r1901; .reg .u64 %r1903; .reg .u32 %r1905; .reg .u64 %r1906; .reg .u64 %r1908; .reg .u32 %r1910; .reg .u64 %r1911; .reg .pred %r1912; .reg .f64 %r1913; .reg .f64 %r1914; .reg .f64 %r1916; .reg .f64 %r1917; .reg .f64 %r1919; .reg .f64 %r1920; .reg .pred %r1921; .reg .f64 %r1922; .reg .f64 %r1923; .reg .f64 %r1925; .reg .f64 %r1926; .reg .f64 %r1928; .reg .f64 %r1929; .reg .pred %r1930; .reg .f64 %r1931; .reg .f64 %r1932; .reg .f64 %r1934; .reg .f64 %r1935; .reg .f64 %r1937; .reg .f64 %r1938; .reg .pred %r1939; .reg .f64 %r1940; .reg .f64 %r1941; .reg .f64 %r1942; .reg .f64 %r1943; .reg .f64 %r1945; .reg .f64 %r1946; .reg .f64 %r1947; .reg .f64 %r1948; .reg .f64 %r1950; .reg .f64 %r1951; .reg .f64 %r1952; .reg .f64 %r1953; .reg .pred %r1954; .reg .pred %r1957; .reg .f64 %r1958; .reg .f64 %r1959; .reg .f64 %r1961; .reg .f64 %r1962; .reg .f64 %r1964; .reg .f64 %r1965; .reg .pred %r1966; mov.u64 %r718,%ar0; mov.u64 %r719,%ar1; mov.u64 %r720,%ar2; mov.u32 %r721,%ar3; mov.u32 %r722,%ar4; mov.u64 %r723,%ar5; .loc 1 2462 3 ld.s8 %r416,[%r719+28]; cvt.u16.u32 %r724,%r416; setp.eq.u16 %r725,%r724,2; @ %r725 bra $L2; ld.s8 %r1286,[%r720+28]; cvt.u16.u32 %r726,%r1286; setp.eq.u16 %r728,%r726,2; @ %r728 bra $L3; cvta.const.u64 %r732,$LC3; cvta.const.u64 %r731,__func__$0; mov.u32 %r730,2462; cvta.const.u64 %r72737323: .loc 1 2476 6 ld.u64 %r733,[%r718]; setp.ne.u64 %r734,%r733,0; @ %r734 bra $L4; .loc 1 2478 10 setp.ne.u16 %r736,%r724,1; @ %r736 bra $L5; .loc 1 2480 4 st.u64 [%r718+48],%r733; ld.u64 %r739,[%r720+80]; ld.u64 %r741,[%r720+72]; sub.u64 %r742,%r739,%r741; st.u64 [%r718+56],%r742; mov.u64 %r743,1; st.u64 [%r718+40],%r743; bra $L6; $L123: .loc 1 2485 4 st.u64 [%r718+48],%r1265; add.u64 %r745,%r478,-1; st.u64 [%r718+56],%r745; mov.u64 %r746,1; st.u64 [%r718+40],%r746; bra $L6; $L124: .loc 1 2490 4 mov.u64 %r747,0; st.u64 [%r718+48],%r747; add.u64 %r748,%r478,-1; st.u64 [%r718+56],%r748; mov.u64 %r749,1; st.u64 [%r718+40],%r749; .loc 1 2493 11 st.u64 [%r718+72],%r747; ld.u64 %r752,[%r720+80]; ld.u64 %r754,[%r720+72]; sub.u64 %r755,%r752,%r754; st.u64 [%r718+80],%r755; st.u64 [%r718+64],%r478; $L6: .loc 1 2499 1718; call (%value_in),_gfortran_size0,(%out_arg1); ld.param.u64 %r757,[%value_in]; } .loc 1 2499 4 mov.u64 %r759759; call (%value_in),_gfortrani_xmallocarray760,[%value_in]; } .loc 1 2499 2 st.u64 [%r718],%r760; .loc 1 2500 24 mov.u64 %r762,0; st.u64 [%r718+8],%r762; .loc 1 2557 7 ld.s8 %r416,[%r719+28]; .loc 1 2564 15 ld.u64 %r764,[%r719+56]; add.u64 %r763,%r764,1; .loc 1 2564 13 ld.u64 %r765,[%r719+48]; sub.u64 %r426,%r763,%r765; bra $L7; $L4: .loc 1 2502 12 cvta.global.u64 %r766,_gfortrani_compile_options; .loc 1 2502 11 ld.u32 %r767,[%r766+36]; setp.eq.u32 %r768,%r767,0; @ %r768 bra $L8; .loc 1 2509 17 ld.u64 %r770,[%r718+56]; add.u64 %r769,%r770,1; .loc 1 2509 15 ld.u64 %r771,[%r718+48]; sub.u64 %r426,%r769,%r771; .loc 1 2506 10 setp.ne.u16 %r773,%r724,1; @ %r773 bra $L9; .loc 1 2508 17 ld.u64 %r775,[%r720+80]; add.u64 %r774,%r775,1; .loc 1 2508 15 ld.u64 %r776,[%r720+72]; sub.u64 %r208,%r774,%r776; .loc 1 2510 7 setp.ne.u64 %r777,%r208,%r426; @ %r777 bra $L10; .loc 1 2564 15 ld.u64 %r779,[%r719+56]; add.u64 %r778,%r779,1; .loc 1 2564 13 ld.u64 %r780,[%r719+48]; sub.u64 %r426,%r778,%r780; .loc 1 2543 7 ld.s8 %r301,[%r718+28]; .loc 1 2548 27 ld.u64 %r414,[%r718+40]; .loc 1 2543 6 cvt.u16.u32 %r781,%r301; setp.eq.u16 %r782,%r781,1; @ %r782 bra $L11; .loc 1 2553 16 ld.u64 %r474,[%r718+64]; .loc 1 2560 16 ld.u64 %r473,[%r719+40]; .loc 1 2563 14 mov.u64 %r186,1; .loc 1 2561 16 mov.u64 %r178,%r186; bra $L12; $L10: .loc 1 2511 6 st.u64 [%stack+8],%r208; st.u64 [%stack],%r426; cvta.const.u64 %r783783_gfortran_runtime_error2485 4 ld.u64 %r786,[%r719+56]; add.u64 %r785,%r786,1; ld.u64 %r787,[%r719+48]; sub.u64 %r482,%r785,%r787; .loc 1 2515 15 cvt.u16.u32 %r788,%r1286; setp.ne.u16 %r790,%r788,1; @ %r790 bra $L13; .loc 1 2519 7 setp.eq.u64 %r791,%r482,%r426; @ %r791 bra $L14; .loc 1 2520 6 st.u64 [%stack+8],%r482; st.u64 [%stack],%r426; cvta.const.u64 %r7922_gfortran_runtime_error13: .loc 1 2528 7 setp.eq.u64 %r794,%r482,%r426; @ %r794 bra $L15; .loc 1 2529 6 st.u64 [%stack+8],%r482; st.u64 [%stack],%r426; cvta.const.u64 %r7955_gfortran_runtime_error15: .loc 1 2533 17 ld.u64 %r798,[%r720+80]; add.u64 %r797,%r798,1; .loc 1 2533 15 ld.u64 %r799,[%r720+72]; sub.u64 %r206,%r797,%r799; .loc 1 2534 17 ld.u64 %r801,[%r718+80]; add.u64 %r800,%r801,1; .loc 1 2534 15 ld.u64 %r802,[%r718+72]; sub.u64 %r207,%r800,%r802; .loc 1 2535 7 setp.eq.u64 %r803,%r206,%r207; @ %r803 bra $L14; .loc 1 2536 6 st.u64 [%stack+8],%r206; st.u64 [%stack],%r207; cvta.const.u64 %r804,$LCstack; call _gfortran_runtime_error2564 15 ld.u64 %r807,[%r719+56]; add.u64 %r806,%r807,1; .loc 1 2564 13 ld.u64 %r808,[%r719+48]; sub.u64 %r426,%r806,%r808; $L7: .loc 1 2548 27 ld.u64 %r474,[%r718+40]; .loc 1 2543 6 ld.s8 %r810,[%r718+28]; cvt.u16.u32 %r809,%r810; setp.eq.u16 %r811,%r809,1; @ %r811 bra $L130; $L127: .loc 1 2553 16 mov.u64 %r414,%r474; ld.u64 %r474,[%r718+64]; bra $L16; $L130: .loc 1 2548 27 mov.u64 %r414,%r474; $L16: .loc 1 2560 16 ld.u64 %r473,[%r719+40]; .loc 1 2557 6 cvt.u16.u32 %r812,%r416; setp.eq.u16 %r813,%r812,1; @ %r813 bra $L131; $L125: .loc 1 2569 16 ld.u64 %r178,[%r719+64]; .loc 1 2571 15 ld.u64 %r815,[%r719+80]; add.u64 %r814,%r815,1; .loc 1 2571 13 ld.u64 %r816,[%r719+72]; mov.u64 %r186,%r426; sub.u64 %r426,%r814,%r816; bra $L12; $L131: .loc 1 2563 14 mov.u64 %r186,1; .loc 1 2561 16 mov.u64 %r178,%r186; $L12: .loc 1 2575 16 ld.u64 %r818,[%r720+56]; add.u64 %r817,%r818,1; ld.u64 %r819,[%r720+48]; sub.u64 %r64,%r817,%r819; .loc 1 2575 6 setp.eq.u64 %r820,%r64,%r426; @ %r820 bra $L17; .loc 1 2577 10 set.u32.gt.s64 %r822,%r426,0; neg.s32 %r823,%r822; .loc 1 2577 21 set.u32.gt.s64 %r825,%r64,0; neg.s32 %r826,%r825; cvt.u16.u32 %r828,%r823; cvt.u16.u32 %r829,%r826; or.b16 %r827,%r828,%r829; cvt.u32.u16 %r830,%r827; cvt.u16.u8 %r831,%r830; setp.eq.u16 %r832,%r831,0; @ %r832 bra $L17; .loc 1 2578 2 st.u64 [%stack+8],%r426; st.u64 [%stack],%r64;_gfortran_runtime_error17: .loc 1 2583 7 ld.s8 %r66,[%r720+28]; .loc 1 2586 16 ld.u64 %r472,[%r720+40]; .loc 1 2583 6 cvt.u16.u32 %r835,%r66; setp.eq.u16 %r836,%r835,1; @ %r836 bra $L132; .loc 1 2597 16 ld.u64 %r179,[%r720+64]; .loc 1 2598 16 ld.u64 %r838,[%r720+80]; add.u64 %r837,%r838,1; .loc 1 2598 14 ld.u64 %r839,[%r720+72]; sub.u64 %r187,%r837,%r839; bra $L18; $L132: .loc 1 2592 14 mov.u64 %r187,1; .loc 1 2591 16 mov.u64 %r179,256; $L18: .loc 1 2601 9 ld.u64 %r213,[%r719]; .loc 1 2602 9 ld.u64 %r214,[%r720]; .loc 1 2603 8 ld.u64 %r215,[%r718]; .loc 1 2612 28 set.u32.eq.u64 %r841,%r414,1; neg.s32 %r842,%r841; cvt.u32.u32 %r840,%r842; cvt.u32.u8 %r71,%r840; .loc 1 2612 46 set.u32.eq.u64 %r844,%r473,1; neg.s32 %r845,%r844; cvt.u32.u32 %r843,%r845; cvt.u32.u8 %r471,%r843; .loc 1 2612 7 set.u32.ne.u32 %r847,%r721,0; neg.s32 %r848,%r847; .loc 1 2612 16 cvt.u16.u32 %r850,%r848; cvt.u16.u32 %r851,%r71; and.b16 %r849,%r850,%r851; .loc 1 2612 6 cvt.u32.u16 %r852,%r849; cvt.u16.u8 %r853,%r852; setp.eq.u16 %r854,%r853,0; @ %r854 bra $L19; .loc 1 2612 63 set.u32.eq.u64 %r856,%r178,1; neg.s32 %r857,%r856; .loc 1 2612 51 cvt.u16.u32 %r859,%r857; cvt.u16.u32 %r860,%r471; or.b16 %r858,%r859,%r860; .loc 1 2612 33 cvt.u32.u16 %r861,%r858; cvt.u16.u8 %r862,%r861; setp.eq.u16 %r863,%r862,0; @ %r863 bra $L20; .loc 1 2613 20 set.u32.eq.u64 %r865,%r472,1; neg.s32 %r866,%r865; .loc 1 2613 37 set.u32.eq.u64 %r868,%r179,1; neg.s32 %r869,%r868; .loc 1 2613 25 cvt.u16.u32 %r871,%r866; cvt.u16.u32 %r872,%r869; or.b16 %r870,%r871,%r872; .loc 1 2613 7 cvt.u32.u16 %r873,%r870; cvt.u16.u8 %r874,%r873; setp.eq.u16 %r875,%r874,0; @ %r875 bra $L20; .loc 1 2615 13 cvt.rn.f32.s32 %r83,%r722; .loc 1 2614 12 cvt.rn.f32.s64 %r876,%r186; .loc 1 2614 50 cvt.rn.f32.s64 %r877,%r426; .loc 1 2614 47 mul.f32 %r878,%r876,%r877; .loc 1 2614 31 cvt.rn.f32.s64 %r879,%r187; .loc 1 2614 47 mul.f32 %r880,%r878,%r879; .loc 1 2615 13 mul.f32 %r881,%r83,%r83; mul.f32 %r882,%r881,%r83; .loc 1 2614 7 setp.gt.f32 %r883,%r880,%r882; @ ! %r883 bra $L19; .loc 1 2617 17 cvt.u32.u64 %r85,%r186; st.u32 [%frame+36],%r85; .loc 1 2617 29 cvt.u32.u64 %r86,%r187; st.u32 [%frame+32],%r86; .loc 1 2617 41 cvt.u32.u64 %r87,%r426; st.u32 [%frame+28],%r87; .loc 1 2617 52 cvt.u32.u64 %r88,%r474; st.u32 [%frame+24],%r88; .loc 1 2618 24 mov.f64 %r884,0d3ff0000000000000; st.f64 [%frame+8],%r884; .loc 1 2618 33 mov.f64 %r885,0d0000000000000000; st.f64 [%frame],%r885; .loc 1 2619 17 setp.ne.u64 %r886,%r473,1; @ %r886 bra $L22; cvt.u32.u64 %r200,%r178; bra $L23; $L22: cvt.u32.u64 %r200,%r473; $L23: st.u32 [%frame+20],%r200; .loc 1 2620 3 setp.ne.u64 %r887,%r472,1; @ %r887 bra $L24; cvt.u32.u64 %r201,%r179; bra $L25; $L24: cvt.u32.u64 %r201,%r472; $L25: st.u32 [%frame+16],%r201; .loc 1 2622 15 set.u32.gt.s32 %r889,%r200,0; neg.s32 %r890,%r889; .loc 1 2622 26 set.u32.gt.s32 %r892,%r201,0; neg.s32 %r893,%r892; .loc 1 2622 19 cvt.u16.u32 %r895,%r890; cvt.u16.u32 %r896,%r893; and.b16 %r894,%r895,%r896; .loc 1 2622 10 cvt.u32.u16 %r897,%r894; cvt.u16.u8 %r898,%r897; setp.eq.u16 %r899,%r898,0; @ %r899 bra $L19; .loc 1 2622 37 set.u32.gt.s32 %r901,%r88,0; neg.s32 %r902,%r901; .loc 1 2622 46 set.u32.gt.s32 %r904,%r85,1; neg.s32 %r905,%r904; .loc 1 2622 41 cvt.u16.u32 %r907,%r902; cvt.u16.u32 %r908,%r905; and.b16 %r906,%r907,%r908; cvt.u32.u16 %r909,%r906; cvt.u16.u8 %r910,%r909; setp.eq.u16 %r911,%r910,0; @ %r911 bra $L19; .loc 1 2622 55 set.u32.gt.s32 %r913,%r86,1; neg.s32 %r914,%r913; .loc 1 2622 64 set.u32.gt.s32 %r916,%r87,1; neg.s32 %r917,%r916; .loc 1 2622 59 cvt.u16.u32 %r919,%r914; cvt.u16.u32 %r920,%r917; and.b16 %r918,%r919,%r920; cvt.u32.u16 %r921,%r918; cvt.u16.u8 %r922,%r921; setp.eq.u16 %r923,%r922,0; @ %r923 bra $L19; .loc 1 2624 4 setp.ne.u64 %r924,%r723,0; @ %r924 bra $L27; cvta.const.u64 %r928,$LC8; cvta.const.u64 %r927,__func__$0; mov.u32 %r926,2624; cvta.const.u64 %r925992827: .loc 1 2626 17 and.b32 %r929,%r721,2; .loc 1 2626 7 setp.ne.u32 %r930,%r929,0; @ %r930 bra $L133; .loc 1 2629 35 setp.eq.u64 %r931,%r473,1; @ %r931 bra $L134; cvta.const.u64 %r188,$LC2; bra $L28; $L133: .loc 1 2627 13 cvta.const.u64 %r188,$LC1; bra $L28; $L134: .loc 1 2629 35 cvta.const.u64 %r188,$LC0; $L28: .loc 1 2631 17 and.b32 %r932,%r721,4; .loc 1 2631 7 setp.ne.u32 %r933,%r932,0; @ %r933 bra $L135; .loc 1 2634 35 setp.eq.u64 %r934,%r472,1; @ %r934 bra $L136; cvta.const.u64 %r189,$LC2; bra $L29; $L135: .loc 1 2632 13 cvta.const.u64 %r189,$LC1; bra $L29; $L136: .loc 1 2634 35 cvta.const.u64 %r189,$LC0; $L29: .loc 1 2636 4 add.u64 %r950,%frame,24; add.u64 %r951,%frame,16; add.u64 %r952,%frame,20; add.u64 %r953,%frame,8; add.u64 %r954,%frame,28; add.u64 %r955,%frame,32; add.u64 %r956,%frame,36; mov.u32 %r949,1;,.param .u64 %in_ar3,.param .u64 %in_ar4,.param .u64 %in_ar5,.param .u64 %in_ar6,.param .u64 %in_ar7,.param .u64 %in_ar8,.param .u64 %in_ar9,.param .u64 %in_ar10,.param .u64 %in_ar11,.param .u64 %in_ar12,.param .u32 %in_ar13,.param .u32 %in_ar1418r189956955954213; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r952; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r214; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r951; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%frame; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r215; .param .u64 %out_arg13; st.param.u64 [%out_arg13],%r950; .param .u32 %out_arg14; st.param.u32 [%out_arg14],%r949; .param .u32 %out_arg15; st.param.u32 [%out_arg15],%r949; call %r723,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13,%out_arg14,%out_arg15),$LCT0; } bra $L1; $L19: .loc 1 2643 21 cvt.u16.u32 %r958,%r71; cvt.u16.u32 %r959,%r471; and.b16 %r957,%r958,%r959; .loc 1 2643 6 cvt.u32.u16 %r960,%r957; cvt.u16.u8 %r961,%r960; setp.eq.u16 %r962,%r961,0; @ %r962 bra $L31; .loc 1 2643 38 setp.ne.u64 %r963,%r472,1; @ %r963 bra $L20; .loc 1 2644 7 cvt.u16.u32 %r964,%r66; setp.eq.u16 %r965,%r964,1; @ %r965 bra $L32; .loc 1 2677 16 add.u64 %r966,%r474,1; .loc 1 2678 9 shl.b64 %r104,%r966,3; sub.u64 %r217,%r215,%r104; .loc 1 2687 7 setp.le.s64 %r967,%r187,0; @ %r967 bra $L33; add.u64 %r591,%r104,-8; mov.u64 %r590,%r215; add.u64 %r594,%r187,1; .loc 1 2689 22 shl.b64 %r411,%r186,3; .loc 1 2687 13 mov.u64 %r266,%r472; setp.gt.s64 %r1302,%r186,0; and.b64 %r1412,%r187,3; setp.eq.u64 %r1415,%r1412,0; @ %r1415 bra $L34; setp.eq.u64 %r1414,%r1412,1; @ %r1414 bra $L309; setp.eq.u64 %r1413,%r1412,2; @ %r1413 bra $L310; bra $L380; $L35: .loc 1 2689 22 mov.u32 %r971$L36: .loc 1 2687 24 add.u64 %r1416,%r266,1; .loc 1 2687 7 add.u64 %r1469,%r590,%r591; .loc 1 2688 2 @ %r1302 bra $L247; bra $L381; $L34: @ %r1302 bra $L35; bra $L36; $L33: .loc 1 2692 28 set.u32.eq.u64 %r978,%r426,0; neg.s32 %r979,%r978; .loc 1 2692 13 set.u32.eq.u64 %r981,%r186,0; neg.s32 %r982,%r981; .loc 1 2692 28 cvt.u16.u32 %r984,%r979; cvt.u16.u32 %r985,%r982; or.b16 %r983,%r984,%r985; .loc 1 2692 23 set.u32.eq.u64 %r987,%r187,0; neg.s32 %r988,%r987; .loc 1 2692 28 cvt.u16.u32 %r992,%r988; or.b16 %r990,%r983,%r992; cvt.u32.u16 %r993,%r990; cvt.u16.u8 %r994,%r993; setp.eq.u16 %r995,%r994,0; @ ! %r995 bra $L1; .loc 1 2697 10 setp.eq.u64 %r996,%r178,1; selp.u64 %r198,%r474,%r178,%r996; .loc 1 2702 21 shl.b64 %r997,%r198,8; .loc 1 2702 14 add.u64 %r222,%r997,%r179; .loc 1 2706 12 min.s64 %r999,%r222,65536; shl.b64 %r1000,%r9991001,[%value_in]; } .loc 1 2710 7 setp.gt.s64 %r1003,%r187,0; @ %r1003 bra $L39; $L42: .loc 1 2910012918 7 bra $L1; $L39: .loc 1 2680 16 add.u64 %r1005,%r178,1; .loc 1 2681 9 shl.b64 %r107,%r1005,3; sub.u64 %r219,%r213,%r107; shl.b64 %r564,%r179,9; shl.b64 %r566,%r179,12; shl.b64 %r575,%r179,4; .loc 1 2683 16 add.u64 %r1006,%r179,1; .loc 1 2684 9 shl.b64 %r1007,%r1006,3; sub.u64 %r567,%r214,%r1007; add.u64 %r565,%r575,%r567; shl.b64 %r570,%r474,9; add.u64 %r571,%r474,%r474; shl.b64 %r572,%r474,2; neg.s64 %r1010,%r179; shl.b64 %r1011,%r1010,12; shl.b64 %r573,%r1010,3; shl.b64 %r577,%r474,12; shl.b64 %r1015,%r474,4; add.u64 %r1016,%r1015,8; add.u64 %r576,%r217,%r1016; add.u64 %r1017,%r187,-1; and.b64 %r585,%r1017,-512; add.u64 %r586,%r567,%r573; shl.b64 %r543,%r178,8; shl.b64 %r545,%r178,11; add.u64 %r546,%r107,-8; neg.s64 %r1019,%r178; shl.b64 %r1020,%r1019,11; shl.b64 %r1023,%r1019,4; shl.b64 %r507,%r178,4; add.u64 %r508,%r178,%r178; shl.b64 %r196,%r179,5; shl.b64 %r191,%r474,5; neg.s64 %r1026,%r474; shl.b64 %r1027,%r1026,3; shl.b64 %r536,%r474,3; shl.b64 %r1030,%r1010,5; shl.b64 %r589,%r179,3; mov.u64 %r568,%r571; mov.u64 %r562,%r179; mov.u64 %r560,0; setp.gt.s64 %r1295,%r426,0; add.u64 %r1338,%r586,8; setp.gt.s64 %r1339,%r186,0; add.u64 %r1340,%r186,1; add.u64 %r1341,%r219,16; bra $L41; $L138: mov.u64 %r560,%r561; $L41: add.u64 %r588,%r560,1; .loc 1 2715 9 sub.u64 %r224,%r187,%r560; min.s64 %r1470,%r224,512; .loc 1 2716 24 shr.s64 %r1032,%r1470,63; shr.u64 %r1033,%r1032,62; add.u64 %r1034,%r1470,%r1033; and.b64 %r1035,%r1034,3; sub.u64 %r1036,%r1035,%r1033; .loc 1 2716 10 sub.u64 %r225,%r1470,%r1036; .loc 1 2718 4 @ %r1295 bra $L40; $L45: .loc 1 2710 7 add.u64 %r561,%r560,512; add.u64 %r562,%r562,%r564; add.u64 %r565,%r565,%r566; add.u64 %r568,%r568,%r570; add.u64 %r572,%r572,%r570; add.u64 %r573,%r573,%r1011; add.u64 %r575,%r575,%r566; add.u64 %r576,%r576,%r577; setp.ne.u64 %r1038,%r560,%r585; @ %r1038 bra $L138; bra $L42; $L40: add.u64 %r540,%r1338,%r575; .loc 1 2768 13 add.u64 %r450,%r225,%r588; add.u64 %r447,%r575,%r589; add.u64 %r445,%r447,%r589; add.u64 %r558,%r565,%r589; add.u64 %r559,%r558,%r589; mul.lo.u64 %r376,%r450,%r474; mul.lo.u64 %r369,%r450,%r179; mad.lo.u64 %r555,%r450,%r589,%r567; mov.u64 %r547,%r1023; mov.u64 %r544,%r546; mov.u64 %r541,%r178; .loc 1 2718 12 mov.u64 %r228,%r472; add.u64 %r1287,%r426,1; .loc 1 2768 8 add.u64 %r1331,%r450,-1; setp.ge.s64 %r1333,%r1331,%r588; sub.u64 %r1336,%r568,%r474; add.u64 %r1337,%r1336,%r571; $L44: .loc 1 2723 13 sub.u64 %r226,%r1287,%r228; min.s64 %r1472,%r226,256; .loc 1 2724 28 shr.u64 %r1044,%r1472,63; add.u64 %r1045,%r1472,%r1044; and.b64 %r1046,%r1045,1; sub.u64 %r1047,%r1046,%r1044; .loc 1 2724 14 sub.u64 %r227,%r1472,%r1047; .loc 1 2727 8 @ %r1339 bra $L43; $L73: .loc 1 2718 30 add.u64 %r228,%r228,256; .loc 1 2718 4 add.u64 %r540,%r540,2048; add.u64 %r541,%r541,%r543; add.u64 %r544,%r544,%r545; add.u64 %r547,%r547,%r1020; setp.ge.s64 %r1049,%r426,%r228; @ %r1049 bra $L44; bra $L45; $L43: .loc 1 2763 26 add.u64 %r1050,%r228,-1; add.u64 %r1051,%r1050,%r1472; .loc 1 2763 31 mul.lo.u64 %r284,%r1051,%r178; .loc 1 2734 13 add.u64 %r449,%r227,%r228; .loc 1 2734 8 add.u64 %r451,%r449,-1; add.u64 %r1052,%r284,1; shl.b64 %r1053,%r1052,3; add.u64 %r526,%r219,%r1053; add.u64 %r537,%r541,%r178; sub.u64 %r1054,%r537,%r284; shl.b64 %r539,%r1054,3; shl.b64 %r1056,%r1472,3; add.u64 %r1057,%r1056,-8; add.u64 %r300,%r1001,%r1057; add.u64 %r1058,%r228,%r562; add.u64 %r1059,%r1058,%r1472; shl.b64 %r1060,%r1059,3; add.u64 %r195,%r1060,%r567; shl.b64 %r557,%r228,3; add.u64 %r48,%r557,%r565; add.u64 %r46,%r557,%r558; add.u64 %r30,%r557,%r559; add.u64 %r556,%r555,%r557; add.u64 %r1061,%r369,%r228; add.u64 %r1062,%r1061,%r1472; shl.b64 %r1063,%r1062,3; add.u64 %r367,%r1063,%r567; .loc 1 2768 8 mov.u64 %r525,%r576; .loc 1 2727 16 mov.u64 %r243,%r472; setp.le.s64 %r1285,%r228,%r451; setp.gt.s64 %r1299,%r1472,%r227; setp.gt.s64 %r1298,%r1470,%r225; not.b64 %r1324,%r228; add.u64 %r1325,%r1324,%r449; and.b64 %r1326,%r1325,-2; .loc 1 2904 18 add.u64 %r1327,%r1472,%r228; .loc 1 2904 13 add.u64 %r1328,%r1327,-1; setp.gt.s64 %r1329,%r228,%r1328; $L72: .loc 1 2732 10 sub.u64 %r229,%r1340,%r243; min.s64 %r1473,%r229,256; .loc 1 2733 25 shr.s64 %r1304,%r1473,63; shr.u64 %r1067,%r1473,63; add.u64 %r1068,%r1473,%r1067; and.b64 %r1069,%r1068,1; sub.u64 %r1070,%r1069,%r1067; .loc 1 2733 11 sub.u64 %r230,%r1473,%r1070; .loc 1 2735 5 @ %r1285 bra $L46; $L55: .loc 1 2757 8 @ ! %r1299 bra $L48; bra $L47; $L46: .loc 1 2737 17 add.u64 %r454,%r230,%r243; .loc 1 2737 12 add.u64 %r455,%r454,-1; shl.b64 %r1074,%r1473,11; add.u64 %r1076,%r1074,-2048; add.u64 %r500,%r1001,%r1076; add.u64 %r506,%r526,%r539; add.u64 %r1079,%r1473,%r243; shl.b64 %r517,%r1079,3; shl.b64 %r521,%r1473,3; mov.u64 %r511,%r537; mov.u64 %r510,%r547; mov.u64 %r509,%r544; mov.u64 %r498,0; setp.le.s64 %r1292,%r243,%r455; setp.gt.s64 %r1300,%r1473,%r230; not.b64 %r1319,%r243; add.u64 %r1320,%r1319,%r454; and.b64 %r1321,%r1320,-2; add.u64 %r1322,%r1321,%r243; bra $L54; $L139: mov.u64 %r498,%r499; $L54: .loc 1 2738 9 @ %r1292 bra $L49; $L53: .loc 1 2749 12 @ %r1300 bra $L50; bra $L51; $L49: shl.b64 %r1082,%r498,3; add.u64 %r199,%r1001,%r1082; add.u64 %r1087,%r1322,%r511; shl.b64 %r1088,%r1087,3; add.u64 %r497,%r1088,%r1341; .loc 1 2738 9 mov.u64 %r181,%r506; sub.u64 %r1400,%r497,%r506; add.u64 %r1401,%r1400,-16; shr.u64 %r1399,%r1401,4; add.u64 %r1402,%r1399,1; and.b64 %r1403,%r1402,3; setp.eq.u64 %r1406,%r1403,0; @ %r1406 bra $L52; setp.eq.u64 %r1405,%r1403,1; @ %r1405 bra $L311; setp.eq.u64 %r1404,%r1403,2; @ %r1404 bra $L312; bra $L382; $L52: .loc 1 2741 7 add.u64 %r1090,%r181,%r509; add.u64 %r1091,%r1090,%r510; .loc 1 2740 49 ld.f64 %r1092,[%r1091]; st.f64 [%r199],%r1092; .loc 1 2742 49 ld.f64 %r1093,[%r181]; st.f64 [%r199+8],%r1093; .loc 1 2745 7 add.u64 %r1094,%r181,%r510; add.u64 %r1095,%r1094,%r509; .loc 1 2744 49 ld.f64 %r1096,[%r1095+8]; st.f64 [%r199+2048],%r1096; .loc 1 2746 49 ld.f64 %r1097,[%r181+8]; st.f64 [%r199+2056],%r1097; .loc 1 2738 9 add.u64 %r1407,%r199,4096; add.u64 %r1408,%r181,16; .loc 1 2741 7 add.u64 %r1476,%r1408,%r509; add.u64 %r1477,%r1476,%r510; .loc 1 2740 49 ld.f64 %r1478,[%r1477]; st.f64 [%r1407],%r1478; .loc 1 2742 49 ld.f64 %r1479,[%r1408]; st.f64 [%r1407+8],%r1479; .loc 1 2745 7 add.u64 %r1480,%r1408,%r510; add.u64 %r1481,%r1480,%r509; .loc 1 2744 49 ld.f64 %r1482,[%r1481+8]; st.f64 [%r1407+2048],%r1482; .loc 1 2746 49 ld.f64 %r1483,[%r1408+8]; st.f64 [%r1407+2056],%r1483; .loc 1 2738 9 add.u64 %r1484,%r199,8192; add.u64 %r1485,%r181,32; .loc 1 2741 7 add.u64 %r1487,%r1485,%r509; add.u64 %r1488,%r1487,%r510; .loc 1 2740 49 ld.f64 %r1489,[%r1488]; st.f64 [%r1484],%r1489; .loc 1 2742 49 ld.f64 %r1490,[%r1485]; st.f64 [%r1484+8],%r1490; .loc 1 2745 7 add.u64 %r1491,%r1485,%r510; add.u64 %r1492,%r1491,%r509; .loc 1 2744 49 ld.f64 %r1493,[%r1492+8]; st.f64 [%r1484+2048],%r1493; .loc 1 2746 49 ld.f64 %r1494,[%r1485+8]; st.f64 [%r1484+2056],%r1494; .loc 1 2738 9 add.u64 %r1495,%r199,12288; add.u64 %r1496,%r181,48; .loc 1 2741 7 add.u64 %r1498,%r1496,%r509; add.u64 %r1499,%r1498,%r510; .loc 1 2740 49 ld.f64 %r1500,[%r1499]; st.f64 [%r1495],%r1500; .loc 1 2742 49 ld.f64 %r1501,[%r1496]; st.f64 [%r1495+8],%r1501; .loc 1 2745 7 add.u64 %r1502,%r1496,%r510; add.u64 %r1503,%r1502,%r509; .loc 1 2744 49 ld.f64 %r1504,[%r1503+8]; st.f64 [%r1495+2048],%r1504; .loc 1 2746 49 ld.f64 %r1505,[%r1496+8]; st.f64 [%r1495+2056],%r1505; .loc 1 2738 9 add.u64 %r199,%r199,16384; add.u64 %r181,%r181,64; setp.ne.u64 %r1506,%r181,%r497; @ %r1506 bra $L52; bra $L53; $L50: .loc 1 2752 10 add.u64 %r1099,%r509,%r517; add.u64 %r1100,%r219,%r1099; .loc 1 2751 41 ld.f64 %r1101,[%r1100+-8]; st.f64 [%r500],%r1101; .loc 1 2754 10 add.u64 %r1102,%r506,%r521; .loc 1 2753 41 ld.f64 %r1103,[%r1102+-8]; st.f64 [%r500+8],%r1103; $L51: .loc 1 2735 5 add.u64 %r499,%r498,2; add.u64 %r500,%r500,16; add.u64 %r506,%r506,%r507; add.u64 %r509,%r509,%r507; add.u64 %r510,%r510,%r1023; add.u64 %r511,%r511,%r508; setp.ne.u64 %r1104,%r498,%r1326; @ %r1104 bra $L139; bra $L55; $L48: .loc 1 2767 25 shr.u64 %r1107,%r1304,62; add.u64 %r1108,%r1473,%r1107; and.b64 %r1109,%r1108,3; sub.u64 %r1110,%r1109,%r1107; .loc 1 2767 11 sub.u64 %r233,%r1473,%r1110; .loc 1 2769 5 @ %r1333 bra $L56; bra $L57; $L47: .loc 1 2759 17 add.u64 %r132,%r1473,%r243; .loc 1 2759 12 add.u64 %r1112,%r132,-1; .loc 1 2760 9 setp.lt.s64 %r1113,%r1112,%r243; @ %r1113 bra $L48; add.u64 %r1114,%r132,%r284; shl.b64 %r1115,%r1114,3; add.u64 %r136,%r219,%r1115; mov.u64 %r73,%r300; mov.u64 %r57,%r526; sub.u64 %r1390,%r136,%r526; add.u64 %r1391,%r1390,-8; shr.u64 %r1389,%r1391,3; add.u64 %r1392,%r1389,1; and.b64 %r1393,%r1392,3; setp.eq.u64 %r1396,%r1393,0; @ %r1396 bra $L58; setp.eq.u64 %r1395,%r1393,1; @ %r1395 bra $L313; setp.eq.u64 %r1394,%r1393,2; @ %r1394 bra $L314; bra $L383; $L58: .loc 1 2762 43 ld.f64 %r1116,[%r57]; st.f64 [%r73],%r1116; ld.f64 %r1509,[%r57+8]; st.f64 [%r73+2048],%r1509; ld.f64 %r1513,[%r57+16]; st.f64 [%r73+4096],%r1513; ld.f64 %r1517,[%r57+24]; st.f64 [%r73+6144],%r1517; .loc 1 2760 9 add.u64 %r57,%r57,32; add.u64 %r73,%r73,8192; setp.ne.u64 %r1518,%r57,%r136; @ %r1518 bra $L58; bra $L48; $L57: .loc 1 2871 8 @ ! %r1298 bra $L60; bra $L59; $L56: .loc 1 2876 14 add.u64 %r458,%r233,%r243; .loc 1 2876 9 add.u64 %r459,%r458,-1; add.u64 %r192,%r525,%r1027; add.u64 %r89,%r525,%r536; add.u64 %r42,%r536,%r89; not.b64 %r1119,%r243; add.u64 %r1120,%r1119,%r458; shr.u64 %r1121,%r1120,2; add.u64 %r1122,%r1121,1; shl.b64 %r288,%r1122,5; add.u64 %r1123,%r1336,%r458; shl.b64 %r1124,%r1123,3; add.u64 %r438,%r1124,%r217; add.u64 %r1125,%r458,%r568; shl.b64 %r1126,%r1125,3; add.u64 %r433,%r1126,%r217; add.u64 %r1128,%r1337,%r458; shl.b64 %r1129,%r1128,3; add.u64 %r430,%r1129,%r217; add.u64 %r1130,%r458,%r572; shl.b64 %r1131,%r1130,3; add.u64 %r427,%r1131,%r217; shl.b64 %r339,%r233,8; shl.b64 %r441,%r1473,3; mov.u64 %r446,%r445; mov.u64 %r448,%r447; mov.u64 %r453,%r575; mov.u64 %r456,%r573; mov.u64 %r190,%r525; mov.u64 %r197,%r195; mov.u64 %r248,%r588; mov.u64 %r443,0; mov.u64 %r444,%r443; setp.le.s64 %r1293,%r243,%r459; setp.gt.s64 %r1301,%r1473,%r233; $L68: .loc 1 2772 9 @ %r1293 bra $L61; $L67: .loc 1 2843 12 @ ! %r1301 bra $L63; bra $L62; $L61: .loc 1 2904 13 mov.u64 %r311,0; mov.u64 %r312,8192; mov.u64 %r313,6144; mov.u64 %r314,2048; mov.u64 %r315,-4096; $L66: add.u64 %r309,%r192,%r311; .loc 1 2774 10 ld.f64 %r249,[%r309]; .loc 1 2775 10 ld.f64 %r250,[%r309+8]; add.u64 %r307,%r190,%r311; .loc 1 2776 10 ld.f64 %r251,[%r307]; .loc 1 2777 10 ld.f64 %r252,[%r307+8]; add.u64 %r305,%r89,%r311; .loc 1 2778 10 ld.f64 %r253,[%r305]; .loc 1 2779 10 ld.f64 %r254,[%r305+8]; add.u64 %r303,%r42,%r311; .loc 1 2780 10 ld.f64 %r255,[%r303]; .loc 1 2781 10 ld.f64 %r256,[%r303+8]; .loc 1 2782 10 ld.f64 %r257,[%r309+16]; .loc 1 2783 10 ld.f64 %r258,[%r309+24]; .loc 1 2784 10 ld.f64 %r259,[%r307+16]; .loc 1 2785 10 ld.f64 %r260,[%r307+24]; .loc 1 2786 10 ld.f64 %r261,[%r305+16]; .loc 1 2787 10 ld.f64 %r262,[%r305+24]; .loc 1 2788 10 ld.f64 %r263,[%r303+16]; .loc 1 2789 10 ld.f64 %r264,[%r303+24]; .loc 1 2791 6 @ %r1329 bra $L64; add.u64 %r331,%r443,%r540; add.u64 %r329,%r1001,%r314; sub.u64 %r1382,%r197,%r331; add.u64 %r1383,%r1382,-8; shr.u64 %r1381,%r1383,3; and.b64 %r1385,%r1381,1; setp.ne.u64 %r1386,%r1385,0; @ ! %r1386 bra $L384; $L65: add.u64 %r326,%r315,%r329; .loc 1 2793 19 add.u64 %r1136,%r326,%r314; ld.f64 %r137,[%r1136]; .loc 1 2794 14 ld.f64 %r138,[%r331]; .loc 1 2793 14 fma.rn.f64 %r1519,%r137,%r138,%r249; .loc 1 2795 19 ld.f64 %r139,[%r329]; .loc 1 2795 14 fma.rn.f64 %r1520,%r138,%r139,%r250; add.u64 %r321,%r331,%r456; .loc 1 2798 14 add.u64 %r1137,%r321,%r453; ld.f64 %r140,[%r1137]; .loc 1 2797 14 fma.rn.f64 %r1521,%r137,%r140,%r251; .loc 1 2799 14 fma.rn.f64 %r1522,%r139,%r140,%r252; .loc 1 2802 14 add.u64 %r1138,%r321,%r448; ld.f64 %r141,[%r1138]; .loc 1 2801 14 fma.rn.f64 %r1523,%r137,%r141,%r253; .loc 1 2803 14 fma.rn.f64 %r1524,%r139,%r141,%r254; .loc 1 2806 14 add.u64 %r1139,%r321,%r446; ld.f64 %r142,[%r1139]; .loc 1 2805 14 fma.rn.f64 %r1525,%r137,%r142,%r255; .loc 1 2807 14 fma.rn.f64 %r1526,%r139,%r142,%r256; .loc 1 2809 19 add.u64 %r1140,%r326,%r313; ld.f64 %r143,[%r1140]; .loc 1 2809 14 fma.rn.f64 %r1527,%r138,%r143,%r257; .loc 1 2811 19 add.u64 %r1141,%r326,%r312; ld.f64 %r144,[%r1141]; .loc 1 2811 14 fma.rn.f64 %r1528,%r138,%r144,%r258; .loc 1 2813 14 fma.rn.f64 %r1529,%r140,%r143,%r259; .loc 1 2815 14 fma.rn.f64 %r1530,%r140,%r144,%r260; .loc 1 2817 14 fma.rn.f64 %r1531,%r141,%r143,%r261; .loc 1 2819 14 fma.rn.f64 %r1532,%r141,%r144,%r262; .loc 1 2821 14 fma.rn.f64 %r1533,%r142,%r143,%r263; .loc 1 2823 14 fma.rn.f64 %r1534,%r142,%r144,%r264; .loc 1 2791 6 add.u64 %r1387,%r331,8; add.u64 %r1388,%r329,8; add.u64 %r1537,%r315,%r1388; .loc 1 2793 19 add.u64 %r1538,%r1537,%r314; ld.f64 %r1539,[%r1538]; .loc 1 2794 14 ld.f64 %r1540,[%r1387]; .loc 1 2793 14 fma.rn.f64 %r249,%r1539,%r1540,%r1519; .loc 1 2795 19 ld.f64 %r1541,[%r1388]; .loc 1 2795 14 fma.rn.f64 %r250,%r1540,%r1541,%r1520; add.u64 %r1542,%r1387,%r456; .loc 1 2798 14 add.u64 %r1543,%r1542,%r453; ld.f64 %r1544,[%r1543]; .loc 1 2797 14 fma.rn.f64 %r251,%r1539,%r1544,%r1521; .loc 1 2799 14 fma.rn.f64 %r252,%r1541,%r1544,%r1522; .loc 1 2802 14 add.u64 %r1545,%r1542,%r448; ld.f64 %r1546,[%r1545]; .loc 1 2801 14 fma.rn.f64 %r253,%r1539,%r1546,%r1523; .loc 1 2803 14 fma.rn.f64 %r254,%r1541,%r1546,%r1524; .loc 1 2806 14 add.u64 %r1547,%r1542,%r446; ld.f64 %r1548,[%r1547]; .loc 1 2805 14 fma.rn.f64 %r255,%r1539,%r1548,%r1525; .loc 1 2807 14 fma.rn.f64 %r256,%r1541,%r1548,%r1526; .loc 1 2809 19 add.u64 %r1549,%r1537,%r313; ld.f64 %r1550,[%r1549]; .loc 1 2809 14 fma.rn.f64 %r257,%r1540,%r1550,%r1527; .loc 1 2811 19 add.u64 %r1551,%r1537,%r312; ld.f64 %r1552,[%r1551]; .loc 1 2811 14 fma.rn.f64 %r258,%r1540,%r1552,%r1528; .loc 1 2813 14 fma.rn.f64 %r259,%r1544,%r1550,%r1529; .loc 1 2815 14 fma.rn.f64 %r260,%r1544,%r1552,%r1530; .loc 1 2817 14 fma.rn.f64 %r261,%r1546,%r1550,%r1531; .loc 1 2819 14 fma.rn.f64 %r262,%r1546,%r1552,%r1532; .loc 1 2821 14 fma.rn.f64 %r263,%r1548,%r1550,%r1533; .loc 1 2823 14 fma.rn.f64 %r264,%r1548,%r1552,%r1534; .loc 1 2791 6 add.u64 %r331,%r331,16; add.u64 %r329,%r329,16; setp.ne.u64 %r1553,%r197,%r331; @ %r1553 bra $L65; $L64: .loc 1 2826 24 st.f64 [%r309],%r249; .loc 1 2827 28 st.f64 [%r309+8],%r250; .loc 1 2828 30 st.f64 [%r307],%r251; .loc 1 2829 34 st.f64 [%r307+8],%r252; .loc 1 2830 30 st.f64 [%r305],%r253; .loc 1 2831 34 st.f64 [%r305+8],%r254; .loc 1 2832 30 st.f64 [%r303],%r255; .loc 1 2833 34 st.f64 [%r303+8],%r256; .loc 1 2834 28 st.f64 [%r309+16],%r257; .loc 1 2835 28 st.f64 [%r309+24],%r258; .loc 1 2836 34 st.f64 [%r307+16],%r259; .loc 1 2837 34 st.f64 [%r307+24],%r260; .loc 1 2838 34 st.f64 [%r305+16],%r261; .loc 1 2839 34 st.f64 [%r305+24],%r262; .loc 1 2840 34 st.f64 [%r303+16],%r263; .loc 1 2841 34 st.f64 [%r303+24],%r264; .loc 1 2772 9 add.u64 %r315,%r315,-8192; add.u64 %r314,%r314,8192; add.u64 %r313,%r313,8192; add.u64 %r312,%r312,8192; add.u64 %r311,%r311,32; setp.ne.u64 %r1143,%r288,%r311; @ %r1143 bra $L66; bra $L67; $L63: .loc 1 2769 29 add.u64 %r248,%r248,4; .loc 1 2769 5 add.u64 %r197,%r197,%r196; add.u64 %r192,%r192,%r191; add.u64 %r190,%r190,%r191; add.u64 %r89,%r89,%r191; add.u64 %r42,%r42,%r191; add.u64 %r456,%r456,%r1030; add.u64 %r453,%r453,%r196; add.u64 %r448,%r448,%r196; add.u64 %r446,%r446,%r196; add.u64 %r444,%r444,%r191; add.u64 %r443,%r443,%r196; setp.le.s64 %r1144,%r248,%r1331; @ %r1144 bra $L68; bra $L57; $L62: add.u64 %r344,%r438,%r444; add.u64 %r343,%r433,%r444; add.u64 %r342,%r430,%r444; add.u64 %r341,%r427,%r444; add.u64 %r49,%r48,%r443; add.u64 %r209,%r46,%r443; add.u64 %r210,%r30,%r443; add.u64 %r442,%r192,%r441; .loc 1 2904 13 mov.u64 %r340,%r339; $L71: .loc 1 2848 14 ld.f64 %r244,[%r344]; .loc 1 2849 14 ld.f64 %r245,[%r343]; .loc 1 2850 14 ld.f64 %r246,[%r342]; .loc 1 2851 14 ld.f64 %r247,[%r341]; .loc 1 2853 10 @ %r1329 bra $L69; shl.b64 %r1148,%r340,3; add.u64 %r359,%r1001,%r1148; add.u64 %r357,%r443,%r540; mov.u64 %r351,%r210; mov.u64 %r353,%r209; mov.u64 %r355,%r49; sub.u64 %r1369,%r197,%r357; add.u64 %r1370,%r1369,-8; shr.u64 %r1368,%r1370,3; add.u64 %r1371,%r1368,1; and.b64 %r1372,%r1371,3; setp.eq.u64 %r1375,%r1372,0; @ %r1375 bra $L70; setp.eq.u64 %r1374,%r1372,1; @ %r1374 bra $L315; setp.eq.u64 %r1373,%r1372,2; @ %r1373 bra $L316; bra $L385; $L70: .loc 1 2855 16 ld.f64 %r146,[%r359]; .loc 1 2855 11 ld.f64 %r1149,[%r357]; fma.rn.f64 %r1554,%r146,%r1149,%r244; .loc 1 2857 11 ld.f64 %r1150,[%r355]; fma.rn.f64 %r1555,%r146,%r1150,%r245; .loc 1 2859 11 ld.f64 %r1151,[%r353]; fma.rn.f64 %r1556,%r146,%r1151,%r246; .loc 1 2861 11 ld.f64 %r1152,[%r351]; fma.rn.f64 %r1557,%r146,%r1152,%r247; .loc 1 2855 16 ld.f64 %r1563,[%r359+8]; .loc 1 2855 11 ld.f64 %r1564,[%r357+8]; fma.rn.f64 %r1565,%r1563,%r1564,%r1554; .loc 1 2857 11 ld.f64 %r1566,[%r355+8]; fma.rn.f64 %r1567,%r1563,%r1566,%r1555; .loc 1 2859 11 ld.f64 %r1568,[%r353+8]; fma.rn.f64 %r1569,%r1563,%r1568,%r1556; .loc 1 2861 11 ld.f64 %r1570,[%r351+8]; fma.rn.f64 %r1571,%r1563,%r1570,%r1557; .loc 1 2855 16 ld.f64 %r1578,[%r359+16]; .loc 1 2855 11 ld.f64 %r1579,[%r357+16]; fma.rn.f64 %r1580,%r1578,%r1579,%r1565; .loc 1 2857 11 ld.f64 %r1581,[%r355+16]; fma.rn.f64 %r1582,%r1578,%r1581,%r1567; .loc 1 2859 11 ld.f64 %r1583,[%r353+16]; fma.rn.f64 %r1584,%r1578,%r1583,%r1569; .loc 1 2861 11 ld.f64 %r1585,[%r351+16]; fma.rn.f64 %r1586,%r1578,%r1585,%r1571; .loc 1 2855 16 ld.f64 %r1593,[%r359+24]; .loc 1 2855 11 ld.f64 %r1594,[%r357+24]; fma.rn.f64 %r244,%r1593,%r1594,%r1580; .loc 1 2857 11 ld.f64 %r1595,[%r355+24]; fma.rn.f64 %r245,%r1593,%r1595,%r1582; .loc 1 2859 11 ld.f64 %r1596,[%r353+24]; fma.rn.f64 %r246,%r1593,%r1596,%r1584; .loc 1 2861 11 ld.f64 %r1597,[%r351+24]; fma.rn.f64 %r247,%r1593,%r1597,%r1586; .loc 1 2853 10 add.u64 %r359,%r359,32; add.u64 %r357,%r357,32; add.u64 %r355,%r355,32; add.u64 %r353,%r353,32; add.u64 %r351,%r351,32; setp.ne.u64 %r1598,%r197,%r357; @ %r1598 bra $L70; $L69: .loc 1 2864 28 st.f64 [%r344],%r244; .loc 1 2865 34 st.f64 [%r343],%r245; .loc 1 2866 34 st.f64 [%r342],%r246; .loc 1 2867 34 st.f64 [%r341],%r247; .loc 1 2846 6 add.u64 %r344,%r344,8; add.u64 %r343,%r343,8; add.u64 %r342,%r342,8; add.u64 %r341,%r341,8; add.u64 %r340,%r340,256; setp.ne.u64 %r1154,%r344,%r442; @ %r1154 bra $L71; bra $L63; $L60: .loc 1 2727 34 add.u64 %r243,%r243,256; .loc 1 2727 8 add.u64 %r525,%r525,2048; add.u64 %r526,%r526,2048; setp.ge.s64 %r1155,%r186,%r243; @ %r1155 bra $L72; bra $L73; $L59: .loc 1 2873 17 add.u64 %r151,%r1470,%r588; .loc 1 2874 9 setp.le.s64 %r1156,%r151,%r450; @ %r1156 bra $L60; .loc 1 2876 14 add.u64 %r465,%r233,%r243; .loc 1 2876 9 add.u64 %r466,%r465,-1; .loc 1 2759 17 add.u64 %r375,%r1473,%r243; add.u64 %r1157,%r375,%r376; shl.b64 %r1158,%r1157,3; add.u64 %r377,%r1158,%r217; neg.s64 %r1160,%r375; shl.b64 %r1161,%r1160,3; shl.b64 %r1162,%r243,3; add.u64 %r360,%r1162,%r1161; shl.b64 %r1163,%r465,3; add.u64 %r362,%r1163,%r1161; shl.b64 %r404,%r233,8; .loc 1 2876 9 mov.u64 %r368,%r367; mov.u64 %r370,%r556; mov.u64 %r371,%r376; mov.u64 %r237,%r450; setp.gt.s64 %r1599,%r1473,%r233; setp.le.s64 %r1294,%r243,%r466; not.b64 %r1311,%r243; add.u64 %r1312,%r1311,%r465; and.b64 %r1313,%r1312,-4; add.u64 %r1314,%r1313,%r243; $L81: .loc 1 2877 6 @ %r1294 bra $L74; $L80: .loc 1 2901 6 @ %r1599 bra $L75; bra $L76; $L74: add.u64 %r389,%r360,%r377; add.u64 %r1171,%r1314,%r371; shl.b64 %r1172,%r1171,3; add.u64 %r1173,%r217,32; add.u64 %r379,%r1172,%r1173; .loc 1 2904 13 mov.u64 %r385,8192; mov.u64 %r386,6144; mov.u64 %r387,2048; mov.u64 %r388,-4096; $L79: .loc 1 2879 14 ld.f64 %r239,[%r389]; .loc 1 2880 14 ld.f64 %r240,[%r389+8]; .loc 1 2881 14 ld.f64 %r241,[%r389+16]; .loc 1 2882 14 ld.f64 %r242,[%r389+24]; .loc 1 2884 10 @ %r1329 bra $L77; add.u64 %r398,%r1001,%r387; mov.u64 %r399,%r370; sub.u64 %r1359,%r368,%r370; add.u64 %r1360,%r1359,-8; shr.u64 %r1358,%r1360,3; add.u64 %r1361,%r1358,1; and.b64 %r1362,%r1361,3; setp.eq.u64 %r1365,%r1362,0; @ %r1365 bra $L78; setp.eq.u64 %r1364,%r1362,1; @ %r1364 bra $L317; setp.eq.u64 %r1363,%r1362,2; @ %r1363 bra $L318; bra $L386; $L78: add.u64 %r395,%r388,%r398; .loc 1 2887 16 ld.f64 %r153,[%r399]; .loc 1 2886 16 add.u64 %r1175,%r395,%r387; .loc 1 2886 11 ld.f64 %r1176,[%r1175]; fma.rn.f64 %r1600,%r1176,%r153,%r239; .loc 1 2888 11 ld.f64 %r1177,[%r398]; fma.rn.f64 %r1601,%r153,%r1177,%r240; .loc 1 2890 16 add.u64 %r1178,%r395,%r386; .loc 1 2890 11 ld.f64 %r1179,[%r1178]; fma.rn.f64 %r1602,%r153,%r1179,%r241; .loc 1 2892 16 add.u64 %r1180,%r395,%r385; .loc 1 2892 11 ld.f64 %r1181,[%r1180]; fma.rn.f64 %r1603,%r153,%r1181,%r242; .loc 1 2884 10 add.u64 %r1367,%r398,8; add.u64 %r1606,%r388,%r1367; .loc 1 2887 16 ld.f64 %r1607,[%r399+8]; .loc 1 2886 16 add.u64 %r1608,%r1606,%r387; .loc 1 2886 11 ld.f64 %r1609,[%r1608]; fma.rn.f64 %r1610,%r1609,%r1607,%r1600; .loc 1 2888 11 ld.f64 %r1611,[%r1367]; fma.rn.f64 %r1612,%r1607,%r1611,%r1601; .loc 1 2890 16 add.u64 %r1613,%r1606,%r386; .loc 1 2890 11 ld.f64 %r1614,[%r1613]; fma.rn.f64 %r1615,%r1607,%r1614,%r1602; .loc 1 2892 16 add.u64 %r1616,%r1606,%r385; .loc 1 2892 11 ld.f64 %r1617,[%r1616]; fma.rn.f64 %r1618,%r1607,%r1617,%r1603; .loc 1 2884 10 add.u64 %r1620,%r398,16; add.u64 %r1622,%r388,%r1620; .loc 1 2887 16 ld.f64 %r1623,[%r399+16]; .loc 1 2886 16 add.u64 %r1624,%r1622,%r387; .loc 1 2886 11 ld.f64 %r1625,[%r1624]; fma.rn.f64 %r1626,%r1625,%r1623,%r1610; .loc 1 2888 11 ld.f64 %r1627,[%r1620]; fma.rn.f64 %r1628,%r1623,%r1627,%r1612; .loc 1 2890 16 add.u64 %r1629,%r1622,%r386; .loc 1 2890 11 ld.f64 %r1630,[%r1629]; fma.rn.f64 %r1631,%r1623,%r1630,%r1615; .loc 1 2892 16 add.u64 %r1632,%r1622,%r385; .loc 1 2892 11 ld.f64 %r1633,[%r1632]; fma.rn.f64 %r1634,%r1623,%r1633,%r1618; .loc 1 2884 10 add.u64 %r1636,%r398,24; add.u64 %r1638,%r388,%r1636; .loc 1 2887 16 ld.f64 %r1639,[%r399+24]; .loc 1 2886 16 add.u64 %r1640,%r1638,%r387; .loc 1 2886 11 ld.f64 %r1641,[%r1640]; fma.rn.f64 %r239,%r1641,%r1639,%r1626; .loc 1 2888 11 ld.f64 %r1642,[%r1636]; fma.rn.f64 %r240,%r1639,%r1642,%r1628; .loc 1 2890 16 add.u64 %r1643,%r1638,%r386; .loc 1 2890 11 ld.f64 %r1644,[%r1643]; fma.rn.f64 %r241,%r1639,%r1644,%r1631; .loc 1 2892 16 add.u64 %r1645,%r1638,%r385; .loc 1 2892 11 ld.f64 %r1646,[%r1645]; fma.rn.f64 %r242,%r1639,%r1646,%r1634; .loc 1 2884 10 add.u64 %r399,%r399,32; add.u64 %r398,%r398,32; setp.ne.u64 %r1647,%r368,%r399; @ %r1647 bra $L78; $L77: .loc 1 2895 28 st.f64 [%r389],%r239; .loc 1 2896 32 st.f64 [%r389+8],%r240; .loc 1 2897 32 st.f64 [%r389+16],%r241; .loc 1 2898 32 st.f64 [%r389+24],%r242; .loc 1 2877 6 add.u64 %r389,%r389,32; add.u64 %r388,%r388,-8192; add.u64 %r387,%r387,8192; add.u64 %r386,%r386,8192; add.u64 %r385,%r385,8192; setp.ne.u64 %r1183,%r379,%r389; @ %r1183 bra $L79; bra $L80; $L76: .loc 1 2874 39 add.u64 %r237,%r237,1; .loc 1 2874 9 add.u64 %r377,%r377,%r536; add.u64 %r371,%r371,%r474; add.u64 %r370,%r370,%r589; add.u64 %r368,%r368,%r589; setp.ne.u64 %r1184,%r151,%r237; @ %r1184 bra $L81; bra $L60; $L75: add.u64 %r406,%r362,%r377; .loc 1 2904 13 mov.u64 %r405,%r404; $L84: .loc 1 2903 14 ld.f64 %r238,[%r406]; .loc 1 2905 10 @ %r1329 bra $L82; shl.b64 %r1187,%r405,3; add.u64 %r302,%r1001,%r1187; mov.u64 %r267,%r370; sub.u64 %r1349,%r368,%r370; add.u64 %r1350,%r1349,-8; shr.u64 %r1348,%r1350,3; add.u64 %r1351,%r1348,1; and.b64 %r1352,%r1351,3; setp.eq.u64 %r1355,%r1352,0; @ %r1355 bra $L83; setp.eq.u64 %r1354,%r1352,1; @ %r1354 bra $L319; setp.eq.u64 %r1353,%r1352,2; @ %r1353 bra $L320; bra $L387; $L83: .loc 1 2907 11 ld.f64 %r1188,[%r302]; ld.f64 %r1189,[%r267]; fma.rn.f64 %r1648,%r1188,%r1189,%r238; ld.f64 %r1651,[%r302+8]; ld.f64 %r1652,[%r267+8]; fma.rn.f64 %r1653,%r1651,%r1652,%r1648; ld.f64 %r1657,[%r302+16]; ld.f64 %r1658,[%r267+16]; fma.rn.f64 %r1659,%r1657,%r1658,%r1653; ld.f64 %r1663,[%r302+24]; ld.f64 %r1664,[%r267+24]; fma.rn.f64 %r238,%r1663,%r1664,%r1659; .loc 1 2905 10 add.u64 %r302,%r302,32; add.u64 %r267,%r267,32; setp.ne.u64 %r1665,%r368,%r267; @ %r1665 bra $L83; $L82: .loc 1 2910 28 st.f64 [%r406],%r238; .loc 1 2901 6 add.u64 %r406,%r406,8; add.u64 %r405,%r405,256; setp.ne.u64 %r1191,%r377,%r406; @ %r1191 bra $L84; bra $L76; $L20: .loc 1 2958 11 cvt.u16.u32 %r1192,%r416; setp.eq.u16 %r1193,%r1192,1; @ ! %r1193 bra $L388; bra $L85; $L31: .loc 1 2920 38 set.u32.eq.u64 %r1195,%r178,1; neg.s32 %r1196,%r1195; .loc 1 2920 26 cvt.u16.u32 %r1198,%r1196; and.b16 %r1197,%r1198,%r958; .loc 1 2920 11 cvt.u32.u16 %r1200,%r1197; cvt.u16.u8 %r1201,%r1200; setp.eq.u16 %r1202,%r1201,0; @ %r1202 bra $L20; .loc 1 2920 43 setp.ne.u64 %r1203,%r472,1; @ %r1203 bra $L20; $L121: .loc 1 2922 10 cvt.u16.u32 %r1204,%r416; setp.ne.u16 %r1205,%r1204,1; @ %r1205 bra $L87; .loc 1 2948 4 setp.gt.s64 %r1206,%r187,0; @ %r1206 bra $L88; bra $L1; $L87: .loc 1 2929 4 setp.le.s64 %r1207,%r187,0; @ %r1207 bra $L1; shl.b64 %r613,%r474,3; shl.b64 %r1208,%r186,3; add.u64 %r612,%r215,%r1208; shl.b64 %r603,%r473,3; shl.b64 %r1209,%r426,3; add.u64 %r604,%r213,%r1209; mov.u64 %r617,0; mov.u64 %r610,%r617; .loc 1 2929 11 mov.u64 %r270,%r617; setp.gt.s64 %r1666,%r186,0; setp.le.s64 %r1345,%r426,0; $L91: .loc 1 2933 8 @ %r1666 bra $L90; $L95: .loc 1 2929 29 add.u64 %r270,%r270,1; .loc 1 2929 4 add.u64 %r610,%r610,%r474; add.u64 %r612,%r612,%r613; add.u64 %r617,%r617,%r179; setp.ne.u64 %r1211,%r187,%r270; @ %r1211 bra $L91; bra $L1; $L90: shl.b64 %r1212,%r610,3; add.u64 %r599,%r215,%r1212; shl.b64 %r1213,%r617,3; add.u64 %r622,%r214,%r1213; .loc 1 2933 8 mov.u64 %r601,%r604; mov.u64 %r600,0; $L94: .loc 1 2937 5 @ %r1345 bra $L140; shl.b64 %r1215,%r600,3; add.u64 %r595,%r213,%r1215; mov.u64 %r596,%r622; .loc 1 2936 7 mov.f64 %r271,0d0000000000000000; sub.u64 %r1418,%r601,%r595; add.u64 %r1419,%r1418,-8; shr.u64 %r1417,%r1419,3; add.u64 %r1420,%r1417,1; and.b64 %r1421,%r1420,3; setp.eq.u64 %r1424,%r1421,0; @ %r1424 bra $L93; setp.eq.u64 %r1423,%r1421,1; @ %r1423 bra $L321; setp.eq.u64 %r1422,%r1421,2; @ %r1422 bra $L322; bra $L389; $L93: .loc 1 2938 9 ld.f64 %r1216,[%r595]; ld.f64 %r1217,[%r596]; fma.rn.f64 %r1667,%r1216,%r1217,%r271; ld.f64 %r1670,[%r595+8]; ld.f64 %r1671,[%r596+8]; fma.rn.f64 %r1672,%r1670,%r1671,%r1667; ld.f64 %r1676,[%r595+16]; ld.f64 %r1677,[%r596+16]; fma.rn.f64 %r1678,%r1676,%r1677,%r1672; ld.f64 %r1682,[%r595+24]; ld.f64 %r1683,[%r596+24]; fma.rn.f64 %r271,%r1682,%r1683,%r1678; .loc 1 2937 5 add.u64 %r595,%r595,32; add.u64 %r596,%r596,32; setp.ne.u64 %r1684,%r595,%r601; @ %r1684 bra $L93; bra $L92; $L140: .loc 1 2936 7 mov.f64 %r271,0d0000000000000000; $L92: .loc 1 2939 15 st.f64 [%r599],%r271; .loc 1 2933 8 add.u64 %r599,%r599,8; add.u64 %r600,%r600,%r473; add.u64 %r601,%r601,%r603; setp.ne.u64 %r1219,%r599,%r612; @ %r1219 bra $L94; bra $L95; $L88: shl.b64 %r633,%r474,3; mov.u64 %r631,%r215; shl.b64 %r625,%r473,3; .loc 1 2948 4 mov.u64 %r634,0; .loc 1 2948 11 mov.u64 %r268,%r634; setp.le.s64 %r1685,%r426,0; $L98: .loc 1 2952 8 @ %r1685 bra $L141; mov.u64 %r623,%r213; shl.b64 %r1221,%r634,3; add.u64 %r626,%r1221,%r214; add.u64 %r1222,%r426,%r634; shl.b64 %r1223,%r1222,3; add.u64 %r642,%r1223,%r214; .loc 1 2951 10 mov.f64 %r269,0d0000000000000000; sub.u64 %r1428,%r642,%r626; add.u64 %r1429,%r1428,-8; shr.u64 %r1427,%r1429,3; add.u64 %r1430,%r1427,1; and.b64 %r1431,%r1430,3; setp.eq.u64 %r1434,%r1431,0; @ %r1434 bra $L97; setp.eq.u64 %r1433,%r1431,1; @ %r1433 bra $L323; setp.eq.u64 %r1432,%r1431,2; @ %r1432 bra $L324; bra $L390; $L97: .loc 1 2953 5 ld.f64 %r1224,[%r623]; ld.f64 %r1225,[%r626]; fma.rn.f64 %r1686,%r1224,%r1225,%r269; .loc 1 2952 8 add.u64 %r1687,%r623,%r625; .loc 1 2953 5 ld.f64 %r1689,[%r1687]; ld.f64 %r1690,[%r626+8]; fma.rn.f64 %r1691,%r1689,%r1690,%r1686; .loc 1 2952 8 add.u64 %r1692,%r1687,%r625; .loc 1 2953 5 ld.f64 %r1695,[%r1692]; ld.f64 %r1696,[%r626+16]; fma.rn.f64 %r1697,%r1695,%r1696,%r1691; .loc 1 2952 8 add.u64 %r1698,%r1692,%r625; .loc 1 2953 5 ld.f64 %r1701,[%r1698]; ld.f64 %r1702,[%r626+24]; fma.rn.f64 %r269,%r1701,%r1702,%r1697; .loc 1 2952 8 add.u64 %r623,%r1698,%r625; add.u64 %r626,%r626,32; setp.ne.u64 %r1703,%r626,%r642; @ %r1703 bra $L97; bra $L96; $L141: .loc 1 2951 10 mov.f64 %r269,0d0000000000000000; $L96: .loc 1 2954 25 st.f64 [%r631],%r269; .loc 1 2948 29 add.u64 %r268,%r268,1; .loc 1 2948 4 add.u64 %r631,%r631,%r633; add.u64 %r634,%r634,%r179; setp.ne.u64 %r1227,%r187,%r268; @ %r1227 bra $L98; bra $L1; $L85: .loc 1 2963 7 setp.le.s64 %r1228,%r187,0; @ %r1228 bra $L1; shl.b64 %r653,%r414,3; mov.u64 %r651,%r215; shl.b64 %r645,%r473,3; shl.b64 %r648,%r472,3; mov.u64 %r654,0; .loc 1 2963 14 mov.u64 %r281,%r654; setp.le.s64 %r1704,%r426,0; $L101: .loc 1 2967 4 @ %r1704 bra $L142; mov.u64 %r643,%r213; shl.b64 %r1230,%r654,3; add.u64 %r646,%r214,%r1230; .loc 1 2966 6 mov.f64 %r282,0d0000000000000000; .loc 1 2967 11 mov.u64 %r283,0; and.b64 %r1439,%r426,3; setp.eq.u64 %r1442,%r1439,0; @ %r1442 bra $L100; setp.eq.u64 %r1441,%r1439,1; @ %r1441 bra $L325; setp.eq.u64 %r1440,%r1439,2; @ %r1440 bra $L326; bra $L391; $L100: .loc 1 2968 8 ld.f64 %r1231,[%r643]; ld.f64 %r1232,[%r646]; fma.rn.f64 %r1705,%r1231,%r1232,%r282; .loc 1 2967 4 add.u64 %r1707,%r643,%r645; add.u64 %r1708,%r646,%r648; .loc 1 2968 8 ld.f64 %r1709,[%r1707]; ld.f64 %r1710,[%r1708]; fma.rn.f64 %r1711,%r1709,%r1710,%r1705; .loc 1 2967 4 add.u64 %r1713,%r1707,%r645; add.u64 %r1714,%r1708,%r648; .loc 1 2968 8 ld.f64 %r1716,[%r1713]; ld.f64 %r1717,[%r1714]; fma.rn.f64 %r1718,%r1716,%r1717,%r1711; .loc 1 2967 4 add.u64 %r1720,%r1713,%r645; add.u64 %r1721,%r1714,%r648; .loc 1 2968 8 ld.f64 %r1723,[%r1720]; ld.f64 %r1724,[%r1721]; fma.rn.f64 %r282,%r1723,%r1724,%r1718; .loc 1 2967 28 add.u64 %r283,%r283,4; .loc 1 2967 4 add.u64 %r643,%r1720,%r645; add.u64 %r646,%r1721,%r648; setp.ne.u64 %r1725,%r426,%r283; @ %r1725 bra $L100; bra $L99; $L142: .loc 1 2966 6 mov.f64 %r282,0d0000000000000000; $L99: .loc 1 2969 21 st.f64 [%r651],%r282; .loc 1 2963 32 add.u64 %r281,%r281,1; .loc 1 2963 7 add.u64 %r651,%r651,%r653; add.u64 %r654,%r654,%r179; setp.ne.u64 %r1234,%r187,%r281; @ %r1234 bra $L101; bra $L1; $L388: .loc 1 2972 11 setp.gt.s64 %r1235,%r178,%r473; @ %r1235 bra $L102; .loc 1 2993 7 setp.gt.s64 %r1236,%r187,0; @ %r1236 bra $L103; bra $L1; $L102: .loc 1 2974 7 setp.gt.s64 %r1237,%r187,0; @ %r1237 bra $L104; bra $L1; $L108: shl.b64 %r1238,%r687,3; add.u64 %r683,%r215,%r1238; .loc 1 2975 9 mov.u64 %r280,0; and.b64 %r1455,%r186,3; setp.eq.u64 %r1458,%r1455,0; @ %r1458 bra $L105; setp.eq.u64 %r1457,%r1455,1; @ %r1457 bra $L327; setp.eq.u64 %r1456,%r1455,2; @ %r1456 bra $L328; bra $L392; $L105: .loc 1 2976 34 st.f64 [%r683],%r1346; .loc 1 2975 2 add.u64 %r1727,%r683,%r685; .loc 1 2976 34 st.f64 [%r1727],%r1346; .loc 1 2975 2 add.u64 %r1729,%r1727,%r685; .loc 1 2976 34 st.f64 [%r1729],%r1346; .loc 1 2975 2 add.u64 %r1732,%r1729,%r685; .loc 1 2976 34 st.f64 [%r1732],%r1346; .loc 1 2975 27 add.u64 %r280,%r280,4; .loc 1 2975 2 add.u64 %r683,%r1732,%r685; setp.ne.u64 %r1734,%r186,%r280; @ %r1734 bra $L105; $L109: .loc 1 2974 32 add.u64 %r279,%r294,1; .loc 1 2974 7 add.u64 %r687,%r687,%r474; setp.ne.u64 %r1241,%r187,%r279; @ %r1241 bra $L143; shl.b64 %r671,%r472,3; shl.b64 %r662,%r473,3; mov.u64 %r676,0; mov.u64 %r675,%r676; mov.u64 %r295,%r676; setp.le.s64 %r1735,%r426,0; bra $L107; $L104: shl.b64 %r685,%r414,3; mov.u64 %r687,0; .loc 1 2974 14 mov.u64 %r294,%r687; setp.gt.s64 %r1736,%r186,0; .loc 1 2976 34 mov.f64 %r1346,0d0000000000000000; bra $L106; $L143: mov.u64 %r294,%r279; $L106: .loc 1 2975 2 @ %r1736 bra $L108; bra $L109; $L112: shl.b64 %r1243,%r667,3; add.u64 %r660,%r213,%r1243; .loc 1 2980 4 mov.u64 %r659,%r680; .loc 1 2980 11 mov.u64 %r278,0; and.b64 %r1447,%r186,3; setp.eq.u64 %r1450,%r1447,0; @ %r1450 bra $L110; setp.eq.u64 %r1449,%r1447,1; @ %r1449 bra $L329; setp.eq.u64 %r1448,%r1447,2; @ %r1448 bra $L330; bra $L393; $L110: .loc 1 2982 36 ld.f64 %r1244,[%r660]; ld.f64 %r1245,[%r669]; ld.f64 %r1246,[%r659]; fma.rn.f64 %r175,%r1244,%r1245,%r1246; st.f64 [%r659],%r175; .loc 1 2980 4 add.u64 %r1738,%r659,%r685; add.u64 %r1739,%r660,%r662; .loc 1 2982 36 ld.f64 %r1740,[%r1739]; ld.f64 %r1741,[%r669]; ld.f64 %r1742,[%r1738]; fma.rn.f64 %r1743,%r1740,%r1741,%r1742; st.f64 [%r1738],%r1743; .loc 1 2980 4 add.u64 %r1745,%r1738,%r685; add.u64 %r1746,%r1739,%r662; .loc 1 2982 36 ld.f64 %r1748,[%r1746]; ld.f64 %r1749,[%r669]; ld.f64 %r1750,[%r1745]; fma.rn.f64 %r1751,%r1748,%r1749,%r1750; st.f64 [%r1745],%r1751; .loc 1 2980 4 add.u64 %r1753,%r1745,%r685; add.u64 %r1754,%r1746,%r662; .loc 1 2982 36 ld.f64 %r1756,[%r1754]; ld.f64 %r1757,[%r669]; ld.f64 %r1758,[%r1753]; fma.rn.f64 %r1759,%r1756,%r1757,%r1758; st.f64 [%r1753],%r1759; .loc 1 2980 29 add.u64 %r278,%r278,4; .loc 1 2980 4 add.u64 %r659,%r1753,%r685; add.u64 %r660,%r1754,%r662; setp.ne.u64 %r1760,%r186,%r278; @ %r1760 bra $L110; $L113: .loc 1 2979 26 add.u64 %r277,%r277,1; .loc 1 2979 2 add.u64 %r667,%r667,%r178; add.u64 %r669,%r669,%r671; setp.eq.u64 %r1248,%r426,%r277; @ %r1248 bra $L111; $L114: .loc 1 2980 4 @ %r1736 bra $L112; bra $L113; $L111: .loc 1 2978 32 add.u64 %r276,%r295,1; .loc 1 2978 7 add.u64 %r675,%r675,%r474; add.u64 %r676,%r676,%r179; setp.eq.u64 %r1250,%r294,%r295; @ %r1250 bra $L1; mov.u64 %r295,%r276; $L107: .loc 1 2979 2 @ %r1735 bra $L111; shl.b64 %r1252,%r676,3; add.u64 %r669,%r214,%r1252; shl.b64 %r1253,%r675,3; add.u64 %r680,%r215,%r1253; mov.u64 %r667,0; .loc 1 2979 9 mov.u64 %r277,%r667; bra $L114; $L103: shl.b64 %r702,%r414,3; shl.b64 %r693,%r178,3; shl.b64 %r697,%r472,3; .loc 1 2993 7 mov.u64 %r711,0; mov.u64 %r709,%r711; .loc 1 2993 14 mov.u64 %r272,%r711; setp.gt.s64 %r1761,%r186,0; setp.le.s64 %r1347,%r426,0; $L116: .loc 1 2997 4 @ %r1761 bra $L115; $L120: .loc 1 2993 32 add.u64 %r272,%r272,1; .loc 1 2993 7 add.u64 %r709,%r709,%r474; add.u64 %r711,%r711,%r179; setp.ne.u64 %r1255,%r187,%r272; @ %r1255 bra $L116; bra $L1; $L115: shl.b64 %r1256,%r709,3; add.u64 %r700,%r215,%r1256; shl.b64 %r1257,%r711,3; add.u64 %r717,%r214,%r1257; .loc 1 2997 4 mov.u64 %r704,0; .loc 1 2997 11 mov.u64 %r273,%r704; $L119: .loc 1 3001 8 @ %r1347 bra $L144; shl.b64 %r1259,%r704,3; add.u64 %r691,%r213,%r1259; mov.u64 %r695,%r717; .loc 1 3000 10 mov.f64 %r274,0d0000000000000000; .loc 1 3001 15 mov.u64 %r275,0; and.b64 %r1463,%r426,3; setp.eq.u64 %r1466,%r1463,0; @ %r1466 bra $L118; setp.eq.u64 %r1465,%r1463,1; @ %r1465 bra $L331; setp.eq.u64 %r1464,%r1463,2; @ %r1464 bra $L332; bra $L394; $L118: .loc 1 3002 5 ld.f64 %r1260,[%r691]; ld.f64 %r1261,[%r695]; fma.rn.f64 %r1763,%r1260,%r1261,%r274; .loc 1 3001 8 add.u64 %r1765,%r691,%r693; add.u64 %r1766,%r695,%r697; .loc 1 3002 5 ld.f64 %r1767,[%r1765]; ld.f64 %r1768,[%r1766]; fma.rn.f64 %r1769,%r1767,%r1768,%r1763; .loc 1 3001 8 add.u64 %r1771,%r1765,%r693; add.u64 %r1772,%r1766,%r697; .loc 1 3002 5 ld.f64 %r1774,[%r1771]; ld.f64 %r1775,[%r1772]; fma.rn.f64 %r1776,%r1774,%r1775,%r1769; .loc 1 3001 8 add.u64 %r1778,%r1771,%r693; add.u64 %r1779,%r1772,%r697; .loc 1 3002 5 ld.f64 %r1781,[%r1778]; ld.f64 %r1782,[%r1779]; fma.rn.f64 %r274,%r1781,%r1782,%r1776; .loc 1 3001 32 add.u64 %r275,%r275,4; .loc 1 3001 8 add.u64 %r691,%r1778,%r693; add.u64 %r695,%r1779,%r697; setp.ne.u64 %r1783,%r426,%r275; @ %r1783 bra $L118; bra $L117; $L144: .loc 1 3000 10 mov.f64 %r274,0d0000000000000000; $L117: .loc 1 3003 27 st.f64 [%r700],%r274; .loc 1 2997 29 add.u64 %r273,%r273,1; .loc 1 2997 4 add.u64 %r700,%r700,%r702; add.u64 %r704,%r704,%r473; setp.ne.u64 %r1263,%r186,%r273; @ %r1263 bra $L119; bra $L120; $L32: .loc 1 2920 11 setp.eq.u64 %r1264,%r178,1; @ %r1264 bra $L121; bra $L20; $L2: .loc 1 2476 6 ld.u64 %r1265,[%r718]; setp.ne.u64 %r1266,%r1265,0; @ %r1266 bra $L122; .loc 1 2485 4 ld.u64 %r1268,[%r719+56]; add.u64 %r1267,%r1268,1; ld.u64 %r1269,[%r719+48]; sub.u64 %r478,%r1267,%r1269; .loc 1 2483 15 ld.s8 %r1271,[%r720+28]; cvt.u16.u32 %r1270,%r1271; setp.eq.u16 %r1272,%r1270,1; @ %r1272 bra $L123; bra $L124; $L126: .loc 1 2560 16 ld.u64 %r473,[%r719+40]; .loc 1 2548 27 mov.u64 %r414,%r474; bra $L125; $L14: ld.u64 %r474,[%r718+40]; .loc 1 2543 6 ld.s8 %r1274,[%r718+28]; cvt.u16.u32 %r1273,%r1274; setp.eq.u16 %r1275,%r1273,1; @ ! %r1275 bra $L127; bra $L126; $L11: .loc 1 2560 16 ld.u64 %r473,[%r719+40]; mov.u32 %r416,%r301; .loc 1 2548 27 mov.u64 %r474,%r414; .loc 1 2563 14 mov.u64 %r186,1; .loc 1 2561 16 mov.u64 %r178,%r186; bra $L12; $L5: .loc 1 2485 4 ld.u64 %r1277,[%r719+56]; add.u64 %r1276,%r1277,1; ld.u64 %r1278,[%r719+48]; sub.u64 %r478,%r1276,%r1278; bra $L124; $L128: .loc 1 2509 17 ld.u64 %r1280,[%r718+56]; add.u64 %r1279,%r1280,1; .loc 1 2509 15 ld.u64 %r1281,[%r718+48]; sub.u64 %r426,%r1279,%r1281; ld.s8 %r1286,[%r720+28]; bra $L9; $L122: .loc 1 2502 12 cvta.global.u64 %r1282,_gfortrani_compile_options; .loc 1 2502 11 ld.u32 %r1283,[%r1282+36]; setp.ne.u32 %r1284,%r1283,0; @ ! %r1284 bra $L8; bra $L128; $L1$L387: .loc 1 2907 11 ld.f64 %r1784,[%r302]; ld.f64 %r1785,[%r370]; fma.rn.f64 %r238,%r1784,%r1785,%r238; .loc 1 2905 10 add.u64 %r302,%r302,8; add.u64 %r267,%r370,8; $L320: .loc 1 2907 11 ld.f64 %r1787,[%r302]; ld.f64 %r1788,[%r267]; fma.rn.f64 %r238,%r1787,%r1788,%r238; .loc 1 2905 10 add.u64 %r302,%r302,8; add.u64 %r267,%r267,8; $L319: .loc 1 2907 11 ld.f64 %r1790,[%r302]; ld.f64 %r1791,[%r267]; fma.rn.f64 %r238,%r1790,%r1791,%r238; .loc 1 2905 10 add.u64 %r302,%r302,8; add.u64 %r267,%r267,8; setp.ne.u64 %r1792,%r368,%r267; @ %r1792 bra $L83; bra $L82; $L386: add.u64 %r1793,%r388,%r398; .loc 1 2887 16 ld.f64 %r1794,[%r370]; .loc 1 2886 16 add.u64 %r1795,%r1793,%r387; .loc 1 2886 11 ld.f64 %r1796,[%r1795]; fma.rn.f64 %r239,%r1796,%r1794,%r239; .loc 1 2888 11 ld.f64 %r1797,[%r398]; fma.rn.f64 %r240,%r1794,%r1797,%r240; .loc 1 2890 16 add.u64 %r1798,%r1793,%r386; .loc 1 2890 11 ld.f64 %r1799,[%r1798]; fma.rn.f64 %r241,%r1794,%r1799,%r241; .loc 1 2892 16 add.u64 %r1800,%r1793,%r385; .loc 1 2892 11 ld.f64 %r1801,[%r1800]; fma.rn.f64 %r242,%r1794,%r1801,%r242; .loc 1 2884 10 add.u64 %r399,%r370,8; add.u64 %r398,%r398,8; $L318: add.u64 %r1803,%r388,%r398; .loc 1 2887 16 ld.f64 %r1804,[%r399]; .loc 1 2886 16 add.u64 %r1805,%r1803,%r387; .loc 1 2886 11 ld.f64 %r1806,[%r1805]; fma.rn.f64 %r239,%r1806,%r1804,%r239; .loc 1 2888 11 ld.f64 %r1807,[%r398]; fma.rn.f64 %r240,%r1804,%r1807,%r240; .loc 1 2890 16 add.u64 %r1808,%r1803,%r386; .loc 1 2890 11 ld.f64 %r1809,[%r1808]; fma.rn.f64 %r241,%r1804,%r1809,%r241; .loc 1 2892 16 add.u64 %r1810,%r1803,%r385; .loc 1 2892 11 ld.f64 %r1811,[%r1810]; fma.rn.f64 %r242,%r1804,%r1811,%r242; .loc 1 2884 10 add.u64 %r399,%r399,8; add.u64 %r398,%r398,8; $L317: add.u64 %r1813,%r388,%r398; .loc 1 2887 16 ld.f64 %r1814,[%r399]; .loc 1 2886 16 add.u64 %r1815,%r1813,%r387; .loc 1 2886 11 ld.f64 %r1816,[%r1815]; fma.rn.f64 %r239,%r1816,%r1814,%r239; .loc 1 2888 11 ld.f64 %r1817,[%r398]; fma.rn.f64 %r240,%r1814,%r1817,%r240; .loc 1 2890 16 add.u64 %r1818,%r1813,%r386; .loc 1 2890 11 ld.f64 %r1819,[%r1818]; fma.rn.f64 %r241,%r1814,%r1819,%r241; .loc 1 2892 16 add.u64 %r1820,%r1813,%r385; .loc 1 2892 11 ld.f64 %r1821,[%r1820]; fma.rn.f64 %r242,%r1814,%r1821,%r242; .loc 1 2884 10 add.u64 %r399,%r399,8; add.u64 %r398,%r398,8; setp.ne.u64 %r1822,%r368,%r399; @ %r1822 bra $L78; bra $L77; $L385: .loc 1 2855 16 ld.f64 %r1823,[%r359]; .loc 1 2855 11 ld.f64 %r1824,[%r357]; fma.rn.f64 %r244,%r1823,%r1824,%r244; .loc 1 2857 11 ld.f64 %r1825,[%r49]; fma.rn.f64 %r245,%r1823,%r1825,%r245; .loc 1 2859 11 ld.f64 %r1826,[%r209]; fma.rn.f64 %r246,%r1823,%r1826,%r246; .loc 1 2861 11 ld.f64 %r1827,[%r210]; fma.rn.f64 %r247,%r1823,%r1827,%r247; .loc 1 2853 10 add.u64 %r359,%r359,8; add.u64 %r357,%r357,8; add.u64 %r355,%r49,8; add.u64 %r353,%r209,8; add.u64 %r351,%r210,8; $L316: .loc 1 2855 16 ld.f64 %r1829,[%r359]; .loc 1 2855 11 ld.f64 %r1830,[%r357]; fma.rn.f64 %r244,%r1829,%r1830,%r244; .loc 1 2857 11 ld.f64 %r1831,[%r355]; fma.rn.f64 %r245,%r1829,%r1831,%r245; .loc 1 2859 11 ld.f64 %r1832,[%r353]; fma.rn.f64 %r246,%r1829,%r1832,%r246; .loc 1 2861 11 ld.f64 %r1833,[%r351]; fma.rn.f64 %r247,%r1829,%r1833,%r247; .loc 1 2853 10 add.u64 %r359,%r359,8; add.u64 %r357,%r357,8; add.u64 %r355,%r355,8; add.u64 %r353,%r353,8; add.u64 %r351,%r351,8; $L315: .loc 1 2855 16 ld.f64 %r1835,[%r359]; .loc 1 2855 11 ld.f64 %r1836,[%r357]; fma.rn.f64 %r244,%r1835,%r1836,%r244; .loc 1 2857 11 ld.f64 %r1837,[%r355]; fma.rn.f64 %r245,%r1835,%r1837,%r245; .loc 1 2859 11 ld.f64 %r1838,[%r353]; fma.rn.f64 %r246,%r1835,%r1838,%r246; .loc 1 2861 11 ld.f64 %r1839,[%r351]; fma.rn.f64 %r247,%r1835,%r1839,%r247; .loc 1 2853 10 add.u64 %r359,%r359,8; add.u64 %r357,%r357,8; add.u64 %r355,%r355,8; add.u64 %r353,%r353,8; add.u64 %r351,%r351,8; setp.ne.u64 %r1840,%r197,%r357; @ %r1840 bra $L70; bra $L69; $L384: add.u64 %r1841,%r315,%r329; .loc 1 2793 19 add.u64 %r1842,%r1841,%r314; ld.f64 %r1843,[%r1842]; .loc 1 2794 14 ld.f64 %r1844,[%r331]; .loc 1 2793 14 fma.rn.f64 %r249,%r1843,%r1844,%r249; .loc 1 2795 19 ld.f64 %r1845,[%r329]; .loc 1 2795 14 fma.rn.f64 %r250,%r1844,%r1845,%r250; add.u64 %r1846,%r331,%r456; .loc 1 2798 14 add.u64 %r1847,%r1846,%r453; ld.f64 %r1848,[%r1847]; .loc 1 2797 14 fma.rn.f64 %r251,%r1843,%r1848,%r251; .loc 1 2799 14 fma.rn.f64 %r252,%r1845,%r1848,%r252; .loc 1 2802 14 add.u64 %r1849,%r1846,%r448; ld.f64 %r1850,[%r1849]; .loc 1 2801 14 fma.rn.f64 %r253,%r1843,%r1850,%r253; .loc 1 2803 14 fma.rn.f64 %r254,%r1845,%r1850,%r254; .loc 1 2806 14 add.u64 %r1851,%r1846,%r446; ld.f64 %r1852,[%r1851]; .loc 1 2805 14 fma.rn.f64 %r255,%r1843,%r1852,%r255; .loc 1 2807 14 fma.rn.f64 %r256,%r1845,%r1852,%r256; .loc 1 2809 19 add.u64 %r1853,%r1841,%r313; ld.f64 %r1854,[%r1853]; .loc 1 2809 14 fma.rn.f64 %r257,%r1844,%r1854,%r257; .loc 1 2811 19 add.u64 %r1855,%r1841,%r312; ld.f64 %r1856,[%r1855]; .loc 1 2811 14 fma.rn.f64 %r258,%r1844,%r1856,%r258; .loc 1 2813 14 fma.rn.f64 %r259,%r1848,%r1854,%r259; .loc 1 2815 14 fma.rn.f64 %r260,%r1848,%r1856,%r260; .loc 1 2817 14 fma.rn.f64 %r261,%r1850,%r1854,%r261; .loc 1 2819 14 fma.rn.f64 %r262,%r1850,%r1856,%r262; .loc 1 2821 14 fma.rn.f64 %r263,%r1852,%r1854,%r263; .loc 1 2823 14 fma.rn.f64 %r264,%r1852,%r1856,%r264; .loc 1 2791 6 add.u64 %r331,%r331,8; add.u64 %r329,%r329,8; setp.ne.u64 %r1857,%r197,%r331; @ %r1857 bra $L65; bra $L64; $L383: .loc 1 2762 43 ld.f64 %r1858,[%r526]; st.f64 [%r300],%r1858; .loc 1 2760 9 add.u64 %r57,%r526,8; add.u64 %r73,%r300,2048; $L314: .loc 1 2762 43 ld.f64 %r1860,[%r57]; st.f64 [%r73],%r1860; .loc 1 2760 9 add.u64 %r57,%r57,8; add.u64 %r73,%r73,2048; $L313: .loc 1 2762 43 ld.f64 %r1862,[%r57]; st.f64 [%r73],%r1862; .loc 1 2760 9 add.u64 %r57,%r57,8; add.u64 %r73,%r73,2048; setp.ne.u64 %r1863,%r57,%r136; @ %r1863 bra $L58; bra $L48; $L382: .loc 1 2741 7 add.u64 %r1864,%r506,%r509; add.u64 %r1865,%r1864,%r510; .loc 1 2740 49 ld.f64 %r1866,[%r1865]; st.f64 [%r199],%r1866; .loc 1 2742 49 ld.f64 %r1867,[%r506]; st.f64 [%r199+8],%r1867; .loc 1 2745 7 add.u64 %r1868,%r506,%r510; add.u64 %r1869,%r1868,%r509; .loc 1 2744 49 ld.f64 %r1870,[%r1869+8]; st.f64 [%r199+2048],%r1870; .loc 1 2746 49 ld.f64 %r1871,[%r506+8]; st.f64 [%r199+2056],%r1871; .loc 1 2738 9 add.u64 %r199,%r199,4096; add.u64 %r181,%r506,16; $L312: .loc 1 2741 7 add.u64 %r1873,%r181,%r509; add.u64 %r1874,%r1873,%r510; .loc 1 2740 49 ld.f64 %r1875,[%r1874]; st.f64 [%r199],%r1875; .loc 1 2742 49 ld.f64 %r1876,[%r181]; st.f64 [%r199+8],%r1876; .loc 1 2745 7 add.u64 %r1877,%r181,%r510; add.u64 %r1878,%r1877,%r509; .loc 1 2744 49 ld.f64 %r1879,[%r1878+8]; st.f64 [%r199+2048],%r1879; .loc 1 2746 49 ld.f64 %r1880,[%r181+8]; st.f64 [%r199+2056],%r1880; .loc 1 2738 9 add.u64 %r199,%r199,4096; add.u64 %r181,%r181,16; $L311: .loc 1 2741 7 add.u64 %r1882,%r181,%r509; add.u64 %r1883,%r1882,%r510; .loc 1 2740 49 ld.f64 %r1884,[%r1883]; st.f64 [%r199],%r1884; .loc 1 2742 49 ld.f64 %r1885,[%r181]; st.f64 [%r199+8],%r1885; .loc 1 2745 7 add.u64 %r1886,%r181,%r510; add.u64 %r1887,%r1886,%r509; .loc 1 2744 49 ld.f64 %r1888,[%r1887+8]; st.f64 [%r199+2048],%r1888; .loc 1 2746 49 ld.f64 %r1889,[%r181+8]; st.f64 [%r199+2056],%r1889; .loc 1 2738 9 add.u64 %r199,%r199,4096; add.u64 %r181,%r181,16; setp.ne.u64 %r1890,%r181,%r497; @ %r1890 bra $L52; bra $L53; $L380: .loc 1 2688 2 @ ! %r1302 bra $L395; .loc 1 2689 22 mov.u32 %r1881892,[%value_in]; } $L395: .loc 1 2687 24 mov.u64 %r266,2; .loc 1 2687 7 add.u64 %r590,%r215,%r591; $L310: .loc 1 2688 2 @ ! %r1302 bra $L396; .loc 1 2689 22 mov.u32 %r1881895,[%value_in]; } $L396: .loc 1 2687 24 add.u64 %r266,%r266,1; .loc 1 2687 7 add.u64 %r590,%r590,%r591; $L309: .loc 1 2688 2 @ ! %r1302 bra $L397; .loc 1 2689 22 mov.u32 %r1881898,[%value_in]; } $L397: .loc 1 2687 24 add.u64 %r266,%r266,1; .loc 1 2687 7 add.u64 %r590,%r590,%r591; setp.eq.u64 %r1899,%r266,%r594; @ ! %r1899 bra $L34; bra $L33; $L247: .loc 1 2689 22 mov.u32 %r190014699001901,[%value_in]; } $L381: .loc 1 2687 7 add.u64 %r1903,%r1469,%r591; .loc 1 2688 2 @ ! %r1302 bra $L398; .loc 1 2689 22 mov.u32 %r190519019051906,[%value_in]; } $L398: .loc 1 2687 7 add.u64 %r1908,%r1903,%r591; .loc 1 2688 2 @ ! %r1302 bra $L399; .loc 1 2689 22 mov.u32 %r19101919101911,[%value_in]; } $L399: .loc 1 2687 24 add.u64 %r266,%r1416,3; .loc 1 2687 7 add.u64 %r590,%r1908,%r591; setp.eq.u64 %r1912,%r266,%r594; @ ! %r1912 bra $L34; bra $L33; $L389: .loc 1 2938 9 ld.f64 %r1913,[%r595]; ld.f64 %r1914,[%r622]; fma.rn.f64 %r271,%r1913,%r1914,%r271; .loc 1 2937 5 add.u64 %r595,%r595,8; add.u64 %r596,%r622,8; $L322: .loc 1 2938 9 ld.f64 %r1916,[%r595]; ld.f64 %r1917,[%r596]; fma.rn.f64 %r271,%r1916,%r1917,%r271; .loc 1 2937 5 add.u64 %r595,%r595,8; add.u64 %r596,%r596,8; $L321: .loc 1 2938 9 ld.f64 %r1919,[%r595]; ld.f64 %r1920,[%r596]; fma.rn.f64 %r271,%r1919,%r1920,%r271; .loc 1 2937 5 add.u64 %r595,%r595,8; add.u64 %r596,%r596,8; setp.ne.u64 %r1921,%r595,%r601; @ %r1921 bra $L93; bra $L92; $L390: .loc 1 2953 5 ld.f64 %r1922,[%r213]; ld.f64 %r1923,[%r626]; fma.rn.f64 %r269,%r1922,%r1923,%r269; .loc 1 2952 8 add.u64 %r623,%r213,%r625; add.u64 %r626,%r626,8; $L324: .loc 1 2953 5 ld.f64 %r1925,[%r623]; ld.f64 %r1926,[%r626]; fma.rn.f64 %r269,%r1925,%r1926,%r269; .loc 1 2952 8 add.u64 %r623,%r623,%r625; add.u64 %r626,%r626,8; $L323: .loc 1 2953 5 ld.f64 %r1928,[%r623]; ld.f64 %r1929,[%r626]; fma.rn.f64 %r269,%r1928,%r1929,%r269; .loc 1 2952 8 add.u64 %r623,%r623,%r625; add.u64 %r626,%r626,8; setp.ne.u64 %r1930,%r626,%r642; @ %r1930 bra $L97; bra $L96; $L391: .loc 1 2968 8 ld.f64 %r1931,[%r213]; ld.f64 %r1932,[%r646]; fma.rn.f64 %r282,%r1931,%r1932,%r282; .loc 1 2967 28 mov.u64 %r283,1; .loc 1 2967 4 add.u64 %r643,%r213,%r645; add.u64 %r646,%r646,%r648; $L326: .loc 1 2968 8 ld.f64 %r1934,[%r643]; ld.f64 %r1935,[%r646]; fma.rn.f64 %r282,%r1934,%r1935,%r282; .loc 1 2967 28 add.u64 %r283,%r283,1; .loc 1 2967 4 add.u64 %r643,%r643,%r645; add.u64 %r646,%r646,%r648; $L325: .loc 1 2968 8 ld.f64 %r1937,[%r643]; ld.f64 %r1938,[%r646]; fma.rn.f64 %r282,%r1937,%r1938,%r282; .loc 1 2967 28 add.u64 %r283,%r283,1; .loc 1 2967 4 add.u64 %r643,%r643,%r645; add.u64 %r646,%r646,%r648; setp.ne.u64 %r1939,%r426,%r283; @ %r1939 bra $L100; bra $L99; $L393: .loc 1 2982 36 ld.f64 %r1940,[%r660]; ld.f64 %r1941,[%r669]; ld.f64 %r1942,[%r680]; fma.rn.f64 %r1943,%r1940,%r1941,%r1942; st.f64 [%r680],%r1943; .loc 1 2980 29 mov.u64 %r278,1; .loc 1 2980 4 add.u64 %r659,%r680,%r685; add.u64 %r660,%r660,%r662; $L330: .loc 1 2982 36 ld.f64 %r1945,[%r660]; ld.f64 %r1946,[%r669]; ld.f64 %r1947,[%r659]; fma.rn.f64 %r1948,%r1945,%r1946,%r1947; st.f64 [%r659],%r1948; .loc 1 2980 29 add.u64 %r278,%r278,1; .loc 1 2980 4 add.u64 %r659,%r659,%r685; add.u64 %r660,%r660,%r662; $L329: .loc 1 2982 36 ld.f64 %r1950,[%r660]; ld.f64 %r1951,[%r669]; ld.f64 %r1952,[%r659]; fma.rn.f64 %r1953,%r1950,%r1951,%r1952; st.f64 [%r659],%r1953; .loc 1 2980 29 add.u64 %r278,%r278,1; .loc 1 2980 4 add.u64 %r659,%r659,%r685; add.u64 %r660,%r660,%r662; setp.ne.u64 %r1954,%r186,%r278; @ %r1954 bra $L110; bra $L113; $L392: .loc 1 2976 34 st.f64 [%r683],%r1346; .loc 1 2975 27 mov.u64 %r280,1; .loc 1 2975 2 add.u64 %r683,%r683,%r685; $L328: .loc 1 2976 34 st.f64 [%r683],%r1346; .loc 1 2975 27 add.u64 %r280,%r280,1; .loc 1 2975 2 add.u64 %r683,%r683,%r685; $L327: .loc 1 2976 34 st.f64 [%r683],%r1346; .loc 1 2975 27 add.u64 %r280,%r280,1; .loc 1 2975 2 add.u64 %r683,%r683,%r685; setp.ne.u64 %r1957,%r186,%r280; @ %r1957 bra $L105; bra $L109; $L394: .loc 1 3002 5 ld.f64 %r1958,[%r691]; ld.f64 %r1959,[%r717]; fma.rn.f64 %r274,%r1958,%r1959,%r274; .loc 1 3001 32 mov.u64 %r275,1; .loc 1 3001 8 add.u64 %r691,%r691,%r693; add.u64 %r695,%r717,%r697; $L332: .loc 1 3002 5 ld.f64 %r1961,[%r691]; ld.f64 %r1962,[%r695]; fma.rn.f64 %r274,%r1961,%r1962,%r274; .loc 1 3001 32 add.u64 %r275,%r275,1; .loc 1 3001 8 add.u64 %r691,%r691,%r693; add.u64 %r695,%r695,%r697; $L331: .loc 1 3002 5 ld.f64 %r1964,[%r691]; ld.f64 %r1965,[%r695]; fma.rn.f64 %r274,%r1964,%r1965,%r274; .loc 1 3001 32 add.u64 %r275,%r275,1; .loc 1 3001 8 add.u64 %r691,%r691,%r693; add.u64 %r695,%r695,%r697; setp.ne.u64 %r1966,%r426,%r275; @ %r1966 bra $L118; bra $L117; } matmul_r10.o/ matmul_c4.o/, .param .u32 %in_ar4, .param .u64 %in_ar5); .file 1 "../../../../libgfortran/generated/matmul_c4.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_options[6]FUNCTION DECL: _gfortran_size0_gfortran_size0_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime_32 $LC9[2] = {__func__$0 .const .align 1 .u8 __func__$0[10] = {109,97,116,109,117,108,95,99,52,0 }2] = {782] = {672] = {8471,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,32,40,97,41,32,61,61,32,50,32,124,124,32,71,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,32,40,98,41,32,61,61,3246116,114,97,110,47,103,101,110,101,114,97,116,101,100,4765,114,114,97,121,32,98,111,117,110,100,32,109,105,115,109,97,116,99,104,32,102,111,114,32,100,105,109,101,110,115,105,111,110,32,49,32,111,102,32,97,114,114,97,121,32,40,37,108,100,47,37,108,100,41,3265,114,114,97,121,32,98,111,117,110,100,32,109,105,115,109,97,116,99,104,32,102,111,114,32,100,105,109,101,110,115,105,111,110,32,50,32,111,102,32,97,114,114,97,121,32,40,37,108,100,47,37,108,100,41,3273,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,97,114,103,117,109,101,110,116,32,66,32,105,110,32,77,65,84,77,85,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,49,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,100,0 }; // BEGIN VAR DEF: $LC8 .const .align 4 .u32 $LC8[2] = {10653532_gfortran_matmul_c4 .visible .func _gfortran_matmul_cu32 %r93u64 %r200; .reg .f32 %r201; .reg .f32 %r202; .reg .f32 %r203; .reg .f32 %r205; .reg .f32 %r206; .reg .f32 %r207; .reg .f32 %r208; .reg .f32 %r209; .reg .f32 %r211; .reg .f32 %r212; .reg .f32 %r213; .reg .f32u64 %r257; .reg .u64 %r258; .reg .u64u64 %r267; .reg .u64 %r268; .reg .u64 %r269; .reg .f32 %r270; .reg .f32 %r271; .reg .f32 %r272; .reg .f32 %r273; .reg .f32 %r274; .reg .f32 %r275; .reg .f32 %r276; .reg .f32 %r277; .reg .f32 %r278; .reg .f32 %r279; .reg .f32 %r280; .reg .f32 %r281; .reg .f32 %r282; .reg .f32 %r283; .reg .f32 %r284; .reg .f32f32 %r290; .reg .f32 %r296; .reg .u64 %r297; .reg .u64 %r298; .reg .f32 %r299; .reg .f32 %r300; .reg .f32 %r301; .reg .f32 %r302; .reg .f32 %r303; .reg .f32 %r304; .reg .f32 %r305; .reg .f32 %r306; .reg .f32 %r307; .reg .f32 %r308; .reg .f32 %r309; .reg .f32 %r310; .reg .f32 %r311; .reg .f32 %r312; .reg .f32 %r313; .reg .f32 %r314; .reg .f32 %r315; .reg .f32 %r317; .reg .f32 %r318; .reg .f32 %r319; .reg .f32 %r320; .reg .f32 %r321; .reg .f32 %r323; .reg .f32 %r324; .reg .f32 %r325; .reg .f32 %r326; .reg .f32 %r327; .reg .f32 %r329; .reg .f32 %r330; .reg .f32 %r331; .reg .f32 %r333; .reg .f32 %r334; .reg .f32 %r335; .reg .f32 %r336; .reg .f32 %r337; .reg .f32 %r339; .reg .f32 %r340; .reg .f32 %r341; .reg .f32 %r343; .reg .f32 %r344; .reg .f32 %r345; .reg .f32 %r346; .reg .f32 %r347; .reg .f32 %r349; .reg .f32 %r350; .reg .f32 %r351; .reg .f32 %r353; .reg .f32 %r354; .reg .f32 %r355; .reg .f32 %r356; .reg .f32 %r357; .reg .f32 %r359; .reg .f32 %r360; .reg .f32 %r361; .reg .f32 %r362; .reg .f32 %r363; .reg .f32 %r365; .reg .f32 %r366; .reg .f32 %r367; .reg .f32 %r369; .reg .f32 %r370; .reg .f32 %r371; .reg .f32 %r373; .reg .f32 %r374; .reg .f32 %r375; .reg .f32 %r377; .reg .f32 %r378; .reg .f32 %r379; .reg .f32 %r381; .reg .f32 %r382; .reg .f32 %r383; .reg .f32 %r385; .reg .f32 %r386; .reg .f32 %r387; .reg .f32 %r389; .reg .f32 %r390; .reg .f32 %r391; .reg .f32 %r392; .reg .f32 %r393; .reg .f32 %r394; .reg .f32 %r395; .reg .f32 %r396; .reg .f32 %r397; .reg .f32 %r398; .reg .f32 %r399; .reg .f32 %r400; .reg .f32 %r401; .reg .f32 %r402; .reg .f32 %r403; .reg .f32 %r405; .reg .f32 %r406; .reg .f32 %r407; .reg .f32 %r408; .reg .f32 %r409; .reg .f32 %r411; .reg .f32 %r412; .reg .f32 %r413; .reg .f32 %r414; .reg .f32 %r415; .reg .f32 %r417; .reg .f32 %r418; .reg .f32 %r419; .reg .f32 %r420; .reg .f32 %r421; .reg .f32 %r423; .reg .f32 %r424; .reg .f32 %r425; .reg .f32 %r426; .reg .f32 %r427; .reg .f32 %r428; .reg .f32 %r429; .reg .f32 %r430; .reg .f32 %r431; .reg .f32 %r432; .reg .f32 %r433; .reg .f32 %r434; .reg .f32 %r435; .reg .f32 %r436; .reg .f32 %r437; .reg .f32 %r439; .reg .f32 %r440; .reg .f32 %r441; .reg .f32 %r442; .reg .f32 %r443; .reg .f32 %r445; .reg .f32 %r446; .reg .f32 %r447; .reg .f32 %r448; .reg .f32 %r449; .reg .f32 %r451; .reg .f32 %r452; .reg .f32 %r453; .reg .f32 %r454; .reg .f32 %r455; .reg .f32 %r457; .reg .f32 %r458; .reg .f32 %r459; .reg .f32 %r460; .reg .f32 %r461; .reg .f32 %r462; .reg .f32 %r463; .reg .f32 %r464; .reg .f32 %r465; .reg .f32 %r467; .reg .f32 %r468; .reg .f32 %r469; .reg .u64 %r472; .reg .u64 %r483; .reg .f32 %r495; .reg .u64 %r496; .reg .u64 %r498; .reg .u64 %r500; .reg .u64 %r502; .reg .u64 %r504; .reg .u64u64 %r515; .reg .u64 %r517; .reg .u64 %r519; .reg .u64 %r521; .reg .u64 %r523; .reg .u64 %r524; .reg .u64 %r527; .reg .u64 %r530; .reg .u64 %r535; .reg .u64 %r539; .reg .u64 %r542; .reg .u64 %r543; .reg .u64 %r549; .reg .u64 %r552; .reg .u64 %r553; .reg .u64 %r555; .reg .u64u64 %r567; .reg .u64 %r569; .reg .u64 %r574; .reg .u64 %r575; .reg .u64 %r577; .reg .u64u64 %r585; .reg .u64 %r592; .reg .u64u64 %r604; .reg .u64 %r606; .reg .u64 %r608; .reg .u64 %r610; .reg .u64 %r615; .reg .u64 %r616; .reg .u64 %r617; .reg .u64 %r621; .reg .u64 %r623; .reg .u32 %r627; .reg .u64 %r637; .reg .u64 %r638; .reg .u64 %r639; .reg .u64 %r640; .reg .u64 %r641; .reg .u64 %r642; .reg .u64 %r643; .reg .u64 %r644; .reg .u64 %r645; .reg .u64 %r647; .reg .u64 %r648; .reg .u64 %r649; .reg .u64 %r650; .reg .u64 %r651; .reg .u64 %r657; .reg .u64 %r658; .reg .u32 %r662; .reg .u64 %r663; .reg .u64 %r664; .reg .u64 %r668; .reg .u64 %r672; .reg .u64 %r676; .reg .u64 %r677; .reg .u64 %r678; .reg .u64 %r679; .reg .u64 %r682; .reg .u64 %r683; .reg .u64 %r692; .reg .u64 %r694; .reg .u64 %r695; .reg .u64 %r696; .reg .u64 %r706; .reg .u64 %r707; .reg .u64 %r708; .reg .u64 %r709; .reg .u64 %r710; .reg .u64 %r716; .reg .u64 %r721; .reg .u64 %r726; .reg .u64 %r736; .reg .u64 %r737; .reg .u64 %r747; .reg .u64 %r749; .reg .u64 %r750; .reg .u64 %r756; .reg .u64 %r757; .reg .u64 %r758; .reg .u64 %r759; .reg .u64 %r760; .reg .u64 %r761; .reg .u64 %r762; .reg .u64 %r763; .reg .u64 %r764; .reg .u64 %r765; .reg .u64 %r766; .reg .u64 %r767; .reg .u64 %r769; .reg .u64 %r770; .reg .u64 %r771; .reg .u64 %r772; .reg .u64 %r774; .reg .u64 %r775; .reg .u64 %r782; .reg .u64 %r783; .reg .u64 %r784; .reg .u64 %r785; .reg .u64 %r786; .reg .u64 %r793; .reg .u64 %r795; .reg .u64 %r797; .reg .u64 %r798; .reg .u64 %r799; .reg .u64 %r800; .reg .u64 %r802; .reg .u64 %r806; .reg .u64 %r807; .reg .u64 %r808; .reg .u64 %r809; .reg .u64 %r810; .reg .u64 %r815; .reg .u64 %r816; .reg .u64 %r818; .reg .u64 %r821; .reg .u64 %r822; .reg .u64 %r827; .reg .u64 %r829; .reg .u64 %r830; .reg .u64 %r834; .reg .u64 %r836; .reg .u64 %r840; .reg .u64 %r842; .reg .u64 %r843; .reg .u64 %r844; .reg .u64 %r845; .reg .u64 %r851; .reg .u64 %r853; .reg .u64 %r854; .reg .u64 %r861; .reg .u64 %r862; .reg .u64 %r864; .reg .u64 %r865; .reg .u64 %r867; .reg .u64 %r870; .reg .u64 %r872; .reg .u64 %r873; .reg .u64 %r878; .reg .u64 %r881; .reg .u64 %r883; .reg .u64 %r884; .reg .u64 %r889; .reg .u64 %r891; .reg .u64 %r893; .reg .u64 %r894; .reg .u64 %r900; .reg .u32 %r901; .reg .u64 %r902; .reg .u64 %r905; .reg .u64 %r908; .reg .u64 %r910; .reg .u64 %r912; .reg .u64 %r914; .reg .u64 %r918; .reg .u64 %r920; .reg .u64 %r922; .reg .u64 %r924; .reg .u64 %r927; .reg .u64 %r929; .reg .u64 %r931; .reg .u64 %r936; .reg .u64 %r938; .reg .u64 %r944; .reg .u64 %r945; .reg .u64 %r946; .reg .u64 %r947; .reg .u32 %r948; .reg .u32 %r949; .reg .u64 %r950; .reg .u16 %r951; .reg .pred %r952; .reg .u16 %r953; .reg .pred %r955; .reg .u64 %r956; .reg .u32 %r957; .reg .u64 %r958; .reg .u64 %r959; .reg .u64 %r960; .reg .pred %r961; .reg .pred %r963; .reg .u64 %r966; .reg .u64 %r968; .reg .u64 %r969; .reg .u64 %r970; .reg .u64 %r972; .reg .u64 %r973; .reg .u64 %r974; .reg .u64 %r975; .reg .u64 %r976; .reg .u64 %r979; .reg .u64 %r981; .reg .u64 %r982; .reg .u64 %r984; .reg .u64 %r986; .reg .u64 %r987; .reg .u64 %r989; .reg .u64 %r990; .reg .u64 %r991; .reg .u64 %r992; .reg .u64 %r993; .reg .u32 %r994; .reg .pred %r995; .reg .u64 %r996; .reg .u64 %r997; .reg .u64 %r998; .reg .pred %r1000; .reg .u64 %r1001; .reg .u64 %r1002; .reg .u64 %r1003; .reg .pred %r1004; .reg .u64 %r1005; .reg .u64 %r1006; .reg .u64 %r1007; .reg .u16 %r1008; .reg .pred %r1009; .reg .u64 %r1010; .reg .u64 %r1012; .reg .u64 %r1013; .reg .u64 %r1014; .reg .u16 %r1015; .reg .pred %r1017; .reg .pred %r1018; .reg .u64 %r1019; .reg .pred %r1021; .reg .u64 %r1022; .reg .u64 %r1024; .reg .u64 %r1025; .reg .u64 %r1026; .reg .u64 %r1027; .reg .u64 %r1028; .reg .u64 %r1029; .reg .pred %r1030; .reg .u64 %r1031; .reg .u64 %r1033; .reg .u64 %r1034; .reg .u64 %r1035; .reg .u16 %r1036; .reg .u32 %r1037; .reg .pred %r1038; .reg .u16 %r1039; .reg .pred %r1040; .reg .u64 %r1041; .reg .u64 %r1042; .reg .u64 %r1043; .reg .u64 %r1044; .reg .u64 %r1045; .reg .u64 %r1046; .reg .pred %r1047; .reg .u32 %r1049; .reg .u32 %r1050; .reg .u32 %r1052; .reg .u32 %r1053; .reg .u16 %r1054; .reg .u16 %r1055; .reg .u16 %r1056; .reg .u32 %r1057; .reg .u16 %r1058; .reg .pred %r1059; .reg .u64 %r1060; .reg .u16 %r1062; .reg .pred %r1063; .reg .u64 %r1064; .reg .u64 %r1065; .reg .u64 %r1066; .reg .u32 %r1067; .reg .u32 %r1068; .reg .u32 %r1069; .reg .u32 %r1070; .reg .u32 %r1071; .reg .u32 %r1072; .reg .u32 %r1074; .reg .u32 %r1075; .reg .u16 %r1076; .reg .u16 %r1077; .reg .u16 %r1078; .reg .u32 %r1079; .reg .u16 %r1080; .reg .pred %r1081; .reg .u32 %r1083; .reg .u32 %r1084; .reg .u16 %r1085; .reg .u16 %r1086; .reg .u16 %r1087; .reg .u32 %r1088; .reg .u16 %r1089; .reg .pred %r1090; .reg .u32 %r1092; .reg .u32 %r1093; .reg .u32 %r1095; .reg .u32 %r1096; .reg .u16 %r1097; .reg .u16 %r1098; .reg .u16 %r1099; .reg .u32 %r1100; .reg .u16 %r1101; .reg .pred %r1102; .reg .f32 %r1103; .reg .f32 %r1104; .reg .f32 %r1105; .reg .f32 %r1106; .reg .f32 %r1107; .reg .f32 %r1108; .reg .f32 %r1109; .reg .pred %r1110; .reg .f32 %r1111; .reg .f32 %r1112; .reg .pred %r1115; .reg .pred %r1116; .reg .u32 %r1118; .reg .u32 %r1119; .reg .u32 %r1121; .reg .u32 %r1122; .reg .u16 %r1123; .reg .u16 %r1124; .reg .u16 %r1125; .reg .u32 %r1126; .reg .u16 %r1127; .reg .pred %r1128; .reg .u32 %r1130; .reg .u32 %r1131; .reg .u32 %r1133; .reg .u32 %r1134; .reg .u16 %r1135; .reg .u16 %r1136; .reg .u16 %r1137; .reg .u32 %r1138; .reg .u16 %r1139; .reg .pred %r1140; .reg .u32 %r1142; .reg .u32 %r1143; .reg .u32 %r1145; .reg .u32 %r1146; .reg .u16 %r1147; .reg .u16 %r1148; .reg .u16 %r1149; .reg .u32 %r1150; .reg .u16 %r1151; .reg .pred %r1152; .reg .pred %r1153; .reg .u64 %r1154; .reg .u32 %r1155; .reg .u64 %r1156; .reg .u64 %r1157; .reg .u32 %r1158; .reg .pred %r1159; .reg .pred %r1160; .reg .u32 %r1161; .reg .pred %r1162; .reg .pred %r1163; .reg .u32 %r1178; .reg .u64 %r1179; .reg .u64 %r1180; .reg .u64 %r1181; .reg .u64 %r1182; .reg .u64 %r1183; .reg .u64 %r1184; .reg .u64 %r1185; .reg .u16 %r1186; .reg .u16 %r1187; .reg .u16 %r1188; .reg .u32 %r1189; .reg .u16 %r1190; .reg .pred %r1191; .reg .pred %r1192; .reg .u16 %r1193; .reg .pred %r1194; .reg .u64 %r1195; .reg .pred %r1196; .reg .u32 %r1200; .reg .u64 %r1202; .reg .u32 %r1207; .reg .u32 %r1208; .reg .u32 %r1210; .reg .u32 %r1211; .reg .u16 %r1212; .reg .u16 %r1213; .reg .u16 %r1214; .reg .u32 %r1216; .reg .u32 %r1217; .reg .u16 %r1219; .reg .u16 %r1221; .reg .u32 %r1222; .reg .u16 %r1223; .reg .pred %r1224; .reg .pred %r1225; .reg .u64 %r1226; .reg .u64 %r1228; .reg .u64 %r1229; .reg .u64 %r1230; .reg .pred %r1232; .reg .u64 %r1234; .reg .u64 %r1235; .reg .u64 %r1236; .reg .u64 %r1238; .reg .u64 %r1239; .reg .u64 %r1244; .reg .u64 %r1245; .reg .u64 %r1246; .reg .u64 %r1247; .reg .u64 %r1251; .reg .u64 %r1253; .reg .u64 %r1254; .reg .u64 %r1255; .reg .u64 %r1256; .reg .u64 %r1257; .reg .pred %r1259; .reg .u64 %r1265; .reg .u64 %r1266; .reg .u64 %r1267; .reg .u64 %r1268; .reg .pred %r1270; .reg .u64 %r1271; .reg .u64 %r1272; .reg .u64 %r1273; .reg .u64 %r1274; .reg .u64 %r1275; .reg .u64 %r1276; .reg .u64 %r1277; .reg .u64 %r1278; .reg .u64 %r1279; .reg .u64 %r1283; .reg .u64 %r1284; .reg .u64 %r1285; .reg .u64 %r1286; .reg .u64 %r1289; .reg .u64 %r1290; .reg .u64 %r1291; .reg .u64 %r1292; .reg .u64 %r1293; .reg .u64 %r1295; .reg .u64 %r1296; .reg .u64 %r1297; .reg .u64 %r1298; .reg .u64 %r1299; .reg .u64 %r1300; .reg .u64 %r1304; .reg .u64 %r1305; .reg .u64 %r1308; .reg .u64 %r1313; .reg .u64 %r1314; .reg .f32 %r1316; .reg .f32 %r1317; .reg .f32 %r1318; .reg .f32 %r1319; .reg .f32 %r1321; .reg .f32 %r1322; .reg .pred %r1323; .reg .u64 %r1326; .reg .u64 %r1327; .reg .u64 %r1328; .reg .u64 %r1329; .reg .u64 %r1331; .reg .pred %r1332; .reg .u64 %r1333; .reg .u64 %r1334; .reg .f32 %r1335; .reg .u64 %r1338; .reg .u64 %r1339; .reg .u64 %r1340; .reg .u64 %r1341; .reg .u64 %r1342; .reg .u64 %r1343; .reg .u64 %r1344; .reg .u64 %r1346; .reg .u64 %r1347; .reg .f32 %r1351; .reg .f32 %r1352; .reg .f32 %r1353; .reg .f32 %r1354; .reg .f32 %r1355; .reg .f32 %r1356; .reg .f32 %r1357; .reg .f32 %r1358; .reg .f32 %r1359; .reg .f32 %r1360; .reg .f32 %r1361; .reg .f32 %r1362; .reg .f32 %r1363; .reg .f32 %r1364; .reg .f32 %r1365; .reg .f32 %r1366; .reg .f32 %r1367; .reg .f32 %r1368; .reg .f32 %r1369; .reg .f32 %r1370; .reg .f32 %r1371; .reg .f32 %r1372; .reg .f32 %r1373; .reg .f32 %r1374; .reg .f32 %r1375; .reg .f32 %r1376; .reg .f32 %r1377; .reg .f32 %r1378; .reg .f32 %r1379; .reg .f32 %r1380; .reg .f32 %r1381; .reg .f32 %r1382; .reg .pred %r1383; .reg .pred %r1384; .reg .pred %r1385; .reg .u64 %r1387; .reg .u64 %r1388; .reg .u64 %r1389; .reg .u64 %r1390; .reg .u64 %r1391; .reg .u64 %r1392; .reg .f32 %r1394; .reg .f32 %r1395; .reg .f32 %r1396; .reg .f32 %r1397; .reg .f32 %r1398; .reg .f32 %r1399; .reg .f32 %r1400; .reg .f32 %r1401; .reg .pred %r1402; .reg .pred %r1403; .reg .pred %r1404; .reg .u64 %r1406; .reg .u64 %r1407; .reg .u64 %r1409; .reg .u64 %r1410; .reg .u64 %r1411; .reg .u64 %r1412; .reg .u64 %r1413; .reg .u64 %r1420; .reg .u64 %r1421; .reg .f32 %r1424; .reg .f32 %r1425; .reg .f32 %r1426; .reg .f32 %r1427; .reg .f32 %r1428; .reg .f32 %r1429; .reg .f32 %r1430; .reg .f32 %r1431; .reg .pred %r1432; .reg .pred %r1433; .reg .pred %r1434; .reg .f32 %r1437; .reg .f32 %r1438; .reg .pred %r1440; .reg .u16 %r1441; .reg .pred %r1442; .reg .u32 %r1444; .reg .u32 %r1445; .reg .u16 %r1446; .reg .u16 %r1447; .reg .u32 %r1449; .reg .u16 %r1450; .reg .pred %r1451; .reg .pred %r1452; .reg .u16 %r1453; .reg .pred %r1454; .reg .pred %r1455; .reg .pred %r1456; .reg .u64 %r1457; .reg .u64 %r1458; .reg .pred %r1460; .reg .u64 %r1461; .reg .f32 %r1463; .reg .f32 %r1464; .reg .pred %r1466; .reg .u64 %r1468; .reg .u64 %r1469; .reg .u64 %r1470; .reg .f32 %r1471; .reg .f32 %r1472; .reg .pred %r1474; .reg .pred %r1475; .reg .u64 %r1477; .reg .f32 %r1478; .reg .f32 %r1479; .reg .pred %r1481; .reg .pred %r1482; .reg .pred %r1483; .reg .pred %r1484; .reg .u64 %r1485; .reg .pred %r1489; .reg .u64 %r1491; .reg .f32 %r1492; .reg .f32 %r1493; .reg .f32 %r1494; .reg .pred %r1496; .reg .pred %r1498; .reg .pred %r1502; .reg .u64 %r1503; .reg .u64 %r1504; .reg .u64 %r1506; .reg .f32 %r1507; .reg .f32 %r1508; .reg .pred %r1510; .reg .pred %r1511; .reg .u64 %r1512; .reg .pred %r1513; .reg .u64 %r1514; .reg .u64 %r1515; .reg .u64 %r1516; .reg .u16 %r1517; .reg .u32 %r1518; .reg .pred %r1519; .reg .u16 %r1520; .reg .u32 %r1521; .reg .pred %r1522; .reg .u64 %r1523; .reg .u64 %r1524; .reg .u64 %r1525; .reg .u64 %r1526; .reg .u64 %r1527; .reg .u64 %r1528; .reg .u64 %r1529; .reg .u32 %r1530; .reg .pred %r1531; .reg .u64 %r1532; .reg .u64 %r1533; .reg .u64 %r1534; .reg .f32 %r1535; .reg .pred %r1536; .reg .pred %r1542; .reg .pred %r1543; .reg .pred %r1544; .reg .f32 %r1545; .reg .u64 %r1546; .reg .pred %r1547; .reg .pred %r1548; .reg .pred %r1549; .reg .pred %r1550; .reg .u32 %r1551; .reg .u64 %r1552; .reg .pred %r1553; .reg .pred %r1556; .reg .u64 %r1562; .reg .u64 %r1563; .reg .u64 %r1564; .reg .u64 %r1565; .reg .u64 %r1566; .reg .u64 %r1570; .reg .u64 %r1571; .reg .u64 %r1572; .reg .u64 %r1573; .reg .u64 %r1574; .reg .u64 %r1576; .reg .u64 %r1577; .reg .u64 %r1578; .reg .u64 %r1580; .reg .pred %r1581; .reg .u64 %r1583; .reg .pred %r1585; .reg .u64 %r1588; .reg .pred %r1589; .reg .u64 %r1590; .reg .pred %r1591; .reg .u64 %r1592; .reg .u64 %r1593; .reg .pred %r1597; .reg .u64 %r1598; .reg .f32 %r1599; .reg .f32 %r1600; .reg .pred %r1601; .reg .u64 %r1602; .reg .u64 %r1603; .reg .u64 %r1604; .reg .u64 %r1605; .reg .u64 %r1606; .reg .pred %r1607; .reg .pred %r1608; .reg .pred %r1609; .reg .u64 %r1611; .reg .u64 %r1613; .reg .u64 %r1614; .reg .u64 %r1615; .reg .u64 %r1616; .reg .u64 %r1617; .reg .pred %r1618; .reg .pred %r1619; .reg .pred %r1620; .reg .u64 %r1621; .reg .u64 %r1624; .reg .u64 %r1625; .reg .u64 %r1626; .reg .u64 %r1628; .reg .pred %r1629; .reg .u64 %r1630; .reg .u64 %r1631; .reg .u64 %r1632; .reg .u64 %r1636; .reg .pred %r1637; .reg .pred %r1638; .reg .pred %r1639; .reg .u64 %r1640; .reg .u64 %r1641; .reg .u64 %r1642; .reg .u64 %r1643; .reg .u64 %r1644; .reg .u64 %r1645; .reg .pred %r1646; .reg .pred %r1647; .reg .pred %r1648; .reg .u64 %r1653; .reg .u64 %r1654; .reg .u64 %r1655; .reg .u64 %r1656; .reg .u64 %r1657; .reg .pred %r1658; .reg .pred %r1659; .reg .pred %r1660; .reg .u64 %r1661; .reg .u64 %r1665; .reg .pred %r1666; .reg .pred %r1667; .reg .pred %r1668; .reg .u64 %r1673; .reg .pred %r1674; .reg .pred %r1675; .reg .pred %r1676; .reg .u64 %r1681; .reg .pred %r1682; .reg .pred %r1683; .reg .pred %r1684; .reg .u64 %r1689; .reg .pred %r1690; .reg .pred %r1691; .reg .pred %r1692; .reg .u64 %r1695; .reg .u64 %r1696; .reg .u64 %r1698; .reg .u64 %r1699; .reg .f32 %r1703; .reg .f32 %r1704; .reg .f32 %r1705; .reg .f32 %r1706; .reg .f32 %r1707; .reg .f32 %r1708; .reg .f32 %r1709; .reg .f32 %r1710; .reg .pred %r1711; .reg .f32 %r1715; .reg .f32 %r1716; .reg .u64 %r1717; .reg .f32 %r1721; .reg .f32 %r1722; .reg .u64 %r1723; .reg .f32 %r1727; .reg .f32 %r1728; .reg .pred %r1729; .reg .pred %r1730; .reg .f32 %r1731; .reg .f32 %r1732; .reg .f32 %r1736; .reg .f32 %r1737; .reg .f32 %r1738; .reg .f32 %r1739; .reg .f32 %r1740; .reg .f32 %r1741; .reg .f32 %r1742; .reg .f32 %r1743; .reg .f32 %r1744; .reg .f32 %r1745; .reg .f32 %r1746; .reg .u64 %r1748; .reg .f32 %r1751; .reg .f32 %r1752; .reg .f32 %r1753; .reg .f32 %r1754; .reg .f32 %r1755; .reg .f32 %r1756; .reg .f32 %r1757; .reg .f32 %r1758; .reg .f32 %r1759; .reg .f32 %r1760; .reg .f32 %r1761; .reg .u64 %r1763; .reg .f32 %r1766; .reg .f32 %r1767; .reg .f32 %r1768; .reg .f32 %r1769; .reg .f32 %r1770; .reg .f32 %r1771; .reg .f32 %r1772; .reg .f32 %r1773; .reg .f32 %r1774; .reg .pred %r1775; .reg .pred %r1776; .reg .f32 %r1777; .reg .f32 %r1778; .reg .f32 %r1783; .reg .f32 %r1784; .reg .f32 %r1785; .reg .f32 %r1786; .reg .f32 %r1787; .reg .f32 %r1788; .reg .f32 %r1789; .reg .f32 %r1790; .reg .f32 %r1791; .reg .f32 %r1792; .reg .f32 %r1793; .reg .f32 %r1799; .reg .f32 %r1800; .reg .f32 %r1801; .reg .f32 %r1802; .reg .f32 %r1803; .reg .f32 %r1804; .reg .f32 %r1805; .reg .f32 %r1806; .reg .f32 %r1807; .reg .f32 %r1808; .reg .f32 %r1809; .reg .f32 %r1815; .reg .f32 %r1816; .reg .f32 %r1817; .reg .f32 %r1818; .reg .f32 %r1819; .reg .f32 %r1820; .reg .f32 %r1821; .reg .f32 %r1822; .reg .f32 %r1823; .reg .pred %r1824; .reg .pred %r1825; .reg .f32 %r1826; .reg .f32 %r1827; .reg .u64 %r1828; .reg .u64 %r1830; .reg .f32 %r1831; .reg .f32 %r1832; .reg .f32 %r1833; .reg .f32 %r1834; .reg .f32 %r1835; .reg .f32 %r1836; .reg .f32 %r1837; .reg .f32 %r1838; .reg .f32 %r1839; .reg .f32 %r1840; .reg .f32 %r1841; .reg .u64 %r1842; .reg .u64 %r1843; .reg .u64 %r1844; .reg .f32 %r1846; .reg .f32 %r1847; .reg .f32 %r1848; .reg .f32 %r1849; .reg .f32 %r1850; .reg .f32 %r1851; .reg .f32 %r1852; .reg .f32 %r1853; .reg .f32 %r1854; .reg .f32 %r1855; .reg .f32 %r1856; .reg .u64 %r1857; .reg .u64 %r1858; .reg .u64 %r1859; .reg .f32 %r1861; .reg .f32 %r1862; .reg .f32 %r1863; .reg .f32 %r1864; .reg .f32 %r1865; .reg .f32 %r1866; .reg .f32 %r1867; .reg .f32 %r1868; .reg .f32 %r1869; .reg .pred %r1870; .reg .pred %r1871; .reg .f32 %r1872; .reg .f32 %r1873; .reg .u64 %r1875; .reg .u64 %r1876; .reg .f32 %r1877; .reg .f32 %r1878; .reg .f32 %r1879; .reg .f32 %r1880; .reg .f32 %r1881; .reg .f32 %r1882; .reg .f32 %r1883; .reg .f32 %r1884; .reg .f32 %r1885; .reg .f32 %r1886; .reg .f32 %r1887; .reg .u64 %r1889; .reg .u64 %r1890; .reg .f32 %r1892; .reg .f32 %r1893; .reg .f32 %r1894; .reg .f32 %r1895; .reg .f32 %r1896; .reg .f32 %r1897; .reg .f32 %r1898; .reg .f32 %r1899; .reg .f32 %r1900; .reg .f32 %r1901; .reg .f32 %r1902; .reg .u64 %r1904; .reg .u64 %r1905; .reg .f32 %r1907; .reg .f32 %r1908; .reg .f32 %r1909; .reg .f32 %r1910; .reg .f32 %r1911; .reg .f32 %r1912; .reg .f32 %r1913; .reg .f32 %r1914; .reg .f32 %r1915; .reg .pred %r1916; .reg .u64 %r1920; .reg .u64 %r1922; .reg .u64 %r1925; .reg .pred %r1927; .reg .pred %r1928; .reg .pred %r1929; .reg .u64 %r1931; .reg .u64 %r1932; .reg .u64 %r1933; .reg .f32 %r1934; .reg .f32 %r1935; .reg .f32 %r1936; .reg .f32 %r1937; .reg .f32 %r1938; .reg .f32 %r1939; .reg .f32 %r1940; .reg .f32 %r1941; .reg .f32 %r1942; .reg .f32 %r1943; .reg .f32 %r1944; .reg .f32 %r1945; .reg .u64 %r1947; .reg .u64 %r1948; .reg .u64 %r1949; .reg .f32 %r1951; .reg .f32 %r1952; .reg .f32 %r1953; .reg .f32 %r1954; .reg .f32 %r1955; .reg .f32 %r1956; .reg .f32 %r1957; .reg .f32 %r1958; .reg .f32 %r1959; .reg .f32 %r1960; .reg .f32 %r1961; .reg .f32 %r1962; .reg .u64 %r1964; .reg .u64 %r1965; .reg .u64 %r1966; .reg .f32 %r1968; .reg .f32 %r1969; .reg .f32 %r1970; .reg .f32 %r1971; .reg .f32 %r1972; .reg .f32 %r1973; .reg .f32 %r1974; .reg .f32 %r1975; .reg .f32 %r1976; .reg .f32 %r1977; .reg .f32 %r1978; .reg .f32 %r1979; .reg .pred %r1980; .reg .pred %r1981; .reg .f32 %r1983; .reg .f32 %r1984; .reg .u64 %r1986; .reg .u64 %r1987; .reg .f32 %r1988; .reg .f32 %r1989; .reg .f32 %r1990; .reg .f32 %r1991; .reg .f32 %r1992; .reg .f32 %r1993; .reg .f32 %r1994; .reg .f32 %r1995; .reg .f32 %r1996; .reg .f32 %r1997; .reg .f32 %r1998; .reg .u64 %r2000; .reg .u64 %r2001; .reg .f32 %r2003; .reg .f32 %r2004; .reg .f32 %r2005; .reg .f32 %r2006; .reg .f32 %r2007; .reg .f32 %r2008; .reg .f32 %r2009; .reg .f32 %r2010; .reg .f32 %r2011; .reg .f32 %r2012; .reg .f32 %r2013; .reg .u64 %r2015; .reg .u64 %r2016; .reg .f32 %r2018; .reg .f32 %r2019; .reg .f32 %r2020; .reg .f32 %r2021; .reg .f32 %r2022; .reg .f32 %r2023; .reg .f32 %r2024; .reg .f32 %r2025; .reg .f32 %r2026; .reg .pred %r2027; .reg .f32 %r2028; .reg .f32 %r2029; .reg .f32 %r2030; .reg .f32 %r2031; .reg .f32 %r2032; .reg .f32 %r2033; .reg .f32 %r2034; .reg .f32 %r2035; .reg .f32 %r2036; .reg .f32 %r2038; .reg .f32 %r2039; .reg .f32 %r2040; .reg .f32 %r2041; .reg .f32 %r2042; .reg .f32 %r2043; .reg .f32 %r2044; .reg .f32 %r2045; .reg .f32 %r2046; .reg .f32 %r2048; .reg .f32 %r2049; .reg .f32 %r2050; .reg .f32 %r2051; .reg .f32 %r2052; .reg .f32 %r2053; .reg .f32 %r2054; .reg .f32 %r2055; .reg .f32 %r2056; .reg .pred %r2057; .reg .f32 %r2058; .reg .f32 %r2059; .reg .f32 %r2061; .reg .f32 %r2062; .reg .f32 %r2064; .reg .f32 %r2065; .reg .pred %r2066; .reg .f32 %r2067; .reg .f32 %r2068; .reg .f32 %r2069; .reg .f32 %r2070; .reg .f32 %r2071; .reg .f32 %r2072; .reg .f32 %r2073; .reg .f32 %r2074; .reg .pred %r2075; .reg .u32 %r2076; .reg .u64 %r2077; .reg .u32 %r2079; .reg .u64 %r2080; .reg .u32 %r2082; .reg .u64 %r2083; .reg .pred %r2084; .reg .u32 %r2085; .reg .u64 %r2086; .reg .u64 %r2088; .reg .u32 %r2090; .reg .u64 %r2091; .reg .u64 %r2093; .reg .u32 %r2095; .reg .u64 %r2096; .reg .pred %r2097; .reg .f32 %r2098; .reg .f32 %r2099; .reg .f32 %r2100; .reg .f32 %r2101; .reg .f32 %r2102; .reg .f32 %r2103; .reg .f32 %r2104; .reg .f32 %r2108; .reg .f32 %r2109; .reg .f32 %r2110; .reg .f32 %r2111; .reg .f32 %r2112; .reg .f32 %r2113; .reg .f32 %r2114; .reg .f32 %r2115; .reg .f32 %r2116; .reg .f32 %r2118; .reg .f32 %r2119; .reg .f32 %r2120; .reg .f32 %r2121; .reg .f32 %r2122; .reg .f32 %r2123; .reg .f32 %r2124; .reg .f32 %r2125; .reg .f32 %r2126; .reg .pred %r2127; .reg .f32 %r2128; .reg .f32 %r2129; .reg .f32 %r2130; .reg .f32 %r2131; .reg .f32 %r2132; .reg .f32 %r2133; .reg .f32 %r2134; .reg .f32 %r2138; .reg .f32 %r2139; .reg .f32 %r2140; .reg .f32 %r2141; .reg .f32 %r2142; .reg .f32 %r2143; .reg .f32 %r2144; .reg .f32 %r2145; .reg .f32 %r2146; .reg .f32 %r2148; .reg .f32 %r2149; .reg .f32 %r2150; .reg .f32 %r2151; .reg .f32 %r2152; .reg .f32 %r2153; .reg .f32 %r2154; .reg .f32 %r2155; .reg .f32 %r2156; .reg .pred %r2157; .reg .f32 %r2158; .reg .f32 %r2159; .reg .f32 %r2160; .reg .f32 %r2161; .reg .f32 %r2162; .reg .f32 %r2163; .reg .f32 %r2164; .reg .f32 %r2168; .reg .f32 %r2169; .reg .f32 %r2170; .reg .f32 %r2171; .reg .f32 %r2172; .reg .f32 %r2173; .reg .f32 %r2174; .reg .f32 %r2175; .reg .f32 %r2176; .reg .f32 %r2178; .reg .f32 %r2179; .reg .f32 %r2180; .reg .f32 %r2181; .reg .f32 %r2182; .reg .f32 %r2183; .reg .f32 %r2184; .reg .f32 %r2185; .reg .f32 %r2186; .reg .pred %r2187; .reg .f32 %r2188; .reg .f32 %r2189; .reg .f32 %r2190; .reg .f32 %r2191; .reg .f32 %r2192; .reg .f32 %r2193; .reg .f32 %r2194; .reg .f32 %r2195; .reg .f32 %r2196; .reg .f32 %r2197; .reg .f32 %r2198; .reg .f32 %r2199; .reg .f32 %r2201; .reg .f32 %r2202; .reg .f32 %r2203; .reg .f32 %r2204; .reg .f32 %r2205; .reg .f32 %r2206; .reg .f32 %r2207; .reg .f32 %r2208; .reg .f32 %r2209; .reg .f32 %r2210; .reg .f32 %r2211; .reg .f32 %r2212; .reg .f32 %r2214; .reg .f32 %r2215; .reg .f32 %r2216; .reg .f32 %r2217; .reg .f32 %r2218; .reg .f32 %r2219; .reg .f32 %r2220; .reg .f32 %r2221; .reg .f32 %r2222; .reg .f32 %r2223; .reg .f32 %r2224; .reg .f32 %r2225; .reg .pred %r2226; .reg .pred %r2229; .reg .f32 %r2230; .reg .f32 %r2231; .reg .f32 %r2232; .reg .f32 %r2233; .reg .f32 %r2234; .reg .f32 %r2235; .reg .f32 %r2236; .reg .f32 %r2240; .reg .f32 %r2241; .reg .f32 %r2242; .reg .f32 %r2243; .reg .f32 %r2244; .reg .f32 %r2245; .reg .f32 %r2246; .reg .f32 %r2247; .reg .f32 %r2248; .reg .f32 %r2250; .reg .f32 %r2251; .reg .f32 %r2252; .reg .f32 %r2253; .reg .f32 %r2254; .reg .f32 %r2255; .reg .f32 %r2256; .reg .f32 %r2257; .reg .f32 %r2258; .reg .pred %r2259; mov.u64 %r945,%ar0; mov.u64 %r946,%ar1; mov.u64 %r947,%ar2; mov.u32 %r948,%ar3; mov.u32 %r949,%ar4; mov.u64 %r950,%ar5; .loc 1 2462 3 ld.s8 %r627,[%r946+28]; cvt.u16.u32 %r951,%r627; setp.eq.u16 %r952,%r951,2; @ %r952 bra $L2; ld.s8 %r1551,[%r947+28]; cvt.u16.u32 %r953,%r1551; setp.eq.u16 %r955,%r953,2; @ %r955 bra $L3; cvta.const.u64 %r959,$LC3; cvta.const.u64 %r958,__func__$0; mov.u32 %r957,2462; cvta.const.u64 %r995893: .loc 1 2476 6 ld.u64 %r960,[%r945]; setp.ne.u64 %r961,%r960,0; @ %r961 bra $L4; .loc 1 2478 10 setp.ne.u16 %r963,%r951,1; @ %r963 bra $L5; .loc 1 2480 4 st.u64 [%r945+48],%r960; ld.u64 %r966,[%r947+80]; ld.u64 %r968,[%r947+72]; sub.u64 %r969,%r966,%r968; st.u64 [%r945+56],%r969; mov.u64 %r970,1; st.u64 [%r945+40],%r970; bra $L6; $L123: .loc 1 2485 4 st.u64 [%r945+48],%r1512; add.u64 %r972,%r668,-1; st.u64 [%r945+56],%r972; mov.u64 %r973,1; st.u64 [%r945+40],%r973; bra $L6; $L124: .loc 1 2490 4 mov.u64 %r974,0; st.u64 [%r945+48],%r974; add.u64 %r975,%r668,-1; st.u64 [%r945+56],%r975; mov.u64 %r976,1; st.u64 [%r945+40],%r976; .loc 1 2493 11 st.u64 [%r945+72],%r974; ld.u64 %r979,[%r947+80]; ld.u64 %r981,[%r947+72]; sub.u64 %r982,%r979,%r981; st.u64 [%r945+80],%r982; st.u64 [%r945+64],%r668; $L6: .loc 1 2499 1945; call (%value_in),_gfortran_size0,(%out_arg1); ld.param.u64 %r984,[%value_in]; } .loc 1 2499 4 mov.u64 %r9869986; call (%value_in),_gfortrani_xmallocarray987,[%value_in]; } .loc 1 2499 2 st.u64 [%r945],%r987; .loc 1 2500 24 mov.u64 %r989,0; st.u64 [%r945+8],%r989; .loc 1 2557 7 ld.s8 %r627,[%r946+28]; .loc 1 2564 15 ld.u64 %r991,[%r946+56]; add.u64 %r990,%r991,1; .loc 1 2564 13 ld.u64 %r992,[%r946+48]; sub.u64 %r638,%r990,%r992; bra $L7; $L4: .loc 1 2502 12 cvta.global.u64 %r993,_gfortrani_compile_options; .loc 1 2502 11 ld.u32 %r994,[%r993+36]; setp.eq.u32 %r995,%r994,0; @ %r995 bra $L8; .loc 1 2509 17 ld.u64 %r997,[%r945+56]; add.u64 %r996,%r997,1; .loc 1 2509 15 ld.u64 %r998,[%r945+48]; sub.u64 %r638,%r996,%r998; .loc 1 2506 10 setp.ne.u16 %r1000,%r951,1; @ %r1000 bra $L9; .loc 1 2508 17 ld.u64 %r1002,[%r947+80]; add.u64 %r1001,%r1002,1; .loc 1 2508 15 ld.u64 %r1003,[%r947+72]; sub.u64 %r217,%r1001,%r1003; .loc 1 2510 7 setp.ne.u64 %r1004,%r217,%r638; @ %r1004 bra $L10; .loc 1 2564 15 ld.u64 %r1006,[%r946+56]; add.u64 %r1005,%r1006,1; .loc 1 2564 13 ld.u64 %r1007,[%r946+48]; sub.u64 %r638,%r1005,%r1007; .loc 1 2543 7 ld.s8 %r901,[%r945+28]; .loc 1 2548 27 ld.u64 %r844,[%r945+40]; .loc 1 2543 6 cvt.u16.u32 %r1008,%r901; setp.eq.u16 %r1009,%r1008,1; @ %r1009 bra $L11; .loc 1 2553 16 ld.u64 %r664,[%r945+64]; .loc 1 2560 16 ld.u64 %r298,[%r946+40]; .loc 1 2563 14 mov.u64 %r162,1; .loc 1 2561 16 mov.u64 %r150,%r162; bra $L12; $L10: .loc 1 2511 6 st.u64 [%stack+8],%r217; st.u64 [%stack],%r638; cvta.const.u64 %r10101010_gfortran_runtime_error2485 4 ld.u64 %r1013,[%r946+56]; add.u64 %r1012,%r1013,1; ld.u64 %r1014,[%r946+48]; sub.u64 %r672,%r1012,%r1014; .loc 1 2515 15 cvt.u16.u32 %r1015,%r1551; setp.ne.u16 %r1017,%r1015,1; @ %r1017 bra $L13; .loc 1 2519 7 setp.eq.u64 %r1018,%r672,%r638; @ %r1018 bra $L14; .loc 1 2520 6 st.u64 [%stack+8],%r672; st.u64 [%stack],%r638; cvta.const.u64 %r101101_gfortran_runtime_error13: .loc 1 2528 7 setp.eq.u64 %r1021,%r672,%r638; @ %r1021 bra $L15; .loc 1 2529 6 st.u64 [%stack+8],%r672; st.u64 [%stack],%r638; cvta.const.u64 %r102210stack; call _gfortran_runtime_error15: .loc 1 2533 17 ld.u64 %r1025,[%r947+80]; add.u64 %r1024,%r1025,1; .loc 1 2533 15 ld.u64 %r1026,[%r947+72]; sub.u64 %r215,%r1024,%r1026; .loc 1 2534 17 ld.u64 %r1028,[%r945+80]; add.u64 %r1027,%r1028,1; .loc 1 2534 15 ld.u64 %r1029,[%r945+72]; sub.u64 %r216,%r1027,%r1029; .loc 1 2535 7 setp.eq.u64 %r1030,%r215,%r216; @ %r1030 bra $L14; .loc 1 2536 6 st.u64 [%stack+8],%r215; st.u64 [%stack],%r216; cvta.const.u64 %r1031,$LC10stack; call _gfortran_runtime_error2564 15 ld.u64 %r1034,[%r946+56]; add.u64 %r1033,%r1034,1; .loc 1 2564 13 ld.u64 %r1035,[%r946+48]; sub.u64 %r638,%r1033,%r1035; $L7: .loc 1 2548 27 ld.u64 %r664,[%r945+40]; .loc 1 2543 6 ld.s8 %r1037,[%r945+28]; cvt.u16.u32 %r1036,%r1037; setp.eq.u16 %r1038,%r1036,1; @ %r1038 bra $L130; $L127: .loc 1 2553 16 mov.u64 %r844,%r664; ld.u64 %r664,[%r945+64]; bra $L16; $L130: .loc 1 2548 27 mov.u64 %r844,%r664; $L16: .loc 1 2560 16 ld.u64 %r298,[%r946+40]; .loc 1 2557 6 cvt.u16.u32 %r1039,%r627; setp.eq.u16 %r1040,%r1039,1; @ %r1040 bra $L131; $L125: .loc 1 2569 16 ld.u64 %r150,[%r946+64]; .loc 1 2571 15 ld.u64 %r1042,[%r946+80]; add.u64 %r1041,%r1042,1; .loc 1 2571 13 ld.u64 %r1043,[%r946+72]; mov.u64 %r162,%r638; sub.u64 %r638,%r1041,%r1043; bra $L12; $L131: .loc 1 2563 14 mov.u64 %r162,1; .loc 1 2561 16 mov.u64 %r150,%r162; $L12: .loc 1 2575 16 ld.u64 %r1045,[%r947+56]; add.u64 %r1044,%r1045,1; ld.u64 %r1046,[%r947+48]; sub.u64 %r65,%r1044,%r1046; .loc 1 2575 6 setp.eq.u64 %r1047,%r65,%r638; @ %r1047 bra $L17; .loc 1 2577 10 set.u32.gt.s64 %r1049,%r638,0; neg.s32 %r1050,%r1049; .loc 1 2577 21 set.u32.gt.s64 %r1052,%r65,0; neg.s32 %r1053,%r1052; cvt.u16.u32 %r1055,%r1050; cvt.u16.u32 %r1056,%r1053; or.b16 %r1054,%r1055,%r1056; cvt.u32.u16 %r1057,%r1054; cvt.u16.u8 %r1058,%r1057; setp.eq.u16 %r1059,%r1058,0; @ %r1059 bra $L17; .loc 1 2578 2 st.u64 [%stack+8],%r638; st.u64 [%stack],%r65; cvta.const.u64 %r10601060_gfortran_runtime_error17: .loc 1 2583 7 ld.s8 %r70,[%r947+28]; .loc 1 2586 16 ld.u64 %r663,[%r947+40]; .loc 1 2583 6 cvt.u16.u32 %r1062,%r70; setp.eq.u16 %r1063,%r1062,1; @ %r1063 bra $L132; .loc 1 2597 16 ld.u64 %r151,[%r947+64]; .loc 1 2598 16 ld.u64 %r1065,[%r947+80]; add.u64 %r1064,%r1065,1; .loc 1 2598 14 ld.u64 %r1066,[%r947+72]; sub.u64 %r163,%r1064,%r1066; bra $L18; $L132: .loc 1 2592 14 mov.u64 %r163,1; .loc 1 2591 16 mov.u64 %r151,256; $L18: .loc 1 2601 9 ld.u64 %r220,[%r946]; .loc 1 2602 9 ld.u64 %r221,[%r947]; .loc 1 2603 8 ld.u64 %r222,[%r945]; .loc 1 2612 28 set.u32.eq.u64 %r1068,%r844,1; neg.s32 %r1069,%r1068; cvt.u32.u32 %r1067,%r1069; cvt.u32.u8 %r75,%r1067; .loc 1 2612 46 set.u32.eq.u64 %r1071,%r298,1; neg.s32 %r1072,%r1071; cvt.u32.u32 %r1070,%r1072; cvt.u32.u8 %r662,%r1070; .loc 1 2612 7 set.u32.ne.u32 %r1074,%r948,0; neg.s32 %r1075,%r1074; .loc 1 2612 16 cvt.u16.u32 %r1077,%r1075; cvt.u16.u32 %r1078,%r75; and.b16 %r1076,%r1077,%r1078; .loc 1 2612 6 cvt.u32.u16 %r1079,%r1076; cvt.u16.u8 %r1080,%r1079; setp.eq.u16 %r1081,%r1080,0; @ %r1081 bra $L19; .loc 1 2612 63 set.u32.eq.u64 %r1083,%r150,1; neg.s32 %r1084,%r1083; .loc 1 2612 51 cvt.u16.u32 %r1086,%r1084; cvt.u16.u32 %r1087,%r662; or.b16 %r1085,%r1086,%r1087; .loc 1 2612 33 cvt.u32.u16 %r1088,%r1085; cvt.u16.u8 %r1089,%r1088; setp.eq.u16 %r1090,%r1089,0; @ %r1090 bra $L20; .loc 1 2613 20 set.u32.eq.u64 %r1092,%r663,1; neg.s32 %r1093,%r1092; .loc 1 2613 37 set.u32.eq.u64 %r1095,%r151,1; neg.s32 %r1096,%r1095; .loc 1 2613 25 cvt.u16.u32 %r1098,%r1093; cvt.u16.u32 %r1099,%r1096; or.b16 %r1097,%r1098,%r1099; .loc 1 2613 7 cvt.u32.u16 %r1100,%r1097; cvt.u16.u8 %r1101,%r1100; setp.eq.u16 %r1102,%r1101,0; @ %r1102 bra $L20; .loc 1 2615 13 cvt.rn.f32.s32 %r86,%r949; .loc 1 2614 12 cvt.rn.f32.s64 %r1103,%r162; .loc 1 2614 50 cvt.rn.f32.s64 %r1104,%r638; .loc 1 2614 47 mul.f32 %r1105,%r1103,%r1104; .loc 1 2614 31 cvt.rn.f32.s64 %r1106,%r163; .loc 1 2614 47 mul.f32 %r1107,%r1105,%r1106; .loc 1 2615 13 mul.f32 %r1108,%r86,%r86; mul.f32 %r1109,%r1108,%r86; .loc 1 2614 7 setp.gt.f32 %r1110,%r1107,%r1109; @ ! %r1110 bra $L19; .loc 1 2617 17 cvt.u32.u64 %r90,%r162; st.u32 [%frame+36],%r90; .loc 1 2617 29 cvt.u32.u64 %r91,%r163; st.u32 [%frame+32],%r91; .loc 1 2617 41 cvt.u32.u64 %r92,%r638; st.u32 [%frame+28],%r92; .loc 1 2617 52 cvt.u32.u64 %r93,%r664; st.u32 [%frame+24],%r93; .loc 1 2618 27 ld.const.f32 %r1111,[$LC8]; st.f32 [%frame+8],%r1111; ld.const.f32 %r1112,[$LC8+4]; st.f32 [%frame+12],%r1112; .loc 1 2618 36 ld.const.f32 %r1545,[$LC9]; st.f32 [%frame],%r1545; ld.const.f32 %r1535,[$LC9+4]; st.f32 [%frame+4],%r1535; .loc 1 2619 17 setp.ne.u64 %r1115,%r298,1; @ %r1115 bra $L22; cvt.u32.u64 %r198,%r150; bra $L23; $L22: cvt.u32.u64 %r198,%r298; $L23: st.u32 [%frame+20],%r198; .loc 1 2620 3 setp.ne.u64 %r1116,%r663,1; @ %r1116 bra $L24; cvt.u32.u64 %r199,%r151; bra $L25; $L24: cvt.u32.u64 %r199,%r663; $L25: st.u32 [%frame+16],%r199; .loc 1 2622 15 set.u32.gt.s32 %r1118,%r198,0; neg.s32 %r1119,%r1118; .loc 1 2622 26 set.u32.gt.s32 %r1121,%r199,0; neg.s32 %r1122,%r1121; .loc 1 2622 19 cvt.u16.u32 %r1124,%r1119; cvt.u16.u32 %r1125,%r1122; and.b16 %r1123,%r1124,%r1125; .loc 1 2622 10 cvt.u32.u16 %r1126,%r1123; cvt.u16.u8 %r1127,%r1126; setp.eq.u16 %r1128,%r1127,0; @ %r1128 bra $L19; .loc 1 2622 37 set.u32.gt.s32 %r1130,%r93,0; neg.s32 %r1131,%r1130; .loc 1 2622 46 set.u32.gt.s32 %r1133,%r90,1; neg.s32 %r1134,%r1133; .loc 1 2622 41 cvt.u16.u32 %r1136,%r1131; cvt.u16.u32 %r1137,%r1134; and.b16 %r1135,%r1136,%r1137; cvt.u32.u16 %r1138,%r1135; cvt.u16.u8 %r1139,%r1138; setp.eq.u16 %r1140,%r1139,0; @ %r1140 bra $L19; .loc 1 2622 55 set.u32.gt.s32 %r1142,%r91,1; neg.s32 %r1143,%r1142; .loc 1 2622 64 set.u32.gt.s32 %r1145,%r92,1; neg.s32 %r1146,%r1145; .loc 1 2622 59 cvt.u16.u32 %r1148,%r1143; cvt.u16.u32 %r1149,%r1146; and.b16 %r1147,%r1148,%r1149; cvt.u32.u16 %r1150,%r1147; cvt.u16.u8 %r1151,%r1150; setp.eq.u16 %r1152,%r1151,0; @ %r1152 bra $L19; .loc 1 2624 4 setp.ne.u64 %r1153,%r950,0; @ %r1153 bra $L27; cvta.const.u64 %r1157,$LC10; cvta.const.u64 %r1156,__func__$0; mov.u32 %r1155,2624; cvta.const.u64 %r11115415511561127: .loc 1 2626 17 and.b32 %r1158,%r948,2; .loc 1 2626 7 setp.ne.u32 %r1159,%r1158,0; @ %r1159 bra $L133; .loc 1 2629 35 setp.eq.u64 %r1160,%r298,1; @ %r1160 bra $L134; cvta.const.u64 %r164,$LC2; bra $L28; $L133: .loc 1 2627 13 cvta.const.u64 %r164,$LC1; bra $L28; $L134: .loc 1 2629 35 cvta.const.u64 %r164,$LC0; $L28: .loc 1 2631 17 and.b32 %r1161,%r948,4; .loc 1 2631 7 setp.ne.u32 %r1162,%r1161,0; @ %r1162 bra $L135; .loc 1 2634 35 setp.eq.u64 %r1163,%r663,1; @ %r1163 bra $L136; cvta.const.u64 %r165,$LC2; bra $L29; $L135: .loc 1 2632 13 cvta.const.u64 %r165,$LC1; bra $L29; $L136: .loc 1 2634 35 cvta.const.u64 %r165,$LC0; $L29: .loc 1 2636 4 add.u64 %r1179,%frame,24; add.u64 %r1180,%frame,16; add.u64 %r1181,%frame,20; add.u64 %r1182,%frame,8; add.u64 %r1183,%frame,28; add.u64 %r1184,%frame,32; add.u64 %r1185,%frame,36; mov.u32 %r1178,1;,.param .u64 %in_ar3,.param .u64 %in_ar4,.param .u64 %in_ar5,.param .u64 %in_ar6,.param .u64 %in_ar7,.param .u64 %in_ar8,.param .u64 %in_ar9,.param .u64 %in_ar10,.param .u64 %in_ar11,.param .u64 %in_ar12,.param .u32 %in_ar13,.param .u32 %in_ar14164r118118411831182220; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r1181; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r221; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r1180; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%frame; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r222; .param .u64 %out_arg13; st.param.u64 [%out_arg13],%r1179; .param .u32 %out_arg14; st.param.u32 [%out_arg14],%r1178; .param .u32 %out_arg15; st.param.u32 [%out_arg15],%r1178; call %r950,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13,%out_arg14,%out_arg15),$LCT0; } bra $L1; $L19: .loc 1 2643 21 cvt.u16.u32 %r1187,%r75; cvt.u16.u32 %r1188,%r662; and.b16 %r1186,%r1187,%r1188; .loc 1 2643 6 cvt.u32.u16 %r1189,%r1186; cvt.u16.u8 %r1190,%r1189; setp.eq.u16 %r1191,%r1190,0; @ %r1191 bra $L31; .loc 1 2643 38 setp.ne.u64 %r1192,%r663,1; @ %r1192 bra $L20; .loc 1 2644 7 cvt.u16.u32 %r1193,%r70; setp.eq.u16 %r1194,%r1193,1; @ %r1194 bra $L32; .loc 1 2677 16 add.u64 %r1195,%r664,1; .loc 1 2678 9 shl.b64 %r111,%r1195,3; sub.u64 %r224,%r222,%r111; .loc 1 2687 7 setp.le.s64 %r1196,%r163,0; @ %r1196 bra $L33; add.u64 %r802,%r111,-8; mov.u64 %r800,%r222; add.u64 %r806,%r163,1; .loc 1 2689 22 shl.b64 %r623,%r162,3; .loc 1 2687 13 mov.u64 %r255,%r663; setp.gt.s64 %r1556,%r162,0; and.b64 %r1636,%r163,3; setp.eq.u64 %r1639,%r1636,0; @ %r1639 bra $L34; setp.eq.u64 %r1638,%r1636,1; @ %r1638 bra $L282; setp.eq.u64 %r1637,%r1636,2; @ %r1637 bra $L283; bra $L338; $L35: .loc 1 2689 22 mov.u32 %r12002001202,[%value_in]; } $L36: .loc 1 2687 24 add.u64 %r1640,%r255,1; .loc 1 2687 7 add.u64 %r1695,%r800,%r802; .loc 1 2688 2 @ %r1556 bra $L220; bra $L339; $L34: @ %r1556 bra $L35; bra $L36; $L33: .loc 1 2692 28 set.u32.eq.u64 %r1207,%r638,0; neg.s32 %r1208,%r1207; .loc 1 2692 13 set.u32.eq.u64 %r1210,%r162,0; neg.s32 %r1211,%r1210; .loc 1 2692 28 cvt.u16.u32 %r1213,%r1208; cvt.u16.u32 %r1214,%r1211; or.b16 %r1212,%r1213,%r1214; .loc 1 2692 23 set.u32.eq.u64 %r1216,%r163,0; neg.s32 %r1217,%r1216; .loc 1 2692 28 cvt.u16.u32 %r1221,%r1217; or.b16 %r1219,%r1212,%r1221; cvt.u32.u16 %r1222,%r1219; cvt.u16.u8 %r1223,%r1222; setp.eq.u16 %r1224,%r1223,0; @ ! %r1224 bra $L1; .loc 1 2697 10 setp.eq.u64 %r1225,%r150,1; selp.u64 %r196,%r664,%r150,%r1225; .loc 1 2702 21 shl.b64 %r1226,%r196,8; .loc 1 2702 14 add.u64 %r229,%r1226,%r151; .loc 1 2706 12 min.s64 %r1228,%r229,65536; shl.b64 %r1229,%r12282291230,[%value_in]; } .loc 1 2710 7 setp.gt.s64 %r1232,%r163,0; @ %r1232 bra $L39; $L42: .loc 1 29302918 7 bra $L1; $L39: .loc 1 2680 16 add.u64 %r1234,%r150,1; .loc 1 2681 9 shl.b64 %r1235,%r1234,3; sub.u64 %r226,%r220,%r1235; .loc 1 2683 16 add.u64 %r1236,%r151,1; .loc 1 2684 9 shl.b64 %r117,%r1236,3; shl.b64 %r769,%r664,12; shl.b64 %r767,%r664,4; shl.b64 %r771,%r664,5; neg.s64 %r1238,%r664; shl.b64 %r1239,%r1238,12; shl.b64 %r772,%r1238,3; shl.b64 %r775,%r664,9; shl.b64 %r783,%r151,12; add.u64 %r784,%r117,-8; sub.u64 %r785,%r221,%r117; add.u64 %r782,%r784,%r785; add.u64 %r1244,%r151,%r151; add.u64 %r1245,%r1244,%r151; shl.b64 %r1246,%r1245,3; add.u64 %r786,%r1246,%r785; add.u64 %r1247,%r163,-1; and.b64 %r793,%r1247,-512; shl.b64 %r749,%r150,8; add.u64 %r750,%r150,%r150; add.u64 %r797,%r111,-8; sub.u64 %r798,%r224,%r797; shl.b64 %r710,%r150,4; shl.b64 %r1251,%r1238,5; shl.b64 %r67,%r151,5; mov.u64 %r774,%r664; mov.u64 %r770,%r771; mov.u64 %r765,0; setp.gt.s64 %r1553,%r638,0; add.u64 %r1590,%r798,8; setp.gt.s64 %r1591,%r162,0; add.u64 %r1592,%r162,1; add.u64 %r1593,%r226,16; bra $L41; $L138: mov.u64 %r765,%r766; $L41: add.u64 %r795,%r765,1; .loc 1 2715 9 sub.u64 %r231,%r163,%r765; min.s64 %r1696,%r231,512; .loc 1 2716 24 shr.s64 %r1253,%r1696,63; shr.u64 %r1254,%r1253,62; add.u64 %r1255,%r1696,%r1254; and.b64 %r1256,%r1255,3; sub.u64 %r1257,%r1256,%r1254; .loc 1 2716 10 sub.u64 %r232,%r1696,%r1257; .loc 1 2718 4 @ %r1553 bra $L40; $L45: .loc 1 2710 7 add.u64 %r766,%r765,512; add.u64 %r767,%r767,%r769; add.u64 %r770,%r770,%r769; add.u64 %r772,%r772,%r1239; add.u64 %r774,%r774,%r775; add.u64 %r782,%r782,%r783; add.u64 %r786,%r786,%r783; setp.ne.u64 %r1259,%r765,%r793; @ %r1259 bra $L138; bra $L42; $L40: .loc 1 2768 13 add.u64 %r644,%r232,%r795; add.u64 %r799,%r1590,%r767; add.u64 %r760,%r782,%r784; add.u64 %r763,%r784,%r786; add.u64 %r259,%r767,%r797; mul.lo.u64 %r583,%r644,%r664; mul.lo.u64 %r577,%r644,%r151; mad.lo.u64 %r756,%r644,%r784,%r785; mov.u64 %r747,%r750; .loc 1 2718 12 mov.u64 %r235,%r663; add.u64 %r1552,%r638,1; .loc 1 2768 8 add.u64 %r1583,%r644,-1; setp.ge.s64 %r1585,%r1583,%r795; .loc 1 2873 17 add.u64 %r1588,%r1696,%r795; .loc 1 2874 9 setp.le.s64 %r1589,%r1588,%r644; $L44: .loc 1 2723 13 sub.u64 %r233,%r1552,%r235; min.s64 %r1698,%r233,256; .loc 1 2724 28 shr.u64 %r1265,%r1698,63; add.u64 %r1266,%r1698,%r1265; and.b64 %r1267,%r1266,1; sub.u64 %r1268,%r1267,%r1265; .loc 1 2724 14 sub.u64 %r234,%r1698,%r1268; .loc 1 2727 8 @ %r1591 bra $L43; $L73: .loc 1 2718 30 add.u64 %r235,%r235,256; .loc 1 2718 4 add.u64 %r747,%r747,%r749; setp.ge.s64 %r1270,%r638,%r235; @ %r1270 bra $L44; bra $L45; $L43: .loc 1 2763 26 add.u64 %r1271,%r235,-1; add.u64 %r1272,%r1271,%r1698; .loc 1 2763 31 mul.lo.u64 %r146,%r1272,%r150; .loc 1 2734 13 add.u64 %r643,%r234,%r235; .loc 1 2734 8 add.u64 %r645,%r643,-1; add.u64 %r1273,%r146,1; shl.b64 %r1274,%r1273,3; add.u64 %r737,%r226,%r1274; shl.b64 %r679,%r1698,3; add.u64 %r1275,%r679,-8; add.u64 %r678,%r1230,%r1275; add.u64 %r1276,%r679,-4; add.u64 %r683,%r1230,%r1276; shl.b64 %r758,%r235,3; add.u64 %r759,%r758,%r782; add.u64 %r761,%r758,%r760; add.u64 %r762,%r758,%r786; add.u64 %r764,%r758,%r763; .loc 1 2904 18 add.u64 %r1546,%r1698,%r235; add.u64 %r757,%r756,%r758; add.u64 %r1277,%r577,%r235; add.u64 %r1278,%r1277,%r1698; shl.b64 %r1279,%r1278,3; add.u64 %r574,%r1279,%r785; mov.u64 %r736,%r799; .loc 1 2727 16 mov.u64 %r248,%r663; setp.le.s64 %r1536,%r235,%r645; setp.gt.s64 %r1548,%r1698,%r234; setp.gt.s64 %r1547,%r1696,%r232; not.b64 %r1576,%r235; add.u64 %r1577,%r1576,%r643; and.b64 %r1578,%r1577,-2; .loc 1 2904 13 add.u64 %r1580,%r1546,-1; setp.gt.s64 %r1581,%r235,%r1580; $L72: .loc 1 2732 10 sub.u64 %r236,%r1592,%r248; min.s64 %r1699,%r236,256; .loc 1 2733 25 shr.s64 %r1533,%r1699,63; shr.u64 %r1283,%r1699,63; add.u64 %r1284,%r1699,%r1283; and.b64 %r1285,%r1284,1; sub.u64 %r1286,%r1285,%r1283; .loc 1 2733 11 sub.u64 %r237,%r1699,%r1286; .loc 1 2735 5 @ %r1536 bra $L46; $L55: .loc 1 2757 8 @ ! %r1548 bra $L48; bra $L47; $L46: .loc 1 2737 17 add.u64 %r647,%r237,%r248; .loc 1 2737 12 add.u64 %r648,%r647,-1; sub.u64 %r1289,%r747,%r150; add.u64 %r1290,%r1289,%r248; add.u64 %r1291,%r1290,%r1699; shl.b64 %r1292,%r1291,3; add.u64 %r1293,%r1292,-8; add.u64 %r709,%r226,%r1293; shl.b64 %r1295,%r1699,11; add.u64 %r1296,%r1295,-2048; add.u64 %r716,%r1230,%r1296; add.u64 %r1297,%r248,%r747; add.u64 %r1298,%r1297,%r1699; shl.b64 %r1299,%r1298,3; add.u64 %r1300,%r1299,-8; add.u64 %r721,%r226,%r1300; neg.s64 %r1304,%r1699; shl.b64 %r1305,%r1304,3; mov.u64 %r726,%r747; mov.u64 %r707,0; setp.le.s64 %r1542,%r248,%r648; setp.gt.s64 %r1549,%r1699,%r237; add.u64 %r1570,%r1305,8; not.b64 %r1571,%r248; add.u64 %r1572,%r1571,%r647; and.b64 %r1573,%r1572,-2; add.u64 %r1574,%r1573,%r248; bra $L54; $L139: mov.u64 %r707,%r708; $L54: .loc 1 2738 9 @ %r1542 bra $L49; $L53: .loc 1 2749 12 @ %r1549 bra $L50; bra $L51; $L49: add.u64 %r694,%r1570,%r709; shl.b64 %r1308,%r707,3; add.u64 %r695,%r1230,%r1308; add.u64 %r696,%r1570,%r721; add.u64 %r1313,%r1574,%r726; shl.b64 %r1314,%r1313,3; add.u64 %r706,%r1314,%r1593; sub.u64 %r1625,%r706,%r696; add.u64 %r1626,%r1625,-16; shr.u64 %r1624,%r1626,4; and.b64 %r1628,%r1624,1; setp.ne.u64 %r1629,%r1628,0; @ ! %r1629 bra $L340; $L52: ld.f32 %r192,[%r694+4]; .loc 1 2740 49 ld.f32 %r1316,[%r694]; st.f32 [%r695],%r1316; st.f32 [%r695+4],%r192; ld.f32 %r182,[%r696+4]; .loc 1 2742 49 ld.f32 %r1317,[%r696]; st.f32 [%r695+8],%r1317; st.f32 [%r695+12],%r182; ld.f32 %r180,[%r694+12]; .loc 1 2744 49 ld.f32 %r1318,[%r694+8]; st.f32 [%r695+2048],%r1318; st.f32 [%r695+2052],%r180; ld.f32 %r178,[%r696+12]; .loc 1 2746 49 ld.f32 %r1319,[%r696+8]; st.f32 [%r695+2056],%r1319; st.f32 [%r695+2060],%r178; .loc 1 2738 9 add.u64 %r1630,%r694,16; add.u64 %r1631,%r695,4096; add.u64 %r1632,%r696,16; ld.f32 %r1703,[%r1630+4]; .loc 1 2740 49 ld.f32 %r1704,[%r1630]; st.f32 [%r1631],%r1704; st.f32 [%r1631+4],%r1703; ld.f32 %r1705,[%r1632+4]; .loc 1 2742 49 ld.f32 %r1706,[%r1632]; st.f32 [%r1631+8],%r1706; st.f32 [%r1631+12],%r1705; ld.f32 %r1707,[%r1630+12]; .loc 1 2744 49 ld.f32 %r1708,[%r1630+8]; st.f32 [%r1631+2048],%r1708; st.f32 [%r1631+2052],%r1707; ld.f32 %r1709,[%r1632+12]; .loc 1 2746 49 ld.f32 %r1710,[%r1632+8]; st.f32 [%r1631+2056],%r1710; st.f32 [%r1631+2060],%r1709; .loc 1 2738 9 add.u64 %r694,%r694,32; add.u64 %r695,%r695,8192; add.u64 %r696,%r696,32; setp.ne.u64 %r1711,%r696,%r706; @ %r1711 bra $L52; bra $L53; $L50: ld.f32 %r174,[%r709+4]; .loc 1 2751 41 ld.f32 %r1321,[%r709]; st.f32 [%r716],%r1321; st.f32 [%r716+4],%r174; ld.f32 %r171,[%r721+4]; .loc 1 2753 41 ld.f32 %r1322,[%r721]; st.f32 [%r716+8],%r1322; st.f32 [%r716+12],%r171; $L51: .loc 1 2735 5 add.u64 %r708,%r707,2; add.u64 %r709,%r709,%r710; add.u64 %r716,%r716,16; add.u64 %r721,%r721,%r710; add.u64 %r726,%r726,%r750; setp.ne.u64 %r1323,%r707,%r1578; @ %r1323 bra $L139; bra $L55; $L48: .loc 1 2767 25 shr.u64 %r1326,%r1533,62; add.u64 %r1327,%r1699,%r1326; and.b64 %r1328,%r1327,3; sub.u64 %r1329,%r1328,%r1326; .loc 1 2767 11 sub.u64 %r240,%r1699,%r1329; .loc 1 2769 5 @ %r1585 bra $L56; bra $L57; $L47: .loc 1 2759 17 add.u64 %r131,%r1699,%r248; .loc 1 2759 12 add.u64 %r1331,%r131,-1; .loc 1 2760 9 setp.lt.s64 %r1332,%r1331,%r248; @ %r1332 bra $L48; add.u64 %r1333,%r131,%r146; shl.b64 %r1334,%r1333,3; add.u64 %r692,%r226,%r1334; mov.u64 %r682,%r683; mov.u64 %r677,%r678; mov.u64 %r676,%r737; sub.u64 %r1614,%r692,%r737; add.u64 %r1615,%r1614,-8; shr.u64 %r1613,%r1615,3; add.u64 %r1616,%r1613,1; and.b64 %r1617,%r1616,3; setp.eq.u64 %r1620,%r1617,0; @ %r1620 bra $L58; setp.eq.u64 %r1619,%r1617,1; @ %r1619 bra $L284; setp.eq.u64 %r1618,%r1617,2; @ %r1618 bra $L285; bra $L341; $L58: ld.f32 %r168,[%r676+4]; .loc 1 2762 43 ld.f32 %r1335,[%r676]; st.f32 [%r677],%r1335; st.f32 [%r682],%r168; .loc 1 2760 9 add.u64 %r1621,%r676,8; ld.f32 %r1715,[%r1621+4]; .loc 1 2762 43 ld.f32 %r1716,[%r1621]; st.f32 [%r677+2048],%r1716; st.f32 [%r682+2048],%r1715; .loc 1 2760 9 add.u64 %r1717,%r676,16; ld.f32 %r1721,[%r1717+4]; .loc 1 2762 43 ld.f32 %r1722,[%r1717]; st.f32 [%r677+4096],%r1722; st.f32 [%r682+4096],%r1721; .loc 1 2760 9 add.u64 %r1723,%r676,24; ld.f32 %r1727,[%r1723+4]; .loc 1 2762 43 ld.f32 %r1728,[%r1723]; st.f32 [%r677+6144],%r1728; st.f32 [%r682+6144],%r1727; .loc 1 2760 9 add.u64 %r676,%r676,32; add.u64 %r677,%r677,8192; add.u64 %r682,%r682,8192; setp.ne.u64 %r1729,%r676,%r692; @ %r1729 bra $L58; bra $L48; $L57: .loc 1 2871 8 @ ! %r1547 bra $L60; bra $L59; $L56: .loc 1 2876 14 add.u64 %r650,%r240,%r248; .loc 1 2876 9 add.u64 %r651,%r650,-1; add.u64 %r1338,%r650,%r774; shl.b64 %r1339,%r1338,3; add.u64 %r140,%r224,%r1339; add.u64 %r639,%r736,%r797; add.u64 %r637,%r639,%r797; add.u64 %r51,%r637,%r797; not.b64 %r1340,%r248; add.u64 %r1341,%r1340,%r650; shr.u64 %r1342,%r1341,2; add.u64 %r1343,%r1342,1; shl.b64 %r244,%r1343,5; sub.u64 %r1344,%r650,%r248; add.u64 %r1346,%r1344,1; shl.b64 %r1347,%r1346,11; add.u64 %r552,%r1230,%r1347; shl.b64 %r238,%r1699,3; mov.u64 %r640,%r736; mov.u64 %r641,%r764; mov.u64 %r642,%r762; mov.u64 %r649,%r761; mov.u64 %r88,%r759; mov.u64 %r200,%r772; mov.u64 %r250,%r795; mov.u64 %r49,0; setp.le.s64 %r1543,%r248,%r651; setp.gt.s64 %r1550,%r1699,%r240; $L68: .loc 1 2772 9 @ %r1543 bra $L61; $L67: .loc 1 2843 12 @ ! %r1550 bra $L63; bra $L62; $L61: add.u64 %r506,%r1230,2048; add.u64 %r504,%r1230,4096; add.u64 %r502,%r1230,6144; add.u64 %r500,%r1230,8192; .loc 1 2904 13 mov.u64 %r498,0; $L66: add.u64 %r496,%r640,%r498; ld.f32 %r184,[%r496]; ld.f32 %r281,[%r496+4]; ld.f32 %r201,[%r496+8]; ld.f32 %r189,[%r496+12]; add.u64 %r483,%r639,%r498; ld.f32 %r282,[%r483]; ld.f32 %r45,[%r483+4]; ld.f32 %r278,[%r483+8]; ld.f32 %r270,[%r483+12]; add.u64 %r472,%r637,%r498; ld.f32 %r280,[%r472]; ld.f32 %r279,[%r472+4]; ld.f32 %r300,[%r472+8]; ld.f32 %r301,[%r472+12]; add.u64 %r286,%r51,%r498; ld.f32 %r296,[%r286]; ld.f32 %r299,[%r286+4]; ld.f32 %r302,[%r286+8]; ld.f32 %r303,[%r286+12]; ld.f32 %r272,[%r496+16]; ld.f32 %r273,[%r496+20]; ld.f32 %r284,[%r496+24]; ld.f32 %r283,[%r496+28]; ld.f32 %r290,[%r483+16]; ld.f32 %r274,[%r483+20]; ld.f32 %r276,[%r483+24]; ld.f32 %r275,[%r483+28]; ld.f32 %r304,[%r472+16]; ld.f32 %r305,[%r472+20]; ld.f32 %r308,[%r472+24]; ld.f32 %r309,[%r472+28]; ld.f32 %r306,[%r286+16]; ld.f32 %r307,[%r286+20]; ld.f32 %r310,[%r286+24]; ld.f32 %r311,[%r286+28]; .loc 1 2791 6 @ %r1581 bra $L64; mov.u64 %r523,0; $L65: add.u64 %r521,%r506,%r523; ld.f32 %r312,[%r521+-2048]; ld.f32 %r313,[%r521+-2044]; add.u64 %r519,%r88,%r523; ld.f32 %r314,[%r519]; ld.f32 %r315,[%r519+4]; .loc 1 2794 11 mul.f32 %r317,%r313,%r314; mul.f32 %r1351,%r313,%r315; neg.f32 %r1352,%r1351; fma.rn.f32 %r318,%r312,%r314,%r1352; fma.rn.f32 %r319,%r312,%r315,%r317; .loc 1 2793 14 add.f32 %r184,%r184,%r318; add.f32 %r281,%r281,%r319; add.u64 %r517,%r504,%r523; ld.f32 %r320,[%r517+-2048]; ld.f32 %r321,[%r517+-2044]; .loc 1 2796 11 mul.f32 %r323,%r315,%r320; mul.f32 %r1353,%r315,%r321; neg.f32 %r1354,%r1353; fma.rn.f32 %r324,%r314,%r320,%r1354; fma.rn.f32 %r325,%r314,%r321,%r323; .loc 1 2795 14 add.f32 %r201,%r201,%r324; add.f32 %r189,%r189,%r325; add.u64 %r515,%r649,%r523; ld.f32 %r326,[%r515]; ld.f32 %r327,[%r515+4]; .loc 1 2798 11 mul.f32 %r329,%r313,%r326; mul.f32 %r1355,%r313,%r327; neg.f32 %r1356,%r1355; fma.rn.f32 %r330,%r312,%r326,%r1356; fma.rn.f32 %r331,%r312,%r327,%r329; .loc 1 2797 14 add.f32 %r282,%r282,%r330; add.f32 %r45,%r45,%r331; .loc 1 2800 11 mul.f32 %r333,%r321,%r326; mul.f32 %r1357,%r321,%r327; neg.f32 %r1358,%r1357; fma.rn.f32 %r334,%r320,%r326,%r1358; fma.rn.f32 %r335,%r320,%r327,%r333; .loc 1 2799 14 add.f32 %r278,%r278,%r334; add.f32 %r270,%r270,%r335; add.u64 %r513,%r642,%r523; ld.f32 %r336,[%r513]; ld.f32 %r337,[%r513+4]; .loc 1 2802 11 mul.f32 %r339,%r313,%r336; mul.f32 %r1359,%r313,%r337; neg.f32 %r1360,%r1359; fma.rn.f32 %r340,%r312,%r336,%r1360; fma.rn.f32 %r341,%r312,%r337,%r339; .loc 1 2801 14 add.f32 %r280,%r280,%r340; add.f32 %r279,%r279,%r341; .loc 1 2804 11 mul.f32 %r343,%r321,%r336; mul.f32 %r1361,%r321,%r337; neg.f32 %r1362,%r1361; fma.rn.f32 %r344,%r320,%r336,%r1362; fma.rn.f32 %r345,%r320,%r337,%r343; .loc 1 2803 14 add.f32 %r300,%r300,%r344; add.f32 %r301,%r301,%r345; add.u64 %r511,%r641,%r523; ld.f32 %r346,[%r511]; ld.f32 %r347,[%r511+4]; .loc 1 2806 11 mul.f32 %r349,%r313,%r346; mul.f32 %r1363,%r313,%r347; neg.f32 %r1364,%r1363; fma.rn.f32 %r350,%r312,%r346,%r1364; fma.rn.f32 %r351,%r312,%r347,%r349; .loc 1 2805 14 add.f32 %r296,%r296,%r350; add.f32 %r299,%r299,%r351; .loc 1 2808 11 mul.f32 %r353,%r321,%r346; mul.f32 %r1365,%r321,%r347; neg.f32 %r1366,%r1365; fma.rn.f32 %r354,%r320,%r346,%r1366; fma.rn.f32 %r355,%r320,%r347,%r353; .loc 1 2807 14 add.f32 %r302,%r302,%r354; add.f32 %r303,%r303,%r355; add.u64 %r509,%r502,%r523; ld.f32 %r356,[%r509+-2048]; ld.f32 %r357,[%r509+-2044]; .loc 1 2810 11 mul.f32 %r359,%r315,%r356; mul.f32 %r1367,%r315,%r357; neg.f32 %r1368,%r1367; fma.rn.f32 %r360,%r314,%r356,%r1368; fma.rn.f32 %r361,%r314,%r357,%r359; .loc 1 2809 14 add.f32 %r272,%r272,%r360; add.f32 %r273,%r273,%r361; add.u64 %r507,%r500,%r523; ld.f32 %r362,[%r507+-2048]; ld.f32 %r363,[%r507+-2044]; .loc 1 2812 11 mul.f32 %r365,%r315,%r362; mul.f32 %r1369,%r315,%r363; neg.f32 %r1370,%r1369; fma.rn.f32 %r366,%r314,%r362,%r1370; fma.rn.f32 %r367,%r314,%r363,%r365; .loc 1 2811 14 add.f32 %r284,%r284,%r366; add.f32 %r283,%r283,%r367; .loc 1 2814 11 mul.f32 %r369,%r327,%r356; mul.f32 %r1371,%r327,%r357; neg.f32 %r1372,%r1371; fma.rn.f32 %r370,%r326,%r356,%r1372; fma.rn.f32 %r371,%r326,%r357,%r369; .loc 1 2813 14 add.f32 %r290,%r290,%r370; add.f32 %r274,%r274,%r371; .loc 1 2816 11 mul.f32 %r373,%r327,%r362; mul.f32 %r1373,%r327,%r363; neg.f32 %r1374,%r1373; fma.rn.f32 %r374,%r326,%r362,%r1374; fma.rn.f32 %r375,%r326,%r363,%r373; .loc 1 2815 14 add.f32 %r276,%r276,%r374; add.f32 %r275,%r275,%r375; .loc 1 2818 11 mul.f32 %r377,%r337,%r356; mul.f32 %r1375,%r337,%r357; neg.f32 %r1376,%r1375; fma.rn.f32 %r378,%r336,%r356,%r1376; fma.rn.f32 %r379,%r336,%r357,%r377; .loc 1 2817 14 add.f32 %r304,%r304,%r378; add.f32 %r305,%r305,%r379; .loc 1 2820 11 mul.f32 %r381,%r337,%r362; mul.f32 %r1377,%r337,%r363; neg.f32 %r1378,%r1377; fma.rn.f32 %r382,%r336,%r362,%r1378; fma.rn.f32 %r383,%r336,%r363,%r381; .loc 1 2819 14 add.f32 %r308,%r308,%r382; add.f32 %r309,%r309,%r383; .loc 1 2822 11 mul.f32 %r385,%r347,%r356; mul.f32 %r1379,%r347,%r357; neg.f32 %r1380,%r1379; fma.rn.f32 %r386,%r346,%r356,%r1380; fma.rn.f32 %r387,%r346,%r357,%r385; .loc 1 2821 14 add.f32 %r306,%r306,%r386; add.f32 %r307,%r307,%r387; .loc 1 2824 11 mul.f32 %r389,%r347,%r362; mul.f32 %r1381,%r347,%r363; neg.f32 %r1382,%r1381; fma.rn.f32 %r390,%r346,%r362,%r1382; fma.rn.f32 %r391,%r346,%r363,%r389; .loc 1 2823 14 add.f32 %r310,%r310,%r390; add.f32 %r311,%r311,%r391; .loc 1 2791 6 add.u64 %r523,%r523,8; setp.ne.u64 %r1383,%r523,%r679; @ %r1383 bra $L65; $L64: .loc 1 2826 24 st.f32 [%r496],%r184; st.f32 [%r496+4],%r281; .loc 1 2827 28 st.f32 [%r496+8],%r201; st.f32 [%r496+12],%r189; .loc 1 2828 30 st.f32 [%r483],%r282; st.f32 [%r483+4],%r45; .loc 1 2829 34 st.f32 [%r483+8],%r278; st.f32 [%r483+12],%r270; .loc 1 2830 30 st.f32 [%r472],%r280; st.f32 [%r472+4],%r279; .loc 1 2831 34 st.f32 [%r472+8],%r300; st.f32 [%r472+12],%r301; .loc 1 2832 30 st.f32 [%r286],%r296; st.f32 [%r286+4],%r299; .loc 1 2833 34 st.f32 [%r286+8],%r302; st.f32 [%r286+12],%r303; .loc 1 2834 28 st.f32 [%r496+16],%r272; st.f32 [%r496+20],%r273; .loc 1 2835 28 st.f32 [%r496+24],%r284; st.f32 [%r496+28],%r283; .loc 1 2836 34 st.f32 [%r483+16],%r290; st.f32 [%r483+20],%r274; .loc 1 2837 34 st.f32 [%r483+24],%r276; st.f32 [%r483+28],%r275; .loc 1 2838 34 st.f32 [%r472+16],%r304; st.f32 [%r472+20],%r305; .loc 1 2839 34 st.f32 [%r472+24],%r308; st.f32 [%r472+28],%r309; .loc 1 2840 34 st.f32 [%r286+16],%r306; st.f32 [%r286+20],%r307; .loc 1 2841 34 st.f32 [%r286+24],%r310; st.f32 [%r286+28],%r311; .loc 1 2772 9 add.u64 %r506,%r506,8192; add.u64 %r504,%r504,8192; add.u64 %r502,%r502,8192; add.u64 %r500,%r500,8192; add.u64 %r498,%r498,32; setp.ne.u64 %r1384,%r244,%r498; @ %r1384 bra $L66; bra $L67; $L63: .loc 1 2769 29 add.u64 %r250,%r250,4; .loc 1 2769 5 add.u64 %r200,%r200,%r1251; add.u64 %r140,%r140,%r771; add.u64 %r88,%r88,%r67; add.u64 %r649,%r649,%r67; add.u64 %r642,%r642,%r67; add.u64 %r641,%r641,%r67; add.u64 %r640,%r640,%r771; add.u64 %r639,%r639,%r771; add.u64 %r637,%r637,%r771; add.u64 %r51,%r51,%r771; add.u64 %r49,%r49,%r771; setp.le.s64 %r1385,%r250,%r1583; @ %r1385 bra $L68; bra $L57; $L62: add.u64 %r549,%r140,4; add.u64 %r542,%r49,%r259; add.u64 %r128,%r238,%r640; .loc 1 2904 13 mov.u64 %r553,%r552; mov.u64 %r555,%r140; add.u64 %r1532,%r49,%r767; add.u64 %r1534,%r49,%r770; $L71: ld.f32 %r392,[%r555]; ld.f32 %r393,[%r549]; add.u64 %r1387,%r1532,%r555; add.u64 %r543,%r1387,%r200; ld.f32 %r394,[%r543]; add.u64 %r1388,%r1532,%r549; add.u64 %r530,%r1388,%r200; ld.f32 %r395,[%r530]; add.u64 %r1389,%r542,%r555; add.u64 %r539,%r1389,%r200; ld.f32 %r396,[%r539]; add.u64 %r1390,%r542,%r549; add.u64 %r527,%r1390,%r200; ld.f32 %r397,[%r527]; add.u64 %r1391,%r1534,%r555; add.u64 %r535,%r1391,%r200; ld.f32 %r398,[%r535]; add.u64 %r1392,%r1534,%r549; add.u64 %r524,%r1392,%r200; ld.f32 %r399,[%r524]; .loc 1 2853 10 @ %r1581 bra $L69; mov.u64 %r566,0; $L70: add.u64 %r564,%r553,%r566; ld.f32 %r400,[%r564+-2048]; ld.f32 %r401,[%r564+-2044]; add.u64 %r562,%r88,%r566; ld.f32 %r402,[%r562]; ld.f32 %r403,[%r562+4]; .loc 1 2856 13 mul.f32 %r405,%r401,%r402; mul.f32 %r1394,%r401,%r403; neg.f32 %r1395,%r1394; fma.rn.f32 %r406,%r400,%r402,%r1395; fma.rn.f32 %r407,%r400,%r403,%r405; .loc 1 2855 11 add.f32 %r392,%r392,%r406; add.f32 %r393,%r393,%r407; add.u64 %r560,%r649,%r566; ld.f32 %r408,[%r560]; ld.f32 %r409,[%r560+4]; .loc 1 2858 13 mul.f32 %r411,%r401,%r408; mul.f32 %r1396,%r401,%r409; neg.f32 %r1397,%r1396; fma.rn.f32 %r412,%r400,%r408,%r1397; fma.rn.f32 %r413,%r400,%r409,%r411; .loc 1 2857 11 add.f32 %r394,%r394,%r412; add.f32 %r395,%r395,%r413; add.u64 %r558,%r642,%r566; ld.f32 %r414,[%r558]; ld.f32 %r415,[%r558+4]; .loc 1 2860 13 mul.f32 %r417,%r401,%r414; mul.f32 %r1398,%r401,%r415; neg.f32 %r1399,%r1398; fma.rn.f32 %r418,%r400,%r414,%r1399; fma.rn.f32 %r419,%r400,%r415,%r417; .loc 1 2859 11 add.f32 %r396,%r396,%r418; add.f32 %r397,%r397,%r419; add.u64 %r556,%r641,%r566; ld.f32 %r420,[%r556]; ld.f32 %r421,[%r556+4]; .loc 1 2862 13 mul.f32 %r423,%r401,%r420; mul.f32 %r1400,%r401,%r421; neg.f32 %r1401,%r1400; fma.rn.f32 %r424,%r400,%r420,%r1401; fma.rn.f32 %r425,%r400,%r421,%r423; .loc 1 2861 11 add.f32 %r398,%r398,%r424; add.f32 %r399,%r399,%r425; .loc 1 2853 10 add.u64 %r566,%r566,8; setp.ne.u64 %r1402,%r566,%r679; @ %r1402 bra $L70; $L69: .loc 1 2864 28 st.f32 [%r555],%r392; st.f32 [%r549],%r393; .loc 1 2865 34 st.f32 [%r543],%r394; st.f32 [%r530],%r395; .loc 1 2866 34 st.f32 [%r539],%r396; st.f32 [%r527],%r397; .loc 1 2867 34 st.f32 [%r535],%r398; st.f32 [%r524],%r399; .loc 1 2846 6 add.u64 %r555,%r555,8; add.u64 %r553,%r553,2048; add.u64 %r549,%r549,8; setp.ne.u64 %r1403,%r128,%r555; @ %r1403 bra $L71; bra $L63; $L60: .loc 1 2727 34 add.u64 %r248,%r248,256; .loc 1 2727 8 add.u64 %r736,%r736,2048; add.u64 %r737,%r737,2048; setp.ge.s64 %r1404,%r162,%r248; @ %r1404 bra $L72; bra $L73; $L59: .loc 1 2874 9 @ %r1589 bra $L60; .loc 1 2876 14 add.u64 %r657,%r240,%r248; .loc 1 2876 9 add.u64 %r658,%r657,-1; .loc 1 2759 17 add.u64 %r582,%r1699,%r248; add.u64 %r1406,%r582,%r583; shl.b64 %r1407,%r1406,3; add.u64 %r584,%r224,%r1407; neg.s64 %r1409,%r582; shl.b64 %r1410,%r1409,3; shl.b64 %r1411,%r248,3; add.u64 %r567,%r1411,%r1410; shl.b64 %r1412,%r657,3; add.u64 %r569,%r1412,%r1410; shl.b64 %r1413,%r240,11; add.u64 %r615,%r1230,%r1413; .loc 1 2876 9 mov.u64 %r575,%r574; mov.u64 %r578,%r757; mov.u64 %r579,%r583; mov.u64 %r243,%r644; setp.gt.s64 %r1730,%r1699,%r240; setp.le.s64 %r1544,%r248,%r658; not.b64 %r1562,%r248; add.u64 %r1563,%r1562,%r657; and.b64 %r1564,%r1563,-4; add.u64 %r1565,%r1564,%r248; add.u64 %r1566,%r224,32; $L81: .loc 1 2877 6 @ %r1544 bra $L74; $L80: .loc 1 2901 6 @ %r1730 bra $L75; bra $L76; $L74: add.u64 %r599,%r567,%r584; add.u64 %r598,%r1230,2048; add.u64 %r596,%r1230,4096; add.u64 %r594,%r1230,6144; add.u64 %r592,%r1230,8192; add.u64 %r1420,%r1565,%r579; shl.b64 %r1421,%r1420,3; add.u64 %r585,%r1421,%r1566; $L79: ld.f32 %r426,[%r599]; ld.f32 %r427,[%r599+4]; ld.f32 %r428,[%r599+8]; ld.f32 %r429,[%r599+12]; ld.f32 %r430,[%r599+16]; ld.f32 %r431,[%r599+20]; ld.f32 %r432,[%r599+24]; ld.f32 %r433,[%r599+28]; .loc 1 2884 10 @ %r1581 bra $L77; mov.u64 %r610,0; $L78: add.u64 %r608,%r598,%r610; ld.f32 %r434,[%r608+-2048]; ld.f32 %r435,[%r608+-2044]; add.u64 %r606,%r578,%r610; ld.f32 %r436,[%r606]; ld.f32 %r437,[%r606+4]; .loc 1 2887 13 mul.f32 %r439,%r435,%r436; mul.f32 %r1424,%r435,%r437; neg.f32 %r1425,%r1424; fma.rn.f32 %r440,%r434,%r436,%r1425; fma.rn.f32 %r441,%r434,%r437,%r439; .loc 1 2886 11 add.f32 %r426,%r426,%r440; add.f32 %r427,%r427,%r441; add.u64 %r604,%r596,%r610; ld.f32 %r442,[%r604+-2048]; ld.f32 %r443,[%r604+-2044]; .loc 1 2889 13 mul.f32 %r445,%r437,%r442; mul.f32 %r1426,%r437,%r443; neg.f32 %r1427,%r1426; fma.rn.f32 %r446,%r436,%r442,%r1427; fma.rn.f32 %r447,%r436,%r443,%r445; .loc 1 2888 11 add.f32 %r428,%r428,%r446; add.f32 %r429,%r429,%r447; add.u64 %r602,%r594,%r610; ld.f32 %r448,[%r602+-2048]; ld.f32 %r449,[%r602+-2044]; .loc 1 2891 13 mul.f32 %r451,%r437,%r448; mul.f32 %r1428,%r437,%r449; neg.f32 %r1429,%r1428; fma.rn.f32 %r452,%r436,%r448,%r1429; fma.rn.f32 %r453,%r436,%r449,%r451; .loc 1 2890 11 add.f32 %r430,%r430,%r452; add.f32 %r431,%r431,%r453; add.u64 %r600,%r592,%r610; ld.f32 %r454,[%r600+-2048]; ld.f32 %r455,[%r600+-2044]; .loc 1 2893 13 mul.f32 %r457,%r437,%r454; mul.f32 %r1430,%r437,%r455; neg.f32 %r1431,%r1430; fma.rn.f32 %r458,%r436,%r454,%r1431; fma.rn.f32 %r459,%r436,%r455,%r457; .loc 1 2892 11 add.f32 %r432,%r432,%r458; add.f32 %r433,%r433,%r459; .loc 1 2884 10 add.u64 %r610,%r610,8; setp.ne.u64 %r1432,%r610,%r679; @ %r1432 bra $L78; $L77: .loc 1 2895 28 st.f32 [%r599],%r426; st.f32 [%r599+4],%r427; .loc 1 2896 32 st.f32 [%r599+8],%r428; st.f32 [%r599+12],%r429; .loc 1 2897 32 st.f32 [%r599+16],%r430; st.f32 [%r599+20],%r431; .loc 1 2898 32 st.f32 [%r599+24],%r432; st.f32 [%r599+28],%r433; .loc 1 2877 6 add.u64 %r599,%r599,32; add.u64 %r598,%r598,8192; add.u64 %r596,%r596,8192; add.u64 %r594,%r594,8192; add.u64 %r592,%r592,8192; setp.ne.u64 %r1433,%r585,%r599; @ %r1433 bra $L79; bra $L80; $L76: .loc 1 2874 39 add.u64 %r243,%r243,1; .loc 1 2874 9 add.u64 %r584,%r584,%r797; add.u64 %r579,%r579,%r664; add.u64 %r578,%r578,%r784; add.u64 %r575,%r575,%r784; setp.ne.u64 %r1434,%r1588,%r243; @ %r1434 bra $L81; bra $L60; $L75: add.u64 %r617,%r569,%r584; .loc 1 2904 13 mov.u64 %r616,%r615; $L84: ld.f32 %r460,[%r617]; ld.f32 %r461,[%r617+4]; .loc 1 2905 10 @ %r1581 bra $L82; add.u64 %r621,%r616,4; mov.u64 %r256,%r578; mov.u64 %r297,%r616; sub.u64 %r1603,%r575,%r578; add.u64 %r1604,%r1603,-8; shr.u64 %r1602,%r1604,3; add.u64 %r1605,%r1602,1; and.b64 %r1606,%r1605,3; setp.eq.u64 %r1609,%r1606,0; @ %r1609 bra $L83; setp.eq.u64 %r1608,%r1606,1; @ %r1608 bra $L286; setp.eq.u64 %r1607,%r1606,2; @ %r1607 bra $L287; bra $L342; $L83: ld.f32 %r462,[%r297]; ld.f32 %r463,[%r621]; ld.f32 %r464,[%r256]; ld.f32 %r465,[%r256+4]; .loc 1 2908 13 mul.f32 %r467,%r463,%r464; mul.f32 %r1437,%r463,%r465; neg.f32 %r1438,%r1437; fma.rn.f32 %r468,%r462,%r464,%r1438; fma.rn.f32 %r469,%r462,%r465,%r467; .loc 1 2907 11 add.f32 %r1731,%r460,%r468; add.f32 %r1732,%r461,%r469; .loc 1 2905 10 add.u64 %r1611,%r256,8; ld.f32 %r1736,[%r297+8]; ld.f32 %r1737,[%r621+8]; ld.f32 %r1738,[%r1611]; ld.f32 %r1739,[%r1611+4]; .loc 1 2908 13 mul.f32 %r1740,%r1737,%r1738; mul.f32 %r1741,%r1737,%r1739; neg.f32 %r1742,%r1741; fma.rn.f32 %r1743,%r1736,%r1738,%r1742; fma.rn.f32 %r1744,%r1736,%r1739,%r1740; .loc 1 2907 11 add.f32 %r1745,%r1731,%r1743; add.f32 %r1746,%r1732,%r1744; .loc 1 2905 10 add.u64 %r1748,%r256,16; ld.f32 %r1751,[%r297+16]; ld.f32 %r1752,[%r621+16]; ld.f32 %r1753,[%r1748]; ld.f32 %r1754,[%r1748+4]; .loc 1 2908 13 mul.f32 %r1755,%r1752,%r1753; mul.f32 %r1756,%r1752,%r1754; neg.f32 %r1757,%r1756; fma.rn.f32 %r1758,%r1751,%r1753,%r1757; fma.rn.f32 %r1759,%r1751,%r1754,%r1755; .loc 1 2907 11 add.f32 %r1760,%r1745,%r1758; add.f32 %r1761,%r1746,%r1759; .loc 1 2905 10 add.u64 %r1763,%r256,24; ld.f32 %r1766,[%r297+24]; ld.f32 %r1767,[%r621+24]; ld.f32 %r1768,[%r1763]; ld.f32 %r1769,[%r1763+4]; .loc 1 2908 13 mul.f32 %r1770,%r1767,%r1768; mul.f32 %r1771,%r1767,%r1769; neg.f32 %r1772,%r1771; fma.rn.f32 %r1773,%r1766,%r1768,%r1772; fma.rn.f32 %r1774,%r1766,%r1769,%r1770; .loc 1 2907 11 add.f32 %r460,%r1760,%r1773; add.f32 %r461,%r1761,%r1774; .loc 1 2905 10 add.u64 %r297,%r297,32; add.u64 %r256,%r256,32; add.u64 %r621,%r621,32; setp.ne.u64 %r1775,%r256,%r575; @ %r1775 bra $L83; $L82: .loc 1 2910 28 st.f32 [%r617],%r460; st.f32 [%r617+4],%r461; .loc 1 2901 6 add.u64 %r617,%r617,8; add.u64 %r616,%r616,2048; setp.ne.u64 %r1440,%r584,%r617; @ %r1440 bra $L84; bra $L76; $L20: .loc 1 2958 11 cvt.u16.u32 %r1441,%r627; setp.eq.u16 %r1442,%r1441,1; @ ! %r1442 bra $L343; bra $L85; $L31: .loc 1 2920 38 set.u32.eq.u64 %r1444,%r150,1; neg.s32 %r1445,%r1444; .loc 1 2920 26 cvt.u16.u32 %r1447,%r1445; and.b16 %r1446,%r1447,%r1187; .loc 1 2920 11 cvt.u32.u16 %r1449,%r1446; cvt.u16.u8 %r1450,%r1449; setp.eq.u16 %r1451,%r1450,0; @ %r1451 bra $L20; .loc 1 2920 43 setp.ne.u64 %r1452,%r663,1; @ %r1452 bra $L20; $L121: .loc 1 2922 10 cvt.u16.u32 %r1453,%r627; setp.ne.u16 %r1454,%r1453,1; @ %r1454 bra $L87; .loc 1 2948 4 setp.gt.s64 %r1455,%r163,0; @ %r1455 bra $L88; bra $L1; $L87: .loc 1 2929 4 setp.le.s64 %r1456,%r163,0; @ %r1456 bra $L1; shl.b64 %r830,%r664,3; shl.b64 %r1457,%r162,3; add.u64 %r829,%r222,%r1457; shl.b64 %r836,%r151,3; mov.u64 %r834,%r221; shl.b64 %r818,%r298,3; shl.b64 %r1458,%r638,3; add.u64 %r822,%r220,%r1458; mov.u64 %r827,0; .loc 1 2929 11 mov.u64 %r258,%r827; setp.gt.s64 %r1776,%r162,0; setp.le.s64 %r1597,%r638,0; $L91: .loc 1 2933 8 @ %r1776 bra $L90; $L95: .loc 1 2929 29 add.u64 %r258,%r258,1; .loc 1 2929 4 add.u64 %r827,%r827,%r664; add.u64 %r829,%r829,%r830; add.u64 %r834,%r834,%r836; setp.ne.u64 %r1460,%r163,%r258; @ %r1460 bra $L91; bra $L1; $L90: shl.b64 %r1461,%r827,3; add.u64 %r815,%r222,%r1461; .loc 1 2933 8 mov.u64 %r821,%r822; mov.u64 %r816,%r220; $L94: .loc 1 2937 5 @ %r1597 bra $L140; add.u64 %r809,%r816,4; add.u64 %r810,%r834,4; mov.u64 %r808,%r834; mov.u64 %r807,%r816; mov.f32 %r108,0f00000000; mov.f32 %r103,%r108; sub.u64 %r1642,%r821,%r816; add.u64 %r1643,%r1642,-8; shr.u64 %r1641,%r1643,3; add.u64 %r1644,%r1641,1; and.b64 %r1645,%r1644,3; setp.eq.u64 %r1648,%r1645,0; @ %r1648 bra $L93; setp.eq.u64 %r1647,%r1645,1; @ %r1647 bra $L288; setp.eq.u64 %r1646,%r1645,2; @ %r1646 bra $L289; bra $L344; $L93: ld.f32 %r186,[%r807]; ld.f32 %r193,[%r809]; ld.f32 %r177,[%r808]; ld.f32 %r175,[%r810]; .loc 1 2938 23 mul.f32 %r44,%r177,%r193; mul.f32 %r1463,%r175,%r193; neg.f32 %r1464,%r1463; fma.rn.f32 %r43,%r177,%r186,%r1464; fma.rn.f32 %r101,%r175,%r186,%r44; .loc 1 2938 9 add.f32 %r1777,%r103,%r43; add.f32 %r1778,%r108,%r101; ld.f32 %r1783,[%r807+8]; ld.f32 %r1784,[%r809+8]; ld.f32 %r1785,[%r808+8]; ld.f32 %r1786,[%r810+8]; .loc 1 2938 23 mul.f32 %r1787,%r1785,%r1784; mul.f32 %r1788,%r1786,%r1784; neg.f32 %r1789,%r1788; fma.rn.f32 %r1790,%r1785,%r1783,%r1789; fma.rn.f32 %r1791,%r1786,%r1783,%r1787; .loc 1 2938 9 add.f32 %r1792,%r1777,%r1790; add.f32 %r1793,%r1778,%r1791; ld.f32 %r1799,[%r807+16]; ld.f32 %r1800,[%r809+16]; ld.f32 %r1801,[%r808+16]; ld.f32 %r1802,[%r810+16]; .loc 1 2938 23 mul.f32 %r1803,%r1801,%r1800; mul.f32 %r1804,%r1802,%r1800; neg.f32 %r1805,%r1804; fma.rn.f32 %r1806,%r1801,%r1799,%r1805; fma.rn.f32 %r1807,%r1802,%r1799,%r1803; .loc 1 2938 9 add.f32 %r1808,%r1792,%r1806; add.f32 %r1809,%r1793,%r1807; ld.f32 %r1815,[%r807+24]; ld.f32 %r1816,[%r809+24]; ld.f32 %r1817,[%r808+24]; ld.f32 %r1818,[%r810+24]; .loc 1 2938 23 mul.f32 %r1819,%r1817,%r1816; mul.f32 %r1820,%r1818,%r1816; neg.f32 %r1821,%r1820; fma.rn.f32 %r1822,%r1817,%r1815,%r1821; fma.rn.f32 %r1823,%r1818,%r1815,%r1819; .loc 1 2938 9 add.f32 %r103,%r1808,%r1822; add.f32 %r108,%r1809,%r1823; .loc 1 2937 5 add.u64 %r807,%r807,32; add.u64 %r808,%r808,32; add.u64 %r809,%r809,32; add.u64 %r810,%r810,32; setp.ne.u64 %r1824,%r807,%r821; @ %r1824 bra $L93; bra $L92; $L140: mov.f32 %r108,0f00000000; mov.f32 %r103,%r108; $L92: .loc 1 2939 15 st.f32 [%r815],%r103; st.f32 [%r815+4],%r108; .loc 1 2933 8 add.u64 %r815,%r815,8; add.u64 %r816,%r816,%r818; add.u64 %r821,%r821,%r818; setp.ne.u64 %r1466,%r815,%r829; @ %r1466 bra $L94; bra $L95; $L88: shl.b64 %r853,%r664,3; mov.u64 %r851,%r222; shl.b64 %r842,%r298,3; .loc 1 2948 4 mov.u64 %r854,0; .loc 1 2948 11 mov.u64 %r257,%r854; setp.le.s64 %r1825,%r638,0; $L98: .loc 1 2952 8 @ %r1825 bra $L141; mov.u64 %r840,%r220; shl.b64 %r1468,%r854,3; add.u64 %r843,%r1468,%r221; add.u64 %r845,%r220,4; add.u64 %r1469,%r638,%r854; shl.b64 %r1470,%r1469,3; add.u64 %r861,%r1470,%r221; mov.f32 %r37,0f00000000; mov.f32 %r38,%r37; sub.u64 %r1654,%r861,%r843; add.u64 %r1655,%r1654,-8; shr.u64 %r1653,%r1655,3; add.u64 %r1656,%r1653,1; and.b64 %r1657,%r1656,3; setp.eq.u64 %r1660,%r1657,0; @ %r1660 bra $L97; setp.eq.u64 %r1659,%r1657,1; @ %r1659 bra $L290; setp.eq.u64 %r1658,%r1657,2; @ %r1658 bra $L291; bra $L345; $L97: ld.f32 %r187,[%r840]; ld.f32 %r194,[%r845]; ld.f32 %r172,[%r843]; ld.f32 %r169,[%r843+4]; .loc 1 2953 26 mul.f32 %r87,%r172,%r194; mul.f32 %r1471,%r169,%r194; neg.f32 %r1472,%r1471; fma.rn.f32 %r69,%r172,%r187,%r1472; fma.rn.f32 %r68,%r169,%r187,%r87; .loc 1 2953 5 add.f32 %r1826,%r38,%r69; add.f32 %r1827,%r37,%r68; .loc 1 2952 8 add.u64 %r1828,%r840,%r842; add.u64 %r1661,%r843,8; add.u64 %r1830,%r845,%r842; ld.f32 %r1831,[%r1828]; ld.f32 %r1832,[%r1830]; ld.f32 %r1833,[%r1661]; ld.f32 %r1834,[%r1661+4]; .loc 1 2953 26 mul.f32 %r1835,%r1833,%r1832; mul.f32 %r1836,%r1834,%r1832; neg.f32 %r1837,%r1836; fma.rn.f32 %r1838,%r1833,%r1831,%r1837; fma.rn.f32 %r1839,%r1834,%r1831,%r1835; .loc 1 2953 5 add.f32 %r1840,%r1826,%r1838; add.f32 %r1841,%r1827,%r1839; .loc 1 2952 8 add.u64 %r1842,%r1828,%r842; add.u64 %r1843,%r843,16; add.u64 %r1844,%r1830,%r842; ld.f32 %r1846,[%r1842]; ld.f32 %r1847,[%r1844]; ld.f32 %r1848,[%r1843]; ld.f32 %r1849,[%r1843+4]; .loc 1 2953 26 mul.f32 %r1850,%r1848,%r1847; mul.f32 %r1851,%r1849,%r1847; neg.f32 %r1852,%r1851; fma.rn.f32 %r1853,%r1848,%r1846,%r1852; fma.rn.f32 %r1854,%r1849,%r1846,%r1850; .loc 1 2953 5 add.f32 %r1855,%r1840,%r1853; add.f32 %r1856,%r1841,%r1854; .loc 1 2952 8 add.u64 %r1857,%r1842,%r842; add.u64 %r1858,%r843,24; add.u64 %r1859,%r1844,%r842; ld.f32 %r1861,[%r1857]; ld.f32 %r1862,[%r1859]; ld.f32 %r1863,[%r1858]; ld.f32 %r1864,[%r1858+4]; .loc 1 2953 26 mul.f32 %r1865,%r1863,%r1862; mul.f32 %r1866,%r1864,%r1862; neg.f32 %r1867,%r1866; fma.rn.f32 %r1868,%r1863,%r1861,%r1867; fma.rn.f32 %r1869,%r1864,%r1861,%r1865; .loc 1 2953 5 add.f32 %r38,%r1855,%r1868; add.f32 %r37,%r1856,%r1869; .loc 1 2952 8 add.u64 %r840,%r1857,%r842; add.u64 %r843,%r843,32; add.u64 %r845,%r1859,%r842; setp.ne.u64 %r1870,%r843,%r861; @ %r1870 bra $L97; bra $L96; $L141: mov.f32 %r37,0f00000000; mov.f32 %r38,%r37; $L96: .loc 1 2954 25 st.f32 [%r851],%r38; st.f32 [%r851+4],%r37; .loc 1 2948 29 add.u64 %r257,%r257,1; .loc 1 2948 4 add.u64 %r851,%r851,%r853; add.u64 %r854,%r854,%r151; setp.ne.u64 %r1474,%r163,%r257; @ %r1474 bra $L98; bra $L1; $L85: .loc 1 2963 7 setp.le.s64 %r1475,%r163,0; @ %r1475 bra $L1; shl.b64 %r872,%r844,3; mov.u64 %r870,%r222; shl.b64 %r864,%r298,3; shl.b64 %r867,%r663,3; mov.u64 %r873,0; .loc 1 2963 14 mov.u64 %r268,%r873; setp.le.s64 %r1871,%r638,0; $L101: .loc 1 2967 4 @ %r1871 bra $L142; mov.u64 %r862,%r220; shl.b64 %r1477,%r873,3; add.u64 %r865,%r221,%r1477; mov.f32 %r152,0f00000000; mov.f32 %r188,%r152; .loc 1 2967 11 mov.u64 %r269,0; and.b64 %r1665,%r638,3; setp.eq.u64 %r1668,%r1665,0; @ %r1668 bra $L100; setp.eq.u64 %r1667,%r1665,1; @ %r1667 bra $L292; setp.eq.u64 %r1666,%r1665,2; @ %r1666 bra $L293; bra $L346; $L100: ld.f32 %r277,[%r862]; ld.f32 %r206,[%r862+4]; ld.f32 %r195,[%r865]; ld.f32 %r205,[%r865+4]; .loc 1 2968 29 mul.f32 %r203,%r195,%r206; mul.f32 %r1478,%r205,%r206; neg.f32 %r1479,%r1478; fma.rn.f32 %r191,%r195,%r277,%r1479; fma.rn.f32 %r202,%r205,%r277,%r203; .loc 1 2968 8 add.f32 %r1872,%r188,%r191; add.f32 %r1873,%r152,%r202; .loc 1 2967 4 add.u64 %r1875,%r862,%r864; add.u64 %r1876,%r865,%r867; ld.f32 %r1877,[%r1875]; ld.f32 %r1878,[%r1875+4]; ld.f32 %r1879,[%r1876]; ld.f32 %r1880,[%r1876+4]; .loc 1 2968 29 mul.f32 %r1881,%r1879,%r1878; mul.f32 %r1882,%r1880,%r1878; neg.f32 %r1883,%r1882; fma.rn.f32 %r1884,%r1879,%r1877,%r1883; fma.rn.f32 %r1885,%r1880,%r1877,%r1881; .loc 1 2968 8 add.f32 %r1886,%r1872,%r1884; add.f32 %r1887,%r1873,%r1885; .loc 1 2967 4 add.u64 %r1889,%r1875,%r864; add.u64 %r1890,%r1876,%r867; ld.f32 %r1892,[%r1889]; ld.f32 %r1893,[%r1889+4]; ld.f32 %r1894,[%r1890]; ld.f32 %r1895,[%r1890+4]; .loc 1 2968 29 mul.f32 %r1896,%r1894,%r1893; mul.f32 %r1897,%r1895,%r1893; neg.f32 %r1898,%r1897; fma.rn.f32 %r1899,%r1894,%r1892,%r1898; fma.rn.f32 %r1900,%r1895,%r1892,%r1896; .loc 1 2968 8 add.f32 %r1901,%r1886,%r1899; add.f32 %r1902,%r1887,%r1900; .loc 1 2967 4 add.u64 %r1904,%r1889,%r864; add.u64 %r1905,%r1890,%r867; ld.f32 %r1907,[%r1904]; ld.f32 %r1908,[%r1904+4]; ld.f32 %r1909,[%r1905]; ld.f32 %r1910,[%r1905+4]; .loc 1 2968 29 mul.f32 %r1911,%r1909,%r1908; mul.f32 %r1912,%r1910,%r1908; neg.f32 %r1913,%r1912; fma.rn.f32 %r1914,%r1909,%r1907,%r1913; fma.rn.f32 %r1915,%r1910,%r1907,%r1911; .loc 1 2968 8 add.f32 %r188,%r1901,%r1914; add.f32 %r152,%r1902,%r1915; .loc 1 2967 28 add.u64 %r269,%r269,4; .loc 1 2967 4 add.u64 %r862,%r1904,%r864; add.u64 %r865,%r1905,%r867; setp.ne.u64 %r1916,%r638,%r269; @ %r1916 bra $L100; bra $L99; $L142: mov.f32 %r152,0f00000000; mov.f32 %r188,%r152; $L99: .loc 1 2969 21 st.f32 [%r870],%r188; st.f32 [%r870+4],%r152; .loc 1 2963 32 add.u64 %r268,%r268,1; .loc 1 2963 7 add.u64 %r870,%r870,%r872; add.u64 %r873,%r873,%r151; setp.ne.u64 %r1481,%r163,%r268; @ %r1481 bra $L101; bra $L1; $L343: .loc 1 2972 11 setp.gt.s64 %r1482,%r150,%r298; @ %r1482 bra $L102; .loc 1 2993 7 setp.gt.s64 %r1483,%r163,0; @ %r1483 bra $L103; bra $L1; $L102: .loc 1 2974 7 setp.gt.s64 %r1484,%r163,0; @ %r1484 bra $L104; bra $L1; $L108: shl.b64 %r1485,%r914,3; add.u64 %r910,%r222,%r1485; .loc 1 2975 9 mov.u64 %r267,0; and.b64 %r1681,%r162,3; setp.eq.u64 %r1684,%r1681,0; @ %r1684 bra $L105; setp.eq.u64 %r1683,%r1681,1; @ %r1683 bra $L294; setp.eq.u64 %r1682,%r1681,2; @ %r1682 bra $L295; bra $L347; $L105: .loc 1 2976 34 st.f32 [%r910],%r1599; st.f32 [%r910+4],%r1600; .loc 1 2975 2 add.u64 %r1920,%r910,%r912; .loc 1 2976 34 st.f32 [%r1920],%r1599; st.f32 [%r1920+4],%r1600; .loc 1 2975 2 add.u64 %r1922,%r1920,%r912; .loc 1 2976 34 st.f32 [%r1922],%r1599; st.f32 [%r1922+4],%r1600; .loc 1 2975 2 add.u64 %r1925,%r1922,%r912; .loc 1 2976 34 st.f32 [%r1925],%r1599; st.f32 [%r1925+4],%r1600; .loc 1 2975 27 add.u64 %r267,%r267,4; .loc 1 2975 2 add.u64 %r910,%r1925,%r912; setp.ne.u64 %r1927,%r162,%r267; @ %r1927 bra $L105; $L109: .loc 1 2974 32 add.u64 %r266,%r288,1; .loc 1 2974 7 add.u64 %r914,%r914,%r664; setp.ne.u64 %r1489,%r163,%r266; @ %r1489 bra $L143; shl.b64 %r902,%r664,3; mov.u64 %r900,%r222; shl.b64 %r893,%r663,3; shl.b64 %r883,%r298,3; mov.u64 %r905,0; mov.u64 %r289,%r905; setp.le.s64 %r1928,%r638,0; add.u64 %r1598,%r221,4; bra $L107; $L104: shl.b64 %r912,%r844,3; mov.u64 %r914,0; .loc 1 2974 14 mov.u64 %r288,%r914; setp.gt.s64 %r1929,%r162,0; ld.const.f32 %r1599,[$LC9]; ld.const.f32 %r1600,[$LC9+4]; bra $L106; $L143: mov.u64 %r288,%r266; $L106: .loc 1 2975 2 @ %r1929 bra $L108; bra $L109; $L112: shl.b64 %r1491,%r889,3; add.u64 %r881,%r220,%r1491; add.u64 %r884,%r900,4; .loc 1 2980 4 mov.u64 %r878,%r900; .loc 1 2980 11 mov.u64 %r265,0; and.b64 %r1673,%r162,3; setp.eq.u64 %r1676,%r1673,0; @ %r1676 bra $L110; setp.eq.u64 %r1675,%r1673,1; @ %r1675 bra $L296; setp.eq.u64 %r1674,%r1673,2; @ %r1674 bra $L297; bra $L348; $L110: ld.f32 %r209,[%r881]; ld.f32 %r153,[%r881+4]; ld.f32 %r208,[%r891]; ld.f32 %r156,[%r894]; .loc 1 2983 37 mul.f32 %r155,%r153,%r208; ld.f32 %r1492,[%r878]; fma.rn.f32 %r495,%r208,%r209,%r1492; fma.rn.f32 %r160,%r156,%r209,%r155; .loc 1 2982 36 neg.f32 %r1493,%r153; fma.rn.f32 %r207,%r1493,%r156,%r495; ld.f32 %r1494,[%r884]; add.f32 %r154,%r160,%r1494; st.f32 [%r878],%r207; st.f32 [%r884],%r154; .loc 1 2980 4 add.u64 %r1931,%r878,%r912; add.u64 %r1932,%r881,%r883; add.u64 %r1933,%r884,%r912; ld.f32 %r1934,[%r1932]; ld.f32 %r1935,[%r1932+4]; ld.f32 %r1936,[%r891]; ld.f32 %r1937,[%r894]; .loc 1 2983 37 mul.f32 %r1938,%r1935,%r1936; ld.f32 %r1939,[%r1931]; fma.rn.f32 %r1940,%r1936,%r1934,%r1939; fma.rn.f32 %r1941,%r1937,%r1934,%r1938; .loc 1 2982 36 neg.f32 %r1942,%r1935; fma.rn.f32 %r1943,%r1942,%r1937,%r1940; ld.f32 %r1944,[%r1933]; add.f32 %r1945,%r1941,%r1944; st.f32 [%r1931],%r1943; st.f32 [%r1933],%r1945; .loc 1 2980 4 add.u64 %r1947,%r1931,%r912; add.u64 %r1948,%r1932,%r883; add.u64 %r1949,%r1933,%r912; ld.f32 %r1951,[%r1948]; ld.f32 %r1952,[%r1948+4]; ld.f32 %r1953,[%r891]; ld.f32 %r1954,[%r894]; .loc 1 2983 37 mul.f32 %r1955,%r1952,%r1953; ld.f32 %r1956,[%r1947]; fma.rn.f32 %r1957,%r1953,%r1951,%r1956; fma.rn.f32 %r1958,%r1954,%r1951,%r1955; .loc 1 2982 36 neg.f32 %r1959,%r1952; fma.rn.f32 %r1960,%r1959,%r1954,%r1957; ld.f32 %r1961,[%r1949]; add.f32 %r1962,%r1958,%r1961; st.f32 [%r1947],%r1960; st.f32 [%r1949],%r1962; .loc 1 2980 4 add.u64 %r1964,%r1947,%r912; add.u64 %r1965,%r1948,%r883; add.u64 %r1966,%r1949,%r912; ld.f32 %r1968,[%r1965]; ld.f32 %r1969,[%r1965+4]; ld.f32 %r1970,[%r891]; ld.f32 %r1971,[%r894]; .loc 1 2983 37 mul.f32 %r1972,%r1969,%r1970; ld.f32 %r1973,[%r1964]; fma.rn.f32 %r1974,%r1970,%r1968,%r1973; fma.rn.f32 %r1975,%r1971,%r1968,%r1972; .loc 1 2982 36 neg.f32 %r1976,%r1969; fma.rn.f32 %r1977,%r1976,%r1971,%r1974; ld.f32 %r1978,[%r1966]; add.f32 %r1979,%r1975,%r1978; st.f32 [%r1964],%r1977; st.f32 [%r1966],%r1979; .loc 1 2980 29 add.u64 %r265,%r265,4; .loc 1 2980 4 add.u64 %r878,%r1964,%r912; add.u64 %r881,%r1965,%r883; add.u64 %r884,%r1966,%r912; setp.ne.u64 %r1980,%r162,%r265; @ %r1980 bra $L110; $L113: .loc 1 2979 26 add.u64 %r264,%r264,1; .loc 1 2979 2 add.u64 %r889,%r889,%r150; add.u64 %r891,%r891,%r893; add.u64 %r894,%r894,%r893; setp.eq.u64 %r1496,%r638,%r264; @ %r1496 bra $L111; $L114: .loc 1 2980 4 @ %r1929 bra $L112; bra $L113; $L111: .loc 1 2978 32 add.u64 %r263,%r289,1; .loc 1 2978 7 add.u64 %r900,%r900,%r902; add.u64 %r905,%r905,%r151; setp.eq.u64 %r1498,%r288,%r289; @ %r1498 bra $L1; mov.u64 %r289,%r263; $L107: .loc 1 2979 2 @ %r1928 bra $L111; shl.b64 %r908,%r905,3; add.u64 %r891,%r221,%r908; add.u64 %r894,%r1598,%r908; mov.u64 %r889,0; .loc 1 2979 9 mov.u64 %r264,%r889; bra $L114; $L103: shl.b64 %r929,%r844,3; shl.b64 %r920,%r150,3; shl.b64 %r924,%r663,3; .loc 1 2993 7 mov.u64 %r938,0; mov.u64 %r936,%r938; .loc 1 2993 14 mov.u64 %r260,%r938; setp.gt.s64 %r1981,%r162,0; setp.le.s64 %r1601,%r638,0; $L116: .loc 1 2997 4 @ %r1981 bra $L115; $L120: .loc 1 2993 32 add.u64 %r260,%r260,1; .loc 1 2993 7 add.u64 %r936,%r936,%r664; add.u64 %r938,%r938,%r151; setp.ne.u64 %r1502,%r163,%r260; @ %r1502 bra $L116; bra $L1; $L115: shl.b64 %r1503,%r936,3; add.u64 %r927,%r222,%r1503; shl.b64 %r1504,%r938,3; add.u64 %r944,%r221,%r1504; .loc 1 2997 4 mov.u64 %r931,0; .loc 1 2997 11 mov.u64 %r261,%r931; $L119: .loc 1 3001 8 @ %r1601 bra $L144; shl.b64 %r1506,%r931,3; add.u64 %r918,%r220,%r1506; mov.u64 %r922,%r944; mov.f32 %r161,0f00000000; mov.f32 %r211,%r161; .loc 1 3001 15 mov.u64 %r262,0; and.b64 %r1689,%r638,3; setp.eq.u64 %r1692,%r1689,0; @ %r1692 bra $L118; setp.eq.u64 %r1691,%r1689,1; @ %r1691 bra $L298; setp.eq.u64 %r1690,%r1689,2; @ %r1690 bra $L299; bra $L349; $L118: ld.f32 %r285,[%r918]; ld.f32 %r214,[%r918+4]; ld.f32 %r159,[%r922]; ld.f32 %r213,[%r922+4]; .loc 1 3002 28 mul.f32 %r271,%r159,%r214; mul.f32 %r1507,%r213,%r214; neg.f32 %r1508,%r1507; fma.rn.f32 %r212,%r159,%r285,%r1508; fma.rn.f32 %r158,%r213,%r285,%r271; .loc 1 3002 5 add.f32 %r1983,%r211,%r212; add.f32 %r1984,%r161,%r158; .loc 1 3001 8 add.u64 %r1986,%r918,%r920; add.u64 %r1987,%r922,%r924; ld.f32 %r1988,[%r1986]; ld.f32 %r1989,[%r1986+4]; ld.f32 %r1990,[%r1987]; ld.f32 %r1991,[%r1987+4]; .loc 1 3002 28 mul.f32 %r1992,%r1990,%r1989; mul.f32 %r1993,%r1991,%r1989; neg.f32 %r1994,%r1993; fma.rn.f32 %r1995,%r1990,%r1988,%r1994; fma.rn.f32 %r1996,%r1991,%r1988,%r1992; .loc 1 3002 5 add.f32 %r1997,%r1983,%r1995; add.f32 %r1998,%r1984,%r1996; .loc 1 3001 8 add.u64 %r2000,%r1986,%r920; add.u64 %r2001,%r1987,%r924; ld.f32 %r2003,[%r2000]; ld.f32 %r2004,[%r2000+4]; ld.f32 %r2005,[%r2001]; ld.f32 %r2006,[%r2001+4]; .loc 1 3002 28 mul.f32 %r2007,%r2005,%r2004; mul.f32 %r2008,%r2006,%r2004; neg.f32 %r2009,%r2008; fma.rn.f32 %r2010,%r2005,%r2003,%r2009; fma.rn.f32 %r2011,%r2006,%r2003,%r2007; .loc 1 3002 5 add.f32 %r2012,%r1997,%r2010; add.f32 %r2013,%r1998,%r2011; .loc 1 3001 8 add.u64 %r2015,%r2000,%r920; add.u64 %r2016,%r2001,%r924; ld.f32 %r2018,[%r2015]; ld.f32 %r2019,[%r2015+4]; ld.f32 %r2020,[%r2016]; ld.f32 %r2021,[%r2016+4]; .loc 1 3002 28 mul.f32 %r2022,%r2020,%r2019; mul.f32 %r2023,%r2021,%r2019; neg.f32 %r2024,%r2023; fma.rn.f32 %r2025,%r2020,%r2018,%r2024; fma.rn.f32 %r2026,%r2021,%r2018,%r2022; .loc 1 3002 5 add.f32 %r211,%r2012,%r2025; add.f32 %r161,%r2013,%r2026; .loc 1 3001 32 add.u64 %r262,%r262,4; .loc 1 3001 8 add.u64 %r918,%r2015,%r920; add.u64 %r922,%r2016,%r924; setp.ne.u64 %r2027,%r638,%r262; @ %r2027 bra $L118; bra $L117; $L144: mov.f32 %r161,0f00000000; mov.f32 %r211,%r161; $L117: .loc 1 3003 27 st.f32 [%r927],%r211; st.f32 [%r927+4],%r161; .loc 1 2997 29 add.u64 %r261,%r261,1; .loc 1 2997 4 add.u64 %r927,%r927,%r929; add.u64 %r931,%r931,%r298; setp.ne.u64 %r1510,%r162,%r261; @ %r1510 bra $L119; bra $L120; $L32: .loc 1 2920 11 setp.eq.u64 %r1511,%r150,1; @ %r1511 bra $L121; bra $L20; $L2: .loc 1 2476 6 ld.u64 %r1512,[%r945]; setp.ne.u64 %r1513,%r1512,0; @ %r1513 bra $L122; .loc 1 2485 4 ld.u64 %r1515,[%r946+56]; add.u64 %r1514,%r1515,1; ld.u64 %r1516,[%r946+48]; sub.u64 %r668,%r1514,%r1516; .loc 1 2483 15 ld.s8 %r1518,[%r947+28]; cvt.u16.u32 %r1517,%r1518; setp.eq.u16 %r1519,%r1517,1; @ %r1519 bra $L123; bra $L124; $L126: .loc 1 2560 16 ld.u64 %r298,[%r946+40]; .loc 1 2548 27 mov.u64 %r844,%r664; bra $L125; $L14: ld.u64 %r664,[%r945+40]; .loc 1 2543 6 ld.s8 %r1521,[%r945+28]; cvt.u16.u32 %r1520,%r1521; setp.eq.u16 %r1522,%r1520,1; @ ! %r1522 bra $L127; bra $L126; $L11: .loc 1 2560 16 ld.u64 %r298,[%r946+40]; mov.u32 %r627,%r901; .loc 1 2548 27 mov.u64 %r664,%r844; .loc 1 2563 14 mov.u64 %r162,1; .loc 1 2561 16 mov.u64 %r150,%r162; bra $L12; $L5: .loc 1 2485 4 ld.u64 %r1524,[%r946+56]; add.u64 %r1523,%r1524,1; ld.u64 %r1525,[%r946+48]; sub.u64 %r668,%r1523,%r1525; bra $L124; $L128: .loc 1 2509 17 ld.u64 %r1527,[%r945+56]; add.u64 %r1526,%r1527,1; .loc 1 2509 15 ld.u64 %r1528,[%r945+48]; sub.u64 %r638,%r1526,%r1528; ld.s8 %r1551,[%r947+28]; bra $L9; $L122: .loc 1 2502 12 cvta.global.u64 %r1529,_gfortrani_compile_options; .loc 1 2502 11 ld.u32 %r1530,[%r1529+36]; setp.ne.u32 %r1531,%r1530,0; @ ! %r1531 bra $L8; bra $L128; $L1$L342: ld.f32 %r2028,[%r616]; ld.f32 %r2029,[%r621]; ld.f32 %r2030,[%r578]; ld.f32 %r2031,[%r578+4]; .loc 1 2908 13 mul.f32 %r2032,%r2029,%r2030; mul.f32 %r2033,%r2029,%r2031; neg.f32 %r2034,%r2033; fma.rn.f32 %r2035,%r2028,%r2030,%r2034; fma.rn.f32 %r2036,%r2028,%r2031,%r2032; .loc 1 2907 11 add.f32 %r460,%r460,%r2035; add.f32 %r461,%r461,%r2036; .loc 1 2905 10 add.u64 %r297,%r616,8; add.u64 %r256,%r578,8; add.u64 %r621,%r616,12; $L287: ld.f32 %r2038,[%r297]; ld.f32 %r2039,[%r621]; ld.f32 %r2040,[%r256]; ld.f32 %r2041,[%r256+4]; .loc 1 2908 13 mul.f32 %r2042,%r2039,%r2040; mul.f32 %r2043,%r2039,%r2041; neg.f32 %r2044,%r2043; fma.rn.f32 %r2045,%r2038,%r2040,%r2044; fma.rn.f32 %r2046,%r2038,%r2041,%r2042; .loc 1 2907 11 add.f32 %r460,%r460,%r2045; add.f32 %r461,%r461,%r2046; .loc 1 2905 10 add.u64 %r297,%r297,8; add.u64 %r256,%r256,8; add.u64 %r621,%r621,8; $L286: ld.f32 %r2048,[%r297]; ld.f32 %r2049,[%r621]; ld.f32 %r2050,[%r256]; ld.f32 %r2051,[%r256+4]; .loc 1 2908 13 mul.f32 %r2052,%r2049,%r2050; mul.f32 %r2053,%r2049,%r2051; neg.f32 %r2054,%r2053; fma.rn.f32 %r2055,%r2048,%r2050,%r2054; fma.rn.f32 %r2056,%r2048,%r2051,%r2052; .loc 1 2907 11 add.f32 %r460,%r460,%r2055; add.f32 %r461,%r461,%r2056; .loc 1 2905 10 add.u64 %r297,%r297,8; add.u64 %r256,%r256,8; add.u64 %r621,%r621,8; setp.ne.u64 %r2057,%r256,%r575; @ %r2057 bra $L83; bra $L82; $L341: ld.f32 %r2058,[%r737+4]; .loc 1 2762 43 ld.f32 %r2059,[%r737]; st.f32 [%r678],%r2059; st.f32 [%r683],%r2058; .loc 1 2760 9 add.u64 %r676,%r737,8; add.u64 %r677,%r678,2048; add.u64 %r682,%r683,2048; $L285: ld.f32 %r2061,[%r676+4]; .loc 1 2762 43 ld.f32 %r2062,[%r676]; st.f32 [%r677],%r2062; st.f32 [%r682],%r2061; .loc 1 2760 9 add.u64 %r676,%r676,8; add.u64 %r677,%r677,2048; add.u64 %r682,%r682,2048; $L284: ld.f32 %r2064,[%r676+4]; .loc 1 2762 43 ld.f32 %r2065,[%r676]; st.f32 [%r677],%r2065; st.f32 [%r682],%r2064; .loc 1 2760 9 add.u64 %r676,%r676,8; add.u64 %r677,%r677,2048; add.u64 %r682,%r682,2048; setp.ne.u64 %r2066,%r676,%r692; @ %r2066 bra $L58; bra $L48; $L340: ld.f32 %r2067,[%r694+4]; .loc 1 2740 49 ld.f32 %r2068,[%r694]; st.f32 [%r695],%r2068; st.f32 [%r695+4],%r2067; ld.f32 %r2069,[%r696+4]; .loc 1 2742 49 ld.f32 %r2070,[%r696]; st.f32 [%r695+8],%r2070; st.f32 [%r695+12],%r2069; ld.f32 %r2071,[%r694+12]; .loc 1 2744 49 ld.f32 %r2072,[%r694+8]; st.f32 [%r695+2048],%r2072; st.f32 [%r695+2052],%r2071; ld.f32 %r2073,[%r696+12]; .loc 1 2746 49 ld.f32 %r2074,[%r696+8]; st.f32 [%r695+2056],%r2074; st.f32 [%r695+2060],%r2073; .loc 1 2738 9 add.u64 %r694,%r694,16; add.u64 %r695,%r695,4096; add.u64 %r696,%r696,16; setp.ne.u64 %r2075,%r696,%r706; @ %r2075 bra $L52; bra $L53; $L338: .loc 1 2688 2 @ ! %r1556 bra $L350; .loc 1 2689 22 mov.u32 %r20762076077,[%value_in]; } $L350: .loc 1 2687 24 mov.u64 %r255,2; .loc 1 2687 7 add.u64 %r800,%r222,%r802; $L283: .loc 1 2688 2 @ ! %r1556 bra $L351; .loc 1 2689 22 mov.u32 %r20792079080,[%value_in]; } $L351: .loc 1 2687 24 add.u64 %r255,%r255,1; .loc 1 2687 7 add.u64 %r800,%r800,%r802; $L282: .loc 1 2688 2 @ ! %r1556 bra $L352; .loc 1 2689 22 mov.u32 %r2082082083,[%value_in]; } $L352: .loc 1 2687 24 add.u64 %r255,%r255,1; .loc 1 2687 7 add.u64 %r800,%r800,%r802; setp.eq.u64 %r2084,%r255,%r806; @ ! %r2084 bra $L34; bra $L33; $L220: .loc 1 2689 22 mov.u32 %r208516952085086,[%value_in]; } $L339: .loc 1 2687 7 add.u64 %r2088,%r1695,%r802; .loc 1 2688 2 @ ! %r1556 bra $L353; .loc 1 2689 22 mov.u32 %r20208090091,[%value_in]; } $L353: .loc 1 2687 7 add.u64 %r2093,%r2088,%r802; .loc 1 2688 2 @ ! %r1556 bra $L354; .loc 1 2689 22 mov.u32 %r202092095096,[%value_in]; } $L354: .loc 1 2687 24 add.u64 %r255,%r1640,3; .loc 1 2687 7 add.u64 %r800,%r2093,%r802; setp.eq.u64 %r2097,%r255,%r806; @ ! %r2097 bra $L34; bra $L33; $L344: ld.f32 %r2098,[%r816]; ld.f32 %r2099,[%r809]; ld.f32 %r2100,[%r834]; ld.f32 %r2101,[%r810]; .loc 1 2938 23 mul.f32 %r2102,%r2100,%r2099; mul.f32 %r2103,%r2101,%r2099; neg.f32 %r2104,%r2103; .loc 1 2938 9 fma.rn.f32 %r103,%r2100,%r2098,%r2104; fma.rn.f32 %r108,%r2101,%r2098,%r2102; .loc 1 2937 5 add.u64 %r807,%r816,8; add.u64 %r808,%r834,8; add.u64 %r809,%r816,12; add.u64 %r810,%r834,12; $L289: ld.f32 %r2108,[%r807]; ld.f32 %r2109,[%r809]; ld.f32 %r2110,[%r808]; ld.f32 %r2111,[%r810]; .loc 1 2938 23 mul.f32 %r2112,%r2110,%r2109; mul.f32 %r2113,%r2111,%r2109; neg.f32 %r2114,%r2113; fma.rn.f32 %r2115,%r2110,%r2108,%r2114; fma.rn.f32 %r2116,%r2111,%r2108,%r2112; .loc 1 2938 9 add.f32 %r103,%r103,%r2115; add.f32 %r108,%r108,%r2116; .loc 1 2937 5 add.u64 %r807,%r807,8; add.u64 %r808,%r808,8; add.u64 %r809,%r809,8; add.u64 %r810,%r810,8; $L288: ld.f32 %r2118,[%r807]; ld.f32 %r2119,[%r809]; ld.f32 %r2120,[%r808]; ld.f32 %r2121,[%r810]; .loc 1 2938 23 mul.f32 %r2122,%r2120,%r2119; mul.f32 %r2123,%r2121,%r2119; neg.f32 %r2124,%r2123; fma.rn.f32 %r2125,%r2120,%r2118,%r2124; fma.rn.f32 %r2126,%r2121,%r2118,%r2122; .loc 1 2938 9 add.f32 %r103,%r103,%r2125; add.f32 %r108,%r108,%r2126; .loc 1 2937 5 add.u64 %r807,%r807,8; add.u64 %r808,%r808,8; add.u64 %r809,%r809,8; add.u64 %r810,%r810,8; setp.ne.u64 %r2127,%r807,%r821; @ %r2127 bra $L93; bra $L92; $L345: ld.f32 %r2128,[%r220]; ld.f32 %r2129,[%r845]; ld.f32 %r2130,[%r843]; ld.f32 %r2131,[%r843+4]; .loc 1 2953 26 mul.f32 %r2132,%r2130,%r2129; mul.f32 %r2133,%r2131,%r2129; neg.f32 %r2134,%r2133; .loc 1 2953 5 fma.rn.f32 %r38,%r2130,%r2128,%r2134; fma.rn.f32 %r37,%r2131,%r2128,%r2132; .loc 1 2952 8 add.u64 %r840,%r220,%r842; add.u64 %r843,%r843,8; add.u64 %r845,%r845,%r842; $L291: ld.f32 %r2138,[%r840]; ld.f32 %r2139,[%r845]; ld.f32 %r2140,[%r843]; ld.f32 %r2141,[%r843+4]; .loc 1 2953 26 mul.f32 %r2142,%r2140,%r2139; mul.f32 %r2143,%r2141,%r2139; neg.f32 %r2144,%r2143; fma.rn.f32 %r2145,%r2140,%r2138,%r2144; fma.rn.f32 %r2146,%r2141,%r2138,%r2142; .loc 1 2953 5 add.f32 %r38,%r38,%r2145; add.f32 %r37,%r37,%r2146; .loc 1 2952 8 add.u64 %r840,%r840,%r842; add.u64 %r843,%r843,8; add.u64 %r845,%r845,%r842; $L290: ld.f32 %r2148,[%r840]; ld.f32 %r2149,[%r845]; ld.f32 %r2150,[%r843]; ld.f32 %r2151,[%r843+4]; .loc 1 2953 26 mul.f32 %r2152,%r2150,%r2149; mul.f32 %r2153,%r2151,%r2149; neg.f32 %r2154,%r2153; fma.rn.f32 %r2155,%r2150,%r2148,%r2154; fma.rn.f32 %r2156,%r2151,%r2148,%r2152; .loc 1 2953 5 add.f32 %r38,%r38,%r2155; add.f32 %r37,%r37,%r2156; .loc 1 2952 8 add.u64 %r840,%r840,%r842; add.u64 %r843,%r843,8; add.u64 %r845,%r845,%r842; setp.ne.u64 %r2157,%r843,%r861; @ %r2157 bra $L97; bra $L96; $L346: ld.f32 %r2158,[%r220]; ld.f32 %r2159,[%r220+4]; ld.f32 %r2160,[%r865]; ld.f32 %r2161,[%r865+4]; .loc 1 2968 29 mul.f32 %r2162,%r2160,%r2159; mul.f32 %r2163,%r2161,%r2159; neg.f32 %r2164,%r2163; .loc 1 2968 8 fma.rn.f32 %r188,%r2160,%r2158,%r2164; fma.rn.f32 %r152,%r2161,%r2158,%r2162; .loc 1 2967 28 mov.u64 %r269,1; .loc 1 2967 4 add.u64 %r862,%r220,%r864; add.u64 %r865,%r865,%r867; $L293: ld.f32 %r2168,[%r862]; ld.f32 %r2169,[%r862+4]; ld.f32 %r2170,[%r865]; ld.f32 %r2171,[%r865+4]; .loc 1 2968 29 mul.f32 %r2172,%r2170,%r2169; mul.f32 %r2173,%r2171,%r2169; neg.f32 %r2174,%r2173; fma.rn.f32 %r2175,%r2170,%r2168,%r2174; fma.rn.f32 %r2176,%r2171,%r2168,%r2172; .loc 1 2968 8 add.f32 %r188,%r188,%r2175; add.f32 %r152,%r152,%r2176; .loc 1 2967 28 add.u64 %r269,%r269,1; .loc 1 2967 4 add.u64 %r862,%r862,%r864; add.u64 %r865,%r865,%r867; $L292: ld.f32 %r2178,[%r862]; ld.f32 %r2179,[%r862+4]; ld.f32 %r2180,[%r865]; ld.f32 %r2181,[%r865+4]; .loc 1 2968 29 mul.f32 %r2182,%r2180,%r2179; mul.f32 %r2183,%r2181,%r2179; neg.f32 %r2184,%r2183; fma.rn.f32 %r2185,%r2180,%r2178,%r2184; fma.rn.f32 %r2186,%r2181,%r2178,%r2182; .loc 1 2968 8 add.f32 %r188,%r188,%r2185; add.f32 %r152,%r152,%r2186; .loc 1 2967 28 add.u64 %r269,%r269,1; .loc 1 2967 4 add.u64 %r862,%r862,%r864; add.u64 %r865,%r865,%r867; setp.ne.u64 %r2187,%r638,%r269; @ %r2187 bra $L100; bra $L99; $L348: ld.f32 %r2188,[%r881]; ld.f32 %r2189,[%r881+4]; ld.f32 %r2190,[%r891]; ld.f32 %r2191,[%r894]; .loc 1 2983 37 mul.f32 %r2192,%r2189,%r2190; ld.f32 %r2193,[%r900]; fma.rn.f32 %r2194,%r2190,%r2188,%r2193; fma.rn.f32 %r2195,%r2191,%r2188,%r2192; .loc 1 2982 36 neg.f32 %r2196,%r2189; fma.rn.f32 %r2197,%r2196,%r2191,%r2194; ld.f32 %r2198,[%r884]; add.f32 %r2199,%r2195,%r2198; st.f32 [%r900],%r2197; st.f32 [%r884],%r2199; .loc 1 2980 29 mov.u64 %r265,1; .loc 1 2980 4 add.u64 %r878,%r900,%r912; add.u64 %r881,%r881,%r883; add.u64 %r884,%r884,%r912; $L297: ld.f32 %r2201,[%r881]; ld.f32 %r2202,[%r881+4]; ld.f32 %r2203,[%r891]; ld.f32 %r2204,[%r894]; .loc 1 2983 37 mul.f32 %r2205,%r2202,%r2203; ld.f32 %r2206,[%r878]; fma.rn.f32 %r2207,%r2203,%r2201,%r2206; fma.rn.f32 %r2208,%r2204,%r2201,%r2205; .loc 1 2982 36 neg.f32 %r2209,%r2202; fma.rn.f32 %r2210,%r2209,%r2204,%r2207; ld.f32 %r2211,[%r884]; add.f32 %r2212,%r2208,%r2211; st.f32 [%r878],%r2210; st.f32 [%r884],%r2212; .loc 1 2980 29 add.u64 %r265,%r265,1; .loc 1 2980 4 add.u64 %r878,%r878,%r912; add.u64 %r881,%r881,%r883; add.u64 %r884,%r884,%r912; $L296: ld.f32 %r2214,[%r881]; ld.f32 %r2215,[%r881+4]; ld.f32 %r2216,[%r891]; ld.f32 %r2217,[%r894]; .loc 1 2983 37 mul.f32 %r2218,%r2215,%r2216; ld.f32 %r2219,[%r878]; fma.rn.f32 %r2220,%r2216,%r2214,%r2219; fma.rn.f32 %r2221,%r2217,%r2214,%r2218; .loc 1 2982 36 neg.f32 %r2222,%r2215; fma.rn.f32 %r2223,%r2222,%r2217,%r2220; ld.f32 %r2224,[%r884]; add.f32 %r2225,%r2221,%r2224; st.f32 [%r878],%r2223; st.f32 [%r884],%r2225; .loc 1 2980 29 add.u64 %r265,%r265,1; .loc 1 2980 4 add.u64 %r878,%r878,%r912; add.u64 %r881,%r881,%r883; add.u64 %r884,%r884,%r912; setp.ne.u64 %r2226,%r162,%r265; @ %r2226 bra $L110; bra $L113; $L347: .loc 1 2976 34 st.f32 [%r910],%r1599; st.f32 [%r910+4],%r1600; .loc 1 2975 27 mov.u64 %r267,1; .loc 1 2975 2 add.u64 %r910,%r910,%r912; $L295: .loc 1 2976 34 st.f32 [%r910],%r1599; st.f32 [%r910+4],%r1600; .loc 1 2975 27 add.u64 %r267,%r267,1; .loc 1 2975 2 add.u64 %r910,%r910,%r912; $L294: .loc 1 2976 34 st.f32 [%r910],%r1599; st.f32 [%r910+4],%r1600; .loc 1 2975 27 add.u64 %r267,%r267,1; .loc 1 2975 2 add.u64 %r910,%r910,%r912; setp.ne.u64 %r2229,%r162,%r267; @ %r2229 bra $L105; bra $L109; $L349: ld.f32 %r2230,[%r918]; ld.f32 %r2231,[%r918+4]; ld.f32 %r2232,[%r944]; ld.f32 %r2233,[%r944+4]; .loc 1 3002 28 mul.f32 %r2234,%r2232,%r2231; mul.f32 %r2235,%r2233,%r2231; neg.f32 %r2236,%r2235; .loc 1 3002 5 fma.rn.f32 %r211,%r2232,%r2230,%r2236; fma.rn.f32 %r161,%r2233,%r2230,%r2234; .loc 1 3001 32 mov.u64 %r262,1; .loc 1 3001 8 add.u64 %r918,%r918,%r920; add.u64 %r922,%r944,%r924; $L299: ld.f32 %r2240,[%r918]; ld.f32 %r2241,[%r918+4]; ld.f32 %r2242,[%r922]; ld.f32 %r2243,[%r922+4]; .loc 1 3002 28 mul.f32 %r2244,%r2242,%r2241; mul.f32 %r2245,%r2243,%r2241; neg.f32 %r2246,%r2245; fma.rn.f32 %r2247,%r2242,%r2240,%r2246; fma.rn.f32 %r2248,%r2243,%r2240,%r2244; .loc 1 3002 5 add.f32 %r211,%r211,%r2247; add.f32 %r161,%r161,%r2248; .loc 1 3001 32 add.u64 %r262,%r262,1; .loc 1 3001 8 add.u64 %r918,%r918,%r920; add.u64 %r922,%r922,%r924; $L298: ld.f32 %r2250,[%r918]; ld.f32 %r2251,[%r918+4]; ld.f32 %r2252,[%r922]; ld.f32 %r2253,[%r922+4]; .loc 1 3002 28 mul.f32 %r2254,%r2252,%r2251; mul.f32 %r2255,%r2253,%r2251; neg.f32 %r2256,%r2255; fma.rn.f32 %r2257,%r2252,%r2250,%r2256; fma.rn.f32 %r2258,%r2253,%r2250,%r2254; .loc 1 3002 5 add.f32 %r211,%r211,%r2257; add.f32 %r161,%r161,%r2258; .loc 1 3001 32 add.u64 %r262,%r262,1; .loc 1 3001 8 add.u64 %r918,%r918,%r920; add.u64 %r922,%r922,%r924; setp.ne.u64 %r2259,%r638,%r262; @ %r2259 bra $L118; bra $L117; } matmul_c8.o/_gfortran_matmul_c8 .visible .func _gfortran_matmul_c8, .param .u32 %in_ar4, .param .u64 %in_ar5); .file 1 "../../../../libgfortran/generated/matmul_c8.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_options[6]FUNCTION DECL: _gfortran_size0_gfortran_size0_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime_64 $LC9[2] = {__func__$0 .const .align 1 .u8 __func__$0[10] = {109,97,116,109,117,108,95,99,56,0 }2] = {782] = {672] = {8471,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,32,40,97,41,32,61,61,32,50,32,124,124,32,71,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,32,40,98,41,32,61,61,3246116,114,97,110,47,103,101,110,101,114,97,116,101,100,4765,114,114,97,121,32,98,111,117,110,100,32,109,105,115,109,97,116,99,104,32,102,111,114,32,100,105,109,101,110,115,105,111,110,32,49,32,111,102,32,97,114,114,97,121,32,40,37,108,100,47,37,108,100,41,3265,114,114,97,121,32,98,111,117,110,100,32,109,105,115,109,97,116,99,104,32,102,111,114,32,100,105,109,101,110,115,105,111,110,32,50,32,111,102,32,97,114,114,97,121,32,40,37,108,100,47,37,108,100,41,3273,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,97,114,103,117,109,101,110,116,32,66,32,105,110,32,77,65,84,77,85,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,49,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,100,0 }; // BEGIN VAR DEF: $LC8 .const .align 8 .u64 $LC8[2] = {4607182418800017408_gfortran_matmul_c8 .visible .func _gfortran_matmul_c8f64 %r189; .reg .f64 %r191; .reg .f64 %r192; .reg .f64 %r193; .reg .f64u32 %r198; .reg .u32 %r199; .reg .u64 %r200; .reg .f64 %r201; .reg .f64 %r202; .reg .f64 %r203; .reg .f64 %r205; .reg .f64 %r206; .reg .f64 %r207; .reg .f64 %r208; .reg .f64 %r209; .reg .f64 %r211; .reg .f64 %r212; .reg .f64 %r213; .reg .f64u64 %r257; .reg .u64 %r258; .reg .u64u64 %r267; .reg .u64 %r268; .reg .u64 %r269; .reg .f64 %r270; .reg .f64 %r271; .reg .f64 %r272; .reg .f64 %r273; .reg .f64 %r274; .reg .f64 %r275; .reg .f64 %r276; .reg .f64 %r277; .reg .f64 %r278; .reg .f64 %r279; .reg .f64 %r280; .reg .f64 %r281; .reg .f64 %r282; .reg .f64 %r283; .reg .f64 %r284; .reg .f64f64 %r290; .reg .f64 %r296; .reg .u64 %r297; .reg .u64 %r298; .reg .f64 %r299; .reg .f64 %r300; .reg .f64 %r301; .reg .f64 %r302; .reg .f64 %r303; .reg .f64 %r304; .reg .f64 %r305; .reg .f64 %r306; .reg .f64 %r307; .reg .f64 %r308; .reg .f64 %r309; .reg .f64 %r310; .reg .f64 %r311; .reg .f64 %r312; .reg .f64 %r313; .reg .f64 %r314; .reg .f64 %r315; .reg .f64 %r317; .reg .f64 %r318; .reg .f64 %r319; .reg .f64 %r320; .reg .f64 %r321; .reg .f64 %r323; .reg .f64 %r324; .reg .f64 %r325; .reg .f64 %r326; .reg .f64 %r327; .reg .f64 %r329; .reg .f64 %r330; .reg .f64 %r331; .reg .f64 %r333; .reg .f64 %r334; .reg .f64 %r335; .reg .f64 %r336; .reg .f64 %r337; .reg .f64 %r339; .reg .f64 %r340; .reg .f64 %r341; .reg .f64 %r343; .reg .f64 %r344; .reg .f64 %r345; .reg .f64 %r346; .reg .f64 %r347; .reg .f64 %r349; .reg .f64 %r350; .reg .f64 %r351; .reg .f64 %r353; .reg .f64 %r354; .reg .f64 %r355; .reg .f64 %r356; .reg .f64 %r357; .reg .f64 %r359; .reg .f64 %r360; .reg .f64 %r361; .reg .f64 %r362; .reg .f64 %r363; .reg .f64 %r365; .reg .f64 %r366; .reg .f64 %r367; .reg .f64 %r369; .reg .f64 %r370; .reg .f64 %r371; .reg .f64 %r373; .reg .f64 %r374; .reg .f64 %r375; .reg .f64 %r377; .reg .f64 %r378; .reg .f64 %r379; .reg .f64 %r381; .reg .f64 %r382; .reg .f64 %r383; .reg .f64 %r385; .reg .f64 %r386; .reg .f64 %r387; .reg .f64 %r389; .reg .f64 %r390; .reg .f64 %r391; .reg .f64 %r392; .reg .f64 %r393; .reg .f64 %r394; .reg .f64 %r395; .reg .f64 %r396; .reg .f64 %r397; .reg .f64 %r398; .reg .f64 %r399; .reg .f64 %r400; .reg .f64 %r401; .reg .f64 %r402; .reg .f64 %r403; .reg .f64 %r405; .reg .f64 %r406; .reg .f64 %r407; .reg .f64 %r408; .reg .f64 %r409; .reg .f64 %r411; .reg .f64 %r412; .reg .f64 %r413; .reg .f64 %r414; .reg .f64 %r415; .reg .f64 %r417; .reg .f64 %r418; .reg .f64 %r419; .reg .f64 %r420; .reg .f64 %r421; .reg .f64 %r423; .reg .f64 %r424; .reg .f64 %r425; .reg .f64 %r426; .reg .f64 %r427; .reg .f64 %r428; .reg .f64 %r429; .reg .f64 %r430; .reg .f64 %r431; .reg .f64 %r432; .reg .f64 %r433; .reg .f64 %r434; .reg .f64 %r435; .reg .f64 %r436; .reg .f64 %r437; .reg .f64 %r439; .reg .f64 %r440; .reg .f64 %r441; .reg .f64 %r442; .reg .f64 %r443; .reg .f64 %r445; .reg .f64 %r446; .reg .f64 %r447; .reg .f64 %r448; .reg .f64 %r449; .reg .f64 %r451; .reg .f64 %r452; .reg .f64 %r453; .reg .f64 %r454; .reg .f64 %r455; .reg .f64 %r457; .reg .f64 %r458; .reg .f64 %r459; .reg .f64 %r460; .reg .f64 %r461; .reg .f64u64 %r509; .reg .u64 %r511; .reg .u64 %r513; .reg .u64 %r515; .reg .u64 %r517; .reg .u64 %r519; .reg .u64 %r521; .reg .u64 %r523; .reg .u64 %r524; .reg .u64 %r527; .reg .u64u64 %r549; .reg .u64 %r552; .reg .u64 %r553; .reg .u64 %r555; .reg .u64u64 %r567; .reg .u64 %r569; .reg .u64 %r574; .reg .u64 %r575; .reg .u64 %r577; .reg .u64u64 %r585; .reg .u64 %r592; .reg .u64u64 %r604; .reg .u64 %r606; .reg .u64 %r608; .reg .u64 %r610; .reg .u64 %r615; .reg .u64 %r616; .reg .u64 %r617; .reg .u64 %r621; .reg .u64 %r623; .reg .u32 %r627; .reg .u64 %r637; .reg .u64 %r638; .reg .u64 %r639; .reg .u64 %r640; .reg .u64 %r641; .reg .u64 %r642; .reg .u64 %r643; .reg .u64 %r644; .reg .u64 %r645; .reg .u64 %r647; .reg .u64 %r648; .reg .u64 %r649; .reg .u64 %r650; .reg .u64 %r651; .reg .u64 %r657; .reg .u64 %r658; .reg .u32 %r662; .reg .u64 %r663; .reg .u64 %r664; .reg .u64 %r668; .reg .u64 %r672; .reg .u64 %r676; .reg .u64 %r677; .reg .u64 %r678; .reg .u64 %r679; .reg .u64 %r682; .reg .u64 %r683; .reg .u64 %r692; .reg .u64 %r694; .reg .u64 %r695; .reg .u64 %r696; .reg .u64 %r706; .reg .u64 %r707; .reg .u64 %r708; .reg .u64 %r709; .reg .u64 %r710; .reg .u64 %r716; .reg .u64 %r721; .reg .u64 %r726; .reg .u64 %r736; .reg .u64 %r737; .reg .u64 %r747; .reg .u64 %r749; .reg .u64 %r750; .reg .u64 %r756; .reg .u64 %r757; .reg .u64 %r758; .reg .u64 %r759; .reg .u64 %r760; .reg .u64 %r761; .reg .u64 %r762; .reg .u64 %r763; .reg .u64 %r764; .reg .u64 %r765; .reg .u64 %r766; .reg .u64 %r767; .reg .u64 %r769; .reg .u64 %r770; .reg .u64 %r771; .reg .u64 %r772; .reg .u64 %r774; .reg .u64 %r775; .reg .u64 %r782; .reg .u64 %r783; .reg .u64 %r784; .reg .u64 %r785; .reg .u64 %r786; .reg .u64 %r793; .reg .u64 %r795; .reg .u64 %r797; .reg .u64 %r798; .reg .u64 %r799; .reg .u64 %r800; .reg .u64 %r802; .reg .u64 %r806; .reg .u64 %r807; .reg .u64 %r808; .reg .u64 %r809; .reg .u64 %r810; .reg .u64 %r815; .reg .u64 %r816; .reg .u64 %r818; .reg .u64 %r821; .reg .u64 %r822; .reg .u64 %r827; .reg .u64 %r829; .reg .u64 %r830; .reg .u64 %r834; .reg .u64 %r836; .reg .u64 %r840; .reg .u64 %r842; .reg .u64 %r843; .reg .u64 %r844; .reg .u64 %r845; .reg .u64 %r851; .reg .u64 %r853; .reg .u64 %r854; .reg .u64 %r861; .reg .u64 %r862; .reg .u64 %r864; .reg .u64 %r865; .reg .u64 %r867; .reg .u64 %r870; .reg .u64 %r872; .reg .u64 %r873; .reg .u64 %r878; .reg .u64 %r881; .reg .u64 %r883; .reg .u64 %r884; .reg .u64 %r889; .reg .u64 %r891; .reg .u64 %r893; .reg .u64 %r894; .reg .u64 %r900; .reg .u32 %r901; .reg .u64 %r902; .reg .u64 %r905; .reg .u64 %r908; .reg .u64 %r910; .reg .u64 %r912; .reg .u64 %r914; .reg .u64 %r918; .reg .u64 %r920; .reg .u64 %r922; .reg .u64 %r924; .reg .u64 %r927; .reg .u64 %r929; .reg .u64 %r931; .reg .u64 %r936; .reg .u64 %r938; .reg .u64 %r944; .reg .u64 %r945; .reg .u64 %r946; .reg .u64 %r947; .reg .u32 %r948; .reg .u32 %r949; .reg .u64 %r950; .reg .u16 %r951; .reg .pred %r952; .reg .u16 %r953; .reg .pred %r955; .reg .u64 %r956; .reg .u32 %r957; .reg .u64 %r958; .reg .u64 %r959; .reg .u64 %r960; .reg .pred %r961; .reg .pred %r963; .reg .u64 %r966; .reg .u64 %r968; .reg .u64 %r969; .reg .u64 %r970; .reg .u64 %r972; .reg .u64 %r973; .reg .u64 %r974; .reg .u64 %r975; .reg .u64 %r976; .reg .u64 %r979; .reg .u64 %r981; .reg .u64 %r982; .reg .u64 %r984; .reg .u64 %r986; .reg .u64 %r987; .reg .u64 %r989; .reg .u64 %r990; .reg .u64 %r991; .reg .u64 %r992; .reg .u64 %r993; .reg .u32 %r994; .reg .pred %r995; .reg .u64 %r996; .reg .u64 %r997; .reg .u64 %r998; .reg .pred %r1000; .reg .u64 %r1001; .reg .u64 %r1002; .reg .u64 %r1003; .reg .pred %r1004; .reg .u64 %r1005; .reg .u64 %r1006; .reg .u64 %r1007; .reg .u16 %r1008; .reg .pred %r1009; .reg .u64 %r1010; .reg .u64 %r1012; .reg .u64 %r1013; .reg .u64 %r1014; .reg .u16 %r1015; .reg .pred %r1017; .reg .pred %r1018; .reg .u64 %r1019; .reg .pred %r1021; .reg .u64 %r1022; .reg .u64 %r1024; .reg .u64 %r1025; .reg .u64 %r1026; .reg .u64 %r1027; .reg .u64 %r1028; .reg .u64 %r1029; .reg .pred %r1030; .reg .u64 %r1031; .reg .u64 %r1033; .reg .u64 %r1034; .reg .u64 %r1035; .reg .u16 %r1036; .reg .u32 %r1037; .reg .pred %r1038; .reg .u16 %r1039; .reg .pred %r1040; .reg .u64 %r1041; .reg .u64 %r1042; .reg .u64 %r1043; .reg .u64 %r1044; .reg .u64 %r1045; .reg .u64 %r1046; .reg .pred %r1047; .reg .u32 %r1049; .reg .u32 %r1050; .reg .u32 %r1052; .reg .u32 %r1053; .reg .u16 %r1054; .reg .u16 %r1055; .reg .u16 %r1056; .reg .u32 %r1057; .reg .u16 %r1058; .reg .pred %r1059; .reg .u64 %r1060; .reg .u16 %r1062; .reg .pred %r1063; .reg .u64 %r1064; .reg .u64 %r1065; .reg .u64 %r1066; .reg .u32 %r1067; .reg .u32 %r1068; .reg .u32 %r1069; .reg .u32 %r1070; .reg .u32 %r1071; .reg .u32 %r1072; .reg .u32 %r1074; .reg .u32 %r1075; .reg .u16 %r1076; .reg .u16 %r1077; .reg .u16 %r1078; .reg .u32 %r1079; .reg .u16 %r1080; .reg .pred %r1081; .reg .u32 %r1083; .reg .u32 %r1084; .reg .u16 %r1085; .reg .u16 %r1086; .reg .u16 %r1087; .reg .u32 %r1088; .reg .u16 %r1089; .reg .pred %r1090; .reg .u32 %r1092; .reg .u32 %r1093; .reg .u32 %r1095; .reg .u32 %r1096; .reg .u16 %r1097; .reg .u16 %r1098; .reg .u16 %r1099; .reg .u32 %r1100; .reg .u16 %r1101; .reg .pred %r1102; .reg .f32 %r1103; .reg .f32 %r1104; .reg .f32 %r1105; .reg .f32 %r1106; .reg .f32 %r1107; .reg .f32 %r1108; .reg .f32 %r1109; .reg .pred %r1110; .reg .f64 %r1111; .reg .f64 %r1112; .reg .pred %r1115; .reg .pred %r1116; .reg .u32 %r1118; .reg .u32 %r1119; .reg .u32 %r1121; .reg .u32 %r1122; .reg .u16 %r1123; .reg .u16 %r1124; .reg .u16 %r1125; .reg .u32 %r1126; .reg .u16 %r1127; .reg .pred %r1128; .reg .u32 %r1130; .reg .u32 %r1131; .reg .u32 %r1133; .reg .u32 %r1134; .reg .u16 %r1135; .reg .u16 %r1136; .reg .u16 %r1137; .reg .u32 %r1138; .reg .u16 %r1139; .reg .pred %r1140; .reg .u32 %r1142; .reg .u32 %r1143; .reg .u32 %r1145; .reg .u32 %r1146; .reg .u16 %r1147; .reg .u16 %r1148; .reg .u16 %r1149; .reg .u32 %r1150; .reg .u16 %r1151; .reg .pred %r1152; .reg .pred %r1153; .reg .u64 %r1154; .reg .u32 %r1155; .reg .u64 %r1156; .reg .u64 %r1157; .reg .u32 %r1158; .reg .pred %r1159; .reg .pred %r1160; .reg .u32 %r1161; .reg .pred %r1162; .reg .pred %r1163; .reg .u32 %r1178; .reg .u64 %r1179; .reg .u64 %r1180; .reg .u64 %r1181; .reg .u64 %r1182; .reg .u64 %r1183; .reg .u64 %r1184; .reg .u64 %r1185; .reg .u16 %r1186; .reg .u16 %r1187; .reg .u16 %r1188; .reg .u32 %r1189; .reg .u16 %r1190; .reg .pred %r1191; .reg .pred %r1192; .reg .u16 %r1193; .reg .pred %r1194; .reg .u64 %r1195; .reg .pred %r1196; .reg .u32 %r1200; .reg .u64 %r1202; .reg .u32 %r1207; .reg .u32 %r1208; .reg .u32 %r1210; .reg .u32 %r1211; .reg .u16 %r1212; .reg .u16 %r1213; .reg .u16 %r1214; .reg .u32 %r1216; .reg .u32 %r1217; .reg .u16 %r1219; .reg .u16 %r1221; .reg .u32 %r1222; .reg .u16 %r1223; .reg .pred %r1224; .reg .pred %r1225; .reg .u64 %r1226; .reg .u64 %r1228; .reg .u64 %r1229; .reg .u64 %r1230; .reg .pred %r1232; .reg .u64 %r1234; .reg .u64 %r1235; .reg .u64 %r1236; .reg .u64 %r1238; .reg .u64 %r1239; .reg .u64 %r1244; .reg .u64 %r1245; .reg .u64 %r1246; .reg .u64 %r1247; .reg .u64 %r1251; .reg .u64 %r1253; .reg .u64 %r1254; .reg .u64 %r1255; .reg .u64 %r1256; .reg .u64 %r1257; .reg .pred %r1259; .reg .u64 %r1265; .reg .u64 %r1266; .reg .u64 %r1267; .reg .u64 %r1268; .reg .pred %r1270; .reg .u64 %r1271; .reg .u64 %r1272; .reg .u64 %r1273; .reg .u64 %r1274; .reg .u64 %r1275; .reg .u64 %r1276; .reg .u64 %r1277; .reg .u64 %r1278; .reg .u64 %r1279; .reg .u64 %r1283; .reg .u64 %r1284; .reg .u64 %r1285; .reg .u64 %r1286; .reg .u64 %r1289; .reg .u64 %r1290; .reg .u64 %r1291; .reg .u64 %r1292; .reg .u64 %r1293; .reg .u64 %r1295; .reg .u64 %r1296; .reg .u64 %r1297; .reg .u64 %r1298; .reg .u64 %r1299; .reg .u64 %r1300; .reg .u64 %r1304; .reg .u64 %r1305; .reg .u64 %r1308; .reg .u64 %r1313; .reg .u64 %r1314; .reg .f64 %r1316; .reg .f64 %r1317; .reg .f64 %r1318; .reg .f64 %r1319; .reg .f64 %r1321; .reg .f64 %r1322; .reg .pred %r1323; .reg .u64 %r1326; .reg .u64 %r1327; .reg .u64 %r1328; .reg .u64 %r1329; .reg .u64 %r1331; .reg .pred %r1332; .reg .u64 %r1333; .reg .u64 %r1334; .reg .f64 %r1335; .reg .u64 %r1338; .reg .u64 %r1339; .reg .u64 %r1340; .reg .u64 %r1341; .reg .u64 %r1342; .reg .u64 %r1343; .reg .u64 %r1344; .reg .u64 %r1346; .reg .u64 %r1347; .reg .f64 %r1351; .reg .f64 %r1352; .reg .f64 %r1353; .reg .f64 %r1354; .reg .f64 %r1355; .reg .f64 %r1356; .reg .f64 %r1357; .reg .f64 %r1358; .reg .f64 %r1359; .reg .f64 %r1360; .reg .f64 %r1361; .reg .f64 %r1362; .reg .f64 %r1363; .reg .f64 %r1364; .reg .f64 %r1365; .reg .f64 %r1366; .reg .f64 %r1367; .reg .f64 %r1368; .reg .f64 %r1369; .reg .f64 %r1370; .reg .f64 %r1371; .reg .f64 %r1372; .reg .f64 %r1373; .reg .f64 %r1374; .reg .f64 %r1375; .reg .f64 %r1376; .reg .f64 %r1377; .reg .f64 %r1378; .reg .f64 %r1379; .reg .f64 %r1380; .reg .f64 %r1381; .reg .f64 %r1382; .reg .pred %r1383; .reg .pred %r1384; .reg .pred %r1385; .reg .u64 %r1387; .reg .u64 %r1388; .reg .u64 %r1389; .reg .u64 %r1390; .reg .u64 %r1391; .reg .u64 %r1392; .reg .f64 %r1394; .reg .f64 %r1395; .reg .f64 %r1396; .reg .f64 %r1397; .reg .f64 %r1398; .reg .f64 %r1399; .reg .f64 %r1400; .reg .f64 %r1401; .reg .pred %r1402; .reg .pred %r1403; .reg .pred %r1404; .reg .u64 %r1406; .reg .u64 %r1407; .reg .u64 %r1409; .reg .u64 %r1410; .reg .u64 %r1411; .reg .u64 %r1412; .reg .u64 %r1413; .reg .u64 %r1420; .reg .u64 %r1421; .reg .f64 %r1424; .reg .f64 %r1425; .reg .f64 %r1426; .reg .f64 %r1427; .reg .f64 %r1428; .reg .f64 %r1429; .reg .f64 %r1430; .reg .f64 %r1431; .reg .pred %r1432; .reg .pred %r1433; .reg .pred %r1434; .reg .f64 %r1437; .reg .f64 %r1438; .reg .pred %r1440; .reg .u16 %r1441; .reg .pred %r1442; .reg .u32 %r1444; .reg .u32 %r1445; .reg .u16 %r1446; .reg .u16 %r1447; .reg .u32 %r1449; .reg .u16 %r1450; .reg .pred %r1451; .reg .pred %r1452; .reg .u16 %r1453; .reg .pred %r1454; .reg .pred %r1455; .reg .pred %r1456; .reg .u64 %r1457; .reg .u64 %r1458; .reg .pred %r1460; .reg .u64 %r1461; .reg .f64 %r1463; .reg .f64 %r1464; .reg .pred %r1466; .reg .u64 %r1468; .reg .u64 %r1469; .reg .u64 %r1470; .reg .f64 %r1471; .reg .f64 %r1472; .reg .pred %r1474; .reg .pred %r1475; .reg .u64 %r1477; .reg .f64 %r1478; .reg .f64 %r1479; .reg .pred %r1481; .reg .pred %r1482; .reg .pred %r1483; .reg .pred %r1484; .reg .u64 %r1485; .reg .pred %r1489; .reg .u64 %r1491; .reg .f64 %r1492; .reg .f64 %r1493; .reg .f64 %r1494; .reg .pred %r1496; .reg .pred %r1498; .reg .pred %r1502; .reg .u64 %r1503; .reg .u64 %r1504; .reg .u64 %r1506; .reg .f64 %r1507; .reg .f64 %r1508; .reg .pred %r1510; .reg .pred %r1511; .reg .u64 %r1512; .reg .pred %r1513; .reg .u64 %r1514; .reg .u64 %r1515; .reg .u64 %r1516; .reg .u16 %r1517; .reg .u32 %r1518; .reg .pred %r1519; .reg .u16 %r1520; .reg .u32 %r1521; .reg .pred %r1522; .reg .u64 %r1523; .reg .u64 %r1524; .reg .u64 %r1525; .reg .u64 %r1526; .reg .u64 %r1527; .reg .u64 %r1528; .reg .u64 %r1529; .reg .u32 %r1530; .reg .pred %r1531; .reg .u64 %r1532; .reg .u64 %r1533; .reg .u64 %r1534; .reg .f64 %r1535; .reg .pred %r1536; .reg .pred %r1542; .reg .pred %r1543; .reg .pred %r1544; .reg .f64 %r1545; .reg .u64 %r1546; .reg .pred %r1547; .reg .pred %r1548; .reg .pred %r1549; .reg .pred %r1550; .reg .u32 %r1551; .reg .u64 %r1552; .reg .pred %r1553; .reg .pred %r1556; .reg .u64 %r1562; .reg .u64 %r1563; .reg .u64 %r1564; .reg .u64 %r1565; .reg .u64 %r1566; .reg .u64 %r1570; .reg .u64 %r1571; .reg .u64 %r1572; .reg .u64 %r1573; .reg .u64 %r1574; .reg .u64 %r1576; .reg .u64 %r1577; .reg .u64 %r1578; .reg .u64 %r1580; .reg .pred %r1581; .reg .u64 %r1583; .reg .pred %r1585; .reg .u64 %r1588; .reg .pred %r1589; .reg .u64 %r1590; .reg .pred %r1591; .reg .u64 %r1592; .reg .u64 %r1593; .reg .pred %r1597; .reg .u64 %r1598; .reg .f64 %r1599; .reg .f64 %r1600; .reg .pred %r1601; .reg .u64 %r1602; .reg .u64 %r1603; .reg .u64 %r1604; .reg .u64 %r1605; .reg .u64 %r1606; .reg .pred %r1607; .reg .pred %r1608; .reg .pred %r1609; .reg .u64 %r1611; .reg .u64 %r1613; .reg .u64 %r1614; .reg .u64 %r1615; .reg .u64 %r1616; .reg .u64 %r1617; .reg .pred %r1618; .reg .pred %r1619; .reg .pred %r1620; .reg .u64 %r1621; .reg .u64 %r1624; .reg .u64 %r1625; .reg .u64 %r1626; .reg .u64 %r1628; .reg .pred %r1629; .reg .u64 %r1630; .reg .u64 %r1631; .reg .u64 %r1632; .reg .u64 %r1636; .reg .pred %r1637; .reg .pred %r1638; .reg .pred %r1639; .reg .u64 %r1640; .reg .u64 %r1641; .reg .u64 %r1642; .reg .u64 %r1643; .reg .u64 %r1644; .reg .u64 %r1645; .reg .pred %r1646; .reg .pred %r1647; .reg .pred %r1648; .reg .u64 %r1653; .reg .u64 %r1654; .reg .u64 %r1655; .reg .u64 %r1656; .reg .u64 %r1657; .reg .pred %r1658; .reg .pred %r1659; .reg .pred %r1660; .reg .u64 %r1661; .reg .u64 %r1665; .reg .pred %r1666; .reg .pred %r1667; .reg .pred %r1668; .reg .u64 %r1673; .reg .pred %r1674; .reg .pred %r1675; .reg .pred %r1676; .reg .u64 %r1681; .reg .pred %r1682; .reg .pred %r1683; .reg .pred %r1684; .reg .u64 %r1689; .reg .pred %r1690; .reg .pred %r1691; .reg .pred %r1692; .reg .u64 %r1695; .reg .u64 %r1696; .reg .u64 %r1698; .reg .u64 %r1699; .reg .f64 %r1703; .reg .f64 %r1704; .reg .f64 %r1705; .reg .f64 %r1706; .reg .f64 %r1707; .reg .f64 %r1708; .reg .f64 %r1709; .reg .f64 %r1710; .reg .pred %r1711; .reg .f64 %r1715; .reg .f64 %r1716; .reg .u64 %r1717; .reg .f64 %r1721; .reg .f64 %r1722; .reg .u64 %r1723; .reg .f64 %r1727; .reg .f64 %r1728; .reg .pred %r1729; .reg .pred %r1730; .reg .f64 %r1731; .reg .f64 %r1732; .reg .f64 %r1736; .reg .f64 %r1737; .reg .f64 %r1738; .reg .f64 %r1739; .reg .f64 %r1740; .reg .f64 %r1741; .reg .f64 %r1742; .reg .f64 %r1743; .reg .f64 %r1744; .reg .f64 %r1745; .reg .f64 %r1746; .reg .u64 %r1748; .reg .f64 %r1751; .reg .f64 %r1752; .reg .f64 %r1753; .reg .f64 %r1754; .reg .f64 %r1755; .reg .f64 %r1756; .reg .f64 %r1757; .reg .f64 %r1758; .reg .f64 %r1759; .reg .f64 %r1760; .reg .f64 %r1761; .reg .u64 %r1763; .reg .f64 %r1766; .reg .f64 %r1767; .reg .f64 %r1768; .reg .f64 %r1769; .reg .f64 %r1770; .reg .f64 %r1771; .reg .f64 %r1772; .reg .f64 %r1773; .reg .f64 %r1774; .reg .pred %r1775; .reg .pred %r1776; .reg .f64 %r1777; .reg .f64 %r1778; .reg .f64 %r1783; .reg .f64 %r1784; .reg .f64 %r1785; .reg .f64 %r1786; .reg .f64 %r1787; .reg .f64 %r1788; .reg .f64 %r1789; .reg .f64 %r1790; .reg .f64 %r1791; .reg .f64 %r1792; .reg .f64 %r1793; .reg .f64 %r1799; .reg .f64 %r1800; .reg .f64 %r1801; .reg .f64 %r1802; .reg .f64 %r1803; .reg .f64 %r1804; .reg .f64 %r1805; .reg .f64 %r1806; .reg .f64 %r1807; .reg .f64 %r1808; .reg .f64 %r1809; .reg .f64 %r1815; .reg .f64 %r1816; .reg .f64 %r1817; .reg .f64 %r1818; .reg .f64 %r1819; .reg .f64 %r1820; .reg .f64 %r1821; .reg .f64 %r1822; .reg .f64 %r1823; .reg .pred %r1824; .reg .pred %r1825; .reg .f64 %r1826; .reg .f64 %r1827; .reg .u64 %r1828; .reg .u64 %r1830; .reg .f64 %r1831; .reg .f64 %r1832; .reg .f64 %r1833; .reg .f64 %r1834; .reg .f64 %r1835; .reg .f64 %r1836; .reg .f64 %r1837; .reg .f64 %r1838; .reg .f64 %r1839; .reg .f64 %r1840; .reg .f64 %r1841; .reg .u64 %r1842; .reg .u64 %r1843; .reg .u64 %r1844; .reg .f64 %r1846; .reg .f64 %r1847; .reg .f64 %r1848; .reg .f64 %r1849; .reg .f64 %r1850; .reg .f64 %r1851; .reg .f64 %r1852; .reg .f64 %r1853; .reg .f64 %r1854; .reg .f64 %r1855; .reg .f64 %r1856; .reg .u64 %r1857; .reg .u64 %r1858; .reg .u64 %r1859; .reg .f64 %r1861; .reg .f64 %r1862; .reg .f64 %r1863; .reg .f64 %r1864; .reg .f64 %r1865; .reg .f64 %r1866; .reg .f64 %r1867; .reg .f64 %r1868; .reg .f64 %r1869; .reg .pred %r1870; .reg .pred %r1871; .reg .f64 %r1872; .reg .f64 %r1873; .reg .u64 %r1875; .reg .u64 %r1876; .reg .f64 %r1877; .reg .f64 %r1878; .reg .f64 %r1879; .reg .f64 %r1880; .reg .f64 %r1881; .reg .f64 %r1882; .reg .f64 %r1883; .reg .f64 %r1884; .reg .f64 %r1885; .reg .f64 %r1886; .reg .f64 %r1887; .reg .u64 %r1889; .reg .u64 %r1890; .reg .f64 %r1892; .reg .f64 %r1893; .reg .f64 %r1894; .reg .f64 %r1895; .reg .f64 %r1896; .reg .f64 %r1897; .reg .f64 %r1898; .reg .f64 %r1899; .reg .f64 %r1900; .reg .f64 %r1901; .reg .f64 %r1902; .reg .u64 %r1904; .reg .u64 %r1905; .reg .f64 %r1907; .reg .f64 %r1908; .reg .f64 %r1909; .reg .f64 %r1910; .reg .f64 %r1911; .reg .f64 %r1912; .reg .f64 %r1913; .reg .f64 %r1914; .reg .f64 %r1915; .reg .pred %r1916; .reg .u64 %r1920; .reg .u64 %r1922; .reg .u64 %r1925; .reg .pred %r1927; .reg .pred %r1928; .reg .pred %r1929; .reg .u64 %r1931; .reg .u64 %r1932; .reg .u64 %r1933; .reg .f64 %r1934; .reg .f64 %r1935; .reg .f64 %r1936; .reg .f64 %r1937; .reg .f64 %r1938; .reg .f64 %r1939; .reg .f64 %r1940; .reg .f64 %r1941; .reg .f64 %r1942; .reg .f64 %r1943; .reg .f64 %r1944; .reg .f64 %r1945; .reg .u64 %r1947; .reg .u64 %r1948; .reg .u64 %r1949; .reg .f64 %r1951; .reg .f64 %r1952; .reg .f64 %r1953; .reg .f64 %r1954; .reg .f64 %r1955; .reg .f64 %r1956; .reg .f64 %r1957; .reg .f64 %r1958; .reg .f64 %r1959; .reg .f64 %r1960; .reg .f64 %r1961; .reg .f64 %r1962; .reg .u64 %r1964; .reg .u64 %r1965; .reg .u64 %r1966; .reg .f64 %r1968; .reg .f64 %r1969; .reg .f64 %r1970; .reg .f64 %r1971; .reg .f64 %r1972; .reg .f64 %r1973; .reg .f64 %r1974; .reg .f64 %r1975; .reg .f64 %r1976; .reg .f64 %r1977; .reg .f64 %r1978; .reg .f64 %r1979; .reg .pred %r1980; .reg .pred %r1981; .reg .f64 %r1983; .reg .f64 %r1984; .reg .u64 %r1986; .reg .u64 %r1987; .reg .f64 %r1988; .reg .f64 %r1989; .reg .f64 %r1990; .reg .f64 %r1991; .reg .f64 %r1992; .reg .f64 %r1993; .reg .f64 %r1994; .reg .f64 %r1995; .reg .f64 %r1996; .reg .f64 %r1997; .reg .f64 %r1998; .reg .u64 %r2000; .reg .u64 %r2001; .reg .f64 %r2003; .reg .f64 %r2004; .reg .f64 %r2005; .reg .f64 %r2006; .reg .f64 %r2007; .reg .f64 %r2008; .reg .f64 %r2009; .reg .f64 %r2010; .reg .f64 %r2011; .reg .f64 %r2012; .reg .f64 %r2013; .reg .u64 %r2015; .reg .u64 %r2016; .reg .f64 %r2018; .reg .f64 %r2019; .reg .f64 %r2020; .reg .f64 %r2021; .reg .f64 %r2022; .reg .f64 %r2023; .reg .f64 %r2024; .reg .f64 %r2025; .reg .f64 %r2026; .reg .pred %r2027; .reg .f64 %r2028; .reg .f64 %r2029; .reg .f64 %r2030; .reg .f64 %r2031; .reg .f64 %r2032; .reg .f64 %r2033; .reg .f64 %r2034; .reg .f64 %r2035; .reg .f64 %r2036; .reg .f64 %r2038; .reg .f64 %r2039; .reg .f64 %r2040; .reg .f64 %r2041; .reg .f64 %r2042; .reg .f64 %r2043; .reg .f64 %r2044; .reg .f64 %r2045; .reg .f64 %r2046; .reg .f64 %r2048; .reg .f64 %r2049; .reg .f64 %r2050; .reg .f64 %r2051; .reg .f64 %r2052; .reg .f64 %r2053; .reg .f64 %r2054; .reg .f64 %r2055; .reg .f64 %r2056; .reg .pred %r2057; .reg .f64 %r2058; .reg .f64 %r2059; .reg .f64 %r2061; .reg .f64 %r2062; .reg .f64 %r2064; .reg .f64 %r2065; .reg .pred %r2066; .reg .f64 %r2067; .reg .f64 %r2068; .reg .f64 %r2069; .reg .f64 %r2070; .reg .f64 %r2071; .reg .f64 %r2072; .reg .f64 %r2073; .reg .f64 %r2074; .reg .pred %r2075; .reg .u32 %r2076; .reg .u64 %r2077; .reg .u32 %r2079; .reg .u64 %r2080; .reg .u32 %r2082; .reg .u64 %r2083; .reg .pred %r2084; .reg .u32 %r2085; .reg .u64 %r2086; .reg .u64 %r2088; .reg .u32 %r2090; .reg .u64 %r2091; .reg .u64 %r2093; .reg .u32 %r2095; .reg .u64 %r2096; .reg .pred %r2097; .reg .f64 %r2098; .reg .f64 %r2099; .reg .f64 %r2100; .reg .f64 %r2101; .reg .f64 %r2102; .reg .f64 %r2103; .reg .f64 %r2104; .reg .f64 %r2108; .reg .f64 %r2109; .reg .f64 %r2110; .reg .f64 %r2111; .reg .f64 %r2112; .reg .f64 %r2113; .reg .f64 %r2114; .reg .f64 %r2115; .reg .f64 %r2116; .reg .f64 %r2118; .reg .f64 %r2119; .reg .f64 %r2120; .reg .f64 %r2121; .reg .f64 %r2122; .reg .f64 %r2123; .reg .f64 %r2124; .reg .f64 %r2125; .reg .f64 %r2126; .reg .pred %r2127; .reg .f64 %r2128; .reg .f64 %r2129; .reg .f64 %r2130; .reg .f64 %r2131; .reg .f64 %r2132; .reg .f64 %r2133; .reg .f64 %r2134; .reg .f64 %r2138; .reg .f64 %r2139; .reg .f64 %r2140; .reg .f64 %r2141; .reg .f64 %r2142; .reg .f64 %r2143; .reg .f64 %r2144; .reg .f64 %r2145; .reg .f64 %r2146; .reg .f64 %r2148; .reg .f64 %r2149; .reg .f64 %r2150; .reg .f64 %r2151; .reg .f64 %r2152; .reg .f64 %r2153; .reg .f64 %r2154; .reg .f64 %r2155; .reg .f64 %r2156; .reg .pred %r2157; .reg .f64 %r2158; .reg .f64 %r2159; .reg .f64 %r2160; .reg .f64 %r2161; .reg .f64 %r2162; .reg .f64 %r2163; .reg .f64 %r2164; .reg .f64 %r2168; .reg .f64 %r2169; .reg .f64 %r2170; .reg .f64 %r2171; .reg .f64 %r2172; .reg .f64 %r2173; .reg .f64 %r2174; .reg .f64 %r2175; .reg .f64 %r2176; .reg .f64 %r2178; .reg .f64 %r2179; .reg .f64 %r2180; .reg .f64 %r2181; .reg .f64 %r2182; .reg .f64 %r2183; .reg .f64 %r2184; .reg .f64 %r2185; .reg .f64 %r2186; .reg .pred %r2187; .reg .f64 %r2188; .reg .f64 %r2189; .reg .f64 %r2190; .reg .f64 %r2191; .reg .f64 %r2192; .reg .f64 %r2193; .reg .f64 %r2194; .reg .f64 %r2195; .reg .f64 %r2196; .reg .f64 %r2197; .reg .f64 %r2198; .reg .f64 %r2199; .reg .f64 %r2201; .reg .f64 %r2202; .reg .f64 %r2203; .reg .f64 %r2204; .reg .f64 %r2205; .reg .f64 %r2206; .reg .f64 %r2207; .reg .f64 %r2208; .reg .f64 %r2209; .reg .f64 %r2210; .reg .f64 %r2211; .reg .f64 %r2212; .reg .f64 %r2214; .reg .f64 %r2215; .reg .f64 %r2216; .reg .f64 %r2217; .reg .f64 %r2218; .reg .f64 %r2219; .reg .f64 %r2220; .reg .f64 %r2221; .reg .f64 %r2222; .reg .f64 %r2223; .reg .f64 %r2224; .reg .f64 %r2225; .reg .pred %r2226; .reg .pred %r2229; .reg .f64 %r2230; .reg .f64 %r2231; .reg .f64 %r2232; .reg .f64 %r2233; .reg .f64 %r2234; .reg .f64 %r2235; .reg .f64 %r2236; .reg .f64 %r2240; .reg .f64 %r2241; .reg .f64 %r2242; .reg .f64 %r2243; .reg .f64 %r2244; .reg .f64 %r2245; .reg .f64 %r2246; .reg .f64 %r2247; .reg .f64 %r2248; .reg .f64 %r2250; .reg .f64 %r2251; .reg .f64 %r2252; .reg .f64 %r2253; .reg .f64 %r2254; .reg .f64 %r2255; .reg .f64 %r2256; .reg .f64 %r2257; .reg .f64 %r2258; .reg .pred %r2259; mov.u64 %r945,%ar0; mov.u64 %r946,%ar1; mov.u64 %r947,%ar2; mov.u32 %r948,%ar3; mov.u32 %r949,%ar4; mov.u64 %r950,%ar5; .loc 1 2462 3 ld.s8 %r627,[%r946+28]; cvt.u16.u32 %r951,%r627; setp.eq.u16 %r952,%r951,2; @ %r952 bra $L2; ld.s8 %r1551,[%r947+28]; cvt.u16.u32 %r953,%r1551; setp.eq.u16 %r955,%r953,2; @ %r955 bra $L3; cvta.const.u64 %r959,$LC3; cvta.const.u64 %r958,__func__$0; mov.u32 %r957,2462; cvta.const.u64 %r995893: .loc 1 2476 6 ld.u64 %r960,[%r945]; setp.ne.u64 %r961,%r960,0; @ %r961 bra $L4; .loc 1 2478 10 setp.ne.u16 %r963,%r951,1; @ %r963 bra $L5; .loc 1 2480 4 st.u64 [%r945+48],%r960; ld.u64 %r966,[%r947+80]; ld.u64 %r968,[%r947+72]; sub.u64 %r969,%r966,%r968; st.u64 [%r945+56],%r969; mov.u64 %r970,1; st.u64 [%r945+40],%r970; bra $L6; $L123: .loc 1 2485 4 st.u64 [%r945+48],%r1512; add.u64 %r972,%r668,-1; st.u64 [%r945+56],%r972; mov.u64 %r973,1; st.u64 [%r945+40],%r973; bra $L6; $L124: .loc 1 2490 4 mov.u64 %r974,0; st.u64 [%r945+48],%r974; add.u64 %r975,%r668,-1; st.u64 [%r945+56],%r975; mov.u64 %r976,1; st.u64 [%r945+40],%r976; .loc 1 2493 11 st.u64 [%r945+72],%r974; ld.u64 %r979,[%r947+80]; ld.u64 %r981,[%r947+72]; sub.u64 %r982,%r979,%r981; st.u64 [%r945+80],%r982; st.u64 [%r945+64],%r668; $L6: .loc 1 2499 1945; call (%value_in),_gfortran_size0,(%out_arg1); ld.param.u64 %r984,[%value_in]; } .loc 1 2499 4 mov.u64 %r986,16986; call (%value_in),_gfortrani_xmallocarray987,[%value_in]; } .loc 1 2499 2 st.u64 [%r945],%r987; .loc 1 2500 24 mov.u64 %r989,0; st.u64 [%r945+8],%r989; .loc 1 2557 7 ld.s8 %r627,[%r946+28]; .loc 1 2564 15 ld.u64 %r991,[%r946+56]; add.u64 %r990,%r991,1; .loc 1 2564 13 ld.u64 %r992,[%r946+48]; sub.u64 %r638,%r990,%r992; bra $L7; $L4: .loc 1 2502 12 cvta.global.u64 %r993,_gfortrani_compile_options; .loc 1 2502 11 ld.u32 %r994,[%r993+36]; setp.eq.u32 %r995,%r994,0; @ %r995 bra $L8; .loc 1 2509 17 ld.u64 %r997,[%r945+56]; add.u64 %r996,%r997,1; .loc 1 2509 15 ld.u64 %r998,[%r945+48]; sub.u64 %r638,%r996,%r998; .loc 1 2506 10 setp.ne.u16 %r1000,%r951,1; @ %r1000 bra $L9; .loc 1 2508 17 ld.u64 %r1002,[%r947+80]; add.u64 %r1001,%r1002,1; .loc 1 2508 15 ld.u64 %r1003,[%r947+72]; sub.u64 %r217,%r1001,%r1003; .loc 1 2510 7 setp.ne.u64 %r1004,%r217,%r638; @ %r1004 bra $L10; .loc 1 2564 15 ld.u64 %r1006,[%r946+56]; add.u64 %r1005,%r1006,1; .loc 1 2564 13 ld.u64 %r1007,[%r946+48]; sub.u64 %r638,%r1005,%r1007; .loc 1 2543 7 ld.s8 %r901,[%r945+28]; .loc 1 2548 27 ld.u64 %r844,[%r945+40]; .loc 1 2543 6 cvt.u16.u32 %r1008,%r901; setp.eq.u16 %r1009,%r1008,1; @ %r1009 bra $L11; .loc 1 2553 16 ld.u64 %r664,[%r945+64]; .loc 1 2560 16 ld.u64 %r298,[%r946+40]; .loc 1 2563 14 mov.u64 %r162,1; .loc 1 2561 16 mov.u64 %r150,%r162; bra $L12; $L10: .loc 1 2511 6 st.u64 [%stack+8],%r217; st.u64 [%stack],%r638; cvta.const.u64 %r10101010_gfortran_runtime_error2485 4 ld.u64 %r1013,[%r946+56]; add.u64 %r1012,%r1013,1; ld.u64 %r1014,[%r946+48]; sub.u64 %r672,%r1012,%r1014; .loc 1 2515 15 cvt.u16.u32 %r1015,%r1551; setp.ne.u16 %r1017,%r1015,1; @ %r1017 bra $L13; .loc 1 2519 7 setp.eq.u64 %r1018,%r672,%r638; @ %r1018 bra $L14; .loc 1 2520 6 st.u64 [%stack+8],%r672; st.u64 [%stack],%r638; cvta.const.u64 %r101101_gfortran_runtime_error13: .loc 1 2528 7 setp.eq.u64 %r1021,%r672,%r638; @ %r1021 bra $L15; .loc 1 2529 6 st.u64 [%stack+8],%r672; st.u64 [%stack],%r638; cvta.const.u64 %r102210stack; call _gfortran_runtime_error15: .loc 1 2533 17 ld.u64 %r1025,[%r947+80]; add.u64 %r1024,%r1025,1; .loc 1 2533 15 ld.u64 %r1026,[%r947+72]; sub.u64 %r215,%r1024,%r1026; .loc 1 2534 17 ld.u64 %r1028,[%r945+80]; add.u64 %r1027,%r1028,1; .loc 1 2534 15 ld.u64 %r1029,[%r945+72]; sub.u64 %r216,%r1027,%r1029; .loc 1 2535 7 setp.eq.u64 %r1030,%r215,%r216; @ %r1030 bra $L14; .loc 1 2536 6 st.u64 [%stack+8],%r215; st.u64 [%stack],%r216; cvta.const.u64 %r1031,$LC10stack; call _gfortran_runtime_error2564 15 ld.u64 %r1034,[%r946+56]; add.u64 %r1033,%r1034,1; .loc 1 2564 13 ld.u64 %r1035,[%r946+48]; sub.u64 %r638,%r1033,%r1035; $L7: .loc 1 2548 27 ld.u64 %r664,[%r945+40]; .loc 1 2543 6 ld.s8 %r1037,[%r945+28]; cvt.u16.u32 %r1036,%r1037; setp.eq.u16 %r1038,%r1036,1; @ %r1038 bra $L130; $L127: .loc 1 2553 16 mov.u64 %r844,%r664; ld.u64 %r664,[%r945+64]; bra $L16; $L130: .loc 1 2548 27 mov.u64 %r844,%r664; $L16: .loc 1 2560 16 ld.u64 %r298,[%r946+40]; .loc 1 2557 6 cvt.u16.u32 %r1039,%r627; setp.eq.u16 %r1040,%r1039,1; @ %r1040 bra $L131; $L125: .loc 1 2569 16 ld.u64 %r150,[%r946+64]; .loc 1 2571 15 ld.u64 %r1042,[%r946+80]; add.u64 %r1041,%r1042,1; .loc 1 2571 13 ld.u64 %r1043,[%r946+72]; mov.u64 %r162,%r638; sub.u64 %r638,%r1041,%r1043; bra $L12; $L131: .loc 1 2563 14 mov.u64 %r162,1; .loc 1 2561 16 mov.u64 %r150,%r162; $L12: .loc 1 2575 16 ld.u64 %r1045,[%r947+56]; add.u64 %r1044,%r1045,1; ld.u64 %r1046,[%r947+48]; sub.u64 %r65,%r1044,%r1046; .loc 1 2575 6 setp.eq.u64 %r1047,%r65,%r638; @ %r1047 bra $L17; .loc 1 2577 10 set.u32.gt.s64 %r1049,%r638,0; neg.s32 %r1050,%r1049; .loc 1 2577 21 set.u32.gt.s64 %r1052,%r65,0; neg.s32 %r1053,%r1052; cvt.u16.u32 %r1055,%r1050; cvt.u16.u32 %r1056,%r1053; or.b16 %r1054,%r1055,%r1056; cvt.u32.u16 %r1057,%r1054; cvt.u16.u8 %r1058,%r1057; setp.eq.u16 %r1059,%r1058,0; @ %r1059 bra $L17; .loc 1 2578 2 st.u64 [%stack+8],%r638; st.u64 [%stack],%r65; cvta.const.u64 %r10601060_gfortran_runtime_error17: .loc 1 2583 7 ld.s8 %r70,[%r947+28]; .loc 1 2586 16 ld.u64 %r663,[%r947+40]; .loc 1 2583 6 cvt.u16.u32 %r1062,%r70; setp.eq.u16 %r1063,%r1062,1; @ %r1063 bra $L132; .loc 1 2597 16 ld.u64 %r151,[%r947+64]; .loc 1 2598 16 ld.u64 %r1065,[%r947+80]; add.u64 %r1064,%r1065,1; .loc 1 2598 14 ld.u64 %r1066,[%r947+72]; sub.u64 %r163,%r1064,%r1066; bra $L18; $L132: .loc 1 2592 14 mov.u64 %r163,1; .loc 1 2591 16 mov.u64 %r151,256; $L18: .loc 1 2601 9 ld.u64 %r220,[%r946]; .loc 1 2602 9 ld.u64 %r221,[%r947]; .loc 1 2603 8 ld.u64 %r222,[%r945]; .loc 1 2612 28 set.u32.eq.u64 %r1068,%r844,1; neg.s32 %r1069,%r1068; cvt.u32.u32 %r1067,%r1069; cvt.u32.u8 %r75,%r1067; .loc 1 2612 46 set.u32.eq.u64 %r1071,%r298,1; neg.s32 %r1072,%r1071; cvt.u32.u32 %r1070,%r1072; cvt.u32.u8 %r662,%r1070; .loc 1 2612 7 set.u32.ne.u32 %r1074,%r948,0; neg.s32 %r1075,%r1074; .loc 1 2612 16 cvt.u16.u32 %r1077,%r1075; cvt.u16.u32 %r1078,%r75; and.b16 %r1076,%r1077,%r1078; .loc 1 2612 6 cvt.u32.u16 %r1079,%r1076; cvt.u16.u8 %r1080,%r1079; setp.eq.u16 %r1081,%r1080,0; @ %r1081 bra $L19; .loc 1 2612 63 set.u32.eq.u64 %r1083,%r150,1; neg.s32 %r1084,%r1083; .loc 1 2612 51 cvt.u16.u32 %r1086,%r1084; cvt.u16.u32 %r1087,%r662; or.b16 %r1085,%r1086,%r1087; .loc 1 2612 33 cvt.u32.u16 %r1088,%r1085; cvt.u16.u8 %r1089,%r1088; setp.eq.u16 %r1090,%r1089,0; @ %r1090 bra $L20; .loc 1 2613 20 set.u32.eq.u64 %r1092,%r663,1; neg.s32 %r1093,%r1092; .loc 1 2613 37 set.u32.eq.u64 %r1095,%r151,1; neg.s32 %r1096,%r1095; .loc 1 2613 25 cvt.u16.u32 %r1098,%r1093; cvt.u16.u32 %r1099,%r1096; or.b16 %r1097,%r1098,%r1099; .loc 1 2613 7 cvt.u32.u16 %r1100,%r1097; cvt.u16.u8 %r1101,%r1100; setp.eq.u16 %r1102,%r1101,0; @ %r1102 bra $L20; .loc 1 2615 13 cvt.rn.f32.s32 %r86,%r949; .loc 1 2614 12 cvt.rn.f32.s64 %r1103,%r162; .loc 1 2614 50 cvt.rn.f32.s64 %r1104,%r638; .loc 1 2614 47 mul.f32 %r1105,%r1103,%r1104; .loc 1 2614 31 cvt.rn.f32.s64 %r1106,%r163; .loc 1 2614 47 mul.f32 %r1107,%r1105,%r1106; .loc 1 2615 13 mul.f32 %r1108,%r86,%r86; mul.f32 %r1109,%r1108,%r86; .loc 1 2614 7 setp.gt.f32 %r1110,%r1107,%r1109; @ ! %r1110 bra $L19; .loc 1 2617 17 cvt.u32.u64 %r90,%r162; st.u32 [%frame+52],%r90; .loc 1 2617 29 cvt.u32.u64 %r91,%r163; st.u32 [%frame+48],%r91; .loc 1 2617 41 cvt.u32.u64 %r92,%r638; st.u32 [%frame+44],%r92; .loc 1 2617 52 cvt.u32.u64 %r93,%r664; st.u32 [%frame+40],%r93; .loc 1 2618 27 ld.const.f64 %r1111,[$LC8]; st.f64 [%frame+16],%r1111; ld.const.f64 %r1112,[$LC8+8]; st.f64 [%frame+24],%r1112; .loc 1 2618 36 ld.const.f64 %r1545,[$LC9]; st.f64 [%frame],%r1545; ld.const.f64 %r1535,[$LC9+8]; st.f64 [%frame+8],%r1535; .loc 1 2619 17 setp.ne.u64 %r1115,%r298,1; @ %r1115 bra $L22; cvt.u32.u64 %r198,%r150; bra $L23; $L22: cvt.u32.u64 %r198,%r298; $L23: st.u32 [%frame+36],%r198; .loc 1 2620 3 setp.ne.u64 %r1116,%r663,1; @ %r1116 bra $L24; cvt.u32.u64 %r199,%r151; bra $L25; $L24: cvt.u32.u64 %r199,%r663; $L25: st.u32 [%frame+32],%r199; .loc 1 2622 15 set.u32.gt.s32 %r1118,%r198,0; neg.s32 %r1119,%r1118; .loc 1 2622 26 set.u32.gt.s32 %r1121,%r199,0; neg.s32 %r1122,%r1121; .loc 1 2622 19 cvt.u16.u32 %r1124,%r1119; cvt.u16.u32 %r1125,%r1122; and.b16 %r1123,%r1124,%r1125; .loc 1 2622 10 cvt.u32.u16 %r1126,%r1123; cvt.u16.u8 %r1127,%r1126; setp.eq.u16 %r1128,%r1127,0; @ %r1128 bra $L19; .loc 1 2622 37 set.u32.gt.s32 %r1130,%r93,0; neg.s32 %r1131,%r1130; .loc 1 2622 46 set.u32.gt.s32 %r1133,%r90,1; neg.s32 %r1134,%r1133; .loc 1 2622 41 cvt.u16.u32 %r1136,%r1131; cvt.u16.u32 %r1137,%r1134; and.b16 %r1135,%r1136,%r1137; cvt.u32.u16 %r1138,%r1135; cvt.u16.u8 %r1139,%r1138; setp.eq.u16 %r1140,%r1139,0; @ %r1140 bra $L19; .loc 1 2622 55 set.u32.gt.s32 %r1142,%r91,1; neg.s32 %r1143,%r1142; .loc 1 2622 64 set.u32.gt.s32 %r1145,%r92,1; neg.s32 %r1146,%r1145; .loc 1 2622 59 cvt.u16.u32 %r1148,%r1143; cvt.u16.u32 %r1149,%r1146; and.b16 %r1147,%r1148,%r1149; cvt.u32.u16 %r1150,%r1147; cvt.u16.u8 %r1151,%r1150; setp.eq.u16 %r1152,%r1151,0; @ %r1152 bra $L19; .loc 1 2624 4 setp.ne.u64 %r1153,%r950,0; @ %r1153 bra $L27; cvta.const.u64 %r1157,$LC10; cvta.const.u64 %r1156,__func__$0; mov.u32 %r1155,2624; cvta.const.u64 %r11115415511561127: .loc 1 2626 17 and.b32 %r1158,%r948,2; .loc 1 2626 7 setp.ne.u32 %r1159,%r1158,0; @ %r1159 bra $L133; .loc 1 2629 35 setp.eq.u64 %r1160,%r298,1; @ %r1160 bra $L134; cvta.const.u64 %r164,$LC2; bra $L28; $L133: .loc 1 2627 13 cvta.const.u64 %r164,$LC1; bra $L28; $L134: .loc 1 2629 35 cvta.const.u64 %r164,$LC0; $L28: .loc 1 2631 17 and.b32 %r1161,%r948,4; .loc 1 2631 7 setp.ne.u32 %r1162,%r1161,0; @ %r1162 bra $L135; .loc 1 2634 35 setp.eq.u64 %r1163,%r663,1; @ %r1163 bra $L136; cvta.const.u64 %r165,$LC2; bra $L29; $L135: .loc 1 2632 13 cvta.const.u64 %r165,$LC1; bra $L29; $L136: .loc 1 2634 35 cvta.const.u64 %r165,$LC0; $L29: .loc 1 2636 4 add.u64 %r1179,%frame,40; add.u64 %r1180,%frame,32; add.u64 %r1181,%frame,36; add.u64 %r1182,%frame,16; add.u64 %r1183,%frame,44; add.u64 %r1184,%frame,48; add.u64 %r1185,%frame,52; mov.u32 %r1178,1;,.param .u64 %in_ar3,.param .u64 %in_ar4,.param .u64 %in_ar5,.param .u64 %in_ar6,.param .u64 %in_ar7,.param .u64 %in_ar8,.param .u64 %in_ar9,.param .u64 %in_ar10,.param .u64 %in_ar11,.param .u64 %in_ar12,.param .u32 %in_ar13,.param .u32 %in_ar14164r118118411831182220; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r1181; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r221; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r1180; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%frame; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r222; .param .u64 %out_arg13; st.param.u64 [%out_arg13],%r1179; .param .u32 %out_arg14; st.param.u32 [%out_arg14],%r1178; .param .u32 %out_arg15; st.param.u32 [%out_arg15],%r1178; call %r950,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13,%out_arg14,%out_arg15),$LCT0; } bra $L1; $L19: .loc 1 2643 21 cvt.u16.u32 %r1187,%r75; cvt.u16.u32 %r1188,%r662; and.b16 %r1186,%r1187,%r1188; .loc 1 2643 6 cvt.u32.u16 %r1189,%r1186; cvt.u16.u8 %r1190,%r1189; setp.eq.u16 %r1191,%r1190,0; @ %r1191 bra $L31; .loc 1 2643 38 setp.ne.u64 %r1192,%r663,1; @ %r1192 bra $L20; .loc 1 2644 7 cvt.u16.u32 %r1193,%r70; setp.eq.u16 %r1194,%r1193,1; @ %r1194 bra $L32; .loc 1 2677 16 add.u64 %r1195,%r664,1; .loc 1 2678 9 shl.b64 %r111,%r1195,4; sub.u64 %r224,%r222,%r111; .loc 1 2687 7 setp.le.s64 %r1196,%r163,0; @ %r1196 bra $L33; add.u64 %r802,%r111,-16; mov.u64 %r800,%r222; add.u64 %r806,%r163,1; .loc 1 2689 22 shl.b64 %r623,%r162,4; .loc 1 2687 13 mov.u64 %r255,%r663; setp.gt.s64 %r1556,%r162,0; and.b64 %r1636,%r163,3; setp.eq.u64 %r1639,%r1636,0; @ %r1639 bra $L34; setp.eq.u64 %r1638,%r1636,1; @ %r1638 bra $L282; setp.eq.u64 %r1637,%r1636,2; @ %r1637 bra $L283; bra $L338; $L35: .loc 1 2689 22 mov.u32 %r12002001202,[%value_in]; } $L36: .loc 1 2687 24 add.u64 %r1640,%r255,1; .loc 1 2687 7 add.u64 %r1695,%r800,%r802; .loc 1 2688 2 @ %r1556 bra $L220; bra $L339; $L34: @ %r1556 bra $L35; bra $L36; $L33: .loc 1 2692 28 set.u32.eq.u64 %r1207,%r638,0; neg.s32 %r1208,%r1207; .loc 1 2692 13 set.u32.eq.u64 %r1210,%r162,0; neg.s32 %r1211,%r1210; .loc 1 2692 28 cvt.u16.u32 %r1213,%r1208; cvt.u16.u32 %r1214,%r1211; or.b16 %r1212,%r1213,%r1214; .loc 1 2692 23 set.u32.eq.u64 %r1216,%r163,0; neg.s32 %r1217,%r1216; .loc 1 2692 28 cvt.u16.u32 %r1221,%r1217; or.b16 %r1219,%r1212,%r1221; cvt.u32.u16 %r1222,%r1219; cvt.u16.u8 %r1223,%r1222; setp.eq.u16 %r1224,%r1223,0; @ ! %r1224 bra $L1; .loc 1 2697 10 setp.eq.u64 %r1225,%r150,1; selp.u64 %r196,%r664,%r150,%r1225; .loc 1 2702 21 shl.b64 %r1226,%r196,8; .loc 1 2702 14 add.u64 %r229,%r1226,%r151; .loc 1 2706 12 min.s64 %r1228,%r229,65536; shl.b64 %r1229,%r12282291230,[%value_in]; } .loc 1 2710 7 setp.gt.s64 %r1232,%r163,0; @ %r1232 bra $L39; $L42: .loc 1 29302918 7 bra $L1; $L39: .loc 1 2680 16 add.u64 %r1234,%r150,1; .loc 1 2681 9 shl.b64 %r1235,%r1234,4; sub.u64 %r226,%r220,%r1235; .loc 1 2683 16 add.u64 %r1236,%r151,1; .loc 1 2684 9 shl.b64 %r117,%r1236,4; shl.b64 %r769,%r664,13; shl.b64 %r767,%r664,5; shl.b64 %r771,%r664,6; neg.s64 %r1238,%r664; shl.b64 %r1239,%r1238,13; shl.b64 %r772,%r1238,4; shl.b64 %r775,%r664,9; shl.b64 %r783,%r151,13; add.u64 %r784,%r117,-16; sub.u64 %r785,%r221,%r117; add.u64 %r782,%r784,%r785; add.u64 %r1244,%r151,%r151; add.u64 %r1245,%r1244,%r151; shl.b64 %r1246,%r1245,4; add.u64 %r786,%r1246,%r785; add.u64 %r1247,%r163,-1; and.b64 %r793,%r1247,-512; shl.b64 %r749,%r150,8; add.u64 %r750,%r150,%r150; add.u64 %r797,%r111,-16; sub.u64 %r798,%r224,%r797; shl.b64 %r710,%r150,5; shl.b64 %r1251,%r1238,6; shl.b64 %r67,%r151,6; mov.u64 %r774,%r664; mov.u64 %r770,%r771; mov.u64 %r765,0; setp.gt.s64 %r1553,%r638,0; add.u64 %r1590,%r798,16; setp.gt.s64 %r1591,%r162,0; add.u64 %r1592,%r162,1; add.u64 %r1593,%r226,32; bra $L41; $L138: mov.u64 %r765,%r766; $L41: add.u64 %r795,%r765,1; .loc 1 2715 9 sub.u64 %r231,%r163,%r765; min.s64 %r1696,%r231,512; .loc 1 2716 24 shr.s64 %r1253,%r1696,63; shr.u64 %r1254,%r1253,62; add.u64 %r1255,%r1696,%r1254; and.b64 %r1256,%r1255,3; sub.u64 %r1257,%r1256,%r1254; .loc 1 2716 10 sub.u64 %r232,%r1696,%r1257; .loc 1 2718 4 @ %r1553 bra $L40; $L45: .loc 1 2710 7 add.u64 %r766,%r765,512; add.u64 %r767,%r767,%r769; add.u64 %r770,%r770,%r769; add.u64 %r772,%r772,%r1239; add.u64 %r774,%r774,%r775; add.u64 %r782,%r782,%r783; add.u64 %r786,%r786,%r783; setp.ne.u64 %r1259,%r765,%r793; @ %r1259 bra $L138; bra $L42; $L40: .loc 1 2768 13 add.u64 %r644,%r232,%r795; add.u64 %r799,%r1590,%r767; add.u64 %r760,%r782,%r784; add.u64 %r763,%r784,%r786; add.u64 %r259,%r767,%r797; mul.lo.u64 %r583,%r644,%r664; mul.lo.u64 %r577,%r644,%r151; mad.lo.u64 %r756,%r644,%r784,%r785; mov.u64 %r747,%r750; .loc 1 2718 12 mov.u64 %r235,%r663; add.u64 %r1552,%r638,1; .loc 1 2768 8 add.u64 %r1583,%r644,-1; setp.ge.s64 %r1585,%r1583,%r795; .loc 1 2873 17 add.u64 %r1588,%r1696,%r795; .loc 1 2874 9 setp.le.s64 %r1589,%r1588,%r644; $L44: .loc 1 2723 13 sub.u64 %r233,%r1552,%r235; min.s64 %r1698,%r233,256; .loc 1 2724 28 shr.u64 %r1265,%r1698,63; add.u64 %r1266,%r1698,%r1265; and.b64 %r1267,%r1266,1; sub.u64 %r1268,%r1267,%r1265; .loc 1 2724 14 sub.u64 %r234,%r1698,%r1268; .loc 1 2727 8 @ %r1591 bra $L43; $L73: .loc 1 2718 30 add.u64 %r235,%r235,256; .loc 1 2718 4 add.u64 %r747,%r747,%r749; setp.ge.s64 %r1270,%r638,%r235; @ %r1270 bra $L44; bra $L45; $L43: .loc 1 2763 26 add.u64 %r1271,%r235,-1; add.u64 %r1272,%r1271,%r1698; .loc 1 2763 31 mul.lo.u64 %r146,%r1272,%r150; .loc 1 2734 13 add.u64 %r643,%r234,%r235; .loc 1 2734 8 add.u64 %r645,%r643,-1; add.u64 %r1273,%r146,1; shl.b64 %r1274,%r1273,4; add.u64 %r737,%r226,%r1274; shl.b64 %r679,%r1698,4; add.u64 %r1275,%r679,-16; add.u64 %r678,%r1230,%r1275; add.u64 %r1276,%r679,-8; add.u64 %r683,%r1230,%r1276; shl.b64 %r758,%r235,4; add.u64 %r759,%r758,%r782; add.u64 %r761,%r758,%r760; add.u64 %r762,%r758,%r786; add.u64 %r764,%r758,%r763; .loc 1 2904 18 add.u64 %r1546,%r1698,%r235; add.u64 %r757,%r756,%r758; add.u64 %r1277,%r577,%r235; add.u64 %r1278,%r1277,%r1698; shl.b64 %r1279,%r1278,4; add.u64 %r574,%r1279,%r785; mov.u64 %r736,%r799; .loc 1 2727 16 mov.u64 %r248,%r663; setp.le.s64 %r1536,%r235,%r645; setp.gt.s64 %r1548,%r1698,%r234; setp.gt.s64 %r1547,%r1696,%r232; not.b64 %r1576,%r235; add.u64 %r1577,%r1576,%r643; and.b64 %r1578,%r1577,-2; .loc 1 2904 13 add.u64 %r1580,%r1546,-1; setp.gt.s64 %r1581,%r235,%r1580; $L72: .loc 1 2732 10 sub.u64 %r236,%r1592,%r248; min.s64 %r1699,%r236,256; .loc 1 2733 25 shr.s64 %r1533,%r1699,63; shr.u64 %r1283,%r1699,63; add.u64 %r1284,%r1699,%r1283; and.b64 %r1285,%r1284,1; sub.u64 %r1286,%r1285,%r1283; .loc 1 2733 11 sub.u64 %r237,%r1699,%r1286; .loc 1 2735 5 @ %r1536 bra $L46; $L55: .loc 1 2757 8 @ ! %r1548 bra $L48; bra $L47; $L46: .loc 1 2737 17 add.u64 %r647,%r237,%r248; .loc 1 2737 12 add.u64 %r648,%r647,-1; sub.u64 %r1289,%r747,%r150; add.u64 %r1290,%r1289,%r248; add.u64 %r1291,%r1290,%r1699; shl.b64 %r1292,%r1291,4; add.u64 %r1293,%r1292,-16; add.u64 %r709,%r226,%r1293; shl.b64 %r1295,%r1699,12; add.u64 %r1296,%r1295,-4096; add.u64 %r716,%r1230,%r1296; add.u64 %r1297,%r248,%r747; add.u64 %r1298,%r1297,%r1699; shl.b64 %r1299,%r1298,4; add.u64 %r1300,%r1299,-16; add.u64 %r721,%r226,%r1300; neg.s64 %r1304,%r1699; shl.b64 %r1305,%r1304,4; mov.u64 %r726,%r747; mov.u64 %r707,0; setp.le.s64 %r1542,%r248,%r648; setp.gt.s64 %r1549,%r1699,%r237; add.u64 %r1570,%r1305,16; not.b64 %r1571,%r248; add.u64 %r1572,%r1571,%r647; and.b64 %r1573,%r1572,-2; add.u64 %r1574,%r1573,%r248; bra $L54; $L139: mov.u64 %r707,%r708; $L54: .loc 1 2738 9 @ %r1542 bra $L49; $L53: .loc 1 2749 12 @ %r1549 bra $L50; bra $L51; $L49: add.u64 %r694,%r1570,%r709; shl.b64 %r1308,%r707,4; add.u64 %r695,%r1230,%r1308; add.u64 %r696,%r1570,%r721; add.u64 %r1313,%r1574,%r726; shl.b64 %r1314,%r1313,4; add.u64 %r706,%r1314,%r1593; sub.u64 %r1625,%r706,%r696; add.u64 %r1626,%r1625,-32; shr.u64 %r1624,%r1626,5; and.b64 %r1628,%r1624,1; setp.ne.u64 %r1629,%r1628,0; @ ! %r1629 bra $L340; $L52: ld.f64 %r192,[%r694+8]; .loc 1 2740 49 ld.f64 %r1316,[%r694]; st.f64 [%r695],%r1316; st.f64 [%r695+8],%r192; ld.f64 %r182,[%r696+8]; .loc 1 2742 49 ld.f64 %r1317,[%r696]; st.f64 [%r695+16],%r1317; st.f64 [%r695+24],%r182; ld.f64 %r180,[%r694+24]; .loc 1 2744 49 ld.f64 %r1318,[%r694+16]; st.f64 [%r695+4096],%r1318; st.f64 [%r695+4104],%r180; ld.f64 %r178,[%r696+24]; .loc 1 2746 49 ld.f64 %r1319,[%r696+16]; st.f64 [%r695+4112],%r1319; st.f64 [%r695+4120],%r178; .loc 1 2738 9 add.u64 %r1630,%r694,32; add.u64 %r1631,%r695,8192; add.u64 %r1632,%r696,32; ld.f64 %r1703,[%r1630+8]; .loc 1 2740 49 ld.f64 %r1704,[%r1630]; st.f64 [%r1631],%r1704; st.f64 [%r1631+8],%r1703; ld.f64 %r1705,[%r1632+8]; .loc 1 2742 49 ld.f64 %r1706,[%r1632]; st.f64 [%r1631+16],%r1706; st.f64 [%r1631+24],%r1705; ld.f64 %r1707,[%r1630+24]; .loc 1 2744 49 ld.f64 %r1708,[%r1630+16]; st.f64 [%r1631+4096],%r1708; st.f64 [%r1631+4104],%r1707; ld.f64 %r1709,[%r1632+24]; .loc 1 2746 49 ld.f64 %r1710,[%r1632+16]; st.f64 [%r1631+4112],%r1710; st.f64 [%r1631+4120],%r1709; .loc 1 2738 9 add.u64 %r694,%r694,64; add.u64 %r695,%r695,16384; add.u64 %r696,%r696,64; setp.ne.u64 %r1711,%r696,%r706; @ %r1711 bra $L52; bra $L53; $L50: ld.f64 %r174,[%r709+8]; .loc 1 2751 41 ld.f64 %r1321,[%r709]; st.f64 [%r716],%r1321; st.f64 [%r716+8],%r174; ld.f64 %r171,[%r721+8]; .loc 1 2753 41 ld.f64 %r1322,[%r721]; st.f64 [%r716+16],%r1322; st.f64 [%r716+24],%r171; $L51: .loc 1 2735 5 add.u64 %r708,%r707,2; add.u64 %r709,%r709,%r710; add.u64 %r716,%r716,32; add.u64 %r721,%r721,%r710; add.u64 %r726,%r726,%r750; setp.ne.u64 %r1323,%r707,%r1578; @ %r1323 bra $L139; bra $L55; $L48: .loc 1 2767 25 shr.u64 %r1326,%r1533,62; add.u64 %r1327,%r1699,%r1326; and.b64 %r1328,%r1327,3; sub.u64 %r1329,%r1328,%r1326; .loc 1 2767 11 sub.u64 %r240,%r1699,%r1329; .loc 1 2769 5 @ %r1585 bra $L56; bra $L57; $L47: .loc 1 2759 17 add.u64 %r131,%r1699,%r248; .loc 1 2759 12 add.u64 %r1331,%r131,-1; .loc 1 2760 9 setp.lt.s64 %r1332,%r1331,%r248; @ %r1332 bra $L48; add.u64 %r1333,%r131,%r146; shl.b64 %r1334,%r1333,4; add.u64 %r692,%r226,%r1334; mov.u64 %r682,%r683; mov.u64 %r677,%r678; mov.u64 %r676,%r737; sub.u64 %r1614,%r692,%r737; add.u64 %r1615,%r1614,-16; shr.u64 %r1613,%r1615,4; add.u64 %r1616,%r1613,1; and.b64 %r1617,%r1616,3; setp.eq.u64 %r1620,%r1617,0; @ %r1620 bra $L58; setp.eq.u64 %r1619,%r1617,1; @ %r1619 bra $L284; setp.eq.u64 %r1618,%r1617,2; @ %r1618 bra $L285; bra $L341; $L58: ld.f64 %r168,[%r676+8]; .loc 1 2762 43 ld.f64 %r1335,[%r676]; st.f64 [%r677],%r1335; st.f64 [%r682],%r168; .loc 1 2760 9 add.u64 %r1621,%r676,16; ld.f64 %r1715,[%r1621+8]; .loc 1 2762 43 ld.f64 %r1716,[%r1621]; st.f64 [%r677+4096],%r1716; st.f64 [%r682+4096],%r1715; .loc 1 2760 9 add.u64 %r1717,%r676,32; ld.f64 %r1721,[%r1717+8]; .loc 1 2762 43 ld.f64 %r1722,[%r1717]; st.f64 [%r677+8192],%r1722; st.f64 [%r682+8192],%r1721; .loc 1 2760 9 add.u64 %r1723,%r676,48; ld.f64 %r1727,[%r1723+8]; .loc 1 2762 43 ld.f64 %r1728,[%r1723]; st.f64 [%r677+12288],%r1728; st.f64 [%r682+12288],%r1727; .loc 1 2760 9 add.u64 %r676,%r676,64; add.u64 %r677,%r677,16384; add.u64 %r682,%r682,16384; setp.ne.u64 %r1729,%r676,%r692; @ %r1729 bra $L58; bra $L48; $L57: .loc 1 2871 8 @ ! %r1547 bra $L60; bra $L59; $L56: .loc 1 2876 14 add.u64 %r650,%r240,%r248; .loc 1 2876 9 add.u64 %r651,%r650,-1; add.u64 %r1338,%r650,%r774; shl.b64 %r1339,%r1338,4; add.u64 %r140,%r224,%r1339; add.u64 %r639,%r736,%r797; add.u64 %r637,%r639,%r797; add.u64 %r51,%r637,%r797; not.b64 %r1340,%r248; add.u64 %r1341,%r1340,%r650; shr.u64 %r1342,%r1341,2; add.u64 %r1343,%r1342,1; shl.b64 %r244,%r1343,6; sub.u64 %r1344,%r650,%r248; add.u64 %r1346,%r1344,1; shl.b64 %r1347,%r1346,12; add.u64 %r552,%r1230,%r1347; shl.b64 %r238,%r1699,4; mov.u64 %r640,%r736; mov.u64 %r641,%r764; mov.u64 %r642,%r762; mov.u64 %r649,%r761; mov.u64 %r88,%r759; mov.u64 %r200,%r772; mov.u64 %r250,%r795; mov.u64 %r49,0; setp.le.s64 %r1543,%r248,%r651; setp.gt.s64 %r1550,%r1699,%r240; $L68: .loc 1 2772 9 @ %r1543 bra $L61; $L67: .loc 1 2843 12 @ ! %r1550 bra $L63; bra $L62; $L61: add.u64 %r506,%r1230,4096; add.u64 %r504,%r1230,8192; add.u64 %r502,%r1230,12288; add.u64 %r500,%r1230,16384; .loc 1 2904 13 mov.u64 %r498,0; $L66: add.u64 %r496,%r640,%r498; ld.f64 %r184,[%r496]; ld.f64 %r281,[%r496+8]; ld.f64 %r201,[%r496+16]; ld.f64 %r189,[%r496+24]; add.u64 %r483,%r639,%r498; ld.f64 %r282,[%r483]; ld.f64 %r45,[%r483+8]; ld.f64 %r278,[%r483+16]; ld.f64 %r270,[%r483+24]; add.u64 %r472,%r637,%r498; ld.f64 %r280,[%r472]; ld.f64 %r279,[%r472+8]; ld.f64 %r300,[%r472+16]; ld.f64 %r301,[%r472+24]; add.u64 %r286,%r51,%r498; ld.f64 %r296,[%r286]; ld.f64 %r299,[%r286+8]; ld.f64 %r302,[%r286+16]; ld.f64 %r303,[%r286+24]; ld.f64 %r272,[%r496+32]; ld.f64 %r273,[%r496+40]; ld.f64 %r284,[%r496+48]; ld.f64 %r283,[%r496+56]; ld.f64 %r290,[%r483+32]; ld.f64 %r274,[%r483+40]; ld.f64 %r276,[%r483+48]; ld.f64 %r275,[%r483+56]; ld.f64 %r304,[%r472+32]; ld.f64 %r305,[%r472+40]; ld.f64 %r308,[%r472+48]; ld.f64 %r309,[%r472+56]; ld.f64 %r306,[%r286+32]; ld.f64 %r307,[%r286+40]; ld.f64 %r310,[%r286+48]; ld.f64 %r311,[%r286+56]; .loc 1 2791 6 @ %r1581 bra $L64; mov.u64 %r523,0; $L65: add.u64 %r521,%r506,%r523; ld.f64 %r312,[%r521+-4096]; ld.f64 %r313,[%r521+-4088]; add.u64 %r519,%r88,%r523; ld.f64 %r314,[%r519]; ld.f64 %r315,[%r519+8]; .loc 1 2794 11 mul.f64 %r317,%r313,%r314; mul.f64 %r1351,%r313,%r315; neg.f64 %r1352,%r1351; fma.rn.f64 %r318,%r312,%r314,%r1352; fma.rn.f64 %r319,%r312,%r315,%r317; .loc 1 2793 14 add.f64 %r184,%r184,%r318; add.f64 %r281,%r281,%r319; add.u64 %r517,%r504,%r523; ld.f64 %r320,[%r517+-4096]; ld.f64 %r321,[%r517+-4088]; .loc 1 2796 11 mul.f64 %r323,%r315,%r320; mul.f64 %r1353,%r315,%r321; neg.f64 %r1354,%r1353; fma.rn.f64 %r324,%r314,%r320,%r1354; fma.rn.f64 %r325,%r314,%r321,%r323; .loc 1 2795 14 add.f64 %r201,%r201,%r324; add.f64 %r189,%r189,%r325; add.u64 %r515,%r649,%r523; ld.f64 %r326,[%r515]; ld.f64 %r327,[%r515+8]; .loc 1 2798 11 mul.f64 %r329,%r313,%r326; mul.f64 %r1355,%r313,%r327; neg.f64 %r1356,%r1355; fma.rn.f64 %r330,%r312,%r326,%r1356; fma.rn.f64 %r331,%r312,%r327,%r329; .loc 1 2797 14 add.f64 %r282,%r282,%r330; add.f64 %r45,%r45,%r331; .loc 1 2800 11 mul.f64 %r333,%r321,%r326; mul.f64 %r1357,%r321,%r327; neg.f64 %r1358,%r1357; fma.rn.f64 %r334,%r320,%r326,%r1358; fma.rn.f64 %r335,%r320,%r327,%r333; .loc 1 2799 14 add.f64 %r278,%r278,%r334; add.f64 %r270,%r270,%r335; add.u64 %r513,%r642,%r523; ld.f64 %r336,[%r513]; ld.f64 %r337,[%r513+8]; .loc 1 2802 11 mul.f64 %r339,%r313,%r336; mul.f64 %r1359,%r313,%r337; neg.f64 %r1360,%r1359; fma.rn.f64 %r340,%r312,%r336,%r1360; fma.rn.f64 %r341,%r312,%r337,%r339; .loc 1 2801 14 add.f64 %r280,%r280,%r340; add.f64 %r279,%r279,%r341; .loc 1 2804 11 mul.f64 %r343,%r321,%r336; mul.f64 %r1361,%r321,%r337; neg.f64 %r1362,%r1361; fma.rn.f64 %r344,%r320,%r336,%r1362; fma.rn.f64 %r345,%r320,%r337,%r343; .loc 1 2803 14 add.f64 %r300,%r300,%r344; add.f64 %r301,%r301,%r345; add.u64 %r511,%r641,%r523; ld.f64 %r346,[%r511]; ld.f64 %r347,[%r511+8]; .loc 1 2806 11 mul.f64 %r349,%r313,%r346; mul.f64 %r1363,%r313,%r347; neg.f64 %r1364,%r1363; fma.rn.f64 %r350,%r312,%r346,%r1364; fma.rn.f64 %r351,%r312,%r347,%r349; .loc 1 2805 14 add.f64 %r296,%r296,%r350; add.f64 %r299,%r299,%r351; .loc 1 2808 11 mul.f64 %r353,%r321,%r346; mul.f64 %r1365,%r321,%r347; neg.f64 %r1366,%r1365; fma.rn.f64 %r354,%r320,%r346,%r1366; fma.rn.f64 %r355,%r320,%r347,%r353; .loc 1 2807 14 add.f64 %r302,%r302,%r354; add.f64 %r303,%r303,%r355; add.u64 %r509,%r502,%r523; ld.f64 %r356,[%r509+-4096]; ld.f64 %r357,[%r509+-4088]; .loc 1 2810 11 mul.f64 %r359,%r315,%r356; mul.f64 %r1367,%r315,%r357; neg.f64 %r1368,%r1367; fma.rn.f64 %r360,%r314,%r356,%r1368; fma.rn.f64 %r361,%r314,%r357,%r359; .loc 1 2809 14 add.f64 %r272,%r272,%r360; add.f64 %r273,%r273,%r361; add.u64 %r507,%r500,%r523; ld.f64 %r362,[%r507+-4096]; ld.f64 %r363,[%r507+-4088]; .loc 1 2812 11 mul.f64 %r365,%r315,%r362; mul.f64 %r1369,%r315,%r363; neg.f64 %r1370,%r1369; fma.rn.f64 %r366,%r314,%r362,%r1370; fma.rn.f64 %r367,%r314,%r363,%r365; .loc 1 2811 14 add.f64 %r284,%r284,%r366; add.f64 %r283,%r283,%r367; .loc 1 2814 11 mul.f64 %r369,%r327,%r356; mul.f64 %r1371,%r327,%r357; neg.f64 %r1372,%r1371; fma.rn.f64 %r370,%r326,%r356,%r1372; fma.rn.f64 %r371,%r326,%r357,%r369; .loc 1 2813 14 add.f64 %r290,%r290,%r370; add.f64 %r274,%r274,%r371; .loc 1 2816 11 mul.f64 %r373,%r327,%r362; mul.f64 %r1373,%r327,%r363; neg.f64 %r1374,%r1373; fma.rn.f64 %r374,%r326,%r362,%r1374; fma.rn.f64 %r375,%r326,%r363,%r373; .loc 1 2815 14 add.f64 %r276,%r276,%r374; add.f64 %r275,%r275,%r375; .loc 1 2818 11 mul.f64 %r377,%r337,%r356; mul.f64 %r1375,%r337,%r357; neg.f64 %r1376,%r1375; fma.rn.f64 %r378,%r336,%r356,%r1376; fma.rn.f64 %r379,%r336,%r357,%r377; .loc 1 2817 14 add.f64 %r304,%r304,%r378; add.f64 %r305,%r305,%r379; .loc 1 2820 11 mul.f64 %r381,%r337,%r362; mul.f64 %r1377,%r337,%r363; neg.f64 %r1378,%r1377; fma.rn.f64 %r382,%r336,%r362,%r1378; fma.rn.f64 %r383,%r336,%r363,%r381; .loc 1 2819 14 add.f64 %r308,%r308,%r382; add.f64 %r309,%r309,%r383; .loc 1 2822 11 mul.f64 %r385,%r347,%r356; mul.f64 %r1379,%r347,%r357; neg.f64 %r1380,%r1379; fma.rn.f64 %r386,%r346,%r356,%r1380; fma.rn.f64 %r387,%r346,%r357,%r385; .loc 1 2821 14 add.f64 %r306,%r306,%r386; add.f64 %r307,%r307,%r387; .loc 1 2824 11 mul.f64 %r389,%r347,%r362; mul.f64 %r1381,%r347,%r363; neg.f64 %r1382,%r1381; fma.rn.f64 %r390,%r346,%r362,%r1382; fma.rn.f64 %r391,%r346,%r363,%r389; .loc 1 2823 14 add.f64 %r310,%r310,%r390; add.f64 %r311,%r311,%r391; .loc 1 2791 6 add.u64 %r523,%r523,16; setp.ne.u64 %r1383,%r523,%r679; @ %r1383 bra $L65; $L64: .loc 1 2826 24 st.f64 [%r496],%r184; st.f64 [%r496+8],%r281; .loc 1 2827 28 st.f64 [%r496+16],%r201; st.f64 [%r496+24],%r189; .loc 1 2828 30 st.f64 [%r483],%r282; st.f64 [%r483+8],%r45; .loc 1 2829 34 st.f64 [%r483+16],%r278; st.f64 [%r483+24],%r270; .loc 1 2830 30 st.f64 [%r472],%r280; st.f64 [%r472+8],%r279; .loc 1 2831 34 st.f64 [%r472+16],%r300; st.f64 [%r472+24],%r301; .loc 1 2832 30 st.f64 [%r286],%r296; st.f64 [%r286+8],%r299; .loc 1 2833 34 st.f64 [%r286+16],%r302; st.f64 [%r286+24],%r303; .loc 1 2834 28 st.f64 [%r496+32],%r272; st.f64 [%r496+40],%r273; .loc 1 2835 28 st.f64 [%r496+48],%r284; st.f64 [%r496+56],%r283; .loc 1 2836 34 st.f64 [%r483+32],%r290; st.f64 [%r483+40],%r274; .loc 1 2837 34 st.f64 [%r483+48],%r276; st.f64 [%r483+56],%r275; .loc 1 2838 34 st.f64 [%r472+32],%r304; st.f64 [%r472+40],%r305; .loc 1 2839 34 st.f64 [%r472+48],%r308; st.f64 [%r472+56],%r309; .loc 1 2840 34 st.f64 [%r286+32],%r306; st.f64 [%r286+40],%r307; .loc 1 2841 34 st.f64 [%r286+48],%r310; st.f64 [%r286+56],%r311; .loc 1 2772 9 add.u64 %r506,%r506,16384; add.u64 %r504,%r504,16384; add.u64 %r502,%r502,16384; add.u64 %r500,%r500,16384; add.u64 %r498,%r498,64; setp.ne.u64 %r1384,%r244,%r498; @ %r1384 bra $L66; bra $L67; $L63: .loc 1 2769 29 add.u64 %r250,%r250,4; .loc 1 2769 5 add.u64 %r200,%r200,%r1251; add.u64 %r140,%r140,%r771; add.u64 %r88,%r88,%r67; add.u64 %r649,%r649,%r67; add.u64 %r642,%r642,%r67; add.u64 %r641,%r641,%r67; add.u64 %r640,%r640,%r771; add.u64 %r639,%r639,%r771; add.u64 %r637,%r637,%r771; add.u64 %r51,%r51,%r771; add.u64 %r49,%r49,%r771; setp.le.s64 %r1385,%r250,%r1583; @ %r1385 bra $L68; bra $L57; $L62: add.u64 %r549,%r140,8; add.u64 %r542,%r49,%r259; add.u64 %r128,%r238,%r640; .loc 1 2904 13 mov.u64 %r553,%r552; mov.u64 %r555,%r140; add.u64 %r1532,%r49,%r767; add.u64 %r1534,%r49,%r770; $L71: ld.f64 %r392,[%r555]; ld.f64 %r393,[%r549]; add.u64 %r1387,%r1532,%r555; add.u64 %r543,%r1387,%r200; ld.f64 %r394,[%r543]; add.u64 %r1388,%r1532,%r549; add.u64 %r530,%r1388,%r200; ld.f64 %r395,[%r530]; add.u64 %r1389,%r542,%r555; add.u64 %r539,%r1389,%r200; ld.f64 %r396,[%r539]; add.u64 %r1390,%r542,%r549; add.u64 %r527,%r1390,%r200; ld.f64 %r397,[%r527]; add.u64 %r1391,%r1534,%r555; add.u64 %r535,%r1391,%r200; ld.f64 %r398,[%r535]; add.u64 %r1392,%r1534,%r549; add.u64 %r524,%r1392,%r200; ld.f64 %r399,[%r524]; .loc 1 2853 10 @ %r1581 bra $L69; mov.u64 %r566,0; $L70: add.u64 %r564,%r553,%r566; ld.f64 %r400,[%r564+-4096]; ld.f64 %r401,[%r564+-4088]; add.u64 %r562,%r88,%r566; ld.f64 %r402,[%r562]; ld.f64 %r403,[%r562+8]; .loc 1 2856 13 mul.f64 %r405,%r401,%r402; mul.f64 %r1394,%r401,%r403; neg.f64 %r1395,%r1394; fma.rn.f64 %r406,%r400,%r402,%r1395; fma.rn.f64 %r407,%r400,%r403,%r405; .loc 1 2855 11 add.f64 %r392,%r392,%r406; add.f64 %r393,%r393,%r407; add.u64 %r560,%r649,%r566; ld.f64 %r408,[%r560]; ld.f64 %r409,[%r560+8]; .loc 1 2858 13 mul.f64 %r411,%r401,%r408; mul.f64 %r1396,%r401,%r409; neg.f64 %r1397,%r1396; fma.rn.f64 %r412,%r400,%r408,%r1397; fma.rn.f64 %r413,%r400,%r409,%r411; .loc 1 2857 11 add.f64 %r394,%r394,%r412; add.f64 %r395,%r395,%r413; add.u64 %r558,%r642,%r566; ld.f64 %r414,[%r558]; ld.f64 %r415,[%r558+8]; .loc 1 2860 13 mul.f64 %r417,%r401,%r414; mul.f64 %r1398,%r401,%r415; neg.f64 %r1399,%r1398; fma.rn.f64 %r418,%r400,%r414,%r1399; fma.rn.f64 %r419,%r400,%r415,%r417; .loc 1 2859 11 add.f64 %r396,%r396,%r418; add.f64 %r397,%r397,%r419; add.u64 %r556,%r641,%r566; ld.f64 %r420,[%r556]; ld.f64 %r421,[%r556+8]; .loc 1 2862 13 mul.f64 %r423,%r401,%r420; mul.f64 %r1400,%r401,%r421; neg.f64 %r1401,%r1400; fma.rn.f64 %r424,%r400,%r420,%r1401; fma.rn.f64 %r425,%r400,%r421,%r423; .loc 1 2861 11 add.f64 %r398,%r398,%r424; add.f64 %r399,%r399,%r425; .loc 1 2853 10 add.u64 %r566,%r566,16; setp.ne.u64 %r1402,%r566,%r679; @ %r1402 bra $L70; $L69: .loc 1 2864 28 st.f64 [%r555],%r392; st.f64 [%r549],%r393; .loc 1 2865 34 st.f64 [%r543],%r394; st.f64 [%r530],%r395; .loc 1 2866 34 st.f64 [%r539],%r396; st.f64 [%r527],%r397; .loc 1 2867 34 st.f64 [%r535],%r398; st.f64 [%r524],%r399; .loc 1 2846 6 add.u64 %r555,%r555,16; add.u64 %r553,%r553,4096; add.u64 %r549,%r549,16; setp.ne.u64 %r1403,%r128,%r555; @ %r1403 bra $L71; bra $L63; $L60: .loc 1 2727 34 add.u64 %r248,%r248,256; .loc 1 2727 8 add.u64 %r736,%r736,4096; add.u64 %r737,%r737,4096; setp.ge.s64 %r1404,%r162,%r248; @ %r1404 bra $L72; bra $L73; $L59: .loc 1 2874 9 @ %r1589 bra $L60; .loc 1 2876 14 add.u64 %r657,%r240,%r248; .loc 1 2876 9 add.u64 %r658,%r657,-1; .loc 1 2759 17 add.u64 %r582,%r1699,%r248; add.u64 %r1406,%r582,%r583; shl.b64 %r1407,%r1406,4; add.u64 %r584,%r224,%r1407; neg.s64 %r1409,%r582; shl.b64 %r1410,%r1409,4; shl.b64 %r1411,%r248,4; add.u64 %r567,%r1411,%r1410; shl.b64 %r1412,%r657,4; add.u64 %r569,%r1412,%r1410; shl.b64 %r1413,%r240,12; add.u64 %r615,%r1230,%r1413; .loc 1 2876 9 mov.u64 %r575,%r574; mov.u64 %r578,%r757; mov.u64 %r579,%r583; mov.u64 %r243,%r644; setp.gt.s64 %r1730,%r1699,%r240; setp.le.s64 %r1544,%r248,%r658; not.b64 %r1562,%r248; add.u64 %r1563,%r1562,%r657; and.b64 %r1564,%r1563,-4; add.u64 %r1565,%r1564,%r248; add.u64 %r1566,%r224,64; $L81: .loc 1 2877 6 @ %r1544 bra $L74; $L80: .loc 1 2901 6 @ %r1730 bra $L75; bra $L76; $L74: add.u64 %r599,%r567,%r584; add.u64 %r598,%r1230,4096; add.u64 %r596,%r1230,8192; add.u64 %r594,%r1230,12288; add.u64 %r592,%r1230,16384; add.u64 %r1420,%r1565,%r579; shl.b64 %r1421,%r1420,4; add.u64 %r585,%r1421,%r1566; $L79: ld.f64 %r426,[%r599]; ld.f64 %r427,[%r599+8]; ld.f64 %r428,[%r599+16]; ld.f64 %r429,[%r599+24]; ld.f64 %r430,[%r599+32]; ld.f64 %r431,[%r599+40]; ld.f64 %r432,[%r599+48]; ld.f64 %r433,[%r599+56]; .loc 1 2884 10 @ %r1581 bra $L77; mov.u64 %r610,0; $L78: add.u64 %r608,%r598,%r610; ld.f64 %r434,[%r608+-4096]; ld.f64 %r435,[%r608+-4088]; add.u64 %r606,%r578,%r610; ld.f64 %r436,[%r606]; ld.f64 %r437,[%r606+8]; .loc 1 2887 13 mul.f64 %r439,%r435,%r436; mul.f64 %r1424,%r435,%r437; neg.f64 %r1425,%r1424; fma.rn.f64 %r440,%r434,%r436,%r1425; fma.rn.f64 %r441,%r434,%r437,%r439; .loc 1 2886 11 add.f64 %r426,%r426,%r440; add.f64 %r427,%r427,%r441; add.u64 %r604,%r596,%r610; ld.f64 %r442,[%r604+-4096]; ld.f64 %r443,[%r604+-4088]; .loc 1 2889 13 mul.f64 %r445,%r437,%r442; mul.f64 %r1426,%r437,%r443; neg.f64 %r1427,%r1426; fma.rn.f64 %r446,%r436,%r442,%r1427; fma.rn.f64 %r447,%r436,%r443,%r445; .loc 1 2888 11 add.f64 %r428,%r428,%r446; add.f64 %r429,%r429,%r447; add.u64 %r602,%r594,%r610; ld.f64 %r448,[%r602+-4096]; ld.f64 %r449,[%r602+-4088]; .loc 1 2891 13 mul.f64 %r451,%r437,%r448; mul.f64 %r1428,%r437,%r449; neg.f64 %r1429,%r1428; fma.rn.f64 %r452,%r436,%r448,%r1429; fma.rn.f64 %r453,%r436,%r449,%r451; .loc 1 2890 11 add.f64 %r430,%r430,%r452; add.f64 %r431,%r431,%r453; add.u64 %r600,%r592,%r610; ld.f64 %r454,[%r600+-4096]; ld.f64 %r455,[%r600+-4088]; .loc 1 2893 13 mul.f64 %r457,%r437,%r454; mul.f64 %r1430,%r437,%r455; neg.f64 %r1431,%r1430; fma.rn.f64 %r458,%r436,%r454,%r1431; fma.rn.f64 %r459,%r436,%r455,%r457; .loc 1 2892 11 add.f64 %r432,%r432,%r458; add.f64 %r433,%r433,%r459; .loc 1 2884 10 add.u64 %r610,%r610,16; setp.ne.u64 %r1432,%r610,%r679; @ %r1432 bra $L78; $L77: .loc 1 2895 28 st.f64 [%r599],%r426; st.f64 [%r599+8],%r427; .loc 1 2896 32 st.f64 [%r599+16],%r428; st.f64 [%r599+24],%r429; .loc 1 2897 32 st.f64 [%r599+32],%r430; st.f64 [%r599+40],%r431; .loc 1 2898 32 st.f64 [%r599+48],%r432; st.f64 [%r599+56],%r433; .loc 1 2877 6 add.u64 %r599,%r599,64; add.u64 %r598,%r598,16384; add.u64 %r596,%r596,16384; add.u64 %r594,%r594,16384; add.u64 %r592,%r592,16384; setp.ne.u64 %r1433,%r585,%r599; @ %r1433 bra $L79; bra $L80; $L76: .loc 1 2874 39 add.u64 %r243,%r243,1; .loc 1 2874 9 add.u64 %r584,%r584,%r797; add.u64 %r579,%r579,%r664; add.u64 %r578,%r578,%r784; add.u64 %r575,%r575,%r784; setp.ne.u64 %r1434,%r1588,%r243; @ %r1434 bra $L81; bra $L60; $L75: add.u64 %r617,%r569,%r584; .loc 1 2904 13 mov.u64 %r616,%r615; $L84: ld.f64 %r460,[%r617]; ld.f64 %r461,[%r617+8]; .loc 1 2905 10 @ %r1581 bra $L82; add.u64 %r621,%r616,8; mov.u64 %r256,%r578; mov.u64 %r297,%r616; sub.u64 %r1603,%r575,%r578; add.u64 %r1604,%r1603,-16; shr.u64 %r1602,%r1604,4; add.u64 %r1605,%r1602,1; and.b64 %r1606,%r1605,3; setp.eq.u64 %r1609,%r1606,0; @ %r1609 bra $L83; setp.eq.u64 %r1608,%r1606,1; @ %r1608 bra $L286; setp.eq.u64 %r1607,%r1606,2; @ %r1607 bra $L287; bra $L342; $L83: ld.f64 %r462,[%r297]; ld.f64 %r463,[%r621]; ld.f64 %r464,[%r256]; ld.f64 %r465,[%r256+8]; .loc 1 2908 13 mul.f64 %r467,%r463,%r464; mul.f64 %r1437,%r463,%r465; neg.f64 %r1438,%r1437; fma.rn.f64 %r468,%r462,%r464,%r1438; fma.rn.f64 %r469,%r462,%r465,%r467; .loc 1 2907 11 add.f64 %r1731,%r460,%r468; add.f64 %r1732,%r461,%r469; .loc 1 2905 10 add.u64 %r1611,%r256,16; ld.f64 %r1736,[%r297+16]; ld.f64 %r1737,[%r621+16]; ld.f64 %r1738,[%r1611]; ld.f64 %r1739,[%r1611+8]; .loc 1 2908 13 mul.f64 %r1740,%r1737,%r1738; mul.f64 %r1741,%r1737,%r1739; neg.f64 %r1742,%r1741; fma.rn.f64 %r1743,%r1736,%r1738,%r1742; fma.rn.f64 %r1744,%r1736,%r1739,%r1740; .loc 1 2907 11 add.f64 %r1745,%r1731,%r1743; add.f64 %r1746,%r1732,%r1744; .loc 1 2905 10 add.u64 %r1748,%r256,32; ld.f64 %r1751,[%r297+32]; ld.f64 %r1752,[%r621+32]; ld.f64 %r1753,[%r1748]; ld.f64 %r1754,[%r1748+8]; .loc 1 2908 13 mul.f64 %r1755,%r1752,%r1753; mul.f64 %r1756,%r1752,%r1754; neg.f64 %r1757,%r1756; fma.rn.f64 %r1758,%r1751,%r1753,%r1757; fma.rn.f64 %r1759,%r1751,%r1754,%r1755; .loc 1 2907 11 add.f64 %r1760,%r1745,%r1758; add.f64 %r1761,%r1746,%r1759; .loc 1 2905 10 add.u64 %r1763,%r256,48; ld.f64 %r1766,[%r297+48]; ld.f64 %r1767,[%r621+48]; ld.f64 %r1768,[%r1763]; ld.f64 %r1769,[%r1763+8]; .loc 1 2908 13 mul.f64 %r1770,%r1767,%r1768; mul.f64 %r1771,%r1767,%r1769; neg.f64 %r1772,%r1771; fma.rn.f64 %r1773,%r1766,%r1768,%r1772; fma.rn.f64 %r1774,%r1766,%r1769,%r1770; .loc 1 2907 11 add.f64 %r460,%r1760,%r1773; add.f64 %r461,%r1761,%r1774; .loc 1 2905 10 add.u64 %r297,%r297,64; add.u64 %r256,%r256,64; add.u64 %r621,%r621,64; setp.ne.u64 %r1775,%r256,%r575; @ %r1775 bra $L83; $L82: .loc 1 2910 28 st.f64 [%r617],%r460; st.f64 [%r617+8],%r461; .loc 1 2901 6 add.u64 %r617,%r617,16; add.u64 %r616,%r616,4096; setp.ne.u64 %r1440,%r584,%r617; @ %r1440 bra $L84; bra $L76; $L20: .loc 1 2958 11 cvt.u16.u32 %r1441,%r627; setp.eq.u16 %r1442,%r1441,1; @ ! %r1442 bra $L343; bra $L85; $L31: .loc 1 2920 38 set.u32.eq.u64 %r1444,%r150,1; neg.s32 %r1445,%r1444; .loc 1 2920 26 cvt.u16.u32 %r1447,%r1445; and.b16 %r1446,%r1447,%r1187; .loc 1 2920 11 cvt.u32.u16 %r1449,%r1446; cvt.u16.u8 %r1450,%r1449; setp.eq.u16 %r1451,%r1450,0; @ %r1451 bra $L20; .loc 1 2920 43 setp.ne.u64 %r1452,%r663,1; @ %r1452 bra $L20; $L121: .loc 1 2922 10 cvt.u16.u32 %r1453,%r627; setp.ne.u16 %r1454,%r1453,1; @ %r1454 bra $L87; .loc 1 2948 4 setp.gt.s64 %r1455,%r163,0; @ %r1455 bra $L88; bra $L1; $L87: .loc 1 2929 4 setp.le.s64 %r1456,%r163,0; @ %r1456 bra $L1; shl.b64 %r830,%r664,4; shl.b64 %r1457,%r162,4; add.u64 %r829,%r222,%r1457; shl.b64 %r836,%r151,4; mov.u64 %r834,%r221; shl.b64 %r818,%r298,4; shl.b64 %r1458,%r638,4; add.u64 %r822,%r220,%r1458; mov.u64 %r827,0; .loc 1 2929 11 mov.u64 %r258,%r827; setp.gt.s64 %r1776,%r162,0; setp.le.s64 %r1597,%r638,0; $L91: .loc 1 2933 8 @ %r1776 bra $L90; $L95: .loc 1 2929 29 add.u64 %r258,%r258,1; .loc 1 2929 4 add.u64 %r827,%r827,%r664; add.u64 %r829,%r829,%r830; add.u64 %r834,%r834,%r836; setp.ne.u64 %r1460,%r163,%r258; @ %r1460 bra $L91; bra $L1; $L90: shl.b64 %r1461,%r827,4; add.u64 %r815,%r222,%r1461; .loc 1 2933 8 mov.u64 %r821,%r822; mov.u64 %r816,%r220; $L94: .loc 1 2937 5 @ %r1597 bra $L140; add.u64 %r809,%r816,8; add.u64 %r810,%r834,8; mov.u64 %r808,%r834; mov.u64 %r807,%r816; mov.f64 %r108,0d0000000000000000; mov.f64 %r103,%r108; sub.u64 %r1642,%r821,%r816; add.u64 %r1643,%r1642,-16; shr.u64 %r1641,%r1643,4; add.u64 %r1644,%r1641,1; and.b64 %r1645,%r1644,3; setp.eq.u64 %r1648,%r1645,0; @ %r1648 bra $L93; setp.eq.u64 %r1647,%r1645,1; @ %r1647 bra $L288; setp.eq.u64 %r1646,%r1645,2; @ %r1646 bra $L289; bra $L344; $L93: ld.f64 %r186,[%r807]; ld.f64 %r193,[%r809]; ld.f64 %r177,[%r808]; ld.f64 %r175,[%r810]; .loc 1 2938 23 mul.f64 %r44,%r177,%r193; mul.f64 %r1463,%r175,%r193; neg.f64 %r1464,%r1463; fma.rn.f64 %r43,%r177,%r186,%r1464; fma.rn.f64 %r101,%r175,%r186,%r44; .loc 1 2938 9 add.f64 %r1777,%r103,%r43; add.f64 %r1778,%r108,%r101; ld.f64 %r1783,[%r807+16]; ld.f64 %r1784,[%r809+16]; ld.f64 %r1785,[%r808+16]; ld.f64 %r1786,[%r810+16]; .loc 1 2938 23 mul.f64 %r1787,%r1785,%r1784; mul.f64 %r1788,%r1786,%r1784; neg.f64 %r1789,%r1788; fma.rn.f64 %r1790,%r1785,%r1783,%r1789; fma.rn.f64 %r1791,%r1786,%r1783,%r1787; .loc 1 2938 9 add.f64 %r1792,%r1777,%r1790; add.f64 %r1793,%r1778,%r1791; ld.f64 %r1799,[%r807+32]; ld.f64 %r1800,[%r809+32]; ld.f64 %r1801,[%r808+32]; ld.f64 %r1802,[%r810+32]; .loc 1 2938 23 mul.f64 %r1803,%r1801,%r1800; mul.f64 %r1804,%r1802,%r1800; neg.f64 %r1805,%r1804; fma.rn.f64 %r1806,%r1801,%r1799,%r1805; fma.rn.f64 %r1807,%r1802,%r1799,%r1803; .loc 1 2938 9 add.f64 %r1808,%r1792,%r1806; add.f64 %r1809,%r1793,%r1807; ld.f64 %r1815,[%r807+48]; ld.f64 %r1816,[%r809+48]; ld.f64 %r1817,[%r808+48]; ld.f64 %r1818,[%r810+48]; .loc 1 2938 23 mul.f64 %r1819,%r1817,%r1816; mul.f64 %r1820,%r1818,%r1816; neg.f64 %r1821,%r1820; fma.rn.f64 %r1822,%r1817,%r1815,%r1821; fma.rn.f64 %r1823,%r1818,%r1815,%r1819; .loc 1 2938 9 add.f64 %r103,%r1808,%r1822; add.f64 %r108,%r1809,%r1823; .loc 1 2937 5 add.u64 %r807,%r807,64; add.u64 %r808,%r808,64; add.u64 %r809,%r809,64; add.u64 %r810,%r810,64; setp.ne.u64 %r1824,%r807,%r821; @ %r1824 bra $L93; bra $L92; $L140: mov.f64 %r108,0d0000000000000000; mov.f64 %r103,%r108; $L92: .loc 1 2939 15 st.f64 [%r815],%r103; st.f64 [%r815+8],%r108; .loc 1 2933 8 add.u64 %r815,%r815,16; add.u64 %r816,%r816,%r818; add.u64 %r821,%r821,%r818; setp.ne.u64 %r1466,%r815,%r829; @ %r1466 bra $L94; bra $L95; $L88: shl.b64 %r853,%r664,4; mov.u64 %r851,%r222; shl.b64 %r842,%r298,4; .loc 1 2948 4 mov.u64 %r854,0; .loc 1 2948 11 mov.u64 %r257,%r854; setp.le.s64 %r1825,%r638,0; $L98: .loc 1 2952 8 @ %r1825 bra $L141; mov.u64 %r840,%r220; shl.b64 %r1468,%r854,4; add.u64 %r843,%r1468,%r221; add.u64 %r845,%r220,8; add.u64 %r1469,%r638,%r854; shl.b64 %r1470,%r1469,4; add.u64 %r861,%r1470,%r221; mov.f64 %r37,0d0000000000000000; mov.f64 %r38,%r37; sub.u64 %r1654,%r861,%r843; add.u64 %r1655,%r1654,-16; shr.u64 %r1653,%r1655,4; add.u64 %r1656,%r1653,1; and.b64 %r1657,%r1656,3; setp.eq.u64 %r1660,%r1657,0; @ %r1660 bra $L97; setp.eq.u64 %r1659,%r1657,1; @ %r1659 bra $L290; setp.eq.u64 %r1658,%r1657,2; @ %r1658 bra $L291; bra $L345; $L97: ld.f64 %r187,[%r840]; ld.f64 %r194,[%r845]; ld.f64 %r172,[%r843]; ld.f64 %r169,[%r843+8]; .loc 1 2953 26 mul.f64 %r87,%r172,%r194; mul.f64 %r1471,%r169,%r194; neg.f64 %r1472,%r1471; fma.rn.f64 %r69,%r172,%r187,%r1472; fma.rn.f64 %r68,%r169,%r187,%r87; .loc 1 2953 5 add.f64 %r1826,%r38,%r69; add.f64 %r1827,%r37,%r68; .loc 1 2952 8 add.u64 %r1828,%r840,%r842; add.u64 %r1661,%r843,16; add.u64 %r1830,%r845,%r842; ld.f64 %r1831,[%r1828]; ld.f64 %r1832,[%r1830]; ld.f64 %r1833,[%r1661]; ld.f64 %r1834,[%r1661+8]; .loc 1 2953 26 mul.f64 %r1835,%r1833,%r1832; mul.f64 %r1836,%r1834,%r1832; neg.f64 %r1837,%r1836; fma.rn.f64 %r1838,%r1833,%r1831,%r1837; fma.rn.f64 %r1839,%r1834,%r1831,%r1835; .loc 1 2953 5 add.f64 %r1840,%r1826,%r1838; add.f64 %r1841,%r1827,%r1839; .loc 1 2952 8 add.u64 %r1842,%r1828,%r842; add.u64 %r1843,%r843,32; add.u64 %r1844,%r1830,%r842; ld.f64 %r1846,[%r1842]; ld.f64 %r1847,[%r1844]; ld.f64 %r1848,[%r1843]; ld.f64 %r1849,[%r1843+8]; .loc 1 2953 26 mul.f64 %r1850,%r1848,%r1847; mul.f64 %r1851,%r1849,%r1847; neg.f64 %r1852,%r1851; fma.rn.f64 %r1853,%r1848,%r1846,%r1852; fma.rn.f64 %r1854,%r1849,%r1846,%r1850; .loc 1 2953 5 add.f64 %r1855,%r1840,%r1853; add.f64 %r1856,%r1841,%r1854; .loc 1 2952 8 add.u64 %r1857,%r1842,%r842; add.u64 %r1858,%r843,48; add.u64 %r1859,%r1844,%r842; ld.f64 %r1861,[%r1857]; ld.f64 %r1862,[%r1859]; ld.f64 %r1863,[%r1858]; ld.f64 %r1864,[%r1858+8]; .loc 1 2953 26 mul.f64 %r1865,%r1863,%r1862; mul.f64 %r1866,%r1864,%r1862; neg.f64 %r1867,%r1866; fma.rn.f64 %r1868,%r1863,%r1861,%r1867; fma.rn.f64 %r1869,%r1864,%r1861,%r1865; .loc 1 2953 5 add.f64 %r38,%r1855,%r1868; add.f64 %r37,%r1856,%r1869; .loc 1 2952 8 add.u64 %r840,%r1857,%r842; add.u64 %r843,%r843,64; add.u64 %r845,%r1859,%r842; setp.ne.u64 %r1870,%r843,%r861; @ %r1870 bra $L97; bra $L96; $L141: mov.f64 %r37,0d0000000000000000; mov.f64 %r38,%r37; $L96: .loc 1 2954 25 st.f64 [%r851],%r38; st.f64 [%r851+8],%r37; .loc 1 2948 29 add.u64 %r257,%r257,1; .loc 1 2948 4 add.u64 %r851,%r851,%r853; add.u64 %r854,%r854,%r151; setp.ne.u64 %r1474,%r163,%r257; @ %r1474 bra $L98; bra $L1; $L85: .loc 1 2963 7 setp.le.s64 %r1475,%r163,0; @ %r1475 bra $L1; shl.b64 %r872,%r844,4; mov.u64 %r870,%r222; shl.b64 %r864,%r298,4; shl.b64 %r867,%r663,4; mov.u64 %r873,0; .loc 1 2963 14 mov.u64 %r268,%r873; setp.le.s64 %r1871,%r638,0; $L101: .loc 1 2967 4 @ %r1871 bra $L142; mov.u64 %r862,%r220; shl.b64 %r1477,%r873,4; add.u64 %r865,%r221,%r1477; mov.f64 %r152,0d0000000000000000; mov.f64 %r188,%r152; .loc 1 2967 11 mov.u64 %r269,0; and.b64 %r1665,%r638,3; setp.eq.u64 %r1668,%r1665,0; @ %r1668 bra $L100; setp.eq.u64 %r1667,%r1665,1; @ %r1667 bra $L292; setp.eq.u64 %r1666,%r1665,2; @ %r1666 bra $L293; bra $L346; $L100: ld.f64 %r277,[%r862]; ld.f64 %r206,[%r862+8]; ld.f64 %r195,[%r865]; ld.f64 %r205,[%r865+8]; .loc 1 2968 29 mul.f64 %r203,%r195,%r206; mul.f64 %r1478,%r205,%r206; neg.f64 %r1479,%r1478; fma.rn.f64 %r191,%r195,%r277,%r1479; fma.rn.f64 %r202,%r205,%r277,%r203; .loc 1 2968 8 add.f64 %r1872,%r188,%r191; add.f64 %r1873,%r152,%r202; .loc 1 2967 4 add.u64 %r1875,%r862,%r864; add.u64 %r1876,%r865,%r867; ld.f64 %r1877,[%r1875]; ld.f64 %r1878,[%r1875+8]; ld.f64 %r1879,[%r1876]; ld.f64 %r1880,[%r1876+8]; .loc 1 2968 29 mul.f64 %r1881,%r1879,%r1878; mul.f64 %r1882,%r1880,%r1878; neg.f64 %r1883,%r1882; fma.rn.f64 %r1884,%r1879,%r1877,%r1883; fma.rn.f64 %r1885,%r1880,%r1877,%r1881; .loc 1 2968 8 add.f64 %r1886,%r1872,%r1884; add.f64 %r1887,%r1873,%r1885; .loc 1 2967 4 add.u64 %r1889,%r1875,%r864; add.u64 %r1890,%r1876,%r867; ld.f64 %r1892,[%r1889]; ld.f64 %r1893,[%r1889+8]; ld.f64 %r1894,[%r1890]; ld.f64 %r1895,[%r1890+8]; .loc 1 2968 29 mul.f64 %r1896,%r1894,%r1893; mul.f64 %r1897,%r1895,%r1893; neg.f64 %r1898,%r1897; fma.rn.f64 %r1899,%r1894,%r1892,%r1898; fma.rn.f64 %r1900,%r1895,%r1892,%r1896; .loc 1 2968 8 add.f64 %r1901,%r1886,%r1899; add.f64 %r1902,%r1887,%r1900; .loc 1 2967 4 add.u64 %r1904,%r1889,%r864; add.u64 %r1905,%r1890,%r867; ld.f64 %r1907,[%r1904]; ld.f64 %r1908,[%r1904+8]; ld.f64 %r1909,[%r1905]; ld.f64 %r1910,[%r1905+8]; .loc 1 2968 29 mul.f64 %r1911,%r1909,%r1908; mul.f64 %r1912,%r1910,%r1908; neg.f64 %r1913,%r1912; fma.rn.f64 %r1914,%r1909,%r1907,%r1913; fma.rn.f64 %r1915,%r1910,%r1907,%r1911; .loc 1 2968 8 add.f64 %r188,%r1901,%r1914; add.f64 %r152,%r1902,%r1915; .loc 1 2967 28 add.u64 %r269,%r269,4; .loc 1 2967 4 add.u64 %r862,%r1904,%r864; add.u64 %r865,%r1905,%r867; setp.ne.u64 %r1916,%r638,%r269; @ %r1916 bra $L100; bra $L99; $L142: mov.f64 %r152,0d0000000000000000; mov.f64 %r188,%r152; $L99: .loc 1 2969 21 st.f64 [%r870],%r188; st.f64 [%r870+8],%r152; .loc 1 2963 32 add.u64 %r268,%r268,1; .loc 1 2963 7 add.u64 %r870,%r870,%r872; add.u64 %r873,%r873,%r151; setp.ne.u64 %r1481,%r163,%r268; @ %r1481 bra $L101; bra $L1; $L343: .loc 1 2972 11 setp.gt.s64 %r1482,%r150,%r298; @ %r1482 bra $L102; .loc 1 2993 7 setp.gt.s64 %r1483,%r163,0; @ %r1483 bra $L103; bra $L1; $L102: .loc 1 2974 7 setp.gt.s64 %r1484,%r163,0; @ %r1484 bra $L104; bra $L1; $L108: shl.b64 %r1485,%r914,4; add.u64 %r910,%r222,%r1485; .loc 1 2975 9 mov.u64 %r267,0; and.b64 %r1681,%r162,3; setp.eq.u64 %r1684,%r1681,0; @ %r1684 bra $L105; setp.eq.u64 %r1683,%r1681,1; @ %r1683 bra $L294; setp.eq.u64 %r1682,%r1681,2; @ %r1682 bra $L295; bra $L347; $L105: .loc 1 2976 34 st.f64 [%r910],%r1599; st.f64 [%r910+8],%r1600; .loc 1 2975 2 add.u64 %r1920,%r910,%r912; .loc 1 2976 34 st.f64 [%r1920],%r1599; st.f64 [%r1920+8],%r1600; .loc 1 2975 2 add.u64 %r1922,%r1920,%r912; .loc 1 2976 34 st.f64 [%r1922],%r1599; st.f64 [%r1922+8],%r1600; .loc 1 2975 2 add.u64 %r1925,%r1922,%r912; .loc 1 2976 34 st.f64 [%r1925],%r1599; st.f64 [%r1925+8],%r1600; .loc 1 2975 27 add.u64 %r267,%r267,4; .loc 1 2975 2 add.u64 %r910,%r1925,%r912; setp.ne.u64 %r1927,%r162,%r267; @ %r1927 bra $L105; $L109: .loc 1 2974 32 add.u64 %r266,%r288,1; .loc 1 2974 7 add.u64 %r914,%r914,%r664; setp.ne.u64 %r1489,%r163,%r266; @ %r1489 bra $L143; shl.b64 %r902,%r664,4; mov.u64 %r900,%r222; shl.b64 %r893,%r663,4; shl.b64 %r883,%r298,4; mov.u64 %r905,0; mov.u64 %r289,%r905; setp.le.s64 %r1928,%r638,0; add.u64 %r1598,%r221,8; bra $L107; $L104: shl.b64 %r912,%r844,4; mov.u64 %r914,0; .loc 1 2974 14 mov.u64 %r288,%r914; setp.gt.s64 %r1929,%r162,0; ld.const.f64 %r1599,[$LC9]; ld.const.f64 %r1600,[$LC9+8]; bra $L106; $L143: mov.u64 %r288,%r266; $L106: .loc 1 2975 2 @ %r1929 bra $L108; bra $L109; $L112: shl.b64 %r1491,%r889,4; add.u64 %r881,%r220,%r1491; add.u64 %r884,%r900,8; .loc 1 2980 4 mov.u64 %r878,%r900; .loc 1 2980 11 mov.u64 %r265,0; and.b64 %r1673,%r162,3; setp.eq.u64 %r1676,%r1673,0; @ %r1676 bra $L110; setp.eq.u64 %r1675,%r1673,1; @ %r1675 bra $L296; setp.eq.u64 %r1674,%r1673,2; @ %r1674 bra $L297; bra $L348; $L110: ld.f64 %r209,[%r881]; ld.f64 %r153,[%r881+8]; ld.f64 %r208,[%r891]; ld.f64 %r156,[%r894]; .loc 1 2983 37 mul.f64 %r155,%r153,%r208; ld.f64 %r1492,[%r878]; fma.rn.f64 %r495,%r208,%r209,%r1492; fma.rn.f64 %r160,%r156,%r209,%r155; .loc 1 2982 36 neg.f64 %r1493,%r153; fma.rn.f64 %r207,%r1493,%r156,%r495; ld.f64 %r1494,[%r884]; add.f64 %r154,%r160,%r1494; st.f64 [%r878],%r207; st.f64 [%r884],%r154; .loc 1 2980 4 add.u64 %r1931,%r878,%r912; add.u64 %r1932,%r881,%r883; add.u64 %r1933,%r884,%r912; ld.f64 %r1934,[%r1932]; ld.f64 %r1935,[%r1932+8]; ld.f64 %r1936,[%r891]; ld.f64 %r1937,[%r894]; .loc 1 2983 37 mul.f64 %r1938,%r1935,%r1936; ld.f64 %r1939,[%r1931]; fma.rn.f64 %r1940,%r1936,%r1934,%r1939; fma.rn.f64 %r1941,%r1937,%r1934,%r1938; .loc 1 2982 36 neg.f64 %r1942,%r1935; fma.rn.f64 %r1943,%r1942,%r1937,%r1940; ld.f64 %r1944,[%r1933]; add.f64 %r1945,%r1941,%r1944; st.f64 [%r1931],%r1943; st.f64 [%r1933],%r1945; .loc 1 2980 4 add.u64 %r1947,%r1931,%r912; add.u64 %r1948,%r1932,%r883; add.u64 %r1949,%r1933,%r912; ld.f64 %r1951,[%r1948]; ld.f64 %r1952,[%r1948+8]; ld.f64 %r1953,[%r891]; ld.f64 %r1954,[%r894]; .loc 1 2983 37 mul.f64 %r1955,%r1952,%r1953; ld.f64 %r1956,[%r1947]; fma.rn.f64 %r1957,%r1953,%r1951,%r1956; fma.rn.f64 %r1958,%r1954,%r1951,%r1955; .loc 1 2982 36 neg.f64 %r1959,%r1952; fma.rn.f64 %r1960,%r1959,%r1954,%r1957; ld.f64 %r1961,[%r1949]; add.f64 %r1962,%r1958,%r1961; st.f64 [%r1947],%r1960; st.f64 [%r1949],%r1962; .loc 1 2980 4 add.u64 %r1964,%r1947,%r912; add.u64 %r1965,%r1948,%r883; add.u64 %r1966,%r1949,%r912; ld.f64 %r1968,[%r1965]; ld.f64 %r1969,[%r1965+8]; ld.f64 %r1970,[%r891]; ld.f64 %r1971,[%r894]; .loc 1 2983 37 mul.f64 %r1972,%r1969,%r1970; ld.f64 %r1973,[%r1964]; fma.rn.f64 %r1974,%r1970,%r1968,%r1973; fma.rn.f64 %r1975,%r1971,%r1968,%r1972; .loc 1 2982 36 neg.f64 %r1976,%r1969; fma.rn.f64 %r1977,%r1976,%r1971,%r1974; ld.f64 %r1978,[%r1966]; add.f64 %r1979,%r1975,%r1978; st.f64 [%r1964],%r1977; st.f64 [%r1966],%r1979; .loc 1 2980 29 add.u64 %r265,%r265,4; .loc 1 2980 4 add.u64 %r878,%r1964,%r912; add.u64 %r881,%r1965,%r883; add.u64 %r884,%r1966,%r912; setp.ne.u64 %r1980,%r162,%r265; @ %r1980 bra $L110; $L113: .loc 1 2979 26 add.u64 %r264,%r264,1; .loc 1 2979 2 add.u64 %r889,%r889,%r150; add.u64 %r891,%r891,%r893; add.u64 %r894,%r894,%r893; setp.eq.u64 %r1496,%r638,%r264; @ %r1496 bra $L111; $L114: .loc 1 2980 4 @ %r1929 bra $L112; bra $L113; $L111: .loc 1 2978 32 add.u64 %r263,%r289,1; .loc 1 2978 7 add.u64 %r900,%r900,%r902; add.u64 %r905,%r905,%r151; setp.eq.u64 %r1498,%r288,%r289; @ %r1498 bra $L1; mov.u64 %r289,%r263; $L107: .loc 1 2979 2 @ %r1928 bra $L111; shl.b64 %r908,%r905,4; add.u64 %r891,%r221,%r908; add.u64 %r894,%r1598,%r908; mov.u64 %r889,0; .loc 1 2979 9 mov.u64 %r264,%r889; bra $L114; $L103: shl.b64 %r929,%r844,4; shl.b64 %r920,%r150,4; shl.b64 %r924,%r663,4; .loc 1 2993 7 mov.u64 %r938,0; mov.u64 %r936,%r938; .loc 1 2993 14 mov.u64 %r260,%r938; setp.gt.s64 %r1981,%r162,0; setp.le.s64 %r1601,%r638,0; $L116: .loc 1 2997 4 @ %r1981 bra $L115; $L120: .loc 1 2993 32 add.u64 %r260,%r260,1; .loc 1 2993 7 add.u64 %r936,%r936,%r664; add.u64 %r938,%r938,%r151; setp.ne.u64 %r1502,%r163,%r260; @ %r1502 bra $L116; bra $L1; $L115: shl.b64 %r1503,%r936,4; add.u64 %r927,%r222,%r1503; shl.b64 %r1504,%r938,4; add.u64 %r944,%r221,%r1504; .loc 1 2997 4 mov.u64 %r931,0; .loc 1 2997 11 mov.u64 %r261,%r931; $L119: .loc 1 3001 8 @ %r1601 bra $L144; shl.b64 %r1506,%r931,4; add.u64 %r918,%r220,%r1506; mov.u64 %r922,%r944; mov.f64 %r161,0d0000000000000000; mov.f64 %r211,%r161; .loc 1 3001 15 mov.u64 %r262,0; and.b64 %r1689,%r638,3; setp.eq.u64 %r1692,%r1689,0; @ %r1692 bra $L118; setp.eq.u64 %r1691,%r1689,1; @ %r1691 bra $L298; setp.eq.u64 %r1690,%r1689,2; @ %r1690 bra $L299; bra $L349; $L118: ld.f64 %r285,[%r918]; ld.f64 %r214,[%r918+8]; ld.f64 %r159,[%r922]; ld.f64 %r213,[%r922+8]; .loc 1 3002 28 mul.f64 %r271,%r159,%r214; mul.f64 %r1507,%r213,%r214; neg.f64 %r1508,%r1507; fma.rn.f64 %r212,%r159,%r285,%r1508; fma.rn.f64 %r158,%r213,%r285,%r271; .loc 1 3002 5 add.f64 %r1983,%r211,%r212; add.f64 %r1984,%r161,%r158; .loc 1 3001 8 add.u64 %r1986,%r918,%r920; add.u64 %r1987,%r922,%r924; ld.f64 %r1988,[%r1986]; ld.f64 %r1989,[%r1986+8]; ld.f64 %r1990,[%r1987]; ld.f64 %r1991,[%r1987+8]; .loc 1 3002 28 mul.f64 %r1992,%r1990,%r1989; mul.f64 %r1993,%r1991,%r1989; neg.f64 %r1994,%r1993; fma.rn.f64 %r1995,%r1990,%r1988,%r1994; fma.rn.f64 %r1996,%r1991,%r1988,%r1992; .loc 1 3002 5 add.f64 %r1997,%r1983,%r1995; add.f64 %r1998,%r1984,%r1996; .loc 1 3001 8 add.u64 %r2000,%r1986,%r920; add.u64 %r2001,%r1987,%r924; ld.f64 %r2003,[%r2000]; ld.f64 %r2004,[%r2000+8]; ld.f64 %r2005,[%r2001]; ld.f64 %r2006,[%r2001+8]; .loc 1 3002 28 mul.f64 %r2007,%r2005,%r2004; mul.f64 %r2008,%r2006,%r2004; neg.f64 %r2009,%r2008; fma.rn.f64 %r2010,%r2005,%r2003,%r2009; fma.rn.f64 %r2011,%r2006,%r2003,%r2007; .loc 1 3002 5 add.f64 %r2012,%r1997,%r2010; add.f64 %r2013,%r1998,%r2011; .loc 1 3001 8 add.u64 %r2015,%r2000,%r920; add.u64 %r2016,%r2001,%r924; ld.f64 %r2018,[%r2015]; ld.f64 %r2019,[%r2015+8]; ld.f64 %r2020,[%r2016]; ld.f64 %r2021,[%r2016+8]; .loc 1 3002 28 mul.f64 %r2022,%r2020,%r2019; mul.f64 %r2023,%r2021,%r2019; neg.f64 %r2024,%r2023; fma.rn.f64 %r2025,%r2020,%r2018,%r2024; fma.rn.f64 %r2026,%r2021,%r2018,%r2022; .loc 1 3002 5 add.f64 %r211,%r2012,%r2025; add.f64 %r161,%r2013,%r2026; .loc 1 3001 32 add.u64 %r262,%r262,4; .loc 1 3001 8 add.u64 %r918,%r2015,%r920; add.u64 %r922,%r2016,%r924; setp.ne.u64 %r2027,%r638,%r262; @ %r2027 bra $L118; bra $L117; $L144: mov.f64 %r161,0d0000000000000000; mov.f64 %r211,%r161; $L117: .loc 1 3003 27 st.f64 [%r927],%r211; st.f64 [%r927+8],%r161; .loc 1 2997 29 add.u64 %r261,%r261,1; .loc 1 2997 4 add.u64 %r927,%r927,%r929; add.u64 %r931,%r931,%r298; setp.ne.u64 %r1510,%r162,%r261; @ %r1510 bra $L119; bra $L120; $L32: .loc 1 2920 11 setp.eq.u64 %r1511,%r150,1; @ %r1511 bra $L121; bra $L20; $L2: .loc 1 2476 6 ld.u64 %r1512,[%r945]; setp.ne.u64 %r1513,%r1512,0; @ %r1513 bra $L122; .loc 1 2485 4 ld.u64 %r1515,[%r946+56]; add.u64 %r1514,%r1515,1; ld.u64 %r1516,[%r946+48]; sub.u64 %r668,%r1514,%r1516; .loc 1 2483 15 ld.s8 %r1518,[%r947+28]; cvt.u16.u32 %r1517,%r1518; setp.eq.u16 %r1519,%r1517,1; @ %r1519 bra $L123; bra $L124; $L126: .loc 1 2560 16 ld.u64 %r298,[%r946+40]; .loc 1 2548 27 mov.u64 %r844,%r664; bra $L125; $L14: ld.u64 %r664,[%r945+40]; .loc 1 2543 6 ld.s8 %r1521,[%r945+28]; cvt.u16.u32 %r1520,%r1521; setp.eq.u16 %r1522,%r1520,1; @ ! %r1522 bra $L127; bra $L126; $L11: .loc 1 2560 16 ld.u64 %r298,[%r946+40]; mov.u32 %r627,%r901; .loc 1 2548 27 mov.u64 %r664,%r844; .loc 1 2563 14 mov.u64 %r162,1; .loc 1 2561 16 mov.u64 %r150,%r162; bra $L12; $L5: .loc 1 2485 4 ld.u64 %r1524,[%r946+56]; add.u64 %r1523,%r1524,1; ld.u64 %r1525,[%r946+48]; sub.u64 %r668,%r1523,%r1525; bra $L124; $L128: .loc 1 2509 17 ld.u64 %r1527,[%r945+56]; add.u64 %r1526,%r1527,1; .loc 1 2509 15 ld.u64 %r1528,[%r945+48]; sub.u64 %r638,%r1526,%r1528; ld.s8 %r1551,[%r947+28]; bra $L9; $L122: .loc 1 2502 12 cvta.global.u64 %r1529,_gfortrani_compile_options; .loc 1 2502 11 ld.u32 %r1530,[%r1529+36]; setp.ne.u32 %r1531,%r1530,0; @ ! %r1531 bra $L8; bra $L128; $L1$L342: ld.f64 %r2028,[%r616]; ld.f64 %r2029,[%r621]; ld.f64 %r2030,[%r578]; ld.f64 %r2031,[%r578+8]; .loc 1 2908 13 mul.f64 %r2032,%r2029,%r2030; mul.f64 %r2033,%r2029,%r2031; neg.f64 %r2034,%r2033; fma.rn.f64 %r2035,%r2028,%r2030,%r2034; fma.rn.f64 %r2036,%r2028,%r2031,%r2032; .loc 1 2907 11 add.f64 %r460,%r460,%r2035; add.f64 %r461,%r461,%r2036; .loc 1 2905 10 add.u64 %r297,%r616,16; add.u64 %r256,%r578,16; add.u64 %r621,%r616,24; $L287: ld.f64 %r2038,[%r297]; ld.f64 %r2039,[%r621]; ld.f64 %r2040,[%r256]; ld.f64 %r2041,[%r256+8]; .loc 1 2908 13 mul.f64 %r2042,%r2039,%r2040; mul.f64 %r2043,%r2039,%r2041; neg.f64 %r2044,%r2043; fma.rn.f64 %r2045,%r2038,%r2040,%r2044; fma.rn.f64 %r2046,%r2038,%r2041,%r2042; .loc 1 2907 11 add.f64 %r460,%r460,%r2045; add.f64 %r461,%r461,%r2046; .loc 1 2905 10 add.u64 %r297,%r297,16; add.u64 %r256,%r256,16; add.u64 %r621,%r621,16; $L286: ld.f64 %r2048,[%r297]; ld.f64 %r2049,[%r621]; ld.f64 %r2050,[%r256]; ld.f64 %r2051,[%r256+8]; .loc 1 2908 13 mul.f64 %r2052,%r2049,%r2050; mul.f64 %r2053,%r2049,%r2051; neg.f64 %r2054,%r2053; fma.rn.f64 %r2055,%r2048,%r2050,%r2054; fma.rn.f64 %r2056,%r2048,%r2051,%r2052; .loc 1 2907 11 add.f64 %r460,%r460,%r2055; add.f64 %r461,%r461,%r2056; .loc 1 2905 10 add.u64 %r297,%r297,16; add.u64 %r256,%r256,16; add.u64 %r621,%r621,16; setp.ne.u64 %r2057,%r256,%r575; @ %r2057 bra $L83; bra $L82; $L341: ld.f64 %r2058,[%r737+8]; .loc 1 2762 43 ld.f64 %r2059,[%r737]; st.f64 [%r678],%r2059; st.f64 [%r683],%r2058; .loc 1 2760 9 add.u64 %r676,%r737,16; add.u64 %r677,%r678,4096; add.u64 %r682,%r683,4096; $L285: ld.f64 %r2061,[%r676+8]; .loc 1 2762 43 ld.f64 %r2062,[%r676]; st.f64 [%r677],%r2062; st.f64 [%r682],%r2061; .loc 1 2760 9 add.u64 %r676,%r676,16; add.u64 %r677,%r677,4096; add.u64 %r682,%r682,4096; $L284: ld.f64 %r2064,[%r676+8]; .loc 1 2762 43 ld.f64 %r2065,[%r676]; st.f64 [%r677],%r2065; st.f64 [%r682],%r2064; .loc 1 2760 9 add.u64 %r676,%r676,16; add.u64 %r677,%r677,4096; add.u64 %r682,%r682,4096; setp.ne.u64 %r2066,%r676,%r692; @ %r2066 bra $L58; bra $L48; $L340: ld.f64 %r2067,[%r694+8]; .loc 1 2740 49 ld.f64 %r2068,[%r694]; st.f64 [%r695],%r2068; st.f64 [%r695+8],%r2067; ld.f64 %r2069,[%r696+8]; .loc 1 2742 49 ld.f64 %r2070,[%r696]; st.f64 [%r695+16],%r2070; st.f64 [%r695+24],%r2069; ld.f64 %r2071,[%r694+24]; .loc 1 2744 49 ld.f64 %r2072,[%r694+16]; st.f64 [%r695+4096],%r2072; st.f64 [%r695+4104],%r2071; ld.f64 %r2073,[%r696+24]; .loc 1 2746 49 ld.f64 %r2074,[%r696+16]; st.f64 [%r695+4112],%r2074; st.f64 [%r695+4120],%r2073; .loc 1 2738 9 add.u64 %r694,%r694,32; add.u64 %r695,%r695,8192; add.u64 %r696,%r696,32; setp.ne.u64 %r2075,%r696,%r706; @ %r2075 bra $L52; bra $L53; $L338: .loc 1 2688 2 @ ! %r1556 bra $L350; .loc 1 2689 22 mov.u32 %r20762076077,[%value_in]; } $L350: .loc 1 2687 24 mov.u64 %r255,2; .loc 1 2687 7 add.u64 %r800,%r222,%r802; $L283: .loc 1 2688 2 @ ! %r1556 bra $L351; .loc 1 2689 22 mov.u32 %r20792079080,[%value_in]; } $L351: .loc 1 2687 24 add.u64 %r255,%r255,1; .loc 1 2687 7 add.u64 %r800,%r800,%r802; $L282: .loc 1 2688 2 @ ! %r1556 bra $L352; .loc 1 2689 22 mov.u32 %r2082082083,[%value_in]; } $L352: .loc 1 2687 24 add.u64 %r255,%r255,1; .loc 1 2687 7 add.u64 %r800,%r800,%r802; setp.eq.u64 %r2084,%r255,%r806; @ ! %r2084 bra $L34; bra $L33; $L220: .loc 1 2689 22 mov.u32 %r208516952085086,[%value_in]; } $L339: .loc 1 2687 7 add.u64 %r2088,%r1695,%r802; .loc 1 2688 2 @ ! %r1556 bra $L353; .loc 1 2689 22 mov.u32 %r20208090091,[%value_in]; } $L353: .loc 1 2687 7 add.u64 %r2093,%r2088,%r802; .loc 1 2688 2 @ ! %r1556 bra $L354; .loc 1 2689 22 mov.u32 %r202092095096,[%value_in]; } $L354: .loc 1 2687 24 add.u64 %r255,%r1640,3; .loc 1 2687 7 add.u64 %r800,%r2093,%r802; setp.eq.u64 %r2097,%r255,%r806; @ ! %r2097 bra $L34; bra $L33; $L344: ld.f64 %r2098,[%r816]; ld.f64 %r2099,[%r809]; ld.f64 %r2100,[%r834]; ld.f64 %r2101,[%r810]; .loc 1 2938 23 mul.f64 %r2102,%r2100,%r2099; mul.f64 %r2103,%r2101,%r2099; neg.f64 %r2104,%r2103; .loc 1 2938 9 fma.rn.f64 %r103,%r2100,%r2098,%r2104; fma.rn.f64 %r108,%r2101,%r2098,%r2102; .loc 1 2937 5 add.u64 %r807,%r816,16; add.u64 %r808,%r834,16; add.u64 %r809,%r816,24; add.u64 %r810,%r834,24; $L289: ld.f64 %r2108,[%r807]; ld.f64 %r2109,[%r809]; ld.f64 %r2110,[%r808]; ld.f64 %r2111,[%r810]; .loc 1 2938 23 mul.f64 %r2112,%r2110,%r2109; mul.f64 %r2113,%r2111,%r2109; neg.f64 %r2114,%r2113; fma.rn.f64 %r2115,%r2110,%r2108,%r2114; fma.rn.f64 %r2116,%r2111,%r2108,%r2112; .loc 1 2938 9 add.f64 %r103,%r103,%r2115; add.f64 %r108,%r108,%r2116; .loc 1 2937 5 add.u64 %r807,%r807,16; add.u64 %r808,%r808,16; add.u64 %r809,%r809,16; add.u64 %r810,%r810,16; $L288: ld.f64 %r2118,[%r807]; ld.f64 %r2119,[%r809]; ld.f64 %r2120,[%r808]; ld.f64 %r2121,[%r810]; .loc 1 2938 23 mul.f64 %r2122,%r2120,%r2119; mul.f64 %r2123,%r2121,%r2119; neg.f64 %r2124,%r2123; fma.rn.f64 %r2125,%r2120,%r2118,%r2124; fma.rn.f64 %r2126,%r2121,%r2118,%r2122; .loc 1 2938 9 add.f64 %r103,%r103,%r2125; add.f64 %r108,%r108,%r2126; .loc 1 2937 5 add.u64 %r807,%r807,16; add.u64 %r808,%r808,16; add.u64 %r809,%r809,16; add.u64 %r810,%r810,16; setp.ne.u64 %r2127,%r807,%r821; @ %r2127 bra $L93; bra $L92; $L345: ld.f64 %r2128,[%r220]; ld.f64 %r2129,[%r845]; ld.f64 %r2130,[%r843]; ld.f64 %r2131,[%r843+8]; .loc 1 2953 26 mul.f64 %r2132,%r2130,%r2129; mul.f64 %r2133,%r2131,%r2129; neg.f64 %r2134,%r2133; .loc 1 2953 5 fma.rn.f64 %r38,%r2130,%r2128,%r2134; fma.rn.f64 %r37,%r2131,%r2128,%r2132; .loc 1 2952 8 add.u64 %r840,%r220,%r842; add.u64 %r843,%r843,16; add.u64 %r845,%r845,%r842; $L291: ld.f64 %r2138,[%r840]; ld.f64 %r2139,[%r845]; ld.f64 %r2140,[%r843]; ld.f64 %r2141,[%r843+8]; .loc 1 2953 26 mul.f64 %r2142,%r2140,%r2139; mul.f64 %r2143,%r2141,%r2139; neg.f64 %r2144,%r2143; fma.rn.f64 %r2145,%r2140,%r2138,%r2144; fma.rn.f64 %r2146,%r2141,%r2138,%r2142; .loc 1 2953 5 add.f64 %r38,%r38,%r2145; add.f64 %r37,%r37,%r2146; .loc 1 2952 8 add.u64 %r840,%r840,%r842; add.u64 %r843,%r843,16; add.u64 %r845,%r845,%r842; $L290: ld.f64 %r2148,[%r840]; ld.f64 %r2149,[%r845]; ld.f64 %r2150,[%r843]; ld.f64 %r2151,[%r843+8]; .loc 1 2953 26 mul.f64 %r2152,%r2150,%r2149; mul.f64 %r2153,%r2151,%r2149; neg.f64 %r2154,%r2153; fma.rn.f64 %r2155,%r2150,%r2148,%r2154; fma.rn.f64 %r2156,%r2151,%r2148,%r2152; .loc 1 2953 5 add.f64 %r38,%r38,%r2155; add.f64 %r37,%r37,%r2156; .loc 1 2952 8 add.u64 %r840,%r840,%r842; add.u64 %r843,%r843,16; add.u64 %r845,%r845,%r842; setp.ne.u64 %r2157,%r843,%r861; @ %r2157 bra $L97; bra $L96; $L346: ld.f64 %r2158,[%r220]; ld.f64 %r2159,[%r220+8]; ld.f64 %r2160,[%r865]; ld.f64 %r2161,[%r865+8]; .loc 1 2968 29 mul.f64 %r2162,%r2160,%r2159; mul.f64 %r2163,%r2161,%r2159; neg.f64 %r2164,%r2163; .loc 1 2968 8 fma.rn.f64 %r188,%r2160,%r2158,%r2164; fma.rn.f64 %r152,%r2161,%r2158,%r2162; .loc 1 2967 28 mov.u64 %r269,1; .loc 1 2967 4 add.u64 %r862,%r220,%r864; add.u64 %r865,%r865,%r867; $L293: ld.f64 %r2168,[%r862]; ld.f64 %r2169,[%r862+8]; ld.f64 %r2170,[%r865]; ld.f64 %r2171,[%r865+8]; .loc 1 2968 29 mul.f64 %r2172,%r2170,%r2169; mul.f64 %r2173,%r2171,%r2169; neg.f64 %r2174,%r2173; fma.rn.f64 %r2175,%r2170,%r2168,%r2174; fma.rn.f64 %r2176,%r2171,%r2168,%r2172; .loc 1 2968 8 add.f64 %r188,%r188,%r2175; add.f64 %r152,%r152,%r2176; .loc 1 2967 28 add.u64 %r269,%r269,1; .loc 1 2967 4 add.u64 %r862,%r862,%r864; add.u64 %r865,%r865,%r867; $L292: ld.f64 %r2178,[%r862]; ld.f64 %r2179,[%r862+8]; ld.f64 %r2180,[%r865]; ld.f64 %r2181,[%r865+8]; .loc 1 2968 29 mul.f64 %r2182,%r2180,%r2179; mul.f64 %r2183,%r2181,%r2179; neg.f64 %r2184,%r2183; fma.rn.f64 %r2185,%r2180,%r2178,%r2184; fma.rn.f64 %r2186,%r2181,%r2178,%r2182; .loc 1 2968 8 add.f64 %r188,%r188,%r2185; add.f64 %r152,%r152,%r2186; .loc 1 2967 28 add.u64 %r269,%r269,1; .loc 1 2967 4 add.u64 %r862,%r862,%r864; add.u64 %r865,%r865,%r867; setp.ne.u64 %r2187,%r638,%r269; @ %r2187 bra $L100; bra $L99; $L348: ld.f64 %r2188,[%r881]; ld.f64 %r2189,[%r881+8]; ld.f64 %r2190,[%r891]; ld.f64 %r2191,[%r894]; .loc 1 2983 37 mul.f64 %r2192,%r2189,%r2190; ld.f64 %r2193,[%r900]; fma.rn.f64 %r2194,%r2190,%r2188,%r2193; fma.rn.f64 %r2195,%r2191,%r2188,%r2192; .loc 1 2982 36 neg.f64 %r2196,%r2189; fma.rn.f64 %r2197,%r2196,%r2191,%r2194; ld.f64 %r2198,[%r884]; add.f64 %r2199,%r2195,%r2198; st.f64 [%r900],%r2197; st.f64 [%r884],%r2199; .loc 1 2980 29 mov.u64 %r265,1; .loc 1 2980 4 add.u64 %r878,%r900,%r912; add.u64 %r881,%r881,%r883; add.u64 %r884,%r884,%r912; $L297: ld.f64 %r2201,[%r881]; ld.f64 %r2202,[%r881+8]; ld.f64 %r2203,[%r891]; ld.f64 %r2204,[%r894]; .loc 1 2983 37 mul.f64 %r2205,%r2202,%r2203; ld.f64 %r2206,[%r878]; fma.rn.f64 %r2207,%r2203,%r2201,%r2206; fma.rn.f64 %r2208,%r2204,%r2201,%r2205; .loc 1 2982 36 neg.f64 %r2209,%r2202; fma.rn.f64 %r2210,%r2209,%r2204,%r2207; ld.f64 %r2211,[%r884]; add.f64 %r2212,%r2208,%r2211; st.f64 [%r878],%r2210; st.f64 [%r884],%r2212; .loc 1 2980 29 add.u64 %r265,%r265,1; .loc 1 2980 4 add.u64 %r878,%r878,%r912; add.u64 %r881,%r881,%r883; add.u64 %r884,%r884,%r912; $L296: ld.f64 %r2214,[%r881]; ld.f64 %r2215,[%r881+8]; ld.f64 %r2216,[%r891]; ld.f64 %r2217,[%r894]; .loc 1 2983 37 mul.f64 %r2218,%r2215,%r2216; ld.f64 %r2219,[%r878]; fma.rn.f64 %r2220,%r2216,%r2214,%r2219; fma.rn.f64 %r2221,%r2217,%r2214,%r2218; .loc 1 2982 36 neg.f64 %r2222,%r2215; fma.rn.f64 %r2223,%r2222,%r2217,%r2220; ld.f64 %r2224,[%r884]; add.f64 %r2225,%r2221,%r2224; st.f64 [%r878],%r2223; st.f64 [%r884],%r2225; .loc 1 2980 29 add.u64 %r265,%r265,1; .loc 1 2980 4 add.u64 %r878,%r878,%r912; add.u64 %r881,%r881,%r883; add.u64 %r884,%r884,%r912; setp.ne.u64 %r2226,%r162,%r265; @ %r2226 bra $L110; bra $L113; $L347: .loc 1 2976 34 st.f64 [%r910],%r1599; st.f64 [%r910+8],%r1600; .loc 1 2975 27 mov.u64 %r267,1; .loc 1 2975 2 add.u64 %r910,%r910,%r912; $L295: .loc 1 2976 34 st.f64 [%r910],%r1599; st.f64 [%r910+8],%r1600; .loc 1 2975 27 add.u64 %r267,%r267,1; .loc 1 2975 2 add.u64 %r910,%r910,%r912; $L294: .loc 1 2976 34 st.f64 [%r910],%r1599; st.f64 [%r910+8],%r1600; .loc 1 2975 27 add.u64 %r267,%r267,1; .loc 1 2975 2 add.u64 %r910,%r910,%r912; setp.ne.u64 %r2229,%r162,%r267; @ %r2229 bra $L105; bra $L109; $L349: ld.f64 %r2230,[%r918]; ld.f64 %r2231,[%r918+8]; ld.f64 %r2232,[%r944]; ld.f64 %r2233,[%r944+8]; .loc 1 3002 28 mul.f64 %r2234,%r2232,%r2231; mul.f64 %r2235,%r2233,%r2231; neg.f64 %r2236,%r2235; .loc 1 3002 5 fma.rn.f64 %r211,%r2232,%r2230,%r2236; fma.rn.f64 %r161,%r2233,%r2230,%r2234; .loc 1 3001 32 mov.u64 %r262,1; .loc 1 3001 8 add.u64 %r918,%r918,%r920; add.u64 %r922,%r944,%r924; $L299: ld.f64 %r2240,[%r918]; ld.f64 %r2241,[%r918+8]; ld.f64 %r2242,[%r922]; ld.f64 %r2243,[%r922+8]; .loc 1 3002 28 mul.f64 %r2244,%r2242,%r2241; mul.f64 %r2245,%r2243,%r2241; neg.f64 %r2246,%r2245; fma.rn.f64 %r2247,%r2242,%r2240,%r2246; fma.rn.f64 %r2248,%r2243,%r2240,%r2244; .loc 1 3002 5 add.f64 %r211,%r211,%r2247; add.f64 %r161,%r161,%r2248; .loc 1 3001 32 add.u64 %r262,%r262,1; .loc 1 3001 8 add.u64 %r918,%r918,%r920; add.u64 %r922,%r922,%r924; $L298: ld.f64 %r2250,[%r918]; ld.f64 %r2251,[%r918+8]; ld.f64 %r2252,[%r922]; ld.f64 %r2253,[%r922+8]; .loc 1 3002 28 mul.f64 %r2254,%r2252,%r2251; mul.f64 %r2255,%r2253,%r2251; neg.f64 %r2256,%r2255; fma.rn.f64 %r2257,%r2252,%r2250,%r2256; fma.rn.f64 %r2258,%r2253,%r2250,%r2254; .loc 1 3002 5 add.f64 %r211,%r211,%r2257; add.f64 %r161,%r161,%r2258; .loc 1 3001 32 add.u64 %r262,%r262,1; .loc 1 3001 8 add.u64 %r918,%r918,%r920; add.u64 %r922,%r922,%r924; setp.ne.u64 %r2259,%r638,%r262; @ %r2259 bra $L118; bra $L117; } matmul_c10.o/ matmul_l4.o/_gfortran_matmul_l4 .visible .func _gfortran_matmul_l4fortran/generated/matmul_l4.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_options[6]FUNCTION DECL: _gfortran_size0_gfortran_size0_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_internal_error .extern .func _gfortrani_internal_97,116,109,117,108,95,108,5246116,114,97,110,47,103,101,110,101,114,97,116,101,100,47,109,97,116,109,117,108,95,108,52,46,9973,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,32,77,65,84,77,85,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,100,0 }70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,32,40,97,41,32,61,61,32,50,32,124,124,32,71,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,32,40,98,41,32,61,61,3273,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,32,77,65,84,77,85,76,32,105,110,116,114,105,110,115,105,99,32,102,111,114,32,100,105,109,101,110,115,105,111,110,32,50,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,12199,111,117,110,116,32,61,61,32,71,70,67,95,68,69,83,67,82,73,80,84,79,82,95,69,88,84,69,78,84,40,98,44,48,473,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,32,77,65,84,77,85,76,32,105,110,116,114,105,110,115,105,99,32,102,111,114,32,100,105,109,101,110,115,105,111,110,32,49,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_matmul_l4 .visible .func _gfortran_matmul_l4pred %r165; .reg .predpredpredpredpred32u64 %r271; .reg .u16 %r272; .reg .u32 %r273; .reg .pred %r274; .reg .u64 %r275; .reg .u64u64 %r280; .reg .u64pred %r310; .reg .predu64 %r327; .reg .u64pred %r334; .reg .predpred %r345; .reg .pred %r346; .reg .pred %r347; .reg .pred %r348; .reg .pred %r349; .reg .pred %r350; .reg .predpredpredpredpred %r392; .reg .u32 %r394; .reg .u16 %r395; .reg .predpredpredpredpredpredpred %r450; .reg .u64 %r451; .reg .u64 %r452; .reg .u32 %r455; .reg .u16 %r456; .reg .pred %r457; .reg .u32 %r458; .reg .u16 %r459; .reg .pred %r460; .reg .u64 %r461; .reg .u64 %r462; .reg .u32 %r465; .reg .u16 %r466; .reg .pred %r467; .reg .u32mov.u64 %r152,%ar0; mov.u64 %r153,%ar1; mov.u64 %r154,%ar2; .loc 1 64 3 ld.s8 %r22,[%r153+28]; cvt.u16.u32 %r155,%r22; setp.eq.u16 %r156,%r155,2; @ %r156 bra $L2; ld.s8 %r333,[%r154+28]; cvt.u16.u32 %r157,%r333; setp.eq.u16 %r159,%r157,2; @ %r159 bra $L3; cvta.const.u64 %r163,$LC0; cvta.const.u64 %r162,__func__$0; mov.u32 %r161,64; cvta.const.u64 %r160,$LC1; {16233: .loc 1 67 6 ld.u64 %r164,[%r1524; .loc 1 69 10 setp.ne.u16 %r167,%r155,1; @ %r167 bra $L5; .loc 1 71 4 st.u64 [%r152+48],%r164; ld.u64 %r170,[%r154+80]; ld.u64 %r172,[%r154+72]; sub.u64 %r173,%r170,%r172; st.u64 [%r152+56],%r173; mov.u64 %r174,1; st.u64 [%r152+40],%r174; bra $L6; $L28: .loc 1 76 4 st.u64 [%r152+48],%r316; add.u64 %r176,%r124,-1; st.u64 [%r152+56],%r176; mov.u64 %r177,1; st.u64 [%r152+40],%r177; bra $L6; $L29: .loc 1 81 4 mov.u64 %r178,0; st.u64 [%r152+48],%r178; add.u64 %r179,%r124,-1; st.u64 [%r152+56],%r179; mov.u64 %r180,1; st.u64 [%r152+40],%r180; .loc 1 84 11 st.u64 [%r152+72],%r178; ld.u64 %r183,[%r154+80]; ld.u64 %r185,[%r154+72]; sub.u64 %r186,%r183,%r185; st.u64 [%r152+80],%r186; st.u64 [%r152+64],%r124; $L6:152; call (%value_in),_gfortran_size0,(%out_arg1); ld.param.u64 %r188,[%value_in]; } .loc 1 90 4call (%value_in),_gfortrani_xmallocarray191,[%value_in]; } .loc 1 90 2 st.u64 [%r152],%r191; .loc 1 91 24 mov.u64 %r193,0; st.u64 [%r152+8],%r193; bra $L7; $L4: .loc 1 93 14 cvta.global.u64 %r194,_gfortrani_compile_options; .loc 1 93 13 ld.u32 %r195,[%r194+36]; setp.eq.u32 %r196,%r195,0; @ %r196 bra $L7; .loc 1 100 19 ld.u64 %r198,[%r152+56]; add.u64 %r197,%r198,1; .loc 1 100 17 ld.u64 %r199,[%r152+48]; sub.u64 %r123,%r197,%r199; .loc 1 97 5 setp.ne.u16 %r201,%r155,1; @ %r201 bra $L8; .loc 1 99 19 ld.u64 %r203,[%r154+80]; add.u64 %r202,%r203,1; .loc 1 99 17 ld.u64 %r204,[%r154+72]; sub.u64 %r107,%r202,%r204; .loc 1 101 9 setp.eq.u64 %r205,%r107,%r123; @ %r205 bra $L7; .loc 1 102 8 st.u64 [%stack+8],%r107; st.u64 [%stack],%r123; cvta.const.u64 %r206,$LC2; {stack; call _gfortran_runtime_error76 4 ld.u64 %r209,[%r153+56]; add.u64 %r208,%r209,1; ld.u64 %r210,[%r153+48]; sub.u64 %r148,%r208,%r210; .loc 1 106 10 cvt.u16.u32 %r211,%r333; setp.ne.u16 %r213,%r211,1; @ %r213 bra $L9; .loc 1 110 9 setp.eq.u64 %r214,%r123,%r148; @ %r214 bra $L7; .loc 1 111 8 st.u64 [%stack+8],%r148; st.u64 [%stack],%r123; cvta.const.u64 %r215215_gfortran_runtime_error119 9 setp.eq.u64 %r217,%r123,%r148; @ %r217 bra $L10; .loc 1 120 8 st.u64 [%stack+8],%r148; st.u64 [%stack],%r123;_gfortran_runtime_error10: .loc 1 125 19 ld.u64 %r221,[%r154+80]; add.u64 %r220,%r221,1; .loc 1 125 17 ld.u64 %r222,[%r154+72]; sub.u64 %r105,%r220,%r222; .loc 1 126 19 ld.u64 %r224,[%r152+80]; add.u64 %r223,%r224,1; .loc 1 126 17 ld.u64 %r225,[%r152+72]; sub.u64 %r106,%r223,%r225; .loc 1 127 9 setp.eq.u64 %r226,%r105,%r106; @ %r226 bra $L7; .loc 1 128 8 st.u64 [%stack+8],%r105; st.u64 [%stack],%r106; cvta.const.u64 %r227227_gfortran_runtime_error135 9 ld.u64 %r89,[%r153]; .loc 1 136 12 ld.u64 %r51,[%r153+16]; .loc 1 138 19 cvt.u32.u64 %r52,%r51; .loc 1 138 44 add.u32 %r229,%r52,-4; and.b32 %r230,%r229,-5; set.u32.eq.u32 %r232,%r230,0; neg.s32 %r233,%r232; .loc 1 138 19 add.u32 %r234,%r52,-1; set.u32.le.u32 %r236,%r234,1; neg.s32 %r237,%r236; .loc 1 138 6 cvt.u16.u32 %r239,%r233; cvt.u16.u32 %r240,%r237; or.b16 %r238,%r239,%r240; cvt.u32.u16 %r241,%r238; cvt.u16.u8 %r242,%r24111; .loc 1 140 6 setp.ne.u32 %r245,%r52,16; @ %r245 bra $L12; $L11: .loc 1 147 9 ld.u64 %r110,[%r154]; .loc 1 148 12 ld.u64 %r57,[%r154+16]; .loc 1 150 19 cvt.u32.u64 %r58,%r57; .loc 1 150 44 add.u32 %r246,%r58,-4; and.b32 %r247,%r246,-5; set.u32.eq.u32 %r249,%r247,0; neg.s32 %r250,%r249; .loc 1 150 19 add.u32 %r251,%r58,-1; set.u32.le.u32 %r253,%r251,1; neg.s32 %r254,%r253; .loc 1 150 6 cvt.u16.u32 %r256,%r250; cvt.u16.u32 %r257,%r254; or.b16 %r255,%r256,%r257; cvt.u32.u16 %r258,%r255; cvt.u16.u8 %r259,%r258; setp.ne.u16 %r260,%r259,0; @ %r260 bra $L13; bra $L122; $L12: .loc 1 145 5 cvta.const.u64 %r262,$LC5; mov.u64 %r261,call _gfortrani_internal_error122: .loc 1 152 6 setp.ne.u32 %r264,%r58,16; @ %r264 bra $L12; $L13: .loc 1 159 8 ld.u64 %r111,[%r152]; .loc 1 164 16 ld.u64 %r92,[%r152+40]; .loc 1 162 6 ld.s8 %r266,[%r152+28]; cvt.u16.u32 %r265,%r266; setp.eq.u16 %r267,%r265,1; @ %r267 bra $L31; .loc 1 170 16 ld.u64 %r93,[%r152+64]; bra $L15; $L31: .loc 1 165 16 mov.u64 %r93,%r92; $L15: .loc 1 177 17 ld.u64 %r268,[%r153+40]; mul.lo.u64 %r96,%r51,%r268; .loc 1 178 15 ld.u64 %r270,[%r153+56]; add.u64 %r269,%r270,1; .loc 1 178 13 ld.u64 %r271,[%r153+48]; sub.u64 %r94,%r269,%r271; .loc 1 175 6 ld.s8 %r273,[%r153+28]; cvt.u16.u32 %r272,%r273; setp.eq.u16 %r274,%r272,1; @ %r274 bra $L32; .loc 1 185 17 ld.u64 %r275,[%r153+64]; mul.lo.u64 %r99,%r51,%r275; .loc 1 186 15 ld.u64 %r277,[%r153+80]; add.u64 %r276,%r277,1; .loc 1 186 13 ld.u64 %r278,[%r153+72]; sub.u64 %r101,%r276,%r278; bra $L16; $L32: .loc 1 178 13 mov.u64 %r101,%r94; .loc 1 177 15 mov.u64 %r99,%r96; .loc 1 179 15 mov.u64 %r96,0; .loc 1 181 14 mov.u64 %r94,1; .loc 1 180 16 mov.u64 %r92,%r96; $L16: .loc 1 192 17 ld.u64 %r279,[%r154+40]; mul.lo.u64 %r131,%r57,%r279; .loc 1 193 7 ld.u64 %r281,[%r154+56]; add.u64 %r280,%r281,1; ld.u64 %r282,[%r154+48]; sub.u64 %r135,%r280,%r282; .loc 1 190 6 ld.s8 %r284,[%r154+28]; cvt.u16.u32 %r283,%r284; setp.ne.u16 %r285,%r283,1; @ %r285 bra $L17; .loc 1 193 7 setp.eq.u64 %r286,%r101,%r135; @ %r286 bra $L33; cvta.const.u64 %r290,$LC6; cvta.const.u64 %r289,__func__$0; mov.u32 %r288,193;r2289907: .loc 1 201 7 setp.eq.u64 %r291,%r101,%r135; @ %r291 bra $L19; cvta.const.u64 %r295,$LC6; cvta.const.u64 %r294,__func__$0; mov.u32 %r293,201;r293294959: .loc 1 202 17 ld.u64 %r296,[%r154+64]; mul.lo.u64 %r74,%r57,%r296; .loc 1 203 16 ld.u64 %r298,[%r154+80]; add.u64 %r297,%r298,1; .loc 1 203 14 ld.u64 %r299,[%r154+72]; sub.u64 %r91,%r297,%r299; .loc 1 206 3 setp.le.s64 %r300,%r91,0; @ %r300 bra $L1; mov.u64 %r122,%r93; bra $L18; $L33: mov.u64 %r74,0; mov.u64 %r122,%r74; .loc 1 196 14 mov.u64 %r91,1; $L18: .loc 1 228 16 shl.b64 %r83,%r92,2; .loc 1 231 24 mul.lo.u64 %r55,%r94,%r96; .loc 1 231 13 neg.s64 %r85,%r55; .loc 1 233 36 mul.lo.u64 %r301,%r92,%r94; .loc 1 233 24 sub.u64 %r302,%r122,%r301; .loc 1 233 12 shl.b64 %r87,%r302,2; mul.lo.u64 %r86,%r83,%r94; .loc 1 206 10 mov.u64 %r112,0; setp.gt.s64 %r334,%r94,0; setp.le.s64 %r337,%r101,0; .loc 1 215 17 cvt.u32.u64 %r338,%r112; .loc 1 221 25 mov.u32 %r339,1; bra $L21; $L34: .loc 1 208 7 mov.u64 %r116,%r111; mov.u64 %r117,%r89; .loc 1 208 14 mov.u64 %r118,0; $L25: .loc 1 215 17 st.u32 [%r116],%r338; .loc 1 217 11 @ %r337 bra $L22; mov.u64 %r114,%r110; mov.u64 %r113,%r117; .loc 1 217 18 mov.u64 %r115,0; and.b64 %r344,%r101,7; setp.eq.u64 %r351,%r344,0; @ %r351 bra $L24; setp.eq.u64 %r350,%r344,1; @ %r350 bra $L78; setp.eq.u64 %r349,%r344,2; @ %r349 bra $L79; setp.eq.u64 %r348,%r344,3; @ %r348 bra $L80; setp.eq.u64 %r347,%r344,4; @ %r347 bra $L81; setp.eq.u64 %r346,%r344,5; @ %r346 bra $L82; setp.eq.u64 %r345,%r344,6; @ %r345 bra $L83; bra $L123; $L24: .loc 1 219 18 ld.s8 %r306,[%r113]; cvt.u16.u32 %r305,%r306; setp.eq.u16 %r307,%r305,0; @ %r307 bra $L23; .loc 1 219 23 ld.s8 %r309,[%r114]; cvt.u16.u32 %r308,%r309; setp.eq.u16 %r310,%r308,0; @ %r310 bra $L23; $L125: .loc 1 221 25 st.u32 [%r116],%r339; .loc 1 222 19 bra $L22; $L23: .loc 1 224 18 add.u64 %r353,%r113,%r99; .loc 1 225 18 add.u64 %r354,%r114,%r131; .loc 1 217 35 add.u64 %r352,%r115,1; .loc 1 219 18 ld.s8 %r356,[%r353]; cvt.u16.u32bra $L124; $L22: .loc 1 228 16 add.u64 %r116,%r116,%r83; .loc 1 229 17 add.u64 %r117,%r117,%r96; .loc 1 208 32 add.u64 %r118,%r118,1; .loc 1 208 7 setp.ne.u64 %r313,%r94,%r118; @ %r313 bra $L25; .loc 1 229 17 add.u64 %r89,%r89,%r55; .loc 1 228 16 add.u64 %r111,%r111,%r86; $L26: .loc 1 231 13 add.u64 %r89,%r89,%r85; .loc 1 232 13 add.u64 %r110,%r110,%r74; .loc 1 233 12 add.u64 %r111,%r111,%r87; .loc 1 206 28 add.u64 %r112,%r112,1; .loc 1 206 3 setp.le.s64 %r314,%r91,%r112; @ %r314 bra $L1; $L21: .loc 1 208 7 @ %r334 bra $L34; bra $L26; $L2: .loc 1 67 6 ld.u64 %r316,[%r152]; setp.ne.u64 %r317,%r316,0; @ %r317 bra $L27; .loc 1 76 4 ld.u64 %r319,[%r153+56]; add.u64 %r318,%r319,1; ld.u64 %r320,[%r153+48]; sub.u64 %r124,%r318,%r320; .loc 1 74 15 ld.s8 %r322,[%r154+28]; cvt.u16.u32 %r321,%r322; setp.eq.u16 %r323,%r321,1; @ %r323 bra $L28; bra $L29; $L5: .loc 1 76 4 ld.u64 %r325,[%r153+56]; add.u64 %r324,%r325,1; ld.u64 %r326,[%r153+48]; sub.u64 %r124,%r324,%r326; bra $L29; $L30: .loc 1 100 19 ld.u64 %r328,[%r152+56]; add.u64 %r327,%r328,1; .loc 1 100 17 ld.u64 %r329,[%r152+48]; sub.u64 %r123,%r327,%r329; ld.s8 %r333,[%r154+28]; bra $L8; $L27: .loc 1 93 14 cvta.global.u64 %r330,_gfortrani_compile_options; .loc 1 93 13 ld.u32 %r331,[%r330+36]; setp.ne.u32 %r332,%r331,0; @ ! %r332 bra $L7; bra $L30; $L1: .loc 1 235$L123: .loc 1 219 18 ld.s8 %r359,[%r117]; cvt.u16.u32.loc 1 219 23 ld.s8 %r362,[%r110]; cvt.u16.u32 %r363,%r362; setp.eq.u16 %r364,%r363,0; @ ! %r364 bra $L125; $L43: .loc 1 224 18 add.u64 %r113,%r117,%r99; .loc 1 225 18 add.u64 %r114,%r110,%r131; .loc 1 217 35 mov.u64 %r115,1; $L83: .loc 1 219 18 ld.s8 %r366,[%r113]; cvt.u16.u32 %r367,%r366; setp.eq.u16 %r368,%r367,0; @ %r368 bra $L46; .loc 1 219 23 ld.s8 %r369,[%r114]; cvt.u16.u32 %r370,%r369; setp.eq.u16 %r371,%r370,0; @ ! %r371 bra $L125; $L46: .loc 1 224 18 add.u64 %r113,%r113,%r99; .loc 1 225 18 add.u64 %r114,%r114,%r131; .loc 1 217 35 add.u64 %r115,%r115,1; $L82: .loc 1 219 18 ld.s8 %r373,[%r113]; cvt.u16.u32 %r374,%r373; setp.eq.u16 %r375,%r374,0; @ %r375 bra $L49; .loc 1 219 23 ld.s8 %r376,[%r114]; cvt.u16.u32 %r377,%r376; setp.eq.u16 %r378,%r377,0; @ ! %r378 bra $L125; $L49: .loc 1 224 18 add.u64 %r113,%r113,%r99; .loc 1 225 18 add.u64 %r114,%r114,%r131; .loc 1 217 35 add.u64 %r115,%r115,1; $L81: .loc 1 219 18 ld.s8 %r380,[%r113]; cvt.u16.u32.loc 1 219 23 ld.s8 %r383,[%r114]; cvt.u16.u32 %r384,%r383; setp.eq.u16 %r385,%r384,0; @ ! %r385 bra $L125; $L52: .loc 1 224 18 add.u64 %r113,%r113,%r99; .loc 1 225 18 add.u64 %r114,%r114,%r131; .loc 1 217 35 add.u64 %r115,%r115,1; $L80: .loc 1 219 18 ld.s8 %r387,[%r113]; cvt.u16.u32 %r388,%r387; setp.eq.u16 %r389,%r388,0; @ %r389 bra $L55; .loc 1 219 23 ld.s8 %r390,[%r114]; cvt.u16.u32 %r391,%r390; setp.eq.u16 %r392,%r391,0; @ ! %r392 bra $L125; $L55: .loc 1 224 18 add.u64 %r113,%r113,%r99; .loc 1 225 18 add.u64 %r114,%r114,%r131; .loc 1 217 35 add.u64 %r115,%r115,1; $L79: .loc 1 219 18 ld.s8 %r394,[%r113]; cvt.u16.u32 %r395,%r394; setp.eq.u16 %r396,%r395,0; @ %r396 bra $L58; .loc 1 219 23 ld.s8 %r397,[%r114]; cvt.u16.u32 %r398,%r397; setp.eq.u16 %r399,%r398,0; @ ! %r399 bra $L125; $L58: .loc 1 224 18 add.u64 %r113,%r113,%r99; .loc 1 225 18 add.u64 %r114,%r114,%r131; .loc 1 217 35 add.u64 %r115,%r115,1; $L78: .loc 1 219 18 ld.s8 %r401,[%r113]; cvt.u16.u32.loc 1 219 23 ld.s8 %r404,[%r114]; cvt.u16.u32 %r405,%r404; setp.eq.u16 %r406,%r405,0; @ ! %r406 bra $L125; $L61: .loc 1 224 18 add.u64 %r113,%r113,%r99; .loc 1 225 18 add.u64 %r114,%r114,%r131; .loc 1 217 35 add.u64 %r115,%r115,1; .loc 1 217 11 setp.ne.u64 %r407,%r101,%r115; @ %r407 bra $L24; bra $L22; $L124: .loc 1 219 23 ld.s8 %r408,[%r354]; cvt.u16.u32 %r409,%r408; setp.eq.u16 %r410,%r409,0; @ ! %r410 bra $L125; $L64: .loc 1 224 18 add.u64 %r411,%r353,%r99; .loc 1 225 18 add.u64 %r412,%r354,%r131; .loc 1 219 18 ld.s8 %r415,[%r411]; cvt.u16.u32.loc 1 219 23 ld.s8 %r418,[%r412]; cvt.u16.u32 %r419,%r418; setp.eq.u16 %r420,%r419,0; @ ! %r420 bra $L125; $L66: .loc 1 224 18 add.u64 %r421,%r411,%r99; .loc 1 225 18 add.u64 %r422,%r412,%r131; .loc 1 219 18 ld.s8 %r425,[%r421]; cvt.u16.u32 %r426,%r425; setp.eq.u16 %r427,%r426,0; @ %r427 bra $L68; .loc 1 219 23 ld.s8 %r428,[%r422]; cvt.u16.u32 %r429,%r428; setp.eq.u16 %r430,%r429,0; @ ! %r430 bra $L125; $L68: .loc 1 224 18 add.u64 %r431,%r421,%r99; .loc 1 225 18 add.u64 %r432,%r422,%r131; .loc 1 219 18 ld.s8 %r435,[%r431]; cvt.u16.u32.loc 1 219 23 ld.s8 %r438,[%r432]; cvt.u16.u32 %r439,%r438; setp.eq.u16 %r440,%r439,0; @ ! %r440 bra $L125; $L70: .loc 1 224 18 add.u64 %r441,%r431,%r99; .loc 1 225 18 add.u64 %r442,%r432,%r131; .loc 1 219 18 ld.s8 %r445,[%r441]; cvt.u16.u32 %r446,%r445; setp.eq.u16 %r447,%r446,0; @ %r447 bra $L72; .loc 1 219 23 ld.s8 %r448,[%r442]; cvt.u16.u32 %r449,%r448; setp.eq.u16 %r450,%r449,0; @ ! %r450 bra $L125; $L72: .loc 1 224 18 add.u64 %r451,%r441,%r99; .loc 1 225 18 add.u64 %r452,%r442,%r131; .loc 1 219 18 ld.s8 %r455,[%r451]; cvt.u16.u32 %r456,%r455; setp.eq.u16 %r457,%r456,0; @ %r457 bra $L74; .loc 1 219 23 ld.s8 %r458,[%r452]; cvt.u16.u32 %r459,%r458; setp.eq.u16 %r460,%r459,0; @ ! %r460 bra $L125; $L74: .loc 1 224 18 add.u64 %r461,%r451,%r99; .loc 1 225 18 add.u64 %r462,%r452,%r131; .loc 1 219 18 ld.s8 %r465,[%r461]; cvt.u16.u32 %r466,%r465; setp.eq.u16 %r467,%r466,0; @ %r467 bra $L76; .loc 1 219 23 ld.s8 %r468,[%r462]; cvt.u16.u32 %r469,%r468; setp.eq.u16 %r470,%r469,0; @ ! %r470 bra $L125; $L76: .loc 1 224 18 add.u64 %r113,%r461,%r99; .loc 1 225 18 add.u64 %r114,%r462,%r131; .loc 1 217 35 add.u64 %r115,%r352,7; .loc 1 217 11 setp.ne.u64 %r471,%r101,%r115; @ %r471 bra $L24; bra $L22; } matmul_l8.o/_gfortran_matmul_l8 .visible .func _gfortran_matmul_l8fortran/generated/matmul_l8.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_options[6]FUNCTION DECL: _gfortran_size0_gfortran_size0_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_internal_error .extern .func _gfortrani_internal_97,116,109,117,108,95,108,5646116,114,97,110,47,103,101,110,101,114,97,116,101,100,47,109,97,116,109,117,108,95,108,56,46,9973,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,32,77,65,84,77,85,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,100,0 }70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,32,40,97,41,32,61,61,32,50,32,124,124,32,71,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,32,40,98,41,32,61,61,3273,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,32,77,65,84,77,85,76,32,105,110,116,114,105,110,115,105,99,32,102,111,114,32,100,105,109,101,110,115,105,111,110,32,50,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,12199,111,117,110,116,32,61,61,32,71,70,67,95,68,69,83,67,82,73,80,84,79,82,95,69,88,84,69,78,84,40,98,44,48,473,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,32,77,65,84,77,85,76,32,105,110,116,114,105,110,115,105,99,32,102,111,114,32,100,105,109,101,110,115,105,111,110,32,49,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_matmul_l8 .visible .func _gfortran_matmul_l8pred %r165; .reg .predpredpredpredpred32u64 %r271; .reg .u16 %r272; .reg .u32 %r273; .reg .pred %r274; .reg .u64 %r275; .reg .u64u64 %r280; .reg .u64pred %r310; .reg .predu64 %r327; .reg .u64pred %r345; .reg .pred %r346; .reg .pred %r347; .reg .pred %r348; .reg .predpredpredpredpredpred %r392; .reg .u32 %r394; .reg .u16 %r395; .reg .predpredpredpredpredpredpred %r450; .reg .u64 %r451; .reg .u64 %r452; .reg .u32 %r455; .reg .u16 %r456; .reg .pred %r457; .reg .u32 %r458; .reg .u16 %r459; .reg .pred %r460; .reg .u64 %r461; .reg .u64 %r462; .reg .u32 %r465; .reg .u16 %r466; .reg .pred %r467; .reg .u32mov.u64 %r152,%ar0; mov.u64 %r153,%ar1; mov.u64 %r154,%ar2; .loc 1 64 3 ld.s8 %r22,[%r153+28]; cvt.u16.u32 %r155,%r22; setp.eq.u16 %r156,%r155,2; @ %r156 bra $L2; ld.s8 %r333,[%r154+28]; cvt.u16.u32 %r157,%r333; setp.eq.u16 %r159,%r157,2; @ %r159 bra $L3; cvta.const.u64 %r163,$LC0; cvta.const.u64 %r162,__func__$0; mov.u32 %r161,64; cvta.const.u64 %r160,$LC1; {16233: .loc 1 67 6 ld.u64 %r164,[%r1524; .loc 1 69 10 setp.ne.u16 %r167,%r155,1; @ %r167 bra $L5; .loc 1 71 4 st.u64 [%r152+48],%r164; ld.u64 %r170,[%r154+80]; ld.u64 %r172,[%r154+72]; sub.u64 %r173,%r170,%r172; st.u64 [%r152+56],%r173; mov.u64 %r174,1; st.u64 [%r152+40],%r174; bra $L6; $L28: .loc 1 76 4 st.u64 [%r152+48],%r316; add.u64 %r176,%r124,-1; st.u64 [%r152+56],%r176; mov.u64 %r177,1; st.u64 [%r152+40],%r177; bra $L6; $L29: .loc 1 81 4 mov.u64 %r178,0; st.u64 [%r152+48],%r178; add.u64 %r179,%r124,-1; st.u64 [%r152+56],%r179; mov.u64 %r180,1; st.u64 [%r152+40],%r180; .loc 1 84 11 st.u64 [%r152+72],%r178; ld.u64 %r183,[%r154+80]; ld.u64 %r185,[%r154+72]; sub.u64 %r186,%r183,%r185; st.u64 [%r152+80],%r186; st.u64 [%r152+64],%r124; $L6:152; call (%value_in),_gfortran_size0,(%out_arg1); ld.param.u64 %r188,[%value_in]; } .loc 1 90 4 mov.u64 %r19018r190; call (%value_in),_gfortrani_xmallocarray191,[%value_in]; } .loc 1 90 2 st.u64 [%r152],%r191; .loc 1 91 24 mov.u64 %r193,0; st.u64 [%r152+8],%r193; bra $L7; $L4: .loc 1 93 14 cvta.global.u64 %r194,_gfortrani_compile_options; .loc 1 93 13 ld.u32 %r195,[%r194+36]; setp.eq.u32 %r196,%r195,0; @ %r196 bra $L7; .loc 1 100 19 ld.u64 %r198,[%r152+56]; add.u64 %r197,%r198,1; .loc 1 100 17 ld.u64 %r199,[%r152+48]; sub.u64 %r123,%r197,%r199; .loc 1 97 5 setp.ne.u16 %r201,%r155,1; @ %r201 bra $L8; .loc 1 99 19 ld.u64 %r203,[%r154+80]; add.u64 %r202,%r203,1; .loc 1 99 17 ld.u64 %r204,[%r154+72]; sub.u64 %r107,%r202,%r204; .loc 1 101 9 setp.eq.u64 %r205,%r107,%r123; @ %r205 bra $L7; .loc 1 102 8 st.u64 [%stack+8],%r107; st.u64 [%stack],%r123; cvta.const.u64 %r206,$LC2; {stack; call _gfortran_runtime_error76 4 ld.u64 %r209,[%r153+56]; add.u64 %r208,%r209,1; ld.u64 %r210,[%r153+48]; sub.u64 %r148,%r208,%r210; .loc 1 106 10 cvt.u16.u32 %r211,%r333; setp.ne.u16 %r213,%r211,1; @ %r213 bra $L9; .loc 1 110 9 setp.eq.u64 %r214,%r123,%r148; @ %r214 bra $L7; .loc 1 111 8 st.u64 [%stack+8],%r148; st.u64 [%stack],%r123; cvta.const.u64 %r215215_gfortran_runtime_error119 9 setp.eq.u64 %r217,%r123,%r148; @ %r217 bra $L10; .loc 1 120 8 st.u64 [%stack+8],%r148; st.u64 [%stack],%r123;_gfortran_runtime_error10: .loc 1 125 19 ld.u64 %r221,[%r154+80]; add.u64 %r220,%r221,1; .loc 1 125 17 ld.u64 %r222,[%r154+72]; sub.u64 %r105,%r220,%r222; .loc 1 126 19 ld.u64 %r224,[%r152+80]; add.u64 %r223,%r224,1; .loc 1 126 17 ld.u64 %r225,[%r152+72]; sub.u64 %r106,%r223,%r225; .loc 1 127 9 setp.eq.u64 %r226,%r105,%r106; @ %r226 bra $L7; .loc 1 128 8 st.u64 [%stack+8],%r105; st.u64 [%stack],%r106; cvta.const.u64 %r227227_gfortran_runtime_error135 9 ld.u64 %r89,[%r153]; .loc 1 136 12 ld.u64 %r51,[%r153+16]; .loc 1 138 19 cvt.u32.u64 %r52,%r51; .loc 1 138 44 add.u32 %r229,%r52,-4; and.b32 %r230,%r229,-5; set.u32.eq.u32 %r232,%r230,0; neg.s32 %r233,%r232; .loc 1 138 19 add.u32 %r234,%r52,-1; set.u32.le.u32 %r236,%r234,1; neg.s32 %r237,%r236; .loc 1 138 6 cvt.u16.u32 %r239,%r233; cvt.u16.u32 %r240,%r237; or.b16 %r238,%r239,%r240; cvt.u32.u16 %r241,%r238; cvt.u16.u8 %r242,%r24111; .loc 1 140 6 setp.ne.u32 %r245,%r52,16; @ %r245 bra $L12; $L11: .loc 1 147 9 ld.u64 %r110,[%r154]; .loc 1 148 12 ld.u64 %r57,[%r154+16]; .loc 1 150 19 cvt.u32.u64 %r58,%r57; .loc 1 150 44 add.u32 %r246,%r58,-4; and.b32 %r247,%r246,-5; set.u32.eq.u32 %r249,%r247,0; neg.s32 %r250,%r249; .loc 1 150 19 add.u32 %r251,%r58,-1; set.u32.le.u32 %r253,%r251,1; neg.s32 %r254,%r253; .loc 1 150 6 cvt.u16.u32 %r256,%r250; cvt.u16.u32 %r257,%r254; or.b16 %r255,%r256,%r257; cvt.u32.u16 %r258,%r255; cvt.u16.u8 %r259,%r258; setp.ne.u16 %r260,%r259,0; @ %r260 bra $L13; bra $L122; $L12: .loc 1 145 5 cvta.const.u64 %r262,$LC5; mov.u64 %r261,call _gfortrani_internal_error122: .loc 1 152 6 setp.ne.u32 %r264,%r58,16; @ %r264 bra $L12; $L13: .loc 1 159 8 ld.u64 %r111,[%r152]; .loc 1 164 16 ld.u64 %r92,[%r152+40]; .loc 1 162 6 ld.s8 %r266,[%r152+28]; cvt.u16.u32 %r265,%r266; setp.eq.u16 %r267,%r265,1; @ %r267 bra $L31; .loc 1 170 16 ld.u64 %r93,[%r152+64]; bra $L15; $L31: .loc 1 165 16 mov.u64 %r93,%r92; $L15: .loc 1 177 17 ld.u64 %r268,[%r153+40]; mul.lo.u64 %r96,%r51,%r268; .loc 1 178 15 ld.u64 %r270,[%r153+56]; add.u64 %r269,%r270,1; .loc 1 178 13 ld.u64 %r271,[%r153+48]; sub.u64 %r94,%r269,%r271; .loc 1 175 6 ld.s8 %r273,[%r153+28]; cvt.u16.u32 %r272,%r273; setp.eq.u16 %r274,%r272,1; @ %r274 bra $L32; .loc 1 185 17 ld.u64 %r275,[%r153+64]; mul.lo.u64 %r99,%r51,%r275; .loc 1 186 15 ld.u64 %r277,[%r153+80]; add.u64 %r276,%r277,1; .loc 1 186 13 ld.u64 %r278,[%r153+72]; sub.u64 %r101,%r276,%r278; bra $L16; $L32: .loc 1 178 13 mov.u64 %r101,%r94; .loc 1 177 15 mov.u64 %r99,%r96; .loc 1 179 15 mov.u64 %r96,0; .loc 1 181 14 mov.u64 %r94,1; .loc 1 180 16 mov.u64 %r92,%r96; $L16: .loc 1 192 17 ld.u64 %r279,[%r154+40]; mul.lo.u64 %r131,%r57,%r279; .loc 1 193 7 ld.u64 %r281,[%r154+56]; add.u64 %r280,%r281,1; ld.u64 %r282,[%r154+48]; sub.u64 %r135,%r280,%r282; .loc 1 190 6 ld.s8 %r284,[%r154+28]; cvt.u16.u32 %r283,%r284; setp.ne.u16 %r285,%r283,1; @ %r285 bra $L17; .loc 1 193 7 setp.eq.u64 %r286,%r101,%r135; @ %r286 bra $L33; cvta.const.u64 %r290,$LC6; cvta.const.u64 %r289,__func__$0; mov.u32 %r288,193;r2289907: .loc 1 201 7 setp.eq.u64 %r291,%r101,%r135; @ %r291 bra $L19; cvta.const.u64 %r295,$LC6; cvta.const.u64 %r294,__func__$0; mov.u32 %r293,201;r293294959: .loc 1 202 17 ld.u64 %r296,[%r154+64]; mul.lo.u64 %r74,%r57,%r296; .loc 1 203 16 ld.u64 %r298,[%r154+80]; add.u64 %r297,%r298,1; .loc 1 203 14 ld.u64 %r299,[%r154+72]; sub.u64 %r91,%r297,%r299; .loc 1 206 3 setp.le.s64 %r300,%r91,0; @ %r300 bra $L1; mov.u64 %r122,%r93; bra $L18; $L33: mov.u64 %r74,0; mov.u64 %r122,%r74; .loc 1 196 14 mov.u64 %r91,1; $L18: .loc 1 228 16 shl.b64 %r83,%r92,3; .loc 1 231 24 mul.lo.u64 %r55,%r94,%r96; .loc 1 231 13 neg.s64 %r85,%r55; .loc 1 233 36 mul.lo.u64 %r301,%r92,%r94; .loc 1 233 24 sub.u64 %r302,%r122,%r301; .loc 1 233 12 shl.b64 %r87,%r302,3; mul.lo.u64 %r86,%r83,%r94; .loc 1 206 10 mov.u64 %r112,0; setp.gt.s64 %r334,%r94,0; setp.le.s64 %r337,%r101,0; .loc 1 215 17 mov.u64 %r338,%r112; .loc 1 221 25 mov.u64 %r339,1; bra $L21; $L34: .loc 1 208 7 mov.u64 %r116,%r111; mov.u64 %r117,%r89; .loc 1 208 14 mov.u64 %r118,0; $L25: .loc 1 215 17 st.u64 [%r116],%r338; .loc 1 217 11 @ %r337 bra $L22; mov.u64 %r114,%r110; mov.u64 %r113,%r117; .loc 1 217 18 mov.u64 %r115,0; and.b64 %r344,%r101,7; setp.eq.u64 %r351,%r344,0; @ %r351 bra $L24; setp.eq.u64 %r350,%r344,1; @ %r350 bra $L78; setp.eq.u64 %r349,%r344,2; @ %r349 bra $L79; setp.eq.u64 %r348,%r344,3; @ %r348 bra $L80; setp.eq.u64 %r347,%r344,4; @ %r347 bra $L81; setp.eq.u64 %r346,%r344,5; @ %r346 bra $L82; setp.eq.u64 %r345,%r344,6; @ %r345 bra $L83; bra $L123; $L24: .loc 1 219 18 ld.s8 %r306,[%r113]; cvt.u16.u32 %r305,%r306; setp.eq.u16 %r307,%r305,0; @ %r307 bra $L23; .loc 1 219 23 ld.s8 %r309,[%r114]; cvt.u16.u32 %r308,%r309; setp.eq.u16 %r310,%r308,0; @ %r310 bra $L23; $L125: .loc 1 221 25 st.u64 [%r116],%r339; .loc 1 222 19 bra $L22; $L23: .loc 1 224 18 add.u64 %r353,%r113,%r99; .loc 1 225 18 add.u64 %r354,%r114,%r131; .loc 1 217 35 add.u64 %r352,%r115,1; .loc 1 219 18 ld.s8 %r356,[%r353]; cvt.u16.u32bra $L124; $L22: .loc 1 228 16 add.u64 %r116,%r116,%r83; .loc 1 229 17 add.u64 %r117,%r117,%r96; .loc 1 208 32 add.u64 %r118,%r118,1; .loc 1 208 7 setp.ne.u64 %r313,%r94,%r118; @ %r313 bra $L25; .loc 1 229 17 add.u64 %r89,%r89,%r55; .loc 1 228 16 add.u64 %r111,%r111,%r86; $L26: .loc 1 231 13 add.u64 %r89,%r89,%r85; .loc 1 232 13 add.u64 %r110,%r110,%r74; .loc 1 233 12 add.u64 %r111,%r111,%r87; .loc 1 206 28 add.u64 %r112,%r112,1; .loc 1 206 3 setp.le.s64 %r314,%r91,%r112; @ %r314 bra $L1; $L21: .loc 1 208 7 @ %r334 bra $L34; bra $L26; $L2: .loc 1 67 6 ld.u64 %r316,[%r152]; setp.ne.u64 %r317,%r316,0; @ %r317 bra $L27; .loc 1 76 4 ld.u64 %r319,[%r153+56]; add.u64 %r318,%r319,1; ld.u64 %r320,[%r153+48]; sub.u64 %r124,%r318,%r320; .loc 1 74 15 ld.s8 %r322,[%r154+28]; cvt.u16.u32 %r321,%r322; setp.eq.u16 %r323,%r321,1; @ %r323 bra $L28; bra $L29; $L5: .loc 1 76 4 ld.u64 %r325,[%r153+56]; add.u64 %r324,%r325,1; ld.u64 %r326,[%r153+48]; sub.u64 %r124,%r324,%r326; bra $L29; $L30: .loc 1 100 19 ld.u64 %r328,[%r152+56]; add.u64 %r327,%r328,1; .loc 1 100 17 ld.u64 %r329,[%r152+48]; sub.u64 %r123,%r327,%r329; ld.s8 %r333,[%r154+28]; bra $L8; $L27: .loc 1 93 14 cvta.global.u64 %r330,_gfortrani_compile_options; .loc 1 93 13 ld.u32 %r331,[%r330+36]; setp.ne.u32 %r332,%r331,0; @ ! %r332 bra $L7; bra $L30; $L1: .loc 1 235$L123: .loc 1 219 18 ld.s8 %r359,[%r117]; cvt.u16.u32.loc 1 219 23 ld.s8 %r362,[%r110]; cvt.u16.u32 %r363,%r362; setp.eq.u16 %r364,%r363,0; @ ! %r364 bra $L125; $L43: .loc 1 224 18 add.u64 %r113,%r117,%r99; .loc 1 225 18 add.u64 %r114,%r110,%r131; .loc 1 217 35 mov.u64 %r115,1; $L83: .loc 1 219 18 ld.s8 %r366,[%r113]; cvt.u16.u32 %r367,%r366; setp.eq.u16 %r368,%r367,0; @ %r368 bra $L46; .loc 1 219 23 ld.s8 %r369,[%r114]; cvt.u16.u32 %r370,%r369; setp.eq.u16 %r371,%r370,0; @ ! %r371 bra $L125; $L46: .loc 1 224 18 add.u64 %r113,%r113,%r99; .loc 1 225 18 add.u64 %r114,%r114,%r131; .loc 1 217 35 add.u64 %r115,%r115,1; $L82: .loc 1 219 18 ld.s8 %r373,[%r113]; cvt.u16.u32 %r374,%r373; setp.eq.u16 %r375,%r374,0; @ %r375 bra $L49; .loc 1 219 23 ld.s8 %r376,[%r114]; cvt.u16.u32 %r377,%r376; setp.eq.u16 %r378,%r377,0; @ ! %r378 bra $L125; $L49: .loc 1 224 18 add.u64 %r113,%r113,%r99; .loc 1 225 18 add.u64 %r114,%r114,%r131; .loc 1 217 35 add.u64 %r115,%r115,1; $L81: .loc 1 219 18 ld.s8 %r380,[%r113]; cvt.u16.u32.loc 1 219 23 ld.s8 %r383,[%r114]; cvt.u16.u32 %r384,%r383; setp.eq.u16 %r385,%r384,0; @ ! %r385 bra $L125; $L52: .loc 1 224 18 add.u64 %r113,%r113,%r99; .loc 1 225 18 add.u64 %r114,%r114,%r131; .loc 1 217 35 add.u64 %r115,%r115,1; $L80: .loc 1 219 18 ld.s8 %r387,[%r113]; cvt.u16.u32 %r388,%r387; setp.eq.u16 %r389,%r388,0; @ %r389 bra $L55; .loc 1 219 23 ld.s8 %r390,[%r114]; cvt.u16.u32 %r391,%r390; setp.eq.u16 %r392,%r391,0; @ ! %r392 bra $L125; $L55: .loc 1 224 18 add.u64 %r113,%r113,%r99; .loc 1 225 18 add.u64 %r114,%r114,%r131; .loc 1 217 35 add.u64 %r115,%r115,1; $L79: .loc 1 219 18 ld.s8 %r394,[%r113]; cvt.u16.u32 %r395,%r394; setp.eq.u16 %r396,%r395,0; @ %r396 bra $L58; .loc 1 219 23 ld.s8 %r397,[%r114]; cvt.u16.u32 %r398,%r397; setp.eq.u16 %r399,%r398,0; @ ! %r399 bra $L125; $L58: .loc 1 224 18 add.u64 %r113,%r113,%r99; .loc 1 225 18 add.u64 %r114,%r114,%r131; .loc 1 217 35 add.u64 %r115,%r115,1; $L78: .loc 1 219 18 ld.s8 %r401,[%r113]; cvt.u16.u32.loc 1 219 23 ld.s8 %r404,[%r114]; cvt.u16.u32 %r405,%r404; setp.eq.u16 %r406,%r405,0; @ ! %r406 bra $L125; $L61: .loc 1 224 18 add.u64 %r113,%r113,%r99; .loc 1 225 18 add.u64 %r114,%r114,%r131; .loc 1 217 35 add.u64 %r115,%r115,1; .loc 1 217 11 setp.ne.u64 %r407,%r101,%r115; @ %r407 bra $L24; bra $L22; $L124: .loc 1 219 23 ld.s8 %r408,[%r354]; cvt.u16.u32 %r409,%r408; setp.eq.u16 %r410,%r409,0; @ ! %r410 bra $L125; $L64: .loc 1 224 18 add.u64 %r411,%r353,%r99; .loc 1 225 18 add.u64 %r412,%r354,%r131; .loc 1 219 18 ld.s8 %r415,[%r411]; cvt.u16.u32.loc 1 219 23 ld.s8 %r418,[%r412]; cvt.u16.u32 %r419,%r418; setp.eq.u16 %r420,%r419,0; @ ! %r420 bra $L125; $L66: .loc 1 224 18 add.u64 %r421,%r411,%r99; .loc 1 225 18 add.u64 %r422,%r412,%r131; .loc 1 219 18 ld.s8 %r425,[%r421]; cvt.u16.u32 %r426,%r425; setp.eq.u16 %r427,%r426,0; @ %r427 bra $L68; .loc 1 219 23 ld.s8 %r428,[%r422]; cvt.u16.u32 %r429,%r428; setp.eq.u16 %r430,%r429,0; @ ! %r430 bra $L125; $L68: .loc 1 224 18 add.u64 %r431,%r421,%r99; .loc 1 225 18 add.u64 %r432,%r422,%r131; .loc 1 219 18 ld.s8 %r435,[%r431]; cvt.u16.u32.loc 1 219 23 ld.s8 %r438,[%r432]; cvt.u16.u32 %r439,%r438; setp.eq.u16 %r440,%r439,0; @ ! %r440 bra $L125; $L70: .loc 1 224 18 add.u64 %r441,%r431,%r99; .loc 1 225 18 add.u64 %r442,%r432,%r131; .loc 1 219 18 ld.s8 %r445,[%r441]; cvt.u16.u32 %r446,%r445; setp.eq.u16 %r447,%r446,0; @ %r447 bra $L72; .loc 1 219 23 ld.s8 %r448,[%r442]; cvt.u16.u32 %r449,%r448; setp.eq.u16 %r450,%r449,0; @ ! %r450 bra $L125; $L72: .loc 1 224 18 add.u64 %r451,%r441,%r99; .loc 1 225 18 add.u64 %r452,%r442,%r131; .loc 1 219 18 ld.s8 %r455,[%r451]; cvt.u16.u32 %r456,%r455; setp.eq.u16 %r457,%r456,0; @ %r457 bra $L74; .loc 1 219 23 ld.s8 %r458,[%r452]; cvt.u16.u32 %r459,%r458; setp.eq.u16 %r460,%r459,0; @ ! %r460 bra $L125; $L74: .loc 1 224 18 add.u64 %r461,%r451,%r99; .loc 1 225 18 add.u64 %r462,%r452,%r131; .loc 1 219 18 ld.s8 %r465,[%r461]; cvt.u16.u32 %r466,%r465; setp.eq.u16 %r467,%r466,0; @ %r467 bra $L76; .loc 1 219 23 ld.s8 %r468,[%r462]; cvt.u16.u32 %r469,%r468; setp.eq.u16 %r470,%r469,0; @ ! %r470 bra $L125; $L76: .loc 1 224 18 add.u64 %r113,%r461,%r99; .loc 1 225 18 add.u64 %r114,%r462,%r131; .loc 1 217 35 add.u64 %r115,%r352,7; .loc 1 217 11 setp.ne.u64 %r471,%r101,%r115; @ %r471 bra $L24; bra $L22; } matmul_l16.o/_gfortran_matmul_l16 .visible .func _gfortran_matmul_l16fortran/generated/matmul_l16.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_options[6]FUNCTION DECL: _gfortran_size0_gfortran_size0_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_internal_error .extern .func _gfortrani_internal_97,116,109,117,108,95,108,49,54116,114,97,110,47,103,101,110,101,114,97,116,101,100,47,109,97,116,109,117,108,95,108,49,54,46,9973,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,32,77,65,84,77,85,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,100,0 }70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,32,40,97,41,32,61,61,32,50,32,124,124,32,71,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,32,40,98,41,32,61,61,3273,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,32,77,65,84,77,85,76,32,105,110,116,114,105,110,115,105,99,32,102,111,114,32,100,105,109,101,110,115,105,111,110,32,50,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,12199,111,117,110,116,32,61,61,32,71,70,67,95,68,69,83,67,82,73,80,84,79,82,95,69,88,84,69,78,84,40,98,44,48,473,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,32,77,65,84,77,85,76,32,105,110,116,114,105,110,115,105,99,32,102,111,114,32,100,105,109,101,110,115,105,111,110,32,49,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_matmul_l16 .visible .func _gfortran_matmul_l16pred %r165; .reg .predpredpredpredpred32u64 %r271; .reg .u16 %r272; .reg .u32 %r273; .reg .pred %r274; .reg .u64 %r275; .reg .u64u64 %r280; .reg .u64u64 %r302; .reg .u16 %r306; .reg .u32 %r307; .reg .pred %r308; .reg .u16 %r309; .reg .u32 %r310; .reg .pred %r311; .reg .predu64 %r346; .reg .pred %r347; .reg .pred %r348; .reg .predpred %r351; .reg .pred %r352; .reg .predpredpred %r373; .reg .u32 %r375; .reg .u16 %r376; .reg .predpredu16 %r400; .reg .predpred %r408; .reg .pred.reg .u16 %r418; .reg .predu32 %r427; .reg .u16 %r428; .reg .predpredpredpredpred %r472; .reg .pred %r473; mov.u64 %r152,%ar0; mov.u64 %r153,%ar1; mov.u64 %r154,%ar2; .loc 1 64 3 ld.s8 %r22,[%r153+28]; cvt.u16.u32 %r155,%r22; setp.eq.u16 %r156,%r155,2; @ %r156 bra $L2; ld.s8 %r335,[%r154+28]; cvt.u16.u32 %r157,%r335; setp.eq.u16 %r159,%r157,2; @ %r159 bra $L3; cvta.const.u64 %r163,$LC0; cvta.const.u64 %r162,__func__$0; mov.u32 %r161,64; cvta.const.u64 %r160,$LC1; {16233: .loc 1 67 6 ld.u64 %r164,[%r1524; .loc 1 69 10 setp.ne.u16 %r167,%r155,1; @ %r167 bra $L5; .loc 1 71 4 st.u64 [%r152+48],%r164; ld.u64 %r170,[%r154+80]; ld.u64 %r172,[%r154+72]; sub.u64 %r173,%r170,%r172; st.u64 [%r152+56],%r173; mov.u64 %r174,1; st.u64 [%r152+40],%r174; bra $L6; $L28: .loc 1 76 4 st.u64 [%r152+48],%r318; add.u64 %r176,%r124,-1; st.u64 [%r152+56],%r176; mov.u64 %r177,1; st.u64 [%r152+40],%r177; bra $L6; $L29: .loc 1 81 4 mov.u64 %r178,0; st.u64 [%r152+48],%r178; add.u64 %r179,%r124,-1; st.u64 [%r152+56],%r179; mov.u64 %r180,1; st.u64 [%r152+40],%r180; .loc 1 84 11 st.u64 [%r152+72],%r178; ld.u64 %r183,[%r154+80]; ld.u64 %r185,[%r154+72]; sub.u64 %r186,%r183,%r185; st.u64 [%r152+80],%r186; st.u64 [%r152+64],%r124; $L6:152; call (%value_in),_gfortran_size0,(%out_arg1); ld.param.u64 %r188,[%value_in]; } .loc 1 90 4 mov.u64 %r190,16call (%value_in),_gfortrani_xmallocarray191,[%value_in]; } .loc 1 90 2 st.u64 [%r152],%r191; .loc 1 91 24 mov.u64 %r193,0; st.u64 [%r152+8],%r193; bra $L7; $L4: .loc 1 93 14 cvta.global.u64 %r194,_gfortrani_compile_options; .loc 1 93 13 ld.u32 %r195,[%r194+36]; setp.eq.u32 %r196,%r195,0; @ %r196 bra $L7; .loc 1 100 19 ld.u64 %r198,[%r152+56]; add.u64 %r197,%r198,1; .loc 1 100 17 ld.u64 %r199,[%r152+48]; sub.u64 %r123,%r197,%r199; .loc 1 97 5 setp.ne.u16 %r201,%r155,1; @ %r201 bra $L8; .loc 1 99 19 ld.u64 %r203,[%r154+80]; add.u64 %r202,%r203,1; .loc 1 99 17 ld.u64 %r204,[%r154+72]; sub.u64 %r107,%r202,%r204; .loc 1 101 9 setp.eq.u64 %r205,%r107,%r123; @ %r205 bra $L7; .loc 1 102 8 st.u64 [%stack+8],%r107; st.u64 [%stack],%r123; cvta.const.u64 %r206,$LC2; {stack; call _gfortran_runtime_error76 4 ld.u64 %r209,[%r153+56]; add.u64 %r208,%r209,1; ld.u64 %r210,[%r153+48]; sub.u64 %r148,%r208,%r210; .loc 1 106 10 cvt.u16.u32 %r211,%r335; setp.ne.u16 %r213,%r211,1; @ %r213 bra $L9; .loc 1 110 9 setp.eq.u64 %r214,%r123,%r148; @ %r214 bra $L7; .loc 1 111 8 st.u64 [%stack+8],%r148; st.u64 [%stack],%r123; cvta.const.u64 %r215215_gfortran_runtime_error119 9 setp.eq.u64 %r217,%r123,%r148; @ %r217 bra $L10; .loc 1 120 8 st.u64 [%stack+8],%r148; st.u64 [%stack],%r123;_gfortran_runtime_error10: .loc 1 125 19 ld.u64 %r221,[%r154+80]; add.u64 %r220,%r221,1; .loc 1 125 17 ld.u64 %r222,[%r154+72]; sub.u64 %r105,%r220,%r222; .loc 1 126 19 ld.u64 %r224,[%r152+80]; add.u64 %r223,%r224,1; .loc 1 126 17 ld.u64 %r225,[%r152+72]; sub.u64 %r106,%r223,%r225; .loc 1 127 9 setp.eq.u64 %r226,%r105,%r106; @ %r226 bra $L7; .loc 1 128 8 st.u64 [%stack+8],%r105; st.u64 [%stack],%r106; cvta.const.u64 %r227227_gfortran_runtime_error135 9 ld.u64 %r89,[%r153]; .loc 1 136 12 ld.u64 %r51,[%r153+16]; .loc 1 138 19 cvt.u32.u64 %r52,%r51; .loc 1 138 44 add.u32 %r229,%r52,-4; and.b32 %r230,%r229,-5; set.u32.eq.u32 %r232,%r230,0; neg.s32 %r233,%r232; .loc 1 138 19 add.u32 %r234,%r52,-1; set.u32.le.u32 %r236,%r234,1; neg.s32 %r237,%r236; .loc 1 138 6 cvt.u16.u32 %r239,%r233; cvt.u16.u32 %r240,%r237; or.b16 %r238,%r239,%r240; cvt.u32.u16 %r241,%r238; cvt.u16.u8 %r242,%r24111; .loc 1 140 6 setp.ne.u32 %r245,%r52,16; @ %r245 bra $L12; $L11: .loc 1 147 9 ld.u64 %r110,[%r154]; .loc 1 148 12 ld.u64 %r57,[%r154+16]; .loc 1 150 19 cvt.u32.u64 %r58,%r57; .loc 1 150 44 add.u32 %r246,%r58,-4; and.b32 %r247,%r246,-5; set.u32.eq.u32 %r249,%r247,0; neg.s32 %r250,%r249; .loc 1 150 19 add.u32 %r251,%r58,-1; set.u32.le.u32 %r253,%r251,1; neg.s32 %r254,%r253; .loc 1 150 6 cvt.u16.u32 %r256,%r250; cvt.u16.u32 %r257,%r254; or.b16 %r255,%r256,%r257; cvt.u32.u16 %r258,%r255; cvt.u16.u8 %r259,%r258; setp.ne.u16 %r260,%r259,0; @ %r260 bra $L13; bra $L122; $L12: .loc 1 145 5 cvta.const.u64 %r262,$LC5; mov.u64 %r261,call _gfortrani_internal_error122: .loc 1 152 6 setp.ne.u32 %r264,%r58,16; @ %r264 bra $L12; $L13: .loc 1 159 8 ld.u64 %r111,[%r152]; .loc 1 164 16 ld.u64 %r92,[%r152+40]; .loc 1 162 6 ld.s8 %r266,[%r152+28]; cvt.u16.u32 %r265,%r266; setp.eq.u16 %r267,%r265,1; @ %r267 bra $L31; .loc 1 170 16 ld.u64 %r93,[%r152+64]; bra $L15; $L31: .loc 1 165 16 mov.u64 %r93,%r92; $L15: .loc 1 177 17 ld.u64 %r268,[%r153+40]; mul.lo.u64 %r96,%r51,%r268; .loc 1 178 15 ld.u64 %r270,[%r153+56]; add.u64 %r269,%r270,1; .loc 1 178 13 ld.u64 %r271,[%r153+48]; sub.u64 %r94,%r269,%r271; .loc 1 175 6 ld.s8 %r273,[%r153+28]; cvt.u16.u32 %r272,%r273; setp.eq.u16 %r274,%r272,1; @ %r274 bra $L32; .loc 1 185 17 ld.u64 %r275,[%r153+64]; mul.lo.u64 %r99,%r51,%r275; .loc 1 186 15 ld.u64 %r277,[%r153+80]; add.u64 %r276,%r277,1; .loc 1 186 13 ld.u64 %r278,[%r153+72]; sub.u64 %r101,%r276,%r278; bra $L16; $L32: .loc 1 178 13 mov.u64 %r101,%r94; .loc 1 177 15 mov.u64 %r99,%r96; .loc 1 179 15 mov.u64 %r96,0; .loc 1 181 14 mov.u64 %r94,1; .loc 1 180 16 mov.u64 %r92,%r96; $L16: .loc 1 192 17 ld.u64 %r279,[%r154+40]; mul.lo.u64 %r131,%r57,%r279; .loc 1 193 7 ld.u64 %r281,[%r154+56]; add.u64 %r280,%r281,1; ld.u64 %r282,[%r154+48]; sub.u64 %r135,%r280,%r282; .loc 1 190 6 ld.s8 %r284,[%r154+28]; cvt.u16.u32 %r283,%r284; setp.ne.u16 %r285,%r283,1; @ %r285 bra $L17; .loc 1 193 7 setp.eq.u64 %r286,%r101,%r135; @ %r286 bra $L33; cvta.const.u64 %r290,$LC6; cvta.const.u64 %r289,__func__$0; mov.u32 %r288,193;r2289907: .loc 1 201 7 setp.eq.u64 %r291,%r101,%r135; @ %r291 bra $L19; cvta.const.u64 %r295,$LC6; cvta.const.u64 %r294,__func__$0; mov.u32 %r293,201;r293294959: .loc 1 202 17 ld.u64 %r296,[%r154+64]; mul.lo.u64 %r74,%r57,%r296; .loc 1 203 16 ld.u64 %r298,[%r154+80]; add.u64 %r297,%r298,1; .loc 1 203 14 ld.u64 %r299,[%r154+72]; sub.u64 %r91,%r297,%r299; .loc 1 206 3 setp.le.s64 %r300,%r91,0; @ %r300 bra $L1; mov.u64 %r122,%r93; bra $L18; $L33: mov.u64 %r74,0; mov.u64 %r122,%r74; .loc 1 196 14 mov.u64 %r91,1; $L18: .loc 1 228 16 shl.b64 %r83,%r92,4; .loc 1 231 24 mul.lo.u64 %r55,%r94,%r96; .loc 1 231 13 neg.s64 %r85,%r55; .loc 1 233 36 mul.lo.u64 %r301,%r92,%r94; .loc 1 233 24 sub.u64 %r302,%r122,%r301; .loc 1 233 12 shl.b64 %r87,%r302,4; mul.lo.u64 %r86,%r83,%r94; .loc 1 206 10 mov.u64 %r112,0; setp.gt.s64 %r336,%r94,0; setp.le.s64 %r339,%r101,0; .loc 1 215 17 mov.u64 %r340,%r112; .loc 1 221 25 mov.u64 %r341,1; bra $L21; $L34: .loc 1 208 7 mov.u64 %r116,%r111; mov.u64 %r117,%r89; .loc 1 208 14 mov.u64 %r118,0; $L25: .loc 1 215 17 st.u64 [%r116],%r340; st.u64 [%r116+8],%r340; .loc 1 217 11 @ %r339 bra $L22; mov.u64 %r114,%r110; mov.u64 %r113,%r117; .loc 1 217 18 mov.u64 %r115,0; and.b64 %r346,%r101,7; setp.eq.u64 %r353,%r346,0; @ %r353 bra $L24; setp.eq.u64 %r352,%r346,1; @ %r352 bra $L78; setp.eq.u64 %r351,%r346,2; @ %r351 bra $L79; setp.eq.u64 %r350,%r346,3; @ %r350 bra $L80; setp.eq.u64 %r349,%r346,4; @ %r349 bra $L81; setp.eq.u64 %r348,%r346,5; @ %r348 bra $L82; setp.eq.u64 %r347,%r346,6; @ %r347 bra $L83; bra $L123; $L24: .loc 1 219 18 ld.s8 %r307,[%r113]; cvt.u16.u32 %r306,%r307; setp.eq.u16 %r308,%r306,0; @ %r308 bra $L23; .loc 1 219 23 ld.s8 %r310,[%r114]; cvt.u16.u32 %r309,%r310; setp.eq.u16 %r311,%r309,0; @ %r311 bra $L23; $L125: .loc 1 221 25 st.u64 [%r116],%r341; st.u64 [%r116+8],%r340; .loc 1 222 19 bra $L22; $L23: .loc 1 224 18 add.u64 %r355,%r113,%r99; .loc 1 225 18 add.u64 %r356,%r114,%r131; .loc 1 217 35 add.u64 %r354,%r115,1; .loc 1 219 18 ld.s8 %r358,[%r355]; cvt.u16.u32bra $L124; $L22: .loc 1 228 16 add.u64 %r116,%r116,%r83; .loc 1 229 17 add.u64 %r117,%r117,%r96; .loc 1 208 32 add.u64 %r118,%r118,1; .loc 1 208 7 setp.ne.u64 %r315,%r94,%r118; @ %r315 bra $L25; .loc 1 229 17 add.u64 %r89,%r89,%r55; .loc 1 228 16 add.u64 %r111,%r111,%r86; $L26: .loc 1 231 13 add.u64 %r89,%r89,%r85; .loc 1 232 13 add.u64 %r110,%r110,%r74; .loc 1 233 12 add.u64 %r111,%r111,%r87; .loc 1 206 28 add.u64 %r112,%r112,1; .loc 1 206 3 setp.le.s64 %r316,%r91,%r112; @ %r316 bra $L1; $L21: .loc 1 208 7 @ %r336 bra $L34; bra $L26; $L2: .loc 1 67 6 ld.u64 %r318,[%r152]; setp.ne.u64 %r319,%r318,0; @ %r319 bra $L27; .loc 1 76 4 ld.u64 %r321,[%r153+56]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r153+48]; sub.u64 %r124,%r320,%r322; .loc 1 74 15 ld.s8 %r324,[%r154+28]; cvt.u16.u32 %r323,%r324; setp.eq.u16 %r325,%r323,1; @ %r325 bra $L28; bra $L29; $L5: .loc 1 76 4 ld.u64 %r327,[%r153+56]; add.u64 %r326,%r327,1; ld.u64 %r328,[%r153+48]; sub.u64 %r124,%r326,%r328; bra $L29; $L30: .loc 1 100 19 ld.u64 %r330,[%r152+56]; add.u64 %r329,%r330,1; .loc 1 100 17 ld.u64 %r331,[%r152+48]; sub.u64 %r123,%r329,%r331; ld.s8 %r335,[%r154+28]; bra $L8; $L27: .loc 1 93 14 cvta.global.u64 %r332,_gfortrani_compile_options; .loc 1 93 13 ld.u32 %r333,[%r332+36]; setp.ne.u32 %r334,%r333,0; @ ! %r334 bra $L7; bra $L30; $L1: .loc 1 235$L123: .loc 1 219 18 ld.s8 %r361,[%r117]; cvt.u16.u32 %r362,%r361; setp.eq.u16 %r363,%r362,0; @ %r363 bra $L43; .loc 1 219 23 ld.s8 %r364,[%r110]; cvt.u16.u32 %r365,%r364; setp.eq.u16 %r366,%r365,0; @ ! %r366 bra $L125; $L43: .loc 1 224 18 add.u64 %r113,%r117,%r99; .loc 1 225 18 add.u64 %r114,%r110,%r131; .loc 1 217 35 mov.u64 %r115,1; $L83: .loc 1 219 18 ld.s8 %r368,[%r113]; cvt.u16.u32 %r369,%r368; setp.eq.u16 %r370,%r369,0; @ %r370 bra $L46; .loc 1 219 23 ld.s8 %r371,[%r114]; cvt.u16.u32 %r372,%r371; setp.eq.u16 %r373,%r372,0; @ ! %r373 bra $L125; $L46: .loc 1 224 18 add.u64 %r113,%r113,%r99; .loc 1 225 18 add.u64 %r114,%r114,%r131; .loc 1 217 35 add.u64 %r115,%r115,1; $L82: .loc 1 219 18 ld.s8 %r375,[%r113]; cvt.u16.u32 %r376,%r375; setp.eq.u16 %r377,%r376,0; @ %r377 bra $L49; .loc 1 219 23 ld.s8 %r378,[%r114]; cvt.u16.u32 %r379,%r378; setp.eq.u16 %r380,%r379,0; @ ! %r380 bra $L125; $L49: .loc 1 224 18 add.u64 %r113,%r113,%r99; .loc 1 225 18 add.u64 %r114,%r114,%r131; .loc 1 217 35 add.u64 %r115,%r115,1; $L81: .loc 1 219 18 ld.s8 %r382,[%r113]; cvt.u16.u32.loc 1 219 23 ld.s8 %r385,[%r114]; cvt.u16.u32 %r386,%r385; setp.eq.u16 %r387,%r386,0; @ ! %r387 bra $L125; $L52: .loc 1 224 18 add.u64 %r113,%r113,%r99; .loc 1 225 18 add.u64 %r114,%r114,%r131; .loc 1 217 35 add.u64 %r115,%r115,1; $L80: .loc 1 219 18 ld.s8 %r389,[%r113]; cvt.u16.u32 %r390,%r389; setp.eq.u16 %r391,%r390,0; @ %r391 bra $L55; .loc 1 219 23 ld.s8 %r392,[%r114]; cvt.u16.u32! %r394 bra $L125; $L55: .loc 1 224 18 add.u64 %r113,%r113,%r99; .loc 1 225 18 add.u64 %r114,%r114,%r131; .loc 1 217 35 add.u64 %r115,%r115,1; $L79: .loc 1 219 18 ld.s8 %r396,[%r113]; cvt.u16.u32 %r397,%r396; setp.eq.u16 %r398,%r397,0; @ %r398 bra $L58; .loc 1 219 23 ld.s8 %r399,[%r114]; cvt.u16.u32 %r400,%r399; setp.eq.u16 %r401,%r400,0; @ ! %r401 bra $L125; $L58: .loc 1 224 18 add.u64 %r113,%r113,%r99; .loc 1 225 18 add.u64 %r114,%r114,%r131; .loc 1 217 35 add.u64 %r115,%r115,1; $L78: .loc 1 219 18 ld.s8 %r403,[%r113]; cvt.u16.u32 %r404,%r403; setp.eq.u16 %r405,%r404,0; @ %r405 bra $L61; .loc 1 219 23 ld.s8 %r406,[%r114]; cvt.u16.u32! %r408 bra $L125; $L61: .loc 1 224 18 add.u64 %r113,%r113,%r99; .loc 1 225 18 add.u64 %r114,%r114,%r131; .loc 1 217 35 add.u64 %r115,%r115,1; .loc 1 217 11 setp.ne.u64 %r409,%r101,%r115; @ %r409 bra $L24; bra $L22; $L124: .loc 1 219 23 ld.s8 %r410,[%r356]; cvt.u16.u32 %r411,%r410; setp.eq.u16 %r412,%r411,0; @ ! %r412 bra $L125; $L64: .loc 1 224 18 add.u64 %r413,%r355,%r99; .loc 1 225 18 add.u64 %r414,%r356,%r131; .loc 1 219 18 ld.s8 %r417,[%r413]; cvt.u16.u32 %r418,%r417; setp.eq.u16 %r419,%r418,0; @ %r419 bra $L66; .loc 1 219 23 ld.s8 %r420,[%r414]; cvt.u16.u32! %r422 bra $L125; $L66: .loc 1 224 18 add.u64 %r423,%r413,%r99; .loc 1 225 18 add.u64 %r424,%r414,%r131; .loc 1 219 18 ld.s8 %r427,[%r423]; cvt.u16.u32 %r428,%r427; setp.eq.u16 %r429,%r428,0; @ %r429 bra $L68; .loc 1 219 23 ld.s8 %r430,[%r424]; cvt.u16.u32 %r431,%r430; setp.eq.u16 %r432,%r431,0; @ ! %r432 bra $L125; $L68: .loc 1 224 18 add.u64 %r433,%r423,%r99; .loc 1 225 18 add.u64 %r434,%r424,%r131; .loc 1 219 18 ld.s8 %r437,[%r433]; cvt.u16.u32.loc 1 219 23 ld.s8 %r440,[%r434]; cvt.u16.u32 %r441,%r440; setp.eq.u16 %r442,%r441,0; @ ! %r442 bra $L125; $L70: .loc 1 224 18 add.u64 %r443,%r433,%r99; .loc 1 225 18 add.u64 %r444,%r434,%r131; .loc 1 219 18 ld.s8 %r447,[%r443]; cvt.u16.u32 %r448,%r447; setp.eq.u16 %r449,%r448,0; @ %r449 bra $L72; .loc 1 219 23 ld.s8 %r450,[%r444]; cvt.u16.u32 %r451,%r450; setp.eq.u16 %r452,%r451,0; @ ! %r452 bra $L125; $L72: .loc 1 224 18 add.u64 %r453,%r443,%r99; .loc 1 225 18 add.u64 %r454,%r444,%r131; .loc 1 219 18 ld.s8 %r457,[%r453]; cvt.u16.u32 %r458,%r457; setp.eq.u16 %r459,%r458,0; @ %r459 bra $L74; .loc 1 219 23 ld.s8 %r460,[%r454]; cvt.u16.u32 %r461,%r460; setp.eq.u16 %r462,%r461,0; @ ! %r462 bra $L125; $L74: .loc 1 224 18 add.u64 %r463,%r453,%r99; .loc 1 225 18 add.u64 %r464,%r454,%r131; .loc 1 219 18 ld.s8 %r467,[%r463]; cvt.u16.u32 %r468,%r467; setp.eq.u16 %r469,%r468,0; @ %r469 bra $L76; .loc 1 219 23 ld.s8 %r470,[%r464]; cvt.u16.u32 %r471,%r470; setp.eq.u16 %r472,%r471,0; @ ! %r472 bra $L125; $L76: .loc 1 224 18 add.u64 %r113,%r463,%r99; .loc 1 225 18 add.u64 %r114,%r464,%r131; .loc 1 217 35 add.u64 %r115,%r354,7; .loc 1 217 11 setp.ne.u64 %r473,%r101,%r115; @ %r473 bra $L24; bra $L22; } shape_i1.o/ _gfortran_shape_1 .visible .func _gfortran_shape_1fortran/generated/shape_i1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_xmallocarr_gfortrani_xmallocarraygfortran_shape_1 .visible .func _gfortran_shape_1predpred %r74; .reg .u16 %r75; .reg .predpred %r8814 ld.s8 %r22,[%r56+28]; .loc 1 44 6 ld.u64 %r57,[%r55]; setp.ne.u64 %r58,%r57,0; @ %r58 bra $L2; .loc 1 46 7 st.u64 [%r55+48],%r57; add.u32 %r60,%r22,-1; cvt.s64.s32 %r61,%r60; st.u64 [%r55+56],%r61; mov.u64 %r62,1; st.u64 [%r55+40],%r62; .loc 1 47 19 st.u64 [%r55+8],%r57; .loc 1 48 24 cvt.u32.u32 %r67,%r22; cvt.s64.s8 %r66,%r662; call (%value_in),_gfortrani_xmallocarray.loc 1 48 22 st.u64 [%r55],%r68; $L2: .loc 1 53 7 ld.u64 %r71,[%r55+56]; add.u64 %r70,%r71,1; ld.u64 %r73,[%r55+48]; sub.u64 %r72,%r70,%r73; .loc 1 53 6 setp.le.s64 %r74,%r72,0; @ %r74 bra $L1; .loc 1 56 3 cvt.u16.u32 %r75,%r22; setp.le.s16 %r76,%r75,0; @ %r76 bra $L1; add.u64 %r40,%r56,48; ld.u64 %r51,[%r55+40]; ld.u64 %r52,[%r55]; add.u64 %r50,%r56,56; cvt.u32.u32 %r78,%r22; cvt.s64.s8 %r77,%r78; add.u64 %r80,%r77,%r77; add.u64 %r81,%r80,%r77; shl.b64 %r82,%r81,3; add.u64 %r38,%r82,%r40; $L4: .loc 1 58 16 ld.u64 %r86,[%r50]; add.u64 %r85,%r86,1; .loc 1 58 14 ld.u64 %r87,[%r40]; sub.u64 %r84,%r85,%r87; .loc 1 59 56 max.s64 %r84,%r84,0; .loc 1 59 34 st.u8 [%r52],%r84; .loc 1 56 3 add.u64 %r40,%r40,24; add.u64 %r52,%r52,%r51; add.u64 %r50,%r50,24; setp.ne.u64 %r88,%r38,%r40; @ %r88 bra $L4; $L1: .loc 1 61 1 ret; } shape_i2.o/_gfortran_shape_2 .visible .func _gfortran_shape_2fortran/generated/shape_i2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_xmallocarr_gfortrani_xmallocarraygfortran_shape_2 .visible .func _gfortran_shape_predpredpred %r91; mov.u64 %r56,%ar0; mov.u64 %r57,%ar1; .loc 1 42 14 ld.s8 %r22,[%r57+28]; .loc 1 44 6 ld.u64 %r58,[%r56]; setp.ne.u64 %r59,%r58,0; @ %r59 bra $L2; .loc 1 46 7 st.u64 [%r56+48],%r58; add.u32 %r61,%r22,-1; cvt.s64.s32 %r62,%r61; st.u64 [%r56+56],%r62; mov.u64 %r63,1; st.u64 [%r56+40],%r63; .loc 1 47 19 st.u64 [%r56+8],%r58; .loc 1 48 24 cvt.u32.u32 %r68,%r22; cvt.s64.s8 %r67,%r68; mov.u64 %r66,2_gfortrani_xmallocarray.loc 1 48 22 st.u64 [%r56],%r69; $L2: .loc 1 53 7 ld.u64 %r72,[%r56+56]; add.u64 %r71,%r72,1; ld.u64 %r74,[%r56+48]; sub.u64 %r73,%r71,%r74; .loc 1 53 6 setp.le.s64 %r75,%r73,0; @ %r75 bra $L1; .loc 1 56 3 cvt.u16.u32 %r76,%r22; setp.le.s16 %r77,%r76,0; @ %r77 bra $L1; add.u64 %r40,%r57,48; ld.u64 %r78,[%r56+40]; add.u64 %r51,%r78,%r78; ld.u64 %r53,[%r56]; add.u64 %r50,%r57,56; cvt.u32.u32 %r81,%r22; cvt.s64.s8 %r80,%r81; add.u64 %r83,%r80,%r80; add.u64 %r84,%r83,%r80; shl.b64 %r85,%r84,3; add.u64 %r38,%r85,%r40; $L4: .loc 1 58 16 ld.u64 %r89,[%r50]; add.u64 %r88,%r89,1; .loc 1 58 14 ld.u64 %r90,[%r40]; sub.u64 %r87,%r88,%r90; .loc 1 59 56 max.s64 %r87,%r87,0; .loc 1 59 34 st.u16 [%r53],%r87; .loc 1 56 3 add.u64 %r40,%r40,24; add.u64 %r53,%r53,%r51; add.u64 %r50,%r50,24; setp.ne.u64 %r91,%r38,%r40; @ %r91 bra $L4; $L1: .loc 1 61 1gfortran_shape_4 .visible .func _gfortran_shape_4fortran/generated/shape_i4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_xmallocarr_gfortrani_xmallocarraygfortran_shape_4 .visible .func _gfortran_shape_4predpredpred %r90; mov.u64 %r56,%ar0; mov.u64 %r57,%ar1; .loc 1 42 14 ld.s8 %r22,[%r57+28]; .loc 1 44 6 ld.u64 %r58,[%r56]; setp.ne.u64 %r59,%r58,0; @ %r59 bra $L2; .loc 1 46 7 st.u64 [%r56+48],%r58; add.u32 %r61,%r22,-1; cvt.s64.s32 %r62,%r61; st.u64 [%r56+56],%r62; mov.u64 %r63,1; st.u64 [%r56+40],%r63; .loc 1 47 19 st.u64 [%r56+8],%r58; .loc 1 48 24 cvt.u32.u32 %r68,%r22; cvt.s64.s8 %r67,%r68; mov.u64 %r66,4_gfortrani_xmallocarray.loc 1 48 22 st.u64 [%r56],%r69; $L2: .loc 1 53 7 ld.u64 %r72,[%r56+56]; add.u64 %r71,%r72,1; ld.u64 %r74,[%r56+48]; sub.u64 %r73,%r71,%r74; .loc 1 53 6 setp.le.s64 %r75,%r73,0; @ %r75 bra $L1; .loc 1 56 3 cvt.u16.u32 %r76,%r22; setp.le.s16 %r77,%r76,0; @ %r77 bra $L1; add.u64 %r40,%r57,48; ld.u64 %r78,[%r56+40]; shl.b64 %r51,%r78,2; ld.u64 %r53,[%r56]; add.u64 %r50,%r57,56; cvt.u32.u32 %r80,%r22; cvt.s64.s8 %r79,%r80; add.u64 %r82,%r79,%r79; add.u64 %r83,%r82,%r79; shl.b64 %r84,%r83,3; add.u64 %r38,%r84,%r40; $L4: .loc 1 58 16 ld.u64 %r88,[%r50]; add.u64 %r87,%r88,1; .loc 1 58 14 ld.u64 %r89,[%r40]; sub.u64 %r86,%r87,%r89; .loc 1 59 56 max.s64 %r86,%r86,0; .loc 1 59 34 st.u32 [%r53],%r86; .loc 1 56 3 add.u64 %r40,%r40,24; add.u64 %r53,%r53,%r51; add.u64 %r50,%r50,24; setp.ne.u64 %r90,%r38,%r40; @ %r90 bra $L4; $L1: .loc 1 61 1 ret; } shape_i8.o/ ortran_shape_8 .visible .func _gfortran_shape_8fortran/generated/shape_i8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_xmallocarr_gfortrani_xmallocarraygfortran_shape_8 .visible .func _gfortran_shape_8.reg .pred %r74; .reg .u16 %r75; .reg .predpred %r9014 ld.s8 %r22,[%r56+28]; .loc 1 44 6 ld.u64 %r57,[%r55]; setp.ne.u64 %r58,%r57,0; @ %r58 bra $L2; .loc 1 46 7 st.u64 [%r55+48],%r57; add.u32 %r60,%r22,-1; cvt.s64.s32 %r61,%r60; st.u64 [%r55+56],%r61; mov.u64 %r62,1; st.u64 [%r55+40],%r62; .loc 1 47 19 st.u64 [%r55+8],%r57; .loc 1 48 24 cvt.u32.u32 %r67,%r22; cvt.s64.s8 %r66,%r67; mov.u64 66; call (%value_in),_gfortrani_xmallocarray.loc 1 48 22 st.u64 [%r55],%r68; $L2: .loc 1 53 7 ld.u64 %r71,[%r55+56]; add.u64 %r70,%r71,1; ld.u64 %r73,[%r55+48]; sub.u64 %r72,%r70,%r73; .loc 1 53 6 setp.le.s64 %r74,%r72,0; @ %r74 bra $L1; .loc 1 56 3 cvt.u16.u32 %r75,%r22; setp.le.s16 %r76,%r75,0; @ %r76 bra $L1; add.u64 %r40,%r56,48; ld.u64 %r77,[%r55+40]; shl.b64 %r49,%r77,3; ld.u64 %r51,[%r55]; add.u64 %r48,%r56,56; cvt.u32.u32 %r79,%r22; cvt.s64.s8 %r78,%r79; add.u64 %r81,%r78,%r78; add.u64 %r82,%r81,%r78; shl.b64 %r83,%r82,3; add.u64 %r53,%r83,%r40; $L4: .loc 1 58 16 ld.u64 %r86,[%r48]; add.u64 %r85,%r86,1; .loc 1 58 14 ld.u64 %r88,[%r40]; sub.u64 %r87,%r85,%r88; .loc 1 59 56 max.s64 %r89,%r87,0; .loc 1 59 34 st.u64 [%r51],%r89; .loc 1 56 3 add.u64 %r40,%r40,24; add.u64 %r51,%r51,%r49; add.u64 %r48,%r48,24; setp.ne.u64 %r90,%r40,%r53; @ %r90 bra $L4; $L1: .loc 1 61 1 ret; } shape_i16.o/ortran_shape_16 .visible .func _gfortran_shape_16fortran/generated/shape_i16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_xmallocarr_gfortrani_xmallocarraygfortran_shape_16 .visible .func _gfortran_shape_16predpredmov.u64 %r56,%ar0; mov.u64 %r57,%ar1; .loc 1 42 14 ld.s8 %r22,[%r57+28]; .loc 1 44 6 ld.u64 %r58,[%r56]; setp.ne.u64 %r59,%r58,0; @ %r59 bra $L2; .loc 1 46 7 st.u64 [%r56+48],%r58; add.u32 %r61,%r22,-1; cvt.s64.s32 %r62,%r61; st.u64 [%r56+56],%r62; mov.u64 %r63,1; st.u64 [%r56+40],%r63; .loc 1 47 19 st.u64 [%r56+8],%r58; .loc 1 48 24 cvt.u32.u32 %r68,%r22; cvt.s64.s8 %r67,%r68; mov.u64 %r66,_gfortrani_xmallocarray.loc 1 48 22 st.u64 [%r56],%r69; $L2: .loc 1 53 7 ld.u64 %r72,[%r56+56]; add.u64 %r71,%r72,1; ld.u64 %r74,[%r56+48]; sub.u64 %r73,%r71,%r74; .loc 1 53 6 setp.le.s64 %r75,%r73,0; @ %r75 bra $L1; .loc 1 56 3 cvt.u16.u32 %r76,%r22; setp.le.s16 %r77,%r76,0; @ %r77 bra $L1; add.u64 %r40,%r57,48; ld.u64 %r78,[%r56+40]; shl.b64 %r51,%r78,4; ld.u64 %r53,[%r56]; add.u64 %r50,%r57,56; cvt.u32.u32 %r80,%r22; cvt.s64.s8 %r79,%r80; add.u64 %r82,%r79,%r79; add.u64 %r83,%r82,%r79; shl.b64 %r84,%r83,3; add.u64 %r38,%r84,%r40; $L4: .loc 1 58 16 ld.u64 %r88,[%r50]; add.u64 %r87,%r88,1; .loc 1 58 14 ld.u64 %r89,[%r40]; sub.u64 %r86,%r87,%r89; .loc 1 59 56 max.s64 %r86,%r86,0; st.u64 [%r53],%r86; shr.s64 %r90,%r86,63; st.u64 [%r53+8],%r90; .loc 1 56 3 add.u64 %r40,%r40,24; add.u64 %r53,%r53,%r51; add.u64 %r50,%r50,24; setp.ne.u64 %r91,%r38,%r40; @ %r91 bra $L4; $L1: .loc 1 61 1 ret; } eoshift1_4.o/BEGIN FUNCTION DECL: eoshift1 .func eoshift1); .file 1 "../../../../libgfortran/generated/eoshift1_4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_eoshift1_4 .visible .func _gfortran_eoshift1_4_gfortran_eoshift1_4_char .visible .func _gfortran_eoshift1_4_charVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_size0_gfortran_size0_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extentsgfortrani_bounds_reduced_extents .extern .func _gfortrani_bounds_reduced_extents)13] = {114,101,116,117,114,110,32,118,97,108,117,101] = {32,0 }8] = {69,79,83,72,73,70,8473,70,84,32,97,114,103,117,109,101,110,12] = {0,0 }; // BEGIN VAR DEF: space$0 .const .align 4 .u32 space$0[1] = {32 }; // BEGIN FUNCTION DEF: eoshift1 .func eoshift1) {predpredpredpred %r195; .reg .u32 %r197; .reg .predpredpredpred %r246; .reg .predu64 %r319; .reg .predu64u64 %r351; .reg .u64 %r352; .reg .u64 %r353; .reg .predpredmov.u64 %r165,%ar0; mov.u64 %r166,%ar1; mov.u64 %r167,%ar2; mov.u64 %r168,%ar3; mov.u64 %r169,%ar4; mov.u64 %r170,%ar5; mov.u64 %r171,%ar6; .loc 1 74 10 ld.u64 %r22,[%r166+16]; .loc 1 76 6 setp.eq.u64 %r172,%r169,0; @ %r172 bra $L51; .loc 1 77 11 ld.u32 %r173,[%r169]; add.u32 %r104,%r173,-1; bra $L2; $L51: .loc 1 79 11 cvt.u32.u64 %r104,%r169; $L2: .loc 1 81 13 mov.u64 %r174,1; st.u64 [%frame],%r174; .loc 1 82 12 mov.u64 %r175,0; st.u64 [%frame+120],%r175166; call (%value_in),_gfortran_size0,(%out_arg1); ld.param.u64 %r177,[%value_in]; } .loc 1 85 6 ld.u64 %r178,[%r165]; setp.ne.u64 %r179,%r178,0; @ %r179 bra $L3; .loc 1 87 19 st.u64 [%r165+8],%r178; .loc 1 88 7 ld.u64 %r181,[%r166+16]; st.u64 [%r165+16],%r181; ld.u64 %r182,[%r166+24]; st.u64 [%r165+24],%r182; .loc 1 89 34 ld.s8 %r162,[%r166+28]; .loc 1 89 7 cvt.u16.u32 %r183,%r162; setp.gt.s16 %r184,%r183,0; @ %r184 bra $L4; $L7: .loc 1 105 24; call (%value_in),_gfortrani_xmallocarray187,[%value_in]; } .loc 1 105 22 st.u64 [%r165],%r187; cvta.global.u64 %r360,_gfortrani_compile_options; bra $L5; $L6: .loc 1 93 9 ld.u64 %r190,[%r53+8]; ld.u64 %r192,[%r53]; .loc 1 98 17 mad.lo.u64 %r108,%r40,%r108,%r108; add.u64 %r53,%r53,24; .loc 1 93 7 sub.u64 %r40,%r190,%r192; $L49: add.u64 %r164,%r165,%r36; .loc 1 101 4 st.u64 [%r164+48],%r194; st.u64 [%r164+56],%r40; st.u64 [%r164+40],%r108; .loc 1 89 7 add.u64 %r36,%r36,24; setp.ne.u64 %r195,%r36,%r353; @ %r195 bra $L6; bra $L7; $L3: .loc 1 108 12 cvta.global.u64 %r360,_gfortrani_compile_options; .loc 1 108 11 ld.u32 %r197,[%r360+36]; setp.ne.u32 %r198,%r197,0; @ ! %r198 bra $L9; .loc 1 110 7 cvta.const.u64 %r202,$LC0;r16620202; call _gfortrani_bounds_equal_extents$L5: .loc 1 114 6 ld.u32 %r204,[%r360+36]; setp.eq.u32 %r205,%r204,0; @ %r205 bra $L9; .loc 1 116 7 cvta.const.u64 %r210,$LC0; cvta.const.u64 %r20104210; call _gfortrani_bounds_reduced_extents); } $L9: .loc 1 120 6 setp.eq.u64 %r211,%r177,0; @ %r211 bra $L1; .loc 1 124 23 ld.s8 %r150,[%r166+28]; cvt.u32.u32 %r212,%r150; cvt.s64.s8 %r159,%r212; .loc 1 124 3 setp.gt.s64 %r213,%r159,0; @ %r213 bra $L12; .loc 1 70 7 mov.u64 %r98,0; .loc 1 71 11 mov.u64 %r90,%r98; .loc 1 72 11 mov.u64 %r83,%r98; bra $L13; $L12: .loc 1 126 15 cvt.s64.s32 %r114,%r104; .loc 1 128 21 ld.u64 %r84,[%r165+16]; .loc 1 131 21 ld.u64 %r154,[%r166+16]; mov.u64 %r93,40; .loc 1 123 5 mov.u64 %r100,0; .loc 1 70 7 mov.u64 %r98,%r100; .loc 1 124 12 mov.u64 %r138,%r100; .loc 1 71 11 mov.u64 %r90,%r100; .loc 1 72 11 mov.u64 %r83,%r100; .loc 1 138 20 mov.u64 %r373,%r100; $L18: add.u64 %r68,%r166,%r93; .loc 1 134 17 ld.u64 %r215,[%r68+16]; add.u64 %r214,%r215,1; .loc 1 134 15 ld.u64 %r216,[%r68+8]; sub.u64 %r99,%r214,%r216; .loc 1 128 21 add.u64 %r217,%r165,%r93; ld.u64 %r218,[%r217]; mul.lo.u64 %r107,%r84,%r218; .loc 1 131 21 ld.u64 %r219,[%r68]; mul.lo.u64 %r148,%r154,%r219; .loc 1 126 10 setp.ne.u64 %r220,%r114,%r138; @ %r220 bra $L14; .loc 1 129 14 setp.ne.u64 %r221,%r107,0; .loc 1 74 8 selp.u64 %r83,%r107,%r22,%r221; .loc 1 132 14 setp.ne.u64 %r222,%r148,0; .loc 1 74 8 selp.u64 %r90,%r148,%r22,%r222; .loc 1 134 15 mov.u64 %r98,%r99; bra $L17; $L14: .loc 1 138 20 shl.b64 %r223,%r100,3; add.u64 %r224,%frame,%r223; add.u64 %r225,%r224,120; st.u64 [%r225],%r373; .loc 1 139 21 st.u64 [%r224],%r99; .loc 1 140 22 st.u64 [%r224+480],%r107; .loc 1 141 22 st.u64 [%r224+360],%r148; .loc 1 143 24 add.u64 %r239,%r100,%r100; add.u64 %r240,%r239,%r100; shl.b64 %r241,%r240,3; add.u64 %r242,%r167,%r241; .loc 1 143 22 ld.u64 %r244,[%r242+40]; st.u64 [%r224+240],%r244; .loc 1 144 12 add.u64 %r100,%r100,1; $L17: .loc 1 124 55 add.u64 %r138,%r138,1; .loc 1 124 3 add.u64 %r93,%r93,24; setp.ne.u64 %r245,%r138,%r159; @ %r245 bra $L18; $L13: .loc 1 147 14 ld.u64 %r43,[%frame+360]; .loc 1 147 6 setp.ne.u64 %r246,%r43,0; .loc 1 74 8 selp.u64 %r43,%r43,%r22,%r246; .loc 1 149 14 ld.u64 %r37,[%frame+480]; .loc 1 149 6 setp.ne.u64 %r247,%r37,0; .loc 1 74 8 selp.u64 %r37,%r37,%r22,%r247; .loc 1 151 14 ld.u64 %r39,[%frame+240]; .loc 1 151 6 setp.ne.u64 %r248,%r39,0; .loc 1 152 16 selp.u64 %r39,%r39,1,%r248; .loc 1 158 8 ld.u64 %r137,[%r165]; .loc 1 159 8 ld.u64 %r50,[%r166]; .loc 1 160 8 ld.u64 %r51,[%r167]; .loc 1 162 9 setp.eq.u64 %r249,%r137,0; @ %r249 bra $L1; .loc 1 228 12 shl.b64 %r58,%r39,2; .loc 1 186 38 set.u32.eq.u64 %r251,%r22,%r83; neg.s32 %r252,%r251; .loc 1 186 19 set.u32.eq.u64 %r254,%r22,%r90; neg.s32 %r255,%r254; .loc 1 186 27 cvt.u16.u32 %r257,%r252; cvt.u16.u32 %r258,%r255; and.b16 %r256,%r257,%r258; cvt.u32.u16 %r259,%r256; cvt.u32.u8 %r96,%r259; .loc 1 231 32 ld.u64 %r139,[%frame]; .loc 1 238 30 mul.lo.u64 %r122,%r139,%r37; .loc 1 239 30 mul.lo.u64 %r260,%r139,%r43; .loc 1 239 16 neg.s64 %r144,%r260; .loc 1 240 23 mul.lo.u64 %r261,%r139,%r39; .loc 1 240 9 shl.b64 %r262,%r261,2; neg.s64 %r145,%r262; setp.ne.u32 %r361,%r96,0; setp.ne.u64 %r356,%r168,0; setp.le.s64 %r355,%r159,2; .loc 1 168 10 cvt.u32.u64 %r364,%r98; setp.eq.u64 %r365,%r171,1; .loc 1 219 8 setp.gt.s64 %r366,%r22,0; cvt.u32.u32 %r367,%r150; cvt.s64.s8 %r368,%r367; add.u64 %r369,%r368,-1; add.u64 %r370,%frame,480; add.u64 %r371,%frame,360; add.u64 %r372,%frame,240; $L50: .loc 1 165 10 ld.u32 %r105,[%r51]; .loc 1 166 26 setp.lt.s32 %r263,%r105,0; @ %r263 bra $L23; cvt.s64.s32 %r110,%r105; bra $L24; $L23: .loc 1 166 28 neg.s32 %r264,%r105; .loc 1 166 26 cvt.s64.s32 %r110,%r264; $L24: .loc 1 166 10 setp.le.s64 %r265,%r110,%r98; @ %r265 bra $L25; .loc 1 168 10 mov.u32 %r105,%r364; mov.u32 %r106,%r364; bra $L26; $L25: .loc 1 172 8 abs.s32 %r106,%r105; $L26: .loc 1 176 29 cvt.s64.s32 %r95,%r106; .loc 1 174 10 setp.le.s32 %r266,%r105,0; @ %r266 bra $L27; .loc 1 176 15 mad.lo.u64 %r94,%r95,%r90,%r50; mov.u64 %r87,%r137; bra $L28; $L27: .loc 1 182 16 mad.lo.u64 %r87,%r95,%r83,%r137; mov.u64 %r94,%r50; $L28: .loc 1 194 24 sub.u64 %r102,%r98,%r95; .loc 1 186 10 @ %r361 bra $L29; .loc 1 194 4 setp.gt.s64 %r270,%r102,0; @ %r270 bra $L30; bra $L31; $L29: .loc 1 188 24 mul.lo.u64 %r131,%r102,%r22; .loc 1 189 4131277,[%value_in]; } .loc 1 190 9 add.u64 %r87,%r87,%r131; bra $L31; $L30: .loc 1 198 12 mov.u64 %r129,%r87; .loc 1 194 11 mov.u64 %r130,0; $L32:285,[%value_in]; } .loc 1 197 13 add.u64 %r129,%r129,%r83; .loc 1 198 12 add.u64 %r94,%r94,%r90; .loc 1 194 34 add.u64 %r130,%r130,1; .loc 1 194 4 setp.ne.u64 %r287,%r102,%r130; @ %r287 bra $L32; .loc 1 197 13 mad.lo.u64 %r87,%r102,%r83,%r87; $L31: .loc 1 201 10 setp.ge.s32 %r289,%r105,0; selp.u64 %r87,%r87,%r137,%r289; .loc 1 212 10 add.u64 %r133,%r95,-1; .loc 1 205 10 @ %r356 bra $L34; .loc 1 212 8 setp.ne.u64 %r291,%r95,0; @ %r291 bra $L45; $L38: .loc 1 226 12 add.u64 %r137,%r137,%r37; .loc 1 227 12 add.u64 %r50,%r50,%r43; .loc 1 228 12 add.u64 %r51,%r51,%r58; .loc 1 229 15 ld.u64 %r292,[%frame+120]; add.u64 %r60,%r292,1; .loc 1 231 13 setp.eq.u64 %r293,%r60,%r139; @ %r293 bra $L36; .loc 1 229 15 st.u64 [%frame+120],%r60; bra $L50; $L34: .loc 1 206 8 setp.eq.u64 %r294,%r95,0; @ %r294 bra $L38; $L39: .loc 1 208 61681,[%value_in]; } .loc 1 209 11 add.u64 %r87,%r87,%r83; .loc 1 206 10 add.u64 %r133,%r133,-1; .loc 1 206 8 setp.ne.u64 %r303,%r133,-1; @ %r303 bra $L39; bra $L38; $L45: .loc 1 216 9 @ %r365 bra $L40; .loc 1 219 8 @ %r366 bra $L41; bra $L42; $L40: .loc 1 217 8 ld.s8 %r310,[%r170]bra $L42; $L41: .loc 1 219 15 mov.u64 %r132,0; $L43: .loc 1 220 3 add.u64 %r312,%r87,%r132 add.u64 %r132,%r132,%r171; .loc 1 219 8 setp.gt.s64 %r321,%r22,%r132; @ %r321 bra $L43; $L42: .loc 1 222 11 add.u64 %r87,%r87,%r83; .loc 1 212 10 add.u64 %r133,%r133,-1; .loc 1 212 8 setp.ne.u64 %r322,%r133,-1; @ %r322 bra $L45; bra $L38; $L36: .loc 1 235 20 mov.u64 %r323,0; st.u64 [%frame+120],%r323; .loc 1 239 16 add.u64 %r134,%r50,%r144; .loc 1 240 9 add.u64 %r135,%r51,%r145; .loc 1 242 14 @ %r355 bra $L1; add.u64 %r157,%frame,128; .loc 1 238 30 mov.u64 %r63,%r122; .loc 1 242 14 mov.u64 %r156,8; .loc 1 241 12 mov.u64 %r136,1; bra $L47; $L48: .loc 1 235 20 st.u64 [%r157],%r323; .loc 1 238 30 mul.lo.u64 %r63,%r75,%r74; .loc 1 239 30 mul.lo.u64 %r329,%r77,%r74; .loc 1 239 16 sub.u64 %r134,%r50,%r329; .loc 1 240 23 mul.lo.u64 %r330,%r79,%r74; .loc 1 240 9 shl.b64 %r331,%r330,2; sub.u64 %r135,%r51,%r331; .loc 1 241 12 add.u64 %r136,%r136,1; .loc 1 242 14 add.u64 %r157,%r157,8; add.u64 %r156,%r156,8; setp.eq.u64 %r332,%r136,%r369; @ %r332 bra $L1; $L47: .loc 1 250 23 ld.u64 %r333,[%r157]; add.u64 %r74,%r333,1; st.u64 [%r157],%r74; .loc 1 251 30 add.u64 %r335,%r370,%r156; ld.u64 %r75,[%r335]; .loc 1 251 20 sub.u64 %r336,%r75,%r63; add.u64 %r137,%r137,%r336; .loc 1 252 30 add.u64 %r338,%r371,%r156; ld.u64 %r77,[%r338]; .loc 1 252 20 add.u64 %r50,%r134,%r77; .loc 1 253 23 add.u64 %r340,%r372,%r156; ld.u64 %r79,[%r340]; .loc 1 253 13 shl.b64 %r341,%r79,2; add.u64 %r51,%r135,%r341; .loc 1 231 32 add.u64 %r342,%frame,%r156; ld.u64 %r82,[%r342]; .loc 1 231 13 setp.eq.u64 %r343,%r74,%r82; @ %r343 bra $L48; bra $L50; $L4: .loc 1 93 9 ld.u64 %r345,[%r166+56]; ld.u64 %r347,[%r166+48]; .loc 1 93 7 sub.u64 %r40,%r345,%r347; add.u64 %r53,%r166,72; cvt.u32.u32 %r349,%r162; cvt.s64.s8 %r348,%r349; add.u64 %r351,%r348,%r348; add.u64 %r352,%r351,%r348; shl.b64 %r353,%r352,3; mov.u64 %r36,%r178; .loc 1 96 17 mov.u64 %r108,%r174; .loc 1 101 4 mov.u64 %r194,%r36; bra $L49; $L1: .loc 1 257_gfortran_eoshift1_4 .visible .func _gfortran_eoshift1_4 {r22.loc 1 272 3 mov.u64 %r33,1; cvta.const.u64 %r32,$LC3eoshift127_gfortran_eoshift1_4_char .visible .func _gfortran_eoshift1_4_char.loc 1 295 3 mov.u64 %r36,1; cvta.const.u64 %r35,$LC4eoshift129_gfortran_eoshift1_4_char4 .visible .func _gfortran_eoshift1_4_char4.loc 1 319 3 mov.u64 %r36,4; cvta.const.u64 %r35,space$0eoshift121 1 ret; } eoshift1_8.o/BEGIN FUNCTION DECL: eoshift1 .func eoshift1); .file 1 "../../../../libgfortran/generated/eoshift1_8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_eoshift1_8 .visible .func _gfortran_eoshift1_8_gfortran_eoshift1_8_char .visible .func _gfortran_eoshift1_8_charVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_size0_gfortran_size0_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extentsgfortrani_bounds_reduced_extents .extern .func _gfortrani_bounds_reduced_extents)13] = {114,101,116,117,114,110,32,118,97,108,117,101] = {32,0 }8] = {69,79,83,72,73,70,8473,70,84,32,97,114,103,117,109,101,110,12] = {0,0 }; // BEGIN VAR DEF: space$0 .const .align 4 .u32 space$0[1] = {32 }; // BEGIN FUNCTION DEF: eoshift1 .func eoshift1) {predpredpredpred %r195; .reg .u32 %r197; .reg .predpredpredpred %r246; .reg .predpred %r268; .reg .u64 %r275; .reg .u64 %r283; .reg .pred %r285; .reg .pred %r287; .reg .predpred %r301; .reg .u32u64 %r350; .reg .u64 %r351; .reg .u64 %r353; .reg .pred %r355; .reg .pred %r356; .reg .pred %r357; .reg .pred %r362; .reg .predu64 %r370; mov.u64 %r164,%ar0; mov.u64 %r165,%ar1; mov.u64 %r166,%ar2; mov.u64 %r167,%ar3; mov.u64 %r168,%ar4; mov.u64 %r169,%ar5; mov.u64 %r170,%ar6; .loc 1 74 10 ld.u64 %r22,[%r165+16]; .loc 1 76 6 setp.eq.u64 %r171,%r168,0; @ %r171 bra $L48; .loc 1 77 21 ld.u32 %r173,[%r168]; add.u32 %r105,%r173,-1; bra $L2; $L48: .loc 1 79 11 cvt.u32.u64 %r105,%r168; $L2: .loc 1 81 13 mov.u64 %r174,1; st.u64 [%frame],%r174; .loc 1 82 12 mov.u64 %r175,0; st.u64 [%frame+120],%r175165; call (%value_in),_gfortran_size0,(%out_arg1); ld.param.u64 %r177,[%value_in]; } .loc 1 85 6 ld.u64 %r178,[%r164]; setp.ne.u64 %r179,%r178,0; @ %r179 bra $L3; .loc 1 87 19 st.u64 [%r164+8],%r178; .loc 1 88 7 ld.u64 %r181,[%r165+16]; st.u64 [%r164+16],%r181; ld.u64 %r182,[%r165+24]; st.u64 [%r164+24],%r182; .loc 1 89 34 ld.s8 %r161,[%r165+28]; .loc 1 89 7 cvt.u16.u32 %r183,%r161; setp.gt.s16 %r184,%r183,0; @ %r184 bra $L4; $L7: .loc 1 105 24; call (%value_in),_gfortrani_xmallocarray187,[%value_in]; } .loc 1 105 22 st.u64 [%r164],%r187; cvta.global.u64 %r353,_gfortrani_compile_options; bra $L5; $L6: .loc 1 93 9 ld.u64 %r190,[%r55+8]; ld.u64 %r192,[%r55]; .loc 1 98 17 mad.lo.u64 %r108,%r42,%r108,%r108; add.u64 %r55,%r55,24; .loc 1 93 7 sub.u64 %r42,%r190,%r192; $L46: add.u64 %r163,%r164,%r38; .loc 1 101 4 st.u64 [%r163+48],%r194; st.u64 [%r163+56],%r42; st.u64 [%r163+40],%r108; .loc 1 89 7 add.u64 %r38,%r38,24; setp.ne.u64 %r195,%r38,%r351; @ %r195 bra $L6; bra $L7; $L3: .loc 1 108 12 cvta.global.u64 %r353,_gfortrani_compile_options; .loc 1 108 11 ld.u32 %r197,[%r353+36]; setp.ne.u32 %r198,%r197,0; @ ! %r198 bra $L9; .loc 1 110 7 cvta.const.u64 %r202,$LC0;164r20202; call _gfortrani_bounds_equal_extents$L5: .loc 1 114 6 ld.u32 %r204,[%r353+36]; setp.eq.u32 %r205,%r204,0; @ %r205 bra $L9; .loc 1 116 7 cvta.const.u64 %r210,$LC0; cvta.const.u64 %r200210; call _gfortrani_bounds_reduced_extents); } $L9: .loc 1 120 6 setp.eq.u64 %r211,%r177,0; @ %r211 bra $L1; .loc 1 124 23 ld.s8 %r151,[%r165+28]; cvt.u32.u32 %r212,%r151; cvt.s64.s8 %r149,%r212; .loc 1 124 3 setp.gt.s64 %r213,%r149,0; @ %r213 bra $L12; .loc 1 70 7 mov.u64 %r98,0; .loc 1 71 11 mov.u64 %r91,%r98; .loc 1 72 11 mov.u64 %r85,%r98; bra $L13; $L12: .loc 1 126 15 cvt.s64.s32 %r112,%r105; .loc 1 128 21 ld.u64 %r86,[%r164+16]; .loc 1 131 21 ld.u64 %r154,[%r165+16]; mov.u64 %r96,40; .loc 1 123 5 mov.u64 %r100,0; .loc 1 70 7 mov.u64 %r98,%r100; .loc 1 124 12 mov.u64 %r137,%r100; .loc 1 71 11 mov.u64 %r91,%r100; .loc 1 72 11 mov.u64 %r85,%r100; .loc 1 138 20 mov.u64 %r370,%r100; $L18: add.u64 %r70,%r165,%r96; .loc 1 134 17 ld.u64 %r215,[%r70+16]; add.u64 %r214,%r215,1; .loc 1 134 15 ld.u64 %r216,[%r70+8]; sub.u64 %r99,%r214,%r216; .loc 1 128 21 add.u64 %r217,%r164,%r96; ld.u64 %r218,[%r217]; mul.lo.u64 %r107,%r86,%r218; .loc 1 131 21 ld.u64 %r219,[%r70]; mul.lo.u64 %r147,%r154,%r219; .loc 1 126 10 setp.ne.u64 %r220,%r112,%r137; @ %r220 bra $L14; .loc 1 129 14 setp.ne.u64 %r221,%r107,0; .loc 1 74 8 selp.u64 %r85,%r107,%r22,%r221; .loc 1 132 14 setp.ne.u64 %r222,%r147,0; .loc 1 74 8 selp.u64 %r91,%r147,%r22,%r222; .loc 1 134 15 mov.u64 %r98,%r99; bra $L17; $L14: .loc 1 138 20 shl.b64 %r223,%r100,3; add.u64 %r224,%frame,%r223; add.u64 %r225,%r224,120; st.u64 [%r225],%r370; .loc 1 139 21 st.u64 [%r224],%r99; .loc 1 140 22 st.u64 [%r224+480],%r107; .loc 1 141 22 st.u64 [%r224+360],%r147; .loc 1 143 24 add.u64 %r239,%r100,%r100; add.u64 %r240,%r239,%r100; shl.b64 %r241,%r240,3; add.u64 %r242,%r166,%r241; .loc 1 143 22 ld.u64 %r244,[%r242+40]; st.u64 [%r224+240],%r244; .loc 1 144 12 add.u64 %r100,%r100,1; $L17: .loc 1 124 55 add.u64 %r137,%r137,1; .loc 1 124 3 add.u64 %r96,%r96,24; setp.ne.u64 %r245,%r137,%r149; @ %r245 bra $L18; $L13: .loc 1 147 14 ld.u64 %r45,[%frame+360]; .loc 1 147 6 setp.ne.u64 %r246,%r45,0; .loc 1 74 8 selp.u64 %r45,%r45,%r22,%r246; .loc 1 149 14 ld.u64 %r39,[%frame+480]; .loc 1 149 6 setp.ne.u64 %r247,%r39,0; .loc 1 74 8 selp.u64 %r39,%r39,%r22,%r247; .loc 1 151 14 ld.u64 %r41,[%frame+240]; .loc 1 151 6 setp.ne.u64 %r248,%r41,0; .loc 1 152 16 selp.u64 %r41,%r41,1,%r248; .loc 1 158 8 ld.u64 %r136,[%r164]; .loc 1 159 8 ld.u64 %r54,[%r165]; .loc 1 160 8 ld.u64 %r52,[%r166]; .loc 1 162 9 setp.eq.u64 %r249,%r136,0; @ %r249 bra $L1; .loc 1 228 12 shl.b64 %r60,%r41,3; .loc 1 186 38 set.u32.eq.u64 %r251,%r22,%r85; neg.s32 %r252,%r251; .loc 1 186 19 set.u32.eq.u64 %r254,%r22,%r91; neg.s32 %r255,%r254; .loc 1 186 27 cvt.u16.u32 %r257,%r252; cvt.u16.u32 %r258,%r255; and.b16 %r256,%r257,%r258; cvt.u32.u16 %r259,%r256; cvt.u32.u8 %r93,%r259; .loc 1 231 32 ld.u64 %r127,[%frame]; .loc 1 238 30 mul.lo.u64 %r120,%r127,%r39; .loc 1 239 30 mul.lo.u64 %r260,%r127,%r45; .loc 1 239 16 neg.s64 %r144,%r260; .loc 1 240 23 mul.lo.u64 %r261,%r127,%r41; .loc 1 240 9 shl.b64 %r262,%r261,3; neg.s64 %r145,%r262; setp.ne.u32 %r355,%r93,0; setp.ne.u64 %r357,%r167,0; setp.le.s64 %r356,%r149,2; setp.eq.u64 %r362,%r170,1; .loc 1 219 8 setp.gt.s64 %r363,%r22,0; cvt.u32.u32 %r364,%r151; cvt.s64.s8 %r365,%r364; add.u64 %r366,%r365,-1; add.u64 %r367,%frame,480; add.u64 %r368,%frame,360; add.u64 %r369,%frame,240; $L47: .loc 1 165 10 ld.u64 %r106,[%r52]; .loc 1 166 26 abs.s64 %r47,%r106; .loc 1 166 10 setp.gt.s64 %r263,%r47,%r98; @ %r263 bra $L51; .loc 1 194 24 sub.u64 %r63,%r98,%r47; bra $L23; $L51: mov.u64 %r47,%r98; mov.u64 %r106,%r98; mov.u64 %r63,0; $L23: .loc 1 174 10 setp.le.s64 %r264,%r106,0; @ %r264 bra $L24; .loc 1 176 15 mad.lo.u64 %r94,%r47,%r91,%r54; mov.u64 %r89,%r136; bra $L25; $L24: .loc 1 182 16 mad.lo.u64 %r89,%r47,%r85,%r136; mov.u64 %r94,%r54; $L25: .loc 1 186 10 @ %r355 bra $L26; .loc 1 194 4 setp.ne.u64 %r268,%r63,0; @ %r268 bra $L27; bra $L28; $L26: .loc 1 188 24 mul.lo.u64 %r130,%r63,%r22; .loc 1 189 4130275,[%value_in]; } .loc 1 190 9 add.u64 %r89,%r89,%r130; bra $L28; $L27: .loc 1 198 12 mov.u64 %r128,%r89; .loc 1 194 11 mov.u64 %r129,0; $L29:1283,[%value_in]; } .loc 1 197 13 add.u64 %r128,%r128,%r85; .loc 1 198 12 add.u64 %r94,%r94,%r91; .loc 1 194 34 add.u64 %r129,%r129,1; .loc 1 194 4 setp.ne.u64 %r285,%r63,%r129; @ %r285 bra $L29; .loc 1 197 13 mad.lo.u64 %r89,%r63,%r85,%r89; $L28: .loc 1 201 10 setp.ge.s64 %r287,%r106,0; selp.u64 %r89,%r89,%r136,%r287; .loc 1 212 10 add.u64 %r132,%r47,-1; .loc 1 205 10 @ %r357 bra $L31; .loc 1 212 8 setp.ne.u64 %r289,%r47,0; @ %r289 bra $L42; $L35: .loc 1 226 12 add.u64 %r136,%r136,%r39; .loc 1 227 12 add.u64 %r54,%r54,%r45; .loc 1 228 12 add.u64 %r52,%r52,%r60; .loc 1 229 15 ld.u64 %r290,[%frame+120]; add.u64 %r62,%r290,1; .loc 1 231 13 setp.eq.u64 %r291,%r62,%r127; @ %r291 bra $L33; .loc 1 229 15 st.u64 [%frame+120],%r62; bra $L47; $L31: .loc 1 206 8 setp.eq.u64 %r292,%r47,0; @ %r292 bra $L35; $L36: .loc 1 208 6167299,[%value_in]; } .loc 1 209 11 add.u64 %r89,%r89,%r85; .loc 1 206 10 add.u64 %r132,%r132,-1; .loc 1 206 8 setp.ne.u64 %r301,%r132,-1; @ %r301 bra $L36; bra $L35; $L42: .loc 1 216 9 @ %r362 bra $L37; .loc 1 219 8 @ %r363 bra $L38; bra $L39; $L37: .loc 1 217 8 ld.s8 %r308,[%r169]89bra $L39; $L38: .loc 1 219 15 mov.u64 %r131,0; $L40: .loc 1 220 3 add.u64 %r310,%r89,%r131170317,[%value_in]; } .loc 1 219 32 add.u64 %r131,%r131,%r170; .loc 1 219 8 setp.gt.s64 %r319,%r22,%r131; @ %r319 bra $L40; $L39: .loc 1 222 11 add.u64 %r89,%r89,%r85; .loc 1 212 10 add.u64 %r132,%r132,-1; .loc 1 212 8 setp.ne.u64 %r320,%r132,-1; @ %r320 bra $L42; bra $L35; $L33: .loc 1 235 20 mov.u64 %r321,0; st.u64 [%frame+120],%r321; .loc 1 239 16 add.u64 %r133,%r54,%r144; .loc 1 240 9 add.u64 %r134,%r52,%r145; .loc 1 242 14 @ %r356 bra $L1; add.u64 %r157,%frame,128; .loc 1 238 30 mov.u64 %r65,%r120; .loc 1 242 14 mov.u64 %r156,8; .loc 1 241 12 mov.u64 %r135,1; bra $L44; $L45: .loc 1 235 20 st.u64 [%r157],%r321; .loc 1 238 30 mul.lo.u64 %r65,%r77,%r76; .loc 1 239 30 mul.lo.u64 %r327,%r79,%r76; .loc 1 239 16 sub.u64 %r133,%r54,%r327; .loc 1 240 23 mul.lo.u64 %r328,%r81,%r76; .loc 1 240 9 shl.b64 %r329,%r328,3; sub.u64 %r134,%r52,%r329; .loc 1 241 12 add.u64 %r135,%r135,1; .loc 1 242 14 add.u64 %r157,%r157,8; add.u64 %r156,%r156,8; setp.eq.u64 %r330,%r135,%r366; @ %r330 bra $L1; $L44: .loc 1 250 23 ld.u64 %r331,[%r157]; add.u64 %r76,%r331,1; st.u64 [%r157],%r76; .loc 1 251 30 add.u64 %r333,%r367,%r156; ld.u64 %r77,[%r333]; .loc 1 251 20 sub.u64 %r334,%r77,%r65; add.u64 %r136,%r136,%r334; .loc 1 252 30 add.u64 %r336,%r368,%r156; ld.u64 %r79,[%r336]; .loc 1 252 20 add.u64 %r54,%r133,%r79; .loc 1 253 23 add.u64 %r338,%r369,%r156; ld.u64 %r81,[%r338]; .loc 1 253 13 shl.b64 %r339,%r81,3; add.u64 %r52,%r134,%r339; .loc 1 231 32 add.u64 %r340,%frame,%r156; ld.u64 %r84,[%r340]; .loc 1 231 13 setp.eq.u64 %r341,%r76,%r84; @ %r341 bra $L45; bra $L47; $L4: .loc 1 93 9 ld.u64 %r343,[%r165+56]; ld.u64 %r345,[%r165+48]; .loc 1 93 7 sub.u64 %r42,%r343,%r345; add.u64 %r55,%r165,72; cvt.u32.u32 %r347,%r161; cvt.s64.s8 %r346,%r347; add.u64 %r349,%r346,%r346; add.u64 %r350,%r349,%r346; shl.b64 %r351,%r350,3; mov.u64 %r38,%r178; .loc 1 96 17 mov.u64 %r108,%r174; .loc 1 101 4 mov.u64 %r194,%r38; bra $L46; $L1: .loc 1 257_gfortran_eoshift1_8 .visible .func _gfortran_eoshift1_8 {r22.loc 1 272 3 mov.u64 %r33,1; cvta.const.u64 %r32,$LC3eoshift127_gfortran_eoshift1_8_char .visible .func _gfortran_eoshift1_8_char.loc 1 295 3 mov.u64 %r36,1; cvta.const.u64 %r35,$LC4eoshift129_gfortran_eoshift1_8_char4 .visible .func _gfortran_eoshift1_8_char4.loc 1 319 3 mov.u64 %r36,4; cvta.const.u64 %r35,space$0eoshift121 1 ret; } eoshift1_16.o/BEGIN FUNCTION DECL: eoshift1 .func eoshift1); .file 1 "../../../../libgfortran/generated/eoshift1_16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_eoshift1_16 .visible .func _gfortran_eoshift1_16_gfortran_eoshift1_16_char .visible .func _gfortran_eoshift1_16_char_gfortran_eoshift1_16_char4 .visible .func _gfortran_eoshift1_16_char4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_size0_gfortran_size0_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extentsgfortrani_bounds_reduced_extents .extern .func _gfortrani_bounds_reduced_extents)13] = {114,101,116,117,114,110,32,118,97,108,117,101] = {32,0 }8] = {69,79,83,72,73,70,8473,70,84,32,97,114,103,117,109,101,110,12] = {0,0 }; // BEGIN VAR DEF: space$0 .const .align 4 .u32 space$0[1] = {32 }; // BEGIN FUNCTION DEF: eoshift1 .func eoshift1) {u64 %r200; .reg .pred %r201; .reg .u32 %r203; .reg .pred %r204; .reg .u64 %r207; .reg .u64 %r208; .reg .u32pred %r251; .reg .pred %r252; .reg .predu32u64 %r269; .reg .u64 %r271; .reg .u64pred %r297; .reg .pred %r300; .reg .u64 %r302; .reg .u64 %r305; .reg .u32 %r307; .reg .u64 %r308; .reg .pred %r313; .reg .pred %r315; .reg .pred %r317; .reg .pred %r322; .reg .pred %r324; .reg .pred %r326; .reg .u64 %r334; .reg .u64 %r342; .reg .pred %r344; .reg .pred %r361; .reg .predpred %r366; .reg .u64 %r373; .reg .pred %r375; .reg .u32u64 %r391; .reg .predu64pred.reg .pred %r445; .reg .pred %r446; .reg .predu64 %r460; .reg .u64 %r461; .reg .u64 %r462; mov.u64 %r171,%ar0; mov.u64 %r172,%ar1; mov.u64 %r173,%ar2; mov.u64 %r174,%ar3; mov.u64 %r175,%ar4; mov.u64 %r176,%ar5; mov.u64 %r177,%ar6; .loc 1 74 10 ld.u64 %r22,[%r172+16]; .loc 1 76 6 setp.eq.u64 %r178,%r175,0; @ %r178 bra $L52; .loc 1 77 21 ld.u32 %r179,[%r175]; add.u32 %r105,%r179,-1; bra $L2; $L52: .loc 1 79 11 cvt.u32.u64 %r105,%r175; $L2: .loc 1 81 13 mov.u64 %r180,1; st.u64 [%frame],%r180; .loc 1 82 12 mov.u64 %r181,0; st.u64 [%frame+120],%r181172; call (%value_in),_gfortran_size0,(%out_arg1); ld.param.u64 %r183,[%value_in]; } .loc 1 85 6 ld.u64 %r184,[%r171]; setp.ne.u64 %r185,%r184,0; @ %r185 bra $L3; .loc 1 87 19 st.u64 [%r171+8],%r184; .loc 1 88 7 ld.u64 %r187,[%r172+16]; st.u64 [%r171+16],%r187; ld.u64 %r188,[%r172+24]; st.u64 [%r171+24],%r188; .loc 1 89 34 ld.s8 %r169,[%r172+28]; .loc 1 89 7 cvt.u16.u32 %r189,%r169; setp.gt.s16 %r190,%r189,0; @ %r190 bra $L4; $L7: .loc 1 105 2418r22; call (%value_in),_gfortrani_xmallocarray193,[%value_in]; } .loc 1 105 22 st.u64 [%r171],%r193; cvta.global.u64 %r444,_gfortrani_compile_options; bra $L5; $L6: .loc 1 93 9 ld.u64 %r196,[%r45+8]; ld.u64 %r198,[%r45]; .loc 1 98 17 mad.lo.u64 %r108,%r43,%r108,%r108; add.u64 %r45,%r45,24; .loc 1 93 7 sub.u64 %r43,%r196,%r198; $L50: add.u64 %r125,%r171,%r35; .loc 1 101 4 st.u64 [%r125+48],%r200; st.u64 [%r125+56],%r43; st.u64 [%r125+40],%r108; .loc 1 89 7 add.u64 %r35,%r35,24; setp.ne.u64 %r201,%r35,%r425; @ %r201 bra $L6; bra $L7; $L3: .loc 1 108 12 cvta.global.u64 %r444,_gfortrani_compile_options; .loc 1 108 11 ld.u32 %r203,[%r444+36]; setp.ne.u32 %r204,%r203,0; @ ! %r204 bra $L9; .loc 1 110 7 cvta.const.u64 %r208,$LC0;r17220708; call _gfortrani_bounds_equal_extents$L5: .loc 1 114 6 ld.u32 %r210,[%r444+36]; setp.eq.u32 %r211,%r210,0; @ %r211 bra $L9; .loc 1 116 7 cvta.const.u64 %r216,$LC0; cvta.const.u64 %r21517r17215216; call _gfortrani_bounds_reduced_extents); } $L9: .loc 1 120 6 setp.eq.u64 %r217,%r183,0; @ %r217 bra $L1; .loc 1 124 23 ld.s8 %r165,[%r172+28]; cvt.u32.u32 %r218,%r165; cvt.s64.s8 %r166,%r218; .loc 1 124 3 setp.gt.s64 %r219,%r166,0; @ %r219 bra $L12; .loc 1 70 7 mov.u64 %r100,0; .loc 1 71 11 mov.u64 %r94,%r100; .loc 1 72 11 mov.u64 %r86,%r100; bra $L13; $L12: .loc 1 126 15 cvt.s64.s32 %r104,%r105; .loc 1 128 21 ld.u64 %r150,[%r171+16]; .loc 1 131 21 ld.u64 %r164,[%r172+16]; mov.u64 %r89,40; .loc 1 123 5 mov.u64 %r101,0; .loc 1 70 7 mov.u64 %r100,%r101; .loc 1 124 12 mov.u64 %r138,%r101; .loc 1 71 11 mov.u64 %r94,%r101; .loc 1 72 11 mov.u64 %r86,%r101; .loc 1 138 20 mov.u64 %r462,%r101; $L18: add.u64 %r65,%r172,%r89; .loc 1 134 17 ld.u64 %r221,[%r65+16]; add.u64 %r220,%r221,1; .loc 1 134 15 ld.u64 %r222,[%r65+8]; sub.u64 %r167,%r220,%r222; .loc 1 128 21 add.u64 %r223,%r171,%r89; ld.u64 %r224,[%r223]; mul.lo.u64 %r156,%r150,%r224; .loc 1 131 21 ld.u64 %r225,[%r65]; mul.lo.u64 %r143,%r164,%r225; .loc 1 126 10 setp.ne.u64 %r226,%r104,%r138; @ %r226 bra $L14; .loc 1 129 14 setp.ne.u64 %r227,%r156,0; .loc 1 74 8 selp.u64 %r86,%r156,%r22,%r227; .loc 1 132 14 setp.ne.u64 %r228,%r143,0; .loc 1 74 8 selp.u64 %r94,%r143,%r22,%r228; .loc 1 134 15 mov.u64 %r100,%r167; bra $L17; $L14: .loc 1 138 20 shl.b64 %r229,%r101,3; add.u64 %r230,%frame,%r229; add.u64 %r231,%r230,120; st.u64 [%r231],%r462; .loc 1 139 21 st.u64 [%r230],%r167; .loc 1 140 22 st.u64 [%r230+480],%r156; .loc 1 141 22 st.u64 [%r230+360],%r143; .loc 1 143 24 add.u64 %r245,%r101,%r101; add.u64 %r246,%r245,%r101; shl.b64 %r247,%r246,3; add.u64 %r248,%r173,%r247; .loc 1 143 22 ld.u64 %r250,[%r248+40]; st.u64 [%r230+240],%r250; .loc 1 144 12 add.u64 %r101,%r101,1; $L17: .loc 1 124 55 add.u64 %r138,%r138,1; .loc 1 124 3 add.u64 %r89,%r89,24; setp.ne.u64 %r251,%r138,%r166; @ %r251 bra $L18; $L13: .loc 1 147 14 ld.u64 %r46,[%frame+360]; .loc 1 147 6 setp.ne.u64 %r252,%r46,0; .loc 1 74 8 selp.u64 %r46,%r46,%r22,%r252; .loc 1 149 14 ld.u64 %r40,[%frame+480]; .loc 1 149 6 setp.ne.u64 %r253,%r40,0; .loc 1 74 8 selp.u64 %r40,%r40,%r22,%r253; .loc 1 151 14 ld.u64 %r42,[%frame+240]; .loc 1 151 6 setp.ne.u64 %r254,%r42,0; .loc 1 152 16 selp.u64 %r42,%r42,1,%r254; .loc 1 158 8 ld.u64 %r137,[%r171]; .loc 1 159 8 ld.u64 %r53,[%r172]; .loc 1 160 8 ld.u64 %r54,[%r173]; .loc 1 162 9 setp.eq.u64 %r255,%r137,0; @ %r255 bra $L1; .loc 1 228 12 shl.b64 %r62,%r42,4; .loc 1 166 34 shr.s64 %r429,%r100,63; .loc 1 186 19 set.u32.eq.u64 %r258,%r22,%r94; neg.s32 %r259,%r258; .loc 1 186 38 set.u32.eq.u64 %r261,%r22,%r86; neg.s32 %r262,%r261; .loc 1 186 27 cvt.u16.u32 %r264,%r259; cvt.u16.u32 %r265,%r262; and.b16 %r263,%r264,%r265; cvt.u32.u16 %r266,%r263; cvt.u32.u8 %r127,%r266; .loc 1 231 32 ld.u64 %r115,[%frame]; .loc 1 238 30 mul.lo.u64 %r60,%r115,%r40; .loc 1 239 30 mul.lo.u64 %r267,%r115,%r46; .loc 1 239 16 neg.s64 %r145,%r267; .loc 1 240 23 mul.lo.u64 %r268,%r115,%r42; .loc 1 240 9 shl.b64 %r269,%r268,4; neg.s64 %r142,%r269; setp.ne.u32 %r450,%r127,0; setp.ne.u64 %r446,%r174,0; setp.le.s64 %r445,%r166,2; setp.eq.u64 %r454,%r177,1; .loc 1 219 8 setp.gt.s64 %r455,%r22,0; cvt.u32.u32 %r456,%r165; cvt.s64.s8 %r457,%r456; add.u64 %r458,%r457,-1; add.u64 %r459,%frame,480; add.u64 %r460,%frame,360; add.u64 %r461,%frame,240; $L51: .loc 1 165 10 ld.u64 %r430,[%r54]; ld.u64 %r431,[%r54+8]; .loc 1 166 26 shr.s64 %r271,%r431,63; xor.b64 %r426,%r271,%r430; xor.b64 %r427,%r271,%r431; sub.u64 %r282,%r426,%r271; set.u32.gt.u64 %r287,%r282,%r426; cvt.s64.s32 %r285,%r287; sub.u64 %r288,%r427,%r271; add.u64 %r291,%r288,%r285; mov.u64 %r426,%r282; .loc 1 166 10 setp.gt.s64 %r294,%r291,%r429; @ %r294 bra $L55; setp.ne.u64 %r297,%r291,%r429; @ %r297 bra $L56; setp.gt.u64 %r300,%r282,%r100; @ %r300 bra $L55; $L56: .loc 1 203 9 mov.u64 %r93,%r282; .loc 1 194 24 sub.u64 %r302,%r100,%r93; set.u32.gt.u64 %r307,%r302,%r100; cvt.s64.s32 %r305,%r307; sub.u64 %r308,%r429,%r291; mov.u64 %r432,%r302; add.u64 %r433,%r308,%r305; bra $L23; $L55: mov.u64 %r93,%r100; mov.u64 %r426,%r100; mov.u64 %r430,%r100; mov.u64 %r431,%r429; mov.u64 %r432,0; mov.u64 %r433,%r432; $L23: .loc 1 174 10 setp.gt.s64 %r313,%r431,0; @ %r313 bra $L57; setp.ne.u64 %r315,%r431,0; @ %r315 bra $L25; setp.ne.u64 %r317,%r430,0; @ ! %r317 bra $L25; $L57: .loc 1 176 15 mad.lo.u64 %r96,%r94,%r426,%r53; mov.u64 %r91,%r137; bra $L27; $L25: .loc 1 182 16 mad.lo.u64 %r91,%r86,%r426,%r137; mov.u64 %r96,%r53; $L27: .loc 1 186 10 @ %r450 bra $L28; .loc 1 194 4 setp.gt.s64 %r322,%r433,0; @ %r322 bra $L29; setp.ne.u64 %r324,%r433,0; @ %r324 bra $L31; setp.ne.u64 %r326,%r432,0; @ %r326 bra $L29; bra $L31; $L28: .loc 1 188 11 mul.lo.u64 %r131,%r22,%r432; .loc 1 189 4.loc 1 190 9 add.u64 %r91,%r91,%r131; bra $L31; $L29: .loc 1 182 18 mov.u64 %r129,%r91; .loc 1 194 11 mov.u64 %r130,0; $L32:342,[%value_in]; } .loc 1 197 13 add.u64 %r129,%r129,%r86; .loc 1 198 12 add.u64 %r96,%r96,%r94; .loc 1 194 34 add.u64 %r130,%r130,1; .loc 1 194 4 setp.ne.u64 %r344,%r432,%r130; @ %r344 bra $L32; .loc 1 197 13 mad.lo.u64 %r91,%r432,%r86,%r91; $L31: .loc 1 201 10 setp.ge.s64 %r361,%r431,0; selp.u64 %r91,%r91,%r137,%r361; .loc 1 212 10 add.u64 %r133,%r93,-1; .loc 1 205 10 @ %r446 bra $L35; .loc 1 212 8 setp.ne.u64 %r363,%r93,0; @ %r363 bra $L46; $L39: .loc 1 226 12 add.u64 %r137,%r137,%r40; .loc 1 227 12 add.u64 %r53,%r53,%r46; .loc 1 228 12 add.u64 %r54,%r54,%r62; .loc 1 229 15 ld.u64 %r364,[%frame+120]; add.u64 %r64,%r364,1; .loc 1 231 13 setp.eq.u64 %r365,%r64,%r115; @ %r365 bra $L37; .loc 1 229 15 st.u64 [%frame+120],%r64; bra $L51; $L35: .loc 1 206 8 setp.eq.u64 %r366,%r93,0; @ %r366 bra $L39; $L40: .loc 1 208 617373,[%value_in]; } .loc 1 209 11 add.u64 %r91,%r91,%r86; .loc 1 206 10 add.u64 %r133,%r133,-1; .loc 1 206 8 setp.ne.u64 %r375,%r133,-1; @ %r375 bra $L40; bra $L39; $L46: .loc 1 216 9 @ %r454 bra $L41; .loc 1 219 8 @ %r455 bra $L42; bra $L43; $L41: .loc 1 217 8 ld.s8 %r382,[%r176]9382383,[%value_in]; } bra $L43; $L42: .loc 1 219 15 mov.u64 %r132,0; $L44: .loc 1 220 3 add.u64 %r384,%r91,%r132 add.u64 %r132,%r132,%r177; .loc 1 219 8 setp.gt.s64 %r393,%r22,%r132; @ %r393 bra $L44; $L43: .loc 1 222 11 add.u64 %r91,%r91,%r86; .loc 1 212 10 add.u64 %r133,%r133,-1; .loc 1 212 8 setp.ne.u64 %r394,%r133,-1; @ %r394 bra $L46; bra $L39; $L37: .loc 1 235 20 mov.u64 %r395,0; st.u64 [%frame+120],%r395; .loc 1 239 16 add.u64 %r134,%r53,%r145; .loc 1 240 9 add.u64 %r135,%r54,%r142; .loc 1 242 14 @ %r445 bra $L1; add.u64 %r159,%frame,128; .loc 1 238 30 mov.u64 %r67,%r60; .loc 1 242 14 mov.u64 %r155,8; .loc 1 241 12 mov.u64 %r136,1; bra $L48; $L49: .loc 1 235 20 st.u64 [%r159],%r395; .loc 1 238 30 mul.lo.u64 %r67,%r78,%r77; .loc 1 239 30 mul.lo.u64 %r401,%r80,%r77; .loc 1 239 16 sub.u64 %r134,%r53,%r401; .loc 1 240 23 mul.lo.u64 %r402,%r82,%r77; .loc 1 240 9 shl.b64 %r403,%r402,4; sub.u64 %r135,%r54,%r403; .loc 1 241 12 add.u64 %r136,%r136,1; .loc 1 242 14 add.u64 %r159,%r159,8; add.u64 %r155,%r155,8; setp.eq.u64 %r404,%r458,%r136; @ %r404 bra $L1; $L48: .loc 1 250 23 ld.u64 %r405,[%r159]; add.u64 %r77,%r405,1; st.u64 [%r159],%r77; .loc 1 251 30 add.u64 %r407,%r459,%r155; ld.u64 %r78,[%r407]; .loc 1 251 20 sub.u64 %r408,%r78,%r67; add.u64 %r137,%r137,%r408; .loc 1 252 30 add.u64 %r410,%r460,%r155; ld.u64 %r80,[%r410]; .loc 1 252 20 add.u64 %r53,%r134,%r80; .loc 1 253 23 add.u64 %r412,%r461,%r155; ld.u64 %r82,[%r412]; .loc 1 253 13 shl.b64 %r413,%r82,4; add.u64 %r54,%r135,%r413; .loc 1 231 32 add.u64 %r414,%frame,%r155; ld.u64 %r85,[%r414]; .loc 1 231 13 setp.eq.u64 %r415,%r77,%r85; @ %r415 bra $L49; bra $L51; $L4: .loc 1 93 9 ld.u64 %r417,[%r172+56]; ld.u64 %r419,[%r172+48]; .loc 1 93 7 sub.u64 %r43,%r417,%r419; add.u64 %r45,%r172,72; cvt.u32.u32 %r421,%r169; cvt.s64.s8 %r420,%r421; add.u64 %r423,%r420,%r420; add.u64 %r424,%r423,%r420; shl.b64 %r425,%r424,3; mov.u64 %r35,%r184; .loc 1 96 17 mov.u64 %r108,%r180; .loc 1 101 4 mov.u64 %r200,%r35; bra $L50; $L1: .loc 1 257_gfortran_eoshift1_16 .visible .func _gfortran_eoshift1_16 {r22.loc 1 272 3 mov.u64 %r33,1; cvta.const.u64 %r32,$LC3eoshift127_gfortran_eoshift1_16_char .visible .func _gfortran_eoshift1_16_char.loc 1 295 3 mov.u64 %r36,1; cvta.const.u64 %r35,$LC4eoshift129_gfortran_eoshift1_16_char4 .visible .func _gfortran_eoshift1_16_char4.loc 1 319 3 mov.u64 %r36,4; cvta.const.u64 %r35,space$0eoshift1BEGIN FUNCTION DECL: eoshift3 .func eoshift3); .file 1 "../../../../libgfortran/generated/eoshift3_4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_eoshift3_4 .visible .func _gfortran_eoshift3_4_gfortran_eoshift3_4_char .visible .func _gfortran_eoshift3_4_charVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_size0_gfortran_size0_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extentsgfortrani_bounds_reduced_extents .extern .func _gfortrani_bounds_reduced_extents)13] = {114,101,116,117,114,110,32,118,97,108,117,101] = {32,0 }8] = {69,79,83,72,73,70,8473,70,84,32,97,114,103,117,109,101,110,12] = {0,0 }; // BEGIN VAR DEF: space$0 .const .align 4 .u32 space$0[1] = {32 }; // BEGIN FUNCTION DEF: eoshift3 .func eoshift3) {predpredpred %r271; .reg .u32predpred %r289; .reg .u64 %r296; .reg .u64u64 %r311; .reg .pred %r312; .reg .predpred %r341; .reg .u6464predu64 %r384; .reg .u64 %r385; .reg .u32 %r390; .reg .pred %r391; .reg .pred %r392; .reg .u64 %r393; .reg .u64 %r394; .reg .u64 %r395; .reg .u64 %r396; mov.u64 %r171,%ar0; mov.u64 %r172,%ar1; mov.u64 %r173,%ar2; mov.u64 %r174,%ar3; mov.u64 %r175,%ar4; mov.u64 %r176,%ar5; mov.u64 %r177172; call (%value_in),_gfortran_size0,(%out_arg1); ld.param.u64 %r179,[%value_in]; } .loc 1 79 10 ld.u64 %r22,[%r172+16]; .loc 1 81 6 setp.eq.u64 %r180,%r175,0; @ %r180 bra $L54; .loc 1 82 11 ld.u32 %r181,[%r175]; add.u32 %r101,%r181,-1; bra $L2; $L54: .loc 1 84 11 cvt.u32.u64 %r101,%r175; $L2: .loc 1 86 6 ld.u64 %r182,[%r171]; setp.ne.u64 %r183,%r182,0; @ %r183 bra $L3; .loc 1 88 2422; call (%value_in),_gfortrani_xmallocarray186,[%value_in]; } .loc 1 88 22 st.u64 [%r171],%r186; .loc 1 89 19 st.u64 [%r171+8],%r182; .loc 1 90 7 ld.u64 %r189,[%r172+16]; st.u64 [%r171+16],%r189; ld.u64 %r190,[%r172+24]; st.u64 [%r171+24],%r190; .loc 1 91 34 ld.s8 %r137,[%r172+28]; .loc 1 91 7 cvt.u16.u32 %r191,%r137; setp.gt.s16 %r192,%r191,0; @ %r192 bra $L4; $L7: .loc 1 107 2422; call (%value_in),_gfortrani_xmallocarray195,[%value_in]; } .loc 1 107 22 st.u64 [%r171],%r195; cvta.global.u64 %r380,_gfortrani_compile_options; bra $L5; $L6: .loc 1 95 9 ld.u64 %r198,[%r121+8]; ld.u64 %r200,[%r121]; .loc 1 100 17 mad.lo.u64 %r104,%r111,%r104,%r104; add.u64 %r121,%r121,24; .loc 1 95 7 sub.u64 %r111,%r198,%r200; $L52: add.u64 %r107,%r171,%r112; .loc 1 103 4 st.u64 [%r107+48],%r202; st.u64 [%r107+56],%r111; st.u64 [%r107+40],%r104; .loc 1 91 7 add.u64 %r112,%r112,24; setp.ne.u64 %r203,%r376,%r112; @ %r203 bra $L6; bra $L7; $L3: .loc 1 110 12 cvta.global.u64 %r380,_gfortrani_compile_options; .loc 1 110 11 ld.u32 %r205,[%r380+36]; setp.ne.u32 %r206,%r205,0; @ ! %r206 bra $L9; .loc 1 112 7 cvta.const.u64 %r210,$LC0;r17220910; call _gfortrani_bounds_equal_extents$L5: .loc 1 116 6 ld.u32 %r212,[%r380+36]; setp.eq.u32 %r213,%r212,0; @ %r213 bra $L9; .loc 1 118 7 cvta.const.u64 %r218,$LC0; cvta.const.u64 %r21717r1727218; call _gfortrani_bounds_reduced_extents); } $L9: .loc 1 122 6 setp.ne.u64 %r219,%r179,0; @ ! %r219 bra $L1; .loc 1 125 13 mov.u64 %r220,1; st.u64 [%frame],%r220; .loc 1 126 12 mov.u64 %r221,0; st.u64 [%frame+120],%r221; .loc 1 128 23 ld.s8 %r144,[%r172+28]; .loc 1 128 3 setp.le.s64 %r222,%r144,0; @ %r222 bra $L55; .loc 1 130 15 cvt.s64.s32 %r87,%r101; .loc 1 132 21 ld.u64 %r163,[%r171+16]; .loc 1 135 21 ld.u64 %r168,[%r172+16]; mov.u64 %r136,40; .loc 1 127 5 mov.u64 %r97,%r221; .loc 1 74 7 mov.u64 %r95,%r97; .loc 1 128 12 mov.u64 %r132,%r97; .loc 1 75 11 mov.u64 %r88,%r97; .loc 1 76 11 mov.u64 %r83,%r97; setp.eq.u64 %r377,%r174,0; .loc 1 142 20 mov.u64 %r396,%r97; $L18: add.u64 %r134,%r172,%r136; .loc 1 138 17 ld.u64 %r224,[%r134+16]; add.u64 %r223,%r224,1; .loc 1 138 15 ld.u64 %r225,[%r134+8]; sub.u64 %r160,%r223,%r225; .loc 1 132 21 add.u64 %r226,%r171,%r136; ld.u64 %r227,[%r226]; mul.lo.u64 %r165,%r163,%r227; .loc 1 135 21 ld.u64 %r228,[%r134]; mul.lo.u64 %r170,%r168,%r228; .loc 1 130 10 setp.ne.u64 %r229,%r87,%r132; @ %r229 bra $L13; .loc 1 133 14 setp.ne.u64 %r230,%r165,0; .loc 1 79 8 selp.u64 %r83,%r165,%r22,%r230; .loc 1 136 14 setp.ne.u64 %r231,%r170,0; .loc 1 79 8 selp.u64 %r88,%r170,%r22,%r231; .loc 1 138 15 mov.u64 %r95,%r160; bra $L16; $L13: .loc 1 142 20 shl.b64 %r382,%r97,3; add.u64 %r233,%frame,%r382; add.u64 %r234,%r233,120; st.u64 [%r234],%r396; .loc 1 143 21 st.u64 [%r233],%r160; .loc 1 144 22 st.u64 [%r233+600],%r165; .loc 1 145 22 st.u64 [%r233+480],%r170; .loc 1 147 24 add.u64 %r248,%r97,%r97; add.u64 %r381,%r248,%r97; shl.b64 %r250,%r381,3; add.u64 %r251,%r173,%r250; .loc 1 147 22 ld.u64 %r253,[%r251+40]; st.u64 [%r233+360],%r253; .loc 1 148 14 @ %r377 bra $L58; .loc 1 149 26 add.u64 %r259,%r174,%r250; ld.u64 %r261,[%r259+40]; ld.u64 %r262,[%r174+16]; mul.lo.u64 %r44,%r261,%r262; bra $L17; $L58: .loc 1 151 24 mov.u64 %r44,%r396; $L17: add.u64 %r264,%frame,%r382; st.u64 [%r264+240],%r44; .loc 1 152 12 add.u64 %r97,%r97,1; $L16: .loc 1 128 55 add.u64 %r132,%r132,1; .loc 1 128 3 add.u64 %r136,%r136,24; setp.ne.u64 %r266,%r132,%r144; @ %r266 bra $L18; bra $L12; $L55: .loc 1 74 7 mov.u64 %r95,%r221; .loc 1 75 11 mov.u64 %r88,%r95; .loc 1 76 11 mov.u64 %r83,%r95; setp.eq.u64 %r377,%r174,0; $L12: .loc 1 155 14 ld.u64 %r45,[%frame+480]; .loc 1 155 6 setp.ne.u64 %r267,%r45,0; .loc 1 79 8 selp.u64 %r45,%r45,%r22,%r267; .loc 1 157 14 ld.u64 %r46,[%frame+600]; .loc 1 157 6 setp.ne.u64 %r268,%r46,0; .loc 1 79 8 selp.u64 %r46,%r46,%r22,%r268; .loc 1 159 14 ld.u64 %r36,[%frame+360]; .loc 1 159 6 setp.ne.u64 %r269,%r36,0; .loc 1 160 16 selp.u64 %r36,%r36,1,%r269; .loc 1 161 23 ld.u64 %r38,[%frame+240]; .loc 1 169 8 ld.u64 %r129,[%r171]; .loc 1 170 8 ld.u64 %r130,[%r172]; .loc 1 171 8 ld.u64 %r131,[%r173]; .loc 1 161 6 @ %r377 bra $L22; .loc 1 161 13 setp.ne.u64 %r271,%r38,0; .loc 1 79 8 selp.u64 %r38,%r38,%r22,%r271; st.u64 [%frame+240],%r38; .loc 1 173 10 ld.u64 %r174,[%r174]; bra $L22; $L50: .loc 1 244 12 shl.b64 %r58,%r36,2; .loc 1 201 19 set.u32.eq.u64 %r273,%r22,%r88; neg.s32 %r274,%r273; .loc 1 201 38 set.u32.eq.u64 %r276,%r22,%r83; neg.s32 %r277,%r276; .loc 1 201 27 cvt.u16.u32 %r279,%r274; cvt.u16.u32 %r280,%r277; and.b16 %r278,%r279,%r280; cvt.u32.u16 %r281,%r278; cvt.u32.u8 %r100,%r281; .loc 1 248 32 ld.u64 %r92,[%frame]; setp.ne.u32 %r383,%r100,0; add.u64 %r385,%r144,-1; add.u64 %r384,%frame,128; add.u64 %r379,%frame,240; .loc 1 183 10 cvt.u32.u64 %r390,%r95; setp.eq.u64 %r391,%r177,1; .loc 1 235 8 setp.gt.s64 %r392,%r22,0; .loc 1 269 30 add.u64 %r393,%frame,600; .loc 1 270 30 add.u64 %r394,%frame,480; .loc 1 271 23 add.u64 %r395,%frame,360; $L53: .loc 1 180 10 ld.u32 %r102,[%r131]; .loc 1 181 26 setp.lt.s32 %r282,%r102,0; @ %r282 bra $L24; cvt.s64.s32 %r106,%r102; bra $L25; $L24: .loc 1 181 28 neg.s32 %r283,%r102; .loc 1 181 26 cvt.s64.s32 %r106,%r283; $L25: .loc 1 181 10 setp.le.s64 %r284,%r106,%r95; @ %r284 bra $L26; .loc 1 183 10 mov.u32 %r102,%r390; mov.u32 %r103,%r390; bra $L27; $L26: .loc 1 187 8 abs.s32 %r103,%r102; $L27: .loc 1 191 29 cvt.s64.s32 %r156,%r103; .loc 1 189 10 setp.le.s32 %r285,%r102,0; @ %r285 bra $L28; .loc 1 191 15 mad.lo.u64 %r91,%r88,%r156,%r130; mov.u64 %r85,%r129; bra $L29; $L28: .loc 1 197 16 mad.lo.u64 %r85,%r83,%r156,%r129; mov.u64 %r91,%r130; $L29: .loc 1 209 24 sub.u64 %r155,%r95,%r156; .loc 1 201 10 @ %r383 bra $L30; .loc 1 209 4 setp.gt.s64 %r289,%r155,0; @ %r289 bra $L31; bra $L32; $L30: .loc 1 203 24 mul.lo.u64 %r124,%r22,%r155; .loc 1 204 44296,[%value_in]; } .loc 1 205 9 add.u64 %r85,%r85,%r124; bra $L32; $L31: .loc 1 213 12 mov.u64 %r122,%r85; .loc 1 209 11 mov.u64 %r123,0; $L33: .loc 1 211 84,[%value_in]; } .loc 1 212 13 add.u64 %r122,%r122,%r83; .loc 1 213 12 add.u64 %r91,%r91,%r88; .loc 1 209 34 add.u64 %r123,%r123,1; .loc 1 209 4 setp.ne.u64 %r306,%r123,%r155; @ %r306 bra $L33; .loc 1 212 13 mad.lo.u64 %r85,%r123,%r83,%r85; $L32: .loc 1 217 10 setp.ge.s32 %r308,%r102,0; selp.u64 %r85,%r85,%r129,%r308; .loc 1 228 10 add.u64 %r126,%r156,-1; .loc 1 221 10 setp.ne.u64 %r309,%r174,0; @ %r309 bra $L35; .loc 1 228 8 setp.ne.u64 %r310,%r156,0; @ %r310 bra $L46; $L39: .loc 1 242 12 add.u64 %r129,%r129,%r46; .loc 1 243 12 add.u64 %r130,%r130,%r45; .loc 1 244 12 add.u64 %r131,%r131,%r58; .loc 1 245 12 add.u64 %r174,%r174,%r38; .loc 1 246 15 ld.u64 %r311,[%frame+120]; add.u64 %r60,%r311,1; st.u64 [%frame+120],%r60; .loc 1 248 13 setp.eq.u64 %r312,%r60,%r92; @ %r312 bra $L37; bra $L53; $L35: .loc 1 222 8 setp.eq.u64 %r313,%r156,0; @ %r313 bra $L39; $L40: .loc 1 224 617320,[%value_in]; } .loc 1 225 11 add.u64 %r85,%r85,%r83; .loc 1 222 10 add.u64 %r126,%r126,-1; .loc 1 222 8 setp.ne.u64 %r322,%r126,-1; @ %r322 bra $L40; bra $L39; $L46: .loc 1 232 9 @ %r391 bra $L41; .loc 1 235 8 @ %r392 bra $L42; bra $L43; $L41: .loc 1 233 8 ld.s8 %r329,[%r176]85bra $L43; $L42: .loc 1 235 15 mov.u64 %r125,0; $L44: .loc 1 236 3 add.u64 %r331,%r85,%rr177338,[%value_in]; } .loc 1 235 32 add.u64 %r125,%r125,%r177; .loc 1 235 8 setp.gt.s64 %r340,%r22,%r125; @ %r340 bra $L44; $L43: .loc 1 238 11 add.u64 %r85,%r85,%r83; .loc 1 228 10 add.u64 %r126,%r126,-1; .loc 1 228 8 setp.ne.u64 %r341,%r126,-1; @ %r341 bra $L46; bra $L39; $L37: mov.u64 %r79,%r384; .loc 1 260 24 mov.u64 %r77,%r36; mov.u64 %r75,%r45; mov.u64 %r73,%r46; mov.u64 %r82,%r92; mov.u64 %r110,8; .loc 1 247 9 mov.u64 %r128,0; .loc 1 252 20 mov.u64 %r343,%r128; $L49: st.u64 [%r79+-8],%r343; .loc 1 255 30 mul.lo.u64 %r62,%r73,%r82; .loc 1 256 30 mul.lo.u64 %r64,%r75,%r82; .loc 1 257 23 mul.lo.u64 %r66,%r77,%r82; .loc 1 258 26 shl.b64 %r345,%r128,3; add.u64 %r346,%r379,%r345; .loc 1 258 30 ld.u64 %r347,[%r346]; mul.lo.u64 %r70,%r82,%r347; .loc 1 259 12 add.u64 %r128,%r128,1; .loc 1 260 14 setp.lt.s64 %r348,%r128,%r385; @ ! %r348 bra $L1; .loc 1 268 23 ld.u64 %r349,[%r79]; add.u64 %r72,%r349,1; st.u64 [%r79],%r72; .loc 1 269 30 add.u64 %r351,%r393,%r110; ld.u64 %r73,[%r351]; .loc 1 269 20 sub.u64 %r352,%r73,%r62; add.u64 %r129,%r129,%r352; .loc 1 270 30 add.u64 %r354,%r394,%r110; ld.u64 %r75,[%r354]; .loc 1 270 20 sub.u64 %r355,%r75,%r64; add.u64 %r130,%r130,%r355; .loc 1 271 23 add.u64 %r357,%r395,%r110; ld.u64 %r77,[%r357]; .loc 1 271 13 sub.u64 %r358,%r77,%r66; shl.b64 %r359,%r358,2; add.u64 %r131,%r131,%r359; .loc 1 272 30 add.u64 %r361,%r379,%r110; .loc 1 272 20 ld.u64 %r363,[%r361]; sub.u64 %r362,%r363,%r70; add.u64 %r174,%r174,%r362; .loc 1 248 32 add.u64 %r364,%frame,%r110; ld.u64 %r82,[%r364]; .loc 1 248 13 add.u64 %r79,%r79,8; setp.ne.u64 %r365,%r72,%r82; @ %r365 bra $L53; add.u64 %r110,%r110,8; bra $L49; $L22: .loc 1 177 9 setp.ne.u64 %r366,%r129,0; @ %r366 bra $L50; bra $L1; $L4: .loc 1 95 9 ld.u64 %r368,[%r172+56]; ld.u64 %r370,[%r172+48]; .loc 1 95 7 sub.u64 %r111,%r368,%r370; add.u64 %r121,%r172,72; cvt.u32.u32 %r372,%r137; cvt.s64.s8 %r371,%r372; add.u64 %r374,%r371,%r371; add.u64 %r375,%r374,%r371; shl.b64 %r376,%r375,3; mov.u64 %r112,%r182; .loc 1 98 17 mov.u64 %r104,1; .loc 1 103 4 mov.u64 %r202,%r112; bra $L52; $L1: .loc 1 2_gfortran_eoshift3_4 .visible .func _gfortran_eoshift3_4 {r22.loc 1 292 3 mov.u64 %r33,1; cvta.const.u64 %r32,$LC3eoshift329_gfortran_eoshift3_4_char .visible .func _gfortran_eoshift3_4_char.loc 1 315 3 mov.u64 %r36,1; cvta.const.u64 %r35,$LC4eoshift31_gfortran_eoshift3_4_char4 .visible .func _gfortran_eoshift3_4_char4.loc 1 339 3 mov.u64 %r36,4; cvta.const.u64 %r35,space$0eoshift3ret; } eoshift3_8.o/ BEGIN FUNCTION DECL: eoshift3 .func eoshift3); .file 1 "../../../../libgfortran/generated/eoshift3_8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_eoshift3_8 .visible .func _gfortran_eoshift3_8_gfortran_eoshift3_8_char .visible .func _gfortran_eoshift3_8_charVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_size0_gfortran_size0_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extentsgfortrani_bounds_reduced_extents .extern .func _gfortrani_bounds_reduced_extents)13] = {114,101,116,117,114,110,32,118,97,108,117,101] = {32,0 }8] = {69,79,83,72,73,70,8473,70,84,32,97,114,103,117,109,101,110,12] = {0,0 }; // BEGIN VAR DEF: space$0 .const .align 4 .u32 space$0[1] = {32 }; // BEGIN FUNCTION DEF: eoshift3 .func eoshift3) {32predpredpred %r271; .reg .u32pred %r287; .reg .u64 %r294; .reg .u64pred %r311; .reg .u64 %r318; .reg .pred %r320; .reg .u32 %r327; .reg .u64predpredpred %r389; .reg .u64 %r390; .reg .u64 %r391; .reg .u64 %r392; .reg .u64 %r393; mov.u64 %r170,%ar0; mov.u64 %r171,%ar1; mov.u64 %r172,%ar2; mov.u64 %r173,%ar3; mov.u64 %r174,%ar4; mov.u64 %r175,%ar5; mov.u64 %r176171; call (%value_in),_gfortran_size0,(%out_arg1); ld.param.u64 %r178,[%value_in]; } .loc 1 79 10 ld.u64 %r22,[%r171+16]; .loc 1 81 6 setp.eq.u64 %r179,%r174,0; @ %r179 bra $L51; .loc 1 82 21 ld.u32 %r181,[%r174]; add.u32 %r102,%r181,-1; bra $L2; $L51: .loc 1 84 11 cvt.u32.u64 %r102,%r174; $L2: .loc 1 86 6 ld.u64 %r182,[%r170]; setp.ne.u64 %r183,%r182,0; @ %r183 bra $L3; .loc 1 88 2417r22; call (%value_in),_gfortrani_xmallocarray186,[%value_in]; } .loc 1 88 22 st.u64 [%r170],%r186; .loc 1 89 19 st.u64 [%r170+8],%r182; .loc 1 90 7 ld.u64 %r189,[%r171+16]; st.u64 [%r170+16],%r189; ld.u64 %r190,[%r171+24]; st.u64 [%r170+24],%r190; .loc 1 91 34 ld.s8 %r137,[%r171+28]; .loc 1 91 7 cvt.u16.u32 %r191,%r137; setp.gt.s16 %r192,%r191,0; @ %r192 bra $L4; $L7: .loc 1 107 2417r22; call (%value_in),_gfortrani_xmallocarray195,[%value_in]; } .loc 1 107 22 st.u64 [%r170],%r195; cvta.global.u64 %r382,_gfortrani_compile_options; bra $L5; $L6: .loc 1 95 9 ld.u64 %r198,[%r120+8]; ld.u64 %r200,[%r120]; .loc 1 100 17 mad.lo.u64 %r104,%r111,%r104,%r104; add.u64 %r120,%r120,24; .loc 1 95 7 sub.u64 %r111,%r198,%r200; $L49: add.u64 %r106,%r170,%r117; .loc 1 103 4 st.u64 [%r106+48],%r202; st.u64 [%r106+56],%r111; st.u64 [%r106+40],%r104; .loc 1 91 7 add.u64 %r117,%r117,24; setp.ne.u64 %r203,%r374,%r117; @ %r203 bra $L6; bra $L7; $L3: .loc 1 110 12 cvta.global.u64 %r382,_gfortrani_compile_options; .loc 1 110 11 ld.u32 %r205,[%r382+36]; setp.ne.u32 %r206,%r205,0; @ ! %r206 bra $L9; .loc 1 112 7 cvta.const.u64 %r210,$LC0;r20910; call _gfortrani_bounds_equal_extents$L5: .loc 1 116 6 ld.u32 %r212,[%r382+36]; setp.eq.u32 %r213,%r212,0; @ %r213 bra $L9; .loc 1 118 7 cvta.const.u64 %r218,$LC0; cvta.const.u64 %r21717171217218; call _gfortrani_bounds_reduced_extents); } $L9: .loc 1 122 6 setp.ne.u64 %r219,%r178,0; @ ! %r219 bra $L1; .loc 1 125 13 mov.u64 %r220,1; st.u64 [%frame],%r220; .loc 1 126 12 mov.u64 %r221,0; st.u64 [%frame+120],%r221; .loc 1 128 23 ld.s8 %r146,[%r171+28]; .loc 1 128 3 setp.le.s64 %r222,%r146,0; @ %r222 bra $L52; .loc 1 130 15 cvt.s64.s32 %r109,%r102; .loc 1 132 21 ld.u64 %r162,[%r170+16]; .loc 1 135 21 ld.u64 %r167,[%r171+16]; mov.u64 %r138,40; .loc 1 127 5 mov.u64 %r98,%r221; .loc 1 74 7 mov.u64 %r96,%r98; .loc 1 128 12 mov.u64 %r132,%r98; .loc 1 75 11 mov.u64 %r89,%r98; .loc 1 76 11 mov.u64 %r85,%r98; setp.eq.u64 %r380,%r173,0; .loc 1 142 20 mov.u64 %r393,%r98; $L18: add.u64 %r134,%r171,%r138; .loc 1 138 17 ld.u64 %r224,[%r134+16]; add.u64 %r223,%r224,1; .loc 1 138 15 ld.u64 %r225,[%r134+8]; sub.u64 %r159,%r223,%r225; .loc 1 132 21 add.u64 %r226,%r170,%r138; ld.u64 %r227,[%r226]; mul.lo.u64 %r164,%r162,%r227; .loc 1 135 21 ld.u64 %r228,[%r134]; mul.lo.u64 %r169,%r167,%r228; .loc 1 130 10 setp.ne.u64 %r229,%r109,%r132; @ %r229 bra $L13; .loc 1 133 14 setp.ne.u64 %r230,%r164,0; .loc 1 79 8 selp.u64 %r85,%r164,%r22,%r230; .loc 1 136 14 setp.ne.u64 %r231,%r169,0; .loc 1 79 8 selp.u64 %r89,%r169,%r22,%r231; .loc 1 138 15 mov.u64 %r96,%r159; bra $L16; $L13: .loc 1 142 20 shl.b64 %r377,%r98,3; add.u64 %r233,%frame,%r377; add.u64 %r234,%r233,120; st.u64 [%r234],%r393; .loc 1 143 21 st.u64 [%r233],%r159; .loc 1 144 22 st.u64 [%r233+600],%r164; .loc 1 145 22 st.u64 [%r233+480],%r169; .loc 1 147 24 add.u64 %r248,%r98,%r98; add.u64 %r375,%r248,%r98; shl.b64 %r250,%r375,3; add.u64 %r251,%r172,%r250; .loc 1 147 22 ld.u64 %r253,[%r251+40]; st.u64 [%r233+360],%r253; .loc 1 148 14 @ %r380 bra $L55; .loc 1 149 26 add.u64 %r259,%r173,%r250; ld.u64 %r261,[%r259+40]; ld.u64 %r262,[%r173+16]; mul.lo.u64 %r46,%r261,%r262; bra $L17; $L55: .loc 1 151 24 mov.u64 %r46,%r393; $L17: add.u64 %r264,%frame,%r377; st.u64 [%r264+240],%r46; .loc 1 152 12 add.u64 %r98,%r98,1; $L16: .loc 1 128 55 add.u64 %r132,%r132,1; .loc 1 128 3 add.u64 %r138,%r138,24; setp.ne.u64 %r266,%r132,%r146; @ %r266 bra $L18; bra $L12; $L52: .loc 1 74 7 mov.u64 %r96,%r221; .loc 1 75 11 mov.u64 %r89,%r96; .loc 1 76 11 mov.u64 %r85,%r96; setp.eq.u64 %r380,%r173,0; $L12: .loc 1 155 14 ld.u64 %r47,[%frame+480]; .loc 1 155 6 setp.ne.u64 %r267,%r47,0; .loc 1 79 8 selp.u64 %r47,%r47,%r22,%r267; .loc 1 157 14 ld.u64 %r48,[%frame+600]; .loc 1 157 6 setp.ne.u64 %r268,%r48,0; .loc 1 79 8 selp.u64 %r48,%r48,%r22,%r268; .loc 1 159 14 ld.u64 %r38,[%frame+360]; .loc 1 159 6 setp.ne.u64 %r269,%r38,0; .loc 1 160 16 selp.u64 %r38,%r38,1,%r269; .loc 1 161 23 ld.u64 %r40,[%frame+240]; .loc 1 169 8 ld.u64 %r129,[%r170]; .loc 1 170 8 ld.u64 %r130,[%r171]; .loc 1 171 8 ld.u64 %r131,[%r172]; .loc 1 161 6 @ %r380 bra $L22; .loc 1 161 13 setp.ne.u64 %r271,%r40,0; .loc 1 79 8 selp.u64 %r40,%r40,%r22,%r271; st.u64 [%frame+240],%r40; .loc 1 173 10 ld.u64 %r173,[%r173]; bra $L22; $L47: .loc 1 244 12 shl.b64 %r60,%r38,3; .loc 1 201 38 set.u32.eq.u64 %r273,%r22,%r85; neg.s32 %r274,%r273; .loc 1 201 19 set.u32.eq.u64 %r276,%r22,%r89; neg.s32 %r277,%r276; .loc 1 201 27 cvt.u16.u32 %r279,%r274; cvt.u16.u32 %r280,%r277; and.b16 %r278,%r279,%r280; cvt.u32.u16 %r281,%r278; cvt.u32.u8 %r88,%r281; .loc 1 248 32 ld.u64 %r86,[%frame]; setp.ne.u32 %r376,%r88,0; add.u64 %r379,%r146,-1; add.u64 %r378,%frame,128; add.u64 %r383,%frame,240; setp.eq.u64 %r388,%r176,1; .loc 1 235 8 setp.gt.s64 %r389,%r22,0; .loc 1 269 30 add.u64 %r390,%frame,600; .loc 1 270 30 add.u64 %r391,%frame,480; .loc 1 271 23 add.u64 %r392,%frame,360; $L50: .loc 1 180 10 ld.u64 %r103,[%r131]; .loc 1 181 26 abs.s64 %r50,%r103; .loc 1 181 10 setp.gt.s64 %r282,%r50,%r96; @ %r282 bra $L56; .loc 1 209 24 sub.u64 %r97,%r96,%r50; bra $L24; $L56: mov.u64 %r50,%r96; mov.u64 %r103,%r96; mov.u64 %r97,0; $L24: .loc 1 189 10 setp.le.s64 %r283,%r103,0; @ %r283 bra $L25; .loc 1 191 15 mad.lo.u64 %r91,%r50,%r89,%r130; mov.u64 %r87,%r129; bra $L26; $L25: .loc 1 197 16 mad.lo.u64 %r87,%r50,%r85,%r129; mov.u64 %r91,%r130; $L26: .loc 1 201 10 @ %r376 bra $L27; .loc 1 209 4 setp.ne.u64 %r287,%r97,0; @ %r287 bra $L28; bra $L29; $L27: .loc 1 203 24 mul.lo.u64 %r123,%r97,%r22; .loc 1 204 4d.u64 %r87,%r87,%r123; bra $L29; $L28: .loc 1 213 12 mov.u64 %r121,%r87; .loc 1 209 11 mov.u64 %r122,0; $L30:2,[%value_in]; } .loc 1 212 13 add.u64 %r121,%r121,%r85; .loc 1 213 12 add.u64 %r91,%r91,%r89; .loc 1 209 34 add.u64 %r122,%r122,1; .loc 1 209 4 setp.ne.u64 %r304,%r97,%r122; @ %r304 bra $L30; .loc 1 212 13 mad.lo.u64 %r87,%r97,%r85,%r87; $L29: .loc 1 217 10 setp.ge.s64 %r306,%r103,0; selp.u64 %r87,%r87,%r129,%r306; .loc 1 228 10 add.u64 %r126,%r50,-1; .loc 1 221 10 setp.ne.u64 %r307,%r173,0; @ %r307 bra $L32; .loc 1 228 8 setp.ne.u64 %r308,%r50,0; @ %r308 bra $L43; $L36: .loc 1 242 12 add.u64 %r129,%r129,%r48; .loc 1 243 12 add.u64 %r130,%r130,%r47; .loc 1 244 12 add.u64 %r131,%r131,%r60; .loc 1 245 12 add.u64 %r173,%r173,%r40; .loc 1 246 15 ld.u64 %r309,[%frame+120]; add.u64 %r62,%r309,1; st.u64 [%frame+120],%r62; .loc 1 248 13 setp.eq.u64 %r310,%r62,%r86; @ %r310 bra $L34; bra $L50; $L32: .loc 1 222 8 setp.eq.u64 %r311,%r50,0; @ %r311 bra $L36; $L37: .loc 1 224 6173318,[%value_in]; } .loc 1 225 11 add.u64 %r87,%r87,%r85; .loc 1 222 10 add.u64 %r126,%r126,-1; .loc 1 222 8 setp.ne.u64 %r320,%r126,-1; @ %r320 bra $L37; bra $L36; $L43: .loc 1 232 9 @ %r388 bra $L38; .loc 1 235 8 @ %r389 bra $L39; bra $L40; $L38: .loc 1 233 8 ld.s8 %r327,[%r175]bra $L40; $L39: .loc 1 235 15 mov.u64 %r125,0; $L41: .loc 1 236 3 add.u64 %r329,%r87,%r336,[%value_in]; } .loc 1 235 32 add.u64 %r125,%r125,%r176; .loc 1 235 8 setp.gt.s64 %r338,%r22,%r125; @ %r338 bra $L41; $L40: .loc 1 238 11 add.u64 %r87,%r87,%r85; .loc 1 228 10 add.u64 %r126,%r126,-1; .loc 1 228 8 setp.ne.u64 %r339,%r126,-1; @ %r339 bra $L43; bra $L36; $L34: mov.u64 %r81,%r378; .loc 1 260 24 mov.u64 %r79,%r38; mov.u64 %r77,%r47; mov.u64 %r75,%r48; mov.u64 %r84,%r86; mov.u64 %r110,8; .loc 1 247 9 mov.u64 %r128,0; .loc 1 252 20 mov.u64 %r341,%r128; $L46: st.u64 [%r81+-8],%r341; .loc 1 255 30 mul.lo.u64 %r64,%r84,%r75; .loc 1 256 30 mul.lo.u64 %r66,%r77,%r84; .loc 1 257 23 mul.lo.u64 %r68,%r84,%r79; .loc 1 258 26 shl.b64 %r343,%r128,3; add.u64 %r344,%r383,%r343; .loc 1 258 30 ld.u64 %r345,[%r344]; mul.lo.u64 %r72,%r84,%r345; .loc 1 259 12 add.u64 %r128,%r128,1; .loc 1 260 14 setp.lt.s64 %r346,%r128,%r379; @ ! %r346 bra $L1; .loc 1 268 23 ld.u64 %r347,[%r81]; add.u64 %r74,%r347,1; st.u64 [%r81],%r74; .loc 1 269 30 add.u64 %r349,%r390,%r110; ld.u64 %r75,[%r349]; .loc 1 269 20 sub.u64 %r350,%r75,%r64; add.u64 %r129,%r129,%r350; .loc 1 270 30 add.u64 %r352,%r391,%r110; ld.u64 %r77,[%r352]; .loc 1 270 20 sub.u64 %r353,%r77,%r66; add.u64 %r130,%r130,%r353; .loc 1 271 23 add.u64 %r355,%r392,%r110; ld.u64 %r79,[%r355]; .loc 1 271 13 sub.u64 %r356,%r79,%r68; shl.b64 %r357,%r356,3; add.u64 %r131,%r131,%r357; .loc 1 272 30 add.u64 %r359,%r383,%r110; .loc 1 272 20 ld.u64 %r361,[%r359]; sub.u64 %r360,%r361,%r72; add.u64 %r173,%r173,%r360; .loc 1 248 32 add.u64 %r362,%frame,%r110; ld.u64 %r84,[%r362]; .loc 1 248 13 add.u64 %r81,%r81,8; setp.ne.u64 %r363,%r74,%r84; @ %r363 bra $L50; add.u64 %r110,%r110,8; bra $L46; $L22: .loc 1 177 9 setp.ne.u64 %r364,%r129,0; @ %r364 bra $L47; bra $L1; $L4: .loc 1 95 9 ld.u64 %r366,[%r171+56]; ld.u64 %r368,[%r171+48]; .loc 1 95 7 sub.u64 %r111,%r366,%r368; add.u64 %r120,%r171,72; cvt.u32.u32 %r370,%r137; cvt.s64.s8 %r369,%r370; add.u64 %r372,%r369,%r369; add.u64 %r373,%r372,%r369; shl.b64 %r374,%r373,3; mov.u64 %r117,%r182; .loc 1 98 17 mov.u64 %r104,1; .loc 1 103 4 mov.u64 %r202,%r117; bra $L49; $L1: .loc 1 2_gfortran_eoshift3_8 .visible .func _gfortran_eoshift3_8 {r22.loc 1 292 3 mov.u64 %r33,1; cvta.const.u64 %r32,$LC3eoshift329_gfortran_eoshift3_8_char .visible .func _gfortran_eoshift3_8_char.loc 1 315 3 mov.u64 %r36,1; cvta.const.u64 %r35,$LC4eoshift31_gfortran_eoshift3_8_char4 .visible .func _gfortran_eoshift3_8_char4.loc 1 339 3 mov.u64 %r36,4; cvta.const.u64 %r35,space$0eoshift3ret; } eoshift3_16.o/BEGIN FUNCTION DECL: eoshift3 .func eoshift3); .file 1 "../../../../libgfortran/generated/eoshift3_16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_eoshift3_16 .visible .func _gfortran_eoshift3_16_gfortran_eoshift3_16_char .visible .func _gfortran_eoshift3_16_char_gfortran_eoshift3_16_char4 .visible .func _gfortran_eoshift3_16_char4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_size0_gfortran_size0_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extentsgfortrani_bounds_reduced_extents .extern .func _gfortrani_bounds_reduced_extents)13] = {114,101,116,117,114,110,32,118,97,108,117,101] = {32,0 }8] = {69,79,83,72,73,70,8473,70,84,32,97,114,103,117,109,101,110,12] = {0,0 }; // BEGIN VAR DEF: space$0 .const .align 4 .u32 space$0[1] = {32 }; // BEGIN FUNCTION DEF: eoshift3 .func eoshift3) {pred %r186; .reg .u32u16 %r197; .reg .predpred %r209; .reg .u32 %r211; .reg .predpredpredpredpred %r235; .reg .pred %r236; .reg .predu64 %r267; .reg .u64 %r268; .reg .u64 %r270; .reg .predu64 %r301; .reg .u64 %r304; .reg .u32 %r306; .reg .u64 %r307; .reg .u64 %r310; .reg .pred %r313; .reg .pred %r316; .reg .pred %r319; .reg .u64 %r321; .reg .u64 %r324; .reg .u32 %r326; .reg .u64 %r327; .reg .pred %r332; .reg .pred %r334; .reg .predpred %r345; .reg .u64 %r353; .reg .u64 %r361; .reg .pred %r363; .reg .pred %r380; .reg .predpredpred %r437; .reg .predpredpred %r473; .reg .u64 %r474; .reg .u64 %r475; .reg .pred %r480; .reg .predmov.u64 %r177,%ar0; mov.u64 %r178,%ar1; mov.u64 %r179,%ar2; mov.u64 %r180,%ar3; mov.u64 %r181,%ar4; mov.u64 %r182,%ar5; mov.u64 %r18178; call (%value_in),_gfortran_size0.loc 1 79 10 ld.u64 %r22,[%r178+16]; .loc 1 81 6 setp.eq.u64 %r186,%r181,0; @ %r186 bra $L55; .loc 1 82 21 ld.u32 %r187,[%r181]; add.u32 %r102,%r187,-1; bra $L2; $L55: .loc 1 84 11 cvt.u32.u64 %r102,%r181; $L2: .loc 1 86 6 ld.u64 %r188,[%r1773; .loc 1 88 2422; call (%value_in),_gfortrani_xmallocarray192,[%value_in]; } .loc 1 88 22 st.u64 [%r177],%r192; .loc 1 89 19 st.u64 [%r177+8],%r188; .loc 1 90 7 ld.u64 %r195,[%r178+16]; st.u64 [%r177+16],%r195; ld.u64 %r196,[%r178+24]; st.u64 [%r177+24],%r196; .loc 1 91 34 ld.s8 %r142,[%r178+28]; .loc 1 91 7 cvt.u16.u32 %r197,%r142; setp.gt.s16 %r198,%r197,0; @ %r198 bra $L4; $L7: .loc 1 107 2422; call (%value_in),_gfortrani_xmallocarray201,[%value_in]; } .loc 1 107 22 st.u64 [%r177],%r201; cvta.global.u64 %r470,_gfortrani_compile_options; bra $L5; $L6: .loc 1 95 9 ld.u64 %r204,[%r112+8]; ld.u64 %r206,[%r112]; .loc 1 100 17 mad.lo.u64 %r105,%r113,%r105,%r105; add.u64 %r112,%r112,24; .loc 1 95 7 sub.u64 %r113,%r204,%r206; $L53: add.u64 %r96,%r177,%r108; .loc 1 103 4 st.u64 [%r96+48],%r208; st.u64 [%r96+56],%r113; st.u64 [%r96+40],%r105; .loc 1 91 7 add.u64 %r108,%r108,24; setp.ne.u64 %r209,%r448,%r108; @ %r209 bra $L6; bra $L7; $L3: .loc 1 110 12 cvta.global.u64 %r470,_gfortrani_compile_options; .loc 1 110 11 ld.u32 %r211,[%r470+36]; setp.ne.u32 %r212,%r211,0; @ ! %r212 bra $L9; .loc 1 112 7 cvta.const.u64 %r216,$LC0;216; call _gfortrani_bounds_equal_extents$L5: .loc 1 116 6 ld.u32 %r218,[%r470+36]; setp.eq.u32 %r219,%r218,0; @ %r219 bra $L9; .loc 1 118 7 cvta.const.u64 %r224,$LC0r1782234; call _gfortrani_bounds_reduced_extents); } $L9: .loc 1 122 6 setp.ne.u64 %r225,%r185,0; @ ! %r225 bra $L1; .loc 1 125 13 mov.u64 %r226,1; st.u64 [%frame],%r226; .loc 1 126 12 mov.u64 %r227,0; st.u64 [%frame+120],%r227; .loc 1 128 23 ld.s8 %r140,[%r178+28]; .loc 1 128 3 setp.le.s64 %r228,%r140,0; @ %r228 bra $L56; .loc 1 130 15 cvt.s64.s32 %r111,%r102; .loc 1 132 21 ld.u64 %r169,[%r177+16]; .loc 1 135 21 ld.u64 %r174,[%r178+16]; mov.u64 %r138,40; .loc 1 127 5 mov.u64 %r99,%r227; .loc 1 74 7 mov.u64 %r97,%r99; .loc 1 128 12 mov.u64 %r136,%r99; .loc 1 75 11 mov.u64 %r92,%r99; .loc 1 76 11 mov.u64 %r85,%r99; setp.eq.u64 %r467,%r180,0; .loc 1 142 20 mov.u64 %r485,%r99; $L18: add.u64 %r120,%r178,%r138; .loc 1 138 17 ld.u64 %r230,[%r120+16]; add.u64 %r229,%r230,1; .loc 1 138 15 ld.u64 %r231,[%r120+8]; sub.u64 %r166,%r229,%r231; .loc 1 132 21 add.u64 %r232,%r177,%r138; ld.u64 %r233,[%r232]; mul.lo.u64 %r171,%r169,%r233; .loc 1 135 21 ld.u64 %r234,[%r120]; mul.lo.u64 %r176,%r174,%r234; .loc 1 130 10 setp.ne.u64 %r235,%r111,%r136; @ %r235 bra $L13; .loc 1 133 14 setp.ne.u64 %r236,%r171,0; .loc 1 79 8 selp.u64 %r85,%r171,%r22,%r236; .loc 1 136 14 setp.ne.u64 %r237,%r176,0; .loc 1 79 8 selp.u64 %r92,%r176,%r22,%r237; .loc 1 138 15 mov.u64 %r97,%r166; bra $L16; $L13: .loc 1 142 20 shl.b64 %r472,%r99,3; add.u64 %r239,%frame,%r472; add.u64 %r240,%r239,120; st.u64 [%r240],%r485; .loc 1 143 21 st.u64 [%r239],%r166; .loc 1 144 22 st.u64 [%r239+600],%r171; .loc 1 145 22 st.u64 [%r239+480],%r176; .loc 1 147 24 add.u64 %r254,%r99,%r99; add.u64 %r471,%r254,%r99; shl.b64 %r256,%r471,3; add.u64 %r257,%r179,%r256; .loc 1 147 22 ld.u64 %r259,[%r257+40]; st.u64 [%r239+360],%r259; .loc 1 148 14 @ %r467 bra $L59; .loc 1 149 26 add.u64 %r265,%r180,%r256; ld.u64 %r267,[%r265+40]; ld.u64 %r268,[%r180+16]; mul.lo.u64 %r46,%r267,%r268; bra $L17; $L59: .loc 1 151 24 mov.u64 %r46,%r485; $L17: add.u64 %r270,%frame,%r472; st.u64 [%r270+240],%r46; .loc 1 152 12 add.u64 %r99,%r99,1; $L16: .loc 1 128 55 add.u64 %r136,%r136,1; .loc 1 128 3 add.u64 %r138,%r138,24; setp.ne.u64 %r272,%r136,%r140; @ %r272 bra $L18; bra $L12; $L56: .loc 1 74 7 mov.u64 %r97,%r227; .loc 1 75 11 mov.u64 %r92,%r97; .loc 1 76 11 mov.u64 %r85,%r97; setp.eq.u64 %r467,%r180,0; $L12: .loc 1 155 14 ld.u64 %r47,[%frame+480]; .loc 1 155 6 setp.ne.u64 %r273,%r47,0; .loc 1 79 8 selp.u64 %r47,%r47,%r22,%r273; .loc 1 157 14 ld.u64 %r48,[%frame+600]; .loc 1 157 6 setp.ne.u64 %r274,%r48,0; .loc 1 79 8 selp.u64 %r48,%r48,%r22,%r274; .loc 1 159 14 ld.u64 %r38,[%frame+360]; .loc 1 159 6 setp.ne.u64 %r275,%r38,0; .loc 1 160 16 selp.u64 %r38,%r38,1,%r275; .loc 1 161 23 ld.u64 %r40,[%frame+240]; .loc 1 169 8 ld.u64 %r133,[%r177]; .loc 1 170 8 ld.u64 %r134,[%r178]; .loc 1 171 8 ld.u64 %r135,[%r179]; .loc 1 161 6 @ %r467 bra $L22; .loc 1 161 13 setp.ne.u64 %r277,%r40,0; .loc 1 79 8 selp.u64 %r40,%r40,%r22,%r277; st.u64 [%frame+240],%r40; .loc 1 173 10 ld.u64 %r180,[%r180]; bra $L22; $L51: .loc 1 244 12 shl.b64 %r59,%r38,4; .loc 1 181 34 shr.s64 %r452,%r97,63; .loc 1 201 38 set.u32.eq.u64 %r280,%r22,%r85; neg.s32 %r281,%r280; .loc 1 201 19 set.u32.eq.u64 %r283,%r22,%r92; neg.s32 %r284,%r283; .loc 1 201 27cvt.u32.u16 %r288,%r285; cvt.u32.u8 %r95,%r288; .loc 1 248 32 ld.u64 %r98,[%frame]; setp.ne.u32 %r473,%r95,0; add.u64 %r475,%r140,-1; add.u64 %r474,%frame,128; add.u64 %r469,%frame,240; setp.eq.u64 %r480,%r183,1; .loc 1 235 8 setp.gt.s64 %r481,%r22,0; .loc 1 269 30 add.u64 %r482,%frame,600; .loc 1 270 30 add.u64 %r483,%frame,480; .loc 1 271 23 add.u64 %r484,%frame,360; $L54: .loc 1 180 10 ld.u64 %r453,[%r135]; ld.u64 %r454,[%r135+8]; .loc 1 181 26 shr.s64 %r290,%r454,63; xor.b64 %r449,%r290,%r453; xor.b64 %r450,%r290,%r454; sub.u64 %r301,%r449,%r290; set.u32.gt.u64 %r306,%r301,%r449; cvt.s64.s32 %r304,%r306; sub.u64 %r307,%r450,%r290; add.u64 %r310,%r307,%r304; mov.u64 %r449,%r301; .loc 1 181 10 setp.gt.s64 %r313,%r310,%r452; @ %r313 bra $L60; setp.ne.u64 %r316,%r310,%r452; @ %r316 bra $L61; setp.gt.u64 %r319,%r301,%r97; @ %r319 bra $L60; $L61: .loc 1 219 9 mov.u64 %r86,%r301; .loc 1 209 24 sub.u64 %r321,%r97,%r86; set.u32.gt.u64 %r326,%r321,%r97; cvt.s64.s32 %r324,%r326; sub.u64 %r327,%r452,%r310; mov.u64 %r455,%r321; add.u64 %r456,%r327,%r324; bra $L24; $L60: mov.u64 %r86,%r97; mov.u64 %r449,%r97; mov.u64 %r453,%r97; mov.u64 %r454,%r452; mov.u64 %r455,0; mov.u64 %r456,%r455; $L24: .loc 1 189 10 setp.gt.s64 %r332,%r454,0; @ %r332 bra $L62; setp.ne.u64 %r334,%r454,0; @ %r334 bra $L26; setp.ne.u64 %r336,%r453,0; @ ! %r336 bra $L26; $L62: .loc 1 191 15 mad.lo.u64 %r94,%r92,%r449,%r134; mov.u64 %r89,%r133; bra $L28; $L26: .loc 1 197 16 mad.lo.u64 %r89,%r85,%r449,%r133; mov.u64 %r94,%r134; $L28: .loc 1 201 10 @ %r473 bra $L29; .loc 1 209 4 setp.gt.s64 %r341,%r456,0; @ %r341 bra $L30; setp.ne.u64 %r343,%r456,0; @ %r343 bra $L32; setp.ne.u64 %r345,%r455,0; @ %r345 bra $L30; bra $L32; $L29: .loc 1 203 11 mul.lo.u64 %r127,%r22,%r455; .loc 1 204 47353,[%value_in]; } .loc 1 205 9 add.u64 %r89,%r89,%r127; bra $L32; $L30: .loc 1 197 18 mov.u64 %r125,%r89; .loc 1 209 11 mov.u64 %r126,0; $L33:361,[%value_in]; } .loc 1 212 13 add.u64 %r125,%r125,%r85; .loc 1 213 12 add.u64 %r94,%r94,%r92; .loc 1 209 34 add.u64 %r126,%r126,1; .loc 1 209 4 setp.ne.u64 %r363,%r126,%r455; @ %r363 bra $L33; .loc 1 212 13 mad.lo.u64 %r89,%r126,%r85,%r89; $L32: .loc 1 217 10 setp.ge.s64 %r380,%r454,0; selp.u64 %r89,%r89,%r133,%r380; .loc 1 228 10 add.u64 %r129,%r86,-1; .loc 1 221 10 setp.ne.u64 %r381,%r180,0; @ %r381 bra $L36; .loc 1 228 8 setp.ne.u64 %r382,%r86,0; @ %r382 bra $L47; $L40: .loc 1 242 12 add.u64 %r133,%r133,%r48; .loc 1 243 12 add.u64 %r134,%r134,%r47; .loc 1 244 12 add.u64 %r135,%r135,%r59; .loc 1 245 12 add.u64 %r180,%r180,%r40; .loc 1 246 15 ld.u64 %r383,[%frame+120]; add.u64 %r61,%r383,1; st.u64 [%frame+120],%r61; .loc 1 248 13 setp.eq.u64 %r384,%r61,%r98; @ %r384 bra $L38; bra $L54; $L36: .loc 1 222 8 setp.eq.u64 %r385,%r86,0; @ %r385 bra $L40; $L41: .loc 1 224 6180392,[%value_in]; } .loc 1 225 11 add.u64 %r89,%r89,%r85; .loc 1 222 10 add.u64 %r129,%r129,-1; .loc 1 222 8 setp.ne.u64 %r394,%r129,-1; @ %r394 bra $L41; bra $L40; $L47: .loc 1 232 9 @ %r480 bra $L42; .loc 1 235 8 @ %r481 bra $L43; bra $L44; $L42: .loc 1 233 8 ld.s8 %r401,[%r182]89bra $L44; $L43: .loc 1 235 15 mov.u64 %r128,0; $L45: .loc 1 236 3 add.u64 %r403,%r89,%r1283410,[%value_in]; } .loc 1 235 32 add.u64 %r128,%r128,%r183; .loc 1 235 8 setp.gt.s64 %r412,%r22,%r128; @ %r412 bra $L45; $L44: .loc 1 238 11 add.u64 %r89,%r89,%r85; .loc 1 228 10 add.u64 %r129,%r129,-1; .loc 1 228 8 setp.ne.u64 %r413,%r129,-1; @ %r413 bra $L47; bra $L40; $L38: mov.u64 %r114,%r474; .loc 1 260 24 mov.u64 %r80,%r38; mov.u64 %r78,%r47; mov.u64 %r76,%r48; mov.u64 %r84,%r98; mov.u64 %r122,8; .loc 1 247 9 mov.u64 %r132,0; .loc 1 252 20 mov.u64 %r415,%r132; $L50: st.u64 [%r114+-8],%r415; .loc 1 255 30 mul.lo.u64 %r63,%r76,%r84; .loc 1 256 30 mul.lo.u64 %r65,%r84,%r78; .loc 1 257 23 mul.lo.u64 %r67,%r84,%r80; .loc 1 258 26 shl.b64 %r417,%r132,3; add.u64 %r418,%r469,%r417; .loc 1 258 30 ld.u64 %r419,[%r418]; mul.lo.u64 %r72,%r84,%r419; .loc 1 259 12 add.u64 %r132,%r132,1; .loc 1 260 14 setp.lt.s64 %r420,%r132,%r475; @ ! %r420 bra $L1; .loc 1 268 23 ld.u64 %r421,[%r114]; add.u64 %r75,%r421,1; st.u64 [%r114],%r75; .loc 1 269 30 add.u64 %r423,%r482,%r122; ld.u64 %r76,[%r423]; .loc 1 269 20 sub.u64 %r424,%r76,%r63; add.u64 %r133,%r133,%r424; .loc 1 270 30 add.u64 %r426,%r483,%r122; ld.u64 %r78,[%r426]; .loc 1 270 20 sub.u64 %r427,%r78,%r65; add.u64 %r134,%r134,%r427; .loc 1 271 23 add.u64 %r429,%r484,%r122; ld.u64 %r80,[%r429]; .loc 1 271 13 sub.u64 %r430,%r80,%r67; shl.b64 %r431,%r430,4; add.u64 %r135,%r135,%r431; .loc 1 272 30 add.u64 %r433,%r469,%r122; .loc 1 272 20 ld.u64 %r435,[%r433]; sub.u64 %r434,%r435,%r72; add.u64 %r180,%r180,%r434; .loc 1 248 32 add.u64 %r436,%frame,%r122; ld.u64 %r84,[%r436]; .loc 1 248 13 add.u64 %r114,%r114,8; setp.ne.u64 %r437,%r75,%r84; @ %r437 bra $L54; add.u64 %r122,%r122,8; bra $L50; $L22: .loc 1 177 9 setp.ne.u64 %r438,%r133,0; @ %r438 bra $L51; bra $L1; $L4: .loc 1 95 9 ld.u64 %r440,[%r178+56]; ld.u64 %r442,[%r178+48]; .loc 1 95 7 sub.u64 %r113,%r440,%r442; add.u64 %r112,%r178,72; cvt.u32.u32 %r444,%r142; cvt.s64.s8 %r443,%r444; add.u64 %r446,%r443,%r443; add.u64 %r447,%r446,%r443; shl.b64 %r448,%r447,3; mov.u64 %r108,%r188; .loc 1 98 17 mov.u64 %r105,1; .loc 1 103 4 mov.u64 %r208,%r108; bra $L53; $L1: .loc 1 2_gfortran_eoshift3_16 .visible .func _gfortran_eoshift3_16 {r22.loc 1 292 3 mov.u64 %r33,1; cvta.const.u64 %r32,$LC3eoshift329_gfortran_eoshift3_16_char .visible .func _gfortran_eoshift3_16_char.loc 1 315 3 mov.u64 %r36,1; cvta.const.u64 %r35,$LC4eoshift31_gfortran_eoshift3_16_char4 .visible .func _gfortran_eoshift3_16_char4.loc 1 339 3 mov.u64 %r36,4; cvta.const.u64 %r35,space$0eoshift3ret; } cshift1_4.o/BEGIN FUNCTION DECL: cshift1 .func cshift1../libgfortran/generated/cshift1_4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_cshift1_4 .visible .func _gfortran_cshift1_4gfortran_cshift1_4_char .visible .func _gfortran_cshift1_4_char_gfortran_cshift1_4_char4 .visible .func _gfortran_cshift1_4_char4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortran_size0_gfortran_size0_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extentsgfortrani_bounds_reduced_extents .extern .func _gfortrani_bounds_reduced_extents); // BEGIN GLOBAL FUNCTION DECL: _gfortrani_cshift1_4_i1 .extern .func _gfortrani_cshift1_4_i1gfortrani_cshift1_4_i2 .extern .func _gfortrani_cshift1_4_i2gfortrani_cshift1_4_i4 .extern .func _gfortrani_cshift1_4_i4gfortrani_cshift1_4_i8 .extern .func _gfortrani_cshift1_4_i8gfortrani_cshift1_4_r4 .extern .func _gfortrani_cshift1_4_r4gfortrani_cshift1_4_r8 .extern .func _gfortrani_cshift1_4_r8gfortrani_cshift1_4_c4 .extern .func _gfortrani_cshift1_4_c4gfortrani_cshift1_4_c8 .extern .func _gfortrani_cshift1_4_c8114,101,116,117,114,110,32,118,97,108,117,101,0 }03,117,109,101,110,116,32,39,68,73,77,39,32,105,115,32,111,117,116,32,111,102,32,114,97,110,103,101,32,105,110,32,99,97,108,108,32,116,111,32,39,67,83,72,73,70,84,397] = {67,83,72,73,70,8473,70,84,32,97,114,103,117,109,101,110,116,0 }; // BEGIN FUNCTION DEF: cshift1 .func cshift1608predu64 %r200; .reg .u16 %r201; .reg .predu64 %r216; .reg .u32 %r218; .reg .predpred %r225; .reg .u32 %r226; .reg .u32 %r227; .reg .u64 %r228; .reg .u64 %r229; .reg .pred %r230; .reg .pred %r231; .reg .pred %r232; .reg .predpred %r236; .reg .u64 %r237; .reg .u64 %r238; .reg .predpred64predpred %r315; .reg .pred64predu64.reg .predmov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; mov.u64 %r182,%ar3; .loc 1 66 6 setp.eq.u64 %r183,%r182,0; @ %r183 bra $L54; .loc 1 67 11 ld.u32 %r184,[%r182]; add.u32 %r116,%r184,-1; .loc 1 71 6 setp.lt.s32 %r185,%r116,0; @ ! %r185 bra $L2; bra $L3; $L54: .loc 1 69 11 cvt.u32.u64 %r116,%r182; $L2: .loc 1 71 34 ld.s8 %r186,[%r180+28]; .loc 1 71 17 setp.gt.s32 %r187,%r186,%r116; @ %r187 bra $L4; $L3: .loc 1 7_gfortran_runtime_errorr25,[%r180+16]180; call (%value_in),_gfortran_size0,(%out_arg1); ld.param.u64 %r191,[%value_in]; } .loc 1 78 6 ld.u64 %r192,[%r1795; .loc 1 80 24191_gfortrani_xmallocarray196,[%value_in]; } .loc 1 80 22 st.u64 [%r179],%r196; .loc 1 81 19 st.u64 [%r179+8],%r192; .loc 1 82 7 ld.u64 %r199,[%r180+16]; st.u64 [%r179+16],%r199; ld.u64 %r200,[%r180+24]; st.u64 [%r179+24],%r200; .loc 1 83 34 ld.s8 %r155,[%r180+28]; .loc 1 83 7 cvt.u16.u32 %r201,%r155; setp.gt.s16 %r202,%r201,0; @ %r202 bra $L6; cvta.global.u64 %r419,_gfortrani_compile_options; bra $L7; $L8: .loc 1 87 16 ld.u64 %r204,[%r82+8]; ld.u64 %r206,[%r82]; .loc 1 92 10 mad.lo.u64 %r108,%r41,%r108,%r108; add.u64 %r82,%r82,24; .loc 1 87 14 sub.u64 %r41,%r204,%r206; $L52: add.u64 %r56,%r179,%r71; .loc 1 95 4 st.u64 [%r56+48],%r208; st.u64 [%r56+56],%r41; st.u64 [%r56+40],%r108; .loc 1 83 7 add.u64 %r71,%r71,24; setp.ne.u64 %r209,%r415,%r71; @ %r209 bra $L8; cvta.global.u64 %r419,_gfortrani_compile_options; bra $L7; $L5: .loc 1 98 12 cvta.global.u64 %r419,_gfortrani_compile_options; .loc 1 98 11 ld.u32 %r211,[%r419+36]; setp.ne.u32 %r212,%r211,0; @ ! %r212 bra $L11; .loc 1 100 7 cvta.const.u64 %r216,$LC1r1802116; call _gfortrani_bounds_equal_extents$L7: .loc 1 104 6 ld.u32 %r218,[%r419+36]; setp.eq.u32 %r219,%r218,0; @ %r219 bra $L11; .loc 1 106 7 cvta.const.u64 %r224,$LC1r18116234; call _gfortrani_bounds_reduced_extents); } $L11: .loc 1 110 6 setp.eq.u64 %r225,%r191,0; @ %r225 bra $L1; .loc 1 115 15 ld.u64 %r46,[%r180+16]; ld.s8 %r226,[%r180+29]; shl.b32 %r227,%r226,4; cvt.s64.s32 %r228,%r227; shl.b64 %r229,%r46,7; or.b64 %r48,%r228,%r229; .loc 1 117 3 setp.eq.u64 %r230,%r48,560; @ %r230 bra $L14; setp.gt.u64 %r231,%r48,560; @ %r231 bra $L15; setp.eq.u64 %r232,%r48,288; @ %r232 bra $L16; setp.gt.u64 %r233,%r48,288; @ %r233 bra $L17; setp.eq.u64 %r234,%r48,160; @ %r234 bra $L18; setp.eq.u64 %r235,%r48,272; @ %r235 bra $L16; setp.eq.u64 %r236,%r48,144; @ %r236 bra $L18; bra $L19; $L17: add.u64 %r237,%r48,-528; and.b64 %r238,%r237,-17; setp.eq.u64 %r239,%r238,0; @ ! %r239 bra $L19; bra $L20; $L15: setp.eq.u64 %r240,%r48,1072; @ %r240 bra $L21; setp.gt.u64 %r241,%r48,1072; @ %r241 bra $L22; add.u64 %r242,%r48,-1040; and.b64 %r243,%r242,-17; setp.eq.u64 %r244,%r243,0; @ ! %r244 bra $L19; bra $L23; $L22: setp.eq.u64 %r245,%r48,1088; @ %r245 bra $L24; setp.eq.u64 %r246,%r48,2112; @ %r246 bra $L25; bra $L19; $L18: .loc 1 121 7r1801182; call _gfortrani_cshift1_4_123 7 bra $L1; $L16: .loc 1 127 7r1801182; call _gfortrani_cshift1_4_129 7 bra $L1; $L20: .loc 1 133 7r1801182; call _gfortrani_cshift1_4_135 7 bra $L1; $L23: .loc 1 139 7r1801182; call _gfortrani_cshift1_4_141 7 bra $L1; $L14: .loc 1 152 7r1801182; call _gfortrani_cshift1_4_r47 bra $L1; $L21: .loc 1 157 7r1801182; call _gfortrani_cshift1_4_r87 bra $L1; $L24: .loc 1 176 7r1801182; call _gfortrani_cshift1_4_c4178 7 bra $L1; $L25: .loc 1 181 7r1801182; call _gfortrani_cshift1_4_c8183 7 bra $L1; $L19: .loc 1 204 13 mov.u64 %r279,1; st.u64 [%frame],%r279; .loc 1 205 12 mov.u64 %r280,0; st.u64 [%frame+120],%r280; .loc 1 213 23 ld.s8 %r150,[%r180+28]; cvt.u32.u32 %r281,%r150; cvt.s64.s8 %r149,%r281; .loc 1 213 3 setp.le.s64 %r282,%r149,0; @ %r282 bra $L55; .loc 1 215 15 cvt.s64.s32 %r101,%r116; add.u64 %r111,%r180,40; add.u64 %r107,%r179,40; .loc 1 217 21 ld.u64 %r174,[%r179+16]; .loc 1 74 8 mov.u64 %r99,%r25; mov.u64 %r94,%r25; .loc 1 206 5 mov.u64 %r106,%r280; .loc 1 211 7 mov.u64 %r104,%r106; .loc 1 213 12 mov.u64 %r143,%r106; .loc 1 227 20 mov.u64 %r430,%r106; $L32: .loc 1 223 17 ld.u64 %r284,[%r111+16]; add.u64 %r283,%r284,1; .loc 1 223 15 ld.u64 %r285,[%r111+8]; sub.u64 %r171,%r283,%r285; .loc 1 217 21 ld.u64 %r286,[%r107]; mul.lo.u64 %r93,%r174,%r286; .loc 1 220 21 ld.u64 %r287,[%r111]; mul.lo.u64 %r98,%r46,%r287; .loc 1 215 10 setp.ne.u64 %r288,%r101,%r143; @ %r288 bra $L28; .loc 1 218 14 setp.ne.u64 %r289,%r93,0; .loc 1 221 14 setp.ne.u64 %r290,%r98,0; .loc 1 223 15 mov.u64 %r104,%r171; selp.u64 %r99,%r98,%r25,%r290; selp.u64 %r94,%r93,%r25,%r289; bra $L31; $L28: .loc 1 227 20 shl.b64 %r291,%r106,3; add.u64 %r292,%frame,%r291; add.u64 %r293,%r292,120; st.u64 [%r293],%r430; .loc 1 228 21 st.u64 [%r292],%r171; .loc 1 229 22 st.u64 [%r292+480],%r93; .loc 1 230 22 st.u64 [%r292+360],%r98; .loc 1 232 24 add.u64 %r307,%r106,%r106; add.u64 %r308,%r307,%r106; shl.b64 %r309,%r308,3; add.u64 %r310,%r181,%r309; .loc 1 232 22 ld.u64 %r312,[%r310+40]; st.u64 [%r292+240],%r312; .loc 1 233 12 add.u64 %r106,%r106,1; $L31: .loc 1 213 55 add.u64 %r143,%r143,1; .loc 1 213 3 add.u64 %r111,%r111,24; add.u64 %r107,%r107,24; setp.ne.u64 %r313,%r143,%r149; @ %r313 bra $L32; bra $L27; $L55: .loc 1 74 8 mov.u64 %r99,%r25; mov.u64 %r94,%r25; .loc 1 211 7 mov.u64 %r104,%r280; $L27: .loc 1 236 14 ld.u64 %r52,[%frame+360]; .loc 1 236 6 setp.ne.u64 %r314,%r52,0; .loc 1 74 8 selp.u64 %r52,%r52,%r25,%r314; .loc 1 238 14 ld.u64 %r37,[%frame+480]; .loc 1 238 6 setp.ne.u64 %r315,%r37,0; .loc 1 74 8 selp.u64 %r37,%r37,%r25,%r315; .loc 1 240 14 ld.u64 %r40,[%frame+240]; .loc 1 240 6 setp.ne.u64 %r316,%r40,0; .loc 1 241 16 selp.u64 %r40,%r40,1,%r316; .loc 1 247 8 ld.u64 %r140,[%r179]; .loc 1 248 8 ld.u64 %r141,[%r180]; .loc 1 249 8 ld.u64 %r142,[%r181]; .loc 1 251 9 setp.eq.u64 %r317,%r140,0; @ %r317 bra $L1; .loc 1 294 12 shl.b64 %r68,%r40,2; .loc 1 297 32 ld.u64 %r109,[%frame]; .loc 1 268 38 set.u32.eq.u64 %r319,%r25,%r94; neg.s32 %r320,%r319; .loc 1 268 19 set.u32.eq.u64 %r322,%r25,%r99; neg.s32 %r323,%r322; .loc 1 268 .loc 1 304 30 mul.lo.u64 %r36,%r37,%r109; .loc 1 305 30 mul.lo.u64 %r328,%r109,%r52; .loc 1 305 16 neg.s64 %r57,%r328; .loc 1 306 23 mul.lo.u64 %r329,%r40,%r109; .loc 1 306 9 shl.b64 %r330,%r329,2; neg.s64 %r156,%r330; setp.ne.u32 %r416,%r100,0; setp.le.s64 %r418,%r149,2; .loc 1 258 12 cvt.u32.u64 %r423,%r104; cvt.u32.u32 %r424,%r150; cvt.s64.s8 %r425,%r424; add.u64 %r426,%r425,-1; add.u64 %r427,%frame,480; add.u64 %r428,%frame,360; add.u64 %r429,%frame,240; $L53: .loc 1 254 10 ld.u32 %r127,[%r142]; .loc 1 257 10 setp.ge.s32 %r331,%r127,0; @ %r331 bra $L36; .loc 1 258 12 add.u32 %r55,%r127,%r423; .loc 1 259 11 cvt.s64.s32 %r157,%r55; .loc 1 259 10 setp.ge.s64 %r332,%r157,%r104; @ %r332 bra $L37; setp.ge.s32 %r333,%r55,0; @ %r333 bra $L38; $L37: .loc 1 261 12 rem.s64 %r157,%r157,%r104; .loc 1 262 7 setp.ge.s64 %r336,%r157,0; @ %r336 bra $L38; .loc 1 263 9 cvt.u32.u64 %r338,%r157; add.u32 %r337,%r423,%r338; .loc 1 266 22 cvt.s64.s32 %r157,%r337; $L38: .loc 1 277 24 sub.u64 %r158,%r104,%r157; .loc 1 268 10 @ %r416 bra $L39; .loc 1 277 4 setp.gt.s64 %r340,%r158,0; @ %r340 bra $L40; mov.u64 %r114,%r140; bra $L41; $L39: .loc 1 270 26 mul.lo.u64 %r135,%r25,%r157; .loc 1 271 27 mul.lo.u64 %r136,%r25,%r158; .loc 1 272 21 add.u64 %r341,%r141,%r135; .loc 1 272 214341136348,[%value_in]; } .loc 1 273 15 add.u64 %r350,%r140,%r136; .loc 1 273 2141bra $L42; $L40: .loc 1 266 11 mad.lo.u64 %r133,%r157,%r99,%r141; .loc 1 287 7 mov.u64 %r132,%r140; .loc 1 277 11 mov.u64 %r134,0; $L43:1366,[%value_in]; } .loc 1 280 13 add.u64 %r132,%r132,%r94; .loc 1 281 12 add.u64 %r133,%r133,%r99; .loc 1 277 31 add.u64 %r134,%r134,1; .loc 1 277 4 setp.ne.u64 %r368,%r134,%r158; @ %r368 bra $L43; .loc 1 280 13 mad.lo.u64 %r114,%r134,%r94,%r140; $L41: .loc 1 283 6 setp.gt.s64 %r370,%r157,0; @ ! %r370 bra $L42; .loc 1 287 7 mov.u64 %r130,%r141; .loc 1 283 25 mov.u64 %r131,0; $L45: .loc 1 285 3377,[%value_in]; } .loc 1 286 8 add.u64 %r114,%r114,%r94; .loc 1 287 7 add.u64 %r130,%r130,%r99; .loc 1 283 39 add.u64 %r131,%r131,1; .loc 1 283 6 setp.ne.u64 %r379,%r131,%r157; @ %r379 bra $L45; $L42: .loc 1 292 12 add.u64 %r140,%r140,%r37; .loc 1 293 12 add.u64 %r141,%r141,%r52; .loc 1 294 12 add.u64 %r142,%r142,%r68; .loc 1 295 15 ld.u64 %r380,[%frame+120]; add.u64 %r70,%r380,1; .loc 1 297 13 setp.eq.u64 %r381,%r70,%r109; @ %r381 bra $L47; .loc 1 295 15 st.u64 [%frame+120],%r70; bra $L53; $L47: .loc 1 301 20 mov.u64 %r382,0; st.u64 [%frame+120],%r382; .loc 1 305 16 add.u64 %r137,%r141,%r57; .loc 1 306 9 add.u64 %r138,%r142,%r156; .loc 1 308 14 @ %r418 bra $L1; add.u64 %r167,%frame,128; .loc 1 304 30 mov.u64 %r73,%r36; .loc 1 308 14 mov.u64 %r166,8; .loc 1 307 12 mov.u64 %r139,1; bra $L50; $L51: .loc 1 301 20 st.u64 [%r167],%r382; .loc 1 304 30 mul.lo.u64 %r73,%r85,%r84; .loc 1 305 30 mul.lo.u64 %r388,%r87,%r84; .loc 1 305 16 sub.u64 %r137,%r141,%r388; .loc 1 306 23 mul.lo.u64 %r389,%r89,%r84; .loc 1 306 9 shl.b64 %r390,%r389,2; sub.u64 %r138,%r142,%r390; .loc 1 307 12 add.u64 %r139,%r139,1; .loc 1 308 14 add.u64 %r167,%r167,8; add.u64 %r166,%r166,8; setp.eq.u64 %r391,%r426,%r139; @ %r391 bra $L1; $L50: .loc 1 316 23 ld.u64 %r392,[%r167]; add.u64 %r84,%r392,1; st.u64 [%r167],%r84; .loc 1 317 30 add.u64 %r394,%r427,%r166; ld.u64 %r85,[%r394]; .loc 1 317 20 sub.u64 %r395,%r85,%r73; add.u64 %r140,%r140,%r395; .loc 1 318 30 add.u64 %r397,%r428,%r166; ld.u64 %r87,[%r397]; .loc 1 318 20 add.u64 %r141,%r137,%r87; .loc 1 319 23 add.u64 %r399,%r429,%r166; ld.u64 %r89,[%r399]; .loc 1 319 13 shl.b64 %r400,%r89,2; add.u64 %r142,%r138,%r400; .loc 1 297 32 add.u64 %r401,%frame,%r166; ld.u64 %r92,[%r401]; .loc 1 297 13 setp.eq.u64 %r402,%r84,%r92; @ %r402 bra $L51; bra $L53; $L36: .loc 1 259 11 cvt.s64.s32 %r157,%r127; .loc 1 259 10 setp.lt.s64 %r403,%r157,%r104; @ %r403 bra $L38; .loc 1 261 12 rem.s64 %r157,%r157,%r104; bra $L38; $L6: .loc 1 87 16 ld.u64 %r407,[%r180+56]; ld.u64 %r409,[%r180+48]; .loc 1 87 14 sub.u64 %r41,%r407,%r409; add.u64 %r82,%r180,72; cvt.u32.u32 %r411,%r155; cvt.s64.s8 %r410,%r411; add.u64 %r413,%r410,%r410; add.u64 %r414,%r413,%r410; shl.b64 %r415,%r414,3; mov.u64 %r71,%r192; .loc 1 90 17 mov.u64 %r108,1; .loc 1 95 4 mov.u64 %r208,%r71; bra $L52; $L1_gfortran_cshift1_4 .visible .func _gfortran_cshift1_.loc 1 337call cshift1_gfortran_cshift1_4_char .visible .func _gfortran_cshift1_4_char.loc 1 357call cshift13_gfortran_cshift1_4_char4 .visible .func _gfortran_cshift1_4_char4$LFB8:call cshift1ret; $LFE8: } cshift1_8.o/BEGIN FUNCTION DECL: cshift1 .func cshift1../libgfortran/generated/cshift1_8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_cshift1_8 .visible .func _gfortran_cshift1_8gfortran_cshift1_8_char .visible .func _gfortran_cshift1_8_char_gfortran_cshift1_8_char4 .visible .func _gfortran_cshift1_8_char4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortran_size0_gfortran_size0_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extentsgfortrani_bounds_reduced_extents .extern .func _gfortrani_bounds_reduced_extents); // BEGIN GLOBAL FUNCTION DECL: _gfortrani_cshift1_8_i1 .extern .func _gfortrani_cshift1_8_i1gfortrani_cshift1_8_i2 .extern .func _gfortrani_cshift1_8_i2gfortrani_cshift1_8_i4 .extern .func _gfortrani_cshift1_8_i4gfortrani_cshift1_8_i8 .extern .func _gfortrani_cshift1_8_i8gfortrani_cshift1_8_r4 .extern .func _gfortrani_cshift1_8_r4gfortrani_cshift1_8_r8 .extern .func _gfortrani_cshift1_8_r8gfortrani_cshift1_8_c4 .extern .func _gfortrani_cshift1_8_c4gfortrani_cshift1_8_c8 .extern .func _gfortrani_cshift1_8_c8114,101,116,117,114,110,32,118,97,108,117,101,0 }03,117,109,101,110,116,32,39,68,73,77,39,32,105,115,32,111,117,116,32,111,102,32,114,97,110,103,101,32,105,110,32,99,97,108,108,32,116,111,32,39,67,83,72,73,70,84,397] = {67,83,72,73,70,8473,70,84,32,97,114,103,117,109,101,110,116,0 }; // BEGIN FUNCTION DEF: cshift1 .func cshift1608pred %r223; .reg .u32 %r224; .reg .u32pred %r230; .reg .pred %r231; .reg .pred %r232; .reg .pred %r233; .reg .predpredpred %r242; .reg .predpred %r286; .reg .pred %r287; .reg .predu64 %r310; .reg .pred %r311; .reg .pred %r312; .reg .pred %r313; .reg .predpred %r329; .reg .u32 %r331; .reg .u32 %r332; .reg .u64predu64 %r356; .reg .u64 %r363; .reg .u64 %r372; .reg .predu64 %r431; .reg .u64 %r432; mov.u64 %r176,%ar0; mov.u64 %r177,%ar1; mov.u64 %r178,%ar2; mov.u64 %r179,%ar3; .loc 1 66 6 setp.eq.u64 %r180,%r179,0; @ %r180 bra $L53; .loc 1 67 21 ld.u32 %r182,[%r179]; add.u32 %r125,%r182,-1; .loc 1 71 6 setp.lt.s32 %r183,%r125,0; @ ! %r183 bra $L2; bra $L3; $L53: .loc 1 69 11 cvt.u32.u64 %r125,%r179; $L2: .loc 1 71 34 ld.s8 %r184,[%r177+28]; .loc 1 71 17 setp.gt.s32 %r185,%r184,%r125; @ %r185 bra $L4; $L3: .loc 1 72 5 cvta.const.u64 %r186186_gfortran_runtime_errorr27,[%r177+16]177; call (%value_in),_gfortran_size0,(%out_arg1); ld.param.u64 %r189,[%value_in]; } .loc 1 78 6 ld.u64 %r190,[%r176]; setp.ne.u64 %r191,%r190,0; @ %r191 bra $L5; .loc 1 80 2427; call (%value_in),_gfortrani_xmallocarray194,[%value_in]; } .loc 1 80 22 st.u64 [%r176],%r194; .loc 1 81 19 st.u64 [%r176+8],%r190; .loc 1 82 7 ld.u64 %r197,[%r177+16]; st.u64 [%r176+16],%r197; ld.u64 %r198,[%r177+24]; st.u64 [%r176+24],%r198; .loc 1 83 34 ld.s8 %r162,[%r177+28]; .loc 1 83 7 cvt.u16.u32 %r199,%r162; setp.gt.s16 %r200,%r199,0; @ %r200 bra $L6; cvta.global.u64 %r421,_gfortrani_compile_options; bra $L7; $L8: .loc 1 87 16 ld.u64 %r202,[%r72+8]; ld.u64 %r204,[%r72]; .loc 1 92 10 mad.lo.u64 %r115,%r152,%r115,%r115; add.u64 %r72,%r72,24; .loc 1 87 14 sub.u64 %r152,%r202,%r204; $L51: add.u64 %r45,%r176,%r55; .loc 1 95 4 st.u64 [%r45+48],%r206; st.u64 [%r45+56],%r152; st.u64 [%r45+40],%r115; .loc 1 83 7 add.u64 %r55,%r55,24; setp.ne.u64 %r207,%r418,%r55; @ %r207 bra $L8; cvta.global.u64 %r421,_gfortrani_compile_options; bra $L7; $L5: .loc 1 98 12 cvta.global.u64 %r421,_gfortrani_compile_options; .loc 1 98 11 ld.u32 %r209,[%r421+36]; setp.ne.u32 %r210,%r209,0; @ ! %r210 bra $L11; .loc 1 100 7 cvta.const.u64 %r214,$LC1; cvta.const.u64 %r21317721314; call _gfortrani_bounds_equal_extents$L7: .loc 1 104 6 ld.u32 %r216,[%r421+36]; setp.eq.u32 %r217,%r216,0; @ %r217 bra $L11; .loc 1 106 7 cvta.const.u64 %r222,$LC1r17122; call _gfortrani_bounds_reduced_extents); } $L11: .loc 1 110 6 setp.eq.u64 %r223,%r189,0; @ %r223 bra $L1; .loc 1 115 15 ld.u64 %r51,[%r177+16]; ld.s8 %r224,[%r177+29]; shl.b32 %r225,%r224,4; cvt.s64.s32 %r226,%r225; shl.b64 %r227,%r51,7; or.b64 %r53,%r226,%r227; .loc 1 117 3 setp.eq.u64 %r228,%r53,560; @ %r228 bra $L14; setp.gt.u64 %r229,%r53,560; @ %r229 bra $L15; setp.eq.u64 %r230,%r53,288; @ %r230 bra $L16; setp.gt.u64 %r231,%r53,288; @ %r231 bra $L17; setp.eq.u64 %r232,%r53,160; @ %r232 bra $L18; setp.eq.u64 %r233,%r53,272; @ %r233 bra $L16; setp.eq.u64 %r234,%r53,144; @ %r234 bra $L18; bra $L19; $L17: add.u64 %r235,%r53,-528; and.b64 %r236,%r235,-17; setp.eq.u64 %r237,%r236,0; @ ! %r237 bra $L19; bra $L20; $L15: setp.eq.u64 %r238,%r53,1072; @ %r238 bra $L21; setp.gt.u64 %r239,%r53,1072; @ %r239 bra $L22; add.u64 %r240,%r53,-1040; and.b64 %r241,%r240,-17; setp.eq.u64 %r242,%r241,0; @ ! %r242 bra $L19; bra $L23; $L22: setp.eq.u64 %r243,%r53,1088; @ %r243 bra $L24; setp.eq.u64 %r244,%r53,2112; @ %r244 bra $L25; bra $L19; $L18:177178179; call _gfortrani_cshift1_8_123 7 bra $L1; $L16:177178179; call _gfortrani_cshift1_8_129 7 bra $L1; $L20:177178179; call _gfortrani_cshift1_8_135 7 bra $L1; $L23:177178179; call _gfortrani_cshift1_8_141 7 bra $L1; $L14:177178179; call _gfortrani_cshift1_8_r47 bra $L1; $L21:177178179; call _gfortrani_cshift1_8_r87 bra $L1; $L24:177178179; call _gfortrani_cshift1_8_c4178 7 bra $L1; $L25:177178179; call _gfortrani_cshift1_8_c8183 7 bra $L1; $L19: .loc 1 204 13 mov.u64 %r277,1; st.u64 [%frame],%r277; .loc 1 205 12 mov.u64 %r278,0; st.u64 [%frame+120],%r278; .loc 1 213 23 ld.s8 %r160,[%r177+28]; cvt.u32.u32 %r279,%r160; cvt.s64.s8 %r158,%r279; .loc 1 213 3 setp.le.s64 %r280,%r158,0; @ %r280 bra $L54; .loc 1 215 15 cvt.s64.s32 %r105,%r125; add.u64 %r117,%r177,40; add.u64 %r103,%r176,40; .loc 1 217 21 ld.u64 %r172,[%r176+16]; .loc 1 74 8 mov.u64 %r101,%r27; mov.u64 %r95,%r27; .loc 1 206 5 mov.u64 %r109,%r278; .loc 1 211 7 mov.u64 %r108,%r109; .loc 1 213 12 mov.u64 %r147,%r109; .loc 1 227 20 mov.u64 %r432,%r109; $L32: .loc 1 223 17 ld.u64 %r282,[%r117+16]; add.u64 %r281,%r282,1; .loc 1 223 15 ld.u64 %r283,[%r117+8]; sub.u64 %r154,%r281,%r283; .loc 1 217 21 ld.u64 %r284,[%r103]; mul.lo.u64 %r94,%r172,%r284; .loc 1 220 21 ld.u64 %r285,[%r117]; mul.lo.u64 %r100,%r51,%r285; .loc 1 215 10 setp.ne.u64 %r286,%r105,%r147; @ %r286 bra $L28; .loc 1 218 14 setp.ne.u64 %r287,%r94,0; .loc 1 221 14 setp.ne.u64 %r288,%r100,0; .loc 1 223 15 mov.u64 %r108,%r154; selp.u64 %r101,%r100,%r27,%r288; selp.u64 %r95,%r94,%r27,%r287; bra $L31; $L28: .loc 1 227 20 shl.b64 %r289,%r109,3; add.u64 %r290,%frame,%r289; add.u64 %r291,%r290,120; st.u64 [%r291],%r432; .loc 1 228 21 st.u64 [%r290],%r154; .loc 1 229 22 st.u64 [%r290+480],%r94; .loc 1 230 22 st.u64 [%r290+360],%r100; .loc 1 232 24 add.u64 %r305,%r109,%r109; add.u64 %r306,%r305,%r109; shl.b64 %r307,%r306,3; add.u64 %r308,%r178,%r307; .loc 1 232 22 ld.u64 %r310,[%r308+40]; st.u64 [%r290+240],%r310; .loc 1 233 12 add.u64 %r109,%r109,1; $L31: .loc 1 213 55 add.u64 %r147,%r147,1; .loc 1 213 3 add.u64 %r117,%r117,24; add.u64 %r103,%r103,24; setp.ne.u64 %r311,%r147,%r158; @ %r311 bra $L32; bra $L27; $L54: .loc 1 74 8 mov.u64 %r101,%r27; mov.u64 %r95,%r27; .loc 1 211 7 mov.u64 %r108,%r278; $L27: .loc 1 236 14 ld.u64 %r56,[%frame+360]; .loc 1 236 6 setp.ne.u64 %r312,%r56,0; .loc 1 74 8 selp.u64 %r56,%r56,%r27,%r312; .loc 1 238 14 ld.u64 %r43,[%frame+480]; .loc 1 238 6 setp.ne.u64 %r313,%r43,0; .loc 1 74 8 selp.u64 %r43,%r43,%r27,%r313; .loc 1 240 14 ld.u64 %r46,[%frame+240]; .loc 1 240 6 setp.ne.u64 %r314,%r46,0; .loc 1 241 16 selp.u64 %r46,%r46,1,%r314; .loc 1 247 8 ld.u64 %r144,[%r176]; .loc 1 248 8 ld.u64 %r145,[%r177]; .loc 1 249 8 ld.u64 %r146,[%r178]; .loc 1 251 9 setp.eq.u64 %r315,%r144,0; @ %r315 bra $L1; .loc 1 294 12 shl.b64 %r69,%r46,3; .loc 1 268 38 set.u32.eq.u64 %r317,%r27,%r95; neg.s32 %r318,%r317; .loc 1 268 19 set.u32.eq.u64 %r320,%r27,%r101; neg.s32 %r321,%r320; .loc 1 268 27 cvt.u16.u32 %r323,%r318; cvt.u16.u32 %r324,%r321; and.b16 %r322,%r323,%r324; cvt.u32.u16 %r325,%r322; cvt.u32.u8 %r104,%r325; .loc 1 297 32 ld.u64 %r110,[%frame]; .loc 1 304 30 mul.lo.u64 %r148,%r43,%r110; .loc 1 305 30 mul.lo.u64 %r326,%r56,%r110; .loc 1 305 16 neg.s64 %r164,%r326; .loc 1 306 23 mul.lo.u64 %r327,%r46,%r110; .loc 1 306 9 shl.b64 %r328,%r327,3; neg.s64 %r156,%r328; setp.ne.u32 %r419,%r104,0; setp.le.s64 %r422,%r158,2; cvt.u32.u32 %r426,%r160; cvt.s64.s8 %r427,%r426; add.u64 %r428,%r427,-1; add.u64 %r429,%frame,480; add.u64 %r430,%frame,360; add.u64 %r431,%frame,240; $L52: .loc 1 254 10 ld.u64 %r113,[%r146]; .loc 1 257 10 setp.ge.s64 %r329,%r113,0; @ %r329 bra $L36; .loc 1 258 12 add.u64 %r113,%r113,%r108; $L36: .loc 1 259 11 set.u32.ge.s64 %r331,%r113,%r108; neg.s32 %r332,%r331; shr.u64 %r334,%r113,63; cvt.u16.u32 %r337,%r332; cvt.u16.u64 %r338,%r334; or.b16 %r336,%r337,%r338; .loc 1 259 10 cvt.u32.u16 %r339,%r336; cvt.u16.u8 %r340,%r339; setp.eq.u16 %r341,%r340,0; @ %r341 bra $L37; .loc 1 261 7 rem.s64 %r113,%r113,%r108; .loc 1 262 7 setp.ge.s64 %r344,%r113,0; @ %r344 bra $L37; .loc 1 263 9 add.u64 %r113,%r113,%r108; $L37: .loc 1 277 24 sub.u64 %r114,%r108,%r113; .loc 1 268 10 @ %r419 bra $L38; .loc 1 277 4 setp.gt.s64 %r346,%r114,0; @ %r346 bra $L39; mov.u64 %r173,%r144; bra $L40; $L38: .loc 1 270 26 mul.lo.u64 %r139,%r113,%r27; .loc 1 271 27 mul.lo.u64 %r140,%r114,%r27; .loc 1 272 21 add.u64 %r347,%r145,%r139; .loc 1 272 21447140354,[%value_in]; } .loc 1 273 15 add.u64 %r356,%r144,%r140; .loc 1 273 2bra $L41; $L39: .loc 1 266 11 mad.lo.u64 %r137,%r113,%r101,%r145; .loc 1 287 7 mov.u64 %r136,%r144; .loc 1 277 11 mov.u64 %r138,0; $L42:372,[%value_in]; } .loc 1 280 13 add.u64 %r136,%r136,%r95; .loc 1 281 12 add.u64 %r137,%r137,%r101; .loc 1 277 31 add.u64 %r138,%r138,1; .loc 1 277 4 setp.ne.u64 %r374,%r114,%r138; @ %r374 bra $L42; .loc 1 280 13 mad.lo.u64 %r173,%r114,%r95,%r144; $L40: .loc 1 283 6 setp.gt.s64 %r376,%r113,0; @ ! %r376 bra $L41; .loc 1 287 7 mov.u64 %r134,%r145; .loc 1 283 25 mov.u64 %r135,0; $L44: .loc 1 285 317r1383,[%value_in]; } .loc 1 286 8 add.u64 %r173,%r173,%r95; .loc 1 287 7 add.u64 %r134,%r134,%r101; .loc 1 283 39 add.u64 %r135,%r135,1; .loc 1 283 6 setp.ne.u64 %r385,%r113,%r135; @ %r385 bra $L44; $L41: .loc 1 292 12 add.u64 %r144,%r144,%r43; .loc 1 293 12 add.u64 %r145,%r145,%r56; .loc 1 294 12 add.u64 %r146,%r146,%r69; .loc 1 295 15 ld.u64 %r386,[%frame+120]; add.u64 %r71,%r386,1; .loc 1 297 13 setp.eq.u64 %r387,%r71,%r110; @ %r387 bra $L46; .loc 1 295 15 st.u64 [%frame+120],%r71; bra $L52; $L46: .loc 1 301 20 mov.u64 %r388,0; st.u64 [%frame+120],%r388; .loc 1 305 16 add.u64 %r141,%r145,%r164; .loc 1 306 9 add.u64 %r142,%r146,%r156; .loc 1 308 14 @ %r422 bra $L1; add.u64 %r153,%frame,128; .loc 1 304 30 mov.u64 %r74,%r148; .loc 1 308 14 mov.u64 %r151,8; .loc 1 307 12 mov.u64 %r143,1; bra $L49; $L50: .loc 1 301 20 st.u64 [%r153],%r388; .loc 1 304 30 mul.lo.u64 %r74,%r86,%r85; .loc 1 305 30 mul.lo.u64 %r394,%r88,%r85; .loc 1 305 16 sub.u64 %r141,%r145,%r394; .loc 1 306 23 mul.lo.u64 %r395,%r90,%r85; .loc 1 306 9 shl.b64 %r396,%r395,3; sub.u64 %r142,%r146,%r396; .loc 1 307 12 add.u64 %r143,%r143,1; .loc 1 308 14 add.u64 %r153,%r153,8; add.u64 %r151,%r151,8; setp.eq.u64 %r397,%r428,%r143; @ %r397 bra $L1; $L49: .loc 1 316 23 ld.u64 %r398,[%r153]; add.u64 %r85,%r398,1; st.u64 [%r153],%r85; .loc 1 317 30 add.u64 %r400,%r429,%r151; ld.u64 %r86,[%r400]; .loc 1 317 20 sub.u64 %r401,%r86,%r74; add.u64 %r144,%r144,%r401; .loc 1 318 30 add.u64 %r403,%r430,%r151; ld.u64 %r88,[%r403]; .loc 1 318 20 add.u64 %r145,%r141,%r88; .loc 1 319 23 add.u64 %r405,%r431,%r151; ld.u64 %r90,[%r405]; .loc 1 319 13 shl.b64 %r406,%r90,3; add.u64 %r146,%r142,%r406; .loc 1 297 32 add.u64 %r407,%frame,%r151; ld.u64 %r93,[%r407]; .loc 1 297 13 setp.eq.u64 %r408,%r85,%r93; @ %r408 bra $L50; bra $L52; $L6: .loc 1 87 16 ld.u64 %r410,[%r177+56]; ld.u64 %r412,[%r177+48]; .loc 1 87 14 sub.u64 %r152,%r410,%r412; add.u64 %r72,%r177,72; cvt.u32.u32 %r414,%r162; cvt.s64.s8 %r413,%r414; add.u64 %r416,%r413,%r413; add.u64 %r417,%r416,%r413; shl.b64 %r418,%r417,3; mov.u64 %r55,%r190; .loc 1 90 17 mov.u64 %r115,1; .loc 1 95 4 mov.u64 %r206,%r55; bra $L51; $L1_gfortran_cshift1_8 .visible .func _gfortran_cshift1_8.loc 1 337call cshift1_gfortran_cshift1_8_char .visible .func _gfortran_cshift1_8_char.loc 1 357call cshift13_gfortran_cshift1_8_char4 .visible .func _gfortran_cshift1_8_char4$LFB8:call cshift1ret; $LFE8: } cshift1_16.o/BEGIN FUNCTION DECL: cshift1 .func cshift1../libgfortran/generated/cshift1_16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_cshift1_16 .visible .func _gfortran_cshift1_16gfortran_cshift1_16_char .visible .func _gfortran_cshift1_16_char_gfortran_cshift1_16_char4 .visible .func _gfortran_cshift1_16_char4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortran_size0_gfortran_size0_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extentsgfortrani_bounds_reduced_extents .extern .func _gfortrani_bounds_reduced_extents); // BEGIN GLOBAL FUNCTION DECL: _gfortrani_cshift1_16_i1 .extern .func _gfortrani_cshift1_16_i1gfortrani_cshift1_16_i2 .extern .func _gfortrani_cshift1_16_i2gfortrani_cshift1_16_i4 .extern .func _gfortrani_cshift1_16_i4gfortrani_cshift1_16_i8 .extern .func _gfortrani_cshift1_16_i8gfortrani_cshift1_16_r4 .extern .func _gfortrani_cshift1_16_r4gfortrani_cshift1_16_r8 .extern .func _gfortrani_cshift1_16_r8gfortrani_cshift1_16_c4 .extern .func _gfortrani_cshift1_16_c4gfortrani_cshift1_16_c8 .extern .func _gfortrani_cshift1_16_c8modti3 .extern .func __modti3114,101,116,117,114,110,32,118,97,108,117,101,0 }03,117,109,101,110,116,32,39,68,73,77,39,32,105,115,32,111,117,116,32,111,102,32,114,97,110,103,101,32,105,110,32,99,97,108,108,32,116,111,32,39,67,83,72,73,70,84,397] = {67,83,72,73,70,8473,70,84,32,97,114,103,117,109,101,110,116,0 }; // BEGIN FUNCTION DEF: cshift1 .func cshift165pred %r223; .reg .u32 %r224; .reg .u32pred %r230; .reg .pred %r231; .reg .pred %r232; .reg .pred %r233; .reg .predpredpred %r242; .reg .predpred %r286; .reg .pred %r287; .reg .predu64 %r310; .reg .pred %r311; .reg .pred %r312; .reg .pred %r313; .reg .predpred %r331; .reg .u64 %r333; .reg .u64 %r336; .reg .u32 %r338; .reg .u64 %r339; .reg .u64 %r342; .reg .pred %r345; .reg .pred %r348; .reg .pred %r351; .reg .pred %r353; .reg .pred %r371; .reg .u64 %r373; .reg .u64 %r376; .reg .u32 %r378; .reg .u64 %r379; .reg .u64 %r384; .reg .u64 %r387; .reg .u32 %r389; .reg .u64 %r390; .reg .u64 %r393; .reg .pred %r396; .reg .predpred %r446; .reg .pred %r448; .reg .pred %r450; .reg .u64 %r457; .reg .pred %r459; .reg .u64 %r460; .reg .predu64 %r475; .reg .u64 %r477; .reg .u64 %r479; .reg .u64 %r480; .reg .u64 %r481; .reg .pred %r482; .reg .pred %r485; .reg .pred %r488; .reg .pred %r491; .reg .u64 %r509; .reg .u64 %r511; .reg .u64 %r512; .reg .u32u64 %r520; .reg .u64 %r521; .reg .pred %r538; .reg .pred %r540; .reg .u64 %r542; .reg .u64u64 %r554; mov.u64 %r177,%ar0; mov.u64 %r178,%ar1; mov.u64 %r179,%ar2; mov.u64 %r180,%ar3; .loc 1 66 6 setp.eq.u64 %r181,%r180,0; @ %r181 bra $L61; .loc 1 67 21 ld.u32 %r182,[%r180]; add.u32 %r122,%r182,-1; .loc 1 71 6 setp.lt.s32 %r183,%r122,0; @ ! %r183 bra $L2; bra $L3; $L61: .loc 1 69 11 cvt.u32.u64 %r122,%r180; $L2: .loc 1 71 34 ld.s8 %r184,[%r178+28]; .loc 1 71 17 setp.gt.s32 %r185,%r184,%r122; @ %r185 bra $L4; $L3: .loc 1 72 5 cvta.const.u64 %r186186_gfortran_runtime_errorr27,[%r178+16]178; call (%value_in),_gfortran_size0,(%out_arg1); ld.param.u64 %r189,[%value_in]; } .loc 1 78 6 ld.u64 %r190,[%r177]; setp.ne.u64 %r191,%r190,0; @ %r191 bra $L5; .loc 1 80 2427; call (%value_in),_gfortrani_xmallocarray194,[%value_in]; } .loc 1 80 22 st.u64 [%r177],%r194; .loc 1 81 19 st.u64 [%r177+8],%r190; .loc 1 82 7 ld.u64 %r197,[%r178+16]; st.u64 [%r177+16],%r197; ld.u64 %r198,[%r178+24]; st.u64 [%r177+24],%r198; .loc 1 83 34 ld.s8 %r160,[%r178+28]; .loc 1 83 7 cvt.u16.u32 %r199,%r160; setp.gt.s16 %r200,%r199,0; @ %r200 bra $L6; cvta.global.u64 %r542,_gfortrani_compile_options; bra $L7; $L8: .loc 1 87 16 ld.u64 %r202,[%r53+8]; ld.u64 %r204,[%r53]; .loc 1 92 10 mad.lo.u64 %r113,%r173,%r113,%r113; add.u64 %r53,%r53,24; .loc 1 87 14 sub.u64 %r173,%r202,%r204; $L59: add.u64 %r38,%r177,%r41; .loc 1 95 4 st.u64 [%r38+48],%r206; st.u64 [%r38+56],%r173; st.u64 [%r38+40],%r113; .loc 1 83 7 add.u64 %r41,%r41,24; setp.ne.u64 %r207,%r41,%r517; @ %r207 bra $L8; cvta.global.u64 %r542,_gfortrani_compile_options; bra $L7; $L5: .loc 1 98 12 cvta.global.u64 %r542,_gfortrani_compile_options; .loc 1 98 11 ld.u32 %r209,[%r542+36]; setp.ne.u32 %r210,%r209,0; @ ! %r210 bra $L11; .loc 1 100 7 cvta.const.u64 %r214,$LC1; cvta.const.u64 %r2131314; call _gfortrani_bounds_equal_extents$L7: .loc 1 104 6 ld.u32 %r216,[%r542+36]; setp.eq.u32 %r217,%r216,0; @ %r217 bra $L11; .loc 1 106 7 cvta.const.u64 %r222,$LC1r178122; call _gfortrani_bounds_reduced_extents); } $L11: .loc 1 110 6 setp.eq.u64 %r223,%r189,0; @ %r223 bra $L1; .loc 1 115 15 ld.u64 %r49,[%r178+16]; ld.s8 %r224,[%r178+29]; shl.b32 %r225,%r224,4; cvt.s64.s32 %r226,%r225; shl.b64 %r227,%r49,7; or.b64 %r51,%r226,%r227; .loc 1 117 3 setp.eq.u64 %r228,%r51,560; @ %r228 bra $L14; setp.gt.u64 %r229,%r51,560; @ %r229 bra $L15; setp.eq.u64 %r230,%r51,288; @ %r230 bra $L16; setp.gt.u64 %r231,%r51,288; @ %r231 bra $L17; setp.eq.u64 %r232,%r51,160; @ %r232 bra $L18; setp.eq.u64 %r233,%r51,272; @ %r233 bra $L16; setp.eq.u64 %r234,%r51,144; @ %r234 bra $L18; bra $L19; $L17: add.u64 %r235,%r51,-528; and.b64 %r236,%r235,-17; setp.eq.u64 %r237,%r236,0; @ ! %r237 bra $L19; bra $L20; $L15: setp.eq.u64 %r238,%r51,1072; @ %r238 bra $L21; setp.gt.u64 %r239,%r51,1072; @ %r239 bra $L22; add.u64 %r240,%r51,-1040; and.b64 %r241,%r240,-17; setp.eq.u64 %r242,%r241,0; @ ! %r242 bra $L19; bra $L23; $L22: setp.eq.u64 %r243,%r51,1088; @ %r243 bra $L24; setp.eq.u64 %r244,%r51,2112; @ %r244 bra $L25; bra $L19; $L18:179180; call _gfortrani_cshift1_16_123 7 bra $L1; $L16:179180; call _gfortrani_cshift1_16_129 7 bra $L1; $L20:179180; call _gfortrani_cshift1_16_135 7 bra $L1; $L23:179180; call _gfortrani_cshift1_16_141 7 bra $L1; $L14:179180; call _gfortrani_cshift1_16_r47 bra $L1; $L21:179180; call _gfortrani_cshift1_16_r87 bra $L1; $L24:179180; call _gfortrani_cshift1_16_c4178 7 bra $L1; $L25:179180; call _gfortrani_cshift1_16_c8183 7 bra $L1; $L19: .loc 1 204 13 mov.u64 %r277,1; st.u64 [%frame],%r277; .loc 1 205 12 mov.u64 %r278,0; st.u64 [%frame+120],%r278; .loc 1 213 23 ld.s8 %r166,[%r178+28]; cvt.u32.u32 %r279,%r166; cvt.s64.s8 %r167,%r279; .loc 1 213 3 setp.le.s64 %r280,%r167,0; @ %r280 bra $L62; .loc 1 215 15 cvt.s64.s32 %r103,%r122; add.u64 %r80,%r178,40; add.u64 %r64,%r177,40; .loc 1 217 21 ld.u64 %r158,[%r177+16]; .loc 1 210 11 mov.u64 %r98,%r27; .loc 1 209 11 mov.u64 %r92,%r27; .loc 1 206 5 mov.u64 %r108,%r278; .loc 1 211 7 mov.u64 %r106,%r108; .loc 1 213 12 mov.u64 %r144,%r108; .loc 1 227 20 mov.u64 %r554,%r108; $L32: .loc 1 223 17 ld.u64 %r282,[%r80+16]; add.u64 %r281,%r282,1; .loc 1 223 15 ld.u64 %r283,[%r80+8]; sub.u64 %r157,%r281,%r283; .loc 1 217 21 ld.u64 %r284,[%r64]; mul.lo.u64 %r91,%r158,%r284; .loc 1 220 21 ld.u64 %r285,[%r80]; mul.lo.u64 %r97,%r49,%r285; .loc 1 215 10 setp.ne.u64 %r286,%r103,%r144; @ %r286 bra $L28; .loc 1 218 14 setp.ne.u64 %r287,%r91,0; .loc 1 221 14 setp.ne.u64 %r288,%r97,0; .loc 1 223 15 mov.u64 %r106,%r157; selp.u64 %r98,%r97,%r27,%r288; selp.u64 %r92,%r91,%r27,%r287; bra $L31; $L28: .loc 1 227 20 shl.b64 %r289,%r108,3; add.u64 %r290,%frame,%r289; add.u64 %r291,%r290,120; st.u64 [%r291],%r554; .loc 1 228 21 st.u64 [%r290],%r157; .loc 1 229 22 st.u64 [%r290+480],%r91; .loc 1 230 22 st.u64 [%r290+360],%r97; .loc 1 232 24 add.u64 %r305,%r108,%r108; add.u64 %r306,%r305,%r108; shl.b64 %r307,%r306,3; add.u64 %r308,%r179,%r307; .loc 1 232 22 ld.u64 %r310,[%r308+40]; st.u64 [%r290+240],%r310; .loc 1 233 12 add.u64 %r108,%r108,1; $L31: .loc 1 213 55 add.u64 %r144,%r144,1; .loc 1 213 3 add.u64 %r80,%r80,24; add.u64 %r64,%r64,24; setp.ne.u64 %r311,%r144,%r167; @ %r311 bra $L32; bra $L27; $L62: .loc 1 210 11 mov.u64 %r98,%r27; .loc 1 209 11 mov.u64 %r92,%r27; .loc 1 211 7 mov.u64 %r106,%r278; $L27: .loc 1 236 14 ld.u64 %r54,[%frame+360]; .loc 1 236 6 setp.ne.u64 %r312,%r54,0; .loc 1 237 16 selp.u64 %r54,%r54,%r27,%r312; .loc 1 238 14 ld.u64 %r40,[%frame+480]; .loc 1 238 6 setp.ne.u64 %r313,%r40,0; .loc 1 239 16 selp.u64 %r40,%r40,%r27,%r313; .loc 1 240 14 ld.u64 %r43,[%frame+240]; .loc 1 240 6 setp.ne.u64 %r314,%r43,0; .loc 1 241 16 selp.u64 %r43,%r43,1,%r314; .loc 1 247 8 ld.u64 %r141,[%r177]; .loc 1 248 8 ld.u64 %r142,[%r178]; .loc 1 249 8 ld.u64 %r143,[%r179]; .loc 1 251 9 setp.eq.u64 %r315,%r141,0; @ %r315 bra $L1; .loc 1 294 12 shl.b64 %r67,%r43,4; .loc 1 297 32 ld.u64 %r119,[%frame]; .loc 1 268 19 set.u32.eq.u64 %r317,%r27,%r98; neg.s32 %r318,%r317; .loc 1 268 38 set.u32.eq.u64 %r320,%r27,%r92; neg.s32 %r321,%r320; .loc 1 268 27 cvt.u16.u32 %r323,%r318; cvt.u16.u32 %r324,%r321; and.b16 %r322,%r323,%r324; cvt.u32.u16 %r325,%r322; cvt.u32.u8 %r109,%r325; .loc 1 258 12 shr.s64 %r519,%r106,63; .loc 1 304 30 mul.lo.u64 %r65,%r119,%r40; .loc 1 305 30 mul.lo.u64 %r327,%r119,%r54; .loc 1 305 16 neg.s64 %r169,%r327; .loc 1 306 23 mul.lo.u64 %r328,%r119,%r43; .loc 1 306 9 shl.b64 %r329,%r328,4; neg.s64 %r151,%r329; setp.ne.u32 %r538,%r109,0; setp.le.s64 %r540,%r167,2; .loc 1 261 7 add.u64 %r545,%frame,608; add.u64 %r546,%frame,624; add.u64 %r547,%frame,640; cvt.u32.u32 %r548,%r166; cvt.s64.s8 %r549,%r548; add.u64 %r550,%r549,-1; add.u64 %r551,%frame,480; add.u64 %r552,%frame,360; add.u64 %r553,%frame,240; $L60: .loc 1 254 10 ld.u64 %r520,[%r143]; ld.u64 %r521,[%r143+8]; .loc 1 257 10 setp.ge.s64 %r331,%r521,0; @ %r331 bra $L36; .loc 1 258 12 add.u64 %r333,%r520,%r106; set.u32.lt.u64 %r338,%r333,%r520; cvt.s64.s32 %r336,%r338; add.u64 %r339,%r521,%r519; sub.u64 %r342,%r339,%r336; mov.u64 %r520,%r333; mov.u64 %r521,%r342; .loc 1 259 10 setp.gt.s64 %r345,%r519,%r342; @ %r345 bra $L63; setp.ne.u64 %r348,%r519,%r342; @ %r348 bra $L38; setp.gt.u64 %r351,%r106,%r333; @ ! %r351 bra $L38; $L63: setp.ge.s64 %r353,%r342,0; @ %r353 bra $L40; $L38: .loc 1 261 7 st.u64 [%frame+624],%r333; st.u64 [%frame+632],%r342; st.u64 [%frame+640],%r106; st.u64 [%frame+648]608]; ld.u64 %r521,[%frame+616]; .loc 1 262 7 setp.ge.s64 %r371,%r521,0; @ %r371 bra $L40; .loc 1 263 9 add.u64 %r373,%r520,%r106; set.u32.lt.u64 %r378,%r373,%r520; cvt.s64.s32 %r376,%r378; add.u64 %r379,%r521,%r519; mov.u64 %r520,%r373; sub.u64 %r521,%r379,%r376; $L40: .loc 1 277 24 sub.u64 %r384,%r106,%r520; set.u32.gt.u64 %r389,%r384,%r106; cvt.s64.s32 %r387,%r389; sub.u64 %r390,%r519,%r521; add.u64 %r393,%r390,%r387; .loc 1 268 10 @ %r538 bra $L43; .loc 1 277 4 setp.gt.s64 %r396,%r393,0; @ %r396 bra $L44; setp.ne.u64 %r398,%r393,0; @ %r398 bra $L64; setp.ne.u64 %r400,%r384,0; @ %r400 bra $L44; $L64: mov.u64 %r130,%r141; bra $L46; $L43: .loc 1 270 16 mul.lo.u64 %r136,%r27,%r520; .loc 1 271 9 mul.lo.u64 %r137,%r27,%r384; .loc 1 272 21 add.u64 %r402,%r142,%r136; .loc 1 272 2137409,[%value_in]; } .loc 1 273 15 add.u64 %r411,%r141,%r137; .loc 1 273 2418,[%value_in]; } bra $L47; $L44: .loc 1 266 11 mad.lo.u64 %r134,%r520,%r98,%r142; .loc 1 286 8 mov.u64 %r133,%r141; .loc 1 277 11 mov.u64 %r135,0; $L48:13r13427,[%value_in]; } .loc 1 280 13 add.u64 %r133,%r133,%r92; .loc 1 281 12 add.u64 %r134,%r134,%r98; .loc 1 277 31 add.u64 %r135,%r135,1; .loc 1 277 4 setp.ne.u64 %r429,%r384,%r135; @ %r429 bra $L48; .loc 1 280 13 mad.lo.u64 %r130,%r384,%r92,%r141; $L46: .loc 1 283 6 setp.gt.s64 %r446,%r521,0; @ %r446 bra $L49; setp.ne.u64 %r448,%r521,0; @ %r448 bra $L47; setp.ne.u64 %r450,%r520,0; @ ! %r450 bra $L47; $L49: .loc 1 286 8 mov.u64 %r131,%r142; .loc 1 283 25 mov.u64 %r132,0; $L51: .loc 1 285 31131457,[%value_in]; } .loc 1 286 8 add.u64 %r130,%r130,%r92; .loc 1 287 7 add.u64 %r131,%r131,%r98; .loc 1 283 39 add.u64 %r132,%r132,1; .loc 1 283 6 setp.ne.u64 %r459,%r520,%r132; @ %r459 bra $L51; $L47: .loc 1 292 12 add.u64 %r141,%r141,%r40; .loc 1 293 12 add.u64 %r142,%r142,%r54; .loc 1 294 12 add.u64 %r143,%r143,%r67; .loc 1 295 15 ld.u64 %r460,[%frame+120]; add.u64 %r69,%r460,1; .loc 1 297 13 setp.eq.u64 %r461,%r69,%r119; @ %r461 bra $L53; .loc 1 295 15 st.u64 [%frame+120],%r69; bra $L60; $L53: .loc 1 301 20 mov.u64 %r462,0; st.u64 [%frame+120],%r462; .loc 1 305 16 add.u64 %r138,%r142,%r169; .loc 1 306 9 add.u64 %r139,%r143,%r151; .loc 1 308 14 @ %r540 bra $L1; add.u64 %r149,%frame,128; .loc 1 304 30 mov.u64 %r71,%r65; .loc 1 308 14 mov.u64 %r129,8; .loc 1 307 12 mov.u64 %r140,1; bra $L56; $L57: .loc 1 301 20 st.u64 [%r149],%r462; .loc 1 304 30 mul.lo.u64 %r71,%r83,%r82; .loc 1 305 30 mul.lo.u64 %r468,%r85,%r82; .loc 1 305 16 sub.u64 %r138,%r142,%r468; .loc 1 306 23 mul.lo.u64 %r469,%r87,%r82; .loc 1 306 9 shl.b64 %r470,%r469,4; sub.u64 %r139,%r143,%r470; .loc 1 307 12 add.u64 %r140,%r140,1; .loc 1 308 14 add.u64 %r149,%r149,8; add.u64 %r129,%r129,8; setp.eq.u64 %r471,%r550,%r140; @ %r471 bra $L1; $L56: .loc 1 316 23 ld.u64 %r472,[%r149]; add.u64 %r82,%r472,1; st.u64 [%r149],%r82; .loc 1 317 30 add.u64 %r474,%r551,%r129; ld.u64 %r83,[%r474]; .loc 1 317 20 sub.u64 %r475,%r83,%r71; add.u64 %r141,%r141,%r475; .loc 1 318 30 add.u64 %r477,%r552,%r129; ld.u64 %r85,[%r477]; .loc 1 318 20 add.u64 %r142,%r138,%r85; .loc 1 319 23 add.u64 %r479,%r553,%r129; ld.u64 %r87,[%r479]; .loc 1 319 13 shl.b64 %r480,%r87,4; add.u64 %r143,%r139,%r480; .loc 1 297 32 add.u64 %r481,%frame,%r129; ld.u64 %r90,[%r481]; .loc 1 297 13 setp.eq.u64 %r482,%r82,%r90; @ %r482 bra $L57; bra $L60; $L36: .loc 1 259 10 setp.gt.s64 %r485,%r519,%r521; @ %r485 bra $L40; setp.ne.u64 %r488,%r519,%r521; @ %r488 bra $L65; setp.gt.u64 %r491,%r106,%r520; @ %r491 bra $L40; $L65: .loc 1 261 7 st.u64 [%frame+624],%r520; st.u64 [%frame+632],%r521; st.u64 [%frame+640],%r106; st.u64 [%frame+648]608]; ld.u64 %r521,[%frame+616]; bra $L40; $L6: .loc 1 87 16 ld.u64 %r509,[%r178+56]; ld.u64 %r511,[%r178+48]; .loc 1 87 14 sub.u64 %r173,%r509,%r511; add.u64 %r53,%r178,72; cvt.u32.u32 %r513,%r160; cvt.s64.s8 %r512,%r513; add.u64 %r515,%r512,%r512; add.u64 %r516,%r515,%r512; shl.b64 %r517,%r516,3; mov.u64 %r41,%r190; .loc 1 90 17 mov.u64 %r113,1; .loc 1 95 4 mov.u64 %r206,%r41; bra $L59; $L1_gfortran_cshift1_16 .visible .func _gfortran_cshift1_16.loc 1 337call cshift1_gfortran_cshift1_16_char .visible .func _gfortran_cshift1_16_char.loc 1 357call cshift13_gfortran_cshift1_16_char4 .visible .func _gfortran_cshift1_16_char4$LFB8:call cshift1ret; $LFE8: } reshape_i4.o/_gfortran_reshape_4 .visible .func _gfortran_reshape_4.file 1 "../../../../libgfortran/generated/reshape_i4.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray_gfortrani_reshape_packed .extern .func _gfortrani_reshape_packed73,110,99,111,114,114,101,99,116,32,115,105,122,101,32,105,110,32,83,79,85,82,67,69,32,97,114,103,117,109,101,110,116,32,116,111,32,82,69,83,72,65,80,69,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,100,0 }114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,82,69,83,72,65,80,69,32,105,110,116,114,105,11097,108,117,101,32,37,108,100,32,105,110,32,79,82,68,69,82,32,97,114,103,117,109,101,110,116,32,116,111,32,82,69,83,72,65,80,69,32,105,110,116,114,105,110,115,105,9910,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,82,69,83,72,65,80,69,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10,111,117,116,32,111,102,32,114,97,110,103,101,32,105,110,32,79,82,68,69,82,32,97,114,103,117,109,101,110,116,32,116,111,32,82,69,83,72,65,80,69,32,105115,104,97,112,101,32,97,110,100,32,116,97,114,103,101,116,32,100,111,32,110,111,116,32,99,111,110,102,111,114_gfortran_reshape_4 .visible .func _gfortran_reshape_4 {120u64 %r258; .reg .u64.reg .predpredu64 %r294; .reg .u64 %r295; .reg .u64 %r301; .reg .u64 %r305; .reg .u64 %r306; .reg .u64u64 %r310; .reg .predpred %r315; .reg .u64 %r316; .reg .u32pred %r325; .reg .u16 %r326; .reg .pred %r327; .reg .u64 %r329; .reg .u32 %r330; .reg .u64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64pred %r339; .reg .predpredu64 %r399; .reg .u64 %r400; .reg .u64 %r402; .reg .u64predu32 %r459; .reg .u32 %r461; .reg .u32 %r462; .reg .u16 %r463; .reg .u16 %r464; .reg .u16 %r465; .reg .u32 %r467; .reg .u32 %r468; .reg .u16 %r470; .reg .u16 %r472; .reg .u32 %r473; .reg .u16 %r474; .reg .pred %r475; .reg .u32 %r482; .reg .pred %r483; .reg .pred %r484; .reg .pred %r485; .reg .u64 %r487; .reg .u64 %r494; .reg .u64 %r497; .reg .u64 %r504; .reg .u64 %r507; .reg .pred %r508; .reg .u64 %r510; .reg .pred %r512; .reg .pred %r518; .reg .u64 %r519; .reg .u64 %r520; .reg .u64 %r521; .reg .pred %r522; .reg .pred %r523; .reg .u64 %r524; .reg .u64 %r525; .reg .pred %r526; .reg .u32 %r528; .reg .u32 %r529; .reg .u16 %r533; .reg .u16 %r534; .reg .u32 %r536; .reg .u16 %r537; .reg .pred %r538; .reg .pred %r540; .reg .u64 %r550; .reg .u64 %r560; .reg .u64 %r570; .reg .u64 %r573; .reg .u64 %r575; .reg .u64 %r576; .reg .u64 %r577; .reg .u64 %r579; .reg .pred %r580; .reg .pred %r581; .reg .pred %r582; .reg .pred %r583; .reg .pred %r584; .reg .u64 %r585; .reg .u64 %r586; .reg .u64 %r587; .reg .u64 %r588; .reg .u64 %r589; .reg .pred %r590; .reg .u64 %r592; .reg .u32 %r593; .reg .u32 %r594; .reg .u16 %r595; .reg .pred %r596; .reg .u64 %r597; .reg .u64 %r598; .reg .u64 %r599; .reg .pred %r600; .reg .u64 %r601; .reg .u64 %r602; .reg .u32 %r603; .reg .u32 %r604; .reg .u64 %r605; .reg .u32 %r606; .reg .u64 %r607; .reg .u32 %r608; mov.u64 %r257,%ar0; mov.u64 %r258,%ar1; mov.u64 %r259,%ar2; mov.u64 %r260,%ar3; mov.u64 %r261,%ar4; .loc 1 78 10 ld.u64 %r263,[%r259+56]; add.u64 %r262,%r263,1; .loc 1 78 8 ld.u64 %r264,[%r259+48]; sub.u64 %r161,%r262,%r264; .loc 1 83 15 ld.s8 %r265,[%r257+28]; .loc 1 83 6 setp.eq.u64 %r266,%r265,%r161; @ %r266 bra $L2; .loc 1 84 5 cvta.const.u64 %r2667_gfortran_runtime_errorld.u64 %r269,[%r259+40]; shl.b64 %r249,%r269,3; ld.u64 %r247,[%r259]; mov.u64 %r230,%frame; shl.b64 %r253,%r161,3; add.u64 %r254,%frame,%r253; .loc 1 90 44 mov.u64 %r250,%frame; .loc 1 86 15 mov.u32 %r151,0; .loc 1 93 23 mov.u64 %r607,0; .loc 1 94 14 mov.u32 %r608,1; $L5: .loc 1 90 39 ld.u64 %r32,[%r247]; .loc 1 91 10 setp.le.s64 %r270,%r32,0; @ %r270 bra $L3; .loc 1 90 21 st.u64 [%r250],%r32; bra $L4; $L3: .loc 1 93 23 st.u64 [%r250],%r607; .loc 1 94 14 mov.u32 %r151,%r608; $L4: .loc 1 88 3 add.u64 %r247,%r247,%r249; add.u64 %r250,%r250,8; setp.ne.u64 %r272,%r250,%r254; @ %r272 bra $L5; .loc 1 98 6 ld.u64 %r273,[%r257]; setp.ne.u64 %r274,%r273,0; @ %r274 bra $L6; add.u64 %r113,%r257,40; add.u64 %r277,%r161,%r161; add.u64 %r278,%r277,%r161; shl.b64 %r279,%r278,3; add.u64 %r246,%r113,%r279; mov.u64 %r159,%r230; .loc 1 102 10 mov.u64 %r167,1; $L7: .loc 1 105 8 ld.u64 %r166,[%r159]; .loc 1 107 4 st.u64 [%r113+8],%r273; add.u64 %r281,%r166,-1; st.u64 [%r113+16],%r281; st.u64 [%r113],%r167; .loc 1 109 7 mul.lo.u64 %r167,%r167,%r166; .loc 1 103 7 add.u64 %r159,%r159,8; add.u64 %r113,%r113,24; setp.ne.u64 %r282,%r113,%r246; @ %r282 bra $L7; .loc 1 111 19 st.u64 [%r257+8],%r273; .loc 1 118 24 max.s64 %r286,%r167,0285; call (%value_in),_gfortrani_xmallocarray287,[%value_in]; } .loc 1 118 22 st.u64 [%r257],%r287; .loc 1 119 23 st.u8 [%r257+28],%r161; $L6: .loc 1 122 6 setp.ne.u32 %r289,%r151,0; @ %r289 bra $L1; .loc 1 125 6 setp.eq.u64 %r290,%r260,0; @ %r290 bra $L53; .loc 1 127 12 ld.s8 %r145,[%r260+28]; .loc 1 130 7 setp.le.s64 %r292,%r145,0; @ %r292 bra $L54; .loc 1 132 21 shl.b64 %r294,%r145,3; add.u64 %r295,%frame,3629301,[%value_in]; } add.u64 %r139,%r260,40; add.u64 %r144,%frame,120; add.u64 %r52,%frame,240; add.u64 %r305,%r145,%r145; add.u64 %r306,%r305,%r145; shl.b64 %r307,%r306,3; add.u64 %r111,%r139,%r307; .loc 1 129 14 mov.u32 %r149,%r151; .loc 1 128 13 mov.u64 %r146,1; .loc 1 138 19 mov.u64 %r605,0; .loc 1 137 15 cvt.u32.u64 %r606,%r146; $L14: .loc 1 133 24 ld.u64 %r39,[%r139]; .loc 1 133 22 st.u64 [%r144],%r39; .loc 1 134 24 ld.u64 %r309,[%r139+16]; add.u64 %r308,%r309,1; ld.u64 %r310,[%r139+8]; sub.u64 %r43,%r308,%r310; .loc 1 135 14 setp.le.s64 %r311,%r43,0; @ %r311 bra $L11; .loc 1 134 22 st.u64 [%r52],%r43; bra $L12; $L11: .loc 1 138 19 st.u64 [%r52],%r605; .loc 1 137 15 mov.u32 %r149,%r606; $L12: .loc 1 141 14 setp.ne.u64 %r313,%r39,%r146; @ %r313 bra $L55; .loc 1 142 19 ld.u64 %r314,[%r52]; mul.lo.u64 %r146,%r146,%r314; bra $L13; $L55: .loc 1 144 19 mov.u64 %r146,0; $L13: .loc 1 130 7 add.u64 %r139,%r139,24; add.u64 %r144,%r144,8; add.u64 %r52,%r52,8; setp.ne.u64 %r315,%r111,%r139; @ %r315 bra $L14; bra $L10; $L54: .loc 1 129 14 mov.u32 %r149,%r151; .loc 1 128 13 mov.u64 %r146,1; $L10: .loc 1 146 12 ld.u64 %r147,[%r260]; bra $L9; $L53: .loc 1 153 12 mov.u64 %r147,%r260; .loc 1 152 14 mov.u32 %r149,1; .loc 1 151 13 mov.u64 %r146,1; .loc 1 150 12 mov.u64 %r145,%r260; $L9: .loc 1 156 7 cvta.global.u64 %r316,_gfortrani_compile_options; .loc 1 156 6 ld.u32 %r317,[%r316+36]; setp.eq.u32 %r318,%r317,0; @ ! %r318 bra $L92; setp.eq.u64 %r590,%r261,0; bra $L15; $L92: add.u64 %r229,%r257,48; .loc 1 161 23 mov.u64 %r224,0; .loc 1 160 10 mov.u64 %r175,1; $L17: .loc 1 163 20 ld.u64 %r47,[%r230]; .loc 1 163 7 mul.lo.u64 %r175,%r175,%r47; .loc 1 164 17 ld.u64 %r320,[%r229+8]; add.u64 %r319,%r320,1; .loc 1 164 15 ld.u64 %r321,[%r229]; sub.u64 %r176,%r319,%r321; .loc 1 161 41 add.u64 %r224,%r224,1; .loc 1 165 7 setp.eq.u64 %r322,%r47,%r176; @ %r322 bra $L16; .loc 1 166 6 st.u64 [%stack+16],%r47; st.u64 [%stack+8],%r176; st.u64 [%stack],%r224;_gfortran_runtime_error16: .loc 1 161 7 add.u64 %r230,%r230,8; add.u64 %r229,%r229,24; setp.ne.u64 %r325,%r161,%r224; @ %r325 bra $L17; .loc 1 173 14 ld.s8 %r53,[%r258+28]; .loc 1 174 7 cvt.u16.u32 %r326,%r53; setp.le.s16 %r327,%r326,0; @ %r327 bra $L56; add.u64 %r193,%r258,48; add.u64 %r241,%r258,56; cvt.u32.u32 %r330,%r53; cvt.s64.s8 %r329,%r330; add.u64 %r332,%r329,%r329; add.u64 %r333,%r332,%r329; shl.b64 %r334,%r333,3; add.u64 %r234,%r193,%r334; .loc 1 172 21 mov.u64 %r155,1; $L19: .loc 1 177 9 ld.u64 %r337,[%r241]; add.u64 %r336,%r337,1; .loc 1 177 7 ld.u64 %r338,[%r193]; sub.u64 %r335,%r336,%r338; .loc 1 178 33 max.s64 %r335,%r335,0; .loc 1 178 18 mul.lo.u64 %r155,%r155,%r335; .loc 1 174 7 add.u64 %r193,%r193,24; add.u64 %r241,%r241,24; setp.ne.u64 %r339,%r234,%r193; @ %r339 bra $L19; bra $L18; $L56: .loc 1 172 21 mov.u64 %r155,1; $L18: .loc 1 181 10 setp.ge.s64 %r340,%r155,%r175; @ %r340 bra $L20; .loc 1 181 34 set.u32.eq.u64 %r342,%r260,0; neg.s32 %r343,%r342; .loc 1 181 39 cvt.u16.u32 %r346,%r149; cvt.u16.u32 %r349,%r343; or.b16 %r348,%r349,%r346; .loc 1 181 30 cvt.u32.u16 %r351,%r348; cvt.u16.u8 %r352,%r351; setp.eq.u16 %r353,%r352,0; @ %r353 bra $L20; .loc 1 182 2 st.u64 [%stack+8],%r175; st.u64 [%stack],%r155; cvta.const.u64 %r3_gfortran_runtime_error20: .loc 1 186 10 setp.eq.u64 %r590,%r261,0; @ %r590 bra $L15; .loc 1 192 14 shl.b64 %r357,%r161,2; add.u64 %r358,%frame,480; mov.u32 %r3635362357364,[%value_in]; } ld.u64 %r366,[%r261+40]; shl.b64 %r191,%r366,3; ld.u64 %r189,[%r261]; .loc 1 194 20 mov.u64 %r172,0; .loc 1 206 16 mov.u32 %r604,1; $L23: .loc 1 196 28 ld.u64 %r63,[%r189]; .loc 1 196 10 add.u64 %r171,%r63,-1; .loc 1 198 14 shr.u64 %r368,%r171,63; .loc 1 198 23 set.u32.le.s64 %r370,%r161,%r171; neg.s32 %r371,%r370; .loc 1 198 18 cvt.u16.u64 %r374,%r368; cvt.u16.u32 %r375,%r371; or.b16 %r373,%r374,%r375; .loc 1 198 11 cvt.u32.u16 %r376,%r373; cvt.u16.u8 %r377,%r376; setp.eq.u16 %r378,%r377,0; @ %r378 bra $L21; .loc 1 199 3 st.u64 [%stack],%r63_gfortran_runtime_errorshl.b64 %r381,%r171,2; add.u64 %r382,%frame,%r381; add.u64 %r383,%r382,480; .loc 1 202 11 ld.u32 %r384,[%r383]; setp.eq.u32 %r385,%r384,0; @ %r385 bra $L22; .loc 1 203 3 st.u64 [%stack],%r63; cvta.const.u64 %r386386_gfortran_runtime_error22: .loc 1 206 16 st.u32 [%r383],%r604; .loc 1 194 38 add.u64 %r172,%r172,1; .loc 1 194 4 add.u64 %r189,%r189,%r191; setp.ne.u64 %r392,%r161,%r172; @ %r392 bra $L23; $L15: add.u64 %r91,%frame,960; add.u64 %r51,%frame,1080; add.u64 %r46,%frame,840; .loc 1 194 20 mov.u64 %r83,%r51; mov.u64 %r185,0; mov.u64 %r136,1; .loc 1 220 17 mov.u64 %r400,%r185; $L29: .loc 1 215 10 @ %r590 bra $L57; .loc 1 216 34 ld.u64 %r395,[%r261+40]; mul.lo.u64 %r394,%r185,%r395; .loc 1 216 31 ld.u64 %r396,[%r261]; shl.b64 %r397,%r394,3; add.u64 %r398,%r396,%r397; .loc 1 216 13 ld.u64 %r399,[%r398]; add.u64 %r156,%r399,-1; bra $L24; $L57: mov.u64 %r156,%r185; $L24: .loc 1 220 17 st.u64 [%r83],%r400; .loc 1 221 20 add.u64 %r402,%r156,%r156; add.u64 %r403,%r402,%r156; shl.b64 %r404,%r403,3; add.u64 %r405,%r257,%r404; ld.u64 %r75,[%r405+40]; .loc 1 221 18 st.u64 [%r46],%r75; .loc 1 222 20 ld.u64 %r414,[%r405+56]; add.u64 %r413,%r414,1; ld.u64 %r421,[%r405+48]; sub.u64 %r79,%r413,%r421; .loc 1 223 10 setp.lt.s64 %r422,%r79,0; @ %r422 bra $L25; .loc 1 222 18 st.u64 [%r91],%r79; bra $L26; $L25: .loc 1 224 20 st.u64 [%r91],%r400; mov.u64 %r79,%r400; $L26: .loc 1 226 35 shl.b64 %r424,%r156,3; add.u64 %r425,%frame,%r424; .loc 1 226 10 ld.u64 %r426,[%r425]; setp.eq.u64 %r427,%r426,%r79; @ %r427 bra $L27; .loc 1 227 9 cvta.const.u64 %r4284_gfortran_runtime_error27: .loc 1 229 10 setp.ne.u64 %r430,%r75,%r136; @ %r430 bra $L58; .loc 1 230 15 mul.lo.u64 %r136,%r136,%r79; bra $L28; $L58: .loc 1 232 15 mov.u64 %r136,0; $L28: .loc 1 233 10 setp.le.s64 %r431,%r79,0; @ %r431 bra $L1; .loc 1 212 37 add.u64 %r185,%r185,1; .loc 1 212 3 add.u64 %r91,%r91,8; add.u64 %r83,%r83,8; add.u64 %r46,%r46,8; setp.ne.u64 %r432,%r161,%r185; @ %r432 bra $L29; .loc 1 237 8 ld.s8 %r194,[%r258+28]; .loc 1 247 17 shl.b64 %r435,%r194,3; add.u64 %r589,%frame,720; mov.u32 %r440435442,[%value_in]; } add.u64 %r197,%r258,40; add.u64 %r588,%frame,480; mov.u64 %r195,%r588; add.u64 %r585,%frame,600; mov.u64 %r169,%r585; add.u64 %r445,%r194,%r194; add.u64 %r446,%r445,%r194; shl.b64 %r447,%r446,3; add.u64 %r134,%r447,%r197; .loc 1 243 9 mov.u64 %r143,1; .loc 1 253 15 mov.u64 %r602,0; .loc 1 252 11 cvt.u32.u64 %r603,%r143; $L33: .loc 1 248 20 ld.u64 %r86,[%r197]; .loc 1 248 18 st.u64 [%r195],%r86; .loc 1 249 20 ld.u64 %r450,[%r197+16]; add.u64 %r449,%r450,1; ld.u64 %r451,[%r197+8]; sub.u64 %r90,%r449,%r451; .loc 1 250 10 setp.le.s64 %r452,%r90,0; @ %r452 bra $L30; .loc 1 249 18 st.u64 [%r169],%r90; bra $L31; $L30: .loc 1 253 15 st.u64 [%r169],%r602; .loc 1 252 11 mov.u32 %r151,%r603; $L31: .loc 1 256 10 setp.ne.u64 %r454,%r86,%r143; @ %r454 bra $L59; .loc 1 257 15 ld.u64 %r455,[%r169]; mul.lo.u64 %r143,%r143,%r455; bra $L32; $L59: .loc 1 259 15 mov.u64 %r143,0; $L32: .loc 1 245 3 add.u64 %r197,%r197,24; add.u64 %r195,%r195,8; add.u64 %r169,%r169,8; setp.ne.u64 %r456,%r134,%r197; @ %r456 bra $L33; .loc 1 267 34 ld.u64 %r82,[%r257]; .loc 1 267 68 ld.u64 %r183,[%r258]; .loc 1 262 13 set.u32.ne.u64 %r458,%r136,0; neg.s32 %r459,%r458; .loc 1 262 32 set.u32.ne.u64 %r461,%r146,0; neg.s32 %r462,%r461; cvt.u16.u32 %r464,%r459; cvt.u16.u32 %r465,%r462; and.b16 %r463,%r464,%r465; .loc 1 262 27 set.u32.ne.u64 %r467,%r143,0; neg.s32 %r468,%r467; .loc 1 262 32 cvt.u16.u32 %r472,%r468; and.b16 %r470,%r463,%r472; cvt.u32.u16 %r473,%r470; cvt.u16.u8 %r474,%r473; setp.eq.u16 %r475,%r474,0; @ %r475 bra $L34; .loc 1 264 13 shl.b64 %r96,%r136,2; .loc 1 265 13 shl.b64 %r98,%r143,2; .loc 1 266 13 shl.b64 %r100,%r146,2; .loc 1 267 7 {1831100; call _gfortrani_reshape_packed269 7 bra $L1; $L34: .loc 1 273 12 ld.u64 %r181,[%frame+840]; .loc 1 274 12 ld.u64 %r140,[%frame+480]; .loc 1 276 6 and.b32 %r482,%r151,%r149; setp.eq.u32 %r483,%r482,0; @ %r483 bra $L3636: .loc 1 279 6 setp.eq.u32 %r484,%r151,0; @ %r484 bra $L37; .loc 1 285 7 setp.le.s64 %r485,%r145,0; @ %r485 bra $L60; .loc 1 290 13 ld.u64 %r140,[%frame+120]; .loc 1 287 16 shl.b64 %r207,%r145,3; add.u64 %r487,%frame,36288 17 add.u64 %r497,%frame,2mov.u64 %r236,%r140; mov.u64 %r200,0; .loc 1 289 26 add.u64 %r601,%frame,120; $L38: .loc 1 289 17 add.u64 %r507,%r588,%r200; st.u64 [%r507],%r236; .loc 1 285 7 add.u64 %r200,%r200,8; setp.eq.u64 %r508,%r200,%r207; @ %r508 bra $L61; .loc 1 289 26 add.u64 %r510,%r601,%r200; ld.u64 %r236,[%r510]; bra $L38; $L50: .loc 1 297 15 ld.u32 %r106,[%r183]; .loc 1 297 13 st.u32 [%r82],%r106; .loc 1 299 12 add.u64 %r82,%r82,%r108; .loc 1 300 11 add.u64 %r183,%r183,%r586; .loc 1 301 16 add.u64 %r157,%r157,1; .loc 1 302 16 add.u64 %r242,%r242,1; st.u64 [%frame+720],%r242; .loc 1 306 13 setp.ne.u64 %r512,%r157,%r222; @ %r512 bra $L39; .loc 1 315 14 @ %r600 bra $L40; add.u64 %r173,%frame,1088; add.u64 %r104,%frame,848; add.u64 %r170,%frame,968; .loc 1 313 30 mov.u64 %r115,%r138; .loc 1 310 21 mov.u64 %r592,0; bra $L41; $L42: st.u64 [%r173],%r592; .loc 1 313 30 mul.lo.u64 %r115,%r118,%r117; .loc 1 315 14 add.u64 %r173,%r173,8; add.u64 %r104,%r104,8; add.u64 %r170,%r170,8; setp.eq.u64 %r518,%r173,%r201; @ %r518 bra $L40; $L41: .loc 1 323 24 ld.u64 %r519,[%r173]; add.u64 %r117,%r519,1; st.u64 [%r173],%r117; .loc 1 324 30 ld.u64 %r118,[%r104]; .loc 1 324 20 sub.u64 %r520,%r118,%r115; shl.b64 %r521,%r520,2; add.u64 %r82,%r82,%r521; .loc 1 306 34 ld.u64 %r120,[%r170]; .loc 1 306 13 setp.eq.u64 %r522,%r117,%r120; @ %r522 bra $L42; .loc 1 329 13 setp.ne.u64 %r523,%r242,%r587; @ %r523 bra $L62; .loc 1 310 21 mov.u64 %r157,0; bra $L51; $L64: mov.u64 %r157,0; .loc 1 329 13 mov.u64 %r82,%r157; $L51: .loc 1 333 21 mov.u64 %r524,0; st.u64 [%frame+720],%r524; .loc 1 336 29 ld.u64 %r525,[%frame+480]; mul.lo.u64 %r124,%r587,%r525; .loc 1 338 14 setp.ne.u64 %r526,%r194,1; @ %r526 bra $L44; $L47: .loc 1 340 19 set.u32.ne.u64 %r528,%r168,0; neg.s32 %r529,%r528; .loc 1 340 24 cvt.u16.u32 %r534,%r529; and.b16 %r533,%r534,%r595; .loc 1 340 18 cvt.u32.u16 %r536,%r533; cvt.u16.u8 %r537,%r536; setp.ne.u16 %r538,%r537,0; @ %r538 bra $L45; .loc 1 354 19 mov.u64 %r183,%r147; bra $L46; $L49: .loc 1 333 21 st.u64 [%r217],%r524; .loc 1 336 29 mul.lo.u64 %r124,%r132,%r131; .loc 1 337 12 add.u64 %r182,%r182,1; .loc 1 338 14 add.u64 %r217,%r217,8; setp.eq.u64 %r540,%r194,%r182; @ ! %r540 bra $L48; bra $L47; $L45: .loc 1 345 19 @ %r596 bra $L63; .loc 1 347 3520550,[%value_in]; } .loc 1 348 3620560,[%value_in]; } .loc 1 349 3658r59920570,[%value_in]; } .loc 1 350 32 ld.u64 %r140,[%frame+480]; .loc 1 354 19 mov.u64 %r183,%r147; .loc 1 350 32 mov.u64 %r194,%r145; .loc 1 343 24 mov.u64 %r168,0; bra $L46; $L44: add.u64 %r217,%frame,728; .loc 1 337 12 mov.u64 %r182,1; $L48: .loc 1 359 24 ld.u64 %r573,[%r217]; add.u64 %r131,%r573,1; st.u64 [%r217],%r131; shl.b64 %r184,%r182,3; .loc 1 360 29 add.u64 %r575,%r588,%r184; ld.u64 %r132,[%r575]; .loc 1 360 19 sub.u64 %r576,%r132,%r124; shl.b64 %r577,%r576,2; add.u64 %r183,%r183,%r577; .loc 1 329 34 add.u64 %r579,%r585,%r184; ld.u64 %r135,[%r579]; .loc 1 329 13 setp.eq.u64 %r580,%r131,%r135; @ %r580 bra $L49; bra $L46; $L60: .loc 1 283 12 mov.u64 %r183,%r147; .loc 1 285 7 mov.u64 %r194,%r145; bra $L37; $L61: .loc 1 283 12 mov.u64 %r183,%r147; mov.u64 %r194,%r145; $L37: .loc 1 294 9 setp.eq.u64 %r581,%r82,0; @ %r581 bra $L1; .loc 1 299 12 shl.b64 %r108,%r181,2; ld.u64 %r157,[%frame+1080]; .loc 1 306 34 ld.u64 %r222,[%frame+960]; .loc 1 313 30 mul.lo.u64 %r138,%r222,%r181; .loc 1 329 34 ld.u64 %r587,[%frame+600]; .loc 1 302 13 ld.u64 %r242,[%frame+720]; add.u64 %r201,%r51,%r253; .loc 1 347 35 shl.b64 %r205,%r145,3; mov.u64 %r168,%r183; shl.b64 %r586,%r140,2; .loc 1 340 24 set.u32.ne.u64 %r593,%r260,0; neg.s32 %r594,%r593; cvt.u16.u32 %r595,%r594; .loc 1 345 19 setp.le.s64 %r596,%r145,0; .loc 1 347 35 add.u64 %r597,%frame,360; .loc 1 348 36 add.u64 %r598,%frame,240; .loc 1 349 36 add.u64 %r599,%frame,120; .loc 1 315 14 setp.eq.u64 %r600,%r161,1; bra $L50; $L63: .loc 1 354 19 mov.u64 %r183,%r147; .loc 1 345 19 mov.u64 %r194,%r145; .loc 1 343 24 mov.u64 %r168,0; $L46: .loc 1 294 9 setp.eq.u64 %r582,%r82,0; @ %r582 bra $L1; shl.b64 %r586,%r140,2; ld.u64 %r587,[%frame+600]; bra $L43; $L62: .loc 1 310 21 mov.u64 %r157,0; $L43: .loc 1 302 13 ld.u64 %r242,[%frame+720]; bra $L50; $L39: .loc 1 329 13 setp.eq.u64 %r583,%r242,%r587; @ %r583 bra $L51; bra $L50; $L40: setp.eq.u64 %r584,%r242,%r587; @ %r584 bra $L64; $L1: .loc 1 364reshape_i8.o/_gfortran_reshape_8 .visible .func _gfortran_reshape_8.file 1 "../../../../libgfortran/generated/reshape_i8.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray_gfortrani_reshape_packed .extern .func _gfortrani_reshape_packed73,110,99,111,114,114,101,99,116,32,115,105,122,101,32,105,110,32,83,79,85,82,67,69,32,97,114,103,117,109,101,110,116,32,116,111,32,82,69,83,72,65,80,69,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,100,0 }114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,82,69,83,72,65,80,69,32,105,110,116,114,105,11097,108,117,101,32,37,108,100,32,105,110,32,79,82,68,69,82,32,97,114,103,117,109,101,110,116,32,116,111,32,82,69,83,72,65,80,69,32,105,110,116,114,105,110,115,105,9910,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,82,69,83,72,65,80,69,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10,111,117,116,32,111,102,32,114,97,110,103,101,32,105,110,32,79,82,68,69,82,32,97,114,103,117,109,101,110,116,32,116,111,32,82,69,83,72,65,80,69,32,105115,104,97,112,101,32,97,110,100,32,116,97,114,103,101,116,32,100,111,32,110,111,116,32,99,111,110,102,111,114_gfortran_reshape_8 .visible .func _gfortran_reshape_8 {120u64 %r258; .reg .u64.reg .predpredu64 %r294; .reg .u64 %r295; .reg .u64 %r301; .reg .u64 %r305; .reg .u64 %r306; .reg .u64u64 %r310; .reg .predpred %r315; .reg .u64 %r316; .reg .u32pred %r325; .reg .u16 %r326; .reg .pred %r327; .reg .u64 %r329; .reg .u32 %r330; .reg .u64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64pred %r339; .reg .predpredu64 %r399; .reg .u64 %r400; .reg .u64 %r402; .reg .u64predu32 %r459; .reg .u32 %r461; .reg .u32 %r462; .reg .u16 %r463; .reg .u16 %r464; .reg .u16 %r465; .reg .u32 %r467; .reg .u32 %r468; .reg .u16 %r470; .reg .u16 %r472; .reg .u32 %r473; .reg .u16 %r474; .reg .pred %r475; .reg .u32 %r482; .reg .pred %r483; .reg .pred %r484; .reg .pred %r485; .reg .u64 %r487; .reg .u64 %r494; .reg .u64 %r497; .reg .u64 %r504; .reg .u64 %r507; .reg .pred %r508; .reg .u64 %r510; .reg .pred %r512; .reg .pred %r518; .reg .u64 %r519; .reg .u64 %r520; .reg .u64 %r521; .reg .pred %r522; .reg .pred %r523; .reg .u64 %r524; .reg .u64 %r525; .reg .pred %r526; .reg .u32 %r528; .reg .u32 %r529; .reg .u16 %r533; .reg .u16 %r534; .reg .u32 %r536; .reg .u16 %r537; .reg .pred %r538; .reg .pred %r540; .reg .u64 %r550; .reg .u64 %r560; .reg .u64 %r570; .reg .u64 %r573; .reg .u64 %r575; .reg .u64 %r576; .reg .u64 %r577; .reg .u64 %r579; .reg .pred %r580; .reg .pred %r581; .reg .pred %r582; .reg .pred %r583; .reg .pred %r584; .reg .u64 %r585; .reg .u64 %r586; .reg .u64 %r587; .reg .u64 %r588; .reg .u64 %r589; .reg .pred %r590; .reg .u64 %r592; .reg .u32 %r593; .reg .u32 %r594; .reg .u16 %r595; .reg .pred %r596; .reg .u64 %r597; .reg .u64 %r598; .reg .u64 %r599; .reg .pred %r600; .reg .u64 %r601; .reg .u64 %r602; .reg .u32 %r603; .reg .u32 %r604; .reg .u64 %r605; .reg .u32 %r606; .reg .u64 %r607; .reg .u32 %r608; mov.u64 %r257,%ar0; mov.u64 %r258,%ar1; mov.u64 %r259,%ar2; mov.u64 %r260,%ar3; mov.u64 %r261,%ar4; .loc 1 78 10 ld.u64 %r263,[%r259+56]; add.u64 %r262,%r263,1; .loc 1 78 8 ld.u64 %r264,[%r259+48]; sub.u64 %r161,%r262,%r264; .loc 1 83 15 ld.s8 %r265,[%r257+28]; .loc 1 83 6 setp.eq.u64 %r266,%r265,%r161; @ %r266 bra $L2; .loc 1 84 5 cvta.const.u64 %r2667_gfortran_runtime_errorld.u64 %r269,[%r259+40]; shl.b64 %r249,%r269,3; ld.u64 %r247,[%r259]; mov.u64 %r230,%frame; shl.b64 %r253,%r161,3; add.u64 %r254,%frame,%r253; .loc 1 90 44 mov.u64 %r250,%frame; .loc 1 86 15 mov.u32 %r151,0; .loc 1 93 23 mov.u64 %r607,0; .loc 1 94 14 mov.u32 %r608,1; $L5: .loc 1 90 39 ld.u64 %r32,[%r247]; .loc 1 91 10 setp.le.s64 %r270,%r32,0; @ %r270 bra $L3; .loc 1 90 21 st.u64 [%r250],%r32; bra $L4; $L3: .loc 1 93 23 st.u64 [%r250],%r607; .loc 1 94 14 mov.u32 %r151,%r608; $L4: .loc 1 88 3 add.u64 %r247,%r247,%r249; add.u64 %r250,%r250,8; setp.ne.u64 %r272,%r250,%r254; @ %r272 bra $L5; .loc 1 98 6 ld.u64 %r273,[%r257]; setp.ne.u64 %r274,%r273,0; @ %r274 bra $L6; add.u64 %r113,%r257,40; add.u64 %r277,%r161,%r161; add.u64 %r278,%r277,%r161; shl.b64 %r279,%r278,3; add.u64 %r246,%r113,%r279; mov.u64 %r159,%r230; .loc 1 102 10 mov.u64 %r167,1; $L7: .loc 1 105 8 ld.u64 %r166,[%r159]; .loc 1 107 4 st.u64 [%r113+8],%r273; add.u64 %r281,%r166,-1; st.u64 [%r113+16],%r281; st.u64 [%r113],%r167; .loc 1 109 7 mul.lo.u64 %r167,%r167,%r166; .loc 1 103 7 add.u64 %r159,%r159,8; add.u64 %r113,%r113,24; setp.ne.u64 %r282,%r113,%r246; @ %r282 bra $L7; .loc 1 111 19 st.u64 [%r257+8],%r273; .loc 1 118 24 max.s64 %r286,%r167,0; mov.u64 %r285285; call (%value_in),_gfortrani_xmallocarray287,[%value_in]; } .loc 1 118 22 st.u64 [%r257],%r287; .loc 1 119 23 st.u8 [%r257+28],%r161; $L6: .loc 1 122 6 setp.ne.u32 %r289,%r151,0; @ %r289 bra $L1; .loc 1 125 6 setp.eq.u64 %r290,%r260,0; @ %r290 bra $L53; .loc 1 127 12 ld.s8 %r145,[%r260+28]; .loc 1 130 7 setp.le.s64 %r292,%r145,0; @ %r292 bra $L54; .loc 1 132 21 shl.b64 %r294,%r145,3; add.u64 %r295,%frame,3629301,[%value_in]; } add.u64 %r139,%r260,40; add.u64 %r144,%frame,120; add.u64 %r52,%frame,240; add.u64 %r305,%r145,%r145; add.u64 %r306,%r305,%r145; shl.b64 %r307,%r306,3; add.u64 %r111,%r139,%r307; .loc 1 129 14 mov.u32 %r149,%r151; .loc 1 128 13 mov.u64 %r146,1; .loc 1 138 19 mov.u64 %r605,0; .loc 1 137 15 cvt.u32.u64 %r606,%r146; $L14: .loc 1 133 24 ld.u64 %r39,[%r139]; .loc 1 133 22 st.u64 [%r144],%r39; .loc 1 134 24 ld.u64 %r309,[%r139+16]; add.u64 %r308,%r309,1; ld.u64 %r310,[%r139+8]; sub.u64 %r43,%r308,%r310; .loc 1 135 14 setp.le.s64 %r311,%r43,0; @ %r311 bra $L11; .loc 1 134 22 st.u64 [%r52],%r43; bra $L12; $L11: .loc 1 138 19 st.u64 [%r52],%r605; .loc 1 137 15 mov.u32 %r149,%r606; $L12: .loc 1 141 14 setp.ne.u64 %r313,%r39,%r146; @ %r313 bra $L55; .loc 1 142 19 ld.u64 %r314,[%r52]; mul.lo.u64 %r146,%r146,%r314; bra $L13; $L55: .loc 1 144 19 mov.u64 %r146,0; $L13: .loc 1 130 7 add.u64 %r139,%r139,24; add.u64 %r144,%r144,8; add.u64 %r52,%r52,8; setp.ne.u64 %r315,%r111,%r139; @ %r315 bra $L14; bra $L10; $L54: .loc 1 129 14 mov.u32 %r149,%r151; .loc 1 128 13 mov.u64 %r146,1; $L10: .loc 1 146 12 ld.u64 %r147,[%r260]; bra $L9; $L53: .loc 1 153 12 mov.u64 %r147,%r260; .loc 1 152 14 mov.u32 %r149,1; .loc 1 151 13 mov.u64 %r146,1; .loc 1 150 12 mov.u64 %r145,%r260; $L9: .loc 1 156 7 cvta.global.u64 %r316,_gfortrani_compile_options; .loc 1 156 6 ld.u32 %r317,[%r316+36]; setp.eq.u32 %r318,%r317,0; @ ! %r318 bra $L92; setp.eq.u64 %r590,%r261,0; bra $L15; $L92: add.u64 %r229,%r257,48; .loc 1 161 23 mov.u64 %r224,0; .loc 1 160 10 mov.u64 %r175,1; $L17: .loc 1 163 20 ld.u64 %r47,[%r230]; .loc 1 163 7 mul.lo.u64 %r175,%r175,%r47; .loc 1 164 17 ld.u64 %r320,[%r229+8]; add.u64 %r319,%r320,1; .loc 1 164 15 ld.u64 %r321,[%r229]; sub.u64 %r176,%r319,%r321; .loc 1 161 41 add.u64 %r224,%r224,1; .loc 1 165 7 setp.eq.u64 %r322,%r47,%r176; @ %r322 bra $L16; .loc 1 166 6 st.u64 [%stack+16],%r47; st.u64 [%stack+8],%r176; st.u64 [%stack],%r224;_gfortran_runtime_error16: .loc 1 161 7 add.u64 %r230,%r230,8; add.u64 %r229,%r229,24; setp.ne.u64 %r325,%r161,%r224; @ %r325 bra $L17; .loc 1 173 14 ld.s8 %r53,[%r258+28]; .loc 1 174 7 cvt.u16.u32 %r326,%r53; setp.le.s16 %r327,%r326,0; @ %r327 bra $L56; add.u64 %r193,%r258,48; add.u64 %r241,%r258,56; cvt.u32.u32 %r330,%r53; cvt.s64.s8 %r329,%r330; add.u64 %r332,%r329,%r329; add.u64 %r333,%r332,%r329; shl.b64 %r334,%r333,3; add.u64 %r234,%r193,%r334; .loc 1 172 21 mov.u64 %r155,1; $L19: .loc 1 177 9 ld.u64 %r337,[%r241]; add.u64 %r336,%r337,1; .loc 1 177 7 ld.u64 %r338,[%r193]; sub.u64 %r335,%r336,%r338; .loc 1 178 33 max.s64 %r335,%r335,0; .loc 1 178 18 mul.lo.u64 %r155,%r155,%r335; .loc 1 174 7 add.u64 %r193,%r193,24; add.u64 %r241,%r241,24; setp.ne.u64 %r339,%r234,%r193; @ %r339 bra $L19; bra $L18; $L56: .loc 1 172 21 mov.u64 %r155,1; $L18: .loc 1 181 10 setp.ge.s64 %r340,%r155,%r175; @ %r340 bra $L20; .loc 1 181 34 set.u32.eq.u64 %r342,%r260,0; neg.s32 %r343,%r342; .loc 1 181 39 cvt.u16.u32 %r346,%r149; cvt.u16.u32 %r349,%r343; or.b16 %r348,%r349,%r346; .loc 1 181 30 cvt.u32.u16 %r351,%r348; cvt.u16.u8 %r352,%r351; setp.eq.u16 %r353,%r352,0; @ %r353 bra $L20; .loc 1 182 2 st.u64 [%stack+8],%r175; st.u64 [%stack],%r155; cvta.const.u64 %r3_gfortran_runtime_error20: .loc 1 186 10 setp.eq.u64 %r590,%r261,0; @ %r590 bra $L15; .loc 1 192 14 shl.b64 %r357,%r161,2; add.u64 %r358,%frame,480; mov.u32 %r3635362357364,[%value_in]; } ld.u64 %r366,[%r261+40]; shl.b64 %r191,%r366,3; ld.u64 %r189,[%r261]; .loc 1 194 20 mov.u64 %r172,0; .loc 1 206 16 mov.u32 %r604,1; $L23: .loc 1 196 28 ld.u64 %r63,[%r189]; .loc 1 196 10 add.u64 %r171,%r63,-1; .loc 1 198 14 shr.u64 %r368,%r171,63; .loc 1 198 23 set.u32.le.s64 %r370,%r161,%r171; neg.s32 %r371,%r370; .loc 1 198 18 cvt.u16.u64 %r374,%r368; cvt.u16.u32 %r375,%r371; or.b16 %r373,%r374,%r375; .loc 1 198 11 cvt.u32.u16 %r376,%r373; cvt.u16.u8 %r377,%r376; setp.eq.u16 %r378,%r377,0; @ %r378 bra $L21; .loc 1 199 3 st.u64 [%stack],%r63_gfortran_runtime_errorshl.b64 %r381,%r171,2; add.u64 %r382,%frame,%r381; add.u64 %r383,%r382,480; .loc 1 202 11 ld.u32 %r384,[%r383]; setp.eq.u32 %r385,%r384,0; @ %r385 bra $L22; .loc 1 203 3 st.u64 [%stack],%r63; cvta.const.u64 %r386386_gfortran_runtime_error22: .loc 1 206 16 st.u32 [%r383],%r604; .loc 1 194 38 add.u64 %r172,%r172,1; .loc 1 194 4 add.u64 %r189,%r189,%r191; setp.ne.u64 %r392,%r161,%r172; @ %r392 bra $L23; $L15: add.u64 %r91,%frame,960; add.u64 %r51,%frame,1080; add.u64 %r46,%frame,840; .loc 1 194 20 mov.u64 %r83,%r51; mov.u64 %r185,0; mov.u64 %r136,1; .loc 1 220 17 mov.u64 %r400,%r185; $L29: .loc 1 215 10 @ %r590 bra $L57; .loc 1 216 34 ld.u64 %r395,[%r261+40]; mul.lo.u64 %r394,%r185,%r395; .loc 1 216 31 ld.u64 %r396,[%r261]; shl.b64 %r397,%r394,3; add.u64 %r398,%r396,%r397; .loc 1 216 13 ld.u64 %r399,[%r398]; add.u64 %r156,%r399,-1; bra $L24; $L57: mov.u64 %r156,%r185; $L24: .loc 1 220 17 st.u64 [%r83],%r400; .loc 1 221 20 add.u64 %r402,%r156,%r156; add.u64 %r403,%r402,%r156; shl.b64 %r404,%r403,3; add.u64 %r405,%r257,%r404; ld.u64 %r75,[%r405+40]; .loc 1 221 18 st.u64 [%r46],%r75; .loc 1 222 20 ld.u64 %r414,[%r405+56]; add.u64 %r413,%r414,1; ld.u64 %r421,[%r405+48]; sub.u64 %r79,%r413,%r421; .loc 1 223 10 setp.lt.s64 %r422,%r79,0; @ %r422 bra $L25; .loc 1 222 18 st.u64 [%r91],%r79; bra $L26; $L25: .loc 1 224 20 st.u64 [%r91],%r400; mov.u64 %r79,%r400; $L26: .loc 1 226 35 shl.b64 %r424,%r156,3; add.u64 %r425,%frame,%r424; .loc 1 226 10 ld.u64 %r426,[%r425]; setp.eq.u64 %r427,%r426,%r79; @ %r427 bra $L27; .loc 1 227 9 cvta.const.u64 %r4284_gfortran_runtime_error27: .loc 1 229 10 setp.ne.u64 %r430,%r75,%r136; @ %r430 bra $L58; .loc 1 230 15 mul.lo.u64 %r136,%r136,%r79; bra $L28; $L58: .loc 1 232 15 mov.u64 %r136,0; $L28: .loc 1 233 10 setp.le.s64 %r431,%r79,0; @ %r431 bra $L1; .loc 1 212 37 add.u64 %r185,%r185,1; .loc 1 212 3 add.u64 %r91,%r91,8; add.u64 %r83,%r83,8; add.u64 %r46,%r46,8; setp.ne.u64 %r432,%r161,%r185; @ %r432 bra $L29; .loc 1 237 8 ld.s8 %r194,[%r258+28]; .loc 1 247 17 shl.b64 %r435,%r194,3; add.u64 %r589,%frame,720; mov.u32 %r440435442,[%value_in]; } add.u64 %r197,%r258,40; add.u64 %r588,%frame,480; mov.u64 %r195,%r588; add.u64 %r585,%frame,600; mov.u64 %r169,%r585; add.u64 %r445,%r194,%r194; add.u64 %r446,%r445,%r194; shl.b64 %r447,%r446,3; add.u64 %r134,%r447,%r197; .loc 1 243 9 mov.u64 %r143,1; .loc 1 253 15 mov.u64 %r602,0; .loc 1 252 11 cvt.u32.u64 %r603,%r143; $L33: .loc 1 248 20 ld.u64 %r86,[%r197]; .loc 1 248 18 st.u64 [%r195],%r86; .loc 1 249 20 ld.u64 %r450,[%r197+16]; add.u64 %r449,%r450,1; ld.u64 %r451,[%r197+8]; sub.u64 %r90,%r449,%r451; .loc 1 250 10 setp.le.s64 %r452,%r90,0; @ %r452 bra $L30; .loc 1 249 18 st.u64 [%r169],%r90; bra $L31; $L30: .loc 1 253 15 st.u64 [%r169],%r602; .loc 1 252 11 mov.u32 %r151,%r603; $L31: .loc 1 256 10 setp.ne.u64 %r454,%r86,%r143; @ %r454 bra $L59; .loc 1 257 15 ld.u64 %r455,[%r169]; mul.lo.u64 %r143,%r143,%r455; bra $L32; $L59: .loc 1 259 15 mov.u64 %r143,0; $L32: .loc 1 245 3 add.u64 %r197,%r197,24; add.u64 %r195,%r195,8; add.u64 %r169,%r169,8; setp.ne.u64 %r456,%r134,%r197; @ %r456 bra $L33; .loc 1 267 34 ld.u64 %r82,[%r257]; .loc 1 267 68 ld.u64 %r183,[%r258]; .loc 1 262 13 set.u32.ne.u64 %r458,%r136,0; neg.s32 %r459,%r458; .loc 1 262 32 set.u32.ne.u64 %r461,%r146,0; neg.s32 %r462,%r461; cvt.u16.u32 %r464,%r459; cvt.u16.u32 %r465,%r462; and.b16 %r463,%r464,%r465; .loc 1 262 27 set.u32.ne.u64 %r467,%r143,0; neg.s32 %r468,%r467; .loc 1 262 32 cvt.u16.u32 %r472,%r468; and.b16 %r470,%r463,%r472; cvt.u32.u16 %r473,%r470; cvt.u16.u8 %r474,%r473; setp.eq.u16 %r475,%r474,0; @ %r475 bra $L34; .loc 1 264 13 shl.b64 %r96,%r136,3; .loc 1 265 13 shl.b64 %r98,%r143,3; .loc 1 266 13 shl.b64 %r100,%r146,3; .loc 1 267 7 {1831100; call _gfortrani_reshape_packed269 7 bra $L1; $L34: .loc 1 273 12 ld.u64 %r181,[%frame+840]; .loc 1 274 12 ld.u64 %r140,[%frame+480]; .loc 1 276 6 and.b32 %r482,%r151,%r149; setp.eq.u32 %r483,%r482,0; @ %r483 bra $L3636: .loc 1 279 6 setp.eq.u32 %r484,%r151,0; @ %r484 bra $L37; .loc 1 285 7 setp.le.s64 %r485,%r145,0; @ %r485 bra $L60; .loc 1 290 13 ld.u64 %r140,[%frame+120]; .loc 1 287 16 shl.b64 %r207,%r145,3; add.u64 %r487,%frame,36288 17 add.u64 %r497,%frame,2mov.u64 %r236,%r140; mov.u64 %r200,0; .loc 1 289 26 add.u64 %r601,%frame,120; $L38: .loc 1 289 17 add.u64 %r507,%r588,%r200; st.u64 [%r507],%r236; .loc 1 285 7 add.u64 %r200,%r200,8; setp.eq.u64 %r508,%r200,%r207; @ %r508 bra $L61; .loc 1 289 26 add.u64 %r510,%r601,%r200; ld.u64 %r236,[%r510]; bra $L38; $L50: .loc 1 297 15 ld.u64 %r106,[%r183]; .loc 1 297 13 st.u64 [%r82],%r106; .loc 1 299 12 add.u64 %r82,%r82,%r108; .loc 1 300 11 add.u64 %r183,%r183,%r586; .loc 1 301 16 add.u64 %r157,%r157,1; .loc 1 302 16 add.u64 %r242,%r242,1; st.u64 [%frame+720],%r242; .loc 1 306 13 setp.ne.u64 %r512,%r157,%r222; @ %r512 bra $L39; .loc 1 315 14 @ %r600 bra $L40; add.u64 %r173,%frame,1088; add.u64 %r104,%frame,848; add.u64 %r170,%frame,968; .loc 1 313 30 mov.u64 %r115,%r138; .loc 1 310 21 mov.u64 %r592,0; bra $L41; $L42: st.u64 [%r173],%r592; .loc 1 313 30 mul.lo.u64 %r115,%r118,%r117; .loc 1 315 14 add.u64 %r173,%r173,8; add.u64 %r104,%r104,8; add.u64 %r170,%r170,8; setp.eq.u64 %r518,%r173,%r201; @ %r518 bra $L40; $L41: .loc 1 323 24 ld.u64 %r519,[%r173]; add.u64 %r117,%r519,1; st.u64 [%r173],%r117; .loc 1 324 30 ld.u64 %r118,[%r104]; .loc 1 324 20 sub.u64 %r520,%r118,%r115; shl.b64 %r521,%r520,3; add.u64 %r82,%r82,%r521; .loc 1 306 34 ld.u64 %r120,[%r170]; .loc 1 306 13 setp.eq.u64 %r522,%r117,%r120; @ %r522 bra $L42; .loc 1 329 13 setp.ne.u64 %r523,%r242,%r587; @ %r523 bra $L62; .loc 1 310 21 mov.u64 %r157,0; bra $L51; $L64: mov.u64 %r157,0; .loc 1 329 13 mov.u64 %r82,%r157; $L51: .loc 1 333 21 mov.u64 %r524,0; st.u64 [%frame+720],%r524; .loc 1 336 29 ld.u64 %r525,[%frame+480]; mul.lo.u64 %r124,%r587,%r525; .loc 1 338 14 setp.ne.u64 %r526,%r194,1; @ %r526 bra $L44; $L47: .loc 1 340 19 set.u32.ne.u64 %r528,%r168,0; neg.s32 %r529,%r528; .loc 1 340 24 cvt.u16.u32 %r534,%r529; and.b16 %r533,%r534,%r595; .loc 1 340 18 cvt.u32.u16 %r536,%r533; cvt.u16.u8 %r537,%r536; setp.ne.u16 %r538,%r537,0; @ %r538 bra $L45; .loc 1 354 19 mov.u64 %r183,%r147; bra $L46; $L49: .loc 1 333 21 st.u64 [%r217],%r524; .loc 1 336 29 mul.lo.u64 %r124,%r132,%r131; .loc 1 337 12 add.u64 %r182,%r182,1; .loc 1 338 14 add.u64 %r217,%r217,8; setp.eq.u64 %r540,%r194,%r182; @ ! %r540 bra $L48; bra $L47; $L45: .loc 1 345 19 @ %r596 bra $L63; .loc 1 347 3520550,[%value_in]; } .loc 1 348 3620560,[%value_in]; } .loc 1 349 3658r59920570,[%value_in]; } .loc 1 350 32 ld.u64 %r140,[%frame+480]; .loc 1 354 19 mov.u64 %r183,%r147; .loc 1 350 32 mov.u64 %r194,%r145; .loc 1 343 24 mov.u64 %r168,0; bra $L46; $L44: add.u64 %r217,%frame,728; .loc 1 337 12 mov.u64 %r182,1; $L48: .loc 1 359 24 ld.u64 %r573,[%r217]; add.u64 %r131,%r573,1; st.u64 [%r217],%r131; shl.b64 %r184,%r182,3; .loc 1 360 29 add.u64 %r575,%r588,%r184; ld.u64 %r132,[%r575]; .loc 1 360 19 sub.u64 %r576,%r132,%r124; shl.b64 %r577,%r576,3; add.u64 %r183,%r183,%r577; .loc 1 329 34 add.u64 %r579,%r585,%r184; ld.u64 %r135,[%r579]; .loc 1 329 13 setp.eq.u64 %r580,%r131,%r135; @ %r580 bra $L49; bra $L46; $L60: .loc 1 283 12 mov.u64 %r183,%r147; .loc 1 285 7 mov.u64 %r194,%r145; bra $L37; $L61: .loc 1 283 12 mov.u64 %r183,%r147; mov.u64 %r194,%r145; $L37: .loc 1 294 9 setp.eq.u64 %r581,%r82,0; @ %r581 bra $L1; .loc 1 299 12 shl.b64 %r108,%r181,3; ld.u64 %r157,[%frame+1080]; .loc 1 306 34 ld.u64 %r222,[%frame+960]; .loc 1 313 30 mul.lo.u64 %r138,%r222,%r181; .loc 1 329 34 ld.u64 %r587,[%frame+600]; .loc 1 302 13 ld.u64 %r242,[%frame+720]; add.u64 %r201,%r51,%r253; .loc 1 347 35 shl.b64 %r205,%r145,3; mov.u64 %r168,%r183; shl.b64 %r586,%r140,3; .loc 1 340 24 set.u32.ne.u64 %r593,%r260,0; neg.s32 %r594,%r593; cvt.u16.u32 %r595,%r594; .loc 1 345 19 setp.le.s64 %r596,%r145,0; .loc 1 347 35 add.u64 %r597,%frame,360; .loc 1 348 36 add.u64 %r598,%frame,240; .loc 1 349 36 add.u64 %r599,%frame,120; .loc 1 315 14 setp.eq.u64 %r600,%r161,1; bra $L50; $L63: .loc 1 354 19 mov.u64 %r183,%r147; .loc 1 345 19 mov.u64 %r194,%r145; .loc 1 343 24 mov.u64 %r168,0; $L46: .loc 1 294 9 setp.eq.u64 %r582,%r82,0; @ %r582 bra $L1; shl.b64 %r586,%r140,3; ld.u64 %r587,[%frame+600]; bra $L43; $L62: .loc 1 310 21 mov.u64 %r157,0; $L43: .loc 1 302 13 ld.u64 %r242,[%frame+720]; bra $L50; $L39: .loc 1 329 13 setp.eq.u64 %r583,%r242,%r587; @ %r583 bra $L51; bra $L50; $L40: setp.eq.u64 %r584,%r242,%r587; @ %r584 bra $L64; $L1: .loc 1 364reshape_i16.o/_gfortran_reshape_16 .visible .func _gfortran_reshape_16.file 1 "../../../../libgfortran/generated/reshape_i16.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray_gfortrani_reshape_packed .extern .func _gfortrani_reshape_packed73,110,99,111,114,114,101,99,116,32,115,105,122,101,32,105,110,32,83,79,85,82,67,69,32,97,114,103,117,109,101,110,116,32,116,111,32,82,69,83,72,65,80,69,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,100,0 }114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,82,69,83,72,65,80,69,32,105,110,116,114,105,11097,108,117,101,32,37,108,100,32,105,110,32,79,82,68,69,82,32,97,114,103,117,109,101,110,116,32,116,111,32,82,69,83,72,65,80,69,32,105,110,116,114,105,110,115,105,9910,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,82,69,83,72,65,80,69,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10,111,117,116,32,111,102,32,114,97,110,103,101,32,105,110,32,79,82,68,69,82,32,97,114,103,117,109,101,110,116,32,116,111,32,82,69,83,72,65,80,69,32,105115,104,97,112,101,32,97,110,100,32,116,97,114,103,101,116,32,100,111,32,110,111,116,32,99,111,110,102,111,114_gfortran_reshape_16 .visible .func _gfortran_reshape_16 {120u64 %r258; .reg .u64.reg .predpredu64 %r294; .reg .u64 %r295; .reg .u64 %r301; .reg .u64 %r305; .reg .u64 %r306; .reg .u64u64 %r310; .reg .predpred %r315; .reg .u64 %r316; .reg .u32pred %r325; .reg .u16 %r326; .reg .pred %r327; .reg .u64 %r329; .reg .u32 %r330; .reg .u64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64pred %r339; .reg .predpredu64 %r399; .reg .u64 %r400; .reg .u64 %r402; .reg .u64predu32 %r459; .reg .u32 %r461; .reg .u32 %r462; .reg .u16 %r463; .reg .u16 %r464; .reg .u16 %r465; .reg .u32 %r467; .reg .u32 %r468; .reg .u16 %r470; .reg .u16 %r472; .reg .u32 %r473; .reg .u16 %r474; .reg .pred %r475; .reg .u32 %r482; .reg .pred %r483; .reg .pred %r484; .reg .pred %r485; .reg .u64 %r487; .reg .u64 %r494; .reg .u64 %r497; .reg .u64 %r504; .reg .u64 %r507; .reg .pred %r508; .reg .u64 %r510; .reg .pred %r514; .reg .pred %r520; .reg .u64pred %r528; .reg .u32 %r530; .reg .u32 %r531; .reg .u16 %r535; .reg .u16 %r536; .reg .u32 %r538; .reg .u16 %r539; .reg .pred %r540; .reg .pred %r542; .reg .u64 %r552; .reg .u64 %r562; .reg .u64 %r572; .reg .u64 %r575; .reg .u64 %r577; .reg .u64pred %r582; .reg .pred %r583; .reg .pred %r584; .reg .pred %r585; .reg .pred %r586; .reg .u64 %r587; .reg .u64 %r588; .reg .u64 %r589; .reg .u64 %r590; .reg .pred %r591; .reg .u64 %r592; .reg .u64 %r593; .reg .u64 %r594; .reg .u64 %r596; .reg .u32 %r597; .reg .u32 %r598; .reg .u16 %r599; .reg .pred.reg .u32 %r612; mov.u64 %r257,%ar0; mov.u64 %r258,%ar1; mov.u64 %r259,%ar2; mov.u64 %r260,%ar3; mov.u64 %r261,%ar4; .loc 1 78 10 ld.u64 %r263,[%r259+56]; add.u64 %r262,%r263,1; .loc 1 78 8 ld.u64 %r264,[%r259+48]; sub.u64 %r161,%r262,%r264; .loc 1 83 15 ld.s8 %r265,[%r257+28]; .loc 1 83 6 setp.eq.u64 %r266,%r265,%r161; @ %r266 bra $L2; .loc 1 84 5 cvta.const.u64 %r2667_gfortran_runtime_errorld.u64 %r269,[%r259+40]; shl.b64 %r249,%r269,3; ld.u64 %r247,[%r259]; mov.u64 %r230,%frame; shl.b64 %r253,%r161,3; add.u64 %r254,%frame,%r253; .loc 1 90 44 mov.u64 %r250,%frame; .loc 1 86 15 mov.u32 %r151,0; .loc 1 93 23 mov.u64 %r611,0; .loc 1 94 14 mov.u32 %r612,1; $L5: .loc 1 90 39 ld.u64 %r32,[%r247]; .loc 1 91 10 setp.le.s64 %r270,%r32,0; @ %r270 bra $L3; .loc 1 90 21 st.u64 [%r250],%r32; bra $L4; $L3: .loc 1 93 23 st.u64 [%r250],%r611; .loc 1 94 14 mov.u32 %r151,%r612; $L4: .loc 1 88 3 add.u64 %r247,%r247,%r249; add.u64 %r250,%r250,8; setp.ne.u64 %r272,%r250,%r254; @ %r272 bra $L5; .loc 1 98 6 ld.u64 %r273,[%r257]; setp.ne.u64 %r274,%r273,0; @ %r274 bra $L6; add.u64 %r113,%r257,40; add.u64 %r277,%r161,%r161; add.u64 %r278,%r277,%r161; shl.b64 %r279,%r278,3; add.u64 %r246,%r113,%r279; mov.u64 %r159,%r230; .loc 1 102 10 mov.u64 %r167,1; $L7: .loc 1 105 8 ld.u64 %r166,[%r159]; .loc 1 107 4 st.u64 [%r113+8],%r273; add.u64 %r281,%r166,-1; st.u64 [%r113+16],%r281; st.u64 [%r113],%r167; .loc 1 109 7 mul.lo.u64 %r167,%r167,%r166; .loc 1 103 7 add.u64 %r159,%r159,8; add.u64 %r113,%r113,24; setp.ne.u64 %r282,%r113,%r246; @ %r282 bra $L7; .loc 1 111 19 st.u64 [%r257+8],%r273; .loc 1 118 24 max.s64 %r286,%r167,0; mov.u64 %r285,16285; call (%value_in),_gfortrani_xmallocarray287,[%value_in]; } .loc 1 118 22 st.u64 [%r257],%r287; .loc 1 119 23 st.u8 [%r257+28],%r161; $L6: .loc 1 122 6 setp.ne.u32 %r289,%r151,0; @ %r289 bra $L1; .loc 1 125 6 setp.eq.u64 %r290,%r260,0; @ %r290 bra $L53; .loc 1 127 12 ld.s8 %r145,[%r260+28]; .loc 1 130 7 setp.le.s64 %r292,%r145,0; @ %r292 bra $L54; .loc 1 132 21 shl.b64 %r294,%r145,3; add.u64 %r295,%frame,3629301,[%value_in]; } add.u64 %r139,%r260,40; add.u64 %r144,%frame,120; add.u64 %r52,%frame,240; add.u64 %r305,%r145,%r145; add.u64 %r306,%r305,%r145; shl.b64 %r307,%r306,3; add.u64 %r111,%r139,%r307; .loc 1 129 14 mov.u32 %r149,%r151; .loc 1 128 13 mov.u64 %r146,1; .loc 1 138 19 mov.u64 %r609,0; .loc 1 137 15 cvt.u32.u64 %r610,%r146; $L14: .loc 1 133 24 ld.u64 %r39,[%r139]; .loc 1 133 22 st.u64 [%r144],%r39; .loc 1 134 24 ld.u64 %r309,[%r139+16]; add.u64 %r308,%r309,1; ld.u64 %r310,[%r139+8]; sub.u64 %r43,%r308,%r310; .loc 1 135 14 setp.le.s64 %r311,%r43,0; @ %r311 bra $L11; .loc 1 134 22 st.u64 [%r52],%r43; bra $L12; $L11: .loc 1 138 19 st.u64 [%r52],%r609; .loc 1 137 15 mov.u32 %r149,%r610; $L12: .loc 1 141 14 setp.ne.u64 %r313,%r39,%r146; @ %r313 bra $L55; .loc 1 142 19 ld.u64 %r314,[%r52]; mul.lo.u64 %r146,%r146,%r314; bra $L13; $L55: .loc 1 144 19 mov.u64 %r146,0; $L13: .loc 1 130 7 add.u64 %r139,%r139,24; add.u64 %r144,%r144,8; add.u64 %r52,%r52,8; setp.ne.u64 %r315,%r111,%r139; @ %r315 bra $L14; bra $L10; $L54: .loc 1 129 14 mov.u32 %r149,%r151; .loc 1 128 13 mov.u64 %r146,1; $L10: .loc 1 146 12 ld.u64 %r147,[%r260]; bra $L9; $L53: .loc 1 153 12 mov.u64 %r147,%r260; .loc 1 152 14 mov.u32 %r149,1; .loc 1 151 13 mov.u64 %r146,1; .loc 1 150 12 mov.u64 %r145,%r260; $L9: .loc 1 156 7 cvta.global.u64 %r316,_gfortrani_compile_options; .loc 1 156 6 ld.u32 %r317,[%r316+36]; setp.eq.u32 %r318,%r317,0; @ ! %r318 bra $L92; setp.eq.u64 %r591,%r261,0; bra $L15; $L92: add.u64 %r229,%r257,48; .loc 1 161 23 mov.u64 %r224,0; .loc 1 160 10 mov.u64 %r175,1; $L17: .loc 1 163 20 ld.u64 %r47,[%r230]; .loc 1 163 7 mul.lo.u64 %r175,%r175,%r47; .loc 1 164 17 ld.u64 %r320,[%r229+8]; add.u64 %r319,%r320,1; .loc 1 164 15 ld.u64 %r321,[%r229]; sub.u64 %r176,%r319,%r321; .loc 1 161 41 add.u64 %r224,%r224,1; .loc 1 165 7 setp.eq.u64 %r322,%r47,%r176; @ %r322 bra $L16; .loc 1 166 6 st.u64 [%stack+16],%r47; st.u64 [%stack+8],%r176; st.u64 [%stack],%r224;_gfortran_runtime_error16: .loc 1 161 7 add.u64 %r230,%r230,8; add.u64 %r229,%r229,24; setp.ne.u64 %r325,%r161,%r224; @ %r325 bra $L17; .loc 1 173 14 ld.s8 %r53,[%r258+28]; .loc 1 174 7 cvt.u16.u32 %r326,%r53; setp.le.s16 %r327,%r326,0; @ %r327 bra $L56; add.u64 %r193,%r258,48; add.u64 %r241,%r258,56; cvt.u32.u32 %r330,%r53; cvt.s64.s8 %r329,%r330; add.u64 %r332,%r329,%r329; add.u64 %r333,%r332,%r329; shl.b64 %r334,%r333,3; add.u64 %r234,%r193,%r334; .loc 1 172 21 mov.u64 %r155,1; $L19: .loc 1 177 9 ld.u64 %r337,[%r241]; add.u64 %r336,%r337,1; .loc 1 177 7 ld.u64 %r338,[%r193]; sub.u64 %r335,%r336,%r338; .loc 1 178 33 max.s64 %r335,%r335,0; .loc 1 178 18 mul.lo.u64 %r155,%r155,%r335; .loc 1 174 7 add.u64 %r193,%r193,24; add.u64 %r241,%r241,24; setp.ne.u64 %r339,%r234,%r193; @ %r339 bra $L19; bra $L18; $L56: .loc 1 172 21 mov.u64 %r155,1; $L18: .loc 1 181 10 setp.ge.s64 %r340,%r155,%r175; @ %r340 bra $L20; .loc 1 181 34 set.u32.eq.u64 %r342,%r260,0; neg.s32 %r343,%r342; .loc 1 181 39 cvt.u16.u32 %r346,%r149; cvt.u16.u32 %r349,%r343; or.b16 %r348,%r349,%r346; .loc 1 181 30 cvt.u32.u16 %r351,%r348; cvt.u16.u8 %r352,%r351; setp.eq.u16 %r353,%r352,0; @ %r353 bra $L20; .loc 1 182 2 st.u64 [%stack+8],%r175; st.u64 [%stack],%r155; cvta.const.u64 %r3_gfortran_runtime_error20: .loc 1 186 10 setp.eq.u64 %r591,%r261,0; @ %r591 bra $L15; .loc 1 192 14 shl.b64 %r357,%r161,2; add.u64 %r358,%frame,480; mov.u32 %r3635362357364,[%value_in]; } ld.u64 %r366,[%r261+40]; shl.b64 %r191,%r366,3; ld.u64 %r189,[%r261]; .loc 1 194 20 mov.u64 %r172,0; .loc 1 206 16 mov.u32 %r608,1; $L23: .loc 1 196 28 ld.u64 %r63,[%r189]; .loc 1 196 10 add.u64 %r171,%r63,-1; .loc 1 198 14 shr.u64 %r368,%r171,63; .loc 1 198 23 set.u32.le.s64 %r370,%r161,%r171; neg.s32 %r371,%r370; .loc 1 198 18 cvt.u16.u64 %r374,%r368; cvt.u16.u32 %r375,%r371; or.b16 %r373,%r374,%r375; .loc 1 198 11 cvt.u32.u16 %r376,%r373; cvt.u16.u8 %r377,%r376; setp.eq.u16 %r378,%r377,0; @ %r378 bra $L21; .loc 1 199 3 st.u64 [%stack],%r63_gfortran_runtime_errorshl.b64 %r381,%r171,2; add.u64 %r382,%frame,%r381; add.u64 %r383,%r382,480; .loc 1 202 11 ld.u32 %r384,[%r383]; setp.eq.u32 %r385,%r384,0; @ %r385 bra $L22; .loc 1 203 3 st.u64 [%stack],%r63; cvta.const.u64 %r386386_gfortran_runtime_error22: .loc 1 206 16 st.u32 [%r383],%r608; .loc 1 194 38 add.u64 %r172,%r172,1; .loc 1 194 4 add.u64 %r189,%r189,%r191; setp.ne.u64 %r392,%r161,%r172; @ %r392 bra $L23; $L15: add.u64 %r91,%frame,960; add.u64 %r51,%frame,1080; add.u64 %r46,%frame,840; .loc 1 194 20 mov.u64 %r83,%r51; mov.u64 %r185,0; mov.u64 %r136,1; .loc 1 220 17 mov.u64 %r400,%r185; $L29: .loc 1 215 10 @ %r591 bra $L57; .loc 1 216 34 ld.u64 %r395,[%r261+40]; mul.lo.u64 %r394,%r185,%r395; .loc 1 216 31 ld.u64 %r396,[%r261]; shl.b64 %r397,%r394,3; add.u64 %r398,%r396,%r397; .loc 1 216 13 ld.u64 %r399,[%r398]; add.u64 %r156,%r399,-1; bra $L24; $L57: mov.u64 %r156,%r185; $L24: .loc 1 220 17 st.u64 [%r83],%r400; .loc 1 221 20 add.u64 %r402,%r156,%r156; add.u64 %r403,%r402,%r156; shl.b64 %r404,%r403,3; add.u64 %r405,%r257,%r404; ld.u64 %r75,[%r405+40]; .loc 1 221 18 st.u64 [%r46],%r75; .loc 1 222 20 ld.u64 %r414,[%r405+56]; add.u64 %r413,%r414,1; ld.u64 %r421,[%r405+48]; sub.u64 %r79,%r413,%r421; .loc 1 223 10 setp.lt.s64 %r422,%r79,0; @ %r422 bra $L25; .loc 1 222 18 st.u64 [%r91],%r79; bra $L26; $L25: .loc 1 224 20 st.u64 [%r91],%r400; mov.u64 %r79,%r400; $L26: .loc 1 226 35 shl.b64 %r424,%r156,3; add.u64 %r425,%frame,%r424; .loc 1 226 10 ld.u64 %r426,[%r425]; setp.eq.u64 %r427,%r426,%r79; @ %r427 bra $L27; .loc 1 227 9 cvta.const.u64 %r4284_gfortran_runtime_error27: .loc 1 229 10 setp.ne.u64 %r430,%r75,%r136; @ %r430 bra $L58; .loc 1 230 15 mul.lo.u64 %r136,%r136,%r79; bra $L28; $L58: .loc 1 232 15 mov.u64 %r136,0; $L28: .loc 1 233 10 setp.le.s64 %r431,%r79,0; @ %r431 bra $L1; .loc 1 212 37 add.u64 %r185,%r185,1; .loc 1 212 3 add.u64 %r91,%r91,8; add.u64 %r83,%r83,8; add.u64 %r46,%r46,8; setp.ne.u64 %r432,%r161,%r185; @ %r432 bra $L29; .loc 1 237 8 ld.s8 %r194,[%r258+28]; .loc 1 247 17 shl.b64 %r435,%r194,3; add.u64 %r590,%frame,720; mov.u32 %r440435442,[%value_in]; } add.u64 %r197,%r258,40; add.u64 %r589,%frame,480; mov.u64 %r195,%r589; add.u64 %r592,%frame,600; mov.u64 %r169,%r592; add.u64 %r445,%r194,%r194; add.u64 %r446,%r445,%r194; shl.b64 %r447,%r446,3; add.u64 %r134,%r447,%r197; .loc 1 243 9 mov.u64 %r143,1; .loc 1 253 15 mov.u64 %r606,0; .loc 1 252 11 cvt.u32.u64 %r607,%r143; $L33: .loc 1 248 20 ld.u64 %r86,[%r197]; .loc 1 248 18 st.u64 [%r195],%r86; .loc 1 249 20 ld.u64 %r450,[%r197+16]; add.u64 %r449,%r450,1; ld.u64 %r451,[%r197+8]; sub.u64 %r90,%r449,%r451; .loc 1 250 10 setp.le.s64 %r452,%r90,0; @ %r452 bra $L30; .loc 1 249 18 st.u64 [%r169],%r90; bra $L31; $L30: .loc 1 253 15 st.u64 [%r169],%r606; .loc 1 252 11 mov.u32 %r151,%r607; $L31: .loc 1 256 10 setp.ne.u64 %r454,%r86,%r143; @ %r454 bra $L59; .loc 1 257 15 ld.u64 %r455,[%r169]; mul.lo.u64 %r143,%r143,%r455; bra $L32; $L59: .loc 1 259 15 mov.u64 %r143,0; $L32: .loc 1 245 3 add.u64 %r197,%r197,24; add.u64 %r195,%r195,8; add.u64 %r169,%r169,8; setp.ne.u64 %r456,%r134,%r197; @ %r456 bra $L33; .loc 1 267 34 ld.u64 %r82,[%r257]; .loc 1 267 68 ld.u64 %r183,[%r258]; .loc 1 262 13 set.u32.ne.u64 %r458,%r136,0; neg.s32 %r459,%r458; .loc 1 262 32 set.u32.ne.u64 %r461,%r146,0; neg.s32 %r462,%r461; cvt.u16.u32 %r464,%r459; cvt.u16.u32 %r465,%r462; and.b16 %r463,%r464,%r465; .loc 1 262 27 set.u32.ne.u64 %r467,%r143,0; neg.s32 %r468,%r467; .loc 1 262 32 cvt.u16.u32 %r472,%r468; and.b16 %r470,%r463,%r472; cvt.u32.u16 %r473,%r470; cvt.u16.u8 %r474,%r473; setp.eq.u16 %r475,%r474,0; @ %r475 bra $L34; .loc 1 264 13 shl.b64 %r96,%r136,4; .loc 1 265 13 shl.b64 %r98,%r143,4; .loc 1 266 13 shl.b64 %r100,%r146,4; .loc 1 267 7 {1831100; call _gfortrani_reshape_packed269 7 bra $L1; $L34: .loc 1 273 12 ld.u64 %r181,[%frame+840]; .loc 1 274 12 ld.u64 %r140,[%frame+480]; .loc 1 276 6 and.b32 %r482,%r151,%r149; setp.eq.u32 %r483,%r482,0; @ %r483 bra $L3636: .loc 1 279 6 setp.eq.u32 %r484,%r151,0; @ %r484 bra $L37; .loc 1 285 7 setp.le.s64 %r485,%r145,0; @ %r485 bra $L60; .loc 1 290 13 ld.u64 %r140,[%frame+120]; .loc 1 287 16 shl.b64 %r207,%r145,3; add.u64 %r487,%frame,36288 17 add.u64 %r497,%frame,259mov.u64 %r236,%r140; mov.u64 %r200,0; .loc 1 289 26 add.u64 %r605,%frame,120; $L38: .loc 1 289 17 add.u64 %r507,%r589,%r200; st.u64 [%r507],%r236; .loc 1 285 7 add.u64 %r200,%r200,8; setp.eq.u64 %r508,%r200,%r207; @ %r508 bra $L61; .loc 1 289 26 add.u64 %r510,%r605,%r200; ld.u64 %r236,[%r510]; bra $L38; $L50: .loc 1 297 15 ld.u64 %r587,[%r183]; ld.u64 %r588,[%r183+8]; .loc 1 297 13 st.u64 [%r82],%r587; st.u64 [%r82+8],%r588; .loc 1 299 12 add.u64 %r82,%r82,%r108; .loc 1 300 11 add.u64 %r183,%r183,%r593; .loc 1 301 16 add.u64 %r157,%r157,1; .loc 1 302 16 add.u64 %r242,%r242,1; st.u64 [%frame+720],%r242; .loc 1 306 13 setp.ne.u64 %r514,%r157,%r222; @ %r514 bra $L39; .loc 1 315 14 @ %r604 bra $L40; add.u64 %r173,%frame,1088; add.u64 %r104,%frame,848; add.u64 %r170,%frame,968; .loc 1 313 30 mov.u64 %r115,%r138; .loc 1 310 21 mov.u64 %r596,0; bra $L41; $L42: st.u64 [%r173],%r596; .loc 1 313 30 mul.lo.u64 %r115,%r118,%r117; .loc 1 315 14 add.u64 %r173,%r173,8; add.u64 %r104,%r104,8; add.u64 %r170,%r170,8; setp.eq.u64 %r520,%r173,%r201; @ %r520 bra $L40; $L41: .loc 1 323 24 ld.u64 %r521,[%r173]; add.u64 %r117,%r521,1; st.u64 [%r173],%r117; .loc 1 324 30 ld.u64 %r118,[%r104]; .loc 1 324 20 sub.u64 %r522,%r118,%r115; shl.b64 %r523,%r522,4; add.u64 %r82,%r82,%r523; .loc 1 306 34 ld.u64 %r120,[%r170]; .loc 1 306 13 setp.eq.u64 %r524,%r117,%r120; @ %r524 bra $L42; .loc 1 329 13 setp.ne.u64 %r525,%r242,%r594; @ %r525 bra $L62; .loc 1 310 21 mov.u64 %r157,0; bra $L51; $L64: mov.u64 %r157,0; .loc 1 329 13 mov.u64 %r82,%r157; $L51: .loc 1 333 21 mov.u64 %r526,0; st.u64 [%frame+720],%r526; .loc 1 336 29 ld.u64 %r527,[%frame+480]; mul.lo.u64 %r124,%r594,%r527; .loc 1 338 14 setp.ne.u64 %r528,%r194,1; @ %r528 bra $L44; $L47: .loc 1 340 19 set.u32.ne.u64 %r530,%r168,0; neg.s32 %r531,%r530; .loc 1 340 24 cvt.u16.u32 %r536,%r531; and.b16 %r535,%r536,%r599; .loc 1 340 18 cvt.u32.u16 %r538,%r535; cvt.u16.u8 %r539,%r538; setp.ne.u16 %r540,%r539,0; @ %r540 bra $L45; .loc 1 354 19 mov.u64 %r183,%r147; bra $L46; $L49: .loc 1 333 21 st.u64 [%r217],%r526; .loc 1 336 29 mul.lo.u64 %r124,%r132,%r131; .loc 1 337 12 add.u64 %r182,%r182,1; .loc 1 338 14 add.u64 %r217,%r217,8; setp.eq.u64 %r542,%r194,%r182; @ ! %r542 bra $L48; bra $L47; $L45: .loc 1 345 19 @ %r600 bra $L63; .loc 1 347 35552,[%value_in]; } .loc 1 348 365960220562,[%value_in]; } .loc 1 349 3620572,[%value_in]; } .loc 1 350 32 ld.u64 %r140,[%frame+480]; .loc 1 354 19 mov.u64 %r183,%r147; .loc 1 350 32 mov.u64 %r194,%r145; .loc 1 343 24 mov.u64 %r168,0; bra $L46; $L44: add.u64 %r217,%frame,728; .loc 1 337 12 mov.u64 %r182,1; $L48: .loc 1 359 24 ld.u64 %r575,[%r217]; add.u64 %r131,%r575,1; st.u64 [%r217],%r131; shl.b64 %r184,%r182,3; .loc 1 360 29 add.u64 %r577,%r589,%r184; ld.u64 %r132,[%r577]; .loc 1 360 19 sub.u64 %r578,%r132,%r124; shl.b64 %r579,%r578,4; add.u64 %r183,%r183,%r579; .loc 1 329 34 add.u64 %r581,%r592,%r184; ld.u64 %r135,[%r581]; .loc 1 329 13 setp.eq.u64 %r582,%r131,%r135; @ %r582 bra $L49; bra $L46; $L60: .loc 1 283 12 mov.u64 %r183,%r147; .loc 1 285 7 mov.u64 %r194,%r145; bra $L37; $L61: .loc 1 283 12 mov.u64 %r183,%r147; mov.u64 %r194,%r145; $L37: .loc 1 294 9 setp.eq.u64 %r583,%r82,0; @ %r583 bra $L1; .loc 1 299 12 shl.b64 %r108,%r181,4; ld.u64 %r157,[%frame+1080]; .loc 1 306 34 ld.u64 %r222,[%frame+960]; .loc 1 313 30 mul.lo.u64 %r138,%r222,%r181; .loc 1 329 34 ld.u64 %r594,[%frame+600]; .loc 1 302 13 ld.u64 %r242,[%frame+720]; add.u64 %r201,%r51,%r253; .loc 1 347 35 shl.b64 %r205,%r145,3; mov.u64 %r168,%r183; shl.b64 %r593,%r140,4; .loc 1 340 24 set.u32.ne.u64 %r597,%r260,0; neg.s32 %r598,%r597; cvt.u16.u32 %r599,%r598; .loc 1 345 19 setp.le.s64 %r600,%r145,0; .loc 1 347 35 add.u64 %r601,%frame,360; .loc 1 348 36 add.u64 %r602,%frame,240; .loc 1 349 36 add.u64 %r603,%frame,120; .loc 1 315 14 setp.eq.u64 %r604,%r161,1; bra $L50; $L63: .loc 1 354 19 mov.u64 %r183,%r147; .loc 1 345 19 mov.u64 %r194,%r145; .loc 1 343 24 mov.u64 %r168,0; $L46: .loc 1 294 9 setp.eq.u64 %r584,%r82,0; @ %r584 bra $L1; shl.b64 %r593,%r140,4; ld.u64 %r594,[%frame+600]; bra $L43; $L62: .loc 1 310 21 mov.u64 %r157,0; $L43: .loc 1 302 13 ld.u64 %r242,[%frame+720]; bra $L50; $L39: .loc 1 329 13 setp.eq.u64 %r585,%r242,%r594; @ %r585 bra $L51; bra $L50; $L40: setp.eq.u64 %r586,%r242,%r594; @ %r586 bra $L64; $L1: .loc 1 364reshape_r4.o/_gfortran_reshape_r4 .visible .func _gfortran_reshape_.file 1 "../../../../libgfortran/generated/reshape_r4.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray_gfortrani_reshape_packed .extern .func _gfortrani_reshape_packed73,110,99,111,114,114,101,99,116,32,115,105,122,101,32,105,110,32,83,79,85,82,67,69,32,97,114,103,117,109,101,110,116,32,116,111,32,82,69,83,72,65,80,69,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,100,0 }114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,82,69,83,72,65,80,69,32,105,110,116,114,105,11097,108,117,101,32,37,108,100,32,105,110,32,79,82,68,69,82,32,97,114,103,117,109,101,110,116,32,116,111,32,82,69,83,72,65,80,69,32,105,110,116,114,105,110,115,105,9910,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,82,69,83,72,65,80,69,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10,111,117,116,32,111,102,32,114,97,110,103,101,32,105,110,32,79,82,68,69,82,32,97,114,103,117,109,101,110,116,32,116,111,32,82,69,83,72,65,80,69,32,105115,104,97,112,101,32,97,110,100,32,116,97,114,103,101,116,32,100,111,32,110,111,116,32,99,111,110,102,111,114_gfortran_reshape_r4 .visible .func _gfortran_reshape_ {120u64 %r258; .reg .u64.reg .predpredu64 %r294; .reg .u64 %r295; .reg .u64 %r301; .reg .u64 %r305; .reg .u64 %r306; .reg .u64u64 %r310; .reg .predpred %r315; .reg .u64 %r316; .reg .u32pred %r325; .reg .u16 %r326; .reg .pred %r327; .reg .u64 %r329; .reg .u32 %r330; .reg .u64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64pred %r339; .reg .predpredu64 %r399; .reg .u64 %r400; .reg .u64 %r402; .reg .u64predu32 %r459; .reg .u32 %r461; .reg .u32 %r462; .reg .u16 %r463; .reg .u16 %r464; .reg .u16 %r465; .reg .u32 %r467; .reg .u32 %r468; .reg .u16 %r470; .reg .u16 %r472; .reg .u32 %r473; .reg .u16 %r474; .reg .pred %r475; .reg .u32 %r482; .reg .pred %r483; .reg .pred %r484; .reg .pred %r485; .reg .u64 %r487; .reg .u64 %r494; .reg .u64 %r497; .reg .u64 %r504; .reg .u64 %r507; .reg .pred %r508; .reg .u64 %r510; .reg .pred %r512; .reg .pred %r518; .reg .u64 %r519; .reg .u64 %r520; .reg .u64 %r521; .reg .pred %r522; .reg .pred %r523; .reg .u64 %r524; .reg .u64 %r525; .reg .pred %r526; .reg .u32 %r528; .reg .u32 %r529; .reg .u16 %r533; .reg .u16 %r534; .reg .u32 %r536; .reg .u16 %r537; .reg .pred %r538; .reg .pred %r540; .reg .u64 %r550; .reg .u64 %r560; .reg .u64 %r570; .reg .u64 %r573; .reg .u64 %r575; .reg .u64 %r576; .reg .u64 %r577; .reg .u64 %r579; .reg .pred %r580; .reg .pred %r581; .reg .pred %r582; .reg .pred %r583; .reg .pred %r584; .reg .u64 %r585; .reg .u64 %r586; .reg .u64 %r587; .reg .u64 %r588; .reg .u64 %r589; .reg .pred %r590; .reg .u64 %r592; .reg .u32 %r593; .reg .u32 %r594; .reg .u16 %r595; .reg .pred %r596; .reg .u64 %r597; .reg .u64 %r598; .reg .u64 %r599; .reg .pred %r600; .reg .u64 %r601; .reg .u64 %r602; .reg .u32 %r603; .reg .u32 %r604; .reg .u64 %r605; .reg .u32 %r606; .reg .u64 %r607; .reg .u32 %r608; mov.u64 %r257,%ar0; mov.u64 %r258,%ar1; mov.u64 %r259,%ar2; mov.u64 %r260,%ar3; mov.u64 %r261,%ar4; .loc 1 78 10 ld.u64 %r263,[%r259+56]; add.u64 %r262,%r263,1; .loc 1 78 8 ld.u64 %r264,[%r259+48]; sub.u64 %r161,%r262,%r264; .loc 1 83 15 ld.s8 %r265,[%r257+28]; .loc 1 83 6 setp.eq.u64 %r266,%r265,%r161; @ %r266 bra $L2; .loc 1 84 5 cvta.const.u64 %r2667_gfortran_runtime_errorld.u64 %r269,[%r259+40]; shl.b64 %r249,%r269,3; ld.u64 %r247,[%r259]; mov.u64 %r230,%frame; shl.b64 %r253,%r161,3; add.u64 %r254,%frame,%r253; .loc 1 90 44 mov.u64 %r250,%frame; .loc 1 86 15 mov.u32 %r151,0; .loc 1 93 23 mov.u64 %r607,0; .loc 1 94 14 mov.u32 %r608,1; $L5: .loc 1 90 39 ld.u64 %r32,[%r247]; .loc 1 91 10 setp.le.s64 %r270,%r32,0; @ %r270 bra $L3; .loc 1 90 21 st.u64 [%r250],%r32; bra $L4; $L3: .loc 1 93 23 st.u64 [%r250],%r607; .loc 1 94 14 mov.u32 %r151,%r608; $L4: .loc 1 88 3 add.u64 %r247,%r247,%r249; add.u64 %r250,%r250,8; setp.ne.u64 %r272,%r250,%r254; @ %r272 bra $L5; .loc 1 98 6 ld.u64 %r273,[%r257]; setp.ne.u64 %r274,%r273,0; @ %r274 bra $L6; add.u64 %r113,%r257,40; add.u64 %r277,%r161,%r161; add.u64 %r278,%r277,%r161; shl.b64 %r279,%r278,3; add.u64 %r246,%r113,%r279; mov.u64 %r159,%r230; .loc 1 102 10 mov.u64 %r167,1; $L7: .loc 1 105 8 ld.u64 %r166,[%r159]; .loc 1 107 4 st.u64 [%r113+8],%r273; add.u64 %r281,%r166,-1; st.u64 [%r113+16],%r281; st.u64 [%r113],%r167; .loc 1 109 7 mul.lo.u64 %r167,%r167,%r166; .loc 1 103 7 add.u64 %r159,%r159,8; add.u64 %r113,%r113,24; setp.ne.u64 %r282,%r113,%r246; @ %r282 bra $L7; .loc 1 111 19 st.u64 [%r257+8],%r273; .loc 1 118 24 max.s64 %r286,%r167,0285; call (%value_in),_gfortrani_xmallocarray287,[%value_in]; } .loc 1 118 22 st.u64 [%r257],%r287; .loc 1 119 23 st.u8 [%r257+28],%r161; $L6: .loc 1 122 6 setp.ne.u32 %r289,%r151,0; @ %r289 bra $L1; .loc 1 125 6 setp.eq.u64 %r290,%r260,0; @ %r290 bra $L53; .loc 1 127 12 ld.s8 %r145,[%r260+28]; .loc 1 130 7 setp.le.s64 %r292,%r145,0; @ %r292 bra $L54; .loc 1 132 21 shl.b64 %r294,%r145,3; add.u64 %r295,%frame,3629301,[%value_in]; } add.u64 %r139,%r260,40; add.u64 %r144,%frame,120; add.u64 %r52,%frame,240; add.u64 %r305,%r145,%r145; add.u64 %r306,%r305,%r145; shl.b64 %r307,%r306,3; add.u64 %r111,%r139,%r307; .loc 1 129 14 mov.u32 %r149,%r151; .loc 1 128 13 mov.u64 %r146,1; .loc 1 138 19 mov.u64 %r605,0; .loc 1 137 15 cvt.u32.u64 %r606,%r146; $L14: .loc 1 133 24 ld.u64 %r39,[%r139]; .loc 1 133 22 st.u64 [%r144],%r39; .loc 1 134 24 ld.u64 %r309,[%r139+16]; add.u64 %r308,%r309,1; ld.u64 %r310,[%r139+8]; sub.u64 %r43,%r308,%r310; .loc 1 135 14 setp.le.s64 %r311,%r43,0; @ %r311 bra $L11; .loc 1 134 22 st.u64 [%r52],%r43; bra $L12; $L11: .loc 1 138 19 st.u64 [%r52],%r605; .loc 1 137 15 mov.u32 %r149,%r606; $L12: .loc 1 141 14 setp.ne.u64 %r313,%r39,%r146; @ %r313 bra $L55; .loc 1 142 19 ld.u64 %r314,[%r52]; mul.lo.u64 %r146,%r146,%r314; bra $L13; $L55: .loc 1 144 19 mov.u64 %r146,0; $L13: .loc 1 130 7 add.u64 %r139,%r139,24; add.u64 %r144,%r144,8; add.u64 %r52,%r52,8; setp.ne.u64 %r315,%r111,%r139; @ %r315 bra $L14; bra $L10; $L54: .loc 1 129 14 mov.u32 %r149,%r151; .loc 1 128 13 mov.u64 %r146,1; $L10: .loc 1 146 12 ld.u64 %r147,[%r260]; bra $L9; $L53: .loc 1 153 12 mov.u64 %r147,%r260; .loc 1 152 14 mov.u32 %r149,1; .loc 1 151 13 mov.u64 %r146,1; .loc 1 150 12 mov.u64 %r145,%r260; $L9: .loc 1 156 7 cvta.global.u64 %r316,_gfortrani_compile_options; .loc 1 156 6 ld.u32 %r317,[%r316+36]; setp.eq.u32 %r318,%r317,0; @ ! %r318 bra $L92; setp.eq.u64 %r590,%r261,0; bra $L15; $L92: add.u64 %r229,%r257,48; .loc 1 161 23 mov.u64 %r224,0; .loc 1 160 10 mov.u64 %r175,1; $L17: .loc 1 163 20 ld.u64 %r47,[%r230]; .loc 1 163 7 mul.lo.u64 %r175,%r175,%r47; .loc 1 164 17 ld.u64 %r320,[%r229+8]; add.u64 %r319,%r320,1; .loc 1 164 15 ld.u64 %r321,[%r229]; sub.u64 %r176,%r319,%r321; .loc 1 161 41 add.u64 %r224,%r224,1; .loc 1 165 7 setp.eq.u64 %r322,%r47,%r176; @ %r322 bra $L16; .loc 1 166 6 st.u64 [%stack+16],%r47; st.u64 [%stack+8],%r176; st.u64 [%stack],%r224;_gfortran_runtime_error16: .loc 1 161 7 add.u64 %r230,%r230,8; add.u64 %r229,%r229,24; setp.ne.u64 %r325,%r161,%r224; @ %r325 bra $L17; .loc 1 173 14 ld.s8 %r53,[%r258+28]; .loc 1 174 7 cvt.u16.u32 %r326,%r53; setp.le.s16 %r327,%r326,0; @ %r327 bra $L56; add.u64 %r193,%r258,48; add.u64 %r241,%r258,56; cvt.u32.u32 %r330,%r53; cvt.s64.s8 %r329,%r330; add.u64 %r332,%r329,%r329; add.u64 %r333,%r332,%r329; shl.b64 %r334,%r333,3; add.u64 %r234,%r193,%r334; .loc 1 172 21 mov.u64 %r155,1; $L19: .loc 1 177 9 ld.u64 %r337,[%r241]; add.u64 %r336,%r337,1; .loc 1 177 7 ld.u64 %r338,[%r193]; sub.u64 %r335,%r336,%r338; .loc 1 178 33 max.s64 %r335,%r335,0; .loc 1 178 18 mul.lo.u64 %r155,%r155,%r335; .loc 1 174 7 add.u64 %r193,%r193,24; add.u64 %r241,%r241,24; setp.ne.u64 %r339,%r234,%r193; @ %r339 bra $L19; bra $L18; $L56: .loc 1 172 21 mov.u64 %r155,1; $L18: .loc 1 181 10 setp.ge.s64 %r340,%r155,%r175; @ %r340 bra $L20; .loc 1 181 34 set.u32.eq.u64 %r342,%r260,0; neg.s32 %r343,%r342; .loc 1 181 39 cvt.u16.u32 %r346,%r149; cvt.u16.u32 %r349,%r343; or.b16 %r348,%r349,%r346; .loc 1 181 30 cvt.u32.u16 %r351,%r348; cvt.u16.u8 %r352,%r351; setp.eq.u16 %r353,%r352,0; @ %r353 bra $L20; .loc 1 182 2 st.u64 [%stack+8],%r175; st.u64 [%stack],%r155; cvta.const.u64 %r3_gfortran_runtime_error20: .loc 1 186 10 setp.eq.u64 %r590,%r261,0; @ %r590 bra $L15; .loc 1 192 14 shl.b64 %r357,%r161,2; add.u64 %r358,%frame,480; mov.u32 %r3635362357364,[%value_in]; } ld.u64 %r366,[%r261+40]; shl.b64 %r191,%r366,3; ld.u64 %r189,[%r261]; .loc 1 194 20 mov.u64 %r172,0; .loc 1 206 16 mov.u32 %r604,1; $L23: .loc 1 196 28 ld.u64 %r63,[%r189]; .loc 1 196 10 add.u64 %r171,%r63,-1; .loc 1 198 14 shr.u64 %r368,%r171,63; .loc 1 198 23 set.u32.le.s64 %r370,%r161,%r171; neg.s32 %r371,%r370; .loc 1 198 18 cvt.u16.u64 %r374,%r368; cvt.u16.u32 %r375,%r371; or.b16 %r373,%r374,%r375; .loc 1 198 11 cvt.u32.u16 %r376,%r373; cvt.u16.u8 %r377,%r376; setp.eq.u16 %r378,%r377,0; @ %r378 bra $L21; .loc 1 199 3 st.u64 [%stack],%r63_gfortran_runtime_errorshl.b64 %r381,%r171,2; add.u64 %r382,%frame,%r381; add.u64 %r383,%r382,480; .loc 1 202 11 ld.u32 %r384,[%r383]; setp.eq.u32 %r385,%r384,0; @ %r385 bra $L22; .loc 1 203 3 st.u64 [%stack],%r63; cvta.const.u64 %r386386_gfortran_runtime_error22: .loc 1 206 16 st.u32 [%r383],%r604; .loc 1 194 38 add.u64 %r172,%r172,1; .loc 1 194 4 add.u64 %r189,%r189,%r191; setp.ne.u64 %r392,%r161,%r172; @ %r392 bra $L23; $L15: add.u64 %r91,%frame,960; add.u64 %r51,%frame,1080; add.u64 %r46,%frame,840; .loc 1 194 20 mov.u64 %r83,%r51; mov.u64 %r185,0; mov.u64 %r136,1; .loc 1 220 17 mov.u64 %r400,%r185; $L29: .loc 1 215 10 @ %r590 bra $L57; .loc 1 216 34 ld.u64 %r395,[%r261+40]; mul.lo.u64 %r394,%r185,%r395; .loc 1 216 31 ld.u64 %r396,[%r261]; shl.b64 %r397,%r394,3; add.u64 %r398,%r396,%r397; .loc 1 216 13 ld.u64 %r399,[%r398]; add.u64 %r156,%r399,-1; bra $L24; $L57: mov.u64 %r156,%r185; $L24: .loc 1 220 17 st.u64 [%r83],%r400; .loc 1 221 20 add.u64 %r402,%r156,%r156; add.u64 %r403,%r402,%r156; shl.b64 %r404,%r403,3; add.u64 %r405,%r257,%r404; ld.u64 %r75,[%r405+40]; .loc 1 221 18 st.u64 [%r46],%r75; .loc 1 222 20 ld.u64 %r414,[%r405+56]; add.u64 %r413,%r414,1; ld.u64 %r421,[%r405+48]; sub.u64 %r79,%r413,%r421; .loc 1 223 10 setp.lt.s64 %r422,%r79,0; @ %r422 bra $L25; .loc 1 222 18 st.u64 [%r91],%r79; bra $L26; $L25: .loc 1 224 20 st.u64 [%r91],%r400; mov.u64 %r79,%r400; $L26: .loc 1 226 35 shl.b64 %r424,%r156,3; add.u64 %r425,%frame,%r424; .loc 1 226 10 ld.u64 %r426,[%r425]; setp.eq.u64 %r427,%r426,%r79; @ %r427 bra $L27; .loc 1 227 9 cvta.const.u64 %r4284_gfortran_runtime_error27: .loc 1 229 10 setp.ne.u64 %r430,%r75,%r136; @ %r430 bra $L58; .loc 1 230 15 mul.lo.u64 %r136,%r136,%r79; bra $L28; $L58: .loc 1 232 15 mov.u64 %r136,0; $L28: .loc 1 233 10 setp.le.s64 %r431,%r79,0; @ %r431 bra $L1; .loc 1 212 37 add.u64 %r185,%r185,1; .loc 1 212 3 add.u64 %r91,%r91,8; add.u64 %r83,%r83,8; add.u64 %r46,%r46,8; setp.ne.u64 %r432,%r161,%r185; @ %r432 bra $L29; .loc 1 237 8 ld.s8 %r194,[%r258+28]; .loc 1 247 17 shl.b64 %r435,%r194,3; add.u64 %r589,%frame,720; mov.u32 %r440435442,[%value_in]; } add.u64 %r197,%r258,40; add.u64 %r588,%frame,480; mov.u64 %r195,%r588; add.u64 %r585,%frame,600; mov.u64 %r169,%r585; add.u64 %r445,%r194,%r194; add.u64 %r446,%r445,%r194; shl.b64 %r447,%r446,3; add.u64 %r134,%r447,%r197; .loc 1 243 9 mov.u64 %r143,1; .loc 1 253 15 mov.u64 %r602,0; .loc 1 252 11 cvt.u32.u64 %r603,%r143; $L33: .loc 1 248 20 ld.u64 %r86,[%r197]; .loc 1 248 18 st.u64 [%r195],%r86; .loc 1 249 20 ld.u64 %r450,[%r197+16]; add.u64 %r449,%r450,1; ld.u64 %r451,[%r197+8]; sub.u64 %r90,%r449,%r451; .loc 1 250 10 setp.le.s64 %r452,%r90,0; @ %r452 bra $L30; .loc 1 249 18 st.u64 [%r169],%r90; bra $L31; $L30: .loc 1 253 15 st.u64 [%r169],%r602; .loc 1 252 11 mov.u32 %r151,%r603; $L31: .loc 1 256 10 setp.ne.u64 %r454,%r86,%r143; @ %r454 bra $L59; .loc 1 257 15 ld.u64 %r455,[%r169]; mul.lo.u64 %r143,%r143,%r455; bra $L32; $L59: .loc 1 259 15 mov.u64 %r143,0; $L32: .loc 1 245 3 add.u64 %r197,%r197,24; add.u64 %r195,%r195,8; add.u64 %r169,%r169,8; setp.ne.u64 %r456,%r134,%r197; @ %r456 bra $L33; .loc 1 267 34 ld.u64 %r82,[%r257]; .loc 1 267 68 ld.u64 %r183,[%r258]; .loc 1 262 13 set.u32.ne.u64 %r458,%r136,0; neg.s32 %r459,%r458; .loc 1 262 32 set.u32.ne.u64 %r461,%r146,0; neg.s32 %r462,%r461; cvt.u16.u32 %r464,%r459; cvt.u16.u32 %r465,%r462; and.b16 %r463,%r464,%r465; .loc 1 262 27 set.u32.ne.u64 %r467,%r143,0; neg.s32 %r468,%r467; .loc 1 262 32 cvt.u16.u32 %r472,%r468; and.b16 %r470,%r463,%r472; cvt.u32.u16 %r473,%r470; cvt.u16.u8 %r474,%r473; setp.eq.u16 %r475,%r474,0; @ %r475 bra $L34; .loc 1 264 13 shl.b64 %r96,%r136,2; .loc 1 265 13 shl.b64 %r98,%r143,2; .loc 1 266 13 shl.b64 %r100,%r146,2; .loc 1 267 7 {1831100; call _gfortrani_reshape_packed269 7 bra $L1; $L34: .loc 1 273 12 ld.u64 %r181,[%frame+840]; .loc 1 274 12 ld.u64 %r140,[%frame+480]; .loc 1 276 6 and.b32 %r482,%r151,%r149; setp.eq.u32 %r483,%r482,0; @ %r483 bra $L3636: .loc 1 279 6 setp.eq.u32 %r484,%r151,0; @ %r484 bra $L37; .loc 1 285 7 setp.le.s64 %r485,%r145,0; @ %r485 bra $L60; .loc 1 290 13 ld.u64 %r140,[%frame+120]; .loc 1 287 16 shl.b64 %r207,%r145,3; add.u64 %r487,%frame,36288 17 add.u64 %r497,%frame,2mov.u64 %r236,%r140; mov.u64 %r200,0; .loc 1 289 26 add.u64 %r601,%frame,120; $L38: .loc 1 289 17 add.u64 %r507,%r588,%r200; st.u64 [%r507],%r236; .loc 1 285 7 add.u64 %r200,%r200,8; setp.eq.u64 %r508,%r200,%r207; @ %r508 bra $L61; .loc 1 289 26 add.u64 %r510,%r601,%r200; ld.u64 %r236,[%r510]; bra $L38; $L50: .loc 1 297 15 ld.f32 %r106,[%r183]; .loc 1 297 13 st.f32 [%r82],%r106; .loc 1 299 12 add.u64 %r82,%r82,%r108; .loc 1 300 11 add.u64 %r183,%r183,%r586; .loc 1 301 16 add.u64 %r157,%r157,1; .loc 1 302 16 add.u64 %r242,%r242,1; st.u64 [%frame+720],%r242; .loc 1 306 13 setp.ne.u64 %r512,%r157,%r222; @ %r512 bra $L39; .loc 1 315 14 @ %r600 bra $L40; add.u64 %r173,%frame,1088; add.u64 %r104,%frame,848; add.u64 %r170,%frame,968; .loc 1 313 30 mov.u64 %r115,%r138; .loc 1 310 21 mov.u64 %r592,0; bra $L41; $L42: st.u64 [%r173],%r592; .loc 1 313 30 mul.lo.u64 %r115,%r118,%r117; .loc 1 315 14 add.u64 %r173,%r173,8; add.u64 %r104,%r104,8; add.u64 %r170,%r170,8; setp.eq.u64 %r518,%r173,%r201; @ %r518 bra $L40; $L41: .loc 1 323 24 ld.u64 %r519,[%r173]; add.u64 %r117,%r519,1; st.u64 [%r173],%r117; .loc 1 324 30 ld.u64 %r118,[%r104]; .loc 1 324 20 sub.u64 %r520,%r118,%r115; shl.b64 %r521,%r520,2; add.u64 %r82,%r82,%r521; .loc 1 306 34 ld.u64 %r120,[%r170]; .loc 1 306 13 setp.eq.u64 %r522,%r117,%r120; @ %r522 bra $L42; .loc 1 329 13 setp.ne.u64 %r523,%r242,%r587; @ %r523 bra $L62; .loc 1 310 21 mov.u64 %r157,0; bra $L51; $L64: mov.u64 %r157,0; .loc 1 329 13 mov.u64 %r82,%r157; $L51: .loc 1 333 21 mov.u64 %r524,0; st.u64 [%frame+720],%r524; .loc 1 336 29 ld.u64 %r525,[%frame+480]; mul.lo.u64 %r124,%r587,%r525; .loc 1 338 14 setp.ne.u64 %r526,%r194,1; @ %r526 bra $L44; $L47: .loc 1 340 19 set.u32.ne.u64 %r528,%r168,0; neg.s32 %r529,%r528; .loc 1 340 24 cvt.u16.u32 %r534,%r529; and.b16 %r533,%r534,%r595; .loc 1 340 18 cvt.u32.u16 %r536,%r533; cvt.u16.u8 %r537,%r536; setp.ne.u16 %r538,%r537,0; @ %r538 bra $L45; .loc 1 354 19 mov.u64 %r183,%r147; bra $L46; $L49: .loc 1 333 21 st.u64 [%r217],%r524; .loc 1 336 29 mul.lo.u64 %r124,%r132,%r131; .loc 1 337 12 add.u64 %r182,%r182,1; .loc 1 338 14 add.u64 %r217,%r217,8; setp.eq.u64 %r540,%r194,%r182; @ ! %r540 bra $L48; bra $L47; $L45: .loc 1 345 19 @ %r596 bra $L63; .loc 1 347 3520550,[%value_in]; } .loc 1 348 3620560,[%value_in]; } .loc 1 349 3658r59920570,[%value_in]; } .loc 1 350 32 ld.u64 %r140,[%frame+480]; .loc 1 354 19 mov.u64 %r183,%r147; .loc 1 350 32 mov.u64 %r194,%r145; .loc 1 343 24 mov.u64 %r168,0; bra $L46; $L44: add.u64 %r217,%frame,728; .loc 1 337 12 mov.u64 %r182,1; $L48: .loc 1 359 24 ld.u64 %r573,[%r217]; add.u64 %r131,%r573,1; st.u64 [%r217],%r131; shl.b64 %r184,%r182,3; .loc 1 360 29 add.u64 %r575,%r588,%r184; ld.u64 %r132,[%r575]; .loc 1 360 19 sub.u64 %r576,%r132,%r124; shl.b64 %r577,%r576,2; add.u64 %r183,%r183,%r577; .loc 1 329 34 add.u64 %r579,%r585,%r184; ld.u64 %r135,[%r579]; .loc 1 329 13 setp.eq.u64 %r580,%r131,%r135; @ %r580 bra $L49; bra $L46; $L60: .loc 1 283 12 mov.u64 %r183,%r147; .loc 1 285 7 mov.u64 %r194,%r145; bra $L37; $L61: .loc 1 283 12 mov.u64 %r183,%r147; mov.u64 %r194,%r145; $L37: .loc 1 294 9 setp.eq.u64 %r581,%r82,0; @ %r581 bra $L1; .loc 1 299 12 shl.b64 %r108,%r181,2; ld.u64 %r157,[%frame+1080]; .loc 1 306 34 ld.u64 %r222,[%frame+960]; .loc 1 313 30 mul.lo.u64 %r138,%r222,%r181; .loc 1 329 34 ld.u64 %r587,[%frame+600]; .loc 1 302 13 ld.u64 %r242,[%frame+720]; add.u64 %r201,%r51,%r253; .loc 1 347 35 shl.b64 %r205,%r145,3; mov.u64 %r168,%r183; shl.b64 %r586,%r140,2; .loc 1 340 24 set.u32.ne.u64 %r593,%r260,0; neg.s32 %r594,%r593; cvt.u16.u32 %r595,%r594; .loc 1 345 19 setp.le.s64 %r596,%r145,0; .loc 1 347 35 add.u64 %r597,%frame,360; .loc 1 348 36 add.u64 %r598,%frame,240; .loc 1 349 36 add.u64 %r599,%frame,120; .loc 1 315 14 setp.eq.u64 %r600,%r161,1; bra $L50; $L63: .loc 1 354 19 mov.u64 %r183,%r147; .loc 1 345 19 mov.u64 %r194,%r145; .loc 1 343 24 mov.u64 %r168,0; $L46: .loc 1 294 9 setp.eq.u64 %r582,%r82,0; @ %r582 bra $L1; shl.b64 %r586,%r140,2; ld.u64 %r587,[%frame+600]; bra $L43; $L62: .loc 1 310 21 mov.u64 %r157,0; $L43: .loc 1 302 13 ld.u64 %r242,[%frame+720]; bra $L50; $L39: .loc 1 329 13 setp.eq.u64 %r583,%r242,%r587; @ %r583 bra $L51; bra $L50; $L40: setp.eq.u64 %r584,%r242,%r587; @ %r584 bra $L64; $L1: .loc 1 364reshape_r8.o/_gfortran_reshape_r8 .visible .func _gfortran_reshape_.file 1 "../../../../libgfortran/generated/reshape_r8.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray_gfortrani_reshape_packed .extern .func _gfortrani_reshape_packed73,110,99,111,114,114,101,99,116,32,115,105,122,101,32,105,110,32,83,79,85,82,67,69,32,97,114,103,117,109,101,110,116,32,116,111,32,82,69,83,72,65,80,69,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,100,0 }114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,82,69,83,72,65,80,69,32,105,110,116,114,105,11097,108,117,101,32,37,108,100,32,105,110,32,79,82,68,69,82,32,97,114,103,117,109,101,110,116,32,116,111,32,82,69,83,72,65,80,69,32,105,110,116,114,105,110,115,105,9910,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,82,69,83,72,65,80,69,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10,111,117,116,32,111,102,32,114,97,110,103,101,32,105,110,32,79,82,68,69,82,32,97,114,103,117,109,101,110,116,32,116,111,32,82,69,83,72,65,80,69,32,105115,104,97,112,101,32,97,110,100,32,116,97,114,103,101,116,32,100,111,32,110,111,116,32,99,111,110,102,111,114_gfortran_reshape_r8 .visible .func _gfortran_reshape_ {120u64 %r258; .reg .u64.reg .predpredu64 %r294; .reg .u64 %r295; .reg .u64 %r301; .reg .u64 %r305; .reg .u64 %r306; .reg .u64u64 %r310; .reg .predpred %r315; .reg .u64 %r316; .reg .u32pred %r325; .reg .u16 %r326; .reg .pred %r327; .reg .u64 %r329; .reg .u32 %r330; .reg .u64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64pred %r339; .reg .predpredu64 %r399; .reg .u64 %r400; .reg .u64 %r402; .reg .u64predu32 %r459; .reg .u32 %r461; .reg .u32 %r462; .reg .u16 %r463; .reg .u16 %r464; .reg .u16 %r465; .reg .u32 %r467; .reg .u32 %r468; .reg .u16 %r470; .reg .u16 %r472; .reg .u32 %r473; .reg .u16 %r474; .reg .pred %r475; .reg .u32 %r482; .reg .pred %r483; .reg .pred %r484; .reg .pred %r485; .reg .u64 %r487; .reg .u64 %r494; .reg .u64 %r497; .reg .u64 %r504; .reg .u64 %r507; .reg .pred %r508; .reg .u64 %r510; .reg .pred %r512; .reg .pred %r518; .reg .u64 %r519; .reg .u64 %r520; .reg .u64 %r521; .reg .pred %r522; .reg .pred %r523; .reg .u64 %r524; .reg .u64 %r525; .reg .pred %r526; .reg .u32 %r528; .reg .u32 %r529; .reg .u16 %r533; .reg .u16 %r534; .reg .u32 %r536; .reg .u16 %r537; .reg .pred %r538; .reg .pred %r540; .reg .u64 %r550; .reg .u64 %r560; .reg .u64 %r570; .reg .u64 %r573; .reg .u64 %r575; .reg .u64 %r576; .reg .u64 %r577; .reg .u64 %r579; .reg .pred %r580; .reg .pred %r581; .reg .pred %r582; .reg .pred %r583; .reg .pred %r584; .reg .u64 %r585; .reg .u64 %r586; .reg .u64 %r587; .reg .u64 %r588; .reg .u64 %r589; .reg .pred %r590; .reg .u64 %r592; .reg .u32 %r593; .reg .u32 %r594; .reg .u16 %r595; .reg .pred %r596; .reg .u64 %r597; .reg .u64 %r598; .reg .u64 %r599; .reg .pred %r600; .reg .u64 %r601; .reg .u64 %r602; .reg .u32 %r603; .reg .u32 %r604; .reg .u64 %r605; .reg .u32 %r606; .reg .u64 %r607; .reg .u32 %r608; mov.u64 %r257,%ar0; mov.u64 %r258,%ar1; mov.u64 %r259,%ar2; mov.u64 %r260,%ar3; mov.u64 %r261,%ar4; .loc 1 78 10 ld.u64 %r263,[%r259+56]; add.u64 %r262,%r263,1; .loc 1 78 8 ld.u64 %r264,[%r259+48]; sub.u64 %r161,%r262,%r264; .loc 1 83 15 ld.s8 %r265,[%r257+28]; .loc 1 83 6 setp.eq.u64 %r266,%r265,%r161; @ %r266 bra $L2; .loc 1 84 5 cvta.const.u64 %r2667_gfortran_runtime_errorld.u64 %r269,[%r259+40]; shl.b64 %r249,%r269,3; ld.u64 %r247,[%r259]; mov.u64 %r230,%frame; shl.b64 %r253,%r161,3; add.u64 %r254,%frame,%r253; .loc 1 90 44 mov.u64 %r250,%frame; .loc 1 86 15 mov.u32 %r151,0; .loc 1 93 23 mov.u64 %r607,0; .loc 1 94 14 mov.u32 %r608,1; $L5: .loc 1 90 39 ld.u64 %r32,[%r247]; .loc 1 91 10 setp.le.s64 %r270,%r32,0; @ %r270 bra $L3; .loc 1 90 21 st.u64 [%r250],%r32; bra $L4; $L3: .loc 1 93 23 st.u64 [%r250],%r607; .loc 1 94 14 mov.u32 %r151,%r608; $L4: .loc 1 88 3 add.u64 %r247,%r247,%r249; add.u64 %r250,%r250,8; setp.ne.u64 %r272,%r250,%r254; @ %r272 bra $L5; .loc 1 98 6 ld.u64 %r273,[%r257]; setp.ne.u64 %r274,%r273,0; @ %r274 bra $L6; add.u64 %r113,%r257,40; add.u64 %r277,%r161,%r161; add.u64 %r278,%r277,%r161; shl.b64 %r279,%r278,3; add.u64 %r246,%r113,%r279; mov.u64 %r159,%r230; .loc 1 102 10 mov.u64 %r167,1; $L7: .loc 1 105 8 ld.u64 %r166,[%r159]; .loc 1 107 4 st.u64 [%r113+8],%r273; add.u64 %r281,%r166,-1; st.u64 [%r113+16],%r281; st.u64 [%r113],%r167; .loc 1 109 7 mul.lo.u64 %r167,%r167,%r166; .loc 1 103 7 add.u64 %r159,%r159,8; add.u64 %r113,%r113,24; setp.ne.u64 %r282,%r113,%r246; @ %r282 bra $L7; .loc 1 111 19 st.u64 [%r257+8],%r273; .loc 1 118 24 max.s64 %r286,%r167,0; mov.u64 %r285285; call (%value_in),_gfortrani_xmallocarray287,[%value_in]; } .loc 1 118 22 st.u64 [%r257],%r287; .loc 1 119 23 st.u8 [%r257+28],%r161; $L6: .loc 1 122 6 setp.ne.u32 %r289,%r151,0; @ %r289 bra $L1; .loc 1 125 6 setp.eq.u64 %r290,%r260,0; @ %r290 bra $L53; .loc 1 127 12 ld.s8 %r145,[%r260+28]; .loc 1 130 7 setp.le.s64 %r292,%r145,0; @ %r292 bra $L54; .loc 1 132 21 shl.b64 %r294,%r145,3; add.u64 %r295,%frame,3629301,[%value_in]; } add.u64 %r139,%r260,40; add.u64 %r144,%frame,120; add.u64 %r52,%frame,240; add.u64 %r305,%r145,%r145; add.u64 %r306,%r305,%r145; shl.b64 %r307,%r306,3; add.u64 %r111,%r139,%r307; .loc 1 129 14 mov.u32 %r149,%r151; .loc 1 128 13 mov.u64 %r146,1; .loc 1 138 19 mov.u64 %r605,0; .loc 1 137 15 cvt.u32.u64 %r606,%r146; $L14: .loc 1 133 24 ld.u64 %r39,[%r139]; .loc 1 133 22 st.u64 [%r144],%r39; .loc 1 134 24 ld.u64 %r309,[%r139+16]; add.u64 %r308,%r309,1; ld.u64 %r310,[%r139+8]; sub.u64 %r43,%r308,%r310; .loc 1 135 14 setp.le.s64 %r311,%r43,0; @ %r311 bra $L11; .loc 1 134 22 st.u64 [%r52],%r43; bra $L12; $L11: .loc 1 138 19 st.u64 [%r52],%r605; .loc 1 137 15 mov.u32 %r149,%r606; $L12: .loc 1 141 14 setp.ne.u64 %r313,%r39,%r146; @ %r313 bra $L55; .loc 1 142 19 ld.u64 %r314,[%r52]; mul.lo.u64 %r146,%r146,%r314; bra $L13; $L55: .loc 1 144 19 mov.u64 %r146,0; $L13: .loc 1 130 7 add.u64 %r139,%r139,24; add.u64 %r144,%r144,8; add.u64 %r52,%r52,8; setp.ne.u64 %r315,%r111,%r139; @ %r315 bra $L14; bra $L10; $L54: .loc 1 129 14 mov.u32 %r149,%r151; .loc 1 128 13 mov.u64 %r146,1; $L10: .loc 1 146 12 ld.u64 %r147,[%r260]; bra $L9; $L53: .loc 1 153 12 mov.u64 %r147,%r260; .loc 1 152 14 mov.u32 %r149,1; .loc 1 151 13 mov.u64 %r146,1; .loc 1 150 12 mov.u64 %r145,%r260; $L9: .loc 1 156 7 cvta.global.u64 %r316,_gfortrani_compile_options; .loc 1 156 6 ld.u32 %r317,[%r316+36]; setp.eq.u32 %r318,%r317,0; @ ! %r318 bra $L92; setp.eq.u64 %r590,%r261,0; bra $L15; $L92: add.u64 %r229,%r257,48; .loc 1 161 23 mov.u64 %r224,0; .loc 1 160 10 mov.u64 %r175,1; $L17: .loc 1 163 20 ld.u64 %r47,[%r230]; .loc 1 163 7 mul.lo.u64 %r175,%r175,%r47; .loc 1 164 17 ld.u64 %r320,[%r229+8]; add.u64 %r319,%r320,1; .loc 1 164 15 ld.u64 %r321,[%r229]; sub.u64 %r176,%r319,%r321; .loc 1 161 41 add.u64 %r224,%r224,1; .loc 1 165 7 setp.eq.u64 %r322,%r47,%r176; @ %r322 bra $L16; .loc 1 166 6 st.u64 [%stack+16],%r47; st.u64 [%stack+8],%r176; st.u64 [%stack],%r224;_gfortran_runtime_error16: .loc 1 161 7 add.u64 %r230,%r230,8; add.u64 %r229,%r229,24; setp.ne.u64 %r325,%r161,%r224; @ %r325 bra $L17; .loc 1 173 14 ld.s8 %r53,[%r258+28]; .loc 1 174 7 cvt.u16.u32 %r326,%r53; setp.le.s16 %r327,%r326,0; @ %r327 bra $L56; add.u64 %r193,%r258,48; add.u64 %r241,%r258,56; cvt.u32.u32 %r330,%r53; cvt.s64.s8 %r329,%r330; add.u64 %r332,%r329,%r329; add.u64 %r333,%r332,%r329; shl.b64 %r334,%r333,3; add.u64 %r234,%r193,%r334; .loc 1 172 21 mov.u64 %r155,1; $L19: .loc 1 177 9 ld.u64 %r337,[%r241]; add.u64 %r336,%r337,1; .loc 1 177 7 ld.u64 %r338,[%r193]; sub.u64 %r335,%r336,%r338; .loc 1 178 33 max.s64 %r335,%r335,0; .loc 1 178 18 mul.lo.u64 %r155,%r155,%r335; .loc 1 174 7 add.u64 %r193,%r193,24; add.u64 %r241,%r241,24; setp.ne.u64 %r339,%r234,%r193; @ %r339 bra $L19; bra $L18; $L56: .loc 1 172 21 mov.u64 %r155,1; $L18: .loc 1 181 10 setp.ge.s64 %r340,%r155,%r175; @ %r340 bra $L20; .loc 1 181 34 set.u32.eq.u64 %r342,%r260,0; neg.s32 %r343,%r342; .loc 1 181 39 cvt.u16.u32 %r346,%r149; cvt.u16.u32 %r349,%r343; or.b16 %r348,%r349,%r346; .loc 1 181 30 cvt.u32.u16 %r351,%r348; cvt.u16.u8 %r352,%r351; setp.eq.u16 %r353,%r352,0; @ %r353 bra $L20; .loc 1 182 2 st.u64 [%stack+8],%r175; st.u64 [%stack],%r155; cvta.const.u64 %r3_gfortran_runtime_error20: .loc 1 186 10 setp.eq.u64 %r590,%r261,0; @ %r590 bra $L15; .loc 1 192 14 shl.b64 %r357,%r161,2; add.u64 %r358,%frame,480; mov.u32 %r3635362357364,[%value_in]; } ld.u64 %r366,[%r261+40]; shl.b64 %r191,%r366,3; ld.u64 %r189,[%r261]; .loc 1 194 20 mov.u64 %r172,0; .loc 1 206 16 mov.u32 %r604,1; $L23: .loc 1 196 28 ld.u64 %r63,[%r189]; .loc 1 196 10 add.u64 %r171,%r63,-1; .loc 1 198 14 shr.u64 %r368,%r171,63; .loc 1 198 23 set.u32.le.s64 %r370,%r161,%r171; neg.s32 %r371,%r370; .loc 1 198 18 cvt.u16.u64 %r374,%r368; cvt.u16.u32 %r375,%r371; or.b16 %r373,%r374,%r375; .loc 1 198 11 cvt.u32.u16 %r376,%r373; cvt.u16.u8 %r377,%r376; setp.eq.u16 %r378,%r377,0; @ %r378 bra $L21; .loc 1 199 3 st.u64 [%stack],%r63_gfortran_runtime_errorshl.b64 %r381,%r171,2; add.u64 %r382,%frame,%r381; add.u64 %r383,%r382,480; .loc 1 202 11 ld.u32 %r384,[%r383]; setp.eq.u32 %r385,%r384,0; @ %r385 bra $L22; .loc 1 203 3 st.u64 [%stack],%r63; cvta.const.u64 %r386386_gfortran_runtime_error22: .loc 1 206 16 st.u32 [%r383],%r604; .loc 1 194 38 add.u64 %r172,%r172,1; .loc 1 194 4 add.u64 %r189,%r189,%r191; setp.ne.u64 %r392,%r161,%r172; @ %r392 bra $L23; $L15: add.u64 %r91,%frame,960; add.u64 %r51,%frame,1080; add.u64 %r46,%frame,840; .loc 1 194 20 mov.u64 %r83,%r51; mov.u64 %r185,0; mov.u64 %r136,1; .loc 1 220 17 mov.u64 %r400,%r185; $L29: .loc 1 215 10 @ %r590 bra $L57; .loc 1 216 34 ld.u64 %r395,[%r261+40]; mul.lo.u64 %r394,%r185,%r395; .loc 1 216 31 ld.u64 %r396,[%r261]; shl.b64 %r397,%r394,3; add.u64 %r398,%r396,%r397; .loc 1 216 13 ld.u64 %r399,[%r398]; add.u64 %r156,%r399,-1; bra $L24; $L57: mov.u64 %r156,%r185; $L24: .loc 1 220 17 st.u64 [%r83],%r400; .loc 1 221 20 add.u64 %r402,%r156,%r156; add.u64 %r403,%r402,%r156; shl.b64 %r404,%r403,3; add.u64 %r405,%r257,%r404; ld.u64 %r75,[%r405+40]; .loc 1 221 18 st.u64 [%r46],%r75; .loc 1 222 20 ld.u64 %r414,[%r405+56]; add.u64 %r413,%r414,1; ld.u64 %r421,[%r405+48]; sub.u64 %r79,%r413,%r421; .loc 1 223 10 setp.lt.s64 %r422,%r79,0; @ %r422 bra $L25; .loc 1 222 18 st.u64 [%r91],%r79; bra $L26; $L25: .loc 1 224 20 st.u64 [%r91],%r400; mov.u64 %r79,%r400; $L26: .loc 1 226 35 shl.b64 %r424,%r156,3; add.u64 %r425,%frame,%r424; .loc 1 226 10 ld.u64 %r426,[%r425]; setp.eq.u64 %r427,%r426,%r79; @ %r427 bra $L27; .loc 1 227 9 cvta.const.u64 %r4284_gfortran_runtime_error27: .loc 1 229 10 setp.ne.u64 %r430,%r75,%r136; @ %r430 bra $L58; .loc 1 230 15 mul.lo.u64 %r136,%r136,%r79; bra $L28; $L58: .loc 1 232 15 mov.u64 %r136,0; $L28: .loc 1 233 10 setp.le.s64 %r431,%r79,0; @ %r431 bra $L1; .loc 1 212 37 add.u64 %r185,%r185,1; .loc 1 212 3 add.u64 %r91,%r91,8; add.u64 %r83,%r83,8; add.u64 %r46,%r46,8; setp.ne.u64 %r432,%r161,%r185; @ %r432 bra $L29; .loc 1 237 8 ld.s8 %r194,[%r258+28]; .loc 1 247 17 shl.b64 %r435,%r194,3; add.u64 %r589,%frame,720; mov.u32 %r440435442,[%value_in]; } add.u64 %r197,%r258,40; add.u64 %r588,%frame,480; mov.u64 %r195,%r588; add.u64 %r585,%frame,600; mov.u64 %r169,%r585; add.u64 %r445,%r194,%r194; add.u64 %r446,%r445,%r194; shl.b64 %r447,%r446,3; add.u64 %r134,%r447,%r197; .loc 1 243 9 mov.u64 %r143,1; .loc 1 253 15 mov.u64 %r602,0; .loc 1 252 11 cvt.u32.u64 %r603,%r143; $L33: .loc 1 248 20 ld.u64 %r86,[%r197]; .loc 1 248 18 st.u64 [%r195],%r86; .loc 1 249 20 ld.u64 %r450,[%r197+16]; add.u64 %r449,%r450,1; ld.u64 %r451,[%r197+8]; sub.u64 %r90,%r449,%r451; .loc 1 250 10 setp.le.s64 %r452,%r90,0; @ %r452 bra $L30; .loc 1 249 18 st.u64 [%r169],%r90; bra $L31; $L30: .loc 1 253 15 st.u64 [%r169],%r602; .loc 1 252 11 mov.u32 %r151,%r603; $L31: .loc 1 256 10 setp.ne.u64 %r454,%r86,%r143; @ %r454 bra $L59; .loc 1 257 15 ld.u64 %r455,[%r169]; mul.lo.u64 %r143,%r143,%r455; bra $L32; $L59: .loc 1 259 15 mov.u64 %r143,0; $L32: .loc 1 245 3 add.u64 %r197,%r197,24; add.u64 %r195,%r195,8; add.u64 %r169,%r169,8; setp.ne.u64 %r456,%r134,%r197; @ %r456 bra $L33; .loc 1 267 34 ld.u64 %r82,[%r257]; .loc 1 267 68 ld.u64 %r183,[%r258]; .loc 1 262 13 set.u32.ne.u64 %r458,%r136,0; neg.s32 %r459,%r458; .loc 1 262 32 set.u32.ne.u64 %r461,%r146,0; neg.s32 %r462,%r461; cvt.u16.u32 %r464,%r459; cvt.u16.u32 %r465,%r462; and.b16 %r463,%r464,%r465; .loc 1 262 27 set.u32.ne.u64 %r467,%r143,0; neg.s32 %r468,%r467; .loc 1 262 32 cvt.u16.u32 %r472,%r468; and.b16 %r470,%r463,%r472; cvt.u32.u16 %r473,%r470; cvt.u16.u8 %r474,%r473; setp.eq.u16 %r475,%r474,0; @ %r475 bra $L34; .loc 1 264 13 shl.b64 %r96,%r136,3; .loc 1 265 13 shl.b64 %r98,%r143,3; .loc 1 266 13 shl.b64 %r100,%r146,3; .loc 1 267 7 {1831100; call _gfortrani_reshape_packed269 7 bra $L1; $L34: .loc 1 273 12 ld.u64 %r181,[%frame+840]; .loc 1 274 12 ld.u64 %r140,[%frame+480]; .loc 1 276 6 and.b32 %r482,%r151,%r149; setp.eq.u32 %r483,%r482,0; @ %r483 bra $L3636: .loc 1 279 6 setp.eq.u32 %r484,%r151,0; @ %r484 bra $L37; .loc 1 285 7 setp.le.s64 %r485,%r145,0; @ %r485 bra $L60; .loc 1 290 13 ld.u64 %r140,[%frame+120]; .loc 1 287 16 shl.b64 %r207,%r145,3; add.u64 %r487,%frame,36288 17 add.u64 %r497,%frame,2mov.u64 %r236,%r140; mov.u64 %r200,0; .loc 1 289 26 add.u64 %r601,%frame,120; $L38: .loc 1 289 17 add.u64 %r507,%r588,%r200; st.u64 [%r507],%r236; .loc 1 285 7 add.u64 %r200,%r200,8; setp.eq.u64 %r508,%r200,%r207; @ %r508 bra $L61; .loc 1 289 26 add.u64 %r510,%r601,%r200; ld.u64 %r236,[%r510]; bra $L38; $L50: .loc 1 297 15 ld.f64 %r106,[%r183]; .loc 1 297 13 st.f64 [%r82],%r106; .loc 1 299 12 add.u64 %r82,%r82,%r108; .loc 1 300 11 add.u64 %r183,%r183,%r586; .loc 1 301 16 add.u64 %r157,%r157,1; .loc 1 302 16 add.u64 %r242,%r242,1; st.u64 [%frame+720],%r242; .loc 1 306 13 setp.ne.u64 %r512,%r157,%r222; @ %r512 bra $L39; .loc 1 315 14 @ %r600 bra $L40; add.u64 %r173,%frame,1088; add.u64 %r104,%frame,848; add.u64 %r170,%frame,968; .loc 1 313 30 mov.u64 %r115,%r138; .loc 1 310 21 mov.u64 %r592,0; bra $L41; $L42: st.u64 [%r173],%r592; .loc 1 313 30 mul.lo.u64 %r115,%r118,%r117; .loc 1 315 14 add.u64 %r173,%r173,8; add.u64 %r104,%r104,8; add.u64 %r170,%r170,8; setp.eq.u64 %r518,%r173,%r201; @ %r518 bra $L40; $L41: .loc 1 323 24 ld.u64 %r519,[%r173]; add.u64 %r117,%r519,1; st.u64 [%r173],%r117; .loc 1 324 30 ld.u64 %r118,[%r104]; .loc 1 324 20 sub.u64 %r520,%r118,%r115; shl.b64 %r521,%r520,3; add.u64 %r82,%r82,%r521; .loc 1 306 34 ld.u64 %r120,[%r170]; .loc 1 306 13 setp.eq.u64 %r522,%r117,%r120; @ %r522 bra $L42; .loc 1 329 13 setp.ne.u64 %r523,%r242,%r587; @ %r523 bra $L62; .loc 1 310 21 mov.u64 %r157,0; bra $L51; $L64: mov.u64 %r157,0; .loc 1 329 13 mov.u64 %r82,%r157; $L51: .loc 1 333 21 mov.u64 %r524,0; st.u64 [%frame+720],%r524; .loc 1 336 29 ld.u64 %r525,[%frame+480]; mul.lo.u64 %r124,%r587,%r525; .loc 1 338 14 setp.ne.u64 %r526,%r194,1; @ %r526 bra $L44; $L47: .loc 1 340 19 set.u32.ne.u64 %r528,%r168,0; neg.s32 %r529,%r528; .loc 1 340 24 cvt.u16.u32 %r534,%r529; and.b16 %r533,%r534,%r595; .loc 1 340 18 cvt.u32.u16 %r536,%r533; cvt.u16.u8 %r537,%r536; setp.ne.u16 %r538,%r537,0; @ %r538 bra $L45; .loc 1 354 19 mov.u64 %r183,%r147; bra $L46; $L49: .loc 1 333 21 st.u64 [%r217],%r524; .loc 1 336 29 mul.lo.u64 %r124,%r132,%r131; .loc 1 337 12 add.u64 %r182,%r182,1; .loc 1 338 14 add.u64 %r217,%r217,8; setp.eq.u64 %r540,%r194,%r182; @ ! %r540 bra $L48; bra $L47; $L45: .loc 1 345 19 @ %r596 bra $L63; .loc 1 347 3520550,[%value_in]; } .loc 1 348 3620560,[%value_in]; } .loc 1 349 3658r59920570,[%value_in]; } .loc 1 350 32 ld.u64 %r140,[%frame+480]; .loc 1 354 19 mov.u64 %r183,%r147; .loc 1 350 32 mov.u64 %r194,%r145; .loc 1 343 24 mov.u64 %r168,0; bra $L46; $L44: add.u64 %r217,%frame,728; .loc 1 337 12 mov.u64 %r182,1; $L48: .loc 1 359 24 ld.u64 %r573,[%r217]; add.u64 %r131,%r573,1; st.u64 [%r217],%r131; shl.b64 %r184,%r182,3; .loc 1 360 29 add.u64 %r575,%r588,%r184; ld.u64 %r132,[%r575]; .loc 1 360 19 sub.u64 %r576,%r132,%r124; shl.b64 %r577,%r576,3; add.u64 %r183,%r183,%r577; .loc 1 329 34 add.u64 %r579,%r585,%r184; ld.u64 %r135,[%r579]; .loc 1 329 13 setp.eq.u64 %r580,%r131,%r135; @ %r580 bra $L49; bra $L46; $L60: .loc 1 283 12 mov.u64 %r183,%r147; .loc 1 285 7 mov.u64 %r194,%r145; bra $L37; $L61: .loc 1 283 12 mov.u64 %r183,%r147; mov.u64 %r194,%r145; $L37: .loc 1 294 9 setp.eq.u64 %r581,%r82,0; @ %r581 bra $L1; .loc 1 299 12 shl.b64 %r108,%r181,3; ld.u64 %r157,[%frame+1080]; .loc 1 306 34 ld.u64 %r222,[%frame+960]; .loc 1 313 30 mul.lo.u64 %r138,%r222,%r181; .loc 1 329 34 ld.u64 %r587,[%frame+600]; .loc 1 302 13 ld.u64 %r242,[%frame+720]; add.u64 %r201,%r51,%r253; .loc 1 347 35 shl.b64 %r205,%r145,3; mov.u64 %r168,%r183; shl.b64 %r586,%r140,3; .loc 1 340 24 set.u32.ne.u64 %r593,%r260,0; neg.s32 %r594,%r593; cvt.u16.u32 %r595,%r594; .loc 1 345 19 setp.le.s64 %r596,%r145,0; .loc 1 347 35 add.u64 %r597,%frame,360; .loc 1 348 36 add.u64 %r598,%frame,240; .loc 1 349 36 add.u64 %r599,%frame,120; .loc 1 315 14 setp.eq.u64 %r600,%r161,1; bra $L50; $L63: .loc 1 354 19 mov.u64 %r183,%r147; .loc 1 345 19 mov.u64 %r194,%r145; .loc 1 343 24 mov.u64 %r168,0; $L46: .loc 1 294 9 setp.eq.u64 %r582,%r82,0; @ %r582 bra $L1; shl.b64 %r586,%r140,3; ld.u64 %r587,[%frame+600]; bra $L43; $L62: .loc 1 310 21 mov.u64 %r157,0; $L43: .loc 1 302 13 ld.u64 %r242,[%frame+720]; bra $L50; $L39: .loc 1 329 13 setp.eq.u64 %r583,%r242,%r587; @ %r583 bra $L51; bra $L50; $L40: setp.eq.u64 %r584,%r242,%r587; @ %r584 bra $L64; $L1: .loc 1 364reshape_r10reshape_c4.o/_gfortran_reshape_c4 .visible .func _gfortran_reshape_c4.file 1 "../../../../libgfortran/generated/reshape_c4.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray_gfortrani_reshape_packed .extern .func _gfortrani_reshape_packed73,110,99,111,114,114,101,99,116,32,115,105,122,101,32,105,110,32,83,79,85,82,67,69,32,97,114,103,117,109,101,110,116,32,116,111,32,82,69,83,72,65,80,69,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,100,0 }114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,82,69,83,72,65,80,69,32,105,110,116,114,105,11097,108,117,101,32,37,108,100,32,105,110,32,79,82,68,69,82,32,97,114,103,117,109,101,110,116,32,116,111,32,82,69,83,72,65,80,69,32,105,110,116,114,105,110,115,105,9910,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,82,69,83,72,65,80,69,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10,111,117,116,32,111,102,32,114,97,110,103,101,32,105,110,32,79,82,68,69,82,32,97,114,103,117,109,101,110,116,32,116,111,32,82,69,83,72,65,80,69,32,105115,104,97,112,101,32,97,110,100,32,116,97,114,103,101,116,32,100,111,32,110,111,116,32,99,111,110,102,111,114_gfortran_reshape_c4 .visible .func _gfortran_reshape_c4 {120u64predpredu64u64 %r310; .reg .u64 %r311; .reg .pred %r312; .reg .predpredpredu64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64pred %r340; .reg .predu32u64 %r399; .reg .u64u64 %r427; .reg .predpred %r457; .reg .u32 %r459; .reg .u32 %r460; .reg .u32 %r462; .reg .u32 %r463; .reg .u16 %r464; .reg .u16 %r465; .reg .u16 %r466; .reg .u32 %r468; .reg .u32 %r469; .reg .u16 %r471; .reg .u16 %r473; .reg .u32 %r474; .reg .u16 %r475; .reg .pred %r476; .reg .u32 %r483; .reg .pred %r484; .reg .pred %r485; .reg .pred %r486; .reg .u64 %r488; .reg .u64 %r495; .reg .u64 %r498; .reg .u64 %r505; .reg .u64 %r508; .reg .pred %r509; .reg .u64 %r511; .reg .pred %r513; .reg .pred %r519; .reg .u64 %r520; .reg .u64 %r521; .reg .u64 %r522; .reg .pred %r523; .reg .pred %r524; .reg .u64 %r525; .reg .u64 %r526; .reg .pred %r527; .reg .u32 %r529; .reg .u32 %r530; .reg .u16 %r534; .reg .u16 %r535; .reg .u32pred %r541; .reg .u64 %r551; .reg .u64 %r561; .reg .u64 %r571; .reg .u64 %r574; .reg .u64 %r576; .reg .u64 %r577; .reg .u64 %r578; .reg .u64 %r580; .reg .pred %r581; .reg .pred %r582; .reg .pred %r583; .reg .pred %r584; .reg .pred %r585; .reg .u64 %r586; .reg .u64 %r587; .reg .u64 %r588; .reg .u64 %r589; .reg .u64 %r590; .reg .pred %r591; .reg .u64 %r593; .reg .u32 %r594; .reg .u32 %r595; .reg .u16 %r596; .reg .pred78 10 ld.u64 %r264,[%r260+56]; add.u64 %r263,%r264,1; .loc 1 78 8 ld.u64 %r265,[%r260+48]; sub.u64 %r162,%r263,%r265; .loc 1 83 15 ld.s8 %r266,[%r258+28]; .loc 1 83 6 setp.eq.u64 %r267,%r266,%r162; @ %r267 bra $L2; .loc 1 84_gfortran_runtime_errorld.u64 %r270,[%r260+40]; shl.b64 %r250,%r270,3; ld.u64 %r248,[%r260]; mov.u64 %r231,%frame; shl.b64 %r254,%r162,3; add.u64 %r255,%frame,%r254; .loc 1 90 44 mov.u64 %r251,%frame; .loc 1 86 15 mov.u32 %r152,0; .loc 1 93 23 mov.u64 %r608,0; .loc 1 94 14 mov.u32 %r609,1; $L5: .loc 1 90 39 ld.u64 %r32,[%r248]; .loc 1 91 10 setp.le.s64 %r271,%r32,0; @ %r271 bra $L3; .loc 1 90 21 st.u64 [%r251],%r32; bra $L4; $L3: .loc 1 93 23 st.u64 [%r251],%r608; .loc 1 94 14 mov.u32 %r152,%r609; $L4: .loc 1 88 3 add.u64 %r248,%r248,%r250; add.u64 %r251,%r251,8; setp.ne.u64 %r273,%r251,%r255; @ %r273 bra $L5; .loc 1 98 6 ld.u64 %r274,[%r258]; setp.ne.u64 %r275,%r274,0; @ %r275 bra $L6; add.u64 %r114,%r258,40; add.u64 %r278,%r162,%r162; add.u64 %r279,%r278,%r162; shl.b64 %r280,%r279,3; add.u64 %r247,%r114,%r280; mov.u64 %r160,%r231; .loc 1 102 10 mov.u64 %r168,1; $L7: .loc 1 105 8 ld.u64 %r167,[%r160]; .loc 1 107 4 st.u64 [%r114+8],%r274; add.u64 %r282,%r167,-1; st.u64 [%r114+16],%r282; st.u64 [%r114],%r168; .loc 1 109 7 mul.lo.u64 %r168,%r168,%r167; .loc 1 103 7 add.u64 %r160,%r160,8; add.u64 %r114,%r114,24; setp.ne.u64 %r283,%r114,%r247; @ %r283 bra $L7; .loc 1 111 19 st.u64 [%r258+8],%r274; .loc 1 118 24 max.s64 %r287,%r168,0; mov.u64 %r286call (%value_in),_gfortrani_xmallocarray288,[%value_in]; } .loc 1 118 22 st.u64 [%r258],%r288; .loc 1 119 23 st.u8 [%r258+28],%r162; $L6: .loc 1 122 6 setp.ne.u32 %r290,%r152,0; @ %r290 bra $L1; .loc 1 125 6 setp.eq.u64 %r291,%r261,0; @ %r291 bra $L53; .loc 1 127 12 ld.s8 %r146,[%r261+28]; .loc 1 130 7 setp.le.s64 %r293,%r146,0; @ %r293 bra $L54; .loc 1 132 21 shl.b64 %r295,%r146,3; add.u64 %r296,%frame,36295302,[%value_in]; } add.u64 %r140,%r261,40; add.u64 %r145,%frame,120; add.u64 %r52,%frame,240; add.u64 %r306,%r146,%r146; add.u64 %r307,%r306,%r146; shl.b64 %r308,%r307,3; add.u64 %r112,%r140,%r308; .loc 1 129 14 mov.u32 %r150,%r152; .loc 1 128 13 mov.u64 %r147,1; .loc 1 138 19 mov.u64 %r606,0; .loc 1 137 15 cvt.u32.u64 %r607,%r147; $L14: .loc 1 133 24 ld.u64 %r39,[%r140]; .loc 1 133 22 st.u64 [%r145],%r39; .loc 1 134 24 ld.u64 %r310,[%r140+16]; add.u64 %r309,%r310,1; ld.u64 %r311,[%r140+8]; sub.u64 %r43,%r309,%r311; .loc 1 135 14 setp.le.s64 %r312,%r43,0; @ %r312 bra $L11; .loc 1 134 22 st.u64 [%r52],%r43; bra $L12; $L11: .loc 1 138 19 st.u64 [%r52],%r606; .loc 1 137 15 mov.u32 %r150,%r607; $L12: .loc 1 141 14 setp.ne.u64 %r314,%r39,%r147; @ %r314 bra $L55; .loc 1 142 19 ld.u64 %r315,[%r52]; mul.lo.u64 %r147,%r147,%r315; bra $L13; $L55: .loc 1 144 19 mov.u64 %r147,0; $L13: .loc 1 130 7 add.u64 %r140,%r140,24; add.u64 %r145,%r145,8; add.u64 %r52,%r52,8; setp.ne.u64 %r316,%r112,%r140; @ %r316 bra $L14; bra $L10; $L54: .loc 1 129 14 mov.u32 %r150,%r152; .loc 1 128 13 mov.u64 %r147,1; $L10: .loc 1 146 12 ld.u64 %r148,[%r261]; bra $L9; $L53: .loc 1 153 12 mov.u64 %r148,%r261; .loc 1 152 14 mov.u32 %r150,1; .loc 1 151 13 mov.u64 %r147,1; .loc 1 150 12 mov.u64 %r146,%r261; $L9: .loc 1 156 7 cvta.global.u64 %r317,_gfortrani_compile_options; .loc 1 156 6 ld.u32 %r318,[%r317+36]; setp.eq.u32 %r319,%r318,0; @ ! %r319 bra $L92; setp.eq.u64 %r591,%r262,0; bra $L15; $L92: add.u64 %r230,%r258,48; .loc 1 161 23 mov.u64 %r225,0; .loc 1 160 10 mov.u64 %r176,1; $L17: .loc 1 163 20 ld.u64 %r47,[%r231]; .loc 1 163 7 mul.lo.u64 %r176,%r176,%r47; .loc 1 164 17 ld.u64 %r321,[%r230+8]; add.u64 %r320,%r321,1; .loc 1 164 15 ld.u64 %r322,[%r230]; sub.u64 %r177,%r320,%r322; .loc 1 161 41 add.u64 %r225,%r225,1; .loc 1 165 7 setp.eq.u64 %r323,%r47,%r177; @ %r323 bra $L16; .loc 1 166 6 st.u64 [%stack+16],%r47; st.u64 [%stack+8],%r177; st.u64 [%stack],%r225;_gfortran_runtime_error16: .loc 1 161 7 add.u64 %r231,%r231,8; add.u64 %r230,%r230,24; setp.ne.u64 %r326,%r162,%r225; @ %r326 bra $L17; .loc 1 173 14 ld.s8 %r53,[%r259+28]; .loc 1 174 7 cvt.u16.u32 %r327,%r53; setp.le.s16 %r328,%r327,0; @ %r328 bra $L56; add.u64 %r194,%r259,48; add.u64 %r242,%r259,56; cvt.u32.u32 %r331,%r53; cvt.s64.s8 %r330,%r331; add.u64 %r333,%r330,%r330; add.u64 %r334,%r333,%r330; shl.b64 %r335,%r334,3; add.u64 %r235,%r194,%r335; .loc 1 172 21 mov.u64 %r156,1; $L19: .loc 1 177 9 ld.u64 %r338,[%r242]; add.u64 %r337,%r338,1; .loc 1 177 7 ld.u64 %r339,[%r194]; sub.u64 %r336,%r337,%r339; .loc 1 178 33 max.s64 %r336,%r336,0; .loc 1 178 18 mul.lo.u64 %r156,%r156,%r336; .loc 1 174 7 add.u64 %r194,%r194,24; add.u64 %r242,%r242,24; setp.ne.u64 %r340,%r235,%r194; @ %r340 bra $L19; bra $L18; $L56: .loc 1 172 21 mov.u64 %r156,1; $L18: .loc 1 181 10 setp.ge.s64 %r341,%r156,%r176; @ %r341 bra $L20; .loc 1 181 34 set.u32.eq.u64 %r343,%r261,0; neg.s32 %r344,%r343; .loc 1 181 39 cvt.u16.u32 %r347,%r150; cvt.u16.u32 %r350,%r344; or.b16 %r349,%r350,%r347; .loc 1 181 30.loc 1 182 2 st.u64 [%stack+8],%r176; st.u64 [%stack],%r156; cvta.const.u64 %r33_gfortran_runtime_error20: .loc 1 186 10 setp.eq.u64 %r591,%r262,0; @ %r591 bra $L15; .loc 1 192 14 shl.b64 %r358,%r162,2; add.u64 %r359,%frame,480; mov.u32 %r36358365,[%value_in]; } ld.u64 %r367,[%r262+40]; shl.b64 %r192,%r367,3; ld.u64 %r190,[%r262]; .loc 1 194 20 mov.u64 %r173,0; .loc 1 206 16 mov.u32 %r605,1; $L23: .loc 1 196 28 ld.u64 %r63,[%r190]; .loc 1 196 10 add.u64 %r172,%r63,-1; .loc 1 198 14 shr.u64 %r369,%r172,63; .loc 1 198 23 set.u32.le.s64 %r371,%r162,%r172; neg.s32 %r372,%r371; .loc 1 198 18 cvt.u16.u64 %r375,%r369; cvt.u16.u32 %r376,%r372; or.b16 %r374,%r375,%r376; .loc 1 198 11 cvt.u32.u16 %r377,%r374; cvt.u16.u8 %r378,%r377; setp.eq.u16 %r379,%r378,0; @ %r379 bra $L21; .loc 1 199 3 st.u64 [%stack],%r63_gfortran_runtime_errorshl.b64 %r382,%r172,2; add.u64 %r383,%frame,%r382; add.u64 %r384,%r383,480; .loc 1 202 11 ld.u32 %r385,[%r384]; setp.eq.u32 %r386,%r385,0; @ %r386 bra $L22;_gfortran_runtime_error22: .loc 1 206 16 st.u32 [%r384],%r605; .loc 1 194 38 add.u64 %r173,%r173,1; .loc 1 194 4 add.u64 %r190,%r190,%r192; setp.ne.u64 %r393,%r162,%r173; @ %r393 bra $L23; $L15: add.u64 %r91,%frame,960; add.u64 %r51,%frame,1080; add.u64 %r46,%frame,840; .loc 1 194 20 mov.u64 %r83,%r51; mov.u64 %r186,0; mov.u64 %r137,1; .loc 1 220 17 mov.u64 %r401,%r186; $L29: .loc 1 215 10 @ %r591 bra $L57; .loc 1 216 34 ld.u64 %r396,[%r262+40]; mul.lo.u64 %r395,%r186,%r396; .loc 1 216 31 ld.u64 %r397,[%r262]; shl.b64 %r398,%r395,3; add.u64 %r399,%r397,%r398; .loc 1 216 13 ld.u64 %r400,[%r399]; add.u64 %r157,%r400,-1; bra $L24; $L57: mov.u64 %r157,%r186; $L24: .loc 1 220 17 st.u64 [%r83],%r401; .loc 1 221 20 add.u64 %r403,%r157,%r157; add.u64 %r404,%r403,%r157; shl.b64 %r405,%r404,3; add.u64 %r406,%r258,%r405; ld.u64 %r75,[%r406+40]; .loc 1 221 18 st.u64 [%r46],%r75; .loc 1 222 20 ld.u64 %r415,[%r406+56]; add.u64 %r414,%r415,1; ld.u64 %r422,[%r406+48]; sub.u64 %r79,%r414,%r422; .loc 1 223 10 setp.lt.s64 %r423,%r79,0; @ %r423 bra $L25; .loc 1 222 18 st.u64 [%r91],%r79; bra $L26; $L25: .loc 1 224 20 st.u64 [%r91],%r401; mov.u64 %r79,%r401; $L26: .loc 1 226 35 shl.b64 %r425,%r157,3; add.u64 %r426,%frame,%r425; .loc 1 226 10 ld.u64 %r427,[%r426]; setp.eq.u64 %r428,%r427,%r79; @ %r428 bra $L27; .loc 1 227 9 cvta.const.u64 %r4242_gfortran_runtime_error27: .loc 1 229 10 setp.ne.u64 %r431,%r75,%r137; @ %r431 bra $L58; .loc 1 230 15 mul.lo.u64 %r137,%r137,%r79; bra $L28; $L58: .loc 1 232 15 mov.u64 %r137,0; $L28: .loc 1 233 10 setp.le.s64 %r432,%r79,0; @ %r432 bra $L1; .loc 1 212 37 add.u64 %r186,%r186,1; .loc 1 212 3 add.u64 %r91,%r91,8; add.u64 %r83,%r83,8; add.u64 %r46,%r46,8; setp.ne.u64 %r433,%r162,%r186; @ %r433 bra $L29; .loc 1 237 8 ld.s8 %r195,[%r259+28]; .loc 1 247 17 shl.b64 %r436,%r195,3; add.u64 %r590,%frame,720; mov.u32 %r441436443,[%value_in]; } add.u64 %r198,%r259,40; add.u64 %r589,%frame,480; mov.u64 %r196,%r589; add.u64 %r586,%frame,600; mov.u64 %r170,%r586; add.u64 %r446,%r195,%r195; add.u64 %r447,%r446,%r195; shl.b64 %r448,%r447,3; add.u64 %r135,%r448,%r198; .loc 1 243 9 mov.u64 %r144,1; .loc 1 253 15 mov.u64 %r603,0; .loc 1 252 11 cvt.u32.u64 %r604,%r144; $L33: .loc 1 248 20 ld.u64 %r86,[%r198]; .loc 1 248 18 st.u64 [%r196],%r86; .loc 1 249 20 ld.u64 %r451,[%r198+16]; add.u64 %r450,%r451,1; ld.u64 %r452,[%r198+8]; sub.u64 %r90,%r450,%r452; .loc 1 250 10 setp.le.s64 %r453,%r90,0; @ %r453 bra $L30; .loc 1 249 18 st.u64 [%r170],%r90; bra $L31; $L30: .loc 1 253 15 st.u64 [%r170],%r603; .loc 1 252 11 mov.u32 %r152,%r604; $L31: .loc 1 256 10 setp.ne.u64 %r455,%r86,%r144; @ %r455 bra $L59; .loc 1 257 15 ld.u64 %r456,[%r170]; mul.lo.u64 %r144,%r144,%r456; bra $L32; $L59: .loc 1 259 15 mov.u64 %r144,0; $L32: .loc 1 245 3 add.u64 %r198,%r198,24; add.u64 %r196,%r196,8; add.u64 %r170,%r170,8; setp.ne.u64 %r457,%r135,%r198; @ %r457 bra $L33; .loc 1 267 34 ld.u64 %r82,[%r258]; .loc 1 267 68 ld.u64 %r184,[%r259]; .loc 1 262 13 set.u32.ne.u64 %r459,%r137,0; neg.s32 %r460,%r459; .loc 1 262 32 set.u32.ne.u64 %r462,%r147,0; neg.s32 %r463,%r462; cvt.u16.u32 %r465,%r460; cvt.u16.u32 %r466,%r463; and.b16 %r464,%r465,%r466; .loc 1 262 27 set.u32.ne.u64 %r468,%r144,0; neg.s32 %r469,%r468; .loc 1 262 32 cvt.u16.u32 %r473,%r469; and.b16 %r471,%r464,%r473; cvt.u32.u16 %r474,%r471; cvt.u16.u8 %r475,%r474; setp.eq.u16 %r476,%r475,0; @ %r476 bra $L34; .loc 1 264 13 shl.b64 %r96,%r137,3; .loc 1 265 13 shl.b64 %r98,%r144,3; .loc 1 266 13 shl.b64 %r100,%r147,3; .loc 1 267 7 {1841100; call _gfortrani_reshape_packed269 7 bra $L1; $L34: .loc 1 273 12 ld.u64 %r182,[%frame+840]; .loc 1 274 12 ld.u64 %r141,[%frame+480]; .loc 1 276 6 and.b32 %r483,%r152,%r150; setp.eq.u32 %r484,%r483,0; @ %r484 bra $L3636: .loc 1 279 6 setp.eq.u32 %r485,%r152,0; @ %r485 bra $L37; .loc 1 285 7 setp.le.s64 %r486,%r146,0; @ %r486 bra $L60; .loc 1 290 13 ld.u64 %r141,[%frame+120]; .loc 1 287 16 shl.b64 %r208,%r146,3; add.u64 %r488,%frame,368495,[%value_in]; } .loc 1 288 17 add.u64 %r498,%frame,2208505,[%value_in]; } mov.u64 %r237,%r141; mov.u64 %r201,0; .loc 1 289 26 add.u64 %r602,%frame,120; $L38: .loc 1 289 17 add.u64 %r508,%r589,%r201; st.u64 [%r508],%r237; .loc 1 285 7 add.u64 %r201,%r201,8; setp.eq.u64 %r509,%r201,%r208; @ %r509 bra $L61; .loc 1 289 26 add.u64 %r511,%r602,%r201; ld.u64 %r237,[%r511]; bra $L38; $L50: .loc 1 297 15 ld.f32 %r106,[%r184]; ld.f32 %r107,[%r184+4]; .loc 1 297 13 st.f32 [%r82],%r106; st.f32 [%r82+4],%r107; .loc 1 299 12 add.u64 %r82,%r82,%r109; .loc 1 300 11 add.u64 %r184,%r184,%r587; .loc 1 301 16 add.u64 %r158,%r158,1; .loc 1 302 16 add.u64 %r243,%r243,1; st.u64 [%frame+720],%r243; .loc 1 306 13 setp.ne.u64 %r513,%r158,%r223; @ %r513 bra $L39; .loc 1 315 14 @ %r601 bra $L40; add.u64 %r174,%frame,1088; add.u64 %r104,%frame,848; add.u64 %r171,%frame,968; .loc 1 313 30 mov.u64 %r116,%r139; .loc 1 310 21 mov.u64 %r593,0; bra $L41; $L42: st.u64 [%r174],%r593; .loc 1 313 30 mul.lo.u64 %r116,%r119,%r118; .loc 1 315 14 add.u64 %r174,%r174,8; add.u64 %r104,%r104,8; add.u64 %r171,%r171,8; setp.eq.u64 %r519,%r174,%r202; @ %r519 bra $L40; $L41: .loc 1 323 24 ld.u64 %r520,[%r174]; add.u64 %r118,%r520,1; st.u64 [%r174],%r118; .loc 1 324 30 ld.u64 %r119,[%r104]; .loc 1 324 20 sub.u64 %r521,%r119,%r116; shl.b64 %r522,%r521,3; add.u64 %r82,%r82,%r522; .loc 1 306 34 ld.u64 %r121,[%r171]; .loc 1 306 13 setp.eq.u64 %r523,%r118,%r121; @ %r523 bra $L42; .loc 1 329 13 setp.ne.u64 %r524,%r243,%r588; @ %r524 bra $L62; .loc 1 310 21 mov.u64 %r158,0; bra $L51; $L64: mov.u64 %r158,0; .loc 1 329 13 mov.u64 %r82,%r158; $L51: .loc 1 333 21 mov.u64 %r525,0; st.u64 [%frame+720],%r525; .loc 1 336 29 ld.u64 %r526,[%frame+480]; mul.lo.u64 %r125,%r588,%r526; .loc 1 338 14 setp.ne.u64 %r527,%r195,1; @ %r527 bra $L44; $L47: .loc 1 340 19 set.u32.ne.u64 %r529,%r169,0; neg.s32 %r530,%r529; .loc 1 340 24 cvt.u16.u32 %r535,%r530; and.b16 %r534,%r535,%r596; .loc 1 340 18 cvt.u32.u16 %r537,%r534; cvt.u16.u8 %r538,%r53745; .loc 1 354 19 mov.u64 %r184,%r148; bra $L46; $L49: .loc 1 333 21 st.u64 [%r218],%r525; .loc 1 336 29 mul.lo.u64 %r125,%r133,%r132; .loc 1 337 12 add.u64 %r183,%r183,1; .loc 1 338 14 add.u64 %r218,%r218,8; setp.eq.u64 %r541,%r195,%r183; @ ! %r541 bra $L48; bra $L47; $L45: .loc 1 345 19 @ %r597 bra $L63; .loc 1 347 356551,[%value_in]; } .loc 1 348 36206561,[%value_in]; } .loc 1 349 36206571,[%value_in]; } .loc 1 350 32 ld.u64 %r141,[%frame+480]; .loc 1 354 19 mov.u64 %r184,%r148; .loc 1 350 32 mov.u64 %r195,%r146; .loc 1 343 24 mov.u64 %r169,0; bra $L46; $L44: add.u64 %r218,%frame,728; .loc 1 337 12 mov.u64 %r183,1; $L48: .loc 1 359 24 ld.u64 %r574,[%r218]; add.u64 %r132,%r574,1; st.u64 [%r218],%r132; shl.b64 %r185,%r183,3; .loc 1 360 29 add.u64 %r576,%r589,%r185; ld.u64 %r133,[%r576]; .loc 1 360 19 sub.u64 %r577,%r133,%r125; shl.b64 %r578,%r577,3; add.u64 %r184,%r184,%r578; .loc 1 329 34 add.u64 %r580,%r586,%r185; ld.u64 %r136,[%r580]; .loc 1 329 13 setp.eq.u64 %r581,%r132,%r136; @ %r581 bra $L49; bra $L46; $L60: .loc 1 283 12 mov.u64 %r184,%r148; .loc 1 285 7 mov.u64 %r195,%r146; bra $L37; $L61: .loc 1 283 12 mov.u64 %r184,%r148; mov.u64 %r195,%r146; $L37: .loc 1 294 9 setp.eq.u64 %r582,%r82,0; @ %r582 bra $L1; .loc 1 299 12 shl.b64 %r109,%r182,3; ld.u64 %r158,[%frame+1080]; .loc 1 306 34 ld.u64 %r223,[%frame+960]; .loc 1 313 30 mul.lo.u64 %r139,%r223,%r182; .loc 1 329 34 ld.u64 %r588,[%frame+600]; .loc 1 302 13 ld.u64 %r243,[%frame+720]; add.u64 %r202,%r51,%r254; .loc 1 347 35 shl.b64 %r206,%r146,3; mov.u64 %r169,%r184; shl.b64 %r587,%r141,3; .loc 1 340 24 set.u32.ne.u64 %r594,%r261,0; neg.s32 %r595,%r594; cvt.u16.u32 %r596,%r595; .loc 1 345 19 setp.le.s64 %r597,%r146,0; .loc 1 347 35 add.u64 %r598,%frame,360; .loc 1 348 36 add.u64 %r599,%frame,240; .loc 1 349 36 add.u64 %r600,%frame,120; .loc 1 315 14 setp.eq.u64 %r601,%r162,1; bra $L50; $L63: .loc 1 354 19 mov.u64 %r184,%r148; .loc 1 345 19 mov.u64 %r195,%r146; .loc 1 343 24 mov.u64 %r169,0; $L46: .loc 1 294 9 setp.eq.u64 %r583,%r82,0; @ %r583 bra $L1; shl.b64 %r587,%r141,3; ld.u64 %r588,[%frame+600]; bra $L43; $L62: .loc 1 310 21 mov.u64 %r158,0; $L43: .loc 1 302 13 ld.u64 %r243,[%frame+720]; bra $L50; $L39: .loc 1 329 13 setp.eq.u64 %r584,%r243,%r588; @ %r584 bra $L51; bra $L50; $L40: setp.eq.u64 %r585,%r243,%r588; @ %r585 bra $L64; $L1: .loc 1 364reshape_c8.o/_gfortran_reshape_c8 .visible .func _gfortran_reshape_c8.file 1 "../../../../libgfortran/generated/reshape_c8.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray_gfortrani_reshape_packed .extern .func _gfortrani_reshape_packed73,110,99,111,114,114,101,99,116,32,115,105,122,101,32,105,110,32,83,79,85,82,67,69,32,97,114,103,117,109,101,110,116,32,116,111,32,82,69,83,72,65,80,69,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,100,0 }114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,82,69,83,72,65,80,69,32,105,110,116,114,105,11097,108,117,101,32,37,108,100,32,105,110,32,79,82,68,69,82,32,97,114,103,117,109,101,110,116,32,116,111,32,82,69,83,72,65,80,69,32,105,110,116,114,105,110,115,105,9910,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,82,69,83,72,65,80,69,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10,111,117,116,32,111,102,32,114,97,110,103,101,32,105,110,32,79,82,68,69,82,32,97,114,103,117,109,101,110,116,32,116,111,32,82,69,83,72,65,80,69,32,105115,104,97,112,101,32,97,110,100,32,116,97,114,103,101,116,32,100,111,32,110,111,116,32,99,111,110,102,111,114_gfortran_reshape_c8 .visible .func _gfortran_reshape_c8 {120u64predpredu64u64 %r310; .reg .u64 %r311; .reg .pred %r312; .reg .predpredpredu64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64pred %r340; .reg .predu32u64 %r399; .reg .u64u64 %r427; .reg .predpred %r457; .reg .u32 %r459; .reg .u32 %r460; .reg .u32 %r462; .reg .u32 %r463; .reg .u16 %r464; .reg .u16 %r465; .reg .u16 %r466; .reg .u32 %r468; .reg .u32 %r469; .reg .u16 %r471; .reg .u16 %r473; .reg .u32 %r474; .reg .u16 %r475; .reg .pred %r476; .reg .u32 %r483; .reg .pred %r484; .reg .pred %r485; .reg .pred %r486; .reg .u64 %r488; .reg .u64 %r495; .reg .u64 %r498; .reg .u64 %r505; .reg .u64 %r508; .reg .pred %r509; .reg .u64 %r511; .reg .pred %r513; .reg .pred %r519; .reg .u64 %r520; .reg .u64 %r521; .reg .u64 %r522; .reg .pred %r523; .reg .pred %r524; .reg .u64 %r525; .reg .u64 %r526; .reg .pred %r527; .reg .u32 %r529; .reg .u32 %r530; .reg .u16 %r534; .reg .u16 %r535; .reg .u32pred %r541; .reg .u64 %r551; .reg .u64 %r561; .reg .u64 %r571; .reg .u64 %r574; .reg .u64 %r576; .reg .u64 %r577; .reg .u64 %r578; .reg .u64 %r580; .reg .pred %r581; .reg .pred %r582; .reg .pred %r583; .reg .pred %r584; .reg .pred %r585; .reg .u64 %r586; .reg .u64 %r587; .reg .u64 %r588; .reg .u64 %r589; .reg .u64 %r590; .reg .pred %r591; .reg .u64 %r593; .reg .u32 %r594; .reg .u32 %r595; .reg .u16 %r596; .reg .pred78 10 ld.u64 %r264,[%r260+56]; add.u64 %r263,%r264,1; .loc 1 78 8 ld.u64 %r265,[%r260+48]; sub.u64 %r162,%r263,%r265; .loc 1 83 15 ld.s8 %r266,[%r258+28]; .loc 1 83 6 setp.eq.u64 %r267,%r266,%r162; @ %r267 bra $L2; .loc 1 84_gfortran_runtime_errorld.u64 %r270,[%r260+40]; shl.b64 %r250,%r270,3; ld.u64 %r248,[%r260]; mov.u64 %r231,%frame; shl.b64 %r254,%r162,3; add.u64 %r255,%frame,%r254; .loc 1 90 44 mov.u64 %r251,%frame; .loc 1 86 15 mov.u32 %r152,0; .loc 1 93 23 mov.u64 %r608,0; .loc 1 94 14 mov.u32 %r609,1; $L5: .loc 1 90 39 ld.u64 %r32,[%r248]; .loc 1 91 10 setp.le.s64 %r271,%r32,0; @ %r271 bra $L3; .loc 1 90 21 st.u64 [%r251],%r32; bra $L4; $L3: .loc 1 93 23 st.u64 [%r251],%r608; .loc 1 94 14 mov.u32 %r152,%r609; $L4: .loc 1 88 3 add.u64 %r248,%r248,%r250; add.u64 %r251,%r251,8; setp.ne.u64 %r273,%r251,%r255; @ %r273 bra $L5; .loc 1 98 6 ld.u64 %r274,[%r258]; setp.ne.u64 %r275,%r274,0; @ %r275 bra $L6; add.u64 %r114,%r258,40; add.u64 %r278,%r162,%r162; add.u64 %r279,%r278,%r162; shl.b64 %r280,%r279,3; add.u64 %r247,%r114,%r280; mov.u64 %r160,%r231; .loc 1 102 10 mov.u64 %r168,1; $L7: .loc 1 105 8 ld.u64 %r167,[%r160]; .loc 1 107 4 st.u64 [%r114+8],%r274; add.u64 %r282,%r167,-1; st.u64 [%r114+16],%r282; st.u64 [%r114],%r168; .loc 1 109 7 mul.lo.u64 %r168,%r168,%r167; .loc 1 103 7 add.u64 %r160,%r160,8; add.u64 %r114,%r114,24; setp.ne.u64 %r283,%r114,%r247; @ %r283 bra $L7; .loc 1 111 19 st.u64 [%r258+8],%r274; .loc 1 118 24 max.s64 %r287,%r168,0; mov.u64 %r286,16286; call (%value_in),_gfortrani_xmallocarray288,[%value_in]; } .loc 1 118 22 st.u64 [%r258],%r288; .loc 1 119 23 st.u8 [%r258+28],%r162; $L6: .loc 1 122 6 setp.ne.u32 %r290,%r152,0; @ %r290 bra $L1; .loc 1 125 6 setp.eq.u64 %r291,%r261,0; @ %r291 bra $L53; .loc 1 127 12 ld.s8 %r146,[%r261+28]; .loc 1 130 7 setp.le.s64 %r293,%r146,0; @ %r293 bra $L54; .loc 1 132 21 shl.b64 %r295,%r146,3; add.u64 %r296,%frame,36295302,[%value_in]; } add.u64 %r140,%r261,40; add.u64 %r145,%frame,120; add.u64 %r52,%frame,240; add.u64 %r306,%r146,%r146; add.u64 %r307,%r306,%r146; shl.b64 %r308,%r307,3; add.u64 %r112,%r140,%r308; .loc 1 129 14 mov.u32 %r150,%r152; .loc 1 128 13 mov.u64 %r147,1; .loc 1 138 19 mov.u64 %r606,0; .loc 1 137 15 cvt.u32.u64 %r607,%r147; $L14: .loc 1 133 24 ld.u64 %r39,[%r140]; .loc 1 133 22 st.u64 [%r145],%r39; .loc 1 134 24 ld.u64 %r310,[%r140+16]; add.u64 %r309,%r310,1; ld.u64 %r311,[%r140+8]; sub.u64 %r43,%r309,%r311; .loc 1 135 14 setp.le.s64 %r312,%r43,0; @ %r312 bra $L11; .loc 1 134 22 st.u64 [%r52],%r43; bra $L12; $L11: .loc 1 138 19 st.u64 [%r52],%r606; .loc 1 137 15 mov.u32 %r150,%r607; $L12: .loc 1 141 14 setp.ne.u64 %r314,%r39,%r147; @ %r314 bra $L55; .loc 1 142 19 ld.u64 %r315,[%r52]; mul.lo.u64 %r147,%r147,%r315; bra $L13; $L55: .loc 1 144 19 mov.u64 %r147,0; $L13: .loc 1 130 7 add.u64 %r140,%r140,24; add.u64 %r145,%r145,8; add.u64 %r52,%r52,8; setp.ne.u64 %r316,%r112,%r140; @ %r316 bra $L14; bra $L10; $L54: .loc 1 129 14 mov.u32 %r150,%r152; .loc 1 128 13 mov.u64 %r147,1; $L10: .loc 1 146 12 ld.u64 %r148,[%r261]; bra $L9; $L53: .loc 1 153 12 mov.u64 %r148,%r261; .loc 1 152 14 mov.u32 %r150,1; .loc 1 151 13 mov.u64 %r147,1; .loc 1 150 12 mov.u64 %r146,%r261; $L9: .loc 1 156 7 cvta.global.u64 %r317,_gfortrani_compile_options; .loc 1 156 6 ld.u32 %r318,[%r317+36]; setp.eq.u32 %r319,%r318,0; @ ! %r319 bra $L92; setp.eq.u64 %r591,%r262,0; bra $L15; $L92: add.u64 %r230,%r258,48; .loc 1 161 23 mov.u64 %r225,0; .loc 1 160 10 mov.u64 %r176,1; $L17: .loc 1 163 20 ld.u64 %r47,[%r231]; .loc 1 163 7 mul.lo.u64 %r176,%r176,%r47; .loc 1 164 17 ld.u64 %r321,[%r230+8]; add.u64 %r320,%r321,1; .loc 1 164 15 ld.u64 %r322,[%r230]; sub.u64 %r177,%r320,%r322; .loc 1 161 41 add.u64 %r225,%r225,1; .loc 1 165 7 setp.eq.u64 %r323,%r47,%r177; @ %r323 bra $L16; .loc 1 166 6 st.u64 [%stack+16],%r47; st.u64 [%stack+8],%r177; st.u64 [%stack],%r225;_gfortran_runtime_error16: .loc 1 161 7 add.u64 %r231,%r231,8; add.u64 %r230,%r230,24; setp.ne.u64 %r326,%r162,%r225; @ %r326 bra $L17; .loc 1 173 14 ld.s8 %r53,[%r259+28]; .loc 1 174 7 cvt.u16.u32 %r327,%r53; setp.le.s16 %r328,%r327,0; @ %r328 bra $L56; add.u64 %r194,%r259,48; add.u64 %r242,%r259,56; cvt.u32.u32 %r331,%r53; cvt.s64.s8 %r330,%r331; add.u64 %r333,%r330,%r330; add.u64 %r334,%r333,%r330; shl.b64 %r335,%r334,3; add.u64 %r235,%r194,%r335; .loc 1 172 21 mov.u64 %r156,1; $L19: .loc 1 177 9 ld.u64 %r338,[%r242]; add.u64 %r337,%r338,1; .loc 1 177 7 ld.u64 %r339,[%r194]; sub.u64 %r336,%r337,%r339; .loc 1 178 33 max.s64 %r336,%r336,0; .loc 1 178 18 mul.lo.u64 %r156,%r156,%r336; .loc 1 174 7 add.u64 %r194,%r194,24; add.u64 %r242,%r242,24; setp.ne.u64 %r340,%r235,%r194; @ %r340 bra $L19; bra $L18; $L56: .loc 1 172 21 mov.u64 %r156,1; $L18: .loc 1 181 10 setp.ge.s64 %r341,%r156,%r176; @ %r341 bra $L20; .loc 1 181 34 set.u32.eq.u64 %r343,%r261,0; neg.s32 %r344,%r343; .loc 1 181 39 cvt.u16.u32 %r347,%r150; cvt.u16.u32 %r350,%r344; or.b16 %r349,%r350,%r347; .loc 1 181 30.loc 1 182 2 st.u64 [%stack+8],%r176; st.u64 [%stack],%r156; cvta.const.u64 %r33_gfortran_runtime_error20: .loc 1 186 10 setp.eq.u64 %r591,%r262,0; @ %r591 bra $L15; .loc 1 192 14 shl.b64 %r358,%r162,2; add.u64 %r359,%frame,480; mov.u32 %r36358365,[%value_in]; } ld.u64 %r367,[%r262+40]; shl.b64 %r192,%r367,3; ld.u64 %r190,[%r262]; .loc 1 194 20 mov.u64 %r173,0; .loc 1 206 16 mov.u32 %r605,1; $L23: .loc 1 196 28 ld.u64 %r63,[%r190]; .loc 1 196 10 add.u64 %r172,%r63,-1; .loc 1 198 14 shr.u64 %r369,%r172,63; .loc 1 198 23 set.u32.le.s64 %r371,%r162,%r172; neg.s32 %r372,%r371; .loc 1 198 18 cvt.u16.u64 %r375,%r369; cvt.u16.u32 %r376,%r372; or.b16 %r374,%r375,%r376; .loc 1 198 11 cvt.u32.u16 %r377,%r374; cvt.u16.u8 %r378,%r377; setp.eq.u16 %r379,%r378,0; @ %r379 bra $L21; .loc 1 199 3 st.u64 [%stack],%r63_gfortran_runtime_errorshl.b64 %r382,%r172,2; add.u64 %r383,%frame,%r382; add.u64 %r384,%r383,480; .loc 1 202 11 ld.u32 %r385,[%r384]; setp.eq.u32 %r386,%r385,0; @ %r386 bra $L22;_gfortran_runtime_error22: .loc 1 206 16 st.u32 [%r384],%r605; .loc 1 194 38 add.u64 %r173,%r173,1; .loc 1 194 4 add.u64 %r190,%r190,%r192; setp.ne.u64 %r393,%r162,%r173; @ %r393 bra $L23; $L15: add.u64 %r91,%frame,960; add.u64 %r51,%frame,1080; add.u64 %r46,%frame,840; .loc 1 194 20 mov.u64 %r83,%r51; mov.u64 %r186,0; mov.u64 %r137,1; .loc 1 220 17 mov.u64 %r401,%r186; $L29: .loc 1 215 10 @ %r591 bra $L57; .loc 1 216 34 ld.u64 %r396,[%r262+40]; mul.lo.u64 %r395,%r186,%r396; .loc 1 216 31 ld.u64 %r397,[%r262]; shl.b64 %r398,%r395,3; add.u64 %r399,%r397,%r398; .loc 1 216 13 ld.u64 %r400,[%r399]; add.u64 %r157,%r400,-1; bra $L24; $L57: mov.u64 %r157,%r186; $L24: .loc 1 220 17 st.u64 [%r83],%r401; .loc 1 221 20 add.u64 %r403,%r157,%r157; add.u64 %r404,%r403,%r157; shl.b64 %r405,%r404,3; add.u64 %r406,%r258,%r405; ld.u64 %r75,[%r406+40]; .loc 1 221 18 st.u64 [%r46],%r75; .loc 1 222 20 ld.u64 %r415,[%r406+56]; add.u64 %r414,%r415,1; ld.u64 %r422,[%r406+48]; sub.u64 %r79,%r414,%r422; .loc 1 223 10 setp.lt.s64 %r423,%r79,0; @ %r423 bra $L25; .loc 1 222 18 st.u64 [%r91],%r79; bra $L26; $L25: .loc 1 224 20 st.u64 [%r91],%r401; mov.u64 %r79,%r401; $L26: .loc 1 226 35 shl.b64 %r425,%r157,3; add.u64 %r426,%frame,%r425; .loc 1 226 10 ld.u64 %r427,[%r426]; setp.eq.u64 %r428,%r427,%r79; @ %r428 bra $L27; .loc 1 227 9 cvta.const.u64 %r4242_gfortran_runtime_error27: .loc 1 229 10 setp.ne.u64 %r431,%r75,%r137; @ %r431 bra $L58; .loc 1 230 15 mul.lo.u64 %r137,%r137,%r79; bra $L28; $L58: .loc 1 232 15 mov.u64 %r137,0; $L28: .loc 1 233 10 setp.le.s64 %r432,%r79,0; @ %r432 bra $L1; .loc 1 212 37 add.u64 %r186,%r186,1; .loc 1 212 3 add.u64 %r91,%r91,8; add.u64 %r83,%r83,8; add.u64 %r46,%r46,8; setp.ne.u64 %r433,%r162,%r186; @ %r433 bra $L29; .loc 1 237 8 ld.s8 %r195,[%r259+28]; .loc 1 247 17 shl.b64 %r436,%r195,3; add.u64 %r590,%frame,720; mov.u32 %r441436443,[%value_in]; } add.u64 %r198,%r259,40; add.u64 %r589,%frame,480; mov.u64 %r196,%r589; add.u64 %r586,%frame,600; mov.u64 %r170,%r586; add.u64 %r446,%r195,%r195; add.u64 %r447,%r446,%r195; shl.b64 %r448,%r447,3; add.u64 %r135,%r448,%r198; .loc 1 243 9 mov.u64 %r144,1; .loc 1 253 15 mov.u64 %r603,0; .loc 1 252 11 cvt.u32.u64 %r604,%r144; $L33: .loc 1 248 20 ld.u64 %r86,[%r198]; .loc 1 248 18 st.u64 [%r196],%r86; .loc 1 249 20 ld.u64 %r451,[%r198+16]; add.u64 %r450,%r451,1; ld.u64 %r452,[%r198+8]; sub.u64 %r90,%r450,%r452; .loc 1 250 10 setp.le.s64 %r453,%r90,0; @ %r453 bra $L30; .loc 1 249 18 st.u64 [%r170],%r90; bra $L31; $L30: .loc 1 253 15 st.u64 [%r170],%r603; .loc 1 252 11 mov.u32 %r152,%r604; $L31: .loc 1 256 10 setp.ne.u64 %r455,%r86,%r144; @ %r455 bra $L59; .loc 1 257 15 ld.u64 %r456,[%r170]; mul.lo.u64 %r144,%r144,%r456; bra $L32; $L59: .loc 1 259 15 mov.u64 %r144,0; $L32: .loc 1 245 3 add.u64 %r198,%r198,24; add.u64 %r196,%r196,8; add.u64 %r170,%r170,8; setp.ne.u64 %r457,%r135,%r198; @ %r457 bra $L33; .loc 1 267 34 ld.u64 %r82,[%r258]; .loc 1 267 68 ld.u64 %r184,[%r259]; .loc 1 262 13 set.u32.ne.u64 %r459,%r137,0; neg.s32 %r460,%r459; .loc 1 262 32 set.u32.ne.u64 %r462,%r147,0; neg.s32 %r463,%r462; cvt.u16.u32 %r465,%r460; cvt.u16.u32 %r466,%r463; and.b16 %r464,%r465,%r466; .loc 1 262 27 set.u32.ne.u64 %r468,%r144,0; neg.s32 %r469,%r468; .loc 1 262 32 cvt.u16.u32 %r473,%r469; and.b16 %r471,%r464,%r473; cvt.u32.u16 %r474,%r471; cvt.u16.u8 %r475,%r474; setp.eq.u16 %r476,%r475,0; @ %r476 bra $L34; .loc 1 264 13 shl.b64 %r96,%r137,4; .loc 1 265 13 shl.b64 %r98,%r144,4; .loc 1 266 13 shl.b64 %r100,%r147,4; .loc 1 267 7 {1841100; call _gfortrani_reshape_packed269 7 bra $L1; $L34: .loc 1 273 12 ld.u64 %r182,[%frame+840]; .loc 1 274 12 ld.u64 %r141,[%frame+480]; .loc 1 276 6 and.b32 %r483,%r152,%r150; setp.eq.u32 %r484,%r483,0; @ %r484 bra $L3636: .loc 1 279 6 setp.eq.u32 %r485,%r152,0; @ %r485 bra $L37; .loc 1 285 7 setp.le.s64 %r486,%r146,0; @ %r486 bra $L60; .loc 1 290 13 ld.u64 %r141,[%frame+120]; .loc 1 287 16 shl.b64 %r208,%r146,3; add.u64 %r488,%frame,368495,[%value_in]; } .loc 1 288 17 add.u64 %r498,%frame,2208505,[%value_in]; } mov.u64 %r237,%r141; mov.u64 %r201,0; .loc 1 289 26 add.u64 %r602,%frame,120; $L38: .loc 1 289 17 add.u64 %r508,%r589,%r201; st.u64 [%r508],%r237; .loc 1 285 7 add.u64 %r201,%r201,8; setp.eq.u64 %r509,%r201,%r208; @ %r509 bra $L61; .loc 1 289 26 add.u64 %r511,%r602,%r201; ld.u64 %r237,[%r511]; bra $L38; $L50: .loc 1 297 15 ld.f64 %r106,[%r184]; ld.f64 %r107,[%r184+8]; .loc 1 297 13 st.f64 [%r82],%r106; st.f64 [%r82+8],%r107; .loc 1 299 12 add.u64 %r82,%r82,%r109; .loc 1 300 11 add.u64 %r184,%r184,%r587; .loc 1 301 16 add.u64 %r158,%r158,1; .loc 1 302 16 add.u64 %r243,%r243,1; st.u64 [%frame+720],%r243; .loc 1 306 13 setp.ne.u64 %r513,%r158,%r223; @ %r513 bra $L39; .loc 1 315 14 @ %r601 bra $L40; add.u64 %r174,%frame,1088; add.u64 %r104,%frame,848; add.u64 %r171,%frame,968; .loc 1 313 30 mov.u64 %r116,%r139; .loc 1 310 21 mov.u64 %r593,0; bra $L41; $L42: st.u64 [%r174],%r593; .loc 1 313 30 mul.lo.u64 %r116,%r119,%r118; .loc 1 315 14 add.u64 %r174,%r174,8; add.u64 %r104,%r104,8; add.u64 %r171,%r171,8; setp.eq.u64 %r519,%r174,%r202; @ %r519 bra $L40; $L41: .loc 1 323 24 ld.u64 %r520,[%r174]; add.u64 %r118,%r520,1; st.u64 [%r174],%r118; .loc 1 324 30 ld.u64 %r119,[%r104]; .loc 1 324 20 sub.u64 %r521,%r119,%r116; shl.b64 %r522,%r521,4; add.u64 %r82,%r82,%r522; .loc 1 306 34 ld.u64 %r121,[%r171]; .loc 1 306 13 setp.eq.u64 %r523,%r118,%r121; @ %r523 bra $L42; .loc 1 329 13 setp.ne.u64 %r524,%r243,%r588; @ %r524 bra $L62; .loc 1 310 21 mov.u64 %r158,0; bra $L51; $L64: mov.u64 %r158,0; .loc 1 329 13 mov.u64 %r82,%r158; $L51: .loc 1 333 21 mov.u64 %r525,0; st.u64 [%frame+720],%r525; .loc 1 336 29 ld.u64 %r526,[%frame+480]; mul.lo.u64 %r125,%r588,%r526; .loc 1 338 14 setp.ne.u64 %r527,%r195,1; @ %r527 bra $L44; $L47: .loc 1 340 19 set.u32.ne.u64 %r529,%r169,0; neg.s32 %r530,%r529; .loc 1 340 24 cvt.u16.u32 %r535,%r530; and.b16 %r534,%r535,%r596; .loc 1 340 18 cvt.u32.u16 %r537,%r534; cvt.u16.u8 %r538,%r53745; .loc 1 354 19 mov.u64 %r184,%r148; bra $L46; $L49: .loc 1 333 21 st.u64 [%r218],%r525; .loc 1 336 29 mul.lo.u64 %r125,%r133,%r132; .loc 1 337 12 add.u64 %r183,%r183,1; .loc 1 338 14 add.u64 %r218,%r218,8; setp.eq.u64 %r541,%r195,%r183; @ ! %r541 bra $L48; bra $L47; $L45: .loc 1 345 19 @ %r597 bra $L63; .loc 1 347 356551,[%value_in]; } .loc 1 348 36206561,[%value_in]; } .loc 1 349 36206571,[%value_in]; } .loc 1 350 32 ld.u64 %r141,[%frame+480]; .loc 1 354 19 mov.u64 %r184,%r148; .loc 1 350 32 mov.u64 %r195,%r146; .loc 1 343 24 mov.u64 %r169,0; bra $L46; $L44: add.u64 %r218,%frame,728; .loc 1 337 12 mov.u64 %r183,1; $L48: .loc 1 359 24 ld.u64 %r574,[%r218]; add.u64 %r132,%r574,1; st.u64 [%r218],%r132; shl.b64 %r185,%r183,3; .loc 1 360 29 add.u64 %r576,%r589,%r185; ld.u64 %r133,[%r576]; .loc 1 360 19 sub.u64 %r577,%r133,%r125; shl.b64 %r578,%r577,4; add.u64 %r184,%r184,%r578; .loc 1 329 34 add.u64 %r580,%r586,%r185; ld.u64 %r136,[%r580]; .loc 1 329 13 setp.eq.u64 %r581,%r132,%r136; @ %r581 bra $L49; bra $L46; $L60: .loc 1 283 12 mov.u64 %r184,%r148; .loc 1 285 7 mov.u64 %r195,%r146; bra $L37; $L61: .loc 1 283 12 mov.u64 %r184,%r148; mov.u64 %r195,%r146; $L37: .loc 1 294 9 setp.eq.u64 %r582,%r82,0; @ %r582 bra $L1; .loc 1 299 12 shl.b64 %r109,%r182,4; ld.u64 %r158,[%frame+1080]; .loc 1 306 34 ld.u64 %r223,[%frame+960]; .loc 1 313 30 mul.lo.u64 %r139,%r223,%r182; .loc 1 329 34 ld.u64 %r588,[%frame+600]; .loc 1 302 13 ld.u64 %r243,[%frame+720]; add.u64 %r202,%r51,%r254; .loc 1 347 35 shl.b64 %r206,%r146,3; mov.u64 %r169,%r184; shl.b64 %r587,%r141,4; .loc 1 340 24 set.u32.ne.u64 %r594,%r261,0; neg.s32 %r595,%r594; cvt.u16.u32 %r596,%r595; .loc 1 345 19 setp.le.s64 %r597,%r146,0; .loc 1 347 35 add.u64 %r598,%frame,360; .loc 1 348 36 add.u64 %r599,%frame,240; .loc 1 349 36 add.u64 %r600,%frame,120; .loc 1 315 14 setp.eq.u64 %r601,%r162,1; bra $L50; $L63: .loc 1 354 19 mov.u64 %r184,%r148; .loc 1 345 19 mov.u64 %r195,%r146; .loc 1 343 24 mov.u64 %r169,0; $L46: .loc 1 294 9 setp.eq.u64 %r583,%r82,0; @ %r583 bra $L1; shl.b64 %r587,%r141,4; ld.u64 %r588,[%frame+600]; bra $L43; $L62: .loc 1 310 21 mov.u64 %r158,0; $L43: .loc 1 302 13 ld.u64 %r243,[%frame+720]; bra $L50; $L39: .loc 1 329 13 setp.eq.u64 %r584,%r243,%r588; @ %r584 bra $L51; bra $L50; $L40: setp.eq.u64 %r585,%r243,%r588; @ %r585 bra $L64; $L1: .loc 1 364in_pack_i1.o/ortrani_internal_pack_1gfortrani_internal_pack_1../libgfortran/generated/in_pack_i1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_xmallocarr_gfortrani_xmallocarray_gfortrani_internal_pack_1gfortrani_internal_pack_1u32u64 %r95; .reg .u64 %r96; .reg .pred %r97; .reg .pred %r99mov.u64 %r71,%ar0; .loc 1 51 7 ld.s8 %r50,[%r71+28]; .loc 1 54 3 setp.gt.s64 %r72,%r50,0; @ %r72 bra $L2; $L4: .loc 1 73 18 ld.u64 %r70,[%r71]; bra $L1; $L2: add.u64 %r42,%frame,240; add.u64 %r40,%r71,40; mov.u64 %r32,%frame; add.u64 %r28,%frame,120; .loc 1 54 19 mov.u64 %r65,0; .loc 1 53 10 mov.u32 %r45,1; .loc 1 52 9 mov.u64 %r52,1; .loc 1 56 16 mov.u64 %r73,%r65; bra $L6; $L11: mov.u64 %r65,%r53; $L6: st.u64 [%r42],%r73; .loc 1 57 19 ld.u64 %r23,[%r40]; .loc 1 57 17 st.u64 [%r32],%r23; .loc 1 58 19 ld.u64 %r75,[%r40+16]; add.u64 %r74,%r75,1; ld.u64 %r76,[%r40+8]; sub.u64 %r27,%r74,%r76; .loc 1 58 17 st.u64 [%r28],%r27; .loc 1 59 10 setp.le.s64 %r77,%r27,0; @ %r77 bra $L4; .loc 1 66 10 setp.eq.u64 %r78,%r23,%r52; .loc 1 67 16 selp.u32 %r45,%r45,0,%r78; .loc 1 69 13 mul.lo.u64 %r52,%r52,%r27; .loc 1 54 36 add.u64 %r53,%r65,1; .loc 1 54 3 add.u64 %r42,%r42,8; add.u64 %r40,%r40,24; add.u64 %r32,%r32,8; add.u64 %r28,%r28,8; setp.ne.u64 %r79,%r50,%r53; @ %r79 bra $L11; .loc 1 72 6 setp.ne.u32 %r80,%r45,0; @ %r80 bra $L4; .loc 1 76 13 mov.u64 call (%value_in),_gfortrani_xmallocarray78 7 ld.u64 %r57,[%r71]; .loc 1 79 11 ld.u64 %r54,[%frame]; .loc 1 82 9 setp.eq.u64 %r85,%r57,0; @ %r85 bra $L1; ld.u64 %r44,[%frame+240]; .loc 1 91 32 ld.u64 %r59,[%frame+120]; .loc 1 98 28 mul.lo.u64 %r48,%r59,%r54; mov.u64 %r47,%r70; setp.eq.u64 %r99,%r50,1; add.u64 %r103,%frame,120; .loc 1 95 20 mov.u64 %r104,0; $L10: .loc 1 85 17 ld.u8 %r86,[%r57]; st.u8 [%r47],%r86; .loc 1 87 11 add.u64 %r57,%r57,%r54; .loc 1 88 15 add.u64 %r44,%r44,1; .loc 1 91 13 setp.ne.u64 %r87,%r44,%r59; @ %r87 bra $L7; .loc 1 100 14 @ %r99 bra $L1; add.u64 %r60,%frame,248; add.u64 %r67,%frame,8; .loc 1 98 28 mov.u64 %r35,%r48; .loc 1 99 12 mov.u64 %r66,%r82; bra $L8; $L9: .loc 1 95 20 st.u64 [%r60],%r104; .loc 1 98 28 mul.lo.u64 %r35,%r38,%r37; .loc 1 99 12 add.u64 %r56,%r66,1; .loc 1 100 14 add.u64 %r60,%r60,8; add.u64 %r67,%r67,8; setp.eq.u64 %r91,%r65,%r66; @ %r91 bra $L1; mov.u64 %r66,%r56; $L8: .loc 1 107 23 ld.u64 %r92,[%r60]; add.u64 %r37,%r92,1; st.u64 [%r60],%r37; .loc 1 108 28 ld.u64 %r38,[%r67]; .loc 1 108 19 sub.u64 %r93,%r38,%r35; add.u64 %r57,%r57,%r93; .loc 1 91 32 shl.b64 %r95,%r66,3; add.u64 %r96,%r103,%r95; ld.u64 %r41,[%r96]; .loc 1 91 13 setp.eq.u64 %r97,%r37,%r41; @ %r97 bra $L9; .loc 1 95 20 mov.u64 %r44,0; $L7: add.u64 %r47,%r47,1; bra $L10; $L1: .loc 1 113gfortrani_internal_pack_gfortrani_internal_pack_../libgfortran/generated/in_pack_i2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_xmallocarr_gfortrani_xmallocarray_gfortrani_internal_pack_gfortrani_internal_pack_u64 %r77; .reg .u64u16 %r89; .reg .pred %r90; .reg .predu64 %r107; .reg .u64 %r108; mov.u64 %r73,%ar0; .loc 1 51 7 ld.s8 %r50,[%r73+28]; .loc 1 54 3 setp.gt.s64 %r74,%r50,0; @ %r74 bra $L2; $L4: .loc 1 73 18 ld.u64 %r72,[%r73]; bra $L1; $L2: add.u64 %r44,%frame,240; add.u64 %r69,%r73,40; mov.u64 %r35,%frame; add.u64 %r34,%frame,120; .loc 1 54 19 mov.u64 %r65,0; .loc 1 53 10 mov.u32 %r46,1; .loc 1 52 9 mov.u64 %r53,1; .loc 1 56 16 mov.u64 %r75,%r65; bra $L6; $L11: mov.u64 %r65,%r54; $L6: st.u64 [%r44],%r75; .loc 1 57 19 ld.u64 %r23,[%r69]; .loc 1 57 17 st.u64 [%r35],%r23; .loc 1 58 19 ld.u64 %r77,[%r69+16]; add.u64 %r76,%r77,1; ld.u64 %r78,[%r69+8]; sub.u64 %r27,%r76,%r78; .loc 1 58 17 st.u64 [%r34],%r27; .loc 1 59 10 setp.le.s64 %r79,%r27,0; @ %r79 bra $L4; .loc 1 66 10 setp.eq.u64 %r80,%r23,%r53; .loc 1 67 16 selp.u32 %r46,%r46,0,%r80; .loc 1 69 13 mul.lo.u64 %r53,%r53,%r27; .loc 1 54 36 add.u64 %r54,%r65,1; .loc 1 54 3 add.u64 %r44,%r44,8; add.u64 %r69,%r69,24; add.u64 %r35,%r35,8; add.u64 %r34,%r34,8; setp.ne.u64 %r81,%r50,%r54; @ %r81 bra $L11; .loc 1 72 6 setp.ne.u32 %r82,%r46,0; @ %r82 bra $L4; .loc 1 76 13 mov.u64 %r84,call (%value_in),_gfortrani_xmallocarray78 7 ld.u64 %r58,[%r73]; .loc 1 79 11 ld.u64 %r55,[%frame]; .loc 1 82 9 setp.eq.u64 %r87,%r58,0; @ %r87 bra $L1; .loc 1 87 11 add.u64 %r33,%r55,%r55; ld.u64 %r45,[%frame+240]; .loc 1 91 32 ld.u64 %r59,[%frame+120]; .loc 1 98 28 mul.lo.u64 %r47,%r59,%r55; mov.u64 %r56,%r72; setp.eq.u64 %r103,%r50,1; .loc 1 99 12 mov.u64 %r106,1; add.u64 %r107,%frame,120; .loc 1 95 20 mov.u64 %r108,0; $L10: .loc 1 85 17 ld.u16 %r89,[%r58]; st.u16 [%r56],%r89; .loc 1 87 11 add.u64 %r58,%r58,%r33; .loc 1 88 15 add.u64 %r45,%r45,1; .loc 1 91 13 setp.ne.u64 %r90,%r45,%r59; @ %r90 bra $L7; .loc 1 100 14 @ %r103 bra $L1; add.u64 %r60,%frame,248; add.u64 %r68,%frame,8; .loc 1 98 28 mov.u64 %r37,%r47; .loc 1 99 12 mov.u64 %r67,%r106; bra $L8; $L9: .loc 1 95 20 st.u64 [%r60],%r108; .loc 1 98 28 mul.lo.u64 %r37,%r40,%r39; .loc 1 99 12 add.u64 %r57,%r67,1; .loc 1 100 14 add.u64 %r60,%r60,8; add.u64 %r68,%r68,8; setp.eq.u64 %r94,%r65,%r67; @ %r94 bra $L1; mov.u64 %r67,%r57; $L8: .loc 1 107 23 ld.u64 %r95,[%r60]; add.u64 %r39,%r95,1; st.u64 [%r60],%r39; .loc 1 108 28 ld.u64 %r40,[%r68]; .loc 1 108 19 sub.u64 %r96,%r40,%r37; add.u64 %r97,%r96,%r96; add.u64 %r58,%r58,%r97; .loc 1 91 32 shl.b64 %r99,%r67,3; add.u64 %r100,%r107,%r99; ld.u64 %r43,[%r100]; .loc 1 91 13 setp.eq.u64 %r101,%r39,%r43; @ %r101 bra $L9; .loc 1 95 20 mov.u64 %r45,0; $L7: add.u64 %r56,%r56,2; bra $L10; $L1: .loc 1 113gfortrani_internal_pack_4gfortrani_internal_pack_4../libgfortran/generated/in_pack_i4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_xmallocarr_gfortrani_xmallocarray_gfortrani_internal_pack_4gfortrani_internal_pack_4u64 %r77; .reg .u64predpred %r99; .reg .predmov.u64 %r73,%ar0; .loc 1 51 7 ld.s8 %r50,[%r73+28]; .loc 1 54 3 setp.gt.s64 %r74,%r50,0; @ %r74 bra $L2; $L4: .loc 1 73 18 ld.u64 %r72,[%r73]; bra $L1; $L2: add.u64 %r44,%frame,240; add.u64 %r69,%r73,40; mov.u64 %r35,%frame; add.u64 %r34,%frame,120; .loc 1 54 19 mov.u64 %r65,0; .loc 1 53 10 mov.u32 %r46,1; .loc 1 52 9 mov.u64 %r53,1; .loc 1 56 16 mov.u64 %r75,%r65; bra $L6; $L11: mov.u64 %r65,%r54; $L6: st.u64 [%r44],%r75; .loc 1 57 19 ld.u64 %r23,[%r69]; .loc 1 57 17 st.u64 [%r35],%r23; .loc 1 58 19 ld.u64 %r77,[%r69+16]; add.u64 %r76,%r77,1; ld.u64 %r78,[%r69+8]; sub.u64 %r27,%r76,%r78; .loc 1 58 17 st.u64 [%r34],%r27; .loc 1 59 10 setp.le.s64 %r79,%r27,0; @ %r79 bra $L4; .loc 1 66 10 setp.eq.u64 %r80,%r23,%r53; .loc 1 67 16 selp.u32 %r46,%r46,0,%r80; .loc 1 69 13 mul.lo.u64 %r53,%r53,%r27; .loc 1 54 36 add.u64 %r54,%r65,1; .loc 1 54 3 add.u64 %r44,%r44,8; add.u64 %r69,%r69,24; add.u64 %r35,%r35,8; add.u64 %r34,%r34,8; setp.ne.u64 %r81,%r50,%r54; @ %r81 bra $L11; .loc 1 72 6 setp.ne.u32 %r82,%r46,0; @ %r82 bra $L4; .loc 1 76 13 mov.u64 %r8r84; call (%value_in),_gfortrani_xmallocarray78 7 ld.u64 %r58,[%r73]; .loc 1 79 11 ld.u64 %r55,[%frame]; .loc 1 82 9 setp.eq.u64 %r87,%r58,0; @ %r87 bra $L1; .loc 1 87 11 shl.b64 %r33,%r55,2; ld.u64 %r45,[%frame+240]; .loc 1 91 32 ld.u64 %r59,[%frame+120]; .loc 1 98 28 mul.lo.u64 %r47,%r59,%r55; mov.u64 %r56,%r72; setp.eq.u64 %r101,%r50,1; .loc 1 99 12 mov.u64 %r104,1; add.u64 %r105,%frame,120; .loc 1 95 20 mov.u64 %r106,0; $L10: .loc 1 85 19 ld.u32 %r31,[%r58]; .loc 1 85 17 st.u32 [%r56],%r31; .loc 1 87 11 add.u64 %r58,%r58,%r33; .loc 1 88 15 add.u64 %r45,%r45,1; .loc 1 91 13 setp.ne.u64 %r88,%r45,%r59; @ %r88 bra $L7; .loc 1 100 14 @ %r101 bra $L1; add.u64 %r60,%frame,248; add.u64 %r68,%frame,8; .loc 1 98 28 mov.u64 %r37,%r47; .loc 1 99 12 mov.u64 %r67,%r104; bra $L8; $L9: .loc 1 95 20 st.u64 [%r60],%r106; .loc 1 98 28 mul.lo.u64 %r37,%r40,%r39; .loc 1 99 12 add.u64 %r57,%r67,1; .loc 1 100 14 add.u64 %r60,%r60,8; add.u64 %r68,%r68,8; setp.eq.u64 %r92,%r65,%r67; @ %r92 bra $L1; mov.u64 %r67,%r57; $L8: .loc 1 107 23 ld.u64 %r93,[%r60]; add.u64 %r39,%r93,1; st.u64 [%r60],%r39; .loc 1 108 28 ld.u64 %r40,[%r68]; .loc 1 108 19 sub.u64 %r94,%r40,%r37; shl.b64 %r95,%r94,2; add.u64 %r58,%r58,%r95; .loc 1 91 32 shl.b64 %r97,%r67,3; add.u64 %r98,%r105,%r97; ld.u64 %r43,[%r98]; .loc 1 91 13 setp.eq.u64 %r99,%r39,%r43; @ %r99 bra $L9; .loc 1 95 20 mov.u64 %r45,0; $L7: add.u64 %r56,%r56,4; bra $L10; $L1: .loc 1 113 in_pack_i8.o/gfortrani_internal_pack_8gfortrani_internal_pack_8../libgfortran/generated/in_pack_i8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_xmallocarr_gfortrani_xmallocarray_gfortrani_internal_pack_8gfortrani_internal_pack_8u64 %r77; .reg .u64predpred %r99; .reg .predmov.u64 %r73,%ar0; .loc 1 51 7 ld.s8 %r50,[%r73+28]; .loc 1 54 3 setp.gt.s64 %r74,%r50,0; @ %r74 bra $L2; $L4: .loc 1 73 18 ld.u64 %r72,[%r73]; bra $L1; $L2: add.u64 %r44,%frame,240; add.u64 %r69,%r73,40; mov.u64 %r35,%frame; add.u64 %r34,%frame,120; .loc 1 54 19 mov.u64 %r65,0; .loc 1 53 10 mov.u32 %r46,1; .loc 1 52 9 mov.u64 %r53,1; .loc 1 56 16 mov.u64 %r75,%r65; bra $L6; $L11: mov.u64 %r65,%r54; $L6: st.u64 [%r44],%r75; .loc 1 57 19 ld.u64 %r23,[%r69]; .loc 1 57 17 st.u64 [%r35],%r23; .loc 1 58 19 ld.u64 %r77,[%r69+16]; add.u64 %r76,%r77,1; ld.u64 %r78,[%r69+8]; sub.u64 %r27,%r76,%r78; .loc 1 58 17 st.u64 [%r34],%r27; .loc 1 59 10 setp.le.s64 %r79,%r27,0; @ %r79 bra $L4; .loc 1 66 10 setp.eq.u64 %r80,%r23,%r53; .loc 1 67 16 selp.u32 %r46,%r46,0,%r80; .loc 1 69 13 mul.lo.u64 %r53,%r53,%r27; .loc 1 54 36 add.u64 %r54,%r65,1; .loc 1 54 3 add.u64 %r44,%r44,8; add.u64 %r69,%r69,24; add.u64 %r35,%r35,8; add.u64 %r34,%r34,8; setp.ne.u64 %r81,%r50,%r54; @ %r81 bra $L11; .loc 1 72 6 setp.ne.u32 %r82,%r46,0; @ %r82 bra $L4; .loc 1 76 13 mov.u64 r84; call (%value_in),_gfortrani_xmallocarray78 7 ld.u64 %r58,[%r73]; .loc 1 79 11 ld.u64 %r55,[%frame]; .loc 1 82 9 setp.eq.u64 %r87,%r58,0; @ %r87 bra $L1; .loc 1 87 11 shl.b64 %r33,%r55,3; ld.u64 %r45,[%frame+240]; .loc 1 91 32 ld.u64 %r59,[%frame+120]; .loc 1 98 28 mul.lo.u64 %r47,%r59,%r55; mov.u64 %r56,%r72; setp.eq.u64 %r101,%r50,1; .loc 1 99 12 mov.u64 %r104,1; add.u64 %r105,%frame,120; .loc 1 95 20 mov.u64 %r106,0; $L10: .loc 1 85 19 ld.u64 %r31,[%r58]; .loc 1 85 17 st.u64 [%r56],%r31; .loc 1 87 11 add.u64 %r58,%r58,%r33; .loc 1 88 15 add.u64 %r45,%r45,1; .loc 1 91 13 setp.ne.u64 %r88,%r45,%r59; @ %r88 bra $L7; .loc 1 100 14 @ %r101 bra $L1; add.u64 %r60,%frame,248; add.u64 %r68,%frame,8; .loc 1 98 28 mov.u64 %r37,%r47; .loc 1 99 12 mov.u64 %r67,%r104; bra $L8; $L9: .loc 1 95 20 st.u64 [%r60],%r106; .loc 1 98 28 mul.lo.u64 %r37,%r40,%r39; .loc 1 99 12 add.u64 %r57,%r67,1; .loc 1 100 14 add.u64 %r60,%r60,8; add.u64 %r68,%r68,8; setp.eq.u64 %r92,%r65,%r67; @ %r92 bra $L1; mov.u64 %r67,%r57; $L8: .loc 1 107 23 ld.u64 %r93,[%r60]; add.u64 %r39,%r93,1; st.u64 [%r60],%r39; .loc 1 108 28 ld.u64 %r40,[%r68]; .loc 1 108 19 sub.u64 %r94,%r40,%r37; shl.b64 %r95,%r94,3; add.u64 %r58,%r58,%r95; .loc 1 91 32 shl.b64 %r97,%r67,3; add.u64 %r98,%r105,%r97; ld.u64 %r43,[%r98]; .loc 1 91 13 setp.eq.u64 %r99,%r39,%r43; @ %r99 bra $L9; .loc 1 95 20 mov.u64 %r45,0; $L7: add.u64 %r56,%r56,8; bra $L10; $L1: .loc 1 113 in_pack_i16.o/_gfortrani_internal_pack_16gfortrani_internal_pack_16../libgfortran/generated/in_pack_i16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_xmallocarr_gfortrani_xmallocarray_gfortrani_internal_pack_16gfortrani_internal_pack_16u64 %r77; .reg .u64predpredmov.u64 %r73,%ar0; .loc 1 51 7 ld.s8 %r50,[%r73+28]; .loc 1 54 3 setp.gt.s64 %r74,%r50,0; @ %r74 bra $L2; $L4: .loc 1 73 18 ld.u64 %r72,[%r73]; bra $L1; $L2: add.u64 %r44,%frame,240; add.u64 %r69,%r73,40; mov.u64 %r35,%frame; add.u64 %r34,%frame,120; .loc 1 54 19 mov.u64 %r65,0; .loc 1 53 10 mov.u32 %r46,1; .loc 1 52 9 mov.u64 %r53,1; .loc 1 56 16 mov.u64 %r75,%r65; bra $L6; $L11: mov.u64 %r65,%r54; $L6: st.u64 [%r44],%r75; .loc 1 57 19 ld.u64 %r23,[%r69]; .loc 1 57 17 st.u64 [%r35],%r23; .loc 1 58 19 ld.u64 %r77,[%r69+16]; add.u64 %r76,%r77,1; ld.u64 %r78,[%r69+8]; sub.u64 %r27,%r76,%r78; .loc 1 58 17 st.u64 [%r34],%r27; .loc 1 59 10 setp.le.s64 %r79,%r27,0; @ %r79 bra $L4; .loc 1 66 10 setp.eq.u64 %r80,%r23,%r53; .loc 1 67 16 selp.u32 %r46,%r46,0,%r80; .loc 1 69 13 mul.lo.u64 %r53,%r53,%r27; .loc 1 54 36 add.u64 %r54,%r65,1; .loc 1 54 3 add.u64 %r44,%r44,8; add.u64 %r69,%r69,24; add.u64 %r35,%r35,8; add.u64 %r34,%r34,8; setp.ne.u64 %r81,%r50,%r54; @ %r81 bra $L11; .loc 1 72 6 setp.ne.u32 %r82,%r46,0; @ %r82 bra $L4; .loc 1 76 13r84; call (%value_in),_gfortrani_xmallocarray78 7 ld.u64 %r58,[%r73]; .loc 1 79 11 ld.u64 %r55,[%frame]; .loc 1 82 9 setp.eq.u64 %r87,%r58,0; @ %r87 bra $L1; .loc 1 87 11 shl.b64 %r33,%r55,4; ld.u64 %r45,[%frame+240]; .loc 1 91 32 ld.u64 %r59,[%frame+120]; .loc 1 98 28 mul.lo.u64 %r47,%r59,%r55; mov.u64 %r56,%r72; setp.eq.u64 %r105,%r50,1; .loc 1 99 12 mov.u64 %r108,1; add.u64 %r109,%frame,120; .loc 1 95 20 mov.u64 %r110,0; $L10: .loc 1 85 19 ld.u64 %r103,[%r58]; ld.u64 %r104,[%r58+8]; .loc 1 85 17 st.u64 [%r56],%r103; st.u64 [%r56+8],%r104; .loc 1 87 11 add.u64 %r58,%r58,%r33; .loc 1 88 15 add.u64 %r45,%r45,1; .loc 1 91 13 setp.ne.u64 %r90,%r45,%r59; @ %r90 bra $L7; .loc 1 100 14 @ %r105 bra $L1; add.u64 %r60,%frame,248; add.u64 %r68,%frame,8; .loc 1 98 28 mov.u64 %r37,%r47; .loc 1 99 12 mov.u64 %r67,%r108; bra $L8; $L9: .loc 1 95 20 st.u64 [%r60],%r110; .loc 1 98 28 mul.lo.u64 %r37,%r40,%r39; .loc 1 99 12 add.u64 %r57,%r67,1; .loc 1 100 14 add.u64 %r60,%r60,8; add.u64 %r68,%r68,8; setp.eq.u64 %r94,%r65,%r67; @ %r94 bra $L1; mov.u64 %r67,%r57; $L8: .loc 1 107 23 ld.u64 %r95,[%r60]; add.u64 %r39,%r95,1; st.u64 [%r60],%r39; .loc 1 108 28 ld.u64 %r40,[%r68]; .loc 1 108 19 sub.u64 %r96,%r40,%r37; shl.b64 %r97,%r96,4; add.u64 %r58,%r58,%r97; .loc 1 91 32 shl.b64 %r99,%r67,3; add.u64 %r100,%r109,%r99; ld.u64 %r43,[%r100]; .loc 1 91 13 setp.eq.u64 %r101,%r39,%r43; @ %r101 bra $L9; .loc 1 95 20 mov.u64 %r45,0; $L7: add.u64 %r56,%r56,16; bra $L10; $L1: .loc 1 113gfortrani_internal_pack_r4gfortrani_internal_pack_r4../libgfortran/generated/in_pack_r4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_xmallocarr_gfortrani_xmallocarray_gfortrani_internal_pack_r4gfortrani_internal_pack_r4u64 %r77; .reg .u64predpred %r99; .reg .predmov.u64 %r73,%ar0; .loc 1 51 7 ld.s8 %r50,[%r73+28]; .loc 1 54 3 setp.gt.s64 %r74,%r50,0; @ %r74 bra $L2; $L4: .loc 1 73 18 ld.u64 %r72,[%r73]; bra $L1; $L2: add.u64 %r44,%frame,240; add.u64 %r69,%r73,40; mov.u64 %r35,%frame; add.u64 %r34,%frame,120; .loc 1 54 19 mov.u64 %r65,0; .loc 1 53 10 mov.u32 %r46,1; .loc 1 52 9 mov.u64 %r53,1; .loc 1 56 16 mov.u64 %r75,%r65; bra $L6; $L11: mov.u64 %r65,%r54; $L6: st.u64 [%r44],%r75; .loc 1 57 19 ld.u64 %r23,[%r69]; .loc 1 57 17 st.u64 [%r35],%r23; .loc 1 58 19 ld.u64 %r77,[%r69+16]; add.u64 %r76,%r77,1; ld.u64 %r78,[%r69+8]; sub.u64 %r27,%r76,%r78; .loc 1 58 17 st.u64 [%r34],%r27; .loc 1 59 10 setp.le.s64 %r79,%r27,0; @ %r79 bra $L4; .loc 1 66 10 setp.eq.u64 %r80,%r23,%r53; .loc 1 67 16 selp.u32 %r46,%r46,0,%r80; .loc 1 69 13 mul.lo.u64 %r53,%r53,%r27; .loc 1 54 36 add.u64 %r54,%r65,1; .loc 1 54 3 add.u64 %r44,%r44,8; add.u64 %r69,%r69,24; add.u64 %r35,%r35,8; add.u64 %r34,%r34,8; setp.ne.u64 %r81,%r50,%r54; @ %r81 bra $L11; .loc 1 72 6 setp.ne.u32 %r82,%r46,0; @ %r82 bra $L4; .loc 1 76 13 mov.u64 %r8r84; call (%value_in),_gfortrani_xmallocarray78 7 ld.u64 %r58,[%r73]; .loc 1 79 11 ld.u64 %r55,[%frame]; .loc 1 82 9 setp.eq.u64 %r87,%r58,0; @ %r87 bra $L1; .loc 1 87 11 shl.b64 %r33,%r55,2; ld.u64 %r45,[%frame+240]; .loc 1 91 32 ld.u64 %r59,[%frame+120]; .loc 1 98 28 mul.lo.u64 %r47,%r59,%r55; mov.u64 %r56,%r72; setp.eq.u64 %r101,%r50,1; .loc 1 99 12 mov.u64 %r104,1; add.u64 %r105,%frame,120; .loc 1 95 20 mov.u64 %r106,0; $L10: .loc 1 85 19 ld.f32 %r31,[%r58]; .loc 1 85 17 st.f32 [%r56],%r31; .loc 1 87 11 add.u64 %r58,%r58,%r33; .loc 1 88 15 add.u64 %r45,%r45,1; .loc 1 91 13 setp.ne.u64 %r88,%r45,%r59; @ %r88 bra $L7; .loc 1 100 14 @ %r101 bra $L1; add.u64 %r60,%frame,248; add.u64 %r68,%frame,8; .loc 1 98 28 mov.u64 %r37,%r47; .loc 1 99 12 mov.u64 %r67,%r104; bra $L8; $L9: .loc 1 95 20 st.u64 [%r60],%r106; .loc 1 98 28 mul.lo.u64 %r37,%r40,%r39; .loc 1 99 12 add.u64 %r57,%r67,1; .loc 1 100 14 add.u64 %r60,%r60,8; add.u64 %r68,%r68,8; setp.eq.u64 %r92,%r65,%r67; @ %r92 bra $L1; mov.u64 %r67,%r57; $L8: .loc 1 107 23 ld.u64 %r93,[%r60]; add.u64 %r39,%r93,1; st.u64 [%r60],%r39; .loc 1 108 28 ld.u64 %r40,[%r68]; .loc 1 108 19 sub.u64 %r94,%r40,%r37; shl.b64 %r95,%r94,2; add.u64 %r58,%r58,%r95; .loc 1 91 32 shl.b64 %r97,%r67,3; add.u64 %r98,%r105,%r97; ld.u64 %r43,[%r98]; .loc 1 91 13 setp.eq.u64 %r99,%r39,%r43; @ %r99 bra $L9; .loc 1 95 20 mov.u64 %r45,0; $L7: add.u64 %r56,%r56,4; bra $L10; $L1: .loc 1 113 in_pack_r8.o/gfortrani_internal_pack_r8gfortrani_internal_pack_r8../libgfortran/generated/in_pack_r8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_xmallocarr_gfortrani_xmallocarray_gfortrani_internal_pack_r8gfortrani_internal_pack_r8u64 %r77; .reg .u64predpred %r99; .reg .predmov.u64 %r73,%ar0; .loc 1 51 7 ld.s8 %r50,[%r73+28]; .loc 1 54 3 setp.gt.s64 %r74,%r50,0; @ %r74 bra $L2; $L4: .loc 1 73 18 ld.u64 %r72,[%r73]; bra $L1; $L2: add.u64 %r44,%frame,240; add.u64 %r69,%r73,40; mov.u64 %r35,%frame; add.u64 %r34,%frame,120; .loc 1 54 19 mov.u64 %r65,0; .loc 1 53 10 mov.u32 %r46,1; .loc 1 52 9 mov.u64 %r53,1; .loc 1 56 16 mov.u64 %r75,%r65; bra $L6; $L11: mov.u64 %r65,%r54; $L6: st.u64 [%r44],%r75; .loc 1 57 19 ld.u64 %r23,[%r69]; .loc 1 57 17 st.u64 [%r35],%r23; .loc 1 58 19 ld.u64 %r77,[%r69+16]; add.u64 %r76,%r77,1; ld.u64 %r78,[%r69+8]; sub.u64 %r27,%r76,%r78; .loc 1 58 17 st.u64 [%r34],%r27; .loc 1 59 10 setp.le.s64 %r79,%r27,0; @ %r79 bra $L4; .loc 1 66 10 setp.eq.u64 %r80,%r23,%r53; .loc 1 67 16 selp.u32 %r46,%r46,0,%r80; .loc 1 69 13 mul.lo.u64 %r53,%r53,%r27; .loc 1 54 36 add.u64 %r54,%r65,1; .loc 1 54 3 add.u64 %r44,%r44,8; add.u64 %r69,%r69,24; add.u64 %r35,%r35,8; add.u64 %r34,%r34,8; setp.ne.u64 %r81,%r50,%r54; @ %r81 bra $L11; .loc 1 72 6 setp.ne.u32 %r82,%r46,0; @ %r82 bra $L4; .loc 1 76 13 mov.u64 r84; call (%value_in),_gfortrani_xmallocarray78 7 ld.u64 %r58,[%r73]; .loc 1 79 11 ld.u64 %r55,[%frame]; .loc 1 82 9 setp.eq.u64 %r87,%r58,0; @ %r87 bra $L1; .loc 1 87 11 shl.b64 %r33,%r55,3; ld.u64 %r45,[%frame+240]; .loc 1 91 32 ld.u64 %r59,[%frame+120]; .loc 1 98 28 mul.lo.u64 %r47,%r59,%r55; mov.u64 %r56,%r72; setp.eq.u64 %r101,%r50,1; .loc 1 99 12 mov.u64 %r104,1; add.u64 %r105,%frame,120; .loc 1 95 20 mov.u64 %r106,0; $L10: .loc 1 85 19 ld.f64 %r31,[%r58]; .loc 1 85 17 st.f64 [%r56],%r31; .loc 1 87 11 add.u64 %r58,%r58,%r33; .loc 1 88 15 add.u64 %r45,%r45,1; .loc 1 91 13 setp.ne.u64 %r88,%r45,%r59; @ %r88 bra $L7; .loc 1 100 14 @ %r101 bra $L1; add.u64 %r60,%frame,248; add.u64 %r68,%frame,8; .loc 1 98 28 mov.u64 %r37,%r47; .loc 1 99 12 mov.u64 %r67,%r104; bra $L8; $L9: .loc 1 95 20 st.u64 [%r60],%r106; .loc 1 98 28 mul.lo.u64 %r37,%r40,%r39; .loc 1 99 12 add.u64 %r57,%r67,1; .loc 1 100 14 add.u64 %r60,%r60,8; add.u64 %r68,%r68,8; setp.eq.u64 %r92,%r65,%r67; @ %r92 bra $L1; mov.u64 %r67,%r57; $L8: .loc 1 107 23 ld.u64 %r93,[%r60]; add.u64 %r39,%r93,1; st.u64 [%r60],%r39; .loc 1 108 28 ld.u64 %r40,[%r68]; .loc 1 108 19 sub.u64 %r94,%r40,%r37; shl.b64 %r95,%r94,3; add.u64 %r58,%r58,%r95; .loc 1 91 32 shl.b64 %r97,%r67,3; add.u64 %r98,%r105,%r97; ld.u64 %r43,[%r98]; .loc 1 91 13 setp.eq.u64 %r99,%r39,%r43; @ %r99 bra $L9; .loc 1 95 20 mov.u64 %r45,0; $L7: add.u64 %r56,%r56,8; bra $L10; $L1: .loc 1 113 in_pack_r10in_pack_c4.o/ 1622802240gfortrani_internal_pack_c4gfortrani_internal_pack_c4../libgfortran/generated/in_pack_c4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_xmallocarr_gfortrani_xmallocarray_gfortrani_internal_pack_c4gfortrani_internal_pack_c474; .reg .predu64 %r85; .reg .u64 %r86; .reg .pred %r88; .reg .pred %r89; .reg .predmov.u64 %r74,%ar0; .loc 1 51 7 ld.s8 %r51,[%r74+28]; .loc 1 54 3 setp.gt.s64 %r75,%r51,0; @ %r75 bra $L2; $L4: .loc 1 73 18 ld.u64 %r73,[%r74]; bra $L1; $L2: add.u64 %r45,%frame,240; add.u64 %r70,%r74,40; mov.u64 %r36,%frame; add.u64 %r35,%frame,120; .loc 1 54 19 mov.u64 %r66,0; .loc 1 53 10 mov.u32 %r47,1; .loc 1 52 9 mov.u64 %r54,1; .loc 1 56 16 mov.u64 %r76,%r66; bra $L6; $L11: mov.u64 %r66,%r55; $L6: st.u64 [%r45],%r76; .loc 1 57 19 ld.u64 %r23,[%r70]; .loc 1 57 17 st.u64 [%r36],%r23; .loc 1 58 19 ld.u64 %r78,[%r70+16]; add.u64 %r77,%r78,1; ld.u64 %r79,[%r70+8]; sub.u64 %r27,%r77,%r79; .loc 1 58 17 st.u64 [%r35],%r27; .loc 1 59 10 setp.le.s64 %r80,%r27,0; @ %r80 bra $L4; .loc 1 66 10 setp.eq.u64 %r81,%r23,%r54; .loc 1 67 16 selp.u32 %r47,%r47,0,%r81; .loc 1 69 13 mul.lo.u64 %r54,%r54,%r27; .loc 1 54 36 add.u64 %r55,%r66,1; .loc 1 54 3 add.u64 %r45,%r45,8; add.u64 %r70,%r70,24; add.u64 %r36,%r36,8; add.u64 %r35,%r35,8; setp.ne.u64 %r82,%r51,%r55; @ %r82 bra $L11; .loc 1 72 6 setp.ne.u32 %r83,%r47,0; @ %r83 bra $L4; .loc 1 76 13 mov.u64 _gfortrani_xmallocarray78 7 ld.u64 %r59,[%r74]; .loc 1 79 11 ld.u64 %r56,[%frame]; .loc 1 82 9 setp.eq.u64 %r88,%r59,0; @ %r88 bra $L1; .loc 1 87 11 shl.b64 %r34,%r56,3; ld.u64 %r46,[%frame+240]; .loc 1 91 32 ld.u64 %r60,[%frame+120]; .loc 1 98 28 mul.lo.u64 %r48,%r60,%r56; mov.u64 %r57,%r73; setp.eq.u64 %r102,%r51,1; .loc 1 99 12 mov.u64 %r105,1; add.u64 %r106,%frame,120; .loc 1 95 20 mov.u64 %r107,0; $L10: .loc 1 85 19 ld.f32 %r31,[%r59]; ld.f32 %r32,[%r59+4]; .loc 1 85 17 st.f32 [%r57],%r31; st.f32 [%r57+4],%r32; .loc 1 87 11 add.u64 %r59,%r59,%r34; .loc 1 88 15 add.u64 %r46,%r46,1; .loc 1 91 13 setp.ne.u64 %r89,%r46,%r60; @ %r89 bra $L7; .loc 1 100 14 @ %r102 bra $L1; add.u64 %r61,%frame,248; add.u64 %r69,%frame,8; .loc 1 98 28 mov.u64 %r38,%r48; .loc 1 99 12 mov.u64 %r68,%r105; bra $L8; $L9: .loc 1 95 20 st.u64 [%r61],%r107; .loc 1 98 28 mul.lo.u64 %r38,%r41,%r40; .loc 1 99 12 add.u64 %r58,%r68,1; .loc 1 100 14 add.u64 %r61,%r61,8; add.u64 %r69,%r69,8; setp.eq.u64 %r93,%r66,%r68; @ %r93 bra $L1; mov.u64 %r68,%r58; $L8: .loc 1 107 23 ld.u64 %r94,[%r61]; add.u64 %r40,%r94,1; st.u64 [%r61],%r40; .loc 1 108 28 ld.u64 %r41,[%r69]; .loc 1 108 19 sub.u64 %r95,%r41,%r38; shl.b64 %r96,%r95,3; add.u64 %r59,%r59,%r96; .loc 1 91 32 shl.b64 %r98,%r68,3; add.u64 %r99,%r106,%r98; ld.u64 %r44,[%r99]; .loc 1 91 13 setp.eq.u64 %r100,%r40,%r44; @ %r100 bra $L9; .loc 1 95 20 mov.u64 %r46,0; $L7: add.u64 %r57,%r57,8; bra $L10; $L1: .loc 1 113gfortrani_internal_pack_c8gfortrani_internal_pack_c8../libgfortran/generated/in_pack_c8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_xmallocarr_gfortrani_xmallocarray_gfortrani_internal_pack_c8gfortrani_internal_pack_c874; .reg .predu64 %r85; .reg .u64 %r86; .reg .pred %r88; .reg .pred %r89; .reg .predmov.u64 %r74,%ar0; .loc 1 51 7 ld.s8 %r51,[%r74+28]; .loc 1 54 3 setp.gt.s64 %r75,%r51,0; @ %r75 bra $L2; $L4: .loc 1 73 18 ld.u64 %r73,[%r74]; bra $L1; $L2: add.u64 %r45,%frame,240; add.u64 %r70,%r74,40; mov.u64 %r36,%frame; add.u64 %r35,%frame,120; .loc 1 54 19 mov.u64 %r66,0; .loc 1 53 10 mov.u32 %r47,1; .loc 1 52 9 mov.u64 %r54,1; .loc 1 56 16 mov.u64 %r76,%r66; bra $L6; $L11: mov.u64 %r66,%r55; $L6: st.u64 [%r45],%r76; .loc 1 57 19 ld.u64 %r23,[%r70]; .loc 1 57 17 st.u64 [%r36],%r23; .loc 1 58 19 ld.u64 %r78,[%r70+16]; add.u64 %r77,%r78,1; ld.u64 %r79,[%r70+8]; sub.u64 %r27,%r77,%r79; .loc 1 58 17 st.u64 [%r35],%r27; .loc 1 59 10 setp.le.s64 %r80,%r27,0; @ %r80 bra $L4; .loc 1 66 10 setp.eq.u64 %r81,%r23,%r54; .loc 1 67 16 selp.u32 %r47,%r47,0,%r81; .loc 1 69 13 mul.lo.u64 %r54,%r54,%r27; .loc 1 54 36 add.u64 %r55,%r66,1; .loc 1 54 3 add.u64 %r45,%r45,8; add.u64 %r70,%r70,24; add.u64 %r36,%r36,8; add.u64 %r35,%r35,8; setp.ne.u64 %r82,%r51,%r55; @ %r82 bra $L11; .loc 1 72 6 setp.ne.u32 %r83,%r47,0; @ %r83 bra $L4; .loc 1 76 13_gfortrani_xmallocarray78 7 ld.u64 %r59,[%r74]; .loc 1 79 11 ld.u64 %r56,[%frame]; .loc 1 82 9 setp.eq.u64 %r88,%r59,0; @ %r88 bra $L1; .loc 1 87 11 shl.b64 %r34,%r56,4; ld.u64 %r46,[%frame+240]; .loc 1 91 32 ld.u64 %r60,[%frame+120]; .loc 1 98 28 mul.lo.u64 %r48,%r60,%r56; mov.u64 %r57,%r73; setp.eq.u64 %r102,%r51,1; .loc 1 99 12 mov.u64 %r105,1; add.u64 %r106,%frame,120; .loc 1 95 20 mov.u64 %r107,0; $L10: .loc 1 85 19 ld.f64 %r31,[%r59]; ld.f64 %r32,[%r59+8]; .loc 1 85 17 st.f64 [%r57],%r31; st.f64 [%r57+8],%r32; .loc 1 87 11 add.u64 %r59,%r59,%r34; .loc 1 88 15 add.u64 %r46,%r46,1; .loc 1 91 13 setp.ne.u64 %r89,%r46,%r60; @ %r89 bra $L7; .loc 1 100 14 @ %r102 bra $L1; add.u64 %r61,%frame,248; add.u64 %r69,%frame,8; .loc 1 98 28 mov.u64 %r38,%r48; .loc 1 99 12 mov.u64 %r68,%r105; bra $L8; $L9: .loc 1 95 20 st.u64 [%r61],%r107; .loc 1 98 28 mul.lo.u64 %r38,%r41,%r40; .loc 1 99 12 add.u64 %r58,%r68,1; .loc 1 100 14 add.u64 %r61,%r61,8; add.u64 %r69,%r69,8; setp.eq.u64 %r93,%r66,%r68; @ %r93 bra $L1; mov.u64 %r68,%r58; $L8: .loc 1 107 23 ld.u64 %r94,[%r61]; add.u64 %r40,%r94,1; st.u64 [%r61],%r40; .loc 1 108 28 ld.u64 %r41,[%r69]; .loc 1 108 19 sub.u64 %r95,%r41,%r38; shl.b64 %r96,%r95,4; add.u64 %r59,%r59,%r96; .loc 1 91 32 shl.b64 %r98,%r68,3; add.u64 %r99,%r106,%r98; ld.u64 %r44,[%r99]; .loc 1 91 13 setp.eq.u64 %r100,%r40,%r44; @ %r100 bra $L9; .loc 1 95 20 mov.u64 %r46,0; $L7: add.u64 %r57,%r57,16; bra $L10; $L1: .loc 1 113in_pack_c10in_unpack_i1.o/ 1622802240gfortrani_internal_unpack_1 .visible .func _gfortrani_internal_unpack_1fortran/generated/in_unpack_i1.c"ortrani_internal_unpack_1 .visible .func _gfortrani_internal_unpack_18pred.reg .u64mov.u64 %r69,%ar0; mov.u64 %r70,%ar1; .loc 1 43 8 ld.u64 %r55,[%r69]; .loc 1 44 11 set.u32.eq.u64 %r72,%r55,%r70; neg.s32 %r73,%r72; .loc 1 44 22 set.u32.eq.u64 %r75,%r70,0; neg.s32 %r76,%r75; .loc 1 44 19 cvt.u16.u32 %r78,%r73; cvt.u16.u32 %r79,%r76; or.b16 %r77,%r78,%r79; .loc 1 44 61; .loc 1 47 7 ld.s8 %r50,[%r69+28]; .loc 1 49 3 setp.le.s64 %r83,%r50,0; @ %r83 bra $L14; add.u64 %r48,%r69,40; mov.u64 %r58,0; .loc 1 49 19 mov.u64 %r62,%r58; .loc 1 48 9 mov.u64 %r43,1; add.u64 %r118,%frame,240; add.u64 %r116,%frame,120; .loc 1 51 16 mov.u64 %r86,%r58; bra $L5; $L16: mov.u64 %r62,%r56; $L5: add.u64 %r85,%r118,%r58; st.u64 [%r85],%r86; .loc 1 52 19 ld.u64 %r26,[%r48]; .loc 1 52 17 add.u64 %r87,%frame,%r58; st.u64 [%r87],%r26; .loc 1 53 19 ld.u64 %r89,[%r48+16]; add.u64 %r88,%r89,1; ld.u64 %r90,[%r48+8]; sub.u64 %r30,%r88,%r90; .loc 1 53 17 add.u64 %r92,%r116,%r58; st.u64 [%r92],%r30; .loc 1 54 10 setp.le.s64 %r93,%r30,0; @ %r93 bra $L1; .loc 1 57 10 setp.ne.u64 %r94,%r26,%r43; @ %r94 bra $L15; .loc 1 58 8 mul.lo.u64 %r43,%r43,%r30; bra $L4; $L15: .loc 1 60 8 mov.u64 %r43,0; $L4: .loc 1 49 36 add.u64 %r56,%r62,1; .loc 1 49 3 add.u64 %r48,%r48,24; add.u64 %r58,%r58,8; setp.ne.u64 %r95,%r50,%r56; @ %r95 bra $L16; .loc 1 63 6 setp.eq.u64 %r96,%r43,0; @ %r96 bra $L6; .loc 1 65 7 mov.u64 %r68,%r43; bra $L3; $L14: .loc 1 49 3 mov.u64 %r68,1; $L3: .loc 1 65 7103,[%value_in]; } .loc 1 66 7 bra $L1; $L6: .loc 1 69 11 ld.u64 %r52,[%frame]; .loc 1 71 9 setp.eq.u64 %r105,%r55,0; @ %r105 bra $L1; ld.u64 %r47,[%frame+240]; .loc 1 80 32 ld.u64 %r59,[%frame+120]; .loc 1 87 29 mul.lo.u64 %r44,%r59,%r52; mov.u64 %r45,%r70; setp.eq.u64 %r117,%r50,1; .loc 1 88 12 mov.u64 %r121,1; .loc 1 84 20 mov.u64 %r122,%r43; $L13: .loc 1 74 13 ld.u8 %r106,[%r45]; st.u8 [%r55],%r106; .loc 1 76 12 add.u64 %r55,%r55,%r52; .loc 1 77 15 add.u64 %r47,%r47,1; .loc 1 80 13 setp.ne.u64 %r107,%r47,%r59; @ %r107 bra $L8; .loc 1 89 14 @ ! %r117 bra $L22; bra $L1; $L12: .loc 1 84 20 st.u64 [%r49],%r122; .loc 1 87 29 mul.lo.u64 %r37,%r40,%r39; .loc 1 88 12 add.u64 %r54,%r63,1; .loc 1 89 14 add.u64 %r49,%r49,8; add.u64 %r65,%r65,8; add.u64 %r64,%r64,8; setp.ne.u64 %r110,%r62,%r63; @ %r110 bra $L17; bra $L1; $L22: add.u64 %r49,%frame,248; add.u64 %r65,%frame,8; add.u64 %r64,%frame,128; .loc 1 87 29 mov.u64 %r37,%r44; .loc 1 88 12 mov.u64 %r63,%r121; bra $L11; $L17: mov.u64 %r63,%r54; $L11: .loc 1 96 23 ld.u64 %r113,[%r49]; add.u64 %r39,%r113,1; st.u64 [%r49],%r39; .loc 1 97 29 ld.u64 %r40,[%r65]; .loc 1 97 20 sub.u64 %r114,%r40,%r37; add.u64 %r55,%r55,%r114; .loc 1 80 32 ld.u64 %r42,[%r64]; .loc 1 80 13 setp.eq.u64 %r115,%r39,%r42; @ %r115 bra $L12; .loc 1 84 20 mov.u64 %r47,%r43; $L8: add.u64 %r45,%r45,1; bra $L13; $L1: .loc 1 101in_unpack_i2.o/ 1622802240gfortrani_internal_unpack_2 .visible .func _gfortrani_internal_unpack_2fortran/generated/in_unpack_i2.c"ortrani_internal_unpack_2 .visible .func _gfortrani_internal_unpack_8predpred %r99predmov.u64 %r72,%ar0; mov.u64 %r73,%ar1; .loc 1 43 8 ld.u64 %r55,[%r72]; .loc 1 44 11 set.u32.eq.u64 %r75,%r55,%r73; neg.s32 %r76,%r75; .loc 1 44 22 set.u32.eq.u64 %r78,%r73,0; neg.s32 %r79,%r78; .loc 1 44 19 cvt.u16.u32 %r81,%r76; cvt.u16.u32 %r82,%r79; or.b16 %r80,%r81,%r82; .loc 1 44 6@ %r85 bra $L1; .loc 1 47 7 ld.s8 %r49,[%r72+28]; .loc 1 49 3 setp.le.s64 %r86,%r49,0; @ %r86 bra $L14; add.u64 %r35,%r72,40; mov.u64 %r47,0; .loc 1 49 19 mov.u64 %r63,%r47; .loc 1 48 9 mov.u64 %r43,1; add.u64 %r124,%frame,240; add.u64 %r122,%frame,120; .loc 1 51 16 mov.u64 %r89,%r47; bra $L5; $L16: mov.u64 %r63,%r56; $L5: add.u64 %r88,%r124,%r47; st.u64 [%r88],%r89; .loc 1 52 19 ld.u64 %r26,[%r35]; .loc 1 52 17 add.u64 %r90,%frame,%r47; st.u64 [%r90],%r26; .loc 1 53 19 ld.u64 %r92,[%r35+16]; add.u64 %r91,%r92,1; ld.u64 %r93,[%r35+8]; sub.u64 %r30,%r91,%r93; .loc 1 53 17 add.u64 %r95,%r122,%r47; st.u64 [%r95],%r30; .loc 1 54 10 setp.le.s64 %r96,%r30,0; @ %r96 bra $L1; .loc 1 57 10 setp.ne.u64 %r97,%r26,%r43; @ %r97 bra $L15; .loc 1 58 8 mul.lo.u64 %r43,%r43,%r30; bra $L4; $L15: .loc 1 60 8 mov.u64 %r43,0; $L4: .loc 1 49 36 add.u64 %r56,%r63,1; .loc 1 49 3 add.u64 %r35,%r35,24; add.u64 %r47,%r47,8; setp.ne.u64 %r98,%r49,%r56; @ %r98 bra $L16; .loc 1 63 6 setp.eq.u64 %r99,%r43,0; @ %r99 bra $L6; .loc 1 65 7 add.u64 %r71,%r43,%r43; bra $L3; $L14: .loc 1 49 3 mov.u64 %r71,2; $L3: .loc 1 65 7107,[%value_in]; } .loc 1 66 7 bra $L1; $L6: .loc 1 69 11 ld.u64 %r52,[%frame]; .loc 1 71 9 setp.eq.u64 %r109,%r55,0; @ %r109 bra $L1; .loc 1 76 12 add.u64 %r34,%r52,%r52; ld.u64 %r44,[%frame+240]; .loc 1 80 32 ld.u64 %r59,[%frame+120]; .loc 1 87 29 mul.lo.u64 %r31,%r59,%r52; mov.u64 %r53,%r73; setp.eq.u64 %r123,%r49,1; .loc 1 88 12 mov.u64 %r127,1; .loc 1 84 20 mov.u64 %r128,%r43; $L13: .loc 1 74 13 ld.u16 %r111,[%r53]; st.u16 [%r55],%r111; .loc 1 76 12 add.u64 %r55,%r55,%r34; .loc 1 77 15 add.u64 %r44,%r44,1; .loc 1 80 13 setp.ne.u64 %r112,%r44,%r59; @ %r112 bra $L8; .loc 1 89 14 @ ! %r123 bra $L22; bra $L1; $L12: .loc 1 84 20 st.u64 [%r48],%r128; .loc 1 87 29 mul.lo.u64 %r37,%r40,%r39; .loc 1 88 12 add.u64 %r54,%r64,1; .loc 1 89 14 add.u64 %r48,%r48,8; add.u64 %r67,%r67,8; add.u64 %r66,%r66,8; setp.ne.u64 %r115,%r63,%r64; @ %r115 bra $L17; bra $L1; $L22: add.u64 %r48,%frame,248; add.u64 %r67,%frame,8; add.u64 %r66,%frame,128; .loc 1 87 29 mov.u64 %r37,%r31; .loc 1 88 12 mov.u64 %r64,%r127; bra $L11; $L17: mov.u64 %r64,%r54; $L11: .loc 1 96 23 ld.u64 %r118,[%r48]; add.u64 %r39,%r118,1; st.u64 [%r48],%r39; .loc 1 97 29 ld.u64 %r40,[%r67]; .loc 1 97 20 sub.u64 %r119,%r40,%r37; add.u64 %r120,%r119,%r119; add.u64 %r55,%r55,%r120; .loc 1 80 32 ld.u64 %r42,[%r66]; .loc 1 80 13 setp.eq.u64 %r121,%r39,%r42; @ %r121 bra $L12; .loc 1 84 20 mov.u64 %r44,%r43; $L8: add.u64 %r53,%r53,2; bra $L13; $L1: .loc 1 101in_unpack_i4.o/ 1622802240gfortrani_internal_unpack_4 .visible .func _gfortrani_internal_unpack_4fortran/generated/in_unpack_i4.c"ortrani_internal_unpack_4 .visible .func _gfortrani_internal_unpack_48predpred %r99predmov.u64 %r72,%ar0; mov.u64 %r73,%ar1; .loc 1 43 8 ld.u64 %r55,[%r72]; .loc 1 44 11 set.u32.eq.u64 %r75,%r55,%r73; neg.s32 %r76,%r75; .loc 1 44 22 set.u32.eq.u64 %r78,%r73,0; neg.s32 %r79,%r78; .loc 1 44 19 cvt.u16.u32 %r81,%r76; cvt.u16.u32 %r82,%r79; or.b16 %r80,%r81,%r82; .loc 1 44 6@ %r85 bra $L1; .loc 1 47 7 ld.s8 %r49,[%r72+28]; .loc 1 49 3 setp.le.s64 %r86,%r49,0; @ %r86 bra $L14; add.u64 %r35,%r72,40; mov.u64 %r47,0; .loc 1 49 19 mov.u64 %r63,%r47; .loc 1 48 9 mov.u64 %r43,1; add.u64 %r121,%frame,240; add.u64 %r119,%frame,120; .loc 1 51 16 mov.u64 %r89,%r47; bra $L5; $L16: mov.u64 %r63,%r56; $L5: add.u64 %r88,%r121,%r47; st.u64 [%r88],%r89; .loc 1 52 19 ld.u64 %r26,[%r35]; .loc 1 52 17 add.u64 %r90,%frame,%r47; st.u64 [%r90],%r26; .loc 1 53 19 ld.u64 %r92,[%r35+16]; add.u64 %r91,%r92,1; ld.u64 %r93,[%r35+8]; sub.u64 %r30,%r91,%r93; .loc 1 53 17 add.u64 %r95,%r119,%r47; st.u64 [%r95],%r30; .loc 1 54 10 setp.le.s64 %r96,%r30,0; @ %r96 bra $L1; .loc 1 57 10 setp.ne.u64 %r97,%r26,%r43; @ %r97 bra $L15; .loc 1 58 8 mul.lo.u64 %r43,%r43,%r30; bra $L4; $L15: .loc 1 60 8 mov.u64 %r43,0; $L4: .loc 1 49 36 add.u64 %r56,%r63,1; .loc 1 49 3 add.u64 %r35,%r35,24; add.u64 %r47,%r47,8; setp.ne.u64 %r98,%r49,%r56; @ %r98 bra $L16; .loc 1 63 6 setp.eq.u64 %r99,%r43,0; @ %r99 bra $L6; .loc 1 65 7 shl.b64 %r71,%r43,2; bra $L3; $L14: .loc 1 49 3 mov.u64 %r71,4; $L3: .loc 1 65 7106,[%value_in]; } .loc 1 66 7 bra $L1; $L6: .loc 1 69 11 ld.u64 %r52,[%frame]; .loc 1 71 9 setp.eq.u64 %r108,%r55,0; @ %r108 bra $L1; .loc 1 76 12 shl.b64 %r34,%r52,2; ld.u64 %r44,[%frame+240]; .loc 1 80 32 ld.u64 %r59,[%frame+120]; .loc 1 87 29 mul.lo.u64 %r31,%r59,%r52; mov.u64 %r53,%r73; setp.eq.u64 %r120,%r49,1; .loc 1 88 12 mov.u64 %r124,1; .loc 1 84 20 mov.u64 %r125,%r43; $L13: .loc 1 74 15 ld.u32 %r32,[%r53]; .loc 1 74 13 st.u32 [%r55],%r32; .loc 1 76 12 add.u64 %r55,%r55,%r34; .loc 1 77 15 add.u64 %r44,%r44,1; .loc 1 80 13 setp.ne.u64 %r109,%r44,%r59; @ %r109 bra $L8; .loc 1 89 14 @ ! %r120 bra $L22; bra $L1; $L12: .loc 1 84 20 st.u64 [%r48],%r125; .loc 1 87 29 mul.lo.u64 %r37,%r40,%r39; .loc 1 88 12 add.u64 %r54,%r64,1; .loc 1 89 14 add.u64 %r48,%r48,8; add.u64 %r67,%r67,8; add.u64 %r66,%r66,8; setp.ne.u64 %r112,%r63,%r64; @ %r112 bra $L17; bra $L1; $L22: add.u64 %r48,%frame,248; add.u64 %r67,%frame,8; add.u64 %r66,%frame,128; .loc 1 87 29 mov.u64 %r37,%r31; .loc 1 88 12 mov.u64 %r64,%r124; bra $L11; $L17: mov.u64 %r64,%r54; $L11: .loc 1 96 23 ld.u64 %r115,[%r48]; add.u64 %r39,%r115,1; st.u64 [%r48],%r39; .loc 1 97 29 ld.u64 %r40,[%r67]; .loc 1 97 20 sub.u64 %r116,%r40,%r37; shl.b64 %r117,%r116,2; add.u64 %r55,%r55,%r117; .loc 1 80 32 ld.u64 %r42,[%r66]; .loc 1 80 13 setp.eq.u64 %r118,%r39,%r42; @ %r118 bra $L12; .loc 1 84 20 mov.u64 %r44,%r43; $L8: add.u64 %r53,%r53,4; bra $L13; $L1: .loc 1 101in_unpack_i8.o/ 1622802240gfortrani_internal_unpack_8 .visible .func _gfortrani_internal_unpack_8fortran/generated/in_unpack_i8.c"ortrani_internal_unpack_8 .visible .func _gfortrani_internal_unpack_88predpred %r99predmov.u64 %r72,%ar0; mov.u64 %r73,%ar1; .loc 1 43 8 ld.u64 %r55,[%r72]; .loc 1 44 11 set.u32.eq.u64 %r75,%r55,%r73; neg.s32 %r76,%r75; .loc 1 44 22 set.u32.eq.u64 %r78,%r73,0; neg.s32 %r79,%r78; .loc 1 44 19 cvt.u16.u32 %r81,%r76; cvt.u16.u32 %r82,%r79; or.b16 %r80,%r81,%r82; .loc 1 44 6@ %r85 bra $L1; .loc 1 47 7 ld.s8 %r49,[%r72+28]; .loc 1 49 3 setp.le.s64 %r86,%r49,0; @ %r86 bra $L14; add.u64 %r35,%r72,40; mov.u64 %r47,0; .loc 1 49 19 mov.u64 %r63,%r47; .loc 1 48 9 mov.u64 %r43,1; add.u64 %r121,%frame,240; add.u64 %r119,%frame,120; .loc 1 51 16 mov.u64 %r89,%r47; bra $L5; $L16: mov.u64 %r63,%r56; $L5: add.u64 %r88,%r121,%r47; st.u64 [%r88],%r89; .loc 1 52 19 ld.u64 %r26,[%r35]; .loc 1 52 17 add.u64 %r90,%frame,%r47; st.u64 [%r90],%r26; .loc 1 53 19 ld.u64 %r92,[%r35+16]; add.u64 %r91,%r92,1; ld.u64 %r93,[%r35+8]; sub.u64 %r30,%r91,%r93; .loc 1 53 17 add.u64 %r95,%r119,%r47; st.u64 [%r95],%r30; .loc 1 54 10 setp.le.s64 %r96,%r30,0; @ %r96 bra $L1; .loc 1 57 10 setp.ne.u64 %r97,%r26,%r43; @ %r97 bra $L15; .loc 1 58 8 mul.lo.u64 %r43,%r43,%r30; bra $L4; $L15: .loc 1 60 8 mov.u64 %r43,0; $L4: .loc 1 49 36 add.u64 %r56,%r63,1; .loc 1 49 3 add.u64 %r35,%r35,24; add.u64 %r47,%r47,8; setp.ne.u64 %r98,%r49,%r56; @ %r98 bra $L16; .loc 1 63 6 setp.eq.u64 %r99,%r43,0; @ %r99 bra $L6; .loc 1 65 7 shl.b64 %r71,%r43,3; bra $L3; $L14: .loc 1 49 3 mov.u64 %r71,8; $L3: .loc 1 65 7106,[%value_in]; } .loc 1 66 7 bra $L1; $L6: .loc 1 69 11 ld.u64 %r52,[%frame]; .loc 1 71 9 setp.eq.u64 %r108,%r55,0; @ %r108 bra $L1; .loc 1 76 12 shl.b64 %r34,%r52,3; ld.u64 %r44,[%frame+240]; .loc 1 80 32 ld.u64 %r59,[%frame+120]; .loc 1 87 29 mul.lo.u64 %r31,%r59,%r52; mov.u64 %r53,%r73; setp.eq.u64 %r120,%r49,1; .loc 1 88 12 mov.u64 %r124,1; .loc 1 84 20 mov.u64 %r125,%r43; $L13: .loc 1 74 15 ld.u64 %r32,[%r53]; .loc 1 74 13 st.u64 [%r55],%r32; .loc 1 76 12 add.u64 %r55,%r55,%r34; .loc 1 77 15 add.u64 %r44,%r44,1; .loc 1 80 13 setp.ne.u64 %r109,%r44,%r59; @ %r109 bra $L8; .loc 1 89 14 @ ! %r120 bra $L22; bra $L1; $L12: .loc 1 84 20 st.u64 [%r48],%r125; .loc 1 87 29 mul.lo.u64 %r37,%r40,%r39; .loc 1 88 12 add.u64 %r54,%r64,1; .loc 1 89 14 add.u64 %r48,%r48,8; add.u64 %r67,%r67,8; add.u64 %r66,%r66,8; setp.ne.u64 %r112,%r63,%r64; @ %r112 bra $L17; bra $L1; $L22: add.u64 %r48,%frame,248; add.u64 %r67,%frame,8; add.u64 %r66,%frame,128; .loc 1 87 29 mov.u64 %r37,%r31; .loc 1 88 12 mov.u64 %r64,%r124; bra $L11; $L17: mov.u64 %r64,%r54; $L11: .loc 1 96 23 ld.u64 %r115,[%r48]; add.u64 %r39,%r115,1; st.u64 [%r48],%r39; .loc 1 97 29 ld.u64 %r40,[%r67]; .loc 1 97 20 sub.u64 %r116,%r40,%r37; shl.b64 %r117,%r116,3; add.u64 %r55,%r55,%r117; .loc 1 80 32 ld.u64 %r42,[%r66]; .loc 1 80 13 setp.eq.u64 %r118,%r39,%r42; @ %r118 bra $L12; .loc 1 84 20 mov.u64 %r44,%r43; $L8: add.u64 %r53,%r53,8; bra $L13; $L1: .loc 1 101in_unpack_i16.o/1622802240gfortrani_internal_unpack_16 .visible .func _gfortrani_internal_unpack_16fortran/generated/in_unpack_i16.c"ortrani_internal_unpack_16 .visible .func _gfortrani_internal_unpack_168predpred %r99; .reg .u64 %r106; .reg .predpredmov.u64 %r72,%ar0; mov.u64 %r73,%ar1; .loc 1 43 8 ld.u64 %r55,[%r72]; .loc 1 44 11 set.u32.eq.u64 %r75,%r55,%r73; neg.s32 %r76,%r75; .loc 1 44 22 set.u32.eq.u64 %r78,%r73,0; neg.s32 %r79,%r78; .loc 1 44 19 cvt.u16.u32 %r81,%r76; cvt.u16.u32 %r82,%r79; or.b16 %r80,%r81,%r82; .loc 1 44 6@ %r85 bra $L1; .loc 1 47 7 ld.s8 %r49,[%r72+28]; .loc 1 49 3 setp.le.s64 %r86,%r49,0; @ %r86 bra $L14; add.u64 %r35,%r72,40; mov.u64 %r47,0; .loc 1 49 19 mov.u64 %r63,%r47; .loc 1 48 9 mov.u64 %r43,1; add.u64 %r123,%frame,240; add.u64 %r124,%frame,120; .loc 1 51 16 mov.u64 %r89,%r47; bra $L5; $L16: mov.u64 %r63,%r56; $L5: add.u64 %r88,%r123,%r47; st.u64 [%r88],%r89; .loc 1 52 19 ld.u64 %r26,[%r35]; .loc 1 52 17 add.u64 %r90,%frame,%r47; st.u64 [%r90],%r26; .loc 1 53 19 ld.u64 %r92,[%r35+16]; add.u64 %r91,%r92,1; ld.u64 %r93,[%r35+8]; sub.u64 %r30,%r91,%r93; .loc 1 53 17 add.u64 %r95,%r124,%r47; st.u64 [%r95],%r30; .loc 1 54 10 setp.le.s64 %r96,%r30,0; @ %r96 bra $L1; .loc 1 57 10 setp.ne.u64 %r97,%r26,%r43; @ %r97 bra $L15; .loc 1 58 8 mul.lo.u64 %r43,%r43,%r30; bra $L4; $L15: .loc 1 60 8 mov.u64 %r43,0; $L4: .loc 1 49 36 add.u64 %r56,%r63,1; .loc 1 49 3 add.u64 %r35,%r35,24; add.u64 %r47,%r47,8; setp.ne.u64 %r98,%r49,%r56; @ %r98 bra $L16; .loc 1 63 6 setp.eq.u64 %r99,%r43,0; @ %r99 bra $L6; .loc 1 65 7 shl.b64 %r71,%r43,4; bra $L3; $L14: .loc 1 49 3 mov.u64 %r71,16; $L3: .loc 1 65 7106,[%value_in]; } .loc 1 66 7 bra $L1; $L6: .loc 1 69 11 ld.u64 %r52,[%frame]; .loc 1 71 9 setp.eq.u64 %r108,%r55,0; @ %r108 bra $L1; .loc 1 76 12 shl.b64 %r34,%r52,4; ld.u64 %r44,[%frame+240]; .loc 1 80 32 ld.u64 %r59,[%frame+120]; .loc 1 87 29 mul.lo.u64 %r31,%r59,%r52; mov.u64 %r53,%r73; setp.eq.u64 %r125,%r49,1; .loc 1 88 12 mov.u64 %r128,1; .loc 1 84 20 mov.u64 %r129,%r43; $L13: .loc 1 74 15 ld.u64 %r121,[%r53]; ld.u64 %r122,[%r53+8]; .loc 1 74 13 st.u64 [%r55],%r121; st.u64 [%r55+8],%r122; .loc 1 76 12 add.u64 %r55,%r55,%r34; .loc 1 77 15 add.u64 %r44,%r44,1; .loc 1 80 13 setp.ne.u64 %r111,%r44,%r59; @ %r111 bra $L8; .loc 1 89 14 @ ! %r125 bra $L22; bra $L1; $L12: .loc 1 84 20 st.u64 [%r48],%r129; .loc 1 87 29 mul.lo.u64 %r37,%r40,%r39; .loc 1 88 12 add.u64 %r54,%r64,1; .loc 1 89 14 add.u64 %r48,%r48,8; add.u64 %r67,%r67,8; add.u64 %r66,%r66,8; setp.ne.u64 %r114,%r63,%r64; @ %r114 bra $L17; bra $L1; $L22: add.u64 %r48,%frame,248; add.u64 %r67,%frame,8; add.u64 %r66,%frame,128; .loc 1 87 29 mov.u64 %r37,%r31; .loc 1 88 12 mov.u64 %r64,%r128; bra $L11; $L17: mov.u64 %r64,%r54; $L11: .loc 1 96 23 ld.u64 %r117,[%r48]; add.u64 %r39,%r117,1; st.u64 [%r48],%r39; .loc 1 97 29 ld.u64 %r40,[%r67]; .loc 1 97 20 sub.u64 %r118,%r40,%r37; shl.b64 %r119,%r118,4; add.u64 %r55,%r55,%r119; .loc 1 80 32 ld.u64 %r42,[%r66]; .loc 1 80 13 setp.eq.u64 %r120,%r39,%r42; @ %r120 bra $L12; .loc 1 84 20 mov.u64 %r44,%r43; $L8: add.u64 %r53,%r53,16; bra $L13; $L1: .loc 1 101 in_unpack_r4.o/ 1622802240gfortrani_internal_unpack_r4 .visible .func _gfortrani_internal_unpack_r4fortran/generated/in_unpack_r4.c"ortrani_internal_unpack_r4 .visible .func _gfortrani_internal_unpack_r48predpred %r99predmov.u64 %r72,%ar0; mov.u64 %r73,%ar1; .loc 1 43 8 ld.u64 %r55,[%r72]; .loc 1 44 11 set.u32.eq.u64 %r75,%r55,%r73; neg.s32 %r76,%r75; .loc 1 44 22 set.u32.eq.u64 %r78,%r73,0; neg.s32 %r79,%r78; .loc 1 44 19 cvt.u16.u32 %r81,%r76; cvt.u16.u32 %r82,%r79; or.b16 %r80,%r81,%r82; .loc 1 44 6@ %r85 bra $L1; .loc 1 47 7 ld.s8 %r49,[%r72+28]; .loc 1 49 3 setp.le.s64 %r86,%r49,0; @ %r86 bra $L14; add.u64 %r35,%r72,40; mov.u64 %r47,0; .loc 1 49 19 mov.u64 %r63,%r47; .loc 1 48 9 mov.u64 %r43,1; add.u64 %r121,%frame,240; add.u64 %r119,%frame,120; .loc 1 51 16 mov.u64 %r89,%r47; bra $L5; $L16: mov.u64 %r63,%r56; $L5: add.u64 %r88,%r121,%r47; st.u64 [%r88],%r89; .loc 1 52 19 ld.u64 %r26,[%r35]; .loc 1 52 17 add.u64 %r90,%frame,%r47; st.u64 [%r90],%r26; .loc 1 53 19 ld.u64 %r92,[%r35+16]; add.u64 %r91,%r92,1; ld.u64 %r93,[%r35+8]; sub.u64 %r30,%r91,%r93; .loc 1 53 17 add.u64 %r95,%r119,%r47; st.u64 [%r95],%r30; .loc 1 54 10 setp.le.s64 %r96,%r30,0; @ %r96 bra $L1; .loc 1 57 10 setp.ne.u64 %r97,%r26,%r43; @ %r97 bra $L15; .loc 1 58 8 mul.lo.u64 %r43,%r43,%r30; bra $L4; $L15: .loc 1 60 8 mov.u64 %r43,0; $L4: .loc 1 49 36 add.u64 %r56,%r63,1; .loc 1 49 3 add.u64 %r35,%r35,24; add.u64 %r47,%r47,8; setp.ne.u64 %r98,%r49,%r56; @ %r98 bra $L16; .loc 1 63 6 setp.eq.u64 %r99,%r43,0; @ %r99 bra $L6; .loc 1 65 7 shl.b64 %r71,%r43,2; bra $L3; $L14: .loc 1 49 3 mov.u64 %r71,4; $L3: .loc 1 65 7106,[%value_in]; } .loc 1 66 7 bra $L1; $L6: .loc 1 69 11 ld.u64 %r52,[%frame]; .loc 1 71 9 setp.eq.u64 %r108,%r55,0; @ %r108 bra $L1; .loc 1 76 12 shl.b64 %r34,%r52,2; ld.u64 %r44,[%frame+240]; .loc 1 80 32 ld.u64 %r59,[%frame+120]; .loc 1 87 29 mul.lo.u64 %r31,%r59,%r52; mov.u64 %r53,%r73; setp.eq.u64 %r120,%r49,1; .loc 1 88 12 mov.u64 %r124,1; .loc 1 84 20 mov.u64 %r125,%r43; $L13: .loc 1 74 15 ld.f32 %r32,[%r53]; .loc 1 74 13 st.f32 [%r55],%r32; .loc 1 76 12 add.u64 %r55,%r55,%r34; .loc 1 77 15 add.u64 %r44,%r44,1; .loc 1 80 13 setp.ne.u64 %r109,%r44,%r59; @ %r109 bra $L8; .loc 1 89 14 @ ! %r120 bra $L22; bra $L1; $L12: .loc 1 84 20 st.u64 [%r48],%r125; .loc 1 87 29 mul.lo.u64 %r37,%r40,%r39; .loc 1 88 12 add.u64 %r54,%r64,1; .loc 1 89 14 add.u64 %r48,%r48,8; add.u64 %r67,%r67,8; add.u64 %r66,%r66,8; setp.ne.u64 %r112,%r63,%r64; @ %r112 bra $L17; bra $L1; $L22: add.u64 %r48,%frame,248; add.u64 %r67,%frame,8; add.u64 %r66,%frame,128; .loc 1 87 29 mov.u64 %r37,%r31; .loc 1 88 12 mov.u64 %r64,%r124; bra $L11; $L17: mov.u64 %r64,%r54; $L11: .loc 1 96 23 ld.u64 %r115,[%r48]; add.u64 %r39,%r115,1; st.u64 [%r48],%r39; .loc 1 97 29 ld.u64 %r40,[%r67]; .loc 1 97 20 sub.u64 %r116,%r40,%r37; shl.b64 %r117,%r116,2; add.u64 %r55,%r55,%r117; .loc 1 80 32 ld.u64 %r42,[%r66]; .loc 1 80 13 setp.eq.u64 %r118,%r39,%r42; @ %r118 bra $L12; .loc 1 84 20 mov.u64 %r44,%r43; $L8: add.u64 %r53,%r53,4; bra $L13; $L1: .loc 1 101in_unpack_r8.o/ 1622802240gfortrani_internal_unpack_r8 .visible .func _gfortrani_internal_unpack_r8fortran/generated/in_unpack_r8.c"ortrani_internal_unpack_r8 .visible .func _gfortrani_internal_unpack_r88predpred %r99predmov.u64 %r72,%ar0; mov.u64 %r73,%ar1; .loc 1 43 8 ld.u64 %r55,[%r72]; .loc 1 44 11 set.u32.eq.u64 %r75,%r55,%r73; neg.s32 %r76,%r75; .loc 1 44 22 set.u32.eq.u64 %r78,%r73,0; neg.s32 %r79,%r78; .loc 1 44 19 cvt.u16.u32 %r81,%r76; cvt.u16.u32 %r82,%r79; or.b16 %r80,%r81,%r82; .loc 1 44 6@ %r85 bra $L1; .loc 1 47 7 ld.s8 %r49,[%r72+28]; .loc 1 49 3 setp.le.s64 %r86,%r49,0; @ %r86 bra $L14; add.u64 %r35,%r72,40; mov.u64 %r47,0; .loc 1 49 19 mov.u64 %r63,%r47; .loc 1 48 9 mov.u64 %r43,1; add.u64 %r121,%frame,240; add.u64 %r119,%frame,120; .loc 1 51 16 mov.u64 %r89,%r47; bra $L5; $L16: mov.u64 %r63,%r56; $L5: add.u64 %r88,%r121,%r47; st.u64 [%r88],%r89; .loc 1 52 19 ld.u64 %r26,[%r35]; .loc 1 52 17 add.u64 %r90,%frame,%r47; st.u64 [%r90],%r26; .loc 1 53 19 ld.u64 %r92,[%r35+16]; add.u64 %r91,%r92,1; ld.u64 %r93,[%r35+8]; sub.u64 %r30,%r91,%r93; .loc 1 53 17 add.u64 %r95,%r119,%r47; st.u64 [%r95],%r30; .loc 1 54 10 setp.le.s64 %r96,%r30,0; @ %r96 bra $L1; .loc 1 57 10 setp.ne.u64 %r97,%r26,%r43; @ %r97 bra $L15; .loc 1 58 8 mul.lo.u64 %r43,%r43,%r30; bra $L4; $L15: .loc 1 60 8 mov.u64 %r43,0; $L4: .loc 1 49 36 add.u64 %r56,%r63,1; .loc 1 49 3 add.u64 %r35,%r35,24; add.u64 %r47,%r47,8; setp.ne.u64 %r98,%r49,%r56; @ %r98 bra $L16; .loc 1 63 6 setp.eq.u64 %r99,%r43,0; @ %r99 bra $L6; .loc 1 65 7 shl.b64 %r71,%r43,3; bra $L3; $L14: .loc 1 49 3 mov.u64 %r71,8; $L3: .loc 1 65 7106,[%value_in]; } .loc 1 66 7 bra $L1; $L6: .loc 1 69 11 ld.u64 %r52,[%frame]; .loc 1 71 9 setp.eq.u64 %r108,%r55,0; @ %r108 bra $L1; .loc 1 76 12 shl.b64 %r34,%r52,3; ld.u64 %r44,[%frame+240]; .loc 1 80 32 ld.u64 %r59,[%frame+120]; .loc 1 87 29 mul.lo.u64 %r31,%r59,%r52; mov.u64 %r53,%r73; setp.eq.u64 %r120,%r49,1; .loc 1 88 12 mov.u64 %r124,1; .loc 1 84 20 mov.u64 %r125,%r43; $L13: .loc 1 74 15 ld.f64 %r32,[%r53]; .loc 1 74 13 st.f64 [%r55],%r32; .loc 1 76 12 add.u64 %r55,%r55,%r34; .loc 1 77 15 add.u64 %r44,%r44,1; .loc 1 80 13 setp.ne.u64 %r109,%r44,%r59; @ %r109 bra $L8; .loc 1 89 14 @ ! %r120 bra $L22; bra $L1; $L12: .loc 1 84 20 st.u64 [%r48],%r125; .loc 1 87 29 mul.lo.u64 %r37,%r40,%r39; .loc 1 88 12 add.u64 %r54,%r64,1; .loc 1 89 14 add.u64 %r48,%r48,8; add.u64 %r67,%r67,8; add.u64 %r66,%r66,8; setp.ne.u64 %r112,%r63,%r64; @ %r112 bra $L17; bra $L1; $L22: add.u64 %r48,%frame,248; add.u64 %r67,%frame,8; add.u64 %r66,%frame,128; .loc 1 87 29 mov.u64 %r37,%r31; .loc 1 88 12 mov.u64 %r64,%r124; bra $L11; $L17: mov.u64 %r64,%r54; $L11: .loc 1 96 23 ld.u64 %r115,[%r48]; add.u64 %r39,%r115,1; st.u64 [%r48],%r39; .loc 1 97 29 ld.u64 %r40,[%r67]; .loc 1 97 20 sub.u64 %r116,%r40,%r37; shl.b64 %r117,%r116,3; add.u64 %r55,%r55,%r117; .loc 1 80 32 ld.u64 %r42,[%r66]; .loc 1 80 13 setp.eq.u64 %r118,%r39,%r42; @ %r118 bra $L12; .loc 1 84 20 mov.u64 %r44,%r43; $L8: add.u64 %r53,%r53,8; bra $L13; $L1: .loc 1 101in_unpack_r10.o/1622802240in_unpack_c4.o/ 1622802240gfortrani_internal_unpack_c4 .visible .func _gfortrani_internal_unpack_c4fortran/generated/in_unpack_c4.c"ortrani_internal_unpack_c4 .visible .func _gfortrani_internal_unpack_c4833; .reg .u6416predpred %r109; .reg .predu64 %r125; .reg .u64 %r126; mov.u64 %r73,%ar0; mov.u64 %r74,%ar1; .loc 1 43 8 ld.u64 %r56,[%r73]; .loc 1 44 11 set.u32.eq.u64 %r76,%r56,%r74; neg.s32 %r77,%r76; .loc 1 44 22 set.u32.eq.u64 %r79,%r74,0; neg.s32 %r80,%r79; .loc 1 44 19 cvt.u16.u32 %r82,%r77; cvt.u16.u32 %r83,%r80; or.b16 %r81,%r82,%r83; .loc 1 44 6 cvt.u32.u16 %r84,%r81; cvt.u16.u8 %r85,%r84; setp.ne.u16 %r86,%r85,0; @ %r86 bra $L1; .loc 1 47 7 ld.s8 %r50,[%r73+28]; .loc 1 49 3 setp.le.s64 %r87,%r50,0; @ %r87 bra $L14; add.u64 %r36,%r73,40; mov.u64 %r48,0; .loc 1 49 19 mov.u64 %r64,%r48; .loc 1 48 9 mov.u64 %r44,1; add.u64 %r122,%frame,240; add.u64 %r120,%frame,120; .loc 1 51 16 mov.u64 %r90,%r48; bra $L5; $L16: mov.u64 %r64,%r57; $L5: add.u64 %r89,%r122,%r48; st.u64 [%r89],%r90; .loc 1 52 19 ld.u64 %r26,[%r36]; .loc 1 52 17 add.u64 %r91,%frame,%r48; st.u64 [%r91],%r26; .loc 1 53 19 ld.u64 %r93,[%r36+16]; add.u64 %r92,%r93,1; ld.u64 %r94,[%r36+8]; sub.u64 %r30,%r92,%r94; .loc 1 53 17 add.u64 %r96,%r120,%r48; st.u64 [%r96],%r30; .loc 1 54 10 setp.le.s64 %r97,%r30,0; @ %r97 bra $L1; .loc 1 57 10 setp.ne.u64 %r98,%r26,%r44; @ %r98 bra $L15; .loc 1 58 8 mul.lo.u64 %r44,%r44,%r30; bra $L4; $L15: .loc 1 60 8 mov.u64 %r44,0; $L4: .loc 1 49 36 add.u64 %r57,%r64,1; .loc 1 49 3 add.u64 %r36,%r36,24; add.u64 %r48,%r48,8; setp.ne.u64 %r99,%r50,%r57; @ %r99 bra $L16; .loc 1 63 6 setp.eq.u64 %r100,%r44,0; @ %r100 bra $L6; .loc 1 65 7 shl.b64 %r72,%r44,3; bra $L3; $L14: .loc 1 49 3 mov.u64 %r72,8; $L3: .loc 1 65 7107,[%value_in]; } .loc 1 66 7 bra $L1; $L6: .loc 1 69 11 ld.u64 %r53,[%frame]; .loc 1 71 9 setp.eq.u64 %r109,%r56,0; @ %r109 bra $L1; .loc 1 76 12 shl.b64 %r35,%r53,3; ld.u64 %r45,[%frame+240]; .loc 1 80 32 ld.u64 %r60,[%frame+120]; .loc 1 87 29 mul.lo.u64 %r31,%r60,%r53; mov.u64 %r54,%r74; setp.eq.u64 %r121,%r50,1; .loc 1 88 12 mov.u64 %r125,1; .loc 1 84 20 mov.u64 %r126,%r44; $L13: .loc 1 74 15 ld.f32 %r32,[%r54]; ld.f32 %r33,[%r54+4]; .loc 1 74 13 st.f32 [%r56],%r32; st.f32 [%r56+4],%r33; .loc 1 76 12 add.u64 %r56,%r56,%r35; .loc 1 77 15 add.u64 %r45,%r45,1; .loc 1 80 13 setp.ne.u64 %r110,%r45,%r60; @ %r110 bra $L8; .loc 1 89 14 @ ! %r121 bra $L22; bra $L1; $L12: .loc 1 84 20 st.u64 [%r49],%r126; .loc 1 87 29 mul.lo.u64 %r38,%r41,%r40; .loc 1 88 12 add.u64 %r55,%r65,1; .loc 1 89 14 add.u64 %r49,%r49,8; add.u64 %r68,%r68,8; add.u64 %r67,%r67,8; setp.ne.u64 %r113,%r64,%r65; @ %r113 bra $L17; bra $L1; $L22: add.u64 %r49,%frame,248; add.u64 %r68,%frame,8; add.u64 %r67,%frame,128; .loc 1 87 29 mov.u64 %r38,%r31; .loc 1 88 12 mov.u64 %r65,%r125; bra $L11; $L17: mov.u64 %r65,%r55; $L11: .loc 1 96 23 ld.u64 %r116,[%r49]; add.u64 %r40,%r116,1; st.u64 [%r49],%r40; .loc 1 97 29 ld.u64 %r41,[%r68]; .loc 1 97 20 sub.u64 %r117,%r41,%r38; shl.b64 %r118,%r117,3; add.u64 %r56,%r56,%r118; .loc 1 80 32 ld.u64 %r43,[%r67]; .loc 1 80 13 setp.eq.u64 %r119,%r40,%r43; @ %r119 bra $L12; .loc 1 84 20 mov.u64 %r45,%r44; $L8: add.u64 %r54,%r54,8; bra $L13; $L1: .loc 1 101 in_unpack_c8.o/ 1622802241gfortrani_internal_unpack_c8 .visible .func _gfortrani_internal_unpack_c8fortran/generated/in_unpack_c8.c"ortrani_internal_unpack_c8 .visible .func _gfortrani_internal_unpack_c8833; .reg .u6416predpred %r109; .reg .predu64 %r125; .reg .u64 %r126; mov.u64 %r73,%ar0; mov.u64 %r74,%ar1; .loc 1 43 8 ld.u64 %r56,[%r73]; .loc 1 44 11 set.u32.eq.u64 %r76,%r56,%r74; neg.s32 %r77,%r76; .loc 1 44 22 set.u32.eq.u64 %r79,%r74,0; neg.s32 %r80,%r79; .loc 1 44 19 cvt.u16.u32 %r82,%r77; cvt.u16.u32 %r83,%r80; or.b16 %r81,%r82,%r83; .loc 1 44 6 cvt.u32.u16 %r84,%r81; cvt.u16.u8 %r85,%r84; setp.ne.u16 %r86,%r85,0; @ %r86 bra $L1; .loc 1 47 7 ld.s8 %r50,[%r73+28]; .loc 1 49 3 setp.le.s64 %r87,%r50,0; @ %r87 bra $L14; add.u64 %r36,%r73,40; mov.u64 %r48,0; .loc 1 49 19 mov.u64 %r64,%r48; .loc 1 48 9 mov.u64 %r44,1; add.u64 %r122,%frame,240; add.u64 %r120,%frame,120; .loc 1 51 16 mov.u64 %r90,%r48; bra $L5; $L16: mov.u64 %r64,%r57; $L5: add.u64 %r89,%r122,%r48; st.u64 [%r89],%r90; .loc 1 52 19 ld.u64 %r26,[%r36]; .loc 1 52 17 add.u64 %r91,%frame,%r48; st.u64 [%r91],%r26; .loc 1 53 19 ld.u64 %r93,[%r36+16]; add.u64 %r92,%r93,1; ld.u64 %r94,[%r36+8]; sub.u64 %r30,%r92,%r94; .loc 1 53 17 add.u64 %r96,%r120,%r48; st.u64 [%r96],%r30; .loc 1 54 10 setp.le.s64 %r97,%r30,0; @ %r97 bra $L1; .loc 1 57 10 setp.ne.u64 %r98,%r26,%r44; @ %r98 bra $L15; .loc 1 58 8 mul.lo.u64 %r44,%r44,%r30; bra $L4; $L15: .loc 1 60 8 mov.u64 %r44,0; $L4: .loc 1 49 36 add.u64 %r57,%r64,1; .loc 1 49 3 add.u64 %r36,%r36,24; add.u64 %r48,%r48,8; setp.ne.u64 %r99,%r50,%r57; @ %r99 bra $L16; .loc 1 63 6 setp.eq.u64 %r100,%r44,0; @ %r100 bra $L6; .loc 1 65 7 shl.b64 %r72,%r44,4; bra $L3; $L14: .loc 1 49 3 mov.u64 %r72,16; $L3: .loc 1 65 7107,[%value_in]; } .loc 1 66 7 bra $L1; $L6: .loc 1 69 11 ld.u64 %r53,[%frame]; .loc 1 71 9 setp.eq.u64 %r109,%r56,0; @ %r109 bra $L1; .loc 1 76 12 shl.b64 %r35,%r53,4; ld.u64 %r45,[%frame+240]; .loc 1 80 32 ld.u64 %r60,[%frame+120]; .loc 1 87 29 mul.lo.u64 %r31,%r60,%r53; mov.u64 %r54,%r74; setp.eq.u64 %r121,%r50,1; .loc 1 88 12 mov.u64 %r125,1; .loc 1 84 20 mov.u64 %r126,%r44; $L13: .loc 1 74 15 ld.f64 %r32,[%r54]; ld.f64 %r33,[%r54+8]; .loc 1 74 13 st.f64 [%r56],%r32; st.f64 [%r56+8],%r33; .loc 1 76 12 add.u64 %r56,%r56,%r35; .loc 1 77 15 add.u64 %r45,%r45,1; .loc 1 80 13 setp.ne.u64 %r110,%r45,%r60; @ %r110 bra $L8; .loc 1 89 14 @ ! %r121 bra $L22; bra $L1; $L12: .loc 1 84 20 st.u64 [%r49],%r126; .loc 1 87 29 mul.lo.u64 %r38,%r41,%r40; .loc 1 88 12 add.u64 %r55,%r65,1; .loc 1 89 14 add.u64 %r49,%r49,8; add.u64 %r68,%r68,8; add.u64 %r67,%r67,8; setp.ne.u64 %r113,%r64,%r65; @ %r113 bra $L17; bra $L1; $L22: add.u64 %r49,%frame,248; add.u64 %r68,%frame,8; add.u64 %r67,%frame,128; .loc 1 87 29 mov.u64 %r38,%r31; .loc 1 88 12 mov.u64 %r65,%r125; bra $L11; $L17: mov.u64 %r65,%r55; $L11: .loc 1 96 23 ld.u64 %r116,[%r49]; add.u64 %r40,%r116,1; st.u64 [%r49],%r40; .loc 1 97 29 ld.u64 %r41,[%r68]; .loc 1 97 20 sub.u64 %r117,%r41,%r38; shl.b64 %r118,%r117,4; add.u64 %r56,%r56,%r118; .loc 1 80 32 ld.u64 %r43,[%r67]; .loc 1 80 13 setp.eq.u64 %r119,%r40,%r43; @ %r119 bra $L12; .loc 1 84 20 mov.u64 %r45,%r44; $L8: add.u64 %r54,%r54,16; bra $L13; $L1: .loc 1 101 in_unpack_c10.o/1622802240pow_i4_i4.o/gfortran_pow_i4_i4 .visible .func (.param .u32 %value_out) _gfortran_pow_i4_; .file 1 "../../../../libgfortran/generated/pow_i4_i4gfortran_pow_i4_gfortran_pow_i4_4.reg .pred10 setp.ge.s32 %r29,%r27,0; @ %r29 bra $L3; .loc 1 53 7 setp.eq.u32 %r30,%r26,1; @ %r30 bra $L7; .loc 1 55 7 setp.ne.u32 %r31,%r26,-1; @ %r31 bra $L8; .loc 1 56 16 and.b32 %r32,%r27,1; .loc 1 56 26 setp.eq.u32 %r33,%r32,0; selp.u32 %r25,1,-1,%r33; bra $L1; $L3: .loc 1 61 7 mov.u32 %r24,%r27; .loc 1 48 7 mov.u32 %r25,1; $L5: .loc 1 65 10 and.b32 %r34,%r24,1; .loc 1 65 7 setp.eq.u32 %r35,%r34,0; @ %r35 bra $L4; .loc 1 66 10 mul.lo.u32 %r25,%r25,%r26; $L4: .loc 1 67 6 shr.u32 %r24,%r24,1; .loc 1 68 7 setp.eq.u32 %r36,%r24,0; @ %r36 bra $L1; .loc 1 69 8 mul.lo.u32 %r26,%r26,%r26; .loc 1 65 7 bra $L5; $L6: .loc 1 56 26 mov.u32 %r25,1; bra $L1; $L7: mov.u32 %r25,%r26; bra $L1; $L8: .loc 1 57 28 mov.u32 %r25,0; $L1: .loc 1 75 pow_i8_i4.o/gfortran_pow_i8_i4 .visible .func (.param .u64 %value_out) _gfortran_pow_i8_); .file 1 "../../../../libgfortran/generated/pow_i8_i4gfortran_pow_i8_gfortran_pow_i8_pred %r29; .reg .predmov.u64 %r26,%ar0; mov.u32 10 setp.ge.s32 %r29,%r27,0; @ %r29 bra $L3; .loc 1 53 7 setp.eq.u64 %r30,%r26,1; @ %r30 bra $L7; .loc 1 55 7 setp.ne.u64 %r31,%r26,-1; @ %r31 bra $L8; .loc 1 56 16 and.b32 %r32,%r27,1; .loc 1 56 26 setp.eq.u32 %r33,%r32,0; selp.u64 %r25,1,-1,%r33; bra $L1; $L3: .loc 1 61 7 mov.u32 %r24,%r27; .loc 1 48 7 mov.u64 %r25,1; $L5: .loc 1 65 10 and.b32 %r34,%r24,1; .loc 1 65 7 setp.eq.u32 %r35,%r34,0; @ %r35 bra $L4; .loc 1 66 10 mul.lo.u64 %r25,%r25,%r26; $L4: .loc 1 67 6 shr.u32 %r24,%r24,1; .loc 1 68 7 setp.eq.u32 %r36,%r24,0; @ %r36 bra $L1; .loc 1 69 8 mul.lo.u64 %r26,%r26,%r26; .loc 1 65 7 bra $L5; $L6: .loc 1 56 26 mov.u64 %r25,1; bra $L1; $L7: mov.u64 %r25,%r26; bra $L1; $L8: .loc 1 57 28 mov.u64 %r25,0; $L1: .loc 1 75 pow_i16_i4.o/fortran/generated/pow_i16_i4ortran_pow_i16_i4 .visible .func _gfortran_pow_i16_i448pred %r30; .reg .pred %r31; .reg .pred %r33; .reg .pred %r35; .reg .pred.loc 1 49 6 setp.eq.u32 %r30,%r29,0; @ %r30 bra $L10; .loc 1 51 10 setp.ge.s32 %r31,%r29,0; @ %r31 bra $L3; .loc 1 53 7 setp.ne.u64 %r33,%r75,1; @ %r33 bra $L4; setp.ne.u64 %r35,%r76,0; @ %r35 bra $L6; .loc 1 54 13 st.u64 [%r26],%r75; st.u64 [%r26+8],%r76; bra $L1; $L4: .loc 1 55 7 setp.ne.u64 %r39,%r75,-1; @ %r39 bra $L6; setp.ne.u64 %r41,%r76,-1; @ %r41 bra $L6; .loc 1 56 16 and.b32 %r42,%r29,1; .loc 1 56 26 setp.ne.u32 %r43,%r42,0; selp.u64 %r75,%r76,1,%r43; selp.u64 %r76,%r76,0,%r43; st.u64 [%r26],%r75; st.u64 [%r26+8],%r76; bra $L1; $L6: .loc 1 57 28 mov.u64 %r46,0; st.u64 [%r26],%r46; st.u64 [%r26+8],%r46; bra $L1; $L3: .loc 1 61 7 mov.u32 %r25,%r29; .loc 1 48 7 mov.u64 %r73,1; mov.u64 %r74,0; .loc 1 66 10 add.u64 %r81,%frame,16; add.u64 %r82,%frame,32; $L9: .loc 1 65 10 and.b32 %r48,%r25,1; .loc 1 658; .loc 1 66 10 st.u64 [%frame+16],%r73; st.u64 [%frame+24],%r74; st.u64 [%frame+32],%r75; st.u64 [%frame+40],%r76; {$L8: .loc 1 67 6 shr.u32 %r25,%r25,1; .loc 1 68 7 setp.eq.u32 %r60,%r25,0; @ %r60 bra $L2; .loc 1 69 8 st.u64 [%frame+16],%r75; st.u64 [%frame+24],%r76; st.u64 [%frame+32],%r75; st.u64 [%frame+40],%r76; { 7 bra $L9; $L10: .loc 1 48 7 mov.u64 %r73,1; mov.u64 %r74,0; $L2: .loc 1 74 10 st.u64 [%r26],%r73; st.u64 [%r26+8],%r74; $L1: .locpow_r16_i4.o/ gfortran_pow_c4_i4 .visible .func _gfortran_pow_c4_i4); .file 1 "../../../../libgfortran/generated/pow_c4_i4gfortran_pow_c4_i4 .visible .func _gfortran_pow_c4_i4f32 %r52; .reg .f32 %r53; .reg .predpredmov.u64 %r46,%ar0; mov.f32 %r47,%ar1; mov.f32 %r48,%ar2; mov.u32 %r49,%ar3; .loc 1 49 6 setp.eq.u32 %r50,%r49,0; @ %r50 bra $L10; mov.f32 %r39,%r47; mov.f32 %r40,%r48; .loc 1 51 10 setp.ge.s32 %r51,%r49,0; @ %r51 bra $L3; .loc 1 54 8 neg.s32 %r26,%r49; .loc 1 55 6 abs.f32 %r52,%r39; abs.f32 %r53,%r40; setp.lt.f32 %r54,%r52,%r53; @ ! %r54 bra $L18; div.rn.f32 %r29,%r39,%r40; fma.rn.f32 %r30,%r29,%r39,%r40; fma.rn.f32 %r32,%r29,0f00000000,0fbf800000; add.f32 %r55,%r29,0f00000000; div.rn.f32 %r39,%r55,%r30; div.rn.f32 %r40,%r32,%r30; bra $L8; $L18: div.rn.f32 %r33,%r40,%r39; fma.rn.f32 %r34,%r40,%r33,%r39; fma.rn.f32 %r35,%r33,0f00000000,0f3f800000; div.rn.f32 %r39,%r35,%r34; mov.f32 %r57,0f00000000; sub.f32 %r56,%r57,%r33; div.rn.f32 %r40,%r56,%r34; $L8: .loc 1 41 1 mov.f32 %r38,0f00000000; mov.f32 %r37,0f3f800000; bra $L7; $L3: .loc 1 59 7 mov.u32 %r26,%r49; bra $L8; $L7: .loc 1 63 10 and.b32 %r58,%r26,1; .loc 1 63 79; .loc 1 64 10 mul.f32 %r42,%r38,%r39; mul.f32 %r60,%r38,%r40; neg.f32 %r61,%r60; fma.rn.f32 %r38,%r37,%r40,%r42; fma.rn.f32 %r37,%r37,%r39,%r61; $L9: .loc 1 65 6 shr.u32 %r26,%r26,1; .loc 1 66 7 setp.eq.u32 %r62,%r26,0; @ %r62 bra $L2; .loc 1 67 8 mul.f32 %r45,%r39,%r40; mul.f32 %r63,%r40,%r40; neg.f32 %r64,%r63; fma.rn.f32 %r39,%r39,%r39,%r64; add.f32 %r40,%r45,%r45; .loc 1 63 7 bra $L7; $L10: mov.f32 %r38,0f00000000; mov.f32 %r37,0f3f800000; $L2: .loc 1 72 10 st.f32 [%r46],%r37; st.f32 [%r46+4],%r38; .loc 1 73 1 ret; } pow_c8_i4.o/gfortran_pow_c8_i4 .visible .func _gfortran_pow_c8_i4); .file 1 "../../../../libgfortran/generated/pow_c8_i4gfortran_pow_c8_i4 .visible .func _gfortran_pow_c8_i4u32.reg .f64 %r38; .reg .f64 %r39; .reg .f64 %r40; .reg .f64 %r42; .reg .f64 %r45; .reg .u64 %r46; .reg .f64 %r47; .reg .f64 %r48; .reg .u32f64 %r52; .reg .f64 %r53; .reg .predpred %r59; .reg .f64 %r60; .reg .f64 %r61; .reg .predmov.u64 %r46,%ar0; mov.f64 %r47,%ar1; mov.f64 %r48,%ar2; mov.u32 %r49,%ar3; .loc 1 49 6 setp.eq.u32 %r50,%r49,0; @ %r50 bra $L10; mov.f64 %r39,%r47; mov.f64 %r40,%r48; .loc 1 51 10 setp.ge.s32 %r51,%r49,0; @ %r51 bra $L3; .loc 1 54 8 neg.s32 %r26,%r49; .loc 1 55 6 abs.f64 %r52,%r39; abs.f64 %r53,%r40; setp.lt.f64 %r54,%r52,%r53; @ ! %r54 bra $L18; div.rn.f64 %r29,%r39,%r40; fma.rn.f64 %r30,%r29,%r39,%r40; fma.rn.f64 %r32,%r29,0d0000000000000000,0dbff0000000000000; add.f64 %r55,%r29,0d0000000000000000; div.rn.f64 %r39,%r55,%r30; div.rn.f64 %r40,%r32,%r30; bra $L8; $L18: div.rn.f64 %r33,%r40,%r39; fma.rn.f64 %r34,%r40,%r33,%r39; fma.rn.f64 %r35,%r33,0d0000000000000000,0d3ff0000000000000; div.rn.f64 %r39,%r35,%r34; mov.f64 %r57,0d0000000000000000; sub.f64 %r56,%r57,%r33; div.rn.f64 %r40,%r56,%r34; $L8: .loc 1 41 1 mov.f64 %r38,0d0000000000000000; mov.f64 %r37,0d3ff0000000000000; bra $L7; $L3: .loc 1 59 7 mov.u32 %r26,%r49; bra $L8; $L7: .loc 1 63 10 and.b32 %r58,%r26,1; .loc 1 63 79; .loc 1 64 10 mul.f64 %r42,%r38,%r39; mul.f64 %r60,%r38,%r40; neg.f64 %r61,%r60; fma.rn.f64 %r38,%r37,%r40,%r42; fma.rn.f64 %r37,%r37,%r39,%r61; $L9: .loc 1 65 6 shr.u32 %r26,%r26,1; .loc 1 66 7 setp.eq.u32 %r62,%r26,0; @ %r62 bra $L2; .loc 1 67 8 mul.f64 %r45,%r39,%r40; mul.f64 %r63,%r40,%r40; neg.f64 %r64,%r63; fma.rn.f64 %r39,%r39,%r39,%r64; add.f64 %r40,%r45,%r45; .loc 1 63 7 bra $L7; $L10: mov.f64 %r38,0d0000000000000000; mov.f64 %r37,0d3ff0000000000000; $L2: .loc 1 72 10 st.f64 [%r46],%r37; st.f64 [%r46+8],%r38; .loc 1 73 1 ret; } pow_c10_i4.o/ pow_i4_i8.o/gfortran_pow_i4_i8 .visible .func (.param .u32 %value_out) _gfortran_pow_i4_i.file 1 "../../../../libgfortran/generated/pow_i4_i8gfortran_pow_i4_gfortran_pow_i4_ir24.reg .pred10 setp.ge.s64 %r29,%r27,0; @ %r29 bra $L3; .loc 1 53 7 setp.eq.u32 %r30,%r26,1; @ %r30 bra $L7; .loc 1 55 7 setp.ne.u32 %r31,%r26,-1; @ %r31 bra $L8; .loc 1 56 16 and.b64 %r32,%r27,1; .loc 1 56 26 setp.eq.u64 %r33,%r32,0; selp.u32 %r25,1,-1,%r33; bra $L1; $L3: .loc 1 61 7 mov.u64 %r24,%r27; .loc 1 48 7 mov.u32 %r25,1; $L5: .loc 1 65 10 and.b64 %r34,%r24,1; .loc 1 65.loc 1 66 10 mul.lo.u32 %r25,%r25,%r26; $L4: .loc 1 67 6 shr.u64 %r24,%r24,1; .loc 1 68 7 setp.eq.u64 %r36,%r24,0; @ %r36 bra $L1; .loc 1 69 8 mul.lo.u32 %r26,%r26,%r26; .loc 1 65 7 bra $L5; $L6: .loc 1 56 26 mov.u32 %r25,1; bra $L1; $L7: mov.u32 %r25,%r26; bra $L1; $L8: .loc 1 57 28 mov.u32 %r25,0; $L1: .loc 1 75 pow_i8_i8.o/gfortran_pow_i8_i8 .visible .func (.param .u64 %value_out) _gfortran_pow_i8_fortran/generated/pow_i8_i8gfortran_pow_i8_gfortran_pow_i8_4.reg .predpred10 setp.ge.s64 %r29,%r27,0; @ %r29 bra $L3; .loc 1 53 7 setp.eq.u64 %r30,%r26,1; @ %r30 bra $L7; .loc 1 55 7 setp.ne.u64 %r31,%r26,-1; @ %r31 bra $L8; .loc 1 56 16 and.b64 %r32,%r27,1; .loc 1 56 26 setp.eq.u64 %r33,%r32,0; selp.u64 %r25,1,-1,%r33; bra $L1; $L3: .loc 1 61 7 mov.u64 %r24,%r27; .loc 1 48 7 mov.u64 %r25,1; $L5: .loc 1 65 10 and.b64 %r34,%r24,1; .loc 1 65.loc 1 66 10 mul.lo.u64 %r25,%r25,%r26; $L4: .loc 1 67 6 shr.u64 %r24,%r24,1; .loc 1 68 7 setp.eq.u64 %r36,%r24,0; @ %r36 bra $L1; .loc 1 69 8 mul.lo.u64 %r26,%r26,%r26; .loc 1 65 7 bra $L5; $L6: .loc 1 56 26 mov.u64 %r25,1; bra $L1; $L7: mov.u64 %r25,%r26; bra $L1; $L8: .loc 1 57 28 mov.u64 %r25,0; $L1: .loc 1 75 pow_i16_i8.o/gfortran_pow_i16_i8 .visible .func _gfortran_pow_i16_i8fortran/generated/pow_i16_i8ortran_pow_i16_i8 .visible .func _gfortran_pow_i16_i8pred %r30; .reg .pred %r31; .reg .pred %r33; .reg .pred %r35; .reg .pred.loc 1 49 6 setp.eq.u64 %r30,%r29,0; @ %r30 bra $L10; .loc 1 51 10 setp.ge.s64 %r31,%r29,0; @ %r31 bra $L3; .loc 1 53 7 setp.ne.u64 %r33,%r75,1; @ %r33 bra $L4; setp.ne.u64 %r35,%r76,0; @ %r35 bra $L6; .loc 1 54 13 st.u64 [%r26],%r75; st.u64 [%r26+8],%r76; bra $L1; $L4: .loc 1 55 7 setp.ne.u64 %r39,%r75,-1; @ %r39 bra $L6; setp.ne.u64 %r41,%r76,-1; @ %r41 bra $L6; .loc 1 56 16 and.b64 %r42,%r29,1; .loc 1 56 26 setp.ne.u64 %r43,%r42,0; selp.u64 %r75,%r76,1,%r43; selp.u64 %r76,%r76,0,%r43; st.u64 [%r26],%r75; st.u64 [%r26+8],%r76; bra $L1; $L6: .loc 1 57 28 mov.u64 %r46,0; st.u64 [%r26],%r46; st.u64 [%r26+8],%r46; bra $L1; $L3: .loc 1 61 7 mov.u64 %r25,%r29; .loc 1 48 7 mov.u64 %r73,1; mov.u64 %r74,0; .loc 1 66 10 add.u64 %r81,%frame,16; add.u64 %r82,%frame,32; $L9: .loc 1 65 10 and.b64 %r48,%r25,1; .loc 1 65 7 setp.eq.u64 %r49,%r48,0; @ %r49 bra $L8; .loc 1 66 10 st.u64 [%frame+16],%r73; st.u64 [%frame+24],%r74; st.u64 [%frame+32],%r75; st.u64 [%frame+40],%r76; {$L8: .loc 1 67 6 shr.u64 %r25,%r25,1; .loc 1 68 7 setp.eq.u64 %r60,%r25,0; @ %r60 bra $L2; .loc 1 69 8 st.u64 [%frame+16],%r75; st.u64 [%frame+24],%r76; st.u64 [%frame+32],%r75; st.u64 [%frame+40],%r76; { 7 bra $L9; $L10: .loc 1 48 7 mov.u64 %r73,1; mov.u64 %r74,%r29; $L2: .loc 1 74 10 st.u64 [%r26],%r73; st.u64 [%r26+8],%r74; $L1: .loc pow_r4_i8.o/gfortran_pow_r4_i8 .visible .func (.param .f32 %value_out) _gfortran_pow_r4_../libgfortran/generated/pow_r4_i8gfortran_pow_r4_i8 .visible .func (.param .f32 %value_out) _gfortran_pow_r4_49 6 setp.eq.u64 %r28,%r27,0; @ %r28 bra $L7; .loc 1 51 10 setp.ge.s64 %r29,%r27,0; @ %r29 bra $L3; .loc 1 54 8 neg.s64 %r24,%r27; .loc 1 55 6 mov.f32 %r30,0f3f800000; div.rn.f32 %r26,%r30,%r26; $L5: .loc 1 41 1 mov.f32 %r25,0f3f800000; bra $L4; $L3: .loc 1 59 7 mov.u64 %r24,%r27; bra $L5; $L4: .loc 1 63 10 and.b64 %r31,%r24,1; .loc 1 63 76; .loc 1 64 10 mul.f32 %r25,%r25,%r26; $L6: .loc 1 65 6 shr.u64 %r24,%r24,1; .loc 1 66 7 setp.eq.u64 %r33,%r24,0; @ %r33 bra $L1; .loc 1 67 8 mul.f32 %r26,%r26,%r26; .loc 1 63 7 bra $L4; $L7: .loc 1 48 7 mov.f32 %r25,0f3f800000; $L1: .loc 1 73ortran_pow_r8_i8 .visible .func (.param .f64 %value_out) _gfortran_pow_r8_../libgfortran/generated/pow_r8_i8gfortran_pow_r8_i8 .visible .func (.param .f64 %value_out) _gfortran_pow_r8_49 6 setp.eq.u64 %r28,%r27,0; @ %r28 bra $L7; .loc 1 51 10 setp.ge.s64 %r29,%r27,0; @ %r29 bra $L3; .loc 1 54 8 neg.s64 %r24,%r27; .loc 1 55 6 mov.f64 %r30,0d3ff0000000000000; div.rn.f64 %r26,%r30,%r26; $L5: .loc 1 41 1 mov.f64 %r25,0d3ff0000000000000; bra $L4; $L3: .loc 1 59 7 mov.u64 %r24,%r27; bra $L5; $L4: .loc 1 63 10 and.b64 %r31,%r24,1; .loc 1 63 76; .loc 1 64 10 mul.f64 %r25,%r25,%r26; $L6: .loc 1 65 6 shr.u64 %r24,%r24,1; .loc 1 66 7 setp.eq.u64 %r33,%r24,0; @ %r33 bra $L1; .loc 1 67 8 mul.f64 %r26,%r26,%r26; .loc 1 63 7 bra $L4; $L7: .loc 1 48 7 mov.f64 %r25,0d3ff0000000000000; $L1: .loc 1 73pow_r10_i8.o/ pow_c4_i8.o/gfortran_pow_c4_i8 .visible .func _gfortran_pow_c4_i8); .file 1 "../../../../libgfortran/generated/pow_c4_i8gfortran_pow_c4_i8 .visible .func _gfortran_pow_c4_i8u64f32 %r53; .reg .predpredmov.u64 %r46,%ar0; mov.f32 %r47,%ar1; mov.f32 %r48,%ar2; mov.u64 %r49,%ar3; .loc 1 49mov.f32 %r39,%r47; mov.f32 %r40,%r48; .loc 1 51 10 setp.ge.s64 %r51,%r49,0; @ %r51 bra $L3; .loc 1 54 8 neg.s64 %r26,%r49; .loc 1 55 6 abs.f32 %r52,%r39; abs.f32 %r53,%r40; setp.lt.f32 %r54,%r52,%r53; @ ! %r54 bra $L18; div.rn.f32 %r29,%r39,%r40; fma.rn.f32 %r30,%r29,%r39,%r40; fma.rn.f32 %r32,%r29,0f00000000,0fbf800000; add.f32 %r55,%r29,0f00000000; div.rn.f32 %r39,%r55,%r30; div.rn.f32 %r40,%r32,%r30; bra $L8; $L18: div.rn.f32 %r33,%r40,%r39; fma.rn.f32 %r34,%r40,%r33,%r39; fma.rn.f32 %r35,%r33,0f00000000,0f3f800000; div.rn.f32 %r39,%r35,%r34; mov.f32 %r57,0f00000000; sub.f32 %r56,%r57,%r33; div.rn.f32 %r40,%r56,%r34; $L8: .loc 1 41 1 mov.f32 %r38,0f00000000; mov.f32 %r37,0f3f800000; bra $L7; $L3: .loc 1 59 7 mov.u64 %r26,%r49; bra $L8; $L7: .loc 1 63 10 and.b64 %r58,%r26,1; .loc 1 63 7 setp.eq.u64 %r59,%r58,0; @ %r59 bra $L9; .loc 1 64 10 mul.f32 %r42,%r38,%r39; mul.f32 %r60,%r38,%r40; neg.f32 %r61,%r60; fma.rn.f32 %r38,%r37,%r40,%r42; fma.rn.f32 %r37,%r37,%r39,%r61; $L9: .loc 1 65 6 shr.u64 %r26,%r26,1; .loc 1 66 7 setp.eq.u64 %r62,%r26,0; @ %r62 bra $L2; .loc 1 67 8 mul.f32 %r45,%r39,%r40; mul.f32 %r63,%r40,%r40; neg.f32 %r64,%r63; fma.rn.f32 %r39,%r39,%r39,%r64; add.f32 %r40,%r45,%r45; .loc 1 63 7 bra $L7; $L10: mov.f32 %r38,0f00000000; mov.f32 %r37,0f3f800000; $L2: .loc 1 72 10 st.f32 [%r46],%r37; st.f32 [%r46+4],%r38; .loc 1 73 1 ret; } pow_c8_i8.o/gfortran_pow_c8_i8 .visible .func _gfortran_pow_c8_i8); .file 1 "../../../../libgfortran/generated/pow_c8_i8gfortran_pow_c8_i8 .visible .func _gfortran_pow_c8_i8f64 %r53; .reg .predpred %r59; .reg .f64 %r60; .reg .f64 %r61; .reg .predmov.u64 %r46,%ar0; mov.f64 %r47,%ar1; mov.f64 %r48,%ar2; mov.u64 %r49,%ar3; .loc 1 49mov.f64 %r39,%r47; mov.f64 %r40,%r48; .loc 1 51 10 setp.ge.s64 %r51,%r49,0; @ %r51 bra $L3; .loc 1 54 8 neg.s64 %r26,%r49; .loc 1 55 6 abs.f64 %r52,%r39; abs.f64 %r53,%r40; setp.lt.f64 %r54,%r52,%r53; @ ! %r54 bra $L18; div.rn.f64 %r29,%r39,%r40; fma.rn.f64 %r30,%r29,%r39,%r40; fma.rn.f64 %r32,%r29,0d0000000000000000,0dbff0000000000000; add.f64 %r55,%r29,0d0000000000000000; div.rn.f64 %r39,%r55,%r30; div.rn.f64 %r40,%r32,%r30; bra $L8; $L18: div.rn.f64 %r33,%r40,%r39; fma.rn.f64 %r34,%r40,%r33,%r39; fma.rn.f64 %r35,%r33,0d0000000000000000,0d3ff0000000000000; div.rn.f64 %r39,%r35,%r34; mov.f64 %r57,0d0000000000000000; sub.f64 %r56,%r57,%r33; div.rn.f64 %r40,%r56,%r34; $L8: .loc 1 41 1 mov.f64 %r38,0d0000000000000000; mov.f64 %r37,0d3ff0000000000000; bra $L7; $L3: .loc 1 59 7 mov.u64 %r26,%r49; bra $L8; $L7: .loc 1 63 10 and.b64 %r58,%r26,1; .loc 1 63 7 setp.eq.u64 %r59,%r58,0; @ %r59 bra $L9; .loc 1 64 10 mul.f64 %r42,%r38,%r39; mul.f64 %r60,%r38,%r40; neg.f64 %r61,%r60; fma.rn.f64 %r38,%r37,%r40,%r42; fma.rn.f64 %r37,%r37,%r39,%r61; $L9: .loc 1 65 6 shr.u64 %r26,%r26,1; .loc 1 66 7 setp.eq.u64 %r62,%r26,0; @ %r62 bra $L2; .loc 1 67 8 mul.f64 %r45,%r39,%r40; mul.f64 %r63,%r40,%r40; neg.f64 %r64,%r63; fma.rn.f64 %r39,%r39,%r39,%r64; add.f64 %r40,%r45,%r45; .loc 1 63 7 bra $L7; $L10: mov.f64 %r38,0d0000000000000000; mov.f64 %r37,0d3ff0000000000000; $L2: .loc 1 72 10 st.f64 [%r46],%r37; st.f64 [%r46+8],%r38; .loc 1 73 1 ret; } pow_c10_i8.o/ pow_i4_i16.o/gfortran_pow_i4_i16gfortran_pow_i4_i16../libgfortran/generated/pow_i4_i16gfortran_pow_i4_i16gfortran_pow_i4_i16 (predpred64.loc 1 49 6 or.b64 %r29,%r66,%r67; setp.eq.u64 %r31,%r29,0; @ %r31 bra $L11; .loc 1 51 10 setp.ge.s64 %r33,%r67,0; @ %r33 bra $L4; .loc 1 53 7 setp.eq.u32 %r34,%r26,1; @ %r34 bra $L12; .loc 1 55 7 setp.ne.u32 %r35,%r26,-1; @ %r35 bra $L13; .loc 1 56 16 and.b64 %r37,%r66,1; .loc 1 56 26 setp.eq.u64 %r43,%r37,0; selp.u32 %r25,1,-1,%r43; bra $L1; $L4: .loc 1 61 7 mov.u64 %r64,%r66; mov.u64 %r65,%r67; .loc 1 48 7 mov.u32 %r25,1; $L10: .loc 1 65 10 and.b64 %r45,%r64,1; .loc 1 65 7 setp.eq.u64 %r51,%r45,0; @ %r51 bra $L7; .loc 1 66 10 mul.lo.u32 %r25,%r25,%r26; $L7: .loc 1 67 6 shl.b64 %r53,%r65,63; shr.u64 %r55,%r64,1; or.b64 %r64,%r55,%r53; shr.u64 %r65,%r65,1; .loc 1 68 7 or.b64 %r60,%r64,%r65; setp.eq.u64 %r62,%r60,0; @ %r62 bra $L1; .loc 1 69 8 mul.lo.u32 %r26,%r26,%r26; .loc 1 65 7 bra $L10; $L11: .loc 1 56 26 mov.u32 %r25,1; bra $L1; $L12: mov.u32 %r25,%r26; bra $L1; $L13: .loc 1 57 28 mov.u32 %r25,0; $L1: .loc 1 75 pow_i8_i16.o/gfortran_pow_i8_i16gfortran_pow_i8_i16fortran/generated/pow_i8_i16gfortran_pow_i8_i16gfortran_pow_i8_i16pred %r43; .reg .u64 %r45; .reg .pred64.loc 1 49 6 or.b64 %r29,%r66,%r67; setp.eq.u64 %r31,%r29,0; @ %r31 bra $L11; .loc 1 51 10 setp.ge.s64 %r33,%r67,0; @ %r33 bra $L4; .loc 1 53 7 setp.eq.u64 %r34,%r26,1; @ %r34 bra $L12; .loc 1 55 7 setp.ne.u64 %r35,%r26,-1; @ %r35 bra $L13; .loc 1 56 16 and.b64 %r37,%r66,1; .loc 1 56 26 setp.eq.u64 %r43,%r37,0; selp.u64 %r25,1,-1,%r43; bra $L1; $L4: .loc 1 61 7 mov.u64 %r64,%r66; mov.u64 %r65,%r67; .loc 1 48 7 mov.u64 %r25,1; $L10: .loc 1 65 10 and.b64 %r45,%r64,1; .loc 1 65 7 setp.eq.u64 %r51,%r45,0; @ %r51 bra $L7; .loc 1 66 10 mul.lo.u64 %r25,%r25,%r26; $L7: .loc 1 67 6 shl.b64 %r53,%r65,63; shr.u64 %r55,%r64,1; or.b64 %r64,%r55,%r53; shr.u64 %r65,%r65,1; .loc 1 68 7 or.b64 %r60,%r64,%r65; setp.eq.u64 %r62,%r60,0; @ %r62 bra $L1; .loc 1 69 8 mul.lo.u64 %r26,%r26,%r26; .loc 1 65 7 bra $L10; $L11: .loc 1 56 26 mov.u64 %r25,1; bra $L1; $L12: mov.u64 %r25,%r26; bra $L1; $L13: .loc 1 57 28 mov.u64 %r25,0; $L1: .loc 1 75 pow_i16_i16.o/gfortran_pow_i16_i16 .visible .func _gfortran_pow_i16_i16fortran/generated/pow_i16_i16ortran_pow_i16_i16 .visible .func _gfortran_pow_i16_i16pred %r33; .reg .predu64 %r47; .reg .pred %r53; .reg .u64 %r56; .reg .u64 %r59; .reg .pred %r65; .reg .u64 %r77; .reg .u64 %r79; .reg .u64 %r84; .reg .pred103,[%r27]; ld.u64 %r104105,[%r29]; ld.u64 %r106,[%r29+8]; .loc 1 49 6 or.b64 %r31,%r105,%r106; setp.eq.u64 %r33,%r31,0; @ %r33 bra $L14; .loc 1 51 10 setp.ge.s64 %r35,%r106,0; @ %r35 bra $L4; .loc 1 53 7 setp.ne.u64 %r37,%r103,1; @ %r37 bra $L6; setp.ne.u64 %r39,%r104,0; @ %r39 bra $L8; .loc 1 54 13 st.u64 [%r26],%r103; st.u64 [%r26+8],%r104; bra $L1; $L6: .loc 1 55 7 setp.ne.u64 %r43,%r103,-1; @ %r43 bra $L8; setp.ne.u64 %r45,%r104,-1; @ %r45 bra $L8; .loc 1 56 16 and.b64 %r47,%r105,1; .loc 1 56 26 setp.ne.u64 %r53,%r47,0; selp.u64 %r103,%r104,1,%r53; selp.u64 %r104,%r104,0,%r53; st.u64 [%r26],%r103; st.u64 [%r26+8],%r104; bra $L1; $L8: .loc 1 57 28 mov.u64 %r56,0; st.u64 [%r26],%r56; st.u64 [%r26+8],%r56; bra $L1; $L4: .loc 1 61 7 mov.u64 %r101,%r105; mov.u64 %r102,%r106; .loc 1 48 7 mov.u64 %r99,1; mov.u64 %r100,0; .loc 1 66 10 add.u64 %r117,%frame,16; add.u64 %r118,%frame,32; $L13: .loc 1 65 10 and.b64 %r59,%r101,1; .loc 1 65 7 setp.eq.u64 %r65,%r59,0; @ %r65 bra $L10; .loc 1 66 10 st.u64 [%frame+16],%r99; st.u64 [%frame+24],%r100; st.u64 [%frame+32],%r103; st.u64 [%frame+40],%r104; {11799,[%frame]; ld.u64 %r100,[%frame+8]; $L10: .loc 1 67 6 shl.b64 %r77,%r102,63; shr.u64 %r79,%r101,1; or.b64 %r101,%r79,%r77; shr.u64 %r102,%r102,1; .loc 1 68 7 or.b64 %r84,%r101,%r102; setp.eq.u64 %r86,%r84,0; @ %r86 bra $L2; .loc 1 69 8 st.u64 [%frame+16],%r103; st.u64 [%frame+24],%r104; st.u64 [%frame+32],%r103; st.u64 [%frame+40],%r104; {117]; ld.u64 %r104,[%frame+8]; .loc 1 65 7 bra $L13; $L14: .loc 1 48 7 mov.u64 %r99,1; mov.u64 %r100,%r31; $L2: .loc 1 74 10 st.u64 [%r26],%r99; st.u64 [%r26+8],%r100; $L1: .loc pow_r4_i16.o/gfortran_pow_r4_i16 .visible .func (.param .f32 %value_out) _gfortran_pow_r4_i16../libgfortran/generated/pow_r4_i16gfortran_pow_r4_i16 .visible .func (.param .f32 %value_out) _gfortran_pow_r4_i16pred.loc 1 49 6 or.b64 %r29,%r69,%r70; setp.eq.u64 %r31,%r29,0; @ %r31 bra $L11; .loc 1 51 10 setp.ge.s64 %r33,%r70,0; @ %r33 bra $L4; .loc 1 54 8 set.u32.ne.u64 %r41,%r69,0; cvt.s64.s32 %r39,%r41; neg.s64 %r67,%r69; sub.u64 %r68,%r39,%r70; .loc 1 55 6 mov.f32 %r46,0f3f800000; div.rn.f32 %r26,%r46,%r26; $L7: .loc 1 41 1 mov.f32 %r25,0f3f800000; bra $L6; $L4: .loc 1 59 7 mov.u64 %r67,%r69; mov.u64 %r68,%r70; bra $L7; $L6: .loc 1 63 10 and.b64 %r48,%r67,1; .loc 1 63 7 setp.eq.u64 %r54,%r48,0; @ %r54 bra $L8; .loc 1 64 10 mul.f32 %r25,%r25,%r26; $L8: .loc 1 65 6 shl.b64 %r56,%r68,63; shr.u64 %r58,%r67,1; or.b64 %r67,%r58,%r56; shr.u64 %r68,%r68,1; .loc 1 66 7 or.b64 %r63,%r67,%r68; setp.eq.u64 %r65,%r63,0; @ %r65 bra $L1; .loc 1 67 8 mul.f32 %r26,%r26,%r26; .loc 1 63 7 bra $L6; $L11: .loc 1 48 7 mov.f32 %r25,0f3f800000; $L1: .loc 1 73 pow_r8_i16.o/gfortran_pow_r8_i16 .visible .func (.param .f64 %value_out) _gfortran_pow_r8_i16../libgfortran/generated/pow_r8_i16gfortran_pow_r8_i16 .visible .func (.param .f64 %value_out) _gfortran_pow_r8_i16pred.loc 1 49 6 or.b64 %r29,%r69,%r70; setp.eq.u64 %r31,%r29,0; @ %r31 bra $L11; .loc 1 51 10 setp.ge.s64 %r33,%r70,0; @ %r33 bra $L4; .loc 1 54 8 set.u32.ne.u64 %r41,%r69,0; cvt.s64.s32 %r39,%r41; neg.s64 %r67,%r69; sub.u64 %r68,%r39,%r70; .loc 1 55 6 mov.f64 %r46,0d3ff0000000000000; div.rn.f64 %r26,%r46,%r26; $L7: .loc 1 41 1 mov.f64 %r25,0d3ff0000000000000; bra $L6; $L4: .loc 1 59 7 mov.u64 %r67,%r69; mov.u64 %r68,%r70; bra $L7; $L6: .loc 1 63 10 and.b64 %r48,%r67,1; .loc 1 63 7 setp.eq.u64 %r54,%r48,0; @ %r54 bra $L8; .loc 1 64 10 mul.f64 %r25,%r25,%r26; $L8: .loc 1 65 6 shl.b64 %r56,%r68,63; shr.u64 %r58,%r67,1; or.b64 %r67,%r58,%r56; shr.u64 %r68,%r68,1; .loc 1 66 7 or.b64 %r63,%r67,%r68; setp.eq.u64 %r65,%r63,0; @ %r65 bra $L1; .loc 1 67 8 mul.f64 %r26,%r26,%r26; .loc 1 63 7 bra $L6; $L11: .loc 1 48 7 mov.f64 %r25,0d3ff0000000000000; $L1: .loc 1 73pow_c4_i16.o/gfortran_pow_c4_i16 .visible .func _gfortran_pow_c4_i16); .file 1 "../../../../libgfortran/generated/pow_c4_i16gfortran_pow_c4_i16 .visible .func _gfortran_pow_c4_i16pred %r53; .reg .predpredpred100; mov.u64 %r46,%ar0; mov.f32 %r47,%ar1; mov.f32 %r48,%ar2; mov.u64 %r49,%ar3; ld.u64 %r99,[%r49]; ld.u64 %r100,[%r49+8]; .loc 1 49 6 or.b64 %r51,%r99,%r100; setp.eq.u64 %r53,%r51,0; @ %r53 bra $L14; mov.f32 %r39,%r47; mov.f32 %r40,%r48; .loc 1 51 10 setp.ge.s64 %r55,%r100,0; @ %r55 bra $L4; .loc 1 54 8 set.u32.ne.u64 %r63,%r99,0; cvt.s64.s32 %r61,%r63; neg.s64 %r97,%r99; sub.u64 %r98,%r61,%r100; .loc 1 55 6 abs.f32 %r68,%r39; abs.f32 %r69,%r40; setp.lt.f32 %r70,%r68,%r69; @ ! %r70 bra $L22; div.rn.f32 %r29,%r39,%r40; fma.rn.f32 %r30,%r29,%r39,%r40; fma.rn.f32 %r32,%r29,0f00000000,0fbf800000; add.f32 %r71,%r29,0f00000000; div.rn.f32 %r39,%r71,%r30; div.rn.f32 %r40,%r32,%r30; bra $L10; $L22: div.rn.f32 %r33,%r40,%r39; fma.rn.f32 %r34,%r40,%r33,%r39; fma.rn.f32 %r35,%r33,0f00000000,0f3f800000; div.rn.f32 %r39,%r35,%r34; mov.f32 %r73,0f00000000; sub.f32 %r72,%r73,%r33; div.rn.f32 %r40,%r72,%r34; $L10: .loc 1 41 1 mov.f32 %r38,0f00000000; mov.f32 %r37,0f3f800000; bra $L9; $L4: .loc 1 59 7 mov.u64 %r97,%r99; mov.u64 %r98,%r100; bra $L10; $L9: .loc 1 63 10 and.b64 %r75,%r97,1; .loc 1 63 7 setp.eq.u64 %r81,%r75,0; @ %r81 bra $L11; .loc 1 64 10 mul.f32 %r42,%r38,%r39; mul.f32 %r82,%r38,%r40; neg.f32 %r83,%r82; fma.rn.f32 %r38,%r37,%r40,%r42; fma.rn.f32 %r37,%r37,%r39,%r83; $L11: .loc 1 65 6 shl.b64 %r85,%r98,63; shr.u64 %r87,%r97,1; or.b64 %r97,%r87,%r85; shr.u64 %r98,%r98,1; .loc 1 66 7 or.b64 %r92,%r97,%r98; setp.eq.u64 %r94,%r92,0; @ %r94 bra $L2; .loc 1 67 8 mul.f32 %r45,%r39,%r40; mul.f32 %r95,%r40,%r40; neg.f32 %r96,%r95; fma.rn.f32 %r39,%r39,%r39,%r96; add.f32 %r40,%r45,%r45; .loc 1 63 7 bra $L9; $L14: mov.f32 %r38,0f00000000; mov.f32 %r37,0f3f800000; $L2: .loc 1 72 10 st.f32 [%r46],%r37; st.f32 [%r46+4],%r38; .loc 1 73 1 ret; } pow_c8_i16.o/_gfortran_pow_c8_i16 .visible .func _gfortran_pow_c8_i16); .file 1 "../../../../libgfortran/generated/pow_c8_i16gfortran_pow_c8_i16 .visible .func _gfortran_pow_c8_i16.reg .f64 %r38; .reg .f64 %r39; .reg .f64 %r40; .reg .f64 %r42; .reg .f64 %r45; .reg .u64 %r46; .reg .f64 %r47; .reg .f64 %r48; .reg .u64 %r49; .reg .u64 %r51; .reg .pred %r53; .reg .predpredpred100; mov.u64 %r46,%ar0; mov.f64 %r47,%ar1; mov.f64 %r48,%ar2; mov.u64 %r49,%ar3; ld.u64 %r99,[%r49]; ld.u64 %r100,[%r49+8]; .loc 1 49 6 or.b64 %r51,%r99,%r100; setp.eq.u64 %r53,%r51,0; @ %r53 bra $L14; mov.f64 %r39,%r47; mov.f64 %r40,%r48; .loc 1 51 10 setp.ge.s64 %r55,%r100,0; @ %r55 bra $L4; .loc 1 54 8 set.u32.ne.u64 %r63,%r99,0; cvt.s64.s32 %r61,%r63; neg.s64 %r97,%r99; sub.u64 %r98,%r61,%r100; .loc 1 55 6 abs.f64 %r68,%r39; abs.f64 %r69,%r40; setp.lt.f64 %r70,%r68,%r69; @ ! %r70 bra $L22; div.rn.f64 %r29,%r39,%r40; fma.rn.f64 %r30,%r29,%r39,%r40; fma.rn.f64 %r32,%r29,0d0000000000000000,0dbff0000000000000; add.f64 %r71,%r29,0d0000000000000000; div.rn.f64 %r39,%r71,%r30; div.rn.f64 %r40,%r32,%r30; bra $L10; $L22: div.rn.f64 %r33,%r40,%r39; fma.rn.f64 %r34,%r40,%r33,%r39; fma.rn.f64 %r35,%r33,0d0000000000000000,0d3ff0000000000000; div.rn.f64 %r39,%r35,%r34; mov.f64 %r73,0d0000000000000000; sub.f64 %r72,%r73,%r33; div.rn.f64 %r40,%r72,%r34; $L10: .loc 1 41 1 mov.f64 %r38,0d0000000000000000; mov.f64 %r37,0d3ff0000000000000; bra $L9; $L4: .loc 1 59 7 mov.u64 %r97,%r99; mov.u64 %r98,%r100; bra $L10; $L9: .loc 1 63 10 and.b64 %r75,%r97,1; .loc 1 63 7 setp.eq.u64 %r81,%r75,0; @ %r81 bra $L11; .loc 1 64 10 mul.f64 %r42,%r38,%r39; mul.f64 %r82,%r38,%r40; neg.f64 %r83,%r82; fma.rn.f64 %r38,%r37,%r40,%r42; fma.rn.f64 %r37,%r37,%r39,%r83; $L11: .loc 1 65 6 shl.b64 %r85,%r98,63; shr.u64 %r87,%r97,1; or.b64 %r97,%r87,%r85; shr.u64 %r98,%r98,1; .loc 1 66 7 or.b64 %r92,%r97,%r98; setp.eq.u64 %r94,%r92,0; @ %r94 bra $L2; .loc 1 67 8 mul.f64 %r45,%r39,%r40; mul.f64 %r95,%r40,%r40; neg.f64 %r96,%r95; fma.rn.f64 %r39,%r39,%r39,%r96; add.f64 %r40,%r45,%r45; .loc 1 63 7 bra $L9; $L14: mov.f64 %r38,0d0000000000000000; mov.f64 %r37,0d3ff0000000000000; $L2: .loc 1 72 10 st.f64 [%r46],%r37; st.f64 [%r46+8],%r38; .loc 1 73 1 ret; } pow_c10_i16pack_i1.o/_gfortrani_pack_i1 .visible .func _gfortrani_pack_../libgfortran/generated/pack_i1.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_count_0_gfortrani_count_0_gfortrani_xmallocarr_gfortrani_xmallocarray110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,910,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,80,65,67,75,32,105,110,116,114,105,110,115,105,99,59,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortrani_pack_i1 .visible .func _gfortrani_pack_predu64 %r146; .reg .u64predpredpredpredpredu16pred %r222; .reg .predpred %r243; .reg .predu64pred %r262; .reg .u64 %r265; .reg .u64 %r266; .reg .u64 %r267; mov.u64 %r124,%ar0; mov.u64 %r125,%ar1; mov.u64 %r126,%ar2; mov.u64 %r127,%ar3; .loc 1 97 9 ld.s8 %r22,[%r125+28]; .loc 1 97 7 cvt.u32.u32 %r128,%r22; cvt.s64.s8 %r80,%r128; .loc 1 99 8 ld.u64 %r98,[%r126]; .loc 1 104 15 ld.u64 %r23,[%r126+16]; .loc 1 104 13 cvt.u32.u64 %r81,%r23; .loc 1 106 53 add.u32 %r129,%r81,-4; and.b32 %r130,%r129,-5; set.u32.eq.u32 %r132,%r130,0; neg.s32 %r133,%r132; .loc 1 106 22 add.u32 %r134,%r81,-1; set.u32.le.u32 %r136,%r134,1; neg.s32 %r137,%r136; .loc 1 106 6 cvt.u16.u32 %r139,%r133; cvt.u16.u32 %r140,%r137; or.b16 %r138,%r139,%r140; cvt.u32.u16 %r141,%r138; cvt.u16.u8 %r142,%r141; setp.eq.u16 %r143,%r142,0; @ %r143 bra $L2; $L5: .loc 1 120 3 setp.gt.s64 %r144,%r80,0; @ %r144 bra $L3; mov.u32 %r78,0; bra $L4; $L2: .loc 1 108 7 setp.eq.u32 %r145,%r81,16; @ %r145 bra $L5; .loc 1 117 5 cvta.const.u64 %r146146_gfortran_runtime_error122 16 cvt.u32.u32 %r149,%r22; cvt.s64.s8 %r148,%r149; shl.b64 %r101,%r148,3; add.u64 %r150,%frame,120; mov.u32 %r154mov.u64 %r114,40; mov.u64 %r115,0; cvt.u32.u64 %r78,%r115; add.u64 %r259,%frame,360; add.u64 %r257,%frame,240; $L7: add.u64 %r107,%r125,%r114; .loc 1 123 19 ld.u64 %r159,[%r107+16]; add.u64 %r158,%r159,1; ld.u64 %r160,[%r107+8]; sub.u64 %r33,%r158,%r160; .loc 1 123 17 add.u64 %r161,%frame,%r115; st.u64 [%r161],%r33; .loc 1 124 10 setp.gt.s64 %r162,%r33,0; .loc 1 125 19 selp.u32 %r78,%r78,1,%r162; .loc 1 126 18 add.u64 %r164,%r259,%r115; ld.u64 %r165,[%r107]; st.u64 [%r164],%r165; .loc 1 127 18 add.u64 %r167,%r257,%r115; .loc 1 127 20 add.u64 %r168,%r126,%r114; ld.u64 %r170,[%r168]; mul.lo.u64 %r169,%r170,%r23; .loc 1 127 18 st.u64 [%r167],%r169; .loc 1 120 3 add.u64 %r115,%r115,8; add.u64 %r114,%r114,24; setp.ne.u64 %r171,%r101,%r115; @ %r171 bra $L7; $L4: .loc 1 129 14 ld.u64 %r39,[%frame+360]; .loc 1 129 6 setp.ne.u64 %r172,%r39,0; .loc 1 130 16 selp.u64 %r39,%r39,1,%r172; .loc 1 131 6 ld.u64 %r173,[%frame+240]; setp.ne.u64 %r174,%r173,0; @ %r174 bra $L9; .loc 1 132 16 cvt.s64.s32 %r175,%r81; st.u64 [%frame+240],%r175; $L9: .loc 1 134 6 setp.ne.u32 %r176,%r78,0; @ %r176 bra $L34; .loc 1 137 10 ld.u64 %r77,[%r125]; bra $L10; $L34: .loc 1 135 10 mov.u64 %r77,0; $L10: .loc 1 139 10 ld.u64 %r92,[%r124]; mov.u64 %r260,%r92; .loc 1 139 6 setp.eq.u64 %r262,%r92,0; @ %r262 bra $L11; .loc 1 139 33 cvta.global.u64 %r178,_gfortrani_compile_options; .loc 1 139 30 ld.u32 %r179,[%r178+36]; setp.eq.u32 %r180,%r179,0; @ ! %r180 bra $L11; setp.eq.u64 %r258,%r127,0; bra $L12; $L11: .loc 1 144 10 setp.eq.u64 %r258,%r127,0; @ %r258 bra $L13; .loc 1 148 12 ld.u64 %r183,[%r127+56]; add.u64 %r182,%r183,1; .loc 1 148 10 ld.u64 %r184,[%r127+48]; sub.u64 %r79,%r182,%r184; .loc 1 149 7 setp.lt.s64 %r185,%r79,0; @ ! %r185 bra $L15; bra $L14; $L13: .loc 1 158 12126; call (%value_in),_gfortrani_count_0,(%out_arg1); ld.param.u64 %r187,[%value_in]; } mov.u64 %r79,%r187; .loc 1 161 14 ld.u64 %r260,[%r124]; $L15: .loc 1 161 10 setp.ne.u64 %r188,%r260,0; @ %r188 bra $L16; .loc 1 164 4 st.u64 [%r124+48],%r260; add.u64 %r190,%r79,-1; st.u64 [%r124+56],%r190; mov.u64 %r191,1; st.u64 [%r124+40],%r191; .loc 1 166 16 st.u64 [%r124+8],%r260; .loc 1 169 21191; call (%value_in),_gfortrani_xmallocarray195,[%value_in]; } mov.u64 %r92,%r195; .loc 1 169 19 st.u64 [%r124],%r92; .loc 1 171 7 setp.ne.u64 %r197,%r79,0; @ %r197 bra $L12; bra $L1; $L16: .loc 1 179 17 ld.u64 %r199,[%r124+56]; add.u64 %r198,%r199,1; .loc 1 179 15 ld.u64 %r200,[%r124+48]; sub.u64 %r84,%r198,%r200; .loc 1 180 7 setp.ne.u64 %r201,%r84,%r79; @ %r201 bra $L17; .loc 1 192 8 mov.u64 %r92,%r260; setp.eq.u64 %r258,%r127,0; bra $L12; $L17: .loc 1 181 6 st.u64 [%stack+8],%r84; st.u64 [%stack],%r79;_gfortran_runtime_error12: .loc 1 187 12 ld.u64 %r76,[%r124+40]; .loc 1 188 6 setp.ne.u64 %r204,%r76,0; .loc 1 189 14 selp.u64 %r76,%r76,1,%r204; .loc 1 191 12 ld.u64 %r85,[%frame+240]; .loc 1 194 15 set.u32.ne.u64 %r206,%r98,0; neg.s32 %r207,%r206; .loc 1 194 10 set.u32.ne.u64 %r209,%r77,0; neg.s32 %r210,%r209; .loc 1 194 15 cvt.u16.u32 %r212,%r207; cvt.u16.u32 %r213,%r210; and.b16 %r211,%r212,%r213; .loc 1 194 9 cvt.u32.u16 %r214,%r211; cvt.u16.u8 %r215,%r2149; ld.u64 %r44,[%frame+120]; .loc 1 208 32 ld.u64 %r103,[%frame]; .loc 1 215 30 mul.lo.u64 %r28,%r103,%r39; .loc 1 216 30 mul.lo.u64 %r217,%r103,%r85; .loc 1 216 16 neg.s64 %r100,%r217; setp.le.s64 %r261,%r80,1; add.u64 %r265,%frame,360; add.u64 %r266,%frame,240; .loc 1 212 20 mov.u64 %r267,0; $L28: .loc 1 197 10 ld.s8 %r219,[%r98]; cvt.u16.u32 %r218,%r219; setp.eq.u16 %r220,%r218,0; @ %r220 bra $L20; .loc 1 200 10 ld.u8 %r221,[%r77]; st.u8 [%r92],%r221; .loc 1 201 16 add.u64 %r92,%r92,%r76; .loc 1 204 12 add.u64 %r77,%r77,%r39; .loc 1 205 12 add.u64 %r98,%r98,%r85; .loc 1 206 15 add.u64 %r44,%r44,1; .loc 1 208 13 setp.eq.u64 %r222,%r44,%r103; @ %r222 bra $L21; bra $L28; $L20: .loc 1 204 12 add.u64 %r77,%r77,%r39; .loc 1 205 12 add.u64 %r98,%r98,%r85; .loc 1 206 15 add.u64 %r44,%r44,1; .loc 1 208 13 setp.ne.u64 %r223,%r44,%r103; @ %r223 bra $L28; $L21: .loc 1 216 16 add.u64 %r93,%r98,%r100; .loc 1 218 14 @ ! %r261 bra $L51; bra $L19; $L27: .loc 1 212 20 st.u64 [%r95],%r267; .loc 1 215 30 mul.lo.u64 %r57,%r63,%r62; .loc 1 216 30 mul.lo.u64 %r226,%r65,%r62; .loc 1 216 16 sub.u64 %r93,%r98,%r226; .loc 1 217 12 add.u64 %r94,%r94,1; .loc 1 218 14 add.u64 %r95,%r95,8; add.u64 %r120,%r120,8; setp.ne.u64 %r227,%r80,%r94; @ %r227 bra $L26; bra $L19; $L51: add.u64 %r95,%frame,128; .loc 1 215 30 mov.u64 %r57,%r28; .loc 1 218 14 mov.u64 %r120,8; .loc 1 217 12 mov.u64 %r94,1; $L26: .loc 1 226 23 ld.u64 %r229,[%r95]; add.u64 %r62,%r229,1; st.u64 [%r95],%r62; .loc 1 227 30 add.u64 %r231,%r265,%r120; ld.u64 %r63,[%r231]; .loc 1 227 20 sub.u64 %r232,%r63,%r57; add.u64 %r77,%r77,%r232; .loc 1 228 30 add.u64 %r234,%r266,%r120; ld.u64 %r65,[%r234]; .loc 1 228 20 add.u64 %r98,%r93,%r65; .loc 1 208 32 add.u64 %r235,%frame,%r120; ld.u64 %r67,[%r235]; .loc 1 208 13 setp.eq.u64 %r236,%r62,%r67; @ %r236 bra $L27; .loc 1 212 20 mov.u64 %r44,0; bra $L28; $L19: .loc 1 234 6 @ %r258 bra $L1; .loc 1 236 11 ld.u64 %r239,[%r127+56]; add.u64 %r238,%r239,1; .loc 1 236 9 ld.u64 %r240,[%r127+48]; sub.u64 %r86,%r238,%r240; .loc 1 237 22 ld.u64 %r242,[%r124]; sub.u64 %r241,%r92,%r242; .loc 1 237 13 div.s64 %r87,%r241,%r76; .loc 1 238 10 setp.le.s64 %r243,%r86,%r87; @ %r243 bra $L1; .loc 1 240 20 ld.u64 %r88,[%r127+40]; .loc 1 241 14 setp.eq.u64 %r244,%r88,0; @ %r244 bra $L35; .loc 1 244 47 mul.lo.u64 %r121,%r87,%r88; .loc 1 250 20 mov.u64 %r122,%r88; bra $L30; $L35: mov.u64 %r121,%r87; mov.u64 %r122,1; $L30: .loc 1 244 16 ld.u64 %r245,[%r127]; add.u64 %r91,%r245,%r121; .loc 1 245 13 sub.u64 %r89,%r86,%r87; .loc 1 246 19 add.u64 %r90,%r89,-1; $L31: .loc 1 248 14 ld.u8 %r246,[%r91]; st.u8 [%r92],%r246; .loc 1 249 20 add.u64 %r92,%r92,%r76; .loc 1 250 20 add.u64 %r91,%r91,%r122; .loc 1 246 19 add.u64 %r90,%r90,-1; .loc 1 246 17 setp.ne.u64 %r247,%r90,-1; @ %r247 bra $L31; bra $L1; $L33: .loc 1 164 4 st.u64 [%r124+48],%r260; mov.u64 %r249,-1; st.u64 [%r124+56],%r249; mov.u64 %r250,1; st.u64 [%r124+40],%r250; .loc 1 166 16 st.u64 [%r124+8],%r260; .loc 1 169 2126250; call (%value_in),_gfortrani_xmallocarray254,[%value_in]; } .loc 1 169 19 st.u64 [%r124],%r254; bra $L1; $L14: .loc 1 161 10 @ %r262 bra $L33; .loc 1 151 14 mov.u64 %r127,0; .loc 1 152 15 mov.u64 %r79,%r127; bra $L16; $L1: .loc 1 254 pack_i2.o/_gfortrani_pack_i2 .visible .func _gfortrani_pack_../libgfortran/generated/pack_i2.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_count_0_gfortrani_count_0_gfortrani_xmallocarr_gfortrani_xmallocarray110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,910,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,80,65,67,75,32,105,110,116,114,105,110,115,105,99,59,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortrani_pack_i2 .visible .func _gfortrani_pack_pred %r150; .reg .predpredpredu64 %r206; .reg .u64 %r207; .reg .predu16 %r222; .reg .pred %r223; .reg .u64 %r226; .reg .u16 %r227; .reg .u32 %r228; .reg .pred %r229; .reg .u16 %r230; .reg .pred %r231; .reg .predpredpredu64 %r282; mov.u64 %r131,%ar0; mov.u64 %r132,%ar1; mov.u64 %r133,%ar2; mov.u64 %r134,%ar3; .loc 1 97 9 ld.s8 %r22,[%r132+28]; .loc 1 97 7 cvt.u32.u32 %r135,%r22; cvt.s64.s8 %r91,%r135; .loc 1 99 8 ld.u64 %r103,[%r133]; .loc 1 104 15 ld.u64 %r23,[%r133+16]; .loc 1 104 13 cvt.u32.u64 %r92,%r23; .loc 1 106 53 add.u32 %r136,%r92,-4; and.b32 %r137,%r136,-5; set.u32.eq.u32 %r139,%r137,0; neg.s32 %r140,%r139; .loc 1 106 22 add.u32 %r141,%r92,-1; set.u32.le.u32 %r143,%r141,1; neg.s32 %r144,%r143; .loc 1 106 6 cvt.u16.u32 %r146,%r140; cvt.u16.u32 %r147,%r144; or.b16 %r145,%r146,%r147; cvt.u32.u16 %r148,%r145; cvt.u16.u8 %r149,%r148; setp.eq.u16 %r150,%r149,0; @ %r150 bra $L2; $L5: .loc 1 120 3 setp.gt.s64 %r151,%r91,0; @ %r151 bra $L3; mov.u32 %r87,0; bra $L4; $L2: .loc 1 108 7 setp.eq.u32 %r152,%r92,16; @ %r152 bra $L5; .loc 1 117 5 cvta.const.u64 %r11_gfortran_runtime_error122 16 cvt.u32.u32 %r156,%r22; cvt.s64.s8 %r155,%r156; shl.b64 %r114,%r155,3; add.u64 %r157,%frame,120; mov.u32 %r16111163,[%value_in]; } mov.u64 %r123,40; mov.u64 %r124,0; cvt.u32.u64 %r87,%r124; add.u64 %r274,%frame,360; add.u64 %r273,%frame,240; $L7: add.u64 %r117,%r132,%r123; .loc 1 123 19 ld.u64 %r166,[%r117+16]; add.u64 %r165,%r166,1; ld.u64 %r167,[%r117+8]; sub.u64 %r32,%r165,%r167; .loc 1 123 17 add.u64 %r168,%frame,%r124; st.u64 [%r168],%r32; .loc 1 124 10 setp.gt.s64 %r169,%r32,0; .loc 1 125 19 selp.u32 %r87,%r87,1,%r169; .loc 1 126 18 add.u64 %r171,%r274,%r124; ld.u64 %r172,[%r117]; st.u64 [%r171],%r172; .loc 1 127 18 add.u64 %r174,%r273,%r124; .loc 1 127 20 add.u64 %r175,%r133,%r123; ld.u64 %r177,[%r175]; mul.lo.u64 %r176,%r177,%r23; .loc 1 127 18 st.u64 [%r174],%r176; .loc 1 120 3 add.u64 %r124,%r124,8; add.u64 %r123,%r123,24; setp.ne.u64 %r178,%r114,%r124; @ %r178 bra $L7; $L4: .loc 1 129 14 ld.u64 %r38,[%frame+360]; .loc 1 129 6 setp.ne.u64 %r179,%r38,0; .loc 1 130 16 selp.u64 %r38,%r38,1,%r179; .loc 1 131 6 ld.u64 %r180,[%frame+240]; setp.ne.u64 %r181,%r180,0; @ %r181 bra $L9; .loc 1 132 16 cvt.s64.s32 %r182,%r92; st.u64 [%frame+240],%r182; $L9: .loc 1 134 6 setp.ne.u32 %r183,%r87,0; @ %r183 bra $L34; .loc 1 137 10 ld.u64 %r88,[%r132]; bra $L10; $L34: .loc 1 135 10 mov.u64 %r88,0; $L10: .loc 1 139 10 ld.u64 %r41,[%r131]; mov.u64 %r276,%r41; .loc 1 139 6 setp.eq.u64 %r277,%r41,0; @ %r277 bra $L11; .loc 1 139 33 cvta.global.u64 %r185,_gfortrani_compile_options; .loc 1 139 30 ld.u32 %r186,[%r185+36]; setp.eq.u32 %r187,%r186,0; @ ! %r187 bra $L11; setp.eq.u64 %r275,%r134,0; bra $L12; $L11: .loc 1 144 10 setp.eq.u64 %r275,%r134,0; @ %r275 bra $L13; .loc 1 148 12 ld.u64 %r190,[%r134+56]; add.u64 %r189,%r190,1; .loc 1 148 10 ld.u64 %r191,[%r134+48]; sub.u64 %r89,%r189,%r191; .loc 1 149 7 setp.lt.s64 %r192,%r89,0; @ ! %r192 bra $L15; bra $L14; $L13: .loc 1 158 12133; call (%value_in),_gfortrani_count_0,(%out_arg1); ld.param.u64 %r194,[%value_in]; } mov.u64 %r89,%r194; .loc 1 161 14 ld.u64 %r276,[%r131]; $L15: .loc 1 161 10 setp.ne.u64 %r195,%r276,0; @ %r195 bra $L16; .loc 1 164 4 st.u64 [%r131+48],%r276; add.u64 %r197,%r89,-1; st.u64 [%r131+56],%r197; mov.u64 %r198,1; st.u64 [%r131+40],%r198; .loc 1 166 16 st.u64 [%r131+8],%r276; .loc 1 169 21 mov.u64 %r201,201; call (%value_in),_gfortrani_xmallocarrayt.u64 [%r131],%r41; .loc 1 171 7 setp.ne.u64 %r204,%r89,0; @ %r204 bra $L12; bra $L1; $L16: .loc 1 179 17 ld.u64 %r206,[%r131+56]; add.u64 %r205,%r206,1; .loc 1 179 15 ld.u64 %r207,[%r131+48]; sub.u64 %r95,%r205,%r207; .loc 1 180 7 setp.ne.u64 %r208,%r95,%r89; @ %r208 bra $L17; .loc 1 192 8 mov.u64 %r41,%r276; setp.eq.u64 %r275,%r134,0; bra $L12; $L17: .loc 1 181 6 st.u64 [%stack+8],%r95; st.u64 [%stack],%r89;_gfortran_runtime_error12: .loc 1 187 12 ld.u64 %r82,[%r131+40]; .loc 1 188 6 setp.ne.u64 %r211,%r82,0; .loc 1 189 14 selp.u64 %r82,%r82,1,%r211; .loc 1 191 12 ld.u64 %r97,[%frame+240]; .loc 1 194 15 set.u32.ne.u64 %r213,%r103,0; neg.s32 %r214,%r213; .loc 1 194 10 set.u32.ne.u64 %r216,%r88,0; neg.s32 %r217,%r216; .loc 1 194 15 cvt.u16.u32 %r219,%r214; cvt.u16.u32 %r220,%r217; and.b16 %r218,%r219,%r220; .loc 1 194 9 cvt.u32.u16 %r221,%r218; cvt.u16.u8 %r222,%r221; setp.eq.u16 %r223,%r222,0; @ %r223 bra $L35; .loc 1 201 16 add.u64 %r54,%r82,%r82; .loc 1 204 12 add.u64 %r56,%r38,%r38; ld.u64 %r94,[%frame+120]; .loc 1 208 32 ld.u64 %r115,[%frame]; .loc 1 215 30 mul.lo.u64 %r28,%r115,%r38; .loc 1 216 30 mul.lo.u64 %r226,%r115,%r97; .loc 1 216 16 neg.s64 %r112,%r226; .loc 1 192 8 mov.u64 %r90,%r41; setp.le.s64 %r272,%r91,1; add.u64 %r280,%frame,360; add.u64 %r281,%frame,240; .loc 1 212 20 mov.u64 %r282,0; $L28: .loc 1 197 10 ld.s8 %r228,[%r103]; cvt.u16.u32 %r227,%r228; setp.eq.u16 %r229,%r227,0; @ %r229 bra $L20; .loc 1 200 10 ld.u16 %r230,[%r88]; st.u16 [%r90],%r230; .loc 1 201 16 add.u64 %r90,%r90,%r54; .loc 1 204 12 add.u64 %r88,%r88,%r56; .loc 1 205 12 add.u64 %r103,%r103,%r97; .loc 1 206 15 add.u64 %r94,%r94,1; .loc 1 208 13 setp.eq.u64 %r231,%r94,%r115; @ %r231 bra $L21; bra $L28; $L20: .loc 1 204 12 add.u64 %r88,%r88,%r56; .loc 1 205 12 add.u64 %r103,%r103,%r97; .loc 1 206 15 add.u64 %r94,%r94,1; .loc 1 208 13 setp.ne.u64 %r232,%r94,%r115; @ %r232 bra $L28; $L21: .loc 1 216 16 add.u64 %r104,%r103,%r112; .loc 1 218 14 @ ! %r272 bra $L52; bra $L19; $L27: .loc 1 212 20 st.u64 [%r83],%r282; .loc 1 215 30 mul.lo.u64 %r58,%r64,%r63; .loc 1 216 30 mul.lo.u64 %r235,%r66,%r63; .loc 1 216 16 sub.u64 %r104,%r103,%r235; .loc 1 217 12 add.u64 %r105,%r105,1; .loc 1 218 14 add.u64 %r83,%r83,8; add.u64 %r129,%r129,8; setp.ne.u64 %r236,%r91,%r105; @ %r236 bra $L26; bra $L19; $L52: add.u64 %r83,%frame,128; .loc 1 215 30 mov.u64 %r58,%r28; .loc 1 218 14 mov.u64 %r129,8; .loc 1 217 12 mov.u64 %r105,1; $L26: .loc 1 226 23 ld.u64 %r238,[%r83]; add.u64 %r63,%r238,1; st.u64 [%r83],%r63; .loc 1 227 30 add.u64 %r240,%r280,%r129; ld.u64 %r64,[%r240]; .loc 1 227 20 sub.u64 %r241,%r64,%r58; add.u64 %r242,%r241,%r241; add.u64 %r88,%r88,%r242; .loc 1 228 30 add.u64 %r244,%r281,%r129; ld.u64 %r66,[%r244]; .loc 1 228 20 add.u64 %r103,%r104,%r66; .loc 1 208 32 add.u64 %r245,%frame,%r129; ld.u64 %r68,[%r245]; .loc 1 208 13 setp.eq.u64 %r246,%r63,%r68; @ %r246 bra $L27; .loc 1 212 20 mov.u64 %r94,0; bra $L28; $L35: .loc 1 192 8 mov.u64 %r90,%r41; $L19: .loc 1 234 6 @ %r275 bra $L1; .loc 1 236 11 ld.u64 %r249,[%r134+56]; add.u64 %r248,%r249,1; .loc 1 236 9 ld.u64 %r250,[%r134+48]; sub.u64 %r98,%r248,%r250; .loc 1 237 22 sub.u64 %r251,%r90,%r41; shr.s64 %r253,%r251,1; .loc 1 237 13 div.s64 %r99,%r253,%r82; .loc 1 238 10 setp.le.s64 %r254,%r98,%r99; @ %r254 bra $L1; .loc 1 240 20 ld.u64 %r84,[%r134+40]; .loc 1 241 14 setp.eq.u64 %r255,%r84,0; @ %r255 bra $L36; .loc 1 244 47 mul.lo.u64 %r130,%r99,%r84; bra $L30; $L36: mov.u64 %r130,%r99; .loc 1 242 22 mov.u64 %r84,1; $L30: .loc 1 244 36 add.u64 %r256,%r130,%r130; .loc 1 244 16 ld.u64 %r257,[%r134]; add.u64 %r102,%r257,%r256; .loc 1 245 13 sub.u64 %r100,%r98,%r99; .loc 1 246 19 add.u64 %r101,%r100,-1; .loc 1 246 17 setp.eq.u64 %r258,%r100,0; @ %r258 bra $L1; .loc 1 249 20 add.u64 %r79,%r82,%r82; .loc 1 250 20 add.u64 %r81,%r84,%r84; $L31: .loc 1 248 14 ld.u16 %r261,[%r102]; st.u16 [%r90],%r261; .loc 1 249 20 add.u64 %r90,%r90,%r79; .loc 1 250 20 add.u64 %r102,%r102,%r81; .loc 1 246 19 add.u64 %r101,%r101,-1; .loc 1 246 17 setp.ne.u64 %r262,%r101,-1; @ %r262 bra $L31; bra $L1; $L33: .loc 1 164 4 st.u64 [%r131+48],%r276; mov.u64 %r264,-1; st.u64 [%r131+56],%r264; mov.u64 %r265,1; st.u64 [%r131+40],%r265; .loc 1 166 16 st.u64 [%r131+8],%r276; .loc 1 169 21 mov.u64 %r268,2(%value_in),_gfortrani_xmallocarray269,[%value_in]; } .loc 1 169 19 st.u64 [%r131],%r269; bra $L1; $L14: .loc 1 161 10 @ %r277 bra $L33; .loc 1 151 14 mov.u64 %r134,0; .loc 1 152 15 mov.u64 %r89,%r134; bra $L16; $L1: .loc 1 254 pack_i4.o/_gfortrani_pack_i4 .visible .func _gfortrani_pack_../libgfortran/generated/pack_i4.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_count_0_gfortrani_count_0_gfortrani_xmallocarr_gfortrani_xmallocarray110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,910,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,80,65,67,75,32,105,110,116,114,105,110,115,105,99,59,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortrani_pack_i4 .visible .func _gfortrani_pack_pred %r150; .reg .predpredpredu64 %r206; .reg .u64 %r207; .reg .predpred %r229; .reg .u64 %r232; .reg .predpred %r251; .reg .predmov.u64 %r131,%ar0; mov.u64 %r132,%ar1; mov.u64 %r133,%ar2; mov.u64 %r134,%ar3; .loc 1 97 9 ld.s8 %r22,[%r132+28]; .loc 1 97 7 cvt.u32.u32 %r135,%r22; cvt.s64.s8 %r91,%r135; .loc 1 99 8 ld.u64 %r103,[%r133]; .loc 1 104 15 ld.u64 %r23,[%r133+16]; .loc 1 104 13 cvt.u32.u64 %r92,%r23; .loc 1 106 53 add.u32 %r136,%r92,-4; and.b32 %r137,%r136,-5; set.u32.eq.u32 %r139,%r137,0; neg.s32 %r140,%r139; .loc 1 106 22 add.u32 %r141,%r92,-1; set.u32.le.u32 %r143,%r141,1; neg.s32 %r144,%r143; .loc 1 106 6 cvt.u16.u32 %r146,%r140; cvt.u16.u32 %r147,%r144; or.b16 %r145,%r146,%r147; cvt.u32.u16 %r148,%r145; cvt.u16.u8 %r149,%r148; setp.eq.u16 %r150,%r149,0; @ %r150 bra $L2; $L5: .loc 1 120 3 setp.gt.s64 %r151,%r91,0; @ %r151 bra $L3; mov.u32 %r87,0; bra $L4; $L2: .loc 1 108 7 setp.eq.u32 %r152,%r92,16; @ %r152 bra $L5; .loc 1 117 5 cvta.const.u64 %r11_gfortran_runtime_error122 16 cvt.u32.u32 %r156,%r22; cvt.s64.s8 %r155,%r156; shl.b64 %r114,%r155,3; add.u64 %r157,%frame,120; mov.u32 %r16111163,[%value_in]; } mov.u64 %r123,40; mov.u64 %r124,0; cvt.u32.u64 %r87,%r124; add.u64 %r268,%frame,360; add.u64 %r267,%frame,240; $L7: add.u64 %r117,%r132,%r123; .loc 1 123 19 ld.u64 %r166,[%r117+16]; add.u64 %r165,%r166,1; ld.u64 %r167,[%r117+8]; sub.u64 %r32,%r165,%r167; .loc 1 123 17 add.u64 %r168,%frame,%r124; st.u64 [%r168],%r32; .loc 1 124 10 setp.gt.s64 %r169,%r32,0; .loc 1 125 19 selp.u32 %r87,%r87,1,%r169; .loc 1 126 18 add.u64 %r171,%r268,%r124; ld.u64 %r172,[%r117]; st.u64 [%r171],%r172; .loc 1 127 18 add.u64 %r174,%r267,%r124; .loc 1 127 20 add.u64 %r175,%r133,%r123; ld.u64 %r177,[%r175]; mul.lo.u64 %r176,%r177,%r23; .loc 1 127 18 st.u64 [%r174],%r176; .loc 1 120 3 add.u64 %r124,%r124,8; add.u64 %r123,%r123,24; setp.ne.u64 %r178,%r114,%r124; @ %r178 bra $L7; $L4: .loc 1 129 14 ld.u64 %r38,[%frame+360]; .loc 1 129 6 setp.ne.u64 %r179,%r38,0; .loc 1 130 16 selp.u64 %r38,%r38,1,%r179; .loc 1 131 6 ld.u64 %r180,[%frame+240]; setp.ne.u64 %r181,%r180,0; @ %r181 bra $L9; .loc 1 132 16 cvt.s64.s32 %r182,%r92; st.u64 [%frame+240],%r182; $L9: .loc 1 134 6 setp.ne.u32 %r183,%r87,0; @ %r183 bra $L34; .loc 1 137 10 ld.u64 %r88,[%r132]; bra $L10; $L34: .loc 1 135 10 mov.u64 %r88,0; $L10: .loc 1 139 10 ld.u64 %r41,[%r131]; mov.u64 %r270,%r41; .loc 1 139 6 setp.eq.u64 %r271,%r41,0; @ %r271 bra $L11; .loc 1 139 33 cvta.global.u64 %r185,_gfortrani_compile_options; .loc 1 139 30 ld.u32 %r186,[%r185+36]; setp.eq.u32 %r187,%r186,0; @ ! %r187 bra $L11; setp.eq.u64 %r269,%r134,0; bra $L12; $L11: .loc 1 144 10 setp.eq.u64 %r269,%r134,0; @ %r269 bra $L13; .loc 1 148 12 ld.u64 %r190,[%r134+56]; add.u64 %r189,%r190,1; .loc 1 148 10 ld.u64 %r191,[%r134+48]; sub.u64 %r89,%r189,%r191; .loc 1 149 7 setp.lt.s64 %r192,%r89,0; @ ! %r192 bra $L15; bra $L14; $L13: .loc 1 158 12133; call (%value_in),_gfortrani_count_0,(%out_arg1); ld.param.u64 %r194,[%value_in]; } mov.u64 %r89,%r194; .loc 1 161 14 ld.u64 %r270,[%r131]; $L15: .loc 1 161 10 setp.ne.u64 %r195,%r270,0; @ %r195 bra $L16; .loc 1 164 4 st.u64 [%r131+48],%r270; add.u64 %r197,%r89,-1; st.u64 [%r131+56],%r197; mov.u64 %r198,1; st.u64 [%r131+40],%r198; .loc 1 166 16 st.u64 [%r131+8],%r270; .loc 1 169 2189201; call (%value_in),_gfortrani_xmallocarrayt.u64 [%r131],%r41; .loc 1 171 7 setp.ne.u64 %r204,%r89,0; @ %r204 bra $L12; bra $L1; $L16: .loc 1 179 17 ld.u64 %r206,[%r131+56]; add.u64 %r205,%r206,1; .loc 1 179 15 ld.u64 %r207,[%r131+48]; sub.u64 %r95,%r205,%r207; .loc 1 180 7 setp.ne.u64 %r208,%r95,%r89; @ %r208 bra $L17; .loc 1 192 8 mov.u64 %r41,%r270; setp.eq.u64 %r269,%r134,0; bra $L12; $L17: .loc 1 181 6 st.u64 [%stack+8],%r95; st.u64 [%stack],%r89;_gfortran_runtime_error12: .loc 1 187 12 ld.u64 %r82,[%r131+40]; .loc 1 188 6 setp.ne.u64 %r211,%r82,0; .loc 1 189 14 selp.u64 %r82,%r82,1,%r211; .loc 1 191 12 ld.u64 %r97,[%frame+240]; .loc 1 194 15 set.u32.ne.u64 %r213,%r103,0; neg.s32 %r214,%r213; .loc 1 194 10 set.u32.ne.u64 %r216,%r88,0; neg.s32 %r217,%r216; .loc 1 194 15 cvt.u16.u32 %r219,%r214; cvt.u16.u32 %r220,%r217; and.b16 %r218,%r219,%r220; .loc 1 194 9 cvt.u32.u16 %r221,%r218; cvt.u16.u8 %r222,%r221; setp.eq.u16 %r223,%r222,0; @ %r223 bra $L35; .loc 1 201 16 shl.b64 %r54,%r82,2; .loc 1 204 12 shl.b64 %r56,%r38,2; ld.u64 %r94,[%frame+120]; .loc 1 208 32 ld.u64 %r115,[%frame]; .loc 1 215 30 mul.lo.u64 %r28,%r115,%r38; .loc 1 216 30 mul.lo.u64 %r224,%r115,%r97; .loc 1 216 16 neg.s64 %r112,%r224; .loc 1 192 8 mov.u64 %r90,%r41; setp.le.s64 %r266,%r91,1; add.u64 %r274,%frame,360; add.u64 %r275,%frame,240; .loc 1 212 20 mov.u64 %r276,0; $L28: .loc 1 197 10 ld.s8 %r226,[%r103]; cvt.u16.u32 %r225,%r226; setp.eq.u16 %r227,%r225,0; @ %r227 bra $L20; .loc 1 200 12 ld.u32 %r52,[%r88]; .loc 1 200 10 st.u32 [%r90],%r52; .loc 1 201 16 add.u64 %r90,%r90,%r54; .loc 1 204 12 add.u64 %r88,%r88,%r56; .loc 1 205 12 add.u64 %r103,%r103,%r97; .loc 1 206 15 add.u64 %r94,%r94,1; .loc 1 208 13 setp.eq.u64 %r228,%r94,%r115; @ %r228 bra $L21; bra $L28; $L20: .loc 1 204 12 add.u64 %r88,%r88,%r56; .loc 1 205 12 add.u64 %r103,%r103,%r97; .loc 1 206 15 add.u64 %r94,%r94,1; .loc 1 208 13 setp.ne.u64 %r229,%r94,%r115; @ %r229 bra $L28; $L21: .loc 1 216 16 add.u64 %r104,%r103,%r112; .loc 1 218 14 @ ! %r266 bra $L52; bra $L19; $L27: .loc 1 212 20 st.u64 [%r83],%r276; .loc 1 215 30 mul.lo.u64 %r58,%r64,%r63; .loc 1 216 30 mul.lo.u64 %r232,%r66,%r63; .loc 1 216 16 sub.u64 %r104,%r103,%r232; .loc 1 217 12 add.u64 %r105,%r105,1; .loc 1 218 14 add.u64 %r83,%r83,8; add.u64 %r129,%r129,8; setp.ne.u64 %r233,%r91,%r105; @ %r233 bra $L26; bra $L19; $L52: add.u64 %r83,%frame,128; .loc 1 215 30 mov.u64 %r58,%r28; .loc 1 218 14 mov.u64 %r129,8; .loc 1 217 12 mov.u64 %r105,1; $L26: .loc 1 226 23 ld.u64 %r235,[%r83]; add.u64 %r63,%r235,1; st.u64 [%r83],%r63; .loc 1 227 30 add.u64 %r237,%r274,%r129; ld.u64 %r64,[%r237]; .loc 1 227 20 sub.u64 %r238,%r64,%r58; shl.b64 %r239,%r238,2; add.u64 %r88,%r88,%r239; .loc 1 228 30 add.u64 %r241,%r275,%r129; ld.u64 %r66,[%r241]; .loc 1 228 20 add.u64 %r103,%r104,%r66; .loc 1 208 32 add.u64 %r242,%frame,%r129; ld.u64 %r68,[%r242]; .loc 1 208 13 setp.eq.u64 %r243,%r63,%r68; @ %r243 bra $L27; .loc 1 212 20 mov.u64 %r94,0; bra $L28; $L35: .loc 1 192 8 mov.u64 %r90,%r41; $L19: .loc 1 234 6 @ %r269 bra $L1; .loc 1 236 11 ld.u64 %r246,[%r134+56]; add.u64 %r245,%r246,1; .loc 1 236 9 ld.u64 %r247,[%r134+48]; sub.u64 %r98,%r245,%r247; .loc 1 237 22 sub.u64 %r248,%r90,%r41; shr.s64 %r250,%r248,2; .loc 1 237 13 div.s64 %r99,%r250,%r82; .loc 1 238 10 setp.le.s64 %r251,%r98,%r99; @ %r251 bra $L1; .loc 1 240 20 ld.u64 %r84,[%r134+40]; .loc 1 241 14 setp.eq.u64 %r252,%r84,0; @ %r252 bra $L36; .loc 1 244 47 mul.lo.u64 %r130,%r99,%r84; bra $L30; $L36: mov.u64 %r130,%r99; .loc 1 242 22 mov.u64 %r84,1; $L30: .loc 1 244 36 shl.b64 %r253,%r130,2; .loc 1 244 16 ld.u64 %r254,[%r134]; add.u64 %r102,%r254,%r253; .loc 1 245 13 sub.u64 %r100,%r98,%r99; .loc 1 246 19 add.u64 %r101,%r100,-1; .loc 1 246 17 setp.eq.u64 %r255,%r100,0; @ %r255 bra $L1; .loc 1 249 20 shl.b64 %r79,%r82,2; .loc 1 250 20 shl.b64 %r81,%r84,2; $L31: .loc 1 248 16 ld.u32 %r77,[%r102]; .loc 1 248 14 st.u32 [%r90],%r77; .loc 1 249 20 add.u64 %r90,%r90,%r79; .loc 1 250 20 add.u64 %r102,%r102,%r81; .loc 1 246 19 add.u64 %r101,%r101,-1; .loc 1 246 17 setp.ne.u64 %r256,%r101,-1; @ %r256 bra $L31; bra $L1; $L33: .loc 1 164 4 st.u64 [%r131+48],%r270; mov.u64 %r258,-1; st.u64 [%r131+56],%r258; mov.u64 %r259,1; st.u64 [%r131+40],%r259; .loc 1 166 16 st.u64 [%r131+8],%r270; .loc 1 169 21262; call (%value_in),_gfortrani_xmallocarray263,[%value_in]; } .loc 1 169 19 st.u64 [%r131],%r263; bra $L1; $L14: .loc 1 161 10 @ %r271 bra $L33; .loc 1 151 14 mov.u64 %r134,0; .loc 1 152 15 mov.u64 %r89,%r134; bra $L16; $L1: .loc 1 254 pack_i8.o/_gfortrani_pack_i8 .visible .func _gfortrani_pack_../libgfortran/generated/pack_i8.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_count_0_gfortrani_count_0_gfortrani_xmallocarr_gfortrani_xmallocarray110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,910,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,80,65,67,75,32,105,110,116,114,105,110,115,105,99,59,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortrani_pack_i8 .visible .func _gfortrani_pack_pred %r150; .reg .predpredpredu64 %r206; .reg .u64 %r207; .reg .predpred %r229; .reg .u64 %r232; .reg .predpred %r251; .reg .predmov.u64 %r131,%ar0; mov.u64 %r132,%ar1; mov.u64 %r133,%ar2; mov.u64 %r134,%ar3; .loc 1 97 9 ld.s8 %r22,[%r132+28]; .loc 1 97 7 cvt.u32.u32 %r135,%r22; cvt.s64.s8 %r91,%r135; .loc 1 99 8 ld.u64 %r103,[%r133]; .loc 1 104 15 ld.u64 %r23,[%r133+16]; .loc 1 104 13 cvt.u32.u64 %r92,%r23; .loc 1 106 53 add.u32 %r136,%r92,-4; and.b32 %r137,%r136,-5; set.u32.eq.u32 %r139,%r137,0; neg.s32 %r140,%r139; .loc 1 106 22 add.u32 %r141,%r92,-1; set.u32.le.u32 %r143,%r141,1; neg.s32 %r144,%r143; .loc 1 106 6 cvt.u16.u32 %r146,%r140; cvt.u16.u32 %r147,%r144; or.b16 %r145,%r146,%r147; cvt.u32.u16 %r148,%r145; cvt.u16.u8 %r149,%r148; setp.eq.u16 %r150,%r149,0; @ %r150 bra $L2; $L5: .loc 1 120 3 setp.gt.s64 %r151,%r91,0; @ %r151 bra $L3; mov.u32 %r87,0; bra $L4; $L2: .loc 1 108 7 setp.eq.u32 %r152,%r92,16; @ %r152 bra $L5; .loc 1 117 5 cvta.const.u64 %r11_gfortran_runtime_error122 16 cvt.u32.u32 %r156,%r22; cvt.s64.s8 %r155,%r156; shl.b64 %r114,%r155,3; add.u64 %r157,%frame,120; mov.u32 %r16111163,[%value_in]; } mov.u64 %r123,40; mov.u64 %r124,0; cvt.u32.u64 %r87,%r124; add.u64 %r268,%frame,360; add.u64 %r267,%frame,240; $L7: add.u64 %r117,%r132,%r123; .loc 1 123 19 ld.u64 %r166,[%r117+16]; add.u64 %r165,%r166,1; ld.u64 %r167,[%r117+8]; sub.u64 %r32,%r165,%r167; .loc 1 123 17 add.u64 %r168,%frame,%r124; st.u64 [%r168],%r32; .loc 1 124 10 setp.gt.s64 %r169,%r32,0; .loc 1 125 19 selp.u32 %r87,%r87,1,%r169; .loc 1 126 18 add.u64 %r171,%r268,%r124; ld.u64 %r172,[%r117]; st.u64 [%r171],%r172; .loc 1 127 18 add.u64 %r174,%r267,%r124; .loc 1 127 20 add.u64 %r175,%r133,%r123; ld.u64 %r177,[%r175]; mul.lo.u64 %r176,%r177,%r23; .loc 1 127 18 st.u64 [%r174],%r176; .loc 1 120 3 add.u64 %r124,%r124,8; add.u64 %r123,%r123,24; setp.ne.u64 %r178,%r114,%r124; @ %r178 bra $L7; $L4: .loc 1 129 14 ld.u64 %r38,[%frame+360]; .loc 1 129 6 setp.ne.u64 %r179,%r38,0; .loc 1 130 16 selp.u64 %r38,%r38,1,%r179; .loc 1 131 6 ld.u64 %r180,[%frame+240]; setp.ne.u64 %r181,%r180,0; @ %r181 bra $L9; .loc 1 132 16 cvt.s64.s32 %r182,%r92; st.u64 [%frame+240],%r182; $L9: .loc 1 134 6 setp.ne.u32 %r183,%r87,0; @ %r183 bra $L34; .loc 1 137 10 ld.u64 %r88,[%r132]; bra $L10; $L34: .loc 1 135 10 mov.u64 %r88,0; $L10: .loc 1 139 10 ld.u64 %r41,[%r131]; mov.u64 %r270,%r41; .loc 1 139 6 setp.eq.u64 %r271,%r41,0; @ %r271 bra $L11; .loc 1 139 33 cvta.global.u64 %r185,_gfortrani_compile_options; .loc 1 139 30 ld.u32 %r186,[%r185+36]; setp.eq.u32 %r187,%r186,0; @ ! %r187 bra $L11; setp.eq.u64 %r269,%r134,0; bra $L12; $L11: .loc 1 144 10 setp.eq.u64 %r269,%r134,0; @ %r269 bra $L13; .loc 1 148 12 ld.u64 %r190,[%r134+56]; add.u64 %r189,%r190,1; .loc 1 148 10 ld.u64 %r191,[%r134+48]; sub.u64 %r89,%r189,%r191; .loc 1 149 7 setp.lt.s64 %r192,%r89,0; @ ! %r192 bra $L15; bra $L14; $L13: .loc 1 158 12133; call (%value_in),_gfortrani_count_0,(%out_arg1); ld.param.u64 %r194,[%value_in]; } mov.u64 %r89,%r194; .loc 1 161 14 ld.u64 %r270,[%r131]; $L15: .loc 1 161 10 setp.ne.u64 %r195,%r270,0; @ %r195 bra $L16; .loc 1 164 4 st.u64 [%r131+48],%r270; add.u64 %r197,%r89,-1; st.u64 [%r131+56],%r197; mov.u64 %r198,1; st.u64 [%r131+40],%r198; .loc 1 166 16 st.u64 [%r131+8],%r270; .loc 1 169 21 mov.u64 %r20189201; call (%value_in),_gfortrani_xmallocarrayt.u64 [%r131],%r41; .loc 1 171 7 setp.ne.u64 %r204,%r89,0; @ %r204 bra $L12; bra $L1; $L16: .loc 1 179 17 ld.u64 %r206,[%r131+56]; add.u64 %r205,%r206,1; .loc 1 179 15 ld.u64 %r207,[%r131+48]; sub.u64 %r95,%r205,%r207; .loc 1 180 7 setp.ne.u64 %r208,%r95,%r89; @ %r208 bra $L17; .loc 1 192 8 mov.u64 %r41,%r270; setp.eq.u64 %r269,%r134,0; bra $L12; $L17: .loc 1 181 6 st.u64 [%stack+8],%r95; st.u64 [%stack],%r89;_gfortran_runtime_error12: .loc 1 187 12 ld.u64 %r82,[%r131+40]; .loc 1 188 6 setp.ne.u64 %r211,%r82,0; .loc 1 189 14 selp.u64 %r82,%r82,1,%r211; .loc 1 191 12 ld.u64 %r97,[%frame+240]; .loc 1 194 15 set.u32.ne.u64 %r213,%r103,0; neg.s32 %r214,%r213; .loc 1 194 10 set.u32.ne.u64 %r216,%r88,0; neg.s32 %r217,%r216; .loc 1 194 15 cvt.u16.u32 %r219,%r214; cvt.u16.u32 %r220,%r217; and.b16 %r218,%r219,%r220; .loc 1 194 9 cvt.u32.u16 %r221,%r218; cvt.u16.u8 %r222,%r221; setp.eq.u16 %r223,%r222,0; @ %r223 bra $L35; .loc 1 201 16 shl.b64 %r54,%r82,3; .loc 1 204 12 shl.b64 %r56,%r38,3; ld.u64 %r94,[%frame+120]; .loc 1 208 32 ld.u64 %r115,[%frame]; .loc 1 215 30 mul.lo.u64 %r28,%r115,%r38; .loc 1 216 30 mul.lo.u64 %r224,%r115,%r97; .loc 1 216 16 neg.s64 %r112,%r224; .loc 1 192 8 mov.u64 %r90,%r41; setp.le.s64 %r266,%r91,1; add.u64 %r274,%frame,360; add.u64 %r275,%frame,240; .loc 1 212 20 mov.u64 %r276,0; $L28: .loc 1 197 10 ld.s8 %r226,[%r103]; cvt.u16.u32 %r225,%r226; setp.eq.u16 %r227,%r225,0; @ %r227 bra $L20; .loc 1 200 12 ld.u64 %r52,[%r88]; .loc 1 200 10 st.u64 [%r90],%r52; .loc 1 201 16 add.u64 %r90,%r90,%r54; .loc 1 204 12 add.u64 %r88,%r88,%r56; .loc 1 205 12 add.u64 %r103,%r103,%r97; .loc 1 206 15 add.u64 %r94,%r94,1; .loc 1 208 13 setp.eq.u64 %r228,%r94,%r115; @ %r228 bra $L21; bra $L28; $L20: .loc 1 204 12 add.u64 %r88,%r88,%r56; .loc 1 205 12 add.u64 %r103,%r103,%r97; .loc 1 206 15 add.u64 %r94,%r94,1; .loc 1 208 13 setp.ne.u64 %r229,%r94,%r115; @ %r229 bra $L28; $L21: .loc 1 216 16 add.u64 %r104,%r103,%r112; .loc 1 218 14 @ ! %r266 bra $L52; bra $L19; $L27: .loc 1 212 20 st.u64 [%r83],%r276; .loc 1 215 30 mul.lo.u64 %r58,%r64,%r63; .loc 1 216 30 mul.lo.u64 %r232,%r66,%r63; .loc 1 216 16 sub.u64 %r104,%r103,%r232; .loc 1 217 12 add.u64 %r105,%r105,1; .loc 1 218 14 add.u64 %r83,%r83,8; add.u64 %r129,%r129,8; setp.ne.u64 %r233,%r91,%r105; @ %r233 bra $L26; bra $L19; $L52: add.u64 %r83,%frame,128; .loc 1 215 30 mov.u64 %r58,%r28; .loc 1 218 14 mov.u64 %r129,8; .loc 1 217 12 mov.u64 %r105,1; $L26: .loc 1 226 23 ld.u64 %r235,[%r83]; add.u64 %r63,%r235,1; st.u64 [%r83],%r63; .loc 1 227 30 add.u64 %r237,%r274,%r129; ld.u64 %r64,[%r237]; .loc 1 227 20 sub.u64 %r238,%r64,%r58; shl.b64 %r239,%r238,3; add.u64 %r88,%r88,%r239; .loc 1 228 30 add.u64 %r241,%r275,%r129; ld.u64 %r66,[%r241]; .loc 1 228 20 add.u64 %r103,%r104,%r66; .loc 1 208 32 add.u64 %r242,%frame,%r129; ld.u64 %r68,[%r242]; .loc 1 208 13 setp.eq.u64 %r243,%r63,%r68; @ %r243 bra $L27; .loc 1 212 20 mov.u64 %r94,0; bra $L28; $L35: .loc 1 192 8 mov.u64 %r90,%r41; $L19: .loc 1 234 6 @ %r269 bra $L1; .loc 1 236 11 ld.u64 %r246,[%r134+56]; add.u64 %r245,%r246,1; .loc 1 236 9 ld.u64 %r247,[%r134+48]; sub.u64 %r98,%r245,%r247; .loc 1 237 22 sub.u64 %r248,%r90,%r41; shr.s64 %r250,%r248,3; .loc 1 237 13 div.s64 %r99,%r250,%r82; .loc 1 238 10 setp.le.s64 %r251,%r98,%r99; @ %r251 bra $L1; .loc 1 240 20 ld.u64 %r84,[%r134+40]; .loc 1 241 14 setp.eq.u64 %r252,%r84,0; @ %r252 bra $L36; .loc 1 244 47 mul.lo.u64 %r130,%r99,%r84; bra $L30; $L36: mov.u64 %r130,%r99; .loc 1 242 22 mov.u64 %r84,1; $L30: .loc 1 244 36 shl.b64 %r253,%r130,3; .loc 1 244 16 ld.u64 %r254,[%r134]; add.u64 %r102,%r254,%r253; .loc 1 245 13 sub.u64 %r100,%r98,%r99; .loc 1 246 19 add.u64 %r101,%r100,-1; .loc 1 246 17 setp.eq.u64 %r255,%r100,0; @ %r255 bra $L1; .loc 1 249 20 shl.b64 %r79,%r82,3; .loc 1 250 20 shl.b64 %r81,%r84,3; $L31: .loc 1 248 16 ld.u64 %r77,[%r102]; .loc 1 248 14 st.u64 [%r90],%r77; .loc 1 249 20 add.u64 %r90,%r90,%r79; .loc 1 250 20 add.u64 %r102,%r102,%r81; .loc 1 246 19 add.u64 %r101,%r101,-1; .loc 1 246 17 setp.ne.u64 %r256,%r101,-1; @ %r256 bra $L31; bra $L1; $L33: .loc 1 164 4 st.u64 [%r131+48],%r270; mov.u64 %r258,-1; st.u64 [%r131+56],%r258; mov.u64 %r259,1; st.u64 [%r131+40],%r259; .loc 1 166 16 st.u64 [%r131+8],%r270; .loc 1 169 21 mov.u64 %r2627262; call (%value_in),_gfortrani_xmallocarray263,[%value_in]; } .loc 1 169 19 st.u64 [%r131],%r263; bra $L1; $L14: .loc 1 161 10 @ %r271 bra $L33; .loc 1 151 14 mov.u64 %r134,0; .loc 1 152 15 mov.u64 %r89,%r134; bra $L16; $L1: .loc 1 254 pack_i16.o/_gfortrani_pack_i16 .visible .func _gfortrani_pack_i16../libgfortran/generated/pack_i16.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_count_0_gfortrani_count_0_gfortrani_xmallocarr_gfortrani_xmallocarray110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,910,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,80,65,67,75,32,105,110,116,114,105,110,115,105,99,59,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortrani_pack_i16 .visible .func _gfortrani_pack_i16pred %r150; .reg .predpredpredu64 %r206; .reg .u64 %r207; .reg .predu16 %r222; .reg .pred %r223; .reg .u64 %r224; .reg .u16 %r225; .reg .u32 %r226; .reg .pred %r227; .reg .predpredpred %r253; .reg .predu64 %r273; .reg .pred %r274; .reg .predu64 %r282; .reg .u64 %r283; .reg .u64 %r284; mov.u64 %r131,%ar0; mov.u64 %r132,%ar1; mov.u64 %r133,%ar2; mov.u64 %r134,%ar3; .loc 1 97 9 ld.s8 %r22,[%r132+28]; .loc 1 97 7 cvt.u32.u32 %r135,%r22; cvt.s64.s8 %r91,%r135; .loc 1 99 8 ld.u64 %r103,[%r133]; .loc 1 104 15 ld.u64 %r23,[%r133+16]; .loc 1 104 13 cvt.u32.u64 %r92,%r23; .loc 1 106 53 add.u32 %r136,%r92,-4; and.b32 %r137,%r136,-5; set.u32.eq.u32 %r139,%r137,0; neg.s32 %r140,%r139; .loc 1 106 22 add.u32 %r141,%r92,-1; set.u32.le.u32 %r143,%r141,1; neg.s32 %r144,%r143; .loc 1 106 6 cvt.u16.u32 %r146,%r140; cvt.u16.u32 %r147,%r144; or.b16 %r145,%r146,%r147; cvt.u32.u16 %r148,%r145; cvt.u16.u8 %r149,%r148; setp.eq.u16 %r150,%r149,0; @ %r150 bra $L2; $L5: .loc 1 120 3 setp.gt.s64 %r151,%r91,0; @ %r151 bra $L3; mov.u32 %r87,0; bra $L4; $L2: .loc 1 108 7 setp.eq.u32 %r152,%r92,16; @ %r152 bra $L5; .loc 1 117 5 cvta.const.u64 %r11_gfortran_runtime_error122 16 cvt.u32.u32 %r156,%r22; cvt.s64.s8 %r155,%r156; shl.b64 %r114,%r155,3; add.u64 %r157,%frame,120; mov.u32 %r16111163,[%value_in]; } mov.u64 %r123,40; mov.u64 %r124,0; cvt.u32.u64 %r87,%r124; add.u64 %r278,%frame,360; add.u64 %r277,%frame,240; $L7: add.u64 %r117,%r132,%r123; .loc 1 123 19 ld.u64 %r166,[%r117+16]; add.u64 %r165,%r166,1; ld.u64 %r167,[%r117+8]; sub.u64 %r32,%r165,%r167; .loc 1 123 17 add.u64 %r168,%frame,%r124; st.u64 [%r168],%r32; .loc 1 124 10 setp.gt.s64 %r169,%r32,0; .loc 1 125 19 selp.u32 %r87,%r87,1,%r169; .loc 1 126 18 add.u64 %r171,%r278,%r124; ld.u64 %r172,[%r117]; st.u64 [%r171],%r172; .loc 1 127 18 add.u64 %r174,%r277,%r124; .loc 1 127 20 add.u64 %r175,%r133,%r123; ld.u64 %r177,[%r175]; mul.lo.u64 %r176,%r177,%r23; .loc 1 127 18 st.u64 [%r174],%r176; .loc 1 120 3 add.u64 %r124,%r124,8; add.u64 %r123,%r123,24; setp.ne.u64 %r178,%r114,%r124; @ %r178 bra $L7; $L4: .loc 1 129 14 ld.u64 %r38,[%frame+360]; .loc 1 129 6 setp.ne.u64 %r179,%r38,0; .loc 1 130 16 selp.u64 %r38,%r38,1,%r179; .loc 1 131 6 ld.u64 %r180,[%frame+240]; setp.ne.u64 %r181,%r180,0; @ %r181 bra $L9; .loc 1 132 16 cvt.s64.s32 %r182,%r92; st.u64 [%frame+240],%r182; $L9: .loc 1 134 6 setp.ne.u32 %r183,%r87,0; @ %r183 bra $L34; .loc 1 137 10 ld.u64 %r88,[%r132]; bra $L10; $L34: .loc 1 135 10 mov.u64 %r88,0; $L10: .loc 1 139 10 ld.u64 %r41,[%r131]; mov.u64 %r279,%r41; .loc 1 139 6 setp.eq.u64 %r275,%r41,0; @ %r275 bra $L11; .loc 1 139 33 cvta.global.u64 %r185,_gfortrani_compile_options; .loc 1 139 30 ld.u32 %r186,[%r185+36]; setp.eq.u32 %r187,%r186,0; @ ! %r187 bra $L11; setp.eq.u64 %r274,%r134,0; bra $L12; $L11: .loc 1 144 10 setp.eq.u64 %r274,%r134,0; @ %r274 bra $L13; .loc 1 148 12 ld.u64 %r190,[%r134+56]; add.u64 %r189,%r190,1; .loc 1 148 10 ld.u64 %r191,[%r134+48]; sub.u64 %r89,%r189,%r191; .loc 1 149 7 setp.lt.s64 %r192,%r89,0; @ ! %r192 bra $L15; bra $L14; $L13: .loc 1 158 12133; call (%value_in),_gfortrani_count_0,(%out_arg1); ld.param.u64 %r194,[%value_in]; } mov.u64 %r89,%r194; .loc 1 161 14 ld.u64 %r279,[%r131]; $L15: .loc 1 161 10 setp.ne.u64 %r195,%r279,0; @ %r195 bra $L16; .loc 1 164 4 st.u64 [%r131+48],%r279; add.u64 %r197,%r89,-1; st.u64 [%r131+56],%r197; mov.u64 %r198,1; st.u64 [%r131+40],%r198; .loc 1 166 16 st.u64 [%r131+8],%r279; .loc 1 169 21 mov.u64 %r201,1689201; call (%value_in),_gfortrani_xmallocarrayt.u64 [%r131],%r41; .loc 1 171 7 setp.ne.u64 %r204,%r89,0; @ %r204 bra $L12; bra $L1; $L16: .loc 1 179 17 ld.u64 %r206,[%r131+56]; add.u64 %r205,%r206,1; .loc 1 179 15 ld.u64 %r207,[%r131+48]; sub.u64 %r95,%r205,%r207; .loc 1 180 7 setp.ne.u64 %r208,%r95,%r89; @ %r208 bra $L17; .loc 1 192 8 mov.u64 %r41,%r279; setp.eq.u64 %r274,%r134,0; bra $L12; $L17: .loc 1 181 6 st.u64 [%stack+8],%r95; st.u64 [%stack],%r89;_gfortran_runtime_error12: .loc 1 187 12 ld.u64 %r82,[%r131+40]; .loc 1 188 6 setp.ne.u64 %r211,%r82,0; .loc 1 189 14 selp.u64 %r82,%r82,1,%r211; .loc 1 191 12 ld.u64 %r97,[%frame+240]; .loc 1 194 15 set.u32.ne.u64 %r213,%r103,0; neg.s32 %r214,%r213; .loc 1 194 10 set.u32.ne.u64 %r216,%r88,0; neg.s32 %r217,%r216; .loc 1 194 15 cvt.u16.u32 %r219,%r214; cvt.u16.u32 %r220,%r217; and.b16 %r218,%r219,%r220; .loc 1 194 9 cvt.u32.u16 %r221,%r218; cvt.u16.u8 %r222,%r221; setp.eq.u16 %r223,%r222,0; @ %r223 bra $L35; .loc 1 201 16 shl.b64 %r54,%r82,4; .loc 1 204 12 shl.b64 %r56,%r38,4; ld.u64 %r94,[%frame+120]; .loc 1 208 32 ld.u64 %r115,[%frame]; .loc 1 215 30 mul.lo.u64 %r28,%r115,%r38; .loc 1 216 30 mul.lo.u64 %r224,%r115,%r97; .loc 1 216 16 neg.s64 %r112,%r224; .loc 1 192 8 mov.u64 %r90,%r41; setp.le.s64 %r276,%r91,1; add.u64 %r282,%frame,360; add.u64 %r283,%frame,240; .loc 1 212 20 mov.u64 %r284,0; $L28: .loc 1 197 10 ld.s8 %r226,[%r103]; cvt.u16.u32 %r225,%r226; setp.eq.u16 %r227,%r225,0; @ %r227 bra $L20; .loc 1 200 12 ld.u64 %r270,[%r88]; ld.u64 %r271,[%r88+8]; .loc 1 200 10 st.u64 [%r90],%r270; st.u64 [%r90+8],%r271; .loc 1 201 16 add.u64 %r90,%r90,%r54; .loc 1 204 12 add.u64 %r88,%r88,%r56; .loc 1 205 12 add.u64 %r103,%r103,%r97; .loc 1 206 15 add.u64 %r94,%r94,1; .loc 1 208 13 setp.eq.u64 %r230,%r94,%r115; @ %r230 bra $L21; bra $L28; $L20: .loc 1 204 12 add.u64 %r88,%r88,%r56; .loc 1 205 12 add.u64 %r103,%r103,%r97; .loc 1 206 15 add.u64 %r94,%r94,1; .loc 1 208 13 setp.ne.u64 %r231,%r94,%r115; @ %r231 bra $L28; $L21: .loc 1 216 16 add.u64 %r104,%r103,%r112; .loc 1 218 14 @ ! %r276 bra $L52; bra $L19; $L27: .loc 1 212 20 st.u64 [%r83],%r284; .loc 1 215 30 mul.lo.u64 %r58,%r64,%r63; .loc 1 216 30 mul.lo.u64 %r234,%r66,%r63; .loc 1 216 16 sub.u64 %r104,%r103,%r234; .loc 1 217 12 add.u64 %r105,%r105,1; .loc 1 218 14 add.u64 %r83,%r83,8; add.u64 %r129,%r129,8; setp.ne.u64 %r235,%r91,%r105; @ %r235 bra $L26; bra $L19; $L52: add.u64 %r83,%frame,128; .loc 1 215 30 mov.u64 %r58,%r28; .loc 1 218 14 mov.u64 %r129,8; .loc 1 217 12 mov.u64 %r105,1; $L26: .loc 1 226 23 ld.u64 %r237,[%r83]; add.u64 %r63,%r237,1; st.u64 [%r83],%r63; .loc 1 227 30 add.u64 %r239,%r282,%r129; ld.u64 %r64,[%r239]; .loc 1 227 20 sub.u64 %r240,%r64,%r58; shl.b64 %r241,%r240,4; add.u64 %r88,%r88,%r241; .loc 1 228 30 add.u64 %r243,%r283,%r129; ld.u64 %r66,[%r243]; .loc 1 228 20 add.u64 %r103,%r104,%r66; .loc 1 208 32 add.u64 %r244,%frame,%r129; ld.u64 %r68,[%r244]; .loc 1 208 13 setp.eq.u64 %r245,%r63,%r68; @ %r245 bra $L27; .loc 1 212 20 mov.u64 %r94,0; bra $L28; $L35: .loc 1 192 8 mov.u64 %r90,%r41; $L19: .loc 1 234 6 @ %r274 bra $L1; .loc 1 236 11 ld.u64 %r248,[%r134+56]; add.u64 %r247,%r248,1; .loc 1 236 9 ld.u64 %r249,[%r134+48]; sub.u64 %r98,%r247,%r249; .loc 1 237 22 sub.u64 %r250,%r90,%r41; shr.s64 %r252,%r250,4; .loc 1 237 13 div.s64 %r99,%r252,%r82; .loc 1 238 10 setp.le.s64 %r253,%r98,%r99; @ %r253 bra $L1; .loc 1 240 20 ld.u64 %r84,[%r134+40]; .loc 1 241 14 setp.eq.u64 %r254,%r84,0; @ %r254 bra $L36; .loc 1 244 47 mul.lo.u64 %r130,%r99,%r84; bra $L30; $L36: mov.u64 %r130,%r99; .loc 1 242 22 mov.u64 %r84,1; $L30: .loc 1 244 36 shl.b64 %r255,%r130,4; .loc 1 244 16 ld.u64 %r256,[%r134]; add.u64 %r102,%r256,%r255; .loc 1 245 13 sub.u64 %r100,%r98,%r99; .loc 1 246 19 add.u64 %r101,%r100,-1; .loc 1 246 17 setp.eq.u64 %r257,%r100,0; @ %r257 bra $L1; .loc 1 249 20 shl.b64 %r79,%r82,4; .loc 1 250 20 shl.b64 %r81,%r84,4; $L31: .loc 1 248 16 ld.u64 %r272,[%r102]; ld.u64 %r273,[%r102+8]; .loc 1 248 14 st.u64 [%r90],%r272; st.u64 [%r90+8],%r273; .loc 1 249 20 add.u64 %r90,%r90,%r79; .loc 1 250 20 add.u64 %r102,%r102,%r81; .loc 1 246 19 add.u64 %r101,%r101,-1; .loc 1 246 17 setp.ne.u64 %r260,%r101,-1; @ %r260 bra $L31; bra $L1; $L33: .loc 1 164 4 st.u64 [%r131+48],%r279; mov.u64 %r262,-1; st.u64 [%r131+56],%r262; mov.u64 %r263,1; st.u64 [%r131+40],%r263; .loc 1 166 16 st.u64 [%r131+8],%r279; .loc 1 169 21 mov.u64 %r266,16266; call (%value_in),_gfortrani_xmallocarray267,[%value_in]; } .loc 1 169 19 st.u64 [%r131],%r267; bra $L1; $L14: .loc 1 161 10 @ %r275 bra $L33; .loc 1 151 14 mov.u64 %r134,0; .loc 1 152 15 mov.u64 %r89,%r134; bra $L16; $L1: .loc 1 254 pack_r4.o/_gfortrani_pack_r4 .visible .func _gfortrani_pack_r../libgfortran/generated/pack_r4.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_count_0_gfortrani_count_0_gfortrani_xmallocarr_gfortrani_xmallocarray110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,910,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,80,65,67,75,32,105,110,116,114,105,110,115,105,99,59,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortrani_pack_r4 .visible .func _gfortrani_pack_rpred %r150; .reg .predpredpredu64 %r206; .reg .u64 %r207; .reg .predpred %r229; .reg .u64 %r232; .reg .predpred %r251; .reg .predmov.u64 %r131,%ar0; mov.u64 %r132,%ar1; mov.u64 %r133,%ar2; mov.u64 %r134,%ar3; .loc 1 97 9 ld.s8 %r22,[%r132+28]; .loc 1 97 7 cvt.u32.u32 %r135,%r22; cvt.s64.s8 %r91,%r135; .loc 1 99 8 ld.u64 %r103,[%r133]; .loc 1 104 15 ld.u64 %r23,[%r133+16]; .loc 1 104 13 cvt.u32.u64 %r92,%r23; .loc 1 106 53 add.u32 %r136,%r92,-4; and.b32 %r137,%r136,-5; set.u32.eq.u32 %r139,%r137,0; neg.s32 %r140,%r139; .loc 1 106 22 add.u32 %r141,%r92,-1; set.u32.le.u32 %r143,%r141,1; neg.s32 %r144,%r143; .loc 1 106 6 cvt.u16.u32 %r146,%r140; cvt.u16.u32 %r147,%r144; or.b16 %r145,%r146,%r147; cvt.u32.u16 %r148,%r145; cvt.u16.u8 %r149,%r148; setp.eq.u16 %r150,%r149,0; @ %r150 bra $L2; $L5: .loc 1 120 3 setp.gt.s64 %r151,%r91,0; @ %r151 bra $L3; mov.u32 %r87,0; bra $L4; $L2: .loc 1 108 7 setp.eq.u32 %r152,%r92,16; @ %r152 bra $L5; .loc 1 117 5 cvta.const.u64 %r11_gfortran_runtime_error122 16 cvt.u32.u32 %r156,%r22; cvt.s64.s8 %r155,%r156; shl.b64 %r114,%r155,3; add.u64 %r157,%frame,120; mov.u32 %r16111163,[%value_in]; } mov.u64 %r123,40; mov.u64 %r124,0; cvt.u32.u64 %r87,%r124; add.u64 %r268,%frame,360; add.u64 %r267,%frame,240; $L7: add.u64 %r117,%r132,%r123; .loc 1 123 19 ld.u64 %r166,[%r117+16]; add.u64 %r165,%r166,1; ld.u64 %r167,[%r117+8]; sub.u64 %r32,%r165,%r167; .loc 1 123 17 add.u64 %r168,%frame,%r124; st.u64 [%r168],%r32; .loc 1 124 10 setp.gt.s64 %r169,%r32,0; .loc 1 125 19 selp.u32 %r87,%r87,1,%r169; .loc 1 126 18 add.u64 %r171,%r268,%r124; ld.u64 %r172,[%r117]; st.u64 [%r171],%r172; .loc 1 127 18 add.u64 %r174,%r267,%r124; .loc 1 127 20 add.u64 %r175,%r133,%r123; ld.u64 %r177,[%r175]; mul.lo.u64 %r176,%r177,%r23; .loc 1 127 18 st.u64 [%r174],%r176; .loc 1 120 3 add.u64 %r124,%r124,8; add.u64 %r123,%r123,24; setp.ne.u64 %r178,%r114,%r124; @ %r178 bra $L7; $L4: .loc 1 129 14 ld.u64 %r38,[%frame+360]; .loc 1 129 6 setp.ne.u64 %r179,%r38,0; .loc 1 130 16 selp.u64 %r38,%r38,1,%r179; .loc 1 131 6 ld.u64 %r180,[%frame+240]; setp.ne.u64 %r181,%r180,0; @ %r181 bra $L9; .loc 1 132 16 cvt.s64.s32 %r182,%r92; st.u64 [%frame+240],%r182; $L9: .loc 1 134 6 setp.ne.u32 %r183,%r87,0; @ %r183 bra $L34; .loc 1 137 10 ld.u64 %r88,[%r132]; bra $L10; $L34: .loc 1 135 10 mov.u64 %r88,0; $L10: .loc 1 139 10 ld.u64 %r41,[%r131]; mov.u64 %r270,%r41; .loc 1 139 6 setp.eq.u64 %r271,%r41,0; @ %r271 bra $L11; .loc 1 139 33 cvta.global.u64 %r185,_gfortrani_compile_options; .loc 1 139 30 ld.u32 %r186,[%r185+36]; setp.eq.u32 %r187,%r186,0; @ ! %r187 bra $L11; setp.eq.u64 %r269,%r134,0; bra $L12; $L11: .loc 1 144 10 setp.eq.u64 %r269,%r134,0; @ %r269 bra $L13; .loc 1 148 12 ld.u64 %r190,[%r134+56]; add.u64 %r189,%r190,1; .loc 1 148 10 ld.u64 %r191,[%r134+48]; sub.u64 %r89,%r189,%r191; .loc 1 149 7 setp.lt.s64 %r192,%r89,0; @ ! %r192 bra $L15; bra $L14; $L13: .loc 1 158 12133; call (%value_in),_gfortrani_count_0,(%out_arg1); ld.param.u64 %r194,[%value_in]; } mov.u64 %r89,%r194; .loc 1 161 14 ld.u64 %r270,[%r131]; $L15: .loc 1 161 10 setp.ne.u64 %r195,%r270,0; @ %r195 bra $L16; .loc 1 164 4 st.u64 [%r131+48],%r270; add.u64 %r197,%r89,-1; st.u64 [%r131+56],%r197; mov.u64 %r198,1; st.u64 [%r131+40],%r198; .loc 1 166 16 st.u64 [%r131+8],%r270; .loc 1 169 2189201; call (%value_in),_gfortrani_xmallocarrayt.u64 [%r131],%r41; .loc 1 171 7 setp.ne.u64 %r204,%r89,0; @ %r204 bra $L12; bra $L1; $L16: .loc 1 179 17 ld.u64 %r206,[%r131+56]; add.u64 %r205,%r206,1; .loc 1 179 15 ld.u64 %r207,[%r131+48]; sub.u64 %r95,%r205,%r207; .loc 1 180 7 setp.ne.u64 %r208,%r95,%r89; @ %r208 bra $L17; .loc 1 192 8 mov.u64 %r41,%r270; setp.eq.u64 %r269,%r134,0; bra $L12; $L17: .loc 1 181 6 st.u64 [%stack+8],%r95; st.u64 [%stack],%r89;_gfortran_runtime_error12: .loc 1 187 12 ld.u64 %r82,[%r131+40]; .loc 1 188 6 setp.ne.u64 %r211,%r82,0; .loc 1 189 14 selp.u64 %r82,%r82,1,%r211; .loc 1 191 12 ld.u64 %r97,[%frame+240]; .loc 1 194 15 set.u32.ne.u64 %r213,%r103,0; neg.s32 %r214,%r213; .loc 1 194 10 set.u32.ne.u64 %r216,%r88,0; neg.s32 %r217,%r216; .loc 1 194 15 cvt.u16.u32 %r219,%r214; cvt.u16.u32 %r220,%r217; and.b16 %r218,%r219,%r220; .loc 1 194 9 cvt.u32.u16 %r221,%r218; cvt.u16.u8 %r222,%r221; setp.eq.u16 %r223,%r222,0; @ %r223 bra $L35; .loc 1 201 16 shl.b64 %r54,%r82,2; .loc 1 204 12 shl.b64 %r56,%r38,2; ld.u64 %r94,[%frame+120]; .loc 1 208 32 ld.u64 %r115,[%frame]; .loc 1 215 30 mul.lo.u64 %r28,%r115,%r38; .loc 1 216 30 mul.lo.u64 %r224,%r115,%r97; .loc 1 216 16 neg.s64 %r112,%r224; .loc 1 192 8 mov.u64 %r90,%r41; setp.le.s64 %r266,%r91,1; add.u64 %r274,%frame,360; add.u64 %r275,%frame,240; .loc 1 212 20 mov.u64 %r276,0; $L28: .loc 1 197 10 ld.s8 %r226,[%r103]; cvt.u16.u32 %r225,%r226; setp.eq.u16 %r227,%r225,0; @ %r227 bra $L20; .loc 1 200 12 ld.f32 %r52,[%r88]; .loc 1 200 10 st.f32 [%r90],%r52; .loc 1 201 16 add.u64 %r90,%r90,%r54; .loc 1 204 12 add.u64 %r88,%r88,%r56; .loc 1 205 12 add.u64 %r103,%r103,%r97; .loc 1 206 15 add.u64 %r94,%r94,1; .loc 1 208 13 setp.eq.u64 %r228,%r94,%r115; @ %r228 bra $L21; bra $L28; $L20: .loc 1 204 12 add.u64 %r88,%r88,%r56; .loc 1 205 12 add.u64 %r103,%r103,%r97; .loc 1 206 15 add.u64 %r94,%r94,1; .loc 1 208 13 setp.ne.u64 %r229,%r94,%r115; @ %r229 bra $L28; $L21: .loc 1 216 16 add.u64 %r104,%r103,%r112; .loc 1 218 14 @ ! %r266 bra $L52; bra $L19; $L27: .loc 1 212 20 st.u64 [%r83],%r276; .loc 1 215 30 mul.lo.u64 %r58,%r64,%r63; .loc 1 216 30 mul.lo.u64 %r232,%r66,%r63; .loc 1 216 16 sub.u64 %r104,%r103,%r232; .loc 1 217 12 add.u64 %r105,%r105,1; .loc 1 218 14 add.u64 %r83,%r83,8; add.u64 %r129,%r129,8; setp.ne.u64 %r233,%r91,%r105; @ %r233 bra $L26; bra $L19; $L52: add.u64 %r83,%frame,128; .loc 1 215 30 mov.u64 %r58,%r28; .loc 1 218 14 mov.u64 %r129,8; .loc 1 217 12 mov.u64 %r105,1; $L26: .loc 1 226 23 ld.u64 %r235,[%r83]; add.u64 %r63,%r235,1; st.u64 [%r83],%r63; .loc 1 227 30 add.u64 %r237,%r274,%r129; ld.u64 %r64,[%r237]; .loc 1 227 20 sub.u64 %r238,%r64,%r58; shl.b64 %r239,%r238,2; add.u64 %r88,%r88,%r239; .loc 1 228 30 add.u64 %r241,%r275,%r129; ld.u64 %r66,[%r241]; .loc 1 228 20 add.u64 %r103,%r104,%r66; .loc 1 208 32 add.u64 %r242,%frame,%r129; ld.u64 %r68,[%r242]; .loc 1 208 13 setp.eq.u64 %r243,%r63,%r68; @ %r243 bra $L27; .loc 1 212 20 mov.u64 %r94,0; bra $L28; $L35: .loc 1 192 8 mov.u64 %r90,%r41; $L19: .loc 1 234 6 @ %r269 bra $L1; .loc 1 236 11 ld.u64 %r246,[%r134+56]; add.u64 %r245,%r246,1; .loc 1 236 9 ld.u64 %r247,[%r134+48]; sub.u64 %r98,%r245,%r247; .loc 1 237 22 sub.u64 %r248,%r90,%r41; shr.s64 %r250,%r248,2; .loc 1 237 13 div.s64 %r99,%r250,%r82; .loc 1 238 10 setp.le.s64 %r251,%r98,%r99; @ %r251 bra $L1; .loc 1 240 20 ld.u64 %r84,[%r134+40]; .loc 1 241 14 setp.eq.u64 %r252,%r84,0; @ %r252 bra $L36; .loc 1 244 47 mul.lo.u64 %r130,%r99,%r84; bra $L30; $L36: mov.u64 %r130,%r99; .loc 1 242 22 mov.u64 %r84,1; $L30: .loc 1 244 36 shl.b64 %r253,%r130,2; .loc 1 244 16 ld.u64 %r254,[%r134]; add.u64 %r102,%r254,%r253; .loc 1 245 13 sub.u64 %r100,%r98,%r99; .loc 1 246 19 add.u64 %r101,%r100,-1; .loc 1 246 17 setp.eq.u64 %r255,%r100,0; @ %r255 bra $L1; .loc 1 249 20 shl.b64 %r79,%r82,2; .loc 1 250 20 shl.b64 %r81,%r84,2; $L31: .loc 1 248 16 ld.f32 %r77,[%r102]; .loc 1 248 14 st.f32 [%r90],%r77; .loc 1 249 20 add.u64 %r90,%r90,%r79; .loc 1 250 20 add.u64 %r102,%r102,%r81; .loc 1 246 19 add.u64 %r101,%r101,-1; .loc 1 246 17 setp.ne.u64 %r256,%r101,-1; @ %r256 bra $L31; bra $L1; $L33: .loc 1 164 4 st.u64 [%r131+48],%r270; mov.u64 %r258,-1; st.u64 [%r131+56],%r258; mov.u64 %r259,1; st.u64 [%r131+40],%r259; .loc 1 166 16 st.u64 [%r131+8],%r270; .loc 1 169 21262; call (%value_in),_gfortrani_xmallocarray263,[%value_in]; } .loc 1 169 19 st.u64 [%r131],%r263; bra $L1; $L14: .loc 1 161 10 @ %r271 bra $L33; .loc 1 151 14 mov.u64 %r134,0; .loc 1 152 15 mov.u64 %r89,%r134; bra $L16; $L1: .loc 1 254 pack_r8.o/_gfortrani_pack_r8 .visible .func _gfortrani_pack_r8../libgfortran/generated/pack_r8.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_count_0_gfortrani_count_0_gfortrani_xmallocarr_gfortrani_xmallocarray110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,910,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,80,65,67,75,32,105,110,116,114,105,110,115,105,99,59,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortrani_pack_r8 .visible .func _gfortrani_pack_r8pred %r150; .reg .predpredpredu64 %r206; .reg .u64 %r207; .reg .predpred %r229; .reg .u64 %r232; .reg .predpred %r251; .reg .predmov.u64 %r131,%ar0; mov.u64 %r132,%ar1; mov.u64 %r133,%ar2; mov.u64 %r134,%ar3; .loc 1 97 9 ld.s8 %r22,[%r132+28]; .loc 1 97 7 cvt.u32.u32 %r135,%r22; cvt.s64.s8 %r91,%r135; .loc 1 99 8 ld.u64 %r103,[%r133]; .loc 1 104 15 ld.u64 %r23,[%r133+16]; .loc 1 104 13 cvt.u32.u64 %r92,%r23; .loc 1 106 53 add.u32 %r136,%r92,-4; and.b32 %r137,%r136,-5; set.u32.eq.u32 %r139,%r137,0; neg.s32 %r140,%r139; .loc 1 106 22 add.u32 %r141,%r92,-1; set.u32.le.u32 %r143,%r141,1; neg.s32 %r144,%r143; .loc 1 106 6 cvt.u16.u32 %r146,%r140; cvt.u16.u32 %r147,%r144; or.b16 %r145,%r146,%r147; cvt.u32.u16 %r148,%r145; cvt.u16.u8 %r149,%r148; setp.eq.u16 %r150,%r149,0; @ %r150 bra $L2; $L5: .loc 1 120 3 setp.gt.s64 %r151,%r91,0; @ %r151 bra $L3; mov.u32 %r87,0; bra $L4; $L2: .loc 1 108 7 setp.eq.u32 %r152,%r92,16; @ %r152 bra $L5; .loc 1 117 5 cvta.const.u64 %r11_gfortran_runtime_error122 16 cvt.u32.u32 %r156,%r22; cvt.s64.s8 %r155,%r156; shl.b64 %r114,%r155,3; add.u64 %r157,%frame,120; mov.u32 %r16111163,[%value_in]; } mov.u64 %r123,40; mov.u64 %r124,0; cvt.u32.u64 %r87,%r124; add.u64 %r268,%frame,360; add.u64 %r267,%frame,240; $L7: add.u64 %r117,%r132,%r123; .loc 1 123 19 ld.u64 %r166,[%r117+16]; add.u64 %r165,%r166,1; ld.u64 %r167,[%r117+8]; sub.u64 %r32,%r165,%r167; .loc 1 123 17 add.u64 %r168,%frame,%r124; st.u64 [%r168],%r32; .loc 1 124 10 setp.gt.s64 %r169,%r32,0; .loc 1 125 19 selp.u32 %r87,%r87,1,%r169; .loc 1 126 18 add.u64 %r171,%r268,%r124; ld.u64 %r172,[%r117]; st.u64 [%r171],%r172; .loc 1 127 18 add.u64 %r174,%r267,%r124; .loc 1 127 20 add.u64 %r175,%r133,%r123; ld.u64 %r177,[%r175]; mul.lo.u64 %r176,%r177,%r23; .loc 1 127 18 st.u64 [%r174],%r176; .loc 1 120 3 add.u64 %r124,%r124,8; add.u64 %r123,%r123,24; setp.ne.u64 %r178,%r114,%r124; @ %r178 bra $L7; $L4: .loc 1 129 14 ld.u64 %r38,[%frame+360]; .loc 1 129 6 setp.ne.u64 %r179,%r38,0; .loc 1 130 16 selp.u64 %r38,%r38,1,%r179; .loc 1 131 6 ld.u64 %r180,[%frame+240]; setp.ne.u64 %r181,%r180,0; @ %r181 bra $L9; .loc 1 132 16 cvt.s64.s32 %r182,%r92; st.u64 [%frame+240],%r182; $L9: .loc 1 134 6 setp.ne.u32 %r183,%r87,0; @ %r183 bra $L34; .loc 1 137 10 ld.u64 %r88,[%r132]; bra $L10; $L34: .loc 1 135 10 mov.u64 %r88,0; $L10: .loc 1 139 10 ld.u64 %r41,[%r131]; mov.u64 %r270,%r41; .loc 1 139 6 setp.eq.u64 %r271,%r41,0; @ %r271 bra $L11; .loc 1 139 33 cvta.global.u64 %r185,_gfortrani_compile_options; .loc 1 139 30 ld.u32 %r186,[%r185+36]; setp.eq.u32 %r187,%r186,0; @ ! %r187 bra $L11; setp.eq.u64 %r269,%r134,0; bra $L12; $L11: .loc 1 144 10 setp.eq.u64 %r269,%r134,0; @ %r269 bra $L13; .loc 1 148 12 ld.u64 %r190,[%r134+56]; add.u64 %r189,%r190,1; .loc 1 148 10 ld.u64 %r191,[%r134+48]; sub.u64 %r89,%r189,%r191; .loc 1 149 7 setp.lt.s64 %r192,%r89,0; @ ! %r192 bra $L15; bra $L14; $L13: .loc 1 158 12133; call (%value_in),_gfortrani_count_0,(%out_arg1); ld.param.u64 %r194,[%value_in]; } mov.u64 %r89,%r194; .loc 1 161 14 ld.u64 %r270,[%r131]; $L15: .loc 1 161 10 setp.ne.u64 %r195,%r270,0; @ %r195 bra $L16; .loc 1 164 4 st.u64 [%r131+48],%r270; add.u64 %r197,%r89,-1; st.u64 [%r131+56],%r197; mov.u64 %r198,1; st.u64 [%r131+40],%r198; .loc 1 166 16 st.u64 [%r131+8],%r270; .loc 1 169 21 mov.u64 %r20189201; call (%value_in),_gfortrani_xmallocarrayt.u64 [%r131],%r41; .loc 1 171 7 setp.ne.u64 %r204,%r89,0; @ %r204 bra $L12; bra $L1; $L16: .loc 1 179 17 ld.u64 %r206,[%r131+56]; add.u64 %r205,%r206,1; .loc 1 179 15 ld.u64 %r207,[%r131+48]; sub.u64 %r95,%r205,%r207; .loc 1 180 7 setp.ne.u64 %r208,%r95,%r89; @ %r208 bra $L17; .loc 1 192 8 mov.u64 %r41,%r270; setp.eq.u64 %r269,%r134,0; bra $L12; $L17: .loc 1 181 6 st.u64 [%stack+8],%r95; st.u64 [%stack],%r89;_gfortran_runtime_error12: .loc 1 187 12 ld.u64 %r82,[%r131+40]; .loc 1 188 6 setp.ne.u64 %r211,%r82,0; .loc 1 189 14 selp.u64 %r82,%r82,1,%r211; .loc 1 191 12 ld.u64 %r97,[%frame+240]; .loc 1 194 15 set.u32.ne.u64 %r213,%r103,0; neg.s32 %r214,%r213; .loc 1 194 10 set.u32.ne.u64 %r216,%r88,0; neg.s32 %r217,%r216; .loc 1 194 15 cvt.u16.u32 %r219,%r214; cvt.u16.u32 %r220,%r217; and.b16 %r218,%r219,%r220; .loc 1 194 9 cvt.u32.u16 %r221,%r218; cvt.u16.u8 %r222,%r221; setp.eq.u16 %r223,%r222,0; @ %r223 bra $L35; .loc 1 201 16 shl.b64 %r54,%r82,3; .loc 1 204 12 shl.b64 %r56,%r38,3; ld.u64 %r94,[%frame+120]; .loc 1 208 32 ld.u64 %r115,[%frame]; .loc 1 215 30 mul.lo.u64 %r28,%r115,%r38; .loc 1 216 30 mul.lo.u64 %r224,%r115,%r97; .loc 1 216 16 neg.s64 %r112,%r224; .loc 1 192 8 mov.u64 %r90,%r41; setp.le.s64 %r266,%r91,1; add.u64 %r274,%frame,360; add.u64 %r275,%frame,240; .loc 1 212 20 mov.u64 %r276,0; $L28: .loc 1 197 10 ld.s8 %r226,[%r103]; cvt.u16.u32 %r225,%r226; setp.eq.u16 %r227,%r225,0; @ %r227 bra $L20; .loc 1 200 12 ld.f64 %r52,[%r88]; .loc 1 200 10 st.f64 [%r90],%r52; .loc 1 201 16 add.u64 %r90,%r90,%r54; .loc 1 204 12 add.u64 %r88,%r88,%r56; .loc 1 205 12 add.u64 %r103,%r103,%r97; .loc 1 206 15 add.u64 %r94,%r94,1; .loc 1 208 13 setp.eq.u64 %r228,%r94,%r115; @ %r228 bra $L21; bra $L28; $L20: .loc 1 204 12 add.u64 %r88,%r88,%r56; .loc 1 205 12 add.u64 %r103,%r103,%r97; .loc 1 206 15 add.u64 %r94,%r94,1; .loc 1 208 13 setp.ne.u64 %r229,%r94,%r115; @ %r229 bra $L28; $L21: .loc 1 216 16 add.u64 %r104,%r103,%r112; .loc 1 218 14 @ ! %r266 bra $L52; bra $L19; $L27: .loc 1 212 20 st.u64 [%r83],%r276; .loc 1 215 30 mul.lo.u64 %r58,%r64,%r63; .loc 1 216 30 mul.lo.u64 %r232,%r66,%r63; .loc 1 216 16 sub.u64 %r104,%r103,%r232; .loc 1 217 12 add.u64 %r105,%r105,1; .loc 1 218 14 add.u64 %r83,%r83,8; add.u64 %r129,%r129,8; setp.ne.u64 %r233,%r91,%r105; @ %r233 bra $L26; bra $L19; $L52: add.u64 %r83,%frame,128; .loc 1 215 30 mov.u64 %r58,%r28; .loc 1 218 14 mov.u64 %r129,8; .loc 1 217 12 mov.u64 %r105,1; $L26: .loc 1 226 23 ld.u64 %r235,[%r83]; add.u64 %r63,%r235,1; st.u64 [%r83],%r63; .loc 1 227 30 add.u64 %r237,%r274,%r129; ld.u64 %r64,[%r237]; .loc 1 227 20 sub.u64 %r238,%r64,%r58; shl.b64 %r239,%r238,3; add.u64 %r88,%r88,%r239; .loc 1 228 30 add.u64 %r241,%r275,%r129; ld.u64 %r66,[%r241]; .loc 1 228 20 add.u64 %r103,%r104,%r66; .loc 1 208 32 add.u64 %r242,%frame,%r129; ld.u64 %r68,[%r242]; .loc 1 208 13 setp.eq.u64 %r243,%r63,%r68; @ %r243 bra $L27; .loc 1 212 20 mov.u64 %r94,0; bra $L28; $L35: .loc 1 192 8 mov.u64 %r90,%r41; $L19: .loc 1 234 6 @ %r269 bra $L1; .loc 1 236 11 ld.u64 %r246,[%r134+56]; add.u64 %r245,%r246,1; .loc 1 236 9 ld.u64 %r247,[%r134+48]; sub.u64 %r98,%r245,%r247; .loc 1 237 22 sub.u64 %r248,%r90,%r41; shr.s64 %r250,%r248,3; .loc 1 237 13 div.s64 %r99,%r250,%r82; .loc 1 238 10 setp.le.s64 %r251,%r98,%r99; @ %r251 bra $L1; .loc 1 240 20 ld.u64 %r84,[%r134+40]; .loc 1 241 14 setp.eq.u64 %r252,%r84,0; @ %r252 bra $L36; .loc 1 244 47 mul.lo.u64 %r130,%r99,%r84; bra $L30; $L36: mov.u64 %r130,%r99; .loc 1 242 22 mov.u64 %r84,1; $L30: .loc 1 244 36 shl.b64 %r253,%r130,3; .loc 1 244 16 ld.u64 %r254,[%r134]; add.u64 %r102,%r254,%r253; .loc 1 245 13 sub.u64 %r100,%r98,%r99; .loc 1 246 19 add.u64 %r101,%r100,-1; .loc 1 246 17 setp.eq.u64 %r255,%r100,0; @ %r255 bra $L1; .loc 1 249 20 shl.b64 %r79,%r82,3; .loc 1 250 20 shl.b64 %r81,%r84,3; $L31: .loc 1 248 16 ld.f64 %r77,[%r102]; .loc 1 248 14 st.f64 [%r90],%r77; .loc 1 249 20 add.u64 %r90,%r90,%r79; .loc 1 250 20 add.u64 %r102,%r102,%r81; .loc 1 246 19 add.u64 %r101,%r101,-1; .loc 1 246 17 setp.ne.u64 %r256,%r101,-1; @ %r256 bra $L31; bra $L1; $L33: .loc 1 164 4 st.u64 [%r131+48],%r270; mov.u64 %r258,-1; st.u64 [%r131+56],%r258; mov.u64 %r259,1; st.u64 [%r131+40],%r259; .loc 1 166 16 st.u64 [%r131+8],%r270; .loc 1 169 21 mov.u64 %r2627262; call (%value_in),_gfortrani_xmallocarray263,[%value_in]; } .loc 1 169 19 st.u64 [%r131],%r263; bra $L1; $L14: .loc 1 161 10 @ %r271 bra $L33; .loc 1 151 14 mov.u64 %r134,0; .loc 1 152 15 mov.u64 %r89,%r134; bra $L16; $L1: .loc 1 254 pack_r10.o/ pack_c4.o/_gfortrani_pack_c4 .visible .func _gfortrani_pack_c../libgfortran/generated/pack_c4.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_count_0_gfortrani_count_0_gfortrani_xmallocarr_gfortrani_xmallocarray110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,910,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,80,65,67,75,32,105,110,116,114,105,110,115,105,99,59,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortrani_pack_c4 .visible .func _gfortrani_pack_cpredpred %r180; .reg .predpredpredu64 %r199; .reg .u64pred %r213; .reg .u32 %r215; .reg .u32 %r216; .reg .u32 %r218; .reg .u32 %r219; .reg .u16u16 %r227; .reg .u32 %r228; .reg .pred %r229; .reg .predu64 %r241; .reg .u64 %r243; .reg .u64 %r244; .reg .predpred %r253; .reg .predpredmov.u64 %r133,%ar0; mov.u64 %r134,%ar1; mov.u64 %r135,%ar2; mov.u64 %r136,%ar3; .loc 1 97 9 ld.s8 %r22,[%r134+28]; .loc 1 97 7 cvt.u32.u32 %r137,%r22; cvt.s64.s8 %r93,%r137; .loc 1 99 8 ld.u64 %r105,[%r135]; .loc 1 104 15 ld.u64 %r23,[%r135+16]; .loc 1 104 13 cvt.u32.u64 %r94,%r23; .loc 1 106 53 add.u32 %r138,%r94,-4; and.b32 %r139,%r138,-5; set.u32.eq.u32 %r141,%r139,0; neg.s32 %r142,%r141; .loc 1 106 22 add.u32 %r143,%r94,-1; set.u32.le.u32 %r145,%r143,1; neg.s32 %r146,%r145; .loc 1 106 6 cvt.u16.u32 %r148,%r142; cvt.u16.u32 %r149,%r146; or.b16 %r147,%r148,%r149; cvt.u32.u16 %r150,%r147; cvt.u16.u8 %r151,%r150; setp.eq.u16 %r152,%r151,0; @ %r152 bra $L2; $L5: .loc 1 120 3 setp.gt.s64 %r153,%r93,0; @ %r153 bra $L3; mov.u32 %r89,0; bra $L4; $L2: .loc 1 108 7 setp.eq.u32 %r154,%r94,16; @ %r154 bra $L5; .loc 1 117 5 cvta.const.u64 %r11_gfortran_runtime_error122 16 cvt.u32.u32 %r158,%r22; cvt.s64.s8 %r157,%r158; shl.b64 %r116,%r157,3; add.u64 %r159,%frame,120; mov.u32 %r163116165,[%value_in]; } mov.u64 %r125,40; mov.u64 %r126,0; cvt.u32.u64 %r89,%r126; add.u64 %r270,%frame,360; add.u64 %r269,%frame,240; $L7: add.u64 %r119,%r134,%r125; .loc 1 123 19 ld.u64 %r168,[%r119+16]; add.u64 %r167,%r168,1; ld.u64 %r169,[%r119+8]; sub.u64 %r32,%r167,%r169; .loc 1 123 17 add.u64 %r170,%frame,%r126; st.u64 [%r170],%r32; .loc 1 124 10 setp.gt.s64 %r171,%r32,0; .loc 1 125 19 selp.u32 %r89,%r89,1,%r171; .loc 1 126 18 add.u64 %r173,%r270,%r126; ld.u64 %r174,[%r119]; st.u64 [%r173],%r174; .loc 1 127 18 add.u64 %r176,%r269,%r126; .loc 1 127 20 add.u64 %r177,%r135,%r125; ld.u64 %r179,[%r177]; mul.lo.u64 %r178,%r179,%r23; .loc 1 127 18 st.u64 [%r176],%r178; .loc 1 120 3 add.u64 %r126,%r126,8; add.u64 %r125,%r125,24; setp.ne.u64 %r180,%r116,%r126; @ %r180 bra $L7; $L4: .loc 1 129 14 ld.u64 %r38,[%frame+360]; .loc 1 129 6 setp.ne.u64 %r181,%r38,0; .loc 1 130 16 selp.u64 %r38,%r38,1,%r181; .loc 1 131 6 ld.u64 %r182,[%frame+240]; setp.ne.u64 %r183,%r182,0; @ %r183 bra $L9; .loc 1 132 16 cvt.s64.s32 %r184,%r94; st.u64 [%frame+240],%r184; $L9: .loc 1 134 6 setp.ne.u32 %r185,%r89,0; @ %r185 bra $L34; .loc 1 137 10 ld.u64 %r90,[%r134]; bra $L10; $L34: .loc 1 135 10 mov.u64 %r90,0; $L10: .loc 1 139 10 ld.u64 %r41,[%r133]; mov.u64 %r272,%r41; .loc 1 139 6 setp.eq.u64 %r273,%r41,0; @ %r273 bra $L11; .loc 1 139 33 cvta.global.u64 %r187,_gfortrani_compile_options; .loc 1 139 30 ld.u32 %r188,[%r187+36]; setp.eq.u32 %r189,%r188,0; @ ! %r189 bra $L11; setp.eq.u64 %r271,%r136,0; bra $L12; $L11: .loc 1 144 10 setp.eq.u64 %r271,%r136,0; @ %r271 bra $L13; .loc 1 148 12 ld.u64 %r192,[%r136+56]; add.u64 %r191,%r192,1; .loc 1 148 10 ld.u64 %r193,[%r136+48]; sub.u64 %r91,%r191,%r193; .loc 1 149 7 setp.lt.s64 %r194,%r91,0; @ ! %r194 bra $L15; bra $L14; $L13: .loc 1 158 12135; call (%value_in),_gfortrani_count_0,(%out_arg1); ld.param.u64 %r196,[%value_in]; } mov.u64 %r91,%r196; .loc 1 161 14 ld.u64 %r272,[%r133]; $L15: .loc 1 161 10 setp.ne.u64 %r197,%r272,0; @ %r197 bra $L16; .loc 1 164 4 st.u64 [%r133+48],%r272; add.u64 %r199,%r91,-1; st.u64 [%r133+56],%r199; mov.u64 %r200,1; st.u64 [%r133+40],%r200; .loc 1 166 16 st.u64 [%r133+8],%r272; .loc 1 169 21 mov.u64 %r20391203; call (%value_in),_gfortrani_xmallocarrayt.u64 [%r133],%r41; .loc 1 171 7 setp.ne.u64 %r206,%r91,0; @ %r206 bra $L12; bra $L1; $L16: .loc 1 179 17 ld.u64 %r208,[%r133+56]; add.u64 %r207,%r208,1; .loc 1 179 15 ld.u64 %r209,[%r133+48]; sub.u64 %r97,%r207,%r209; .loc 1 180 7 setp.ne.u64 %r210,%r97,%r91; @ %r210 bra $L17; .loc 1 192 8 mov.u64 %r41,%r272; setp.eq.u64 %r271,%r136,0; bra $L12; $L17: .loc 1 181 6 st.u64 [%stack+8],%r97; st.u64 [%stack],%r91;_gfortran_runtime_error12: .loc 1 187 12 ld.u64 %r84,[%r133+40]; .loc 1 188 6 setp.ne.u64 %r213,%r84,0; .loc 1 189 14 selp.u64 %r84,%r84,1,%r213; .loc 1 191 12 ld.u64 %r99,[%frame+240]; .loc 1 194 15 set.u32.ne.u64 %r215,%r105,0; neg.s32 %r216,%r215; .loc 1 194 10 set.u32.ne.u64 %r218,%r90,0; neg.s32 %r219,%r218; .loc 1 194 15 cvt.u16.u32 %r221,%r216; cvt.u16.u32 %r222,%r219; and.b16 %r220,%r221,%r222; .loc 1 194 9 cvt.u32.u16 %r223,%r220; cvt.u16.u8 %r224,%r223; setp.eq.u16 %r225,%r224,0; @ %r225 bra $L35; .loc 1 201 16 shl.b64 %r55,%r84,3; .loc 1 204 12 shl.b64 %r57,%r38,3; ld.u64 %r96,[%frame+120]; .loc 1 208 32 ld.u64 %r117,[%frame]; .loc 1 215 30 mul.lo.u64 %r28,%r117,%r38; .loc 1 216 30 mul.lo.u64 %r226,%r117,%r99; .loc 1 216 16 neg.s64 %r114,%r226; .loc 1 192 8 mov.u64 %r92,%r41; setp.le.s64 %r268,%r93,1; add.u64 %r276,%frame,360; add.u64 %r277,%frame,240; .loc 1 212 20 mov.u64 %r278,0; $L28: .loc 1 197 10 ld.s8 %r228,[%r105]; cvt.u16.u32 %r227,%r228; setp.eq.u16 %r229,%r227,0; @ %r229 bra $L20; .loc 1 200 12 ld.f32 %r52,[%r90]; ld.f32 %r53,[%r90+4]; .loc 1 200 10 st.f32 [%r92],%r52; st.f32 [%r92+4],%r53; .loc 1 201 16 add.u64 %r92,%r92,%r55; .loc 1 204 12 add.u64 %r90,%r90,%r57; .loc 1 205 12 add.u64 %r105,%r105,%r99; .loc 1 206 15 add.u64 %r96,%r96,1; .loc 1 208 13 setp.eq.u64 %r230,%r96,%r117; @ %r230 bra $L21; bra $L28; $L20: .loc 1 204 12 add.u64 %r90,%r90,%r57; .loc 1 205 12 add.u64 %r105,%r105,%r99; .loc 1 206 15 add.u64 %r96,%r96,1; .loc 1 208 13 setp.ne.u64 %r231,%r96,%r117; @ %r231 bra $L28; $L21: .loc 1 216 16 add.u64 %r106,%r105,%r114; .loc 1 218 14 @ ! %r268 bra $L52; bra $L19; $L27: .loc 1 212 20 st.u64 [%r85],%r278; .loc 1 215 30 mul.lo.u64 %r59,%r65,%r64; .loc 1 216 30 mul.lo.u64 %r234,%r67,%r64; .loc 1 216 16 sub.u64 %r106,%r105,%r234; .loc 1 217 12 add.u64 %r107,%r107,1; .loc 1 218 14 add.u64 %r85,%r85,8; add.u64 %r131,%r131,8; setp.ne.u64 %r235,%r93,%r107; @ %r235 bra $L26; bra $L19; $L52: add.u64 %r85,%frame,128; .loc 1 215 30 mov.u64 %r59,%r28; .loc 1 218 14 mov.u64 %r131,8; .loc 1 217 12 mov.u64 %r107,1; $L26: .loc 1 226 23 ld.u64 %r237,[%r85]; add.u64 %r64,%r237,1; st.u64 [%r85],%r64; .loc 1 227 30 add.u64 %r239,%r276,%r131; ld.u64 %r65,[%r239]; .loc 1 227 20 sub.u64 %r240,%r65,%r59; shl.b64 %r241,%r240,3; add.u64 %r90,%r90,%r241; .loc 1 228 30 add.u64 %r243,%r277,%r131; ld.u64 %r67,[%r243]; .loc 1 228 20 add.u64 %r105,%r106,%r67; .loc 1 208 32 add.u64 %r244,%frame,%r131; ld.u64 %r69,[%r244]; .loc 1 208 13 setp.eq.u64 %r245,%r64,%r69; @ %r245 bra $L27; .loc 1 212 20 mov.u64 %r96,0; bra $L28; $L35: .loc 1 192 8 mov.u64 %r92,%r41; $L19: .loc 1 234 6 @ %r271 bra $L1; .loc 1 236 11 ld.u64 %r248,[%r136+56]; add.u64 %r247,%r248,1; .loc 1 236 9 ld.u64 %r249,[%r136+48]; sub.u64 %r100,%r247,%r249; .loc 1 237 22 sub.u64 %r250,%r92,%r41; shr.s64 %r252,%r250,3; .loc 1 237 13 div.s64 %r101,%r252,%r84; .loc 1 238 10 setp.le.s64 %r253,%r100,%r101; @ %r253 bra $L1; .loc 1 240 20 ld.u64 %r86,[%r136+40]; .loc 1 241 14 setp.eq.u64 %r254,%r86,0; @ %r254 bra $L36; .loc 1 244 47 mul.lo.u64 %r132,%r101,%r86; bra $L30; $L36: mov.u64 %r132,%r101; .loc 1 242 22 mov.u64 %r86,1; $L30: .loc 1 244 36 shl.b64 %r255,%r132,3; .loc 1 244 16 ld.u64 %r256,[%r136]; add.u64 %r104,%r256,%r255; .loc 1 245 13 sub.u64 %r102,%r100,%r101; .loc 1 246 19 add.u64 %r103,%r102,-1; .loc 1 246 17 setp.eq.u64 %r257,%r102,0; @ %r257 bra $L1; .loc 1 249 20 shl.b64 %r81,%r84,3; .loc 1 250 20 shl.b64 %r83,%r86,3; $L31: .loc 1 248 16 ld.f32 %r78,[%r104]; ld.f32 %r79,[%r104+4]; .loc 1 248 14 st.f32 [%r92],%r78; st.f32 [%r92+4],%r79; .loc 1 249 20 add.u64 %r92,%r92,%r81; .loc 1 250 20 add.u64 %r104,%r104,%r83; .loc 1 246 19 add.u64 %r103,%r103,-1; .loc 1 246 17 setp.ne.u64 %r258,%r103,-1; @ %r258 bra $L31; bra $L1; $L33: .loc 1 164 4 st.u64 [%r133+48],%r272; mov.u64 %r260,-1; st.u64 [%r133+56],%r260; mov.u64 %r261,1; st.u64 [%r133+40],%r261; .loc 1 166 16 st.u64 [%r133+8],%r272; .loc 1 169 21 mov.u64 %r26427264; call (%value_in),_gfortrani_xmallocarray265,[%value_in]; } .loc 1 169 19 st.u64 [%r133],%r265; bra $L1; $L14: .loc 1 161 10 @ %r273 bra $L33; .loc 1 151 14 mov.u64 %r136,0; .loc 1 152 15 mov.u64 %r91,%r136; bra $L16; $L1: .loc 1 254pack_c8.o/_gfortrani_pack_c8 .visible .func _gfortrani_pack_c8../libgfortran/generated/pack_c8.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_count_0_gfortrani_count_0_gfortrani_xmallocarr_gfortrani_xmallocarray110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,910,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,80,65,67,75,32,105,110,116,114,105,110,115,105,99,59,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortrani_pack_c8 .visible .func _gfortrani_pack_c8predpred %r180; .reg .predpredpredu64 %r199; .reg .u64pred %r213; .reg .u32 %r215; .reg .u32 %r216; .reg .u32 %r218; .reg .u32 %r219; .reg .u16u16 %r227; .reg .u32 %r228; .reg .pred %r229; .reg .predu64 %r241; .reg .u64 %r243; .reg .u64 %r244; .reg .predpred %r253; .reg .predpredmov.u64 %r133,%ar0; mov.u64 %r134,%ar1; mov.u64 %r135,%ar2; mov.u64 %r136,%ar3; .loc 1 97 9 ld.s8 %r22,[%r134+28]; .loc 1 97 7 cvt.u32.u32 %r137,%r22; cvt.s64.s8 %r93,%r137; .loc 1 99 8 ld.u64 %r105,[%r135]; .loc 1 104 15 ld.u64 %r23,[%r135+16]; .loc 1 104 13 cvt.u32.u64 %r94,%r23; .loc 1 106 53 add.u32 %r138,%r94,-4; and.b32 %r139,%r138,-5; set.u32.eq.u32 %r141,%r139,0; neg.s32 %r142,%r141; .loc 1 106 22 add.u32 %r143,%r94,-1; set.u32.le.u32 %r145,%r143,1; neg.s32 %r146,%r145; .loc 1 106 6 cvt.u16.u32 %r148,%r142; cvt.u16.u32 %r149,%r146; or.b16 %r147,%r148,%r149; cvt.u32.u16 %r150,%r147; cvt.u16.u8 %r151,%r150; setp.eq.u16 %r152,%r151,0; @ %r152 bra $L2; $L5: .loc 1 120 3 setp.gt.s64 %r153,%r93,0; @ %r153 bra $L3; mov.u32 %r89,0; bra $L4; $L2: .loc 1 108 7 setp.eq.u32 %r154,%r94,16; @ %r154 bra $L5; .loc 1 117 5 cvta.const.u64 %r11_gfortran_runtime_error122 16 cvt.u32.u32 %r158,%r22; cvt.s64.s8 %r157,%r158; shl.b64 %r116,%r157,3; add.u64 %r159,%frame,120; mov.u32 %r163116165,[%value_in]; } mov.u64 %r125,40; mov.u64 %r126,0; cvt.u32.u64 %r89,%r126; add.u64 %r270,%frame,360; add.u64 %r269,%frame,240; $L7: add.u64 %r119,%r134,%r125; .loc 1 123 19 ld.u64 %r168,[%r119+16]; add.u64 %r167,%r168,1; ld.u64 %r169,[%r119+8]; sub.u64 %r32,%r167,%r169; .loc 1 123 17 add.u64 %r170,%frame,%r126; st.u64 [%r170],%r32; .loc 1 124 10 setp.gt.s64 %r171,%r32,0; .loc 1 125 19 selp.u32 %r89,%r89,1,%r171; .loc 1 126 18 add.u64 %r173,%r270,%r126; ld.u64 %r174,[%r119]; st.u64 [%r173],%r174; .loc 1 127 18 add.u64 %r176,%r269,%r126; .loc 1 127 20 add.u64 %r177,%r135,%r125; ld.u64 %r179,[%r177]; mul.lo.u64 %r178,%r179,%r23; .loc 1 127 18 st.u64 [%r176],%r178; .loc 1 120 3 add.u64 %r126,%r126,8; add.u64 %r125,%r125,24; setp.ne.u64 %r180,%r116,%r126; @ %r180 bra $L7; $L4: .loc 1 129 14 ld.u64 %r38,[%frame+360]; .loc 1 129 6 setp.ne.u64 %r181,%r38,0; .loc 1 130 16 selp.u64 %r38,%r38,1,%r181; .loc 1 131 6 ld.u64 %r182,[%frame+240]; setp.ne.u64 %r183,%r182,0; @ %r183 bra $L9; .loc 1 132 16 cvt.s64.s32 %r184,%r94; st.u64 [%frame+240],%r184; $L9: .loc 1 134 6 setp.ne.u32 %r185,%r89,0; @ %r185 bra $L34; .loc 1 137 10 ld.u64 %r90,[%r134]; bra $L10; $L34: .loc 1 135 10 mov.u64 %r90,0; $L10: .loc 1 139 10 ld.u64 %r41,[%r133]; mov.u64 %r272,%r41; .loc 1 139 6 setp.eq.u64 %r273,%r41,0; @ %r273 bra $L11; .loc 1 139 33 cvta.global.u64 %r187,_gfortrani_compile_options; .loc 1 139 30 ld.u32 %r188,[%r187+36]; setp.eq.u32 %r189,%r188,0; @ ! %r189 bra $L11; setp.eq.u64 %r271,%r136,0; bra $L12; $L11: .loc 1 144 10 setp.eq.u64 %r271,%r136,0; @ %r271 bra $L13; .loc 1 148 12 ld.u64 %r192,[%r136+56]; add.u64 %r191,%r192,1; .loc 1 148 10 ld.u64 %r193,[%r136+48]; sub.u64 %r91,%r191,%r193; .loc 1 149 7 setp.lt.s64 %r194,%r91,0; @ ! %r194 bra $L15; bra $L14; $L13: .loc 1 158 12135; call (%value_in),_gfortrani_count_0,(%out_arg1); ld.param.u64 %r196,[%value_in]; } mov.u64 %r91,%r196; .loc 1 161 14 ld.u64 %r272,[%r133]; $L15: .loc 1 161 10 setp.ne.u64 %r197,%r272,0; @ %r197 bra $L16; .loc 1 164 4 st.u64 [%r133+48],%r272; add.u64 %r199,%r91,-1; st.u64 [%r133+56],%r199; mov.u64 %r200,1; st.u64 [%r133+40],%r200; .loc 1 166 16 st.u64 [%r133+8],%r272; .loc 1 169 21 mov.u64 %r203,1691203; call (%value_in),_gfortrani_xmallocarrayt.u64 [%r133],%r41; .loc 1 171 7 setp.ne.u64 %r206,%r91,0; @ %r206 bra $L12; bra $L1; $L16: .loc 1 179 17 ld.u64 %r208,[%r133+56]; add.u64 %r207,%r208,1; .loc 1 179 15 ld.u64 %r209,[%r133+48]; sub.u64 %r97,%r207,%r209; .loc 1 180 7 setp.ne.u64 %r210,%r97,%r91; @ %r210 bra $L17; .loc 1 192 8 mov.u64 %r41,%r272; setp.eq.u64 %r271,%r136,0; bra $L12; $L17: .loc 1 181 6 st.u64 [%stack+8],%r97; st.u64 [%stack],%r91;_gfortran_runtime_error12: .loc 1 187 12 ld.u64 %r84,[%r133+40]; .loc 1 188 6 setp.ne.u64 %r213,%r84,0; .loc 1 189 14 selp.u64 %r84,%r84,1,%r213; .loc 1 191 12 ld.u64 %r99,[%frame+240]; .loc 1 194 15 set.u32.ne.u64 %r215,%r105,0; neg.s32 %r216,%r215; .loc 1 194 10 set.u32.ne.u64 %r218,%r90,0; neg.s32 %r219,%r218; .loc 1 194 15 cvt.u16.u32 %r221,%r216; cvt.u16.u32 %r222,%r219; and.b16 %r220,%r221,%r222; .loc 1 194 9 cvt.u32.u16 %r223,%r220; cvt.u16.u8 %r224,%r223; setp.eq.u16 %r225,%r224,0; @ %r225 bra $L35; .loc 1 201 16 shl.b64 %r55,%r84,4; .loc 1 204 12 shl.b64 %r57,%r38,4; ld.u64 %r96,[%frame+120]; .loc 1 208 32 ld.u64 %r117,[%frame]; .loc 1 215 30 mul.lo.u64 %r28,%r117,%r38; .loc 1 216 30 mul.lo.u64 %r226,%r117,%r99; .loc 1 216 16 neg.s64 %r114,%r226; .loc 1 192 8 mov.u64 %r92,%r41; setp.le.s64 %r268,%r93,1; add.u64 %r276,%frame,360; add.u64 %r277,%frame,240; .loc 1 212 20 mov.u64 %r278,0; $L28: .loc 1 197 10 ld.s8 %r228,[%r105]; cvt.u16.u32 %r227,%r228; setp.eq.u16 %r229,%r227,0; @ %r229 bra $L20; .loc 1 200 12 ld.f64 %r52,[%r90]; ld.f64 %r53,[%r90+8]; .loc 1 200 10 st.f64 [%r92],%r52; st.f64 [%r92+8],%r53; .loc 1 201 16 add.u64 %r92,%r92,%r55; .loc 1 204 12 add.u64 %r90,%r90,%r57; .loc 1 205 12 add.u64 %r105,%r105,%r99; .loc 1 206 15 add.u64 %r96,%r96,1; .loc 1 208 13 setp.eq.u64 %r230,%r96,%r117; @ %r230 bra $L21; bra $L28; $L20: .loc 1 204 12 add.u64 %r90,%r90,%r57; .loc 1 205 12 add.u64 %r105,%r105,%r99; .loc 1 206 15 add.u64 %r96,%r96,1; .loc 1 208 13 setp.ne.u64 %r231,%r96,%r117; @ %r231 bra $L28; $L21: .loc 1 216 16 add.u64 %r106,%r105,%r114; .loc 1 218 14 @ ! %r268 bra $L52; bra $L19; $L27: .loc 1 212 20 st.u64 [%r85],%r278; .loc 1 215 30 mul.lo.u64 %r59,%r65,%r64; .loc 1 216 30 mul.lo.u64 %r234,%r67,%r64; .loc 1 216 16 sub.u64 %r106,%r105,%r234; .loc 1 217 12 add.u64 %r107,%r107,1; .loc 1 218 14 add.u64 %r85,%r85,8; add.u64 %r131,%r131,8; setp.ne.u64 %r235,%r93,%r107; @ %r235 bra $L26; bra $L19; $L52: add.u64 %r85,%frame,128; .loc 1 215 30 mov.u64 %r59,%r28; .loc 1 218 14 mov.u64 %r131,8; .loc 1 217 12 mov.u64 %r107,1; $L26: .loc 1 226 23 ld.u64 %r237,[%r85]; add.u64 %r64,%r237,1; st.u64 [%r85],%r64; .loc 1 227 30 add.u64 %r239,%r276,%r131; ld.u64 %r65,[%r239]; .loc 1 227 20 sub.u64 %r240,%r65,%r59; shl.b64 %r241,%r240,4; add.u64 %r90,%r90,%r241; .loc 1 228 30 add.u64 %r243,%r277,%r131; ld.u64 %r67,[%r243]; .loc 1 228 20 add.u64 %r105,%r106,%r67; .loc 1 208 32 add.u64 %r244,%frame,%r131; ld.u64 %r69,[%r244]; .loc 1 208 13 setp.eq.u64 %r245,%r64,%r69; @ %r245 bra $L27; .loc 1 212 20 mov.u64 %r96,0; bra $L28; $L35: .loc 1 192 8 mov.u64 %r92,%r41; $L19: .loc 1 234 6 @ %r271 bra $L1; .loc 1 236 11 ld.u64 %r248,[%r136+56]; add.u64 %r247,%r248,1; .loc 1 236 9 ld.u64 %r249,[%r136+48]; sub.u64 %r100,%r247,%r249; .loc 1 237 22 sub.u64 %r250,%r92,%r41; shr.s64 %r252,%r250,4; .loc 1 237 13 div.s64 %r101,%r252,%r84; .loc 1 238 10 setp.le.s64 %r253,%r100,%r101; @ %r253 bra $L1; .loc 1 240 20 ld.u64 %r86,[%r136+40]; .loc 1 241 14 setp.eq.u64 %r254,%r86,0; @ %r254 bra $L36; .loc 1 244 47 mul.lo.u64 %r132,%r101,%r86; bra $L30; $L36: mov.u64 %r132,%r101; .loc 1 242 22 mov.u64 %r86,1; $L30: .loc 1 244 36 shl.b64 %r255,%r132,4; .loc 1 244 16 ld.u64 %r256,[%r136]; add.u64 %r104,%r256,%r255; .loc 1 245 13 sub.u64 %r102,%r100,%r101; .loc 1 246 19 add.u64 %r103,%r102,-1; .loc 1 246 17 setp.eq.u64 %r257,%r102,0; @ %r257 bra $L1; .loc 1 249 20 shl.b64 %r81,%r84,4; .loc 1 250 20 shl.b64 %r83,%r86,4; $L31: .loc 1 248 16 ld.f64 %r78,[%r104]; ld.f64 %r79,[%r104+8]; .loc 1 248 14 st.f64 [%r92],%r78; st.f64 [%r92+8],%r79; .loc 1 249 20 add.u64 %r92,%r92,%r81; .loc 1 250 20 add.u64 %r104,%r104,%r83; .loc 1 246 19 add.u64 %r103,%r103,-1; .loc 1 246 17 setp.ne.u64 %r258,%r103,-1; @ %r258 bra $L31; bra $L1; $L33: .loc 1 164 4 st.u64 [%r133+48],%r272; mov.u64 %r260,-1; st.u64 [%r133+56],%r260; mov.u64 %r261,1; st.u64 [%r133+40],%r261; .loc 1 166 16 st.u64 [%r133+8],%r272; .loc 1 169 21 mov.u64 %r264,16call (%value_in),_gfortrani_xmallocarray265,[%value_in]; } .loc 1 169 19 st.u64 [%r133],%r265; bra $L1; $L14: .loc 1 161 10 @ %r273 bra $L33; .loc 1 151 14 mov.u64 %r136,0; .loc 1 152 15 mov.u64 %r91,%r136; bra $L16; $L1: .loc 1 254pack_c10.o/ unpack_i1.o/_gfortrani_unpack0_i1 .visible .func _gfortrani_unpack0_../libgfortran/generated/unpack_i1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_unpack1_i1 .visible .func _gfortrani_unpack1_i1gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortrani_unpack0_i1 .visible .func _gfortrani_unpack0_0predpredpred %r202; .reg .u64 %r205; .reg .u32 %r209; .reg .u64 %r211; .reg .u64 %r213; .reg .u64u16 %r224; .reg .u16u64 %r235; .reg .pred %r236; .reg .u64 %r237; .reg .predpredpredpred %r253; .reg .u64 %r255; .reg .u64 %r257; .reg .u64 %r258; .reg .u64 %r260; .reg .u64 %r261; .reg .predmov.u64 %r130,%ar0; mov.u64 %r131,%ar1; mov.u64 %r132,%ar2; mov.u64 %r133,%ar3; .loc 1 46 23 ld.s8 %r84,[%r133]; .loc 1 62 8 ld.u64 %r95,[%r132]; .loc 1 67 15 ld.u64 %r22,[%r132+16]; .loc 1 69 22 cvt.u32.u64 %r23,%r22; .loc 1 69 53 add.u32 %r134,%r23,-4; and.b32 %r135,%r134,-5; set.u32.eq.u32 %r137,%r135,0; neg.s32 %r138,%r137; .loc 1 69 22 add.u32 %r139,%r23,-1; set.u32.le.u32 %r141,%r139,1; neg.s32 %r142,%r141; .loc 1 69 6 cvt.u16.u32 %r144,%r138; cvt.u16.u32 %r145,%r142; or.b16 %r143,%r144,%r145;2; .loc 1 71 7 setp.eq.u32 %r150,%r23,16; @ %r150 bra $L2; .loc 1 80 5 cvta.const.u64 %r11_gfortran_runtime_errorr94,[%r130]; .loc 1 82 6 setp.ne.u64 %r153,%r94,0; @ %r153 bra $L3; .loc 1 86 11 ld.s8 %r80,[%r132+28]; .loc 1 88 7 setp.le.s64 %r155,%r80,0; @ %r155 bra $L4; .loc 1 90 13 shl.b64 %r157,%r80,3; add.u64 %r158,%frame,120; cvt.u32.u64 %r162,%r94157164,[%value_in]; } add.u64 %r117,%r130,40; add.u64 %r115,%r132,40; mov.u64 %r113,%frame; add.u64 %r112,%frame,360; add.u64 %r111,%frame,240; add.u64 %r167,%r80,%r80; add.u64 %r168,%r167,%r80; shl.b64 %r169,%r168,3; add.u64 %r98,%r169,%r117; .loc 1 60 9 cvt.u32.u64 %r38,%r94; .loc 1 87 10 mov.u64 %r88,1; .loc 1 91 4 mov.u64 %r171,%r94; $L5: st.u64 [%r117+8],%r171; ld.u64 %r173,[%r115+16]; add.u64 %r172,%r173,1; ld.u64 %r174,[%r115+8]; sub.u64 %r33,%r172,%r174; add.u64 %r175,%r33,-1; st.u64 [%r117+16],%r175; st.u64 [%r117],%r88; .loc 1 93 14 st.u64 [%r113],%r33; .loc 1 94 18 set.u32.le.s64 %r177,%r33,0; neg.s32 %r178,%r177; cvt.u16.u32 %r181,%r38; and.b16 %r180,%r181,1; cvt.u16.u32 %r184,%r178; or.b16 %r183,%r184,%r180; cvt.u32.u16 %r186,%r183; cvt.u32.u8 %r38,%r186; .loc 1 95 15 st.u64 [%r112],%r88; .loc 1 96 17 ld.u64 %r188,[%r115]; mul.lo.u64 %r187,%r188,%r22; .loc 1 96 15 st.u64 [%r111],%r187; .loc 1 97 7 mul.lo.u64 %r88,%r88,%r33; .loc 1 88 7 add.u64 %r117,%r117,24; add.u64 %r115,%r115,24; add.u64 %r113,%r113,8;8; setp.ne.u64 %r189,%r98,%r117; @ %r189 bra $L5; bra $L31; $L4: .loc 1 99 19 st.u64 [%r130+8],%r94; .loc 1 100 24 mov.u64 %r19219192; call (%value_in),_gfortrani_xmallocarray193,[%value_in]; } mov.u64 %r94,%r193; .loc 1 100 22 st.u64 [%r130],%r94; .loc 1 128 12 ld.u64 %r129,[%frame+360]; bra $L7; $L31: .loc 1 99 19 st.u64 [%r130+8],%r171; .loc 1 100 24 mov.u64 %r197r197; call (%value_in),_gfortrani_xmallocarray198,[%value_in]; } .loc 1 100 22 st.u64 [%r130],%r198; bra $L8; $L3: .loc 1 104 11 ld.s8 %r80,[%r130+28]; .loc 1 106 18 mov.u64 %r201,1; st.u64 [%frame+360],%r201; .loc 1 107 7 setp.le.s64 %r202,%r80,0; @ %r202 bra $L20; .loc 1 109 13 shl.b64 %r127,%r80,3; add.u64 %r205,%frame,120; mov.u32 %r209111,[%value_in]; } mov.u64 %r92,40; mov.u64 %r97,0; .loc 1 60 9 cvt.u32.u64 %r38,%r97; add.u64 %r263,%frame,360; add.u64 %r265,%frame,240; $L9: add.u64 %r82,%r130,%r92; .loc 1 110 16 ld.u64 %r214,[%r82+16]; add.u64 %r213,%r214,1; ld.u64 %r215,[%r82+8]; sub.u64 %r49,%r213,%r215; .loc 1 110 14 add.u64 %r216,%frame,%r97; st.u64 [%r216],%r49; .loc 1 111 18 set.u32.le.s64 %r218,%r49,0; neg.s32 %r219,%r218; cvt.u16.u32 %r222,%r38; and.b16 %r221,%r222,1; cvt.u16.u32 %r225,%r219; or.b16 %r224,%r225,%r221; cvt.u32.u16 %r227,%r224; cvt.u32.u8 %r38,%r227; .loc 1 112 15 add.u64 %r229,%r263,%r97; ld.u64 %r230,[%r82]; st.u64 [%r229],%r230; .loc 1 113 15 add.u64 %r232,%r265,%r97; .loc 1 113 17 add.u64 %r233,%r132,%r92; ld.u64 %r235,[%r233]; mul.lo.u64 %r234,%r235,%r22; .loc 1 113 15 st.u64 [%r232],%r234; .loc 1 107 7 add.u64 %r97,%r97,8; add.u64 %r92,%r92,24; setp.ne.u64 %r236,%r97,%r127; @ %r236 bra $L9; .loc 1 115 10 ld.u64 %r237,[%frame+360].loc 1 116 13 mov.u64 %r239,1; st.u64 [%frame+360],%r239; $L8: .loc 1 119 6 setp.ne.u32 %r240,%r38,0; @ %r240 bra $L1; .loc 1 128 12 ld.u64 %r129,[%frame+360]; .loc 1 130 8 ld.u64 %r94,[%r130]; bra $L7; $L20: .loc 1 107 7 mov.u64 %r129,%r201; $L7: .loc 1 122 14 ld.u64 %r57,[%frame+240]; .loc 1 122 6 setp.ne.u64 %r241,%r57,0; .loc 1 123 16 selp.u64 %r57,%r57,1,%r241; .loc 1 125 12 ld.u64 %r75,[%r131+40]; .loc 1 126 6 setp.ne.u64 %r242,%r75,0; .loc 1 127 14 selp.u64 %r75,%r75,1,%r242; .loc 1 131 8 ld.u64 %r76,[%r131]; .loc 1 133 9 setp.eq.u64 %r243,%r94,0; @ %r243 bra $L1; ld.u64 %r27,[%frame+120]; .loc 1 151 32 ld.u64 %r96,[%frame]; .loc 1 158 30 mul.lo.u64 %r28,%r96,%r129; .loc 1 159 30 mul.lo.u64 %r244,%r96,%r57; .loc 1 159 16 neg.s64 %r99,%r244; shl.b64 %r118,%r80,3; setp.le.s64 %r264,%r80,1; add.u64 %r270,%frame,360; add.u64 %r271,%frame,240; .loc 1 155 20 mov.u64 %r272,0; $L19: .loc 1 135 10 ld.s8 %r246,[%r95]; cvt.u16.u32 %r245,%r246; setp.eq.u16 %r247,%r245,0; @ %r247 bra $L21; .loc 1 138 12 ld.s8 %r59,[%r76]; .loc 1 139 9 add.u64 %r76,%r76,%r75; bra $L13; $L21: .loc 1 144 10 mov.u32 %r59,%r84; $L13: cvt.u32.u32 %r248,%r59; st.u8 [%r94],%r248; .loc 1 147 12 add.u64 %r94,%r94,%r129; .loc 1 148 12 add.u64 %r95,%r95,%r57; .loc 1 149 15 add.u64 %r27,%r27,1; .loc 1 151 13 setp.ne.u64 %r249,%r27,%r96; @ %r249 bra $L19; .loc 1 159 16 add.u64 %r93,%r95,%r99; .loc 1 161 14 @ ! %r264 bra $L32; bra $L1; $L18: .loc 1 155 20 st.u64 [%r87],%r272; .loc 1 158 30 mul.lo.u64 %r63,%r70,%r69; .loc 1 159 30 mul.lo.u64 %r252,%r72,%r69; .loc 1 159 16 sub.u64 %r93,%r95,%r252; .loc 1 161 14 add.u64 %r87,%r87,8; add.u64 %r124,%r124,8; setp.ne.u64 %r253,%r118,%r124; @ %r253 bra $L17; bra $L1; $L32: add.u64 %r87,%frame,128; .loc 1 158 30 mov.u64 %r63,%r28; .loc 1 161 14 mov.u64 %r124,8; $L17: .loc 1 169 23 ld.u64 %r255,[%r87]; add.u64 %r69,%r255,1; st.u64 [%r87],%r69; .loc 1 170 30 add.u64 %r257,%r270,%r124; ld.u64 %r70,[%r257]; .loc 1 170 20 sub.u64 %r258,%r70,%r63; add.u64 %r94,%r94,%r258; .loc 1 171 30 add.u64 %r260,%r271,%r124; ld.u64 %r72,[%r260]; .loc 1 171 20 add.u64 %r95,%r93,%r72; .loc 1 151 32 add.u64 %r261,%frame,%r124; ld.u64 %r74,[%r261]; .loc 1 151 13 setp.eq.u64 %r262,%r69,%r74; @ %r262 bra $L18; .loc 1 155 20 mov.u64 %r27,0; bra $L19; $L1: .loc 1 1_gfortrani_unpack1_i1 .visible .func _gfortrani_unpack1_608pred %r168; .reg .u64 %r171; .reg .u32 %r175; .reg .u64predu64 %r215; .reg .u64 %r218; .reg .pred %r219; .reg .u64 %r222; .reg .u32 %r226; .reg .u64 %r228; .reg .u64 %r230; .reg .u64 %r231; .reg .u64u32 %r236; .reg .u16u32predpredpredpred %r275; .reg .u64 %r277; .reg .u64 %r279; .reg .u64 %r280; .reg .u64 %r282; .reg .u64predmov.u64 %r143,%ar0; mov.u64 %r144,%ar1; mov.u64 %r145,%ar2; mov.u64 %r146,%ar3; .loc 1 208 8 ld.u64 %r100,[%r145]; .loc 1 213 15 ld.u64 %r22,[%r145+16]; .loc 1 215 22 cvt.u32.u64 %r23,%r22; .loc 1 215 53 add.u32 %r147,%r23,-4; and.b32 %r148,%r147,-5; set.u32.eq.u32 %r150,%r148,0; neg.s32 %r151,%r150; .loc 1 215 22 add.u32 %r152,%r23,-1; set.u32.le.u32 %r154,%r152,1; neg.s32 %r155,%r154; .loc 1 215 6 cvt.u16.u3234; .loc 1 217 7 setp.eq.u32 %r163,%r23,16; @ %r163 bra $L34; .loc 1 226 5 cvta.const.u64 %r164164_gfortran_runtime_error34: .loc 1 228 10 ld.u64 %r98,[%r143]; .loc 1 228 6 setp.ne.u64 %r166,%r98,0; @ %r166 bra $L35; .loc 1 232 11 ld.s8 %r85,[%r145+28]; .loc 1 234 7 setp.le.s64 %r168,%r85,0; @ %r168 bra $L36; .loc 1 236 13 shl.b64 %r138,%r85,3; add.u64 %r171,%frame,120; cvt.u32.u64 %r175,%r98138177,[%value_in]; } mov.u64 %r128,40; mov.u64 %r129,%r98; .loc 1 206 9 cvt.u32.u64 %r39,%r129; .loc 1 233 10 mov.u64 %r91,1; add.u64 %r289,%frame,480; add.u64 %r287,%frame,360; add.u64 %r290,%frame,240; .loc 1 237 4 mov.u64 %r179,%r129; $L37: add.u64 %r127,%r143,%r128; st.u64 [%r127+8],%r179; add.u64 %r126,%r145,%r128; ld.u64 %r181,[%r126+16]; add.u64 %r180,%r181,1; ld.u64 %r182,[%r126+8]; sub.u64 %r34,%r180,%r182; add.u64 %r183,%r34,-1; st.u64 [%r127+16],%r183; st.u64 [%r127],%r91; .loc 1 239 14 add.u64 %r184,%frame,%r129; st.u64 [%r184],%r34; .loc 1 240 18 set.u32.le.s64 %r186,%r34,0; neg.s32 %r187,%r186; cvt.u16.u32 %r190,%r39; and.b16 %r189,%r190,1; cvt.u16.u32 %r193,%r187; or.b16 %r192,%r193,%r189; cvt.u32.u16 %r195,%r192; cvt.u32.u8 %r39,%r195; .loc 1 241 15 add.u64 %r197,%r289,%r129; st.u64 [%r197],%r91; .loc 1 242 15 add.u64 %r199,%r287,%r129; .loc 1 242 17 add.u64 %r200,%r146,%r128; .loc 1 242 15 ld.u64 %r201,[%r200]; st.u64 [%r199],%r201; .loc 1 243 15 add.u64 %r203,%r290,%r129; .loc 1 243 17 ld.u64 %r205,[%r126]; mul.lo.u64 %r204,%r205,%r22; .loc 1 243 15 st.u64 [%r203],%r204; .loc 1 244 7 mul.lo.u64 %r91,%r91,%r34; .loc 1 234 7 add.u64 %r129,%r129,8; add.u64 %r128,%r128,24; setp.ne.u64 %r206,%r129,%r138; @ %r206 bra $L37; bra $L64; $L36: .loc 1 246 19 st.u64 [%r143+8],%r98; .loc 1 247 24 mov.u64 %r209209; call (%value_in),_gfortrani_xmallocarray210,[%value_in]; } mov.u64 %r98,%r210; .loc 1 247 22 st.u64 [%r143],%r98; .loc 1 278 12 ld.u64 %r142,[%frame+480]; bra $L39; $L64: .loc 1 246 19 st.u64 [%r143+8],%r179; .loc 1 247 24 mov.u64 %r21491214; call (%value_in),_gfortrani_xmallocarray215,[%value_in]; } .loc 1 247 22 st.u64 [%r143],%r215; bra $L40; $L35: .loc 1 251 11 ld.s8 %r85,[%r143+28]; .loc 1 253 18 mov.u64 %r218,1; st.u64 [%frame+480],%r218; .loc 1 254 7 setp.le.s64 %r219,%r85,0; @ %r219 bra $L54; .loc 1 256 13 shl.b64 %r140,%r85,3; add.u64 %r222,%frame,120; mov.u32 %r226add.u64 %r120,%r143,40; add.u64 %r118,%r146,40; add.u64 %r116,%r145,40; mov.u64 %r113,0; .loc 1 206 9 cvt.u32.u64 %r39,%r113; add.u64 %r289,%frame,480; add.u64 %r287,%frame,360; add.u64 %r290,%frame,240; $L41: .loc 1 257 16 ld.u64 %r231,[%r120+16]; add.u64 %r230,%r231,1; ld.u64 %r232,[%r120+8]; sub.u64 %r51,%r230,%r232; .loc 1 257 14 add.u64 %r233,%frame,%r113; st.u64 [%r233],%r51; .loc 1 258 18 set.u32.le.s64 %r235,%r51,0; neg.s32 %r236,%r235; cvt.u16.u32 %r239,%r39; and.b16 %r238,%r239,1; cvt.u16.u32 %r242,%r236; or.b16 %r241,%r242,%r238; cvt.u32.u16 %r244,%r241; cvt.u32.u8 %r39,%r244; .loc 1 259 15 add.u64 %r246,%r289,%r113; ld.u64 %r247,[%r120]; st.u64 [%r246],%r247; .loc 1 260 15 add.u64 %r249,%r287,%r113; ld.u64 %r250,[%r118]; st.u64 [%r249],%r250; .loc 1 261 15 add.u64 %r252,%r290,%r113; .loc 1 261 17 ld.u64 %r254,[%r116]; mul.lo.u64 %r253,%r254,%r22; .loc 1 261 15 st.u64 [%r252],%r253; .loc 1 254 7 add.u64 %r120,%r120,24; add.u64 %r118,%r118,24; add.u64 %r116,%r116,24; add.u64 %r113,%r113,8; setp.ne.u64 %r255,%r113,%r140; @ %r255 bra $L41; .loc 1 263 10 ld.u64 %r256,[%frame+480]; setp.ne.u64 %r257,%r256,0; @ %r257 bra $L40; .loc 1 264 13 mov.u64 %r258,1; st.u64 [%frame+480],%r258; $L40: .loc 1 267 6 setp.ne.u32 %r259,%r39,0; @ %r259 bra $L33; .loc 1 278 12 ld.u64 %r142,[%frame+480]; .loc 1 281 8 ld.u64 %r98,[%r143]; bra $L39; $L54: .loc 1 254 7 mov.u64 %r142,%r218; $L39: .loc 1 270 14 ld.u64 %r60,[%frame+360]; .loc 1 270 6 setp.ne.u64 %r260,%r60,0; .loc 1 271 16 selp.u64 %r60,%r60,1,%r260; .loc 1 272 14 ld.u64 %r61,[%frame+240]; .loc 1 272 6 setp.ne.u64 %r261,%r61,0; .loc 1 273 16 selp.u64 %r61,%r61,1,%r261; .loc 1 275 12 ld.u64 %r82,[%r144+40]; .loc 1 276 6 setp.ne.u64 %r262,%r82,0; .loc 1 277 14 selp.u64 %r82,%r82,1,%r262; .loc 1 282 8 ld.u64 %r99,[%r146]; .loc 1 283 8 ld.u64 %r83,[%r144]; .loc 1 285 9 setp.eq.u64 %r263,%r98,0; @ %r263 bra $L33; ld.u64 %r29,[%frame+120]; .loc 1 304 32 ld.u64 %r104,[%frame]; .loc 1 311 30 mul.lo.u64 %r28,%r104,%r142; .loc 1 312 30 mul.lo.u64 %r264,%r104,%r60; .loc 1 312 16 neg.s64 %r110,%r264; .loc 1 313 30 mul.lo.u64 %r265,%r104,%r61; .loc 1 313 16 neg.s64 %r111,%r265; shl.b64 %r130,%r85,3; setp.le.s64 %r288,%r85,1; add.u64 %r295,%frame,480; add.u64 %r296,%frame,360; add.u64 %r297,%frame,240; .loc 1 308 20 mov.u64 %r298,0; $L53: .loc 1 287 10 ld.s8 %r267,[%r100]; cvt.u16.u32 %r266,%r267; setp.eq.u16 %r268,%r266,0; @ %r268 bra $L46; .loc 1 290 12 ld.s8 %r63,[%r83]; .loc 1 291 16 add.u64 %r83,%r83,%r82; bra $L47; $L46: .loc 1 296 12 ld.s8 %r63,[%r99]; $L47: cvt.u32.u32 %r269,%r63; st.u8 [%r98],%r269; .loc 1 299 12 add.u64 %r98,%r98,%r142; .loc 1 300 12 add.u64 %r99,%r99,%r60; .loc 1 301 12 add.u64 %r100,%r100,%r61; .loc 1 302 15 add.u64 %r29,%r29,1; .loc 1 304 13 setp.ne.u64 %r270,%r29,%r104; @ %r270 bra $L53; .loc 1 312 16 add.u64 %r96,%r99,%r110; .loc 1 313 16 add.u64 %r97,%r100,%r111; .loc 1 315 14 @ ! %r288 bra $L65; bra $L33; $L52: .loc 1 308 20 st.u64 [%r90],%r298; .loc 1 311 30 mul.lo.u64 %r66,%r75,%r74; .loc 1 312 30 mul.lo.u64 %r273,%r77,%r74; .loc 1 312 16 sub.u64 %r96,%r99,%r273; .loc 1 313 30 mul.lo.u64 %r274,%r79,%r74; .loc 1 313 16 sub.u64 %r97,%r100,%r274; .loc 1 315 14 add.u64 %r90,%r90,8; add.u64 %r137,%r137,8; setp.ne.u64 %r275,%r130,%r137; @ %r275 bra $L51; bra $L33; $L65: add.u64 %r90,%frame,128; .loc 1 311 30 mov.u64 %r66,%r28; .loc 1 315 14 mov.u64 %r137,8; $L51: .loc 1 323 23 ld.u64 %r277,[%r90]; add.u64 %r74,%r277,1; st.u64 [%r90],%r74; .loc 1 324 30 add.u64 %r279,%r295,%r137; ld.u64 %r75,[%r279]; .loc 1 324 20 sub.u64 %r280,%r75,%r66; add.u64 %r98,%r98,%r280; .loc 1 325 30 add.u64 %r282,%r296,%r137; ld.u64 %r77,[%r282]; .loc 1 325 20 add.u64 %r99,%r96,%r77; .loc 1 326 30 add.u64 %r284,%r297,%r137; ld.u64 %r79,[%r284]; .loc 1 326 20 add.u64 %r100,%r97,%r79; .loc 1 304 32 add.u64 %r285,%frame,%r137; ld.u64 %r81,[%r285]; .loc 1 304 13 setp.eq.u64 %r286,%r74,%r81; @ %r286 bra $L52; .loc 1 308 20 mov.u64 %r29,0; bra $L53; $L33: .loc 1 330 unpack_i2.o/_gfortrani_unpack0_i2 .visible .func _gfortrani_unpack0_../libgfortran/generated/unpack_i2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_unpack1_i2 .visible .func _gfortrani_unpack1_i2gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortrani_unpack0_i2 .visible .func _gfortrani_unpack0_0predpredpred %r205; .reg .u64 %r208; .reg .u32 %r212; .reg .u64u16 %r225; .reg .u16 %r227; .reg .u16u64 %r237; .reg .u64 %r238; .reg .pred %r239; .reg .u64 %r240; .reg .pred %r241; .reg .u64 %r242; .reg .predu32 %r251; .reg .predmov.u64 %r133,%ar0; mov.u64 %r134,%ar1; mov.u64 %r135,%ar2; mov.u64 %r136,%ar3; .loc 1 46 23 ld.s16 %r86,[%r136]; .loc 1 62 8 ld.u64 %r97,[%r135]; .loc 1 67 15 ld.u64 %r22,[%r135+16]; .loc 1 69 22 cvt.u32.u64 %r23,%r22; .loc 1 69 53 add.u32 %r137,%r23,-4; and.b32 %r138,%r137,-5; set.u32.eq.u32 %r140,%r138,0; neg.s32 %r141,%r140; .loc 1 69 22 add.u32 %r142,%r23,-1; set.u32.le.u32 %r144,%r142,1; neg.s32 %r145,%r144; .loc 1 69 6 cvt.u16.u32 %r147,%r141; cvt.u16.u32 %r148,%r145; or.b16 %r146,%r147,%r148; cvt.u32.u16 %r149,%r146; cvt.u16.u8 %r150,%r149; setp.ne.u16 %r151,%r150,0; @ %r151 bra $L2; .loc 1 71 7 setp.eq.u32 %r153,%r23,16; @ %r153 bra $L2; .loc 1 80 5 cvta.const.u64 %r1_gfortran_runtime_errorr96,[%r133]; .loc 1 82 6 setp.ne.u64 %r156,%r96,0; @ %r156 bra $L3; .loc 1 86 11 ld.s8 %r78,[%r135+28]; .loc 1 88 7 setp.le.s64 %r158,%r78,0; @ %r158 bra $L4; .loc 1 90 13 shl.b64 %r160,%r78,3; add.u64 %r161,%frame,120; cvt.u32.u64 %r165,%r96add.u64 %r120,%r133,40; add.u64 %r118,%r135,40; mov.u64 %r116,%frame; add.u64 %r115,%frame,360; add.u64 %r114,%frame,240; add.u64 %r170,%r78,%r78; add.u64 %r171,%r170,%r78; shl.b64 %r172,%r171,3; add.u64 %r102,%r172,%r120; .loc 1 60 9 cvt.u32.u64 %r37,%r96; .loc 1 87 10 mov.u64 %r91,1; .loc 1 91 4 mov.u64 %r174,%r96; $L5: st.u64 [%r120+8],%r174; ld.u64 %r176,[%r118+16]; add.u64 %r175,%r176,1; ld.u64 %r177,[%r118+8]; sub.u64 %r32,%r175,%r177; add.u64 %r178,%r32,-1; st.u64 [%r120+16],%r178; st.u64 [%r120],%r91; .loc 1 93 14 st.u64 [%r116],%r32; .loc 1 94 18 set.u32.le.s64 %r180,%r32,0; neg.s32 %r181,%r180; cvt.u16.u32 %r184,%r37; and.b16 %r183,%r184,1; cvt.u16.u32 %r187,%r181; or.b16 %r186,%r187,%r183; cvt.u32.u16 %r189,%r186; cvt.u32.u8 %r37,%r189; .loc 1 95 15 st.u64 [%r115],%r91; .loc 1 96 17 ld.u64 %r191,[%r118]; mul.lo.u64 %r190,%r191,%r22; .loc 1 96 15 st.u64 [%r114],%r190; .loc 1 97 7 mul.lo.u64 %r91,%r91,%r32; .loc 1 88 7 add.u64 %r120,%r120,24; add.u64 %r118,%r118,24; add.u64 %r116,%r116,8; add.u64 %r115,%r115,8; add.u64 %r114,%r114,8; setp.ne.u64 %r192,%r102,%r120; @ %r192 bra $L5; bra $L31; $L4: .loc 1 99 19 st.u64 [%r133+8],%r96; .loc 1 100 24 mov.u64 %r195,2; mov.u64 %r19419195; call (%value_in),_gfortrani_xmallocarray196,[%value_in]; } mov.u64 %r96,%r196; .loc 1 100 22 st.u64 [%r133],%r96; .loc 1 128 12 ld.u64 %r132,[%frame+360]; bra $L7; $L31: .loc 1 99 19 st.u64 [%r133+8],%r174; .loc 1 100 24 mov.u64 %r200,200; call (%value_in),_gfortrani_xmallocarray201,[%value_in]; } .loc 1 100 22 st.u64 [%r133],%r201; bra $L8; $L3: .loc 1 104 11 ld.s8 %r78,[%r133+28]; .loc 1 106 18 mov.u64 %r204,1; st.u64 [%frame+360],%r204; .loc 1 107 7 setp.le.s64 %r205,%r78,0; @ %r205 bra $L20; .loc 1 109 13 shl.b64 %r130,%r78,3; add.u64 %r208,%frame,120; mov.u32 %r21mov.u64 %r99,40; mov.u64 %r100,0; .loc 1 60 9 cvt.u32.u64 %r37,%r100; add.u64 %r270,%frame,360; add.u64 %r269,%frame,240; $L9: add.u64 %r82,%r133,%r99; .loc 1 110 16 ld.u64 %r217,[%r82+16]; add.u64 %r216,%r217,1; ld.u64 %r218,[%r82+8]; sub.u64 %r48,%r216,%r218; .loc 1 110 14 add.u64 %r219,%frame,%r100; st.u64 [%r219],%r48; .loc 1 111 18 set.u32.le.s64 %r221,%r48,0; neg.s32 %r222,%r221; cvt.u16.u32 %r225,%r37; and.b16 %r224,%r225,1; cvt.u16.u32 %r228,%r222; or.b16 %r227,%r228,%r224; cvt.u32.u16 %r230,%r227; cvt.u32.u8 %r37,%r230; .loc 1 112 15 add.u64 %r232,%r270,%r100; ld.u64 %r233,[%r82]; st.u64 [%r232],%r233; .loc 1 113 15 add.u64 %r235,%r269,%r100; .loc 1 113 17 add.u64 %r236,%r135,%r99; ld.u64 %r238,[%r236]; mul.lo.u64 %r237,%r238,%r22; .loc 1 113 15 st.u64 [%r235],%r237; .loc 1 107 7 add.u64 %r100,%r100,8; add.u64 %r99,%r99,24; setp.ne.u64 %r239,%r100,%r130; @ %r239 bra $L9; .loc 1 115 10 ld.u64 %r240,[%frame+360]; setp.ne.u64 %r241,%r240,0; @ %r241 bra $L8; .loc 1 116 13 mov.u64 %r242,1; st.u64 [%frame+360],%r242; $L8: .loc 1 119 6 setp.ne.u32 %r243,%r37,0; @ %r243 bra $L1; .loc 1 128 12 ld.u64 %r132,[%frame+360]; .loc 1 130 8 ld.u64 %r96,[%r133]; bra $L7; $L20: .loc 1 107 7 mov.u64 %r132,%r204; $L7: .loc 1 122 14 ld.u64 %r56,[%frame+240]; .loc 1 122 6 setp.ne.u64 %r244,%r56,0; .loc 1 123 16 selp.u64 %r56,%r56,1,%r244; .loc 1 125 12 ld.u64 %r75,[%r134+40]; .loc 1 126 6 setp.ne.u64 %r245,%r75,0; .loc 1 127 14 selp.u64 %r75,%r75,1,%r245; .loc 1 131 8 ld.u64 %r76,[%r134]; .loc 1 133 9 setp.eq.u64 %r246,%r96,0; @ %r246 bra $L1; .loc 1 139 9 add.u64 %r60,%r75,%r75; .loc 1 147 12 add.u64 %r62,%r132,%r132; ld.u64 %r94,[%frame+120]; .loc 1 151 32 ld.u64 %r98,[%frame]; .loc 1 158 30 mul.lo.u64 %r27,%r98,%r132; .loc 1 159 30 mul.lo.u64 %r249,%r98,%r56; .loc 1 159 16 neg.s64 %r101,%r249; shl.b64 %r121,%r78,3; setp.le.s64 %r271,%r78,1; add.u64 %r276,%frame,360; add.u64 %r277,%frame,240; .loc 1 155 20 mov.u64 %r278,0; $L19: .loc 1 135 10 ld.s8 %r251,[%r97]; cvt.u16.u32 %r250,%r251; setp.eq.u16 %r252,%r250,0; @ %r252 bra $L21; .loc 1 138 12 ld.s16 %r58,[%r76]; .loc 1 139 9 add.u64 %r76,%r76,%r60; bra $L13; $L21: .loc 1 144 10 mov.u32 %r58,%r86; $L13: cvt.u16.u32 %r253,%r58; st.u16 [%r96],%r253; .loc 1 147 12 add.u64 %r96,%r96,%r62; .loc 1 148 12 add.u64 %r97,%r97,%r56; .loc 1 149 15 add.u64 %r94,%r94,1; .loc 1 151 13 setp.ne.u64 %r254,%r94,%r98; @ %r254 bra $L19; .loc 1 159 16 add.u64 %r95,%r97,%r101; .loc 1 161 14 @ ! %r271 bra $L32; bra $L1; $L18: .loc 1 155 20 st.u64 [%r90],%r278; .loc 1 158 30 mul.lo.u64 %r64,%r70,%r69; .loc 1 159 30 mul.lo.u64 %r257,%r72,%r69; .loc 1 159 16 sub.u64 %r95,%r97,%r257; .loc 1 161 14 add.u64 %r90,%r90,8; add.u64 %r127,%r127,8; setp.ne.u64 %r258,%r121,%r127; @ %r258 bra $L17; bra $L1; $L32: add.u64 %r90,%frame,128; .loc 1 158 30 mov.u64 %r64,%r27; .loc 1 161 14 mov.u64 %r127,8; $L17: .loc 1 169 23 ld.u64 %r260,[%r90]; add.u64 %r69,%r260,1; st.u64 [%r90],%r69; .loc 1 170 30 add.u64 %r262,%r276,%r127; ld.u64 %r70,[%r262]; .loc 1 170 20 sub.u64 %r263,%r70,%r64; add.u64 %r264,%r263,%r263; add.u64 %r96,%r96,%r264; .loc 1 171 30 add.u64 %r266,%r277,%r127; ld.u64 %r72,[%r266]; .loc 1 171 20 add.u64 %r97,%r95,%r72; .loc 1 151 32 add.u64 %r267,%frame,%r127; ld.u64 %r74,[%r267]; .loc 1 151 13 setp.eq.u64 %r268,%r69,%r74; @ %r268 bra $L18; .loc 1 155 20 mov.u64 %r94,0; bra $L19; $L1: .loc 1 1_gfortrani_unpack1_i2 .visible .func _gfortrani_unpack1_608932predpredpred %r213; .reg .u64u32u64 %r257; .reg .u64predpredpredmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 208 8 ld.u64 %r106,[%r152]; .loc 1 213 15 ld.u64 %r22,[%r152+16]; .loc 1 215 22 cvt.u32.u64 %r23,%r22; .loc 1 215 53 add.u32 %r154,%r23,-4; and.b32 %r155,%r154,-5; set.u32.eq.u32 %r157,%r155,0; neg.s32 %r158,%r157; .loc 1 215 22 add.u32 %r159,%r23,-1; set.u32.le.u32 %r161,%r159,1; neg.s32 %r162,%r161; .loc 1 215 6 cvt.u16.u32 %r164,%r158; cvt.u16.u32 %r165,%r162; or%r168 bra $L34; .loc 1 217 7 setp.eq.u32 %r170,%r23,16; @ %r170 bra $L34; .loc 1 226 5 cvta.const.u64 %r171171_gfortran_runtime_error34: .loc 1 228 10 ld.u64 %r104,[%r150]; .loc 1 228 6 setp.ne.u64 %r173,%r104,0; @ %r173 bra $L35; .loc 1 232 11 ld.s8 %r92,[%r152+28]; .loc 1 234 7 setp.le.s64 %r175,%r92,0; @ %r175 bra $L36; .loc 1 236 13 shl.b64 %r145,%r92,3; add.u64 %r178,%frame,120; cvt.u32.u64 %r182,%r1041454,[%value_in]; } mov.u64 %r135,40; mov.u64 %r136,%r104; .loc 1 206 9 cvt.u32.u64 %r38,%r136; .loc 1 233 10 mov.u64 %r99,1; add.u64 %r304,%frame,480; add.u64 %r303,%frame,360; add.u64 %r302,%frame,240; .loc 1 237 4 mov.u64 %r186,%r136; $L37: add.u64 %r134,%r150,%r135; st.u64 [%r134+8],%r186; add.u64 %r133,%r152,%r135; ld.u64 %r188,[%r133+16]; add.u64 %r187,%r188,1; ld.u64 %r189,[%r133+8]; sub.u64 %r33,%r187,%r189; add.u64 %r190,%r33,-1; st.u64 [%r134+16],%r190; st.u64 [%r134],%r99; .loc 1 239 14 add.u64 %r191,%frame,%r136; st.u64 [%r191],%r33; .loc 1 240 18 set.u32.le.s64 %r193,%r33,0; neg.s32 %r194,%r193; cvt.u16.u32 %r197,%r38; and.b16 %r196,%r197,1; cvt.u16.u32 %r200,%r194; or.b16 %r199,%r200,%r196; cvt.u32.u16 %r202,%r199; cvt.u32.u8 %r38,%r202; .loc 1 241 15 add.u64 %r204,%r304,%r136; st.u64 [%r204],%r99; .loc 1 242 15 add.u64 %r206,%r303,%r136; .loc 1 242 17 add.u64 %r207,%r153,%r135; .loc 1 242 15 ld.u64 %r208,[%r207]; st.u64 [%r206],%r208; .loc 1 243 15 add.u64 %r210,%r302,%r136; .loc 1 243 17 ld.u64 %r212,[%r133]; mul.lo.u64 %r211,%r212,%r22; .loc 1 243 15 st.u64 [%r210],%r211; .loc 1 244 7 mul.lo.u64 %r99,%r99,%r33; .loc 1 234 7 add.u64 %r136,%r136,8; add.u64 %r135,%r135,24; setp.ne.u64 %r213,%r136,%r145; @ %r213 bra $L37; bra $L64; $L36: .loc 1 246 19 st.u64 [%r150+8],%r104; .loc 1 247 24 mov.u64 %r216,2; mov.u64 %r215216; call (%value_in),_gfortrani_xmallocarray217,[%value_in]; } mov.u64 %r104,%r217; .loc 1 247 22 st.u64 [%r150],%r104; .loc 1 278 12 ld.u64 %r149,[%frame+480]; bra $L39; $L64: .loc 1 246 19 st.u64 [%r150+8],%r186; .loc 1 247 24 mov.u64 %r221,221; call (%value_in),_gfortrani_xmallocarray222,[%value_in]; } .loc 1 247 22 st.u64 [%r150],%r222; bra $L40; $L35: .loc 1 251 11 ld.s8 %r92,[%r150+28]; .loc 1 253 18 mov.u64 %r225,1; st.u64 [%frame+480],%r225; .loc 1 254 7 setp.le.s64 %r226,%r92,0; @ %r226 bra $L54; .loc 1 256 13 shl.b64 %r147,%r92,3; add.u64 %r229,%frame,120; mov.u32 %r23314735,[%value_in]; } add.u64 %r127,%r150,40; add.u64 %r125,%r153,40; add.u64 %r123,%r152,40; mov.u64 %r121,0; .loc 1 206 9 cvt.u32.u64 %r38,%r121; add.u64 %r304,%frame,480; add.u64 %r303,%frame,360; add.u64 %r302,%frame,240; $L41: .loc 1 257 16 ld.u64 %r238,[%r127+16]; add.u64 %r237,%r238,1; ld.u64 %r239,[%r127+8]; sub.u64 %r50,%r237,%r239; .loc 1 257 14 add.u64 %r240,%frame,%r121; st.u64 [%r240],%r50; .loc 1 258 18 set.u32.le.s64 %r242,%r50,0; neg.s32 %r243,%r242; cvt.u16.u32 %r246,%r38; and.b16 %r245,%r246,1; cvt.u16.u32 %r249,%r243; or.b16 %r248,%r249,%r245; cvt.u32.u16 %r251,%r248; cvt.u32.u8 %r38,%r251; .loc 1 259 15 add.u64 %r253,%r304,%r121; ld.u64 %r254,[%r127]; st.u64 [%r253],%r254; .loc 1 260 15 add.u64 %r256,%r303,%r121; ld.u64 %r257,[%r125]; st.u64 [%r256],%r257; .loc 1 261 15 add.u64 %r259,%r302,%r121; .loc 1 261 17 ld.u64 %r261,[%r123]; mul.lo.u64 %r260,%r261,%r22; .loc 1 261 15 st.u64 [%r259],%r260; .loc 1 254 7 add.u64 %r127,%r127,24; add.u64 %r125,%r125,24; add.u64 %r123,%r123,24; add.u64 %r121,%r121,8; setp.ne.u64 %r262,%r121,%r147; @ %r262 bra $L41; .loc 1 263 10 ld.u64 %r263,[%frame+48040; .loc 1 264 13 mov.u64 %r265,1; st.u64 [%frame+480],%r265; $L40: .loc 1 267 6 setp.ne.u32 %r266,%r38,0; @ %r266 bra $L33; .loc 1 278 12 ld.u64 %r149,[%frame+480]; .loc 1 281 8 ld.u64 %r104,[%r150]; bra $L39; $L54: .loc 1 254 7 mov.u64 %r149,%r225; $L39: .loc 1 270 14 ld.u64 %r59,[%frame+360]; .loc 1 270 6 setp.ne.u64 %r267,%r59,0; .loc 1 271 16 selp.u64 %r59,%r59,1,%r267; .loc 1 272 14 ld.u64 %r60,[%frame+240]; .loc 1 272 6 setp.ne.u64 %r268,%r60,0; .loc 1 273 16 selp.u64 %r60,%r60,1,%r268; .loc 1 275 12 ld.u64 %r90,[%r151+40]; .loc 1 276 6 setp.ne.u64 %r269,%r90,0; .loc 1 277 14 selp.u64 %r90,%r90,1,%r269; .loc 1 282 8 ld.u64 %r105,[%r153]; .loc 1 283 8 ld.u64 %r91,[%r151]; .loc 1 285 9 setp.eq.u64 %r270,%r104,0; @ %r270 bra $L33; .loc 1 291 16 add.u64 %r64,%r90,%r90; .loc 1 299 12 add.u64 %r66,%r149,%r149; .loc 1 300 12 add.u64 %r68,%r59,%r59; ld.u64 %r27,[%frame+120]; .loc 1 304 32 ld.u64 %r107,[%frame]; .loc 1 311 30 mul.lo.u64 %r28,%r107,%r149; .loc 1 312 16 mul.lo.u64 %r274,%r107,%r59; add.u64 %r275,%r274,%r274; neg.s64 %r111,%r275; .loc 1 313 30 mul.lo.u64 %r276,%r107,%r60; .loc 1 313 16 neg.s64 %r112,%r276; shl.b64 %r137,%r92,3; setp.le.s64 %r301,%r92,1; add.u64 %r309,%frame,480; add.u64 %r310,%frame,360; add.u64 %r311,%frame,240; .loc 1 308 20 mov.u64 %r312,0; $L53: .loc 1 287 10 ld.s8 %r278,[%r106]; cvt.u16.u32 %r277,%r278; setp.eq.u16 %r279,%r277,0; @ %r279 bra $L46; .loc 1 290 12 ld.s16 %r62,[%r91]; .loc 1 291 16 add.u64 %r91,%r91,%r64; bra $L47; $L46: .loc 1 296 12 ld.s16 %r62,[%r105]; $L47: cvt.u16.u32 %r280,%r62; st.u16 [%r104],%r280; .loc 1 299 12 add.u64 %r104,%r104,%r66; .loc 1 300 12 add.u64 %r105,%r105,%r68; .loc 1 301 12 add.u64 %r106,%r106,%r60; .loc 1 302 15 add.u64 %r27,%r27,1; .loc 1 304 13 setp.ne.u64 %r281,%r27,%r107; @ %r281 bra $L53; .loc 1 312 16 add.u64 %r102,%r105,%r111; .loc 1 313 16 add.u64 %r103,%r106,%r112; .loc 1 315 14 @ ! %r301 bra $L65; bra $L33; $L52: .loc 1 308 20 st.u64 [%r98],%r312; .loc 1 311 30 mul.lo.u64 %r71,%r81,%r80; .loc 1 312 30 mul.lo.u64 %r284,%r83,%r80; .loc 1 312 16 add.u64 %r285,%r284,%r284; sub.u64 %r102,%r105,%r285; .loc 1 313 30 mul.lo.u64 %r286,%r86,%r80; .loc 1 313 16 sub.u64 %r103,%r106,%r286; .loc 1 315 14 add.u64 %r98,%r98,8; add.u64 %r144,%r144,8; setp.ne.u64 %r287,%r137,%r144; @ %r287 bra $L51; bra $L33; $L65: add.u64 %r98,%frame,128; .loc 1 311 30 mov.u64 %r71,%r28; .loc 1 315 14 mov.u64 %r144,8; $L51: .loc 1 323 23 ld.u64 %r289,[%r98]; add.u64 %r80,%r289,1; st.u64 [%r98],%r80; .loc 1 324 30 add.u64 %r291,%r309,%r144; ld.u64 %r81,[%r291]; .loc 1 324 20 sub.u64 %r292,%r81,%r71; add.u64 %r293,%r292,%r292; add.u64 %r104,%r104,%r293; .loc 1 325 30 add.u64 %r295,%r310,%r144; ld.u64 %r83,[%r295]; .loc 1 325 20 add.u64 %r296,%r83,%r83; add.u64 %r105,%r102,%r296; .loc 1 326 30 add.u64 %r298,%r311,%r144; ld.u64 %r86,[%r298]; .loc 1 326 20 add.u64 %r106,%r103,%r86; .loc 1 304 32 add.u64 %r299,%frame,%r144; ld.u64 %r88,[%r299]; .loc 1 304 13 setp.eq.u64 %r300,%r80,%r88; @ %r300 bra $L52; .loc 1 308 20 mov.u64 %r27,0; bra $L53; $L33: .loc 1 330 unpack_i4.o/_gfortrani_unpack0_i4 .visible .func _gfortrani_unpack0_../libgfortran/generated/unpack_i4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_unpack1_i4 .visible .func _gfortrani_unpack1_i4gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortrani_unpack0_i4 .visible .func _gfortrani_unpack0_0predpredpred %r205; .reg .u64 %r208; .reg .u32 %r212; .reg .u64u16 %r225; .reg .u16 %r227; .reg .u16u64 %r237; .reg .u64 %r238; .reg .pred %r239; .reg .u64 %r240; .reg .pred %r241; .reg .u64 %r242; .reg .predpredpredmov.u64 %r133,%ar0; mov.u64 %r134,%ar1; mov.u64 %r135,%ar2; mov.u64 %r136,%ar3; .loc 1 46 23 ld.u32 %r86,[%r136]; .loc 1 62 8 ld.u64 %r97,[%r135]; .loc 1 67 15 ld.u64 %r22,[%r135+16]; .loc 1 69 22 cvt.u32.u64 %r23,%r22; .loc 1 69 53 add.u32 %r137,%r23,-4; and.b32 %r138,%r137,-5; set.u32.eq.u32 %r140,%r138,0; neg.s32 %r141,%r140; .loc 1 69 22 add.u32 %r142,%r23,-1; set.u32.le.u32 %r144,%r142,1; neg.s32 %r145,%r144; .loc 1 69 6 cvt.u16.u32 %r147,%r141; cvt.u16.u32 %r148,%r145; or.b16 %r146,%r147,%r148; cvt.u32.u16 %r149,%r146; cvt.u16.u8 %r150,%r149; setp.ne.u16 %r151,%r150,0; @ %r151 bra $L2; .loc 1 71 7 setp.eq.u32 %r153,%r23,16; @ %r153 bra $L2; .loc 1 80 5 cvta.const.u64 %r1_gfortran_runtime_errorr96,[%r133]; .loc 1 82 6 setp.ne.u64 %r156,%r96,0; @ %r156 bra $L3; .loc 1 86 11 ld.s8 %r78,[%r135+28]; .loc 1 88 7 setp.le.s64 %r158,%r78,0; @ %r158 bra $L4; .loc 1 90 13 shl.b64 %r160,%r78,3; add.u64 %r161,%frame,120; cvt.u32.u64 %r165,%r96add.u64 %r120,%r133,40; add.u64 %r118,%r135,40; mov.u64 %r116,%frame; add.u64 %r115,%frame,360; add.u64 %r114,%frame,240; add.u64 %r170,%r78,%r78; add.u64 %r171,%r170,%r78; shl.b64 %r172,%r171,3; add.u64 %r102,%r172,%r120; .loc 1 60 9 cvt.u32.u64 %r37,%r96; .loc 1 87 10 mov.u64 %r91,1; .loc 1 91 4 mov.u64 %r174,%r96; $L5: st.u64 [%r120+8],%r174; ld.u64 %r176,[%r118+16]; add.u64 %r175,%r176,1; ld.u64 %r177,[%r118+8]; sub.u64 %r32,%r175,%r177; add.u64 %r178,%r32,-1; st.u64 [%r120+16],%r178; st.u64 [%r120],%r91; .loc 1 93 14 st.u64 [%r116],%r32; .loc 1 94 18 set.u32.le.s64 %r180,%r32,0; neg.s32 %r181,%r180; cvt.u16.u32 %r184,%r37; and.b16 %r183,%r184,1; cvt.u16.u32 %r187,%r181; or.b16 %r186,%r187,%r183; cvt.u32.u16 %r189,%r186; cvt.u32.u8 %r37,%r189; .loc 1 95 15 st.u64 [%r115],%r91; .loc 1 96 17 ld.u64 %r191,[%r118]; mul.lo.u64 %r190,%r191,%r22; .loc 1 96 15 st.u64 [%r114],%r190; .loc 1 97 7 mul.lo.u64 %r91,%r91,%r32; .loc 1 88 7 add.u64 %r120,%r120,24; add.u64 %r118,%r118,24; add.u64 %r116,%r116,8; add.u64 %r115,%r115,8; add.u64 %r114,%r114,8; setp.ne.u64 %r192,%r102,%r120; @ %r192 bra $L5; bra $L31; $L4: .loc 1 99 19 st.u64 [%r133+8],%r96; .loc 1 100 24 mov.u64 %r195,4; mov.u64 %r19419195; call (%value_in),_gfortrani_xmallocarray196,[%value_in]; } mov.u64 %r96,%r196; .loc 1 100 22 st.u64 [%r133],%r96; .loc 1 128 12 ld.u64 %r132,[%frame+360]; bra $L7; $L31: .loc 1 99 19 st.u64 [%r133+8],%r174; .loc 1 100 2491200; call (%value_in),_gfortrani_xmallocarray201,[%value_in]; } .loc 1 100 22 st.u64 [%r133],%r201; bra $L8; $L3: .loc 1 104 11 ld.s8 %r78,[%r133+28]; .loc 1 106 18 mov.u64 %r204,1; st.u64 [%frame+360],%r204; .loc 1 107 7 setp.le.s64 %r205,%r78,0; @ %r205 bra $L20; .loc 1 109 13 shl.b64 %r130,%r78,3; add.u64 %r208,%frame,120; mov.u32 %r21mov.u64 %r99,40; mov.u64 %r100,0; .loc 1 60 9 cvt.u32.u64 %r37,%r100; add.u64 %r266,%frame,360; add.u64 %r268,%frame,240; $L9: add.u64 %r82,%r133,%r99; .loc 1 110 16 ld.u64 %r217,[%r82+16]; add.u64 %r216,%r217,1; ld.u64 %r218,[%r82+8]; sub.u64 %r48,%r216,%r218; .loc 1 110 14 add.u64 %r219,%frame,%r100; st.u64 [%r219],%r48; .loc 1 111 18 set.u32.le.s64 %r221,%r48,0; neg.s32 %r222,%r221; cvt.u16.u32 %r225,%r37; and.b16 %r224,%r225,1; cvt.u16.u32 %r228,%r222; or.b16 %r227,%r228,%r224; cvt.u32.u16 %r230,%r227; cvt.u32.u8 %r37,%r230; .loc 1 112 15 add.u64 %r232,%r266,%r100; ld.u64 %r233,[%r82]; st.u64 [%r232],%r233; .loc 1 113 15 add.u64 %r235,%r268,%r100; .loc 1 113 17 add.u64 %r236,%r135,%r99; ld.u64 %r238,[%r236]; mul.lo.u64 %r237,%r238,%r22; .loc 1 113 15 st.u64 [%r235],%r237; .loc 1 107 7 add.u64 %r100,%r100,8; add.u64 %r99,%r99,24; setp.ne.u64 %r239,%r100,%r130; @ %r239 bra $L9; .loc 1 115 10 ld.u64 %r240,[%frame+360]; setp.ne.u64 %r241,%r240,0; @ %r241 bra $L8; .loc 1 116 13 mov.u64 %r242,1; st.u64 [%frame+360],%r242; $L8: .loc 1 119 6 setp.ne.u32 %r243,%r37,0; @ %r243 bra $L1; .loc 1 128 12 ld.u64 %r132,[%frame+360]; .loc 1 130 8 ld.u64 %r96,[%r133]; bra $L7; $L20: .loc 1 107 7 mov.u64 %r132,%r204; $L7: .loc 1 122 14 ld.u64 %r56,[%frame+240]; .loc 1 122 6 setp.ne.u64 %r244,%r56,0; .loc 1 123 16 selp.u64 %r56,%r56,1,%r244; .loc 1 125 12 ld.u64 %r75,[%r134+40]; .loc 1 126 6 setp.ne.u64 %r245,%r75,0; .loc 1 127 14 selp.u64 %r75,%r75,1,%r245; .loc 1 131 8 ld.u64 %r76,[%r134]; .loc 1 133 9 setp.eq.u64 %r246,%r96,0; @ %r246 bra $L1; .loc 1 139 9 shl.b64 %r60,%r75,2; .loc 1 147 12 shl.b64 %r62,%r132,2; ld.u64 %r94,[%frame+120]; .loc 1 151 32 ld.u64 %r98,[%frame]; .loc 1 158 30 mul.lo.u64 %r27,%r98,%r132; .loc 1 159 30 mul.lo.u64 %r247,%r98,%r56; .loc 1 159 16 neg.s64 %r101,%r247; shl.b64 %r121,%r78,3; setp.le.s64 %r267,%r78,1; add.u64 %r273,%frame,360; add.u64 %r274,%frame,240; .loc 1 155 20 mov.u64 %r275,0; $L19: .loc 1 135 10 ld.s8 %r249,[%r97]; cvt.u16.u32 %r248,%r249; setp.eq.u16 %r250,%r248,0; @ %r250 bra $L21; .loc 1 138 12 ld.u32 %r58,[%r76]; .loc 1 139 9 add.u64 %r76,%r76,%r60; bra $L13; $L21: .loc 1 144 10 mov.u32 %r58,%r86; $L13: st.u32 [%r96],%r58; .loc 1 147 12 add.u64 %r96,%r96,%r62; .loc 1 148 12 add.u64 %r97,%r97,%r56; .loc 1 149 15 add.u64 %r94,%r94,1; .loc 1 151 13 setp.ne.u64 %r251,%r94,%r98; @ %r251 bra $L19; .loc 1 159 16 add.u64 %r95,%r97,%r101; .loc 1 161 14 @ ! %r267 bra $L32; bra $L1; $L18: .loc 1 155 20 st.u64 [%r90],%r275; .loc 1 158 30 mul.lo.u64 %r64,%r70,%r69; .loc 1 159 30 mul.lo.u64 %r254,%r72,%r69; .loc 1 159 16 sub.u64 %r95,%r97,%r254; .loc 1 161 14 add.u64 %r90,%r90,8; add.u64 %r127,%r127,8; setp.ne.u64 %r255,%r121,%r127; @ %r255 bra $L17; bra $L1; $L32: add.u64 %r90,%frame,128; .loc 1 158 30 mov.u64 %r64,%r27; .loc 1 161 14 mov.u64 %r127,8; $L17: .loc 1 169 23 ld.u64 %r257,[%r90]; add.u64 %r69,%r257,1; st.u64 [%r90],%r69; .loc 1 170 30 add.u64 %r259,%r273,%r127; ld.u64 %r70,[%r259]; .loc 1 170 20 sub.u64 %r260,%r70,%r64; shl.b64 %r261,%r260,2; add.u64 %r96,%r96,%r261; .loc 1 171 30 add.u64 %r263,%r274,%r127; ld.u64 %r72,[%r263]; .loc 1 171 20 add.u64 %r97,%r95,%r72; .loc 1 151 32 add.u64 %r264,%frame,%r127; ld.u64 %r74,[%r264]; .loc 1 151 13 setp.eq.u64 %r265,%r69,%r74; @ %r265 bra $L18; .loc 1 155 20 mov.u64 %r94,0; bra $L19; $L1: .loc 1 1_gfortrani_unpack1_i4 .visible .func _gfortrani_unpack1_608932predpredpred %r213; .reg .u64u32u64 %r257; .reg .u64predu64 %r281; .reg .u64 %r282; .reg .pred %r283; .reg .u64u64 %r306; .reg .u64 %r307; .reg .u64 %r308; mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 208 8 ld.u64 %r106,[%r152]; .loc 1 213 15 ld.u64 %r22,[%r152+16]; .loc 1 215 22 cvt.u32.u64 %r23,%r22; .loc 1 215 53 add.u32 %r154,%r23,-4; and.b32 %r155,%r154,-5; set.u32.eq.u32 %r157,%r155,0; neg.s32 %r158,%r157; .loc 1 215 22 add.u32 %r159,%r23,-1; set.u32.le.u32 %r161,%r159,1; neg.s32 %r162,%r161; .loc 1 215 6 cvt.u16.u32 %r164,%r158; cvt.u16.u32 %r165,%r162; or%r168 bra $L34; .loc 1 217 7 setp.eq.u32 %r170,%r23,16; @ %r170 bra $L34; .loc 1 226 5 cvta.const.u64 %r171171_gfortran_runtime_error34: .loc 1 228 10 ld.u64 %r104,[%r150]; .loc 1 228 6 setp.ne.u64 %r173,%r104,0; @ %r173 bra $L35; .loc 1 232 11 ld.s8 %r92,[%r152+28]; .loc 1 234 7 setp.le.s64 %r175,%r92,0; @ %r175 bra $L36; .loc 1 236 13 shl.b64 %r145,%r92,3; add.u64 %r178,%frame,120; cvt.u32.u64 %r182,%r1041454,[%value_in]; } mov.u64 %r135,40; mov.u64 %r136,%r104; .loc 1 206 9 cvt.u32.u64 %r38,%r136; .loc 1 233 10 mov.u64 %r99,1; add.u64 %r297,%frame,480; add.u64 %r300,%frame,360; add.u64 %r299,%frame,240; .loc 1 237 4 mov.u64 %r186,%r136; $L37: add.u64 %r134,%r150,%r135; st.u64 [%r134+8],%r186; add.u64 %r133,%r152,%r135; ld.u64 %r188,[%r133+16]; add.u64 %r187,%r188,1; ld.u64 %r189,[%r133+8]; sub.u64 %r33,%r187,%r189; add.u64 %r190,%r33,-1; st.u64 [%r134+16],%r190; st.u64 [%r134],%r99; .loc 1 239 14 add.u64 %r191,%frame,%r136; st.u64 [%r191],%r33; .loc 1 240 18 set.u32.le.s64 %r193,%r33,0; neg.s32 %r194,%r193; cvt.u16.u32 %r197,%r38; and.b16 %r196,%r197,1; cvt.u16.u32 %r200,%r194; or.b16 %r199,%r200,%r196; cvt.u32.u16 %r202,%r199; cvt.u32.u8 %r38,%r202; .loc 1 241 15 add.u64 %r204,%r297,%r136; st.u64 [%r204],%r99; .loc 1 242 15 add.u64 %r206,%r300,%r136; .loc 1 242 17 add.u64 %r207,%r153,%r135; .loc 1 242 15 ld.u64 %r208,[%r207]; st.u64 [%r206],%r208; .loc 1 243 15 add.u64 %r210,%r299,%r136; .loc 1 243 17 ld.u64 %r212,[%r133]; mul.lo.u64 %r211,%r212,%r22; .loc 1 243 15 st.u64 [%r210],%r211; .loc 1 244 7 mul.lo.u64 %r99,%r99,%r33; .loc 1 234 7 add.u64 %r136,%r136,8; add.u64 %r135,%r135,24; setp.ne.u64 %r213,%r136,%r145; @ %r213 bra $L37; bra $L64; $L36: .loc 1 246 19 st.u64 [%r150+8],%r104; .loc 1 247 24 mov.u64 %r216,4; mov.u64 %r215216; call (%value_in),_gfortrani_xmallocarray217,[%value_in]; } mov.u64 %r104,%r217; .loc 1 247 22 st.u64 [%r150],%r104; .loc 1 278 12 ld.u64 %r149,[%frame+480]; bra $L39; $L64: .loc 1 246 19 st.u64 [%r150+8],%r186; .loc 1 247 2499221; call (%value_in),_gfortrani_xmallocarray222,[%value_in]; } .loc 1 247 22 st.u64 [%r150],%r222; bra $L40; $L35: .loc 1 251 11 ld.s8 %r92,[%r150+28]; .loc 1 253 18 mov.u64 %r225,1; st.u64 [%frame+480],%r225; .loc 1 254 7 setp.le.s64 %r226,%r92,0; @ %r226 bra $L54; .loc 1 256 13 shl.b64 %r147,%r92,3; add.u64 %r229,%frame,120; mov.u32 %r23314735,[%value_in]; } add.u64 %r127,%r150,40; add.u64 %r125,%r153,40; add.u64 %r123,%r152,40; mov.u64 %r121,0; .loc 1 206 9 cvt.u32.u64 %r38,%r121; add.u64 %r297,%frame,480; add.u64 %r300,%frame,360; add.u64 %r299,%frame,240; $L41: .loc 1 257 16 ld.u64 %r238,[%r127+16]; add.u64 %r237,%r238,1; ld.u64 %r239,[%r127+8]; sub.u64 %r50,%r237,%r239; .loc 1 257 14 add.u64 %r240,%frame,%r121; st.u64 [%r240],%r50; .loc 1 258 18 set.u32.le.s64 %r242,%r50,0; neg.s32 %r243,%r242; cvt.u16.u32 %r246,%r38; and.b16 %r245,%r246,1; cvt.u16.u32 %r249,%r243; or.b16 %r248,%r249,%r245; cvt.u32.u16 %r251,%r248; cvt.u32.u8 %r38,%r251; .loc 1 259 15 add.u64 %r253,%r297,%r121; ld.u64 %r254,[%r127]; st.u64 [%r253],%r254; .loc 1 260 15 add.u64 %r256,%r300,%r121; ld.u64 %r257,[%r125]; st.u64 [%r256],%r257; .loc 1 261 15 add.u64 %r259,%r299,%r121; .loc 1 261 17 ld.u64 %r261,[%r123]; mul.lo.u64 %r260,%r261,%r22; .loc 1 261 15 st.u64 [%r259],%r260; .loc 1 254 7 add.u64 %r127,%r127,24; add.u64 %r125,%r125,24; add.u64 %r123,%r123,24; add.u64 %r121,%r121,8; setp.ne.u64 %r262,%r121,%r147; @ %r262 bra $L41; .loc 1 263 10 ld.u64 %r263,[%frame+48040; .loc 1 264 13 mov.u64 %r265,1; st.u64 [%frame+480],%r265; $L40: .loc 1 267 6 setp.ne.u32 %r266,%r38,0; @ %r266 bra $L33; .loc 1 278 12 ld.u64 %r149,[%frame+480]; .loc 1 281 8 ld.u64 %r104,[%r150]; bra $L39; $L54: .loc 1 254 7 mov.u64 %r149,%r225; $L39: .loc 1 270 14 ld.u64 %r59,[%frame+360]; .loc 1 270 6 setp.ne.u64 %r267,%r59,0; .loc 1 271 16 selp.u64 %r59,%r59,1,%r267; .loc 1 272 14 ld.u64 %r60,[%frame+240]; .loc 1 272 6 setp.ne.u64 %r268,%r60,0; .loc 1 273 16 selp.u64 %r60,%r60,1,%r268; .loc 1 275 12 ld.u64 %r90,[%r151+40]; .loc 1 276 6 setp.ne.u64 %r269,%r90,0; .loc 1 277 14 selp.u64 %r90,%r90,1,%r269; .loc 1 282 8 ld.u64 %r105,[%r153]; .loc 1 283 8 ld.u64 %r91,[%r151]; .loc 1 285 9 setp.eq.u64 %r270,%r104,0; @ %r270 bra $L33; .loc 1 291 16 shl.b64 %r64,%r90,2; .loc 1 299 12 shl.b64 %r66,%r149,2; .loc 1 300 12 shl.b64 %r68,%r59,2; ld.u64 %r27,[%frame+120]; .loc 1 304 32 ld.u64 %r107,[%frame]; .loc 1 311 30 mul.lo.u64 %r28,%r107,%r149; .loc 1 312 16 mul.lo.u64 %r271,%r107,%r59; shl.b64 %r272,%r271,2; neg.s64 %r111,%r272; .loc 1 313 30 mul.lo.u64 %r273,%r107,%r60; .loc 1 313 16 neg.s64 %r112,%r273; shl.b64 %r137,%r92,3; setp.le.s64 %r298,%r92,1; add.u64 %r305,%frame,480; add.u64 %r306,%frame,360; add.u64 %r307,%frame,240; .loc 1 308 20 mov.u64 %r308,0; $L53: .loc 1 287 10 ld.s8 %r275,[%r106]; cvt.u16.u32 %r274,%r275; setp.eq.u16 %r276,%r274,0; @ %r276 bra $L46; .loc 1 290 12 ld.u32 %r62,[%r91]; .loc 1 291 16 add.u64 %r91,%r91,%r64; bra $L47; $L46: .loc 1 296 12 ld.u32 %r62,[%r105]; $L47: st.u32 [%r104],%r62; .loc 1 299 12 add.u64 %r104,%r104,%r66; .loc 1 300 12 add.u64 %r105,%r105,%r68; .loc 1 301 12 add.u64 %r106,%r106,%r60; .loc 1 302 15 add.u64 %r27,%r27,1; .loc 1 304 13 setp.ne.u64 %r277,%r27,%r107; @ %r277 bra $L53; .loc 1 312 16 add.u64 %r102,%r105,%r111; .loc 1 313 16 add.u64 %r103,%r106,%r112; .loc 1 315 14 @ ! %r298 bra $L65; bra $L33; $L52: .loc 1 308 20 st.u64 [%r98],%r308; .loc 1 311 30 mul.lo.u64 %r71,%r81,%r80; .loc 1 312 30 mul.lo.u64 %r280,%r83,%r80; .loc 1 312 16 shl.b64 %r281,%r280,2; sub.u64 %r102,%r105,%r281; .loc 1 313 30 mul.lo.u64 %r282,%r86,%r80; .loc 1 313 16 sub.u64 %r103,%r106,%r282; .loc 1 315 14 add.u64 %r98,%r98,8; add.u64 %r144,%r144,8; setp.ne.u64 %r283,%r137,%r144; @ %r283 bra $L51; bra $L33; $L65: add.u64 %r98,%frame,128; .loc 1 311 30 mov.u64 %r71,%r28; .loc 1 315 14 mov.u64 %r144,8; $L51: .loc 1 323 23 ld.u64 %r285,[%r98]; add.u64 %r80,%r285,1; st.u64 [%r98],%r80; .loc 1 324 30 add.u64 %r287,%r305,%r144; ld.u64 %r81,[%r287]; .loc 1 324 20 sub.u64 %r288,%r81,%r71; shl.b64 %r289,%r288,2; add.u64 %r104,%r104,%r289; .loc 1 325 30 add.u64 %r291,%r306,%r144; ld.u64 %r83,[%r291]; .loc 1 325 20 shl.b64 %r292,%r83,2; add.u64 %r105,%r102,%r292; .loc 1 326 30 add.u64 %r294,%r307,%r144; ld.u64 %r86,[%r294]; .loc 1 326 20 add.u64 %r106,%r103,%r86; .loc 1 304 32 add.u64 %r295,%frame,%r144; ld.u64 %r88,[%r295]; .loc 1 304 13 setp.eq.u64 %r296,%r80,%r88; @ %r296 bra $L52; .loc 1 308 20 mov.u64 %r27,0; bra $L53; $L33: .loc 1 330 unpack_i8.o/_gfortrani_unpack0_i8 .visible .func _gfortrani_unpack0_../libgfortran/generated/unpack_i8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_unpack1_i8 .visible .func _gfortrani_unpack1_i8gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortrani_unpack0_i8 .visible .func _gfortrani_unpack0_0predpredpred %r205; .reg .u64 %r208; .reg .u32 %r212; .reg .u64u16 %r225; .reg .u16 %r227; .reg .u16u64 %r237; .reg .u64 %r238; .reg .pred %r239; .reg .u64 %r240; .reg .pred %r241; .reg .u64 %r242; .reg .predpredpredmov.u64 %r133,%ar0; mov.u64 %r134,%ar1; mov.u64 %r135,%ar2; mov.u64 %r136,%ar3; .loc 1 46 23 ld.u64 %r86,[%r136]; .loc 1 62 8 ld.u64 %r97,[%r135]; .loc 1 67 15 ld.u64 %r22,[%r135+16]; .loc 1 69 22 cvt.u32.u64 %r23,%r22; .loc 1 69 53 add.u32 %r137,%r23,-4; and.b32 %r138,%r137,-5; set.u32.eq.u32 %r140,%r138,0; neg.s32 %r141,%r140; .loc 1 69 22 add.u32 %r142,%r23,-1; set.u32.le.u32 %r144,%r142,1; neg.s32 %r145,%r144; .loc 1 69 6 cvt.u16.u32 %r147,%r141; cvt.u16.u32 %r148,%r145; or.b16 %r146,%r147,%r148; cvt.u32.u16 %r149,%r146; cvt.u16.u8 %r150,%r149; setp.ne.u16 %r151,%r150,0; @ %r151 bra $L2; .loc 1 71 7 setp.eq.u32 %r153,%r23,16; @ %r153 bra $L2; .loc 1 80 5 cvta.const.u64 %r1_gfortran_runtime_errorr96,[%r133]; .loc 1 82 6 setp.ne.u64 %r156,%r96,0; @ %r156 bra $L3; .loc 1 86 11 ld.s8 %r78,[%r135+28]; .loc 1 88 7 setp.le.s64 %r158,%r78,0; @ %r158 bra $L4; .loc 1 90 13 shl.b64 %r160,%r78,3; add.u64 %r161,%frame,120; cvt.u32.u64 %r165,%r96add.u64 %r120,%r133,40; add.u64 %r118,%r135,40; mov.u64 %r116,%frame; add.u64 %r115,%frame,360; add.u64 %r114,%frame,240; add.u64 %r170,%r78,%r78; add.u64 %r171,%r170,%r78; shl.b64 %r172,%r171,3; add.u64 %r102,%r172,%r120; .loc 1 60 9 cvt.u32.u64 %r37,%r96; .loc 1 87 10 mov.u64 %r91,1; .loc 1 91 4 mov.u64 %r174,%r96; $L5: st.u64 [%r120+8],%r174; ld.u64 %r176,[%r118+16]; add.u64 %r175,%r176,1; ld.u64 %r177,[%r118+8]; sub.u64 %r32,%r175,%r177; add.u64 %r178,%r32,-1; st.u64 [%r120+16],%r178; st.u64 [%r120],%r91; .loc 1 93 14 st.u64 [%r116],%r32; .loc 1 94 18 set.u32.le.s64 %r180,%r32,0; neg.s32 %r181,%r180; cvt.u16.u32 %r184,%r37; and.b16 %r183,%r184,1; cvt.u16.u32 %r187,%r181; or.b16 %r186,%r187,%r183; cvt.u32.u16 %r189,%r186; cvt.u32.u8 %r37,%r189; .loc 1 95 15 st.u64 [%r115],%r91; .loc 1 96 17 ld.u64 %r191,[%r118]; mul.lo.u64 %r190,%r191,%r22; .loc 1 96 15 st.u64 [%r114],%r190; .loc 1 97 7 mul.lo.u64 %r91,%r91,%r32; .loc 1 88 7 add.u64 %r120,%r120,24; add.u64 %r118,%r118,24; add.u64 %r116,%r116,8; add.u64 %r115,%r115,8; add.u64 %r114,%r114,8; setp.ne.u64 %r192,%r102,%r120; @ %r192 bra $L5; bra $L31; $L4: .loc 1 99 19 st.u64 [%r133+8],%r96; .loc 1 100 24 mov.u64 %r195,8; mov.u64 %r19419195; call (%value_in),_gfortrani_xmallocarray196,[%value_in]; } mov.u64 %r96,%r196; .loc 1 100 22 st.u64 [%r133],%r96; .loc 1 128 12 ld.u64 %r132,[%frame+360]; bra $L7; $L31: .loc 1 99 19 st.u64 [%r133+8],%r174; .loc 1 100 24 mov.u64 %r20091200; call (%value_in),_gfortrani_xmallocarray201,[%value_in]; } .loc 1 100 22 st.u64 [%r133],%r201; bra $L8; $L3: .loc 1 104 11 ld.s8 %r78,[%r133+28]; .loc 1 106 18 mov.u64 %r204,1; st.u64 [%frame+360],%r204; .loc 1 107 7 setp.le.s64 %r205,%r78,0; @ %r205 bra $L20; .loc 1 109 13 shl.b64 %r130,%r78,3; add.u64 %r208,%frame,120; mov.u32 %r21mov.u64 %r99,40; mov.u64 %r100,0; .loc 1 60 9 cvt.u32.u64 %r37,%r100; add.u64 %r266,%frame,360; add.u64 %r268,%frame,240; $L9: add.u64 %r82,%r133,%r99; .loc 1 110 16 ld.u64 %r217,[%r82+16]; add.u64 %r216,%r217,1; ld.u64 %r218,[%r82+8]; sub.u64 %r48,%r216,%r218; .loc 1 110 14 add.u64 %r219,%frame,%r100; st.u64 [%r219],%r48; .loc 1 111 18 set.u32.le.s64 %r221,%r48,0; neg.s32 %r222,%r221; cvt.u16.u32 %r225,%r37; and.b16 %r224,%r225,1; cvt.u16.u32 %r228,%r222; or.b16 %r227,%r228,%r224; cvt.u32.u16 %r230,%r227; cvt.u32.u8 %r37,%r230; .loc 1 112 15 add.u64 %r232,%r266,%r100; ld.u64 %r233,[%r82]; st.u64 [%r232],%r233; .loc 1 113 15 add.u64 %r235,%r268,%r100; .loc 1 113 17 add.u64 %r236,%r135,%r99; ld.u64 %r238,[%r236]; mul.lo.u64 %r237,%r238,%r22; .loc 1 113 15 st.u64 [%r235],%r237; .loc 1 107 7 add.u64 %r100,%r100,8; add.u64 %r99,%r99,24; setp.ne.u64 %r239,%r100,%r130; @ %r239 bra $L9; .loc 1 115 10 ld.u64 %r240,[%frame+360]; setp.ne.u64 %r241,%r240,0; @ %r241 bra $L8; .loc 1 116 13 mov.u64 %r242,1; st.u64 [%frame+360],%r242; $L8: .loc 1 119 6 setp.ne.u32 %r243,%r37,0; @ %r243 bra $L1; .loc 1 128 12 ld.u64 %r132,[%frame+360]; .loc 1 130 8 ld.u64 %r96,[%r133]; bra $L7; $L20: .loc 1 107 7 mov.u64 %r132,%r204; $L7: .loc 1 122 14 ld.u64 %r56,[%frame+240]; .loc 1 122 6 setp.ne.u64 %r244,%r56,0; .loc 1 123 16 selp.u64 %r56,%r56,1,%r244; .loc 1 125 12 ld.u64 %r75,[%r134+40]; .loc 1 126 6 setp.ne.u64 %r245,%r75,0; .loc 1 127 14 selp.u64 %r75,%r75,1,%r245; .loc 1 131 8 ld.u64 %r76,[%r134]; .loc 1 133 9 setp.eq.u64 %r246,%r96,0; @ %r246 bra $L1; .loc 1 139 9 shl.b64 %r60,%r75,3; .loc 1 147 12 shl.b64 %r62,%r132,3; ld.u64 %r94,[%frame+120]; .loc 1 151 32 ld.u64 %r98,[%frame]; .loc 1 158 30 mul.lo.u64 %r27,%r98,%r132; .loc 1 159 30 mul.lo.u64 %r247,%r98,%r56; .loc 1 159 16 neg.s64 %r101,%r247; shl.b64 %r121,%r78,3; setp.le.s64 %r267,%r78,1; add.u64 %r273,%frame,360; add.u64 %r274,%frame,240; .loc 1 155 20 mov.u64 %r275,0; $L19: .loc 1 135 10 ld.s8 %r249,[%r97]; cvt.u16.u32 %r248,%r249; setp.eq.u16 %r250,%r248,0; @ %r250 bra $L21; .loc 1 138 12 ld.u64 %r58,[%r76]; .loc 1 139 9 add.u64 %r76,%r76,%r60; bra $L13; $L21: .loc 1 144 10 mov.u64 %r58,%r86; $L13: st.u64 [%r96],%r58; .loc 1 147 12 add.u64 %r96,%r96,%r62; .loc 1 148 12 add.u64 %r97,%r97,%r56; .loc 1 149 15 add.u64 %r94,%r94,1; .loc 1 151 13 setp.ne.u64 %r251,%r94,%r98; @ %r251 bra $L19; .loc 1 159 16 add.u64 %r95,%r97,%r101; .loc 1 161 14 @ ! %r267 bra $L32; bra $L1; $L18: .loc 1 155 20 st.u64 [%r90],%r275; .loc 1 158 30 mul.lo.u64 %r64,%r70,%r69; .loc 1 159 30 mul.lo.u64 %r254,%r72,%r69; .loc 1 159 16 sub.u64 %r95,%r97,%r254; .loc 1 161 14 add.u64 %r90,%r90,8; add.u64 %r127,%r127,8; setp.ne.u64 %r255,%r121,%r127; @ %r255 bra $L17; bra $L1; $L32: add.u64 %r90,%frame,128; .loc 1 158 30 mov.u64 %r64,%r27; .loc 1 161 14 mov.u64 %r127,8; $L17: .loc 1 169 23 ld.u64 %r257,[%r90]; add.u64 %r69,%r257,1; st.u64 [%r90],%r69; .loc 1 170 30 add.u64 %r259,%r273,%r127; ld.u64 %r70,[%r259]; .loc 1 170 20 sub.u64 %r260,%r70,%r64; shl.b64 %r261,%r260,3; add.u64 %r96,%r96,%r261; .loc 1 171 30 add.u64 %r263,%r274,%r127; ld.u64 %r72,[%r263]; .loc 1 171 20 add.u64 %r97,%r95,%r72; .loc 1 151 32 add.u64 %r264,%frame,%r127; ld.u64 %r74,[%r264]; .loc 1 151 13 setp.eq.u64 %r265,%r69,%r74; @ %r265 bra $L18; .loc 1 155 20 mov.u64 %r94,0; bra $L19; $L1: .loc 1 1_gfortrani_unpack1_i8 .visible .func _gfortrani_unpack1_608932predpredpred %r213; .reg .u64u32u64 %r257; .reg .u64predu64 %r281; .reg .u64 %r282; .reg .pred %r283; .reg .u64u64 %r306; .reg .u64 %r307; .reg .u64 %r308; mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 208 8 ld.u64 %r106,[%r152]; .loc 1 213 15 ld.u64 %r22,[%r152+16]; .loc 1 215 22 cvt.u32.u64 %r23,%r22; .loc 1 215 53 add.u32 %r154,%r23,-4; and.b32 %r155,%r154,-5; set.u32.eq.u32 %r157,%r155,0; neg.s32 %r158,%r157; .loc 1 215 22 add.u32 %r159,%r23,-1; set.u32.le.u32 %r161,%r159,1; neg.s32 %r162,%r161; .loc 1 215 6 cvt.u16.u32 %r164,%r158; cvt.u16.u32 %r165,%r162; or%r168 bra $L34; .loc 1 217 7 setp.eq.u32 %r170,%r23,16; @ %r170 bra $L34; .loc 1 226 5 cvta.const.u64 %r171171_gfortran_runtime_error34: .loc 1 228 10 ld.u64 %r104,[%r150]; .loc 1 228 6 setp.ne.u64 %r173,%r104,0; @ %r173 bra $L35; .loc 1 232 11 ld.s8 %r92,[%r152+28]; .loc 1 234 7 setp.le.s64 %r175,%r92,0; @ %r175 bra $L36; .loc 1 236 13 shl.b64 %r145,%r92,3; add.u64 %r178,%frame,120; cvt.u32.u64 %r182,%r1041454,[%value_in]; } mov.u64 %r135,40; mov.u64 %r136,%r104; .loc 1 206 9 cvt.u32.u64 %r38,%r136; .loc 1 233 10 mov.u64 %r99,1; add.u64 %r297,%frame,480; add.u64 %r300,%frame,360; add.u64 %r299,%frame,240; .loc 1 237 4 mov.u64 %r186,%r136; $L37: add.u64 %r134,%r150,%r135; st.u64 [%r134+8],%r186; add.u64 %r133,%r152,%r135; ld.u64 %r188,[%r133+16]; add.u64 %r187,%r188,1; ld.u64 %r189,[%r133+8]; sub.u64 %r33,%r187,%r189; add.u64 %r190,%r33,-1; st.u64 [%r134+16],%r190; st.u64 [%r134],%r99; .loc 1 239 14 add.u64 %r191,%frame,%r136; st.u64 [%r191],%r33; .loc 1 240 18 set.u32.le.s64 %r193,%r33,0; neg.s32 %r194,%r193; cvt.u16.u32 %r197,%r38; and.b16 %r196,%r197,1; cvt.u16.u32 %r200,%r194; or.b16 %r199,%r200,%r196; cvt.u32.u16 %r202,%r199; cvt.u32.u8 %r38,%r202; .loc 1 241 15 add.u64 %r204,%r297,%r136; st.u64 [%r204],%r99; .loc 1 242 15 add.u64 %r206,%r300,%r136; .loc 1 242 17 add.u64 %r207,%r153,%r135; .loc 1 242 15 ld.u64 %r208,[%r207]; st.u64 [%r206],%r208; .loc 1 243 15 add.u64 %r210,%r299,%r136; .loc 1 243 17 ld.u64 %r212,[%r133]; mul.lo.u64 %r211,%r212,%r22; .loc 1 243 15 st.u64 [%r210],%r211; .loc 1 244 7 mul.lo.u64 %r99,%r99,%r33; .loc 1 234 7 add.u64 %r136,%r136,8; add.u64 %r135,%r135,24; setp.ne.u64 %r213,%r136,%r145; @ %r213 bra $L37; bra $L64; $L36: .loc 1 246 19 st.u64 [%r150+8],%r104; .loc 1 247 24 mov.u64 %r216,8; mov.u64 %r215216; call (%value_in),_gfortrani_xmallocarray217,[%value_in]; } mov.u64 %r104,%r217; .loc 1 247 22 st.u64 [%r150],%r104; .loc 1 278 12 ld.u64 %r149,[%frame+480]; bra $L39; $L64: .loc 1 246 19 st.u64 [%r150+8],%r186; .loc 1 247 24 mov.u64 %r299221; call (%value_in),_gfortrani_xmallocarray222,[%value_in]; } .loc 1 247 22 st.u64 [%r150],%r222; bra $L40; $L35: .loc 1 251 11 ld.s8 %r92,[%r150+28]; .loc 1 253 18 mov.u64 %r225,1; st.u64 [%frame+480],%r225; .loc 1 254 7 setp.le.s64 %r226,%r92,0; @ %r226 bra $L54; .loc 1 256 13 shl.b64 %r147,%r92,3; add.u64 %r229,%frame,120; mov.u32 %r23314735,[%value_in]; } add.u64 %r127,%r150,40; add.u64 %r125,%r153,40; add.u64 %r123,%r152,40; mov.u64 %r121,0; .loc 1 206 9 cvt.u32.u64 %r38,%r121; add.u64 %r297,%frame,480; add.u64 %r300,%frame,360; add.u64 %r299,%frame,240; $L41: .loc 1 257 16 ld.u64 %r238,[%r127+16]; add.u64 %r237,%r238,1; ld.u64 %r239,[%r127+8]; sub.u64 %r50,%r237,%r239; .loc 1 257 14 add.u64 %r240,%frame,%r121; st.u64 [%r240],%r50; .loc 1 258 18 set.u32.le.s64 %r242,%r50,0; neg.s32 %r243,%r242; cvt.u16.u32 %r246,%r38; and.b16 %r245,%r246,1; cvt.u16.u32 %r249,%r243; or.b16 %r248,%r249,%r245; cvt.u32.u16 %r251,%r248; cvt.u32.u8 %r38,%r251; .loc 1 259 15 add.u64 %r253,%r297,%r121; ld.u64 %r254,[%r127]; st.u64 [%r253],%r254; .loc 1 260 15 add.u64 %r256,%r300,%r121; ld.u64 %r257,[%r125]; st.u64 [%r256],%r257; .loc 1 261 15 add.u64 %r259,%r299,%r121; .loc 1 261 17 ld.u64 %r261,[%r123]; mul.lo.u64 %r260,%r261,%r22; .loc 1 261 15 st.u64 [%r259],%r260; .loc 1 254 7 add.u64 %r127,%r127,24; add.u64 %r125,%r125,24; add.u64 %r123,%r123,24; add.u64 %r121,%r121,8; setp.ne.u64 %r262,%r121,%r147; @ %r262 bra $L41; .loc 1 263 10 ld.u64 %r263,[%frame+48040; .loc 1 264 13 mov.u64 %r265,1; st.u64 [%frame+480],%r265; $L40: .loc 1 267 6 setp.ne.u32 %r266,%r38,0; @ %r266 bra $L33; .loc 1 278 12 ld.u64 %r149,[%frame+480]; .loc 1 281 8 ld.u64 %r104,[%r150]; bra $L39; $L54: .loc 1 254 7 mov.u64 %r149,%r225; $L39: .loc 1 270 14 ld.u64 %r59,[%frame+360]; .loc 1 270 6 setp.ne.u64 %r267,%r59,0; .loc 1 271 16 selp.u64 %r59,%r59,1,%r267; .loc 1 272 14 ld.u64 %r60,[%frame+240]; .loc 1 272 6 setp.ne.u64 %r268,%r60,0; .loc 1 273 16 selp.u64 %r60,%r60,1,%r268; .loc 1 275 12 ld.u64 %r90,[%r151+40]; .loc 1 276 6 setp.ne.u64 %r269,%r90,0; .loc 1 277 14 selp.u64 %r90,%r90,1,%r269; .loc 1 282 8 ld.u64 %r105,[%r153]; .loc 1 283 8 ld.u64 %r91,[%r151]; .loc 1 285 9 setp.eq.u64 %r270,%r104,0; @ %r270 bra $L33; .loc 1 291 16 shl.b64 %r64,%r90,3; .loc 1 299 12 shl.b64 %r66,%r149,3; .loc 1 300 12 shl.b64 %r68,%r59,3; ld.u64 %r27,[%frame+120]; .loc 1 304 32 ld.u64 %r107,[%frame]; .loc 1 311 30 mul.lo.u64 %r28,%r107,%r149; .loc 1 312 16 mul.lo.u64 %r271,%r107,%r59; shl.b64 %r272,%r271,3; neg.s64 %r111,%r272; .loc 1 313 30 mul.lo.u64 %r273,%r107,%r60; .loc 1 313 16 neg.s64 %r112,%r273; shl.b64 %r137,%r92,3; setp.le.s64 %r298,%r92,1; add.u64 %r305,%frame,480; add.u64 %r306,%frame,360; add.u64 %r307,%frame,240; .loc 1 308 20 mov.u64 %r308,0; $L53: .loc 1 287 10 ld.s8 %r275,[%r106]; cvt.u16.u32 %r274,%r275; setp.eq.u16 %r276,%r274,0; @ %r276 bra $L46; .loc 1 290 12 ld.u64 %r62,[%r91]; .loc 1 291 16 add.u64 %r91,%r91,%r64; bra $L47; $L46: .loc 1 296 12 ld.u64 %r62,[%r105]; $L47: st.u64 [%r104],%r62; .loc 1 299 12 add.u64 %r104,%r104,%r66; .loc 1 300 12 add.u64 %r105,%r105,%r68; .loc 1 301 12 add.u64 %r106,%r106,%r60; .loc 1 302 15 add.u64 %r27,%r27,1; .loc 1 304 13 setp.ne.u64 %r277,%r27,%r107; @ %r277 bra $L53; .loc 1 312 16 add.u64 %r102,%r105,%r111; .loc 1 313 16 add.u64 %r103,%r106,%r112; .loc 1 315 14 @ ! %r298 bra $L65; bra $L33; $L52: .loc 1 308 20 st.u64 [%r98],%r308; .loc 1 311 30 mul.lo.u64 %r71,%r81,%r80; .loc 1 312 30 mul.lo.u64 %r280,%r83,%r80; .loc 1 312 16 shl.b64 %r281,%r280,3; sub.u64 %r102,%r105,%r281; .loc 1 313 30 mul.lo.u64 %r282,%r86,%r80; .loc 1 313 16 sub.u64 %r103,%r106,%r282; .loc 1 315 14 add.u64 %r98,%r98,8; add.u64 %r144,%r144,8; setp.ne.u64 %r283,%r137,%r144; @ %r283 bra $L51; bra $L33; $L65: add.u64 %r98,%frame,128; .loc 1 311 30 mov.u64 %r71,%r28; .loc 1 315 14 mov.u64 %r144,8; $L51: .loc 1 323 23 ld.u64 %r285,[%r98]; add.u64 %r80,%r285,1; st.u64 [%r98],%r80; .loc 1 324 30 add.u64 %r287,%r305,%r144; ld.u64 %r81,[%r287]; .loc 1 324 20 sub.u64 %r288,%r81,%r71; shl.b64 %r289,%r288,3; add.u64 %r104,%r104,%r289; .loc 1 325 30 add.u64 %r291,%r306,%r144; ld.u64 %r83,[%r291]; .loc 1 325 20 shl.b64 %r292,%r83,3; add.u64 %r105,%r102,%r292; .loc 1 326 30 add.u64 %r294,%r307,%r144; ld.u64 %r86,[%r294]; .loc 1 326 20 add.u64 %r106,%r103,%r86; .loc 1 304 32 add.u64 %r295,%frame,%r144; ld.u64 %r88,[%r295]; .loc 1 304 13 setp.eq.u64 %r296,%r80,%r88; @ %r296 bra $L52; .loc 1 308 20 mov.u64 %r27,0; bra $L53; $L33: .loc 1 330 unpack_i16.o/_gfortrani_unpack0_i16 .visible .func _gfortrani_unpack0_i16../libgfortran/generated/unpack_i16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_unpack1_i16 .visible .func _gfortrani_unpack1_i16gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortrani_unpack0_i16 .visible .func _gfortrani_unpack0_i16064predpredpred %r205; .reg .u64 %r208; .reg .u32 %r212; .reg .u64u16 %r225; .reg .u16 %r227; .reg .u16u64 %r237; .reg .u64 %r238; .reg .pred %r239; .reg .u64 %r240; .reg .pred %r241; .reg .u64 %r242; .reg .predpred64mov.u64 %r133,%ar0; mov.u64 %r134,%ar1; mov.u64 %r135,%ar2; mov.u64 %r136,%ar3; .loc 1 46 24 ld.u64 %r270,[%r136]; ld.u64 %r271,[%r136+8]; .loc 1 62 8 ld.u64 %r97,[%r135]; .loc 1 67 15 ld.u64 %r22,[%r135+16]; .loc 1 69 22 cvt.u32.u64 %r23,%r22; .loc 1 69 53 add.u32 %r137,%r23,-4; and.b32 %r138,%r137,-5; set.u32.eq.u32 %r140,%r138,0; neg.s32 %r141,%r140; .loc 1 69 22 add.u32 %r142,%r23,-1; set.u32.le.u32 %r144,%r142,1; neg.s32 %r145,%r144; .loc 1 69 6 cvt.u16.u32 %r147,%r141; cvt.u16.u32 %r148,%r145; or.b16 %r146,%r147,%r148; cvt.u32.u16 %r149,%r146; cvt.u16.u8 %r150,%r149; setp.ne.u16 %r151,%r150,0; @ %r151 bra $L2; .loc 1 71 7 setp.eq.u32 %r153,%r23,16; @ %r153 bra $L2; .loc 1 80 5 cvta.const.u64 %r1_gfortran_runtime_errorr96,[%r133]; .loc 1 82 6 setp.ne.u64 %r156,%r96,0; @ %r156 bra $L3; .loc 1 86 11 ld.s8 %r78,[%r135+28]; .loc 1 88 7 setp.le.s64 %r158,%r78,0; @ %r158 bra $L4; .loc 1 90 13 shl.b64 %r160,%r78,3; add.u64 %r161,%frame,120; cvt.u32.u64 %r165,%r96add.u64 %r120,%r133,40; add.u64 %r118,%r135,40; mov.u64 %r116,%frame; add.u64 %r115,%frame,360; add.u64 %r114,%frame,240; add.u64 %r170,%r78,%r78; add.u64 %r171,%r170,%r78; shl.b64 %r172,%r171,3; add.u64 %r102,%r172,%r120; .loc 1 60 9 cvt.u32.u64 %r37,%r96; .loc 1 87 10 mov.u64 %r91,1; .loc 1 91 4 mov.u64 %r174,%r96; $L5: st.u64 [%r120+8],%r174; ld.u64 %r176,[%r118+16]; add.u64 %r175,%r176,1; ld.u64 %r177,[%r118+8]; sub.u64 %r32,%r175,%r177; add.u64 %r178,%r32,-1; st.u64 [%r120+16],%r178; st.u64 [%r120],%r91; .loc 1 93 14 st.u64 [%r116],%r32; .loc 1 94 18 set.u32.le.s64 %r180,%r32,0; neg.s32 %r181,%r180; cvt.u16.u32 %r184,%r37; and.b16 %r183,%r184,1; cvt.u16.u32 %r187,%r181; or.b16 %r186,%r187,%r183; cvt.u32.u16 %r189,%r186; cvt.u32.u8 %r37,%r189; .loc 1 95 15 st.u64 [%r115],%r91; .loc 1 96 17 ld.u64 %r191,[%r118]; mul.lo.u64 %r190,%r191,%r22; .loc 1 96 15 st.u64 [%r114],%r190; .loc 1 97 7 mul.lo.u64 %r91,%r91,%r32; .loc 1 88 7 add.u64 %r120,%r120,24; add.u64 %r118,%r118,24; add.u64 %r116,%r116,8; add.u64 %r115,%r115,8; add.u64 %r114,%r114,8; setp.ne.u64 %r192,%r102,%r120; @ %r192 bra $L5; bra $L31; $L4: .loc 1 99 19 st.u64 [%r133+8],%r96; .loc 1 100 24 mov.u64 %r195,16; mov.u64 %r19419195; call (%value_in),_gfortrani_xmallocarray196,[%value_in]; } mov.u64 %r96,%r196; .loc 1 100 22 st.u64 [%r133],%r96; .loc 1 128 12 ld.u64 %r132,[%frame+360]; bra $L7; $L31: .loc 1 99 19 st.u64 [%r133+8],%r174; .loc 1 100 24 mov.u64 %r200,1691200; call (%value_in),_gfortrani_xmallocarray201,[%value_in]; } .loc 1 100 22 st.u64 [%r133],%r201; bra $L8; $L3: .loc 1 104 11 ld.s8 %r78,[%r133+28]; .loc 1 106 18 mov.u64 %r204,1; st.u64 [%frame+360],%r204; .loc 1 107 7 setp.le.s64 %r205,%r78,0; @ %r205 bra $L20; .loc 1 109 13 shl.b64 %r130,%r78,3; add.u64 %r208,%frame,120; mov.u32 %r21mov.u64 %r99,40; mov.u64 %r100,0; .loc 1 60 9 cvt.u32.u64 %r37,%r100; add.u64 %r272,%frame,360; add.u64 %r274,%frame,240; $L9: add.u64 %r82,%r133,%r99; .loc 1 110 16 ld.u64 %r217,[%r82+16]; add.u64 %r216,%r217,1; ld.u64 %r218,[%r82+8]; sub.u64 %r48,%r216,%r218; .loc 1 110 14 add.u64 %r219,%frame,%r100; st.u64 [%r219],%r48; .loc 1 111 18 set.u32.le.s64 %r221,%r48,0; neg.s32 %r222,%r221; cvt.u16.u32 %r225,%r37; and.b16 %r224,%r225,1; cvt.u16.u32 %r228,%r222; or.b16 %r227,%r228,%r224; cvt.u32.u16 %r230,%r227; cvt.u32.u8 %r37,%r230; .loc 1 112 15 add.u64 %r232,%r272,%r100; ld.u64 %r233,[%r82]; st.u64 [%r232],%r233; .loc 1 113 15 add.u64 %r235,%r274,%r100; .loc 1 113 17 add.u64 %r236,%r135,%r99; ld.u64 %r238,[%r236]; mul.lo.u64 %r237,%r238,%r22; .loc 1 113 15 st.u64 [%r235],%r237; .loc 1 107 7 add.u64 %r100,%r100,8; add.u64 %r99,%r99,24; setp.ne.u64 %r239,%r100,%r130; @ %r239 bra $L9; .loc 1 115 10 ld.u64 %r240,[%frame+360]; setp.ne.u64 %r241,%r240,0; @ %r241 bra $L8; .loc 1 116 13 mov.u64 %r242,1; st.u64 [%frame+360],%r242; $L8: .loc 1 119 6 setp.ne.u32 %r243,%r37,0; @ %r243 bra $L1; .loc 1 128 12 ld.u64 %r132,[%frame+360]; .loc 1 130 8 ld.u64 %r96,[%r133]; bra $L7; $L20: .loc 1 107 7 mov.u64 %r132,%r204; $L7: .loc 1 122 14 ld.u64 %r56,[%frame+240]; .loc 1 122 6 setp.ne.u64 %r244,%r56,0; .loc 1 123 16 selp.u64 %r56,%r56,1,%r244; .loc 1 125 12 ld.u64 %r75,[%r134+40]; .loc 1 126 6 setp.ne.u64 %r245,%r75,0; .loc 1 127 14 selp.u64 %r75,%r75,1,%r245; .loc 1 131 8 ld.u64 %r76,[%r134]; .loc 1 133 9 setp.eq.u64 %r246,%r96,0; @ %r246 bra $L1; .loc 1 139 9 shl.b64 %r60,%r75,4; .loc 1 147 12 shl.b64 %r62,%r132,4; ld.u64 %r94,[%frame+120]; .loc 1 151 32 ld.u64 %r98,[%frame]; .loc 1 158 30 mul.lo.u64 %r27,%r98,%r132; .loc 1 159 30 mul.lo.u64 %r247,%r98,%r56; .loc 1 159 16 neg.s64 %r101,%r247; shl.b64 %r121,%r78,3; setp.le.s64 %r273,%r78,1; add.u64 %r279,%frame,360; add.u64 %r280,%frame,240; .loc 1 155 20 mov.u64 %r281,0; $L19: .loc 1 135 10 ld.s8 %r249,[%r97]; cvt.u16.u32 %r248,%r249; setp.eq.u16 %r250,%r248,0; @ %r250 bra $L21; .loc 1 138 12 ld.u64 %r268,[%r76]; ld.u64 %r269,[%r76+8]; .loc 1 139 9 add.u64 %r76,%r76,%r60; bra $L13; $L21: .loc 1 144 10 mov.u64 %r268,%r270; mov.u64 %r269,%r271; $L13: st.u64 [%r96],%r268; st.u64 [%r96+8],%r269; .loc 1 147 12 add.u64 %r96,%r96,%r62; .loc 1 148 12 add.u64 %r97,%r97,%r56; .loc 1 149 15 add.u64 %r94,%r94,1; .loc 1 151 13 setp.ne.u64 %r253,%r94,%r98; @ %r253 bra $L19; .loc 1 159 16 add.u64 %r95,%r97,%r101; .loc 1 161 14 @ ! %r273 bra $L32; bra $L1; $L18: .loc 1 155 20 st.u64 [%r90],%r281; .loc 1 158 30 mul.lo.u64 %r64,%r70,%r69; .loc 1 159 30 mul.lo.u64 %r256,%r72,%r69; .loc 1 159 16 sub.u64 %r95,%r97,%r256; .loc 1 161 14 add.u64 %r90,%r90,8; add.u64 %r127,%r127,8; setp.ne.u64 %r257,%r121,%r127; @ %r257 bra $L17; bra $L1; $L32: add.u64 %r90,%frame,128; .loc 1 158 30 mov.u64 %r64,%r27; .loc 1 161 14 mov.u64 %r127,8; $L17: .loc 1 169 23 ld.u64 %r259,[%r90]; add.u64 %r69,%r259,1; st.u64 [%r90],%r69; .loc 1 170 30 add.u64 %r261,%r279,%r127; ld.u64 %r70,[%r261]; .loc 1 170 20 sub.u64 %r262,%r70,%r64; shl.b64 %r263,%r262,4; add.u64 %r96,%r96,%r263; .loc 1 171 30 add.u64 %r265,%r280,%r127; ld.u64 %r72,[%r265]; .loc 1 171 20 add.u64 %r97,%r95,%r72; .loc 1 151 32 add.u64 %r266,%frame,%r127; ld.u64 %r74,[%r266]; .loc 1 151 13 setp.eq.u64 %r267,%r69,%r74; @ %r267 bra $L18; .loc 1 155 20 mov.u64 %r94,0; bra $L19; $L1: .loc 1 1_gfortrani_unpack1_i16 .visible .func _gfortrani_unpack1_i16608932predpredpred %r213; .reg .u64u32u64 %r257; .reg .u64predu32 %r275; .reg .pred %r276; .reg .pred %r279; .reg .u64 %r282; .reg .u64 %r283; .reg .u64 %r284; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 208 8 ld.u64 %r106,[%r152]; .loc 1 213 15 ld.u64 %r22,[%r152+16]; .loc 1 215 22 cvt.u32.u64 %r23,%r22; .loc 1 215 53 add.u32 %r154,%r23,-4; and.b32 %r155,%r154,-5; set.u32.eq.u32 %r157,%r155,0; neg.s32 %r158,%r157; .loc 1 215 22 add.u32 %r159,%r23,-1; set.u32.le.u32 %r161,%r159,1; neg.s32 %r162,%r161; .loc 1 215 6 cvt.u16.u32 %r164,%r158; cvt.u16.u32 %r165,%r162; or%r168 bra $L34; .loc 1 217 7 setp.eq.u32 %r170,%r23,16; @ %r170 bra $L34; .loc 1 226 5 cvta.const.u64 %r171171_gfortran_runtime_error34: .loc 1 228 10 ld.u64 %r104,[%r150]; .loc 1 228 6 setp.ne.u64 %r173,%r104,0; @ %r173 bra $L35; .loc 1 232 11 ld.s8 %r92,[%r152+28]; .loc 1 234 7 setp.le.s64 %r175,%r92,0; @ %r175 bra $L36; .loc 1 236 13 shl.b64 %r145,%r92,3; add.u64 %r178,%frame,120; cvt.u32.u64 %r182,%r1041454,[%value_in]; } mov.u64 %r135,40; mov.u64 %r136,%r104; .loc 1 206 9 cvt.u32.u64 %r38,%r136; .loc 1 233 10 mov.u64 %r99,1; add.u64 %r304,%frame,480; add.u64 %r303,%frame,360; add.u64 %r302,%frame,240; .loc 1 237 4 mov.u64 %r186,%r136; $L37: add.u64 %r134,%r150,%r135; st.u64 [%r134+8],%r186; add.u64 %r133,%r152,%r135; ld.u64 %r188,[%r133+16]; add.u64 %r187,%r188,1; ld.u64 %r189,[%r133+8]; sub.u64 %r33,%r187,%r189; add.u64 %r190,%r33,-1; st.u64 [%r134+16],%r190; st.u64 [%r134],%r99; .loc 1 239 14 add.u64 %r191,%frame,%r136; st.u64 [%r191],%r33; .loc 1 240 18 set.u32.le.s64 %r193,%r33,0; neg.s32 %r194,%r193; cvt.u16.u32 %r197,%r38; and.b16 %r196,%r197,1; cvt.u16.u32 %r200,%r194; or.b16 %r199,%r200,%r196; cvt.u32.u16 %r202,%r199; cvt.u32.u8 %r38,%r202; .loc 1 241 15 add.u64 %r204,%r304,%r136; st.u64 [%r204],%r99; .loc 1 242 15 add.u64 %r206,%r303,%r136; .loc 1 242 17 add.u64 %r207,%r153,%r135; .loc 1 242 15 ld.u64 %r208,[%r207]; st.u64 [%r206],%r208; .loc 1 243 15 add.u64 %r210,%r302,%r136; .loc 1 243 17 ld.u64 %r212,[%r133]; mul.lo.u64 %r211,%r212,%r22; .loc 1 243 15 st.u64 [%r210],%r211; .loc 1 244 7 mul.lo.u64 %r99,%r99,%r33; .loc 1 234 7 add.u64 %r136,%r136,8; add.u64 %r135,%r135,24; setp.ne.u64 %r213,%r136,%r145; @ %r213 bra $L37; bra $L64; $L36: .loc 1 246 19 st.u64 [%r150+8],%r104; .loc 1 247 24 mov.u64 %r216,16; mov.u64 %r215216; call (%value_in),_gfortrani_xmallocarray217,[%value_in]; } mov.u64 %r104,%r217; .loc 1 247 22 st.u64 [%r150],%r104; .loc 1 278 12 ld.u64 %r149,[%frame+480]; bra $L39; $L64: .loc 1 246 19 st.u64 [%r150+8],%r186; .loc 1 247 24 mov.u64 %r221,1699221; call (%value_in),_gfortrani_xmallocarray222,[%value_in]; } .loc 1 247 22 st.u64 [%r150],%r222; bra $L40; $L35: .loc 1 251 11 ld.s8 %r92,[%r150+28]; .loc 1 253 18 mov.u64 %r225,1; st.u64 [%frame+480],%r225; .loc 1 254 7 setp.le.s64 %r226,%r92,0; @ %r226 bra $L54; .loc 1 256 13 shl.b64 %r147,%r92,3; add.u64 %r229,%frame,120; mov.u32 %r23314735,[%value_in]; } add.u64 %r127,%r150,40; add.u64 %r125,%r153,40; add.u64 %r123,%r152,40; mov.u64 %r121,0; .loc 1 206 9 cvt.u32.u64 %r38,%r121; add.u64 %r304,%frame,480; add.u64 %r303,%frame,360; add.u64 %r302,%frame,240; $L41: .loc 1 257 16 ld.u64 %r238,[%r127+16]; add.u64 %r237,%r238,1; ld.u64 %r239,[%r127+8]; sub.u64 %r50,%r237,%r239; .loc 1 257 14 add.u64 %r240,%frame,%r121; st.u64 [%r240],%r50; .loc 1 258 18 set.u32.le.s64 %r242,%r50,0; neg.s32 %r243,%r242; cvt.u16.u32 %r246,%r38; and.b16 %r245,%r246,1; cvt.u16.u32 %r249,%r243; or.b16 %r248,%r249,%r245; cvt.u32.u16 %r251,%r248; cvt.u32.u8 %r38,%r251; .loc 1 259 15 add.u64 %r253,%r304,%r121; ld.u64 %r254,[%r127]; st.u64 [%r253],%r254; .loc 1 260 15 add.u64 %r256,%r303,%r121; ld.u64 %r257,[%r125]; st.u64 [%r256],%r257; .loc 1 261 15 add.u64 %r259,%r302,%r121; .loc 1 261 17 ld.u64 %r261,[%r123]; mul.lo.u64 %r260,%r261,%r22; .loc 1 261 15 st.u64 [%r259],%r260; .loc 1 254 7 add.u64 %r127,%r127,24; add.u64 %r125,%r125,24; add.u64 %r123,%r123,24; add.u64 %r121,%r121,8; setp.ne.u64 %r262,%r121,%r147; @ %r262 bra $L41; .loc 1 263 10 ld.u64 %r263,[%frame+48040; .loc 1 264 13 mov.u64 %r265,1; st.u64 [%frame+480],%r265; $L40: .loc 1 267 6 setp.ne.u32 %r266,%r38,0; @ %r266 bra $L33; .loc 1 278 12 ld.u64 %r149,[%frame+480]; .loc 1 281 8 ld.u64 %r104,[%r150]; bra $L39; $L54: .loc 1 254 7 mov.u64 %r149,%r225; $L39: .loc 1 270 14 ld.u64 %r59,[%frame+360]; .loc 1 270 6 setp.ne.u64 %r267,%r59,0; .loc 1 271 16 selp.u64 %r59,%r59,1,%r267; .loc 1 272 14 ld.u64 %r60,[%frame+240]; .loc 1 272 6 setp.ne.u64 %r268,%r60,0; .loc 1 273 16 selp.u64 %r60,%r60,1,%r268; .loc 1 275 12 ld.u64 %r90,[%r151+40]; .loc 1 276 6 setp.ne.u64 %r269,%r90,0; .loc 1 277 14 selp.u64 %r90,%r90,1,%r269; .loc 1 282 8 ld.u64 %r105,[%r153]; .loc 1 283 8 ld.u64 %r91,[%r151]; .loc 1 285 9 setp.eq.u64 %r270,%r104,0; @ %r270 bra $L33; .loc 1 291 16 shl.b64 %r64,%r90,4; .loc 1 299 12 shl.b64 %r66,%r149,4; .loc 1 300 12 shl.b64 %r68,%r59,4; ld.u64 %r27,[%frame+120]; .loc 1 304 32 ld.u64 %r107,[%frame]; .loc 1 311 30 mul.lo.u64 %r28,%r107,%r149; .loc 1 312 16 mul.lo.u64 %r271,%r107,%r59; shl.b64 %r272,%r271,4; neg.s64 %r111,%r272; .loc 1 313 30 mul.lo.u64 %r273,%r107,%r60; .loc 1 313 16 neg.s64 %r112,%r273; shl.b64 %r137,%r92,3; setp.le.s64 %r301,%r92,1; add.u64 %r309,%frame,480; add.u64 %r310,%frame,360; add.u64 %r311,%frame,240; .loc 1 308 20 mov.u64 %r312,0; $L53: .loc 1 287 10 ld.s8 %r275,[%r106]; cvt.u16.u32 %r274,%r275; setp.eq.u16 %r276,%r274,0; @ %r276 bra $L46; .loc 1 290 12 ld.u64 %r299,[%r91]; ld.u64 %r300,[%r91+8]; .loc 1 291 16 add.u64 %r91,%r91,%r64; bra $L47; $L46: .loc 1 296 12 ld.u64 %r299,[%r105]; ld.u64 %r300,[%r105+8]; $L47: st.u64 [%r104],%r299; st.u64 [%r104+8],%r300; .loc 1 299 12 add.u64 %r104,%r104,%r66; .loc 1 300 12 add.u64 %r105,%r105,%r68; .loc 1 301 12 add.u64 %r106,%r106,%r60; .loc 1 302 15 add.u64 %r27,%r27,1; .loc 1 304 13 setp.ne.u64 %r279,%r27,%r107; @ %r279 bra $L53; .loc 1 312 16 add.u64 %r102,%r105,%r111; .loc 1 313 16 add.u64 %r103,%r106,%r112; .loc 1 315 14 @ ! %r301 bra $L65; bra $L33; $L52: .loc 1 308 20 st.u64 [%r98],%r312; .loc 1 311 30 mul.lo.u64 %r71,%r81,%r80; .loc 1 312 30 mul.lo.u64 %r282,%r83,%r80; .loc 1 312 16 shl.b64 %r283,%r282,4; sub.u64 %r102,%r105,%r283; .loc 1 313 30 mul.lo.u64 %r284,%r86,%r80; .loc 1 313 16 sub.u64 %r103,%r106,%r284; .loc 1 315 14 add.u64 %r98,%r98,8; add.u64 %r144,%r144,8; setp.ne.u64 %r285,%r137,%r144; @ %r285 bra $L51; bra $L33; $L65: add.u64 %r98,%frame,128; .loc 1 311 30 mov.u64 %r71,%r28; .loc 1 315 14 mov.u64 %r144,8; $L51: .loc 1 323 23 ld.u64 %r287,[%r98]; add.u64 %r80,%r287,1; st.u64 [%r98],%r80; .loc 1 324 30 add.u64 %r289,%r309,%r144; ld.u64 %r81,[%r289]; .loc 1 324 20 sub.u64 %r290,%r81,%r71; shl.b64 %r291,%r290,4; add.u64 %r104,%r104,%r291; .loc 1 325 30 add.u64 %r293,%r310,%r144; ld.u64 %r83,[%r293]; .loc 1 325 20 shl.b64 %r294,%r83,4; add.u64 %r105,%r102,%r294; .loc 1 326 30 add.u64 %r296,%r311,%r144; ld.u64 %r86,[%r296]; .loc 1 326 20 add.u64 %r106,%r103,%r86; .loc 1 304 32 add.u64 %r297,%frame,%r144; ld.u64 %r88,[%r297]; .loc 1 304 13 setp.eq.u64 %r298,%r80,%r88; @ %r298 bra $L52; .loc 1 308 20 mov.u64 %r27,0; bra $L53; $L33: .loc 1 330unpack_r4.o/_gfortrani_unpack0_r4 .visible .func _gfortrani_unpack0_r../libgfortran/generated/unpack_r4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_unpack1_r4 .visible .func _gfortrani_unpack1_r4gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortrani_unpack0_r4 .visible .func _gfortrani_unpack0_r0predpredpred %r205; .reg .u64 %r208; .reg .u32 %r212; .reg .u64u16 %r225; .reg .u16 %r227; .reg .u16u64 %r237; .reg .u64 %r238; .reg .pred %r239; .reg .u64 %r240; .reg .pred %r241; .reg .u64 %r242; .reg .predpredpredmov.u64 %r133,%ar0; mov.u64 %r134,%ar1; mov.u64 %r135,%ar2; mov.u64 %r136,%ar3; .loc 1 46 20 ld.f32 %r86,[%r136]; .loc 1 62 8 ld.u64 %r97,[%r135]; .loc 1 67 15 ld.u64 %r22,[%r135+16]; .loc 1 69 22 cvt.u32.u64 %r23,%r22; .loc 1 69 53 add.u32 %r137,%r23,-4; and.b32 %r138,%r137,-5; set.u32.eq.u32 %r140,%r138,0; neg.s32 %r141,%r140; .loc 1 69 22 add.u32 %r142,%r23,-1; set.u32.le.u32 %r144,%r142,1; neg.s32 %r145,%r144; .loc 1 69 6 cvt.u16.u32 %r147,%r141; cvt.u16.u32 %r148,%r145; or.b16 %r146,%r147,%r148; cvt.u32.u16 %r149,%r146; cvt.u16.u8 %r150,%r149; setp.ne.u16 %r151,%r150,0; @ %r151 bra $L2; .loc 1 71 7 setp.eq.u32 %r153,%r23,16; @ %r153 bra $L2; .loc 1 80 5 cvta.const.u64 %r1_gfortran_runtime_errorr96,[%r133]; .loc 1 82 6 setp.ne.u64 %r156,%r96,0; @ %r156 bra $L3; .loc 1 86 11 ld.s8 %r78,[%r135+28]; .loc 1 88 7 setp.le.s64 %r158,%r78,0; @ %r158 bra $L4; .loc 1 90 13 shl.b64 %r160,%r78,3; add.u64 %r161,%frame,120; cvt.u32.u64 %r165,%r96add.u64 %r120,%r133,40; add.u64 %r118,%r135,40; mov.u64 %r116,%frame; add.u64 %r115,%frame,360; add.u64 %r114,%frame,240; add.u64 %r170,%r78,%r78; add.u64 %r171,%r170,%r78; shl.b64 %r172,%r171,3; add.u64 %r102,%r172,%r120; .loc 1 60 9 cvt.u32.u64 %r37,%r96; .loc 1 87 10 mov.u64 %r91,1; .loc 1 91 4 mov.u64 %r174,%r96; $L5: st.u64 [%r120+8],%r174; ld.u64 %r176,[%r118+16]; add.u64 %r175,%r176,1; ld.u64 %r177,[%r118+8]; sub.u64 %r32,%r175,%r177; add.u64 %r178,%r32,-1; st.u64 [%r120+16],%r178; st.u64 [%r120],%r91; .loc 1 93 14 st.u64 [%r116],%r32; .loc 1 94 18 set.u32.le.s64 %r180,%r32,0; neg.s32 %r181,%r180; cvt.u16.u32 %r184,%r37; and.b16 %r183,%r184,1; cvt.u16.u32 %r187,%r181; or.b16 %r186,%r187,%r183; cvt.u32.u16 %r189,%r186; cvt.u32.u8 %r37,%r189; .loc 1 95 15 st.u64 [%r115],%r91; .loc 1 96 17 ld.u64 %r191,[%r118]; mul.lo.u64 %r190,%r191,%r22; .loc 1 96 15 st.u64 [%r114],%r190; .loc 1 97 7 mul.lo.u64 %r91,%r91,%r32; .loc 1 88 7 add.u64 %r120,%r120,24; add.u64 %r118,%r118,24; add.u64 %r116,%r116,8; add.u64 %r115,%r115,8; add.u64 %r114,%r114,8; setp.ne.u64 %r192,%r102,%r120; @ %r192 bra $L5; bra $L31; $L4: .loc 1 99 19 st.u64 [%r133+8],%r96; .loc 1 100 24 mov.u64 %r195,4; mov.u64 %r19419195; call (%value_in),_gfortrani_xmallocarray196,[%value_in]; } mov.u64 %r96,%r196; .loc 1 100 22 st.u64 [%r133],%r96; .loc 1 128 12 ld.u64 %r132,[%frame+360]; bra $L7; $L31: .loc 1 99 19 st.u64 [%r133+8],%r174; .loc 1 100 2491200; call (%value_in),_gfortrani_xmallocarray201,[%value_in]; } .loc 1 100 22 st.u64 [%r133],%r201; bra $L8; $L3: .loc 1 104 11 ld.s8 %r78,[%r133+28]; .loc 1 106 18 mov.u64 %r204,1; st.u64 [%frame+360],%r204; .loc 1 107 7 setp.le.s64 %r205,%r78,0; @ %r205 bra $L20; .loc 1 109 13 shl.b64 %r130,%r78,3; add.u64 %r208,%frame,120; mov.u32 %r21mov.u64 %r99,40; mov.u64 %r100,0; .loc 1 60 9 cvt.u32.u64 %r37,%r100; add.u64 %r266,%frame,360; add.u64 %r268,%frame,240; $L9: add.u64 %r82,%r133,%r99; .loc 1 110 16 ld.u64 %r217,[%r82+16]; add.u64 %r216,%r217,1; ld.u64 %r218,[%r82+8]; sub.u64 %r48,%r216,%r218; .loc 1 110 14 add.u64 %r219,%frame,%r100; st.u64 [%r219],%r48; .loc 1 111 18 set.u32.le.s64 %r221,%r48,0; neg.s32 %r222,%r221; cvt.u16.u32 %r225,%r37; and.b16 %r224,%r225,1; cvt.u16.u32 %r228,%r222; or.b16 %r227,%r228,%r224; cvt.u32.u16 %r230,%r227; cvt.u32.u8 %r37,%r230; .loc 1 112 15 add.u64 %r232,%r266,%r100; ld.u64 %r233,[%r82]; st.u64 [%r232],%r233; .loc 1 113 15 add.u64 %r235,%r268,%r100; .loc 1 113 17 add.u64 %r236,%r135,%r99; ld.u64 %r238,[%r236]; mul.lo.u64 %r237,%r238,%r22; .loc 1 113 15 st.u64 [%r235],%r237; .loc 1 107 7 add.u64 %r100,%r100,8; add.u64 %r99,%r99,24; setp.ne.u64 %r239,%r100,%r130; @ %r239 bra $L9; .loc 1 115 10 ld.u64 %r240,[%frame+360]; setp.ne.u64 %r241,%r240,0; @ %r241 bra $L8; .loc 1 116 13 mov.u64 %r242,1; st.u64 [%frame+360],%r242; $L8: .loc 1 119 6 setp.ne.u32 %r243,%r37,0; @ %r243 bra $L1; .loc 1 128 12 ld.u64 %r132,[%frame+360]; .loc 1 130 8 ld.u64 %r96,[%r133]; bra $L7; $L20: .loc 1 107 7 mov.u64 %r132,%r204; $L7: .loc 1 122 14 ld.u64 %r56,[%frame+240]; .loc 1 122 6 setp.ne.u64 %r244,%r56,0; .loc 1 123 16 selp.u64 %r56,%r56,1,%r244; .loc 1 125 12 ld.u64 %r75,[%r134+40]; .loc 1 126 6 setp.ne.u64 %r245,%r75,0; .loc 1 127 14 selp.u64 %r75,%r75,1,%r245; .loc 1 131 8 ld.u64 %r76,[%r134]; .loc 1 133 9 setp.eq.u64 %r246,%r96,0; @ %r246 bra $L1; .loc 1 139 9 shl.b64 %r60,%r75,2; .loc 1 147 12 shl.b64 %r62,%r132,2; ld.u64 %r94,[%frame+120]; .loc 1 151 32 ld.u64 %r98,[%frame]; .loc 1 158 30 mul.lo.u64 %r27,%r98,%r132; .loc 1 159 30 mul.lo.u64 %r247,%r98,%r56; .loc 1 159 16 neg.s64 %r101,%r247; shl.b64 %r121,%r78,3; setp.le.s64 %r267,%r78,1; add.u64 %r273,%frame,360; add.u64 %r274,%frame,240; .loc 1 155 20 mov.u64 %r275,0; $L19: .loc 1 135 10 ld.s8 %r249,[%r97]; cvt.u16.u32 %r248,%r249; setp.eq.u16 %r250,%r248,0; @ %r250 bra $L21; .loc 1 138 12 ld.f32 %r58,[%r76]; .loc 1 139 9 add.u64 %r76,%r76,%r60; bra $L13; $L21: .loc 1 144 10 mov.f32 %r58,%r86; $L13: st.f32 [%r96],%r58; .loc 1 147 12 add.u64 %r96,%r96,%r62; .loc 1 148 12 add.u64 %r97,%r97,%r56; .loc 1 149 15 add.u64 %r94,%r94,1; .loc 1 151 13 setp.ne.u64 %r251,%r94,%r98; @ %r251 bra $L19; .loc 1 159 16 add.u64 %r95,%r97,%r101; .loc 1 161 14 @ ! %r267 bra $L32; bra $L1; $L18: .loc 1 155 20 st.u64 [%r90],%r275; .loc 1 158 30 mul.lo.u64 %r64,%r70,%r69; .loc 1 159 30 mul.lo.u64 %r254,%r72,%r69; .loc 1 159 16 sub.u64 %r95,%r97,%r254; .loc 1 161 14 add.u64 %r90,%r90,8; add.u64 %r127,%r127,8; setp.ne.u64 %r255,%r121,%r127; @ %r255 bra $L17; bra $L1; $L32: add.u64 %r90,%frame,128; .loc 1 158 30 mov.u64 %r64,%r27; .loc 1 161 14 mov.u64 %r127,8; $L17: .loc 1 169 23 ld.u64 %r257,[%r90]; add.u64 %r69,%r257,1; st.u64 [%r90],%r69; .loc 1 170 30 add.u64 %r259,%r273,%r127; ld.u64 %r70,[%r259]; .loc 1 170 20 sub.u64 %r260,%r70,%r64; shl.b64 %r261,%r260,2; add.u64 %r96,%r96,%r261; .loc 1 171 30 add.u64 %r263,%r274,%r127; ld.u64 %r72,[%r263]; .loc 1 171 20 add.u64 %r97,%r95,%r72; .loc 1 151 32 add.u64 %r264,%frame,%r127; ld.u64 %r74,[%r264]; .loc 1 151 13 setp.eq.u64 %r265,%r69,%r74; @ %r265 bra $L18; .loc 1 155 20 mov.u64 %r94,0; bra $L19; $L1: .loc 1 1_gfortrani_unpack1_r4 .visible .func _gfortrani_unpack1_r608932predpredpred %r213; .reg .u64u32u64 %r257; .reg .u64predu64 %r281; .reg .u64 %r282; .reg .pred %r283; .reg .u64u64 %r306; .reg .u64 %r307; .reg .u64 %r308; mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 208 8 ld.u64 %r106,[%r152]; .loc 1 213 15 ld.u64 %r22,[%r152+16]; .loc 1 215 22 cvt.u32.u64 %r23,%r22; .loc 1 215 53 add.u32 %r154,%r23,-4; and.b32 %r155,%r154,-5; set.u32.eq.u32 %r157,%r155,0; neg.s32 %r158,%r157; .loc 1 215 22 add.u32 %r159,%r23,-1; set.u32.le.u32 %r161,%r159,1; neg.s32 %r162,%r161; .loc 1 215 6 cvt.u16.u32 %r164,%r158; cvt.u16.u32 %r165,%r162; or%r168 bra $L34; .loc 1 217 7 setp.eq.u32 %r170,%r23,16; @ %r170 bra $L34; .loc 1 226 5 cvta.const.u64 %r171171_gfortran_runtime_error34: .loc 1 228 10 ld.u64 %r104,[%r150]; .loc 1 228 6 setp.ne.u64 %r173,%r104,0; @ %r173 bra $L35; .loc 1 232 11 ld.s8 %r92,[%r152+28]; .loc 1 234 7 setp.le.s64 %r175,%r92,0; @ %r175 bra $L36; .loc 1 236 13 shl.b64 %r145,%r92,3; add.u64 %r178,%frame,120; cvt.u32.u64 %r182,%r1041454,[%value_in]; } mov.u64 %r135,40; mov.u64 %r136,%r104; .loc 1 206 9 cvt.u32.u64 %r38,%r136; .loc 1 233 10 mov.u64 %r99,1; add.u64 %r297,%frame,480; add.u64 %r300,%frame,360; add.u64 %r299,%frame,240; .loc 1 237 4 mov.u64 %r186,%r136; $L37: add.u64 %r134,%r150,%r135; st.u64 [%r134+8],%r186; add.u64 %r133,%r152,%r135; ld.u64 %r188,[%r133+16]; add.u64 %r187,%r188,1; ld.u64 %r189,[%r133+8]; sub.u64 %r33,%r187,%r189; add.u64 %r190,%r33,-1; st.u64 [%r134+16],%r190; st.u64 [%r134],%r99; .loc 1 239 14 add.u64 %r191,%frame,%r136; st.u64 [%r191],%r33; .loc 1 240 18 set.u32.le.s64 %r193,%r33,0; neg.s32 %r194,%r193; cvt.u16.u32 %r197,%r38; and.b16 %r196,%r197,1; cvt.u16.u32 %r200,%r194; or.b16 %r199,%r200,%r196; cvt.u32.u16 %r202,%r199; cvt.u32.u8 %r38,%r202; .loc 1 241 15 add.u64 %r204,%r297,%r136; st.u64 [%r204],%r99; .loc 1 242 15 add.u64 %r206,%r300,%r136; .loc 1 242 17 add.u64 %r207,%r153,%r135; .loc 1 242 15 ld.u64 %r208,[%r207]; st.u64 [%r206],%r208; .loc 1 243 15 add.u64 %r210,%r299,%r136; .loc 1 243 17 ld.u64 %r212,[%r133]; mul.lo.u64 %r211,%r212,%r22; .loc 1 243 15 st.u64 [%r210],%r211; .loc 1 244 7 mul.lo.u64 %r99,%r99,%r33; .loc 1 234 7 add.u64 %r136,%r136,8; add.u64 %r135,%r135,24; setp.ne.u64 %r213,%r136,%r145; @ %r213 bra $L37; bra $L64; $L36: .loc 1 246 19 st.u64 [%r150+8],%r104; .loc 1 247 24 mov.u64 %r216,4; mov.u64 %r215216; call (%value_in),_gfortrani_xmallocarray217,[%value_in]; } mov.u64 %r104,%r217; .loc 1 247 22 st.u64 [%r150],%r104; .loc 1 278 12 ld.u64 %r149,[%frame+480]; bra $L39; $L64: .loc 1 246 19 st.u64 [%r150+8],%r186; .loc 1 247 2499221; call (%value_in),_gfortrani_xmallocarray222,[%value_in]; } .loc 1 247 22 st.u64 [%r150],%r222; bra $L40; $L35: .loc 1 251 11 ld.s8 %r92,[%r150+28]; .loc 1 253 18 mov.u64 %r225,1; st.u64 [%frame+480],%r225; .loc 1 254 7 setp.le.s64 %r226,%r92,0; @ %r226 bra $L54; .loc 1 256 13 shl.b64 %r147,%r92,3; add.u64 %r229,%frame,120; mov.u32 %r23314735,[%value_in]; } add.u64 %r127,%r150,40; add.u64 %r125,%r153,40; add.u64 %r123,%r152,40; mov.u64 %r121,0; .loc 1 206 9 cvt.u32.u64 %r38,%r121; add.u64 %r297,%frame,480; add.u64 %r300,%frame,360; add.u64 %r299,%frame,240; $L41: .loc 1 257 16 ld.u64 %r238,[%r127+16]; add.u64 %r237,%r238,1; ld.u64 %r239,[%r127+8]; sub.u64 %r50,%r237,%r239; .loc 1 257 14 add.u64 %r240,%frame,%r121; st.u64 [%r240],%r50; .loc 1 258 18 set.u32.le.s64 %r242,%r50,0; neg.s32 %r243,%r242; cvt.u16.u32 %r246,%r38; and.b16 %r245,%r246,1; cvt.u16.u32 %r249,%r243; or.b16 %r248,%r249,%r245; cvt.u32.u16 %r251,%r248; cvt.u32.u8 %r38,%r251; .loc 1 259 15 add.u64 %r253,%r297,%r121; ld.u64 %r254,[%r127]; st.u64 [%r253],%r254; .loc 1 260 15 add.u64 %r256,%r300,%r121; ld.u64 %r257,[%r125]; st.u64 [%r256],%r257; .loc 1 261 15 add.u64 %r259,%r299,%r121; .loc 1 261 17 ld.u64 %r261,[%r123]; mul.lo.u64 %r260,%r261,%r22; .loc 1 261 15 st.u64 [%r259],%r260; .loc 1 254 7 add.u64 %r127,%r127,24; add.u64 %r125,%r125,24; add.u64 %r123,%r123,24; add.u64 %r121,%r121,8; setp.ne.u64 %r262,%r121,%r147; @ %r262 bra $L41; .loc 1 263 10 ld.u64 %r263,[%frame+48040; .loc 1 264 13 mov.u64 %r265,1; st.u64 [%frame+480],%r265; $L40: .loc 1 267 6 setp.ne.u32 %r266,%r38,0; @ %r266 bra $L33; .loc 1 278 12 ld.u64 %r149,[%frame+480]; .loc 1 281 8 ld.u64 %r104,[%r150]; bra $L39; $L54: .loc 1 254 7 mov.u64 %r149,%r225; $L39: .loc 1 270 14 ld.u64 %r59,[%frame+360]; .loc 1 270 6 setp.ne.u64 %r267,%r59,0; .loc 1 271 16 selp.u64 %r59,%r59,1,%r267; .loc 1 272 14 ld.u64 %r60,[%frame+240]; .loc 1 272 6 setp.ne.u64 %r268,%r60,0; .loc 1 273 16 selp.u64 %r60,%r60,1,%r268; .loc 1 275 12 ld.u64 %r90,[%r151+40]; .loc 1 276 6 setp.ne.u64 %r269,%r90,0; .loc 1 277 14 selp.u64 %r90,%r90,1,%r269; .loc 1 282 8 ld.u64 %r105,[%r153]; .loc 1 283 8 ld.u64 %r91,[%r151]; .loc 1 285 9 setp.eq.u64 %r270,%r104,0; @ %r270 bra $L33; .loc 1 291 16 shl.b64 %r64,%r90,2; .loc 1 299 12 shl.b64 %r66,%r149,2; .loc 1 300 12 shl.b64 %r68,%r59,2; ld.u64 %r27,[%frame+120]; .loc 1 304 32 ld.u64 %r107,[%frame]; .loc 1 311 30 mul.lo.u64 %r28,%r107,%r149; .loc 1 312 16 mul.lo.u64 %r271,%r107,%r59; shl.b64 %r272,%r271,2; neg.s64 %r111,%r272; .loc 1 313 30 mul.lo.u64 %r273,%r107,%r60; .loc 1 313 16 neg.s64 %r112,%r273; shl.b64 %r137,%r92,3; setp.le.s64 %r298,%r92,1; add.u64 %r305,%frame,480; add.u64 %r306,%frame,360; add.u64 %r307,%frame,240; .loc 1 308 20 mov.u64 %r308,0; $L53: .loc 1 287 10 ld.s8 %r275,[%r106]; cvt.u16.u32 %r274,%r275; setp.eq.u16 %r276,%r274,0; @ %r276 bra $L46; .loc 1 290 12 ld.f32 %r62,[%r91]; .loc 1 291 16 add.u64 %r91,%r91,%r64; bra $L47; $L46: .loc 1 296 12 ld.f32 %r62,[%r105]; $L47: st.f32 [%r104],%r62; .loc 1 299 12 add.u64 %r104,%r104,%r66; .loc 1 300 12 add.u64 %r105,%r105,%r68; .loc 1 301 12 add.u64 %r106,%r106,%r60; .loc 1 302 15 add.u64 %r27,%r27,1; .loc 1 304 13 setp.ne.u64 %r277,%r27,%r107; @ %r277 bra $L53; .loc 1 312 16 add.u64 %r102,%r105,%r111; .loc 1 313 16 add.u64 %r103,%r106,%r112; .loc 1 315 14 @ ! %r298 bra $L65; bra $L33; $L52: .loc 1 308 20 st.u64 [%r98],%r308; .loc 1 311 30 mul.lo.u64 %r71,%r81,%r80; .loc 1 312 30 mul.lo.u64 %r280,%r83,%r80; .loc 1 312 16 shl.b64 %r281,%r280,2; sub.u64 %r102,%r105,%r281; .loc 1 313 30 mul.lo.u64 %r282,%r86,%r80; .loc 1 313 16 sub.u64 %r103,%r106,%r282; .loc 1 315 14 add.u64 %r98,%r98,8; add.u64 %r144,%r144,8; setp.ne.u64 %r283,%r137,%r144; @ %r283 bra $L51; bra $L33; $L65: add.u64 %r98,%frame,128; .loc 1 311 30 mov.u64 %r71,%r28; .loc 1 315 14 mov.u64 %r144,8; $L51: .loc 1 323 23 ld.u64 %r285,[%r98]; add.u64 %r80,%r285,1; st.u64 [%r98],%r80; .loc 1 324 30 add.u64 %r287,%r305,%r144; ld.u64 %r81,[%r287]; .loc 1 324 20 sub.u64 %r288,%r81,%r71; shl.b64 %r289,%r288,2; add.u64 %r104,%r104,%r289; .loc 1 325 30 add.u64 %r291,%r306,%r144; ld.u64 %r83,[%r291]; .loc 1 325 20 shl.b64 %r292,%r83,2; add.u64 %r105,%r102,%r292; .loc 1 326 30 add.u64 %r294,%r307,%r144; ld.u64 %r86,[%r294]; .loc 1 326 20 add.u64 %r106,%r103,%r86; .loc 1 304 32 add.u64 %r295,%frame,%r144; ld.u64 %r88,[%r295]; .loc 1 304 13 setp.eq.u64 %r296,%r80,%r88; @ %r296 bra $L52; .loc 1 308 20 mov.u64 %r27,0; bra $L53; $L33: .loc 1 330 unpack_r8.o/_gfortrani_unpack0_r8 .visible .func _gfortrani_unpack0_r8../libgfortran/generated/unpack_r8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_unpack1_r8 .visible .func _gfortrani_unpack1_r8gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortrani_unpack0_r8 .visible .func _gfortrani_unpack0_r80predpredpred %r205; .reg .u64 %r208; .reg .u32 %r212; .reg .u64u16 %r225; .reg .u16 %r227; .reg .u16u64 %r237; .reg .u64 %r238; .reg .pred %r239; .reg .u64 %r240; .reg .pred %r241; .reg .u64 %r242; .reg .predpredpredmov.u64 %r133,%ar0; mov.u64 %r134,%ar1; mov.u64 %r135,%ar2; mov.u64 %r136,%ar3; .loc 1 46 20 ld.f64 %r86,[%r136]; .loc 1 62 8 ld.u64 %r97,[%r135]; .loc 1 67 15 ld.u64 %r22,[%r135+16]; .loc 1 69 22 cvt.u32.u64 %r23,%r22; .loc 1 69 53 add.u32 %r137,%r23,-4; and.b32 %r138,%r137,-5; set.u32.eq.u32 %r140,%r138,0; neg.s32 %r141,%r140; .loc 1 69 22 add.u32 %r142,%r23,-1; set.u32.le.u32 %r144,%r142,1; neg.s32 %r145,%r144; .loc 1 69 6 cvt.u16.u32 %r147,%r141; cvt.u16.u32 %r148,%r145; or.b16 %r146,%r147,%r148; cvt.u32.u16 %r149,%r146; cvt.u16.u8 %r150,%r149; setp.ne.u16 %r151,%r150,0; @ %r151 bra $L2; .loc 1 71 7 setp.eq.u32 %r153,%r23,16; @ %r153 bra $L2; .loc 1 80 5 cvta.const.u64 %r1_gfortran_runtime_errorr96,[%r133]; .loc 1 82 6 setp.ne.u64 %r156,%r96,0; @ %r156 bra $L3; .loc 1 86 11 ld.s8 %r78,[%r135+28]; .loc 1 88 7 setp.le.s64 %r158,%r78,0; @ %r158 bra $L4; .loc 1 90 13 shl.b64 %r160,%r78,3; add.u64 %r161,%frame,120; cvt.u32.u64 %r165,%r96add.u64 %r120,%r133,40; add.u64 %r118,%r135,40; mov.u64 %r116,%frame; add.u64 %r115,%frame,360; add.u64 %r114,%frame,240; add.u64 %r170,%r78,%r78; add.u64 %r171,%r170,%r78; shl.b64 %r172,%r171,3; add.u64 %r102,%r172,%r120; .loc 1 60 9 cvt.u32.u64 %r37,%r96; .loc 1 87 10 mov.u64 %r91,1; .loc 1 91 4 mov.u64 %r174,%r96; $L5: st.u64 [%r120+8],%r174; ld.u64 %r176,[%r118+16]; add.u64 %r175,%r176,1; ld.u64 %r177,[%r118+8]; sub.u64 %r32,%r175,%r177; add.u64 %r178,%r32,-1; st.u64 [%r120+16],%r178; st.u64 [%r120],%r91; .loc 1 93 14 st.u64 [%r116],%r32; .loc 1 94 18 set.u32.le.s64 %r180,%r32,0; neg.s32 %r181,%r180; cvt.u16.u32 %r184,%r37; and.b16 %r183,%r184,1; cvt.u16.u32 %r187,%r181; or.b16 %r186,%r187,%r183; cvt.u32.u16 %r189,%r186; cvt.u32.u8 %r37,%r189; .loc 1 95 15 st.u64 [%r115],%r91; .loc 1 96 17 ld.u64 %r191,[%r118]; mul.lo.u64 %r190,%r191,%r22; .loc 1 96 15 st.u64 [%r114],%r190; .loc 1 97 7 mul.lo.u64 %r91,%r91,%r32; .loc 1 88 7 add.u64 %r120,%r120,24; add.u64 %r118,%r118,24; add.u64 %r116,%r116,8; add.u64 %r115,%r115,8; add.u64 %r114,%r114,8; setp.ne.u64 %r192,%r102,%r120; @ %r192 bra $L5; bra $L31; $L4: .loc 1 99 19 st.u64 [%r133+8],%r96; .loc 1 100 24 mov.u64 %r195,8; mov.u64 %r19419195; call (%value_in),_gfortrani_xmallocarray196,[%value_in]; } mov.u64 %r96,%r196; .loc 1 100 22 st.u64 [%r133],%r96; .loc 1 128 12 ld.u64 %r132,[%frame+360]; bra $L7; $L31: .loc 1 99 19 st.u64 [%r133+8],%r174; .loc 1 100 24 mov.u64 %r20091200; call (%value_in),_gfortrani_xmallocarray201,[%value_in]; } .loc 1 100 22 st.u64 [%r133],%r201; bra $L8; $L3: .loc 1 104 11 ld.s8 %r78,[%r133+28]; .loc 1 106 18 mov.u64 %r204,1; st.u64 [%frame+360],%r204; .loc 1 107 7 setp.le.s64 %r205,%r78,0; @ %r205 bra $L20; .loc 1 109 13 shl.b64 %r130,%r78,3; add.u64 %r208,%frame,120; mov.u32 %r21mov.u64 %r99,40; mov.u64 %r100,0; .loc 1 60 9 cvt.u32.u64 %r37,%r100; add.u64 %r266,%frame,360; add.u64 %r268,%frame,240; $L9: add.u64 %r82,%r133,%r99; .loc 1 110 16 ld.u64 %r217,[%r82+16]; add.u64 %r216,%r217,1; ld.u64 %r218,[%r82+8]; sub.u64 %r48,%r216,%r218; .loc 1 110 14 add.u64 %r219,%frame,%r100; st.u64 [%r219],%r48; .loc 1 111 18 set.u32.le.s64 %r221,%r48,0; neg.s32 %r222,%r221; cvt.u16.u32 %r225,%r37; and.b16 %r224,%r225,1; cvt.u16.u32 %r228,%r222; or.b16 %r227,%r228,%r224; cvt.u32.u16 %r230,%r227; cvt.u32.u8 %r37,%r230; .loc 1 112 15 add.u64 %r232,%r266,%r100; ld.u64 %r233,[%r82]; st.u64 [%r232],%r233; .loc 1 113 15 add.u64 %r235,%r268,%r100; .loc 1 113 17 add.u64 %r236,%r135,%r99; ld.u64 %r238,[%r236]; mul.lo.u64 %r237,%r238,%r22; .loc 1 113 15 st.u64 [%r235],%r237; .loc 1 107 7 add.u64 %r100,%r100,8; add.u64 %r99,%r99,24; setp.ne.u64 %r239,%r100,%r130; @ %r239 bra $L9; .loc 1 115 10 ld.u64 %r240,[%frame+360]; setp.ne.u64 %r241,%r240,0; @ %r241 bra $L8; .loc 1 116 13 mov.u64 %r242,1; st.u64 [%frame+360],%r242; $L8: .loc 1 119 6 setp.ne.u32 %r243,%r37,0; @ %r243 bra $L1; .loc 1 128 12 ld.u64 %r132,[%frame+360]; .loc 1 130 8 ld.u64 %r96,[%r133]; bra $L7; $L20: .loc 1 107 7 mov.u64 %r132,%r204; $L7: .loc 1 122 14 ld.u64 %r56,[%frame+240]; .loc 1 122 6 setp.ne.u64 %r244,%r56,0; .loc 1 123 16 selp.u64 %r56,%r56,1,%r244; .loc 1 125 12 ld.u64 %r75,[%r134+40]; .loc 1 126 6 setp.ne.u64 %r245,%r75,0; .loc 1 127 14 selp.u64 %r75,%r75,1,%r245; .loc 1 131 8 ld.u64 %r76,[%r134]; .loc 1 133 9 setp.eq.u64 %r246,%r96,0; @ %r246 bra $L1; .loc 1 139 9 shl.b64 %r60,%r75,3; .loc 1 147 12 shl.b64 %r62,%r132,3; ld.u64 %r94,[%frame+120]; .loc 1 151 32 ld.u64 %r98,[%frame]; .loc 1 158 30 mul.lo.u64 %r27,%r98,%r132; .loc 1 159 30 mul.lo.u64 %r247,%r98,%r56; .loc 1 159 16 neg.s64 %r101,%r247; shl.b64 %r121,%r78,3; setp.le.s64 %r267,%r78,1; add.u64 %r273,%frame,360; add.u64 %r274,%frame,240; .loc 1 155 20 mov.u64 %r275,0; $L19: .loc 1 135 10 ld.s8 %r249,[%r97]; cvt.u16.u32 %r248,%r249; setp.eq.u16 %r250,%r248,0; @ %r250 bra $L21; .loc 1 138 12 ld.f64 %r58,[%r76]; .loc 1 139 9 add.u64 %r76,%r76,%r60; bra $L13; $L21: .loc 1 144 10 mov.f64 %r58,%r86; $L13: st.f64 [%r96],%r58; .loc 1 147 12 add.u64 %r96,%r96,%r62; .loc 1 148 12 add.u64 %r97,%r97,%r56; .loc 1 149 15 add.u64 %r94,%r94,1; .loc 1 151 13 setp.ne.u64 %r251,%r94,%r98; @ %r251 bra $L19; .loc 1 159 16 add.u64 %r95,%r97,%r101; .loc 1 161 14 @ ! %r267 bra $L32; bra $L1; $L18: .loc 1 155 20 st.u64 [%r90],%r275; .loc 1 158 30 mul.lo.u64 %r64,%r70,%r69; .loc 1 159 30 mul.lo.u64 %r254,%r72,%r69; .loc 1 159 16 sub.u64 %r95,%r97,%r254; .loc 1 161 14 add.u64 %r90,%r90,8; add.u64 %r127,%r127,8; setp.ne.u64 %r255,%r121,%r127; @ %r255 bra $L17; bra $L1; $L32: add.u64 %r90,%frame,128; .loc 1 158 30 mov.u64 %r64,%r27; .loc 1 161 14 mov.u64 %r127,8; $L17: .loc 1 169 23 ld.u64 %r257,[%r90]; add.u64 %r69,%r257,1; st.u64 [%r90],%r69; .loc 1 170 30 add.u64 %r259,%r273,%r127; ld.u64 %r70,[%r259]; .loc 1 170 20 sub.u64 %r260,%r70,%r64; shl.b64 %r261,%r260,3; add.u64 %r96,%r96,%r261; .loc 1 171 30 add.u64 %r263,%r274,%r127; ld.u64 %r72,[%r263]; .loc 1 171 20 add.u64 %r97,%r95,%r72; .loc 1 151 32 add.u64 %r264,%frame,%r127; ld.u64 %r74,[%r264]; .loc 1 151 13 setp.eq.u64 %r265,%r69,%r74; @ %r265 bra $L18; .loc 1 155 20 mov.u64 %r94,0; bra $L19; $L1: .loc 1 1_gfortrani_unpack1_r8 .visible .func _gfortrani_unpack1_r8608932predpredpred %r213; .reg .u64u32u64 %r257; .reg .u64predu64 %r281; .reg .u64 %r282; .reg .pred %r283; .reg .u64u64 %r306; .reg .u64 %r307; .reg .u64 %r308; mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 208 8 ld.u64 %r106,[%r152]; .loc 1 213 15 ld.u64 %r22,[%r152+16]; .loc 1 215 22 cvt.u32.u64 %r23,%r22; .loc 1 215 53 add.u32 %r154,%r23,-4; and.b32 %r155,%r154,-5; set.u32.eq.u32 %r157,%r155,0; neg.s32 %r158,%r157; .loc 1 215 22 add.u32 %r159,%r23,-1; set.u32.le.u32 %r161,%r159,1; neg.s32 %r162,%r161; .loc 1 215 6 cvt.u16.u32 %r164,%r158; cvt.u16.u32 %r165,%r162; or%r168 bra $L34; .loc 1 217 7 setp.eq.u32 %r170,%r23,16; @ %r170 bra $L34; .loc 1 226 5 cvta.const.u64 %r171171_gfortran_runtime_error34: .loc 1 228 10 ld.u64 %r104,[%r150]; .loc 1 228 6 setp.ne.u64 %r173,%r104,0; @ %r173 bra $L35; .loc 1 232 11 ld.s8 %r92,[%r152+28]; .loc 1 234 7 setp.le.s64 %r175,%r92,0; @ %r175 bra $L36; .loc 1 236 13 shl.b64 %r145,%r92,3; add.u64 %r178,%frame,120; cvt.u32.u64 %r182,%r1041454,[%value_in]; } mov.u64 %r135,40; mov.u64 %r136,%r104; .loc 1 206 9 cvt.u32.u64 %r38,%r136; .loc 1 233 10 mov.u64 %r99,1; add.u64 %r297,%frame,480; add.u64 %r300,%frame,360; add.u64 %r299,%frame,240; .loc 1 237 4 mov.u64 %r186,%r136; $L37: add.u64 %r134,%r150,%r135; st.u64 [%r134+8],%r186; add.u64 %r133,%r152,%r135; ld.u64 %r188,[%r133+16]; add.u64 %r187,%r188,1; ld.u64 %r189,[%r133+8]; sub.u64 %r33,%r187,%r189; add.u64 %r190,%r33,-1; st.u64 [%r134+16],%r190; st.u64 [%r134],%r99; .loc 1 239 14 add.u64 %r191,%frame,%r136; st.u64 [%r191],%r33; .loc 1 240 18 set.u32.le.s64 %r193,%r33,0; neg.s32 %r194,%r193; cvt.u16.u32 %r197,%r38; and.b16 %r196,%r197,1; cvt.u16.u32 %r200,%r194; or.b16 %r199,%r200,%r196; cvt.u32.u16 %r202,%r199; cvt.u32.u8 %r38,%r202; .loc 1 241 15 add.u64 %r204,%r297,%r136; st.u64 [%r204],%r99; .loc 1 242 15 add.u64 %r206,%r300,%r136; .loc 1 242 17 add.u64 %r207,%r153,%r135; .loc 1 242 15 ld.u64 %r208,[%r207]; st.u64 [%r206],%r208; .loc 1 243 15 add.u64 %r210,%r299,%r136; .loc 1 243 17 ld.u64 %r212,[%r133]; mul.lo.u64 %r211,%r212,%r22; .loc 1 243 15 st.u64 [%r210],%r211; .loc 1 244 7 mul.lo.u64 %r99,%r99,%r33; .loc 1 234 7 add.u64 %r136,%r136,8; add.u64 %r135,%r135,24; setp.ne.u64 %r213,%r136,%r145; @ %r213 bra $L37; bra $L64; $L36: .loc 1 246 19 st.u64 [%r150+8],%r104; .loc 1 247 24 mov.u64 %r216,8; mov.u64 %r215216; call (%value_in),_gfortrani_xmallocarray217,[%value_in]; } mov.u64 %r104,%r217; .loc 1 247 22 st.u64 [%r150],%r104; .loc 1 278 12 ld.u64 %r149,[%frame+480]; bra $L39; $L64: .loc 1 246 19 st.u64 [%r150+8],%r186; .loc 1 247 24 mov.u64 %r299221; call (%value_in),_gfortrani_xmallocarray222,[%value_in]; } .loc 1 247 22 st.u64 [%r150],%r222; bra $L40; $L35: .loc 1 251 11 ld.s8 %r92,[%r150+28]; .loc 1 253 18 mov.u64 %r225,1; st.u64 [%frame+480],%r225; .loc 1 254 7 setp.le.s64 %r226,%r92,0; @ %r226 bra $L54; .loc 1 256 13 shl.b64 %r147,%r92,3; add.u64 %r229,%frame,120; mov.u32 %r23314735,[%value_in]; } add.u64 %r127,%r150,40; add.u64 %r125,%r153,40; add.u64 %r123,%r152,40; mov.u64 %r121,0; .loc 1 206 9 cvt.u32.u64 %r38,%r121; add.u64 %r297,%frame,480; add.u64 %r300,%frame,360; add.u64 %r299,%frame,240; $L41: .loc 1 257 16 ld.u64 %r238,[%r127+16]; add.u64 %r237,%r238,1; ld.u64 %r239,[%r127+8]; sub.u64 %r50,%r237,%r239; .loc 1 257 14 add.u64 %r240,%frame,%r121; st.u64 [%r240],%r50; .loc 1 258 18 set.u32.le.s64 %r242,%r50,0; neg.s32 %r243,%r242; cvt.u16.u32 %r246,%r38; and.b16 %r245,%r246,1; cvt.u16.u32 %r249,%r243; or.b16 %r248,%r249,%r245; cvt.u32.u16 %r251,%r248; cvt.u32.u8 %r38,%r251; .loc 1 259 15 add.u64 %r253,%r297,%r121; ld.u64 %r254,[%r127]; st.u64 [%r253],%r254; .loc 1 260 15 add.u64 %r256,%r300,%r121; ld.u64 %r257,[%r125]; st.u64 [%r256],%r257; .loc 1 261 15 add.u64 %r259,%r299,%r121; .loc 1 261 17 ld.u64 %r261,[%r123]; mul.lo.u64 %r260,%r261,%r22; .loc 1 261 15 st.u64 [%r259],%r260; .loc 1 254 7 add.u64 %r127,%r127,24; add.u64 %r125,%r125,24; add.u64 %r123,%r123,24; add.u64 %r121,%r121,8; setp.ne.u64 %r262,%r121,%r147; @ %r262 bra $L41; .loc 1 263 10 ld.u64 %r263,[%frame+48040; .loc 1 264 13 mov.u64 %r265,1; st.u64 [%frame+480],%r265; $L40: .loc 1 267 6 setp.ne.u32 %r266,%r38,0; @ %r266 bra $L33; .loc 1 278 12 ld.u64 %r149,[%frame+480]; .loc 1 281 8 ld.u64 %r104,[%r150]; bra $L39; $L54: .loc 1 254 7 mov.u64 %r149,%r225; $L39: .loc 1 270 14 ld.u64 %r59,[%frame+360]; .loc 1 270 6 setp.ne.u64 %r267,%r59,0; .loc 1 271 16 selp.u64 %r59,%r59,1,%r267; .loc 1 272 14 ld.u64 %r60,[%frame+240]; .loc 1 272 6 setp.ne.u64 %r268,%r60,0; .loc 1 273 16 selp.u64 %r60,%r60,1,%r268; .loc 1 275 12 ld.u64 %r90,[%r151+40]; .loc 1 276 6 setp.ne.u64 %r269,%r90,0; .loc 1 277 14 selp.u64 %r90,%r90,1,%r269; .loc 1 282 8 ld.u64 %r105,[%r153]; .loc 1 283 8 ld.u64 %r91,[%r151]; .loc 1 285 9 setp.eq.u64 %r270,%r104,0; @ %r270 bra $L33; .loc 1 291 16 shl.b64 %r64,%r90,3; .loc 1 299 12 shl.b64 %r66,%r149,3; .loc 1 300 12 shl.b64 %r68,%r59,3; ld.u64 %r27,[%frame+120]; .loc 1 304 32 ld.u64 %r107,[%frame]; .loc 1 311 30 mul.lo.u64 %r28,%r107,%r149; .loc 1 312 16 mul.lo.u64 %r271,%r107,%r59; shl.b64 %r272,%r271,3; neg.s64 %r111,%r272; .loc 1 313 30 mul.lo.u64 %r273,%r107,%r60; .loc 1 313 16 neg.s64 %r112,%r273; shl.b64 %r137,%r92,3; setp.le.s64 %r298,%r92,1; add.u64 %r305,%frame,480; add.u64 %r306,%frame,360; add.u64 %r307,%frame,240; .loc 1 308 20 mov.u64 %r308,0; $L53: .loc 1 287 10 ld.s8 %r275,[%r106]; cvt.u16.u32 %r274,%r275; setp.eq.u16 %r276,%r274,0; @ %r276 bra $L46; .loc 1 290 12 ld.f64 %r62,[%r91]; .loc 1 291 16 add.u64 %r91,%r91,%r64; bra $L47; $L46: .loc 1 296 12 ld.f64 %r62,[%r105]; $L47: st.f64 [%r104],%r62; .loc 1 299 12 add.u64 %r104,%r104,%r66; .loc 1 300 12 add.u64 %r105,%r105,%r68; .loc 1 301 12 add.u64 %r106,%r106,%r60; .loc 1 302 15 add.u64 %r27,%r27,1; .loc 1 304 13 setp.ne.u64 %r277,%r27,%r107; @ %r277 bra $L53; .loc 1 312 16 add.u64 %r102,%r105,%r111; .loc 1 313 16 add.u64 %r103,%r106,%r112; .loc 1 315 14 @ ! %r298 bra $L65; bra $L33; $L52: .loc 1 308 20 st.u64 [%r98],%r308; .loc 1 311 30 mul.lo.u64 %r71,%r81,%r80; .loc 1 312 30 mul.lo.u64 %r280,%r83,%r80; .loc 1 312 16 shl.b64 %r281,%r280,3; sub.u64 %r102,%r105,%r281; .loc 1 313 30 mul.lo.u64 %r282,%r86,%r80; .loc 1 313 16 sub.u64 %r103,%r106,%r282; .loc 1 315 14 add.u64 %r98,%r98,8; add.u64 %r144,%r144,8; setp.ne.u64 %r283,%r137,%r144; @ %r283 bra $L51; bra $L33; $L65: add.u64 %r98,%frame,128; .loc 1 311 30 mov.u64 %r71,%r28; .loc 1 315 14 mov.u64 %r144,8; $L51: .loc 1 323 23 ld.u64 %r285,[%r98]; add.u64 %r80,%r285,1; st.u64 [%r98],%r80; .loc 1 324 30 add.u64 %r287,%r305,%r144; ld.u64 %r81,[%r287]; .loc 1 324 20 sub.u64 %r288,%r81,%r71; shl.b64 %r289,%r288,3; add.u64 %r104,%r104,%r289; .loc 1 325 30 add.u64 %r291,%r306,%r144; ld.u64 %r83,[%r291]; .loc 1 325 20 shl.b64 %r292,%r83,3; add.u64 %r105,%r102,%r292; .loc 1 326 30 add.u64 %r294,%r307,%r144; ld.u64 %r86,[%r294]; .loc 1 326 20 add.u64 %r106,%r103,%r86; .loc 1 304 32 add.u64 %r295,%frame,%r144; ld.u64 %r88,[%r295]; .loc 1 304 13 setp.eq.u64 %r296,%r80,%r88; @ %r296 bra $L52; .loc 1 308 20 mov.u64 %r27,0; bra $L53; $L33: .loc 1 330 unpack_r10.o/ unpack_c4.o/_gfortrani_unpack0_c4 .visible .func _gfortrani_unpack0_c../libgfortran/generated/unpack_c4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_unpack1_c4 .visible .func _gfortrani_unpack1_c4gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortrani_unpack0_c4 .visible .func _gfortrani_unpack0_c032predpredu16predu32 %r251; .reg .pred %r252; .reg .predpred %r267; .reg .u64 %r268; .reg .u64mov.u64 %r135,%ar0; mov.u64 %r136,%ar1; mov.u64 %r137,%ar2; mov.u64 %r138,%ar3; .loc 1 46 23 ld.f32 %r87,[%r138]; ld.f32 %r88,[%r138+4]; .loc 1 62 8 ld.u64 %r99,[%r137]; .loc 1 67 15 ld.u64 %r22,[%r137+16]; .loc 1 69 22 cvt.u32.u64 %r23,%r22; .loc 1 69 53 add.u32 %r139,%r23,-4; and.b32 %r140,%r139,-5; set.u32.eq.u32 %r142,%r140,0; neg.s32 %r143,%r142; .loc 1 69 22 add.u32 %r144,%r23,-1; set.u32.le.u32 %r146,%r144,1; neg.s32 %r147,%r146; .loc 1 69 6 cvt.u16.u32 %r149,%r143; cvt.u16.u32 %r150,%r1472; .loc 1 71 7 setp.eq.u32 %r155,%r23,16; @ %r155 bra $L2; .loc 1 80 5_gfortran_runtime_errorr98,[%r135]; .loc 1 82 6 setp.ne.u64 %r158,%r98,0; @ %r158 bra $L3; .loc 1 86 11 ld.s8 %r79,[%r137+28]; .loc 1 88 7 setp.le.s64 %r160,%r79,0; @ %r160 bra $L4; .loc 1 90 13 shl.b64 %r162,%r79,3; add.u64 %r163,%frame,120; cvt.u32.u64 %r167,%r98162169,[%value_in]; } add.u64 %r122,%r135,40; add.u64 %r120,%r137,40; mov.u64 %r118,%frame; add.u64 %r117,%frame,360; add.u64 %r116,%frame,240; add.u64 %r172,%r79,%r79; add.u64 %r173,%r172,%r79; shl.b64 %r174,%r173,3; add.u64 %r104,%r174,%r122; .loc 1 60 9 cvt.u32.u64 %r37,%r98; .loc 1 87 10 mov.u64 %r93,1; .loc 1 91 4 mov.u64 %r176,%r98; $L5: st.u64 [%r122+8],%r176; ld.u64 %r178,[%r120+16]; add.u64 %r177,%r178,1; ld.u64 %r179,[%r120+8]; sub.u64 %r32,%r177,%r179; add.u64 %r180,%r32,-1; st.u64 [%r122+16],%r180; st.u64 [%r122],%r93; .loc 1 93 14 st.u64 [%r118],%r32; .loc 1 94 18 set.u32.le.s64 %r182,%r32,0; neg.s32 %r183,%r182; cvt.u16.u32 %r186,%r37; and.b16 %r185,%r186,1; cvt.u16.u32 %r189,%r183; or.b16 %r188,%r189,%r185; cvt.u32.u16 %r191,%r188; cvt.u32.u8 %r37,%r191; .loc 1 95 15 st.u64 [%r117],%r93; .loc 1 96 17 ld.u64 %r193,[%r120]; mul.lo.u64 %r192,%r193,%r22; .loc 1 96 15 st.u64 [%r116],%r192; .loc 1 97 7 mul.lo.u64 %r93,%r93,%r32; .loc 1 88 7 add.u64 %r122,%r122,24; add.u64 %r120,%r120,24; add.u64 %r118,%r118,8; add.u64 %r117,%r117,8; add.u64 %r116,%r116,8; setp.ne.u64 %r194,%r104,%r122; @ %r194 bra $L5; bra $L31; $L4: .loc 1 99 19 st.u64 [%r135+8],%r98; .loc 1 100 24 mov.u64 %r197,8; mov.u64 %r196call (%value_in),_gfortrani_xmallocarray198,[%value_in]; } mov.u64 %r98,%r198; .loc 1 100 22 st.u64 [%r135],%r98; .loc 1 128 12 ld.u64 %r134,[%frame+360]; bra $L7; $L31: .loc 1 99 19 st.u64 [%r135+8],%r176; .loc 1 100 24 mov.u64 %r20(%value_in),_gfortrani_xmallocarray203,[%value_in]; } .loc 1 100 22 st.u64 [%r135],%r203; bra $L8; $L3: .loc 1 104 11 ld.s8 %r79,[%r135+28]; .loc 1 106 18 mov.u64 %r206,1; st.u64 [%frame+360],%r206; .loc 1 107 7 setp.le.s64 %r207,%r79,0; @ %r207 bra $L20; .loc 1 109 13 shl.b64 %r132,%r79,3; add.u64 %r210,%frame,120; mov.u32 %r21413216,[%value_in]; } mov.u64 %r101,40; mov.u64 %r102,0; .loc 1 60 9 cvt.u32.u64 %r37,%r102; add.u64 %r269,%frame,360; add.u64 %r268,%frame,240; $L9: add.u64 %r83,%r135,%r101; .loc 1 110 16 ld.u64 %r219,[%r83+16]; add.u64 %r218,%r219,1; ld.u64 %r220,[%r83+8]; sub.u64 %r48,%r218,%r220; .loc 1 110 14 add.u64 %r221,%frame,%r102; st.u64 [%r221],%r48; .loc 1 111 18 set.u32.le.s64 %r223,%r48,0; neg.s32 %r224,%r223; cvt.u16.u32 %r227,%r37; and.b16 %r226,%r227,1; cvt.u16.u32 %r230,%r224; or.b16 %r229,%r230,%r226; cvt.u32.u16 %r232,%r229; cvt.u32.u8 %r37,%r232; .loc 1 112 15 add.u64 %r234,%r269,%r102; ld.u64 %r235,[%r83]; st.u64 [%r234],%r235; .loc 1 113 15 add.u64 %r237,%r268,%r102; .loc 1 113 17 add.u64 %r238,%r137,%r101; ld.u64 %r240,[%r238]; mul.lo.u64 %r239,%r240,%r22; .loc 1 113 15 st.u64 [%r237],%r239; .loc 1 107 7 add.u64 %r102,%r102,8; add.u64 %r101,%r101,24; setp.ne.u64 %r241,%r102,%r132; @ %r241 bra $L9; .loc 1 115 10 ld.u64 %r242,[%frame+360]; setp.ne.u64 %r243,%r242,0; @ %r243 bra $L8; .loc 1 116 13 mov.u64 %r244,1; st.u64 [%frame+360],%r244; $L8: .loc 1 119 6 setp.ne.u32 %r245,%r37,0; @ %r245 bra $L1; .loc 1 128 12 ld.u64 %r134,[%frame+360]; .loc 1 130 8 ld.u64 %r98,[%r135]; bra $L7; $L20: .loc 1 107 7 mov.u64 %r134,%r206; $L7: .loc 1 122 14 ld.u64 %r56,[%frame+240]; .loc 1 122 6 setp.ne.u64 %r246,%r56,0; .loc 1 123 16 selp.u64 %r56,%r56,1,%r246; .loc 1 125 12 ld.u64 %r76,[%r136+40]; .loc 1 126 6 setp.ne.u64 %r247,%r76,0; .loc 1 127 14 selp.u64 %r76,%r76,1,%r247; .loc 1 131 8 ld.u64 %r77,[%r136]; .loc 1 133 9 setp.eq.u64 %r248,%r98,0; @ %r248 bra $L1; .loc 1 139 9 shl.b64 %r61,%r76,3; .loc 1 147 12 shl.b64 %r63,%r134,3; ld.u64 %r96,[%frame+120]; .loc 1 151 32 ld.u64 %r100,[%frame]; .loc 1 158 30 mul.lo.u64 %r27,%r100,%r134; .loc 1 159 30 mul.lo.u64 %r249,%r100,%r56; .loc 1 159 16 neg.s64 %r103,%r249; shl.b64 %r123,%r79,3; setp.le.s64 %r270,%r79,1; add.u64 %r275,%frame,360; add.u64 %r276,%frame,240; .loc 1 155 20 mov.u64 %r277,0; $L19: .loc 1 135 10 ld.s8 %r251,[%r99]; cvt.u16.u32 %r250,%r251; setp.eq.u16 %r252,%r250,0; @ %r252 bra $L21; .loc 1 138 12 ld.f32 %r58,[%r77]; ld.f32 %r59,[%r77+4]; .loc 1 139 9 add.u64 %r77,%r77,%r61; bra $L13; $L21: .loc 1 144 10 mov.f32 %r58,%r87; mov.f32 %r59,%r88; $L13: st.f32 [%r98],%r58; st.f32 [%r98+4],%r59; .loc 1 147 12 add.u64 %r98,%r98,%r63; .loc 1 148 12 add.u64 %r99,%r99,%r56; .loc 1 149 15 add.u64 %r96,%r96,1; .loc 1 151 13 setp.ne.u64 %r253,%r96,%r100; @ %r253 bra $L19; .loc 1 159 16 add.u64 %r97,%r99,%r103; .loc 1 161 14 @ ! %r270 bra $L32; bra $L1; $L18: .loc 1 155 20 st.u64 [%r92],%r277; .loc 1 158 30 mul.lo.u64 %r65,%r71,%r70; .loc 1 159 30 mul.lo.u64 %r256,%r73,%r70; .loc 1 159 16 sub.u64 %r97,%r99,%r256; .loc 1 161 14 add.u64 %r92,%r92,8; add.u64 %r129,%r129,8; setp.ne.u64 %r257,%r123,%r129; @ %r257 bra $L17; bra $L1; $L32: add.u64 %r92,%frame,128; .loc 1 158 30 mov.u64 %r65,%r27; .loc 1 161 14 mov.u64 %r129,8; $L17: .loc 1 169 23 ld.u64 %r259,[%r92]; add.u64 %r70,%r259,1; st.u64 [%r92],%r70; .loc 1 170 30 add.u64 %r261,%r275,%r129; ld.u64 %r71,[%r261]; .loc 1 170 20 sub.u64 %r262,%r71,%r65; shl.b64 %r263,%r262,3; add.u64 %r98,%r98,%r263; .loc 1 171 30 add.u64 %r265,%r276,%r129; ld.u64 %r73,[%r265]; .loc 1 171 20 add.u64 %r99,%r97,%r73; .loc 1 151 32 add.u64 %r266,%frame,%r129; ld.u64 %r75,[%r266]; .loc 1 151 13 setp.eq.u64 %r267,%r70,%r75; @ %r267 bra $L18; .loc 1 155 20 mov.u64 %r96,0; bra $L19; $L1: .loc 1 1_gfortrani_unpack1_c4 .visible .func _gfortrani_unpack1_c608predpred %r174; .reg .predu32u64pred %r269; .reg .predu64 %r306; .reg .u64mov.u64 %r151,%ar0; mov.u64 %r152,%ar1; mov.u64 %r153,%ar2; mov.u64 %r154,%ar3; .loc 1 208 8 ld.u64 %r107,[%r153]; .loc 1 213 15 ld.u64 %r22,[%r153+16]; .loc 1 215 22 cvt.u32.u64 %r23,%r22; .loc 1 215 53 add.u32 %r155,%r23,-4; and.b32 %r156,%r155,-5; set.u32.eq.u32 %r158,%r156,0; neg.s32 %r159,%r158; .loc 1 215 22 add.u32 %r160,%r23,-1; set.u32.le.u32 %r162,%r160,1; neg.s32 %r163,%r162; .loc 1 215 6 cvt.u16.u32 %r165,%r159; cvt.u16.u32 %r166,%r163; or.b16 %r164,%r165,%r166; cvt.u32.u16 %r167,%r164; cvt.u16.u8 %r168,%r167; setp.ne.u16 %r169,%r168,0; @ %r169 bra $L34; .loc 1 217 7 setp.eq.u32 %r171,%r23,16; @ %r171 bra $L34; .loc 1 226 5 cvta.const.u64 %r172172_gfortran_runtime_error34: .loc 1 228 10 ld.u64 %r105,[%r151]; .loc 1 228 6 setp.ne.u64 %r174,%r105,0; @ %r174 bra $L35; .loc 1 232 11 ld.s8 %r93,[%r153+28]; .loc 1 234 7 setp.le.s64 %r176,%r93,0; @ %r176 bra $L36; .loc 1 236 13 shl.b64 %r146,%r93,3; add.u64 %r179,%frame,120; cvt.u32.u64 %r183,%r1051465,[%value_in]; } mov.u64 %r136,40; mov.u64 %r137,%r105; .loc 1 206 9 cvt.u32.u64 %r38,%r137; .loc 1 233 10 mov.u64 %r100,1; add.u64 %r301,%frame,480; add.u64 %r300,%frame,360; add.u64 %r299,%frame,240; .loc 1 237 4 mov.u64 %r187,%r137; $L37: add.u64 %r135,%r151,%r136; st.u64 [%r135+8],%r187; add.u64 %r134,%r153,%r136; ld.u64 %r189,[%r134+16]; add.u64 %r188,%r189,1; ld.u64 %r190,[%r134+8]; sub.u64 %r33,%r188,%r190; add.u64 %r191,%r33,-1; st.u64 [%r135+16],%r191; st.u64 [%r135],%r100; .loc 1 239 14 add.u64 %r192,%frame,%r137; st.u64 [%r192],%r33; .loc 1 240 18 set.u32.le.s64 %r194,%r33,0; neg.s32 %r195,%r194; cvt.u16.u32 %r198,%r38; and.b16 %r197,%r198,1; cvt.u16.u32 %r201,%r195; or.b16 %r200,%r201,%r197; cvt.u32.u16 %r203,%r200; cvt.u32.u8 %r38,%r203; .loc 1 241 15 add.u64 %r205,%r301,%r137; st.u64 [%r205],%r100; .loc 1 242 15 add.u64 %r207,%r300,%r137; .loc 1 242 17 add.u64 %r208,%r154,%r136; .loc 1 242 15 ld.u64 %r209,[%r208]; st.u64 [%r207],%r209; .loc 1 243 15 add.u64 %r211,%r299,%r137; .loc 1 243 17 ld.u64 %r213,[%r134]; mul.lo.u64 %r212,%r213,%r22; .loc 1 243 15 st.u64 [%r211],%r212; .loc 1 244 7 mul.lo.u64 %r100,%r100,%r33; .loc 1 234 7 add.u64 %r137,%r137,8; add.u64 %r136,%r136,24; setp.ne.u64 %r214,%r137,%r146; @ %r214 bra $L37; bra $L64; $L36: .loc 1 246 19 st.u64 [%r151+8],%r105; .loc 1 247 24 mov.u64 %r217,8; mov.u64 %r216217; call (%value_in),_gfortrani_xmallocarray218,[%value_in]; } mov.u64 %r105,%r218; .loc 1 247 22 st.u64 [%r151],%r105; .loc 1 278 12 ld.u64 %r150,[%frame+480]; bra $L39; $L64: .loc 1 246 19 st.u64 [%r151+8],%r187; .loc 1 247 24 mov.u64 %r2222; call (%value_in),_gfortrani_xmallocarray223,[%value_in]; } .loc 1 247 22 st.u64 [%r151],%r223; bra $L40; $L35: .loc 1 251 11 ld.s8 %r93,[%r151+28]; .loc 1 253 18 mov.u64 %r226,1; st.u64 [%frame+480],%r226; .loc 1 254 7 setp.le.s64 %r227,%r93,0; @ %r227 bra $L54; .loc 1 256 13 shl.b64 %r148,%r93,3; add.u64 %r230,%frame,120; mov.u32 %r23414836,[%value_in]; } add.u64 %r128,%r151,40; add.u64 %r126,%r154,40; add.u64 %r124,%r153,40; mov.u64 %r122,0; .loc 1 206 9 cvt.u32.u64 %r38,%r122; add.u64 %r301,%frame,480; add.u64 %r300,%frame,360; add.u64 %r299,%frame,240; $L41: .loc 1 257 16 ld.u64 %r239,[%r128+16]; add.u64 %r238,%r239,1; ld.u64 %r240,[%r128+8]; sub.u64 %r50,%r238,%r240; .loc 1 257 14 add.u64 %r241,%frame,%r122; st.u64 [%r241],%r50; .loc 1 258 18 set.u32.le.s64 %r243,%r50,0; neg.s32 %r244,%r243; cvt.u16.u32 %r247,%r38; and.b16 %r246,%r247,1; cvt.u16.u32 %r250,%r244; or.b16 %r249,%r250,%r246; cvt.u32.u16 %r252,%r249; cvt.u32.u8 %r38,%r252; .loc 1 259 15 add.u64 %r254,%r301,%r122; ld.u64 %r255,[%r128]; st.u64 [%r254],%r255; .loc 1 260 15 add.u64 %r257,%r300,%r122; ld.u64 %r258,[%r126]; st.u64 [%r257],%r258; .loc 1 261 15 add.u64 %r260,%r299,%r122; .loc 1 261 17 ld.u64 %r262,[%r124]; mul.lo.u64 %r261,%r262,%r22; .loc 1 261 15 st.u64 [%r260],%r261; .loc 1 254 7 add.u64 %r128,%r128,24; add.u64 %r126,%r126,24; add.u64 %r124,%r124,24; add.u64 %r122,%r122,8; setp.ne.u64 %r263,%r122,%r148; @ %r263 bra $L41; .loc 1 263 10 ld.u64 %r264,[%frame+480]; setp.ne.u64 %r265,%r264,0; @ %r265 bra $L40; .loc 1 264 13 mov.u64 %r266,1; st.u64 [%frame+480],%r266; $L40: .loc 1 267 6 setp.ne.u32 %r267,%r38,0; @ %r267 bra $L33; .loc 1 278 12 ld.u64 %r150,[%frame+480]; .loc 1 281 8 ld.u64 %r105,[%r151]; bra $L39; $L54: .loc 1 254 7 mov.u64 %r150,%r226; $L39: .loc 1 270 14 ld.u64 %r59,[%frame+360]; .loc 1 270 6 setp.ne.u64 %r268,%r59,0; .loc 1 271 16 selp.u64 %r59,%r59,1,%r268; .loc 1 272 14 ld.u64 %r60,[%frame+240]; .loc 1 272 6 setp.ne.u64 %r269,%r60,0; .loc 1 273 16 selp.u64 %r60,%r60,1,%r269; .loc 1 275 12 ld.u64 %r91,[%r152+40]; .loc 1 276 6 setp.ne.u64 %r270,%r91,0; .loc 1 277 14 selp.u64 %r91,%r91,1,%r270; .loc 1 282 8 ld.u64 %r106,[%r154]; .loc 1 283 8 ld.u64 %r92,[%r152]; .loc 1 285 9 setp.eq.u64 %r271,%r105,0; @ %r271 bra $L33; .loc 1 291 16 shl.b64 %r65,%r91,3; .loc 1 299 12 shl.b64 %r67,%r150,3; .loc 1 300 12 shl.b64 %r69,%r59,3; ld.u64 %r27,[%frame+120]; .loc 1 304 32 ld.u64 %r108,[%frame]; .loc 1 311 30 mul.lo.u64 %r28,%r108,%r150; .loc 1 312 16 mul.lo.u64 %r272,%r108,%r59; shl.b64 %r273,%r272,3; neg.s64 %r112,%r273; .loc 1 313 30 mul.lo.u64 %r274,%r108,%r60; .loc 1 313 16 neg.s64 %r113,%r274; shl.b64 %r138,%r93,3; setp.le.s64 %r298,%r93,1; add.u64 %r306,%frame,480; add.u64 %r307,%frame,360; add.u64 %r308,%frame,240; .loc 1 308 20 mov.u64 %r309,0; $L53: .loc 1 287 10 ld.s8 %r276,[%r107]; cvt.u16.u32 %r275,%r276; setp.eq.u16 %r277,%r275,0; @ %r277 bra $L46; .loc 1 290 12 ld.f32 %r62,[%r92]; ld.f32 %r63,[%r92+4]; .loc 1 291 16 add.u64 %r92,%r92,%r65; bra $L47; $L46: .loc 1 296 12 ld.f32 %r62,[%r106]; ld.f32 %r63,[%r106+4]; $L47: st.f32 [%r105],%r62; st.f32 [%r105+4],%r63; .loc 1 299 12 add.u64 %r105,%r105,%r67; .loc 1 300 12 add.u64 %r106,%r106,%r69; .loc 1 301 12 add.u64 %r107,%r107,%r60; .loc 1 302 15 add.u64 %r27,%r27,1; .loc 1 304 13 setp.ne.u64 %r278,%r27,%r108; @ %r278 bra $L53; .loc 1 312 16 add.u64 %r103,%r106,%r112; .loc 1 313 16 add.u64 %r104,%r107,%r113; .loc 1 315 14 @ ! %r298 bra $L65; bra $L33; $L52: .loc 1 308 20 st.u64 [%r99],%r309; .loc 1 311 30 mul.lo.u64 %r72,%r82,%r81; .loc 1 312 30 mul.lo.u64 %r281,%r84,%r81; .loc 1 312 16 shl.b64 %r282,%r281,3; sub.u64 %r103,%r106,%r282; .loc 1 313 30 mul.lo.u64 %r283,%r87,%r81; .loc 1 313 16 sub.u64 %r104,%r107,%r283; .loc 1 315 14 add.u64 %r99,%r99,8; add.u64 %r145,%r145,8; setp.ne.u64 %r284,%r138,%r145; @ %r284 bra $L51; bra $L33; $L65: add.u64 %r99,%frame,128; .loc 1 311 30 mov.u64 %r72,%r28; .loc 1 315 14 mov.u64 %r145,8; $L51: .loc 1 323 23 ld.u64 %r286,[%r99]; add.u64 %r81,%r286,1; st.u64 [%r99],%r81; .loc 1 324 30 add.u64 %r288,%r306,%r145; ld.u64 %r82,[%r288]; .loc 1 324 20 sub.u64 %r289,%r82,%r72; shl.b64 %r290,%r289,3; add.u64 %r105,%r105,%r290; .loc 1 325 30 add.u64 %r292,%r307,%r145; ld.u64 %r84,[%r292]; .loc 1 325 20 shl.b64 %r293,%r84,3; add.u64 %r106,%r103,%r293; .loc 1 326 30 add.u64 %r295,%r308,%r145; ld.u64 %r87,[%r295]; .loc 1 326 20 add.u64 %r107,%r104,%r87; .loc 1 304 32 add.u64 %r296,%frame,%r145; ld.u64 %r89,[%r296]; .loc 1 304 13 setp.eq.u64 %r297,%r81,%r89; @ %r297 bra $L52; .loc 1 308 20 mov.u64 %r27,0; bra $L53; $L33: .loc 1 330 unpack_c8.o/_gfortrani_unpack0_c8 .visible .func _gfortrani_unpack0_c8../libgfortran/generated/unpack_c8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_unpack1_c8 .visible .func _gfortrani_unpack1_c8gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortrani_unpack0_c8 .visible .func _gfortrani_unpack0_c8032predpredu16predu32 %r251; .reg .pred %r252; .reg .predpred %r267; .reg .u64 %r268; .reg .u64mov.u64 %r135,%ar0; mov.u64 %r136,%ar1; mov.u64 %r137,%ar2; mov.u64 %r138,%ar3; .loc 1 46 23 ld.f64 %r87,[%r138]; ld.f64 %r88,[%r138+8]; .loc 1 62 8 ld.u64 %r99,[%r137]; .loc 1 67 15 ld.u64 %r22,[%r137+16]; .loc 1 69 22 cvt.u32.u64 %r23,%r22; .loc 1 69 53 add.u32 %r139,%r23,-4; and.b32 %r140,%r139,-5; set.u32.eq.u32 %r142,%r140,0; neg.s32 %r143,%r142; .loc 1 69 22 add.u32 %r144,%r23,-1; set.u32.le.u32 %r146,%r144,1; neg.s32 %r147,%r146; .loc 1 69 6 cvt.u16.u32 %r149,%r143; cvt.u16.u32 %r150,%r1472; .loc 1 71 7 setp.eq.u32 %r155,%r23,16; @ %r155 bra $L2; .loc 1 80 5_gfortran_runtime_errorr98,[%r135]; .loc 1 82 6 setp.ne.u64 %r158,%r98,0; @ %r158 bra $L3; .loc 1 86 11 ld.s8 %r79,[%r137+28]; .loc 1 88 7 setp.le.s64 %r160,%r79,0; @ %r160 bra $L4; .loc 1 90 13 shl.b64 %r162,%r79,3; add.u64 %r163,%frame,120; cvt.u32.u64 %r167,%r98162169,[%value_in]; } add.u64 %r122,%r135,40; add.u64 %r120,%r137,40; mov.u64 %r118,%frame; add.u64 %r117,%frame,360; add.u64 %r116,%frame,240; add.u64 %r172,%r79,%r79; add.u64 %r173,%r172,%r79; shl.b64 %r174,%r173,3; add.u64 %r104,%r174,%r122; .loc 1 60 9 cvt.u32.u64 %r37,%r98; .loc 1 87 10 mov.u64 %r93,1; .loc 1 91 4 mov.u64 %r176,%r98; $L5: st.u64 [%r122+8],%r176; ld.u64 %r178,[%r120+16]; add.u64 %r177,%r178,1; ld.u64 %r179,[%r120+8]; sub.u64 %r32,%r177,%r179; add.u64 %r180,%r32,-1; st.u64 [%r122+16],%r180; st.u64 [%r122],%r93; .loc 1 93 14 st.u64 [%r118],%r32; .loc 1 94 18 set.u32.le.s64 %r182,%r32,0; neg.s32 %r183,%r182; cvt.u16.u32 %r186,%r37; and.b16 %r185,%r186,1; cvt.u16.u32 %r189,%r183; or.b16 %r188,%r189,%r185; cvt.u32.u16 %r191,%r188; cvt.u32.u8 %r37,%r191; .loc 1 95 15 st.u64 [%r117],%r93; .loc 1 96 17 ld.u64 %r193,[%r120]; mul.lo.u64 %r192,%r193,%r22; .loc 1 96 15 st.u64 [%r116],%r192; .loc 1 97 7 mul.lo.u64 %r93,%r93,%r32; .loc 1 88 7 add.u64 %r122,%r122,24; add.u64 %r120,%r120,24; add.u64 %r118,%r118,8; add.u64 %r117,%r117,8; add.u64 %r116,%r116,8; setp.ne.u64 %r194,%r104,%r122; @ %r194 bra $L5; bra $L31; $L4: .loc 1 99 19 st.u64 [%r135+8],%r98; .loc 1 100 24 mov.u64 %r197,16; mov.u64 %r196call (%value_in),_gfortrani_xmallocarray198,[%value_in]; } mov.u64 %r98,%r198; .loc 1 100 22 st.u64 [%r135],%r98; .loc 1 128 12 ld.u64 %r134,[%frame+360]; bra $L7; $L31: .loc 1 99 19 st.u64 [%r135+8],%r176; .loc 1 100 24 mov.u64 %r202,16(%value_in),_gfortrani_xmallocarray203,[%value_in]; } .loc 1 100 22 st.u64 [%r135],%r203; bra $L8; $L3: .loc 1 104 11 ld.s8 %r79,[%r135+28]; .loc 1 106 18 mov.u64 %r206,1; st.u64 [%frame+360],%r206; .loc 1 107 7 setp.le.s64 %r207,%r79,0; @ %r207 bra $L20; .loc 1 109 13 shl.b64 %r132,%r79,3; add.u64 %r210,%frame,120; mov.u32 %r21413216,[%value_in]; } mov.u64 %r101,40; mov.u64 %r102,0; .loc 1 60 9 cvt.u32.u64 %r37,%r102; add.u64 %r269,%frame,360; add.u64 %r268,%frame,240; $L9: add.u64 %r83,%r135,%r101; .loc 1 110 16 ld.u64 %r219,[%r83+16]; add.u64 %r218,%r219,1; ld.u64 %r220,[%r83+8]; sub.u64 %r48,%r218,%r220; .loc 1 110 14 add.u64 %r221,%frame,%r102; st.u64 [%r221],%r48; .loc 1 111 18 set.u32.le.s64 %r223,%r48,0; neg.s32 %r224,%r223; cvt.u16.u32 %r227,%r37; and.b16 %r226,%r227,1; cvt.u16.u32 %r230,%r224; or.b16 %r229,%r230,%r226; cvt.u32.u16 %r232,%r229; cvt.u32.u8 %r37,%r232; .loc 1 112 15 add.u64 %r234,%r269,%r102; ld.u64 %r235,[%r83]; st.u64 [%r234],%r235; .loc 1 113 15 add.u64 %r237,%r268,%r102; .loc 1 113 17 add.u64 %r238,%r137,%r101; ld.u64 %r240,[%r238]; mul.lo.u64 %r239,%r240,%r22; .loc 1 113 15 st.u64 [%r237],%r239; .loc 1 107 7 add.u64 %r102,%r102,8; add.u64 %r101,%r101,24; setp.ne.u64 %r241,%r102,%r132; @ %r241 bra $L9; .loc 1 115 10 ld.u64 %r242,[%frame+360]; setp.ne.u64 %r243,%r242,0; @ %r243 bra $L8; .loc 1 116 13 mov.u64 %r244,1; st.u64 [%frame+360],%r244; $L8: .loc 1 119 6 setp.ne.u32 %r245,%r37,0; @ %r245 bra $L1; .loc 1 128 12 ld.u64 %r134,[%frame+360]; .loc 1 130 8 ld.u64 %r98,[%r135]; bra $L7; $L20: .loc 1 107 7 mov.u64 %r134,%r206; $L7: .loc 1 122 14 ld.u64 %r56,[%frame+240]; .loc 1 122 6 setp.ne.u64 %r246,%r56,0; .loc 1 123 16 selp.u64 %r56,%r56,1,%r246; .loc 1 125 12 ld.u64 %r76,[%r136+40]; .loc 1 126 6 setp.ne.u64 %r247,%r76,0; .loc 1 127 14 selp.u64 %r76,%r76,1,%r247; .loc 1 131 8 ld.u64 %r77,[%r136]; .loc 1 133 9 setp.eq.u64 %r248,%r98,0; @ %r248 bra $L1; .loc 1 139 9 shl.b64 %r61,%r76,4; .loc 1 147 12 shl.b64 %r63,%r134,4; ld.u64 %r96,[%frame+120]; .loc 1 151 32 ld.u64 %r100,[%frame]; .loc 1 158 30 mul.lo.u64 %r27,%r100,%r134; .loc 1 159 30 mul.lo.u64 %r249,%r100,%r56; .loc 1 159 16 neg.s64 %r103,%r249; shl.b64 %r123,%r79,3; setp.le.s64 %r270,%r79,1; add.u64 %r275,%frame,360; add.u64 %r276,%frame,240; .loc 1 155 20 mov.u64 %r277,0; $L19: .loc 1 135 10 ld.s8 %r251,[%r99]; cvt.u16.u32 %r250,%r251; setp.eq.u16 %r252,%r250,0; @ %r252 bra $L21; .loc 1 138 12 ld.f64 %r58,[%r77]; ld.f64 %r59,[%r77+8]; .loc 1 139 9 add.u64 %r77,%r77,%r61; bra $L13; $L21: .loc 1 144 10 mov.f64 %r58,%r87; mov.f64 %r59,%r88; $L13: st.f64 [%r98],%r58; st.f64 [%r98+8],%r59; .loc 1 147 12 add.u64 %r98,%r98,%r63; .loc 1 148 12 add.u64 %r99,%r99,%r56; .loc 1 149 15 add.u64 %r96,%r96,1; .loc 1 151 13 setp.ne.u64 %r253,%r96,%r100; @ %r253 bra $L19; .loc 1 159 16 add.u64 %r97,%r99,%r103; .loc 1 161 14 @ ! %r270 bra $L32; bra $L1; $L18: .loc 1 155 20 st.u64 [%r92],%r277; .loc 1 158 30 mul.lo.u64 %r65,%r71,%r70; .loc 1 159 30 mul.lo.u64 %r256,%r73,%r70; .loc 1 159 16 sub.u64 %r97,%r99,%r256; .loc 1 161 14 add.u64 %r92,%r92,8; add.u64 %r129,%r129,8; setp.ne.u64 %r257,%r123,%r129; @ %r257 bra $L17; bra $L1; $L32: add.u64 %r92,%frame,128; .loc 1 158 30 mov.u64 %r65,%r27; .loc 1 161 14 mov.u64 %r129,8; $L17: .loc 1 169 23 ld.u64 %r259,[%r92]; add.u64 %r70,%r259,1; st.u64 [%r92],%r70; .loc 1 170 30 add.u64 %r261,%r275,%r129; ld.u64 %r71,[%r261]; .loc 1 170 20 sub.u64 %r262,%r71,%r65; shl.b64 %r263,%r262,4; add.u64 %r98,%r98,%r263; .loc 1 171 30 add.u64 %r265,%r276,%r129; ld.u64 %r73,[%r265]; .loc 1 171 20 add.u64 %r99,%r97,%r73; .loc 1 151 32 add.u64 %r266,%frame,%r129; ld.u64 %r75,[%r266]; .loc 1 151 13 setp.eq.u64 %r267,%r70,%r75; @ %r267 bra $L18; .loc 1 155 20 mov.u64 %r96,0; bra $L19; $L1: .loc 1 1_gfortrani_unpack1_c8 .visible .func _gfortrani_unpack1_c8608predpred %r174; .reg .predu32u64pred %r269; .reg .predu64 %r306; .reg .u64mov.u64 %r151,%ar0; mov.u64 %r152,%ar1; mov.u64 %r153,%ar2; mov.u64 %r154,%ar3; .loc 1 208 8 ld.u64 %r107,[%r153]; .loc 1 213 15 ld.u64 %r22,[%r153+16]; .loc 1 215 22 cvt.u32.u64 %r23,%r22; .loc 1 215 53 add.u32 %r155,%r23,-4; and.b32 %r156,%r155,-5; set.u32.eq.u32 %r158,%r156,0; neg.s32 %r159,%r158; .loc 1 215 22 add.u32 %r160,%r23,-1; set.u32.le.u32 %r162,%r160,1; neg.s32 %r163,%r162; .loc 1 215 6 cvt.u16.u32 %r165,%r159; cvt.u16.u32 %r166,%r163; or.b16 %r164,%r165,%r166; cvt.u32.u16 %r167,%r164; cvt.u16.u8 %r168,%r167; setp.ne.u16 %r169,%r168,0; @ %r169 bra $L34; .loc 1 217 7 setp.eq.u32 %r171,%r23,16; @ %r171 bra $L34; .loc 1 226 5 cvta.const.u64 %r172172_gfortran_runtime_error34: .loc 1 228 10 ld.u64 %r105,[%r151]; .loc 1 228 6 setp.ne.u64 %r174,%r105,0; @ %r174 bra $L35; .loc 1 232 11 ld.s8 %r93,[%r153+28]; .loc 1 234 7 setp.le.s64 %r176,%r93,0; @ %r176 bra $L36; .loc 1 236 13 shl.b64 %r146,%r93,3; add.u64 %r179,%frame,120; cvt.u32.u64 %r183,%r1051465,[%value_in]; } mov.u64 %r136,40; mov.u64 %r137,%r105; .loc 1 206 9 cvt.u32.u64 %r38,%r137; .loc 1 233 10 mov.u64 %r100,1; add.u64 %r301,%frame,480; add.u64 %r300,%frame,360; add.u64 %r299,%frame,240; .loc 1 237 4 mov.u64 %r187,%r137; $L37: add.u64 %r135,%r151,%r136; st.u64 [%r135+8],%r187; add.u64 %r134,%r153,%r136; ld.u64 %r189,[%r134+16]; add.u64 %r188,%r189,1; ld.u64 %r190,[%r134+8]; sub.u64 %r33,%r188,%r190; add.u64 %r191,%r33,-1; st.u64 [%r135+16],%r191; st.u64 [%r135],%r100; .loc 1 239 14 add.u64 %r192,%frame,%r137; st.u64 [%r192],%r33; .loc 1 240 18 set.u32.le.s64 %r194,%r33,0; neg.s32 %r195,%r194; cvt.u16.u32 %r198,%r38; and.b16 %r197,%r198,1; cvt.u16.u32 %r201,%r195; or.b16 %r200,%r201,%r197; cvt.u32.u16 %r203,%r200; cvt.u32.u8 %r38,%r203; .loc 1 241 15 add.u64 %r205,%r301,%r137; st.u64 [%r205],%r100; .loc 1 242 15 add.u64 %r207,%r300,%r137; .loc 1 242 17 add.u64 %r208,%r154,%r136; .loc 1 242 15 ld.u64 %r209,[%r208]; st.u64 [%r207],%r209; .loc 1 243 15 add.u64 %r211,%r299,%r137; .loc 1 243 17 ld.u64 %r213,[%r134]; mul.lo.u64 %r212,%r213,%r22; .loc 1 243 15 st.u64 [%r211],%r212; .loc 1 244 7 mul.lo.u64 %r100,%r100,%r33; .loc 1 234 7 add.u64 %r137,%r137,8; add.u64 %r136,%r136,24; setp.ne.u64 %r214,%r137,%r146; @ %r214 bra $L37; bra $L64; $L36: .loc 1 246 19 st.u64 [%r151+8],%r105; .loc 1 247 24 mov.u64 %r217,16; mov.u64 %r216217; call (%value_in),_gfortrani_xmallocarray218,[%value_in]; } mov.u64 %r105,%r218; .loc 1 247 22 st.u64 [%r151],%r105; .loc 1 278 12 ld.u64 %r150,[%frame+480]; bra $L39; $L64: .loc 1 246 19 st.u64 [%r151+8],%r187; .loc 1 247 24 mov.u64 %r222,16222; call (%value_in),_gfortrani_xmallocarray223,[%value_in]; } .loc 1 247 22 st.u64 [%r151],%r223; bra $L40; $L35: .loc 1 251 11 ld.s8 %r93,[%r151+28]; .loc 1 253 18 mov.u64 %r226,1; st.u64 [%frame+480],%r226; .loc 1 254 7 setp.le.s64 %r227,%r93,0; @ %r227 bra $L54; .loc 1 256 13 shl.b64 %r148,%r93,3; add.u64 %r230,%frame,120; mov.u32 %r23414836,[%value_in]; } add.u64 %r128,%r151,40; add.u64 %r126,%r154,40; add.u64 %r124,%r153,40; mov.u64 %r122,0; .loc 1 206 9 cvt.u32.u64 %r38,%r122; add.u64 %r301,%frame,480; add.u64 %r300,%frame,360; add.u64 %r299,%frame,240; $L41: .loc 1 257 16 ld.u64 %r239,[%r128+16]; add.u64 %r238,%r239,1; ld.u64 %r240,[%r128+8]; sub.u64 %r50,%r238,%r240; .loc 1 257 14 add.u64 %r241,%frame,%r122; st.u64 [%r241],%r50; .loc 1 258 18 set.u32.le.s64 %r243,%r50,0; neg.s32 %r244,%r243; cvt.u16.u32 %r247,%r38; and.b16 %r246,%r247,1; cvt.u16.u32 %r250,%r244; or.b16 %r249,%r250,%r246; cvt.u32.u16 %r252,%r249; cvt.u32.u8 %r38,%r252; .loc 1 259 15 add.u64 %r254,%r301,%r122; ld.u64 %r255,[%r128]; st.u64 [%r254],%r255; .loc 1 260 15 add.u64 %r257,%r300,%r122; ld.u64 %r258,[%r126]; st.u64 [%r257],%r258; .loc 1 261 15 add.u64 %r260,%r299,%r122; .loc 1 261 17 ld.u64 %r262,[%r124]; mul.lo.u64 %r261,%r262,%r22; .loc 1 261 15 st.u64 [%r260],%r261; .loc 1 254 7 add.u64 %r128,%r128,24; add.u64 %r126,%r126,24; add.u64 %r124,%r124,24; add.u64 %r122,%r122,8; setp.ne.u64 %r263,%r122,%r148; @ %r263 bra $L41; .loc 1 263 10 ld.u64 %r264,[%frame+480]; setp.ne.u64 %r265,%r264,0; @ %r265 bra $L40; .loc 1 264 13 mov.u64 %r266,1; st.u64 [%frame+480],%r266; $L40: .loc 1 267 6 setp.ne.u32 %r267,%r38,0; @ %r267 bra $L33; .loc 1 278 12 ld.u64 %r150,[%frame+480]; .loc 1 281 8 ld.u64 %r105,[%r151]; bra $L39; $L54: .loc 1 254 7 mov.u64 %r150,%r226; $L39: .loc 1 270 14 ld.u64 %r59,[%frame+360]; .loc 1 270 6 setp.ne.u64 %r268,%r59,0; .loc 1 271 16 selp.u64 %r59,%r59,1,%r268; .loc 1 272 14 ld.u64 %r60,[%frame+240]; .loc 1 272 6 setp.ne.u64 %r269,%r60,0; .loc 1 273 16 selp.u64 %r60,%r60,1,%r269; .loc 1 275 12 ld.u64 %r91,[%r152+40]; .loc 1 276 6 setp.ne.u64 %r270,%r91,0; .loc 1 277 14 selp.u64 %r91,%r91,1,%r270; .loc 1 282 8 ld.u64 %r106,[%r154]; .loc 1 283 8 ld.u64 %r92,[%r152]; .loc 1 285 9 setp.eq.u64 %r271,%r105,0; @ %r271 bra $L33; .loc 1 291 16 shl.b64 %r65,%r91,4; .loc 1 299 12 shl.b64 %r67,%r150,4; .loc 1 300 12 shl.b64 %r69,%r59,4; ld.u64 %r27,[%frame+120]; .loc 1 304 32 ld.u64 %r108,[%frame]; .loc 1 311 30 mul.lo.u64 %r28,%r108,%r150; .loc 1 312 16 mul.lo.u64 %r272,%r108,%r59; shl.b64 %r273,%r272,4; neg.s64 %r112,%r273; .loc 1 313 30 mul.lo.u64 %r274,%r108,%r60; .loc 1 313 16 neg.s64 %r113,%r274; shl.b64 %r138,%r93,3; setp.le.s64 %r298,%r93,1; add.u64 %r306,%frame,480; add.u64 %r307,%frame,360; add.u64 %r308,%frame,240; .loc 1 308 20 mov.u64 %r309,0; $L53: .loc 1 287 10 ld.s8 %r276,[%r107]; cvt.u16.u32 %r275,%r276; setp.eq.u16 %r277,%r275,0; @ %r277 bra $L46; .loc 1 290 12 ld.f64 %r62,[%r92]; ld.f64 %r63,[%r92+8]; .loc 1 291 16 add.u64 %r92,%r92,%r65; bra $L47; $L46: .loc 1 296 12 ld.f64 %r62,[%r106]; ld.f64 %r63,[%r106+8]; $L47: st.f64 [%r105],%r62; st.f64 [%r105+8],%r63; .loc 1 299 12 add.u64 %r105,%r105,%r67; .loc 1 300 12 add.u64 %r106,%r106,%r69; .loc 1 301 12 add.u64 %r107,%r107,%r60; .loc 1 302 15 add.u64 %r27,%r27,1; .loc 1 304 13 setp.ne.u64 %r278,%r27,%r108; @ %r278 bra $L53; .loc 1 312 16 add.u64 %r103,%r106,%r112; .loc 1 313 16 add.u64 %r104,%r107,%r113; .loc 1 315 14 @ ! %r298 bra $L65; bra $L33; $L52: .loc 1 308 20 st.u64 [%r99],%r309; .loc 1 311 30 mul.lo.u64 %r72,%r82,%r81; .loc 1 312 30 mul.lo.u64 %r281,%r84,%r81; .loc 1 312 16 shl.b64 %r282,%r281,4; sub.u64 %r103,%r106,%r282; .loc 1 313 30 mul.lo.u64 %r283,%r87,%r81; .loc 1 313 16 sub.u64 %r104,%r107,%r283; .loc 1 315 14 add.u64 %r99,%r99,8; add.u64 %r145,%r145,8; setp.ne.u64 %r284,%r138,%r145; @ %r284 bra $L51; bra $L33; $L65: add.u64 %r99,%frame,128; .loc 1 311 30 mov.u64 %r72,%r28; .loc 1 315 14 mov.u64 %r145,8; $L51: .loc 1 323 23 ld.u64 %r286,[%r99]; add.u64 %r81,%r286,1; st.u64 [%r99],%r81; .loc 1 324 30 add.u64 %r288,%r306,%r145; ld.u64 %r82,[%r288]; .loc 1 324 20 sub.u64 %r289,%r82,%r72; shl.b64 %r290,%r289,4; add.u64 %r105,%r105,%r290; .loc 1 325 30 add.u64 %r292,%r307,%r145; ld.u64 %r84,[%r292]; .loc 1 325 20 shl.b64 %r293,%r84,4; add.u64 %r106,%r103,%r293; .loc 1 326 30 add.u64 %r295,%r308,%r145; ld.u64 %r87,[%r295]; .loc 1 326 20 add.u64 %r107,%r104,%r87; .loc 1 304 32 add.u64 %r296,%frame,%r145; ld.u64 %r89,[%r296]; .loc 1 304 13 setp.eq.u64 %r297,%r81,%r89; @ %r297 bra $L52; .loc 1 308 20 mov.u64 %r27,0; bra $L53; $L33: .loc 1 330 unpack_c10.o/ 235 273 311 350 389 428 467 ../libgfortran/generated/spread_i1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_spread_scalar_i1 .visible .func _gfortrani_spread_scalar_i1VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray114,101,116,117,114,110,32,114,97,110,107,32,116,111,111,32,108,97,114,103,101,32,105,110,32,115,112,114,101,97,100,40,410,105,109,32,111,117,116,115,105,100,101,32,111,102,32,114,97,110,107,32,105,110,32114,97,110,107,32,109,105,115,109,97,116,99,104,32,105,110,32,115,112,114,101,97,100,40,473,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,83,80,82,69,65,68,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10105,110,99,111,114,114,101,99,116,32,100,101,115,116,105,110,97,116,105,111,110,32,114,97,110,107,32,105,110,32,115,112,114,101,97,100,40,41100,105,109,32,116,111,111,32,108,97,114,103,101,32,105,110,32,115,112,114,101,97,100,40,41_gfortrani_spread_i1 .visible .func _gfortrani_spreapredpredpred %r184; .reg .u64 %r185; .reg .predpredpred %r198; .reg .predpred %r228; .reg .u64 %r238; .reg .u64 %r242; .reg .pred %r243; .reg .predpredpred %r287; .reg .pred %r288; .reg .u64 %r292; .reg .predmov.u64 %r122,%ar0; mov.u64 %r123,%ar1; mov.u64 %r124,%ar2; mov.u64 %r125,%ar3; .loc 1 57 11 ld.s8 %r22,[%r123+28]; .loc 1 57 9 cvt.u32.u32 %r126,%r22; cvt.s64.s8 %r82,%r126; .loc 1 59 9 add.u64 %r83,%r82,1; .loc 1 60 6 setp.le.s64 %r127,%r83,15; @ %r127 bra $L2; .loc 1 61_gfortran_runtime_errorsetp.ge.s64 %r130,%r83,%r124; @ %r130 bra $L3; .loc 1 64_gfortran_runtime_errorr99,[%r122]; .loc 1 68 6 setp.ne.u64 %r133,%r99,0; @ %r133 bra $L4; .loc 1 75 23 st.u8 [%r122+28],%r83; .loc 1 79 7 setp.le.s64 %r134,%r83,0; @ %r134 bra $L5; .loc 1 82 19 add.u64 %r117,%r124,-1; add.u64 %r102,%r122,40; .loc 1 77 11 mov.u64 %r71,%r99; .loc 1 79 14 mov.u64 %r89,%r71; .loc 1 78 10 mov.u64 %r111,1; .loc 1 40 14 mov.u64 %r67,%r71; .loc 1 90 19 mov.u64 %r322,%r71; .loc 1 84 21 add.u64 %r323,%r125,-1; bra $L8; $L37: mov.u64 %r111,%r68; $L8: .loc 1 82 7 setp.ne.u64 %r137,%r89,%r117; @ %r137 bra $L6; .loc 1 84 21 mov.u64 %r29,%r323; .loc 1 86 11 mul.lo.u64 %r68,%r125,%r111; mov.u64 %r67,%r111; bra $L7; $L6: .loc 1 90 19 shl.b64 %r138,%r71,3; add.u64 %r139,%frame,%r138; st.u64 [%r139+120],%r322; .loc 1 91 22 add.u64 %r143,%r71,%r71; add.u64 %r144,%r143,%r71; shl.b64 %r145,%r144,3; add.u64 %r146,%r123,%r145; ld.u64 %r149,[%r146+56]; add.u64 %r148,%r149,1; ld.u64 %r156,[%r146+48]; sub.u64 %r27,%r148,%r156; .loc 1 91 20 st.u64 [%r139],%r27; .loc 1 92 21 ld.u64 %r168,[%r146+40]; st.u64 [%r139+240],%r168; .loc 1 93 21 st.u64 [%r139+360],%r111; .loc 1 95 25 add.u64 %r29,%r27,-1; .loc 1 96 11 mul.lo.u64 %r68,%r27,%r111; .loc 1 97 11 add.u64 %r71,%r71,1; $L7: .loc 1 99 4 st.u64 [%r102+8],%r322; st.u64 [%r102+16],%r29; st.u64 [%r102],%r111; .loc 1 79 31 add.u64 %r89,%r89,1; .loc 1 79 7 add.u64 %r102,%r102,24; setp.ne.u64 %r173,%r83,%r89; @ %r173 bra $L37; bra $L45; $L5: .loc 1 101 19 st.u64 [%r122+8],%r99; .loc 1 104 24 mov.u64 %r1761(%value_in),_gfortrani_xmallocarray177,[%value_in]; } mov.u64 %r99,%r177; .loc 1 104 22 st.u64 [%r122],%r99; .loc 1 181 12 ld.u64 %r50,[%frame+240]; .loc 1 40 14 mov.u64 %r67,0; bra $L10; $L45: .loc 1 101 19 st.u64 [%r122+8],%r322; .loc 1 104 24 mov.u64 %r18181; call (%value_in),_gfortrani_xmallocarray,(%out_arg1,%out_arg2); ld.param.u64 %r182,[%value_in]; } mov.u64 %r99,%r182; .loc 1 104 22 st.u64 [%r122],%r99; .loc 1 105 10 setp.le.s64 %r184,%r68,0; @ %r184 bra $L1; .loc 1 181 12 ld.u64 %r50,[%frame+240]; bra $L10; $L4: .loc 1 115 11 ld.s8 %r185,[%r122+28]; .loc 1 115 10 setp.eq.u64 %r186,%r185,%r83; @ %r186 bra $L12; .loc 1 116 2 cvta.const.u64 %r187187_gfortran_runtime_error12: .loc 1 118 11 cvta.global.u64 %r189,_gfortrani_compile_options; ld.u32 %r75,[%r189+36]; .loc 1 118 10 setp.ne.u32 %r190,%r75,0; @ %r190 bra $L13; .loc 1 156 4 setp.gt.s64 %r191,%r185,0; @ %r191 bra $L14; .loc 1 40 14 mov.u64 %r67,0; bra $L15; $L13: .loc 1 120 4 setp.le.s64 %r192,%r185,0; @ %r192 bra $L38; .loc 1 82 19 add.u64 %r116,%r124,-1; add.u64 %r73,%r122,40; .loc 1 112 18 mov.u32 %r75,0; .loc 1 114 11 mov.u64 %r72,0; .loc 1 120 11 mov.u64 %r86,%r72; .loc 1 40 14 mov.u64 %r67,%r72; .loc 1 137 16 mov.u64 %r324,%r72; $L20: .loc 1 124 21 ld.u64 %r196,[%r73+16]; add.u64 %r195,%r196,1; .loc 1 124 19 ld.u64 %r197,[%r73+8]; sub.u64 %r85,%r195,%r197; .loc 1 125 11 setp.ne.u64 %r198,%r86,%r116; @ %r198 bra $L16; .loc 1 127 12 ld.u64 %r67,[%r73]; .loc 1 129 8 setp.eq.u64 %r199,%r125,%r85; @ %r199 bra $L17; .loc 1 130 7 st.u64 [%stack+16],%r125; st.u64 [%stack+8],%r85; add.u64 %r201,%r86,1;_gfortran_runtime_error16: .loc 1 137 16 shl.b64 %r311,%r72,3; add.u64 %r204,%frame,%r311; add.u64 %r205,%r204,120; st.u64 [%r205],%r324; .loc 1 138 19 add.u64 %r309,%r72,%r72; add.u64 %r307,%r309,%r72; shl.b64 %r210,%r307,3; add.u64 %r211,%r123,%r210; ld.u64 %r214,[%r211+56]; add.u64 %r213,%r214,1; ld.u64 %r221,[%r211+48]; sub.u64 %r41,%r213,%r221; .loc 1 138 17 st.u64 [%r204],%r41; .loc 1 139 8 setp.eq.u64 %r224,%r41,%r85; @ %r224 bra $L18; .loc 1 140 7 st.u64 [%stack+16],%r41; st.u64 [%stack+8],%r85; add.u64 %r226,%r86,1;_gfortran_runtime_error18: .loc 1 146 8 setp.gt.s64 %r228,%r41,0; .loc 1 147 18 selp.u32 %r75,%r75,1,%r228; .loc 1 148 18 ld.u64 %r238,[%r211+40]; st.u64 [%r204+240],%r238; .loc 1 149 18 ld.u64 %r242,[%r73]; st.u64 [%r204+360],%r242; .loc 1 150 8 add.u64 %r72,%r72,1; $L17: .loc 1 120 28 add.u64 %r86,%r86,1; .loc 1 120 4 add.u64 %r73,%r73,24; setp.ne.u64 %r243,%r185,%r86; @ %r243 bra $L20; bra $L21; $L14: .loc 1 82 19 add.u64 %r113,%r124,-1; add.u64 %r96,%r122,40; .loc 1 114 11 mov.u64 %r74,0; .loc 1 156 11 mov.u64 %r84,%r74; .loc 1 40 14 mov.u64 %r67,%r74; .loc 1 164 16 mov.u64 %r325,%r74; $L24: .loc 1 160 12 ld.u64 %r121,[%r96]; .loc 1 158 11 setp.eq.u64 %r246,%r84,%r113; @ %r246 bra $L39; .loc 1 164 16 shl.b64 %r313,%r74,3; add.u64 %r248,%frame,%r313; add.u64 %r249,%r248,120; st.u64 [%r249],%r325; .loc 1 165 19 add.u64 %r310,%r74,%r74; add.u64 %r308,%r310,%r74; shl.b64 %r254,%r308,3; add.u64 %r255,%r123,%r254; ld.u64 %r258,[%r255+56]; add.u64 %r257,%r258,1; ld.u64 %r265,[%r255+48]; sub.u64 %r48,%r257,%r265; .loc 1 165 17 st.u64 [%r248],%r48; .loc 1 166 8 setp.gt.s64 %r268,%r48,0; .loc 1 167 18 selp.u32 %r75,%r75,1,%r268; .loc 1 168 18 ld.u64 %r278,[%r255+40]; st.u64 [%r248+240],%r278; .loc 1 169 18 st.u64 [%r248+360],%r121; .loc 1 170 8 add.u64 %r74,%r74,1; bra $L22; $L39: .loc 1 160 12 mov.u64 %r67,%r121; $L22: .loc 1 156 28 add.u64 %r84,%r84,1; .loc 1 156 4 add.u64 %r96,%r96,24; setp.ne.u64 %r282,%r84,%r185; @ %r282 bra $L24; $L21: .loc 1 175 10 setp.ne.u32 %r283,%r75,0; @ ! %r283 bra $L15; bra $L1; $L38: .loc 1 40 14 mov.u64 %r67,0; $L15: .loc 1 178 18 ld.u64 %r50,[%frame+240]; .loc 1 178 10 setp.eq.u64 %r284,%r50,0; selp.u64 %r50,1,%r50,%r284; $L10: .loc 1 182 12 ld.u64 %r90,[%frame+360]; .loc 1 184 8 ld.u64 %r93,[%r123]; .loc 1 186 9 setp.ne.u64 %r285,%r93,0; @ %r285 bra $L25; bra $L1; $L33: .loc 1 193 16 mov.u64 %r94,%r99; .loc 1 190 14 mov.u64 %r95,0; $L26: .loc 1 192 10 ld.u8 %r286,[%r93]; st.u8 [%r94],%r286; .loc 1 193 16 add.u64 %r94,%r94,%r67; .loc 1 190 33 add.u64 %r95,%r95,1; .loc 1 190 7 setp.ne.u64 %r287,%r125,%r95; @ %r287 bra $L26; bra $L46; $L44: .loc 1 196 12 add.u64 %r93,%r93,%r50; .loc 1 197 12 add.u64 %r99,%r99,%r90; .loc 1 198 15 add.u64 %r87,%r87,1; .loc 1 200 13 setp.ne.u64 %r288,%r30,%r87; @ %r288 bra $L44; $L35: .loc 1 208 16 add.u64 %r92,%r99,%r107; .loc 1 210 14 @ %r315 bra $L1; add.u64 %r100,%frame,128; .loc 1 207 30 mov.u64 %r53,%r91; .loc 1 210 14 mov.u64 %r110,8; bra $L30; $L31: .loc 1 204 20 st.u64 [%r100],%r320; .loc 1 207 30 mul.lo.u64 %r53,%r60,%r59; .loc 1 208 30 mul.lo.u64 %r292,%r62,%r59; .loc 1 208 16 sub.u64 %r92,%r99,%r292; .loc 1 210 14 add.u64 %r100,%r100,8; add.u64 %r110,%r110,8; setp.eq.u64 %r293,%r103,%r110; @ %r293 bra $L1; $L30: .loc 1 218 23 ld.u64 %r294,[%r100]; add.u64 %r59,%r294,1; st.u64 [%r100],%r59; .loc 1 219 30 add.u64 %r296,%r318,%r110; ld.u64 %r60,[%r296]; .loc 1 219 20 sub.u64 %r297,%r60,%r53; add.u64 %r93,%r93,%r297; .loc 1 220 30 add.u64 %r299,%r319,%r110; ld.u64 %r62,[%r299]; .loc 1 220 20 add.u64 %r99,%r92,%r62; .loc 1 200 32 add.u64 %r300,%frame,%r110; ld.u64 %r64,[%r300]; .loc 1 200 13 setp.eq.u64 %r301,%r59,%r64; @ %r301 bra $L31; .loc 1 204 20 mov.u64 %r87,0; bra $L32; $L25: ld.u64 %r87,[%frame+120]; .loc 1 200 32 ld.u64 %r30,[%frame]; .loc 1 207 30 mul.lo.u64 %r91,%r50,%r30; .loc 1 208 30 mul.lo.u64 %r302,%r90,%r30; .loc 1 208 16 neg.s64 %r107,%r302; cvt.u32.u32 %r304,%r22; cvt.s64.s8 %r303,%r304; shl.b64 %r103,%r303,3; setp.le.s64 %r315,%r82,1; setp.gt.s64 %r316,%r125,0; add.u64 %r318,%frame,240; add.u64 %r319,%frame,360; .loc 1 204 20 mov.u64 %r320,0; $L32: .loc 1 190 7 @ %r316 bra $L33; bra $L44; $L46: .loc 1 196 12 add.u64 %r93,%r93,%r50; .loc 1 197 12 add.u64 %r99,%r99,%r90; .loc 1 198 15 add.u64 %r87,%r87,1; .loc 1 200 13 setp.eq.u64 %r306,%r30,%r87; @ %r306 bra $L35; bra $L32; $L1: .loc 1 224_gfortrani_spread_scalar_i1 .visible .func _gfortrani_spread_scalar_0u64 %r46; .reg .predmov.u64 %r38,%ar2; mov.u64 %r39,%ar3; .loc 1 236 6 ld.s8 %r41,[%r36+28]; cvt.u16.u32 %r40,%r41; setp.eq.u16 %r42,%r40,1; @ %r42 bra $L48; .loc 1 237 5 cvta.const.u64 %r43,$LC4; {stack; call _gfortran_runtime_error48: .loc 1 239 6 setp.le.s64 %r45,%r38,1_gfortran_runtime_error49: .loc 1 242 10 ld.u64 %r23,[%r36]; .loc 1 246 7 add.u64 %r35,%r39,-1; .loc 1 242 6 setp.ne.u64 %r48,%r23,0; @ %r48 bra $L50; .loc 1 244 24 mov.u64 _gfortrani_xmallocarray244 22 st.u64 [%r36],%r23; .loc 1 245 19 mov.u64 %r53,0; st.u64 [%r36+8],%r53; .loc 1 246 7 st.u64 [%r36+48],%r53; st.u64 [%r36+56],%r35; st.u64 [%r36+40],%r50; mov.u64 %r30,%r50; bra $L51; $L50: .loc 1 251 9 ld.u64 %r30,[%r36+40]; .loc 1 250 26 ld.u64 %r57,[%r36+56]; ld.u64 %r59,[%r36+48]; .loc 1 250 55 sub.u64 %r60,%r57,%r59; .loc 1 251 7 div.s64 %r62,%r60,%r30; .loc 1 250 10 setp.ge.s64 %r63,%r62,%r35; @ %r63 bra $L51; .loc 1 252 2_gfortran_runtime_error1: .loc 1 258 3 setp.gt.s64 %r66,%r39,0; @ ! %r66 bra $L47; .loc 1 258 19 mov.u64 %r34,0; $L53: .loc 1 260 13 ld.u8 %r67,[%r37]; st.u8 [%r23],%r67; .loc 1 261 12 add.u64 %r23,%r23,%r30; .loc 1 258 40 add.u64 %r34,%r34,1; .loc 1 258 3 setp.ne.u64 %r68,%r39,%r34; @ %r68 bra $L53; $L47spread_i2.o/../libgfortran/generated/spread_i2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_spread_scalar_i2 .visible .func _gfortrani_spread_scalar_i2VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray114,101,116,117,114,110,32,114,97,110,107,32,116,111,111,32,108,97,114,103,101,32,105,110,32,115,112,114,101,97,100,40,410,105,109,32,111,117,116,115,105,100,101,32,111,102,32,114,97,110,107,32,105,110,32114,97,110,107,32,109,105,115,109,97,116,99,104,32,105,110,32,115,112,114,101,97,100,40,473,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,83,80,82,69,65,68,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10105,110,99,111,114,114,101,99,116,32,100,101,115,116,105,110,97,116,105,111,110,32,114,97,110,107,32,105,110,32,115,112,114,101,97,100,40,41100,105,109,32,116,111,111,32,108,97,114,103,101,32,105,110,32,115,112,114,101,97,100,40,41_gfortrani_spread_i2 .visible .func _gfortrani_spreapredpredpredpred %r250; .reg .predu64 %r264; .reg .u64 %r265; .reg .u64 %r272; .reg .pred %r275; .reg .u64 %r285; .reg .predpred %r292; .reg .u16 %r293; .reg .pred %r294; .reg .predpredmov.u64 %r129,%ar0; mov.u64 %r130,%ar1; mov.u64 %r131,%ar2; mov.u64 %r132,%ar3; .loc 1 57 11 ld.s8 %r22,[%r130+28]; .loc 1 57 9 cvt.u32.u32 %r133,%r22; cvt.s64.s8 %r89,%r133; .loc 1 59 9 add.u64 %r90,%r89,1; .loc 1 60 6 setp.le.s64 %r134,%r90,15; @ %r134 bra $L2; .loc 1 61 5 cvta.const.u64 %r135135_gfortran_runtime_errorsetp.ge.s64 %r137,%r90,%r131; @ %r137 bra $L3; .loc 1 64_gfortran_runtime_errorr112,[%r129]; .loc 1 68 6 setp.ne.u64 %r140,%r112,0; @ %r140 bra $L4; .loc 1 75 23 st.u8 [%r129+28],%r90; .loc 1 79 7 setp.le.s64 %r141,%r90,0; @ %r141 bra $L5; .loc 1 82 19 add.u64 %r127,%r131,-1; add.u64 %r115,%r129,40; .loc 1 77 11 mov.u64 %r80,%r112; .loc 1 79 14 mov.u64 %r97,%r80; .loc 1 78 10 mov.u64 %r121,1; .loc 1 40 14 mov.u64 %r74,%r80; .loc 1 90 19 mov.u64 %r336,%r80; .loc 1 84 21 add.u64 %r337,%r132,-1; bra $L8; $L36: mov.u64 %r121,%r76; $L8: .loc 1 82 7 setp.ne.u64 %r144,%r97,%r127; @ %r144 bra $L6; .loc 1 84 21 mov.u64 %r29,%r337; .loc 1 86 11 mul.lo.u64 %r76,%r132,%r121; mov.u64 %r74,%r121; bra $L7; $L6: .loc 1 90 19 shl.b64 %r145,%r80,3; add.u64 %r146,%frame,%r145; st.u64 [%r146+120],%r336; .loc 1 91 22 add.u64 %r150,%r80,%r80; add.u64 %r151,%r150,%r80; shl.b64 %r152,%r151,3; add.u64 %r153,%r130,%r152; ld.u64 %r156,[%r153+56]; add.u64 %r155,%r156,1; ld.u64 %r163,[%r153+48]; sub.u64 %r27,%r155,%r163; .loc 1 91 20 st.u64 [%r146],%r27; .loc 1 92 21 ld.u64 %r175,[%r153+40]; st.u64 [%r146+240],%r175; .loc 1 93 21 st.u64 [%r146+360],%r121; .loc 1 95 25 add.u64 %r29,%r27,-1; .loc 1 96 11 mul.lo.u64 %r76,%r27,%r121; .loc 1 97 11 add.u64 %r80,%r80,1; $L7: .loc 1 99 4 st.u64 [%r115+8],%r336; st.u64 [%r115+16],%r29; st.u64 [%r115],%r121; .loc 1 79 31 add.u64 %r97,%r97,1; .loc 1 79 7 add.u64 %r115,%r115,24; setp.ne.u64 %r180,%r97,%r90; @ %r180 bra $L36; bra $L45; $L5: .loc 1 101 19 st.u64 [%r129+8],%r112; .loc 1 104 24 mov.u64 %r183,2; mov.u64 %r18218183; call (%value_in),_gfortrani_xmallocarray184,[%value_in]; } mov.u64 %r112,%r184; .loc 1 104 22 st.u64 [%r129],%r112; .loc 1 181 12 ld.u64 %r50,[%frame+240]; .loc 1 40 14 mov.u64 %r74,0; bra $L10; $L45: .loc 1 101 19 st.u64 [%r129+8],%r336; .loc 1 104 24 mov.u64 %r188,2;188; call (%value_in),_gfortrani_xmallocarray189,[%value_in]; } mov.u64 %r112,%r189; .loc 1 104 22 st.u64 [%r129],%r112; .loc 1 105 10 setp.le.s64 %r191,%r76,0; @ %r191 bra $L1; .loc 1 181 12 ld.u64 %r50,[%frame+240]; bra $L10; $L4: .loc 1 115 11 ld.s8 %r192,[%r129+28]; .loc 1 115 10 setp.eq.u64 %r193,%r192,%r90; @ %r193 bra $L12; .loc 1 116 2 cvta.const.u64 %r1944_gfortran_runtime_error12: .loc 1 118 11 cvta.global.u64 %r196,_gfortrani_compile_options; ld.u32 %r85,[%r196+36]; .loc 1 118 10 setp.ne.u32 %r197,%r85,0; @ %r197 bra $L13; .loc 1 156 4 setp.gt.s64 %r198,%r192,0; @ %r198 bra $L14; .loc 1 40 14 mov.u64 %r74,0; bra $L15; $L13: .loc 1 120 4 setp.le.s64 %r199,%r192,0; @ %r199 bra $L37; .loc 1 82 19 add.u64 %r126,%r131,-1; add.u64 %r106,%r129,40; .loc 1 112 18 mov.u32 %r85,0; .loc 1 114 11 mov.u64 %r82,0; .loc 1 120 11 mov.u64 %r94,%r82; .loc 1 40 14 mov.u64 %r74,%r82; .loc 1 137 16 mov.u64 %r338,%r82; $L20: .loc 1 124 21 ld.u64 %r203,[%r106+16]; add.u64 %r202,%r203,1; .loc 1 124 19 ld.u64 %r204,[%r106+8]; sub.u64 %r93,%r202,%r204; .loc 1 125 11 setp.ne.u64 %r205,%r94,%r126; @ %r205 bra $L16; .loc 1 127 12 ld.u64 %r74,[%r106]; .loc 1 129 8 setp.eq.u64 %r206,%r132,%r93; @ %r206 bra $L17; .loc 1 130 7 st.u64 [%stack+16],%r132; st.u64 [%stack+8],%r93; add.u64 %r208,%r94,1;_gfortran_runtime_error16: .loc 1 137 16 shl.b64 %r326,%r82,3; add.u64 %r211,%frame,%r326; add.u64 %r212,%r211,120; st.u64 [%r212],%r338; .loc 1 138 19 add.u64 %r321,%r82,%r82; add.u64 %r323,%r321,%r82; shl.b64 %r217,%r323,3; add.u64 %r218,%r130,%r217; ld.u64 %r221,[%r218+56]; add.u64 %r220,%r221,1; ld.u64 %r228,[%r218+48]; sub.u64 %r41,%r220,%r228; .loc 1 138 17 st.u64 [%r211],%r41; .loc 1 139 8 setp.eq.u64 %r231,%r41,%r93; @ %r231 bra $L18; .loc 1 140 7 st.u64 [%stack+16],%r41; st.u64 [%stack+8],%r93; add.u64 %r233,%r94,1;_gfortran_runtime_error18: .loc 1 146 8 setp.gt.s64 %r235,%r41,0; .loc 1 147 18 selp.u32 %r85,%r85,1,%r235; .loc 1 148 18 ld.u64 %r245,[%r218+40]; st.u64 [%r211+240],%r245; .loc 1 149 18 ld.u64 %r249,[%r106]; st.u64 [%r211+360],%r249; .loc 1 150 8 add.u64 %r82,%r82,1; $L17: .loc 1 120 28 add.u64 %r94,%r94,1; .loc 1 120 4 add.u64 %r106,%r106,24; setp.ne.u64 %r250,%r192,%r94; @ %r250 bra $L20; bra $L21; $L14: .loc 1 82 19 add.u64 %r125,%r131,-1; add.u64 %r75,%r129,40; .loc 1 114 11 mov.u64 %r84,0; .loc 1 156 11 mov.u64 %r92,%r84; .loc 1 40 14 mov.u64 %r74,%r84; .loc 1 164 16 mov.u64 %r339,%r84; $L24: .loc 1 160 12 ld.u64 %r128,[%r75]; .loc 1 158 11 setp.eq.u64 %r253,%r92,%r125; @ %r253 bra $L38; .loc 1 164 16 shl.b64 %r327,%r84,3; add.u64 %r255,%frame,%r327; add.u64 %r256,%r255,120; st.u64 [%r256],%r339; .loc 1 165 19 add.u64 %r322,%r84,%r84; add.u64 %r325,%r322,%r84; shl.b64 %r261,%r325,3; add.u64 %r262,%r130,%r261; ld.u64 %r265,[%r262+56]; add.u64 %r264,%r265,1; ld.u64 %r272,[%r262+48]; sub.u64 %r48,%r264,%r272; .loc 1 165 17 st.u64 [%r255],%r48; .loc 1 166 8 setp.gt.s64 %r275,%r48,0; .loc 1 167 18 selp.u32 %r85,%r85,1,%r275; .loc 1 168 18 ld.u64 %r285,[%r262+40]; st.u64 [%r255+240],%r285; .loc 1 169 18 st.u64 [%r255+360],%r128; .loc 1 170 8 add.u64 %r84,%r84,1; bra $L22; $L38: .loc 1 160 12 mov.u64 %r74,%r128; $L22: .loc 1 156 28 add.u64 %r92,%r92,1; .loc 1 156 4 add.u64 %r75,%r75,24; setp.ne.u64 %r289,%r92,%r192; @ %r289 bra $L24; $L21: .loc 1 175 10 setp.ne.u32 %r290,%r85,0; @ ! %r290 bra $L15; bra $L1; $L37: .loc 1 40 14 mov.u64 %r74,0; $L15: .loc 1 178 18 ld.u64 %r50,[%frame+240]; .loc 1 178 10 setp.eq.u64 %r291,%r50,0; selp.u64 %r50,1,%r50,%r291; $L10: .loc 1 182 12 ld.u64 %r98,[%frame+360]; .loc 1 184 8 ld.u64 %r101,[%r130]; .loc 1 186 9 setp.ne.u64 %r292,%r101,0; @ %r292 bra $L25; bra $L1; $L40: .loc 1 190 7 mov.u64 %r102,%r112; .loc 1 190 14 mov.u64 %r103,0; $L26: .loc 1 192 10 ld.u16 %r293,[%r101]; st.u16 [%r102],%r293; .loc 1 193 16 add.u64 %r102,%r102,%r53; .loc 1 190 33 add.u64 %r103,%r103,1; .loc 1 190 7 setp.ne.u64 %r294,%r132,%r103; @ %r294 bra $L26; bra $L46; $L44: .loc 1 196 12 add.u64 %r101,%r101,%r55; .loc 1 197 12 add.u64 %r112,%r112,%r57; .loc 1 198 15 add.u64 %r95,%r95,1; .loc 1 200 13 setp.ne.u64 %r295,%r30,%r95; @ %r295 bra $L44; $L34: .loc 1 208 16 add.u64 %r100,%r112,%r116; .loc 1 210 14 @ %r329 bra $L1; add.u64 %r108,%frame,128; .loc 1 207 30 mov.u64 %r59,%r99; .loc 1 210 14 mov.u64 %r124,8; bra $L30; $L31: .loc 1 204 20 st.u64 [%r108],%r334; .loc 1 207 30 mul.lo.u64 %r59,%r67,%r66; .loc 1 208 30 mul.lo.u64 %r299,%r69,%r66; .loc 1 208 16 add.u64 %r300,%r299,%r299; sub.u64 %r100,%r112,%r300; .loc 1 210 14 add.u64 %r108,%r108,8; add.u64 %r124,%r124,8; setp.eq.u64 %r301,%r117,%r124; @ %r301 bra $L1; $L30: .loc 1 218 23 ld.u64 %r302,[%r108]; add.u64 %r66,%r302,1; st.u64 [%r108],%r66; .loc 1 219 30 add.u64 %r304,%r332,%r124; ld.u64 %r67,[%r304]; .loc 1 219 20 sub.u64 %r305,%r67,%r59; add.u64 %r306,%r305,%r305; add.u64 %r101,%r101,%r306; .loc 1 220 30 add.u64 %r308,%r333,%r124; ld.u64 %r69,[%r308]; .loc 1 220 20 add.u64 %r309,%r69,%r69; add.u64 %r112,%r100,%r309; .loc 1 200 32 add.u64 %r310,%frame,%r124; ld.u64 %r72,[%r310]; .loc 1 200 13 setp.eq.u64 %r311,%r66,%r72; @ %r311 bra $L31; .loc 1 204 20 mov.u64 %r95,0; bra $L32; $L25: .loc 1 193 16 add.u64 %r53,%r74,%r74; .loc 1 196 12 add.u64 %r55,%r50,%r50; .loc 1 197 12 add.u64 %r57,%r98,%r98; ld.u64 %r95,[%frame+120]; .loc 1 200 32 ld.u64 %r30,[%frame]; .loc 1 207 30 mul.lo.u64 %r99,%r50,%r30; .loc 1 208 30 mul.lo.u64 %r315,%r98,%r30; .loc 1 208 16 add.u64 %r316,%r315,%r315; neg.s64 %r116,%r316; cvt.u32.u32 %r318,%r22; cvt.s64.s8 %r317,%r318; shl.b64 %r117,%r317,3; setp.le.s64 %r329,%r89,1; setp.gt.s64 %r330,%r132,0; add.u64 %r332,%frame,240; add.u64 %r333,%frame,360; .loc 1 204 20 mov.u64 %r334,0; $L32: .loc 1 190 7 @ %r330 bra $L40; bra $L44; $L46: .loc 1 196 12 add.u64 %r101,%r101,%r55; .loc 1 197 12 add.u64 %r112,%r112,%r57; .loc 1 198 15 add.u64 %r95,%r95,1; .loc 1 200 13 setp.eq.u64 %r320,%r30,%r95; @ %r320 bra $L34; bra $L32; $L1: .loc 1 224_gfortrani_spread_scalar_i2 .visible .func _gfortrani_spread_scalar_0predmov.u64 %r39,%ar2; mov.u64 %r40,%ar3; .loc 1 236 6 ld.s8 %r42,[%r37+28]; cvt.u16.u32 %r41,%r42; setp.eq.u16 %r43,%r41,1; @ %r43 bra $L48; .loc 1 237 5 cvta.const.u64 %r44,$LC4; {stack; call _gfortran_runtime_error48: .loc 1 239 6 setp.le.s64 %r46,%r39,1; @ %r46 bra $L49; .loc 1 240 5 cvta.const.u64 %r47_gfortran_runtime_error49: .loc 1 242 10 ld.u64 %r23,[%r37]; .loc 1 246 7 add.u64 %r36,%r40,-1; .loc 1 242 6 setp.ne.u64 %r49,%r23,0; @ %r49 bra $L50; .loc 1 244 24 mov.u64 %r51,call (%value_in),_gfortrani_xmallocarray244 22 st.u64 [%r37],%r23; .loc 1 245 19 mov.u64 %r54,0; st.u64 [%r37+8],%r54; .loc 1 246 7 st.u64 [%r37+48],%r54; st.u64 [%r37+56],%r36; mov.u64 %r56,1; st.u64 [%r37+40],%r56; mov.u64 %r30,%r56; bra $L51; $L50: .loc 1 251 9 ld.u64 %r30,[%r37+40]; .loc 1 250 26 ld.u64 %r58,[%r37+56]; ld.u64 %r60,[%r37+48]; .loc 1 250 55 sub.u64 %r61,%r58,%r60; .loc 1 251 7 div.s64 %r63,%r61,%r30; .loc 1 250 10 setp.ge.s64 %r64,%r63,%r36; @ %r64 bra $L51; .loc 1 252 2 cvta.const.u64 %r65,$LC5stack; call _gfortran_runtime_error1: .loc 1 258 3 setp.gt.s64 %r67,%r40,0; @ ! %r67 bra $L47; .loc 1 261 12 add.u64 %r34,%r30,%r30; .loc 1 258 19 mov.u64 %r35,0; $L53: .loc 1 260 13 ld.u16 %r69,[%r38]; st.u16 [%r23],%r69; .loc 1 261 12 add.u64 %r23,%r23,%r34; .loc 1 258 40 add.u64 %r35,%r35,1; .loc 1 258 3 setp.ne.u64 %r70,%r40,%r35; @ %r70 bra $L53; $L47 spread_i4.o/_gfortrani_spread_i4 .visible .func _gfortrani_sprea../libgfortran/generated/spread_i4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_spread_scalar_i4 .visible .func _gfortrani_spread_scalar_i4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray114,101,116,117,114,110,32,114,97,110,107,32,116,111,111,32,108,97,114,103,101,32,105,110,32,115,112,114,101,97,100,40,410,105,109,32,111,117,116,115,105,100,101,32,111,102,32,114,97,110,107,32,105,110,32114,97,110,107,32,109,105,115,109,97,116,99,104,32,105,110,32,115,112,114,101,97,100,40,473,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,83,80,82,69,65,68,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10105,110,99,111,114,114,101,99,116,32,100,101,115,116,105,110,97,116,105,111,110,32,114,97,110,107,32,105,110,32,115,112,114,101,97,100,40,41100,105,109,32,116,111,111,32,108,97,114,103,101,32,105,110,32,115,112,114,101,97,100,40,41_gfortrani_spread_i4 .visible .func _gfortrani_spreapredpredpredpred %r250; .reg .predu64 %r264; .reg .u64 %r265; .reg .u64 %r272; .reg .pred %r275; .reg .u64 %r285; .reg .predpred %r292; .reg .pred %r293; .reg .predu64 %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u64pred %r316; .reg .u64 %r317; .reg .u64 %r318; .reg .u64 %r319; .reg .u64 %r321; .reg .u64 %r322; .reg .u64 %r323; .reg .pred %r325; .reg .pred %r326; .reg .u64u64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; mov.u64 %r129,%ar0; mov.u64 %r130,%ar1; mov.u64 %r131,%ar2; mov.u64 %r132,%ar3; .loc 1 57 11 ld.s8 %r22,[%r130+28]; .loc 1 57 9 cvt.u32.u32 %r133,%r22; cvt.s64.s8 %r89,%r133; .loc 1 59 9 add.u64 %r90,%r89,1; .loc 1 60 6 setp.le.s64 %r134,%r90,15; @ %r134 bra $L2; .loc 1 61 5 cvta.const.u64 %r135135_gfortran_runtime_errorsetp.ge.s64 %r137,%r90,%r131; @ %r137 bra $L3; .loc 1 64_gfortran_runtime_errorr112,[%r129]; .loc 1 68 6 setp.ne.u64 %r140,%r112,0; @ %r140 bra $L4; .loc 1 75 23 st.u8 [%r129+28],%r90; .loc 1 79 7 setp.le.s64 %r141,%r90,0; @ %r141 bra $L5; .loc 1 82 19 add.u64 %r127,%r131,-1; add.u64 %r115,%r129,40; .loc 1 77 11 mov.u64 %r80,%r112; .loc 1 79 14 mov.u64 %r97,%r80; .loc 1 78 10 mov.u64 %r121,1; .loc 1 40 14 mov.u64 %r74,%r80; .loc 1 90 19 mov.u64 %r332,%r80; .loc 1 84 21 add.u64 %r333,%r132,-1; bra $L8; $L36: mov.u64 %r121,%r76; $L8: .loc 1 82 7 setp.ne.u64 %r144,%r97,%r127; @ %r144 bra $L6; .loc 1 84 21 mov.u64 %r29,%r333; .loc 1 86 11 mul.lo.u64 %r76,%r132,%r121; mov.u64 %r74,%r121; bra $L7; $L6: .loc 1 90 19 shl.b64 %r145,%r80,3; add.u64 %r146,%frame,%r145; st.u64 [%r146+120],%r332; .loc 1 91 22 add.u64 %r150,%r80,%r80; add.u64 %r151,%r150,%r80; shl.b64 %r152,%r151,3; add.u64 %r153,%r130,%r152; ld.u64 %r156,[%r153+56]; add.u64 %r155,%r156,1; ld.u64 %r163,[%r153+48]; sub.u64 %r27,%r155,%r163; .loc 1 91 20 st.u64 [%r146],%r27; .loc 1 92 21 ld.u64 %r175,[%r153+40]; st.u64 [%r146+240],%r175; .loc 1 93 21 st.u64 [%r146+360],%r121; .loc 1 95 25 add.u64 %r29,%r27,-1; .loc 1 96 11 mul.lo.u64 %r76,%r27,%r121; .loc 1 97 11 add.u64 %r80,%r80,1; $L7: .loc 1 99 4 st.u64 [%r115+8],%r332; st.u64 [%r115+16],%r29; st.u64 [%r115],%r121; .loc 1 79 31 add.u64 %r97,%r97,1; .loc 1 79 7 add.u64 %r115,%r115,24; setp.ne.u64 %r180,%r97,%r90; @ %r180 bra $L36; bra $L45; $L5: .loc 1 101 19 st.u64 [%r129+8],%r112; .loc 1 104 24 mov.u64 %r183,4; mov.u64 %r18218183; call (%value_in),_gfortrani_xmallocarray184,[%value_in]; } mov.u64 %r112,%r184; .loc 1 104 22 st.u64 [%r129],%r112; .loc 1 181 12 ld.u64 %r50,[%frame+240]; .loc 1 40 14 mov.u64 %r74,0; bra $L10; $L45: .loc 1 101 19 st.u64 [%r129+8],%r332; .loc 1 104 24 mov.u64 %r188,4;188; call (%value_in),_gfortrani_xmallocarray189,[%value_in]; } mov.u64 %r112,%r189; .loc 1 104 22 st.u64 [%r129],%r112; .loc 1 105 10 setp.le.s64 %r191,%r76,0; @ %r191 bra $L1; .loc 1 181 12 ld.u64 %r50,[%frame+240]; bra $L10; $L4: .loc 1 115 11 ld.s8 %r192,[%r129+28]; .loc 1 115 10 setp.eq.u64 %r193,%r192,%r90; @ %r193 bra $L12; .loc 1 116 2 cvta.const.u64 %r1944_gfortran_runtime_error12: .loc 1 118 11 cvta.global.u64 %r196,_gfortrani_compile_options; ld.u32 %r85,[%r196+36]; .loc 1 118 10 setp.ne.u32 %r197,%r85,0; @ %r197 bra $L13; .loc 1 156 4 setp.gt.s64 %r198,%r192,0; @ %r198 bra $L14; .loc 1 40 14 mov.u64 %r74,0; bra $L15; $L13: .loc 1 120 4 setp.le.s64 %r199,%r192,0; @ %r199 bra $L37; .loc 1 82 19 add.u64 %r126,%r131,-1; add.u64 %r106,%r129,40; .loc 1 112 18 mov.u32 %r85,0; .loc 1 114 11 mov.u64 %r82,0; .loc 1 120 11 mov.u64 %r94,%r82; .loc 1 40 14 mov.u64 %r74,%r82; .loc 1 137 16 mov.u64 %r334,%r82; $L20: .loc 1 124 21 ld.u64 %r203,[%r106+16]; add.u64 %r202,%r203,1; .loc 1 124 19 ld.u64 %r204,[%r106+8]; sub.u64 %r93,%r202,%r204; .loc 1 125 11 setp.ne.u64 %r205,%r94,%r126; @ %r205 bra $L16; .loc 1 127 12 ld.u64 %r74,[%r106]; .loc 1 129 8 setp.eq.u64 %r206,%r132,%r93; @ %r206 bra $L17; .loc 1 130 7 st.u64 [%stack+16],%r132; st.u64 [%stack+8],%r93; add.u64 %r208,%r94,1;_gfortran_runtime_error16: .loc 1 137 16 shl.b64 %r322,%r82,3; add.u64 %r211,%frame,%r322; add.u64 %r212,%r211,120; st.u64 [%r212],%r334; .loc 1 138 19 add.u64 %r317,%r82,%r82; add.u64 %r319,%r317,%r82; shl.b64 %r217,%r319,3; add.u64 %r218,%r130,%r217; ld.u64 %r221,[%r218+56]; add.u64 %r220,%r221,1; ld.u64 %r228,[%r218+48]; sub.u64 %r41,%r220,%r228; .loc 1 138 17 st.u64 [%r211],%r41; .loc 1 139 8 setp.eq.u64 %r231,%r41,%r93; @ %r231 bra $L18; .loc 1 140 7 st.u64 [%stack+16],%r41; st.u64 [%stack+8],%r93; add.u64 %r233,%r94,1;_gfortran_runtime_error18: .loc 1 146 8 setp.gt.s64 %r235,%r41,0; .loc 1 147 18 selp.u32 %r85,%r85,1,%r235; .loc 1 148 18 ld.u64 %r245,[%r218+40]; st.u64 [%r211+240],%r245; .loc 1 149 18 ld.u64 %r249,[%r106]; st.u64 [%r211+360],%r249; .loc 1 150 8 add.u64 %r82,%r82,1; $L17: .loc 1 120 28 add.u64 %r94,%r94,1; .loc 1 120 4 add.u64 %r106,%r106,24; setp.ne.u64 %r250,%r192,%r94; @ %r250 bra $L20; bra $L21; $L14: .loc 1 82 19 add.u64 %r125,%r131,-1; add.u64 %r75,%r129,40; .loc 1 114 11 mov.u64 %r84,0; .loc 1 156 11 mov.u64 %r92,%r84; .loc 1 40 14 mov.u64 %r74,%r84; .loc 1 164 16 mov.u64 %r335,%r84; $L24: .loc 1 160 12 ld.u64 %r128,[%r75]; .loc 1 158 11 setp.eq.u64 %r253,%r92,%r125; @ %r253 bra $L38; .loc 1 164 16 shl.b64 %r323,%r84,3; add.u64 %r255,%frame,%r323; add.u64 %r256,%r255,120; st.u64 [%r256],%r335; .loc 1 165 19 add.u64 %r318,%r84,%r84; add.u64 %r321,%r318,%r84; shl.b64 %r261,%r321,3; add.u64 %r262,%r130,%r261; ld.u64 %r265,[%r262+56]; add.u64 %r264,%r265,1; ld.u64 %r272,[%r262+48]; sub.u64 %r48,%r264,%r272; .loc 1 165 17 st.u64 [%r255],%r48; .loc 1 166 8 setp.gt.s64 %r275,%r48,0; .loc 1 167 18 selp.u32 %r85,%r85,1,%r275; .loc 1 168 18 ld.u64 %r285,[%r262+40]; st.u64 [%r255+240],%r285; .loc 1 169 18 st.u64 [%r255+360],%r128; .loc 1 170 8 add.u64 %r84,%r84,1; bra $L22; $L38: .loc 1 160 12 mov.u64 %r74,%r128; $L22: .loc 1 156 28 add.u64 %r92,%r92,1; .loc 1 156 4 add.u64 %r75,%r75,24; setp.ne.u64 %r289,%r92,%r192; @ %r289 bra $L24; $L21: .loc 1 175 10 setp.ne.u32 %r290,%r85,0; @ ! %r290 bra $L15; bra $L1; $L37: .loc 1 40 14 mov.u64 %r74,0; $L15: .loc 1 178 18 ld.u64 %r50,[%frame+240]; .loc 1 178 10 setp.eq.u64 %r291,%r50,0; selp.u64 %r50,1,%r50,%r291; $L10: .loc 1 182 12 ld.u64 %r98,[%frame+360]; .loc 1 184 8 ld.u64 %r101,[%r130]; .loc 1 186 9 setp.ne.u64 %r292,%r101,0; @ %r292 bra $L25; bra $L1; $L40: .loc 1 190 7 mov.u64 %r102,%r112; .loc 1 190 14 mov.u64 %r103,0; $L26: .loc 1 192 12 ld.u32 %r51,[%r101]; .loc 1 192 10 st.u32 [%r102],%r51; .loc 1 193 16 add.u64 %r102,%r102,%r53; .loc 1 190 33 add.u64 %r103,%r103,1; .loc 1 190 7 setp.ne.u64 %r293,%r132,%r103; @ %r293 bra $L26; bra $L46; $L44: .loc 1 196 12 add.u64 %r101,%r101,%r55; .loc 1 197 12 add.u64 %r112,%r112,%r57; .loc 1 198 15 add.u64 %r95,%r95,1; .loc 1 200 13 setp.ne.u64 %r294,%r30,%r95; @ %r294 bra $L44; $L34: .loc 1 208 16 add.u64 %r100,%r112,%r116; .loc 1 210 14 @ %r325 bra $L1; add.u64 %r108,%frame,128; .loc 1 207 30 mov.u64 %r59,%r99; .loc 1 210 14 mov.u64 %r124,8; bra $L30; $L31: .loc 1 204 20 st.u64 [%r108],%r330; .loc 1 207 30 mul.lo.u64 %r59,%r67,%r66; .loc 1 208 30 mul.lo.u64 %r298,%r69,%r66; .loc 1 208 16 shl.b64 %r299,%r298,2; sub.u64 %r100,%r112,%r299; .loc 1 210 14 add.u64 %r108,%r108,8; add.u64 %r124,%r124,8; setp.eq.u64 %r300,%r117,%r124; @ %r300 bra $L1; $L30: .loc 1 218 23 ld.u64 %r301,[%r108]; add.u64 %r66,%r301,1; st.u64 [%r108],%r66; .loc 1 219 30 add.u64 %r303,%r328,%r124; ld.u64 %r67,[%r303]; .loc 1 219 20 sub.u64 %r304,%r67,%r59; shl.b64 %r305,%r304,2; add.u64 %r101,%r101,%r305; .loc 1 220 30 add.u64 %r307,%r329,%r124; ld.u64 %r69,[%r307]; .loc 1 220 20 shl.b64 %r308,%r69,2; add.u64 %r112,%r100,%r308; .loc 1 200 32 add.u64 %r309,%frame,%r124; ld.u64 %r72,[%r309]; .loc 1 200 13 setp.eq.u64 %r310,%r66,%r72; @ %r310 bra $L31; .loc 1 204 20 mov.u64 %r95,0; bra $L32; $L25: .loc 1 193 16 shl.b64 %r53,%r74,2; .loc 1 196 12 shl.b64 %r55,%r50,2; .loc 1 197 12 shl.b64 %r57,%r98,2; ld.u64 %r95,[%frame+120]; .loc 1 200 32 ld.u64 %r30,[%frame]; .loc 1 207 30 mul.lo.u64 %r99,%r50,%r30; .loc 1 208 30 mul.lo.u64 %r311,%r98,%r30; .loc 1 208 16 shl.b64 %r312,%r311,2; neg.s64 %r116,%r312; cvt.u32.u32 %r314,%r22; cvt.s64.s8 %r313,%r314; shl.b64 %r117,%r313,3; setp.le.s64 %r325,%r89,1; setp.gt.s64 %r326,%r132,0; add.u64 %r328,%frame,240; add.u64 %r329,%frame,360; .loc 1 204 20 mov.u64 %r330,0; $L32: .loc 1 190 7 @ %r326 bra $L40; bra $L44; $L46: .loc 1 196 12 add.u64 %r101,%r101,%r55; .loc 1 197 12 add.u64 %r112,%r112,%r57; .loc 1 198 15 add.u64 %r95,%r95,1; .loc 1 200 13 setp.eq.u64 %r316,%r30,%r95; @ %r316 bra $L34; bra $L32; $L1: .loc 1 224_gfortrani_spread_scalar_i4 .visible .func _gfortrani_spread_scalar_0predmov.u64 %r39,%ar2; mov.u64 %r40,%ar3; .loc 1 236 6 ld.s8 %r42,[%r37+28]; cvt.u16.u32 %r41,%r42; setp.eq.u16 %r43,%r41,1; @ %r43 bra $L48; .loc 1 237 5 cvta.const.u64 %r44,$LC4; {stack; call _gfortran_runtime_error48: .loc 1 239 6 setp.le.s64 %r46,%r39,1; @ %r46 bra $L49; .loc 1 240 5 cvta.const.u64 %r47_gfortran_runtime_error49: .loc 1 242 10 ld.u64 %r23,[%r37]; .loc 1 246 7 add.u64 %r36,%r40,-1; .loc 1 242 6 setp.ne.u64 %r49,%r23,0; @ %r49 bra $L50; .loc 1 244 24 mov.u64 %r51_gfortrani_xmallocarray244 22 st.u64 [%r37],%r23; .loc 1 245 19 mov.u64 %r54,0; st.u64 [%r37+8],%r54; .loc 1 246 7 st.u64 [%r37+48],%r54; st.u64 [%r37+56],%r36; mov.u64 %r56,1; st.u64 [%r37+40],%r56; mov.u64 %r30,%r56; bra $L51; $L50: .loc 1 251 9 ld.u64 %r30,[%r37+40]; .loc 1 250 26 ld.u64 %r58,[%r37+56]; ld.u64 %r60,[%r37+48]; .loc 1 250 55 sub.u64 %r61,%r58,%r60; .loc 1 251 7 div.s64 %r63,%r61,%r30; .loc 1 250 10 setp.ge.s64 %r64,%r63,%r36; @ %r64 bra $L51; .loc 1 252 2 cvta.const.u64 %r65,$LC5stack; call _gfortran_runtime_error1: .loc 1 258 3 setp.gt.s64 %r67,%r40,0; @ ! %r67 bra $L47; .loc 1 261 12 shl.b64 %r34,%r30,2; .loc 1 258 19 mov.u64 %r35,0; $L53: .loc 1 260 15 ld.u32 %r32,[%r38]; .loc 1 260 13 st.u32 [%r23],%r32; .loc 1 261 12 add.u64 %r23,%r23,%r34; .loc 1 258 40 add.u64 %r35,%r35,1; .loc 1 258 3 setp.ne.u64 %r68,%r40,%r35; @ %r68 bra $L53; $L47 spread_i8.o/../libgfortran/generated/spread_i8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_spread_scalar_i8 .visible .func _gfortrani_spread_scalar_i8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray114,101,116,117,114,110,32,114,97,110,107,32,116,111,111,32,108,97,114,103,101,32,105,110,32,115,112,114,101,97,100,40,410,105,109,32,111,117,116,115,105,100,101,32,111,102,32,114,97,110,107,32,105,110,32114,97,110,107,32,109,105,115,109,97,116,99,104,32,105,110,32,115,112,114,101,97,100,40,473,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,83,80,82,69,65,68,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10105,110,99,111,114,114,101,99,116,32,100,101,115,116,105,110,97,116,105,111,110,32,114,97,110,107,32,105,110,32,115,112,114,101,97,100,40,41100,105,109,32,116,111,111,32,108,97,114,103,101,32,105,110,32,115,112,114,101,97,100,40,41_gfortrani_spread_i8 .visible .func _gfortrani_spreapredpredpredpred %r250; .reg .predu64 %r264; .reg .u64 %r265; .reg .u64 %r272; .reg .pred %r275; .reg .u64 %r285; .reg .predpred %r292; .reg .pred %r293; .reg .predu64 %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u64pred %r316; .reg .u64 %r317; .reg .u64 %r318; .reg .u64 %r319; .reg .u64 %r321; .reg .u64 %r322; .reg .u64 %r323; .reg .pred %r325; .reg .pred %r326; .reg .u64u64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; mov.u64 %r129,%ar0; mov.u64 %r130,%ar1; mov.u64 %r131,%ar2; mov.u64 %r132,%ar3; .loc 1 57 11 ld.s8 %r22,[%r130+28]; .loc 1 57 9 cvt.u32.u32 %r133,%r22; cvt.s64.s8 %r89,%r133; .loc 1 59 9 add.u64 %r90,%r89,1; .loc 1 60 6 setp.le.s64 %r134,%r90,15; @ %r134 bra $L2; .loc 1 61 5 cvta.const.u64 %r135135_gfortran_runtime_errorsetp.ge.s64 %r137,%r90,%r131; @ %r137 bra $L3; .loc 1 64_gfortran_runtime_errorr112,[%r129]; .loc 1 68 6 setp.ne.u64 %r140,%r112,0; @ %r140 bra $L4; .loc 1 75 23 st.u8 [%r129+28],%r90; .loc 1 79 7 setp.le.s64 %r141,%r90,0; @ %r141 bra $L5; .loc 1 82 19 add.u64 %r127,%r131,-1; add.u64 %r115,%r129,40; .loc 1 77 11 mov.u64 %r80,%r112; .loc 1 79 14 mov.u64 %r97,%r80; .loc 1 78 10 mov.u64 %r121,1; .loc 1 40 14 mov.u64 %r74,%r80; .loc 1 90 19 mov.u64 %r332,%r80; .loc 1 84 21 add.u64 %r333,%r132,-1; bra $L8; $L36: mov.u64 %r121,%r76; $L8: .loc 1 82 7 setp.ne.u64 %r144,%r97,%r127; @ %r144 bra $L6; .loc 1 84 21 mov.u64 %r29,%r333; .loc 1 86 11 mul.lo.u64 %r76,%r132,%r121; mov.u64 %r74,%r121; bra $L7; $L6: .loc 1 90 19 shl.b64 %r145,%r80,3; add.u64 %r146,%frame,%r145; st.u64 [%r146+120],%r332; .loc 1 91 22 add.u64 %r150,%r80,%r80; add.u64 %r151,%r150,%r80; shl.b64 %r152,%r151,3; add.u64 %r153,%r130,%r152; ld.u64 %r156,[%r153+56]; add.u64 %r155,%r156,1; ld.u64 %r163,[%r153+48]; sub.u64 %r27,%r155,%r163; .loc 1 91 20 st.u64 [%r146],%r27; .loc 1 92 21 ld.u64 %r175,[%r153+40]; st.u64 [%r146+240],%r175; .loc 1 93 21 st.u64 [%r146+360],%r121; .loc 1 95 25 add.u64 %r29,%r27,-1; .loc 1 96 11 mul.lo.u64 %r76,%r27,%r121; .loc 1 97 11 add.u64 %r80,%r80,1; $L7: .loc 1 99 4 st.u64 [%r115+8],%r332; st.u64 [%r115+16],%r29; st.u64 [%r115],%r121; .loc 1 79 31 add.u64 %r97,%r97,1; .loc 1 79 7 add.u64 %r115,%r115,24; setp.ne.u64 %r180,%r97,%r90; @ %r180 bra $L36; bra $L45; $L5: .loc 1 101 19 st.u64 [%r129+8],%r112; .loc 1 104 24 mov.u64 %r183,8; mov.u64 %r18218183; call (%value_in),_gfortrani_xmallocarray184,[%value_in]; } mov.u64 %r112,%r184; .loc 1 104 22 st.u64 [%r129],%r112; .loc 1 181 12 ld.u64 %r50,[%frame+240]; .loc 1 40 14 mov.u64 %r74,0; bra $L10; $L45: .loc 1 101 19 st.u64 [%r129+8],%r332; .loc 1 104 24 mov.u64 %r188(%value_in),_gfortrani_xmallocarray189,[%value_in]; } mov.u64 %r112,%r189; .loc 1 104 22 st.u64 [%r129],%r112; .loc 1 105 10 setp.le.s64 %r191,%r76,0; @ %r191 bra $L1; .loc 1 181 12 ld.u64 %r50,[%frame+240]; bra $L10; $L4: .loc 1 115 11 ld.s8 %r192,[%r129+28]; .loc 1 115 10 setp.eq.u64 %r193,%r192,%r90; @ %r193 bra $L12; .loc 1 116 2 cvta.const.u64 %r1944_gfortran_runtime_error12: .loc 1 118 11 cvta.global.u64 %r196,_gfortrani_compile_options; ld.u32 %r85,[%r196+36]; .loc 1 118 10 setp.ne.u32 %r197,%r85,0; @ %r197 bra $L13; .loc 1 156 4 setp.gt.s64 %r198,%r192,0; @ %r198 bra $L14; .loc 1 40 14 mov.u64 %r74,0; bra $L15; $L13: .loc 1 120 4 setp.le.s64 %r199,%r192,0; @ %r199 bra $L37; .loc 1 82 19 add.u64 %r126,%r131,-1; add.u64 %r106,%r129,40; .loc 1 112 18 mov.u32 %r85,0; .loc 1 114 11 mov.u64 %r82,0; .loc 1 120 11 mov.u64 %r94,%r82; .loc 1 40 14 mov.u64 %r74,%r82; .loc 1 137 16 mov.u64 %r334,%r82; $L20: .loc 1 124 21 ld.u64 %r203,[%r106+16]; add.u64 %r202,%r203,1; .loc 1 124 19 ld.u64 %r204,[%r106+8]; sub.u64 %r93,%r202,%r204; .loc 1 125 11 setp.ne.u64 %r205,%r94,%r126; @ %r205 bra $L16; .loc 1 127 12 ld.u64 %r74,[%r106]; .loc 1 129 8 setp.eq.u64 %r206,%r132,%r93; @ %r206 bra $L17; .loc 1 130 7 st.u64 [%stack+16],%r132; st.u64 [%stack+8],%r93; add.u64 %r208,%r94,1;_gfortran_runtime_error16: .loc 1 137 16 shl.b64 %r322,%r82,3; add.u64 %r211,%frame,%r322; add.u64 %r212,%r211,120; st.u64 [%r212],%r334; .loc 1 138 19 add.u64 %r317,%r82,%r82; add.u64 %r319,%r317,%r82; shl.b64 %r217,%r319,3; add.u64 %r218,%r130,%r217; ld.u64 %r221,[%r218+56]; add.u64 %r220,%r221,1; ld.u64 %r228,[%r218+48]; sub.u64 %r41,%r220,%r228; .loc 1 138 17 st.u64 [%r211],%r41; .loc 1 139 8 setp.eq.u64 %r231,%r41,%r93; @ %r231 bra $L18; .loc 1 140 7 st.u64 [%stack+16],%r41; st.u64 [%stack+8],%r93; add.u64 %r233,%r94,1;_gfortran_runtime_error18: .loc 1 146 8 setp.gt.s64 %r235,%r41,0; .loc 1 147 18 selp.u32 %r85,%r85,1,%r235; .loc 1 148 18 ld.u64 %r245,[%r218+40]; st.u64 [%r211+240],%r245; .loc 1 149 18 ld.u64 %r249,[%r106]; st.u64 [%r211+360],%r249; .loc 1 150 8 add.u64 %r82,%r82,1; $L17: .loc 1 120 28 add.u64 %r94,%r94,1; .loc 1 120 4 add.u64 %r106,%r106,24; setp.ne.u64 %r250,%r192,%r94; @ %r250 bra $L20; bra $L21; $L14: .loc 1 82 19 add.u64 %r125,%r131,-1; add.u64 %r75,%r129,40; .loc 1 114 11 mov.u64 %r84,0; .loc 1 156 11 mov.u64 %r92,%r84; .loc 1 40 14 mov.u64 %r74,%r84; .loc 1 164 16 mov.u64 %r335,%r84; $L24: .loc 1 160 12 ld.u64 %r128,[%r75]; .loc 1 158 11 setp.eq.u64 %r253,%r92,%r125; @ %r253 bra $L38; .loc 1 164 16 shl.b64 %r323,%r84,3; add.u64 %r255,%frame,%r323; add.u64 %r256,%r255,120; st.u64 [%r256],%r335; .loc 1 165 19 add.u64 %r318,%r84,%r84; add.u64 %r321,%r318,%r84; shl.b64 %r261,%r321,3; add.u64 %r262,%r130,%r261; ld.u64 %r265,[%r262+56]; add.u64 %r264,%r265,1; ld.u64 %r272,[%r262+48]; sub.u64 %r48,%r264,%r272; .loc 1 165 17 st.u64 [%r255],%r48; .loc 1 166 8 setp.gt.s64 %r275,%r48,0; .loc 1 167 18 selp.u32 %r85,%r85,1,%r275; .loc 1 168 18 ld.u64 %r285,[%r262+40]; st.u64 [%r255+240],%r285; .loc 1 169 18 st.u64 [%r255+360],%r128; .loc 1 170 8 add.u64 %r84,%r84,1; bra $L22; $L38: .loc 1 160 12 mov.u64 %r74,%r128; $L22: .loc 1 156 28 add.u64 %r92,%r92,1; .loc 1 156 4 add.u64 %r75,%r75,24; setp.ne.u64 %r289,%r92,%r192; @ %r289 bra $L24; $L21: .loc 1 175 10 setp.ne.u32 %r290,%r85,0; @ ! %r290 bra $L15; bra $L1; $L37: .loc 1 40 14 mov.u64 %r74,0; $L15: .loc 1 178 18 ld.u64 %r50,[%frame+240]; .loc 1 178 10 setp.eq.u64 %r291,%r50,0; selp.u64 %r50,1,%r50,%r291; $L10: .loc 1 182 12 ld.u64 %r98,[%frame+360]; .loc 1 184 8 ld.u64 %r101,[%r130]; .loc 1 186 9 setp.ne.u64 %r292,%r101,0; @ %r292 bra $L25; bra $L1; $L40: .loc 1 190 7 mov.u64 %r102,%r112; .loc 1 190 14 mov.u64 %r103,0; $L26: .loc 1 192 12 ld.u64 %r51,[%r101]; .loc 1 192 10 st.u64 [%r102],%r51; .loc 1 193 16 add.u64 %r102,%r102,%r53; .loc 1 190 33 add.u64 %r103,%r103,1; .loc 1 190 7 setp.ne.u64 %r293,%r132,%r103; @ %r293 bra $L26; bra $L46; $L44: .loc 1 196 12 add.u64 %r101,%r101,%r55; .loc 1 197 12 add.u64 %r112,%r112,%r57; .loc 1 198 15 add.u64 %r95,%r95,1; .loc 1 200 13 setp.ne.u64 %r294,%r30,%r95; @ %r294 bra $L44; $L34: .loc 1 208 16 add.u64 %r100,%r112,%r116; .loc 1 210 14 @ %r325 bra $L1; add.u64 %r108,%frame,128; .loc 1 207 30 mov.u64 %r59,%r99; .loc 1 210 14 mov.u64 %r124,8; bra $L30; $L31: .loc 1 204 20 st.u64 [%r108],%r330; .loc 1 207 30 mul.lo.u64 %r59,%r67,%r66; .loc 1 208 30 mul.lo.u64 %r298,%r69,%r66; .loc 1 208 16 shl.b64 %r299,%r298,3; sub.u64 %r100,%r112,%r299; .loc 1 210 14 add.u64 %r108,%r108,8; add.u64 %r124,%r124,8; setp.eq.u64 %r300,%r117,%r124; @ %r300 bra $L1; $L30: .loc 1 218 23 ld.u64 %r301,[%r108]; add.u64 %r66,%r301,1; st.u64 [%r108],%r66; .loc 1 219 30 add.u64 %r303,%r328,%r124; ld.u64 %r67,[%r303]; .loc 1 219 20 sub.u64 %r304,%r67,%r59; shl.b64 %r305,%r304,3; add.u64 %r101,%r101,%r305; .loc 1 220 30 add.u64 %r307,%r329,%r124; ld.u64 %r69,[%r307]; .loc 1 220 20 shl.b64 %r308,%r69,3; add.u64 %r112,%r100,%r308; .loc 1 200 32 add.u64 %r309,%frame,%r124; ld.u64 %r72,[%r309]; .loc 1 200 13 setp.eq.u64 %r310,%r66,%r72; @ %r310 bra $L31; .loc 1 204 20 mov.u64 %r95,0; bra $L32; $L25: .loc 1 193 16 shl.b64 %r53,%r74,3; .loc 1 196 12 shl.b64 %r55,%r50,3; .loc 1 197 12 shl.b64 %r57,%r98,3; ld.u64 %r95,[%frame+120]; .loc 1 200 32 ld.u64 %r30,[%frame]; .loc 1 207 30 mul.lo.u64 %r99,%r50,%r30; .loc 1 208 30 mul.lo.u64 %r311,%r98,%r30; .loc 1 208 16 shl.b64 %r312,%r311,3; neg.s64 %r116,%r312; cvt.u32.u32 %r314,%r22; cvt.s64.s8 %r313,%r314; shl.b64 %r117,%r313,3; setp.le.s64 %r325,%r89,1; setp.gt.s64 %r326,%r132,0; add.u64 %r328,%frame,240; add.u64 %r329,%frame,360; .loc 1 204 20 mov.u64 %r330,0; $L32: .loc 1 190 7 @ %r326 bra $L40; bra $L44; $L46: .loc 1 196 12 add.u64 %r101,%r101,%r55; .loc 1 197 12 add.u64 %r112,%r112,%r57; .loc 1 198 15 add.u64 %r95,%r95,1; .loc 1 200 13 setp.eq.u64 %r316,%r30,%r95; @ %r316 bra $L34; bra $L32; $L1: .loc 1 224_gfortrani_spread_scalar_i8 .visible .func _gfortrani_spread_scalar_0predmov.u64 %r39,%ar2; mov.u64 %r40,%ar3; .loc 1 236 6 ld.s8 %r42,[%r37+28]; cvt.u16.u32 %r41,%r42; setp.eq.u16 %r43,%r41,1; @ %r43 bra $L48; .loc 1 237 5 cvta.const.u64 %r44,$LC4; {stack; call _gfortran_runtime_error48: .loc 1 239 6 setp.le.s64 %r46,%r39,1; @ %r46 bra $L49; .loc 1 240 5 cvta.const.u64 %r47_gfortran_runtime_error49: .loc 1 242 10 ld.u64 %r23,[%r37]; .loc 1 246 7 add.u64 %r36,%r40,-1; .loc 1 242 6 setp.ne.u64 %r49,%r23,0; @ %r49 bra $L50; .loc 1 244 24 mov.u64 _gfortrani_xmallocarray244 22 st.u64 [%r37],%r23; .loc 1 245 19 mov.u64 %r54,0; st.u64 [%r37+8],%r54; .loc 1 246 7 st.u64 [%r37+48],%r54; st.u64 [%r37+56],%r36; mov.u64 %r56,1; st.u64 [%r37+40],%r56; mov.u64 %r30,%r56; bra $L51; $L50: .loc 1 251 9 ld.u64 %r30,[%r37+40]; .loc 1 250 26 ld.u64 %r58,[%r37+56]; ld.u64 %r60,[%r37+48]; .loc 1 250 55 sub.u64 %r61,%r58,%r60; .loc 1 251 7 div.s64 %r63,%r61,%r30; .loc 1 250 10 setp.ge.s64 %r64,%r63,%r36; @ %r64 bra $L51; .loc 1 252 2 cvta.const.u64 %r65,$LC5stack; call _gfortran_runtime_error1: .loc 1 258 3 setp.gt.s64 %r67,%r40,0; @ ! %r67 bra $L47; .loc 1 261 12 shl.b64 %r34,%r30,3; .loc 1 258 19 mov.u64 %r35,0; $L53: .loc 1 260 15 ld.u64 %r32,[%r38]; .loc 1 260 13 st.u64 [%r23],%r32; .loc 1 261 12 add.u64 %r23,%r23,%r34; .loc 1 258 40 add.u64 %r35,%r35,1; .loc 1 258 3 setp.ne.u64 %r68,%r40,%r35; @ %r68 bra $L53; $L47 spread_i16.o/_gfortrani_spread_i16 .visible .func _gfortrani_spread_i16../libgfortran/generated/spread_i16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_spread_scalar_i16 .visible .func _gfortrani_spread_scalar_i16VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray114,101,116,117,114,110,32,114,97,110,107,32,116,111,111,32,108,97,114,103,101,32,105,110,32,115,112,114,101,97,100,40,410,105,109,32,111,117,116,115,105,100,101,32,111,102,32,114,97,110,107,32,105,110,32114,97,110,107,32,109,105,115,109,97,116,99,104,32,105,110,32,115,112,114,101,97,100,40,473,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,83,80,82,69,65,68,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10105,110,99,111,114,114,101,99,116,32,100,101,115,116,105,110,97,116,105,111,110,32,114,97,110,107,32,105,110,32,115,112,114,101,97,100,40,41100,105,109,32,116,111,111,32,108,97,114,103,101,32,105,110,32,115,112,114,101,97,100,40,41_gfortrani_spread_i16 .visible .func _gfortrani_spread_i16predpredpredpred %r250; .reg .predu64 %r264; .reg .u64 %r265; .reg .u64 %r272; .reg .pred %r275; .reg .u64 %r285; .reg .predpred %r292; .reg .pred %r295; .reg .pred %r296; .reg .u64u64predmov.u64 %r129,%ar0; mov.u64 %r130,%ar1; mov.u64 %r131,%ar2; mov.u64 %r132,%ar3; .loc 1 57 11 ld.s8 %r22,[%r130+28]; .loc 1 57 9 cvt.u32.u32 %r133,%r22; cvt.s64.s8 %r89,%r133; .loc 1 59 9 add.u64 %r90,%r89,1; .loc 1 60 6 setp.le.s64 %r134,%r90,15; @ %r134 bra $L2; .loc 1 61 5 cvta.const.u64 %r135135_gfortran_runtime_errorsetp.ge.s64 %r137,%r90,%r131; @ %r137 bra $L3; .loc 1 64_gfortran_runtime_errorr112,[%r129]; .loc 1 68 6 setp.ne.u64 %r140,%r112,0; @ %r140 bra $L4; .loc 1 75 23 st.u8 [%r129+28],%r90; .loc 1 79 7 setp.le.s64 %r141,%r90,0; @ %r141 bra $L5; .loc 1 82 19 add.u64 %r127,%r131,-1; add.u64 %r115,%r129,40; .loc 1 77 11 mov.u64 %r80,%r112; .loc 1 79 14 mov.u64 %r97,%r80; .loc 1 78 10 mov.u64 %r121,1; .loc 1 40 14 mov.u64 %r74,%r80; .loc 1 90 19 mov.u64 %r336,%r80; .loc 1 84 21 add.u64 %r337,%r132,-1; bra $L8; $L36: mov.u64 %r121,%r76; $L8: .loc 1 82 7 setp.ne.u64 %r144,%r97,%r127; @ %r144 bra $L6; .loc 1 84 21 mov.u64 %r29,%r337; .loc 1 86 11 mul.lo.u64 %r76,%r132,%r121; mov.u64 %r74,%r121; bra $L7; $L6: .loc 1 90 19 shl.b64 %r145,%r80,3; add.u64 %r146,%frame,%r145; st.u64 [%r146+120],%r336; .loc 1 91 22 add.u64 %r150,%r80,%r80; add.u64 %r151,%r150,%r80; shl.b64 %r152,%r151,3; add.u64 %r153,%r130,%r152; ld.u64 %r156,[%r153+56]; add.u64 %r155,%r156,1; ld.u64 %r163,[%r153+48]; sub.u64 %r27,%r155,%r163; .loc 1 91 20 st.u64 [%r146],%r27; .loc 1 92 21 ld.u64 %r175,[%r153+40]; st.u64 [%r146+240],%r175; .loc 1 93 21 st.u64 [%r146+360],%r121; .loc 1 95 25 add.u64 %r29,%r27,-1; .loc 1 96 11 mul.lo.u64 %r76,%r27,%r121; .loc 1 97 11 add.u64 %r80,%r80,1; $L7: .loc 1 99 4 st.u64 [%r115+8],%r336; st.u64 [%r115+16],%r29; st.u64 [%r115],%r121; .loc 1 79 31 add.u64 %r97,%r97,1; .loc 1 79 7 add.u64 %r115,%r115,24; setp.ne.u64 %r180,%r97,%r90; @ %r180 bra $L36; bra $L45; $L5: .loc 1 101 19 st.u64 [%r129+8],%r112; .loc 1 104 24 mov.u64 %r183,16; mov.u64 %r18218183; call (%value_in),_gfortrani_xmallocarray184,[%value_in]; } mov.u64 %r112,%r184; .loc 1 104 22 st.u64 [%r129],%r112; .loc 1 181 12 ld.u64 %r50,[%frame+240]; .loc 1 40 14 mov.u64 %r74,0; bra $L10; $L45: .loc 1 101 19 st.u64 [%r129+8],%r336; .loc 1 104 24 mov.u64 %r188,16;188; call (%value_in),_gfortrani_xmallocarray189,[%value_in]; } mov.u64 %r112,%r189; .loc 1 104 22 st.u64 [%r129],%r112; .loc 1 105 10 setp.le.s64 %r191,%r76,0; @ %r191 bra $L1; .loc 1 181 12 ld.u64 %r50,[%frame+240]; bra $L10; $L4: .loc 1 115 11 ld.s8 %r192,[%r129+28]; .loc 1 115 10 setp.eq.u64 %r193,%r192,%r90; @ %r193 bra $L12; .loc 1 116 2 cvta.const.u64 %r1944_gfortran_runtime_error12: .loc 1 118 11 cvta.global.u64 %r196,_gfortrani_compile_options; ld.u32 %r85,[%r196+36]; .loc 1 118 10 setp.ne.u32 %r197,%r85,0; @ %r197 bra $L13; .loc 1 156 4 setp.gt.s64 %r198,%r192,0; @ %r198 bra $L14; .loc 1 40 14 mov.u64 %r74,0; bra $L15; $L13: .loc 1 120 4 setp.le.s64 %r199,%r192,0; @ %r199 bra $L37; .loc 1 82 19 add.u64 %r126,%r131,-1; add.u64 %r106,%r129,40; .loc 1 112 18 mov.u32 %r85,0; .loc 1 114 11 mov.u64 %r82,0; .loc 1 120 11 mov.u64 %r94,%r82; .loc 1 40 14 mov.u64 %r74,%r82; .loc 1 137 16 mov.u64 %r338,%r82; $L20: .loc 1 124 21 ld.u64 %r203,[%r106+16]; add.u64 %r202,%r203,1; .loc 1 124 19 ld.u64 %r204,[%r106+8]; sub.u64 %r93,%r202,%r204; .loc 1 125 11 setp.ne.u64 %r205,%r94,%r126; @ %r205 bra $L16; .loc 1 127 12 ld.u64 %r74,[%r106]; .loc 1 129 8 setp.eq.u64 %r206,%r132,%r93; @ %r206 bra $L17; .loc 1 130 7 st.u64 [%stack+16],%r132; st.u64 [%stack+8],%r93; add.u64 %r208,%r94,1;_gfortran_runtime_error16: .loc 1 137 16 shl.b64 %r326,%r82,3; add.u64 %r211,%frame,%r326; add.u64 %r212,%r211,120; st.u64 [%r212],%r338; .loc 1 138 19 add.u64 %r321,%r82,%r82; add.u64 %r323,%r321,%r82; shl.b64 %r217,%r323,3; add.u64 %r218,%r130,%r217; ld.u64 %r221,[%r218+56]; add.u64 %r220,%r221,1; ld.u64 %r228,[%r218+48]; sub.u64 %r41,%r220,%r228; .loc 1 138 17 st.u64 [%r211],%r41; .loc 1 139 8 setp.eq.u64 %r231,%r41,%r93; @ %r231 bra $L18; .loc 1 140 7 st.u64 [%stack+16],%r41; st.u64 [%stack+8],%r93; add.u64 %r233,%r94,1;_gfortran_runtime_error18: .loc 1 146 8 setp.gt.s64 %r235,%r41,0; .loc 1 147 18 selp.u32 %r85,%r85,1,%r235; .loc 1 148 18 ld.u64 %r245,[%r218+40]; st.u64 [%r211+240],%r245; .loc 1 149 18 ld.u64 %r249,[%r106]; st.u64 [%r211+360],%r249; .loc 1 150 8 add.u64 %r82,%r82,1; $L17: .loc 1 120 28 add.u64 %r94,%r94,1; .loc 1 120 4 add.u64 %r106,%r106,24; setp.ne.u64 %r250,%r192,%r94; @ %r250 bra $L20; bra $L21; $L14: .loc 1 82 19 add.u64 %r125,%r131,-1; add.u64 %r75,%r129,40; .loc 1 114 11 mov.u64 %r84,0; .loc 1 156 11 mov.u64 %r92,%r84; .loc 1 40 14 mov.u64 %r74,%r84; .loc 1 164 16 mov.u64 %r339,%r84; $L24: .loc 1 160 12 ld.u64 %r128,[%r75]; .loc 1 158 11 setp.eq.u64 %r253,%r92,%r125; @ %r253 bra $L38; .loc 1 164 16 shl.b64 %r327,%r84,3; add.u64 %r255,%frame,%r327; add.u64 %r256,%r255,120; st.u64 [%r256],%r339; .loc 1 165 19 add.u64 %r322,%r84,%r84; add.u64 %r325,%r322,%r84; shl.b64 %r261,%r325,3; add.u64 %r262,%r130,%r261; ld.u64 %r265,[%r262+56]; add.u64 %r264,%r265,1; ld.u64 %r272,[%r262+48]; sub.u64 %r48,%r264,%r272; .loc 1 165 17 st.u64 [%r255],%r48; .loc 1 166 8 setp.gt.s64 %r275,%r48,0; .loc 1 167 18 selp.u32 %r85,%r85,1,%r275; .loc 1 168 18 ld.u64 %r285,[%r262+40]; st.u64 [%r255+240],%r285; .loc 1 169 18 st.u64 [%r255+360],%r128; .loc 1 170 8 add.u64 %r84,%r84,1; bra $L22; $L38: .loc 1 160 12 mov.u64 %r74,%r128; $L22: .loc 1 156 28 add.u64 %r92,%r92,1; .loc 1 156 4 add.u64 %r75,%r75,24; setp.ne.u64 %r289,%r92,%r192; @ %r289 bra $L24; $L21: .loc 1 175 10 setp.ne.u32 %r290,%r85,0; @ ! %r290 bra $L15; bra $L1; $L37: .loc 1 40 14 mov.u64 %r74,0; $L15: .loc 1 178 18 ld.u64 %r50,[%frame+240]; .loc 1 178 10 setp.eq.u64 %r291,%r50,0; selp.u64 %r50,1,%r50,%r291; $L10: .loc 1 182 12 ld.u64 %r98,[%frame+360]; .loc 1 184 8 ld.u64 %r101,[%r130]; .loc 1 186 9 setp.ne.u64 %r292,%r101,0; @ %r292 bra $L25; bra $L1; $L40: .loc 1 190 7 mov.u64 %r102,%r112; .loc 1 190 14 mov.u64 %r103,0; $L26: .loc 1 192 12 ld.u64 %r319,[%r101]; ld.u64 %r320,[%r101+8]; .loc 1 192 10 st.u64 [%r102],%r319; st.u64 [%r102+8],%r320; .loc 1 193 16 add.u64 %r102,%r102,%r53; .loc 1 190 33 add.u64 %r103,%r103,1; .loc 1 190 7 setp.ne.u64 %r295,%r132,%r103; @ %r295 bra $L26; bra $L46; $L44: .loc 1 196 12 add.u64 %r101,%r101,%r55; .loc 1 197 12 add.u64 %r112,%r112,%r57; .loc 1 198 15 add.u64 %r95,%r95,1; .loc 1 200 13 setp.ne.u64 %r296,%r30,%r95; @ %r296 bra $L44; $L34: .loc 1 208 16 add.u64 %r100,%r112,%r116; .loc 1 210 14 @ %r329 bra $L1; add.u64 %r108,%frame,128; .loc 1 207 30 mov.u64 %r59,%r99; .loc 1 210 14 mov.u64 %r124,8; bra $L30; $L31: .loc 1 204 20 st.u64 [%r108],%r334; .loc 1 207 30 mul.lo.u64 %r59,%r67,%r66; .loc 1 208 30 mul.lo.u64 %r300,%r69,%r66; .loc 1 208 16 shl.b64 %r301,%r300,4; sub.u64 %r100,%r112,%r301; .loc 1 210 14 add.u64 %r108,%r108,8; add.u64 %r124,%r124,8; setp.eq.u64 %r302,%r117,%r124; @ %r302 bra $L1; $L30: .loc 1 218 23 ld.u64 %r303,[%r108]; add.u64 %r66,%r303,1; st.u64 [%r108],%r66; .loc 1 219 30 add.u64 %r305,%r332,%r124; ld.u64 %r67,[%r305]; .loc 1 219 20 sub.u64 %r306,%r67,%r59; shl.b64 %r307,%r306,4; add.u64 %r101,%r101,%r307; .loc 1 220 30 add.u64 %r309,%r333,%r124; ld.u64 %r69,[%r309]; .loc 1 220 20 shl.b64 %r310,%r69,4; add.u64 %r112,%r100,%r310; .loc 1 200 32 add.u64 %r311,%frame,%r124; ld.u64 %r72,[%r311]; .loc 1 200 13 setp.eq.u64 %r312,%r66,%r72; @ %r312 bra $L31; .loc 1 204 20 mov.u64 %r95,0; bra $L32; $L25: .loc 1 193 16 shl.b64 %r53,%r74,4; .loc 1 196 12 shl.b64 %r55,%r50,4; .loc 1 197 12 shl.b64 %r57,%r98,4; ld.u64 %r95,[%frame+120]; .loc 1 200 32 ld.u64 %r30,[%frame]; .loc 1 207 30 mul.lo.u64 %r99,%r50,%r30; .loc 1 208 30 mul.lo.u64 %r313,%r98,%r30; .loc 1 208 16 shl.b64 %r314,%r313,4; neg.s64 %r116,%r314; cvt.u32.u32 %r316,%r22; cvt.s64.s8 %r315,%r316; shl.b64 %r117,%r315,3; setp.le.s64 %r329,%r89,1; setp.gt.s64 %r330,%r132,0; add.u64 %r332,%frame,240; add.u64 %r333,%frame,360; .loc 1 204 20 mov.u64 %r334,0; $L32: .loc 1 190 7 @ %r330 bra $L40; bra $L44; $L46: .loc 1 196 12 add.u64 %r101,%r101,%r55; .loc 1 197 12 add.u64 %r112,%r112,%r57; .loc 1 198 15 add.u64 %r95,%r95,1; .loc 1 200 13 setp.eq.u64 %r318,%r30,%r95; @ %r318 bra $L34; bra $L32; $L1: .loc 1 224_gfortrani_spread_scalar_i16 .visible .func _gfortrani_spread_scalar_i160pred %r67; .reg .predmov.u64 %r37,%ar0; mov.u64 %r38,%ar1; mov.u64 %r39,%ar2; mov.u64 %r40,%ar3; .loc 1 236 6 ld.s8 %r42,[%r37+28]; cvt.u16.u32 %r41,%r42; setp.eq.u16 %r43,%r41,1; @ %r43 bra $L48; .loc 1 237 5 cvta.const.u64 %r44,$LC4; {stack; call _gfortran_runtime_error48: .loc 1 239 6 setp.le.s64 %r46,%r39,1; @ %r46 bra $L49; .loc 1 240 5 cvta.const.u64 %r47_gfortran_runtime_error49: .loc 1 242 10 ld.u64 %r23,[%r37]; .loc 1 246 7 add.u64 %r36,%r40,-1; .loc 1 242 6 setp.ne.u64 %r49,%r23,0; @ %r49 bra $L50; .loc 1 244 24_gfortrani_xmallocarray244 22 st.u64 [%r37],%r23; .loc 1 245 19 mov.u64 %r54,0; st.u64 [%r37+8],%r54; .loc 1 246 7 st.u64 [%r37+48],%r54; st.u64 [%r37+56],%r36; mov.u64 %r56,1; st.u64 [%r37+40],%r56; mov.u64 %r30,%r56; bra $L51; $L50: .loc 1 251 9 ld.u64 %r30,[%r37+40]; .loc 1 250 26 ld.u64 %r58,[%r37+56]; ld.u64 %r60,[%r37+48]; .loc 1 250 55 sub.u64 %r61,%r58,%r60; .loc 1 251 7 div.s64 %r63,%r61,%r30; .loc 1 250 10 setp.ge.s64 %r64,%r63,%r36; @ %r64 bra $L51; .loc 1 252 2 cvta.const.u64 %r65,$LC5stack; call _gfortran_runtime_error1: .loc 1 258 3 setp.gt.s64 %r67,%r40,0; @ ! %r67 bra $L47; .loc 1 261 12 shl.b64 %r34,%r30,4; .loc 1 258 19 mov.u64 %r35,0; $L53: .loc 1 260 15 ld.u64 %r71,[%r38]; ld.u64 %r72,[%r38+8]; .loc 1 260 13 st.u64 [%r23],%r71; st.u64 [%r23+8],%r72; .loc 1 261 12 add.u64 %r23,%r23,%r34; .loc 1 258 40 add.u64 %r35,%r35,1; .loc 1 258 3 setp.ne.u64 %r70,%r40,%r35; @ %r70 bra $L53; $L47 spread_r4.o/_gfortrani_spread_r4 .visible .func _gfortrani_spread_r../libgfortran/generated/spread_r4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_spread_scalar_r4 .visible .func _gfortrani_spread_scalar_r4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray114,101,116,117,114,110,32,114,97,110,107,32,116,111,111,32,108,97,114,103,101,32,105,110,32,115,112,114,101,97,100,40,410,105,109,32,111,117,116,115,105,100,101,32,111,102,32,114,97,110,107,32,105,110,32114,97,110,107,32,109,105,115,109,97,116,99,104,32,105,110,32,115,112,114,101,97,100,40,473,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,83,80,82,69,65,68,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10105,110,99,111,114,114,101,99,116,32,100,101,115,116,105,110,97,116,105,111,110,32,114,97,110,107,32,105,110,32,115,112,114,101,97,100,40,41100,105,109,32,116,111,111,32,108,97,114,103,101,32,105,110,32,115,112,114,101,97,100,40,41_gfortrani_spread_r4 .visible .func _gfortrani_spread_rpredpredpredpred %r250; .reg .predu64 %r264; .reg .u64 %r265; .reg .u64 %r272; .reg .pred %r275; .reg .u64 %r285; .reg .predpred %r292; .reg .pred %r293; .reg .predu64 %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u64pred %r316; .reg .u64 %r317; .reg .u64 %r318; .reg .u64 %r319; .reg .u64 %r321; .reg .u64 %r322; .reg .u64 %r323; .reg .pred %r325; .reg .pred %r326; .reg .u64u64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; mov.u64 %r129,%ar0; mov.u64 %r130,%ar1; mov.u64 %r131,%ar2; mov.u64 %r132,%ar3; .loc 1 57 11 ld.s8 %r22,[%r130+28]; .loc 1 57 9 cvt.u32.u32 %r133,%r22; cvt.s64.s8 %r89,%r133; .loc 1 59 9 add.u64 %r90,%r89,1; .loc 1 60 6 setp.le.s64 %r134,%r90,15; @ %r134 bra $L2; .loc 1 61 5 cvta.const.u64 %r135135_gfortran_runtime_errorsetp.ge.s64 %r137,%r90,%r131; @ %r137 bra $L3; .loc 1 64_gfortran_runtime_errorr112,[%r129]; .loc 1 68 6 setp.ne.u64 %r140,%r112,0; @ %r140 bra $L4; .loc 1 75 23 st.u8 [%r129+28],%r90; .loc 1 79 7 setp.le.s64 %r141,%r90,0; @ %r141 bra $L5; .loc 1 82 19 add.u64 %r127,%r131,-1; add.u64 %r115,%r129,40; .loc 1 77 11 mov.u64 %r80,%r112; .loc 1 79 14 mov.u64 %r97,%r80; .loc 1 78 10 mov.u64 %r121,1; .loc 1 40 14 mov.u64 %r74,%r80; .loc 1 90 19 mov.u64 %r332,%r80; .loc 1 84 21 add.u64 %r333,%r132,-1; bra $L8; $L36: mov.u64 %r121,%r76; $L8: .loc 1 82 7 setp.ne.u64 %r144,%r97,%r127; @ %r144 bra $L6; .loc 1 84 21 mov.u64 %r29,%r333; .loc 1 86 11 mul.lo.u64 %r76,%r132,%r121; mov.u64 %r74,%r121; bra $L7; $L6: .loc 1 90 19 shl.b64 %r145,%r80,3; add.u64 %r146,%frame,%r145; st.u64 [%r146+120],%r332; .loc 1 91 22 add.u64 %r150,%r80,%r80; add.u64 %r151,%r150,%r80; shl.b64 %r152,%r151,3; add.u64 %r153,%r130,%r152; ld.u64 %r156,[%r153+56]; add.u64 %r155,%r156,1; ld.u64 %r163,[%r153+48]; sub.u64 %r27,%r155,%r163; .loc 1 91 20 st.u64 [%r146],%r27; .loc 1 92 21 ld.u64 %r175,[%r153+40]; st.u64 [%r146+240],%r175; .loc 1 93 21 st.u64 [%r146+360],%r121; .loc 1 95 25 add.u64 %r29,%r27,-1; .loc 1 96 11 mul.lo.u64 %r76,%r27,%r121; .loc 1 97 11 add.u64 %r80,%r80,1; $L7: .loc 1 99 4 st.u64 [%r115+8],%r332; st.u64 [%r115+16],%r29; st.u64 [%r115],%r121; .loc 1 79 31 add.u64 %r97,%r97,1; .loc 1 79 7 add.u64 %r115,%r115,24; setp.ne.u64 %r180,%r97,%r90; @ %r180 bra $L36; bra $L45; $L5: .loc 1 101 19 st.u64 [%r129+8],%r112; .loc 1 104 24 mov.u64 %r183,4; mov.u64 %r18218183; call (%value_in),_gfortrani_xmallocarray184,[%value_in]; } mov.u64 %r112,%r184; .loc 1 104 22 st.u64 [%r129],%r112; .loc 1 181 12 ld.u64 %r50,[%frame+240]; .loc 1 40 14 mov.u64 %r74,0; bra $L10; $L45: .loc 1 101 19 st.u64 [%r129+8],%r332; .loc 1 104 24 mov.u64 %r188,4;188; call (%value_in),_gfortrani_xmallocarray189,[%value_in]; } mov.u64 %r112,%r189; .loc 1 104 22 st.u64 [%r129],%r112; .loc 1 105 10 setp.le.s64 %r191,%r76,0; @ %r191 bra $L1; .loc 1 181 12 ld.u64 %r50,[%frame+240]; bra $L10; $L4: .loc 1 115 11 ld.s8 %r192,[%r129+28]; .loc 1 115 10 setp.eq.u64 %r193,%r192,%r90; @ %r193 bra $L12; .loc 1 116 2 cvta.const.u64 %r1944_gfortran_runtime_error12: .loc 1 118 11 cvta.global.u64 %r196,_gfortrani_compile_options; ld.u32 %r85,[%r196+36]; .loc 1 118 10 setp.ne.u32 %r197,%r85,0; @ %r197 bra $L13; .loc 1 156 4 setp.gt.s64 %r198,%r192,0; @ %r198 bra $L14; .loc 1 40 14 mov.u64 %r74,0; bra $L15; $L13: .loc 1 120 4 setp.le.s64 %r199,%r192,0; @ %r199 bra $L37; .loc 1 82 19 add.u64 %r126,%r131,-1; add.u64 %r106,%r129,40; .loc 1 112 18 mov.u32 %r85,0; .loc 1 114 11 mov.u64 %r82,0; .loc 1 120 11 mov.u64 %r94,%r82; .loc 1 40 14 mov.u64 %r74,%r82; .loc 1 137 16 mov.u64 %r334,%r82; $L20: .loc 1 124 21 ld.u64 %r203,[%r106+16]; add.u64 %r202,%r203,1; .loc 1 124 19 ld.u64 %r204,[%r106+8]; sub.u64 %r93,%r202,%r204; .loc 1 125 11 setp.ne.u64 %r205,%r94,%r126; @ %r205 bra $L16; .loc 1 127 12 ld.u64 %r74,[%r106]; .loc 1 129 8 setp.eq.u64 %r206,%r132,%r93; @ %r206 bra $L17; .loc 1 130 7 st.u64 [%stack+16],%r132; st.u64 [%stack+8],%r93; add.u64 %r208,%r94,1;_gfortran_runtime_error16: .loc 1 137 16 shl.b64 %r322,%r82,3; add.u64 %r211,%frame,%r322; add.u64 %r212,%r211,120; st.u64 [%r212],%r334; .loc 1 138 19 add.u64 %r317,%r82,%r82; add.u64 %r319,%r317,%r82; shl.b64 %r217,%r319,3; add.u64 %r218,%r130,%r217; ld.u64 %r221,[%r218+56]; add.u64 %r220,%r221,1; ld.u64 %r228,[%r218+48]; sub.u64 %r41,%r220,%r228; .loc 1 138 17 st.u64 [%r211],%r41; .loc 1 139 8 setp.eq.u64 %r231,%r41,%r93; @ %r231 bra $L18; .loc 1 140 7 st.u64 [%stack+16],%r41; st.u64 [%stack+8],%r93; add.u64 %r233,%r94,1;_gfortran_runtime_error18: .loc 1 146 8 setp.gt.s64 %r235,%r41,0; .loc 1 147 18 selp.u32 %r85,%r85,1,%r235; .loc 1 148 18 ld.u64 %r245,[%r218+40]; st.u64 [%r211+240],%r245; .loc 1 149 18 ld.u64 %r249,[%r106]; st.u64 [%r211+360],%r249; .loc 1 150 8 add.u64 %r82,%r82,1; $L17: .loc 1 120 28 add.u64 %r94,%r94,1; .loc 1 120 4 add.u64 %r106,%r106,24; setp.ne.u64 %r250,%r192,%r94; @ %r250 bra $L20; bra $L21; $L14: .loc 1 82 19 add.u64 %r125,%r131,-1; add.u64 %r75,%r129,40; .loc 1 114 11 mov.u64 %r84,0; .loc 1 156 11 mov.u64 %r92,%r84; .loc 1 40 14 mov.u64 %r74,%r84; .loc 1 164 16 mov.u64 %r335,%r84; $L24: .loc 1 160 12 ld.u64 %r128,[%r75]; .loc 1 158 11 setp.eq.u64 %r253,%r92,%r125; @ %r253 bra $L38; .loc 1 164 16 shl.b64 %r323,%r84,3; add.u64 %r255,%frame,%r323; add.u64 %r256,%r255,120; st.u64 [%r256],%r335; .loc 1 165 19 add.u64 %r318,%r84,%r84; add.u64 %r321,%r318,%r84; shl.b64 %r261,%r321,3; add.u64 %r262,%r130,%r261; ld.u64 %r265,[%r262+56]; add.u64 %r264,%r265,1; ld.u64 %r272,[%r262+48]; sub.u64 %r48,%r264,%r272; .loc 1 165 17 st.u64 [%r255],%r48; .loc 1 166 8 setp.gt.s64 %r275,%r48,0; .loc 1 167 18 selp.u32 %r85,%r85,1,%r275; .loc 1 168 18 ld.u64 %r285,[%r262+40]; st.u64 [%r255+240],%r285; .loc 1 169 18 st.u64 [%r255+360],%r128; .loc 1 170 8 add.u64 %r84,%r84,1; bra $L22; $L38: .loc 1 160 12 mov.u64 %r74,%r128; $L22: .loc 1 156 28 add.u64 %r92,%r92,1; .loc 1 156 4 add.u64 %r75,%r75,24; setp.ne.u64 %r289,%r92,%r192; @ %r289 bra $L24; $L21: .loc 1 175 10 setp.ne.u32 %r290,%r85,0; @ ! %r290 bra $L15; bra $L1; $L37: .loc 1 40 14 mov.u64 %r74,0; $L15: .loc 1 178 18 ld.u64 %r50,[%frame+240]; .loc 1 178 10 setp.eq.u64 %r291,%r50,0; selp.u64 %r50,1,%r50,%r291; $L10: .loc 1 182 12 ld.u64 %r98,[%frame+360]; .loc 1 184 8 ld.u64 %r101,[%r130]; .loc 1 186 9 setp.ne.u64 %r292,%r101,0; @ %r292 bra $L25; bra $L1; $L40: .loc 1 190 7 mov.u64 %r102,%r112; .loc 1 190 14 mov.u64 %r103,0; $L26: .loc 1 192 12 ld.f32 %r51,[%r101]; .loc 1 192 10 st.f32 [%r102],%r51; .loc 1 193 16 add.u64 %r102,%r102,%r53; .loc 1 190 33 add.u64 %r103,%r103,1; .loc 1 190 7 setp.ne.u64 %r293,%r132,%r103; @ %r293 bra $L26; bra $L46; $L44: .loc 1 196 12 add.u64 %r101,%r101,%r55; .loc 1 197 12 add.u64 %r112,%r112,%r57; .loc 1 198 15 add.u64 %r95,%r95,1; .loc 1 200 13 setp.ne.u64 %r294,%r30,%r95; @ %r294 bra $L44; $L34: .loc 1 208 16 add.u64 %r100,%r112,%r116; .loc 1 210 14 @ %r325 bra $L1; add.u64 %r108,%frame,128; .loc 1 207 30 mov.u64 %r59,%r99; .loc 1 210 14 mov.u64 %r124,8; bra $L30; $L31: .loc 1 204 20 st.u64 [%r108],%r330; .loc 1 207 30 mul.lo.u64 %r59,%r67,%r66; .loc 1 208 30 mul.lo.u64 %r298,%r69,%r66; .loc 1 208 16 shl.b64 %r299,%r298,2; sub.u64 %r100,%r112,%r299; .loc 1 210 14 add.u64 %r108,%r108,8; add.u64 %r124,%r124,8; setp.eq.u64 %r300,%r117,%r124; @ %r300 bra $L1; $L30: .loc 1 218 23 ld.u64 %r301,[%r108]; add.u64 %r66,%r301,1; st.u64 [%r108],%r66; .loc 1 219 30 add.u64 %r303,%r328,%r124; ld.u64 %r67,[%r303]; .loc 1 219 20 sub.u64 %r304,%r67,%r59; shl.b64 %r305,%r304,2; add.u64 %r101,%r101,%r305; .loc 1 220 30 add.u64 %r307,%r329,%r124; ld.u64 %r69,[%r307]; .loc 1 220 20 shl.b64 %r308,%r69,2; add.u64 %r112,%r100,%r308; .loc 1 200 32 add.u64 %r309,%frame,%r124; ld.u64 %r72,[%r309]; .loc 1 200 13 setp.eq.u64 %r310,%r66,%r72; @ %r310 bra $L31; .loc 1 204 20 mov.u64 %r95,0; bra $L32; $L25: .loc 1 193 16 shl.b64 %r53,%r74,2; .loc 1 196 12 shl.b64 %r55,%r50,2; .loc 1 197 12 shl.b64 %r57,%r98,2; ld.u64 %r95,[%frame+120]; .loc 1 200 32 ld.u64 %r30,[%frame]; .loc 1 207 30 mul.lo.u64 %r99,%r50,%r30; .loc 1 208 30 mul.lo.u64 %r311,%r98,%r30; .loc 1 208 16 shl.b64 %r312,%r311,2; neg.s64 %r116,%r312; cvt.u32.u32 %r314,%r22; cvt.s64.s8 %r313,%r314; shl.b64 %r117,%r313,3; setp.le.s64 %r325,%r89,1; setp.gt.s64 %r326,%r132,0; add.u64 %r328,%frame,240; add.u64 %r329,%frame,360; .loc 1 204 20 mov.u64 %r330,0; $L32: .loc 1 190 7 @ %r326 bra $L40; bra $L44; $L46: .loc 1 196 12 add.u64 %r101,%r101,%r55; .loc 1 197 12 add.u64 %r112,%r112,%r57; .loc 1 198 15 add.u64 %r95,%r95,1; .loc 1 200 13 setp.eq.u64 %r316,%r30,%r95; @ %r316 bra $L34; bra $L32; $L1: .loc 1 224_gfortrani_spread_scalar_r4 .visible .func _gfortrani_spread_scalar_r0predmov.u64 %r39,%ar2; mov.u64 %r40,%ar3; .loc 1 236 6 ld.s8 %r42,[%r37+28]; cvt.u16.u32 %r41,%r42; setp.eq.u16 %r43,%r41,1; @ %r43 bra $L48; .loc 1 237 5 cvta.const.u64 %r44,$LC4; {stack; call _gfortran_runtime_error48: .loc 1 239 6 setp.le.s64 %r46,%r39,1; @ %r46 bra $L49; .loc 1 240 5 cvta.const.u64 %r47_gfortran_runtime_error49: .loc 1 242 10 ld.u64 %r23,[%r37]; .loc 1 246 7 add.u64 %r36,%r40,-1; .loc 1 242 6 setp.ne.u64 %r49,%r23,0; @ %r49 bra $L50; .loc 1 244 24 mov.u64 %r51_gfortrani_xmallocarray244 22 st.u64 [%r37],%r23; .loc 1 245 19 mov.u64 %r54,0; st.u64 [%r37+8],%r54; .loc 1 246 7 st.u64 [%r37+48],%r54; st.u64 [%r37+56],%r36; mov.u64 %r56,1; st.u64 [%r37+40],%r56; mov.u64 %r30,%r56; bra $L51; $L50: .loc 1 251 9 ld.u64 %r30,[%r37+40]; .loc 1 250 26 ld.u64 %r58,[%r37+56]; ld.u64 %r60,[%r37+48]; .loc 1 250 55 sub.u64 %r61,%r58,%r60; .loc 1 251 7 div.s64 %r63,%r61,%r30; .loc 1 250 10 setp.ge.s64 %r64,%r63,%r36; @ %r64 bra $L51; .loc 1 252 2 cvta.const.u64 %r65,$LC5stack; call _gfortran_runtime_error1: .loc 1 258 3 setp.gt.s64 %r67,%r40,0; @ ! %r67 bra $L47; .loc 1 261 12 shl.b64 %r34,%r30,2; .loc 1 258 19 mov.u64 %r35,0; $L53: .loc 1 260 15 ld.f32 %r32,[%r38]; .loc 1 260 13 st.f32 [%r23],%r32; .loc 1 261 12 add.u64 %r23,%r23,%r34; .loc 1 258 40 add.u64 %r35,%r35,1; .loc 1 258 3 setp.ne.u64 %r68,%r40,%r35; @ %r68 bra $L53; $L47 spread_r8.o/../libgfortran/generated/spread_r8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_spread_scalar_r8 .visible .func _gfortrani_spread_scalar_r8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray114,101,116,117,114,110,32,114,97,110,107,32,116,111,111,32,108,97,114,103,101,32,105,110,32,115,112,114,101,97,100,40,410,105,109,32,111,117,116,115,105,100,101,32,111,102,32,114,97,110,107,32,105,110,32114,97,110,107,32,109,105,115,109,97,116,99,104,32,105,110,32,115,112,114,101,97,100,40,473,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,83,80,82,69,65,68,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10105,110,99,111,114,114,101,99,116,32,100,101,115,116,105,110,97,116,105,111,110,32,114,97,110,107,32,105,110,32,115,112,114,101,97,100,40,41100,105,109,32,116,111,111,32,108,97,114,103,101,32,105,110,32,115,112,114,101,97,100,40,41_gfortrani_spread_r8 .visible .func _gfortrani_spread_r8predpredpredpred %r250; .reg .predu64 %r264; .reg .u64 %r265; .reg .u64 %r272; .reg .pred %r275; .reg .u64 %r285; .reg .predpred %r292; .reg .pred %r293; .reg .predu64 %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u64pred %r316; .reg .u64 %r317; .reg .u64 %r318; .reg .u64 %r319; .reg .u64 %r321; .reg .u64 %r322; .reg .u64 %r323; .reg .pred %r325; .reg .pred %r326; .reg .u64u64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; mov.u64 %r129,%ar0; mov.u64 %r130,%ar1; mov.u64 %r131,%ar2; mov.u64 %r132,%ar3; .loc 1 57 11 ld.s8 %r22,[%r130+28]; .loc 1 57 9 cvt.u32.u32 %r133,%r22; cvt.s64.s8 %r89,%r133; .loc 1 59 9 add.u64 %r90,%r89,1; .loc 1 60 6 setp.le.s64 %r134,%r90,15; @ %r134 bra $L2; .loc 1 61 5 cvta.const.u64 %r135135_gfortran_runtime_errorsetp.ge.s64 %r137,%r90,%r131; @ %r137 bra $L3; .loc 1 64_gfortran_runtime_errorr112,[%r129]; .loc 1 68 6 setp.ne.u64 %r140,%r112,0; @ %r140 bra $L4; .loc 1 75 23 st.u8 [%r129+28],%r90; .loc 1 79 7 setp.le.s64 %r141,%r90,0; @ %r141 bra $L5; .loc 1 82 19 add.u64 %r127,%r131,-1; add.u64 %r115,%r129,40; .loc 1 77 11 mov.u64 %r80,%r112; .loc 1 79 14 mov.u64 %r97,%r80; .loc 1 78 10 mov.u64 %r121,1; .loc 1 40 14 mov.u64 %r74,%r80; .loc 1 90 19 mov.u64 %r332,%r80; .loc 1 84 21 add.u64 %r333,%r132,-1; bra $L8; $L36: mov.u64 %r121,%r76; $L8: .loc 1 82 7 setp.ne.u64 %r144,%r97,%r127; @ %r144 bra $L6; .loc 1 84 21 mov.u64 %r29,%r333; .loc 1 86 11 mul.lo.u64 %r76,%r132,%r121; mov.u64 %r74,%r121; bra $L7; $L6: .loc 1 90 19 shl.b64 %r145,%r80,3; add.u64 %r146,%frame,%r145; st.u64 [%r146+120],%r332; .loc 1 91 22 add.u64 %r150,%r80,%r80; add.u64 %r151,%r150,%r80; shl.b64 %r152,%r151,3; add.u64 %r153,%r130,%r152; ld.u64 %r156,[%r153+56]; add.u64 %r155,%r156,1; ld.u64 %r163,[%r153+48]; sub.u64 %r27,%r155,%r163; .loc 1 91 20 st.u64 [%r146],%r27; .loc 1 92 21 ld.u64 %r175,[%r153+40]; st.u64 [%r146+240],%r175; .loc 1 93 21 st.u64 [%r146+360],%r121; .loc 1 95 25 add.u64 %r29,%r27,-1; .loc 1 96 11 mul.lo.u64 %r76,%r27,%r121; .loc 1 97 11 add.u64 %r80,%r80,1; $L7: .loc 1 99 4 st.u64 [%r115+8],%r332; st.u64 [%r115+16],%r29; st.u64 [%r115],%r121; .loc 1 79 31 add.u64 %r97,%r97,1; .loc 1 79 7 add.u64 %r115,%r115,24; setp.ne.u64 %r180,%r97,%r90; @ %r180 bra $L36; bra $L45; $L5: .loc 1 101 19 st.u64 [%r129+8],%r112; .loc 1 104 24 mov.u64 %r183,8; mov.u64 %r18218183; call (%value_in),_gfortrani_xmallocarray184,[%value_in]; } mov.u64 %r112,%r184; .loc 1 104 22 st.u64 [%r129],%r112; .loc 1 181 12 ld.u64 %r50,[%frame+240]; .loc 1 40 14 mov.u64 %r74,0; bra $L10; $L45: .loc 1 101 19 st.u64 [%r129+8],%r332; .loc 1 104 24 mov.u64 %r188(%value_in),_gfortrani_xmallocarray189,[%value_in]; } mov.u64 %r112,%r189; .loc 1 104 22 st.u64 [%r129],%r112; .loc 1 105 10 setp.le.s64 %r191,%r76,0; @ %r191 bra $L1; .loc 1 181 12 ld.u64 %r50,[%frame+240]; bra $L10; $L4: .loc 1 115 11 ld.s8 %r192,[%r129+28]; .loc 1 115 10 setp.eq.u64 %r193,%r192,%r90; @ %r193 bra $L12; .loc 1 116 2 cvta.const.u64 %r1944_gfortran_runtime_error12: .loc 1 118 11 cvta.global.u64 %r196,_gfortrani_compile_options; ld.u32 %r85,[%r196+36]; .loc 1 118 10 setp.ne.u32 %r197,%r85,0; @ %r197 bra $L13; .loc 1 156 4 setp.gt.s64 %r198,%r192,0; @ %r198 bra $L14; .loc 1 40 14 mov.u64 %r74,0; bra $L15; $L13: .loc 1 120 4 setp.le.s64 %r199,%r192,0; @ %r199 bra $L37; .loc 1 82 19 add.u64 %r126,%r131,-1; add.u64 %r106,%r129,40; .loc 1 112 18 mov.u32 %r85,0; .loc 1 114 11 mov.u64 %r82,0; .loc 1 120 11 mov.u64 %r94,%r82; .loc 1 40 14 mov.u64 %r74,%r82; .loc 1 137 16 mov.u64 %r334,%r82; $L20: .loc 1 124 21 ld.u64 %r203,[%r106+16]; add.u64 %r202,%r203,1; .loc 1 124 19 ld.u64 %r204,[%r106+8]; sub.u64 %r93,%r202,%r204; .loc 1 125 11 setp.ne.u64 %r205,%r94,%r126; @ %r205 bra $L16; .loc 1 127 12 ld.u64 %r74,[%r106]; .loc 1 129 8 setp.eq.u64 %r206,%r132,%r93; @ %r206 bra $L17; .loc 1 130 7 st.u64 [%stack+16],%r132; st.u64 [%stack+8],%r93; add.u64 %r208,%r94,1;_gfortran_runtime_error16: .loc 1 137 16 shl.b64 %r322,%r82,3; add.u64 %r211,%frame,%r322; add.u64 %r212,%r211,120; st.u64 [%r212],%r334; .loc 1 138 19 add.u64 %r317,%r82,%r82; add.u64 %r319,%r317,%r82; shl.b64 %r217,%r319,3; add.u64 %r218,%r130,%r217; ld.u64 %r221,[%r218+56]; add.u64 %r220,%r221,1; ld.u64 %r228,[%r218+48]; sub.u64 %r41,%r220,%r228; .loc 1 138 17 st.u64 [%r211],%r41; .loc 1 139 8 setp.eq.u64 %r231,%r41,%r93; @ %r231 bra $L18; .loc 1 140 7 st.u64 [%stack+16],%r41; st.u64 [%stack+8],%r93; add.u64 %r233,%r94,1;_gfortran_runtime_error18: .loc 1 146 8 setp.gt.s64 %r235,%r41,0; .loc 1 147 18 selp.u32 %r85,%r85,1,%r235; .loc 1 148 18 ld.u64 %r245,[%r218+40]; st.u64 [%r211+240],%r245; .loc 1 149 18 ld.u64 %r249,[%r106]; st.u64 [%r211+360],%r249; .loc 1 150 8 add.u64 %r82,%r82,1; $L17: .loc 1 120 28 add.u64 %r94,%r94,1; .loc 1 120 4 add.u64 %r106,%r106,24; setp.ne.u64 %r250,%r192,%r94; @ %r250 bra $L20; bra $L21; $L14: .loc 1 82 19 add.u64 %r125,%r131,-1; add.u64 %r75,%r129,40; .loc 1 114 11 mov.u64 %r84,0; .loc 1 156 11 mov.u64 %r92,%r84; .loc 1 40 14 mov.u64 %r74,%r84; .loc 1 164 16 mov.u64 %r335,%r84; $L24: .loc 1 160 12 ld.u64 %r128,[%r75]; .loc 1 158 11 setp.eq.u64 %r253,%r92,%r125; @ %r253 bra $L38; .loc 1 164 16 shl.b64 %r323,%r84,3; add.u64 %r255,%frame,%r323; add.u64 %r256,%r255,120; st.u64 [%r256],%r335; .loc 1 165 19 add.u64 %r318,%r84,%r84; add.u64 %r321,%r318,%r84; shl.b64 %r261,%r321,3; add.u64 %r262,%r130,%r261; ld.u64 %r265,[%r262+56]; add.u64 %r264,%r265,1; ld.u64 %r272,[%r262+48]; sub.u64 %r48,%r264,%r272; .loc 1 165 17 st.u64 [%r255],%r48; .loc 1 166 8 setp.gt.s64 %r275,%r48,0; .loc 1 167 18 selp.u32 %r85,%r85,1,%r275; .loc 1 168 18 ld.u64 %r285,[%r262+40]; st.u64 [%r255+240],%r285; .loc 1 169 18 st.u64 [%r255+360],%r128; .loc 1 170 8 add.u64 %r84,%r84,1; bra $L22; $L38: .loc 1 160 12 mov.u64 %r74,%r128; $L22: .loc 1 156 28 add.u64 %r92,%r92,1; .loc 1 156 4 add.u64 %r75,%r75,24; setp.ne.u64 %r289,%r92,%r192; @ %r289 bra $L24; $L21: .loc 1 175 10 setp.ne.u32 %r290,%r85,0; @ ! %r290 bra $L15; bra $L1; $L37: .loc 1 40 14 mov.u64 %r74,0; $L15: .loc 1 178 18 ld.u64 %r50,[%frame+240]; .loc 1 178 10 setp.eq.u64 %r291,%r50,0; selp.u64 %r50,1,%r50,%r291; $L10: .loc 1 182 12 ld.u64 %r98,[%frame+360]; .loc 1 184 8 ld.u64 %r101,[%r130]; .loc 1 186 9 setp.ne.u64 %r292,%r101,0; @ %r292 bra $L25; bra $L1; $L40: .loc 1 190 7 mov.u64 %r102,%r112; .loc 1 190 14 mov.u64 %r103,0; $L26: .loc 1 192 12 ld.f64 %r51,[%r101]; .loc 1 192 10 st.f64 [%r102],%r51; .loc 1 193 16 add.u64 %r102,%r102,%r53; .loc 1 190 33 add.u64 %r103,%r103,1; .loc 1 190 7 setp.ne.u64 %r293,%r132,%r103; @ %r293 bra $L26; bra $L46; $L44: .loc 1 196 12 add.u64 %r101,%r101,%r55; .loc 1 197 12 add.u64 %r112,%r112,%r57; .loc 1 198 15 add.u64 %r95,%r95,1; .loc 1 200 13 setp.ne.u64 %r294,%r30,%r95; @ %r294 bra $L44; $L34: .loc 1 208 16 add.u64 %r100,%r112,%r116; .loc 1 210 14 @ %r325 bra $L1; add.u64 %r108,%frame,128; .loc 1 207 30 mov.u64 %r59,%r99; .loc 1 210 14 mov.u64 %r124,8; bra $L30; $L31: .loc 1 204 20 st.u64 [%r108],%r330; .loc 1 207 30 mul.lo.u64 %r59,%r67,%r66; .loc 1 208 30 mul.lo.u64 %r298,%r69,%r66; .loc 1 208 16 shl.b64 %r299,%r298,3; sub.u64 %r100,%r112,%r299; .loc 1 210 14 add.u64 %r108,%r108,8; add.u64 %r124,%r124,8; setp.eq.u64 %r300,%r117,%r124; @ %r300 bra $L1; $L30: .loc 1 218 23 ld.u64 %r301,[%r108]; add.u64 %r66,%r301,1; st.u64 [%r108],%r66; .loc 1 219 30 add.u64 %r303,%r328,%r124; ld.u64 %r67,[%r303]; .loc 1 219 20 sub.u64 %r304,%r67,%r59; shl.b64 %r305,%r304,3; add.u64 %r101,%r101,%r305; .loc 1 220 30 add.u64 %r307,%r329,%r124; ld.u64 %r69,[%r307]; .loc 1 220 20 shl.b64 %r308,%r69,3; add.u64 %r112,%r100,%r308; .loc 1 200 32 add.u64 %r309,%frame,%r124; ld.u64 %r72,[%r309]; .loc 1 200 13 setp.eq.u64 %r310,%r66,%r72; @ %r310 bra $L31; .loc 1 204 20 mov.u64 %r95,0; bra $L32; $L25: .loc 1 193 16 shl.b64 %r53,%r74,3; .loc 1 196 12 shl.b64 %r55,%r50,3; .loc 1 197 12 shl.b64 %r57,%r98,3; ld.u64 %r95,[%frame+120]; .loc 1 200 32 ld.u64 %r30,[%frame]; .loc 1 207 30 mul.lo.u64 %r99,%r50,%r30; .loc 1 208 30 mul.lo.u64 %r311,%r98,%r30; .loc 1 208 16 shl.b64 %r312,%r311,3; neg.s64 %r116,%r312; cvt.u32.u32 %r314,%r22; cvt.s64.s8 %r313,%r314; shl.b64 %r117,%r313,3; setp.le.s64 %r325,%r89,1; setp.gt.s64 %r326,%r132,0; add.u64 %r328,%frame,240; add.u64 %r329,%frame,360; .loc 1 204 20 mov.u64 %r330,0; $L32: .loc 1 190 7 @ %r326 bra $L40; bra $L44; $L46: .loc 1 196 12 add.u64 %r101,%r101,%r55; .loc 1 197 12 add.u64 %r112,%r112,%r57; .loc 1 198 15 add.u64 %r95,%r95,1; .loc 1 200 13 setp.eq.u64 %r316,%r30,%r95; @ %r316 bra $L34; bra $L32; $L1: .loc 1 224_gfortrani_spread_scalar_r8 .visible .func _gfortrani_spread_scalar_r80predmov.u64 %r39,%ar2; mov.u64 %r40,%ar3; .loc 1 236 6 ld.s8 %r42,[%r37+28]; cvt.u16.u32 %r41,%r42; setp.eq.u16 %r43,%r41,1; @ %r43 bra $L48; .loc 1 237 5 cvta.const.u64 %r44,$LC4; {stack; call _gfortran_runtime_error48: .loc 1 239 6 setp.le.s64 %r46,%r39,1; @ %r46 bra $L49; .loc 1 240 5 cvta.const.u64 %r47_gfortran_runtime_error49: .loc 1 242 10 ld.u64 %r23,[%r37]; .loc 1 246 7 add.u64 %r36,%r40,-1; .loc 1 242 6 setp.ne.u64 %r49,%r23,0; @ %r49 bra $L50; .loc 1 244 24 mov.u64 _gfortrani_xmallocarray244 22 st.u64 [%r37],%r23; .loc 1 245 19 mov.u64 %r54,0; st.u64 [%r37+8],%r54; .loc 1 246 7 st.u64 [%r37+48],%r54; st.u64 [%r37+56],%r36; mov.u64 %r56,1; st.u64 [%r37+40],%r56; mov.u64 %r30,%r56; bra $L51; $L50: .loc 1 251 9 ld.u64 %r30,[%r37+40]; .loc 1 250 26 ld.u64 %r58,[%r37+56]; ld.u64 %r60,[%r37+48]; .loc 1 250 55 sub.u64 %r61,%r58,%r60; .loc 1 251 7 div.s64 %r63,%r61,%r30; .loc 1 250 10 setp.ge.s64 %r64,%r63,%r36; @ %r64 bra $L51; .loc 1 252 2 cvta.const.u64 %r65,$LC5stack; call _gfortran_runtime_error1: .loc 1 258 3 setp.gt.s64 %r67,%r40,0; @ ! %r67 bra $L47; .loc 1 261 12 shl.b64 %r34,%r30,3; .loc 1 258 19 mov.u64 %r35,0; $L53: .loc 1 260 15 ld.f64 %r32,[%r38]; .loc 1 260 13 st.f64 [%r23],%r32; .loc 1 261 12 add.u64 %r23,%r23,%r34; .loc 1 258 40 add.u64 %r35,%r35,1; .loc 1 258 3 setp.ne.u64 %r68,%r40,%r35; @ %r68 bra $L53; $L47 spread_r10.o/ spread_c4.o/_gfortrani_spread_c4 .visible .func _gfortrani_spread_c../libgfortran/generated/spread_c4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_spread_scalar_c4 .visible .func _gfortrani_spread_scalar_c4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray114,101,116,117,114,110,32,114,97,110,107,32,116,111,111,32,108,97,114,103,101,32,105,110,32,115,112,114,101,97,100,40,410,105,109,32,111,117,116,115,105,100,101,32,111,102,32,114,97,110,107,32,105,110,32114,97,110,107,32,109,105,115,109,97,116,99,104,32,105,110,32,115,112,114,101,97,100,40,473,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,83,80,82,69,65,68,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10105,110,99,111,114,114,101,99,116,32,100,101,115,116,105,110,97,116,105,111,110,32,114,97,110,107,32,105,110,32,115,112,114,101,97,100,40,41100,105,109,32,116,111,111,32,108,97,114,103,101,32,105,110,32,115,112,114,101,97,100,40,41_gfortrani_spread_c4 .visible .func _gfortrani_spread_cpred %r142; .reg .predpredpred %r192; .reg .u64pred %r206; .reg .predpred %r251; .reg .predu64 %r265; .reg .u64 %r266; .reg .u64 %r273; .reg .pred %r276; .reg .u64 %r286; .reg .predpred %r293; .reg .pred %r294; .reg .predpredu64 %r331; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64 %r336; mov.u64 %r130,%ar0; mov.u64 %r131,%ar1; mov.u64 %r132,%ar2; mov.u64 %r133,%ar3; .loc 1 57 11 ld.s8 %r22,[%r131+28]; .loc 1 57 9 cvt.u32.u32 %r134,%r22; cvt.s64.s8 %r90,%r134; .loc 1 59 9 add.u64 %r91,%r90,1; .loc 1 60 6 setp.le.s64 %r135,%r91,15; @ %r135 bra $L2; .loc 1 61 5 cvta.const.u64 %r136136_gfortran_runtime_errorsetp.ge.s64 %r138,%r91,%r132; @ %r138 bra $L3; .loc 1 64_gfortran_runtime_errorr113,[%r130]; .loc 1 68 6 setp.ne.u64 %r141,%r113,0; @ %r141 bra $L4; .loc 1 75 23 st.u8 [%r130+28],%r91; .loc 1 79 7 setp.le.s64 %r142,%r91,0; @ %r142 bra $L5; .loc 1 82 19 add.u64 %r128,%r132,-1; add.u64 %r116,%r130,40; .loc 1 77 11 mov.u64 %r81,%r113; .loc 1 79 14 mov.u64 %r98,%r81; .loc 1 78 10 mov.u64 %r122,1; .loc 1 40 14 mov.u64 %r75,%r81; .loc 1 90 19 mov.u64 %r333,%r81; .loc 1 84 21 add.u64 %r334,%r133,-1; bra $L8; $L36: mov.u64 %r122,%r77; $L8: .loc 1 82 7 setp.ne.u64 %r145,%r98,%r128; @ %r145 bra $L6; .loc 1 84 21 mov.u64 %r29,%r334; .loc 1 86 11 mul.lo.u64 %r77,%r133,%r122; mov.u64 %r75,%r122; bra $L7; $L6: .loc 1 90 19 shl.b64 %r146,%r81,3; add.u64 %r147,%frame,%r146; st.u64 [%r147+120],%r333; .loc 1 91 22 add.u64 %r151,%r81,%r81; add.u64 %r152,%r151,%r81; shl.b64 %r153,%r152,3; add.u64 %r154,%r131,%r153; ld.u64 %r157,[%r154+56]; add.u64 %r156,%r157,1; ld.u64 %r164,[%r154+48]; sub.u64 %r27,%r156,%r164; .loc 1 91 20 st.u64 [%r147],%r27; .loc 1 92 21 ld.u64 %r176,[%r154+40]; st.u64 [%r147+240],%r176; .loc 1 93 21 st.u64 [%r147+360],%r122; .loc 1 95 25 add.u64 %r29,%r27,-1; .loc 1 96 11 mul.lo.u64 %r77,%r27,%r122; .loc 1 97 11 add.u64 %r81,%r81,1; $L7: .loc 1 99 4 st.u64 [%r116+8],%r333; st.u64 [%r116+16],%r29; st.u64 [%r116],%r122; .loc 1 79 31 add.u64 %r98,%r98,1; .loc 1 79 7 add.u64 %r116,%r116,24; setp.ne.u64 %r181,%r98,%r91; @ %r181 bra $L36; bra $L45; $L5: .loc 1 101 19 st.u64 [%r130+8],%r113; .loc 1 104 24 mov.u64 %r184,8; mov.u64 %r1818(%value_in),_gfortrani_xmallocarray185,[%value_in]; } mov.u64 %r113,%r185; .loc 1 104 22 st.u64 [%r130],%r113; .loc 1 181 12 ld.u64 %r50,[%frame+240]; .loc 1 40 14 mov.u64 %r75,0; bra $L10; $L45: .loc 1 101 19 st.u64 [%r130+8],%r333; .loc 1 104 24 mov.u64 %r1897189; call (%value_in),_gfortrani_xmallocarray190,[%value_in]; } mov.u64 %r113,%r190; .loc 1 104 22 st.u64 [%r130],%r113; .loc 1 105 10 setp.le.s64 %r192,%r77,0; @ %r192 bra $L1; .loc 1 181 12 ld.u64 %r50,[%frame+240]; bra $L10; $L4: .loc 1 115 11 ld.s8 %r193,[%r130+28]; .loc 1 115 10 setp.eq.u64 %r194,%r193,%r91; @ %r194 bra $L12; .loc 1 116 2 cvta.const.u64 %r1955_gfortran_runtime_error12: .loc 1 118 11 cvta.global.u64 %r197,_gfortrani_compile_options; ld.u32 %r86,[%r197+36]; .loc 1 118 10 setp.ne.u32 %r198,%r86,0; @ %r198 bra $L13; .loc 1 156 4 setp.gt.s64 %r199,%r193,0; @ %r199 bra $L14; .loc 1 40 14 mov.u64 %r75,0; bra $L15; $L13: .loc 1 120 4 setp.le.s64 %r200,%r193,0; @ %r200 bra $L37; .loc 1 82 19 add.u64 %r127,%r132,-1; add.u64 %r107,%r130,40; .loc 1 112 18 mov.u32 %r86,0; .loc 1 114 11 mov.u64 %r83,0; .loc 1 120 11 mov.u64 %r95,%r83; .loc 1 40 14 mov.u64 %r75,%r83; .loc 1 137 16 mov.u64 %r335,%r83; $L20: .loc 1 124 21 ld.u64 %r204,[%r107+16]; add.u64 %r203,%r204,1; .loc 1 124 19 ld.u64 %r205,[%r107+8]; sub.u64 %r94,%r203,%r205; .loc 1 125 11 setp.ne.u64 %r206,%r95,%r127; @ %r206 bra $L16; .loc 1 127 12 ld.u64 %r75,[%r107]; .loc 1 129 8 setp.eq.u64 %r207,%r133,%r94; @ %r207 bra $L17; .loc 1 130 7 st.u64 [%stack+16],%r133; st.u64 [%stack+8],%r94; add.u64 %r209,%r95,1;_gfortran_runtime_error16: .loc 1 137 16 shl.b64 %r323,%r83,3; add.u64 %r212,%frame,%r323; add.u64 %r213,%r212,120; st.u64 [%r213],%r335; .loc 1 138 19 add.u64 %r318,%r83,%r83; add.u64 %r320,%r318,%r83; shl.b64 %r218,%r320,3; add.u64 %r219,%r131,%r218; ld.u64 %r222,[%r219+56]; add.u64 %r221,%r222,1; ld.u64 %r229,[%r219+48]; sub.u64 %r41,%r221,%r229; .loc 1 138 17 st.u64 [%r212],%r41; .loc 1 139 8 setp.eq.u64 %r232,%r41,%r94; @ %r232 bra $L18; .loc 1 140 7 st.u64 [%stack+16],%r41; st.u64 [%stack+8],%r94; add.u64 %r234,%r95,1;_gfortran_runtime_error18: .loc 1 146 8 setp.gt.s64 %r236,%r41,0; .loc 1 147 18 selp.u32 %r86,%r86,1,%r236; .loc 1 148 18 ld.u64 %r246,[%r219+40]; st.u64 [%r212+240],%r246; .loc 1 149 18 ld.u64 %r250,[%r107]; st.u64 [%r212+360],%r250; .loc 1 150 8 add.u64 %r83,%r83,1; $L17: .loc 1 120 28 add.u64 %r95,%r95,1; .loc 1 120 4 add.u64 %r107,%r107,24; setp.ne.u64 %r251,%r193,%r95; @ %r251 bra $L20; bra $L21; $L14: .loc 1 82 19 add.u64 %r126,%r132,-1; add.u64 %r76,%r130,40; .loc 1 114 11 mov.u64 %r85,0; .loc 1 156 11 mov.u64 %r93,%r85; .loc 1 40 14 mov.u64 %r75,%r85; .loc 1 164 16 mov.u64 %r336,%r85; $L24: .loc 1 160 12 ld.u64 %r129,[%r76]; .loc 1 158 11 setp.eq.u64 %r254,%r93,%r126; @ %r254 bra $L38; .loc 1 164 16 shl.b64 %r324,%r85,3; add.u64 %r256,%frame,%r324; add.u64 %r257,%r256,120; st.u64 [%r257],%r336; .loc 1 165 19 add.u64 %r321,%r85,%r85; add.u64 %r322,%r321,%r85; shl.b64 %r262,%r322,3; add.u64 %r263,%r131,%r262; ld.u64 %r266,[%r263+56]; add.u64 %r265,%r266,1; ld.u64 %r273,[%r263+48]; sub.u64 %r48,%r265,%r273; .loc 1 165 17 st.u64 [%r256],%r48; .loc 1 166 8 setp.gt.s64 %r276,%r48,0; .loc 1 167 18 selp.u32 %r86,%r86,1,%r276; .loc 1 168 18 ld.u64 %r286,[%r263+40]; st.u64 [%r256+240],%r286; .loc 1 169 18 st.u64 [%r256+360],%r129; .loc 1 170 8 add.u64 %r85,%r85,1; bra $L22; $L38: .loc 1 160 12 mov.u64 %r75,%r129; $L22: .loc 1 156 28 add.u64 %r93,%r93,1; .loc 1 156 4 add.u64 %r76,%r76,24; setp.ne.u64 %r290,%r93,%r193; @ %r290 bra $L24; $L21: .loc 1 175 10 setp.ne.u32 %r291,%r86,0; @ ! %r291 bra $L15; bra $L1; $L37: .loc 1 40 14 mov.u64 %r75,0; $L15: .loc 1 178 18 ld.u64 %r50,[%frame+240]; .loc 1 178 10 setp.eq.u64 %r292,%r50,0; selp.u64 %r50,1,%r50,%r292; $L10: .loc 1 182 12 ld.u64 %r99,[%frame+360]; .loc 1 184 8 ld.u64 %r102,[%r131]; .loc 1 186 9 setp.ne.u64 %r293,%r102,0; @ %r293 bra $L25; bra $L1; $L40: .loc 1 190 7 mov.u64 %r103,%r113; .loc 1 190 14 mov.u64 %r104,0; $L26: .loc 1 192 12 ld.f32 %r51,[%r102]; ld.f32 %r52,[%r102+4]; .loc 1 192 10 st.f32 [%r103],%r51; st.f32 [%r103+4],%r52; .loc 1 193 16 add.u64 %r103,%r103,%r54; .loc 1 190 33 add.u64 %r104,%r104,1; .loc 1 190 7 setp.ne.u64 %r294,%r133,%r104; @ %r294 bra $L26; bra $L46; $L44: .loc 1 196 12 add.u64 %r102,%r102,%r56; .loc 1 197 12 add.u64 %r113,%r113,%r58; .loc 1 198 15 add.u64 %r96,%r96,1; .loc 1 200 13 setp.ne.u64 %r295,%r30,%r96; @ %r295 bra $L44; $L34: .loc 1 208 16 add.u64 %r101,%r113,%r117; .loc 1 210 14 @ %r326 bra $L1; add.u64 %r109,%frame,128; .loc 1 207 30 mov.u64 %r60,%r100; .loc 1 210 14 mov.u64 %r125,8; bra $L30; $L31: .loc 1 204 20 st.u64 [%r109],%r331; .loc 1 207 30 mul.lo.u64 %r60,%r68,%r67; .loc 1 208 30 mul.lo.u64 %r299,%r70,%r67; .loc 1 208 16 shl.b64 %r300,%r299,3; sub.u64 %r101,%r113,%r300; .loc 1 210 14 add.u64 %r109,%r109,8; add.u64 %r125,%r125,8; setp.eq.u64 %r301,%r118,%r125; @ %r301 bra $L1; $L30: .loc 1 218 23 ld.u64 %r302,[%r109]; add.u64 %r67,%r302,1; st.u64 [%r109],%r67; .loc 1 219 30 add.u64 %r304,%r329,%r125; ld.u64 %r68,[%r304]; .loc 1 219 20 sub.u64 %r305,%r68,%r60; shl.b64 %r306,%r305,3; add.u64 %r102,%r102,%r306; .loc 1 220 30 add.u64 %r308,%r330,%r125; ld.u64 %r70,[%r308]; .loc 1 220 20 shl.b64 %r309,%r70,3; add.u64 %r113,%r101,%r309; .loc 1 200 32 add.u64 %r310,%frame,%r125; ld.u64 %r73,[%r310]; .loc 1 200 13 setp.eq.u64 %r311,%r67,%r73; @ %r311 bra $L31; .loc 1 204 20 mov.u64 %r96,0; bra $L32; $L25: .loc 1 193 16 shl.b64 %r54,%r75,3; .loc 1 196 12 shl.b64 %r56,%r50,3; .loc 1 197 12 shl.b64 %r58,%r99,3; ld.u64 %r96,[%frame+120]; .loc 1 200 32 ld.u64 %r30,[%frame]; .loc 1 207 30 mul.lo.u64 %r100,%r50,%r30; .loc 1 208 30 mul.lo.u64 %r312,%r99,%r30; .loc 1 208 16 shl.b64 %r313,%r312,3; neg.s64 %r117,%r313; cvt.u32.u32 %r315,%r22; cvt.s64.s8 %r314,%r315; shl.b64 %r118,%r314,3; setp.le.s64 %r326,%r90,1; setp.gt.s64 %r327,%r133,0; add.u64 %r329,%frame,240; add.u64 %r330,%frame,360; .loc 1 204 20 mov.u64 %r331,0; $L32: .loc 1 190 7 @ %r327 bra $L40; bra $L44; $L46: .loc 1 196 12 add.u64 %r102,%r102,%r56; .loc 1 197 12 add.u64 %r113,%r113,%r58; .loc 1 198 15 add.u64 %r96,%r96,1; .loc 1 200 13 setp.eq.u64 %r317,%r30,%r96; @ %r317 bra $L34; bra $L32; $L1: .loc 1 224_gfortrani_spread_scalar_c4 .visible .func _gfortrani_spread_scalar_c0pred %r68; .reg .predmov.u64 %r40,%ar2; mov.u64 %r41,%ar3; .loc 1 236 6 ld.s8 %r43,[%r38+28]; cvt.u16.u32 %r42,%r43; setp.eq.u16 %r44,%r42,1; @ %r44 bra $L48; .loc 1 237 5 cvta.const.u64 %r45,$LC4stack; call _gfortran_runtime_error48: .loc 1 239 6 setp.le.s64 %r47,%r40,1; @ %r47 bra $L49; .loc 1 240 5 cvta.const.u64 %r48_gfortran_runtime_error49: .loc 1 242 10 ld.u64 %r23,[%r38]; .loc 1 246 7 add.u64 %r37,%r41,-1; .loc 1 242 6 setp.ne.u64 %r50,%r23,0; @ %r50 bra $L50; .loc 1 244 24 mov.u64 52; call (%value_in),_gfortrani_xmallocarray244 22 st.u64 [%r38],%r23; .loc 1 245 19 mov.u64 %r55,0; st.u64 [%r38+8],%r55; .loc 1 246 7 st.u64 [%r38+48],%r55; st.u64 [%r38+56],%r37; mov.u64 %r57,1; st.u64 [%r38+40],%r57; mov.u64 %r30,%r57; bra $L51; $L50: .loc 1 251 9 ld.u64 %r30,[%r38+40]; .loc 1 250 26 ld.u64 %r59,[%r38+56]; ld.u64 %r61,[%r38+48]; .loc 1 250 55 sub.u64 %r62,%r59,%r61; .loc 1 251 7 div.s64 %r64,%r62,%r30; .loc 1 250 10 setp.ge.s64 %r65,%r64,%r37; @ %r65 bra $L51; .loc 1 252 2_gfortran_runtime_error1: .loc 1 258 3 setp.gt.s64 %r68,%r41,0; @ ! %r68 bra $L47; .loc 1 261 12 shl.b64 %r35,%r30,3; .loc 1 258 19 mov.u64 %r36,0; $L53: .loc 1 260 15 ld.f32 %r32,[%r39]; ld.f32 %r33,[%r39+4]; .loc 1 260 13 st.f32 [%r23],%r32; st.f32 [%r23+4],%r33; .loc 1 261 12 add.u64 %r23,%r23,%r35; .loc 1 258 40 add.u64 %r36,%r36,1; .loc 1 258 3 setp.ne.u64 %r69,%r41,%r36; @ %r69 bra $L53; $L47spread_c8.o/../libgfortran/generated/spread_c8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_spread_scalar_c8 .visible .func _gfortrani_spread_scalar_c8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray114,101,116,117,114,110,32,114,97,110,107,32,116,111,111,32,108,97,114,103,101,32,105,110,32,115,112,114,101,97,100,40,410,105,109,32,111,117,116,115,105,100,101,32,111,102,32,114,97,110,107,32,105,110,32114,97,110,107,32,109,105,115,109,97,116,99,104,32,105,110,32,115,112,114,101,97,100,40,473,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,83,80,82,69,65,68,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10105,110,99,111,114,114,101,99,116,32,100,101,115,116,105,110,97,116,105,111,110,32,114,97,110,107,32,105,110,32,115,112,114,101,97,100,40,41100,105,109,32,116,111,111,32,108,97,114,103,101,32,105,110,32,115,112,114,101,97,100,40,41_gfortrani_spread_c8 .visible .func _gfortrani_spread_c8pred %r142; .reg .predpredpred %r192; .reg .u64pred %r206; .reg .predpred %r251; .reg .predu64 %r265; .reg .u64 %r266; .reg .u64 %r273; .reg .pred %r276; .reg .u64 %r286; .reg .predpred %r293; .reg .pred %r294; .reg .predpredu64 %r331; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64 %r336; mov.u64 %r130,%ar0; mov.u64 %r131,%ar1; mov.u64 %r132,%ar2; mov.u64 %r133,%ar3; .loc 1 57 11 ld.s8 %r22,[%r131+28]; .loc 1 57 9 cvt.u32.u32 %r134,%r22; cvt.s64.s8 %r90,%r134; .loc 1 59 9 add.u64 %r91,%r90,1; .loc 1 60 6 setp.le.s64 %r135,%r91,15; @ %r135 bra $L2; .loc 1 61 5 cvta.const.u64 %r136136_gfortran_runtime_errorsetp.ge.s64 %r138,%r91,%r132; @ %r138 bra $L3; .loc 1 64_gfortran_runtime_errorr113,[%r130]; .loc 1 68 6 setp.ne.u64 %r141,%r113,0; @ %r141 bra $L4; .loc 1 75 23 st.u8 [%r130+28],%r91; .loc 1 79 7 setp.le.s64 %r142,%r91,0; @ %r142 bra $L5; .loc 1 82 19 add.u64 %r128,%r132,-1; add.u64 %r116,%r130,40; .loc 1 77 11 mov.u64 %r81,%r113; .loc 1 79 14 mov.u64 %r98,%r81; .loc 1 78 10 mov.u64 %r122,1; .loc 1 40 14 mov.u64 %r75,%r81; .loc 1 90 19 mov.u64 %r333,%r81; .loc 1 84 21 add.u64 %r334,%r133,-1; bra $L8; $L36: mov.u64 %r122,%r77; $L8: .loc 1 82 7 setp.ne.u64 %r145,%r98,%r128; @ %r145 bra $L6; .loc 1 84 21 mov.u64 %r29,%r334; .loc 1 86 11 mul.lo.u64 %r77,%r133,%r122; mov.u64 %r75,%r122; bra $L7; $L6: .loc 1 90 19 shl.b64 %r146,%r81,3; add.u64 %r147,%frame,%r146; st.u64 [%r147+120],%r333; .loc 1 91 22 add.u64 %r151,%r81,%r81; add.u64 %r152,%r151,%r81; shl.b64 %r153,%r152,3; add.u64 %r154,%r131,%r153; ld.u64 %r157,[%r154+56]; add.u64 %r156,%r157,1; ld.u64 %r164,[%r154+48]; sub.u64 %r27,%r156,%r164; .loc 1 91 20 st.u64 [%r147],%r27; .loc 1 92 21 ld.u64 %r176,[%r154+40]; st.u64 [%r147+240],%r176; .loc 1 93 21 st.u64 [%r147+360],%r122; .loc 1 95 25 add.u64 %r29,%r27,-1; .loc 1 96 11 mul.lo.u64 %r77,%r27,%r122; .loc 1 97 11 add.u64 %r81,%r81,1; $L7: .loc 1 99 4 st.u64 [%r116+8],%r333; st.u64 [%r116+16],%r29; st.u64 [%r116],%r122; .loc 1 79 31 add.u64 %r98,%r98,1; .loc 1 79 7 add.u64 %r116,%r116,24; setp.ne.u64 %r181,%r98,%r91; @ %r181 bra $L36; bra $L45; $L5: .loc 1 101 19 st.u64 [%r130+8],%r113; .loc 1 104 24 mov.u64 %r184,16; mov.u64 %r1818(%value_in),_gfortrani_xmallocarray185,[%value_in]; } mov.u64 %r113,%r185; .loc 1 104 22 st.u64 [%r130],%r113; .loc 1 181 12 ld.u64 %r50,[%frame+240]; .loc 1 40 14 mov.u64 %r75,0; bra $L10; $L45: .loc 1 101 19 st.u64 [%r130+8],%r333; .loc 1 104 24 mov.u64 %r189,167189; call (%value_in),_gfortrani_xmallocarray190,[%value_in]; } mov.u64 %r113,%r190; .loc 1 104 22 st.u64 [%r130],%r113; .loc 1 105 10 setp.le.s64 %r192,%r77,0; @ %r192 bra $L1; .loc 1 181 12 ld.u64 %r50,[%frame+240]; bra $L10; $L4: .loc 1 115 11 ld.s8 %r193,[%r130+28]; .loc 1 115 10 setp.eq.u64 %r194,%r193,%r91; @ %r194 bra $L12; .loc 1 116 2 cvta.const.u64 %r1955_gfortran_runtime_error12: .loc 1 118 11 cvta.global.u64 %r197,_gfortrani_compile_options; ld.u32 %r86,[%r197+36]; .loc 1 118 10 setp.ne.u32 %r198,%r86,0; @ %r198 bra $L13; .loc 1 156 4 setp.gt.s64 %r199,%r193,0; @ %r199 bra $L14; .loc 1 40 14 mov.u64 %r75,0; bra $L15; $L13: .loc 1 120 4 setp.le.s64 %r200,%r193,0; @ %r200 bra $L37; .loc 1 82 19 add.u64 %r127,%r132,-1; add.u64 %r107,%r130,40; .loc 1 112 18 mov.u32 %r86,0; .loc 1 114 11 mov.u64 %r83,0; .loc 1 120 11 mov.u64 %r95,%r83; .loc 1 40 14 mov.u64 %r75,%r83; .loc 1 137 16 mov.u64 %r335,%r83; $L20: .loc 1 124 21 ld.u64 %r204,[%r107+16]; add.u64 %r203,%r204,1; .loc 1 124 19 ld.u64 %r205,[%r107+8]; sub.u64 %r94,%r203,%r205; .loc 1 125 11 setp.ne.u64 %r206,%r95,%r127; @ %r206 bra $L16; .loc 1 127 12 ld.u64 %r75,[%r107]; .loc 1 129 8 setp.eq.u64 %r207,%r133,%r94; @ %r207 bra $L17; .loc 1 130 7 st.u64 [%stack+16],%r133; st.u64 [%stack+8],%r94; add.u64 %r209,%r95,1;_gfortran_runtime_error16: .loc 1 137 16 shl.b64 %r323,%r83,3; add.u64 %r212,%frame,%r323; add.u64 %r213,%r212,120; st.u64 [%r213],%r335; .loc 1 138 19 add.u64 %r318,%r83,%r83; add.u64 %r320,%r318,%r83; shl.b64 %r218,%r320,3; add.u64 %r219,%r131,%r218; ld.u64 %r222,[%r219+56]; add.u64 %r221,%r222,1; ld.u64 %r229,[%r219+48]; sub.u64 %r41,%r221,%r229; .loc 1 138 17 st.u64 [%r212],%r41; .loc 1 139 8 setp.eq.u64 %r232,%r41,%r94; @ %r232 bra $L18; .loc 1 140 7 st.u64 [%stack+16],%r41; st.u64 [%stack+8],%r94; add.u64 %r234,%r95,1;_gfortran_runtime_error18: .loc 1 146 8 setp.gt.s64 %r236,%r41,0; .loc 1 147 18 selp.u32 %r86,%r86,1,%r236; .loc 1 148 18 ld.u64 %r246,[%r219+40]; st.u64 [%r212+240],%r246; .loc 1 149 18 ld.u64 %r250,[%r107]; st.u64 [%r212+360],%r250; .loc 1 150 8 add.u64 %r83,%r83,1; $L17: .loc 1 120 28 add.u64 %r95,%r95,1; .loc 1 120 4 add.u64 %r107,%r107,24; setp.ne.u64 %r251,%r193,%r95; @ %r251 bra $L20; bra $L21; $L14: .loc 1 82 19 add.u64 %r126,%r132,-1; add.u64 %r76,%r130,40; .loc 1 114 11 mov.u64 %r85,0; .loc 1 156 11 mov.u64 %r93,%r85; .loc 1 40 14 mov.u64 %r75,%r85; .loc 1 164 16 mov.u64 %r336,%r85; $L24: .loc 1 160 12 ld.u64 %r129,[%r76]; .loc 1 158 11 setp.eq.u64 %r254,%r93,%r126; @ %r254 bra $L38; .loc 1 164 16 shl.b64 %r324,%r85,3; add.u64 %r256,%frame,%r324; add.u64 %r257,%r256,120; st.u64 [%r257],%r336; .loc 1 165 19 add.u64 %r321,%r85,%r85; add.u64 %r322,%r321,%r85; shl.b64 %r262,%r322,3; add.u64 %r263,%r131,%r262; ld.u64 %r266,[%r263+56]; add.u64 %r265,%r266,1; ld.u64 %r273,[%r263+48]; sub.u64 %r48,%r265,%r273; .loc 1 165 17 st.u64 [%r256],%r48; .loc 1 166 8 setp.gt.s64 %r276,%r48,0; .loc 1 167 18 selp.u32 %r86,%r86,1,%r276; .loc 1 168 18 ld.u64 %r286,[%r263+40]; st.u64 [%r256+240],%r286; .loc 1 169 18 st.u64 [%r256+360],%r129; .loc 1 170 8 add.u64 %r85,%r85,1; bra $L22; $L38: .loc 1 160 12 mov.u64 %r75,%r129; $L22: .loc 1 156 28 add.u64 %r93,%r93,1; .loc 1 156 4 add.u64 %r76,%r76,24; setp.ne.u64 %r290,%r93,%r193; @ %r290 bra $L24; $L21: .loc 1 175 10 setp.ne.u32 %r291,%r86,0; @ ! %r291 bra $L15; bra $L1; $L37: .loc 1 40 14 mov.u64 %r75,0; $L15: .loc 1 178 18 ld.u64 %r50,[%frame+240]; .loc 1 178 10 setp.eq.u64 %r292,%r50,0; selp.u64 %r50,1,%r50,%r292; $L10: .loc 1 182 12 ld.u64 %r99,[%frame+360]; .loc 1 184 8 ld.u64 %r102,[%r131]; .loc 1 186 9 setp.ne.u64 %r293,%r102,0; @ %r293 bra $L25; bra $L1; $L40: .loc 1 190 7 mov.u64 %r103,%r113; .loc 1 190 14 mov.u64 %r104,0; $L26: .loc 1 192 12 ld.f64 %r51,[%r102]; ld.f64 %r52,[%r102+8]; .loc 1 192 10 st.f64 [%r103],%r51; st.f64 [%r103+8],%r52; .loc 1 193 16 add.u64 %r103,%r103,%r54; .loc 1 190 33 add.u64 %r104,%r104,1; .loc 1 190 7 setp.ne.u64 %r294,%r133,%r104; @ %r294 bra $L26; bra $L46; $L44: .loc 1 196 12 add.u64 %r102,%r102,%r56; .loc 1 197 12 add.u64 %r113,%r113,%r58; .loc 1 198 15 add.u64 %r96,%r96,1; .loc 1 200 13 setp.ne.u64 %r295,%r30,%r96; @ %r295 bra $L44; $L34: .loc 1 208 16 add.u64 %r101,%r113,%r117; .loc 1 210 14 @ %r326 bra $L1; add.u64 %r109,%frame,128; .loc 1 207 30 mov.u64 %r60,%r100; .loc 1 210 14 mov.u64 %r125,8; bra $L30; $L31: .loc 1 204 20 st.u64 [%r109],%r331; .loc 1 207 30 mul.lo.u64 %r60,%r68,%r67; .loc 1 208 30 mul.lo.u64 %r299,%r70,%r67; .loc 1 208 16 shl.b64 %r300,%r299,4; sub.u64 %r101,%r113,%r300; .loc 1 210 14 add.u64 %r109,%r109,8; add.u64 %r125,%r125,8; setp.eq.u64 %r301,%r118,%r125; @ %r301 bra $L1; $L30: .loc 1 218 23 ld.u64 %r302,[%r109]; add.u64 %r67,%r302,1; st.u64 [%r109],%r67; .loc 1 219 30 add.u64 %r304,%r329,%r125; ld.u64 %r68,[%r304]; .loc 1 219 20 sub.u64 %r305,%r68,%r60; shl.b64 %r306,%r305,4; add.u64 %r102,%r102,%r306; .loc 1 220 30 add.u64 %r308,%r330,%r125; ld.u64 %r70,[%r308]; .loc 1 220 20 shl.b64 %r309,%r70,4; add.u64 %r113,%r101,%r309; .loc 1 200 32 add.u64 %r310,%frame,%r125; ld.u64 %r73,[%r310]; .loc 1 200 13 setp.eq.u64 %r311,%r67,%r73; @ %r311 bra $L31; .loc 1 204 20 mov.u64 %r96,0; bra $L32; $L25: .loc 1 193 16 shl.b64 %r54,%r75,4; .loc 1 196 12 shl.b64 %r56,%r50,4; .loc 1 197 12 shl.b64 %r58,%r99,4; ld.u64 %r96,[%frame+120]; .loc 1 200 32 ld.u64 %r30,[%frame]; .loc 1 207 30 mul.lo.u64 %r100,%r50,%r30; .loc 1 208 30 mul.lo.u64 %r312,%r99,%r30; .loc 1 208 16 shl.b64 %r313,%r312,4; neg.s64 %r117,%r313; cvt.u32.u32 %r315,%r22; cvt.s64.s8 %r314,%r315; shl.b64 %r118,%r314,3; setp.le.s64 %r326,%r90,1; setp.gt.s64 %r327,%r133,0; add.u64 %r329,%frame,240; add.u64 %r330,%frame,360; .loc 1 204 20 mov.u64 %r331,0; $L32: .loc 1 190 7 @ %r327 bra $L40; bra $L44; $L46: .loc 1 196 12 add.u64 %r102,%r102,%r56; .loc 1 197 12 add.u64 %r113,%r113,%r58; .loc 1 198 15 add.u64 %r96,%r96,1; .loc 1 200 13 setp.eq.u64 %r317,%r30,%r96; @ %r317 bra $L34; bra $L32; $L1: .loc 1 224_gfortrani_spread_scalar_c8 .visible .func _gfortrani_spread_scalar_c80pred %r68; .reg .predmov.u64 %r40,%ar2; mov.u64 %r41,%ar3; .loc 1 236 6 ld.s8 %r43,[%r38+28]; cvt.u16.u32 %r42,%r43; setp.eq.u16 %r44,%r42,1; @ %r44 bra $L48; .loc 1 237 5 cvta.const.u64 %r45,$LC4stack; call _gfortran_runtime_error48: .loc 1 239 6 setp.le.s64 %r47,%r40,1; @ %r47 bra $L49; .loc 1 240 5 cvta.const.u64 %r48_gfortran_runtime_error49: .loc 1 242 10 ld.u64 %r23,[%r38]; .loc 1 246 7 add.u64 %r37,%r41,-1; .loc 1 242 6 setp.ne.u64 %r50,%r23,0; @ %r50 bra $L50; .loc 1 244 24call (%value_in),_gfortrani_xmallocarray244 22 st.u64 [%r38],%r23; .loc 1 245 19 mov.u64 %r55,0; st.u64 [%r38+8],%r55; .loc 1 246 7 st.u64 [%r38+48],%r55; st.u64 [%r38+56],%r37; mov.u64 %r57,1; st.u64 [%r38+40],%r57; mov.u64 %r30,%r57; bra $L51; $L50: .loc 1 251 9 ld.u64 %r30,[%r38+40]; .loc 1 250 26 ld.u64 %r59,[%r38+56]; ld.u64 %r61,[%r38+48]; .loc 1 250 55 sub.u64 %r62,%r59,%r61; .loc 1 251 7 div.s64 %r64,%r62,%r30; .loc 1 250 10 setp.ge.s64 %r65,%r64,%r37; @ %r65 bra $L51; .loc 1 252 2_gfortran_runtime_error1: .loc 1 258 3 setp.gt.s64 %r68,%r41,0; @ ! %r68 bra $L47; .loc 1 261 12 shl.b64 %r35,%r30,4; .loc 1 258 19 mov.u64 %r36,0; $L53: .loc 1 260 15 ld.f64 %r32,[%r39]; ld.f64 %r33,[%r39+8]; .loc 1 260 13 st.f64 [%r23],%r32; st.f64 [%r23+8],%r33; .loc 1 261 12 add.u64 %r23,%r23,%r35; .loc 1 258 40 add.u64 %r36,%r36,1; .loc 1 258 3 setp.ne.u64 %r69,%r41,%r36; @ %r69 bra $L53; $L47 spread_c10.o/ cshift0_i1.o/gfortrani_cshift0_i1 .visible .func _gfortrani_cshift0_i1.file 1 "../../../../libgfortran/generated/cshift0_i1.c"ortrani_cshift0_i1 .visible .func _gfortrani_cshift0_i18predu64 %r177; .reg .u32predpred %r204; .reg .pred %r205; .reg .predu32 %r225; .reg .u32 %r226; .reg .u16 %r228; .reg .u16 %r229; .reg .u16predu64 %r328; .reg .u64 %r329; mov.u64 %r149,%ar0; mov.u64 %r150,%ar1; mov.u64 %r151,%ar2; mov.u32 %r152,%ar3; .loc 1 57 9 add.u32 %r98,%r152,-1; .loc 1 58 14 mov.u64 %r153,0; st.u64 [%frame+240],%r153; .loc 1 59 14 st.u64 [%frame+360],%r153; .loc 1 61 13 mov.u64 %r155,1; st.u64 [%frame],%r155; .loc 1 62 12 st.u64 [%frame+120],%r153; .loc 1 76 13 ld.s8 %r132,[%r150+28]; .loc 1 76 11 cvt.u32.u32 %r157,%r132; cvt.s64.s8 %r85,%r157; .loc 1 72 6 setp.le.s32 %r158,%r98,0; @ %r158 bra $L2; .loc 1 77 7 setp.le.s64 %r159,%r85,0; @ %r159 bra $L3; mov.u64 %r52,40; .loc 1 70 8 mov.u64 %r106,%r155; .loc 1 69 8 mov.u64 %r105,%r106; .loc 1 77 14 mov.u64 %r107,%r153; $L7: add.u64 %r112,%r149,%r52; .loc 1 81 7 ld.u64 %r160,[%r112]; setp.ne.u64 %r161,%r160,%r105; @ ! %r161 bra $L50; bra $L4; $L2: .loc 1 133 7 setp.gt.s64 %r162,%r85,0; @ %r162 bra $L4; bra $L51; $L50: add.u64 %r56,%r150,%r52; .loc 1 87 7 ld.u64 %r163,[%r56]; setp.ne.u64 %r164,%r163,%r106; @ %r164 bra $L4; .loc 1 92 12 ld.u64 %r166,[%r112+16]; ld.u64 %r168,[%r112+8]; sub.u64 %r167,%r166,%r168; .loc 1 92 9 mad.lo.u64 %r105,%r167,%r105,%r105; .loc 1 93 12 ld.u64 %r170,[%r56+16]; ld.u64 %r172,[%r56+8]; sub.u64 %r171,%r170,%r172; .loc 1 93 9 mad.lo.u64 %r106,%r171,%r106,%r106; .loc 1 77 30 add.u64 %r107,%r107,1; .loc 1 77 7 add.u64 %r52,%r52,24; setp.ne.u64 %r173,%r107,%r85; @ %r173 bra $L7; bra $L3; $L33: .loc 1 123 13 cvt.s64.s32 %r146,%r152; cvt.u32.u32 %r175,%r132; cvt.s64.s8 %r174,%r175; sub.u64 %r176,%r174,%r146; shl.b64 %r140,%r176,3; add.u64 %r177,%frame,1201403,[%value_in]; } add.u64 %r186,%r146,%r146; add.u64 %r187,%r186,%r146; shl.b64 %r188,%r187,3; add.u64 %r79,%r188,40; add.u64 %r41,%r150,%r79; mov.u64 %r135,%frame; add.u64 %r102,%r149,%r79; mov.u64 %r49,0; add.u64 %r320,%frame,360; add.u64 %r321,%frame,240; $L8: .loc 1 124 16 ld.u64 %r190,[%r41+16]; add.u64 %r189,%r190,1; ld.u64 %r192,[%r41+8]; sub.u64 %r191,%r189,%r192; .loc 1 124 14 st.u64 [%r135],%r191; .loc 1 125 15 add.u64 %r194,%r320,%r49; ld.u64 %r195,[%r102]; st.u64 [%r194],%r195; .loc 1 126 15 add.u64 %r197,%r321,%r49; ld.u64 %r198,[%r41]; st.u64 [%r197],%r198; .loc 1 121 7 add.u64 %r41,%r41,24; add.u64 %r135,%r135,8; add.u64 %r102,%r102,24; add.u64 %r49,%r49,8; setp.ne.u64 %r199,%r49,%r140; @ %r199 bra $L8; .loc 1 162 12 ld.u64 %r148,[%frame+360]; .loc 1 163 12 ld.u64 %r139,[%frame+240]; $L34: .loc 1 129 41 sub.u32 %r200,%r132,%r98; .loc 1 129 11 cvt.s64.s32 %r85,%r200; .loc 1 117 15 mov.u64 %r81,1; .loc 1 116 15 mov.u64 %r78,%r81; bra $L9; $L4: .loc 1 135 12 cvt.s64.s32 %r92,%r98; add.u64 %r93,%r150,40; add.u64 %r76,%r149,40; mov.u64 %r90,0; mov.u64 %r86,%r90; mov.u64 %r110,%r90; mov.u64 %r81,1; mov.u64 %r78,%r81; .loc 1 147 17 mov.u64 %r329,%r90; $L14: .loc 1 143 14 ld.u64 %r202,[%r93+16]; add.u64 %r201,%r202,1; .loc 1 143 12 ld.u64 %r203,[%r93+8]; sub.u64 %r138,%r201,%r203; .loc 1 137 16 ld.u64 %r42,[%r76]; .loc 1 140 16 ld.u64 %r133,[%r93]; .loc 1 135 7 setp.ne.u64 %r204,%r92,%r110; @ %r204 bra $L10; .loc 1 138 11 setp.ne.u64 %r205,%r42,0; .loc 1 139 11 selp.u64 %r78,%r42,1,%r205; .loc 1 141 11 setp.ne.u64 %r206,%r133,0; .loc 1 142 11 selp.u64 %r81,%r133,1,%r206; .loc 1 143 12 mov.u64 %r86,%r138; bra $L13; $L10: .loc 1 147 17 shl.b64 %r207,%r90,3; add.u64 %r208,%frame,%r207; add.u64 %r209,%r208,120; st.u64 [%r209],%r329; .loc 1 148 18 st.u64 [%r208],%r138; .loc 1 149 19 st.u64 [%r208+360],%r42; .loc 1 150 19 st.u64 [%r208+240],%r133; .loc 1 151 9 add.u64 %r90,%r90,1; $L13: .loc 1 133 59 add.u64 %r110,%r110,1; .loc 1 133 7 add.u64 %r93,%r93,24; add.u64 %r76,%r76,24; setp.lt.s64 %r219,%r110,%r85; @ %r219 bra $L14; .loc 1 154 18 ld.u64 %r139,[%frame+240]; .loc 1 156 18 ld.u64 %r148,[%frame+360]; .loc 1 154 10 setp.ne.u64 %r220,%r139,0; .loc 1 155 13 selp.u64 %r139,%r139,1,%r220; .loc 1 156 10 setp.eq.u64 %r221,%r148,0; selp.u64 %r148,1,%r148,%r221; $L9: .loc 1 164 8 ld.u64 %r121,[%r149]; .loc 1 165 8 ld.u64 %r122,[%r150]; .loc 1 168 13 shr.u64 %r223,%r151,63; .loc 1 168 26 set.u32.ge.s64 %r225,%r151,%r86; neg.s32 %r226,%r225; .loc 1 168 17 cvt.u16.u64 %r229,%r223; cvt.u16.u32 %r230,%r226; or.b16 %r228,%r229,%r230; .loc 1 168 6 cvt.u32.u16 %r231,%r228; cvt.u16.u8 %r232,%r231; setp.eq.u16 %r233,%r232,0; @ %r233 bra $L18; .loc 1 170 28 setp.ne.u64 %r234,%r86,0; @ %r234 bra $L17; $L36: mov.u64 %r151,0; bra $L18; $L17: rem.s64 %r151,%r151,%r86; .loc 1 171 10 setp.ge.s64 %r237,%r151,0; @ %r237 bra $L18; .loc 1 172 8 add.u64 %r151,%r151,%r86; bra $L18; $L32: .loc 1 193 43 mul.lo.u64 %r83,%r151,%r81; .loc 1 181 19 set.u32.eq.u64 %r239,%r81,1; neg.s32 %r240,%r239; .loc 1 181 35 set.u32.eq.u64 %r242,%r78,1; neg.s32 %r243,%r242; .loc 1 181 24 cvt.u16.u32 %r245,%r240; cvt.u16.u32 %r246,%r243; and.b16 %r244,%r245,%r246; cvt.u32.u16 %r247,%r244; cvt.u32.u8 %r131,%r247; .loc 1 214 32 ld.u64 %r87,[%frame]; .loc 1 184 23 sub.u64 %r84,%r86,%r151; mul.lo.u64 %r142,%r78,%r84; .loc 1 221 30 mul.lo.u64 %r147,%r148,%r87; .loc 1 222 30 mul.lo.u64 %r248,%r139,%r87; .loc 1 222 16 neg.s64 %r128,%r248; shl.b64 %r249,%r85,3; add.u64 %r96,%r249,-8; setp.eq.u32 %r323,%r131,0; setp.gt.s64 %r322,%r85,2; .loc 1 195 4 setp.le.s64 %r325,%r84,0; .loc 1 201 4 setp.gt.s64 %r326,%r151,0; add.u64 %r327,%frame,360; add.u64 %r328,%frame,240; $L35: .loc 1 181 10 @ %r323 bra $L19; .loc 1 185 23 add.u64 %r251,%r122,%r151; .loc 1 185 4258,[%value_in]; } .loc 1 186 17 add.u64 %r260,%r121,%r84; .loc 1 186 4261221267,[%value_in]; } bra $L20; $L19: .loc 1 193 25 add.u64 %r117,%r122,%r83; .loc 1 195 4 @ %r325 bra $L40; mov.u64 %r116,%r121; .loc 1 195 11 mov.u64 %r118,0; $L22: .loc 1 197 14 ld.u8 %r270,[%r117]; st.u8 [%r116],%r270; .loc 1 198 13 add.u64 %r116,%r116,%r78; .loc 1 199 12 add.u64 %r117,%r117,%r81; .loc 1 195 34 add.u64 %r118,%r118,1; .loc 1 195 4 setp.ne.u64 %r271,%r84,%r118; @ %r271 bra $L22; .loc 1 198 13 add.u64 %r113,%r121,%r142; bra $L21; $L40: .loc 1 195 4 mov.u64 %r113,%r121; $L21: .loc 1 201 4 @ ! %r326 bra $L20; mov.u64 %r114,%r122; .loc 1 201 23 mov.u64 %r115,0; $L23: .loc 1 203 14 ld.u8 %r273,[%r114]; st.u8 [%r113],%r273; .loc 1 204 13 add.u64 %r113,%r113,%r78; .loc 1 205 12 add.u64 %r114,%r114,%r81; .loc 1 201 40 add.u64 %r115,%r115,1; .loc 1 201 4 setp.ne.u64 %r274,%r115,%r151; @ %r274 bra $L23; $L20: .loc 1 210 12 add.u64 %r121,%r121,%r148; .loc 1 211 12 add.u64 %r122,%r122,%r139; .loc 1 212 15 ld.u64 %r275,[%frame+120]; add.u64 %r62,%r275,1; .loc 1 214 13 setp.eq.u64 %r276,%r62,%r87; @ %r276 bra $L25; .loc 1 212 15 st.u64 [%frame+120],%r62; bra $L35; $L25: .loc 1 218 20 mov.u64 %r277,0; st.u64 [%frame+120],%r277; .loc 1 222 16 add.u64 %r119,%r122,%r128; .loc 1 224 14 @ %r322 bra $L27; bra $L1; $L31: .loc 1 218 20 st.u64 [%r111],%r277; .loc 1 221 30 mul.lo.u64 %r65,%r72,%r71; .loc 1 222 30 mul.lo.u64 %r280,%r74,%r71; .loc 1 222 16 sub.u64 %r119,%r122,%r280; .loc 1 224 14 add.u64 %r111,%r111,8; add.u64 %r136,%r136,8; setp.eq.u64 %r281,%r96,%r136; @ ! %r281 bra $L30; bra $L1; $L27: add.u64 %r111,%frame,128; .loc 1 221 30 mov.u64 %r65,%r147; .loc 1 224 14 mov.u64 %r136,8; $L30: .loc 1 232 23 ld.u64 %r283,[%r111]; add.u64 %r71,%r283,1; st.u64 [%r111],%r71; .loc 1 233 30 add.u64 %r285,%r327,%r136; ld.u64 %r72,[%r285]; .loc 1 233 20 sub.u64 %r286,%r72,%r65; add.u64 %r121,%r121,%r286; .loc 1 234 30 add.u64 %r288,%r328,%r136; ld.u64 %r74,[%r288]; .loc 1 234 20 add.u64 %r122,%r119,%r74; .loc 1 214 32 add.u64 %r289,%frame,%r136; ld.u64 %r77,[%r289]; .loc 1 214 13 setp.eq.u64 %r290,%r71,%r77; @ %r290 bra $L31; bra $L35; $L18: .loc 1 175 9 setp.ne.u64 %r291,%r121,0; @ %r291 bra $L32; bra $L1; $L3: .loc 1 114 18 mov.u64 %r292,1; st.u64 [%frame+240],%r292; .loc 1 115 18 st.u64 [%frame+360],%r292; .loc 1 118 13 cvt.s64.s32 %r294,%r98; add.u64 %r296,%r294,%r294; add.u64 %r297,%r296,%r294; shl.b64 %r298,%r297,3; add.u64 %r299,%r150,%r298; ld.u64 %r30,[%r299+40]; .loc 1 119 4 ld.u64 %r309,[%r299+56]; ld.u64 %r318,[%r299+48]; sub.u64 %r317,%r309,%r318; .loc 1 118 11 mad.lo.u64 %r86,%r317,%r30,%r30; .loc 1 120 13 mul.lo.u64 %r151,%r151,%r30; .loc 1 121 7 setp.lt.s32 %r319,%r152,%r132; @ %r319 bra $L33; mov.u64 %r139,%r292; mov.u64 %r148,%r139; bra $L34; $L51: .loc 1 164 8 ld.u64 %r121,[%r149]; .loc 1 165 8 ld.u64 %r122,[%r150]; .loc 1 155 13 mov.u64 %r139,%r155; .loc 1 165 8 mov.u64 %r148,%r139; mov.u64 %r86,%r153; mov.u64 %r81,%r139; mov.u64 %r78,%r139; bra $L36; $L1: .loc 1 240cshift0_i2.o/gfortrani_cshift0_i2 .visible .func _gfortrani_cshift0_i2.file 1 "../../../../libgfortran/generated/cshift0_i2.c"ortrani_cshift0_i2 .visible .func _gfortrani_cshift0_i2u6432u64 %r214; .reg .u64 %r215; .reg .pred %r216; .reg .pred %r217; .reg .predpred %r232; .reg .pred %r233; .reg .u64 %r235; .reg .u32 %r237; .reg .u32 %r238; .reg .u16 %r240; .reg .u16 %r241; .reg .u16 %r242; .reg .u32 %r243; .reg .u16predu64u16 %r292; .reg .predpred %r301; .reg .u64 %r303; .reg .u64 %r305; .reg .u64 %r306; .reg .u64 %r307; .reg .u64 %r309; .reg .u64 %r310; .reg .u64 %r311; .reg .predu64 %r318; .reg .u64pred %r345; .reg .pred %r346; .reg .pred %r348; .reg .pred %r349; .reg .u64 %r350; .reg .u64 %r351; .reg .u64 %r352; mov.u64 %r160,%ar0; mov.u64 %r161,%ar1; mov.u64 %r162,%ar2; mov.u32 %r163,%ar3; .loc 1 57 9 add.u32 %r108,%r163,-1; .loc 1 58 14 mov.u64 %r164,0; st.u64 [%frame+240],%r164; .loc 1 59 14 st.u64 [%frame+360],%r164; .loc 1 61 13 mov.u64 %r166,1; st.u64 [%frame],%r166; .loc 1 62 12 st.u64 [%frame+120],%r164; .loc 1 76 13 ld.s8 %r112,[%r161+28]; .loc 1 72 6 setp.le.s32 %r168,%r108,0; @ %r168 bra $L2; .loc 1 76 11 cvt.u32.u32 %r169,%r112; cvt.s64.s8 %r96,%r169; .loc 1 77 7 setp.le.s64 %r170,%r96,0; @ %r170 bra $L3; mov.u64 %r53,40; .loc 1 70 8 mov.u64 %r117,%r166; .loc 1 69 8 mov.u64 %r116,%r117; .loc 1 77 14 mov.u64 %r118,%r164; $L7: add.u64 %r125,%r160,%r53; .loc 1 81 7 ld.u64 %r171,[%r125]; setp.ne.u64 %r172,%r171,%r116; @ ! %r172 bra $L49; bra $L4; $L2: .loc 1 133 27 cvt.u32.u32 %r173,%r112; cvt.s64.s8 %r96,%r173; .loc 1 133 7 setp.gt.s64 %r174,%r96,0; @ %r174 bra $L4; .loc 1 164 8 ld.u64 %r136,[%r160]; .loc 1 165 8 ld.u64 %r137,[%r161]; mov.u64 %r97,%r164; mov.u64 %r159,%r166; .loc 1 155 13 mov.u64 %r95,%r159; .loc 1 165 8 mov.u64 %r92,%r159; mov.u64 %r54,%r159; bra $L6; $L49: add.u64 %r67,%r161,%r53; .loc 1 87 7 ld.u64 %r175,[%r67]; setp.ne.u64 %r176,%r175,%r117; @ %r176 bra $L4; .loc 1 92 12 ld.u64 %r178,[%r125+16]; ld.u64 %r180,[%r125+8]; sub.u64 %r179,%r178,%r180; .loc 1 92 9 mad.lo.u64 %r116,%r179,%r116,%r116; .loc 1 93 12 ld.u64 %r182,[%r67+16]; ld.u64 %r184,[%r67+8]; sub.u64 %r183,%r182,%r184; .loc 1 93 9 mad.lo.u64 %r117,%r183,%r117,%r117; .loc 1 77 30 add.u64 %r118,%r118,1; .loc 1 77 7 add.u64 %r53,%r53,24; setp.ne.u64 %r185,%r96,%r118; @ %r185 bra $L7; bra $L3; $L33: .loc 1 123 13 cvt.u32.u32 %r187,%r112; cvt.s64.s8 %r186,%r187; sub.u64 %r188,%r186,%r341; shl.b64 %r154,%r188,3; add.u64 %r189,%frame,120;1add.u64 %r198,%r341,%r341; add.u64 %r199,%r198,%r341; shl.b64 %r200,%r199,3; add.u64 %r44,%r200,40; add.u64 %r74,%r161,%r44; mov.u64 %r35,%frame; add.u64 %r101,%r160,%r44; mov.u64 %r49,0; add.u64 %r343,%frame,360; add.u64 %r344,%frame,240; $L8: .loc 1 124 16 ld.u64 %r202,[%r74+16]; add.u64 %r201,%r202,1; ld.u64 %r204,[%r74+8]; sub.u64 %r203,%r201,%r204; .loc 1 124 14 st.u64 [%r35],%r203; .loc 1 125 15 add.u64 %r206,%r343,%r49; ld.u64 %r207,[%r101]; st.u64 [%r206],%r207; .loc 1 126 15 add.u64 %r209,%r344,%r49; ld.u64 %r210,[%r74]; st.u64 [%r209],%r210; .loc 1 121 7 add.u64 %r74,%r74,24; add.u64 %r35,%r35,8; add.u64 %r101,%r101,24; add.u64 %r49,%r49,8; setp.ne.u64 %r211,%r49,%r154; @ %r211 bra $L8; .loc 1 162 12 ld.u64 %r159,[%frame+360]; .loc 1 163 12 ld.u64 %r95,[%frame+240]; $L34: .loc 1 129 41 sub.u32 %r212,%r112,%r108; .loc 1 129 11 cvt.s64.s32 %r96,%r212; .loc 1 117 15 mov.u64 %r92,1; .loc 1 116 15 mov.u64 %r54,%r92; bra $L9; $L4: .loc 1 135 12 cvt.s64.s32 %r151,%r108; add.u64 %r104,%r161,40; add.u64 %r100,%r160,40; mov.u64 %r99,0; mov.u64 %r97,%r99; mov.u64 %r121,%r99; mov.u64 %r92,1; mov.u64 %r54,%r92; .loc 1 147 17 mov.u64 %r352,%r99; $L14: .loc 1 143 14 ld.u64 %r214,[%r104+16]; add.u64 %r213,%r214,1; .loc 1 143 12 ld.u64 %r215,[%r104+8]; sub.u64 %r150,%r213,%r215; .loc 1 137 16 ld.u64 %r152,[%r100]; .loc 1 140 16 ld.u64 %r141,[%r104]; .loc 1 135 7 setp.ne.u64 %r216,%r151,%r121; @ %r216 bra $L10; .loc 1 138 11 setp.ne.u64 %r217,%r152,0; .loc 1 139 11 selp.u64 %r54,%r152,1,%r217; .loc 1 141 11 setp.ne.u64 %r218,%r141,0; .loc 1 142 11 selp.u64 %r92,%r141,1,%r218; .loc 1 143 12 mov.u64 %r97,%r150; bra $L13; $L10: .loc 1 147 17 shl.b64 %r219,%r99,3; add.u64 %r220,%frame,%r219; add.u64 %r221,%r220,120; st.u64 [%r221],%r352; .loc 1 148 18 st.u64 [%r220],%r150; .loc 1 149 19 st.u64 [%r220+360],%r152; .loc 1 150 19 st.u64 [%r220+240],%r141; .loc 1 151 9 add.u64 %r99,%r99,1; $L13: .loc 1 133 59 add.u64 %r121,%r121,1; .loc 1 133 7 add.u64 %r104,%r104,24; add.u64 %r100,%r100,24; setp.lt.s64 %r231,%r121,%r96; @ %r231 bra $L14; .loc 1 154 18 ld.u64 %r95,[%frame+240]; .loc 1 156 18 ld.u64 %r159,[%frame+360]; .loc 1 154 10 setp.ne.u64 %r232,%r95,0; .loc 1 155 13 selp.u64 %r95,%r95,1,%r232; .loc 1 156 10 setp.eq.u64 %r233,%r159,0; selp.u64 %r159,1,%r159,%r233; $L9: .loc 1 164 8 ld.u64 %r136,[%r160]; .loc 1 165 8 ld.u64 %r137,[%r161]; .loc 1 168 13 shr.u64 %r235,%r162,63; .loc 1 168 26 set.u32.ge.s64 %r237,%r162,%r97; neg.s32 %r238,%r237; .loc 1 168 17 cvt.u16.u64 %r241,%r235; cvt.u16.u32 %r242,%r238; or.b16 %r240,%r241,%r242; .loc 1 168 6 cvt.u32.u16 %r243,%r240; cvt.u16.u8 %r244,%r243; setp.eq.u16 %r245,%r244,0; @ %r245 bra $L18; .loc 1 170 28 setp.ne.u64 %r246,%r97,0; @ %r246 bra $L17; $L6: mov.u64 %r162,0; bra $L18; $L17: rem.s64 %r162,%r162,%r97; .loc 1 171 10 setp.ge.s64 %r249,%r162,0; @ %r249 bra $L18; .loc 1 172 8 add.u64 %r162,%r162,%r97; bra $L18; $L32: add.u64 %r158,%r92,%r92; .loc 1 193 36 mul.lo.u64 %r62,%r162,%r158; .loc 1 198 13 add.u64 %r65,%r54,%r54; .loc 1 183 11 add.u64 %r132,%r162,%r162; .loc 1 184 23 sub.u64 %r143,%r97,%r162; .loc 1 184 11 add.u64 %r133,%r143,%r143; .loc 1 210 12 add.u64 %r69,%r159,%r159; .loc 1 211 12 add.u64 %r71,%r95,%r95; .loc 1 181 35 set.u32.eq.u64 %r257,%r54,1; neg.s32 %r258,%r257; .loc 1 181 19 set.u32.eq.u64 %r260,%r92,1; neg.s32 %r261,%r260; .loc 1 181 24 cvt.u16.u32 %r263,%r258; cvt.u16.u32 %r264,%r261; and.b16 %r262,%r263,%r264; cvt.u32.u16 %r265,%r262; cvt.u32.u8 %r103,%r265; .loc 1 214 32 ld.u64 %r98,[%frame]; mul.lo.u64 %r156,%r65,%r143; .loc 1 221 30 mul.lo.u64 %r135,%r159,%r98; .loc 1 222 30 mul.lo.u64 %r266,%r95,%r98; .loc 1 222 16 add.u64 %r267,%r266,%r266; neg.s64 %r153,%r267; shl.b64 %r268,%r96,3; add.u64 %r106,%r268,-8; setp.eq.u32 %r346,%r103,0; setp.gt.s64 %r345,%r96,2; .loc 1 195 4 setp.le.s64 %r348,%r143,0; .loc 1 201 4 setp.gt.s64 %r349,%r162,0; add.u64 %r350,%frame,360; add.u64 %r351,%frame,240; $L35: .loc 1 181 10 @ %r346 bra $L19; .loc 1 185 23 add.u64 %r270,%r137,%r132; .loc 1 185 433277,[%value_in]; } .loc 1 186 17 add.u64 %r279,%r136,%r133; .loc 1 186 4326,[%value_in]; } bra $L20; $L19: .loc 1 193 25 add.u64 %r130,%r137,%r62; .loc 1 195 4 @ %r348 bra $L39; mov.u64 %r129,%r136; .loc 1 195 11 mov.u64 %r131,0; $L22: .loc 1 197 14 ld.u16 %r289,[%r130]; st.u16 [%r129],%r289; .loc 1 198 13 add.u64 %r129,%r129,%r65; .loc 1 199 12 add.u64 %r130,%r130,%r158; .loc 1 195 34 add.u64 %r131,%r131,1; .loc 1 195 4 setp.ne.u64 %r290,%r131,%r143; @ %r290 bra $L22; .loc 1 198 13 add.u64 %r126,%r136,%r156; bra $L21; $L39: .loc 1 195 4 mov.u64 %r126,%r136; $L21: .loc 1 201 4 @ ! %r349 bra $L20; mov.u64 %r127,%r137; .loc 1 201 23 mov.u64 %r128,0; $L23: .loc 1 203 14 ld.u16 %r292,[%r127]; st.u16 [%r126],%r292; .loc 1 204 13 add.u64 %r126,%r126,%r65; .loc 1 205 12 add.u64 %r127,%r127,%r158; .loc 1 201 40 add.u64 %r128,%r128,1; .loc 1 201 4 setp.ne.u64 %r293,%r128,%r162; @ %r293 bra $L23; $L20: .loc 1 210 12 add.u64 %r136,%r136,%r69; .loc 1 211 12 add.u64 %r137,%r137,%r71; .loc 1 212 15 ld.u64 %r294,[%frame+120]; add.u64 %r73,%r294,1; .loc 1 214 13 setp.eq.u64 %r295,%r73,%r98; @ %r295 bra $L25; .loc 1 212 15 st.u64 [%frame+120],%r73; bra $L35; $L25: .loc 1 218 20 mov.u64 %r296,0; st.u64 [%frame+120],%r296; .loc 1 222 16 add.u64 %r134,%r137,%r153; .loc 1 224 14 @ %r345 bra $L27; bra $L1; $L31: .loc 1 218 20 st.u64 [%r123],%r296; .loc 1 221 30 mul.lo.u64 %r76,%r84,%r83; .loc 1 222 30 mul.lo.u64 %r299,%r86,%r83; .loc 1 222 16 add.u64 %r300,%r299,%r299; sub.u64 %r134,%r137,%r300; .loc 1 224 14 add.u64 %r123,%r123,8; add.u64 %r148,%r148,8; setp.eq.u64 %r301,%r106,%r148; @ ! %r301 bra $L30; bra $L1; $L27: add.u64 %r123,%frame,128; .loc 1 221 30 mov.u64 %r76,%r135; .loc 1 224 14 mov.u64 %r148,8; $L30: .loc 1 232 23 ld.u64 %r303,[%r123]; add.u64 %r83,%r303,1; st.u64 [%r123],%r83; .loc 1 233 30 add.u64 %r305,%r350,%r148; ld.u64 %r84,[%r305]; .loc 1 233 20 sub.u64 %r306,%r84,%r76; add.u64 %r307,%r306,%r306; add.u64 %r136,%r136,%r307; .loc 1 234 30 add.u64 %r309,%r351,%r148; ld.u64 %r86,[%r309]; .loc 1 234 20 add.u64 %r310,%r86,%r86; add.u64 %r137,%r134,%r310; .loc 1 214 32 add.u64 %r311,%frame,%r148; ld.u64 %r90,[%r311]; .loc 1 214 13 setp.eq.u64 %r312,%r83,%r90; @ %r312 bra $L31; bra $L35; $L18: .loc 1 175 9 setp.ne.u64 %r313,%r136,0; @ %r313 bra $L32; bra $L1; $L3: .loc 1 114 18 mov.u64 %r314,1; st.u64 [%frame+240],%r314; .loc 1 115 18 st.u64 [%frame+360],%r314; .loc 1 118 13 cvt.s64.s32 %r316,%r108; add.u64 %r318,%r316,%r316; add.u64 %r319,%r318,%r316; shl.b64 %r320,%r319,3; add.u64 %r321,%r161,%r320; ld.u64 %r30,[%r321+40]; .loc 1 119 4 ld.u64 %r331,[%r321+56]; ld.u64 %r340,[%r321+48]; sub.u64 %r339,%r331,%r340; .loc 1 118 11 mad.lo.u64 %r97,%r339,%r30,%r30; .loc 1 120 13 mul.lo.u64 %r162,%r162,%r30; .loc 1 121 16 cvt.s64.s32 %r341,%r163; .loc 1 121 7 setp.gt.s64 %r342,%r96,%r341; @ %r342 bra $L33; mov.u64 %r95,%r314; mov.u64 %r159,%r95; bra $L34; $L1: .loc 1 240 cshift0_i4.o/_gfortrani_cshift0_i4 .visible .func _gfortrani_cshift0_../libgfortran/generated/cshift0_i4.c"ortrani_cshift0_i4 .visible .func _gfortrani_cshift0_i432u64 %r214; .reg .u64 %r215; .reg .pred %r216; .reg .pred %r217; .reg .predpred %r232; .reg .pred %r233; .reg .u64 %r235; .reg .u32 %r237; .reg .u32 %r238; .reg .u16 %r240; .reg .u16 %r241; .reg .u16 %r242; .reg .u32 %r243; .reg .u16pred %r249; .reg .u32 %r251; .reg .u32 %r252; .reg .u32 %r254; .reg .u32 %r255; .reg .u16 %r256; .reg .u16 %r257; .reg .u16 %r258; .reg .u32predpredu64pred %r305; .reg .u64u64pred %r337; .reg .pred %r338; .reg .pred %r340; .reg .predmov.u64 %r160,%ar0; mov.u64 %r161,%ar1; mov.u64 %r162,%ar2; mov.u32 %r163,%ar3; .loc 1 57 9 add.u32 %r108,%r163,-1; .loc 1 58 14 mov.u64 %r164,0; st.u64 [%frame+240],%r164; .loc 1 59 14 st.u64 [%frame+360],%r164; .loc 1 61 13 mov.u64 %r166,1; st.u64 [%frame],%r166; .loc 1 62 12 st.u64 [%frame+120],%r164; .loc 1 76 13 ld.s8 %r112,[%r161+28]; .loc 1 72 6 setp.le.s32 %r168,%r108,0; @ %r168 bra $L2; .loc 1 76 11 cvt.u32.u32 %r169,%r112; cvt.s64.s8 %r96,%r169; .loc 1 77 7 setp.le.s64 %r170,%r96,0; @ %r170 bra $L3; mov.u64 %r53,40; .loc 1 70 8 mov.u64 %r117,%r166; .loc 1 69 8 mov.u64 %r116,%r117; .loc 1 77 14 mov.u64 %r118,%r164; $L7: add.u64 %r125,%r160,%r53; .loc 1 81 7 ld.u64 %r171,[%r125]; setp.ne.u64 %r172,%r171,%r116; @ ! %r172 bra $L49; bra $L4; $L2: .loc 1 133 27 cvt.u32.u32 %r173,%r112; cvt.s64.s8 %r96,%r173; .loc 1 133 7 setp.gt.s64 %r174,%r96,0; @ %r174 bra $L4; .loc 1 164 8 ld.u64 %r136,[%r160]; .loc 1 165 8 ld.u64 %r137,[%r161]; mov.u64 %r97,%r164; mov.u64 %r159,%r166; .loc 1 155 13 mov.u64 %r95,%r159; .loc 1 165 8 mov.u64 %r92,%r159; mov.u64 %r54,%r159; bra $L6; $L49: add.u64 %r67,%r161,%r53; .loc 1 87 7 ld.u64 %r175,[%r67]; setp.ne.u64 %r176,%r175,%r117; @ %r176 bra $L4; .loc 1 92 12 ld.u64 %r178,[%r125+16]; ld.u64 %r180,[%r125+8]; sub.u64 %r179,%r178,%r180; .loc 1 92 9 mad.lo.u64 %r116,%r179,%r116,%r116; .loc 1 93 12 ld.u64 %r182,[%r67+16]; ld.u64 %r184,[%r67+8]; sub.u64 %r183,%r182,%r184; .loc 1 93 9 mad.lo.u64 %r117,%r183,%r117,%r117; .loc 1 77 30 add.u64 %r118,%r118,1; .loc 1 77 7 add.u64 %r53,%r53,24; setp.ne.u64 %r185,%r96,%r118; @ %r185 bra $L7; bra $L3; $L33: .loc 1 123 13 cvt.u32.u32 %r187,%r112; cvt.s64.s8 %r186,%r187; sub.u64 %r188,%r186,%r333; shl.b64 %r154,%r188,3; add.u64 %r189,%frame,120;1add.u64 %r198,%r333,%r333; add.u64 %r199,%r198,%r333; shl.b64 %r200,%r199,3; add.u64 %r44,%r200,40; add.u64 %r74,%r161,%r44; mov.u64 %r35,%frame; add.u64 %r101,%r160,%r44; mov.u64 %r49,0; add.u64 %r335,%frame,360; add.u64 %r336,%frame,240; $L8: .loc 1 124 16 ld.u64 %r202,[%r74+16]; add.u64 %r201,%r202,1; ld.u64 %r204,[%r74+8]; sub.u64 %r203,%r201,%r204; .loc 1 124 14 st.u64 [%r35],%r203; .loc 1 125 15 add.u64 %r206,%r335,%r49; ld.u64 %r207,[%r101]; st.u64 [%r206],%r207; .loc 1 126 15 add.u64 %r209,%r336,%r49; ld.u64 %r210,[%r74]; st.u64 [%r209],%r210; .loc 1 121 7 add.u64 %r74,%r74,24; add.u64 %r35,%r35,8; add.u64 %r101,%r101,24; add.u64 %r49,%r49,8; setp.ne.u64 %r211,%r49,%r154; @ %r211 bra $L8; .loc 1 162 12 ld.u64 %r159,[%frame+360]; .loc 1 163 12 ld.u64 %r95,[%frame+240]; $L34: .loc 1 129 41 sub.u32 %r212,%r112,%r108; .loc 1 129 11 cvt.s64.s32 %r96,%r212; .loc 1 117 15 mov.u64 %r92,1; .loc 1 116 15 mov.u64 %r54,%r92; bra $L9; $L4: .loc 1 135 12 cvt.s64.s32 %r151,%r108; add.u64 %r104,%r161,40; add.u64 %r100,%r160,40; mov.u64 %r99,0; mov.u64 %r97,%r99; mov.u64 %r121,%r99; mov.u64 %r92,1; mov.u64 %r54,%r92; .loc 1 147 17 mov.u64 %r344,%r99; $L14: .loc 1 143 14 ld.u64 %r214,[%r104+16]; add.u64 %r213,%r214,1; .loc 1 143 12 ld.u64 %r215,[%r104+8]; sub.u64 %r150,%r213,%r215; .loc 1 137 16 ld.u64 %r152,[%r100]; .loc 1 140 16 ld.u64 %r141,[%r104]; .loc 1 135 7 setp.ne.u64 %r216,%r151,%r121; @ %r216 bra $L10; .loc 1 138 11 setp.ne.u64 %r217,%r152,0; .loc 1 139 11 selp.u64 %r54,%r152,1,%r217; .loc 1 141 11 setp.ne.u64 %r218,%r141,0; .loc 1 142 11 selp.u64 %r92,%r141,1,%r218; .loc 1 143 12 mov.u64 %r97,%r150; bra $L13; $L10: .loc 1 147 17 shl.b64 %r219,%r99,3; add.u64 %r220,%frame,%r219; add.u64 %r221,%r220,120; st.u64 [%r221],%r344; .loc 1 148 18 st.u64 [%r220],%r150; .loc 1 149 19 st.u64 [%r220+360],%r152; .loc 1 150 19 st.u64 [%r220+240],%r141; .loc 1 151 9 add.u64 %r99,%r99,1; $L13: .loc 1 133 59 add.u64 %r121,%r121,1; .loc 1 133 7 add.u64 %r104,%r104,24; add.u64 %r100,%r100,24; setp.lt.s64 %r231,%r121,%r96; @ %r231 bra $L14; .loc 1 154 18 ld.u64 %r95,[%frame+240]; .loc 1 156 18 ld.u64 %r159,[%frame+360]; .loc 1 154 10 setp.ne.u64 %r232,%r95,0; .loc 1 155 13 selp.u64 %r95,%r95,1,%r232; .loc 1 156 10 setp.eq.u64 %r233,%r159,0; selp.u64 %r159,1,%r159,%r233; $L9: .loc 1 164 8 ld.u64 %r136,[%r160]; .loc 1 165 8 ld.u64 %r137,[%r161]; .loc 1 168 13 shr.u64 %r235,%r162,63; .loc 1 168 26 set.u32.ge.s64 %r237,%r162,%r97; neg.s32 %r238,%r237; .loc 1 168 17 cvt.u16.u64 %r241,%r235; cvt.u16.u32 %r242,%r238; or.b16 %r240,%r241,%r242; .loc 1 168 6 cvt.u32.u16 %r243,%r240; cvt.u16.u8 %r244,%r243; setp.eq.u16 %r245,%r244,0; @ %r245 bra $L18; .loc 1 170 28 setp.ne.u64 %r246,%r97,0; @ %r246 bra $L17; $L6: mov.u64 %r162,0; bra $L18; $L17: rem.s64 %r162,%r162,%r97; .loc 1 171 10 setp.ge.s64 %r249,%r162,0; @ %r249 bra $L18; .loc 1 172 8 add.u64 %r162,%r162,%r97; bra $L18; $L32: shl.b64 %r158,%r92,2; .loc 1 193 36 mul.lo.u64 %r62,%r162,%r158; .loc 1 198 13 shl.b64 %r65,%r54,2; .loc 1 183 11 shl.b64 %r132,%r162,2; .loc 1 184 23 sub.u64 %r143,%r97,%r162; .loc 1 184 11 shl.b64 %r133,%r143,2; .loc 1 210 12 shl.b64 %r69,%r159,2; .loc 1 211 12 shl.b64 %r71,%r95,2; .loc 1 181 35 set.u32.eq.u64 %r251,%r54,1; neg.s32 %r252,%r251; .loc 1 181 19 set.u32.eq.u64 %r254,%r92,1; neg.s32 %r255,%r254; .loc 1 181 24 cvt.u16.u32 %r257,%r252; cvt.u16.u32 %r258,%r255; and.b16 %r256,%r257,%r258; cvt.u32.u16 %r259,%r256; cvt.u32.u8 %r103,%r259; .loc 1 214 32 ld.u64 %r98,[%frame]; mul.lo.u64 %r156,%r65,%r143; .loc 1 221 30 mul.lo.u64 %r135,%r159,%r98; .loc 1 222 30 mul.lo.u64 %r260,%r95,%r98; .loc 1 222 16 shl.b64 %r261,%r260,2; neg.s64 %r153,%r261; shl.b64 %r262,%r96,3; add.u64 %r106,%r262,-8; setp.eq.u32 %r338,%r103,0; setp.gt.s64 %r337,%r96,2; .loc 1 195 4 setp.le.s64 %r340,%r143,0; .loc 1 201 4 setp.gt.s64 %r341,%r162,0; add.u64 %r342,%frame,360; add.u64 %r343,%frame,240; $L35: .loc 1 181 10 @ %r338 bra $L19; .loc 1 185 23 add.u64 %r264,%r137,%r132; .loc 1 185 433271,[%value_in]; } .loc 1 186 17 add.u64 %r273,%r136,%r133; .loc 1 186 427r11320,[%value_in]; } bra $L20; $L19: .loc 1 193 25 add.u64 %r130,%r137,%r62; .loc 1 195 4 @ %r340 bra $L39; mov.u64 %r129,%r136; .loc 1 195 11 mov.u64 %r131,0; $L22: .loc 1 197 16 ld.u32 %r63,[%r130]; .loc 1 197 14 st.u32 [%r129],%r63; .loc 1 198 13 add.u64 %r129,%r129,%r65; .loc 1 199 12 add.u64 %r130,%r130,%r158; .loc 1 195 34 add.u64 %r131,%r131,1; .loc 1 195 4 setp.ne.u64 %r283,%r131,%r143; @ %r283 bra $L22; .loc 1 198 13 add.u64 %r126,%r136,%r156; bra $L21; $L39: .loc 1 195 4 mov.u64 %r126,%r136; $L21: .loc 1 201 4 @ ! %r341 bra $L20; mov.u64 %r127,%r137; .loc 1 201 23 mov.u64 %r128,0; $L23: .loc 1 203 16 ld.u32 %r66,[%r127]; .loc 1 203 14 st.u32 [%r126],%r66; .loc 1 204 13 add.u64 %r126,%r126,%r65; .loc 1 205 12 add.u64 %r127,%r127,%r158; .loc 1 201 40 add.u64 %r128,%r128,1; .loc 1 201 4 setp.ne.u64 %r285,%r128,%r162; @ %r285 bra $L23; $L20: .loc 1 210 12 add.u64 %r136,%r136,%r69; .loc 1 211 12 add.u64 %r137,%r137,%r71; .loc 1 212 15 ld.u64 %r286,[%frame+120]; add.u64 %r73,%r286,1; .loc 1 214 13 setp.eq.u64 %r287,%r73,%r98; @ %r287 bra $L25; .loc 1 212 15 st.u64 [%frame+120],%r73; bra $L35; $L25: .loc 1 218 20 mov.u64 %r288,0; st.u64 [%frame+120],%r288; .loc 1 222 16 add.u64 %r134,%r137,%r153; .loc 1 224 14 @ %r337 bra $L27; bra $L1; $L31: .loc 1 218 20 st.u64 [%r123],%r288; .loc 1 221 30 mul.lo.u64 %r76,%r84,%r83; .loc 1 222 30 mul.lo.u64 %r291,%r86,%r83; .loc 1 222 16 shl.b64 %r292,%r291,2; sub.u64 %r134,%r137,%r292; .loc 1 224 14 add.u64 %r123,%r123,8; add.u64 %r148,%r148,8; setp.eq.u64 %r293,%r106,%r148; @ ! %r293 bra $L30; bra $L1; $L27: add.u64 %r123,%frame,128; .loc 1 221 30 mov.u64 %r76,%r135; .loc 1 224 14 mov.u64 %r148,8; $L30: .loc 1 232 23 ld.u64 %r295,[%r123]; add.u64 %r83,%r295,1; st.u64 [%r123],%r83; .loc 1 233 30 add.u64 %r297,%r342,%r148; ld.u64 %r84,[%r297]; .loc 1 233 20 sub.u64 %r298,%r84,%r76; shl.b64 %r299,%r298,2; add.u64 %r136,%r136,%r299; .loc 1 234 30 add.u64 %r301,%r343,%r148; ld.u64 %r86,[%r301]; .loc 1 234 20 shl.b64 %r302,%r86,2; add.u64 %r137,%r134,%r302; .loc 1 214 32 add.u64 %r303,%frame,%r148; ld.u64 %r90,[%r303]; .loc 1 214 13 setp.eq.u64 %r304,%r83,%r90; @ %r304 bra $L31; bra $L35; $L18: .loc 1 175 9 setp.ne.u64 %r305,%r136,0; @ %r305 bra $L32; bra $L1; $L3: .loc 1 114 18 mov.u64 %r306,1; st.u64 [%frame+240],%r306; .loc 1 115 18 st.u64 [%frame+360],%r306; .loc 1 118 13 cvt.s64.s32 %r308,%r108; add.u64 %r310,%r308,%r308; add.u64 %r311,%r310,%r308; shl.b64 %r312,%r311,3; add.u64 %r313,%r161,%r312; ld.u64 %r30,[%r313+40]; .loc 1 119 4 ld.u64 %r323,[%r313+56]; ld.u64 %r332,[%r313+48]; sub.u64 %r331,%r323,%r332; .loc 1 118 11 mad.lo.u64 %r97,%r331,%r30,%r30; .loc 1 120 13 mul.lo.u64 %r162,%r162,%r30; .loc 1 121 16 cvt.s64.s32 %r333,%r163; .loc 1 121 7 setp.gt.s64 %r334,%r96,%r333; @ %r334 bra $L33; mov.u64 %r95,%r306; mov.u64 %r159,%r95; bra $L34; $L1: .loc 1 240 cshift0_i8.o/_gfortrani_cshift0_i8 .visible .func _gfortrani_cshift0_i8.file 1 "../../../../libgfortran/generated/cshift0_i8.c"ortrani_cshift0_i8 .visible .func _gfortrani_cshift0_i832u64 %r214; .reg .u64 %r215; .reg .pred %r216; .reg .pred %r217; .reg .predpred %r232; .reg .pred %r233; .reg .u64 %r235; .reg .u32 %r237; .reg .u32 %r238; .reg .u16 %r240; .reg .u16 %r241; .reg .u16 %r242; .reg .u32 %r243; .reg .u16pred %r249; .reg .u32 %r251; .reg .u32 %r252; .reg .u32 %r254; .reg .u32 %r255; .reg .u16 %r256; .reg .u16 %r257; .reg .u16 %r258; .reg .u32predpredu64pred %r305; .reg .u64u64pred %r337; .reg .pred %r338; .reg .pred %r340; .reg .predmov.u64 %r160,%ar0; mov.u64 %r161,%ar1; mov.u64 %r162,%ar2; mov.u32 %r163,%ar3; .loc 1 57 9 add.u32 %r108,%r163,-1; .loc 1 58 14 mov.u64 %r164,0; st.u64 [%frame+240],%r164; .loc 1 59 14 st.u64 [%frame+360],%r164; .loc 1 61 13 mov.u64 %r166,1; st.u64 [%frame],%r166; .loc 1 62 12 st.u64 [%frame+120],%r164; .loc 1 76 13 ld.s8 %r112,[%r161+28]; .loc 1 72 6 setp.le.s32 %r168,%r108,0; @ %r168 bra $L2; .loc 1 76 11 cvt.u32.u32 %r169,%r112; cvt.s64.s8 %r96,%r169; .loc 1 77 7 setp.le.s64 %r170,%r96,0; @ %r170 bra $L3; mov.u64 %r53,40; .loc 1 70 8 mov.u64 %r117,%r166; .loc 1 69 8 mov.u64 %r116,%r117; .loc 1 77 14 mov.u64 %r118,%r164; $L7: add.u64 %r125,%r160,%r53; .loc 1 81 7 ld.u64 %r171,[%r125]; setp.ne.u64 %r172,%r171,%r116; @ ! %r172 bra $L49; bra $L4; $L2: .loc 1 133 27 cvt.u32.u32 %r173,%r112; cvt.s64.s8 %r96,%r173; .loc 1 133 7 setp.gt.s64 %r174,%r96,0; @ %r174 bra $L4; .loc 1 164 8 ld.u64 %r136,[%r160]; .loc 1 165 8 ld.u64 %r137,[%r161]; mov.u64 %r97,%r164; mov.u64 %r159,%r166; .loc 1 155 13 mov.u64 %r95,%r159; .loc 1 165 8 mov.u64 %r92,%r159; mov.u64 %r54,%r159; bra $L6; $L49: add.u64 %r67,%r161,%r53; .loc 1 87 7 ld.u64 %r175,[%r67]; setp.ne.u64 %r176,%r175,%r117; @ %r176 bra $L4; .loc 1 92 12 ld.u64 %r178,[%r125+16]; ld.u64 %r180,[%r125+8]; sub.u64 %r179,%r178,%r180; .loc 1 92 9 mad.lo.u64 %r116,%r179,%r116,%r116; .loc 1 93 12 ld.u64 %r182,[%r67+16]; ld.u64 %r184,[%r67+8]; sub.u64 %r183,%r182,%r184; .loc 1 93 9 mad.lo.u64 %r117,%r183,%r117,%r117; .loc 1 77 30 add.u64 %r118,%r118,1; .loc 1 77 7 add.u64 %r53,%r53,24; setp.ne.u64 %r185,%r96,%r118; @ %r185 bra $L7; bra $L3; $L33: .loc 1 123 13 cvt.u32.u32 %r187,%r112; cvt.s64.s8 %r186,%r187; sub.u64 %r188,%r186,%r333; shl.b64 %r154,%r188,3; add.u64 %r189,%frame,120;1add.u64 %r198,%r333,%r333; add.u64 %r199,%r198,%r333; shl.b64 %r200,%r199,3; add.u64 %r44,%r200,40; add.u64 %r74,%r161,%r44; mov.u64 %r35,%frame; add.u64 %r101,%r160,%r44; mov.u64 %r49,0; add.u64 %r335,%frame,360; add.u64 %r336,%frame,240; $L8: .loc 1 124 16 ld.u64 %r202,[%r74+16]; add.u64 %r201,%r202,1; ld.u64 %r204,[%r74+8]; sub.u64 %r203,%r201,%r204; .loc 1 124 14 st.u64 [%r35],%r203; .loc 1 125 15 add.u64 %r206,%r335,%r49; ld.u64 %r207,[%r101]; st.u64 [%r206],%r207; .loc 1 126 15 add.u64 %r209,%r336,%r49; ld.u64 %r210,[%r74]; st.u64 [%r209],%r210; .loc 1 121 7 add.u64 %r74,%r74,24; add.u64 %r35,%r35,8; add.u64 %r101,%r101,24; add.u64 %r49,%r49,8; setp.ne.u64 %r211,%r49,%r154; @ %r211 bra $L8; .loc 1 162 12 ld.u64 %r159,[%frame+360]; .loc 1 163 12 ld.u64 %r95,[%frame+240]; $L34: .loc 1 129 41 sub.u32 %r212,%r112,%r108; .loc 1 129 11 cvt.s64.s32 %r96,%r212; .loc 1 117 15 mov.u64 %r92,1; .loc 1 116 15 mov.u64 %r54,%r92; bra $L9; $L4: .loc 1 135 12 cvt.s64.s32 %r151,%r108; add.u64 %r104,%r161,40; add.u64 %r100,%r160,40; mov.u64 %r99,0; mov.u64 %r97,%r99; mov.u64 %r121,%r99; mov.u64 %r92,1; mov.u64 %r54,%r92; .loc 1 147 17 mov.u64 %r344,%r99; $L14: .loc 1 143 14 ld.u64 %r214,[%r104+16]; add.u64 %r213,%r214,1; .loc 1 143 12 ld.u64 %r215,[%r104+8]; sub.u64 %r150,%r213,%r215; .loc 1 137 16 ld.u64 %r152,[%r100]; .loc 1 140 16 ld.u64 %r141,[%r104]; .loc 1 135 7 setp.ne.u64 %r216,%r151,%r121; @ %r216 bra $L10; .loc 1 138 11 setp.ne.u64 %r217,%r152,0; .loc 1 139 11 selp.u64 %r54,%r152,1,%r217; .loc 1 141 11 setp.ne.u64 %r218,%r141,0; .loc 1 142 11 selp.u64 %r92,%r141,1,%r218; .loc 1 143 12 mov.u64 %r97,%r150; bra $L13; $L10: .loc 1 147 17 shl.b64 %r219,%r99,3; add.u64 %r220,%frame,%r219; add.u64 %r221,%r220,120; st.u64 [%r221],%r344; .loc 1 148 18 st.u64 [%r220],%r150; .loc 1 149 19 st.u64 [%r220+360],%r152; .loc 1 150 19 st.u64 [%r220+240],%r141; .loc 1 151 9 add.u64 %r99,%r99,1; $L13: .loc 1 133 59 add.u64 %r121,%r121,1; .loc 1 133 7 add.u64 %r104,%r104,24; add.u64 %r100,%r100,24; setp.lt.s64 %r231,%r121,%r96; @ %r231 bra $L14; .loc 1 154 18 ld.u64 %r95,[%frame+240]; .loc 1 156 18 ld.u64 %r159,[%frame+360]; .loc 1 154 10 setp.ne.u64 %r232,%r95,0; .loc 1 155 13 selp.u64 %r95,%r95,1,%r232; .loc 1 156 10 setp.eq.u64 %r233,%r159,0; selp.u64 %r159,1,%r159,%r233; $L9: .loc 1 164 8 ld.u64 %r136,[%r160]; .loc 1 165 8 ld.u64 %r137,[%r161]; .loc 1 168 13 shr.u64 %r235,%r162,63; .loc 1 168 26 set.u32.ge.s64 %r237,%r162,%r97; neg.s32 %r238,%r237; .loc 1 168 17 cvt.u16.u64 %r241,%r235; cvt.u16.u32 %r242,%r238; or.b16 %r240,%r241,%r242; .loc 1 168 6 cvt.u32.u16 %r243,%r240; cvt.u16.u8 %r244,%r243; setp.eq.u16 %r245,%r244,0; @ %r245 bra $L18; .loc 1 170 28 setp.ne.u64 %r246,%r97,0; @ %r246 bra $L17; $L6: mov.u64 %r162,0; bra $L18; $L17: rem.s64 %r162,%r162,%r97; .loc 1 171 10 setp.ge.s64 %r249,%r162,0; @ %r249 bra $L18; .loc 1 172 8 add.u64 %r162,%r162,%r97; bra $L18; $L32: shl.b64 %r158,%r92,3; .loc 1 193 36 mul.lo.u64 %r62,%r162,%r158; .loc 1 198 13 shl.b64 %r65,%r54,3; .loc 1 183 11 shl.b64 %r132,%r162,3; .loc 1 184 23 sub.u64 %r143,%r97,%r162; .loc 1 184 11 shl.b64 %r133,%r143,3; .loc 1 210 12 shl.b64 %r69,%r159,3; .loc 1 211 12 shl.b64 %r71,%r95,3; .loc 1 181 35 set.u32.eq.u64 %r251,%r54,1; neg.s32 %r252,%r251; .loc 1 181 19 set.u32.eq.u64 %r254,%r92,1; neg.s32 %r255,%r254; .loc 1 181 24 cvt.u16.u32 %r257,%r252; cvt.u16.u32 %r258,%r255; and.b16 %r256,%r257,%r258; cvt.u32.u16 %r259,%r256; cvt.u32.u8 %r103,%r259; .loc 1 214 32 ld.u64 %r98,[%frame]; mul.lo.u64 %r156,%r65,%r143; .loc 1 221 30 mul.lo.u64 %r135,%r159,%r98; .loc 1 222 30 mul.lo.u64 %r260,%r95,%r98; .loc 1 222 16 shl.b64 %r261,%r260,3; neg.s64 %r153,%r261; shl.b64 %r262,%r96,3; add.u64 %r106,%r262,-8; setp.eq.u32 %r338,%r103,0; setp.gt.s64 %r337,%r96,2; .loc 1 195 4 setp.le.s64 %r340,%r143,0; .loc 1 201 4 setp.gt.s64 %r341,%r162,0; add.u64 %r342,%frame,360; add.u64 %r343,%frame,240; $L35: .loc 1 181 10 @ %r338 bra $L19; .loc 1 185 23 add.u64 %r264,%r137,%r132; .loc 1 185 433271,[%value_in]; } .loc 1 186 17 add.u64 %r273,%r136,%r133; .loc 1 186 427r11320,[%value_in]; } bra $L20; $L19: .loc 1 193 25 add.u64 %r130,%r137,%r62; .loc 1 195 4 @ %r340 bra $L39; mov.u64 %r129,%r136; .loc 1 195 11 mov.u64 %r131,0; $L22: .loc 1 197 16 ld.u64 %r63,[%r130]; .loc 1 197 14 st.u64 [%r129],%r63; .loc 1 198 13 add.u64 %r129,%r129,%r65; .loc 1 199 12 add.u64 %r130,%r130,%r158; .loc 1 195 34 add.u64 %r131,%r131,1; .loc 1 195 4 setp.ne.u64 %r283,%r131,%r143; @ %r283 bra $L22; .loc 1 198 13 add.u64 %r126,%r136,%r156; bra $L21; $L39: .loc 1 195 4 mov.u64 %r126,%r136; $L21: .loc 1 201 4 @ ! %r341 bra $L20; mov.u64 %r127,%r137; .loc 1 201 23 mov.u64 %r128,0; $L23: .loc 1 203 16 ld.u64 %r66,[%r127]; .loc 1 203 14 st.u64 [%r126],%r66; .loc 1 204 13 add.u64 %r126,%r126,%r65; .loc 1 205 12 add.u64 %r127,%r127,%r158; .loc 1 201 40 add.u64 %r128,%r128,1; .loc 1 201 4 setp.ne.u64 %r285,%r128,%r162; @ %r285 bra $L23; $L20: .loc 1 210 12 add.u64 %r136,%r136,%r69; .loc 1 211 12 add.u64 %r137,%r137,%r71; .loc 1 212 15 ld.u64 %r286,[%frame+120]; add.u64 %r73,%r286,1; .loc 1 214 13 setp.eq.u64 %r287,%r73,%r98; @ %r287 bra $L25; .loc 1 212 15 st.u64 [%frame+120],%r73; bra $L35; $L25: .loc 1 218 20 mov.u64 %r288,0; st.u64 [%frame+120],%r288; .loc 1 222 16 add.u64 %r134,%r137,%r153; .loc 1 224 14 @ %r337 bra $L27; bra $L1; $L31: .loc 1 218 20 st.u64 [%r123],%r288; .loc 1 221 30 mul.lo.u64 %r76,%r84,%r83; .loc 1 222 30 mul.lo.u64 %r291,%r86,%r83; .loc 1 222 16 shl.b64 %r292,%r291,3; sub.u64 %r134,%r137,%r292; .loc 1 224 14 add.u64 %r123,%r123,8; add.u64 %r148,%r148,8; setp.eq.u64 %r293,%r106,%r148; @ ! %r293 bra $L30; bra $L1; $L27: add.u64 %r123,%frame,128; .loc 1 221 30 mov.u64 %r76,%r135; .loc 1 224 14 mov.u64 %r148,8; $L30: .loc 1 232 23 ld.u64 %r295,[%r123]; add.u64 %r83,%r295,1; st.u64 [%r123],%r83; .loc 1 233 30 add.u64 %r297,%r342,%r148; ld.u64 %r84,[%r297]; .loc 1 233 20 sub.u64 %r298,%r84,%r76; shl.b64 %r299,%r298,3; add.u64 %r136,%r136,%r299; .loc 1 234 30 add.u64 %r301,%r343,%r148; ld.u64 %r86,[%r301]; .loc 1 234 20 shl.b64 %r302,%r86,3; add.u64 %r137,%r134,%r302; .loc 1 214 32 add.u64 %r303,%frame,%r148; ld.u64 %r90,[%r303]; .loc 1 214 13 setp.eq.u64 %r304,%r83,%r90; @ %r304 bra $L31; bra $L35; $L18: .loc 1 175 9 setp.ne.u64 %r305,%r136,0; @ %r305 bra $L32; bra $L1; $L3: .loc 1 114 18 mov.u64 %r306,1; st.u64 [%frame+240],%r306; .loc 1 115 18 st.u64 [%frame+360],%r306; .loc 1 118 13 cvt.s64.s32 %r308,%r108; add.u64 %r310,%r308,%r308; add.u64 %r311,%r310,%r308; shl.b64 %r312,%r311,3; add.u64 %r313,%r161,%r312; ld.u64 %r30,[%r313+40]; .loc 1 119 4 ld.u64 %r323,[%r313+56]; ld.u64 %r332,[%r313+48]; sub.u64 %r331,%r323,%r332; .loc 1 118 11 mad.lo.u64 %r97,%r331,%r30,%r30; .loc 1 120 13 mul.lo.u64 %r162,%r162,%r30; .loc 1 121 16 cvt.s64.s32 %r333,%r163; .loc 1 121 7 setp.gt.s64 %r334,%r96,%r333; @ %r334 bra $L33; mov.u64 %r95,%r306; mov.u64 %r159,%r95; bra $L34; $L1: .loc 1 240 cshift0_i16.o/gfortrani_cshift0_i16 .visible .func _gfortrani_cshift0_i16.file 1 "../../../../libgfortran/generated/cshift0_i16.c"ortrani_cshift0_i16 .visible .func _gfortrani_cshift0_i16u6432u64 %r214; .reg .u64 %r215; .reg .pred %r216; .reg .pred %r217; .reg .predpred %r232; .reg .pred %r233; .reg .u64 %r235; .reg .u32 %r237; .reg .u32 %r238; .reg .u16 %r240; .reg .u16 %r241; .reg .u16 %r242; .reg .u32 %r243; .reg .u16pred %r249; .reg .u32 %r251; .reg .u32 %r252; .reg .u32 %r254; .reg .u32 %r255; .reg .u16 %r256; .reg .u16 %r257; .reg .u16 %r258; .reg .u32predpredu64 %r327; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .pred %r338; .reg .u64 %r339; .reg .u64 %r340; .reg .u64 %r341; .reg .u64 %r342; .reg .u64 %r343; .reg .u64 %r344; .reg .pred %r345; .reg .pred %r346; .reg .pred %r348; .reg .pred %r349; .reg .u64 %r350; .reg .u64 %r351; .reg .u64 %r352; mov.u64 %r160,%ar0; mov.u64 %r161,%ar1; mov.u64 %r162,%ar2; mov.u32 %r163,%ar3; .loc 1 57 9 add.u32 %r108,%r163,-1; .loc 1 58 14 mov.u64 %r164,0; st.u64 [%frame+240],%r164; .loc 1 59 14 st.u64 [%frame+360],%r164; .loc 1 61 13 mov.u64 %r166,1; st.u64 [%frame],%r166; .loc 1 62 12 st.u64 [%frame+120],%r164; .loc 1 76 13 ld.s8 %r112,[%r161+28]; .loc 1 72 6 setp.le.s32 %r168,%r108,0; @ %r168 bra $L2; .loc 1 76 11 cvt.u32.u32 %r169,%r112; cvt.s64.s8 %r96,%r169; .loc 1 77 7 setp.le.s64 %r170,%r96,0; @ %r170 bra $L3; mov.u64 %r53,40; .loc 1 70 8 mov.u64 %r117,%r166; .loc 1 69 8 mov.u64 %r116,%r117; .loc 1 77 14 mov.u64 %r118,%r164; $L7: add.u64 %r125,%r160,%r53; .loc 1 81 7 ld.u64 %r171,[%r125]; setp.ne.u64 %r172,%r171,%r116; @ ! %r172 bra $L49; bra $L4; $L2: .loc 1 133 27 cvt.u32.u32 %r173,%r112; cvt.s64.s8 %r96,%r173; .loc 1 133 7 setp.gt.s64 %r174,%r96,0; @ %r174 bra $L4; .loc 1 164 8 ld.u64 %r136,[%r160]; .loc 1 165 8 ld.u64 %r137,[%r161]; mov.u64 %r97,%r164; mov.u64 %r159,%r166; .loc 1 155 13 mov.u64 %r95,%r159; .loc 1 165 8 mov.u64 %r92,%r159; mov.u64 %r54,%r159; bra $L6; $L49: add.u64 %r67,%r161,%r53; .loc 1 87 7 ld.u64 %r175,[%r67]; setp.ne.u64 %r176,%r175,%r117; @ %r176 bra $L4; .loc 1 92 12 ld.u64 %r178,[%r125+16]; ld.u64 %r180,[%r125+8]; sub.u64 %r179,%r178,%r180; .loc 1 92 9 mad.lo.u64 %r116,%r179,%r116,%r116; .loc 1 93 12 ld.u64 %r182,[%r67+16]; ld.u64 %r184,[%r67+8]; sub.u64 %r183,%r182,%r184; .loc 1 93 9 mad.lo.u64 %r117,%r183,%r117,%r117; .loc 1 77 30 add.u64 %r118,%r118,1; .loc 1 77 7 add.u64 %r53,%r53,24; setp.ne.u64 %r185,%r96,%r118; @ %r185 bra $L7; bra $L3; $L33: .loc 1 123 13 cvt.u32.u32 %r187,%r112; cvt.s64.s8 %r186,%r187; sub.u64 %r188,%r186,%r337; shl.b64 %r154,%r188,3; add.u64 %r189,%frame,120;1add.u64 %r198,%r337,%r337; add.u64 %r199,%r198,%r337; shl.b64 %r200,%r199,3; add.u64 %r44,%r200,40; add.u64 %r74,%r161,%r44; mov.u64 %r35,%frame; add.u64 %r101,%r160,%r44; mov.u64 %r49,0; add.u64 %r343,%frame,360; add.u64 %r344,%frame,240; $L8: .loc 1 124 16 ld.u64 %r202,[%r74+16]; add.u64 %r201,%r202,1; ld.u64 %r204,[%r74+8]; sub.u64 %r203,%r201,%r204; .loc 1 124 14 st.u64 [%r35],%r203; .loc 1 125 15 add.u64 %r206,%r343,%r49; ld.u64 %r207,[%r101]; st.u64 [%r206],%r207; .loc 1 126 15 add.u64 %r209,%r344,%r49; ld.u64 %r210,[%r74]; st.u64 [%r209],%r210; .loc 1 121 7 add.u64 %r74,%r74,24; add.u64 %r35,%r35,8; add.u64 %r101,%r101,24; add.u64 %r49,%r49,8; setp.ne.u64 %r211,%r49,%r154; @ %r211 bra $L8; .loc 1 162 12 ld.u64 %r159,[%frame+360]; .loc 1 163 12 ld.u64 %r95,[%frame+240]; $L34: .loc 1 129 41 sub.u32 %r212,%r112,%r108; .loc 1 129 11 cvt.s64.s32 %r96,%r212; .loc 1 117 15 mov.u64 %r92,1; .loc 1 116 15 mov.u64 %r54,%r92; bra $L9; $L4: .loc 1 135 12 cvt.s64.s32 %r151,%r108; add.u64 %r104,%r161,40; add.u64 %r100,%r160,40; mov.u64 %r99,0; mov.u64 %r97,%r99; mov.u64 %r121,%r99; mov.u64 %r92,1; mov.u64 %r54,%r92; .loc 1 147 17 mov.u64 %r352,%r99; $L14: .loc 1 143 14 ld.u64 %r214,[%r104+16]; add.u64 %r213,%r214,1; .loc 1 143 12 ld.u64 %r215,[%r104+8]; sub.u64 %r150,%r213,%r215; .loc 1 137 16 ld.u64 %r152,[%r100]; .loc 1 140 16 ld.u64 %r141,[%r104]; .loc 1 135 7 setp.ne.u64 %r216,%r151,%r121; @ %r216 bra $L10; .loc 1 138 11 setp.ne.u64 %r217,%r152,0; .loc 1 139 11 selp.u64 %r54,%r152,1,%r217; .loc 1 141 11 setp.ne.u64 %r218,%r141,0; .loc 1 142 11 selp.u64 %r92,%r141,1,%r218; .loc 1 143 12 mov.u64 %r97,%r150; bra $L13; $L10: .loc 1 147 17 shl.b64 %r219,%r99,3; add.u64 %r220,%frame,%r219; add.u64 %r221,%r220,120; st.u64 [%r221],%r352; .loc 1 148 18 st.u64 [%r220],%r150; .loc 1 149 19 st.u64 [%r220+360],%r152; .loc 1 150 19 st.u64 [%r220+240],%r141; .loc 1 151 9 add.u64 %r99,%r99,1; $L13: .loc 1 133 59 add.u64 %r121,%r121,1; .loc 1 133 7 add.u64 %r104,%r104,24; add.u64 %r100,%r100,24; setp.lt.s64 %r231,%r121,%r96; @ %r231 bra $L14; .loc 1 154 18 ld.u64 %r95,[%frame+240]; .loc 1 156 18 ld.u64 %r159,[%frame+360]; .loc 1 154 10 setp.ne.u64 %r232,%r95,0; .loc 1 155 13 selp.u64 %r95,%r95,1,%r232; .loc 1 156 10 setp.eq.u64 %r233,%r159,0; selp.u64 %r159,1,%r159,%r233; $L9: .loc 1 164 8 ld.u64 %r136,[%r160]; .loc 1 165 8 ld.u64 %r137,[%r161]; .loc 1 168 13 shr.u64 %r235,%r162,63; .loc 1 168 26 set.u32.ge.s64 %r237,%r162,%r97; neg.s32 %r238,%r237; .loc 1 168 17 cvt.u16.u64 %r241,%r235; cvt.u16.u32 %r242,%r238; or.b16 %r240,%r241,%r242; .loc 1 168 6 cvt.u32.u16 %r243,%r240; cvt.u16.u8 %r244,%r243; setp.eq.u16 %r245,%r244,0; @ %r245 bra $L18; .loc 1 170 28 setp.ne.u64 %r246,%r97,0; @ %r246 bra $L17; $L6: mov.u64 %r162,0; bra $L18; $L17: rem.s64 %r162,%r162,%r97; .loc 1 171 10 setp.ge.s64 %r249,%r162,0; @ %r249 bra $L18; .loc 1 172 8 add.u64 %r162,%r162,%r97; bra $L18; $L32: shl.b64 %r158,%r92,4; .loc 1 193 37 mul.lo.u64 %r62,%r162,%r158; .loc 1 198 13 shl.b64 %r65,%r54,4; .loc 1 183 11 shl.b64 %r132,%r162,4; .loc 1 184 23 sub.u64 %r143,%r97,%r162; .loc 1 184 11 shl.b64 %r133,%r143,4; .loc 1 210 12 shl.b64 %r69,%r159,4; .loc 1 211 12 shl.b64 %r71,%r95,4; .loc 1 181 35 set.u32.eq.u64 %r251,%r54,1; neg.s32 %r252,%r251; .loc 1 181 19 set.u32.eq.u64 %r254,%r92,1; neg.s32 %r255,%r254; .loc 1 181 24 cvt.u16.u32 %r257,%r252; cvt.u16.u32 %r258,%r255; and.b16 %r256,%r257,%r258; cvt.u32.u16 %r259,%r256; cvt.u32.u8 %r103,%r259; .loc 1 214 32 ld.u64 %r98,[%frame]; mul.lo.u64 %r156,%r65,%r143; .loc 1 221 30 mul.lo.u64 %r135,%r159,%r98; .loc 1 222 30 mul.lo.u64 %r260,%r95,%r98; .loc 1 222 16 shl.b64 %r261,%r260,4; neg.s64 %r153,%r261; shl.b64 %r262,%r96,3; add.u64 %r106,%r262,-8; setp.eq.u32 %r346,%r103,0; setp.gt.s64 %r345,%r96,2; .loc 1 195 4 setp.le.s64 %r348,%r143,0; .loc 1 201 4 setp.gt.s64 %r349,%r162,0; add.u64 %r350,%frame,360; add.u64 %r351,%frame,240; $L35: .loc 1 181 10 @ %r346 bra $L19; .loc 1 185 23 add.u64 %r264,%r137,%r132; .loc 1 185 433271,[%value_in]; } .loc 1 186 17 add.u64 %r273,%r136,%r133; .loc 1 186 427r11320,[%value_in]; } bra $L20; $L19: .loc 1 193 26 add.u64 %r130,%r137,%r62; .loc 1 195 4 @ %r348 bra $L39; mov.u64 %r129,%r136; .loc 1 195 11 mov.u64 %r131,0; $L22: .loc 1 197 16 ld.u64 %r339,[%r130]; ld.u64 %r340,[%r130+8]; .loc 1 197 14 st.u64 [%r129],%r339; st.u64 [%r129+8],%r340; .loc 1 198 13 add.u64 %r129,%r129,%r65; .loc 1 199 12 add.u64 %r130,%r130,%r158; .loc 1 195 34 add.u64 %r131,%r131,1; .loc 1 195 4 setp.ne.u64 %r285,%r131,%r143; @ %r285 bra $L22; .loc 1 198 13 add.u64 %r126,%r136,%r156; bra $L21; $L39: .loc 1 195 4 mov.u64 %r126,%r136; $L21: .loc 1 201 4 @ ! %r349 bra $L20; mov.u64 %r127,%r137; .loc 1 201 23 mov.u64 %r128,0; $L23: .loc 1 203 16 ld.u64 %r341,[%r127]; ld.u64 %r342,[%r127+8]; .loc 1 203 14 st.u64 [%r126],%r341; st.u64 [%r126+8],%r342; .loc 1 204 13 add.u64 %r126,%r126,%r65; .loc 1 205 12 add.u64 %r127,%r127,%r158; .loc 1 201 40 add.u64 %r128,%r128,1; .loc 1 201 4 setp.ne.u64 %r289,%r128,%r162; @ %r289 bra $L23; $L20: .loc 1 210 12 add.u64 %r136,%r136,%r69; .loc 1 211 12 add.u64 %r137,%r137,%r71; .loc 1 212 15 ld.u64 %r290,[%frame+120]; add.u64 %r73,%r290,1; .loc 1 214 13 setp.eq.u64 %r291,%r73,%r98; @ %r291 bra $L25; .loc 1 212 15 st.u64 [%frame+120],%r73; bra $L35; $L25: .loc 1 218 20 mov.u64 %r292,0; st.u64 [%frame+120],%r292; .loc 1 222 16 add.u64 %r134,%r137,%r153; .loc 1 224 14 @ %r345 bra $L27; bra $L1; $L31: .loc 1 218 20 st.u64 [%r123],%r292; .loc 1 221 30 mul.lo.u64 %r76,%r84,%r83; .loc 1 222 30 mul.lo.u64 %r295,%r86,%r83; .loc 1 222 16 shl.b64 %r296,%r295,4; sub.u64 %r134,%r137,%r296; .loc 1 224 14 add.u64 %r123,%r123,8; add.u64 %r148,%r148,8; setp.eq.u64 %r297,%r106,%r148; @ ! %r297 bra $L30; bra $L1; $L27: add.u64 %r123,%frame,128; .loc 1 221 30 mov.u64 %r76,%r135; .loc 1 224 14 mov.u64 %r148,8; $L30: .loc 1 232 23 ld.u64 %r299,[%r123]; add.u64 %r83,%r299,1; st.u64 [%r123],%r83; .loc 1 233 30 add.u64 %r301,%r350,%r148; ld.u64 %r84,[%r301]; .loc 1 233 20 sub.u64 %r302,%r84,%r76; shl.b64 %r303,%r302,4; add.u64 %r136,%r136,%r303; .loc 1 234 30 add.u64 %r305,%r351,%r148; ld.u64 %r86,[%r305]; .loc 1 234 20 shl.b64 %r306,%r86,4; add.u64 %r137,%r134,%r306; .loc 1 214 32 add.u64 %r307,%frame,%r148; ld.u64 %r90,[%r307]; .loc 1 214 13 setp.eq.u64 %r308,%r83,%r90; @ %r308 bra $L31; bra $L35; $L18: .loc 1 175 9 setp.ne.u64 %r309,%r136,0; @ %r309 bra $L32; bra $L1; $L3: .loc 1 114 18 mov.u64 %r310,1; st.u64 [%frame+240],%r310; .loc 1 115 18 st.u64 [%frame+360],%r310; .loc 1 118 13 cvt.s64.s32 %r312,%r108; add.u64 %r314,%r312,%r312; add.u64 %r315,%r314,%r312; shl.b64 %r316,%r315,3; add.u64 %r317,%r161,%r316; ld.u64 %r30,[%r317+40]; .loc 1 119 4 ld.u64 %r327,[%r317+56]; ld.u64 %r336,[%r317+48]; sub.u64 %r335,%r327,%r336; .loc 1 118 11 mad.lo.u64 %r97,%r335,%r30,%r30; .loc 1 120 13 mul.lo.u64 %r162,%r162,%r30; .loc 1 121 16 cvt.s64.s32 %r337,%r163; .loc 1 121 7 setp.gt.s64 %r338,%r96,%r337; @ %r338 bra $L33; mov.u64 %r95,%r310; mov.u64 %r159,%r95; bra $L34; $L1: .loc 1 240cshift0_r4.o/_gfortrani_cshift0_r4 .visible .func _gfortrani_cshift0_r../libgfortran/generated/cshift0_r4.c"ortrani_cshift0_r4 .visible .func _gfortrani_cshift0_r492; .reg .u6432u64 %r214; .reg .u64 %r215; .reg .pred %r216; .reg .pred %r217; .reg .predpred %r232; .reg .pred %r233; .reg .u64 %r235; .reg .u32 %r237; .reg .u32 %r238; .reg .u16 %r240; .reg .u16 %r241; .reg .u16 %r242; .reg .u32 %r243; .reg .u16pred %r249; .reg .u32 %r251; .reg .u32 %r252; .reg .u32 %r254; .reg .u32 %r255; .reg .u16 %r256; .reg .u16 %r257; .reg .u16 %r258; .reg .u32predpredu64pred %r305; .reg .u64u64pred %r337; .reg .pred %r338; .reg .pred %r340; .reg .predmov.u64 %r160,%ar0; mov.u64 %r161,%ar1; mov.u64 %r162,%ar2; mov.u32 %r163,%ar3; .loc 1 57 9 add.u32 %r108,%r163,-1; .loc 1 58 14 mov.u64 %r164,0; st.u64 [%frame+240],%r164; .loc 1 59 14 st.u64 [%frame+360],%r164; .loc 1 61 13 mov.u64 %r166,1; st.u64 [%frame],%r166; .loc 1 62 12 st.u64 [%frame+120],%r164; .loc 1 76 13 ld.s8 %r112,[%r161+28]; .loc 1 72 6 setp.le.s32 %r168,%r108,0; @ %r168 bra $L2; .loc 1 76 11 cvt.u32.u32 %r169,%r112; cvt.s64.s8 %r96,%r169; .loc 1 77 7 setp.le.s64 %r170,%r96,0; @ %r170 bra $L3; mov.u64 %r53,40; .loc 1 70 8 mov.u64 %r117,%r166; .loc 1 69 8 mov.u64 %r116,%r117; .loc 1 77 14 mov.u64 %r118,%r164; $L7: add.u64 %r125,%r160,%r53; .loc 1 81 7 ld.u64 %r171,[%r125]; setp.ne.u64 %r172,%r171,%r116; @ ! %r172 bra $L49; bra $L4; $L2: .loc 1 133 27 cvt.u32.u32 %r173,%r112; cvt.s64.s8 %r96,%r173; .loc 1 133 7 setp.gt.s64 %r174,%r96,0; @ %r174 bra $L4; .loc 1 164 8 ld.u64 %r136,[%r160]; .loc 1 165 8 ld.u64 %r137,[%r161]; mov.u64 %r97,%r164; mov.u64 %r159,%r166; .loc 1 155 13 mov.u64 %r95,%r159; .loc 1 165 8 mov.u64 %r92,%r159; mov.u64 %r54,%r159; bra $L6; $L49: add.u64 %r67,%r161,%r53; .loc 1 87 7 ld.u64 %r175,[%r67]; setp.ne.u64 %r176,%r175,%r117; @ %r176 bra $L4; .loc 1 92 12 ld.u64 %r178,[%r125+16]; ld.u64 %r180,[%r125+8]; sub.u64 %r179,%r178,%r180; .loc 1 92 9 mad.lo.u64 %r116,%r179,%r116,%r116; .loc 1 93 12 ld.u64 %r182,[%r67+16]; ld.u64 %r184,[%r67+8]; sub.u64 %r183,%r182,%r184; .loc 1 93 9 mad.lo.u64 %r117,%r183,%r117,%r117; .loc 1 77 30 add.u64 %r118,%r118,1; .loc 1 77 7 add.u64 %r53,%r53,24; setp.ne.u64 %r185,%r96,%r118; @ %r185 bra $L7; bra $L3; $L33: .loc 1 123 13 cvt.u32.u32 %r187,%r112; cvt.s64.s8 %r186,%r187; sub.u64 %r188,%r186,%r333; shl.b64 %r154,%r188,3; add.u64 %r189,%frame,120;1add.u64 %r198,%r333,%r333; add.u64 %r199,%r198,%r333; shl.b64 %r200,%r199,3; add.u64 %r44,%r200,40; add.u64 %r74,%r161,%r44; mov.u64 %r35,%frame; add.u64 %r101,%r160,%r44; mov.u64 %r49,0; add.u64 %r335,%frame,360; add.u64 %r336,%frame,240; $L8: .loc 1 124 16 ld.u64 %r202,[%r74+16]; add.u64 %r201,%r202,1; ld.u64 %r204,[%r74+8]; sub.u64 %r203,%r201,%r204; .loc 1 124 14 st.u64 [%r35],%r203; .loc 1 125 15 add.u64 %r206,%r335,%r49; ld.u64 %r207,[%r101]; st.u64 [%r206],%r207; .loc 1 126 15 add.u64 %r209,%r336,%r49; ld.u64 %r210,[%r74]; st.u64 [%r209],%r210; .loc 1 121 7 add.u64 %r74,%r74,24; add.u64 %r35,%r35,8; add.u64 %r101,%r101,24; add.u64 %r49,%r49,8; setp.ne.u64 %r211,%r49,%r154; @ %r211 bra $L8; .loc 1 162 12 ld.u64 %r159,[%frame+360]; .loc 1 163 12 ld.u64 %r95,[%frame+240]; $L34: .loc 1 129 41 sub.u32 %r212,%r112,%r108; .loc 1 129 11 cvt.s64.s32 %r96,%r212; .loc 1 117 15 mov.u64 %r92,1; .loc 1 116 15 mov.u64 %r54,%r92; bra $L9; $L4: .loc 1 135 12 cvt.s64.s32 %r151,%r108; add.u64 %r104,%r161,40; add.u64 %r100,%r160,40; mov.u64 %r99,0; mov.u64 %r97,%r99; mov.u64 %r121,%r99; mov.u64 %r92,1; mov.u64 %r54,%r92; .loc 1 147 17 mov.u64 %r344,%r99; $L14: .loc 1 143 14 ld.u64 %r214,[%r104+16]; add.u64 %r213,%r214,1; .loc 1 143 12 ld.u64 %r215,[%r104+8]; sub.u64 %r150,%r213,%r215; .loc 1 137 16 ld.u64 %r152,[%r100]; .loc 1 140 16 ld.u64 %r141,[%r104]; .loc 1 135 7 setp.ne.u64 %r216,%r151,%r121; @ %r216 bra $L10; .loc 1 138 11 setp.ne.u64 %r217,%r152,0; .loc 1 139 11 selp.u64 %r54,%r152,1,%r217; .loc 1 141 11 setp.ne.u64 %r218,%r141,0; .loc 1 142 11 selp.u64 %r92,%r141,1,%r218; .loc 1 143 12 mov.u64 %r97,%r150; bra $L13; $L10: .loc 1 147 17 shl.b64 %r219,%r99,3; add.u64 %r220,%frame,%r219; add.u64 %r221,%r220,120; st.u64 [%r221],%r344; .loc 1 148 18 st.u64 [%r220],%r150; .loc 1 149 19 st.u64 [%r220+360],%r152; .loc 1 150 19 st.u64 [%r220+240],%r141; .loc 1 151 9 add.u64 %r99,%r99,1; $L13: .loc 1 133 59 add.u64 %r121,%r121,1; .loc 1 133 7 add.u64 %r104,%r104,24; add.u64 %r100,%r100,24; setp.lt.s64 %r231,%r121,%r96; @ %r231 bra $L14; .loc 1 154 18 ld.u64 %r95,[%frame+240]; .loc 1 156 18 ld.u64 %r159,[%frame+360]; .loc 1 154 10 setp.ne.u64 %r232,%r95,0; .loc 1 155 13 selp.u64 %r95,%r95,1,%r232; .loc 1 156 10 setp.eq.u64 %r233,%r159,0; selp.u64 %r159,1,%r159,%r233; $L9: .loc 1 164 8 ld.u64 %r136,[%r160]; .loc 1 165 8 ld.u64 %r137,[%r161]; .loc 1 168 13 shr.u64 %r235,%r162,63; .loc 1 168 26 set.u32.ge.s64 %r237,%r162,%r97; neg.s32 %r238,%r237; .loc 1 168 17 cvt.u16.u64 %r241,%r235; cvt.u16.u32 %r242,%r238; or.b16 %r240,%r241,%r242; .loc 1 168 6 cvt.u32.u16 %r243,%r240; cvt.u16.u8 %r244,%r243; setp.eq.u16 %r245,%r244,0; @ %r245 bra $L18; .loc 1 170 28 setp.ne.u64 %r246,%r97,0; @ %r246 bra $L17; $L6: mov.u64 %r162,0; bra $L18; $L17: rem.s64 %r162,%r162,%r97; .loc 1 171 10 setp.ge.s64 %r249,%r162,0; @ %r249 bra $L18; .loc 1 172 8 add.u64 %r162,%r162,%r97; bra $L18; $L32: shl.b64 %r158,%r92,2; .loc 1 193 33 mul.lo.u64 %r62,%r162,%r158; .loc 1 198 13 shl.b64 %r65,%r54,2; .loc 1 183 11 shl.b64 %r132,%r162,2; .loc 1 184 23 sub.u64 %r143,%r97,%r162; .loc 1 184 11 shl.b64 %r133,%r143,2; .loc 1 210 12 shl.b64 %r69,%r159,2; .loc 1 211 12 shl.b64 %r71,%r95,2; .loc 1 181 35 set.u32.eq.u64 %r251,%r54,1; neg.s32 %r252,%r251; .loc 1 181 19 set.u32.eq.u64 %r254,%r92,1; neg.s32 %r255,%r254; .loc 1 181 24 cvt.u16.u32 %r257,%r252; cvt.u16.u32 %r258,%r255; and.b16 %r256,%r257,%r258; cvt.u32.u16 %r259,%r256; cvt.u32.u8 %r103,%r259; .loc 1 214 32 ld.u64 %r98,[%frame]; mul.lo.u64 %r156,%r65,%r143; .loc 1 221 30 mul.lo.u64 %r135,%r159,%r98; .loc 1 222 30 mul.lo.u64 %r260,%r95,%r98; .loc 1 222 16 shl.b64 %r261,%r260,2; neg.s64 %r153,%r261; shl.b64 %r262,%r96,3; add.u64 %r106,%r262,-8; setp.eq.u32 %r338,%r103,0; setp.gt.s64 %r337,%r96,2; .loc 1 195 4 setp.le.s64 %r340,%r143,0; .loc 1 201 4 setp.gt.s64 %r341,%r162,0; add.u64 %r342,%frame,360; add.u64 %r343,%frame,240; $L35: .loc 1 181 10 @ %r338 bra $L19; .loc 1 185 23 add.u64 %r264,%r137,%r132; .loc 1 185 433271,[%value_in]; } .loc 1 186 17 add.u64 %r273,%r136,%r133; .loc 1 186 427r11320,[%value_in]; } bra $L20; $L19: .loc 1 193 22 add.u64 %r130,%r137,%r62; .loc 1 195 4 @ %r340 bra $L39; mov.u64 %r129,%r136; .loc 1 195 11 mov.u64 %r131,0; $L22: .loc 1 197 16 ld.f32 %r63,[%r130]; .loc 1 197 14 st.f32 [%r129],%r63; .loc 1 198 13 add.u64 %r129,%r129,%r65; .loc 1 199 12 add.u64 %r130,%r130,%r158; .loc 1 195 34 add.u64 %r131,%r131,1; .loc 1 195 4 setp.ne.u64 %r283,%r131,%r143; @ %r283 bra $L22; .loc 1 198 13 add.u64 %r126,%r136,%r156; bra $L21; $L39: .loc 1 195 4 mov.u64 %r126,%r136; $L21: .loc 1 201 4 @ ! %r341 bra $L20; mov.u64 %r127,%r137; .loc 1 201 23 mov.u64 %r128,0; $L23: .loc 1 203 16 ld.f32 %r66,[%r127]; .loc 1 203 14 st.f32 [%r126],%r66; .loc 1 204 13 add.u64 %r126,%r126,%r65; .loc 1 205 12 add.u64 %r127,%r127,%r158; .loc 1 201 40 add.u64 %r128,%r128,1; .loc 1 201 4 setp.ne.u64 %r285,%r128,%r162; @ %r285 bra $L23; $L20: .loc 1 210 12 add.u64 %r136,%r136,%r69; .loc 1 211 12 add.u64 %r137,%r137,%r71; .loc 1 212 15 ld.u64 %r286,[%frame+120]; add.u64 %r73,%r286,1; .loc 1 214 13 setp.eq.u64 %r287,%r73,%r98; @ %r287 bra $L25; .loc 1 212 15 st.u64 [%frame+120],%r73; bra $L35; $L25: .loc 1 218 20 mov.u64 %r288,0; st.u64 [%frame+120],%r288; .loc 1 222 16 add.u64 %r134,%r137,%r153; .loc 1 224 14 @ %r337 bra $L27; bra $L1; $L31: .loc 1 218 20 st.u64 [%r123],%r288; .loc 1 221 30 mul.lo.u64 %r76,%r84,%r83; .loc 1 222 30 mul.lo.u64 %r291,%r86,%r83; .loc 1 222 16 shl.b64 %r292,%r291,2; sub.u64 %r134,%r137,%r292; .loc 1 224 14 add.u64 %r123,%r123,8; add.u64 %r148,%r148,8; setp.eq.u64 %r293,%r106,%r148; @ ! %r293 bra $L30; bra $L1; $L27: add.u64 %r123,%frame,128; .loc 1 221 30 mov.u64 %r76,%r135; .loc 1 224 14 mov.u64 %r148,8; $L30: .loc 1 232 23 ld.u64 %r295,[%r123]; add.u64 %r83,%r295,1; st.u64 [%r123],%r83; .loc 1 233 30 add.u64 %r297,%r342,%r148; ld.u64 %r84,[%r297]; .loc 1 233 20 sub.u64 %r298,%r84,%r76; shl.b64 %r299,%r298,2; add.u64 %r136,%r136,%r299; .loc 1 234 30 add.u64 %r301,%r343,%r148; ld.u64 %r86,[%r301]; .loc 1 234 20 shl.b64 %r302,%r86,2; add.u64 %r137,%r134,%r302; .loc 1 214 32 add.u64 %r303,%frame,%r148; ld.u64 %r90,[%r303]; .loc 1 214 13 setp.eq.u64 %r304,%r83,%r90; @ %r304 bra $L31; bra $L35; $L18: .loc 1 175 9 setp.ne.u64 %r305,%r136,0; @ %r305 bra $L32; bra $L1; $L3: .loc 1 114 18 mov.u64 %r306,1; st.u64 [%frame+240],%r306; .loc 1 115 18 st.u64 [%frame+360],%r306; .loc 1 118 13 cvt.s64.s32 %r308,%r108; add.u64 %r310,%r308,%r308; add.u64 %r311,%r310,%r308; shl.b64 %r312,%r311,3; add.u64 %r313,%r161,%r312; ld.u64 %r30,[%r313+40]; .loc 1 119 4 ld.u64 %r323,[%r313+56]; ld.u64 %r332,[%r313+48]; sub.u64 %r331,%r323,%r332; .loc 1 118 11 mad.lo.u64 %r97,%r331,%r30,%r30; .loc 1 120 13 mul.lo.u64 %r162,%r162,%r30; .loc 1 121 16 cvt.s64.s32 %r333,%r163; .loc 1 121 7 setp.gt.s64 %r334,%r96,%r333; @ %r334 bra $L33; mov.u64 %r95,%r306; mov.u64 %r159,%r95; bra $L34; $L1: .loc 1 240 cshift0_r8.o/_gfortrani_cshift0_r8 .visible .func _gfortrani_cshift0_r8.file 1 "../../../../libgfortran/generated/cshift0_r8.c"ortrani_cshift0_r8 .visible .func _gfortrani_cshift0_r892; .reg .u6432u64 %r214; .reg .u64 %r215; .reg .pred %r216; .reg .pred %r217; .reg .predpred %r232; .reg .pred %r233; .reg .u64 %r235; .reg .u32 %r237; .reg .u32 %r238; .reg .u16 %r240; .reg .u16 %r241; .reg .u16 %r242; .reg .u32 %r243; .reg .u16pred %r249; .reg .u32 %r251; .reg .u32 %r252; .reg .u32 %r254; .reg .u32 %r255; .reg .u16 %r256; .reg .u16 %r257; .reg .u16 %r258; .reg .u32predpredu64pred %r305; .reg .u64u64pred %r337; .reg .pred %r338; .reg .pred %r340; .reg .predmov.u64 %r160,%ar0; mov.u64 %r161,%ar1; mov.u64 %r162,%ar2; mov.u32 %r163,%ar3; .loc 1 57 9 add.u32 %r108,%r163,-1; .loc 1 58 14 mov.u64 %r164,0; st.u64 [%frame+240],%r164; .loc 1 59 14 st.u64 [%frame+360],%r164; .loc 1 61 13 mov.u64 %r166,1; st.u64 [%frame],%r166; .loc 1 62 12 st.u64 [%frame+120],%r164; .loc 1 76 13 ld.s8 %r112,[%r161+28]; .loc 1 72 6 setp.le.s32 %r168,%r108,0; @ %r168 bra $L2; .loc 1 76 11 cvt.u32.u32 %r169,%r112; cvt.s64.s8 %r96,%r169; .loc 1 77 7 setp.le.s64 %r170,%r96,0; @ %r170 bra $L3; mov.u64 %r53,40; .loc 1 70 8 mov.u64 %r117,%r166; .loc 1 69 8 mov.u64 %r116,%r117; .loc 1 77 14 mov.u64 %r118,%r164; $L7: add.u64 %r125,%r160,%r53; .loc 1 81 7 ld.u64 %r171,[%r125]; setp.ne.u64 %r172,%r171,%r116; @ ! %r172 bra $L49; bra $L4; $L2: .loc 1 133 27 cvt.u32.u32 %r173,%r112; cvt.s64.s8 %r96,%r173; .loc 1 133 7 setp.gt.s64 %r174,%r96,0; @ %r174 bra $L4; .loc 1 164 8 ld.u64 %r136,[%r160]; .loc 1 165 8 ld.u64 %r137,[%r161]; mov.u64 %r97,%r164; mov.u64 %r159,%r166; .loc 1 155 13 mov.u64 %r95,%r159; .loc 1 165 8 mov.u64 %r92,%r159; mov.u64 %r54,%r159; bra $L6; $L49: add.u64 %r67,%r161,%r53; .loc 1 87 7 ld.u64 %r175,[%r67]; setp.ne.u64 %r176,%r175,%r117; @ %r176 bra $L4; .loc 1 92 12 ld.u64 %r178,[%r125+16]; ld.u64 %r180,[%r125+8]; sub.u64 %r179,%r178,%r180; .loc 1 92 9 mad.lo.u64 %r116,%r179,%r116,%r116; .loc 1 93 12 ld.u64 %r182,[%r67+16]; ld.u64 %r184,[%r67+8]; sub.u64 %r183,%r182,%r184; .loc 1 93 9 mad.lo.u64 %r117,%r183,%r117,%r117; .loc 1 77 30 add.u64 %r118,%r118,1; .loc 1 77 7 add.u64 %r53,%r53,24; setp.ne.u64 %r185,%r96,%r118; @ %r185 bra $L7; bra $L3; $L33: .loc 1 123 13 cvt.u32.u32 %r187,%r112; cvt.s64.s8 %r186,%r187; sub.u64 %r188,%r186,%r333; shl.b64 %r154,%r188,3; add.u64 %r189,%frame,120;1add.u64 %r198,%r333,%r333; add.u64 %r199,%r198,%r333; shl.b64 %r200,%r199,3; add.u64 %r44,%r200,40; add.u64 %r74,%r161,%r44; mov.u64 %r35,%frame; add.u64 %r101,%r160,%r44; mov.u64 %r49,0; add.u64 %r335,%frame,360; add.u64 %r336,%frame,240; $L8: .loc 1 124 16 ld.u64 %r202,[%r74+16]; add.u64 %r201,%r202,1; ld.u64 %r204,[%r74+8]; sub.u64 %r203,%r201,%r204; .loc 1 124 14 st.u64 [%r35],%r203; .loc 1 125 15 add.u64 %r206,%r335,%r49; ld.u64 %r207,[%r101]; st.u64 [%r206],%r207; .loc 1 126 15 add.u64 %r209,%r336,%r49; ld.u64 %r210,[%r74]; st.u64 [%r209],%r210; .loc 1 121 7 add.u64 %r74,%r74,24; add.u64 %r35,%r35,8; add.u64 %r101,%r101,24; add.u64 %r49,%r49,8; setp.ne.u64 %r211,%r49,%r154; @ %r211 bra $L8; .loc 1 162 12 ld.u64 %r159,[%frame+360]; .loc 1 163 12 ld.u64 %r95,[%frame+240]; $L34: .loc 1 129 41 sub.u32 %r212,%r112,%r108; .loc 1 129 11 cvt.s64.s32 %r96,%r212; .loc 1 117 15 mov.u64 %r92,1; .loc 1 116 15 mov.u64 %r54,%r92; bra $L9; $L4: .loc 1 135 12 cvt.s64.s32 %r151,%r108; add.u64 %r104,%r161,40; add.u64 %r100,%r160,40; mov.u64 %r99,0; mov.u64 %r97,%r99; mov.u64 %r121,%r99; mov.u64 %r92,1; mov.u64 %r54,%r92; .loc 1 147 17 mov.u64 %r344,%r99; $L14: .loc 1 143 14 ld.u64 %r214,[%r104+16]; add.u64 %r213,%r214,1; .loc 1 143 12 ld.u64 %r215,[%r104+8]; sub.u64 %r150,%r213,%r215; .loc 1 137 16 ld.u64 %r152,[%r100]; .loc 1 140 16 ld.u64 %r141,[%r104]; .loc 1 135 7 setp.ne.u64 %r216,%r151,%r121; @ %r216 bra $L10; .loc 1 138 11 setp.ne.u64 %r217,%r152,0; .loc 1 139 11 selp.u64 %r54,%r152,1,%r217; .loc 1 141 11 setp.ne.u64 %r218,%r141,0; .loc 1 142 11 selp.u64 %r92,%r141,1,%r218; .loc 1 143 12 mov.u64 %r97,%r150; bra $L13; $L10: .loc 1 147 17 shl.b64 %r219,%r99,3; add.u64 %r220,%frame,%r219; add.u64 %r221,%r220,120; st.u64 [%r221],%r344; .loc 1 148 18 st.u64 [%r220],%r150; .loc 1 149 19 st.u64 [%r220+360],%r152; .loc 1 150 19 st.u64 [%r220+240],%r141; .loc 1 151 9 add.u64 %r99,%r99,1; $L13: .loc 1 133 59 add.u64 %r121,%r121,1; .loc 1 133 7 add.u64 %r104,%r104,24; add.u64 %r100,%r100,24; setp.lt.s64 %r231,%r121,%r96; @ %r231 bra $L14; .loc 1 154 18 ld.u64 %r95,[%frame+240]; .loc 1 156 18 ld.u64 %r159,[%frame+360]; .loc 1 154 10 setp.ne.u64 %r232,%r95,0; .loc 1 155 13 selp.u64 %r95,%r95,1,%r232; .loc 1 156 10 setp.eq.u64 %r233,%r159,0; selp.u64 %r159,1,%r159,%r233; $L9: .loc 1 164 8 ld.u64 %r136,[%r160]; .loc 1 165 8 ld.u64 %r137,[%r161]; .loc 1 168 13 shr.u64 %r235,%r162,63; .loc 1 168 26 set.u32.ge.s64 %r237,%r162,%r97; neg.s32 %r238,%r237; .loc 1 168 17 cvt.u16.u64 %r241,%r235; cvt.u16.u32 %r242,%r238; or.b16 %r240,%r241,%r242; .loc 1 168 6 cvt.u32.u16 %r243,%r240; cvt.u16.u8 %r244,%r243; setp.eq.u16 %r245,%r244,0; @ %r245 bra $L18; .loc 1 170 28 setp.ne.u64 %r246,%r97,0; @ %r246 bra $L17; $L6: mov.u64 %r162,0; bra $L18; $L17: rem.s64 %r162,%r162,%r97; .loc 1 171 10 setp.ge.s64 %r249,%r162,0; @ %r249 bra $L18; .loc 1 172 8 add.u64 %r162,%r162,%r97; bra $L18; $L32: shl.b64 %r158,%r92,3; .loc 1 193 33 mul.lo.u64 %r62,%r162,%r158; .loc 1 198 13 shl.b64 %r65,%r54,3; .loc 1 183 11 shl.b64 %r132,%r162,3; .loc 1 184 23 sub.u64 %r143,%r97,%r162; .loc 1 184 11 shl.b64 %r133,%r143,3; .loc 1 210 12 shl.b64 %r69,%r159,3; .loc 1 211 12 shl.b64 %r71,%r95,3; .loc 1 181 35 set.u32.eq.u64 %r251,%r54,1; neg.s32 %r252,%r251; .loc 1 181 19 set.u32.eq.u64 %r254,%r92,1; neg.s32 %r255,%r254; .loc 1 181 24 cvt.u16.u32 %r257,%r252; cvt.u16.u32 %r258,%r255; and.b16 %r256,%r257,%r258; cvt.u32.u16 %r259,%r256; cvt.u32.u8 %r103,%r259; .loc 1 214 32 ld.u64 %r98,[%frame]; mul.lo.u64 %r156,%r65,%r143; .loc 1 221 30 mul.lo.u64 %r135,%r159,%r98; .loc 1 222 30 mul.lo.u64 %r260,%r95,%r98; .loc 1 222 16 shl.b64 %r261,%r260,3; neg.s64 %r153,%r261; shl.b64 %r262,%r96,3; add.u64 %r106,%r262,-8; setp.eq.u32 %r338,%r103,0; setp.gt.s64 %r337,%r96,2; .loc 1 195 4 setp.le.s64 %r340,%r143,0; .loc 1 201 4 setp.gt.s64 %r341,%r162,0; add.u64 %r342,%frame,360; add.u64 %r343,%frame,240; $L35: .loc 1 181 10 @ %r338 bra $L19; .loc 1 185 23 add.u64 %r264,%r137,%r132; .loc 1 185 433271,[%value_in]; } .loc 1 186 17 add.u64 %r273,%r136,%r133; .loc 1 186 427r11320,[%value_in]; } bra $L20; $L19: .loc 1 193 22 add.u64 %r130,%r137,%r62; .loc 1 195 4 @ %r340 bra $L39; mov.u64 %r129,%r136; .loc 1 195 11 mov.u64 %r131,0; $L22: .loc 1 197 16 ld.f64 %r63,[%r130]; .loc 1 197 14 st.f64 [%r129],%r63; .loc 1 198 13 add.u64 %r129,%r129,%r65; .loc 1 199 12 add.u64 %r130,%r130,%r158; .loc 1 195 34 add.u64 %r131,%r131,1; .loc 1 195 4 setp.ne.u64 %r283,%r131,%r143; @ %r283 bra $L22; .loc 1 198 13 add.u64 %r126,%r136,%r156; bra $L21; $L39: .loc 1 195 4 mov.u64 %r126,%r136; $L21: .loc 1 201 4 @ ! %r341 bra $L20; mov.u64 %r127,%r137; .loc 1 201 23 mov.u64 %r128,0; $L23: .loc 1 203 16 ld.f64 %r66,[%r127]; .loc 1 203 14 st.f64 [%r126],%r66; .loc 1 204 13 add.u64 %r126,%r126,%r65; .loc 1 205 12 add.u64 %r127,%r127,%r158; .loc 1 201 40 add.u64 %r128,%r128,1; .loc 1 201 4 setp.ne.u64 %r285,%r128,%r162; @ %r285 bra $L23; $L20: .loc 1 210 12 add.u64 %r136,%r136,%r69; .loc 1 211 12 add.u64 %r137,%r137,%r71; .loc 1 212 15 ld.u64 %r286,[%frame+120]; add.u64 %r73,%r286,1; .loc 1 214 13 setp.eq.u64 %r287,%r73,%r98; @ %r287 bra $L25; .loc 1 212 15 st.u64 [%frame+120],%r73; bra $L35; $L25: .loc 1 218 20 mov.u64 %r288,0; st.u64 [%frame+120],%r288; .loc 1 222 16 add.u64 %r134,%r137,%r153; .loc 1 224 14 @ %r337 bra $L27; bra $L1; $L31: .loc 1 218 20 st.u64 [%r123],%r288; .loc 1 221 30 mul.lo.u64 %r76,%r84,%r83; .loc 1 222 30 mul.lo.u64 %r291,%r86,%r83; .loc 1 222 16 shl.b64 %r292,%r291,3; sub.u64 %r134,%r137,%r292; .loc 1 224 14 add.u64 %r123,%r123,8; add.u64 %r148,%r148,8; setp.eq.u64 %r293,%r106,%r148; @ ! %r293 bra $L30; bra $L1; $L27: add.u64 %r123,%frame,128; .loc 1 221 30 mov.u64 %r76,%r135; .loc 1 224 14 mov.u64 %r148,8; $L30: .loc 1 232 23 ld.u64 %r295,[%r123]; add.u64 %r83,%r295,1; st.u64 [%r123],%r83; .loc 1 233 30 add.u64 %r297,%r342,%r148; ld.u64 %r84,[%r297]; .loc 1 233 20 sub.u64 %r298,%r84,%r76; shl.b64 %r299,%r298,3; add.u64 %r136,%r136,%r299; .loc 1 234 30 add.u64 %r301,%r343,%r148; ld.u64 %r86,[%r301]; .loc 1 234 20 shl.b64 %r302,%r86,3; add.u64 %r137,%r134,%r302; .loc 1 214 32 add.u64 %r303,%frame,%r148; ld.u64 %r90,[%r303]; .loc 1 214 13 setp.eq.u64 %r304,%r83,%r90; @ %r304 bra $L31; bra $L35; $L18: .loc 1 175 9 setp.ne.u64 %r305,%r136,0; @ %r305 bra $L32; bra $L1; $L3: .loc 1 114 18 mov.u64 %r306,1; st.u64 [%frame+240],%r306; .loc 1 115 18 st.u64 [%frame+360],%r306; .loc 1 118 13 cvt.s64.s32 %r308,%r108; add.u64 %r310,%r308,%r308; add.u64 %r311,%r310,%r308; shl.b64 %r312,%r311,3; add.u64 %r313,%r161,%r312; ld.u64 %r30,[%r313+40]; .loc 1 119 4 ld.u64 %r323,[%r313+56]; ld.u64 %r332,[%r313+48]; sub.u64 %r331,%r323,%r332; .loc 1 118 11 mad.lo.u64 %r97,%r331,%r30,%r30; .loc 1 120 13 mul.lo.u64 %r162,%r162,%r30; .loc 1 121 16 cvt.s64.s32 %r333,%r163; .loc 1 121 7 setp.gt.s64 %r334,%r96,%r333; @ %r334 bra $L33; mov.u64 %r95,%r306; mov.u64 %r159,%r95; bra $L34; $L1: .loc 1 240 cshift0_r10cshift0_c4.o/gfortrani_cshift0_c4 .visible .func _gfortrani_cshift0_c../libgfortran/generated/cshift0_c4.c"ortrani_cshift0_c4 .visible .func _gfortrani_cshift0_pred %r213; .reg .u32predu64 %r223; .reg .predu32 %r240; .reg .u16predpredu64 %r312; .reg .u64u64 %r325; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .predmov.u64 %r162,%ar0; mov.u64 %r163,%ar1; mov.u64 %r164,%ar2; mov.u32 %r165,%ar3; .loc 1 57 9 add.u32 %r110,%r165,-1; .loc 1 58 14 mov.u64 %r166,0; st.u64 [%frame+240],%r166; .loc 1 59 14 st.u64 [%frame+360],%r166; .loc 1 61 13 mov.u64 %r168,1; st.u64 [%frame],%r168; .loc 1 62 12 st.u64 [%frame+120],%r166; .loc 1 76 13 ld.s8 %r114,[%r163+28]; .loc 1 72 6 setp.le.s32 %r170,%r110,0; @ %r170 bra $L2; .loc 1 76 11 cvt.u32.u32 %r171,%r114; cvt.s64.s8 %r98,%r171; .loc 1 77 7 setp.le.s64 %r172,%r98,0; @ %r172 bra $L3; mov.u64 %r53,40; .loc 1 70 8 mov.u64 %r119,%r168; .loc 1 69 8 mov.u64 %r118,%r119; .loc 1 77 14 mov.u64 %r120,%r166; $L7: add.u64 %r127,%r162,%r53; .loc 1 81 7 ld.u64 %r173,[%r127]; setp.ne.u64 %r174,%r173,%r118; @ ! %r174 bra $L49; bra $L4; $L2: .loc 1 133 27 cvt.u32.u32 %r175,%r114; cvt.s64.s8 %r98,%r175; .loc 1 133 7 setp.gt.s64 %r176,%r98,0; @ %r176 bra $L4; .loc 1 164 8 ld.u64 %r138,[%r162]; .loc 1 165 8 ld.u64 %r139,[%r163]; mov.u64 %r99,%r166; mov.u64 %r161,%r168; .loc 1 155 13 mov.u64 %r97,%r161; .loc 1 165 8 mov.u64 %r94,%r161; mov.u64 %r54,%r161; bra $L6; $L49: add.u64 %r69,%r163,%r53; .loc 1 87 7 ld.u64 %r177,[%r69]; setp.ne.u64 %r178,%r177,%r119; @ %r178 bra $L4; .loc 1 92 12 ld.u64 %r180,[%r127+16]; ld.u64 %r182,[%r127+8]; sub.u64 %r181,%r180,%r182; .loc 1 92 9 mad.lo.u64 %r118,%r181,%r118,%r118; .loc 1 93 12 ld.u64 %r184,[%r69+16]; ld.u64 %r186,[%r69+8]; sub.u64 %r185,%r184,%r186; .loc 1 93 9 mad.lo.u64 %r119,%r185,%r119,%r119; .loc 1 77 30 add.u64 %r120,%r120,1; .loc 1 77 7 add.u64 %r53,%r53,24; setp.ne.u64 %r187,%r98,%r120; @ %r187 bra $L7; bra $L3; $L33: .loc 1 123 13 cvt.u32.u32 %r189,%r114; cvt.s64.s8 %r188,%r189; sub.u64 %r190,%r188,%r335; shl.b64 %r156,%r190,3; add.u64 %r191,%frame,120; mov.u32 %r1add.u64 %r200,%r335,%r335; add.u64 %r201,%r200,%r335; shl.b64 %r202,%r201,3; add.u64 %r44,%r202,40; add.u64 %r76,%r163,%r44; mov.u64 %r35,%frame; add.u64 %r103,%r162,%r44; mov.u64 %r49,0; add.u64 %r337,%frame,360; add.u64 %r338,%frame,240; $L8: .loc 1 124 16 ld.u64 %r204,[%r76+16]; add.u64 %r203,%r204,1; ld.u64 %r206,[%r76+8]; sub.u64 %r205,%r203,%r206; .loc 1 124 14 st.u64 [%r35],%r205; .loc 1 125 15 add.u64 %r208,%r337,%r49; ld.u64 %r209,[%r103]; st.u64 [%r208],%r209; .loc 1 126 15 add.u64 %r211,%r338,%r49; ld.u64 %r212,[%r76]; st.u64 [%r211],%r212; .loc 1 121 7 add.u64 %r76,%r76,24; add.u64 %r35,%r35,8; add.u64 %r103,%r103,24; add.u64 %r49,%r49,8; setp.ne.u64 %r213,%r49,%r156; @ %r213 bra $L8; .loc 1 162 12 ld.u64 %r161,[%frame+360]; .loc 1 163 12 ld.u64 %r97,[%frame+240]; $L34: .loc 1 129 41 sub.u32 %r214,%r114,%r110; .loc 1 129 11 cvt.s64.s32 %r98,%r214; .loc 1 117 15 mov.u64 %r94,1; .loc 1 116 15 mov.u64 %r54,%r94; bra $L9; $L4: .loc 1 135 12 cvt.s64.s32 %r153,%r110; add.u64 %r106,%r163,40; add.u64 %r102,%r162,40; mov.u64 %r101,0; mov.u64 %r99,%r101; mov.u64 %r123,%r101; mov.u64 %r94,1; mov.u64 %r54,%r94; .loc 1 147 17 mov.u64 %r346,%r101; $L14: .loc 1 143 14 ld.u64 %r216,[%r106+16]; add.u64 %r215,%r216,1; .loc 1 143 12 ld.u64 %r217,[%r106+8]; sub.u64 %r152,%r215,%r217; .loc 1 137 16 ld.u64 %r154,[%r102]; .loc 1 140 16 ld.u64 %r143,[%r106]; .loc 1 135 7 setp.ne.u64 %r218,%r153,%r123; @ %r218 bra $L10; .loc 1 138 11 setp.ne.u64 %r219,%r154,0; .loc 1 139 11 selp.u64 %r54,%r154,1,%r219; .loc 1 141 11 setp.ne.u64 %r220,%r143,0; .loc 1 142 11 selp.u64 %r94,%r143,1,%r220; .loc 1 143 12 mov.u64 %r99,%r152; bra $L13; $L10: .loc 1 147 17 shl.b64 %r221,%r101,3; add.u64 %r222,%frame,%r221; add.u64 %r223,%r222,120; st.u64 [%r223],%r346; .loc 1 148 18 st.u64 [%r222],%r152; .loc 1 149 19 st.u64 [%r222+360],%r154; .loc 1 150 19 st.u64 [%r222+240],%r143; .loc 1 151 9 add.u64 %r101,%r101,1; $L13: .loc 1 133 59 add.u64 %r123,%r123,1; .loc 1 133 7 add.u64 %r106,%r106,24; add.u64 %r102,%r102,24; setp.lt.s64 %r233,%r123,%r98; @ %r233 bra $L14; .loc 1 154 18 ld.u64 %r97,[%frame+240]; .loc 1 156 18 ld.u64 %r161,[%frame+360]; .loc 1 154 10 setp.ne.u64 %r234,%r97,0; .loc 1 155 13 selp.u64 %r97,%r97,1,%r234; .loc 1 156 10 setp.eq.u64 %r235,%r161,0; selp.u64 %r161,1,%r161,%r235; $L9: .loc 1 164 8 ld.u64 %r138,[%r162]; .loc 1 165 8 ld.u64 %r139,[%r163]; .loc 1 168 13 shr.u64 %r237,%r164,63; .loc 1 168 26 set.u32.ge.s64 %r239,%r164,%r99; neg.s32 %r240,%r239; .loc 1 168 17 cvt.u16.u64 %r243,%r237; cvt.u16.u32 %r244,%r240; or.b16 %r242,%r243,%r244; .loc 1 168 6 cvt.u32.u16 %r245,%r242; cvt.u16.u8 %r246,%r245; setp.eq.u16 %r247,%r246,0; @ %r247 bra $L18; .loc 1 170 28 setp.ne.u64 %r248,%r99,0; @ %r248 bra $L17; $L6: mov.u64 %r164,0; bra $L18; $L17: rem.s64 %r164,%r164,%r99; .loc 1 171 10 setp.ge.s64 %r251,%r164,0; @ %r251 bra $L18; .loc 1 172 8 add.u64 %r164,%r164,%r99; bra $L18; $L32: shl.b64 %r160,%r94,3; .loc 1 193 36 mul.lo.u64 %r62,%r164,%r160; .loc 1 198 13 shl.b64 %r66,%r54,3; .loc 1 183 11 shl.b64 %r134,%r164,3; .loc 1 184 23 sub.u64 %r145,%r99,%r164; .loc 1 184 11 shl.b64 %r135,%r145,3; .loc 1 210 12 shl.b64 %r71,%r161,3; .loc 1 211 12 shl.b64 %r73,%r97,3; .loc 1 181 35 set.u32.eq.u64 %r253,%r54,1; neg.s32 %r254,%r253; .loc 1 181 19 set.u32.eq.u64 %r256,%r94,1; neg.s32 %r257,%r256; .loc 1 181 24.loc 1 214 32 ld.u64 %r100,[%frame]; mul.lo.u64 %r158,%r66,%r145; .loc 1 221 30 mul.lo.u64 %r137,%r161,%r100; .loc 1 222 30 mul.lo.u64 %r262,%r97,%r100; .loc 1 222 16 shl.b64 %r263,%r262,3; neg.s64 %r155,%r263; shl.b64 %r264,%r98,3; add.u64 %r108,%r264,-8; setp.eq.u32 %r340,%r105,0; setp.gt.s64 %r339,%r98,2; .loc 1 195 4 setp.le.s64 %r342,%r145,0; .loc 1 201 4 setp.gt.s64 %r343,%r164,0; add.u64 %r344,%frame,360; add.u64 %r345,%frame,240; $L35: .loc 1 181 10 @ %r340 bra $L19; .loc 1 185 23 add.u64 %r266,%r139,%r134; .loc 1 185 413r26613273,[%value_in]; } .loc 1 186 17 add.u64 %r275,%r138,%r135; .loc 1 186 4342,[%value_in]; } bra $L20; $L19: .loc 1 193 25 add.u64 %r132,%r139,%r62; .loc 1 195 4 @ %r342 bra $L39; mov.u64 %r131,%r138; .loc 1 195 11 mov.u64 %r133,0; $L22: .loc 1 197 16 ld.f32 %r63,[%r132]; ld.f32 %r64,[%r132+4]; .loc 1 197 14 st.f32 [%r131],%r63; st.f32 [%r131+4],%r64; .loc 1 198 13 add.u64 %r131,%r131,%r66; .loc 1 199 12 add.u64 %r132,%r132,%r160; .loc 1 195 34 add.u64 %r133,%r133,1; .loc 1 195 4 setp.ne.u64 %r285,%r133,%r145; @ %r285 bra $L22; .loc 1 198 13 add.u64 %r128,%r138,%r158; bra $L21; $L39: .loc 1 195 4 mov.u64 %r128,%r138; $L21: .loc 1 201 4 @ ! %r343 bra $L20; mov.u64 %r129,%r139; .loc 1 201 23 mov.u64 %r130,0; $L23: .loc 1 203 16 ld.f32 %r67,[%r129]; ld.f32 %r68,[%r129+4]; .loc 1 203 14 st.f32 [%r128],%r67; st.f32 [%r128+4],%r68; .loc 1 204 13 add.u64 %r128,%r128,%r66; .loc 1 205 12 add.u64 %r129,%r129,%r160; .loc 1 201 40 add.u64 %r130,%r130,1; .loc 1 201 4 setp.ne.u64 %r287,%r130,%r164; @ %r287 bra $L23; $L20: .loc 1 210 12 add.u64 %r138,%r138,%r71; .loc 1 211 12 add.u64 %r139,%r139,%r73; .loc 1 212 15 ld.u64 %r288,[%frame+120]; add.u64 %r75,%r288,1; .loc 1 214 13 setp.eq.u64 %r289,%r75,%r100; @ %r289 bra $L25; .loc 1 212 15 st.u64 [%frame+120],%r75; bra $L35; $L25: .loc 1 218 20 mov.u64 %r290,0; st.u64 [%frame+120],%r290; .loc 1 222 16 add.u64 %r136,%r139,%r155; .loc 1 224 14 @ %r339 bra $L27; bra $L1; $L31: .loc 1 218 20 st.u64 [%r125],%r290; .loc 1 221 30 mul.lo.u64 %r78,%r86,%r85; .loc 1 222 30 mul.lo.u64 %r293,%r88,%r85; .loc 1 222 16 shl.b64 %r294,%r293,3; sub.u64 %r136,%r139,%r294; .loc 1 224 14 add.u64 %r125,%r125,8; add.u64 %r150,%r150,8; setp.eq.u64 %r295,%r108,%r150; @ ! %r295 bra $L30; bra $L1; $L27: add.u64 %r125,%frame,128; .loc 1 221 30 mov.u64 %r78,%r137; .loc 1 224 14 mov.u64 %r150,8; $L30: .loc 1 232 23 ld.u64 %r297,[%r125]; add.u64 %r85,%r297,1; st.u64 [%r125],%r85; .loc 1 233 30 add.u64 %r299,%r344,%r150; ld.u64 %r86,[%r299]; .loc 1 233 20 sub.u64 %r300,%r86,%r78; shl.b64 %r301,%r300,3; add.u64 %r138,%r138,%r301; .loc 1 234 30 add.u64 %r303,%r345,%r150; ld.u64 %r88,[%r303]; .loc 1 234 20 shl.b64 %r304,%r88,3; add.u64 %r139,%r136,%r304; .loc 1 214 32 add.u64 %r305,%frame,%r150; ld.u64 %r92,[%r305]; .loc 1 214 13 setp.eq.u64 %r306,%r85,%r92; @ %r306 bra $L31; bra $L35; $L18: .loc 1 175 9 setp.ne.u64 %r307,%r138,0; @ %r307 bra $L32; bra $L1; $L3: .loc 1 114 18 mov.u64 %r308,1; st.u64 [%frame+240],%r308; .loc 1 115 18 st.u64 [%frame+360],%r308; .loc 1 118 13 cvt.s64.s32 %r310,%r110; add.u64 %r312,%r310,%r310; add.u64 %r313,%r312,%r310; shl.b64 %r314,%r313,3; add.u64 %r315,%r163,%r314; ld.u64 %r30,[%r315+40]; .loc 1 119 4 ld.u64 %r325,[%r315+56]; ld.u64 %r334,[%r315+48]; sub.u64 %r333,%r325,%r334; .loc 1 118 11 mad.lo.u64 %r99,%r333,%r30,%r30; .loc 1 120 13 mul.lo.u64 %r164,%r164,%r30; .loc 1 121 16 cvt.s64.s32 %r335,%r165; .loc 1 121 7 setp.gt.s64 %r336,%r98,%r335; @ %r336 bra $L33; mov.u64 %r97,%r308; mov.u64 %r161,%r97; bra $L34; $L1: .loc 1 240cshift0_c8.o/gfortrani_cshift0_c8 .visible .func _gfortrani_cshift0_.file 1 "../../../../libgfortran/generated/cshift0_c8.c"ortrani_cshift0_c8 .visible .func _gfortrani_cshift0_pred %r213; .reg .u32predu64 %r223; .reg .predu32 %r240; .reg .u16predpredu64 %r312; .reg .u64u64 %r325; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .predmov.u64 %r162,%ar0; mov.u64 %r163,%ar1; mov.u64 %r164,%ar2; mov.u32 %r165,%ar3; .loc 1 57 9 add.u32 %r110,%r165,-1; .loc 1 58 14 mov.u64 %r166,0; st.u64 [%frame+240],%r166; .loc 1 59 14 st.u64 [%frame+360],%r166; .loc 1 61 13 mov.u64 %r168,1; st.u64 [%frame],%r168; .loc 1 62 12 st.u64 [%frame+120],%r166; .loc 1 76 13 ld.s8 %r114,[%r163+28]; .loc 1 72 6 setp.le.s32 %r170,%r110,0; @ %r170 bra $L2; .loc 1 76 11 cvt.u32.u32 %r171,%r114; cvt.s64.s8 %r98,%r171; .loc 1 77 7 setp.le.s64 %r172,%r98,0; @ %r172 bra $L3; mov.u64 %r53,40; .loc 1 70 8 mov.u64 %r119,%r168; .loc 1 69 8 mov.u64 %r118,%r119; .loc 1 77 14 mov.u64 %r120,%r166; $L7: add.u64 %r127,%r162,%r53; .loc 1 81 7 ld.u64 %r173,[%r127]; setp.ne.u64 %r174,%r173,%r118; @ ! %r174 bra $L49; bra $L4; $L2: .loc 1 133 27 cvt.u32.u32 %r175,%r114; cvt.s64.s8 %r98,%r175; .loc 1 133 7 setp.gt.s64 %r176,%r98,0; @ %r176 bra $L4; .loc 1 164 8 ld.u64 %r138,[%r162]; .loc 1 165 8 ld.u64 %r139,[%r163]; mov.u64 %r99,%r166; mov.u64 %r161,%r168; .loc 1 155 13 mov.u64 %r97,%r161; .loc 1 165 8 mov.u64 %r94,%r161; mov.u64 %r54,%r161; bra $L6; $L49: add.u64 %r69,%r163,%r53; .loc 1 87 7 ld.u64 %r177,[%r69]; setp.ne.u64 %r178,%r177,%r119; @ %r178 bra $L4; .loc 1 92 12 ld.u64 %r180,[%r127+16]; ld.u64 %r182,[%r127+8]; sub.u64 %r181,%r180,%r182; .loc 1 92 9 mad.lo.u64 %r118,%r181,%r118,%r118; .loc 1 93 12 ld.u64 %r184,[%r69+16]; ld.u64 %r186,[%r69+8]; sub.u64 %r185,%r184,%r186; .loc 1 93 9 mad.lo.u64 %r119,%r185,%r119,%r119; .loc 1 77 30 add.u64 %r120,%r120,1; .loc 1 77 7 add.u64 %r53,%r53,24; setp.ne.u64 %r187,%r98,%r120; @ %r187 bra $L7; bra $L3; $L33: .loc 1 123 13 cvt.u32.u32 %r189,%r114; cvt.s64.s8 %r188,%r189; sub.u64 %r190,%r188,%r335; shl.b64 %r156,%r190,3; add.u64 %r191,%frame,120; mov.u32 %r1add.u64 %r200,%r335,%r335; add.u64 %r201,%r200,%r335; shl.b64 %r202,%r201,3; add.u64 %r44,%r202,40; add.u64 %r76,%r163,%r44; mov.u64 %r35,%frame; add.u64 %r103,%r162,%r44; mov.u64 %r49,0; add.u64 %r337,%frame,360; add.u64 %r338,%frame,240; $L8: .loc 1 124 16 ld.u64 %r204,[%r76+16]; add.u64 %r203,%r204,1; ld.u64 %r206,[%r76+8]; sub.u64 %r205,%r203,%r206; .loc 1 124 14 st.u64 [%r35],%r205; .loc 1 125 15 add.u64 %r208,%r337,%r49; ld.u64 %r209,[%r103]; st.u64 [%r208],%r209; .loc 1 126 15 add.u64 %r211,%r338,%r49; ld.u64 %r212,[%r76]; st.u64 [%r211],%r212; .loc 1 121 7 add.u64 %r76,%r76,24; add.u64 %r35,%r35,8; add.u64 %r103,%r103,24; add.u64 %r49,%r49,8; setp.ne.u64 %r213,%r49,%r156; @ %r213 bra $L8; .loc 1 162 12 ld.u64 %r161,[%frame+360]; .loc 1 163 12 ld.u64 %r97,[%frame+240]; $L34: .loc 1 129 41 sub.u32 %r214,%r114,%r110; .loc 1 129 11 cvt.s64.s32 %r98,%r214; .loc 1 117 15 mov.u64 %r94,1; .loc 1 116 15 mov.u64 %r54,%r94; bra $L9; $L4: .loc 1 135 12 cvt.s64.s32 %r153,%r110; add.u64 %r106,%r163,40; add.u64 %r102,%r162,40; mov.u64 %r101,0; mov.u64 %r99,%r101; mov.u64 %r123,%r101; mov.u64 %r94,1; mov.u64 %r54,%r94; .loc 1 147 17 mov.u64 %r346,%r101; $L14: .loc 1 143 14 ld.u64 %r216,[%r106+16]; add.u64 %r215,%r216,1; .loc 1 143 12 ld.u64 %r217,[%r106+8]; sub.u64 %r152,%r215,%r217; .loc 1 137 16 ld.u64 %r154,[%r102]; .loc 1 140 16 ld.u64 %r143,[%r106]; .loc 1 135 7 setp.ne.u64 %r218,%r153,%r123; @ %r218 bra $L10; .loc 1 138 11 setp.ne.u64 %r219,%r154,0; .loc 1 139 11 selp.u64 %r54,%r154,1,%r219; .loc 1 141 11 setp.ne.u64 %r220,%r143,0; .loc 1 142 11 selp.u64 %r94,%r143,1,%r220; .loc 1 143 12 mov.u64 %r99,%r152; bra $L13; $L10: .loc 1 147 17 shl.b64 %r221,%r101,3; add.u64 %r222,%frame,%r221; add.u64 %r223,%r222,120; st.u64 [%r223],%r346; .loc 1 148 18 st.u64 [%r222],%r152; .loc 1 149 19 st.u64 [%r222+360],%r154; .loc 1 150 19 st.u64 [%r222+240],%r143; .loc 1 151 9 add.u64 %r101,%r101,1; $L13: .loc 1 133 59 add.u64 %r123,%r123,1; .loc 1 133 7 add.u64 %r106,%r106,24; add.u64 %r102,%r102,24; setp.lt.s64 %r233,%r123,%r98; @ %r233 bra $L14; .loc 1 154 18 ld.u64 %r97,[%frame+240]; .loc 1 156 18 ld.u64 %r161,[%frame+360]; .loc 1 154 10 setp.ne.u64 %r234,%r97,0; .loc 1 155 13 selp.u64 %r97,%r97,1,%r234; .loc 1 156 10 setp.eq.u64 %r235,%r161,0; selp.u64 %r161,1,%r161,%r235; $L9: .loc 1 164 8 ld.u64 %r138,[%r162]; .loc 1 165 8 ld.u64 %r139,[%r163]; .loc 1 168 13 shr.u64 %r237,%r164,63; .loc 1 168 26 set.u32.ge.s64 %r239,%r164,%r99; neg.s32 %r240,%r239; .loc 1 168 17 cvt.u16.u64 %r243,%r237; cvt.u16.u32 %r244,%r240; or.b16 %r242,%r243,%r244; .loc 1 168 6 cvt.u32.u16 %r245,%r242; cvt.u16.u8 %r246,%r245; setp.eq.u16 %r247,%r246,0; @ %r247 bra $L18; .loc 1 170 28 setp.ne.u64 %r248,%r99,0; @ %r248 bra $L17; $L6: mov.u64 %r164,0; bra $L18; $L17: rem.s64 %r164,%r164,%r99; .loc 1 171 10 setp.ge.s64 %r251,%r164,0; @ %r251 bra $L18; .loc 1 172 8 add.u64 %r164,%r164,%r99; bra $L18; $L32: shl.b64 %r160,%r94,4; .loc 1 193 36 mul.lo.u64 %r62,%r164,%r160; .loc 1 198 13 shl.b64 %r66,%r54,4; .loc 1 183 11 shl.b64 %r134,%r164,4; .loc 1 184 23 sub.u64 %r145,%r99,%r164; .loc 1 184 11 shl.b64 %r135,%r145,4; .loc 1 210 12 shl.b64 %r71,%r161,4; .loc 1 211 12 shl.b64 %r73,%r97,4; .loc 1 181 35 set.u32.eq.u64 %r253,%r54,1; neg.s32 %r254,%r253; .loc 1 181 19 set.u32.eq.u64 %r256,%r94,1; neg.s32 %r257,%r256; .loc 1 181 24.loc 1 214 32 ld.u64 %r100,[%frame]; mul.lo.u64 %r158,%r66,%r145; .loc 1 221 30 mul.lo.u64 %r137,%r161,%r100; .loc 1 222 30 mul.lo.u64 %r262,%r97,%r100; .loc 1 222 16 shl.b64 %r263,%r262,4; neg.s64 %r155,%r263; shl.b64 %r264,%r98,3; add.u64 %r108,%r264,-8; setp.eq.u32 %r340,%r105,0; setp.gt.s64 %r339,%r98,2; .loc 1 195 4 setp.le.s64 %r342,%r145,0; .loc 1 201 4 setp.gt.s64 %r343,%r164,0; add.u64 %r344,%frame,360; add.u64 %r345,%frame,240; $L35: .loc 1 181 10 @ %r340 bra $L19; .loc 1 185 23 add.u64 %r266,%r139,%r134; .loc 1 185 413r26613273,[%value_in]; } .loc 1 186 17 add.u64 %r275,%r138,%r135; .loc 1 186 4342,[%value_in]; } bra $L20; $L19: .loc 1 193 25 add.u64 %r132,%r139,%r62; .loc 1 195 4 @ %r342 bra $L39; mov.u64 %r131,%r138; .loc 1 195 11 mov.u64 %r133,0; $L22: .loc 1 197 16 ld.f64 %r63,[%r132]; ld.f64 %r64,[%r132+8]; .loc 1 197 14 st.f64 [%r131],%r63; st.f64 [%r131+8],%r64; .loc 1 198 13 add.u64 %r131,%r131,%r66; .loc 1 199 12 add.u64 %r132,%r132,%r160; .loc 1 195 34 add.u64 %r133,%r133,1; .loc 1 195 4 setp.ne.u64 %r285,%r133,%r145; @ %r285 bra $L22; .loc 1 198 13 add.u64 %r128,%r138,%r158; bra $L21; $L39: .loc 1 195 4 mov.u64 %r128,%r138; $L21: .loc 1 201 4 @ ! %r343 bra $L20; mov.u64 %r129,%r139; .loc 1 201 23 mov.u64 %r130,0; $L23: .loc 1 203 16 ld.f64 %r67,[%r129]; ld.f64 %r68,[%r129+8]; .loc 1 203 14 st.f64 [%r128],%r67; st.f64 [%r128+8],%r68; .loc 1 204 13 add.u64 %r128,%r128,%r66; .loc 1 205 12 add.u64 %r129,%r129,%r160; .loc 1 201 40 add.u64 %r130,%r130,1; .loc 1 201 4 setp.ne.u64 %r287,%r130,%r164; @ %r287 bra $L23; $L20: .loc 1 210 12 add.u64 %r138,%r138,%r71; .loc 1 211 12 add.u64 %r139,%r139,%r73; .loc 1 212 15 ld.u64 %r288,[%frame+120]; add.u64 %r75,%r288,1; .loc 1 214 13 setp.eq.u64 %r289,%r75,%r100; @ %r289 bra $L25; .loc 1 212 15 st.u64 [%frame+120],%r75; bra $L35; $L25: .loc 1 218 20 mov.u64 %r290,0; st.u64 [%frame+120],%r290; .loc 1 222 16 add.u64 %r136,%r139,%r155; .loc 1 224 14 @ %r339 bra $L27; bra $L1; $L31: .loc 1 218 20 st.u64 [%r125],%r290; .loc 1 221 30 mul.lo.u64 %r78,%r86,%r85; .loc 1 222 30 mul.lo.u64 %r293,%r88,%r85; .loc 1 222 16 shl.b64 %r294,%r293,4; sub.u64 %r136,%r139,%r294; .loc 1 224 14 add.u64 %r125,%r125,8; add.u64 %r150,%r150,8; setp.eq.u64 %r295,%r108,%r150; @ ! %r295 bra $L30; bra $L1; $L27: add.u64 %r125,%frame,128; .loc 1 221 30 mov.u64 %r78,%r137; .loc 1 224 14 mov.u64 %r150,8; $L30: .loc 1 232 23 ld.u64 %r297,[%r125]; add.u64 %r85,%r297,1; st.u64 [%r125],%r85; .loc 1 233 30 add.u64 %r299,%r344,%r150; ld.u64 %r86,[%r299]; .loc 1 233 20 sub.u64 %r300,%r86,%r78; shl.b64 %r301,%r300,4; add.u64 %r138,%r138,%r301; .loc 1 234 30 add.u64 %r303,%r345,%r150; ld.u64 %r88,[%r303]; .loc 1 234 20 shl.b64 %r304,%r88,4; add.u64 %r139,%r136,%r304; .loc 1 214 32 add.u64 %r305,%frame,%r150; ld.u64 %r92,[%r305]; .loc 1 214 13 setp.eq.u64 %r306,%r85,%r92; @ %r306 bra $L31; bra $L35; $L18: .loc 1 175 9 setp.ne.u64 %r307,%r138,0; @ %r307 bra $L32; bra $L1; $L3: .loc 1 114 18 mov.u64 %r308,1; st.u64 [%frame+240],%r308; .loc 1 115 18 st.u64 [%frame+360],%r308; .loc 1 118 13 cvt.s64.s32 %r310,%r110; add.u64 %r312,%r310,%r310; add.u64 %r313,%r312,%r310; shl.b64 %r314,%r313,3; add.u64 %r315,%r163,%r314; ld.u64 %r30,[%r315+40]; .loc 1 119 4 ld.u64 %r325,[%r315+56]; ld.u64 %r334,[%r315+48]; sub.u64 %r333,%r325,%r334; .loc 1 118 11 mad.lo.u64 %r99,%r333,%r30,%r30; .loc 1 120 13 mul.lo.u64 %r164,%r164,%r30; .loc 1 121 16 cvt.s64.s32 %r335,%r165; .loc 1 121 7 setp.gt.s64 %r336,%r98,%r335; @ %r336 bra $L33; mov.u64 %r97,%r308; mov.u64 %r161,%r97; bra $L34; $L1: .loc 1 240cshift0_c10cshift1_4_i1.o/ 1622802246gfortrani_cshift1_4_i1 .visible .func _gfortrani_cshift1_4_../libgfortran/generated/cshift1_4_i1.c"ortrani_cshift1_4_i1 .visible .func _gfortrani_cshift1_4_095predpredpredpred %r185; .reg .predu16 %r197; .reg .u16pred %r201; .reg .predpredpred %r229; .reg .predu64pred %r270; .reg .predmov.u64 %r136,%ar0; mov.u64 %r137,%ar1; mov.u64 %r138,%ar2; mov.u64 %r139,%ar3; .loc 1 68 6 setp.eq.u64 %r140,%r139,0; @ %r140 bra $L31; .loc 1 69 11 ld.u32 %r141,[%r139]; add.u32 %r88,%r141,-1; bra $L2; $L31: .loc 1 71 11 cvt.u32.u64 %r88,%r139; $L2: .loc 1 73 13 mov.u64 %r142,1; st.u64 [%frame+360],%r142; .loc 1 74 12 mov.u64 %r143,0; st.u64 [%frame+480],%r143; .loc 1 82 23 ld.s8 %r126,[%r137+28]; cvt.u32.u32 %r144,%r126; cvt.s64.s8 %r132,%r144; .loc 1 82 3 setp.le.s64 %r145,%r132,0; @ %r145 bra $L32; .loc 1 84 15 cvt.s64.s32 %r24,%r88; add.u64 %r74,%r137,40; add.u64 %r45,%r136,40; .loc 1 75 5 mov.u64 %r84,%r143; .loc 1 80 7 mov.u64 %r82,%r84; .loc 1 82 12 mov.u64 %r116,%r84; .loc 1 79 11 mov.u64 %r77,%r142; .loc 1 78 11 mov.u64 %r72,%r77; .loc 1 96 20 mov.u64 %r292,%r84; $L8: .loc 1 92 17 ld.u64 %r147,[%r74+16]; add.u64 %r146,%r147,1; .loc 1 92 15 ld.u64 %r148,[%r74+8]; sub.u64 %r128,%r146,%r148; .loc 1 86 19 ld.u64 %r71,[%r45]; .loc 1 89 19 ld.u64 %r76,[%r74]; .loc 1 84 10 setp.ne.u64 %r149,%r24,%r116; @ %r149 bra $L4; .loc 1 87 14 setp.ne.u64 %r150,%r71,0; .loc 1 90 14 setp.ne.u64 %r151,%r76,0; .loc 1 92 15 mov.u64 %r82,%r128; selp.u64 %r77,%r76,1,%r151; selp.u64 %r72,%r71,1,%r150; bra $L7; $L4: .loc 1 96 20 shl.b64 %r152,%r84,3; add.u64 %r153,%frame,%r152; add.u64 %r154,%r153,480; st.u64 [%r154],%r292; .loc 1 97 21 st.u64 [%r153+360],%r128; .loc 1 98 22 st.u64 [%r153+840],%r71; .loc 1 99 22 st.u64 [%r153+720],%r76; .loc 1 100 24 add.u64 %r166,%r84,%r84; add.u64 %r167,%r166,%r84; shl.b64 %r168,%r167,3; add.u64 %r169,%r138,%r168; ld.u64 %r23,[%r169+40]; .loc 1 100 22 st.u64 [%r153+600],%r23; .loc 1 101 26 mul.lo.u64 %r177,%r128,%r71; .loc 1 101 13 st.u64 [%r153+240],%r177; .loc 1 102 26 mul.lo.u64 %r181,%r76,%r128; .loc 1 102 13 st.u64 [%r153+120],%r181; .loc 1 103 26 mul.lo.u64 %r184,%r23,%r128; .loc 1 103 13 st.u64 [%r153],%r184; .loc 1 104 12 add.u64 %r84,%r84,1; $L7: .loc 1 82 55 add.u64 %r116,%r116,1; .loc 1 82 3 add.u64 %r74,%r74,24; add.u64 %r45,%r45,24; setp.ne.u64 %r185,%r116,%r132; @ %r185 bra $L8; bra $L3; $L32: .loc 1 80 7 mov.u64 %r82,%r143; .loc 1 79 11 mov.u64 %r77,%r142; .loc 1 78 11 mov.u64 %r72,%r77; $L3: .loc 1 107 14 ld.u64 %r28,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r186,%r28,0; .loc 1 108 16 selp.u64 %r28,%r28,1,%r186; .loc 1 109 14 ld.u64 %r29,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r187,%r29,0; .loc 1 110 16 selp.u64 %r29,%r29,1,%r187; .loc 1 111 14 ld.u64 %r30,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r188,%r30,0; .loc 1 112 16 selp.u64 %r30,%r30,1,%r188; .loc 1 118 8 ld.u64 %r113,[%r136]; .loc 1 119 8 ld.u64 %r114,[%r137]; .loc 1 120 8 ld.u64 %r115,[%r138]; .loc 1 122 9 setp.eq.u64 %r189,%r113,0; @ %r189 bra $L1; .loc 1 164 12 shl.b64 %r48,%r30,2; .loc 1 138 35 set.u32.eq.u64 %r191,%r72,1; neg.s32 %r192,%r191; .loc 1 138 19 set.u32.eq.u64 %r194,%r77,1; neg.s32 %r195,%r194; .loc 1 138 24 cvt.u16.u32 %r197,%r192; cvt.u16.u32 %r198,%r195; and.b16 %r196,%r197,%r198; cvt.u32.u16 %r199,%r196; cvt.u32.u8 %r95,%r199; .loc 1 167 32 ld.u64 %r75,[%frame+360]; setp.ne.u32 %r278,%r95,0; setp.le.s64 %r274,%r132,2; .loc 1 129 12 cvt.u32.u64 %r282,%r82; cvt.u32.u32 %r283,%r126; cvt.s64.s8 %r284,%r283; add.u64 %r285,%r284,-1; add.u64 %r286,%frame,840; add.u64 %r287,%frame,720; add.u64 %r288,%frame,600; add.u64 %r289,%frame,360; .loc 1 172 24 add.u64 %r290,%frame,240; .loc 1 173 24 add.u64 %r291,%frame,120; $L30: .loc 1 125 10 ld.u32 %r90,[%r115]; .loc 1 128 10 setp.ge.s32 %r200,%r90,0; @ %r200 bra $L13; .loc 1 129 12 add.u32 %r90,%r90,%r282; .loc 1 130 11 cvt.s64.s32 %r91,%r90; .loc 1 130 10 setp.ge.s64 %r201,%r91,%r82; @ %r201 bra $L14; setp.ge.s32 %r202,%r90,0; @ %r202 bra $L15; $L14: .loc 1 132 13 rem.s64 %r91,%r91,%r82; .loc 1 133 7 setp.lt.s64 %r205,%r91,0; @ %r205 bra $L16; .loc 1 132 8 cvt.u32.u64 %r90,%r91; bra $L15; $L16: .loc 1 134 16 cvt.u32.u64 %r206,%r91; add.u32 %r90,%r282,%r206; .loc 1 136 22 cvt.s64.s32 %r91,%r90; $L15: .loc 1 147 24 sub.u64 %r129,%r82,%r91; .loc 1 138 10 @ %r278 bra $L17; .loc 1 147 4 setp.gt.s64 %r208,%r129,0; @ %r208 bra $L18; mov.u64 %r135,%r113; bra $L19; $L17: .loc 1 140 11 cvt.s64.s32 %r108,%r90; .loc 1 142 23 add.u64 %r209,%r114,%r108; .loc 1 142 411r20943 17 add.u64 %r218,%r113,%r129; .loc 1 143 4r108225,[%value_in]; } bra $L20; $L18: .loc 1 136 11 mad.lo.u64 %r106,%r91,%r77,%r114; .loc 1 157 12 mov.u64 %r105,%r113; .loc 1 147 11 mov.u64 %r107,0; $L21: .loc 1 149 14 ld.u8 %r228,[%r106]; st.u8 [%r105],%r228; .loc 1 150 13 add.u64 %r105,%r105,%r72; .loc 1 151 12 add.u64 %r106,%r106,%r77; .loc 1 147 31 add.u64 %r107,%r107,1; .loc 1 147 4 setp.ne.u64 %r229,%r107,%r129; @ %r229 bra $L21; .loc 1 150 13 mad.lo.u64 %r135,%r107,%r72,%r113; $L19: .loc 1 153 4 setp.gt.s64 %r231,%r91,0; @ ! %r231 bra $L20; .loc 1 157 12 mov.u64 %r103,%r114; .loc 1 153 23 mov.u64 %r104,0; $L23: .loc 1 155 14 ld.u8 %r232,[%r103]; st.u8 [%r135],%r232; .loc 1 156 13 add.u64 %r135,%r135,%r72; .loc 1 157 12 add.u64 %r103,%r103,%r77; .loc 1 153 37 add.u64 %r104,%r104,1; .loc 1 153 4 setp.ne.u64 %r233,%r91,%r104; @ %r233 bra $L23; $L20: .loc 1 162 12 add.u64 %r113,%r113,%r29; .loc 1 163 12 add.u64 %r114,%r114,%r28; .loc 1 164 12 add.u64 %r115,%r115,%r48; .loc 1 165 15 ld.u64 %r234,[%frame+480]; add.u64 %r50,%r234,1; .loc 1 167 13 setp.eq.u64 %r235,%r50,%r75; @ %r235 bra $L25; .loc 1 165 15 st.u64 [%frame+480],%r50; bra $L30; $L25: .loc 1 171 20 mov.u64 %r236,0; st.u64 [%frame+480],%r236; .loc 1 172 24 ld.u64 %r52,[%frame+240]; .loc 1 173 16 ld.u64 %r237,[%frame+120]; sub.u64 %r110,%r114,%r237; .loc 1 174 9 ld.u64 %r239,[%frame]; shl.b64 %r238,%r239,2; sub.u64 %r111,%r115,%r238; .loc 1 176 14 @ %r274 bra $L1; add.u64 %r130,%frame,488; mov.u64 %r120,8; .loc 1 175 12 mov.u64 %r112,1; bra $L28; $L29: .loc 1 171 20 st.u64 [%r130],%r236; .loc 1 172 24 add.u64 %r246,%r290,%r120; ld.u64 %r52,[%r246]; .loc 1 173 24 add.u64 %r248,%r291,%r120; .loc 1 173 16 ld.u64 %r249,[%r248]; sub.u64 %r110,%r114,%r249; .loc 1 174 17 add.u64 %r250,%frame,%r120; .loc 1 174 9 ld.u64 %r252,[%r250]; shl.b64 %r251,%r252,2; sub.u64 %r111,%r115,%r251; .loc 1 175 12 add.u64 %r112,%r112,1; .loc 1 176 14 add.u64 %r130,%r130,8; add.u64 %r120,%r120,8; setp.eq.u64 %r253,%r285,%r112; @ %r253 bra $L1; $L28: .loc 1 184 23 ld.u64 %r254,[%r130]; add.u64 %r62,%r254,1; st.u64 [%r130],%r62; .loc 1 185 30 add.u64 %r256,%r286,%r120; .loc 1 185 20 ld.u64 %r258,[%r256]; sub.u64 %r257,%r258,%r52; add.u64 %r113,%r113,%r257; .loc 1 186 30 add.u64 %r260,%r287,%r120; .loc 1 186 20 ld.u64 %r261,[%r260]; add.u64 %r114,%r110,%r261; .loc 1 187 23 add.u64 %r263,%r288,%r120; .loc 1 187 13 ld.u64 %r265,[%r263]; shl.b64 %r264,%r265,2; add.u64 %r115,%r111,%r264; .loc 1 167 32 add.u64 %r267,%r289,%r120; .loc 1 167 13 ld.u64 %r268,[%r267]; setp.eq.u64 %r269,%r62,%r268; @ %r269 bra $L29; bra $L30; $L13: .loc 1 130 11 cvt.s64.s32 %r91,%r90; .loc 1 130 10 setp.lt.s64 %r270,%r91,%r82; @ %r270 bra $L15; .loc 1 132 13 rem.s64 %r91,%r91,%r82; .loc 1 132 8 cvt.u32.u64 %r90,%r91; bra $L15; $L1: .loc 1 191 cshift1_4_i2.o/ 1622802246gfortrani_cshift1_4_i2 .visible .func _gfortrani_cshift1_4_../libgfortran/generated/cshift1_4_i2.c"ortrani_cshift1_4_i2 .visible .func _gfortrani_cshift1_4_0u32 %r9532predpredpred %r215; .reg .predu64 %r244; .reg .u16 %r245; .reg .pred %r246; .reg .pred %r248; .reg .u16 %r249; .reg .pred %r250; .reg .u64 %r251; .reg .predu64 %r267; .reg .u6464pred %r295; .reg .pred64u64 %r313; mov.u64 %r145,%ar0; mov.u64 %r146,%ar1; mov.u64 %r147,%ar2; mov.u64 %r148,%ar3; .loc 1 68 6 setp.eq.u64 %r149,%r148,0; @ %r149 bra $L30; .loc 1 69 11 ld.u32 %r150,[%r148]; add.u32 %r94,%r150,-1; bra $L2; $L30: .loc 1 71 11 cvt.u32.u64 %r94,%r148; $L2: .loc 1 73 13 mov.u64 %r151,1; st.u64 [%frame+360],%r151; .loc 1 74 12 mov.u64 %r152,0; st.u64 [%frame+480],%r152; .loc 1 82 23 ld.s8 %r140,[%r146+28]; cvt.u32.u32 %r153,%r140; cvt.s64.s8 %r141,%r153; .loc 1 82 3 setp.le.s64 %r154,%r141,0; @ %r154 bra $L31; .loc 1 84 15 cvt.s64.s32 %r92,%r94; add.u64 %r93,%r146,40; add.u64 %r86,%r145,40; .loc 1 75 5 mov.u64 %r90,%r152; .loc 1 80 7 mov.u64 %r88,%r90; .loc 1 82 12 mov.u64 %r125,%r90; .loc 1 79 11 mov.u64 %r85,%r151; .loc 1 78 11 mov.u64 %r81,%r85; .loc 1 96 20 mov.u64 %r313,%r90; $L8: .loc 1 92 17 ld.u64 %r156,[%r93+16]; add.u64 %r155,%r156,1; .loc 1 92 15 ld.u64 %r157,[%r93+8]; sub.u64 %r33,%r155,%r157; .loc 1 86 19 ld.u64 %r80,[%r86]; .loc 1 89 19 ld.u64 %r84,[%r93]; .loc 1 84 10 setp.ne.u64 %r158,%r92,%r125; @ %r158 bra $L4; .loc 1 87 14 setp.ne.u64 %r159,%r80,0; .loc 1 90 14 setp.ne.u64 %r160,%r84,0; .loc 1 92 15 mov.u64 %r88,%r33; selp.u64 %r85,%r84,1,%r160; selp.u64 %r81,%r80,1,%r159; bra $L7; $L4: .loc 1 96 20 shl.b64 %r161,%r90,3; add.u64 %r162,%frame,%r161; add.u64 %r163,%r162,480; st.u64 [%r163],%r313; .loc 1 97 21 st.u64 [%r162+360],%r33; .loc 1 98 22 st.u64 [%r162+840],%r80; .loc 1 99 22 st.u64 [%r162+720],%r84; .loc 1 100 24 add.u64 %r175,%r90,%r90; add.u64 %r176,%r175,%r90; shl.b64 %r177,%r176,3; add.u64 %r178,%r147,%r177; ld.u64 %r23,[%r178+40]; .loc 1 100 22 st.u64 [%r162+600],%r23; .loc 1 101 26 mul.lo.u64 %r186,%r33,%r80; .loc 1 101 13 st.u64 [%r162+240],%r186; .loc 1 102 26 mul.lo.u64 %r190,%r33,%r84; .loc 1 102 13 st.u64 [%r162+120],%r190; .loc 1 103 26 mul.lo.u64 %r193,%r23,%r33; .loc 1 103 13 st.u64 [%r162],%r193; .loc 1 104 12 add.u64 %r90,%r90,1; $L7: .loc 1 82 55 add.u64 %r125,%r125,1; .loc 1 82 3 add.u64 %r93,%r93,24; add.u64 %r86,%r86,24; setp.ne.u64 %r194,%r125,%r141; @ %r194 bra $L8; bra $L3; $L31: .loc 1 80 7 mov.u64 %r88,%r152; .loc 1 79 11 mov.u64 %r85,%r151; .loc 1 78 11 mov.u64 %r81,%r85; $L3: .loc 1 107 14 ld.u64 %r27,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r195,%r27,0; .loc 1 108 16 selp.u64 %r27,%r27,1,%r195; .loc 1 109 14 ld.u64 %r28,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r196,%r28,0; .loc 1 110 16 selp.u64 %r28,%r28,1,%r196; .loc 1 111 14 ld.u64 %r29,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r197,%r29,0; .loc 1 112 16 selp.u64 %r29,%r29,1,%r197; .loc 1 118 8 ld.u64 %r122,[%r145]; .loc 1 119 8 ld.u64 %r123,[%r146]; .loc 1 120 8 ld.u64 %r124,[%r147]; .loc 1 122 9 setp.eq.u64 %r198,%r122,0; @ %r198 bra $L1; .loc 1 150 13 add.u64 %r45,%r81,%r81; .loc 1 151 12 add.u64 %r47,%r85,%r85; .loc 1 162 12 add.u64 %r51,%r28,%r28; .loc 1 163 12 add.u64 %r53,%r27,%r27; .loc 1 164 12 shl.b64 %r55,%r29,2; .loc 1 138 19 set.u32.eq.u64 %r204,%r85,1; neg.s32 %r205,%r204; .loc 1 138 35 set.u32.eq.u64 %r207,%r81,1; neg.s32 %r208,%r207; .loc 1 138 24 cvt.u16.u32 %r210,%r205; cvt.u16.u32 %r211,%r208; and.b16 %r209,%r210,%r211; cvt.u32.u16 %r212,%r209; cvt.u32.u8 %r91,%r212; .loc 1 167 32 ld.u64 %r89,[%frame+360]; setp.ne.u32 %r295,%r91,0; setp.le.s64 %r297,%r141,2; .loc 1 129 12 cvt.u32.u64 %r303,%r88; cvt.u32.u32 %r304,%r140; cvt.s64.s8 %r305,%r304; add.u64 %r306,%r305,-1; add.u64 %r307,%frame,840; add.u64 %r308,%frame,720; add.u64 %r309,%frame,600; add.u64 %r310,%frame,360; .loc 1 172 24 add.u64 %r311,%frame,240; .loc 1 173 24 add.u64 %r312,%frame,120; $L29: .loc 1 125 10 ld.u32 %r95,[%r124]; .loc 1 128 10 setp.ge.s32 %r213,%r95,0; @ %r213 bra $L13; .loc 1 129 12 add.u32 %r95,%r95,%r303; .loc 1 130 11 cvt.s64.s32 %r132,%r95; .loc 1 130 10 setp.ge.s64 %r214,%r132,%r88; @ %r214 bra $L14; setp.ge.s32 %r215,%r95,0; @ %r215 bra $L15; $L14: .loc 1 132 13 rem.s64 %r132,%r132,%r88; .loc 1 133 7 setp.lt.s64 %r218,%r132,0; @ %r218 bra $L16; .loc 1 132 8 cvt.u32.u64 %r95,%r132; bra $L15; $L16: .loc 1 134 16 cvt.u32.u64 %r219,%r132; add.u32 %r95,%r303,%r219; .loc 1 136 22 cvt.s64.s32 %r132,%r95; $L15: .loc 1 147 24 sub.u64 %r138,%r88,%r132; .loc 1 138 10 @ %r295 bra $L17; .loc 1 147 4 setp.gt.s64 %r221,%r138,0; @ %r221 bra $L18; mov.u64 %r111,%r122; bra $L19; $L17: .loc 1 140 21 cvt.s64.s32 %r222,%r95; .loc 1 140 11 add.u64 %r223,%r222,%r222; .loc 1 141 11 add.u64 %r224,%r138,%r138; .loc 1 142 23 add.u64 %r225,%r123,%r223; .loc 1 142 4224232,[%value_in]; } .loc 1 143 17 add.u64 %r234,%r122,%r224123bra $L20; $L18: .loc 1 136 22 mul.lo.u64 %r243,%r132,%r85; .loc 1 136 18 add.u64 %r244,%r243,%r243; .loc 1 136 11 add.u64 %r115,%r123,%r244; mov.u64 %r114,%r122; .loc 1 147 11 mov.u64 %r116,0; $L21: .loc 1 149 14 ld.u16 %r245,[%r115]; st.u16 [%r114],%r245; .loc 1 150 13 add.u64 %r114,%r114,%r45; .loc 1 151 12 add.u64 %r115,%r115,%r47; .loc 1 147 31 add.u64 %r116,%r116,1; .loc 1 147 4 setp.ne.u64 %r246,%r116,%r138; @ %r246 bra $L21; .loc 1 150 13 mad.lo.u64 %r111,%r116,%r45,%r122; $L19: .loc 1 153 4 setp.gt.s64 %r248,%r132,0; @ ! %r248 bra $L20; mov.u64 %r112,%r123; .loc 1 153 23 mov.u64 %r113,0; $L22: .loc 1 155 14 ld.u16 %r249,[%r112]; st.u16 [%r111],%r249; .loc 1 156 13 add.u64 %r111,%r111,%r45; .loc 1 157 12 add.u64 %r112,%r112,%r47; .loc 1 153 37 add.u64 %r113,%r113,1; .loc 1 153 4 setp.ne.u64 %r250,%r113,%r132; @ %r250 bra $L22; $L20: .loc 1 162 12 add.u64 %r122,%r122,%r51; .loc 1 163 12 add.u64 %r123,%r123,%r53; .loc 1 164 12 add.u64 %r124,%r124,%r55; .loc 1 165 15 ld.u64 %r251,[%frame+480]; add.u64 %r57,%r251,1; .loc 1 167 13 setp.eq.u64 %r252,%r57,%r89; @ %r252 bra $L24; .loc 1 165 15 st.u64 [%frame+480],%r57; bra $L29; $L24: .loc 1 171 20 mov.u64 %r253,0; st.u64 [%frame+480],%r253; .loc 1 172 24 ld.u64 %r59,[%frame+240]; .loc 1 173 16 ld.u64 %r254,[%frame+120]; add.u64 %r255,%r254,%r254; sub.u64 %r119,%r123,%r255; .loc 1 174 9 ld.u64 %r257,[%frame]; shl.b64 %r256,%r257,2; sub.u64 %r120,%r124,%r256; .loc 1 176 14 @ %r297 bra $L1; add.u64 %r142,%frame,488; mov.u64 %r135,8; .loc 1 175 12 mov.u64 %r121,1; bra $L27; $L28: .loc 1 171 20 st.u64 [%r142],%r253; .loc 1 172 24 add.u64 %r264,%r311,%r135; ld.u64 %r59,[%r264]; .loc 1 173 24 add.u64 %r266,%r312,%r135; .loc 1 173 16 ld.u64 %r267,[%r266]; add.u64 %r268,%r267,%r267; sub.u64 %r119,%r123,%r268; .loc 1 174 17 add.u64 %r269,%frame,%r135; .loc 1 174 9 ld.u64 %r271,[%r269]; shl.b64 %r270,%r271,2; sub.u64 %r120,%r124,%r270; .loc 1 175 12 add.u64 %r121,%r121,1; .loc 1 176 14 add.u64 %r142,%r142,8; add.u64 %r135,%r135,8; setp.eq.u64 %r272,%r306,%r121; @ %r272 bra $L1; $L27: .loc 1 184 23 ld.u64 %r273,[%r142]; add.u64 %r70,%r273,1; st.u64 [%r142],%r70; .loc 1 185 30 add.u64 %r275,%r307,%r135; .loc 1 185 20 ld.u64 %r277,[%r275]; sub.u64 %r276,%r277,%r59; add.u64 %r278,%r276,%r276; add.u64 %r122,%r122,%r278; .loc 1 186 30 add.u64 %r280,%r308,%r135; .loc 1 186 20 ld.u64 %r281,[%r280]; add.u64 %r282,%r281,%r281; add.u64 %r123,%r119,%r282; .loc 1 187 23 add.u64 %r284,%r309,%r135; .loc 1 187 13 ld.u64 %r286,[%r284]; shl.b64 %r285,%r286,2; add.u64 %r124,%r120,%r285; .loc 1 167 32 add.u64 %r288,%r310,%r135; .loc 1 167 13 ld.u64 %r289,[%r288]; setp.eq.u64 %r290,%r70,%r289; @ %r290 bra $L28; bra $L29; $L13: .loc 1 130 11 cvt.s64.s32 %r132,%r95; .loc 1 130 10 setp.lt.s64 %r291,%r132,%r88; @ %r291 bra $L15; .loc 1 132 13 rem.s64 %r132,%r132,%r88; .loc 1 132 8 cvt.u32.u64 %r95,%r132; bra $L15; $L1: .loc 1 191cshift1_4_i4.o/gfortrani_cshift1_4_i4 .visible .func _gfortrani_cshift1_4_../libgfortran/generated/cshift1_4_i4.c"ortrani_cshift1_4_i4 .visible .func _gfortrani_cshift1_4_0u32 %r9532predpredu16 %r206; .reg .u16 %r207; .reg .u32predpred %r239; .reg .predpredu64 %r249; .reg .u64 %r256; .reg .u64 %r258; .reg .u64u64 %r267; .reg .u64u64 %r278; .reg .u64pred %r287; .reg .pred %r290; .reg .u32 %r295; .reg .u32 %r296; .reg .u64u64 %r305; mov.u64 %r145,%ar0; mov.u64 %r146,%ar1; mov.u64 %r147,%ar2; mov.u64 %r148,%ar3; .loc 1 68 6 setp.eq.u64 %r149,%r148,0; @ %r149 bra $L30; .loc 1 69 11 ld.u32 %r150,[%r148]; add.u32 %r94,%r150,-1; bra $L2; $L30: .loc 1 71 11 cvt.u32.u64 %r94,%r148; $L2: .loc 1 73 13 mov.u64 %r151,1; st.u64 [%frame+360],%r151; .loc 1 74 12 mov.u64 %r152,0; st.u64 [%frame+480],%r152; .loc 1 82 23 ld.s8 %r140,[%r146+28]; cvt.u32.u32 %r153,%r140; cvt.s64.s8 %r141,%r153; .loc 1 82 3 setp.le.s64 %r154,%r141,0; @ %r154 bra $L31; .loc 1 84 15 cvt.s64.s32 %r92,%r94; add.u64 %r93,%r146,40; add.u64 %r86,%r145,40; .loc 1 75 5 mov.u64 %r90,%r152; .loc 1 80 7 mov.u64 %r88,%r90; .loc 1 82 12 mov.u64 %r125,%r90; .loc 1 79 11 mov.u64 %r85,%r151; .loc 1 78 11 mov.u64 %r81,%r85; .loc 1 96 20 mov.u64 %r305,%r90; $L8: .loc 1 92 17 ld.u64 %r156,[%r93+16]; add.u64 %r155,%r156,1; .loc 1 92 15 ld.u64 %r157,[%r93+8]; sub.u64 %r33,%r155,%r157; .loc 1 86 19 ld.u64 %r80,[%r86]; .loc 1 89 19 ld.u64 %r84,[%r93]; .loc 1 84 10 setp.ne.u64 %r158,%r92,%r125; @ %r158 bra $L4; .loc 1 87 14 setp.ne.u64 %r159,%r80,0; .loc 1 90 14 setp.ne.u64 %r160,%r84,0; .loc 1 92 15 mov.u64 %r88,%r33; selp.u64 %r85,%r84,1,%r160; selp.u64 %r81,%r80,1,%r159; bra $L7; $L4: .loc 1 96 20 shl.b64 %r161,%r90,3; add.u64 %r162,%frame,%r161; add.u64 %r163,%r162,480; st.u64 [%r163],%r305; .loc 1 97 21 st.u64 [%r162+360],%r33; .loc 1 98 22 st.u64 [%r162+840],%r80; .loc 1 99 22 st.u64 [%r162+720],%r84; .loc 1 100 24 add.u64 %r175,%r90,%r90; add.u64 %r176,%r175,%r90; shl.b64 %r177,%r176,3; add.u64 %r178,%r147,%r177; ld.u64 %r23,[%r178+40]; .loc 1 100 22 st.u64 [%r162+600],%r23; .loc 1 101 26 mul.lo.u64 %r186,%r33,%r80; .loc 1 101 13 st.u64 [%r162+240],%r186; .loc 1 102 26 mul.lo.u64 %r190,%r33,%r84; .loc 1 102 13 st.u64 [%r162+120],%r190; .loc 1 103 26 mul.lo.u64 %r193,%r23,%r33; .loc 1 103 13 st.u64 [%r162],%r193; .loc 1 104 12 add.u64 %r90,%r90,1; $L7: .loc 1 82 55 add.u64 %r125,%r125,1; .loc 1 82 3 add.u64 %r93,%r93,24; add.u64 %r86,%r86,24; setp.ne.u64 %r194,%r125,%r141; @ %r194 bra $L8; bra $L3; $L31: .loc 1 80 7 mov.u64 %r88,%r152; .loc 1 79 11 mov.u64 %r85,%r151; .loc 1 78 11 mov.u64 %r81,%r85; $L3: .loc 1 107 14 ld.u64 %r27,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r195,%r27,0; .loc 1 108 16 selp.u64 %r27,%r27,1,%r195; .loc 1 109 14 ld.u64 %r28,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r196,%r28,0; .loc 1 110 16 selp.u64 %r28,%r28,1,%r196; .loc 1 111 14 ld.u64 %r29,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r197,%r29,0; .loc 1 112 16 selp.u64 %r29,%r29,1,%r197; .loc 1 118 8 ld.u64 %r122,[%r145]; .loc 1 119 8 ld.u64 %r123,[%r146]; .loc 1 120 8 ld.u64 %r124,[%r147]; .loc 1 122 9 setp.eq.u64 %r198,%r122,0; @ %r198 bra $L1; .loc 1 150 13 shl.b64 %r45,%r81,2; .loc 1 151 12 shl.b64 %r47,%r85,2; .loc 1 162 12 shl.b64 %r51,%r28,2; .loc 1 163 12 shl.b64 %r53,%r27,2; .loc 1 164 12 shl.b64 %r55,%r29,2; .loc 1 138 19 set.u32.eq.u64 %r200,%r85,1; neg.s32 %r201,%r200; .loc 1 138 35 set.u32.eq.u64 %r203,%r81,1; neg.s32 %r204,%r203; .loc 1 138 24 cvt.u16.u32 %r206,%r201; cvt.u16.u32 %r207,%r204; and.b16 %r205,%r206,%r207; cvt.u32.u16 %r208,%r205; cvt.u32.u8 %r91,%r208; .loc 1 167 32 ld.u64 %r89,[%frame+360]; setp.ne.u32 %r290,%r91,0; setp.le.s64 %r287,%r141,2; .loc 1 129 12 cvt.u32.u64 %r295,%r88; cvt.u32.u32 %r296,%r140; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; add.u64 %r299,%frame,840; add.u64 %r300,%frame,720; add.u64 %r301,%frame,600; add.u64 %r302,%frame,360; .loc 1 172 24 add.u64 %r303,%frame,240; .loc 1 173 24 add.u64 %r304,%frame,120; $L29: .loc 1 125 10 ld.u32 %r95,[%r124]; .loc 1 128 10 setp.ge.s32 %r209,%r95,0; @ %r209 bra $L13; .loc 1 129 12 add.u32 %r95,%r95,%r295; .loc 1 130 11 cvt.s64.s32 %r132,%r95; .loc 1 130 10 setp.ge.s64 %r210,%r132,%r88; @ %r210 bra $L14; setp.ge.s32 %r211,%r95,0; @ %r211 bra $L15; $L14: .loc 1 132 13 rem.s64 %r132,%r132,%r88; .loc 1 133 7 setp.lt.s64 %r214,%r132,0; @ %r214 bra $L16; .loc 1 132 8 cvt.u32.u64 %r95,%r132; bra $L15; $L16: .loc 1 134 16 cvt.u32.u64 %r215,%r132; add.u32 %r95,%r295,%r215; .loc 1 136 22 cvt.s64.s32 %r132,%r95; $L15: .loc 1 147 24 sub.u64 %r138,%r88,%r132; .loc 1 138 10 @ %r290 bra $L17; .loc 1 147 4 setp.gt.s64 %r217,%r138,0; @ %r217 bra $L18; mov.u64 %r111,%r122; bra $L19; $L17: .loc 1 140 21 cvt.s64.s32 %r218,%r95; .loc 1 140 11 shl.b64 %r117,%r218,2; .loc 1 141 11 shl.b64 %r118,%r138,2; .loc 1 142 23 add.u64 %r219,%r123,%r117; .loc 1 142 418226,[%value_in]; } .loc 1 143 17 add.u64 %r228,%r122,%r118; .loc 1 143 42r123117235,[%value_in]; } bra $L20; $L18: .loc 1 136 22 mul.lo.u64 %r237,%r132,%r85; .loc 1 136 18 shl.b64 %r238,%r237,2; .loc 1 136 11 add.u64 %r115,%r123,%r238; mov.u64 %r114,%r122; .loc 1 147 11 mov.u64 %r116,0; $L21: .loc 1 149 16 ld.u32 %r43,[%r115]; .loc 1 149 14 st.u32 [%r114],%r43; .loc 1 150 13 add.u64 %r114,%r114,%r45; .loc 1 151 12 add.u64 %r115,%r115,%r47; .loc 1 147 31 add.u64 %r116,%r116,1; .loc 1 147 4 setp.ne.u64 %r239,%r116,%r138; @ %r239 bra $L21; .loc 1 150 13 mad.lo.u64 %r111,%r116,%r45,%r122; $L19: .loc 1 153 4 setp.gt.s64 %r241,%r132,0; @ ! %r241 bra $L20; mov.u64 %r112,%r123; .loc 1 153 23 mov.u64 %r113,0; $L22: .loc 1 155 16 ld.u32 %r48,[%r112]; .loc 1 155 14 st.u32 [%r111],%r48; .loc 1 156 13 add.u64 %r111,%r111,%r45; .loc 1 157 12 add.u64 %r112,%r112,%r47; .loc 1 153 37 add.u64 %r113,%r113,1; .loc 1 153 4 setp.ne.u64 %r242,%r113,%r132; @ %r242 bra $L22; $L20: .loc 1 162 12 add.u64 %r122,%r122,%r51; .loc 1 163 12 add.u64 %r123,%r123,%r53; .loc 1 164 12 add.u64 %r124,%r124,%r55; .loc 1 165 15 ld.u64 %r243,[%frame+480]; add.u64 %r57,%r243,1; .loc 1 167 13 setp.eq.u64 %r244,%r57,%r89; @ %r244 bra $L24; .loc 1 165 15 st.u64 [%frame+480],%r57; bra $L29; $L24: .loc 1 171 20 mov.u64 %r245,0; st.u64 [%frame+480],%r245; .loc 1 172 24 ld.u64 %r59,[%frame+240]; .loc 1 173 16 ld.u64 %r247,[%frame+120]; shl.b64 %r246,%r247,2; sub.u64 %r119,%r123,%r246; .loc 1 174 9 ld.u64 %r249,[%frame]; shl.b64 %r248,%r249,2; sub.u64 %r120,%r124,%r248; .loc 1 176 14 @ %r287 bra $L1; add.u64 %r142,%frame,488; mov.u64 %r135,8; .loc 1 175 12 mov.u64 %r121,1; bra $L27; $L28: .loc 1 171 20 st.u64 [%r142],%r245; .loc 1 172 24 add.u64 %r256,%r303,%r135; ld.u64 %r59,[%r256]; .loc 1 173 24 add.u64 %r258,%r304,%r135; .loc 1 173 16 ld.u64 %r260,[%r258]; shl.b64 %r259,%r260,2; sub.u64 %r119,%r123,%r259; .loc 1 174 17 add.u64 %r261,%frame,%r135; .loc 1 174 9 ld.u64 %r263,[%r261]; shl.b64 %r262,%r263,2; sub.u64 %r120,%r124,%r262; .loc 1 175 12 add.u64 %r121,%r121,1; .loc 1 176 14 add.u64 %r142,%r142,8; add.u64 %r135,%r135,8; setp.eq.u64 %r264,%r298,%r121; @ %r264 bra $L1; $L27: .loc 1 184 23 ld.u64 %r265,[%r142]; add.u64 %r70,%r265,1; st.u64 [%r142],%r70; .loc 1 185 30 add.u64 %r267,%r299,%r135; .loc 1 185 20 ld.u64 %r269,[%r267]; sub.u64 %r268,%r269,%r59; shl.b64 %r270,%r268,2; add.u64 %r122,%r122,%r270; .loc 1 186 30 add.u64 %r272,%r300,%r135; .loc 1 186 20 ld.u64 %r274,[%r272]; shl.b64 %r273,%r274,2; add.u64 %r123,%r119,%r273; .loc 1 187 23 add.u64 %r276,%r301,%r135; .loc 1 187 13 ld.u64 %r278,[%r276]; shl.b64 %r277,%r278,2; add.u64 %r124,%r120,%r277; .loc 1 167 32 add.u64 %r280,%r302,%r135; .loc 1 167 13 ld.u64 %r281,[%r280]; setp.eq.u64 %r282,%r70,%r281; @ %r282 bra $L28; bra $L29; $L13: .loc 1 130 11 cvt.s64.s32 %r132,%r95; .loc 1 130 10 setp.lt.s64 %r283,%r132,%r88; @ %r283 bra $L15; .loc 1 132 13 rem.s64 %r132,%r132,%r88; .loc 1 132 8 cvt.u32.u64 %r95,%r132; bra $L15; $L1: .loc 1 191cshift1_4_i8.o/gfortrani_cshift1_4_i8 .visible .func _gfortrani_cshift1_4_../libgfortran/generated/cshift1_4_i8.c"ortrani_cshift1_4_i8 .visible .func _gfortrani_cshift1_4_0u32 %r9532predpredu16 %r206; .reg .u16 %r207; .reg .u32predpred %r239; .reg .predpredu64 %r249; .reg .u64 %r256; .reg .u64 %r258; .reg .u64u64 %r267; .reg .u64u64 %r278; .reg .u64pred %r287; .reg .pred %r290; .reg .u32 %r295; .reg .u32 %r296; .reg .u64u64 %r305; mov.u64 %r145,%ar0; mov.u64 %r146,%ar1; mov.u64 %r147,%ar2; mov.u64 %r148,%ar3; .loc 1 68 6 setp.eq.u64 %r149,%r148,0; @ %r149 bra $L30; .loc 1 69 11 ld.u32 %r150,[%r148]; add.u32 %r94,%r150,-1; bra $L2; $L30: .loc 1 71 11 cvt.u32.u64 %r94,%r148; $L2: .loc 1 73 13 mov.u64 %r151,1; st.u64 [%frame+360],%r151; .loc 1 74 12 mov.u64 %r152,0; st.u64 [%frame+480],%r152; .loc 1 82 23 ld.s8 %r140,[%r146+28]; cvt.u32.u32 %r153,%r140; cvt.s64.s8 %r141,%r153; .loc 1 82 3 setp.le.s64 %r154,%r141,0; @ %r154 bra $L31; .loc 1 84 15 cvt.s64.s32 %r92,%r94; add.u64 %r93,%r146,40; add.u64 %r86,%r145,40; .loc 1 75 5 mov.u64 %r90,%r152; .loc 1 80 7 mov.u64 %r88,%r90; .loc 1 82 12 mov.u64 %r125,%r90; .loc 1 79 11 mov.u64 %r85,%r151; .loc 1 78 11 mov.u64 %r81,%r85; .loc 1 96 20 mov.u64 %r305,%r90; $L8: .loc 1 92 17 ld.u64 %r156,[%r93+16]; add.u64 %r155,%r156,1; .loc 1 92 15 ld.u64 %r157,[%r93+8]; sub.u64 %r33,%r155,%r157; .loc 1 86 19 ld.u64 %r80,[%r86]; .loc 1 89 19 ld.u64 %r84,[%r93]; .loc 1 84 10 setp.ne.u64 %r158,%r92,%r125; @ %r158 bra $L4; .loc 1 87 14 setp.ne.u64 %r159,%r80,0; .loc 1 90 14 setp.ne.u64 %r160,%r84,0; .loc 1 92 15 mov.u64 %r88,%r33; selp.u64 %r85,%r84,1,%r160; selp.u64 %r81,%r80,1,%r159; bra $L7; $L4: .loc 1 96 20 shl.b64 %r161,%r90,3; add.u64 %r162,%frame,%r161; add.u64 %r163,%r162,480; st.u64 [%r163],%r305; .loc 1 97 21 st.u64 [%r162+360],%r33; .loc 1 98 22 st.u64 [%r162+840],%r80; .loc 1 99 22 st.u64 [%r162+720],%r84; .loc 1 100 24 add.u64 %r175,%r90,%r90; add.u64 %r176,%r175,%r90; shl.b64 %r177,%r176,3; add.u64 %r178,%r147,%r177; ld.u64 %r23,[%r178+40]; .loc 1 100 22 st.u64 [%r162+600],%r23; .loc 1 101 26 mul.lo.u64 %r186,%r33,%r80; .loc 1 101 13 st.u64 [%r162+240],%r186; .loc 1 102 26 mul.lo.u64 %r190,%r33,%r84; .loc 1 102 13 st.u64 [%r162+120],%r190; .loc 1 103 26 mul.lo.u64 %r193,%r23,%r33; .loc 1 103 13 st.u64 [%r162],%r193; .loc 1 104 12 add.u64 %r90,%r90,1; $L7: .loc 1 82 55 add.u64 %r125,%r125,1; .loc 1 82 3 add.u64 %r93,%r93,24; add.u64 %r86,%r86,24; setp.ne.u64 %r194,%r125,%r141; @ %r194 bra $L8; bra $L3; $L31: .loc 1 80 7 mov.u64 %r88,%r152; .loc 1 79 11 mov.u64 %r85,%r151; .loc 1 78 11 mov.u64 %r81,%r85; $L3: .loc 1 107 14 ld.u64 %r27,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r195,%r27,0; .loc 1 108 16 selp.u64 %r27,%r27,1,%r195; .loc 1 109 14 ld.u64 %r28,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r196,%r28,0; .loc 1 110 16 selp.u64 %r28,%r28,1,%r196; .loc 1 111 14 ld.u64 %r29,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r197,%r29,0; .loc 1 112 16 selp.u64 %r29,%r29,1,%r197; .loc 1 118 8 ld.u64 %r122,[%r145]; .loc 1 119 8 ld.u64 %r123,[%r146]; .loc 1 120 8 ld.u64 %r124,[%r147]; .loc 1 122 9 setp.eq.u64 %r198,%r122,0; @ %r198 bra $L1; .loc 1 150 13 shl.b64 %r45,%r81,3; .loc 1 151 12 shl.b64 %r47,%r85,3; .loc 1 162 12 shl.b64 %r51,%r28,3; .loc 1 163 12 shl.b64 %r53,%r27,3; .loc 1 164 12 shl.b64 %r55,%r29,2; .loc 1 138 19 set.u32.eq.u64 %r200,%r85,1; neg.s32 %r201,%r200; .loc 1 138 35 set.u32.eq.u64 %r203,%r81,1; neg.s32 %r204,%r203; .loc 1 138 24 cvt.u16.u32 %r206,%r201; cvt.u16.u32 %r207,%r204; and.b16 %r205,%r206,%r207; cvt.u32.u16 %r208,%r205; cvt.u32.u8 %r91,%r208; .loc 1 167 32 ld.u64 %r89,[%frame+360]; setp.ne.u32 %r290,%r91,0; setp.le.s64 %r287,%r141,2; .loc 1 129 12 cvt.u32.u64 %r295,%r88; cvt.u32.u32 %r296,%r140; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; add.u64 %r299,%frame,840; add.u64 %r300,%frame,720; add.u64 %r301,%frame,600; add.u64 %r302,%frame,360; .loc 1 172 24 add.u64 %r303,%frame,240; .loc 1 173 24 add.u64 %r304,%frame,120; $L29: .loc 1 125 10 ld.u32 %r95,[%r124]; .loc 1 128 10 setp.ge.s32 %r209,%r95,0; @ %r209 bra $L13; .loc 1 129 12 add.u32 %r95,%r95,%r295; .loc 1 130 11 cvt.s64.s32 %r132,%r95; .loc 1 130 10 setp.ge.s64 %r210,%r132,%r88; @ %r210 bra $L14; setp.ge.s32 %r211,%r95,0; @ %r211 bra $L15; $L14: .loc 1 132 13 rem.s64 %r132,%r132,%r88; .loc 1 133 7 setp.lt.s64 %r214,%r132,0; @ %r214 bra $L16; .loc 1 132 8 cvt.u32.u64 %r95,%r132; bra $L15; $L16: .loc 1 134 16 cvt.u32.u64 %r215,%r132; add.u32 %r95,%r295,%r215; .loc 1 136 22 cvt.s64.s32 %r132,%r95; $L15: .loc 1 147 24 sub.u64 %r138,%r88,%r132; .loc 1 138 10 @ %r290 bra $L17; .loc 1 147 4 setp.gt.s64 %r217,%r138,0; @ %r217 bra $L18; mov.u64 %r111,%r122; bra $L19; $L17: .loc 1 140 21 cvt.s64.s32 %r218,%r95; .loc 1 140 11 shl.b64 %r117,%r218,3; .loc 1 141 11 shl.b64 %r118,%r138,3; .loc 1 142 23 add.u64 %r219,%r123,%r117; .loc 1 142 418226,[%value_in]; } .loc 1 143 17 add.u64 %r228,%r122,%r118; .loc 1 143 42r123117235,[%value_in]; } bra $L20; $L18: .loc 1 136 22 mul.lo.u64 %r237,%r132,%r85; .loc 1 136 18 shl.b64 %r238,%r237,3; .loc 1 136 11 add.u64 %r115,%r123,%r238; mov.u64 %r114,%r122; .loc 1 147 11 mov.u64 %r116,0; $L21: .loc 1 149 16 ld.u64 %r43,[%r115]; .loc 1 149 14 st.u64 [%r114],%r43; .loc 1 150 13 add.u64 %r114,%r114,%r45; .loc 1 151 12 add.u64 %r115,%r115,%r47; .loc 1 147 31 add.u64 %r116,%r116,1; .loc 1 147 4 setp.ne.u64 %r239,%r116,%r138; @ %r239 bra $L21; .loc 1 150 13 mad.lo.u64 %r111,%r116,%r45,%r122; $L19: .loc 1 153 4 setp.gt.s64 %r241,%r132,0; @ ! %r241 bra $L20; mov.u64 %r112,%r123; .loc 1 153 23 mov.u64 %r113,0; $L22: .loc 1 155 16 ld.u64 %r48,[%r112]; .loc 1 155 14 st.u64 [%r111],%r48; .loc 1 156 13 add.u64 %r111,%r111,%r45; .loc 1 157 12 add.u64 %r112,%r112,%r47; .loc 1 153 37 add.u64 %r113,%r113,1; .loc 1 153 4 setp.ne.u64 %r242,%r113,%r132; @ %r242 bra $L22; $L20: .loc 1 162 12 add.u64 %r122,%r122,%r51; .loc 1 163 12 add.u64 %r123,%r123,%r53; .loc 1 164 12 add.u64 %r124,%r124,%r55; .loc 1 165 15 ld.u64 %r243,[%frame+480]; add.u64 %r57,%r243,1; .loc 1 167 13 setp.eq.u64 %r244,%r57,%r89; @ %r244 bra $L24; .loc 1 165 15 st.u64 [%frame+480],%r57; bra $L29; $L24: .loc 1 171 20 mov.u64 %r245,0; st.u64 [%frame+480],%r245; .loc 1 172 24 ld.u64 %r59,[%frame+240]; .loc 1 173 16 ld.u64 %r247,[%frame+120]; shl.b64 %r246,%r247,3; sub.u64 %r119,%r123,%r246; .loc 1 174 9 ld.u64 %r249,[%frame]; shl.b64 %r248,%r249,2; sub.u64 %r120,%r124,%r248; .loc 1 176 14 @ %r287 bra $L1; add.u64 %r142,%frame,488; mov.u64 %r135,8; .loc 1 175 12 mov.u64 %r121,1; bra $L27; $L28: .loc 1 171 20 st.u64 [%r142],%r245; .loc 1 172 24 add.u64 %r256,%r303,%r135; ld.u64 %r59,[%r256]; .loc 1 173 24 add.u64 %r258,%r304,%r135; .loc 1 173 16 ld.u64 %r260,[%r258]; shl.b64 %r259,%r260,3; sub.u64 %r119,%r123,%r259; .loc 1 174 17 add.u64 %r261,%frame,%r135; .loc 1 174 9 ld.u64 %r263,[%r261]; shl.b64 %r262,%r263,2; sub.u64 %r120,%r124,%r262; .loc 1 175 12 add.u64 %r121,%r121,1; .loc 1 176 14 add.u64 %r142,%r142,8; add.u64 %r135,%r135,8; setp.eq.u64 %r264,%r298,%r121; @ %r264 bra $L1; $L27: .loc 1 184 23 ld.u64 %r265,[%r142]; add.u64 %r70,%r265,1; st.u64 [%r142],%r70; .loc 1 185 30 add.u64 %r267,%r299,%r135; .loc 1 185 20 ld.u64 %r269,[%r267]; sub.u64 %r268,%r269,%r59; shl.b64 %r270,%r268,3; add.u64 %r122,%r122,%r270; .loc 1 186 30 add.u64 %r272,%r300,%r135; .loc 1 186 20 ld.u64 %r274,[%r272]; shl.b64 %r273,%r274,3; add.u64 %r123,%r119,%r273; .loc 1 187 23 add.u64 %r276,%r301,%r135; .loc 1 187 13 ld.u64 %r278,[%r276]; shl.b64 %r277,%r278,2; add.u64 %r124,%r120,%r277; .loc 1 167 32 add.u64 %r280,%r302,%r135; .loc 1 167 13 ld.u64 %r281,[%r280]; setp.eq.u64 %r282,%r70,%r281; @ %r282 bra $L28; bra $L29; $L13: .loc 1 130 11 cvt.s64.s32 %r132,%r95; .loc 1 130 10 setp.lt.s64 %r283,%r132,%r88; @ %r283 bra $L15; .loc 1 132 13 rem.s64 %r132,%r132,%r88; .loc 1 132 8 cvt.u32.u64 %r95,%r132; bra $L15; $L1: .loc 1 191cshift1_4_i16.o/1622802246_gfortrani_cshift1_4_i16 .visible .func _gfortrani_cshift1_4_i16../libgfortran/generated/cshift1_4_i16.c"ortrani_cshift1_4_i16 .visible .func _gfortrani_cshift1_4_i160u32 %r9532predpredu16 %r206; .reg .u16 %r207; .reg .u32predpred %r241; .reg .pred %r243; .reg .predu64 %r251; .reg .u64 %r252; .reg .u64u64u64predu64 %r312; .reg .u64 %r313; mov.u64 %r145,%ar0; mov.u64 %r146,%ar1; mov.u64 %r147,%ar2; mov.u64 %r148,%ar3; .loc 1 68 6 setp.eq.u64 %r149,%r148,0; @ %r149 bra $L30; .loc 1 69 11 ld.u32 %r150,[%r148]; add.u32 %r94,%r150,-1; bra $L2; $L30: .loc 1 71 11 cvt.u32.u64 %r94,%r148; $L2: .loc 1 73 13 mov.u64 %r151,1; st.u64 [%frame+360],%r151; .loc 1 74 12 mov.u64 %r152,0; st.u64 [%frame+480],%r152; .loc 1 82 23 ld.s8 %r140,[%r146+28]; cvt.u32.u32 %r153,%r140; cvt.s64.s8 %r141,%r153; .loc 1 82 3 setp.le.s64 %r154,%r141,0; @ %r154 bra $L31; .loc 1 84 15 cvt.s64.s32 %r92,%r94; add.u64 %r93,%r146,40; add.u64 %r86,%r145,40; .loc 1 75 5 mov.u64 %r90,%r152; .loc 1 80 7 mov.u64 %r88,%r90; .loc 1 82 12 mov.u64 %r125,%r90; .loc 1 79 11 mov.u64 %r85,%r151; .loc 1 78 11 mov.u64 %r81,%r85; .loc 1 96 20 mov.u64 %r313,%r90; $L8: .loc 1 92 17 ld.u64 %r156,[%r93+16]; add.u64 %r155,%r156,1; .loc 1 92 15 ld.u64 %r157,[%r93+8]; sub.u64 %r33,%r155,%r157; .loc 1 86 19 ld.u64 %r80,[%r86]; .loc 1 89 19 ld.u64 %r84,[%r93]; .loc 1 84 10 setp.ne.u64 %r158,%r92,%r125; @ %r158 bra $L4; .loc 1 87 14 setp.ne.u64 %r159,%r80,0; .loc 1 90 14 setp.ne.u64 %r160,%r84,0; .loc 1 92 15 mov.u64 %r88,%r33; selp.u64 %r85,%r84,1,%r160; selp.u64 %r81,%r80,1,%r159; bra $L7; $L4: .loc 1 96 20 shl.b64 %r161,%r90,3; add.u64 %r162,%frame,%r161; add.u64 %r163,%r162,480; st.u64 [%r163],%r313; .loc 1 97 21 st.u64 [%r162+360],%r33; .loc 1 98 22 st.u64 [%r162+840],%r80; .loc 1 99 22 st.u64 [%r162+720],%r84; .loc 1 100 24 add.u64 %r175,%r90,%r90; add.u64 %r176,%r175,%r90; shl.b64 %r177,%r176,3; add.u64 %r178,%r147,%r177; ld.u64 %r23,[%r178+40]; .loc 1 100 22 st.u64 [%r162+600],%r23; .loc 1 101 26 mul.lo.u64 %r186,%r33,%r80; .loc 1 101 13 st.u64 [%r162+240],%r186; .loc 1 102 26 mul.lo.u64 %r190,%r33,%r84; .loc 1 102 13 st.u64 [%r162+120],%r190; .loc 1 103 26 mul.lo.u64 %r193,%r23,%r33; .loc 1 103 13 st.u64 [%r162],%r193; .loc 1 104 12 add.u64 %r90,%r90,1; $L7: .loc 1 82 55 add.u64 %r125,%r125,1; .loc 1 82 3 add.u64 %r93,%r93,24; add.u64 %r86,%r86,24; setp.ne.u64 %r194,%r125,%r141; @ %r194 bra $L8; bra $L3; $L31: .loc 1 80 7 mov.u64 %r88,%r152; .loc 1 79 11 mov.u64 %r85,%r151; .loc 1 78 11 mov.u64 %r81,%r85; $L3: .loc 1 107 14 ld.u64 %r27,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r195,%r27,0; .loc 1 108 16 selp.u64 %r27,%r27,1,%r195; .loc 1 109 14 ld.u64 %r28,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r196,%r28,0; .loc 1 110 16 selp.u64 %r28,%r28,1,%r196; .loc 1 111 14 ld.u64 %r29,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r197,%r29,0; .loc 1 112 16 selp.u64 %r29,%r29,1,%r197; .loc 1 118 8 ld.u64 %r122,[%r145]; .loc 1 119 8 ld.u64 %r123,[%r146]; .loc 1 120 8 ld.u64 %r124,[%r147]; .loc 1 122 9 setp.eq.u64 %r198,%r122,0; @ %r198 bra $L1; .loc 1 150 13 shl.b64 %r45,%r81,4; .loc 1 151 12 shl.b64 %r47,%r85,4; .loc 1 162 12 shl.b64 %r51,%r28,4; .loc 1 163 12 shl.b64 %r53,%r27,4; .loc 1 164 12 shl.b64 %r55,%r29,2; .loc 1 138 19 set.u32.eq.u64 %r200,%r85,1; neg.s32 %r201,%r200; .loc 1 138 35 set.u32.eq.u64 %r203,%r81,1; neg.s32 %r204,%r203; .loc 1 138 24 cvt.u16.u32 %r206,%r201; cvt.u16.u32 %r207,%r204; and.b16 %r205,%r206,%r207; cvt.u32.u16 %r208,%r205; cvt.u32.u8 %r91,%r208; .loc 1 167 32 ld.u64 %r89,[%frame+360]; setp.ne.u32 %r296,%r91,0; setp.le.s64 %r298,%r141,2; .loc 1 129 12 cvt.u32.u64 %r303,%r88; cvt.u32.u32 %r304,%r140; cvt.s64.s8 %r305,%r304; add.u64 %r306,%r305,-1; add.u64 %r307,%frame,840; add.u64 %r308,%frame,720; add.u64 %r309,%frame,600; add.u64 %r310,%frame,360; .loc 1 172 24 add.u64 %r311,%frame,240; .loc 1 173 24 add.u64 %r312,%frame,120; $L29: .loc 1 125 10 ld.u32 %r95,[%r124]; .loc 1 128 10 setp.ge.s32 %r209,%r95,0; @ %r209 bra $L13; .loc 1 129 12 add.u32 %r95,%r95,%r303; .loc 1 130 11 cvt.s64.s32 %r132,%r95; .loc 1 130 10 setp.ge.s64 %r210,%r132,%r88; @ %r210 bra $L14; setp.ge.s32 %r211,%r95,0; @ %r211 bra $L15; $L14: .loc 1 132 13 rem.s64 %r132,%r132,%r88; .loc 1 133 7 setp.lt.s64 %r214,%r132,0; @ %r214 bra $L16; .loc 1 132 8 cvt.u32.u64 %r95,%r132; bra $L15; $L16: .loc 1 134 16 cvt.u32.u64 %r215,%r132; add.u32 %r95,%r303,%r215; .loc 1 136 22 cvt.s64.s32 %r132,%r95; $L15: .loc 1 147 24 sub.u64 %r138,%r88,%r132; .loc 1 138 10 @ %r296 bra $L17; .loc 1 147 4 setp.gt.s64 %r217,%r138,0; @ %r217 bra $L18; mov.u64 %r111,%r122; bra $L19; $L17: .loc 1 140 21 cvt.s64.s32 %r218,%r95; .loc 1 140 11 shl.b64 %r117,%r218,4; .loc 1 141 11 shl.b64 %r118,%r138,4; .loc 1 142 23 add.u64 %r219,%r123,%r117; .loc 1 142 418226,[%value_in]; } .loc 1 143 17 add.u64 %r228,%r122,%r118; .loc 1 143 42r123117235,[%value_in]; } bra $L20; $L18: .loc 1 136 22 mul.lo.u64 %r237,%r132,%r85; .loc 1 136 18 shl.b64 %r238,%r237,4; .loc 1 136 11 add.u64 %r115,%r123,%r238; mov.u64 %r114,%r122; .loc 1 147 11 mov.u64 %r116,0; $L21: .loc 1 149 16 ld.u64 %r290,[%r115]; ld.u64 %r291,[%r115+8]; .loc 1 149 14 st.u64 [%r114],%r290; st.u64 [%r114+8],%r291; .loc 1 150 13 add.u64 %r114,%r114,%r45; .loc 1 151 12 add.u64 %r115,%r115,%r47; .loc 1 147 31 add.u64 %r116,%r116,1; .loc 1 147 4 setp.ne.u64 %r241,%r116,%r138; @ %r241 bra $L21; .loc 1 150 13 mad.lo.u64 %r111,%r116,%r45,%r122; $L19: .loc 1 153 4 setp.gt.s64 %r243,%r132,0; @ ! %r243 bra $L20; mov.u64 %r112,%r123; .loc 1 153 23 mov.u64 %r113,0; $L22: .loc 1 155 16 ld.u64 %r292,[%r112]; ld.u64 %r293,[%r112+8]; .loc 1 155 14 st.u64 [%r111],%r292; st.u64 [%r111+8],%r293; .loc 1 156 13 add.u64 %r111,%r111,%r45; .loc 1 157 12 add.u64 %r112,%r112,%r47; .loc 1 153 37 add.u64 %r113,%r113,1; .loc 1 153 4 setp.ne.u64 %r246,%r113,%r132; @ %r246 bra $L22; $L20: .loc 1 162 12 add.u64 %r122,%r122,%r51; .loc 1 163 12 add.u64 %r123,%r123,%r53; .loc 1 164 12 add.u64 %r124,%r124,%r55; .loc 1 165 15 ld.u64 %r247,[%frame+480]; add.u64 %r57,%r247,1; .loc 1 167 13 setp.eq.u64 %r248,%r57,%r89; @ %r248 bra $L24; .loc 1 165 15 st.u64 [%frame+480],%r57; bra $L29; $L24: .loc 1 171 20 mov.u64 %r249,0; st.u64 [%frame+480],%r249; .loc 1 172 24 ld.u64 %r59,[%frame+240]; .loc 1 173 16 ld.u64 %r251,[%frame+120]; shl.b64 %r250,%r251,4; sub.u64 %r119,%r123,%r250; .loc 1 174 9 ld.u64 %r253,[%frame]; shl.b64 %r252,%r253,2; sub.u64 %r120,%r124,%r252; .loc 1 176 14 @ %r298 bra $L1; add.u64 %r142,%frame,488; mov.u64 %r135,8; .loc 1 175 12 mov.u64 %r121,1; bra $L27; $L28: .loc 1 171 20 st.u64 [%r142],%r249; .loc 1 172 24 add.u64 %r260,%r311,%r135; ld.u64 %r59,[%r260]; .loc 1 173 24 add.u64 %r262,%r312,%r135; .loc 1 173 16 ld.u64 %r264,[%r262]; shl.b64 %r263,%r264,4; sub.u64 %r119,%r123,%r263; .loc 1 174 17 add.u64 %r265,%frame,%r135; .loc 1 174 9 ld.u64 %r267,[%r265]; shl.b64 %r266,%r267,2; sub.u64 %r120,%r124,%r266; .loc 1 175 12 add.u64 %r121,%r121,1; .loc 1 176 14 add.u64 %r142,%r142,8; add.u64 %r135,%r135,8; setp.eq.u64 %r268,%r306,%r121; @ %r268 bra $L1; $L27: .loc 1 184 23 ld.u64 %r269,[%r142]; add.u64 %r70,%r269,1; st.u64 [%r142],%r70; .loc 1 185 30 add.u64 %r271,%r307,%r135; .loc 1 185 20 ld.u64 %r273,[%r271]; sub.u64 %r272,%r273,%r59; shl.b64 %r274,%r272,4; add.u64 %r122,%r122,%r274; .loc 1 186 30 add.u64 %r276,%r308,%r135; .loc 1 186 20 ld.u64 %r278,[%r276]; shl.b64 %r277,%r278,4; add.u64 %r123,%r119,%r277; .loc 1 187 23 add.u64 %r280,%r309,%r135; .loc 1 187 13 ld.u64 %r282,[%r280]; shl.b64 %r281,%r282,2; add.u64 %r124,%r120,%r281; .loc 1 167 32 add.u64 %r284,%r310,%r135; .loc 1 167 13 ld.u64 %r285,[%r284]; setp.eq.u64 %r286,%r70,%r285; @ %r286 bra $L28; bra $L29; $L13: .loc 1 130 11 cvt.s64.s32 %r132,%r95; .loc 1 130 10 setp.lt.s64 %r287,%r132,%r88; @ %r287 bra $L15; .loc 1 132 13 rem.s64 %r132,%r132,%r88; .loc 1 132 8 cvt.u32.u64 %r95,%r132; bra $L15; $L1: .loc 1 191 cshift1_4_r4.o/gfortrani_cshift1_4_r4 .visible .func _gfortrani_cshift1_4_r../libgfortran/generated/cshift1_4_r4.c"ortrani_cshift1_4_r4 .visible .func _gfortrani_cshift1_4_r0u32 %r9532predpredu16 %r206; .reg .u16 %r207; .reg .u32predpred %r239; .reg .predpredu64 %r249; .reg .u64 %r256; .reg .u64 %r258; .reg .u64u64 %r267; .reg .u64u64 %r278; .reg .u64pred %r287; .reg .pred %r290; .reg .u32 %r295; .reg .u32 %r296; .reg .u64u64 %r305; mov.u64 %r145,%ar0; mov.u64 %r146,%ar1; mov.u64 %r147,%ar2; mov.u64 %r148,%ar3; .loc 1 68 6 setp.eq.u64 %r149,%r148,0; @ %r149 bra $L30; .loc 1 69 11 ld.u32 %r150,[%r148]; add.u32 %r94,%r150,-1; bra $L2; $L30: .loc 1 71 11 cvt.u32.u64 %r94,%r148; $L2: .loc 1 73 13 mov.u64 %r151,1; st.u64 [%frame+360],%r151; .loc 1 74 12 mov.u64 %r152,0; st.u64 [%frame+480],%r152; .loc 1 82 23 ld.s8 %r140,[%r146+28]; cvt.u32.u32 %r153,%r140; cvt.s64.s8 %r141,%r153; .loc 1 82 3 setp.le.s64 %r154,%r141,0; @ %r154 bra $L31; .loc 1 84 15 cvt.s64.s32 %r92,%r94; add.u64 %r93,%r146,40; add.u64 %r86,%r145,40; .loc 1 75 5 mov.u64 %r90,%r152; .loc 1 80 7 mov.u64 %r88,%r90; .loc 1 82 12 mov.u64 %r125,%r90; .loc 1 79 11 mov.u64 %r85,%r151; .loc 1 78 11 mov.u64 %r81,%r85; .loc 1 96 20 mov.u64 %r305,%r90; $L8: .loc 1 92 17 ld.u64 %r156,[%r93+16]; add.u64 %r155,%r156,1; .loc 1 92 15 ld.u64 %r157,[%r93+8]; sub.u64 %r33,%r155,%r157; .loc 1 86 19 ld.u64 %r80,[%r86]; .loc 1 89 19 ld.u64 %r84,[%r93]; .loc 1 84 10 setp.ne.u64 %r158,%r92,%r125; @ %r158 bra $L4; .loc 1 87 14 setp.ne.u64 %r159,%r80,0; .loc 1 90 14 setp.ne.u64 %r160,%r84,0; .loc 1 92 15 mov.u64 %r88,%r33; selp.u64 %r85,%r84,1,%r160; selp.u64 %r81,%r80,1,%r159; bra $L7; $L4: .loc 1 96 20 shl.b64 %r161,%r90,3; add.u64 %r162,%frame,%r161; add.u64 %r163,%r162,480; st.u64 [%r163],%r305; .loc 1 97 21 st.u64 [%r162+360],%r33; .loc 1 98 22 st.u64 [%r162+840],%r80; .loc 1 99 22 st.u64 [%r162+720],%r84; .loc 1 100 24 add.u64 %r175,%r90,%r90; add.u64 %r176,%r175,%r90; shl.b64 %r177,%r176,3; add.u64 %r178,%r147,%r177; ld.u64 %r23,[%r178+40]; .loc 1 100 22 st.u64 [%r162+600],%r23; .loc 1 101 26 mul.lo.u64 %r186,%r33,%r80; .loc 1 101 13 st.u64 [%r162+240],%r186; .loc 1 102 26 mul.lo.u64 %r190,%r33,%r84; .loc 1 102 13 st.u64 [%r162+120],%r190; .loc 1 103 26 mul.lo.u64 %r193,%r23,%r33; .loc 1 103 13 st.u64 [%r162],%r193; .loc 1 104 12 add.u64 %r90,%r90,1; $L7: .loc 1 82 55 add.u64 %r125,%r125,1; .loc 1 82 3 add.u64 %r93,%r93,24; add.u64 %r86,%r86,24; setp.ne.u64 %r194,%r125,%r141; @ %r194 bra $L8; bra $L3; $L31: .loc 1 80 7 mov.u64 %r88,%r152; .loc 1 79 11 mov.u64 %r85,%r151; .loc 1 78 11 mov.u64 %r81,%r85; $L3: .loc 1 107 14 ld.u64 %r27,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r195,%r27,0; .loc 1 108 16 selp.u64 %r27,%r27,1,%r195; .loc 1 109 14 ld.u64 %r28,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r196,%r28,0; .loc 1 110 16 selp.u64 %r28,%r28,1,%r196; .loc 1 111 14 ld.u64 %r29,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r197,%r29,0; .loc 1 112 16 selp.u64 %r29,%r29,1,%r197; .loc 1 118 8 ld.u64 %r122,[%r145]; .loc 1 119 8 ld.u64 %r123,[%r146]; .loc 1 120 8 ld.u64 %r124,[%r147]; .loc 1 122 9 setp.eq.u64 %r198,%r122,0; @ %r198 bra $L1; .loc 1 150 13 shl.b64 %r45,%r81,2; .loc 1 151 12 shl.b64 %r47,%r85,2; .loc 1 162 12 shl.b64 %r51,%r28,2; .loc 1 163 12 shl.b64 %r53,%r27,2; .loc 1 164 12 shl.b64 %r55,%r29,2; .loc 1 138 19 set.u32.eq.u64 %r200,%r85,1; neg.s32 %r201,%r200; .loc 1 138 35 set.u32.eq.u64 %r203,%r81,1; neg.s32 %r204,%r203; .loc 1 138 24 cvt.u16.u32 %r206,%r201; cvt.u16.u32 %r207,%r204; and.b16 %r205,%r206,%r207; cvt.u32.u16 %r208,%r205; cvt.u32.u8 %r91,%r208; .loc 1 167 32 ld.u64 %r89,[%frame+360]; setp.ne.u32 %r290,%r91,0; setp.le.s64 %r287,%r141,2; .loc 1 129 12 cvt.u32.u64 %r295,%r88; cvt.u32.u32 %r296,%r140; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; add.u64 %r299,%frame,840; add.u64 %r300,%frame,720; add.u64 %r301,%frame,600; add.u64 %r302,%frame,360; .loc 1 172 24 add.u64 %r303,%frame,240; .loc 1 173 24 add.u64 %r304,%frame,120; $L29: .loc 1 125 10 ld.u32 %r95,[%r124]; .loc 1 128 10 setp.ge.s32 %r209,%r95,0; @ %r209 bra $L13; .loc 1 129 12 add.u32 %r95,%r95,%r295; .loc 1 130 11 cvt.s64.s32 %r132,%r95; .loc 1 130 10 setp.ge.s64 %r210,%r132,%r88; @ %r210 bra $L14; setp.ge.s32 %r211,%r95,0; @ %r211 bra $L15; $L14: .loc 1 132 13 rem.s64 %r132,%r132,%r88; .loc 1 133 7 setp.lt.s64 %r214,%r132,0; @ %r214 bra $L16; .loc 1 132 8 cvt.u32.u64 %r95,%r132; bra $L15; $L16: .loc 1 134 16 cvt.u32.u64 %r215,%r132; add.u32 %r95,%r295,%r215; .loc 1 136 22 cvt.s64.s32 %r132,%r95; $L15: .loc 1 147 24 sub.u64 %r138,%r88,%r132; .loc 1 138 10 @ %r290 bra $L17; .loc 1 147 4 setp.gt.s64 %r217,%r138,0; @ %r217 bra $L18; mov.u64 %r111,%r122; bra $L19; $L17: .loc 1 140 21 cvt.s64.s32 %r218,%r95; .loc 1 140 11 shl.b64 %r117,%r218,2; .loc 1 141 11 shl.b64 %r118,%r138,2; .loc 1 142 23 add.u64 %r219,%r123,%r117; .loc 1 142 418226,[%value_in]; } .loc 1 143 17 add.u64 %r228,%r122,%r118; .loc 1 143 42r123117235,[%value_in]; } bra $L20; $L18: .loc 1 136 22 mul.lo.u64 %r237,%r132,%r85; .loc 1 136 18 shl.b64 %r238,%r237,2; .loc 1 136 11 add.u64 %r115,%r123,%r238; mov.u64 %r114,%r122; .loc 1 147 11 mov.u64 %r116,0; $L21: .loc 1 149 16 ld.f32 %r43,[%r115]; .loc 1 149 14 st.f32 [%r114],%r43; .loc 1 150 13 add.u64 %r114,%r114,%r45; .loc 1 151 12 add.u64 %r115,%r115,%r47; .loc 1 147 31 add.u64 %r116,%r116,1; .loc 1 147 4 setp.ne.u64 %r239,%r116,%r138; @ %r239 bra $L21; .loc 1 150 13 mad.lo.u64 %r111,%r116,%r45,%r122; $L19: .loc 1 153 4 setp.gt.s64 %r241,%r132,0; @ ! %r241 bra $L20; mov.u64 %r112,%r123; .loc 1 153 23 mov.u64 %r113,0; $L22: .loc 1 155 16 ld.f32 %r48,[%r112]; .loc 1 155 14 st.f32 [%r111],%r48; .loc 1 156 13 add.u64 %r111,%r111,%r45; .loc 1 157 12 add.u64 %r112,%r112,%r47; .loc 1 153 37 add.u64 %r113,%r113,1; .loc 1 153 4 setp.ne.u64 %r242,%r113,%r132; @ %r242 bra $L22; $L20: .loc 1 162 12 add.u64 %r122,%r122,%r51; .loc 1 163 12 add.u64 %r123,%r123,%r53; .loc 1 164 12 add.u64 %r124,%r124,%r55; .loc 1 165 15 ld.u64 %r243,[%frame+480]; add.u64 %r57,%r243,1; .loc 1 167 13 setp.eq.u64 %r244,%r57,%r89; @ %r244 bra $L24; .loc 1 165 15 st.u64 [%frame+480],%r57; bra $L29; $L24: .loc 1 171 20 mov.u64 %r245,0; st.u64 [%frame+480],%r245; .loc 1 172 24 ld.u64 %r59,[%frame+240]; .loc 1 173 16 ld.u64 %r247,[%frame+120]; shl.b64 %r246,%r247,2; sub.u64 %r119,%r123,%r246; .loc 1 174 9 ld.u64 %r249,[%frame]; shl.b64 %r248,%r249,2; sub.u64 %r120,%r124,%r248; .loc 1 176 14 @ %r287 bra $L1; add.u64 %r142,%frame,488; mov.u64 %r135,8; .loc 1 175 12 mov.u64 %r121,1; bra $L27; $L28: .loc 1 171 20 st.u64 [%r142],%r245; .loc 1 172 24 add.u64 %r256,%r303,%r135; ld.u64 %r59,[%r256]; .loc 1 173 24 add.u64 %r258,%r304,%r135; .loc 1 173 16 ld.u64 %r260,[%r258]; shl.b64 %r259,%r260,2; sub.u64 %r119,%r123,%r259; .loc 1 174 17 add.u64 %r261,%frame,%r135; .loc 1 174 9 ld.u64 %r263,[%r261]; shl.b64 %r262,%r263,2; sub.u64 %r120,%r124,%r262; .loc 1 175 12 add.u64 %r121,%r121,1; .loc 1 176 14 add.u64 %r142,%r142,8; add.u64 %r135,%r135,8; setp.eq.u64 %r264,%r298,%r121; @ %r264 bra $L1; $L27: .loc 1 184 23 ld.u64 %r265,[%r142]; add.u64 %r70,%r265,1; st.u64 [%r142],%r70; .loc 1 185 30 add.u64 %r267,%r299,%r135; .loc 1 185 20 ld.u64 %r269,[%r267]; sub.u64 %r268,%r269,%r59; shl.b64 %r270,%r268,2; add.u64 %r122,%r122,%r270; .loc 1 186 30 add.u64 %r272,%r300,%r135; .loc 1 186 20 ld.u64 %r274,[%r272]; shl.b64 %r273,%r274,2; add.u64 %r123,%r119,%r273; .loc 1 187 23 add.u64 %r276,%r301,%r135; .loc 1 187 13 ld.u64 %r278,[%r276]; shl.b64 %r277,%r278,2; add.u64 %r124,%r120,%r277; .loc 1 167 32 add.u64 %r280,%r302,%r135; .loc 1 167 13 ld.u64 %r281,[%r280]; setp.eq.u64 %r282,%r70,%r281; @ %r282 bra $L28; bra $L29; $L13: .loc 1 130 11 cvt.s64.s32 %r132,%r95; .loc 1 130 10 setp.lt.s64 %r283,%r132,%r88; @ %r283 bra $L15; .loc 1 132 13 rem.s64 %r132,%r132,%r88; .loc 1 132 8 cvt.u32.u64 %r95,%r132; bra $L15; $L1: .loc 1 191cshift1_4_r8.o/gfortrani_cshift1_4_r8 .visible .func _gfortrani_cshift1_4_r8../libgfortran/generated/cshift1_4_r8.c"ortrani_cshift1_4_r8 .visible .func _gfortrani_cshift1_4_r80u32 %r9532predpredu16 %r206; .reg .u16 %r207; .reg .u32predpred %r239; .reg .predpredu64 %r249; .reg .u64 %r256; .reg .u64 %r258; .reg .u64u64 %r267; .reg .u64u64 %r278; .reg .u64pred %r287; .reg .pred %r290; .reg .u32 %r295; .reg .u32 %r296; .reg .u64u64 %r305; mov.u64 %r145,%ar0; mov.u64 %r146,%ar1; mov.u64 %r147,%ar2; mov.u64 %r148,%ar3; .loc 1 68 6 setp.eq.u64 %r149,%r148,0; @ %r149 bra $L30; .loc 1 69 11 ld.u32 %r150,[%r148]; add.u32 %r94,%r150,-1; bra $L2; $L30: .loc 1 71 11 cvt.u32.u64 %r94,%r148; $L2: .loc 1 73 13 mov.u64 %r151,1; st.u64 [%frame+360],%r151; .loc 1 74 12 mov.u64 %r152,0; st.u64 [%frame+480],%r152; .loc 1 82 23 ld.s8 %r140,[%r146+28]; cvt.u32.u32 %r153,%r140; cvt.s64.s8 %r141,%r153; .loc 1 82 3 setp.le.s64 %r154,%r141,0; @ %r154 bra $L31; .loc 1 84 15 cvt.s64.s32 %r92,%r94; add.u64 %r93,%r146,40; add.u64 %r86,%r145,40; .loc 1 75 5 mov.u64 %r90,%r152; .loc 1 80 7 mov.u64 %r88,%r90; .loc 1 82 12 mov.u64 %r125,%r90; .loc 1 79 11 mov.u64 %r85,%r151; .loc 1 78 11 mov.u64 %r81,%r85; .loc 1 96 20 mov.u64 %r305,%r90; $L8: .loc 1 92 17 ld.u64 %r156,[%r93+16]; add.u64 %r155,%r156,1; .loc 1 92 15 ld.u64 %r157,[%r93+8]; sub.u64 %r33,%r155,%r157; .loc 1 86 19 ld.u64 %r80,[%r86]; .loc 1 89 19 ld.u64 %r84,[%r93]; .loc 1 84 10 setp.ne.u64 %r158,%r92,%r125; @ %r158 bra $L4; .loc 1 87 14 setp.ne.u64 %r159,%r80,0; .loc 1 90 14 setp.ne.u64 %r160,%r84,0; .loc 1 92 15 mov.u64 %r88,%r33; selp.u64 %r85,%r84,1,%r160; selp.u64 %r81,%r80,1,%r159; bra $L7; $L4: .loc 1 96 20 shl.b64 %r161,%r90,3; add.u64 %r162,%frame,%r161; add.u64 %r163,%r162,480; st.u64 [%r163],%r305; .loc 1 97 21 st.u64 [%r162+360],%r33; .loc 1 98 22 st.u64 [%r162+840],%r80; .loc 1 99 22 st.u64 [%r162+720],%r84; .loc 1 100 24 add.u64 %r175,%r90,%r90; add.u64 %r176,%r175,%r90; shl.b64 %r177,%r176,3; add.u64 %r178,%r147,%r177; ld.u64 %r23,[%r178+40]; .loc 1 100 22 st.u64 [%r162+600],%r23; .loc 1 101 26 mul.lo.u64 %r186,%r33,%r80; .loc 1 101 13 st.u64 [%r162+240],%r186; .loc 1 102 26 mul.lo.u64 %r190,%r33,%r84; .loc 1 102 13 st.u64 [%r162+120],%r190; .loc 1 103 26 mul.lo.u64 %r193,%r23,%r33; .loc 1 103 13 st.u64 [%r162],%r193; .loc 1 104 12 add.u64 %r90,%r90,1; $L7: .loc 1 82 55 add.u64 %r125,%r125,1; .loc 1 82 3 add.u64 %r93,%r93,24; add.u64 %r86,%r86,24; setp.ne.u64 %r194,%r125,%r141; @ %r194 bra $L8; bra $L3; $L31: .loc 1 80 7 mov.u64 %r88,%r152; .loc 1 79 11 mov.u64 %r85,%r151; .loc 1 78 11 mov.u64 %r81,%r85; $L3: .loc 1 107 14 ld.u64 %r27,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r195,%r27,0; .loc 1 108 16 selp.u64 %r27,%r27,1,%r195; .loc 1 109 14 ld.u64 %r28,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r196,%r28,0; .loc 1 110 16 selp.u64 %r28,%r28,1,%r196; .loc 1 111 14 ld.u64 %r29,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r197,%r29,0; .loc 1 112 16 selp.u64 %r29,%r29,1,%r197; .loc 1 118 8 ld.u64 %r122,[%r145]; .loc 1 119 8 ld.u64 %r123,[%r146]; .loc 1 120 8 ld.u64 %r124,[%r147]; .loc 1 122 9 setp.eq.u64 %r198,%r122,0; @ %r198 bra $L1; .loc 1 150 13 shl.b64 %r45,%r81,3; .loc 1 151 12 shl.b64 %r47,%r85,3; .loc 1 162 12 shl.b64 %r51,%r28,3; .loc 1 163 12 shl.b64 %r53,%r27,3; .loc 1 164 12 shl.b64 %r55,%r29,2; .loc 1 138 19 set.u32.eq.u64 %r200,%r85,1; neg.s32 %r201,%r200; .loc 1 138 35 set.u32.eq.u64 %r203,%r81,1; neg.s32 %r204,%r203; .loc 1 138 24 cvt.u16.u32 %r206,%r201; cvt.u16.u32 %r207,%r204; and.b16 %r205,%r206,%r207; cvt.u32.u16 %r208,%r205; cvt.u32.u8 %r91,%r208; .loc 1 167 32 ld.u64 %r89,[%frame+360]; setp.ne.u32 %r290,%r91,0; setp.le.s64 %r287,%r141,2; .loc 1 129 12 cvt.u32.u64 %r295,%r88; cvt.u32.u32 %r296,%r140; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; add.u64 %r299,%frame,840; add.u64 %r300,%frame,720; add.u64 %r301,%frame,600; add.u64 %r302,%frame,360; .loc 1 172 24 add.u64 %r303,%frame,240; .loc 1 173 24 add.u64 %r304,%frame,120; $L29: .loc 1 125 10 ld.u32 %r95,[%r124]; .loc 1 128 10 setp.ge.s32 %r209,%r95,0; @ %r209 bra $L13; .loc 1 129 12 add.u32 %r95,%r95,%r295; .loc 1 130 11 cvt.s64.s32 %r132,%r95; .loc 1 130 10 setp.ge.s64 %r210,%r132,%r88; @ %r210 bra $L14; setp.ge.s32 %r211,%r95,0; @ %r211 bra $L15; $L14: .loc 1 132 13 rem.s64 %r132,%r132,%r88; .loc 1 133 7 setp.lt.s64 %r214,%r132,0; @ %r214 bra $L16; .loc 1 132 8 cvt.u32.u64 %r95,%r132; bra $L15; $L16: .loc 1 134 16 cvt.u32.u64 %r215,%r132; add.u32 %r95,%r295,%r215; .loc 1 136 22 cvt.s64.s32 %r132,%r95; $L15: .loc 1 147 24 sub.u64 %r138,%r88,%r132; .loc 1 138 10 @ %r290 bra $L17; .loc 1 147 4 setp.gt.s64 %r217,%r138,0; @ %r217 bra $L18; mov.u64 %r111,%r122; bra $L19; $L17: .loc 1 140 21 cvt.s64.s32 %r218,%r95; .loc 1 140 11 shl.b64 %r117,%r218,3; .loc 1 141 11 shl.b64 %r118,%r138,3; .loc 1 142 23 add.u64 %r219,%r123,%r117; .loc 1 142 418226,[%value_in]; } .loc 1 143 17 add.u64 %r228,%r122,%r118; .loc 1 143 42r123117235,[%value_in]; } bra $L20; $L18: .loc 1 136 22 mul.lo.u64 %r237,%r132,%r85; .loc 1 136 18 shl.b64 %r238,%r237,3; .loc 1 136 11 add.u64 %r115,%r123,%r238; mov.u64 %r114,%r122; .loc 1 147 11 mov.u64 %r116,0; $L21: .loc 1 149 16 ld.f64 %r43,[%r115]; .loc 1 149 14 st.f64 [%r114],%r43; .loc 1 150 13 add.u64 %r114,%r114,%r45; .loc 1 151 12 add.u64 %r115,%r115,%r47; .loc 1 147 31 add.u64 %r116,%r116,1; .loc 1 147 4 setp.ne.u64 %r239,%r116,%r138; @ %r239 bra $L21; .loc 1 150 13 mad.lo.u64 %r111,%r116,%r45,%r122; $L19: .loc 1 153 4 setp.gt.s64 %r241,%r132,0; @ ! %r241 bra $L20; mov.u64 %r112,%r123; .loc 1 153 23 mov.u64 %r113,0; $L22: .loc 1 155 16 ld.f64 %r48,[%r112]; .loc 1 155 14 st.f64 [%r111],%r48; .loc 1 156 13 add.u64 %r111,%r111,%r45; .loc 1 157 12 add.u64 %r112,%r112,%r47; .loc 1 153 37 add.u64 %r113,%r113,1; .loc 1 153 4 setp.ne.u64 %r242,%r113,%r132; @ %r242 bra $L22; $L20: .loc 1 162 12 add.u64 %r122,%r122,%r51; .loc 1 163 12 add.u64 %r123,%r123,%r53; .loc 1 164 12 add.u64 %r124,%r124,%r55; .loc 1 165 15 ld.u64 %r243,[%frame+480]; add.u64 %r57,%r243,1; .loc 1 167 13 setp.eq.u64 %r244,%r57,%r89; @ %r244 bra $L24; .loc 1 165 15 st.u64 [%frame+480],%r57; bra $L29; $L24: .loc 1 171 20 mov.u64 %r245,0; st.u64 [%frame+480],%r245; .loc 1 172 24 ld.u64 %r59,[%frame+240]; .loc 1 173 16 ld.u64 %r247,[%frame+120]; shl.b64 %r246,%r247,3; sub.u64 %r119,%r123,%r246; .loc 1 174 9 ld.u64 %r249,[%frame]; shl.b64 %r248,%r249,2; sub.u64 %r120,%r124,%r248; .loc 1 176 14 @ %r287 bra $L1; add.u64 %r142,%frame,488; mov.u64 %r135,8; .loc 1 175 12 mov.u64 %r121,1; bra $L27; $L28: .loc 1 171 20 st.u64 [%r142],%r245; .loc 1 172 24 add.u64 %r256,%r303,%r135; ld.u64 %r59,[%r256]; .loc 1 173 24 add.u64 %r258,%r304,%r135; .loc 1 173 16 ld.u64 %r260,[%r258]; shl.b64 %r259,%r260,3; sub.u64 %r119,%r123,%r259; .loc 1 174 17 add.u64 %r261,%frame,%r135; .loc 1 174 9 ld.u64 %r263,[%r261]; shl.b64 %r262,%r263,2; sub.u64 %r120,%r124,%r262; .loc 1 175 12 add.u64 %r121,%r121,1; .loc 1 176 14 add.u64 %r142,%r142,8; add.u64 %r135,%r135,8; setp.eq.u64 %r264,%r298,%r121; @ %r264 bra $L1; $L27: .loc 1 184 23 ld.u64 %r265,[%r142]; add.u64 %r70,%r265,1; st.u64 [%r142],%r70; .loc 1 185 30 add.u64 %r267,%r299,%r135; .loc 1 185 20 ld.u64 %r269,[%r267]; sub.u64 %r268,%r269,%r59; shl.b64 %r270,%r268,3; add.u64 %r122,%r122,%r270; .loc 1 186 30 add.u64 %r272,%r300,%r135; .loc 1 186 20 ld.u64 %r274,[%r272]; shl.b64 %r273,%r274,3; add.u64 %r123,%r119,%r273; .loc 1 187 23 add.u64 %r276,%r301,%r135; .loc 1 187 13 ld.u64 %r278,[%r276]; shl.b64 %r277,%r278,2; add.u64 %r124,%r120,%r277; .loc 1 167 32 add.u64 %r280,%r302,%r135; .loc 1 167 13 ld.u64 %r281,[%r280]; setp.eq.u64 %r282,%r70,%r281; @ %r282 bra $L28; bra $L29; $L13: .loc 1 130 11 cvt.s64.s32 %r132,%r95; .loc 1 130 10 setp.lt.s64 %r283,%r132,%r88; @ %r283 bra $L15; .loc 1 132 13 rem.s64 %r132,%r132,%r88; .loc 1 132 8 cvt.u32.u64 %r95,%r132; bra $L15; $L1: .loc 1 191cshift1_4_r10.o/1622802246cshift1_4_c4.o/ 1622802246gfortrani_cshift1_4_c4 .visible .func _gfortrani_cshift1_4_c../libgfortran/generated/cshift1_4_c4.c"ortrani_cshift1_4_c4 .visible .func _gfortrani_cshift1_4_c0u32predpredpredpredu64 %r251; .reg .u64predu32 %r298; .reg .u64u64 %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u64 %r306; .reg .u64 %r307; mov.u64 %r147,%ar0; mov.u64 %r148,%ar1; mov.u64 %r149,%ar2; mov.u64 %r150,%ar3; .loc 1 68 6 setp.eq.u64 %r151,%r150,0; @ %r151 bra $L30; .loc 1 69 11 ld.u32 %r152,[%r150]; add.u32 %r96,%r152,-1; bra $L2; $L30: .loc 1 71 11 cvt.u32.u64 %r96,%r150; $L2: .loc 1 73 13 mov.u64 %r153,1; st.u64 [%frame+360],%r153; .loc 1 74 12 mov.u64 %r154,0; st.u64 [%frame+480],%r154; .loc 1 82 23 ld.s8 %r142,[%r148+28]; cvt.u32.u32 %r155,%r142; cvt.s64.s8 %r143,%r155; .loc 1 82 3 setp.le.s64 %r156,%r143,0; @ %r156 bra $L31; .loc 1 84 15 cvt.s64.s32 %r94,%r96; add.u64 %r95,%r148,40; add.u64 %r88,%r147,40; .loc 1 75 5 mov.u64 %r92,%r154; .loc 1 80 7 mov.u64 %r90,%r92; .loc 1 82 12 mov.u64 %r127,%r92; .loc 1 79 11 mov.u64 %r87,%r153; .loc 1 78 11 mov.u64 %r83,%r87; .loc 1 96 20 mov.u64 %r307,%r92; $L8: .loc 1 92 17 ld.u64 %r158,[%r95+16]; add.u64 %r157,%r158,1; .loc 1 92 15 ld.u64 %r159,[%r95+8]; sub.u64 %r33,%r157,%r159; .loc 1 86 19 ld.u64 %r82,[%r88]; .loc 1 89 19 ld.u64 %r86,[%r95]; .loc 1 84 10 setp.ne.u64 %r160,%r94,%r127; @ %r160 bra $L4; .loc 1 87 14 setp.ne.u64 %r161,%r82,0; .loc 1 90 14 setp.ne.u64 %r162,%r86,0; .loc 1 92 15 mov.u64 %r90,%r33; selp.u64 %r87,%r86,1,%r162; selp.u64 %r83,%r82,1,%r161; bra $L7; $L4: .loc 1 96 20 shl.b64 %r163,%r92,3; add.u64 %r164,%frame,%r163; add.u64 %r165,%r164,480; st.u64 [%r165],%r307; .loc 1 97 21 st.u64 [%r164+360],%r33; .loc 1 98 22 st.u64 [%r164+840],%r82; .loc 1 99 22 st.u64 [%r164+720],%r86; .loc 1 100 24 add.u64 %r177,%r92,%r92; add.u64 %r178,%r177,%r92; shl.b64 %r179,%r178,3; add.u64 %r180,%r149,%r179; ld.u64 %r23,[%r180+40]; .loc 1 100 22 st.u64 [%r164+600],%r23; .loc 1 101 26 mul.lo.u64 %r188,%r33,%r82; .loc 1 101 13 st.u64 [%r164+240],%r188; .loc 1 102 26 mul.lo.u64 %r192,%r33,%r86; .loc 1 102 13 st.u64 [%r164+120],%r192; .loc 1 103 26 mul.lo.u64 %r195,%r23,%r33; .loc 1 103 13 st.u64 [%r164],%r195; .loc 1 104 12 add.u64 %r92,%r92,1; $L7: .loc 1 82 55 add.u64 %r127,%r127,1; .loc 1 82 3 add.u64 %r95,%r95,24; add.u64 %r88,%r88,24; setp.ne.u64 %r196,%r127,%r143; @ %r196 bra $L8; bra $L3; $L31: .loc 1 80 7 mov.u64 %r90,%r154; .loc 1 79 11 mov.u64 %r87,%r153; .loc 1 78 11 mov.u64 %r83,%r87; $L3: .loc 1 107 14 ld.u64 %r27,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r197,%r27,0; .loc 1 108 16 selp.u64 %r27,%r27,1,%r197; .loc 1 109 14 ld.u64 %r28,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r198,%r28,0; .loc 1 110 16 selp.u64 %r28,%r28,1,%r198; .loc 1 111 14 ld.u64 %r29,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r199,%r29,0; .loc 1 112 16 selp.u64 %r29,%r29,1,%r199; .loc 1 118 8 ld.u64 %r124,[%r147]; .loc 1 119 8 ld.u64 %r125,[%r148]; .loc 1 120 8 ld.u64 %r126,[%r149]; .loc 1 122 9 setp.eq.u64 %r200,%r124,0; @ %r200 bra $L1; .loc 1 150 13 shl.b64 %r46,%r83,3; .loc 1 151 12 shl.b64 %r48,%r87,3; .loc 1 162 12 shl.b64 %r53,%r28,3; .loc 1 163 12 shl.b64 %r55,%r27,3; .loc 1 164 12 shl.b64 %r57,%r29,2; .loc 1 138 19 set.u32.eq.u64 %r202,%r87,1; neg.s32 %r203,%r202; .loc 1 138 35 set.u32.eq.u64 %r205,%r83,1; neg.s32 %r206,%r205; .loc 1 138 24 cvt.u16.u32 %r208,%r203; cvt.u16.u32 %r209,%r206; and.b16 %r207,%r208,%r209; cvt.u32.u16 %r210,%r207; cvt.u32.u8 %r93,%r210; .loc 1 167 32 ld.u64 %r91,[%frame+360]; setp.ne.u32 %r289,%r93,0; setp.le.s64 %r291,%r143,2; .loc 1 129 12 cvt.u32.u64 %r297,%r90; cvt.u32.u32 %r298,%r142; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; add.u64 %r301,%frame,840; add.u64 %r302,%frame,720; add.u64 %r303,%frame,600; add.u64 %r304,%frame,360; .loc 1 172 24 add.u64 %r305,%frame,240; .loc 1 173 24 add.u64 %r306,%frame,120; $L29: .loc 1 125 10 ld.u32 %r97,[%r126]; .loc 1 128 10 setp.ge.s32 %r211,%r97,0; @ %r211 bra $L13; .loc 1 129 12 add.u32 %r97,%r97,%r297; .loc 1 130 11 cvt.s64.s32 %r134,%r97; .loc 1 130 10 setp.ge.s64 %r212,%r134,%r90; @ %r212 bra $L14; setp.ge.s32 %r213,%r97,0; @ %r213 bra $L15; $L14: .loc 1 132 13 rem.s64 %r134,%r134,%r90; .loc 1 133 7 setp.lt.s64 %r216,%r134,0; @ %r216 bra $L16; .loc 1 132 8 cvt.u32.u64 %r97,%r134; bra $L15; $L16: .loc 1 134 16 cvt.u32.u64 %r217,%r134; add.u32 %r97,%r297,%r217; .loc 1 136 22 cvt.s64.s32 %r134,%r97; $L15: .loc 1 147 24 sub.u64 %r140,%r90,%r134; .loc 1 138 10 @ %r289 bra $L17; .loc 1 147 4 setp.gt.s64 %r219,%r140,0; @ %r219 bra $L18; mov.u64 %r113,%r124; bra $L19; $L17: .loc 1 140 21 cvt.s64.s32 %r220,%r97; .loc 1 140 11 shl.b64 %r119,%r220,3; .loc 1 141 11 shl.b64 %r120,%r140,3; .loc 1 142 23 add.u64 %r221,%r125,%r119; .loc 1 142 40228,[%value_in]; } .loc 1 143 17 add.u64 %r230,%r124,%r120r119237,[%value_in]; } bra $L20; $L18: .loc 1 136 22 mul.lo.u64 %r239,%r134,%r87; .loc 1 136 18 shl.b64 %r240,%r239,3; .loc 1 136 11 add.u64 %r117,%r125,%r240; mov.u64 %r116,%r124; .loc 1 147 11 mov.u64 %r118,0; $L21: .loc 1 149 16 ld.f32 %r43,[%r117]; ld.f32 %r44,[%r117+4]; .loc 1 149 14 st.f32 [%r116],%r43; st.f32 [%r116+4],%r44; .loc 1 150 13 add.u64 %r116,%r116,%r46; .loc 1 151 12 add.u64 %r117,%r117,%r48; .loc 1 147 31 add.u64 %r118,%r118,1; .loc 1 147 4 setp.ne.u64 %r241,%r118,%r140; @ %r241 bra $L21; .loc 1 150 13 mad.lo.u64 %r113,%r118,%r46,%r124; $L19: .loc 1 153 4 setp.gt.s64 %r243,%r134,0; @ ! %r243 bra $L20; mov.u64 %r114,%r125; .loc 1 153 23 mov.u64 %r115,0; $L22: .loc 1 155 16 ld.f32 %r49,[%r114]; ld.f32 %r50,[%r114+4]; .loc 1 155 14 st.f32 [%r113],%r49; st.f32 [%r113+4],%r50; .loc 1 156 13 add.u64 %r113,%r113,%r46; .loc 1 157 12 add.u64 %r114,%r114,%r48; .loc 1 153 37 add.u64 %r115,%r115,1; .loc 1 153 4 setp.ne.u64 %r244,%r115,%r134; @ %r244 bra $L22; $L20: .loc 1 162 12 add.u64 %r124,%r124,%r53; .loc 1 163 12 add.u64 %r125,%r125,%r55; .loc 1 164 12 add.u64 %r126,%r126,%r57; .loc 1 165 15 ld.u64 %r245,[%frame+480]; add.u64 %r59,%r245,1; .loc 1 167 13 setp.eq.u64 %r246,%r59,%r91; @ %r246 bra $L24; .loc 1 165 15 st.u64 [%frame+480],%r59; bra $L29; $L24: .loc 1 171 20 mov.u64 %r247,0; st.u64 [%frame+480],%r247; .loc 1 172 24 ld.u64 %r61,[%frame+240]; .loc 1 173 16 ld.u64 %r249,[%frame+120]; shl.b64 %r248,%r249,3; sub.u64 %r121,%r125,%r248; .loc 1 174 9 ld.u64 %r251,[%frame]; shl.b64 %r250,%r251,2; sub.u64 %r122,%r126,%r250; .loc 1 176 14 @ %r291 bra $L1; add.u64 %r144,%frame,488; mov.u64 %r137,8; .loc 1 175 12 mov.u64 %r123,1; bra $L27; $L28: .loc 1 171 20 st.u64 [%r144],%r247; .loc 1 172 24 add.u64 %r258,%r305,%r137; ld.u64 %r61,[%r258]; .loc 1 173 24 add.u64 %r260,%r306,%r137; .loc 1 173 16 ld.u64 %r262,[%r260]; shl.b64 %r261,%r262,3; sub.u64 %r121,%r125,%r261; .loc 1 174 17 add.u64 %r263,%frame,%r137; .loc 1 174 9 ld.u64 %r265,[%r263]; shl.b64 %r264,%r265,2; sub.u64 %r122,%r126,%r264; .loc 1 175 12 add.u64 %r123,%r123,1; .loc 1 176 14 add.u64 %r144,%r144,8; add.u64 %r137,%r137,8; setp.eq.u64 %r266,%r300,%r123; @ %r266 bra $L1; $L27: .loc 1 184 23 ld.u64 %r267,[%r144]; add.u64 %r72,%r267,1; st.u64 [%r144],%r72; .loc 1 185 30 add.u64 %r269,%r301,%r137; .loc 1 185 20 ld.u64 %r271,[%r269]; sub.u64 %r270,%r271,%r61; shl.b64 %r272,%r270,3; add.u64 %r124,%r124,%r272; .loc 1 186 30 add.u64 %r274,%r302,%r137; .loc 1 186 20 ld.u64 %r276,[%r274]; shl.b64 %r275,%r276,3; add.u64 %r125,%r121,%r275; .loc 1 187 23 add.u64 %r278,%r303,%r137; .loc 1 187 13 ld.u64 %r280,[%r278]; shl.b64 %r279,%r280,2; add.u64 %r126,%r122,%r279; .loc 1 167 32 add.u64 %r282,%r304,%r137; .loc 1 167 13 ld.u64 %r283,[%r282]; setp.eq.u64 %r284,%r72,%r283; @ %r284 bra $L28; bra $L29; $L13: .loc 1 130 11 cvt.s64.s32 %r134,%r97; .loc 1 130 10 setp.lt.s64 %r285,%r134,%r90; @ %r285 bra $L15; .loc 1 132 13 rem.s64 %r134,%r134,%r90; .loc 1 132 8 cvt.u32.u64 %r97,%r134; bra $L15; $L1: .loc 1 191cshift1_4_c8.o/ 1622802246gfortrani_cshift1_4_c8 .visible .func _gfortrani_cshift1_4_c8../libgfortran/generated/cshift1_4_c8.c"ortrani_cshift1_4_c8 .visible .func _gfortrani_cshift1_4_c80u32predpredpredpredu64 %r251; .reg .u64predu32 %r298; .reg .u64u64 %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u64 %r306; .reg .u64 %r307; mov.u64 %r147,%ar0; mov.u64 %r148,%ar1; mov.u64 %r149,%ar2; mov.u64 %r150,%ar3; .loc 1 68 6 setp.eq.u64 %r151,%r150,0; @ %r151 bra $L30; .loc 1 69 11 ld.u32 %r152,[%r150]; add.u32 %r96,%r152,-1; bra $L2; $L30: .loc 1 71 11 cvt.u32.u64 %r96,%r150; $L2: .loc 1 73 13 mov.u64 %r153,1; st.u64 [%frame+360],%r153; .loc 1 74 12 mov.u64 %r154,0; st.u64 [%frame+480],%r154; .loc 1 82 23 ld.s8 %r142,[%r148+28]; cvt.u32.u32 %r155,%r142; cvt.s64.s8 %r143,%r155; .loc 1 82 3 setp.le.s64 %r156,%r143,0; @ %r156 bra $L31; .loc 1 84 15 cvt.s64.s32 %r94,%r96; add.u64 %r95,%r148,40; add.u64 %r88,%r147,40; .loc 1 75 5 mov.u64 %r92,%r154; .loc 1 80 7 mov.u64 %r90,%r92; .loc 1 82 12 mov.u64 %r127,%r92; .loc 1 79 11 mov.u64 %r87,%r153; .loc 1 78 11 mov.u64 %r83,%r87; .loc 1 96 20 mov.u64 %r307,%r92; $L8: .loc 1 92 17 ld.u64 %r158,[%r95+16]; add.u64 %r157,%r158,1; .loc 1 92 15 ld.u64 %r159,[%r95+8]; sub.u64 %r33,%r157,%r159; .loc 1 86 19 ld.u64 %r82,[%r88]; .loc 1 89 19 ld.u64 %r86,[%r95]; .loc 1 84 10 setp.ne.u64 %r160,%r94,%r127; @ %r160 bra $L4; .loc 1 87 14 setp.ne.u64 %r161,%r82,0; .loc 1 90 14 setp.ne.u64 %r162,%r86,0; .loc 1 92 15 mov.u64 %r90,%r33; selp.u64 %r87,%r86,1,%r162; selp.u64 %r83,%r82,1,%r161; bra $L7; $L4: .loc 1 96 20 shl.b64 %r163,%r92,3; add.u64 %r164,%frame,%r163; add.u64 %r165,%r164,480; st.u64 [%r165],%r307; .loc 1 97 21 st.u64 [%r164+360],%r33; .loc 1 98 22 st.u64 [%r164+840],%r82; .loc 1 99 22 st.u64 [%r164+720],%r86; .loc 1 100 24 add.u64 %r177,%r92,%r92; add.u64 %r178,%r177,%r92; shl.b64 %r179,%r178,3; add.u64 %r180,%r149,%r179; ld.u64 %r23,[%r180+40]; .loc 1 100 22 st.u64 [%r164+600],%r23; .loc 1 101 26 mul.lo.u64 %r188,%r33,%r82; .loc 1 101 13 st.u64 [%r164+240],%r188; .loc 1 102 26 mul.lo.u64 %r192,%r33,%r86; .loc 1 102 13 st.u64 [%r164+120],%r192; .loc 1 103 26 mul.lo.u64 %r195,%r23,%r33; .loc 1 103 13 st.u64 [%r164],%r195; .loc 1 104 12 add.u64 %r92,%r92,1; $L7: .loc 1 82 55 add.u64 %r127,%r127,1; .loc 1 82 3 add.u64 %r95,%r95,24; add.u64 %r88,%r88,24; setp.ne.u64 %r196,%r127,%r143; @ %r196 bra $L8; bra $L3; $L31: .loc 1 80 7 mov.u64 %r90,%r154; .loc 1 79 11 mov.u64 %r87,%r153; .loc 1 78 11 mov.u64 %r83,%r87; $L3: .loc 1 107 14 ld.u64 %r27,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r197,%r27,0; .loc 1 108 16 selp.u64 %r27,%r27,1,%r197; .loc 1 109 14 ld.u64 %r28,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r198,%r28,0; .loc 1 110 16 selp.u64 %r28,%r28,1,%r198; .loc 1 111 14 ld.u64 %r29,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r199,%r29,0; .loc 1 112 16 selp.u64 %r29,%r29,1,%r199; .loc 1 118 8 ld.u64 %r124,[%r147]; .loc 1 119 8 ld.u64 %r125,[%r148]; .loc 1 120 8 ld.u64 %r126,[%r149]; .loc 1 122 9 setp.eq.u64 %r200,%r124,0; @ %r200 bra $L1; .loc 1 150 13 shl.b64 %r46,%r83,4; .loc 1 151 12 shl.b64 %r48,%r87,4; .loc 1 162 12 shl.b64 %r53,%r28,4; .loc 1 163 12 shl.b64 %r55,%r27,4; .loc 1 164 12 shl.b64 %r57,%r29,2; .loc 1 138 19 set.u32.eq.u64 %r202,%r87,1; neg.s32 %r203,%r202; .loc 1 138 35 set.u32.eq.u64 %r205,%r83,1; neg.s32 %r206,%r205; .loc 1 138 24 cvt.u16.u32 %r208,%r203; cvt.u16.u32 %r209,%r206; and.b16 %r207,%r208,%r209; cvt.u32.u16 %r210,%r207; cvt.u32.u8 %r93,%r210; .loc 1 167 32 ld.u64 %r91,[%frame+360]; setp.ne.u32 %r289,%r93,0; setp.le.s64 %r291,%r143,2; .loc 1 129 12 cvt.u32.u64 %r297,%r90; cvt.u32.u32 %r298,%r142; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; add.u64 %r301,%frame,840; add.u64 %r302,%frame,720; add.u64 %r303,%frame,600; add.u64 %r304,%frame,360; .loc 1 172 24 add.u64 %r305,%frame,240; .loc 1 173 24 add.u64 %r306,%frame,120; $L29: .loc 1 125 10 ld.u32 %r97,[%r126]; .loc 1 128 10 setp.ge.s32 %r211,%r97,0; @ %r211 bra $L13; .loc 1 129 12 add.u32 %r97,%r97,%r297; .loc 1 130 11 cvt.s64.s32 %r134,%r97; .loc 1 130 10 setp.ge.s64 %r212,%r134,%r90; @ %r212 bra $L14; setp.ge.s32 %r213,%r97,0; @ %r213 bra $L15; $L14: .loc 1 132 13 rem.s64 %r134,%r134,%r90; .loc 1 133 7 setp.lt.s64 %r216,%r134,0; @ %r216 bra $L16; .loc 1 132 8 cvt.u32.u64 %r97,%r134; bra $L15; $L16: .loc 1 134 16 cvt.u32.u64 %r217,%r134; add.u32 %r97,%r297,%r217; .loc 1 136 22 cvt.s64.s32 %r134,%r97; $L15: .loc 1 147 24 sub.u64 %r140,%r90,%r134; .loc 1 138 10 @ %r289 bra $L17; .loc 1 147 4 setp.gt.s64 %r219,%r140,0; @ %r219 bra $L18; mov.u64 %r113,%r124; bra $L19; $L17: .loc 1 140 21 cvt.s64.s32 %r220,%r97; .loc 1 140 11 shl.b64 %r119,%r220,4; .loc 1 141 11 shl.b64 %r120,%r140,4; .loc 1 142 23 add.u64 %r221,%r125,%r119; .loc 1 142 40228,[%value_in]; } .loc 1 143 17 add.u64 %r230,%r124,%r120r119237,[%value_in]; } bra $L20; $L18: .loc 1 136 22 mul.lo.u64 %r239,%r134,%r87; .loc 1 136 18 shl.b64 %r240,%r239,4; .loc 1 136 11 add.u64 %r117,%r125,%r240; mov.u64 %r116,%r124; .loc 1 147 11 mov.u64 %r118,0; $L21: .loc 1 149 16 ld.f64 %r43,[%r117]; ld.f64 %r44,[%r117+8]; .loc 1 149 14 st.f64 [%r116],%r43; st.f64 [%r116+8],%r44; .loc 1 150 13 add.u64 %r116,%r116,%r46; .loc 1 151 12 add.u64 %r117,%r117,%r48; .loc 1 147 31 add.u64 %r118,%r118,1; .loc 1 147 4 setp.ne.u64 %r241,%r118,%r140; @ %r241 bra $L21; .loc 1 150 13 mad.lo.u64 %r113,%r118,%r46,%r124; $L19: .loc 1 153 4 setp.gt.s64 %r243,%r134,0; @ ! %r243 bra $L20; mov.u64 %r114,%r125; .loc 1 153 23 mov.u64 %r115,0; $L22: .loc 1 155 16 ld.f64 %r49,[%r114]; ld.f64 %r50,[%r114+8]; .loc 1 155 14 st.f64 [%r113],%r49; st.f64 [%r113+8],%r50; .loc 1 156 13 add.u64 %r113,%r113,%r46; .loc 1 157 12 add.u64 %r114,%r114,%r48; .loc 1 153 37 add.u64 %r115,%r115,1; .loc 1 153 4 setp.ne.u64 %r244,%r115,%r134; @ %r244 bra $L22; $L20: .loc 1 162 12 add.u64 %r124,%r124,%r53; .loc 1 163 12 add.u64 %r125,%r125,%r55; .loc 1 164 12 add.u64 %r126,%r126,%r57; .loc 1 165 15 ld.u64 %r245,[%frame+480]; add.u64 %r59,%r245,1; .loc 1 167 13 setp.eq.u64 %r246,%r59,%r91; @ %r246 bra $L24; .loc 1 165 15 st.u64 [%frame+480],%r59; bra $L29; $L24: .loc 1 171 20 mov.u64 %r247,0; st.u64 [%frame+480],%r247; .loc 1 172 24 ld.u64 %r61,[%frame+240]; .loc 1 173 16 ld.u64 %r249,[%frame+120]; shl.b64 %r248,%r249,4; sub.u64 %r121,%r125,%r248; .loc 1 174 9 ld.u64 %r251,[%frame]; shl.b64 %r250,%r251,2; sub.u64 %r122,%r126,%r250; .loc 1 176 14 @ %r291 bra $L1; add.u64 %r144,%frame,488; mov.u64 %r137,8; .loc 1 175 12 mov.u64 %r123,1; bra $L27; $L28: .loc 1 171 20 st.u64 [%r144],%r247; .loc 1 172 24 add.u64 %r258,%r305,%r137; ld.u64 %r61,[%r258]; .loc 1 173 24 add.u64 %r260,%r306,%r137; .loc 1 173 16 ld.u64 %r262,[%r260]; shl.b64 %r261,%r262,4; sub.u64 %r121,%r125,%r261; .loc 1 174 17 add.u64 %r263,%frame,%r137; .loc 1 174 9 ld.u64 %r265,[%r263]; shl.b64 %r264,%r265,2; sub.u64 %r122,%r126,%r264; .loc 1 175 12 add.u64 %r123,%r123,1; .loc 1 176 14 add.u64 %r144,%r144,8; add.u64 %r137,%r137,8; setp.eq.u64 %r266,%r300,%r123; @ %r266 bra $L1; $L27: .loc 1 184 23 ld.u64 %r267,[%r144]; add.u64 %r72,%r267,1; st.u64 [%r144],%r72; .loc 1 185 30 add.u64 %r269,%r301,%r137; .loc 1 185 20 ld.u64 %r271,[%r269]; sub.u64 %r270,%r271,%r61; shl.b64 %r272,%r270,4; add.u64 %r124,%r124,%r272; .loc 1 186 30 add.u64 %r274,%r302,%r137; .loc 1 186 20 ld.u64 %r276,[%r274]; shl.b64 %r275,%r276,4; add.u64 %r125,%r121,%r275; .loc 1 187 23 add.u64 %r278,%r303,%r137; .loc 1 187 13 ld.u64 %r280,[%r278]; shl.b64 %r279,%r280,2; add.u64 %r126,%r122,%r279; .loc 1 167 32 add.u64 %r282,%r304,%r137; .loc 1 167 13 ld.u64 %r283,[%r282]; setp.eq.u64 %r284,%r72,%r283; @ %r284 bra $L28; bra $L29; $L13: .loc 1 130 11 cvt.s64.s32 %r134,%r97; .loc 1 130 10 setp.lt.s64 %r285,%r134,%r90; @ %r285 bra $L15; .loc 1 132 13 rem.s64 %r134,%r134,%r90; .loc 1 132 8 cvt.u32.u64 %r97,%r134; bra $L15; $L1: .loc 1 191cshift1_4_c10.o/1622802246cshift1_8_i1.o/ 1622802246gfortrani_cshift1_8_i1 .visible .func _gfortrani_cshift1_8_../libgfortran/generated/cshift1_8_i1.c"ortrani_cshift1_8_i1 .visible .func _gfortrani_cshift1_8_00predpredpredpred %r185; .reg .predu16 %r197; .reg .u16predu64u64 %r266; .reg .u64pred %r281; .reg .u32 %r288; .reg .u64 %r289; .reg .u64 %r290; .reg .u64u64 %r296; .reg .u64 %r297; mov.u64 %r135,%ar0; mov.u64 %r136,%ar1; mov.u64 %r137,%ar2; mov.u64 %r138,%ar3; .loc 1 68 629; .loc 1 69 21 ld.u32 %r141,[%r138]; add.u32 %r89,%r141,-1; bra $L2; $L29: .loc 1 71 11 cvt.u32.u64 %r89,%r138; $L2: .loc 1 73 13 mov.u64 %r142,1; st.u64 [%frame+360],%r142; .loc 1 74 12 mov.u64 %r143,0; st.u64 [%frame+480],%r143; .loc 1 82 23 ld.s8 %r132,[%r136+28]; cvt.u32.u32 %r144,%r132; cvt.s64.s8 %r133,%r144; .loc 1 82 3 setp.le.s64 %r145,%r133,0; @ %r145 bra $L30; .loc 1 84 15 cvt.s64.s32 %r88,%r89; add.u64 %r69,%r136,40; add.u64 %r42,%r135,40; .loc 1 75 5 mov.u64 %r84,%r143; .loc 1 80 7 mov.u64 %r82,%r84; .loc 1 82 12 mov.u64 %r117,%r84; .loc 1 79 11 mov.u64 %r76,%r142; .loc 1 78 11 mov.u64 %r72,%r76; .loc 1 96 20 mov.u64 %r297,%r84; $L8: .loc 1 92 17 ld.u64 %r147,[%r69+16]; add.u64 %r146,%r147,1; .loc 1 92 15 ld.u64 %r148,[%r69+8]; sub.u64 %r81,%r146,%r148; .loc 1 86 19 ld.u64 %r71,[%r42]; .loc 1 89 19 ld.u64 %r73,[%r69]; .loc 1 84 10 setp.ne.u64 %r149,%r88,%r117; @ %r149 bra $L4; .loc 1 87 14 setp.ne.u64 %r150,%r71,0; .loc 1 90 14 setp.ne.u64 %r151,%r73,0; .loc 1 92 15 mov.u64 %r82,%r81; selp.u64 %r76,%r73,1,%r151; selp.u64 %r72,%r71,1,%r150; bra $L7; $L4: .loc 1 96 20 shl.b64 %r152,%r84,3; add.u64 %r153,%frame,%r152; add.u64 %r154,%r153,480; st.u64 [%r154],%r297; .loc 1 97 21 st.u64 [%r153+360],%r81; .loc 1 98 22 st.u64 [%r153+840],%r71; .loc 1 99 22 st.u64 [%r153+720],%r73; .loc 1 100 24 add.u64 %r166,%r84,%r84; add.u64 %r167,%r166,%r84; shl.b64 %r168,%r167,3; add.u64 %r169,%r137,%r168; ld.u64 %r25,[%r169+40]; .loc 1 100 22 st.u64 [%r153+600],%r25; .loc 1 101 26 mul.lo.u64 %r177,%r71,%r81; .loc 1 101 13 st.u64 [%r153+240],%r177; .loc 1 102 26 mul.lo.u64 %r181,%r73,%r81; .loc 1 102 13 st.u64 [%r153+120],%r181; .loc 1 103 26 mul.lo.u64 %r184,%r25,%r81; .loc 1 103 13 st.u64 [%r153],%r184; .loc 1 104 12 add.u64 %r84,%r84,1; $L7: .loc 1 82 55 add.u64 %r117,%r117,1; .loc 1 82 3 add.u64 %r69,%r69,24; add.u64 %r42,%r42,24; setp.ne.u64 %r185,%r117,%r133; @ %r185 bra $L8; bra $L3; $L30: .loc 1 80 7 mov.u64 %r82,%r143; .loc 1 79 11 mov.u64 %r76,%r142; .loc 1 78 11 mov.u64 %r72,%r76; $L3: .loc 1 107 14 ld.u64 %r31,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r186,%r31,0; .loc 1 108 16 selp.u64 %r31,%r31,1,%r186; .loc 1 109 14 ld.u64 %r32,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r187,%r32,0; .loc 1 110 16 selp.u64 %r32,%r32,1,%r187; .loc 1 111 14 ld.u64 %r33,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r188,%r33,0; .loc 1 112 16 selp.u64 %r33,%r33,1,%r188; .loc 1 118 8 ld.u64 %r114,[%r135]; .loc 1 119 8 ld.u64 %r115,[%r136]; .loc 1 120 8 ld.u64 %r116,[%r137]; .loc 1 122 9 setp.eq.u64 %r189,%r114,0; @ %r189 bra $L1; .loc 1 164 12 shl.b64 %r48,%r33,3; .loc 1 138 35 set.u32.eq.u64 %r191,%r72,1; neg.s32 %r192,%r191; .loc 1 138 19 set.u32.eq.u64 %r194,%r76,1; neg.s32 %r195,%r194; .loc 1 138 24 cvt.u16.u32 %r197,%r192; cvt.u16.u32 %r198,%r195; and.b16 %r196,%r197,%r198; cvt.u32.u16 %r199,%r196; cvt.u32.u8 %r119,%r199; .loc 1 167 32 ld.u64 %r38,[%frame+360]; setp.ne.u32 %r279,%r119,0; setp.le.s64 %r281,%r133,2; cvt.u32.u32 %r288,%r132; cvt.s64.s8 %r289,%r288; add.u64 %r290,%r289,-1; add.u64 %r291,%frame,840; add.u64 %r292,%frame,720; add.u64 %r293,%frame,600; add.u64 %r294,%frame,360; .loc 1 172 24 add.u64 %r295,%frame,240; .loc 1 173 24 add.u64 %r296,%frame,120; $L28: .loc 1 125 10 ld.u64 %r90,[%r116]; .loc 1 128 10 setp.ge.s64 %r200,%r90,0; @ %r200 bra $L13; .loc 1 129 12 add.u64 %r90,%r90,%r82; $L13: .loc 1 130 11 set.u32.ge.s64 %r202,%r90,%r82; neg.s32 %r203,%r202; shr.u64 %r205,%r90,63;4; .loc 1 132 8 rem.s64 %r90,%r90,%r82; .loc 1 133 7 setp.ge.s64 %r215,%r90,0; @ %r215 bra $L14; .loc 1 134 16 add.u64 %r90,%r90,%r82; $L14: .loc 1 147 24 sub.u64 %r75,%r82,%r90; .loc 1 138 10 @ %r279 bra $L15; .loc 1 147 4 setp.gt.s64 %r217,%r75,0; @ %r217 bra $L16; mov.u64 %r134,%r114; bra $L17; $L15: .loc 1 142 23 add.u64 %r218,%r115,%r90; .loc 1 142 41121225,[%value_in]; } .loc 1 143 17 add.u64 %r227,%r114,%r75; .loc 1 143 4234,[%value_in]; } bra $L18; $L16: .loc 1 136 11 mad.lo.u64 %r107,%r90,%r76,%r115; .loc 1 157 12 mov.u64 %r106,%r114; .loc 1 147 11 mov.u64 %r108,0; $L19: .loc 1 149 14 ld.u8 %r237,[%r107]; st.u8 [%r106],%r237; .loc 1 150 13 add.u64 %r106,%r106,%r72; .loc 1 151 12 add.u64 %r107,%r107,%r76; .loc 1 147 31 add.u64 %r108,%r108,1; .loc 1 147 4 setp.ne.u64 %r238,%r75,%r108; @ %r238 bra $L19; .loc 1 150 13 mad.lo.u64 %r134,%r75,%r72,%r114; $L17: .loc 1 153 4 setp.gt.s64 %r240,%r90,0; @ ! %r240 bra $L18; .loc 1 157 12 mov.u64 %r104,%r115; .loc 1 153 23 mov.u64 %r105,0; $L21: .loc 1 155 14 ld.u8 %r241,[%r104]; st.u8 [%r134],%r241; .loc 1 156 13 add.u64 %r134,%r134,%r72; .loc 1 157 12 add.u64 %r104,%r104,%r76; .loc 1 153 37 add.u64 %r105,%r105,1; .loc 1 153 4 setp.ne.u64 %r242,%r90,%r105; @ %r242 bra $L21; $L18: .loc 1 162 12 add.u64 %r114,%r114,%r32; .loc 1 163 12 add.u64 %r115,%r115,%r31; .loc 1 164 12 add.u64 %r116,%r116,%r48; .loc 1 165 15 ld.u64 %r243,[%frame+480]; add.u64 %r50,%r243,1; .loc 1 167 13 setp.eq.u64 %r244,%r38,%r50; @ %r244 bra $L23; .loc 1 165 15 st.u64 [%frame+480],%r50; bra $L28; $L23: .loc 1 171 20 mov.u64 %r245,0; st.u64 [%frame+480],%r245; .loc 1 172 24 ld.u64 %r52,[%frame+240]; .loc 1 173 16 ld.u64 %r246,[%frame+120]; sub.u64 %r111,%r115,%r246; .loc 1 174 9 ld.u64 %r248,[%frame]; shl.b64 %r247,%r248,3; sub.u64 %r112,%r116,%r247; .loc 1 176 14 @ %r281 bra $L1; add.u64 %r124,%frame,488; mov.u64 %r122,8; .loc 1 175 12 mov.u64 %r113,1; bra $L26; $L27: .loc 1 171 20 st.u64 [%r124],%r245; .loc 1 172 24 add.u64 %r255,%r295,%r122; ld.u64 %r52,[%r255]; .loc 1 173 24 add.u64 %r257,%r296,%r122; .loc 1 173 16 ld.u64 %r258,[%r257]; sub.u64 %r111,%r115,%r258; .loc 1 174 17 add.u64 %r259,%frame,%r122; .loc 1 174 9 ld.u64 %r261,[%r259]; shl.b64 %r260,%r261,3; sub.u64 %r112,%r116,%r260; .loc 1 175 12 add.u64 %r113,%r113,1; .loc 1 176 14 add.u64 %r124,%r124,8; add.u64 %r122,%r122,8; setp.eq.u64 %r262,%r290,%r113; @ %r262 bra $L1; $L26: .loc 1 184 23 ld.u64 %r263,[%r124]; add.u64 %r61,%r263,1; st.u64 [%r124],%r61; .loc 1 185 30 add.u64 %r265,%r291,%r122; .loc 1 185 20 ld.u64 %r267,[%r265]; sub.u64 %r266,%r267,%r52; add.u64 %r114,%r114,%r266; .loc 1 186 30 add.u64 %r269,%r292,%r122; .loc 1 186 20 ld.u64 %r270,[%r269]; add.u64 %r115,%r111,%r270; .loc 1 187 23 add.u64 %r272,%r293,%r122; .loc 1 187 13 ld.u64 %r274,[%r272]; shl.b64 %r273,%r274,3; add.u64 %r116,%r112,%r273; .loc 1 167 32 add.u64 %r276,%r294,%r122; .loc 1 167 13 ld.u64 %r277,[%r276]; setp.eq.u64 %r278,%r61,%r277; @ %r278 bra $L27; bra $L28; $L1: .loc 1 191cshift1_8_i2.o/ 1622802246gfortrani_cshift1_8_i2 .visible .func _gfortrani_cshift1_8_../libgfortran/generated/cshift1_8_i2.c"ortrani_cshift1_8_i2 .visible .func _gfortrani_cshift1_8_0pred32predpredu32 %r215; .reg .u32predu64 %r233; .reg .u64 %r240; .reg .u64pred %r260; .reg .u64 %r261; .reg .u6464u64u32u64 %r313; .reg .u64mov.u64 %r144,%ar0; mov.u64 %r145,%ar1; mov.u64 %r146,%ar2; mov.u64 %r147,%ar3; .loc 1 68 6 setp.eq.u64 %r148,%r147,0; @ %r148 bra $L28; .loc 1 69 21 ld.u32 %r150,[%r147]; add.u32 %r97,%r150,-1; bra $L2; $L28: .loc 1 71 11 cvt.u32.u64 %r97,%r147; $L2: .loc 1 73 13 mov.u64 %r151,1; st.u64 [%frame+360],%r151; .loc 1 74 12 mov.u64 %r152,0; st.u64 [%frame+480],%r152; .loc 1 82 23 ld.s8 %r142,[%r145+28]; cvt.u32.u32 %r153,%r142; cvt.s64.s8 %r143,%r153; .loc 1 82 3 setp.le.s64 %r154,%r143,0; @ %r154 bra $L29; .loc 1 84 15 cvt.s64.s32 %r35,%r97; add.u64 %r90,%r145,40; add.u64 %r83,%r144,40; .loc 1 75 5 mov.u64 %r94,%r152; .loc 1 80 7 mov.u64 %r92,%r94; .loc 1 82 12 mov.u64 %r127,%r94; .loc 1 79 11 mov.u64 %r86,%r151; .loc 1 78 11 mov.u64 %r82,%r86; .loc 1 96 20 mov.u64 %r317,%r94; $L8: .loc 1 92 17 ld.u64 %r156,[%r90+16]; add.u64 %r155,%r156,1; .loc 1 92 15 ld.u64 %r157,[%r90+8]; sub.u64 %r87,%r155,%r157; .loc 1 86 19 ld.u64 %r81,[%r83]; .loc 1 89 19 ld.u64 %r85,[%r90]; .loc 1 84 10 setp.ne.u64 %r158,%r35,%r127; @ %r158 bra $L4; .loc 1 87 14 setp.ne.u64 %r159,%r81,0; .loc 1 90 14 setp.ne.u64 %r160,%r85,0; .loc 1 92 15 mov.u64 %r92,%r87; selp.u64 %r86,%r85,1,%r160; selp.u64 %r82,%r81,1,%r159; bra $L7; $L4: .loc 1 96 20 shl.b64 %r161,%r94,3; add.u64 %r162,%frame,%r161; add.u64 %r163,%r162,480; st.u64 [%r163],%r317; .loc 1 97 21 st.u64 [%r162+360],%r87; .loc 1 98 22 st.u64 [%r162+840],%r81; .loc 1 99 22 st.u64 [%r162+720],%r85; .loc 1 100 24 add.u64 %r175,%r94,%r94; add.u64 %r176,%r175,%r94; shl.b64 %r177,%r176,3; add.u64 %r178,%r146,%r177; ld.u64 %r25,[%r178+40]; .loc 1 100 22 st.u64 [%r162+600],%r25; .loc 1 101 26 mul.lo.u64 %r186,%r81,%r87; .loc 1 101 13 st.u64 [%r162+240],%r186; .loc 1 102 26 mul.lo.u64 %r190,%r87,%r85; .loc 1 102 13 st.u64 [%r162+120],%r190; .loc 1 103 26 mul.lo.u64 %r193,%r25,%r87; .loc 1 103 13 st.u64 [%r162],%r193; .loc 1 104 12 add.u64 %r94,%r94,1; $L7: .loc 1 82 55 add.u64 %r127,%r127,1; .loc 1 82 3 add.u64 %r90,%r90,24; add.u64 %r83,%r83,24; setp.ne.u64 %r194,%r127,%r143; @ %r194 bra $L8; bra $L3; $L29: .loc 1 80 7 mov.u64 %r92,%r152; .loc 1 79 11 mov.u64 %r86,%r151; .loc 1 78 11 mov.u64 %r82,%r86; $L3: .loc 1 107 14 ld.u64 %r29,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r195,%r29,0; .loc 1 108 16 selp.u64 %r29,%r29,1,%r195; .loc 1 109 14 ld.u64 %r30,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r196,%r30,0; .loc 1 110 16 selp.u64 %r30,%r30,1,%r196; .loc 1 111 14 ld.u64 %r31,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r197,%r31,0; .loc 1 112 16 selp.u64 %r31,%r31,1,%r197; .loc 1 118 8 ld.u64 %r124,[%r144]; .loc 1 119 8 ld.u64 %r125,[%r145]; .loc 1 120 8 ld.u64 %r126,[%r146]; .loc 1 122 9 setp.eq.u64 %r198,%r124,0; @ %r198 bra $L1; .loc 1 150 13 add.u64 %r47,%r82,%r82; .loc 1 151 12 add.u64 %r49,%r86,%r86; .loc 1 162 12 add.u64 %r52,%r30,%r30; .loc 1 163 12 add.u64 %r54,%r29,%r29; .loc 1 164 12 shl.b64 %r56,%r31,3; .loc 1 138 35 set.u32.eq.u64 %r204,%r82,1; neg.s32 %r205,%r204; .loc 1 138 19 set.u32.eq.u64 %r207,%r86,1; neg.s32 %r208,%r207; .loc 1 138 24 cvt.u16.u32 %r210,%r205; cvt.u16.u32 %r211,%r208; and.b16 %r209,%r210,%r211; cvt.u32.u16 %r212,%r209; cvt.u32.u8 %r84,%r212; .loc 1 167 32 ld.u64 %r103,[%frame+360]; setp.ne.u32 %r303,%r84,0; setp.le.s64 %r300,%r143,2; cvt.u32.u32 %r308,%r142; cvt.s64.s8 %r309,%r308; add.u64 %r310,%r309,-1; add.u64 %r311,%frame,840; add.u64 %r312,%frame,720; add.u64 %r313,%frame,600; add.u64 %r314,%frame,360; .loc 1 172 24 add.u64 %r315,%frame,240; .loc 1 173 24 add.u64 %r316,%frame,120; $L27: .loc 1 125 10 ld.u64 %r98,[%r126]; .loc 1 128 10 setp.ge.s64 %r213,%r98,0; @ %r213 bra $L13; .loc 1 129 12 add.u64 %r98,%r98,%r92; $L13: .loc 1 130 11 set.u32.ge.s64 %r215,%r98,%r92; neg.s32 %r216,%r215; shr.u64 %r218,%r98,63; cvt.u16.u32 %r221,%r216; cvt.u16.u64 %r222,%r218; or.b16 %r220,%r221,%r222; .loc 1 130 10 cvt.u32.u16 %r223,%r220; cvt.u16.u8 %r224,%r223; setp.eq.u16 %r225,%r224,0; @ %r225 bra $L14; .loc 1 132 8 rem.s64 %r98,%r98,%r92; .loc 1 133 7 setp.ge.s64 %r228,%r98,0; @ %r228 bra $L14; .loc 1 134 16 add.u64 %r98,%r98,%r92; $L14: .loc 1 147 24 sub.u64 %r99,%r92,%r98; .loc 1 138 10 @ %r303 bra $L15; .loc 1 147 4 setp.gt.s64 %r230,%r99,0; @ %r230 bra $L16; mov.u64 %r113,%r124; bra $L17; $L15: .loc 1 140 11 add.u64 %r231,%r98,%r98; .loc 1 141 11 add.u64 %r232,%r99,%r99; .loc 1 142 23 add.u64 %r233,%r125,%r231; .loc 1 142 41232240,[%value_in]; } .loc 1 143 17 add.u64 %r242,%r124,%r232; .loc 1 143 4r231249,[%value_in]; } bra $L18; $L16: .loc 1 136 22 mul.lo.u64 %r251,%r98,%r86; .loc 1 136 18 add.u64 %r252,%r251,%r251; .loc 1 136 11 add.u64 %r117,%r125,%r252; mov.u64 %r116,%r124; .loc 1 147 11 mov.u64 %r118,0; $L19: .loc 1 149 14 ld.u16 %r253,[%r117]; st.u16 [%r116],%r253; .loc 1 150 13 add.u64 %r116,%r116,%r47; .loc 1 151 12 add.u64 %r117,%r117,%r49; .loc 1 147 31 add.u64 %r118,%r118,1; .loc 1 147 4 setp.ne.u64 %r254,%r99,%r118; @ %r254 bra $L19; .loc 1 150 13 mad.lo.u64 %r113,%r99,%r47,%r124; $L17: .loc 1 153 4 setp.gt.s64 %r256,%r98,0; @ ! %r256 bra $L18; mov.u64 %r114,%r125; .loc 1 153 23 mov.u64 %r115,0; $L20: .loc 1 155 14 ld.u16 %r257,[%r114]; st.u16 [%r113],%r257; .loc 1 156 13 add.u64 %r113,%r113,%r47; .loc 1 157 12 add.u64 %r114,%r114,%r49; .loc 1 153 37 add.u64 %r115,%r115,1; .loc 1 153 4 setp.ne.u64 %r258,%r98,%r115; @ %r258 bra $L20; $L18: .loc 1 162 12 add.u64 %r124,%r124,%r52; .loc 1 163 12 add.u64 %r125,%r125,%r54; .loc 1 164 12 add.u64 %r126,%r126,%r56; .loc 1 165 15 ld.u64 %r259,[%frame+480]; add.u64 %r58,%r259,1; .loc 1 167 13 setp.eq.u64 %r260,%r58,%r103; @ %r260 bra $L22; .loc 1 165 15 st.u64 [%frame+480],%r58; bra $L27; $L22: .loc 1 171 20 mov.u64 %r261,0; st.u64 [%frame+480],%r261; .loc 1 172 24 ld.u64 %r60,[%frame+240]; .loc 1 173 16 ld.u64 %r262,[%frame+120]; add.u64 %r263,%r262,%r262; sub.u64 %r121,%r125,%r263; .loc 1 174 9 ld.u64 %r265,[%frame]; shl.b64 %r264,%r265,3; sub.u64 %r122,%r126,%r264; .loc 1 176 14 @ %r300 bra $L1; add.u64 %r134,%frame,488; mov.u64 %r132,8; .loc 1 175 12 mov.u64 %r123,1; bra $L25; $L26: .loc 1 171 20 st.u64 [%r134],%r261; .loc 1 172 24 add.u64 %r272,%r315,%r132; ld.u64 %r60,[%r272]; .loc 1 173 24 add.u64 %r274,%r316,%r132; .loc 1 173 16 ld.u64 %r275,[%r274]; add.u64 %r276,%r275,%r275; sub.u64 %r121,%r125,%r276; .loc 1 174 17 add.u64 %r277,%frame,%r132; .loc 1 174 9 ld.u64 %r279,[%r277]; shl.b64 %r278,%r279,3; sub.u64 %r122,%r126,%r278; .loc 1 175 12 add.u64 %r123,%r123,1; .loc 1 176 14 add.u64 %r134,%r134,8; add.u64 %r132,%r132,8; setp.eq.u64 %r280,%r310,%r123; @ %r280 bra $L1; $L25: .loc 1 184 23 ld.u64 %r281,[%r134]; add.u64 %r70,%r281,1; st.u64 [%r134],%r70; .loc 1 185 30 add.u64 %r283,%r311,%r132; .loc 1 185 20 ld.u64 %r285,[%r283]; sub.u64 %r284,%r285,%r60; add.u64 %r286,%r284,%r284; add.u64 %r124,%r124,%r286; .loc 1 186 30 add.u64 %r288,%r312,%r132; .loc 1 186 20 ld.u64 %r289,[%r288]; add.u64 %r290,%r289,%r289; add.u64 %r125,%r121,%r290; .loc 1 187 23 add.u64 %r292,%r313,%r132; .loc 1 187 13 ld.u64 %r294,[%r292]; shl.b64 %r293,%r294,3; add.u64 %r126,%r122,%r293; .loc 1 167 32 add.u64 %r296,%r314,%r132; .loc 1 167 13 ld.u64 %r297,[%r296]; setp.eq.u64 %r298,%r70,%r297; @ %r298 bra $L26; bra $L27; $L1: .loc 1 191cshift1_8_i4.o/ 1622802247gfortrani_cshift1_8_i4 .visible .func _gfortrani_cshift1_8_../libgfortran/generated/cshift1_8_i4.c"ortrani_cshift1_8_i4 .visible .func _gfortrani_cshift1_8_0pred32predpredu16u64 %r214; .reg .u16 %r216; .reg .u16 %r217; .reg .u16 %r218; .reg .u32 %r219; .reg .u16 %r220; .reg .pred %r221; .reg .pred %r224; .reg .pred %r226; .reg .u64 %r227; .reg .u64pred %r247; .reg .pred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .predu64 %r267; .reg .u6464u64 %r305; .reg .u64 %r306; .reg .u64mov.u64 %r144,%ar0; mov.u64 %r145,%ar1; mov.u64 %r146,%ar2; mov.u64 %r147,%ar3; .loc 1 68 6 setp.eq.u64 %r148,%r147,0; @ %r148 bra $L28; .loc 1 69 21 ld.u32 %r150,[%r147]; add.u32 %r97,%r150,-1; bra $L2; $L28: .loc 1 71 11 cvt.u32.u64 %r97,%r147; $L2: .loc 1 73 13 mov.u64 %r151,1; st.u64 [%frame+360],%r151; .loc 1 74 12 mov.u64 %r152,0; st.u64 [%frame+480],%r152; .loc 1 82 23 ld.s8 %r142,[%r145+28]; cvt.u32.u32 %r153,%r142; cvt.s64.s8 %r143,%r153; .loc 1 82 3 setp.le.s64 %r154,%r143,0; @ %r154 bra $L29; .loc 1 84 15 cvt.s64.s32 %r35,%r97; add.u64 %r90,%r145,40; add.u64 %r83,%r144,40; .loc 1 75 5 mov.u64 %r94,%r152; .loc 1 80 7 mov.u64 %r92,%r94; .loc 1 82 12 mov.u64 %r127,%r94; .loc 1 79 11 mov.u64 %r86,%r151; .loc 1 78 11 mov.u64 %r82,%r86; .loc 1 96 20 mov.u64 %r309,%r94; $L8: .loc 1 92 17 ld.u64 %r156,[%r90+16]; add.u64 %r155,%r156,1; .loc 1 92 15 ld.u64 %r157,[%r90+8]; sub.u64 %r87,%r155,%r157; .loc 1 86 19 ld.u64 %r81,[%r83]; .loc 1 89 19 ld.u64 %r85,[%r90]; .loc 1 84 10 setp.ne.u64 %r158,%r35,%r127; @ %r158 bra $L4; .loc 1 87 14 setp.ne.u64 %r159,%r81,0; .loc 1 90 14 setp.ne.u64 %r160,%r85,0; .loc 1 92 15 mov.u64 %r92,%r87; selp.u64 %r86,%r85,1,%r160; selp.u64 %r82,%r81,1,%r159; bra $L7; $L4: .loc 1 96 20 shl.b64 %r161,%r94,3; add.u64 %r162,%frame,%r161; add.u64 %r163,%r162,480; st.u64 [%r163],%r309; .loc 1 97 21 st.u64 [%r162+360],%r87; .loc 1 98 22 st.u64 [%r162+840],%r81; .loc 1 99 22 st.u64 [%r162+720],%r85; .loc 1 100 24 add.u64 %r175,%r94,%r94; add.u64 %r176,%r175,%r94; shl.b64 %r177,%r176,3; add.u64 %r178,%r146,%r177; ld.u64 %r25,[%r178+40]; .loc 1 100 22 st.u64 [%r162+600],%r25; .loc 1 101 26 mul.lo.u64 %r186,%r81,%r87; .loc 1 101 13 st.u64 [%r162+240],%r186; .loc 1 102 26 mul.lo.u64 %r190,%r87,%r85; .loc 1 102 13 st.u64 [%r162+120],%r190; .loc 1 103 26 mul.lo.u64 %r193,%r25,%r87; .loc 1 103 13 st.u64 [%r162],%r193; .loc 1 104 12 add.u64 %r94,%r94,1; $L7: .loc 1 82 55 add.u64 %r127,%r127,1; .loc 1 82 3 add.u64 %r90,%r90,24; add.u64 %r83,%r83,24; setp.ne.u64 %r194,%r127,%r143; @ %r194 bra $L8; bra $L3; $L29: .loc 1 80 7 mov.u64 %r92,%r152; .loc 1 79 11 mov.u64 %r86,%r151; .loc 1 78 11 mov.u64 %r82,%r86; $L3: .loc 1 107 14 ld.u64 %r29,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r195,%r29,0; .loc 1 108 16 selp.u64 %r29,%r29,1,%r195; .loc 1 109 14 ld.u64 %r30,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r196,%r30,0; .loc 1 110 16 selp.u64 %r30,%r30,1,%r196; .loc 1 111 14 ld.u64 %r31,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r197,%r31,0; .loc 1 112 16 selp.u64 %r31,%r31,1,%r197; .loc 1 118 8 ld.u64 %r124,[%r144]; .loc 1 119 8 ld.u64 %r125,[%r145]; .loc 1 120 8 ld.u64 %r126,[%r146]; .loc 1 122 9 setp.eq.u64 %r198,%r124,0; @ %r198 bra $L1; .loc 1 150 13 shl.b64 %r47,%r82,2; .loc 1 151 12 shl.b64 %r49,%r86,2; .loc 1 162 12 shl.b64 %r52,%r30,2; .loc 1 163 12 shl.b64 %r54,%r29,2; .loc 1 164 12 shl.b64 %r56,%r31,3; .loc 1 138 35 set.u32.eq.u64 %r200,%r82,1; neg.s32 %r201,%r200; .loc 1 138 19 set.u32.eq.u64 %r203,%r86,1; neg.s32 %r204,%r203; .loc 1 138 24 cvt.u16.u32 %r206,%r201; cvt.u16.u32 %r207,%r204; and.b16 %r205,%r206,%r207; cvt.u32.u16 %r208,%r205; cvt.u32.u8 %r84,%r208; .loc 1 167 32 ld.u64 %r103,[%frame+360]; setp.ne.u32 %r294,%r84,0; setp.le.s64 %r292,%r143,2; cvt.u32.u32 %r300,%r142; cvt.s64.s8 %r301,%r300; add.u64 %r302,%r301,-1; add.u64 %r303,%frame,840; add.u64 %r304,%frame,720; add.u64 %r305,%frame,600; add.u64 %r306,%frame,360; .loc 1 172 24 add.u64 %r307,%frame,240; .loc 1 173 24 add.u64 %r308,%frame,120; $L27: .loc 1 125 10 ld.u64 %r98,[%r126]; .loc 1 128 10 setp.ge.s64 %r209,%r98,0; @ %r209 bra $L13; .loc 1 129 12 add.u64 %r98,%r98,%r92; $L13: .loc 1 130 11 set.u32.ge.s64 %r211,%r98,%r92; neg.s32 %r212,%r211; shr.u64 %r214,%r98,63; cvt.u16.u32 %r217,%r212; cvt.u16.u64 %r218,%r214; or.b16 %r216,%r217,%r218; .loc 1 130 10 cvt.u32.u16 %r219,%r216; cvt.u16.u8 %r220,%r219; setp.eq.u16 %r221,%r220,0; @ %r221 bra $L14; .loc 1 132 8 rem.s64 %r98,%r98,%r92; .loc 1 133 7 setp.ge.s64 %r224,%r98,0; @ %r224 bra $L14; .loc 1 134 16 add.u64 %r98,%r98,%r92; $L14: .loc 1 147 24 sub.u64 %r99,%r92,%r98; .loc 1 138 10 @ %r294 bra $L15; .loc 1 147 4 setp.gt.s64 %r226,%r99,0; @ %r226 bra $L16; mov.u64 %r113,%r124; bra $L17; $L15: .loc 1 140 11 shl.b64 %r119,%r98,2; .loc 1 141 11 shl.b64 %r120,%r99,2; .loc 1 142 23 add.u64 %r227,%r125,%r119; .loc 1 142 41270234,[%value_in]; } .loc 1 143 17 add.u64 %r236,%r124,%r120r119243,[%value_in]; } bra $L18; $L16: .loc 1 136 22 mul.lo.u64 %r245,%r98,%r86; .loc 1 136 18 shl.b64 %r246,%r245,2; .loc 1 136 11 add.u64 %r117,%r125,%r246; mov.u64 %r116,%r124; .loc 1 147 11 mov.u64 %r118,0; $L19: .loc 1 149 16 ld.u32 %r45,[%r117]; .loc 1 149 14 st.u32 [%r116],%r45; .loc 1 150 13 add.u64 %r116,%r116,%r47; .loc 1 151 12 add.u64 %r117,%r117,%r49; .loc 1 147 31 add.u64 %r118,%r118,1; .loc 1 147 4 setp.ne.u64 %r247,%r99,%r118; @ %r247 bra $L19; .loc 1 150 13 mad.lo.u64 %r113,%r99,%r47,%r124; $L17: .loc 1 153 4 setp.gt.s64 %r249,%r98,0; @ ! %r249 bra $L18; mov.u64 %r114,%r125; .loc 1 153 23 mov.u64 %r115,0; $L20: .loc 1 155 16 ld.u32 %r50,[%r114]; .loc 1 155 14 st.u32 [%r113],%r50; .loc 1 156 13 add.u64 %r113,%r113,%r47; .loc 1 157 12 add.u64 %r114,%r114,%r49; .loc 1 153 37 add.u64 %r115,%r115,1; .loc 1 153 4 setp.ne.u64 %r250,%r98,%r115; @ %r250 bra $L20; $L18: .loc 1 162 12 add.u64 %r124,%r124,%r52; .loc 1 163 12 add.u64 %r125,%r125,%r54; .loc 1 164 12 add.u64 %r126,%r126,%r56; .loc 1 165 15 ld.u64 %r251,[%frame+480]; add.u64 %r58,%r251,1; .loc 1 167 13 setp.eq.u64 %r252,%r58,%r103; @ %r252 bra $L22; .loc 1 165 15 st.u64 [%frame+480],%r58; bra $L27; $L22: .loc 1 171 20 mov.u64 %r253,0; st.u64 [%frame+480],%r253; .loc 1 172 24 ld.u64 %r60,[%frame+240]; .loc 1 173 16 ld.u64 %r255,[%frame+120]; shl.b64 %r254,%r255,2; sub.u64 %r121,%r125,%r254; .loc 1 174 9 ld.u64 %r257,[%frame]; shl.b64 %r256,%r257,3; sub.u64 %r122,%r126,%r256; .loc 1 176 14 @ %r292 bra $L1; add.u64 %r134,%frame,488; mov.u64 %r132,8; .loc 1 175 12 mov.u64 %r123,1; bra $L25; $L26: .loc 1 171 20 st.u64 [%r134],%r253; .loc 1 172 24 add.u64 %r264,%r307,%r132; ld.u64 %r60,[%r264]; .loc 1 173 24 add.u64 %r266,%r308,%r132; .loc 1 173 16 ld.u64 %r268,[%r266]; shl.b64 %r267,%r268,2; sub.u64 %r121,%r125,%r267; .loc 1 174 17 add.u64 %r269,%frame,%r132; .loc 1 174 9 ld.u64 %r271,[%r269]; shl.b64 %r270,%r271,3; sub.u64 %r122,%r126,%r270; .loc 1 175 12 add.u64 %r123,%r123,1; .loc 1 176 14 add.u64 %r134,%r134,8; add.u64 %r132,%r132,8; setp.eq.u64 %r272,%r302,%r123; @ %r272 bra $L1; $L25: .loc 1 184 23 ld.u64 %r273,[%r134]; add.u64 %r70,%r273,1; st.u64 [%r134],%r70; .loc 1 185 30 add.u64 %r275,%r303,%r132; .loc 1 185 20 ld.u64 %r277,[%r275]; sub.u64 %r276,%r277,%r60; shl.b64 %r278,%r276,2; add.u64 %r124,%r124,%r278; .loc 1 186 30 add.u64 %r280,%r304,%r132; .loc 1 186 20 ld.u64 %r282,[%r280]; shl.b64 %r281,%r282,2; add.u64 %r125,%r121,%r281; .loc 1 187 23 add.u64 %r284,%r305,%r132; .loc 1 187 13 ld.u64 %r286,[%r284]; shl.b64 %r285,%r286,3; add.u64 %r126,%r122,%r285; .loc 1 167 32 add.u64 %r288,%r306,%r132; .loc 1 167 13 ld.u64 %r289,[%r288]; setp.eq.u64 %r290,%r70,%r289; @ %r290 bra $L26; bra $L27; $L1: .loc 1 191cshift1_8_i8.o/ 1622802246gfortrani_cshift1_8_i8 .visible .func _gfortrani_cshift1_8_../libgfortran/generated/cshift1_8_i8.c"ortrani_cshift1_8_i8 .visible .func _gfortrani_cshift1_8_0pred32predpredu16u64 %r214; .reg .u16 %r216; .reg .u16 %r217; .reg .u16 %r218; .reg .u32 %r219; .reg .u16 %r220; .reg .pred %r221; .reg .pred %r224; .reg .pred %r226; .reg .u64 %r227; .reg .u64pred %r247; .reg .pred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .predu64 %r267; .reg .u6464u64 %r305; .reg .u64 %r306; .reg .u64mov.u64 %r144,%ar0; mov.u64 %r145,%ar1; mov.u64 %r146,%ar2; mov.u64 %r147,%ar3; .loc 1 68 6 setp.eq.u64 %r148,%r147,0; @ %r148 bra $L28; .loc 1 69 21 ld.u32 %r150,[%r147]; add.u32 %r97,%r150,-1; bra $L2; $L28: .loc 1 71 11 cvt.u32.u64 %r97,%r147; $L2: .loc 1 73 13 mov.u64 %r151,1; st.u64 [%frame+360],%r151; .loc 1 74 12 mov.u64 %r152,0; st.u64 [%frame+480],%r152; .loc 1 82 23 ld.s8 %r142,[%r145+28]; cvt.u32.u32 %r153,%r142; cvt.s64.s8 %r143,%r153; .loc 1 82 3 setp.le.s64 %r154,%r143,0; @ %r154 bra $L29; .loc 1 84 15 cvt.s64.s32 %r35,%r97; add.u64 %r90,%r145,40; add.u64 %r83,%r144,40; .loc 1 75 5 mov.u64 %r94,%r152; .loc 1 80 7 mov.u64 %r92,%r94; .loc 1 82 12 mov.u64 %r127,%r94; .loc 1 79 11 mov.u64 %r86,%r151; .loc 1 78 11 mov.u64 %r82,%r86; .loc 1 96 20 mov.u64 %r309,%r94; $L8: .loc 1 92 17 ld.u64 %r156,[%r90+16]; add.u64 %r155,%r156,1; .loc 1 92 15 ld.u64 %r157,[%r90+8]; sub.u64 %r87,%r155,%r157; .loc 1 86 19 ld.u64 %r81,[%r83]; .loc 1 89 19 ld.u64 %r85,[%r90]; .loc 1 84 10 setp.ne.u64 %r158,%r35,%r127; @ %r158 bra $L4; .loc 1 87 14 setp.ne.u64 %r159,%r81,0; .loc 1 90 14 setp.ne.u64 %r160,%r85,0; .loc 1 92 15 mov.u64 %r92,%r87; selp.u64 %r86,%r85,1,%r160; selp.u64 %r82,%r81,1,%r159; bra $L7; $L4: .loc 1 96 20 shl.b64 %r161,%r94,3; add.u64 %r162,%frame,%r161; add.u64 %r163,%r162,480; st.u64 [%r163],%r309; .loc 1 97 21 st.u64 [%r162+360],%r87; .loc 1 98 22 st.u64 [%r162+840],%r81; .loc 1 99 22 st.u64 [%r162+720],%r85; .loc 1 100 24 add.u64 %r175,%r94,%r94; add.u64 %r176,%r175,%r94; shl.b64 %r177,%r176,3; add.u64 %r178,%r146,%r177; ld.u64 %r25,[%r178+40]; .loc 1 100 22 st.u64 [%r162+600],%r25; .loc 1 101 26 mul.lo.u64 %r186,%r81,%r87; .loc 1 101 13 st.u64 [%r162+240],%r186; .loc 1 102 26 mul.lo.u64 %r190,%r87,%r85; .loc 1 102 13 st.u64 [%r162+120],%r190; .loc 1 103 26 mul.lo.u64 %r193,%r25,%r87; .loc 1 103 13 st.u64 [%r162],%r193; .loc 1 104 12 add.u64 %r94,%r94,1; $L7: .loc 1 82 55 add.u64 %r127,%r127,1; .loc 1 82 3 add.u64 %r90,%r90,24; add.u64 %r83,%r83,24; setp.ne.u64 %r194,%r127,%r143; @ %r194 bra $L8; bra $L3; $L29: .loc 1 80 7 mov.u64 %r92,%r152; .loc 1 79 11 mov.u64 %r86,%r151; .loc 1 78 11 mov.u64 %r82,%r86; $L3: .loc 1 107 14 ld.u64 %r29,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r195,%r29,0; .loc 1 108 16 selp.u64 %r29,%r29,1,%r195; .loc 1 109 14 ld.u64 %r30,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r196,%r30,0; .loc 1 110 16 selp.u64 %r30,%r30,1,%r196; .loc 1 111 14 ld.u64 %r31,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r197,%r31,0; .loc 1 112 16 selp.u64 %r31,%r31,1,%r197; .loc 1 118 8 ld.u64 %r124,[%r144]; .loc 1 119 8 ld.u64 %r125,[%r145]; .loc 1 120 8 ld.u64 %r126,[%r146]; .loc 1 122 9 setp.eq.u64 %r198,%r124,0; @ %r198 bra $L1; .loc 1 150 13 shl.b64 %r47,%r82,3; .loc 1 151 12 shl.b64 %r49,%r86,3; .loc 1 162 12 shl.b64 %r52,%r30,3; .loc 1 163 12 shl.b64 %r54,%r29,3; .loc 1 164 12 shl.b64 %r56,%r31,3; .loc 1 138 35 set.u32.eq.u64 %r200,%r82,1; neg.s32 %r201,%r200; .loc 1 138 19 set.u32.eq.u64 %r203,%r86,1; neg.s32 %r204,%r203; .loc 1 138 24 cvt.u16.u32 %r206,%r201; cvt.u16.u32 %r207,%r204; and.b16 %r205,%r206,%r207; cvt.u32.u16 %r208,%r205; cvt.u32.u8 %r84,%r208; .loc 1 167 32 ld.u64 %r103,[%frame+360]; setp.ne.u32 %r294,%r84,0; setp.le.s64 %r292,%r143,2; cvt.u32.u32 %r300,%r142; cvt.s64.s8 %r301,%r300; add.u64 %r302,%r301,-1; add.u64 %r303,%frame,840; add.u64 %r304,%frame,720; add.u64 %r305,%frame,600; add.u64 %r306,%frame,360; .loc 1 172 24 add.u64 %r307,%frame,240; .loc 1 173 24 add.u64 %r308,%frame,120; $L27: .loc 1 125 10 ld.u64 %r98,[%r126]; .loc 1 128 10 setp.ge.s64 %r209,%r98,0; @ %r209 bra $L13; .loc 1 129 12 add.u64 %r98,%r98,%r92; $L13: .loc 1 130 11 set.u32.ge.s64 %r211,%r98,%r92; neg.s32 %r212,%r211; shr.u64 %r214,%r98,63; cvt.u16.u32 %r217,%r212; cvt.u16.u64 %r218,%r214; or.b16 %r216,%r217,%r218; .loc 1 130 10 cvt.u32.u16 %r219,%r216; cvt.u16.u8 %r220,%r219; setp.eq.u16 %r221,%r220,0; @ %r221 bra $L14; .loc 1 132 8 rem.s64 %r98,%r98,%r92; .loc 1 133 7 setp.ge.s64 %r224,%r98,0; @ %r224 bra $L14; .loc 1 134 16 add.u64 %r98,%r98,%r92; $L14: .loc 1 147 24 sub.u64 %r99,%r92,%r98; .loc 1 138 10 @ %r294 bra $L15; .loc 1 147 4 setp.gt.s64 %r226,%r99,0; @ %r226 bra $L16; mov.u64 %r113,%r124; bra $L17; $L15: .loc 1 140 11 shl.b64 %r119,%r98,3; .loc 1 141 11 shl.b64 %r120,%r99,3; .loc 1 142 23 add.u64 %r227,%r125,%r119; .loc 1 142 41270234,[%value_in]; } .loc 1 143 17 add.u64 %r236,%r124,%r120r119243,[%value_in]; } bra $L18; $L16: .loc 1 136 22 mul.lo.u64 %r245,%r98,%r86; .loc 1 136 18 shl.b64 %r246,%r245,3; .loc 1 136 11 add.u64 %r117,%r125,%r246; mov.u64 %r116,%r124; .loc 1 147 11 mov.u64 %r118,0; $L19: .loc 1 149 16 ld.u64 %r45,[%r117]; .loc 1 149 14 st.u64 [%r116],%r45; .loc 1 150 13 add.u64 %r116,%r116,%r47; .loc 1 151 12 add.u64 %r117,%r117,%r49; .loc 1 147 31 add.u64 %r118,%r118,1; .loc 1 147 4 setp.ne.u64 %r247,%r99,%r118; @ %r247 bra $L19; .loc 1 150 13 mad.lo.u64 %r113,%r99,%r47,%r124; $L17: .loc 1 153 4 setp.gt.s64 %r249,%r98,0; @ ! %r249 bra $L18; mov.u64 %r114,%r125; .loc 1 153 23 mov.u64 %r115,0; $L20: .loc 1 155 16 ld.u64 %r50,[%r114]; .loc 1 155 14 st.u64 [%r113],%r50; .loc 1 156 13 add.u64 %r113,%r113,%r47; .loc 1 157 12 add.u64 %r114,%r114,%r49; .loc 1 153 37 add.u64 %r115,%r115,1; .loc 1 153 4 setp.ne.u64 %r250,%r98,%r115; @ %r250 bra $L20; $L18: .loc 1 162 12 add.u64 %r124,%r124,%r52; .loc 1 163 12 add.u64 %r125,%r125,%r54; .loc 1 164 12 add.u64 %r126,%r126,%r56; .loc 1 165 15 ld.u64 %r251,[%frame+480]; add.u64 %r58,%r251,1; .loc 1 167 13 setp.eq.u64 %r252,%r58,%r103; @ %r252 bra $L22; .loc 1 165 15 st.u64 [%frame+480],%r58; bra $L27; $L22: .loc 1 171 20 mov.u64 %r253,0; st.u64 [%frame+480],%r253; .loc 1 172 24 ld.u64 %r60,[%frame+240]; .loc 1 173 16 ld.u64 %r255,[%frame+120]; shl.b64 %r254,%r255,3; sub.u64 %r121,%r125,%r254; .loc 1 174 9 ld.u64 %r257,[%frame]; shl.b64 %r256,%r257,3; sub.u64 %r122,%r126,%r256; .loc 1 176 14 @ %r292 bra $L1; add.u64 %r134,%frame,488; mov.u64 %r132,8; .loc 1 175 12 mov.u64 %r123,1; bra $L25; $L26: .loc 1 171 20 st.u64 [%r134],%r253; .loc 1 172 24 add.u64 %r264,%r307,%r132; ld.u64 %r60,[%r264]; .loc 1 173 24 add.u64 %r266,%r308,%r132; .loc 1 173 16 ld.u64 %r268,[%r266]; shl.b64 %r267,%r268,3; sub.u64 %r121,%r125,%r267; .loc 1 174 17 add.u64 %r269,%frame,%r132; .loc 1 174 9 ld.u64 %r271,[%r269]; shl.b64 %r270,%r271,3; sub.u64 %r122,%r126,%r270; .loc 1 175 12 add.u64 %r123,%r123,1; .loc 1 176 14 add.u64 %r134,%r134,8; add.u64 %r132,%r132,8; setp.eq.u64 %r272,%r302,%r123; @ %r272 bra $L1; $L25: .loc 1 184 23 ld.u64 %r273,[%r134]; add.u64 %r70,%r273,1; st.u64 [%r134],%r70; .loc 1 185 30 add.u64 %r275,%r303,%r132; .loc 1 185 20 ld.u64 %r277,[%r275]; sub.u64 %r276,%r277,%r60; shl.b64 %r278,%r276,3; add.u64 %r124,%r124,%r278; .loc 1 186 30 add.u64 %r280,%r304,%r132; .loc 1 186 20 ld.u64 %r282,[%r280]; shl.b64 %r281,%r282,3; add.u64 %r125,%r121,%r281; .loc 1 187 23 add.u64 %r284,%r305,%r132; .loc 1 187 13 ld.u64 %r286,[%r284]; shl.b64 %r285,%r286,3; add.u64 %r126,%r122,%r285; .loc 1 167 32 add.u64 %r288,%r306,%r132; .loc 1 167 13 ld.u64 %r289,[%r288]; setp.eq.u64 %r290,%r70,%r289; @ %r290 bra $L26; bra $L27; $L1: .loc 1 191cshift1_8_i16.o/1622802247gfortrani_cshift1_8_i16 .visible .func _gfortrani_cshift1_8_i16../libgfortran/generated/cshift1_8_i16.c"ortrani_cshift1_8_i16 .visible .func _gfortrani_cshift1_8_i160pred32predpredu16u64 %r214; .reg .u16 %r216; .reg .u16 %r217; .reg .u16 %r218; .reg .u32 %r219; .reg .u16 %r220; .reg .pred %r221; .reg .pred %r224; .reg .pred %r226; .reg .u64 %r227; .reg .u64pred %r249; .reg .predpred %r276; .reg .u64 %r277; .reg .u64 %r279; .reg .u64 %r280; .reg .u64 %r281; .reg .u64 %r282; .reg .u64predpred %r300; .reg .pred %r303; .reg .u32u64 %r313; .reg .u64mov.u64 %r144,%ar0; mov.u64 %r145,%ar1; mov.u64 %r146,%ar2; mov.u64 %r147,%ar3; .loc 1 68 6 setp.eq.u64 %r148,%r147,0; @ %r148 bra $L28; .loc 1 69 21 ld.u32 %r150,[%r147]; add.u32 %r97,%r150,-1; bra $L2; $L28: .loc 1 71 11 cvt.u32.u64 %r97,%r147; $L2: .loc 1 73 13 mov.u64 %r151,1; st.u64 [%frame+360],%r151; .loc 1 74 12 mov.u64 %r152,0; st.u64 [%frame+480],%r152; .loc 1 82 23 ld.s8 %r142,[%r145+28]; cvt.u32.u32 %r153,%r142; cvt.s64.s8 %r143,%r153; .loc 1 82 3 setp.le.s64 %r154,%r143,0; @ %r154 bra $L29; .loc 1 84 15 cvt.s64.s32 %r35,%r97; add.u64 %r90,%r145,40; add.u64 %r83,%r144,40; .loc 1 75 5 mov.u64 %r94,%r152; .loc 1 80 7 mov.u64 %r92,%r94; .loc 1 82 12 mov.u64 %r127,%r94; .loc 1 79 11 mov.u64 %r86,%r151; .loc 1 78 11 mov.u64 %r82,%r86; .loc 1 96 20 mov.u64 %r317,%r94; $L8: .loc 1 92 17 ld.u64 %r156,[%r90+16]; add.u64 %r155,%r156,1; .loc 1 92 15 ld.u64 %r157,[%r90+8]; sub.u64 %r87,%r155,%r157; .loc 1 86 19 ld.u64 %r81,[%r83]; .loc 1 89 19 ld.u64 %r85,[%r90]; .loc 1 84 10 setp.ne.u64 %r158,%r35,%r127; @ %r158 bra $L4; .loc 1 87 14 setp.ne.u64 %r159,%r81,0; .loc 1 90 14 setp.ne.u64 %r160,%r85,0; .loc 1 92 15 mov.u64 %r92,%r87; selp.u64 %r86,%r85,1,%r160; selp.u64 %r82,%r81,1,%r159; bra $L7; $L4: .loc 1 96 20 shl.b64 %r161,%r94,3; add.u64 %r162,%frame,%r161; add.u64 %r163,%r162,480; st.u64 [%r163],%r317; .loc 1 97 21 st.u64 [%r162+360],%r87; .loc 1 98 22 st.u64 [%r162+840],%r81; .loc 1 99 22 st.u64 [%r162+720],%r85; .loc 1 100 24 add.u64 %r175,%r94,%r94; add.u64 %r176,%r175,%r94; shl.b64 %r177,%r176,3; add.u64 %r178,%r146,%r177; ld.u64 %r25,[%r178+40]; .loc 1 100 22 st.u64 [%r162+600],%r25; .loc 1 101 26 mul.lo.u64 %r186,%r81,%r87; .loc 1 101 13 st.u64 [%r162+240],%r186; .loc 1 102 26 mul.lo.u64 %r190,%r87,%r85; .loc 1 102 13 st.u64 [%r162+120],%r190; .loc 1 103 26 mul.lo.u64 %r193,%r25,%r87; .loc 1 103 13 st.u64 [%r162],%r193; .loc 1 104 12 add.u64 %r94,%r94,1; $L7: .loc 1 82 55 add.u64 %r127,%r127,1; .loc 1 82 3 add.u64 %r90,%r90,24; add.u64 %r83,%r83,24; setp.ne.u64 %r194,%r127,%r143; @ %r194 bra $L8; bra $L3; $L29: .loc 1 80 7 mov.u64 %r92,%r152; .loc 1 79 11 mov.u64 %r86,%r151; .loc 1 78 11 mov.u64 %r82,%r86; $L3: .loc 1 107 14 ld.u64 %r29,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r195,%r29,0; .loc 1 108 16 selp.u64 %r29,%r29,1,%r195; .loc 1 109 14 ld.u64 %r30,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r196,%r30,0; .loc 1 110 16 selp.u64 %r30,%r30,1,%r196; .loc 1 111 14 ld.u64 %r31,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r197,%r31,0; .loc 1 112 16 selp.u64 %r31,%r31,1,%r197; .loc 1 118 8 ld.u64 %r124,[%r144]; .loc 1 119 8 ld.u64 %r125,[%r145]; .loc 1 120 8 ld.u64 %r126,[%r146]; .loc 1 122 9 setp.eq.u64 %r198,%r124,0; @ %r198 bra $L1; .loc 1 150 13 shl.b64 %r47,%r82,4; .loc 1 151 12 shl.b64 %r49,%r86,4; .loc 1 162 12 shl.b64 %r52,%r30,4; .loc 1 163 12 shl.b64 %r54,%r29,4; .loc 1 164 12 shl.b64 %r56,%r31,3; .loc 1 138 35 set.u32.eq.u64 %r200,%r82,1; neg.s32 %r201,%r200; .loc 1 138 19 set.u32.eq.u64 %r203,%r86,1; neg.s32 %r204,%r203; .loc 1 138 24 cvt.u16.u32 %r206,%r201; cvt.u16.u32 %r207,%r204; and.b16 %r205,%r206,%r207; cvt.u32.u16 %r208,%r205; cvt.u32.u8 %r84,%r208; .loc 1 167 32 ld.u64 %r103,[%frame+360]; setp.ne.u32 %r303,%r84,0; setp.le.s64 %r300,%r143,2; cvt.u32.u32 %r308,%r142; cvt.s64.s8 %r309,%r308; add.u64 %r310,%r309,-1; add.u64 %r311,%frame,840; add.u64 %r312,%frame,720; add.u64 %r313,%frame,600; add.u64 %r314,%frame,360; .loc 1 172 24 add.u64 %r315,%frame,240; .loc 1 173 24 add.u64 %r316,%frame,120; $L27: .loc 1 125 10 ld.u64 %r98,[%r126]; .loc 1 128 10 setp.ge.s64 %r209,%r98,0; @ %r209 bra $L13; .loc 1 129 12 add.u64 %r98,%r98,%r92; $L13: .loc 1 130 11 set.u32.ge.s64 %r211,%r98,%r92; neg.s32 %r212,%r211; shr.u64 %r214,%r98,63; cvt.u16.u32 %r217,%r212; cvt.u16.u64 %r218,%r214; or.b16 %r216,%r217,%r218; .loc 1 130 10 cvt.u32.u16 %r219,%r216; cvt.u16.u8 %r220,%r219; setp.eq.u16 %r221,%r220,0; @ %r221 bra $L14; .loc 1 132 8 rem.s64 %r98,%r98,%r92; .loc 1 133 7 setp.ge.s64 %r224,%r98,0; @ %r224 bra $L14; .loc 1 134 16 add.u64 %r98,%r98,%r92; $L14: .loc 1 147 24 sub.u64 %r99,%r92,%r98; .loc 1 138 10 @ %r303 bra $L15; .loc 1 147 4 setp.gt.s64 %r226,%r99,0; @ %r226 bra $L16; mov.u64 %r113,%r124; bra $L17; $L15: .loc 1 140 11 shl.b64 %r119,%r98,4; .loc 1 141 11 shl.b64 %r120,%r99,4; .loc 1 142 23 add.u64 %r227,%r125,%r119; .loc 1 142 41270234,[%value_in]; } .loc 1 143 17 add.u64 %r236,%r124,%r120r119243,[%value_in]; } bra $L18; $L16: .loc 1 136 22 mul.lo.u64 %r245,%r98,%r86; .loc 1 136 18 shl.b64 %r246,%r245,4; .loc 1 136 11 add.u64 %r117,%r125,%r246; mov.u64 %r116,%r124; .loc 1 147 11 mov.u64 %r118,0; $L19: .loc 1 149 16 ld.u64 %r295,[%r117]; ld.u64 %r296,[%r117+8]; .loc 1 149 14 st.u64 [%r116],%r295; st.u64 [%r116+8],%r296; .loc 1 150 13 add.u64 %r116,%r116,%r47; .loc 1 151 12 add.u64 %r117,%r117,%r49; .loc 1 147 31 add.u64 %r118,%r118,1; .loc 1 147 4 setp.ne.u64 %r249,%r99,%r118; @ %r249 bra $L19; .loc 1 150 13 mad.lo.u64 %r113,%r99,%r47,%r124; $L17: .loc 1 153 4 setp.gt.s64 %r251,%r98,0; @ ! %r251 bra $L18; mov.u64 %r114,%r125; .loc 1 153 23 mov.u64 %r115,0; $L20: .loc 1 155 16 ld.u64 %r297,[%r114]; ld.u64 %r298,[%r114+8]; .loc 1 155 14 st.u64 [%r113],%r297; st.u64 [%r113+8],%r298; .loc 1 156 13 add.u64 %r113,%r113,%r47; .loc 1 157 12 add.u64 %r114,%r114,%r49; .loc 1 153 37 add.u64 %r115,%r115,1; .loc 1 153 4 setp.ne.u64 %r254,%r98,%r115; @ %r254 bra $L20; $L18: .loc 1 162 12 add.u64 %r124,%r124,%r52; .loc 1 163 12 add.u64 %r125,%r125,%r54; .loc 1 164 12 add.u64 %r126,%r126,%r56; .loc 1 165 15 ld.u64 %r255,[%frame+480]; add.u64 %r58,%r255,1; .loc 1 167 13 setp.eq.u64 %r256,%r58,%r103; @ %r256 bra $L22; .loc 1 165 15 st.u64 [%frame+480],%r58; bra $L27; $L22: .loc 1 171 20 mov.u64 %r257,0; st.u64 [%frame+480],%r257; .loc 1 172 24 ld.u64 %r60,[%frame+240]; .loc 1 173 16 ld.u64 %r259,[%frame+120]; shl.b64 %r258,%r259,4; sub.u64 %r121,%r125,%r258; .loc 1 174 9 ld.u64 %r261,[%frame]; shl.b64 %r260,%r261,3; sub.u64 %r122,%r126,%r260; .loc 1 176 14 @ %r300 bra $L1; add.u64 %r134,%frame,488; mov.u64 %r132,8; .loc 1 175 12 mov.u64 %r123,1; bra $L25; $L26: .loc 1 171 20 st.u64 [%r134],%r257; .loc 1 172 24 add.u64 %r268,%r315,%r132; ld.u64 %r60,[%r268]; .loc 1 173 24 add.u64 %r270,%r316,%r132; .loc 1 173 16 ld.u64 %r272,[%r270]; shl.b64 %r271,%r272,4; sub.u64 %r121,%r125,%r271; .loc 1 174 17 add.u64 %r273,%frame,%r132; .loc 1 174 9 ld.u64 %r275,[%r273]; shl.b64 %r274,%r275,3; sub.u64 %r122,%r126,%r274; .loc 1 175 12 add.u64 %r123,%r123,1; .loc 1 176 14 add.u64 %r134,%r134,8; add.u64 %r132,%r132,8; setp.eq.u64 %r276,%r310,%r123; @ %r276 bra $L1; $L25: .loc 1 184 23 ld.u64 %r277,[%r134]; add.u64 %r70,%r277,1; st.u64 [%r134],%r70; .loc 1 185 30 add.u64 %r279,%r311,%r132; .loc 1 185 20 ld.u64 %r281,[%r279]; sub.u64 %r280,%r281,%r60; shl.b64 %r282,%r280,4; add.u64 %r124,%r124,%r282; .loc 1 186 30 add.u64 %r284,%r312,%r132; .loc 1 186 20 ld.u64 %r286,[%r284]; shl.b64 %r285,%r286,4; add.u64 %r125,%r121,%r285; .loc 1 187 23 add.u64 %r288,%r313,%r132; .loc 1 187 13 ld.u64 %r290,[%r288]; shl.b64 %r289,%r290,3; add.u64 %r126,%r122,%r289; .loc 1 167 32 add.u64 %r292,%r314,%r132; .loc 1 167 13 ld.u64 %r293,[%r292]; setp.eq.u64 %r294,%r70,%r293; @ %r294 bra $L26; bra $L27; $L1: .loc 1 191 cshift1_8_r4.o/ 1622802247gfortrani_cshift1_8_r4 .visible .func _gfortrani_cshift1_8_r../libgfortran/generated/cshift1_8_r4.c"ortrani_cshift1_8_r4 .visible .func _gfortrani_cshift1_8_r0pred32predpredu16u64 %r214; .reg .u16 %r216; .reg .u16 %r217; .reg .u16 %r218; .reg .u32 %r219; .reg .u16 %r220; .reg .pred %r221; .reg .pred %r224; .reg .pred %r226; .reg .u64 %r227; .reg .u64pred %r247; .reg .pred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .predu64 %r267; .reg .u6464u64 %r305; .reg .u64 %r306; .reg .u64mov.u64 %r144,%ar0; mov.u64 %r145,%ar1; mov.u64 %r146,%ar2; mov.u64 %r147,%ar3; .loc 1 68 6 setp.eq.u64 %r148,%r147,0; @ %r148 bra $L28; .loc 1 69 21 ld.u32 %r150,[%r147]; add.u32 %r97,%r150,-1; bra $L2; $L28: .loc 1 71 11 cvt.u32.u64 %r97,%r147; $L2: .loc 1 73 13 mov.u64 %r151,1; st.u64 [%frame+360],%r151; .loc 1 74 12 mov.u64 %r152,0; st.u64 [%frame+480],%r152; .loc 1 82 23 ld.s8 %r142,[%r145+28]; cvt.u32.u32 %r153,%r142; cvt.s64.s8 %r143,%r153; .loc 1 82 3 setp.le.s64 %r154,%r143,0; @ %r154 bra $L29; .loc 1 84 15 cvt.s64.s32 %r35,%r97; add.u64 %r90,%r145,40; add.u64 %r83,%r144,40; .loc 1 75 5 mov.u64 %r94,%r152; .loc 1 80 7 mov.u64 %r92,%r94; .loc 1 82 12 mov.u64 %r127,%r94; .loc 1 79 11 mov.u64 %r86,%r151; .loc 1 78 11 mov.u64 %r82,%r86; .loc 1 96 20 mov.u64 %r309,%r94; $L8: .loc 1 92 17 ld.u64 %r156,[%r90+16]; add.u64 %r155,%r156,1; .loc 1 92 15 ld.u64 %r157,[%r90+8]; sub.u64 %r87,%r155,%r157; .loc 1 86 19 ld.u64 %r81,[%r83]; .loc 1 89 19 ld.u64 %r85,[%r90]; .loc 1 84 10 setp.ne.u64 %r158,%r35,%r127; @ %r158 bra $L4; .loc 1 87 14 setp.ne.u64 %r159,%r81,0; .loc 1 90 14 setp.ne.u64 %r160,%r85,0; .loc 1 92 15 mov.u64 %r92,%r87; selp.u64 %r86,%r85,1,%r160; selp.u64 %r82,%r81,1,%r159; bra $L7; $L4: .loc 1 96 20 shl.b64 %r161,%r94,3; add.u64 %r162,%frame,%r161; add.u64 %r163,%r162,480; st.u64 [%r163],%r309; .loc 1 97 21 st.u64 [%r162+360],%r87; .loc 1 98 22 st.u64 [%r162+840],%r81; .loc 1 99 22 st.u64 [%r162+720],%r85; .loc 1 100 24 add.u64 %r175,%r94,%r94; add.u64 %r176,%r175,%r94; shl.b64 %r177,%r176,3; add.u64 %r178,%r146,%r177; ld.u64 %r25,[%r178+40]; .loc 1 100 22 st.u64 [%r162+600],%r25; .loc 1 101 26 mul.lo.u64 %r186,%r81,%r87; .loc 1 101 13 st.u64 [%r162+240],%r186; .loc 1 102 26 mul.lo.u64 %r190,%r87,%r85; .loc 1 102 13 st.u64 [%r162+120],%r190; .loc 1 103 26 mul.lo.u64 %r193,%r25,%r87; .loc 1 103 13 st.u64 [%r162],%r193; .loc 1 104 12 add.u64 %r94,%r94,1; $L7: .loc 1 82 55 add.u64 %r127,%r127,1; .loc 1 82 3 add.u64 %r90,%r90,24; add.u64 %r83,%r83,24; setp.ne.u64 %r194,%r127,%r143; @ %r194 bra $L8; bra $L3; $L29: .loc 1 80 7 mov.u64 %r92,%r152; .loc 1 79 11 mov.u64 %r86,%r151; .loc 1 78 11 mov.u64 %r82,%r86; $L3: .loc 1 107 14 ld.u64 %r29,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r195,%r29,0; .loc 1 108 16 selp.u64 %r29,%r29,1,%r195; .loc 1 109 14 ld.u64 %r30,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r196,%r30,0; .loc 1 110 16 selp.u64 %r30,%r30,1,%r196; .loc 1 111 14 ld.u64 %r31,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r197,%r31,0; .loc 1 112 16 selp.u64 %r31,%r31,1,%r197; .loc 1 118 8 ld.u64 %r124,[%r144]; .loc 1 119 8 ld.u64 %r125,[%r145]; .loc 1 120 8 ld.u64 %r126,[%r146]; .loc 1 122 9 setp.eq.u64 %r198,%r124,0; @ %r198 bra $L1; .loc 1 150 13 shl.b64 %r47,%r82,2; .loc 1 151 12 shl.b64 %r49,%r86,2; .loc 1 162 12 shl.b64 %r52,%r30,2; .loc 1 163 12 shl.b64 %r54,%r29,2; .loc 1 164 12 shl.b64 %r56,%r31,3; .loc 1 138 35 set.u32.eq.u64 %r200,%r82,1; neg.s32 %r201,%r200; .loc 1 138 19 set.u32.eq.u64 %r203,%r86,1; neg.s32 %r204,%r203; .loc 1 138 24 cvt.u16.u32 %r206,%r201; cvt.u16.u32 %r207,%r204; and.b16 %r205,%r206,%r207; cvt.u32.u16 %r208,%r205; cvt.u32.u8 %r84,%r208; .loc 1 167 32 ld.u64 %r103,[%frame+360]; setp.ne.u32 %r294,%r84,0; setp.le.s64 %r292,%r143,2; cvt.u32.u32 %r300,%r142; cvt.s64.s8 %r301,%r300; add.u64 %r302,%r301,-1; add.u64 %r303,%frame,840; add.u64 %r304,%frame,720; add.u64 %r305,%frame,600; add.u64 %r306,%frame,360; .loc 1 172 24 add.u64 %r307,%frame,240; .loc 1 173 24 add.u64 %r308,%frame,120; $L27: .loc 1 125 10 ld.u64 %r98,[%r126]; .loc 1 128 10 setp.ge.s64 %r209,%r98,0; @ %r209 bra $L13; .loc 1 129 12 add.u64 %r98,%r98,%r92; $L13: .loc 1 130 11 set.u32.ge.s64 %r211,%r98,%r92; neg.s32 %r212,%r211; shr.u64 %r214,%r98,63; cvt.u16.u32 %r217,%r212; cvt.u16.u64 %r218,%r214; or.b16 %r216,%r217,%r218; .loc 1 130 10 cvt.u32.u16 %r219,%r216; cvt.u16.u8 %r220,%r219; setp.eq.u16 %r221,%r220,0; @ %r221 bra $L14; .loc 1 132 8 rem.s64 %r98,%r98,%r92; .loc 1 133 7 setp.ge.s64 %r224,%r98,0; @ %r224 bra $L14; .loc 1 134 16 add.u64 %r98,%r98,%r92; $L14: .loc 1 147 24 sub.u64 %r99,%r92,%r98; .loc 1 138 10 @ %r294 bra $L15; .loc 1 147 4 setp.gt.s64 %r226,%r99,0; @ %r226 bra $L16; mov.u64 %r113,%r124; bra $L17; $L15: .loc 1 140 11 shl.b64 %r119,%r98,2; .loc 1 141 11 shl.b64 %r120,%r99,2; .loc 1 142 23 add.u64 %r227,%r125,%r119; .loc 1 142 41270234,[%value_in]; } .loc 1 143 17 add.u64 %r236,%r124,%r120r119243,[%value_in]; } bra $L18; $L16: .loc 1 136 22 mul.lo.u64 %r245,%r98,%r86; .loc 1 136 18 shl.b64 %r246,%r245,2; .loc 1 136 11 add.u64 %r117,%r125,%r246; mov.u64 %r116,%r124; .loc 1 147 11 mov.u64 %r118,0; $L19: .loc 1 149 16 ld.f32 %r45,[%r117]; .loc 1 149 14 st.f32 [%r116],%r45; .loc 1 150 13 add.u64 %r116,%r116,%r47; .loc 1 151 12 add.u64 %r117,%r117,%r49; .loc 1 147 31 add.u64 %r118,%r118,1; .loc 1 147 4 setp.ne.u64 %r247,%r99,%r118; @ %r247 bra $L19; .loc 1 150 13 mad.lo.u64 %r113,%r99,%r47,%r124; $L17: .loc 1 153 4 setp.gt.s64 %r249,%r98,0; @ ! %r249 bra $L18; mov.u64 %r114,%r125; .loc 1 153 23 mov.u64 %r115,0; $L20: .loc 1 155 16 ld.f32 %r50,[%r114]; .loc 1 155 14 st.f32 [%r113],%r50; .loc 1 156 13 add.u64 %r113,%r113,%r47; .loc 1 157 12 add.u64 %r114,%r114,%r49; .loc 1 153 37 add.u64 %r115,%r115,1; .loc 1 153 4 setp.ne.u64 %r250,%r98,%r115; @ %r250 bra $L20; $L18: .loc 1 162 12 add.u64 %r124,%r124,%r52; .loc 1 163 12 add.u64 %r125,%r125,%r54; .loc 1 164 12 add.u64 %r126,%r126,%r56; .loc 1 165 15 ld.u64 %r251,[%frame+480]; add.u64 %r58,%r251,1; .loc 1 167 13 setp.eq.u64 %r252,%r58,%r103; @ %r252 bra $L22; .loc 1 165 15 st.u64 [%frame+480],%r58; bra $L27; $L22: .loc 1 171 20 mov.u64 %r253,0; st.u64 [%frame+480],%r253; .loc 1 172 24 ld.u64 %r60,[%frame+240]; .loc 1 173 16 ld.u64 %r255,[%frame+120]; shl.b64 %r254,%r255,2; sub.u64 %r121,%r125,%r254; .loc 1 174 9 ld.u64 %r257,[%frame]; shl.b64 %r256,%r257,3; sub.u64 %r122,%r126,%r256; .loc 1 176 14 @ %r292 bra $L1; add.u64 %r134,%frame,488; mov.u64 %r132,8; .loc 1 175 12 mov.u64 %r123,1; bra $L25; $L26: .loc 1 171 20 st.u64 [%r134],%r253; .loc 1 172 24 add.u64 %r264,%r307,%r132; ld.u64 %r60,[%r264]; .loc 1 173 24 add.u64 %r266,%r308,%r132; .loc 1 173 16 ld.u64 %r268,[%r266]; shl.b64 %r267,%r268,2; sub.u64 %r121,%r125,%r267; .loc 1 174 17 add.u64 %r269,%frame,%r132; .loc 1 174 9 ld.u64 %r271,[%r269]; shl.b64 %r270,%r271,3; sub.u64 %r122,%r126,%r270; .loc 1 175 12 add.u64 %r123,%r123,1; .loc 1 176 14 add.u64 %r134,%r134,8; add.u64 %r132,%r132,8; setp.eq.u64 %r272,%r302,%r123; @ %r272 bra $L1; $L25: .loc 1 184 23 ld.u64 %r273,[%r134]; add.u64 %r70,%r273,1; st.u64 [%r134],%r70; .loc 1 185 30 add.u64 %r275,%r303,%r132; .loc 1 185 20 ld.u64 %r277,[%r275]; sub.u64 %r276,%r277,%r60; shl.b64 %r278,%r276,2; add.u64 %r124,%r124,%r278; .loc 1 186 30 add.u64 %r280,%r304,%r132; .loc 1 186 20 ld.u64 %r282,[%r280]; shl.b64 %r281,%r282,2; add.u64 %r125,%r121,%r281; .loc 1 187 23 add.u64 %r284,%r305,%r132; .loc 1 187 13 ld.u64 %r286,[%r284]; shl.b64 %r285,%r286,3; add.u64 %r126,%r122,%r285; .loc 1 167 32 add.u64 %r288,%r306,%r132; .loc 1 167 13 ld.u64 %r289,[%r288]; setp.eq.u64 %r290,%r70,%r289; @ %r290 bra $L26; bra $L27; $L1: .loc 1 191cshift1_8_r8.o/ 1622802247gfortrani_cshift1_8_r8 .visible .func _gfortrani_cshift1_8_r8../libgfortran/generated/cshift1_8_r8.c"ortrani_cshift1_8_r8 .visible .func _gfortrani_cshift1_8_r80pred32predpredu16u64 %r214; .reg .u16 %r216; .reg .u16 %r217; .reg .u16 %r218; .reg .u32 %r219; .reg .u16 %r220; .reg .pred %r221; .reg .pred %r224; .reg .pred %r226; .reg .u64 %r227; .reg .u64pred %r247; .reg .pred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .predu64 %r267; .reg .u6464u64 %r305; .reg .u64 %r306; .reg .u64mov.u64 %r144,%ar0; mov.u64 %r145,%ar1; mov.u64 %r146,%ar2; mov.u64 %r147,%ar3; .loc 1 68 6 setp.eq.u64 %r148,%r147,0; @ %r148 bra $L28; .loc 1 69 21 ld.u32 %r150,[%r147]; add.u32 %r97,%r150,-1; bra $L2; $L28: .loc 1 71 11 cvt.u32.u64 %r97,%r147; $L2: .loc 1 73 13 mov.u64 %r151,1; st.u64 [%frame+360],%r151; .loc 1 74 12 mov.u64 %r152,0; st.u64 [%frame+480],%r152; .loc 1 82 23 ld.s8 %r142,[%r145+28]; cvt.u32.u32 %r153,%r142; cvt.s64.s8 %r143,%r153; .loc 1 82 3 setp.le.s64 %r154,%r143,0; @ %r154 bra $L29; .loc 1 84 15 cvt.s64.s32 %r35,%r97; add.u64 %r90,%r145,40; add.u64 %r83,%r144,40; .loc 1 75 5 mov.u64 %r94,%r152; .loc 1 80 7 mov.u64 %r92,%r94; .loc 1 82 12 mov.u64 %r127,%r94; .loc 1 79 11 mov.u64 %r86,%r151; .loc 1 78 11 mov.u64 %r82,%r86; .loc 1 96 20 mov.u64 %r309,%r94; $L8: .loc 1 92 17 ld.u64 %r156,[%r90+16]; add.u64 %r155,%r156,1; .loc 1 92 15 ld.u64 %r157,[%r90+8]; sub.u64 %r87,%r155,%r157; .loc 1 86 19 ld.u64 %r81,[%r83]; .loc 1 89 19 ld.u64 %r85,[%r90]; .loc 1 84 10 setp.ne.u64 %r158,%r35,%r127; @ %r158 bra $L4; .loc 1 87 14 setp.ne.u64 %r159,%r81,0; .loc 1 90 14 setp.ne.u64 %r160,%r85,0; .loc 1 92 15 mov.u64 %r92,%r87; selp.u64 %r86,%r85,1,%r160; selp.u64 %r82,%r81,1,%r159; bra $L7; $L4: .loc 1 96 20 shl.b64 %r161,%r94,3; add.u64 %r162,%frame,%r161; add.u64 %r163,%r162,480; st.u64 [%r163],%r309; .loc 1 97 21 st.u64 [%r162+360],%r87; .loc 1 98 22 st.u64 [%r162+840],%r81; .loc 1 99 22 st.u64 [%r162+720],%r85; .loc 1 100 24 add.u64 %r175,%r94,%r94; add.u64 %r176,%r175,%r94; shl.b64 %r177,%r176,3; add.u64 %r178,%r146,%r177; ld.u64 %r25,[%r178+40]; .loc 1 100 22 st.u64 [%r162+600],%r25; .loc 1 101 26 mul.lo.u64 %r186,%r81,%r87; .loc 1 101 13 st.u64 [%r162+240],%r186; .loc 1 102 26 mul.lo.u64 %r190,%r87,%r85; .loc 1 102 13 st.u64 [%r162+120],%r190; .loc 1 103 26 mul.lo.u64 %r193,%r25,%r87; .loc 1 103 13 st.u64 [%r162],%r193; .loc 1 104 12 add.u64 %r94,%r94,1; $L7: .loc 1 82 55 add.u64 %r127,%r127,1; .loc 1 82 3 add.u64 %r90,%r90,24; add.u64 %r83,%r83,24; setp.ne.u64 %r194,%r127,%r143; @ %r194 bra $L8; bra $L3; $L29: .loc 1 80 7 mov.u64 %r92,%r152; .loc 1 79 11 mov.u64 %r86,%r151; .loc 1 78 11 mov.u64 %r82,%r86; $L3: .loc 1 107 14 ld.u64 %r29,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r195,%r29,0; .loc 1 108 16 selp.u64 %r29,%r29,1,%r195; .loc 1 109 14 ld.u64 %r30,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r196,%r30,0; .loc 1 110 16 selp.u64 %r30,%r30,1,%r196; .loc 1 111 14 ld.u64 %r31,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r197,%r31,0; .loc 1 112 16 selp.u64 %r31,%r31,1,%r197; .loc 1 118 8 ld.u64 %r124,[%r144]; .loc 1 119 8 ld.u64 %r125,[%r145]; .loc 1 120 8 ld.u64 %r126,[%r146]; .loc 1 122 9 setp.eq.u64 %r198,%r124,0; @ %r198 bra $L1; .loc 1 150 13 shl.b64 %r47,%r82,3; .loc 1 151 12 shl.b64 %r49,%r86,3; .loc 1 162 12 shl.b64 %r52,%r30,3; .loc 1 163 12 shl.b64 %r54,%r29,3; .loc 1 164 12 shl.b64 %r56,%r31,3; .loc 1 138 35 set.u32.eq.u64 %r200,%r82,1; neg.s32 %r201,%r200; .loc 1 138 19 set.u32.eq.u64 %r203,%r86,1; neg.s32 %r204,%r203; .loc 1 138 24 cvt.u16.u32 %r206,%r201; cvt.u16.u32 %r207,%r204; and.b16 %r205,%r206,%r207; cvt.u32.u16 %r208,%r205; cvt.u32.u8 %r84,%r208; .loc 1 167 32 ld.u64 %r103,[%frame+360]; setp.ne.u32 %r294,%r84,0; setp.le.s64 %r292,%r143,2; cvt.u32.u32 %r300,%r142; cvt.s64.s8 %r301,%r300; add.u64 %r302,%r301,-1; add.u64 %r303,%frame,840; add.u64 %r304,%frame,720; add.u64 %r305,%frame,600; add.u64 %r306,%frame,360; .loc 1 172 24 add.u64 %r307,%frame,240; .loc 1 173 24 add.u64 %r308,%frame,120; $L27: .loc 1 125 10 ld.u64 %r98,[%r126]; .loc 1 128 10 setp.ge.s64 %r209,%r98,0; @ %r209 bra $L13; .loc 1 129 12 add.u64 %r98,%r98,%r92; $L13: .loc 1 130 11 set.u32.ge.s64 %r211,%r98,%r92; neg.s32 %r212,%r211; shr.u64 %r214,%r98,63; cvt.u16.u32 %r217,%r212; cvt.u16.u64 %r218,%r214; or.b16 %r216,%r217,%r218; .loc 1 130 10 cvt.u32.u16 %r219,%r216; cvt.u16.u8 %r220,%r219; setp.eq.u16 %r221,%r220,0; @ %r221 bra $L14; .loc 1 132 8 rem.s64 %r98,%r98,%r92; .loc 1 133 7 setp.ge.s64 %r224,%r98,0; @ %r224 bra $L14; .loc 1 134 16 add.u64 %r98,%r98,%r92; $L14: .loc 1 147 24 sub.u64 %r99,%r92,%r98; .loc 1 138 10 @ %r294 bra $L15; .loc 1 147 4 setp.gt.s64 %r226,%r99,0; @ %r226 bra $L16; mov.u64 %r113,%r124; bra $L17; $L15: .loc 1 140 11 shl.b64 %r119,%r98,3; .loc 1 141 11 shl.b64 %r120,%r99,3; .loc 1 142 23 add.u64 %r227,%r125,%r119; .loc 1 142 41270234,[%value_in]; } .loc 1 143 17 add.u64 %r236,%r124,%r120r119243,[%value_in]; } bra $L18; $L16: .loc 1 136 22 mul.lo.u64 %r245,%r98,%r86; .loc 1 136 18 shl.b64 %r246,%r245,3; .loc 1 136 11 add.u64 %r117,%r125,%r246; mov.u64 %r116,%r124; .loc 1 147 11 mov.u64 %r118,0; $L19: .loc 1 149 16 ld.f64 %r45,[%r117]; .loc 1 149 14 st.f64 [%r116],%r45; .loc 1 150 13 add.u64 %r116,%r116,%r47; .loc 1 151 12 add.u64 %r117,%r117,%r49; .loc 1 147 31 add.u64 %r118,%r118,1; .loc 1 147 4 setp.ne.u64 %r247,%r99,%r118; @ %r247 bra $L19; .loc 1 150 13 mad.lo.u64 %r113,%r99,%r47,%r124; $L17: .loc 1 153 4 setp.gt.s64 %r249,%r98,0; @ ! %r249 bra $L18; mov.u64 %r114,%r125; .loc 1 153 23 mov.u64 %r115,0; $L20: .loc 1 155 16 ld.f64 %r50,[%r114]; .loc 1 155 14 st.f64 [%r113],%r50; .loc 1 156 13 add.u64 %r113,%r113,%r47; .loc 1 157 12 add.u64 %r114,%r114,%r49; .loc 1 153 37 add.u64 %r115,%r115,1; .loc 1 153 4 setp.ne.u64 %r250,%r98,%r115; @ %r250 bra $L20; $L18: .loc 1 162 12 add.u64 %r124,%r124,%r52; .loc 1 163 12 add.u64 %r125,%r125,%r54; .loc 1 164 12 add.u64 %r126,%r126,%r56; .loc 1 165 15 ld.u64 %r251,[%frame+480]; add.u64 %r58,%r251,1; .loc 1 167 13 setp.eq.u64 %r252,%r58,%r103; @ %r252 bra $L22; .loc 1 165 15 st.u64 [%frame+480],%r58; bra $L27; $L22: .loc 1 171 20 mov.u64 %r253,0; st.u64 [%frame+480],%r253; .loc 1 172 24 ld.u64 %r60,[%frame+240]; .loc 1 173 16 ld.u64 %r255,[%frame+120]; shl.b64 %r254,%r255,3; sub.u64 %r121,%r125,%r254; .loc 1 174 9 ld.u64 %r257,[%frame]; shl.b64 %r256,%r257,3; sub.u64 %r122,%r126,%r256; .loc 1 176 14 @ %r292 bra $L1; add.u64 %r134,%frame,488; mov.u64 %r132,8; .loc 1 175 12 mov.u64 %r123,1; bra $L25; $L26: .loc 1 171 20 st.u64 [%r134],%r253; .loc 1 172 24 add.u64 %r264,%r307,%r132; ld.u64 %r60,[%r264]; .loc 1 173 24 add.u64 %r266,%r308,%r132; .loc 1 173 16 ld.u64 %r268,[%r266]; shl.b64 %r267,%r268,3; sub.u64 %r121,%r125,%r267; .loc 1 174 17 add.u64 %r269,%frame,%r132; .loc 1 174 9 ld.u64 %r271,[%r269]; shl.b64 %r270,%r271,3; sub.u64 %r122,%r126,%r270; .loc 1 175 12 add.u64 %r123,%r123,1; .loc 1 176 14 add.u64 %r134,%r134,8; add.u64 %r132,%r132,8; setp.eq.u64 %r272,%r302,%r123; @ %r272 bra $L1; $L25: .loc 1 184 23 ld.u64 %r273,[%r134]; add.u64 %r70,%r273,1; st.u64 [%r134],%r70; .loc 1 185 30 add.u64 %r275,%r303,%r132; .loc 1 185 20 ld.u64 %r277,[%r275]; sub.u64 %r276,%r277,%r60; shl.b64 %r278,%r276,3; add.u64 %r124,%r124,%r278; .loc 1 186 30 add.u64 %r280,%r304,%r132; .loc 1 186 20 ld.u64 %r282,[%r280]; shl.b64 %r281,%r282,3; add.u64 %r125,%r121,%r281; .loc 1 187 23 add.u64 %r284,%r305,%r132; .loc 1 187 13 ld.u64 %r286,[%r284]; shl.b64 %r285,%r286,3; add.u64 %r126,%r122,%r285; .loc 1 167 32 add.u64 %r288,%r306,%r132; .loc 1 167 13 ld.u64 %r289,[%r288]; setp.eq.u64 %r290,%r70,%r289; @ %r290 bra $L26; bra $L27; $L1: .loc 1 191cshift1_8_r10.o/1622802247cshift1_8_c4.o/ 1622802247gfortrani_cshift1_8_c4 .visible .func _gfortrani_cshift1_8_c../libgfortran/generated/cshift1_8_c4.c"ortrani_cshift1_8_c4 .visible .func _gfortrani_cshift1_8_c0predpredpredpredpred %r226; .reg .predu64pred %r249; .reg .pred %r251; .reg .pred %r252; .reg .u64 %r253; .reg .predu64 %r257; .reg .u64 %r258; .reg .u64 %r259; .reg .u64u64pred %r294; .reg .pred %r296; .reg .u32 %r302; .reg .u64 %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u64 %r306; .reg .u64u64 %r310; .reg .u64 %r311; mov.u64 %r146,%ar0; mov.u64 %r147,%ar1; mov.u64 %r148,%ar2; mov.u64 %r149,%ar3; .loc 1 68 6 setp.eq.u64 %r150,%r149,0; @ %r150 bra $L28; .loc 1 69 21 ld.u32 %r152,[%r149]; add.u32 %r99,%r152,-1; bra $L2; $L28: .loc 1 71 11 cvt.u32.u64 %r99,%r149; $L2: .loc 1 73 13 mov.u64 %r153,1; st.u64 [%frame+360],%r153; .loc 1 74 12 mov.u64 %r154,0; st.u64 [%frame+480],%r154; .loc 1 82 23 ld.s8 %r144,[%r147+28]; cvt.u32.u32 %r155,%r144; cvt.s64.s8 %r145,%r155; .loc 1 82 3 setp.le.s64 %r156,%r145,0; @ %r156 bra $L29; .loc 1 84 15 cvt.s64.s32 %r35,%r99; add.u64 %r92,%r147,40; add.u64 %r85,%r146,40; .loc 1 75 5 mov.u64 %r96,%r154; .loc 1 80 7 mov.u64 %r94,%r96; .loc 1 82 12 mov.u64 %r129,%r96; .loc 1 79 11 mov.u64 %r88,%r153; .loc 1 78 11 mov.u64 %r84,%r88; .loc 1 96 20 mov.u64 %r311,%r96; $L8: .loc 1 92 17 ld.u64 %r158,[%r92+16]; add.u64 %r157,%r158,1; .loc 1 92 15 ld.u64 %r159,[%r92+8]; sub.u64 %r89,%r157,%r159; .loc 1 86 19 ld.u64 %r83,[%r85]; .loc 1 89 19 ld.u64 %r87,[%r92]; .loc 1 84 10 setp.ne.u64 %r160,%r35,%r129; @ %r160 bra $L4; .loc 1 87 14 setp.ne.u64 %r161,%r83,0; .loc 1 90 14 setp.ne.u64 %r162,%r87,0; .loc 1 92 15 mov.u64 %r94,%r89; selp.u64 %r88,%r87,1,%r162; selp.u64 %r84,%r83,1,%r161; bra $L7; $L4: .loc 1 96 20 shl.b64 %r163,%r96,3; add.u64 %r164,%frame,%r163; add.u64 %r165,%r164,480; st.u64 [%r165],%r311; .loc 1 97 21 st.u64 [%r164+360],%r89; .loc 1 98 22 st.u64 [%r164+840],%r83; .loc 1 99 22 st.u64 [%r164+720],%r87; .loc 1 100 24 add.u64 %r177,%r96,%r96; add.u64 %r178,%r177,%r96; shl.b64 %r179,%r178,3; add.u64 %r180,%r148,%r179; ld.u64 %r25,[%r180+40]; .loc 1 100 22 st.u64 [%r164+600],%r25; .loc 1 101 26 mul.lo.u64 %r188,%r83,%r89; .loc 1 101 13 st.u64 [%r164+240],%r188; .loc 1 102 26 mul.lo.u64 %r192,%r89,%r87; .loc 1 102 13 st.u64 [%r164+120],%r192; .loc 1 103 26 mul.lo.u64 %r195,%r25,%r89; .loc 1 103 13 st.u64 [%r164],%r195; .loc 1 104 12 add.u64 %r96,%r96,1; $L7: .loc 1 82 55 add.u64 %r129,%r129,1; .loc 1 82 3 add.u64 %r92,%r92,24; add.u64 %r85,%r85,24; setp.ne.u64 %r196,%r129,%r145; @ %r196 bra $L8; bra $L3; $L29: .loc 1 80 7 mov.u64 %r94,%r154; .loc 1 79 11 mov.u64 %r88,%r153; .loc 1 78 11 mov.u64 %r84,%r88; $L3: .loc 1 107 14 ld.u64 %r29,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r197,%r29,0; .loc 1 108 16 selp.u64 %r29,%r29,1,%r197; .loc 1 109 14 ld.u64 %r30,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r198,%r30,0; .loc 1 110 16 selp.u64 %r30,%r30,1,%r198; .loc 1 111 14 ld.u64 %r31,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r199,%r31,0; .loc 1 112 16 selp.u64 %r31,%r31,1,%r199; .loc 1 118 8 ld.u64 %r126,[%r146]; .loc 1 119 8 ld.u64 %r127,[%r147]; .loc 1 120 8 ld.u64 %r128,[%r148]; .loc 1 122 9 setp.eq.u64 %r200,%r126,0; @ %r200 bra $L1; .loc 1 150 13 shl.b64 %r48,%r84,3; .loc 1 151 12 shl.b64 %r50,%r88,3; .loc 1 162 12 shl.b64 %r54,%r30,3; .loc 1 163 12 shl.b64 %r56,%r29,3; .loc 1 164 12 shl.b64 %r58,%r31,3; .loc 1 138 35 set.u32.eq.u64 %r202,%r84,1; neg.s32 %r203,%r202; .loc 1 138 19 set.u32.eq.u64 %r205,%r88,1; neg.s32 %r206,%r205; .loc 1 138 24 cvt.u16.u32 %r208,%r203; cvt.u16.u32 %r209,%r206; and.b16 %r207,%r208,%r209; cvt.u32.u16 %r210,%r207; cvt.u32.u8 %r86,%r210; .loc 1 167 32 ld.u64 %r105,[%frame+360]; setp.ne.u32 %r296,%r86,0; setp.le.s64 %r294,%r145,2; cvt.u32.u32 %r302,%r144; cvt.s64.s8 %r303,%r302; add.u64 %r304,%r303,-1; add.u64 %r305,%frame,840; add.u64 %r306,%frame,720; add.u64 %r307,%frame,600; add.u64 %r308,%frame,360; .loc 1 172 24 add.u64 %r309,%frame,240; .loc 1 173 24 add.u64 %r310,%frame,120; $L27: .loc 1 125 10 ld.u64 %r100,[%r128]; .loc 1 128 10 setp.ge.s64 %r211,%r100,0; @ %r211 bra $L13; .loc 1 129 12 add.u64 %r100,%r100,%r94; $L13: .loc 1 130 11 set.u32.ge.s64 %r213,%r100,%r94; neg.s32 %r214,%r213; shr.u64 %r216,%r100,63; cvt.u16.u32 %r219,%r214; cvt.u16.u64 %r220,%r216; or.b16 %r218,%r219,%r220; .loc 1 130 10 cvt.u32.u16 %r221,%r218; cvt.u16.u8 %r222,%r221; setp.eq.u16 %r223,%r222,0; @ %r223 bra $L14; .loc 1 132 8 rem.s64 %r100,%r100,%r94; .loc 1 133 7 setp.ge.s64 %r226,%r100,0; @ %r226 bra $L14; .loc 1 134 16 add.u64 %r100,%r100,%r94; $L14: .loc 1 147 24 sub.u64 %r101,%r94,%r100; .loc 1 138 10 @ %r296 bra $L15; .loc 1 147 4 setp.gt.s64 %r228,%r101,0; @ %r228 bra $L16; mov.u64 %r115,%r126; bra $L17; $L15: .loc 1 140 11 shl.b64 %r121,%r100,3; .loc 1 141 11 shl.b64 %r122,%r101,3; .loc 1 142 23 add.u64 %r229,%r127,%r121; .loc 1 142 422236,[%value_in]; } .loc 1 143 17 add.u64 %r238,%r126,%r122r1271245,[%value_in]; } bra $L18; $L16: .loc 1 136 22 mul.lo.u64 %r247,%r100,%r88; .loc 1 136 18 shl.b64 %r248,%r247,3; .loc 1 136 11 add.u64 %r119,%r127,%r248; mov.u64 %r118,%r126; .loc 1 147 11 mov.u64 %r120,0; $L19: .loc 1 149 16 ld.f32 %r45,[%r119]; ld.f32 %r46,[%r119+4]; .loc 1 149 14 st.f32 [%r118],%r45; st.f32 [%r118+4],%r46; .loc 1 150 13 add.u64 %r118,%r118,%r48; .loc 1 151 12 add.u64 %r119,%r119,%r50; .loc 1 147 31 add.u64 %r120,%r120,1; .loc 1 147 4 setp.ne.u64 %r249,%r101,%r120; @ %r249 bra $L19; .loc 1 150 13 mad.lo.u64 %r115,%r101,%r48,%r126; $L17: .loc 1 153 4 setp.gt.s64 %r251,%r100,0; @ ! %r251 bra $L18; mov.u64 %r116,%r127; .loc 1 153 23 mov.u64 %r117,0; $L20: .loc 1 155 16 ld.f32 %r51,[%r116]; ld.f32 %r52,[%r116+4]; .loc 1 155 14 st.f32 [%r115],%r51; st.f32 [%r115+4],%r52; .loc 1 156 13 add.u64 %r115,%r115,%r48; .loc 1 157 12 add.u64 %r116,%r116,%r50; .loc 1 153 37 add.u64 %r117,%r117,1; .loc 1 153 4 setp.ne.u64 %r252,%r100,%r117; @ %r252 bra $L20; $L18: .loc 1 162 12 add.u64 %r126,%r126,%r54; .loc 1 163 12 add.u64 %r127,%r127,%r56; .loc 1 164 12 add.u64 %r128,%r128,%r58; .loc 1 165 15 ld.u64 %r253,[%frame+480]; add.u64 %r60,%r253,1; .loc 1 167 13 setp.eq.u64 %r254,%r60,%r105; @ %r254 bra $L22; .loc 1 165 15 st.u64 [%frame+480],%r60; bra $L27; $L22: .loc 1 171 20 mov.u64 %r255,0; st.u64 [%frame+480],%r255; .loc 1 172 24 ld.u64 %r62,[%frame+240]; .loc 1 173 16 ld.u64 %r257,[%frame+120]; shl.b64 %r256,%r257,3; sub.u64 %r123,%r127,%r256; .loc 1 174 9 ld.u64 %r259,[%frame]; shl.b64 %r258,%r259,3; sub.u64 %r124,%r128,%r258; .loc 1 176 14 @ %r294 bra $L1; add.u64 %r136,%frame,488; mov.u64 %r134,8; .loc 1 175 12 mov.u64 %r125,1; bra $L25; $L26: .loc 1 171 20 st.u64 [%r136],%r255; .loc 1 172 24 add.u64 %r266,%r309,%r134; ld.u64 %r62,[%r266]; .loc 1 173 24 add.u64 %r268,%r310,%r134; .loc 1 173 16 ld.u64 %r270,[%r268]; shl.b64 %r269,%r270,3; sub.u64 %r123,%r127,%r269; .loc 1 174 17 add.u64 %r271,%frame,%r134; .loc 1 174 9 ld.u64 %r273,[%r271]; shl.b64 %r272,%r273,3; sub.u64 %r124,%r128,%r272; .loc 1 175 12 add.u64 %r125,%r125,1; .loc 1 176 14 add.u64 %r136,%r136,8; add.u64 %r134,%r134,8; setp.eq.u64 %r274,%r304,%r125; @ %r274 bra $L1; $L25: .loc 1 184 23 ld.u64 %r275,[%r136]; add.u64 %r72,%r275,1; st.u64 [%r136],%r72; .loc 1 185 30 add.u64 %r277,%r305,%r134; .loc 1 185 20 ld.u64 %r279,[%r277]; sub.u64 %r278,%r279,%r62; shl.b64 %r280,%r278,3; add.u64 %r126,%r126,%r280; .loc 1 186 30 add.u64 %r282,%r306,%r134; .loc 1 186 20 ld.u64 %r284,[%r282]; shl.b64 %r283,%r284,3; add.u64 %r127,%r123,%r283; .loc 1 187 23 add.u64 %r286,%r307,%r134; .loc 1 187 13 ld.u64 %r288,[%r286]; shl.b64 %r287,%r288,3; add.u64 %r128,%r124,%r287; .loc 1 167 32 add.u64 %r290,%r308,%r134; .loc 1 167 13 ld.u64 %r291,[%r290]; setp.eq.u64 %r292,%r72,%r291; @ %r292 bra $L26; bra $L27; $L1: .loc 1 191 cshift1_8_c8.o/ 1622802247gfortrani_cshift1_8_c8 .visible .func _gfortrani_cshift1_8_c8../libgfortran/generated/cshift1_8_c8.c"ortrani_cshift1_8_c8 .visible .func _gfortrani_cshift1_8_c80predpredpredpredpred %r226; .reg .predu64pred %r249; .reg .pred %r251; .reg .pred %r252; .reg .u64 %r253; .reg .predu64 %r257; .reg .u64 %r258; .reg .u64 %r259; .reg .u64u64pred %r294; .reg .pred %r296; .reg .u32 %r302; .reg .u64 %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u64 %r306; .reg .u64u64 %r310; .reg .u64 %r311; mov.u64 %r146,%ar0; mov.u64 %r147,%ar1; mov.u64 %r148,%ar2; mov.u64 %r149,%ar3; .loc 1 68 6 setp.eq.u64 %r150,%r149,0; @ %r150 bra $L28; .loc 1 69 21 ld.u32 %r152,[%r149]; add.u32 %r99,%r152,-1; bra $L2; $L28: .loc 1 71 11 cvt.u32.u64 %r99,%r149; $L2: .loc 1 73 13 mov.u64 %r153,1; st.u64 [%frame+360],%r153; .loc 1 74 12 mov.u64 %r154,0; st.u64 [%frame+480],%r154; .loc 1 82 23 ld.s8 %r144,[%r147+28]; cvt.u32.u32 %r155,%r144; cvt.s64.s8 %r145,%r155; .loc 1 82 3 setp.le.s64 %r156,%r145,0; @ %r156 bra $L29; .loc 1 84 15 cvt.s64.s32 %r35,%r99; add.u64 %r92,%r147,40; add.u64 %r85,%r146,40; .loc 1 75 5 mov.u64 %r96,%r154; .loc 1 80 7 mov.u64 %r94,%r96; .loc 1 82 12 mov.u64 %r129,%r96; .loc 1 79 11 mov.u64 %r88,%r153; .loc 1 78 11 mov.u64 %r84,%r88; .loc 1 96 20 mov.u64 %r311,%r96; $L8: .loc 1 92 17 ld.u64 %r158,[%r92+16]; add.u64 %r157,%r158,1; .loc 1 92 15 ld.u64 %r159,[%r92+8]; sub.u64 %r89,%r157,%r159; .loc 1 86 19 ld.u64 %r83,[%r85]; .loc 1 89 19 ld.u64 %r87,[%r92]; .loc 1 84 10 setp.ne.u64 %r160,%r35,%r129; @ %r160 bra $L4; .loc 1 87 14 setp.ne.u64 %r161,%r83,0; .loc 1 90 14 setp.ne.u64 %r162,%r87,0; .loc 1 92 15 mov.u64 %r94,%r89; selp.u64 %r88,%r87,1,%r162; selp.u64 %r84,%r83,1,%r161; bra $L7; $L4: .loc 1 96 20 shl.b64 %r163,%r96,3; add.u64 %r164,%frame,%r163; add.u64 %r165,%r164,480; st.u64 [%r165],%r311; .loc 1 97 21 st.u64 [%r164+360],%r89; .loc 1 98 22 st.u64 [%r164+840],%r83; .loc 1 99 22 st.u64 [%r164+720],%r87; .loc 1 100 24 add.u64 %r177,%r96,%r96; add.u64 %r178,%r177,%r96; shl.b64 %r179,%r178,3; add.u64 %r180,%r148,%r179; ld.u64 %r25,[%r180+40]; .loc 1 100 22 st.u64 [%r164+600],%r25; .loc 1 101 26 mul.lo.u64 %r188,%r83,%r89; .loc 1 101 13 st.u64 [%r164+240],%r188; .loc 1 102 26 mul.lo.u64 %r192,%r89,%r87; .loc 1 102 13 st.u64 [%r164+120],%r192; .loc 1 103 26 mul.lo.u64 %r195,%r25,%r89; .loc 1 103 13 st.u64 [%r164],%r195; .loc 1 104 12 add.u64 %r96,%r96,1; $L7: .loc 1 82 55 add.u64 %r129,%r129,1; .loc 1 82 3 add.u64 %r92,%r92,24; add.u64 %r85,%r85,24; setp.ne.u64 %r196,%r129,%r145; @ %r196 bra $L8; bra $L3; $L29: .loc 1 80 7 mov.u64 %r94,%r154; .loc 1 79 11 mov.u64 %r88,%r153; .loc 1 78 11 mov.u64 %r84,%r88; $L3: .loc 1 107 14 ld.u64 %r29,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r197,%r29,0; .loc 1 108 16 selp.u64 %r29,%r29,1,%r197; .loc 1 109 14 ld.u64 %r30,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r198,%r30,0; .loc 1 110 16 selp.u64 %r30,%r30,1,%r198; .loc 1 111 14 ld.u64 %r31,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r199,%r31,0; .loc 1 112 16 selp.u64 %r31,%r31,1,%r199; .loc 1 118 8 ld.u64 %r126,[%r146]; .loc 1 119 8 ld.u64 %r127,[%r147]; .loc 1 120 8 ld.u64 %r128,[%r148]; .loc 1 122 9 setp.eq.u64 %r200,%r126,0; @ %r200 bra $L1; .loc 1 150 13 shl.b64 %r48,%r84,4; .loc 1 151 12 shl.b64 %r50,%r88,4; .loc 1 162 12 shl.b64 %r54,%r30,4; .loc 1 163 12 shl.b64 %r56,%r29,4; .loc 1 164 12 shl.b64 %r58,%r31,3; .loc 1 138 35 set.u32.eq.u64 %r202,%r84,1; neg.s32 %r203,%r202; .loc 1 138 19 set.u32.eq.u64 %r205,%r88,1; neg.s32 %r206,%r205; .loc 1 138 24 cvt.u16.u32 %r208,%r203; cvt.u16.u32 %r209,%r206; and.b16 %r207,%r208,%r209; cvt.u32.u16 %r210,%r207; cvt.u32.u8 %r86,%r210; .loc 1 167 32 ld.u64 %r105,[%frame+360]; setp.ne.u32 %r296,%r86,0; setp.le.s64 %r294,%r145,2; cvt.u32.u32 %r302,%r144; cvt.s64.s8 %r303,%r302; add.u64 %r304,%r303,-1; add.u64 %r305,%frame,840; add.u64 %r306,%frame,720; add.u64 %r307,%frame,600; add.u64 %r308,%frame,360; .loc 1 172 24 add.u64 %r309,%frame,240; .loc 1 173 24 add.u64 %r310,%frame,120; $L27: .loc 1 125 10 ld.u64 %r100,[%r128]; .loc 1 128 10 setp.ge.s64 %r211,%r100,0; @ %r211 bra $L13; .loc 1 129 12 add.u64 %r100,%r100,%r94; $L13: .loc 1 130 11 set.u32.ge.s64 %r213,%r100,%r94; neg.s32 %r214,%r213; shr.u64 %r216,%r100,63; cvt.u16.u32 %r219,%r214; cvt.u16.u64 %r220,%r216; or.b16 %r218,%r219,%r220; .loc 1 130 10 cvt.u32.u16 %r221,%r218; cvt.u16.u8 %r222,%r221; setp.eq.u16 %r223,%r222,0; @ %r223 bra $L14; .loc 1 132 8 rem.s64 %r100,%r100,%r94; .loc 1 133 7 setp.ge.s64 %r226,%r100,0; @ %r226 bra $L14; .loc 1 134 16 add.u64 %r100,%r100,%r94; $L14: .loc 1 147 24 sub.u64 %r101,%r94,%r100; .loc 1 138 10 @ %r296 bra $L15; .loc 1 147 4 setp.gt.s64 %r228,%r101,0; @ %r228 bra $L16; mov.u64 %r115,%r126; bra $L17; $L15: .loc 1 140 11 shl.b64 %r121,%r100,4; .loc 1 141 11 shl.b64 %r122,%r101,4; .loc 1 142 23 add.u64 %r229,%r127,%r121; .loc 1 142 422236,[%value_in]; } .loc 1 143 17 add.u64 %r238,%r126,%r122r1271245,[%value_in]; } bra $L18; $L16: .loc 1 136 22 mul.lo.u64 %r247,%r100,%r88; .loc 1 136 18 shl.b64 %r248,%r247,4; .loc 1 136 11 add.u64 %r119,%r127,%r248; mov.u64 %r118,%r126; .loc 1 147 11 mov.u64 %r120,0; $L19: .loc 1 149 16 ld.f64 %r45,[%r119]; ld.f64 %r46,[%r119+8]; .loc 1 149 14 st.f64 [%r118],%r45; st.f64 [%r118+8],%r46; .loc 1 150 13 add.u64 %r118,%r118,%r48; .loc 1 151 12 add.u64 %r119,%r119,%r50; .loc 1 147 31 add.u64 %r120,%r120,1; .loc 1 147 4 setp.ne.u64 %r249,%r101,%r120; @ %r249 bra $L19; .loc 1 150 13 mad.lo.u64 %r115,%r101,%r48,%r126; $L17: .loc 1 153 4 setp.gt.s64 %r251,%r100,0; @ ! %r251 bra $L18; mov.u64 %r116,%r127; .loc 1 153 23 mov.u64 %r117,0; $L20: .loc 1 155 16 ld.f64 %r51,[%r116]; ld.f64 %r52,[%r116+8]; .loc 1 155 14 st.f64 [%r115],%r51; st.f64 [%r115+8],%r52; .loc 1 156 13 add.u64 %r115,%r115,%r48; .loc 1 157 12 add.u64 %r116,%r116,%r50; .loc 1 153 37 add.u64 %r117,%r117,1; .loc 1 153 4 setp.ne.u64 %r252,%r100,%r117; @ %r252 bra $L20; $L18: .loc 1 162 12 add.u64 %r126,%r126,%r54; .loc 1 163 12 add.u64 %r127,%r127,%r56; .loc 1 164 12 add.u64 %r128,%r128,%r58; .loc 1 165 15 ld.u64 %r253,[%frame+480]; add.u64 %r60,%r253,1; .loc 1 167 13 setp.eq.u64 %r254,%r60,%r105; @ %r254 bra $L22; .loc 1 165 15 st.u64 [%frame+480],%r60; bra $L27; $L22: .loc 1 171 20 mov.u64 %r255,0; st.u64 [%frame+480],%r255; .loc 1 172 24 ld.u64 %r62,[%frame+240]; .loc 1 173 16 ld.u64 %r257,[%frame+120]; shl.b64 %r256,%r257,4; sub.u64 %r123,%r127,%r256; .loc 1 174 9 ld.u64 %r259,[%frame]; shl.b64 %r258,%r259,3; sub.u64 %r124,%r128,%r258; .loc 1 176 14 @ %r294 bra $L1; add.u64 %r136,%frame,488; mov.u64 %r134,8; .loc 1 175 12 mov.u64 %r125,1; bra $L25; $L26: .loc 1 171 20 st.u64 [%r136],%r255; .loc 1 172 24 add.u64 %r266,%r309,%r134; ld.u64 %r62,[%r266]; .loc 1 173 24 add.u64 %r268,%r310,%r134; .loc 1 173 16 ld.u64 %r270,[%r268]; shl.b64 %r269,%r270,4; sub.u64 %r123,%r127,%r269; .loc 1 174 17 add.u64 %r271,%frame,%r134; .loc 1 174 9 ld.u64 %r273,[%r271]; shl.b64 %r272,%r273,3; sub.u64 %r124,%r128,%r272; .loc 1 175 12 add.u64 %r125,%r125,1; .loc 1 176 14 add.u64 %r136,%r136,8; add.u64 %r134,%r134,8; setp.eq.u64 %r274,%r304,%r125; @ %r274 bra $L1; $L25: .loc 1 184 23 ld.u64 %r275,[%r136]; add.u64 %r72,%r275,1; st.u64 [%r136],%r72; .loc 1 185 30 add.u64 %r277,%r305,%r134; .loc 1 185 20 ld.u64 %r279,[%r277]; sub.u64 %r278,%r279,%r62; shl.b64 %r280,%r278,4; add.u64 %r126,%r126,%r280; .loc 1 186 30 add.u64 %r282,%r306,%r134; .loc 1 186 20 ld.u64 %r284,[%r282]; shl.b64 %r283,%r284,4; add.u64 %r127,%r123,%r283; .loc 1 187 23 add.u64 %r286,%r307,%r134; .loc 1 187 13 ld.u64 %r288,[%r286]; shl.b64 %r287,%r288,3; add.u64 %r128,%r124,%r287; .loc 1 167 32 add.u64 %r290,%r308,%r134; .loc 1 167 13 ld.u64 %r291,[%r290]; setp.eq.u64 %r292,%r72,%r291; @ %r292 bra $L26; bra $L27; $L1: .loc 1 191 cshift1_8_c10.o/1622802247cshift1_16_i1.o/1622802247gfortrani_cshift1_16_i1 .visible .func _gfortrani_cshift1_16_../libgfortran/generated/cshift1_16_i1.c"odti3 .extern .func __modti3ortrani_cshift1_16_i1 .visible .func _gfortrani_cshift1_16_0081predpred %r150; .reg .pred %r151; .reg .predpredu16 %r197; .reg .u16 %r198; .reg .u16 %r199; .reg .u32 %r200; .reg .predpredu64 %r266; .reg .pred %r268; .reg .pred %r270; .reg .predu64 %r282; .reg .u64 %r289; .reg .u32u64 %r320; .reg .u64u64 %r333; .reg .u64 %r334; .reg .u64u64pred %r352; .reg .predu32 %r410; .reg .u64mov.u64 %r137,%ar0; mov.u64 %r138,%ar1; mov.u64 %r139,%ar2; mov.u64 %r140,%ar3; .loc 1 68 6 setp.eq.u64 %r141,%r140,0; @ %r141 bra $L37; .loc 1 69 21 ld.u32 %r142,[%r140]; add.u32 %r91,%r142,-1; bra $L2; $L37: .loc 1 71 11 cvt.u32.u64 %r91,%r140; $L2: .loc 1 73 13 mov.u64 %r143,1; st.u64 [%frame+360],%r143; .loc 1 74 12 mov.u64 %r144,0; st.u64 [%frame+480],%r144; .loc 1 82 23 ld.s8 %r129,[%r138+28]; cvt.u32.u32 %r145,%r129; cvt.s64.s8 %r130,%r145; .loc 1 82 3 setp.le.s64 %r146,%r130,0; @ %r146 bra $L38; .loc 1 84 15 cvt.s64.s32 %r90,%r91; add.u64 %r36,%r138,40; add.u64 %r33,%r137,40; .loc 1 75 5 mov.u64 %r88,%r144; .loc 1 80 7 mov.u64 %r86,%r88; .loc 1 82 12 mov.u64 %r118,%r88; .loc 1 79 11 mov.u64 %r79,%r143; .loc 1 78 11 mov.u64 %r72,%r79; .loc 1 96 20 mov.u64 %r419,%r88; $L8: .loc 1 92 17 ld.u64 %r148,[%r36+16]; add.u64 %r147,%r148,1; .loc 1 92 15 ld.u64 %r149,[%r36+8]; sub.u64 %r73,%r147,%r149; .loc 1 86 19 ld.u64 %r71,[%r33]; .loc 1 89 19 ld.u64 %r78,[%r36]; .loc 1 84 10 setp.ne.u64 %r150,%r90,%r118; @ %r150 bra $L4; .loc 1 87 14 setp.ne.u64 %r151,%r71,0; .loc 1 90 14 setp.ne.u64 %r152,%r78,0; .loc 1 92 15 mov.u64 %r86,%r73; selp.u64 %r79,%r78,1,%r152; selp.u64 %r72,%r71,1,%r151; bra $L7; $L4: .loc 1 96 20 shl.b64 %r153,%r88,3; add.u64 %r154,%frame,%r153; add.u64 %r155,%r154,480; st.u64 [%r155],%r419; .loc 1 97 21 st.u64 [%r154+360],%r73; .loc 1 98 22 st.u64 [%r154+840],%r71; .loc 1 99 22 st.u64 [%r154+720],%r78; .loc 1 100 24 add.u64 %r167,%r88,%r88; add.u64 %r168,%r167,%r88; shl.b64 %r169,%r168,3; add.u64 %r170,%r139,%r169; ld.u64 %r25,[%r170+40]; .loc 1 100 22 st.u64 [%r154+600],%r25; .loc 1 101 26 mul.lo.u64 %r178,%r73,%r71; .loc 1 101 13 st.u64 [%r154+240],%r178; .loc 1 102 26 mul.lo.u64 %r182,%r73,%r78; .loc 1 102 13 st.u64 [%r154+120],%r182; .loc 1 103 26 mul.lo.u64 %r185,%r25,%r73; .loc 1 103 13 st.u64 [%r154],%r185; .loc 1 104 12 add.u64 %r88,%r88,1; $L7: .loc 1 82 55 add.u64 %r118,%r118,1; .loc 1 82 3 add.u64 %r36,%r36,24; add.u64 %r33,%r33,24; setp.ne.u64 %r186,%r118,%r130; @ %r186 bra $L8; bra $L3; $L38: .loc 1 80 7 mov.u64 %r86,%r144; .loc 1 79 11 mov.u64 %r79,%r143; .loc 1 78 11 mov.u64 %r72,%r79; $L3: .loc 1 107 14 ld.u64 %r29,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r187,%r29,0; .loc 1 108 16 selp.u64 %r29,%r29,1,%r187; .loc 1 109 14 ld.u64 %r30,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r188,%r30,0; .loc 1 110 16 selp.u64 %r30,%r30,1,%r188; .loc 1 111 14 ld.u64 %r31,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r189,%r31,0; .loc 1 112 16 selp.u64 %r31,%r31,1,%r189; .loc 1 118 8 ld.u64 %r115,[%r137]; .loc 1 119 8 ld.u64 %r116,[%r138]; .loc 1 120 8 ld.u64 %r117,[%r139]; .loc 1 122 9 setp.eq.u64 %r190,%r115,0; @ %r190 bra $L1; .loc 1 164 12 shl.b64 %r47,%r31,4; .loc 1 138 19 set.u32.eq.u64 %r192,%r79,1; neg.s32 %r193,%r192; .loc 1 138 35 set.u32.eq.u64 %r195,%r72,1; neg.s32 %r196,%r195; .loc 1 138 24 cvt.u16.u32 %r198,%r193; cvt.u16.u32 %r199,%r196; and.b16 %r197,%r198,%r199; cvt.u32.u16 %r200,%r197; cvt.u32.u8 %r35,%r200; .loc 1 167 32 ld.u64 %r40,[%frame+360]; .loc 1 129 12 shr.s64 %r379,%r86,63; setp.ne.u32 %r398,%r35,0; setp.le.s64 %r403,%r130,2; .loc 1 132 8 add.u64 %r407,%frame,960; add.u64 %r408,%frame,976; add.u64 %r409,%frame,992; cvt.u32.u32 %r410,%r129; cvt.s64.s8 %r411,%r410; add.u64 %r412,%r411,-1; add.u64 %r413,%frame,840; add.u64 %r414,%frame,720; add.u64 %r415,%frame,600; add.u64 %r416,%frame,360; .loc 1 172 24 add.u64 %r417,%frame,240; .loc 1 173 24 add.u64 %r418,%frame,120; $L36: .loc 1 125 10 ld.u64 %r380,[%r117]; ld.u64 %r381,[%r117+8]; .loc 1 128 10 setp.ge.s64 %r203,%r381,0; @ %r203 bra $L13; .loc 1 129 12 add.u64 %r205,%r380,%r86; set.u32.lt.u64 %r210,%r205,%r380; cvt.s64.s32 %r208,%r210; add.u64 %r211,%r381,%r379; sub.u64 %r214,%r211,%r208; mov.u64 %r380,%r205; mov.u64 %r381,%r214; .loc 1 130 10 setp.gt.s64 %r217,%r379,%r214; @ %r217 bra $L39; setp.ne.u64 %r220,%r379,%r214; @ %r220 bra $L15; setp.gt.u64 %r223,%r86,%r205; @ ! %r223 bra $L15; $L39: setp.ge.s64 %r225,%r214,0; @ %r225 bra $L17; $L15: .loc 1 132 8 st.u64 [%frame+976],%r205; st.u64 [%frame+984],%r214; st.u64 [%frame+992],%r86; st.u64 [%frame+1000]960]; ld.u64 %r381,[%frame+968]; .loc 1 133 7 setp.ge.s64 %r243,%r381,0; @ %r243 bra $L17; .loc 1 134 16 add.u64 %r245,%r380,%r86; set.u32.lt.u64 %r250,%r245,%r380; cvt.s64.s32 %r248,%r250; add.u64 %r251,%r381,%r379; mov.u64 %r380,%r245; sub.u64 %r381,%r251,%r248; $L17: .loc 1 138 10 @ %r398 bra $L20; .loc 1 147 24 sub.u64 %r257,%r86,%r380; set.u32.gt.u64 %r262,%r257,%r86; cvt.s64.s32 %r260,%r262; sub.u64 %r263,%r379,%r381; add.u64 %r266,%r263,%r260; .loc 1 147 4 setp.gt.s64 %r268,%r266,0; @ %r268 bra $L21; setp.ne.u64 %r270,%r266,0; @ %r270 bra $L40; setp.ne.u64 %r272,%r257,0; @ %r272 bra $L21; $L40: mov.u64 %r136,%r115; bra $L23; $L20: .loc 1 141 11 sub.u64 %r111,%r86,%r380; .loc 1 142 23 add.u64 %r273,%r116,%r380; .loc 1 142 4110,[%value_in]; } .loc 1 143 17 add.u64 %r282,%r115,%r111; .loc 1 143 428bra $L24; $L21: .loc 1 136 11 mad.lo.u64 %r109,%r380,%r79,%r116; .loc 1 156 13 mov.u64 %r108,%r115; .loc 1 147 11 mov.u64 %r110,0; $L25: .loc 1 149 14 ld.u8 %r292,[%r109]; st.u8 [%r108],%r292; .loc 1 150 13 add.u64 %r108,%r108,%r72; .loc 1 151 12 add.u64 %r109,%r109,%r79; .loc 1 147 31 add.u64 %r110,%r110,1; .loc 1 147 4 setp.ne.u64 %r293,%r257,%r110; @ %r293 bra $L25; .loc 1 150 13 mad.lo.u64 %r136,%r257,%r72,%r115; $L23: .loc 1 153 4 setp.gt.s64 %r310,%r381,0; @ %r310 bra $L26; setp.ne.u64 %r312,%r381,0; @ %r312 bra $L24; setp.ne.u64 %r314,%r380,0; @ ! %r314 bra $L24; $L26: .loc 1 156 13 mov.u64 %r106,%r116; .loc 1 153 23 mov.u64 %r107,0; $L28: .loc 1 155 14 ld.u8 %r315,[%r106]; st.u8 [%r136],%r315; .loc 1 156 13 add.u64 %r136,%r136,%r72; .loc 1 157 12 add.u64 %r106,%r106,%r79; .loc 1 153 37 add.u64 %r107,%r107,1; .loc 1 153 4 setp.ne.u64 %r316,%r380,%r107; @ %r316 bra $L28; $L24: .loc 1 162 12 add.u64 %r115,%r115,%r30; .loc 1 163 12 add.u64 %r116,%r116,%r29; .loc 1 164 12 add.u64 %r117,%r117,%r47; .loc 1 165 15 ld.u64 %r317,[%frame+480]; add.u64 %r49,%r317,1; .loc 1 167 13 setp.eq.u64 %r318,%r40,%r49; @ %r318 bra $L30; .loc 1 165 15 st.u64 [%frame+480],%r49; bra $L36; $L30: .loc 1 171 20 mov.u64 %r319,0; st.u64 [%frame+480],%r319; .loc 1 172 24 ld.u64 %r51,[%frame+240]; .loc 1 173 16 ld.u64 %r320,[%frame+120]; sub.u64 %r112,%r116,%r320; .loc 1 174 9 ld.u64 %r322,[%frame]; shl.b64 %r321,%r322,4; sub.u64 %r113,%r117,%r321; .loc 1 176 14 @ %r403 bra $L1; add.u64 %r126,%frame,488; mov.u64 %r121,8; .loc 1 175 12 mov.u64 %r114,1; bra $L33; $L34: .loc 1 171 20 st.u64 [%r126],%r319; .loc 1 172 24 add.u64 %r329,%r417,%r121; ld.u64 %r51,[%r329]; .loc 1 173 24 add.u64 %r331,%r418,%r121; .loc 1 173 16 ld.u64 %r332,[%r331]; sub.u64 %r112,%r116,%r332; .loc 1 174 17 add.u64 %r333,%frame,%r121; .loc 1 174 9 ld.u64 %r335,[%r333]; shl.b64 %r334,%r335,4; sub.u64 %r113,%r117,%r334; .loc 1 175 12 add.u64 %r114,%r114,1; .loc 1 176 14 add.u64 %r126,%r126,8; add.u64 %r121,%r121,8; setp.eq.u64 %r336,%r412,%r114; @ %r336 bra $L1; $L33: .loc 1 184 23 ld.u64 %r337,[%r126]; add.u64 %r61,%r337,1; st.u64 [%r126],%r61; .loc 1 185 30 add.u64 %r339,%r413,%r121; .loc 1 185 20 ld.u64 %r341,[%r339]; sub.u64 %r340,%r341,%r51; add.u64 %r115,%r115,%r340; .loc 1 186 30 add.u64 %r343,%r414,%r121; .loc 1 186 20 ld.u64 %r344,[%r343]; add.u64 %r116,%r112,%r344; .loc 1 187 23 add.u64 %r346,%r415,%r121; .loc 1 187 13 ld.u64 %r348,[%r346]; shl.b64 %r347,%r348,4; add.u64 %r117,%r113,%r347; .loc 1 167 32 add.u64 %r350,%r416,%r121; .loc 1 167 13 ld.u64 %r351,[%r350]; setp.eq.u64 %r352,%r61,%r351; @ %r352 bra $L34; bra $L36; $L13: .loc 1 130 10 setp.gt.s64 %r355,%r379,%r381; @ %r355 bra $L17; setp.ne.u64 %r358,%r379,%r381; @ %r358 bra $L41; setp.gt.u64 %r361,%r86,%r380; @ %r361 bra $L17; $L41: .loc 1 132 8 st.u64 [%frame+976],%r380; st.u64 [%frame+984],%r381; st.u64 [%frame+992],%r86; st.u64 [%frame+1000]960]; ld.u64 %r381,[%frame+968]; bra $L17; $L1: .loc 1 191cshift1_16_i2.o/1622802247gfortrani_cshift1_16_i2 .visible .func _gfortrani_cshift1_16_../libgfortran/generated/cshift1_16_i2.c"odti3 .extern .func __modti3ortrani_cshift1_16_i2 .visible .func _gfortrani_cshift1_16_0085predu16 %r214; .reg .u32 %r215; .reg .predpred %r232; .reg .pred %r235; .reg .predu64 %r280; .reg .pred %r283; .reg .pred %r285; .reg .predpred %r313; .reg .pred %r330; .reg .pred %r332; .reg .pred %r334; .reg .u16u64 %r375; .reg .pred %r376; .reg .pred %r379; .reg .pred %r382; .reg .pred %r385; .reg .u64pred.reg .u64mov.u64 %r148,%ar0; mov.u64 %r149,%ar1; mov.u64 %r150,%ar2; mov.u64 %r151,%ar3; .loc 1 68 6 setp.eq.u64 %r152,%r151,0; @ %r152 bra $L37; .loc 1 69 21 ld.u32 %r153,[%r151]; add.u32 %r95,%r153,-1; bra $L2; $L37: .loc 1 71 11 cvt.u32.u64 %r95,%r151; $L2: .loc 1 73 13 mov.u64 %r154,1; st.u64 [%frame+360],%r154; .loc 1 74 12 mov.u64 %r155,0; st.u64 [%frame+480],%r155; .loc 1 82 23 ld.s8 %r139,[%r149+28]; cvt.u32.u32 %r156,%r139; cvt.s64.s8 %r134,%r156; .loc 1 82 3 setp.le.s64 %r157,%r134,0; @ %r157 bra $L38; .loc 1 84 15 cvt.s64.s32 %r87,%r95; add.u64 %r81,%r149,40; add.u64 %r48,%r148,40; .loc 1 75 5 mov.u64 %r90,%r155; .loc 1 80 7 mov.u64 %r89,%r90; .loc 1 82 12 mov.u64 %r125,%r90; .loc 1 79 11 mov.u64 %r84,%r154; .loc 1 78 11 mov.u64 %r80,%r84; .loc 1 96 20 mov.u64 %r444,%r90; $L8: .loc 1 92 17 ld.u64 %r159,[%r81+16]; add.u64 %r158,%r159,1; .loc 1 92 15 ld.u64 %r160,[%r81+8]; sub.u64 %r141,%r158,%r160; .loc 1 86 19 ld.u64 %r79,[%r48]; .loc 1 89 19 ld.u64 %r83,[%r81]; .loc 1 84 10 setp.ne.u64 %r161,%r87,%r125; @ %r161 bra $L4; .loc 1 87 14 setp.ne.u64 %r162,%r79,0; .loc 1 90 14 setp.ne.u64 %r163,%r83,0; .loc 1 92 15 mov.u64 %r89,%r141; selp.u64 %r84,%r83,1,%r163; selp.u64 %r80,%r79,1,%r162; bra $L7; $L4: .loc 1 96 20 shl.b64 %r164,%r90,3; add.u64 %r165,%frame,%r164; add.u64 %r166,%r165,480; st.u64 [%r166],%r444; .loc 1 97 21 st.u64 [%r165+360],%r141; .loc 1 98 22 st.u64 [%r165+840],%r79; .loc 1 99 22 st.u64 [%r165+720],%r83; .loc 1 100 24 add.u64 %r178,%r90,%r90; add.u64 %r179,%r178,%r90; shl.b64 %r180,%r179,3; add.u64 %r181,%r150,%r180; ld.u64 %r25,[%r181+40]; .loc 1 100 22 st.u64 [%r165+600],%r25; .loc 1 101 26 mul.lo.u64 %r189,%r79,%r141; .loc 1 101 13 st.u64 [%r165+240],%r189; .loc 1 102 26 mul.lo.u64 %r193,%r83,%r141; .loc 1 102 13 st.u64 [%r165+120],%r193; .loc 1 103 26 mul.lo.u64 %r196,%r25,%r141; .loc 1 103 13 st.u64 [%r165],%r196; .loc 1 104 12 add.u64 %r90,%r90,1; $L7: .loc 1 82 55 add.u64 %r125,%r125,1; .loc 1 82 3 add.u64 %r81,%r81,24; add.u64 %r48,%r48,24; setp.ne.u64 %r197,%r125,%r134; @ %r197 bra $L8; bra $L3; $L38: .loc 1 80 7 mov.u64 %r89,%r155; .loc 1 79 11 mov.u64 %r84,%r154; .loc 1 78 11 mov.u64 %r80,%r84; $L3: .loc 1 107 14 ld.u64 %r29,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r198,%r29,0; .loc 1 108 16 selp.u64 %r29,%r29,1,%r198; .loc 1 109 14 ld.u64 %r30,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r199,%r30,0; .loc 1 110 16 selp.u64 %r30,%r30,1,%r199; .loc 1 111 14 ld.u64 %r31,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r200,%r31,0; .loc 1 112 16 selp.u64 %r31,%r31,1,%r200; .loc 1 118 8 ld.u64 %r122,[%r148]; .loc 1 119 8 ld.u64 %r123,[%r149]; .loc 1 120 8 ld.u64 %r124,[%r150]; .loc 1 122 9 setp.eq.u64 %r201,%r122,0; @ %r201 bra $L1; add.u64 %r133,%r84,%r84; .loc 1 150 13 add.u64 %r46,%r80,%r80; .loc 1 162 12 add.u64 %r50,%r30,%r30; .loc 1 163 12 add.u64 %r52,%r29,%r29; .loc 1 164 12 shl.b64 %r54,%r31,4; .loc 1 138 35 set.u32.eq.u64 %r207,%r80,1; neg.s32 %r208,%r207; .loc 1 138 19 set.u32.eq.u64 %r210,%r84,1; neg.s32 %r211,%r210; .loc 1 138 24 cvt.u16.u32 %r213,%r208; cvt.u16.u32 %r214,%r211; and.b16 %r212,%r213,%r214; cvt.u32.u16 %r215,%r212; cvt.u32.u8 %r86,%r215; .loc 1 167 32 ld.u64 %r82,[%frame+360]; .loc 1 129 12 shr.s64 %r403,%r89,63; setp.ne.u32 %r427,%r86,0; setp.le.s64 %r423,%r134,2; .loc 1 132 8 add.u64 %r432,%frame,960; add.u64 %r433,%frame,976; add.u64 %r434,%frame,992; cvt.u32.u32 %r435,%r139; cvt.s64.s8 %r436,%r435; add.u64 %r437,%r436,-1; add.u64 %r438,%frame,840; add.u64 %r439,%frame,720; add.u64 %r440,%frame,600; add.u64 %r441,%frame,360; .loc 1 172 24 add.u64 %r442,%frame,240; .loc 1 173 24 add.u64 %r443,%frame,120; $L36: .loc 1 125 10 ld.u64 %r404,[%r124]; ld.u64 %r405,[%r124+8]; .loc 1 128 10 setp.ge.s64 %r218,%r405,0; @ %r218 bra $L13; .loc 1 129 12 add.u64 %r220,%r404,%r89; set.u32.lt.u64 %r225,%r220,%r404; cvt.s64.s32 %r223,%r225; add.u64 %r226,%r405,%r403; sub.u64 %r229,%r226,%r223; mov.u64 %r404,%r220; mov.u64 %r405,%r229; .loc 1 130 10 setp.gt.s64 %r232,%r403,%r229; @ %r232 bra $L39; setp.ne.u64 %r235,%r403,%r229; @ %r235 bra $L15; setp.gt.u64 %r238,%r89,%r220; @ ! %r238 bra $L15; $L39: setp.ge.s64 %r240,%r229,0; @ %r240 bra $L17; $L15: .loc 1 132 8 st.u64 [%frame+976],%r220; st.u64 [%frame+984],%r229; st.u64 [%frame+992],%r89; st.u64 [%frame+1000],%r44960]; ld.u64 %r405,[%frame+968]; .loc 1 133 7 setp.ge.s64 %r258,%r405,0; @ %r258 bra $L17; .loc 1 134 16 add.u64 %r260,%r404,%r89; set.u32.lt.u64 %r265,%r260,%r404; cvt.s64.s32 %r263,%r265; add.u64 %r266,%r405,%r403; mov.u64 %r404,%r260; sub.u64 %r405,%r266,%r263; $L17: .loc 1 147 24 sub.u64 %r428,%r89,%r404; set.u32.gt.u64 %r276,%r428,%r89; cvt.s64.s32 %r274,%r276; sub.u64 %r277,%r403,%r405; add.u64 %r280,%r277,%r274; .loc 1 138 10 @ %r427 bra $L20; .loc 1 147 4 setp.gt.s64 %r283,%r280,0; @ %r283 bra $L21; setp.ne.u64 %r285,%r280,0; @ %r285 bra $L40; setp.ne.u64 %r287,%r428,0; @ %r287 bra $L21; $L40: mov.u64 %r147,%r122; bra $L23; $L20: .loc 1 140 11 add.u64 %r288,%r404,%r404; .loc 1 141 11 add.u64 %r290,%r428,%r428; .loc 1 142 23 add.u64 %r291,%r123,%r288; .loc 1 142 4290298,[%value_in]; } .loc 1 143 17 add.u64 %r302,%r122,%r290; .loc 1 143 40123288309,[%value_in]; } bra $L24; $L21: .loc 1 136 11 mad.lo.u64 %r115,%r404,%r133,%r123; mov.u64 %r114,%r122; .loc 1 147 11 mov.u64 %r116,0; $L25: .loc 1 149 14 ld.u16 %r312,[%r115]; st.u16 [%r114],%r312; .loc 1 150 13 add.u64 %r114,%r114,%r46; .loc 1 151 12 add.u64 %r115,%r115,%r133; .loc 1 147 31 add.u64 %r116,%r116,1; .loc 1 147 4 setp.ne.u64 %r313,%r428,%r116; @ %r313 bra $L25; .loc 1 150 13 mad.lo.u64 %r147,%r428,%r46,%r122; $L23: .loc 1 153 4 setp.gt.s64 %r330,%r405,0; @ %r330 bra $L26; setp.ne.u64 %r332,%r405,0; @ %r332 bra $L24; setp.ne.u64 %r334,%r404,0; @ ! %r334 bra $L24; $L26: mov.u64 %r112,%r123; .loc 1 153 23 mov.u64 %r113,0; $L28: .loc 1 155 14 ld.u16 %r335,[%r112]; st.u16 [%r147],%r335; .loc 1 156 13 add.u64 %r147,%r147,%r46; .loc 1 157 12 add.u64 %r112,%r112,%r133; .loc 1 153 37 add.u64 %r113,%r113,1; .loc 1 153 4 setp.ne.u64 %r336,%r404,%r113; @ %r336 bra $L28; $L24: .loc 1 162 12 add.u64 %r122,%r122,%r50; .loc 1 163 12 add.u64 %r123,%r123,%r52; .loc 1 164 12 add.u64 %r124,%r124,%r54; .loc 1 165 15 ld.u64 %r337,[%frame+480]; add.u64 %r56,%r337,1; .loc 1 167 13 setp.eq.u64 %r338,%r56,%r82; @ %r338 bra $L30; .loc 1 165 15 st.u64 [%frame+480],%r56; bra $L36; $L30: .loc 1 171 20 mov.u64 %r339,0; st.u64 [%frame+480],%r339; .loc 1 172 24 ld.u64 %r58,[%frame+240]; .loc 1 173 16 ld.u64 %r340,[%frame+120]; add.u64 %r341,%r340,%r340; sub.u64 %r119,%r123,%r341; .loc 1 174 9 ld.u64 %r343,[%frame]; shl.b64 %r342,%r343,4; sub.u64 %r120,%r124,%r342; .loc 1 176 14 @ %r423 bra $L1; add.u64 %r136,%frame,488; mov.u64 %r128,8; .loc 1 175 12 mov.u64 %r121,1; bra $L33; $L34: .loc 1 171 20 st.u64 [%r136],%r339; .loc 1 172 24 add.u64 %r350,%r442,%r128; ld.u64 %r58,[%r350]; .loc 1 173 24 add.u64 %r352,%r443,%r128; .loc 1 173 16 ld.u64 %r353,[%r352]; add.u64 %r354,%r353,%r353; sub.u64 %r119,%r123,%r354; .loc 1 174 17 add.u64 %r355,%frame,%r128; .loc 1 174 9 ld.u64 %r357,[%r355]; shl.b64 %r356,%r357,4; sub.u64 %r120,%r124,%r356; .loc 1 175 12 add.u64 %r121,%r121,1; .loc 1 176 14 add.u64 %r136,%r136,8; add.u64 %r128,%r128,8; setp.eq.u64 %r358,%r437,%r121; @ %r358 bra $L1; $L33: .loc 1 184 23 ld.u64 %r359,[%r136]; add.u64 %r68,%r359,1; st.u64 [%r136],%r68; .loc 1 185 30 add.u64 %r361,%r438,%r128; .loc 1 185 20 ld.u64 %r363,[%r361]; sub.u64 %r362,%r363,%r58; add.u64 %r364,%r362,%r362; add.u64 %r122,%r122,%r364; .loc 1 186 30 add.u64 %r366,%r439,%r128; .loc 1 186 20 ld.u64 %r367,[%r366]; add.u64 %r368,%r367,%r367; add.u64 %r123,%r119,%r368; .loc 1 187 23 add.u64 %r370,%r440,%r128; .loc 1 187 13 ld.u64 %r372,[%r370]; shl.b64 %r371,%r372,4; add.u64 %r124,%r120,%r371; .loc 1 167 32 add.u64 %r374,%r441,%r128; .loc 1 167 13 ld.u64 %r375,[%r374]; setp.eq.u64 %r376,%r68,%r375; @ %r376 bra $L34; bra $L36; $L13: .loc 1 130 10 setp.gt.s64 %r379,%r403,%r405; @ %r379 bra $L17; setp.ne.u64 %r382,%r403,%r405; @ %r382 bra $L41; setp.gt.u64 %r385,%r89,%r404; @ %r385 bra $L17; $L41: .loc 1 132 8 st.u64 [%frame+976],%r404; st.u64 [%frame+984],%r405; st.u64 [%frame+992],%r89; st.u64 [%frame+1000],%r44960]; ld.u64 %r405,[%frame+968]; bra $L17; $L1: .loc 1 191 cshift1_16_i4.o/1622802247_gfortrani_cshift1_16_i4 .visible .func _gfortrani_cshift1_16_../libgfortran/generated/cshift1_16_i4.c"odti3 .extern .func __modti3ortrani_cshift1_16_i4 .visible .func _gfortrani_cshift1_16_00895predu16 %r210; .reg .u32 %r211; .reg .predpred %r228; .reg .predpred %r254; .reg .u64 %r256; .reg .u64 %r259; .reg .u32 %r261; .reg .u64predpred %r326; .reg .pred %r328; .reg .predu64pred %r369; .reg .pred %r372; .reg .pred %r375; .reg .predmov.u64 %r148,%ar0; mov.u64 %r149,%ar1; mov.u64 %r150,%ar2; mov.u64 %r151,%ar3; .loc 1 68 6 setp.eq.u64 %r152,%r151,0; @ %r152 bra $L37; .loc 1 69 21 ld.u32 %r153,[%r151]; add.u32 %r95,%r153,-1; bra $L2; $L37: .loc 1 71 11 cvt.u32.u64 %r95,%r151; $L2: .loc 1 73 13 mov.u64 %r154,1; st.u64 [%frame+360],%r154; .loc 1 74 12 mov.u64 %r155,0; st.u64 [%frame+480],%r155; .loc 1 82 23 ld.s8 %r139,[%r149+28]; cvt.u32.u32 %r156,%r139; cvt.s64.s8 %r134,%r156; .loc 1 82 3 setp.le.s64 %r157,%r134,0; @ %r157 bra $L38; .loc 1 84 15 cvt.s64.s32 %r87,%r95; add.u64 %r81,%r149,40; add.u64 %r48,%r148,40; .loc 1 75 5 mov.u64 %r90,%r155; .loc 1 80 7 mov.u64 %r89,%r90; .loc 1 82 12 mov.u64 %r125,%r90; .loc 1 79 11 mov.u64 %r84,%r154; .loc 1 78 11 mov.u64 %r80,%r84; .loc 1 96 20 mov.u64 %r437,%r90; $L8: .loc 1 92 17 ld.u64 %r159,[%r81+16]; add.u64 %r158,%r159,1; .loc 1 92 15 ld.u64 %r160,[%r81+8]; sub.u64 %r141,%r158,%r160; .loc 1 86 19 ld.u64 %r79,[%r48]; .loc 1 89 19 ld.u64 %r83,[%r81]; .loc 1 84 10 setp.ne.u64 %r161,%r87,%r125; @ %r161 bra $L4; .loc 1 87 14 setp.ne.u64 %r162,%r79,0; .loc 1 90 14 setp.ne.u64 %r163,%r83,0; .loc 1 92 15 mov.u64 %r89,%r141; selp.u64 %r84,%r83,1,%r163; selp.u64 %r80,%r79,1,%r162; bra $L7; $L4: .loc 1 96 20 shl.b64 %r164,%r90,3; add.u64 %r165,%frame,%r164; add.u64 %r166,%r165,480; st.u64 [%r166],%r437; .loc 1 97 21 st.u64 [%r165+360],%r141; .loc 1 98 22 st.u64 [%r165+840],%r79; .loc 1 99 22 st.u64 [%r165+720],%r83; .loc 1 100 24 add.u64 %r178,%r90,%r90; add.u64 %r179,%r178,%r90; shl.b64 %r180,%r179,3; add.u64 %r181,%r150,%r180; ld.u64 %r25,[%r181+40]; .loc 1 100 22 st.u64 [%r165+600],%r25; .loc 1 101 26 mul.lo.u64 %r189,%r79,%r141; .loc 1 101 13 st.u64 [%r165+240],%r189; .loc 1 102 26 mul.lo.u64 %r193,%r83,%r141; .loc 1 102 13 st.u64 [%r165+120],%r193; .loc 1 103 26 mul.lo.u64 %r196,%r25,%r141; .loc 1 103 13 st.u64 [%r165],%r196; .loc 1 104 12 add.u64 %r90,%r90,1; $L7: .loc 1 82 55 add.u64 %r125,%r125,1; .loc 1 82 3 add.u64 %r81,%r81,24; add.u64 %r48,%r48,24; setp.ne.u64 %r197,%r125,%r134; @ %r197 bra $L8; bra $L3; $L38: .loc 1 80 7 mov.u64 %r89,%r155; .loc 1 79 11 mov.u64 %r84,%r154; .loc 1 78 11 mov.u64 %r80,%r84; $L3: .loc 1 107 14 ld.u64 %r29,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r198,%r29,0; .loc 1 108 16 selp.u64 %r29,%r29,1,%r198; .loc 1 109 14 ld.u64 %r30,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r199,%r30,0; .loc 1 110 16 selp.u64 %r30,%r30,1,%r199; .loc 1 111 14 ld.u64 %r31,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r200,%r31,0; .loc 1 112 16 selp.u64 %r31,%r31,1,%r200; .loc 1 118 8 ld.u64 %r122,[%r148]; .loc 1 119 8 ld.u64 %r123,[%r149]; .loc 1 120 8 ld.u64 %r124,[%r150]; .loc 1 122 9 setp.eq.u64 %r201,%r122,0; @ %r201 bra $L1; shl.b64 %r133,%r84,2; .loc 1 150 13 shl.b64 %r46,%r80,2; .loc 1 162 12 shl.b64 %r50,%r30,2; .loc 1 163 12 shl.b64 %r52,%r29,2; .loc 1 164 12 shl.b64 %r54,%r31,4; .loc 1 138 35 set.u32.eq.u64 %r203,%r80,1; neg.s32 %r204,%r203; .loc 1 138 19 set.u32.eq.u64 %r206,%r84,1; neg.s32 %r207,%r206; .loc 1 138 24 cvt.u16.u32 %r209,%r204; cvt.u16.u32 %r210,%r207; and.b16 %r208,%r209,%r210; cvt.u32.u16 %r211,%r208; cvt.u32.u8 %r86,%r211; .loc 1 167 32 ld.u64 %r82,[%frame+360]; .loc 1 129 12 shr.s64 %r396,%r89,63; setp.ne.u32 %r418,%r86,0; setp.le.s64 %r415,%r134,2; .loc 1 132 8 add.u64 %r425,%frame,960; add.u64 %r426,%frame,976; add.u64 %r427,%frame,992; cvt.u32.u32 %r428,%r139; cvt.s64.s8 %r429,%r428; add.u64 %r430,%r429,-1; add.u64 %r431,%frame,840; add.u64 %r432,%frame,720; add.u64 %r433,%frame,600; add.u64 %r434,%frame,360; .loc 1 172 24 add.u64 %r435,%frame,240; .loc 1 173 24 add.u64 %r436,%frame,120; $L36: .loc 1 125 10 ld.u64 %r397,[%r124]; ld.u64 %r398,[%r124+8]; .loc 1 128 10 setp.ge.s64 %r214,%r398,0; @ %r214 bra $L13; .loc 1 129 12 add.u64 %r216,%r397,%r89; set.u32.lt.u64 %r221,%r216,%r397; cvt.s64.s32 %r219,%r221; add.u64 %r222,%r398,%r396; sub.u64 %r225,%r222,%r219; mov.u64 %r397,%r216; mov.u64 %r398,%r225; .loc 1 130 10 setp.gt.s64 %r228,%r396,%r225; @ %r228 bra $L39; setp.ne.u64 %r231,%r396,%r225; @ %r231 bra $L15; setp.gt.u64 %r234,%r89,%r216; @ ! %r234 bra $L15; $L39: setp.ge.s64 %r236,%r225,0; @ %r236 bra $L17; $L15: .loc 1 132 8 st.u64 [%frame+976],%r216; st.u64 [%frame+984],%r225; st.u64 [%frame+992],%r89; st.u64 [%frame+1000],%r39960]; ld.u64 %r398,[%frame+968]; .loc 1 133 7 setp.ge.s64 %r254,%r398,0; @ %r254 bra $L17; .loc 1 134 16 add.u64 %r256,%r397,%r89; set.u32.lt.u64 %r261,%r256,%r397; cvt.s64.s32 %r259,%r261; add.u64 %r262,%r398,%r396; mov.u64 %r397,%r256; sub.u64 %r398,%r262,%r259; $L17: .loc 1 147 24 sub.u64 %r420,%r89,%r397; set.u32.gt.u64 %r272,%r420,%r89; cvt.s64.s32 %r270,%r272; sub.u64 %r273,%r396,%r398; add.u64 %r276,%r273,%r270; .loc 1 138 10 @ %r418 bra $L20; .loc 1 147 4 setp.gt.s64 %r279,%r276,0; @ %r279 bra $L21; setp.ne.u64 %r281,%r276,0; @ %r281 bra $L40; setp.ne.u64 %r283,%r420,0; @ %r283 bra $L21; $L40: mov.u64 %r147,%r122; bra $L23; $L20: .loc 1 140 11 shl.b64 %r117,%r397,2; .loc 1 141 11 shl.b64 %r284,%r420,2; .loc 1 142 23 add.u64 %r286,%r123,%r117; .loc 1 142 4284293,[%value_in]; } .loc 1 143 17 add.u64 %r297,%r122,%r284; .loc 1 143 4117304,[%value_in]; } bra $L24; $L21: .loc 1 136 11 mad.lo.u64 %r115,%r397,%r133,%r123; mov.u64 %r114,%r122; .loc 1 147 11 mov.u64 %r116,0; $L25: .loc 1 149 16 ld.u32 %r44,[%r115]; .loc 1 149 14 st.u32 [%r114],%r44; .loc 1 150 13 add.u64 %r114,%r114,%r46; .loc 1 151 12 add.u64 %r115,%r115,%r133; .loc 1 147 31 add.u64 %r116,%r116,1; .loc 1 147 4 setp.ne.u64 %r307,%r420,%r116; @ %r307 bra $L25; .loc 1 150 13 mad.lo.u64 %r147,%r420,%r46,%r122; $L23: .loc 1 153 4 setp.gt.s64 %r324,%r398,0; @ %r324 bra $L26; setp.ne.u64 %r326,%r398,0; @ %r326 bra $L24; setp.ne.u64 %r328,%r397,0; @ ! %r328 bra $L24; $L26: mov.u64 %r112,%r123; .loc 1 153 23 mov.u64 %r113,0; $L28: .loc 1 155 16 ld.u32 %r47,[%r112]; .loc 1 155 14 st.u32 [%r147],%r47; .loc 1 156 13 add.u64 %r147,%r147,%r46; .loc 1 157 12 add.u64 %r112,%r112,%r133; .loc 1 153 37 add.u64 %r113,%r113,1; .loc 1 153 4 setp.ne.u64 %r329,%r397,%r113; @ %r329 bra $L28; $L24: .loc 1 162 12 add.u64 %r122,%r122,%r50; .loc 1 163 12 add.u64 %r123,%r123,%r52; .loc 1 164 12 add.u64 %r124,%r124,%r54; .loc 1 165 15 ld.u64 %r330,[%frame+480]; add.u64 %r56,%r330,1; .loc 1 167 13 setp.eq.u64 %r331,%r56,%r82; @ %r331 bra $L30; .loc 1 165 15 st.u64 [%frame+480],%r56; bra $L36; $L30: .loc 1 171 20 mov.u64 %r332,0; st.u64 [%frame+480],%r332; .loc 1 172 24 ld.u64 %r58,[%frame+240]; .loc 1 173 16 ld.u64 %r334,[%frame+120]; shl.b64 %r333,%r334,2; sub.u64 %r119,%r123,%r333; .loc 1 174 9 ld.u64 %r336,[%frame]; shl.b64 %r335,%r336,4; sub.u64 %r120,%r124,%r335; .loc 1 176 14 @ %r415 bra $L1; add.u64 %r136,%frame,488; mov.u64 %r128,8; .loc 1 175 12 mov.u64 %r121,1; bra $L33; $L34: .loc 1 171 20 st.u64 [%r136],%r332; .loc 1 172 24 add.u64 %r343,%r435,%r128; ld.u64 %r58,[%r343]; .loc 1 173 24 add.u64 %r345,%r436,%r128; .loc 1 173 16 ld.u64 %r347,[%r345]; shl.b64 %r346,%r347,2; sub.u64 %r119,%r123,%r346; .loc 1 174 17 add.u64 %r348,%frame,%r128; .loc 1 174 9 ld.u64 %r350,[%r348]; shl.b64 %r349,%r350,4; sub.u64 %r120,%r124,%r349; .loc 1 175 12 add.u64 %r121,%r121,1; .loc 1 176 14 add.u64 %r136,%r136,8; add.u64 %r128,%r128,8; setp.eq.u64 %r351,%r430,%r121; @ %r351 bra $L1; $L33: .loc 1 184 23 ld.u64 %r352,[%r136]; add.u64 %r68,%r352,1; st.u64 [%r136],%r68; .loc 1 185 30 add.u64 %r354,%r431,%r128; .loc 1 185 20 ld.u64 %r356,[%r354]; sub.u64 %r355,%r356,%r58; shl.b64 %r357,%r355,2; add.u64 %r122,%r122,%r357; .loc 1 186 30 add.u64 %r359,%r432,%r128; .loc 1 186 20 ld.u64 %r361,[%r359]; shl.b64 %r360,%r361,2; add.u64 %r123,%r119,%r360; .loc 1 187 23 add.u64 %r363,%r433,%r128; .loc 1 187 13 ld.u64 %r365,[%r363]; shl.b64 %r364,%r365,4; add.u64 %r124,%r120,%r364; .loc 1 167 32 add.u64 %r367,%r434,%r128; .loc 1 167 13 ld.u64 %r368,[%r367]; setp.eq.u64 %r369,%r68,%r368; @ %r369 bra $L34; bra $L36; $L13: .loc 1 130 10 setp.gt.s64 %r372,%r396,%r398; @ %r372 bra $L17; setp.ne.u64 %r375,%r396,%r398; @ %r375 bra $L41; setp.gt.u64 %r378,%r89,%r397; @ %r378 bra $L17; $L41: .loc 1 132 8 st.u64 [%frame+976],%r397; st.u64 [%frame+984],%r398; st.u64 [%frame+992],%r89; st.u64 [%frame+1000],%r39960]; ld.u64 %r398,[%frame+968]; bra $L17; $L1: .loc 1 191 cshift1_16_i8.o/1622802247_gfortrani_cshift1_16_i8 .visible .func _gfortrani_cshift1_16_../libgfortran/generated/cshift1_16_i8.c"odti3 .extern .func __modti3ortrani_cshift1_16_i8 .visible .func _gfortrani_cshift1_16_00895predu16 %r210; .reg .u32 %r211; .reg .predpred %r228; .reg .predpred %r254; .reg .u64 %r256; .reg .u64 %r259; .reg .u32 %r261; .reg .u64predpred %r326; .reg .pred %r328; .reg .predu64pred %r369; .reg .pred %r372; .reg .pred %r375; .reg .predmov.u64 %r148,%ar0; mov.u64 %r149,%ar1; mov.u64 %r150,%ar2; mov.u64 %r151,%ar3; .loc 1 68 6 setp.eq.u64 %r152,%r151,0; @ %r152 bra $L37; .loc 1 69 21 ld.u32 %r153,[%r151]; add.u32 %r95,%r153,-1; bra $L2; $L37: .loc 1 71 11 cvt.u32.u64 %r95,%r151; $L2: .loc 1 73 13 mov.u64 %r154,1; st.u64 [%frame+360],%r154; .loc 1 74 12 mov.u64 %r155,0; st.u64 [%frame+480],%r155; .loc 1 82 23 ld.s8 %r139,[%r149+28]; cvt.u32.u32 %r156,%r139; cvt.s64.s8 %r134,%r156; .loc 1 82 3 setp.le.s64 %r157,%r134,0; @ %r157 bra $L38; .loc 1 84 15 cvt.s64.s32 %r87,%r95; add.u64 %r81,%r149,40; add.u64 %r48,%r148,40; .loc 1 75 5 mov.u64 %r90,%r155; .loc 1 80 7 mov.u64 %r89,%r90; .loc 1 82 12 mov.u64 %r125,%r90; .loc 1 79 11 mov.u64 %r84,%r154; .loc 1 78 11 mov.u64 %r80,%r84; .loc 1 96 20 mov.u64 %r437,%r90; $L8: .loc 1 92 17 ld.u64 %r159,[%r81+16]; add.u64 %r158,%r159,1; .loc 1 92 15 ld.u64 %r160,[%r81+8]; sub.u64 %r141,%r158,%r160; .loc 1 86 19 ld.u64 %r79,[%r48]; .loc 1 89 19 ld.u64 %r83,[%r81]; .loc 1 84 10 setp.ne.u64 %r161,%r87,%r125; @ %r161 bra $L4; .loc 1 87 14 setp.ne.u64 %r162,%r79,0; .loc 1 90 14 setp.ne.u64 %r163,%r83,0; .loc 1 92 15 mov.u64 %r89,%r141; selp.u64 %r84,%r83,1,%r163; selp.u64 %r80,%r79,1,%r162; bra $L7; $L4: .loc 1 96 20 shl.b64 %r164,%r90,3; add.u64 %r165,%frame,%r164; add.u64 %r166,%r165,480; st.u64 [%r166],%r437; .loc 1 97 21 st.u64 [%r165+360],%r141; .loc 1 98 22 st.u64 [%r165+840],%r79; .loc 1 99 22 st.u64 [%r165+720],%r83; .loc 1 100 24 add.u64 %r178,%r90,%r90; add.u64 %r179,%r178,%r90; shl.b64 %r180,%r179,3; add.u64 %r181,%r150,%r180; ld.u64 %r25,[%r181+40]; .loc 1 100 22 st.u64 [%r165+600],%r25; .loc 1 101 26 mul.lo.u64 %r189,%r79,%r141; .loc 1 101 13 st.u64 [%r165+240],%r189; .loc 1 102 26 mul.lo.u64 %r193,%r83,%r141; .loc 1 102 13 st.u64 [%r165+120],%r193; .loc 1 103 26 mul.lo.u64 %r196,%r25,%r141; .loc 1 103 13 st.u64 [%r165],%r196; .loc 1 104 12 add.u64 %r90,%r90,1; $L7: .loc 1 82 55 add.u64 %r125,%r125,1; .loc 1 82 3 add.u64 %r81,%r81,24; add.u64 %r48,%r48,24; setp.ne.u64 %r197,%r125,%r134; @ %r197 bra $L8; bra $L3; $L38: .loc 1 80 7 mov.u64 %r89,%r155; .loc 1 79 11 mov.u64 %r84,%r154; .loc 1 78 11 mov.u64 %r80,%r84; $L3: .loc 1 107 14 ld.u64 %r29,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r198,%r29,0; .loc 1 108 16 selp.u64 %r29,%r29,1,%r198; .loc 1 109 14 ld.u64 %r30,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r199,%r30,0; .loc 1 110 16 selp.u64 %r30,%r30,1,%r199; .loc 1 111 14 ld.u64 %r31,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r200,%r31,0; .loc 1 112 16 selp.u64 %r31,%r31,1,%r200; .loc 1 118 8 ld.u64 %r122,[%r148]; .loc 1 119 8 ld.u64 %r123,[%r149]; .loc 1 120 8 ld.u64 %r124,[%r150]; .loc 1 122 9 setp.eq.u64 %r201,%r122,0; @ %r201 bra $L1; shl.b64 %r133,%r84,3; .loc 1 150 13 shl.b64 %r46,%r80,3; .loc 1 162 12 shl.b64 %r50,%r30,3; .loc 1 163 12 shl.b64 %r52,%r29,3; .loc 1 164 12 shl.b64 %r54,%r31,4; .loc 1 138 35 set.u32.eq.u64 %r203,%r80,1; neg.s32 %r204,%r203; .loc 1 138 19 set.u32.eq.u64 %r206,%r84,1; neg.s32 %r207,%r206; .loc 1 138 24 cvt.u16.u32 %r209,%r204; cvt.u16.u32 %r210,%r207; and.b16 %r208,%r209,%r210; cvt.u32.u16 %r211,%r208; cvt.u32.u8 %r86,%r211; .loc 1 167 32 ld.u64 %r82,[%frame+360]; .loc 1 129 12 shr.s64 %r396,%r89,63; setp.ne.u32 %r418,%r86,0; setp.le.s64 %r415,%r134,2; .loc 1 132 8 add.u64 %r425,%frame,960; add.u64 %r426,%frame,976; add.u64 %r427,%frame,992; cvt.u32.u32 %r428,%r139; cvt.s64.s8 %r429,%r428; add.u64 %r430,%r429,-1; add.u64 %r431,%frame,840; add.u64 %r432,%frame,720; add.u64 %r433,%frame,600; add.u64 %r434,%frame,360; .loc 1 172 24 add.u64 %r435,%frame,240; .loc 1 173 24 add.u64 %r436,%frame,120; $L36: .loc 1 125 10 ld.u64 %r397,[%r124]; ld.u64 %r398,[%r124+8]; .loc 1 128 10 setp.ge.s64 %r214,%r398,0; @ %r214 bra $L13; .loc 1 129 12 add.u64 %r216,%r397,%r89; set.u32.lt.u64 %r221,%r216,%r397; cvt.s64.s32 %r219,%r221; add.u64 %r222,%r398,%r396; sub.u64 %r225,%r222,%r219; mov.u64 %r397,%r216; mov.u64 %r398,%r225; .loc 1 130 10 setp.gt.s64 %r228,%r396,%r225; @ %r228 bra $L39; setp.ne.u64 %r231,%r396,%r225; @ %r231 bra $L15; setp.gt.u64 %r234,%r89,%r216; @ ! %r234 bra $L15; $L39: setp.ge.s64 %r236,%r225,0; @ %r236 bra $L17; $L15: .loc 1 132 8 st.u64 [%frame+976],%r216; st.u64 [%frame+984],%r225; st.u64 [%frame+992],%r89; st.u64 [%frame+1000],%r39960]; ld.u64 %r398,[%frame+968]; .loc 1 133 7 setp.ge.s64 %r254,%r398,0; @ %r254 bra $L17; .loc 1 134 16 add.u64 %r256,%r397,%r89; set.u32.lt.u64 %r261,%r256,%r397; cvt.s64.s32 %r259,%r261; add.u64 %r262,%r398,%r396; mov.u64 %r397,%r256; sub.u64 %r398,%r262,%r259; $L17: .loc 1 147 24 sub.u64 %r420,%r89,%r397; set.u32.gt.u64 %r272,%r420,%r89; cvt.s64.s32 %r270,%r272; sub.u64 %r273,%r396,%r398; add.u64 %r276,%r273,%r270; .loc 1 138 10 @ %r418 bra $L20; .loc 1 147 4 setp.gt.s64 %r279,%r276,0; @ %r279 bra $L21; setp.ne.u64 %r281,%r276,0; @ %r281 bra $L40; setp.ne.u64 %r283,%r420,0; @ %r283 bra $L21; $L40: mov.u64 %r147,%r122; bra $L23; $L20: .loc 1 140 11 shl.b64 %r117,%r397,3; .loc 1 141 11 shl.b64 %r284,%r420,3; .loc 1 142 23 add.u64 %r286,%r123,%r117; .loc 1 142 4284293,[%value_in]; } .loc 1 143 17 add.u64 %r297,%r122,%r284; .loc 1 143 4117304,[%value_in]; } bra $L24; $L21: .loc 1 136 11 mad.lo.u64 %r115,%r397,%r133,%r123; mov.u64 %r114,%r122; .loc 1 147 11 mov.u64 %r116,0; $L25: .loc 1 149 16 ld.u64 %r44,[%r115]; .loc 1 149 14 st.u64 [%r114],%r44; .loc 1 150 13 add.u64 %r114,%r114,%r46; .loc 1 151 12 add.u64 %r115,%r115,%r133; .loc 1 147 31 add.u64 %r116,%r116,1; .loc 1 147 4 setp.ne.u64 %r307,%r420,%r116; @ %r307 bra $L25; .loc 1 150 13 mad.lo.u64 %r147,%r420,%r46,%r122; $L23: .loc 1 153 4 setp.gt.s64 %r324,%r398,0; @ %r324 bra $L26; setp.ne.u64 %r326,%r398,0; @ %r326 bra $L24; setp.ne.u64 %r328,%r397,0; @ ! %r328 bra $L24; $L26: mov.u64 %r112,%r123; .loc 1 153 23 mov.u64 %r113,0; $L28: .loc 1 155 16 ld.u64 %r47,[%r112]; .loc 1 155 14 st.u64 [%r147],%r47; .loc 1 156 13 add.u64 %r147,%r147,%r46; .loc 1 157 12 add.u64 %r112,%r112,%r133; .loc 1 153 37 add.u64 %r113,%r113,1; .loc 1 153 4 setp.ne.u64 %r329,%r397,%r113; @ %r329 bra $L28; $L24: .loc 1 162 12 add.u64 %r122,%r122,%r50; .loc 1 163 12 add.u64 %r123,%r123,%r52; .loc 1 164 12 add.u64 %r124,%r124,%r54; .loc 1 165 15 ld.u64 %r330,[%frame+480]; add.u64 %r56,%r330,1; .loc 1 167 13 setp.eq.u64 %r331,%r56,%r82; @ %r331 bra $L30; .loc 1 165 15 st.u64 [%frame+480],%r56; bra $L36; $L30: .loc 1 171 20 mov.u64 %r332,0; st.u64 [%frame+480],%r332; .loc 1 172 24 ld.u64 %r58,[%frame+240]; .loc 1 173 16 ld.u64 %r334,[%frame+120]; shl.b64 %r333,%r334,3; sub.u64 %r119,%r123,%r333; .loc 1 174 9 ld.u64 %r336,[%frame]; shl.b64 %r335,%r336,4; sub.u64 %r120,%r124,%r335; .loc 1 176 14 @ %r415 bra $L1; add.u64 %r136,%frame,488; mov.u64 %r128,8; .loc 1 175 12 mov.u64 %r121,1; bra $L33; $L34: .loc 1 171 20 st.u64 [%r136],%r332; .loc 1 172 24 add.u64 %r343,%r435,%r128; ld.u64 %r58,[%r343]; .loc 1 173 24 add.u64 %r345,%r436,%r128; .loc 1 173 16 ld.u64 %r347,[%r345]; shl.b64 %r346,%r347,3; sub.u64 %r119,%r123,%r346; .loc 1 174 17 add.u64 %r348,%frame,%r128; .loc 1 174 9 ld.u64 %r350,[%r348]; shl.b64 %r349,%r350,4; sub.u64 %r120,%r124,%r349; .loc 1 175 12 add.u64 %r121,%r121,1; .loc 1 176 14 add.u64 %r136,%r136,8; add.u64 %r128,%r128,8; setp.eq.u64 %r351,%r430,%r121; @ %r351 bra $L1; $L33: .loc 1 184 23 ld.u64 %r352,[%r136]; add.u64 %r68,%r352,1; st.u64 [%r136],%r68; .loc 1 185 30 add.u64 %r354,%r431,%r128; .loc 1 185 20 ld.u64 %r356,[%r354]; sub.u64 %r355,%r356,%r58; shl.b64 %r357,%r355,3; add.u64 %r122,%r122,%r357; .loc 1 186 30 add.u64 %r359,%r432,%r128; .loc 1 186 20 ld.u64 %r361,[%r359]; shl.b64 %r360,%r361,3; add.u64 %r123,%r119,%r360; .loc 1 187 23 add.u64 %r363,%r433,%r128; .loc 1 187 13 ld.u64 %r365,[%r363]; shl.b64 %r364,%r365,4; add.u64 %r124,%r120,%r364; .loc 1 167 32 add.u64 %r367,%r434,%r128; .loc 1 167 13 ld.u64 %r368,[%r367]; setp.eq.u64 %r369,%r68,%r368; @ %r369 bra $L34; bra $L36; $L13: .loc 1 130 10 setp.gt.s64 %r372,%r396,%r398; @ %r372 bra $L17; setp.ne.u64 %r375,%r396,%r398; @ %r375 bra $L41; setp.gt.u64 %r378,%r89,%r397; @ %r378 bra $L17; $L41: .loc 1 132 8 st.u64 [%frame+976],%r397; st.u64 [%frame+984],%r398; st.u64 [%frame+992],%r89; st.u64 [%frame+1000],%r39960]; ld.u64 %r398,[%frame+968]; bra $L17; $L1: .loc 1 191487 gfortrani_cshift1_16_i16 .visible .func _gfortrani_cshift1_16_i16../libgfortran/generated/cshift1_16_i16.c"odti3 .extern .func __modti3ortrani_cshift1_16_i16 .visible .func _gfortrani_cshift1_16_i160085predu16 %r210; .reg .u32 %r211; .reg .predpred %r228; .reg .predpred %r254; .reg .u64 %r256; .reg .u64 %r259; .reg .u32 %r261; .reg .u64predu64 %r304; .reg .pred %r309; .reg .pred %r326; .reg .pred %r328; .reg .pred %r330; .reg .predu64pred %r373; .reg .pred %r376; .reg .pred %r379; .reg .pred %r382; .reg .u64 %r399; .reg .u64u64 %r406; .reg .pred %r424; .reg .predmov.u64 %r148,%ar0; mov.u64 %r149,%ar1; mov.u64 %r150,%ar2; mov.u64 %r151,%ar3; .loc 1 68 6 setp.eq.u64 %r152,%r151,0; @ %r152 bra $L37; .loc 1 69 21 ld.u32 %r153,[%r151]; add.u32 %r95,%r153,-1; bra $L2; $L37: .loc 1 71 11 cvt.u32.u64 %r95,%r151; $L2: .loc 1 73 13 mov.u64 %r154,1; st.u64 [%frame+360],%r154; .loc 1 74 12 mov.u64 %r155,0; st.u64 [%frame+480],%r155; .loc 1 82 23 ld.s8 %r139,[%r149+28]; cvt.u32.u32 %r156,%r139; cvt.s64.s8 %r134,%r156; .loc 1 82 3 setp.le.s64 %r157,%r134,0; @ %r157 bra $L38; .loc 1 84 15 cvt.s64.s32 %r87,%r95; add.u64 %r81,%r149,40; add.u64 %r48,%r148,40; .loc 1 75 5 mov.u64 %r90,%r155; .loc 1 80 7 mov.u64 %r89,%r90; .loc 1 82 12 mov.u64 %r125,%r90; .loc 1 79 11 mov.u64 %r84,%r154; .loc 1 78 11 mov.u64 %r80,%r84; .loc 1 96 20 mov.u64 %r445,%r90; $L8: .loc 1 92 17 ld.u64 %r159,[%r81+16]; add.u64 %r158,%r159,1; .loc 1 92 15 ld.u64 %r160,[%r81+8]; sub.u64 %r141,%r158,%r160; .loc 1 86 19 ld.u64 %r79,[%r48]; .loc 1 89 19 ld.u64 %r83,[%r81]; .loc 1 84 10 setp.ne.u64 %r161,%r87,%r125; @ %r161 bra $L4; .loc 1 87 14 setp.ne.u64 %r162,%r79,0; .loc 1 90 14 setp.ne.u64 %r163,%r83,0; .loc 1 92 15 mov.u64 %r89,%r141; selp.u64 %r84,%r83,1,%r163; selp.u64 %r80,%r79,1,%r162; bra $L7; $L4: .loc 1 96 20 shl.b64 %r164,%r90,3; add.u64 %r165,%frame,%r164; add.u64 %r166,%r165,480; st.u64 [%r166],%r445; .loc 1 97 21 st.u64 [%r165+360],%r141; .loc 1 98 22 st.u64 [%r165+840],%r79; .loc 1 99 22 st.u64 [%r165+720],%r83; .loc 1 100 24 add.u64 %r178,%r90,%r90; add.u64 %r179,%r178,%r90; shl.b64 %r180,%r179,3; add.u64 %r181,%r150,%r180; ld.u64 %r25,[%r181+40]; .loc 1 100 22 st.u64 [%r165+600],%r25; .loc 1 101 26 mul.lo.u64 %r189,%r79,%r141; .loc 1 101 13 st.u64 [%r165+240],%r189; .loc 1 102 26 mul.lo.u64 %r193,%r83,%r141; .loc 1 102 13 st.u64 [%r165+120],%r193; .loc 1 103 26 mul.lo.u64 %r196,%r25,%r141; .loc 1 103 13 st.u64 [%r165],%r196; .loc 1 104 12 add.u64 %r90,%r90,1; $L7: .loc 1 82 55 add.u64 %r125,%r125,1; .loc 1 82 3 add.u64 %r81,%r81,24; add.u64 %r48,%r48,24; setp.ne.u64 %r197,%r125,%r134; @ %r197 bra $L8; bra $L3; $L38: .loc 1 80 7 mov.u64 %r89,%r155; .loc 1 79 11 mov.u64 %r84,%r154; .loc 1 78 11 mov.u64 %r80,%r84; $L3: .loc 1 107 14 ld.u64 %r29,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r198,%r29,0; .loc 1 108 16 selp.u64 %r29,%r29,1,%r198; .loc 1 109 14 ld.u64 %r30,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r199,%r30,0; .loc 1 110 16 selp.u64 %r30,%r30,1,%r199; .loc 1 111 14 ld.u64 %r31,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r200,%r31,0; .loc 1 112 16 selp.u64 %r31,%r31,1,%r200; .loc 1 118 8 ld.u64 %r122,[%r148]; .loc 1 119 8 ld.u64 %r123,[%r149]; .loc 1 120 8 ld.u64 %r124,[%r150]; .loc 1 122 9 setp.eq.u64 %r201,%r122,0; @ %r201 bra $L1; shl.b64 %r133,%r84,4; .loc 1 150 13 shl.b64 %r46,%r80,4; .loc 1 162 12 shl.b64 %r50,%r30,4; .loc 1 163 12 shl.b64 %r52,%r29,4; .loc 1 164 12 shl.b64 %r54,%r31,4; .loc 1 138 35 set.u32.eq.u64 %r203,%r80,1; neg.s32 %r204,%r203; .loc 1 138 19 set.u32.eq.u64 %r206,%r84,1; neg.s32 %r207,%r206; .loc 1 138 24 cvt.u16.u32 %r209,%r204; cvt.u16.u32 %r210,%r207; and.b16 %r208,%r209,%r210; cvt.u32.u16 %r211,%r208; cvt.u32.u8 %r86,%r211; .loc 1 167 32 ld.u64 %r82,[%frame+360]; .loc 1 129 12 shr.s64 %r404,%r89,63; setp.ne.u32 %r428,%r86,0; setp.le.s64 %r424,%r134,2; .loc 1 132 8 add.u64 %r433,%frame,960; add.u64 %r434,%frame,976; add.u64 %r435,%frame,992; cvt.u32.u32 %r436,%r139; cvt.s64.s8 %r437,%r436; add.u64 %r438,%r437,-1; add.u64 %r439,%frame,840; add.u64 %r440,%frame,720; add.u64 %r441,%frame,600; add.u64 %r442,%frame,360; .loc 1 172 24 add.u64 %r443,%frame,240; .loc 1 173 24 add.u64 %r444,%frame,120; $L36: .loc 1 125 10 ld.u64 %r405,[%r124]; ld.u64 %r406,[%r124+8]; .loc 1 128 10 setp.ge.s64 %r214,%r406,0; @ %r214 bra $L13; .loc 1 129 12 add.u64 %r216,%r405,%r89; set.u32.lt.u64 %r221,%r216,%r405; cvt.s64.s32 %r219,%r221; add.u64 %r222,%r406,%r404; sub.u64 %r225,%r222,%r219; mov.u64 %r405,%r216; mov.u64 %r406,%r225; .loc 1 130 10 setp.gt.s64 %r228,%r404,%r225; @ %r228 bra $L39; setp.ne.u64 %r231,%r404,%r225; @ %r231 bra $L15; setp.gt.u64 %r234,%r89,%r216; @ ! %r234 bra $L15; $L39: setp.ge.s64 %r236,%r225,0; @ %r236 bra $L17; $L15: .loc 1 132 8 st.u64 [%frame+976],%r216; st.u64 [%frame+984],%r225; st.u64 [%frame+992],%r89; st.u64 [%frame+1000],%r40443r4960]; ld.u64 %r406,[%frame+968]; .loc 1 133 7 setp.ge.s64 %r254,%r406,0; @ %r254 bra $L17; .loc 1 134 16 add.u64 %r256,%r405,%r89; set.u32.lt.u64 %r261,%r256,%r405; cvt.s64.s32 %r259,%r261; add.u64 %r262,%r406,%r404; mov.u64 %r405,%r256; sub.u64 %r406,%r262,%r259; $L17: .loc 1 147 24 sub.u64 %r429,%r89,%r405; set.u32.gt.u64 %r272,%r429,%r89; cvt.s64.s32 %r270,%r272; sub.u64 %r273,%r404,%r406; add.u64 %r276,%r273,%r270; .loc 1 138 10 @ %r428 bra $L20; .loc 1 147 4 setp.gt.s64 %r279,%r276,0; @ %r279 bra $L21; setp.ne.u64 %r281,%r276,0; @ %r281 bra $L40; setp.ne.u64 %r283,%r429,0; @ %r283 bra $L21; $L40: mov.u64 %r147,%r122; bra $L23; $L20: .loc 1 140 11 shl.b64 %r117,%r405,4; .loc 1 141 11 shl.b64 %r284,%r429,4; .loc 1 142 23 add.u64 %r286,%r123,%r117; .loc 1 142 4284293,[%value_in]; } .loc 1 143 17 add.u64 %r297,%r122,%r284; .loc 1 143 4117304,[%value_in]; } bra $L24; $L21: .loc 1 136 11 mad.lo.u64 %r115,%r405,%r133,%r123; mov.u64 %r114,%r122; .loc 1 147 11 mov.u64 %r116,0; $L25: .loc 1 149 16 ld.u64 %r399,[%r115]; ld.u64 %r400,[%r115+8]; .loc 1 149 14 st.u64 [%r114],%r399; st.u64 [%r114+8],%r400; .loc 1 150 13 add.u64 %r114,%r114,%r46; .loc 1 151 12 add.u64 %r115,%r115,%r133; .loc 1 147 31 add.u64 %r116,%r116,1; .loc 1 147 4 setp.ne.u64 %r309,%r429,%r116; @ %r309 bra $L25; .loc 1 150 13 mad.lo.u64 %r147,%r429,%r46,%r122; $L23: .loc 1 153 4 setp.gt.s64 %r326,%r406,0; @ %r326 bra $L26; setp.ne.u64 %r328,%r406,0; @ %r328 bra $L24; setp.ne.u64 %r330,%r405,0; @ ! %r330 bra $L24; $L26: mov.u64 %r112,%r123; .loc 1 153 23 mov.u64 %r113,0; $L28: .loc 1 155 16 ld.u64 %r401,[%r112]; ld.u64 %r402,[%r112+8]; .loc 1 155 14 st.u64 [%r147],%r401; st.u64 [%r147+8],%r402; .loc 1 156 13 add.u64 %r147,%r147,%r46; .loc 1 157 12 add.u64 %r112,%r112,%r133; .loc 1 153 37 add.u64 %r113,%r113,1; .loc 1 153 4 setp.ne.u64 %r333,%r405,%r113; @ %r333 bra $L28; $L24: .loc 1 162 12 add.u64 %r122,%r122,%r50; .loc 1 163 12 add.u64 %r123,%r123,%r52; .loc 1 164 12 add.u64 %r124,%r124,%r54; .loc 1 165 15 ld.u64 %r334,[%frame+480]; add.u64 %r56,%r334,1; .loc 1 167 13 setp.eq.u64 %r335,%r56,%r82; @ %r335 bra $L30; .loc 1 165 15 st.u64 [%frame+480],%r56; bra $L36; $L30: .loc 1 171 20 mov.u64 %r336,0; st.u64 [%frame+480],%r336; .loc 1 172 24 ld.u64 %r58,[%frame+240]; .loc 1 173 16 ld.u64 %r338,[%frame+120]; shl.b64 %r337,%r338,4; sub.u64 %r119,%r123,%r337; .loc 1 174 9 ld.u64 %r340,[%frame]; shl.b64 %r339,%r340,4; sub.u64 %r120,%r124,%r339; .loc 1 176 14 @ %r424 bra $L1; add.u64 %r136,%frame,488; mov.u64 %r128,8; .loc 1 175 12 mov.u64 %r121,1; bra $L33; $L34: .loc 1 171 20 st.u64 [%r136],%r336; .loc 1 172 24 add.u64 %r347,%r443,%r128; ld.u64 %r58,[%r347]; .loc 1 173 24 add.u64 %r349,%r444,%r128; .loc 1 173 16 ld.u64 %r351,[%r349]; shl.b64 %r350,%r351,4; sub.u64 %r119,%r123,%r350; .loc 1 174 17 add.u64 %r352,%frame,%r128; .loc 1 174 9 ld.u64 %r354,[%r352]; shl.b64 %r353,%r354,4; sub.u64 %r120,%r124,%r353; .loc 1 175 12 add.u64 %r121,%r121,1; .loc 1 176 14 add.u64 %r136,%r136,8; add.u64 %r128,%r128,8; setp.eq.u64 %r355,%r438,%r121; @ %r355 bra $L1; $L33: .loc 1 184 23 ld.u64 %r356,[%r136]; add.u64 %r68,%r356,1; st.u64 [%r136],%r68; .loc 1 185 30 add.u64 %r358,%r439,%r128; .loc 1 185 20 ld.u64 %r360,[%r358]; sub.u64 %r359,%r360,%r58; shl.b64 %r361,%r359,4; add.u64 %r122,%r122,%r361; .loc 1 186 30 add.u64 %r363,%r440,%r128; .loc 1 186 20 ld.u64 %r365,[%r363]; shl.b64 %r364,%r365,4; add.u64 %r123,%r119,%r364; .loc 1 187 23 add.u64 %r367,%r441,%r128; .loc 1 187 13 ld.u64 %r369,[%r367]; shl.b64 %r368,%r369,4; add.u64 %r124,%r120,%r368; .loc 1 167 32 add.u64 %r371,%r442,%r128; .loc 1 167 13 ld.u64 %r372,[%r371]; setp.eq.u64 %r373,%r68,%r372; @ %r373 bra $L34; bra $L36; $L13: .loc 1 130 10 setp.gt.s64 %r376,%r404,%r406; @ %r376 bra $L17; setp.ne.u64 %r379,%r404,%r406; @ %r379 bra $L41; setp.gt.u64 %r382,%r89,%r405; @ %r382 bra $L17; $L41: .loc 1 132 8 st.u64 [%frame+976],%r405; st.u64 [%frame+984],%r406; st.u64 [%frame+992],%r89; st.u64 [%frame+1000],%r40443r4960]; ld.u64 %r406,[%frame+968]; bra $L17; $L1: .loc 1 191cshift1_16_r4.o/1622802247_gfortrani_cshift1_16_r4 .visible .func _gfortrani_cshift1_16_r../libgfortran/generated/cshift1_16_r4.c"odti3 .extern .func __modti3ortrani_cshift1_16_r4 .visible .func _gfortrani_cshift1_16_r00895predu16 %r210; .reg .u32 %r211; .reg .predpred %r228; .reg .predpred %r254; .reg .u64 %r256; .reg .u64 %r259; .reg .u32 %r261; .reg .u64predpred %r326; .reg .pred %r328; .reg .predu64pred %r369; .reg .pred %r372; .reg .pred %r375; .reg .predmov.u64 %r148,%ar0; mov.u64 %r149,%ar1; mov.u64 %r150,%ar2; mov.u64 %r151,%ar3; .loc 1 68 6 setp.eq.u64 %r152,%r151,0; @ %r152 bra $L37; .loc 1 69 21 ld.u32 %r153,[%r151]; add.u32 %r95,%r153,-1; bra $L2; $L37: .loc 1 71 11 cvt.u32.u64 %r95,%r151; $L2: .loc 1 73 13 mov.u64 %r154,1; st.u64 [%frame+360],%r154; .loc 1 74 12 mov.u64 %r155,0; st.u64 [%frame+480],%r155; .loc 1 82 23 ld.s8 %r139,[%r149+28]; cvt.u32.u32 %r156,%r139; cvt.s64.s8 %r134,%r156; .loc 1 82 3 setp.le.s64 %r157,%r134,0; @ %r157 bra $L38; .loc 1 84 15 cvt.s64.s32 %r87,%r95; add.u64 %r81,%r149,40; add.u64 %r48,%r148,40; .loc 1 75 5 mov.u64 %r90,%r155; .loc 1 80 7 mov.u64 %r89,%r90; .loc 1 82 12 mov.u64 %r125,%r90; .loc 1 79 11 mov.u64 %r84,%r154; .loc 1 78 11 mov.u64 %r80,%r84; .loc 1 96 20 mov.u64 %r437,%r90; $L8: .loc 1 92 17 ld.u64 %r159,[%r81+16]; add.u64 %r158,%r159,1; .loc 1 92 15 ld.u64 %r160,[%r81+8]; sub.u64 %r141,%r158,%r160; .loc 1 86 19 ld.u64 %r79,[%r48]; .loc 1 89 19 ld.u64 %r83,[%r81]; .loc 1 84 10 setp.ne.u64 %r161,%r87,%r125; @ %r161 bra $L4; .loc 1 87 14 setp.ne.u64 %r162,%r79,0; .loc 1 90 14 setp.ne.u64 %r163,%r83,0; .loc 1 92 15 mov.u64 %r89,%r141; selp.u64 %r84,%r83,1,%r163; selp.u64 %r80,%r79,1,%r162; bra $L7; $L4: .loc 1 96 20 shl.b64 %r164,%r90,3; add.u64 %r165,%frame,%r164; add.u64 %r166,%r165,480; st.u64 [%r166],%r437; .loc 1 97 21 st.u64 [%r165+360],%r141; .loc 1 98 22 st.u64 [%r165+840],%r79; .loc 1 99 22 st.u64 [%r165+720],%r83; .loc 1 100 24 add.u64 %r178,%r90,%r90; add.u64 %r179,%r178,%r90; shl.b64 %r180,%r179,3; add.u64 %r181,%r150,%r180; ld.u64 %r25,[%r181+40]; .loc 1 100 22 st.u64 [%r165+600],%r25; .loc 1 101 26 mul.lo.u64 %r189,%r79,%r141; .loc 1 101 13 st.u64 [%r165+240],%r189; .loc 1 102 26 mul.lo.u64 %r193,%r83,%r141; .loc 1 102 13 st.u64 [%r165+120],%r193; .loc 1 103 26 mul.lo.u64 %r196,%r25,%r141; .loc 1 103 13 st.u64 [%r165],%r196; .loc 1 104 12 add.u64 %r90,%r90,1; $L7: .loc 1 82 55 add.u64 %r125,%r125,1; .loc 1 82 3 add.u64 %r81,%r81,24; add.u64 %r48,%r48,24; setp.ne.u64 %r197,%r125,%r134; @ %r197 bra $L8; bra $L3; $L38: .loc 1 80 7 mov.u64 %r89,%r155; .loc 1 79 11 mov.u64 %r84,%r154; .loc 1 78 11 mov.u64 %r80,%r84; $L3: .loc 1 107 14 ld.u64 %r29,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r198,%r29,0; .loc 1 108 16 selp.u64 %r29,%r29,1,%r198; .loc 1 109 14 ld.u64 %r30,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r199,%r30,0; .loc 1 110 16 selp.u64 %r30,%r30,1,%r199; .loc 1 111 14 ld.u64 %r31,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r200,%r31,0; .loc 1 112 16 selp.u64 %r31,%r31,1,%r200; .loc 1 118 8 ld.u64 %r122,[%r148]; .loc 1 119 8 ld.u64 %r123,[%r149]; .loc 1 120 8 ld.u64 %r124,[%r150]; .loc 1 122 9 setp.eq.u64 %r201,%r122,0; @ %r201 bra $L1; shl.b64 %r133,%r84,2; .loc 1 150 13 shl.b64 %r46,%r80,2; .loc 1 162 12 shl.b64 %r50,%r30,2; .loc 1 163 12 shl.b64 %r52,%r29,2; .loc 1 164 12 shl.b64 %r54,%r31,4; .loc 1 138 35 set.u32.eq.u64 %r203,%r80,1; neg.s32 %r204,%r203; .loc 1 138 19 set.u32.eq.u64 %r206,%r84,1; neg.s32 %r207,%r206; .loc 1 138 24 cvt.u16.u32 %r209,%r204; cvt.u16.u32 %r210,%r207; and.b16 %r208,%r209,%r210; cvt.u32.u16 %r211,%r208; cvt.u32.u8 %r86,%r211; .loc 1 167 32 ld.u64 %r82,[%frame+360]; .loc 1 129 12 shr.s64 %r396,%r89,63; setp.ne.u32 %r418,%r86,0; setp.le.s64 %r415,%r134,2; .loc 1 132 8 add.u64 %r425,%frame,960; add.u64 %r426,%frame,976; add.u64 %r427,%frame,992; cvt.u32.u32 %r428,%r139; cvt.s64.s8 %r429,%r428; add.u64 %r430,%r429,-1; add.u64 %r431,%frame,840; add.u64 %r432,%frame,720; add.u64 %r433,%frame,600; add.u64 %r434,%frame,360; .loc 1 172 24 add.u64 %r435,%frame,240; .loc 1 173 24 add.u64 %r436,%frame,120; $L36: .loc 1 125 10 ld.u64 %r397,[%r124]; ld.u64 %r398,[%r124+8]; .loc 1 128 10 setp.ge.s64 %r214,%r398,0; @ %r214 bra $L13; .loc 1 129 12 add.u64 %r216,%r397,%r89; set.u32.lt.u64 %r221,%r216,%r397; cvt.s64.s32 %r219,%r221; add.u64 %r222,%r398,%r396; sub.u64 %r225,%r222,%r219; mov.u64 %r397,%r216; mov.u64 %r398,%r225; .loc 1 130 10 setp.gt.s64 %r228,%r396,%r225; @ %r228 bra $L39; setp.ne.u64 %r231,%r396,%r225; @ %r231 bra $L15; setp.gt.u64 %r234,%r89,%r216; @ ! %r234 bra $L15; $L39: setp.ge.s64 %r236,%r225,0; @ %r236 bra $L17; $L15: .loc 1 132 8 st.u64 [%frame+976],%r216; st.u64 [%frame+984],%r225; st.u64 [%frame+992],%r89; st.u64 [%frame+1000],%r39960]; ld.u64 %r398,[%frame+968]; .loc 1 133 7 setp.ge.s64 %r254,%r398,0; @ %r254 bra $L17; .loc 1 134 16 add.u64 %r256,%r397,%r89; set.u32.lt.u64 %r261,%r256,%r397; cvt.s64.s32 %r259,%r261; add.u64 %r262,%r398,%r396; mov.u64 %r397,%r256; sub.u64 %r398,%r262,%r259; $L17: .loc 1 147 24 sub.u64 %r420,%r89,%r397; set.u32.gt.u64 %r272,%r420,%r89; cvt.s64.s32 %r270,%r272; sub.u64 %r273,%r396,%r398; add.u64 %r276,%r273,%r270; .loc 1 138 10 @ %r418 bra $L20; .loc 1 147 4 setp.gt.s64 %r279,%r276,0; @ %r279 bra $L21; setp.ne.u64 %r281,%r276,0; @ %r281 bra $L40; setp.ne.u64 %r283,%r420,0; @ %r283 bra $L21; $L40: mov.u64 %r147,%r122; bra $L23; $L20: .loc 1 140 11 shl.b64 %r117,%r397,2; .loc 1 141 11 shl.b64 %r284,%r420,2; .loc 1 142 23 add.u64 %r286,%r123,%r117; .loc 1 142 4284293,[%value_in]; } .loc 1 143 17 add.u64 %r297,%r122,%r284; .loc 1 143 4117304,[%value_in]; } bra $L24; $L21: .loc 1 136 11 mad.lo.u64 %r115,%r397,%r133,%r123; mov.u64 %r114,%r122; .loc 1 147 11 mov.u64 %r116,0; $L25: .loc 1 149 16 ld.f32 %r44,[%r115]; .loc 1 149 14 st.f32 [%r114],%r44; .loc 1 150 13 add.u64 %r114,%r114,%r46; .loc 1 151 12 add.u64 %r115,%r115,%r133; .loc 1 147 31 add.u64 %r116,%r116,1; .loc 1 147 4 setp.ne.u64 %r307,%r420,%r116; @ %r307 bra $L25; .loc 1 150 13 mad.lo.u64 %r147,%r420,%r46,%r122; $L23: .loc 1 153 4 setp.gt.s64 %r324,%r398,0; @ %r324 bra $L26; setp.ne.u64 %r326,%r398,0; @ %r326 bra $L24; setp.ne.u64 %r328,%r397,0; @ ! %r328 bra $L24; $L26: mov.u64 %r112,%r123; .loc 1 153 23 mov.u64 %r113,0; $L28: .loc 1 155 16 ld.f32 %r47,[%r112]; .loc 1 155 14 st.f32 [%r147],%r47; .loc 1 156 13 add.u64 %r147,%r147,%r46; .loc 1 157 12 add.u64 %r112,%r112,%r133; .loc 1 153 37 add.u64 %r113,%r113,1; .loc 1 153 4 setp.ne.u64 %r329,%r397,%r113; @ %r329 bra $L28; $L24: .loc 1 162 12 add.u64 %r122,%r122,%r50; .loc 1 163 12 add.u64 %r123,%r123,%r52; .loc 1 164 12 add.u64 %r124,%r124,%r54; .loc 1 165 15 ld.u64 %r330,[%frame+480]; add.u64 %r56,%r330,1; .loc 1 167 13 setp.eq.u64 %r331,%r56,%r82; @ %r331 bra $L30; .loc 1 165 15 st.u64 [%frame+480],%r56; bra $L36; $L30: .loc 1 171 20 mov.u64 %r332,0; st.u64 [%frame+480],%r332; .loc 1 172 24 ld.u64 %r58,[%frame+240]; .loc 1 173 16 ld.u64 %r334,[%frame+120]; shl.b64 %r333,%r334,2; sub.u64 %r119,%r123,%r333; .loc 1 174 9 ld.u64 %r336,[%frame]; shl.b64 %r335,%r336,4; sub.u64 %r120,%r124,%r335; .loc 1 176 14 @ %r415 bra $L1; add.u64 %r136,%frame,488; mov.u64 %r128,8; .loc 1 175 12 mov.u64 %r121,1; bra $L33; $L34: .loc 1 171 20 st.u64 [%r136],%r332; .loc 1 172 24 add.u64 %r343,%r435,%r128; ld.u64 %r58,[%r343]; .loc 1 173 24 add.u64 %r345,%r436,%r128; .loc 1 173 16 ld.u64 %r347,[%r345]; shl.b64 %r346,%r347,2; sub.u64 %r119,%r123,%r346; .loc 1 174 17 add.u64 %r348,%frame,%r128; .loc 1 174 9 ld.u64 %r350,[%r348]; shl.b64 %r349,%r350,4; sub.u64 %r120,%r124,%r349; .loc 1 175 12 add.u64 %r121,%r121,1; .loc 1 176 14 add.u64 %r136,%r136,8; add.u64 %r128,%r128,8; setp.eq.u64 %r351,%r430,%r121; @ %r351 bra $L1; $L33: .loc 1 184 23 ld.u64 %r352,[%r136]; add.u64 %r68,%r352,1; st.u64 [%r136],%r68; .loc 1 185 30 add.u64 %r354,%r431,%r128; .loc 1 185 20 ld.u64 %r356,[%r354]; sub.u64 %r355,%r356,%r58; shl.b64 %r357,%r355,2; add.u64 %r122,%r122,%r357; .loc 1 186 30 add.u64 %r359,%r432,%r128; .loc 1 186 20 ld.u64 %r361,[%r359]; shl.b64 %r360,%r361,2; add.u64 %r123,%r119,%r360; .loc 1 187 23 add.u64 %r363,%r433,%r128; .loc 1 187 13 ld.u64 %r365,[%r363]; shl.b64 %r364,%r365,4; add.u64 %r124,%r120,%r364; .loc 1 167 32 add.u64 %r367,%r434,%r128; .loc 1 167 13 ld.u64 %r368,[%r367]; setp.eq.u64 %r369,%r68,%r368; @ %r369 bra $L34; bra $L36; $L13: .loc 1 130 10 setp.gt.s64 %r372,%r396,%r398; @ %r372 bra $L17; setp.ne.u64 %r375,%r396,%r398; @ %r375 bra $L41; setp.gt.u64 %r378,%r89,%r397; @ %r378 bra $L17; $L41: .loc 1 132 8 st.u64 [%frame+976],%r397; st.u64 [%frame+984],%r398; st.u64 [%frame+992],%r89; st.u64 [%frame+1000],%r39960]; ld.u64 %r398,[%frame+968]; bra $L17; $L1: .loc 1 191 cshift1_16_r8.o/1622802248_gfortrani_cshift1_16_r8 .visible .func _gfortrani_cshift1_16_r8../libgfortran/generated/cshift1_16_r8.c"odti3 .extern .func __modti3ortrani_cshift1_16_r8 .visible .func _gfortrani_cshift1_16_r800895predu16 %r210; .reg .u32 %r211; .reg .predpred %r228; .reg .predpred %r254; .reg .u64 %r256; .reg .u64 %r259; .reg .u32 %r261; .reg .u64predpred %r326; .reg .pred %r328; .reg .predu64pred %r369; .reg .pred %r372; .reg .pred %r375; .reg .predmov.u64 %r148,%ar0; mov.u64 %r149,%ar1; mov.u64 %r150,%ar2; mov.u64 %r151,%ar3; .loc 1 68 6 setp.eq.u64 %r152,%r151,0; @ %r152 bra $L37; .loc 1 69 21 ld.u32 %r153,[%r151]; add.u32 %r95,%r153,-1; bra $L2; $L37: .loc 1 71 11 cvt.u32.u64 %r95,%r151; $L2: .loc 1 73 13 mov.u64 %r154,1; st.u64 [%frame+360],%r154; .loc 1 74 12 mov.u64 %r155,0; st.u64 [%frame+480],%r155; .loc 1 82 23 ld.s8 %r139,[%r149+28]; cvt.u32.u32 %r156,%r139; cvt.s64.s8 %r134,%r156; .loc 1 82 3 setp.le.s64 %r157,%r134,0; @ %r157 bra $L38; .loc 1 84 15 cvt.s64.s32 %r87,%r95; add.u64 %r81,%r149,40; add.u64 %r48,%r148,40; .loc 1 75 5 mov.u64 %r90,%r155; .loc 1 80 7 mov.u64 %r89,%r90; .loc 1 82 12 mov.u64 %r125,%r90; .loc 1 79 11 mov.u64 %r84,%r154; .loc 1 78 11 mov.u64 %r80,%r84; .loc 1 96 20 mov.u64 %r437,%r90; $L8: .loc 1 92 17 ld.u64 %r159,[%r81+16]; add.u64 %r158,%r159,1; .loc 1 92 15 ld.u64 %r160,[%r81+8]; sub.u64 %r141,%r158,%r160; .loc 1 86 19 ld.u64 %r79,[%r48]; .loc 1 89 19 ld.u64 %r83,[%r81]; .loc 1 84 10 setp.ne.u64 %r161,%r87,%r125; @ %r161 bra $L4; .loc 1 87 14 setp.ne.u64 %r162,%r79,0; .loc 1 90 14 setp.ne.u64 %r163,%r83,0; .loc 1 92 15 mov.u64 %r89,%r141; selp.u64 %r84,%r83,1,%r163; selp.u64 %r80,%r79,1,%r162; bra $L7; $L4: .loc 1 96 20 shl.b64 %r164,%r90,3; add.u64 %r165,%frame,%r164; add.u64 %r166,%r165,480; st.u64 [%r166],%r437; .loc 1 97 21 st.u64 [%r165+360],%r141; .loc 1 98 22 st.u64 [%r165+840],%r79; .loc 1 99 22 st.u64 [%r165+720],%r83; .loc 1 100 24 add.u64 %r178,%r90,%r90; add.u64 %r179,%r178,%r90; shl.b64 %r180,%r179,3; add.u64 %r181,%r150,%r180; ld.u64 %r25,[%r181+40]; .loc 1 100 22 st.u64 [%r165+600],%r25; .loc 1 101 26 mul.lo.u64 %r189,%r79,%r141; .loc 1 101 13 st.u64 [%r165+240],%r189; .loc 1 102 26 mul.lo.u64 %r193,%r83,%r141; .loc 1 102 13 st.u64 [%r165+120],%r193; .loc 1 103 26 mul.lo.u64 %r196,%r25,%r141; .loc 1 103 13 st.u64 [%r165],%r196; .loc 1 104 12 add.u64 %r90,%r90,1; $L7: .loc 1 82 55 add.u64 %r125,%r125,1; .loc 1 82 3 add.u64 %r81,%r81,24; add.u64 %r48,%r48,24; setp.ne.u64 %r197,%r125,%r134; @ %r197 bra $L8; bra $L3; $L38: .loc 1 80 7 mov.u64 %r89,%r155; .loc 1 79 11 mov.u64 %r84,%r154; .loc 1 78 11 mov.u64 %r80,%r84; $L3: .loc 1 107 14 ld.u64 %r29,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r198,%r29,0; .loc 1 108 16 selp.u64 %r29,%r29,1,%r198; .loc 1 109 14 ld.u64 %r30,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r199,%r30,0; .loc 1 110 16 selp.u64 %r30,%r30,1,%r199; .loc 1 111 14 ld.u64 %r31,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r200,%r31,0; .loc 1 112 16 selp.u64 %r31,%r31,1,%r200; .loc 1 118 8 ld.u64 %r122,[%r148]; .loc 1 119 8 ld.u64 %r123,[%r149]; .loc 1 120 8 ld.u64 %r124,[%r150]; .loc 1 122 9 setp.eq.u64 %r201,%r122,0; @ %r201 bra $L1; shl.b64 %r133,%r84,3; .loc 1 150 13 shl.b64 %r46,%r80,3; .loc 1 162 12 shl.b64 %r50,%r30,3; .loc 1 163 12 shl.b64 %r52,%r29,3; .loc 1 164 12 shl.b64 %r54,%r31,4; .loc 1 138 35 set.u32.eq.u64 %r203,%r80,1; neg.s32 %r204,%r203; .loc 1 138 19 set.u32.eq.u64 %r206,%r84,1; neg.s32 %r207,%r206; .loc 1 138 24 cvt.u16.u32 %r209,%r204; cvt.u16.u32 %r210,%r207; and.b16 %r208,%r209,%r210; cvt.u32.u16 %r211,%r208; cvt.u32.u8 %r86,%r211; .loc 1 167 32 ld.u64 %r82,[%frame+360]; .loc 1 129 12 shr.s64 %r396,%r89,63; setp.ne.u32 %r418,%r86,0; setp.le.s64 %r415,%r134,2; .loc 1 132 8 add.u64 %r425,%frame,960; add.u64 %r426,%frame,976; add.u64 %r427,%frame,992; cvt.u32.u32 %r428,%r139; cvt.s64.s8 %r429,%r428; add.u64 %r430,%r429,-1; add.u64 %r431,%frame,840; add.u64 %r432,%frame,720; add.u64 %r433,%frame,600; add.u64 %r434,%frame,360; .loc 1 172 24 add.u64 %r435,%frame,240; .loc 1 173 24 add.u64 %r436,%frame,120; $L36: .loc 1 125 10 ld.u64 %r397,[%r124]; ld.u64 %r398,[%r124+8]; .loc 1 128 10 setp.ge.s64 %r214,%r398,0; @ %r214 bra $L13; .loc 1 129 12 add.u64 %r216,%r397,%r89; set.u32.lt.u64 %r221,%r216,%r397; cvt.s64.s32 %r219,%r221; add.u64 %r222,%r398,%r396; sub.u64 %r225,%r222,%r219; mov.u64 %r397,%r216; mov.u64 %r398,%r225; .loc 1 130 10 setp.gt.s64 %r228,%r396,%r225; @ %r228 bra $L39; setp.ne.u64 %r231,%r396,%r225; @ %r231 bra $L15; setp.gt.u64 %r234,%r89,%r216; @ ! %r234 bra $L15; $L39: setp.ge.s64 %r236,%r225,0; @ %r236 bra $L17; $L15: .loc 1 132 8 st.u64 [%frame+976],%r216; st.u64 [%frame+984],%r225; st.u64 [%frame+992],%r89; st.u64 [%frame+1000],%r39960]; ld.u64 %r398,[%frame+968]; .loc 1 133 7 setp.ge.s64 %r254,%r398,0; @ %r254 bra $L17; .loc 1 134 16 add.u64 %r256,%r397,%r89; set.u32.lt.u64 %r261,%r256,%r397; cvt.s64.s32 %r259,%r261; add.u64 %r262,%r398,%r396; mov.u64 %r397,%r256; sub.u64 %r398,%r262,%r259; $L17: .loc 1 147 24 sub.u64 %r420,%r89,%r397; set.u32.gt.u64 %r272,%r420,%r89; cvt.s64.s32 %r270,%r272; sub.u64 %r273,%r396,%r398; add.u64 %r276,%r273,%r270; .loc 1 138 10 @ %r418 bra $L20; .loc 1 147 4 setp.gt.s64 %r279,%r276,0; @ %r279 bra $L21; setp.ne.u64 %r281,%r276,0; @ %r281 bra $L40; setp.ne.u64 %r283,%r420,0; @ %r283 bra $L21; $L40: mov.u64 %r147,%r122; bra $L23; $L20: .loc 1 140 11 shl.b64 %r117,%r397,3; .loc 1 141 11 shl.b64 %r284,%r420,3; .loc 1 142 23 add.u64 %r286,%r123,%r117; .loc 1 142 4284293,[%value_in]; } .loc 1 143 17 add.u64 %r297,%r122,%r284; .loc 1 143 4117304,[%value_in]; } bra $L24; $L21: .loc 1 136 11 mad.lo.u64 %r115,%r397,%r133,%r123; mov.u64 %r114,%r122; .loc 1 147 11 mov.u64 %r116,0; $L25: .loc 1 149 16 ld.f64 %r44,[%r115]; .loc 1 149 14 st.f64 [%r114],%r44; .loc 1 150 13 add.u64 %r114,%r114,%r46; .loc 1 151 12 add.u64 %r115,%r115,%r133; .loc 1 147 31 add.u64 %r116,%r116,1; .loc 1 147 4 setp.ne.u64 %r307,%r420,%r116; @ %r307 bra $L25; .loc 1 150 13 mad.lo.u64 %r147,%r420,%r46,%r122; $L23: .loc 1 153 4 setp.gt.s64 %r324,%r398,0; @ %r324 bra $L26; setp.ne.u64 %r326,%r398,0; @ %r326 bra $L24; setp.ne.u64 %r328,%r397,0; @ ! %r328 bra $L24; $L26: mov.u64 %r112,%r123; .loc 1 153 23 mov.u64 %r113,0; $L28: .loc 1 155 16 ld.f64 %r47,[%r112]; .loc 1 155 14 st.f64 [%r147],%r47; .loc 1 156 13 add.u64 %r147,%r147,%r46; .loc 1 157 12 add.u64 %r112,%r112,%r133; .loc 1 153 37 add.u64 %r113,%r113,1; .loc 1 153 4 setp.ne.u64 %r329,%r397,%r113; @ %r329 bra $L28; $L24: .loc 1 162 12 add.u64 %r122,%r122,%r50; .loc 1 163 12 add.u64 %r123,%r123,%r52; .loc 1 164 12 add.u64 %r124,%r124,%r54; .loc 1 165 15 ld.u64 %r330,[%frame+480]; add.u64 %r56,%r330,1; .loc 1 167 13 setp.eq.u64 %r331,%r56,%r82; @ %r331 bra $L30; .loc 1 165 15 st.u64 [%frame+480],%r56; bra $L36; $L30: .loc 1 171 20 mov.u64 %r332,0; st.u64 [%frame+480],%r332; .loc 1 172 24 ld.u64 %r58,[%frame+240]; .loc 1 173 16 ld.u64 %r334,[%frame+120]; shl.b64 %r333,%r334,3; sub.u64 %r119,%r123,%r333; .loc 1 174 9 ld.u64 %r336,[%frame]; shl.b64 %r335,%r336,4; sub.u64 %r120,%r124,%r335; .loc 1 176 14 @ %r415 bra $L1; add.u64 %r136,%frame,488; mov.u64 %r128,8; .loc 1 175 12 mov.u64 %r121,1; bra $L33; $L34: .loc 1 171 20 st.u64 [%r136],%r332; .loc 1 172 24 add.u64 %r343,%r435,%r128; ld.u64 %r58,[%r343]; .loc 1 173 24 add.u64 %r345,%r436,%r128; .loc 1 173 16 ld.u64 %r347,[%r345]; shl.b64 %r346,%r347,3; sub.u64 %r119,%r123,%r346; .loc 1 174 17 add.u64 %r348,%frame,%r128; .loc 1 174 9 ld.u64 %r350,[%r348]; shl.b64 %r349,%r350,4; sub.u64 %r120,%r124,%r349; .loc 1 175 12 add.u64 %r121,%r121,1; .loc 1 176 14 add.u64 %r136,%r136,8; add.u64 %r128,%r128,8; setp.eq.u64 %r351,%r430,%r121; @ %r351 bra $L1; $L33: .loc 1 184 23 ld.u64 %r352,[%r136]; add.u64 %r68,%r352,1; st.u64 [%r136],%r68; .loc 1 185 30 add.u64 %r354,%r431,%r128; .loc 1 185 20 ld.u64 %r356,[%r354]; sub.u64 %r355,%r356,%r58; shl.b64 %r357,%r355,3; add.u64 %r122,%r122,%r357; .loc 1 186 30 add.u64 %r359,%r432,%r128; .loc 1 186 20 ld.u64 %r361,[%r359]; shl.b64 %r360,%r361,3; add.u64 %r123,%r119,%r360; .loc 1 187 23 add.u64 %r363,%r433,%r128; .loc 1 187 13 ld.u64 %r365,[%r363]; shl.b64 %r364,%r365,4; add.u64 %r124,%r120,%r364; .loc 1 167 32 add.u64 %r367,%r434,%r128; .loc 1 167 13 ld.u64 %r368,[%r367]; setp.eq.u64 %r369,%r68,%r368; @ %r369 bra $L34; bra $L36; $L13: .loc 1 130 10 setp.gt.s64 %r372,%r396,%r398; @ %r372 bra $L17; setp.ne.u64 %r375,%r396,%r398; @ %r375 bra $L41; setp.gt.u64 %r378,%r89,%r397; @ %r378 bra $L17; $L41: .loc 1 132 8 st.u64 [%frame+976],%r397; st.u64 [%frame+984],%r398; st.u64 [%frame+992],%r89; st.u64 [%frame+1000],%r39960]; ld.u64 %r398,[%frame+968]; bra $L17; $L1: .loc 1 191505 cshift1_16_c4.o/1622802248gfortrani_cshift1_16_c4 .visible .func _gfortrani_cshift1_16_c../libgfortran/generated/cshift1_16_c4.c"odti3 .extern .func __modti3ortrani_cshift1_16_c4 .visible .func _gfortrani_cshift1_16_c00891; .reg .u64 %r92; .reg .u32 %r97u64predpredpredpred %r230; .reg .pred %r233; .reg .pred %r236; .reg .pred %r238; .reg .pred %r256; .reg .u64u64 %r264; .reg .u64 %r272; .reg .u32 %r274; .reg .u64 %r275; .reg .u64 %r278; .reg .predu64 %r299; .reg .u64 %r306; .reg .pred %r309; .reg .pred %r326; .reg .pred %r328; .reg .pred %r330; .reg .pred64u64pred %r374; .reg .pred %r377; .reg .pred %r380; .reg .u64pred.reg .u64 %r438; .reg .u64 %r439; mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 68 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L37; .loc 1 69 21 ld.u32 %r155,[%r153]; add.u32 %r97,%r155,-1; bra $L2; $L37: .loc 1 71 11 cvt.u32.u64 %r97,%r153; $L2: .loc 1 73 13 mov.u64 %r156,1; st.u64 [%frame+360],%r156; .loc 1 74 12 mov.u64 %r157,0; st.u64 [%frame+480],%r157; .loc 1 82 23 ld.s8 %r141,[%r151+28]; cvt.u32.u32 %r158,%r141; cvt.s64.s8 %r136,%r158; .loc 1 82 3 setp.le.s64 %r159,%r136,0; @ %r159 bra $L38; .loc 1 84 15 cvt.s64.s32 %r89,%r97; add.u64 %r83,%r151,40; add.u64 %r50,%r150,40; .loc 1 75 5 mov.u64 %r92,%r157; .loc 1 80 7 mov.u64 %r91,%r92; .loc 1 82 12 mov.u64 %r127,%r92; .loc 1 79 11 mov.u64 %r86,%r156; .loc 1 78 11 mov.u64 %r82,%r86; .loc 1 96 20 mov.u64 %r439,%r92; $L8: .loc 1 92 17 ld.u64 %r161,[%r83+16]; add.u64 %r160,%r161,1; .loc 1 92 15 ld.u64 %r162,[%r83+8]; sub.u64 %r143,%r160,%r162; .loc 1 86 19 ld.u64 %r81,[%r50]; .loc 1 89 19 ld.u64 %r85,[%r83]; .loc 1 84 10 setp.ne.u64 %r163,%r89,%r127; @ %r163 bra $L4; .loc 1 87 14 setp.ne.u64 %r164,%r81,0; .loc 1 90 14 setp.ne.u64 %r165,%r85,0; .loc 1 92 15 mov.u64 %r91,%r143; selp.u64 %r86,%r85,1,%r165; selp.u64 %r82,%r81,1,%r164; bra $L7; $L4: .loc 1 96 20 shl.b64 %r166,%r92,3; add.u64 %r167,%frame,%r166; add.u64 %r168,%r167,480; st.u64 [%r168],%r439; .loc 1 97 21 st.u64 [%r167+360],%r143; .loc 1 98 22 st.u64 [%r167+840],%r81; .loc 1 99 22 st.u64 [%r167+720],%r85; .loc 1 100 24 add.u64 %r180,%r92,%r92; add.u64 %r181,%r180,%r92; shl.b64 %r182,%r181,3; add.u64 %r183,%r152,%r182; ld.u64 %r25,[%r183+40]; .loc 1 100 22 st.u64 [%r167+600],%r25; .loc 1 101 26 mul.lo.u64 %r191,%r81,%r143; .loc 1 101 13 st.u64 [%r167+240],%r191; .loc 1 102 26 mul.lo.u64 %r195,%r85,%r143; .loc 1 102 13 st.u64 [%r167+120],%r195; .loc 1 103 26 mul.lo.u64 %r198,%r25,%r143; .loc 1 103 13 st.u64 [%r167],%r198; .loc 1 104 12 add.u64 %r92,%r92,1; $L7: .loc 1 82 55 add.u64 %r127,%r127,1; .loc 1 82 3 add.u64 %r83,%r83,24; add.u64 %r50,%r50,24; setp.ne.u64 %r199,%r127,%r136; @ %r199 bra $L8; bra $L3; $L38: .loc 1 80 7 mov.u64 %r91,%r157; .loc 1 79 11 mov.u64 %r86,%r156; .loc 1 78 11 mov.u64 %r82,%r86; $L3: .loc 1 107 14 ld.u64 %r29,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r200,%r29,0; .loc 1 108 16 selp.u64 %r29,%r29,1,%r200; .loc 1 109 14 ld.u64 %r30,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r201,%r30,0; .loc 1 110 16 selp.u64 %r30,%r30,1,%r201; .loc 1 111 14 ld.u64 %r31,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r202,%r31,0; .loc 1 112 16 selp.u64 %r31,%r31,1,%r202; .loc 1 118 8 ld.u64 %r124,[%r150]; .loc 1 119 8 ld.u64 %r125,[%r151]; .loc 1 120 8 ld.u64 %r126,[%r152]; .loc 1 122 9 setp.eq.u64 %r203,%r124,0; @ %r203 bra $L1; shl.b64 %r135,%r86,3; .loc 1 150 13 shl.b64 %r47,%r82,3; .loc 1 162 12 shl.b64 %r52,%r30,3; .loc 1 163 12 shl.b64 %r54,%r29,3; .loc 1 164 12 shl.b64 %r56,%r31,4; .loc 1 138 35 set.u32.eq.u64 %r205,%r82,1; neg.s32 %r206,%r205; .loc 1 138 19 set.u32.eq.u64 %r208,%r86,1; neg.s32 %r209,%r208; .loc 1 138 24 cvt.u16.u32 %r211,%r206; cvt.u16.u32 %r212,%r209; and.b16 %r210,%r211,%r212; cvt.u32.u16 %r213,%r210; cvt.u32.u8 %r88,%r213; .loc 1 167 32 ld.u64 %r84,[%frame+360]; .loc 1 129 12 shr.s64 %r398,%r91,63; setp.ne.u32 %r420,%r88,0; setp.le.s64 %r417,%r136,2; .loc 1 132 8 add.u64 %r427,%frame,960; add.u64 %r428,%frame,976; add.u64 %r429,%frame,992; cvt.u32.u32 %r430,%r141; cvt.s64.s8 %r431,%r430; add.u64 %r432,%r431,-1; add.u64 %r433,%frame,840; add.u64 %r434,%frame,720; add.u64 %r435,%frame,600; add.u64 %r436,%frame,360; .loc 1 172 24 add.u64 %r437,%frame,240; .loc 1 173 24 add.u64 %r438,%frame,120; $L36: .loc 1 125 10 ld.u64 %r399,[%r126]; ld.u64 %r400,[%r126+8]; .loc 1 128 10 setp.ge.s64 %r216,%r400,0; @ %r216 bra $L13; .loc 1 129 12 add.u64 %r218,%r399,%r91; set.u32.lt.u64 %r223,%r218,%r399; cvt.s64.s32 %r221,%r223; add.u64 %r224,%r400,%r398; sub.u64 %r227,%r224,%r221; mov.u64 %r399,%r218; mov.u64 %r400,%r227; .loc 1 130 10 setp.gt.s64 %r230,%r398,%r227; @ %r230 bra $L39; setp.ne.u64 %r233,%r398,%r227; @ %r233 bra $L15; setp.gt.u64 %r236,%r91,%r218; @ ! %r236 bra $L15; $L39: setp.ge.s64 %r238,%r227,0; @ %r238 bra $L17; $L15: .loc 1 132 8 st.u64 [%frame+976],%r218; st.u64 [%frame+984],%r227; st.u64 [%frame+992],%r91; st.u64 [%frame+1000],%r39800,[%frame+968]; .loc 1 133 7 setp.ge.s64 %r256,%r400,0; @ %r256 bra $L17; .loc 1 134 16 add.u64 %r258,%r399,%r91; set.u32.lt.u64 %r263,%r258,%r399; cvt.s64.s32 %r261,%r263; add.u64 %r264,%r400,%r398; mov.u64 %r399,%r258; sub.u64 %r400,%r264,%r261; $L17: .loc 1 147 24 sub.u64 %r422,%r91,%r399; set.u32.gt.u64 %r274,%r422,%r91; cvt.s64.s32 %r272,%r274; sub.u64 %r275,%r398,%r400; add.u64 %r278,%r275,%r272; .loc 1 138 10 @ %r420 bra $L20; .loc 1 147 4 setp.gt.s64 %r281,%r278,0; @ %r281 bra $L21; setp.ne.u64 %r283,%r278,0; @ %r283 bra $L40; setp.ne.u64 %r285,%r422,0; @ %r285 bra $L21; $L40: mov.u64 %r149,%r124; bra $L23; $L20: .loc 1 140 11 shl.b64 %r119,%r399,3; .loc 1 141 11 shl.b64 %r286,%r422,3; .loc 1 142 23 add.u64 %r288,%r125,%r119; .loc 1 142 41286295,[%value_in]; } .loc 1 143 17 add.u64 %r299,%r124,%r286; .loc 1 143 4r119306,[%value_in]; } bra $L24; $L21: .loc 1 136 11 mad.lo.u64 %r117,%r399,%r135,%r125; mov.u64 %r116,%r124; .loc 1 147 11 mov.u64 %r118,0; $L25: .loc 1 149 16 ld.f32 %r44,[%r117]; ld.f32 %r45,[%r117+4]; .loc 1 149 14 st.f32 [%r116],%r44; st.f32 [%r116+4],%r45; .loc 1 150 13 add.u64 %r116,%r116,%r47; .loc 1 151 12 add.u64 %r117,%r117,%r135; .loc 1 147 31 add.u64 %r118,%r118,1; .loc 1 147 4 setp.ne.u64 %r309,%r422,%r118; @ %r309 bra $L25; .loc 1 150 13 mad.lo.u64 %r149,%r422,%r47,%r124; $L23: .loc 1 153 4 setp.gt.s64 %r326,%r400,0; @ %r326 bra $L26; setp.ne.u64 %r328,%r400,0; @ %r328 bra $L24; setp.ne.u64 %r330,%r399,0; @ ! %r330 bra $L24; $L26: mov.u64 %r114,%r125; .loc 1 153 23 mov.u64 %r115,0; $L28: .loc 1 155 16 ld.f32 %r48,[%r114]; ld.f32 %r49,[%r114+4]; .loc 1 155 14 st.f32 [%r149],%r48; st.f32 [%r149+4],%r49; .loc 1 156 13 add.u64 %r149,%r149,%r47; .loc 1 157 12 add.u64 %r114,%r114,%r135; .loc 1 153 37 add.u64 %r115,%r115,1; .loc 1 153 4 setp.ne.u64 %r331,%r399,%r115; @ %r331 bra $L28; $L24: .loc 1 162 12 add.u64 %r124,%r124,%r52; .loc 1 163 12 add.u64 %r125,%r125,%r54; .loc 1 164 12 add.u64 %r126,%r126,%r56; .loc 1 165 15 ld.u64 %r332,[%frame+480]; add.u64 %r58,%r332,1; .loc 1 167 13 setp.eq.u64 %r333,%r58,%r84; @ %r333 bra $L30; .loc 1 165 15 st.u64 [%frame+480],%r58; bra $L36; $L30: .loc 1 171 20 mov.u64 %r334,0; st.u64 [%frame+480],%r334; .loc 1 172 24 ld.u64 %r60,[%frame+240]; .loc 1 173 16 ld.u64 %r336,[%frame+120]; shl.b64 %r335,%r336,3; sub.u64 %r121,%r125,%r335; .loc 1 174 9 ld.u64 %r338,[%frame]; shl.b64 %r337,%r338,4; sub.u64 %r122,%r126,%r337; .loc 1 176 14 @ %r417 bra $L1; add.u64 %r138,%frame,488; mov.u64 %r130,8; .loc 1 175 12 mov.u64 %r123,1; bra $L33; $L34: .loc 1 171 20 st.u64 [%r138],%r334; .loc 1 172 24 add.u64 %r345,%r437,%r130; ld.u64 %r60,[%r345]; .loc 1 173 24 add.u64 %r347,%r438,%r130; .loc 1 173 16 ld.u64 %r349,[%r347]; shl.b64 %r348,%r349,3; sub.u64 %r121,%r125,%r348; .loc 1 174 17 add.u64 %r350,%frame,%r130; .loc 1 174 9 ld.u64 %r352,[%r350]; shl.b64 %r351,%r352,4; sub.u64 %r122,%r126,%r351; .loc 1 175 12 add.u64 %r123,%r123,1; .loc 1 176 14 add.u64 %r138,%r138,8; add.u64 %r130,%r130,8; setp.eq.u64 %r353,%r432,%r123; @ %r353 bra $L1; $L33: .loc 1 184 23 ld.u64 %r354,[%r138]; add.u64 %r70,%r354,1; st.u64 [%r138],%r70; .loc 1 185 30 add.u64 %r356,%r433,%r130; .loc 1 185 20 ld.u64 %r358,[%r356]; sub.u64 %r357,%r358,%r60; shl.b64 %r359,%r357,3; add.u64 %r124,%r124,%r359; .loc 1 186 30 add.u64 %r361,%r434,%r130; .loc 1 186 20 ld.u64 %r363,[%r361]; shl.b64 %r362,%r363,3; add.u64 %r125,%r121,%r362; .loc 1 187 23 add.u64 %r365,%r435,%r130; .loc 1 187 13 ld.u64 %r367,[%r365]; shl.b64 %r366,%r367,4; add.u64 %r126,%r122,%r366; .loc 1 167 32 add.u64 %r369,%r436,%r130; .loc 1 167 13 ld.u64 %r370,[%r369]; setp.eq.u64 %r371,%r70,%r370; @ %r371 bra $L34; bra $L36; $L13: .loc 1 130 10 setp.gt.s64 %r374,%r398,%r400; @ %r374 bra $L17; setp.ne.u64 %r377,%r398,%r400; @ %r377 bra $L41; setp.gt.u64 %r380,%r91,%r399; @ %r380 bra $L17; $L41: .loc 1 132 8 st.u64 [%frame+976],%r399; st.u64 [%frame+984],%r400; st.u64 [%frame+992],%r91; st.u64 [%frame+1000],%r39800,[%frame+968]; bra $L17; $L1: .loc 1 191 cshift1_16_c8.o/1622802248gfortrani_cshift1_16_c8 .visible .func _gfortrani_cshift1_16_c8../libgfortran/generated/cshift1_16_c8.c"odti3 .extern .func __modti3ortrani_cshift1_16_c8 .visible .func _gfortrani_cshift1_16_c800891; .reg .u64 %r92; .reg .u32 %r97u64predpredpredpred %r230; .reg .pred %r233; .reg .pred %r236; .reg .pred %r238; .reg .pred %r256; .reg .u64u64 %r264; .reg .u64 %r272; .reg .u32 %r274; .reg .u64 %r275; .reg .u64 %r278; .reg .predu64 %r299; .reg .u64 %r306; .reg .pred %r309; .reg .pred %r326; .reg .pred %r328; .reg .pred %r330; .reg .pred64u64pred %r374; .reg .pred %r377; .reg .pred %r380; .reg .u64pred.reg .u64 %r438; .reg .u64 %r439; mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 68 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L37; .loc 1 69 21 ld.u32 %r155,[%r153]; add.u32 %r97,%r155,-1; bra $L2; $L37: .loc 1 71 11 cvt.u32.u64 %r97,%r153; $L2: .loc 1 73 13 mov.u64 %r156,1; st.u64 [%frame+360],%r156; .loc 1 74 12 mov.u64 %r157,0; st.u64 [%frame+480],%r157; .loc 1 82 23 ld.s8 %r141,[%r151+28]; cvt.u32.u32 %r158,%r141; cvt.s64.s8 %r136,%r158; .loc 1 82 3 setp.le.s64 %r159,%r136,0; @ %r159 bra $L38; .loc 1 84 15 cvt.s64.s32 %r89,%r97; add.u64 %r83,%r151,40; add.u64 %r50,%r150,40; .loc 1 75 5 mov.u64 %r92,%r157; .loc 1 80 7 mov.u64 %r91,%r92; .loc 1 82 12 mov.u64 %r127,%r92; .loc 1 79 11 mov.u64 %r86,%r156; .loc 1 78 11 mov.u64 %r82,%r86; .loc 1 96 20 mov.u64 %r439,%r92; $L8: .loc 1 92 17 ld.u64 %r161,[%r83+16]; add.u64 %r160,%r161,1; .loc 1 92 15 ld.u64 %r162,[%r83+8]; sub.u64 %r143,%r160,%r162; .loc 1 86 19 ld.u64 %r81,[%r50]; .loc 1 89 19 ld.u64 %r85,[%r83]; .loc 1 84 10 setp.ne.u64 %r163,%r89,%r127; @ %r163 bra $L4; .loc 1 87 14 setp.ne.u64 %r164,%r81,0; .loc 1 90 14 setp.ne.u64 %r165,%r85,0; .loc 1 92 15 mov.u64 %r91,%r143; selp.u64 %r86,%r85,1,%r165; selp.u64 %r82,%r81,1,%r164; bra $L7; $L4: .loc 1 96 20 shl.b64 %r166,%r92,3; add.u64 %r167,%frame,%r166; add.u64 %r168,%r167,480; st.u64 [%r168],%r439; .loc 1 97 21 st.u64 [%r167+360],%r143; .loc 1 98 22 st.u64 [%r167+840],%r81; .loc 1 99 22 st.u64 [%r167+720],%r85; .loc 1 100 24 add.u64 %r180,%r92,%r92; add.u64 %r181,%r180,%r92; shl.b64 %r182,%r181,3; add.u64 %r183,%r152,%r182; ld.u64 %r25,[%r183+40]; .loc 1 100 22 st.u64 [%r167+600],%r25; .loc 1 101 26 mul.lo.u64 %r191,%r81,%r143; .loc 1 101 13 st.u64 [%r167+240],%r191; .loc 1 102 26 mul.lo.u64 %r195,%r85,%r143; .loc 1 102 13 st.u64 [%r167+120],%r195; .loc 1 103 26 mul.lo.u64 %r198,%r25,%r143; .loc 1 103 13 st.u64 [%r167],%r198; .loc 1 104 12 add.u64 %r92,%r92,1; $L7: .loc 1 82 55 add.u64 %r127,%r127,1; .loc 1 82 3 add.u64 %r83,%r83,24; add.u64 %r50,%r50,24; setp.ne.u64 %r199,%r127,%r136; @ %r199 bra $L8; bra $L3; $L38: .loc 1 80 7 mov.u64 %r91,%r157; .loc 1 79 11 mov.u64 %r86,%r156; .loc 1 78 11 mov.u64 %r82,%r86; $L3: .loc 1 107 14 ld.u64 %r29,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r200,%r29,0; .loc 1 108 16 selp.u64 %r29,%r29,1,%r200; .loc 1 109 14 ld.u64 %r30,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r201,%r30,0; .loc 1 110 16 selp.u64 %r30,%r30,1,%r201; .loc 1 111 14 ld.u64 %r31,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r202,%r31,0; .loc 1 112 16 selp.u64 %r31,%r31,1,%r202; .loc 1 118 8 ld.u64 %r124,[%r150]; .loc 1 119 8 ld.u64 %r125,[%r151]; .loc 1 120 8 ld.u64 %r126,[%r152]; .loc 1 122 9 setp.eq.u64 %r203,%r124,0; @ %r203 bra $L1; shl.b64 %r135,%r86,4; .loc 1 150 13 shl.b64 %r47,%r82,4; .loc 1 162 12 shl.b64 %r52,%r30,4; .loc 1 163 12 shl.b64 %r54,%r29,4; .loc 1 164 12 shl.b64 %r56,%r31,4; .loc 1 138 35 set.u32.eq.u64 %r205,%r82,1; neg.s32 %r206,%r205; .loc 1 138 19 set.u32.eq.u64 %r208,%r86,1; neg.s32 %r209,%r208; .loc 1 138 24 cvt.u16.u32 %r211,%r206; cvt.u16.u32 %r212,%r209; and.b16 %r210,%r211,%r212; cvt.u32.u16 %r213,%r210; cvt.u32.u8 %r88,%r213; .loc 1 167 32 ld.u64 %r84,[%frame+360]; .loc 1 129 12 shr.s64 %r398,%r91,63; setp.ne.u32 %r420,%r88,0; setp.le.s64 %r417,%r136,2; .loc 1 132 8 add.u64 %r427,%frame,960; add.u64 %r428,%frame,976; add.u64 %r429,%frame,992; cvt.u32.u32 %r430,%r141; cvt.s64.s8 %r431,%r430; add.u64 %r432,%r431,-1; add.u64 %r433,%frame,840; add.u64 %r434,%frame,720; add.u64 %r435,%frame,600; add.u64 %r436,%frame,360; .loc 1 172 24 add.u64 %r437,%frame,240; .loc 1 173 24 add.u64 %r438,%frame,120; $L36: .loc 1 125 10 ld.u64 %r399,[%r126]; ld.u64 %r400,[%r126+8]; .loc 1 128 10 setp.ge.s64 %r216,%r400,0; @ %r216 bra $L13; .loc 1 129 12 add.u64 %r218,%r399,%r91; set.u32.lt.u64 %r223,%r218,%r399; cvt.s64.s32 %r221,%r223; add.u64 %r224,%r400,%r398; sub.u64 %r227,%r224,%r221; mov.u64 %r399,%r218; mov.u64 %r400,%r227; .loc 1 130 10 setp.gt.s64 %r230,%r398,%r227; @ %r230 bra $L39; setp.ne.u64 %r233,%r398,%r227; @ %r233 bra $L15; setp.gt.u64 %r236,%r91,%r218; @ ! %r236 bra $L15; $L39: setp.ge.s64 %r238,%r227,0; @ %r238 bra $L17; $L15: .loc 1 132 8 st.u64 [%frame+976],%r218; st.u64 [%frame+984],%r227; st.u64 [%frame+992],%r91; st.u64 [%frame+1000],%r39800,[%frame+968]; .loc 1 133 7 setp.ge.s64 %r256,%r400,0; @ %r256 bra $L17; .loc 1 134 16 add.u64 %r258,%r399,%r91; set.u32.lt.u64 %r263,%r258,%r399; cvt.s64.s32 %r261,%r263; add.u64 %r264,%r400,%r398; mov.u64 %r399,%r258; sub.u64 %r400,%r264,%r261; $L17: .loc 1 147 24 sub.u64 %r422,%r91,%r399; set.u32.gt.u64 %r274,%r422,%r91; cvt.s64.s32 %r272,%r274; sub.u64 %r275,%r398,%r400; add.u64 %r278,%r275,%r272; .loc 1 138 10 @ %r420 bra $L20; .loc 1 147 4 setp.gt.s64 %r281,%r278,0; @ %r281 bra $L21; setp.ne.u64 %r283,%r278,0; @ %r283 bra $L40; setp.ne.u64 %r285,%r422,0; @ %r285 bra $L21; $L40: mov.u64 %r149,%r124; bra $L23; $L20: .loc 1 140 11 shl.b64 %r119,%r399,4; .loc 1 141 11 shl.b64 %r286,%r422,4; .loc 1 142 23 add.u64 %r288,%r125,%r119; .loc 1 142 41286295,[%value_in]; } .loc 1 143 17 add.u64 %r299,%r124,%r286; .loc 1 143 4r119306,[%value_in]; } bra $L24; $L21: .loc 1 136 11 mad.lo.u64 %r117,%r399,%r135,%r125; mov.u64 %r116,%r124; .loc 1 147 11 mov.u64 %r118,0; $L25: .loc 1 149 16 ld.f64 %r44,[%r117]; ld.f64 %r45,[%r117+8]; .loc 1 149 14 st.f64 [%r116],%r44; st.f64 [%r116+8],%r45; .loc 1 150 13 add.u64 %r116,%r116,%r47; .loc 1 151 12 add.u64 %r117,%r117,%r135; .loc 1 147 31 add.u64 %r118,%r118,1; .loc 1 147 4 setp.ne.u64 %r309,%r422,%r118; @ %r309 bra $L25; .loc 1 150 13 mad.lo.u64 %r149,%r422,%r47,%r124; $L23: .loc 1 153 4 setp.gt.s64 %r326,%r400,0; @ %r326 bra $L26; setp.ne.u64 %r328,%r400,0; @ %r328 bra $L24; setp.ne.u64 %r330,%r399,0; @ ! %r330 bra $L24; $L26: mov.u64 %r114,%r125; .loc 1 153 23 mov.u64 %r115,0; $L28: .loc 1 155 16 ld.f64 %r48,[%r114]; ld.f64 %r49,[%r114+8]; .loc 1 155 14 st.f64 [%r149],%r48; st.f64 [%r149+8],%r49; .loc 1 156 13 add.u64 %r149,%r149,%r47; .loc 1 157 12 add.u64 %r114,%r114,%r135; .loc 1 153 37 add.u64 %r115,%r115,1; .loc 1 153 4 setp.ne.u64 %r331,%r399,%r115; @ %r331 bra $L28; $L24: .loc 1 162 12 add.u64 %r124,%r124,%r52; .loc 1 163 12 add.u64 %r125,%r125,%r54; .loc 1 164 12 add.u64 %r126,%r126,%r56; .loc 1 165 15 ld.u64 %r332,[%frame+480]; add.u64 %r58,%r332,1; .loc 1 167 13 setp.eq.u64 %r333,%r58,%r84; @ %r333 bra $L30; .loc 1 165 15 st.u64 [%frame+480],%r58; bra $L36; $L30: .loc 1 171 20 mov.u64 %r334,0; st.u64 [%frame+480],%r334; .loc 1 172 24 ld.u64 %r60,[%frame+240]; .loc 1 173 16 ld.u64 %r336,[%frame+120]; shl.b64 %r335,%r336,4; sub.u64 %r121,%r125,%r335; .loc 1 174 9 ld.u64 %r338,[%frame]; shl.b64 %r337,%r338,4; sub.u64 %r122,%r126,%r337; .loc 1 176 14 @ %r417 bra $L1; add.u64 %r138,%frame,488; mov.u64 %r130,8; .loc 1 175 12 mov.u64 %r123,1; bra $L33; $L34: .loc 1 171 20 st.u64 [%r138],%r334; .loc 1 172 24 add.u64 %r345,%r437,%r130; ld.u64 %r60,[%r345]; .loc 1 173 24 add.u64 %r347,%r438,%r130; .loc 1 173 16 ld.u64 %r349,[%r347]; shl.b64 %r348,%r349,4; sub.u64 %r121,%r125,%r348; .loc 1 174 17 add.u64 %r350,%frame,%r130; .loc 1 174 9 ld.u64 %r352,[%r350]; shl.b64 %r351,%r352,4; sub.u64 %r122,%r126,%r351; .loc 1 175 12 add.u64 %r123,%r123,1; .loc 1 176 14 add.u64 %r138,%r138,8; add.u64 %r130,%r130,8; setp.eq.u64 %r353,%r432,%r123; @ %r353 bra $L1; $L33: .loc 1 184 23 ld.u64 %r354,[%r138]; add.u64 %r70,%r354,1; st.u64 [%r138],%r70; .loc 1 185 30 add.u64 %r356,%r433,%r130; .loc 1 185 20 ld.u64 %r358,[%r356]; sub.u64 %r357,%r358,%r60; shl.b64 %r359,%r357,4; add.u64 %r124,%r124,%r359; .loc 1 186 30 add.u64 %r361,%r434,%r130; .loc 1 186 20 ld.u64 %r363,[%r361]; shl.b64 %r362,%r363,4; add.u64 %r125,%r121,%r362; .loc 1 187 23 add.u64 %r365,%r435,%r130; .loc 1 187 13 ld.u64 %r367,[%r365]; shl.b64 %r366,%r367,4; add.u64 %r126,%r122,%r366; .loc 1 167 32 add.u64 %r369,%r436,%r130; .loc 1 167 13 ld.u64 %r370,[%r369]; setp.eq.u64 %r371,%r70,%r370; @ %r371 bra $L34; bra $L36; $L13: .loc 1 130 10 setp.gt.s64 %r374,%r398,%r400; @ %r374 bra $L17; setp.ne.u64 %r377,%r398,%r400; @ %r377 bra $L41; setp.gt.u64 %r380,%r91,%r399; @ %r380 bra $L17; $L41: .loc 1 132 8 st.u64 [%frame+976],%r399; st.u64 [%frame+984],%r400; st.u64 [%frame+992],%r91; st.u64 [%frame+1000],%r39800,[%frame+968]; bra $L17; $L1: .loc 1 191541 maxloc0_4_s1.o/ 1622802248_gfortran_maxloc0_4_s1 .visible .func _gfortran_maxloc0_4_s1); .file 1 "../../../../libgfortran/generated/maxloc0_4_s1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_4_s1 .visible .func _gfortran_mmaxloc0_4_s1_gfortran_smaxloc0_4_s1 .visible .func _gfortran_smaxloc0_4_s1in_ar4); // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_returnmemcmpmem_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents88,777,65,83,75,32,97,114,103,117,109,101,110,116,0 }82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,6270,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_4_s1 .visible .func _gfortran_maxloc0_4_s1predpredpredpredpred %r175; .reg .u64 %r178; mov.u64 %r99,%ar0; mov.u64 %r100,%ar1; mov.u32 %r101,%ar2; mov.u64 %r102,%ar3; .loc 1 64 10 ld.s8 %r22,[%r100+28]; .loc 1 64 8 cvt.u32.u32 %r103,%r22; cvt.s64.s8 %r65,%r103; .loc 1 65 6 setp.gt.s64 %r104,%r65,0; @ %r104 bra $L2; .loc 1 66 5 cvta.const.u64 %r105,$LC0stack; call _gfortran_runtime_errorr23,[%r99]; .loc 1 68 6 setp.ne.u64 %r107,%r23,0; @ %r107 bra $L3; .loc 1 70 7 st.u64 [%r99+48],%r23; add.u64 %r109,%r65,-1; st.u64 [%r99+56],%r109; mov.u64 %r110,1; st.u64 [%r99+40],%r110; .loc 1 71 28 cvt.u32.u64 %r111,%r110; st.u8 [%r99+28],%r111; .loc 1 72 24 st.u64 [%r99+8],%r23; .loc 1 73 2965call (%value_in),_gfortrani_xmallocarray117,[%value_in]; } mov.u64 %r23,%r117; .loc 1 73 27 st.u64 [%r99],%r23; bra $L4; $L3: .loc 1 77 11 cvta.global.u64 %r119,_gfortrani_compile_options; .loc 1 77 10 ld.u32 %r120,[%r119+36]; setp.eq.u32 %r121,%r120,0; @ %r121 bra $L4; .loc 1 78r124; call _gfortrani_bounds_iforeach_return.loc 1 83 8 ld.u64 %r23,[%r99]; $L4: .loc 1 82 11 ld.u64 %r66,[%r99+40]; add.u64 %r28,%r100,40; mov.u64 %r70,0; .loc 1 84 10 mov.u64 %r86,%r70; add.u64 %r173,%frame,120; add.u64 %r172,%frame,240; .loc 1 88 16 mov.u64 %r135,%r70; bra $L7; $L19: mov.u64 %r86,%r72; $L7: .loc 1 86 18 add.u64 %r125,%frame,%r70; .loc 1 86 51 ld.u64 %r127,[%r28]; mul.lo.u64 %r126,%r127,%r102; .loc 1 86 18 st.u64 [%r125],%r126; .loc 1 87 19 ld.u64 %r129,[%r28+16]; add.u64 %r128,%r129,1; ld.u64 %r130,[%r28+8]; sub.u64 %r36,%r128,%r130; .loc 1 87 17 add.u64 %r132,%r173,%r70; st.u64 [%r132],%r36; .loc 1 88 16 add.u64 %r134,%r172,%r70; st.u64 [%r134],%r135; .loc 1 89 10 setp.gt.s64 %r136,%r36,0; @ %r136 bra $L5; shl.b64 %r90,%r66,2; mov.u64 %r27,%r23; .loc 1 92 11 mov.u64 %r73,0; .loc 1 93 24 cvt.u32.u64 %r137,%r73; $L6: st.u32 [%r27],%r137; .loc 1 92 27 add.u64 %r73,%r73,1; .loc 1 92 4 add.u64 %r27,%r27,%r90; setp.ne.u64 %r138,%r65,%r73; @ %r138 bra $L6; bra $L1; $L5: .loc 1 84 26 add.u64 %r72,%r86,1; .loc 1 84 3 add.u64 %r28,%r28,24; add.u64 %r70,%r70,8; setp.ne.u64 %r139,%r65,%r72; @ %r139 bra $L19; .loc 1 98 8 ld.u64 %r53,[%r100]; shl.b64 %r55,%r66,2; mov.u64 %r57,%r23; .loc 1 101 10 mov.u64 %r71,0; .loc 1 102 23 mov.u32 %r140,1; $L8: st.u32 [%r57],%r140; mov.u64 %r87,%r71; .loc 1 101 26 add.u64 %r71,%r71,1; .loc 1 101 3 add.u64 %r57,%r57,%r55; setp.ne.u64 %r141,%r86,%r87; @ %r141 bra $L8; .loc 1 108 9 setp.ne.u64 %r142,%r53,0; @ ! %r142 bra $L1; .loc 1 123 19 ld.u64 %r40,[%frame]; .loc 1 125 34 ld.u64 %r43,[%frame+120]; cvt.u32.u32 %r144,%r22; cvt.s64.s8 %r143,%r144; shl.b64 %r145,%r143,3; add.u64 %r61,%r172,%r145; .loc 1 106 11 mov.u64 %r59,0; add.u64 %r174,%frame,248; .loc 1 114 8 setp.eq.u32 %r175,%r101,0; .loc 1 131 13 mov.u64 %r178,%r59; $L18: .loc 1 125 21 ld.u64 %r91,[%frame+240]; mov.u64 %r42,%r91; mov.u64 %r52,%r53; $L15: .loc 1 114 8 setp.eq.u64 %r147,%r59,0; @ %r147 bra $L11; .loc 1 41 12r10151,[%value_in]; } .loc 1 114 8 @ %r175 bra $L12; .loc 1 114 24 setp.lt.s32 %r154,%r151,0; @ %r154 bra $L13; $L11: mov.u64 %r74,%r174; .loc 1 125 21 mov.u64 %r75,%r23; mov.u64 %r92,%r42; bra $L14; $L12: .loc 1 114 24 setp.gt.s32 %r156,%r151,0; @ ! %r156 bra $L13; bra $L11; $L14: .loc 1 119 38 cvt.u32.u64 %r158,%r92; add.u32 %r157,%r158,1; .loc 1 119 27 st.u32 [%r75],%r157; .loc 1 118 7 add.u64 %r75,%r75,%r55; setp.eq.u64 %r159,%r61,%r74; @ %r159 bra $L20; .loc 1 119 34 ld.u64 %r92,[%r74]; add.u64 %r74,%r74,8; bra $L14; $L20: mov.u64 %r59,%r52; $L13: .loc 1 123 9 add.u64 %r52,%r52,%r40; .loc 1 125 14 add.u64 %r42,%r42,1; .loc 1 125 7 st.u64 [%frame+240],%r42; setp.ne.u64 %r160,%r42,%r43; @ %r160 bra $L15; .loc 1 123 9 sub.u64 %r161,%r43,%r91; mad.lo.u64 %r53,%r161,%r40,%r53; mov.u64 %r81,%r174; mov.u64 %r51,%r43; mov.u64 %r48,%r40; .loc 1 126 9 mov.u64 %r58,0; $L17: .loc 1 131 13 st.u64 [%r81+-8],%r178; .loc 1 134 23 mul.lo.u64 %r45,%r48,%r51; mov.u64 %r94,%r58; .loc 1 135 5 add.u64 %r58,%r58,1; .loc 1 136 7 setp.eq.u64 %r165,%r94,%r86; @ %r165 bra $L1; .loc 1 144 16 ld.u64 %r166,[%r81]; add.u64 %r47,%r166,1; st.u64 [%r81],%r47; shl.b64 %r78,%r58,3; .loc 1 145 23 add.u64 %r167,%frame,%r78; ld.u64 %r48,[%r167]; .loc 1 145 13 sub.u64 %r168,%r48,%r45; add.u64 %r53,%r53,%r168; .loc 1 148 32 add.u64 %r170,%r173,%r78; ld.u64 %r51,[%r170]; .loc 1 148 7 add.u64 %r81,%r81,8; setp.eq.u64 %r171,%r47,%r51; @ %r171 bra $L17; bra $L18; $L1: .loc 1 151_gfortran_mmaxloc0_4_s1 .visible .func _gfortran_mmaxloc0_4_s1in_ar4) {u64u64u32 %r177; .reg .u16 %r178; .reg .pred %r179; .reg .predpredpred %r205; .reg .predpred %r223; .reg .u32predpredpred %r246; .reg .u64 %r249; mov.u64 %r128,%ar0; mov.u64 %r129,%ar1; mov.u64 %r130,%ar2; mov.u32 %r131,%ar3; mov.u64 %r132,%ar4; .loc 1 177 6 setp.ne.u64 %r133,%r130,0; @ %r133 bra $L321r12913132; call _gfortran_maxloc0_4_s1184 7 bra $L31; $L32: .loc 1 187 10 ld.s8 %r22,[%r129+28]; .loc 1 188 6 setp.gt.s32 %r138,%r22,0; @ %r138 bra $L34; .loc 1 189 5 cvta.const.u64 %r131stack; call _gfortran_runtime_error34: .loc 1 191 6 ld.u64 %r141,[%r128]; setp.ne.u64 %r142,%r141,0; @ %r142 bra $L35; .loc 1 193 7 st.u64 [%r128+48],%r141; add.u32 %r144,%r22,-1; cvt.s64.s32 %r145,%r144; st.u64 [%r128+56],%r145; mov.u64 %r146,1; st.u64 [%r128+40],%r146; .loc 1 194 28 cvt.u32.u64 %r147,%r146; st.u8 [%r128+28],%r147; .loc 1 195 24 st.u64 [%r128+8],%r141; .loc 1 196 29 cvt.u32.u32 %r152,%r22; cvt.s64.s8 %r151,%r152call (%value_in),_gfortrani_xmallocarray153,[%value_in]; } .loc 1 196 27 st.u64 [%r128],%r153; bra $L36; $L35: .loc 1 200 11 cvta.global.u64 %r155,_gfortrani_compile_options; .loc 1 200 10 ld.u32 %r156,[%r155+36]; setp.eq.u32 %r157,%r156,0r129160; call _gfortrani_bounds_iforeach_return.loc 1 205 4 cvta.const.u64 %r16311291630; call _gfortrani_bounds_equal_extents$L36: .loc 1 210 15 ld.u64 %r31,[%r130+16]; .loc 1 212 9 ld.u64 %r82,[%r130]; .loc 1 214 22 cvt.u32.u64 %r32,%r31; .loc 1 214 53 add.u32 %r165,%r32,-4; and.b32 %r166,%r165,-5; set.u32.eq.u32 %r168,%r166,0; neg.s32 %r169,%r168; .loc 1 214 22 add.u32 %r170,%r32,-1; set.u32.le.u32 %r172,%r170,1; neg.s32 %r173,%r172; .loc 1 214 6 cvt.u16.u32 %r175,%r169; cvt.u16.u32 %r176,%r173; or.b16 %r174,%r175,%r176; cvt.u32.u16 %r177,%r174; cvt.u16.u8 %r178,%r177; setp.ne.u16 %r179,%r178,0; @ %r179 bra $L37; .loc 1 216 7 setp.ne.u32 %r181,%r32,16; @ %r181 bra $L38; $L37: .loc 1 223 11 ld.u64 %r93,[%r128+40]; .loc 1 224 8 ld.u64 %r94,[%r128]; .loc 1 225 17 cvt.u32.u32 %r182,%r22; cvt.s64.s8 %r111,%r182; add.u64 %r89,%r129,40; add.u64 %r83,%r130,40; mov.u64 %r68,0; .loc 1 225 10 mov.u64 %r84,%r68; add.u64 %r245,%frame,120; add.u64 %r242,%frame,240; add.u64 %r243,%frame,360; .loc 1 230 16 mov.u64 %r199,%r68; bra $L39; $L38: .loc 1 221 5_gfortran_runtime_error39: .loc 1 227 18 add.u64 %r186,%r245,%r68; .loc 1 227 51 ld.u64 %r188,[%r89]; mul.lo.u64 %r187,%r188,%r132; .loc 1 227 18 st.u64 [%r186],%r187; .loc 1 228 18 add.u64 %r189,%frame,%r68; .loc 1 228 20 ld.u64 %r191,[%r83]; mul.lo.u64 %r190,%r191,%r31; .loc 1 228 18 st.u64 [%r189],%r190; .loc 1 229 19 ld.u64 %r193,[%r89+16]; add.u64 %r192,%r193,1; ld.u64 %r194,[%r89+8]; sub.u64 %r51,%r192,%r194; .loc 1 229 17 add.u64 %r196,%r242,%r68; st.u64 [%r196],%r51; .loc 1 230 16 add.u64 %r198,%r243,%r68; st.u64 [%r198],%r199; .loc 1 231 10 setp.gt.s64 %r200,%r51,0; @ %r200 bra $L40; shl.b64 %r124,%r93,2; mov.u64 %r38,%r94; .loc 1 234 11 mov.u64 %r96,0; .loc 1 235 24 cvt.u32.u64 %r201,%r96; $L41: st.u32 [%r38],%r201; .loc 1 234 27 add.u64 %r96,%r96,1; .loc 1 234 4 add.u64 %r38,%r38,%r124; setp.ne.u64 %r202,%r96,%r111; @ %r202 bra $L41; bra $L31; $L40: .loc 1 225 26 add.u64 %r84,%r84,1; .loc 1 225 3 add.u64 %r89,%r89,24; add.u64 %r83,%r83,24; add.u64 %r68,%r68,8; setp.ne.u64 %r203,%r84,%r111; @ %r203 bra $L39; .loc 1 240 8 ld.u64 %r80,[%r129]; shl.b64 %r99,%r93,2; mov.u64 %r101,%r94; .loc 1 243 10 mov.u64 %r95,0; .loc 1 244 23 cvt.u32.u64 %r204,%r95; $L43: st.u32 [%r101],%r204; .loc 1 243 26 add.u64 %r95,%r95,1; .loc 1 243 3 add.u64 %r101,%r101,%r99; setp.ne.u64 %r205,%r95,%r111; @ %r205 bra $L43; .loc 1 251 9 setp.eq.u64 %r206,%r80,0; @ %r206 bra $L31; .loc 1 267 19 ld.u64 %r59,[%frame+120]; .loc 1 268 20 ld.u64 %r61,[%frame]; .loc 1 270 34 ld.u64 %r65,[%frame+240]; cvt.u32.u32 %r208,%r22; cvt.s64.s8 %r207,%r208; shl.b64 %r209,%r207,3; add.u64 %r103,%r243,%r209; .loc 1 249 10 mov.u64 %r86,0; add.u64 %r244,%frame,368; .loc 1 257 6 setp.eq.u32 %r246,%r131,0; .loc 1 276 13 mov.u64 %r249,%r86; $L50: .loc 1 270 21 ld.u64 %r125,[%frame+360]; mov.u64 %r63,%r125; mov.u64 %r81,%r82; mov.u64 %r79,%r80; $L48: .loc 1 257 6 ld.s8 %r212,[%r81]; cvt.u16.u32 %r211,%r212; setp.eq.u16 %r213,%r211,0; @ %r213 bra $L44; .loc 1 257 14 setp.eq.u64 %r214,%r86,0; @ %r214 bra $L45; .loc 1 4113218,[%value_in]; } .loc 1 257 6 @ %r246 bra $L46; .loc 1 258 25 setp.lt.s32 %r221,%r218,0; @ %r221 bra $L44; $L45: mov.u64 %r108,%r244; .loc 1 270 21 mov.u64 %r109,%r94; mov.u64 %r126,%r63; bra $L47; $L46: .loc 1 258 25 setp.gt.s32 %r223,%r218,0; @ ! %r223 bra $L44; bra $L45; $L47: .loc 1 263 38 cvt.u32.u64 %r225,%r126; add.u32 %r224,%r225,1; .loc 1 263 27 st.u32 [%r109],%r224; .loc 1 262 7 add.u64 %r109,%r109,%r99; setp.eq.u64 %r226,%r103,%r108; @ %r226 bra $L51; .loc 1 263 34 ld.u64 %r126,[%r108]; add.u64 %r108,%r108,8; bra $L47; $L51: mov.u64 %r86,%r79; $L44: .loc 1 267 9 add.u64 %r79,%r79,%r59; .loc 1 268 10 add.u64 %r81,%r81,%r61; .loc 1 270 14 add.u64 %r63,%r63,1; .loc 1 270 7 st.u64 [%frame+360],%r63; setp.ne.u64 %r227,%r63,%r65; @ %r227 bra $L48; sub.u64 %r119,%r65,%r125; .loc 1 267 9 mad.lo.u64 %r80,%r59,%r119,%r80; .loc 1 268 10 mad.lo.u64 %r82,%r61,%r119,%r82; mov.u64 %r116,%r244; mov.u64 %r75,%r61; mov.u64 %r78,%r65; mov.u64 %r73,%r59; mov.u64 %r115,8; .loc 1 271 9 mov.u64 %r85,0; $L49: .loc 1 276 13 st.u64 [%r116+-8],%r249; .loc 1 279 23 mul.lo.u64 %r67,%r73,%r78; .loc 1 280 24 mul.lo.u64 %r70,%r78,%r75; .loc 1 281 5 add.u64 %r85,%r85,1; .loc 1 282 7 setp.eq.u64 %r232,%r85,%r111; @ %r232 bra $L31; .loc 1 290 16 ld.u64 %r233,[%r116]; add.u64 %r72,%r233,1; st.u64 [%r116],%r72; .loc 1 291 23 add.u64 %r235,%r245,%r115; ld.u64 %r73,[%r235]; .loc 1 291 13 sub.u64 %r236,%r73,%r67; add.u64 %r80,%r80,%r236; .loc 1 292 24 add.u64 %r237,%frame,%r115; ld.u64 %r75,[%r237]; .loc 1 292 14 sub.u64 %r238,%r75,%r70; add.u64 %r82,%r82,%r238; .loc 1 295 32 add.u64 %r240,%r242,%r115; ld.u64 %r78,[%r240]; .loc 1 295 7 add.u64 %r116,%r116,8; add.u64 %r115,%r115,8; setp.eq.u64 %r241,%r72,%r78; @ %r241 bra $L49; bra $L50; $L31: .loc 1 298_gfortran_smaxloc0_4_s1 .visible .func _gfortran_smaxloc0_4_s1in_ar4) {4u32 %r41; .reg .predu32 %r55; .reg .u64u32ar2; mov.u32 %r38,%ar3; mov.u64 %r39,%ar4; .loc 1 317 6 setp.eq.u64 %r40,%r37,0; @ %r40 bra $L70; .loc 1 317 20 ld.u32 %r41,[%r37]; setp.eq.u32 %r42,%r41,0; @ %r42 bra $L71; $L70: .loc 1 320 7 {call _gfortran_maxloc0_4_s1324 7 bra $L69; $L71: .loc 1 327 8 ld.s8 %r30,[%r36+28]; .loc 1 329 6 setp.gt.s64 %r48,%r30,0; @ %r48 bra $L73; .loc 1 330 5 cvta.const.u64 %r49,$LC0; {stack; call _gfortran_runtime_error73: .loc 1 332 15 ld.u64 %r24,[%r35]; .loc 1 332 6 setp.ne.u64 %r51,%r24,0; @ %r51 bra $L74; .loc 1 334 7 st.u64 [%r35+48],%r24; add.u64 %r53,%r30,-1; st.u64 [%r35+56],%r53; mov.u64 %r54,1; st.u64 [%r35+40],%r54; .loc 1 335 28 cvt.u32.u64 %r55,%r54; st.u8 [%r35+28],%r55; .loc 1 336 24 st.u64 [%r35+8],%r24; .loc 1 337 29 mov.u64 %r58call (%value_in),_gfortrani_xmallocarray37 27 st.u64 [%r35],%r24; bra $L75; $L74: .loc 1 339 12 cvta.global.u64 %r63,_gfortrani_compile_options; .loc 1 339 11 ld.u32 %r64,[%r63+36]; setp.eq.u32 %r65,%r64,0; @ %r65 bra $L75; .loc 1 341 8 cvta.const.u64 %r68,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 346 8 ld.u64 %r24,[%r35]; $L75: ld.u64 %r69,[%r35+40]; shl.b64 %r34,%r69,2; mov.u64 %r29,%r24; .loc 1 347 10 mov.u64 %r32,0; .loc 1 348 23 cvt.u32.u64 %r70,%r32; $L76: st.u32 [%r29],%r70; .loc 1 347 24 add.u64 %r32,%r32,1; .loc 1 347 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r71,%r30,%r32; @ %r71 bra $L76; $L69: .loc 1 349 maxloc0_4_s4.o/ 1622802248_gfortran_maxloc0_4_s4 .visible .func _gfortran_maxloc0_4_s4); .file 1 "../../../../libgfortran/generated/maxloc0_4_s4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_4_s4 .visible .func _gfortran_mmaxloc0_4_s_gfortran_smaxloc0_4_s4 .visible .func _gfortran_smaxloc0_4_sVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_memcmp_char4_gfortrani_memcmp_char4_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents88,777,65,83,75,32,97,114,103,117,109,101,110,116,0 }82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,6270,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_4_s4 .visible .func _gfortran_maxloc0_4_s4u64 %r120; .reg .u64 %r122; .reg .u32 %r123; .reg .predu64pred %r182; mov.u64 %r102,%ar0; mov.u64 %r103,%ar1; mov.u32 %r104,%ar2; mov.u64 %r105,%ar3; .loc 1 64 10 ld.s8 %r22,[%r103+28]; .loc 1 64 8 cvt.u32.u32 %r106,%r22; cvt.s64.s8 %r62,%r106; .loc 1 65 6 setp.gt.s64 %r107,%r62,0; @ %r107 bra $L2; .loc 1 66 5 cvta.const.u64 %r108,$LC0; {stack; call _gfortran_runtime_errorr23,[%r102]; .loc 1 68 6 setp.ne.u64 %r110,%r23,0; @ %r110 bra $L3; .loc 1 70 7 st.u64 [%r102+48],%r23; add.u64 %r112,%r62,-1; st.u64 [%r102+56],%r112; mov.u64 %r113,1; st.u64 [%r102+40],%r113; .loc 1 71 28 cvt.u32.u64 %r114,%r113; st.u8 [%r102+28],%r114; .loc 1 72 24 st.u64 [%r102+8],%r23; .loc 1 73 29117; call (%value_in),_gfortrani_xmallocarray120,[%value_in]; } mov.u64 %r23,%r120; .loc 1 73 27 st.u64 [%r102],%r23; bra $L4; $L3: .loc 1 77 11 cvta.global.u64 %r122,_gfortrani_compile_options; .loc 1 77 10 ld.u32 %r123,[%r122+36]; setp.eq.u32 %r124,%r123,0; @ %r124 bra $L4; .loc 1 78 2127; call _gfortrani_bounds_iforeach_return.loc 1 83 8 ld.u64 %r23,[%r102]; $L4: .loc 1 82 11 ld.u64 %r63,[%r102+40]; add.u64 %r27,%r103,40; mov.u64 %r68,0; .loc 1 84 10 mov.u64 %r83,%r68; add.u64 %r181,%frame,120; add.u64 %r179,%frame,240; .loc 1 88 16 mov.u64 %r138,%r68; bra $L7; $L20: mov.u64 %r83,%r70; $L7: .loc 1 86 18 add.u64 %r128,%frame,%r68; .loc 1 86 51 ld.u64 %r130,[%r27]; mul.lo.u64 %r129,%r130,%r105; .loc 1 86 18 st.u64 [%r128],%r129; .loc 1 87 19 ld.u64 %r132,[%r27+16]; add.u64 %r131,%r132,1; ld.u64 %r133,[%r27+8]; sub.u64 %r35,%r131,%r133; .loc 1 87 17 add.u64 %r135,%r181,%r68; st.u64 [%r135],%r35; .loc 1 88 16 add.u64 %r137,%r179,%r68; st.u64 [%r137],%r138; .loc 1 89 10 setp.gt.s64 %r139,%r35,0; @ %r139 bra $L5; shl.b64 %r85,%r63,2; mov.u64 %r87,%r23; .loc 1 92 11 mov.u64 %r71,0; .loc 1 93 24 cvt.u32.u64 %r140,%r71; $L6: st.u32 [%r87],%r140; .loc 1 92 27 add.u64 %r71,%r71,1; .loc 1 92 4 add.u64 %r87,%r87,%r85; setp.ne.u64 %r141,%r62,%r71; @ %r141 bra $L6; bra $L1; $L5: .loc 1 84 26 add.u64 %r70,%r83,1; .loc 1 84 3 add.u64 %r27,%r27,24; add.u64 %r68,%r68,8; setp.ne.u64 %r142,%r62,%r70; @ %r142 bra $L20; .loc 1 98 8 ld.u64 %r50,[%r103]; shl.b64 %r52,%r63,2; mov.u64 %r54,%r23; .loc 1 101 10 mov.u64 %r69,0; .loc 1 102 23 mov.u32 %r143,1; $L8: st.u32 [%r54],%r143; mov.u64 %r84,%r69; .loc 1 101 26 add.u64 %r69,%r69,1; .loc 1 101 3 add.u64 %r54,%r54,%r52; setp.ne.u64 %r144,%r83,%r84; @ %r144 bra $L8; .loc 1 108 9 setp.ne.u64 %r145,%r50,0; @ ! %r145 bra $L1; .loc 1 123 19 ld.u64 %r92,[%frame]; .loc 1 123 9 shl.b64 %r94,%r92,2; .loc 1 125 34 ld.u64 %r95,[%frame+120]; cvt.u32.u32 %r147,%r22; cvt.s64.s8 %r146,%r147; shl.b64 %r148,%r146,3; add.u64 %r58,%r179,%r148; .loc 1 106 11 mov.u64 %r56,0; add.u64 %r180,%frame,248; .loc 1 114 8 setp.eq.u32 %r182,%r104,0; $L19: .loc 1 125 21 ld.u64 %r96,[%frame+240]; mov.u64 %r40,%r96; mov.u64 %r49,%r50; $L16: .loc 1 114 8 setp.ne.u64 %r150,%r56,0; @ %r150 bra $L11; $L14: mov.u64 %r74,%r180; .loc 1 125 21 mov.u64 %r75,%r23; mov.u64 %r97,%r40; bra $L12; $L11: .loc 1 114 8 @ %r182 bra $L13;105; call (%value_in),_gfortrani_memcmp_char4156,[%value_in]; } .loc 1 114 24 setp.ge.s32 %r157,%r156,0; @ ! %r157 bra $L15; bra $L14105; call (%value_in),_gfortrani_memcmp_char4161,[%value_in]; } .loc 1 114 24 setp.gt.s32 %r162,%r161,0; @ ! %r162 bra $L15; bra $L14; $L12: .loc 1 119 38 cvt.u32.u64 %r164,%r97; add.u32 %r163,%r164,1; .loc 1 119 27 st.u32 [%r75],%r163; .loc 1 118 7 add.u64 %r75,%r75,%r52; setp.eq.u64 %r165,%r58,%r74; @ %r165 bra $L21; .loc 1 119 34 ld.u64 %r97,[%r74]; add.u64 %r74,%r74,8; bra $L12; $L21: .loc 1 117 14 mov.u64 %r56,%r49; $L15: .loc 1 123 9 add.u64 %r49,%r49,%r94; .loc 1 125 14 add.u64 %r40,%r40,1; .loc 1 125 7 st.u64 [%frame+240],%r40; setp.ne.u64 %r166,%r40,%r95; @ %r166 bra $L16; .loc 1 123 9 sub.u64 %r167,%r95,%r96; mad.lo.u64 %r50,%r167,%r94,%r50; mov.u64 %r81,%r180; mov.u64 %r48,%r95; mov.u64 %r45,%r92; .loc 1 126 9 mov.u64 %r55,0; .loc 1 131 13 mov.u64 %r170,%r55; $L18: st.u64 [%r81+-8],%r170; .loc 1 134 23 mul.lo.u64 %r42,%r45,%r48; mov.u64 %r36,%r55; .loc 1 135 5 add.u64 %r55,%r55,1; .loc 1 136 7 setp.eq.u64 %r171,%r36,%r83; @ %r171 bra $L1; .loc 1 144 16 ld.u64 %r172,[%r81]; add.u64 %r44,%r172,1; st.u64 [%r81],%r44; shl.b64 %r78,%r55,3; .loc 1 145 23 add.u64 %r173,%frame,%r78; ld.u64 %r45,[%r173]; .loc 1 145 13 sub.u64 %r174,%r45,%r42; shl.b64 %r175,%r174,2; add.u64 %r50,%r50,%r175; .loc 1 148 32 add.u64 %r177,%r181,%r78; ld.u64 %r48,[%r177]; .loc 1 148 7 add.u64 %r81,%r81,8; setp.eq.u64 %r178,%r44,%r48; @ %r178 bra $L18; bra $L19; $L1: .loc 1 151_gfortran_mmaxloc0_4_s4 .visible .func _gfortran_mmaxloc0_4_s, .param .u64 %in_ar4) {predpred.reg .u16 %r179; .reg .u32 %r180; .reg .u16 %r181; .reg .pred %r182; .reg .predpred %r205; .reg .pred %r206; .reg .u32 %r207; .reg .predu32 %r211; .reg .u64 %r212; .reg .u16 %r214; .reg .u32 %r215; .reg .predpred %r232; .reg .predu64 %r251; .reg .u64 %r252; .reg .pred %r253; mov.u64 %r131,%ar0; mov.u64 %r132,%ar1; mov.u64 %r133,%ar2; mov.u32 %r134,%ar3; mov.u64 %r135,%ar4; .loc 1 177 6 setp.ne.u64 %r136,%r133,0; @ %r136 bra $L301132134135; call _gfortran_maxloc0_4_s4184 7 bra $L29; $L30: .loc 1 187 10 ld.s8 %r22,[%r132+28]; .loc 1 188 6 setp.gt.s32 %r141,%r22,0; @ %r141 bra $L32; .loc 1 189 5 cvta.const.u64 %r142,$LC0stack; call _gfortran_runtime_error32: .loc 1 191 6 ld.u64 %r144,[%r131]; setp.ne.u64 %r145,%r144,0; @ %r145 bra $L33; .loc 1 193 7 st.u64 [%r131+48],%r144; add.u32 %r147,%r22,-1; cvt.s64.s32 %r148,%r147; st.u64 [%r131+56],%r148; mov.u64 %r149,1; st.u64 [%r131+40],%r149; .loc 1 194 28 cvt.u32.u64 %r150,%r149; st.u8 [%r131+28],%r150; .loc 1 195 24 st.u64 [%r131+8],%r144; .loc 1 196 29 cvt.u32.u32 %r155,%r22; cvt.s64.s8 %r154,%r155call (%value_in),_gfortrani_xmallocarray156,[%value_in]; } .loc 1 196 27 st.u64 [%r131],%r156; bra $L34; $L33: .loc 1 200 11 cvta.global.u64 %r158,_gfortrani_compile_options; .loc 1 200 10 ld.u32 %r159,[%r158+36]; setp.eq.u32 %r160,%r159,0r1163; call _gfortrani_bounds_iforeach_return.loc 1 205 4 cvta.const.u64 %r16613r11663; call _gfortrani_bounds_equal_extents$L34: .loc 1 210 15 ld.u64 %r31,[%r133+16]; .loc 1 212 9 ld.u64 %r76,[%r133]; .loc 1 214 22 cvt.u32.u64 %r32,%r31; .loc 1 214 53 add.u32 %r168,%r32,-4; and.b32 %r169,%r168,-5; set.u32.eq.u32 %r171,%r169,0; neg.s32 %r172,%r171; .loc 1 214 22 add.u32 %r173,%r32,-1; set.u32.le.u32 %r175,%r173,1; neg.s32 %r176,%r175; .loc 1 214 6 cvt.u16.u32 %r178,%r172; cvt.u16.u32 %r179,%r176; or.b16 %r177,%r178,%r179; cvt.u32.u16 %r180,%r177; cvt.u16.u8 %r181,%r180; setp.ne.u16 %r182,%r181,0; @ %r182 bra $L35; .loc 1 216 7 setp.ne.u32 %r184,%r32,16; @ %r184 bra $L36; $L35: .loc 1 223 11 ld.u64 %r87,[%r131+40]; .loc 1 224 8 ld.u64 %r88,[%r131]; .loc 1 225 17 cvt.u32.u32 %r185,%r22; cvt.s64.s8 %r107,%r185; add.u64 %r83,%r132,40; add.u64 %r77,%r133,40; mov.u64 %r61,0; .loc 1 225 10 mov.u64 %r78,%r61; add.u64 %r250,%frame,120; add.u64 %r251,%frame,240; add.u64 %r252,%frame,360; .loc 1 230 16 mov.u64 %r202,%r61; bra $L37; $L36: .loc 1 221 5_gfortran_runtime_error37: .loc 1 227 18 add.u64 %r189,%r250,%r61; .loc 1 227 51 ld.u64 %r191,[%r83]; mul.lo.u64 %r190,%r191,%r135; .loc 1 227 18 st.u64 [%r189],%r190; .loc 1 228 18 add.u64 %r192,%frame,%r61; .loc 1 228 20 ld.u64 %r194,[%r77]; mul.lo.u64 %r193,%r194,%r31; .loc 1 228 18 st.u64 [%r192],%r193; .loc 1 229 19 ld.u64 %r196,[%r83+16]; add.u64 %r195,%r196,1; ld.u64 %r197,[%r83+8]; sub.u64 %r50,%r195,%r197; .loc 1 229 17 add.u64 %r199,%r251,%r61; st.u64 [%r199],%r50; .loc 1 230 16 add.u64 %r201,%r252,%r61; st.u64 [%r201],%r202; .loc 1 231 10 setp.gt.s64 %r203,%r50,0; @ %r203 bra $L38; shl.b64 %r115,%r87,2; mov.u64 %r117,%r88; .loc 1 234 11 mov.u64 %r91,0; .loc 1 235 24 cvt.u32.u64 %r204,%r91; $L39: st.u32 [%r117],%r204; .loc 1 234 27 add.u64 %r91,%r91,1; .loc 1 234 4 add.u64 %r117,%r117,%r115; setp.ne.u64 %r205,%r91,%r107; @ %r205 bra $L39; bra $L29; $L38: .loc 1 225 26 add.u64 %r78,%r78,1; .loc 1 225 3 add.u64 %r83,%r83,24; add.u64 %r77,%r77,24; add.u64 %r61,%r61,8; setp.ne.u64 %r206,%r78,%r107; @ %r206 bra $L37; .loc 1 240 8 ld.u64 %r73,[%r132]; shl.b64 %r96,%r87,2; mov.u64 %r98,%r88; .loc 1 243 10 mov.u64 %r90,0; .loc 1 244 23 cvt.u32.u64 %r207,%r90; $L41: st.u32 [%r98],%r207; .loc 1 243 26 add.u64 %r90,%r90,1; .loc 1 243 3 add.u64 %r98,%r98,%r96; setp.ne.u64 %r208,%r90,%r107; @ %r208 bra $L41; .loc 1 251 9 setp.eq.u64 %r209,%r73,0; @ %r209 bra $L29; .loc 1 267 19 ld.u64 %r123,[%frame+120]; .loc 1 267 9 shl.b64 %r125,%r123,2; .loc 1 268 20 ld.u64 %r126,[%frame]; .loc 1 270 34 ld.u64 %r128,[%frame+240]; cvt.u32.u32 %r211,%r22; cvt.s64.s8 %r210,%r211; shl.b64 %r212,%r210,3; add.u64 %r100,%r252,%r212; .loc 1 249 10 mov.u64 %r80,0; add.u64 %r249,%frame,368; .loc 1 257 6 setp.eq.u32 %r253,%r134,0; $L49: .loc 1 270 21 ld.u64 %r129,[%frame+360]; mov.u64 %r57,%r129; mov.u64 %r75,%r76; mov.u64 %r72,%r73; $L47: .loc 1 257 6 ld.s8 %r215,[%r75]; cvt.u16.u32 %r214,%r215; setp.eq.u16 %r216,%r214,0; @ %r216 bra $L42; .loc 1 257 14 setp.ne.u64 %r217,%r80,0; @ %r217 bra $L43; $L46: mov.u64 %r105,%r249; .loc 1 270 21 mov.u64 %r106,%r88; mov.u64 %r130,%r57; bra $L44; $L43: .loc 1 257 6 @ %r253 bra $L45; .loc 1 43135; call (%value_in),_gfortrani_memcmp_char4223,[%value_in]; } .loc 1 258 25 setp.ge.s32 %r224,%r223,0; @ ! %r224 bra $L42; bra $L46; $L45: .loc 1 43135; call (%value_in),_gfortrani_memcmp_char4228,[%value_in]; } .loc 1 258 25 setp.gt.s32 %r229,%r228,0; @ ! %r229 bra $L42; bra $L46; $L44: .loc 1 263 38 cvt.u32.u64 %r231,%r130; add.u32 %r230,%r231,1; .loc 1 263 27 st.u32 [%r106],%r230; .loc 1 262 7 add.u64 %r106,%r106,%r96; setp.eq.u64 %r232,%r100,%r105; @ %r232 bra $L50; .loc 1 263 34 ld.u64 %r130,[%r105]; add.u64 %r105,%r105,8; bra $L44; $L50: .loc 1 261 14 mov.u64 %r80,%r72; $L42: .loc 1 267 9 add.u64 %r72,%r72,%r125; .loc 1 268 10 add.u64 %r75,%r75,%r126; .loc 1 270 14 add.u64 %r57,%r57,1; .loc 1 270 7 st.u64 [%frame+360],%r57; setp.ne.u64 %r233,%r57,%r128; @ %r233 bra $L47; sub.u64 %r120,%r128,%r129; .loc 1 267 9 mad.lo.u64 %r73,%r120,%r125,%r73; .loc 1 268 10 mad.lo.u64 %r76,%r120,%r126,%r76; mov.u64 %r113,%r249; mov.u64 %r69,%r126; mov.u64 %r71,%r128; mov.u64 %r67,%r123; mov.u64 %r112,8; .loc 1 271 9 mov.u64 %r79,0; .loc 1 276 13 mov.u64 %r237,%r79; $L48: st.u64 [%r113+-8],%r237; .loc 1 279 23 mul.lo.u64 %r60,%r67,%r71; .loc 1 280 24 mul.lo.u64 %r63,%r71,%r69; .loc 1 281 5 add.u64 %r79,%r79,1; .loc 1 282 7 setp.eq.u64 %r238,%r79,%r107; @ %r238 bra $L29; .loc 1 290 16 ld.u64 %r239,[%r113]; add.u64 %r66,%r239,1; st.u64 [%r113],%r66; .loc 1 291 23 add.u64 %r241,%r250,%r112; ld.u64 %r67,[%r241]; .loc 1 291 13 sub.u64 %r242,%r67,%r60; shl.b64 %r243,%r242,2; add.u64 %r73,%r73,%r243; .loc 1 292 24 add.u64 %r244,%frame,%r112; ld.u64 %r69,[%r244]; .loc 1 292 14 sub.u64 %r245,%r69,%r63; add.u64 %r76,%r76,%r245; .loc 1 295 32 add.u64 %r247,%r251,%r112; ld.u64 %r71,[%r247]; .loc 1 295 7 add.u64 %r113,%r113,8; add.u64 %r112,%r112,8; setp.eq.u64 %r248,%r66,%r71; @ %r248 bra $L48; bra $L49; $L29: .loc 1 298_gfortran_smaxloc0_4_s4 .visible .func _gfortran_smaxloc0_4_s, .param .u64 %in_ar4) {4u32 %r41; .reg .predu32 %r55; .reg .u64u32ar2; mov.u32 %r38,%ar3; mov.u64 %r39,%ar4; .loc 1 317 6 setp.eq.u64 %r40,%r37,0; @ %r40 bra $L66; .loc 1 317 20 ld.u32 %r41,[%r37]; setp.eq.u32 %r42,%r41,0; @ %r42 bra $L67; $L66: .loc 1 320 7 {call _gfortran_maxloc0_4_s4324 7 bra $L65; $L67: .loc 1 327 8 ld.s8 %r30,[%r36+28]; .loc 1 329 6 setp.gt.s64 %r48,%r30,0; @ %r48 bra $L69; .loc 1 330 5 cvta.const.u64 %r49,$LC0; {stack; call _gfortran_runtime_error69: .loc 1 332 15 ld.u64 %r24,[%r35]; .loc 1 332 6 setp.ne.u64 %r51,%r24,0; @ %r51 bra $L70; .loc 1 334 7 st.u64 [%r35+48],%r24; add.u64 %r53,%r30,-1; st.u64 [%r35+56],%r53; mov.u64 %r54,1; st.u64 [%r35+40],%r54; .loc 1 335 28 cvt.u32.u64 %r55,%r54; st.u8 [%r35+28],%r55; .loc 1 336 24 st.u64 [%r35+8],%r24; .loc 1 337 29 mov.u64 %r58call (%value_in),_gfortrani_xmallocarray37 27 st.u64 [%r35],%r24; bra $L71; $L70: .loc 1 339 12 cvta.global.u64 %r63,_gfortrani_compile_options; .loc 1 339 11 ld.u32 %r64,[%r63+36]; setp.eq.u32 %r65,%r64,0; @ %r65 bra $L71; .loc 1 341 8 cvta.const.u64 %r68,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 346 8 ld.u64 %r24,[%r35]; $L71: ld.u64 %r69,[%r35+40]; shl.b64 %r34,%r69,2; mov.u64 %r29,%r24; .loc 1 347 10 mov.u64 %r32,0; .loc 1 348 23 cvt.u32.u64 %r70,%r32; $L72: st.u32 [%r29],%r70; .loc 1 347 24 add.u64 %r32,%r32,1; .loc 1 347 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r71,%r30,%r32; @ %r71 bra $L72; $L65: .loc 1 349 maxloc0_8_s1.o/ 1622802248_gfortran_maxloc0_8_s1 .visible .func _gfortran_maxloc0_8_s1); .file 1 "../../../../libgfortran/generated/maxloc0_8_s1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_8_s1 .visible .func _gfortran_mmaxloc0_8_s1_gfortran_smaxloc0_8_s1 .visible .func _gfortran_smaxloc0_8_s1in_ar4); // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_returnmemcmpmem_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents88,777,65,83,75,32,97,114,103,117,109,101,110,116,0 }82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,6270,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_8_s1 .visible .func _gfortran_maxloc0_8_s132u64u32 %r142; .reg .u64 %r143; .reg .predpred %r157; .reg .u64 %r158; .reg .predpredpred %r172; .reg .u64 %r175; mov.u64 %r97,%ar0; mov.u64 %r98,%ar1; mov.u32 %r99,%ar2; mov.u64 %r100,%ar3; .loc 1 64 10 ld.s8 %r22,[%r98+28]; .loc 1 64 8 cvt.u32.u32 %r101,%r22; cvt.s64.s8 %r64,%r101; .loc 1 65 6 setp.gt.s64 %r102,%r64,0; @ %r102 bra $L2; .loc 1 66 5 cvta.const.u64 %r103103_gfortran_runtime_errorr23,[%r97]; .loc 1 68 6 setp.ne.u64 %r105,%r23,0; @ %r105 bra $L3; .loc 1 70 7 st.u64 [%r97+48],%r23; add.u64 %r107,%r64,-1; st.u64 [%r97+56],%r107; mov.u64 %r108,1; st.u64 [%r97+40],%r108; .loc 1 71 28 cvt.u32.u64 %r109,%r108; st.u8 [%r97+28],%r109; .loc 1 72 24 st.u64 [%r97+8],%r23; .loc 1 73 29 mov.u64 %r11112; call (%value_in),_gfortrani_xmallocarray115,[%value_in]; } mov.u64 %r23,%r115; .loc 1 73 27 st.u64 [%r97],%r23; bra $L4; $L3: .loc 1 77 11 cvta.global.u64 %r117,_gfortrani_compile_options; .loc 1 77 10 ld.u32 %r118,[%r117+36]; setp.eq.u32 %r119,%r118,0; @ %r119 bra $L4; .loc 1 7892; call _gfortrani_bounds_iforeach_return.loc 1 83 8 ld.u64 %r23,[%r97]; $L4: .loc 1 82 11 ld.u64 %r65,[%r97+40]; add.u64 %r67,%r98,40; mov.u64 %r94,0; .loc 1 84 10 mov.u64 %r85,%r94; add.u64 %r170,%frame,120; add.u64 %r169,%frame,240; .loc 1 88 16 mov.u64 %r133,%r94; bra $L7; $L19: mov.u64 %r85,%r69; $L7: .loc 1 86 18 add.u64 %r123,%frame,%r94; .loc 1 86 51 ld.u64 %r125,[%r67]; mul.lo.u64 %r124,%r125,%r100; .loc 1 86 18 st.u64 [%r123],%r124; .loc 1 87 19 ld.u64 %r127,[%r67+16]; add.u64 %r126,%r127,1; ld.u64 %r128,[%r67+8]; sub.u64 %r37,%r126,%r128; .loc 1 87 17 add.u64 %r130,%r170,%r94; st.u64 [%r130],%r37; .loc 1 88 16 add.u64 %r132,%r169,%r94; st.u64 [%r132],%r133; .loc 1 89 10 setp.gt.s64 %r134,%r37,0; @ %r134 bra $L5; shl.b64 %r84,%r65,3; mov.u64 %r27,%r23; .loc 1 92 11 mov.u64 %r70,0; .loc 1 93 24 mov.u64 %r135,%r70; $L6: st.u64 [%r27],%r135; .loc 1 92 27 add.u64 %r70,%r70,1; .loc 1 92 4 add.u64 %r27,%r27,%r84; setp.ne.u64 %r136,%r64,%r70; @ %r136 bra $L6; bra $L1; $L5: .loc 1 84 26 add.u64 %r69,%r85,1; .loc 1 84 3 add.u64 %r67,%r67,24; add.u64 %r94,%r94,8; setp.ne.u64 %r137,%r64,%r69; @ %r137 bra $L19; .loc 1 98 8 ld.u64 %r53,[%r98]; shl.b64 %r29,%r65,3; mov.u64 %r50,%r23; .loc 1 101 10 mov.u64 %r68,0; .loc 1 102 23 mov.u64 %r138,1; $L8: st.u64 [%r50],%r138; mov.u64 %r86,%r68; .loc 1 101 26 add.u64 %r68,%r68,1; .loc 1 101 3 add.u64 %r50,%r50,%r29; setp.ne.u64 %r139,%r85,%r86; @ %r139 bra $L8; .loc 1 108 9 setp.ne.u64 %r140,%r53,0; @ ! %r140 bra $L1; .loc 1 123 19 ld.u64 %r40,[%frame]; .loc 1 125 34 ld.u64 %r43,[%frame+120]; cvt.u32.u32 %r142,%r22; cvt.s64.s8 %r141,%r142; shl.b64 %r143,%r141,3; add.u64 %r55,%r169,%r143; .loc 1 106 11 mov.u64 %r58,0; add.u64 %r171,%frame,248; .loc 1 114 8 setp.eq.u32 %r172,%r99,0; .loc 1 131 13 mov.u64 %r175,%r58; $L18: .loc 1 125 21 ld.u64 %r87,[%frame+240]; mov.u64 %r42,%r87; mov.u64 %r52,%r53; $L15: .loc 1 114 8 setp.eq.u64 %r145,%r58,0; @ %r145 bra $L11; .loc 1 41 12r100149,[%value_in]; } .loc 1 114 8 @ %r172 bra $L12; .loc 1 114 24 setp.lt.s32 %r152,%r149,0; @ %r152 bra $L13; $L11: mov.u64 %r63,%r171; .loc 1 125 21 mov.u64 %r71,%r23; mov.u64 %r88,%r42; bra $L14; $L12: .loc 1 114 24 setp.gt.s32 %r154,%r149,0; @ ! %r154 bra $L13; bra $L11; $L14: .loc 1 119 38 add.u64 %r155,%r88,1; .loc 1 119 27 st.u64 [%r71],%r155; .loc 1 118 7 add.u64 %r71,%r71,%r29; setp.eq.u64 %r156,%r55,%r63; @ %r156 bra $L20; .loc 1 119 34 ld.u64 %r88,[%r63]; add.u64 %r63,%r63,8; bra $L14; $L20: mov.u64 %r58,%r52; $L13: .loc 1 123 9 add.u64 %r52,%r52,%r40; .loc 1 125 14 add.u64 %r42,%r42,1; .loc 1 125 7 st.u64 [%frame+240],%r42; setp.ne.u64 %r157,%r42,%r43; @ %r157 bra $L15; .loc 1 123 9 sub.u64 %r158,%r43,%r87; mad.lo.u64 %r53,%r158,%r40,%r53; mov.u64 %r77,%r171; mov.u64 %r51,%r43; mov.u64 %r48,%r40; .loc 1 126 9 mov.u64 %r57,0; $L17: .loc 1 131 13 st.u64 [%r77+-8],%r175; .loc 1 134 23 mul.lo.u64 %r45,%r48,%r51; mov.u64 %r92,%r57; .loc 1 135 5 add.u64 %r57,%r57,1; .loc 1 136 7 setp.eq.u64 %r162,%r92,%r85; @ %r162 bra $L1; .loc 1 144 16 ld.u64 %r163,[%r77]; add.u64 %r47,%r163,1; st.u64 [%r77],%r47; shl.b64 %r74,%r57,3; .loc 1 145 23 add.u64 %r164,%frame,%r74; ld.u64 %r48,[%r164]; .loc 1 145 13 sub.u64 %r165,%r48,%r45; add.u64 %r53,%r53,%r165; .loc 1 148 32 add.u64 %r167,%r170,%r74; ld.u64 %r51,[%r167]; .loc 1 148 7 add.u64 %r77,%r77,8; setp.eq.u64 %r168,%r47,%r51; @ %r168 bra $L17; bra $L18; $L1: .loc 1 151_gfortran_mmaxloc0_8_s1 .visible .func _gfortran_mmaxloc0_8_s1in_ar4) {pred %r131; .reg .predpredpred %r155; .reg .u64 %r158; .reg .u64 %r161; .reg .u32pred %r198; .reg .u64predpredpred %r229; .reg .u64pred %r238; .reg .u64pred %r243; .reg .u64 %r246; mov.u64 %r126,%ar0; mov.u64 %r127,%ar1; mov.u64 %r128,%ar2; mov.u32 %r129,%ar3; mov.u64 %r130,%ar4; .loc 1 177 6 setp.ne.u64 %r131,%r128,0; @ %r131 bra $L32; .loc 1 180 7130; call _gfortran_maxloc0_8_s1184 7 bra $L31; $L32: .loc 1 187 10 ld.s8 %r22,[%r127+28]; .loc 1 188 6 setp.gt.s32 %r136,%r22,0; @ %r136 bra $L34; .loc 1 189 5 cvta.const.u64 %r131stack; call _gfortran_runtime_error34: .loc 1 191 6 ld.u64 %r139,[%r126]; setp.ne.u64 %r140,%r139,0; @ %r140 bra $L35; .loc 1 193 7 st.u64 [%r126+48],%r139; add.u32 %r142,%r22,-1; cvt.s64.s32 %r143,%r142; st.u64 [%r126+56],%r143; mov.u64 %r144,1; st.u64 [%r126+40],%r144; .loc 1 194 28 cvt.u32.u64 %r145,%r144; st.u8 [%r126+28],%r145; .loc 1 195 24 st.u64 [%r126+8],%r139; .loc 1 196 29 cvt.u32.u32 %r150,%r22; cvt.s64.s8 %r149,%r150; mov.u64 %r148call (%value_in),_gfortrani_xmallocarray151,[%value_in]; } .loc 1 196 27 st.u64 [%r126],%r151; bra $L36; $L35: .loc 1 200 11 cvta.global.u64 %r153,_gfortrani_compile_options; .loc 1 200 10 ld.u32 %r154,[%r153+36]; setp.eq.u32 %r155,%r154,0; @ %r155 bra $L36; .loc 1 203 4 cvta.const.u64 %r158,$LC1gfortrani_bounds_iforeach_return.loc 1 205 4 cvta.const.u64 %r1611r12716158; call _gfortrani_bounds_equal_extents$L36: .loc 1 210 15 ld.u64 %r31,[%r128+16]; .loc 1 212 9 ld.u64 %r79,[%r128]; .loc 1 214 22 cvt.u32.u64 %r32,%r31; .loc 1 214 53 add.u32 %r163,%r32,-4; and.b32 %r164,%r163,-5; set.u32.eq.u32 %r166,%r164,0; neg.s32 %r167,%r166; .loc 1 214 22 add.u32 %r168,%r32,-1; set.u32.le.u32 %r170,%r168,1; neg.s32 %r171,%r170; .loc 1 214 6 cvt.u16.u32 %r173,%r167; cvt.u16.u32 %r174,%r171; or.b16 %r172,%r173,%r174; cvt.u32.u16 %r175,%r172; cvt.u16.u8 %r176,%r175; setp.ne.u16 %r177,%r176,0; @ %r177 bra $L37; .loc 1 216 7 setp.ne.u32 %r179,%r32,16; @ %r179 bra $L38; $L37: .loc 1 223 11 ld.u64 %r92,[%r126+40]; .loc 1 224 8 ld.u64 %r93,[%r126]; .loc 1 225 17 cvt.u32.u32 %r180,%r22; cvt.s64.s8 %r112,%r180; add.u64 %r80,%r127,40; add.u64 %r65,%r128,40; mov.u64 %r44,0; .loc 1 225 10 mov.u64 %r81,%r44; add.u64 %r242,%frame,120; add.u64 %r239,%frame,240; add.u64 %r240,%frame,360; .loc 1 230 16 mov.u64 %r197,%r44; bra $L39; $L38: .loc 1 221 5_gfortran_runtime_error39: .loc 1 227 18 add.u64 %r184,%r242,%r44; .loc 1 227 51 ld.u64 %r186,[%r80]; mul.lo.u64 %r185,%r186,%r130; .loc 1 227 18 st.u64 [%r184],%r185; .loc 1 228 18 add.u64 %r187,%frame,%r44; .loc 1 228 20 ld.u64 %r189,[%r65]; mul.lo.u64 %r188,%r189,%r31; .loc 1 228 18 st.u64 [%r187],%r188; .loc 1 229 19 ld.u64 %r191,[%r80+16]; add.u64 %r190,%r191,1; ld.u64 %r192,[%r80+8]; sub.u64 %r50,%r190,%r192; .loc 1 229 17 add.u64 %r194,%r239,%r44; st.u64 [%r194],%r50; .loc 1 230 16 add.u64 %r196,%r240,%r44; st.u64 [%r196],%r197; .loc 1 231 10 setp.gt.s64 %r198,%r50,0; @ %r198 bra $L40; shl.b64 %r122,%r92,3; mov.u64 %r37,%r93; .loc 1 234 11 mov.u64 %r95,0; .loc 1 235 24 mov.u64 %r199,%r95; $L41: st.u64 [%r37],%r199; .loc 1 234 27 add.u64 %r95,%r95,1; .loc 1 234 4 add.u64 %r37,%r37,%r122; setp.ne.u64 %r200,%r95,%r112; @ %r200 bra $L41; bra $L31; $L40: .loc 1 225 26 add.u64 %r81,%r81,1; .loc 1 225 3 add.u64 %r80,%r80,24; add.u64 %r65,%r65,24; add.u64 %r44,%r44,8; setp.ne.u64 %r201,%r81,%r112; @ %r201 bra $L39; .loc 1 240 8 ld.u64 %r77,[%r127]; shl.b64 %r87,%r92,3; mov.u64 %r99,%r93; .loc 1 243 10 mov.u64 %r94,0; .loc 1 244 23 mov.u64 %r202,%r94; $L43: st.u64 [%r99],%r202; .loc 1 243 26 add.u64 %r94,%r94,1; .loc 1 243 3 add.u64 %r99,%r99,%r87; setp.ne.u64 %r203,%r94,%r112; @ %r203 bra $L43; .loc 1 251 9 setp.eq.u64 %r204,%r77,0; @ %r204 bra $L31; .loc 1 267 19 ld.u64 %r57,[%frame+120]; .loc 1 268 20 ld.u64 %r59,[%frame]; .loc 1 270 34 ld.u64 %r62,[%frame+240]; cvt.u32.u32 %r206,%r22; cvt.s64.s8 %r205,%r206; shl.b64 %r207,%r205,3; add.u64 %r101,%r240,%r207; .loc 1 249 10 mov.u64 %r83,0; add.u64 %r241,%frame,368; .loc 1 257 6 setp.eq.u32 %r243,%r129,0; .loc 1 276 13 mov.u64 %r246,%r83; $L50: .loc 1 270 21 ld.u64 %r123,[%frame+360]; mov.u64 %r61,%r123; mov.u64 %r78,%r79; mov.u64 %r76,%r77; $L48: .loc 1 257 6 ld.s8 %r210,[%r78]; cvt.u16.u32 %r209,%r210; setp.eq.u16 %r211,%r209,0; @ %r211 bra $L44; .loc 1 257 14 setp.eq.u64 %r212,%r83,0; @ %r212 bra $L45;257 6 @ %r243 bra $L46; .loc 1 258 25 setp.lt.s32 %r219,%r216,0; @ %r219 bra $L44; $L45: mov.u64 %r106,%r241; .loc 1 270 21 mov.u64 %r107,%r93; mov.u64 %r124,%r61; bra $L47; $L46: .loc 1 258 25 setp.gt.s32 %r221,%r216,0; @ ! %r221 bra $L44; bra $L45; $L47: .loc 1 263 38 add.u64 %r222,%r124,1; .loc 1 263 27 st.u64 [%r107],%r222; .loc 1 262 7 add.u64 %r107,%r107,%r87; setp.eq.u64 %r223,%r101,%r106; @ %r223 bra $L51; .loc 1 263 34 ld.u64 %r124,[%r106]; add.u64 %r106,%r106,8; bra $L47; $L51: mov.u64 %r83,%r76; $L44: .loc 1 267 9 add.u64 %r76,%r76,%r57; .loc 1 268 10 add.u64 %r78,%r78,%r59; .loc 1 270 14 add.u64 %r61,%r61,1; .loc 1 270 7 st.u64 [%frame+360],%r61; setp.ne.u64 %r224,%r61,%r62; @ %r224 bra $L48; sub.u64 %r117,%r62,%r123; .loc 1 267 9 mad.lo.u64 %r77,%r57,%r117,%r77; .loc 1 268 10 mad.lo.u64 %r79,%r59,%r117,%r79; mov.u64 %r114,%r241; mov.u64 %r72,%r59; mov.u64 %r75,%r62; mov.u64 %r70,%r57; mov.u64 %r113,8; .loc 1 271 9 mov.u64 %r82,0; $L49: .loc 1 276 13 st.u64 [%r114+-8],%r246; .loc 1 279 23 mul.lo.u64 %r64,%r70,%r75; .loc 1 280 24 mul.lo.u64 %r67,%r75,%r72; .loc 1 281 5 add.u64 %r82,%r82,1; .loc 1 282 7 setp.eq.u64 %r229,%r82,%r112; @ %r229 bra $L31; .loc 1 290 16 ld.u64 %r230,[%r114]; add.u64 %r69,%r230,1; st.u64 [%r114],%r69; .loc 1 291 23 add.u64 %r232,%r242,%r113; ld.u64 %r70,[%r232]; .loc 1 291 13 sub.u64 %r233,%r70,%r64; add.u64 %r77,%r77,%r233; .loc 1 292 24 add.u64 %r234,%frame,%r113; ld.u64 %r72,[%r234]; .loc 1 292 14 sub.u64 %r235,%r72,%r67; add.u64 %r79,%r79,%r235; .loc 1 295 32 add.u64 %r237,%r239,%r113; ld.u64 %r75,[%r237]; .loc 1 295 7 add.u64 %r114,%r114,8; add.u64 %r113,%r113,8; setp.eq.u64 %r238,%r69,%r75; @ %r238 bra $L49; bra $L50; $L31: .loc 1 298_gfortran_smaxloc0_8_s1 .visible .func _gfortran_smaxloc0_8_s1in_ar4) {4u32 %r41; .reg .predu32 %r55; .reg .u64u32ar2; mov.u32 %r38,%ar3; mov.u64 %r39,%ar4; .loc 1 317 6 setp.eq.u64 %r40,%r37,0; @ %r40 bra $L70; .loc 1 317 20 ld.u32 %r41,[%r37]; setp.eq.u32 %r42,%r41,0; @ %r42 bra $L71; $L70: .loc 1 320 7 {call _gfortran_maxloc0_8_s1324 7 bra $L69; $L71: .loc 1 327 8 ld.s8 %r30,[%r36+28]; .loc 1 329 6 setp.gt.s64 %r48,%r30,0; @ %r48 bra $L73; .loc 1 330 5 cvta.const.u64 %r49,$LC0; {stack; call _gfortran_runtime_error73: .loc 1 332 15 ld.u64 %r24,[%r35]; .loc 1 332 6 setp.ne.u64 %r51,%r24,0; @ %r51 bra $L74; .loc 1 334 7 st.u64 [%r35+48],%r24; add.u64 %r53,%r30,-1; st.u64 [%r35+56],%r53; mov.u64 %r54,1; st.u64 [%r35+40],%r54; .loc 1 335 28 cvt.u32.u64 %r55,%r54; st.u8 [%r35+28],%r55; .loc 1 336 24 st.u64 [%r35+8],%r24; .loc 1 337 29 mov.u64 call (%value_in),_gfortrani_xmallocarray37 27 st.u64 [%r35],%r24; bra $L75; $L74: .loc 1 339 12 cvta.global.u64 %r63,_gfortrani_compile_options; .loc 1 339 11 ld.u32 %r64,[%r63+36]; setp.eq.u32 %r65,%r64,0; @ %r65 bra $L75; .loc 1 341 8 cvta.const.u64 %r68,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 346 8 ld.u64 %r24,[%r35]; $L75: ld.u64 %r69,[%r35+40]; shl.b64 %r34,%r69,3; mov.u64 %r29,%r24; .loc 1 347 10 mov.u64 %r32,0; .loc 1 348 23 mov.u64 %r70,%r32; $L76: st.u64 [%r29],%r70; .loc 1 347 24 add.u64 %r32,%r32,1; .loc 1 347 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r71,%r30,%r32; @ %r71 bra $L76; $L69: .loc 1 349 maxloc0_8_s4.o/ 1622802248_gfortran_maxloc0_8_s4 .visible .func _gfortran_maxloc0_8_s4); .file 1 "../../../../libgfortran/generated/maxloc0_8_s4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_8_s4 .visible .func _gfortran_mmaxloc0_8_s_gfortran_smaxloc0_8_s4 .visible .func _gfortran_smaxloc0_8_sVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_memcmp_char4_gfortrani_memcmp_char4_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents88,777,65,83,75,32,97,114,103,117,109,101,110,116,0 }82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,6270,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_8_s4 .visible .func _gfortran_maxloc0_8_s4predu64 %r146; .reg .pred %r148; .reg .u32 %r154; .reg .pred %r155; .reg .u32 %r159; .reg .predpredpredpredpred64 10 ld.s8 %r22,[%r101+28]; .loc 1 64 8 cvt.u32.u32 %r104,%r22; cvt.s64.s8 %r61,%r104; .loc 1 65 6 setp.gt.s64 %r105,%r61,0; @ %r105 bra $L2; .loc 1 66 5 cvta.const.u64 %r106,$LC0; {stack; call _gfortran_runtime_errorr23,[%r100]; .loc 1 68 6 setp.ne.u64 %r108,%r23,0; @ %r108 bra $L3; .loc 1 70 7 st.u64 [%r100+48],%r23; add.u64 %r110,%r61,-1; st.u64 [%r100+56],%r110; mov.u64 %r111,1; st.u64 [%r100+40],%r111; .loc 1 71 28 cvt.u32.u64 %r112,%r111; st.u8 [%r100+28],%r112; .loc 1 72 24 st.u64 [%r100+8],%r23; .loc 1 73 29 mov.u64 %r11561call (%value_in),_gfortrani_xmallocarray118,[%value_in]; } mov.u64 %r23,%r118; .loc 1 73 27 st.u64 [%r100],%r23; bra $L4; $L3: .loc 1 77 11 cvta.global.u64 %r120,_gfortrani_compile_options; .loc 1 77 10 ld.u32 %r121,[%r120+364; .loc 1 78r1015; call _gfortrani_bounds_iforeach_return.loc 1 83 8 ld.u64 %r23,[%r100]; $L4: .loc 1 82 11 ld.u64 %r62,[%r100+40]; add.u64 %r65,%r101,40; mov.u64 %r98,0; .loc 1 84 10 mov.u64 %r82,%r98; add.u64 %r178,%frame,120; add.u64 %r176,%frame,240; .loc 1 88 16 mov.u64 %r136,%r98; bra $L7; $L20: mov.u64 %r82,%r67; $L7: .loc 1 86 18 add.u64 %r126,%frame,%r98; .loc 1 86 51 ld.u64 %r128,[%r65]; mul.lo.u64 %r127,%r128,%r103; .loc 1 86 18 st.u64 [%r126],%r127; .loc 1 87 19 ld.u64 %r130,[%r65+16]; add.u64 %r129,%r130,1; ld.u64 %r131,[%r65+8]; sub.u64 %r36,%r129,%r131; .loc 1 87 17 add.u64 %r133,%r178,%r98; st.u64 [%r133],%r36; .loc 1 88 16 add.u64 %r135,%r176,%r98; st.u64 [%r135],%r136; .loc 1 89 10 setp.gt.s64 %r137,%r36,0; @ %r137 bra $L5; shl.b64 %r79,%r62,3; mov.u64 %r81,%r23; .loc 1 92 11 mov.u64 %r68,0; .loc 1 93 24 mov.u64 %r138,%r68; $L6: st.u64 [%r81],%r138; .loc 1 92 27 add.u64 %r68,%r68,1; .loc 1 92 4 add.u64 %r81,%r81,%r79; setp.ne.u64 %r139,%r61,%r68; @ %r139 bra $L6; bra $L1; $L5: .loc 1 84 26 add.u64 %r67,%r82,1; .loc 1 84 3 add.u64 %r65,%r65,24; add.u64 %r98,%r98,8; setp.ne.u64 %r140,%r61,%r67; @ %r140 bra $L20; .loc 1 98 8 ld.u64 %r50,[%r101]; shl.b64 %r28,%r62,3; mov.u64 %r47,%r23; .loc 1 101 10 mov.u64 %r66,0; .loc 1 102 23 mov.u64 %r141,1; $L8: st.u64 [%r47],%r141; mov.u64 %r83,%r66; .loc 1 101 26 add.u64 %r66,%r66,1; .loc 1 101 3 add.u64 %r47,%r47,%r28; setp.ne.u64 %r142,%r82,%r83; @ %r142 bra $L8; .loc 1 108 9 setp.ne.u64 %r143,%r50,0; @ ! %r143 bra $L1; .loc 1 123 19 ld.u64 %r88,[%frame]; .loc 1 123 9 shl.b64 %r90,%r88,2; .loc 1 125 34 ld.u64 %r91,[%frame+120]; cvt.u32.u32 %r145,%r22; cvt.s64.s8 %r144,%r145; shl.b64 %r146,%r144,3; add.u64 %r52,%r176,%r146; .loc 1 106 11 mov.u64 %r55,0; add.u64 %r177,%frame,248; .loc 1 114 8 setp.eq.u32 %r179,%r102,0; $L19: .loc 1 125 21 ld.u64 %r92,[%frame+240]; mov.u64 %r40,%r92; mov.u64 %r49,%r50; $L16: .loc 1 114 8 setp.ne.u64 %r148,%r55,0; @ %r148 bra $L11; $L14: mov.u64 %r60,%r177; .loc 1 125 21 mov.u64 %r71,%r23; mov.u64 %r93,%r40; bra $L12; $L11: .loc 1 114 8 @ %r179 bra $L13;5_gfortrani_memcmp_char4154,[%value_in]; } .loc 1 114 24 setp.ge.s32 %r155,%r154,0; @ ! %r155 bra $L15; bra $L145_gfortrani_memcmp_char4.loc 1 114 24 setp.gt.s32 %r160,%r159,0; @ ! %r160 bra $L15; bra $L14; $L12: .loc 1 119 38 add.u64 %r161,%r93,1; .loc 1 119 27 st.u64 [%r71],%r161; .loc 1 118 7 add.u64 %r71,%r71,%r28; setp.eq.u64 %r162,%r52,%r60; @ %r162 bra $L21; .loc 1 119 34 ld.u64 %r93,[%r60]; add.u64 %r60,%r60,8; bra $L12; $L21: .loc 1 117 14 mov.u64 %r55,%r49; $L15: .loc 1 123 9 add.u64 %r49,%r49,%r90; .loc 1 125 14 add.u64 %r40,%r40,1; .loc 1 125 7 st.u64 [%frame+240],%r40; setp.ne.u64 %r163,%r40,%r91; @ %r163 bra $L16; .loc 1 123 9 sub.u64 %r164,%r91,%r92; mad.lo.u64 %r50,%r164,%r90,%r50; mov.u64 %r77,%r177; mov.u64 %r48,%r91; mov.u64 %r45,%r88; .loc 1 126 9 mov.u64 %r54,0; .loc 1 131 13 mov.u64 %r167,%r54; $L18: st.u64 [%r77+-8],%r167; .loc 1 134 23 mul.lo.u64 %r42,%r45,%r48; mov.u64 %r38,%r54; .loc 1 135 5 add.u64 %r54,%r54,1; .loc 1 136 7 setp.eq.u64 %r168,%r38,%r82; @ %r168 bra $L1; .loc 1 144 16 ld.u64 %r169,[%r77]; add.u64 %r44,%r169,1; st.u64 [%r77],%r44; shl.b64 %r74,%r54,3; .loc 1 145 23 add.u64 %r170,%frame,%r74; ld.u64 %r45,[%r170]; .loc 1 145 13 sub.u64 %r171,%r45,%r42; shl.b64 %r172,%r171,2; add.u64 %r50,%r50,%r172; .loc 1 148 32 add.u64 %r174,%r178,%r74; ld.u64 %r48,[%r174]; .loc 1 148 7 add.u64 %r77,%r77,8; setp.eq.u64 %r175,%r44,%r48; @ %r175 bra $L18; bra $L19; $L1: .loc 1 151_gfortran_mmaxloc0_8_s4 .visible .func _gfortran_mmaxloc0_8_s, .param .u64 %in_ar4) {8pred %r134; .reg .predpredpredpred %r204; .reg .u64 %r205; .reg .predu32 %r209; .reg .u64pred %r222; .reg .u32u64 %r242; .reg .u64mov.u64 %r129,%ar0; mov.u64 %r130,%ar1; mov.u64 %r131,%ar2; mov.u32 %r132,%ar3; mov.u64 %r133,%ar4; .loc 1 177 6 setp.ne.u64 %r134,%r131,0; @ %r134 bra $L302133; call _gfortran_maxloc0_8_s4184 7 bra $L29; $L30: .loc 1 187 10 ld.s8 %r22,[%r130+28]; .loc 1 188 6 setp.gt.s32 %r139,%r22,0; @ %r139 bra $L32; .loc 1 189 5 cvta.const.u64 %r140140_gfortran_runtime_error32: .loc 1 191 6 ld.u64 %r142,[%r129]; setp.ne.u64 %r143,%r142,0; @ %r143 bra $L33; .loc 1 193 7 st.u64 [%r129+48],%r142; add.u32 %r145,%r22,-1; cvt.s64.s32 %r146,%r145; st.u64 [%r129+56],%r146; mov.u64 %r147,1; st.u64 [%r129+40],%r147; .loc 1 194 28 cvt.u32.u64 %r148,%r147; st.u8 [%r129+28],%r148; .loc 1 195 24 st.u64 [%r129+8],%r142; .loc 1 196 29 cvt.u32.u32 %r153,%r22; cvt.s64.s8 %r152,%r153; mov.u64 %r1511151; call (%value_in),_gfortrani_xmallocarray154,[%value_in]; } .loc 1 196 27 st.u64 [%r129],%r154; bra $L34; $L33: .loc 1 200 11 cvta.global.u64 %r156,_gfortrani_compile_options; .loc 1 200 10 ld.u32 %r157,[%r156+36]; setp.eq.u32 %r158,%r157,0gfortrani_bounds_iforeach_return.loc 1 205 4 cvta.const.u64 %r1641; call _gfortrani_bounds_equal_extents$L34: .loc 1 210 15 ld.u64 %r31,[%r131+16]; .loc 1 212 9 ld.u64 %r73,[%r131]; .loc 1 214 22 cvt.u32.u64 %r32,%r31; .loc 1 214 53 add.u32 %r166,%r32,-4; and.b32 %r167,%r166,-5; set.u32.eq.u32 %r169,%r167,0; neg.s32 %r170,%r169; .loc 1 214 22 add.u32 %r171,%r32,-1; set.u32.le.u32 %r173,%r171,1; neg.s32 %r174,%r173; .loc 1 214 6 cvt.u16.u32 %r176,%r170; cvt.u16.u32 %r177,%r174; or35; .loc 1 216 7 setp.ne.u32 %r182,%r32,16; @ %r182 bra $L36; $L35: .loc 1 223 11 ld.u64 %r85,[%r129+40]; .loc 1 224 8 ld.u64 %r86,[%r129]; .loc 1 225 17 cvt.u32.u32 %r183,%r22; cvt.s64.s8 %r107,%r183; add.u64 %r74,%r130,40; add.u64 %r58,%r131,40; mov.u64 %r37,0; .loc 1 225 10 mov.u64 %r75,%r37; add.u64 %r247,%frame,120; add.u64 %r248,%frame,240; add.u64 %r249,%frame,360; .loc 1 230 16 mov.u64 %r200,%r37; bra $L37; $L36: .loc 1 221 5_gfortran_runtime_error37: .loc 1 227 18 add.u64 %r187,%r247,%r37; .loc 1 227 51 ld.u64 %r189,[%r74]; mul.lo.u64 %r188,%r189,%r133; .loc 1 227 18 st.u64 [%r187],%r188; .loc 1 228 18 add.u64 %r190,%frame,%r37; .loc 1 228 20 ld.u64 %r192,[%r58]; mul.lo.u64 %r191,%r192,%r31; .loc 1 228 18 st.u64 [%r190],%r191; .loc 1 229 19 ld.u64 %r194,[%r74+16]; add.u64 %r193,%r194,1; ld.u64 %r195,[%r74+8]; sub.u64 %r49,%r193,%r195; .loc 1 229 17 add.u64 %r197,%r248,%r37; st.u64 [%r197],%r49; .loc 1 230 16 add.u64 %r199,%r249,%r37; st.u64 [%r199],%r200; .loc 1 231 10 setp.gt.s64 %r201,%r49,0; @ %r201 bra $L38; shl.b64 %r113,%r85,3; mov.u64 %r115,%r86; .loc 1 234 11 mov.u64 %r90,0; .loc 1 235 24 mov.u64 %r202,%r90; $L39: st.u64 [%r115],%r202; .loc 1 234 27 add.u64 %r90,%r90,1; .loc 1 234 4 add.u64 %r115,%r115,%r113; setp.ne.u64 %r203,%r90,%r107; @ %r203 bra $L39; bra $L29; $L38: .loc 1 225 26 add.u64 %r75,%r75,1; .loc 1 225 3 add.u64 %r74,%r74,24; add.u64 %r58,%r58,24; add.u64 %r37,%r37,8; setp.ne.u64 %r204,%r75,%r107; @ %r204 bra $L37; .loc 1 240 8 ld.u64 %r70,[%r130]; shl.b64 %r88,%r85,3; mov.u64 %r96,%r86; .loc 1 243 10 mov.u64 %r89,0; .loc 1 244 23 mov.u64 %r205,%r89; $L41: st.u64 [%r96],%r205; .loc 1 243 26 add.u64 %r89,%r89,1; .loc 1 243 3 add.u64 %r96,%r96,%r88; setp.ne.u64 %r206,%r89,%r107; @ %r206 bra $L41; .loc 1 251 9 setp.eq.u64 %r207,%r70,0; @ %r207 bra $L29; .loc 1 267 19 ld.u64 %r121,[%frame+120]; .loc 1 267 9 shl.b64 %r123,%r121,2; .loc 1 268 20 ld.u64 %r124,[%frame]; .loc 1 270 34 ld.u64 %r126,[%frame+240]; cvt.u32.u32 %r209,%r22; cvt.s64.s8 %r208,%r209; shl.b64 %r210,%r208,3; add.u64 %r98,%r249,%r210; .loc 1 249 10 mov.u64 %r77,0; add.u64 %r246,%frame,368; .loc 1 257 6 setp.eq.u32 %r250,%r132,0; $L49: .loc 1 270 21 ld.u64 %r127,[%frame+360]; mov.u64 %r55,%r127; mov.u64 %r72,%r73; mov.u64 %r69,%r70; $L47: .loc 1 257 6 ld.s8 %r213,[%r72]; cvt.u16.u32 %r212,%r213; setp.eq.u16 %r214,%r212,0; @ %r214 bra $L42; .loc 1 257 14 setp.ne.u64 %r215,%r77,0; @ %r215 bra $L43; $L46: mov.u64 %r103,%r246; .loc 1 270 21 mov.u64 %r104,%r86; mov.u64 %r128,%r55; bra $L44; $L43: .loc 1 257 6 @ %r250 bra $L45; .loc 1 43r133; call (%value_in),_gfortrani_memcmp_char4221,[%value_in]; } .loc 1 258 25 setp.ge.s32 %r222,%r221,0; @ ! %r222 bra $L42; bra $L46; $L45: .loc 1 43r133; call (%value_in),_gfortrani_memcmp_char4226,[%value_in]; } .loc 1 258 25 setp.gt.s32 %r227,%r226,0; @ ! %r227 bra $L42; bra $L46; $L44: .loc 1 263 38 add.u64 %r228,%r128,1; .loc 1 263 27 st.u64 [%r104],%r228; .loc 1 262 7 add.u64 %r104,%r104,%r88; setp.eq.u64 %r229,%r98,%r103; @ %r229 bra $L50; .loc 1 263 34 ld.u64 %r128,[%r103]; add.u64 %r103,%r103,8; bra $L44; $L50: .loc 1 261 14 mov.u64 %r77,%r69; $L42: .loc 1 267 9 add.u64 %r69,%r69,%r123; .loc 1 268 10 add.u64 %r72,%r72,%r124; .loc 1 270 14 add.u64 %r55,%r55,1; .loc 1 270 7 st.u64 [%frame+360],%r55; setp.ne.u64 %r230,%r55,%r126; @ %r230 bra $L47; sub.u64 %r118,%r126,%r127; .loc 1 267 9 mad.lo.u64 %r70,%r118,%r123,%r70; .loc 1 268 10 mad.lo.u64 %r73,%r118,%r124,%r73; mov.u64 %r111,%r246; mov.u64 %r66,%r124; mov.u64 %r68,%r126; mov.u64 %r64,%r121; mov.u64 %r110,8; .loc 1 271 9 mov.u64 %r76,0; .loc 1 276 13 mov.u64 %r234,%r76; $L48: st.u64 [%r111+-8],%r234; .loc 1 279 23 mul.lo.u64 %r57,%r64,%r68; .loc 1 280 24 mul.lo.u64 %r60,%r68,%r66; .loc 1 281 5 add.u64 %r76,%r76,1; .loc 1 282 7 setp.eq.u64 %r235,%r76,%r107; @ %r235 bra $L29; .loc 1 290 16 ld.u64 %r236,[%r111]; add.u64 %r63,%r236,1; st.u64 [%r111],%r63; .loc 1 291 23 add.u64 %r238,%r247,%r110; ld.u64 %r64,[%r238]; .loc 1 291 13 sub.u64 %r239,%r64,%r57; shl.b64 %r240,%r239,2; add.u64 %r70,%r70,%r240; .loc 1 292 24 add.u64 %r241,%frame,%r110; ld.u64 %r66,[%r241]; .loc 1 292 14 sub.u64 %r242,%r66,%r60; add.u64 %r73,%r73,%r242; .loc 1 295 32 add.u64 %r244,%r248,%r110; ld.u64 %r68,[%r244]; .loc 1 295 7 add.u64 %r111,%r111,8; add.u64 %r110,%r110,8; setp.eq.u64 %r245,%r63,%r68; @ %r245 bra $L48; bra $L49; $L29: .loc 1 298_gfortran_smaxloc0_8_s4 .visible .func _gfortran_smaxloc0_8_s, .param .u64 %in_ar4) {4u32 %r41; .reg .predu32 %r55; .reg .u64u32ar2; mov.u32 %r38,%ar3; mov.u64 %r39,%ar4; .loc 1 317 6 setp.eq.u64 %r40,%r37,0; @ %r40 bra $L66; .loc 1 317 20 ld.u32 %r41,[%r37]; setp.eq.u32 %r42,%r41,0; @ %r42 bra $L67; $L66: .loc 1 320 7 {call _gfortran_maxloc0_8_s4324 7 bra $L65; $L67: .loc 1 327 8 ld.s8 %r30,[%r36+28]; .loc 1 329 6 setp.gt.s64 %r48,%r30,0; @ %r48 bra $L69; .loc 1 330 5 cvta.const.u64 %r49,$LC0; {stack; call _gfortran_runtime_error69: .loc 1 332 15 ld.u64 %r24,[%r35]; .loc 1 332 6 setp.ne.u64 %r51,%r24,0; @ %r51 bra $L70; .loc 1 334 7 st.u64 [%r35+48],%r24; add.u64 %r53,%r30,-1; st.u64 [%r35+56],%r53; mov.u64 %r54,1; st.u64 [%r35+40],%r54; .loc 1 335 28 cvt.u32.u64 %r55,%r54; st.u8 [%r35+28],%r55; .loc 1 336 24 st.u64 [%r35+8],%r24; .loc 1 337 29 mov.u64 call (%value_in),_gfortrani_xmallocarray37 27 st.u64 [%r35],%r24; bra $L71; $L70: .loc 1 339 12 cvta.global.u64 %r63,_gfortrani_compile_options; .loc 1 339 11 ld.u32 %r64,[%r63+36]; setp.eq.u32 %r65,%r64,0; @ %r65 bra $L71; .loc 1 341 8 cvta.const.u64 %r68,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 346 8 ld.u64 %r24,[%r35]; $L71: ld.u64 %r69,[%r35+40]; shl.b64 %r34,%r69,3; mov.u64 %r29,%r24; .loc 1 347 10 mov.u64 %r32,0; .loc 1 348 23 mov.u64 %r70,%r32; $L72: st.u64 [%r29],%r70; .loc 1 347 24 add.u64 %r32,%r32,1; .loc 1 347 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r71,%r30,%r32; @ %r71 bra $L72; $L65: .loc 1 349maxloc0_16_s1.o/1622802248_gfortran_maxloc0_16_s1 .visible .func _gfortran_maxloc0_16_s1); .file 1 "../../../../libgfortran/generated/maxloc0_16_s1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_16_s1 .visible .func _gfortran_mmaxloc0_16_s1_gfortran_smaxloc0_16_s1 .visible .func _gfortran_smaxloc0_16_s1in_ar4); // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_returnmemcmpmem_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents88,777,65,83,75,32,97,114,103,117,109,101,110,116,0 }82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,6270,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_16_s1 .visible .func _gfortran_maxloc0_16_s1predu64 %r146; .reg .predu64 %r162; .reg .predpredpred %r176; .reg .u64.loc 1 64 10 ld.s8 %r22,[%r99+28]; .loc 1 64 8 cvt.u32.u32 %r102,%r22; cvt.s64.s8 %r65,%r102; .loc 1 65 6 setp.gt.s64 %r103,%r65,0; @ %r103 bra $L2; .loc 1 66 5 cvta.const.u64 %r104,$LC0; {_gfortran_runtime_errorr23,[%r98]; .loc 1 3; .loc 1 70 7 st.u64 [%r98+48],%r23; add.u64 %r108,%r65,-1; st.u64 [%r98+56],%r108; mov.u64 %r109,1; st.u64 [%r98+40],%r109; .loc 1 71 28 cvt.u32.u64 %r110,%r109; st.u8 [%r98+28],%r110; .loc 1 72 24 st.u64 [%r98+8],%r23; .loc 1 73 29 mov.u64 %r113,1665call (%value_in),_gfortrani_xmallocarray116,[%value_in]; } mov.u64 %r23,%r116; .loc 1 73 27 st.u64 [%r98],%r23; bra $L4; $L3: .loc 1 77 11 cvta.global.u64 %r118,_gfortrani_compile_options; .loc 1 77 10 ld.u32 %r119,[%r118+36]; setp.eq.u32 %r120,%r119,0; @ %r120 bra $L4; .loc 1 783; call _gfortrani_bounds_iforeach_return.loc 1 83 8 ld.u64 %r23,[%r98]; $L4: .loc 1 82 11 ld.u64 %r66,[%r98+40]; add.u64 %r62,%r99,40; mov.u64 %r94,0; .loc 1 84 10 mov.u64 %r86,%r94; add.u64 %r175,%frame,120; add.u64 %r173,%frame,240; .loc 1 88 16 mov.u64 %r134,%r94; bra $L7; $L19: mov.u64 %r86,%r71; $L7: .loc 1 86 18 add.u64 %r124,%frame,%r94; .loc 1 86 51 ld.u64 %r126,[%r62]; mul.lo.u64 %r125,%r126,%r101; .loc 1 86 18 st.u64 [%r124],%r125; .loc 1 87 19 ld.u64 %r128,[%r62+16]; add.u64 %r127,%r128,1; ld.u64 %r129,[%r62+8]; sub.u64 %r36,%r127,%r129; .loc 1 87 17 add.u64 %r131,%r175,%r94; st.u64 [%r131],%r36; .loc 1 88 16 add.u64 %r133,%r173,%r94; st.u64 [%r133],%r134; .loc 1 89 10 setp.gt.s64 %r135,%r36,0; @ %r135 bra $L5; shl.b64 %r88,%r66,4; mov.u64 %r27,%r23; .loc 1 92 11 mov.u64 %r72,0; .loc 1 93 24 mov.u64 %r136,%r72; $L6: st.u64 [%r27],%r136; st.u64 [%r27+8],%r136; .loc 1 92 27 add.u64 %r72,%r72,1; .loc 1 92 4 add.u64 %r27,%r27,%r88; setp.ne.u64 %r138,%r65,%r72; @ %r138 bra $L6; bra $L1; $L5: .loc 1 84 26 add.u64 %r71,%r86,1; .loc 1 84 3 add.u64 %r62,%r62,24; add.u64 %r94,%r94,8; setp.ne.u64 %r139,%r65,%r71; @ %r139 bra $L19; .loc 1 98 8 ld.u64 %r53,[%r99]; shl.b64 %r50,%r66,4; mov.u64 %r55,%r23; .loc 1 101 10 mov.u64 %r70,0; .loc 1 102 23 mov.u64 %r140,1; mov.u64 %r141,%r70; $L8: st.u64 [%r55],%r140; st.u64 [%r55+8],%r141; mov.u64 %r87,%r70; .loc 1 101 26 add.u64 %r70,%r70,1; .loc 1 101 3 add.u64 %r55,%r55,%r50; setp.ne.u64 %r142,%r86,%r87; @ %r142 bra $L8; .loc 1 108 9 setp.ne.u64 %r143,%r53,0; @ ! %r143 bra $L1; .loc 1 123 19 ld.u64 %r40,[%frame]; .loc 1 125 34 ld.u64 %r43,[%frame+120]; cvt.u32.u32 %r145,%r22; cvt.s64.s8 %r144,%r145; shl.b64 %r146,%r144,3; add.u64 %r59,%r173,%r146; .loc 1 106 11 mov.u64 %r58,0; add.u64 %r174,%frame,248; .loc 1 114 8 setp.eq.u32 %r176,%r100,0; .loc 1 131 13 mov.u64 %r179,%r58; $L18: .loc 1 125 21 ld.u64 %r89,[%frame+240]; mov.u64 %r42,%r89; mov.u64 %r52,%r53; $L15: .loc 1 114 8 setp.eq.u64 %r148,%r58,0; @ %r148 bra $L11; .loc 1 41 12r101152,[%value_in]; } .loc 1 114 8 @ %r176 bra $L12; .loc 1 114 24 setp.lt.s32 %r155,%r152,0; @ %r155 bra $L13; $L11: mov.u64 %r69,%r174; .loc 1 125 21 mov.u64 %r73,%r23; mov.u64 %r90,%r42; bra $L14; $L12: .loc 1 114 24 setp.gt.s32 %r157,%r152,0; @ ! %r157 bra $L13; bra $L11; $L14: .loc 1 119 38 add.u64 %r158,%r90,1; st.u64 [%r73],%r158; shr.s64 %r159,%r158,63; st.u64 [%r73+8],%r159; .loc 1 118 7 add.u64 %r73,%r73,%r50; setp.eq.u64 %r160,%r59,%r69; @ %r160 bra $L20; .loc 1 119 34 ld.u64 %r90,[%r69]; add.u64 %r69,%r69,8; bra $L14; $L20: mov.u64 %r58,%r52; $L13: .loc 1 123 9 add.u64 %r52,%r52,%r40; .loc 1 125 14 add.u64 %r42,%r42,1; .loc 1 125 7 st.u64 [%frame+240],%r42; setp.ne.u64 %r161,%r42,%r43; @ %r161 bra $L15; .loc 1 123 9 sub.u64 %r162,%r43,%r89; mad.lo.u64 %r53,%r162,%r40,%r53; mov.u64 %r79,%r174; mov.u64 %r51,%r43; mov.u64 %r48,%r40; .loc 1 126 9 mov.u64 %r57,0; $L17: .loc 1 131 13 st.u64 [%r79+-8],%r179; .loc 1 134 23 mul.lo.u64 %r45,%r48,%r51; mov.u64 %r96,%r57; .loc 1 135 5 add.u64 %r57,%r57,1; .loc 1 136 7 setp.eq.u64 %r166,%r96,%r86; @ %r166 bra $L1; .loc 1 144 16 ld.u64 %r167,[%r79]; add.u64 %r47,%r167,1; st.u64 [%r79],%r47; shl.b64 %r76,%r57,3; .loc 1 145 23 add.u64 %r168,%frame,%r76; ld.u64 %r48,[%r168]; .loc 1 145 13 sub.u64 %r169,%r48,%r45; add.u64 %r53,%r53,%r169; .loc 1 148 32 add.u64 %r171,%r175,%r76; ld.u64 %r51,[%r171]; .loc 1 148 7 add.u64 %r79,%r79,8; setp.eq.u64 %r172,%r47,%r51; @ %r172 bra $L17; bra $L18; $L1: .loc 1 151_gfortran_mmaxloc0_16_s1 .visible .func _gfortran_mmaxloc0_16_s1in_ar4) {predpredu32 %r209; .reg .u64pred %r222; .reg .predpredmov.u32 %r130,%ar3; mov.u64 %r131,%ar4; .loc 1 177 6 setp.ne.u64 %r132,%r129,0; @ %r132 bra $L32; .loc 1 180 7r128130131; call _gfortran_maxloc0_16_s1184 7 bra $L31; $L32: .loc 1 187 10 ld.s8 %r22,[%r128+28]; .loc 1 188 6 setp.gt.s32 %r137,%r22,0; @ %r137 bra $L34;_gfortran_runtime_error34: .loc 1 191 6 ld.u64 %r140,[%r127]; setp.ne.u64 %r141,%r140,0; @ %r141 bra $L35; .loc 1 193 7 st.u64 [%r127+48],%r140; add.u32 %r143,%r22,-1; cvt.s64.s32 %r144,%r143; st.u64 [%r127+56],%r144; mov.u64 %r145,1; st.u64 [%r127+40],%r145; .loc 1 194 28 cvt.u32.u64 %r146,%r145; st.u8 [%r127+28],%r146; .loc 1 195 24 st.u64 [%r127+8],%r140; .loc 1 196 29 cvt.u32.u32 %r151,%r22; cvt.s64.s8 %r150,%r151; mov.u64 %r149,16call (%value_in),_gfortrani_xmallocarray152,[%value_in]; } .loc 1 196 27 st.u64 [%r127],%r152; bra $L36; $L35: .loc 1 200 11 cvta.global.u64 %r154,_gfortrani_compile_options; .loc 1 200 10 ld.u32 %r155,[%r154+36]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L36; .loc 1 203 4r12159; call _gfortrani_bounds_iforeach_return.loc 1 205 4 cvta.const.u64 %r16262159; call _gfortrani_bounds_equal_extents$L36: .loc 1 210 15 ld.u64 %r32,[%r129+16]; .loc 1 212 9 ld.u64 %r81,[%r129]; .loc 1 214 22 cvt.u32.u64 %r33,%r32; .loc 1 214 53 add.u32 %r164,%r33,-4; and.b32 %r165,%r164,-5; set.u32.eq.u32 %r167,%r165,0; neg.s32 %r168,%r167; .loc 1 214 22 add.u32 %r169,%r33,-1; set.u32.le.u32 %r171,%r169,1; neg.s32 %r172,%r171; .loc 1 214 6 cvt.u16.u32 %r174,%r168; cvt.u16.u32 %r175,%r172; or.b16 %r173,%r174,%r175; cvt.u32.u16 %r176,%r173; cvt.u16.u8 %r177,%r176; setp.ne.u16 %r178,%r177,0; @ %r178 bra $L37; .loc 1 216 7 setp.ne.u32 %r180,%r33,16; @ %r180 bra $L38; $L37: .loc 1 223 11 ld.u64 %r93,[%r127+40]; .loc 1 224 8 ld.u64 %r94,[%r127]; .loc 1 225 17 cvt.u32.u32 %r181,%r22; cvt.s64.s8 %r112,%r181; add.u64 %r86,%r128,40; add.u64 %r76,%r129,40; mov.u64 %r58,0; .loc 1 225 10 mov.u64 %r83,%r58; add.u64 %r244,%frame,120; add.u64 %r245,%frame,240; add.u64 %r246,%frame,360; .loc 1 230 16 mov.u64 %r198,%r58; bra $L39; $L38: .loc 1 221 5_gfortran_runtime_error39: .loc 1 227 18 add.u64 %r185,%r244,%r58; .loc 1 227 51 ld.u64 %r187,[%r86]; mul.lo.u64 %r186,%r187,%r131; .loc 1 227 18 st.u64 [%r185],%r186; .loc 1 228 18 add.u64 %r188,%frame,%r58; .loc 1 228 20 ld.u64 %r190,[%r76]; mul.lo.u64 %r189,%r190,%r32; .loc 1 228 18 st.u64 [%r188],%r189; .loc 1 229 19 ld.u64 %r192,[%r86+16]; add.u64 %r191,%r192,1; ld.u64 %r193,[%r86+8]; sub.u64 %r50,%r191,%r193; .loc 1 229 17 add.u64 %r195,%r245,%r58; st.u64 [%r195],%r50; .loc 1 230 16 add.u64 %r197,%r246,%r58; st.u64 [%r197],%r198; .loc 1 231 10 setp.gt.s64 %r199,%r50,0; @ %r199 bra $L40; shl.b64 %r123,%r93,4; mov.u64 %r38,%r94; .loc 1 234 11 mov.u64 %r96,0; .loc 1 235 24 mov.u64 %r200,%r96; $L41: st.u64 [%r38],%r200; st.u64 [%r38+8],%r200; .loc 1 234 27 add.u64 %r96,%r96,1; .loc 1 234 4 add.u64 %r38,%r38,%r123; setp.ne.u64 %r202,%r96,%r112; @ %r202 bra $L41; bra $L31; $L40: .loc 1 225 26 add.u64 %r83,%r83,1; .loc 1 225 3 add.u64 %r86,%r86,24; add.u64 %r76,%r76,24; add.u64 %r58,%r58,8; setp.ne.u64 %r203,%r83,%r112; @ %r203 bra $L39; .loc 1 240 8 ld.u64 %r79,[%r128]; shl.b64 %r98,%r93,4; mov.u64 %r100,%r94; .loc 1 243 10 mov.u64 %r95,0; .loc 1 244 23 mov.u64 %r204,%r95; $L43: st.u64 [%r100],%r204; st.u64 [%r100+8],%r204; .loc 1 243 26 add.u64 %r95,%r95,1; .loc 1 243 3 add.u64 %r100,%r100,%r98; setp.ne.u64 %r206,%r95,%r112; @ %r206 bra $L43; .loc 1 251 9 setp.eq.u64 %r207,%r79,0; @ %r207 bra $L31; .loc 1 267 19 ld.u64 %r59,[%frame+120]; .loc 1 268 20 ld.u64 %r61,[%frame]; .loc 1 270 34 ld.u64 %r64,[%frame+240]; cvt.u32.u32 %r209,%r22; cvt.s64.s8 %r208,%r209; shl.b64 %r210,%r208,3; add.u64 %r102,%r246,%r210; .loc 1 249 10 mov.u64 %r85,0; add.u64 %r243,%frame,368; .loc 1 257 6 setp.eq.u32 %r247,%r130,0; .loc 1 276 13 mov.u64 %r250,%r85; $L50: .loc 1 270 21 ld.u64 %r124,[%frame+360]; mov.u64 %r63,%r124; mov.u64 %r80,%r81; mov.u64 %r78,%r79; $L48: .loc 1 257 6 ld.s8 %r213,[%r80]; cvt.u16.u32 %r212,%r213; setp.eq.u16 %r214,%r212,0; @ %r214 bra $L44; .loc 1 257 14 setp.eq.u64 %r215,%r85,0; @ %r215 bra $L45; .loc 1 41 12131219,[%value_in]; } .loc 1 257 6 @ %r247 bra $L46; .loc 1 258 25 setp.lt.s32 %r222,%r219,0; @ %r222 bra $L44; $L45: mov.u64 %r107,%r243; .loc 1 270 21 mov.u64 %r108,%r94; mov.u64 %r125,%r63; bra $L47; $L46: .loc 1 258 25 setp.gt.s32 %r224,%r219,0; @ ! %r224 bra $L44; bra $L45; $L47: .loc 1 263 38 add.u64 %r225,%r125,1; st.u64 [%r108],%r225; shr.s64 %r226,%r225,63; st.u64 [%r108+8],%r226; .loc 1 262 7 add.u64 %r108,%r108,%r98; setp.eq.u64 %r227,%r102,%r107; @ %r227 bra $L51; .loc 1 263 34 ld.u64 %r125,[%r107]; add.u64 %r107,%r107,8; bra $L47; $L51: mov.u64 %r85,%r78; $L44: .loc 1 267 9 add.u64 %r78,%r78,%r59; .loc 1 268 10 add.u64 %r80,%r80,%r61; .loc 1 270 14 add.u64 %r63,%r63,1; .loc 1 270 7 st.u64 [%frame+360],%r63; setp.ne.u64 %r228,%r63,%r64; @ %r228 bra $L48; sub.u64 %r118,%r64,%r124; .loc 1 267 9 mad.lo.u64 %r79,%r59,%r118,%r79; .loc 1 268 10 mad.lo.u64 %r81,%r61,%r118,%r81; mov.u64 %r115,%r243; mov.u64 %r74,%r61; mov.u64 %r77,%r64; mov.u64 %r72,%r59; mov.u64 %r114,8; .loc 1 271 9 mov.u64 %r84,0; $L49: .loc 1 276 13 st.u64 [%r115+-8],%r250; .loc 1 279 23 mul.lo.u64 %r66,%r72,%r77; .loc 1 280 24 mul.lo.u64 %r69,%r77,%r74; .loc 1 281 5 add.u64 %r84,%r84,1; .loc 1 282 7 setp.eq.u64 %r233,%r84,%r112; @ %r233 bra $L31; .loc 1 290 16 ld.u64 %r234,[%r115]; add.u64 %r71,%r234,1; st.u64 [%r115],%r71; .loc 1 291 23 add.u64 %r236,%r244,%r114; ld.u64 %r72,[%r236]; .loc 1 291 13 sub.u64 %r237,%r72,%r66; add.u64 %r79,%r79,%r237; .loc 1 292 24 add.u64 %r238,%frame,%r114; ld.u64 %r74,[%r238]; .loc 1 292 14 sub.u64 %r239,%r74,%r69; add.u64 %r81,%r81,%r239; .loc 1 295 32 add.u64 %r241,%r245,%r114; ld.u64 %r77,[%r241]; .loc 1 295 7 add.u64 %r115,%r115,8; add.u64 %r114,%r114,8; setp.eq.u64 %r242,%r71,%r77; @ %r242 bra $L49; bra $L50; $L31: .loc 1 298_gfortran_smaxloc0_16_s1 .visible .func _gfortran_smaxloc0_16_s1in_ar4) {4u32 %r41; .reg .predu32 %r55; .reg .u64u32ar2; mov.u32 %r38,%ar3; mov.u64 %r39,%ar4; .loc 1 317 6 setp.eq.u64 %r40,%r37,0; @ %r40 bra $L70; .loc 1 317 20 ld.u32 %r41,[%r37]; setp.eq.u32 %r42,%r41,0; @ %r42 bra $L71; $L70: .loc 1 320 7 {call _gfortran_maxloc0_16_s1324 7 bra $L69; $L71: .loc 1 327 8 ld.s8 %r30,[%r36+28]; .loc 1 329 6 setp.gt.s64 %r48,%r30,0; @ %r48 bra $L73; .loc 1 330 5 cvta.const.u64 %r49,$LC0; {stack; call _gfortran_runtime_error73: .loc 1 332 15 ld.u64 %r24,[%r35]; .loc 1 332 6 setp.ne.u64 %r51,%r24,0; @ %r51 bra $L74; .loc 1 334 7 st.u64 [%r35+48],%r24; add.u64 %r53,%r30,-1; st.u64 [%r35+56],%r53; mov.u64 %r54,1; st.u64 [%r35+40],%r54; .loc 1 335 28 cvt.u32.u64 %r55,%r54; st.u8 [%r35+28],%r55; .loc 1 336 24 st.u64 [%r35+8],%r24; .loc 1 337 29call (%value_in),_gfortrani_xmallocarray37 27 st.u64 [%r35],%r24; bra $L75; $L74: .loc 1 339 12 cvta.global.u64 %r63,_gfortrani_compile_options; .loc 1 339 11 ld.u32 %r64,[%r63+36]; setp.eq.u32 %r65,%r64,0; @ %r65 bra $L75; .loc 1 341 8 cvta.const.u64 %r68,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 346 8 ld.u64 %r24,[%r35]; $L75: ld.u64 %r69,[%r35+40]; shl.b64 %r34,%r69,4; mov.u64 %r29,%r24; .loc 1 347 10 mov.u64 %r32,0; .loc 1 348 23 mov.u64 %r70,%r32; $L76: st.u64 [%r29],%r70; st.u64 [%r29+8],%r70; .loc 1 347 24 add.u64 %r32,%r32,1; .loc 1 347 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r72,%r30,%r32; @ %r72 bra $L76; $L69: .loc 1 349maxloc0_16_s4.o/1622802248_gfortran_maxloc0_16_s4 .visible .func _gfortran_maxloc0_16_s4); .file 1 "../../../../libgfortran/generated/maxloc0_16_s4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_16_s4 .visible .func _gfortran_mmaxloc0_16_s_gfortran_smaxloc0_16_s4 .visible .func _gfortran_smaxloc0_16_sVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_memcmp_char4_gfortrani_memcmp_char4_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents88,777,65,83,75,32,97,114,103,117,109,101,110,116,0 }82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,6270,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_16_s4 .visible .func _gfortran_maxloc0_16_s4u64 %r104; .reg .u32 %r105; .reg .pred %r106; .reg .u64 %r107; .reg .predpredpredmov.u64 %r101,%ar0; mov.u64 %r102,%ar1; mov.u32 %r103,%ar2; mov.u64 %r104,%ar3; .loc 1 64 10 ld.s8 %r22,[%r102+28]; .loc 1 64 8 cvt.u32.u32 %r105,%r22; cvt.s64.s8 %r62,%r105; .loc 1 65 6 setp.gt.s64 %r106,%r62,0; @ %r106 bra $L2; .loc 1 66 5 cvta.const.u64 %r107,$LC0; {stack; call _gfortran_runtime_errorr23,[%r101]; .loc 1 68 6 setp.ne.u64 %r109,%r23,0; @ %r109 bra $L3; .loc 1 70 7 st.u64 [%r101+48],%r23; add.u64 %r111,%r62,-1; st.u64 [%r101+56],%r111; mov.u64 %r112,1; st.u64 [%r101+40],%r112; .loc 1 71 28 cvt.u32.u64 %r113,%r112; st.u8 [%r101+28],%r113; .loc 1 72 24 st.u64 [%r101+8],%r23; .loc 1 73 29 mov.u64 %r116,16116; call (%value_in),_gfortrani_xmallocarray119,[%value_in]; } mov.u64 %r23,%r119; .loc 1 73 27 st.u64 [%r101],%r23; bra $L4; $L3: .loc 1 77 11 cvta.global.u64 %r121,_gfortrani_compile_options; .loc 1 77 10 ld.u32 %r122,[%r121+36]; setp.eq.u32 %r123,%r122,0; @ %r123 bra $L4; .loc 1 78 2 cvta.const.u64 %r126,$LC1; {r1026; call _gfortrani_bounds_iforeach_return.loc 1 83 8 ld.u64 %r23,[%r101]; $L4: .loc 1 82 11 ld.u64 %r63,[%r101+40]; add.u64 %r59,%r102,40; mov.u64 %r99,0; .loc 1 84 10 mov.u64 %r82,%r99; add.u64 %r180,%frame,120; add.u64 %r181,%frame,240; .loc 1 88 16 mov.u64 %r137,%r99; bra $L7; $L20: mov.u64 %r82,%r69; $L7: .loc 1 86 18 add.u64 %r127,%frame,%r99; .loc 1 86 51 ld.u64 %r129,[%r59]; mul.lo.u64 %r128,%r129,%r104; .loc 1 86 18 st.u64 [%r127],%r128; .loc 1 87 19 ld.u64 %r131,[%r59+16]; add.u64 %r130,%r131,1; ld.u64 %r132,[%r59+8]; sub.u64 %r35,%r130,%r132; .loc 1 87 17 add.u64 %r134,%r180,%r99; st.u64 [%r134],%r35; .loc 1 88 16 add.u64 %r136,%r181,%r99; st.u64 [%r136],%r137; .loc 1 89 10 setp.gt.s64 %r138,%r35,0; @ %r138 bra $L5; shl.b64 %r81,%r63,4; mov.u64 %r85,%r23; .loc 1 92 11 mov.u64 %r70,0; .loc 1 93 24 mov.u64 %r139,%r70; $L6: st.u64 [%r85],%r139; st.u64 [%r85+8],%r139; .loc 1 92 27 add.u64 %r70,%r70,1; .loc 1 92 4 add.u64 %r85,%r85,%r81; setp.ne.u64 %r141,%r62,%r70; @ %r141 bra $L6; bra $L1; $L5: .loc 1 84 26 add.u64 %r69,%r82,1; .loc 1 84 3 add.u64 %r59,%r59,24; add.u64 %r99,%r99,8; setp.ne.u64 %r142,%r62,%r69; @ %r142 bra $L20; .loc 1 98 8 ld.u64 %r50,[%r102]; shl.b64 %r47,%r63,4; mov.u64 %r52,%r23; .loc 1 101 10 mov.u64 %r68,0; .loc 1 102 23 mov.u64 %r143,1; mov.u64 %r144,%r68; $L8: st.u64 [%r52],%r143; st.u64 [%r52+8],%r144; mov.u64 %r83,%r68; .loc 1 101 26 add.u64 %r68,%r68,1; .loc 1 101 3 add.u64 %r52,%r52,%r47; setp.ne.u64 %r145,%r82,%r83; @ %r145 bra $L8; .loc 1 108 9 setp.ne.u64 %r146,%r50,0; @ ! %r146 bra $L1; .loc 1 123 19 ld.u64 %r90,[%frame]; .loc 1 123 9 shl.b64 %r92,%r90,2; .loc 1 125 34 ld.u64 %r93,[%frame+120]; cvt.u32.u32 %r148,%r22; cvt.s64.s8 %r147,%r148; shl.b64 %r149,%r147,3; add.u64 %r56,%r181,%r149; .loc 1 106 11 mov.u64 %r55,0; add.u64 %r182,%frame,248; .loc 1 114 8 setp.eq.u32 %r183,%r103,0; $L19: .loc 1 125 21 ld.u64 %r94,[%frame+240]; mov.u64 %r40,%r94; mov.u64 %r49,%r50; $L16: .loc 1 114 8 setp.ne.u64 %r151,%r55,0; @ %r151 bra $L11; $L14: mov.u64 %r67,%r182; .loc 1 125 21 mov.u64 %r73,%r23; mov.u64 %r95,%r40; bra $L12; $L11: .loc 1 114 8 @ %r183 bra $L13;104; call (%value_in),_gfortrani_memcmp_char4157,[%value_in]; } .loc 1 114 24 setp.ge.s32 %r158,%r157,0; @ ! %r158 bra $L15; bra $L14104; call (%value_in),_gfortrani_memcmp_char4162,[%value_in]; } .loc 1 114 24 setp.gt.s32 %r163,%r162,0; @ ! %r163 bra $L15; bra $L14; $L12: .loc 1 119 38 add.u64 %r164,%r95,1; st.u64 [%r73],%r164; shr.s64 %r165,%r164,63; st.u64 [%r73+8],%r165; .loc 1 118 7 add.u64 %r73,%r73,%r47; setp.eq.u64 %r166,%r56,%r67; @ %r166 bra $L21; .loc 1 119 34 ld.u64 %r95,[%r67]; add.u64 %r67,%r67,8; bra $L12; $L21: .loc 1 117 14 mov.u64 %r55,%r49; $L15: .loc 1 123 9 add.u64 %r49,%r49,%r92; .loc 1 125 14 add.u64 %r40,%r40,1; .loc 1 125 7 st.u64 [%frame+240],%r40; setp.ne.u64 %r167,%r40,%r93; @ %r167 bra $L16; .loc 1 123 9 sub.u64 %r168,%r93,%r94; mad.lo.u64 %r50,%r168,%r92,%r50; mov.u64 %r79,%r182; mov.u64 %r48,%r93; mov.u64 %r45,%r90; .loc 1 126 9 mov.u64 %r54,0; .loc 1 131 13 mov.u64 %r171,%r54; $L18: st.u64 [%r79+-8],%r171; .loc 1 134 23 mul.lo.u64 %r42,%r45,%r48; mov.u64 %r37,%r54; .loc 1 135 5 add.u64 %r54,%r54,1; .loc 1 136 7 setp.eq.u64 %r172,%r37,%r82; @ %r172 bra $L1; .loc 1 144 16 ld.u64 %r173,[%r79]; add.u64 %r44,%r173,1; st.u64 [%r79],%r44; shl.b64 %r76,%r54,3; .loc 1 145 23 add.u64 %r174,%frame,%r76; ld.u64 %r45,[%r174]; .loc 1 145 13 sub.u64 %r175,%r45,%r42; shl.b64 %r176,%r175,2; add.u64 %r50,%r50,%r176; .loc 1 148 32 add.u64 %r178,%r180,%r76; ld.u64 %r48,[%r178]; .loc 1 148 7 add.u64 %r79,%r79,8; setp.eq.u64 %r179,%r44,%r48; @ %r179 bra $L18; bra $L19; $L1: .loc 1 151_gfortran_mmaxloc0_16_s4 .visible .func _gfortran_mmaxloc0_16_s, .param .u64 %in_ar4) {9pred %r135; .reg .predpredpred %r159; .reg .u64.reg .u32 %r179; .reg .u16 %r180; .reg .pred %r181; .reg .pred.reg .pred %r206; .reg .u64 %r207; .reg .pred %r209; .reg .pred %r210; .reg .u64u16 %r215; .reg .u32 %r216; .reg .pred %r217; .reg .pred %r218; .reg .u32 %r224; .reg .pred %r225; .reg .u32 %r229; .reg .pred %r230; .reg .u64 %r231; .reg .u64 %r232; .reg .pred %r233; .reg .pred.reg .u64 %r252; .reg .u64 %r253; .reg .pred %r254; mov.u64 %r130,%ar0; mov.u64 %r131,%ar1; mov.u64 %r132,%ar2; mov.u32 %r133,%ar3; mov.u64 %r134,%ar4; .loc 1 177 6 setp.ne.u64 %r135,%r132,0; @ %r135 bra $L3011133134; call _gfortran_maxloc0_16_s4184 7 bra $L29; $L30: .loc 1 187 10 ld.s8 %r22,[%r131+28]; .loc 1 188 6 setp.gt.s32 %r140,%r22,0; @ %r140 bra $L32; .loc 1 189 5 cvta.const.u64 %r1411stack; call _gfortran_runtime_error32: .loc 1 191 6 ld.u64 %r143,[%r130]; setp.ne.u64 %r144,%r143,0; @ %r144 bra $L33; .loc 1 193 7 st.u64 [%r130+48],%r143; add.u32 %r146,%r22,-1; cvt.s64.s32 %r147,%r146; st.u64 [%r130+56],%r147; mov.u64 %r148,1; st.u64 [%r130+40],%r148; .loc 1 194 28 cvt.u32.u64 %r149,%r148; st.u8 [%r130+28],%r149; .loc 1 195 24 st.u64 [%r130+8],%r143; .loc 1 196 29 cvt.u32.u32 %r154,%r22; cvt.s64.s8 %r153,%r154; mov.u64 %r152,16call (%value_in),_gfortrani_xmallocarray155,[%value_in]; } .loc 1 196 27 st.u64 [%r130],%r155; bra $L34; $L33: .loc 1 200 11 cvta.global.u64 %r157,_gfortrani_compile_options; .loc 1 200 10 ld.u32 %r158,[%r157+36]; setp.eq.u32 %r159,%r158,0r1162; call _gfortrani_bounds_iforeach_return.loc 1 205 4 cvta.const.u64 %r1651162; call _gfortrani_bounds_equal_extents$L34: .loc 1 210 15 ld.u64 %r32,[%r132+16]; .loc 1 212 9 ld.u64 %r75,[%r132]; .loc 1 214 22 cvt.u32.u64 %r33,%r32; .loc 1 214 53 add.u32 %r167,%r33,-4; and.b32 %r168,%r167,-5; set.u32.eq.u32 %r170,%r168,0; neg.s32 %r171,%r170; .loc 1 214 22 add.u32 %r172,%r33,-1; set.u32.le.u32 %r174,%r172,1; neg.s32 %r175,%r174; .loc 1 214 6 cvt.u16.u32 %r177,%r171; cvt.u16.u32 %r178,%r175; or.b16 %r176,%r177,%r178; cvt.u32.u16 %r179,%r176; cvt.u16.u8 %r180,%r179; setp.ne.u16 %r181,%r180,0; @ %r181 bra $L35; .loc 1 216 7 setp.ne.u32 %r183,%r33,16; @ %r183 bra $L36; $L35: .loc 1 223 11 ld.u64 %r86,[%r130+40]; .loc 1 224 8 ld.u64 %r87,[%r130]; .loc 1 225 17 cvt.u32.u32 %r184,%r22; cvt.s64.s8 %r106,%r184; add.u64 %r80,%r131,40; add.u64 %r73,%r132,40; mov.u64 %r56,0; .loc 1 225 10 mov.u64 %r77,%r56; add.u64 %r252,%frame,120; add.u64 %r253,%frame,240; add.u64 %r250,%frame,360; .loc 1 230 16 mov.u64 %r201,%r56; bra $L37; $L36: .loc 1 221 5_gfortran_runtime_error37: .loc 1 227 18 add.u64 %r188,%r252,%r56; .loc 1 227 51 ld.u64 %r190,[%r80]; mul.lo.u64 %r189,%r190,%r134; .loc 1 227 18 st.u64 [%r188],%r189; .loc 1 228 18 add.u64 %r191,%frame,%r56; .loc 1 228 20 ld.u64 %r193,[%r73]; mul.lo.u64 %r192,%r193,%r32; .loc 1 228 18 st.u64 [%r191],%r192; .loc 1 229 19 ld.u64 %r195,[%r80+16]; add.u64 %r194,%r195,1; ld.u64 %r196,[%r80+8]; sub.u64 %r49,%r194,%r196; .loc 1 229 17 add.u64 %r198,%r253,%r56; st.u64 [%r198],%r49; .loc 1 230 16 add.u64 %r200,%r250,%r56; st.u64 [%r200],%r201; .loc 1 231 10 setp.gt.s64 %r202,%r49,0; @ %r202 bra $L38; shl.b64 %r114,%r86,4; mov.u64 %r116,%r87; .loc 1 234 11 mov.u64 %r91,0; .loc 1 235 24 mov.u64 %r203,%r91; $L39: st.u64 [%r116],%r203; st.u64 [%r116+8],%r203; .loc 1 234 27 add.u64 %r91,%r91,1; .loc 1 234 4 add.u64 %r116,%r116,%r114; setp.ne.u64 %r205,%r91,%r106; @ %r205 bra $L39; bra $L29; $L38: .loc 1 225 26 add.u64 %r77,%r77,1; .loc 1 225 3 add.u64 %r80,%r80,24; add.u64 %r73,%r73,24; add.u64 %r56,%r56,8; setp.ne.u64 %r206,%r77,%r106; @ %r206 bra $L37; .loc 1 240 8 ld.u64 %r72,[%r131]; shl.b64 %r95,%r86,4; mov.u64 %r97,%r87; .loc 1 243 10 mov.u64 %r90,0; .loc 1 244 23 mov.u64 %r207,%r90; $L41: st.u64 [%r97],%r207; st.u64 [%r97+8],%r207; .loc 1 243 26 add.u64 %r90,%r90,1; .loc 1 243 3 add.u64 %r97,%r97,%r95; setp.ne.u64 %r209,%r90,%r106; @ %r209 bra $L41; .loc 1 251 9 setp.eq.u64 %r210,%r72,0; @ %r210 bra $L29; .loc 1 267 19 ld.u64 %r122,[%frame+120]; .loc 1 267 9 shl.b64 %r124,%r122,2; .loc 1 268 20 ld.u64 %r125,[%frame]; .loc 1 270 34 ld.u64 %r127,[%frame+240]; cvt.u32.u32 %r212,%r22; cvt.s64.s8 %r211,%r212; shl.b64 %r213,%r211,3; add.u64 %r99,%r250,%r213; .loc 1 249 10 mov.u64 %r79,0; add.u64 %r251,%frame,368; .loc 1 257 6 setp.eq.u32 %r254,%r133,0; $L49: .loc 1 270 21 ld.u64 %r128,[%frame+360]; mov.u64 %r57,%r128; mov.u64 %r74,%r75; mov.u64 %r71,%r72; $L47: .loc 1 257 6 ld.s8 %r216,[%r74]; cvt.u16.u32 %r215,%r216; setp.eq.u16 %r217,%r215,0; @ %r217 bra $L42; .loc 1 257 14 setp.ne.u64 %r218,%r79,0; @ %r218 bra $L43; $L46: mov.u64 %r104,%r251; .loc 1 270 21 mov.u64 %r105,%r87; mov.u64 %r129,%r57; bra $L44; $L43: .loc 1 257 6 @ %r254 bra $L45; .loc 1 43134; call (%value_in),_gfortrani_memcmp_char4224,[%value_in]; } .loc 1 258 25 setp.ge.s32 %r225,%r224,0; @ ! %r225 bra $L42; bra $L46; $L45: .loc 1 43134; call (%value_in),_gfortrani_memcmp_char4229,[%value_in]; } .loc 1 258 25 setp.gt.s32 %r230,%r229,0; @ ! %r230 bra $L42; bra $L46; $L44: .loc 1 263 38 add.u64 %r231,%r129,1; st.u64 [%r105],%r231; shr.s64 %r232,%r231,63; st.u64 [%r105+8],%r232; .loc 1 262 7 add.u64 %r105,%r105,%r95; setp.eq.u64 %r233,%r99,%r104; @ %r233 bra $L50; .loc 1 263 34 ld.u64 %r129,[%r104]; add.u64 %r104,%r104,8; bra $L44; $L50: .loc 1 261 14 mov.u64 %r79,%r71; $L42: .loc 1 267 9 add.u64 %r71,%r71,%r124; .loc 1 268 10 add.u64 %r74,%r74,%r125; .loc 1 270 14 add.u64 %r57,%r57,1; .loc 1 270 7 st.u64 [%frame+360],%r57; setp.ne.u64 %r234,%r57,%r127; @ %r234 bra $L47; sub.u64 %r119,%r127,%r128; .loc 1 267 9 mad.lo.u64 %r72,%r119,%r124,%r72; .loc 1 268 10 mad.lo.u64 %r75,%r119,%r125,%r75; mov.u64 %r112,%r251; mov.u64 %r68,%r125; mov.u64 %r70,%r127; mov.u64 %r66,%r122; mov.u64 %r111,8; .loc 1 271 9 mov.u64 %r78,0; .loc 1 276 13 mov.u64 %r238,%r78; $L48: st.u64 [%r112+-8],%r238; .loc 1 279 23 mul.lo.u64 %r59,%r66,%r70; .loc 1 280 24 mul.lo.u64 %r62,%r70,%r68; .loc 1 281 5 add.u64 %r78,%r78,1; .loc 1 282 7 setp.eq.u64 %r239,%r78,%r106; @ %r239 bra $L29; .loc 1 290 16 ld.u64 %r240,[%r112]; add.u64 %r65,%r240,1; st.u64 [%r112],%r65; .loc 1 291 23 add.u64 %r242,%r252,%r111; ld.u64 %r66,[%r242]; .loc 1 291 13 sub.u64 %r243,%r66,%r59; shl.b64 %r244,%r243,2; add.u64 %r72,%r72,%r244; .loc 1 292 24 add.u64 %r245,%frame,%r111; ld.u64 %r68,[%r245]; .loc 1 292 14 sub.u64 %r246,%r68,%r62; add.u64 %r75,%r75,%r246; .loc 1 295 32 add.u64 %r248,%r253,%r111; ld.u64 %r70,[%r248]8; setp.eq.u64 %r249,%r65,%r70; @ %r249 bra $L48; bra $L49; $L29: .loc 1 298_gfortran_smaxloc0_16_s4 .visible .func _gfortran_smaxloc0_16_s, .param .u64 %in_ar4) {4u32 %r41; .reg .predu32 %r55; .reg .u64u32ar2; mov.u32 %r38,%ar3; mov.u64 %r39,%ar4; .loc 1 317 6 setp.eq.u64 %r40,%r37,0; @ %r40 bra $L66; .loc 1 317 20 ld.u32 %r41,[%r37]; setp.eq.u32 %r42,%r41,0; @ %r42 bra $L67; $L66: .loc 1 320 7 {call _gfortran_maxloc0_16_s4324 7 bra $L65; $L67: .loc 1 327 8 ld.s8 %r30,[%r36+28]; .loc 1 329 6 setp.gt.s64 %r48,%r30,0; @ %r48 bra $L69; .loc 1 330 5 cvta.const.u64 %r49,$LC0; {stack; call _gfortran_runtime_error69: .loc 1 332 15 ld.u64 %r24,[%r35]; .loc 1 332 6 setp.ne.u64 %r51,%r24,0; @ %r51 bra $L70; .loc 1 334 7 st.u64 [%r35+48],%r24; add.u64 %r53,%r30,-1; st.u64 [%r35+56],%r53; mov.u64 %r54,1; st.u64 [%r35+40],%r54; .loc 1 335 28 cvt.u32.u64 %r55,%r54; st.u8 [%r35+28],%r55; .loc 1 336 24 st.u64 [%r35+8],%r24; .loc 1 337 29call (%value_in),_gfortrani_xmallocarray37 27 st.u64 [%r35],%r24; bra $L71; $L70: .loc 1 339 12 cvta.global.u64 %r63,_gfortrani_compile_options; .loc 1 339 11 ld.u32 %r64,[%r63+36]; setp.eq.u32 %r65,%r64,0; @ %r65 bra $L71; .loc 1 341 8 cvta.const.u64 %r68,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 346 8 ld.u64 %r24,[%r35]; $L71: ld.u64 %r69,[%r35+40]; shl.b64 %r34,%r69,4; mov.u64 %r29,%r24; .loc 1 347 10 mov.u64 %r32,0; .loc 1 348 23 mov.u64 %r70,%r32; $L72: st.u64 [%r29],%r70; st.u64 [%r29+8],%r70; .loc 1 347 24 add.u64 %r32,%r32,1; .loc 1 347 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r72,%r30,%r32; @ %r72 bra $L72; $L65: .loc 1 349 minloc0_4_s1.o/ 1622802251_gfortran_minloc0_4_s1 .visible .func _gfortran_minloc0_4_s1); .file 1 "../../../../libgfortran/generated/minloc0_4_s1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_4_s1 .visible .func _gfortran_mminloc0_4_s1_gfortran_sminloc0_4_s1 .visible .func _gfortran_sminloc0_4_s1in_ar4); // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_returnmemcmpmem_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,116,0 }82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,6270,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_4_s1 .visible .func _gfortran_minloc0_4_s1predpredpredpredpred %r175; .reg .u64 %r178; mov.u64 %r99,%ar0; mov.u64 %r100,%ar1; mov.u32 %r101,%ar2; mov.u64 %r102,%ar3; .loc 1 64 10 ld.s8 %r22,[%r100+28]; .loc 1 64 8 cvt.u32.u32 %r103,%r22; cvt.s64.s8 %r65,%r103; .loc 1 65 6 setp.gt.s64 %r104,%r65,0; @ %r104 bra $L2; .loc 1 66 5 cvta.const.u64 %r105,$LC0stack; call _gfortran_runtime_errorr23,[%r99]; .loc 1 68 6 setp.ne.u64 %r107,%r23,0; @ %r107 bra $L3; .loc 1 70 7 st.u64 [%r99+48],%r23; add.u64 %r109,%r65,-1; st.u64 [%r99+56],%r109; mov.u64 %r110,1; st.u64 [%r99+40],%r110; .loc 1 71 28 cvt.u32.u64 %r111,%r110; st.u8 [%r99+28],%r111; .loc 1 72 24 st.u64 [%r99+8],%r23; .loc 1 73 2965call (%value_in),_gfortrani_xmallocarray117,[%value_in]; } mov.u64 %r23,%r117; .loc 1 73 27 st.u64 [%r99],%r23; bra $L4; $L3: .loc 1 77 11 cvta.global.u64 %r119,_gfortrani_compile_options; .loc 1 77 10 ld.u32 %r120,[%r119+36]; setp.eq.u32 %r121,%r120,0; @ %r121 bra $L4; .loc 1 78r124; call _gfortrani_bounds_iforeach_return.loc 1 83 8 ld.u64 %r23,[%r99]; $L4: .loc 1 82 11 ld.u64 %r66,[%r99+40]; add.u64 %r28,%r100,40; mov.u64 %r70,0; .loc 1 84 10 mov.u64 %r86,%r70; add.u64 %r173,%frame,120; add.u64 %r172,%frame,240; .loc 1 88 16 mov.u64 %r135,%r70; bra $L7; $L19: mov.u64 %r86,%r72; $L7: .loc 1 86 18 add.u64 %r125,%frame,%r70; .loc 1 86 51 ld.u64 %r127,[%r28]; mul.lo.u64 %r126,%r127,%r102; .loc 1 86 18 st.u64 [%r125],%r126; .loc 1 87 19 ld.u64 %r129,[%r28+16]; add.u64 %r128,%r129,1; ld.u64 %r130,[%r28+8]; sub.u64 %r36,%r128,%r130; .loc 1 87 17 add.u64 %r132,%r173,%r70; st.u64 [%r132],%r36; .loc 1 88 16 add.u64 %r134,%r172,%r70; st.u64 [%r134],%r135; .loc 1 89 10 setp.gt.s64 %r136,%r36,0; @ %r136 bra $L5; shl.b64 %r90,%r66,2; mov.u64 %r27,%r23; .loc 1 92 11 mov.u64 %r73,0; .loc 1 93 24 cvt.u32.u64 %r137,%r73; $L6: st.u32 [%r27],%r137; .loc 1 92 27 add.u64 %r73,%r73,1; .loc 1 92 4 add.u64 %r27,%r27,%r90; setp.ne.u64 %r138,%r65,%r73; @ %r138 bra $L6; bra $L1; $L5: .loc 1 84 26 add.u64 %r72,%r86,1; .loc 1 84 3 add.u64 %r28,%r28,24; add.u64 %r70,%r70,8; setp.ne.u64 %r139,%r65,%r72; @ %r139 bra $L19; .loc 1 98 8 ld.u64 %r53,[%r100]; shl.b64 %r55,%r66,2; mov.u64 %r57,%r23; .loc 1 101 10 mov.u64 %r71,0; .loc 1 102 23 mov.u32 %r140,1; $L8: st.u32 [%r57],%r140; mov.u64 %r87,%r71; .loc 1 101 26 add.u64 %r71,%r71,1; .loc 1 101 3 add.u64 %r57,%r57,%r55; setp.ne.u64 %r141,%r86,%r87; @ %r141 bra $L8; .loc 1 108 9 setp.ne.u64 %r142,%r53,0; @ ! %r142 bra $L1; .loc 1 123 19 ld.u64 %r40,[%frame]; .loc 1 125 34 ld.u64 %r43,[%frame+120]; cvt.u32.u32 %r144,%r22; cvt.s64.s8 %r143,%r144; shl.b64 %r145,%r143,3; add.u64 %r61,%r172,%r145; .loc 1 106 11 mov.u64 %r59,0; add.u64 %r174,%frame,248; .loc 1 114 8 setp.eq.u32 %r175,%r101,0; .loc 1 131 13 mov.u64 %r178,%r59; $L18: .loc 1 125 21 ld.u64 %r91,[%frame+240]; mov.u64 %r42,%r91; mov.u64 %r52,%r53; $L15: .loc 1 114 8 setp.eq.u64 %r147,%r59,0; @ %r147 bra $L11; .loc 1 41 12r10151,[%value_in]; } .loc 1 114 8 @ %r175 bra $L12; .loc 1 114 24 setp.gt.s32 %r154,%r151,0; @ %r154 bra $L13; $L11: mov.u64 %r74,%r174; .loc 1 125 21 mov.u64 %r75,%r23; mov.u64 %r92,%r42; bra $L14; $L12: .loc 1 114 24 setp.lt.s32 %r156,%r151,0; @ ! %r156 bra $L13; bra $L11; $L14: .loc 1 119 38 cvt.u32.u64 %r158,%r92; add.u32 %r157,%r158,1; .loc 1 119 27 st.u32 [%r75],%r157; .loc 1 118 7 add.u64 %r75,%r75,%r55; setp.eq.u64 %r159,%r61,%r74; @ %r159 bra $L20; .loc 1 119 34 ld.u64 %r92,[%r74]; add.u64 %r74,%r74,8; bra $L14; $L20: mov.u64 %r59,%r52; $L13: .loc 1 123 9 add.u64 %r52,%r52,%r40; .loc 1 125 14 add.u64 %r42,%r42,1; .loc 1 125 7 st.u64 [%frame+240],%r42; setp.ne.u64 %r160,%r42,%r43; @ %r160 bra $L15; .loc 1 123 9 sub.u64 %r161,%r43,%r91; mad.lo.u64 %r53,%r161,%r40,%r53; mov.u64 %r81,%r174; mov.u64 %r51,%r43; mov.u64 %r48,%r40; .loc 1 126 9 mov.u64 %r58,0; $L17: .loc 1 131 13 st.u64 [%r81+-8],%r178; .loc 1 134 23 mul.lo.u64 %r45,%r48,%r51; mov.u64 %r94,%r58; .loc 1 135 5 add.u64 %r58,%r58,1; .loc 1 136 7 setp.eq.u64 %r165,%r94,%r86; @ %r165 bra $L1; .loc 1 144 16 ld.u64 %r166,[%r81]; add.u64 %r47,%r166,1; st.u64 [%r81],%r47; shl.b64 %r78,%r58,3; .loc 1 145 23 add.u64 %r167,%frame,%r78; ld.u64 %r48,[%r167]; .loc 1 145 13 sub.u64 %r168,%r48,%r45; add.u64 %r53,%r53,%r168; .loc 1 148 32 add.u64 %r170,%r173,%r78; ld.u64 %r51,[%r170]; .loc 1 148 7 add.u64 %r81,%r81,8; setp.eq.u64 %r171,%r47,%r51; @ %r171 bra $L17; bra $L18; $L1: .loc 1 151_gfortran_mminloc0_4_s1 .visible .func _gfortran_mminloc0_4_s1in_ar4) {u64u64u32 %r177; .reg .u16 %r178; .reg .pred %r179; .reg .predpredpred %r205; .reg .predpred %r223; .reg .u32predpredpred %r246; .reg .u64 %r249; mov.u64 %r128,%ar0; mov.u64 %r129,%ar1; mov.u64 %r130,%ar2; mov.u32 %r131,%ar3; mov.u64 %r132,%ar4; .loc 1 177 6 setp.ne.u64 %r133,%r130,0; @ %r133 bra $L321r12913132; call _gfortran_minloc0_4_s1184 7 bra $L31; $L32: .loc 1 187 10 ld.s8 %r22,[%r129+28]; .loc 1 188 6 setp.gt.s32 %r138,%r22,0; @ %r138 bra $L34; .loc 1 189 5 cvta.const.u64 %r131stack; call _gfortran_runtime_error34: .loc 1 191 6 ld.u64 %r141,[%r128]; setp.ne.u64 %r142,%r141,0; @ %r142 bra $L35; .loc 1 193 7 st.u64 [%r128+48],%r141; add.u32 %r144,%r22,-1; cvt.s64.s32 %r145,%r144; st.u64 [%r128+56],%r145; mov.u64 %r146,1; st.u64 [%r128+40],%r146; .loc 1 194 28 cvt.u32.u64 %r147,%r146; st.u8 [%r128+28],%r147; .loc 1 195 24 st.u64 [%r128+8],%r141; .loc 1 196 29 cvt.u32.u32 %r152,%r22; cvt.s64.s8 %r151,%r152call (%value_in),_gfortrani_xmallocarray153,[%value_in]; } .loc 1 196 27 st.u64 [%r128],%r153; bra $L36; $L35: .loc 1 200 11 cvta.global.u64 %r155,_gfortrani_compile_options; .loc 1 200 10 ld.u32 %r156,[%r155+36]; setp.eq.u32 %r157,%r156,0r129160; call _gfortrani_bounds_iforeach_return.loc 1 205 4 cvta.const.u64 %r16311291630; call _gfortrani_bounds_equal_extents$L36: .loc 1 210 15 ld.u64 %r31,[%r130+16]; .loc 1 212 9 ld.u64 %r82,[%r130]; .loc 1 214 22 cvt.u32.u64 %r32,%r31; .loc 1 214 53 add.u32 %r165,%r32,-4; and.b32 %r166,%r165,-5; set.u32.eq.u32 %r168,%r166,0; neg.s32 %r169,%r168; .loc 1 214 22 add.u32 %r170,%r32,-1; set.u32.le.u32 %r172,%r170,1; neg.s32 %r173,%r172; .loc 1 214 6 cvt.u16.u32 %r175,%r169; cvt.u16.u32 %r176,%r173; or.b16 %r174,%r175,%r176; cvt.u32.u16 %r177,%r174; cvt.u16.u8 %r178,%r177; setp.ne.u16 %r179,%r178,0; @ %r179 bra $L37; .loc 1 216 7 setp.ne.u32 %r181,%r32,16; @ %r181 bra $L38; $L37: .loc 1 223 11 ld.u64 %r93,[%r128+40]; .loc 1 224 8 ld.u64 %r94,[%r128]; .loc 1 225 17 cvt.u32.u32 %r182,%r22; cvt.s64.s8 %r111,%r182; add.u64 %r89,%r129,40; add.u64 %r83,%r130,40; mov.u64 %r68,0; .loc 1 225 10 mov.u64 %r84,%r68; add.u64 %r245,%frame,120; add.u64 %r242,%frame,240; add.u64 %r243,%frame,360; .loc 1 230 16 mov.u64 %r199,%r68; bra $L39; $L38: .loc 1 221 5_gfortran_runtime_error39: .loc 1 227 18 add.u64 %r186,%r245,%r68; .loc 1 227 51 ld.u64 %r188,[%r89]; mul.lo.u64 %r187,%r188,%r132; .loc 1 227 18 st.u64 [%r186],%r187; .loc 1 228 18 add.u64 %r189,%frame,%r68; .loc 1 228 20 ld.u64 %r191,[%r83]; mul.lo.u64 %r190,%r191,%r31; .loc 1 228 18 st.u64 [%r189],%r190; .loc 1 229 19 ld.u64 %r193,[%r89+16]; add.u64 %r192,%r193,1; ld.u64 %r194,[%r89+8]; sub.u64 %r51,%r192,%r194; .loc 1 229 17 add.u64 %r196,%r242,%r68; st.u64 [%r196],%r51; .loc 1 230 16 add.u64 %r198,%r243,%r68; st.u64 [%r198],%r199; .loc 1 231 10 setp.gt.s64 %r200,%r51,0; @ %r200 bra $L40; shl.b64 %r124,%r93,2; mov.u64 %r38,%r94; .loc 1 234 11 mov.u64 %r96,0; .loc 1 235 24 cvt.u32.u64 %r201,%r96; $L41: st.u32 [%r38],%r201; .loc 1 234 27 add.u64 %r96,%r96,1; .loc 1 234 4 add.u64 %r38,%r38,%r124; setp.ne.u64 %r202,%r96,%r111; @ %r202 bra $L41; bra $L31; $L40: .loc 1 225 26 add.u64 %r84,%r84,1; .loc 1 225 3 add.u64 %r89,%r89,24; add.u64 %r83,%r83,24; add.u64 %r68,%r68,8; setp.ne.u64 %r203,%r84,%r111; @ %r203 bra $L39; .loc 1 240 8 ld.u64 %r80,[%r129]; shl.b64 %r99,%r93,2; mov.u64 %r101,%r94; .loc 1 243 10 mov.u64 %r95,0; .loc 1 244 23 cvt.u32.u64 %r204,%r95; $L43: st.u32 [%r101],%r204; .loc 1 243 26 add.u64 %r95,%r95,1; .loc 1 243 3 add.u64 %r101,%r101,%r99; setp.ne.u64 %r205,%r95,%r111; @ %r205 bra $L43; .loc 1 251 9 setp.eq.u64 %r206,%r80,0; @ %r206 bra $L31; .loc 1 267 19 ld.u64 %r59,[%frame+120]; .loc 1 268 20 ld.u64 %r61,[%frame]; .loc 1 270 34 ld.u64 %r65,[%frame+240]; cvt.u32.u32 %r208,%r22; cvt.s64.s8 %r207,%r208; shl.b64 %r209,%r207,3; add.u64 %r103,%r243,%r209; .loc 1 249 10 mov.u64 %r86,0; add.u64 %r244,%frame,368; .loc 1 257 6 setp.eq.u32 %r246,%r131,0; .loc 1 276 13 mov.u64 %r249,%r86; $L50: .loc 1 270 21 ld.u64 %r125,[%frame+360]; mov.u64 %r63,%r125; mov.u64 %r81,%r82; mov.u64 %r79,%r80; $L48: .loc 1 257 6 ld.s8 %r212,[%r81]; cvt.u16.u32 %r211,%r212; setp.eq.u16 %r213,%r211,0; @ %r213 bra $L44; .loc 1 257 14 setp.eq.u64 %r214,%r86,0; @ %r214 bra $L45; .loc 1 4113218,[%value_in]; } .loc 1 257 6 @ %r246 bra $L46; .loc 1 258 23 setp.gt.s32 %r221,%r218,0; @ %r221 bra $L44; $L45: mov.u64 %r108,%r244; .loc 1 270 21 mov.u64 %r109,%r94; mov.u64 %r126,%r63; bra $L47; $L46: .loc 1 258 23 setp.lt.s32 %r223,%r218,0; @ ! %r223 bra $L44; bra $L45; $L47: .loc 1 263 38 cvt.u32.u64 %r225,%r126; add.u32 %r224,%r225,1; .loc 1 263 27 st.u32 [%r109],%r224; .loc 1 262 7 add.u64 %r109,%r109,%r99; setp.eq.u64 %r226,%r103,%r108; @ %r226 bra $L51; .loc 1 263 34 ld.u64 %r126,[%r108]; add.u64 %r108,%r108,8; bra $L47; $L51: mov.u64 %r86,%r79; $L44: .loc 1 267 9 add.u64 %r79,%r79,%r59; .loc 1 268 10 add.u64 %r81,%r81,%r61; .loc 1 270 14 add.u64 %r63,%r63,1; .loc 1 270 7 st.u64 [%frame+360],%r63; setp.ne.u64 %r227,%r63,%r65; @ %r227 bra $L48; sub.u64 %r119,%r65,%r125; .loc 1 267 9 mad.lo.u64 %r80,%r59,%r119,%r80; .loc 1 268 10 mad.lo.u64 %r82,%r61,%r119,%r82; mov.u64 %r116,%r244; mov.u64 %r75,%r61; mov.u64 %r78,%r65; mov.u64 %r73,%r59; mov.u64 %r115,8; .loc 1 271 9 mov.u64 %r85,0; $L49: .loc 1 276 13 st.u64 [%r116+-8],%r249; .loc 1 279 23 mul.lo.u64 %r67,%r73,%r78; .loc 1 280 24 mul.lo.u64 %r70,%r78,%r75; .loc 1 281 5 add.u64 %r85,%r85,1; .loc 1 282 7 setp.eq.u64 %r232,%r85,%r111; @ %r232 bra $L31; .loc 1 290 16 ld.u64 %r233,[%r116]; add.u64 %r72,%r233,1; st.u64 [%r116],%r72; .loc 1 291 23 add.u64 %r235,%r245,%r115; ld.u64 %r73,[%r235]; .loc 1 291 13 sub.u64 %r236,%r73,%r67; add.u64 %r80,%r80,%r236; .loc 1 292 24 add.u64 %r237,%frame,%r115; ld.u64 %r75,[%r237]; .loc 1 292 14 sub.u64 %r238,%r75,%r70; add.u64 %r82,%r82,%r238; .loc 1 295 32 add.u64 %r240,%r242,%r115; ld.u64 %r78,[%r240]; .loc 1 295 7 add.u64 %r116,%r116,8; add.u64 %r115,%r115,8; setp.eq.u64 %r241,%r72,%r78; @ %r241 bra $L49; bra $L50; $L31: .loc 1 298_gfortran_sminloc0_4_s1 .visible .func _gfortran_sminloc0_4_s1in_ar4) {4u32 %r41; .reg .predu32 %r55; .reg .u64u32ar2; mov.u32 %r38,%ar3; mov.u64 %r39,%ar4; .loc 1 317 6 setp.eq.u64 %r40,%r37,0; @ %r40 bra $L70; .loc 1 317 20 ld.u32 %r41,[%r37]; setp.eq.u32 %r42,%r41,0; @ %r42 bra $L71; $L70: .loc 1 320 7 {call _gfortran_minloc0_4_s1324 7 bra $L69; $L71: .loc 1 327 8 ld.s8 %r30,[%r36+28]; .loc 1 329 6 setp.gt.s64 %r48,%r30,0; @ %r48 bra $L73; .loc 1 330 5 cvta.const.u64 %r49,$LC0; {stack; call _gfortran_runtime_error73: .loc 1 332 15 ld.u64 %r24,[%r35]; .loc 1 332 6 setp.ne.u64 %r51,%r24,0; @ %r51 bra $L74; .loc 1 334 7 st.u64 [%r35+48],%r24; add.u64 %r53,%r30,-1; st.u64 [%r35+56],%r53; mov.u64 %r54,1; st.u64 [%r35+40],%r54; .loc 1 335 28 cvt.u32.u64 %r55,%r54; st.u8 [%r35+28],%r55; .loc 1 336 24 st.u64 [%r35+8],%r24; .loc 1 337 29 mov.u64 %r58call (%value_in),_gfortrani_xmallocarray37 27 st.u64 [%r35],%r24; bra $L75; $L74: .loc 1 339 12 cvta.global.u64 %r63,_gfortrani_compile_options; .loc 1 339 11 ld.u32 %r64,[%r63+36]; setp.eq.u32 %r65,%r64,0; @ %r65 bra $L75; .loc 1 341 8 cvta.const.u64 %r68,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 346 8 ld.u64 %r24,[%r35]; $L75: ld.u64 %r69,[%r35+40]; shl.b64 %r34,%r69,2; mov.u64 %r29,%r24; .loc 1 347 10 mov.u64 %r32,0; .loc 1 348 23 cvt.u32.u64 %r70,%r32; $L76: st.u32 [%r29],%r70; .loc 1 347 24 add.u64 %r32,%r32,1; .loc 1 347 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r71,%r30,%r32; @ %r71 bra $L76; $L69: .loc 1 349 minloc0_4_s4.o/ 1622802248_gfortran_minloc0_4_s4 .visible .func _gfortran_minloc0_4_s4); .file 1 "../../../../libgfortran/generated/minloc0_4_s4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_4_s4 .visible .func _gfortran_mminloc0_4_s_gfortran_sminloc0_4_s4 .visible .func _gfortran_sminloc0_4_sVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_memcmp_char4_gfortrani_memcmp_char4_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,116,0 }82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,6270,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_4_s4 .visible .func _gfortran_minloc0_4_s4u64 %r120; .reg .u64 %r122; .reg .u32 %r123; .reg .predu64pred %r182; mov.u64 %r102,%ar0; mov.u64 %r103,%ar1; mov.u32 %r104,%ar2; mov.u64 %r105,%ar3; .loc 1 64 10 ld.s8 %r22,[%r103+28]; .loc 1 64 8 cvt.u32.u32 %r106,%r22; cvt.s64.s8 %r62,%r106; .loc 1 65 6 setp.gt.s64 %r107,%r62,0; @ %r107 bra $L2; .loc 1 66 5 cvta.const.u64 %r108,$LC0; {stack; call _gfortran_runtime_errorr23,[%r102]; .loc 1 68 6 setp.ne.u64 %r110,%r23,0; @ %r110 bra $L3; .loc 1 70 7 st.u64 [%r102+48],%r23; add.u64 %r112,%r62,-1; st.u64 [%r102+56],%r112; mov.u64 %r113,1; st.u64 [%r102+40],%r113; .loc 1 71 28 cvt.u32.u64 %r114,%r113; st.u8 [%r102+28],%r114; .loc 1 72 24 st.u64 [%r102+8],%r23; .loc 1 73 29117; call (%value_in),_gfortrani_xmallocarray120,[%value_in]; } mov.u64 %r23,%r120; .loc 1 73 27 st.u64 [%r102],%r23; bra $L4; $L3: .loc 1 77 11 cvta.global.u64 %r122,_gfortrani_compile_options; .loc 1 77 10 ld.u32 %r123,[%r122+36]; setp.eq.u32 %r124,%r123,0; @ %r124 bra $L4; .loc 1 78 2127; call _gfortrani_bounds_iforeach_return.loc 1 83 8 ld.u64 %r23,[%r102]; $L4: .loc 1 82 11 ld.u64 %r63,[%r102+40]; add.u64 %r27,%r103,40; mov.u64 %r68,0; .loc 1 84 10 mov.u64 %r83,%r68; add.u64 %r181,%frame,120; add.u64 %r179,%frame,240; .loc 1 88 16 mov.u64 %r138,%r68; bra $L7; $L20: mov.u64 %r83,%r70; $L7: .loc 1 86 18 add.u64 %r128,%frame,%r68; .loc 1 86 51 ld.u64 %r130,[%r27]; mul.lo.u64 %r129,%r130,%r105; .loc 1 86 18 st.u64 [%r128],%r129; .loc 1 87 19 ld.u64 %r132,[%r27+16]; add.u64 %r131,%r132,1; ld.u64 %r133,[%r27+8]; sub.u64 %r35,%r131,%r133; .loc 1 87 17 add.u64 %r135,%r181,%r68; st.u64 [%r135],%r35; .loc 1 88 16 add.u64 %r137,%r179,%r68; st.u64 [%r137],%r138; .loc 1 89 10 setp.gt.s64 %r139,%r35,0; @ %r139 bra $L5; shl.b64 %r85,%r63,2; mov.u64 %r87,%r23; .loc 1 92 11 mov.u64 %r71,0; .loc 1 93 24 cvt.u32.u64 %r140,%r71; $L6: st.u32 [%r87],%r140; .loc 1 92 27 add.u64 %r71,%r71,1; .loc 1 92 4 add.u64 %r87,%r87,%r85; setp.ne.u64 %r141,%r62,%r71; @ %r141 bra $L6; bra $L1; $L5: .loc 1 84 26 add.u64 %r70,%r83,1; .loc 1 84 3 add.u64 %r27,%r27,24; add.u64 %r68,%r68,8; setp.ne.u64 %r142,%r62,%r70; @ %r142 bra $L20; .loc 1 98 8 ld.u64 %r50,[%r103]; shl.b64 %r52,%r63,2; mov.u64 %r54,%r23; .loc 1 101 10 mov.u64 %r69,0; .loc 1 102 23 mov.u32 %r143,1; $L8: st.u32 [%r54],%r143; mov.u64 %r84,%r69; .loc 1 101 26 add.u64 %r69,%r69,1; .loc 1 101 3 add.u64 %r54,%r54,%r52; setp.ne.u64 %r144,%r83,%r84; @ %r144 bra $L8; .loc 1 108 9 setp.ne.u64 %r145,%r50,0; @ ! %r145 bra $L1; .loc 1 123 19 ld.u64 %r92,[%frame]; .loc 1 123 9 shl.b64 %r94,%r92,2; .loc 1 125 34 ld.u64 %r95,[%frame+120]; cvt.u32.u32 %r147,%r22; cvt.s64.s8 %r146,%r147; shl.b64 %r148,%r146,3; add.u64 %r58,%r179,%r148; .loc 1 106 11 mov.u64 %r56,0; add.u64 %r180,%frame,248; .loc 1 114 8 setp.eq.u32 %r182,%r104,0; $L19: .loc 1 125 21 ld.u64 %r96,[%frame+240]; mov.u64 %r40,%r96; mov.u64 %r49,%r50; $L16: .loc 1 114 8 setp.ne.u64 %r150,%r56,0; @ %r150 bra $L11; $L14: mov.u64 %r74,%r180; .loc 1 125 21 mov.u64 %r75,%r23; mov.u64 %r97,%r40; bra $L12; $L11: .loc 1 114 8 @ %r182 bra $L13;105; call (%value_in),_gfortrani_memcmp_char4156,[%value_in]; } .loc 1 114 24 setp.le.s32 %r157,%r156,0; @ ! %r157 bra $L15; bra $L14105; call (%value_in),_gfortrani_memcmp_char4161,[%value_in]; } .loc 1 114 24 setp.lt.s32 %r162,%r161,0; @ ! %r162 bra $L15; bra $L14; $L12: .loc 1 119 38 cvt.u32.u64 %r164,%r97; add.u32 %r163,%r164,1; .loc 1 119 27 st.u32 [%r75],%r163; .loc 1 118 7 add.u64 %r75,%r75,%r52; setp.eq.u64 %r165,%r58,%r74; @ %r165 bra $L21; .loc 1 119 34 ld.u64 %r97,[%r74]; add.u64 %r74,%r74,8; bra $L12; $L21: .loc 1 117 14 mov.u64 %r56,%r49; $L15: .loc 1 123 9 add.u64 %r49,%r49,%r94; .loc 1 125 14 add.u64 %r40,%r40,1; .loc 1 125 7 st.u64 [%frame+240],%r40; setp.ne.u64 %r166,%r40,%r95; @ %r166 bra $L16; .loc 1 123 9 sub.u64 %r167,%r95,%r96; mad.lo.u64 %r50,%r167,%r94,%r50; mov.u64 %r81,%r180; mov.u64 %r48,%r95; mov.u64 %r45,%r92; .loc 1 126 9 mov.u64 %r55,0; .loc 1 131 13 mov.u64 %r170,%r55; $L18: st.u64 [%r81+-8],%r170; .loc 1 134 23 mul.lo.u64 %r42,%r45,%r48; mov.u64 %r36,%r55; .loc 1 135 5 add.u64 %r55,%r55,1; .loc 1 136 7 setp.eq.u64 %r171,%r36,%r83; @ %r171 bra $L1; .loc 1 144 16 ld.u64 %r172,[%r81]; add.u64 %r44,%r172,1; st.u64 [%r81],%r44; shl.b64 %r78,%r55,3; .loc 1 145 23 add.u64 %r173,%frame,%r78; ld.u64 %r45,[%r173]; .loc 1 145 13 sub.u64 %r174,%r45,%r42; shl.b64 %r175,%r174,2; add.u64 %r50,%r50,%r175; .loc 1 148 32 add.u64 %r177,%r181,%r78; ld.u64 %r48,[%r177]; .loc 1 148 7 add.u64 %r81,%r81,8; setp.eq.u64 %r178,%r44,%r48; @ %r178 bra $L18; bra $L19; $L1: .loc 1 151_gfortran_mminloc0_4_s4 .visible .func _gfortran_mminloc0_4_s, .param .u64 %in_ar4) {predpred.reg .u16 %r179; .reg .u32 %r180; .reg .u16 %r181; .reg .pred %r182; .reg .predpred %r205; .reg .pred %r206; .reg .u32 %r207; .reg .predu32 %r211; .reg .u64 %r212; .reg .u16 %r214; .reg .u32 %r215; .reg .predpred %r232; .reg .predu64 %r251; .reg .u64 %r252; .reg .pred %r253; mov.u64 %r131,%ar0; mov.u64 %r132,%ar1; mov.u64 %r133,%ar2; mov.u32 %r134,%ar3; mov.u64 %r135,%ar4; .loc 1 177 6 setp.ne.u64 %r136,%r133,0; @ %r136 bra $L301132134135; call _gfortran_minloc0_4_s4184 7 bra $L29; $L30: .loc 1 187 10 ld.s8 %r22,[%r132+28]; .loc 1 188 6 setp.gt.s32 %r141,%r22,0; @ %r141 bra $L32; .loc 1 189 5 cvta.const.u64 %r142,$LC0stack; call _gfortran_runtime_error32: .loc 1 191 6 ld.u64 %r144,[%r131]; setp.ne.u64 %r145,%r144,0; @ %r145 bra $L33; .loc 1 193 7 st.u64 [%r131+48],%r144; add.u32 %r147,%r22,-1; cvt.s64.s32 %r148,%r147; st.u64 [%r131+56],%r148; mov.u64 %r149,1; st.u64 [%r131+40],%r149; .loc 1 194 28 cvt.u32.u64 %r150,%r149; st.u8 [%r131+28],%r150; .loc 1 195 24 st.u64 [%r131+8],%r144; .loc 1 196 29 cvt.u32.u32 %r155,%r22; cvt.s64.s8 %r154,%r155call (%value_in),_gfortrani_xmallocarray156,[%value_in]; } .loc 1 196 27 st.u64 [%r131],%r156; bra $L34; $L33: .loc 1 200 11 cvta.global.u64 %r158,_gfortrani_compile_options; .loc 1 200 10 ld.u32 %r159,[%r158+36]; setp.eq.u32 %r160,%r159,0r1163; call _gfortrani_bounds_iforeach_return.loc 1 205 4 cvta.const.u64 %r16613r11663; call _gfortrani_bounds_equal_extents$L34: .loc 1 210 15 ld.u64 %r31,[%r133+16]; .loc 1 212 9 ld.u64 %r76,[%r133]; .loc 1 214 22 cvt.u32.u64 %r32,%r31; .loc 1 214 53 add.u32 %r168,%r32,-4; and.b32 %r169,%r168,-5; set.u32.eq.u32 %r171,%r169,0; neg.s32 %r172,%r171; .loc 1 214 22 add.u32 %r173,%r32,-1; set.u32.le.u32 %r175,%r173,1; neg.s32 %r176,%r175; .loc 1 214 6 cvt.u16.u32 %r178,%r172; cvt.u16.u32 %r179,%r176; or.b16 %r177,%r178,%r179; cvt.u32.u16 %r180,%r177; cvt.u16.u8 %r181,%r180; setp.ne.u16 %r182,%r181,0; @ %r182 bra $L35; .loc 1 216 7 setp.ne.u32 %r184,%r32,16; @ %r184 bra $L36; $L35: .loc 1 223 11 ld.u64 %r87,[%r131+40]; .loc 1 224 8 ld.u64 %r88,[%r131]; .loc 1 225 17 cvt.u32.u32 %r185,%r22; cvt.s64.s8 %r107,%r185; add.u64 %r83,%r132,40; add.u64 %r77,%r133,40; mov.u64 %r61,0; .loc 1 225 10 mov.u64 %r78,%r61; add.u64 %r250,%frame,120; add.u64 %r251,%frame,240; add.u64 %r252,%frame,360; .loc 1 230 16 mov.u64 %r202,%r61; bra $L37; $L36: .loc 1 221 5_gfortran_runtime_error37: .loc 1 227 18 add.u64 %r189,%r250,%r61; .loc 1 227 51 ld.u64 %r191,[%r83]; mul.lo.u64 %r190,%r191,%r135; .loc 1 227 18 st.u64 [%r189],%r190; .loc 1 228 18 add.u64 %r192,%frame,%r61; .loc 1 228 20 ld.u64 %r194,[%r77]; mul.lo.u64 %r193,%r194,%r31; .loc 1 228 18 st.u64 [%r192],%r193; .loc 1 229 19 ld.u64 %r196,[%r83+16]; add.u64 %r195,%r196,1; ld.u64 %r197,[%r83+8]; sub.u64 %r50,%r195,%r197; .loc 1 229 17 add.u64 %r199,%r251,%r61; st.u64 [%r199],%r50; .loc 1 230 16 add.u64 %r201,%r252,%r61; st.u64 [%r201],%r202; .loc 1 231 10 setp.gt.s64 %r203,%r50,0; @ %r203 bra $L38; shl.b64 %r115,%r87,2; mov.u64 %r117,%r88; .loc 1 234 11 mov.u64 %r91,0; .loc 1 235 24 cvt.u32.u64 %r204,%r91; $L39: st.u32 [%r117],%r204; .loc 1 234 27 add.u64 %r91,%r91,1; .loc 1 234 4 add.u64 %r117,%r117,%r115; setp.ne.u64 %r205,%r91,%r107; @ %r205 bra $L39; bra $L29; $L38: .loc 1 225 26 add.u64 %r78,%r78,1; .loc 1 225 3 add.u64 %r83,%r83,24; add.u64 %r77,%r77,24; add.u64 %r61,%r61,8; setp.ne.u64 %r206,%r78,%r107; @ %r206 bra $L37; .loc 1 240 8 ld.u64 %r73,[%r132]; shl.b64 %r96,%r87,2; mov.u64 %r98,%r88; .loc 1 243 10 mov.u64 %r90,0; .loc 1 244 23 cvt.u32.u64 %r207,%r90; $L41: st.u32 [%r98],%r207; .loc 1 243 26 add.u64 %r90,%r90,1; .loc 1 243 3 add.u64 %r98,%r98,%r96; setp.ne.u64 %r208,%r90,%r107; @ %r208 bra $L41; .loc 1 251 9 setp.eq.u64 %r209,%r73,0; @ %r209 bra $L29; .loc 1 267 19 ld.u64 %r123,[%frame+120]; .loc 1 267 9 shl.b64 %r125,%r123,2; .loc 1 268 20 ld.u64 %r126,[%frame]; .loc 1 270 34 ld.u64 %r128,[%frame+240]; cvt.u32.u32 %r211,%r22; cvt.s64.s8 %r210,%r211; shl.b64 %r212,%r210,3; add.u64 %r100,%r252,%r212; .loc 1 249 10 mov.u64 %r80,0; add.u64 %r249,%frame,368; .loc 1 257 6 setp.eq.u32 %r253,%r134,0; $L49: .loc 1 270 21 ld.u64 %r129,[%frame+360]; mov.u64 %r57,%r129; mov.u64 %r75,%r76; mov.u64 %r72,%r73; $L47: .loc 1 257 6 ld.s8 %r215,[%r75]; cvt.u16.u32 %r214,%r215; setp.eq.u16 %r216,%r214,0; @ %r216 bra $L42; .loc 1 257 14 setp.ne.u64 %r217,%r80,0; @ %r217 bra $L43; $L46: mov.u64 %r105,%r249; .loc 1 270 21 mov.u64 %r106,%r88; mov.u64 %r130,%r57; bra $L44; $L43: .loc 1 257 6 @ %r253 bra $L45; .loc 1 43135; call (%value_in),_gfortrani_memcmp_char4223,[%value_in]; } .loc 1 258 23 setp.le.s32 %r224,%r223,0; @ ! %r224 bra $L42; bra $L46; $L45: .loc 1 43135; call (%value_in),_gfortrani_memcmp_char4228,[%value_in]; } .loc 1 258 23 setp.lt.s32 %r229,%r228,0; @ ! %r229 bra $L42; bra $L46; $L44: .loc 1 263 38 cvt.u32.u64 %r231,%r130; add.u32 %r230,%r231,1; .loc 1 263 27 st.u32 [%r106],%r230; .loc 1 262 7 add.u64 %r106,%r106,%r96; setp.eq.u64 %r232,%r100,%r105; @ %r232 bra $L50; .loc 1 263 34 ld.u64 %r130,[%r105]; add.u64 %r105,%r105,8; bra $L44; $L50: .loc 1 261 14 mov.u64 %r80,%r72; $L42: .loc 1 267 9 add.u64 %r72,%r72,%r125; .loc 1 268 10 add.u64 %r75,%r75,%r126; .loc 1 270 14 add.u64 %r57,%r57,1; .loc 1 270 7 st.u64 [%frame+360],%r57; setp.ne.u64 %r233,%r57,%r128; @ %r233 bra $L47; sub.u64 %r120,%r128,%r129; .loc 1 267 9 mad.lo.u64 %r73,%r120,%r125,%r73; .loc 1 268 10 mad.lo.u64 %r76,%r120,%r126,%r76; mov.u64 %r113,%r249; mov.u64 %r69,%r126; mov.u64 %r71,%r128; mov.u64 %r67,%r123; mov.u64 %r112,8; .loc 1 271 9 mov.u64 %r79,0; .loc 1 276 13 mov.u64 %r237,%r79; $L48: st.u64 [%r113+-8],%r237; .loc 1 279 23 mul.lo.u64 %r60,%r67,%r71; .loc 1 280 24 mul.lo.u64 %r63,%r71,%r69; .loc 1 281 5 add.u64 %r79,%r79,1; .loc 1 282 7 setp.eq.u64 %r238,%r79,%r107; @ %r238 bra $L29; .loc 1 290 16 ld.u64 %r239,[%r113]; add.u64 %r66,%r239,1; st.u64 [%r113],%r66; .loc 1 291 23 add.u64 %r241,%r250,%r112; ld.u64 %r67,[%r241]; .loc 1 291 13 sub.u64 %r242,%r67,%r60; shl.b64 %r243,%r242,2; add.u64 %r73,%r73,%r243; .loc 1 292 24 add.u64 %r244,%frame,%r112; ld.u64 %r69,[%r244]; .loc 1 292 14 sub.u64 %r245,%r69,%r63; add.u64 %r76,%r76,%r245; .loc 1 295 32 add.u64 %r247,%r251,%r112; ld.u64 %r71,[%r247]; .loc 1 295 7 add.u64 %r113,%r113,8; add.u64 %r112,%r112,8; setp.eq.u64 %r248,%r66,%r71; @ %r248 bra $L48; bra $L49; $L29: .loc 1 298_gfortran_sminloc0_4_s4 .visible .func _gfortran_sminloc0_4_s, .param .u64 %in_ar4) {4u32 %r41; .reg .predu32 %r55; .reg .u64u32ar2; mov.u32 %r38,%ar3; mov.u64 %r39,%ar4; .loc 1 317 6 setp.eq.u64 %r40,%r37,0; @ %r40 bra $L66; .loc 1 317 20 ld.u32 %r41,[%r37]; setp.eq.u32 %r42,%r41,0; @ %r42 bra $L67; $L66: .loc 1 320 7 {call _gfortran_minloc0_4_s4324 7 bra $L65; $L67: .loc 1 327 8 ld.s8 %r30,[%r36+28]; .loc 1 329 6 setp.gt.s64 %r48,%r30,0; @ %r48 bra $L69; .loc 1 330 5 cvta.const.u64 %r49,$LC0; {stack; call _gfortran_runtime_error69: .loc 1 332 15 ld.u64 %r24,[%r35]; .loc 1 332 6 setp.ne.u64 %r51,%r24,0; @ %r51 bra $L70; .loc 1 334 7 st.u64 [%r35+48],%r24; add.u64 %r53,%r30,-1; st.u64 [%r35+56],%r53; mov.u64 %r54,1; st.u64 [%r35+40],%r54; .loc 1 335 28 cvt.u32.u64 %r55,%r54; st.u8 [%r35+28],%r55; .loc 1 336 24 st.u64 [%r35+8],%r24; .loc 1 337 29 mov.u64 %r58call (%value_in),_gfortrani_xmallocarray37 27 st.u64 [%r35],%r24; bra $L71; $L70: .loc 1 339 12 cvta.global.u64 %r63,_gfortrani_compile_options; .loc 1 339 11 ld.u32 %r64,[%r63+36]; setp.eq.u32 %r65,%r64,0; @ %r65 bra $L71; .loc 1 341 8 cvta.const.u64 %r68,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 346 8 ld.u64 %r24,[%r35]; $L71: ld.u64 %r69,[%r35+40]; shl.b64 %r34,%r69,2; mov.u64 %r29,%r24; .loc 1 347 10 mov.u64 %r32,0; .loc 1 348 23 cvt.u32.u64 %r70,%r32; $L72: st.u32 [%r29],%r70; .loc 1 347 24 add.u64 %r32,%r32,1; .loc 1 347 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r71,%r30,%r32; @ %r71 bra $L72; $L65: .loc 1 349 minloc0_8_s1.o/ 1622802248_gfortran_minloc0_8_s1 .visible .func _gfortran_minloc0_8_s1); .file 1 "../../../../libgfortran/generated/minloc0_8_s1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_8_s1 .visible .func _gfortran_mminloc0_8_s1_gfortran_sminloc0_8_s1 .visible .func _gfortran_sminloc0_8_s1in_ar4); // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_returnmemcmpmem_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,116,0 }82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,6270,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_8_s1 .visible .func _gfortran_minloc0_8_s132u64u32 %r142; .reg .u64 %r143; .reg .predpred %r157; .reg .u64 %r158; .reg .predpredpred %r172; .reg .u64 %r175; mov.u64 %r97,%ar0; mov.u64 %r98,%ar1; mov.u32 %r99,%ar2; mov.u64 %r100,%ar3; .loc 1 64 10 ld.s8 %r22,[%r98+28]; .loc 1 64 8 cvt.u32.u32 %r101,%r22; cvt.s64.s8 %r64,%r101; .loc 1 65 6 setp.gt.s64 %r102,%r64,0; @ %r102 bra $L2; .loc 1 66 5 cvta.const.u64 %r103103_gfortran_runtime_errorr23,[%r97]; .loc 1 68 6 setp.ne.u64 %r105,%r23,0; @ %r105 bra $L3; .loc 1 70 7 st.u64 [%r97+48],%r23; add.u64 %r107,%r64,-1; st.u64 [%r97+56],%r107; mov.u64 %r108,1; st.u64 [%r97+40],%r108; .loc 1 71 28 cvt.u32.u64 %r109,%r108; st.u8 [%r97+28],%r109; .loc 1 72 24 st.u64 [%r97+8],%r23; .loc 1 73 29 mov.u64 %r11112; call (%value_in),_gfortrani_xmallocarray115,[%value_in]; } mov.u64 %r23,%r115; .loc 1 73 27 st.u64 [%r97],%r23; bra $L4; $L3: .loc 1 77 11 cvta.global.u64 %r117,_gfortrani_compile_options; .loc 1 77 10 ld.u32 %r118,[%r117+36]; setp.eq.u32 %r119,%r118,0; @ %r119 bra $L4; .loc 1 7892; call _gfortrani_bounds_iforeach_return.loc 1 83 8 ld.u64 %r23,[%r97]; $L4: .loc 1 82 11 ld.u64 %r65,[%r97+40]; add.u64 %r67,%r98,40; mov.u64 %r94,0; .loc 1 84 10 mov.u64 %r85,%r94; add.u64 %r170,%frame,120; add.u64 %r169,%frame,240; .loc 1 88 16 mov.u64 %r133,%r94; bra $L7; $L19: mov.u64 %r85,%r69; $L7: .loc 1 86 18 add.u64 %r123,%frame,%r94; .loc 1 86 51 ld.u64 %r125,[%r67]; mul.lo.u64 %r124,%r125,%r100; .loc 1 86 18 st.u64 [%r123],%r124; .loc 1 87 19 ld.u64 %r127,[%r67+16]; add.u64 %r126,%r127,1; ld.u64 %r128,[%r67+8]; sub.u64 %r37,%r126,%r128; .loc 1 87 17 add.u64 %r130,%r170,%r94; st.u64 [%r130],%r37; .loc 1 88 16 add.u64 %r132,%r169,%r94; st.u64 [%r132],%r133; .loc 1 89 10 setp.gt.s64 %r134,%r37,0; @ %r134 bra $L5; shl.b64 %r84,%r65,3; mov.u64 %r27,%r23; .loc 1 92 11 mov.u64 %r70,0; .loc 1 93 24 mov.u64 %r135,%r70; $L6: st.u64 [%r27],%r135; .loc 1 92 27 add.u64 %r70,%r70,1; .loc 1 92 4 add.u64 %r27,%r27,%r84; setp.ne.u64 %r136,%r64,%r70; @ %r136 bra $L6; bra $L1; $L5: .loc 1 84 26 add.u64 %r69,%r85,1; .loc 1 84 3 add.u64 %r67,%r67,24; add.u64 %r94,%r94,8; setp.ne.u64 %r137,%r64,%r69; @ %r137 bra $L19; .loc 1 98 8 ld.u64 %r53,[%r98]; shl.b64 %r29,%r65,3; mov.u64 %r50,%r23; .loc 1 101 10 mov.u64 %r68,0; .loc 1 102 23 mov.u64 %r138,1; $L8: st.u64 [%r50],%r138; mov.u64 %r86,%r68; .loc 1 101 26 add.u64 %r68,%r68,1; .loc 1 101 3 add.u64 %r50,%r50,%r29; setp.ne.u64 %r139,%r85,%r86; @ %r139 bra $L8; .loc 1 108 9 setp.ne.u64 %r140,%r53,0; @ ! %r140 bra $L1; .loc 1 123 19 ld.u64 %r40,[%frame]; .loc 1 125 34 ld.u64 %r43,[%frame+120]; cvt.u32.u32 %r142,%r22; cvt.s64.s8 %r141,%r142; shl.b64 %r143,%r141,3; add.u64 %r55,%r169,%r143; .loc 1 106 11 mov.u64 %r58,0; add.u64 %r171,%frame,248; .loc 1 114 8 setp.eq.u32 %r172,%r99,0; .loc 1 131 13 mov.u64 %r175,%r58; $L18: .loc 1 125 21 ld.u64 %r87,[%frame+240]; mov.u64 %r42,%r87; mov.u64 %r52,%r53; $L15: .loc 1 114 8 setp.eq.u64 %r145,%r58,0; @ %r145 bra $L11; .loc 1 41 12r100149,[%value_in]; } .loc 1 114 8 @ %r172 bra $L12; .loc 1 114 24 setp.gt.s32 %r152,%r149,0; @ %r152 bra $L13; $L11: mov.u64 %r63,%r171; .loc 1 125 21 mov.u64 %r71,%r23; mov.u64 %r88,%r42; bra $L14; $L12: .loc 1 114 24 setp.lt.s32 %r154,%r149,0; @ ! %r154 bra $L13; bra $L11; $L14: .loc 1 119 38 add.u64 %r155,%r88,1; .loc 1 119 27 st.u64 [%r71],%r155; .loc 1 118 7 add.u64 %r71,%r71,%r29; setp.eq.u64 %r156,%r55,%r63; @ %r156 bra $L20; .loc 1 119 34 ld.u64 %r88,[%r63]; add.u64 %r63,%r63,8; bra $L14; $L20: mov.u64 %r58,%r52; $L13: .loc 1 123 9 add.u64 %r52,%r52,%r40; .loc 1 125 14 add.u64 %r42,%r42,1; .loc 1 125 7 st.u64 [%frame+240],%r42; setp.ne.u64 %r157,%r42,%r43; @ %r157 bra $L15; .loc 1 123 9 sub.u64 %r158,%r43,%r87; mad.lo.u64 %r53,%r158,%r40,%r53; mov.u64 %r77,%r171; mov.u64 %r51,%r43; mov.u64 %r48,%r40; .loc 1 126 9 mov.u64 %r57,0; $L17: .loc 1 131 13 st.u64 [%r77+-8],%r175; .loc 1 134 23 mul.lo.u64 %r45,%r48,%r51; mov.u64 %r92,%r57; .loc 1 135 5 add.u64 %r57,%r57,1; .loc 1 136 7 setp.eq.u64 %r162,%r92,%r85; @ %r162 bra $L1; .loc 1 144 16 ld.u64 %r163,[%r77]; add.u64 %r47,%r163,1; st.u64 [%r77],%r47; shl.b64 %r74,%r57,3; .loc 1 145 23 add.u64 %r164,%frame,%r74; ld.u64 %r48,[%r164]; .loc 1 145 13 sub.u64 %r165,%r48,%r45; add.u64 %r53,%r53,%r165; .loc 1 148 32 add.u64 %r167,%r170,%r74; ld.u64 %r51,[%r167]; .loc 1 148 7 add.u64 %r77,%r77,8; setp.eq.u64 %r168,%r47,%r51; @ %r168 bra $L17; bra $L18; $L1: .loc 1 151_gfortran_mminloc0_8_s1 .visible .func _gfortran_mminloc0_8_s1in_ar4) {pred %r131; .reg .predpredpred %r155; .reg .u64 %r158; .reg .u64 %r161; .reg .u32pred %r198; .reg .u64predpredpred %r229; .reg .u64pred %r238; .reg .u64pred %r243; .reg .u64 %r246; mov.u64 %r126,%ar0; mov.u64 %r127,%ar1; mov.u64 %r128,%ar2; mov.u32 %r129,%ar3; mov.u64 %r130,%ar4; .loc 1 177 6 setp.ne.u64 %r131,%r128,0; @ %r131 bra $L32; .loc 1 180 7130; call _gfortran_minloc0_8_s1184 7 bra $L31; $L32: .loc 1 187 10 ld.s8 %r22,[%r127+28]; .loc 1 188 6 setp.gt.s32 %r136,%r22,0; @ %r136 bra $L34; .loc 1 189 5 cvta.const.u64 %r131stack; call _gfortran_runtime_error34: .loc 1 191 6 ld.u64 %r139,[%r126]; setp.ne.u64 %r140,%r139,0; @ %r140 bra $L35; .loc 1 193 7 st.u64 [%r126+48],%r139; add.u32 %r142,%r22,-1; cvt.s64.s32 %r143,%r142; st.u64 [%r126+56],%r143; mov.u64 %r144,1; st.u64 [%r126+40],%r144; .loc 1 194 28 cvt.u32.u64 %r145,%r144; st.u8 [%r126+28],%r145; .loc 1 195 24 st.u64 [%r126+8],%r139; .loc 1 196 29 cvt.u32.u32 %r150,%r22; cvt.s64.s8 %r149,%r150; mov.u64 %r148call (%value_in),_gfortrani_xmallocarray151,[%value_in]; } .loc 1 196 27 st.u64 [%r126],%r151; bra $L36; $L35: .loc 1 200 11 cvta.global.u64 %r153,_gfortrani_compile_options; .loc 1 200 10 ld.u32 %r154,[%r153+36]; setp.eq.u32 %r155,%r154,0; @ %r155 bra $L36; .loc 1 203 4 cvta.const.u64 %r158,$LC1gfortrani_bounds_iforeach_return.loc 1 205 4 cvta.const.u64 %r1611r12716158; call _gfortrani_bounds_equal_extents$L36: .loc 1 210 15 ld.u64 %r31,[%r128+16]; .loc 1 212 9 ld.u64 %r79,[%r128]; .loc 1 214 22 cvt.u32.u64 %r32,%r31; .loc 1 214 53 add.u32 %r163,%r32,-4; and.b32 %r164,%r163,-5; set.u32.eq.u32 %r166,%r164,0; neg.s32 %r167,%r166; .loc 1 214 22 add.u32 %r168,%r32,-1; set.u32.le.u32 %r170,%r168,1; neg.s32 %r171,%r170; .loc 1 214 6 cvt.u16.u32 %r173,%r167; cvt.u16.u32 %r174,%r171; or.b16 %r172,%r173,%r174; cvt.u32.u16 %r175,%r172; cvt.u16.u8 %r176,%r175; setp.ne.u16 %r177,%r176,0; @ %r177 bra $L37; .loc 1 216 7 setp.ne.u32 %r179,%r32,16; @ %r179 bra $L38; $L37: .loc 1 223 11 ld.u64 %r92,[%r126+40]; .loc 1 224 8 ld.u64 %r93,[%r126]; .loc 1 225 17 cvt.u32.u32 %r180,%r22; cvt.s64.s8 %r112,%r180; add.u64 %r80,%r127,40; add.u64 %r65,%r128,40; mov.u64 %r44,0; .loc 1 225 10 mov.u64 %r81,%r44; add.u64 %r242,%frame,120; add.u64 %r239,%frame,240; add.u64 %r240,%frame,360; .loc 1 230 16 mov.u64 %r197,%r44; bra $L39; $L38: .loc 1 221 5_gfortran_runtime_error39: .loc 1 227 18 add.u64 %r184,%r242,%r44; .loc 1 227 51 ld.u64 %r186,[%r80]; mul.lo.u64 %r185,%r186,%r130; .loc 1 227 18 st.u64 [%r184],%r185; .loc 1 228 18 add.u64 %r187,%frame,%r44; .loc 1 228 20 ld.u64 %r189,[%r65]; mul.lo.u64 %r188,%r189,%r31; .loc 1 228 18 st.u64 [%r187],%r188; .loc 1 229 19 ld.u64 %r191,[%r80+16]; add.u64 %r190,%r191,1; ld.u64 %r192,[%r80+8]; sub.u64 %r50,%r190,%r192; .loc 1 229 17 add.u64 %r194,%r239,%r44; st.u64 [%r194],%r50; .loc 1 230 16 add.u64 %r196,%r240,%r44; st.u64 [%r196],%r197; .loc 1 231 10 setp.gt.s64 %r198,%r50,0; @ %r198 bra $L40; shl.b64 %r122,%r92,3; mov.u64 %r37,%r93; .loc 1 234 11 mov.u64 %r95,0; .loc 1 235 24 mov.u64 %r199,%r95; $L41: st.u64 [%r37],%r199; .loc 1 234 27 add.u64 %r95,%r95,1; .loc 1 234 4 add.u64 %r37,%r37,%r122; setp.ne.u64 %r200,%r95,%r112; @ %r200 bra $L41; bra $L31; $L40: .loc 1 225 26 add.u64 %r81,%r81,1; .loc 1 225 3 add.u64 %r80,%r80,24; add.u64 %r65,%r65,24; add.u64 %r44,%r44,8; setp.ne.u64 %r201,%r81,%r112; @ %r201 bra $L39; .loc 1 240 8 ld.u64 %r77,[%r127]; shl.b64 %r87,%r92,3; mov.u64 %r99,%r93; .loc 1 243 10 mov.u64 %r94,0; .loc 1 244 23 mov.u64 %r202,%r94; $L43: st.u64 [%r99],%r202; .loc 1 243 26 add.u64 %r94,%r94,1; .loc 1 243 3 add.u64 %r99,%r99,%r87; setp.ne.u64 %r203,%r94,%r112; @ %r203 bra $L43; .loc 1 251 9 setp.eq.u64 %r204,%r77,0; @ %r204 bra $L31; .loc 1 267 19 ld.u64 %r57,[%frame+120]; .loc 1 268 20 ld.u64 %r59,[%frame]; .loc 1 270 34 ld.u64 %r62,[%frame+240]; cvt.u32.u32 %r206,%r22; cvt.s64.s8 %r205,%r206; shl.b64 %r207,%r205,3; add.u64 %r101,%r240,%r207; .loc 1 249 10 mov.u64 %r83,0; add.u64 %r241,%frame,368; .loc 1 257 6 setp.eq.u32 %r243,%r129,0; .loc 1 276 13 mov.u64 %r246,%r83; $L50: .loc 1 270 21 ld.u64 %r123,[%frame+360]; mov.u64 %r61,%r123; mov.u64 %r78,%r79; mov.u64 %r76,%r77; $L48: .loc 1 257 6 ld.s8 %r210,[%r78]; cvt.u16.u32 %r209,%r210; setp.eq.u16 %r211,%r209,0; @ %r211 bra $L44; .loc 1 257 14 setp.eq.u64 %r212,%r83,0; @ %r212 bra $L45;257 6 @ %r243 bra $L46; .loc 1 258 23 setp.gt.s32 %r219,%r216,0; @ %r219 bra $L44; $L45: mov.u64 %r106,%r241; .loc 1 270 21 mov.u64 %r107,%r93; mov.u64 %r124,%r61; bra $L47; $L46: .loc 1 258 23 setp.lt.s32 %r221,%r216,0; @ ! %r221 bra $L44; bra $L45; $L47: .loc 1 263 38 add.u64 %r222,%r124,1; .loc 1 263 27 st.u64 [%r107],%r222; .loc 1 262 7 add.u64 %r107,%r107,%r87; setp.eq.u64 %r223,%r101,%r106; @ %r223 bra $L51; .loc 1 263 34 ld.u64 %r124,[%r106]; add.u64 %r106,%r106,8; bra $L47; $L51: mov.u64 %r83,%r76; $L44: .loc 1 267 9 add.u64 %r76,%r76,%r57; .loc 1 268 10 add.u64 %r78,%r78,%r59; .loc 1 270 14 add.u64 %r61,%r61,1; .loc 1 270 7 st.u64 [%frame+360],%r61; setp.ne.u64 %r224,%r61,%r62; @ %r224 bra $L48; sub.u64 %r117,%r62,%r123; .loc 1 267 9 mad.lo.u64 %r77,%r57,%r117,%r77; .loc 1 268 10 mad.lo.u64 %r79,%r59,%r117,%r79; mov.u64 %r114,%r241; mov.u64 %r72,%r59; mov.u64 %r75,%r62; mov.u64 %r70,%r57; mov.u64 %r113,8; .loc 1 271 9 mov.u64 %r82,0; $L49: .loc 1 276 13 st.u64 [%r114+-8],%r246; .loc 1 279 23 mul.lo.u64 %r64,%r70,%r75; .loc 1 280 24 mul.lo.u64 %r67,%r75,%r72; .loc 1 281 5 add.u64 %r82,%r82,1; .loc 1 282 7 setp.eq.u64 %r229,%r82,%r112; @ %r229 bra $L31; .loc 1 290 16 ld.u64 %r230,[%r114]; add.u64 %r69,%r230,1; st.u64 [%r114],%r69; .loc 1 291 23 add.u64 %r232,%r242,%r113; ld.u64 %r70,[%r232]; .loc 1 291 13 sub.u64 %r233,%r70,%r64; add.u64 %r77,%r77,%r233; .loc 1 292 24 add.u64 %r234,%frame,%r113; ld.u64 %r72,[%r234]; .loc 1 292 14 sub.u64 %r235,%r72,%r67; add.u64 %r79,%r79,%r235; .loc 1 295 32 add.u64 %r237,%r239,%r113; ld.u64 %r75,[%r237]; .loc 1 295 7 add.u64 %r114,%r114,8; add.u64 %r113,%r113,8; setp.eq.u64 %r238,%r69,%r75; @ %r238 bra $L49; bra $L50; $L31: .loc 1 298_gfortran_sminloc0_8_s1 .visible .func _gfortran_sminloc0_8_s1in_ar4) {4u32 %r41; .reg .predu32 %r55; .reg .u64u32ar2; mov.u32 %r38,%ar3; mov.u64 %r39,%ar4; .loc 1 317 6 setp.eq.u64 %r40,%r37,0; @ %r40 bra $L70; .loc 1 317 20 ld.u32 %r41,[%r37]; setp.eq.u32 %r42,%r41,0; @ %r42 bra $L71; $L70: .loc 1 320 7 {call _gfortran_minloc0_8_s1324 7 bra $L69; $L71: .loc 1 327 8 ld.s8 %r30,[%r36+28]; .loc 1 329 6 setp.gt.s64 %r48,%r30,0; @ %r48 bra $L73; .loc 1 330 5 cvta.const.u64 %r49,$LC0; {stack; call _gfortran_runtime_error73: .loc 1 332 15 ld.u64 %r24,[%r35]; .loc 1 332 6 setp.ne.u64 %r51,%r24,0; @ %r51 bra $L74; .loc 1 334 7 st.u64 [%r35+48],%r24; add.u64 %r53,%r30,-1; st.u64 [%r35+56],%r53; mov.u64 %r54,1; st.u64 [%r35+40],%r54; .loc 1 335 28 cvt.u32.u64 %r55,%r54; st.u8 [%r35+28],%r55; .loc 1 336 24 st.u64 [%r35+8],%r24; .loc 1 337 29 mov.u64 call (%value_in),_gfortrani_xmallocarray37 27 st.u64 [%r35],%r24; bra $L75; $L74: .loc 1 339 12 cvta.global.u64 %r63,_gfortrani_compile_options; .loc 1 339 11 ld.u32 %r64,[%r63+36]; setp.eq.u32 %r65,%r64,0; @ %r65 bra $L75; .loc 1 341 8 cvta.const.u64 %r68,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 346 8 ld.u64 %r24,[%r35]; $L75: ld.u64 %r69,[%r35+40]; shl.b64 %r34,%r69,3; mov.u64 %r29,%r24; .loc 1 347 10 mov.u64 %r32,0; .loc 1 348 23 mov.u64 %r70,%r32; $L76: st.u64 [%r29],%r70; .loc 1 347 24 add.u64 %r32,%r32,1; .loc 1 347 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r71,%r30,%r32; @ %r71 bra $L76; $L69: .loc 1 349 minloc0_8_s4.o/ 1622802248_gfortran_minloc0_8_s4 .visible .func _gfortran_minloc0_8_s4); .file 1 "../../../../libgfortran/generated/minloc0_8_s4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_8_s4 .visible .func _gfortran_mminloc0_8_s_gfortran_sminloc0_8_s4 .visible .func _gfortran_sminloc0_8_sVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_memcmp_char4_gfortrani_memcmp_char4_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,116,0 }82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,6270,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_8_s4 .visible .func _gfortran_minloc0_8_s4predu64 %r146; .reg .pred %r148; .reg .u32 %r154; .reg .pred %r155; .reg .u32 %r159; .reg .predpredpredpredpred64 10 ld.s8 %r22,[%r101+28]; .loc 1 64 8 cvt.u32.u32 %r104,%r22; cvt.s64.s8 %r61,%r104; .loc 1 65 6 setp.gt.s64 %r105,%r61,0; @ %r105 bra $L2; .loc 1 66 5 cvta.const.u64 %r106,$LC0; {stack; call _gfortran_runtime_errorr23,[%r100]; .loc 1 68 6 setp.ne.u64 %r108,%r23,0; @ %r108 bra $L3; .loc 1 70 7 st.u64 [%r100+48],%r23; add.u64 %r110,%r61,-1; st.u64 [%r100+56],%r110; mov.u64 %r111,1; st.u64 [%r100+40],%r111; .loc 1 71 28 cvt.u32.u64 %r112,%r111; st.u8 [%r100+28],%r112; .loc 1 72 24 st.u64 [%r100+8],%r23; .loc 1 73 29 mov.u64 %r11561call (%value_in),_gfortrani_xmallocarray118,[%value_in]; } mov.u64 %r23,%r118; .loc 1 73 27 st.u64 [%r100],%r23; bra $L4; $L3: .loc 1 77 11 cvta.global.u64 %r120,_gfortrani_compile_options; .loc 1 77 10 ld.u32 %r121,[%r120+364; .loc 1 78r1015; call _gfortrani_bounds_iforeach_return.loc 1 83 8 ld.u64 %r23,[%r100]; $L4: .loc 1 82 11 ld.u64 %r62,[%r100+40]; add.u64 %r65,%r101,40; mov.u64 %r98,0; .loc 1 84 10 mov.u64 %r82,%r98; add.u64 %r178,%frame,120; add.u64 %r176,%frame,240; .loc 1 88 16 mov.u64 %r136,%r98; bra $L7; $L20: mov.u64 %r82,%r67; $L7: .loc 1 86 18 add.u64 %r126,%frame,%r98; .loc 1 86 51 ld.u64 %r128,[%r65]; mul.lo.u64 %r127,%r128,%r103; .loc 1 86 18 st.u64 [%r126],%r127; .loc 1 87 19 ld.u64 %r130,[%r65+16]; add.u64 %r129,%r130,1; ld.u64 %r131,[%r65+8]; sub.u64 %r36,%r129,%r131; .loc 1 87 17 add.u64 %r133,%r178,%r98; st.u64 [%r133],%r36; .loc 1 88 16 add.u64 %r135,%r176,%r98; st.u64 [%r135],%r136; .loc 1 89 10 setp.gt.s64 %r137,%r36,0; @ %r137 bra $L5; shl.b64 %r79,%r62,3; mov.u64 %r81,%r23; .loc 1 92 11 mov.u64 %r68,0; .loc 1 93 24 mov.u64 %r138,%r68; $L6: st.u64 [%r81],%r138; .loc 1 92 27 add.u64 %r68,%r68,1; .loc 1 92 4 add.u64 %r81,%r81,%r79; setp.ne.u64 %r139,%r61,%r68; @ %r139 bra $L6; bra $L1; $L5: .loc 1 84 26 add.u64 %r67,%r82,1; .loc 1 84 3 add.u64 %r65,%r65,24; add.u64 %r98,%r98,8; setp.ne.u64 %r140,%r61,%r67; @ %r140 bra $L20; .loc 1 98 8 ld.u64 %r50,[%r101]; shl.b64 %r28,%r62,3; mov.u64 %r47,%r23; .loc 1 101 10 mov.u64 %r66,0; .loc 1 102 23 mov.u64 %r141,1; $L8: st.u64 [%r47],%r141; mov.u64 %r83,%r66; .loc 1 101 26 add.u64 %r66,%r66,1; .loc 1 101 3 add.u64 %r47,%r47,%r28; setp.ne.u64 %r142,%r82,%r83; @ %r142 bra $L8; .loc 1 108 9 setp.ne.u64 %r143,%r50,0; @ ! %r143 bra $L1; .loc 1 123 19 ld.u64 %r88,[%frame]; .loc 1 123 9 shl.b64 %r90,%r88,2; .loc 1 125 34 ld.u64 %r91,[%frame+120]; cvt.u32.u32 %r145,%r22; cvt.s64.s8 %r144,%r145; shl.b64 %r146,%r144,3; add.u64 %r52,%r176,%r146; .loc 1 106 11 mov.u64 %r55,0; add.u64 %r177,%frame,248; .loc 1 114 8 setp.eq.u32 %r179,%r102,0; $L19: .loc 1 125 21 ld.u64 %r92,[%frame+240]; mov.u64 %r40,%r92; mov.u64 %r49,%r50; $L16: .loc 1 114 8 setp.ne.u64 %r148,%r55,0; @ %r148 bra $L11; $L14: mov.u64 %r60,%r177; .loc 1 125 21 mov.u64 %r71,%r23; mov.u64 %r93,%r40; bra $L12; $L11: .loc 1 114 8 @ %r179 bra $L13;5_gfortrani_memcmp_char4154,[%value_in]; } .loc 1 114 24 setp.le.s32 %r155,%r154,0; @ ! %r155 bra $L15; bra $L145_gfortrani_memcmp_char4.loc 1 114 24 setp.lt.s32 %r160,%r159,0; @ ! %r160 bra $L15; bra $L14; $L12: .loc 1 119 38 add.u64 %r161,%r93,1; .loc 1 119 27 st.u64 [%r71],%r161; .loc 1 118 7 add.u64 %r71,%r71,%r28; setp.eq.u64 %r162,%r52,%r60; @ %r162 bra $L21; .loc 1 119 34 ld.u64 %r93,[%r60]; add.u64 %r60,%r60,8; bra $L12; $L21: .loc 1 117 14 mov.u64 %r55,%r49; $L15: .loc 1 123 9 add.u64 %r49,%r49,%r90; .loc 1 125 14 add.u64 %r40,%r40,1; .loc 1 125 7 st.u64 [%frame+240],%r40; setp.ne.u64 %r163,%r40,%r91; @ %r163 bra $L16; .loc 1 123 9 sub.u64 %r164,%r91,%r92; mad.lo.u64 %r50,%r164,%r90,%r50; mov.u64 %r77,%r177; mov.u64 %r48,%r91; mov.u64 %r45,%r88; .loc 1 126 9 mov.u64 %r54,0; .loc 1 131 13 mov.u64 %r167,%r54; $L18: st.u64 [%r77+-8],%r167; .loc 1 134 23 mul.lo.u64 %r42,%r45,%r48; mov.u64 %r38,%r54; .loc 1 135 5 add.u64 %r54,%r54,1; .loc 1 136 7 setp.eq.u64 %r168,%r38,%r82; @ %r168 bra $L1; .loc 1 144 16 ld.u64 %r169,[%r77]; add.u64 %r44,%r169,1; st.u64 [%r77],%r44; shl.b64 %r74,%r54,3; .loc 1 145 23 add.u64 %r170,%frame,%r74; ld.u64 %r45,[%r170]; .loc 1 145 13 sub.u64 %r171,%r45,%r42; shl.b64 %r172,%r171,2; add.u64 %r50,%r50,%r172; .loc 1 148 32 add.u64 %r174,%r178,%r74; ld.u64 %r48,[%r174]; .loc 1 148 7 add.u64 %r77,%r77,8; setp.eq.u64 %r175,%r44,%r48; @ %r175 bra $L18; bra $L19; $L1: .loc 1 151_gfortran_mminloc0_8_s4 .visible .func _gfortran_mminloc0_8_s, .param .u64 %in_ar4) {8pred %r134; .reg .predpredpredpred %r204; .reg .u64 %r205; .reg .predu32 %r209; .reg .u64pred %r222; .reg .u32u64 %r242; .reg .u64mov.u64 %r129,%ar0; mov.u64 %r130,%ar1; mov.u64 %r131,%ar2; mov.u32 %r132,%ar3; mov.u64 %r133,%ar4; .loc 1 177 6 setp.ne.u64 %r134,%r131,0; @ %r134 bra $L302133; call _gfortran_minloc0_8_s4184 7 bra $L29; $L30: .loc 1 187 10 ld.s8 %r22,[%r130+28]; .loc 1 188 6 setp.gt.s32 %r139,%r22,0; @ %r139 bra $L32; .loc 1 189 5 cvta.const.u64 %r140140_gfortran_runtime_error32: .loc 1 191 6 ld.u64 %r142,[%r129]; setp.ne.u64 %r143,%r142,0; @ %r143 bra $L33; .loc 1 193 7 st.u64 [%r129+48],%r142; add.u32 %r145,%r22,-1; cvt.s64.s32 %r146,%r145; st.u64 [%r129+56],%r146; mov.u64 %r147,1; st.u64 [%r129+40],%r147; .loc 1 194 28 cvt.u32.u64 %r148,%r147; st.u8 [%r129+28],%r148; .loc 1 195 24 st.u64 [%r129+8],%r142; .loc 1 196 29 cvt.u32.u32 %r153,%r22; cvt.s64.s8 %r152,%r153; mov.u64 %r1511151; call (%value_in),_gfortrani_xmallocarray154,[%value_in]; } .loc 1 196 27 st.u64 [%r129],%r154; bra $L34; $L33: .loc 1 200 11 cvta.global.u64 %r156,_gfortrani_compile_options; .loc 1 200 10 ld.u32 %r157,[%r156+36]; setp.eq.u32 %r158,%r157,0gfortrani_bounds_iforeach_return.loc 1 205 4 cvta.const.u64 %r1641; call _gfortrani_bounds_equal_extents$L34: .loc 1 210 15 ld.u64 %r31,[%r131+16]; .loc 1 212 9 ld.u64 %r73,[%r131]; .loc 1 214 22 cvt.u32.u64 %r32,%r31; .loc 1 214 53 add.u32 %r166,%r32,-4; and.b32 %r167,%r166,-5; set.u32.eq.u32 %r169,%r167,0; neg.s32 %r170,%r169; .loc 1 214 22 add.u32 %r171,%r32,-1; set.u32.le.u32 %r173,%r171,1; neg.s32 %r174,%r173; .loc 1 214 6 cvt.u16.u32 %r176,%r170; cvt.u16.u32 %r177,%r174; or35; .loc 1 216 7 setp.ne.u32 %r182,%r32,16; @ %r182 bra $L36; $L35: .loc 1 223 11 ld.u64 %r85,[%r129+40]; .loc 1 224 8 ld.u64 %r86,[%r129]; .loc 1 225 17 cvt.u32.u32 %r183,%r22; cvt.s64.s8 %r107,%r183; add.u64 %r74,%r130,40; add.u64 %r58,%r131,40; mov.u64 %r37,0; .loc 1 225 10 mov.u64 %r75,%r37; add.u64 %r247,%frame,120; add.u64 %r248,%frame,240; add.u64 %r249,%frame,360; .loc 1 230 16 mov.u64 %r200,%r37; bra $L37; $L36: .loc 1 221 5_gfortran_runtime_error37: .loc 1 227 18 add.u64 %r187,%r247,%r37; .loc 1 227 51 ld.u64 %r189,[%r74]; mul.lo.u64 %r188,%r189,%r133; .loc 1 227 18 st.u64 [%r187],%r188; .loc 1 228 18 add.u64 %r190,%frame,%r37; .loc 1 228 20 ld.u64 %r192,[%r58]; mul.lo.u64 %r191,%r192,%r31; .loc 1 228 18 st.u64 [%r190],%r191; .loc 1 229 19 ld.u64 %r194,[%r74+16]; add.u64 %r193,%r194,1; ld.u64 %r195,[%r74+8]; sub.u64 %r49,%r193,%r195; .loc 1 229 17 add.u64 %r197,%r248,%r37; st.u64 [%r197],%r49; .loc 1 230 16 add.u64 %r199,%r249,%r37; st.u64 [%r199],%r200; .loc 1 231 10 setp.gt.s64 %r201,%r49,0; @ %r201 bra $L38; shl.b64 %r113,%r85,3; mov.u64 %r115,%r86; .loc 1 234 11 mov.u64 %r90,0; .loc 1 235 24 mov.u64 %r202,%r90; $L39: st.u64 [%r115],%r202; .loc 1 234 27 add.u64 %r90,%r90,1; .loc 1 234 4 add.u64 %r115,%r115,%r113; setp.ne.u64 %r203,%r90,%r107; @ %r203 bra $L39; bra $L29; $L38: .loc 1 225 26 add.u64 %r75,%r75,1; .loc 1 225 3 add.u64 %r74,%r74,24; add.u64 %r58,%r58,24; add.u64 %r37,%r37,8; setp.ne.u64 %r204,%r75,%r107; @ %r204 bra $L37; .loc 1 240 8 ld.u64 %r70,[%r130]; shl.b64 %r88,%r85,3; mov.u64 %r96,%r86; .loc 1 243 10 mov.u64 %r89,0; .loc 1 244 23 mov.u64 %r205,%r89; $L41: st.u64 [%r96],%r205; .loc 1 243 26 add.u64 %r89,%r89,1; .loc 1 243 3 add.u64 %r96,%r96,%r88; setp.ne.u64 %r206,%r89,%r107; @ %r206 bra $L41; .loc 1 251 9 setp.eq.u64 %r207,%r70,0; @ %r207 bra $L29; .loc 1 267 19 ld.u64 %r121,[%frame+120]; .loc 1 267 9 shl.b64 %r123,%r121,2; .loc 1 268 20 ld.u64 %r124,[%frame]; .loc 1 270 34 ld.u64 %r126,[%frame+240]; cvt.u32.u32 %r209,%r22; cvt.s64.s8 %r208,%r209; shl.b64 %r210,%r208,3; add.u64 %r98,%r249,%r210; .loc 1 249 10 mov.u64 %r77,0; add.u64 %r246,%frame,368; .loc 1 257 6 setp.eq.u32 %r250,%r132,0; $L49: .loc 1 270 21 ld.u64 %r127,[%frame+360]; mov.u64 %r55,%r127; mov.u64 %r72,%r73; mov.u64 %r69,%r70; $L47: .loc 1 257 6 ld.s8 %r213,[%r72]; cvt.u16.u32 %r212,%r213; setp.eq.u16 %r214,%r212,0; @ %r214 bra $L42; .loc 1 257 14 setp.ne.u64 %r215,%r77,0; @ %r215 bra $L43; $L46: mov.u64 %r103,%r246; .loc 1 270 21 mov.u64 %r104,%r86; mov.u64 %r128,%r55; bra $L44; $L43: .loc 1 257 6 @ %r250 bra $L45; .loc 1 43r133; call (%value_in),_gfortrani_memcmp_char4221,[%value_in]; } .loc 1 258 23 setp.le.s32 %r222,%r221,0; @ ! %r222 bra $L42; bra $L46; $L45: .loc 1 43r133; call (%value_in),_gfortrani_memcmp_char4226,[%value_in]; } .loc 1 258 23 setp.lt.s32 %r227,%r226,0; @ ! %r227 bra $L42; bra $L46; $L44: .loc 1 263 38 add.u64 %r228,%r128,1; .loc 1 263 27 st.u64 [%r104],%r228; .loc 1 262 7 add.u64 %r104,%r104,%r88; setp.eq.u64 %r229,%r98,%r103; @ %r229 bra $L50; .loc 1 263 34 ld.u64 %r128,[%r103]; add.u64 %r103,%r103,8; bra $L44; $L50: .loc 1 261 14 mov.u64 %r77,%r69; $L42: .loc 1 267 9 add.u64 %r69,%r69,%r123; .loc 1 268 10 add.u64 %r72,%r72,%r124; .loc 1 270 14 add.u64 %r55,%r55,1; .loc 1 270 7 st.u64 [%frame+360],%r55; setp.ne.u64 %r230,%r55,%r126; @ %r230 bra $L47; sub.u64 %r118,%r126,%r127; .loc 1 267 9 mad.lo.u64 %r70,%r118,%r123,%r70; .loc 1 268 10 mad.lo.u64 %r73,%r118,%r124,%r73; mov.u64 %r111,%r246; mov.u64 %r66,%r124; mov.u64 %r68,%r126; mov.u64 %r64,%r121; mov.u64 %r110,8; .loc 1 271 9 mov.u64 %r76,0; .loc 1 276 13 mov.u64 %r234,%r76; $L48: st.u64 [%r111+-8],%r234; .loc 1 279 23 mul.lo.u64 %r57,%r64,%r68; .loc 1 280 24 mul.lo.u64 %r60,%r68,%r66; .loc 1 281 5 add.u64 %r76,%r76,1; .loc 1 282 7 setp.eq.u64 %r235,%r76,%r107; @ %r235 bra $L29; .loc 1 290 16 ld.u64 %r236,[%r111]; add.u64 %r63,%r236,1; st.u64 [%r111],%r63; .loc 1 291 23 add.u64 %r238,%r247,%r110; ld.u64 %r64,[%r238]; .loc 1 291 13 sub.u64 %r239,%r64,%r57; shl.b64 %r240,%r239,2; add.u64 %r70,%r70,%r240; .loc 1 292 24 add.u64 %r241,%frame,%r110; ld.u64 %r66,[%r241]; .loc 1 292 14 sub.u64 %r242,%r66,%r60; add.u64 %r73,%r73,%r242; .loc 1 295 32 add.u64 %r244,%r248,%r110; ld.u64 %r68,[%r244]; .loc 1 295 7 add.u64 %r111,%r111,8; add.u64 %r110,%r110,8; setp.eq.u64 %r245,%r63,%r68; @ %r245 bra $L48; bra $L49; $L29: .loc 1 298_gfortran_sminloc0_8_s4 .visible .func _gfortran_sminloc0_8_s, .param .u64 %in_ar4) {4u32 %r41; .reg .predu32 %r55; .reg .u64u32ar2; mov.u32 %r38,%ar3; mov.u64 %r39,%ar4; .loc 1 317 6 setp.eq.u64 %r40,%r37,0; @ %r40 bra $L66; .loc 1 317 20 ld.u32 %r41,[%r37]; setp.eq.u32 %r42,%r41,0; @ %r42 bra $L67; $L66: .loc 1 320 7 {call _gfortran_minloc0_8_s4324 7 bra $L65; $L67: .loc 1 327 8 ld.s8 %r30,[%r36+28]; .loc 1 329 6 setp.gt.s64 %r48,%r30,0; @ %r48 bra $L69; .loc 1 330 5 cvta.const.u64 %r49,$LC0; {stack; call _gfortran_runtime_error69: .loc 1 332 15 ld.u64 %r24,[%r35]; .loc 1 332 6 setp.ne.u64 %r51,%r24,0; @ %r51 bra $L70; .loc 1 334 7 st.u64 [%r35+48],%r24; add.u64 %r53,%r30,-1; st.u64 [%r35+56],%r53; mov.u64 %r54,1; st.u64 [%r35+40],%r54; .loc 1 335 28 cvt.u32.u64 %r55,%r54; st.u8 [%r35+28],%r55; .loc 1 336 24 st.u64 [%r35+8],%r24; .loc 1 337 29 mov.u64 call (%value_in),_gfortrani_xmallocarray37 27 st.u64 [%r35],%r24; bra $L71; $L70: .loc 1 339 12 cvta.global.u64 %r63,_gfortrani_compile_options; .loc 1 339 11 ld.u32 %r64,[%r63+36]; setp.eq.u32 %r65,%r64,0; @ %r65 bra $L71; .loc 1 341 8 cvta.const.u64 %r68,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 346 8 ld.u64 %r24,[%r35]; $L71: ld.u64 %r69,[%r35+40]; shl.b64 %r34,%r69,3; mov.u64 %r29,%r24; .loc 1 347 10 mov.u64 %r32,0; .loc 1 348 23 mov.u64 %r70,%r32; $L72: st.u64 [%r29],%r70; .loc 1 347 24 add.u64 %r32,%r32,1; .loc 1 347 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r71,%r30,%r32; @ %r71 bra $L72; $L65: .loc 1 349minloc0_16_s1.o/1622802249_gfortran_minloc0_16_s1 .visible .func _gfortran_minloc0_16_s1); .file 1 "../../../../libgfortran/generated/minloc0_16_s1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_16_s1 .visible .func _gfortran_mminloc0_16_s1_gfortran_sminloc0_16_s1 .visible .func _gfortran_sminloc0_16_s1in_ar4); // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_returnmemcmpmem_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,116,0 }82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,6270,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_16_s1 .visible .func _gfortran_minloc0_16_s1predu64 %r146; .reg .predu64 %r162; .reg .predpredpred %r176; .reg .u64.loc 1 64 10 ld.s8 %r22,[%r99+28]; .loc 1 64 8 cvt.u32.u32 %r102,%r22; cvt.s64.s8 %r65,%r102; .loc 1 65 6 setp.gt.s64 %r103,%r65,0; @ %r103 bra $L2; .loc 1 66 5 cvta.const.u64 %r104,$LC0; {_gfortran_runtime_errorr23,[%r98]; .loc 1 3; .loc 1 70 7 st.u64 [%r98+48],%r23; add.u64 %r108,%r65,-1; st.u64 [%r98+56],%r108; mov.u64 %r109,1; st.u64 [%r98+40],%r109; .loc 1 71 28 cvt.u32.u64 %r110,%r109; st.u8 [%r98+28],%r110; .loc 1 72 24 st.u64 [%r98+8],%r23; .loc 1 73 29 mov.u64 %r113,1665call (%value_in),_gfortrani_xmallocarray116,[%value_in]; } mov.u64 %r23,%r116; .loc 1 73 27 st.u64 [%r98],%r23; bra $L4; $L3: .loc 1 77 11 cvta.global.u64 %r118,_gfortrani_compile_options; .loc 1 77 10 ld.u32 %r119,[%r118+36]; setp.eq.u32 %r120,%r119,0; @ %r120 bra $L4; .loc 1 783; call _gfortrani_bounds_iforeach_return.loc 1 83 8 ld.u64 %r23,[%r98]; $L4: .loc 1 82 11 ld.u64 %r66,[%r98+40]; add.u64 %r62,%r99,40; mov.u64 %r94,0; .loc 1 84 10 mov.u64 %r86,%r94; add.u64 %r175,%frame,120; add.u64 %r173,%frame,240; .loc 1 88 16 mov.u64 %r134,%r94; bra $L7; $L19: mov.u64 %r86,%r71; $L7: .loc 1 86 18 add.u64 %r124,%frame,%r94; .loc 1 86 51 ld.u64 %r126,[%r62]; mul.lo.u64 %r125,%r126,%r101; .loc 1 86 18 st.u64 [%r124],%r125; .loc 1 87 19 ld.u64 %r128,[%r62+16]; add.u64 %r127,%r128,1; ld.u64 %r129,[%r62+8]; sub.u64 %r36,%r127,%r129; .loc 1 87 17 add.u64 %r131,%r175,%r94; st.u64 [%r131],%r36; .loc 1 88 16 add.u64 %r133,%r173,%r94; st.u64 [%r133],%r134; .loc 1 89 10 setp.gt.s64 %r135,%r36,0; @ %r135 bra $L5; shl.b64 %r88,%r66,4; mov.u64 %r27,%r23; .loc 1 92 11 mov.u64 %r72,0; .loc 1 93 24 mov.u64 %r136,%r72; $L6: st.u64 [%r27],%r136; st.u64 [%r27+8],%r136; .loc 1 92 27 add.u64 %r72,%r72,1; .loc 1 92 4 add.u64 %r27,%r27,%r88; setp.ne.u64 %r138,%r65,%r72; @ %r138 bra $L6; bra $L1; $L5: .loc 1 84 26 add.u64 %r71,%r86,1; .loc 1 84 3 add.u64 %r62,%r62,24; add.u64 %r94,%r94,8; setp.ne.u64 %r139,%r65,%r71; @ %r139 bra $L19; .loc 1 98 8 ld.u64 %r53,[%r99]; shl.b64 %r50,%r66,4; mov.u64 %r55,%r23; .loc 1 101 10 mov.u64 %r70,0; .loc 1 102 23 mov.u64 %r140,1; mov.u64 %r141,%r70; $L8: st.u64 [%r55],%r140; st.u64 [%r55+8],%r141; mov.u64 %r87,%r70; .loc 1 101 26 add.u64 %r70,%r70,1; .loc 1 101 3 add.u64 %r55,%r55,%r50; setp.ne.u64 %r142,%r86,%r87; @ %r142 bra $L8; .loc 1 108 9 setp.ne.u64 %r143,%r53,0; @ ! %r143 bra $L1; .loc 1 123 19 ld.u64 %r40,[%frame]; .loc 1 125 34 ld.u64 %r43,[%frame+120]; cvt.u32.u32 %r145,%r22; cvt.s64.s8 %r144,%r145; shl.b64 %r146,%r144,3; add.u64 %r59,%r173,%r146; .loc 1 106 11 mov.u64 %r58,0; add.u64 %r174,%frame,248; .loc 1 114 8 setp.eq.u32 %r176,%r100,0; .loc 1 131 13 mov.u64 %r179,%r58; $L18: .loc 1 125 21 ld.u64 %r89,[%frame+240]; mov.u64 %r42,%r89; mov.u64 %r52,%r53; $L15: .loc 1 114 8 setp.eq.u64 %r148,%r58,0; @ %r148 bra $L11; .loc 1 41 12r101152,[%value_in]; } .loc 1 114 8 @ %r176 bra $L12; .loc 1 114 24 setp.gt.s32 %r155,%r152,0; @ %r155 bra $L13; $L11: mov.u64 %r69,%r174; .loc 1 125 21 mov.u64 %r73,%r23; mov.u64 %r90,%r42; bra $L14; $L12: .loc 1 114 24 setp.lt.s32 %r157,%r152,0; @ ! %r157 bra $L13; bra $L11; $L14: .loc 1 119 38 add.u64 %r158,%r90,1; st.u64 [%r73],%r158; shr.s64 %r159,%r158,63; st.u64 [%r73+8],%r159; .loc 1 118 7 add.u64 %r73,%r73,%r50; setp.eq.u64 %r160,%r59,%r69; @ %r160 bra $L20; .loc 1 119 34 ld.u64 %r90,[%r69]; add.u64 %r69,%r69,8; bra $L14; $L20: mov.u64 %r58,%r52; $L13: .loc 1 123 9 add.u64 %r52,%r52,%r40; .loc 1 125 14 add.u64 %r42,%r42,1; .loc 1 125 7 st.u64 [%frame+240],%r42; setp.ne.u64 %r161,%r42,%r43; @ %r161 bra $L15; .loc 1 123 9 sub.u64 %r162,%r43,%r89; mad.lo.u64 %r53,%r162,%r40,%r53; mov.u64 %r79,%r174; mov.u64 %r51,%r43; mov.u64 %r48,%r40; .loc 1 126 9 mov.u64 %r57,0; $L17: .loc 1 131 13 st.u64 [%r79+-8],%r179; .loc 1 134 23 mul.lo.u64 %r45,%r48,%r51; mov.u64 %r96,%r57; .loc 1 135 5 add.u64 %r57,%r57,1; .loc 1 136 7 setp.eq.u64 %r166,%r96,%r86; @ %r166 bra $L1; .loc 1 144 16 ld.u64 %r167,[%r79]; add.u64 %r47,%r167,1; st.u64 [%r79],%r47; shl.b64 %r76,%r57,3; .loc 1 145 23 add.u64 %r168,%frame,%r76; ld.u64 %r48,[%r168]; .loc 1 145 13 sub.u64 %r169,%r48,%r45; add.u64 %r53,%r53,%r169; .loc 1 148 32 add.u64 %r171,%r175,%r76; ld.u64 %r51,[%r171]; .loc 1 148 7 add.u64 %r79,%r79,8; setp.eq.u64 %r172,%r47,%r51; @ %r172 bra $L17; bra $L18; $L1: .loc 1 151_gfortran_mminloc0_16_s1 .visible .func _gfortran_mminloc0_16_s1in_ar4) {predpredu32 %r209; .reg .u64pred %r222; .reg .predpredmov.u32 %r130,%ar3; mov.u64 %r131,%ar4; .loc 1 177 6 setp.ne.u64 %r132,%r129,0; @ %r132 bra $L32; .loc 1 180 7r128130131; call _gfortran_minloc0_16_s1184 7 bra $L31; $L32: .loc 1 187 10 ld.s8 %r22,[%r128+28]; .loc 1 188 6 setp.gt.s32 %r137,%r22,0; @ %r137 bra $L34;_gfortran_runtime_error34: .loc 1 191 6 ld.u64 %r140,[%r127]; setp.ne.u64 %r141,%r140,0; @ %r141 bra $L35; .loc 1 193 7 st.u64 [%r127+48],%r140; add.u32 %r143,%r22,-1; cvt.s64.s32 %r144,%r143; st.u64 [%r127+56],%r144; mov.u64 %r145,1; st.u64 [%r127+40],%r145; .loc 1 194 28 cvt.u32.u64 %r146,%r145; st.u8 [%r127+28],%r146; .loc 1 195 24 st.u64 [%r127+8],%r140; .loc 1 196 29 cvt.u32.u32 %r151,%r22; cvt.s64.s8 %r150,%r151; mov.u64 %r149,16call (%value_in),_gfortrani_xmallocarray152,[%value_in]; } .loc 1 196 27 st.u64 [%r127],%r152; bra $L36; $L35: .loc 1 200 11 cvta.global.u64 %r154,_gfortrani_compile_options; .loc 1 200 10 ld.u32 %r155,[%r154+36]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L36; .loc 1 203 4r12159; call _gfortrani_bounds_iforeach_return.loc 1 205 4 cvta.const.u64 %r16262159; call _gfortrani_bounds_equal_extents$L36: .loc 1 210 15 ld.u64 %r32,[%r129+16]; .loc 1 212 9 ld.u64 %r81,[%r129]; .loc 1 214 22 cvt.u32.u64 %r33,%r32; .loc 1 214 53 add.u32 %r164,%r33,-4; and.b32 %r165,%r164,-5; set.u32.eq.u32 %r167,%r165,0; neg.s32 %r168,%r167; .loc 1 214 22 add.u32 %r169,%r33,-1; set.u32.le.u32 %r171,%r169,1; neg.s32 %r172,%r171; .loc 1 214 6 cvt.u16.u32 %r174,%r168; cvt.u16.u32 %r175,%r172; or.b16 %r173,%r174,%r175; cvt.u32.u16 %r176,%r173; cvt.u16.u8 %r177,%r176; setp.ne.u16 %r178,%r177,0; @ %r178 bra $L37; .loc 1 216 7 setp.ne.u32 %r180,%r33,16; @ %r180 bra $L38; $L37: .loc 1 223 11 ld.u64 %r93,[%r127+40]; .loc 1 224 8 ld.u64 %r94,[%r127]; .loc 1 225 17 cvt.u32.u32 %r181,%r22; cvt.s64.s8 %r112,%r181; add.u64 %r86,%r128,40; add.u64 %r76,%r129,40; mov.u64 %r58,0; .loc 1 225 10 mov.u64 %r83,%r58; add.u64 %r244,%frame,120; add.u64 %r245,%frame,240; add.u64 %r246,%frame,360; .loc 1 230 16 mov.u64 %r198,%r58; bra $L39; $L38: .loc 1 221 5_gfortran_runtime_error39: .loc 1 227 18 add.u64 %r185,%r244,%r58; .loc 1 227 51 ld.u64 %r187,[%r86]; mul.lo.u64 %r186,%r187,%r131; .loc 1 227 18 st.u64 [%r185],%r186; .loc 1 228 18 add.u64 %r188,%frame,%r58; .loc 1 228 20 ld.u64 %r190,[%r76]; mul.lo.u64 %r189,%r190,%r32; .loc 1 228 18 st.u64 [%r188],%r189; .loc 1 229 19 ld.u64 %r192,[%r86+16]; add.u64 %r191,%r192,1; ld.u64 %r193,[%r86+8]; sub.u64 %r50,%r191,%r193; .loc 1 229 17 add.u64 %r195,%r245,%r58; st.u64 [%r195],%r50; .loc 1 230 16 add.u64 %r197,%r246,%r58; st.u64 [%r197],%r198; .loc 1 231 10 setp.gt.s64 %r199,%r50,0; @ %r199 bra $L40; shl.b64 %r123,%r93,4; mov.u64 %r38,%r94; .loc 1 234 11 mov.u64 %r96,0; .loc 1 235 24 mov.u64 %r200,%r96; $L41: st.u64 [%r38],%r200; st.u64 [%r38+8],%r200; .loc 1 234 27 add.u64 %r96,%r96,1; .loc 1 234 4 add.u64 %r38,%r38,%r123; setp.ne.u64 %r202,%r96,%r112; @ %r202 bra $L41; bra $L31; $L40: .loc 1 225 26 add.u64 %r83,%r83,1; .loc 1 225 3 add.u64 %r86,%r86,24; add.u64 %r76,%r76,24; add.u64 %r58,%r58,8; setp.ne.u64 %r203,%r83,%r112; @ %r203 bra $L39; .loc 1 240 8 ld.u64 %r79,[%r128]; shl.b64 %r98,%r93,4; mov.u64 %r100,%r94; .loc 1 243 10 mov.u64 %r95,0; .loc 1 244 23 mov.u64 %r204,%r95; $L43: st.u64 [%r100],%r204; st.u64 [%r100+8],%r204; .loc 1 243 26 add.u64 %r95,%r95,1; .loc 1 243 3 add.u64 %r100,%r100,%r98; setp.ne.u64 %r206,%r95,%r112; @ %r206 bra $L43; .loc 1 251 9 setp.eq.u64 %r207,%r79,0; @ %r207 bra $L31; .loc 1 267 19 ld.u64 %r59,[%frame+120]; .loc 1 268 20 ld.u64 %r61,[%frame]; .loc 1 270 34 ld.u64 %r64,[%frame+240]; cvt.u32.u32 %r209,%r22; cvt.s64.s8 %r208,%r209; shl.b64 %r210,%r208,3; add.u64 %r102,%r246,%r210; .loc 1 249 10 mov.u64 %r85,0; add.u64 %r243,%frame,368; .loc 1 257 6 setp.eq.u32 %r247,%r130,0; .loc 1 276 13 mov.u64 %r250,%r85; $L50: .loc 1 270 21 ld.u64 %r124,[%frame+360]; mov.u64 %r63,%r124; mov.u64 %r80,%r81; mov.u64 %r78,%r79; $L48: .loc 1 257 6 ld.s8 %r213,[%r80]; cvt.u16.u32 %r212,%r213; setp.eq.u16 %r214,%r212,0; @ %r214 bra $L44; .loc 1 257 14 setp.eq.u64 %r215,%r85,0; @ %r215 bra $L45; .loc 1 41 12131219,[%value_in]; } .loc 1 257 6 @ %r247 bra $L46; .loc 1 258 23 setp.gt.s32 %r222,%r219,0; @ %r222 bra $L44; $L45: mov.u64 %r107,%r243; .loc 1 270 21 mov.u64 %r108,%r94; mov.u64 %r125,%r63; bra $L47; $L46: .loc 1 258 23 setp.lt.s32 %r224,%r219,0; @ ! %r224 bra $L44; bra $L45; $L47: .loc 1 263 38 add.u64 %r225,%r125,1; st.u64 [%r108],%r225; shr.s64 %r226,%r225,63; st.u64 [%r108+8],%r226; .loc 1 262 7 add.u64 %r108,%r108,%r98; setp.eq.u64 %r227,%r102,%r107; @ %r227 bra $L51; .loc 1 263 34 ld.u64 %r125,[%r107]; add.u64 %r107,%r107,8; bra $L47; $L51: mov.u64 %r85,%r78; $L44: .loc 1 267 9 add.u64 %r78,%r78,%r59; .loc 1 268 10 add.u64 %r80,%r80,%r61; .loc 1 270 14 add.u64 %r63,%r63,1; .loc 1 270 7 st.u64 [%frame+360],%r63; setp.ne.u64 %r228,%r63,%r64; @ %r228 bra $L48; sub.u64 %r118,%r64,%r124; .loc 1 267 9 mad.lo.u64 %r79,%r59,%r118,%r79; .loc 1 268 10 mad.lo.u64 %r81,%r61,%r118,%r81; mov.u64 %r115,%r243; mov.u64 %r74,%r61; mov.u64 %r77,%r64; mov.u64 %r72,%r59; mov.u64 %r114,8; .loc 1 271 9 mov.u64 %r84,0; $L49: .loc 1 276 13 st.u64 [%r115+-8],%r250; .loc 1 279 23 mul.lo.u64 %r66,%r72,%r77; .loc 1 280 24 mul.lo.u64 %r69,%r77,%r74; .loc 1 281 5 add.u64 %r84,%r84,1; .loc 1 282 7 setp.eq.u64 %r233,%r84,%r112; @ %r233 bra $L31; .loc 1 290 16 ld.u64 %r234,[%r115]; add.u64 %r71,%r234,1; st.u64 [%r115],%r71; .loc 1 291 23 add.u64 %r236,%r244,%r114; ld.u64 %r72,[%r236]; .loc 1 291 13 sub.u64 %r237,%r72,%r66; add.u64 %r79,%r79,%r237; .loc 1 292 24 add.u64 %r238,%frame,%r114; ld.u64 %r74,[%r238]; .loc 1 292 14 sub.u64 %r239,%r74,%r69; add.u64 %r81,%r81,%r239; .loc 1 295 32 add.u64 %r241,%r245,%r114; ld.u64 %r77,[%r241]; .loc 1 295 7 add.u64 %r115,%r115,8; add.u64 %r114,%r114,8; setp.eq.u64 %r242,%r71,%r77; @ %r242 bra $L49; bra $L50; $L31: .loc 1 298_gfortran_sminloc0_16_s1 .visible .func _gfortran_sminloc0_16_s1in_ar4) {4u32 %r41; .reg .predu32 %r55; .reg .u64u32ar2; mov.u32 %r38,%ar3; mov.u64 %r39,%ar4; .loc 1 317 6 setp.eq.u64 %r40,%r37,0; @ %r40 bra $L70; .loc 1 317 20 ld.u32 %r41,[%r37]; setp.eq.u32 %r42,%r41,0; @ %r42 bra $L71; $L70: .loc 1 320 7 {call _gfortran_minloc0_16_s1324 7 bra $L69; $L71: .loc 1 327 8 ld.s8 %r30,[%r36+28]; .loc 1 329 6 setp.gt.s64 %r48,%r30,0; @ %r48 bra $L73; .loc 1 330 5 cvta.const.u64 %r49,$LC0; {stack; call _gfortran_runtime_error73: .loc 1 332 15 ld.u64 %r24,[%r35]; .loc 1 332 6 setp.ne.u64 %r51,%r24,0; @ %r51 bra $L74; .loc 1 334 7 st.u64 [%r35+48],%r24; add.u64 %r53,%r30,-1; st.u64 [%r35+56],%r53; mov.u64 %r54,1; st.u64 [%r35+40],%r54; .loc 1 335 28 cvt.u32.u64 %r55,%r54; st.u8 [%r35+28],%r55; .loc 1 336 24 st.u64 [%r35+8],%r24; .loc 1 337 29call (%value_in),_gfortrani_xmallocarray37 27 st.u64 [%r35],%r24; bra $L75; $L74: .loc 1 339 12 cvta.global.u64 %r63,_gfortrani_compile_options; .loc 1 339 11 ld.u32 %r64,[%r63+36]; setp.eq.u32 %r65,%r64,0; @ %r65 bra $L75; .loc 1 341 8 cvta.const.u64 %r68,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 346 8 ld.u64 %r24,[%r35]; $L75: ld.u64 %r69,[%r35+40]; shl.b64 %r34,%r69,4; mov.u64 %r29,%r24; .loc 1 347 10 mov.u64 %r32,0; .loc 1 348 23 mov.u64 %r70,%r32; $L76: st.u64 [%r29],%r70; st.u64 [%r29+8],%r70; .loc 1 347 24 add.u64 %r32,%r32,1; .loc 1 347 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r72,%r30,%r32; @ %r72 bra $L76; $L69: .loc 1 349minloc0_16_s4.o/1622802249_gfortran_minloc0_16_s4 .visible .func _gfortran_minloc0_16_s4); .file 1 "../../../../libgfortran/generated/minloc0_16_s4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_16_s4 .visible .func _gfortran_mminloc0_16_s_gfortran_sminloc0_16_s4 .visible .func _gfortran_sminloc0_16_sVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_memcmp_char4_gfortrani_memcmp_char4_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,116,0 }82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,6270,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_16_s4 .visible .func _gfortran_minloc0_16_s4u64 %r104; .reg .u32 %r105; .reg .pred %r106; .reg .u64 %r107; .reg .predpredpredmov.u64 %r101,%ar0; mov.u64 %r102,%ar1; mov.u32 %r103,%ar2; mov.u64 %r104,%ar3; .loc 1 64 10 ld.s8 %r22,[%r102+28]; .loc 1 64 8 cvt.u32.u32 %r105,%r22; cvt.s64.s8 %r62,%r105; .loc 1 65 6 setp.gt.s64 %r106,%r62,0; @ %r106 bra $L2; .loc 1 66 5 cvta.const.u64 %r107,$LC0; {stack; call _gfortran_runtime_errorr23,[%r101]; .loc 1 68 6 setp.ne.u64 %r109,%r23,0; @ %r109 bra $L3; .loc 1 70 7 st.u64 [%r101+48],%r23; add.u64 %r111,%r62,-1; st.u64 [%r101+56],%r111; mov.u64 %r112,1; st.u64 [%r101+40],%r112; .loc 1 71 28 cvt.u32.u64 %r113,%r112; st.u8 [%r101+28],%r113; .loc 1 72 24 st.u64 [%r101+8],%r23; .loc 1 73 29 mov.u64 %r116,16116; call (%value_in),_gfortrani_xmallocarray119,[%value_in]; } mov.u64 %r23,%r119; .loc 1 73 27 st.u64 [%r101],%r23; bra $L4; $L3: .loc 1 77 11 cvta.global.u64 %r121,_gfortrani_compile_options; .loc 1 77 10 ld.u32 %r122,[%r121+36]; setp.eq.u32 %r123,%r122,0; @ %r123 bra $L4; .loc 1 78 2 cvta.const.u64 %r126,$LC1; {r1026; call _gfortrani_bounds_iforeach_return.loc 1 83 8 ld.u64 %r23,[%r101]; $L4: .loc 1 82 11 ld.u64 %r63,[%r101+40]; add.u64 %r59,%r102,40; mov.u64 %r99,0; .loc 1 84 10 mov.u64 %r82,%r99; add.u64 %r180,%frame,120; add.u64 %r181,%frame,240; .loc 1 88 16 mov.u64 %r137,%r99; bra $L7; $L20: mov.u64 %r82,%r69; $L7: .loc 1 86 18 add.u64 %r127,%frame,%r99; .loc 1 86 51 ld.u64 %r129,[%r59]; mul.lo.u64 %r128,%r129,%r104; .loc 1 86 18 st.u64 [%r127],%r128; .loc 1 87 19 ld.u64 %r131,[%r59+16]; add.u64 %r130,%r131,1; ld.u64 %r132,[%r59+8]; sub.u64 %r35,%r130,%r132; .loc 1 87 17 add.u64 %r134,%r180,%r99; st.u64 [%r134],%r35; .loc 1 88 16 add.u64 %r136,%r181,%r99; st.u64 [%r136],%r137; .loc 1 89 10 setp.gt.s64 %r138,%r35,0; @ %r138 bra $L5; shl.b64 %r81,%r63,4; mov.u64 %r85,%r23; .loc 1 92 11 mov.u64 %r70,0; .loc 1 93 24 mov.u64 %r139,%r70; $L6: st.u64 [%r85],%r139; st.u64 [%r85+8],%r139; .loc 1 92 27 add.u64 %r70,%r70,1; .loc 1 92 4 add.u64 %r85,%r85,%r81; setp.ne.u64 %r141,%r62,%r70; @ %r141 bra $L6; bra $L1; $L5: .loc 1 84 26 add.u64 %r69,%r82,1; .loc 1 84 3 add.u64 %r59,%r59,24; add.u64 %r99,%r99,8; setp.ne.u64 %r142,%r62,%r69; @ %r142 bra $L20; .loc 1 98 8 ld.u64 %r50,[%r102]; shl.b64 %r47,%r63,4; mov.u64 %r52,%r23; .loc 1 101 10 mov.u64 %r68,0; .loc 1 102 23 mov.u64 %r143,1; mov.u64 %r144,%r68; $L8: st.u64 [%r52],%r143; st.u64 [%r52+8],%r144; mov.u64 %r83,%r68; .loc 1 101 26 add.u64 %r68,%r68,1; .loc 1 101 3 add.u64 %r52,%r52,%r47; setp.ne.u64 %r145,%r82,%r83; @ %r145 bra $L8; .loc 1 108 9 setp.ne.u64 %r146,%r50,0; @ ! %r146 bra $L1; .loc 1 123 19 ld.u64 %r90,[%frame]; .loc 1 123 9 shl.b64 %r92,%r90,2; .loc 1 125 34 ld.u64 %r93,[%frame+120]; cvt.u32.u32 %r148,%r22; cvt.s64.s8 %r147,%r148; shl.b64 %r149,%r147,3; add.u64 %r56,%r181,%r149; .loc 1 106 11 mov.u64 %r55,0; add.u64 %r182,%frame,248; .loc 1 114 8 setp.eq.u32 %r183,%r103,0; $L19: .loc 1 125 21 ld.u64 %r94,[%frame+240]; mov.u64 %r40,%r94; mov.u64 %r49,%r50; $L16: .loc 1 114 8 setp.ne.u64 %r151,%r55,0; @ %r151 bra $L11; $L14: mov.u64 %r67,%r182; .loc 1 125 21 mov.u64 %r73,%r23; mov.u64 %r95,%r40; bra $L12; $L11: .loc 1 114 8 @ %r183 bra $L13;104; call (%value_in),_gfortrani_memcmp_char4157,[%value_in]; } .loc 1 114 24 setp.le.s32 %r158,%r157,0; @ ! %r158 bra $L15; bra $L14104; call (%value_in),_gfortrani_memcmp_char4162,[%value_in]; } .loc 1 114 24 setp.lt.s32 %r163,%r162,0; @ ! %r163 bra $L15; bra $L14; $L12: .loc 1 119 38 add.u64 %r164,%r95,1; st.u64 [%r73],%r164; shr.s64 %r165,%r164,63; st.u64 [%r73+8],%r165; .loc 1 118 7 add.u64 %r73,%r73,%r47; setp.eq.u64 %r166,%r56,%r67; @ %r166 bra $L21; .loc 1 119 34 ld.u64 %r95,[%r67]; add.u64 %r67,%r67,8; bra $L12; $L21: .loc 1 117 14 mov.u64 %r55,%r49; $L15: .loc 1 123 9 add.u64 %r49,%r49,%r92; .loc 1 125 14 add.u64 %r40,%r40,1; .loc 1 125 7 st.u64 [%frame+240],%r40; setp.ne.u64 %r167,%r40,%r93; @ %r167 bra $L16; .loc 1 123 9 sub.u64 %r168,%r93,%r94; mad.lo.u64 %r50,%r168,%r92,%r50; mov.u64 %r79,%r182; mov.u64 %r48,%r93; mov.u64 %r45,%r90; .loc 1 126 9 mov.u64 %r54,0; .loc 1 131 13 mov.u64 %r171,%r54; $L18: st.u64 [%r79+-8],%r171; .loc 1 134 23 mul.lo.u64 %r42,%r45,%r48; mov.u64 %r37,%r54; .loc 1 135 5 add.u64 %r54,%r54,1; .loc 1 136 7 setp.eq.u64 %r172,%r37,%r82; @ %r172 bra $L1; .loc 1 144 16 ld.u64 %r173,[%r79]; add.u64 %r44,%r173,1; st.u64 [%r79],%r44; shl.b64 %r76,%r54,3; .loc 1 145 23 add.u64 %r174,%frame,%r76; ld.u64 %r45,[%r174]; .loc 1 145 13 sub.u64 %r175,%r45,%r42; shl.b64 %r176,%r175,2; add.u64 %r50,%r50,%r176; .loc 1 148 32 add.u64 %r178,%r180,%r76; ld.u64 %r48,[%r178]; .loc 1 148 7 add.u64 %r79,%r79,8; setp.eq.u64 %r179,%r44,%r48; @ %r179 bra $L18; bra $L19; $L1: .loc 1 151_gfortran_mminloc0_16_s4 .visible .func _gfortran_mminloc0_16_s, .param .u64 %in_ar4) {9pred %r135; .reg .predpredpred %r159; .reg .u64.reg .u32 %r179; .reg .u16 %r180; .reg .pred %r181; .reg .pred.reg .pred %r206; .reg .u64 %r207; .reg .pred %r209; .reg .pred %r210; .reg .u64u16 %r215; .reg .u32 %r216; .reg .pred %r217; .reg .pred %r218; .reg .u32 %r224; .reg .pred %r225; .reg .u32 %r229; .reg .pred %r230; .reg .u64 %r231; .reg .u64 %r232; .reg .pred %r233; .reg .pred.reg .u64 %r252; .reg .u64 %r253; .reg .pred %r254; mov.u64 %r130,%ar0; mov.u64 %r131,%ar1; mov.u64 %r132,%ar2; mov.u32 %r133,%ar3; mov.u64 %r134,%ar4; .loc 1 177 6 setp.ne.u64 %r135,%r132,0; @ %r135 bra $L3011133134; call _gfortran_minloc0_16_s4184 7 bra $L29; $L30: .loc 1 187 10 ld.s8 %r22,[%r131+28]; .loc 1 188 6 setp.gt.s32 %r140,%r22,0; @ %r140 bra $L32; .loc 1 189 5 cvta.const.u64 %r1411stack; call _gfortran_runtime_error32: .loc 1 191 6 ld.u64 %r143,[%r130]; setp.ne.u64 %r144,%r143,0; @ %r144 bra $L33; .loc 1 193 7 st.u64 [%r130+48],%r143; add.u32 %r146,%r22,-1; cvt.s64.s32 %r147,%r146; st.u64 [%r130+56],%r147; mov.u64 %r148,1; st.u64 [%r130+40],%r148; .loc 1 194 28 cvt.u32.u64 %r149,%r148; st.u8 [%r130+28],%r149; .loc 1 195 24 st.u64 [%r130+8],%r143; .loc 1 196 29 cvt.u32.u32 %r154,%r22; cvt.s64.s8 %r153,%r154; mov.u64 %r152,16call (%value_in),_gfortrani_xmallocarray155,[%value_in]; } .loc 1 196 27 st.u64 [%r130],%r155; bra $L34; $L33: .loc 1 200 11 cvta.global.u64 %r157,_gfortrani_compile_options; .loc 1 200 10 ld.u32 %r158,[%r157+36]; setp.eq.u32 %r159,%r158,0r1162; call _gfortrani_bounds_iforeach_return.loc 1 205 4 cvta.const.u64 %r1651162; call _gfortrani_bounds_equal_extents$L34: .loc 1 210 15 ld.u64 %r32,[%r132+16]; .loc 1 212 9 ld.u64 %r75,[%r132]; .loc 1 214 22 cvt.u32.u64 %r33,%r32; .loc 1 214 53 add.u32 %r167,%r33,-4; and.b32 %r168,%r167,-5; set.u32.eq.u32 %r170,%r168,0; neg.s32 %r171,%r170; .loc 1 214 22 add.u32 %r172,%r33,-1; set.u32.le.u32 %r174,%r172,1; neg.s32 %r175,%r174; .loc 1 214 6 cvt.u16.u32 %r177,%r171; cvt.u16.u32 %r178,%r175; or.b16 %r176,%r177,%r178; cvt.u32.u16 %r179,%r176; cvt.u16.u8 %r180,%r179; setp.ne.u16 %r181,%r180,0; @ %r181 bra $L35; .loc 1 216 7 setp.ne.u32 %r183,%r33,16; @ %r183 bra $L36; $L35: .loc 1 223 11 ld.u64 %r86,[%r130+40]; .loc 1 224 8 ld.u64 %r87,[%r130]; .loc 1 225 17 cvt.u32.u32 %r184,%r22; cvt.s64.s8 %r106,%r184; add.u64 %r80,%r131,40; add.u64 %r73,%r132,40; mov.u64 %r56,0; .loc 1 225 10 mov.u64 %r77,%r56; add.u64 %r252,%frame,120; add.u64 %r253,%frame,240; add.u64 %r250,%frame,360; .loc 1 230 16 mov.u64 %r201,%r56; bra $L37; $L36: .loc 1 221 5_gfortran_runtime_error37: .loc 1 227 18 add.u64 %r188,%r252,%r56; .loc 1 227 51 ld.u64 %r190,[%r80]; mul.lo.u64 %r189,%r190,%r134; .loc 1 227 18 st.u64 [%r188],%r189; .loc 1 228 18 add.u64 %r191,%frame,%r56; .loc 1 228 20 ld.u64 %r193,[%r73]; mul.lo.u64 %r192,%r193,%r32; .loc 1 228 18 st.u64 [%r191],%r192; .loc 1 229 19 ld.u64 %r195,[%r80+16]; add.u64 %r194,%r195,1; ld.u64 %r196,[%r80+8]; sub.u64 %r49,%r194,%r196; .loc 1 229 17 add.u64 %r198,%r253,%r56; st.u64 [%r198],%r49; .loc 1 230 16 add.u64 %r200,%r250,%r56; st.u64 [%r200],%r201; .loc 1 231 10 setp.gt.s64 %r202,%r49,0; @ %r202 bra $L38; shl.b64 %r114,%r86,4; mov.u64 %r116,%r87; .loc 1 234 11 mov.u64 %r91,0; .loc 1 235 24 mov.u64 %r203,%r91; $L39: st.u64 [%r116],%r203; st.u64 [%r116+8],%r203; .loc 1 234 27 add.u64 %r91,%r91,1; .loc 1 234 4 add.u64 %r116,%r116,%r114; setp.ne.u64 %r205,%r91,%r106; @ %r205 bra $L39; bra $L29; $L38: .loc 1 225 26 add.u64 %r77,%r77,1; .loc 1 225 3 add.u64 %r80,%r80,24; add.u64 %r73,%r73,24; add.u64 %r56,%r56,8; setp.ne.u64 %r206,%r77,%r106; @ %r206 bra $L37; .loc 1 240 8 ld.u64 %r72,[%r131]; shl.b64 %r95,%r86,4; mov.u64 %r97,%r87; .loc 1 243 10 mov.u64 %r90,0; .loc 1 244 23 mov.u64 %r207,%r90; $L41: st.u64 [%r97],%r207; st.u64 [%r97+8],%r207; .loc 1 243 26 add.u64 %r90,%r90,1; .loc 1 243 3 add.u64 %r97,%r97,%r95; setp.ne.u64 %r209,%r90,%r106; @ %r209 bra $L41; .loc 1 251 9 setp.eq.u64 %r210,%r72,0; @ %r210 bra $L29; .loc 1 267 19 ld.u64 %r122,[%frame+120]; .loc 1 267 9 shl.b64 %r124,%r122,2; .loc 1 268 20 ld.u64 %r125,[%frame]; .loc 1 270 34 ld.u64 %r127,[%frame+240]; cvt.u32.u32 %r212,%r22; cvt.s64.s8 %r211,%r212; shl.b64 %r213,%r211,3; add.u64 %r99,%r250,%r213; .loc 1 249 10 mov.u64 %r79,0; add.u64 %r251,%frame,368; .loc 1 257 6 setp.eq.u32 %r254,%r133,0; $L49: .loc 1 270 21 ld.u64 %r128,[%frame+360]; mov.u64 %r57,%r128; mov.u64 %r74,%r75; mov.u64 %r71,%r72; $L47: .loc 1 257 6 ld.s8 %r216,[%r74]; cvt.u16.u32 %r215,%r216; setp.eq.u16 %r217,%r215,0; @ %r217 bra $L42; .loc 1 257 14 setp.ne.u64 %r218,%r79,0; @ %r218 bra $L43; $L46: mov.u64 %r104,%r251; .loc 1 270 21 mov.u64 %r105,%r87; mov.u64 %r129,%r57; bra $L44; $L43: .loc 1 257 6 @ %r254 bra $L45; .loc 1 43134; call (%value_in),_gfortrani_memcmp_char4224,[%value_in]; } .loc 1 258 23 setp.le.s32 %r225,%r224,0; @ ! %r225 bra $L42; bra $L46; $L45: .loc 1 43134; call (%value_in),_gfortrani_memcmp_char4229,[%value_in]; } .loc 1 258 23 setp.lt.s32 %r230,%r229,0; @ ! %r230 bra $L42; bra $L46; $L44: .loc 1 263 38 add.u64 %r231,%r129,1; st.u64 [%r105],%r231; shr.s64 %r232,%r231,63; st.u64 [%r105+8],%r232; .loc 1 262 7 add.u64 %r105,%r105,%r95; setp.eq.u64 %r233,%r99,%r104; @ %r233 bra $L50; .loc 1 263 34 ld.u64 %r129,[%r104]; add.u64 %r104,%r104,8; bra $L44; $L50: .loc 1 261 14 mov.u64 %r79,%r71; $L42: .loc 1 267 9 add.u64 %r71,%r71,%r124; .loc 1 268 10 add.u64 %r74,%r74,%r125; .loc 1 270 14 add.u64 %r57,%r57,1; .loc 1 270 7 st.u64 [%frame+360],%r57; setp.ne.u64 %r234,%r57,%r127; @ %r234 bra $L47; sub.u64 %r119,%r127,%r128; .loc 1 267 9 mad.lo.u64 %r72,%r119,%r124,%r72; .loc 1 268 10 mad.lo.u64 %r75,%r119,%r125,%r75; mov.u64 %r112,%r251; mov.u64 %r68,%r125; mov.u64 %r70,%r127; mov.u64 %r66,%r122; mov.u64 %r111,8; .loc 1 271 9 mov.u64 %r78,0; .loc 1 276 13 mov.u64 %r238,%r78; $L48: st.u64 [%r112+-8],%r238; .loc 1 279 23 mul.lo.u64 %r59,%r66,%r70; .loc 1 280 24 mul.lo.u64 %r62,%r70,%r68; .loc 1 281 5 add.u64 %r78,%r78,1; .loc 1 282 7 setp.eq.u64 %r239,%r78,%r106; @ %r239 bra $L29; .loc 1 290 16 ld.u64 %r240,[%r112]; add.u64 %r65,%r240,1; st.u64 [%r112],%r65; .loc 1 291 23 add.u64 %r242,%r252,%r111; ld.u64 %r66,[%r242]; .loc 1 291 13 sub.u64 %r243,%r66,%r59; shl.b64 %r244,%r243,2; add.u64 %r72,%r72,%r244; .loc 1 292 24 add.u64 %r245,%frame,%r111; ld.u64 %r68,[%r245]; .loc 1 292 14 sub.u64 %r246,%r68,%r62; add.u64 %r75,%r75,%r246; .loc 1 295 32 add.u64 %r248,%r253,%r111; ld.u64 %r70,[%r248]8; setp.eq.u64 %r249,%r65,%r70; @ %r249 bra $L48; bra $L49; $L29: .loc 1 298_gfortran_sminloc0_16_s4 .visible .func _gfortran_sminloc0_16_s, .param .u64 %in_ar4) {4u32 %r41; .reg .predu32 %r55; .reg .u64u32ar2; mov.u32 %r38,%ar3; mov.u64 %r39,%ar4; .loc 1 317 6 setp.eq.u64 %r40,%r37,0; @ %r40 bra $L66; .loc 1 317 20 ld.u32 %r41,[%r37]; setp.eq.u32 %r42,%r41,0; @ %r42 bra $L67; $L66: .loc 1 320 7 {call _gfortran_minloc0_16_s4324 7 bra $L65; $L67: .loc 1 327 8 ld.s8 %r30,[%r36+28]; .loc 1 329 6 setp.gt.s64 %r48,%r30,0; @ %r48 bra $L69; .loc 1 330 5 cvta.const.u64 %r49,$LC0; {stack; call _gfortran_runtime_error69: .loc 1 332 15 ld.u64 %r24,[%r35]; .loc 1 332 6 setp.ne.u64 %r51,%r24,0; @ %r51 bra $L70; .loc 1 334 7 st.u64 [%r35+48],%r24; add.u64 %r53,%r30,-1; st.u64 [%r35+56],%r53; mov.u64 %r54,1; st.u64 [%r35+40],%r54; .loc 1 335 28 cvt.u32.u64 %r55,%r54; st.u8 [%r35+28],%r55; .loc 1 336 24 st.u64 [%r35+8],%r24; .loc 1 337 29call (%value_in),_gfortrani_xmallocarray37 27 st.u64 [%r35],%r24; bra $L71; $L70: .loc 1 339 12 cvta.global.u64 %r63,_gfortrani_compile_options; .loc 1 339 11 ld.u32 %r64,[%r63+36]; setp.eq.u32 %r65,%r64,0; @ %r65 bra $L71; .loc 1 341 8 cvta.const.u64 %r68,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 346 8 ld.u64 %r24,[%r35]; $L71: ld.u64 %r69,[%r35+40]; shl.b64 %r34,%r69,4; mov.u64 %r29,%r24; .loc 1 347 10 mov.u64 %r32,0; .loc 1 348 23 mov.u64 %r70,%r32; $L72: st.u64 [%r29],%r70; st.u64 [%r29+8],%r70; .loc 1 347 24 add.u64 %r32,%r32,1; .loc 1 347 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r72,%r30,%r32; @ %r72 bra $L72; $L65: .loc 1 349 maxloc1_4_s1.o/ 1622802249_gfortran_maxloc1_4_s1 .visible .func _gfortran_maxloc1_4_s1fortran/generated/maxloc1_4_s1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_4_s1 .visible .func _gfortran_mmaxloc1_4_s1_gfortran_smaxloc1_4_s1 .visible .func _gfortran_smaxloc1_4_s1VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returmemcmpmem_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents7] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,914,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,117,101114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99_gfortran_maxloc1_4_s1 .visible .func _gfortran_maxloc1_4_s1in_ar4) {48predpredpred %r238; .reg .u64predu64 %r279; .reg .u64 %r280; .reg .u64 %r281; .reg .u64u64 %r303; .reg .u32 %r304; .reg .pred %r305; .reg .u64 %r308; .reg .u64 %r309; .reg .pred %r311; .reg .u64 %r312; .reg .u64 %r313; .reg .u64u64predpredpredpred %r356; .reg .u64 %r357; .reg .predmov.u64 %r184,%ar0; mov.u64 %r185,%ar1; mov.u64 %r186,%ar2; mov.u32 %r187,%ar3; mov.u64 %r188,%ar4; .loc 1 70 10 ld.s8 %r22,[%r185+28]; .loc 1 70 38 add.u32 %r189,%r22,-1; .loc 1 70 8 cvt.s64.s32 %r117,%r189; .loc 1 71 10 ld.u64 %r25,[%r186]; .loc 1 71 7 add.u64 %r118,%r25,-1; .loc 1 73 7 shr.u64 %r191,%r118,63; set.u32.lt.s64 %r193,%r117,%r118; neg.s32 %r194,%r193; cvt.u16.u64 %r197,%r191; cvt.u16.u32 %r198,%r194; or.b16 %r196,%r197,%r198; .loc 1 73 6 cvt.u32.u16 %r199,%r196; cvt.u16.u8 %r200,%r199; setp.eq.u16 %r201,%r200,0; @ %r201 bra $L2; .loc 1 75 7 cvt.u32.u32 %r204,%r22; cvt.s64.s8 %r203,%r204; st.u64 [%stack+8],%r20302202_gfortran_runtime_errorr207,%r118,%r118; add.u64 %r208,%r207,%r118; shl.b64 %r209,%r208,3; add.u64 %r210,%r185,%r209; ld.u64 %r31,[%r210+56]; ld.u64 %r33,[%r210+48]; .loc 1 83 11 ld.u64 %r34,[%r210+40]; .loc 1 85 3 setp.ne.u64 %r224,%r118,0; @ %r224 bra $L3; $L9: .loc 1 93 3 setp.gt.s64 %r225,%r117,%r118; @ %r225 bra $L4; bra $L52; $L3: add.u64 %r29,%r185,40; add.u64 %r133,%frame,120; add.u64 %r182,%frame,240; add.u64 %r227,%r25,%r25; add.u64 %r228,%r227,%r25; shl.b64 %r229,%r228,3; add.u64 %r230,%r185,16; add.u64 %r160,%r229,%r230; .loc 1 91 12 mov.u64 %r373,0; $L8: .loc 1 87 51 ld.u64 %r232,[%r29]; mul.lo.u64 %r231,%r232,%r188; .loc 1 87 18 st.u64 [%r133],%r231; .loc 1 88 19 ld.u64 %r234,[%r29+16]; add.u64 %r233,%r234,1; ld.u64 %r235,[%r29+8]; sub.u64 %r44,%r233,%r235; .loc 1 90 10 setp.lt.s64 %r236,%r44,0; @ %r236 bra $L6; .loc 1 88 17 st.u64 [%r182],%r44; bra $L7; $L6: .loc 1 91 12 st.u64 [%r182],%r373; $L7: .loc 1 85 3 add.u64 %r29,%r29,24; add.u64 %r133,%r133,8; add.u64 %r182,%r182,8; setp.ne.u64 %r238,%r29,%r160; @ %r238 bra $L8; bra $L9; $L52: .loc 1 102 6 ld.u64 %r239,[%r184]; setp.eq.u64 %r240,%r239,0; @ ! %r240 bra $L11; bra $L10; $L4: add.u64 %r242,%r25,%r25; add.u64 %r243,%r242,%r25; shl.b64 %r244,%r243,3; add.u64 %r245,%r244,40; add.u64 %r134,%r185,%r245; shl.b64 %r246,%r25,3; add.u64 %r108,%r246,-8; add.u64 %r362,%frame,120; add.u64 %r112,%r362,%r108; add.u64 %r361,%frame,240; add.u64 %r105,%r361,%r108; add.u64 %r249,%r185,40; cvt.u32.u32 %r251,%r22; cvt.s64.s8 %r250,%r251; add.u64 %r253,%r250,%r250; add.u64 %r254,%r253,%r250; shl.b64 %r255,%r254,3; add.u64 %r67,%r249,%r255; .loc 1 99 12 mov.u64 %r372,0; $L14: .loc 1 95 56 ld.u64 %r257,[%r134]; mul.lo.u64 %r256,%r257,%r188; .loc 1 95 18 st.u64 [%r112],%r256; .loc 1 96 19 ld.u64 %r259,[%r134+16]; add.u64 %r258,%r259,1; ld.u64 %r260,[%r134+8]; sub.u64 %r54,%r258,%r260; .loc 1 98 10 setp.lt.s64 %r261,%r54,0; @ %r261 bra $L12; .loc 1 96 17 st.u64 [%r105],%r54; bra $L13; $L12: .loc 1 99 12 st.u64 [%r105],%r372; $L13: .loc 1 93 3 add.u64 %r134,%r134,24; add.u64 %r112,%r112,8; add.u64 %r105,%r105,8; setp.ne.u64 %r263,%r67,%r134; @ %r263 bra $L14; bra $L53; $L10: .loc 1 106 7 setp.le.s64 %r264,%r117,0; @ %r264 bra $L16; add.u64 %r361,%frame,240; $L37: add.u64 %r150,%r184,40; mov.u64 %r148,%r361; cvt.u32.u32 %r266,%r22; cvt.s64.s8 %r265,%r266; add.u64 %r268,%r265,%r265; add.u64 %r269,%r268,%r265; shl.b64 %r270,%r269,3; add.u64 %r271,%r184,16; add.u64 %r135,%r270,%r271; .loc 1 109 10 mov.u64 %r107,1; .loc 1 113 4 mov.u64 %r294,0; bra $L17; $L16: .loc 1 117 24 mov.u64 %r272,0; st.u64 [%r184+8],%r272; .loc 1 118 28 cvt.u16.u32 %r275,%r22; add.u16 %r274,%r275,-1; cvt.u32.u16 %r276,%r274; st.u8 [%r184+28],%r276; .loc 1 120 20 add.u32 %r277,%r22,-2; cvt.s64.s32 %r65,%r277; add.u64 %r279,%r65,%r65; add.u64 %r280,%r279,%r65; shl.b64 %r281,%r280,3; add.u64 %r282,%r184,%r281; .loc 1 120 67 shl.b64 %r284,%r65,3; add.u64 %r285,%frame,%r284; .loc 1 120 59 ld.u64 %r287,[%r282+40]; ld.u64 %r288,[%r285+240]; mul.lo.u64 %r123,%r287,%r288; .loc 1 122 29290; call (%value_in),_gfortrani_xmallocarray291,[%value_in]; } .loc 1 122 27 st.u64 [%r184],%r291; .loc 1 123 10 setp.eq.u64 %r293,%r123,0; @ ! %r293 bra $L21; bra $L18; $L20: .loc 1 111 48 mul.lo.u64 %r107,%r61,%r107; $L17: .loc 1 113 4 st.u64 [%r150+8],%r294; ld.u64 %r61,[%r148]; add.u64 %r295,%r61,-1; st.u64 [%r150+16],%r295; st.u64 [%r150],%r107; .loc 1 106 7 add.u64 %r150,%r150,24; add.u64 %r148,%r148,8; setp.ne.u64 %r296,%r135,%r150; @ %r296 bra $L20; bra $L16; $L18: .loc 1 126 4 st.u64 [%r184+48],%r123; mov.u64 %r298,-1; st.u64 [%r184+56],%r298; mov.u64 %r299,1; st.u64 [%r184+40],%r299; .loc 1 127 4 bra $L1; $L11: .loc 1 133 19 ld.s8 %r72,[%r184+28]; .loc 1 133 10 setp.eq.u64 %r300,%r72,%r117; @ %r300 bra $L23; .loc 1 134 2 st.u64 [%stack+8],%r117; st.u64 [%stack],%r72;_gfortran_runtime_error23: .loc 1 139 11 cvta.global.u64 %r303,_gfortrani_compile_options; .loc 1 139 10 ld.u32 %r304,[%r303+36]; setp.eq.u32 %r305,%r304,0; @ %r305 bra $L21; .loc 1 140 2 add.u64 %r361,%frame,240; cvta.const.u64 %r309,$LC2r361308309; call _gfortrani_bounds_ifunction_return$L21: .loc 1 144 3 setp.gt.s64 %r311,%r117,0; @ %r311 bra $L24; $L28: .loc 1 80 9 add.u64 %r312,%r31,1; .loc 1 80 7 sub.u64 %r119,%r312,%r33; max.s64 %r121,%r119,0; .loc 1 83 44 mul.lo.u64 %r36,%r34,%r188; .loc 1 152 8 ld.u64 %r129,[%r185]; .loc 1 153 8 ld.u64 %r130,[%r184]; .loc 1 186 22 ld.u64 %r81,[%frame+120]; .loc 1 187 22 ld.u64 %r83,[%frame]; .loc 1 187 12 shl.b64 %r84,%r83,2; .loc 1 189 32 ld.u64 %r136,[%frame+240]; .loc 1 196 23 mul.lo.u64 %r313,%r81,%r136; .loc 1 196 9 neg.s64 %r144,%r313; .loc 1 197 23 mul.lo.u64 %r314,%r83,%r136; .loc 1 197 9 shl.b64 %r315,%r314,2; neg.s64 %r146,%r315; .loc 1 185 12 ld.u64 %r181,[%frame+360]; setp.gt.s64 %r359,%r119,0; setp.le.s64 %r360,%r117,1; .loc 1 164 9 mov.u64 %r365,0; .loc 1 165 9 cvt.u32.u64 %r366,%r365; .loc 1 173 6 setp.eq.u32 %r367,%r187,0; cvt.u32.u32 %r369,%r22; cvt.s64.s8 %r370,%r369; add.u64 %r371,%r370,-1; bra $L25; $L24: add.u64 %r169,%frame,360; add.u64 %r168,%r184,40; mov.u64 %r166,%frame; cvt.u32.u32 %r317,%r22; cvt.s64.s8 %r316,%r317; add.u64 %r152,%r316,-1; .loc 1 144 3 mov.u64 %r131,0; add.u64 %r361,%frame,240; .loc 1 146 16 mov.u64 %r318,%r131; $L27: st.u64 [%r169],%r318; .loc 1 147 18 ld.u64 %r319,[%r168]; st.u64 [%r166],%r319; .loc 1 148 17 shl.b64 %r321,%r131,3; add.u64 %r322,%r361,%r321; .loc 1 148 10 ld.u64 %r323,[%r322]; setp.le.s64 %r324,%r323,0; @ %r324 bra $L1; .loc 1 144 26 add.u64 %r131,%r131,1; .loc 1 144 3 add.u64 %r169,%r169,8; add.u64 %r168,%r168,24; add.u64 %r166,%r166,8; setp.ne.u64 %r325,%r131,%r152; @ %r325 bra $L27; bra $L28; $L25: .loc 1 166 5 @ %r359 bra $L38; .loc 1 167 10 mov.u32 %r327,0; st.u32 [%r130],%r327; bra $L30; $L38: mov.u64 %r125,%r129; .loc 1 164 9 mov.u64 %r111,%r365; .loc 1 165 9 mov.u32 %r109,%r366; .loc 1 170 13 mov.u64 %r124,0; $L29: .loc 1 173 6 setp.eq.u64 %r328,%r111,0; @ %r328 bra $L31; .loc 1 40125188332,[%value_in]; } .loc 1 173 6 @ %r367 bra $L32; .loc 1 173 22 setp.ge.s32 %r335,%r332,0; @ %r335 bra $L31; bra $L33; $L32: setp.le.s32 %r336,%r332,0; @ %r336 bra $L33; $L31: cvt.u32.u64 %r337,%r124; add.u32 %r109,%r337,1; .loc 1 170 13 mov.u64 %r111,%r125; $L33: .loc 1 170 28 add.u64 %r124,%r124,1; .loc 1 170 36 add.u64 %r125,%r125,%r36; .loc 1 170 6 setp.gt.s64 %r338,%r121,%r124; @ %r338 bra $L29; .loc 1 181 12 st.u32 [%r130],%r109; $L30: .loc 1 185 15 add.u64 %r181,%r181,1; .loc 1 186 12 add.u64 %r129,%r129,%r81; .loc 1 187 12 add.u64 %r130,%r130,%r84; .loc 1 189 13 setp.ne.u64 %r339,%r181,%r136; @ %r339 bra $L25; .loc 1 196 9 add.u64 %r126,%r129,%r144; .loc 1 197 9 add.u64 %r127,%r130,%r146; .loc 1 199 7 @ %r360 bra $L1; add.u64 %r137,%frame,368; mov.u64 %r155,8; .loc 1 198 5 mov.u64 %r128,1; add.u64 %r362,%frame,120; add.u64 %r361,%frame,240; bra $L35; $L36: .loc 1 193 13 st.u64 [%r137],%r365; .loc 1 196 23 mul.lo.u64 %r345,%r97,%r96; .loc 1 196 9 sub.u64 %r126,%r129,%r345; .loc 1 197 23 mul.lo.u64 %r346,%r99,%r96; .loc 1 197 9 shl.b64 %r347,%r346,2; sub.u64 %r127,%r130,%r347; .loc 1 198 5 add.u64 %r128,%r128,1; .loc 1 199 7 add.u64 %r137,%r137,8; add.u64 %r155,%r155,8; setp.eq.u64 %r348,%r128,%r371; @ %r348 bra $L1; $L35: .loc 1 207 16 ld.u64 %r349,[%r137]; add.u64 %r96,%r349,1; st.u64 [%r137],%r96; .loc 1 208 23 add.u64 %r351,%r362,%r155; ld.u64 %r97,[%r351]; .loc 1 208 13 add.u64 %r129,%r126,%r97; .loc 1 209 23 add.u64 %r352,%frame,%r155; ld.u64 %r99,[%r352]; .loc 1 209 13 shl.b64 %r353,%r99,2; add.u64 %r130,%r127,%r353; .loc 1 189 32 add.u64 %r355,%r361,%r155; ld.u64 %r102,[%r355]; .loc 1 189 13 setp.eq.u64 %r356,%r96,%r102; @ %r356 bra $L36; mov.u64 %r181,0; bra $L25; $L53: .loc 1 102 6 ld.u64 %r357,[%r184]; setp.eq.u64 %r358,%r357,0; @ ! %r358 bra $L11; bra $L37; $L1_gfortran_mmaxloc1_4_s1 .visible .func _gfortran_mmaxloc1_4_s1608u64 %r218; .reg .u64 %r227; .reg .u64 %r230; .reg .u64pred %r289; .reg .predpred %r304; .reg .pred %r305; .reg .u64u64 %r313; .reg .u64u64pred %r324; .reg .u64predu16 %r371; .reg .u16 %r372; .reg .u32 %r373; .reg .pred %r374; .reg .u64 %r375; .reg .u64 %r376; .reg .predu64u64 %r399; .reg .predu64 %r410; .reg .pred %r411; .reg .pred.reg .u32 %r417; .reg .predpred %r427; .reg .pred %r428; .reg .predpred %r461; .reg .predmov.u64 %r230,%ar0; mov.u64 %r231,%ar1; mov.u64 %r232,%ar2; mov.u64 %r233,%ar3; mov.u32 %r234,%ar4; mov.u64 %r235,%ar5; .loc 1 244 6 setp.ne.u64 %r236,%r233,0; @ %r236 bra $L2234235; call _gfortran_maxloc1_4_s1); } .loc 1 251 7 bra $L54; $L55: .loc 1 254 10 ld.u64 %r22,[%r232]; .loc 1 254 7 add.u64 %r151,%r22,-1; .loc 1 255 10 ld.s8 %r23,[%r231+28]; .loc 1 255 38 add.u32 %r242,%r23,-1; .loc 1 255 8 cvt.s64.s32 %r152,%r242; .loc 1 258 7 shr.u64 %r244,%r151,63; set.u32.gt.s64 %r246,%r151,%r152; neg.s32 %r247,%r246; cvt.u16.u64 %r250,%r244; cvt.u16.u32 %r251,%r247; or.b16 %r249,%r250,%r251; .loc 1 258 6.loc 1 260 7 cvt.u32.u32 %r257,%r23; cvt.s64.s8 %r256,%r257; st.u64 [%stack+8],%r256; st.u64 [%stack],%r22; cvta.const.u64 %r22_gfortran_runtime_error7: .loc 1 265 9 add.u64 %r460,%r151,%r151; add.u64 %r459,%r460,%r151; shl.b64 %r262,%r459,3; add.u64 %r263,%r231,%r262; ld.u64 %r266,[%r263+56]; add.u64 %r265,%r266,1; .loc 1 265 7 ld.u64 %r273,[%r263+48]; sub.u64 %r153,%r265,%r273; .loc 1 266 6 setp.le.s64 %r274,%r153,0; @ %r274 bra $L54; .loc 1 269 9 ld.u64 %r164,[%r233]; .loc 1 271 15 ld.u64 %r34,[%r233+16]; .loc 1 273 22 cvt.u32.u64 %r35,%r34; .loc 1 273 53 add.u32 %r275,%r35,-4; and.b32 %r276,%r275,-5; set.u32.eq.u32 %r278,%r276,0; neg.s32 %r279,%r278; .loc 1 273 22 add.u32 %r280,%r35,-1; set.u32.le.u32 %r282,%r280,1; neg.s32 %r283,%r282; .loc 1 273 6 cvt.u16.u32 %r285,%r279; cvt.u16.u32 %r286,%r283; or.b16 %r284,%r285,%r286; cvt.u32.u16 %r287,%r284; cvt.u16.u8 %r288,%r287; setp.ne.u16 %r289,%r288,0; @ %r289 bra $L59; .loc 1 275 7 setp.ne.u32 %r291,%r35,16; @ %r291 bra $L60; $L59: .loc 1 282 11 shl.b64 %r295,%r459,3; add.u64 %r296,%r231,%r295; ld.u64 %r40,[%r296+40]; .loc 1 283 12 add.u64 %r302,%r233,%r295; ld.u64 %r43,[%r302+40]; .loc 1 285 3 setp.ne.u64 %r304,%r151,0; @ %r304 bra $L61; $L67: .loc 1 295 3 setp.lt.s64 %r305,%r151,%r152; @ %r305 bra $L62; bra $L113; $L60: .loc 1 280 5 cvta.const.u64 %r306306_gfortran_runtime_error61: add.u64 %r212,%r231,40; add.u64 %r214,%frame,240; add.u64 %r215,%r233,40; mov.u64 %r217,%frame; add.u64 %r218,%frame,360; add.u64 %r308,%r231,16; add.u64 %r310,%r22,%r22; add.u64 %r311,%r310,%r22; shl.b64 %r312,%r311,3; add.u64 %r227,%r308,%r312; .loc 1 292 12 mov.u64 %r472,0; $L66: .loc 1 287 51 ld.u64 %r314,[%r212]; mul.lo.u64 %r313,%r314,%r235; .loc 1 287 18 st.u64 [%r214],%r313; .loc 1 288 20 ld.u64 %r316,[%r215]; mul.lo.u64 %r315,%r316,%r34; .loc 1 288 18 st.u64 [%r217],%r315; .loc 1 289 19 ld.u64 %r318,[%r212+16]; add.u64 %r317,%r318,1; ld.u64 %r319,[%r212+8]; sub.u64 %r58,%r317,%r319; .loc 1 291 10 setp.lt.s64 %r320,%r58,0; @ %r320 bra $L64; .loc 1 289 17 st.u64 [%r218],%r58; bra $L65; $L64: .loc 1 292 12 st.u64 [%r218],%r472; $L65: .loc 1 285 3 add.u64 %r212,%r212,24; add.u64 %r214,%r214,8; add.u64 %r215,%r215,24; add.u64 %r217,%r217,8; add.u64 %r218,%r218,8; setp.ne.u64 %r322,%r212,%r227; @ %r322 bra $L66; bra $L67; $L113: .loc 1 305 6 ld.u64 %r323,[%r230]; setp.eq.u64 %r324,%r323,0; @ ! %r324 bra $L69; bra $L68; $L62: add.u64 %r326,%r22,%r22; add.u64 %r327,%r326,%r22; shl.b64 %r328,%r327,3; add.u64 %r67,%r328,40; add.u64 %r84,%r231,%r67; shl.b64 %r329,%r22,3; add.u64 %r171,%r329,-8; add.u64 %r457,%frame,240; add.u64 %r38,%r457,%r171; add.u64 %r173,%r233,%r67; add.u64 %r77,%frame,%r171; add.u64 %r455,%frame,360; add.u64 %r194,%r455,%r171; add.u64 %r332,%r231,40; cvt.u32.u32 %r334,%r23; cvt.s64.s8 %r333,%r334; add.u64 %r336,%r333,%r333; add.u64 %r337,%r336,%r333; shl.b64 %r338,%r337,3; add.u64 %r209,%r332,%r338; .loc 1 302 12 mov.u64 %r471,0; $L72: .loc 1 297 55 ld.u64 %r340,[%r84]; mul.lo.u64 %r339,%r340,%r235; .loc 1 297 18 st.u64 [%r38],%r339; .loc 1 298 20 ld.u64 %r342,[%r173]; mul.lo.u64 %r341,%r342,%r34; .loc 1 298 18 st.u64 [%r77],%r341; .loc 1 299 19 ld.u64 %r344,[%r84+16]; add.u64 %r343,%r344,1; ld.u64 %r345,[%r84+8]; sub.u64 %r74,%r343,%r345; .loc 1 301 10 setp.lt.s64 %r346,%r74,0; @ %r346 bra $L70; .loc 1 299 17 st.u64 [%r194],%r74; bra $L71; $L70: .loc 1 302 12 st.u64 [%r194],%r471; $L71: .loc 1 295 3 add.u64 %r84,%r84,24; add.u64 %r38,%r38,8; add.u64 %r173,%r173,24; add.u64 %r77,%r77,8; add.u64 %r194,%r194,8; setp.ne.u64 %r348,%r84,%r209; @ %r348 bra $L72; bra $L114; $L68: .loc 1 309 7 setp.eq.u64 %r349,%r152,0; @ %r349 bra $L74; add.u64 %r455,%frame,360; $L97: add.u64 %r133,%r230,40; mov.u64 %r130,%r455; cvt.u32.u32 %r351,%r23; cvt.s64.s8 %r350,%r351; add.u64 %r353,%r350,%r350; add.u64 %r354,%r353,%r350; shl.b64 %r355,%r354,3; add.u64 %r356,%r230,16; add.u64 %r92,%r355,%r356; .loc 1 312 10 mov.u64 %r135,1; .loc 1 316 4 mov.u64 %r375,0; bra $L75; $L74: .loc 1 320 20 add.u32 %r357,%r23,-2; cvt.s64.s32 %r82,%r357; add.u64 %r359,%r82,%r82; add.u64 %r360,%r359,%r82; shl.b64 %r361,%r360,3; add.u64 %r362,%r230,%r361; .loc 1 320 67 shl.b64 %r364,%r82,3; add.u64 %r365,%frame,%r364; .loc 1 320 59 ld.u64 %r367,[%r362+40]; ld.u64 %r368,[%r365+360]; mul.lo.u64 %r158,%r367,%r368; .loc 1 322 24 mov.u64 %r369,0; st.u64 [%r230+8],%r369; .loc 1 323 28 cvt.u16.u32 %r372,%r23; add.u16 %r371,%r372,-1; cvt.u32.u16 %r373,%r371; st.u8 [%r230+28],%r373; .loc 1 325 10 setp.eq.u64 %r374,%r158,0; @ %r374 bra $L76; bra $L115; $L78: .loc 1 314 47 mul.lo.u64 %r135,%r79,%r135; $L75: .loc 1 316 4 st.u64 [%r133+8],%r375; ld.u64 %r79,[%r130]; add.u64 %r376,%r79,-1; st.u64 [%r133+16],%r376; st.u64 [%r133],%r135; .loc 1 309 7 add.u64 %r133,%r133,24; add.u64 %r130,%r130,8; setp.ne.u64 %r377,%r92,%r133; @ %r377 bra $L78; bra $L74; $L76: .loc 1 328 4 st.u64 [%r230+48],%r158; mov.u64 %r379,-1; st.u64 [%r230+56],%r379; mov.u64 %r380,1; st.u64 [%r230+40],%r380; .loc 1 329 4 bra $L54; $L115: .loc 1 332 24382; call (%value_in),_gfortrani_xmallocarray83,[%value_in]; } .loc 1 332 22 st.u64 [%r230],%r383; bra $L79; $L69: .loc 1 337 19 ld.s8 %r385,[%r230+28]; .loc 1 337 10 setp.eq.u64 %r386,%r385,%r152; @ %r386 bra $L80; .loc 1 338 2 cvta.const.u64 %r387387_gfortran_runtime_error80: .loc 1 340 11 cvta.global.u64 %r389,_gfortrani_compile_options; .loc 1 340 10 ld.u32 %r390,[%r389+36]; setp.eq.u32 %r391,%r390,0; @ %r391 bra $L79; .loc 1 342 4 add.u64 %r455,%frame,360; cvta.const.u64 %r395,$LC2r455394395; call _gfortrani_bounds_ifunction_return344 4 cvta.const.u64 %r399,$LC3r2399395; call _gfortrani_bounds_equal_extents$L79: .loc 1 349 3 setp.ne.u64 %r401,%r152,0; @ %r401 bra $L81; $L84: .loc 1 357 8 ld.u64 %r165,[%r230]; .loc 1 358 8 ld.u64 %r163,[%r231]; .loc 1 360 9 setp.ne.u64 %r402,%r163,0; @ %r402 bra $L82; bra $L54; $L81: add.u64 %r174,%frame,480; add.u64 %r149,%r230,40; add.u64 %r147,%frame,120; cvt.u32.u32 %r404,%r23; cvt.s64.s8 %r403,%r404; add.u64 %r134,%r403,-1; .loc 1 349 3 mov.u64 %r169,0; add.u64 %r455,%frame,360; .loc 1 351 16 mov.u64 %r405,%r169; $L83: st.u64 [%r174],%r405; .loc 1 352 18 ld.u64 %r406,[%r149]; st.u64 [%r147],%r406; .loc 1 353 17 shl.b64 %r408,%r169,3; add.u64 %r409,%r455,%r408; .loc 1 353 10 ld.u64 %r410,[%r409]; setp.le.s64 %r411,%r410,0; @ %r411 bra $L54; .loc 1 349 26 add.u64 %r169,%r169,1; .loc 1 349 3 add.u64 %r174,%r174,8; add.u64 %r149,%r149,24; add.u64 %r147,%r147,8; setp.ne.u64 %r412,%r134,%r169; @ %r412 bra $L83; bra $L84; $L82: .loc 1 282 44 mul.lo.u64 %r42,%r40,%r235; .loc 1 283 12 mul.lo.u64 %r46,%r43,%r34; .loc 1 396 22 ld.u64 %r99,[%frame+240]; .loc 1 397 23 ld.u64 %r101,[%frame]; .loc 1 398 22 ld.u64 %r103,[%frame+120]; .loc 1 398 12 shl.b64 %r104,%r103,2; .loc 1 400 32 ld.u64 %r29,[%frame+360]; .loc 1 407 23 mul.lo.u64 %r166,%r99,%r29; .loc 1 408 24 mul.lo.u64 %r413,%r101,%r29; .loc 1 408 10 neg.s64 %r177,%r413; .loc 1 409 23 mul.lo.u64 %r414,%r103,%r29; .loc 1 409 9 shl.b64 %r415,%r414,2; neg.s64 %r182,%r415; setp.le.s64 %r461,%r152,1; .loc 1 384 13 setp.eq.u32 %r464,%r234,0; cvt.u32.u32 %r465,%r23; cvt.s64.s8 %r466,%r465; add.u64 %r467,%r466,-1; add.u64 %r468,%frame,240; add.u64 %r469,%frame,360; add.u64 %r470,%frame,120; $L98: .loc 1 349 3 mov.u64 %r168,%r164; mov.u64 %r141,%r163; .loc 1 372 9 mov.u64 %r159,0; $L88: .loc 1 375 7 ld.s8 %r199,[%r168]; .loc 1 375 6 setp.eq.u32 %r456,%r199,0; @ %r456 bra $L85; .loc 1 378 11 cvt.u32.u64 %r417,%r159; add.u32 %r139,%r417,1; .loc 1 382 6 setp.le.s64 %r418,%r153,%r159; @ %r418 bra $L86; add.u64 %r180,%r159,1; mov.u64 %r167,%r141; bra $L87; $L85: .loc 1 372 24 add.u64 %r159,%r159,1; .loc 1 372 32 add.u64 %r141,%r141,%r42; .loc 1 372 47 add.u64 %r168,%r168,%r46; .loc 1 372 2 setp.ne.u64 %r419,%r153,%r159; @ %r419 bra $L88; .loc 1 371 9 mov.u32 %r139,%r199; bra $L86; $L87: .loc 1 384 6 @ %r456 bra $L89; .loc 1 40(%value_in),memcmp,(%out_arg1,%out_arg2,%out_arg3); ld.param.u32 %r424,[%value_in]; } .loc 1 384 13 @ %r464 bra $L90; setp.ge.s32 %r427,%r424,0; @ %r427 bra $L91; bra $L89; $L90: setp.le.s32 %r428,%r424,0; @ %r428 bra $L89; $L91: .loc 1 388 14 cvt.u32.u64 %r139,%r180; mov.u64 %r141,%r167; $L89: .loc 1 382 31 add.u64 %r167,%r167,%r42; .loc 1 382 46 add.u64 %r168,%r168,%r46; .loc 1 382 6 add.u64 %r181,%r180,1; setp.eq.u64 %r429,%r153,%r180; @ %r429 bra $L86; .loc 1 384 7 ld.s8 %r199,[%r168]; mov.u64 %r180,%r181; setp.eq.u32 %r456,%r199,0; bra $L87; $L86: .loc 1 392 8 st.u32 [%r165],%r139; .loc 1 395 15 ld.u64 %r430,[%frame+480]; add.u64 %r97,%r430,1; .loc 1 396 12 add.u64 %r163,%r163,%r99; .loc 1 397 13 add.u64 %r164,%r164,%r101; .loc 1 398 12 add.u64 %r165,%r165,%r104; .loc 1 400 13 setp.eq.u64 %r431,%r29,%r97; @ %r431 bra $L92; .loc 1 395 15 st.u64 [%frame+480],%r97; bra $L98; $L92: .loc 1 404 13 mov.u64 %r432,0; st.u64 [%frame+480],%r432; .loc 1 408 10 add.u64 %r160,%r164,%r177; .loc 1 409 9 add.u64 %r161,%r165,%r182; .loc 1 411 7 @ %r461 bra $L54; add.u64 %r150,%frame,488; .loc 1 407 23 mov.u64 %r107,%r166; .loc 1 411 7 mov.u64 %r197,8; .loc 1 410 5 mov.u64 %r162,1; bra $L95; $L96: .loc 1 404 13 st.u64 [%r150],%r432; .loc 1 407 23 mul.lo.u64 %r107,%r120,%r119; .loc 1 408 24 mul.lo.u64 %r438,%r122,%r119; .loc 1 408 10 sub.u64 %r160,%r164,%r438; .loc 1 409 23 mul.lo.u64 %r439,%r124,%r119; .loc 1 409 9 shl.b64 %r440,%r439,2; sub.u64 %r161,%r165,%r440; .loc 1 410 5 add.u64 %r162,%r162,1; .loc 1 411 7 add.u64 %r150,%r150,8; add.u64 %r197,%r197,8; setp.eq.u64 %r441,%r162,%r467; @ %r441 bra $L54; $L95: .loc 1 419 16 ld.u64 %r442,[%r150]; add.u64 %r119,%r442,1; st.u64 [%r150],%r119; .loc 1 420 23 add.u64 %r444,%r468,%r197; ld.u64 %r120,[%r444]; .loc 1 420 13 sub.u64 %r445,%r120,%r107; add.u64 %r163,%r163,%r445; .loc 1 421 24 add.u64 %r446,%frame,%r197; ld.u64 %r122,[%r446]; .loc 1 421 14 add.u64 %r164,%r160,%r122; .loc 1 422 23 add.u64 %r448,%r470,%r197; ld.u64 %r124,[%r448]; .loc 1 422 13 shl.b64 %r449,%r124,2; add.u64 %r165,%r161,%r449; .loc 1 400 32 add.u64 %r451,%r469,%r197; ld.u64 %r128,[%r451]; .loc 1 400 13 setp.eq.u64 %r452,%r119,%r128; @ %r452 bra $L96; bra $L98; $L114: .loc 1 305 6 ld.u64 %r453,[%r230]; setp.eq.u64 %r454,%r453,0; @ ! %r454 bra $L69; bra $L97; $L54: .loc 1 426_gfortran_smaxloc1_4_s1 .visible .func _gfortran_smaxloc1_4_s1368100predu16 %r180; .reg .predpred %r222; .reg .predu64 %r231; .reg .u16 %r233; .reg .u16 %r234; .reg .u32pred %r259; .reg .pred %r260; .reg .u64u64 %r267; .reg .u32pred %r303; .reg .u64 %r304; .reg .pred %r305; .reg .u64 %r306; .reg .u64 %r307; .reg .u32mov.u64 %r156,%ar0; mov.u64 %r157,%ar1; mov.u64 %r158,%ar2; mov.u64 %r159,%ar3; mov.u32 %r160,%ar4; mov.u64 %r161,%ar5; .loc 1 449 6 setp.eq.u64 %r162,%r159,0; @ %r162 bra $L117; .loc 1 449 20 ld.u32 %r163,[%r159]; setp.eq.u32 %r164,%r163,0; @ %r164 bra $L118; $L117: .loc 1 452 7r1571580161; call _gfortran_maxloc1_4_s1); } .loc 1 456 7 bra $L116; $L118: .loc 1 459 10 ld.u64 %r23,[%r158]; .loc 1 459 7 add.u64 %r95,%r23,-1; .loc 1 460 10 ld.s8 %r24,[%r157+28]; .loc 1 460 38 add.u32 %r26,%r24,-1; .loc 1 460 8 cvt.s64.s32 %r306,%r26; .loc 1 462 7 shr.u64 %r171,%r95,63; set.u32.gt.s64 %r173,%r95,%r306; neg.s32 %r174,%r173; cvt.u16.u64 %r177,%r171; cvt.u16.u32 %r178,%r174; or.b16 %r176,%r177,%r178; .loc 1 462 6 cvt.u32.u16 %r179,%r176; cvt.u16.u8 %r180,%r179; setp.ne.u16 %r181,%r180,0; @ %r181 bra $L120; .loc 1 469 3 setp.ne.u64 %r182,%r95,0; @ %r182 bra $L121; $L127: .loc 1 477 3 setp.lt.s64 %r183,%r95,%r306; @ %r183 bra $L122; bra $L175; $L120: .loc 1 464 7 cvt.u32.u32 %r186,%r24; cvt.s64.s8 %r185,%r186; st.u64 [%stack+8],%r1851841stack; call _gfortran_runtime_error121: add.u64 %r92,%r157,48; add.u64 %r114,%frame,120; add.u64 %r146,%r157,56; add.u64 %r188,%r157,24; add.u64 %r190,%r23,%r23; add.u64 %r191,%r190,%r23; shl.b64 %r192,%r191,3; add.u64 %r152,%r188,%r192; .loc 1 474 12 mov.u64 %r312,0; $L126: .loc 1 471 19 ld.u64 %r194,[%r146]; ld.u64 %r196,[%r92]; sub.u64 %r195,%r194,%r196; .loc 1 471 50 mad.lo.u64 %r38,%r195,%r161,%r161; .loc 1 473 10 setp.le.s64 %r197,%r38,0; @ %r197 bra $L124; .loc 1 471 17 st.u64 [%r114],%r38; bra $L125; $L124: .loc 1 474 12 st.u64 [%r114],%r312; $L125: .loc 1 469 3 add.u64 %r92,%r92,24; add.u64 %r114,%r114,8; add.u64 %r146,%r146,24; setp.ne.u64 %r199,%r92,%r152; @ %r199 bra $L126; bra $L127; $L175: .loc 1 486 15 ld.u64 %r107,[%r156]; .loc 1 486 6 setp.eq.u64 %r200,%r107,0; @ ! %r200 bra $L129; bra $L128; $L122: add.u64 %r202,%r23,%r23; add.u64 %r203,%r202,%r23; shl.b64 %r204,%r203,3; add.u64 %r205,%r204,48; add.u64 %r83,%r157,%r205; add.u64 %r304,%frame,120; shl.b64 %r207,%r23,3; add.u64 %r208,%r207,-8; add.u64 %r144,%r304,%r208; add.u64 %r209,%r157,48; cvt.u32.u32 %r211,%r24; cvt.s64.s8 %r210,%r211; add.u64 %r213,%r210,%r210; add.u64 %r214,%r213,%r210; shl.b64 %r215,%r214,3; add.u64 %r72,%r209,%r215; .loc 1 483 12 mov.u64 %r311,0; $L132: .loc 1 480 2 ld.u64 %r217,[%r83+8]; ld.u64 %r219,[%r83]; sub.u64 %r218,%r217,%r219; .loc 1 480 37 mad.lo.u64 %r46,%r218,%r161,%r161; .loc 1 482 10 setp.le.s64 %r220,%r46,0; @ %r220 bra $L130; .loc 1 479 17 st.u64 [%r144],%r46; bra $L131; $L130: .loc 1 483 12 st.u64 [%r144],%r311; $L131: .loc 1 477 3 add.u64 %r83,%r83,24; add.u64 %r144,%r144,8; setp.ne.u64 %r222,%r72,%r83; @ %r222 bra $L132; bra $L176; $L128: .loc 1 490 7 setp.eq.u64 %r223,%r306,0; @ %r223 bra $L134; add.u64 %r304,%frame,120; $L154: add.u64 %r125,%r156,40; mov.u64 %r123,%r304; add.u64 %r224,%r156,16; cvt.u32.u32 %r226,%r24; cvt.s64.s8 %r225,%r226; add.u64 %r228,%r225,%r225; add.u64 %r229,%r228,%r225; shl.b64 %r230,%r229,3; add.u64 %r117,%r224,%r230; .loc 1 493 10 mov.u64 %r89,1; .loc 1 497 4 mov.u64 %r249,0; bra $L135; $L134: .loc 1 501 24 mov.u64 %r231,0; st.u64 [%r156+8],%r231; .loc 1 502 28 cvt.u16.u32 %r234,%r24; add.u16 %r233,%r234,-1; cvt.u32.u16 %r235,%r233; st.u8 [%r156+28],%r235; .loc 1 504 20 add.u32 %r236,%r24,-2; cvt.s64.s32 %r56,%r236; add.u64 %r238,%r56,%r56; add.u64 %r239,%r238,%r56; shl.b64 %r240,%r239,3; add.u64 %r241,%r156,%r240; .loc 1 504 67 shl.b64 %r243,%r56,3; add.u64 %r244,%frame,%r243; .loc 1 504 59 ld.u64 %r246,[%r241+40]; ld.u64 %r247,[%r244+120]; mul.lo.u64 %r103,%r246,%r247; .loc 1 506 10 setp.eq.u64 %r248,%r103,0; @ %r248 bra $L136; bra $L177; $L138: .loc 1 495 48 mul.lo.u64 %r89,%r52,%r89; $L135: .loc 1 497 4 st.u64 [%r125+8],%r249; ld.u64 %r52,[%r123]; add.u64 %r250,%r52,-1; st.u64 [%r125+16],%r250; st.u64 [%r125],%r89; .loc 1 490 7 add.u64 %r125,%r125,24; add.u64 %r123,%r123,8; setp.ne.u64 %r251,%r117,%r125; @ %r251 bra $L138; bra $L134; $L136: .loc 1 509 4 st.u64 [%r156+48],%r103; mov.u64 %r253,-1; st.u64 [%r156+56],%r253; mov.u64 %r254,1; st.u64 [%r156+40],%r254; .loc 1 510 4 bra $L116; $L177: .loc 1 513 24256; call (%value_in),_gfortrani_xmallocarray257,[%value_in]; } mov.u64 %r107,%r257; .loc 1 513 22 st.u64 [%r156],%r107; $L142: .loc 1 539 3 setp.ne.u64 %r259,%r306,0; @ %r259 bra $L139; bra $L140; $L129: .loc 1 517 19 ld.s8 %r61,[%r156+28]; .loc 1 517 10 setp.eq.u64 %r260,%r61,%r306; @ %r260 bra $L141; .loc 1 518 2 st.u64 [%stack+8],%r306; st.u64 [%stack],%r61;_gfortran_runtime_error141: .loc 1 523 11 cvta.global.u64 %r263,_gfortrani_compile_options; .loc 1 523 10 ld.u32 %r264,[%r263+36]; setp.eq.u32 %r265,%r264,0; @ %r265 bra $L142; .loc 1 525 4 setp.eq.u64 %r266,%r306,0; @ %r266 bra $L140; add.u64 %r115,%r156,48; add.u64 %r105,%frame,120; cvt.u32.u32 %r268,%r24; cvt.s64.s8 %r267,%r268; add.u64 %r86,%r267,-1; .loc 1 525 10 mov.u64 %r113,0; $L144: .loc 1 529 21 ld.u64 %r270,[%r115+8]; add.u64 %r269,%r270,1; .loc 1 529 19 ld.u64 %r271,[%r115]; sub.u64 %r100,%r269,%r271; .loc 1 530 18 ld.u64 %r70,[%r105]; .loc 1 525 25 add.u64 %r113,%r113,1; .loc 1 530 11 setp.eq.u64 %r272,%r70,%r100; @ %r272 bra $L143; .loc 1 531 3 st.u64 [%stack+16],%r70; st.u64 [%stack+8],%r100; st.u64 [%stack],%r113;_gfortran_runtime_error143: .loc 1 525 4 add.u64 %r115,%r115,24; add.u64 %r105,%r105,8; setp.ne.u64 %r275,%r86,%r113; @ %r275 bra $L144; $L139: .loc 1 541 16 cvt.u16.u32 %r277,%r24; setp.le.s16 %r278,%r277,1; @ %r278 bra $L145; shl.b64 %r276,%r306,3; bra $L146; $L145: mov.u64 %r276,8; $L146: add.u64 %r280,%frame,240; mov.u32 %r2842766,[%value_in]; } add.u64 %r130,%r156,40; mov.u64 %r128,%frame; mov.u64 %r109,0; $L147: .loc 1 542 18 ld.u64 %r288,[%r130]; st.u64 [%r128],%r288; .loc 1 539 26 add.u64 %r109,%r109,1; .loc 1 539 3 add.u64 %r130,%r130,24; add.u64 %r128,%r128,8; setp.gt.s64 %r289,%r306,%r109; @ %r289 bra $L147; $L140: ld.u64 %r88,[%frame+240]; .loc 1 551 22 ld.u64 %r75,[%frame]; .loc 1 551 12 shl.b64 %r76,%r75,2; .loc 1 553 32 ld.u64 %r65,[%frame+120]; .loc 1 560 23 mul.lo.u64 %r49,%r65,%r75; setp.le.s64 %r305,%r306,1; .loc 1 549 13 mov.u32 %r290,0; cvt.u32.u32 %r308,%r24; cvt.s64.s8 %r309,%r308; add.u64 %r310,%r309,-1; $L153: st.u32 [%r107],%r290; .loc 1 550 15 add.u64 %r88,%r88,1; .loc 1 551 12 add.u64 %r107,%r107,%r76; .loc 1 553 13 setp.ne.u64 %r291,%r65,%r88; @ %r291 bra $L153; .loc 1 562 7 @ ! %r305 bra $L178; bra $L116; $L152: .loc 1 557 13 st.u64 [%r108],%r307; .loc 1 560 23 mul.lo.u64 %r78,%r81,%r80; .loc 1 561 5 add.u64 %r106,%r106,1; .loc 1 562 7 add.u64 %r108,%r108,8; add.u64 %r138,%r138,8; add.u64 %r137,%r137,8; setp.ne.u64 %r294,%r106,%r310; @ %r294 bra $L151; bra $L116; $L178: add.u64 %r108,%frame,248; add.u64 %r138,%frame,8; add.u64 %r137,%frame,128; .loc 1 560 23 mov.u64 %r78,%r49; .loc 1 561 5 mov.u64 %r106,1; .loc 1 557 13 mov.u64 %r307,0; $L151: .loc 1 566 16 ld.u64 %r299,[%r108]; add.u64 %r80,%r299,1; st.u64 [%r108],%r80; .loc 1 567 23 ld.u64 %r81,[%r138]; .loc 1 567 13 sub.u64 %r300,%r81,%r78; shl.b64 %r301,%r300,2; add.u64 %r107,%r107,%r301; .loc 1 553 32 ld.u64 %r84,[%r137]; .loc 1 553 13 setp.eq.u64 %r302,%r80,%r84; @ %r302 bra $L152; .loc 1 557 13 mov.u64 %r88,0; bra $L153; $L176: .loc 1 486 15 ld.u64 %r107,[%r156]; .loc 1 486 6 setp.eq.u64 %r303,%r107,0; @ ! %r303 bra $L129; bra $L154; $L116: .loc 1 maxloc1_4_s4.o/ 1622802249_gfortran_maxloc1_4_s4 .visible .func _gfortran_maxloc1_4_sfortran/generated/maxloc1_4_s4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_4_s4 .visible .func _gfortran_mmaxloc1_4_s_gfortran_smaxloc1_4_s4 .visible .func _gfortran_smaxloc1_4_sVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_memcmp_char4_gfortrani_memcmp_char4_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents7] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,914,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,117,101114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99_gfortran_maxloc1_4_s4 .visible .func _gfortran_maxloc1_4_s, .param .u64 %in_ar4) {487predpredu64u64 %r266; .reg .u64 %r267; .reg .pred %r268; .reg .predu64u64 %r320; .reg .u64u64 %r327; .reg .u64 %r328; .reg .predpred %r340; .reg .u32 %r344; .reg .predpredu64mov.u64 %r191,%ar0; mov.u64 %r192,%ar1; mov.u64 %r193,%ar2; mov.u32 %r194,%ar3; mov.u64 %r195,%ar4; .loc 1 70 10 ld.s8 %r22,[%r192+28]; .loc 1 70 38 add.u32 %r196,%r22,-1; .loc 1 70 8 cvt.s64.s32 %r103,%r196; .loc 1 71 10 ld.u64 %r25,[%r193]; .loc 1 71 7 add.u64 %r104,%r25,-1; .loc 1 73 7 shr.u64 %r198,%r104,63; set.u32.lt.s64 %r200,%r103,%r104; neg.s32 %r201,%r200; cvt.u16.u64 %r204,%r198; cvt.u16.u32 %r205,%r201; or.b16 %r203,%r204,%r205; .loc 1 73 6 cvt.u32.u16 %r206,%r203; cvt.u16.u8 %r207,%r206; setp.eq.u16 %r208,%r207,0; @ %r208 bra $L2; .loc 1 75 7 cvt.u32.u32 %r211,%r22; cvt.s64.s8 %r210,%r211; st.u64 [%stack+8],%r21000_gfortran_runtime_errorr214,%r104,%r104; add.u64 %r215,%r214,%r104; shl.b64 %r216,%r215,3; add.u64 %r217,%r192,%r216; ld.u64 %r30,[%r217+56]; ld.u64 %r32,[%r217+48]; .loc 1 83 11 ld.u64 %r33,[%r217+40]; .loc 1 85 3 setp.ne.u64 %r231,%r104,0; @ %r231 bra $L3; $L9: .loc 1 93 3 setp.gt.s64 %r232,%r103,%r104; @ %r232 bra $L4; bra $L50; $L3: add.u64 %r124,%r192,40; add.u64 %r102,%frame,120; add.u64 %r101,%frame,240; add.u64 %r233,%r192,16; add.u64 %r235,%r25,%r25; add.u64 %r236,%r235,%r25; shl.b64 %r237,%r236,3; add.u64 %r90,%r233,%r237; .loc 1 91 12 mov.u64 %r388,0; $L8: .loc 1 87 51 ld.u64 %r239,[%r124]; mul.lo.u64 %r238,%r239,%r195; .loc 1 87 18 st.u64 [%r102],%r238; .loc 1 88 19 ld.u64 %r241,[%r124+16]; add.u64 %r240,%r241,1; ld.u64 %r242,[%r124+8]; sub.u64 %r42,%r240,%r242; .loc 1 90 10 setp.lt.s64 %r243,%r42,0; @ %r243 bra $L6; .loc 1 88 17 st.u64 [%r101],%r42; bra $L7; $L6: .loc 1 91 12 st.u64 [%r101],%r388; $L7: .loc 1 85 3 add.u64 %r124,%r124,24; add.u64 %r102,%r102,8; add.u64 %r101,%r101,8; setp.ne.u64 %r245,%r90,%r124; @ %r245 bra $L8; bra $L9; $L50: .loc 1 102 6 ld.u64 %r246,[%r191]; setp.eq.u64 %r247,%r246,0; @ ! %r247 bra $L11; bra $L10; $L4: add.u64 %r249,%r25,%r25; add.u64 %r250,%r249,%r25; shl.b64 %r251,%r250,3; add.u64 %r252,%r251,40; add.u64 %r152,%r192,%r252; shl.b64 %r253,%r25,3; add.u64 %r145,%r253,-8; add.u64 %r374,%frame,120; add.u64 %r147,%r374,%r145; add.u64 %r373,%frame,240; add.u64 %r142,%r373,%r145; add.u64 %r256,%r192,40; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r127,%r256,%r262; .loc 1 99 12 mov.u64 %r387,0; $L14: .loc 1 95 56 ld.u64 %r264,[%r152]; mul.lo.u64 %r263,%r264,%r195; .loc 1 95 18 st.u64 [%r147],%r263; .loc 1 96 19 ld.u64 %r266,[%r152+16]; add.u64 %r265,%r266,1; ld.u64 %r267,[%r152+8]; sub.u64 %r51,%r265,%r267; .loc 1 98 10 setp.lt.s64 %r268,%r51,0; @ %r268 bra $L12; .loc 1 96 17 st.u64 [%r142],%r51; bra $L13; $L12: .loc 1 99 12 st.u64 [%r142],%r387; $L13: .loc 1 93 3 add.u64 %r152,%r152,24; add.u64 %r147,%r147,8; add.u64 %r142,%r142,8; setp.ne.u64 %r270,%r127,%r152; @ %r270 bra $L14; bra $L51; $L10: .loc 1 106 7 setp.le.s64 %r271,%r103,0; @ %r271 bra $L16; add.u64 %r373,%frame,240; $L38: add.u64 %r161,%r191,40; mov.u64 %r159,%r373; add.u64 %r272,%r191,16; cvt.u32.u32 %r274,%r22; cvt.s64.s8 %r273,%r274; add.u64 %r276,%r273,%r273; add.u64 %r277,%r276,%r273; shl.b64 %r278,%r277,3; add.u64 %r153,%r272,%r278; .loc 1 109 10 mov.u64 %r94,1; .loc 1 113 4 mov.u64 %r301,0; bra $L17; $L16: .loc 1 117 24 mov.u64 %r279,0; st.u64 [%r191+8],%r279; .loc 1 118 28 cvt.u16.u32 %r282,%r22; add.u16 %r281,%r282,-1; cvt.u32.u16 %r283,%r281; st.u8 [%r191+28],%r283; .loc 1 120 20 add.u32 %r284,%r22,-2; cvt.s64.s32 %r58,%r284; add.u64 %r286,%r58,%r58; add.u64 %r287,%r286,%r58; shl.b64 %r288,%r287,3; add.u64 %r289,%r191,%r288; .loc 1 120 67 shl.b64 %r291,%r58,3; add.u64 %r292,%frame,%r291; .loc 1 120 59 ld.u64 %r294,[%r289+40]; ld.u64 %r295,[%r292+240]; mul.lo.u64 %r109,%r294,%r295; .loc 1 122 29297; call (%value_in),_gfortrani_xmallocarray298,[%value_in]; } .loc 1 122 27 st.u64 [%r191],%r298; .loc 1 123 10 setp.eq.u64 %r300,%r109,0; @ ! %r300 bra $L21; bra $L18; $L20: .loc 1 111 48 mul.lo.u64 %r94,%r54,%r94; $L17: .loc 1 113 4 st.u64 [%r161+8],%r301; ld.u64 %r54,[%r159]; add.u64 %r302,%r54,-1; st.u64 [%r161+16],%r302; st.u64 [%r161],%r94; .loc 1 106 7 add.u64 %r161,%r161,24; add.u64 %r159,%r159,8; setp.ne.u64 %r303,%r153,%r161; @ %r303 bra $L20; bra $L16; $L18: .loc 1 126 4 st.u64 [%r191+48],%r109; mov.u64 %r305,-1; st.u64 [%r191+56],%r305; mov.u64 %r306,1; st.u64 [%r191+40],%r306; .loc 1 127 4 bra $L1; $L11: .loc 1 133 19 ld.s8 %r64,[%r191+28]; .loc 1 133 10 setp.eq.u64 %r307,%r64,%r103; @ %r307 bra $L23; .loc 1 134 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r64;_gfortran_runtime_error23: .loc 1 139 11 cvta.global.u64 %r310,_gfortrani_compile_options; .loc 1 139 10 ld.u32 %r311,[%r310+36]; setp.eq.u32 %r312,%r311,0; @ %r312 bra $L21; .loc 1 140 2 add.u64 %r373,%frame,240; cvta.const.u64 %r316,$LC2r37331316; call _gfortrani_bounds_ifunction_return$L21: .loc 1 144 3 setp.gt.s64 %r318,%r103,0; @ %r318 bra $L24; $L28: .loc 1 80 9 add.u64 %r319,%r30,1; .loc 1 80 7 sub.u64 %r105,%r319,%r32; max.s64 %r107,%r105,0; .loc 1 152 8 ld.u64 %r93,[%r192]; .loc 1 153 8 ld.u64 %r115,[%r191]; .loc 1 170 36 mul.lo.u64 %r320,%r33,%r195; shl.b64 %r69,%r320,2; .loc 1 186 22 ld.u64 %r183,[%frame+120]; .loc 1 186 12 shl.b64 %r185,%r183,2; .loc 1 187 22 ld.u64 %r186,[%frame]; .loc 1 187 12 shl.b64 %r188,%r186,2; .loc 1 185 12 ld.u64 %r189,[%frame+360]; setp.gt.s64 %r375,%r105,0; setp.le.s64 %r376,%r103,1; .loc 1 164 9 mov.u64 %r379,0; .loc 1 165 9 cvt.u32.u64 %r380,%r379; .loc 1 173 6 setp.eq.u32 %r381,%r194,0; cvt.u32.u32 %r382,%r22; cvt.s64.s8 %r383,%r382; add.u64 %r384,%r383,-1; add.u64 %r385,%frame,120; add.u64 %r386,%frame,240; bra $L25; $L24: add.u64 %r174,%frame,360; add.u64 %r173,%r191,40; mov.u64 %r171,%frame; cvt.u32.u32 %r322,%r22; cvt.s64.s8 %r321,%r322; add.u64 %r162,%r321,-1; .loc 1 144 3 mov.u64 %r116,0; add.u64 %r373,%frame,240; .loc 1 146 16 mov.u64 %r323,%r116; $L27: st.u64 [%r174],%r323; .loc 1 147 18 ld.u64 %r324,[%r173]; st.u64 [%r171],%r324; .loc 1 148 17 shl.b64 %r326,%r116,3; add.u64 %r327,%r373,%r326; .loc 1 148 10 ld.u64 %r328,[%r327]; setp.le.s64 %r329,%r328,0; @ %r329 bra $L1; .loc 1 144 26 add.u64 %r116,%r116,1; .loc 1 144 3 add.u64 %r174,%r174,8; add.u64 %r173,%r173,24; add.u64 %r171,%r171,8; setp.ne.u64 %r330,%r116,%r162; @ %r330 bra $L27; bra $L28; $L25: .loc 1 166 5 @ %r375 bra $L39; .loc 1 167 10 mov.u32 %r332,0; st.u32 [%r115],%r332; bra $L30; $L39: mov.u64 %r111,%r93; .loc 1 164 9 mov.u64 %r98,%r379; .loc 1 165 9 mov.u32 %r96,%r380; .loc 1 170 13 mov.u64 %r110,0; $L29: .loc 1 173 6 setp.ne.u64 %r333,%r98,0; @ %r333 bra $L31; $L34: cvt.u32.u64 %r334,%r110; add.u32 %r96,%r334,1; .loc 1 170 13 mov.u64 %r98,%r111; bra $L32; $L31: .loc 1 173 6 @ %r381 bra $L33; .loc 1 42195; call (%value_in),_gfortrani_memcmp_char4339,[%value_in]; } .loc 1 173 22 setp.ge.s32 %r340,%r339,0; @ %r340 bra $L34; bra $L32; $L33: .loc 1 42195; call (%value_in),_gfortrani_memcmp_char4344,[%value_in]; } .loc 1 173 22 setp.gt.s32 %r345,%r344,0; @ %r345 bra $L34; $L32: .loc 1 170 28 add.u64 %r110,%r110,1; .loc 1 170 36 add.u64 %r111,%r111,%r69; .loc 1 170 6 setp.gt.s64 %r346,%r107,%r110; @ %r346 bra $L29; .loc 1 181 12 st.u32 [%r115],%r96; $L30: .loc 1 185 15 add.u64 %r189,%r189,1; .loc 1 186 12 add.u64 %r93,%r93,%r185; .loc 1 187 12 add.u64 %r115,%r115,%r188; .loc 1 189 32 ld.u64 %r121,[%frame+240]; .loc 1 189 13 setp.ne.u64 %r347,%r189,%r121; @ %r347 bra $L25; .loc 1 196 23 mul.lo.u64 %r348,%r189,%r183; .loc 1 196 9 shl.b64 %r349,%r348,2; sub.u64 %r112,%r93,%r349; .loc 1 197 23 mul.lo.u64 %r350,%r189,%r186; .loc 1 197 9 shl.b64 %r351,%r350,2; sub.u64 %r113,%r115,%r351; .loc 1 199 7 @ %r376 bra $L1; add.u64 %r123,%frame,368; mov.u64 %r143,8; .loc 1 198 5 mov.u64 %r114,1; bra $L36; $L37: .loc 1 193 13 st.u64 [%r123],%r379; .loc 1 196 23 mul.lo.u64 %r357,%r81,%r80; .loc 1 196 9 shl.b64 %r358,%r357,2; sub.u64 %r112,%r93,%r358; .loc 1 197 23 mul.lo.u64 %r359,%r84,%r80; .loc 1 197 9 shl.b64 %r360,%r359,2; sub.u64 %r113,%r115,%r360; .loc 1 198 5 add.u64 %r114,%r114,1; .loc 1 199 7 add.u64 %r123,%r123,8; add.u64 %r143,%r143,8; setp.eq.u64 %r361,%r114,%r384; @ %r361 bra $L1; $L36: .loc 1 207 16 ld.u64 %r362,[%r123]; add.u64 %r80,%r362,1; st.u64 [%r123],%r80; .loc 1 208 23 add.u64 %r364,%r385,%r143; ld.u64 %r81,[%r364]; .loc 1 208 13 shl.b64 %r365,%r81,2; add.u64 %r93,%r112,%r365; .loc 1 209 23 add.u64 %r366,%frame,%r143; ld.u64 %r84,[%r366]; .loc 1 209 13 shl.b64 %r367,%r84,2; add.u64 %r115,%r113,%r367; .loc 1 189 32 add.u64 %r369,%r386,%r143; ld.u64 %r87,[%r369]; .loc 1 189 13 setp.eq.u64 %r370,%r80,%r87; @ %r370 bra $L37; mov.u64 %r189,0; bra $L25; $L51: .loc 1 102 6 ld.u64 %r371,[%r191]; setp.eq.u64 %r372,%r371,0; @ ! %r372 bra $L11; bra $L38; $L1_gfortran_mmaxloc1_4_s4 .visible .func _gfortran_mmaxloc1_4_s608predu64u32 %r283; .reg .u32 %r284; .reg .u32 %r285; .reg .u32 %r287; .reg .u32 %r288; .reg .u16 %r289; .reg .u16 %r290; .reg .u16 %r291; .reg .u32 %r292; .reg .u16 %r293; .reg .pred %r294; .reg .predu64 %r307; .reg .pred %r309; .reg .predu64 %r318; .reg .u64pred %r327; .reg .u64 %r328; .reg .predu64 %r333; .reg .u64 %r334; .reg .u64 %r337; .reg .u32predu64 %r374; .reg .u16 %r376; .reg .u16 %r377; .reg .u32 %r378; .reg .predpred %r396; .reg .u64 %r399; .reg .u64 %r400; .reg .u64 %r404; .reg .pred %r406; .reg .predu64.reg .predpredpredpred %r450; .reg .u64pred %r462; .reg .u64 %r463; .reg .pred %r464; .reg .u64 %r465; .reg .pred %r466; .reg .u64 %r468; .reg .pred %r469; .reg .u64 %r470; .reg .u64 %r471; .reg .predu64mov.u64 %r235,%ar0; mov.u64 %r236,%ar1; mov.u64 %r237,%ar2; mov.u64 %r238,%ar3; mov.u32 %r239,%ar4; mov.u64 %r240,%ar5; .loc 1 244 6 setp.ne.u64 %r241,%r238,0; @ %r241 bra $L237240; call _gfortran_maxloc1_4_s4); } .loc 1 251 7 bra $L52; $L53: .loc 1 254 10 ld.u64 %r22,[%r237]; .loc 1 254 7 add.u64 %r143,%r22,-1; .loc 1 255 10 ld.s8 %r23,[%r236+28]; .loc 1 255 38 add.u32 %r247,%r23,-1; .loc 1 255 8 cvt.s64.s32 %r144,%r247; .loc 1 258 7 shr.u64 %r249,%r143,63; set.u32.gt.s64 %r251,%r143,%r144; neg.s32 %r252,%r251; cvt.u16.u64 %r255,%r249; cvt.u16.u32 %r256,%r252; or.b16 %r254,%r255,%r256; .loc 1 258 6 cvt.u32.u16 %r257,%r254; cvt.u16.u8 %r258,%r257; setp.eq.u16 %r259,%r258,0; @ %r259 bra $L55; .loc 1 260 7 cvt.u32.u32 %r262,%r23; cvt.s64.s8 %r261,%r262; st.u64 [%stack+8],%r261; st.u64 [%stack],%r22; cvta.const.u64 %r260260_gfortran_runtime_error5: .loc 1 265 9 add.u64 %r470,%r143,%r143; add.u64 %r468,%r470,%r143; shl.b64 %r267,%r468,3; add.u64 %r268,%r236,%r267; ld.u64 %r271,[%r268+56]; add.u64 %r270,%r271,1; .loc 1 265 7 ld.u64 %r278,[%r268+48]; sub.u64 %r145,%r270,%r278; .loc 1 266 6 setp.le.s64 %r279,%r145,0; @ %r279 bra $L52; .loc 1 269 9 ld.u64 %r158,[%r238]; .loc 1 271 15 ld.u64 %r35,[%r238+16]; .loc 1 273 22 cvt.u32.u64 %r36,%r35; .loc 1 273 53 add.u32 %r280,%r36,-4; and.b32 %r281,%r280,-5; set.u32.eq.u32 %r283,%r281,0; neg.s32 %r284,%r283; .loc 1 273 22 add.u32 %r285,%r36,-1; set.u32.le.u32 %r287,%r285,1; neg.s32 %r288,%r287; .loc 1 273 6 cvt.u16.u32 %r290,%r284; cvt.u16.u32 %r291,%r288; or.b16 %r289,%r290,%r291; cvt.u32.u16 %r292,%r289; cvt.u16.u8 %r293,%r292; setp.ne.u16 %r294,%r293,0; @ %r294 bra $L57; .loc 1 275 7 setp.ne.u32 %r296,%r36,16; @ %r296 bra $L58; $L57: .loc 1 282 11 shl.b64 %r300,%r468,3; add.u64 %r301,%r236,%r300; ld.u64 %r42,[%r301+40]; .loc 1 283 12 add.u64 %r307,%r238,%r300; ld.u64 %r44,[%r307+40]; .loc 1 285 3 setp.ne.u64 %r309,%r143,0; @ %r309 bra $L59; $L65: .loc 1 295 3 setp.lt.s64 %r310,%r143,%r144; @ %r310 bra $L60; bra $L110; $L58: .loc 1 280 5 cvta.const.u64 %r311311_gfortran_runtime_error9: add.u64 %r214,%r236,40; add.u64 %r77,%frame,240; add.u64 %r98,%r238,40; mov.u64 %r97,%frame; add.u64 %r101,%frame,360; add.u64 %r313,%r236,16; add.u64 %r315,%r22,%r22; add.u64 %r316,%r315,%r22; shl.b64 %r317,%r316,3; add.u64 %r231,%r313,%r317; .loc 1 292 12 mov.u64 %r482,0; $L64: .loc 1 287 51 ld.u64 %r319,[%r214]; mul.lo.u64 %r318,%r319,%r240; .loc 1 287 18 st.u64 [%r77],%r318; .loc 1 288 20 ld.u64 %r321,[%r98]; mul.lo.u64 %r320,%r321,%r35; .loc 1 288 18 st.u64 [%r97],%r320; .loc 1 289 19 ld.u64 %r323,[%r214+16]; add.u64 %r322,%r323,1; ld.u64 %r324,[%r214+8]; sub.u64 %r58,%r322,%r324; .loc 1 291 10 setp.lt.s64 %r325,%r58,0; @ %r325 bra $L62; .loc 1 289 17 st.u64 [%r101],%r58; bra $L63; $L62: .loc 1 292 12 st.u64 [%r101],%r482; $L63: .loc 1 285 3 add.u64 %r214,%r214,24; add.u64 %r77,%r77,8; add.u64 %r98,%r98,24; add.u64 %r97,%r97,8; add.u64 %r101,%r101,8; setp.ne.u64 %r327,%r214,%r231; @ %r327 bra $L64; bra $L65; $L110: .loc 1 305 6 ld.u64 %r328,[%r235]; setp.eq.u64 %r329,%r328,0; @ ! %r329 bra $L67; bra $L66; $L60: add.u64 %r331,%r22,%r22; add.u64 %r332,%r331,%r22; shl.b64 %r333,%r332,3; add.u64 %r135,%r333,40; add.u64 %r139,%r236,%r135; shl.b64 %r334,%r22,3; add.u64 %r128,%r334,-8; add.u64 %r465,%frame,240; add.u64 %r132,%r465,%r128; add.u64 %r126,%r238,%r135; add.u64 %r123,%frame,%r128; add.u64 %r471,%frame,360; add.u64 %r92,%r471,%r128; cvt.u32.u32 %r338,%r23; cvt.s64.s8 %r337,%r338; add.u64 %r340,%r337,%r337; add.u64 %r341,%r340,%r337; shl.b64 %r342,%r341,3; add.u64 %r343,%r236,40; add.u64 %r166,%r342,%r343; .loc 1 302 12 mov.u64 %r481,0; $L70: .loc 1 297 55 ld.u64 %r345,[%r139]; mul.lo.u64 %r344,%r345,%r240; .loc 1 297 18 st.u64 [%r132],%r344; .loc 1 298 20 ld.u64 %r347,[%r126]; mul.lo.u64 %r346,%r347,%r35; .loc 1 298 18 st.u64 [%r123],%r346; .loc 1 299 19 ld.u64 %r349,[%r139+16]; add.u64 %r348,%r349,1; ld.u64 %r350,[%r139+8]; sub.u64 %r75,%r348,%r350; .loc 1 301 10 setp.lt.s64 %r351,%r75,0; @ %r351 bra $L68; .loc 1 299 17 st.u64 [%r92],%r75; bra $L69; $L68: .loc 1 302 12 st.u64 [%r92],%r481; $L69: .loc 1 295 3 add.u64 %r139,%r139,24; add.u64 %r132,%r132,8; add.u64 %r126,%r126,24; add.u64 %r123,%r123,8; add.u64 %r92,%r92,8; setp.ne.u64 %r353,%r139,%r166; @ %r353 bra $L70; bra $L111; $L66: .loc 1 309 7 setp.eq.u64 %r354,%r144,0; @ %r354 bra $L72; add.u64 %r471,%frame,360; $L94: add.u64 %r180,%r235,40; mov.u64 %r177,%r471; cvt.u32.u32 %r356,%r23; cvt.s64.s8 %r355,%r356; add.u64 %r358,%r355,%r355; add.u64 %r359,%r358,%r355; shl.b64 %r360,%r359,3; add.u64 %r361,%r235,16; add.u64 %r141,%r360,%r361; .loc 1 312 10 mov.u64 %r131,1; .loc 1 316 4 mov.u64 %r380,0; bra $L73; $L72: .loc 1 320 20 add.u32 %r362,%r23,-2; cvt.s64.s32 %r83,%r362; add.u64 %r364,%r83,%r83; add.u64 %r365,%r364,%r83; shl.b64 %r366,%r365,3; add.u64 %r367,%r235,%r366; .loc 1 320 67 shl.b64 %r369,%r83,3; add.u64 %r370,%frame,%r369; .loc 1 320 59 ld.u64 %r372,[%r367+40]; ld.u64 %r373,[%r370+360]; mul.lo.u64 %r151,%r372,%r373; .loc 1 322 24 mov.u64 %r374,0; st.u64 [%r235+8],%r374; .loc 1 323 28 cvt.u16.u32 %r377,%r23; add.u16 %r376,%r377,-1; cvt.u32.u16 %r378,%r376; st.u8 [%r235+28],%r378; .loc 1 325 10 setp.eq.u64 %r379,%r151,0; @ %r379 bra $L74; bra $L112; $L76: .loc 1 314 47 mul.lo.u64 %r131,%r80,%r131; $L73: .loc 1 316 4 st.u64 [%r180+8],%r380; ld.u64 %r80,[%r177]; add.u64 %r381,%r80,-1; st.u64 [%r180+16],%r381; st.u64 [%r180],%r131; .loc 1 309 7 add.u64 %r180,%r180,24; add.u64 %r177,%r177,8; setp.ne.u64 %r382,%r141,%r180; @ %r382 bra $L76; bra $L72; $L74: .loc 1 328 4 st.u64 [%r235+48],%r151; mov.u64 %r384,-1; st.u64 [%r235+56],%r384; mov.u64 %r385,1; st.u64 [%r235+40],%r385; .loc 1 329 4 bra $L52; $L112: .loc 1 332 24387; call (%value_in),_gfortrani_xmallocarray88,[%value_in]; } .loc 1 332 22 st.u64 [%r235],%r388; bra $L77; $L67: .loc 1 337 19 ld.s8 %r390,[%r235+28]; .loc 1 337 10 setp.eq.u64 %r391,%r390,%r144; @ %r391 bra $L78; .loc 1 338 2 cvta.const.u64 %r392,$LC5;stack; call _gfortran_runtime_error78: .loc 1 340 11 cvta.global.u64 %r394,_gfortrani_compile_options; .loc 1 340 10 ld.u32 %r395,[%r394+36]; setp.eq.u32 %r396,%r395,0; @ %r396 bra $L77; .loc 1 342 4 add.u64 %r471,%frame,360; cvta.const.u64 %r400,$LC2r471399400; call _gfortrani_bounds_ifunction_return344 4 cvta.const.u64 %r404,$LC3r2r404400; call _gfortrani_bounds_equal_extents$L77: .loc 1 349 3 setp.ne.u64 %r406,%r144,0; @ %r406 bra $L79; $L82: .loc 1 357 8 ld.u64 %r159,[%r235]; .loc 1 358 8 ld.u64 %r157,[%r236]; .loc 1 360 9 setp.ne.u64 %r407,%r157,0; @ %r407 bra $L80; bra $L52; $L79: add.u64 %r198,%frame,480; add.u64 %r197,%r235,40; add.u64 %r194,%frame,120; cvt.u32.u32 %r409,%r23; cvt.s64.s8 %r408,%r409; add.u64 %r184,%r408,-1; .loc 1 349 3 mov.u64 %r164,0; add.u64 %r471,%frame,360; .loc 1 351 16 mov.u64 %r410,%r164; $L81: st.u64 [%r198],%r410; .loc 1 352 18 ld.u64 %r411,[%r197]; st.u64 [%r194],%r411; .loc 1 353 17 shl.b64 %r413,%r164,3; add.u64 %r414,%r471,%r413; .loc 1 353 10 ld.u64 %r415,[%r414]; setp.le.s64 %r416,%r415,0; @ %r416 bra $L52; .loc 1 349 26 add.u64 %r164,%r164,1; .loc 1 349 3 add.u64 %r198,%r198,8; add.u64 %r197,%r197,24; add.u64 %r194,%r194,8; setp.ne.u64 %r417,%r164,%r184; @ %r417 bra $L81; bra $L82; $L80: .loc 1 283 12 mul.lo.u64 %r46,%r44,%r35; .loc 1 372 32 mul.lo.u64 %r418,%r42,%r240; shl.b64 %r96,%r418,2; .loc 1 396 22 ld.u64 %r215,[%frame+240]; .loc 1 396 12 shl.b64 %r217,%r215,2; .loc 1 397 23 ld.u64 %r218,[%frame]; .loc 1 398 22 ld.u64 %r220,[%frame+120]; .loc 1 398 12 shl.b64 %r223,%r220,2; setp.le.s64 %r469,%r144,1; .loc 1 384 13 setp.eq.u32 %r474,%r239,0; cvt.u32.u32 %r475,%r23; cvt.s64.s8 %r476,%r475; add.u64 %r477,%r476,-1; add.u64 %r478,%frame,240; add.u64 %r479,%frame,360; add.u64 %r480,%frame,120; $L95: .loc 1 349 3 mov.u64 %r163,%r158; mov.u64 %r153,%r157; .loc 1 372 9 mov.u64 %r152,0; $L86: .loc 1 375 7 ld.s8 %r224,[%r163]; .loc 1 375 6 setp.eq.u32 %r466,%r224,0; @ %r466 bra $L83; .loc 1 378 11 cvt.u32.u64 %r420,%r152; add.u32 %r133,%r420,1; .loc 1 382 6 setp.le.s64 %r421,%r145,%r152; @ %r421 bra $L84; add.u64 %r201,%r152,1; mov.u64 %r162,%r153; bra $L85; $L83: .loc 1 372 24 add.u64 %r152,%r152,1; .loc 1 372 32 add.u64 %r153,%r153,%r96; .loc 1 372 47 add.u64 %r163,%r163,%r46; .loc 1 372 2 setp.ne.u64 %r422,%r145,%r152; @ %r422 bra $L86; .loc 1 371 9 mov.u32 %r133,%r224; bra $L84; $L85: .loc 1 384 6 @ %r466 bra $L87; .loc 1 384 13 @ %r474 bra $L88; .loc 1 4216153240; call (%value_in),_gfortrani_memcmp_char4428,[%value_in]; } .loc 1 384 13 setp.lt.s32 %r429,%r428,0; @ %r429 bra $L87; $L89: .loc 1 388 14 cvt.u32.u64 %r133,%r201; mov.u64 %r153,%r162; bra $L87; $L88: .loc 1 4216153240; call (%value_in),_gfortrani_memcmp_char4433,[%value_in]; } .loc 1 384 13 setp.gt.s32 %r434,%r433,0; @ %r434 bra $L89; $L87: .loc 1 382 31 add.u64 %r162,%r162,%r96; .loc 1 382 46 add.u64 %r163,%r163,%r46; .loc 1 382 6 add.u64 %r202,%r201,1; setp.eq.u64 %r435,%r145,%r201; @ %r435 bra $L84; .loc 1 384 7 ld.s8 %r224,[%r163]; mov.u64 %r201,%r202; setp.eq.u32 %r466,%r224,0; bra $L85; $L84: .loc 1 392 8 st.u32 [%r159],%r133; .loc 1 395 15 ld.u64 %r436,[%frame+480]; add.u64 %r100,%r436,1; st.u64 [%frame+480],%r100; .loc 1 396 12 add.u64 %r157,%r157,%r217; .loc 1 397 13 add.u64 %r158,%r158,%r218; .loc 1 398 12 add.u64 %r159,%r159,%r223; .loc 1 400 32 ld.u64 %r29,[%frame+360]; .loc 1 400 13 setp.ne.u64 %r437,%r29,%r100; @ %r437 bra $L95; .loc 1 404 13 mov.u64 %r438,0; st.u64 [%frame+480],%r438; .loc 1 407 23 mul.lo.u64 %r106,%r29,%r215; .loc 1 408 24 mul.lo.u64 %r439,%r29,%r218; .loc 1 408 10 sub.u64 %r154,%r158,%r439; .loc 1 409 23 mul.lo.u64 %r440,%r29,%r220; .loc 1 409 9 shl.b64 %r441,%r440,2; sub.u64 %r155,%r159,%r441; .loc 1 411 7 @ %r469 bra $L52; add.u64 %r212,%frame,488; mov.u64 %r211,8; .loc 1 410 5 mov.u64 %r156,1; bra $L92; $L93: .loc 1 404 13 st.u64 [%r212],%r438; .loc 1 407 23 mul.lo.u64 %r106,%r116,%r115; .loc 1 408 24 mul.lo.u64 %r447,%r118,%r115; .loc 1 408 10 sub.u64 %r154,%r158,%r447; .loc 1 409 23 mul.lo.u64 %r448,%r120,%r115; .loc 1 409 9 shl.b64 %r449,%r448,2; sub.u64 %r155,%r159,%r449; .loc 1 410 5 add.u64 %r156,%r156,1; .loc 1 411 7 add.u64 %r212,%r212,8; add.u64 %r211,%r211,8; setp.eq.u64 %r450,%r156,%r477; @ %r450 bra $L52; $L92: .loc 1 419 16 ld.u64 %r451,[%r212]; add.u64 %r115,%r451,1; st.u64 [%r212],%r115; .loc 1 420 23 add.u64 %r453,%r478,%r211; ld.u64 %r116,[%r453]; .loc 1 420 13 sub.u64 %r454,%r116,%r106; shl.b64 %r455,%r454,2; add.u64 %r157,%r157,%r455; .loc 1 421 24 add.u64 %r456,%frame,%r211; ld.u64 %r118,[%r456]; .loc 1 421 14 add.u64 %r158,%r154,%r118; .loc 1 422 23 add.u64 %r458,%r480,%r211; ld.u64 %r120,[%r458]; .loc 1 422 13 shl.b64 %r459,%r120,2; add.u64 %r159,%r155,%r459; .loc 1 400 32 add.u64 %r461,%r479,%r211; ld.u64 %r124,[%r461]; .loc 1 400 13 setp.eq.u64 %r462,%r115,%r124; @ %r462 bra $L93; bra $L95; $L111: .loc 1 305 6 ld.u64 %r463,[%r235]; setp.eq.u64 %r464,%r463,0; @ ! %r464 bra $L67; bra $L94; $L52: .loc 1 426_gfortran_smaxloc1_4_s4 .visible .func _gfortran_smaxloc1_4_s368100predu16 %r180; .reg .predpred %r222; .reg .predu64 %r231; .reg .u16 %r233; .reg .u16 %r234; .reg .u32pred %r259; .reg .pred %r260; .reg .u64u64 %r267; .reg .u32pred %r303; .reg .u64 %r304; .reg .pred %r305; .reg .u64 %r306; .reg .u64 %r307; .reg .u32mov.u64 %r156,%ar0; mov.u64 %r157,%ar1; mov.u64 %r158,%ar2; mov.u64 %r159,%ar3; mov.u32 %r160,%ar4; mov.u64 %r161,%ar5; .loc 1 449 6 setp.eq.u64 %r162,%r159,0; @ %r162 bra $L114; .loc 1 449 20 ld.u32 %r163,[%r159]; setp.eq.u32 %r164,%r163,0; @ %r164 bra $L115; $L114: .loc 1 452 7r1571580161; call _gfortran_maxloc1_4_s4); } .loc 1 456 7 bra $L113; $L115: .loc 1 459 10 ld.u64 %r23,[%r158]; .loc 1 459 7 add.u64 %r95,%r23,-1; .loc 1 460 10 ld.s8 %r24,[%r157+28]; .loc 1 460 38 add.u32 %r26,%r24,-1; .loc 1 460 8 cvt.s64.s32 %r306,%r26; .loc 1 462 7 shr.u64 %r171,%r95,63; set.u32.gt.s64 %r173,%r95,%r306; neg.s32 %r174,%r173; cvt.u16.u64 %r177,%r171; cvt.u16.u32 %r178,%r174; or.b16 %r176,%r177,%r178; .loc 1 462 6 cvt.u32.u16 %r179,%r176; cvt.u16.u8 %r180,%r179; setp.ne.u16 %r181,%r180,0; @ %r181 bra $L117; .loc 1 469 3 setp.ne.u64 %r182,%r95,0; @ %r182 bra $L118; $L124: .loc 1 477 3 setp.lt.s64 %r183,%r95,%r306; @ %r183 bra $L119; bra $L172; $L117: .loc 1 464 7 cvt.u32.u32 %r186,%r24; cvt.s64.s8 %r185,%r186; st.u64 [%stack+8],%r1851841stack; call _gfortran_runtime_error118: add.u64 %r92,%r157,48; add.u64 %r114,%frame,120; add.u64 %r146,%r157,56; add.u64 %r188,%r157,24; add.u64 %r190,%r23,%r23; add.u64 %r191,%r190,%r23; shl.b64 %r192,%r191,3; add.u64 %r152,%r188,%r192; .loc 1 474 12 mov.u64 %r312,0; $L123: .loc 1 471 19 ld.u64 %r194,[%r146]; ld.u64 %r196,[%r92]; sub.u64 %r195,%r194,%r196; .loc 1 471 50 mad.lo.u64 %r38,%r195,%r161,%r161; .loc 1 473 10 setp.le.s64 %r197,%r38,0; @ %r197 bra $L121; .loc 1 471 17 st.u64 [%r114],%r38; bra $L122; $L121: .loc 1 474 12 st.u64 [%r114],%r312; $L122: .loc 1 469 3 add.u64 %r92,%r92,24; add.u64 %r114,%r114,8; add.u64 %r146,%r146,24; setp.ne.u64 %r199,%r92,%r152; @ %r199 bra $L123; bra $L124; $L172: .loc 1 486 15 ld.u64 %r107,[%r156]; .loc 1 486 6 setp.eq.u64 %r200,%r107,0; @ ! %r200 bra $L126; bra $L125; $L119: add.u64 %r202,%r23,%r23; add.u64 %r203,%r202,%r23; shl.b64 %r204,%r203,3; add.u64 %r205,%r204,48; add.u64 %r83,%r157,%r205; add.u64 %r304,%frame,120; shl.b64 %r207,%r23,3; add.u64 %r208,%r207,-8; add.u64 %r144,%r304,%r208; add.u64 %r209,%r157,48; cvt.u32.u32 %r211,%r24; cvt.s64.s8 %r210,%r211; add.u64 %r213,%r210,%r210; add.u64 %r214,%r213,%r210; shl.b64 %r215,%r214,3; add.u64 %r72,%r209,%r215; .loc 1 483 12 mov.u64 %r311,0; $L129: .loc 1 480 2 ld.u64 %r217,[%r83+8]; ld.u64 %r219,[%r83]; sub.u64 %r218,%r217,%r219; .loc 1 480 37 mad.lo.u64 %r46,%r218,%r161,%r161; .loc 1 482 10 setp.le.s64 %r220,%r46,0; @ %r220 bra $L127; .loc 1 479 17 st.u64 [%r144],%r46; bra $L128; $L127: .loc 1 483 12 st.u64 [%r144],%r311; $L128: .loc 1 477 3 add.u64 %r83,%r83,24; add.u64 %r144,%r144,8; setp.ne.u64 %r222,%r72,%r83; @ %r222 bra $L129; bra $L173; $L125: .loc 1 490 7 setp.eq.u64 %r223,%r306,0; @ %r223 bra $L131; add.u64 %r304,%frame,120; $L151: add.u64 %r125,%r156,40; mov.u64 %r123,%r304; add.u64 %r224,%r156,16; cvt.u32.u32 %r226,%r24; cvt.s64.s8 %r225,%r226; add.u64 %r228,%r225,%r225; add.u64 %r229,%r228,%r225; shl.b64 %r230,%r229,3; add.u64 %r117,%r224,%r230; .loc 1 493 10 mov.u64 %r89,1; .loc 1 497 4 mov.u64 %r249,0; bra $L132; $L131: .loc 1 501 24 mov.u64 %r231,0; st.u64 [%r156+8],%r231; .loc 1 502 28 cvt.u16.u32 %r234,%r24; add.u16 %r233,%r234,-1; cvt.u32.u16 %r235,%r233; st.u8 [%r156+28],%r235; .loc 1 504 20 add.u32 %r236,%r24,-2; cvt.s64.s32 %r56,%r236; add.u64 %r238,%r56,%r56; add.u64 %r239,%r238,%r56; shl.b64 %r240,%r239,3; add.u64 %r241,%r156,%r240; .loc 1 504 67 shl.b64 %r243,%r56,3; add.u64 %r244,%frame,%r243; .loc 1 504 59 ld.u64 %r246,[%r241+40]; ld.u64 %r247,[%r244+120]; mul.lo.u64 %r103,%r246,%r247; .loc 1 506 10 setp.eq.u64 %r248,%r103,0; @ %r248 bra $L133; bra $L174; $L135: .loc 1 495 48 mul.lo.u64 %r89,%r52,%r89; $L132: .loc 1 497 4 st.u64 [%r125+8],%r249; ld.u64 %r52,[%r123]; add.u64 %r250,%r52,-1; st.u64 [%r125+16],%r250; st.u64 [%r125],%r89; .loc 1 490 7 add.u64 %r125,%r125,24; add.u64 %r123,%r123,8; setp.ne.u64 %r251,%r117,%r125; @ %r251 bra $L135; bra $L131; $L133: .loc 1 509 4 st.u64 [%r156+48],%r103; mov.u64 %r253,-1; st.u64 [%r156+56],%r253; mov.u64 %r254,1; st.u64 [%r156+40],%r254; .loc 1 510 4 bra $L113; $L174: .loc 1 513 24256; call (%value_in),_gfortrani_xmallocarray257,[%value_in]; } mov.u64 %r107,%r257; .loc 1 513 22 st.u64 [%r156],%r107; $L139: .loc 1 539 3 setp.ne.u64 %r259,%r306,0; @ %r259 bra $L136; bra $L137; $L126: .loc 1 517 19 ld.s8 %r61,[%r156+28]; .loc 1 517 10 setp.eq.u64 %r260,%r61,%r306; @ %r260 bra $L138; .loc 1 518 2 st.u64 [%stack+8],%r306; st.u64 [%stack],%r61;_gfortran_runtime_error138: .loc 1 523 11 cvta.global.u64 %r263,_gfortrani_compile_options; .loc 1 523 10 ld.u32 %r264,[%r263+36]; setp.eq.u32 %r265,%r264,0; @ %r265 bra $L139; .loc 1 525 4 setp.eq.u64 %r266,%r306,0; @ %r266 bra $L137; add.u64 %r115,%r156,48; add.u64 %r105,%frame,120; cvt.u32.u32 %r268,%r24; cvt.s64.s8 %r267,%r268; add.u64 %r86,%r267,-1; .loc 1 525 10 mov.u64 %r113,0; $L141: .loc 1 529 21 ld.u64 %r270,[%r115+8]; add.u64 %r269,%r270,1; .loc 1 529 19 ld.u64 %r271,[%r115]; sub.u64 %r100,%r269,%r271; .loc 1 530 18 ld.u64 %r70,[%r105]; .loc 1 525 25 add.u64 %r113,%r113,1; .loc 1 530 11 setp.eq.u64 %r272,%r70,%r100; @ %r272 bra $L140; .loc 1 531 3 st.u64 [%stack+16],%r70; st.u64 [%stack+8],%r100; st.u64 [%stack],%r113;_gfortran_runtime_error140: .loc 1 525 4 add.u64 %r115,%r115,24; add.u64 %r105,%r105,8; setp.ne.u64 %r275,%r86,%r113; @ %r275 bra $L141; $L136: .loc 1 541 16 cvt.u16.u32 %r277,%r24; setp.le.s16 %r278,%r277,1; @ %r278 bra $L142; shl.b64 %r276,%r306,3; bra $L143; $L142: mov.u64 %r276,8; $L143: add.u64 %r280,%frame,240; mov.u32 %r2842766,[%value_in]; } add.u64 %r130,%r156,40; mov.u64 %r128,%frame; mov.u64 %r109,0; $L144: .loc 1 542 18 ld.u64 %r288,[%r130]; st.u64 [%r128],%r288; .loc 1 539 26 add.u64 %r109,%r109,1; .loc 1 539 3 add.u64 %r130,%r130,24; add.u64 %r128,%r128,8; setp.gt.s64 %r289,%r306,%r109; @ %r289 bra $L144; $L137: ld.u64 %r88,[%frame+240]; .loc 1 551 22 ld.u64 %r75,[%frame]; .loc 1 551 12 shl.b64 %r76,%r75,2; .loc 1 553 32 ld.u64 %r65,[%frame+120]; .loc 1 560 23 mul.lo.u64 %r49,%r65,%r75; setp.le.s64 %r305,%r306,1; .loc 1 549 13 mov.u32 %r290,0; cvt.u32.u32 %r308,%r24; cvt.s64.s8 %r309,%r308; add.u64 %r310,%r309,-1; $L150: st.u32 [%r107],%r290; .loc 1 550 15 add.u64 %r88,%r88,1; .loc 1 551 12 add.u64 %r107,%r107,%r76; .loc 1 553 13 setp.ne.u64 %r291,%r65,%r88; @ %r291 bra $L150; .loc 1 562 7 @ ! %r305 bra $L175; bra $L113; $L149: .loc 1 557 13 st.u64 [%r108],%r307; .loc 1 560 23 mul.lo.u64 %r78,%r81,%r80; .loc 1 561 5 add.u64 %r106,%r106,1; .loc 1 562 7 add.u64 %r108,%r108,8; add.u64 %r138,%r138,8; add.u64 %r137,%r137,8; setp.ne.u64 %r294,%r106,%r310; @ %r294 bra $L148; bra $L113; $L175: add.u64 %r108,%frame,248; add.u64 %r138,%frame,8; add.u64 %r137,%frame,128; .loc 1 560 23 mov.u64 %r78,%r49; .loc 1 561 5 mov.u64 %r106,1; .loc 1 557 13 mov.u64 %r307,0; $L148: .loc 1 566 16 ld.u64 %r299,[%r108]; add.u64 %r80,%r299,1; st.u64 [%r108],%r80; .loc 1 567 23 ld.u64 %r81,[%r138]; .loc 1 567 13 sub.u64 %r300,%r81,%r78; shl.b64 %r301,%r300,2; add.u64 %r107,%r107,%r301; .loc 1 553 32 ld.u64 %r84,[%r137]; .loc 1 553 13 setp.eq.u64 %r302,%r80,%r84; @ %r302 bra $L149; .loc 1 557 13 mov.u64 %r88,0; bra $L150; $L173: .loc 1 486 15 ld.u64 %r107,[%r156]; .loc 1 486 6 setp.eq.u64 %r303,%r107,0; @ ! %r303 bra $L126; bra $L151; $L113: .loc 1 maxloc1_8_s1.o/ 1622802249_gfortran_maxloc1_8_s1 .visible .func _gfortran_maxloc1_8_s1fortran/generated/maxloc1_8_s1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_8_s1 .visible .func _gfortran_mmaxloc1_8_s1_gfortran_smaxloc1_8_s1 .visible .func _gfortran_smaxloc1_8_s1VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returmemcmpmem_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents7] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,914,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,117,101114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99_gfortran_maxloc1_8_s1 .visible .func _gfortran_maxloc1_8_s1in_ar4) {48u64pred %r235; .reg .predu16 %r273; .reg .u16u32pred %r331; .reg .predu64pred %r357; .reg .predu64 %r370; .reg .u64 %r371; mov.u64 %r183,%ar0; mov.u64 %r184,%ar1; mov.u64 %r185,%ar2; mov.u32 %r186,%ar3; mov.u64 %r187,%ar4; .loc 1 70 10 ld.s8 %r22,[%r184+28]; .loc 1 70 38 add.u32 %r188,%r22,-1; .loc 1 70 8 cvt.s64.s32 %r116,%r188; .loc 1 71 10 ld.u64 %r25,[%r185]; .loc 1 71 7 add.u64 %r117,%r25,-1; .loc 1 73 7 shr.u64 %r190,%r117,63; set.u32.lt.s64 %r192,%r116,%r117; neg.s32 %r193,%r192; cvt.u16.u64 %r196,%r190; cvt.u16.u32 %r197,%r193; or.b16 %r195,%r196,%r197; .loc 1 73 6 cvt.u32.u16 %r198,%r195; cvt.u16.u8 %r199,%r198; setp.eq.u16 %r200,%r199,0; @ %r200 bra $L2; .loc 1 75 7 cvt.u32.u32 %r203,%r22; cvt.s64.s8 %r202,%r203; st.u64 [%stack+8],%r20201201_gfortran_runtime_errorr206,%r117,%r117; add.u64 %r207,%r206,%r117; shl.b64 %r208,%r207,3; add.u64 %r209,%r184,%r208; ld.u64 %r31,[%r209+56]; ld.u64 %r33,[%r209+48]; .loc 1 83 11 ld.u64 %r34,[%r209+40]; .loc 1 85 3 setp.ne.u64 %r223,%r117,0; @ %r223 bra $L3; $L9: .loc 1 93 3 setp.gt.s64 %r224,%r116,%r117; @ %r224 bra $L4; bra $L49; $L3: add.u64 %r133,%r184,40; add.u64 %r177,%frame,120; add.u64 %r173,%frame,240; add.u64 %r225,%r184,16; add.u64 %r227,%r25,%r25; add.u64 %r228,%r227,%r25; shl.b64 %r229,%r228,3; add.u64 %r179,%r225,%r229; .loc 1 91 12 mov.u64 %r371,0; $L8: .loc 1 87 51 ld.u64 %r231,[%r133]; mul.lo.u64 %r230,%r231,%r187; .loc 1 87 18 st.u64 [%r177],%r230; .loc 1 88 19 ld.u64 %r233,[%r133+16]; add.u64 %r232,%r233,1; ld.u64 %r234,[%r133+8]; sub.u64 %r44,%r232,%r234; .loc 1 90 10 setp.lt.s64 %r235,%r44,0; @ %r235 bra $L6; .loc 1 88 17 st.u64 [%r173],%r44; bra $L7; $L6: .loc 1 91 12 st.u64 [%r173],%r371; $L7: .loc 1 85 3 add.u64 %r133,%r133,24; add.u64 %r177,%r177,8; add.u64 %r173,%r173,8; setp.ne.u64 %r237,%r133,%r179; @ %r237 bra $L8; bra $L9; $L49: .loc 1 102 6 ld.u64 %r238,[%r183]; setp.eq.u64 %r239,%r238,0; @ ! %r239 bra $L11; bra $L10; $L4: add.u64 %r241,%r25,%r25; add.u64 %r242,%r241,%r25; shl.b64 %r243,%r242,3; add.u64 %r244,%r243,40; add.u64 %r111,%r184,%r244; shl.b64 %r245,%r25,3; add.u64 %r101,%r245,-8; add.u64 %r360,%frame,120; add.u64 %r103,%r360,%r101; add.u64 %r359,%frame,240; add.u64 %r98,%r359,%r101; add.u64 %r248,%r184,40; cvt.u32.u32 %r250,%r22; cvt.s64.s8 %r249,%r250; add.u64 %r252,%r249,%r249; add.u64 %r253,%r252,%r249; shl.b64 %r254,%r253,3; add.u64 %r29,%r248,%r254; .loc 1 99 12 mov.u64 %r370,0; $L14: .loc 1 95 56 ld.u64 %r256,[%r111]; mul.lo.u64 %r255,%r256,%r187; .loc 1 95 18 st.u64 [%r103],%r255; .loc 1 96 19 ld.u64 %r258,[%r111+16]; add.u64 %r257,%r258,1; ld.u64 %r259,[%r111+8]; sub.u64 %r53,%r257,%r259; .loc 1 98 10 setp.lt.s64 %r260,%r53,0; @ %r260 bra $L12; .loc 1 96 17 st.u64 [%r98],%r53; bra $L13; $L12: .loc 1 99 12 st.u64 [%r98],%r370; $L13: .loc 1 93 3 add.u64 %r111,%r111,24; add.u64 %r103,%r103,8; add.u64 %r98,%r98,8; setp.ne.u64 %r262,%r29,%r111; @ %r262 bra $L14; bra $L50; $L10: .loc 1 106 7 setp.le.s64 %r263,%r116,0; @ %r263 bra $L16; add.u64 %r359,%frame,240; $L36: add.u64 %r147,%r183,40; mov.u64 %r141,%r359; cvt.u32.u32 %r265,%r22; cvt.s64.s8 %r264,%r265; add.u64 %r267,%r264,%r264; add.u64 %r268,%r267,%r264; shl.b64 %r269,%r268,3; add.u64 %r270,%r183,16; add.u64 %r112,%r269,%r270; .loc 1 109 10 mov.u64 %r104,1; .loc 1 113 4 mov.u64 %r293,0; bra $L17; $L16: .loc 1 117 24 mov.u64 %r271,0; st.u64 [%r183+8],%r271; .loc 1 118 28 cvt.u16.u32 %r274,%r22; add.u16 %r273,%r274,-1; cvt.u32.u16 %r275,%r273; st.u8 [%r183+28],%r275; .loc 1 120 20 add.u32 %r276,%r22,-2; cvt.s64.s32 %r63,%r276; add.u64 %r278,%r63,%r63; add.u64 %r279,%r278,%r63; shl.b64 %r280,%r279,3; add.u64 %r281,%r183,%r280; .loc 1 120 67 shl.b64 %r283,%r63,3; add.u64 %r284,%frame,%r283; .loc 1 120 59 ld.u64 %r286,[%r281+40]; ld.u64 %r287,[%r284+240]; mul.lo.u64 %r122,%r286,%r287; .loc 1 122 29 mov.u64 %r289189; call (%value_in),_gfortrani_xmallocarray290,[%value_in]; } .loc 1 122 27 st.u64 [%r183],%r290; .loc 1 123 10 setp.eq.u64 %r292,%r122,0; @ ! %r292 bra $L21; bra $L18; $L20: .loc 1 111 48 mul.lo.u64 %r104,%r59,%r104; $L17: .loc 1 113 4 st.u64 [%r147+8],%r293; ld.u64 %r59,[%r141]; add.u64 %r294,%r59,-1; st.u64 [%r147+16],%r294; st.u64 [%r147],%r104; .loc 1 106 7 add.u64 %r147,%r147,2420; bra $L16; $L18: .loc 1 126 4 st.u64 [%r183+48],%r122; mov.u64 %r297,-1; st.u64 [%r183+56],%r297; mov.u64 %r298,1; st.u64 [%r183+40],%r298; .loc 1 127 4 bra $L1; $L11: .loc 1 133 19 ld.s8 %r70,[%r183+28]; .loc 1 133 10 setp.eq.u64 %r299,%r70,%r116; @ %r299 bra $L23; .loc 1 134 2 st.u64 [%stack+8],%r116_gfortran_runtime_error23: .loc 1 139 11 cvta.global.u64 %r302,_gfortrani_compile_options; .loc 1 139 10 ld.u32 %r303,[%r302+36]; setp.eq.u32 %r304,%r303,0; @ %r304 bra $L21; .loc 1 140 2 add.u64 %r359,%frame,240; cvta.const.u64 %r308,$LC2r359307308; call _gfortrani_bounds_ifunction_return$L21: .loc 1 144 3 setp.gt.s64 %r310,%r116,0; @ %r310 bra $L24; $L28: .loc 1 80 9 add.u64 %r311,%r31,1; .loc 1 80 7 sub.u64 %r118,%r311,%r33; max.s64 %r120,%r118,0; .loc 1 83 44 mul.lo.u64 %r36,%r34,%r187; .loc 1 152 8 ld.u64 %r129,[%r184]; .loc 1 153 8 ld.u64 %r130,[%r183]; .loc 1 186 22 ld.u64 %r76,[%frame+120]; .loc 1 187 22 ld.u64 %r78,[%frame]; .loc 1 187 12 shl.b64 %r80,%r78,3; .loc 1 185 12 ld.u64 %r176,[%frame+360]; setp.gt.s64 %r357,%r118,0; setp.le.s64 %r358,%r116,1; .loc 1 173 6 setp.eq.u32 %r363,%r186,0; .loc 1 167 10 mov.u64 %r364,0; cvt.u32.u32 %r365,%r22; cvt.s64.s8 %r366,%r365; add.u64 %r367,%r366,-1; add.u64 %r368,%frame,120; add.u64 %r369,%frame,240; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r183,40; mov.u64 %r162,%frame; cvt.u32.u32 %r313,%r22; cvt.s64.s8 %r312,%r313; add.u64 %r150,%r312,-1; .loc 1 144 3 mov.u64 %r131,0; add.u64 %r359,%frame,240; .loc 1 146 16 mov.u64 %r314,%r131; $L27: st.u64 [%r165],%r314; .loc 1 147 18 ld.u64 %r315,[%r164]; st.u64 [%r162],%r315; .loc 1 148 17 shl.b64 %r317,%r131,3; add.u64 %r318,%r359,%r317; .loc 1 148 10 ld.u64 %r319,[%r318]; setp.le.s64 %r320,%r319,0; @ %r320 bra $L1; .loc 1 144 26 add.u64 %r131,%r131,1; .loc 1 144 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r321,%r131,%r150; @ %r321 bra $L27; bra $L28; $L25: .loc 1 166 5 @ %r357 bra $L37; .loc 1 167 10 st.u64 [%r130],%r364; bra $L30; $L37: mov.u64 %r125,%r129; .loc 1 164 9 mov.u64 %r107,0; .loc 1 165 9 mov.u64 %r105,%r107; .loc 1 170 13 mov.u64 %r158,%r107; $L29: .loc 1 177 14 add.u64 %r158,%r158,1; .loc 1 173 6 setp.eq.u64 %r324,%r107,0; @ %r324 bra $L38; .loc 1 40(%value_in),memcmp,(%out_arg1,%out_arg2,%out_arg3); ld.param.u32 %r328,[%value_in]; } .loc 1 173 6 @ %r363 bra $L32; .loc 1 173 22 setp.lt.s32 %r331,%r328,0; selp.u64 %r107,%r107,%r125,%r331; selp.u64 %r105,%r105,%r158,%r331; bra $L31; $L32: setp.le.s32 %r332,%r328,0; selp.u64 %r107,%r107,%r125,%r332; selp.u64 %r105,%r105,%r158,%r332; bra $L31; $L38: mov.u64 %r107,%r125; .loc 1 177 14 mov.u64 %r105,%r158; $L31: .loc 1 170 36 add.u64 %r125,%r125,%r36; .loc 1 170 6 setp.gt.s64 %r333,%r120,%r158; @ %r333 bra $L29; .loc 1 181 12 st.u64 [%r130],%r105; $L30: .loc 1 185 15 add.u64 %r176,%r176,1; .loc 1 186 12 add.u64 %r129,%r129,%r76; .loc 1 187 12 add.u64 %r130,%r130,%r80; .loc 1 189 32 ld.u64 %r137,[%frame+240]; .loc 1 189 13 setp.ne.u64 %r334,%r176,%r137; @ %r334 bra $L25; .loc 1 196 23 mul.lo.u64 %r335,%r76,%r176; .loc 1 196 9 sub.u64 %r126,%r129,%r335; .loc 1 197 23 mul.lo.u64 %r336,%r78,%r176; .loc 1 197 9 shl.b64 %r337,%r336,3; sub.u64 %r127,%r130,%r337; .loc 1 199 7 @ %r358 bra $L1; add.u64 %r139,%frame,368; mov.u64 %r159,8; .loc 1 198 5 mov.u64 %r128,1; bra $L34; $L35: .loc 1 193 13 st.u64 [%r139],%r364; .loc 1 196 23 mul.lo.u64 %r343,%r92,%r91; .loc 1 196 9 sub.u64 %r126,%r129,%r343; .loc 1 197 23 mul.lo.u64 %r344,%r94,%r91; .loc 1 197 9 shl.b64 %r345,%r344,3; sub.u64 %r127,%r130,%r345; .loc 1 198 5 add.u64 %r128,%r128,1; .loc 1 199 7 add.u64 %r139,%r139,8; add.u64 %r159,%r159,8; setp.eq.u64 %r346,%r128,%r367; @ %r346 bra $L1; $L34: .loc 1 207 16 ld.u64 %r347,[%r139]; add.u64 %r91,%r347,1; st.u64 [%r139],%r91; .loc 1 208 23 add.u64 %r349,%r368,%r159; ld.u64 %r92,[%r349]; .loc 1 208 13 add.u64 %r129,%r126,%r92; .loc 1 209 23 add.u64 %r350,%frame,%r159; ld.u64 %r94,[%r350]; .loc 1 209 13 shl.b64 %r351,%r94,3; add.u64 %r130,%r127,%r351; .loc 1 189 32 add.u64 %r353,%r369,%r159; ld.u64 %r97,[%r353]; .loc 1 189 13 setp.eq.u64 %r354,%r91,%r97; @ %r354 bra $L35; mov.u64 %r176,0; bra $L25; $L50: .loc 1 102 6 ld.u64 %r355,[%r183]; setp.eq.u64 %r356,%r355,0; @ ! %r356 bra $L11; bra $L36; $L1_gfortran_mmaxloc1_8_s1 .visible .func _gfortran_mmaxloc1_8_s1608u64 %r210; .reg .u64 %r211; .reg .u64 %r213; .reg .u64u64 %r229; .reg .u32 %r230; .reg .u64 %r231; .reg .pred %r232; .reg .u32 %r238; .reg .u64 %r240; .reg .u32 %r242; .reg .u32u64 %r258; .reg .u64u64u64u64predu64 %r357; .reg .u64 %r358; .reg .u64u32pred %r421; .reg .predpred %r447; .reg .u64 %r448; .reg .pred %r449; .reg .u64 %r450; .reg .u64 %r452; .reg .pred %r453; .reg .pred %r454; .reg .u64mov.u64 %r226,%ar0; mov.u64 %r227,%ar1; mov.u64 %r228,%ar2; mov.u64 %r229,%ar3; mov.u32 %r230,%ar4; mov.u64 %r231,%ar5; .loc 1 244 6 setp.ne.u64 %r232,%r229,0; @ %r232 bra $L228230231; call _gfortran_maxloc1_8_s1); } .loc 1 251 7 bra $L51; $L52: .loc 1 254 10 ld.u64 %r22,[%r228]; .loc 1 254 7 add.u64 %r150,%r22,-1; .loc 1 255 10 ld.s8 %r23,[%r227+28]; .loc 1 255 38 add.u32 %r238,%r23,-1; .loc 1 255 8 cvt.s64.s32 %r151,%r238; .loc 1 258 7 shr.u64 %r240,%r150,63; set.u32.gt.s64 %r242,%r150,%r151; neg.s32 %r243,%r242; cvt.u16.u64 %r246,%r240; cvt.u16.u32 %r247,%r243; or.b16 %r245,%r246,%r247; .loc 1 258 660 7 cvt.u32.u32 %r253,%r23; cvt.s64.s8 %r252,%r253; st.u64 [%stack+8],%r252; st.u64 [%stack],%r22; cvta.const.u64 %r22_gfortran_runtime_error4: .loc 1 265 9 add.u64 %r455,%r150,%r150; add.u64 %r452,%r455,%r150; shl.b64 %r258,%r452,3; add.u64 %r259,%r227,%r258; ld.u64 %r262,[%r259+56]; add.u64 %r261,%r262,1; .loc 1 265 7 ld.u64 %r269,[%r259+48]; sub.u64 %r152,%r261,%r269; .loc 1 266 6 setp.le.s64 %r270,%r152,0; @ %r270 bra $L51; .loc 1 269 9 ld.u64 %r162,[%r229]; .loc 1 271 15 ld.u64 %r35,[%r229+16]; .loc 1 273 22 cvt.u32.u64 %r36,%r35; .loc 1 273 53 add.u32 %r271,%r36,-4; and.b32 %r272,%r271,-5; set.u32.eq.u32 %r274,%r272,0; neg.s32 %r275,%r274; .loc 1 273 22 add.u32 %r276,%r36,-1; set.u32.le.u32 %r278,%r276,1; neg.s32 %r279,%r278; .loc 1 273 6 cvt.u16.u32 %r281,%r275; cvt.u16.u32 %r282,%r279; or.loc 1 275 7 setp.ne.u32 %r287,%r36,16; @ %r287 bra $L57; $L56: .loc 1 282 11 shl.b64 %r291,%r452,3; add.u64 %r292,%r227,%r291; ld.u64 %r42,[%r292+40]; .loc 1 283 12 add.u64 %r298,%r229,%r291; ld.u64 %r45,[%r298+40]; .loc 1 285 3 setp.ne.u64 %r300,%r150,0; @ %r300 bra $L58; $L64: .loc 1 295 3 setp.lt.s64 %r301,%r150,%r151; @ %r301 bra $L59; bra $L109; $L57: .loc 1 280 5 cvta.const.u64 %r302302_gfortran_runtime_error8: add.u64 %r208,%r227,40; add.u64 %r210,%frame,240; add.u64 %r211,%r229,40; mov.u64 %r213,%frame; add.u64 %r214,%frame,360; add.u64 %r304,%r227,16; add.u64 %r306,%r22,%r22; add.u64 %r307,%r306,%r22; shl.b64 %r308,%r307,3; add.u64 %r223,%r304,%r308; .loc 1 292 12 mov.u64 %r468,0; $L63: .loc 1 287 51 ld.u64 %r310,[%r208]; mul.lo.u64 %r309,%r310,%r231; .loc 1 287 18 st.u64 [%r210],%r309; .loc 1 288 20 ld.u64 %r312,[%r211]; mul.lo.u64 %r311,%r312,%r35; .loc 1 288 18 st.u64 [%r213],%r311; .loc 1 289 19 ld.u64 %r314,[%r208+16]; add.u64 %r313,%r314,1; ld.u64 %r315,[%r208+8]; sub.u64 %r60,%r313,%r315; .loc 1 291 10 setp.lt.s64 %r316,%r60,0; @ %r316 bra $L61; .loc 1 289 17 st.u64 [%r214],%r60; bra $L62; $L61: .loc 1 292 12 st.u64 [%r214],%r468; $L62: .loc 1 285 3 add.u64 %r208,%r208,24; add.u64 %r210,%r210,8; add.u64 %r211,%r211,24; add.u64 %r213,%r213,8; add.u64 %r214,%r214,8; setp.ne.u64 %r318,%r208,%r223; @ %r318 bra $L63; bra $L64; $L109: .loc 1 305 6 ld.u64 %r319,[%r226]; setp.eq.u64 %r320,%r319,0; @ ! %r320 bra $L66; bra $L65; $L59: add.u64 %r322,%r22,%r22; add.u64 %r323,%r322,%r22; shl.b64 %r324,%r323,3; add.u64 %r40,%r324,40; add.u64 %r71,%r227,%r40; shl.b64 %r325,%r22,3; add.u64 %r171,%r325,-8; add.u64 %r450,%frame,240; add.u64 %r30,%r450,%r171; add.u64 %r173,%r229,%r40; add.u64 %r79,%frame,%r171; add.u64 %r456,%frame,360; add.u64 %r164,%r456,%r171; add.u64 %r328,%r227,40; cvt.u32.u32 %r330,%r23; cvt.s64.s8 %r329,%r330; add.u64 %r332,%r329,%r329; add.u64 %r333,%r332,%r329; shl.b64 %r334,%r333,3; add.u64 %r205,%r328,%r334; .loc 1 302 12 mov.u64 %r467,0; $L69: .loc 1 297 55 ld.u64 %r336,[%r71]; mul.lo.u64 %r335,%r336,%r231; .loc 1 297 18 st.u64 [%r30],%r335; .loc 1 298 20 ld.u64 %r338,[%r173]; mul.lo.u64 %r337,%r338,%r35; .loc 1 298 18 st.u64 [%r79],%r337; .loc 1 299 19 ld.u64 %r340,[%r71+16]; add.u64 %r339,%r340,1; ld.u64 %r341,[%r71+8]; sub.u64 %r76,%r339,%r341; .loc 1 301 10 setp.lt.s64 %r342,%r76,0; @ %r342 bra $L67; .loc 1 299 17 st.u64 [%r164],%r76; bra $L68; $L67: .loc 1 302 12 st.u64 [%r164],%r467; $L68: .loc 1 295 3 add.u64 %r71,%r71,24; add.u64 %r30,%r30,8; add.u64 %r173,%r173,24; add.u64 %r79,%r79,8; add.u64 %r164,%r164,8; setp.ne.u64 %r344,%r71,%r205; @ %r344 bra $L69; bra $L110; $L65: .loc 1 309 7 setp.eq.u64 %r345,%r151,0; @ %r345 bra $L71; add.u64 %r456,%frame,360; $L92: add.u64 %r130,%r226,40; mov.u64 %r128,%r456; cvt.u32.u32 %r347,%r23; cvt.s64.s8 %r346,%r347; add.u64 %r349,%r346,%r346; add.u64 %r350,%r349,%r346; shl.b64 %r351,%r350,3; add.u64 %r352,%r226,16; add.u64 %r78,%r351,%r352; .loc 1 312 10 mov.u64 %r137,1; .loc 1 316 4 mov.u64 %r371,0; bra $L72; $L71: .loc 1 320 20 add.u32 %r353,%r23,-2; cvt.s64.s32 %r84,%r353; add.u64 %r355,%r84,%r84; add.u64 %r356,%r355,%r84; shl.b64 %r357,%r356,3; add.u64 %r358,%r226,%r357; .loc 1 320 67 shl.b64 %r360,%r84,3; add.u64 %r361,%frame,%r360; .loc 1 320 59 ld.u64 %r363,[%r358+40]; ld.u64 %r364,[%r361+360]; mul.lo.u64 %r157,%r363,%r364; .loc 1 322 24 mov.u64 %r365,0; st.u64 [%r226+8],%r365; .loc 1 323 28 cvt.u16.u32 %r368,%r23; add.u16 %r367,%r368,-1; cvt.u32.u16 %r369,%r367; st.u8 [%r226+28],%r369; .loc 1 325 10 setp.eq.u64 %r370,%r157,0; @ %r370 bra $L73; bra $L111; $L75: .loc 1 314 47 mul.lo.u64 %r137,%r81,%r137; $L72: .loc 1 316 4 st.u64 [%r130+8],%r371; ld.u64 %r81,[%r128]; add.u64 %r372,%r81,-1; st.u64 [%r130+16],%r372; st.u64 [%r130],%r137; .loc 1 309 7 add.u64 %r130,%r130,24; add.u64 %r128,%r128,8; setp.ne.u64 %r373,%r78,%r130; @ %r373 bra $L75; bra $L71; $L73: .loc 1 328 4 st.u64 [%r226+48],%r157; mov.u64 %r375,-1; st.u64 [%r226+56],%r375; mov.u64 %r376,1; st.u64 [%r226+40],%r376; .loc 1 329 4 bra $L51; $L111: .loc 1 332 24 mov.u64 %r378378; call (%value_in),_gfortrani_xmallocarray79,[%value_in]; } .loc 1 332 22 st.u64 [%r226],%r379; bra $L76; $L66: .loc 1 337 19 ld.s8 %r381,[%r226+28]; .loc 1 337 10 setp.eq.u64 %r382,%r381,%r151; @ %r382 bra $L77; .loc 1 338 2 cvta.const.u64 %r383383_gfortran_runtime_error77: .loc 1 340 11 cvta.global.u64 %r385,_gfortrani_compile_options; .loc 1 340 10 ld.u32 %r386,[%r385+36]; setp.eq.u32 %r387,%r386,0; @ %r387 bra $L76; .loc 1 342 4 add.u64 %r456,%frame,360; cvta.const.u64 %r391,$LC2r456390391; call _gfortrani_bounds_ifunction_return344 4 cvta.const.u64 %r395,$LC39391; call _gfortrani_bounds_equal_extents$L76: .loc 1 349 3 setp.ne.u64 %r397,%r151,0; @ %r397 bra $L78; $L81: .loc 1 357 8 ld.u64 %r163,[%r226]; .loc 1 358 8 ld.u64 %r161,[%r227]; .loc 1 360 9 setp.ne.u64 %r398,%r161,0; @ %r398 bra $L79; bra $L51; $L78: add.u64 %r174,%frame,480; add.u64 %r165,%r226,40; add.u64 %r148,%frame,120; cvt.u32.u32 %r400,%r23; cvt.s64.s8 %r399,%r400; add.u64 %r135,%r399,-1; .loc 1 349 3 mov.u64 %r169,0; add.u64 %r456,%frame,360; .loc 1 351 16 mov.u64 %r401,%r169; $L80: st.u64 [%r174],%r401; .loc 1 352 18 ld.u64 %r402,[%r165]; st.u64 [%r148],%r402; .loc 1 353 17 shl.b64 %r404,%r169,3; add.u64 %r405,%r456,%r404; .loc 1 353 10 ld.u64 %r406,[%r405]; setp.le.s64 %r407,%r406,0; @ %r407 bra $L51; .loc 1 349 26 add.u64 %r169,%r169,1; .loc 1 349 3 add.u64 %r174,%r174,8; add.u64 %r165,%r165,24; add.u64 %r148,%r148,8; setp.ne.u64 %r408,%r135,%r169; @ %r408 bra $L80; bra $L81; $L79: .loc 1 282 44 mul.lo.u64 %r44,%r42,%r231; .loc 1 283 12 mul.lo.u64 %r48,%r45,%r35; .loc 1 396 22 ld.u64 %r100,[%frame+240]; .loc 1 397 23 ld.u64 %r102,[%frame]; .loc 1 398 22 ld.u64 %r104,[%frame+120]; .loc 1 398 12 shl.b64 %r106,%r104,3; ld.u64 %r457,[%frame+480]; setp.le.s64 %r454,%r151,1; .loc 1 384 13 setp.eq.u32 %r460,%r230,0; cvt.u32.u32 %r461,%r23; cvt.s64.s8 %r462,%r461; add.u64 %r463,%r462,-1; add.u64 %r464,%frame,240; add.u64 %r465,%frame,360; add.u64 %r466,%frame,120; $L93: .loc 1 349 3 mov.u64 %r168,%r162; mov.u64 %r143,%r161; .loc 1 372 9 mov.u64 %r141,0; $L85: .loc 1 375 7 ld.s8 %r132,[%r168]; mov.u64 %r190,%r141; .loc 1 378 11 add.u64 %r141,%r141,1; .loc 1 375 6 setp.eq.u32 %r453,%r132,0; @ %r453 bra $L82; .loc 1 382 6 setp.gt.s64 %r410,%r152,%r190; @ %r410 bra $L94; bra $L84; $L82: .loc 1 372 32 add.u64 %r143,%r143,%r44; .loc 1 372 47 add.u64 %r168,%r168,%r48; .loc 1 372 2 setp.ne.u64 %r411,%r152,%r141; @ %r411 bra $L85; .loc 1 371 9 mov.u64 %r141,0; bra $L84; $L94: .loc 1 382 6 mov.u64 %r167,%r143; $L83: .loc 1 388 14 add.u64 %r190,%r190,1; .loc 1 384 6 @ %r453 bra $L86; .loc 1 40(%value_in),memcmp,(%out_arg1,%out_arg2,%out_arg3); ld.param.u32 %r416,[%value_in]; } .loc 1 384 13 @ %r460 bra $L87; setp.lt.s32 %r419,%r416,0; selp.u64 %r143,%r143,%r167,%r419; selp.u64 %r141,%r141,%r190,%r419; bra $L86; $L87: setp.le.s32 %r420,%r416,0; selp.u64 %r143,%r143,%r167,%r420; selp.u64 %r141,%r141,%r190,%r420; $L86: .loc 1 382 31 add.u64 %r167,%r167,%r44; .loc 1 382 46 add.u64 %r168,%r168,%r48; .loc 1 382 6 setp.eq.u64 %r421,%r152,%r190; @ %r421 bra $L84; .loc 1 384 7 ld.s8 %r132,[%r168]; setp.eq.u32 %r453,%r132,0; bra $L83; $L84: .loc 1 392 8 st.u64 [%r163],%r141; .loc 1 395 15 add.u64 %r457,%r457,1; st.u64 [%frame+480],%r457; .loc 1 396 12 add.u64 %r161,%r161,%r100; .loc 1 397 13 add.u64 %r162,%r162,%r102; .loc 1 398 12 add.u64 %r163,%r163,%r106; .loc 1 400 32 ld.u64 %r29,[%frame+360]; .loc 1 400 13 setp.ne.u64 %r423,%r29,%r457; @ %r423 bra $L93; .loc 1 404 13 mov.u64 %r424,0; st.u64 [%frame+480],%r424; .loc 1 407 23 mul.lo.u64 %r109,%r457,%r100; .loc 1 408 24 mul.lo.u64 %r425,%r457,%r102; .loc 1 408 10 sub.u64 %r158,%r162,%r425; .loc 1 409 23 mul.lo.u64 %r426,%r457,%r104; .loc 1 409 9 shl.b64 %r427,%r426,3; sub.u64 %r159,%r163,%r427; .loc 1 411 7 @ %r454 bra $L51; add.u64 %r197,%frame,488; mov.u64 %r196,8; .loc 1 410 5 mov.u64 %r160,1; bra $L90; $L91: .loc 1 404 13 st.u64 [%r197],%r424; .loc 1 407 23 mul.lo.u64 %r109,%r120,%r119; .loc 1 408 24 mul.lo.u64 %r433,%r122,%r119; .loc 1 408 10 sub.u64 %r158,%r162,%r433; .loc 1 409 23 mul.lo.u64 %r434,%r124,%r119; .loc 1 409 9 shl.b64 %r435,%r434,3; sub.u64 %r159,%r163,%r435; .loc 1 410 5 add.u64 %r160,%r160,1; .loc 1 411 7 add.u64 %r197,%r197,8; add.u64 %r196,%r196,8; setp.eq.u64 %r436,%r160,%r463; @ %r436 bra $L51; $L90: .loc 1 419 16 ld.u64 %r437,[%r197]; add.u64 %r119,%r437,1; st.u64 [%r197],%r119; .loc 1 420 23 add.u64 %r439,%r464,%r196; ld.u64 %r120,[%r439]; .loc 1 420 13 sub.u64 %r440,%r120,%r109; add.u64 %r161,%r161,%r440; .loc 1 421 24 add.u64 %r441,%frame,%r196; ld.u64 %r122,[%r441]; .loc 1 421 14 add.u64 %r162,%r158,%r122; .loc 1 422 23 add.u64 %r443,%r466,%r196; ld.u64 %r124,[%r443]; .loc 1 422 13 shl.b64 %r444,%r124,3; add.u64 %r163,%r159,%r444; .loc 1 400 32 add.u64 %r446,%r465,%r196; ld.u64 %r127,[%r446]; .loc 1 400 13 setp.eq.u64 %r447,%r119,%r127; @ %r447 bra $L91; ld.u64 %r457,[%frame+480]; bra $L93; $L110: .loc 1 305 6 ld.u64 %r448,[%r226]; setp.eq.u64 %r449,%r448,0; @ ! %r449 bra $L66; bra $L92; $L51: .loc 1 426_gfortran_smaxloc1_8_s1 .visible .func _gfortran_smaxloc1_8_s1368100predu16 %r180; .reg .predpred %r222; .reg .predu64 %r231; .reg .u16 %r233; .reg .u16 %r234; .reg .u32pred %r259; .reg .pred %r260; .reg .u64u64 %r267; .reg .u32pred %r303; .reg .u64 %r304; .reg .pred %r305; .reg .u64 %r306; .reg .u32mov.u64 %r156,%ar0; mov.u64 %r157,%ar1; mov.u64 %r158,%ar2; mov.u64 %r159,%ar3; mov.u32 %r160,%ar4; mov.u64 %r161,%ar5; .loc 1 449 6 setp.eq.u64 %r162,%r159,0; @ %r162 bra $L113; .loc 1 449 20 ld.u32 %r163,[%r159]; setp.eq.u32 %r164,%r163,0; @ %r164 bra $L114; $L113: .loc 1 452 7r1571580161; call _gfortran_maxloc1_8_s1); } .loc 1 456 7 bra $L112; $L114: .loc 1 459 10 ld.u64 %r23,[%r158]; .loc 1 459 7 add.u64 %r95,%r23,-1; .loc 1 460 10 ld.s8 %r24,[%r157+28]; .loc 1 460 38 add.u32 %r26,%r24,-1; .loc 1 460 8 cvt.s64.s32 %r306,%r26; .loc 1 462 7 shr.u64 %r171,%r95,63; set.u32.gt.s64 %r173,%r95,%r306; neg.s32 %r174,%r173; cvt.u16.u64 %r177,%r171; cvt.u16.u32 %r178,%r174; or.b16 %r176,%r177,%r178; .loc 1 462 6 cvt.u32.u16 %r179,%r176; cvt.u16.u8 %r180,%r179; setp.ne.u16 %r181,%r180,0; @ %r181 bra $L116; .loc 1 469 3 setp.ne.u64 %r182,%r95,0; @ %r182 bra $L117; $L123: .loc 1 477 3 setp.lt.s64 %r183,%r95,%r306; @ %r183 bra $L118; bra $L171; $L116: .loc 1 464 7 cvt.u32.u32 %r186,%r24; cvt.s64.s8 %r185,%r186; st.u64 [%stack+8],%r1851841stack; call _gfortran_runtime_error117: add.u64 %r92,%r157,48; add.u64 %r114,%frame,120; add.u64 %r146,%r157,56; add.u64 %r188,%r157,24; add.u64 %r190,%r23,%r23; add.u64 %r191,%r190,%r23; shl.b64 %r192,%r191,3; add.u64 %r152,%r188,%r192; .loc 1 474 12 mov.u64 %r312,0; $L122: .loc 1 471 19 ld.u64 %r194,[%r146]; ld.u64 %r196,[%r92]; sub.u64 %r195,%r194,%r196; .loc 1 471 50 mad.lo.u64 %r38,%r195,%r161,%r161; .loc 1 473 10 setp.le.s64 %r197,%r38,0; @ %r197 bra $L120; .loc 1 471 17 st.u64 [%r114],%r38; bra $L121; $L120: .loc 1 474 12 st.u64 [%r114],%r312; $L121: .loc 1 469 3 add.u64 %r92,%r92,24; add.u64 %r114,%r114,8; add.u64 %r146,%r146,24; setp.ne.u64 %r199,%r92,%r152; @ %r199 bra $L122; bra $L123; $L171: .loc 1 486 15 ld.u64 %r107,[%r156]; .loc 1 486 6 setp.eq.u64 %r200,%r107,0; @ ! %r200 bra $L125; bra $L124; $L118: add.u64 %r202,%r23,%r23; add.u64 %r203,%r202,%r23; shl.b64 %r204,%r203,3; add.u64 %r205,%r204,48; add.u64 %r83,%r157,%r205; add.u64 %r304,%frame,120; shl.b64 %r207,%r23,3; add.u64 %r208,%r207,-8; add.u64 %r144,%r304,%r208; add.u64 %r209,%r157,48; cvt.u32.u32 %r211,%r24; cvt.s64.s8 %r210,%r211; add.u64 %r213,%r210,%r210; add.u64 %r214,%r213,%r210; shl.b64 %r215,%r214,3; add.u64 %r72,%r209,%r215; .loc 1 483 12 mov.u64 %r311,0; $L128: .loc 1 480 2 ld.u64 %r217,[%r83+8]; ld.u64 %r219,[%r83]; sub.u64 %r218,%r217,%r219; .loc 1 480 37 mad.lo.u64 %r46,%r218,%r161,%r161; .loc 1 482 10 setp.le.s64 %r220,%r46,0; @ %r220 bra $L126; .loc 1 479 17 st.u64 [%r144],%r46; bra $L127; $L126: .loc 1 483 12 st.u64 [%r144],%r311; $L127: .loc 1 477 3 add.u64 %r83,%r83,24; add.u64 %r144,%r144,8; setp.ne.u64 %r222,%r72,%r83; @ %r222 bra $L128; bra $L172; $L124: .loc 1 490 7 setp.eq.u64 %r223,%r306,0; @ %r223 bra $L130; add.u64 %r304,%frame,120; $L150: add.u64 %r125,%r156,40; mov.u64 %r123,%r304; add.u64 %r224,%r156,16; cvt.u32.u32 %r226,%r24; cvt.s64.s8 %r225,%r226; add.u64 %r228,%r225,%r225; add.u64 %r229,%r228,%r225; shl.b64 %r230,%r229,3; add.u64 %r117,%r224,%r230; .loc 1 493 10 mov.u64 %r89,1; .loc 1 497 4 mov.u64 %r249,0; bra $L131; $L130: .loc 1 501 24 mov.u64 %r231,0; st.u64 [%r156+8],%r231; .loc 1 502 28 cvt.u16.u32 %r234,%r24; add.u16 %r233,%r234,-1; cvt.u32.u16 %r235,%r233; st.u8 [%r156+28],%r235; .loc 1 504 20 add.u32 %r236,%r24,-2; cvt.s64.s32 %r56,%r236; add.u64 %r238,%r56,%r56; add.u64 %r239,%r238,%r56; shl.b64 %r240,%r239,3; add.u64 %r241,%r156,%r240; .loc 1 504 67 shl.b64 %r243,%r56,3; add.u64 %r244,%frame,%r243; .loc 1 504 59 ld.u64 %r246,[%r241+40]; ld.u64 %r247,[%r244+120]; mul.lo.u64 %r103,%r246,%r247; .loc 1 506 10 setp.eq.u64 %r248,%r103,0; @ %r248 bra $L132; bra $L173; $L134: .loc 1 495 48 mul.lo.u64 %r89,%r52,%r89; $L131: .loc 1 497 4 st.u64 [%r125+8],%r249; ld.u64 %r52,[%r123]; add.u64 %r250,%r52,-1; st.u64 [%r125+16],%r250; st.u64 [%r125],%r89; .loc 1 490 7 add.u64 %r125,%r125,24; add.u64 %r123,%r123,8; setp.ne.u64 %r251,%r117,%r125; @ %r251 bra $L134; bra $L130; $L132: .loc 1 509 4 st.u64 [%r156+48],%r103; mov.u64 %r253,-1; st.u64 [%r156+56],%r253; mov.u64 %r254,1; st.u64 [%r156+40],%r254; .loc 1 510 4 bra $L112; $L173: .loc 1 513 24 mov.u64 %r256(%value_in),_gfortrani_xmallocarray257,[%value_in]; } mov.u64 %r107,%r257; .loc 1 513 22 st.u64 [%r156],%r107; $L138: .loc 1 539 3 setp.ne.u64 %r259,%r306,0; @ %r259 bra $L135; bra $L136; $L125: .loc 1 517 19 ld.s8 %r61,[%r156+28]; .loc 1 517 10 setp.eq.u64 %r260,%r61,%r306; @ %r260 bra $L137; .loc 1 518 2 st.u64 [%stack+8],%r306; st.u64 [%stack],%r61;_gfortran_runtime_error137: .loc 1 523 11 cvta.global.u64 %r263,_gfortrani_compile_options; .loc 1 523 10 ld.u32 %r264,[%r263+36]; setp.eq.u32 %r265,%r264,0; @ %r265 bra $L138; .loc 1 525 4 setp.eq.u64 %r266,%r306,0; @ %r266 bra $L136; add.u64 %r115,%r156,48; add.u64 %r105,%frame,120; cvt.u32.u32 %r268,%r24; cvt.s64.s8 %r267,%r268; add.u64 %r86,%r267,-1; .loc 1 525 10 mov.u64 %r113,0; $L140: .loc 1 529 21 ld.u64 %r270,[%r115+8]; add.u64 %r269,%r270,1; .loc 1 529 19 ld.u64 %r271,[%r115]; sub.u64 %r100,%r269,%r271; .loc 1 530 18 ld.u64 %r70,[%r105]; .loc 1 525 25 add.u64 %r113,%r113,1; .loc 1 530 11 setp.eq.u64 %r272,%r70,%r100; @ %r272 bra $L139; .loc 1 531 3 st.u64 [%stack+16],%r70; st.u64 [%stack+8],%r100; st.u64 [%stack],%r113;_gfortran_runtime_error139: .loc 1 525 4 add.u64 %r115,%r115,24; add.u64 %r105,%r105,8; setp.ne.u64 %r275,%r86,%r113; @ %r275 bra $L140; $L135: .loc 1 541 16 cvt.u16.u32 %r277,%r24; setp.le.s16 %r278,%r277,1; @ %r278 bra $L141; shl.b64 %r276,%r306,3; bra $L142; $L141: mov.u64 %r276,8; $L142: add.u64 %r280,%frame,240; mov.u32 %r2842766,[%value_in]; } add.u64 %r130,%r156,40; mov.u64 %r128,%frame; mov.u64 %r109,0; $L143: .loc 1 542 18 ld.u64 %r288,[%r130]; st.u64 [%r128],%r288; .loc 1 539 26 add.u64 %r109,%r109,1; .loc 1 539 3 add.u64 %r130,%r130,24; add.u64 %r128,%r128,8; setp.gt.s64 %r289,%r306,%r109; @ %r289 bra $L143; $L136: ld.u64 %r88,[%frame+240]; .loc 1 551 22 ld.u64 %r75,[%frame]; .loc 1 551 12 shl.b64 %r76,%r75,3; .loc 1 553 32 ld.u64 %r65,[%frame+120]; .loc 1 560 23 mul.lo.u64 %r49,%r65,%r75; setp.le.s64 %r305,%r306,1; .loc 1 549 13 mov.u64 %r290,0; cvt.u32.u32 %r308,%r24; cvt.s64.s8 %r309,%r308; add.u64 %r310,%r309,-1; $L149: st.u64 [%r107],%r290; .loc 1 550 15 add.u64 %r88,%r88,1; .loc 1 551 12 add.u64 %r107,%r107,%r76; .loc 1 553 13 setp.ne.u64 %r291,%r65,%r88; @ %r291 bra $L149; .loc 1 562 7 @ ! %r305 bra $L174; bra $L112; $L148: .loc 1 557 13 st.u64 [%r108],%r290; .loc 1 560 23 mul.lo.u64 %r78,%r81,%r80; .loc 1 561 5 add.u64 %r106,%r106,1; .loc 1 562 7 add.u64 %r108,%r108,8; add.u64 %r138,%r138,8; add.u64 %r137,%r137,8; setp.ne.u64 %r294,%r106,%r310; @ %r294 bra $L147; bra $L112; $L174: add.u64 %r108,%frame,248; add.u64 %r138,%frame,8; add.u64 %r137,%frame,128; .loc 1 560 23 mov.u64 %r78,%r49; .loc 1 561 5 mov.u64 %r106,1; $L147: .loc 1 566 16 ld.u64 %r299,[%r108]; add.u64 %r80,%r299,1; st.u64 [%r108],%r80; .loc 1 567 23 ld.u64 %r81,[%r138]; .loc 1 567 13 sub.u64 %r300,%r81,%r78; shl.b64 %r301,%r300,3; add.u64 %r107,%r107,%r301; .loc 1 553 32 ld.u64 %r84,[%r137]; .loc 1 553 13 setp.eq.u64 %r302,%r80,%r84; @ %r302 bra $L148; .loc 1 557 13 mov.u64 %r88,0; bra $L149; $L172: .loc 1 486 15 ld.u64 %r107,[%r156]; .loc 1 486 6 setp.eq.u64 %r303,%r107,0; @ ! %r303 bra $L125; bra $L150; $L112: .loc 1 maxloc1_8_s4.o/ 1622802249_gfortran_maxloc1_8_s4 .visible .func _gfortran_maxloc1_8_sfortran/generated/maxloc1_8_s4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_8_s4 .visible .func _gfortran_mmaxloc1_8_s_gfortran_smaxloc1_8_s4 .visible .func _gfortran_smaxloc1_8_sVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_memcmp_char4_gfortrani_memcmp_char4_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents7] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,914,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,117,101114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99_gfortran_maxloc1_8_s4 .visible .func _gfortran_maxloc1_8_s, .param .u64 %in_ar4) {48predpred %r229; .reg .predpredu64 %r258; .reg .u64pred %r266; .reg .pred64pred %r310; .reg .u64 %r313; .reg .u64 %r314; .reg .pred %r316; .reg .u64 %r317; .reg .u64 %r318; .reg .u64 %r319; .reg .u32 %r320; .reg .u64pred %r337; .reg .u32 %r341; .reg .predpredpredpredmov.u64 %r189,%ar0; mov.u64 %r190,%ar1; mov.u64 %r191,%ar2; mov.u32 %r192,%ar3; mov.u64 %r193,%ar4; .loc 1 70 10 ld.s8 %r22,[%r190+28]; .loc 1 70 38 add.u32 %r194,%r22,-1; .loc 1 70 8 cvt.s64.s32 %r114,%r194; .loc 1 71 10 ld.u64 %r25,[%r191]; .loc 1 71 7 add.u64 %r115,%r25,-1; .loc 1 73 7 shr.u64 %r196,%r115,63; set.u32.lt.s64 %r198,%r114,%r115; neg.s32 %r199,%r198; cvt.u16.u64 %r202,%r196;73%r206 bra $L2; .loc 1 75 7 cvt.u32.u32 %r209,%r22; cvt.s64.s8 %r208,%r209; st.u64 [%stack+8],%r208007_gfortran_runtime_errorr212,%r115,%r115; add.u64 %r213,%r212,%r115; shl.b64 %r214,%r213,3; add.u64 %r215,%r190,%r214; ld.u64 %r31,[%r215+56]; ld.u64 %r33,[%r215+48]; .loc 1 83 11 ld.u64 %r34,[%r215+40]; .loc 1 85 3 setp.ne.u64 %r229,%r115,0; @ %r229 bra $L3; $L9: .loc 1 93 3 setp.gt.s64 %r230,%r114,%r115; @ %r230 bra $L4; bra $L50; $L3: add.u64 %r54,%r190,40; add.u64 %r113,%frame,120; add.u64 %r127,%frame,240; add.u64 %r231,%r190,16; add.u64 %r233,%r25,%r25; add.u64 %r234,%r233,%r25; shl.b64 %r235,%r234,3; add.u64 %r55,%r231,%r235; .loc 1 91 12 mov.u64 %r383,0; $L8: .loc 1 87 51 ld.u64 %r237,[%r54]; mul.lo.u64 %r236,%r237,%r193; .loc 1 87 18 st.u64 [%r113],%r236; .loc 1 88 19 ld.u64 %r239,[%r54+16]; add.u64 %r238,%r239,1; ld.u64 %r240,[%r54+8]; sub.u64 %r43,%r238,%r240; .loc 1 90 10 setp.lt.s64 %r241,%r43,0; @ %r241 bra $L6; .loc 1 88 17 st.u64 [%r127],%r43; bra $L7; $L6: .loc 1 91 12 st.u64 [%r127],%r383; $L7: .loc 1 85 3 add.u64 %r54,%r54,24; add.u64 %r113,%r113,8; add.u64 %r127,%r127,8; setp.ne.u64 %r243,%r54,%r55; @ %r243 bra $L8; bra $L9; $L50: .loc 1 102 6 ld.u64 %r244,[%r189]; setp.eq.u64 %r245,%r244,0; @ ! %r245 bra $L11; bra $L10; $L4: add.u64 %r247,%r25,%r25; add.u64 %r248,%r247,%r25; shl.b64 %r249,%r248,3; add.u64 %r250,%r249,40; add.u64 %r130,%r190,%r250; shl.b64 %r251,%r25,3; add.u64 %r105,%r251,-8; add.u64 %r373,%frame,120; add.u64 %r108,%r373,%r105; add.u64 %r372,%frame,240; add.u64 %r100,%r372,%r105; add.u64 %r254,%r190,40; cvt.u32.u32 %r256,%r22; cvt.s64.s8 %r255,%r256; add.u64 %r258,%r255,%r255; add.u64 %r259,%r258,%r255; shl.b64 %r260,%r259,3; add.u64 %r70,%r254,%r260; .loc 1 99 12 mov.u64 %r382,0; $L14: .loc 1 95 56 ld.u64 %r262,[%r130]; mul.lo.u64 %r261,%r262,%r193; .loc 1 95 18 st.u64 [%r108],%r261; .loc 1 96 19 ld.u64 %r264,[%r130+16]; add.u64 %r263,%r264,1; ld.u64 %r265,[%r130+8]; sub.u64 %r52,%r263,%r265; .loc 1 98 10 setp.lt.s64 %r266,%r52,0; @ %r266 bra $L12; .loc 1 96 17 st.u64 [%r100],%r52; bra $L13; $L12: .loc 1 99 12 st.u64 [%r100],%r382; $L13: .loc 1 93 3 add.u64 %r130,%r130,24; add.u64 %r108,%r108,8; add.u64 %r100,%r100,8; setp.ne.u64 %r268,%r70,%r130; @ %r268 bra $L14; bra $L51; $L10: .loc 1 106 7 setp.le.s64 %r269,%r114,0; @ %r269 bra $L16; add.u64 %r372,%frame,240; $L38: add.u64 %r147,%r189,40; mov.u64 %r145,%r372; cvt.u32.u32 %r271,%r22; cvt.s64.s8 %r270,%r271; add.u64 %r273,%r270,%r270; add.u64 %r274,%r273,%r270; shl.b64 %r275,%r274,3; add.u64 %r276,%r189,16; add.u64 %r135,%r275,%r276; .loc 1 109 10 mov.u64 %r102,1; .loc 1 113 4 mov.u64 %r299,0; bra $L17; $L16: .loc 1 117 24 mov.u64 %r277,0; st.u64 [%r189+8],%r277; .loc 1 118 28 cvt.u16.u32 %r280,%r22; add.u16 %r279,%r280,-1; cvt.u32.u16 %r281,%r279; st.u8 [%r189+28],%r281; .loc 1 120 20 add.u32 %r282,%r22,-2; cvt.s64.s32 %r61,%r282; add.u64 %r284,%r61,%r61; add.u64 %r285,%r284,%r61; shl.b64 %r286,%r285,3; add.u64 %r287,%r189,%r286; .loc 1 120 67 shl.b64 %r289,%r61,3; add.u64 %r290,%frame,%r289; .loc 1 120 59 ld.u64 %r292,[%r287+40]; ld.u64 %r293,[%r290+240]; mul.lo.u64 %r120,%r292,%r293; .loc 1 122 29 mov.u64 %r29512295; call (%value_in),_gfortrani_xmallocarray296,[%value_in]; } .loc 1 122 27 st.u64 [%r189],%r296; .loc 1 123 10 setp.eq.u64 %r298,%r120,0; @ ! %r298 bra $L21; bra $L18; $L20: .loc 1 111 48 mul.lo.u64 %r102,%r57,%r102; $L17: .loc 1 113 4 st.u64 [%r147+8],%r299; ld.u64 %r57,[%r145]; add.u64 %r300,%r57,-1; st.u64 [%r147+16],%r300; st.u64 [%r147],%r102; .loc 1 106 7 add.u64 %r147,%r147,24; add.u64 %r145,%r145,8; setp.ne.u64 %r301,%r135,%r147; @ %r301 bra $L20; bra $L16; $L18: .loc 1 126 4 st.u64 [%r189+48],%r120; mov.u64 %r303,-1; st.u64 [%r189+56],%r303; mov.u64 %r304,1; st.u64 [%r189+40],%r304; .loc 1 127 4 bra $L1; $L11: .loc 1 133 19 ld.s8 %r68,[%r189+28]; .loc 1 133 10 setp.eq.u64 %r305,%r68,%r114; @ %r305 bra $L23; .loc 1 134 2 st.u64 [%stack+8],%r114; st.u64 [%stack],%r68;_gfortran_runtime_error23: .loc 1 139 11 cvta.global.u64 %r308,_gfortrani_compile_options; .loc 1 139 10 ld.u32 %r309,[%r308+36]; setp.eq.u32 %r310,%r309,0; @ %r310 bra $L21; .loc 1 140 2 add.u64 %r372,%frame,240; cvta.const.u64 %r314,$LC2r372313314; call _gfortrani_bounds_ifunction_return$L21: .loc 1 144 3 setp.gt.s64 %r316,%r114,0; @ %r316 bra $L24; $L28: .loc 1 80 9 add.u64 %r317,%r31,1; .loc 1 80 7 sub.u64 %r116,%r317,%r33; max.s64 %r118,%r116,0; .loc 1 152 8 ld.u64 %r101,[%r190]; .loc 1 153 8 ld.u64 %r125,[%r189]; .loc 1 170 36 mul.lo.u64 %r318,%r34,%r193; shl.b64 %r75,%r318,2; .loc 1 186 22 ld.u64 %r177,[%frame+120]; .loc 1 186 12 shl.b64 %r180,%r177,2; .loc 1 187 22 ld.u64 %r181,[%frame]; .loc 1 187 12 shl.b64 %r184,%r181,3; .loc 1 185 12 ld.u64 %r185,[%frame+360]; setp.gt.s64 %r370,%r116,0; setp.le.s64 %r371,%r114,1; .loc 1 173 6 setp.eq.u32 %r376,%r192,0; cvt.u32.u32 %r377,%r22; cvt.s64.s8 %r378,%r377; add.u64 %r379,%r378,-1; add.u64 %r380,%frame,120; add.u64 %r381,%frame,240; bra $L25; $L24: add.u64 %r166,%frame,360; add.u64 %r165,%r189,40; mov.u64 %r163,%frame; cvt.u32.u32 %r320,%r22; cvt.s64.s8 %r319,%r320; add.u64 %r151,%r319,-1; .loc 1 144 3 mov.u64 %r126,0; add.u64 %r372,%frame,240; .loc 1 146 16 mov.u64 %r321,%r126; $L27: st.u64 [%r166],%r321; .loc 1 147 18 ld.u64 %r322,[%r165]; st.u64 [%r163],%r322; .loc 1 148 17 shl.b64 %r324,%r126,3; add.u64 %r325,%r372,%r324; .loc 1 148 10 ld.u64 %r326,[%r325]; setp.le.s64 %r327,%r326,0; @ %r327 bra $L1; .loc 1 144 26 add.u64 %r126,%r126,1; .loc 1 144 3 add.u64 %r166,%r166,8; add.u64 %r165,%r165,24; add.u64 %r163,%r163,8; setp.ne.u64 %r328,%r126,%r151; @ %r328 bra $L27; bra $L28; $L25: .loc 1 166 5 @ %r370 bra $L39; .loc 1 167 10 mov.u64 %r330,0; st.u64 [%r125],%r330; bra $L30; $L39: mov.u64 %r121,%r101; .loc 1 164 9 mov.u64 %r106,0; .loc 1 165 9 mov.u64 %r104,%r106; .loc 1 170 13 mov.u64 %r152,%r106; $L29: .loc 1 177 14 add.u64 %r152,%r152,1; .loc 1 173 6 setp.ne.u64 %r331,%r106,0; @ %r331 bra $L31; $L34: .loc 1 170 13 mov.u64 %r106,%r121; .loc 1 177 14 mov.u64 %r104,%r152; bra $L32; $L31: .loc 1 173 6 @ %r376 bra $L33; .loc 1 42 12 { .param .u32 %value_in;(%value_in),_gfortrani_memcmp_char4336,[%value_in]; } .loc 1 173 22 setp.ge.s32 %r337,%r336,0; @ %r337 bra $L34; bra $L32; $L33: .loc 1 42(%value_in),_gfortrani_memcmp_char4341,[%value_in]; } .loc 1 173 22 setp.gt.s32 %r342,%r341,0; @ %r342 bra $L34; $L32: .loc 1 170 36 add.u64 %r121,%r121,%r75; .loc 1 170 6 setp.gt.s64 %r343,%r118,%r152; @ %r343 bra $L29; .loc 1 181 12 st.u64 [%r125],%r104; $L30: .loc 1 185 15 add.u64 %r185,%r185,1; .loc 1 186 12 add.u64 %r101,%r101,%r180; .loc 1 187 12 add.u64 %r125,%r125,%r184; .loc 1 189 32 ld.u64 %r132,[%frame+240]; .loc 1 189 13 setp.ne.u64 %r344,%r185,%r132; @ %r344 bra $L25; .loc 1 196 23 mul.lo.u64 %r345,%r185,%r177; .loc 1 196 9 shl.b64 %r346,%r345,2; sub.u64 %r122,%r101,%r346; .loc 1 197 23 mul.lo.u64 %r347,%r185,%r181; .loc 1 197 9 shl.b64 %r348,%r347,3; sub.u64 %r123,%r125,%r348; .loc 1 199 7 @ %r371 bra $L1; add.u64 %r134,%frame,368; mov.u64 %r155,8; .loc 1 198 5 mov.u64 %r124,1; .loc 1 193 13 mov.u64 %r374,0; bra $L36; $L37: st.u64 [%r134],%r374; .loc 1 196 23 mul.lo.u64 %r354,%r90,%r89; .loc 1 196 9 shl.b64 %r355,%r354,2; sub.u64 %r122,%r101,%r355; .loc 1 197 23 mul.lo.u64 %r356,%r93,%r89; .loc 1 197 9 shl.b64 %r357,%r356,3; sub.u64 %r123,%r125,%r357; .loc 1 198 5 add.u64 %r124,%r124,1; .loc 1 199 7 add.u64 %r134,%r134,8; add.u64 %r155,%r155,8; setp.eq.u64 %r358,%r124,%r379; @ %r358 bra $L1; $L36: .loc 1 207 16 ld.u64 %r359,[%r134]; add.u64 %r89,%r359,1; st.u64 [%r134],%r89; .loc 1 208 23 add.u64 %r361,%r380,%r155; ld.u64 %r90,[%r361]; .loc 1 208 13 shl.b64 %r362,%r90,2; add.u64 %r101,%r122,%r362; .loc 1 209 23 add.u64 %r363,%frame,%r155; ld.u64 %r93,[%r363]; .loc 1 209 13 shl.b64 %r364,%r93,3; add.u64 %r125,%r123,%r364; .loc 1 189 32 add.u64 %r366,%r381,%r155; ld.u64 %r96,[%r366]; .loc 1 189 13 setp.eq.u64 %r367,%r89,%r96; @ %r367 bra $L37; mov.u64 %r185,0; bra $L25; $L51: .loc 1 102 6 ld.u64 %r368,[%r189]; setp.eq.u64 %r369,%r368,0; @ ! %r369 bra $L11; bra $L38; $L1_gfortran_mmaxloc1_8_s4 .visible .func _gfortran_mmaxloc1_8_s608pred %r289; .reg .predpred %r304; .reg .pred %r305; .reg .u64u64 %r313; .reg .u64u64pred %r324; .reg .u64predu16 %r371; .reg .u16 %r372; .reg .u32 %r373; .reg .pred %r374; .reg .u64 %r375; .reg .u64 %r376; .reg .predu64u64 %r399; .reg .predpred %r416; .reg .u32 %r422; .reg .pred %r423; .reg .u32 %r427; .reg .pred %r428; .reg .predpred.reg .u64u64u64 %r464; .reg .u64 %r465; .reg .pred.reg .u64 %r475; .reg .u64 %r476; mov.u64 %r230,%ar0; mov.u64 %r231,%ar1; mov.u64 %r232,%ar2; mov.u64 %r233,%ar3; mov.u32 %r234,%ar4; mov.u64 %r235,%ar5; .loc 1 244 6 setp.ne.u64 %r236,%r233,0; @ %r236 bra $L2234235; call _gfortran_maxloc1_8_s4); } .loc 1 251 7 bra $L52; $L53: .loc 1 254 10 ld.u64 %r22,[%r232]; .loc 1 254 7 add.u64 %r149,%r22,-1; .loc 1 255 10 ld.s8 %r23,[%r231+28]; .loc 1 255 38 add.u32 %r242,%r23,-1; .loc 1 255 8 cvt.s64.s32 %r150,%r242; .loc 1 258 7 shr.u64 %r244,%r149,63; set.u32.gt.s64 %r246,%r149,%r150; neg.s32 %r247,%r246; cvt.u16.u64 %r250,%r244; cvt.u16.u32 %r251,%r247; or.b16 %r249,%r250,%r251; .loc 1 258 6.loc 1 260 7 cvt.u32.u32 %r257,%r23; cvt.s64.s8 %r256,%r257; st.u64 [%stack+8],%r256; st.u64 [%stack],%r22; cvta.const.u64 %r22_gfortran_runtime_error5: .loc 1 265 9 add.u64 %r464,%r149,%r149; add.u64 %r462,%r464,%r149; shl.b64 %r262,%r462,3; add.u64 %r263,%r231,%r262; ld.u64 %r266,[%r263+56]; add.u64 %r265,%r266,1; .loc 1 265 7 ld.u64 %r273,[%r263+48]; sub.u64 %r151,%r265,%r273; .loc 1 266 6 setp.le.s64 %r274,%r151,0; @ %r274 bra $L52; .loc 1 269 9 ld.u64 %r164,[%r233]; .loc 1 271 15 ld.u64 %r35,[%r233+16]; .loc 1 273 22 cvt.u32.u64 %r36,%r35; .loc 1 273 53 add.u32 %r275,%r36,-4; and.b32 %r276,%r275,-5; set.u32.eq.u32 %r278,%r276,0; neg.s32 %r279,%r278; .loc 1 273 22 add.u32 %r280,%r36,-1; set.u32.le.u32 %r282,%r280,1; neg.s32 %r283,%r282; .loc 1 273 6 cvt.u16.u32 %r285,%r279; cvt.u16.u32 %r286,%r283; or.b16 %r284,%r285,%r286; cvt.u32.u16 %r287,%r284; cvt.u16.u8 %r288,%r287; setp.ne.u16 %r289,%r288,0; @ %r289 bra $L57; .loc 1 275 7 setp.ne.u32 %r291,%r36,16; @ %r291 bra $L58; $L57: .loc 1 282 11 shl.b64 %r295,%r462,3; add.u64 %r296,%r231,%r295; ld.u64 %r40,[%r296+40]; .loc 1 283 12 add.u64 %r302,%r233,%r295; ld.u64 %r42,[%r302+40]; .loc 1 285 3 setp.ne.u64 %r304,%r149,0; @ %r304 bra $L59; $L65: .loc 1 295 3 setp.lt.s64 %r305,%r149,%r150; @ %r305 bra $L60; bra $L111; $L58: .loc 1 280 5 cvta.const.u64 %r306306_gfortran_runtime_error9: add.u64 %r103,%r231,40; add.u64 %r213,%frame,240; add.u64 %r214,%r233,40; mov.u64 %r216,%frame; add.u64 %r217,%frame,360; add.u64 %r308,%r231,16; add.u64 %r310,%r22,%r22; add.u64 %r311,%r310,%r22; shl.b64 %r312,%r311,3; add.u64 %r226,%r308,%r312; .loc 1 292 12 mov.u64 %r476,0; $L64: .loc 1 287 51 ld.u64 %r314,[%r103]; mul.lo.u64 %r313,%r314,%r235; .loc 1 287 18 st.u64 [%r213],%r313; .loc 1 288 20 ld.u64 %r316,[%r214]; mul.lo.u64 %r315,%r316,%r35; .loc 1 288 18 st.u64 [%r216],%r315; .loc 1 289 19 ld.u64 %r318,[%r103+16]; add.u64 %r317,%r318,1; ld.u64 %r319,[%r103+8]; sub.u64 %r57,%r317,%r319; .loc 1 291 10 setp.lt.s64 %r320,%r57,0; @ %r320 bra $L62; .loc 1 289 17 st.u64 [%r217],%r57; bra $L63; $L62: .loc 1 292 12 st.u64 [%r217],%r476; $L63: .loc 1 285 3 add.u64 %r103,%r103,24; add.u64 %r213,%r213,8; add.u64 %r214,%r214,24; add.u64 %r216,%r216,8; add.u64 %r217,%r217,8; setp.ne.u64 %r322,%r103,%r226; @ %r322 bra $L64; bra $L65; $L111: .loc 1 305 6 ld.u64 %r323,[%r230]; setp.eq.u64 %r324,%r323,0; @ ! %r324 bra $L67; bra $L66; $L60: add.u64 %r326,%r22,%r22; add.u64 %r327,%r326,%r22; shl.b64 %r328,%r327,3; add.u64 %r58,%r328,40; add.u64 %r77,%r231,%r58; shl.b64 %r329,%r22,3; add.u64 %r148,%r329,-8; add.u64 %r460,%frame,240; add.u64 %r39,%r460,%r148; add.u64 %r173,%r233,%r58; add.u64 %r143,%frame,%r148; add.u64 %r465,%frame,360; add.u64 %r178,%r465,%r148; add.u64 %r332,%r231,40; cvt.u32.u32 %r334,%r23; cvt.s64.s8 %r333,%r334; add.u64 %r336,%r333,%r333; add.u64 %r337,%r336,%r333; shl.b64 %r338,%r337,3; add.u64 %r100,%r332,%r338; .loc 1 302 12 mov.u64 %r475,0; $L70: .loc 1 297 55 ld.u64 %r340,[%r77]; mul.lo.u64 %r339,%r340,%r235; .loc 1 297 18 st.u64 [%r39],%r339; .loc 1 298 20 ld.u64 %r342,[%r173]; mul.lo.u64 %r341,%r342,%r35; .loc 1 298 18 st.u64 [%r143],%r341; .loc 1 299 19 ld.u64 %r344,[%r77+16]; add.u64 %r343,%r344,1; ld.u64 %r345,[%r77+8]; sub.u64 %r75,%r343,%r345; .loc 1 301 10 setp.lt.s64 %r346,%r75,0; @ %r346 bra $L68; .loc 1 299 17 st.u64 [%r178],%r75; bra $L69; $L68: .loc 1 302 12 st.u64 [%r178],%r475; $L69: .loc 1 295 3 add.u64 %r77,%r77,24; add.u64 %r39,%r39,8; add.u64 %r173,%r173,24; add.u64 %r143,%r143,8; add.u64 %r178,%r178,8; setp.ne.u64 %r348,%r77,%r100; @ %r348 bra $L70; bra $L112; $L66: .loc 1 309 7 setp.eq.u64 %r349,%r150,0; @ %r349 bra $L72; add.u64 %r465,%frame,360; $L94: add.u64 %r130,%r230,40; mov.u64 %r128,%r465; cvt.u32.u32 %r351,%r23; cvt.s64.s8 %r350,%r351; add.u64 %r353,%r350,%r350; add.u64 %r354,%r353,%r350; shl.b64 %r355,%r354,3; add.u64 %r356,%r230,16; add.u64 %r92,%r355,%r356; .loc 1 312 10 mov.u64 %r135,1; .loc 1 316 4 mov.u64 %r375,0; bra $L73; $L72: .loc 1 320 20 add.u32 %r357,%r23,-2; cvt.s64.s32 %r83,%r357; add.u64 %r359,%r83,%r83; add.u64 %r360,%r359,%r83; shl.b64 %r361,%r360,3; add.u64 %r362,%r230,%r361; .loc 1 320 67 shl.b64 %r364,%r83,3; add.u64 %r365,%frame,%r364; .loc 1 320 59 ld.u64 %r367,[%r362+40]; ld.u64 %r368,[%r365+360]; mul.lo.u64 %r157,%r367,%r368; .loc 1 322 24 mov.u64 %r369,0; st.u64 [%r230+8],%r369; .loc 1 323 28 cvt.u16.u32 %r372,%r23; add.u16 %r371,%r372,-1; cvt.u32.u16 %r373,%r371; st.u8 [%r230+28],%r373; .loc 1 325 10 setp.eq.u64 %r374,%r157,0; @ %r374 bra $L74; bra $L113; $L76: .loc 1 314 47 mul.lo.u64 %r135,%r80,%r135; $L73: .loc 1 316 4 st.u64 [%r130+8],%r375; ld.u64 %r80,[%r128]; add.u64 %r376,%r80,-1; st.u64 [%r130+16],%r376; st.u64 [%r130],%r135; .loc 1 309 7 add.u64 %r130,%r130,24; add.u64 %r128,%r128,8; setp.ne.u64 %r377,%r92,%r130; @ %r377 bra $L76; bra $L72; $L74: .loc 1 328 4 st.u64 [%r230+48],%r157; mov.u64 %r379,-1; st.u64 [%r230+56],%r379; mov.u64 %r380,1; st.u64 [%r230+40],%r380; .loc 1 329 4 bra $L52; $L113: .loc 1 332 24 mov.u64 %r38382; call (%value_in),_gfortrani_xmallocarray83,[%value_in]; } .loc 1 332 22 st.u64 [%r230],%r383; bra $L77; $L67: .loc 1 337 19 ld.s8 %r385,[%r230+28]; .loc 1 337 10 setp.eq.u64 %r386,%r385,%r150; @ %r386 bra $L78; .loc 1 338 2 cvta.const.u64 %r387387_gfortran_runtime_error78: .loc 1 340 11 cvta.global.u64 %r389,_gfortrani_compile_options; .loc 1 340 10 ld.u32 %r390,[%r389+36]; setp.eq.u32 %r391,%r390,0; @ %r391 bra $L77; .loc 1 342 4 add.u64 %r465,%frame,360; cvta.const.u64 %r395,$LC2r394395; call _gfortrani_bounds_ifunction_return344 4 cvta.const.u64 %r399,$LC3r2399395; call _gfortrani_bounds_equal_extents$L77: .loc 1 349 3 setp.ne.u64 %r401,%r150,0; @ %r401 bra $L79; $L82: .loc 1 357 8 ld.u64 %r165,[%r230]; .loc 1 358 8 ld.u64 %r163,[%r231]; .loc 1 360 9 setp.ne.u64 %r402,%r163,0; @ %r402 bra $L80; bra $L52; $L79: add.u64 %r174,%frame,480; add.u64 %r166,%r230,40; add.u64 %r145,%frame,120; cvt.u32.u32 %r404,%r23; cvt.s64.s8 %r403,%r404; add.u64 %r131,%r403,-1; .loc 1 349 3 mov.u64 %r171,0; add.u64 %r465,%frame,360; .loc 1 351 16 mov.u64 %r405,%r171; $L81: st.u64 [%r174],%r405; .loc 1 352 18 ld.u64 %r406,[%r166]; st.u64 [%r145],%r406; .loc 1 353 17 shl.b64 %r408,%r171,3; add.u64 %r409,%r465,%r408; .loc 1 353 10 ld.u64 %r410,[%r409]; setp.le.s64 %r411,%r410,0; @ %r411 bra $L52; .loc 1 349 26 add.u64 %r171,%r171,1; .loc 1 349 3 add.u64 %r174,%r174,8; add.u64 %r166,%r166,24; add.u64 %r145,%r145,8; setp.ne.u64 %r412,%r131,%r171; @ %r412 bra $L81; bra $L82; $L80: .loc 1 283 12 mul.lo.u64 %r45,%r42,%r35; .loc 1 372 32 mul.lo.u64 %r413,%r40,%r235; shl.b64 %r95,%r413,2; .loc 1 396 22 ld.u64 %r197,[%frame+240]; .loc 1 396 12 shl.b64 %r200,%r197,2; .loc 1 397 23 ld.u64 %r201,[%frame]; .loc 1 398 22 ld.u64 %r203,[%frame+120]; .loc 1 398 12 shl.b64 %r205,%r203,3; setp.le.s64 %r463,%r150,1; .loc 1 384 13 setp.eq.u32 %r468,%r234,0; cvt.u32.u32 %r469,%r23; cvt.s64.s8 %r470,%r469; add.u64 %r471,%r470,-1; add.u64 %r472,%frame,240; add.u64 %r473,%frame,360; add.u64 %r474,%frame,120; $L95: .loc 1 349 3 mov.u64 %r170,%r164; mov.u64 %r159,%r163; .loc 1 372 9 mov.u64 %r139,0; $L86: .loc 1 375 7 ld.s8 %r206,[%r170]; mov.u64 %r190,%r139; .loc 1 378 11 add.u64 %r139,%r139,1; .loc 1 375 6 setp.eq.u32 %r459,%r206,0; @ %r459 bra $L83; .loc 1 382 6 setp.gt.s64 %r415,%r151,%r190; @ %r415 bra $L96; bra $L85; $L83: .loc 1 372 32 add.u64 %r159,%r159,%r95; .loc 1 372 47 add.u64 %r170,%r170,%r45; .loc 1 372 2 setp.ne.u64 %r416,%r151,%r139; @ %r416 bra $L86; .loc 1 371 9 mov.u64 %r139,0; bra $L85; $L96: .loc 1 382 6 mov.u64 %r169,%r159; $L84: .loc 1 388 14 add.u64 %r190,%r190,1; .loc 1 384 6 @ %r459 bra $L87; .loc 1 384 13 @ %r468 bra $L88; .loc 1 42call (%value_in),_gfortrani_memcmp_char4422,[%value_in]; } .loc 1 384 13 setp.lt.s32 %r423,%r422,0; @ %r423 bra $L87; $L89: .loc 1 382 6 mov.u64 %r159,%r169; .loc 1 388 14 mov.u64 %r139,%r190; bra $L87; $L88: .loc 1 42call (%value_in),_gfortrani_memcmp_char4427,[%value_in]; } .loc 1 384 13 setp.gt.s32 %r428,%r427,0; @ %r428 bra $L89; $L87: .loc 1 382 31 add.u64 %r169,%r169,%r95; .loc 1 382 46 add.u64 %r170,%r170,%r45; .loc 1 382 6 setp.eq.u64 %r429,%r151,%r190; @ %r429 bra $L85; .loc 1 384 7 ld.s8 %r206,[%r170]; setp.eq.u32 %r459,%r206,0; bra $L84; $L85: .loc 1 392 8 st.u64 [%r165],%r139; .loc 1 395 15 ld.u64 %r430,[%frame+480]; add.u64 %r98,%r430,1; st.u64 [%frame+480],%r98; .loc 1 396 12 add.u64 %r163,%r163,%r200; .loc 1 397 13 add.u64 %r164,%r164,%r201; .loc 1 398 12 add.u64 %r165,%r165,%r205; .loc 1 400 32 ld.u64 %r29,[%frame+360]; .loc 1 400 13 setp.ne.u64 %r431,%r29,%r98; @ %r431 bra $L95; .loc 1 404 13 mov.u64 %r432,0; st.u64 [%frame+480],%r432; .loc 1 407 23 mul.lo.u64 %r107,%r29,%r197; .loc 1 408 24 mul.lo.u64 %r433,%r29,%r201; .loc 1 408 10 sub.u64 %r160,%r164,%r433; .loc 1 409 23 mul.lo.u64 %r434,%r29,%r203; .loc 1 409 9 shl.b64 %r435,%r434,3; sub.u64 %r161,%r165,%r435; .loc 1 411 7 @ %r463 bra $L52; add.u64 %r198,%frame,488; mov.u64 %r196,8; .loc 1 410 5 mov.u64 %r162,1; bra $L92; $L93: .loc 1 404 13 st.u64 [%r198],%r432; .loc 1 407 23 mul.lo.u64 %r107,%r119,%r118; .loc 1 408 24 mul.lo.u64 %r441,%r121,%r118; .loc 1 408 10 sub.u64 %r160,%r164,%r441; .loc 1 409 23 mul.lo.u64 %r442,%r123,%r118; .loc 1 409 9 shl.b64 %r443,%r442,3; sub.u64 %r161,%r165,%r443; .loc 1 410 5 add.u64 %r162,%r162,1; .loc 1 411 7 add.u64 %r198,%r198,8; add.u64 %r196,%r196,8; setp.eq.u64 %r444,%r162,%r471; @ %r444 bra $L52; $L92: .loc 1 419 16 ld.u64 %r445,[%r198]; add.u64 %r118,%r445,1; st.u64 [%r198],%r118; .loc 1 420 23 add.u64 %r447,%r472,%r196; ld.u64 %r119,[%r447]; .loc 1 420 13 sub.u64 %r448,%r119,%r107; shl.b64 %r449,%r448,2; add.u64 %r163,%r163,%r449; .loc 1 421 24 add.u64 %r450,%frame,%r196; ld.u64 %r121,[%r450]; .loc 1 421 14 add.u64 %r164,%r160,%r121; .loc 1 422 23 add.u64 %r452,%r474,%r196; ld.u64 %r123,[%r452]; .loc 1 422 13 shl.b64 %r453,%r123,3; add.u64 %r165,%r161,%r453; .loc 1 400 32 add.u64 %r455,%r473,%r196; ld.u64 %r127,[%r455]; .loc 1 400 13 setp.eq.u64 %r456,%r118,%r127; @ %r456 bra $L93; bra $L95; $L112: .loc 1 305 6 ld.u64 %r457,[%r230]; setp.eq.u64 %r458,%r457,0; @ ! %r458 bra $L67; bra $L94; $L52: .loc 1 426_gfortran_smaxloc1_8_s4 .visible .func _gfortran_smaxloc1_8_s368100predu16 %r180; .reg .predpred %r222; .reg .predu64 %r231; .reg .u16 %r233; .reg .u16 %r234; .reg .u32pred %r259; .reg .pred %r260; .reg .u64u64 %r267; .reg .u32pred %r303; .reg .u64 %r304; .reg .pred %r305; .reg .u64 %r306; .reg .u32mov.u64 %r156,%ar0; mov.u64 %r157,%ar1; mov.u64 %r158,%ar2; mov.u64 %r159,%ar3; mov.u32 %r160,%ar4; mov.u64 %r161,%ar5; .loc 1 449 6 setp.eq.u64 %r162,%r159,0; @ %r162 bra $L115; .loc 1 449 20 ld.u32 %r163,[%r159]; setp.eq.u32 %r164,%r163,0; @ %r164 bra $L116; $L115: .loc 1 452 7r1571580161; call _gfortran_maxloc1_8_s4); } .loc 1 456 7 bra $L114; $L116: .loc 1 459 10 ld.u64 %r23,[%r158]; .loc 1 459 7 add.u64 %r95,%r23,-1; .loc 1 460 10 ld.s8 %r24,[%r157+28]; .loc 1 460 38 add.u32 %r26,%r24,-1; .loc 1 460 8 cvt.s64.s32 %r306,%r26; .loc 1 462 7 shr.u64 %r171,%r95,63; set.u32.gt.s64 %r173,%r95,%r306; neg.s32 %r174,%r173; cvt.u16.u64 %r177,%r171; cvt.u16.u32 %r178,%r174; or.b16 %r176,%r177,%r178; .loc 1 462 6 cvt.u32.u16 %r179,%r176; cvt.u16.u8 %r180,%r179; setp.ne.u16 %r181,%r180,0; @ %r181 bra $L118; .loc 1 469 3 setp.ne.u64 %r182,%r95,0; @ %r182 bra $L119; $L125: .loc 1 477 3 setp.lt.s64 %r183,%r95,%r306; @ %r183 bra $L120; bra $L173; $L118: .loc 1 464 7 cvt.u32.u32 %r186,%r24; cvt.s64.s8 %r185,%r186; st.u64 [%stack+8],%r1851841stack; call _gfortran_runtime_error119: add.u64 %r92,%r157,48; add.u64 %r114,%frame,120; add.u64 %r146,%r157,56; add.u64 %r188,%r157,24; add.u64 %r190,%r23,%r23; add.u64 %r191,%r190,%r23; shl.b64 %r192,%r191,3; add.u64 %r152,%r188,%r192; .loc 1 474 12 mov.u64 %r312,0; $L124: .loc 1 471 19 ld.u64 %r194,[%r146]; ld.u64 %r196,[%r92]; sub.u64 %r195,%r194,%r196; .loc 1 471 50 mad.lo.u64 %r38,%r195,%r161,%r161; .loc 1 473 10 setp.le.s64 %r197,%r38,0; @ %r197 bra $L122; .loc 1 471 17 st.u64 [%r114],%r38; bra $L123; $L122: .loc 1 474 12 st.u64 [%r114],%r312; $L123: .loc 1 469 3 add.u64 %r92,%r92,24; add.u64 %r114,%r114,8; add.u64 %r146,%r146,24; setp.ne.u64 %r199,%r92,%r152; @ %r199 bra $L124; bra $L125; $L173: .loc 1 486 15 ld.u64 %r107,[%r156]; .loc 1 486 6 setp.eq.u64 %r200,%r107,0; @ ! %r200 bra $L127; bra $L126; $L120: add.u64 %r202,%r23,%r23; add.u64 %r203,%r202,%r23; shl.b64 %r204,%r203,3; add.u64 %r205,%r204,48; add.u64 %r83,%r157,%r205; add.u64 %r304,%frame,120; shl.b64 %r207,%r23,3; add.u64 %r208,%r207,-8; add.u64 %r144,%r304,%r208; add.u64 %r209,%r157,48; cvt.u32.u32 %r211,%r24; cvt.s64.s8 %r210,%r211; add.u64 %r213,%r210,%r210; add.u64 %r214,%r213,%r210; shl.b64 %r215,%r214,3; add.u64 %r72,%r209,%r215; .loc 1 483 12 mov.u64 %r311,0; $L130: .loc 1 480 2 ld.u64 %r217,[%r83+8]; ld.u64 %r219,[%r83]; sub.u64 %r218,%r217,%r219; .loc 1 480 37 mad.lo.u64 %r46,%r218,%r161,%r161; .loc 1 482 10 setp.le.s64 %r220,%r46,0; @ %r220 bra $L128; .loc 1 479 17 st.u64 [%r144],%r46; bra $L129; $L128: .loc 1 483 12 st.u64 [%r144],%r311; $L129: .loc 1 477 3 add.u64 %r83,%r83,24; add.u64 %r144,%r144,8; setp.ne.u64 %r222,%r72,%r83; @ %r222 bra $L130; bra $L174; $L126: .loc 1 490 7 setp.eq.u64 %r223,%r306,0; @ %r223 bra $L132; add.u64 %r304,%frame,120; $L152: add.u64 %r125,%r156,40; mov.u64 %r123,%r304; add.u64 %r224,%r156,16; cvt.u32.u32 %r226,%r24; cvt.s64.s8 %r225,%r226; add.u64 %r228,%r225,%r225; add.u64 %r229,%r228,%r225; shl.b64 %r230,%r229,3; add.u64 %r117,%r224,%r230; .loc 1 493 10 mov.u64 %r89,1; .loc 1 497 4 mov.u64 %r249,0; bra $L133; $L132: .loc 1 501 24 mov.u64 %r231,0; st.u64 [%r156+8],%r231; .loc 1 502 28 cvt.u16.u32 %r234,%r24; add.u16 %r233,%r234,-1; cvt.u32.u16 %r235,%r233; st.u8 [%r156+28],%r235; .loc 1 504 20 add.u32 %r236,%r24,-2; cvt.s64.s32 %r56,%r236; add.u64 %r238,%r56,%r56; add.u64 %r239,%r238,%r56; shl.b64 %r240,%r239,3; add.u64 %r241,%r156,%r240; .loc 1 504 67 shl.b64 %r243,%r56,3; add.u64 %r244,%frame,%r243; .loc 1 504 59 ld.u64 %r246,[%r241+40]; ld.u64 %r247,[%r244+120]; mul.lo.u64 %r103,%r246,%r247; .loc 1 506 10 setp.eq.u64 %r248,%r103,0; @ %r248 bra $L134; bra $L175; $L136: .loc 1 495 48 mul.lo.u64 %r89,%r52,%r89; $L133: .loc 1 497 4 st.u64 [%r125+8],%r249; ld.u64 %r52,[%r123]; add.u64 %r250,%r52,-1; st.u64 [%r125+16],%r250; st.u64 [%r125],%r89; .loc 1 490 7 add.u64 %r125,%r125,24; add.u64 %r123,%r123,8; setp.ne.u64 %r251,%r117,%r125; @ %r251 bra $L136; bra $L132; $L134: .loc 1 509 4 st.u64 [%r156+48],%r103; mov.u64 %r253,-1; st.u64 [%r156+56],%r253; mov.u64 %r254,1; st.u64 [%r156+40],%r254; .loc 1 510 4 bra $L114; $L175: .loc 1 513 24 mov.u64 %r256(%value_in),_gfortrani_xmallocarray257,[%value_in]; } mov.u64 %r107,%r257; .loc 1 513 22 st.u64 [%r156],%r107; $L140: .loc 1 539 3 setp.ne.u64 %r259,%r306,0; @ %r259 bra $L137; bra $L138; $L127: .loc 1 517 19 ld.s8 %r61,[%r156+28]; .loc 1 517 10 setp.eq.u64 %r260,%r61,%r306; @ %r260 bra $L139; .loc 1 518 2 st.u64 [%stack+8],%r306; st.u64 [%stack],%r61;_gfortran_runtime_error139: .loc 1 523 11 cvta.global.u64 %r263,_gfortrani_compile_options; .loc 1 523 10 ld.u32 %r264,[%r263+36]; setp.eq.u32 %r265,%r264,0; @ %r265 bra $L140; .loc 1 525 4 setp.eq.u64 %r266,%r306,0; @ %r266 bra $L138; add.u64 %r115,%r156,48; add.u64 %r105,%frame,120; cvt.u32.u32 %r268,%r24; cvt.s64.s8 %r267,%r268; add.u64 %r86,%r267,-1; .loc 1 525 10 mov.u64 %r113,0; $L142: .loc 1 529 21 ld.u64 %r270,[%r115+8]; add.u64 %r269,%r270,1; .loc 1 529 19 ld.u64 %r271,[%r115]; sub.u64 %r100,%r269,%r271; .loc 1 530 18 ld.u64 %r70,[%r105]; .loc 1 525 25 add.u64 %r113,%r113,1; .loc 1 530 11 setp.eq.u64 %r272,%r70,%r100; @ %r272 bra $L141; .loc 1 531 3 st.u64 [%stack+16],%r70; st.u64 [%stack+8],%r100; st.u64 [%stack],%r113;_gfortran_runtime_error141: .loc 1 525 4 add.u64 %r115,%r115,24; add.u64 %r105,%r105,8; setp.ne.u64 %r275,%r86,%r113; @ %r275 bra $L142; $L137: .loc 1 541 16 cvt.u16.u32 %r277,%r24; setp.le.s16 %r278,%r277,1; @ %r278 bra $L143; shl.b64 %r276,%r306,3; bra $L144; $L143: mov.u64 %r276,8; $L144: add.u64 %r280,%frame,240; mov.u32 %r2842766,[%value_in]; } add.u64 %r130,%r156,40; mov.u64 %r128,%frame; mov.u64 %r109,0; $L145: .loc 1 542 18 ld.u64 %r288,[%r130]; st.u64 [%r128],%r288; .loc 1 539 26 add.u64 %r109,%r109,1; .loc 1 539 3 add.u64 %r130,%r130,24; add.u64 %r128,%r128,8; setp.gt.s64 %r289,%r306,%r109; @ %r289 bra $L145; $L138: ld.u64 %r88,[%frame+240]; .loc 1 551 22 ld.u64 %r75,[%frame]; .loc 1 551 12 shl.b64 %r76,%r75,3; .loc 1 553 32 ld.u64 %r65,[%frame+120]; .loc 1 560 23 mul.lo.u64 %r49,%r65,%r75; setp.le.s64 %r305,%r306,1; .loc 1 549 13 mov.u64 %r290,0; cvt.u32.u32 %r308,%r24; cvt.s64.s8 %r309,%r308; add.u64 %r310,%r309,-1; $L151: st.u64 [%r107],%r290; .loc 1 550 15 add.u64 %r88,%r88,1; .loc 1 551 12 add.u64 %r107,%r107,%r76; .loc 1 553 13 setp.ne.u64 %r291,%r65,%r88; @ %r291 bra $L151; .loc 1 562 7 @ ! %r305 bra $L176; bra $L114; $L150: .loc 1 557 13 st.u64 [%r108],%r290; .loc 1 560 23 mul.lo.u64 %r78,%r81,%r80; .loc 1 561 5 add.u64 %r106,%r106,1; .loc 1 562 7 add.u64 %r108,%r108,8; add.u64 %r138,%r138,8; add.u64 %r137,%r137,8; setp.ne.u64 %r294,%r106,%r310; @ %r294 bra $L149; bra $L114; $L176: add.u64 %r108,%frame,248; add.u64 %r138,%frame,8; add.u64 %r137,%frame,128; .loc 1 560 23 mov.u64 %r78,%r49; .loc 1 561 5 mov.u64 %r106,1; $L149: .loc 1 566 16 ld.u64 %r299,[%r108]; add.u64 %r80,%r299,1; st.u64 [%r108],%r80; .loc 1 567 23 ld.u64 %r81,[%r138]; .loc 1 567 13 sub.u64 %r300,%r81,%r78; shl.b64 %r301,%r300,3; add.u64 %r107,%r107,%r301; .loc 1 553 32 ld.u64 %r84,[%r137]; .loc 1 553 13 setp.eq.u64 %r302,%r80,%r84; @ %r302 bra $L150; .loc 1 557 13 mov.u64 %r88,0; bra $L151; $L174: .loc 1 486 15 ld.u64 %r107,[%r156]; .loc 1 486 6 setp.eq.u64 %r303,%r107,0; @ ! %r303 bra $L127; bra $L152; $L114: .loc 1 maxloc1_16_s1.o/1622802249_gfortran_maxloc1_16_s1 .visible .func _gfortran_maxloc1_16_s1fortran/generated/maxloc1_16_s1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_16_s1 .visible .func _gfortran_mmaxloc1_16_s1_gfortran_smaxloc1_16_s1 .visible .func _gfortran_smaxloc1_16_s1VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returmemcmpmem_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents7] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,914,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,117,101114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99_gfortran_maxloc1_16_s1 .visible .func _gfortran_maxloc1_16_s1in_ar4) {48predpred %r225; .reg .predu64 %r257; .reg .u64 %r258; .reg .u64pred %r262; .reg .pred %r264; .reg .pred %r265; .reg .u64u64predu32 %r318; .reg .u64pred %r330; .reg .u32 %r334; .reg .pred %r337; .reg .predu64 %r370; .reg .u64 %r372; .reg .pred %r373; .reg .u64 %r374; .reg .pred398; mov.u64 %r185,%ar0; mov.u64 %r186,%ar1; mov.u64 %r187,%ar2; mov.u32 %r188,%ar3; mov.u64 %r189,%ar4; .loc 1 70 10 ld.s8 %r22,[%r186+28]; .loc 1 70 38 add.u32 %r190,%r22,-1; .loc 1 70 8 cvt.s64.s32 %r117,%r190; .loc 1 71 10 ld.u64 %r25,[%r187]; .loc 1 71 7 add.u64 %r118,%r25,-1; .loc 1 73 7 shr.u64 %r192,%r118,63; set.u32.lt.s64 %r194,%r117,%r118; neg.s32 %r195,%r194; cvt.u16.u64 %r198,%r192; cvt.u16.u32 %r199,%r195; or.b16 %r197,%r198,%r199; .loc 1 73 6 cvt.u32.u16 %r200,%r197; cvt.u16.u8 %r201,%r200; setp.eq.u16 %r202,%r201,0; @ %r202 bra $L2; .loc 1 75 7 cvt.u32.u32 %r205,%r22; cvt.s64.s8 %r204,%r205; st.u64 [%stack+8],%r20403203_gfortran_runtime_errorr208,%r118,%r118; add.u64 %r209,%r208,%r118; shl.b64 %r210,%r209,3; add.u64 %r211,%r186,%r210; ld.u64 %r31,[%r211+56]; ld.u64 %r33,[%r211+48]; .loc 1 83 11 ld.u64 %r34,[%r211+40]; .loc 1 85 3 setp.ne.u64 %r225,%r118,0; @ %r225 bra $L3; $L9: .loc 1 93 3 setp.gt.s64 %r226,%r117,%r118; @ %r226 bra $L4; bra $L52; $L3: add.u64 %r132,%r186,40; add.u64 %r182,%frame,120; add.u64 %r180,%frame,240; add.u64 %r227,%r186,16; add.u64 %r229,%r25,%r25; add.u64 %r230,%r229,%r25; shl.b64 %r231,%r230,3; add.u64 %r137,%r227,%r231; .loc 1 91 12 mov.u64 %r398,0; $L8: .loc 1 87 51 ld.u64 %r233,[%r132]; mul.lo.u64 %r232,%r233,%r189; .loc 1 87 18 st.u64 [%r182],%r232; .loc 1 88 19 ld.u64 %r235,[%r132+16]; add.u64 %r234,%r235,1; ld.u64 %r236,[%r132+8]; sub.u64 %r44,%r234,%r236; .loc 1 90 10 setp.lt.s64 %r237,%r44,0; @ %r237 bra $L6; .loc 1 88 17 st.u64 [%r180],%r44; bra $L7; $L6: .loc 1 91 12 st.u64 [%r180],%r398; $L7: .loc 1 85 3 add.u64 %r132,%r132,24; add.u64 %r182,%r182,8; add.u64 %r180,%r180,8; setp.ne.u64 %r239,%r132,%r137; @ %r239 bra $L8; bra $L9; $L52: .loc 1 102 6 ld.u64 %r240,[%r185]; setp.eq.u64 %r241,%r240,0; @ ! %r241 bra $L11; bra $L10; $L4: add.u64 %r243,%r25,%r25; add.u64 %r244,%r243,%r25; shl.b64 %r245,%r244,3; add.u64 %r246,%r245,40; add.u64 %r114,%r186,%r246; shl.b64 %r247,%r25,3; add.u64 %r103,%r247,-8; add.u64 %r385,%frame,120; add.u64 %r108,%r385,%r103; add.u64 %r384,%frame,240; add.u64 %r101,%r384,%r103; cvt.u32.u32 %r251,%r22; cvt.s64.s8 %r250,%r251; add.u64 %r253,%r250,%r250; add.u64 %r254,%r253,%r250; shl.b64 %r255,%r254,3; add.u64 %r256,%r186,40; add.u64 %r50,%r255,%r256; .loc 1 99 12 mov.u64 %r397,0; $L14: .loc 1 95 56 ld.u64 %r258,[%r114]; mul.lo.u64 %r257,%r258,%r189; .loc 1 95 18 st.u64 [%r108],%r257; .loc 1 96 19 ld.u64 %r260,[%r114+16]; add.u64 %r259,%r260,1; ld.u64 %r261,[%r114+8]; sub.u64 %r54,%r259,%r261; .loc 1 98 10 setp.lt.s64 %r262,%r54,0; @ %r262 bra $L12; .loc 1 96 17 st.u64 [%r101],%r54; bra $L13; $L12: .loc 1 99 12 st.u64 [%r101],%r397; $L13: .loc 1 93 3 add.u64 %r114,%r114,24; add.u64 %r108,%r108,8; add.u64 %r101,%r101,8; setp.ne.u64 %r264,%r50,%r114; @ %r264 bra $L14; bra $L53; $L10: .loc 1 106 7 setp.le.s64 %r265,%r117,0; @ %r265 bra $L16; add.u64 %r384,%frame,240; $L37: add.u64 %r149,%r185,40; mov.u64 %r146,%r384; add.u64 %r266,%r185,16; cvt.u32.u32 %r268,%r22; cvt.s64.s8 %r267,%r268; add.u64 %r270,%r267,%r267; add.u64 %r271,%r270,%r267; shl.b64 %r272,%r271,3; add.u64 %r115,%r266,%r272; .loc 1 109 10 mov.u64 %r106,1; .loc 1 113 4 mov.u64 %r295,0; bra $L17; $L16: .loc 1 117 24 mov.u64 %r273,0; st.u64 [%r185+8],%r273; .loc 1 118 28 cvt.u16.u32 %r276,%r22; add.u16 %r275,%r276,-1; cvt.u32.u16 %r277,%r275; st.u8 [%r185+28],%r277; .loc 1 120 20 add.u32 %r278,%r22,-2; cvt.s64.s32 %r63,%r278; add.u64 %r280,%r63,%r63; add.u64 %r281,%r280,%r63; shl.b64 %r282,%r281,3; add.u64 %r283,%r185,%r282; .loc 1 120 67 shl.b64 %r285,%r63,3; add.u64 %r286,%frame,%r285; .loc 1 120 59 ld.u64 %r288,[%r283+40]; ld.u64 %r289,[%r286+240]; mul.lo.u64 %r123,%r288,%r289; .loc 1 122 29 mov.u64 %r291,16291; call (%value_in),_gfortrani_xmallocarray292,[%value_in]; } .loc 1 122 27 st.u64 [%r185],%r292; .loc 1 123 10 setp.eq.u64 %r294,%r123,0; @ ! %r294 bra $L21; bra $L18; $L20: .loc 1 111 48 mul.lo.u64 %r106,%r59,%r106; $L17: .loc 1 113 4 st.u64 [%r149+8],%r295; ld.u64 %r59,[%r146]; add.u64 %r296,%r59,-1; st.u64 [%r149+16],%r296; st.u64 [%r149],%r106; .loc 1 106 7 add.u64 %r149,%r149,24; add.u64 %r146,%r146,8; setp.ne.u64 %r297,%r115,%r149; @ %r297 bra $L20; bra $L16; $L18: .loc 1 126 4 st.u64 [%r185+48],%r123; mov.u64 %r299,-1; st.u64 [%r185+56],%r299; mov.u64 %r300,1; st.u64 [%r185+40],%r300; .loc 1 127 4 bra $L1; $L11: .loc 1 133 19 ld.s8 %r70,[%r185+28]; .loc 1 133 10 setp.eq.u64 %r301,%r70,%r117; @ %r301 bra $L23; .loc 1 134 2 st.u64 [%stack+8],%r117_gfortran_runtime_error23: .loc 1 139 11 cvta.global.u64 %r304,_gfortrani_compile_options; .loc 1 139 10 ld.u32 %r305,[%r304+36]; setp.eq.u32 %r306,%r305,0; @ %r306 bra $L21; .loc 1 140 2 add.u64 %r384,%frame,240; cvta.const.u64 %r310,$LC2r384309310; call _gfortrani_bounds_ifunction_return$L21: .loc 1 144 3 setp.gt.s64 %r312,%r117,0; @ %r312 bra $L24; $L28: .loc 1 80 9 add.u64 %r313,%r31,1; .loc 1 80 7 sub.u64 %r119,%r313,%r33; max.s64 %r121,%r119,0; .loc 1 83 44 mul.lo.u64 %r36,%r34,%r189; .loc 1 152 8 ld.u64 %r129,[%r186]; .loc 1 153 8 ld.u64 %r130,[%r185]; .loc 1 186 22 ld.u64 %r78,[%frame+120]; .loc 1 187 22 ld.u64 %r80,[%frame]; .loc 1 187 12 shl.b64 %r81,%r80,4; .loc 1 189 32 ld.u64 %r152,[%frame+240]; .loc 1 196 23 mul.lo.u64 %r314,%r78,%r152; .loc 1 196 9 neg.s64 %r142,%r314; .loc 1 197 23 mul.lo.u64 %r315,%r80,%r152; .loc 1 197 9 shl.b64 %r316,%r315,4; neg.s64 %r141,%r316; .loc 1 185 12 ld.u64 %r181,[%frame+360]; setp.gt.s64 %r386,%r119,0; setp.le.s64 %r387,%r117,1; mov.u64 %r390,1; .loc 1 173 6 setp.eq.u32 %r391,%r188,0; cvt.u32.u32 %r392,%r22; cvt.s64.s8 %r393,%r392; add.u64 %r394,%r393,-1; add.u64 %r395,%frame,120; add.u64 %r396,%frame,240; bra $L25; $L24: add.u64 %r167,%frame,360; add.u64 %r166,%r185,40; mov.u64 %r164,%frame; cvt.u32.u32 %r318,%r22; cvt.s64.s8 %r317,%r318; add.u64 %r150,%r317,-1; .loc 1 144 3 mov.u64 %r131,0; add.u64 %r384,%frame,240; .loc 1 146 16 mov.u64 %r319,%r131; $L27: st.u64 [%r167],%r319; .loc 1 147 18 ld.u64 %r320,[%r166]; st.u64 [%r164],%r320; .loc 1 148 17 shl.b64 %r322,%r131,3; add.u64 %r323,%r384,%r322; .loc 1 148 10 ld.u64 %r324,[%r323]; setp.le.s64 %r325,%r324,0; @ %r325 bra $L1; .loc 1 144 26 add.u64 %r131,%r131,1; .loc 1 144 3 add.u64 %r167,%r167,8; add.u64 %r166,%r166,24; add.u64 %r164,%r164,8; setp.ne.u64 %r326,%r131,%r150; @ %r326 bra $L27; bra $L28; $L25: .loc 1 166 5 @ %r386 bra $L38; .loc 1 167 10 mov.u64 %r328,0; st.u64 [%r130],%r328; st.u64 [%r130+8],%r328; bra $L30; $L38: mov.u64 %r125,%r129; mov.u64 %r378,%r390; mov.u64 %r379,0; .loc 1 164 9 mov.u64 %r109,%r379; .loc 1 165 9 mov.u64 %r376,%r379; mov.u64 %r377,%r379; $L29: .loc 1 173 6 setp.eq.u64 %r330,%r109,0; @ %r330 bra $L31; .loc 1 40(%value_in),memcmp,(%out_arg1,%out_arg2,%out_arg3); ld.param.u32 %r334,[%value_in]; } .loc 1 173 6 @ %r391 bra $L32; .loc 1 173 22 setp.ge.s32 %r337,%r334,0; @ %r337 bra $L31; bra $L33; $L32: setp.le.s32 %r338,%r334,0; @ %r338 bra $L33; $L31: .loc 1 177 14 mov.u64 %r376,%r378; mov.u64 %r377,%r379; mov.u64 %r109,%r125; $L33: .loc 1 170 36 add.u64 %r125,%r125,%r36; .loc 1 170 6 add.u64 %r341,%r378,%r390; set.u32.lt.u64 %r346,%r341,%r378; cvt.s64.s32 %r344,%r346; mov.u64 %r378,%r341; sub.u64 %r379,%r379,%r344; add.u64 %r351,%r378,-1; setp.gt.s64 %r353,%r121,%r351; @ %r353 bra $L29; .loc 1 181 12 st.u64 [%r130],%r376; st.u64 [%r130+8],%r377; $L30: .loc 1 185 15 add.u64 %r181,%r181,1; .loc 1 186 12 add.u64 %r129,%r129,%r78; .loc 1 187 12 add.u64 %r130,%r130,%r81; .loc 1 189 13 setp.ne.u64 %r356,%r181,%r152; @ %r356 bra $L25; .loc 1 196 9 add.u64 %r126,%r129,%r142; .loc 1 197 9 add.u64 %r127,%r130,%r141; .loc 1 199 7 @ %r387 bra $L1; add.u64 %r136,%frame,368; mov.u64 %r156,8; .loc 1 198 5 mov.u64 %r128,1; .loc 1 193 13 mov.u64 %r388,0; bra $L35; $L36: st.u64 [%r136],%r388; .loc 1 196 23 mul.lo.u64 %r362,%r95,%r94; .loc 1 196 9 sub.u64 %r126,%r129,%r362; .loc 1 197 23 mul.lo.u64 %r363,%r97,%r94; .loc 1 197 9 shl.b64 %r364,%r363,4; sub.u64 %r127,%r130,%r364; .loc 1 198 5 add.u64 %r128,%r128,1; .loc 1 199 7 add.u64 %r136,%r136,8; add.u64 %r156,%r156,8; setp.eq.u64 %r365,%r128,%r394; @ %r365 bra $L1; $L35: .loc 1 207 16 ld.u64 %r366,[%r136]; add.u64 %r94,%r366,1; st.u64 [%r136],%r94; .loc 1 208 23 add.u64 %r368,%r395,%r156; ld.u64 %r95,[%r368]; .loc 1 208 13 add.u64 %r129,%r126,%r95; .loc 1 209 23 add.u64 %r369,%frame,%r156; ld.u64 %r97,[%r369]; .loc 1 209 13 shl.b64 %r370,%r97,4; add.u64 %r130,%r127,%r370; .loc 1 189 32 add.u64 %r372,%r396,%r156; ld.u64 %r100,[%r372]; .loc 1 189 13 setp.eq.u64 %r373,%r94,%r100; @ %r373 bra $L36; mov.u64 %r181,0; bra $L25; $L53: .loc 1 102 6 ld.u64 %r374,[%r185]; setp.eq.u64 %r375,%r374,0; @ ! %r375 bra $L11; bra $L37; $L1_gfortran_mmaxloc1_16_s1 .visible .func _gfortran_mmaxloc1_16_s1608u16u32 %r263; .reg .u64pred %r295; .reg .predpred %r310; .reg .pred %r311; .reg .u64u64 %r318; .reg .u64pred64pred64predpred %r397; .reg .u64 %r400; .reg .u64 %r401; .reg .u64 %r405; .reg .pred.reg .u64 %r416; .reg .predpred.reg .u64 %r448; .reg .u32 %r450; .reg .u64pred %r463; .reg .predpred %r479; .reg .pred %r482; .reg .u64 %r485; .reg .predu64 %r504; .reg .u64 %r506; .reg .pred %r507; .reg .u64 %r508; .reg .pred %r509; .reg .u64 %r510; .reg .u64 %r511; .reg .u64 %r514; .reg .u64 %r515; .reg .u64u64 %r534; .reg .pred %r535; .reg .u64 %r536; .reg .u64 %r539; .reg .pred %r540; .reg .u32 %r542; .reg .u64 %r543; .reg .u64 %r544; .reg .u64 %r545; .reg .u64 %r546; .reg .u64 %r547; mov.u64 %r236,%ar0; mov.u64 %r237,%ar1; mov.u64 %r238,%ar2; mov.u64 %r239,%ar3; mov.u32 %r240,%ar4; mov.u64 %r241,%ar5; .loc 1 244 6 setp.ne.u64 %r242,%r239,0; @ %r242 bra $L238240241; call _gfortran_maxloc1_16_s1); } .loc 1 251 7 bra $L54; $L55: .loc 1 254 10 ld.u64 %r22,[%r238]; .loc 1 254 7 add.u64 %r150,%r22,-1; .loc 1 255 10 ld.s8 %r23,[%r237+28]; .loc 1 255 38 add.u32 %r248,%r23,-1; .loc 1 255 8 cvt.s64.s32 %r151,%r248; .loc 1 258 7 shr.u64 %r250,%r150,63; set.u32.gt.s64 %r252,%r150,%r151; neg.s32 %r253,%r252; cvt.u16.u64 %r256,%r250; cvt.u16.u32 %r257,%r253; or.b16 %r255,%r256,%r257; .loc 1 258 6 cvt.u32.u16 %r258,%r255; cvt.u16.u8 %r259,%r258; setp.eq.u16 %r260,%r259,0; @ %r260 bra $L57; .loc 1 260 7 cvt.u32.u32 %r263,%r23; cvt.s64.s8 %r262,%r263; st.u64 [%stack+8],%r262; st.u64 [%stack],%r22; cvta.const.u64 %r261261_gfortran_runtime_error7: .loc 1 265 9 add.u64 %r534,%r150,%r150; add.u64 %r531,%r534,%r150; shl.b64 %r268,%r531,3; add.u64 %r269,%r237,%r268; ld.u64 %r272,[%r269+56]; add.u64 %r271,%r272,1; .loc 1 265 7 ld.u64 %r279,[%r269+48]; sub.u64 %r152,%r271,%r279; .loc 1 266 6 setp.le.s64 %r280,%r152,0; @ %r280 bra $L54; .loc 1 269 9 ld.u64 %r164,[%r239]; .loc 1 271 15 ld.u64 %r34,[%r239+16]; .loc 1 273 22 cvt.u32.u64 %r35,%r34; .loc 1 273 53 add.u32 %r281,%r35,-4; and.b32 %r282,%r281,-5; set.u32.eq.u32 %r284,%r282,0; neg.s32 %r285,%r284; .loc 1 273 22 add.u32 %r286,%r35,-1; set.u32.le.u32 %r288,%r286,1; neg.s32 %r289,%r288; .loc 1 273 6 cvt.u16.u32 %r291,%r285; cvt.u16.u32 %r292,%r289; or.b16 %r290,%r291,%r292; cvt.u32.u16 %r293,%r290; cvt.u16.u8 %r294,%r293; setp.ne.u16 %r295,%r294,0; @ %r295 bra $L59; .loc 1 275 7 setp.ne.u32 %r297,%r35,16; @ %r297 bra $L60; $L59: .loc 1 282 11 shl.b64 %r301,%r531,3; add.u64 %r302,%r237,%r301; ld.u64 %r41,[%r302+40]; .loc 1 283 12 add.u64 %r308,%r239,%r301; ld.u64 %r44,[%r308+40]; .loc 1 285 3 setp.ne.u64 %r310,%r150,0; @ %r310 bra $L61; $L67: .loc 1 295 3 setp.lt.s64 %r311,%r150,%r151; @ %r311 bra $L62; bra $L115; $L60: .loc 1 280 5 cvta.const.u64 %r312312_gfortran_runtime_error61: add.u64 %r217,%r237,40; add.u64 %r219,%frame,240; add.u64 %r220,%r239,40; mov.u64 %r222,%frame; add.u64 %r223,%frame,360; add.u64 %r314,%r237,16; add.u64 %r316,%r22,%r22; add.u64 %r317,%r316,%r22; shl.b64 %r318,%r317,3; add.u64 %r232,%r314,%r318; .loc 1 292 12 mov.u64 %r547,0; $L66: .loc 1 287 51 ld.u64 %r320,[%r217]; mul.lo.u64 %r319,%r320,%r241; .loc 1 287 18 st.u64 [%r219],%r319; .loc 1 288 20 ld.u64 %r322,[%r220]; mul.lo.u64 %r321,%r322,%r34; .loc 1 288 18 st.u64 [%r222],%r321; .loc 1 289 19 ld.u64 %r324,[%r217+16]; add.u64 %r323,%r324,1; ld.u64 %r325,[%r217+8]; sub.u64 %r59,%r323,%r325; .loc 1 291 10 setp.lt.s64 %r326,%r59,0; @ %r326 bra $L64; .loc 1 289 17 st.u64 [%r223],%r59; bra $L65; $L64: .loc 1 292 12 st.u64 [%r223],%r547; $L65: .loc 1 285 3 add.u64 %r217,%r217,24; add.u64 %r219,%r219,8; add.u64 %r220,%r220,24; add.u64 %r222,%r222,8; add.u64 %r223,%r223,8; setp.ne.u64 %r328,%r217,%r232; @ %r328 bra $L66; bra $L67; $L115: .loc 1 305 6 ld.u64 %r329,[%r236]; setp.eq.u64 %r330,%r329,0; @ ! %r330 bra $L69; bra $L68; $L62: add.u64 %r332,%r22,%r22; add.u64 %r333,%r332,%r22; shl.b64 %r334,%r333,3; add.u64 %r175,%r334,40; add.u64 %r39,%r237,%r175; shl.b64 %r335,%r22,3; add.u64 %r203,%r335,-8; add.u64 %r536,%frame,240; add.u64 %r177,%r536,%r203; add.u64 %r112,%r239,%r175; add.u64 %r38,%frame,%r203; add.u64 %r532,%frame,360; add.u64 %r96,%r532,%r203; add.u64 %r338,%r237,40; cvt.u32.u32 %r340,%r23; cvt.s64.s8 %r339,%r340; add.u64 %r342,%r339,%r339; add.u64 %r343,%r342,%r339; shl.b64 %r344,%r343,3; add.u64 %r214,%r338,%r344; .loc 1 302 12 mov.u64 %r546,0; $L72: .loc 1 297 55 ld.u64 %r346,[%r39]; mul.lo.u64 %r345,%r346,%r241; .loc 1 297 18 st.u64 [%r177],%r345; .loc 1 298 20 ld.u64 %r348,[%r112]; mul.lo.u64 %r347,%r348,%r34; .loc 1 298 18 st.u64 [%r38],%r347; .loc 1 299 19 ld.u64 %r350,[%r39+16]; add.u64 %r349,%r350,1; ld.u64 %r351,[%r39+8]; sub.u64 %r75,%r349,%r351; .loc 1 301 10 setp.lt.s64 %r352,%r75,0; @ %r352 bra $L70; .loc 1 299 17 st.u64 [%r96],%r75; bra $L71; $L70: .loc 1 302 12 st.u64 [%r96],%r546; $L71: .loc 1 295 3 add.u64 %r39,%r39,24; add.u64 %r177,%r177,8; add.u64 %r112,%r112,24; add.u64 %r38,%r38,8; add.u64 %r96,%r96,8; setp.ne.u64 %r354,%r39,%r214; @ %r354 bra $L72; bra $L116; $L68: .loc 1 309 7 setp.eq.u64 %r355,%r151,0; @ %r355 bra $L74; add.u64 %r532,%frame,360; $L98: add.u64 %r128,%r236,40; mov.u64 %r108,%r532; add.u64 %r356,%r236,16; cvt.u32.u32 %r358,%r23; cvt.s64.s8 %r357,%r358; add.u64 %r360,%r357,%r357; add.u64 %r361,%r360,%r357; shl.b64 %r362,%r361,3; add.u64 %r60,%r356,%r362; .loc 1 312 10 mov.u64 %r138,1; .loc 1 316 4 mov.u64 %r381,0; bra $L75; $L74: .loc 1 320 20 add.u32 %r363,%r23,-2; cvt.s64.s32 %r83,%r363; add.u64 %r365,%r83,%r83; add.u64 %r366,%r365,%r83; shl.b64 %r367,%r366,3; add.u64 %r368,%r236,%r367; .loc 1 320 67 shl.b64 %r370,%r83,3; add.u64 %r371,%frame,%r370; .loc 1 320 59 ld.u64 %r373,[%r368+40]; ld.u64 %r374,[%r371+360]; mul.lo.u64 %r157,%r373,%r374; .loc 1 322 24 mov.u64 %r375,0; st.u64 [%r236+8],%r375; .loc 1 323 28 cvt.u16.u32 %r378,%r23; add.u16 %r377,%r378,-1; cvt.u32.u16 %r379,%r377; st.u8 [%r236+28],%r379; .loc 1 325 10 setp.eq.u64 %r380,%r157,0; @ %r380 bra $L76; bra $L117; $L78: .loc 1 314 47 mul.lo.u64 %r138,%r80,%r138; $L75: .loc 1 316 4 st.u64 [%r128+8],%r381; ld.u64 %r80,[%r108]; add.u64 %r382,%r80,-1; st.u64 [%r128+16],%r382; st.u64 [%r128],%r138; .loc 1 309 7 add.u64 %r128,%r128,24; add.u64 %r108,%r108,8; setp.ne.u64 %r383,%r60,%r128; @ %r383 bra $L78; bra $L74; $L76: .loc 1 328 4 st.u64 [%r236+48],%r157; mov.u64 %r385,-1; st.u64 [%r236+56],%r385; mov.u64 %r386,1; st.u64 [%r236+40],%r386; .loc 1 329 4 bra $L54; $L117: .loc 1 332 24 mov.u64 %r388,16388; call (%value_in),_gfortrani_xmallocarray89,[%value_in]; } .loc 1 332 22 st.u64 [%r236],%r389; bra $L79; $L69: .loc 1 337 19 ld.s8 %r391,[%r236+28]; .loc 1 337 10 setp.eq.u64 %r392,%r391,%r151; @ %r392 bra $L80; .loc 1 338 2 cvta.const.u64 %r3933_gfortran_runtime_error80: .loc 1 340 11 cvta.global.u64 %r395,_gfortrani_compile_options; .loc 1 340 10 ld.u32 %r396,[%r395+36]; setp.eq.u32 %r397,%r396,0; @ %r397 bra $L79; .loc 1 342 4 add.u64 %r532,%frame,360; cvta.const.u64 %r401,$LC2r500401; call _gfortrani_bounds_ifunction_return344 4 cvta.const.u64 %r405,$LC20401; call _gfortrani_bounds_equal_extents$L79: .loc 1 349 3 setp.ne.u64 %r407,%r151,0; @ %r407 bra $L81; $L84: .loc 1 357 8 ld.u64 %r165,[%r236]; .loc 1 358 8 ld.u64 %r163,[%r237]; .loc 1 360 9 setp.ne.u64 %r408,%r163,0; @ %r408 bra $L82; bra $L54; $L81: add.u64 %r146,%frame,480; add.u64 %r145,%r236,40; add.u64 %r140,%frame,120; cvt.u32.u32 %r410,%r23; cvt.s64.s8 %r409,%r410; add.u64 %r129,%r409,-1; .loc 1 349 3 mov.u64 %r172,0; add.u64 %r532,%frame,360; .loc 1 351 16 mov.u64 %r411,%r172; $L83: st.u64 [%r146],%r411; .loc 1 352 18 ld.u64 %r412,[%r145]; st.u64 [%r140],%r412; .loc 1 353 17 shl.b64 %r414,%r172,3; add.u64 %r415,%r532,%r414; .loc 1 353 10 ld.u64 %r416,[%r415]; setp.le.s64 %r417,%r416,0; @ %r417 bra $L54; .loc 1 349 26 add.u64 %r172,%r172,1; .loc 1 349 3 add.u64 %r146,%r146,8; add.u64 %r145,%r145,24; add.u64 %r140,%r140,8; setp.ne.u64 %r418,%r129,%r172; @ %r418 bra $L83; bra $L84; $L82: .loc 1 282 44 mul.lo.u64 %r43,%r41,%r241; .loc 1 283 12 mul.lo.u64 %r47,%r44,%r34; .loc 1 396 22 ld.u64 %r100,[%frame+240]; .loc 1 397 23 ld.u64 %r102,[%frame]; .loc 1 398 22 ld.u64 %r104,[%frame+120]; .loc 1 398 12 shl.b64 %r105,%r104,4; .loc 1 400 32 ld.u64 %r29,[%frame+360]; .loc 1 407 23 mul.lo.u64 %r168,%r100,%r29; .loc 1 408 24 mul.lo.u64 %r419,%r102,%r29; .loc 1 408 10 neg.s64 %r144,%r419; .loc 1 409 23 mul.lo.u64 %r420,%r104,%r29; .loc 1 409 9 shl.b64 %r421,%r420,4; neg.s64 %r188,%r421; setp.le.s64 %r533,%r151,1; add.u64 %r539,%r152,-1; .loc 1 384 13 setp.eq.u32 %r540,%r240,0; cvt.u32.u32 %r542,%r23; cvt.s64.s8 %r543,%r542; add.u64 %r544,%r543,-1; add.u64 %r545,%frame,240; $L99: .loc 1 349 3 mov.u64 %r171,%r164; mov.u64 %r142,%r163; .loc 1 372 9 mov.u64 %r199,0; $L88: .loc 1 375 7 ld.s8 %r205,[%r171]; mov.u64 %r209,%r199; add.u64 %r199,%r199,1; .loc 1 375 6 setp.eq.u32 %r535,%r205,0; @ %r535 bra $L85; .loc 1 378 11 mov.u64 %r510,%r199; shr.s64 %r511,%r510,63; .loc 1 382 6 setp.le.s64 %r424,%r152,%r209; @ %r424 bra $L86; add.u64 %r514,%r209,1; shr.s64 %r515,%r514,63; sub.u64 %r428,%r539,%r209; shr.s64 %r431,%r209,63; add.u64 %r434,%r209,2; set.u32.lt.u64 %r439,%r434,%r209; cvt.s64.s32 %r437,%r439; sub.u64 %r443,%r431,%r437; add.u64 %r445,%r428,%r434; set.u32.lt.u64 %r450,%r445,%r428; cvt.s64.s32 %r448,%r450; sub.u64 %r454,%r443,%r448; mov.u64 %r170,%r142; mov.u64 %r526,1; bra $L87; $L85: .loc 1 372 32 add.u64 %r142,%r142,%r43; .loc 1 372 47 add.u64 %r171,%r171,%r47; .loc 1 372 2 setp.ne.u64 %r455,%r152,%r199; @ %r455 bra $L88; .loc 1 371 9 mov.u64 %r510,0; mov.u64 %r511,%r510; bra $L86; $L87: .loc 1 384 6 @ %r535 bra $L89; .loc 1 4017142241460,[%value_in]; } .loc 1 384 13 @ %r540 bra $L90; setp.ge.s32 %r463,%r460,0; @ %r463 bra $L91; bra $L89; $L90: setp.le.s32 %r464,%r460,0; @ %r464 bra $L89; $L91: .loc 1 388 14 mov.u64 %r510,%r514; mov.u64 %r511,%r515; mov.u64 %r142,%r170; $L89: .loc 1 382 31 add.u64 %r170,%r170,%r43; .loc 1 382 46 add.u64 %r171,%r171,%r47; .loc 1 382 6 add.u64 %r467,%r514,%r526; set.u32.lt.u64 %r472,%r467,%r514; cvt.s64.s32 %r470,%r472; mov.u64 %r514,%r467; sub.u64 %r515,%r515,%r470; setp.ne.u64 %r479,%r445,%r514; @ %r479 bra $L100; setp.ne.u64 %r482,%r454,%r515; @ ! %r482 bra $L86; $L100: .loc 1 384 7 ld.s8 %r205,[%r171]; setp.eq.u32 %r535,%r205,0; bra $L87; $L86: .loc 1 392 8 st.u64 [%r165],%r510; st.u64 [%r165+8],%r511; .loc 1 395 15 ld.u64 %r485,[%frame+480]; add.u64 %r98,%r485,1; .loc 1 396 12 add.u64 %r163,%r163,%r100; .loc 1 397 13 add.u64 %r164,%r164,%r102; .loc 1 398 12 add.u64 %r165,%r165,%r105; .loc 1 400 13 setp.eq.u64 %r486,%r29,%r98; @ %r486 bra $L93; .loc 1 395 15 st.u64 [%frame+480],%r98; bra $L99; $L93: .loc 1 404 13 mov.u64 %r487,0; st.u64 [%frame+480],%r487; .loc 1 408 10 add.u64 %r160,%r164,%r144; .loc 1 409 9 add.u64 %r161,%r165,%r188; .loc 1 411 7 @ %r533 bra $L54; add.u64 %r201,%frame,488; .loc 1 407 23 mov.u64 %r107,%r168; .loc 1 411 7 mov.u64 %r197,8; .loc 1 410 5 mov.u64 %r162,1; add.u64 %r532,%frame,360; add.u64 %r530,%frame,120; bra $L96; $L97: .loc 1 404 13 st.u64 [%r201],%r487; .loc 1 407 23 mul.lo.u64 %r107,%r119,%r118; .loc 1 408 24 mul.lo.u64 %r493,%r121,%r118; .loc 1 408 10 sub.u64 %r160,%r164,%r493; .loc 1 409 23 mul.lo.u64 %r494,%r123,%r118; .loc 1 409 9 shl.b64 %r495,%r494,4; sub.u64 %r161,%r165,%r495; .loc 1 410 5 add.u64 %r162,%r162,1; .loc 1 411 7 add.u64 %r201,%r201,8; add.u64 %r197,%r197,8; setp.eq.u64 %r496,%r162,%r544; @ %r496 bra $L54; $L96: .loc 1 419 16 ld.u64 %r497,[%r201]; add.u64 %r118,%r497,1; st.u64 [%r201],%r118; .loc 1 420 23 add.u64 %r499,%r545,%r197; ld.u64 %r119,[%r499]; .loc 1 420 13 sub.u64 %r500,%r119,%r107; add.u64 %r163,%r163,%r500; .loc 1 421 24 add.u64 %r501,%frame,%r197; ld.u64 %r121,[%r501]; .loc 1 421 14 add.u64 %r164,%r160,%r121; .loc 1 422 23 add.u64 %r503,%r530,%r197; ld.u64 %r123,[%r503]; .loc 1 422 13 shl.b64 %r504,%r123,4; add.u64 %r165,%r161,%r504; .loc 1 400 32 add.u64 %r506,%r532,%r197; ld.u64 %r127,[%r506]; .loc 1 400 13 setp.eq.u64 %r507,%r118,%r127; @ %r507 bra $L97; bra $L99; $L116: .loc 1 305 6 ld.u64 %r508,[%r236]; setp.eq.u64 %r509,%r508,0; @ ! %r509 bra $L69; bra $L98; $L54: .loc 1 426_gfortran_smaxloc1_16_s1 .visible .func _gfortran_smaxloc1_16_s1368100predu16 %r180; .reg .predpred %r222; .reg .predu64 %r231; .reg .u16 %r233; .reg .u16 %r234; .reg .u32pred %r259; .reg .pred %r260; .reg .u64u64 %r267; .reg .u32predmov.u64 %r156,%ar0; mov.u64 %r157,%ar1; mov.u64 %r158,%ar2; mov.u64 %r159,%ar3; mov.u32 %r160,%ar4; mov.u64 %r161,%ar5; .loc 1 449 6 setp.eq.u64 %r162,%r159,0; @ %r162 bra $L119; .loc 1 449 20 ld.u32 %r163,[%r159]; setp.eq.u32 %r164,%r163,0; @ %r164 bra $L120; $L119: .loc 1 452 7r1571580161; call _gfortran_maxloc1_16_s1); } .loc 1 456 7 bra $L118; $L120: .loc 1 459 10 ld.u64 %r23,[%r158]; .loc 1 459 7 add.u64 %r95,%r23,-1; .loc 1 460 10 ld.s8 %r24,[%r157+28]; .loc 1 460 38 add.u32 %r26,%r24,-1; .loc 1 460 8 cvt.s64.s32 %r307,%r26; .loc 1 462 7 shr.u64 %r171,%r95,63; set.u32.gt.s64 %r173,%r95,%r307; neg.s32 %r174,%r173; cvt.u16.u64 %r177,%r171; cvt.u16.u32 %r178,%r174; or.b16 %r176,%r177,%r178; .loc 1 462 6 cvt.u32.u16 %r179,%r176; cvt.u16.u8 %r180,%r179; setp.ne.u16 %r181,%r180,0; @ %r181 bra $L122; .loc 1 469 3 setp.ne.u64 %r182,%r95,0; @ %r182 bra $L123; $L129: .loc 1 477 3 setp.lt.s64 %r183,%r95,%r307; @ %r183 bra $L124; bra $L177; $L122: .loc 1 464 7 cvt.u32.u32 %r186,%r24; cvt.s64.s8 %r185,%r186; st.u64 [%stack+8],%r1851841stack; call _gfortran_runtime_error123: add.u64 %r92,%r157,48; add.u64 %r114,%frame,120; add.u64 %r146,%r157,56; add.u64 %r188,%r157,24; add.u64 %r190,%r23,%r23; add.u64 %r191,%r190,%r23; shl.b64 %r192,%r191,3; add.u64 %r152,%r188,%r192; .loc 1 474 12 mov.u64 %r313,0; $L128: .loc 1 471 19 ld.u64 %r194,[%r146]; ld.u64 %r196,[%r92]; sub.u64 %r195,%r194,%r196; .loc 1 471 50 mad.lo.u64 %r38,%r195,%r161,%r161; .loc 1 473 10 setp.le.s64 %r197,%r38,0; @ %r197 bra $L126; .loc 1 471 17 st.u64 [%r114],%r38; bra $L127; $L126: .loc 1 474 12 st.u64 [%r114],%r313; $L127: .loc 1 469 3 add.u64 %r92,%r92,24; add.u64 %r114,%r114,8; add.u64 %r146,%r146,24; setp.ne.u64 %r199,%r92,%r152; @ %r199 bra $L128; bra $L129; $L177: .loc 1 486 15 ld.u64 %r107,[%r156]; .loc 1 486 6 setp.eq.u64 %r200,%r107,0; @ ! %r200 bra $L131; bra $L130; $L124: add.u64 %r202,%r23,%r23; add.u64 %r203,%r202,%r23; shl.b64 %r204,%r203,3; add.u64 %r205,%r204,48; add.u64 %r83,%r157,%r205; add.u64 %r305,%frame,120; shl.b64 %r207,%r23,3; add.u64 %r208,%r207,-8; add.u64 %r144,%r305,%r208; add.u64 %r209,%r157,48; cvt.u32.u32 %r211,%r24; cvt.s64.s8 %r210,%r211; add.u64 %r213,%r210,%r210; add.u64 %r214,%r213,%r210; shl.b64 %r215,%r214,3; add.u64 %r72,%r209,%r215; .loc 1 483 12 mov.u64 %r312,0; $L134: .loc 1 480 2 ld.u64 %r217,[%r83+8]; ld.u64 %r219,[%r83]; sub.u64 %r218,%r217,%r219; .loc 1 480 37 mad.lo.u64 %r46,%r218,%r161,%r161; .loc 1 482 10 setp.le.s64 %r220,%r46,0; @ %r220 bra $L132; .loc 1 479 17 st.u64 [%r144],%r46; bra $L133; $L132: .loc 1 483 12 st.u64 [%r144],%r312; $L133: .loc 1 477 3 add.u64 %r83,%r83,24; add.u64 %r144,%r144,8; setp.ne.u64 %r222,%r72,%r83; @ %r222 bra $L134; bra $L178; $L130: .loc 1 490 7 setp.eq.u64 %r223,%r307,0; @ %r223 bra $L136; add.u64 %r305,%frame,120; $L156: add.u64 %r125,%r156,40; mov.u64 %r123,%r305; add.u64 %r224,%r156,16; cvt.u32.u32 %r226,%r24; cvt.s64.s8 %r225,%r226; add.u64 %r228,%r225,%r225; add.u64 %r229,%r228,%r225; shl.b64 %r230,%r229,3; add.u64 %r117,%r224,%r230; .loc 1 493 10 mov.u64 %r89,1; .loc 1 497 4 mov.u64 %r249,0; bra $L137; $L136: .loc 1 501 24 mov.u64 %r231,0; st.u64 [%r156+8],%r231; .loc 1 502 28 cvt.u16.u32 %r234,%r24; add.u16 %r233,%r234,-1; cvt.u32.u16 %r235,%r233; st.u8 [%r156+28],%r235; .loc 1 504 20 add.u32 %r236,%r24,-2; cvt.s64.s32 %r56,%r236; add.u64 %r238,%r56,%r56; add.u64 %r239,%r238,%r56; shl.b64 %r240,%r239,3; add.u64 %r241,%r156,%r240; .loc 1 504 67 shl.b64 %r243,%r56,3; add.u64 %r244,%frame,%r243; .loc 1 504 59 ld.u64 %r246,[%r241+40]; ld.u64 %r247,[%r244+120]; mul.lo.u64 %r103,%r246,%r247; .loc 1 506 10 setp.eq.u64 %r248,%r103,0; @ %r248 bra $L138; bra $L179; $L140: .loc 1 495 48 mul.lo.u64 %r89,%r52,%r89; $L137: .loc 1 497 4 st.u64 [%r125+8],%r249; ld.u64 %r52,[%r123]; add.u64 %r250,%r52,-1; st.u64 [%r125+16],%r250; st.u64 [%r125],%r89; .loc 1 490 7 add.u64 %r125,%r125,24; add.u64 %r123,%r123,8; setp.ne.u64 %r251,%r117,%r125; @ %r251 bra $L140; bra $L136; $L138: .loc 1 509 4 st.u64 [%r156+48],%r103; mov.u64 %r253,-1; st.u64 [%r156+56],%r253; mov.u64 %r254,1; st.u64 [%r156+40],%r254; .loc 1 510 4 bra $L118; $L179: .loc 1 513 24 mov.u64 %r256,16256; call (%value_in),_gfortrani_xmallocarray257,[%value_in]; } mov.u64 %r107,%r257; .loc 1 513 22 st.u64 [%r156],%r107; $L144: .loc 1 539 3 setp.ne.u64 %r259,%r307,0; @ %r259 bra $L141; bra $L142; $L131: .loc 1 517 19 ld.s8 %r61,[%r156+28]; .loc 1 517 10 setp.eq.u64 %r260,%r61,%r307; @ %r260 bra $L143; .loc 1 518 2 st.u64 [%stack+8],%r307; st.u64 [%stack],%r61;_gfortran_runtime_error143: .loc 1 523 11 cvta.global.u64 %r263,_gfortrani_compile_options; .loc 1 523 10 ld.u32 %r264,[%r263+36]; setp.eq.u32 %r265,%r264,0; @ %r265 bra $L144; .loc 1 525 4 setp.eq.u64 %r266,%r307,0; @ %r266 bra $L142; add.u64 %r115,%r156,48; add.u64 %r105,%frame,120; cvt.u32.u32 %r268,%r24; cvt.s64.s8 %r267,%r268; add.u64 %r86,%r267,-1; .loc 1 525 10 mov.u64 %r113,0; $L146: .loc 1 529 21 ld.u64 %r270,[%r115+8]; add.u64 %r269,%r270,1; .loc 1 529 19 ld.u64 %r271,[%r115]; sub.u64 %r100,%r269,%r271; .loc 1 530 18 ld.u64 %r70,[%r105]; .loc 1 525 25 add.u64 %r113,%r113,1; .loc 1 530 11 setp.eq.u64 %r272,%r70,%r100; @ %r272 bra $L145; .loc 1 531 3 st.u64 [%stack+16],%r70; st.u64 [%stack+8],%r100; st.u64 [%stack],%r113;_gfortran_runtime_error145: .loc 1 525 4 add.u64 %r115,%r115,24; add.u64 %r105,%r105,8; setp.ne.u64 %r275,%r86,%r113; @ %r275 bra $L146; $L141: .loc 1 541 16 cvt.u16.u32 %r277,%r24; setp.le.s16 %r278,%r277,1; @ %r278 bra $L147; shl.b64 %r276,%r307,3; bra $L148; $L147: mov.u64 %r276,8; $L148: add.u64 %r280,%frame,240; mov.u32 %r2842766,[%value_in]; } add.u64 %r130,%r156,40; mov.u64 %r128,%frame; mov.u64 %r109,0; $L149: .loc 1 542 18 ld.u64 %r288,[%r130]; st.u64 [%r128],%r288; .loc 1 539 26 add.u64 %r109,%r109,1; .loc 1 539 3 add.u64 %r130,%r130,24; add.u64 %r128,%r128,8; setp.gt.s64 %r289,%r307,%r109; @ %r289 bra $L149; $L142: ld.u64 %r88,[%frame+240]; .loc 1 551 22 ld.u64 %r75,[%frame]; .loc 1 551 12 shl.b64 %r76,%r75,4; .loc 1 553 32 ld.u64 %r65,[%frame+120]; .loc 1 560 23 mul.lo.u64 %r49,%r65,%r75; setp.le.s64 %r306,%r307,1; .loc 1 549 13 mov.u64 %r290,0; cvt.u32.u32 %r309,%r24; cvt.s64.s8 %r310,%r309; add.u64 %r311,%r310,-1; $L155: st.u64 [%r107],%r290; st.u64 [%r107+8],%r290; .loc 1 550 15 add.u64 %r88,%r88,1; .loc 1 551 12 add.u64 %r107,%r107,%r76; .loc 1 553 13 setp.ne.u64 %r292,%r65,%r88; @ %r292 bra $L155; .loc 1 562 7 @ ! %r306 bra $L180; bra $L118; $L154: .loc 1 557 13 st.u64 [%r108],%r290; .loc 1 560 23 mul.lo.u64 %r78,%r81,%r80; .loc 1 561 5 add.u64 %r106,%r106,1; .loc 1 562 7 add.u64 %r108,%r108,8; add.u64 %r138,%r138,8; add.u64 %r137,%r137,8; setp.ne.u64 %r295,%r106,%r311; @ %r295 bra $L153; bra $L118; $L180: add.u64 %r108,%frame,248; add.u64 %r138,%frame,8; add.u64 %r137,%frame,128; .loc 1 560 23 mov.u64 %r78,%r49; .loc 1 561 5 mov.u64 %r106,1; $L153: .loc 1 566 16 ld.u64 %r300,[%r108]; add.u64 %r80,%r300,1; st.u64 [%r108],%r80; .loc 1 567 23 ld.u64 %r81,[%r138]; .loc 1 567 13 sub.u64 %r301,%r81,%r78; shl.b64 %r302,%r301,4; add.u64 %r107,%r107,%r302; .loc 1 553 32 ld.u64 %r84,[%r137]; .loc 1 553 13 setp.eq.u64 %r303,%r80,%r84; @ %r303 bra $L154; .loc 1 557 13 mov.u64 %r88,0; bra $L155; $L178: .loc 1 486 15 ld.u64 %r107,[%r156]; .loc 1 486 6 setp.eq.u64 %r304,%r107,0; @ ! %r304 bra $L131; bra $L156; $L118: .loc 1 maxloc1_16_s4.o/1622802249_gfortran_maxloc1_16_s4 .visible .func _gfortran_maxloc1_16_sfortran/generated/maxloc1_16_s4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_16_s4 .visible .func _gfortran_mmaxloc1_16_s_gfortran_smaxloc1_16_s4 .visible .func _gfortran_smaxloc1_16_sVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_memcmp_char4_gfortrani_memcmp_char4_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents7] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,914,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,117,101114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99_gfortran_maxloc1_16_s4 .visible .func _gfortran_maxloc1_16_s, .param .u64 %in_ar4) {48u64predpredu32 %r259; .reg .u64predu64 %r280; .reg .u16 %r282; .reg .u16 %r283; .reg .u32 %r284; .reg .u32pred %r301; .reg .u64predu64 %r324; .reg .u64pred %r361; .reg .pred %r364; .reg .u64u64 %r375; .reg .u64u64pred %r389; .reg .u64mov.u64 %r192,%ar0; mov.u64 %r193,%ar1; mov.u64 %r194,%ar2; mov.u32 %r195,%ar3; mov.u64 %r196,%ar4; .loc 1 70 10 ld.s8 %r22,[%r193+28]; .loc 1 70 38 add.u32 %r197,%r22,-1; .loc 1 70 8 cvt.s64.s32 %r104,%r197; .loc 1 71 10 ld.u64 %r25,[%r194]; .loc 1 71 7 add.u64 %r105,%r25,-1; .loc 1 73 7 shr.u64 %r199,%r105,63; set.u32.lt.s64 %r201,%r104,%r105; neg.s32 %r202,%r201; cvt.u16.u64 %r205,%r199; cvt.u16.u32 %r206,%r202; or.b16 %r204,%r205,%r206; .loc 1 73 6 cvt.u32.u16 %r207,%r204; cvt.u16.u8 %r208,%r207; setp.eq.u16 %r209,%r208,0; @ %r209 bra $L2; .loc 1 75 7 cvt.u32.u32 %r212,%r22; cvt.s64.s8 %r211,%r212; st.u64 [%stack+8],%r2110210_gfortran_runtime_errorr215,%r105,%r105; add.u64 %r216,%r215,%r105; shl.b64 %r217,%r216,3; add.u64 %r218,%r193,%r217; ld.u64 %r30,[%r218+56]; ld.u64 %r32,[%r218+48]; .loc 1 83 11 ld.u64 %r33,[%r218+40]; .loc 1 85 3 setp.ne.u64 %r232,%r105,0; @ %r232 bra $L3; $L9: .loc 1 93 3 setp.gt.s64 %r233,%r104,%r105; @ %r233 bra $L4; bra $L50; $L3: add.u64 %r103,%r193,40; add.u64 %r101,%frame,120; add.u64 %r100,%frame,240; add.u64 %r234,%r193,16; add.u64 %r236,%r25,%r25; add.u64 %r237,%r236,%r25.loc 1 91 12 mov.u64 %r411,0; $L8: .loc 1 87 51 ld.u64 %r240,[%r103]; mul.lo.u64 %r239,%r240,%r196; .loc 1 87 18 st.u64 [%r101],%r239; .loc 1 88 19 ld.u64 %r242,[%r103+16]; add.u64 %r241,%r242,1; ld.u64 %r243,[%r103+8]; sub.u64 %r42,%r241,%r243; .loc 1 90 10 setp.lt.s64 %r244,%r42,0; @ %r244 bra $L6; .loc 1 88 17 st.u64 [%r100],%r42; bra $L7; $L6: .loc 1 91 12 st.u64 [%r100],%r411; $L7: .loc 1 85 3 add.u64 %r103,%r103,24; add.u64 %r101,%r101,8; add.u64 %r100,%r100,8; setp.ne.u64 %r246,%r89,%r103; @ %r246 bra $L8; bra $L9; $L50: .loc 1 102 6 ld.u64 %r247,[%r192]; setp.eq.u64 %r248,%r247,0; @ ! %r248 bra $L11; bra $L10; $L4: add.u64 %r250,%r25,%r25; add.u64 %r251,%r250,%r25; shl.b64 %r252,%r251,3; add.u64 %r253,%r252,40; add.u64 %r151,%r193,%r253; shl.b64 %r254,%r25,3; add.u64 %r144,%r254,-8; add.u64 %r399,%frame,120; add.u64 %r146,%r399,%r144; add.u64 %r398,%frame,240; add.u64 %r137,%r398,%r144; add.u64 %r257,%r193,40; cvt.u32.u32 %r259,%r22; cvt.s64.s8 %r258,%r259; add.u64 %r261,%r258,%r258; add.u64 %r262,%r261,%r258; shl.b64 %r263,%r262,3; add.u64 %r125,%r257,%r263; .loc 1 99 12 mov.u64 %r410,0; $L14: .loc 1 95 56 ld.u64 %r265,[%r151]; mul.lo.u64 %r264,%r265,%r196; .loc 1 95 18 st.u64 [%r146],%r264; .loc 1 96 19 ld.u64 %r267,[%r151+16]; add.u64 %r266,%r267,1; ld.u64 %r268,[%r151+8]; sub.u64 %r51,%r266,%r268; .loc 1 98 10 setp.lt.s64 %r269,%r51,0; @ %r269 bra $L12; .loc 1 96 17 st.u64 [%r137],%r51; bra $L13; $L12: .loc 1 99 12 st.u64 [%r137],%r410; $L13: .loc 1 93 3 add.u64 %r151,%r151,24; add.u64 %r146,%r146,8; add.u64 %r137,%r137,8; setp.ne.u64 %r271,%r125,%r151; @ %r271 bra $L14; bra $L51; $L10: .loc 1 106 7 setp.le.s64 %r272,%r104,0; @ %r272 bra $L16; add.u64 %r398,%frame,240; $L38: add.u64 %r160,%r192,40; mov.u64 %r158,%r398; add.u64 %r273,%r192,16; cvt.u32.u32 %r275,%r22; cvt.s64.s8 %r274,%r275; add.u64 %r277,%r274,%r274; add.u64 %r278,%r277,%r274; shl.b64 %r279,%r278,3; add.u64 %r152,%r273,%r279; .loc 1 109 10 mov.u64 %r95,1; .loc 1 113 4 mov.u64 %r302,0; bra $L17; $L16: .loc 1 117 24 mov.u64 %r280,0; st.u64 [%r192+8],%r280; .loc 1 118 28 cvt.u16.u32 %r283,%r22; add.u16 %r282,%r283,-1; cvt.u32.u16 %r284,%r282; st.u8 [%r192+28],%r284; .loc 1 120 20 add.u32 %r285,%r22,-2; cvt.s64.s32 %r58,%r285; add.u64 %r287,%r58,%r58; add.u64 %r288,%r287,%r58; shl.b64 %r289,%r288,3; add.u64 %r290,%r192,%r289; .loc 1 120 67 shl.b64 %r292,%r58,3; add.u64 %r293,%frame,%r292; .loc 1 120 59 ld.u64 %r295,[%r290+40]; ld.u64 %r296,[%r293+240]; mul.lo.u64 %r110,%r295,%r296; .loc 1 122 29 mov.u64 %r298,1622 27 st.u64 [%r192],%r299; .loc 1 123 10 setp.eq.u64 %r301,%r110,0; @ ! %r301 bra $L21; bra $L18; $L20: .loc 1 111 48 mul.lo.u64 %r95,%r54,%r95; $L17: .loc 1 113 4 st.u64 [%r160+8],%r302; ld.u64 %r54,[%r158]; add.u64 %r303,%r54,-1; st.u64 [%r160+16],%r303; st.u64 [%r160],%r95; .loc 1 106 7 add.u64 %r160,%r160,24; add.u64 %r158,%r158,8; setp.ne.u64 %r304,%r152,%r160; @ %r304 bra $L20; bra $L16; $L18: .loc 1 126 4 st.u64 [%r192+48],%r110; mov.u64 %r306,-1; st.u64 [%r192+56],%r306; mov.u64 %r307,1; st.u64 [%r192+40],%r307; .loc 1 127 4 bra $L1; $L11: .loc 1 133 19 ld.s8 %r64,[%r192+28]; .loc 1 133 10 setp.eq.u64 %r308,%r64,%r104; @ %r308 bra $L23; .loc 1 134 2 st.u64 [%stack+8],%r104; st.u64 [%stack],%r64;_gfortran_runtime_error23: .loc 1 139 11 cvta.global.u64 %r311,_gfortrani_compile_options; .loc 1 139 10 ld.u32 %r312,[%r311+36]; setp.eq.u32 %r313,%r312,0; @ %r313 bra $L21; .loc 1 140 2 add.u64 %r398,%frame,240; cvta.const.u64 %r317,$LC2317; call _gfortrani_bounds_ifunction_return$L21: .loc 1 144 3 setp.gt.s64 %r319,%r104,0; @ %r319 bra $L24; $L28: .loc 1 80 9 add.u64 %r320,%r30,1; .loc 1 80 7 sub.u64 %r106,%r320,%r32; max.s64 %r108,%r106,0; .loc 1 152 8 ld.u64 %r115,[%r193]; .loc 1 153 8 ld.u64 %r116,[%r192]; .loc 1 170 36 mul.lo.u64 %r321,%r33,%r196; shl.b64 %r68,%r321,2; .loc 1 186 22 ld.u64 %r184,[%frame+120]; .loc 1 186 12 shl.b64 %r186,%r184,2; .loc 1 187 22 ld.u64 %r187,[%frame]; .loc 1 187 12 shl.b64 %r189,%r187,4; .loc 1 185 12 ld.u64 %r190,[%frame+360]; setp.gt.s64 %r400,%r106,0; setp.le.s64 %r401,%r104,1; .loc 1 173 6 setp.eq.u32 %r404,%r195,0; cvt.u32.u32 %r405,%r22; cvt.s64.s8 %r406,%r405; add.u64 %r407,%r406,-1; add.u64 %r408,%frame,120; add.u64 %r409,%frame,240; bra $L25; $L24: add.u64 %r173,%frame,360; add.u64 %r172,%r192,40; mov.u64 %r170,%frame; cvt.u32.u32 %r323,%r22; cvt.s64.s8 %r322,%r323; add.u64 %r161,%r322,-1; .loc 1 144 3 mov.u64 %r117,0; add.u64 %r398,%frame,240; .loc 1 146 16 mov.u64 %r324,%r117; $L27: st.u64 [%r173],%r324; .loc 1 147 18 ld.u64 %r325,[%r172]; st.u64 [%r170],%r325; .loc 1 148 17 shl.b64 %r327,%r117,3; add.u64 %r328,%r398,%r327; .loc 1 148 10 ld.u64 %r329,[%r328]; setp.le.s64 %r330,%r329,0; @ %r330 bra $L1; .loc 1 144 26 add.u64 %r117,%r117,1; .loc 1 144 3 add.u64 %r173,%r173,8; add.u64 %r172,%r172,24; add.u64 %r170,%r170,8; setp.ne.u64 %r331,%r117,%r161; @ %r331 bra $L27; bra $L28; $L25: .loc 1 166 5 @ %r400 bra $L39; .loc 1 167 10 mov.u64 %r333,0; st.u64 [%r116],%r333; st.u64 [%r116+8],%r333; bra $L30; $L39: mov.u64 %r111,%r115; mov.u64 %r392,1; mov.u64 %r393,0; .loc 1 164 9 mov.u64 %r99,%r393; .loc 1 165 9 mov.u64 %r390,%r393; mov.u64 %r391,%r393; $L29: .loc 1 173 6 setp.ne.u64 %r335,%r99,0; @ %r335 bra $L31; $L34: .loc 1 177 14 mov.u64 %r390,%r392; mov.u64 %r391,%r393; mov.u64 %r99,%r111; bra $L32; $L31: .loc 1 173 6 @ %r404 bra $L33; .loc 1 42196; call (%value_in),_gfortrani_memcmp_char4340,[%value_in]; } .loc 1 173 22 setp.ge.s32 %r341,%r340,0; @ %r341 bra $L34; bra $L32; $L33: .loc 1 42196; call (%value_in),_gfortrani_memcmp_char4345,[%value_in]; } .loc 1 173 22 setp.gt.s32 %r346,%r345,0; @ %r346 bra $L34; $L32: .loc 1 170 36 add.u64 %r111,%r111,%r68; .loc 1 170 6 add.u64 %r349,%r392,1; set.u32.lt.u64 %r354,%r349,%r392; cvt.s64.s32 %r352,%r354; mov.u64 %r392,%r349; sub.u64 %r393,%r393,%r352; add.u64 %r359,%r392,-1; setp.gt.s64 %r361,%r108,%r359; @ %r361 bra $L29; .loc 1 181 12 st.u64 [%r116],%r390; st.u64 [%r116+8],%r391; $L30: .loc 1 185 15 add.u64 %r190,%r190,1; .loc 1 186 12 add.u64 %r115,%r115,%r186; .loc 1 187 12 add.u64 %r116,%r116,%r189; .loc 1 189 32 ld.u64 %r139,[%frame+240]; .loc 1 189 13 setp.ne.u64 %r364,%r190,%r139; @ %r364 bra $L25; .loc 1 196 23 mul.lo.u64 %r365,%r190,%r184; .loc 1 196 9 shl.b64 %r366,%r365,2; sub.u64 %r112,%r115,%r366; .loc 1 197 23 mul.lo.u64 %r367,%r190,%r187; .loc 1 197 9 shl.b64 %r368,%r367,4; sub.u64 %r113,%r116,%r368; .loc 1 199 7 @ %r401 bra $L1; add.u64 %r122,%frame,368; mov.u64 %r143,8; .loc 1 198 5 mov.u64 %r114,1; .loc 1 193 13 mov.u64 %r402,0; bra $L36; $L37: st.u64 [%r122],%r402; .loc 1 196 23 mul.lo.u64 %r374,%r80,%r79; .loc 1 196 9 shl.b64 %r375,%r374,2; sub.u64 %r112,%r115,%r375; .loc 1 197 23 mul.lo.u64 %r376,%r83,%r79; .loc 1 197 9 shl.b64 %r377,%r376,4; sub.u64 %r113,%r116,%r377; .loc 1 198 5 add.u64 %r114,%r114,1; .loc 1 199 7 add.u64 %r122,%r122,8; add.u64 %r143,%r143,8; setp.eq.u64 %r378,%r114,%r407; @ %r378 bra $L1; $L36: .loc 1 207 16 ld.u64 %r379,[%r122]; add.u64 %r79,%r379,1; st.u64 [%r122],%r79; .loc 1 208 23 add.u64 %r381,%r408,%r143; ld.u64 %r80,[%r381]; .loc 1 208 13 shl.b64 %r382,%r80,2; add.u64 %r115,%r112,%r382; .loc 1 209 23 add.u64 %r383,%frame,%r143; ld.u64 %r83,[%r383]; .loc 1 209 13 shl.b64 %r384,%r83,4; add.u64 %r116,%r113,%r384; .loc 1 189 32 add.u64 %r386,%r409,%r143; ld.u64 %r86,[%r386]; .loc 1 189 13 setp.eq.u64 %r387,%r79,%r86; @ %r387 bra $L37; mov.u64 %r190,0; bra $L25; $L51: .loc 1 102 6 ld.u64 %r388,[%r192]; setp.eq.u64 %r389,%r388,0; @ ! %r389 bra $L11; bra $L38; $L1_gfortran_mmaxloc1_16_s4 .visible .func _gfortran_mmaxloc1_16_s60810pred %r265; .reg .u64 %r266; .reg .u64 %r267; .reg .u32 %r268; .reg .u64 %r273; .reg .u64 %r274; .reg .u64 %r276; .reg .u64 %r277; .reg .u64 %r284; .reg .pred %r285; .reg .u32 %r286; .reg .u32 %r287; .reg .u32predu64predu16 %r382; .reg .u16 %r383; .reg .u32 %r384; .reg .pred %r385; .reg .u64 %r386; .reg .u64u64 %r410; .reg .pred %r412; .reg .pred %r413; .reg .u64 %r414; .reg .u32pred %r423; .reg .u64 %r424; .reg .pred %r427; .reg .u64 %r431; .reg .u64 %r434; .reg .u64 %r437; .reg .u64 %r440; .reg .u32 %r442; .reg .u64 %r446; .reg .u64 %r448; .reg .u64 %r451; .reg .u32 %r453; .reg .u64 %r457; .reg .pred %r458; .reg .u32 %r464; .reg .pred %r465; .reg .u32 %r469; .reg .pred %r470; .reg .u64predu64 %r511; .reg .u64 %r513; .reg .u64 %r514; .reg .u64 %r516; .reg .pred %r517; .reg .u64 %r518; .reg .pred %r519; .reg .u64 %r520; .reg .u64 %r521; .reg .u64pred %r543; .reg .u64 %r544; .reg .u64 %r545; .reg .pred %r546; .reg .u64 %r549; .reg .pred %r550; .reg .u32 %r551; .reg .u64u64 %r556; .reg .u64 %r557; .reg .u64 %r558;.loc 1 244 6 setp.ne.u64 %r247,%r244,0; @ %r247 bra $L42243245246; call _gfortran_maxloc1_16_s4); } .loc 1 251 7 bra $L52; $L53: .loc 1 254 10 ld.u64 %r22,[%r243]; .loc 1 254 7 add.u64 %r145,%r22,-1; .loc 1 255 10 ld.s8 %r23,[%r242+28]; .loc 1 255 38 add.u32 %r253,%r23,-1; .loc 1 255 8 cvt.s64.s32 %r146,%r253; .loc 1 258 7 shr.u64 %r255,%r145,63; set.u32.gt.s64 %r257,%r145,%r146; neg.s32 %r258,%r257; cvt.u16.u64 %r261,%r255; cvt.u16.u32 %r262,%r258; or.b16 %r260,%r261,%r262; .loc 1 258 6 cvt.u32.u16 %r263,%r260; cvt.u16.u8 %r264,%r263; setp.eq.u16 %r265,%r264,0; @ %r265 bra $L55; .loc 1 260 7 cvt.u32.u32 %r268,%r23; cvt.s64.s8 %r267,%r268; st.u64 [%stack+8],%r267; st.u64 [%stack],%r22; cvta.const.u64 %r266266_gfortran_runtime_error5: .loc 1 265 9 add.u64 %r542,%r145,%r145; add.u64 %r541,%r542,%r145; shl.b64 %r273,%r541,3; add.u64 %r274,%r242,%r273; ld.u64 %r277,[%r274+56]; add.u64 %r276,%r277,1; .loc 1 265 7 ld.u64 %r284,[%r274+48]; sub.u64 %r147,%r276,%r284; .loc 1 266 6 setp.le.s64 %r285,%r147,0; @ %r285 bra $L52; .loc 1 269 9 ld.u64 %r159,[%r244]; .loc 1 271 15 ld.u64 %r35,[%r244+16]; .loc 1 273 22 cvt.u32.u64 %r36,%r35; .loc 1 273 53 add.u32 %r286,%r36,-4; and.b32 %r287,%r286,-5; set.u32.eq.u32 %r289,%r287,0; neg.s32 %r290,%r289; .loc 1 273 22 add.u32 %r291,%r36,-1; set.u32.le.u32 %r293,%r291,1; neg.s32 %r294,%r293; .loc 1 273 6 cvt.u16.u32 %r296,%r290; cvt.u16.u32 %r297,%r294; or.loc 1 275 7 setp.ne.u32 %r302,%r36,16; @ %r302 bra $L58; $L57: .loc 1 282 11 shl.b64 %r306,%r541,3; add.u64 %r307,%r242,%r306; ld.u64 %r39,[%r307+40]; .loc 1 283 12 add.u64 %r313,%r244,%r306; ld.u64 %r41,[%r313+40]; .loc 1 285 3 setp.ne.u64 %r315,%r145,0; @ %r315 bra $L59; $L65: .loc 1 295 3 setp.lt.s64 %r316,%r145,%r146; @ %r316 bra $L60; bra $L112; $L58: .loc 1 280 5 cvta.const.u64 %r317317_gfortran_runtime_error9: add.u64 %r74,%r242,40; add.u64 %r224,%frame,240; add.u64 %r93,%r244,40; mov.u64 %r232,%frame; add.u64 %r195,%frame,360; add.u64 %r320,%r22,%r22; add.u64 %r321,%r320,%r22; shl.b64 %r322,%r321,3; add.u64 %r323,%r242,16; add.u64 %r236,%r322,%r323; .loc 1 292 12 mov.u64 %r558,0; $L64: .loc 1 287 51 ld.u64 %r325,[%r74]; mul.lo.u64 %r324,%r325,%r246; .loc 1 287 18 st.u64 [%r224],%r324; .loc 1 288 20 ld.u64 %r327,[%r93]; mul.lo.u64 %r326,%r327,%r35; .loc 1 288 18 st.u64 [%r232],%r326; .loc 1 289 19 ld.u64 %r329,[%r74+16]; add.u64 %r328,%r329,1; ld.u64 %r330,[%r74+8]; sub.u64 %r56,%r328,%r330; .loc 1 291 10 setp.lt.s64 %r331,%r56,0; @ %r331 bra $L62; .loc 1 289 17 st.u64 [%r195],%r56; bra $L63; $L62: .loc 1 292 12 st.u64 [%r195],%r558; $L63: .loc 1 285 3 add.u64 %r74,%r74,24; add.u64 %r224,%r224,8; add.u64 %r93,%r93,24; add.u64 %r232,%r232,8; add.u64 %r195,%r195,8; setp.ne.u64 %r333,%r74,%r236; @ %r333 bra $L64; bra $L65; $L112: .loc 1 305 6 ld.u64 %r334,[%r241]; setp.eq.u64 %r335,%r334,0; @ ! %r335 bra $L67; bra $L66; $L60: add.u64 %r337,%r22,%r22; add.u64 %r338,%r337,%r22; shl.b64 %r339,%r338,3; add.u64 %r129,%r339,40; add.u64 %r135,%r242,%r129; shl.b64 %r340,%r22,3; add.u64 %r125,%r340,-8; add.u64 %r545,%frame,240; add.u64 %r127,%r545,%r125; add.u64 %r122,%r244,%r129; add.u64 %r89,%frame,%r125; add.u64 %r544,%frame,360; add.u64 %r43,%r544,%r125; add.u64 %r343,%r242,40; cvt.u32.u32 %r345,%r23; cvt.s64.s8 %r344,%r345; add.u64 %r347,%r344,%r344; add.u64 %r348,%r347,%r344; shl.b64 %r349,%r348,3; add.u64 %r217,%r343,%r349; .loc 1 302 12 mov.u64 %r557,0; $L70: .loc 1 297 55 ld.u64 %r351,[%r135]; mul.lo.u64 %r350,%r351,%r246; .loc 1 297 18 st.u64 [%r127],%r350; .loc 1 298 20 ld.u64 %r353,[%r122]; mul.lo.u64 %r352,%r353,%r35; .loc 1 298 18 st.u64 [%r89],%r352; .loc 1 299 19 ld.u64 %r355,[%r135+16]; add.u64 %r354,%r355,1; ld.u64 %r356,[%r135+8]; sub.u64 %r72,%r354,%r356; .loc 1 301 10 setp.lt.s64 %r357,%r72,0; @ %r357 bra $L68; .loc 1 299 17 st.u64 [%r43],%r72; bra $L69; $L68: .loc 1 302 12 st.u64 [%r43],%r557; $L69: .loc 1 295 3 add.u64 %r135,%r135,24; add.u64 %r127,%r127,8; add.u64 %r122,%r122,24; add.u64 %r89,%r89,8; add.u64 %r43,%r43,8; setp.ne.u64 %r359,%r135,%r217; @ %r359 bra $L70; bra $L113; $L66: .loc 1 309 7 setp.eq.u64 %r360,%r146,0; @ %r360 bra $L72; add.u64 %r544,%frame,360; $L95: add.u64 %r174,%r241,40; mov.u64 %r161,%r544; cvt.u32.u32 %r362,%r23; cvt.s64.s8 %r361,%r362; add.u64 %r364,%r361,%r361; add.u64 %r365,%r364,%r361; shl.b64 %r366,%r365,3; add.u64 %r367,%r241,16; add.u64 %r137,%r366,%r367; .loc 1 312 10 mov.u64 %r133,1; .loc 1 316 4 mov.u64 %r386,0; bra $L73; $L72: .loc 1 320 20 add.u32 %r368,%r23,-2; cvt.s64.s32 %r80,%r368; add.u64 %r370,%r80,%r80; add.u64 %r371,%r370,%r80; shl.b64 %r372,%r371,3; add.u64 %r373,%r241,%r372; .loc 1 320 67 shl.b64 %r375,%r80,3; add.u64 %r376,%frame,%r375; .loc 1 320 59 ld.u64 %r378,[%r373+40]; ld.u64 %r379,[%r376+360]; mul.lo.u64 %r153,%r378,%r379; .loc 1 322 24 mov.u64 %r380,0; st.u64 [%r241+8],%r380; .loc 1 323 28 cvt.u16.u32 %r383,%r23; add.u16 %r382,%r383,-1; cvt.u32.u16 %r384,%r382; st.u8 [%r241+28],%r384; .loc 1 325 10 setp.eq.u64 %r385,%r153,0; @ %r385 bra $L74; bra $L114; $L76: .loc 1 314 47 mul.lo.u64 %r133,%r77,%r133; $L73: .loc 1 316 4 st.u64 [%r174+8],%r386; ld.u64 %r77,[%r161]; add.u64 %r387,%r77,-1; st.u64 [%r174+16],%r387; st.u64 [%r174],%r133; .loc 1 309 7 add.u64 %r174,%r174,24; add.u64 %r161,%r161,8; setp.ne.u64 %r388,%r137,%r174; @ %r388 bra $L76; bra $L72; $L74: .loc 1 328 4 st.u64 [%r241+48],%r153; mov.u64 %r390,-1; st.u64 [%r241+56],%r390; mov.u64 %r391,1; st.u64 [%r241+40],%r391; .loc 1 329 4 bra $L52; $L114: .loc 1 332 24 mov.u64 %r393,16393; call (%value_in),_gfortrani_xmallocarray94,[%value_in]; } .loc 1 332 22 st.u64 [%r241],%r394; bra $L77; $L67: .loc 1 337 19 ld.s8 %r396,[%r241+28]; .loc 1 337 10 setp.eq.u64 %r397,%r396,%r146; @ %r397 bra $L78; .loc 1 338 2 cvta.const.u64 %r398,$LC5stack; call _gfortran_runtime_error78: .loc 1 340 11 cvta.global.u64 %r400,_gfortrani_compile_options; .loc 1 340 10 ld.u32 %r401,[%r400+36]; setp.eq.u32 %r402,%r401,0; @ %r402 bra $L77; .loc 1 342 4 add.u64 %r544,%frame,360; cvta.const.u64 %r406,$LC2r5440406; call _gfortrani_bounds_ifunction_return344 4 cvta.const.u64 %r410,$LC424210406; call _gfortrani_bounds_equal_extents$L77: .loc 1 349 3 setp.ne.u64 %r412,%r146,0; @ %r412 bra $L79; $L82: .loc 1 357 8 ld.u64 %r160,[%r241]; .loc 1 358 8 ld.u64 %r158,[%r242]; .loc 1 360 9 setp.ne.u64 %r413,%r158,0; @ %r413 bra $L80; bra $L52; $L79: add.u64 %r196,%frame,480; add.u64 %r193,%r241,40; add.u64 %r189,%frame,120; cvt.u32.u32 %r415,%r23; cvt.s64.s8 %r414,%r415; add.u64 %r177,%r414,-1; .loc 1 349 3 mov.u64 %r165,0; add.u64 %r544,%frame,360; .loc 1 351 16 mov.u64 %r416,%r165; $L81: st.u64 [%r196],%r416; .loc 1 352 18 ld.u64 %r417,[%r193]; st.u64 [%r189],%r417; .loc 1 353 17 shl.b64 %r419,%r165,3; add.u64 %r420,%r544,%r419; .loc 1 353 10 ld.u64 %r421,[%r420]; setp.le.s64 %r422,%r421,0; @ %r422 bra $L52; .loc 1 349 26 add.u64 %r165,%r165,1; .loc 1 349 3 add.u64 %r196,%r196,8; add.u64 %r193,%r193,24; add.u64 %r189,%r189,8; setp.ne.u64 %r423,%r165,%r177; @ %r423 bra $L81; bra $L82; $L80: .loc 1 283 12 mul.lo.u64 %r44,%r41,%r35; .loc 1 372 32 mul.lo.u64 %r424,%r39,%r246; shl.b64 %r92,%r424,2; .loc 1 396 22 ld.u64 %r219,[%frame+240]; .loc 1 396 12 shl.b64 %r222,%r219,2; .loc 1 397 23 ld.u64 %r223,[%frame]; .loc 1 398 22 ld.u64 %r226,[%frame+120]; .loc 1 398 12 shl.b64 %r228,%r226,4; setp.le.s64 %r543,%r146,1; add.u64 %r549,%r147,-1; .loc 1 384 13 setp.eq.u32 %r550,%r245,0; cvt.u32.u32 %r551,%r23; cvt.s64.s8 %r552,%r551; add.u64 %r553,%r552,-1; add.u64 %r554,%frame,240; add.u64 %r555,%frame,360; add.u64 %r556,%frame,120; $L96: .loc 1 349 3 mov.u64 %r164,%r159; mov.u64 %r154,%r158; .loc 1 372 9 mov.u64 %r198,0; $L86: .loc 1 375 7 ld.s8 %r230,[%r164]; mov.u64 %r235,%r198; add.u64 %r198,%r198,1; .loc 1 375 6 setp.eq.u32 %r546,%r230,0; @ %r546 bra $L83; .loc 1 378 11 mov.u64 %r520,%r198; shr.s64 %r521,%r520,63; .loc 1 382 6 setp.le.s64 %r427,%r147,%r235; @ %r427 bra $L84; add.u64 %r524,%r235,1; shr.s64 %r525,%r524,63; sub.u64 %r431,%r549,%r235; shr.s64 %r434,%r235,63; add.u64 %r437,%r235,2; set.u32.lt.u64 %r442,%r437,%r235; cvt.s64.s32 %r440,%r442; sub.u64 %r446,%r434,%r440; add.u64 %r448,%r431,%r437; set.u32.lt.u64 %r453,%r448,%r431; cvt.s64.s32 %r451,%r453; sub.u64 %r457,%r446,%r451; mov.u64 %r163,%r154; bra $L85; $L83: .loc 1 372 32 add.u64 %r154,%r154,%r92; .loc 1 372 47 add.u64 %r164,%r164,%r44; .loc 1 372 2 setp.ne.u64 %r458,%r147,%r198; @ %r458 bra $L86; .loc 1 371 9 mov.u64 %r520,0; mov.u64 %r521,%r520; bra $L84; $L85: .loc 1 384 6 @ %r546 bra $L87; .loc 1 384 13 @ %r550 bra $L88; .loc 1 4216r154246; call (%value_in),_gfortrani_memcmp_char4464,[%value_in]; } .loc 1 384 13 setp.lt.s32 %r465,%r464,0; @ %r465 bra $L87; $L89: .loc 1 388 14 mov.u64 %r520,%r524; mov.u64 %r521,%r525; mov.u64 %r154,%r163; bra $L87; $L88: .loc 1 4216r154246; call (%value_in),_gfortrani_memcmp_char4469,[%value_in]; } .loc 1 384 13 setp.gt.s32 %r470,%r469,0; @ %r470 bra $L89; $L87: .loc 1 382 31 add.u64 %r163,%r163,%r92; .loc 1 382 46 add.u64 %r164,%r164,%r44; .loc 1 382 6 add.u64 %r473,%r524,1; set.u32.lt.u64 %r478,%r473,%r524; cvt.s64.s32 %r476,%r478; mov.u64 %r524,%r473; sub.u64 %r525,%r525,%r476; setp.ne.u64 %r485,%r448,%r524; @ %r485 bra $L97; setp.ne.u64 %r488,%r457,%r525; @ ! %r488 bra $L84; $L97: .loc 1 384 7 ld.s8 %r230,[%r164]; setp.eq.u32 %r546,%r230,0; bra $L85; $L84: .loc 1 392 8 st.u64 [%r160],%r520; st.u64 [%r160+8],%r521; .loc 1 395 15 ld.u64 %r491,[%frame+480]; add.u64 %r95,%r491,1; st.u64 [%frame+480],%r95; .loc 1 396 12 add.u64 %r158,%r158,%r222; .loc 1 397 13 add.u64 %r159,%r159,%r223; .loc 1 398 12 add.u64 %r160,%r160,%r228; .loc 1 400 32 ld.u64 %r29,[%frame+360]; .loc 1 400 13 setp.ne.u64 %r492,%r29,%r95; @ %r492 bra $L96; .loc 1 404 13 mov.u64 %r493,0; st.u64 [%frame+480],%r493; .loc 1 407 23 mul.lo.u64 %r105,%r29,%r219; .loc 1 408 24 mul.lo.u64 %r494,%r29,%r223; .loc 1 408 10 sub.u64 %r155,%r159,%r494; .loc 1 409 23 mul.lo.u64 %r495,%r29,%r226; .loc 1 409 9 shl.b64 %r496,%r495,4; sub.u64 %r156,%r160,%r496; .loc 1 411 7 @ %r543 bra $L52; add.u64 %r215,%frame,488; mov.u64 %r214,8; .loc 1 410 5 mov.u64 %r157,1; bra $L93; $L94: .loc 1 404 13 st.u64 [%r215],%r493; .loc 1 407 23 mul.lo.u64 %r105,%r115,%r114; .loc 1 408 24 mul.lo.u64 %r502,%r117,%r114; .loc 1 408 10 sub.u64 %r155,%r159,%r502; .loc 1 409 23 mul.lo.u64 %r503,%r119,%r114; .loc 1 409 9 shl.b64 %r504,%r503,4; sub.u64 %r156,%r160,%r504; .loc 1 410 5 add.u64 %r157,%r157,1; .loc 1 411 7 add.u64 %r215,%r215,8; add.u64 %r214,%r214,8; setp.eq.u64 %r505,%r157,%r553; @ %r505 bra $L52; $L93: .loc 1 419 16 ld.u64 %r506,[%r215]; add.u64 %r114,%r506,1; st.u64 [%r215],%r114; .loc 1 420 23 add.u64 %r508,%r554,%r214; ld.u64 %r115,[%r508]; .loc 1 420 13 sub.u64 %r509,%r115,%r105; shl.b64 %r510,%r509,2; add.u64 %r158,%r158,%r510; .loc 1 421 24 add.u64 %r511,%frame,%r214; ld.u64 %r117,[%r511]; .loc 1 421 14 add.u64 %r159,%r155,%r117; .loc 1 422 23 add.u64 %r513,%r556,%r214; ld.u64 %r119,[%r513]; .loc 1 422 13 shl.b64 %r514,%r119,4; add.u64 %r160,%r156,%r514; .loc 1 400 32 add.u64 %r516,%r555,%r214; ld.u64 %r123,[%r516]; .loc 1 400 13 setp.eq.u64 %r517,%r114,%r123; @ %r517 bra $L94; bra $L96; $L113: .loc 1 305 6 ld.u64 %r518,[%r241]; setp.eq.u64 %r519,%r518,0; @ ! %r519 bra $L67; bra $L95; $L52: .loc 1 426_gfortran_smaxloc1_16_s4 .visible .func _gfortran_smaxloc1_16_s368100predu16 %r180; .reg .predpred %r222; .reg .predu64 %r231; .reg .u16 %r233; .reg .u16 %r234; .reg .u32pred %r259; .reg .pred %r260; .reg .u64u64 %r267; .reg .u32predmov.u64 %r156,%ar0; mov.u64 %r157,%ar1; mov.u64 %r158,%ar2; mov.u64 %r159,%ar3; mov.u32 %r160,%ar4; mov.u64 %r161,%ar5; .loc 1 449 6 setp.eq.u64 %r162,%r159,0; @ %r162 bra $L116; .loc 1 449 20 ld.u32 %r163,[%r159]; setp.eq.u32 %r164,%r163,0; @ %r164 bra $L117; $L116: .loc 1 452 7r1571580161; call _gfortran_maxloc1_16_s4); } .loc 1 456 7 bra $L115; $L117: .loc 1 459 10 ld.u64 %r23,[%r158]; .loc 1 459 7 add.u64 %r95,%r23,-1; .loc 1 460 10 ld.s8 %r24,[%r157+28]; .loc 1 460 38 add.u32 %r26,%r24,-1; .loc 1 460 8 cvt.s64.s32 %r307,%r26; .loc 1 462 7 shr.u64 %r171,%r95,63; set.u32.gt.s64 %r173,%r95,%r307; neg.s32 %r174,%r173; cvt.u16.u64 %r177,%r171; cvt.u16.u32 %r178,%r174; or.b16 %r176,%r177,%r178; .loc 1 462 6 cvt.u32.u16 %r179,%r176; cvt.u16.u8 %r180,%r179; setp.ne.u16 %r181,%r180,0; @ %r181 bra $L119; .loc 1 469 3 setp.ne.u64 %r182,%r95,0; @ %r182 bra $L120; $L126: .loc 1 477 3 setp.lt.s64 %r183,%r95,%r307; @ %r183 bra $L121; bra $L174; $L119: .loc 1 464 7 cvt.u32.u32 %r186,%r24; cvt.s64.s8 %r185,%r186; st.u64 [%stack+8],%r1851841stack; call _gfortran_runtime_error120: add.u64 %r92,%r157,48; add.u64 %r114,%frame,120; add.u64 %r146,%r157,56; add.u64 %r188,%r157,24; add.u64 %r190,%r23,%r23; add.u64 %r191,%r190,%r23; shl.b64 %r192,%r191,3; add.u64 %r152,%r188,%r192; .loc 1 474 12 mov.u64 %r313,0; $L125: .loc 1 471 19 ld.u64 %r194,[%r146]; ld.u64 %r196,[%r92]; sub.u64 %r195,%r194,%r196; .loc 1 471 50 mad.lo.u64 %r38,%r195,%r161,%r161; .loc 1 473 10 setp.le.s64 %r197,%r38,0; @ %r197 bra $L123; .loc 1 471 17 st.u64 [%r114],%r38; bra $L124; $L123: .loc 1 474 12 st.u64 [%r114],%r313; $L124: .loc 1 469 3 add.u64 %r92,%r92,24; add.u64 %r114,%r114,8; add.u64 %r146,%r146,24; setp.ne.u64 %r199,%r92,%r152; @ %r199 bra $L125; bra $L126; $L174: .loc 1 486 15 ld.u64 %r107,[%r156]; .loc 1 486 6 setp.eq.u64 %r200,%r107,0; @ ! %r200 bra $L128; bra $L127; $L121: add.u64 %r202,%r23,%r23; add.u64 %r203,%r202,%r23; shl.b64 %r204,%r203,3; add.u64 %r205,%r204,48; add.u64 %r83,%r157,%r205; add.u64 %r305,%frame,120; shl.b64 %r207,%r23,3; add.u64 %r208,%r207,-8; add.u64 %r144,%r305,%r208; add.u64 %r209,%r157,48; cvt.u32.u32 %r211,%r24; cvt.s64.s8 %r210,%r211; add.u64 %r213,%r210,%r210; add.u64 %r214,%r213,%r210; shl.b64 %r215,%r214,3; add.u64 %r72,%r209,%r215; .loc 1 483 12 mov.u64 %r312,0; $L131: .loc 1 480 2 ld.u64 %r217,[%r83+8]; ld.u64 %r219,[%r83]; sub.u64 %r218,%r217,%r219; .loc 1 480 37 mad.lo.u64 %r46,%r218,%r161,%r161; .loc 1 482 10 setp.le.s64 %r220,%r46,0; @ %r220 bra $L129; .loc 1 479 17 st.u64 [%r144],%r46; bra $L130; $L129: .loc 1 483 12 st.u64 [%r144],%r312; $L130: .loc 1 477 3 add.u64 %r83,%r83,24; add.u64 %r144,%r144,8; setp.ne.u64 %r222,%r72,%r83; @ %r222 bra $L131; bra $L175; $L127: .loc 1 490 7 setp.eq.u64 %r223,%r307,0; @ %r223 bra $L133; add.u64 %r305,%frame,120; $L153: add.u64 %r125,%r156,40; mov.u64 %r123,%r305; add.u64 %r224,%r156,16; cvt.u32.u32 %r226,%r24; cvt.s64.s8 %r225,%r226; add.u64 %r228,%r225,%r225; add.u64 %r229,%r228,%r225; shl.b64 %r230,%r229,3; add.u64 %r117,%r224,%r230; .loc 1 493 10 mov.u64 %r89,1; .loc 1 497 4 mov.u64 %r249,0; bra $L134; $L133: .loc 1 501 24 mov.u64 %r231,0; st.u64 [%r156+8],%r231; .loc 1 502 28 cvt.u16.u32 %r234,%r24; add.u16 %r233,%r234,-1; cvt.u32.u16 %r235,%r233; st.u8 [%r156+28],%r235; .loc 1 504 20 add.u32 %r236,%r24,-2; cvt.s64.s32 %r56,%r236; add.u64 %r238,%r56,%r56; add.u64 %r239,%r238,%r56; shl.b64 %r240,%r239,3; add.u64 %r241,%r156,%r240; .loc 1 504 67 shl.b64 %r243,%r56,3; add.u64 %r244,%frame,%r243; .loc 1 504 59 ld.u64 %r246,[%r241+40]; ld.u64 %r247,[%r244+120]; mul.lo.u64 %r103,%r246,%r247; .loc 1 506 10 setp.eq.u64 %r248,%r103,0; @ %r248 bra $L135; bra $L176; $L137: .loc 1 495 48 mul.lo.u64 %r89,%r52,%r89; $L134: .loc 1 497 4 st.u64 [%r125+8],%r249; ld.u64 %r52,[%r123]; add.u64 %r250,%r52,-1; st.u64 [%r125+16],%r250; st.u64 [%r125],%r89; .loc 1 490 7 add.u64 %r125,%r125,24; add.u64 %r123,%r123,8; setp.ne.u64 %r251,%r117,%r125; @ %r251 bra $L137; bra $L133; $L135: .loc 1 509 4 st.u64 [%r156+48],%r103; mov.u64 %r253,-1; st.u64 [%r156+56],%r253; mov.u64 %r254,1; st.u64 [%r156+40],%r254; .loc 1 510 4 bra $L115; $L176: .loc 1 513 24 mov.u64 %r256,16256; call (%value_in),_gfortrani_xmallocarray257,[%value_in]; } mov.u64 %r107,%r257; .loc 1 513 22 st.u64 [%r156],%r107; $L141: .loc 1 539 3 setp.ne.u64 %r259,%r307,0; @ %r259 bra $L138; bra $L139; $L128: .loc 1 517 19 ld.s8 %r61,[%r156+28]; .loc 1 517 10 setp.eq.u64 %r260,%r61,%r307; @ %r260 bra $L140; .loc 1 518 2 st.u64 [%stack+8],%r307; st.u64 [%stack],%r61;_gfortran_runtime_error140: .loc 1 523 11 cvta.global.u64 %r263,_gfortrani_compile_options; .loc 1 523 10 ld.u32 %r264,[%r263+36]; setp.eq.u32 %r265,%r264,0; @ %r265 bra $L141; .loc 1 525 4 setp.eq.u64 %r266,%r307,0; @ %r266 bra $L139; add.u64 %r115,%r156,48; add.u64 %r105,%frame,120; cvt.u32.u32 %r268,%r24; cvt.s64.s8 %r267,%r268; add.u64 %r86,%r267,-1; .loc 1 525 10 mov.u64 %r113,0; $L143: .loc 1 529 21 ld.u64 %r270,[%r115+8]; add.u64 %r269,%r270,1; .loc 1 529 19 ld.u64 %r271,[%r115]; sub.u64 %r100,%r269,%r271; .loc 1 530 18 ld.u64 %r70,[%r105]; .loc 1 525 25 add.u64 %r113,%r113,1; .loc 1 530 11 setp.eq.u64 %r272,%r70,%r100; @ %r272 bra $L142; .loc 1 531 3 st.u64 [%stack+16],%r70; st.u64 [%stack+8],%r100; st.u64 [%stack],%r113;_gfortran_runtime_error142: .loc 1 525 4 add.u64 %r115,%r115,24; add.u64 %r105,%r105,8; setp.ne.u64 %r275,%r86,%r113; @ %r275 bra $L143; $L138: .loc 1 541 16 cvt.u16.u32 %r277,%r24; setp.le.s16 %r278,%r277,1; @ %r278 bra $L144; shl.b64 %r276,%r307,3; bra $L145; $L144: mov.u64 %r276,8; $L145: add.u64 %r280,%frame,240; mov.u32 %r2842766,[%value_in]; } add.u64 %r130,%r156,40; mov.u64 %r128,%frame; mov.u64 %r109,0; $L146: .loc 1 542 18 ld.u64 %r288,[%r130]; st.u64 [%r128],%r288; .loc 1 539 26 add.u64 %r109,%r109,1; .loc 1 539 3 add.u64 %r130,%r130,24; add.u64 %r128,%r128,8; setp.gt.s64 %r289,%r307,%r109; @ %r289 bra $L146; $L139: ld.u64 %r88,[%frame+240]; .loc 1 551 22 ld.u64 %r75,[%frame]; .loc 1 551 12 shl.b64 %r76,%r75,4; .loc 1 553 32 ld.u64 %r65,[%frame+120]; .loc 1 560 23 mul.lo.u64 %r49,%r65,%r75; setp.le.s64 %r306,%r307,1; .loc 1 549 13 mov.u64 %r290,0; cvt.u32.u32 %r309,%r24; cvt.s64.s8 %r310,%r309; add.u64 %r311,%r310,-1; $L152: st.u64 [%r107],%r290; st.u64 [%r107+8],%r290; .loc 1 550 15 add.u64 %r88,%r88,1; .loc 1 551 12 add.u64 %r107,%r107,%r76; .loc 1 553 13 setp.ne.u64 %r292,%r65,%r88; @ %r292 bra $L152; .loc 1 562 7 @ ! %r306 bra $L177; bra $L115; $L151: .loc 1 557 13 st.u64 [%r108],%r290; .loc 1 560 23 mul.lo.u64 %r78,%r81,%r80; .loc 1 561 5 add.u64 %r106,%r106,1; .loc 1 562 7 add.u64 %r108,%r108,8; add.u64 %r138,%r138,8; add.u64 %r137,%r137,8; setp.ne.u64 %r295,%r106,%r311; @ %r295 bra $L150; bra $L115; $L177: add.u64 %r108,%frame,248; add.u64 %r138,%frame,8; add.u64 %r137,%frame,128; .loc 1 560 23 mov.u64 %r78,%r49; .loc 1 561 5 mov.u64 %r106,1; $L150: .loc 1 566 16 ld.u64 %r300,[%r108]; add.u64 %r80,%r300,1; st.u64 [%r108],%r80; .loc 1 567 23 ld.u64 %r81,[%r138]; .loc 1 567 13 sub.u64 %r301,%r81,%r78; shl.b64 %r302,%r301,4; add.u64 %r107,%r107,%r302; .loc 1 553 32 ld.u64 %r84,[%r137]; .loc 1 553 13 setp.eq.u64 %r303,%r80,%r84; @ %r303 bra $L151; .loc 1 557 13 mov.u64 %r88,0; bra $L152; $L175: .loc 1 486 15 ld.u64 %r107,[%r156]; .loc 1 486 6 setp.eq.u64 %r304,%r107,0; @ ! %r304 bra $L128; bra $L153; $L115: .loc 1 minloc1_4_s1.o/ 1622802249_gfortran_minloc1_4_s1 .visible .func _gfortran_minloc1_4_s1fortran/generated/minloc1_4_s1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_4_s1 .visible .func _gfortran_mminloc1_4_s1_gfortran_sminloc1_4_s1 .visible .func _gfortran_sminloc1_4_s1VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returmemcmpmem_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents7] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,914,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,117,101114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99_gfortran_minloc1_4_s1 .visible .func _gfortran_minloc1_4_s1in_ar4) {48predpredpred %r238; .reg .u64predu64 %r279; .reg .u64 %r280; .reg .u64 %r281; .reg .u64u64 %r303; .reg .u32 %r304; .reg .pred %r305; .reg .u64 %r308; .reg .u64 %r309; .reg .pred %r311; .reg .u64 %r312; .reg .u64 %r313; .reg .u64u64predpredpredpred %r356; .reg .u64 %r357; .reg .predmov.u64 %r184,%ar0; mov.u64 %r185,%ar1; mov.u64 %r186,%ar2; mov.u32 %r187,%ar3; mov.u64 %r188,%ar4; .loc 1 70 10 ld.s8 %r22,[%r185+28]; .loc 1 70 38 add.u32 %r189,%r22,-1; .loc 1 70 8 cvt.s64.s32 %r117,%r189; .loc 1 71 10 ld.u64 %r25,[%r186]; .loc 1 71 7 add.u64 %r118,%r25,-1; .loc 1 73 7 shr.u64 %r191,%r118,63; set.u32.lt.s64 %r193,%r117,%r118; neg.s32 %r194,%r193; cvt.u16.u64 %r197,%r191; cvt.u16.u32 %r198,%r194; or.b16 %r196,%r197,%r198; .loc 1 73 6 cvt.u32.u16 %r199,%r196; cvt.u16.u8 %r200,%r199; setp.eq.u16 %r201,%r200,0; @ %r201 bra $L2; .loc 1 75 7 cvt.u32.u32 %r204,%r22; cvt.s64.s8 %r203,%r204; st.u64 [%stack+8],%r20302202_gfortran_runtime_errorr207,%r118,%r118; add.u64 %r208,%r207,%r118; shl.b64 %r209,%r208,3; add.u64 %r210,%r185,%r209; ld.u64 %r31,[%r210+56]; ld.u64 %r33,[%r210+48]; .loc 1 83 11 ld.u64 %r34,[%r210+40]; .loc 1 85 3 setp.ne.u64 %r224,%r118,0; @ %r224 bra $L3; $L9: .loc 1 93 3 setp.gt.s64 %r225,%r117,%r118; @ %r225 bra $L4; bra $L52; $L3: add.u64 %r29,%r185,40; add.u64 %r133,%frame,120; add.u64 %r182,%frame,240; add.u64 %r227,%r25,%r25; add.u64 %r228,%r227,%r25; shl.b64 %r229,%r228,3; add.u64 %r230,%r185,16; add.u64 %r160,%r229,%r230; .loc 1 91 12 mov.u64 %r373,0; $L8: .loc 1 87 51 ld.u64 %r232,[%r29]; mul.lo.u64 %r231,%r232,%r188; .loc 1 87 18 st.u64 [%r133],%r231; .loc 1 88 19 ld.u64 %r234,[%r29+16]; add.u64 %r233,%r234,1; ld.u64 %r235,[%r29+8]; sub.u64 %r44,%r233,%r235; .loc 1 90 10 setp.lt.s64 %r236,%r44,0; @ %r236 bra $L6; .loc 1 88 17 st.u64 [%r182],%r44; bra $L7; $L6: .loc 1 91 12 st.u64 [%r182],%r373; $L7: .loc 1 85 3 add.u64 %r29,%r29,24; add.u64 %r133,%r133,8; add.u64 %r182,%r182,8; setp.ne.u64 %r238,%r29,%r160; @ %r238 bra $L8; bra $L9; $L52: .loc 1 102 6 ld.u64 %r239,[%r184]; setp.eq.u64 %r240,%r239,0; @ ! %r240 bra $L11; bra $L10; $L4: add.u64 %r242,%r25,%r25; add.u64 %r243,%r242,%r25; shl.b64 %r244,%r243,3; add.u64 %r245,%r244,40; add.u64 %r134,%r185,%r245; shl.b64 %r246,%r25,3; add.u64 %r108,%r246,-8; add.u64 %r362,%frame,120; add.u64 %r112,%r362,%r108; add.u64 %r361,%frame,240; add.u64 %r105,%r361,%r108; add.u64 %r249,%r185,40; cvt.u32.u32 %r251,%r22; cvt.s64.s8 %r250,%r251; add.u64 %r253,%r250,%r250; add.u64 %r254,%r253,%r250; shl.b64 %r255,%r254,3; add.u64 %r67,%r249,%r255; .loc 1 99 12 mov.u64 %r372,0; $L14: .loc 1 95 56 ld.u64 %r257,[%r134]; mul.lo.u64 %r256,%r257,%r188; .loc 1 95 18 st.u64 [%r112],%r256; .loc 1 96 19 ld.u64 %r259,[%r134+16]; add.u64 %r258,%r259,1; ld.u64 %r260,[%r134+8]; sub.u64 %r54,%r258,%r260; .loc 1 98 10 setp.lt.s64 %r261,%r54,0; @ %r261 bra $L12; .loc 1 96 17 st.u64 [%r105],%r54; bra $L13; $L12: .loc 1 99 12 st.u64 [%r105],%r372; $L13: .loc 1 93 3 add.u64 %r134,%r134,24; add.u64 %r112,%r112,8; add.u64 %r105,%r105,8; setp.ne.u64 %r263,%r67,%r134; @ %r263 bra $L14; bra $L53; $L10: .loc 1 106 7 setp.le.s64 %r264,%r117,0; @ %r264 bra $L16; add.u64 %r361,%frame,240; $L37: add.u64 %r150,%r184,40; mov.u64 %r148,%r361; cvt.u32.u32 %r266,%r22; cvt.s64.s8 %r265,%r266; add.u64 %r268,%r265,%r265; add.u64 %r269,%r268,%r265; shl.b64 %r270,%r269,3; add.u64 %r271,%r184,16; add.u64 %r135,%r270,%r271; .loc 1 109 10 mov.u64 %r107,1; .loc 1 113 4 mov.u64 %r294,0; bra $L17; $L16: .loc 1 117 24 mov.u64 %r272,0; st.u64 [%r184+8],%r272; .loc 1 118 28 cvt.u16.u32 %r275,%r22; add.u16 %r274,%r275,-1; cvt.u32.u16 %r276,%r274; st.u8 [%r184+28],%r276; .loc 1 120 20 add.u32 %r277,%r22,-2; cvt.s64.s32 %r65,%r277; add.u64 %r279,%r65,%r65; add.u64 %r280,%r279,%r65; shl.b64 %r281,%r280,3; add.u64 %r282,%r184,%r281; .loc 1 120 67 shl.b64 %r284,%r65,3; add.u64 %r285,%frame,%r284; .loc 1 120 59 ld.u64 %r287,[%r282+40]; ld.u64 %r288,[%r285+240]; mul.lo.u64 %r123,%r287,%r288; .loc 1 122 29290; call (%value_in),_gfortrani_xmallocarray291,[%value_in]; } .loc 1 122 27 st.u64 [%r184],%r291; .loc 1 123 10 setp.eq.u64 %r293,%r123,0; @ ! %r293 bra $L21; bra $L18; $L20: .loc 1 111 48 mul.lo.u64 %r107,%r61,%r107; $L17: .loc 1 113 4 st.u64 [%r150+8],%r294; ld.u64 %r61,[%r148]; add.u64 %r295,%r61,-1; st.u64 [%r150+16],%r295; st.u64 [%r150],%r107; .loc 1 106 7 add.u64 %r150,%r150,24; add.u64 %r148,%r148,8; setp.ne.u64 %r296,%r135,%r150; @ %r296 bra $L20; bra $L16; $L18: .loc 1 126 4 st.u64 [%r184+48],%r123; mov.u64 %r298,-1; st.u64 [%r184+56],%r298; mov.u64 %r299,1; st.u64 [%r184+40],%r299; .loc 1 127 4 bra $L1; $L11: .loc 1 133 19 ld.s8 %r72,[%r184+28]; .loc 1 133 10 setp.eq.u64 %r300,%r72,%r117; @ %r300 bra $L23; .loc 1 134 2 st.u64 [%stack+8],%r117; st.u64 [%stack],%r72;_gfortran_runtime_error23: .loc 1 139 11 cvta.global.u64 %r303,_gfortrani_compile_options; .loc 1 139 10 ld.u32 %r304,[%r303+36]; setp.eq.u32 %r305,%r304,0; @ %r305 bra $L21; .loc 1 140 2 add.u64 %r361,%frame,240; cvta.const.u64 %r309,$LC2r361308309; call _gfortrani_bounds_ifunction_return$L21: .loc 1 144 3 setp.gt.s64 %r311,%r117,0; @ %r311 bra $L24; $L28: .loc 1 80 9 add.u64 %r312,%r31,1; .loc 1 80 7 sub.u64 %r119,%r312,%r33; max.s64 %r121,%r119,0; .loc 1 83 44 mul.lo.u64 %r36,%r34,%r188; .loc 1 152 8 ld.u64 %r129,[%r185]; .loc 1 153 8 ld.u64 %r130,[%r184]; .loc 1 186 22 ld.u64 %r81,[%frame+120]; .loc 1 187 22 ld.u64 %r83,[%frame]; .loc 1 187 12 shl.b64 %r84,%r83,2; .loc 1 189 32 ld.u64 %r136,[%frame+240]; .loc 1 196 23 mul.lo.u64 %r313,%r81,%r136; .loc 1 196 9 neg.s64 %r144,%r313; .loc 1 197 23 mul.lo.u64 %r314,%r83,%r136; .loc 1 197 9 shl.b64 %r315,%r314,2; neg.s64 %r146,%r315; .loc 1 185 12 ld.u64 %r181,[%frame+360]; setp.gt.s64 %r359,%r119,0; setp.le.s64 %r360,%r117,1; .loc 1 164 9 mov.u64 %r365,0; .loc 1 165 9 cvt.u32.u64 %r366,%r365; .loc 1 173 6 setp.eq.u32 %r367,%r187,0; cvt.u32.u32 %r369,%r22; cvt.s64.s8 %r370,%r369; add.u64 %r371,%r370,-1; bra $L25; $L24: add.u64 %r169,%frame,360; add.u64 %r168,%r184,40; mov.u64 %r166,%frame; cvt.u32.u32 %r317,%r22; cvt.s64.s8 %r316,%r317; add.u64 %r152,%r316,-1; .loc 1 144 3 mov.u64 %r131,0; add.u64 %r361,%frame,240; .loc 1 146 16 mov.u64 %r318,%r131; $L27: st.u64 [%r169],%r318; .loc 1 147 18 ld.u64 %r319,[%r168]; st.u64 [%r166],%r319; .loc 1 148 17 shl.b64 %r321,%r131,3; add.u64 %r322,%r361,%r321; .loc 1 148 10 ld.u64 %r323,[%r322]; setp.le.s64 %r324,%r323,0; @ %r324 bra $L1; .loc 1 144 26 add.u64 %r131,%r131,1; .loc 1 144 3 add.u64 %r169,%r169,8; add.u64 %r168,%r168,24; add.u64 %r166,%r166,8; setp.ne.u64 %r325,%r131,%r152; @ %r325 bra $L27; bra $L28; $L25: .loc 1 166 5 @ %r359 bra $L38; .loc 1 167 10 mov.u32 %r327,0; st.u32 [%r130],%r327; bra $L30; $L38: mov.u64 %r125,%r129; .loc 1 164 9 mov.u64 %r111,%r365; .loc 1 165 9 mov.u32 %r109,%r366; .loc 1 170 13 mov.u64 %r124,0; $L29: .loc 1 173 6 setp.eq.u64 %r328,%r111,0; @ %r328 bra $L31; .loc 1 40125188332,[%value_in]; } .loc 1 173 6 @ %r367 bra $L32; .loc 1 173 22 setp.le.s32 %r335,%r332,0; @ ! %r335 bra $L33; bra $L31; $L32: setp.ge.s32 %r336,%r332,0; @ %r336 bra $L33; $L31: cvt.u32.u64 %r337,%r124; add.u32 %r109,%r337,1; .loc 1 170 13 mov.u64 %r111,%r125; $L33: .loc 1 170 28 add.u64 %r124,%r124,1; .loc 1 170 36 add.u64 %r125,%r125,%r36; .loc 1 170 6 setp.gt.s64 %r338,%r121,%r124; @ %r338 bra $L29; .loc 1 181 12 st.u32 [%r130],%r109; $L30: .loc 1 185 15 add.u64 %r181,%r181,1; .loc 1 186 12 add.u64 %r129,%r129,%r81; .loc 1 187 12 add.u64 %r130,%r130,%r84; .loc 1 189 13 setp.ne.u64 %r339,%r181,%r136; @ %r339 bra $L25; .loc 1 196 9 add.u64 %r126,%r129,%r144; .loc 1 197 9 add.u64 %r127,%r130,%r146; .loc 1 199 7 @ %r360 bra $L1; add.u64 %r137,%frame,368; mov.u64 %r155,8; .loc 1 198 5 mov.u64 %r128,1; add.u64 %r362,%frame,120; add.u64 %r361,%frame,240; bra $L35; $L36: .loc 1 193 13 st.u64 [%r137],%r365; .loc 1 196 23 mul.lo.u64 %r345,%r97,%r96; .loc 1 196 9 sub.u64 %r126,%r129,%r345; .loc 1 197 23 mul.lo.u64 %r346,%r99,%r96; .loc 1 197 9 shl.b64 %r347,%r346,2; sub.u64 %r127,%r130,%r347; .loc 1 198 5 add.u64 %r128,%r128,1; .loc 1 199 7 add.u64 %r137,%r137,8; add.u64 %r155,%r155,8; setp.eq.u64 %r348,%r128,%r371; @ %r348 bra $L1; $L35: .loc 1 207 16 ld.u64 %r349,[%r137]; add.u64 %r96,%r349,1; st.u64 [%r137],%r96; .loc 1 208 23 add.u64 %r351,%r362,%r155; ld.u64 %r97,[%r351]; .loc 1 208 13 add.u64 %r129,%r126,%r97; .loc 1 209 23 add.u64 %r352,%frame,%r155; ld.u64 %r99,[%r352]; .loc 1 209 13 shl.b64 %r353,%r99,2; add.u64 %r130,%r127,%r353; .loc 1 189 32 add.u64 %r355,%r361,%r155; ld.u64 %r102,[%r355]; .loc 1 189 13 setp.eq.u64 %r356,%r96,%r102; @ %r356 bra $L36; mov.u64 %r181,0; bra $L25; $L53: .loc 1 102 6 ld.u64 %r357,[%r184]; setp.eq.u64 %r358,%r357,0; @ ! %r358 bra $L11; bra $L37; $L1_gfortran_mminloc1_4_s1 .visible .func _gfortran_mminloc1_4_s1608u64 %r218; .reg .u64 %r227; .reg .u64 %r230; .reg .u64pred %r289; .reg .predpred %r304; .reg .pred %r305; .reg .u64u64 %r313; .reg .u64u64pred %r324; .reg .u64predu16 %r371; .reg .u16 %r372; .reg .u32 %r373; .reg .pred %r374; .reg .u64 %r375; .reg .u64 %r376; .reg .predu64u64 %r399; .reg .predu64 %r410; .reg .pred %r411; .reg .pred.reg .u32 %r417; .reg .predpred %r427; .reg .pred %r428; .reg .predpred %r461; .reg .predmov.u64 %r230,%ar0; mov.u64 %r231,%ar1; mov.u64 %r232,%ar2; mov.u64 %r233,%ar3; mov.u32 %r234,%ar4; mov.u64 %r235,%ar5; .loc 1 244 6 setp.ne.u64 %r236,%r233,0; @ %r236 bra $L2234235; call _gfortran_minloc1_4_s1); } .loc 1 251 7 bra $L54; $L55: .loc 1 254 10 ld.u64 %r22,[%r232]; .loc 1 254 7 add.u64 %r151,%r22,-1; .loc 1 255 10 ld.s8 %r23,[%r231+28]; .loc 1 255 38 add.u32 %r242,%r23,-1; .loc 1 255 8 cvt.s64.s32 %r152,%r242; .loc 1 258 7 shr.u64 %r244,%r151,63; set.u32.gt.s64 %r246,%r151,%r152; neg.s32 %r247,%r246; cvt.u16.u64 %r250,%r244; cvt.u16.u32 %r251,%r247; or.b16 %r249,%r250,%r251; .loc 1 258 6.loc 1 260 7 cvt.u32.u32 %r257,%r23; cvt.s64.s8 %r256,%r257; st.u64 [%stack+8],%r256; st.u64 [%stack],%r22; cvta.const.u64 %r22_gfortran_runtime_error7: .loc 1 265 9 add.u64 %r460,%r151,%r151; add.u64 %r459,%r460,%r151; shl.b64 %r262,%r459,3; add.u64 %r263,%r231,%r262; ld.u64 %r266,[%r263+56]; add.u64 %r265,%r266,1; .loc 1 265 7 ld.u64 %r273,[%r263+48]; sub.u64 %r153,%r265,%r273; .loc 1 266 6 setp.le.s64 %r274,%r153,0; @ %r274 bra $L54; .loc 1 269 9 ld.u64 %r164,[%r233]; .loc 1 271 15 ld.u64 %r34,[%r233+16]; .loc 1 273 22 cvt.u32.u64 %r35,%r34; .loc 1 273 53 add.u32 %r275,%r35,-4; and.b32 %r276,%r275,-5; set.u32.eq.u32 %r278,%r276,0; neg.s32 %r279,%r278; .loc 1 273 22 add.u32 %r280,%r35,-1; set.u32.le.u32 %r282,%r280,1; neg.s32 %r283,%r282; .loc 1 273 6 cvt.u16.u32 %r285,%r279; cvt.u16.u32 %r286,%r283; or.b16 %r284,%r285,%r286; cvt.u32.u16 %r287,%r284; cvt.u16.u8 %r288,%r287; setp.ne.u16 %r289,%r288,0; @ %r289 bra $L59; .loc 1 275 7 setp.ne.u32 %r291,%r35,16; @ %r291 bra $L60; $L59: .loc 1 282 11 shl.b64 %r295,%r459,3; add.u64 %r296,%r231,%r295; ld.u64 %r40,[%r296+40]; .loc 1 283 12 add.u64 %r302,%r233,%r295; ld.u64 %r43,[%r302+40]; .loc 1 285 3 setp.ne.u64 %r304,%r151,0; @ %r304 bra $L61; $L67: .loc 1 295 3 setp.lt.s64 %r305,%r151,%r152; @ %r305 bra $L62; bra $L113; $L60: .loc 1 280 5 cvta.const.u64 %r306306_gfortran_runtime_error61: add.u64 %r212,%r231,40; add.u64 %r214,%frame,240; add.u64 %r215,%r233,40; mov.u64 %r217,%frame; add.u64 %r218,%frame,360; add.u64 %r308,%r231,16; add.u64 %r310,%r22,%r22; add.u64 %r311,%r310,%r22; shl.b64 %r312,%r311,3; add.u64 %r227,%r308,%r312; .loc 1 292 12 mov.u64 %r472,0; $L66: .loc 1 287 51 ld.u64 %r314,[%r212]; mul.lo.u64 %r313,%r314,%r235; .loc 1 287 18 st.u64 [%r214],%r313; .loc 1 288 20 ld.u64 %r316,[%r215]; mul.lo.u64 %r315,%r316,%r34; .loc 1 288 18 st.u64 [%r217],%r315; .loc 1 289 19 ld.u64 %r318,[%r212+16]; add.u64 %r317,%r318,1; ld.u64 %r319,[%r212+8]; sub.u64 %r58,%r317,%r319; .loc 1 291 10 setp.lt.s64 %r320,%r58,0; @ %r320 bra $L64; .loc 1 289 17 st.u64 [%r218],%r58; bra $L65; $L64: .loc 1 292 12 st.u64 [%r218],%r472; $L65: .loc 1 285 3 add.u64 %r212,%r212,24; add.u64 %r214,%r214,8; add.u64 %r215,%r215,24; add.u64 %r217,%r217,8; add.u64 %r218,%r218,8; setp.ne.u64 %r322,%r212,%r227; @ %r322 bra $L66; bra $L67; $L113: .loc 1 305 6 ld.u64 %r323,[%r230]; setp.eq.u64 %r324,%r323,0; @ ! %r324 bra $L69; bra $L68; $L62: add.u64 %r326,%r22,%r22; add.u64 %r327,%r326,%r22; shl.b64 %r328,%r327,3; add.u64 %r67,%r328,40; add.u64 %r84,%r231,%r67; shl.b64 %r329,%r22,3; add.u64 %r171,%r329,-8; add.u64 %r457,%frame,240; add.u64 %r38,%r457,%r171; add.u64 %r173,%r233,%r67; add.u64 %r77,%frame,%r171; add.u64 %r455,%frame,360; add.u64 %r194,%r455,%r171; add.u64 %r332,%r231,40; cvt.u32.u32 %r334,%r23; cvt.s64.s8 %r333,%r334; add.u64 %r336,%r333,%r333; add.u64 %r337,%r336,%r333; shl.b64 %r338,%r337,3; add.u64 %r209,%r332,%r338; .loc 1 302 12 mov.u64 %r471,0; $L72: .loc 1 297 55 ld.u64 %r340,[%r84]; mul.lo.u64 %r339,%r340,%r235; .loc 1 297 18 st.u64 [%r38],%r339; .loc 1 298 20 ld.u64 %r342,[%r173]; mul.lo.u64 %r341,%r342,%r34; .loc 1 298 18 st.u64 [%r77],%r341; .loc 1 299 19 ld.u64 %r344,[%r84+16]; add.u64 %r343,%r344,1; ld.u64 %r345,[%r84+8]; sub.u64 %r74,%r343,%r345; .loc 1 301 10 setp.lt.s64 %r346,%r74,0; @ %r346 bra $L70; .loc 1 299 17 st.u64 [%r194],%r74; bra $L71; $L70: .loc 1 302 12 st.u64 [%r194],%r471; $L71: .loc 1 295 3 add.u64 %r84,%r84,24; add.u64 %r38,%r38,8; add.u64 %r173,%r173,24; add.u64 %r77,%r77,8; add.u64 %r194,%r194,8; setp.ne.u64 %r348,%r84,%r209; @ %r348 bra $L72; bra $L114; $L68: .loc 1 309 7 setp.eq.u64 %r349,%r152,0; @ %r349 bra $L74; add.u64 %r455,%frame,360; $L97: add.u64 %r133,%r230,40; mov.u64 %r130,%r455; cvt.u32.u32 %r351,%r23; cvt.s64.s8 %r350,%r351; add.u64 %r353,%r350,%r350; add.u64 %r354,%r353,%r350; shl.b64 %r355,%r354,3; add.u64 %r356,%r230,16; add.u64 %r92,%r355,%r356; .loc 1 312 10 mov.u64 %r135,1; .loc 1 316 4 mov.u64 %r375,0; bra $L75; $L74: .loc 1 320 20 add.u32 %r357,%r23,-2; cvt.s64.s32 %r82,%r357; add.u64 %r359,%r82,%r82; add.u64 %r360,%r359,%r82; shl.b64 %r361,%r360,3; add.u64 %r362,%r230,%r361; .loc 1 320 67 shl.b64 %r364,%r82,3; add.u64 %r365,%frame,%r364; .loc 1 320 59 ld.u64 %r367,[%r362+40]; ld.u64 %r368,[%r365+360]; mul.lo.u64 %r158,%r367,%r368; .loc 1 322 24 mov.u64 %r369,0; st.u64 [%r230+8],%r369; .loc 1 323 28 cvt.u16.u32 %r372,%r23; add.u16 %r371,%r372,-1; cvt.u32.u16 %r373,%r371; st.u8 [%r230+28],%r373; .loc 1 325 10 setp.eq.u64 %r374,%r158,0; @ %r374 bra $L76; bra $L115; $L78: .loc 1 314 47 mul.lo.u64 %r135,%r79,%r135; $L75: .loc 1 316 4 st.u64 [%r133+8],%r375; ld.u64 %r79,[%r130]; add.u64 %r376,%r79,-1; st.u64 [%r133+16],%r376; st.u64 [%r133],%r135; .loc 1 309 7 add.u64 %r133,%r133,24; add.u64 %r130,%r130,8; setp.ne.u64 %r377,%r92,%r133; @ %r377 bra $L78; bra $L74; $L76: .loc 1 328 4 st.u64 [%r230+48],%r158; mov.u64 %r379,-1; st.u64 [%r230+56],%r379; mov.u64 %r380,1; st.u64 [%r230+40],%r380; .loc 1 329 4 bra $L54; $L115: .loc 1 332 24382; call (%value_in),_gfortrani_xmallocarray83,[%value_in]; } .loc 1 332 22 st.u64 [%r230],%r383; bra $L79; $L69: .loc 1 337 19 ld.s8 %r385,[%r230+28]; .loc 1 337 10 setp.eq.u64 %r386,%r385,%r152; @ %r386 bra $L80; .loc 1 338 2 cvta.const.u64 %r387387_gfortran_runtime_error80: .loc 1 340 11 cvta.global.u64 %r389,_gfortrani_compile_options; .loc 1 340 10 ld.u32 %r390,[%r389+36]; setp.eq.u32 %r391,%r390,0; @ %r391 bra $L79; .loc 1 342 4 add.u64 %r455,%frame,360; cvta.const.u64 %r395,$LC2r455394395; call _gfortrani_bounds_ifunction_return344 4 cvta.const.u64 %r399,$LC3r2399395; call _gfortrani_bounds_equal_extents$L79: .loc 1 349 3 setp.ne.u64 %r401,%r152,0; @ %r401 bra $L81; $L84: .loc 1 357 8 ld.u64 %r165,[%r230]; .loc 1 358 8 ld.u64 %r163,[%r231]; .loc 1 360 9 setp.ne.u64 %r402,%r163,0; @ %r402 bra $L82; bra $L54; $L81: add.u64 %r174,%frame,480; add.u64 %r149,%r230,40; add.u64 %r147,%frame,120; cvt.u32.u32 %r404,%r23; cvt.s64.s8 %r403,%r404; add.u64 %r134,%r403,-1; .loc 1 349 3 mov.u64 %r169,0; add.u64 %r455,%frame,360; .loc 1 351 16 mov.u64 %r405,%r169; $L83: st.u64 [%r174],%r405; .loc 1 352 18 ld.u64 %r406,[%r149]; st.u64 [%r147],%r406; .loc 1 353 17 shl.b64 %r408,%r169,3; add.u64 %r409,%r455,%r408; .loc 1 353 10 ld.u64 %r410,[%r409]; setp.le.s64 %r411,%r410,0; @ %r411 bra $L54; .loc 1 349 26 add.u64 %r169,%r169,1; .loc 1 349 3 add.u64 %r174,%r174,8; add.u64 %r149,%r149,24; add.u64 %r147,%r147,8; setp.ne.u64 %r412,%r134,%r169; @ %r412 bra $L83; bra $L84; $L82: .loc 1 282 44 mul.lo.u64 %r42,%r40,%r235; .loc 1 283 12 mul.lo.u64 %r46,%r43,%r34; .loc 1 396 22 ld.u64 %r99,[%frame+240]; .loc 1 397 23 ld.u64 %r101,[%frame]; .loc 1 398 22 ld.u64 %r103,[%frame+120]; .loc 1 398 12 shl.b64 %r104,%r103,2; .loc 1 400 32 ld.u64 %r29,[%frame+360]; .loc 1 407 23 mul.lo.u64 %r166,%r99,%r29; .loc 1 408 24 mul.lo.u64 %r413,%r101,%r29; .loc 1 408 10 neg.s64 %r177,%r413; .loc 1 409 23 mul.lo.u64 %r414,%r103,%r29; .loc 1 409 9 shl.b64 %r415,%r414,2; neg.s64 %r182,%r415; setp.le.s64 %r461,%r152,1; .loc 1 384 13 setp.eq.u32 %r464,%r234,0; cvt.u32.u32 %r465,%r23; cvt.s64.s8 %r466,%r465; add.u64 %r467,%r466,-1; add.u64 %r468,%frame,240; add.u64 %r469,%frame,360; add.u64 %r470,%frame,120; $L98: .loc 1 349 3 mov.u64 %r168,%r164; mov.u64 %r141,%r163; .loc 1 372 9 mov.u64 %r159,0; $L88: .loc 1 375 7 ld.s8 %r199,[%r168]; .loc 1 375 6 setp.eq.u32 %r456,%r199,0; @ %r456 bra $L85; .loc 1 378 11 cvt.u32.u64 %r417,%r159; add.u32 %r139,%r417,1; .loc 1 382 6 setp.le.s64 %r418,%r153,%r159; @ %r418 bra $L86; add.u64 %r180,%r159,1; mov.u64 %r167,%r141; bra $L87; $L85: .loc 1 372 24 add.u64 %r159,%r159,1; .loc 1 372 32 add.u64 %r141,%r141,%r42; .loc 1 372 47 add.u64 %r168,%r168,%r46; .loc 1 372 2 setp.ne.u64 %r419,%r153,%r159; @ %r419 bra $L88; .loc 1 371 9 mov.u32 %r139,%r199; bra $L86; $L87: .loc 1 384 6 @ %r456 bra $L89; .loc 1 40(%value_in),memcmp,(%out_arg1,%out_arg2,%out_arg3); ld.param.u32 %r424,[%value_in]; } .loc 1 384 13 @ %r464 bra $L90; setp.le.s32 %r427,%r424,0; @ %r427 bra $L91; bra $L89; $L90: setp.ge.s32 %r428,%r424,0; @ %r428 bra $L89; $L91: .loc 1 388 14 cvt.u32.u64 %r139,%r180; mov.u64 %r141,%r167; $L89: .loc 1 382 31 add.u64 %r167,%r167,%r42; .loc 1 382 46 add.u64 %r168,%r168,%r46; .loc 1 382 6 add.u64 %r181,%r180,1; setp.eq.u64 %r429,%r153,%r180; @ %r429 bra $L86; .loc 1 384 7 ld.s8 %r199,[%r168]; mov.u64 %r180,%r181; setp.eq.u32 %r456,%r199,0; bra $L87; $L86: .loc 1 392 8 st.u32 [%r165],%r139; .loc 1 395 15 ld.u64 %r430,[%frame+480]; add.u64 %r97,%r430,1; .loc 1 396 12 add.u64 %r163,%r163,%r99; .loc 1 397 13 add.u64 %r164,%r164,%r101; .loc 1 398 12 add.u64 %r165,%r165,%r104; .loc 1 400 13 setp.eq.u64 %r431,%r29,%r97; @ %r431 bra $L92; .loc 1 395 15 st.u64 [%frame+480],%r97; bra $L98; $L92: .loc 1 404 13 mov.u64 %r432,0; st.u64 [%frame+480],%r432; .loc 1 408 10 add.u64 %r160,%r164,%r177; .loc 1 409 9 add.u64 %r161,%r165,%r182; .loc 1 411 7 @ %r461 bra $L54; add.u64 %r150,%frame,488; .loc 1 407 23 mov.u64 %r107,%r166; .loc 1 411 7 mov.u64 %r197,8; .loc 1 410 5 mov.u64 %r162,1; bra $L95; $L96: .loc 1 404 13 st.u64 [%r150],%r432; .loc 1 407 23 mul.lo.u64 %r107,%r120,%r119; .loc 1 408 24 mul.lo.u64 %r438,%r122,%r119; .loc 1 408 10 sub.u64 %r160,%r164,%r438; .loc 1 409 23 mul.lo.u64 %r439,%r124,%r119; .loc 1 409 9 shl.b64 %r440,%r439,2; sub.u64 %r161,%r165,%r440; .loc 1 410 5 add.u64 %r162,%r162,1; .loc 1 411 7 add.u64 %r150,%r150,8; add.u64 %r197,%r197,8; setp.eq.u64 %r441,%r162,%r467; @ %r441 bra $L54; $L95: .loc 1 419 16 ld.u64 %r442,[%r150]; add.u64 %r119,%r442,1; st.u64 [%r150],%r119; .loc 1 420 23 add.u64 %r444,%r468,%r197; ld.u64 %r120,[%r444]; .loc 1 420 13 sub.u64 %r445,%r120,%r107; add.u64 %r163,%r163,%r445; .loc 1 421 24 add.u64 %r446,%frame,%r197; ld.u64 %r122,[%r446]; .loc 1 421 14 add.u64 %r164,%r160,%r122; .loc 1 422 23 add.u64 %r448,%r470,%r197; ld.u64 %r124,[%r448]; .loc 1 422 13 shl.b64 %r449,%r124,2; add.u64 %r165,%r161,%r449; .loc 1 400 32 add.u64 %r451,%r469,%r197; ld.u64 %r128,[%r451]; .loc 1 400 13 setp.eq.u64 %r452,%r119,%r128; @ %r452 bra $L96; bra $L98; $L114: .loc 1 305 6 ld.u64 %r453,[%r230]; setp.eq.u64 %r454,%r453,0; @ ! %r454 bra $L69; bra $L97; $L54: .loc 1 426_gfortran_sminloc1_4_s1 .visible .func _gfortran_sminloc1_4_s1368100predu16 %r180; .reg .predpred %r222; .reg .predu64 %r231; .reg .u16 %r233; .reg .u16 %r234; .reg .u32pred %r259; .reg .pred %r260; .reg .u64u64 %r267; .reg .u32pred %r303; .reg .u64 %r304; .reg .pred %r305; .reg .u64 %r306; .reg .u64 %r307; .reg .u32mov.u64 %r156,%ar0; mov.u64 %r157,%ar1; mov.u64 %r158,%ar2; mov.u64 %r159,%ar3; mov.u32 %r160,%ar4; mov.u64 %r161,%ar5; .loc 1 449 6 setp.eq.u64 %r162,%r159,0; @ %r162 bra $L117; .loc 1 449 20 ld.u32 %r163,[%r159]; setp.eq.u32 %r164,%r163,0; @ %r164 bra $L118; $L117: .loc 1 452 7r1571580161; call _gfortran_minloc1_4_s1); } .loc 1 456 7 bra $L116; $L118: .loc 1 459 10 ld.u64 %r23,[%r158]; .loc 1 459 7 add.u64 %r95,%r23,-1; .loc 1 460 10 ld.s8 %r24,[%r157+28]; .loc 1 460 38 add.u32 %r26,%r24,-1; .loc 1 460 8 cvt.s64.s32 %r306,%r26; .loc 1 462 7 shr.u64 %r171,%r95,63; set.u32.gt.s64 %r173,%r95,%r306; neg.s32 %r174,%r173; cvt.u16.u64 %r177,%r171; cvt.u16.u32 %r178,%r174; or.b16 %r176,%r177,%r178; .loc 1 462 6 cvt.u32.u16 %r179,%r176; cvt.u16.u8 %r180,%r179; setp.ne.u16 %r181,%r180,0; @ %r181 bra $L120; .loc 1 469 3 setp.ne.u64 %r182,%r95,0; @ %r182 bra $L121; $L127: .loc 1 477 3 setp.lt.s64 %r183,%r95,%r306; @ %r183 bra $L122; bra $L175; $L120: .loc 1 464 7 cvt.u32.u32 %r186,%r24; cvt.s64.s8 %r185,%r186; st.u64 [%stack+8],%r1851841stack; call _gfortran_runtime_error121: add.u64 %r92,%r157,48; add.u64 %r114,%frame,120; add.u64 %r146,%r157,56; add.u64 %r188,%r157,24; add.u64 %r190,%r23,%r23; add.u64 %r191,%r190,%r23; shl.b64 %r192,%r191,3; add.u64 %r152,%r188,%r192; .loc 1 474 12 mov.u64 %r312,0; $L126: .loc 1 471 19 ld.u64 %r194,[%r146]; ld.u64 %r196,[%r92]; sub.u64 %r195,%r194,%r196; .loc 1 471 50 mad.lo.u64 %r38,%r195,%r161,%r161; .loc 1 473 10 setp.le.s64 %r197,%r38,0; @ %r197 bra $L124; .loc 1 471 17 st.u64 [%r114],%r38; bra $L125; $L124: .loc 1 474 12 st.u64 [%r114],%r312; $L125: .loc 1 469 3 add.u64 %r92,%r92,24; add.u64 %r114,%r114,8; add.u64 %r146,%r146,24; setp.ne.u64 %r199,%r92,%r152; @ %r199 bra $L126; bra $L127; $L175: .loc 1 486 15 ld.u64 %r107,[%r156]; .loc 1 486 6 setp.eq.u64 %r200,%r107,0; @ ! %r200 bra $L129; bra $L128; $L122: add.u64 %r202,%r23,%r23; add.u64 %r203,%r202,%r23; shl.b64 %r204,%r203,3; add.u64 %r205,%r204,48; add.u64 %r83,%r157,%r205; add.u64 %r304,%frame,120; shl.b64 %r207,%r23,3; add.u64 %r208,%r207,-8; add.u64 %r144,%r304,%r208; add.u64 %r209,%r157,48; cvt.u32.u32 %r211,%r24; cvt.s64.s8 %r210,%r211; add.u64 %r213,%r210,%r210; add.u64 %r214,%r213,%r210; shl.b64 %r215,%r214,3; add.u64 %r72,%r209,%r215; .loc 1 483 12 mov.u64 %r311,0; $L132: .loc 1 480 2 ld.u64 %r217,[%r83+8]; ld.u64 %r219,[%r83]; sub.u64 %r218,%r217,%r219; .loc 1 480 37 mad.lo.u64 %r46,%r218,%r161,%r161; .loc 1 482 10 setp.le.s64 %r220,%r46,0; @ %r220 bra $L130; .loc 1 479 17 st.u64 [%r144],%r46; bra $L131; $L130: .loc 1 483 12 st.u64 [%r144],%r311; $L131: .loc 1 477 3 add.u64 %r83,%r83,24; add.u64 %r144,%r144,8; setp.ne.u64 %r222,%r72,%r83; @ %r222 bra $L132; bra $L176; $L128: .loc 1 490 7 setp.eq.u64 %r223,%r306,0; @ %r223 bra $L134; add.u64 %r304,%frame,120; $L154: add.u64 %r125,%r156,40; mov.u64 %r123,%r304; add.u64 %r224,%r156,16; cvt.u32.u32 %r226,%r24; cvt.s64.s8 %r225,%r226; add.u64 %r228,%r225,%r225; add.u64 %r229,%r228,%r225; shl.b64 %r230,%r229,3; add.u64 %r117,%r224,%r230; .loc 1 493 10 mov.u64 %r89,1; .loc 1 497 4 mov.u64 %r249,0; bra $L135; $L134: .loc 1 501 24 mov.u64 %r231,0; st.u64 [%r156+8],%r231; .loc 1 502 28 cvt.u16.u32 %r234,%r24; add.u16 %r233,%r234,-1; cvt.u32.u16 %r235,%r233; st.u8 [%r156+28],%r235; .loc 1 504 20 add.u32 %r236,%r24,-2; cvt.s64.s32 %r56,%r236; add.u64 %r238,%r56,%r56; add.u64 %r239,%r238,%r56; shl.b64 %r240,%r239,3; add.u64 %r241,%r156,%r240; .loc 1 504 67 shl.b64 %r243,%r56,3; add.u64 %r244,%frame,%r243; .loc 1 504 59 ld.u64 %r246,[%r241+40]; ld.u64 %r247,[%r244+120]; mul.lo.u64 %r103,%r246,%r247; .loc 1 506 10 setp.eq.u64 %r248,%r103,0; @ %r248 bra $L136; bra $L177; $L138: .loc 1 495 48 mul.lo.u64 %r89,%r52,%r89; $L135: .loc 1 497 4 st.u64 [%r125+8],%r249; ld.u64 %r52,[%r123]; add.u64 %r250,%r52,-1; st.u64 [%r125+16],%r250; st.u64 [%r125],%r89; .loc 1 490 7 add.u64 %r125,%r125,24; add.u64 %r123,%r123,8; setp.ne.u64 %r251,%r117,%r125; @ %r251 bra $L138; bra $L134; $L136: .loc 1 509 4 st.u64 [%r156+48],%r103; mov.u64 %r253,-1; st.u64 [%r156+56],%r253; mov.u64 %r254,1; st.u64 [%r156+40],%r254; .loc 1 510 4 bra $L116; $L177: .loc 1 513 24256; call (%value_in),_gfortrani_xmallocarray257,[%value_in]; } mov.u64 %r107,%r257; .loc 1 513 22 st.u64 [%r156],%r107; $L142: .loc 1 539 3 setp.ne.u64 %r259,%r306,0; @ %r259 bra $L139; bra $L140; $L129: .loc 1 517 19 ld.s8 %r61,[%r156+28]; .loc 1 517 10 setp.eq.u64 %r260,%r61,%r306; @ %r260 bra $L141; .loc 1 518 2 st.u64 [%stack+8],%r306; st.u64 [%stack],%r61;_gfortran_runtime_error141: .loc 1 523 11 cvta.global.u64 %r263,_gfortrani_compile_options; .loc 1 523 10 ld.u32 %r264,[%r263+36]; setp.eq.u32 %r265,%r264,0; @ %r265 bra $L142; .loc 1 525 4 setp.eq.u64 %r266,%r306,0; @ %r266 bra $L140; add.u64 %r115,%r156,48; add.u64 %r105,%frame,120; cvt.u32.u32 %r268,%r24; cvt.s64.s8 %r267,%r268; add.u64 %r86,%r267,-1; .loc 1 525 10 mov.u64 %r113,0; $L144: .loc 1 529 21 ld.u64 %r270,[%r115+8]; add.u64 %r269,%r270,1; .loc 1 529 19 ld.u64 %r271,[%r115]; sub.u64 %r100,%r269,%r271; .loc 1 530 18 ld.u64 %r70,[%r105]; .loc 1 525 25 add.u64 %r113,%r113,1; .loc 1 530 11 setp.eq.u64 %r272,%r70,%r100; @ %r272 bra $L143; .loc 1 531 3 st.u64 [%stack+16],%r70; st.u64 [%stack+8],%r100; st.u64 [%stack],%r113;_gfortran_runtime_error143: .loc 1 525 4 add.u64 %r115,%r115,24; add.u64 %r105,%r105,8; setp.ne.u64 %r275,%r86,%r113; @ %r275 bra $L144; $L139: .loc 1 541 16 cvt.u16.u32 %r277,%r24; setp.le.s16 %r278,%r277,1; @ %r278 bra $L145; shl.b64 %r276,%r306,3; bra $L146; $L145: mov.u64 %r276,8; $L146: add.u64 %r280,%frame,240; mov.u32 %r2842766,[%value_in]; } add.u64 %r130,%r156,40; mov.u64 %r128,%frame; mov.u64 %r109,0; $L147: .loc 1 542 18 ld.u64 %r288,[%r130]; st.u64 [%r128],%r288; .loc 1 539 26 add.u64 %r109,%r109,1; .loc 1 539 3 add.u64 %r130,%r130,24; add.u64 %r128,%r128,8; setp.gt.s64 %r289,%r306,%r109; @ %r289 bra $L147; $L140: ld.u64 %r88,[%frame+240]; .loc 1 551 22 ld.u64 %r75,[%frame]; .loc 1 551 12 shl.b64 %r76,%r75,2; .loc 1 553 32 ld.u64 %r65,[%frame+120]; .loc 1 560 23 mul.lo.u64 %r49,%r65,%r75; setp.le.s64 %r305,%r306,1; .loc 1 549 13 mov.u32 %r290,0; cvt.u32.u32 %r308,%r24; cvt.s64.s8 %r309,%r308; add.u64 %r310,%r309,-1; $L153: st.u32 [%r107],%r290; .loc 1 550 15 add.u64 %r88,%r88,1; .loc 1 551 12 add.u64 %r107,%r107,%r76; .loc 1 553 13 setp.ne.u64 %r291,%r65,%r88; @ %r291 bra $L153; .loc 1 562 7 @ ! %r305 bra $L178; bra $L116; $L152: .loc 1 557 13 st.u64 [%r108],%r307; .loc 1 560 23 mul.lo.u64 %r78,%r81,%r80; .loc 1 561 5 add.u64 %r106,%r106,1; .loc 1 562 7 add.u64 %r108,%r108,8; add.u64 %r138,%r138,8; add.u64 %r137,%r137,8; setp.ne.u64 %r294,%r106,%r310; @ %r294 bra $L151; bra $L116; $L178: add.u64 %r108,%frame,248; add.u64 %r138,%frame,8; add.u64 %r137,%frame,128; .loc 1 560 23 mov.u64 %r78,%r49; .loc 1 561 5 mov.u64 %r106,1; .loc 1 557 13 mov.u64 %r307,0; $L151: .loc 1 566 16 ld.u64 %r299,[%r108]; add.u64 %r80,%r299,1; st.u64 [%r108],%r80; .loc 1 567 23 ld.u64 %r81,[%r138]; .loc 1 567 13 sub.u64 %r300,%r81,%r78; shl.b64 %r301,%r300,2; add.u64 %r107,%r107,%r301; .loc 1 553 32 ld.u64 %r84,[%r137]; .loc 1 553 13 setp.eq.u64 %r302,%r80,%r84; @ %r302 bra $L152; .loc 1 557 13 mov.u64 %r88,0; bra $L153; $L176: .loc 1 486 15 ld.u64 %r107,[%r156]; .loc 1 486 6 setp.eq.u64 %r303,%r107,0; @ ! %r303 bra $L129; bra $L154; $L116: .loc 1 minloc1_4_s4.o/ 1622802249_gfortran_minloc1_4_s4 .visible .func _gfortran_minloc1_4_sfortran/generated/minloc1_4_s4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_4_s4 .visible .func _gfortran_mminloc1_4_s_gfortran_sminloc1_4_s4 .visible .func _gfortran_sminloc1_4_sVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_memcmp_char4_gfortrani_memcmp_char4_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents7] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,914,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,117,101114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99_gfortran_minloc1_4_s4 .visible .func _gfortran_minloc1_4_s, .param .u64 %in_ar4) {487predpredu64u64 %r266; .reg .u64 %r267; .reg .pred %r268; .reg .predu64u64 %r320; .reg .u64u64 %r327; .reg .u64 %r328; .reg .predpred %r340; .reg .u32 %r344; .reg .predpredu64mov.u64 %r191,%ar0; mov.u64 %r192,%ar1; mov.u64 %r193,%ar2; mov.u32 %r194,%ar3; mov.u64 %r195,%ar4; .loc 1 70 10 ld.s8 %r22,[%r192+28]; .loc 1 70 38 add.u32 %r196,%r22,-1; .loc 1 70 8 cvt.s64.s32 %r103,%r196; .loc 1 71 10 ld.u64 %r25,[%r193]; .loc 1 71 7 add.u64 %r104,%r25,-1; .loc 1 73 7 shr.u64 %r198,%r104,63; set.u32.lt.s64 %r200,%r103,%r104; neg.s32 %r201,%r200; cvt.u16.u64 %r204,%r198; cvt.u16.u32 %r205,%r201; or.b16 %r203,%r204,%r205; .loc 1 73 6 cvt.u32.u16 %r206,%r203; cvt.u16.u8 %r207,%r206; setp.eq.u16 %r208,%r207,0; @ %r208 bra $L2; .loc 1 75 7 cvt.u32.u32 %r211,%r22; cvt.s64.s8 %r210,%r211; st.u64 [%stack+8],%r21000_gfortran_runtime_errorr214,%r104,%r104; add.u64 %r215,%r214,%r104; shl.b64 %r216,%r215,3; add.u64 %r217,%r192,%r216; ld.u64 %r30,[%r217+56]; ld.u64 %r32,[%r217+48]; .loc 1 83 11 ld.u64 %r33,[%r217+40]; .loc 1 85 3 setp.ne.u64 %r231,%r104,0; @ %r231 bra $L3; $L9: .loc 1 93 3 setp.gt.s64 %r232,%r103,%r104; @ %r232 bra $L4; bra $L50; $L3: add.u64 %r124,%r192,40; add.u64 %r102,%frame,120; add.u64 %r101,%frame,240; add.u64 %r233,%r192,16; add.u64 %r235,%r25,%r25; add.u64 %r236,%r235,%r25; shl.b64 %r237,%r236,3; add.u64 %r90,%r233,%r237; .loc 1 91 12 mov.u64 %r388,0; $L8: .loc 1 87 51 ld.u64 %r239,[%r124]; mul.lo.u64 %r238,%r239,%r195; .loc 1 87 18 st.u64 [%r102],%r238; .loc 1 88 19 ld.u64 %r241,[%r124+16]; add.u64 %r240,%r241,1; ld.u64 %r242,[%r124+8]; sub.u64 %r42,%r240,%r242; .loc 1 90 10 setp.lt.s64 %r243,%r42,0; @ %r243 bra $L6; .loc 1 88 17 st.u64 [%r101],%r42; bra $L7; $L6: .loc 1 91 12 st.u64 [%r101],%r388; $L7: .loc 1 85 3 add.u64 %r124,%r124,24; add.u64 %r102,%r102,8; add.u64 %r101,%r101,8; setp.ne.u64 %r245,%r90,%r124; @ %r245 bra $L8; bra $L9; $L50: .loc 1 102 6 ld.u64 %r246,[%r191]; setp.eq.u64 %r247,%r246,0; @ ! %r247 bra $L11; bra $L10; $L4: add.u64 %r249,%r25,%r25; add.u64 %r250,%r249,%r25; shl.b64 %r251,%r250,3; add.u64 %r252,%r251,40; add.u64 %r152,%r192,%r252; shl.b64 %r253,%r25,3; add.u64 %r145,%r253,-8; add.u64 %r374,%frame,120; add.u64 %r147,%r374,%r145; add.u64 %r373,%frame,240; add.u64 %r142,%r373,%r145; add.u64 %r256,%r192,40; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r127,%r256,%r262; .loc 1 99 12 mov.u64 %r387,0; $L14: .loc 1 95 56 ld.u64 %r264,[%r152]; mul.lo.u64 %r263,%r264,%r195; .loc 1 95 18 st.u64 [%r147],%r263; .loc 1 96 19 ld.u64 %r266,[%r152+16]; add.u64 %r265,%r266,1; ld.u64 %r267,[%r152+8]; sub.u64 %r51,%r265,%r267; .loc 1 98 10 setp.lt.s64 %r268,%r51,0; @ %r268 bra $L12; .loc 1 96 17 st.u64 [%r142],%r51; bra $L13; $L12: .loc 1 99 12 st.u64 [%r142],%r387; $L13: .loc 1 93 3 add.u64 %r152,%r152,24; add.u64 %r147,%r147,8; add.u64 %r142,%r142,8; setp.ne.u64 %r270,%r127,%r152; @ %r270 bra $L14; bra $L51; $L10: .loc 1 106 7 setp.le.s64 %r271,%r103,0; @ %r271 bra $L16; add.u64 %r373,%frame,240; $L38: add.u64 %r161,%r191,40; mov.u64 %r159,%r373; add.u64 %r272,%r191,16; cvt.u32.u32 %r274,%r22; cvt.s64.s8 %r273,%r274; add.u64 %r276,%r273,%r273; add.u64 %r277,%r276,%r273; shl.b64 %r278,%r277,3; add.u64 %r153,%r272,%r278; .loc 1 109 10 mov.u64 %r94,1; .loc 1 113 4 mov.u64 %r301,0; bra $L17; $L16: .loc 1 117 24 mov.u64 %r279,0; st.u64 [%r191+8],%r279; .loc 1 118 28 cvt.u16.u32 %r282,%r22; add.u16 %r281,%r282,-1; cvt.u32.u16 %r283,%r281; st.u8 [%r191+28],%r283; .loc 1 120 20 add.u32 %r284,%r22,-2; cvt.s64.s32 %r58,%r284; add.u64 %r286,%r58,%r58; add.u64 %r287,%r286,%r58; shl.b64 %r288,%r287,3; add.u64 %r289,%r191,%r288; .loc 1 120 67 shl.b64 %r291,%r58,3; add.u64 %r292,%frame,%r291; .loc 1 120 59 ld.u64 %r294,[%r289+40]; ld.u64 %r295,[%r292+240]; mul.lo.u64 %r109,%r294,%r295; .loc 1 122 29297; call (%value_in),_gfortrani_xmallocarray298,[%value_in]; } .loc 1 122 27 st.u64 [%r191],%r298; .loc 1 123 10 setp.eq.u64 %r300,%r109,0; @ ! %r300 bra $L21; bra $L18; $L20: .loc 1 111 48 mul.lo.u64 %r94,%r54,%r94; $L17: .loc 1 113 4 st.u64 [%r161+8],%r301; ld.u64 %r54,[%r159]; add.u64 %r302,%r54,-1; st.u64 [%r161+16],%r302; st.u64 [%r161],%r94; .loc 1 106 7 add.u64 %r161,%r161,24; add.u64 %r159,%r159,8; setp.ne.u64 %r303,%r153,%r161; @ %r303 bra $L20; bra $L16; $L18: .loc 1 126 4 st.u64 [%r191+48],%r109; mov.u64 %r305,-1; st.u64 [%r191+56],%r305; mov.u64 %r306,1; st.u64 [%r191+40],%r306; .loc 1 127 4 bra $L1; $L11: .loc 1 133 19 ld.s8 %r64,[%r191+28]; .loc 1 133 10 setp.eq.u64 %r307,%r64,%r103; @ %r307 bra $L23; .loc 1 134 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r64;_gfortran_runtime_error23: .loc 1 139 11 cvta.global.u64 %r310,_gfortrani_compile_options; .loc 1 139 10 ld.u32 %r311,[%r310+36]; setp.eq.u32 %r312,%r311,0; @ %r312 bra $L21; .loc 1 140 2 add.u64 %r373,%frame,240; cvta.const.u64 %r316,$LC2r37331316; call _gfortrani_bounds_ifunction_return$L21: .loc 1 144 3 setp.gt.s64 %r318,%r103,0; @ %r318 bra $L24; $L28: .loc 1 80 9 add.u64 %r319,%r30,1; .loc 1 80 7 sub.u64 %r105,%r319,%r32; max.s64 %r107,%r105,0; .loc 1 152 8 ld.u64 %r93,[%r192]; .loc 1 153 8 ld.u64 %r115,[%r191]; .loc 1 170 36 mul.lo.u64 %r320,%r33,%r195; shl.b64 %r69,%r320,2; .loc 1 186 22 ld.u64 %r183,[%frame+120]; .loc 1 186 12 shl.b64 %r185,%r183,2; .loc 1 187 22 ld.u64 %r186,[%frame]; .loc 1 187 12 shl.b64 %r188,%r186,2; .loc 1 185 12 ld.u64 %r189,[%frame+360]; setp.gt.s64 %r375,%r105,0; setp.le.s64 %r376,%r103,1; .loc 1 164 9 mov.u64 %r379,0; .loc 1 165 9 cvt.u32.u64 %r380,%r379; .loc 1 173 6 setp.eq.u32 %r381,%r194,0; cvt.u32.u32 %r382,%r22; cvt.s64.s8 %r383,%r382; add.u64 %r384,%r383,-1; add.u64 %r385,%frame,120; add.u64 %r386,%frame,240; bra $L25; $L24: add.u64 %r174,%frame,360; add.u64 %r173,%r191,40; mov.u64 %r171,%frame; cvt.u32.u32 %r322,%r22; cvt.s64.s8 %r321,%r322; add.u64 %r162,%r321,-1; .loc 1 144 3 mov.u64 %r116,0; add.u64 %r373,%frame,240; .loc 1 146 16 mov.u64 %r323,%r116; $L27: st.u64 [%r174],%r323; .loc 1 147 18 ld.u64 %r324,[%r173]; st.u64 [%r171],%r324; .loc 1 148 17 shl.b64 %r326,%r116,3; add.u64 %r327,%r373,%r326; .loc 1 148 10 ld.u64 %r328,[%r327]; setp.le.s64 %r329,%r328,0; @ %r329 bra $L1; .loc 1 144 26 add.u64 %r116,%r116,1; .loc 1 144 3 add.u64 %r174,%r174,8; add.u64 %r173,%r173,24; add.u64 %r171,%r171,8; setp.ne.u64 %r330,%r116,%r162; @ %r330 bra $L27; bra $L28; $L25: .loc 1 166 5 @ %r375 bra $L39; .loc 1 167 10 mov.u32 %r332,0; st.u32 [%r115],%r332; bra $L30; $L39: mov.u64 %r111,%r93; .loc 1 164 9 mov.u64 %r98,%r379; .loc 1 165 9 mov.u32 %r96,%r380; .loc 1 170 13 mov.u64 %r110,0; $L29: .loc 1 173 6 setp.ne.u64 %r333,%r98,0; @ %r333 bra $L31; $L34: cvt.u32.u64 %r334,%r110; add.u32 %r96,%r334,1; .loc 1 170 13 mov.u64 %r98,%r111; bra $L32; $L31: .loc 1 173 6 @ %r381 bra $L33; .loc 1 42195; call (%value_in),_gfortrani_memcmp_char4339,[%value_in]; } .loc 1 173 22 setp.le.s32 %r340,%r339,0; @ ! %r340 bra $L32; bra $L34; $L33: .loc 1 42195; call (%value_in),_gfortrani_memcmp_char4344,[%value_in]; } .loc 1 173 22 setp.lt.s32 %r345,%r344,0; @ %r345 bra $L34; $L32: .loc 1 170 28 add.u64 %r110,%r110,1; .loc 1 170 36 add.u64 %r111,%r111,%r69; .loc 1 170 6 setp.gt.s64 %r346,%r107,%r110; @ %r346 bra $L29; .loc 1 181 12 st.u32 [%r115],%r96; $L30: .loc 1 185 15 add.u64 %r189,%r189,1; .loc 1 186 12 add.u64 %r93,%r93,%r185; .loc 1 187 12 add.u64 %r115,%r115,%r188; .loc 1 189 32 ld.u64 %r121,[%frame+240]; .loc 1 189 13 setp.ne.u64 %r347,%r189,%r121; @ %r347 bra $L25; .loc 1 196 23 mul.lo.u64 %r348,%r189,%r183; .loc 1 196 9 shl.b64 %r349,%r348,2; sub.u64 %r112,%r93,%r349; .loc 1 197 23 mul.lo.u64 %r350,%r189,%r186; .loc 1 197 9 shl.b64 %r351,%r350,2; sub.u64 %r113,%r115,%r351; .loc 1 199 7 @ %r376 bra $L1; add.u64 %r123,%frame,368; mov.u64 %r143,8; .loc 1 198 5 mov.u64 %r114,1; bra $L36; $L37: .loc 1 193 13 st.u64 [%r123],%r379; .loc 1 196 23 mul.lo.u64 %r357,%r81,%r80; .loc 1 196 9 shl.b64 %r358,%r357,2; sub.u64 %r112,%r93,%r358; .loc 1 197 23 mul.lo.u64 %r359,%r84,%r80; .loc 1 197 9 shl.b64 %r360,%r359,2; sub.u64 %r113,%r115,%r360; .loc 1 198 5 add.u64 %r114,%r114,1; .loc 1 199 7 add.u64 %r123,%r123,8; add.u64 %r143,%r143,8; setp.eq.u64 %r361,%r114,%r384; @ %r361 bra $L1; $L36: .loc 1 207 16 ld.u64 %r362,[%r123]; add.u64 %r80,%r362,1; st.u64 [%r123],%r80; .loc 1 208 23 add.u64 %r364,%r385,%r143; ld.u64 %r81,[%r364]; .loc 1 208 13 shl.b64 %r365,%r81,2; add.u64 %r93,%r112,%r365; .loc 1 209 23 add.u64 %r366,%frame,%r143; ld.u64 %r84,[%r366]; .loc 1 209 13 shl.b64 %r367,%r84,2; add.u64 %r115,%r113,%r367; .loc 1 189 32 add.u64 %r369,%r386,%r143; ld.u64 %r87,[%r369]; .loc 1 189 13 setp.eq.u64 %r370,%r80,%r87; @ %r370 bra $L37; mov.u64 %r189,0; bra $L25; $L51: .loc 1 102 6 ld.u64 %r371,[%r191]; setp.eq.u64 %r372,%r371,0; @ ! %r372 bra $L11; bra $L38; $L1_gfortran_mminloc1_4_s4 .visible .func _gfortran_mminloc1_4_s608predu64u32 %r283; .reg .u32 %r284; .reg .u32 %r285; .reg .u32 %r287; .reg .u32 %r288; .reg .u16 %r289; .reg .u16 %r290; .reg .u16 %r291; .reg .u32 %r292; .reg .u16 %r293; .reg .pred %r294; .reg .predu64 %r307; .reg .pred %r309; .reg .predu64 %r318; .reg .u64pred %r327; .reg .u64 %r328; .reg .predu64 %r333; .reg .u64 %r334; .reg .u64 %r337; .reg .u32predu64 %r374; .reg .u16 %r376; .reg .u16 %r377; .reg .u32 %r378; .reg .predpred %r396; .reg .u64 %r399; .reg .u64 %r400; .reg .u64 %r404; .reg .pred %r406; .reg .predu64.reg .predpredpredpred %r450; .reg .u64pred %r462; .reg .u64 %r463; .reg .pred %r464; .reg .u64 %r465; .reg .pred %r466; .reg .u64 %r468; .reg .pred %r469; .reg .u64 %r470; .reg .u64 %r471; .reg .predu64mov.u64 %r235,%ar0; mov.u64 %r236,%ar1; mov.u64 %r237,%ar2; mov.u64 %r238,%ar3; mov.u32 %r239,%ar4; mov.u64 %r240,%ar5; .loc 1 244 6 setp.ne.u64 %r241,%r238,0; @ %r241 bra $L237240; call _gfortran_minloc1_4_s4); } .loc 1 251 7 bra $L52; $L53: .loc 1 254 10 ld.u64 %r22,[%r237]; .loc 1 254 7 add.u64 %r143,%r22,-1; .loc 1 255 10 ld.s8 %r23,[%r236+28]; .loc 1 255 38 add.u32 %r247,%r23,-1; .loc 1 255 8 cvt.s64.s32 %r144,%r247; .loc 1 258 7 shr.u64 %r249,%r143,63; set.u32.gt.s64 %r251,%r143,%r144; neg.s32 %r252,%r251; cvt.u16.u64 %r255,%r249; cvt.u16.u32 %r256,%r252; or.b16 %r254,%r255,%r256; .loc 1 258 6 cvt.u32.u16 %r257,%r254; cvt.u16.u8 %r258,%r257; setp.eq.u16 %r259,%r258,0; @ %r259 bra $L55; .loc 1 260 7 cvt.u32.u32 %r262,%r23; cvt.s64.s8 %r261,%r262; st.u64 [%stack+8],%r261; st.u64 [%stack],%r22; cvta.const.u64 %r260260_gfortran_runtime_error5: .loc 1 265 9 add.u64 %r470,%r143,%r143; add.u64 %r468,%r470,%r143; shl.b64 %r267,%r468,3; add.u64 %r268,%r236,%r267; ld.u64 %r271,[%r268+56]; add.u64 %r270,%r271,1; .loc 1 265 7 ld.u64 %r278,[%r268+48]; sub.u64 %r145,%r270,%r278; .loc 1 266 6 setp.le.s64 %r279,%r145,0; @ %r279 bra $L52; .loc 1 269 9 ld.u64 %r158,[%r238]; .loc 1 271 15 ld.u64 %r35,[%r238+16]; .loc 1 273 22 cvt.u32.u64 %r36,%r35; .loc 1 273 53 add.u32 %r280,%r36,-4; and.b32 %r281,%r280,-5; set.u32.eq.u32 %r283,%r281,0; neg.s32 %r284,%r283; .loc 1 273 22 add.u32 %r285,%r36,-1; set.u32.le.u32 %r287,%r285,1; neg.s32 %r288,%r287; .loc 1 273 6 cvt.u16.u32 %r290,%r284; cvt.u16.u32 %r291,%r288; or.b16 %r289,%r290,%r291; cvt.u32.u16 %r292,%r289; cvt.u16.u8 %r293,%r292; setp.ne.u16 %r294,%r293,0; @ %r294 bra $L57; .loc 1 275 7 setp.ne.u32 %r296,%r36,16; @ %r296 bra $L58; $L57: .loc 1 282 11 shl.b64 %r300,%r468,3; add.u64 %r301,%r236,%r300; ld.u64 %r42,[%r301+40]; .loc 1 283 12 add.u64 %r307,%r238,%r300; ld.u64 %r44,[%r307+40]; .loc 1 285 3 setp.ne.u64 %r309,%r143,0; @ %r309 bra $L59; $L65: .loc 1 295 3 setp.lt.s64 %r310,%r143,%r144; @ %r310 bra $L60; bra $L110; $L58: .loc 1 280 5 cvta.const.u64 %r311311_gfortran_runtime_error9: add.u64 %r214,%r236,40; add.u64 %r77,%frame,240; add.u64 %r98,%r238,40; mov.u64 %r97,%frame; add.u64 %r101,%frame,360; add.u64 %r313,%r236,16; add.u64 %r315,%r22,%r22; add.u64 %r316,%r315,%r22; shl.b64 %r317,%r316,3; add.u64 %r231,%r313,%r317; .loc 1 292 12 mov.u64 %r482,0; $L64: .loc 1 287 51 ld.u64 %r319,[%r214]; mul.lo.u64 %r318,%r319,%r240; .loc 1 287 18 st.u64 [%r77],%r318; .loc 1 288 20 ld.u64 %r321,[%r98]; mul.lo.u64 %r320,%r321,%r35; .loc 1 288 18 st.u64 [%r97],%r320; .loc 1 289 19 ld.u64 %r323,[%r214+16]; add.u64 %r322,%r323,1; ld.u64 %r324,[%r214+8]; sub.u64 %r58,%r322,%r324; .loc 1 291 10 setp.lt.s64 %r325,%r58,0; @ %r325 bra $L62; .loc 1 289 17 st.u64 [%r101],%r58; bra $L63; $L62: .loc 1 292 12 st.u64 [%r101],%r482; $L63: .loc 1 285 3 add.u64 %r214,%r214,24; add.u64 %r77,%r77,8; add.u64 %r98,%r98,24; add.u64 %r97,%r97,8; add.u64 %r101,%r101,8; setp.ne.u64 %r327,%r214,%r231; @ %r327 bra $L64; bra $L65; $L110: .loc 1 305 6 ld.u64 %r328,[%r235]; setp.eq.u64 %r329,%r328,0; @ ! %r329 bra $L67; bra $L66; $L60: add.u64 %r331,%r22,%r22; add.u64 %r332,%r331,%r22; shl.b64 %r333,%r332,3; add.u64 %r135,%r333,40; add.u64 %r139,%r236,%r135; shl.b64 %r334,%r22,3; add.u64 %r128,%r334,-8; add.u64 %r465,%frame,240; add.u64 %r132,%r465,%r128; add.u64 %r126,%r238,%r135; add.u64 %r123,%frame,%r128; add.u64 %r471,%frame,360; add.u64 %r92,%r471,%r128; cvt.u32.u32 %r338,%r23; cvt.s64.s8 %r337,%r338; add.u64 %r340,%r337,%r337; add.u64 %r341,%r340,%r337; shl.b64 %r342,%r341,3; add.u64 %r343,%r236,40; add.u64 %r166,%r342,%r343; .loc 1 302 12 mov.u64 %r481,0; $L70: .loc 1 297 55 ld.u64 %r345,[%r139]; mul.lo.u64 %r344,%r345,%r240; .loc 1 297 18 st.u64 [%r132],%r344; .loc 1 298 20 ld.u64 %r347,[%r126]; mul.lo.u64 %r346,%r347,%r35; .loc 1 298 18 st.u64 [%r123],%r346; .loc 1 299 19 ld.u64 %r349,[%r139+16]; add.u64 %r348,%r349,1; ld.u64 %r350,[%r139+8]; sub.u64 %r75,%r348,%r350; .loc 1 301 10 setp.lt.s64 %r351,%r75,0; @ %r351 bra $L68; .loc 1 299 17 st.u64 [%r92],%r75; bra $L69; $L68: .loc 1 302 12 st.u64 [%r92],%r481; $L69: .loc 1 295 3 add.u64 %r139,%r139,24; add.u64 %r132,%r132,8; add.u64 %r126,%r126,24; add.u64 %r123,%r123,8; add.u64 %r92,%r92,8; setp.ne.u64 %r353,%r139,%r166; @ %r353 bra $L70; bra $L111; $L66: .loc 1 309 7 setp.eq.u64 %r354,%r144,0; @ %r354 bra $L72; add.u64 %r471,%frame,360; $L94: add.u64 %r180,%r235,40; mov.u64 %r177,%r471; cvt.u32.u32 %r356,%r23; cvt.s64.s8 %r355,%r356; add.u64 %r358,%r355,%r355; add.u64 %r359,%r358,%r355; shl.b64 %r360,%r359,3; add.u64 %r361,%r235,16; add.u64 %r141,%r360,%r361; .loc 1 312 10 mov.u64 %r131,1; .loc 1 316 4 mov.u64 %r380,0; bra $L73; $L72: .loc 1 320 20 add.u32 %r362,%r23,-2; cvt.s64.s32 %r83,%r362; add.u64 %r364,%r83,%r83; add.u64 %r365,%r364,%r83; shl.b64 %r366,%r365,3; add.u64 %r367,%r235,%r366; .loc 1 320 67 shl.b64 %r369,%r83,3; add.u64 %r370,%frame,%r369; .loc 1 320 59 ld.u64 %r372,[%r367+40]; ld.u64 %r373,[%r370+360]; mul.lo.u64 %r151,%r372,%r373; .loc 1 322 24 mov.u64 %r374,0; st.u64 [%r235+8],%r374; .loc 1 323 28 cvt.u16.u32 %r377,%r23; add.u16 %r376,%r377,-1; cvt.u32.u16 %r378,%r376; st.u8 [%r235+28],%r378; .loc 1 325 10 setp.eq.u64 %r379,%r151,0; @ %r379 bra $L74; bra $L112; $L76: .loc 1 314 47 mul.lo.u64 %r131,%r80,%r131; $L73: .loc 1 316 4 st.u64 [%r180+8],%r380; ld.u64 %r80,[%r177]; add.u64 %r381,%r80,-1; st.u64 [%r180+16],%r381; st.u64 [%r180],%r131; .loc 1 309 7 add.u64 %r180,%r180,24; add.u64 %r177,%r177,8; setp.ne.u64 %r382,%r141,%r180; @ %r382 bra $L76; bra $L72; $L74: .loc 1 328 4 st.u64 [%r235+48],%r151; mov.u64 %r384,-1; st.u64 [%r235+56],%r384; mov.u64 %r385,1; st.u64 [%r235+40],%r385; .loc 1 329 4 bra $L52; $L112: .loc 1 332 24387; call (%value_in),_gfortrani_xmallocarray88,[%value_in]; } .loc 1 332 22 st.u64 [%r235],%r388; bra $L77; $L67: .loc 1 337 19 ld.s8 %r390,[%r235+28]; .loc 1 337 10 setp.eq.u64 %r391,%r390,%r144; @ %r391 bra $L78; .loc 1 338 2 cvta.const.u64 %r392,$LC5;stack; call _gfortran_runtime_error78: .loc 1 340 11 cvta.global.u64 %r394,_gfortrani_compile_options; .loc 1 340 10 ld.u32 %r395,[%r394+36]; setp.eq.u32 %r396,%r395,0; @ %r396 bra $L77; .loc 1 342 4 add.u64 %r471,%frame,360; cvta.const.u64 %r400,$LC2r471399400; call _gfortrani_bounds_ifunction_return344 4 cvta.const.u64 %r404,$LC3r2r404400; call _gfortrani_bounds_equal_extents$L77: .loc 1 349 3 setp.ne.u64 %r406,%r144,0; @ %r406 bra $L79; $L82: .loc 1 357 8 ld.u64 %r159,[%r235]; .loc 1 358 8 ld.u64 %r157,[%r236]; .loc 1 360 9 setp.ne.u64 %r407,%r157,0; @ %r407 bra $L80; bra $L52; $L79: add.u64 %r198,%frame,480; add.u64 %r197,%r235,40; add.u64 %r194,%frame,120; cvt.u32.u32 %r409,%r23; cvt.s64.s8 %r408,%r409; add.u64 %r184,%r408,-1; .loc 1 349 3 mov.u64 %r164,0; add.u64 %r471,%frame,360; .loc 1 351 16 mov.u64 %r410,%r164; $L81: st.u64 [%r198],%r410; .loc 1 352 18 ld.u64 %r411,[%r197]; st.u64 [%r194],%r411; .loc 1 353 17 shl.b64 %r413,%r164,3; add.u64 %r414,%r471,%r413; .loc 1 353 10 ld.u64 %r415,[%r414]; setp.le.s64 %r416,%r415,0; @ %r416 bra $L52; .loc 1 349 26 add.u64 %r164,%r164,1; .loc 1 349 3 add.u64 %r198,%r198,8; add.u64 %r197,%r197,24; add.u64 %r194,%r194,8; setp.ne.u64 %r417,%r164,%r184; @ %r417 bra $L81; bra $L82; $L80: .loc 1 283 12 mul.lo.u64 %r46,%r44,%r35; .loc 1 372 32 mul.lo.u64 %r418,%r42,%r240; shl.b64 %r96,%r418,2; .loc 1 396 22 ld.u64 %r215,[%frame+240]; .loc 1 396 12 shl.b64 %r217,%r215,2; .loc 1 397 23 ld.u64 %r218,[%frame]; .loc 1 398 22 ld.u64 %r220,[%frame+120]; .loc 1 398 12 shl.b64 %r223,%r220,2; setp.le.s64 %r469,%r144,1; .loc 1 384 13 setp.eq.u32 %r474,%r239,0; cvt.u32.u32 %r475,%r23; cvt.s64.s8 %r476,%r475; add.u64 %r477,%r476,-1; add.u64 %r478,%frame,240; add.u64 %r479,%frame,360; add.u64 %r480,%frame,120; $L95: .loc 1 349 3 mov.u64 %r163,%r158; mov.u64 %r153,%r157; .loc 1 372 9 mov.u64 %r152,0; $L86: .loc 1 375 7 ld.s8 %r224,[%r163]; .loc 1 375 6 setp.eq.u32 %r466,%r224,0; @ %r466 bra $L83; .loc 1 378 11 cvt.u32.u64 %r420,%r152; add.u32 %r133,%r420,1; .loc 1 382 6 setp.le.s64 %r421,%r145,%r152; @ %r421 bra $L84; add.u64 %r201,%r152,1; mov.u64 %r162,%r153; bra $L85; $L83: .loc 1 372 24 add.u64 %r152,%r152,1; .loc 1 372 32 add.u64 %r153,%r153,%r96; .loc 1 372 47 add.u64 %r163,%r163,%r46; .loc 1 372 2 setp.ne.u64 %r422,%r145,%r152; @ %r422 bra $L86; .loc 1 371 9 mov.u32 %r133,%r224; bra $L84; $L85: .loc 1 384 6 @ %r466 bra $L87; .loc 1 384 13 @ %r474 bra $L88; .loc 1 4216153240; call (%value_in),_gfortrani_memcmp_char4428,[%value_in]; } .loc 1 384 13 setp.gt.s32 %r429,%r428,0; @ %r429 bra $L87; $L89: .loc 1 388 14 cvt.u32.u64 %r133,%r201; mov.u64 %r153,%r162; bra $L87; $L88: .loc 1 4216153240; call (%value_in),_gfortrani_memcmp_char4433,[%value_in]; } .loc 1 384 13 setp.lt.s32 %r434,%r433,0; @ %r434 bra $L89; $L87: .loc 1 382 31 add.u64 %r162,%r162,%r96; .loc 1 382 46 add.u64 %r163,%r163,%r46; .loc 1 382 6 add.u64 %r202,%r201,1; setp.eq.u64 %r435,%r145,%r201; @ %r435 bra $L84; .loc 1 384 7 ld.s8 %r224,[%r163]; mov.u64 %r201,%r202; setp.eq.u32 %r466,%r224,0; bra $L85; $L84: .loc 1 392 8 st.u32 [%r159],%r133; .loc 1 395 15 ld.u64 %r436,[%frame+480]; add.u64 %r100,%r436,1; st.u64 [%frame+480],%r100; .loc 1 396 12 add.u64 %r157,%r157,%r217; .loc 1 397 13 add.u64 %r158,%r158,%r218; .loc 1 398 12 add.u64 %r159,%r159,%r223; .loc 1 400 32 ld.u64 %r29,[%frame+360]; .loc 1 400 13 setp.ne.u64 %r437,%r29,%r100; @ %r437 bra $L95; .loc 1 404 13 mov.u64 %r438,0; st.u64 [%frame+480],%r438; .loc 1 407 23 mul.lo.u64 %r106,%r29,%r215; .loc 1 408 24 mul.lo.u64 %r439,%r29,%r218; .loc 1 408 10 sub.u64 %r154,%r158,%r439; .loc 1 409 23 mul.lo.u64 %r440,%r29,%r220; .loc 1 409 9 shl.b64 %r441,%r440,2; sub.u64 %r155,%r159,%r441; .loc 1 411 7 @ %r469 bra $L52; add.u64 %r212,%frame,488; mov.u64 %r211,8; .loc 1 410 5 mov.u64 %r156,1; bra $L92; $L93: .loc 1 404 13 st.u64 [%r212],%r438; .loc 1 407 23 mul.lo.u64 %r106,%r116,%r115; .loc 1 408 24 mul.lo.u64 %r447,%r118,%r115; .loc 1 408 10 sub.u64 %r154,%r158,%r447; .loc 1 409 23 mul.lo.u64 %r448,%r120,%r115; .loc 1 409 9 shl.b64 %r449,%r448,2; sub.u64 %r155,%r159,%r449; .loc 1 410 5 add.u64 %r156,%r156,1; .loc 1 411 7 add.u64 %r212,%r212,8; add.u64 %r211,%r211,8; setp.eq.u64 %r450,%r156,%r477; @ %r450 bra $L52; $L92: .loc 1 419 16 ld.u64 %r451,[%r212]; add.u64 %r115,%r451,1; st.u64 [%r212],%r115; .loc 1 420 23 add.u64 %r453,%r478,%r211; ld.u64 %r116,[%r453]; .loc 1 420 13 sub.u64 %r454,%r116,%r106; shl.b64 %r455,%r454,2; add.u64 %r157,%r157,%r455; .loc 1 421 24 add.u64 %r456,%frame,%r211; ld.u64 %r118,[%r456]; .loc 1 421 14 add.u64 %r158,%r154,%r118; .loc 1 422 23 add.u64 %r458,%r480,%r211; ld.u64 %r120,[%r458]; .loc 1 422 13 shl.b64 %r459,%r120,2; add.u64 %r159,%r155,%r459; .loc 1 400 32 add.u64 %r461,%r479,%r211; ld.u64 %r124,[%r461]; .loc 1 400 13 setp.eq.u64 %r462,%r115,%r124; @ %r462 bra $L93; bra $L95; $L111: .loc 1 305 6 ld.u64 %r463,[%r235]; setp.eq.u64 %r464,%r463,0; @ ! %r464 bra $L67; bra $L94; $L52: .loc 1 426_gfortran_sminloc1_4_s4 .visible .func _gfortran_sminloc1_4_s368100predu16 %r180; .reg .predpred %r222; .reg .predu64 %r231; .reg .u16 %r233; .reg .u16 %r234; .reg .u32pred %r259; .reg .pred %r260; .reg .u64u64 %r267; .reg .u32pred %r303; .reg .u64 %r304; .reg .pred %r305; .reg .u64 %r306; .reg .u64 %r307; .reg .u32mov.u64 %r156,%ar0; mov.u64 %r157,%ar1; mov.u64 %r158,%ar2; mov.u64 %r159,%ar3; mov.u32 %r160,%ar4; mov.u64 %r161,%ar5; .loc 1 449 6 setp.eq.u64 %r162,%r159,0; @ %r162 bra $L114; .loc 1 449 20 ld.u32 %r163,[%r159]; setp.eq.u32 %r164,%r163,0; @ %r164 bra $L115; $L114: .loc 1 452 7r1571580161; call _gfortran_minloc1_4_s4); } .loc 1 456 7 bra $L113; $L115: .loc 1 459 10 ld.u64 %r23,[%r158]; .loc 1 459 7 add.u64 %r95,%r23,-1; .loc 1 460 10 ld.s8 %r24,[%r157+28]; .loc 1 460 38 add.u32 %r26,%r24,-1; .loc 1 460 8 cvt.s64.s32 %r306,%r26; .loc 1 462 7 shr.u64 %r171,%r95,63; set.u32.gt.s64 %r173,%r95,%r306; neg.s32 %r174,%r173; cvt.u16.u64 %r177,%r171; cvt.u16.u32 %r178,%r174; or.b16 %r176,%r177,%r178; .loc 1 462 6 cvt.u32.u16 %r179,%r176; cvt.u16.u8 %r180,%r179; setp.ne.u16 %r181,%r180,0; @ %r181 bra $L117; .loc 1 469 3 setp.ne.u64 %r182,%r95,0; @ %r182 bra $L118; $L124: .loc 1 477 3 setp.lt.s64 %r183,%r95,%r306; @ %r183 bra $L119; bra $L172; $L117: .loc 1 464 7 cvt.u32.u32 %r186,%r24; cvt.s64.s8 %r185,%r186; st.u64 [%stack+8],%r1851841stack; call _gfortran_runtime_error118: add.u64 %r92,%r157,48; add.u64 %r114,%frame,120; add.u64 %r146,%r157,56; add.u64 %r188,%r157,24; add.u64 %r190,%r23,%r23; add.u64 %r191,%r190,%r23; shl.b64 %r192,%r191,3; add.u64 %r152,%r188,%r192; .loc 1 474 12 mov.u64 %r312,0; $L123: .loc 1 471 19 ld.u64 %r194,[%r146]; ld.u64 %r196,[%r92]; sub.u64 %r195,%r194,%r196; .loc 1 471 50 mad.lo.u64 %r38,%r195,%r161,%r161; .loc 1 473 10 setp.le.s64 %r197,%r38,0; @ %r197 bra $L121; .loc 1 471 17 st.u64 [%r114],%r38; bra $L122; $L121: .loc 1 474 12 st.u64 [%r114],%r312; $L122: .loc 1 469 3 add.u64 %r92,%r92,24; add.u64 %r114,%r114,8; add.u64 %r146,%r146,24; setp.ne.u64 %r199,%r92,%r152; @ %r199 bra $L123; bra $L124; $L172: .loc 1 486 15 ld.u64 %r107,[%r156]; .loc 1 486 6 setp.eq.u64 %r200,%r107,0; @ ! %r200 bra $L126; bra $L125; $L119: add.u64 %r202,%r23,%r23; add.u64 %r203,%r202,%r23; shl.b64 %r204,%r203,3; add.u64 %r205,%r204,48; add.u64 %r83,%r157,%r205; add.u64 %r304,%frame,120; shl.b64 %r207,%r23,3; add.u64 %r208,%r207,-8; add.u64 %r144,%r304,%r208; add.u64 %r209,%r157,48; cvt.u32.u32 %r211,%r24; cvt.s64.s8 %r210,%r211; add.u64 %r213,%r210,%r210; add.u64 %r214,%r213,%r210; shl.b64 %r215,%r214,3; add.u64 %r72,%r209,%r215; .loc 1 483 12 mov.u64 %r311,0; $L129: .loc 1 480 2 ld.u64 %r217,[%r83+8]; ld.u64 %r219,[%r83]; sub.u64 %r218,%r217,%r219; .loc 1 480 37 mad.lo.u64 %r46,%r218,%r161,%r161; .loc 1 482 10 setp.le.s64 %r220,%r46,0; @ %r220 bra $L127; .loc 1 479 17 st.u64 [%r144],%r46; bra $L128; $L127: .loc 1 483 12 st.u64 [%r144],%r311; $L128: .loc 1 477 3 add.u64 %r83,%r83,24; add.u64 %r144,%r144,8; setp.ne.u64 %r222,%r72,%r83; @ %r222 bra $L129; bra $L173; $L125: .loc 1 490 7 setp.eq.u64 %r223,%r306,0; @ %r223 bra $L131; add.u64 %r304,%frame,120; $L151: add.u64 %r125,%r156,40; mov.u64 %r123,%r304; add.u64 %r224,%r156,16; cvt.u32.u32 %r226,%r24; cvt.s64.s8 %r225,%r226; add.u64 %r228,%r225,%r225; add.u64 %r229,%r228,%r225; shl.b64 %r230,%r229,3; add.u64 %r117,%r224,%r230; .loc 1 493 10 mov.u64 %r89,1; .loc 1 497 4 mov.u64 %r249,0; bra $L132; $L131: .loc 1 501 24 mov.u64 %r231,0; st.u64 [%r156+8],%r231; .loc 1 502 28 cvt.u16.u32 %r234,%r24; add.u16 %r233,%r234,-1; cvt.u32.u16 %r235,%r233; st.u8 [%r156+28],%r235; .loc 1 504 20 add.u32 %r236,%r24,-2; cvt.s64.s32 %r56,%r236; add.u64 %r238,%r56,%r56; add.u64 %r239,%r238,%r56; shl.b64 %r240,%r239,3; add.u64 %r241,%r156,%r240; .loc 1 504 67 shl.b64 %r243,%r56,3; add.u64 %r244,%frame,%r243; .loc 1 504 59 ld.u64 %r246,[%r241+40]; ld.u64 %r247,[%r244+120]; mul.lo.u64 %r103,%r246,%r247; .loc 1 506 10 setp.eq.u64 %r248,%r103,0; @ %r248 bra $L133; bra $L174; $L135: .loc 1 495 48 mul.lo.u64 %r89,%r52,%r89; $L132: .loc 1 497 4 st.u64 [%r125+8],%r249; ld.u64 %r52,[%r123]; add.u64 %r250,%r52,-1; st.u64 [%r125+16],%r250; st.u64 [%r125],%r89; .loc 1 490 7 add.u64 %r125,%r125,24; add.u64 %r123,%r123,8; setp.ne.u64 %r251,%r117,%r125; @ %r251 bra $L135; bra $L131; $L133: .loc 1 509 4 st.u64 [%r156+48],%r103; mov.u64 %r253,-1; st.u64 [%r156+56],%r253; mov.u64 %r254,1; st.u64 [%r156+40],%r254; .loc 1 510 4 bra $L113; $L174: .loc 1 513 24256; call (%value_in),_gfortrani_xmallocarray257,[%value_in]; } mov.u64 %r107,%r257; .loc 1 513 22 st.u64 [%r156],%r107; $L139: .loc 1 539 3 setp.ne.u64 %r259,%r306,0; @ %r259 bra $L136; bra $L137; $L126: .loc 1 517 19 ld.s8 %r61,[%r156+28]; .loc 1 517 10 setp.eq.u64 %r260,%r61,%r306; @ %r260 bra $L138; .loc 1 518 2 st.u64 [%stack+8],%r306; st.u64 [%stack],%r61;_gfortran_runtime_error138: .loc 1 523 11 cvta.global.u64 %r263,_gfortrani_compile_options; .loc 1 523 10 ld.u32 %r264,[%r263+36]; setp.eq.u32 %r265,%r264,0; @ %r265 bra $L139; .loc 1 525 4 setp.eq.u64 %r266,%r306,0; @ %r266 bra $L137; add.u64 %r115,%r156,48; add.u64 %r105,%frame,120; cvt.u32.u32 %r268,%r24; cvt.s64.s8 %r267,%r268; add.u64 %r86,%r267,-1; .loc 1 525 10 mov.u64 %r113,0; $L141: .loc 1 529 21 ld.u64 %r270,[%r115+8]; add.u64 %r269,%r270,1; .loc 1 529 19 ld.u64 %r271,[%r115]; sub.u64 %r100,%r269,%r271; .loc 1 530 18 ld.u64 %r70,[%r105]; .loc 1 525 25 add.u64 %r113,%r113,1; .loc 1 530 11 setp.eq.u64 %r272,%r70,%r100; @ %r272 bra $L140; .loc 1 531 3 st.u64 [%stack+16],%r70; st.u64 [%stack+8],%r100; st.u64 [%stack],%r113;_gfortran_runtime_error140: .loc 1 525 4 add.u64 %r115,%r115,24; add.u64 %r105,%r105,8; setp.ne.u64 %r275,%r86,%r113; @ %r275 bra $L141; $L136: .loc 1 541 16 cvt.u16.u32 %r277,%r24; setp.le.s16 %r278,%r277,1; @ %r278 bra $L142; shl.b64 %r276,%r306,3; bra $L143; $L142: mov.u64 %r276,8; $L143: add.u64 %r280,%frame,240; mov.u32 %r2842766,[%value_in]; } add.u64 %r130,%r156,40; mov.u64 %r128,%frame; mov.u64 %r109,0; $L144: .loc 1 542 18 ld.u64 %r288,[%r130]; st.u64 [%r128],%r288; .loc 1 539 26 add.u64 %r109,%r109,1; .loc 1 539 3 add.u64 %r130,%r130,24; add.u64 %r128,%r128,8; setp.gt.s64 %r289,%r306,%r109; @ %r289 bra $L144; $L137: ld.u64 %r88,[%frame+240]; .loc 1 551 22 ld.u64 %r75,[%frame]; .loc 1 551 12 shl.b64 %r76,%r75,2; .loc 1 553 32 ld.u64 %r65,[%frame+120]; .loc 1 560 23 mul.lo.u64 %r49,%r65,%r75; setp.le.s64 %r305,%r306,1; .loc 1 549 13 mov.u32 %r290,0; cvt.u32.u32 %r308,%r24; cvt.s64.s8 %r309,%r308; add.u64 %r310,%r309,-1; $L150: st.u32 [%r107],%r290; .loc 1 550 15 add.u64 %r88,%r88,1; .loc 1 551 12 add.u64 %r107,%r107,%r76; .loc 1 553 13 setp.ne.u64 %r291,%r65,%r88; @ %r291 bra $L150; .loc 1 562 7 @ ! %r305 bra $L175; bra $L113; $L149: .loc 1 557 13 st.u64 [%r108],%r307; .loc 1 560 23 mul.lo.u64 %r78,%r81,%r80; .loc 1 561 5 add.u64 %r106,%r106,1; .loc 1 562 7 add.u64 %r108,%r108,8; add.u64 %r138,%r138,8; add.u64 %r137,%r137,8; setp.ne.u64 %r294,%r106,%r310; @ %r294 bra $L148; bra $L113; $L175: add.u64 %r108,%frame,248; add.u64 %r138,%frame,8; add.u64 %r137,%frame,128; .loc 1 560 23 mov.u64 %r78,%r49; .loc 1 561 5 mov.u64 %r106,1; .loc 1 557 13 mov.u64 %r307,0; $L148: .loc 1 566 16 ld.u64 %r299,[%r108]; add.u64 %r80,%r299,1; st.u64 [%r108],%r80; .loc 1 567 23 ld.u64 %r81,[%r138]; .loc 1 567 13 sub.u64 %r300,%r81,%r78; shl.b64 %r301,%r300,2; add.u64 %r107,%r107,%r301; .loc 1 553 32 ld.u64 %r84,[%r137]; .loc 1 553 13 setp.eq.u64 %r302,%r80,%r84; @ %r302 bra $L149; .loc 1 557 13 mov.u64 %r88,0; bra $L150; $L173: .loc 1 486 15 ld.u64 %r107,[%r156]; .loc 1 486 6 setp.eq.u64 %r303,%r107,0; @ ! %r303 bra $L126; bra $L151; $L113: .loc 1 minloc1_8_s1.o/ 1622802249_gfortran_minloc1_8_s1 .visible .func _gfortran_minloc1_8_s1fortran/generated/minloc1_8_s1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_8_s1 .visible .func _gfortran_mminloc1_8_s1_gfortran_sminloc1_8_s1 .visible .func _gfortran_sminloc1_8_s1VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returmemcmpmem_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents7] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,914,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,117,101114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99_gfortran_minloc1_8_s1 .visible .func _gfortran_minloc1_8_s1in_ar4) {48u64pred %r235; .reg .predu16 %r273; .reg .u16u32pred %r331; .reg .predu64pred %r357; .reg .predu64 %r370; .reg .u64 %r371; mov.u64 %r183,%ar0; mov.u64 %r184,%ar1; mov.u64 %r185,%ar2; mov.u32 %r186,%ar3; mov.u64 %r187,%ar4; .loc 1 70 10 ld.s8 %r22,[%r184+28]; .loc 1 70 38 add.u32 %r188,%r22,-1; .loc 1 70 8 cvt.s64.s32 %r116,%r188; .loc 1 71 10 ld.u64 %r25,[%r185]; .loc 1 71 7 add.u64 %r117,%r25,-1; .loc 1 73 7 shr.u64 %r190,%r117,63; set.u32.lt.s64 %r192,%r116,%r117; neg.s32 %r193,%r192; cvt.u16.u64 %r196,%r190; cvt.u16.u32 %r197,%r193; or.b16 %r195,%r196,%r197; .loc 1 73 6 cvt.u32.u16 %r198,%r195; cvt.u16.u8 %r199,%r198; setp.eq.u16 %r200,%r199,0; @ %r200 bra $L2; .loc 1 75 7 cvt.u32.u32 %r203,%r22; cvt.s64.s8 %r202,%r203; st.u64 [%stack+8],%r20201201_gfortran_runtime_errorr206,%r117,%r117; add.u64 %r207,%r206,%r117; shl.b64 %r208,%r207,3; add.u64 %r209,%r184,%r208; ld.u64 %r31,[%r209+56]; ld.u64 %r33,[%r209+48]; .loc 1 83 11 ld.u64 %r34,[%r209+40]; .loc 1 85 3 setp.ne.u64 %r223,%r117,0; @ %r223 bra $L3; $L9: .loc 1 93 3 setp.gt.s64 %r224,%r116,%r117; @ %r224 bra $L4; bra $L49; $L3: add.u64 %r133,%r184,40; add.u64 %r177,%frame,120; add.u64 %r173,%frame,240; add.u64 %r225,%r184,16; add.u64 %r227,%r25,%r25; add.u64 %r228,%r227,%r25; shl.b64 %r229,%r228,3; add.u64 %r179,%r225,%r229; .loc 1 91 12 mov.u64 %r371,0; $L8: .loc 1 87 51 ld.u64 %r231,[%r133]; mul.lo.u64 %r230,%r231,%r187; .loc 1 87 18 st.u64 [%r177],%r230; .loc 1 88 19 ld.u64 %r233,[%r133+16]; add.u64 %r232,%r233,1; ld.u64 %r234,[%r133+8]; sub.u64 %r44,%r232,%r234; .loc 1 90 10 setp.lt.s64 %r235,%r44,0; @ %r235 bra $L6; .loc 1 88 17 st.u64 [%r173],%r44; bra $L7; $L6: .loc 1 91 12 st.u64 [%r173],%r371; $L7: .loc 1 85 3 add.u64 %r133,%r133,24; add.u64 %r177,%r177,8; add.u64 %r173,%r173,8; setp.ne.u64 %r237,%r133,%r179; @ %r237 bra $L8; bra $L9; $L49: .loc 1 102 6 ld.u64 %r238,[%r183]; setp.eq.u64 %r239,%r238,0; @ ! %r239 bra $L11; bra $L10; $L4: add.u64 %r241,%r25,%r25; add.u64 %r242,%r241,%r25; shl.b64 %r243,%r242,3; add.u64 %r244,%r243,40; add.u64 %r111,%r184,%r244; shl.b64 %r245,%r25,3; add.u64 %r101,%r245,-8; add.u64 %r360,%frame,120; add.u64 %r103,%r360,%r101; add.u64 %r359,%frame,240; add.u64 %r98,%r359,%r101; add.u64 %r248,%r184,40; cvt.u32.u32 %r250,%r22; cvt.s64.s8 %r249,%r250; add.u64 %r252,%r249,%r249; add.u64 %r253,%r252,%r249; shl.b64 %r254,%r253,3; add.u64 %r29,%r248,%r254; .loc 1 99 12 mov.u64 %r370,0; $L14: .loc 1 95 56 ld.u64 %r256,[%r111]; mul.lo.u64 %r255,%r256,%r187; .loc 1 95 18 st.u64 [%r103],%r255; .loc 1 96 19 ld.u64 %r258,[%r111+16]; add.u64 %r257,%r258,1; ld.u64 %r259,[%r111+8]; sub.u64 %r53,%r257,%r259; .loc 1 98 10 setp.lt.s64 %r260,%r53,0; @ %r260 bra $L12; .loc 1 96 17 st.u64 [%r98],%r53; bra $L13; $L12: .loc 1 99 12 st.u64 [%r98],%r370; $L13: .loc 1 93 3 add.u64 %r111,%r111,24; add.u64 %r103,%r103,8; add.u64 %r98,%r98,8; setp.ne.u64 %r262,%r29,%r111; @ %r262 bra $L14; bra $L50; $L10: .loc 1 106 7 setp.le.s64 %r263,%r116,0; @ %r263 bra $L16; add.u64 %r359,%frame,240; $L36: add.u64 %r147,%r183,40; mov.u64 %r141,%r359; cvt.u32.u32 %r265,%r22; cvt.s64.s8 %r264,%r265; add.u64 %r267,%r264,%r264; add.u64 %r268,%r267,%r264; shl.b64 %r269,%r268,3; add.u64 %r270,%r183,16; add.u64 %r112,%r269,%r270; .loc 1 109 10 mov.u64 %r104,1; .loc 1 113 4 mov.u64 %r293,0; bra $L17; $L16: .loc 1 117 24 mov.u64 %r271,0; st.u64 [%r183+8],%r271; .loc 1 118 28 cvt.u16.u32 %r274,%r22; add.u16 %r273,%r274,-1; cvt.u32.u16 %r275,%r273; st.u8 [%r183+28],%r275; .loc 1 120 20 add.u32 %r276,%r22,-2; cvt.s64.s32 %r63,%r276; add.u64 %r278,%r63,%r63; add.u64 %r279,%r278,%r63; shl.b64 %r280,%r279,3; add.u64 %r281,%r183,%r280; .loc 1 120 67 shl.b64 %r283,%r63,3; add.u64 %r284,%frame,%r283; .loc 1 120 59 ld.u64 %r286,[%r281+40]; ld.u64 %r287,[%r284+240]; mul.lo.u64 %r122,%r286,%r287; .loc 1 122 29 mov.u64 %r289189; call (%value_in),_gfortrani_xmallocarray290,[%value_in]; } .loc 1 122 27 st.u64 [%r183],%r290; .loc 1 123 10 setp.eq.u64 %r292,%r122,0; @ ! %r292 bra $L21; bra $L18; $L20: .loc 1 111 48 mul.lo.u64 %r104,%r59,%r104; $L17: .loc 1 113 4 st.u64 [%r147+8],%r293; ld.u64 %r59,[%r141]; add.u64 %r294,%r59,-1; st.u64 [%r147+16],%r294; st.u64 [%r147],%r104; .loc 1 106 7 add.u64 %r147,%r147,2420; bra $L16; $L18: .loc 1 126 4 st.u64 [%r183+48],%r122; mov.u64 %r297,-1; st.u64 [%r183+56],%r297; mov.u64 %r298,1; st.u64 [%r183+40],%r298; .loc 1 127 4 bra $L1; $L11: .loc 1 133 19 ld.s8 %r70,[%r183+28]; .loc 1 133 10 setp.eq.u64 %r299,%r70,%r116; @ %r299 bra $L23; .loc 1 134 2 st.u64 [%stack+8],%r116_gfortran_runtime_error23: .loc 1 139 11 cvta.global.u64 %r302,_gfortrani_compile_options; .loc 1 139 10 ld.u32 %r303,[%r302+36]; setp.eq.u32 %r304,%r303,0; @ %r304 bra $L21; .loc 1 140 2 add.u64 %r359,%frame,240; cvta.const.u64 %r308,$LC2r359307308; call _gfortrani_bounds_ifunction_return$L21: .loc 1 144 3 setp.gt.s64 %r310,%r116,0; @ %r310 bra $L24; $L28: .loc 1 80 9 add.u64 %r311,%r31,1; .loc 1 80 7 sub.u64 %r118,%r311,%r33; max.s64 %r120,%r118,0; .loc 1 83 44 mul.lo.u64 %r36,%r34,%r187; .loc 1 152 8 ld.u64 %r129,[%r184]; .loc 1 153 8 ld.u64 %r130,[%r183]; .loc 1 186 22 ld.u64 %r76,[%frame+120]; .loc 1 187 22 ld.u64 %r78,[%frame]; .loc 1 187 12 shl.b64 %r80,%r78,3; .loc 1 185 12 ld.u64 %r176,[%frame+360]; setp.gt.s64 %r357,%r118,0; setp.le.s64 %r358,%r116,1; .loc 1 173 6 setp.eq.u32 %r363,%r186,0; .loc 1 167 10 mov.u64 %r364,0; cvt.u32.u32 %r365,%r22; cvt.s64.s8 %r366,%r365; add.u64 %r367,%r366,-1; add.u64 %r368,%frame,120; add.u64 %r369,%frame,240; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r183,40; mov.u64 %r162,%frame; cvt.u32.u32 %r313,%r22; cvt.s64.s8 %r312,%r313; add.u64 %r150,%r312,-1; .loc 1 144 3 mov.u64 %r131,0; add.u64 %r359,%frame,240; .loc 1 146 16 mov.u64 %r314,%r131; $L27: st.u64 [%r165],%r314; .loc 1 147 18 ld.u64 %r315,[%r164]; st.u64 [%r162],%r315; .loc 1 148 17 shl.b64 %r317,%r131,3; add.u64 %r318,%r359,%r317; .loc 1 148 10 ld.u64 %r319,[%r318]; setp.le.s64 %r320,%r319,0; @ %r320 bra $L1; .loc 1 144 26 add.u64 %r131,%r131,1; .loc 1 144 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r321,%r131,%r150; @ %r321 bra $L27; bra $L28; $L25: .loc 1 166 5 @ %r357 bra $L37; .loc 1 167 10 st.u64 [%r130],%r364; bra $L30; $L37: mov.u64 %r125,%r129; .loc 1 164 9 mov.u64 %r107,0; .loc 1 165 9 mov.u64 %r105,%r107; .loc 1 170 13 mov.u64 %r158,%r107; $L29: .loc 1 177 14 add.u64 %r158,%r158,1; .loc 1 173 6 setp.eq.u64 %r324,%r107,0; @ %r324 bra $L38; .loc 1 40(%value_in),memcmp,(%out_arg1,%out_arg2,%out_arg3); ld.param.u32 %r328,[%value_in]; } .loc 1 173 6 @ %r363 bra $L32; .loc 1 173 22 setp.gt.s32 %r331,%r328,0; selp.u64 %r107,%r107,%r125,%r331; selp.u64 %r105,%r105,%r158,%r331; bra $L31; $L32: setp.ge.s32 %r332,%r328,0; selp.u64 %r107,%r107,%r125,%r332; selp.u64 %r105,%r105,%r158,%r332; bra $L31; $L38: mov.u64 %r107,%r125; .loc 1 177 14 mov.u64 %r105,%r158; $L31: .loc 1 170 36 add.u64 %r125,%r125,%r36; .loc 1 170 6 setp.gt.s64 %r333,%r120,%r158; @ %r333 bra $L29; .loc 1 181 12 st.u64 [%r130],%r105; $L30: .loc 1 185 15 add.u64 %r176,%r176,1; .loc 1 186 12 add.u64 %r129,%r129,%r76; .loc 1 187 12 add.u64 %r130,%r130,%r80; .loc 1 189 32 ld.u64 %r137,[%frame+240]; .loc 1 189 13 setp.ne.u64 %r334,%r176,%r137; @ %r334 bra $L25; .loc 1 196 23 mul.lo.u64 %r335,%r76,%r176; .loc 1 196 9 sub.u64 %r126,%r129,%r335; .loc 1 197 23 mul.lo.u64 %r336,%r78,%r176; .loc 1 197 9 shl.b64 %r337,%r336,3; sub.u64 %r127,%r130,%r337; .loc 1 199 7 @ %r358 bra $L1; add.u64 %r139,%frame,368; mov.u64 %r159,8; .loc 1 198 5 mov.u64 %r128,1; bra $L34; $L35: .loc 1 193 13 st.u64 [%r139],%r364; .loc 1 196 23 mul.lo.u64 %r343,%r92,%r91; .loc 1 196 9 sub.u64 %r126,%r129,%r343; .loc 1 197 23 mul.lo.u64 %r344,%r94,%r91; .loc 1 197 9 shl.b64 %r345,%r344,3; sub.u64 %r127,%r130,%r345; .loc 1 198 5 add.u64 %r128,%r128,1; .loc 1 199 7 add.u64 %r139,%r139,8; add.u64 %r159,%r159,8; setp.eq.u64 %r346,%r128,%r367; @ %r346 bra $L1; $L34: .loc 1 207 16 ld.u64 %r347,[%r139]; add.u64 %r91,%r347,1; st.u64 [%r139],%r91; .loc 1 208 23 add.u64 %r349,%r368,%r159; ld.u64 %r92,[%r349]; .loc 1 208 13 add.u64 %r129,%r126,%r92; .loc 1 209 23 add.u64 %r350,%frame,%r159; ld.u64 %r94,[%r350]; .loc 1 209 13 shl.b64 %r351,%r94,3; add.u64 %r130,%r127,%r351; .loc 1 189 32 add.u64 %r353,%r369,%r159; ld.u64 %r97,[%r353]; .loc 1 189 13 setp.eq.u64 %r354,%r91,%r97; @ %r354 bra $L35; mov.u64 %r176,0; bra $L25; $L50: .loc 1 102 6 ld.u64 %r355,[%r183]; setp.eq.u64 %r356,%r355,0; @ ! %r356 bra $L11; bra $L36; $L1_gfortran_mminloc1_8_s1 .visible .func _gfortran_mminloc1_8_s1608u64 %r210; .reg .u64 %r211; .reg .u64 %r213; .reg .u64u64 %r229; .reg .u32 %r230; .reg .u64 %r231; .reg .pred %r232; .reg .u32 %r238; .reg .u64 %r240; .reg .u32 %r242; .reg .u32u64 %r258; .reg .u64u64u64u64predu64 %r357; .reg .u64 %r358; .reg .u64u32pred %r421; .reg .predpred %r447; .reg .u64 %r448; .reg .pred %r449; .reg .u64 %r450; .reg .u64 %r452; .reg .pred %r453; .reg .pred %r454; .reg .u64mov.u64 %r226,%ar0; mov.u64 %r227,%ar1; mov.u64 %r228,%ar2; mov.u64 %r229,%ar3; mov.u32 %r230,%ar4; mov.u64 %r231,%ar5; .loc 1 244 6 setp.ne.u64 %r232,%r229,0; @ %r232 bra $L228230231; call _gfortran_minloc1_8_s1); } .loc 1 251 7 bra $L51; $L52: .loc 1 254 10 ld.u64 %r22,[%r228]; .loc 1 254 7 add.u64 %r150,%r22,-1; .loc 1 255 10 ld.s8 %r23,[%r227+28]; .loc 1 255 38 add.u32 %r238,%r23,-1; .loc 1 255 8 cvt.s64.s32 %r151,%r238; .loc 1 258 7 shr.u64 %r240,%r150,63; set.u32.gt.s64 %r242,%r150,%r151; neg.s32 %r243,%r242; cvt.u16.u64 %r246,%r240; cvt.u16.u32 %r247,%r243; or.b16 %r245,%r246,%r247; .loc 1 258 660 7 cvt.u32.u32 %r253,%r23; cvt.s64.s8 %r252,%r253; st.u64 [%stack+8],%r252; st.u64 [%stack],%r22; cvta.const.u64 %r22_gfortran_runtime_error4: .loc 1 265 9 add.u64 %r455,%r150,%r150; add.u64 %r452,%r455,%r150; shl.b64 %r258,%r452,3; add.u64 %r259,%r227,%r258; ld.u64 %r262,[%r259+56]; add.u64 %r261,%r262,1; .loc 1 265 7 ld.u64 %r269,[%r259+48]; sub.u64 %r152,%r261,%r269; .loc 1 266 6 setp.le.s64 %r270,%r152,0; @ %r270 bra $L51; .loc 1 269 9 ld.u64 %r162,[%r229]; .loc 1 271 15 ld.u64 %r35,[%r229+16]; .loc 1 273 22 cvt.u32.u64 %r36,%r35; .loc 1 273 53 add.u32 %r271,%r36,-4; and.b32 %r272,%r271,-5; set.u32.eq.u32 %r274,%r272,0; neg.s32 %r275,%r274; .loc 1 273 22 add.u32 %r276,%r36,-1; set.u32.le.u32 %r278,%r276,1; neg.s32 %r279,%r278; .loc 1 273 6 cvt.u16.u32 %r281,%r275; cvt.u16.u32 %r282,%r279; or.loc 1 275 7 setp.ne.u32 %r287,%r36,16; @ %r287 bra $L57; $L56: .loc 1 282 11 shl.b64 %r291,%r452,3; add.u64 %r292,%r227,%r291; ld.u64 %r42,[%r292+40]; .loc 1 283 12 add.u64 %r298,%r229,%r291; ld.u64 %r45,[%r298+40]; .loc 1 285 3 setp.ne.u64 %r300,%r150,0; @ %r300 bra $L58; $L64: .loc 1 295 3 setp.lt.s64 %r301,%r150,%r151; @ %r301 bra $L59; bra $L109; $L57: .loc 1 280 5 cvta.const.u64 %r302302_gfortran_runtime_error8: add.u64 %r208,%r227,40; add.u64 %r210,%frame,240; add.u64 %r211,%r229,40; mov.u64 %r213,%frame; add.u64 %r214,%frame,360; add.u64 %r304,%r227,16; add.u64 %r306,%r22,%r22; add.u64 %r307,%r306,%r22; shl.b64 %r308,%r307,3; add.u64 %r223,%r304,%r308; .loc 1 292 12 mov.u64 %r468,0; $L63: .loc 1 287 51 ld.u64 %r310,[%r208]; mul.lo.u64 %r309,%r310,%r231; .loc 1 287 18 st.u64 [%r210],%r309; .loc 1 288 20 ld.u64 %r312,[%r211]; mul.lo.u64 %r311,%r312,%r35; .loc 1 288 18 st.u64 [%r213],%r311; .loc 1 289 19 ld.u64 %r314,[%r208+16]; add.u64 %r313,%r314,1; ld.u64 %r315,[%r208+8]; sub.u64 %r60,%r313,%r315; .loc 1 291 10 setp.lt.s64 %r316,%r60,0; @ %r316 bra $L61; .loc 1 289 17 st.u64 [%r214],%r60; bra $L62; $L61: .loc 1 292 12 st.u64 [%r214],%r468; $L62: .loc 1 285 3 add.u64 %r208,%r208,24; add.u64 %r210,%r210,8; add.u64 %r211,%r211,24; add.u64 %r213,%r213,8; add.u64 %r214,%r214,8; setp.ne.u64 %r318,%r208,%r223; @ %r318 bra $L63; bra $L64; $L109: .loc 1 305 6 ld.u64 %r319,[%r226]; setp.eq.u64 %r320,%r319,0; @ ! %r320 bra $L66; bra $L65; $L59: add.u64 %r322,%r22,%r22; add.u64 %r323,%r322,%r22; shl.b64 %r324,%r323,3; add.u64 %r40,%r324,40; add.u64 %r71,%r227,%r40; shl.b64 %r325,%r22,3; add.u64 %r171,%r325,-8; add.u64 %r450,%frame,240; add.u64 %r30,%r450,%r171; add.u64 %r173,%r229,%r40; add.u64 %r79,%frame,%r171; add.u64 %r456,%frame,360; add.u64 %r164,%r456,%r171; add.u64 %r328,%r227,40; cvt.u32.u32 %r330,%r23; cvt.s64.s8 %r329,%r330; add.u64 %r332,%r329,%r329; add.u64 %r333,%r332,%r329; shl.b64 %r334,%r333,3; add.u64 %r205,%r328,%r334; .loc 1 302 12 mov.u64 %r467,0; $L69: .loc 1 297 55 ld.u64 %r336,[%r71]; mul.lo.u64 %r335,%r336,%r231; .loc 1 297 18 st.u64 [%r30],%r335; .loc 1 298 20 ld.u64 %r338,[%r173]; mul.lo.u64 %r337,%r338,%r35; .loc 1 298 18 st.u64 [%r79],%r337; .loc 1 299 19 ld.u64 %r340,[%r71+16]; add.u64 %r339,%r340,1; ld.u64 %r341,[%r71+8]; sub.u64 %r76,%r339,%r341; .loc 1 301 10 setp.lt.s64 %r342,%r76,0; @ %r342 bra $L67; .loc 1 299 17 st.u64 [%r164],%r76; bra $L68; $L67: .loc 1 302 12 st.u64 [%r164],%r467; $L68: .loc 1 295 3 add.u64 %r71,%r71,24; add.u64 %r30,%r30,8; add.u64 %r173,%r173,24; add.u64 %r79,%r79,8; add.u64 %r164,%r164,8; setp.ne.u64 %r344,%r71,%r205; @ %r344 bra $L69; bra $L110; $L65: .loc 1 309 7 setp.eq.u64 %r345,%r151,0; @ %r345 bra $L71; add.u64 %r456,%frame,360; $L92: add.u64 %r130,%r226,40; mov.u64 %r128,%r456; cvt.u32.u32 %r347,%r23; cvt.s64.s8 %r346,%r347; add.u64 %r349,%r346,%r346; add.u64 %r350,%r349,%r346; shl.b64 %r351,%r350,3; add.u64 %r352,%r226,16; add.u64 %r78,%r351,%r352; .loc 1 312 10 mov.u64 %r137,1; .loc 1 316 4 mov.u64 %r371,0; bra $L72; $L71: .loc 1 320 20 add.u32 %r353,%r23,-2; cvt.s64.s32 %r84,%r353; add.u64 %r355,%r84,%r84; add.u64 %r356,%r355,%r84; shl.b64 %r357,%r356,3; add.u64 %r358,%r226,%r357; .loc 1 320 67 shl.b64 %r360,%r84,3; add.u64 %r361,%frame,%r360; .loc 1 320 59 ld.u64 %r363,[%r358+40]; ld.u64 %r364,[%r361+360]; mul.lo.u64 %r157,%r363,%r364; .loc 1 322 24 mov.u64 %r365,0; st.u64 [%r226+8],%r365; .loc 1 323 28 cvt.u16.u32 %r368,%r23; add.u16 %r367,%r368,-1; cvt.u32.u16 %r369,%r367; st.u8 [%r226+28],%r369; .loc 1 325 10 setp.eq.u64 %r370,%r157,0; @ %r370 bra $L73; bra $L111; $L75: .loc 1 314 47 mul.lo.u64 %r137,%r81,%r137; $L72: .loc 1 316 4 st.u64 [%r130+8],%r371; ld.u64 %r81,[%r128]; add.u64 %r372,%r81,-1; st.u64 [%r130+16],%r372; st.u64 [%r130],%r137; .loc 1 309 7 add.u64 %r130,%r130,24; add.u64 %r128,%r128,8; setp.ne.u64 %r373,%r78,%r130; @ %r373 bra $L75; bra $L71; $L73: .loc 1 328 4 st.u64 [%r226+48],%r157; mov.u64 %r375,-1; st.u64 [%r226+56],%r375; mov.u64 %r376,1; st.u64 [%r226+40],%r376; .loc 1 329 4 bra $L51; $L111: .loc 1 332 24 mov.u64 %r378378; call (%value_in),_gfortrani_xmallocarray79,[%value_in]; } .loc 1 332 22 st.u64 [%r226],%r379; bra $L76; $L66: .loc 1 337 19 ld.s8 %r381,[%r226+28]; .loc 1 337 10 setp.eq.u64 %r382,%r381,%r151; @ %r382 bra $L77; .loc 1 338 2 cvta.const.u64 %r383383_gfortran_runtime_error77: .loc 1 340 11 cvta.global.u64 %r385,_gfortrani_compile_options; .loc 1 340 10 ld.u32 %r386,[%r385+36]; setp.eq.u32 %r387,%r386,0; @ %r387 bra $L76; .loc 1 342 4 add.u64 %r456,%frame,360; cvta.const.u64 %r391,$LC2r456390391; call _gfortrani_bounds_ifunction_return344 4 cvta.const.u64 %r395,$LC39391; call _gfortrani_bounds_equal_extents$L76: .loc 1 349 3 setp.ne.u64 %r397,%r151,0; @ %r397 bra $L78; $L81: .loc 1 357 8 ld.u64 %r163,[%r226]; .loc 1 358 8 ld.u64 %r161,[%r227]; .loc 1 360 9 setp.ne.u64 %r398,%r161,0; @ %r398 bra $L79; bra $L51; $L78: add.u64 %r174,%frame,480; add.u64 %r165,%r226,40; add.u64 %r148,%frame,120; cvt.u32.u32 %r400,%r23; cvt.s64.s8 %r399,%r400; add.u64 %r135,%r399,-1; .loc 1 349 3 mov.u64 %r169,0; add.u64 %r456,%frame,360; .loc 1 351 16 mov.u64 %r401,%r169; $L80: st.u64 [%r174],%r401; .loc 1 352 18 ld.u64 %r402,[%r165]; st.u64 [%r148],%r402; .loc 1 353 17 shl.b64 %r404,%r169,3; add.u64 %r405,%r456,%r404; .loc 1 353 10 ld.u64 %r406,[%r405]; setp.le.s64 %r407,%r406,0; @ %r407 bra $L51; .loc 1 349 26 add.u64 %r169,%r169,1; .loc 1 349 3 add.u64 %r174,%r174,8; add.u64 %r165,%r165,24; add.u64 %r148,%r148,8; setp.ne.u64 %r408,%r135,%r169; @ %r408 bra $L80; bra $L81; $L79: .loc 1 282 44 mul.lo.u64 %r44,%r42,%r231; .loc 1 283 12 mul.lo.u64 %r48,%r45,%r35; .loc 1 396 22 ld.u64 %r100,[%frame+240]; .loc 1 397 23 ld.u64 %r102,[%frame]; .loc 1 398 22 ld.u64 %r104,[%frame+120]; .loc 1 398 12 shl.b64 %r106,%r104,3; ld.u64 %r457,[%frame+480]; setp.le.s64 %r454,%r151,1; .loc 1 384 13 setp.eq.u32 %r460,%r230,0; cvt.u32.u32 %r461,%r23; cvt.s64.s8 %r462,%r461; add.u64 %r463,%r462,-1; add.u64 %r464,%frame,240; add.u64 %r465,%frame,360; add.u64 %r466,%frame,120; $L93: .loc 1 349 3 mov.u64 %r168,%r162; mov.u64 %r143,%r161; .loc 1 372 9 mov.u64 %r141,0; $L85: .loc 1 375 7 ld.s8 %r132,[%r168]; mov.u64 %r190,%r141; .loc 1 378 11 add.u64 %r141,%r141,1; .loc 1 375 6 setp.eq.u32 %r453,%r132,0; @ %r453 bra $L82; .loc 1 382 6 setp.gt.s64 %r410,%r152,%r190; @ %r410 bra $L94; bra $L84; $L82: .loc 1 372 32 add.u64 %r143,%r143,%r44; .loc 1 372 47 add.u64 %r168,%r168,%r48; .loc 1 372 2 setp.ne.u64 %r411,%r152,%r141; @ %r411 bra $L85; .loc 1 371 9 mov.u64 %r141,0; bra $L84; $L94: .loc 1 382 6 mov.u64 %r167,%r143; $L83: .loc 1 388 14 add.u64 %r190,%r190,1; .loc 1 384 6 @ %r453 bra $L86; .loc 1 40(%value_in),memcmp,(%out_arg1,%out_arg2,%out_arg3); ld.param.u32 %r416,[%value_in]; } .loc 1 384 13 @ %r460 bra $L87; setp.gt.s32 %r419,%r416,0; selp.u64 %r143,%r143,%r167,%r419; selp.u64 %r141,%r141,%r190,%r419; bra $L86; $L87: setp.ge.s32 %r420,%r416,0; selp.u64 %r143,%r143,%r167,%r420; selp.u64 %r141,%r141,%r190,%r420; $L86: .loc 1 382 31 add.u64 %r167,%r167,%r44; .loc 1 382 46 add.u64 %r168,%r168,%r48; .loc 1 382 6 setp.eq.u64 %r421,%r152,%r190; @ %r421 bra $L84; .loc 1 384 7 ld.s8 %r132,[%r168]; setp.eq.u32 %r453,%r132,0; bra $L83; $L84: .loc 1 392 8 st.u64 [%r163],%r141; .loc 1 395 15 add.u64 %r457,%r457,1; st.u64 [%frame+480],%r457; .loc 1 396 12 add.u64 %r161,%r161,%r100; .loc 1 397 13 add.u64 %r162,%r162,%r102; .loc 1 398 12 add.u64 %r163,%r163,%r106; .loc 1 400 32 ld.u64 %r29,[%frame+360]; .loc 1 400 13 setp.ne.u64 %r423,%r29,%r457; @ %r423 bra $L93; .loc 1 404 13 mov.u64 %r424,0; st.u64 [%frame+480],%r424; .loc 1 407 23 mul.lo.u64 %r109,%r457,%r100; .loc 1 408 24 mul.lo.u64 %r425,%r457,%r102; .loc 1 408 10 sub.u64 %r158,%r162,%r425; .loc 1 409 23 mul.lo.u64 %r426,%r457,%r104; .loc 1 409 9 shl.b64 %r427,%r426,3; sub.u64 %r159,%r163,%r427; .loc 1 411 7 @ %r454 bra $L51; add.u64 %r197,%frame,488; mov.u64 %r196,8; .loc 1 410 5 mov.u64 %r160,1; bra $L90; $L91: .loc 1 404 13 st.u64 [%r197],%r424; .loc 1 407 23 mul.lo.u64 %r109,%r120,%r119; .loc 1 408 24 mul.lo.u64 %r433,%r122,%r119; .loc 1 408 10 sub.u64 %r158,%r162,%r433; .loc 1 409 23 mul.lo.u64 %r434,%r124,%r119; .loc 1 409 9 shl.b64 %r435,%r434,3; sub.u64 %r159,%r163,%r435; .loc 1 410 5 add.u64 %r160,%r160,1; .loc 1 411 7 add.u64 %r197,%r197,8; add.u64 %r196,%r196,8; setp.eq.u64 %r436,%r160,%r463; @ %r436 bra $L51; $L90: .loc 1 419 16 ld.u64 %r437,[%r197]; add.u64 %r119,%r437,1; st.u64 [%r197],%r119; .loc 1 420 23 add.u64 %r439,%r464,%r196; ld.u64 %r120,[%r439]; .loc 1 420 13 sub.u64 %r440,%r120,%r109; add.u64 %r161,%r161,%r440; .loc 1 421 24 add.u64 %r441,%frame,%r196; ld.u64 %r122,[%r441]; .loc 1 421 14 add.u64 %r162,%r158,%r122; .loc 1 422 23 add.u64 %r443,%r466,%r196; ld.u64 %r124,[%r443]; .loc 1 422 13 shl.b64 %r444,%r124,3; add.u64 %r163,%r159,%r444; .loc 1 400 32 add.u64 %r446,%r465,%r196; ld.u64 %r127,[%r446]; .loc 1 400 13 setp.eq.u64 %r447,%r119,%r127; @ %r447 bra $L91; ld.u64 %r457,[%frame+480]; bra $L93; $L110: .loc 1 305 6 ld.u64 %r448,[%r226]; setp.eq.u64 %r449,%r448,0; @ ! %r449 bra $L66; bra $L92; $L51: .loc 1 426_gfortran_sminloc1_8_s1 .visible .func _gfortran_sminloc1_8_s1368100predu16 %r180; .reg .predpred %r222; .reg .predu64 %r231; .reg .u16 %r233; .reg .u16 %r234; .reg .u32pred %r259; .reg .pred %r260; .reg .u64u64 %r267; .reg .u32pred %r303; .reg .u64 %r304; .reg .pred %r305; .reg .u64 %r306; .reg .u32mov.u64 %r156,%ar0; mov.u64 %r157,%ar1; mov.u64 %r158,%ar2; mov.u64 %r159,%ar3; mov.u32 %r160,%ar4; mov.u64 %r161,%ar5; .loc 1 449 6 setp.eq.u64 %r162,%r159,0; @ %r162 bra $L113; .loc 1 449 20 ld.u32 %r163,[%r159]; setp.eq.u32 %r164,%r163,0; @ %r164 bra $L114; $L113: .loc 1 452 7r1571580161; call _gfortran_minloc1_8_s1); } .loc 1 456 7 bra $L112; $L114: .loc 1 459 10 ld.u64 %r23,[%r158]; .loc 1 459 7 add.u64 %r95,%r23,-1; .loc 1 460 10 ld.s8 %r24,[%r157+28]; .loc 1 460 38 add.u32 %r26,%r24,-1; .loc 1 460 8 cvt.s64.s32 %r306,%r26; .loc 1 462 7 shr.u64 %r171,%r95,63; set.u32.gt.s64 %r173,%r95,%r306; neg.s32 %r174,%r173; cvt.u16.u64 %r177,%r171; cvt.u16.u32 %r178,%r174; or.b16 %r176,%r177,%r178; .loc 1 462 6 cvt.u32.u16 %r179,%r176; cvt.u16.u8 %r180,%r179; setp.ne.u16 %r181,%r180,0; @ %r181 bra $L116; .loc 1 469 3 setp.ne.u64 %r182,%r95,0; @ %r182 bra $L117; $L123: .loc 1 477 3 setp.lt.s64 %r183,%r95,%r306; @ %r183 bra $L118; bra $L171; $L116: .loc 1 464 7 cvt.u32.u32 %r186,%r24; cvt.s64.s8 %r185,%r186; st.u64 [%stack+8],%r1851841stack; call _gfortran_runtime_error117: add.u64 %r92,%r157,48; add.u64 %r114,%frame,120; add.u64 %r146,%r157,56; add.u64 %r188,%r157,24; add.u64 %r190,%r23,%r23; add.u64 %r191,%r190,%r23; shl.b64 %r192,%r191,3; add.u64 %r152,%r188,%r192; .loc 1 474 12 mov.u64 %r312,0; $L122: .loc 1 471 19 ld.u64 %r194,[%r146]; ld.u64 %r196,[%r92]; sub.u64 %r195,%r194,%r196; .loc 1 471 50 mad.lo.u64 %r38,%r195,%r161,%r161; .loc 1 473 10 setp.le.s64 %r197,%r38,0; @ %r197 bra $L120; .loc 1 471 17 st.u64 [%r114],%r38; bra $L121; $L120: .loc 1 474 12 st.u64 [%r114],%r312; $L121: .loc 1 469 3 add.u64 %r92,%r92,24; add.u64 %r114,%r114,8; add.u64 %r146,%r146,24; setp.ne.u64 %r199,%r92,%r152; @ %r199 bra $L122; bra $L123; $L171: .loc 1 486 15 ld.u64 %r107,[%r156]; .loc 1 486 6 setp.eq.u64 %r200,%r107,0; @ ! %r200 bra $L125; bra $L124; $L118: add.u64 %r202,%r23,%r23; add.u64 %r203,%r202,%r23; shl.b64 %r204,%r203,3; add.u64 %r205,%r204,48; add.u64 %r83,%r157,%r205; add.u64 %r304,%frame,120; shl.b64 %r207,%r23,3; add.u64 %r208,%r207,-8; add.u64 %r144,%r304,%r208; add.u64 %r209,%r157,48; cvt.u32.u32 %r211,%r24; cvt.s64.s8 %r210,%r211; add.u64 %r213,%r210,%r210; add.u64 %r214,%r213,%r210; shl.b64 %r215,%r214,3; add.u64 %r72,%r209,%r215; .loc 1 483 12 mov.u64 %r311,0; $L128: .loc 1 480 2 ld.u64 %r217,[%r83+8]; ld.u64 %r219,[%r83]; sub.u64 %r218,%r217,%r219; .loc 1 480 37 mad.lo.u64 %r46,%r218,%r161,%r161; .loc 1 482 10 setp.le.s64 %r220,%r46,0; @ %r220 bra $L126; .loc 1 479 17 st.u64 [%r144],%r46; bra $L127; $L126: .loc 1 483 12 st.u64 [%r144],%r311; $L127: .loc 1 477 3 add.u64 %r83,%r83,24; add.u64 %r144,%r144,8; setp.ne.u64 %r222,%r72,%r83; @ %r222 bra $L128; bra $L172; $L124: .loc 1 490 7 setp.eq.u64 %r223,%r306,0; @ %r223 bra $L130; add.u64 %r304,%frame,120; $L150: add.u64 %r125,%r156,40; mov.u64 %r123,%r304; add.u64 %r224,%r156,16; cvt.u32.u32 %r226,%r24; cvt.s64.s8 %r225,%r226; add.u64 %r228,%r225,%r225; add.u64 %r229,%r228,%r225; shl.b64 %r230,%r229,3; add.u64 %r117,%r224,%r230; .loc 1 493 10 mov.u64 %r89,1; .loc 1 497 4 mov.u64 %r249,0; bra $L131; $L130: .loc 1 501 24 mov.u64 %r231,0; st.u64 [%r156+8],%r231; .loc 1 502 28 cvt.u16.u32 %r234,%r24; add.u16 %r233,%r234,-1; cvt.u32.u16 %r235,%r233; st.u8 [%r156+28],%r235; .loc 1 504 20 add.u32 %r236,%r24,-2; cvt.s64.s32 %r56,%r236; add.u64 %r238,%r56,%r56; add.u64 %r239,%r238,%r56; shl.b64 %r240,%r239,3; add.u64 %r241,%r156,%r240; .loc 1 504 67 shl.b64 %r243,%r56,3; add.u64 %r244,%frame,%r243; .loc 1 504 59 ld.u64 %r246,[%r241+40]; ld.u64 %r247,[%r244+120]; mul.lo.u64 %r103,%r246,%r247; .loc 1 506 10 setp.eq.u64 %r248,%r103,0; @ %r248 bra $L132; bra $L173; $L134: .loc 1 495 48 mul.lo.u64 %r89,%r52,%r89; $L131: .loc 1 497 4 st.u64 [%r125+8],%r249; ld.u64 %r52,[%r123]; add.u64 %r250,%r52,-1; st.u64 [%r125+16],%r250; st.u64 [%r125],%r89; .loc 1 490 7 add.u64 %r125,%r125,24; add.u64 %r123,%r123,8; setp.ne.u64 %r251,%r117,%r125; @ %r251 bra $L134; bra $L130; $L132: .loc 1 509 4 st.u64 [%r156+48],%r103; mov.u64 %r253,-1; st.u64 [%r156+56],%r253; mov.u64 %r254,1; st.u64 [%r156+40],%r254; .loc 1 510 4 bra $L112; $L173: .loc 1 513 24 mov.u64 %r256(%value_in),_gfortrani_xmallocarray257,[%value_in]; } mov.u64 %r107,%r257; .loc 1 513 22 st.u64 [%r156],%r107; $L138: .loc 1 539 3 setp.ne.u64 %r259,%r306,0; @ %r259 bra $L135; bra $L136; $L125: .loc 1 517 19 ld.s8 %r61,[%r156+28]; .loc 1 517 10 setp.eq.u64 %r260,%r61,%r306; @ %r260 bra $L137; .loc 1 518 2 st.u64 [%stack+8],%r306; st.u64 [%stack],%r61;_gfortran_runtime_error137: .loc 1 523 11 cvta.global.u64 %r263,_gfortrani_compile_options; .loc 1 523 10 ld.u32 %r264,[%r263+36]; setp.eq.u32 %r265,%r264,0; @ %r265 bra $L138; .loc 1 525 4 setp.eq.u64 %r266,%r306,0; @ %r266 bra $L136; add.u64 %r115,%r156,48; add.u64 %r105,%frame,120; cvt.u32.u32 %r268,%r24; cvt.s64.s8 %r267,%r268; add.u64 %r86,%r267,-1; .loc 1 525 10 mov.u64 %r113,0; $L140: .loc 1 529 21 ld.u64 %r270,[%r115+8]; add.u64 %r269,%r270,1; .loc 1 529 19 ld.u64 %r271,[%r115]; sub.u64 %r100,%r269,%r271; .loc 1 530 18 ld.u64 %r70,[%r105]; .loc 1 525 25 add.u64 %r113,%r113,1; .loc 1 530 11 setp.eq.u64 %r272,%r70,%r100; @ %r272 bra $L139; .loc 1 531 3 st.u64 [%stack+16],%r70; st.u64 [%stack+8],%r100; st.u64 [%stack],%r113;_gfortran_runtime_error139: .loc 1 525 4 add.u64 %r115,%r115,24; add.u64 %r105,%r105,8; setp.ne.u64 %r275,%r86,%r113; @ %r275 bra $L140; $L135: .loc 1 541 16 cvt.u16.u32 %r277,%r24; setp.le.s16 %r278,%r277,1; @ %r278 bra $L141; shl.b64 %r276,%r306,3; bra $L142; $L141: mov.u64 %r276,8; $L142: add.u64 %r280,%frame,240; mov.u32 %r2842766,[%value_in]; } add.u64 %r130,%r156,40; mov.u64 %r128,%frame; mov.u64 %r109,0; $L143: .loc 1 542 18 ld.u64 %r288,[%r130]; st.u64 [%r128],%r288; .loc 1 539 26 add.u64 %r109,%r109,1; .loc 1 539 3 add.u64 %r130,%r130,24; add.u64 %r128,%r128,8; setp.gt.s64 %r289,%r306,%r109; @ %r289 bra $L143; $L136: ld.u64 %r88,[%frame+240]; .loc 1 551 22 ld.u64 %r75,[%frame]; .loc 1 551 12 shl.b64 %r76,%r75,3; .loc 1 553 32 ld.u64 %r65,[%frame+120]; .loc 1 560 23 mul.lo.u64 %r49,%r65,%r75; setp.le.s64 %r305,%r306,1; .loc 1 549 13 mov.u64 %r290,0; cvt.u32.u32 %r308,%r24; cvt.s64.s8 %r309,%r308; add.u64 %r310,%r309,-1; $L149: st.u64 [%r107],%r290; .loc 1 550 15 add.u64 %r88,%r88,1; .loc 1 551 12 add.u64 %r107,%r107,%r76; .loc 1 553 13 setp.ne.u64 %r291,%r65,%r88; @ %r291 bra $L149; .loc 1 562 7 @ ! %r305 bra $L174; bra $L112; $L148: .loc 1 557 13 st.u64 [%r108],%r290; .loc 1 560 23 mul.lo.u64 %r78,%r81,%r80; .loc 1 561 5 add.u64 %r106,%r106,1; .loc 1 562 7 add.u64 %r108,%r108,8; add.u64 %r138,%r138,8; add.u64 %r137,%r137,8; setp.ne.u64 %r294,%r106,%r310; @ %r294 bra $L147; bra $L112; $L174: add.u64 %r108,%frame,248; add.u64 %r138,%frame,8; add.u64 %r137,%frame,128; .loc 1 560 23 mov.u64 %r78,%r49; .loc 1 561 5 mov.u64 %r106,1; $L147: .loc 1 566 16 ld.u64 %r299,[%r108]; add.u64 %r80,%r299,1; st.u64 [%r108],%r80; .loc 1 567 23 ld.u64 %r81,[%r138]; .loc 1 567 13 sub.u64 %r300,%r81,%r78; shl.b64 %r301,%r300,3; add.u64 %r107,%r107,%r301; .loc 1 553 32 ld.u64 %r84,[%r137]; .loc 1 553 13 setp.eq.u64 %r302,%r80,%r84; @ %r302 bra $L148; .loc 1 557 13 mov.u64 %r88,0; bra $L149; $L172: .loc 1 486 15 ld.u64 %r107,[%r156]; .loc 1 486 6 setp.eq.u64 %r303,%r107,0; @ ! %r303 bra $L125; bra $L150; $L112: .loc 1 minloc1_8_s4.o/ 1622802249_gfortran_minloc1_8_s4 .visible .func _gfortran_minloc1_8_sfortran/generated/minloc1_8_s4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_8_s4 .visible .func _gfortran_mminloc1_8_s_gfortran_sminloc1_8_s4 .visible .func _gfortran_sminloc1_8_sVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_memcmp_char4_gfortrani_memcmp_char4_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents7] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,914,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,117,101114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99_gfortran_minloc1_8_s4 .visible .func _gfortran_minloc1_8_s, .param .u64 %in_ar4) {48predpred %r229; .reg .predpredu64 %r258; .reg .u64pred %r266; .reg .pred64pred %r310; .reg .u64 %r313; .reg .u64 %r314; .reg .pred %r316; .reg .u64 %r317; .reg .u64 %r318; .reg .u64 %r319; .reg .u32 %r320; .reg .u64pred %r337; .reg .u32 %r341; .reg .predpredpredpredmov.u64 %r189,%ar0; mov.u64 %r190,%ar1; mov.u64 %r191,%ar2; mov.u32 %r192,%ar3; mov.u64 %r193,%ar4; .loc 1 70 10 ld.s8 %r22,[%r190+28]; .loc 1 70 38 add.u32 %r194,%r22,-1; .loc 1 70 8 cvt.s64.s32 %r114,%r194; .loc 1 71 10 ld.u64 %r25,[%r191]; .loc 1 71 7 add.u64 %r115,%r25,-1; .loc 1 73 7 shr.u64 %r196,%r115,63; set.u32.lt.s64 %r198,%r114,%r115; neg.s32 %r199,%r198; cvt.u16.u64 %r202,%r196;73%r206 bra $L2; .loc 1 75 7 cvt.u32.u32 %r209,%r22; cvt.s64.s8 %r208,%r209; st.u64 [%stack+8],%r208007_gfortran_runtime_errorr212,%r115,%r115; add.u64 %r213,%r212,%r115; shl.b64 %r214,%r213,3; add.u64 %r215,%r190,%r214; ld.u64 %r31,[%r215+56]; ld.u64 %r33,[%r215+48]; .loc 1 83 11 ld.u64 %r34,[%r215+40]; .loc 1 85 3 setp.ne.u64 %r229,%r115,0; @ %r229 bra $L3; $L9: .loc 1 93 3 setp.gt.s64 %r230,%r114,%r115; @ %r230 bra $L4; bra $L50; $L3: add.u64 %r54,%r190,40; add.u64 %r113,%frame,120; add.u64 %r127,%frame,240; add.u64 %r231,%r190,16; add.u64 %r233,%r25,%r25; add.u64 %r234,%r233,%r25; shl.b64 %r235,%r234,3; add.u64 %r55,%r231,%r235; .loc 1 91 12 mov.u64 %r383,0; $L8: .loc 1 87 51 ld.u64 %r237,[%r54]; mul.lo.u64 %r236,%r237,%r193; .loc 1 87 18 st.u64 [%r113],%r236; .loc 1 88 19 ld.u64 %r239,[%r54+16]; add.u64 %r238,%r239,1; ld.u64 %r240,[%r54+8]; sub.u64 %r43,%r238,%r240; .loc 1 90 10 setp.lt.s64 %r241,%r43,0; @ %r241 bra $L6; .loc 1 88 17 st.u64 [%r127],%r43; bra $L7; $L6: .loc 1 91 12 st.u64 [%r127],%r383; $L7: .loc 1 85 3 add.u64 %r54,%r54,24; add.u64 %r113,%r113,8; add.u64 %r127,%r127,8; setp.ne.u64 %r243,%r54,%r55; @ %r243 bra $L8; bra $L9; $L50: .loc 1 102 6 ld.u64 %r244,[%r189]; setp.eq.u64 %r245,%r244,0; @ ! %r245 bra $L11; bra $L10; $L4: add.u64 %r247,%r25,%r25; add.u64 %r248,%r247,%r25; shl.b64 %r249,%r248,3; add.u64 %r250,%r249,40; add.u64 %r130,%r190,%r250; shl.b64 %r251,%r25,3; add.u64 %r105,%r251,-8; add.u64 %r373,%frame,120; add.u64 %r108,%r373,%r105; add.u64 %r372,%frame,240; add.u64 %r100,%r372,%r105; add.u64 %r254,%r190,40; cvt.u32.u32 %r256,%r22; cvt.s64.s8 %r255,%r256; add.u64 %r258,%r255,%r255; add.u64 %r259,%r258,%r255; shl.b64 %r260,%r259,3; add.u64 %r70,%r254,%r260; .loc 1 99 12 mov.u64 %r382,0; $L14: .loc 1 95 56 ld.u64 %r262,[%r130]; mul.lo.u64 %r261,%r262,%r193; .loc 1 95 18 st.u64 [%r108],%r261; .loc 1 96 19 ld.u64 %r264,[%r130+16]; add.u64 %r263,%r264,1; ld.u64 %r265,[%r130+8]; sub.u64 %r52,%r263,%r265; .loc 1 98 10 setp.lt.s64 %r266,%r52,0; @ %r266 bra $L12; .loc 1 96 17 st.u64 [%r100],%r52; bra $L13; $L12: .loc 1 99 12 st.u64 [%r100],%r382; $L13: .loc 1 93 3 add.u64 %r130,%r130,24; add.u64 %r108,%r108,8; add.u64 %r100,%r100,8; setp.ne.u64 %r268,%r70,%r130; @ %r268 bra $L14; bra $L51; $L10: .loc 1 106 7 setp.le.s64 %r269,%r114,0; @ %r269 bra $L16; add.u64 %r372,%frame,240; $L38: add.u64 %r147,%r189,40; mov.u64 %r145,%r372; cvt.u32.u32 %r271,%r22; cvt.s64.s8 %r270,%r271; add.u64 %r273,%r270,%r270; add.u64 %r274,%r273,%r270; shl.b64 %r275,%r274,3; add.u64 %r276,%r189,16; add.u64 %r135,%r275,%r276; .loc 1 109 10 mov.u64 %r102,1; .loc 1 113 4 mov.u64 %r299,0; bra $L17; $L16: .loc 1 117 24 mov.u64 %r277,0; st.u64 [%r189+8],%r277; .loc 1 118 28 cvt.u16.u32 %r280,%r22; add.u16 %r279,%r280,-1; cvt.u32.u16 %r281,%r279; st.u8 [%r189+28],%r281; .loc 1 120 20 add.u32 %r282,%r22,-2; cvt.s64.s32 %r61,%r282; add.u64 %r284,%r61,%r61; add.u64 %r285,%r284,%r61; shl.b64 %r286,%r285,3; add.u64 %r287,%r189,%r286; .loc 1 120 67 shl.b64 %r289,%r61,3; add.u64 %r290,%frame,%r289; .loc 1 120 59 ld.u64 %r292,[%r287+40]; ld.u64 %r293,[%r290+240]; mul.lo.u64 %r120,%r292,%r293; .loc 1 122 29 mov.u64 %r29512295; call (%value_in),_gfortrani_xmallocarray296,[%value_in]; } .loc 1 122 27 st.u64 [%r189],%r296; .loc 1 123 10 setp.eq.u64 %r298,%r120,0; @ ! %r298 bra $L21; bra $L18; $L20: .loc 1 111 48 mul.lo.u64 %r102,%r57,%r102; $L17: .loc 1 113 4 st.u64 [%r147+8],%r299; ld.u64 %r57,[%r145]; add.u64 %r300,%r57,-1; st.u64 [%r147+16],%r300; st.u64 [%r147],%r102; .loc 1 106 7 add.u64 %r147,%r147,24; add.u64 %r145,%r145,8; setp.ne.u64 %r301,%r135,%r147; @ %r301 bra $L20; bra $L16; $L18: .loc 1 126 4 st.u64 [%r189+48],%r120; mov.u64 %r303,-1; st.u64 [%r189+56],%r303; mov.u64 %r304,1; st.u64 [%r189+40],%r304; .loc 1 127 4 bra $L1; $L11: .loc 1 133 19 ld.s8 %r68,[%r189+28]; .loc 1 133 10 setp.eq.u64 %r305,%r68,%r114; @ %r305 bra $L23; .loc 1 134 2 st.u64 [%stack+8],%r114; st.u64 [%stack],%r68;_gfortran_runtime_error23: .loc 1 139 11 cvta.global.u64 %r308,_gfortrani_compile_options; .loc 1 139 10 ld.u32 %r309,[%r308+36]; setp.eq.u32 %r310,%r309,0; @ %r310 bra $L21; .loc 1 140 2 add.u64 %r372,%frame,240; cvta.const.u64 %r314,$LC2r372313314; call _gfortrani_bounds_ifunction_return$L21: .loc 1 144 3 setp.gt.s64 %r316,%r114,0; @ %r316 bra $L24; $L28: .loc 1 80 9 add.u64 %r317,%r31,1; .loc 1 80 7 sub.u64 %r116,%r317,%r33; max.s64 %r118,%r116,0; .loc 1 152 8 ld.u64 %r101,[%r190]; .loc 1 153 8 ld.u64 %r125,[%r189]; .loc 1 170 36 mul.lo.u64 %r318,%r34,%r193; shl.b64 %r75,%r318,2; .loc 1 186 22 ld.u64 %r177,[%frame+120]; .loc 1 186 12 shl.b64 %r180,%r177,2; .loc 1 187 22 ld.u64 %r181,[%frame]; .loc 1 187 12 shl.b64 %r184,%r181,3; .loc 1 185 12 ld.u64 %r185,[%frame+360]; setp.gt.s64 %r370,%r116,0; setp.le.s64 %r371,%r114,1; .loc 1 173 6 setp.eq.u32 %r376,%r192,0; cvt.u32.u32 %r377,%r22; cvt.s64.s8 %r378,%r377; add.u64 %r379,%r378,-1; add.u64 %r380,%frame,120; add.u64 %r381,%frame,240; bra $L25; $L24: add.u64 %r166,%frame,360; add.u64 %r165,%r189,40; mov.u64 %r163,%frame; cvt.u32.u32 %r320,%r22; cvt.s64.s8 %r319,%r320; add.u64 %r151,%r319,-1; .loc 1 144 3 mov.u64 %r126,0; add.u64 %r372,%frame,240; .loc 1 146 16 mov.u64 %r321,%r126; $L27: st.u64 [%r166],%r321; .loc 1 147 18 ld.u64 %r322,[%r165]; st.u64 [%r163],%r322; .loc 1 148 17 shl.b64 %r324,%r126,3; add.u64 %r325,%r372,%r324; .loc 1 148 10 ld.u64 %r326,[%r325]; setp.le.s64 %r327,%r326,0; @ %r327 bra $L1; .loc 1 144 26 add.u64 %r126,%r126,1; .loc 1 144 3 add.u64 %r166,%r166,8; add.u64 %r165,%r165,24; add.u64 %r163,%r163,8; setp.ne.u64 %r328,%r126,%r151; @ %r328 bra $L27; bra $L28; $L25: .loc 1 166 5 @ %r370 bra $L39; .loc 1 167 10 mov.u64 %r330,0; st.u64 [%r125],%r330; bra $L30; $L39: mov.u64 %r121,%r101; .loc 1 164 9 mov.u64 %r106,0; .loc 1 165 9 mov.u64 %r104,%r106; .loc 1 170 13 mov.u64 %r152,%r106; $L29: .loc 1 177 14 add.u64 %r152,%r152,1; .loc 1 173 6 setp.ne.u64 %r331,%r106,0; @ %r331 bra $L31; $L34: .loc 1 170 13 mov.u64 %r106,%r121; .loc 1 177 14 mov.u64 %r104,%r152; bra $L32; $L31: .loc 1 173 6 @ %r376 bra $L33; .loc 1 42 12 { .param .u32 %value_in;(%value_in),_gfortrani_memcmp_char4336,[%value_in]; } .loc 1 173 22 setp.le.s32 %r337,%r336,0; @ ! %r337 bra $L32; bra $L34; $L33: .loc 1 42(%value_in),_gfortrani_memcmp_char4341,[%value_in]; } .loc 1 173 22 setp.lt.s32 %r342,%r341,0; @ %r342 bra $L34; $L32: .loc 1 170 36 add.u64 %r121,%r121,%r75; .loc 1 170 6 setp.gt.s64 %r343,%r118,%r152; @ %r343 bra $L29; .loc 1 181 12 st.u64 [%r125],%r104; $L30: .loc 1 185 15 add.u64 %r185,%r185,1; .loc 1 186 12 add.u64 %r101,%r101,%r180; .loc 1 187 12 add.u64 %r125,%r125,%r184; .loc 1 189 32 ld.u64 %r132,[%frame+240]; .loc 1 189 13 setp.ne.u64 %r344,%r185,%r132; @ %r344 bra $L25; .loc 1 196 23 mul.lo.u64 %r345,%r185,%r177; .loc 1 196 9 shl.b64 %r346,%r345,2; sub.u64 %r122,%r101,%r346; .loc 1 197 23 mul.lo.u64 %r347,%r185,%r181; .loc 1 197 9 shl.b64 %r348,%r347,3; sub.u64 %r123,%r125,%r348; .loc 1 199 7 @ %r371 bra $L1; add.u64 %r134,%frame,368; mov.u64 %r155,8; .loc 1 198 5 mov.u64 %r124,1; .loc 1 193 13 mov.u64 %r374,0; bra $L36; $L37: st.u64 [%r134],%r374; .loc 1 196 23 mul.lo.u64 %r354,%r90,%r89; .loc 1 196 9 shl.b64 %r355,%r354,2; sub.u64 %r122,%r101,%r355; .loc 1 197 23 mul.lo.u64 %r356,%r93,%r89; .loc 1 197 9 shl.b64 %r357,%r356,3; sub.u64 %r123,%r125,%r357; .loc 1 198 5 add.u64 %r124,%r124,1; .loc 1 199 7 add.u64 %r134,%r134,8; add.u64 %r155,%r155,8; setp.eq.u64 %r358,%r124,%r379; @ %r358 bra $L1; $L36: .loc 1 207 16 ld.u64 %r359,[%r134]; add.u64 %r89,%r359,1; st.u64 [%r134],%r89; .loc 1 208 23 add.u64 %r361,%r380,%r155; ld.u64 %r90,[%r361]; .loc 1 208 13 shl.b64 %r362,%r90,2; add.u64 %r101,%r122,%r362; .loc 1 209 23 add.u64 %r363,%frame,%r155; ld.u64 %r93,[%r363]; .loc 1 209 13 shl.b64 %r364,%r93,3; add.u64 %r125,%r123,%r364; .loc 1 189 32 add.u64 %r366,%r381,%r155; ld.u64 %r96,[%r366]; .loc 1 189 13 setp.eq.u64 %r367,%r89,%r96; @ %r367 bra $L37; mov.u64 %r185,0; bra $L25; $L51: .loc 1 102 6 ld.u64 %r368,[%r189]; setp.eq.u64 %r369,%r368,0; @ ! %r369 bra $L11; bra $L38; $L1_gfortran_mminloc1_8_s4 .visible .func _gfortran_mminloc1_8_s608pred %r289; .reg .predpred %r304; .reg .pred %r305; .reg .u64u64 %r313; .reg .u64u64pred %r324; .reg .u64predu16 %r371; .reg .u16 %r372; .reg .u32 %r373; .reg .pred %r374; .reg .u64 %r375; .reg .u64 %r376; .reg .predu64u64 %r399; .reg .predpred %r416; .reg .u32 %r422; .reg .pred %r423; .reg .u32 %r427; .reg .pred %r428; .reg .predpred.reg .u64u64u64 %r464; .reg .u64 %r465; .reg .pred.reg .u64 %r475; .reg .u64 %r476; mov.u64 %r230,%ar0; mov.u64 %r231,%ar1; mov.u64 %r232,%ar2; mov.u64 %r233,%ar3; mov.u32 %r234,%ar4; mov.u64 %r235,%ar5; .loc 1 244 6 setp.ne.u64 %r236,%r233,0; @ %r236 bra $L2234235; call _gfortran_minloc1_8_s4); } .loc 1 251 7 bra $L52; $L53: .loc 1 254 10 ld.u64 %r22,[%r232]; .loc 1 254 7 add.u64 %r149,%r22,-1; .loc 1 255 10 ld.s8 %r23,[%r231+28]; .loc 1 255 38 add.u32 %r242,%r23,-1; .loc 1 255 8 cvt.s64.s32 %r150,%r242; .loc 1 258 7 shr.u64 %r244,%r149,63; set.u32.gt.s64 %r246,%r149,%r150; neg.s32 %r247,%r246; cvt.u16.u64 %r250,%r244; cvt.u16.u32 %r251,%r247; or.b16 %r249,%r250,%r251; .loc 1 258 6.loc 1 260 7 cvt.u32.u32 %r257,%r23; cvt.s64.s8 %r256,%r257; st.u64 [%stack+8],%r256; st.u64 [%stack],%r22; cvta.const.u64 %r22_gfortran_runtime_error5: .loc 1 265 9 add.u64 %r464,%r149,%r149; add.u64 %r462,%r464,%r149; shl.b64 %r262,%r462,3; add.u64 %r263,%r231,%r262; ld.u64 %r266,[%r263+56]; add.u64 %r265,%r266,1; .loc 1 265 7 ld.u64 %r273,[%r263+48]; sub.u64 %r151,%r265,%r273; .loc 1 266 6 setp.le.s64 %r274,%r151,0; @ %r274 bra $L52; .loc 1 269 9 ld.u64 %r164,[%r233]; .loc 1 271 15 ld.u64 %r35,[%r233+16]; .loc 1 273 22 cvt.u32.u64 %r36,%r35; .loc 1 273 53 add.u32 %r275,%r36,-4; and.b32 %r276,%r275,-5; set.u32.eq.u32 %r278,%r276,0; neg.s32 %r279,%r278; .loc 1 273 22 add.u32 %r280,%r36,-1; set.u32.le.u32 %r282,%r280,1; neg.s32 %r283,%r282; .loc 1 273 6 cvt.u16.u32 %r285,%r279; cvt.u16.u32 %r286,%r283; or.b16 %r284,%r285,%r286; cvt.u32.u16 %r287,%r284; cvt.u16.u8 %r288,%r287; setp.ne.u16 %r289,%r288,0; @ %r289 bra $L57; .loc 1 275 7 setp.ne.u32 %r291,%r36,16; @ %r291 bra $L58; $L57: .loc 1 282 11 shl.b64 %r295,%r462,3; add.u64 %r296,%r231,%r295; ld.u64 %r40,[%r296+40]; .loc 1 283 12 add.u64 %r302,%r233,%r295; ld.u64 %r42,[%r302+40]; .loc 1 285 3 setp.ne.u64 %r304,%r149,0; @ %r304 bra $L59; $L65: .loc 1 295 3 setp.lt.s64 %r305,%r149,%r150; @ %r305 bra $L60; bra $L111; $L58: .loc 1 280 5 cvta.const.u64 %r306306_gfortran_runtime_error9: add.u64 %r103,%r231,40; add.u64 %r213,%frame,240; add.u64 %r214,%r233,40; mov.u64 %r216,%frame; add.u64 %r217,%frame,360; add.u64 %r308,%r231,16; add.u64 %r310,%r22,%r22; add.u64 %r311,%r310,%r22; shl.b64 %r312,%r311,3; add.u64 %r226,%r308,%r312; .loc 1 292 12 mov.u64 %r476,0; $L64: .loc 1 287 51 ld.u64 %r314,[%r103]; mul.lo.u64 %r313,%r314,%r235; .loc 1 287 18 st.u64 [%r213],%r313; .loc 1 288 20 ld.u64 %r316,[%r214]; mul.lo.u64 %r315,%r316,%r35; .loc 1 288 18 st.u64 [%r216],%r315; .loc 1 289 19 ld.u64 %r318,[%r103+16]; add.u64 %r317,%r318,1; ld.u64 %r319,[%r103+8]; sub.u64 %r57,%r317,%r319; .loc 1 291 10 setp.lt.s64 %r320,%r57,0; @ %r320 bra $L62; .loc 1 289 17 st.u64 [%r217],%r57; bra $L63; $L62: .loc 1 292 12 st.u64 [%r217],%r476; $L63: .loc 1 285 3 add.u64 %r103,%r103,24; add.u64 %r213,%r213,8; add.u64 %r214,%r214,24; add.u64 %r216,%r216,8; add.u64 %r217,%r217,8; setp.ne.u64 %r322,%r103,%r226; @ %r322 bra $L64; bra $L65; $L111: .loc 1 305 6 ld.u64 %r323,[%r230]; setp.eq.u64 %r324,%r323,0; @ ! %r324 bra $L67; bra $L66; $L60: add.u64 %r326,%r22,%r22; add.u64 %r327,%r326,%r22; shl.b64 %r328,%r327,3; add.u64 %r58,%r328,40; add.u64 %r77,%r231,%r58; shl.b64 %r329,%r22,3; add.u64 %r148,%r329,-8; add.u64 %r460,%frame,240; add.u64 %r39,%r460,%r148; add.u64 %r173,%r233,%r58; add.u64 %r143,%frame,%r148; add.u64 %r465,%frame,360; add.u64 %r178,%r465,%r148; add.u64 %r332,%r231,40; cvt.u32.u32 %r334,%r23; cvt.s64.s8 %r333,%r334; add.u64 %r336,%r333,%r333; add.u64 %r337,%r336,%r333; shl.b64 %r338,%r337,3; add.u64 %r100,%r332,%r338; .loc 1 302 12 mov.u64 %r475,0; $L70: .loc 1 297 55 ld.u64 %r340,[%r77]; mul.lo.u64 %r339,%r340,%r235; .loc 1 297 18 st.u64 [%r39],%r339; .loc 1 298 20 ld.u64 %r342,[%r173]; mul.lo.u64 %r341,%r342,%r35; .loc 1 298 18 st.u64 [%r143],%r341; .loc 1 299 19 ld.u64 %r344,[%r77+16]; add.u64 %r343,%r344,1; ld.u64 %r345,[%r77+8]; sub.u64 %r75,%r343,%r345; .loc 1 301 10 setp.lt.s64 %r346,%r75,0; @ %r346 bra $L68; .loc 1 299 17 st.u64 [%r178],%r75; bra $L69; $L68: .loc 1 302 12 st.u64 [%r178],%r475; $L69: .loc 1 295 3 add.u64 %r77,%r77,24; add.u64 %r39,%r39,8; add.u64 %r173,%r173,24; add.u64 %r143,%r143,8; add.u64 %r178,%r178,8; setp.ne.u64 %r348,%r77,%r100; @ %r348 bra $L70; bra $L112; $L66: .loc 1 309 7 setp.eq.u64 %r349,%r150,0; @ %r349 bra $L72; add.u64 %r465,%frame,360; $L94: add.u64 %r130,%r230,40; mov.u64 %r128,%r465; cvt.u32.u32 %r351,%r23; cvt.s64.s8 %r350,%r351; add.u64 %r353,%r350,%r350; add.u64 %r354,%r353,%r350; shl.b64 %r355,%r354,3; add.u64 %r356,%r230,16; add.u64 %r92,%r355,%r356; .loc 1 312 10 mov.u64 %r135,1; .loc 1 316 4 mov.u64 %r375,0; bra $L73; $L72: .loc 1 320 20 add.u32 %r357,%r23,-2; cvt.s64.s32 %r83,%r357; add.u64 %r359,%r83,%r83; add.u64 %r360,%r359,%r83; shl.b64 %r361,%r360,3; add.u64 %r362,%r230,%r361; .loc 1 320 67 shl.b64 %r364,%r83,3; add.u64 %r365,%frame,%r364; .loc 1 320 59 ld.u64 %r367,[%r362+40]; ld.u64 %r368,[%r365+360]; mul.lo.u64 %r157,%r367,%r368; .loc 1 322 24 mov.u64 %r369,0; st.u64 [%r230+8],%r369; .loc 1 323 28 cvt.u16.u32 %r372,%r23; add.u16 %r371,%r372,-1; cvt.u32.u16 %r373,%r371; st.u8 [%r230+28],%r373; .loc 1 325 10 setp.eq.u64 %r374,%r157,0; @ %r374 bra $L74; bra $L113; $L76: .loc 1 314 47 mul.lo.u64 %r135,%r80,%r135; $L73: .loc 1 316 4 st.u64 [%r130+8],%r375; ld.u64 %r80,[%r128]; add.u64 %r376,%r80,-1; st.u64 [%r130+16],%r376; st.u64 [%r130],%r135; .loc 1 309 7 add.u64 %r130,%r130,24; add.u64 %r128,%r128,8; setp.ne.u64 %r377,%r92,%r130; @ %r377 bra $L76; bra $L72; $L74: .loc 1 328 4 st.u64 [%r230+48],%r157; mov.u64 %r379,-1; st.u64 [%r230+56],%r379; mov.u64 %r380,1; st.u64 [%r230+40],%r380; .loc 1 329 4 bra $L52; $L113: .loc 1 332 24 mov.u64 %r38382; call (%value_in),_gfortrani_xmallocarray83,[%value_in]; } .loc 1 332 22 st.u64 [%r230],%r383; bra $L77; $L67: .loc 1 337 19 ld.s8 %r385,[%r230+28]; .loc 1 337 10 setp.eq.u64 %r386,%r385,%r150; @ %r386 bra $L78; .loc 1 338 2 cvta.const.u64 %r387387_gfortran_runtime_error78: .loc 1 340 11 cvta.global.u64 %r389,_gfortrani_compile_options; .loc 1 340 10 ld.u32 %r390,[%r389+36]; setp.eq.u32 %r391,%r390,0; @ %r391 bra $L77; .loc 1 342 4 add.u64 %r465,%frame,360; cvta.const.u64 %r395,$LC2r394395; call _gfortrani_bounds_ifunction_return344 4 cvta.const.u64 %r399,$LC3r2399395; call _gfortrani_bounds_equal_extents$L77: .loc 1 349 3 setp.ne.u64 %r401,%r150,0; @ %r401 bra $L79; $L82: .loc 1 357 8 ld.u64 %r165,[%r230]; .loc 1 358 8 ld.u64 %r163,[%r231]; .loc 1 360 9 setp.ne.u64 %r402,%r163,0; @ %r402 bra $L80; bra $L52; $L79: add.u64 %r174,%frame,480; add.u64 %r166,%r230,40; add.u64 %r145,%frame,120; cvt.u32.u32 %r404,%r23; cvt.s64.s8 %r403,%r404; add.u64 %r131,%r403,-1; .loc 1 349 3 mov.u64 %r171,0; add.u64 %r465,%frame,360; .loc 1 351 16 mov.u64 %r405,%r171; $L81: st.u64 [%r174],%r405; .loc 1 352 18 ld.u64 %r406,[%r166]; st.u64 [%r145],%r406; .loc 1 353 17 shl.b64 %r408,%r171,3; add.u64 %r409,%r465,%r408; .loc 1 353 10 ld.u64 %r410,[%r409]; setp.le.s64 %r411,%r410,0; @ %r411 bra $L52; .loc 1 349 26 add.u64 %r171,%r171,1; .loc 1 349 3 add.u64 %r174,%r174,8; add.u64 %r166,%r166,24; add.u64 %r145,%r145,8; setp.ne.u64 %r412,%r131,%r171; @ %r412 bra $L81; bra $L82; $L80: .loc 1 283 12 mul.lo.u64 %r45,%r42,%r35; .loc 1 372 32 mul.lo.u64 %r413,%r40,%r235; shl.b64 %r95,%r413,2; .loc 1 396 22 ld.u64 %r197,[%frame+240]; .loc 1 396 12 shl.b64 %r200,%r197,2; .loc 1 397 23 ld.u64 %r201,[%frame]; .loc 1 398 22 ld.u64 %r203,[%frame+120]; .loc 1 398 12 shl.b64 %r205,%r203,3; setp.le.s64 %r463,%r150,1; .loc 1 384 13 setp.eq.u32 %r468,%r234,0; cvt.u32.u32 %r469,%r23; cvt.s64.s8 %r470,%r469; add.u64 %r471,%r470,-1; add.u64 %r472,%frame,240; add.u64 %r473,%frame,360; add.u64 %r474,%frame,120; $L95: .loc 1 349 3 mov.u64 %r170,%r164; mov.u64 %r159,%r163; .loc 1 372 9 mov.u64 %r139,0; $L86: .loc 1 375 7 ld.s8 %r206,[%r170]; mov.u64 %r190,%r139; .loc 1 378 11 add.u64 %r139,%r139,1; .loc 1 375 6 setp.eq.u32 %r459,%r206,0; @ %r459 bra $L83; .loc 1 382 6 setp.gt.s64 %r415,%r151,%r190; @ %r415 bra $L96; bra $L85; $L83: .loc 1 372 32 add.u64 %r159,%r159,%r95; .loc 1 372 47 add.u64 %r170,%r170,%r45; .loc 1 372 2 setp.ne.u64 %r416,%r151,%r139; @ %r416 bra $L86; .loc 1 371 9 mov.u64 %r139,0; bra $L85; $L96: .loc 1 382 6 mov.u64 %r169,%r159; $L84: .loc 1 388 14 add.u64 %r190,%r190,1; .loc 1 384 6 @ %r459 bra $L87; .loc 1 384 13 @ %r468 bra $L88; .loc 1 42call (%value_in),_gfortrani_memcmp_char4422,[%value_in]; } .loc 1 384 13 setp.gt.s32 %r423,%r422,0; @ %r423 bra $L87; $L89: .loc 1 382 6 mov.u64 %r159,%r169; .loc 1 388 14 mov.u64 %r139,%r190; bra $L87; $L88: .loc 1 42call (%value_in),_gfortrani_memcmp_char4427,[%value_in]; } .loc 1 384 13 setp.lt.s32 %r428,%r427,0; @ %r428 bra $L89; $L87: .loc 1 382 31 add.u64 %r169,%r169,%r95; .loc 1 382 46 add.u64 %r170,%r170,%r45; .loc 1 382 6 setp.eq.u64 %r429,%r151,%r190; @ %r429 bra $L85; .loc 1 384 7 ld.s8 %r206,[%r170]; setp.eq.u32 %r459,%r206,0; bra $L84; $L85: .loc 1 392 8 st.u64 [%r165],%r139; .loc 1 395 15 ld.u64 %r430,[%frame+480]; add.u64 %r98,%r430,1; st.u64 [%frame+480],%r98; .loc 1 396 12 add.u64 %r163,%r163,%r200; .loc 1 397 13 add.u64 %r164,%r164,%r201; .loc 1 398 12 add.u64 %r165,%r165,%r205; .loc 1 400 32 ld.u64 %r29,[%frame+360]; .loc 1 400 13 setp.ne.u64 %r431,%r29,%r98; @ %r431 bra $L95; .loc 1 404 13 mov.u64 %r432,0; st.u64 [%frame+480],%r432; .loc 1 407 23 mul.lo.u64 %r107,%r29,%r197; .loc 1 408 24 mul.lo.u64 %r433,%r29,%r201; .loc 1 408 10 sub.u64 %r160,%r164,%r433; .loc 1 409 23 mul.lo.u64 %r434,%r29,%r203; .loc 1 409 9 shl.b64 %r435,%r434,3; sub.u64 %r161,%r165,%r435; .loc 1 411 7 @ %r463 bra $L52; add.u64 %r198,%frame,488; mov.u64 %r196,8; .loc 1 410 5 mov.u64 %r162,1; bra $L92; $L93: .loc 1 404 13 st.u64 [%r198],%r432; .loc 1 407 23 mul.lo.u64 %r107,%r119,%r118; .loc 1 408 24 mul.lo.u64 %r441,%r121,%r118; .loc 1 408 10 sub.u64 %r160,%r164,%r441; .loc 1 409 23 mul.lo.u64 %r442,%r123,%r118; .loc 1 409 9 shl.b64 %r443,%r442,3; sub.u64 %r161,%r165,%r443; .loc 1 410 5 add.u64 %r162,%r162,1; .loc 1 411 7 add.u64 %r198,%r198,8; add.u64 %r196,%r196,8; setp.eq.u64 %r444,%r162,%r471; @ %r444 bra $L52; $L92: .loc 1 419 16 ld.u64 %r445,[%r198]; add.u64 %r118,%r445,1; st.u64 [%r198],%r118; .loc 1 420 23 add.u64 %r447,%r472,%r196; ld.u64 %r119,[%r447]; .loc 1 420 13 sub.u64 %r448,%r119,%r107; shl.b64 %r449,%r448,2; add.u64 %r163,%r163,%r449; .loc 1 421 24 add.u64 %r450,%frame,%r196; ld.u64 %r121,[%r450]; .loc 1 421 14 add.u64 %r164,%r160,%r121; .loc 1 422 23 add.u64 %r452,%r474,%r196; ld.u64 %r123,[%r452]; .loc 1 422 13 shl.b64 %r453,%r123,3; add.u64 %r165,%r161,%r453; .loc 1 400 32 add.u64 %r455,%r473,%r196; ld.u64 %r127,[%r455]; .loc 1 400 13 setp.eq.u64 %r456,%r118,%r127; @ %r456 bra $L93; bra $L95; $L112: .loc 1 305 6 ld.u64 %r457,[%r230]; setp.eq.u64 %r458,%r457,0; @ ! %r458 bra $L67; bra $L94; $L52: .loc 1 426_gfortran_sminloc1_8_s4 .visible .func _gfortran_sminloc1_8_s368100predu16 %r180; .reg .predpred %r222; .reg .predu64 %r231; .reg .u16 %r233; .reg .u16 %r234; .reg .u32pred %r259; .reg .pred %r260; .reg .u64u64 %r267; .reg .u32pred %r303; .reg .u64 %r304; .reg .pred %r305; .reg .u64 %r306; .reg .u32mov.u64 %r156,%ar0; mov.u64 %r157,%ar1; mov.u64 %r158,%ar2; mov.u64 %r159,%ar3; mov.u32 %r160,%ar4; mov.u64 %r161,%ar5; .loc 1 449 6 setp.eq.u64 %r162,%r159,0; @ %r162 bra $L115; .loc 1 449 20 ld.u32 %r163,[%r159]; setp.eq.u32 %r164,%r163,0; @ %r164 bra $L116; $L115: .loc 1 452 7r1571580161; call _gfortran_minloc1_8_s4); } .loc 1 456 7 bra $L114; $L116: .loc 1 459 10 ld.u64 %r23,[%r158]; .loc 1 459 7 add.u64 %r95,%r23,-1; .loc 1 460 10 ld.s8 %r24,[%r157+28]; .loc 1 460 38 add.u32 %r26,%r24,-1; .loc 1 460 8 cvt.s64.s32 %r306,%r26; .loc 1 462 7 shr.u64 %r171,%r95,63; set.u32.gt.s64 %r173,%r95,%r306; neg.s32 %r174,%r173; cvt.u16.u64 %r177,%r171; cvt.u16.u32 %r178,%r174; or.b16 %r176,%r177,%r178; .loc 1 462 6 cvt.u32.u16 %r179,%r176; cvt.u16.u8 %r180,%r179; setp.ne.u16 %r181,%r180,0; @ %r181 bra $L118; .loc 1 469 3 setp.ne.u64 %r182,%r95,0; @ %r182 bra $L119; $L125: .loc 1 477 3 setp.lt.s64 %r183,%r95,%r306; @ %r183 bra $L120; bra $L173; $L118: .loc 1 464 7 cvt.u32.u32 %r186,%r24; cvt.s64.s8 %r185,%r186; st.u64 [%stack+8],%r1851841stack; call _gfortran_runtime_error119: add.u64 %r92,%r157,48; add.u64 %r114,%frame,120; add.u64 %r146,%r157,56; add.u64 %r188,%r157,24; add.u64 %r190,%r23,%r23; add.u64 %r191,%r190,%r23; shl.b64 %r192,%r191,3; add.u64 %r152,%r188,%r192; .loc 1 474 12 mov.u64 %r312,0; $L124: .loc 1 471 19 ld.u64 %r194,[%r146]; ld.u64 %r196,[%r92]; sub.u64 %r195,%r194,%r196; .loc 1 471 50 mad.lo.u64 %r38,%r195,%r161,%r161; .loc 1 473 10 setp.le.s64 %r197,%r38,0; @ %r197 bra $L122; .loc 1 471 17 st.u64 [%r114],%r38; bra $L123; $L122: .loc 1 474 12 st.u64 [%r114],%r312; $L123: .loc 1 469 3 add.u64 %r92,%r92,24; add.u64 %r114,%r114,8; add.u64 %r146,%r146,24; setp.ne.u64 %r199,%r92,%r152; @ %r199 bra $L124; bra $L125; $L173: .loc 1 486 15 ld.u64 %r107,[%r156]; .loc 1 486 6 setp.eq.u64 %r200,%r107,0; @ ! %r200 bra $L127; bra $L126; $L120: add.u64 %r202,%r23,%r23; add.u64 %r203,%r202,%r23; shl.b64 %r204,%r203,3; add.u64 %r205,%r204,48; add.u64 %r83,%r157,%r205; add.u64 %r304,%frame,120; shl.b64 %r207,%r23,3; add.u64 %r208,%r207,-8; add.u64 %r144,%r304,%r208; add.u64 %r209,%r157,48; cvt.u32.u32 %r211,%r24; cvt.s64.s8 %r210,%r211; add.u64 %r213,%r210,%r210; add.u64 %r214,%r213,%r210; shl.b64 %r215,%r214,3; add.u64 %r72,%r209,%r215; .loc 1 483 12 mov.u64 %r311,0; $L130: .loc 1 480 2 ld.u64 %r217,[%r83+8]; ld.u64 %r219,[%r83]; sub.u64 %r218,%r217,%r219; .loc 1 480 37 mad.lo.u64 %r46,%r218,%r161,%r161; .loc 1 482 10 setp.le.s64 %r220,%r46,0; @ %r220 bra $L128; .loc 1 479 17 st.u64 [%r144],%r46; bra $L129; $L128: .loc 1 483 12 st.u64 [%r144],%r311; $L129: .loc 1 477 3 add.u64 %r83,%r83,24; add.u64 %r144,%r144,8; setp.ne.u64 %r222,%r72,%r83; @ %r222 bra $L130; bra $L174; $L126: .loc 1 490 7 setp.eq.u64 %r223,%r306,0; @ %r223 bra $L132; add.u64 %r304,%frame,120; $L152: add.u64 %r125,%r156,40; mov.u64 %r123,%r304; add.u64 %r224,%r156,16; cvt.u32.u32 %r226,%r24; cvt.s64.s8 %r225,%r226; add.u64 %r228,%r225,%r225; add.u64 %r229,%r228,%r225; shl.b64 %r230,%r229,3; add.u64 %r117,%r224,%r230; .loc 1 493 10 mov.u64 %r89,1; .loc 1 497 4 mov.u64 %r249,0; bra $L133; $L132: .loc 1 501 24 mov.u64 %r231,0; st.u64 [%r156+8],%r231; .loc 1 502 28 cvt.u16.u32 %r234,%r24; add.u16 %r233,%r234,-1; cvt.u32.u16 %r235,%r233; st.u8 [%r156+28],%r235; .loc 1 504 20 add.u32 %r236,%r24,-2; cvt.s64.s32 %r56,%r236; add.u64 %r238,%r56,%r56; add.u64 %r239,%r238,%r56; shl.b64 %r240,%r239,3; add.u64 %r241,%r156,%r240; .loc 1 504 67 shl.b64 %r243,%r56,3; add.u64 %r244,%frame,%r243; .loc 1 504 59 ld.u64 %r246,[%r241+40]; ld.u64 %r247,[%r244+120]; mul.lo.u64 %r103,%r246,%r247; .loc 1 506 10 setp.eq.u64 %r248,%r103,0; @ %r248 bra $L134; bra $L175; $L136: .loc 1 495 48 mul.lo.u64 %r89,%r52,%r89; $L133: .loc 1 497 4 st.u64 [%r125+8],%r249; ld.u64 %r52,[%r123]; add.u64 %r250,%r52,-1; st.u64 [%r125+16],%r250; st.u64 [%r125],%r89; .loc 1 490 7 add.u64 %r125,%r125,24; add.u64 %r123,%r123,8; setp.ne.u64 %r251,%r117,%r125; @ %r251 bra $L136; bra $L132; $L134: .loc 1 509 4 st.u64 [%r156+48],%r103; mov.u64 %r253,-1; st.u64 [%r156+56],%r253; mov.u64 %r254,1; st.u64 [%r156+40],%r254; .loc 1 510 4 bra $L114; $L175: .loc 1 513 24 mov.u64 %r256(%value_in),_gfortrani_xmallocarray257,[%value_in]; } mov.u64 %r107,%r257; .loc 1 513 22 st.u64 [%r156],%r107; $L140: .loc 1 539 3 setp.ne.u64 %r259,%r306,0; @ %r259 bra $L137; bra $L138; $L127: .loc 1 517 19 ld.s8 %r61,[%r156+28]; .loc 1 517 10 setp.eq.u64 %r260,%r61,%r306; @ %r260 bra $L139; .loc 1 518 2 st.u64 [%stack+8],%r306; st.u64 [%stack],%r61;_gfortran_runtime_error139: .loc 1 523 11 cvta.global.u64 %r263,_gfortrani_compile_options; .loc 1 523 10 ld.u32 %r264,[%r263+36]; setp.eq.u32 %r265,%r264,0; @ %r265 bra $L140; .loc 1 525 4 setp.eq.u64 %r266,%r306,0; @ %r266 bra $L138; add.u64 %r115,%r156,48; add.u64 %r105,%frame,120; cvt.u32.u32 %r268,%r24; cvt.s64.s8 %r267,%r268; add.u64 %r86,%r267,-1; .loc 1 525 10 mov.u64 %r113,0; $L142: .loc 1 529 21 ld.u64 %r270,[%r115+8]; add.u64 %r269,%r270,1; .loc 1 529 19 ld.u64 %r271,[%r115]; sub.u64 %r100,%r269,%r271; .loc 1 530 18 ld.u64 %r70,[%r105]; .loc 1 525 25 add.u64 %r113,%r113,1; .loc 1 530 11 setp.eq.u64 %r272,%r70,%r100; @ %r272 bra $L141; .loc 1 531 3 st.u64 [%stack+16],%r70; st.u64 [%stack+8],%r100; st.u64 [%stack],%r113;_gfortran_runtime_error141: .loc 1 525 4 add.u64 %r115,%r115,24; add.u64 %r105,%r105,8; setp.ne.u64 %r275,%r86,%r113; @ %r275 bra $L142; $L137: .loc 1 541 16 cvt.u16.u32 %r277,%r24; setp.le.s16 %r278,%r277,1; @ %r278 bra $L143; shl.b64 %r276,%r306,3; bra $L144; $L143: mov.u64 %r276,8; $L144: add.u64 %r280,%frame,240; mov.u32 %r2842766,[%value_in]; } add.u64 %r130,%r156,40; mov.u64 %r128,%frame; mov.u64 %r109,0; $L145: .loc 1 542 18 ld.u64 %r288,[%r130]; st.u64 [%r128],%r288; .loc 1 539 26 add.u64 %r109,%r109,1; .loc 1 539 3 add.u64 %r130,%r130,24; add.u64 %r128,%r128,8; setp.gt.s64 %r289,%r306,%r109; @ %r289 bra $L145; $L138: ld.u64 %r88,[%frame+240]; .loc 1 551 22 ld.u64 %r75,[%frame]; .loc 1 551 12 shl.b64 %r76,%r75,3; .loc 1 553 32 ld.u64 %r65,[%frame+120]; .loc 1 560 23 mul.lo.u64 %r49,%r65,%r75; setp.le.s64 %r305,%r306,1; .loc 1 549 13 mov.u64 %r290,0; cvt.u32.u32 %r308,%r24; cvt.s64.s8 %r309,%r308; add.u64 %r310,%r309,-1; $L151: st.u64 [%r107],%r290; .loc 1 550 15 add.u64 %r88,%r88,1; .loc 1 551 12 add.u64 %r107,%r107,%r76; .loc 1 553 13 setp.ne.u64 %r291,%r65,%r88; @ %r291 bra $L151; .loc 1 562 7 @ ! %r305 bra $L176; bra $L114; $L150: .loc 1 557 13 st.u64 [%r108],%r290; .loc 1 560 23 mul.lo.u64 %r78,%r81,%r80; .loc 1 561 5 add.u64 %r106,%r106,1; .loc 1 562 7 add.u64 %r108,%r108,8; add.u64 %r138,%r138,8; add.u64 %r137,%r137,8; setp.ne.u64 %r294,%r106,%r310; @ %r294 bra $L149; bra $L114; $L176: add.u64 %r108,%frame,248; add.u64 %r138,%frame,8; add.u64 %r137,%frame,128; .loc 1 560 23 mov.u64 %r78,%r49; .loc 1 561 5 mov.u64 %r106,1; $L149: .loc 1 566 16 ld.u64 %r299,[%r108]; add.u64 %r80,%r299,1; st.u64 [%r108],%r80; .loc 1 567 23 ld.u64 %r81,[%r138]; .loc 1 567 13 sub.u64 %r300,%r81,%r78; shl.b64 %r301,%r300,3; add.u64 %r107,%r107,%r301; .loc 1 553 32 ld.u64 %r84,[%r137]; .loc 1 553 13 setp.eq.u64 %r302,%r80,%r84; @ %r302 bra $L150; .loc 1 557 13 mov.u64 %r88,0; bra $L151; $L174: .loc 1 486 15 ld.u64 %r107,[%r156]; .loc 1 486 6 setp.eq.u64 %r303,%r107,0; @ ! %r303 bra $L127; bra $L152; $L114: .loc 1 minloc1_16_s1.o/1622802249_gfortran_minloc1_16_s1 .visible .func _gfortran_minloc1_16_s1fortran/generated/minloc1_16_s1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_16_s1 .visible .func _gfortran_mminloc1_16_s1_gfortran_sminloc1_16_s1 .visible .func _gfortran_sminloc1_16_s1VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returmemcmpmem_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents7] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,914,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,117,101114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99_gfortran_minloc1_16_s1 .visible .func _gfortran_minloc1_16_s1in_ar4) {48predpred %r225; .reg .predu64 %r257; .reg .u64 %r258; .reg .u64pred %r262; .reg .pred %r264; .reg .pred %r265; .reg .u64u64predu32 %r318; .reg .u64pred %r330; .reg .u32 %r334; .reg .pred %r337; .reg .predu64 %r370; .reg .u64 %r372; .reg .pred %r373; .reg .u64 %r374; .reg .pred398; mov.u64 %r185,%ar0; mov.u64 %r186,%ar1; mov.u64 %r187,%ar2; mov.u32 %r188,%ar3; mov.u64 %r189,%ar4; .loc 1 70 10 ld.s8 %r22,[%r186+28]; .loc 1 70 38 add.u32 %r190,%r22,-1; .loc 1 70 8 cvt.s64.s32 %r117,%r190; .loc 1 71 10 ld.u64 %r25,[%r187]; .loc 1 71 7 add.u64 %r118,%r25,-1; .loc 1 73 7 shr.u64 %r192,%r118,63; set.u32.lt.s64 %r194,%r117,%r118; neg.s32 %r195,%r194; cvt.u16.u64 %r198,%r192; cvt.u16.u32 %r199,%r195; or.b16 %r197,%r198,%r199; .loc 1 73 6 cvt.u32.u16 %r200,%r197; cvt.u16.u8 %r201,%r200; setp.eq.u16 %r202,%r201,0; @ %r202 bra $L2; .loc 1 75 7 cvt.u32.u32 %r205,%r22; cvt.s64.s8 %r204,%r205; st.u64 [%stack+8],%r20403203_gfortran_runtime_errorr208,%r118,%r118; add.u64 %r209,%r208,%r118; shl.b64 %r210,%r209,3; add.u64 %r211,%r186,%r210; ld.u64 %r31,[%r211+56]; ld.u64 %r33,[%r211+48]; .loc 1 83 11 ld.u64 %r34,[%r211+40]; .loc 1 85 3 setp.ne.u64 %r225,%r118,0; @ %r225 bra $L3; $L9: .loc 1 93 3 setp.gt.s64 %r226,%r117,%r118; @ %r226 bra $L4; bra $L52; $L3: add.u64 %r132,%r186,40; add.u64 %r182,%frame,120; add.u64 %r180,%frame,240; add.u64 %r227,%r186,16; add.u64 %r229,%r25,%r25; add.u64 %r230,%r229,%r25; shl.b64 %r231,%r230,3; add.u64 %r137,%r227,%r231; .loc 1 91 12 mov.u64 %r398,0; $L8: .loc 1 87 51 ld.u64 %r233,[%r132]; mul.lo.u64 %r232,%r233,%r189; .loc 1 87 18 st.u64 [%r182],%r232; .loc 1 88 19 ld.u64 %r235,[%r132+16]; add.u64 %r234,%r235,1; ld.u64 %r236,[%r132+8]; sub.u64 %r44,%r234,%r236; .loc 1 90 10 setp.lt.s64 %r237,%r44,0; @ %r237 bra $L6; .loc 1 88 17 st.u64 [%r180],%r44; bra $L7; $L6: .loc 1 91 12 st.u64 [%r180],%r398; $L7: .loc 1 85 3 add.u64 %r132,%r132,24; add.u64 %r182,%r182,8; add.u64 %r180,%r180,8; setp.ne.u64 %r239,%r132,%r137; @ %r239 bra $L8; bra $L9; $L52: .loc 1 102 6 ld.u64 %r240,[%r185]; setp.eq.u64 %r241,%r240,0; @ ! %r241 bra $L11; bra $L10; $L4: add.u64 %r243,%r25,%r25; add.u64 %r244,%r243,%r25; shl.b64 %r245,%r244,3; add.u64 %r246,%r245,40; add.u64 %r114,%r186,%r246; shl.b64 %r247,%r25,3; add.u64 %r103,%r247,-8; add.u64 %r385,%frame,120; add.u64 %r108,%r385,%r103; add.u64 %r384,%frame,240; add.u64 %r101,%r384,%r103; cvt.u32.u32 %r251,%r22; cvt.s64.s8 %r250,%r251; add.u64 %r253,%r250,%r250; add.u64 %r254,%r253,%r250; shl.b64 %r255,%r254,3; add.u64 %r256,%r186,40; add.u64 %r50,%r255,%r256; .loc 1 99 12 mov.u64 %r397,0; $L14: .loc 1 95 56 ld.u64 %r258,[%r114]; mul.lo.u64 %r257,%r258,%r189; .loc 1 95 18 st.u64 [%r108],%r257; .loc 1 96 19 ld.u64 %r260,[%r114+16]; add.u64 %r259,%r260,1; ld.u64 %r261,[%r114+8]; sub.u64 %r54,%r259,%r261; .loc 1 98 10 setp.lt.s64 %r262,%r54,0; @ %r262 bra $L12; .loc 1 96 17 st.u64 [%r101],%r54; bra $L13; $L12: .loc 1 99 12 st.u64 [%r101],%r397; $L13: .loc 1 93 3 add.u64 %r114,%r114,24; add.u64 %r108,%r108,8; add.u64 %r101,%r101,8; setp.ne.u64 %r264,%r50,%r114; @ %r264 bra $L14; bra $L53; $L10: .loc 1 106 7 setp.le.s64 %r265,%r117,0; @ %r265 bra $L16; add.u64 %r384,%frame,240; $L37: add.u64 %r149,%r185,40; mov.u64 %r146,%r384; add.u64 %r266,%r185,16; cvt.u32.u32 %r268,%r22; cvt.s64.s8 %r267,%r268; add.u64 %r270,%r267,%r267; add.u64 %r271,%r270,%r267; shl.b64 %r272,%r271,3; add.u64 %r115,%r266,%r272; .loc 1 109 10 mov.u64 %r106,1; .loc 1 113 4 mov.u64 %r295,0; bra $L17; $L16: .loc 1 117 24 mov.u64 %r273,0; st.u64 [%r185+8],%r273; .loc 1 118 28 cvt.u16.u32 %r276,%r22; add.u16 %r275,%r276,-1; cvt.u32.u16 %r277,%r275; st.u8 [%r185+28],%r277; .loc 1 120 20 add.u32 %r278,%r22,-2; cvt.s64.s32 %r63,%r278; add.u64 %r280,%r63,%r63; add.u64 %r281,%r280,%r63; shl.b64 %r282,%r281,3; add.u64 %r283,%r185,%r282; .loc 1 120 67 shl.b64 %r285,%r63,3; add.u64 %r286,%frame,%r285; .loc 1 120 59 ld.u64 %r288,[%r283+40]; ld.u64 %r289,[%r286+240]; mul.lo.u64 %r123,%r288,%r289; .loc 1 122 29 mov.u64 %r291,16291; call (%value_in),_gfortrani_xmallocarray292,[%value_in]; } .loc 1 122 27 st.u64 [%r185],%r292; .loc 1 123 10 setp.eq.u64 %r294,%r123,0; @ ! %r294 bra $L21; bra $L18; $L20: .loc 1 111 48 mul.lo.u64 %r106,%r59,%r106; $L17: .loc 1 113 4 st.u64 [%r149+8],%r295; ld.u64 %r59,[%r146]; add.u64 %r296,%r59,-1; st.u64 [%r149+16],%r296; st.u64 [%r149],%r106; .loc 1 106 7 add.u64 %r149,%r149,24; add.u64 %r146,%r146,8; setp.ne.u64 %r297,%r115,%r149; @ %r297 bra $L20; bra $L16; $L18: .loc 1 126 4 st.u64 [%r185+48],%r123; mov.u64 %r299,-1; st.u64 [%r185+56],%r299; mov.u64 %r300,1; st.u64 [%r185+40],%r300; .loc 1 127 4 bra $L1; $L11: .loc 1 133 19 ld.s8 %r70,[%r185+28]; .loc 1 133 10 setp.eq.u64 %r301,%r70,%r117; @ %r301 bra $L23; .loc 1 134 2 st.u64 [%stack+8],%r117_gfortran_runtime_error23: .loc 1 139 11 cvta.global.u64 %r304,_gfortrani_compile_options; .loc 1 139 10 ld.u32 %r305,[%r304+36]; setp.eq.u32 %r306,%r305,0; @ %r306 bra $L21; .loc 1 140 2 add.u64 %r384,%frame,240; cvta.const.u64 %r310,$LC2r384309310; call _gfortrani_bounds_ifunction_return$L21: .loc 1 144 3 setp.gt.s64 %r312,%r117,0; @ %r312 bra $L24; $L28: .loc 1 80 9 add.u64 %r313,%r31,1; .loc 1 80 7 sub.u64 %r119,%r313,%r33; max.s64 %r121,%r119,0; .loc 1 83 44 mul.lo.u64 %r36,%r34,%r189; .loc 1 152 8 ld.u64 %r129,[%r186]; .loc 1 153 8 ld.u64 %r130,[%r185]; .loc 1 186 22 ld.u64 %r78,[%frame+120]; .loc 1 187 22 ld.u64 %r80,[%frame]; .loc 1 187 12 shl.b64 %r81,%r80,4; .loc 1 189 32 ld.u64 %r152,[%frame+240]; .loc 1 196 23 mul.lo.u64 %r314,%r78,%r152; .loc 1 196 9 neg.s64 %r142,%r314; .loc 1 197 23 mul.lo.u64 %r315,%r80,%r152; .loc 1 197 9 shl.b64 %r316,%r315,4; neg.s64 %r141,%r316; .loc 1 185 12 ld.u64 %r181,[%frame+360]; setp.gt.s64 %r386,%r119,0; setp.le.s64 %r387,%r117,1; mov.u64 %r390,1; .loc 1 173 6 setp.eq.u32 %r391,%r188,0; cvt.u32.u32 %r392,%r22; cvt.s64.s8 %r393,%r392; add.u64 %r394,%r393,-1; add.u64 %r395,%frame,120; add.u64 %r396,%frame,240; bra $L25; $L24: add.u64 %r167,%frame,360; add.u64 %r166,%r185,40; mov.u64 %r164,%frame; cvt.u32.u32 %r318,%r22; cvt.s64.s8 %r317,%r318; add.u64 %r150,%r317,-1; .loc 1 144 3 mov.u64 %r131,0; add.u64 %r384,%frame,240; .loc 1 146 16 mov.u64 %r319,%r131; $L27: st.u64 [%r167],%r319; .loc 1 147 18 ld.u64 %r320,[%r166]; st.u64 [%r164],%r320; .loc 1 148 17 shl.b64 %r322,%r131,3; add.u64 %r323,%r384,%r322; .loc 1 148 10 ld.u64 %r324,[%r323]; setp.le.s64 %r325,%r324,0; @ %r325 bra $L1; .loc 1 144 26 add.u64 %r131,%r131,1; .loc 1 144 3 add.u64 %r167,%r167,8; add.u64 %r166,%r166,24; add.u64 %r164,%r164,8; setp.ne.u64 %r326,%r131,%r150; @ %r326 bra $L27; bra $L28; $L25: .loc 1 166 5 @ %r386 bra $L38; .loc 1 167 10 mov.u64 %r328,0; st.u64 [%r130],%r328; st.u64 [%r130+8],%r328; bra $L30; $L38: mov.u64 %r125,%r129; mov.u64 %r378,%r390; mov.u64 %r379,0; .loc 1 164 9 mov.u64 %r109,%r379; .loc 1 165 9 mov.u64 %r376,%r379; mov.u64 %r377,%r379; $L29: .loc 1 173 6 setp.eq.u64 %r330,%r109,0; @ %r330 bra $L31; .loc 1 40(%value_in),memcmp,(%out_arg1,%out_arg2,%out_arg3); ld.param.u32 %r334,[%value_in]; } .loc 1 173 6 @ %r391 bra $L32; .loc 1 173 22 setp.le.s32 %r337,%r334,0; @ ! %r337 bra $L33; bra $L31; $L32: setp.ge.s32 %r338,%r334,0; @ %r338 bra $L33; $L31: .loc 1 177 14 mov.u64 %r376,%r378; mov.u64 %r377,%r379; mov.u64 %r109,%r125; $L33: .loc 1 170 36 add.u64 %r125,%r125,%r36; .loc 1 170 6 add.u64 %r341,%r378,%r390; set.u32.lt.u64 %r346,%r341,%r378; cvt.s64.s32 %r344,%r346; mov.u64 %r378,%r341; sub.u64 %r379,%r379,%r344; add.u64 %r351,%r378,-1; setp.gt.s64 %r353,%r121,%r351; @ %r353 bra $L29; .loc 1 181 12 st.u64 [%r130],%r376; st.u64 [%r130+8],%r377; $L30: .loc 1 185 15 add.u64 %r181,%r181,1; .loc 1 186 12 add.u64 %r129,%r129,%r78; .loc 1 187 12 add.u64 %r130,%r130,%r81; .loc 1 189 13 setp.ne.u64 %r356,%r181,%r152; @ %r356 bra $L25; .loc 1 196 9 add.u64 %r126,%r129,%r142; .loc 1 197 9 add.u64 %r127,%r130,%r141; .loc 1 199 7 @ %r387 bra $L1; add.u64 %r136,%frame,368; mov.u64 %r156,8; .loc 1 198 5 mov.u64 %r128,1; .loc 1 193 13 mov.u64 %r388,0; bra $L35; $L36: st.u64 [%r136],%r388; .loc 1 196 23 mul.lo.u64 %r362,%r95,%r94; .loc 1 196 9 sub.u64 %r126,%r129,%r362; .loc 1 197 23 mul.lo.u64 %r363,%r97,%r94; .loc 1 197 9 shl.b64 %r364,%r363,4; sub.u64 %r127,%r130,%r364; .loc 1 198 5 add.u64 %r128,%r128,1; .loc 1 199 7 add.u64 %r136,%r136,8; add.u64 %r156,%r156,8; setp.eq.u64 %r365,%r128,%r394; @ %r365 bra $L1; $L35: .loc 1 207 16 ld.u64 %r366,[%r136]; add.u64 %r94,%r366,1; st.u64 [%r136],%r94; .loc 1 208 23 add.u64 %r368,%r395,%r156; ld.u64 %r95,[%r368]; .loc 1 208 13 add.u64 %r129,%r126,%r95; .loc 1 209 23 add.u64 %r369,%frame,%r156; ld.u64 %r97,[%r369]; .loc 1 209 13 shl.b64 %r370,%r97,4; add.u64 %r130,%r127,%r370; .loc 1 189 32 add.u64 %r372,%r396,%r156; ld.u64 %r100,[%r372]; .loc 1 189 13 setp.eq.u64 %r373,%r94,%r100; @ %r373 bra $L36; mov.u64 %r181,0; bra $L25; $L53: .loc 1 102 6 ld.u64 %r374,[%r185]; setp.eq.u64 %r375,%r374,0; @ ! %r375 bra $L11; bra $L37; $L1_gfortran_mminloc1_16_s1 .visible .func _gfortran_mminloc1_16_s1608u16u32 %r263; .reg .u64pred %r295; .reg .predpred %r310; .reg .pred %r311; .reg .u64u64 %r318; .reg .u64pred64pred64predpred %r397; .reg .u64 %r400; .reg .u64 %r401; .reg .u64 %r405; .reg .pred.reg .u64 %r416; .reg .predpred.reg .u64 %r448; .reg .u32 %r450; .reg .u64pred %r463; .reg .predpred %r479; .reg .pred %r482; .reg .u64 %r485; .reg .predu64 %r504; .reg .u64 %r506; .reg .pred %r507; .reg .u64 %r508; .reg .pred %r509; .reg .u64 %r510; .reg .u64 %r511; .reg .u64 %r514; .reg .u64 %r515; .reg .u64u64 %r534; .reg .pred %r535; .reg .u64 %r536; .reg .u64 %r539; .reg .pred %r540; .reg .u32 %r542; .reg .u64 %r543; .reg .u64 %r544; .reg .u64 %r545; .reg .u64 %r546; .reg .u64 %r547; mov.u64 %r236,%ar0; mov.u64 %r237,%ar1; mov.u64 %r238,%ar2; mov.u64 %r239,%ar3; mov.u32 %r240,%ar4; mov.u64 %r241,%ar5; .loc 1 244 6 setp.ne.u64 %r242,%r239,0; @ %r242 bra $L238240241; call _gfortran_minloc1_16_s1); } .loc 1 251 7 bra $L54; $L55: .loc 1 254 10 ld.u64 %r22,[%r238]; .loc 1 254 7 add.u64 %r150,%r22,-1; .loc 1 255 10 ld.s8 %r23,[%r237+28]; .loc 1 255 38 add.u32 %r248,%r23,-1; .loc 1 255 8 cvt.s64.s32 %r151,%r248; .loc 1 258 7 shr.u64 %r250,%r150,63; set.u32.gt.s64 %r252,%r150,%r151; neg.s32 %r253,%r252; cvt.u16.u64 %r256,%r250; cvt.u16.u32 %r257,%r253; or.b16 %r255,%r256,%r257; .loc 1 258 6 cvt.u32.u16 %r258,%r255; cvt.u16.u8 %r259,%r258; setp.eq.u16 %r260,%r259,0; @ %r260 bra $L57; .loc 1 260 7 cvt.u32.u32 %r263,%r23; cvt.s64.s8 %r262,%r263; st.u64 [%stack+8],%r262; st.u64 [%stack],%r22; cvta.const.u64 %r261261_gfortran_runtime_error7: .loc 1 265 9 add.u64 %r534,%r150,%r150; add.u64 %r531,%r534,%r150; shl.b64 %r268,%r531,3; add.u64 %r269,%r237,%r268; ld.u64 %r272,[%r269+56]; add.u64 %r271,%r272,1; .loc 1 265 7 ld.u64 %r279,[%r269+48]; sub.u64 %r152,%r271,%r279; .loc 1 266 6 setp.le.s64 %r280,%r152,0; @ %r280 bra $L54; .loc 1 269 9 ld.u64 %r164,[%r239]; .loc 1 271 15 ld.u64 %r34,[%r239+16]; .loc 1 273 22 cvt.u32.u64 %r35,%r34; .loc 1 273 53 add.u32 %r281,%r35,-4; and.b32 %r282,%r281,-5; set.u32.eq.u32 %r284,%r282,0; neg.s32 %r285,%r284; .loc 1 273 22 add.u32 %r286,%r35,-1; set.u32.le.u32 %r288,%r286,1; neg.s32 %r289,%r288; .loc 1 273 6 cvt.u16.u32 %r291,%r285; cvt.u16.u32 %r292,%r289; or.b16 %r290,%r291,%r292; cvt.u32.u16 %r293,%r290; cvt.u16.u8 %r294,%r293; setp.ne.u16 %r295,%r294,0; @ %r295 bra $L59; .loc 1 275 7 setp.ne.u32 %r297,%r35,16; @ %r297 bra $L60; $L59: .loc 1 282 11 shl.b64 %r301,%r531,3; add.u64 %r302,%r237,%r301; ld.u64 %r41,[%r302+40]; .loc 1 283 12 add.u64 %r308,%r239,%r301; ld.u64 %r44,[%r308+40]; .loc 1 285 3 setp.ne.u64 %r310,%r150,0; @ %r310 bra $L61; $L67: .loc 1 295 3 setp.lt.s64 %r311,%r150,%r151; @ %r311 bra $L62; bra $L115; $L60: .loc 1 280 5 cvta.const.u64 %r312312_gfortran_runtime_error61: add.u64 %r217,%r237,40; add.u64 %r219,%frame,240; add.u64 %r220,%r239,40; mov.u64 %r222,%frame; add.u64 %r223,%frame,360; add.u64 %r314,%r237,16; add.u64 %r316,%r22,%r22; add.u64 %r317,%r316,%r22; shl.b64 %r318,%r317,3; add.u64 %r232,%r314,%r318; .loc 1 292 12 mov.u64 %r547,0; $L66: .loc 1 287 51 ld.u64 %r320,[%r217]; mul.lo.u64 %r319,%r320,%r241; .loc 1 287 18 st.u64 [%r219],%r319; .loc 1 288 20 ld.u64 %r322,[%r220]; mul.lo.u64 %r321,%r322,%r34; .loc 1 288 18 st.u64 [%r222],%r321; .loc 1 289 19 ld.u64 %r324,[%r217+16]; add.u64 %r323,%r324,1; ld.u64 %r325,[%r217+8]; sub.u64 %r59,%r323,%r325; .loc 1 291 10 setp.lt.s64 %r326,%r59,0; @ %r326 bra $L64; .loc 1 289 17 st.u64 [%r223],%r59; bra $L65; $L64: .loc 1 292 12 st.u64 [%r223],%r547; $L65: .loc 1 285 3 add.u64 %r217,%r217,24; add.u64 %r219,%r219,8; add.u64 %r220,%r220,24; add.u64 %r222,%r222,8; add.u64 %r223,%r223,8; setp.ne.u64 %r328,%r217,%r232; @ %r328 bra $L66; bra $L67; $L115: .loc 1 305 6 ld.u64 %r329,[%r236]; setp.eq.u64 %r330,%r329,0; @ ! %r330 bra $L69; bra $L68; $L62: add.u64 %r332,%r22,%r22; add.u64 %r333,%r332,%r22; shl.b64 %r334,%r333,3; add.u64 %r175,%r334,40; add.u64 %r39,%r237,%r175; shl.b64 %r335,%r22,3; add.u64 %r203,%r335,-8; add.u64 %r536,%frame,240; add.u64 %r177,%r536,%r203; add.u64 %r112,%r239,%r175; add.u64 %r38,%frame,%r203; add.u64 %r532,%frame,360; add.u64 %r96,%r532,%r203; add.u64 %r338,%r237,40; cvt.u32.u32 %r340,%r23; cvt.s64.s8 %r339,%r340; add.u64 %r342,%r339,%r339; add.u64 %r343,%r342,%r339; shl.b64 %r344,%r343,3; add.u64 %r214,%r338,%r344; .loc 1 302 12 mov.u64 %r546,0; $L72: .loc 1 297 55 ld.u64 %r346,[%r39]; mul.lo.u64 %r345,%r346,%r241; .loc 1 297 18 st.u64 [%r177],%r345; .loc 1 298 20 ld.u64 %r348,[%r112]; mul.lo.u64 %r347,%r348,%r34; .loc 1 298 18 st.u64 [%r38],%r347; .loc 1 299 19 ld.u64 %r350,[%r39+16]; add.u64 %r349,%r350,1; ld.u64 %r351,[%r39+8]; sub.u64 %r75,%r349,%r351; .loc 1 301 10 setp.lt.s64 %r352,%r75,0; @ %r352 bra $L70; .loc 1 299 17 st.u64 [%r96],%r75; bra $L71; $L70: .loc 1 302 12 st.u64 [%r96],%r546; $L71: .loc 1 295 3 add.u64 %r39,%r39,24; add.u64 %r177,%r177,8; add.u64 %r112,%r112,24; add.u64 %r38,%r38,8; add.u64 %r96,%r96,8; setp.ne.u64 %r354,%r39,%r214; @ %r354 bra $L72; bra $L116; $L68: .loc 1 309 7 setp.eq.u64 %r355,%r151,0; @ %r355 bra $L74; add.u64 %r532,%frame,360; $L98: add.u64 %r128,%r236,40; mov.u64 %r108,%r532; add.u64 %r356,%r236,16; cvt.u32.u32 %r358,%r23; cvt.s64.s8 %r357,%r358; add.u64 %r360,%r357,%r357; add.u64 %r361,%r360,%r357; shl.b64 %r362,%r361,3; add.u64 %r60,%r356,%r362; .loc 1 312 10 mov.u64 %r138,1; .loc 1 316 4 mov.u64 %r381,0; bra $L75; $L74: .loc 1 320 20 add.u32 %r363,%r23,-2; cvt.s64.s32 %r83,%r363; add.u64 %r365,%r83,%r83; add.u64 %r366,%r365,%r83; shl.b64 %r367,%r366,3; add.u64 %r368,%r236,%r367; .loc 1 320 67 shl.b64 %r370,%r83,3; add.u64 %r371,%frame,%r370; .loc 1 320 59 ld.u64 %r373,[%r368+40]; ld.u64 %r374,[%r371+360]; mul.lo.u64 %r157,%r373,%r374; .loc 1 322 24 mov.u64 %r375,0; st.u64 [%r236+8],%r375; .loc 1 323 28 cvt.u16.u32 %r378,%r23; add.u16 %r377,%r378,-1; cvt.u32.u16 %r379,%r377; st.u8 [%r236+28],%r379; .loc 1 325 10 setp.eq.u64 %r380,%r157,0; @ %r380 bra $L76; bra $L117; $L78: .loc 1 314 47 mul.lo.u64 %r138,%r80,%r138; $L75: .loc 1 316 4 st.u64 [%r128+8],%r381; ld.u64 %r80,[%r108]; add.u64 %r382,%r80,-1; st.u64 [%r128+16],%r382; st.u64 [%r128],%r138; .loc 1 309 7 add.u64 %r128,%r128,24; add.u64 %r108,%r108,8; setp.ne.u64 %r383,%r60,%r128; @ %r383 bra $L78; bra $L74; $L76: .loc 1 328 4 st.u64 [%r236+48],%r157; mov.u64 %r385,-1; st.u64 [%r236+56],%r385; mov.u64 %r386,1; st.u64 [%r236+40],%r386; .loc 1 329 4 bra $L54; $L117: .loc 1 332 24 mov.u64 %r388,16388; call (%value_in),_gfortrani_xmallocarray89,[%value_in]; } .loc 1 332 22 st.u64 [%r236],%r389; bra $L79; $L69: .loc 1 337 19 ld.s8 %r391,[%r236+28]; .loc 1 337 10 setp.eq.u64 %r392,%r391,%r151; @ %r392 bra $L80; .loc 1 338 2 cvta.const.u64 %r3933_gfortran_runtime_error80: .loc 1 340 11 cvta.global.u64 %r395,_gfortrani_compile_options; .loc 1 340 10 ld.u32 %r396,[%r395+36]; setp.eq.u32 %r397,%r396,0; @ %r397 bra $L79; .loc 1 342 4 add.u64 %r532,%frame,360; cvta.const.u64 %r401,$LC2r500401; call _gfortrani_bounds_ifunction_return344 4 cvta.const.u64 %r405,$LC20401; call _gfortrani_bounds_equal_extents$L79: .loc 1 349 3 setp.ne.u64 %r407,%r151,0; @ %r407 bra $L81; $L84: .loc 1 357 8 ld.u64 %r165,[%r236]; .loc 1 358 8 ld.u64 %r163,[%r237]; .loc 1 360 9 setp.ne.u64 %r408,%r163,0; @ %r408 bra $L82; bra $L54; $L81: add.u64 %r146,%frame,480; add.u64 %r145,%r236,40; add.u64 %r140,%frame,120; cvt.u32.u32 %r410,%r23; cvt.s64.s8 %r409,%r410; add.u64 %r129,%r409,-1; .loc 1 349 3 mov.u64 %r172,0; add.u64 %r532,%frame,360; .loc 1 351 16 mov.u64 %r411,%r172; $L83: st.u64 [%r146],%r411; .loc 1 352 18 ld.u64 %r412,[%r145]; st.u64 [%r140],%r412; .loc 1 353 17 shl.b64 %r414,%r172,3; add.u64 %r415,%r532,%r414; .loc 1 353 10 ld.u64 %r416,[%r415]; setp.le.s64 %r417,%r416,0; @ %r417 bra $L54; .loc 1 349 26 add.u64 %r172,%r172,1; .loc 1 349 3 add.u64 %r146,%r146,8; add.u64 %r145,%r145,24; add.u64 %r140,%r140,8; setp.ne.u64 %r418,%r129,%r172; @ %r418 bra $L83; bra $L84; $L82: .loc 1 282 44 mul.lo.u64 %r43,%r41,%r241; .loc 1 283 12 mul.lo.u64 %r47,%r44,%r34; .loc 1 396 22 ld.u64 %r100,[%frame+240]; .loc 1 397 23 ld.u64 %r102,[%frame]; .loc 1 398 22 ld.u64 %r104,[%frame+120]; .loc 1 398 12 shl.b64 %r105,%r104,4; .loc 1 400 32 ld.u64 %r29,[%frame+360]; .loc 1 407 23 mul.lo.u64 %r168,%r100,%r29; .loc 1 408 24 mul.lo.u64 %r419,%r102,%r29; .loc 1 408 10 neg.s64 %r144,%r419; .loc 1 409 23 mul.lo.u64 %r420,%r104,%r29; .loc 1 409 9 shl.b64 %r421,%r420,4; neg.s64 %r188,%r421; setp.le.s64 %r533,%r151,1; add.u64 %r539,%r152,-1; .loc 1 384 13 setp.eq.u32 %r540,%r240,0; cvt.u32.u32 %r542,%r23; cvt.s64.s8 %r543,%r542; add.u64 %r544,%r543,-1; add.u64 %r545,%frame,240; $L99: .loc 1 349 3 mov.u64 %r171,%r164; mov.u64 %r142,%r163; .loc 1 372 9 mov.u64 %r199,0; $L88: .loc 1 375 7 ld.s8 %r205,[%r171]; mov.u64 %r209,%r199; add.u64 %r199,%r199,1; .loc 1 375 6 setp.eq.u32 %r535,%r205,0; @ %r535 bra $L85; .loc 1 378 11 mov.u64 %r510,%r199; shr.s64 %r511,%r510,63; .loc 1 382 6 setp.le.s64 %r424,%r152,%r209; @ %r424 bra $L86; add.u64 %r514,%r209,1; shr.s64 %r515,%r514,63; sub.u64 %r428,%r539,%r209; shr.s64 %r431,%r209,63; add.u64 %r434,%r209,2; set.u32.lt.u64 %r439,%r434,%r209; cvt.s64.s32 %r437,%r439; sub.u64 %r443,%r431,%r437; add.u64 %r445,%r428,%r434; set.u32.lt.u64 %r450,%r445,%r428; cvt.s64.s32 %r448,%r450; sub.u64 %r454,%r443,%r448; mov.u64 %r170,%r142; mov.u64 %r526,1; bra $L87; $L85: .loc 1 372 32 add.u64 %r142,%r142,%r43; .loc 1 372 47 add.u64 %r171,%r171,%r47; .loc 1 372 2 setp.ne.u64 %r455,%r152,%r199; @ %r455 bra $L88; .loc 1 371 9 mov.u64 %r510,0; mov.u64 %r511,%r510; bra $L86; $L87: .loc 1 384 6 @ %r535 bra $L89; .loc 1 4017142241460,[%value_in]; } .loc 1 384 13 @ %r540 bra $L90; setp.le.s32 %r463,%r460,0; @ %r463 bra $L91; bra $L89; $L90: setp.ge.s32 %r464,%r460,0; @ %r464 bra $L89; $L91: .loc 1 388 14 mov.u64 %r510,%r514; mov.u64 %r511,%r515; mov.u64 %r142,%r170; $L89: .loc 1 382 31 add.u64 %r170,%r170,%r43; .loc 1 382 46 add.u64 %r171,%r171,%r47; .loc 1 382 6 add.u64 %r467,%r514,%r526; set.u32.lt.u64 %r472,%r467,%r514; cvt.s64.s32 %r470,%r472; mov.u64 %r514,%r467; sub.u64 %r515,%r515,%r470; setp.ne.u64 %r479,%r445,%r514; @ %r479 bra $L100; setp.ne.u64 %r482,%r454,%r515; @ ! %r482 bra $L86; $L100: .loc 1 384 7 ld.s8 %r205,[%r171]; setp.eq.u32 %r535,%r205,0; bra $L87; $L86: .loc 1 392 8 st.u64 [%r165],%r510; st.u64 [%r165+8],%r511; .loc 1 395 15 ld.u64 %r485,[%frame+480]; add.u64 %r98,%r485,1; .loc 1 396 12 add.u64 %r163,%r163,%r100; .loc 1 397 13 add.u64 %r164,%r164,%r102; .loc 1 398 12 add.u64 %r165,%r165,%r105; .loc 1 400 13 setp.eq.u64 %r486,%r29,%r98; @ %r486 bra $L93; .loc 1 395 15 st.u64 [%frame+480],%r98; bra $L99; $L93: .loc 1 404 13 mov.u64 %r487,0; st.u64 [%frame+480],%r487; .loc 1 408 10 add.u64 %r160,%r164,%r144; .loc 1 409 9 add.u64 %r161,%r165,%r188; .loc 1 411 7 @ %r533 bra $L54; add.u64 %r201,%frame,488; .loc 1 407 23 mov.u64 %r107,%r168; .loc 1 411 7 mov.u64 %r197,8; .loc 1 410 5 mov.u64 %r162,1; add.u64 %r532,%frame,360; add.u64 %r530,%frame,120; bra $L96; $L97: .loc 1 404 13 st.u64 [%r201],%r487; .loc 1 407 23 mul.lo.u64 %r107,%r119,%r118; .loc 1 408 24 mul.lo.u64 %r493,%r121,%r118; .loc 1 408 10 sub.u64 %r160,%r164,%r493; .loc 1 409 23 mul.lo.u64 %r494,%r123,%r118; .loc 1 409 9 shl.b64 %r495,%r494,4; sub.u64 %r161,%r165,%r495; .loc 1 410 5 add.u64 %r162,%r162,1; .loc 1 411 7 add.u64 %r201,%r201,8; add.u64 %r197,%r197,8; setp.eq.u64 %r496,%r162,%r544; @ %r496 bra $L54; $L96: .loc 1 419 16 ld.u64 %r497,[%r201]; add.u64 %r118,%r497,1; st.u64 [%r201],%r118; .loc 1 420 23 add.u64 %r499,%r545,%r197; ld.u64 %r119,[%r499]; .loc 1 420 13 sub.u64 %r500,%r119,%r107; add.u64 %r163,%r163,%r500; .loc 1 421 24 add.u64 %r501,%frame,%r197; ld.u64 %r121,[%r501]; .loc 1 421 14 add.u64 %r164,%r160,%r121; .loc 1 422 23 add.u64 %r503,%r530,%r197; ld.u64 %r123,[%r503]; .loc 1 422 13 shl.b64 %r504,%r123,4; add.u64 %r165,%r161,%r504; .loc 1 400 32 add.u64 %r506,%r532,%r197; ld.u64 %r127,[%r506]; .loc 1 400 13 setp.eq.u64 %r507,%r118,%r127; @ %r507 bra $L97; bra $L99; $L116: .loc 1 305 6 ld.u64 %r508,[%r236]; setp.eq.u64 %r509,%r508,0; @ ! %r509 bra $L69; bra $L98; $L54: .loc 1 426_gfortran_sminloc1_16_s1 .visible .func _gfortran_sminloc1_16_s1368100predu16 %r180; .reg .predpred %r222; .reg .predu64 %r231; .reg .u16 %r233; .reg .u16 %r234; .reg .u32pred %r259; .reg .pred %r260; .reg .u64u64 %r267; .reg .u32predmov.u64 %r156,%ar0; mov.u64 %r157,%ar1; mov.u64 %r158,%ar2; mov.u64 %r159,%ar3; mov.u32 %r160,%ar4; mov.u64 %r161,%ar5; .loc 1 449 6 setp.eq.u64 %r162,%r159,0; @ %r162 bra $L119; .loc 1 449 20 ld.u32 %r163,[%r159]; setp.eq.u32 %r164,%r163,0; @ %r164 bra $L120; $L119: .loc 1 452 7r1571580161; call _gfortran_minloc1_16_s1); } .loc 1 456 7 bra $L118; $L120: .loc 1 459 10 ld.u64 %r23,[%r158]; .loc 1 459 7 add.u64 %r95,%r23,-1; .loc 1 460 10 ld.s8 %r24,[%r157+28]; .loc 1 460 38 add.u32 %r26,%r24,-1; .loc 1 460 8 cvt.s64.s32 %r307,%r26; .loc 1 462 7 shr.u64 %r171,%r95,63; set.u32.gt.s64 %r173,%r95,%r307; neg.s32 %r174,%r173; cvt.u16.u64 %r177,%r171; cvt.u16.u32 %r178,%r174; or.b16 %r176,%r177,%r178; .loc 1 462 6 cvt.u32.u16 %r179,%r176; cvt.u16.u8 %r180,%r179; setp.ne.u16 %r181,%r180,0; @ %r181 bra $L122; .loc 1 469 3 setp.ne.u64 %r182,%r95,0; @ %r182 bra $L123; $L129: .loc 1 477 3 setp.lt.s64 %r183,%r95,%r307; @ %r183 bra $L124; bra $L177; $L122: .loc 1 464 7 cvt.u32.u32 %r186,%r24; cvt.s64.s8 %r185,%r186; st.u64 [%stack+8],%r1851841stack; call _gfortran_runtime_error123: add.u64 %r92,%r157,48; add.u64 %r114,%frame,120; add.u64 %r146,%r157,56; add.u64 %r188,%r157,24; add.u64 %r190,%r23,%r23; add.u64 %r191,%r190,%r23; shl.b64 %r192,%r191,3; add.u64 %r152,%r188,%r192; .loc 1 474 12 mov.u64 %r313,0; $L128: .loc 1 471 19 ld.u64 %r194,[%r146]; ld.u64 %r196,[%r92]; sub.u64 %r195,%r194,%r196; .loc 1 471 50 mad.lo.u64 %r38,%r195,%r161,%r161; .loc 1 473 10 setp.le.s64 %r197,%r38,0; @ %r197 bra $L126; .loc 1 471 17 st.u64 [%r114],%r38; bra $L127; $L126: .loc 1 474 12 st.u64 [%r114],%r313; $L127: .loc 1 469 3 add.u64 %r92,%r92,24; add.u64 %r114,%r114,8; add.u64 %r146,%r146,24; setp.ne.u64 %r199,%r92,%r152; @ %r199 bra $L128; bra $L129; $L177: .loc 1 486 15 ld.u64 %r107,[%r156]; .loc 1 486 6 setp.eq.u64 %r200,%r107,0; @ ! %r200 bra $L131; bra $L130; $L124: add.u64 %r202,%r23,%r23; add.u64 %r203,%r202,%r23; shl.b64 %r204,%r203,3; add.u64 %r205,%r204,48; add.u64 %r83,%r157,%r205; add.u64 %r305,%frame,120; shl.b64 %r207,%r23,3; add.u64 %r208,%r207,-8; add.u64 %r144,%r305,%r208; add.u64 %r209,%r157,48; cvt.u32.u32 %r211,%r24; cvt.s64.s8 %r210,%r211; add.u64 %r213,%r210,%r210; add.u64 %r214,%r213,%r210; shl.b64 %r215,%r214,3; add.u64 %r72,%r209,%r215; .loc 1 483 12 mov.u64 %r312,0; $L134: .loc 1 480 2 ld.u64 %r217,[%r83+8]; ld.u64 %r219,[%r83]; sub.u64 %r218,%r217,%r219; .loc 1 480 37 mad.lo.u64 %r46,%r218,%r161,%r161; .loc 1 482 10 setp.le.s64 %r220,%r46,0; @ %r220 bra $L132; .loc 1 479 17 st.u64 [%r144],%r46; bra $L133; $L132: .loc 1 483 12 st.u64 [%r144],%r312; $L133: .loc 1 477 3 add.u64 %r83,%r83,24; add.u64 %r144,%r144,8; setp.ne.u64 %r222,%r72,%r83; @ %r222 bra $L134; bra $L178; $L130: .loc 1 490 7 setp.eq.u64 %r223,%r307,0; @ %r223 bra $L136; add.u64 %r305,%frame,120; $L156: add.u64 %r125,%r156,40; mov.u64 %r123,%r305; add.u64 %r224,%r156,16; cvt.u32.u32 %r226,%r24; cvt.s64.s8 %r225,%r226; add.u64 %r228,%r225,%r225; add.u64 %r229,%r228,%r225; shl.b64 %r230,%r229,3; add.u64 %r117,%r224,%r230; .loc 1 493 10 mov.u64 %r89,1; .loc 1 497 4 mov.u64 %r249,0; bra $L137; $L136: .loc 1 501 24 mov.u64 %r231,0; st.u64 [%r156+8],%r231; .loc 1 502 28 cvt.u16.u32 %r234,%r24; add.u16 %r233,%r234,-1; cvt.u32.u16 %r235,%r233; st.u8 [%r156+28],%r235; .loc 1 504 20 add.u32 %r236,%r24,-2; cvt.s64.s32 %r56,%r236; add.u64 %r238,%r56,%r56; add.u64 %r239,%r238,%r56; shl.b64 %r240,%r239,3; add.u64 %r241,%r156,%r240; .loc 1 504 67 shl.b64 %r243,%r56,3; add.u64 %r244,%frame,%r243; .loc 1 504 59 ld.u64 %r246,[%r241+40]; ld.u64 %r247,[%r244+120]; mul.lo.u64 %r103,%r246,%r247; .loc 1 506 10 setp.eq.u64 %r248,%r103,0; @ %r248 bra $L138; bra $L179; $L140: .loc 1 495 48 mul.lo.u64 %r89,%r52,%r89; $L137: .loc 1 497 4 st.u64 [%r125+8],%r249; ld.u64 %r52,[%r123]; add.u64 %r250,%r52,-1; st.u64 [%r125+16],%r250; st.u64 [%r125],%r89; .loc 1 490 7 add.u64 %r125,%r125,24; add.u64 %r123,%r123,8; setp.ne.u64 %r251,%r117,%r125; @ %r251 bra $L140; bra $L136; $L138: .loc 1 509 4 st.u64 [%r156+48],%r103; mov.u64 %r253,-1; st.u64 [%r156+56],%r253; mov.u64 %r254,1; st.u64 [%r156+40],%r254; .loc 1 510 4 bra $L118; $L179: .loc 1 513 24 mov.u64 %r256,16256; call (%value_in),_gfortrani_xmallocarray257,[%value_in]; } mov.u64 %r107,%r257; .loc 1 513 22 st.u64 [%r156],%r107; $L144: .loc 1 539 3 setp.ne.u64 %r259,%r307,0; @ %r259 bra $L141; bra $L142; $L131: .loc 1 517 19 ld.s8 %r61,[%r156+28]; .loc 1 517 10 setp.eq.u64 %r260,%r61,%r307; @ %r260 bra $L143; .loc 1 518 2 st.u64 [%stack+8],%r307; st.u64 [%stack],%r61;_gfortran_runtime_error143: .loc 1 523 11 cvta.global.u64 %r263,_gfortrani_compile_options; .loc 1 523 10 ld.u32 %r264,[%r263+36]; setp.eq.u32 %r265,%r264,0; @ %r265 bra $L144; .loc 1 525 4 setp.eq.u64 %r266,%r307,0; @ %r266 bra $L142; add.u64 %r115,%r156,48; add.u64 %r105,%frame,120; cvt.u32.u32 %r268,%r24; cvt.s64.s8 %r267,%r268; add.u64 %r86,%r267,-1; .loc 1 525 10 mov.u64 %r113,0; $L146: .loc 1 529 21 ld.u64 %r270,[%r115+8]; add.u64 %r269,%r270,1; .loc 1 529 19 ld.u64 %r271,[%r115]; sub.u64 %r100,%r269,%r271; .loc 1 530 18 ld.u64 %r70,[%r105]; .loc 1 525 25 add.u64 %r113,%r113,1; .loc 1 530 11 setp.eq.u64 %r272,%r70,%r100; @ %r272 bra $L145; .loc 1 531 3 st.u64 [%stack+16],%r70; st.u64 [%stack+8],%r100; st.u64 [%stack],%r113;_gfortran_runtime_error145: .loc 1 525 4 add.u64 %r115,%r115,24; add.u64 %r105,%r105,8; setp.ne.u64 %r275,%r86,%r113; @ %r275 bra $L146; $L141: .loc 1 541 16 cvt.u16.u32 %r277,%r24; setp.le.s16 %r278,%r277,1; @ %r278 bra $L147; shl.b64 %r276,%r307,3; bra $L148; $L147: mov.u64 %r276,8; $L148: add.u64 %r280,%frame,240; mov.u32 %r2842766,[%value_in]; } add.u64 %r130,%r156,40; mov.u64 %r128,%frame; mov.u64 %r109,0; $L149: .loc 1 542 18 ld.u64 %r288,[%r130]; st.u64 [%r128],%r288; .loc 1 539 26 add.u64 %r109,%r109,1; .loc 1 539 3 add.u64 %r130,%r130,24; add.u64 %r128,%r128,8; setp.gt.s64 %r289,%r307,%r109; @ %r289 bra $L149; $L142: ld.u64 %r88,[%frame+240]; .loc 1 551 22 ld.u64 %r75,[%frame]; .loc 1 551 12 shl.b64 %r76,%r75,4; .loc 1 553 32 ld.u64 %r65,[%frame+120]; .loc 1 560 23 mul.lo.u64 %r49,%r65,%r75; setp.le.s64 %r306,%r307,1; .loc 1 549 13 mov.u64 %r290,0; cvt.u32.u32 %r309,%r24; cvt.s64.s8 %r310,%r309; add.u64 %r311,%r310,-1; $L155: st.u64 [%r107],%r290; st.u64 [%r107+8],%r290; .loc 1 550 15 add.u64 %r88,%r88,1; .loc 1 551 12 add.u64 %r107,%r107,%r76; .loc 1 553 13 setp.ne.u64 %r292,%r65,%r88; @ %r292 bra $L155; .loc 1 562 7 @ ! %r306 bra $L180; bra $L118; $L154: .loc 1 557 13 st.u64 [%r108],%r290; .loc 1 560 23 mul.lo.u64 %r78,%r81,%r80; .loc 1 561 5 add.u64 %r106,%r106,1; .loc 1 562 7 add.u64 %r108,%r108,8; add.u64 %r138,%r138,8; add.u64 %r137,%r137,8; setp.ne.u64 %r295,%r106,%r311; @ %r295 bra $L153; bra $L118; $L180: add.u64 %r108,%frame,248; add.u64 %r138,%frame,8; add.u64 %r137,%frame,128; .loc 1 560 23 mov.u64 %r78,%r49; .loc 1 561 5 mov.u64 %r106,1; $L153: .loc 1 566 16 ld.u64 %r300,[%r108]; add.u64 %r80,%r300,1; st.u64 [%r108],%r80; .loc 1 567 23 ld.u64 %r81,[%r138]; .loc 1 567 13 sub.u64 %r301,%r81,%r78; shl.b64 %r302,%r301,4; add.u64 %r107,%r107,%r302; .loc 1 553 32 ld.u64 %r84,[%r137]; .loc 1 553 13 setp.eq.u64 %r303,%r80,%r84; @ %r303 bra $L154; .loc 1 557 13 mov.u64 %r88,0; bra $L155; $L178: .loc 1 486 15 ld.u64 %r107,[%r156]; .loc 1 486 6 setp.eq.u64 %r304,%r107,0; @ ! %r304 bra $L131; bra $L156; $L118: .loc 1 minloc1_16_s4.o/1622802250_gfortran_minloc1_16_s4 .visible .func _gfortran_minloc1_16_sfortran/generated/minloc1_16_s4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_16_s4 .visible .func _gfortran_mminloc1_16_s_gfortran_sminloc1_16_s4 .visible .func _gfortran_sminloc1_16_sVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_memcmp_char4_gfortrani_memcmp_char4_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents7] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,914,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,117,101114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99_gfortran_minloc1_16_s4 .visible .func _gfortran_minloc1_16_s, .param .u64 %in_ar4) {48u64predpredu32 %r259; .reg .u64predu64 %r280; .reg .u16 %r282; .reg .u16 %r283; .reg .u32 %r284; .reg .u32pred %r301; .reg .u64predu64 %r324; .reg .u64pred %r361; .reg .pred %r364; .reg .u64u64 %r375; .reg .u64u64pred %r389; .reg .u64mov.u64 %r192,%ar0; mov.u64 %r193,%ar1; mov.u64 %r194,%ar2; mov.u32 %r195,%ar3; mov.u64 %r196,%ar4; .loc 1 70 10 ld.s8 %r22,[%r193+28]; .loc 1 70 38 add.u32 %r197,%r22,-1; .loc 1 70 8 cvt.s64.s32 %r104,%r197; .loc 1 71 10 ld.u64 %r25,[%r194]; .loc 1 71 7 add.u64 %r105,%r25,-1; .loc 1 73 7 shr.u64 %r199,%r105,63; set.u32.lt.s64 %r201,%r104,%r105; neg.s32 %r202,%r201; cvt.u16.u64 %r205,%r199; cvt.u16.u32 %r206,%r202; or.b16 %r204,%r205,%r206; .loc 1 73 6 cvt.u32.u16 %r207,%r204; cvt.u16.u8 %r208,%r207; setp.eq.u16 %r209,%r208,0; @ %r209 bra $L2; .loc 1 75 7 cvt.u32.u32 %r212,%r22; cvt.s64.s8 %r211,%r212; st.u64 [%stack+8],%r2110210_gfortran_runtime_errorr215,%r105,%r105; add.u64 %r216,%r215,%r105; shl.b64 %r217,%r216,3; add.u64 %r218,%r193,%r217; ld.u64 %r30,[%r218+56]; ld.u64 %r32,[%r218+48]; .loc 1 83 11 ld.u64 %r33,[%r218+40]; .loc 1 85 3 setp.ne.u64 %r232,%r105,0; @ %r232 bra $L3; $L9: .loc 1 93 3 setp.gt.s64 %r233,%r104,%r105; @ %r233 bra $L4; bra $L50; $L3: add.u64 %r103,%r193,40; add.u64 %r101,%frame,120; add.u64 %r100,%frame,240; add.u64 %r234,%r193,16; add.u64 %r236,%r25,%r25; add.u64 %r237,%r236,%r25.loc 1 91 12 mov.u64 %r411,0; $L8: .loc 1 87 51 ld.u64 %r240,[%r103]; mul.lo.u64 %r239,%r240,%r196; .loc 1 87 18 st.u64 [%r101],%r239; .loc 1 88 19 ld.u64 %r242,[%r103+16]; add.u64 %r241,%r242,1; ld.u64 %r243,[%r103+8]; sub.u64 %r42,%r241,%r243; .loc 1 90 10 setp.lt.s64 %r244,%r42,0; @ %r244 bra $L6; .loc 1 88 17 st.u64 [%r100],%r42; bra $L7; $L6: .loc 1 91 12 st.u64 [%r100],%r411; $L7: .loc 1 85 3 add.u64 %r103,%r103,24; add.u64 %r101,%r101,8; add.u64 %r100,%r100,8; setp.ne.u64 %r246,%r89,%r103; @ %r246 bra $L8; bra $L9; $L50: .loc 1 102 6 ld.u64 %r247,[%r192]; setp.eq.u64 %r248,%r247,0; @ ! %r248 bra $L11; bra $L10; $L4: add.u64 %r250,%r25,%r25; add.u64 %r251,%r250,%r25; shl.b64 %r252,%r251,3; add.u64 %r253,%r252,40; add.u64 %r151,%r193,%r253; shl.b64 %r254,%r25,3; add.u64 %r144,%r254,-8; add.u64 %r399,%frame,120; add.u64 %r146,%r399,%r144; add.u64 %r398,%frame,240; add.u64 %r137,%r398,%r144; add.u64 %r257,%r193,40; cvt.u32.u32 %r259,%r22; cvt.s64.s8 %r258,%r259; add.u64 %r261,%r258,%r258; add.u64 %r262,%r261,%r258; shl.b64 %r263,%r262,3; add.u64 %r125,%r257,%r263; .loc 1 99 12 mov.u64 %r410,0; $L14: .loc 1 95 56 ld.u64 %r265,[%r151]; mul.lo.u64 %r264,%r265,%r196; .loc 1 95 18 st.u64 [%r146],%r264; .loc 1 96 19 ld.u64 %r267,[%r151+16]; add.u64 %r266,%r267,1; ld.u64 %r268,[%r151+8]; sub.u64 %r51,%r266,%r268; .loc 1 98 10 setp.lt.s64 %r269,%r51,0; @ %r269 bra $L12; .loc 1 96 17 st.u64 [%r137],%r51; bra $L13; $L12: .loc 1 99 12 st.u64 [%r137],%r410; $L13: .loc 1 93 3 add.u64 %r151,%r151,24; add.u64 %r146,%r146,8; add.u64 %r137,%r137,8; setp.ne.u64 %r271,%r125,%r151; @ %r271 bra $L14; bra $L51; $L10: .loc 1 106 7 setp.le.s64 %r272,%r104,0; @ %r272 bra $L16; add.u64 %r398,%frame,240; $L38: add.u64 %r160,%r192,40; mov.u64 %r158,%r398; add.u64 %r273,%r192,16; cvt.u32.u32 %r275,%r22; cvt.s64.s8 %r274,%r275; add.u64 %r277,%r274,%r274; add.u64 %r278,%r277,%r274; shl.b64 %r279,%r278,3; add.u64 %r152,%r273,%r279; .loc 1 109 10 mov.u64 %r95,1; .loc 1 113 4 mov.u64 %r302,0; bra $L17; $L16: .loc 1 117 24 mov.u64 %r280,0; st.u64 [%r192+8],%r280; .loc 1 118 28 cvt.u16.u32 %r283,%r22; add.u16 %r282,%r283,-1; cvt.u32.u16 %r284,%r282; st.u8 [%r192+28],%r284; .loc 1 120 20 add.u32 %r285,%r22,-2; cvt.s64.s32 %r58,%r285; add.u64 %r287,%r58,%r58; add.u64 %r288,%r287,%r58; shl.b64 %r289,%r288,3; add.u64 %r290,%r192,%r289; .loc 1 120 67 shl.b64 %r292,%r58,3; add.u64 %r293,%frame,%r292; .loc 1 120 59 ld.u64 %r295,[%r290+40]; ld.u64 %r296,[%r293+240]; mul.lo.u64 %r110,%r295,%r296; .loc 1 122 29 mov.u64 %r298,1622 27 st.u64 [%r192],%r299; .loc 1 123 10 setp.eq.u64 %r301,%r110,0; @ ! %r301 bra $L21; bra $L18; $L20: .loc 1 111 48 mul.lo.u64 %r95,%r54,%r95; $L17: .loc 1 113 4 st.u64 [%r160+8],%r302; ld.u64 %r54,[%r158]; add.u64 %r303,%r54,-1; st.u64 [%r160+16],%r303; st.u64 [%r160],%r95; .loc 1 106 7 add.u64 %r160,%r160,24; add.u64 %r158,%r158,8; setp.ne.u64 %r304,%r152,%r160; @ %r304 bra $L20; bra $L16; $L18: .loc 1 126 4 st.u64 [%r192+48],%r110; mov.u64 %r306,-1; st.u64 [%r192+56],%r306; mov.u64 %r307,1; st.u64 [%r192+40],%r307; .loc 1 127 4 bra $L1; $L11: .loc 1 133 19 ld.s8 %r64,[%r192+28]; .loc 1 133 10 setp.eq.u64 %r308,%r64,%r104; @ %r308 bra $L23; .loc 1 134 2 st.u64 [%stack+8],%r104; st.u64 [%stack],%r64;_gfortran_runtime_error23: .loc 1 139 11 cvta.global.u64 %r311,_gfortrani_compile_options; .loc 1 139 10 ld.u32 %r312,[%r311+36]; setp.eq.u32 %r313,%r312,0; @ %r313 bra $L21; .loc 1 140 2 add.u64 %r398,%frame,240; cvta.const.u64 %r317,$LC2317; call _gfortrani_bounds_ifunction_return$L21: .loc 1 144 3 setp.gt.s64 %r319,%r104,0; @ %r319 bra $L24; $L28: .loc 1 80 9 add.u64 %r320,%r30,1; .loc 1 80 7 sub.u64 %r106,%r320,%r32; max.s64 %r108,%r106,0; .loc 1 152 8 ld.u64 %r115,[%r193]; .loc 1 153 8 ld.u64 %r116,[%r192]; .loc 1 170 36 mul.lo.u64 %r321,%r33,%r196; shl.b64 %r68,%r321,2; .loc 1 186 22 ld.u64 %r184,[%frame+120]; .loc 1 186 12 shl.b64 %r186,%r184,2; .loc 1 187 22 ld.u64 %r187,[%frame]; .loc 1 187 12 shl.b64 %r189,%r187,4; .loc 1 185 12 ld.u64 %r190,[%frame+360]; setp.gt.s64 %r400,%r106,0; setp.le.s64 %r401,%r104,1; .loc 1 173 6 setp.eq.u32 %r404,%r195,0; cvt.u32.u32 %r405,%r22; cvt.s64.s8 %r406,%r405; add.u64 %r407,%r406,-1; add.u64 %r408,%frame,120; add.u64 %r409,%frame,240; bra $L25; $L24: add.u64 %r173,%frame,360; add.u64 %r172,%r192,40; mov.u64 %r170,%frame; cvt.u32.u32 %r323,%r22; cvt.s64.s8 %r322,%r323; add.u64 %r161,%r322,-1; .loc 1 144 3 mov.u64 %r117,0; add.u64 %r398,%frame,240; .loc 1 146 16 mov.u64 %r324,%r117; $L27: st.u64 [%r173],%r324; .loc 1 147 18 ld.u64 %r325,[%r172]; st.u64 [%r170],%r325; .loc 1 148 17 shl.b64 %r327,%r117,3; add.u64 %r328,%r398,%r327; .loc 1 148 10 ld.u64 %r329,[%r328]; setp.le.s64 %r330,%r329,0; @ %r330 bra $L1; .loc 1 144 26 add.u64 %r117,%r117,1; .loc 1 144 3 add.u64 %r173,%r173,8; add.u64 %r172,%r172,24; add.u64 %r170,%r170,8; setp.ne.u64 %r331,%r117,%r161; @ %r331 bra $L27; bra $L28; $L25: .loc 1 166 5 @ %r400 bra $L39; .loc 1 167 10 mov.u64 %r333,0; st.u64 [%r116],%r333; st.u64 [%r116+8],%r333; bra $L30; $L39: mov.u64 %r111,%r115; mov.u64 %r392,1; mov.u64 %r393,0; .loc 1 164 9 mov.u64 %r99,%r393; .loc 1 165 9 mov.u64 %r390,%r393; mov.u64 %r391,%r393; $L29: .loc 1 173 6 setp.ne.u64 %r335,%r99,0; @ %r335 bra $L31; $L34: .loc 1 177 14 mov.u64 %r390,%r392; mov.u64 %r391,%r393; mov.u64 %r99,%r111; bra $L32; $L31: .loc 1 173 6 @ %r404 bra $L33; .loc 1 42196; call (%value_in),_gfortrani_memcmp_char4340,[%value_in]; } .loc 1 173 22 setp.le.s32 %r341,%r340,0; @ ! %r341 bra $L32; bra $L34; $L33: .loc 1 42196; call (%value_in),_gfortrani_memcmp_char4345,[%value_in]; } .loc 1 173 22 setp.lt.s32 %r346,%r345,0; @ %r346 bra $L34; $L32: .loc 1 170 36 add.u64 %r111,%r111,%r68; .loc 1 170 6 add.u64 %r349,%r392,1; set.u32.lt.u64 %r354,%r349,%r392; cvt.s64.s32 %r352,%r354; mov.u64 %r392,%r349; sub.u64 %r393,%r393,%r352; add.u64 %r359,%r392,-1; setp.gt.s64 %r361,%r108,%r359; @ %r361 bra $L29; .loc 1 181 12 st.u64 [%r116],%r390; st.u64 [%r116+8],%r391; $L30: .loc 1 185 15 add.u64 %r190,%r190,1; .loc 1 186 12 add.u64 %r115,%r115,%r186; .loc 1 187 12 add.u64 %r116,%r116,%r189; .loc 1 189 32 ld.u64 %r139,[%frame+240]; .loc 1 189 13 setp.ne.u64 %r364,%r190,%r139; @ %r364 bra $L25; .loc 1 196 23 mul.lo.u64 %r365,%r190,%r184; .loc 1 196 9 shl.b64 %r366,%r365,2; sub.u64 %r112,%r115,%r366; .loc 1 197 23 mul.lo.u64 %r367,%r190,%r187; .loc 1 197 9 shl.b64 %r368,%r367,4; sub.u64 %r113,%r116,%r368; .loc 1 199 7 @ %r401 bra $L1; add.u64 %r122,%frame,368; mov.u64 %r143,8; .loc 1 198 5 mov.u64 %r114,1; .loc 1 193 13 mov.u64 %r402,0; bra $L36; $L37: st.u64 [%r122],%r402; .loc 1 196 23 mul.lo.u64 %r374,%r80,%r79; .loc 1 196 9 shl.b64 %r375,%r374,2; sub.u64 %r112,%r115,%r375; .loc 1 197 23 mul.lo.u64 %r376,%r83,%r79; .loc 1 197 9 shl.b64 %r377,%r376,4; sub.u64 %r113,%r116,%r377; .loc 1 198 5 add.u64 %r114,%r114,1; .loc 1 199 7 add.u64 %r122,%r122,8; add.u64 %r143,%r143,8; setp.eq.u64 %r378,%r114,%r407; @ %r378 bra $L1; $L36: .loc 1 207 16 ld.u64 %r379,[%r122]; add.u64 %r79,%r379,1; st.u64 [%r122],%r79; .loc 1 208 23 add.u64 %r381,%r408,%r143; ld.u64 %r80,[%r381]; .loc 1 208 13 shl.b64 %r382,%r80,2; add.u64 %r115,%r112,%r382; .loc 1 209 23 add.u64 %r383,%frame,%r143; ld.u64 %r83,[%r383]; .loc 1 209 13 shl.b64 %r384,%r83,4; add.u64 %r116,%r113,%r384; .loc 1 189 32 add.u64 %r386,%r409,%r143; ld.u64 %r86,[%r386]; .loc 1 189 13 setp.eq.u64 %r387,%r79,%r86; @ %r387 bra $L37; mov.u64 %r190,0; bra $L25; $L51: .loc 1 102 6 ld.u64 %r388,[%r192]; setp.eq.u64 %r389,%r388,0; @ ! %r389 bra $L11; bra $L38; $L1_gfortran_mminloc1_16_s4 .visible .func _gfortran_mminloc1_16_s60810pred %r265; .reg .u64 %r266; .reg .u64 %r267; .reg .u32 %r268; .reg .u64 %r273; .reg .u64 %r274; .reg .u64 %r276; .reg .u64 %r277; .reg .u64 %r284; .reg .pred %r285; .reg .u32 %r286; .reg .u32 %r287; .reg .u32predu64predu16 %r382; .reg .u16 %r383; .reg .u32 %r384; .reg .pred %r385; .reg .u64 %r386; .reg .u64u64 %r410; .reg .pred %r412; .reg .pred %r413; .reg .u64 %r414; .reg .u32pred %r423; .reg .u64 %r424; .reg .pred %r427; .reg .u64 %r431; .reg .u64 %r434; .reg .u64 %r437; .reg .u64 %r440; .reg .u32 %r442; .reg .u64 %r446; .reg .u64 %r448; .reg .u64 %r451; .reg .u32 %r453; .reg .u64 %r457; .reg .pred %r458; .reg .u32 %r464; .reg .pred %r465; .reg .u32 %r469; .reg .pred %r470; .reg .u64predu64 %r511; .reg .u64 %r513; .reg .u64 %r514; .reg .u64 %r516; .reg .pred %r517; .reg .u64 %r518; .reg .pred %r519; .reg .u64 %r520; .reg .u64 %r521; .reg .u64pred %r543; .reg .u64 %r544; .reg .u64 %r545; .reg .pred %r546; .reg .u64 %r549; .reg .pred %r550; .reg .u32 %r551; .reg .u64u64 %r556; .reg .u64 %r557; .reg .u64 %r558;.loc 1 244 6 setp.ne.u64 %r247,%r244,0; @ %r247 bra $L42243245246; call _gfortran_minloc1_16_s4); } .loc 1 251 7 bra $L52; $L53: .loc 1 254 10 ld.u64 %r22,[%r243]; .loc 1 254 7 add.u64 %r145,%r22,-1; .loc 1 255 10 ld.s8 %r23,[%r242+28]; .loc 1 255 38 add.u32 %r253,%r23,-1; .loc 1 255 8 cvt.s64.s32 %r146,%r253; .loc 1 258 7 shr.u64 %r255,%r145,63; set.u32.gt.s64 %r257,%r145,%r146; neg.s32 %r258,%r257; cvt.u16.u64 %r261,%r255; cvt.u16.u32 %r262,%r258; or.b16 %r260,%r261,%r262; .loc 1 258 6 cvt.u32.u16 %r263,%r260; cvt.u16.u8 %r264,%r263; setp.eq.u16 %r265,%r264,0; @ %r265 bra $L55; .loc 1 260 7 cvt.u32.u32 %r268,%r23; cvt.s64.s8 %r267,%r268; st.u64 [%stack+8],%r267; st.u64 [%stack],%r22; cvta.const.u64 %r266266_gfortran_runtime_error5: .loc 1 265 9 add.u64 %r542,%r145,%r145; add.u64 %r541,%r542,%r145; shl.b64 %r273,%r541,3; add.u64 %r274,%r242,%r273; ld.u64 %r277,[%r274+56]; add.u64 %r276,%r277,1; .loc 1 265 7 ld.u64 %r284,[%r274+48]; sub.u64 %r147,%r276,%r284; .loc 1 266 6 setp.le.s64 %r285,%r147,0; @ %r285 bra $L52; .loc 1 269 9 ld.u64 %r159,[%r244]; .loc 1 271 15 ld.u64 %r35,[%r244+16]; .loc 1 273 22 cvt.u32.u64 %r36,%r35; .loc 1 273 53 add.u32 %r286,%r36,-4; and.b32 %r287,%r286,-5; set.u32.eq.u32 %r289,%r287,0; neg.s32 %r290,%r289; .loc 1 273 22 add.u32 %r291,%r36,-1; set.u32.le.u32 %r293,%r291,1; neg.s32 %r294,%r293; .loc 1 273 6 cvt.u16.u32 %r296,%r290; cvt.u16.u32 %r297,%r294; or.loc 1 275 7 setp.ne.u32 %r302,%r36,16; @ %r302 bra $L58; $L57: .loc 1 282 11 shl.b64 %r306,%r541,3; add.u64 %r307,%r242,%r306; ld.u64 %r39,[%r307+40]; .loc 1 283 12 add.u64 %r313,%r244,%r306; ld.u64 %r41,[%r313+40]; .loc 1 285 3 setp.ne.u64 %r315,%r145,0; @ %r315 bra $L59; $L65: .loc 1 295 3 setp.lt.s64 %r316,%r145,%r146; @ %r316 bra $L60; bra $L112; $L58: .loc 1 280 5 cvta.const.u64 %r317317_gfortran_runtime_error9: add.u64 %r74,%r242,40; add.u64 %r224,%frame,240; add.u64 %r93,%r244,40; mov.u64 %r232,%frame; add.u64 %r195,%frame,360; add.u64 %r320,%r22,%r22; add.u64 %r321,%r320,%r22; shl.b64 %r322,%r321,3; add.u64 %r323,%r242,16; add.u64 %r236,%r322,%r323; .loc 1 292 12 mov.u64 %r558,0; $L64: .loc 1 287 51 ld.u64 %r325,[%r74]; mul.lo.u64 %r324,%r325,%r246; .loc 1 287 18 st.u64 [%r224],%r324; .loc 1 288 20 ld.u64 %r327,[%r93]; mul.lo.u64 %r326,%r327,%r35; .loc 1 288 18 st.u64 [%r232],%r326; .loc 1 289 19 ld.u64 %r329,[%r74+16]; add.u64 %r328,%r329,1; ld.u64 %r330,[%r74+8]; sub.u64 %r56,%r328,%r330; .loc 1 291 10 setp.lt.s64 %r331,%r56,0; @ %r331 bra $L62; .loc 1 289 17 st.u64 [%r195],%r56; bra $L63; $L62: .loc 1 292 12 st.u64 [%r195],%r558; $L63: .loc 1 285 3 add.u64 %r74,%r74,24; add.u64 %r224,%r224,8; add.u64 %r93,%r93,24; add.u64 %r232,%r232,8; add.u64 %r195,%r195,8; setp.ne.u64 %r333,%r74,%r236; @ %r333 bra $L64; bra $L65; $L112: .loc 1 305 6 ld.u64 %r334,[%r241]; setp.eq.u64 %r335,%r334,0; @ ! %r335 bra $L67; bra $L66; $L60: add.u64 %r337,%r22,%r22; add.u64 %r338,%r337,%r22; shl.b64 %r339,%r338,3; add.u64 %r129,%r339,40; add.u64 %r135,%r242,%r129; shl.b64 %r340,%r22,3; add.u64 %r125,%r340,-8; add.u64 %r545,%frame,240; add.u64 %r127,%r545,%r125; add.u64 %r122,%r244,%r129; add.u64 %r89,%frame,%r125; add.u64 %r544,%frame,360; add.u64 %r43,%r544,%r125; add.u64 %r343,%r242,40; cvt.u32.u32 %r345,%r23; cvt.s64.s8 %r344,%r345; add.u64 %r347,%r344,%r344; add.u64 %r348,%r347,%r344; shl.b64 %r349,%r348,3; add.u64 %r217,%r343,%r349; .loc 1 302 12 mov.u64 %r557,0; $L70: .loc 1 297 55 ld.u64 %r351,[%r135]; mul.lo.u64 %r350,%r351,%r246; .loc 1 297 18 st.u64 [%r127],%r350; .loc 1 298 20 ld.u64 %r353,[%r122]; mul.lo.u64 %r352,%r353,%r35; .loc 1 298 18 st.u64 [%r89],%r352; .loc 1 299 19 ld.u64 %r355,[%r135+16]; add.u64 %r354,%r355,1; ld.u64 %r356,[%r135+8]; sub.u64 %r72,%r354,%r356; .loc 1 301 10 setp.lt.s64 %r357,%r72,0; @ %r357 bra $L68; .loc 1 299 17 st.u64 [%r43],%r72; bra $L69; $L68: .loc 1 302 12 st.u64 [%r43],%r557; $L69: .loc 1 295 3 add.u64 %r135,%r135,24; add.u64 %r127,%r127,8; add.u64 %r122,%r122,24; add.u64 %r89,%r89,8; add.u64 %r43,%r43,8; setp.ne.u64 %r359,%r135,%r217; @ %r359 bra $L70; bra $L113; $L66: .loc 1 309 7 setp.eq.u64 %r360,%r146,0; @ %r360 bra $L72; add.u64 %r544,%frame,360; $L95: add.u64 %r174,%r241,40; mov.u64 %r161,%r544; cvt.u32.u32 %r362,%r23; cvt.s64.s8 %r361,%r362; add.u64 %r364,%r361,%r361; add.u64 %r365,%r364,%r361; shl.b64 %r366,%r365,3; add.u64 %r367,%r241,16; add.u64 %r137,%r366,%r367; .loc 1 312 10 mov.u64 %r133,1; .loc 1 316 4 mov.u64 %r386,0; bra $L73; $L72: .loc 1 320 20 add.u32 %r368,%r23,-2; cvt.s64.s32 %r80,%r368; add.u64 %r370,%r80,%r80; add.u64 %r371,%r370,%r80; shl.b64 %r372,%r371,3; add.u64 %r373,%r241,%r372; .loc 1 320 67 shl.b64 %r375,%r80,3; add.u64 %r376,%frame,%r375; .loc 1 320 59 ld.u64 %r378,[%r373+40]; ld.u64 %r379,[%r376+360]; mul.lo.u64 %r153,%r378,%r379; .loc 1 322 24 mov.u64 %r380,0; st.u64 [%r241+8],%r380; .loc 1 323 28 cvt.u16.u32 %r383,%r23; add.u16 %r382,%r383,-1; cvt.u32.u16 %r384,%r382; st.u8 [%r241+28],%r384; .loc 1 325 10 setp.eq.u64 %r385,%r153,0; @ %r385 bra $L74; bra $L114; $L76: .loc 1 314 47 mul.lo.u64 %r133,%r77,%r133; $L73: .loc 1 316 4 st.u64 [%r174+8],%r386; ld.u64 %r77,[%r161]; add.u64 %r387,%r77,-1; st.u64 [%r174+16],%r387; st.u64 [%r174],%r133; .loc 1 309 7 add.u64 %r174,%r174,24; add.u64 %r161,%r161,8; setp.ne.u64 %r388,%r137,%r174; @ %r388 bra $L76; bra $L72; $L74: .loc 1 328 4 st.u64 [%r241+48],%r153; mov.u64 %r390,-1; st.u64 [%r241+56],%r390; mov.u64 %r391,1; st.u64 [%r241+40],%r391; .loc 1 329 4 bra $L52; $L114: .loc 1 332 24 mov.u64 %r393,16393; call (%value_in),_gfortrani_xmallocarray94,[%value_in]; } .loc 1 332 22 st.u64 [%r241],%r394; bra $L77; $L67: .loc 1 337 19 ld.s8 %r396,[%r241+28]; .loc 1 337 10 setp.eq.u64 %r397,%r396,%r146; @ %r397 bra $L78; .loc 1 338 2 cvta.const.u64 %r398,$LC5stack; call _gfortran_runtime_error78: .loc 1 340 11 cvta.global.u64 %r400,_gfortrani_compile_options; .loc 1 340 10 ld.u32 %r401,[%r400+36]; setp.eq.u32 %r402,%r401,0; @ %r402 bra $L77; .loc 1 342 4 add.u64 %r544,%frame,360; cvta.const.u64 %r406,$LC2r5440406; call _gfortrani_bounds_ifunction_return344 4 cvta.const.u64 %r410,$LC424210406; call _gfortrani_bounds_equal_extents$L77: .loc 1 349 3 setp.ne.u64 %r412,%r146,0; @ %r412 bra $L79; $L82: .loc 1 357 8 ld.u64 %r160,[%r241]; .loc 1 358 8 ld.u64 %r158,[%r242]; .loc 1 360 9 setp.ne.u64 %r413,%r158,0; @ %r413 bra $L80; bra $L52; $L79: add.u64 %r196,%frame,480; add.u64 %r193,%r241,40; add.u64 %r189,%frame,120; cvt.u32.u32 %r415,%r23; cvt.s64.s8 %r414,%r415; add.u64 %r177,%r414,-1; .loc 1 349 3 mov.u64 %r165,0; add.u64 %r544,%frame,360; .loc 1 351 16 mov.u64 %r416,%r165; $L81: st.u64 [%r196],%r416; .loc 1 352 18 ld.u64 %r417,[%r193]; st.u64 [%r189],%r417; .loc 1 353 17 shl.b64 %r419,%r165,3; add.u64 %r420,%r544,%r419; .loc 1 353 10 ld.u64 %r421,[%r420]; setp.le.s64 %r422,%r421,0; @ %r422 bra $L52; .loc 1 349 26 add.u64 %r165,%r165,1; .loc 1 349 3 add.u64 %r196,%r196,8; add.u64 %r193,%r193,24; add.u64 %r189,%r189,8; setp.ne.u64 %r423,%r165,%r177; @ %r423 bra $L81; bra $L82; $L80: .loc 1 283 12 mul.lo.u64 %r44,%r41,%r35; .loc 1 372 32 mul.lo.u64 %r424,%r39,%r246; shl.b64 %r92,%r424,2; .loc 1 396 22 ld.u64 %r219,[%frame+240]; .loc 1 396 12 shl.b64 %r222,%r219,2; .loc 1 397 23 ld.u64 %r223,[%frame]; .loc 1 398 22 ld.u64 %r226,[%frame+120]; .loc 1 398 12 shl.b64 %r228,%r226,4; setp.le.s64 %r543,%r146,1; add.u64 %r549,%r147,-1; .loc 1 384 13 setp.eq.u32 %r550,%r245,0; cvt.u32.u32 %r551,%r23; cvt.s64.s8 %r552,%r551; add.u64 %r553,%r552,-1; add.u64 %r554,%frame,240; add.u64 %r555,%frame,360; add.u64 %r556,%frame,120; $L96: .loc 1 349 3 mov.u64 %r164,%r159; mov.u64 %r154,%r158; .loc 1 372 9 mov.u64 %r198,0; $L86: .loc 1 375 7 ld.s8 %r230,[%r164]; mov.u64 %r235,%r198; add.u64 %r198,%r198,1; .loc 1 375 6 setp.eq.u32 %r546,%r230,0; @ %r546 bra $L83; .loc 1 378 11 mov.u64 %r520,%r198; shr.s64 %r521,%r520,63; .loc 1 382 6 setp.le.s64 %r427,%r147,%r235; @ %r427 bra $L84; add.u64 %r524,%r235,1; shr.s64 %r525,%r524,63; sub.u64 %r431,%r549,%r235; shr.s64 %r434,%r235,63; add.u64 %r437,%r235,2; set.u32.lt.u64 %r442,%r437,%r235; cvt.s64.s32 %r440,%r442; sub.u64 %r446,%r434,%r440; add.u64 %r448,%r431,%r437; set.u32.lt.u64 %r453,%r448,%r431; cvt.s64.s32 %r451,%r453; sub.u64 %r457,%r446,%r451; mov.u64 %r163,%r154; bra $L85; $L83: .loc 1 372 32 add.u64 %r154,%r154,%r92; .loc 1 372 47 add.u64 %r164,%r164,%r44; .loc 1 372 2 setp.ne.u64 %r458,%r147,%r198; @ %r458 bra $L86; .loc 1 371 9 mov.u64 %r520,0; mov.u64 %r521,%r520; bra $L84; $L85: .loc 1 384 6 @ %r546 bra $L87; .loc 1 384 13 @ %r550 bra $L88; .loc 1 4216r154246; call (%value_in),_gfortrani_memcmp_char4464,[%value_in]; } .loc 1 384 13 setp.gt.s32 %r465,%r464,0; @ %r465 bra $L87; $L89: .loc 1 388 14 mov.u64 %r520,%r524; mov.u64 %r521,%r525; mov.u64 %r154,%r163; bra $L87; $L88: .loc 1 4216r154246; call (%value_in),_gfortrani_memcmp_char4469,[%value_in]; } .loc 1 384 13 setp.lt.s32 %r470,%r469,0; @ %r470 bra $L89; $L87: .loc 1 382 31 add.u64 %r163,%r163,%r92; .loc 1 382 46 add.u64 %r164,%r164,%r44; .loc 1 382 6 add.u64 %r473,%r524,1; set.u32.lt.u64 %r478,%r473,%r524; cvt.s64.s32 %r476,%r478; mov.u64 %r524,%r473; sub.u64 %r525,%r525,%r476; setp.ne.u64 %r485,%r448,%r524; @ %r485 bra $L97; setp.ne.u64 %r488,%r457,%r525; @ ! %r488 bra $L84; $L97: .loc 1 384 7 ld.s8 %r230,[%r164]; setp.eq.u32 %r546,%r230,0; bra $L85; $L84: .loc 1 392 8 st.u64 [%r160],%r520; st.u64 [%r160+8],%r521; .loc 1 395 15 ld.u64 %r491,[%frame+480]; add.u64 %r95,%r491,1; st.u64 [%frame+480],%r95; .loc 1 396 12 add.u64 %r158,%r158,%r222; .loc 1 397 13 add.u64 %r159,%r159,%r223; .loc 1 398 12 add.u64 %r160,%r160,%r228; .loc 1 400 32 ld.u64 %r29,[%frame+360]; .loc 1 400 13 setp.ne.u64 %r492,%r29,%r95; @ %r492 bra $L96; .loc 1 404 13 mov.u64 %r493,0; st.u64 [%frame+480],%r493; .loc 1 407 23 mul.lo.u64 %r105,%r29,%r219; .loc 1 408 24 mul.lo.u64 %r494,%r29,%r223; .loc 1 408 10 sub.u64 %r155,%r159,%r494; .loc 1 409 23 mul.lo.u64 %r495,%r29,%r226; .loc 1 409 9 shl.b64 %r496,%r495,4; sub.u64 %r156,%r160,%r496; .loc 1 411 7 @ %r543 bra $L52; add.u64 %r215,%frame,488; mov.u64 %r214,8; .loc 1 410 5 mov.u64 %r157,1; bra $L93; $L94: .loc 1 404 13 st.u64 [%r215],%r493; .loc 1 407 23 mul.lo.u64 %r105,%r115,%r114; .loc 1 408 24 mul.lo.u64 %r502,%r117,%r114; .loc 1 408 10 sub.u64 %r155,%r159,%r502; .loc 1 409 23 mul.lo.u64 %r503,%r119,%r114; .loc 1 409 9 shl.b64 %r504,%r503,4; sub.u64 %r156,%r160,%r504; .loc 1 410 5 add.u64 %r157,%r157,1; .loc 1 411 7 add.u64 %r215,%r215,8; add.u64 %r214,%r214,8; setp.eq.u64 %r505,%r157,%r553; @ %r505 bra $L52; $L93: .loc 1 419 16 ld.u64 %r506,[%r215]; add.u64 %r114,%r506,1; st.u64 [%r215],%r114; .loc 1 420 23 add.u64 %r508,%r554,%r214; ld.u64 %r115,[%r508]; .loc 1 420 13 sub.u64 %r509,%r115,%r105; shl.b64 %r510,%r509,2; add.u64 %r158,%r158,%r510; .loc 1 421 24 add.u64 %r511,%frame,%r214; ld.u64 %r117,[%r511]; .loc 1 421 14 add.u64 %r159,%r155,%r117; .loc 1 422 23 add.u64 %r513,%r556,%r214; ld.u64 %r119,[%r513]; .loc 1 422 13 shl.b64 %r514,%r119,4; add.u64 %r160,%r156,%r514; .loc 1 400 32 add.u64 %r516,%r555,%r214; ld.u64 %r123,[%r516]; .loc 1 400 13 setp.eq.u64 %r517,%r114,%r123; @ %r517 bra $L94; bra $L96; $L113: .loc 1 305 6 ld.u64 %r518,[%r241]; setp.eq.u64 %r519,%r518,0; @ ! %r519 bra $L67; bra $L95; $L52: .loc 1 426_gfortran_sminloc1_16_s4 .visible .func _gfortran_sminloc1_16_s368100predu16 %r180; .reg .predpred %r222; .reg .predu64 %r231; .reg .u16 %r233; .reg .u16 %r234; .reg .u32pred %r259; .reg .pred %r260; .reg .u64u64 %r267; .reg .u32predmov.u64 %r156,%ar0; mov.u64 %r157,%ar1; mov.u64 %r158,%ar2; mov.u64 %r159,%ar3; mov.u32 %r160,%ar4; mov.u64 %r161,%ar5; .loc 1 449 6 setp.eq.u64 %r162,%r159,0; @ %r162 bra $L116; .loc 1 449 20 ld.u32 %r163,[%r159]; setp.eq.u32 %r164,%r163,0; @ %r164 bra $L117; $L116: .loc 1 452 7r1571580161; call _gfortran_minloc1_16_s4); } .loc 1 456 7 bra $L115; $L117: .loc 1 459 10 ld.u64 %r23,[%r158]; .loc 1 459 7 add.u64 %r95,%r23,-1; .loc 1 460 10 ld.s8 %r24,[%r157+28]; .loc 1 460 38 add.u32 %r26,%r24,-1; .loc 1 460 8 cvt.s64.s32 %r307,%r26; .loc 1 462 7 shr.u64 %r171,%r95,63; set.u32.gt.s64 %r173,%r95,%r307; neg.s32 %r174,%r173; cvt.u16.u64 %r177,%r171; cvt.u16.u32 %r178,%r174; or.b16 %r176,%r177,%r178; .loc 1 462 6 cvt.u32.u16 %r179,%r176; cvt.u16.u8 %r180,%r179; setp.ne.u16 %r181,%r180,0; @ %r181 bra $L119; .loc 1 469 3 setp.ne.u64 %r182,%r95,0; @ %r182 bra $L120; $L126: .loc 1 477 3 setp.lt.s64 %r183,%r95,%r307; @ %r183 bra $L121; bra $L174; $L119: .loc 1 464 7 cvt.u32.u32 %r186,%r24; cvt.s64.s8 %r185,%r186; st.u64 [%stack+8],%r1851841stack; call _gfortran_runtime_error120: add.u64 %r92,%r157,48; add.u64 %r114,%frame,120; add.u64 %r146,%r157,56; add.u64 %r188,%r157,24; add.u64 %r190,%r23,%r23; add.u64 %r191,%r190,%r23; shl.b64 %r192,%r191,3; add.u64 %r152,%r188,%r192; .loc 1 474 12 mov.u64 %r313,0; $L125: .loc 1 471 19 ld.u64 %r194,[%r146]; ld.u64 %r196,[%r92]; sub.u64 %r195,%r194,%r196; .loc 1 471 50 mad.lo.u64 %r38,%r195,%r161,%r161; .loc 1 473 10 setp.le.s64 %r197,%r38,0; @ %r197 bra $L123; .loc 1 471 17 st.u64 [%r114],%r38; bra $L124; $L123: .loc 1 474 12 st.u64 [%r114],%r313; $L124: .loc 1 469 3 add.u64 %r92,%r92,24; add.u64 %r114,%r114,8; add.u64 %r146,%r146,24; setp.ne.u64 %r199,%r92,%r152; @ %r199 bra $L125; bra $L126; $L174: .loc 1 486 15 ld.u64 %r107,[%r156]; .loc 1 486 6 setp.eq.u64 %r200,%r107,0; @ ! %r200 bra $L128; bra $L127; $L121: add.u64 %r202,%r23,%r23; add.u64 %r203,%r202,%r23; shl.b64 %r204,%r203,3; add.u64 %r205,%r204,48; add.u64 %r83,%r157,%r205; add.u64 %r305,%frame,120; shl.b64 %r207,%r23,3; add.u64 %r208,%r207,-8; add.u64 %r144,%r305,%r208; add.u64 %r209,%r157,48; cvt.u32.u32 %r211,%r24; cvt.s64.s8 %r210,%r211; add.u64 %r213,%r210,%r210; add.u64 %r214,%r213,%r210; shl.b64 %r215,%r214,3; add.u64 %r72,%r209,%r215; .loc 1 483 12 mov.u64 %r312,0; $L131: .loc 1 480 2 ld.u64 %r217,[%r83+8]; ld.u64 %r219,[%r83]; sub.u64 %r218,%r217,%r219; .loc 1 480 37 mad.lo.u64 %r46,%r218,%r161,%r161; .loc 1 482 10 setp.le.s64 %r220,%r46,0; @ %r220 bra $L129; .loc 1 479 17 st.u64 [%r144],%r46; bra $L130; $L129: .loc 1 483 12 st.u64 [%r144],%r312; $L130: .loc 1 477 3 add.u64 %r83,%r83,24; add.u64 %r144,%r144,8; setp.ne.u64 %r222,%r72,%r83; @ %r222 bra $L131; bra $L175; $L127: .loc 1 490 7 setp.eq.u64 %r223,%r307,0; @ %r223 bra $L133; add.u64 %r305,%frame,120; $L153: add.u64 %r125,%r156,40; mov.u64 %r123,%r305; add.u64 %r224,%r156,16; cvt.u32.u32 %r226,%r24; cvt.s64.s8 %r225,%r226; add.u64 %r228,%r225,%r225; add.u64 %r229,%r228,%r225; shl.b64 %r230,%r229,3; add.u64 %r117,%r224,%r230; .loc 1 493 10 mov.u64 %r89,1; .loc 1 497 4 mov.u64 %r249,0; bra $L134; $L133: .loc 1 501 24 mov.u64 %r231,0; st.u64 [%r156+8],%r231; .loc 1 502 28 cvt.u16.u32 %r234,%r24; add.u16 %r233,%r234,-1; cvt.u32.u16 %r235,%r233; st.u8 [%r156+28],%r235; .loc 1 504 20 add.u32 %r236,%r24,-2; cvt.s64.s32 %r56,%r236; add.u64 %r238,%r56,%r56; add.u64 %r239,%r238,%r56; shl.b64 %r240,%r239,3; add.u64 %r241,%r156,%r240; .loc 1 504 67 shl.b64 %r243,%r56,3; add.u64 %r244,%frame,%r243; .loc 1 504 59 ld.u64 %r246,[%r241+40]; ld.u64 %r247,[%r244+120]; mul.lo.u64 %r103,%r246,%r247; .loc 1 506 10 setp.eq.u64 %r248,%r103,0; @ %r248 bra $L135; bra $L176; $L137: .loc 1 495 48 mul.lo.u64 %r89,%r52,%r89; $L134: .loc 1 497 4 st.u64 [%r125+8],%r249; ld.u64 %r52,[%r123]; add.u64 %r250,%r52,-1; st.u64 [%r125+16],%r250; st.u64 [%r125],%r89; .loc 1 490 7 add.u64 %r125,%r125,24; add.u64 %r123,%r123,8; setp.ne.u64 %r251,%r117,%r125; @ %r251 bra $L137; bra $L133; $L135: .loc 1 509 4 st.u64 [%r156+48],%r103; mov.u64 %r253,-1; st.u64 [%r156+56],%r253; mov.u64 %r254,1; st.u64 [%r156+40],%r254; .loc 1 510 4 bra $L115; $L176: .loc 1 513 24 mov.u64 %r256,16256; call (%value_in),_gfortrani_xmallocarray257,[%value_in]; } mov.u64 %r107,%r257; .loc 1 513 22 st.u64 [%r156],%r107; $L141: .loc 1 539 3 setp.ne.u64 %r259,%r307,0; @ %r259 bra $L138; bra $L139; $L128: .loc 1 517 19 ld.s8 %r61,[%r156+28]; .loc 1 517 10 setp.eq.u64 %r260,%r61,%r307; @ %r260 bra $L140; .loc 1 518 2 st.u64 [%stack+8],%r307; st.u64 [%stack],%r61;_gfortran_runtime_error140: .loc 1 523 11 cvta.global.u64 %r263,_gfortrani_compile_options; .loc 1 523 10 ld.u32 %r264,[%r263+36]; setp.eq.u32 %r265,%r264,0; @ %r265 bra $L141; .loc 1 525 4 setp.eq.u64 %r266,%r307,0; @ %r266 bra $L139; add.u64 %r115,%r156,48; add.u64 %r105,%frame,120; cvt.u32.u32 %r268,%r24; cvt.s64.s8 %r267,%r268; add.u64 %r86,%r267,-1; .loc 1 525 10 mov.u64 %r113,0; $L143: .loc 1 529 21 ld.u64 %r270,[%r115+8]; add.u64 %r269,%r270,1; .loc 1 529 19 ld.u64 %r271,[%r115]; sub.u64 %r100,%r269,%r271; .loc 1 530 18 ld.u64 %r70,[%r105]; .loc 1 525 25 add.u64 %r113,%r113,1; .loc 1 530 11 setp.eq.u64 %r272,%r70,%r100; @ %r272 bra $L142; .loc 1 531 3 st.u64 [%stack+16],%r70; st.u64 [%stack+8],%r100; st.u64 [%stack],%r113;_gfortran_runtime_error142: .loc 1 525 4 add.u64 %r115,%r115,24; add.u64 %r105,%r105,8; setp.ne.u64 %r275,%r86,%r113; @ %r275 bra $L143; $L138: .loc 1 541 16 cvt.u16.u32 %r277,%r24; setp.le.s16 %r278,%r277,1; @ %r278 bra $L144; shl.b64 %r276,%r307,3; bra $L145; $L144: mov.u64 %r276,8; $L145: add.u64 %r280,%frame,240; mov.u32 %r2842766,[%value_in]; } add.u64 %r130,%r156,40; mov.u64 %r128,%frame; mov.u64 %r109,0; $L146: .loc 1 542 18 ld.u64 %r288,[%r130]; st.u64 [%r128],%r288; .loc 1 539 26 add.u64 %r109,%r109,1; .loc 1 539 3 add.u64 %r130,%r130,24; add.u64 %r128,%r128,8; setp.gt.s64 %r289,%r307,%r109; @ %r289 bra $L146; $L139: ld.u64 %r88,[%frame+240]; .loc 1 551 22 ld.u64 %r75,[%frame]; .loc 1 551 12 shl.b64 %r76,%r75,4; .loc 1 553 32 ld.u64 %r65,[%frame+120]; .loc 1 560 23 mul.lo.u64 %r49,%r65,%r75; setp.le.s64 %r306,%r307,1; .loc 1 549 13 mov.u64 %r290,0; cvt.u32.u32 %r309,%r24; cvt.s64.s8 %r310,%r309; add.u64 %r311,%r310,-1; $L152: st.u64 [%r107],%r290; st.u64 [%r107+8],%r290; .loc 1 550 15 add.u64 %r88,%r88,1; .loc 1 551 12 add.u64 %r107,%r107,%r76; .loc 1 553 13 setp.ne.u64 %r292,%r65,%r88; @ %r292 bra $L152; .loc 1 562 7 @ ! %r306 bra $L177; bra $L115; $L151: .loc 1 557 13 st.u64 [%r108],%r290; .loc 1 560 23 mul.lo.u64 %r78,%r81,%r80; .loc 1 561 5 add.u64 %r106,%r106,1; .loc 1 562 7 add.u64 %r108,%r108,8; add.u64 %r138,%r138,8; add.u64 %r137,%r137,8; setp.ne.u64 %r295,%r106,%r311; @ %r295 bra $L150; bra $L115; $L177: add.u64 %r108,%frame,248; add.u64 %r138,%frame,8; add.u64 %r137,%frame,128; .loc 1 560 23 mov.u64 %r78,%r49; .loc 1 561 5 mov.u64 %r106,1; $L150: .loc 1 566 16 ld.u64 %r300,[%r108]; add.u64 %r80,%r300,1; st.u64 [%r108],%r80; .loc 1 567 23 ld.u64 %r81,[%r138]; .loc 1 567 13 sub.u64 %r301,%r81,%r78; shl.b64 %r302,%r301,4; add.u64 %r107,%r107,%r302; .loc 1 553 32 ld.u64 %r84,[%r137]; .loc 1 553 13 setp.eq.u64 %r303,%r80,%r84; @ %r303 bra $L151; .loc 1 557 13 mov.u64 %r88,0; bra $L152; $L175: .loc 1 486 15 ld.u64 %r107,[%r156]; .loc 1 486 6 setp.eq.u64 %r304,%r107,0; @ ! %r304 bra $L128; bra $L153; $L115: .loc 1 maxloc2_4_s1.o/ 1622802249gfortran_maxloc2_4_s1gfortran_maxloc2_4_s1.file 1 "../../../../libgfortran/generated/maxloc2_4_s1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc2_4_s1gfortran_mmaxloc2_4_s1); // BEGIN GLOBAL FUNCTION DECL: _gfortran_smaxloc2_4_s1gfortran_smaxloc2_4_s1); // BEGIN GLOBAL FUNCTION DECL: memcmpmem_gfortrani_internal_error .extern .func _gfortrani_internal_error21,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc2_4_s1gfortran_maxloc2_4_s1value;u64 %r42; .reg .u64 %r43; .reg .pred %r44; .reg .u64 %r45; .reg .predmov.u64 %r38,%ar0; mov.u32 %r39,%ar1; mov.u64 %r40,%ar2; .loc 1 56 12 ld.u64 %r42,[%r38+56]; add.u64 %r41,%r42,1; .loc 1 56 10 ld.u64 %r43,[%r38+48]; sub.u64 %r30,%r41,%r43; .loc 1 57 6 setp.le.s64 %r44,%r30,0; @ %r44 bra $L7; .loc 1 60 44 ld.u64 %r45,[%r38+40]; mul.lo.u64 %r27,%r40,%r45; .loc 1 63 7 ld.u64 %r33,[%r38]; add.u64 %r36,%r30,1; mov.u64 %r29,%r33; .loc 1 65 9 mov.u64 %r34,1; mov.u64 %r28,%r34; .loc 1 67 10 setp.eq.u32 %r57,%r39,0; bra $L3; $L6:8(%value_in),memcmp,(%out_arg1,%out_arg2,%out_arg3); ld.param.u32 %r50,[%value_in]; } .loc 1 67 10 @ %r57 bra $L5; .loc 1 67 26 setp.lt.s32 %r53,%r50,0; selp.u64 %r29,%r29,%r33,%r53; selp.u64 %r28,%r28,%r34,%r53; bra $L3; $L5: setp.le.s32 %r54,%r50,0; selp.u64 %r29,%r29,%r33,%r54; selp.u64 %r28,%r28,%r34,%r54; bra $L3; $L8: .loc 1 73 11 mov.u64 %r29,%r33; mov.u64 %r28,%r34; $L3: add.u64 %r33,%r33,%r27; .loc 1 65 25 add.u64 %r34,%r34,1; .loc 1 65 3 setp.ne.u64 %r55,%r34,%r36; @ %r55 bra $L6; .loc 1 75 10 cvt.u32.u64 %r37,%r28; bra $L1; $L7: .loc 1 58 12 mov.u32 %r37,0; $L1: .loc 1 76_gfortran_mmaxloc2_4_s1gfortran_mmaxloc2_4_s1predu16 %r88; .reg .u32 %r89; .reg .pred %r90; .reg .pred %r91; .reg .pred.reg .pred %r98; .reg .u32pred.loc 1 98 12 ld.u64 %r64,[%r59+56]; add.u64 %r63,%r64,1; .loc 1 98 10 ld.u64 %r65,[%r59+48]; sub.u64 %r48,%r63,%r65; .loc 1 99 6 setp.le.s64 %r66,%r48,0; @ %r66 bra $L20; .loc 1 102 44 ld.u64 %r67,[%r59+40]; mul.lo.u64 %r27,%r62,%r67; .loc 1 104 15 ld.u64 %r28,[%r60+16]; .loc 1 105 9 ld.u64 %r51,[%r60]; .loc 1 107 22 cvt.u32.u64 %r29,%r28; .loc 1 107 53 add.u32 %r68,%r29,-4; and.b32 %r69,%r68,-5; set.u32.eq.u32 %r71,%r69,0; neg.s32 %r72,%r71; .loc 1 107 22 add.u32 %r73,%r29,-1; set.u32.le.u32 %r75,%r73,1; neg.s32 %r76,%r75; .loc 1 107 6 cvt.u16.u32 %r78,%r72; cvt.u16.u32 %r79,%r76; or.b16 %r77,%r78,%r79;109 7 setp.ne.u32 %r84,%r29,16; @ %r84 bra $L12; $L11: .loc 1 116 13 ld.u64 %r85,[%r60+40]; mul.lo.u64 %r37,%r28,%r85; .loc 1 119 9 mov.u64 %r52,0; bra $L13; $L12: .loc 1 114 5 cvta.const.u64 %r87,$LC0; mov.u64 %r86,r87; call _gfortrani_internal_error13: .loc 1 121 10 ld.s8 %r89,[%r51]; cvt.u16.u32 %r88,%r89; setp.ne.u16 %r90,%r88,0; @ %r90 bra $L14; .loc 1 123 13 add.u64 %r51,%r51,%r37; .loc 1 119 24 add.u64 %r52,%r52,1; .loc 1 119 3 setp.ne.u64 %r91,%r48,%r52; @ %r91 bra $L13; .loc 1 100 12 mov.u32 %r58,0; bra $L9; $L14: .loc 1 126 6 setp.eq.u64 %r92,%r48,%r52; @ %r92 bra $L21; .loc 1 129 7 add.u64 %r43,%r52,1; .loc 1 130 7 ld.u64 %r94,[%r59]; mad.lo.u64 %r44,%r27,%r52,%r94; .loc 1 133 3 setp.lt.s64 %r95,%r48,%r43; @ %r95 bra $L15; add.u64 %r34,%r48,1; mov.u64 %r55,%r43; .loc 1 130 7 mov.u64 %r54,%r44; .loc 1 135 18 setp.eq.u32 %r115,%r61,0; $L19: .loc 1 135 10 ld.s8 %r97,[%r51]; cvt.u16.u32 %r96,%r97; setp.eq.u16 %r98,%r96,0; @ %r98 bra $L16; .loc 1 37102,[%value_in]; } .loc 1 135 18 @ %r115 bra $L17; not.b32 %r106,%r102; shr.u32 %r107,%r106,31; cvt.u32.u32 %r108,%r107; cvt.u32.u8 %r53,%r108; bra $L18; $L17: set.u32.gt.s32 %r110,%r102,0; neg.s32 %r111,%r110; cvt.u32.u32 %r109,%r111; cvt.u32.u8 %r53,%r109; bra $L18; $L16: .loc 1 141 11 add.u64 %r54,%r54,%r27; .loc 1 142 13 add.u64 %r51,%r51,%r37; .loc 1 133 27 add.u64 %r55,%r55,1; .loc 1 133 3 setp.ne.u64 %r112,%r34,%r55; @ %r112 bra $L19; $L15: .loc 1 144 10 cvt.u32.u64 %r58,%r43; bra $L9; $L20: .loc 1 100 12 mov.u32 %r58,0; bra $L9; $L21: mov.u32 %r58,0; bra $L9; $L18: .loc 1 135 18 setp.eq.u32 %r114,%r53,0; selp.u64 %r44,%r44,%r54,%r114; selp.u64 %r43,%r43,%r55,%r114; bra $L16;gfortran_smaxloc2_4_s1gfortran_smaxloc2_4_s1predmov.u64 %r28,%ar3; .loc 1 155 6 setp.eq.u64 %r29,%r26,0; @ %r29 bra $L31; .loc 1 156 12 cvt.s64.s32 %r33,%r27; cvt.u32.u64 %r31,%r225(%value_in),_gfortran_maxloc2_4_s1mov.u32 %r24,%r34; bra $L29; $L31: .loc 1 158 12 cvt.u32 maxloc2_4_s4.o/ 1622802250gfortran_maxloc2_4_s4gfortran_maxloc2_4_s4.file 1 "../../../../libgfortran/generated/maxloc2_4_s4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc2_4_s4gfortran_mmaxloc2_4_s4); // BEGIN GLOBAL FUNCTION DECL: _gfortran_smaxloc2_4_s4gfortran_smaxloc2_4_s4); // BEGIN GLOBAL FUNCTION DECL: _gfortrani_memcmp_char4_gfortrani_memcmp_char4_gfortrani_internal_error .extern .func _gfortrani_internal_error21,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc2_4_s4gfortran_maxloc2_4_s4value;u64mov.u64 %r37,%ar0; mov.u32 %r38,%ar1; mov.u64 %r39,%ar2; .loc 1 56 12 ld.u64 %r41,[%r37+56]; add.u64 %r40,%r41,1; .loc 1 56 10 ld.u64 %r42,[%r37+48]; sub.u64 %r31,%r40,%r42; .loc 1 57 6 setp.le.s64 %r43,%r31,0; @ %r43 bra $L7; .loc 1 63 7 ld.u64 %r32,[%r37]; .loc 1 73 11 ld.u64 %r45,[%r37+40]; mul.lo.u64 %r44,%r39,%r45; shl.b64 %r27,%r44,2; .loc 1 63 7 mov.u64 %r29,%r32; .loc 1 65 9 mov.u64 %r33,1; mov.u64 %r28,%r33; .loc 1 67 10 setp.eq.u32 %r60,%r38,0; bra $L3; $L6:8; @ %r60 bra $L5; .loc 1 3932_gfortrani_memcmp_char4.loc 1 67 26 setp.lt.s32 %r52,%r51,0; .loc 1 73 11 selp.u64 %r29,%r29,%r32,%r52; selp.u64 %r28,%r28,%r33,%r52; bra $L3; $L5: .loc 1 3932_gfortrani_memcmp_char4.loc 1 67 26 setp.le.s32 %r57,%r56,0; .loc 1 73 11 selp.u64 %r29,%r29,%r32,%r57; selp.u64 %r28,%r28,%r33,%r57; bra $L3; $L8: mov.u64 %r29,%r32; mov.u64 %r28,%r33; $L3: add.u64 %r32,%r32,%r27; .loc 1 65 25 add.u64 %r33,%r33,1; .loc 1 65 3 setp.ge.s64 %r58,%r31,%r33; @ %r58 bra $L6; .loc 1 75 10 cvt.u32.u64 %r36,%r28; bra $L1; $L7: .loc 1 58 12 mov.u32 %r36,0; $L1: .loc 1 _gfortran_mmaxloc2_4_s4gfortran_mmaxloc2_4_s4predu16 %r88; .reg .u32 %r89; .reg .pred %r90; .reg .pred %r91; .reg .predpredpred %r116; .reg .pred.loc 1 98 12 ld.u64 %r64,[%r59+56]; add.u64 %r63,%r64,1; .loc 1 98 10 ld.u64 %r65,[%r59+48]; sub.u64 %r48,%r63,%r65; .loc 1 99 6 setp.le.s64 %r66,%r48,0; @ %r66 bra $L20; .loc 1 102 44 ld.u64 %r67,[%r59+40]; mul.lo.u64 %r27,%r62,%r67; .loc 1 104 15 ld.u64 %r28,[%r60+16]; .loc 1 105 9 ld.u64 %r51,[%r60]; .loc 1 107 22 cvt.u32.u64 %r29,%r28; .loc 1 107 53 add.u32 %r68,%r29,-4; and.b32 %r69,%r68,-5; set.u32.eq.u32 %r71,%r69,0; neg.s32 %r72,%r71; .loc 1 107 22 add.u32 %r73,%r29,-1; set.u32.le.u32 %r75,%r73,1; neg.s32 %r76,%r75; .loc 1 107 6 cvt.u16.u32 %r78,%r72; cvt.u16.u32 %r79,%r76; or.b16 %r77,%r78,%r79;109 7 setp.ne.u32 %r84,%r29,16; @ %r84 bra $L12; $L11: .loc 1 116 13 ld.u64 %r85,[%r60+40]; mul.lo.u64 %r34,%r28,%r85; .loc 1 119 9 mov.u64 %r52,0; bra $L13; $L12: .loc 1 114 5 cvta.const.u64 %r87,$LC0; mov.u64 %r86,r87; call _gfortrani_internal_error13: .loc 1 121 10 ld.s8 %r89,[%r51]; cvt.u16.u32 %r88,%r89; setp.ne.u16 %r90,%r88,0; @ %r90 bra $L14; .loc 1 123 13 add.u64 %r51,%r51,%r34; .loc 1 119 24 add.u64 %r52,%r52,1; .loc 1 119 3 setp.ne.u64 %r91,%r48,%r52; @ %r91 bra $L13; .loc 1 100 12 mov.u32 %r58,0; bra $L9; $L14: .loc 1 126 6 setp.eq.u64 %r92,%r48,%r52; @ %r92 bra $L21; .loc 1 129 7 add.u64 %r42,%r52,1; .loc 1 130 30 mul.lo.u64 %r93,%r27,%r52; .loc 1 130 26 shl.b64 %r94,%r93,2; .loc 1 130 7 ld.u64 %r95,[%r59]; add.u64 %r44,%r95,%r94; .loc 1 133 3 setp.lt.s64 %r96,%r48,%r42; @ %r96 bra $L15; .loc 1 141 11 shl.b64 %r41,%r27,2; mov.u64 %r55,%r42; .loc 1 130 7 mov.u64 %r54,%r44; .loc 1 135 18 setp.eq.u32 %r119,%r61,0; $L19: .loc 1 135 10 ld.s8 %r98,[%r51]; cvt.u16.u32 %r97,%r98; setp.eq.u16 %r99,%r97,0; @ %r99 bra $L16; .loc 1 135 18 @ %r119 bra $L17; .loc 1 39_gfortrani_memcmp_char44,[%value_in]; } .loc 1 135 18 not.b32 %r106,%r104; shr.u32 %r107,%r106,31; cvt.u32.u32 %r108,%r107; cvt.u32.u8 %r53,%r108; bra $L18; $L17: .loc 1 39_gfortrani_memcmp_char4112,[%value_in]; } .loc 1 135 18 set.u32.gt.s32 %r114,%r112,0; neg.s32 %r115,%r114; cvt.u32.u32 %r113,%r115; cvt.u32.u8 %r53,%r113; bra $L18; $L16: .loc 1 141 11 add.u64 %r54,%r54,%r41; .loc 1 142 13 add.u64 %r51,%r51,%r34; .loc 1 133 27 add.u64 %r55,%r55,1; .loc 1 133 3 setp.ge.s64 %r116,%r48,%r55; @ %r116 bra $L19; $L15: .loc 1 144 10 cvt.u32.u64 %r58,%r42; bra $L9; $L20: .loc 1 100 12 mov.u32 %r58,0; bra $L9; $L21: mov.u32 %r58,0; bra $L9; $L18: .loc 1 135 18 setp.eq.u32 %r118,%r53,0; selp.u64 %r44,%r44,%r54,%r118; selp.u64 %r42,%r42,%r55,%r118; bra $L16;gfortran_smaxloc2_4_s4gfortran_smaxloc2_4_s4predmov.u64 %r28,%ar3; .loc 1 155 6 setp.eq.u64 %r29,%r26,0; @ %r29 bra $L31; .loc 1 156 12 cvt.s64.s32 %r33,%r27; cvt.u32.u64 %r31,%r225(%value_in),_gfortran_maxloc2_4_s4mov.u32 %r24,%r34; bra $L29; $L31: .loc 1 158 12 cvt.u32gfortran_maxloc2_8_s1gfortran_maxloc2_8_s1.file 1 "../../../../libgfortran/generated/maxloc2_8_s1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc2_8_s1gfortran_mmaxloc2_8_s1); // BEGIN GLOBAL FUNCTION DECL: _gfortran_smaxloc2_8_s1gfortran_smaxloc2_8_s1); // BEGIN GLOBAL FUNCTION DECL: memcmpmem_gfortrani_internal_error .extern .func _gfortrani_internal_error21,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc2_8_s1gfortran_maxloc2_8_s1value;mov.u64 %r37,%ar0; mov.u32 %r38,%ar1; mov.u64 %r39,%ar2; .loc 1 56 12 ld.u64 %r41,[%r37+56]; add.u64 %r40,%r41,1; .loc 1 56 10 ld.u64 %r42,[%r37+48]; sub.u64 %r29,%r40,%r42; .loc 1 57 6 setp.le.s64 %r43,%r29,0; @ %r43 bra $L7; .loc 1 60 44 ld.u64 %r44,[%r37+40]; mul.lo.u64 %r27,%r39,%r44; .loc 1 63 7 ld.u64 %r32,[%r37]; add.u64 %r35,%r29,1; mov.u64 %r28,%r32; .loc 1 65 9 mov.u64 %r33,1; mov.u64 %r36,%r33; .loc 1 67 10 setp.eq.u32 %r56,%r38,0; bra $L3; $L6: setp.eq.u64 %r45,%r28,0; @ %r45 bra $L8; .loc 1 3710 @ %r56 bra $L5; .loc 1 67 26 setp.lt.s32 %r52,%r49,0; selp.u64 %r28,%r28,%r32,%r52; selp.u64 %r36,%r36,%r33,%r52; bra $L3; $L5: setp.le.s32 %r53,%r49,0; selp.u64 %r28,%r28,%r32,%r53; selp.u64 %r36,%r36,%r33,%r53; bra $L3; $L8: .loc 1 73 11 mov.u64 %r28,%r32; mov.u64 %r36,%r33; $L3: add.u64 %r32,%r32,%r27; .loc 1 65 25 add.u64 %r33,%r33,1; .loc 1 65 3 setp.ne.u64 %r54,%r33,%r35; @ %r54 bra $L6; bra $L1; $L7: .loc 1 58 12 mov.u64 %r36,0; $L1: .loc 1 7_gfortran_mmaxloc2_8_s1gfortran_mmaxloc2_8_s1.reg .u32 %r75; .reg .u16 %r76; .reg .u16 %r77; .reg .u16 %r78; .reg .u32pred32pred %r91; .reg .u64 %r93; .reg .predpred %r97; .reg .u32 %r101; .reg .u32pred %r114; mov.u64 %r58,%ar0; mov.u64 %r59,%ar1; mov.u32 %r60,%ar2; mov.u64 %r61,%ar3; .loc 1 98 12 ld.u64 %r63,[%r58+56]; add.u64 %r62,%r63,1; .loc 1 98 10 ld.u64 %r64,[%r58+48]; sub.u64 %r48,%r62,%r64; .loc 1 99 6 setp.le.s64 %r65,%r48,0; @ %r65 bra $L19; .loc 1 102 44 ld.u64 %r66,[%r58+40]; mul.lo.u64 %r27,%r61,%r66; .loc 1 104 15 ld.u64 %r28,[%r59+16]; .loc 1 105 9 ld.u64 %r51,[%r59]; .loc 1 107 22 cvt.u32.u64 %r29,%r28; .loc 1 107 53 add.u32 %r67,%r29,-4; and.b32 %r68,%r67,-5; set.u32.eq.u32 %r70,%r68,0; neg.s32 %r71,%r70; .loc 1 107 22 add.u32 %r72,%r29,-1; set.u32.le.u32 %r74,%r72,1; neg.s32 %r75,%r74; .loc 1 107 6 cvt.u16.u32 %r77,%r71; cvt.u16.u32 %r78,%r75; or.b16 %r76,%r77,%r78; cvt.u32.u16 %r79,%r76; cvt.u16.u8 %r80,%r79; setp.ne.u16 %r81,%r80,0; @ %r81 bra $L11; .loc 1 109 7 setp.ne.u32 %r83,%r29,16; @ %r83 bra $L12; $L11: .loc 1 116 13 ld.u64 %r84,[%r59+40]; mul.lo.u64 %r37,%r28,%r84; .loc 1 119 9 mov.u64 %r52,0; bra $L13; $L12: .loc 1 114 5 cvta.const.u64 %r86,$LC0; mov.u64 %r85,r86; call _gfortrani_internal_error13: .loc 1 121 10 ld.s8 %r88,[%r51]; cvt.u16.u32 %r87,%r88; setp.ne.u16 %r89,%r87,0; @ %r89 bra $L14; .loc 1 123 13 add.u64 %r51,%r51,%r37; .loc 1 119 24 add.u64 %r52,%r52,1; .loc 1 119 3 setp.ne.u64 %r90,%r48,%r52; @ %r90 bra $L13; .loc 1 100 12 mov.u64 %r57,0; bra $L9; $L14: .loc 1 126 6 setp.eq.u64 %r91,%r48,%r52; @ %r91 bra $L20; .loc 1 129 7 add.u64 %r57,%r52,1; .loc 1 130 7 ld.u64 %r93,[%r58]; mad.lo.u64 %r43,%r27,%r52,%r93; .loc 1 133 3 setp.lt.s64 %r94,%r48,%r57; @ %r94 bra $L9; add.u64 %r34,%r48,1; mov.u64 %r55,%r57; .loc 1 130 7 mov.u64 %r54,%r43; .loc 1 135 18 setp.eq.u32 %r114,%r60,0; $L18: .loc 1 135 10 ld.s8 %r96,[%r51]; cvt.u16.u32 %r95,%r96; setp.eq.u16 %r97,%r95,0; @ %r97 bra $L15; .loc 1 371,[%value_in]; } .loc 1 135 18 @ %r114 bra $L16; not.b32 %r105,%r101; shr.u32 %r106,%r105,31; cvt.u32.u32 %r107,%r106; cvt.u32.u8 %r53,%r107; bra $L17; $L16: set.u32.gt.s32 %r109,%r101,0; neg.s32 %r110,%r109; cvt.u32.u32 %r108,%r110; cvt.u32.u8 %r53,%r108; bra $L17; $L15: .loc 1 141 11 add.u64 %r54,%r54,%r27; .loc 1 142 13 add.u64 %r51,%r51,%r37; .loc 1 133 27 add.u64 %r55,%r55,1; .loc 1 133 3 setp.ne.u64 %r111,%r34,%r55; @ %r111 bra $L18; bra $L9; $L19: .loc 1 100 12 mov.u64 %r57,0; bra $L9; $L20: mov.u64 %r57,0; bra $L9; $L17: .loc 1 135 18 setp.eq.u32 %r113,%r53,0; selp.u64 %r43,%r43,%r54,%r113; selp.u64 %r57,%r57,%r55,%r113; bra $L15; $L9predmov.u64 %r28,%ar3; .loc 1 155 6 setp.eq.u64 %r29,%r26,0; @ %r29 bra $L30; .loc 1 156 12 cvt.s64.s32 %r33,%r27; cvt.u32.u64 %r31,%r2825(%value_in),_gfortran_maxloc2_8_s1mov.u64 %r24,%r34; bra $L28; $L30: .loc 1 158 12 mov.u64 %r24,%r26; $L28 maxloc2_8_s4.o/ 1622802250gfortran_maxloc2_8_s4gfortran_maxloc2_8_s4.file 1 "../../../../libgfortran/generated/maxloc2_8_s4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc2_8_s4gfortran_mmaxloc2_8_s4); // BEGIN GLOBAL FUNCTION DECL: _gfortran_smaxloc2_8_s4gfortran_smaxloc2_8_s4); // BEGIN GLOBAL FUNCTION DECL: _gfortrani_memcmp_char4_gfortrani_memcmp_char4_gfortrani_internal_error .extern .func _gfortrani_internal_error21,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc2_8_s4gfortran_maxloc2_8_s4value;predpred %r57; .reg .pred %r59; mov.u64 %r36,%ar0; mov.u32 %r37,%ar1; mov.u64 %r38,%ar2; .loc 1 56 12 ld.u64 %r40,[%r36+56]; add.u64 %r39,%r40,1; .loc 1 56 10 ld.u64 %r41,[%r36+48]; sub.u64 %r30,%r39,%r41; .loc 1 57 6 setp.le.s64 %r42,%r30,0; @ %r42 bra $L7; .loc 1 63 7 ld.u64 %r31,[%r36]; .loc 1 73 11 ld.u64 %r44,[%r36+40]; mul.lo.u64 %r43,%r38,%r44; shl.b64 %r27,%r43,2; .loc 1 63 7 mov.u64 %r29,%r31; .loc 1 65 9 mov.u64 %r32,1; mov.u64 %r35,%r32; .loc 1 67 10 setp.eq.u32 %r59,%r37,0; bra $L3; $L6: setp.eq.u64 %r45,%r29,0; @ %r45 bra $L8; @ %r59 bra $L5; .loc 1 39 1229_gfortrani_memcmp_char4.loc 1 67 26 setp.lt.s32 %r51,%r50,0; .loc 1 73 11 selp.u64 %r29,%r29,%r31,%r51; selp.u64 %r35,%r35,%r32,%r51; bra $L3; $L5: .loc 1 39 1229_gfortrani_memcmp_char4.loc 1 67 26 setp.le.s32 %r56,%r55,0; .loc 1 73 11 selp.u64 %r29,%r29,%r31,%r56; selp.u64 %r35,%r35,%r32,%r56; bra $L3; $L8: mov.u64 %r29,%r31; mov.u64 %r35,%r32; $L3: add.u64 %r31,%r31,%r27; .loc 1 65 25 add.u64 %r32,%r32,1; .loc 1 65 3 setp.ge.s64 %r57,%r30,%r32; @ %r57 bra $L6; bra $L1; $L7: .loc 1 58 12 mov.u64 %r35,0; $L1: .loc 1 76gfortran_mmaxloc2_8_s4gfortran_mmaxloc2_8_s4.reg .u32 %r75; .reg .u16 %r76; .reg .u16 %r77; .reg .u16 %r78; .reg .u32predpred %r89; .reg .pred %r90; .reg .predpred %r117; .reg .pred %r118; mov.u64 %r58,%ar0; mov.u64 %r59,%ar1; mov.u32 %r60,%ar2; mov.u64 %r61,%ar3; .loc 1 98 12 ld.u64 %r63,[%r58+56]; add.u64 %r62,%r63,1; .loc 1 98 10 ld.u64 %r64,[%r58+48]; sub.u64 %r47,%r62,%r64; .loc 1 99 6 setp.le.s64 %r65,%r47,0; @ %r65 bra $L19; .loc 1 102 44 ld.u64 %r66,[%r58+40]; mul.lo.u64 %r27,%r61,%r66; .loc 1 104 15 ld.u64 %r28,[%r59+16]; .loc 1 105 9 ld.u64 %r50,[%r59]; .loc 1 107 22 cvt.u32.u64 %r29,%r28; .loc 1 107 53 add.u32 %r67,%r29,-4; and.b32 %r68,%r67,-5; set.u32.eq.u32 %r70,%r68,0; neg.s32 %r71,%r70; .loc 1 107 22 add.u32 %r72,%r29,-1; set.u32.le.u32 %r74,%r72,1; neg.s32 %r75,%r74; .loc 1 107 6 cvt.u16.u32 %r77,%r71; cvt.u16.u32 %r78,%r75; or.b16 %r76,%r77,%r78; cvt.u32.u16 %r79,%r76; cvt.u16.u8 %r80,%r79; setp.ne.u16 %r81,%r80,0; @ %r81 bra $L11; .loc 1 109 7 setp.ne.u32 %r83,%r29,16; @ %r83 bra $L12; $L11: .loc 1 116 13 ld.u64 %r84,[%r59+40]; mul.lo.u64 %r34,%r28,%r84; .loc 1 119 9 mov.u64 %r51,0; bra $L13; $L12: .loc 1 114 5 cvta.const.u64 %r86,$LC0; mov.u64 %r85,r86; call _gfortrani_internal_error13: .loc 1 121 10 ld.s8 %r88,[%r50]; cvt.u16.u32 %r87,%r88; setp.ne.u16 %r89,%r87,0; @ %r89 bra $L14; .loc 1 123 13 add.u64 %r50,%r50,%r34; .loc 1 119 24 add.u64 %r51,%r51,1; .loc 1 119 3 setp.ne.u64 %r90,%r47,%r51; @ %r90 bra $L13; .loc 1 100 12 mov.u64 %r57,0; bra $L9; $L14: .loc 1 126 6 setp.eq.u64 %r91,%r47,%r51; @ %r91 bra $L20; .loc 1 129 7 add.u64 %r57,%r51,1; .loc 1 130 30 mul.lo.u64 %r92,%r27,%r51; .loc 1 130 26 shl.b64 %r93,%r92,2; .loc 1 130 7 ld.u64 %r94,[%r58]; add.u64 %r43,%r94,%r93; .loc 1 133 3 setp.lt.s64 %r95,%r47,%r57; @ %r95 bra $L9; .loc 1 141 11 shl.b64 %r41,%r27,2; mov.u64 %r54,%r57; .loc 1 130 7 mov.u64 %r53,%r43; .loc 1 135 18 setp.eq.u32 %r118,%r60,0; $L18: .loc 1 135 10 ld.s8 %r97,[%r50]; cvt.u16.u32 %r96,%r97; setp.eq.u16 %r98,%r96,0; @ %r98 bra $L15; .loc 1 135 18 @ %r118 bra $L16; .loc 1 39call (%value_in),_gfortrani_memcmp_char43,[%value_in]; } .loc 1 135 18 not.b32 %r105,%r103; shr.u32 %r106,%r105,31; cvt.u32.u32 %r107,%r106; cvt.u32.u8 %r52,%r107; bra $L17; $L16: .loc 1 39call (%value_in),_gfortrani_memcmp_char4111,[%value_in]; } .loc 1 135 18 set.u32.gt.s32 %r113,%r111,0; neg.s32 %r114,%r113; cvt.u32.u32 %r112,%r114; cvt.u32.u8 %r52,%r112; bra $L17; $L15: .loc 1 141 11 add.u64 %r53,%r53,%r41; .loc 1 142 13 add.u64 %r50,%r50,%r34; .loc 1 133 27 add.u64 %r54,%r54,1; .loc 1 133 3 setp.ge.s64 %r115,%r47,%r54; @ %r115 bra $L18; bra $L9; $L19: .loc 1 100 12 mov.u64 %r57,0; bra $L9; $L20: mov.u64 %r57,0; bra $L9; $L17: .loc 1 135 18 setp.eq.u32 %r117,%r52,0; selp.u64 %r43,%r43,%r53,%r117; selp.u64 %r57,%r57,%r54,%r117; bra $L15; $L9predmov.u64 %r28,%ar3; .loc 1 155 6 setp.eq.u64 %r29,%r26,0; @ %r29 bra $L30; .loc 1 156 12 cvt.s64.s32 %r33,%r27; cvt.u32.u64 %r31,%r2825(%value_in),_gfortran_maxloc2_8_s4mov.u64 %r24,%r34; bra $L28; $L30: .loc 1 158 12 mov.u64 %r24,%r26; $L28gfortran_maxloc2_16_s1 .visible .func _gfortran_maxloc2_16_s1); .file 1 "../../../../libgfortran/generated/maxloc2_16_s1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc2_16_s1 .visible .func _gfortran_mmaxloc2_16_s1_gfortran_smaxloc2_16_s1 .visible .func _gfortran_smaxloc2_16_s1memcmpmem_gfortrani_internal_error .extern .func _gfortrani_internal_21,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc2_16_s1 .visible .func _gfortran_maxloc2_16_s1predmov.u32 %r40,%ar2; mov.u64 %r41,%ar3; .loc 1 56 12 ld.u64 %r43,[%r39+56]; add.u64 %r42,%r43,1; .loc 1 56 10 ld.u64 %r44,[%r39+48]; sub.u64 %r31,%r42,%r44; .loc 1 57 6 setp.gt.s64 %r45,%r31,0; @ %r45 bra $L2; .loc 1 58 12 mov.u64 %r46,0; st.u64 [%r38],%r46; st.u64 [%r38+8],%r46; bra $L1; $L2: .loc 1 60 44 ld.u64 %r48,[%r39+40]; mul.lo.u64 %r27,%r41,%r48; .loc 1 63 7 ld.u64 %r34,[%r39]; add.u64 %r37,%r31,1; mov.u64 %r30,%r34; .loc 1 65 9 mov.u64 %r35,1; mov.u64 %r29,%r35; .loc 1 67 10 setp.eq.u32 %r60,%r40,0; bra $L4; $L7: setp.eq.u64 %r49,%r30,0; @ %r49 bra $L8; .loc 1 37(%value_in),memcmp,(%out_arg1,%out_arg2,67 10 @ %r60 bra $L6; .loc 1 67 26 setp.lt.s32 %r56,%r53,0; selp.u64 %r30,%r30,%r34,%r56; selp.u64 %r29,%r29,%r35,%r56; bra $L4; $L6: setp.le.s32 %r57,%r53,0; selp.u64 %r30,%r30,%r34,%r57; selp.u64 %r29,%r29,%r35,%r57; bra $L4; $L8: .loc 1 73 11 mov.u64 %r30,%r34; mov.u64 %r29,%r35; $L4: add.u64 %r34,%r34,%r27; .loc 1 65 25 add.u64 %r35,%r35,1; .loc 1 65 3 setp.ne.u64 %r58,%r35,%r37; @ %r58 bra $L7; .loc 1 75 10 st.u64 [%r38],%r29; shr.s64 %r59,%r29,63; st.u64 [%r38+8],%r59;_gfortran_mmaxloc2_16_s1 .visible .func _gfortran_mmaxloc2_16_s1in_ar4) {u16 %r84; .reg .pred %r85; .reg .predu32.loc 1 98 12 ld.u64 %r65,[%r60+56]; add.u64 %r64,%r65,1; .loc 1 98 10 ld.u64 %r66,[%r60+48]; sub.u64 %r50,%r64,%r66; .loc 1 99 6 setp.gt.s64 %r67,%r50,0; @ %r67 bra $L10; $L16: .loc 1 100 12 mov.u64 %r68,0; st.u64 [%r59],%r68; st.u64 [%r59+8],%r68; bra $L9; $L10: .loc 1 102 44 ld.u64 %r70,[%r60+40]; mul.lo.u64 %r27,%r63,%r70; .loc 1 104 15 ld.u64 %r28,[%r61+16]; .loc 1 105 9 ld.u64 %r53,[%r61]; .loc 1 107 22 cvt.u32.u64 %r29,%r28; .loc 1 107 53 add.u32 %r71,%r29,-4; and.b32 %r72,%r71,-5;.loc 1 107 22 add.u32 %r76,%r29,-1; set.u32.le.u32 %r78,%r76,1; neg.s32 %r79,%r78; .loc 1 107 6 cvt.u16.u32 %r81,%r75; cvt.u16.u32 %r82,%r79; or.b16 %r80,%r81,%r82;@ %r85 bra $L12; .loc 1 109 7 setp.ne.u32 %r87,%r29,16; @ %r87 bra $L13; $L12: .loc 1 116 13 ld.u64 %r88,[%r61+40]; mul.lo.u64 %r37,%r28,%r88; .loc 1 119 9 mov.u64 %r54,0; bra $L14; $L13: .loc 1 114 5 cvta.const.u64 %r90,$LC0; mov.u64 %r89,0; {r90; call _gfortrani_internal_error14: .loc 1 121 10 ld.s8 %r92,[%r53]; cvt.u16.u32 %r91,%r92; setp.ne.u16 %r93,%r91,0; @ %r93 bra $L15; .loc 1 123 13 add.u64 %r53,%r53,%r37; .loc 1 119 24 add.u64 %r54,%r54,1; .loc 1 119 3 setp.ne.u64 %r94,%r50,%r54; @ %r94 bra $L14; bra $L16; $L15: .loc 1 126 6 setp.eq.u64 %r95,%r50,%r54; @ %r95 bra $L16; .loc 1 129 7 add.u64 %r44,%r54,1; .loc 1 130 7 ld.u64 %r97,[%r60]; mad.lo.u64 %r45,%r27,%r54,%r97; .loc 1 133 3 setp.lt.s64 %r98,%r50,%r44; @ %r98 bra $L17; add.u64 %r33,%r50,1; mov.u64 %r57,%r44; .loc 1 130 7 mov.u64 %r56,%r45; .loc 1 135 18 setp.eq.u32 %r118,%r62,0; $L21: .loc 1 135 10 ld.s8 %r100,[%r53]; cvt.u16.u32 %r99,%r100; setp.eq.u16 %r101,%r99,0; @ %r101 bra $L18; .loc 1 37105,[%value_in]; } .loc 1 135 18 @ %r118 bra $L19; not.b32 %r109,%r105; shr.u32 %r110,%r109,31; cvt.u32.u32 %r111,%r110; cvt.u32.u8 %r55,%r111; bra $L20; $L19: set.u32.gt.s32 %r113,%r105,0; neg.s32 %r114,%r113; cvt.u32.u32 %r112,%r114; cvt.u32.u8 %r55,%r112; bra $L20; $L18: .loc 1 141 11 add.u64 %r56,%r56,%r27; .loc 1 142 13 add.u64 %r53,%r53,%r37; .loc 1 133 27 add.u64 %r57,%r57,1; .loc 1 133 3 setp.ne.u64 %r115,%r33,%r57; @ %r115 bra $L21; $L17: .loc 1 144 10 st.u64 [%r59],%r44; shr.s64 %r116,%r44,63; st.u64 [%r59+8],%r116; bra $L9; $L20: .loc 1 135 18 setp.eq.u32 %r117,%r55,0; selp.u64 %r45,%r45,%r56,%r117; selp.u64 %r44,%r44,%r57,%r117; bra $L18; $L9: .loc 1 1_gfortran_smaxloc2_16_s1 .visible .func _gfortran_smaxloc2_16_s1in_ar4) {predmov.u64 %r29,%ar4; .loc 1 155 6 setp.eq.u64 %r30,%r27,0; @ %r30 bra $L31; .loc 1 156 12 cvt.s64.s32 %r35,%r28; cvt.u32.u64 %r33,%r29; {35; call _gfortran_maxloc2_16_s1ld.u64 %r38,[%frame]; ld.u64 %r39,[%frame+8]; bra $L30; $L31: .loc 1 158 12 mov.u64 %r38,%r27; mov.u64 %r39,%r38; $L30: st.u64 [%r25],%r38; st.u64 [%r25+8],%r39; .loc 1 159maxloc2_16_s4.o/1622802250gfortran_maxloc2_16_s4 .visible .func _gfortran_maxloc2_16_s4); .file 1 "../../../../libgfortran/generated/maxloc2_16_s4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc2_16_s4 .visible .func _gfortran_mmaxloc2_16_s_gfortran_smaxloc2_16_s4 .visible .func _gfortran_smaxloc2_16_s_gfortrani_memcmp_char4_gfortrani_memcmp_char4_gfortrani_internal_error .extern .func _gfortrani_internal_21,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc2_16_s4 .visible .func _gfortran_maxloc2_16_s4pred.loc 1 56 12 ld.u64 %r42,[%r38+56]; add.u64 %r41,%r42,1; .loc 1 56 10 ld.u64 %r43,[%r38+48]; sub.u64 %r32,%r41,%r43; .loc 1 57 6 setp.gt.s64 %r44,%r32,0; @ %r44 bra $L2; .loc 1 58 12 mov.u64 %r45,0; st.u64 [%r37],%r45; st.u64 [%r37+8],%r45; bra $L1; $L2: .loc 1 63 7 ld.u64 %r33,[%r38]; .loc 1 73 11 ld.u64 %r48,[%r38+40]; mul.lo.u64 %r47,%r40,%r48; shl.b64 %r27,%r47,2; .loc 1 63 7 mov.u64 %r30,%r33; .loc 1 65 9 mov.u64 %r34,1; mov.u64 %r29,%r34; .loc 1 67 10 setp.eq.u32 %r63,%r39,0; bra $L4; $L7: setp.eq.u64 %r49,%r30,0; @ %r49 bra $L8; @ %r63 bra $L(%value_in),_gfortrani_memcmp_char467 26 setp.lt.s32 %r55,%r54,0; .loc 1 73 11 selp.u64 %r30,%r30,%r33,%r55; selp.u64 %r29,%r29,%r34,%r55; bra $L4; $L(%value_in),_gfortrani_memcmp_char4.loc 1 67 26 setp.le.s32 %r60,%r59,0; .loc 1 73 11 selp.u64 %r30,%r30,%r33,%r60; selp.u64 %r29,%r29,%r34,%r60; bra $L4; $L8: mov.u64 %r30,%r33; mov.u64 %r29,%r34; $L4: add.u64 %r33,%r33,%r27; .loc 1 65 25 add.u64 %r34,%r34,1; .loc 1 65 3 setp.ge.s64 %r61,%r32,%r34; @ %r61 bra $L7; .loc 1 75 10 st.u64 [%r37],%r29; shr.s64 %r62,%r29,63; st.u64 [%r37+8],%r62;_gfortran_mmaxloc2_16_s4 .visible .func _gfortran_mmaxloc2_16_s, .param .u64 %in_ar4) {predpredpred.loc 1 98 12 ld.u64 %r65,[%r60+56]; add.u64 %r64,%r65,1; .loc 1 98 10 ld.u64 %r66,[%r60+48]; sub.u64 %r49,%r64,%r66; .loc 1 99 6 setp.gt.s64 %r67,%r49,0; @ %r67 bra $L10; $L16: .loc 1 100 12 mov.u64 %r68,0; st.u64 [%r59],%r68; st.u64 [%r59+8],%r68; bra $L9; $L10: .loc 1 102 44 ld.u64 %r70,[%r60+40]; mul.lo.u64 %r27,%r63,%r70; .loc 1 104 15 ld.u64 %r28,[%r61+16]; .loc 1 105 9 ld.u64 %r52,[%r61]; .loc 1 107 22 cvt.u32.u64 %r29,%r28; .loc 1 107 53 add.u32 %r71,%r29,-4; and.b32 %r72,%r71,-5;.loc 1 107 22 add.u32 %r76,%r29,-1; set.u32.le.u32 %r78,%r76,1; neg.s32 %r79,%r78; .loc 1 107 6 cvt.u16.u32 %r81,%r75; cvt.u16.u32 %r82,%r79; or.b16 %r80,%r81,%r82;@ %r85 bra $L12; .loc 1 109 7 setp.ne.u32 %r87,%r29,16; @ %r87 bra $L13; $L12: .loc 1 116 13 ld.u64 %r88,[%r61+40]; mul.lo.u64 %r34,%r28,%r88; .loc 1 119 9 mov.u64 %r53,0; bra $L14; $L13: .loc 1 114 5 cvta.const.u64 %r90,$LC0; mov.u64 %r89,0; {r90; call _gfortrani_internal_error14: .loc 1 121 10 ld.s8 %r92,[%r52]; cvt.u16.u32 %r91,%r92; setp.ne.u16 %r93,%r91,0; @ %r93 bra $L15; .loc 1 123 13 add.u64 %r52,%r52,%r34; .loc 1 119 24 add.u64 %r53,%r53,1; .loc 1 119 3 setp.ne.u64 %r94,%r49,%r53; @ %r94 bra $L14; bra $L16; $L15: .loc 1 126 6 setp.eq.u64 %r95,%r49,%r53; @ %r95 bra $L16; .loc 1 129 7 add.u64 %r43,%r53,1; .loc 1 130 30 mul.lo.u64 %r96,%r27,%r53; .loc 1 130 26 shl.b64 %r97,%r96,2; .loc 1 130 7 ld.u64 %r98,[%r60]; add.u64 %r45,%r98,%r97; .loc 1 133 3 setp.lt.s64 %r99,%r49,%r43; @ %r99 bra $L17; .loc 1 141 11 shl.b64 %r41,%r27,2; mov.u64 %r56,%r43; .loc 1 130 7 mov.u64 %r55,%r45; .loc 1 135 18 setp.eq.u32 %r122,%r62,0; $L21: .loc 1 135 10 ld.s8 %r101,[%r52]; cvt.u16.u32 %r100,%r101; setp.eq.u16 %r102,%r100,0; @ %r102 bra $L18; .loc 1 135 18 @ %r122 bra $L19; .loc 1 39_gfortrani_memcmp_char47,[%value_in]; } .loc 1 135 18 not.b32 %r109,%r107; shr.u32 %r110,%r109,31; cvt.u32.u32 %r111,%r110; cvt.u32.u8 %r54,%r111; bra $L20; $L19: .loc 1 39_gfortrani_memcmp_char4115,[%value_in]; } .loc 1 135 18 set.u32.gt.s32 %r117,%r115,0; neg.s32 %r118,%r117; cvt.u32.u32 %r116,%r118; cvt.u32.u8 %r54,%r116; bra $L20; $L18: .loc 1 141 11 add.u64 %r55,%r55,%r41; .loc 1 142 13 add.u64 %r52,%r52,%r34; .loc 1 133 27 add.u64 %r56,%r56,1; .loc 1 133 3 setp.ge.s64 %r119,%r49,%r56; @ %r119 bra $L21; $L17: .loc 1 144 10 st.u64 [%r59],%r43; shr.s64 %r120,%r43,63; st.u64 [%r59+8],%r120; bra $L9; $L20: .loc 1 135 18 setp.eq.u32 %r121,%r54,0; selp.u64 %r45,%r45,%r55,%r121; selp.u64 %r43,%r43,%r56,%r121; bra $L18; $L9: .loc 1 1_gfortran_smaxloc2_16_s4 .visible .func _gfortran_smaxloc2_16_s, .param .u64 %in_ar4) {mov.u64 %r29,%ar4; .loc 1 155 6 setp.eq.u64 %r30,%r27,0; @ %r30 bra $L30; .loc 1 156 12 cvt.s64.s32 %r35,%r28; cvt.u32.u64 %r33,%r29; {35; call _gfortran_maxloc2_16_s4ld.u64 %r40,[%frame]; ld.u64 %r41,[%frame+8]; st.u64 [%r25],%r40; st.u64 [%r25+8],%r41; bra $L29; $L30: .loc 1 158 12 st.u64 [%r25],%r27; st.u64 [%r25+8],%r27; $L29: .loc 1 159minloc2_4_s1.o/ 1622802250gfortran_minloc2_4_s1gfortran_minloc2_4_s1.file 1 "../../../../libgfortran/generated/minloc2_4_s1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc2_4_s1gfortran_mminloc2_4_s1); // BEGIN GLOBAL FUNCTION DECL: _gfortran_sminloc2_4_s1gfortran_sminloc2_4_s1); // BEGIN GLOBAL FUNCTION DECL: memcmpmem_gfortrani_internal_error .extern .func _gfortrani_internal_error21,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc2_4_s1gfortran_minloc2_4_s1value;u64 %r42; .reg .u64 %r43; .reg .pred %r44; .reg .u64 %r45; .reg .predmov.u64 %r38,%ar0; mov.u32 %r39,%ar1; mov.u64 %r40,%ar2; .loc 1 57 12 ld.u64 %r42,[%r38+56]; add.u64 %r41,%r42,1; .loc 1 57 10 ld.u64 %r43,[%r38+48]; sub.u64 %r30,%r41,%r43; .loc 1 58 6 setp.le.s64 %r44,%r30,0; @ %r44 bra $L7; .loc 1 61 44 ld.u64 %r45,[%r38+40]; mul.lo.u64 %r27,%r40,%r45; .loc 1 64 7 ld.u64 %r33,[%r38]; add.u64 %r36,%r30,1; mov.u64 %r29,%r33; .loc 1 66 9 mov.u64 %r34,1; mov.u64 %r28,%r34; .loc 1 68 10 setp.eq.u32 %r57,%r39,0; bra $L3; $L6:8(%value_in),memcmp,(%out_arg1,%out_arg2,%out_arg3); ld.param.u32 %r50,[%value_in]; } .loc 1 68 10 @ %r57 bra $L5; .loc 1 68 26 setp.gt.s32 %r53,%r50,0; selp.u64 %r29,%r29,%r33,%r53; selp.u64 %r28,%r28,%r34,%r53; bra $L3; $L5: setp.ge.s32 %r54,%r50,0; selp.u64 %r29,%r29,%r33,%r54; selp.u64 %r28,%r28,%r34,%r54; bra $L3; $L8: .loc 1 74 11 mov.u64 %r29,%r33; mov.u64 %r28,%r34; $L3: add.u64 %r33,%r33,%r27; .loc 1 66 25 add.u64 %r34,%r34,1; .loc 1 66 3 setp.ne.u64 %r55,%r34,%r36; @ %r55 bra $L6; .loc 1 76 10 cvt.u32.u64 %r37,%r28; bra $L1; $L7: .loc 1 59 12 mov.u32 %r37,0; $L1: .loc 1 77_gfortran_mminloc2_4_s1gfortran_mminloc2_4_s1predu16 %r88; .reg .u32 %r89; .reg .pred %r90; .reg .pred %r91; .reg .pred.reg .pred %r98; .reg .u32 %r102; .reg .u32.loc 1 99 12 ld.u64 %r64,[%r59+56]; add.u64 %r63,%r64,1; .loc 1 99 10 ld.u64 %r65,[%r59+48]; sub.u64 %r48,%r63,%r65; .loc 1 100 6 setp.le.s64 %r66,%r48,0; @ %r66 bra $L20; .loc 1 103 44 ld.u64 %r67,[%r59+40]; mul.lo.u64 %r27,%r62,%r67; .loc 1 105 15 ld.u64 %r28,[%r60+16]; .loc 1 106 9 ld.u64 %r51,[%r60]; .loc 1 108 22 cvt.u32.u64 %r29,%r28; .loc 1 108 53 add.u32 %r68,%r29,-4; and.b32 %r69,%r68,-5; set.u32.eq.u32 %r71,%r69,0; neg.s32 %r72,%r71; .loc 1 108 22 add.u32 %r73,%r29,-1; set.u32.le.u32 %r75,%r73,1; neg.s32 %r76,%r75; .loc 1 108 6 cvt.u16.u32 %r78,%r72; cvt.u16.u32 %r79,%r76; or.b16 %r77,%r78,%r79;110 7 setp.ne.u32 %r84,%r29,16; @ %r84 bra $L12; $L11: .loc 1 117 13 ld.u64 %r85,[%r60+40]; mul.lo.u64 %r37,%r28,%r85; .loc 1 120 9 mov.u64 %r52,0; bra $L13; $L12: .loc 1 115 5 cvta.const.u64 %r87,$LC0; mov.u64 %r86,r87; call _gfortrani_internal_error13: .loc 1 122 10 ld.s8 %r89,[%r51]; cvt.u16.u32 %r88,%r89; setp.ne.u16 %r90,%r88,0; @ %r90 bra $L14; .loc 1 124 13 add.u64 %r51,%r51,%r37; .loc 1 120 24 add.u64 %r52,%r52,1; .loc 1 120 3 setp.ne.u64 %r91,%r48,%r52; @ %r91 bra $L13; .loc 1 101 12 mov.u32 %r58,0; bra $L9; $L14: .loc 1 127 6 setp.eq.u64 %r92,%r48,%r52; @ %r92 bra $L21; .loc 1 130 7 add.u64 %r43,%r52,1; .loc 1 131 7 ld.u64 %r94,[%r59]; mad.lo.u64 %r44,%r27,%r52,%r94; .loc 1 134 3 setp.lt.s64 %r95,%r48,%r43; @ %r95 bra $L15; add.u64 %r34,%r48,1; mov.u64 %r55,%r43; .loc 1 131 7 mov.u64 %r54,%r44; .loc 1 137 18 setp.eq.u32 %r114,%r61,0; $L19: .loc 1 137 10 ld.s8 %r97,[%r51]; cvt.u16.u32 %r96,%r97; setp.eq.u16 %r98,%r96,0; @ %r98 bra $L16; .loc 1 37102,[%value_in]; } .loc 1 137 18 @ %r114 bra $L17; set.u32.le.s32 %r106,%r102,0; neg.s32 %r107,%r106; cvt.u32.u32 %r105,%r107; cvt.u32.u8 %r53,%r105; bra $L18; $L17: shr.u32 %r109,%r102,31; cvt.u32.u32 %r110,%r109; cvt.u32.u8 %r53,%r110; bra $L18; $L16: .loc 1 143 11 add.u64 %r54,%r54,%r27; .loc 1 144 13 add.u64 %r51,%r51,%r37; .loc 1 134 27 add.u64 %r55,%r55,1; .loc 1 134 3 setp.ne.u64 %r111,%r34,%r55; @ %r111 bra $L19; $L15: .loc 1 146 10 cvt.u32.u64 %r58,%r43; bra $L9; $L20: .loc 1 101 12 mov.u32 %r58,0; bra $L9; $L21: mov.u32 %r58,0; bra $L9; $L18: .loc 1 137 18 setp.eq.u32 %r113,%r53,0; selp.u64 %r44,%r44,%r54,%r113; selp.u64 %r43,%r43,%r55,%r113; bra $L16;gfortran_sminloc2_4_s1gfortran_sminloc2_4_s1predmov.u64 %r28,%ar3; .loc 1 157 6 setp.eq.u64 %r29,%r26,0; @ %r29 bra $L31; .loc 1 158 12 cvt.s64.s32 %r33,%r27; cvt.u32.u64 %r31,%r225(%value_in),_gfortran_minloc2_4_s1mov.u32 %r24,%r34; bra $L29; $L31: .loc 1 160 12 cvt.u32gfortran_minloc2_4_s4gfortran_minloc2_4_s4.file 1 "../../../../libgfortran/generated/minloc2_4_s4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc2_4_s4gfortran_mminloc2_4_s4); // BEGIN GLOBAL FUNCTION DECL: _gfortran_sminloc2_4_s4gfortran_sminloc2_4_s4); // BEGIN GLOBAL FUNCTION DECL: _gfortrani_memcmp_char4_gfortrani_memcmp_char4_gfortrani_internal_error .extern .func _gfortrani_internal_error21,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc2_4_s4gfortran_minloc2_4_s4value;u64mov.u64 %r37,%ar0; mov.u32 %r38,%ar1; mov.u64 %r39,%ar2; .loc 1 57 12 ld.u64 %r41,[%r37+56]; add.u64 %r40,%r41,1; .loc 1 57 10 ld.u64 %r42,[%r37+48]; sub.u64 %r31,%r40,%r42; .loc 1 58 6 setp.le.s64 %r43,%r31,0; @ %r43 bra $L7; .loc 1 64 7 ld.u64 %r32,[%r37]; .loc 1 74 11 ld.u64 %r45,[%r37+40]; mul.lo.u64 %r44,%r39,%r45; shl.b64 %r27,%r44,2; .loc 1 64 7 mov.u64 %r29,%r32; .loc 1 66 9 mov.u64 %r33,1; mov.u64 %r28,%r33; .loc 1 68 10 setp.eq.u32 %r60,%r38,0; bra $L3; $L6:8; @ %r60 bra $L5; .loc 1 3932_gfortrani_memcmp_char4.loc 1 68 26 setp.gt.s32 %r52,%r51,0; .loc 1 74 11 selp.u64 %r29,%r29,%r32,%r52; selp.u64 %r28,%r28,%r33,%r52; bra $L3; $L5: .loc 1 3932_gfortrani_memcmp_char4.loc 1 68 26 setp.ge.s32 %r57,%r56,0; .loc 1 74 11 selp.u64 %r29,%r29,%r32,%r57; selp.u64 %r28,%r28,%r33,%r57; bra $L3; $L8: mov.u64 %r29,%r32; mov.u64 %r28,%r33; $L3: add.u64 %r32,%r32,%r27; .loc 1 66 25 add.u64 %r33,%r33,1; .loc 1 66 3 setp.ge.s64 %r58,%r31,%r33; @ %r58 bra $L6; .loc 1 76 10 cvt.u32.u64 %r36,%r28; bra $L1; $L7: .loc 1 59 12 mov.u32 %r36,0; $L1: .loc 1 _gfortran_mminloc2_4_s4gfortran_mminloc2_4_s4predu16 %r88; .reg .u32 %r89; .reg .pred %r90; .reg .pred %r91; .reg .predpredpred %r115; .reg .pred.loc 1 99 12 ld.u64 %r64,[%r59+56]; add.u64 %r63,%r64,1; .loc 1 99 10 ld.u64 %r65,[%r59+48]; sub.u64 %r48,%r63,%r65; .loc 1 100 6 setp.le.s64 %r66,%r48,0; @ %r66 bra $L20; .loc 1 103 44 ld.u64 %r67,[%r59+40]; mul.lo.u64 %r27,%r62,%r67; .loc 1 105 15 ld.u64 %r28,[%r60+16]; .loc 1 106 9 ld.u64 %r51,[%r60]; .loc 1 108 22 cvt.u32.u64 %r29,%r28; .loc 1 108 53 add.u32 %r68,%r29,-4; and.b32 %r69,%r68,-5; set.u32.eq.u32 %r71,%r69,0; neg.s32 %r72,%r71; .loc 1 108 22 add.u32 %r73,%r29,-1; set.u32.le.u32 %r75,%r73,1; neg.s32 %r76,%r75; .loc 1 108 6 cvt.u16.u32 %r78,%r72; cvt.u16.u32 %r79,%r76; or.b16 %r77,%r78,%r79;110 7 setp.ne.u32 %r84,%r29,16; @ %r84 bra $L12; $L11: .loc 1 117 13 ld.u64 %r85,[%r60+40]; mul.lo.u64 %r34,%r28,%r85; .loc 1 120 9 mov.u64 %r52,0; bra $L13; $L12: .loc 1 115 5 cvta.const.u64 %r87,$LC0; mov.u64 %r86,r87; call _gfortrani_internal_error13: .loc 1 122 10 ld.s8 %r89,[%r51]; cvt.u16.u32 %r88,%r89; setp.ne.u16 %r90,%r88,0; @ %r90 bra $L14; .loc 1 124 13 add.u64 %r51,%r51,%r34; .loc 1 120 24 add.u64 %r52,%r52,1; .loc 1 120 3 setp.ne.u64 %r91,%r48,%r52; @ %r91 bra $L13; .loc 1 101 12 mov.u32 %r58,0; bra $L9; $L14: .loc 1 127 6 setp.eq.u64 %r92,%r48,%r52; @ %r92 bra $L21; .loc 1 130 7 add.u64 %r42,%r52,1; .loc 1 131 30 mul.lo.u64 %r93,%r27,%r52; .loc 1 131 26 shl.b64 %r94,%r93,2; .loc 1 131 7 ld.u64 %r95,[%r59]; add.u64 %r44,%r95,%r94; .loc 1 134 3 setp.lt.s64 %r96,%r48,%r42; @ %r96 bra $L15; .loc 1 143 11 shl.b64 %r41,%r27,2; mov.u64 %r55,%r42; .loc 1 131 7 mov.u64 %r54,%r44; .loc 1 137 18 setp.eq.u32 %r118,%r61,0; $L19: .loc 1 137 10 ld.s8 %r98,[%r51]; cvt.u16.u32 %r97,%r98; setp.eq.u16 %r99,%r97,0; @ %r99 bra $L16; .loc 1 137 18 @ %r118 bra $L17; .loc 1 39_gfortrani_memcmp_char44,[%value_in]; } .loc 1 137 18 set.u32.le.s32 %r106,%r104,0; neg.s32 %r107,%r106; cvt.u32.u32 %r105,%r107; cvt.u32.u8 %r53,%r105; bra $L18; $L17: .loc 1 39_gfortrani_memcmp_char4111,[%value_in]; } .loc 1 137 18 shr.u32 %r113,%r111,31; cvt.u32.u32 %r114,%r113; cvt.u32.u8 %r53,%r114; bra $L18; $L16: .loc 1 143 11 add.u64 %r54,%r54,%r41; .loc 1 144 13 add.u64 %r51,%r51,%r34; .loc 1 134 27 add.u64 %r55,%r55,1; .loc 1 134 3 setp.ge.s64 %r115,%r48,%r55; @ %r115 bra $L19; $L15: .loc 1 146 10 cvt.u32.u64 %r58,%r42; bra $L9; $L20: .loc 1 101 12 mov.u32 %r58,0; bra $L9; $L21: mov.u32 %r58,0; bra $L9; $L18: .loc 1 137 18 setp.eq.u32 %r117,%r53,0; selp.u64 %r44,%r44,%r54,%r117; selp.u64 %r42,%r42,%r55,%r117; bra $L16;gfortran_sminloc2_4_s4gfortran_sminloc2_4_s4predmov.u64 %r28,%ar3; .loc 1 157 6 setp.eq.u64 %r29,%r26,0; @ %r29 bra $L31; .loc 1 158 12 cvt.s64.s32 %r33,%r27; cvt.u32.u64 %r31,%r225(%value_in),_gfortran_minloc2_4_s4mov.u32 %r24,%r34; bra $L29; $L31: .loc 1 160 12 cvt.u32 minloc2_8_s1.o/ 1622802250gfortran_minloc2_8_s1gfortran_minloc2_8_s1.file 1 "../../../../libgfortran/generated/minloc2_8_s1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc2_8_s1gfortran_mminloc2_8_s1); // BEGIN GLOBAL FUNCTION DECL: _gfortran_sminloc2_8_s1gfortran_sminloc2_8_s1); // BEGIN GLOBAL FUNCTION DECL: memcmpmem_gfortrani_internal_error .extern .func _gfortrani_internal_error21,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc2_8_s1gfortran_minloc2_8_s1value;mov.u64 %r37,%ar0; mov.u32 %r38,%ar1; mov.u64 %r39,%ar2; .loc 1 57 12 ld.u64 %r41,[%r37+56]; add.u64 %r40,%r41,1; .loc 1 57 10 ld.u64 %r42,[%r37+48]; sub.u64 %r29,%r40,%r42; .loc 1 58 6 setp.le.s64 %r43,%r29,0; @ %r43 bra $L7; .loc 1 61 44 ld.u64 %r44,[%r37+40]; mul.lo.u64 %r27,%r39,%r44; .loc 1 64 7 ld.u64 %r32,[%r37]; add.u64 %r35,%r29,1; mov.u64 %r28,%r32; .loc 1 66 9 mov.u64 %r33,1; mov.u64 %r36,%r33; .loc 1 68 10 setp.eq.u32 %r56,%r38,0; bra $L3; $L6: setp.eq.u64 %r45,%r28,0; @ %r45 bra $L8; .loc 1 3710 @ %r56 bra $L5; .loc 1 68 26 setp.gt.s32 %r52,%r49,0; selp.u64 %r28,%r28,%r32,%r52; selp.u64 %r36,%r36,%r33,%r52; bra $L3; $L5: setp.ge.s32 %r53,%r49,0; selp.u64 %r28,%r28,%r32,%r53; selp.u64 %r36,%r36,%r33,%r53; bra $L3; $L8: .loc 1 74 11 mov.u64 %r28,%r32; mov.u64 %r36,%r33; $L3: add.u64 %r32,%r32,%r27; .loc 1 66 25 add.u64 %r33,%r33,1; .loc 1 66 3 setp.ne.u64 %r54,%r33,%r35; @ %r54 bra $L6; bra $L1; $L7: .loc 1 59 12 mov.u64 %r36,0; $L1: .loc 1 77_gfortran_mminloc2_8_s1gfortran_mminloc2_8_s1.reg .u32 %r75; .reg .u16 %r76; .reg .u16 %r77; .reg .u16 %r78; .reg .u32pred32pred %r91; .reg .u64 %r93; .reg .predpred %r97; .reg .u32pred %r110; .reg .pred %r112; .reg .pred %r113; mov.u64 %r58,%ar0; mov.u64 %r59,%ar1; mov.u32 %r60,%ar2; mov.u64 %r61,%ar3; .loc 1 99 12 ld.u64 %r63,[%r58+56]; add.u64 %r62,%r63,1; .loc 1 99 10 ld.u64 %r64,[%r58+48]; sub.u64 %r48,%r62,%r64; .loc 1 100 6 setp.le.s64 %r65,%r48,0; @ %r65 bra $L19; .loc 1 103 44 ld.u64 %r66,[%r58+40]; mul.lo.u64 %r27,%r61,%r66; .loc 1 105 15 ld.u64 %r28,[%r59+16]; .loc 1 106 9 ld.u64 %r51,[%r59]; .loc 1 108 22 cvt.u32.u64 %r29,%r28; .loc 1 108 53 add.u32 %r67,%r29,-4; and.b32 %r68,%r67,-5; set.u32.eq.u32 %r70,%r68,0; neg.s32 %r71,%r70; .loc 1 108 22 add.u32 %r72,%r29,-1; set.u32.le.u32 %r74,%r72,1; neg.s32 %r75,%r74; .loc 1 108 6 cvt.u16.u32 %r77,%r71; cvt.u16.u32 %r78,%r75; or.b16 %r76,%r77,%r78; cvt.u32.u16 %r79,%r76; cvt.u16.u8 %r80,%r79; setp.ne.u16 %r81,%r80,0; @ %r81 bra $L11; .loc 1 110 7 setp.ne.u32 %r83,%r29,16; @ %r83 bra $L12; $L11: .loc 1 117 13 ld.u64 %r84,[%r59+40]; mul.lo.u64 %r37,%r28,%r84; .loc 1 120 9 mov.u64 %r52,0; bra $L13; $L12: .loc 1 115 5 cvta.const.u64 %r86,$LC0; mov.u64 %r85,r86; call _gfortrani_internal_error13: .loc 1 122 10 ld.s8 %r88,[%r51]; cvt.u16.u32 %r87,%r88; setp.ne.u16 %r89,%r87,0; @ %r89 bra $L14; .loc 1 124 13 add.u64 %r51,%r51,%r37; .loc 1 120 24 add.u64 %r52,%r52,1; .loc 1 120 3 setp.ne.u64 %r90,%r48,%r52; @ %r90 bra $L13; .loc 1 101 12 mov.u64 %r57,0; bra $L9; $L14: .loc 1 127 6 setp.eq.u64 %r91,%r48,%r52; @ %r91 bra $L20; .loc 1 130 7 add.u64 %r57,%r52,1; .loc 1 131 7 ld.u64 %r93,[%r58]; mad.lo.u64 %r43,%r27,%r52,%r93; .loc 1 134 3 setp.lt.s64 %r94,%r48,%r57; @ %r94 bra $L9; add.u64 %r34,%r48,1; mov.u64 %r55,%r57; .loc 1 131 7 mov.u64 %r54,%r43; .loc 1 137 18 setp.eq.u32 %r113,%r60,0; $L18: .loc 1 137 10 ld.s8 %r96,[%r51]; cvt.u16.u32 %r95,%r96; setp.eq.u16 %r97,%r95,0; @ %r97 bra $L15; .loc 1 371,[%value_in]; } .loc 1 137 18 @ %r113 bra $L16; set.u32.le.s32 %r105,%r101,0; neg.s32 %r106,%r105; cvt.u32.u32 %r104,%r106; cvt.u32.u8 %r53,%r104; bra $L17; $L16: shr.u32 %r108,%r101,31; cvt.u32.u32 %r109,%r108; cvt.u32.u8 %r53,%r109; bra $L17; $L15: .loc 1 143 11 add.u64 %r54,%r54,%r27; .loc 1 144 13 add.u64 %r51,%r51,%r37; .loc 1 134 27 add.u64 %r55,%r55,1; .loc 1 134 3 setp.ne.u64 %r110,%r34,%r55; @ %r110 bra $L18; bra $L9; $L19: .loc 1 101 12 mov.u64 %r57,0; bra $L9; $L20: mov.u64 %r57,0; bra $L9; $L17: .loc 1 137 18 setp.eq.u32 %r112,%r53,0; selp.u64 %r43,%r43,%r54,%r112; selp.u64 %r57,%r57,%r55,%r112; bra $L15; $L9predmov.u64 %r28,%ar3; .loc 1 157 6 setp.eq.u64 %r29,%r26,0; @ %r29 bra $L30; .loc 1 158 12 cvt.s64.s32 %r33,%r27; cvt.u32.u64 %r31,%r2825(%value_in),_gfortran_minloc2_8_s1mov.u64 %r24,%r34; bra $L28; $L30: .loc 1 160 12 mov.u64 %r24,%r26; $L28gfortran_minloc2_8_s4gfortran_minloc2_8_s4.file 1 "../../../../libgfortran/generated/minloc2_8_s4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc2_8_s4gfortran_mminloc2_8_s4); // BEGIN GLOBAL FUNCTION DECL: _gfortran_sminloc2_8_s4gfortran_sminloc2_8_s4); // BEGIN GLOBAL FUNCTION DECL: _gfortrani_memcmp_char4_gfortrani_memcmp_char4_gfortrani_internal_error .extern .func _gfortrani_internal_error21,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc2_8_s4gfortran_minloc2_8_s4value;predpred %r57; .reg .pred %r59; mov.u64 %r36,%ar0; mov.u32 %r37,%ar1; mov.u64 %r38,%ar2; .loc 1 57 12 ld.u64 %r40,[%r36+56]; add.u64 %r39,%r40,1; .loc 1 57 10 ld.u64 %r41,[%r36+48]; sub.u64 %r30,%r39,%r41; .loc 1 58 6 setp.le.s64 %r42,%r30,0; @ %r42 bra $L7; .loc 1 64 7 ld.u64 %r31,[%r36]; .loc 1 74 11 ld.u64 %r44,[%r36+40]; mul.lo.u64 %r43,%r38,%r44; shl.b64 %r27,%r43,2; .loc 1 64 7 mov.u64 %r29,%r31; .loc 1 66 9 mov.u64 %r32,1; mov.u64 %r35,%r32; .loc 1 68 10 setp.eq.u32 %r59,%r37,0; bra $L3; $L6: setp.eq.u64 %r45,%r29,0; @ %r45 bra $L8; @ %r59 bra $L5; .loc 1 39 1229_gfortrani_memcmp_char4.loc 1 68 26 setp.gt.s32 %r51,%r50,0; .loc 1 74 11 selp.u64 %r29,%r29,%r31,%r51; selp.u64 %r35,%r35,%r32,%r51; bra $L3; $L5: .loc 1 39 1229_gfortrani_memcmp_char4.loc 1 68 26 setp.ge.s32 %r56,%r55,0; .loc 1 74 11 selp.u64 %r29,%r29,%r31,%r56; selp.u64 %r35,%r35,%r32,%r56; bra $L3; $L8: mov.u64 %r29,%r31; mov.u64 %r35,%r32; $L3: add.u64 %r31,%r31,%r27; .loc 1 66 25 add.u64 %r32,%r32,1; .loc 1 66 3 setp.ge.s64 %r57,%r30,%r32; @ %r57 bra $L6; bra $L1; $L7: .loc 1 59 12 mov.u64 %r35,0; $L1: .loc 1 77gfortran_mminloc2_8_s4gfortran_mminloc2_8_s4.reg .u32 %r75; .reg .u16 %r76; .reg .u16 %r77; .reg .u16 %r78; .reg .u32predpred %r89; .reg .pred %r90; .reg .predpredmov.u64 %r58,%ar0; mov.u64 %r59,%ar1; mov.u32 %r60,%ar2; mov.u64 %r61,%ar3; .loc 1 99 12 ld.u64 %r63,[%r58+56]; add.u64 %r62,%r63,1; .loc 1 99 10 ld.u64 %r64,[%r58+48]; sub.u64 %r47,%r62,%r64; .loc 1 100 6 setp.le.s64 %r65,%r47,0; @ %r65 bra $L19; .loc 1 103 44 ld.u64 %r66,[%r58+40]; mul.lo.u64 %r27,%r61,%r66; .loc 1 105 15 ld.u64 %r28,[%r59+16]; .loc 1 106 9 ld.u64 %r50,[%r59]; .loc 1 108 22 cvt.u32.u64 %r29,%r28; .loc 1 108 53 add.u32 %r67,%r29,-4; and.b32 %r68,%r67,-5; set.u32.eq.u32 %r70,%r68,0; neg.s32 %r71,%r70; .loc 1 108 22 add.u32 %r72,%r29,-1; set.u32.le.u32 %r74,%r72,1; neg.s32 %r75,%r74; .loc 1 108 6 cvt.u16.u32 %r77,%r71; cvt.u16.u32 %r78,%r75; or.b16 %r76,%r77,%r78; cvt.u32.u16 %r79,%r76; cvt.u16.u8 %r80,%r79; setp.ne.u16 %r81,%r80,0; @ %r81 bra $L11; .loc 1 110 7 setp.ne.u32 %r83,%r29,16; @ %r83 bra $L12; $L11: .loc 1 117 13 ld.u64 %r84,[%r59+40]; mul.lo.u64 %r34,%r28,%r84; .loc 1 120 9 mov.u64 %r51,0; bra $L13; $L12: .loc 1 115 5 cvta.const.u64 %r86,$LC0; mov.u64 %r85,r86; call _gfortrani_internal_error13: .loc 1 122 10 ld.s8 %r88,[%r50]; cvt.u16.u32 %r87,%r88; setp.ne.u16 %r89,%r87,0; @ %r89 bra $L14; .loc 1 124 13 add.u64 %r50,%r50,%r34; .loc 1 120 24 add.u64 %r51,%r51,1; .loc 1 120 3 setp.ne.u64 %r90,%r47,%r51; @ %r90 bra $L13; .loc 1 101 12 mov.u64 %r57,0; bra $L9; $L14: .loc 1 127 6 setp.eq.u64 %r91,%r47,%r51; @ %r91 bra $L20; .loc 1 130 7 add.u64 %r57,%r51,1; .loc 1 131 30 mul.lo.u64 %r92,%r27,%r51; .loc 1 131 26 shl.b64 %r93,%r92,2; .loc 1 131 7 ld.u64 %r94,[%r58]; add.u64 %r43,%r94,%r93; .loc 1 134 3 setp.lt.s64 %r95,%r47,%r57; @ %r95 bra $L9; .loc 1 143 11 shl.b64 %r41,%r27,2; mov.u64 %r54,%r57; .loc 1 131 7 mov.u64 %r53,%r43; .loc 1 137 18 setp.eq.u32 %r117,%r60,0; $L18: .loc 1 137 10 ld.s8 %r97,[%r50]; cvt.u16.u32 %r96,%r97; setp.eq.u16 %r98,%r96,0; @ %r98 bra $L15; .loc 1 137 18 @ %r117 bra $L16; .loc 1 39call (%value_in),_gfortrani_memcmp_char43,[%value_in]; } .loc 1 137 18 set.u32.le.s32 %r105,%r103,0; neg.s32 %r106,%r105; cvt.u32.u32 %r104,%r106; cvt.u32.u8 %r52,%r104; bra $L17; $L16: .loc 1 39call (%value_in),_gfortrani_memcmp_char4110,[%value_in]; } .loc 1 137 18 shr.u32 %r112,%r110,31; cvt.u32.u32 %r113,%r112; cvt.u32.u8 %r52,%r113; bra $L17; $L15: .loc 1 143 11 add.u64 %r53,%r53,%r41; .loc 1 144 13 add.u64 %r50,%r50,%r34; .loc 1 134 27 add.u64 %r54,%r54,1; .loc 1 134 3 setp.ge.s64 %r114,%r47,%r54; @ %r114 bra $L18; bra $L9; $L19: .loc 1 101 12 mov.u64 %r57,0; bra $L9; $L20: mov.u64 %r57,0; bra $L9; $L17: .loc 1 137 18 setp.eq.u32 %r116,%r52,0; selp.u64 %r43,%r43,%r53,%r116; selp.u64 %r57,%r57,%r54,%r116; bra $L15; $L9predmov.u64 %r28,%ar3; .loc 1 157 6 setp.eq.u64 %r29,%r26,0; @ %r29 bra $L30; .loc 1 158 12 cvt.s64.s32 %r33,%r27; cvt.u32.u64 %r31,%r2825(%value_in),_gfortran_minloc2_8_s4mov.u64 %r24,%r34; bra $L28; $L30: .loc 1 160 12 mov.u64 %r24,%r26; $L28 minloc2_16_s1.o/1622802250gfortran_minloc2_16_s1 .visible .func _gfortran_minloc2_16_s1); .file 1 "../../../../libgfortran/generated/minloc2_16_s1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc2_16_s1 .visible .func _gfortran_mminloc2_16_s1_gfortran_sminloc2_16_s1 .visible .func _gfortran_sminloc2_16_s1memcmpmem_gfortrani_internal_error .extern .func _gfortrani_internal_21,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc2_16_s1 .visible .func _gfortran_minloc2_16_s1predmov.u32 %r40,%ar2; mov.u64 %r41,%ar3; .loc 1 57 12 ld.u64 %r43,[%r39+56]; add.u64 %r42,%r43,1; .loc 1 57 10 ld.u64 %r44,[%r39+48]; sub.u64 %r31,%r42,%r44; .loc 1 58 6 setp.gt.s64 %r45,%r31,0; @ %r45 bra $L2; .loc 1 59 12 mov.u64 %r46,0; st.u64 [%r38],%r46; st.u64 [%r38+8],%r46; bra $L1; $L2: .loc 1 61 44 ld.u64 %r48,[%r39+40]; mul.lo.u64 %r27,%r41,%r48; .loc 1 64 7 ld.u64 %r34,[%r39]; add.u64 %r37,%r31,1; mov.u64 %r30,%r34; .loc 1 66 9 mov.u64 %r35,1; mov.u64 %r29,%r35; .loc 1 68 10 setp.eq.u32 %r60,%r40,0; bra $L4; $L7: setp.eq.u64 %r49,%r30,0; @ %r49 bra $L8; .loc 1 37(%value_in),memcmp,(%out_arg1,%out_arg2,68 10 @ %r60 bra $L6; .loc 1 68 26 setp.gt.s32 %r56,%r53,0; selp.u64 %r30,%r30,%r34,%r56; selp.u64 %r29,%r29,%r35,%r56; bra $L4; $L6: setp.ge.s32 %r57,%r53,0; selp.u64 %r30,%r30,%r34,%r57; selp.u64 %r29,%r29,%r35,%r57; bra $L4; $L8: .loc 1 74 11 mov.u64 %r30,%r34; mov.u64 %r29,%r35; $L4: add.u64 %r34,%r34,%r27; .loc 1 66 25 add.u64 %r35,%r35,1; .loc 1 66 3 setp.ne.u64 %r58,%r35,%r37; @ %r58 bra $L7; .loc 1 76 10 st.u64 [%r38],%r29; shr.s64 %r59,%r29,63; st.u64 [%r38+8],%r59; $L1: .loc 1 77_gfortran_mminloc2_16_s1 .visible .func _gfortran_mminloc2_16_s1in_ar4) {u16 %r84; .reg .pred %r85; .reg .predpred.loc 1 99 12 ld.u64 %r65,[%r60+56]; add.u64 %r64,%r65,1; .loc 1 99 10 ld.u64 %r66,[%r60+48]; sub.u64 %r50,%r64,%r66; .loc 1 100 6 setp.gt.s64 %r67,%r50,0; @ %r67 bra $L10; $L16: .loc 1 101 12 mov.u64 %r68,0; st.u64 [%r59],%r68; st.u64 [%r59+8],%r68; bra $L9; $L10: .loc 1 103 44 ld.u64 %r70,[%r60+40]; mul.lo.u64 %r27,%r63,%r70; .loc 1 105 15 ld.u64 %r28,[%r61+16]; .loc 1 106 9 ld.u64 %r53,[%r61]; .loc 1 108 22 cvt.u32.u64 %r29,%r28; .loc 1 108 53 add.u32 %r71,%r29,-4; and.b32 %r72,%r71,-5;.loc 1 108 22 add.u32 %r76,%r29,-1; set.u32.le.u32 %r78,%r76,1; neg.s32 %r79,%r78; .loc 1 108 6 cvt.u16.u32 %r81,%r75; cvt.u16.u32 %r82,%r79; or.b16 %r80,%r81,%r82;@ %r85 bra $L12; .loc 1 110 7 setp.ne.u32 %r87,%r29,16; @ %r87 bra $L13; $L12: .loc 1 117 13 ld.u64 %r88,[%r61+40]; mul.lo.u64 %r37,%r28,%r88; .loc 1 120 9 mov.u64 %r54,0; bra $L14; $L13: .loc 1 115 5 cvta.const.u64 %r90,$LC0; mov.u64 %r89,0; {r90; call _gfortrani_internal_error14: .loc 1 122 10 ld.s8 %r92,[%r53]; cvt.u16.u32 %r91,%r92; setp.ne.u16 %r93,%r91,0; @ %r93 bra $L15; .loc 1 124 13 add.u64 %r53,%r53,%r37; .loc 1 120 24 add.u64 %r54,%r54,1; .loc 1 120 3 setp.ne.u64 %r94,%r50,%r54; @ %r94 bra $L14; bra $L16; $L15: .loc 1 127 6 setp.eq.u64 %r95,%r50,%r54; @ %r95 bra $L16; .loc 1 130 7 add.u64 %r44,%r54,1; .loc 1 131 7 ld.u64 %r97,[%r60]; mad.lo.u64 %r45,%r27,%r54,%r97; .loc 1 134 3 setp.lt.s64 %r98,%r50,%r44; @ %r98 bra $L17; add.u64 %r33,%r50,1; mov.u64 %r57,%r44; .loc 1 131 7 mov.u64 %r56,%r45; .loc 1 137 18 setp.eq.u32 %r117,%r62,0; $L21: .loc 1 137 10 ld.s8 %r100,[%r53]; cvt.u16.u32 %r99,%r100; setp.eq.u16 %r101,%r99,0; @ %r101 bra $L18; .loc 1 37105,[%value_in]; } .loc 1 137 18 @ %r117 bra $L19; set.u32.le.s32 %r109,%r105,0; neg.s32 %r110,%r109; cvt.u32.u32 %r108,%r110; cvt.u32.u8 %r55,%r108; bra $L20; $L19: shr.u32 %r112,%r105,31; cvt.u32.u32 %r113,%r112; cvt.u32.u8 %r55,%r113; bra $L20; $L18: .loc 1 143 11 add.u64 %r56,%r56,%r27; .loc 1 144 13 add.u64 %r53,%r53,%r37; .loc 1 134 27 add.u64 %r57,%r57,1; .loc 1 134 3 setp.ne.u64 %r114,%r33,%r57; @ %r114 bra $L21; $L17: .loc 1 146 10 st.u64 [%r59],%r44; shr.s64 %r115,%r44,63; st.u64 [%r59+8],%r115; bra $L9; $L20: .loc 1 137 18 setp.eq.u32 %r116,%r55,0; selp.u64 %r45,%r45,%r56,%r116; selp.u64 %r44,%r44,%r57,%r116; bra $L18; $L9:_gfortran_sminloc2_16_s1 .visible .func _gfortran_sminloc2_16_s1in_ar4) {predmov.u64 %r29,%ar4; .loc 1 157 6 setp.eq.u64 %r30,%r27,0; @ %r30 bra $L31; .loc 1 158 12 cvt.s64.s32 %r35,%r28; cvt.u32.u64 %r33,%r29; {35; call _gfortran_minloc2_16_s1ld.u64 %r38,[%frame]; ld.u64 %r39,[%frame+8]; bra $L30; $L31: .loc 1 160 12 mov.u64 %r38,%r27; mov.u64 %r39,%r38; $L30: st.u64 [%r25],%r38; st.u64 [%r25+8],%r39; .loc 1 161 minloc2_16_s4.o/1622802250gfortran_minloc2_16_s4 .visible .func _gfortran_minloc2_16_s4); .file 1 "../../../../libgfortran/generated/minloc2_16_s4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc2_16_s4 .visible .func _gfortran_mminloc2_16_s_gfortran_sminloc2_16_s4 .visible .func _gfortran_sminloc2_16_s_gfortrani_memcmp_char4_gfortrani_memcmp_char4_gfortrani_internal_error .extern .func _gfortrani_internal_21,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc2_16_s4 .visible .func _gfortran_minloc2_16_s4pred.loc 1 57 12 ld.u64 %r42,[%r38+56]; add.u64 %r41,%r42,1; .loc 1 57 10 ld.u64 %r43,[%r38+48]; sub.u64 %r32,%r41,%r43; .loc 1 58 6 setp.gt.s64 %r44,%r32,0; @ %r44 bra $L2; .loc 1 59 12 mov.u64 %r45,0; st.u64 [%r37],%r45; st.u64 [%r37+8],%r45; bra $L1; $L2: .loc 1 64 7 ld.u64 %r33,[%r38]; .loc 1 74 11 ld.u64 %r48,[%r38+40]; mul.lo.u64 %r47,%r40,%r48; shl.b64 %r27,%r47,2; .loc 1 64 7 mov.u64 %r30,%r33; .loc 1 66 9 mov.u64 %r34,1; mov.u64 %r29,%r34; .loc 1 68 10 setp.eq.u32 %r63,%r39,0; bra $L4; $L7: setp.eq.u64 %r49,%r30,0; @ %r49 bra $L8; @ %r63 bra $L(%value_in),_gfortrani_memcmp_char468 26 setp.gt.s32 %r55,%r54,0; .loc 1 74 11 selp.u64 %r30,%r30,%r33,%r55; selp.u64 %r29,%r29,%r34,%r55; bra $L4; $L(%value_in),_gfortrani_memcmp_char4.loc 1 68 26 setp.ge.s32 %r60,%r59,0; .loc 1 74 11 selp.u64 %r30,%r30,%r33,%r60; selp.u64 %r29,%r29,%r34,%r60; bra $L4; $L8: mov.u64 %r30,%r33; mov.u64 %r29,%r34; $L4: add.u64 %r33,%r33,%r27; .loc 1 66 25 add.u64 %r34,%r34,1; .loc 1 66 3 setp.ge.s64 %r61,%r32,%r34; @ %r61 bra $L7; .loc 1 76 10 st.u64 [%r37],%r29; shr.s64 %r62,%r29,63; st.u64 [%r37+8],%r62; $L1: .loc 1 77_gfortran_mminloc2_16_s4 .visible .func _gfortran_mminloc2_16_s, .param .u64 %in_ar4) {predpredu32 %r110; .reg .u32 %r114; .reg .u32 %r116; .reg .u32 %r117; .reg .pred.loc 1 99 12 ld.u64 %r65,[%r60+56]; add.u64 %r64,%r65,1; .loc 1 99 10 ld.u64 %r66,[%r60+48]; sub.u64 %r49,%r64,%r66; .loc 1 100 6 setp.gt.s64 %r67,%r49,0; @ %r67 bra $L10; $L16: .loc 1 101 12 mov.u64 %r68,0; st.u64 [%r59],%r68; st.u64 [%r59+8],%r68; bra $L9; $L10: .loc 1 103 44 ld.u64 %r70,[%r60+40]; mul.lo.u64 %r27,%r63,%r70; .loc 1 105 15 ld.u64 %r28,[%r61+16]; .loc 1 106 9 ld.u64 %r52,[%r61]; .loc 1 108 22 cvt.u32.u64 %r29,%r28; .loc 1 108 53 add.u32 %r71,%r29,-4; and.b32 %r72,%r71,-5;.loc 1 108 22 add.u32 %r76,%r29,-1; set.u32.le.u32 %r78,%r76,1; neg.s32 %r79,%r78; .loc 1 108 6 cvt.u16.u32 %r81,%r75; cvt.u16.u32 %r82,%r79; or.b16 %r80,%r81,%r82;@ %r85 bra $L12; .loc 1 110 7 setp.ne.u32 %r87,%r29,16; @ %r87 bra $L13; $L12: .loc 1 117 13 ld.u64 %r88,[%r61+40]; mul.lo.u64 %r34,%r28,%r88; .loc 1 120 9 mov.u64 %r53,0; bra $L14; $L13: .loc 1 115 5 cvta.const.u64 %r90,$LC0; mov.u64 %r89,0; {r90; call _gfortrani_internal_error14: .loc 1 122 10 ld.s8 %r92,[%r52]; cvt.u16.u32 %r91,%r92; setp.ne.u16 %r93,%r91,0; @ %r93 bra $L15; .loc 1 124 13 add.u64 %r52,%r52,%r34; .loc 1 120 24 add.u64 %r53,%r53,1; .loc 1 120 3 setp.ne.u64 %r94,%r49,%r53; @ %r94 bra $L14; bra $L16; $L15: .loc 1 127 6 setp.eq.u64 %r95,%r49,%r53; @ %r95 bra $L16; .loc 1 130 7 add.u64 %r43,%r53,1; .loc 1 131 30 mul.lo.u64 %r96,%r27,%r53; .loc 1 131 26 shl.b64 %r97,%r96,2; .loc 1 131 7 ld.u64 %r98,[%r60]; add.u64 %r45,%r98,%r97; .loc 1 134 3 setp.lt.s64 %r99,%r49,%r43; @ %r99 bra $L17; .loc 1 143 11 shl.b64 %r41,%r27,2; mov.u64 %r56,%r43; .loc 1 131 7 mov.u64 %r55,%r45; .loc 1 137 18 setp.eq.u32 %r121,%r62,0; $L21: .loc 1 137 10 ld.s8 %r101,[%r52]; cvt.u16.u32 %r100,%r101; setp.eq.u16 %r102,%r100,0; @ %r102 bra $L18; .loc 1 137 18 @ %r121 bra $L19; .loc 1 39_gfortrani_memcmp_char47,[%value_in]; } .loc 1 137 18 set.u32.le.s32 %r109,%r107,0; neg.s32 %r110,%r109; cvt.u32.u32 %r108,%r110; cvt.u32.u8 %r54,%r108; bra $L20; $L19: .loc 1 39_gfortrani_memcmp_char4114,[%value_in]; } .loc 1 137 18 shr.u32 %r116,%r114,31; cvt.u32.u32 %r117,%r116; cvt.u32.u8 %r54,%r117; bra $L20; $L18: .loc 1 143 11 add.u64 %r55,%r55,%r41; .loc 1 144 13 add.u64 %r52,%r52,%r34; .loc 1 134 27 add.u64 %r56,%r56,1; .loc 1 134 3 setp.ge.s64 %r118,%r49,%r56; @ %r118 bra $L21; $L17: .loc 1 146 10 st.u64 [%r59],%r43; shr.s64 %r119,%r43,63; st.u64 [%r59+8],%r119; bra $L9; $L20: .loc 1 137 18 setp.eq.u32 %r120,%r54,0; selp.u64 %r45,%r45,%r55,%r120; selp.u64 %r43,%r43,%r56,%r120; bra $L18; $L9:_gfortran_sminloc2_16_s4 .visible .func _gfortran_sminloc2_16_s, .param .u64 %in_ar4) {mov.u64 %r29,%ar4; .loc 1 157 6 setp.eq.u64 %r30,%r27,0; @ %r30 bra $L30; .loc 1 158 12 cvt.s64.s32 %r35,%r28; cvt.u32.u64 %r33,%r29; {35; call _gfortran_minloc2_16_s4ld.u64 %r40,[%frame]; ld.u64 %r41,[%frame+8]; st.u64 [%r25],%r40; st.u64 [%r25+8],%r41; bra $L29; $L30: .loc 1 160 12 st.u64 [%r25],%r27; st.u64 [%r25+8],%r27; $L29: .loc 1 161 maxval0_s1.o/../libgfortran/generated/maxval0_s1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxval0_s1 .visible .func _gfortran_mmaxval0_s1_gfortran_smaxval0_s1 .visible .func _gfortran_smaxval0_s1_gfortran_runtime_error .extern .func _gfortran_runtime_memcmpmem120,118,97,108,48,95,115,49,0 }; // BEGIN VAR DEF: __func__$1 .const .align 1 .u8 __func__$1[11] = {109,97,120,118,97,108,48,95,115,49116,114,97,110,47,103,101,110,101,114,97,116,101,100,47,109,97,120,118,97,108,48,95,115,49,46,99,0 }82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,4820,108,101,110,32,61,61,32,108,101,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxval0_s1 .visible .func _gfortran_maxval0_s1839101; .reg .u64predmov.u64 %r73,%ar0; mov.u64 %r74,%ar1; mov.u64 %r75,%ar2; mov.u64 %r76,%ar3; .loc 1 64 8 ld.s8 %r50,[%r75+28]; .loc 1 65 6 setp.gt.s64 %r77,%r50,0; @ %r77 bra $L2; .loc 1 66 5 cvta.const.u64 %r78,$LC0; {stack; call _gfortran_runtime_errorsetp.eq.u64 %r80,%r74,%r76; @ %r80 bra $L3; cvta.const.u64 %r84,$LC1; cvta.const.u64 %r83,__func__$1; mov.u32 %r82,683: .loc 1 71 3add.u64 %r66,%r75,40; mov.u64 %r68,0; .loc 1 73 10 mov.u64 %r59,%r68; add.u64 %r134,%frame,120; add.u64 %r132,%frame,240; .loc 1 77 16 mov.u64 %r102,%r68; bra $L6; $L12: mov.u64 %r59,%r54; $L6: .loc 1 75 18 add.u64 %r92,%frame,%r68; .loc 1 75 51 ld.u64 %r94,[%r66]; mul.lo.u64 %r93,%r94,%r76; .loc 1 75 18 st.u64 [%r92],%r93; .loc 1 76 19 ld.u64 %r96,[%r66+16]; add.u64 %r95,%r96,1; ld.u64 %r97,[%r66+8]; sub.u64 %r30,%r95,%r97; .loc 1 76 17 add.u64 %r99,%r134,%r68; st.u64 [%r99],%r30; .loc 1 77 16 add.u64 %r101,%r132,%r68; st.u64 [%r101],%r102; .loc 1 78 10 setp.gt.s64 %r103,%r30,0; @ ! %r103 bra $L1; .loc 1 73 26 add.u64 %r54,%r59,1; .loc 1 73 3 add.u64 %r66,%r66,24; add.u64 %r68,%r68,8; setp.ne.u64 %r104,%r50,%r54; @ %r104 bra $L12; .loc 1 82 8 ld.u64 %r46,[%r75]; .loc 1 89 9 setp.eq.u64 %r105,%r46,0; @ %r105 bra $L13; .loc 1 101 19 ld.u64 %r32,[%frame]; .loc 1 103 34 ld.u64 %r36,[%frame+120]; mov.u64 %r49,%r73; add.u64 %r133,%frame,248; .loc 1 109 13 mov.u64 %r116,0; $L11: .loc 1 103 21 ld.u64 %r58,[%frame+240]; add.u64 %r61,%r58,1; mov.u64 %r45,%r46; bra $L9; $L14: mov.u64 %r61,%r62; $L9: .loc 1 39(%value_in),memcmp,(%out_arg1,%out_arg2,%out_arg3); ld.param.u32 %r109,[%value_in]; } .loc 1 95 6 setp.le.s32 %r111,%r109,0; selp.u64 %r49,%r49,%r45,%r111; .loc 1 101 9 add.u64 %r45,%r45,%r32; .loc 1 103 7 st.u64 [%frame+240],%r61; add.u64 %r62,%r61,1; setp.ne.u64 %r112,%r61,%r36; @ %r112 bra $L14; .loc 1 101 9 sub.u64 %r113,%r36,%r58; mad.lo.u64 %r46,%r113,%r32,%r46; mov.u64 %r52,%r133; mov.u64 %r44,%r36; mov.u64 %r42,%r32; .loc 1 104 9 mov.u64 %r48,0; $L10: .loc 1 109 13 st.u64 [%r52+-8],%r116; .loc 1 112 23 mul.lo.u64 %r39,%r42,%r44; mov.u64 %r65,%r48; .loc 1 113 5 add.u64 %r48,%r48,1; .loc 1 114 7 setp.eq.u64 %r117,%r65,%r59; @ %r117 bra $L7; .loc 1 122 16 ld.u64 %r118,[%r52]; add.u64 %r41,%r118,1; st.u64 [%r52],%r41; shl.b64 %r31,%r48,3; .loc 1 123 23 add.u64 %r119,%frame,%r31; ld.u64 %r42,[%r119]; .loc 1 123 13 sub.u64 %r120,%r42,%r39; add.u64 %r46,%r46,%r120; .loc 1 126 32 add.u64 %r122,%r134,%r31; ld.u64 %r44,[%r122]; .loc 1 126 7 add.u64 %r52,%r52,8; setp.eq.u64 %r123,%r41,%r44; @ %r123 bra $L10; bra $L11; $L13: .loc 1 89 9 mov.u64 %r49,%r73; $L7: .loc 1 128 4130,[%value_in]; } $L1: .loc 1 130_gfortran_mmaxval0_s1 .visible .func _gfortran_mmaxval0_s1 {predu64 %r114; .reg .u3232predpredmov.u64 %r99,%ar0; mov.u64 %r100,%ar1; mov.u64 %r101,%ar2; mov.u64 %r102,%ar3; mov.u64 %r103,%ar4; .loc 1 153 6 setp.ne.u64 %r104,%r102,0; @ %r104 bra $L17r10103; call _gfortran_maxval0_s17 bra $L16; $L17: .loc 1 159 10 ld.s8 %r22,[%r101+28]; .loc 1 160 6 cvt.u16.u32 %r109,%r22; setp.gt.s16 %r110,%r109,0; @ %r110 bra $L19; .loc 1 161 5 cvta.const.u64 %r111111_gfortran_runtime_error19: .loc 1 163 3 setp.eq.u64 %r113,%r100,%r103; @ %r113 bra $L20; cvta.const.u64 %r117,$LC1; cvta.const.u64 %r116,__func__$0; mov.u32 %r115,163; cvta.const.u64 %r11411611720: .loc 1 166 3 mov.u32 %r121100123,[%value_in]; } .loc 1 168 15 ld.u64 %r23,[%r102+16]; .loc 1 170 9 ld.u64 %r71,[%r102]; .loc 1 172 22 cvt.u32.u64 %r24,%r23; .loc 1 172 53 add.u32 %r125,%r24,-4; and.b32 %r126,%r125,-5; set.u32.eq.u32 %r128,%r126,0; neg.s32 %r129,%r128; .loc 1 172 22 add.u32 %r130,%r24,-1; set.u32.le.u32 %r132,%r130,1; neg.s32 %r133,%r132; .loc 1 172 6 cvt.u16.u32 %r135,%r129; cvt.u16.u32 %r136,%r133; or.b16 %r134,%r135,%r136; cvt.u32.u16 %r137,%r134; cvt.u16.u8 %r138,%r137; setp.eq.u16 %r139,%r138,0; @ %r139 bra $L21; $L23: .loc 1 181 17 cvt.u32.u32 %r140,%r22; cvt.s64.s8 %r85,%r140; add.u64 %r28,%r101,40; add.u64 %r49,%r102,40; mov.u64 %r92,0; mov.u64 %r82,%r92; add.u64 %r198,%frame,120; add.u64 %r196,%frame,240; add.u64 %r195,%frame,360; .loc 1 186 16 mov.u64 %r159,%r92; bra $L22; $L21: .loc 1 174 7 setp.eq.u32 %r142,%r24,16; @ %r142 bra $L23; .loc 1 179 5 cvta.const.u64 %r143,$LC3; {_gfortran_runtime_error22: .loc 1 183 18 add.u64 %r146,%r198,%r92; .loc 1 183 51 ld.u64 %r148,[%r28]; mul.lo.u64 %r147,%r148,%r103; .loc 1 183 18 st.u64 [%r146],%r147; .loc 1 184 18 add.u64 %r149,%frame,%r92; .loc 1 184 20 ld.u64 %r151,[%r49]; mul.lo.u64 %r150,%r151,%r23; .loc 1 184 18 st.u64 [%r149],%r150; .loc 1 185 19 ld.u64 %r153,[%r28+16]; add.u64 %r152,%r153,1; ld.u64 %r154,[%r28+8]; sub.u64 %r41,%r152,%r154; .loc 1 185 17 add.u64 %r156,%r196,%r92; st.u64 [%r156],%r41; .loc 1 186 16 add.u64 %r158,%r195,%r92; st.u64 [%r158],%r159; .loc 1 187 10 setp.le.s64 %r160,%r41,0; @ %r160 bra $L16; .loc 1 181 26 add.u64 %r82,%r82,1; .loc 1 181 3 add.u64 %r28,%r28,24; add.u64 %r49,%r49,24; add.u64 %r92,%r92,8; setp.ne.u64 %r161,%r82,%r85; @ %r161 bra $L22; .loc 1 191 8 ld.u64 %r69,[%r101]; .loc 1 198 9 setp.eq.u64 %r162,%r69,0; @ %r162 bra $L30; .loc 1 210 19 ld.u64 %r45,[%frame+120]; .loc 1 211 20 ld.u64 %r47,[%frame]; .loc 1 213 34 ld.u64 %r52,[%frame+240]; mov.u64 %r73,%r99; add.u64 %r197,%frame,368; .loc 1 219 13 mov.u64 %r176,0; $L28: .loc 1 213 21 ld.u64 %r91,[%frame+360]; add.u64 %r56,%r91,1; mov.u64 %r70,%r71; mov.u64 %r68,%r69; bra $L26; $L31: mov.u64 %r56,%r59; $L26: .loc 1 204 6 ld.s8 %r164,[%r70]; cvt.u16.u32 %r163,%r164; setp.eq.u16 %r165,%r163,0; @ %r165 bra $L25; .loc 1 3973.loc 1 204 14 setp.le.s32 %r171,%r169,0; selp.u64 %r73,%r73,%r68,%r171; $L25: .loc 1 210 9 add.u64 %r68,%r68,%r45; .loc 1 211 10 add.u64 %r70,%r70,%r47; .loc 1 213 7 st.u64 [%frame+360],%r56; add.u64 %r59,%r56,1; setp.ne.u64 %r172,%r52,%r56; @ %r172 bra $L31; sub.u64 %r89,%r52,%r91; .loc 1 210 9 mad.lo.u64 %r69,%r45,%r89,%r69; .loc 1 211 10 mad.lo.u64 %r71,%r47,%r89,%r71; mov.u64 %r83,%r197; mov.u64 %r64,%r47; mov.u64 %r67,%r52; mov.u64 %r62,%r45; mov.u64 %r81,8; .loc 1 214 9 mov.u64 %r72,0; $L27: .loc 1 219 13 st.u64 [%r83+-8],%r176; .loc 1 222 23 mul.lo.u64 %r54,%r62,%r67; .loc 1 223 24 mul.lo.u64 %r58,%r67,%r64; .loc 1 224 5 add.u64 %r72,%r72,1; .loc 1 225 7 setp.eq.u64 %r177,%r72,%r85; @ %r177 bra $L24; .loc 1 233 16 ld.u64 %r178,[%r83]; add.u64 %r61,%r178,1; st.u64 [%r83],%r61; .loc 1 234 23 add.u64 %r180,%r198,%r81; ld.u64 %r62,[%r180]; .loc 1 234 13 sub.u64 %r181,%r62,%r54; add.u64 %r69,%r69,%r181; .loc 1 235 24 add.u64 %r182,%frame,%r81; ld.u64 %r64,[%r182]; .loc 1 235 14 sub.u64 %r183,%r64,%r58; add.u64 %r71,%r71,%r183; .loc 1 238 32 add.u64 %r185,%r196,%r81; ld.u64 %r67,[%r185]; .loc 1 238 7 add.u64 %r83,%r83,8; add.u64 %r81,%r81,8; setp.eq.u64 %r186,%r61,%r67; @ %r186 bra $L27; bra $L28; $L30: .loc 1 198 9 mov.u64 %r73,%r99; $L24: .loc 1 240 5103193,[%value_in]; } $L16: .loc 1 242_gfortran_smaxval0_s1 .visible .func _gfortran_smaxval0_s1 {pred %r28; .reg .u32 %r29; .reg .pred %r30; .reg .u64 %r40;_gfortran_maxval0_s17 bra $L40; $L42:40,[%value_in]; } $L40: .loc 1 262 1 ret; } maxval0_s4.o/_gfortran_maxval0_s4 .visible .func _gfortran_maxval0_s../libgfortran/generated/maxval0_s4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxval0_s4 .visible .func _gfortran_mmaxval0_s4_gfortran_smaxval0_s4 .visible .func _gfortran_smaxval0_s4_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_memcmp_char4_gfortrani_memcmp_char4120,118,97,108,48,95,115,52,0 }; // BEGIN VAR DEF: __func__$1 .const .align 1 .u8 __func__$1[11] = {109,97,120,118,97,108,48,95,115,52116,114,97,110,47,103,101,110,101,114,97,116,101,100,47,109,97,120,118,97,108,48,95,115,52,46,99,0 }82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,4820,108,101,110,32,61,61,32,108,101,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxval0_s4 .visible .func _gfortran_maxval0_s8101; .reg .u64predpredmov.u64 %r73,%ar0; mov.u64 %r74,%ar1; mov.u64 %r75,%ar2; mov.u64 %r76,%ar3; .loc 1 64 8 ld.s8 %r51,[%r75+28]; .loc 1 65 6 setp.gt.s64 %r77,%r51,0; @ %r77 bra $L2; .loc 1 66 5 cvta.const.u64 %r78,$LC0; {stack; call _gfortran_runtime_errorsetp.eq.u64 %r80,%r74,%r76; @ %r80 bra $L3; cvta.const.u64 %r84,$LC1; cvta.const.u64 %r83,__func__$1; mov.u32 %r82,683: .loc 1 71 3 shl.b64 %r23,%r76,2add.u64 %r33,%r75,40; mov.u64 %r37,0; .loc 1 73 10 mov.u64 %r60,%r37; add.u64 %r134,%frame,120; add.u64 %r132,%frame,240; .loc 1 77 16 mov.u64 %r102,%r37; bra $L6; $L12: mov.u64 %r60,%r54; $L6: .loc 1 75 18 add.u64 %r92,%frame,%r37; .loc 1 75 51 ld.u64 %r94,[%r33]; mul.lo.u64 %r93,%r94,%r76; .loc 1 75 18 st.u64 [%r92],%r93; .loc 1 76 19 ld.u64 %r96,[%r33+16]; add.u64 %r95,%r96,1; ld.u64 %r97,[%r33+8]; sub.u64 %r31,%r95,%r97; .loc 1 76 17 add.u64 %r99,%r134,%r37; st.u64 [%r99],%r31; .loc 1 77 16 add.u64 %r101,%r132,%r37; st.u64 [%r101],%r102; .loc 1 78 10 setp.gt.s64 %r103,%r31,0; @ ! %r103 bra $L1; .loc 1 73 26 add.u64 %r54,%r60,1; .loc 1 73 3 add.u64 %r33,%r33,24; add.u64 %r37,%r37,8; setp.ne.u64 %r104,%r51,%r54; @ %r104 bra $L12; .loc 1 82 8 ld.u64 %r47,[%r75]; .loc 1 89 9 setp.eq.u64 %r105,%r47,0; @ %r105 bra $L13; .loc 1 101 19 ld.u64 %r61,[%frame]; .loc 1 101 9 shl.b64 %r63,%r61,2; .loc 1 103 34 ld.u64 %r64,[%frame+120]; mov.u64 %r49,%r73; add.u64 %r133,%frame,248; $L11: .loc 1 103 21 ld.u64 %r66,[%frame+240]; mov.u64 %r34,%r66; mov.u64 %r46,%r47; $L9: .loc 1 41(%value_in),_gfortrani_memcmp_char49,[%value_in]; } .loc 1 95 6 setp.le.s32 %r110,%r109,0; .loc 1 97 14 selp.u64 %r49,%r49,%r46,%r110; .loc 1 101 9 add.u64 %r46,%r46,%r63; .loc 1 103 14 add.u64 %r34,%r34,1; .loc 1 103 7 st.u64 [%frame+240],%r34; setp.ne.u64 %r111,%r34,%r64; @ %r111 bra $L9; .loc 1 101 9 sub.u64 %r112,%r64,%r66; mad.lo.u64 %r47,%r112,%r63,%r47; mov.u64 %r50,%r133; mov.u64 %r44,%r64; mov.u64 %r42,%r61; .loc 1 104 9 mov.u64 %r48,0; .loc 1 109 13 mov.u64 %r115,%r48; $L10: st.u64 [%r50+-8],%r115; .loc 1 112 23 mul.lo.u64 %r39,%r42,%r44; mov.u64 %r72,%r48; .loc 1 113 5 add.u64 %r48,%r48,1; .loc 1 114 7 setp.eq.u64 %r116,%r72,%r60; @ %r116 bra $L7; .loc 1 122 16 ld.u64 %r117,[%r50]; add.u64 %r41,%r117,1; st.u64 [%r50],%r41; shl.b64 %r32,%r48,3; .loc 1 123 23 add.u64 %r118,%frame,%r32; ld.u64 %r42,[%r118]; .loc 1 123 13 sub.u64 %r119,%r42,%r39; shl.b64 %r120,%r119,2; add.u64 %r47,%r47,%r120; .loc 1 126 32 add.u64 %r122,%r134,%r32; ld.u64 %r44,[%r122]; .loc 1 126 7 add.u64 %r50,%r50,8; setp.eq.u64 %r123,%r41,%r44; @ %r123 bra $L10; bra $L11; $L13: .loc 1 89 9 mov.u64 %r49,%r73; $L7: .loc 1 128 4130,[%value_in]; } $L1: .loc 1 130_gfortran_mmaxval0_s4 .visible .func _gfortran_mmaxval0_s4 {predu64 %r114; .reg .u3232predu64 %r183; .reg .u64 %r185; .reg .predmov.u64 %r99,%ar0; mov.u64 %r100,%ar1; mov.u64 %r101,%ar2; mov.u64 %r102,%ar3; mov.u64 %r103,%ar4; .loc 1 153 6 setp.ne.u64 %r104,%r102,0; @ %r104 bra $L17r10103; call _gfortran_maxval0_s47 bra $L16; $L17: .loc 1 159 10 ld.s8 %r22,[%r101+28]; .loc 1 160 6 cvt.u16.u32 %r109,%r22; setp.gt.s16 %r110,%r109,0; @ %r110 bra $L19; .loc 1 161 5 cvta.const.u64 %r111111_gfortran_runtime_error19: .loc 1 163 3 setp.eq.u64 %r113,%r100,%r103; @ %r113 bra $L20; cvta.const.u64 %r117,$LC1; cvta.const.u64 %r116,__func__$0; mov.u32 %r115,163; cvta.const.u64 %r11411611720: .loc 1 166 3 shl.b64 %r23,%r100,2; mov.u32 %r1211123,[%value_in]; } .loc 1 168 15 ld.u64 %r24,[%r102+16]; .loc 1 170 9 ld.u64 %r65,[%r102]; .loc 1 172 22 cvt.u32.u64 %r25,%r24; .loc 1 172 53 add.u32 %r125,%r25,-4; and.b32 %r126,%r125,-5; set.u32.eq.u32 %r128,%r126,0; neg.s32 %r129,%r128; .loc 1 172 22 add.u32 %r130,%r25,-1; set.u32.le.u32 %r132,%r130,1; neg.s32 %r133,%r132; .loc 1 172 6 cvt.u16.u32 %r135,%r129; cvt.u16.u32 %r136,%r133; or.b16 %r134,%r135,%r136; cvt.u32.u16 %r137,%r134; cvt.u16.u8 %r138,%r137; setp.eq.u16 %r139,%r138,0; @ %r139 bra $L21; $L23: .loc 1 181 17 cvt.u32.u32 %r140,%r22; cvt.s64.s8 %r80,%r140; add.u64 %r51,%r101,40; add.u64 %r30,%r102,40; mov.u64 %r44,0; mov.u64 %r76,%r44; add.u64 %r197,%frame,120; add.u64 %r196,%frame,240; add.u64 %r195,%frame,360; .loc 1 186 16 mov.u64 %r159,%r44; bra $L22; $L21: .loc 1 174 7 setp.eq.u32 %r142,%r25,16; @ %r142 bra $L23; .loc 1 179 5 cvta.const.u64 %r143,$LC3; {_gfortran_runtime_error22: .loc 1 183 18 add.u64 %r146,%r197,%r44; .loc 1 183 51 ld.u64 %r148,[%r51]; mul.lo.u64 %r147,%r148,%r103; .loc 1 183 18 st.u64 [%r146],%r147; .loc 1 184 18 add.u64 %r149,%frame,%r44; .loc 1 184 20 ld.u64 %r151,[%r30]; mul.lo.u64 %r150,%r151,%r24; .loc 1 184 18 st.u64 [%r149],%r150; .loc 1 185 19 ld.u64 %r153,[%r51+16]; add.u64 %r152,%r153,1; ld.u64 %r154,[%r51+8]; sub.u64 %r42,%r152,%r154; .loc 1 185 17 add.u64 %r156,%r196,%r44; st.u64 [%r156],%r42; .loc 1 186 16 add.u64 %r158,%r195,%r44; st.u64 [%r158],%r159; .loc 1 187 10 setp.le.s64 %r160,%r42,0; @ %r160 bra $L16; .loc 1 181 26 add.u64 %r76,%r76,1; .loc 1 181 3 add.u64 %r51,%r51,24; add.u64 %r30,%r30,24; add.u64 %r44,%r44,8; setp.ne.u64 %r161,%r76,%r80; @ %r161 bra $L22; .loc 1 191 8 ld.u64 %r62,[%r101]; .loc 1 198 9 setp.eq.u64 %r162,%r62,0; @ %r162 bra $L30; .loc 1 210 19 ld.u64 %r88,[%frame+120]; .loc 1 210 9 shl.b64 %r91,%r88,2; .loc 1 211 20 ld.u64 %r92,[%frame]; .loc 1 213 34 ld.u64 %r94,[%frame+240]; mov.u64 %r69,%r99; add.u64 %r198,%frame,368; $L28: .loc 1 213 21 ld.u64 %r98,[%frame+360]; mov.u64 %r48,%r98; mov.u64 %r64,%r65; mov.u64 %r61,%r62; $L26: .loc 1 204 6 ld.s8 %r164,[%r64]; cvt.u16.u32 %r163,%r164; setp.eq.u16 %r165,%r163,0; @ %r165 bra $L25; .loc 1 41r103; call (%value_in),_gfortrani_memcmp_char4169,[%value_in]; } .loc 1 204 14 setp.le.s32 %r170,%r169,0; .loc 1 206 14 selp.u64 %r69,%r69,%r61,%r170; $L25: .loc 1 210 9 add.u64 %r61,%r61,%r91; .loc 1 211 10 add.u64 %r64,%r64,%r92; .loc 1 213 14 add.u64 %r48,%r48,1; .loc 1 213 7 st.u64 [%frame+360],%r48; setp.ne.u64 %r171,%r48,%r94; @ %r171 bra $L26; sub.u64 %r85,%r94,%r98; .loc 1 210 9 mad.lo.u64 %r62,%r85,%r91,%r62; .loc 1 211 10 mad.lo.u64 %r65,%r85,%r92,%r65; mov.u64 %r82,%r198; mov.u64 %r58,%r92; mov.u64 %r60,%r94; mov.u64 %r56,%r88; mov.u64 %r78,8; .loc 1 214 9 mov.u64 %r68,0; .loc 1 219 13 mov.u64 %r175,%r68; $L27: st.u64 [%r82+-8],%r175; .loc 1 222 23 mul.lo.u64 %r50,%r56,%r60; .loc 1 223 24 mul.lo.u64 %r53,%r60,%r58; .loc 1 224 5 add.u64 %r68,%r68,1; .loc 1 225 7 setp.eq.u64 %r176,%r68,%r80; @ %r176 bra $L24; .loc 1 233 16 ld.u64 %r177,[%r82]; add.u64 %r55,%r177,1; st.u64 [%r82],%r55; .loc 1 234 23 add.u64 %r179,%r197,%r78; ld.u64 %r56,[%r179]; .loc 1 234 13 sub.u64 %r180,%r56,%r50; shl.b64 %r181,%r180,2; add.u64 %r62,%r62,%r181; .loc 1 235 24 add.u64 %r182,%frame,%r78; ld.u64 %r58,[%r182]; .loc 1 235 14 sub.u64 %r183,%r58,%r53; add.u64 %r65,%r65,%r183; .loc 1 238 32 add.u64 %r185,%r196,%r78; ld.u64 %r60,[%r185]; .loc 1 238 7 add.u64 %r82,%r82,8; add.u64 %r78,%r78,8; setp.eq.u64 %r186,%r55,%r60; @ %r186 bra $L27; bra $L28; $L30: .loc 1 198 9 mov.u64 %r69,%r99; $L24: .loc 1 240 5193,[%value_in]; } $L16: .loc 1 242_gfortran_smaxval0_s4 .visible .func _gfortran_smaxval0_s4 {r24predmov.u64 %r26,%ar2; mov.u64 %r27,%ar3; mov.u64 %r28,%ar4; .loc 1 256 6 setp.eq.u64 %r29,%r27,0; @ %r29 bra $L41; .loc 1 256 20 l42; $L41: .loc 1 258 7 {call _gfortran_maxval0_s47 bra $L40; $L42: .loc 1 261 3 shl.b64 %r36,%r28,2$L40: .loc 1 262 1 ret; } minval0_s1.o/../libgfortran/generated/minval0_s1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminval0_s1 .visible .func _gfortran_mminval0_s1_gfortran_sminval0_s1 .visible .func _gfortran_sminval0_s1_gfortran_runtime_error .extern .func _gfortran_runtime_memcmpmem09,105,110,118,97,108,48,95,115,49,0 }; // BEGIN VAR DEF: __func__$1 .const .align 1 .u8 __func__$1[11] = {109,105,110,118,97,108,48,95,115,49116,114,97,110,47,103,101,110,101,114,97,116,101,100,47,109,105,110,118,97,108,48,95,115,49,46,99,0 }82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,4820,108,101,110,32,61,61,32,108,101,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minval0_s1 .visible .func _gfortran_minval0_s1839predmov.u64 %r73,%ar0; mov.u64 %r74,%ar1; mov.u64 %r75,%ar2; mov.u64 %r76,%ar3; .loc 1 64 8 ld.s8 %r50,[%r75+28]; .loc 1 65 6 setp.gt.s64 %r77,%r50,0; @ %r77 bra $L2; .loc 1 66 5 cvta.const.u64 %r78,$LC0; {stack; call _gfortran_runtime_errorsetp.eq.u64 %r80,%r74,%r76; @ %r80 bra $L3; cvta.const.u64 %r84,$LC1; cvta.const.u64 %r83,__func__$1; mov.u32 %r82,683: .loc 1 71 3 mov.u32 %r86,2557add.u64 %r66,%r75,40; mov.u64 %r68,0; .loc 1 73 10 mov.u64 %r59,%r68; add.u64 %r131,%frame,120; add.u64 %r129,%frame,240; .loc 1 77 16 mov.u64 %r99,%r68; bra $L6; $L12: mov.u64 %r59,%r54; $L6: .loc 1 75 18 add.u64 %r89,%frame,%r68; .loc 1 75 51 ld.u64 %r91,[%r66]; mul.lo.u64 %r90,%r91,%r76; .loc 1 75 18 st.u64 [%r89],%r90; .loc 1 76 19 ld.u64 %r93,[%r66+16]; add.u64 %r92,%r93,1; ld.u64 %r94,[%r66+8]; sub.u64 %r30,%r92,%r94; .loc 1 76 17 add.u64 %r96,%r131,%r68; st.u64 [%r96],%r30; .loc 1 77 16 add.u64 %r98,%r129,%r68; st.u64 [%r98],%r99; .loc 1 78 10 setp.gt.s64 %r100,%r30,0; @ ! %r100 bra $L1; .loc 1 73 26 add.u64 %r54,%r59,1; .loc 1 73 3 add.u64 %r66,%r66,24; add.u64 %r68,%r68,8; setp.ne.u64 %r101,%r50,%r54; @ %r101 bra $L12; .loc 1 82 8 ld.u64 %r46,[%r75]; .loc 1 89 9 setp.eq.u64 %r102,%r46,0; @ %r102 bra $L13; .loc 1 101 19 ld.u64 %r32,[%frame]; .loc 1 103 34 ld.u64 %r36,[%frame+120]; mov.u64 %r49,%r73; add.u64 %r130,%frame,248; .loc 1 109 13 mov.u64 %r113,0; $L11: .loc 1 103 21 ld.u64 %r58,[%frame+240]; add.u64 %r61,%r58,1; mov.u64 %r45,%r46; bra $L9; $L14: mov.u64 %r61,%r62; $L9: .loc 1 39(%value_in),memcmp,(%out_arg1,%out_arg2,%out_arg3); ld.param.u32 %r106,[%value_in]; } .loc 1 95 6 setp.ge.s32 %r108,%r106,0; selp.u64 %r49,%r49,%r45,%r108; .loc 1 101 9 add.u64 %r45,%r45,%r32; .loc 1 103 7 st.u64 [%frame+240],%r61; add.u64 %r62,%r61,1; setp.ne.u64 %r109,%r61,%r36; @ %r109 bra $L14; .loc 1 101 9 sub.u64 %r110,%r36,%r58; mad.lo.u64 %r46,%r110,%r32,%r46; mov.u64 %r52,%r130; mov.u64 %r44,%r36; mov.u64 %r42,%r32; .loc 1 104 9 mov.u64 %r48,0; $L10: .loc 1 109 13 st.u64 [%r52+-8],%r113; .loc 1 112 23 mul.lo.u64 %r39,%r42,%r44; mov.u64 %r65,%r48; .loc 1 113 5 add.u64 %r48,%r48,1; .loc 1 114 7 setp.eq.u64 %r114,%r65,%r59; @ %r114 bra $L7; .loc 1 122 16 ld.u64 %r115,[%r52]; add.u64 %r41,%r115,1; st.u64 [%r52],%r41; shl.b64 %r31,%r48,3; .loc 1 123 23 add.u64 %r116,%frame,%r31; ld.u64 %r42,[%r116]; .loc 1 123 13 sub.u64 %r117,%r42,%r39; add.u64 %r46,%r46,%r117; .loc 1 126 32 add.u64 %r119,%r131,%r31; ld.u64 %r44,[%r119]; .loc 1 126 7 add.u64 %r52,%r52,8; setp.eq.u64 %r120,%r41,%r44; @ %r120 bra $L10; bra $L11; $L13: .loc 1 89 9 mov.u64 %r49,%r73; $L7: .loc 1 128 4127,[%value_in]; } $L1: .loc 1 130_gfortran_mminval0_s1 .visible .func _gfortran_mminval0_s1 {predu64 %r114; .reg .u32u16 %r133; .reg .u32 %r134; .reg .u16 %r135; .reg .predu16mov.u64 %r99,%ar0; mov.u64 %r100,%ar1; mov.u64 %r101,%ar2; mov.u64 %r102,%ar3; mov.u64 %r103,%ar4; .loc 1 153 6 setp.ne.u64 %r104,%r102,0; @ %r104 bra $L17r10103; call _gfortran_minval0_s17 bra $L16; $L17: .loc 1 159 10 ld.s8 %r22,[%r101+28]; .loc 1 160 6 cvt.u16.u32 %r109,%r22; setp.gt.s16 %r110,%r109,0; @ %r110 bra $L19; .loc 1 161 5 cvta.const.u64 %r111111_gfortran_runtime_error19: .loc 1 163 3 setp.eq.u64 %r113,%r100,%r103; @ %r113 bra $L20; cvta.const.u64 %r117,$LC1; cvta.const.u64 %r116,__func__$0; mov.u32 %r115,163; cvta.const.u64 %r11411611720: .loc 1 166 3 mov.u32 %r119,25599100121,[%value_in]; } .loc 1 168 15 ld.u64 %r23,[%r102+16]; .loc 1 170 9 ld.u64 %r71,[%r102]; .loc 1 172 22 cvt.u32.u64 %r24,%r23; .loc 1 172 53 add.u32 %r122,%r24,-4; and.b32 %r123,%r122,-5; set.u32.eq.u32 %r125,%r123,0; neg.s32 %r126,%r125; .loc 1 172 22 add.u32 %r127,%r24,-1; set.u32.le.u32 %r129,%r127,1; neg.s32 %r130,%r129; .loc 1 172 6 cvt.u16.u32 %r132,%r126; cvt.u16.u32 %r133,%r130; or.b16 %r131,%r132,%r133; cvt.u32.u16 %r134,%r131; cvt.u16.u8 %r135,%r134; setp.eq.u16 %r136,%r135,0; @ %r136 bra $L21; $L23: .loc 1 181 17 cvt.u32.u32 %r137,%r22; cvt.s64.s8 %r85,%r137; add.u64 %r28,%r101,40; add.u64 %r49,%r102,40; mov.u64 %r92,0; mov.u64 %r82,%r92; add.u64 %r194,%frame,120; add.u64 %r193,%frame,240; add.u64 %r192,%frame,360; .loc 1 186 16 mov.u64 %r156,%r92; bra $L22; $L21: .loc 1 174 7 setp.eq.u32 %r139,%r24,16; @ %r139 bra $L23; .loc 1 179 5_gfortran_runtime_error22: .loc 1 183 18 add.u64 %r143,%r194,%r92; .loc 1 183 51 ld.u64 %r145,[%r28]; mul.lo.u64 %r144,%r145,%r103; .loc 1 183 18 st.u64 [%r143],%r144; .loc 1 184 18 add.u64 %r146,%frame,%r92; .loc 1 184 20 ld.u64 %r148,[%r49]; mul.lo.u64 %r147,%r148,%r23; .loc 1 184 18 st.u64 [%r146],%r147; .loc 1 185 19 ld.u64 %r150,[%r28+16]; add.u64 %r149,%r150,1; ld.u64 %r151,[%r28+8]; sub.u64 %r41,%r149,%r151; .loc 1 185 17 add.u64 %r153,%r193,%r92; st.u64 [%r153],%r41; .loc 1 186 16 add.u64 %r155,%r192,%r92; st.u64 [%r155],%r156; .loc 1 187 10 setp.le.s64 %r157,%r41,0; @ %r157 bra $L16; .loc 1 181 26 add.u64 %r82,%r82,1; .loc 1 181 3 add.u64 %r28,%r28,24; add.u64 %r49,%r49,24; add.u64 %r92,%r92,8; setp.ne.u64 %r158,%r82,%r85; @ %r158 bra $L22; .loc 1 191 8 ld.u64 %r69,[%r101]; .loc 1 198 9 setp.eq.u64 %r159,%r69,0; @ %r159 bra $L30; .loc 1 210 19 ld.u64 %r45,[%frame+120]; .loc 1 211 20 ld.u64 %r47,[%frame]; .loc 1 213 34 ld.u64 %r52,[%frame+240]; mov.u64 %r73,%r99; add.u64 %r195,%frame,368; .loc 1 219 13 mov.u64 %r173,0; $L28: .loc 1 213 21 ld.u64 %r91,[%frame+360]; add.u64 %r56,%r91,1; mov.u64 %r70,%r71; mov.u64 %r68,%r69; bra $L26; $L31: mov.u64 %r56,%r59; $L26: .loc 1 204 6 ld.s8 %r161,[%r70]; cvt.u16.u32 %r160,%r161; setp.eq.u16 %r162,%r160,0; @ %r162 bra $L25; .loc 1 3973.loc 1 204 14 setp.ge.s32 %r168,%r166,0; selp.u64 %r73,%r73,%r68,%r168; $L25: .loc 1 210 9 add.u64 %r68,%r68,%r45; .loc 1 211 10 add.u64 %r70,%r70,%r47; .loc 1 213 7 st.u64 [%frame+360],%r56; add.u64 %r59,%r56,1; setp.ne.u64 %r169,%r52,%r56; @ %r169 bra $L31; sub.u64 %r89,%r52,%r91; .loc 1 210 9 mad.lo.u64 %r69,%r45,%r89,%r69; .loc 1 211 10 mad.lo.u64 %r71,%r47,%r89,%r71; mov.u64 %r83,%r195; mov.u64 %r64,%r47; mov.u64 %r67,%r52; mov.u64 %r62,%r45; mov.u64 %r81,8; .loc 1 214 9 mov.u64 %r72,0; $L27: .loc 1 219 13 st.u64 [%r83+-8],%r173; .loc 1 222 23 mul.lo.u64 %r54,%r62,%r67; .loc 1 223 24 mul.lo.u64 %r58,%r67,%r64; .loc 1 224 5 add.u64 %r72,%r72,1; .loc 1 225 7 setp.eq.u64 %r174,%r72,%r85; @ %r174 bra $L24; .loc 1 233 16 ld.u64 %r175,[%r83]; add.u64 %r61,%r175,1; st.u64 [%r83],%r61; .loc 1 234 23 add.u64 %r177,%r194,%r81; ld.u64 %r62,[%r177]; .loc 1 234 13 sub.u64 %r178,%r62,%r54; add.u64 %r69,%r69,%r178; .loc 1 235 24 add.u64 %r179,%frame,%r81; ld.u64 %r64,[%r179]; .loc 1 235 14 sub.u64 %r180,%r64,%r58; add.u64 %r71,%r71,%r180; .loc 1 238 32 add.u64 %r182,%r193,%r81; ld.u64 %r67,[%r182]; .loc 1 238 7 add.u64 %r83,%r83,8; add.u64 %r81,%r81,8; setp.eq.u64 %r183,%r61,%r67; @ %r183 bra $L27; bra $L28; $L30: .loc 1 198 9 mov.u64 %r73,%r99; $L24: .loc 1 240 5103190,[%value_in]; } $L16: .loc 1 242_gfortran_sminval0_s1 .visible .func _gfortran_sminval0_s1 {predu64 %r38;_gfortran_minval0_s17 bra $L40; $L42: .loc 1 261 3 mov.u32 %r36,255;$L40: .loc 1 262 1 ret; } minval0_s4.o/_gfortran_minval0_s4 .visible .func _gfortran_minval0_s../libgfortran/generated/minval0_s4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminval0_s4 .visible .func _gfortran_mminval0_s4_gfortran_sminval0_s4 .visible .func _gfortran_sminval0_s4_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_memcmp_char4_gfortrani_memcmp_char409,105,110,118,97,108,48,95,115,52,0 }; // BEGIN VAR DEF: __func__$1 .const .align 1 .u8 __func__$1[11] = {109,105,110,118,97,108,48,95,115,52116,114,97,110,47,103,101,110,101,114,97,116,101,100,47,109,105,110,118,97,108,48,95,115,52,46,99,0 }82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,4820,108,101,110,32,61,61,32,108,101,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minval0_s4 .visible .func _gfortran_minval0_s8u64 %r112; .reg .predpredmov.u64 %r73,%ar0; mov.u64 %r74,%ar1; mov.u64 %r75,%ar2; mov.u64 %r76,%ar3; .loc 1 64 8 ld.s8 %r51,[%r75+28]; .loc 1 65 6 setp.gt.s64 %r77,%r51,0; @ %r77 bra $L2; .loc 1 66 5 cvta.const.u64 %r78,$LC0; {stack; call _gfortran_runtime_errorsetp.eq.u64 %r80,%r74,%r76; @ %r80 bra $L3; cvta.const.u64 %r84,$LC1; cvta.const.u64 %r83,__func__$1; mov.u32 %r82,683: .loc 1 71 3 shl.b64 %r23,%r76,2; mov.u32 %r86,2557add.u64 %r33,%r75,40; mov.u64 %r37,0; .loc 1 73 10 mov.u64 %r60,%r37; add.u64 %r130,%frame,120; add.u64 %r129,%frame,240; .loc 1 77 16 mov.u64 %r99,%r37; bra $L6; $L12: mov.u64 %r60,%r54; $L6: .loc 1 75 18 add.u64 %r89,%frame,%r37; .loc 1 75 51 ld.u64 %r91,[%r33]; mul.lo.u64 %r90,%r91,%r76; .loc 1 75 18 st.u64 [%r89],%r90; .loc 1 76 19 ld.u64 %r93,[%r33+16]; add.u64 %r92,%r93,1; ld.u64 %r94,[%r33+8]; sub.u64 %r31,%r92,%r94; .loc 1 76 17 add.u64 %r96,%r130,%r37; st.u64 [%r96],%r31; .loc 1 77 16 add.u64 %r98,%r129,%r37; st.u64 [%r98],%r99; .loc 1 78 10 setp.gt.s64 %r100,%r31,0; @ ! %r100 bra $L1; .loc 1 73 26 add.u64 %r54,%r60,1; .loc 1 73 3 add.u64 %r33,%r33,24; add.u64 %r37,%r37,8; setp.ne.u64 %r101,%r51,%r54; @ %r101 bra $L12; .loc 1 82 8 ld.u64 %r47,[%r75]; .loc 1 89 9 setp.eq.u64 %r102,%r47,0; @ %r102 bra $L13; .loc 1 101 19 ld.u64 %r61,[%frame]; .loc 1 101 9 shl.b64 %r63,%r61,2; .loc 1 103 34 ld.u64 %r64,[%frame+120]; mov.u64 %r49,%r73; add.u64 %r131,%frame,248; $L11: .loc 1 103 21 ld.u64 %r66,[%frame+240]; mov.u64 %r34,%r66; mov.u64 %r46,%r47; $L9: .loc 1 41(%value_in),_gfortrani_memcmp_char46,[%value_in]; } .loc 1 95 6 setp.ge.s32 %r107,%r106,0; .loc 1 97 14 selp.u64 %r49,%r49,%r46,%r107; .loc 1 101 9 add.u64 %r46,%r46,%r63; .loc 1 103 14 add.u64 %r34,%r34,1; .loc 1 103 7 st.u64 [%frame+240],%r34; setp.ne.u64 %r108,%r34,%r64; @ %r108 bra $L9; .loc 1 101 9 sub.u64 %r109,%r64,%r66; mad.lo.u64 %r47,%r109,%r63,%r47; mov.u64 %r50,%r131; mov.u64 %r44,%r64; mov.u64 %r42,%r61; .loc 1 104 9 mov.u64 %r48,0; .loc 1 109 13 mov.u64 %r112,%r48; $L10: st.u64 [%r50+-8],%r112; .loc 1 112 23 mul.lo.u64 %r39,%r42,%r44; mov.u64 %r72,%r48; .loc 1 113 5 add.u64 %r48,%r48,1; .loc 1 114 7 setp.eq.u64 %r113,%r72,%r60; @ %r113 bra $L7; .loc 1 122 16 ld.u64 %r114,[%r50]; add.u64 %r41,%r114,1; st.u64 [%r50],%r41; shl.b64 %r32,%r48,3; .loc 1 123 23 add.u64 %r115,%frame,%r32; ld.u64 %r42,[%r115]; .loc 1 123 13 sub.u64 %r116,%r42,%r39; shl.b64 %r117,%r116,2; add.u64 %r47,%r47,%r117; .loc 1 126 32 add.u64 %r119,%r130,%r32; ld.u64 %r44,[%r119]; .loc 1 126 7 add.u64 %r50,%r50,8; setp.eq.u64 %r120,%r41,%r44; @ %r120 bra $L10; bra $L11; $L13: .loc 1 89 9 mov.u64 %r49,%r73; $L7: .loc 1 128 4127,[%value_in]; } $L1: .loc 1 130_gfortran_mminval0_s4 .visible .func _gfortran_mminval0_s4 {predu64 %r114; .reg .u32u16 %r133; .reg .u32 %r134; .reg .u16 %r135; .reg .predu16 %r160; .reg .u32 %r161; .reg .predu64 %r194; .reg .u64 %r195; mov.u64 %r99,%ar0; mov.u64 %r100,%ar1; mov.u64 %r101,%ar2; mov.u64 %r102,%ar3; mov.u64 %r103,%ar4; .loc 1 153 6 setp.ne.u64 %r104,%r102,0; @ %r104 bra $L17r10103; call _gfortran_minval0_s47 bra $L16; $L17: .loc 1 159 10 ld.s8 %r22,[%r101+28]; .loc 1 160 6 cvt.u16.u32 %r109,%r22; setp.gt.s16 %r110,%r109,0; @ %r110 bra $L19; .loc 1 161 5 cvta.const.u64 %r111111_gfortran_runtime_error19: .loc 1 163 3 setp.eq.u64 %r113,%r100,%r103; @ %r113 bra $L20; cvta.const.u64 %r117,$LC1; cvta.const.u64 %r116,__func__$0; mov.u32 %r115,163; cvta.const.u64 %r11411611720: .loc 1 166 3 shl.b64 %r23,%r100,2; mov.u32 %r119,25599%r24,[%r102+16]; .loc 1 170 9 ld.u64 %r65,[%r102]; .loc 1 172 22 cvt.u32.u64 %r25,%r24; .loc 1 172 53 add.u32 %r122,%r25,-4; and.b32 %r123,%r122,-5; set.u32.eq.u32 %r125,%r123,0; neg.s32 %r126,%r125; .loc 1 172 22 add.u32 %r127,%r25,-1; set.u32.le.u32 %r129,%r127,1; neg.s32 %r130,%r129; .loc 1 172 6 cvt.u16.u32 %r132,%r126; cvt.u16.u32 %r133,%r130; or.b16 %r131,%r132,%r133; cvt.u32.u16 %r134,%r131; cvt.u16.u8 %r135,%r134; setp.eq.u16 %r136,%r135,0; @ %r136 bra $L21; $L23: .loc 1 181 17 cvt.u32.u32 %r137,%r22; cvt.s64.s8 %r80,%r137; add.u64 %r51,%r101,40; add.u64 %r30,%r102,40; mov.u64 %r44,0; mov.u64 %r76,%r44; add.u64 %r194,%frame,120; add.u64 %r193,%frame,240; add.u64 %r192,%frame,360; .loc 1 186 16 mov.u64 %r156,%r44; bra $L22; $L21: .loc 1 174 7 setp.eq.u32 %r139,%r25,16; @ %r139 bra $L23; .loc 1 179 5_gfortran_runtime_error22: .loc 1 183 18 add.u64 %r143,%r194,%r44; .loc 1 183 51 ld.u64 %r145,[%r51]; mul.lo.u64 %r144,%r145,%r103; .loc 1 183 18 st.u64 [%r143],%r144; .loc 1 184 18 add.u64 %r146,%frame,%r44; .loc 1 184 20 ld.u64 %r148,[%r30]; mul.lo.u64 %r147,%r148,%r24; .loc 1 184 18 st.u64 [%r146],%r147; .loc 1 185 19 ld.u64 %r150,[%r51+16]; add.u64 %r149,%r150,1; ld.u64 %r151,[%r51+8]; sub.u64 %r42,%r149,%r151; .loc 1 185 17 add.u64 %r153,%r193,%r44; st.u64 [%r153],%r42; .loc 1 186 16 add.u64 %r155,%r192,%r44; st.u64 [%r155],%r156; .loc 1 187 10 setp.le.s64 %r157,%r42,0; @ %r157 bra $L16; .loc 1 181 26 add.u64 %r76,%r76,1; .loc 1 181 3 add.u64 %r51,%r51,24; add.u64 %r30,%r30,24; add.u64 %r44,%r44,8; setp.ne.u64 %r158,%r76,%r80; @ %r158 bra $L22; .loc 1 191 8 ld.u64 %r62,[%r101]; .loc 1 198 9 setp.eq.u64 %r159,%r62,0; @ %r159 bra $L30; .loc 1 210 19 ld.u64 %r88,[%frame+120]; .loc 1 210 9 shl.b64 %r91,%r88,2; .loc 1 211 20 ld.u64 %r92,[%frame]; .loc 1 213 34 ld.u64 %r94,[%frame+240]; mov.u64 %r69,%r99; add.u64 %r195,%frame,368; $L28: .loc 1 213 21 ld.u64 %r98,[%frame+360]; mov.u64 %r48,%r98; mov.u64 %r64,%r65; mov.u64 %r61,%r62; $L26: .loc 1 204 6 ld.s8 %r161,[%r64]; cvt.u16.u32 %r160,%r161; setp.eq.u16 %r162,%r160,0; @ %r162 bra $L25; .loc 1 41r103; call (%value_in),_gfortrani_memcmp_char4166,[%value_in]; } .loc 1 204 14 setp.ge.s32 %r167,%r166,0; .loc 1 206 14 selp.u64 %r69,%r69,%r61,%r167; $L25: .loc 1 210 9 add.u64 %r61,%r61,%r91; .loc 1 211 10 add.u64 %r64,%r64,%r92; .loc 1 213 14 add.u64 %r48,%r48,1; .loc 1 213 7 st.u64 [%frame+360],%r48; setp.ne.u64 %r168,%r48,%r94; @ %r168 bra $L26; sub.u64 %r85,%r94,%r98; .loc 1 210 9 mad.lo.u64 %r62,%r85,%r91,%r62; .loc 1 211 10 mad.lo.u64 %r65,%r85,%r92,%r65; mov.u64 %r82,%r195; mov.u64 %r58,%r92; mov.u64 %r60,%r94; mov.u64 %r56,%r88; mov.u64 %r78,8; .loc 1 214 9 mov.u64 %r68,0; .loc 1 219 13 mov.u64 %r172,%r68; $L27: st.u64 [%r82+-8],%r172; .loc 1 222 23 mul.lo.u64 %r50,%r56,%r60; .loc 1 223 24 mul.lo.u64 %r53,%r60,%r58; .loc 1 224 5 add.u64 %r68,%r68,1; .loc 1 225 7 setp.eq.u64 %r173,%r68,%r80; @ %r173 bra $L24; .loc 1 233 16 ld.u64 %r174,[%r82]; add.u64 %r55,%r174,1; st.u64 [%r82],%r55; .loc 1 234 23 add.u64 %r176,%r194,%r78; ld.u64 %r56,[%r176]; .loc 1 234 13 sub.u64 %r177,%r56,%r50; shl.b64 %r178,%r177,2; add.u64 %r62,%r62,%r178; .loc 1 235 24 add.u64 %r179,%frame,%r78; ld.u64 %r58,[%r179]; .loc 1 235 14 sub.u64 %r180,%r58,%r53; add.u64 %r65,%r65,%r180; .loc 1 238 32 add.u64 %r182,%r193,%r78; ld.u64 %r60,[%r182]; .loc 1 238 7 add.u64 %r82,%r82,8; add.u64 %r78,%r78,8; setp.eq.u64 %r183,%r55,%r60; @ %r183 bra $L27; bra $L28; $L30: .loc 1 198 9 mov.u64 %r69,%r99; $L24: .loc 1 240 5190,[%value_in]; } $L16: .loc 1 242_gfortran_sminval0_s4 .visible .func _gfortran_sminval0_s4 {r24predmov.u64 %r26,%ar2; mov.u64 %r27,%ar3; mov.u64 %r28,%ar4; .loc 1 256 6 setp.eq.u64 %r29,%r27,0; @ %r29 bra $L41; .loc 1 256 20 l42; $L41: .loc 1 258 7 {call _gfortran_minval0_s47 bra $L40; $L42: .loc 1 261 3 shl.b64 %r40,%r28,2; mov.u32 %r38,255$L40: .loc 1 262 1 ret; } maxval1_s1.o/_gfortran_maxval1_s1 .visible .func _gfortran_maxval1_s1.file 1 "../../../../libgfortran/generated/maxval1_s1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxval1_s1 .visible .func _gfortran_mmaxval1_s1GLOBAL FUNCTION DECL: _gfortran_smaxval1_s1 .visible .func _gfortran_smaxval1_s1GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_options[6]FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returmemcmpmem_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents73,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37120,118,97,108,49,95,115,49,0 }; // BEGIN VAR DEF: __func__$1 .const .align 1 .u8 __func__$1[11] = {109,97,120,118,97,108,49,95,115,49,0 }120,108,101,110,32,61,61,32,115,116,114,105,110,103,95,108,101116,114,97,110,47,103,101,110,101,114,97,116,101,100,47,109,97,120,118,97,108,49,95,115,49,46,9968,105,109,32,97,114,103,117,109,101,110,116,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,10114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,86,65,76114,101,116,117,114,110,32,118,9726] = {70,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,997,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,0 }_gfortran_maxval1_s1 .visible .func _gfortran_maxval1_s1 {48u64 %r97u64predpredpred %r238; .reg .u64predu64 %r279; .reg .u64 %r280; .reg .u64 %r281; .reg .u64predpred %r312; .reg .u64 %r313; .reg .u64 %r314; .reg .u32 %r315; .reg .u64 %r317; .reg .u64 %r318; .reg .u64 %r319; .reg .u64 %r320; .reg .u64 %r321; .reg .u64u64 %r352; .reg .u64 %r358; .reg .u64 %r359; .reg .pred %r360; .reg .u64 %r361; .reg .u64 %r363; .reg .u64 %r364; .reg .u64 %r366; .reg .predpredu64 %r381; .reg .u64 %r382; mov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; mov.u64 %r182,%ar3; mov.u64 %r183,%ar4; .loc 1 66 3 setp.eq.u64 %r184,%r180,%r183; @ %r184 bra $L2; cvta.const.u64 %r188,$LC0; cvta.const.u64 %r187,__func__$1; mov.u32 %r186,66;r1861871882: .loc 1 68 10 ld.s8 %r22,[%r181+28]; .loc 1 68 38 add.u32 %r189,%r22,-1; .loc 1 68 8 cvt.s64.s32 %r111,%r189; .loc 1 69 10 ld.u64 %r25,[%r182]; .loc 1 69 7 add.u64 %r112,%r25,-1; .loc 1 71 7 shr.u64 %r191,%r112,63; set.u32.lt.s64 %r193,%r111,%r112; neg.s32 %r194,%r193; cvt.u16.u64 %r197,%r191; cvt.u16.u32 %r198,%r194; or.b16 %r196,%r197,%r198; .loc 1 71 6 cvt.u32.u16 %r199,%r196; cvt.u16.u8 %r200,%r199; setp.eq.u16 %r201,%r200,0; @ %r201 bra $L3; .loc 1 73 7 cvt.u32.u32 %r204,%r22; cvt.s64.s8 %r203,%r204; st.u64 [%stack+8],%r20302202_gfortran_runtime_errorr207,%r112,%r112; add.u64 %r208,%r207,%r112; shl.b64 %r209,%r208,3; add.u64 %r210,%r181,%r209; ld.u64 %r31,[%r210+56]; ld.u64 %r33,[%r210+48]; .loc 1 82 11 ld.u64 %r34,[%r210+40]; .loc 1 84 3 setp.ne.u64 %r224,%r112,0; @ %r224 bra $L4; $L10: .loc 1 92 3 setp.gt.s64 %r225,%r111,%r112; @ %r225 bra $L5; bra $L48; $L4: add.u64 %r65,%r181,40; add.u64 %r29,%frame,120; add.u64 %r127,%frame,240; add.u64 %r226,%r181,16; add.u64 %r228,%r25,%r25; add.u64 %r229,%r228,%r25; shl.b64 %r230,%r229,3; add.u64 %r168,%r226,%r230; .loc 1 90 12 mov.u64 %r382,0; $L9: .loc 1 86 51 ld.u64 %r232,[%r65]; mul.lo.u64 %r231,%r232,%r183; .loc 1 86 18 st.u64 [%r29],%r231; .loc 1 87 19 ld.u64 %r234,[%r65+16]; add.u64 %r233,%r234,1; ld.u64 %r235,[%r65+8]; sub.u64 %r44,%r233,%r235; .loc 1 89 10 setp.lt.s64 %r236,%r44,0; @ %r236 bra $L7; .loc 1 87 17 st.u64 [%r127],%r44; bra $L8; $L7: .loc 1 90 12 st.u64 [%r127],%r382; $L8: .loc 1 84 3 add.u64 %r65,%r65,24; add.u64 %r29,%r29,8; add.u64 %r127,%r127,8; setp.ne.u64 %r238,%r65,%r168; @ %r238 bra $L9; bra $L10; $L48: .loc 1 101 6 ld.u64 %r239,[%r179]; setp.eq.u64 %r240,%r239,0; @ ! %r240 bra $L12; bra $L11; $L5: add.u64 %r242,%r25,%r25; add.u64 %r243,%r242,%r25; shl.b64 %r244,%r243,3; add.u64 %r245,%r244,40; add.u64 %r136,%r181,%r245; shl.b64 %r246,%r25,3; add.u64 %r106,%r246,-8; add.u64 %r374,%frame,120; add.u64 %r108,%r374,%r106; add.u64 %r373,%frame,240; add.u64 %r103,%r373,%r106; add.u64 %r249,%r181,40; cvt.u32.u32 %r251,%r22; cvt.s64.s8 %r250,%r251; add.u64 %r253,%r250,%r250; add.u64 %r254,%r253,%r250; shl.b64 %r255,%r254,3; add.u64 %r81,%r249,%r255; .loc 1 98 12 mov.u64 %r381,0; $L15: .loc 1 94 56 ld.u64 %r257,[%r136]; mul.lo.u64 %r256,%r257,%r183; .loc 1 94 18 st.u64 [%r108],%r256; .loc 1 95 19 ld.u64 %r259,[%r136+16]; add.u64 %r258,%r259,1; ld.u64 %r260,[%r136+8]; sub.u64 %r53,%r258,%r260; .loc 1 97 10 setp.lt.s64 %r261,%r53,0; @ %r261 bra $L13; .loc 1 95 17 st.u64 [%r103],%r53; bra $L14; $L13: .loc 1 98 12 st.u64 [%r103],%r381; $L14: .loc 1 92 3 add.u64 %r136,%r136,24; add.u64 %r108,%r108,8; add.u64 %r103,%r103,8; setp.ne.u64 %r263,%r81,%r136; @ %r263 bra $L15; bra $L49; $L11: .loc 1 105 7 setp.le.s64 %r264,%r111,0; @ %r264 bra $L17; add.u64 %r373,%frame,240; $L37: add.u64 %r151,%r179,40; mov.u64 %r148,%r373; cvt.u32.u32 %r266,%r22; cvt.s64.s8 %r265,%r266; add.u64 %r268,%r265,%r265; add.u64 %r269,%r268,%r265; shl.b64 %r270,%r269,3; add.u64 %r271,%r179,16; add.u64 %r139,%r270,%r271; .loc 1 108 10 mov.u64 %r102,1; .loc 1 112 4 mov.u64 %r295,0; bra $L18; $L17: .loc 1 116 24 mov.u64 %r272,0; st.u64 [%r179+8],%r272; .loc 1 117 28 cvt.u16.u32 %r275,%r22; add.u16 %r274,%r275,-1; cvt.u32.u16 %r276,%r274; st.u8 [%r179+28],%r276; .loc 1 119 20 add.u32 %r277,%r22,-2; cvt.s64.s32 %r63,%r277; add.u64 %r279,%r63,%r63; add.u64 %r280,%r279,%r63; shl.b64 %r281,%r280,3; add.u64 %r282,%r179,%r281; .loc 1 119 67 shl.b64 %r284,%r63,3; add.u64 %r285,%frame,%r284; .loc 1 119 59 ld.u64 %r288,[%r282+40]; ld.u64 %r289,[%r285+240]; mul.lo.u64 %r287,%r288,%r289; .loc 1 119 18 mul.lo.u64 %r117,%r287,%r183; .loc 1 122 29 mov.u64 %r29call (%value_in),_gfortrani_xmallocarray292,[%value_in]; } .loc 1 122 27 st.u64 [%r179],%r292; .loc 1 123 10 setp.eq.u64 %r294,%r117,0; @ ! %r294 bra $L22; bra $L19; $L21: .loc 1 110 48 mul.lo.u64 %r102,%r59,%r102; $L18: .loc 1 112 4 st.u64 [%r151+8],%r295; ld.u64 %r59,[%r148]; add.u64 %r296,%r59,-1; st.u64 [%r151+16],%r296; st.u64 [%r151],%r102; .loc 1 105 7 add.u64 %r151,%r151,24; add.u64 %r148,%r148,8; setp.ne.u64 %r297,%r139,%r151; @ %r297 bra $L21; bra $L17; $L19: .loc 1 126 4 st.u64 [%r179+48],%r117; mov.u64 %r299,-1; st.u64 [%r179+56],%r299; st.u64 [%r179+40],%r291; .loc 1 127 4 bra $L1; $L12: .loc 1 133 19 ld.s8 %r71,[%r179+28]; .loc 1 133 10 setp.eq.u64 %r301,%r71,%r111; @ %r301 bra $L24; .loc 1 134 2 st.u64 [%stack+8],%r111; st.u64 [%stack],%r71_gfortran_runtime_error24: .loc 1 139 11 cvta.global.u64 %r304,_gfortrani_compile_options; .loc 1 139 10 ld.u32 %r305,[%r304+36]; setp.eq.u32 %r306,%r305,0; @ %r306 bra $L22; .loc 1 140 2 add.u64 %r373,%frame,240; cvta.const.u64 %r310,$LC4r373309310; call _gfortrani_bounds_ifunction_return$L22: .loc 1 144 3 setp.gt.s64 %r312,%r111,0; @ %r312 bra $L25; $L29: .loc 1 78 9 add.u64 %r313,%r31,1; .loc 1 78 7 sub.u64 %r113,%r313,%r33; max.s64 %r115,%r113,0; .loc 1 82 44 mul.lo.u64 %r36,%r34,%r183; .loc 1 152 8 ld.u64 %r124,[%r181]; .loc 1 153 8 ld.u64 %r125,[%r179]; .loc 1 182 22 ld.u64 %r171,[%frame+120]; .loc 1 183 22 ld.u64 %r173,[%frame]; .loc 1 181 12 ld.u64 %r176,[%frame+360]; setp.gt.s64 %r370,%r113,0; setp.le.s64 %r371,%r111,1; cvt.u32.u32 %r376,%r22; cvt.s64.s8 %r377,%r376; add.u64 %r378,%r377,-1; add.u64 %r379,%frame,120; add.u64 %r380,%frame,240; bra $L26; $L25: add.u64 %r161,%r179,40; cvt.u32.u32 %r315,%r22; cvt.s64.s8 %r314,%r315; add.u64 %r152,%r314,-1; .loc 1 144 3 mov.u64 %r159,0; mov.u64 %r126,%r159; add.u64 %r373,%frame,240; add.u64 %r372,%frame,360; .loc 1 146 16 mov.u64 %r318,%r159; $L28: add.u64 %r317,%r372,%r159; st.u64 [%r317],%r318; .loc 1 147 18 add.u64 %r319,%frame,%r159; .loc 1 147 54 ld.u64 %r321,[%r161]; mul.lo.u64 %r320,%r321,%r183; .loc 1 147 18 st.u64 [%r319],%r320; .loc 1 148 17 add.u64 %r323,%r373,%r159; .loc 1 148 10 ld.u64 %r324,[%r323]; setp.le.s64 %r325,%r324,0; @ %r325 bra $L1; .loc 1 144 26 add.u64 %r126,%r126,1; .loc 1 144 3 add.u64 %r161,%r161,24; add.u64 %r159,%r159,8; setp.ne.u64 %r326,%r126,%r152; @ %r326 bra $L28; bra $L29; $L26: .loc 1 164 5 @ %r370 bra $L38; .loc 1 165 4 mov.u32 %r3313333,[%value_in]; } bra $L31; $L33: .loc 1 3812104183338,[%value_in]; } .loc 1 171 6 setp.le.s32 %r340,%r338,0; selp.u64 %r104,%r104,%r120,%r340; bra $L30; $L38: mov.u64 %r120,%r124; mov.u64 %r104,%r124; .loc 1 168 13 mov.u64 %r119,0; $L30: .loc 1 168 28 add.u64 %r119,%r119,1; .loc 1 168 36 add.u64 %r120,%r120,%r36; .loc 1 168 6 setp.gt.s64 %r341,%r115,%r11983348,[%value_in]; } $L31: .loc 1 181 15 add.u64 %r176,%r176,1; .loc 1 182 12 add.u64 %r124,%r124,%r171; .loc 1 183 12 add.u64 %r125,%r125,%r173; .loc 1 185 32 ld.u64 %r130,[%frame+240]; .loc 1 185 13 setp.ne.u64 %r350,%r176,%r130; @ %r350 bra $L26; .loc 1 192 23 mul.lo.u64 %r351,%r176,%r171; .loc 1 192 9 sub.u64 %r121,%r124,%r351; .loc 1 193 23 mul.lo.u64 %r352,%r176,%r173; .loc 1 193 9 sub.u64 %r122,%r125,%r352; .loc 1 195 7 @ %r371 bra $L1; add.u64 %r132,%frame,368; mov.u64 %r150,8; .loc 1 194 5 mov.u64 %r123,1; .loc 1 189 13 mov.u64 %r375,0; bra $L35; $L36: st.u64 [%r132],%r375; .loc 1 192 23 mul.lo.u64 %r358,%r92,%r91; .loc 1 192 9 sub.u64 %r121,%r124,%r358; .loc 1 193 23 mul.lo.u64 %r359,%r94,%r91; .loc 1 193 9 sub.u64 %r122,%r125,%r359; .loc 1 194 5 add.u64 %r123,%r123,1; .loc 1 195 7 add.u64 %r132,%r132,8; add.u64 %r150,%r150,8; setp.eq.u64 %r360,%r123,%r378; @ %r360 bra $L1; $L35: .loc 1 203 16 ld.u64 %r361,[%r132]; add.u64 %r91,%r361,1; st.u64 [%r132],%r91; .loc 1 204 23 add.u64 %r363,%r379,%r150; ld.u64 %r92,[%r363]; .loc 1 204 13 add.u64 %r124,%r121,%r92; .loc 1 205 23 add.u64 %r364,%frame,%r150; ld.u64 %r94,[%r364]; .loc 1 205 13 add.u64 %r125,%r122,%r94; .loc 1 185 32 add.u64 %r366,%r380,%r150; ld.u64 %r97,[%r366]; .loc 1 185 13 setp.eq.u64 %r367,%r91,%r97; @ %r367 bra $L36; mov.u64 %r176,0; bra $L26; $L49: .loc 1 101 6 ld.u64 %r368,[%r179]; setp.eq.u64 %r369,%r368,0; @ ! %r369 bra $L12; bra $L37; $L1: .loc 1 209_gfortran_mmaxval1_s1 .visible .func _gfortran_mmaxval1_s1608pred %r234; .reg .u64 %r235; .reg .u32 %r236; .reg .u64 %r237; .reg .u64 %r238; .reg .u32 %r239; .reg .u64 %r241; .reg .u32u32u64 %r313; .reg .u64u64 %r329; .reg .u32 %r330; .reg .u64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64predpredu64 %r411; .reg .u64 %r412; .reg .pred %r413; .reg .pred %r414; .reg .u32 %r418; .reg .u64 %r420; .reg .pred %r422; .reg .pred %r423; .reg .predpredpred %r463; .reg .u64 %r464; .reg .predpredu64 %r480; .reg .u64 %r481; mov.u64 %r222,%ar0; mov.u64 %r223,%ar1; mov.u64 %r224,%ar2; mov.u64 %r225,%ar3; mov.u64 %r226,%ar4; mov.u64 %r227,%ar5; .loc 1 242 6 setp.ne.u64 %r228,%r226,0; @ %r228 bra $L223257; call _gfortran_maxval1_s1); } .loc 1 245 7 bra $L50; $L51: .loc 1 248 3 setp.eq.u64 %r234,%r223,%r227; @ %r234 bra $L53; cvta.const.u64 %r238,$LC0; cvta.const.u64 %r237,__func__$0; mov.u32 %r236,248;r2r2373853: .loc 1 250 10 ld.u64 %r22,[%r225]; .loc 1 250 7 add.u64 %r146,%r22,-1; .loc 1 251 10 ld.s8 %r23,[%r224+28]; .loc 1 251 38 add.u32 %r239,%r23,-1; .loc 1 251 8 cvt.s64.s32 %r147,%r239; .loc 1 253 7 shr.u64 %r241,%r146,63; set.u32.gt.s64 %r243,%r146,%r147; neg.s32 %r244,%r243; cvt.u16.u64 %r247,%r241; cvt.u16.u32 %r248,%r244; or.b16 %r246,%r247,%r248; .loc 1 253 6 cvt.u32.u16 %r249,%r246; cvt.u16.u8 %r250,%r249; setp.eq.u16 %r251,%r250,0; @ %r251 bra $L54; .loc 1 255 7 cvt.u32.u32 %r254,%r23; cvt.s64.s8 %r253,%r254; st.u64 [%stack+8],%r253; st.u64 [%stack],%r22; cvta.const.u64 %r22_gfortran_runtime_error4: .loc 1 260 9 add.u64 %r471,%r146,%r146; add.u64 %r468,%r471,%r146; shl.b64 %r259,%r468,3; add.u64 %r260,%r224,%r259; ld.u64 %r263,[%r260+56]; add.u64 %r262,%r263,1; .loc 1 260 7 ld.u64 %r270,[%r260+48]; sub.u64 %r148,%r262,%r270; .loc 1 261 6 setp.le.s64 %r271,%r148,0; @ %r271 bra $L50; .loc 1 264 9 ld.u64 %r159,[%r226]; .loc 1 266 15 ld.u64 %r35,[%r226+16]; .loc 1 268 22 cvt.u32.u64 %r36,%r35; .loc 1 268 53 add.u32 %r272,%r36,-4; and.b32 %r273,%r272,-5; set.u32.eq.u32 %r275,%r273,0; neg.s32 %r276,%r275; .loc 1 268 22 add.u32 %r277,%r36,-1; set.u32.le.u32 %r279,%r277,1; neg.s32 %r280,%r279; .loc 1 268 6 cvt.u16.u32 %r282,%r276; cvt.u16.u32 %r283,%r280; or.b16 %r281,%r282,%r283; cvt.u32.u16 %r284,%r281; cvt.u16.u8 %r285,%r28456; .loc 1 270 7 setp.ne.u32 %r288,%r36,16; @ %r288 bra $L57; $L56: .loc 1 277 11 shl.b64 %r292,%r468,3; add.u64 %r293,%r224,%r292; ld.u64 %r39,[%r293+40]; .loc 1 278 12 add.u64 %r299,%r226,%r292; ld.u64 %r42,[%r299+40]; .loc 1 280 3 setp.ne.u64 %r301,%r146,0; @ %r301 bra $L58; $L64: .loc 1 290 3 setp.lt.s64 %r302,%r146,%r147; @ %r302 bra $L59; bra $L109; $L57: .loc 1 275 5 cvta.const.u64 %r303,$LC303_gfortran_runtime_error8: add.u64 %r204,%r224,40; add.u64 %r206,%frame,240; add.u64 %r207,%r226,40; mov.u64 %r209,%frame; add.u64 %r210,%frame,360; add.u64 %r305,%r224,16; add.u64 %r307,%r22,%r22; add.u64 %r308,%r307,%r22; shl.b64 %r309,%r308,3; add.u64 %r219,%r305,%r309; .loc 1 287 12 mov.u64 %r481,0; $L63: .loc 1 282 51 ld.u64 %r311,[%r204]; mul.lo.u64 %r310,%r311,%r227; .loc 1 282 18 st.u64 [%r206],%r310; .loc 1 283 20 ld.u64 %r313,[%r207]; mul.lo.u64 %r312,%r313,%r35; .loc 1 283 18 st.u64 [%r209],%r312; .loc 1 284 19 ld.u64 %r315,[%r204+16]; add.u64 %r314,%r315,1; ld.u64 %r316,[%r204+8]; sub.u64 %r57,%r314,%r316; .loc 1 286 10 setp.lt.s64 %r317,%r57,0; @ %r317 bra $L61; .loc 1 284 17 st.u64 [%r210],%r57; bra $L62; $L61: .loc 1 287 12 st.u64 [%r210],%r481; $L62: .loc 1 280 3 add.u64 %r204,%r204,24; add.u64 %r206,%r206,8; add.u64 %r207,%r207,24; add.u64 %r209,%r209,8; add.u64 %r210,%r210,8; setp.ne.u64 %r319,%r204,%r219; @ %r319 bra $L63; bra $L64; $L109: .loc 1 300 6 ld.u64 %r320,[%r222]; setp.eq.u64 %r321,%r320,0; @ ! %r321 bra $L66; bra $L65; $L59: add.u64 %r323,%r22,%r22; add.u64 %r324,%r323,%r22; shl.b64 %r325,%r324,3; add.u64 %r125,%r325,40; add.u64 %r130,%r224,%r125; shl.b64 %r326,%r22,3; add.u64 %r75,%r326,-8; add.u64 %r466,%frame,240; add.u64 %r106,%r466,%r75; add.u64 %r68,%r226,%r125; add.u64 %r29,%frame,%r75; add.u64 %r472,%frame,360; add.u64 %r192,%r472,%r75; cvt.u32.u32 %r330,%r23; cvt.s64.s8 %r329,%r330; add.u64 %r332,%r329,%r329; add.u64 %r333,%r332,%r329; shl.b64 %r334,%r333,3; add.u64 %r335,%r224,40; add.u64 %r201,%r334,%r335; .loc 1 297 12 mov.u64 %r480,0; $L69: .loc 1 292 55 ld.u64 %r337,[%r130]; mul.lo.u64 %r336,%r337,%r227; .loc 1 292 18 st.u64 [%r106],%r336; .loc 1 293 20 ld.u64 %r339,[%r68]; mul.lo.u64 %r338,%r339,%r35; .loc 1 293 18 st.u64 [%r29],%r338; .loc 1 294 19 ld.u64 %r341,[%r130+16]; add.u64 %r340,%r341,1; ld.u64 %r342,[%r130+8]; sub.u64 %r72,%r340,%r342; .loc 1 296 10 setp.lt.s64 %r343,%r72,0; @ %r343 bra $L67; .loc 1 294 17 st.u64 [%r192],%r72; bra $L68; $L67: .loc 1 297 12 st.u64 [%r192],%r480; $L68: .loc 1 290 3 add.u64 %r130,%r130,24; add.u64 %r106,%r106,8; add.u64 %r68,%r68,24; add.u64 %r29,%r29,8; add.u64 %r192,%r192,8; setp.ne.u64 %r345,%r130,%r201; @ %r345 bra $L69; bra $L110; $L65: .loc 1 304 7 setp.eq.u64 %r346,%r147,0; @ %r346 bra $L71; add.u64 %r472,%frame,360; $L91: add.u64 %r141,%r222,40; mov.u64 %r139,%r472; cvt.u32.u32 %r348,%r23; cvt.s64.s8 %r347,%r348; add.u64 %r350,%r347,%r347; add.u64 %r351,%r350,%r347; shl.b64 %r352,%r351,3; add.u64 %r353,%r222,16; add.u64 %r131,%r352,%r353; .loc 1 307 10 mov.u64 %r137,1; .loc 1 311 4 mov.u64 %r373,0; bra $L72; $L71: .loc 1 315 20 add.u32 %r354,%r23,-2; cvt.s64.s32 %r81,%r354; add.u64 %r356,%r81,%r81; add.u64 %r357,%r356,%r81; shl.b64 %r358,%r357,3; add.u64 %r359,%r222,%r358; .loc 1 315 67 shl.b64 %r361,%r81,3; add.u64 %r362,%frame,%r361; .loc 1 315 59 ld.u64 %r365,[%r359+40]; ld.u64 %r366,[%r362+360]; mul.lo.u64 %r364,%r365,%r366; .loc 1 315 18 mul.lo.u64 %r153,%r364,%r227; .loc 1 318 24 mov.u64 %r367,0; st.u64 [%r222+8],%r367; .loc 1 319 28 cvt.u16.u32 %r370,%r23; add.u16 %r369,%r370,-1; cvt.u32.u16 %r371,%r369; st.u8 [%r222+28],%r371; .loc 1 321 10 setp.eq.u64 %r372,%r153,0; @ %r372 bra $L73; bra $L111; $L75: .loc 1 309 47 mul.lo.u64 %r137,%r78,%r137; $L72: .loc 1 311 4 st.u64 [%r141+8],%r373; ld.u64 %r78,[%r139]; add.u64 %r374,%r78,-1; st.u64 [%r141+16],%r374; st.u64 [%r141],%r137; .loc 1 304 7 add.u64 %r141,%r141,24; add.u64 %r139,%r139,8; setp.ne.u64 %r375,%r131,%r141; @ %r375 bra $L75; bra $L71; $L73: .loc 1 324 4 st.u64 [%r222+48],%r153; mov.u64 %r377,-1; st.u64 [%r222+56],%r377; mov.u64 %r378,1; st.u64 [%r222+40],%r378; .loc 1 325 4 bra $L50; $L111: .loc 1 328 24 mov.u64 %r38015(%value_in),_gfortrani_xmallocarray81,[%value_in]; } .loc 1 328 22 st.u64 [%r222],%r381; bra $L76; $L66: .loc 1 333 19 ld.s8 %r383,[%r222+28]; .loc 1 333 10 setp.eq.u64 %r384,%r383,%r147; @ %r384 bra $L77; .loc 1 334 2 cvta.const.u64 %r385385_gfortran_runtime_error77: .loc 1 336 11 cvta.global.u64 %r387,_gfortrani_compile_options; .loc 1 336 10 ld.u32 %r388,[%r387+36]; setp.eq.u32 %r389,%r388,0; @ %r389 bra $L76; .loc 1 338 4 add.u64 %r472,%frame,360; cvta.const.u64 %r393,$LC4; cvta.const.u64 %r3922472392393; call _gfortrani_bounds_ifunction_return340 4 cvta.const.u64 %r397397393; call _gfortrani_bounds_equal_extents$L76: .loc 1 345 3 setp.ne.u64 %r399,%r147,0; @ %r399 bra $L78; $L81: .loc 1 353 8 ld.u64 %r160,[%r222]; .loc 1 354 8 ld.u64 %r158,[%r224]; .loc 1 356 9 setp.ne.u64 %r400,%r158,0; @ %r400 bra $L79; bra $L50; $L78: add.u64 %r171,%r222,40; cvt.u32.u32 %r402,%r23; cvt.s64.s8 %r401,%r402; add.u64 %r142,%r401,-1; .loc 1 345 3 mov.u64 %r169,0; mov.u64 %r164,%r169; add.u64 %r472,%frame,360; add.u64 %r469,%frame,480; add.u64 %r467,%frame,120; .loc 1 347 16 mov.u64 %r405,%r169; $L80: add.u64 %r404,%r469,%r169; st.u64 [%r404],%r405; .loc 1 348 18 add.u64 %r407,%r467,%r169; .loc 1 348 54 ld.u64 %r409,[%r171]; mul.lo.u64 %r408,%r409,%r227; .loc 1 348 18 st.u64 [%r407],%r408; .loc 1 349 17 add.u64 %r411,%r472,%r169; .loc 1 349 10 ld.u64 %r412,[%r411]; setp.le.s64 %r413,%r412,0; @ %r413 bra $L50; .loc 1 345 26 add.u64 %r164,%r164,1; .loc 1 345 3 add.u64 %r171,%r171,24; add.u64 %r169,%r169,8; setp.ne.u64 %r414,%r142,%r164; @ %r414 bra $L80; bra $L81; $L79: .loc 1 277 44 mul.lo.u64 %r41,%r39,%r227; .loc 1 278 12 mul.lo.u64 %r45,%r42,%r35; .loc 1 389 22 ld.u64 %r193,[%frame+240]; .loc 1 390 23 ld.u64 %r196,[%frame]; .loc 1 391 22 ld.u64 %r198,[%frame+120]; setp.le.s64 %r470,%r147,1; cvt.u32.u32 %r474,%r23; cvt.s64.s8 %r475,%r474; add.u64 %r476,%r475,-1; add.u64 %r477,%frame,240; add.u64 %r478,%frame,360; add.u64 %r479,%frame,120; $L92: .loc 1 366 2 mov.u32 %r4182420,[%value_in]; } mov.u64 %r65,%r159; mov.u64 %r154,%r158; .loc 1 368 9 mov.u64 %r162,0; $L83: .loc 1 371 7 ld.s8 %r200,[%r65]; .loc 1 371 6 setp.ne.u32 %r422,%r200,0; @ %r422 bra $L82; .loc 1 368 24 add.u64 %r162,%r162,1; .loc 1 368 32 add.u64 %r154,%r154,%r41; .loc 1 368 47 add.u64 %r65,%r65,%r45; .loc 1 368 2 setp.ne.u64 %r423,%r148,%r162; @ %r423 bra $L83; mov.u64 %r154,%r160; bra $L84; $L82: .loc 1 377 6 setp.le.s64 %r424,%r148,%r162; @ %r424 bra $L84; mov.u64 %r163,%r154; $L86: .loc 1 379 6 setp.eq.u32 %r425,%r200,0; @ %r425 bra $L85; .loc 1 3816r154429,[%value_in]; } .loc 1 379 13 setp.le.s32 %r431,%r429,0; selp.u64 %r154,%r154,%r163,%r431; $L85: .loc 1 377 23 add.u64 %r162,%r162,1; .loc 1 377 31 add.u64 %r163,%r163,%r41; .loc 1 377 46 add.u64 %r65,%r65,%r45; .loc 1 377 6 setp.eq.u64 %r432,%r148,%r162; @ %r432 bra $L84; .loc 1 379 7 ld.s8 %r200,[%r65]; bra $L86; $L84: .loc 1 385 2161547439,[%value_in]; } .loc 1 388 15 ld.u64 %r441,[%frame+480]; add.u64 %r100,%r441,1; st.u64 [%frame+480],%r100; .loc 1 389 12 add.u64 %r158,%r158,%r193; .loc 1 390 13 add.u64 %r159,%r159,%r196; .loc 1 391 12 add.u64 %r160,%r160,%r198; .loc 1 393 32 ld.u64 %r134,[%frame+360]; .loc 1 393 13 setp.ne.u64 %r442,%r100,%r134; @ %r442 bra $L92; .loc 1 397 13 mov.u64 %r443,0; st.u64 [%frame+480],%r443; .loc 1 400 23 mul.lo.u64 %r109,%r100,%r193; .loc 1 401 24 mul.lo.u64 %r444,%r100,%r196; .loc 1 401 10 sub.u64 %r155,%r159,%r444; .loc 1 402 23 mul.lo.u64 %r445,%r100,%r198; .loc 1 402 9 sub.u64 %r156,%r160,%r445; .loc 1 404 7 @ %r470 bra $L50; add.u64 %r191,%frame,488; mov.u64 %r189,8; .loc 1 403 5 mov.u64 %r157,1; bra $L89; $L90: .loc 1 397 13 st.u64 [%r191],%r443; .loc 1 400 23 mul.lo.u64 %r109,%r119,%r118; .loc 1 401 24 mul.lo.u64 %r451,%r121,%r118; .loc 1 401 10 sub.u64 %r155,%r159,%r451; .loc 1 402 23 mul.lo.u64 %r452,%r123,%r118; .loc 1 402 9 sub.u64 %r156,%r160,%r452; .loc 1 403 5 add.u64 %r157,%r157,1; .loc 1 404 7 add.u64 %r191,%r191,8; add.u64 %r189,%r189,8; setp.eq.u64 %r453,%r157,%r476; @ %r453 bra $L50; $L89: .loc 1 412 16 ld.u64 %r454,[%r191]; add.u64 %r118,%r454,1; st.u64 [%r191],%r118; .loc 1 413 23 add.u64 %r456,%r477,%r189; ld.u64 %r119,[%r456]; .loc 1 413 13 sub.u64 %r457,%r119,%r109; add.u64 %r158,%r158,%r457; .loc 1 414 24 add.u64 %r458,%frame,%r189; ld.u64 %r121,[%r458]; .loc 1 414 14 add.u64 %r159,%r155,%r121; .loc 1 415 23 add.u64 %r460,%r479,%r189; ld.u64 %r123,[%r460]; .loc 1 415 13 add.u64 %r160,%r156,%r123; .loc 1 393 32 add.u64 %r462,%r478,%r189; ld.u64 %r126,[%r462]; .loc 1 393 13 setp.eq.u64 %r463,%r118,%r126; @ %r463 bra $L90; bra $L92; $L110: .loc 1 300 6 ld.u64 %r464,[%r222]; setp.eq.u64 %r465,%r464,0; @ ! %r465 bra $L66; bra $L91; $L50: .loc 1 419_gfortran_smaxval1_s1 .visible .func _gfortran_smaxval1_s13688pred %r159; .reg .u32 %r160; .reg .predu16 %r173; .reg .u16 %r174; .reg .u16 %r175; .reg .u32 %r176; .reg .u16 %r177; .reg .pred %r178; .reg .pred %r179; .reg .predpred %r193; .reg .pred %r195; .reg .predpredu16 %r229; .reg .u32pred %r271; .reg .u64u32 %r280; .reg .u64 %r282; .reg .u64 %r284; .reg .u64pred %r296; .reg .predu64u64 %r316; mov.u64 %r153,%ar0; mov.u64 %r154,%ar1; mov.u64 %r155,%ar2; mov.u64 %r156,%ar3; mov.u64 %r157,%ar4; mov.u64 %r158,%ar5; .loc 1 445 6 setp.eq.u64 %r159,%r157,0; @ %r159 bra $L113; .loc 1 445 20 ld.u32 %r160,[%r157]; setp.eq.u32 %r161,%r160,0; @ %r161 bra $L114; 15r1541515158; call _gfortran_maxval1_s1); } .loc 1 448 7 bra $L112; $L114: .loc 1 451 10 ld.u64 %r23,[%r156]; .loc 1 451 7 add.u64 %r88,%r23,-1; .loc 1 452 10 ld.s8 %r24,[%r155+28]; .loc 1 452 38 add.u32 %r26,%r24,-1; .loc 1 452 8 cvt.s64.s32 %r310,%r26; .loc 1 454 7 shr.u64 %r168,%r88,63; set.u32.gt.s64 %r170,%r88,%r310; neg.s32 %r171,%r170; cvt.u16.u64 %r174,%r168; cvt.u16.u32 %r175,%r171; or.b16 %r173,%r174,%r175; .loc 1 454 6 cvt.u32.u16 %r176,%r173; cvt.u16.u8 %r177,%r176; setp.ne.u16 %r178,%r177,0; @ %r178 bra $L116; .loc 1 461 3 setp.ne.u64 %r179,%r88,0; @ %r179 bra $L117; $L123: .loc 1 469 3 setp.lt.s64 %r180,%r88,%r310; @ %r180 bra $L118; bra $L169; $L116: .loc 1 456 7 cvt.u32.u32 %r183,%r24; cvt.s64.s8 %r182,%r183; st.u64 [%stack+8],%r182181181_gfortran_runtime_error117: add.u64 %r59,%r155,48; add.u64 %r137,%frame,120; add.u64 %r101,%r155,56; add.u64 %r185,%r155,24; add.u64 %r187,%r23,%r23; add.u64 %r188,%r187,%r23; shl.b64 %r189,%r188,3; add.u64 %r149,%r185,%r189; .loc 1 466 12 mov.u64 %r316,0; $L122: .loc 1 463 19 ld.u64 %r191,[%r101]; add.u64 %r190,%r191,1; ld.u64 %r192,[%r59]; sub.u64 %r36,%r190,%r192; .loc 1 465 10 setp.le.s64 %r193,%r36,0; @ %r193 bra $L120; .loc 1 463 17 st.u64 [%r137],%r36; bra $L121; $L120: .loc 1 466 12 st.u64 [%r137],%r316; $L121: .loc 1 461 3 add.u64 %r59,%r59,24; add.u64 %r137,%r137,8; add.u64 %r101,%r101,24; setp.ne.u64 %r195,%r59,%r149; @ %r195 bra $L122; bra $L123; $L169: .loc 1 478 15 ld.u64 %r96,[%r153]; .loc 1 478 6 setp.eq.u64 %r196,%r96,0; @ ! %r196 bra $L125; bra $L124; $L118: add.u64 %r198,%r23,%r23; add.u64 %r199,%r198,%r23; shl.b64 %r200,%r199,3; add.u64 %r201,%r200,48; add.u64 %r43,%r155,%r201; add.u64 %r309,%frame,120; shl.b64 %r203,%r23,3; add.u64 %r204,%r203,-8; add.u64 %r131,%r309,%r204; cvt.u32.u32 %r206,%r24; cvt.s64.s8 %r205,%r206; add.u64 %r208,%r205,%r205; add.u64 %r209,%r208,%r205; shl.b64 %r210,%r209,3; add.u64 %r211,%r155,48; add.u64 %r133,%r210,%r211; .loc 1 475 12 mov.u64 %r315,0; $L128: .loc 1 472 2 ld.u64 %r213,[%r43+8]; add.u64 %r212,%r213,1; ld.u64 %r214,[%r43]; sub.u64 %r41,%r212,%r214; .loc 1 474 10 setp.le.s64 %r215,%r41,0; @ %r215 bra $L126; .loc 1 471 17 st.u64 [%r131],%r41; bra $L127; $L126: .loc 1 475 12 st.u64 [%r131],%r315; $L127: .loc 1 469 3 add.u64 %r43,%r43,24; add.u64 %r131,%r131,8; setp.ne.u64 %r217,%r43,%r133; @ %r217 bra $L128; bra $L170; $L124: .loc 1 482 7 setp.eq.u64 %r218,%r310,0; @ %r218 bra $L130; add.u64 %r309,%frame,120; $L149: add.u64 %r111,%r153,40; mov.u64 %r108,%r309; cvt.u32.u32 %r220,%r24; cvt.s64.s8 %r219,%r220; add.u64 %r222,%r219,%r219; add.u64 %r223,%r222,%r219; shl.b64 %r224,%r223,3; add.u64 %r225,%r153,16; add.u64 %r87,%r224,%r225; .loc 1 485 10 mov.u64 %r84,1; .loc 1 489 4 mov.u64 %r245,0; bra $L131; $L130: .loc 1 493 24 mov.u64 %r226,0; st.u64 [%r153+8],%r226; .loc 1 494 28 cvt.u16.u32 %r229,%r24; add.u16 %r228,%r229,-1; cvt.u32.u16 %r230,%r228; st.u8 [%r153+28],%r230; .loc 1 496 20 add.u32 %r231,%r24,-2; cvt.s64.s32 %r50,%r231; add.u64 %r233,%r50,%r50; add.u64 %r234,%r233,%r50; shl.b64 %r235,%r234,3; add.u64 %r236,%r153,%r235; .loc 1 496 67 shl.b64 %r238,%r50,3; add.u64 %r239,%frame,%r238; .loc 1 496 59 ld.u64 %r242,[%r236+40]; ld.u64 %r243,[%r239+120]; mul.lo.u64 %r241,%r242,%r243; .loc 1 496 18 mul.lo.u64 %r94,%r241,%r158; .loc 1 499 10 setp.eq.u64 %r244,%r94,0; @ %r244 bra $L132; bra $L171; $L134: .loc 1 487 48 mul.lo.u64 %r84,%r46,%r84; $L131: .loc 1 489 4 st.u64 [%r111+8],%r245; ld.u64 %r46,[%r108]; add.u64 %r246,%r46,-1; st.u64 [%r111+16],%r246; st.u64 [%r111],%r84; .loc 1 482 7 add.u64 %r111,%r111,24; add.u64 %r108,%r108,8; setp.ne.u64 %r247,%r87,%r111; @ %r247 bra $L134; bra $L130; $L132: .loc 1 502 4 st.u64 [%r153+48],%r94; mov.u64 %r249,-1; st.u64 [%r153+56],%r249; mov.u64 %r250,1; st.u64 [%r153+40],%r250; .loc 1 503 4 bra $L112; $L171: .loc 1 506 24 mov.u64 %r2522; call (%value_in),_gfortrani_xmallocarray253,[%value_in]; } mov.u64 %r96,%r253; .loc 1 506 22 st.u64 [%r153],%r96; $L138: .loc 1 532 3 setp.ne.u64 %r255,%r310,0; @ %r255 bra $L135; bra $L136; $L125: .loc 1 510 19 ld.s8 %r56,[%r153+28]; .loc 1 510 10 setp.eq.u64 %r256,%r56,%r310; @ %r256 bra $L137; .loc 1 511 2 st.u64 [%stack+8],%r310_gfortran_runtime_error137: .loc 1 516 11 cvta.global.u64 %r259,_gfortrani_compile_options; .loc 1 516 10 ld.u32 %r260,[%r259+36]; setp.eq.u32 %r261,%r260,0; @ %r261 bra $L138; .loc 1 518 4 setp.eq.u64 %r262,%r310,0; @ %r262 bra $L136; add.u64 %r86,%r153,48; add.u64 %r83,%frame,120; cvt.u32.u32 %r264,%r24; cvt.s64.s8 %r263,%r264; add.u64 %r72,%r263,-1; .loc 1 518 10 mov.u64 %r104,0; $L140: .loc 1 522 21 ld.u64 %r266,[%r86+8]; add.u64 %r265,%r266,1; .loc 1 522 19 ld.u64 %r267,[%r86]; sub.u64 %r93,%r265,%r267; .loc 1 523 18 ld.u64 %r63,[%r83]; .loc 1 524 3 add.u64 %r104,%r104,1; .loc 1 523 11 setp.eq.u64 %r268,%r63,%r93; @ %r268 bra $L139; .loc 1 524 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r269,$LCstack; call _gfortran_runtime_error139: .loc 1 518 4 add.u64 %r86,%r86,24; add.u64 %r83,%r83,8; setp.ne.u64 %r271,%r72,%r104; @ %r271 bra $L140; $L135: .loc 1 534 16 cvt.u16.u32 %r273,%r24; setp.le.s16 %r274,%r273,1; @ %r274 bra $L141; shl.b64 %r272,%r310,3; bra $L142; $L141: mov.u64 %r272,8; $L142: add.u64 %r276,%frame,240; mov.u32 %r2802722,[%value_in]; } add.u64 %r117,%r153,40; mov.u64 %r98,0; $L143: .loc 1 535 18 shl.b64 %r284,%r98,3; add.u64 %r285,%frame,%r284; .loc 1 535 54 ld.u64 %r287,[%r117]; mul.lo.u64 %r286,%r287,%r158; .loc 1 535 18 st.u64 [%r285],%r286; .loc 1 532 26 add.u64 %r98,%r98,1; .loc 1 532 3 add.u64 %r117,%r117,24; setp.gt.s64 %r288,%r310,%r98; @ %r288 bra $L143; $L136: .loc 1 546 32 ld.u64 %r132,[%frame+120]; .loc 1 544 22 ld.u64 %r136,[%frame]; .loc 1 543 12 ld.u64 %r138,[%frame+240]; .loc 1 553 23 mul.lo.u64 %r142,%r132,%r136; setp.le.s64 %r308,%r310,1; cvt.u32.u32 %r312,%r24; cvt.s64.s8 %r313,%r312; add.u64 %r314,%r313,-1; $L148: .loc 1 542 7 mov.u32 %r2543 15 add.u64 %r138,%r138,1; .loc 1 544 12 add.u64 %r96,%r96,%r136; .loc 1 546 13 setp.ne.u64 %r296,%r138,%r132; @ %r296 bra $L148; .loc 1 555 7 @ %r308 bra $L112; add.u64 %r97,%frame,248; add.u64 %r125,%frame,8; add.u64 %r124,%frame,128; .loc 1 553 23 mov.u64 %r74,%r142; .loc 1 554 5 mov.u64 %r95,1; .loc 1 550 13 mov.u64 %r311,0; bra $L146; $L147: st.u64 [%r97],%r311; .loc 1 553 23 mul.lo.u64 %r74,%r77,%r76; .loc 1 554 5 add.u64 %r95,%r95,1; .loc 1 555 7 add.u64 %r97,%r97,8; add.u64 %r125,%r125,8; add.u64 %r124,%r124,8; setp.eq.u64 %r303,%r95,%r314; @ %r303 bra $L112; $L146: .loc 1 559 16 ld.u64 %r304,[%r97]; add.u64 %r76,%r304,1; st.u64 [%r97],%r76; .loc 1 560 23 ld.u64 %r77,[%r125]; .loc 1 560 13 sub.u64 %r305,%r77,%r74; add.u64 %r96,%r96,%r305; .loc 1 546 32 ld.u64 %r80,[%r124]; .loc 1 546 13 setp.eq.u64 %r306,%r76,%r80; @ %r306 bra $L147; mov.u64 %r138,0; bra $L148; $L170: .loc 1 478 15 ld.u64 %r96,[%r153]; .loc 1 478 6 setp.eq.u64 %r307,%r96,0; @ ! %r307 bra $L125; bra $L149; $L112: .loc 1 564maxval1_s4.o/_gfortran_maxval1_s4 .visible .func _gfortran_maxval1_s4.file 1 "../../../../libgfortran/generated/maxval1_s4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxval1_s4 .visible .func _gfortran_mmaxval1_s4GLOBAL FUNCTION DECL: _gfortran_smaxval1_s4 .visible .func _gfortran_smaxval1_s4GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_options[6]FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_retur_gfortrani_memcmp_char4_gfortrani_memcmp_char4_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents73,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37120,118,97,108,49,95,115,52,0 }; // BEGIN VAR DEF: __func__$1 .const .align 1 .u8 __func__$1[11] = {109,97,120,118,97,108,49,95,115,52,0 }120,108,101,110,32,61,61,32,115,116,114,105,110,103,95,108,101116,114,97,110,47,103,101,110,101,114,97,116,101,100,47,109,97,120,118,97,108,49,95,115,52,46,9968,105,109,32,97,114,103,117,109,101,110,116,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,10114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,86,65,76114,101,116,117,114,110,32,118,9726] = {70,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,997,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,0 }_gfortran_maxval1_s4 .visible .func _gfortran_maxval1_s4 {48pred %r210; .reg .u64 %r211; .reg .u64 %r212; .reg .u32predu64 %r259; .reg .u32u64 %r266; .reg .u64 %r267; .reg .u64 %r268; .reg .u64 %r269; .reg .pred %r270; .reg .pred %r272; .reg .predu64 %r280; .reg .u64u64pred %r315; .reg .u64u64 %r330; .reg .u64 %r332; .reg .u64pred %r348; .reg .pred %r349; .reg .u64 %r356; .reg .predpredu64 %r389; .reg .u32u64 %r396; mov.u64 %r188,%ar0; mov.u64 %r189,%ar1; mov.u64 %r190,%ar2; mov.u64 %r191,%ar3; mov.u64 %r192,%ar4; .loc 1 66 3 setp.eq.u64 %r193,%r189,%r192; @ %r193 bra $L2; cvta.const.u64 %r197,$LC0; cvta.const.u64 %r196,__func__$1; mov.u32 %r195,66;r1951961972: .loc 1 68 10 ld.s8 %r22,[%r190+28]; .loc 1 68 38 add.u32 %r198,%r22,-1; .loc 1 68 8 cvt.s64.s32 %r114,%r198; .loc 1 69 10 ld.u64 %r25,[%r191]; .loc 1 69 7 add.u64 %r115,%r25,-1; .loc 1 71 7 shr.u64 %r200,%r115,63; set.u32.lt.s64 %r202,%r114,%r115; neg.s32 %r203,%r202; cvt.u16.u64 %r206,%r200; cvt.u16.u32 %r207,%r203; or.b16 %r205,%r206,%r207; .loc 1 71 6 cvt.u32.u16 %r208,%r205; cvt.u16.u8 %r209,%r2083; .loc 1 73 7 cvt.u32.u32 %r213,%r22; cvt.s64.s8 %r212,%r213; st.u64 [%stack+8],%r21211211_gfortran_runtime_errorr216,%r115,%r115; add.u64 %r217,%r216,%r115; shl.b64 %r218,%r217,3; add.u64 %r219,%r190,%r218; ld.u64 %r31,[%r219+56]; ld.u64 %r33,[%r219+48]; .loc 1 82 11 ld.u64 %r34,[%r219+40]; .loc 1 84 3 setp.ne.u64 %r233,%r115,0; @ %r233 bra $L4; $L10: .loc 1 92 3 setp.gt.s64 %r234,%r114,%r115; @ %r234 bra $L5; bra $L48; $L4: add.u64 %r100,%r190,40; add.u64 %r84,%frame,120; add.u64 %r73,%frame,240; add.u64 %r236,%r25,%r25; add.u64 %r237,%r236,%r25190,16; add.u64 %r128,%r238,%r239; .loc 1 90 12 mov.u64 %r396,0; $L9: .loc 1 86 51 ld.u64 %r241,[%r100]; mul.lo.u64 %r240,%r241,%r192; .loc 1 86 18 st.u64 [%r84],%r240; .loc 1 87 19 ld.u64 %r243,[%r100+16]; add.u64 %r242,%r243,1; ld.u64 %r244,[%r100+8]; sub.u64 %r43,%r242,%r244; .loc 1 89 10 setp.lt.s64 %r245,%r43,0; @ %r245 bra $L7; .loc 1 87 17 st.u64 [%r73],%r43; bra $L8; $L7: .loc 1 90 12 st.u64 [%r73],%r396; $L8: .loc 1 84 3 add.u64 %r100,%r100,24; add.u64 %r84,%r84,8; add.u64 %r73,%r73,8; setp.ne.u64 %r247,%r100,%r128; @ %r247 bra $L9; bra $L10; $L48: .loc 1 101 6 ld.u64 %r248,[%r188]; setp.eq.u64 %r249,%r248,0; @ ! %r249 bra $L12; bra $L11; $L5: add.u64 %r251,%r25,%r25; add.u64 %r252,%r251,%r25; shl.b64 %r253,%r252,3; add.u64 %r254,%r253,40; add.u64 %r148,%r190,%r254; shl.b64 %r255,%r25,3; add.u64 %r137,%r255,-8; add.u64 %r388,%frame,120; add.u64 %r140,%r388,%r137; add.u64 %r387,%frame,240; add.u64 %r132,%r387,%r137; add.u64 %r258,%r190,40; cvt.u32.u32 %r260,%r22; cvt.s64.s8 %r259,%r260; add.u64 %r262,%r259,%r259; add.u64 %r263,%r262,%r259; shl.b64 %r264,%r263,3; add.u64 %r104,%r258,%r264; .loc 1 98 12 mov.u64 %r395,0; $L15: .loc 1 94 56 ld.u64 %r266,[%r148]; mul.lo.u64 %r265,%r266,%r192; .loc 1 94 18 st.u64 [%r140],%r265; .loc 1 95 19 ld.u64 %r268,[%r148+16]; add.u64 %r267,%r268,1; ld.u64 %r269,[%r148+8]; sub.u64 %r53,%r267,%r269; .loc 1 97 10 setp.lt.s64 %r270,%r53,0; @ %r270 bra $L13; .loc 1 95 17 st.u64 [%r132],%r53; bra $L14; $L13: .loc 1 98 12 st.u64 [%r132],%r395; $L14: .loc 1 92 3 add.u64 %r148,%r148,24; add.u64 %r140,%r140,8; add.u64 %r132,%r132,8; setp.ne.u64 %r272,%r104,%r148; @ %r272 bra $L15; bra $L49; $L11: .loc 1 105 7 setp.le.s64 %r273,%r114,0; @ %r273 bra $L17; add.u64 %r387,%frame,240; $L36: add.u64 %r164,%r188,40; mov.u64 %r162,%r387; add.u64 %r274,%r188,16; cvt.u32.u32 %r276,%r22; cvt.s64.s8 %r275,%r276; add.u64 %r278,%r275,%r275; add.u64 %r279,%r278,%r275; shl.b64 %r280,%r279,3; add.u64 %r152,%r274,%r280; .loc 1 108 10 mov.u64 %r107,1; .loc 1 112 4 mov.u64 %r304,0; bra $L18; $L17: .loc 1 116 24 mov.u64 %r281,0; st.u64 [%r188+8],%r281; .loc 1 117 28 cvt.u16.u32 %r284,%r22; add.u16 %r283,%r284,-1; cvt.u32.u16 %r285,%r283; st.u8 [%r188+28],%r285; .loc 1 119 20 add.u32 %r286,%r22,-2; cvt.s64.s32 %r62,%r286; add.u64 %r288,%r62,%r62; add.u64 %r289,%r288,%r62; shl.b64 %r290,%r289,3; add.u64 %r291,%r188,%r290; .loc 1 119 67 shl.b64 %r293,%r62,3; add.u64 %r294,%frame,%r293; .loc 1 119 59 ld.u64 %r297,[%r291+40]; ld.u64 %r298,[%r294+240]; mul.lo.u64 %r296,%r297,%r298; .loc 1 119 18 mul.lo.u64 %r120,%r296,%r192; .loc 1 122 29300; call (%value_in),_gfortrani_xmallocarray01,[%value_in]; } .loc 1 122 27 st.u64 [%r188],%r301; .loc 1 123 10 setp.eq.u64 %r303,%r120,0; @ ! %r303 bra $L22; bra $L19; $L21: .loc 1 110 48 mul.lo.u64 %r107,%r58,%r107; $L18: .loc 1 112 4 st.u64 [%r164+8],%r304; ld.u64 %r58,[%r162]; add.u64 %r305,%r58,-1; st.u64 [%r164+16],%r305; st.u64 [%r164],%r107; .loc 1 105 7 add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r306,%r152,%r164; @ %r306 bra $L21; bra $L17; $L19: .loc 1 126 4 st.u64 [%r188+48],%r120; mov.u64 %r308,-1; st.u64 [%r188+56],%r308; mov.u64 %r309,1; st.u64 [%r188+40],%r309; .loc 1 127 4 bra $L1; $L12: .loc 1 133 19 ld.s8 %r69,[%r188+28]; .loc 1 133 10 setp.eq.u64 %r310,%r69,%r114; @ %r310 bra $L24; .loc 1 134 2 st.u64 [%stack+8],%r114; st.u64 [%stack],%r69_gfortran_runtime_error24: .loc 1 139 11 cvta.global.u64 %r313,_gfortrani_compile_options; .loc 1 139 10 ld.u32 %r314,[%r313+36]; setp.eq.u32 %r315,%r314,0; @ %r315 bra $L22; .loc 1 140 2 add.u64 %r387,%frame,240; cvta.const.u64 %r319,$LC4; cvta.const.u64 %r31818r387318319; call _gfortrani_bounds_ifunction_return$L22: .loc 1 144 3 setp.gt.s64 %r321,%r114,0; @ %r321 bra $L25; $L29: .loc 1 78 9 add.u64 %r322,%r31,1; .loc 1 78 7 sub.u64 %r116,%r322,%r33; max.s64 %r118,%r116,0; .loc 1 152 8 ld.u64 %r106,[%r190]; .loc 1 153 8 ld.u64 %r126,[%r188]; shl.b64 %r153,%r192,2; .loc 1 168 36 mul.lo.u64 %r79,%r34,%r153; .loc 1 182 22 ld.u64 %r181,[%frame+120]; .loc 1 182 12 shl.b64 %r183,%r181,2; .loc 1 183 22 ld.u64 %r184,[%frame]; .loc 1 183 12 shl.b64 %r186,%r184,2; .loc 1 181 12 ld.u64 %r187,[%frame+360]; setp.gt.s64 %r384,%r116,0; setp.le.s64 %r385,%r114,1; cvt.u32.u32 %r390,%r22; cvt.s64.s8 %r391,%r390; add.u64 %r392,%r391,-1; add.u64 %r393,%frame,120; add.u64 %r394,%frame,240; bra $L26; $L25: add.u64 %r174,%r188,40; cvt.u32.u32 %r324,%r22; cvt.s64.s8 %r323,%r324; add.u64 %r165,%r323,-1; .loc 1 144 3 mov.u64 %r172,0; mov.u64 %r127,%r172; add.u64 %r387,%frame,240; add.u64 %r386,%frame,360; .loc 1 146 16 mov.u64 %r327,%r172; $L28: add.u64 %r326,%r386,%r172; st.u64 [%r326],%r327; .loc 1 147 18 add.u64 %r328,%frame,%r172; .loc 1 147 54 ld.u64 %r330,[%r174]; mul.lo.u64 %r329,%r330,%r192; .loc 1 147 18 st.u64 [%r328],%r329; .loc 1 148 17 add.u64 %r332,%r387,%r172; .loc 1 148 10 ld.u64 %r333,[%r332]; setp.le.s64 %r334,%r333,0; @ %r334 bra $L1; .loc 1 144 26 add.u64 %r127,%r127,1; .loc 1 144 3 add.u64 %r174,%r174,24; add.u64 %r172,%r172,8; setp.ne.u64 %r335,%r127,%r165; @ %r335 bra $L28; bra $L29; $L26: .loc 1 164 5 @ %r384 bra $L37; .loc 1 165 4 mov.u32 %r340bra $L31; $L37: mov.u64 %r109,%r106; mov.u64 %r122,%r106; .loc 1 168 13 mov.u64 %r121,0; $L30: .loc 1 401109192; call (%value_in),_gfortrani_memcmp_char4347,[%value_in]; } .loc 1 171 6 setp.le.s32 %r348,%r347,0; selp.u64 %r109,%r109,%r122,%r348; .loc 1 168 28 add.u64 %r121,%r121,1; .loc 1 168 36 add.u64 %r122,%r122,%r79; .loc 1 168 6 setp.gt.s64 %r349,%r118,%r121356,[%value_in]; } $L31: .loc 1 181 15 add.u64 %r187,%r187,1; .loc 1 182 12 add.u64 %r106,%r106,%r183; .loc 1 183 12 add.u64 %r126,%r126,%r186; .loc 1 185 32 ld.u64 %r133,[%frame+240]; .loc 1 185 13 setp.ne.u64 %r358,%r187,%r133; @ %r358 bra $L26; .loc 1 192 23 mul.lo.u64 %r359,%r187,%r181; .loc 1 192 9 shl.b64 %r360,%r359,2; sub.u64 %r123,%r106,%r360; .loc 1 193 23 mul.lo.u64 %r361,%r187,%r184; .loc 1 193 9 shl.b64 %r362,%r361,2; sub.u64 %r124,%r126,%r362; .loc 1 195 7 @ %r385 bra $L1; add.u64 %r135,%frame,368; mov.u64 %r156,8; .loc 1 194 5 mov.u64 %r125,1; .loc 1 189 13 mov.u64 %r389,0; bra $L34; $L35: st.u64 [%r135],%r389; .loc 1 192 23 mul.lo.u64 %r368,%r92,%r91; .loc 1 192 9 shl.b64 %r369,%r368,2; sub.u64 %r123,%r106,%r369; .loc 1 193 23 mul.lo.u64 %r370,%r95,%r91; .loc 1 193 9 shl.b64 %r371,%r370,2; sub.u64 %r124,%r126,%r371; .loc 1 194 5 add.u64 %r125,%r125,1; .loc 1 195 7 add.u64 %r135,%r135,8; add.u64 %r156,%r156,8; setp.eq.u64 %r372,%r125,%r392; @ %r372 bra $L1; $L34: .loc 1 203 16 ld.u64 %r373,[%r135]; add.u64 %r91,%r373,1; st.u64 [%r135],%r91; .loc 1 204 23 add.u64 %r375,%r393,%r156; ld.u64 %r92,[%r375]; .loc 1 204 13 shl.b64 %r376,%r92,2; add.u64 %r106,%r123,%r376; .loc 1 205 23 add.u64 %r377,%frame,%r156; ld.u64 %r95,[%r377]; .loc 1 205 13 shl.b64 %r378,%r95,2; add.u64 %r126,%r124,%r378; .loc 1 185 32 add.u64 %r380,%r394,%r156; ld.u64 %r98,[%r380]; .loc 1 185 13 setp.eq.u64 %r381,%r91,%r98; @ %r381 bra $L35; mov.u64 %r187,0; bra $L26; $L49: .loc 1 101 6 ld.u64 %r382,[%r188]; setp.eq.u64 %r383,%r382,0; @ ! %r383 bra $L12; bra $L36; $L1: .loc 1 209_gfortran_mmaxval1_s4 .visible .func _gfortran_mmaxval1_s4608pred %r235; .reg .predu32 %r261; .reg .u64 %r266; .reg .u64pred %r278; .reg .u32u32 %r291; .reg .u16 %r292; .reg .predpredu64 %r318; .reg .u64u64 %r333; .reg .u64 %r336; .reg .u64 %r337; .reg .u32u64 %r349; .reg .pred %r350; .reg .pred %r352; .reg .predu64 %r374; .reg .u16 %r376; .reg .u16 %r377; .reg .u32 %r378; .reg .predpred %r396; .reg .u64 %r399; .reg .u64 %r400; .reg .u64 %r404; .reg .pred %r406; .reg .predpred %r420; .reg .predpred %r429; .reg .predpred %r437; .reg .pred %r438; .reg .u64 %r445; .reg .u64 %r447; .reg .predpred.reg .pred %r473; .reg .u64 %r474; .reg .predu64 %r490; .reg .u64 %r491; mov.u64 %r229,%ar0; mov.u64 %r230,%ar1; mov.u64 %r231,%ar2; mov.u64 %r232,%ar3; mov.u64 %r233,%ar4; mov.u64 %r234,%ar5; .loc 1 242 6 setp.ne.u64 %r235,%r233,0; @ %r235 bra $L23232234; call _gfortran_maxval1_s4); } .loc 1 245 7 bra $L50; $L51: .loc 1 248 3 setp.eq.u64 %r241,%r230,%r234; @ %r241 bra $L53; cvta.const.u64 %r245,$LC0; cvta.const.u64 %r244,__func__$0; mov.u32 %r243,248;r2432444553: .loc 1 250 10 ld.u64 %r22,[%r232]; .loc 1 250 7 add.u64 %r147,%r22,-1; .loc 1 251 10 ld.s8 %r23,[%r231+28]; .loc 1 251 38 add.u32 %r246,%r23,-1; .loc 1 251 8 cvt.s64.s32 %r148,%r246; .loc 1 253 7 shr.u64 %r248,%r147,63; set.u32.gt.s64 %r250,%r147,%r148; neg.s32 %r251,%r250; cvt.u16.u64 %r254,%r248; cvt.u16.u32 %r255,%r251; or.b16 %r253,%r254,%r255; .loc 1 253 6 cvt.u32.u16 %r256,%r253; cvt.u16.u8 %r257,%r256; setp.eq.u16 %r258,%r257,0; @ %r258 bra $L54; .loc 1 255 7 cvt.u32.u32 %r261,%r23; cvt.s64.s8 %r260,%r261; st.u64 [%stack+8],%r260; st.u64 [%stack],%r22; cvta.const.u64 %r22_gfortran_runtime_error4: .loc 1 260 9 add.u64 %r481,%r147,%r147; add.u64 %r479,%r481,%r147; shl.b64 %r266,%r479,3; add.u64 %r267,%r231,%r266; ld.u64 %r270,[%r267+56]; add.u64 %r269,%r270,1; .loc 1 260 7 ld.u64 %r277,[%r267+48]; sub.u64 %r149,%r269,%r277; .loc 1 261 6 setp.le.s64 %r278,%r149,0; @ %r278 bra $L50; .loc 1 264 9 ld.u64 %r164,[%r233]; .loc 1 266 15 ld.u64 %r33,[%r233+16]; .loc 1 268 22 cvt.u32.u64 %r34,%r33; .loc 1 268 53 add.u32 %r279,%r34,-4; and.b32 %r280,%r279,-5; set.u32.eq.u32 %r282,%r280,0; neg.s32 %r283,%r282; .loc 1 268 22 add.u32 %r284,%r34,-1; set.u32.le.u32 %r286,%r284,1; neg.s32 %r287,%r286; .loc 1 268 6 cvt.u16.u32 %r289,%r283; cvt.u16.u32 %r290,%r287; or.b16 %r288,%r289,%r290; cvt.u32.u16 %r291,%r288; cvt.u16.u8 %r292,%r291; setp.ne.u16 %r293,%r292,0; @ %r293 bra $L56; .loc 1 270 7 setp.ne.u32 %r295,%r34,16; @ %r295 bra $L57; $L56: .loc 1 277 11 shl.b64 %r299,%r479,3; add.u64 %r300,%r231,%r299; ld.u64 %r39,[%r300+40]; .loc 1 278 12 add.u64 %r306,%r233,%r299; ld.u64 %r41,[%r306+40]; .loc 1 280 3 setp.ne.u64 %r308,%r147,0; @ %r308 bra $L58; $L64: .loc 1 290 3 setp.lt.s64 %r309,%r147,%r148; @ %r309 bra $L59; bra $L109; $L57: .loc 1 275 5 cvta.const.u64 %r310,$LC310_gfortran_runtime_error8: add.u64 %r104,%r231,40; add.u64 %r213,%frame,240; add.u64 %r214,%r233,40; mov.u64 %r216,%frame; add.u64 %r217,%frame,360; add.u64 %r312,%r231,16; add.u64 %r314,%r22,%r22; add.u64 %r315,%r314,%r22; shl.b64 %r316,%r315,3; add.u64 %r226,%r312,%r316; .loc 1 287 12 mov.u64 %r491,0; $L63: .loc 1 282 51 ld.u64 %r318,[%r104]; mul.lo.u64 %r317,%r318,%r234; .loc 1 282 18 st.u64 [%r213],%r317; .loc 1 283 20 ld.u64 %r320,[%r214]; mul.lo.u64 %r319,%r320,%r33; .loc 1 283 18 st.u64 [%r216],%r319; .loc 1 284 19 ld.u64 %r322,[%r104+16]; add.u64 %r321,%r322,1; ld.u64 %r323,[%r104+8]; sub.u64 %r56,%r321,%r323; .loc 1 286 10 setp.lt.s64 %r324,%r56,0; @ %r324 bra $L61; .loc 1 284 17 st.u64 [%r217],%r56; bra $L62; $L61: .loc 1 287 12 st.u64 [%r217],%r491; $L62: .loc 1 280 3 add.u64 %r104,%r104,24; add.u64 %r213,%r213,8; add.u64 %r214,%r214,24; add.u64 %r216,%r216,8; add.u64 %r217,%r217,8; setp.ne.u64 %r326,%r104,%r226; @ %r326 bra $L63; bra $L64; $L109: .loc 1 300 6 ld.u64 %r327,[%r229]; setp.eq.u64 %r328,%r327,0; @ ! %r328 bra $L66; bra $L65; $L59: add.u64 %r330,%r22,%r22; add.u64 %r331,%r330,%r22; shl.b64 %r332,%r331,3; add.u64 %r130,%r332,40; add.u64 %r134,%r231,%r130; shl.b64 %r333,%r22,3; add.u64 %r106,%r333,-8; add.u64 %r477,%frame,240; add.u64 %r127,%r477,%r106; add.u64 %r88,%r233,%r130; add.u64 %r37,%frame,%r106; add.u64 %r476,%frame,360; add.u64 %r168,%r476,%r106; add.u64 %r336,%r231,40; cvt.u32.u32 %r338,%r23; cvt.s64.s8 %r337,%r338; add.u64 %r340,%r337,%r337; add.u64 %r341,%r340,%r337; shl.b64 %r342,%r341,3; add.u64 %r101,%r336,%r342; .loc 1 297 12 mov.u64 %r490,0; $L69: .loc 1 292 55 ld.u64 %r344,[%r134]; mul.lo.u64 %r343,%r344,%r234; .loc 1 292 18 st.u64 [%r127],%r343; .loc 1 293 20 ld.u64 %r346,[%r88]; mul.lo.u64 %r345,%r346,%r33; .loc 1 293 18 st.u64 [%r37],%r345; .loc 1 294 19 ld.u64 %r348,[%r134+16]; add.u64 %r347,%r348,1; ld.u64 %r349,[%r134+8]; sub.u64 %r70,%r347,%r349; .loc 1 296 10 setp.lt.s64 %r350,%r70,0; @ %r350 bra $L67; .loc 1 294 17 st.u64 [%r168],%r70; bra $L68; $L67: .loc 1 297 12 st.u64 [%r168],%r490; $L68: .loc 1 290 3 add.u64 %r134,%r134,24; add.u64 %r127,%r127,8; add.u64 %r88,%r88,24; add.u64 %r37,%r37,8; add.u64 %r168,%r168,8; setp.ne.u64 %r352,%r101,%r134; @ %r352 bra $L69; bra $L110; $L65: .loc 1 304 7 setp.eq.u64 %r353,%r148,0; @ %r353 bra $L71; add.u64 %r476,%frame,360; $L91: add.u64 %r146,%r229,40; mov.u64 %r144,%r476; add.u64 %r354,%r229,16; cvt.u32.u32 %r356,%r23; cvt.s64.s8 %r355,%r356; add.u64 %r358,%r355,%r355; add.u64 %r359,%r358,%r355; shl.b64 %r360,%r359,3; add.u64 %r135,%r354,%r360; .loc 1 307 10 mov.u64 %r138,1; .loc 1 311 4 mov.u64 %r380,0; bra $L72; $L71: .loc 1 315 20 add.u32 %r361,%r23,-2; cvt.s64.s32 %r77,%r361; add.u64 %r363,%r77,%r77; add.u64 %r364,%r363,%r77; shl.b64 %r365,%r364,3; add.u64 %r366,%r229,%r365; .loc 1 315 67 shl.b64 %r368,%r77,3; add.u64 %r369,%frame,%r368; .loc 1 315 59 ld.u64 %r372,[%r366+40]; ld.u64 %r373,[%r369+360]; mul.lo.u64 %r371,%r372,%r373; .loc 1 315 18 mul.lo.u64 %r155,%r371,%r234; .loc 1 318 24 mov.u64 %r374,0; st.u64 [%r229+8],%r374; .loc 1 319 28 cvt.u16.u32 %r377,%r23; add.u16 %r376,%r377,-1; cvt.u32.u16 %r378,%r376; st.u8 [%r229+28],%r378; .loc 1 321 10 setp.eq.u64 %r379,%r155,0; @ %r379 bra $L73; bra $L111; $L75: .loc 1 309 47 mul.lo.u64 %r138,%r74,%r138; $L72: .loc 1 311 4 st.u64 [%r146+8],%r380; ld.u64 %r74,[%r144]; add.u64 %r381,%r74,-1; st.u64 [%r146+16],%r381; st.u64 [%r146],%r138; .loc 1 304 7 add.u64 %r146,%r146,24; add.u64 %r144,%r144,8; setp.ne.u64 %r382,%r135,%r146; @ %r382 bra $L75; bra $L71; $L73: .loc 1 324 4 st.u64 [%r229+48],%r155; mov.u64 %r384,-1; st.u64 [%r229+56],%r384; mov.u64 %r385,1; st.u64 [%r229+40],%r385; .loc 1 325 4 bra $L50; $L111: .loc 1 328 24387; call (%value_in),_gfortrani_xmallocarray88,[%value_in]; } .loc 1 328 22 st.u64 [%r229],%r388; bra $L76; $L66: .loc 1 333 19 ld.s8 %r390,[%r229+28]; .loc 1 333 10 setp.eq.u64 %r391,%r390,%r148; @ %r391 bra $L77; .loc 1 334 2 cvta.const.u64 %r392,$LC7;stack; call _gfortran_runtime_error77: .loc 1 336 11 cvta.global.u64 %r394,_gfortrani_compile_options; .loc 1 336 10 ld.u32 %r395,[%r394+36]; setp.eq.u32 %r396,%r395,0; @ %r396 bra $L76; .loc 1 338 4 add.u64 %r476,%frame,360; cvta.const.u64 %r400,$LC4; cvta.const.u64 %r39399400; call _gfortrani_bounds_ifunction_return340 4 cvta.const.u64 %r40423r204400; call _gfortrani_bounds_equal_extents$L76: .loc 1 345 3 setp.ne.u64 %r406,%r148,0; @ %r406 bra $L78; $L81: .loc 1 353 8 ld.u64 %r165,[%r229]; .loc 1 354 8 ld.u64 %r163,[%r231]; .loc 1 356 9 setp.ne.u64 %r407,%r163,0; @ %r407 bra $L79; bra $L50; $L78: add.u64 %r185,%r229,40; cvt.u32.u32 %r409,%r23; cvt.s64.s8 %r408,%r409; add.u64 %r169,%r408,-1; .loc 1 345 3 mov.u64 %r179,0; mov.u64 %r62,%r179; add.u64 %r476,%frame,360; add.u64 %r480,%frame,480; add.u64 %r478,%frame,120; .loc 1 347 16 mov.u64 %r412,%r179; $L80: add.u64 %r411,%r480,%r179; st.u64 [%r411],%r412; .loc 1 348 18 add.u64 %r414,%r478,%r179; .loc 1 348 54 ld.u64 %r416,[%r185]; mul.lo.u64 %r415,%r416,%r234; .loc 1 348 18 st.u64 [%r414],%r415; .loc 1 349 17 add.u64 %r418,%r476,%r179; .loc 1 349 10 ld.u64 %r419,[%r418]; setp.le.s64 %r420,%r419,0; @ %r420 bra $L50; .loc 1 345 26 add.u64 %r62,%r62,1; .loc 1 345 3 add.u64 %r185,%r185,24; add.u64 %r179,%r179,8; setp.ne.u64 %r421,%r62,%r169; @ %r421 bra $L80; bra $L81; $L79: .loc 1 278 12 mul.lo.u64 %r44,%r41,%r33; .loc 1 366 2 shl.b64 %r95,%r234,2; .loc 1 368 32 mul.lo.u64 %r96,%r39,%r95; .loc 1 389 22 ld.u64 %r201,[%frame+240]; .loc 1 389 12 shl.b64 %r204,%r201,2; .loc 1 390 23 ld.u64 %r205,[%frame]; .loc 1 391 22 ld.u64 %r208,[%frame+120]; .loc 1 391 12 shl.b64 %r211,%r208,2; setp.le.s64 %r482,%r148,1; cvt.u32.u32 %r484,%r23; cvt.s64.s8 %r485,%r484; add.u64 %r486,%r485,-1; add.u64 %r487,%frame,240; add.u64 %r488,%frame,360; add.u64 %r489,%frame,120; $L92: .loc 1 366 2 mov.u32 %r425mov.u64 %r159,%r164; mov.u64 %r158,%r163; .loc 1 368 9 mov.u64 %r157,0; $L83: .loc 1 371 7 ld.s8 %r212,[%r159]; .loc 1 371 6 setp.ne.u32 %r429,%r212,0; @ %r429 bra $L82; .loc 1 368 24 add.u64 %r157,%r157,1; .loc 1 368 32 add.u64 %r158,%r158,%r96; .loc 1 368 47 add.u64 %r159,%r159,%r44; .loc 1 368 2 setp.ne.u64 %r430,%r149,%r157; @ %r430 bra $L83; mov.u64 %r158,%r165; bra $L84; $L82: .loc 1 377 6 setp.le.s64 %r431,%r149,%r157; @ %r431 bra $L84; mov.u64 %r166,%r158; $L86: .loc 1 379 6 setp.eq.u32 %r432,%r212,0; @ %r432 bra $L85; .loc 1 40(%value_in),_gfortrani_memcmp_char4436,[%value_in]; } .loc 1 379 13 setp.le.s32 %r437,%r436,0; selp.u64 %r158,%r158,%r166,%r437; $L85: .loc 1 377 23 add.u64 %r157,%r157,1; .loc 1 377 31 add.u64 %r166,%r166,%r96; .loc 1 377 46 add.u64 %r159,%r159,%r44; .loc 1 377 6 setp.eq.u64 %r438,%r149,%r157; @ %r438 bra $L84; .loc 1 379 7 ld.s8 %r212,[%r159]; bra $L86; $L84: .loc 1 385 2445,[%value_in]; } .loc 1 388 15 ld.u64 %r447,[%frame+480]; add.u64 %r100,%r447,1; st.u64 [%frame+480],%r100; .loc 1 389 12 add.u64 %r163,%r163,%r204; .loc 1 390 13 add.u64 %r164,%r164,%r205; .loc 1 391 12 add.u64 %r165,%r165,%r211; .loc 1 393 32 ld.u64 %r136,[%frame+360]; .loc 1 393 13 setp.ne.u64 %r448,%r100,%r136; @ %r448 bra $L92; .loc 1 397 13 mov.u64 %r449,0; st.u64 [%frame+480],%r449; .loc 1 400 23 mul.lo.u64 %r108,%r100,%r201; .loc 1 401 24 mul.lo.u64 %r450,%r100,%r205; .loc 1 401 10 sub.u64 %r160,%r164,%r450; .loc 1 402 23 mul.lo.u64 %r451,%r100,%r208; .loc 1 402 9 shl.b64 %r452,%r451,2; sub.u64 %r161,%r165,%r452; .loc 1 404 7 @ %r482 bra $L50; add.u64 %r198,%frame,488; mov.u64 %r197,8; .loc 1 403 5 mov.u64 %r162,1; bra $L89; $L90: .loc 1 397 13 st.u64 [%r198],%r449; .loc 1 400 23 mul.lo.u64 %r108,%r120,%r119; .loc 1 401 24 mul.lo.u64 %r458,%r122,%r119; .loc 1 401 10 sub.u64 %r160,%r164,%r458; .loc 1 402 23 mul.lo.u64 %r459,%r124,%r119; .loc 1 402 9 shl.b64 %r460,%r459,2; sub.u64 %r161,%r165,%r460; .loc 1 403 5 add.u64 %r162,%r162,1; .loc 1 404 7 add.u64 %r198,%r198,8; add.u64 %r197,%r197,8; setp.eq.u64 %r461,%r162,%r486; @ %r461 bra $L50; $L89: .loc 1 412 16 ld.u64 %r462,[%r198]; add.u64 %r119,%r462,1; st.u64 [%r198],%r119; .loc 1 413 23 add.u64 %r464,%r487,%r197; ld.u64 %r120,[%r464]; .loc 1 413 13 sub.u64 %r465,%r120,%r108; shl.b64 %r466,%r465,2; add.u64 %r163,%r163,%r466; .loc 1 414 24 add.u64 %r467,%frame,%r197; ld.u64 %r122,[%r467]; .loc 1 414 14 add.u64 %r164,%r160,%r122; .loc 1 415 23 add.u64 %r469,%r489,%r197; ld.u64 %r124,[%r469]; .loc 1 415 13 shl.b64 %r470,%r124,2; add.u64 %r165,%r161,%r470; .loc 1 393 32 add.u64 %r472,%r488,%r197; ld.u64 %r128,[%r472]; .loc 1 393 13 setp.eq.u64 %r473,%r119,%r128; @ %r473 bra $L90; bra $L92; $L110: .loc 1 300 6 ld.u64 %r474,[%r229]; setp.eq.u64 %r475,%r474,0; @ ! %r475 bra $L66; bra $L91; $L50: .loc 1 419_gfortran_smaxval1_s4 .visible .func _gfortran_smaxval1_s4368predu16 %r180; .reg .predpredu32u64 %r229; .reg .u16predpred %r264; .reg .pred %r265; .reg .u64 %r266; .reg .u32pred %r274; .reg .u64 %r275; .reg .u16u32 %r283; .reg .u64predpred %r310; .reg .predu32 %r316; .reg .u64 %r317; .reg .u64 %r318; .reg .u64 %r319; .reg .u64 %r320; mov.u64 %r156,%ar0; mov.u64 %r157,%ar1; mov.u64 %r158,%ar2; mov.u64 %r159,%ar3; mov.u64 %r160,%ar4; mov.u64 %r161,%ar5; .loc 1 445 6 setp.eq.u64 %r162,%r160,0; @ %r162 bra $L113; .loc 1 445 20 ld.u32 %r163,[%r160]; setp.eq.u32 %r164,%r163,0; @ %r164 bra $L114; $L113: .loc 1 447 7r157158161; call _gfortran_maxval1_s4); } .loc 1 448 7 bra $L112; $L114: .loc 1 451 10 ld.u64 %r23,[%r159]; .loc 1 451 7 add.u64 %r91,%r23,-1; .loc 1 452 10 ld.s8 %r24,[%r158+28]; .loc 1 452 38 add.u32 %r26,%r24,-1; .loc 1 452 8 cvt.s64.s32 %r314,%r26; .loc 1 454 7 shr.u64 %r171,%r91,63; set.u32.gt.s64 %r173,%r91,%r314; neg.s32 %r174,%r173; cvt.u16.u64 %r177,%r171; cvt.u16.u32 %r178,%r174; or.b16 %r176,%r177,%r178; .loc 1 454 6 cvt.u32.u16 %r179,%r176; cvt.u16.u8 %r180,%r179; setp.ne.u16 %r181,%r180,0; @ %r181 bra $L116; .loc 1 461 3 setp.ne.u64 %r182,%r91,0; @ %r182 bra $L117; $L123: .loc 1 469 3 setp.lt.s64 %r183,%r91,%r314; @ %r183 bra $L118; bra $L169; $L116: .loc 1 456 7 cvt.u32.u32 %r186,%r24; cvt.s64.s8 %r185,%r186; st.u64 [%stack+8],%r1851841stack; call _gfortran_runtime_error117: add.u64 %r100,%r158,48; add.u64 %r72,%frame,120; add.u64 %r59,%r158,56; add.u64 %r188,%r158,24; add.u64 %r190,%r23,%r23; add.u64 %r191,%r190,%r23; shl.b64 %r192,%r191,3; add.u64 %r152,%r188,%r192; .loc 1 466 12 mov.u64 %r320,0; $L122: .loc 1 463 19 ld.u64 %r194,[%r59]; add.u64 %r193,%r194,1; ld.u64 %r195,[%r100]; sub.u64 %r35,%r193,%r195; .loc 1 465 10 setp.le.s64 %r196,%r35,0; @ %r196 bra $L120; .loc 1 463 17 st.u64 [%r72],%r35; bra $L121; $L120: .loc 1 466 12 st.u64 [%r72],%r320; $L121: .loc 1 461 3 add.u64 %r100,%r100,24; add.u64 %r72,%r72,8; add.u64 %r59,%r59,24; setp.ne.u64 %r198,%r100,%r152; @ %r198 bra $L122; bra $L123; $L169: .loc 1 478 15 ld.u64 %r102,[%r156]; .loc 1 478 6 setp.eq.u64 %r199,%r102,0; @ ! %r199 bra $L125; bra $L124; $L118: add.u64 %r201,%r23,%r23; add.u64 %r202,%r201,%r23; shl.b64 %r203,%r202,3; add.u64 %r204,%r203,48; add.u64 %r81,%r158,%r204; add.u64 %r312,%frame,120; shl.b64 %r206,%r23,3; add.u64 %r207,%r206,-8; add.u64 %r30,%r312,%r207; add.u64 %r208,%r158,48; cvt.u32.u32 %r210,%r24; cvt.s64.s8 %r209,%r210; add.u64 %r212,%r209,%r209; add.u64 %r213,%r212,%r209; shl.b64 %r214,%r213,3; add.u64 %r97,%r208,%r214; .loc 1 475 12 mov.u64 %r319,0; $L128: .loc 1 472 2 ld.u64 %r216,[%r81+8]; add.u64 %r215,%r216,1; ld.u64 %r217,[%r81]; sub.u64 %r40,%r215,%r217; .loc 1 474 10 setp.le.s64 %r218,%r40,0; @ %r218 bra $L126; .loc 1 471 17 st.u64 [%r30],%r40; bra $L127; $L126: .loc 1 475 12 st.u64 [%r30],%r319; $L127: .loc 1 469 3 add.u64 %r81,%r81,24; add.u64 %r30,%r30,8; setp.ne.u64 %r220,%r81,%r97; @ %r220 bra $L128; bra $L170; $L124: .loc 1 482 7 setp.eq.u64 %r221,%r314,0; @ %r221 bra $L130; add.u64 %r312,%frame,120; $L149: add.u64 %r119,%r156,40; mov.u64 %r117,%r312; cvt.u32.u32 %r223,%r24; cvt.s64.s8 %r222,%r223; add.u64 %r225,%r222,%r222; add.u64 %r226,%r225,%r222; shl.b64 %r227,%r226,3; add.u64 %r228,%r156,16; add.u64 %r110,%r227,%r228; .loc 1 485 10 mov.u64 %r86,1; .loc 1 489 4 mov.u64 %r248,0; bra $L131; $L130: .loc 1 493 24 mov.u64 %r229,0; st.u64 [%r156+8],%r229; .loc 1 494 28 cvt.u16.u32 %r232,%r24; add.u16 %r231,%r232,-1; cvt.u32.u16 %r233,%r231; st.u8 [%r156+28],%r233; .loc 1 496 20 add.u32 %r234,%r24,-2; cvt.s64.s32 %r49,%r234; add.u64 %r236,%r49,%r49; add.u64 %r237,%r236,%r49156,%r238; .loc 1 496 67 shl.b64 %r241,%r49,3; add.u64 %r242,%frame,%r241; .loc 1 496 59 ld.u64 %r245,[%r239+40]; ld.u64 %r246,[%r242+120]; mul.lo.u64 %r244,%r245,%r246; .loc 1 496 18 mul.lo.u64 %r99,%r244,%r161; .loc 1 499 10 setp.eq.u64 %r247,%r99,0; @ %r247 bra $L132; bra $L171; $L134: .loc 1 487 48 mul.lo.u64 %r86,%r45,%r86; $L131: .loc 1 489 4 st.u64 [%r119+8],%r248; ld.u64 %r45,[%r117]; add.u64 %r249,%r45,-1; st.u64 [%r119+16],%r249; st.u64 [%r119],%r86; .loc 1 482 7 add.u64 %r119,%r119,24; add.u64 %r117,%r117,8; setp.ne.u64 %r250,%r110,%r119; @ %r250 bra $L134; bra $L130; $L132: .loc 1 502 4 st.u64 [%r156+48],%r99; mov.u64 %r252,-1; st.u64 [%r156+56],%r252; mov.u64 %r253,1; st.u64 [%r156+40],%r253; .loc 1 503 4 bra $L112; $L171: .loc 1 506 2499call (%value_in),_gfortrani_xmallocarray256,[%value_in]; } mov.u64 %r102,%r256; .loc 1 506 22 st.u64 [%r156],%r102; $L138: .loc 1 532 3 setp.ne.u64 %r258,%r314,0; @ %r258 bra $L135; bra $L136; $L125: .loc 1 510 19 ld.s8 %r55,[%r156+28]; .loc 1 510 10 setp.eq.u64 %r259,%r55,%r314; @ %r259 bra $L137; .loc 1 511 2 st.u64 [%stack+8],%r314; st.u64 [%stack],%r5_gfortran_runtime_error137: .loc 1 516 11 cvta.global.u64 %r262,_gfortrani_compile_options; .loc 1 516 10 ld.u32 %r263,[%r262+36]; setp.eq.u32 %r264,%r263,0; @ %r264 bra $L138; .loc 1 518 4 setp.eq.u64 %r265,%r314,0; @ %r265 bra $L136; add.u64 %r105,%r156,48; add.u64 %r88,%frame,120; cvt.u32.u32 %r267,%r24; cvt.s64.s8 %r266,%r267; add.u64 %r83,%r266,-1; .loc 1 518 10 mov.u64 %r109,0; $L140: .loc 1 522 21 ld.u64 %r269,[%r105+8]; add.u64 %r268,%r269,1; .loc 1 522 19 ld.u64 %r270,[%r105]; sub.u64 %r96,%r268,%r270; .loc 1 523 18 ld.u64 %r63,[%r88]; .loc 1 524 3 add.u64 %r109,%r109,1; .loc 1 523 11 setp.eq.u64 %r271,%r63,%r96; @ %r271 bra $L139; .loc 1 524 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r96; st.u64 [%stack],%r109; cvta.const.u64 %r272,$LC272_gfortran_runtime_error139: .loc 1 518 4 add.u64 %r105,%r105,24; add.u64 %r88,%r88,8; setp.ne.u64 %r274,%r83,%r109; @ %r274 bra $L140; $L135: .loc 1 534 16 cvt.u16.u32 %r276,%r24; setp.le.s16 %r277,%r276,1; @ %r277 bra $L141; shl.b64 %r275,%r314,3; bra $L142; $L141: mov.u64 %r275,8; $L142: add.u64 %r279,%frame,240; mov.u32 %r2832755,[%value_in]; } add.u64 %r125,%r156,40; mov.u64 %r104,0; $L143: .loc 1 535 18 shl.b64 %r287,%r104,3; add.u64 %r288,%frame,%r287; .loc 1 535 54 ld.u64 %r290,[%r125]; mul.lo.u64 %r289,%r290,%r161; .loc 1 535 18 st.u64 [%r288],%r289; .loc 1 532 26 add.u64 %r104,%r104,1; .loc 1 532 3 add.u64 %r125,%r125,24; setp.gt.s64 %r291,%r314,%r104; @ %r291 bra $L143; $L136: .loc 1 542 7 shl.b64 %r70,%r161,2; .loc 1 546 32 ld.u64 %r140,[%frame+120]; .loc 1 544 22 ld.u64 %r141,[%frame]; .loc 1 544 12 shl.b64 %r143,%r141,2; .loc 1 543 12 ld.u64 %r147,[%frame+240]; .loc 1 553 23 mul.lo.u64 %r151,%r140,%r141; setp.le.s64 %r313,%r314,1; cvt.u32.u32 %r316,%r24; cvt.s64.s8 %r317,%r316; add.u64 %r318,%r317,-1; $L148: .loc 1 542 7 mov.u32 %r297,[%value_in]; } .loc 1 543 15 add.u64 %r147,%r147,1; .loc 1 544 12 add.u64 %r102,%r102,%r143; .loc 1 546 13 setp.ne.u64 %r299,%r147,%r140; @ %r299 bra $L148; .loc 1 555 7 @ %r313 bra $L112; add.u64 %r103,%frame,248; add.u64 %r133,%frame,8; add.u64 %r132,%frame,128; .loc 1 553 23 mov.u64 %r75,%r151; .loc 1 554 5 mov.u64 %r101,1; .loc 1 550 13 mov.u64 %r315,0; bra $L146; $L147: st.u64 [%r103],%r315; .loc 1 553 23 mul.lo.u64 %r75,%r78,%r77; .loc 1 554 5 add.u64 %r101,%r101,1; .loc 1 555 7 add.u64 %r103,%r103,8; add.u64 %r133,%r133,8; add.u64 %r132,%r132,8; setp.eq.u64 %r306,%r101,%r318; @ %r306 bra $L112; $L146: .loc 1 559 16 ld.u64 %r307,[%r103]; add.u64 %r77,%r307,1; st.u64 [%r103],%r77; .loc 1 560 23 ld.u64 %r78,[%r133]; .loc 1 560 13 sub.u64 %r308,%r78,%r75; shl.b64 %r309,%r308,2; add.u64 %r102,%r102,%r309; .loc 1 546 32 ld.u64 %r80,[%r132]; .loc 1 546 13 setp.eq.u64 %r310,%r77,%r80; @ %r310 bra $L147; mov.u64 %r147,0; bra $L148; $L170: .loc 1 478 15 ld.u64 %r102,[%r156]; .loc 1 478 6 setp.eq.u64 %r311,%r102,0; @ ! %r311 bra $L125; bra $L149; $L112: .loc 1 564 minval1_s1.o/_gfortran_minval1_s1 .visible .func _gfortran_minval1_s1.file 1 "../../../../libgfortran/generated/minval1_s1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminval1_s1 .visible .func _gfortran_mminval1_s1GLOBAL FUNCTION DECL: _gfortran_sminval1_s1 .visible .func _gfortran_sminval1_s1GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_options[6]FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returmemcmpmem_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents73,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,3709,105,110,118,97,108,49,95,115,49,0 }; // BEGIN VAR DEF: __func__$1 .const .align 1 .u8 __func__$1[11] = {109,105,110,118,97,108,49,95,115,49,0 }120,108,101,110,32,61,61,32,115,116,114,105,110,103,95,108,101116,114,97,110,47,103,101,110,101,114,97,116,101,100,47,109,105,110,118,97,108,49,95,115,49,46,9968,105,109,32,97,114,103,117,109,101,110,116,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,10114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,86,65,76114,101,116,117,114,110,32,118,9726] = {70,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,997,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,0 }_gfortran_minval1_s1 .visible .func _gfortran_minval1_s1 {48u64 %r97u64predpredpred %r238; .reg .u64predu64 %r279; .reg .u64 %r280; .reg .u64 %r281; .reg .u64predpred %r312; .reg .u64 %r313; .reg .u64 %r314; .reg .u32 %r315; .reg .u64 %r317; .reg .u64 %r318; .reg .u64 %r319; .reg .u64 %r320; .reg .u64 %r321; .reg .u64u64u64 %r358; .reg .u64predpredmov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; mov.u64 %r182,%ar3; mov.u64 %r183,%ar4; .loc 1 66 3 setp.eq.u64 %r184,%r180,%r183; @ %r184 bra $L2; cvta.const.u64 %r188,$LC0; cvta.const.u64 %r187,__func__$1; mov.u32 %r186,66;r1861871882: .loc 1 68 10 ld.s8 %r22,[%r181+28]; .loc 1 68 38 add.u32 %r189,%r22,-1; .loc 1 68 8 cvt.s64.s32 %r111,%r189; .loc 1 69 10 ld.u64 %r25,[%r182]; .loc 1 69 7 add.u64 %r112,%r25,-1; .loc 1 71 7 shr.u64 %r191,%r112,63; set.u32.lt.s64 %r193,%r111,%r112; neg.s32 %r194,%r193; cvt.u16.u64 %r197,%r191; cvt.u16.u32 %r198,%r194; or.b16 %r196,%r197,%r198; .loc 1 71 6 cvt.u32.u16 %r199,%r196; cvt.u16.u8 %r200,%r199; setp.eq.u16 %r201,%r200,0; @ %r201 bra $L3; .loc 1 73 7 cvt.u32.u32 %r204,%r22; cvt.s64.s8 %r203,%r204; st.u64 [%stack+8],%r20302202_gfortran_runtime_errorr207,%r112,%r112; add.u64 %r208,%r207,%r112; shl.b64 %r209,%r208,3; add.u64 %r210,%r181,%r209; ld.u64 %r31,[%r210+56]; ld.u64 %r33,[%r210+48]; .loc 1 82 11 ld.u64 %r34,[%r210+40]; .loc 1 84 3 setp.ne.u64 %r224,%r112,0; @ %r224 bra $L4; $L10: .loc 1 92 3 setp.gt.s64 %r225,%r111,%r112; @ %r225 bra $L5; bra $L48; $L4: add.u64 %r65,%r181,40; add.u64 %r29,%frame,120; add.u64 %r127,%frame,240; add.u64 %r226,%r181,16; add.u64 %r228,%r25,%r25; add.u64 %r229,%r228,%r25; shl.b64 %r230,%r229,3; add.u64 %r168,%r226,%r230; .loc 1 90 12 mov.u64 %r379,0; $L9: .loc 1 86 51 ld.u64 %r232,[%r65]; mul.lo.u64 %r231,%r232,%r183; .loc 1 86 18 st.u64 [%r29],%r231; .loc 1 87 19 ld.u64 %r234,[%r65+16]; add.u64 %r233,%r234,1; ld.u64 %r235,[%r65+8]; sub.u64 %r44,%r233,%r235; .loc 1 89 10 setp.lt.s64 %r236,%r44,0; @ %r236 bra $L7; .loc 1 87 17 st.u64 [%r127],%r44; bra $L8; $L7: .loc 1 90 12 st.u64 [%r127],%r379; $L8: .loc 1 84 3 add.u64 %r65,%r65,24; add.u64 %r29,%r29,8; add.u64 %r127,%r127,8; setp.ne.u64 %r238,%r65,%r168; @ %r238 bra $L9; bra $L10; $L48: .loc 1 101 6 ld.u64 %r239,[%r179]; setp.eq.u64 %r240,%r239,0; @ ! %r240 bra $L12; bra $L11; $L5: add.u64 %r242,%r25,%r25; add.u64 %r243,%r242,%r25; shl.b64 %r244,%r243,3; add.u64 %r245,%r244,40; add.u64 %r136,%r181,%r245; shl.b64 %r246,%r25,3; add.u64 %r106,%r246,-8; add.u64 %r371,%frame,120; add.u64 %r108,%r371,%r106; add.u64 %r370,%frame,240; add.u64 %r103,%r370,%r106; add.u64 %r249,%r181,40; cvt.u32.u32 %r251,%r22; cvt.s64.s8 %r250,%r251; add.u64 %r253,%r250,%r250; add.u64 %r254,%r253,%r250; shl.b64 %r255,%r254,3; add.u64 %r81,%r249,%r255; .loc 1 98 12 mov.u64 %r378,0; $L15: .loc 1 94 56 ld.u64 %r257,[%r136]; mul.lo.u64 %r256,%r257,%r183; .loc 1 94 18 st.u64 [%r108],%r256; .loc 1 95 19 ld.u64 %r259,[%r136+16]; add.u64 %r258,%r259,1; ld.u64 %r260,[%r136+8]; sub.u64 %r53,%r258,%r260; .loc 1 97 10 setp.lt.s64 %r261,%r53,0; @ %r261 bra $L13; .loc 1 95 17 st.u64 [%r103],%r53; bra $L14; $L13: .loc 1 98 12 st.u64 [%r103],%r378; $L14: .loc 1 92 3 add.u64 %r136,%r136,24; add.u64 %r108,%r108,8; add.u64 %r103,%r103,8; setp.ne.u64 %r263,%r81,%r136; @ %r263 bra $L15; bra $L49; $L11: .loc 1 105 7 setp.le.s64 %r264,%r111,0; @ %r264 bra $L17; add.u64 %r370,%frame,240; $L37: add.u64 %r151,%r179,40; mov.u64 %r148,%r370; cvt.u32.u32 %r266,%r22; cvt.s64.s8 %r265,%r266; add.u64 %r268,%r265,%r265; add.u64 %r269,%r268,%r265; shl.b64 %r270,%r269,3; add.u64 %r271,%r179,16; add.u64 %r139,%r270,%r271; .loc 1 108 10 mov.u64 %r102,1; .loc 1 112 4 mov.u64 %r295,0; bra $L18; $L17: .loc 1 116 24 mov.u64 %r272,0; st.u64 [%r179+8],%r272; .loc 1 117 28 cvt.u16.u32 %r275,%r22; add.u16 %r274,%r275,-1; cvt.u32.u16 %r276,%r274; st.u8 [%r179+28],%r276; .loc 1 119 20 add.u32 %r277,%r22,-2; cvt.s64.s32 %r63,%r277; add.u64 %r279,%r63,%r63; add.u64 %r280,%r279,%r63; shl.b64 %r281,%r280,3; add.u64 %r282,%r179,%r281; .loc 1 119 67 shl.b64 %r284,%r63,3; add.u64 %r285,%frame,%r284; .loc 1 119 59 ld.u64 %r288,[%r282+40]; ld.u64 %r289,[%r285+240]; mul.lo.u64 %r287,%r288,%r289; .loc 1 119 18 mul.lo.u64 %r117,%r287,%r183; .loc 1 122 29 mov.u64 %r29call (%value_in),_gfortrani_xmallocarray292,[%value_in]; } .loc 1 122 27 st.u64 [%r179],%r292; .loc 1 123 10 setp.eq.u64 %r294,%r117,0; @ ! %r294 bra $L22; bra $L19; $L21: .loc 1 110 48 mul.lo.u64 %r102,%r59,%r102; $L18: .loc 1 112 4 st.u64 [%r151+8],%r295; ld.u64 %r59,[%r148]; add.u64 %r296,%r59,-1; st.u64 [%r151+16],%r296; st.u64 [%r151],%r102; .loc 1 105 7 add.u64 %r151,%r151,24; add.u64 %r148,%r148,8; setp.ne.u64 %r297,%r139,%r151; @ %r297 bra $L21; bra $L17; $L19: .loc 1 126 4 st.u64 [%r179+48],%r117; mov.u64 %r299,-1; st.u64 [%r179+56],%r299; st.u64 [%r179+40],%r291; .loc 1 127 4 bra $L1; $L12: .loc 1 133 19 ld.s8 %r71,[%r179+28]; .loc 1 133 10 setp.eq.u64 %r301,%r71,%r111; @ %r301 bra $L24; .loc 1 134 2 st.u64 [%stack+8],%r111; st.u64 [%stack],%r71_gfortran_runtime_error24: .loc 1 139 11 cvta.global.u64 %r304,_gfortrani_compile_options; .loc 1 139 10 ld.u32 %r305,[%r304+36]; setp.eq.u32 %r306,%r305,0; @ %r306 bra $L22; .loc 1 140 2 add.u64 %r370,%frame,240; cvta.const.u64 %r310,$LC4r370309310; call _gfortrani_bounds_ifunction_return$L22: .loc 1 144 3 setp.gt.s64 %r312,%r111,0; @ %r312 bra $L25; $L29: .loc 1 78 9 add.u64 %r313,%r31,1; .loc 1 78 7 sub.u64 %r113,%r313,%r33; max.s64 %r115,%r113,0; .loc 1 82 44 mul.lo.u64 %r36,%r34,%r183; .loc 1 152 8 ld.u64 %r124,[%r181]; .loc 1 153 8 ld.u64 %r125,[%r179]; .loc 1 182 22 ld.u64 %r171,[%frame+120]; .loc 1 183 22 ld.u64 %r173,[%frame]; .loc 1 181 12 ld.u64 %r176,[%frame+360]; setp.gt.s64 %r367,%r113,0; setp.le.s64 %r368,%r111,1; cvt.u32.u32 %r373,%r22; cvt.s64.s8 %r374,%r373; add.u64 %r375,%r374,-1; add.u64 %r376,%frame,120; add.u64 %r377,%frame,240; bra $L26; $L25: add.u64 %r161,%r179,40; cvt.u32.u32 %r315,%r22; cvt.s64.s8 %r314,%r315; add.u64 %r152,%r314,-1; .loc 1 144 3 mov.u64 %r159,0; mov.u64 %r126,%r159; add.u64 %r370,%frame,240; add.u64 %r369,%frame,360; .loc 1 146 16 mov.u64 %r318,%r159; $L28: add.u64 %r317,%r369,%r159; st.u64 [%r317],%r318; .loc 1 147 18 add.u64 %r319,%frame,%r159; .loc 1 147 54 ld.u64 %r321,[%r161]; mul.lo.u64 %r320,%r321,%r183; .loc 1 147 18 st.u64 [%r319],%r320; .loc 1 148 17 add.u64 %r323,%r370,%r159; .loc 1 148 10 ld.u64 %r324,[%r323]; setp.le.s64 %r325,%r324,0; @ %r325 bra $L1; .loc 1 144 26 add.u64 %r126,%r126,1; .loc 1 144 3 add.u64 %r161,%r161,24; add.u64 %r159,%r159,8; setp.ne.u64 %r326,%r126,%r152; @ %r326 bra $L28; bra $L29; $L26: .loc 1 164 5 @ %r367 bra $L38; .loc 1 165 4 mov.u32 %r329,2553331,[%value_in]; } bra $L31; $L33: .loc 1 3812104183335,[%value_in]; } .loc 1 171 6 setp.ge.s32 %r337,%r335,0; selp.u64 %r104,%r104,%r120,%r337; bra $L30; $L38: mov.u64 %r120,%r124; mov.u64 %r104,%r124; .loc 1 168 13 mov.u64 %r119,0; $L30: .loc 1 168 28 add.u64 %r119,%r119,1; .loc 1 168 36 add.u64 %r120,%r120,%r36; .loc 1 168 6 setp.gt.s64 %r338,%r115,%r11983345,[%value_in]; } $L31: .loc 1 181 15 add.u64 %r176,%r176,1; .loc 1 182 12 add.u64 %r124,%r124,%r171; .loc 1 183 12 add.u64 %r125,%r125,%r173; .loc 1 185 32 ld.u64 %r130,[%frame+240]; .loc 1 185 13 setp.ne.u64 %r347,%r176,%r130; @ %r347 bra $L26; .loc 1 192 23 mul.lo.u64 %r348,%r176,%r171; .loc 1 192 9 sub.u64 %r121,%r124,%r348; .loc 1 193 23 mul.lo.u64 %r349,%r176,%r173; .loc 1 193 9 sub.u64 %r122,%r125,%r349; .loc 1 195 7 @ %r368 bra $L1; add.u64 %r132,%frame,368; mov.u64 %r150,8; .loc 1 194 5 mov.u64 %r123,1; .loc 1 189 13 mov.u64 %r372,0; bra $L35; $L36: st.u64 [%r132],%r372; .loc 1 192 23 mul.lo.u64 %r355,%r92,%r91; .loc 1 192 9 sub.u64 %r121,%r124,%r355; .loc 1 193 23 mul.lo.u64 %r356,%r94,%r91; .loc 1 193 9 sub.u64 %r122,%r125,%r356; .loc 1 194 5 add.u64 %r123,%r123,1; .loc 1 195 7 add.u64 %r132,%r132,8; add.u64 %r150,%r150,8; setp.eq.u64 %r357,%r123,%r375; @ %r357 bra $L1; $L35: .loc 1 203 16 ld.u64 %r358,[%r132]; add.u64 %r91,%r358,1; st.u64 [%r132],%r91; .loc 1 204 23 add.u64 %r360,%r376,%r150; ld.u64 %r92,[%r360]; .loc 1 204 13 add.u64 %r124,%r121,%r92; .loc 1 205 23 add.u64 %r361,%frame,%r150; ld.u64 %r94,[%r361]; .loc 1 205 13 add.u64 %r125,%r122,%r94; .loc 1 185 32 add.u64 %r363,%r377,%r150; ld.u64 %r97,[%r363]; .loc 1 185 13 setp.eq.u64 %r364,%r91,%r97; @ %r364 bra $L36; mov.u64 %r176,0; bra $L26; $L49: .loc 1 101 6 ld.u64 %r365,[%r179]; setp.eq.u64 %r366,%r365,0; @ ! %r366 bra $L12; bra $L37; $L1: .loc 1 209_gfortran_mminval1_s1 .visible .func _gfortran_mminval1_s1608pred %r234; .reg .u64 %r235; .reg .u32 %r236; .reg .u64 %r237; .reg .u64 %r238; .reg .u32 %r239; .reg .u64 %r241; .reg .u32u32u64 %r313; .reg .u64u64 %r329; .reg .u32 %r330; .reg .u64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64predpredu64 %r411; .reg .u64 %r412; .reg .pred %r413; .reg .pred %r414; .reg .u32 %r416; .reg .u64 %r418; .reg .pred %r419; .reg .pred %r420; .reg .pred %r421; .reg .pred %r422; .reg .u32 %r426; .reg .pred %r428; .reg .pred.reg .u64 %r454; .reg .u64 %r455; .reg .u64 %r457; .reg .u64 %r459; .reg .pred %r460; .reg .u64 %r461; .reg .predpredmov.u64 %r222,%ar0; mov.u64 %r223,%ar1; mov.u64 %r224,%ar2; mov.u64 %r225,%ar3; mov.u64 %r226,%ar4; mov.u64 %r227,%ar5; .loc 1 242 6 setp.ne.u64 %r228,%r226,0; @ %r228 bra $L223257; call _gfortran_minval1_s1); } .loc 1 245 7 bra $L50; $L51: .loc 1 248 3 setp.eq.u64 %r234,%r223,%r227; @ %r234 bra $L53; cvta.const.u64 %r238,$LC0; cvta.const.u64 %r237,__func__$0; mov.u32 %r236,248;r2r2373853: .loc 1 250 10 ld.u64 %r22,[%r225]; .loc 1 250 7 add.u64 %r146,%r22,-1; .loc 1 251 10 ld.s8 %r23,[%r224+28]; .loc 1 251 38 add.u32 %r239,%r23,-1; .loc 1 251 8 cvt.s64.s32 %r147,%r239; .loc 1 253 7 shr.u64 %r241,%r146,63; set.u32.gt.s64 %r243,%r146,%r147; neg.s32 %r244,%r243; cvt.u16.u64 %r247,%r241; cvt.u16.u32 %r248,%r244; or.b16 %r246,%r247,%r248; .loc 1 253 6 cvt.u32.u16 %r249,%r246; cvt.u16.u8 %r250,%r249; setp.eq.u16 %r251,%r250,0; @ %r251 bra $L54; .loc 1 255 7 cvt.u32.u32 %r254,%r23; cvt.s64.s8 %r253,%r254; st.u64 [%stack+8],%r253; st.u64 [%stack],%r22; cvta.const.u64 %r22_gfortran_runtime_error4: .loc 1 260 9 add.u64 %r468,%r146,%r146; add.u64 %r465,%r468,%r146; shl.b64 %r259,%r465,3; add.u64 %r260,%r224,%r259; ld.u64 %r263,[%r260+56]; add.u64 %r262,%r263,1; .loc 1 260 7 ld.u64 %r270,[%r260+48]; sub.u64 %r148,%r262,%r270; .loc 1 261 6 setp.le.s64 %r271,%r148,0; @ %r271 bra $L50; .loc 1 264 9 ld.u64 %r159,[%r226]; .loc 1 266 15 ld.u64 %r35,[%r226+16]; .loc 1 268 22 cvt.u32.u64 %r36,%r35; .loc 1 268 53 add.u32 %r272,%r36,-4; and.b32 %r273,%r272,-5; set.u32.eq.u32 %r275,%r273,0; neg.s32 %r276,%r275; .loc 1 268 22 add.u32 %r277,%r36,-1; set.u32.le.u32 %r279,%r277,1; neg.s32 %r280,%r279; .loc 1 268 6 cvt.u16.u32 %r282,%r276; cvt.u16.u32 %r283,%r280; or.b16 %r281,%r282,%r283; cvt.u32.u16 %r284,%r281; cvt.u16.u8 %r285,%r28456; .loc 1 270 7 setp.ne.u32 %r288,%r36,16; @ %r288 bra $L57; $L56: .loc 1 277 11 shl.b64 %r292,%r465,3; add.u64 %r293,%r224,%r292; ld.u64 %r39,[%r293+40]; .loc 1 278 12 add.u64 %r299,%r226,%r292; ld.u64 %r42,[%r299+40]; .loc 1 280 3 setp.ne.u64 %r301,%r146,0; @ %r301 bra $L58; $L64: .loc 1 290 3 setp.lt.s64 %r302,%r146,%r147; @ %r302 bra $L59; bra $L109; $L57: .loc 1 275 5 cvta.const.u64 %r303,$LC303_gfortran_runtime_error8: add.u64 %r204,%r224,40; add.u64 %r206,%frame,240; add.u64 %r207,%r226,40; mov.u64 %r209,%frame; add.u64 %r210,%frame,360; add.u64 %r305,%r224,16; add.u64 %r307,%r22,%r22; add.u64 %r308,%r307,%r22; shl.b64 %r309,%r308,3; add.u64 %r219,%r305,%r309; .loc 1 287 12 mov.u64 %r478,0; $L63: .loc 1 282 51 ld.u64 %r311,[%r204]; mul.lo.u64 %r310,%r311,%r227; .loc 1 282 18 st.u64 [%r206],%r310; .loc 1 283 20 ld.u64 %r313,[%r207]; mul.lo.u64 %r312,%r313,%r35; .loc 1 283 18 st.u64 [%r209],%r312; .loc 1 284 19 ld.u64 %r315,[%r204+16]; add.u64 %r314,%r315,1; ld.u64 %r316,[%r204+8]; sub.u64 %r57,%r314,%r316; .loc 1 286 10 setp.lt.s64 %r317,%r57,0; @ %r317 bra $L61; .loc 1 284 17 st.u64 [%r210],%r57; bra $L62; $L61: .loc 1 287 12 st.u64 [%r210],%r478; $L62: .loc 1 280 3 add.u64 %r204,%r204,24; add.u64 %r206,%r206,8; add.u64 %r207,%r207,24; add.u64 %r209,%r209,8; add.u64 %r210,%r210,8; setp.ne.u64 %r319,%r204,%r219; @ %r319 bra $L63; bra $L64; $L109: .loc 1 300 6 ld.u64 %r320,[%r222]; setp.eq.u64 %r321,%r320,0; @ ! %r321 bra $L66; bra $L65; $L59: add.u64 %r323,%r22,%r22; add.u64 %r324,%r323,%r22; shl.b64 %r325,%r324,3; add.u64 %r125,%r325,40; add.u64 %r130,%r224,%r125; shl.b64 %r326,%r22,3; add.u64 %r75,%r326,-8; add.u64 %r463,%frame,240; add.u64 %r106,%r463,%r75; add.u64 %r68,%r226,%r125; add.u64 %r29,%frame,%r75; add.u64 %r469,%frame,360; add.u64 %r192,%r469,%r75; cvt.u32.u32 %r330,%r23; cvt.s64.s8 %r329,%r330; add.u64 %r332,%r329,%r329; add.u64 %r333,%r332,%r329; shl.b64 %r334,%r333,3; add.u64 %r335,%r224,40; add.u64 %r201,%r334,%r335; .loc 1 297 12 mov.u64 %r477,0; $L69: .loc 1 292 55 ld.u64 %r337,[%r130]; mul.lo.u64 %r336,%r337,%r227; .loc 1 292 18 st.u64 [%r106],%r336; .loc 1 293 20 ld.u64 %r339,[%r68]; mul.lo.u64 %r338,%r339,%r35; .loc 1 293 18 st.u64 [%r29],%r338; .loc 1 294 19 ld.u64 %r341,[%r130+16]; add.u64 %r340,%r341,1; ld.u64 %r342,[%r130+8]; sub.u64 %r72,%r340,%r342; .loc 1 296 10 setp.lt.s64 %r343,%r72,0; @ %r343 bra $L67; .loc 1 294 17 st.u64 [%r192],%r72; bra $L68; $L67: .loc 1 297 12 st.u64 [%r192],%r477; $L68: .loc 1 290 3 add.u64 %r130,%r130,24; add.u64 %r106,%r106,8; add.u64 %r68,%r68,24; add.u64 %r29,%r29,8; add.u64 %r192,%r192,8; setp.ne.u64 %r345,%r130,%r201; @ %r345 bra $L69; bra $L110; $L65: .loc 1 304 7 setp.eq.u64 %r346,%r147,0; @ %r346 bra $L71; add.u64 %r469,%frame,360; $L91: add.u64 %r141,%r222,40; mov.u64 %r139,%r469; cvt.u32.u32 %r348,%r23; cvt.s64.s8 %r347,%r348; add.u64 %r350,%r347,%r347; add.u64 %r351,%r350,%r347; shl.b64 %r352,%r351,3; add.u64 %r353,%r222,16; add.u64 %r131,%r352,%r353; .loc 1 307 10 mov.u64 %r137,1; .loc 1 311 4 mov.u64 %r373,0; bra $L72; $L71: .loc 1 315 20 add.u32 %r354,%r23,-2; cvt.s64.s32 %r81,%r354; add.u64 %r356,%r81,%r81; add.u64 %r357,%r356,%r81; shl.b64 %r358,%r357,3; add.u64 %r359,%r222,%r358; .loc 1 315 67 shl.b64 %r361,%r81,3; add.u64 %r362,%frame,%r361; .loc 1 315 59 ld.u64 %r365,[%r359+40]; ld.u64 %r366,[%r362+360]; mul.lo.u64 %r364,%r365,%r366; .loc 1 315 18 mul.lo.u64 %r153,%r364,%r227; .loc 1 318 24 mov.u64 %r367,0; st.u64 [%r222+8],%r367; .loc 1 319 28 cvt.u16.u32 %r370,%r23; add.u16 %r369,%r370,-1; cvt.u32.u16 %r371,%r369; st.u8 [%r222+28],%r371; .loc 1 321 10 setp.eq.u64 %r372,%r153,0; @ %r372 bra $L73; bra $L111; $L75: .loc 1 309 47 mul.lo.u64 %r137,%r78,%r137; $L72: .loc 1 311 4 st.u64 [%r141+8],%r373; ld.u64 %r78,[%r139]; add.u64 %r374,%r78,-1; st.u64 [%r141+16],%r374; st.u64 [%r141],%r137; .loc 1 304 7 add.u64 %r141,%r141,24; add.u64 %r139,%r139,8; setp.ne.u64 %r375,%r131,%r141; @ %r375 bra $L75; bra $L71; $L73: .loc 1 324 4 st.u64 [%r222+48],%r153; mov.u64 %r377,-1; st.u64 [%r222+56],%r377; mov.u64 %r378,1; st.u64 [%r222+40],%r378; .loc 1 325 4 bra $L50; $L111: .loc 1 328 24 mov.u64 %r38015(%value_in),_gfortrani_xmallocarray81,[%value_in]; } .loc 1 328 22 st.u64 [%r222],%r381; bra $L76; $L66: .loc 1 333 19 ld.s8 %r383,[%r222+28]; .loc 1 333 10 setp.eq.u64 %r384,%r383,%r147; @ %r384 bra $L77; .loc 1 334 2 cvta.const.u64 %r385385_gfortran_runtime_error77: .loc 1 336 11 cvta.global.u64 %r387,_gfortrani_compile_options; .loc 1 336 10 ld.u32 %r388,[%r387+36]; setp.eq.u32 %r389,%r388,0; @ %r389 bra $L76; .loc 1 338 4 add.u64 %r469,%frame,360; cvta.const.u64 %r393,$LC4; cvta.const.u64 %r3922469392393; call _gfortrani_bounds_ifunction_return340 4 cvta.const.u64 %r397397393; call _gfortrani_bounds_equal_extents$L76: .loc 1 345 3 setp.ne.u64 %r399,%r147,0; @ %r399 bra $L78; $L81: .loc 1 353 8 ld.u64 %r160,[%r222]; .loc 1 354 8 ld.u64 %r158,[%r224]; .loc 1 356 9 setp.ne.u64 %r400,%r158,0; @ %r400 bra $L79; bra $L50; $L78: add.u64 %r171,%r222,40; cvt.u32.u32 %r402,%r23; cvt.s64.s8 %r401,%r402; add.u64 %r142,%r401,-1; .loc 1 345 3 mov.u64 %r169,0; mov.u64 %r164,%r169; add.u64 %r469,%frame,360; add.u64 %r466,%frame,480; add.u64 %r464,%frame,120; .loc 1 347 16 mov.u64 %r405,%r169; $L80: add.u64 %r404,%r466,%r169; st.u64 [%r404],%r405; .loc 1 348 18 add.u64 %r407,%r464,%r169; .loc 1 348 54 ld.u64 %r409,[%r171]; mul.lo.u64 %r408,%r409,%r227; .loc 1 348 18 st.u64 [%r407],%r408; .loc 1 349 17 add.u64 %r411,%r469,%r169; .loc 1 349 10 ld.u64 %r412,[%r411]; setp.le.s64 %r413,%r412,0; @ %r413 bra $L50; .loc 1 345 26 add.u64 %r164,%r164,1; .loc 1 345 3 add.u64 %r171,%r171,24; add.u64 %r169,%r169,8; setp.ne.u64 %r414,%r142,%r164; @ %r414 bra $L80; bra $L81; $L79: .loc 1 277 44 mul.lo.u64 %r41,%r39,%r227; .loc 1 278 12 mul.lo.u64 %r45,%r42,%r35; .loc 1 389 22 ld.u64 %r193,[%frame+240]; .loc 1 390 23 ld.u64 %r196,[%frame]; .loc 1 391 22 ld.u64 %r198,[%frame+120]; setp.le.s64 %r467,%r147,1; cvt.u32.u32 %r471,%r23; cvt.s64.s8 %r472,%r471; add.u64 %r473,%r472,-1; add.u64 %r474,%frame,240; add.u64 %r475,%frame,360; add.u64 %r476,%frame,120; $L92: .loc 1 366 2 mov.u32 %r416,2552418,[%value_in]; } mov.u64 %r65,%r159; mov.u64 %r154,%r158; .loc 1 368 9 mov.u64 %r162,0; $L83: .loc 1 371 7 ld.s8 %r200,[%r65]; .loc 1 371 6 setp.ne.u32 %r419,%r200,0; @ %r419 bra $L82; .loc 1 368 24 add.u64 %r162,%r162,1; .loc 1 368 32 add.u64 %r154,%r154,%r41; .loc 1 368 47 add.u64 %r65,%r65,%r45; .loc 1 368 2 setp.ne.u64 %r420,%r148,%r162; @ %r420 bra $L83; mov.u64 %r154,%r160; bra $L84; $L82: .loc 1 377 6 setp.le.s64 %r421,%r148,%r162; @ %r421 bra $L84; mov.u64 %r163,%r154; $L86: .loc 1 379 6 setp.eq.u32 %r422,%r200,0; @ %r422 bra $L85; .loc 1 3816r154426,[%value_in]; } .loc 1 379 13 setp.ge.s32 %r428,%r426,0; selp.u64 %r154,%r154,%r163,%r428; $L85: .loc 1 377 23 add.u64 %r162,%r162,1; .loc 1 377 31 add.u64 %r163,%r163,%r41; .loc 1 377 46 add.u64 %r65,%r65,%r45; .loc 1 377 6 setp.eq.u64 %r429,%r148,%r162; @ %r429 bra $L84; .loc 1 379 7 ld.s8 %r200,[%r65]; bra $L86; $L84: .loc 1 385 2161547436,[%value_in]; } .loc 1 388 15 ld.u64 %r438,[%frame+480]; add.u64 %r100,%r438,1; st.u64 [%frame+480],%r100; .loc 1 389 12 add.u64 %r158,%r158,%r193; .loc 1 390 13 add.u64 %r159,%r159,%r196; .loc 1 391 12 add.u64 %r160,%r160,%r198; .loc 1 393 32 ld.u64 %r134,[%frame+360]; .loc 1 393 13 setp.ne.u64 %r439,%r100,%r134; @ %r439 bra $L92; .loc 1 397 13 mov.u64 %r440,0; st.u64 [%frame+480],%r440; .loc 1 400 23 mul.lo.u64 %r109,%r100,%r193; .loc 1 401 24 mul.lo.u64 %r441,%r100,%r196; .loc 1 401 10 sub.u64 %r155,%r159,%r441; .loc 1 402 23 mul.lo.u64 %r442,%r100,%r198; .loc 1 402 9 sub.u64 %r156,%r160,%r442; .loc 1 404 7 @ %r467 bra $L50; add.u64 %r191,%frame,488; mov.u64 %r189,8; .loc 1 403 5 mov.u64 %r157,1; bra $L89; $L90: .loc 1 397 13 st.u64 [%r191],%r440; .loc 1 400 23 mul.lo.u64 %r109,%r119,%r118; .loc 1 401 24 mul.lo.u64 %r448,%r121,%r118; .loc 1 401 10 sub.u64 %r155,%r159,%r448; .loc 1 402 23 mul.lo.u64 %r449,%r123,%r118; .loc 1 402 9 sub.u64 %r156,%r160,%r449; .loc 1 403 5 add.u64 %r157,%r157,1; .loc 1 404 7 add.u64 %r191,%r191,8; add.u64 %r189,%r189,8; setp.eq.u64 %r450,%r157,%r473; @ %r450 bra $L50; $L89: .loc 1 412 16 ld.u64 %r451,[%r191]; add.u64 %r118,%r451,1; st.u64 [%r191],%r118; .loc 1 413 23 add.u64 %r453,%r474,%r189; ld.u64 %r119,[%r453]; .loc 1 413 13 sub.u64 %r454,%r119,%r109; add.u64 %r158,%r158,%r454; .loc 1 414 24 add.u64 %r455,%frame,%r189; ld.u64 %r121,[%r455]; .loc 1 414 14 add.u64 %r159,%r155,%r121; .loc 1 415 23 add.u64 %r457,%r476,%r189; ld.u64 %r123,[%r457]; .loc 1 415 13 add.u64 %r160,%r156,%r123; .loc 1 393 32 add.u64 %r459,%r475,%r189; ld.u64 %r126,[%r459]; .loc 1 393 13 setp.eq.u64 %r460,%r118,%r126; @ %r460 bra $L90; bra $L92; $L110: .loc 1 300 6 ld.u64 %r461,[%r222]; setp.eq.u64 %r462,%r461,0; @ ! %r462 bra $L66; bra $L91; $L50: .loc 1 419_gfortran_sminval1_s1 .visible .func _gfortran_sminval1_s13688pred %r159; .reg .u32 %r160; .reg .predu16 %r173; .reg .u16 %r174; .reg .u16 %r175; .reg .u32 %r176; .reg .u16 %r177; .reg .pred %r178; .reg .pred %r179; .reg .predpred %r193; .reg .pred %r195; .reg .predpredu16 %r229; .reg .u32pred %r271; .reg .u64u32 %r280; .reg .u64 %r282; .reg .u64 %r284; .reg .u64pred %r293; .reg .predmov.u64 %r153,%ar0; mov.u64 %r154,%ar1; mov.u64 %r155,%ar2; mov.u64 %r156,%ar3; mov.u64 %r157,%ar4; mov.u64 %r158,%ar5; .loc 1 445 6 setp.eq.u64 %r159,%r157,0; @ %r159 bra $L113; .loc 1 445 20 ld.u32 %r160,[%r157]; setp.eq.u32 %r161,%r160,0; @ %r161 bra $L114; 15r1541515158; call _gfortran_minval1_s1); } .loc 1 448 7 bra $L112; $L114: .loc 1 451 10 ld.u64 %r23,[%r156]; .loc 1 451 7 add.u64 %r88,%r23,-1; .loc 1 452 10 ld.s8 %r24,[%r155+28]; .loc 1 452 38 add.u32 %r26,%r24,-1; .loc 1 452 8 cvt.s64.s32 %r307,%r26; .loc 1 454 7 shr.u64 %r168,%r88,63; set.u32.gt.s64 %r170,%r88,%r307; neg.s32 %r171,%r170; cvt.u16.u64 %r174,%r168; cvt.u16.u32 %r175,%r171; or.b16 %r173,%r174,%r175; .loc 1 454 6 cvt.u32.u16 %r176,%r173; cvt.u16.u8 %r177,%r176; setp.ne.u16 %r178,%r177,0; @ %r178 bra $L116; .loc 1 461 3 setp.ne.u64 %r179,%r88,0; @ %r179 bra $L117; $L123: .loc 1 469 3 setp.lt.s64 %r180,%r88,%r307; @ %r180 bra $L118; bra $L169; $L116: .loc 1 456 7 cvt.u32.u32 %r183,%r24; cvt.s64.s8 %r182,%r183; st.u64 [%stack+8],%r182181181_gfortran_runtime_error117: add.u64 %r59,%r155,48; add.u64 %r137,%frame,120; add.u64 %r101,%r155,56; add.u64 %r185,%r155,24; add.u64 %r187,%r23,%r23; add.u64 %r188,%r187,%r23; shl.b64 %r189,%r188,3; add.u64 %r149,%r185,%r189; .loc 1 466 12 mov.u64 %r313,0; $L122: .loc 1 463 19 ld.u64 %r191,[%r101]; add.u64 %r190,%r191,1; ld.u64 %r192,[%r59]; sub.u64 %r36,%r190,%r192; .loc 1 465 10 setp.le.s64 %r193,%r36,0; @ %r193 bra $L120; .loc 1 463 17 st.u64 [%r137],%r36; bra $L121; $L120: .loc 1 466 12 st.u64 [%r137],%r313; $L121: .loc 1 461 3 add.u64 %r59,%r59,24; add.u64 %r137,%r137,8; add.u64 %r101,%r101,24; setp.ne.u64 %r195,%r59,%r149; @ %r195 bra $L122; bra $L123; $L169: .loc 1 478 15 ld.u64 %r96,[%r153]; .loc 1 478 6 setp.eq.u64 %r196,%r96,0; @ ! %r196 bra $L125; bra $L124; $L118: add.u64 %r198,%r23,%r23; add.u64 %r199,%r198,%r23; shl.b64 %r200,%r199,3; add.u64 %r201,%r200,48; add.u64 %r43,%r155,%r201; add.u64 %r306,%frame,120; shl.b64 %r203,%r23,3; add.u64 %r204,%r203,-8; add.u64 %r131,%r306,%r204; cvt.u32.u32 %r206,%r24; cvt.s64.s8 %r205,%r206; add.u64 %r208,%r205,%r205; add.u64 %r209,%r208,%r205; shl.b64 %r210,%r209,3; add.u64 %r211,%r155,48; add.u64 %r133,%r210,%r211; .loc 1 475 12 mov.u64 %r312,0; $L128: .loc 1 472 2 ld.u64 %r213,[%r43+8]; add.u64 %r212,%r213,1; ld.u64 %r214,[%r43]; sub.u64 %r41,%r212,%r214; .loc 1 474 10 setp.le.s64 %r215,%r41,0; @ %r215 bra $L126; .loc 1 471 17 st.u64 [%r131],%r41; bra $L127; $L126: .loc 1 475 12 st.u64 [%r131],%r312; $L127: .loc 1 469 3 add.u64 %r43,%r43,24; add.u64 %r131,%r131,8; setp.ne.u64 %r217,%r43,%r133; @ %r217 bra $L128; bra $L170; $L124: .loc 1 482 7 setp.eq.u64 %r218,%r307,0; @ %r218 bra $L130; add.u64 %r306,%frame,120; $L149: add.u64 %r111,%r153,40; mov.u64 %r108,%r306; cvt.u32.u32 %r220,%r24; cvt.s64.s8 %r219,%r220; add.u64 %r222,%r219,%r219; add.u64 %r223,%r222,%r219; shl.b64 %r224,%r223,3; add.u64 %r225,%r153,16; add.u64 %r87,%r224,%r225; .loc 1 485 10 mov.u64 %r84,1; .loc 1 489 4 mov.u64 %r245,0; bra $L131; $L130: .loc 1 493 24 mov.u64 %r226,0; st.u64 [%r153+8],%r226; .loc 1 494 28 cvt.u16.u32 %r229,%r24; add.u16 %r228,%r229,-1; cvt.u32.u16 %r230,%r228; st.u8 [%r153+28],%r230; .loc 1 496 20 add.u32 %r231,%r24,-2; cvt.s64.s32 %r50,%r231; add.u64 %r233,%r50,%r50; add.u64 %r234,%r233,%r50; shl.b64 %r235,%r234,3; add.u64 %r236,%r153,%r235; .loc 1 496 67 shl.b64 %r238,%r50,3; add.u64 %r239,%frame,%r238; .loc 1 496 59 ld.u64 %r242,[%r236+40]; ld.u64 %r243,[%r239+120]; mul.lo.u64 %r241,%r242,%r243; .loc 1 496 18 mul.lo.u64 %r94,%r241,%r158; .loc 1 499 10 setp.eq.u64 %r244,%r94,0; @ %r244 bra $L132; bra $L171; $L134: .loc 1 487 48 mul.lo.u64 %r84,%r46,%r84; $L131: .loc 1 489 4 st.u64 [%r111+8],%r245; ld.u64 %r46,[%r108]; add.u64 %r246,%r46,-1; st.u64 [%r111+16],%r246; st.u64 [%r111],%r84; .loc 1 482 7 add.u64 %r111,%r111,24; add.u64 %r108,%r108,8; setp.ne.u64 %r247,%r87,%r111; @ %r247 bra $L134; bra $L130; $L132: .loc 1 502 4 st.u64 [%r153+48],%r94; mov.u64 %r249,-1; st.u64 [%r153+56],%r249; mov.u64 %r250,1; st.u64 [%r153+40],%r250; .loc 1 503 4 bra $L112; $L171: .loc 1 506 24 mov.u64 %r2522; call (%value_in),_gfortrani_xmallocarray253,[%value_in]; } mov.u64 %r96,%r253; .loc 1 506 22 st.u64 [%r153],%r96; $L138: .loc 1 532 3 setp.ne.u64 %r255,%r307,0; @ %r255 bra $L135; bra $L136; $L125: .loc 1 510 19 ld.s8 %r56,[%r153+28]; .loc 1 510 10 setp.eq.u64 %r256,%r56,%r307; @ %r256 bra $L137; .loc 1 511 2 st.u64 [%stack+8],%r307_gfortran_runtime_error137: .loc 1 516 11 cvta.global.u64 %r259,_gfortrani_compile_options; .loc 1 516 10 ld.u32 %r260,[%r259+36]; setp.eq.u32 %r261,%r260,0; @ %r261 bra $L138; .loc 1 518 4 setp.eq.u64 %r262,%r307,0; @ %r262 bra $L136; add.u64 %r86,%r153,48; add.u64 %r83,%frame,120; cvt.u32.u32 %r264,%r24; cvt.s64.s8 %r263,%r264; add.u64 %r72,%r263,-1; .loc 1 518 10 mov.u64 %r104,0; $L140: .loc 1 522 21 ld.u64 %r266,[%r86+8]; add.u64 %r265,%r266,1; .loc 1 522 19 ld.u64 %r267,[%r86]; sub.u64 %r93,%r265,%r267; .loc 1 523 18 ld.u64 %r63,[%r83]; .loc 1 524 3 add.u64 %r104,%r104,1; .loc 1 523 11 setp.eq.u64 %r268,%r63,%r93; @ %r268 bra $L139; .loc 1 524 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r269,$LCstack; call _gfortran_runtime_error139: .loc 1 518 4 add.u64 %r86,%r86,24; add.u64 %r83,%r83,8; setp.ne.u64 %r271,%r72,%r104; @ %r271 bra $L140; $L135: .loc 1 534 16 cvt.u16.u32 %r273,%r24; setp.le.s16 %r274,%r273,1; @ %r274 bra $L141; shl.b64 %r272,%r307,3; bra $L142; $L141: mov.u64 %r272,8; $L142: add.u64 %r276,%frame,240; mov.u32 %r2802722,[%value_in]; } add.u64 %r117,%r153,40; mov.u64 %r98,0; $L143: .loc 1 535 18 shl.b64 %r284,%r98,3; add.u64 %r285,%frame,%r284; .loc 1 535 54 ld.u64 %r287,[%r117]; mul.lo.u64 %r286,%r287,%r158; .loc 1 535 18 st.u64 [%r285],%r286; .loc 1 532 26 add.u64 %r98,%r98,1; .loc 1 532 3 add.u64 %r117,%r117,24; setp.gt.s64 %r288,%r307,%r98; @ %r288 bra $L143; $L136: .loc 1 546 32 ld.u64 %r132,[%frame+120]; .loc 1 544 22 ld.u64 %r136,[%frame]; .loc 1 543 12 ld.u64 %r138,[%frame+240]; .loc 1 553 23 mul.lo.u64 %r142,%r132,%r136; setp.le.s64 %r305,%r307,1; cvt.u32.u32 %r309,%r24; cvt.s64.s8 %r310,%r309; add.u64 %r311,%r310,-1; $L148: .loc 1 542 7 mov.u32 %r290,2559615892,[%value_in]; } .loc 1 543 15 add.u64 %r138,%r138,1; .loc 1 544 12 add.u64 %r96,%r96,%r136; .loc 1 546 13 setp.ne.u64 %r293,%r138,%r132; @ %r293 bra $L148; .loc 1 555 7 @ %r305 bra $L112; add.u64 %r97,%frame,248; add.u64 %r125,%frame,8; add.u64 %r124,%frame,128; .loc 1 553 23 mov.u64 %r74,%r142; .loc 1 554 5 mov.u64 %r95,1; .loc 1 550 13 mov.u64 %r308,0; bra $L146; $L147: st.u64 [%r97],%r308; .loc 1 553 23 mul.lo.u64 %r74,%r77,%r76; .loc 1 554 5 add.u64 %r95,%r95,1; .loc 1 555 7 add.u64 %r97,%r97,8; add.u64 %r125,%r125,8; add.u64 %r124,%r124,8; setp.eq.u64 %r300,%r95,%r311; @ %r300 bra $L112; $L146: .loc 1 559 16 ld.u64 %r301,[%r97]; add.u64 %r76,%r301,1; st.u64 [%r97],%r76; .loc 1 560 23 ld.u64 %r77,[%r125]; .loc 1 560 13 sub.u64 %r302,%r77,%r74; add.u64 %r96,%r96,%r302; .loc 1 546 32 ld.u64 %r80,[%r124]; .loc 1 546 13 setp.eq.u64 %r303,%r76,%r80; @ %r303 bra $L147; mov.u64 %r138,0; bra $L148; $L170: .loc 1 478 15 ld.u64 %r96,[%r153]; .loc 1 478 6 setp.eq.u64 %r304,%r96,0; @ ! %r304 bra $L125; bra $L149; $L112: .loc 1 564minval1_s4.o/_gfortran_minval1_s4 .visible .func _gfortran_minval1_s4.file 1 "../../../../libgfortran/generated/minval1_s4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminval1_s4 .visible .func _gfortran_mminval1_s4GLOBAL FUNCTION DECL: _gfortran_sminval1_s4 .visible .func _gfortran_sminval1_s4GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_options[6]FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_retur_gfortrani_memcmp_char4_gfortrani_memcmp_char4_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents73,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,3709,105,110,118,97,108,49,95,115,52,0 }; // BEGIN VAR DEF: __func__$1 .const .align 1 .u8 __func__$1[11] = {109,105,110,118,97,108,49,95,115,52,0 }120,108,101,110,32,61,61,32,115,116,114,105,110,103,95,108,101116,114,97,110,47,103,101,110,101,114,97,116,101,100,47,109,105,110,118,97,108,49,95,115,52,46,9968,105,109,32,97,114,103,117,109,101,110,116,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,10114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,86,65,76114,101,116,117,114,110,32,118,9726] = {70,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,997,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,0 }_gfortran_minval1_s4 .visible .func _gfortran_minval1_s4 {48pred %r210; .reg .u64 %r211; .reg .u64 %r212; .reg .u32predu64 %r259; .reg .u32u64 %r266; .reg .u64 %r267; .reg .u64 %r268; .reg .u64 %r269; .reg .pred %r270; .reg .pred %r272; .reg .predu64 %r280; .reg .u64u64pred %r315; .reg .u64u64 %r330; .reg .u64 %r332; .reg .u64pred %r345; .reg .pred %r346; .reg .u64 %r353; .reg .pred %r355; .reg .u64 %r356; .reg .u64 %r357; .reg .u64pred %r378; .reg .u64 %r379; .reg .pred %r380; .reg .predu64 %r391; .reg .u64 %r392; .reg .u64 %r393; mov.u64 %r188,%ar0; mov.u64 %r189,%ar1; mov.u64 %r190,%ar2; mov.u64 %r191,%ar3; mov.u64 %r192,%ar4; .loc 1 66 3 setp.eq.u64 %r193,%r189,%r192; @ %r193 bra $L2; cvta.const.u64 %r197,$LC0; cvta.const.u64 %r196,__func__$1; mov.u32 %r195,66;r1951961972: .loc 1 68 10 ld.s8 %r22,[%r190+28]; .loc 1 68 38 add.u32 %r198,%r22,-1; .loc 1 68 8 cvt.s64.s32 %r114,%r198; .loc 1 69 10 ld.u64 %r25,[%r191]; .loc 1 69 7 add.u64 %r115,%r25,-1; .loc 1 71 7 shr.u64 %r200,%r115,63; set.u32.lt.s64 %r202,%r114,%r115; neg.s32 %r203,%r202; cvt.u16.u64 %r206,%r200; cvt.u16.u32 %r207,%r203; or.b16 %r205,%r206,%r207; .loc 1 71 6 cvt.u32.u16 %r208,%r205; cvt.u16.u8 %r209,%r2083; .loc 1 73 7 cvt.u32.u32 %r213,%r22; cvt.s64.s8 %r212,%r213; st.u64 [%stack+8],%r21211211_gfortran_runtime_errorr216,%r115,%r115; add.u64 %r217,%r216,%r115; shl.b64 %r218,%r217,3; add.u64 %r219,%r190,%r218; ld.u64 %r31,[%r219+56]; ld.u64 %r33,[%r219+48]; .loc 1 82 11 ld.u64 %r34,[%r219+40]; .loc 1 84 3 setp.ne.u64 %r233,%r115,0; @ %r233 bra $L4; $L10: .loc 1 92 3 setp.gt.s64 %r234,%r114,%r115; @ %r234 bra $L5; bra $L48; $L4: add.u64 %r100,%r190,40; add.u64 %r84,%frame,120; add.u64 %r73,%frame,240; add.u64 %r236,%r25,%r25; add.u64 %r237,%r236,%r25190,16; add.u64 %r128,%r238,%r239; .loc 1 90 12 mov.u64 %r393,0; $L9: .loc 1 86 51 ld.u64 %r241,[%r100]; mul.lo.u64 %r240,%r241,%r192; .loc 1 86 18 st.u64 [%r84],%r240; .loc 1 87 19 ld.u64 %r243,[%r100+16]; add.u64 %r242,%r243,1; ld.u64 %r244,[%r100+8]; sub.u64 %r43,%r242,%r244; .loc 1 89 10 setp.lt.s64 %r245,%r43,0; @ %r245 bra $L7; .loc 1 87 17 st.u64 [%r73],%r43; bra $L8; $L7: .loc 1 90 12 st.u64 [%r73],%r393; $L8: .loc 1 84 3 add.u64 %r100,%r100,24; add.u64 %r84,%r84,8; add.u64 %r73,%r73,8; setp.ne.u64 %r247,%r100,%r128; @ %r247 bra $L9; bra $L10; $L48: .loc 1 101 6 ld.u64 %r248,[%r188]; setp.eq.u64 %r249,%r248,0; @ ! %r249 bra $L12; bra $L11; $L5: add.u64 %r251,%r25,%r25; add.u64 %r252,%r251,%r25; shl.b64 %r253,%r252,3; add.u64 %r254,%r253,40; add.u64 %r148,%r190,%r254; shl.b64 %r255,%r25,3; add.u64 %r137,%r255,-8; add.u64 %r385,%frame,120; add.u64 %r140,%r385,%r137; add.u64 %r384,%frame,240; add.u64 %r132,%r384,%r137; add.u64 %r258,%r190,40; cvt.u32.u32 %r260,%r22; cvt.s64.s8 %r259,%r260; add.u64 %r262,%r259,%r259; add.u64 %r263,%r262,%r259; shl.b64 %r264,%r263,3; add.u64 %r104,%r258,%r264; .loc 1 98 12 mov.u64 %r392,0; $L15: .loc 1 94 56 ld.u64 %r266,[%r148]; mul.lo.u64 %r265,%r266,%r192; .loc 1 94 18 st.u64 [%r140],%r265; .loc 1 95 19 ld.u64 %r268,[%r148+16]; add.u64 %r267,%r268,1; ld.u64 %r269,[%r148+8]; sub.u64 %r53,%r267,%r269; .loc 1 97 10 setp.lt.s64 %r270,%r53,0; @ %r270 bra $L13; .loc 1 95 17 st.u64 [%r132],%r53; bra $L14; $L13: .loc 1 98 12 st.u64 [%r132],%r392; $L14: .loc 1 92 3 add.u64 %r148,%r148,24; add.u64 %r140,%r140,8; add.u64 %r132,%r132,8; setp.ne.u64 %r272,%r104,%r148; @ %r272 bra $L15; bra $L49; $L11: .loc 1 105 7 setp.le.s64 %r273,%r114,0; @ %r273 bra $L17; add.u64 %r384,%frame,240; $L36: add.u64 %r164,%r188,40; mov.u64 %r162,%r384; add.u64 %r274,%r188,16; cvt.u32.u32 %r276,%r22; cvt.s64.s8 %r275,%r276; add.u64 %r278,%r275,%r275; add.u64 %r279,%r278,%r275; shl.b64 %r280,%r279,3; add.u64 %r152,%r274,%r280; .loc 1 108 10 mov.u64 %r107,1; .loc 1 112 4 mov.u64 %r304,0; bra $L18; $L17: .loc 1 116 24 mov.u64 %r281,0; st.u64 [%r188+8],%r281; .loc 1 117 28 cvt.u16.u32 %r284,%r22; add.u16 %r283,%r284,-1; cvt.u32.u16 %r285,%r283; st.u8 [%r188+28],%r285; .loc 1 119 20 add.u32 %r286,%r22,-2; cvt.s64.s32 %r62,%r286; add.u64 %r288,%r62,%r62; add.u64 %r289,%r288,%r62; shl.b64 %r290,%r289,3; add.u64 %r291,%r188,%r290; .loc 1 119 67 shl.b64 %r293,%r62,3; add.u64 %r294,%frame,%r293; .loc 1 119 59 ld.u64 %r297,[%r291+40]; ld.u64 %r298,[%r294+240]; mul.lo.u64 %r296,%r297,%r298; .loc 1 119 18 mul.lo.u64 %r120,%r296,%r192; .loc 1 122 29300; call (%value_in),_gfortrani_xmallocarray01,[%value_in]; } .loc 1 122 27 st.u64 [%r188],%r301; .loc 1 123 10 setp.eq.u64 %r303,%r120,0; @ ! %r303 bra $L22; bra $L19; $L21: .loc 1 110 48 mul.lo.u64 %r107,%r58,%r107; $L18: .loc 1 112 4 st.u64 [%r164+8],%r304; ld.u64 %r58,[%r162]; add.u64 %r305,%r58,-1; st.u64 [%r164+16],%r305; st.u64 [%r164],%r107; .loc 1 105 7 add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r306,%r152,%r164; @ %r306 bra $L21; bra $L17; $L19: .loc 1 126 4 st.u64 [%r188+48],%r120; mov.u64 %r308,-1; st.u64 [%r188+56],%r308; mov.u64 %r309,1; st.u64 [%r188+40],%r309; .loc 1 127 4 bra $L1; $L12: .loc 1 133 19 ld.s8 %r69,[%r188+28]; .loc 1 133 10 setp.eq.u64 %r310,%r69,%r114; @ %r310 bra $L24; .loc 1 134 2 st.u64 [%stack+8],%r114; st.u64 [%stack],%r69_gfortran_runtime_error24: .loc 1 139 11 cvta.global.u64 %r313,_gfortrani_compile_options; .loc 1 139 10 ld.u32 %r314,[%r313+36]; setp.eq.u32 %r315,%r314,0; @ %r315 bra $L22; .loc 1 140 2 add.u64 %r384,%frame,240; cvta.const.u64 %r319,$LC4; cvta.const.u64 %r31818r384318319; call _gfortrani_bounds_ifunction_return$L22: .loc 1 144 3 setp.gt.s64 %r321,%r114,0; @ %r321 bra $L25; $L29: .loc 1 78 9 add.u64 %r322,%r31,1; .loc 1 78 7 sub.u64 %r116,%r322,%r33; max.s64 %r118,%r116,0; .loc 1 152 8 ld.u64 %r106,[%r190]; .loc 1 153 8 ld.u64 %r126,[%r188]; shl.b64 %r153,%r192,2; .loc 1 168 36 mul.lo.u64 %r79,%r34,%r153; .loc 1 182 22 ld.u64 %r181,[%frame+120]; .loc 1 182 12 shl.b64 %r183,%r181,2; .loc 1 183 22 ld.u64 %r184,[%frame]; .loc 1 183 12 shl.b64 %r186,%r184,2; .loc 1 181 12 ld.u64 %r187,[%frame+360]; setp.gt.s64 %r381,%r116,0; setp.le.s64 %r382,%r114,1; cvt.u32.u32 %r387,%r22; cvt.s64.s8 %r388,%r387; add.u64 %r389,%r388,-1; add.u64 %r390,%frame,120; add.u64 %r391,%frame,240; bra $L26; $L25: add.u64 %r174,%r188,40; cvt.u32.u32 %r324,%r22; cvt.s64.s8 %r323,%r324; add.u64 %r165,%r323,-1; .loc 1 144 3 mov.u64 %r172,0; mov.u64 %r127,%r172; add.u64 %r384,%frame,240; add.u64 %r383,%frame,360; .loc 1 146 16 mov.u64 %r327,%r172; $L28: add.u64 %r326,%r383,%r172; st.u64 [%r326],%r327; .loc 1 147 18 add.u64 %r328,%frame,%r172; .loc 1 147 54 ld.u64 %r330,[%r174]; mul.lo.u64 %r329,%r330,%r192; .loc 1 147 18 st.u64 [%r328],%r329; .loc 1 148 17 add.u64 %r332,%r384,%r172; .loc 1 148 10 ld.u64 %r333,[%r332]; setp.le.s64 %r334,%r333,0; @ %r334 bra $L1; .loc 1 144 26 add.u64 %r127,%r127,1; .loc 1 144 3 add.u64 %r174,%r174,24; add.u64 %r172,%r172,8; setp.ne.u64 %r335,%r127,%r165; @ %r335 bra $L28; bra $L29; $L26: .loc 1 164 5 @ %r381 bra $L37; .loc 1 165 4 mov.u32 %r338,255153340,[%value_in]; } bra $L31; $L37: mov.u64 %r109,%r106; mov.u64 %r122,%r106; .loc 1 168 13 mov.u64 %r121,0; $L30: .loc 1 401109192; call (%value_in),_gfortrani_memcmp_char4344,[%value_in]; } .loc 1 171 6 setp.ge.s32 %r345,%r344,0; selp.u64 %r109,%r109,%r122,%r345; .loc 1 168 28 add.u64 %r121,%r121,1; .loc 1 168 36 add.u64 %r122,%r122,%r79; .loc 1 168 6 setp.gt.s64 %r346,%r118,%r121353,[%value_in]; } $L31: .loc 1 181 15 add.u64 %r187,%r187,1; .loc 1 182 12 add.u64 %r106,%r106,%r183; .loc 1 183 12 add.u64 %r126,%r126,%r186; .loc 1 185 32 ld.u64 %r133,[%frame+240]; .loc 1 185 13 setp.ne.u64 %r355,%r187,%r133; @ %r355 bra $L26; .loc 1 192 23 mul.lo.u64 %r356,%r187,%r181; .loc 1 192 9 shl.b64 %r357,%r356,2; sub.u64 %r123,%r106,%r357; .loc 1 193 23 mul.lo.u64 %r358,%r187,%r184; .loc 1 193 9 shl.b64 %r359,%r358,2; sub.u64 %r124,%r126,%r359; .loc 1 195 7 @ %r382 bra $L1; add.u64 %r135,%frame,368; mov.u64 %r156,8; .loc 1 194 5 mov.u64 %r125,1; .loc 1 189 13 mov.u64 %r386,0; bra $L34; $L35: st.u64 [%r135],%r386; .loc 1 192 23 mul.lo.u64 %r365,%r92,%r91; .loc 1 192 9 shl.b64 %r366,%r365,2; sub.u64 %r123,%r106,%r366; .loc 1 193 23 mul.lo.u64 %r367,%r95,%r91; .loc 1 193 9 shl.b64 %r368,%r367,2; sub.u64 %r124,%r126,%r368; .loc 1 194 5 add.u64 %r125,%r125,1; .loc 1 195 7 add.u64 %r135,%r135,8; add.u64 %r156,%r156,8; setp.eq.u64 %r369,%r125,%r389; @ %r369 bra $L1; $L34: .loc 1 203 16 ld.u64 %r370,[%r135]; add.u64 %r91,%r370,1; st.u64 [%r135],%r91; .loc 1 204 23 add.u64 %r372,%r390,%r156; ld.u64 %r92,[%r372]; .loc 1 204 13 shl.b64 %r373,%r92,2; add.u64 %r106,%r123,%r373; .loc 1 205 23 add.u64 %r374,%frame,%r156; ld.u64 %r95,[%r374]; .loc 1 205 13 shl.b64 %r375,%r95,2; add.u64 %r126,%r124,%r375; .loc 1 185 32 add.u64 %r377,%r391,%r156; ld.u64 %r98,[%r377]; .loc 1 185 13 setp.eq.u64 %r378,%r91,%r98; @ %r378 bra $L35; mov.u64 %r187,0; bra $L26; $L49: .loc 1 101 6 ld.u64 %r379,[%r188]; setp.eq.u64 %r380,%r379,0; @ ! %r380 bra $L12; bra $L36; $L1: .loc 1 209_gfortran_mminval1_s4 .visible .func _gfortran_mminval1_s4608pred %r235; .reg .predu32 %r261; .reg .u64 %r266; .reg .u64pred %r278; .reg .u32u32 %r291; .reg .u16 %r292; .reg .predpredu64 %r318; .reg .u64u64 %r333; .reg .u64 %r336; .reg .u64 %r337; .reg .u32u64 %r349; .reg .pred %r350; .reg .pred %r352; .reg .predu64 %r374; .reg .u16 %r376; .reg .u16 %r377; .reg .u32 %r378; .reg .predpred %r396; .reg .u64 %r399; .reg .u64 %r400; .reg .u64 %r404; .reg .pred %r406; .reg .predpred %r420; .reg .predpred %r427; .reg .pred %r428; .reg .pred %r429; .reg .u32 %r433; .reg .predpredpred %r470; .reg .u64 %r471; .reg .predpred488; mov.u64 %r229,%ar0; mov.u64 %r230,%ar1; mov.u64 %r231,%ar2; mov.u64 %r232,%ar3; mov.u64 %r233,%ar4; mov.u64 %r234,%ar5; .loc 1 242 6 setp.ne.u64 %r235,%r233,0; @ %r235 bra $L23232234; call _gfortran_minval1_s4); } .loc 1 245 7 bra $L50; $L51: .loc 1 248 3 setp.eq.u64 %r241,%r230,%r234; @ %r241 bra $L53; cvta.const.u64 %r245,$LC0; cvta.const.u64 %r244,__func__$0; mov.u32 %r243,248;r2432444553: .loc 1 250 10 ld.u64 %r22,[%r232]; .loc 1 250 7 add.u64 %r147,%r22,-1; .loc 1 251 10 ld.s8 %r23,[%r231+28]; .loc 1 251 38 add.u32 %r246,%r23,-1; .loc 1 251 8 cvt.s64.s32 %r148,%r246; .loc 1 253 7 shr.u64 %r248,%r147,63; set.u32.gt.s64 %r250,%r147,%r148; neg.s32 %r251,%r250; cvt.u16.u64 %r254,%r248; cvt.u16.u32 %r255,%r251; or.b16 %r253,%r254,%r255; .loc 1 253 6 cvt.u32.u16 %r256,%r253; cvt.u16.u8 %r257,%r256; setp.eq.u16 %r258,%r257,0; @ %r258 bra $L54; .loc 1 255 7 cvt.u32.u32 %r261,%r23; cvt.s64.s8 %r260,%r261; st.u64 [%stack+8],%r260; st.u64 [%stack],%r22; cvta.const.u64 %r22_gfortran_runtime_error4: .loc 1 260 9 add.u64 %r478,%r147,%r147; add.u64 %r475,%r478,%r147; shl.b64 %r266,%r475,3; add.u64 %r267,%r231,%r266; ld.u64 %r270,[%r267+56]; add.u64 %r269,%r270,1; .loc 1 260 7 ld.u64 %r277,[%r267+48]; sub.u64 %r149,%r269,%r277; .loc 1 261 6 setp.le.s64 %r278,%r149,0; @ %r278 bra $L50; .loc 1 264 9 ld.u64 %r164,[%r233]; .loc 1 266 15 ld.u64 %r33,[%r233+16]; .loc 1 268 22 cvt.u32.u64 %r34,%r33; .loc 1 268 53 add.u32 %r279,%r34,-4; and.b32 %r280,%r279,-5; set.u32.eq.u32 %r282,%r280,0; neg.s32 %r283,%r282; .loc 1 268 22 add.u32 %r284,%r34,-1; set.u32.le.u32 %r286,%r284,1; neg.s32 %r287,%r286; .loc 1 268 6 cvt.u16.u32 %r289,%r283; cvt.u16.u32 %r290,%r287; or.b16 %r288,%r289,%r290; cvt.u32.u16 %r291,%r288; cvt.u16.u8 %r292,%r291; setp.ne.u16 %r293,%r292,0; @ %r293 bra $L56; .loc 1 270 7 setp.ne.u32 %r295,%r34,16; @ %r295 bra $L57; $L56: .loc 1 277 11 shl.b64 %r299,%r475,3; add.u64 %r300,%r231,%r299; ld.u64 %r39,[%r300+40]; .loc 1 278 12 add.u64 %r306,%r233,%r299; ld.u64 %r41,[%r306+40]; .loc 1 280 3 setp.ne.u64 %r308,%r147,0; @ %r308 bra $L58; $L64: .loc 1 290 3 setp.lt.s64 %r309,%r147,%r148; @ %r309 bra $L59; bra $L109; $L57: .loc 1 275 5 cvta.const.u64 %r310,$LC310_gfortran_runtime_error8: add.u64 %r104,%r231,40; add.u64 %r213,%frame,240; add.u64 %r214,%r233,40; mov.u64 %r216,%frame; add.u64 %r217,%frame,360; add.u64 %r312,%r231,16; add.u64 %r314,%r22,%r22; add.u64 %r315,%r314,%r22; shl.b64 %r316,%r315,3; add.u64 %r226,%r312,%r316; .loc 1 287 12 mov.u64 %r488,0; $L63: .loc 1 282 51 ld.u64 %r318,[%r104]; mul.lo.u64 %r317,%r318,%r234; .loc 1 282 18 st.u64 [%r213],%r317; .loc 1 283 20 ld.u64 %r320,[%r214]; mul.lo.u64 %r319,%r320,%r33; .loc 1 283 18 st.u64 [%r216],%r319; .loc 1 284 19 ld.u64 %r322,[%r104+16]; add.u64 %r321,%r322,1; ld.u64 %r323,[%r104+8]; sub.u64 %r56,%r321,%r323; .loc 1 286 10 setp.lt.s64 %r324,%r56,0; @ %r324 bra $L61; .loc 1 284 17 st.u64 [%r217],%r56; bra $L62; $L61: .loc 1 287 12 st.u64 [%r217],%r488; $L62: .loc 1 280 3 add.u64 %r104,%r104,24; add.u64 %r213,%r213,8; add.u64 %r214,%r214,24; add.u64 %r216,%r216,8; add.u64 %r217,%r217,8; setp.ne.u64 %r326,%r104,%r226; @ %r326 bra $L63; bra $L64; $L109: .loc 1 300 6 ld.u64 %r327,[%r229]; setp.eq.u64 %r328,%r327,0; @ ! %r328 bra $L66; bra $L65; $L59: add.u64 %r330,%r22,%r22; add.u64 %r331,%r330,%r22; shl.b64 %r332,%r331,3; add.u64 %r130,%r332,40; add.u64 %r134,%r231,%r130; shl.b64 %r333,%r22,3; add.u64 %r106,%r333,-8; add.u64 %r473,%frame,240; add.u64 %r127,%r473,%r106; add.u64 %r88,%r233,%r130; add.u64 %r37,%frame,%r106; add.u64 %r479,%frame,360; add.u64 %r168,%r479,%r106; add.u64 %r336,%r231,40; cvt.u32.u32 %r338,%r23; cvt.s64.s8 %r337,%r338; add.u64 %r340,%r337,%r337; add.u64 %r341,%r340,%r337; shl.b64 %r342,%r341,3; add.u64 %r101,%r336,%r342; .loc 1 297 12 mov.u64 %r487,0; $L69: .loc 1 292 55 ld.u64 %r344,[%r134]; mul.lo.u64 %r343,%r344,%r234; .loc 1 292 18 st.u64 [%r127],%r343; .loc 1 293 20 ld.u64 %r346,[%r88]; mul.lo.u64 %r345,%r346,%r33; .loc 1 293 18 st.u64 [%r37],%r345; .loc 1 294 19 ld.u64 %r348,[%r134+16]; add.u64 %r347,%r348,1; ld.u64 %r349,[%r134+8]; sub.u64 %r70,%r347,%r349; .loc 1 296 10 setp.lt.s64 %r350,%r70,0; @ %r350 bra $L67; .loc 1 294 17 st.u64 [%r168],%r70; bra $L68; $L67: .loc 1 297 12 st.u64 [%r168],%r487; $L68: .loc 1 290 3 add.u64 %r134,%r134,24; add.u64 %r127,%r127,8; add.u64 %r88,%r88,24; add.u64 %r37,%r37,8; add.u64 %r168,%r168,8; setp.ne.u64 %r352,%r101,%r134; @ %r352 bra $L69; bra $L110; $L65: .loc 1 304 7 setp.eq.u64 %r353,%r148,0; @ %r353 bra $L71; add.u64 %r479,%frame,360; $L91: add.u64 %r146,%r229,40; mov.u64 %r144,%r479; add.u64 %r354,%r229,16; cvt.u32.u32 %r356,%r23; cvt.s64.s8 %r355,%r356; add.u64 %r358,%r355,%r355; add.u64 %r359,%r358,%r355; shl.b64 %r360,%r359,3; add.u64 %r135,%r354,%r360; .loc 1 307 10 mov.u64 %r138,1; .loc 1 311 4 mov.u64 %r380,0; bra $L72; $L71: .loc 1 315 20 add.u32 %r361,%r23,-2; cvt.s64.s32 %r77,%r361; add.u64 %r363,%r77,%r77; add.u64 %r364,%r363,%r77; shl.b64 %r365,%r364,3; add.u64 %r366,%r229,%r365; .loc 1 315 67 shl.b64 %r368,%r77,3; add.u64 %r369,%frame,%r368; .loc 1 315 59 ld.u64 %r372,[%r366+40]; ld.u64 %r373,[%r369+360]; mul.lo.u64 %r371,%r372,%r373; .loc 1 315 18 mul.lo.u64 %r155,%r371,%r234; .loc 1 318 24 mov.u64 %r374,0; st.u64 [%r229+8],%r374; .loc 1 319 28 cvt.u16.u32 %r377,%r23; add.u16 %r376,%r377,-1; cvt.u32.u16 %r378,%r376; st.u8 [%r229+28],%r378; .loc 1 321 10 setp.eq.u64 %r379,%r155,0; @ %r379 bra $L73; bra $L111; $L75: .loc 1 309 47 mul.lo.u64 %r138,%r74,%r138; $L72: .loc 1 311 4 st.u64 [%r146+8],%r380; ld.u64 %r74,[%r144]; add.u64 %r381,%r74,-1; st.u64 [%r146+16],%r381; st.u64 [%r146],%r138; .loc 1 304 7 add.u64 %r146,%r146,24; add.u64 %r144,%r144,8; setp.ne.u64 %r382,%r135,%r146; @ %r382 bra $L75; bra $L71; $L73: .loc 1 324 4 st.u64 [%r229+48],%r155; mov.u64 %r384,-1; st.u64 [%r229+56],%r384; mov.u64 %r385,1; st.u64 [%r229+40],%r385; .loc 1 325 4 bra $L50; $L111: .loc 1 328 24387; call (%value_in),_gfortrani_xmallocarray88,[%value_in]; } .loc 1 328 22 st.u64 [%r229],%r388; bra $L76; $L66: .loc 1 333 19 ld.s8 %r390,[%r229+28]; .loc 1 333 10 setp.eq.u64 %r391,%r390,%r148; @ %r391 bra $L77; .loc 1 334 2 cvta.const.u64 %r392,$LC7;stack; call _gfortran_runtime_error77: .loc 1 336 11 cvta.global.u64 %r394,_gfortrani_compile_options; .loc 1 336 10 ld.u32 %r395,[%r394+36]; setp.eq.u32 %r396,%r395,0; @ %r396 bra $L76; .loc 1 338 4 add.u64 %r479,%frame,360; cvta.const.u64 %r400,$LC4; cvta.const.u64 %r39399400; call _gfortrani_bounds_ifunction_return340 4 cvta.const.u64 %r40423r204400; call _gfortrani_bounds_equal_extents$L76: .loc 1 345 3 setp.ne.u64 %r406,%r148,0; @ %r406 bra $L78; $L81: .loc 1 353 8 ld.u64 %r165,[%r229]; .loc 1 354 8 ld.u64 %r163,[%r231]; .loc 1 356 9 setp.ne.u64 %r407,%r163,0; @ %r407 bra $L79; bra $L50; $L78: add.u64 %r185,%r229,40; cvt.u32.u32 %r409,%r23; cvt.s64.s8 %r408,%r409; add.u64 %r169,%r408,-1; .loc 1 345 3 mov.u64 %r179,0; mov.u64 %r62,%r179; add.u64 %r479,%frame,360; add.u64 %r476,%frame,480; add.u64 %r474,%frame,120; .loc 1 347 16 mov.u64 %r412,%r179; $L80: add.u64 %r411,%r476,%r179; st.u64 [%r411],%r412; .loc 1 348 18 add.u64 %r414,%r474,%r179; .loc 1 348 54 ld.u64 %r416,[%r185]; mul.lo.u64 %r415,%r416,%r234; .loc 1 348 18 st.u64 [%r414],%r415; .loc 1 349 17 add.u64 %r418,%r479,%r179; .loc 1 349 10 ld.u64 %r419,[%r418]; setp.le.s64 %r420,%r419,0; @ %r420 bra $L50; .loc 1 345 26 add.u64 %r62,%r62,1; .loc 1 345 3 add.u64 %r185,%r185,24; add.u64 %r179,%r179,8; setp.ne.u64 %r421,%r62,%r169; @ %r421 bra $L80; bra $L81; $L79: .loc 1 278 12 mul.lo.u64 %r44,%r41,%r33; .loc 1 366 2 shl.b64 %r95,%r234,2; .loc 1 368 32 mul.lo.u64 %r96,%r39,%r95; .loc 1 389 22 ld.u64 %r201,[%frame+240]; .loc 1 389 12 shl.b64 %r204,%r201,2; .loc 1 390 23 ld.u64 %r205,[%frame]; .loc 1 391 22 ld.u64 %r208,[%frame+120]; .loc 1 391 12 shl.b64 %r211,%r208,2; setp.le.s64 %r477,%r148,1; cvt.u32.u32 %r481,%r23; cvt.s64.s8 %r482,%r481; add.u64 %r483,%r482,-1; add.u64 %r484,%frame,240; add.u64 %r485,%frame,360; add.u64 %r486,%frame,120; $L92: .loc 1 366 2 mov.u32 %r423,255mov.u64 %r159,%r164; mov.u64 %r158,%r163; .loc 1 368 9 mov.u64 %r157,0; $L83: .loc 1 371 7 ld.s8 %r212,[%r159]; .loc 1 371 6 setp.ne.u32 %r426,%r212,0; @ %r426 bra $L82; .loc 1 368 24 add.u64 %r157,%r157,1; .loc 1 368 32 add.u64 %r158,%r158,%r96; .loc 1 368 47 add.u64 %r159,%r159,%r44; .loc 1 368 2 setp.ne.u64 %r427,%r149,%r157; @ %r427 bra $L83; mov.u64 %r158,%r165; bra $L84; $L82: .loc 1 377 6 setp.le.s64 %r428,%r149,%r157; @ %r428 bra $L84; mov.u64 %r166,%r158; $L86: .loc 1 379 6 setp.eq.u32 %r429,%r212,0; @ %r429 bra $L85; .loc 1 40(%value_in),_gfortrani_memcmp_char4433,[%value_in]; } .loc 1 379 13 setp.ge.s32 %r434,%r433,0; selp.u64 %r158,%r158,%r166,%r434; $L85: .loc 1 377 23 add.u64 %r157,%r157,1; .loc 1 377 31 add.u64 %r166,%r166,%r96; .loc 1 377 46 add.u64 %r159,%r159,%r44; .loc 1 377 6 setp.eq.u64 %r435,%r149,%r157; @ %r435 bra $L84; .loc 1 379 7 ld.s8 %r212,[%r159]; bra $L86; $L84: .loc 1 385 2442,[%value_in]; } .loc 1 388 15 ld.u64 %r444,[%frame+480]; add.u64 %r100,%r444,1; st.u64 [%frame+480],%r100; .loc 1 389 12 add.u64 %r163,%r163,%r204; .loc 1 390 13 add.u64 %r164,%r164,%r205; .loc 1 391 12 add.u64 %r165,%r165,%r211; .loc 1 393 32 ld.u64 %r136,[%frame+360]; .loc 1 393 13 setp.ne.u64 %r445,%r100,%r136; @ %r445 bra $L92; .loc 1 397 13 mov.u64 %r446,0; st.u64 [%frame+480],%r446; .loc 1 400 23 mul.lo.u64 %r108,%r100,%r201; .loc 1 401 24 mul.lo.u64 %r447,%r100,%r205; .loc 1 401 10 sub.u64 %r160,%r164,%r447; .loc 1 402 23 mul.lo.u64 %r448,%r100,%r208; .loc 1 402 9 shl.b64 %r449,%r448,2; sub.u64 %r161,%r165,%r449; .loc 1 404 7 @ %r477 bra $L50; add.u64 %r198,%frame,488; mov.u64 %r197,8; .loc 1 403 5 mov.u64 %r162,1; bra $L89; $L90: .loc 1 397 13 st.u64 [%r198],%r446; .loc 1 400 23 mul.lo.u64 %r108,%r120,%r119; .loc 1 401 24 mul.lo.u64 %r455,%r122,%r119; .loc 1 401 10 sub.u64 %r160,%r164,%r455; .loc 1 402 23 mul.lo.u64 %r456,%r124,%r119; .loc 1 402 9 shl.b64 %r457,%r456,2; sub.u64 %r161,%r165,%r457; .loc 1 403 5 add.u64 %r162,%r162,1; .loc 1 404 7 add.u64 %r198,%r198,8; add.u64 %r197,%r197,8; setp.eq.u64 %r458,%r162,%r483; @ %r458 bra $L50; $L89: .loc 1 412 16 ld.u64 %r459,[%r198]; add.u64 %r119,%r459,1; st.u64 [%r198],%r119; .loc 1 413 23 add.u64 %r461,%r484,%r197; ld.u64 %r120,[%r461]; .loc 1 413 13 sub.u64 %r462,%r120,%r108; shl.b64 %r463,%r462,2; add.u64 %r163,%r163,%r463; .loc 1 414 24 add.u64 %r464,%frame,%r197; ld.u64 %r122,[%r464]; .loc 1 414 14 add.u64 %r164,%r160,%r122; .loc 1 415 23 add.u64 %r466,%r486,%r197; ld.u64 %r124,[%r466]; .loc 1 415 13 shl.b64 %r467,%r124,2; add.u64 %r165,%r161,%r467; .loc 1 393 32 add.u64 %r469,%r485,%r197; ld.u64 %r128,[%r469]; .loc 1 393 13 setp.eq.u64 %r470,%r119,%r128; @ %r470 bra $L90; bra $L92; $L110: .loc 1 300 6 ld.u64 %r471,[%r229]; setp.eq.u64 %r472,%r471,0; @ ! %r472 bra $L66; bra $L91; $L50: .loc 1 419_gfortran_sminval1_s4 .visible .func _gfortran_sminval1_s4368predu16 %r180; .reg .predpredu32u64 %r229; .reg .u16predpred %r264; .reg .pred %r265; .reg .u64 %r266; .reg .u32pred %r274; .reg .u64 %r275; .reg .u16u32 %r283; .reg .u64predu32 %r313; .reg .u64mov.u64 %r156,%ar0; mov.u64 %r157,%ar1; mov.u64 %r158,%ar2; mov.u64 %r159,%ar3; mov.u64 %r160,%ar4; mov.u64 %r161,%ar5; .loc 1 445 6 setp.eq.u64 %r162,%r160,0; @ %r162 bra $L113; .loc 1 445 20 ld.u32 %r163,[%r160]; setp.eq.u32 %r164,%r163,0; @ %r164 bra $L114; $L113: .loc 1 447 7r157158161; call _gfortran_minval1_s4); } .loc 1 448 7 bra $L112; $L114: .loc 1 451 10 ld.u64 %r23,[%r159]; .loc 1 451 7 add.u64 %r91,%r23,-1; .loc 1 452 10 ld.s8 %r24,[%r158+28]; .loc 1 452 38 add.u32 %r26,%r24,-1; .loc 1 452 8 cvt.s64.s32 %r311,%r26; .loc 1 454 7 shr.u64 %r171,%r91,63; set.u32.gt.s64 %r173,%r91,%r311; neg.s32 %r174,%r173; cvt.u16.u64 %r177,%r171; cvt.u16.u32 %r178,%r174; or.b16 %r176,%r177,%r178; .loc 1 454 6 cvt.u32.u16 %r179,%r176; cvt.u16.u8 %r180,%r179; setp.ne.u16 %r181,%r180,0; @ %r181 bra $L116; .loc 1 461 3 setp.ne.u64 %r182,%r91,0; @ %r182 bra $L117; $L123: .loc 1 469 3 setp.lt.s64 %r183,%r91,%r311; @ %r183 bra $L118; bra $L169; $L116: .loc 1 456 7 cvt.u32.u32 %r186,%r24; cvt.s64.s8 %r185,%r186; st.u64 [%stack+8],%r1851841stack; call _gfortran_runtime_error117: add.u64 %r100,%r158,48; add.u64 %r72,%frame,120; add.u64 %r59,%r158,56; add.u64 %r188,%r158,24; add.u64 %r190,%r23,%r23; add.u64 %r191,%r190,%r23; shl.b64 %r192,%r191,3; add.u64 %r152,%r188,%r192; .loc 1 466 12 mov.u64 %r317,0; $L122: .loc 1 463 19 ld.u64 %r194,[%r59]; add.u64 %r193,%r194,1; ld.u64 %r195,[%r100]; sub.u64 %r35,%r193,%r195; .loc 1 465 10 setp.le.s64 %r196,%r35,0; @ %r196 bra $L120; .loc 1 463 17 st.u64 [%r72],%r35; bra $L121; $L120: .loc 1 466 12 st.u64 [%r72],%r317; $L121: .loc 1 461 3 add.u64 %r100,%r100,24; add.u64 %r72,%r72,8; add.u64 %r59,%r59,24; setp.ne.u64 %r198,%r100,%r152; @ %r198 bra $L122; bra $L123; $L169: .loc 1 478 15 ld.u64 %r102,[%r156]; .loc 1 478 6 setp.eq.u64 %r199,%r102,0; @ ! %r199 bra $L125; bra $L124; $L118: add.u64 %r201,%r23,%r23; add.u64 %r202,%r201,%r23; shl.b64 %r203,%r202,3; add.u64 %r204,%r203,48; add.u64 %r81,%r158,%r204; add.u64 %r309,%frame,120; shl.b64 %r206,%r23,3; add.u64 %r207,%r206,-8; add.u64 %r30,%r309,%r207; add.u64 %r208,%r158,48; cvt.u32.u32 %r210,%r24; cvt.s64.s8 %r209,%r210; add.u64 %r212,%r209,%r209; add.u64 %r213,%r212,%r209; shl.b64 %r214,%r213,3; add.u64 %r97,%r208,%r214; .loc 1 475 12 mov.u64 %r316,0; $L128: .loc 1 472 2 ld.u64 %r216,[%r81+8]; add.u64 %r215,%r216,1; ld.u64 %r217,[%r81]; sub.u64 %r40,%r215,%r217; .loc 1 474 10 setp.le.s64 %r218,%r40,0; @ %r218 bra $L126; .loc 1 471 17 st.u64 [%r30],%r40; bra $L127; $L126: .loc 1 475 12 st.u64 [%r30],%r316; $L127: .loc 1 469 3 add.u64 %r81,%r81,24; add.u64 %r30,%r30,8; setp.ne.u64 %r220,%r81,%r97; @ %r220 bra $L128; bra $L170; $L124: .loc 1 482 7 setp.eq.u64 %r221,%r311,0; @ %r221 bra $L130; add.u64 %r309,%frame,120; $L149: add.u64 %r119,%r156,40; mov.u64 %r117,%r309; cvt.u32.u32 %r223,%r24; cvt.s64.s8 %r222,%r223; add.u64 %r225,%r222,%r222; add.u64 %r226,%r225,%r222; shl.b64 %r227,%r226,3; add.u64 %r228,%r156,16; add.u64 %r110,%r227,%r228; .loc 1 485 10 mov.u64 %r86,1; .loc 1 489 4 mov.u64 %r248,0; bra $L131; $L130: .loc 1 493 24 mov.u64 %r229,0; st.u64 [%r156+8],%r229; .loc 1 494 28 cvt.u16.u32 %r232,%r24; add.u16 %r231,%r232,-1; cvt.u32.u16 %r233,%r231; st.u8 [%r156+28],%r233; .loc 1 496 20 add.u32 %r234,%r24,-2; cvt.s64.s32 %r49,%r234; add.u64 %r236,%r49,%r49; add.u64 %r237,%r236,%r49156,%r238; .loc 1 496 67 shl.b64 %r241,%r49,3; add.u64 %r242,%frame,%r241; .loc 1 496 59 ld.u64 %r245,[%r239+40]; ld.u64 %r246,[%r242+120]; mul.lo.u64 %r244,%r245,%r246; .loc 1 496 18 mul.lo.u64 %r99,%r244,%r161; .loc 1 499 10 setp.eq.u64 %r247,%r99,0; @ %r247 bra $L132; bra $L171; $L134: .loc 1 487 48 mul.lo.u64 %r86,%r45,%r86; $L131: .loc 1 489 4 st.u64 [%r119+8],%r248; ld.u64 %r45,[%r117]; add.u64 %r249,%r45,-1; st.u64 [%r119+16],%r249; st.u64 [%r119],%r86; .loc 1 482 7 add.u64 %r119,%r119,24; add.u64 %r117,%r117,8; setp.ne.u64 %r250,%r110,%r119; @ %r250 bra $L134; bra $L130; $L132: .loc 1 502 4 st.u64 [%r156+48],%r99; mov.u64 %r252,-1; st.u64 [%r156+56],%r252; mov.u64 %r253,1; st.u64 [%r156+40],%r253; .loc 1 503 4 bra $L112; $L171: .loc 1 506 2499call (%value_in),_gfortrani_xmallocarray256,[%value_in]; } mov.u64 %r102,%r256; .loc 1 506 22 st.u64 [%r156],%r102; $L138: .loc 1 532 3 setp.ne.u64 %r258,%r311,0; @ %r258 bra $L135; bra $L136; $L125: .loc 1 510 19 ld.s8 %r55,[%r156+28]; .loc 1 510 10 setp.eq.u64 %r259,%r55,%r311; @ %r259 bra $L137; .loc 1 511 2 st.u64 [%stack+8],%r311; st.u64 [%stack],%r5_gfortran_runtime_error137: .loc 1 516 11 cvta.global.u64 %r262,_gfortrani_compile_options; .loc 1 516 10 ld.u32 %r263,[%r262+36]; setp.eq.u32 %r264,%r263,0; @ %r264 bra $L138; .loc 1 518 4 setp.eq.u64 %r265,%r311,0; @ %r265 bra $L136; add.u64 %r105,%r156,48; add.u64 %r88,%frame,120; cvt.u32.u32 %r267,%r24; cvt.s64.s8 %r266,%r267; add.u64 %r83,%r266,-1; .loc 1 518 10 mov.u64 %r109,0; $L140: .loc 1 522 21 ld.u64 %r269,[%r105+8]; add.u64 %r268,%r269,1; .loc 1 522 19 ld.u64 %r270,[%r105]; sub.u64 %r96,%r268,%r270; .loc 1 523 18 ld.u64 %r63,[%r88]; .loc 1 524 3 add.u64 %r109,%r109,1; .loc 1 523 11 setp.eq.u64 %r271,%r63,%r96; @ %r271 bra $L139; .loc 1 524 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r96; st.u64 [%stack],%r109; cvta.const.u64 %r272,$LC272_gfortran_runtime_error139: .loc 1 518 4 add.u64 %r105,%r105,24; add.u64 %r88,%r88,8; setp.ne.u64 %r274,%r83,%r109; @ %r274 bra $L140; $L135: .loc 1 534 16 cvt.u16.u32 %r276,%r24; setp.le.s16 %r277,%r276,1; @ %r277 bra $L141; shl.b64 %r275,%r311,3; bra $L142; $L141: mov.u64 %r275,8; $L142: add.u64 %r279,%frame,240; mov.u32 %r2832755,[%value_in]; } add.u64 %r125,%r156,40; mov.u64 %r104,0; $L143: .loc 1 535 18 shl.b64 %r287,%r104,3; add.u64 %r288,%frame,%r287; .loc 1 535 54 ld.u64 %r290,[%r125]; mul.lo.u64 %r289,%r290,%r161; .loc 1 535 18 st.u64 [%r288],%r289; .loc 1 532 26 add.u64 %r104,%r104,1; .loc 1 532 3 add.u64 %r125,%r125,24; setp.gt.s64 %r291,%r311,%r104; @ %r291 bra $L143; $L136: .loc 1 542 7 shl.b64 %r70,%r161,2; .loc 1 546 32 ld.u64 %r140,[%frame+120]; .loc 1 544 22 ld.u64 %r141,[%frame]; .loc 1 544 12 shl.b64 %r143,%r141,2; .loc 1 543 12 ld.u64 %r147,[%frame+240]; .loc 1 553 23 mul.lo.u64 %r151,%r140,%r141; setp.le.s64 %r310,%r311,1; cvt.u32.u32 %r313,%r24; cvt.s64.s8 %r314,%r313; add.u64 %r315,%r314,-1; $L148: .loc 1 542 7 mov.u32 %r293,255543 15 add.u64 %r147,%r147,1; .loc 1 544 12 add.u64 %r102,%r102,%r143; .loc 1 546 13 setp.ne.u64 %r296,%r147,%r140; @ %r296 bra $L148; .loc 1 555 7 @ %r310 bra $L112; add.u64 %r103,%frame,248; add.u64 %r133,%frame,8; add.u64 %r132,%frame,128; .loc 1 553 23 mov.u64 %r75,%r151; .loc 1 554 5 mov.u64 %r101,1; .loc 1 550 13 mov.u64 %r312,0; bra $L146; $L147: st.u64 [%r103],%r312; .loc 1 553 23 mul.lo.u64 %r75,%r78,%r77; .loc 1 554 5 add.u64 %r101,%r101,1; .loc 1 555 7 add.u64 %r103,%r103,8; add.u64 %r133,%r133,8; add.u64 %r132,%r132,8; setp.eq.u64 %r303,%r101,%r315; @ %r303 bra $L112; $L146: .loc 1 559 16 ld.u64 %r304,[%r103]; add.u64 %r77,%r304,1; st.u64 [%r103],%r77; .loc 1 560 23 ld.u64 %r78,[%r133]; .loc 1 560 13 sub.u64 %r305,%r78,%r75; shl.b64 %r306,%r305,2; add.u64 %r102,%r102,%r306; .loc 1 546 32 ld.u64 %r80,[%r132]; .loc 1 546 13 setp.eq.u64 %r307,%r77,%r80; @ %r307 bra $L147; mov.u64 %r147,0; bra $L148; $L170: .loc 1 478 15 ld.u64 %r102,[%r156]; .loc 1 478 6 setp.eq.u64 %r308,%r102,0; @ ! %r308 bra $L125; bra $L149; $L112: .loc 1 564 findloc0_i1.o/_gfortran_findloc0_i1 .visible .func _gfortran_findloc0_i1); .file 1 "../../../../libgfortran/generated/findloc0_i1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mfindloc0_i1 .visible .func _gfortran_mfindloc0_i1_gfortran_sfindloc0_i1 .visible .func _gfortran_sfindloc0_i1); // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extentsgfortrani_internal_error .extern .func _gfortrani_internal_70,73,78,68,777,65,83,75,32,97,114,103,117,109,101,110,116,0 }82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,6270,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_findloc0_i1 .visible .func _gfortran_findloc0_i1100predpredpred %r194; .reg .pred %r195; .reg .predu32 %r212; .reg .u64 %r215; mov.u64 %r116,%ar0; mov.u64 %r117,%ar1; mov.u32 %r118,%ar2; mov.u32 %r119,%ar3; .loc 1 51 10 ld.s8 %r22,[%r117+28]; .loc 1 51 8 cvt.u32.u32 %r120,%r22; cvt.s64.s8 %r78,%r120; .loc 1 52 6 setp.gt.s64 %r121,%r78,0; @ %r121 bra $L2; .loc 1 53 5 cvta.const.u64 %r1221stack; call _gfortran_runtime_errorr23,[%r116]; .loc 1 55 6 setp.ne.u64 %r124,%r23,0; @ %r124 bra $L3; .loc 1 57 7 st.u64 [%r116+48],%r23; add.u64 %r126,%r78,-1; st.u64 [%r116+56],%r126; mov.u64 %r127,1; st.u64 [%r116+40],%r127; .loc 1 58 28 cvt.u32.u64 %r128,%r127; st.u8 [%r116+28],%r128; .loc 1 59 24 st.u64 [%r116+8],%r23; .loc 1 60 29 mov.u64 %r131r131; call (%value_in),_gfortrani_xmallocarray134,[%value_in]; } mov.u64 %r23,%r134; .loc 1 60 27 st.u64 [%r116],%r23; bra $L4; $L3: .loc 1 64 11 cvta.global.u64 %r136,_gfortrani_compile_options; .loc 1 64 10 ld.u32 %r137,[%r136+36]; setp.eq.u32 %r138,%r137,0; @ %r138 bra $L4; .loc 1 6r141; call _gfortrani_bounds_iforeach_return.loc 1 70 8 ld.u64 %r23,[%r116]; $L4: ld.u64 %r142,[%r116+40]; shl.b64 %r114,%r142,3; mov.u64 %r97,%r23; .loc 1 69 11 mov.u64 %r112,%r97; .loc 1 73 10 mov.u64 %r85,0; .loc 1 74 23 mov.u64 %r143,%r85; $L5: st.u64 [%r112],%r143; mov.u64 %r66,%r85; .loc 1 73 26 add.u64 %r85,%r66,1; .loc 1 73 3 add.u64 %r112,%r112,%r114; setp.ne.u64 %r144,%r78,%r85; @ %r144 bra $L5; add.u64 %r93,%r117,40; mov.u64 %r45,%frame; .loc 1 76 6 mov.u64 %r83,1; .loc 1 77 10 mov.u64 %r96,0; add.u64 %r205,%frame,120; bra $L7; $L20: mov.u64 %r96,%r84; $L7: .loc 1 79 18 ld.u64 %r145,[%r93]; st.u64 [%r45],%r145; .loc 1 80 19 ld.u64 %r147,[%r93+16]; add.u64 %r146,%r147,1; ld.u64 %r148,[%r93+8]; sub.u64 %r33,%r146,%r148; .loc 1 80 17 shl.b64 %r150,%r96,3; add.u64 %r151,%r205,%r150; st.u64 [%r151],%r33; .loc 1 81 10 mul.lo.u64 %r83,%r83,%r33; .loc 1 82 10 setp.le.s64 %r152,%r33,0; @ %r152 bra $L1; .loc 1 77 26 add.u64 %r84,%r96,1; .loc 1 77 3 add.u64 %r93,%r93,24; add.u64 %r45,%r45,8; setp.ne.u64 %r153,%r66,%r96; @ %r153 bra $L20; .loc 1 87 16 cvt.u32.u32 %r155,%r22; cvt.s64.s8 %r154,%r155; shl.b64 %r91,%r154,3; add.u64 %r206,%frame,240; mov.u32 %r160%r108,[%frame]; .loc 1 105 34 ld.u64 %r104,[%frame+120]; .loc 1 91 19 ld.u64 %r43,[%r117]; .loc 1 89 6 setp.eq.u32 %r164,%r119,0; @ ! %r164 bra $L32; add.u64 %r207,%frame,248; .loc 1 134 11 cvt.u16.u32 %r188,%r118; mov.u32 %r212,1; .loc 1 149 17 mov.u64 %r215,0; bra $L8; $L32: .loc 1 91 42 add.u64 %r165,%r83,-1; .loc 1 91 12 add.u64 %r65,%r43,%r165; .loc 1 104 13 neg.s64 %r42,%r108; add.u64 %r207,%frame,248; .loc 1 97 11 cvt.u16.u32 %r168,%r118; mov.u32 %r208,1; .loc 1 112 17 mov.u64 %r211,0; $L14: ld.u64 %r38,[%frame+240]; mov.u64 %r105,%r38; mov.u32 %r57,%r160; $L12: .loc 1 97 11 ld.s8 %r167,[%r65]; cvt.u16.u32 %r166,%r167; setp.ne.u16 %r169,%r166,%r168; @ %r169 bra $L9; setp.eq.u32 %r170,%r57,0; selp.u64 %r105,%r38,%r105,%r170; add.u64 %r89,%frame,128; mov.u64 %r88,%r207; add.u64 %r82,%r205,%r91; $L11: .loc 1 100 37 sub.u64 %r174,%r104,%r105; .loc 1 100 25 st.u64 [%r97],%r174; .loc 1 99 5 add.u64 %r97,%r97,%r114; setp.eq.u64 %r175,%r82,%r89; @ %r175 bra $L1; .loc 1 100 33 ld.u64 %r104,[%r89]; .loc 1 100 44 ld.u64 %r105,[%r88]; add.u64 %r89,%r89,8; add.u64 %r88,%r88,8; bra $L11; $L9: .loc 1 104 13 add.u64 %r65,%r65,%r42; .loc 1 105 14 add.u64 %r105,%r105,1; mov.u32 %r57,%r208; .loc 1 105 6 setp.ne.u64 %r176,%r105,%r104; @ %r176 bra $L12; st.u64 [%frame+240],%r104; mov.u64 %r106,%r207; mov.u64 %r52,%r104; mov.u64 %r50,%r108; .loc 1 107 6 mov.u64 %r70,0; $L13: .loc 1 112 17 st.u64 [%r106+-8],%r211; .loc 1 115 39 mul.lo.u64 %r47,%r50,%r52; mov.u64 %r67,%r70; .loc 1 116 9 add.u64 %r70,%r70,1; .loc 1 117 11 setp.eq.u64 %r179,%r67,%r66; @ %r179 bra $L1; .loc 1 121 13 ld.u64 %r180,[%r106]; add.u64 %r49,%r180,1; st.u64 [%r106],%r49; shl.b64 %r101,%r70,3; .loc 1 122 24 add.u64 %r181,%frame,%r101; ld.u64 %r50,[%r181]; .loc 1 122 10 sub.u64 %r182,%r47,%r50; add.u64 %r65,%r65,%r182; .loc 1 124 33 add.u64 %r184,%r205,%r101; ld.u64 %r52,[%r184]; .loc 1 124 6 add.u64 %r106,%r106,8; setp.eq.u64 %r185,%r49,%r52; @ %r185 bra $L13; bra $L14; $L8: ld.u64 %r92,[%frame+240]; .loc 1 107 6 mov.u64 %r100,%r92; mov.u32 %r75,%r119; $L18: .loc 1 134 11 ld.s8 %r187,[%r43]; cvt.u16.u32 %r186,%r187; setp.ne.u16 %r189,%r186,%r188; @ %r189 bra $L15; setp.eq.u32 %r190,%r75,0; selp.u64 %r100,%r92,%r100,%r190; mov.u64 %r69,%r207; add.u64 %r95,%r206,%r91; $L17: .loc 1 137 36 add.u64 %r193,%r100,1; .loc 1 137 25 st.u64 [%r97],%r193; .loc 1 136 5 add.u64 %r97,%r97,%r114; setp.eq.u64 %r194,%r69,%r95; @ %r194 bra $L1; .loc 1 137 32 ld.u64 %r100,[%r69]; add.u64 %r69,%r69,8; bra $L17; $L15: .loc 1 141 13 add.u64 %r43,%r43,%r108; .loc 1 142 14 add.u64 %r100,%r100,1; mov.u32 %r75,%r212; .loc 1 142 6 setp.ne.u64 %r195,%r100,%r104; @ %r195 bra $L18; st.u64 [%frame+240],%r104; mov.u64 %r68,%r207; mov.u64 %r64,%r104; mov.u64 %r62,%r108; .loc 1 144 6 mov.u64 %r71,0; $L19: .loc 1 149 17 st.u64 [%r68+-8],%r215; .loc 1 152 39 mul.lo.u64 %r59,%r62,%r64; mov.u64 %r98,%r71; .loc 1 153 9 add.u64 %r71,%r71,1; .loc 1 154 11 setp.eq.u64 %r198,%r98,%r66; @ %r198 bra $L1; .loc 1 158 13 ld.u64 %r199,[%r68]; add.u64 %r61,%r199,1; st.u64 [%r68],%r61; shl.b64 %r54,%r71,3; .loc 1 159 24 add.u64 %r200,%frame,%r54; ld.u64 %r62,[%r200]; .loc 1 159 10 sub.u64 %r201,%r62,%r59; add.u64 %r43,%r43,%r201; .loc 1 161 33 add.u64 %r203,%r205,%r54; ld.u64 %r64,[%r203]; .loc 1 161 6 add.u64 %r68,%r68,8; setp.eq.u64 %r204,%r61,%r64; @ %r204 bra $L19; bra $L8; $L1: .loc 1 165_gfortran_mfindloc0_i1 .visible .func _gfortran_mfindloc0_i1u32u32u16 %r197; .reg .u16 %r198; .reg .u16 %r199; .reg .u32 %r200; .reg .u16u64 %r217; .reg .u32.reg .u32 %r234; .reg .pred %r236; .reg .pred %r237; .reg .u64 %r240; .reg .predmov.u64 %r156,%ar0; mov.u64 %r157,%ar1; mov.u32 %r158,%ar2; mov.u64 %r159,%ar3; mov.u32 %r160,%ar4; .loc 1 190 10 ld.s8 %r22,[%r157+28]; .loc 1 190 8 cvt.u32.u32 %r161,%r22; cvt.s64.s8 %r103,%r161; .loc 1 191 6 setp.gt.s64 %r162,%r103,0; @ %r162 bra $L34; .loc 1 192 5 cvta.const.u64 %r163163_gfortran_runtime_error34: .loc 1 194 6 ld.u64 %r165,[%r156]; setp.ne.u64 %r166,%r165,0; @ %r166 bra $L35; .loc 1 196 7 st.u64 [%r156+48],%r165; add.u64 %r168,%r103,-1; st.u64 [%r156+56],%r168; mov.u64 %r169,1; st.u64 [%r156+40],%r169; .loc 1 197 28 cvt.u32.u64 %r170,%r169; st.u8 [%r156+28],%r170; .loc 1 198 24 st.u64 [%r156+8],%r165; .loc 1 199 29 mov.u64 %r173(%value_in),_gfortrani_xmallocarray176,[%value_in]; } .loc 1 199 27 st.u64 [%r156],%r176; bra $L36; $L35: .loc 1 203 11 cvta.global.u64 %r178,_gfortrani_compile_options; .loc 1 203 10 ld.u32 %r179,[%r178+36]; setp.eq.u32 %r180,%r179,0; @ %r180 bra $L36; .loc 1 205 4r157183; call _gfortrani_bounds_iforeach_return.loc 1 207 4 cvta.const.u64 %r186183; call _gfortrani_bounds_equal_extents$L36: .loc 1 212 13 ld.u32 %r104,[%r159+16]; .loc 1 214 9 ld.u64 %r58,[%r159]; .loc 1 216 53 add.u32 %r188,%r104,-4; and.b32 %r189,%r188,-5; set.u32.eq.u32 %r191,%r189,0; neg.s32 %r192,%r191; .loc 1 216 22 add.u32 %r193,%r104,-1; set.u32.le.u32 %r195,%r193,1; neg.s32 %r196,%r195; .loc 1 216 6 cvt.u16.u32 %r198,%r192; cvt.u16.u32 %r199,%r196; or.b16 %r197,%r198,%r199; cvt.u32.u16 %r200,%r197; cvt.u16.u8 %r201,%r200; setp.ne.u16 %r202,%r201,0; @ %r202 bra $L37; .loc 1 218 7 setp.ne.u32 %r203,%r104,16; @ %r203 bra $L38; $L37: ld.u64 %r204,[%r156+40]; shl.b64 %r43,%r204,3; ld.u64 %r131,[%r156]; .loc 1 226 8 mov.u64 %r84,%r131; .loc 1 229 10 mov.u64 %r115,0; .loc 1 230 23 mov.u64 %r207,%r115; bra $L39; $L38: .loc 1 223 5 cvta.const.u64 %r206,$LC3; mov.u64 %r205,206; call _gfortrani_internal_error39: .loc 1 230 23 st.u64 [%r84],%r207; mov.u64 %r139,%r115; .loc 1 229 26 add.u64 %r115,%r139,1; .loc 1 229 3 add.u64 %r84,%r84,%r43; setp.ne.u64 %r208,%r103,%r115; @ %r208 bra $L39; .loc 1 236 20 ld.u64 %r36,[%r159+16]; add.u64 %r140,%r157,40; add.u64 %r282,%frame,120; mov.u64 %r96,%r282; add.u64 %r77,%r159,40; mov.u64 %r80,%frame; add.u64 %r79,%frame,240; mov.u64 %r75,%r79; .loc 1 232 6 mov.u64 %r113,1; .loc 1 233 10 mov.u64 %r130,0; bra $L41; $L54: mov.u64 %r130,%r114; $L41: .loc 1 235 18 ld.u64 %r209,[%r140]; st.u64 [%r96],%r209; .loc 1 236 20 ld.u64 %r211,[%r77]; mul.lo.u64 %r210,%r211,%r36; .loc 1 236 18 st.u64 [%r80],%r210; .loc 1 237 19 ld.u64 %r213,[%r140+16]; add.u64 %r212,%r213,1; ld.u64 %r214,[%r140+8]; sub.u64 %r42,%r212,%r214; .loc 1 237 17 st.u64 [%r75],%r42; .loc 1 238 10 mul.lo.u64 %r113,%r113,%r42; .loc 1 239 10 setp.le.s64 %r215,%r42,0; @ %r215 bra $L33; .loc 1 233 26 add.u64 %r114,%r130,1; .loc 1 233 3 add.u64 %r140,%r140,24; add.u64 %r96,%r96,8; add.u64 %r77,%r77,24; add.u64 %r80,%r80,8; add.u64 %r75,%r75,8; setp.ne.u64 %r216,%r139,%r130; @ %r216 bra $L54; .loc 1 244 16 cvt.u32.u32 %r218,%r22; cvt.s64.s8 %r217,%r218; shl.b64 %r138,%r217,3; add.u64 %r280,%frame,360; mov.u32 %r22313825,[%value_in]; } .loc 1 261 27 ld.u64 %r152,[%frame+120]; .loc 1 262 24 ld.u64 %r154,[%frame]; .loc 1 263 34 ld.u64 %r148,[%frame+240]; .loc 1 248 19 ld.u64 %r46,[%r157]; .loc 1 246 6 setp.eq.u32 %r227,%r160,0; @ ! %r227 bra $L72; add.u64 %r281,%frame,368; .loc 1 294 11 cvt.u16.u32 %r293,%r158; mov.u32 %r294,1; .loc 1 310 17 mov.u64 %r298,0; .loc 1 324 33 add.u64 %r299,%frame,240; bra $L42; $L72: .loc 1 248 42 add.u64 %r44,%r113,-1; .loc 1 248 12 add.u64 %r94,%r46,%r44; .loc 1 249 32 cvt.s64.s32 %r228,%r104; .loc 1 249 13 mad.lo.u64 %r95,%r228,%r44,%r58; .loc 1 261 13 neg.s64 %r55,%r152; .loc 1 262 14 neg.s64 %r57,%r154; add.u64 %r281,%frame,368; .loc 1 254 11 cvt.u16.u32 %r284,%r158; mov.u32 %r285,1; .loc 1 270 17 mov.u64 %r289,0; .loc 1 284 33 add.u64 %r290,%frame,240; $L48: ld.u64 %r128,[%frame+360]; mov.u64 %r149,%r128; mov.u32 %r74,%r223; $L46: .loc 1 254 11 ld.s8 %r231,[%r95]; cvt.u16.u32 %r230,%r231; setp.eq.u16 %r232,%r230,0; @ %r232 bra $L43; ld.s8 %r234,[%r94]; cvt.u16.u32 %r233,%r234; setp.ne.u16 %r236,%r233,%r284; @ %r236 bra $L43; setp.eq.u32 %r237,%r74,0; selp.u64 %r149,%r128,%r149,%r237; add.u64 %r136,%frame,248; mov.u64 %r135,%r281; add.u64 %r133,%r79,%r138; $L45: .loc 1 257 37 sub.u64 %r240,%r148,%r149; .loc 1 257 25 st.u64 [%r131],%r240; .loc 1 256 5 add.u64 %r131,%r131,%r43; setp.eq.u64 %r241,%r133,%r136; @ %r241 bra $L33; .loc 1 257 33 ld.u64 %r148,[%r136]; .loc 1 257 44 ld.u64 %r149,[%r135]; add.u64 %r136,%r136,8; add.u64 %r135,%r135,8; bra $L45; $L43: .loc 1 261 13 add.u64 %r94,%r94,%r55; .loc 1 262 14 add.u64 %r95,%r95,%r57; .loc 1 263 14 add.u64 %r149,%r149,1; mov.u32 %r74,%r285; .loc 1 263 6 setp.ne.u64 %r242,%r149,%r148; @ %r242 bra $L46; st.u64 [%frame+360],%r148; mov.u64 %r124,%r281; mov.u64 %r67,%r154; mov.u64 %r70,%r148; mov.u64 %r65,%r152; mov.u64 %r123,8; .loc 1 265 6 mov.u64 %r97,0; $L47: .loc 1 270 17 st.u64 [%r124+-8],%r289; .loc 1 273 39 mul.lo.u64 %r60,%r65,%r70; .loc 1 274 28 mul.lo.u64 %r62,%r70,%r67; mov.u64 %r102,%r97; .loc 1 275 9 add.u64 %r97,%r97,1; .loc 1 276 11 setp.eq.u64 %r245,%r102,%r139; @ %r245 bra $L33; .loc 1 280 13 ld.u64 %r246,[%r124]; add.u64 %r64,%r246,1; st.u64 [%r124],%r64; .loc 1 281 24 add.u64 %r248,%r282,%r123; ld.u64 %r65,[%r248]; .loc 1 281 10 sub.u64 %r249,%r60,%r65; add.u64 %r94,%r94,%r249; .loc 1 282 21 add.u64 %r250,%frame,%r123; ld.u64 %r67,[%r250]; .loc 1 282 11 sub.u64 %r251,%r67,%r62; add.u64 %r95,%r95,%r251; .loc 1 284 33 add.u64 %r253,%r290,%r123; ld.u64 %r70,[%r253]; .loc 1 284 6 add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.eq.u64 %r254,%r64,%r70; @ %r254 bra $L47; bra $L48; $L42: ld.u64 %r52,[%frame+360]; .loc 1 265 6 mov.u64 %r143,%r52; mov.u32 %r56,%r160; $L52: .loc 1 294 11 ld.s8 %r256,[%r58]; cvt.u16.u32 %r255,%r256; setp.eq.u16 %r257,%r255,0; @ %r257 bra $L49; ld.s8 %r259,[%r46]; cvt.u16.u32 %r258,%r259; setp.ne.u16 %r261,%r258,%r293; @ %r261 bra $L49; setp.eq.u32 %r262,%r56,0; selp.u64 %r143,%r52,%r143,%r262; mov.u64 %r118,%r281; add.u64 %r69,%r280,%r138; $L51: .loc 1 297 36 add.u64 %r265,%r143,1; .loc 1 297 25 st.u64 [%r131],%r265; .loc 1 296 5 add.u64 %r131,%r131,%r43; setp.eq.u64 %r266,%r69,%r118; @ %r266 bra $L33; .loc 1 297 32 ld.u64 %r143,[%r118]; add.u64 %r118,%r118,8; bra $L51; $L49: .loc 1 301 13 add.u64 %r46,%r46,%r152; .loc 1 302 14 add.u64 %r58,%r58,%r154; .loc 1 303 14 add.u64 %r143,%r143,1; mov.u32 %r56,%r294; .loc 1 303 6 setp.ne.u64 %r267,%r143,%r148; @ %r267 bra $L52; st.u64 [%frame+360],%r148; mov.u64 %r150,%r281; mov.u64 %r91,%r154; mov.u64 %r93,%r148; mov.u64 %r89,%r152; mov.u64 %r147,8; .loc 1 305 6 mov.u64 %r98,0; $L53: .loc 1 310 17 st.u64 [%r150+-8],%r298; .loc 1 313 39 mul.lo.u64 %r83,%r89,%r93; .loc 1 314 28 mul.lo.u64 %r86,%r93,%r91; mov.u64 %r116,%r98; .loc 1 315 9 add.u64 %r98,%r98,1; .loc 1 316 11 setp.eq.u64 %r270,%r116,%r139; @ %r270 bra $L33; .loc 1 320 13 ld.u64 %r271,[%r150]; add.u64 %r88,%r271,1; st.u64 [%r150],%r88; .loc 1 321 23 add.u64 %r273,%r282,%r147; ld.u64 %r89,[%r273]; .loc 1 321 10 sub.u64 %r274,%r89,%r83; add.u64 %r46,%r46,%r274; .loc 1 322 21 add.u64 %r275,%frame,%r147; ld.u64 %r91,[%r275]; .loc 1 322 11 sub.u64 %r276,%r91,%r86; add.u64 %r58,%r58,%r276; .loc 1 324 33 add.u64 %r278,%r299,%r147; ld.u64 %r93,[%r278]; .loc 1 324 6 add.u64 %r150,%r150,8; add.u64 %r147,%r147,8; setp.eq.u64 %r279,%r88,%r93; @ %r279 bra $L53; bra $L42; $L33: .loc 1 328_gfortran_sfindloc0_i1 .visible .func _gfortran_sfindloc0_i1predar2; mov.u64 %r38,%ar3; mov.u32 %r39,%ar4; .loc 1 345 6 setp.eq.u64 %r40,%r38,0; @ %r40 bra $L74; .loc 1 345 20 ld.u32 %r41,[%r38]; setp.eq.u32 %r42,%r41,0; @ %r42 bra $L75; $L74: .loc 1 347 7 {call _gfortran_findloc0_348 7 bra $L73; $L75: .loc 1 351 8 ld.s8 %r30,[%r36+28]; .loc 1 353 6 setp.gt.s64 %r48,%r30,0; @ %r48 bra $L77; .loc 1 354 5 cvta.const.u64 %r50,$LC0; mov.u64 %r49,0; {_gfortrani_internal_error77: .loc 1 356 15 ld.u64 %r24,[%r35]; .loc 1 356 6 setp.ne.u64 %r51,%r24,0; @ %r51 bra $L78; .loc 1 358 7 st.u64 [%r35+48],%r24; add.u64 %r53,%r30,-1; st.u64 [%r35+56],%r53; mov.u64 %r54,1; st.u64 [%r35+40],%r54; .loc 1 359 28 cvt.u32.u64 %r55,%r54; st.u8 [%r35+28],%r55; .loc 1 360 24 st.u64 [%r35+8],%r24; .loc 1 361 29 mov.u64 call (%value_in),_gfortrani_xmallocarray61 27 st.u64 [%r35],%r24; bra $L79; $L78: .loc 1 363 12 cvta.global.u64 %r63,_gfortrani_compile_options; .loc 1 363 11 ld.u32 %r64,[%r63+36]; setp.eq.u32 %r65,%r64,0; @ %r65 bra $L79; .loc 1 365 8 cvta.const.u64 %r68,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 370 8 ld.u64 %r24,[%r35]; $L79: ld.u64 %r69,[%r35+40]; shl.b64 %r34,%r69,3; mov.u64 %r29,%r24; .loc 1 371 10 mov.u64 %r32,0; .loc 1 372 23 mov.u64 %r70,%r32; $L80: st.u64 [%r29],%r70; .loc 1 371 24 add.u64 %r32,%r32,1; .loc 1 371 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r71,%r30,%r32; @ %r71 bra $L80; $L73: .loc 1 373 1 ret; } findloc0_i2.o/_gfortran_findloc0_i2 .visible .func _gfortran_findloc0_i2); .file 1 "../../../../libgfortran/generated/findloc0_i2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mfindloc0_i2 .visible .func _gfortran_mfindloc0_i2_gfortran_sfindloc0_i2 .visible .func _gfortran_sfindloc0_i2); // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extentsgfortrani_internal_error .extern .func _gfortrani_internal_70,73,78,68,777,65,83,75,32,97,114,103,117,109,101,110,116,0 }82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,6270,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_findloc0_i2 .visible .func _gfortran_findloc0_i28pred.reg .u64 %r140; .reg .u32 %r141; .reg .predpredu64 %r166; .reg .u64 %r168; .reg .predpred %r174; .reg .predpredpred %r191; .reg .u16 %r192; .reg .u16 %r193; .reg .pred64u32u64 %r221; mov.u64 %r120,%ar0; mov.u64 %r121,%ar1; mov.u32 %r122,%ar2; mov.u32 %r123,%ar3; .loc 1 51 10 ld.s8 %r22,[%r121+28]; .loc 1 51 8 cvt.u32.u32 %r124,%r22; cvt.s64.s8 %r80,%r124; .loc 1 52 6 setp.gt.s64 %r125,%r80,0; @ %r125 bra $L2; .loc 1 53 5 cvta.const.u64 %r126,$LCstack; call _gfortran_runtime_errorr23,[%r120]; .loc 1 55 6 setp.ne.u64 %r128,%r23,0; @ %r128 bra $L3; .loc 1 57 7 st.u64 [%r120+48],%r23; add.u64 %r130,%r80,-1; st.u64 [%r120+56],%r130; mov.u64 %r131,1; st.u64 [%r120+40],%r131; .loc 1 58 28 cvt.u32.u64 %r132,%r131; st.u8 [%r120+28],%r132; .loc 1 59 24 st.u64 [%r120+8],%r23; .loc 1 60 29 mov.u64 %r135135; call (%value_in),_gfortrani_xmallocarray138,[%value_in]; } mov.u64 %r23,%r138; .loc 1 60 27 st.u64 [%r120],%r23; bra $L4; $L3: .loc 1 64 11 cvta.global.u64 %r140,_gfortrani_compile_options; .loc 1 64 10 ld.u32 %r141,[%r140+36]; setp.eq.u32 %r142,%r141,0; @ %r142 bra $L4; .loc 1 6gfortrani_bounds_iforeach_return.loc 1 70 8 ld.u64 %r23,[%r120]; $L4: ld.u64 %r146,[%r120+40]; shl.b64 %r118,%r146,3; mov.u64 %r102,%r23; .loc 1 69 11 mov.u64 %r116,%r102; .loc 1 73 10 mov.u64 %r89,0; .loc 1 74 23 mov.u64 %r147,%r89; $L5: st.u64 [%r116],%r147; mov.u64 %r103,%r89; .loc 1 73 26 add.u64 %r89,%r103,1; .loc 1 73 3 add.u64 %r116,%r116,%r118; setp.ne.u64 %r148,%r80,%r89; @ %r148 bra $L5; add.u64 %r39,%r121,40; mov.u64 %r74,%frame; .loc 1 76 6 mov.u64 %r87,1; .loc 1 77 10 mov.u64 %r101,0; add.u64 %r211,%frame,120; bra $L7; $L20: mov.u64 %r101,%r88; $L7: .loc 1 79 18 ld.u64 %r149,[%r39]; st.u64 [%r74],%r149; .loc 1 80 19 ld.u64 %r151,[%r39+16]; add.u64 %r150,%r151,1; ld.u64 %r152,[%r39+8]; sub.u64 %r33,%r150,%r152; .loc 1 80 17 shl.b64 %r154,%r101,3; add.u64 %r155,%r211,%r154; st.u64 [%r155],%r33; .loc 1 81 10 mul.lo.u64 %r87,%r87,%r33; .loc 1 82 10 setp.le.s64 %r156,%r33,0; @ %r156 bra $L1; .loc 1 77 26 add.u64 %r88,%r101,1; .loc 1 77 3 add.u64 %r39,%r39,24; add.u64 %r74,%r74,8; setp.ne.u64 %r157,%r103,%r101; @ %r157 bra $L20; .loc 1 87 16 cvt.u32.u32 %r159,%r22; cvt.s64.s8 %r158,%r159; shl.b64 %r95,%r158,3; add.u64 %r212,%frame,240; mov.u32 %r164166,[%value_in]; } .loc 1 104 23 ld.u64 %r113,[%frame]; .loc 1 104 13 add.u64 %r168,%r113,%r113; .loc 1 105 34 ld.u64 %r109,[%frame+120]; .loc 1 91 19 ld.u64 %r45,[%r121]; .loc 1 89 6 setp.eq.u32 %r169,%r123,0; @ ! %r169 bra $L32; add.u64 %r213,%frame,248; .loc 1 134 11 cvt.u16.u32 %r193,%r122; mov.u32 %r218,1; .loc 1 149 17 mov.u64 %r221,0; bra $L8; $L32: .loc 1 91 37 add.u64 %r170,%r87,-1; .loc 1 91 31 add.u64 %r171,%r170,%r170; .loc 1 91 12 add.u64 %r68,%r45,%r171; .loc 1 104 13 neg.s64 %r44,%r168; add.u64 %r213,%frame,248; .loc 1 97 11 cvt.u16.u32 %r173,%r122; mov.u32 %r214,1; .loc 1 112 17 mov.u64 %r217,0; $L14: ld.u64 %r38,[%frame+240]; mov.u64 %r110,%r38; mov.u32 %r98,%r164; $L12: .loc 1 97 11 ld.u16 %r172,[%r68]; setp.ne.u16 %r174,%r172,%r173; @ %r174 bra $L9; setp.eq.u32 %r175,%r98,0; selp.u64 %r110,%r38,%r110,%r175; add.u64 %r93,%frame,128; mov.u64 %r92,%r213; add.u64 %r86,%r211,%r95; $L11: .loc 1 100 37 sub.u64 %r179,%r109,%r110; .loc 1 100 25 st.u64 [%r102],%r179; .loc 1 99 5 add.u64 %r102,%r102,%r118; setp.eq.u64 %r180,%r86,%r93; @ %r180 bra $L1; .loc 1 100 33 ld.u64 %r109,[%r93]; .loc 1 100 44 ld.u64 %r110,[%r92]; add.u64 %r93,%r93,8; add.u64 %r92,%r92,8; bra $L11; $L9: .loc 1 104 13 add.u64 %r68,%r68,%r44; .loc 1 105 14 add.u64 %r110,%r110,1; mov.u32 %r98,%r214; .loc 1 105 6 setp.ne.u64 %r181,%r110,%r109; @ %r181 bra $L12; st.u64 [%frame+240],%r109; mov.u64 %r111,%r213; mov.u64 %r53,%r109; mov.u64 %r51,%r113; .loc 1 107 6 mov.u64 %r73,0; $L13: .loc 1 112 17 st.u64 [%r111+-8],%r217; .loc 1 115 27 mul.lo.u64 %r48,%r51,%r53; mov.u64 %r79,%r73; .loc 1 116 9 add.u64 %r73,%r73,1; .loc 1 117 11 setp.eq.u64 %r184,%r79,%r103; @ %r184 bra $L1; .loc 1 121 13 ld.u64 %r185,[%r111]; add.u64 %r50,%r185,1; st.u64 [%r111],%r50; shl.b64 %r106,%r73,3; .loc 1 122 20 add.u64 %r186,%frame,%r106; ld.u64 %r51,[%r186]; .loc 1 122 10 sub.u64 %r187,%r48,%r51; add.u64 %r188,%r187,%r187; add.u64 %r68,%r68,%r188; .loc 1 124 33 add.u64 %r190,%r211,%r106; ld.u64 %r53,[%r190]; .loc 1 124 6 add.u64 %r111,%r111,8; setp.eq.u64 %r191,%r50,%r53; @ %r191 bra $L13; bra $L14; $L8: ld.u64 %r96,[%frame+240]; .loc 1 107 6 mov.u64 %r105,%r96; mov.u32 %r59,%r123; $L18: .loc 1 134 11 ld.u16 %r192,[%r45]; setp.ne.u16 %r194,%r192,%r193; @ %r194 bra $L15; setp.eq.u32 %r195,%r59,0; selp.u64 %r105,%r96,%r105,%r195; mov.u64 %r72,%r213; add.u64 %r27,%r212,%r95; $L17: .loc 1 137 36 add.u64 %r198,%r105,1; .loc 1 137 25 st.u64 [%r102],%r198; .loc 1 136 5 add.u64 %r102,%r102,%r118; setp.eq.u64 %r199,%r27,%r72; @ %r199 bra $L1; .loc 1 137 32 ld.u64 %r105,[%r72]; add.u64 %r72,%r72,8; bra $L17; $L15: .loc 1 141 13 add.u64 %r45,%r45,%r168; .loc 1 142 14 add.u64 %r105,%r105,1; mov.u32 %r59,%r218; .loc 1 142 6 setp.ne.u64 %r200,%r105,%r109; @ %r200 bra $L18; st.u64 [%frame+240],%r109; mov.u64 %r71,%r213; mov.u64 %r67,%r109; mov.u64 %r65,%r113; .loc 1 144 6 mov.u64 %r75,0; $L19: .loc 1 149 17 st.u64 [%r71+-8],%r221; .loc 1 152 27 mul.lo.u64 %r62,%r65,%r67; mov.u64 %r78,%r75; .loc 1 153 9 add.u64 %r75,%r75,1; .loc 1 154 11 setp.eq.u64 %r203,%r78,%r103; @ %r203 bra $L1; .loc 1 158 13 ld.u64 %r204,[%r71]; add.u64 %r64,%r204,1; st.u64 [%r71],%r64; shl.b64 %r76,%r75,3; .loc 1 159 20 add.u64 %r205,%frame,%r76; ld.u64 %r65,[%r205]; .loc 1 159 10 sub.u64 %r206,%r65,%r62; add.u64 %r207,%r206,%r206; add.u64 %r45,%r45,%r207; .loc 1 161 33 add.u64 %r209,%r211,%r76; ld.u64 %r67,[%r209]; .loc 1 161 6 add.u64 %r71,%r71,8; setp.eq.u64 %r210,%r64,%r67; @ %r210 bra $L19; bra $L8; $L1: .loc 1 165_gfortran_mfindloc0_i2 .visible .func _gfortran_mfindloc0_i2pred %r166; .reg .u64pred %r184; .reg .u64 %r187; .reg .u64 %r190; .reg .u32u64 %r213; .reg .u64pred %r238; .reg .u16 %r239; .reg .predpred %r246; .reg .predpred %r271; .reg .pred %r272; .reg .predmov.u64 %r160,%ar0; mov.u64 %r161,%ar1; mov.u32 %r162,%ar2; mov.u64 %r163,%ar3; mov.u32 %r164,%ar4; .loc 1 190 10 ld.s8 %r22,[%r161+28]; .loc 1 190 8 cvt.u32.u32 %r165,%r22; cvt.s64.s8 %r104,%r165; .loc 1 191 6 setp.gt.s64 %r166,%r104,0; @ %r166 bra $L34; .loc 1 192 5 cvta.const.u64 %r16167_gfortran_runtime_error34: .loc 1 194 6 ld.u64 %r169,[%r160; .loc 1 196 7 st.u64 [%r160+48],%r169; add.u64 %r172,%r104,-1; st.u64 [%r160+56],%r172; mov.u64 %r173,1; st.u64 [%r160+40],%r173; .loc 1 197 28 cvt.u32.u64 %r174,%r173; st.u8 [%r160+28],%r174; .loc 1 198 24 st.u64 [%r160+8],%r169; .loc 1 199 29 mov.u64 %r177call (%value_in),_gfortrani_xmallocarray,(%out_arg1,%out_arg2); ld.param.u64 %r180,[%value_in]; } .loc 1 199 27 st.u64 [%r160],%r180; bra $L36; $L35: .loc 1 203 11 cvta.global.u64 %r182,_gfortrani_compile_options; .loc 1 203 10 ld.u32 %r183,[%r182+36]; setp.eq.u32 %r184,%r183,0; @ %r184 bra $L36; .loc 1 205 4 cvta.const.u64 %r187,$LC1; {161187; call _gfortrani_bounds_iforeach_return.loc 1 207 4 cvta.const.u64 %r19016r161190187; call _gfortrani_bounds_equal_extents$L36: .loc 1 212 13 ld.u32 %r105,[%r163+16]; .loc 1 214 9 ld.u64 %r60,[%r163]; .loc 1 216 53 add.u32 %r192,%r105,-4; and.b32 %r193,%r192,-5; set.u32.eq.u32 %r195,%r193,0; neg.s32 %r196,%r195; .loc 1 216 22 add.u32 %r197,%r105,-1; set.u32.le.u32 %r199,%r197,1; neg.s32 %r200,%r199; .loc 1 216 6 cvt.u16.u32 %r202,%r196; cvt.u16.u32 %r203,%r200; or.b16 %r201,%r202,%r203;%r206 bra $L37; .loc 1 218 7 setp.ne.u32 %r207,%r105,16; @ %r207 bra $L38; $L37: ld.u64 %r208,[%r160+40]; shl.b64 %r98,%r208,3; ld.u64 %r133,[%r160]; .loc 1 226 8 mov.u64 %r84,%r133; .loc 1 229 10 mov.u64 %r118,0; .loc 1 230 23 mov.u64 %r211,%r118; bra $L39; $L38: .loc 1 223 5 cvta.const.u64 %r210,$LC3; mov.u64 %r209,210; call _gfortrani_internal_error39: .loc 1 230 23 st.u64 [%r84],%r211; mov.u64 %r134,%r118; .loc 1 229 26 add.u64 %r118,%r134,1; .loc 1 229 3 add.u64 %r84,%r84,%r98; setp.ne.u64 %r212,%r104,%r118; @ %r212 bra $L39; .loc 1 236 20 ld.u64 %r37,[%r163+16]; add.u64 %r143,%r161,40; add.u64 %r288,%frame,120; mov.u64 %r28,%r288; add.u64 %r80,%r163,40; mov.u64 %r82,%frame; add.u64 %r78,%frame,240; mov.u64 %r54,%r78; .loc 1 232 6 mov.u64 %r116,1; .loc 1 233 10 mov.u64 %r132,0; bra $L41; $L54: mov.u64 %r132,%r117; $L41: .loc 1 235 18 ld.u64 %r213,[%r143]; st.u64 [%r28],%r213; .loc 1 236 20 ld.u64 %r215,[%r80]; mul.lo.u64 %r214,%r215,%r37; .loc 1 236 18 st.u64 [%r82],%r214; .loc 1 237 19 ld.u64 %r217,[%r143+16]; add.u64 %r216,%r217,1; ld.u64 %r218,[%r143+8]; sub.u64 %r43,%r216,%r218; .loc 1 237 17 st.u64 [%r54],%r43; .loc 1 238 10 mul.lo.u64 %r116,%r116,%r43; .loc 1 239 10 setp.le.s64 %r219,%r43,0; @ %r219 bra $L33; .loc 1 233 26 add.u64 %r117,%r132,1; .loc 1 233 3 add.u64 %r143,%r143,24; add.u64 %r28,%r28,8; add.u64 %r80,%r80,24; add.u64 %r82,%r82,8; add.u64 %r54,%r54,8; setp.ne.u64 %r220,%r134,%r132; @ %r220 bra $L54; .loc 1 244 16 cvt.u32.u32 %r222,%r22; cvt.s64.s8 %r221,%r222; shl.b64 %r141,%r221,3; add.u64 %r286,%frame,360; mov.u32 %r227261 23 ld.u64 %r155,[%frame+120]; .loc 1 261 13 add.u64 %r231,%r155,%r155; .loc 1 262 24 ld.u64 %r158,[%frame]; .loc 1 263 34 ld.u64 %r151,[%frame+240]; .loc 1 248 19 ld.u64 %r47,[%r161]; .loc 1 246 6 setp.eq.u32 %r232,%r164,0; @ ! %r232 bra $L72; add.u64 %r287,%frame,368; .loc 1 294 11 cvt.u16.u32 %r299,%r162; mov.u32 %r300,1; .loc 1 310 17 mov.u64 %r304,0; .loc 1 324 33 add.u64 %r305,%frame,240; bra $L42; $L72: .loc 1 248 37 add.u64 %r44,%r116,-1; .loc 1 248 31 add.u64 %r233,%r44,%r44; .loc 1 248 12 add.u64 %r96,%r47,%r233; .loc 1 249 32 cvt.s64.s32 %r234,%r105; .loc 1 249 13 mad.lo.u64 %r97,%r234,%r44,%r60; .loc 1 261 13 neg.s64 %r58,%r231; .loc 1 262 14 neg.s64 %r59,%r158; add.u64 %r287,%frame,368; .loc 1 254 11 cvt.u16.u32 %r290,%r162; mov.u32 %r291,1; .loc 1 270 17 mov.u64 %r295,0; .loc 1 284 33 add.u64 %r296,%frame,240; $L48: ld.u64 %r130,[%frame+360]; mov.u64 %r152,%r130; mov.u32 %r33,%r227; $L46: .loc 1 254 11 ld.s8 %r237,[%r97]; cvt.u16.u32 %r236,%r237; setp.eq.u16 %r238,%r236,0; @ %r238 bra $L43; ld.u16 %r239,[%r96]; setp.ne.u16 %r241,%r239,%r290; @ %r241 bra $L43; setp.eq.u32 %r242,%r33,0; selp.u64 %r152,%r130,%r152,%r242; add.u64 %r139,%frame,248; mov.u64 %r138,%r287; add.u64 %r136,%r78,%r141; $L45: .loc 1 257 37 sub.u64 %r245,%r151,%r152; .loc 1 257 25 st.u64 [%r133],%r245; .loc 1 256 5 add.u64 %r133,%r133,%r98; setp.eq.u64 %r246,%r136,%r139; @ %r246 bra $L33; .loc 1 257 33 ld.u64 %r151,[%r139]; .loc 1 257 44 ld.u64 %r152,[%r138]; add.u64 %r139,%r139,8; add.u64 %r138,%r138,8; bra $L45; $L43: .loc 1 261 13 add.u64 %r96,%r96,%r58; .loc 1 262 14 add.u64 %r97,%r97,%r59; .loc 1 263 14 add.u64 %r152,%r152,1; mov.u32 %r33,%r291; .loc 1 263 6 setp.ne.u64 %r247,%r152,%r151; @ %r247 bra $L46; st.u64 [%frame+360],%r151; mov.u64 %r125,%r287; mov.u64 %r69,%r158; mov.u64 %r72,%r151; mov.u64 %r67,%r155; mov.u64 %r124,8; .loc 1 265 6 mov.u64 %r100,0; $L47: .loc 1 270 17 st.u64 [%r125+-8],%r295; .loc 1 273 27 mul.lo.u64 %r62,%r67,%r72; .loc 1 274 28 mul.lo.u64 %r64,%r72,%r69; mov.u64 %r142,%r100; .loc 1 275 9 add.u64 %r100,%r100,1; .loc 1 276 11 setp.eq.u64 %r250,%r142,%r134; @ %r250 bra $L33; .loc 1 280 13 ld.u64 %r251,[%r125]; add.u64 %r66,%r251,1; st.u64 [%r125],%r66; .loc 1 281 20 add.u64 %r253,%r288,%r124; ld.u64 %r67,[%r253]; .loc 1 281 10 sub.u64 %r254,%r62,%r67; add.u64 %r255,%r254,%r254; add.u64 %r96,%r96,%r255; .loc 1 282 21 add.u64 %r256,%frame,%r124; ld.u64 %r69,[%r256]; .loc 1 282 11 sub.u64 %r257,%r69,%r64; add.u64 %r97,%r97,%r257; .loc 1 284 33 add.u64 %r259,%r296,%r124; ld.u64 %r72,[%r259]; .loc 1 284 6 add.u64 %r125,%r125,8; add.u64 %r124,%r124,8; setp.eq.u64 %r260,%r66,%r72; @ %r260 bra $L47; bra $L48; $L42: ld.u64 %r55,[%frame+360]; .loc 1 265 6 mov.u64 %r146,%r55; mov.u32 %r126,%r164; $L52: .loc 1 294 11 ld.s8 %r262,[%r60]; cvt.u16.u32 %r261,%r262; setp.eq.u16 %r263,%r261,0; @ %r263 bra $L49; ld.u16 %r264,[%r47]; setp.ne.u16 %r266,%r264,%r299; @ %r266 bra $L49; setp.eq.u32 %r267,%r126,0; selp.u64 %r146,%r55,%r146,%r267; mov.u64 %r119,%r287; add.u64 %r77,%r286,%r141; $L51: .loc 1 297 36 add.u64 %r270,%r146,1; .loc 1 297 25 st.u64 [%r133],%r270; .loc 1 296 5 add.u64 %r133,%r133,%r98; setp.eq.u64 %r271,%r77,%r119; @ %r271 bra $L33; .loc 1 297 32 ld.u64 %r146,[%r119]; add.u64 %r119,%r119,8; bra $L51; $L49: .loc 1 301 13 add.u64 %r47,%r47,%r231; .loc 1 302 14 add.u64 %r60,%r60,%r158; .loc 1 303 14 add.u64 %r146,%r146,1; mov.u32 %r126,%r300; .loc 1 303 6 setp.ne.u64 %r272,%r146,%r151; @ %r272 bra $L52; st.u64 [%frame+360],%r151; mov.u64 %r153,%r287; mov.u64 %r93,%r158; mov.u64 %r95,%r151; mov.u64 %r91,%r155; mov.u64 %r150,8; .loc 1 305 6 mov.u64 %r101,0; $L53: .loc 1 310 17 st.u64 [%r153+-8],%r304; .loc 1 313 27 mul.lo.u64 %r86,%r91,%r95; .loc 1 314 28 mul.lo.u64 %r88,%r95,%r93; mov.u64 %r103,%r101; .loc 1 315 9 add.u64 %r101,%r101,1; .loc 1 316 11 setp.eq.u64 %r275,%r103,%r134; @ %r275 bra $L33; .loc 1 320 13 ld.u64 %r276,[%r153]; add.u64 %r90,%r276,1; st.u64 [%r153],%r90; .loc 1 321 20 add.u64 %r278,%r288,%r150; ld.u64 %r91,[%r278]; .loc 1 321 10 sub.u64 %r279,%r91,%r86; add.u64 %r280,%r279,%r279; add.u64 %r47,%r47,%r280; .loc 1 322 21 add.u64 %r281,%frame,%r150; ld.u64 %r93,[%r281]; .loc 1 322 11 sub.u64 %r282,%r93,%r88; add.u64 %r60,%r60,%r282; .loc 1 324 33 add.u64 %r284,%r305,%r150; ld.u64 %r95,[%r284]; .loc 1 324 6 add.u64 %r153,%r153,8; add.u64 %r150,%r150,8; setp.eq.u64 %r285,%r90,%r95; @ %r285 bra $L53; bra $L42; $L33: .loc 1 328_gfortran_sfindloc0_i2 .visible .func _gfortran_sfindloc0_i2predar2; mov.u64 %r38,%ar3; mov.u32 %r39,%ar4; .loc 1 345 6 setp.eq.u64 %r40,%r38,0; @ %r40 bra $L74; .loc 1 345 20 ld.u32 %r41,[%r38]; setp.eq.u32 %r42,%r41,0; @ %r42 bra $L75; $L74: .loc 1 347 7 {call _gfortran_findloc0_348 7 bra $L73; $L75: .loc 1 351 8 ld.s8 %r30,[%r36+28]; .loc 1 353 6 setp.gt.s64 %r48,%r30,0; @ %r48 bra $L77; .loc 1 354 5 cvta.const.u64 %r50,$LC0; mov.u64 %r49,0; {_gfortrani_internal_error77: .loc 1 356 15 ld.u64 %r24,[%r35]; .loc 1 356 6 setp.ne.u64 %r51,%r24,0; @ %r51 bra $L78; .loc 1 358 7 st.u64 [%r35+48],%r24; add.u64 %r53,%r30,-1; st.u64 [%r35+56],%r53; mov.u64 %r54,1; st.u64 [%r35+40],%r54; .loc 1 359 28 cvt.u32.u64 %r55,%r54; st.u8 [%r35+28],%r55; .loc 1 360 24 st.u64 [%r35+8],%r24; .loc 1 361 29 mov.u64 call (%value_in),_gfortrani_xmallocarray61 27 st.u64 [%r35],%r24; bra $L79; $L78: .loc 1 363 12 cvta.global.u64 %r63,_gfortrani_compile_options; .loc 1 363 11 ld.u32 %r64,[%r63+36]; setp.eq.u32 %r65,%r64,0; @ %r65 bra $L79; .loc 1 365 8 cvta.const.u64 %r68,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 370 8 ld.u64 %r24,[%r35]; $L79: ld.u64 %r69,[%r35+40]; shl.b64 %r34,%r69,3; mov.u64 %r29,%r24; .loc 1 371 10 mov.u64 %r32,0; .loc 1 372 23 mov.u64 %r70,%r32; $L80: st.u64 [%r29],%r70; .loc 1 371 24 add.u64 %r32,%r32,1; .loc 1 371 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r71,%r30,%r32; @ %r71 bra $L80; $L73: .loc 1 373 1 ret; } findloc0_i4.o/_gfortran_findloc0_i4 .visible .func _gfortran_findloc0_i4); .file 1 "../../../../libgfortran/generated/findloc0_i4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mfindloc0_i4 .visible .func _gfortran_mfindloc0_i4_gfortran_sfindloc0_i4 .visible .func _gfortran_sfindloc0_i4); // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extentsgfortrani_internal_error .extern .func _gfortrani_internal_70,73,78,68,777,65,83,75,32,97,114,103,117,109,101,110,116,0 }82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,6270,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_findloc0_i4 .visible .func _gfortran_findloc0_i48pred.reg .u64 %r140; .reg .u32 %r141; .reg .predpredu64predu64pred %r192; .reg .u64 %r195; .reg .predpredu32 %r211; .reg .u64 %r215; .reg .u32 %r216; .reg .u64 %r220; mov.u64 %r120,%ar0; mov.u64 %r121,%ar1; mov.u32 %r122,%ar2; mov.u32 %r123,%ar3; .loc 1 51 10 ld.s8 %r22,[%r121+28]; .loc 1 51 8 cvt.u32.u32 %r124,%r22; cvt.s64.s8 %r80,%r124; .loc 1 52 6 setp.gt.s64 %r125,%r80,0; @ %r125 bra $L2; .loc 1 53 5 cvta.const.u64 %r126,$LCstack; call _gfortran_runtime_errorr23,[%r120]; .loc 1 55 6 setp.ne.u64 %r128,%r23,0; @ %r128 bra $L3; .loc 1 57 7 st.u64 [%r120+48],%r23; add.u64 %r130,%r80,-1; st.u64 [%r120+56],%r130; mov.u64 %r131,1; st.u64 [%r120+40],%r131; .loc 1 58 28 cvt.u32.u64 %r132,%r131; st.u8 [%r120+28],%r132; .loc 1 59 24 st.u64 [%r120+8],%r23; .loc 1 60 29 mov.u64 %r135135; call (%value_in),_gfortrani_xmallocarray138,[%value_in]; } mov.u64 %r23,%r138; .loc 1 60 27 st.u64 [%r120],%r23; bra $L4; $L3: .loc 1 64 11 cvta.global.u64 %r140,_gfortrani_compile_options; .loc 1 64 10 ld.u32 %r141,[%r140+36]; setp.eq.u32 %r142,%r141,0; @ %r142 bra $L4; .loc 1 6gfortrani_bounds_iforeach_return.loc 1 70 8 ld.u64 %r23,[%r120]; $L4: ld.u64 %r146,[%r120+40]; shl.b64 %r118,%r146,3; mov.u64 %r102,%r23; .loc 1 69 11 mov.u64 %r116,%r102; .loc 1 73 10 mov.u64 %r89,0; .loc 1 74 23 mov.u64 %r147,%r89; $L5: st.u64 [%r116],%r147; mov.u64 %r103,%r89; .loc 1 73 26 add.u64 %r89,%r103,1; .loc 1 73 3 add.u64 %r116,%r116,%r118; setp.ne.u64 %r148,%r80,%r89; @ %r148 bra $L5; add.u64 %r39,%r121,40; mov.u64 %r74,%frame; .loc 1 76 6 mov.u64 %r87,1; .loc 1 77 10 mov.u64 %r101,0; add.u64 %r208,%frame,120; bra $L7; $L20: mov.u64 %r101,%r88; $L7: .loc 1 79 18 ld.u64 %r149,[%r39]; st.u64 [%r74],%r149; .loc 1 80 19 ld.u64 %r151,[%r39+16]; add.u64 %r150,%r151,1; ld.u64 %r152,[%r39+8]; sub.u64 %r33,%r150,%r152; .loc 1 80 17 shl.b64 %r154,%r101,3; add.u64 %r155,%r208,%r154; st.u64 [%r155],%r33; .loc 1 81 10 mul.lo.u64 %r87,%r87,%r33; .loc 1 82 10 setp.le.s64 %r156,%r33,0; @ %r156 bra $L1; .loc 1 77 26 add.u64 %r88,%r101,1; .loc 1 77 3 add.u64 %r39,%r39,24; add.u64 %r74,%r74,8; setp.ne.u64 %r157,%r103,%r101; @ %r157 bra $L20; .loc 1 87 16 cvt.u32.u32 %r159,%r22; cvt.s64.s8 %r158,%r159; shl.b64 %r95,%r158,3; add.u64 %r209,%frame,240; mov.u32 %r164%r113,[%frame]; .loc 1 104 13 shl.b64 %r115,%r113,2; .loc 1 105 34 ld.u64 %r109,[%frame+120]; .loc 1 91 19 ld.u64 %r45,[%r121]; .loc 1 89 6 setp.eq.u32 %r168,%r123,0; @ ! %r168 bra $L32; add.u64 %r210,%frame,248; mov.u32 %r216,1; .loc 1 149 17 mov.u64 %r220,0; bra $L8; $L32: .loc 1 91 37 add.u64 %r169,%r87,-1; .loc 1 91 31 shl.b64 %r170,%r169,2; .loc 1 91 12 add.u64 %r68,%r45,%r170; .loc 1 104 13 neg.s64 %r44,%r115; add.u64 %r210,%frame,248; mov.u32 %r211,1; .loc 1 112 17 mov.u64 %r215,0; $L14: ld.u64 %r38,[%frame+240]; mov.u64 %r110,%r38; mov.u32 %r98,%r164; $L12: .loc 1 97 11 ld.u32 %r171,[%r68]; setp.ne.u32 %r172,%r171,%r122; @ %r172 bra $L9; setp.eq.u32 %r173,%r98,0; selp.u64 %r110,%r38,%r110,%r173; add.u64 %r93,%frame,128; mov.u64 %r92,%r210; add.u64 %r86,%r208,%r95; $L11: .loc 1 100 37 sub.u64 %r177,%r109,%r110; .loc 1 100 25 st.u64 [%r102],%r177; .loc 1 99 5 add.u64 %r102,%r102,%r118; setp.eq.u64 %r178,%r86,%r93; @ %r178 bra $L1; .loc 1 100 33 ld.u64 %r109,[%r93]; .loc 1 100 44 ld.u64 %r110,[%r92]; add.u64 %r93,%r93,8; add.u64 %r92,%r92,8; bra $L11; $L9: .loc 1 104 13 add.u64 %r68,%r68,%r44; .loc 1 105 14 add.u64 %r110,%r110,1; mov.u32 %r98,%r211; .loc 1 105 6 setp.ne.u64 %r179,%r110,%r109; @ %r179 bra $L12; st.u64 [%frame+240],%r109; mov.u64 %r111,%r210; mov.u64 %r53,%r109; mov.u64 %r51,%r113; .loc 1 107 6 mov.u64 %r73,0; $L13: .loc 1 112 17 st.u64 [%r111+-8],%r215; .loc 1 115 27 mul.lo.u64 %r48,%r51,%r53; mov.u64 %r79,%r73; .loc 1 116 9 add.u64 %r73,%r73,1; .loc 1 117 11 setp.eq.u64 %r182,%r79,%r103; @ %r182 bra $L1; .loc 1 121 13 ld.u64 %r183,[%r111]; add.u64 %r50,%r183,1; st.u64 [%r111],%r50; shl.b64 %r106,%r73,3; .loc 1 122 20 add.u64 %r184,%frame,%r106; ld.u64 %r51,[%r184]; .loc 1 122 10 sub.u64 %r185,%r48,%r51; shl.b64 %r186,%r185,2; add.u64 %r68,%r68,%r186; .loc 1 124 33 add.u64 %r188,%r208,%r106; ld.u64 %r53,[%r188]; .loc 1 124 6 add.u64 %r111,%r111,8; setp.eq.u64 %r189,%r50,%r53; @ %r189 bra $L13; bra $L14; $L8: ld.u64 %r96,[%frame+240]; .loc 1 107 6 mov.u64 %r105,%r96; mov.u32 %r59,%r123; $L18: .loc 1 134 11 ld.u32 %r190,[%r45]; setp.ne.u32 %r191,%r190,%r122; @ %r191 bra $L15; setp.eq.u32 %r192,%r59,0; selp.u64 %r105,%r96,%r105,%r192; mov.u64 %r72,%r210; add.u64 %r27,%r209,%r95; $L17: .loc 1 137 36 add.u64 %r195,%r105,1; .loc 1 137 25 st.u64 [%r102],%r195; .loc 1 136 5 add.u64 %r102,%r102,%r118; setp.eq.u64 %r196,%r27,%r72; @ %r196 bra $L1; .loc 1 137 32 ld.u64 %r105,[%r72]; add.u64 %r72,%r72,8; bra $L17; $L15: .loc 1 141 13 add.u64 %r45,%r45,%r115; .loc 1 142 14 add.u64 %r105,%r105,1; mov.u32 %r59,%r216; .loc 1 142 6 setp.ne.u64 %r197,%r105,%r109; @ %r197 bra $L18; st.u64 [%frame+240],%r109; mov.u64 %r71,%r210; mov.u64 %r67,%r109; mov.u64 %r65,%r113; .loc 1 144 6 mov.u64 %r75,0; $L19: .loc 1 149 17 st.u64 [%r71+-8],%r220; .loc 1 152 27 mul.lo.u64 %r62,%r65,%r67; mov.u64 %r78,%r75; .loc 1 153 9 add.u64 %r75,%r75,1; .loc 1 154 11 setp.eq.u64 %r200,%r78,%r103; @ %r200 bra $L1; .loc 1 158 13 ld.u64 %r201,[%r71]; add.u64 %r64,%r201,1; st.u64 [%r71],%r64; shl.b64 %r76,%r75,3; .loc 1 159 20 add.u64 %r202,%frame,%r76; ld.u64 %r65,[%r202]; .loc 1 159 10 sub.u64 %r203,%r65,%r62; shl.b64 %r204,%r203,2; add.u64 %r45,%r45,%r204; .loc 1 161 33 add.u64 %r206,%r208,%r76; ld.u64 %r67,[%r206]; .loc 1 161 6 add.u64 %r71,%r71,8; setp.eq.u64 %r207,%r64,%r67; @ %r207 bra $L19; bra $L8; $L1: .loc 1 165_gfortran_mfindloc0_i4 .visible .func _gfortran_mfindloc0_i4pred %r166; .reg .u64pred %r184; .reg .u64 %r187; .reg .u64 %r190; .reg .u32u64 %r213; .reg .u64u32 %r227; .reg .u64 %r229; .reg .pred %r231; .reg .u64 %r232; .reg .u64 %r233; .reg .u16u64 %r255; .reg .u64 %r257; .reg .pred %r258; .reg .u16 %r259; .reg .u32 %r260; .reg .predmov.u64 %r160,%ar0; mov.u64 %r161,%ar1; mov.u32 %r162,%ar2; mov.u64 %r163,%ar3; mov.u32 %r164,%ar4; .loc 1 190 10 ld.s8 %r22,[%r161+28]; .loc 1 190 8 cvt.u32.u32 %r165,%r22; cvt.s64.s8 %r104,%r165; .loc 1 191 6 setp.gt.s64 %r166,%r104,0; @ %r166 bra $L34; .loc 1 192 5 cvta.const.u64 %r16167_gfortran_runtime_error34: .loc 1 194 6 ld.u64 %r169,[%r160; .loc 1 196 7 st.u64 [%r160+48],%r169; add.u64 %r172,%r104,-1; st.u64 [%r160+56],%r172; mov.u64 %r173,1; st.u64 [%r160+40],%r173; .loc 1 197 28 cvt.u32.u64 %r174,%r173; st.u8 [%r160+28],%r174; .loc 1 198 24 st.u64 [%r160+8],%r169; .loc 1 199 29 mov.u64 %r177call (%value_in),_gfortrani_xmallocarray,(%out_arg1,%out_arg2); ld.param.u64 %r180,[%value_in]; } .loc 1 199 27 st.u64 [%r160],%r180; bra $L36; $L35: .loc 1 203 11 cvta.global.u64 %r182,_gfortrani_compile_options; .loc 1 203 10 ld.u32 %r183,[%r182+36]; setp.eq.u32 %r184,%r183,0; @ %r184 bra $L36; .loc 1 205 4 cvta.const.u64 %r187,$LC1; {161187; call _gfortrani_bounds_iforeach_return.loc 1 207 4 cvta.const.u64 %r19016r161190187; call _gfortrani_bounds_equal_extents$L36: .loc 1 212 13 ld.u32 %r105,[%r163+16]; .loc 1 214 9 ld.u64 %r60,[%r163]; .loc 1 216 53 add.u32 %r192,%r105,-4; and.b32 %r193,%r192,-5; set.u32.eq.u32 %r195,%r193,0; neg.s32 %r196,%r195; .loc 1 216 22 add.u32 %r197,%r105,-1; set.u32.le.u32 %r199,%r197,1; neg.s32 %r200,%r199; .loc 1 216 6 cvt.u16.u32 %r202,%r196; cvt.u16.u32 %r203,%r200; or.b16 %r201,%r202,%r203;%r206 bra $L37; .loc 1 218 7 setp.ne.u32 %r207,%r105,16; @ %r207 bra $L38; $L37: ld.u64 %r208,[%r160+40]; shl.b64 %r98,%r208,3; ld.u64 %r133,[%r160]; .loc 1 226 8 mov.u64 %r84,%r133; .loc 1 229 10 mov.u64 %r118,0; .loc 1 230 23 mov.u64 %r211,%r118; bra $L39; $L38: .loc 1 223 5 cvta.const.u64 %r210,$LC3; mov.u64 %r209,210; call _gfortrani_internal_error39: .loc 1 230 23 st.u64 [%r84],%r211; mov.u64 %r134,%r118; .loc 1 229 26 add.u64 %r118,%r134,1; .loc 1 229 3 add.u64 %r84,%r84,%r98; setp.ne.u64 %r212,%r104,%r118; @ %r212 bra $L39; .loc 1 236 20 ld.u64 %r37,[%r163+16]; add.u64 %r143,%r161,40; add.u64 %r285,%frame,120; mov.u64 %r28,%r285; add.u64 %r80,%r163,40; mov.u64 %r82,%frame; add.u64 %r78,%frame,240; mov.u64 %r54,%r78; .loc 1 232 6 mov.u64 %r116,1; .loc 1 233 10 mov.u64 %r132,0; bra $L41; $L54: mov.u64 %r132,%r117; $L41: .loc 1 235 18 ld.u64 %r213,[%r143]; st.u64 [%r28],%r213; .loc 1 236 20 ld.u64 %r215,[%r80]; mul.lo.u64 %r214,%r215,%r37; .loc 1 236 18 st.u64 [%r82],%r214; .loc 1 237 19 ld.u64 %r217,[%r143+16]; add.u64 %r216,%r217,1; ld.u64 %r218,[%r143+8]; sub.u64 %r43,%r216,%r218; .loc 1 237 17 st.u64 [%r54],%r43; .loc 1 238 10 mul.lo.u64 %r116,%r116,%r43; .loc 1 239 10 setp.le.s64 %r219,%r43,0; @ %r219 bra $L33; .loc 1 233 26 add.u64 %r117,%r132,1; .loc 1 233 3 add.u64 %r143,%r143,24; add.u64 %r28,%r28,8; add.u64 %r80,%r80,24; add.u64 %r82,%r82,8; add.u64 %r54,%r54,8; setp.ne.u64 %r220,%r134,%r132; @ %r220 bra $L54; .loc 1 244 16 cvt.u32.u32 %r222,%r22; cvt.s64.s8 %r221,%r222; shl.b64 %r141,%r221,3; add.u64 %r283,%frame,360; mov.u32 %r227227261 23 ld.u64 %r155,[%frame+120]; .loc 1 261 13 shl.b64 %r157,%r155,2; .loc 1 262 24 ld.u64 %r158,[%frame]; .loc 1 263 34 ld.u64 %r151,[%frame+240]; .loc 1 248 19 ld.u64 %r47,[%r161]; .loc 1 246 6 setp.eq.u32 %r231,%r164,0; @ ! %r231 bra $L72; add.u64 %r284,%frame,368; mov.u32 %r293,1; .loc 1 324 33 add.u64 %r296,%frame,240; bra $L42; $L72: .loc 1 248 37 add.u64 %r44,%r116,-1; .loc 1 248 31 shl.b64 %r232,%r44,2; .loc 1 248 12 add.u64 %r96,%r47,%r232; .loc 1 249 32 cvt.s64.s32 %r233,%r105; .loc 1 249 13 mad.lo.u64 %r97,%r233,%r44,%r60; .loc 1 261 13 neg.s64 %r58,%r157; .loc 1 262 14 neg.s64 %r59,%r158; add.u64 %r284,%frame,368; mov.u32 %r287,1; .loc 1 284 33 add.u64 %r290,%frame,240; $L48: ld.u64 %r130,[%frame+360]; mov.u64 %r152,%r130; mov.u32 %r33,%r227; $L46: .loc 1 254 11 ld.s8 %r236,[%r97]; cvt.u16.u32 %r235,%r236; setp.eq.u16 %r237,%r235,0; @ %r237 bra $L43; ld.u32 %r238,[%r96]; setp.ne.u32 %r239,%r238,%r162; @ %r239 bra $L43; setp.eq.u32 %r240,%r33,0; selp.u64 %r152,%r130,%r152,%r240; add.u64 %r139,%frame,248; mov.u64 %r138,%r284; add.u64 %r136,%r78,%r141; $L45: .loc 1 257 37 sub.u64 %r243,%r151,%r152; .loc 1 257 25 st.u64 [%r133],%r243; .loc 1 256 5 add.u64 %r133,%r133,%r98; setp.eq.u64 %r244,%r136,%r139; @ %r244 bra $L33; .loc 1 257 33 ld.u64 %r151,[%r139]; .loc 1 257 44 ld.u64 %r152,[%r138]; add.u64 %r139,%r139,8; add.u64 %r138,%r138,8; bra $L45; $L43: .loc 1 261 13 add.u64 %r96,%r96,%r58; .loc 1 262 14 add.u64 %r97,%r97,%r59; .loc 1 263 14 add.u64 %r152,%r152,1; mov.u32 %r33,%r287; .loc 1 263 6 setp.ne.u64 %r245,%r152,%r151; @ %r245 bra $L46; st.u64 [%frame+360],%r151; mov.u64 %r125,%r284; mov.u64 %r69,%r158; mov.u64 %r72,%r151; mov.u64 %r67,%r155; mov.u64 %r124,8; .loc 1 265 6 mov.u64 %r100,0; .loc 1 270 17 mov.u64 %r247,%r100; $L47: st.u64 [%r125+-8],%r247; .loc 1 273 27 mul.lo.u64 %r62,%r67,%r72; .loc 1 274 28 mul.lo.u64 %r64,%r72,%r69; mov.u64 %r142,%r100; .loc 1 275 9 add.u64 %r100,%r100,1; .loc 1 276 11 setp.eq.u64 %r248,%r142,%r134; @ %r248 bra $L33; .loc 1 280 13 ld.u64 %r249,[%r125]; add.u64 %r66,%r249,1; st.u64 [%r125],%r66; .loc 1 281 20 add.u64 %r251,%r285,%r124; ld.u64 %r67,[%r251]; .loc 1 281 10 sub.u64 %r252,%r62,%r67; shl.b64 %r253,%r252,2; add.u64 %r96,%r96,%r253; .loc 1 282 21 add.u64 %r254,%frame,%r124; ld.u64 %r69,[%r254]; .loc 1 282 11 sub.u64 %r255,%r69,%r64; add.u64 %r97,%r97,%r255; .loc 1 284 33 add.u64 %r257,%r290,%r124; ld.u64 %r72,[%r257]; .loc 1 284 6 add.u64 %r125,%r125,8; add.u64 %r124,%r124,8; setp.eq.u64 %r258,%r66,%r72; @ %r258 bra $L47; bra $L48; $L42: ld.u64 %r55,[%frame+360]; .loc 1 265 6 mov.u64 %r146,%r55; mov.u32 %r126,%r164; $L52: .loc 1 294 11 ld.s8 %r260,[%r60]; cvt.u16.u32 %r259,%r260; setp.eq.u16 %r261,%r259,0; @ %r261 bra $L49; ld.u32 %r262,[%r47]; setp.ne.u32 %r263,%r262,%r162; @ %r263 bra $L49; setp.eq.u32 %r264,%r126,0; selp.u64 %r146,%r55,%r146,%r264; mov.u64 %r119,%r284; add.u64 %r77,%r283,%r141; $L51: .loc 1 297 36 add.u64 %r267,%r146,1; .loc 1 297 25 st.u64 [%r133],%r267; .loc 1 296 5 add.u64 %r133,%r133,%r98; setp.eq.u64 %r268,%r77,%r119; @ %r268 bra $L33; .loc 1 297 32 ld.u64 %r146,[%r119]; add.u64 %r119,%r119,8; bra $L51; $L49: .loc 1 301 13 add.u64 %r47,%r47,%r157; .loc 1 302 14 add.u64 %r60,%r60,%r158; .loc 1 303 14 add.u64 %r146,%r146,1; mov.u32 %r126,%r293; .loc 1 303 6 setp.ne.u64 %r269,%r146,%r151; @ %r269 bra $L52; st.u64 [%frame+360],%r151; mov.u64 %r153,%r284; mov.u64 %r93,%r158; mov.u64 %r95,%r151; mov.u64 %r91,%r155; mov.u64 %r150,8; .loc 1 305 6 mov.u64 %r101,0; .loc 1 310 17 mov.u64 %r271,%r101; $L53: st.u64 [%r153+-8],%r271; .loc 1 313 27 mul.lo.u64 %r86,%r91,%r95; .loc 1 314 28 mul.lo.u64 %r88,%r95,%r93; mov.u64 %r103,%r101; .loc 1 315 9 add.u64 %r101,%r101,1; .loc 1 316 11 setp.eq.u64 %r272,%r103,%r134; @ %r272 bra $L33; .loc 1 320 13 ld.u64 %r273,[%r153]; add.u64 %r90,%r273,1; st.u64 [%r153],%r90; .loc 1 321 20 add.u64 %r275,%r285,%r150; ld.u64 %r91,[%r275]; .loc 1 321 10 sub.u64 %r276,%r91,%r86; shl.b64 %r277,%r276,2; add.u64 %r47,%r47,%r277; .loc 1 322 21 add.u64 %r278,%frame,%r150; ld.u64 %r93,[%r278]; .loc 1 322 11 sub.u64 %r279,%r93,%r88; add.u64 %r60,%r60,%r279; .loc 1 324 33 add.u64 %r281,%r296,%r150; ld.u64 %r95,[%r281]; .loc 1 324 6 add.u64 %r153,%r153,8; add.u64 %r150,%r150,8; setp.eq.u64 %r282,%r90,%r95; @ %r282 bra $L53; bra $L42; $L33: .loc 1 328_gfortran_sfindloc0_i4 .visible .func _gfortran_sfindloc0_i4predar2; mov.u64 %r38,%ar3; mov.u32 %r39,%ar4; .loc 1 345 6 setp.eq.u64 %r40,%r38,0; @ %r40 bra $L74; .loc 1 345 20 ld.u32 %r41,[%r38]; setp.eq.u32 %r42,%r41,0; @ %r42 bra $L75; $L74: .loc 1 347 7 {call _gfortran_findloc0_348 7 bra $L73; $L75: .loc 1 351 8 ld.s8 %r30,[%r36+28]; .loc 1 353 6 setp.gt.s64 %r48,%r30,0; @ %r48 bra $L77; .loc 1 354 5 cvta.const.u64 %r50,$LC0; mov.u64 %r49,0; {_gfortrani_internal_error77: .loc 1 356 15 ld.u64 %r24,[%r35]; .loc 1 356 6 setp.ne.u64 %r51,%r24,0; @ %r51 bra $L78; .loc 1 358 7 st.u64 [%r35+48],%r24; add.u64 %r53,%r30,-1; st.u64 [%r35+56],%r53; mov.u64 %r54,1; st.u64 [%r35+40],%r54; .loc 1 359 28 cvt.u32.u64 %r55,%r54; st.u8 [%r35+28],%r55; .loc 1 360 24 st.u64 [%r35+8],%r24; .loc 1 361 29 mov.u64 call (%value_in),_gfortrani_xmallocarray61 27 st.u64 [%r35],%r24; bra $L79; $L78: .loc 1 363 12 cvta.global.u64 %r63,_gfortrani_compile_options; .loc 1 363 11 ld.u32 %r64,[%r63+36]; setp.eq.u32 %r65,%r64,0; @ %r65 bra $L79; .loc 1 365 8 cvta.const.u64 %r68,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 370 8 ld.u64 %r24,[%r35]; $L79: ld.u64 %r69,[%r35+40]; shl.b64 %r34,%r69,3; mov.u64 %r29,%r24; .loc 1 371 10 mov.u64 %r32,0; .loc 1 372 23 mov.u64 %r70,%r32; $L80: st.u64 [%r29],%r70; .loc 1 371 24 add.u64 %r32,%r32,1; .loc 1 371 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r71,%r30,%r32; @ %r71 bra $L80; $L73: .loc 1 373 1 ret; } findloc0_i8.o/_gfortran_findloc0_i8 .visible .func _gfortran_findloc0_i8.file 1 "../../../../libgfortran/generated/findloc0_i8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mfindloc0_i8 .visible .func _gfortran_mfindloc0__gfortran_sfindloc0_i8 .visible .func _gfortran_sfindloc0_VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extentsgfortrani_internal_error .extern .func _gfortrani_internal_70,73,78,68,777,65,83,75,32,97,114,103,117,109,101,110,116,0 }82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,6270,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_findloc0_i8 .visible .func _gfortran_findloc0_i888pred.reg .u64 %r140; .reg .u32 %r141; .reg .predpredu64 %r166; .reg .pred.reg .pred %r178; .reg .predu64pred %r192; .reg .u64 %r195; .reg .predpredu32 %r211; .reg .u64 %r215; .reg .u32 %r216; .reg .u64 %r220; mov.u64 %r120,%ar0; mov.u64 %r121,%ar1; mov.u64 %r122,%ar2; mov.u32 %r123,%ar3; .loc 1 51 10 ld.s8 %r22,[%r121+28]; .loc 1 51 8 cvt.u32.u32 %r124,%r22; cvt.s64.s8 %r80,%r124; .loc 1 52 6 setp.gt.s64 %r125,%r80,0; @ %r125 bra $L2; .loc 1 53 5 cvta.const.u64 %r126,$LCstack; call _gfortran_runtime_errorr23,[%r120]; .loc 1 55 6 setp.ne.u64 %r128,%r23,0; @ %r128 bra $L3; .loc 1 57 7 st.u64 [%r120+48],%r23; add.u64 %r130,%r80,-1; st.u64 [%r120+56],%r130; mov.u64 %r131,1; st.u64 [%r120+40],%r131; .loc 1 58 28 cvt.u32.u64 %r132,%r131; st.u8 [%r120+28],%r132; .loc 1 59 24 st.u64 [%r120+8],%r23; .loc 1 60 29 mov.u64 %r135135; call (%value_in),_gfortrani_xmallocarray138,[%value_in]; } mov.u64 %r23,%r138; .loc 1 60 27 st.u64 [%r120],%r23; bra $L4; $L3: .loc 1 64 11 cvta.global.u64 %r140,_gfortrani_compile_options; .loc 1 64 10 ld.u32 %r141,[%r140+36]; setp.eq.u32 %r142,%r141,0; @ %r142 bra $L4; .loc 1 6gfortrani_bounds_iforeach_return.loc 1 70 8 ld.u64 %r23,[%r120]; $L4: ld.u64 %r146,[%r120+40]; shl.b64 %r118,%r146,3; mov.u64 %r102,%r23; .loc 1 69 11 mov.u64 %r116,%r102; .loc 1 73 10 mov.u64 %r89,0; .loc 1 74 23 mov.u64 %r147,%r89; $L5: st.u64 [%r116],%r147; mov.u64 %r103,%r89; .loc 1 73 26 add.u64 %r89,%r103,1; .loc 1 73 3 add.u64 %r116,%r116,%r118; setp.ne.u64 %r148,%r80,%r89; @ %r148 bra $L5; add.u64 %r39,%r121,40; mov.u64 %r74,%frame; .loc 1 76 6 mov.u64 %r87,1; .loc 1 77 10 mov.u64 %r101,0; add.u64 %r208,%frame,120; bra $L7; $L20: mov.u64 %r101,%r88; $L7: .loc 1 79 18 ld.u64 %r149,[%r39]; st.u64 [%r74],%r149; .loc 1 80 19 ld.u64 %r151,[%r39+16]; add.u64 %r150,%r151,1; ld.u64 %r152,[%r39+8]; sub.u64 %r33,%r150,%r152; .loc 1 80 17 shl.b64 %r154,%r101,3; add.u64 %r155,%r208,%r154; st.u64 [%r155],%r33; .loc 1 81 10 mul.lo.u64 %r87,%r87,%r33; .loc 1 82 10 setp.le.s64 %r156,%r33,0; @ %r156 bra $L1; .loc 1 77 26 add.u64 %r88,%r101,1; .loc 1 77 3 add.u64 %r39,%r39,24; add.u64 %r74,%r74,8; setp.ne.u64 %r157,%r103,%r101; @ %r157 bra $L20; .loc 1 87 16 cvt.u32.u32 %r159,%r22; cvt.s64.s8 %r158,%r159; shl.b64 %r95,%r158,3; add.u64 %r209,%frame,240; mov.u32 %r164%r113,[%frame]; .loc 1 104 13 shl.b64 %r115,%r113,3; .loc 1 105 34 ld.u64 %r109,[%frame+120]; .loc 1 91 19 ld.u64 %r45,[%r121]; .loc 1 89 6 setp.eq.u32 %r168,%r123,0; @ ! %r168 bra $L32; add.u64 %r210,%frame,248; mov.u32 %r216,1; .loc 1 149 17 mov.u64 %r220,0; bra $L8; $L32: .loc 1 91 37 add.u64 %r169,%r87,-1; .loc 1 91 31 shl.b64 %r170,%r169,3; .loc 1 91 12 add.u64 %r68,%r45,%r170; .loc 1 104 13 neg.s64 %r44,%r115; add.u64 %r210,%frame,248; mov.u32 %r211,1; .loc 1 112 17 mov.u64 %r215,0; $L14: ld.u64 %r38,[%frame+240]; mov.u64 %r110,%r38; mov.u32 %r98,%r164; $L12: .loc 1 97 11 ld.u64 %r171,[%r68]; setp.ne.u64 %r172,%r171,%r122; @ %r172 bra $L9; setp.eq.u32 %r173,%r98,0; selp.u64 %r110,%r38,%r110,%r173; add.u64 %r93,%frame,128; mov.u64 %r92,%r210; add.u64 %r86,%r208,%r95; $L11: .loc 1 100 37 sub.u64 %r177,%r109,%r110; .loc 1 100 25 st.u64 [%r102],%r177; .loc 1 99 5 add.u64 %r102,%r102,%r118; setp.eq.u64 %r178,%r86,%r93; @ %r178 bra $L1; .loc 1 100 33 ld.u64 %r109,[%r93]; .loc 1 100 44 ld.u64 %r110,[%r92]; add.u64 %r93,%r93,8; add.u64 %r92,%r92,8; bra $L11; $L9: .loc 1 104 13 add.u64 %r68,%r68,%r44; .loc 1 105 14 add.u64 %r110,%r110,1; mov.u32 %r98,%r211; .loc 1 105 6 setp.ne.u64 %r179,%r110,%r109; @ %r179 bra $L12; st.u64 [%frame+240],%r109; mov.u64 %r111,%r210; mov.u64 %r53,%r109; mov.u64 %r51,%r113; .loc 1 107 6 mov.u64 %r73,0; $L13: .loc 1 112 17 st.u64 [%r111+-8],%r215; .loc 1 115 27 mul.lo.u64 %r48,%r51,%r53; mov.u64 %r79,%r73; .loc 1 116 9 add.u64 %r73,%r73,1; .loc 1 117 11 setp.eq.u64 %r182,%r79,%r103; @ %r182 bra $L1; .loc 1 121 13 ld.u64 %r183,[%r111]; add.u64 %r50,%r183,1; st.u64 [%r111],%r50; shl.b64 %r106,%r73,3; .loc 1 122 20 add.u64 %r184,%frame,%r106; ld.u64 %r51,[%r184]; .loc 1 122 10 sub.u64 %r185,%r48,%r51; shl.b64 %r186,%r185,3; add.u64 %r68,%r68,%r186; .loc 1 124 33 add.u64 %r188,%r208,%r106; ld.u64 %r53,[%r188]; .loc 1 124 6 add.u64 %r111,%r111,8; setp.eq.u64 %r189,%r50,%r53; @ %r189 bra $L13; bra $L14; $L8: ld.u64 %r96,[%frame+240]; .loc 1 107 6 mov.u64 %r105,%r96; mov.u32 %r59,%r123; $L18: .loc 1 134 11 ld.u64 %r190,[%r45]; setp.ne.u64 %r191,%r190,%r122; @ %r191 bra $L15; setp.eq.u32 %r192,%r59,0; selp.u64 %r105,%r96,%r105,%r192; mov.u64 %r72,%r210; add.u64 %r27,%r209,%r95; $L17: .loc 1 137 36 add.u64 %r195,%r105,1; .loc 1 137 25 st.u64 [%r102],%r195; .loc 1 136 5 add.u64 %r102,%r102,%r118; setp.eq.u64 %r196,%r27,%r72; @ %r196 bra $L1; .loc 1 137 32 ld.u64 %r105,[%r72]; add.u64 %r72,%r72,8; bra $L17; $L15: .loc 1 141 13 add.u64 %r45,%r45,%r115; .loc 1 142 14 add.u64 %r105,%r105,1; mov.u32 %r59,%r216; .loc 1 142 6 setp.ne.u64 %r197,%r105,%r109; @ %r197 bra $L18; st.u64 [%frame+240],%r109; mov.u64 %r71,%r210; mov.u64 %r67,%r109; mov.u64 %r65,%r113; .loc 1 144 6 mov.u64 %r75,0; $L19: .loc 1 149 17 st.u64 [%r71+-8],%r220; .loc 1 152 27 mul.lo.u64 %r62,%r65,%r67; mov.u64 %r78,%r75; .loc 1 153 9 add.u64 %r75,%r75,1; .loc 1 154 11 setp.eq.u64 %r200,%r78,%r103; @ %r200 bra $L1; .loc 1 158 13 ld.u64 %r201,[%r71]; add.u64 %r64,%r201,1; st.u64 [%r71],%r64; shl.b64 %r76,%r75,3; .loc 1 159 20 add.u64 %r202,%frame,%r76; ld.u64 %r65,[%r202]; .loc 1 159 10 sub.u64 %r203,%r65,%r62; shl.b64 %r204,%r203,3; add.u64 %r45,%r45,%r204; .loc 1 161 33 add.u64 %r206,%r208,%r76; ld.u64 %r67,[%r206]; .loc 1 161 6 add.u64 %r71,%r71,8; setp.eq.u64 %r207,%r64,%r67; @ %r207 bra $L19; bra $L8; $L1: .loc 1 165_gfortran_mfindloc0_i8 .visible .func _gfortran_mfindloc0_, .param .u32 %in_ar4) {pred %r166; .reg .u64pred %r184; .reg .u64 %r187; .reg .u64 %r190; .reg .u32u64 %r213; .reg .u64u32 %r227; .reg .u64 %r229; .reg .pred %r231; .reg .u64 %r232; .reg .u64 %r233; .reg .u16u64 %r255; .reg .u64 %r257; .reg .pred %r258; .reg .u16 %r259; .reg .u32mov.u64 %r160,%ar0; mov.u64 %r161,%ar1; mov.u64 %r162,%ar2; mov.u64 %r163,%ar3; mov.u32 %r164,%ar4; .loc 1 190 10 ld.s8 %r22,[%r161+28]; .loc 1 190 8 cvt.u32.u32 %r165,%r22; cvt.s64.s8 %r104,%r165; .loc 1 191 6 setp.gt.s64 %r166,%r104,0; @ %r166 bra $L34; .loc 1 192 5 cvta.const.u64 %r16167_gfortran_runtime_error34: .loc 1 194 6 ld.u64 %r169,[%r160; .loc 1 196 7 st.u64 [%r160+48],%r169; add.u64 %r172,%r104,-1; st.u64 [%r160+56],%r172; mov.u64 %r173,1; st.u64 [%r160+40],%r173; .loc 1 197 28 cvt.u32.u64 %r174,%r173; st.u8 [%r160+28],%r174; .loc 1 198 24 st.u64 [%r160+8],%r169; .loc 1 199 29 mov.u64 %r177call (%value_in),_gfortrani_xmallocarray,(%out_arg1,%out_arg2); ld.param.u64 %r180,[%value_in]; } .loc 1 199 27 st.u64 [%r160],%r180; bra $L36; $L35: .loc 1 203 11 cvta.global.u64 %r182,_gfortrani_compile_options; .loc 1 203 10 ld.u32 %r183,[%r182+36]; setp.eq.u32 %r184,%r183,0; @ %r184 bra $L36; .loc 1 205 4 cvta.const.u64 %r187,$LC1; {161187; call _gfortrani_bounds_iforeach_return.loc 1 207 4 cvta.const.u64 %r19016r161190187; call _gfortrani_bounds_equal_extents$L36: .loc 1 212 13 ld.u32 %r105,[%r163+16]; .loc 1 214 9 ld.u64 %r60,[%r163]; .loc 1 216 53 add.u32 %r192,%r105,-4; and.b32 %r193,%r192,-5; set.u32.eq.u32 %r195,%r193,0; neg.s32 %r196,%r195; .loc 1 216 22 add.u32 %r197,%r105,-1; set.u32.le.u32 %r199,%r197,1; neg.s32 %r200,%r199; .loc 1 216 6 cvt.u16.u32 %r202,%r196; cvt.u16.u32 %r203,%r200; or.b16 %r201,%r202,%r203;%r206 bra $L37; .loc 1 218 7 setp.ne.u32 %r207,%r105,16; @ %r207 bra $L38; $L37: ld.u64 %r208,[%r160+40]; shl.b64 %r98,%r208,3; ld.u64 %r133,[%r160]; .loc 1 226 8 mov.u64 %r84,%r133; .loc 1 229 10 mov.u64 %r118,0; .loc 1 230 23 mov.u64 %r211,%r118; bra $L39; $L38: .loc 1 223 5 cvta.const.u64 %r210,$LC3; mov.u64 %r209,210; call _gfortrani_internal_error39: .loc 1 230 23 st.u64 [%r84],%r211; mov.u64 %r134,%r118; .loc 1 229 26 add.u64 %r118,%r134,1; .loc 1 229 3 add.u64 %r84,%r84,%r98; setp.ne.u64 %r212,%r104,%r118; @ %r212 bra $L39; .loc 1 236 20 ld.u64 %r37,[%r163+16]; add.u64 %r143,%r161,40; add.u64 %r285,%frame,120; mov.u64 %r28,%r285; add.u64 %r80,%r163,40; mov.u64 %r82,%frame; add.u64 %r78,%frame,240; mov.u64 %r54,%r78; .loc 1 232 6 mov.u64 %r116,1; .loc 1 233 10 mov.u64 %r132,0; bra $L41; $L54: mov.u64 %r132,%r117; $L41: .loc 1 235 18 ld.u64 %r213,[%r143]; st.u64 [%r28],%r213; .loc 1 236 20 ld.u64 %r215,[%r80]; mul.lo.u64 %r214,%r215,%r37; .loc 1 236 18 st.u64 [%r82],%r214; .loc 1 237 19 ld.u64 %r217,[%r143+16]; add.u64 %r216,%r217,1; ld.u64 %r218,[%r143+8]; sub.u64 %r43,%r216,%r218; .loc 1 237 17 st.u64 [%r54],%r43; .loc 1 238 10 mul.lo.u64 %r116,%r116,%r43; .loc 1 239 10 setp.le.s64 %r219,%r43,0; @ %r219 bra $L33; .loc 1 233 26 add.u64 %r117,%r132,1; .loc 1 233 3 add.u64 %r143,%r143,24; add.u64 %r28,%r28,8; add.u64 %r80,%r80,24; add.u64 %r82,%r82,8; add.u64 %r54,%r54,8; setp.ne.u64 %r220,%r134,%r132; @ %r220 bra $L54; .loc 1 244 16 cvt.u32.u32 %r222,%r22; cvt.s64.s8 %r221,%r222; shl.b64 %r141,%r221,3; add.u64 %r283,%frame,360; mov.u32 %r227227261 23 ld.u64 %r155,[%frame+120]; .loc 1 261 13 shl.b64 %r157,%r155,3; .loc 1 262 24 ld.u64 %r158,[%frame]; .loc 1 263 34 ld.u64 %r151,[%frame+240]; .loc 1 248 19 ld.u64 %r47,[%r161]; .loc 1 246 6 setp.eq.u32 %r231,%r164,0; @ ! %r231 bra $L72; add.u64 %r284,%frame,368; mov.u32 %r293,1; .loc 1 324 33 add.u64 %r296,%frame,240; bra $L42; $L72: .loc 1 248 37 add.u64 %r44,%r116,-1; .loc 1 248 31 shl.b64 %r232,%r44,3; .loc 1 248 12 add.u64 %r96,%r47,%r232; .loc 1 249 32 cvt.s64.s32 %r233,%r105; .loc 1 249 13 mad.lo.u64 %r97,%r233,%r44,%r60; .loc 1 261 13 neg.s64 %r58,%r157; .loc 1 262 14 neg.s64 %r59,%r158; add.u64 %r284,%frame,368; mov.u32 %r287,1; .loc 1 284 33 add.u64 %r290,%frame,240; $L48: ld.u64 %r130,[%frame+360]; mov.u64 %r152,%r130; mov.u32 %r33,%r227; $L46: .loc 1 254 11 ld.s8 %r236,[%r97]; cvt.u16.u32 %r235,%r236; setp.eq.u16 %r237,%r235,0; @ %r237 bra $L43; ld.u64 %r238,[%r96]; setp.ne.u64 %r239,%r238,%r162; @ %r239 bra $L43; setp.eq.u32 %r240,%r33,0; selp.u64 %r152,%r130,%r152,%r240; add.u64 %r139,%frame,248; mov.u64 %r138,%r284; add.u64 %r136,%r78,%r141; $L45: .loc 1 257 37 sub.u64 %r243,%r151,%r152; .loc 1 257 25 st.u64 [%r133],%r243; .loc 1 256 5 add.u64 %r133,%r133,%r98; setp.eq.u64 %r244,%r136,%r139; @ %r244 bra $L33; .loc 1 257 33 ld.u64 %r151,[%r139]; .loc 1 257 44 ld.u64 %r152,[%r138]; add.u64 %r139,%r139,8; add.u64 %r138,%r138,8; bra $L45; $L43: .loc 1 261 13 add.u64 %r96,%r96,%r58; .loc 1 262 14 add.u64 %r97,%r97,%r59; .loc 1 263 14 add.u64 %r152,%r152,1; mov.u32 %r33,%r287; .loc 1 263 6 setp.ne.u64 %r245,%r152,%r151; @ %r245 bra $L46; st.u64 [%frame+360],%r151; mov.u64 %r125,%r284; mov.u64 %r69,%r158; mov.u64 %r72,%r151; mov.u64 %r67,%r155; mov.u64 %r124,8; .loc 1 265 6 mov.u64 %r100,0; .loc 1 270 17 mov.u64 %r247,%r100; $L47: st.u64 [%r125+-8],%r247; .loc 1 273 27 mul.lo.u64 %r62,%r67,%r72; .loc 1 274 28 mul.lo.u64 %r64,%r72,%r69; mov.u64 %r142,%r100; .loc 1 275 9 add.u64 %r100,%r100,1; .loc 1 276 11 setp.eq.u64 %r248,%r142,%r134; @ %r248 bra $L33; .loc 1 280 13 ld.u64 %r249,[%r125]; add.u64 %r66,%r249,1; st.u64 [%r125],%r66; .loc 1 281 20 add.u64 %r251,%r285,%r124; ld.u64 %r67,[%r251]; .loc 1 281 10 sub.u64 %r252,%r62,%r67; shl.b64 %r253,%r252,3; add.u64 %r96,%r96,%r253; .loc 1 282 21 add.u64 %r254,%frame,%r124; ld.u64 %r69,[%r254]; .loc 1 282 11 sub.u64 %r255,%r69,%r64; add.u64 %r97,%r97,%r255; .loc 1 284 33 add.u64 %r257,%r290,%r124; ld.u64 %r72,[%r257]; .loc 1 284 6 add.u64 %r125,%r125,8; add.u64 %r124,%r124,8; setp.eq.u64 %r258,%r66,%r72; @ %r258 bra $L47; bra $L48; $L42: ld.u64 %r55,[%frame+360]; .loc 1 265 6 mov.u64 %r146,%r55; mov.u32 %r126,%r164; $L52: .loc 1 294 11 ld.s8 %r260,[%r60]; cvt.u16.u32 %r259,%r260; setp.eq.u16 %r261,%r259,0; @ %r261 bra $L49; ld.u64 %r262,[%r47]; setp.ne.u64 %r263,%r262,%r162; @ %r263 bra $L49; setp.eq.u32 %r264,%r126,0; selp.u64 %r146,%r55,%r146,%r264; mov.u64 %r119,%r284; add.u64 %r77,%r283,%r141; $L51: .loc 1 297 36 add.u64 %r267,%r146,1; .loc 1 297 25 st.u64 [%r133],%r267; .loc 1 296 5 add.u64 %r133,%r133,%r98; setp.eq.u64 %r268,%r77,%r119; @ %r268 bra $L33; .loc 1 297 32 ld.u64 %r146,[%r119]; add.u64 %r119,%r119,8; bra $L51; $L49: .loc 1 301 13 add.u64 %r47,%r47,%r157; .loc 1 302 14 add.u64 %r60,%r60,%r158; .loc 1 303 14 add.u64 %r146,%r146,1; mov.u32 %r126,%r293; .loc 1 303 6 setp.ne.u64 %r269,%r146,%r151; @ %r269 bra $L52; st.u64 [%frame+360],%r151; mov.u64 %r153,%r284; mov.u64 %r93,%r158; mov.u64 %r95,%r151; mov.u64 %r91,%r155; mov.u64 %r150,8; .loc 1 305 6 mov.u64 %r101,0; .loc 1 310 17 mov.u64 %r271,%r101; $L53: st.u64 [%r153+-8],%r271; .loc 1 313 27 mul.lo.u64 %r86,%r91,%r95; .loc 1 314 28 mul.lo.u64 %r88,%r95,%r93; mov.u64 %r103,%r101; .loc 1 315 9 add.u64 %r101,%r101,1; .loc 1 316 11 setp.eq.u64 %r272,%r103,%r134; @ %r272 bra $L33; .loc 1 320 13 ld.u64 %r273,[%r153]; add.u64 %r90,%r273,1; st.u64 [%r153],%r90; .loc 1 321 20 add.u64 %r275,%r285,%r150; ld.u64 %r91,[%r275]; .loc 1 321 10 sub.u64 %r276,%r91,%r86; shl.b64 %r277,%r276,3; add.u64 %r47,%r47,%r277; .loc 1 322 21 add.u64 %r278,%frame,%r150; ld.u64 %r93,[%r278]; .loc 1 322 11 sub.u64 %r279,%r93,%r88; add.u64 %r60,%r60,%r279; .loc 1 324 33 add.u64 %r281,%r296,%r150; ld.u64 %r95,[%r281]; .loc 1 324 6 add.u64 %r153,%r153,8; add.u64 %r150,%r150,8; setp.eq.u64 %r282,%r90,%r95; @ %r282 bra $L53; bra $L42; $L33: .loc 1 328_gfortran_sfindloc0_i8 .visible .func _gfortran_sfindloc0_, .param .u32 %in_ar4) {reg .u64 %r24u64 %r38; .reg .u32predar2; mov.u64 %r38,%ar3; mov.u32 %r39,%ar4; .loc 1 345 6 setp.eq.u64 %r40,%r38,0; @ %r40 bra $L74; .loc 1 345 20 ld.u32 %r41,[%r38]; setp.eq.u32 %r42,%r41,0; @ %r42 bra $L75; $L74: .loc 1 347 7 {call _gfortran_findloc0_348 7 bra $L73; $L75: .loc 1 351 8 ld.s8 %r30,[%r36+28]; .loc 1 353 6 setp.gt.s64 %r48,%r30,0; @ %r48 bra $L77; .loc 1 354 5 cvta.const.u64 %r50,$LC0; mov.u64 %r49,0; {_gfortrani_internal_error77: .loc 1 356 15 ld.u64 %r24,[%r35]; .loc 1 356 6 setp.ne.u64 %r51,%r24,0; @ %r51 bra $L78; .loc 1 358 7 st.u64 [%r35+48],%r24; add.u64 %r53,%r30,-1; st.u64 [%r35+56],%r53; mov.u64 %r54,1; st.u64 [%r35+40],%r54; .loc 1 359 28 cvt.u32.u64 %r55,%r54; st.u8 [%r35+28],%r55; .loc 1 360 24 st.u64 [%r35+8],%r24; .loc 1 361 29 mov.u64 call (%value_in),_gfortrani_xmallocarray61 27 st.u64 [%r35],%r24; bra $L79; $L78: .loc 1 363 12 cvta.global.u64 %r63,_gfortrani_compile_options; .loc 1 363 11 ld.u32 %r64,[%r63+36]; setp.eq.u32 %r65,%r64,0; @ %r65 bra $L79; .loc 1 365 8 cvta.const.u64 %r68,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 370 8 ld.u64 %r24,[%r35]; $L79: ld.u64 %r69,[%r35+40]; shl.b64 %r34,%r69,3; mov.u64 %r29,%r24; .loc 1 371 10 mov.u64 %r32,0; .loc 1 372 23 mov.u64 %r70,%r32; $L80: st.u64 [%r29],%r70; .loc 1 371 24 add.u64 %r32,%r32,1; .loc 1 371 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r71,%r30,%r32; @ %r71 bra $L80; $L73: .loc 1 373 1 ret; } findloc0_i16.o/ 1622802251_gfortran_findloc0_i16 .visible .func _gfortran_findloc0_i16.file 1 "../../../../libgfortran/generated/findloc0_i16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mfindloc0_i16 .visible .func _gfortran_mfindloc0_i16_gfortran_sfindloc0_i16 .visible .func _gfortran_sfindloc0_i16VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extentsgfortrani_internal_error .extern .func _gfortrani_internal_70,73,78,68,777,65,83,75,32,97,114,103,117,109,101,110,116,0 }82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,6270,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_findloc0_i16 .visible .func _gfortran_findloc0_i1688predpred %r157; .reg .predpredpredpred %r209; .reg .u64 %r210; .reg .u64predmov.u64 %r120,%ar0; mov.u64 %r121,%ar1; mov.u64 %r122,%ar2; ld.u64 %r217,[%r122]; ld.u64 %r218,[%r122+8]; mov.u32 %r124,%ar3; .loc 1 51 10 ld.s8 %r22,[%r121+28]; .loc 1 51 8 cvt.u32.u32 %r125,%r22; cvt.s64.s8 %r80,%r125; .loc 1 52 6 setp.gt.s64 %r126,%r80,0; @ %r126 bra $L2; .loc 1 53 5_gfortran_runtime_errorr23,[%r120]; .loc 1 55 6 setp.ne.u64 %r129,%r23,0; @ %r129 bra $L3; .loc 1 57 7 st.u64 [%r120+48],%r23; add.u64 %r131,%r80,-1; st.u64 [%r120+56],%r131; mov.u64 %r132,1; st.u64 [%r120+40],%r132; .loc 1 58 28 cvt.u32.u64 %r133,%r132; st.u8 [%r120+28],%r133; .loc 1 59 24 st.u64 [%r120+8],%r23; .loc 1 60 29 mov.u64 %r136136; call (%value_in),_gfortrani_xmallocarray139,[%value_in]; } mov.u64 %r23,%r139; .loc 1 60 27 st.u64 [%r120],%r23; bra $L4; $L3: .loc 1 64 11 cvta.global.u64 %r141,_gfortrani_compile_options; .loc 1 64 10 ld.u32 %r142,[%r141+36]; setp.eq.u32 %r143,%r142,0; @ %r143 bra $L4; .loc 1 6gfortrani_bounds_iforeach_return.loc 1 70 8 ld.u64 %r23,[%r120]; $L4: ld.u64 %r147,[%r120+40]; shl.b64 %r118,%r147,3; mov.u64 %r102,%r23; .loc 1 69 11 mov.u64 %r116,%r102; .loc 1 73 10 mov.u64 %r89,0; .loc 1 74 23 mov.u64 %r148,%r89; $L5: st.u64 [%r116],%r148; mov.u64 %r103,%r89; .loc 1 73 26 add.u64 %r89,%r103,1; .loc 1 73 3 add.u64 %r116,%r116,%r118; setp.ne.u64 %r149,%r80,%r89; @ %r149 bra $L5; add.u64 %r39,%r121,40; mov.u64 %r74,%frame; .loc 1 76 6 mov.u64 %r87,1; .loc 1 77 10 mov.u64 %r101,0; add.u64 %r221,%frame,120; bra $L7; $L20: mov.u64 %r101,%r88; $L7: .loc 1 79 18 ld.u64 %r150,[%r39]; st.u64 [%r74],%r150; .loc 1 80 19 ld.u64 %r152,[%r39+16]; add.u64 %r151,%r152,1; ld.u64 %r153,[%r39+8]; sub.u64 %r33,%r151,%r153; .loc 1 80 17 shl.b64 %r155,%r101,3; add.u64 %r156,%r221,%r155; st.u64 [%r156],%r33; .loc 1 81 10 mul.lo.u64 %r87,%r87,%r33; .loc 1 82 10 setp.le.s64 %r157,%r33,0; @ %r157 bra $L1; .loc 1 77 26 add.u64 %r88,%r101,1; .loc 1 77 3 add.u64 %r39,%r39,24; add.u64 %r74,%r74,8; setp.ne.u64 %r158,%r103,%r101; @ %r158 bra $L20; .loc 1 87 16 cvt.u32.u32 %r160,%r22; cvt.s64.s8 %r159,%r160; shl.b64 %r95,%r159,3; add.u64 %r219,%frame,240; mov.u32 %r165%r113,[%frame]; .loc 1 104 13 shl.b64 %r115,%r113,4; .loc 1 105 34 ld.u64 %r109,[%frame+120]; .loc 1 91 19 ld.u64 %r45,[%r121]; .loc 1 89 6 setp.eq.u32 %r169,%r124,0; @ ! %r169 bra $L32; add.u64 %r220,%frame,248; mov.u32 %r227,1; .loc 1 149 17 mov.u64 %r231,0; bra $L8; $L32: .loc 1 91 37 add.u64 %r170,%r87,-1; .loc 1 91 31 shl.b64 %r171,%r170,4; .loc 1 91 12 add.u64 %r68,%r45,%r171; .loc 1 104 13 neg.s64 %r44,%r115; add.u64 %r220,%frame,248; mov.u32 %r222,1; .loc 1 112 17 mov.u64 %r226,0; $L14: ld.u64 %r38,[%frame+240]; mov.u64 %r110,%r38; mov.u32 %r98,%r165; $L12: .loc 1 97 11 ld.u64 %r172,[%r68]; setp.ne.u64 %r174,%r172,%r217; @ %r174 bra $L9; ld.u64 %r175,[%r68+8]; setp.ne.u64 %r177,%r175,%r218; @ %r177 bra $L9; setp.eq.u32 %r178,%r98,0; selp.u64 %r110,%r38,%r110,%r178; add.u64 %r93,%frame,128; mov.u64 %r92,%r220; add.u64 %r86,%r221,%r95; $L11: .loc 1 100 37 sub.u64 %r182,%r109,%r110; .loc 1 100 25 st.u64 [%r102],%r182; .loc 1 99 5 add.u64 %r102,%r102,%r118; setp.eq.u64 %r183,%r86,%r93; @ %r183 bra $L1; .loc 1 100 33 ld.u64 %r109,[%r93]; .loc 1 100 44 ld.u64 %r110,[%r92]; add.u64 %r93,%r93,8; add.u64 %r92,%r92,8; bra $L11; $L9: .loc 1 104 13 add.u64 %r68,%r68,%r44; .loc 1 105 14 add.u64 %r110,%r110,1; mov.u32 %r98,%r222; .loc 1 105 6 setp.ne.u64 %r184,%r110,%r109; @ %r184 bra $L12; st.u64 [%frame+240],%r109; mov.u64 %r111,%r220; mov.u64 %r53,%r109; mov.u64 %r51,%r113; .loc 1 107 6 mov.u64 %r73,0; $L13: .loc 1 112 17 st.u64 [%r111+-8],%r226; .loc 1 115 27 mul.lo.u64 %r48,%r51,%r53; mov.u64 %r79,%r73; .loc 1 116 9 add.u64 %r73,%r73,1; .loc 1 117 11 setp.eq.u64 %r187,%r79,%r103; @ %r187 bra $L1; .loc 1 121 13 ld.u64 %r188,[%r111]; add.u64 %r50,%r188,1; st.u64 [%r111],%r50; shl.b64 %r106,%r73,3; .loc 1 122 20 add.u64 %r189,%frame,%r106; ld.u64 %r51,[%r189]; .loc 1 122 10 sub.u64 %r190,%r48,%r51; shl.b64 %r191,%r190,4; add.u64 %r68,%r68,%r191; .loc 1 124 33 add.u64 %r193,%r221,%r106; ld.u64 %r53,[%r193]; .loc 1 124 6 add.u64 %r111,%r111,8; setp.eq.u64 %r194,%r50,%r53; @ %r194 bra $L13; bra $L14; $L8: ld.u64 %r96,[%frame+240]; .loc 1 107 6 mov.u64 %r105,%r96; mov.u32 %r59,%r124; $L18: .loc 1 134 11 ld.u64 %r195,[%r45]; setp.ne.u64 %r197,%r195,%r217; @ %r197 bra $L15; ld.u64 %r198,[%r45+8]; setp.ne.u64 %r200,%r198,%r218; @ %r200 bra $L15; setp.eq.u32 %r201,%r59,0; selp.u64 %r105,%r96,%r105,%r201; mov.u64 %r72,%r220; add.u64 %r27,%r219,%r95; $L17: .loc 1 137 36 add.u64 %r204,%r105,1; .loc 1 137 25 st.u64 [%r102],%r204; .loc 1 136 5 add.u64 %r102,%r102,%r118; setp.eq.u64 %r205,%r27,%r72; @ %r205 bra $L1; .loc 1 137 32 ld.u64 %r105,[%r72]; add.u64 %r72,%r72,8; bra $L17; $L15: .loc 1 141 13 add.u64 %r45,%r45,%r115; .loc 1 142 14 add.u64 %r105,%r105,1; mov.u32 %r59,%r227; .loc 1 142 6 setp.ne.u64 %r206,%r105,%r109; @ %r206 bra $L18; st.u64 [%frame+240],%r109; mov.u64 %r71,%r220; mov.u64 %r67,%r109; mov.u64 %r65,%r113; .loc 1 144 6 mov.u64 %r75,0; $L19: .loc 1 149 17 st.u64 [%r71+-8],%r231; .loc 1 152 27 mul.lo.u64 %r62,%r65,%r67; mov.u64 %r78,%r75; .loc 1 153 9 add.u64 %r75,%r75,1; .loc 1 154 11 setp.eq.u64 %r209,%r78,%r103; @ %r209 bra $L1; .loc 1 158 13 ld.u64 %r210,[%r71]; add.u64 %r64,%r210,1; st.u64 [%r71],%r64; shl.b64 %r76,%r75,3; .loc 1 159 20 add.u64 %r211,%frame,%r76; ld.u64 %r65,[%r211]; .loc 1 159 10 sub.u64 %r212,%r65,%r62; shl.b64 %r213,%r212,4; add.u64 %r45,%r45,%r213; .loc 1 161 33 add.u64 %r215,%r221,%r76; ld.u64 %r67,[%r215]; .loc 1 161 6 add.u64 %r71,%r71,8; setp.eq.u64 %r216,%r64,%r67; @ %r216 bra $L19; bra $L8; $L1: .loc 1 165_gfortran_mfindloc0_i16 .visible .func _gfortran_mfindloc0_i16, .param .u32 %in_ar4) {predpredu64u32 %r196; .reg .u32 %r197; .reg .u32u16pred %r207; .reg .pred64predpred %r249; .reg .pred %r250; .reg .u64 %r252; .reg .predu64.reg .predpred %r272; .reg .pred %r273; .reg .u64 %r276; .reg .predu64u64mov.u64 %r160,%ar0; mov.u64 %r161,%ar1; mov.u64 %r162,%ar2; ld.u64 %r292,[%r162]; ld.u64 %r293,[%r162+8]; mov.u64 %r164,%ar3; mov.u32 %r165,%ar4; .loc 1 190 10 ld.s8 %r22,[%r161+28]; .loc 1 190 8 cvt.u32.u32 %r166,%r22; cvt.s64.s8 %r104,%r166; .loc 1 191 6 setp.gt.s64 %r167,%r104,0; @ %r167 bra $L34;_gfortran_runtime_error34: .loc 1 194 6 ld.u64 %r170,[%r160]; setp.ne.u64 %r171,%r170,0; @ %r171 bra $L35; .loc 1 196 7 st.u64 [%r160+48],%r170; add.u64 %r173,%r104,-1; st.u64 [%r160+56],%r173; mov.u64 %r174,1; st.u64 [%r160+40],%r174; .loc 1 197 28 cvt.u32.u64 %r175,%r174; st.u8 [%r160+28],%r175; .loc 1 198 24 st.u64 [%r160+8],%r170; .loc 1 199 29 mov.u64 %r178call (%value_in),_gfortrani_xmallocarray,(%out_arg1,%out_arg2); ld.param.u64 %r181,[%value_in]; } .loc 1 199 27 st.u64 [%r160],%r181; bra $L36; $L35: .loc 1 203 11 cvta.global.u64 %r183,_gfortrani_compile_options; .loc 1 203 10 ld.u32 %r184,[%r183+36]; setp.eq.u32 %r185,%r184,0; @ %r185 bra $L36; .loc 1 205 4 cvta.const.u64 %r188,$LC1; {161188; call _gfortrani_bounds_iforeach_return.loc 1 207 4 cvta.const.u64 %r1911619188; call _gfortrani_bounds_equal_extents$L36: .loc 1 212 13 ld.u32 %r105,[%r164+16]; .loc 1 214 9 ld.u64 %r60,[%r164]; .loc 1 216 53 add.u32 %r193,%r105,-4; and.b32 %r194,%r193,-5; set.u32.eq.u32 %r196,%r194,0; neg.s32 %r197,%r196; .loc 1 216 22 add.u32 %r198,%r105,-1; set.u32.le.u32 %r200,%r198,1; neg.s32 %r201,%r200; .loc 1 216 6 cvt.u16.u32 %r203,%r197; cvt.u16.u32 %r204,%r201; or.b16 %r202,%r203,%r204; cvt.u32.u16 %r205,%r202; cvt.u16.u8 %r206,%r205; setp.ne.u16 %r207,%r206,0; @ %r207 bra $L37; .loc 1 218 7 setp.ne.u32 %r208,%r105,16; @ %r208 bra $L38; $L37: ld.u64 %r209,[%r160+40]; shl.b64 %r98,%r209,3; ld.u64 %r133,[%r160]; .loc 1 226 8 mov.u64 %r84,%r133; .loc 1 229 10 mov.u64 %r118,0; .loc 1 230 23 mov.u64 %r212,%r118; bra $L39; $L38: .loc 1 223 5 cvta.const.u64 %r211,$LC3; mov.u64 %r210,211; call _gfortrani_internal_error39: .loc 1 230 23 st.u64 [%r84],%r212; mov.u64 %r134,%r118; .loc 1 229 26 add.u64 %r118,%r134,1; .loc 1 229 3 add.u64 %r84,%r84,%r98; setp.ne.u64 %r213,%r104,%r118; @ %r213 bra $L39; .loc 1 236 20 ld.u64 %r37,[%r164+16]; add.u64 %r143,%r161,40; add.u64 %r296,%frame,120; mov.u64 %r28,%r296; add.u64 %r80,%r164,40; mov.u64 %r82,%frame; add.u64 %r78,%frame,240; mov.u64 %r54,%r78; .loc 1 232 6 mov.u64 %r116,1; .loc 1 233 10 mov.u64 %r132,0; bra $L41; $L54: mov.u64 %r132,%r117; $L41: .loc 1 235 18 ld.u64 %r214,[%r143]; st.u64 [%r28],%r214; .loc 1 236 20 ld.u64 %r216,[%r80]; mul.lo.u64 %r215,%r216,%r37; .loc 1 236 18 st.u64 [%r82],%r215; .loc 1 237 19 ld.u64 %r218,[%r143+16]; add.u64 %r217,%r218,1; ld.u64 %r219,[%r143+8]; sub.u64 %r43,%r217,%r219; .loc 1 237 17 st.u64 [%r54],%r43; .loc 1 238 10 mul.lo.u64 %r116,%r116,%r43; .loc 1 239 10 setp.le.s64 %r220,%r43,0; @ %r220 bra $L33; .loc 1 233 26 add.u64 %r117,%r132,1; .loc 1 233 3 add.u64 %r143,%r143,24; add.u64 %r28,%r28,8; add.u64 %r80,%r80,24; add.u64 %r82,%r82,8; add.u64 %r54,%r54,8; setp.ne.u64 %r221,%r134,%r132; @ %r221 bra $L54; .loc 1 244 16 cvt.u32.u32 %r223,%r22; cvt.s64.s8 %r222,%r223; shl.b64 %r141,%r222,3; add.u64 %r294,%frame,360; mov.u32 %r228261 23 ld.u64 %r155,[%frame+120]; .loc 1 261 13 shl.b64 %r157,%r155,4; .loc 1 262 24 ld.u64 %r158,[%frame]; .loc 1 263 34 ld.u64 %r151,[%frame+240]; .loc 1 248 19 ld.u64 %r47,[%r161]; .loc 1 246 6 setp.eq.u32 %r232,%r165,0; @ ! %r232 bra $L72; add.u64 %r295,%frame,368; mov.u32 %r304,1; .loc 1 324 33 add.u64 %r307,%frame,240; bra $L42; $L72: .loc 1 248 37 add.u64 %r44,%r116,-1; .loc 1 248 31 shl.b64 %r233,%r44,4; .loc 1 248 12 add.u64 %r96,%r47,%r233; .loc 1 249 32 cvt.s64.s32 %r234,%r105; .loc 1 249 13 mad.lo.u64 %r97,%r234,%r44,%r60; .loc 1 261 13 neg.s64 %r58,%r157; .loc 1 262 14 neg.s64 %r59,%r158; add.u64 %r295,%frame,368; mov.u32 %r298,1; .loc 1 284 33 add.u64 %r301,%frame,240; $L48: ld.u64 %r130,[%frame+360]; mov.u64 %r152,%r130; mov.u32 %r33,%r228; $L46: .loc 1 254 11 ld.s8 %r237,[%r97]; cvt.u16.u32 %r236,%r237; setp.eq.u16 %r238,%r236,0; @ %r238 bra $L43; ld.u64 %r239,[%r96]; setp.ne.u64 %r241,%r239,%r292; @ %r241 bra $L43; ld.u64 %r242,[%r96+8]; setp.ne.u64 %r244,%r242,%r293; @ %r244 bra $L43; setp.eq.u32 %r245,%r33,0; selp.u64 %r152,%r130,%r152,%r245; add.u64 %r139,%frame,248; mov.u64 %r138,%r295; add.u64 %r136,%r78,%r141; $L45: .loc 1 257 37 sub.u64 %r248,%r151,%r152; .loc 1 257 25 st.u64 [%r133],%r248; .loc 1 256 5 add.u64 %r133,%r133,%r98; setp.eq.u64 %r249,%r136,%r139; @ %r249 bra $L33; .loc 1 257 33 ld.u64 %r151,[%r139]; .loc 1 257 44 ld.u64 %r152,[%r138]; add.u64 %r139,%r139,8; add.u64 %r138,%r138,8; bra $L45; $L43: .loc 1 261 13 add.u64 %r96,%r96,%r58; .loc 1 262 14 add.u64 %r97,%r97,%r59; .loc 1 263 14 add.u64 %r152,%r152,1; mov.u32 %r33,%r298; .loc 1 263 6 setp.ne.u64 %r250,%r152,%r151; @ %r250 bra $L46; st.u64 [%frame+360],%r151; mov.u64 %r125,%r295; mov.u64 %r69,%r158; mov.u64 %r72,%r151; mov.u64 %r67,%r155; mov.u64 %r124,8; .loc 1 265 6 mov.u64 %r100,0; .loc 1 270 17 mov.u64 %r252,%r100; $L47: st.u64 [%r125+-8],%r252; .loc 1 273 27 mul.lo.u64 %r62,%r67,%r72; .loc 1 274 28 mul.lo.u64 %r64,%r72,%r69; mov.u64 %r142,%r100; .loc 1 275 9 add.u64 %r100,%r100,1; .loc 1 276 11 setp.eq.u64 %r253,%r142,%r134; @ %r253 bra $L33; .loc 1 280 13 ld.u64 %r254,[%r125]; add.u64 %r66,%r254,1; st.u64 [%r125],%r66; .loc 1 281 20 add.u64 %r256,%r296,%r124; ld.u64 %r67,[%r256]; .loc 1 281 10 sub.u64 %r257,%r62,%r67; shl.b64 %r258,%r257,4; add.u64 %r96,%r96,%r258; .loc 1 282 21 add.u64 %r259,%frame,%r124; ld.u64 %r69,[%r259]; .loc 1 282 11 sub.u64 %r260,%r69,%r64; add.u64 %r97,%r97,%r260; .loc 1 284 33 add.u64 %r262,%r301,%r124; ld.u64 %r72,[%r262]; .loc 1 284 6 add.u64 %r125,%r125,8; add.u64 %r124,%r124,8; setp.eq.u64 %r263,%r66,%r72; @ %r263 bra $L47; bra $L48; $L42: ld.u64 %r55,[%frame+360]; .loc 1 265 6 mov.u64 %r146,%r55; mov.u32 %r126,%r165; $L52: .loc 1 294 11 ld.s8 %r265,[%r60]; cvt.u16.u32 %r264,%r265; setp.eq.u16 %r266,%r264,0; @ %r266 bra $L49; ld.u64 %r267,[%r47]; setp.ne.u64 %r269,%r267,%r292; @ %r269 bra $L49; ld.u64 %r270,[%r47+8]; setp.ne.u64 %r272,%r270,%r293; @ %r272 bra $L49; setp.eq.u32 %r273,%r126,0; selp.u64 %r146,%r55,%r146,%r273; mov.u64 %r119,%r295; add.u64 %r77,%r294,%r141; $L51: .loc 1 297 36 add.u64 %r276,%r146,1; .loc 1 297 25 st.u64 [%r133],%r276; .loc 1 296 5 add.u64 %r133,%r133,%r98; setp.eq.u64 %r277,%r77,%r119; @ %r277 bra $L33; .loc 1 297 32 ld.u64 %r146,[%r119]; add.u64 %r119,%r119,8; bra $L51; $L49: .loc 1 301 13 add.u64 %r47,%r47,%r157; .loc 1 302 14 add.u64 %r60,%r60,%r158; .loc 1 303 14 add.u64 %r146,%r146,1; mov.u32 %r126,%r304; .loc 1 303 6 setp.ne.u64 %r278,%r146,%r151; @ %r278 bra $L52; st.u64 [%frame+360],%r151; mov.u64 %r153,%r295; mov.u64 %r93,%r158; mov.u64 %r95,%r151; mov.u64 %r91,%r155; mov.u64 %r150,8; .loc 1 305 6 mov.u64 %r101,0; .loc 1 310 17 mov.u64 %r280,%r101; $L53: st.u64 [%r153+-8],%r280; .loc 1 313 27 mul.lo.u64 %r86,%r91,%r95; .loc 1 314 28 mul.lo.u64 %r88,%r95,%r93; mov.u64 %r103,%r101; .loc 1 315 9 add.u64 %r101,%r101,1; .loc 1 316 11 setp.eq.u64 %r281,%r103,%r134; @ %r281 bra $L33; .loc 1 320 13 ld.u64 %r282,[%r153]; add.u64 %r90,%r282,1; st.u64 [%r153],%r90; .loc 1 321 20 add.u64 %r284,%r296,%r150; ld.u64 %r91,[%r284]; .loc 1 321 10 sub.u64 %r285,%r91,%r86; shl.b64 %r286,%r285,4; add.u64 %r47,%r47,%r286; .loc 1 322 21 add.u64 %r287,%frame,%r150; ld.u64 %r93,[%r287]; .loc 1 322 11 sub.u64 %r288,%r93,%r88; add.u64 %r60,%r60,%r288; .loc 1 324 33 add.u64 %r290,%r307,%r150; ld.u64 %r95,[%r290]; .loc 1 324 6 add.u64 %r153,%r153,8; add.u64 %r150,%r150,8; setp.eq.u64 %r291,%r90,%r95; @ %r291 bra $L53; bra $L42; $L33: .loc 1 328_gfortran_sfindloc0_i16 .visible .func _gfortran_sfindloc0_i16, .param .u32 %in_ar4) {u64ar2; ld.u64 %r75,[%r37]; ld.u64 %r76,[%r37+8]; mov.u64 %r39,%ar3; mov.u32 %r40,%ar4; .loc 1 345 6 setp.eq.u64 %r41,%r39,0; @ %r41 bra $L74; .loc 1 345 20 ld.u32 %r42,[%r39]; setp.eq.u32 %r43,%r42,0; @ %r43 bra $L75; $L74: .loc 1 347 7 st.u64 [%frame],%r75; st.u64 [%frame+8],%r76; {framecall _gfortran_findloc0_i16348 7 bra $L73; $L75: .loc 1 351 8 ld.s8 %r30,[%r36+28]; .loc 1 353 6 setp.gt.s64 %r51,%r30,0; @ %r51 bra $L77; .loc 1 354 5 cvta.const.u64 %r53,$LC0; mov.u64 %r52,0; {call _gfortrani_internal_error77: .loc 1 356 15 ld.u64 %r24,[%r35]; .loc 1 356 6 setp.ne.u64 %r54,%r24,0; @ %r54 bra $L78; .loc 1 358 7 st.u64 [%r35+48],%r24; add.u64 %r56,%r30,-1; st.u64 [%r35+56],%r56; mov.u64 %r57,1; st.u64 [%r35+40],%r57; .loc 1 359 28 cvt.u32.u64 %r58,%r57; st.u8 [%r35+28],%r58; .loc 1 360 24 st.u64 [%r35+8],%r24; .loc 1 361 29 mov.u64 call (%value_in),_gfortrani_xmallocarray61 27 st.u64 [%r35],%r24; bra $L79; $L78: .loc 1 363 12 cvta.global.u64 %r66,_gfortrani_compile_options; .loc 1 363 11 ld.u32 %r67,[%r66+36]365 8 cvta.const.u64 %r71,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 370 8 ld.u64 %r24,[%r35]; $L79: ld.u64 %r72,[%r35+40]; shl.b64 %r34,%r72,3; mov.u64 %r29,%r24; .loc 1 371 10 mov.u64 %r32,0; .loc 1 372 23 mov.u64 %r73,%r32; $L80: st.u64 [%r29],%r73; .loc 1 371 24 add.u64 %r32,%r32,1; .loc 1 371 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r74,%r30,%r32; @ %r74 bra $L80; $L7findloc0_r4.o/_gfortran_findloc0_r4 .visible .func _gfortran_findloc0_r../libgfortran/generated/findloc0_r4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mfindloc0_r4 .visible .func _gfortran_mfindloc0_r_gfortran_sfindloc0_r4 .visible .func _gfortran_sfindloc0_rVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extentsgfortrani_internal_error .extern .func _gfortrani_internal_70,73,78,68,777,65,83,75,32,97,114,103,117,109,101,110,116,0 }82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,6270,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_findloc0_r4 .visible .func _gfortran_findloc0_r688pred.reg .u64 %r140; .reg .u32 %r141; .reg .predpredu64predu64pred %r192; .reg .u64 %r195; .reg .predpredu32 %r211; .reg .u64 %r215; .reg .u32 %r216; .reg .u64 %r220; mov.u64 %r120,%ar0; mov.u64 %r121,%ar1; mov.f32 %r122,%ar2; mov.u32 %r123,%ar3; .loc 1 51 10 ld.s8 %r22,[%r121+28]; .loc 1 51 8 cvt.u32.u32 %r124,%r22; cvt.s64.s8 %r80,%r124; .loc 1 52 6 setp.gt.s64 %r125,%r80,0; @ %r125 bra $L2; .loc 1 53 5 cvta.const.u64 %r126,$LCstack; call _gfortran_runtime_errorr23,[%r120]; .loc 1 55 6 setp.ne.u64 %r128,%r23,0; @ %r128 bra $L3; .loc 1 57 7 st.u64 [%r120+48],%r23; add.u64 %r130,%r80,-1; st.u64 [%r120+56],%r130; mov.u64 %r131,1; st.u64 [%r120+40],%r131; .loc 1 58 28 cvt.u32.u64 %r132,%r131; st.u8 [%r120+28],%r132; .loc 1 59 24 st.u64 [%r120+8],%r23; .loc 1 60 29 mov.u64 %r135135; call (%value_in),_gfortrani_xmallocarray138,[%value_in]; } mov.u64 %r23,%r138; .loc 1 60 27 st.u64 [%r120],%r23; bra $L4; $L3: .loc 1 64 11 cvta.global.u64 %r140,_gfortrani_compile_options; .loc 1 64 10 ld.u32 %r141,[%r140+36]; setp.eq.u32 %r142,%r141,0; @ %r142 bra $L4; .loc 1 6gfortrani_bounds_iforeach_return.loc 1 70 8 ld.u64 %r23,[%r120]; $L4: ld.u64 %r146,[%r120+40]; shl.b64 %r118,%r146,3; mov.u64 %r102,%r23; .loc 1 69 11 mov.u64 %r116,%r102; .loc 1 73 10 mov.u64 %r89,0; .loc 1 74 23 mov.u64 %r147,%r89; $L5: st.u64 [%r116],%r147; mov.u64 %r103,%r89; .loc 1 73 26 add.u64 %r89,%r103,1; .loc 1 73 3 add.u64 %r116,%r116,%r118; setp.ne.u64 %r148,%r80,%r89; @ %r148 bra $L5; add.u64 %r39,%r121,40; mov.u64 %r74,%frame; .loc 1 76 6 mov.u64 %r87,1; .loc 1 77 10 mov.u64 %r101,0; add.u64 %r208,%frame,120; bra $L7; $L20: mov.u64 %r101,%r88; $L7: .loc 1 79 18 ld.u64 %r149,[%r39]; st.u64 [%r74],%r149; .loc 1 80 19 ld.u64 %r151,[%r39+16]; add.u64 %r150,%r151,1; ld.u64 %r152,[%r39+8]; sub.u64 %r33,%r150,%r152; .loc 1 80 17 shl.b64 %r154,%r101,3; add.u64 %r155,%r208,%r154; st.u64 [%r155],%r33; .loc 1 81 10 mul.lo.u64 %r87,%r87,%r33; .loc 1 82 10 setp.le.s64 %r156,%r33,0; @ %r156 bra $L1; .loc 1 77 26 add.u64 %r88,%r101,1; .loc 1 77 3 add.u64 %r39,%r39,24; add.u64 %r74,%r74,8; setp.ne.u64 %r157,%r103,%r101; @ %r157 bra $L20; .loc 1 87 16 cvt.u32.u32 %r159,%r22; cvt.s64.s8 %r158,%r159; shl.b64 %r95,%r158,3; add.u64 %r209,%frame,240; mov.u32 %r164%r113,[%frame]; .loc 1 104 13 shl.b64 %r115,%r113,2; .loc 1 105 34 ld.u64 %r109,[%frame+120]; .loc 1 91 19 ld.u64 %r45,[%r121]; .loc 1 89 6 setp.eq.u32 %r168,%r123,0; @ ! %r168 bra $L32; add.u64 %r210,%frame,248; mov.u32 %r216,1; .loc 1 149 17 mov.u64 %r220,0; bra $L8; $L32: .loc 1 91 37 add.u64 %r169,%r87,-1; .loc 1 91 31 shl.b64 %r170,%r169,2; .loc 1 91 12 add.u64 %r68,%r45,%r170; .loc 1 104 13 neg.s64 %r44,%r115; add.u64 %r210,%frame,248; mov.u32 %r211,1; .loc 1 112 17 mov.u64 %r215,0; $L14: ld.u64 %r38,[%frame+240]; mov.u64 %r110,%r38; mov.u32 %r98,%r164; $L12: .loc 1 97 11 ld.f32 %r171,[%r68]; setp.neu.f32 %r172,%r171,%r122; @ %r172 bra $L9; setp.eq.u32 %r173,%r98,0; selp.u64 %r110,%r38,%r110,%r173; add.u64 %r93,%frame,128; mov.u64 %r92,%r210; add.u64 %r86,%r208,%r95; $L11: .loc 1 100 37 sub.u64 %r177,%r109,%r110; .loc 1 100 25 st.u64 [%r102],%r177; .loc 1 99 5 add.u64 %r102,%r102,%r118; setp.eq.u64 %r178,%r86,%r93; @ %r178 bra $L1; .loc 1 100 33 ld.u64 %r109,[%r93]; .loc 1 100 44 ld.u64 %r110,[%r92]; add.u64 %r93,%r93,8; add.u64 %r92,%r92,8; bra $L11; $L9: .loc 1 104 13 add.u64 %r68,%r68,%r44; .loc 1 105 14 add.u64 %r110,%r110,1; mov.u32 %r98,%r211; .loc 1 105 6 setp.ne.u64 %r179,%r110,%r109; @ %r179 bra $L12; st.u64 [%frame+240],%r109; mov.u64 %r111,%r210; mov.u64 %r53,%r109; mov.u64 %r51,%r113; .loc 1 107 6 mov.u64 %r73,0; $L13: .loc 1 112 17 st.u64 [%r111+-8],%r215; .loc 1 115 27 mul.lo.u64 %r48,%r51,%r53; mov.u64 %r79,%r73; .loc 1 116 9 add.u64 %r73,%r73,1; .loc 1 117 11 setp.eq.u64 %r182,%r79,%r103; @ %r182 bra $L1; .loc 1 121 13 ld.u64 %r183,[%r111]; add.u64 %r50,%r183,1; st.u64 [%r111],%r50; shl.b64 %r106,%r73,3; .loc 1 122 20 add.u64 %r184,%frame,%r106; ld.u64 %r51,[%r184]; .loc 1 122 10 sub.u64 %r185,%r48,%r51; shl.b64 %r186,%r185,2; add.u64 %r68,%r68,%r186; .loc 1 124 33 add.u64 %r188,%r208,%r106; ld.u64 %r53,[%r188]; .loc 1 124 6 add.u64 %r111,%r111,8; setp.eq.u64 %r189,%r50,%r53; @ %r189 bra $L13; bra $L14; $L8: ld.u64 %r96,[%frame+240]; .loc 1 107 6 mov.u64 %r105,%r96; mov.u32 %r59,%r123; $L18: .loc 1 134 11 ld.f32 %r190,[%r45]; setp.neu.f32 %r191,%r190,%r122; @ %r191 bra $L15; setp.eq.u32 %r192,%r59,0; selp.u64 %r105,%r96,%r105,%r192; mov.u64 %r72,%r210; add.u64 %r27,%r209,%r95; $L17: .loc 1 137 36 add.u64 %r195,%r105,1; .loc 1 137 25 st.u64 [%r102],%r195; .loc 1 136 5 add.u64 %r102,%r102,%r118; setp.eq.u64 %r196,%r27,%r72; @ %r196 bra $L1; .loc 1 137 32 ld.u64 %r105,[%r72]; add.u64 %r72,%r72,8; bra $L17; $L15: .loc 1 141 13 add.u64 %r45,%r45,%r115; .loc 1 142 14 add.u64 %r105,%r105,1; mov.u32 %r59,%r216; .loc 1 142 6 setp.ne.u64 %r197,%r105,%r109; @ %r197 bra $L18; st.u64 [%frame+240],%r109; mov.u64 %r71,%r210; mov.u64 %r67,%r109; mov.u64 %r65,%r113; .loc 1 144 6 mov.u64 %r75,0; $L19: .loc 1 149 17 st.u64 [%r71+-8],%r220; .loc 1 152 27 mul.lo.u64 %r62,%r65,%r67; mov.u64 %r78,%r75; .loc 1 153 9 add.u64 %r75,%r75,1; .loc 1 154 11 setp.eq.u64 %r200,%r78,%r103; @ %r200 bra $L1; .loc 1 158 13 ld.u64 %r201,[%r71]; add.u64 %r64,%r201,1; st.u64 [%r71],%r64; shl.b64 %r76,%r75,3; .loc 1 159 20 add.u64 %r202,%frame,%r76; ld.u64 %r65,[%r202]; .loc 1 159 10 sub.u64 %r203,%r65,%r62; shl.b64 %r204,%r203,2; add.u64 %r45,%r45,%r204; .loc 1 161 33 add.u64 %r206,%r208,%r76; ld.u64 %r67,[%r206]; .loc 1 161 6 add.u64 %r71,%r71,8; setp.eq.u64 %r207,%r64,%r67; @ %r207 bra $L19; bra $L8; $L1: .loc 1 165_gfortran_mfindloc0_r4 .visible .func _gfortran_mfindloc0_rpred %r166; .reg .u64pred %r184; .reg .u64 %r187; .reg .u64 %r190; .reg .u32u64 %r213; .reg .u64u32 %r227; .reg .u64 %r229; .reg .pred %r231; .reg .u64 %r232; .reg .u64 %r233; .reg .u16u64 %r255; .reg .u64 %r257; .reg .pred %r258; .reg .u16 %r259; .reg .u32 %r260; .reg .predmov.u64 %r160,%ar0; mov.u64 %r161,%ar1; mov.f32 %r162,%ar2; mov.u64 %r163,%ar3; mov.u32 %r164,%ar4; .loc 1 190 10 ld.s8 %r22,[%r161+28]; .loc 1 190 8 cvt.u32.u32 %r165,%r22; cvt.s64.s8 %r104,%r165; .loc 1 191 6 setp.gt.s64 %r166,%r104,0; @ %r166 bra $L34; .loc 1 192 5 cvta.const.u64 %r16167_gfortran_runtime_error34: .loc 1 194 6 ld.u64 %r169,[%r160; .loc 1 196 7 st.u64 [%r160+48],%r169; add.u64 %r172,%r104,-1; st.u64 [%r160+56],%r172; mov.u64 %r173,1; st.u64 [%r160+40],%r173; .loc 1 197 28 cvt.u32.u64 %r174,%r173; st.u8 [%r160+28],%r174; .loc 1 198 24 st.u64 [%r160+8],%r169; .loc 1 199 29 mov.u64 %r177call (%value_in),_gfortrani_xmallocarray,(%out_arg1,%out_arg2); ld.param.u64 %r180,[%value_in]; } .loc 1 199 27 st.u64 [%r160],%r180; bra $L36; $L35: .loc 1 203 11 cvta.global.u64 %r182,_gfortrani_compile_options; .loc 1 203 10 ld.u32 %r183,[%r182+36]; setp.eq.u32 %r184,%r183,0; @ %r184 bra $L36; .loc 1 205 4 cvta.const.u64 %r187,$LC1; {161187; call _gfortrani_bounds_iforeach_return.loc 1 207 4 cvta.const.u64 %r19016r161190187; call _gfortrani_bounds_equal_extents$L36: .loc 1 212 13 ld.u32 %r105,[%r163+16]; .loc 1 214 9 ld.u64 %r60,[%r163]; .loc 1 216 53 add.u32 %r192,%r105,-4; and.b32 %r193,%r192,-5; set.u32.eq.u32 %r195,%r193,0; neg.s32 %r196,%r195; .loc 1 216 22 add.u32 %r197,%r105,-1; set.u32.le.u32 %r199,%r197,1; neg.s32 %r200,%r199; .loc 1 216 6 cvt.u16.u32 %r202,%r196; cvt.u16.u32 %r203,%r200; or.b16 %r201,%r202,%r203;%r206 bra $L37; .loc 1 218 7 setp.ne.u32 %r207,%r105,16; @ %r207 bra $L38; $L37: ld.u64 %r208,[%r160+40]; shl.b64 %r98,%r208,3; ld.u64 %r133,[%r160]; .loc 1 226 8 mov.u64 %r84,%r133; .loc 1 229 10 mov.u64 %r118,0; .loc 1 230 23 mov.u64 %r211,%r118; bra $L39; $L38: .loc 1 223 5 cvta.const.u64 %r210,$LC3; mov.u64 %r209,210; call _gfortrani_internal_error39: .loc 1 230 23 st.u64 [%r84],%r211; mov.u64 %r134,%r118; .loc 1 229 26 add.u64 %r118,%r134,1; .loc 1 229 3 add.u64 %r84,%r84,%r98; setp.ne.u64 %r212,%r104,%r118; @ %r212 bra $L39; .loc 1 236 20 ld.u64 %r37,[%r163+16]; add.u64 %r143,%r161,40; add.u64 %r285,%frame,120; mov.u64 %r28,%r285; add.u64 %r80,%r163,40; mov.u64 %r82,%frame; add.u64 %r78,%frame,240; mov.u64 %r54,%r78; .loc 1 232 6 mov.u64 %r116,1; .loc 1 233 10 mov.u64 %r132,0; bra $L41; $L54: mov.u64 %r132,%r117; $L41: .loc 1 235 18 ld.u64 %r213,[%r143]; st.u64 [%r28],%r213; .loc 1 236 20 ld.u64 %r215,[%r80]; mul.lo.u64 %r214,%r215,%r37; .loc 1 236 18 st.u64 [%r82],%r214; .loc 1 237 19 ld.u64 %r217,[%r143+16]; add.u64 %r216,%r217,1; ld.u64 %r218,[%r143+8]; sub.u64 %r43,%r216,%r218; .loc 1 237 17 st.u64 [%r54],%r43; .loc 1 238 10 mul.lo.u64 %r116,%r116,%r43; .loc 1 239 10 setp.le.s64 %r219,%r43,0; @ %r219 bra $L33; .loc 1 233 26 add.u64 %r117,%r132,1; .loc 1 233 3 add.u64 %r143,%r143,24; add.u64 %r28,%r28,8; add.u64 %r80,%r80,24; add.u64 %r82,%r82,8; add.u64 %r54,%r54,8; setp.ne.u64 %r220,%r134,%r132; @ %r220 bra $L54; .loc 1 244 16 cvt.u32.u32 %r222,%r22; cvt.s64.s8 %r221,%r222; shl.b64 %r141,%r221,3; add.u64 %r283,%frame,360; mov.u32 %r227227261 23 ld.u64 %r155,[%frame+120]; .loc 1 261 13 shl.b64 %r157,%r155,2; .loc 1 262 24 ld.u64 %r158,[%frame]; .loc 1 263 34 ld.u64 %r151,[%frame+240]; .loc 1 248 19 ld.u64 %r47,[%r161]; .loc 1 246 6 setp.eq.u32 %r231,%r164,0; @ ! %r231 bra $L72; add.u64 %r284,%frame,368; mov.u32 %r293,1; .loc 1 324 33 add.u64 %r296,%frame,240; bra $L42; $L72: .loc 1 248 37 add.u64 %r44,%r116,-1; .loc 1 248 31 shl.b64 %r232,%r44,2; .loc 1 248 12 add.u64 %r96,%r47,%r232; .loc 1 249 32 cvt.s64.s32 %r233,%r105; .loc 1 249 13 mad.lo.u64 %r97,%r233,%r44,%r60; .loc 1 261 13 neg.s64 %r58,%r157; .loc 1 262 14 neg.s64 %r59,%r158; add.u64 %r284,%frame,368; mov.u32 %r287,1; .loc 1 284 33 add.u64 %r290,%frame,240; $L48: ld.u64 %r130,[%frame+360]; mov.u64 %r152,%r130; mov.u32 %r33,%r227; $L46: .loc 1 254 11 ld.s8 %r236,[%r97]; cvt.u16.u32 %r235,%r236; setp.eq.u16 %r237,%r235,0; @ %r237 bra $L43; ld.f32 %r238,[%r96]; setp.neu.f32 %r239,%r238,%r162; @ %r239 bra $L43; setp.eq.u32 %r240,%r33,0; selp.u64 %r152,%r130,%r152,%r240; add.u64 %r139,%frame,248; mov.u64 %r138,%r284; add.u64 %r136,%r78,%r141; $L45: .loc 1 257 37 sub.u64 %r243,%r151,%r152; .loc 1 257 25 st.u64 [%r133],%r243; .loc 1 256 5 add.u64 %r133,%r133,%r98; setp.eq.u64 %r244,%r136,%r139; @ %r244 bra $L33; .loc 1 257 33 ld.u64 %r151,[%r139]; .loc 1 257 44 ld.u64 %r152,[%r138]; add.u64 %r139,%r139,8; add.u64 %r138,%r138,8; bra $L45; $L43: .loc 1 261 13 add.u64 %r96,%r96,%r58; .loc 1 262 14 add.u64 %r97,%r97,%r59; .loc 1 263 14 add.u64 %r152,%r152,1; mov.u32 %r33,%r287; .loc 1 263 6 setp.ne.u64 %r245,%r152,%r151; @ %r245 bra $L46; st.u64 [%frame+360],%r151; mov.u64 %r125,%r284; mov.u64 %r69,%r158; mov.u64 %r72,%r151; mov.u64 %r67,%r155; mov.u64 %r124,8; .loc 1 265 6 mov.u64 %r100,0; .loc 1 270 17 mov.u64 %r247,%r100; $L47: st.u64 [%r125+-8],%r247; .loc 1 273 27 mul.lo.u64 %r62,%r67,%r72; .loc 1 274 28 mul.lo.u64 %r64,%r72,%r69; mov.u64 %r142,%r100; .loc 1 275 9 add.u64 %r100,%r100,1; .loc 1 276 11 setp.eq.u64 %r248,%r142,%r134; @ %r248 bra $L33; .loc 1 280 13 ld.u64 %r249,[%r125]; add.u64 %r66,%r249,1; st.u64 [%r125],%r66; .loc 1 281 20 add.u64 %r251,%r285,%r124; ld.u64 %r67,[%r251]; .loc 1 281 10 sub.u64 %r252,%r62,%r67; shl.b64 %r253,%r252,2; add.u64 %r96,%r96,%r253; .loc 1 282 21 add.u64 %r254,%frame,%r124; ld.u64 %r69,[%r254]; .loc 1 282 11 sub.u64 %r255,%r69,%r64; add.u64 %r97,%r97,%r255; .loc 1 284 33 add.u64 %r257,%r290,%r124; ld.u64 %r72,[%r257]; .loc 1 284 6 add.u64 %r125,%r125,8; add.u64 %r124,%r124,8; setp.eq.u64 %r258,%r66,%r72; @ %r258 bra $L47; bra $L48; $L42: ld.u64 %r55,[%frame+360]; .loc 1 265 6 mov.u64 %r146,%r55; mov.u32 %r126,%r164; $L52: .loc 1 294 11 ld.s8 %r260,[%r60]; cvt.u16.u32 %r259,%r260; setp.eq.u16 %r261,%r259,0; @ %r261 bra $L49; ld.f32 %r262,[%r47]; setp.neu.f32 %r263,%r262,%r162; @ %r263 bra $L49; setp.eq.u32 %r264,%r126,0; selp.u64 %r146,%r55,%r146,%r264; mov.u64 %r119,%r284; add.u64 %r77,%r283,%r141; $L51: .loc 1 297 36 add.u64 %r267,%r146,1; .loc 1 297 25 st.u64 [%r133],%r267; .loc 1 296 5 add.u64 %r133,%r133,%r98; setp.eq.u64 %r268,%r77,%r119; @ %r268 bra $L33; .loc 1 297 32 ld.u64 %r146,[%r119]; add.u64 %r119,%r119,8; bra $L51; $L49: .loc 1 301 13 add.u64 %r47,%r47,%r157; .loc 1 302 14 add.u64 %r60,%r60,%r158; .loc 1 303 14 add.u64 %r146,%r146,1; mov.u32 %r126,%r293; .loc 1 303 6 setp.ne.u64 %r269,%r146,%r151; @ %r269 bra $L52; st.u64 [%frame+360],%r151; mov.u64 %r153,%r284; mov.u64 %r93,%r158; mov.u64 %r95,%r151; mov.u64 %r91,%r155; mov.u64 %r150,8; .loc 1 305 6 mov.u64 %r101,0; .loc 1 310 17 mov.u64 %r271,%r101; $L53: st.u64 [%r153+-8],%r271; .loc 1 313 27 mul.lo.u64 %r86,%r91,%r95; .loc 1 314 28 mul.lo.u64 %r88,%r95,%r93; mov.u64 %r103,%r101; .loc 1 315 9 add.u64 %r101,%r101,1; .loc 1 316 11 setp.eq.u64 %r272,%r103,%r134; @ %r272 bra $L33; .loc 1 320 13 ld.u64 %r273,[%r153]; add.u64 %r90,%r273,1; st.u64 [%r153],%r90; .loc 1 321 20 add.u64 %r275,%r285,%r150; ld.u64 %r91,[%r275]; .loc 1 321 10 sub.u64 %r276,%r91,%r86; shl.b64 %r277,%r276,2; add.u64 %r47,%r47,%r277; .loc 1 322 21 add.u64 %r278,%frame,%r150; ld.u64 %r93,[%r278]; .loc 1 322 11 sub.u64 %r279,%r93,%r88; add.u64 %r60,%r60,%r279; .loc 1 324 33 add.u64 %r281,%r296,%r150; ld.u64 %r95,[%r281]; .loc 1 324 6 add.u64 %r153,%r153,8; add.u64 %r150,%r150,8; setp.eq.u64 %r282,%r90,%r95; @ %r282 bra $L53; bra $L42; $L33: .loc 1 328_gfortran_sfindloc0_r4 .visible .func _gfortran_sfindloc0_rreg .u64 %r24u64 %r38; .reg .u32predov.f32 %r37,%ar2; mov.u64 %r38,%ar3; mov.u32 %r39,%ar4; .loc 1 345 6 setp.eq.u64 %r40,%r38,0; @ %r40 bra $L74; .loc 1 345 20 ld.u32 %r41,[%r38]; setp.eq.u32 %r42,%r41,0; @ %r42 bra $L75; $L74: .loc 1 347 7 {call _gfortran_findloc0_r4348 7 bra $L73; $L75: .loc 1 351 8 ld.s8 %r30,[%r36+28]; .loc 1 353 6 setp.gt.s64 %r48,%r30,0; @ %r48 bra $L77; .loc 1 354 5 cvta.const.u64 %r50,$LC0; mov.u64 %r49,0; {_gfortrani_internal_error77: .loc 1 356 15 ld.u64 %r24,[%r35]; .loc 1 356 6 setp.ne.u64 %r51,%r24,0; @ %r51 bra $L78; .loc 1 358 7 st.u64 [%r35+48],%r24; add.u64 %r53,%r30,-1; st.u64 [%r35+56],%r53; mov.u64 %r54,1; st.u64 [%r35+40],%r54; .loc 1 359 28 cvt.u32.u64 %r55,%r54; st.u8 [%r35+28],%r55; .loc 1 360 24 st.u64 [%r35+8],%r24; .loc 1 361 29 mov.u64 call (%value_in),_gfortrani_xmallocarray61 27 st.u64 [%r35],%r24; bra $L79; $L78: .loc 1 363 12 cvta.global.u64 %r63,_gfortrani_compile_options; .loc 1 363 11 ld.u32 %r64,[%r63+36]; setp.eq.u32 %r65,%r64,0; @ %r65 bra $L79; .loc 1 365 8 cvta.const.u64 %r68,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 370 8 ld.u64 %r24,[%r35]; $L79: ld.u64 %r69,[%r35+40]; shl.b64 %r34,%r69,3; mov.u64 %r29,%r24; .loc 1 371 10 mov.u64 %r32,0; .loc 1 372 23 mov.u64 %r70,%r32; $L80: st.u64 [%r29],%r70; .loc 1 371 24 add.u64 %r32,%r32,1; .loc 1 371 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r71,%r30,%r32; @ %r71 bra $L80; $L73: .loc 1 373 1 ret; } findloc0_r8.o/_gfortran_findloc0_r8 .visible .func _gfortran_findloc0_r8, .param .u32 %in_ar3); .file 1 "../../../../libgfortran/generated/findloc0_r8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mfindloc0_r8 .visible .func _gfortran_mfindloc0_r8_gfortran_sfindloc0_r8 .visible .func _gfortran_sfindloc0_r8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extentsgfortrani_internal_error .extern .func _gfortrani_internal_70,73,78,68,777,65,83,75,32,97,114,103,117,109,101,110,116,0 }82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,6270,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_findloc0_r8 .visible .func _gfortran_findloc0_r8688pred.reg .u64 %r140; .reg .u32 %r141; .reg .predpredu64predu64pred %r192; .reg .u64 %r195; .reg .predpredu32 %r211; .reg .u64 %r215; .reg .u32 %r216; .reg .u64 %r220; mov.u64 %r120,%ar0; mov.u64 %r121,%ar1; mov.f64 %r122,%ar2; mov.u32 %r123,%ar3; .loc 1 51 10 ld.s8 %r22,[%r121+28]; .loc 1 51 8 cvt.u32.u32 %r124,%r22; cvt.s64.s8 %r80,%r124; .loc 1 52 6 setp.gt.s64 %r125,%r80,0; @ %r125 bra $L2; .loc 1 53 5 cvta.const.u64 %r126,$LCstack; call _gfortran_runtime_errorr23,[%r120]; .loc 1 55 6 setp.ne.u64 %r128,%r23,0; @ %r128 bra $L3; .loc 1 57 7 st.u64 [%r120+48],%r23; add.u64 %r130,%r80,-1; st.u64 [%r120+56],%r130; mov.u64 %r131,1; st.u64 [%r120+40],%r131; .loc 1 58 28 cvt.u32.u64 %r132,%r131; st.u8 [%r120+28],%r132; .loc 1 59 24 st.u64 [%r120+8],%r23; .loc 1 60 29 mov.u64 %r135135; call (%value_in),_gfortrani_xmallocarray138,[%value_in]; } mov.u64 %r23,%r138; .loc 1 60 27 st.u64 [%r120],%r23; bra $L4; $L3: .loc 1 64 11 cvta.global.u64 %r140,_gfortrani_compile_options; .loc 1 64 10 ld.u32 %r141,[%r140+36]; setp.eq.u32 %r142,%r141,0; @ %r142 bra $L4; .loc 1 6gfortrani_bounds_iforeach_return.loc 1 70 8 ld.u64 %r23,[%r120]; $L4: ld.u64 %r146,[%r120+40]; shl.b64 %r118,%r146,3; mov.u64 %r102,%r23; .loc 1 69 11 mov.u64 %r116,%r102; .loc 1 73 10 mov.u64 %r89,0; .loc 1 74 23 mov.u64 %r147,%r89; $L5: st.u64 [%r116],%r147; mov.u64 %r103,%r89; .loc 1 73 26 add.u64 %r89,%r103,1; .loc 1 73 3 add.u64 %r116,%r116,%r118; setp.ne.u64 %r148,%r80,%r89; @ %r148 bra $L5; add.u64 %r39,%r121,40; mov.u64 %r74,%frame; .loc 1 76 6 mov.u64 %r87,1; .loc 1 77 10 mov.u64 %r101,0; add.u64 %r208,%frame,120; bra $L7; $L20: mov.u64 %r101,%r88; $L7: .loc 1 79 18 ld.u64 %r149,[%r39]; st.u64 [%r74],%r149; .loc 1 80 19 ld.u64 %r151,[%r39+16]; add.u64 %r150,%r151,1; ld.u64 %r152,[%r39+8]; sub.u64 %r33,%r150,%r152; .loc 1 80 17 shl.b64 %r154,%r101,3; add.u64 %r155,%r208,%r154; st.u64 [%r155],%r33; .loc 1 81 10 mul.lo.u64 %r87,%r87,%r33; .loc 1 82 10 setp.le.s64 %r156,%r33,0; @ %r156 bra $L1; .loc 1 77 26 add.u64 %r88,%r101,1; .loc 1 77 3 add.u64 %r39,%r39,24; add.u64 %r74,%r74,8; setp.ne.u64 %r157,%r103,%r101; @ %r157 bra $L20; .loc 1 87 16 cvt.u32.u32 %r159,%r22; cvt.s64.s8 %r158,%r159; shl.b64 %r95,%r158,3; add.u64 %r209,%frame,240; mov.u32 %r164%r113,[%frame]; .loc 1 104 13 shl.b64 %r115,%r113,3; .loc 1 105 34 ld.u64 %r109,[%frame+120]; .loc 1 91 19 ld.u64 %r45,[%r121]; .loc 1 89 6 setp.eq.u32 %r168,%r123,0; @ ! %r168 bra $L32; add.u64 %r210,%frame,248; mov.u32 %r216,1; .loc 1 149 17 mov.u64 %r220,0; bra $L8; $L32: .loc 1 91 37 add.u64 %r169,%r87,-1; .loc 1 91 31 shl.b64 %r170,%r169,3; .loc 1 91 12 add.u64 %r68,%r45,%r170; .loc 1 104 13 neg.s64 %r44,%r115; add.u64 %r210,%frame,248; mov.u32 %r211,1; .loc 1 112 17 mov.u64 %r215,0; $L14: ld.u64 %r38,[%frame+240]; mov.u64 %r110,%r38; mov.u32 %r98,%r164; $L12: .loc 1 97 11 ld.f64 %r171,[%r68]; setp.neu.f64 %r172,%r171,%r122; @ %r172 bra $L9; setp.eq.u32 %r173,%r98,0; selp.u64 %r110,%r38,%r110,%r173; add.u64 %r93,%frame,128; mov.u64 %r92,%r210; add.u64 %r86,%r208,%r95; $L11: .loc 1 100 37 sub.u64 %r177,%r109,%r110; .loc 1 100 25 st.u64 [%r102],%r177; .loc 1 99 5 add.u64 %r102,%r102,%r118; setp.eq.u64 %r178,%r86,%r93; @ %r178 bra $L1; .loc 1 100 33 ld.u64 %r109,[%r93]; .loc 1 100 44 ld.u64 %r110,[%r92]; add.u64 %r93,%r93,8; add.u64 %r92,%r92,8; bra $L11; $L9: .loc 1 104 13 add.u64 %r68,%r68,%r44; .loc 1 105 14 add.u64 %r110,%r110,1; mov.u32 %r98,%r211; .loc 1 105 6 setp.ne.u64 %r179,%r110,%r109; @ %r179 bra $L12; st.u64 [%frame+240],%r109; mov.u64 %r111,%r210; mov.u64 %r53,%r109; mov.u64 %r51,%r113; .loc 1 107 6 mov.u64 %r73,0; $L13: .loc 1 112 17 st.u64 [%r111+-8],%r215; .loc 1 115 27 mul.lo.u64 %r48,%r51,%r53; mov.u64 %r79,%r73; .loc 1 116 9 add.u64 %r73,%r73,1; .loc 1 117 11 setp.eq.u64 %r182,%r79,%r103; @ %r182 bra $L1; .loc 1 121 13 ld.u64 %r183,[%r111]; add.u64 %r50,%r183,1; st.u64 [%r111],%r50; shl.b64 %r106,%r73,3; .loc 1 122 20 add.u64 %r184,%frame,%r106; ld.u64 %r51,[%r184]; .loc 1 122 10 sub.u64 %r185,%r48,%r51; shl.b64 %r186,%r185,3; add.u64 %r68,%r68,%r186; .loc 1 124 33 add.u64 %r188,%r208,%r106; ld.u64 %r53,[%r188]; .loc 1 124 6 add.u64 %r111,%r111,8; setp.eq.u64 %r189,%r50,%r53; @ %r189 bra $L13; bra $L14; $L8: ld.u64 %r96,[%frame+240]; .loc 1 107 6 mov.u64 %r105,%r96; mov.u32 %r59,%r123; $L18: .loc 1 134 11 ld.f64 %r190,[%r45]; setp.neu.f64 %r191,%r190,%r122; @ %r191 bra $L15; setp.eq.u32 %r192,%r59,0; selp.u64 %r105,%r96,%r105,%r192; mov.u64 %r72,%r210; add.u64 %r27,%r209,%r95; $L17: .loc 1 137 36 add.u64 %r195,%r105,1; .loc 1 137 25 st.u64 [%r102],%r195; .loc 1 136 5 add.u64 %r102,%r102,%r118; setp.eq.u64 %r196,%r27,%r72; @ %r196 bra $L1; .loc 1 137 32 ld.u64 %r105,[%r72]; add.u64 %r72,%r72,8; bra $L17; $L15: .loc 1 141 13 add.u64 %r45,%r45,%r115; .loc 1 142 14 add.u64 %r105,%r105,1; mov.u32 %r59,%r216; .loc 1 142 6 setp.ne.u64 %r197,%r105,%r109; @ %r197 bra $L18; st.u64 [%frame+240],%r109; mov.u64 %r71,%r210; mov.u64 %r67,%r109; mov.u64 %r65,%r113; .loc 1 144 6 mov.u64 %r75,0; $L19: .loc 1 149 17 st.u64 [%r71+-8],%r220; .loc 1 152 27 mul.lo.u64 %r62,%r65,%r67; mov.u64 %r78,%r75; .loc 1 153 9 add.u64 %r75,%r75,1; .loc 1 154 11 setp.eq.u64 %r200,%r78,%r103; @ %r200 bra $L1; .loc 1 158 13 ld.u64 %r201,[%r71]; add.u64 %r64,%r201,1; st.u64 [%r71],%r64; shl.b64 %r76,%r75,3; .loc 1 159 20 add.u64 %r202,%frame,%r76; ld.u64 %r65,[%r202]; .loc 1 159 10 sub.u64 %r203,%r65,%r62; shl.b64 %r204,%r203,3; add.u64 %r45,%r45,%r204; .loc 1 161 33 add.u64 %r206,%r208,%r76; ld.u64 %r67,[%r206]; .loc 1 161 6 add.u64 %r71,%r71,8; setp.eq.u64 %r207,%r64,%r67; @ %r207 bra $L19; bra $L8; $L1: .loc 1 165_gfortran_mfindloc0_r8 .visible .func _gfortran_mfindloc0_r8pred %r166; .reg .u64pred %r184; .reg .u64 %r187; .reg .u64 %r190; .reg .u32u64 %r213; .reg .u64u32 %r227; .reg .u64 %r229; .reg .pred %r231; .reg .u64 %r232; .reg .u64 %r233; .reg .u16u64 %r255; .reg .u64 %r257; .reg .pred %r258; .reg .u16 %r259; .reg .u32 %r260; .reg .predmov.u64 %r160,%ar0; mov.u64 %r161,%ar1; mov.f64 %r162,%ar2; mov.u64 %r163,%ar3; mov.u32 %r164,%ar4; .loc 1 190 10 ld.s8 %r22,[%r161+28]; .loc 1 190 8 cvt.u32.u32 %r165,%r22; cvt.s64.s8 %r104,%r165; .loc 1 191 6 setp.gt.s64 %r166,%r104,0; @ %r166 bra $L34; .loc 1 192 5 cvta.const.u64 %r16167_gfortran_runtime_error34: .loc 1 194 6 ld.u64 %r169,[%r160; .loc 1 196 7 st.u64 [%r160+48],%r169; add.u64 %r172,%r104,-1; st.u64 [%r160+56],%r172; mov.u64 %r173,1; st.u64 [%r160+40],%r173; .loc 1 197 28 cvt.u32.u64 %r174,%r173; st.u8 [%r160+28],%r174; .loc 1 198 24 st.u64 [%r160+8],%r169; .loc 1 199 29 mov.u64 %r177call (%value_in),_gfortrani_xmallocarray,(%out_arg1,%out_arg2); ld.param.u64 %r180,[%value_in]; } .loc 1 199 27 st.u64 [%r160],%r180; bra $L36; $L35: .loc 1 203 11 cvta.global.u64 %r182,_gfortrani_compile_options; .loc 1 203 10 ld.u32 %r183,[%r182+36]; setp.eq.u32 %r184,%r183,0; @ %r184 bra $L36; .loc 1 205 4 cvta.const.u64 %r187,$LC1; {161187; call _gfortrani_bounds_iforeach_return.loc 1 207 4 cvta.const.u64 %r19016r161190187; call _gfortrani_bounds_equal_extents$L36: .loc 1 212 13 ld.u32 %r105,[%r163+16]; .loc 1 214 9 ld.u64 %r60,[%r163]; .loc 1 216 53 add.u32 %r192,%r105,-4; and.b32 %r193,%r192,-5; set.u32.eq.u32 %r195,%r193,0; neg.s32 %r196,%r195; .loc 1 216 22 add.u32 %r197,%r105,-1; set.u32.le.u32 %r199,%r197,1; neg.s32 %r200,%r199; .loc 1 216 6 cvt.u16.u32 %r202,%r196; cvt.u16.u32 %r203,%r200; or.b16 %r201,%r202,%r203;%r206 bra $L37; .loc 1 218 7 setp.ne.u32 %r207,%r105,16; @ %r207 bra $L38; $L37: ld.u64 %r208,[%r160+40]; shl.b64 %r98,%r208,3; ld.u64 %r133,[%r160]; .loc 1 226 8 mov.u64 %r84,%r133; .loc 1 229 10 mov.u64 %r118,0; .loc 1 230 23 mov.u64 %r211,%r118; bra $L39; $L38: .loc 1 223 5 cvta.const.u64 %r210,$LC3; mov.u64 %r209,210; call _gfortrani_internal_error39: .loc 1 230 23 st.u64 [%r84],%r211; mov.u64 %r134,%r118; .loc 1 229 26 add.u64 %r118,%r134,1; .loc 1 229 3 add.u64 %r84,%r84,%r98; setp.ne.u64 %r212,%r104,%r118; @ %r212 bra $L39; .loc 1 236 20 ld.u64 %r37,[%r163+16]; add.u64 %r143,%r161,40; add.u64 %r285,%frame,120; mov.u64 %r28,%r285; add.u64 %r80,%r163,40; mov.u64 %r82,%frame; add.u64 %r78,%frame,240; mov.u64 %r54,%r78; .loc 1 232 6 mov.u64 %r116,1; .loc 1 233 10 mov.u64 %r132,0; bra $L41; $L54: mov.u64 %r132,%r117; $L41: .loc 1 235 18 ld.u64 %r213,[%r143]; st.u64 [%r28],%r213; .loc 1 236 20 ld.u64 %r215,[%r80]; mul.lo.u64 %r214,%r215,%r37; .loc 1 236 18 st.u64 [%r82],%r214; .loc 1 237 19 ld.u64 %r217,[%r143+16]; add.u64 %r216,%r217,1; ld.u64 %r218,[%r143+8]; sub.u64 %r43,%r216,%r218; .loc 1 237 17 st.u64 [%r54],%r43; .loc 1 238 10 mul.lo.u64 %r116,%r116,%r43; .loc 1 239 10 setp.le.s64 %r219,%r43,0; @ %r219 bra $L33; .loc 1 233 26 add.u64 %r117,%r132,1; .loc 1 233 3 add.u64 %r143,%r143,24; add.u64 %r28,%r28,8; add.u64 %r80,%r80,24; add.u64 %r82,%r82,8; add.u64 %r54,%r54,8; setp.ne.u64 %r220,%r134,%r132; @ %r220 bra $L54; .loc 1 244 16 cvt.u32.u32 %r222,%r22; cvt.s64.s8 %r221,%r222; shl.b64 %r141,%r221,3; add.u64 %r283,%frame,360; mov.u32 %r227227261 23 ld.u64 %r155,[%frame+120]; .loc 1 261 13 shl.b64 %r157,%r155,3; .loc 1 262 24 ld.u64 %r158,[%frame]; .loc 1 263 34 ld.u64 %r151,[%frame+240]; .loc 1 248 19 ld.u64 %r47,[%r161]; .loc 1 246 6 setp.eq.u32 %r231,%r164,0; @ ! %r231 bra $L72; add.u64 %r284,%frame,368; mov.u32 %r293,1; .loc 1 324 33 add.u64 %r296,%frame,240; bra $L42; $L72: .loc 1 248 37 add.u64 %r44,%r116,-1; .loc 1 248 31 shl.b64 %r232,%r44,3; .loc 1 248 12 add.u64 %r96,%r47,%r232; .loc 1 249 32 cvt.s64.s32 %r233,%r105; .loc 1 249 13 mad.lo.u64 %r97,%r233,%r44,%r60; .loc 1 261 13 neg.s64 %r58,%r157; .loc 1 262 14 neg.s64 %r59,%r158; add.u64 %r284,%frame,368; mov.u32 %r287,1; .loc 1 284 33 add.u64 %r290,%frame,240; $L48: ld.u64 %r130,[%frame+360]; mov.u64 %r152,%r130; mov.u32 %r33,%r227; $L46: .loc 1 254 11 ld.s8 %r236,[%r97]; cvt.u16.u32 %r235,%r236; setp.eq.u16 %r237,%r235,0; @ %r237 bra $L43; ld.f64 %r238,[%r96]; setp.neu.f64 %r239,%r238,%r162; @ %r239 bra $L43; setp.eq.u32 %r240,%r33,0; selp.u64 %r152,%r130,%r152,%r240; add.u64 %r139,%frame,248; mov.u64 %r138,%r284; add.u64 %r136,%r78,%r141; $L45: .loc 1 257 37 sub.u64 %r243,%r151,%r152; .loc 1 257 25 st.u64 [%r133],%r243; .loc 1 256 5 add.u64 %r133,%r133,%r98; setp.eq.u64 %r244,%r136,%r139; @ %r244 bra $L33; .loc 1 257 33 ld.u64 %r151,[%r139]; .loc 1 257 44 ld.u64 %r152,[%r138]; add.u64 %r139,%r139,8; add.u64 %r138,%r138,8; bra $L45; $L43: .loc 1 261 13 add.u64 %r96,%r96,%r58; .loc 1 262 14 add.u64 %r97,%r97,%r59; .loc 1 263 14 add.u64 %r152,%r152,1; mov.u32 %r33,%r287; .loc 1 263 6 setp.ne.u64 %r245,%r152,%r151; @ %r245 bra $L46; st.u64 [%frame+360],%r151; mov.u64 %r125,%r284; mov.u64 %r69,%r158; mov.u64 %r72,%r151; mov.u64 %r67,%r155; mov.u64 %r124,8; .loc 1 265 6 mov.u64 %r100,0; .loc 1 270 17 mov.u64 %r247,%r100; $L47: st.u64 [%r125+-8],%r247; .loc 1 273 27 mul.lo.u64 %r62,%r67,%r72; .loc 1 274 28 mul.lo.u64 %r64,%r72,%r69; mov.u64 %r142,%r100; .loc 1 275 9 add.u64 %r100,%r100,1; .loc 1 276 11 setp.eq.u64 %r248,%r142,%r134; @ %r248 bra $L33; .loc 1 280 13 ld.u64 %r249,[%r125]; add.u64 %r66,%r249,1; st.u64 [%r125],%r66; .loc 1 281 20 add.u64 %r251,%r285,%r124; ld.u64 %r67,[%r251]; .loc 1 281 10 sub.u64 %r252,%r62,%r67; shl.b64 %r253,%r252,3; add.u64 %r96,%r96,%r253; .loc 1 282 21 add.u64 %r254,%frame,%r124; ld.u64 %r69,[%r254]; .loc 1 282 11 sub.u64 %r255,%r69,%r64; add.u64 %r97,%r97,%r255; .loc 1 284 33 add.u64 %r257,%r290,%r124; ld.u64 %r72,[%r257]; .loc 1 284 6 add.u64 %r125,%r125,8; add.u64 %r124,%r124,8; setp.eq.u64 %r258,%r66,%r72; @ %r258 bra $L47; bra $L48; $L42: ld.u64 %r55,[%frame+360]; .loc 1 265 6 mov.u64 %r146,%r55; mov.u32 %r126,%r164; $L52: .loc 1 294 11 ld.s8 %r260,[%r60]; cvt.u16.u32 %r259,%r260; setp.eq.u16 %r261,%r259,0; @ %r261 bra $L49; ld.f64 %r262,[%r47]; setp.neu.f64 %r263,%r262,%r162; @ %r263 bra $L49; setp.eq.u32 %r264,%r126,0; selp.u64 %r146,%r55,%r146,%r264; mov.u64 %r119,%r284; add.u64 %r77,%r283,%r141; $L51: .loc 1 297 36 add.u64 %r267,%r146,1; .loc 1 297 25 st.u64 [%r133],%r267; .loc 1 296 5 add.u64 %r133,%r133,%r98; setp.eq.u64 %r268,%r77,%r119; @ %r268 bra $L33; .loc 1 297 32 ld.u64 %r146,[%r119]; add.u64 %r119,%r119,8; bra $L51; $L49: .loc 1 301 13 add.u64 %r47,%r47,%r157; .loc 1 302 14 add.u64 %r60,%r60,%r158; .loc 1 303 14 add.u64 %r146,%r146,1; mov.u32 %r126,%r293; .loc 1 303 6 setp.ne.u64 %r269,%r146,%r151; @ %r269 bra $L52; st.u64 [%frame+360],%r151; mov.u64 %r153,%r284; mov.u64 %r93,%r158; mov.u64 %r95,%r151; mov.u64 %r91,%r155; mov.u64 %r150,8; .loc 1 305 6 mov.u64 %r101,0; .loc 1 310 17 mov.u64 %r271,%r101; $L53: st.u64 [%r153+-8],%r271; .loc 1 313 27 mul.lo.u64 %r86,%r91,%r95; .loc 1 314 28 mul.lo.u64 %r88,%r95,%r93; mov.u64 %r103,%r101; .loc 1 315 9 add.u64 %r101,%r101,1; .loc 1 316 11 setp.eq.u64 %r272,%r103,%r134; @ %r272 bra $L33; .loc 1 320 13 ld.u64 %r273,[%r153]; add.u64 %r90,%r273,1; st.u64 [%r153],%r90; .loc 1 321 20 add.u64 %r275,%r285,%r150; ld.u64 %r91,[%r275]; .loc 1 321 10 sub.u64 %r276,%r91,%r86; shl.b64 %r277,%r276,3; add.u64 %r47,%r47,%r277; .loc 1 322 21 add.u64 %r278,%frame,%r150; ld.u64 %r93,[%r278]; .loc 1 322 11 sub.u64 %r279,%r93,%r88; add.u64 %r60,%r60,%r279; .loc 1 324 33 add.u64 %r281,%r296,%r150; ld.u64 %r95,[%r281]; .loc 1 324 6 add.u64 %r153,%r153,8; add.u64 %r150,%r150,8; setp.eq.u64 %r282,%r90,%r95; @ %r282 bra $L53; bra $L42; $L33: .loc 1 328_gfortran_sfindloc0_r8 .visible .func _gfortran_sfindloc0_r8reg .u64 %r24u64 %r38; .reg .u32predar2; mov.u64 %r38,%ar3; mov.u32 %r39,%ar4; .loc 1 345 6 setp.eq.u64 %r40,%r38,0; @ %r40 bra $L74; .loc 1 345 20 ld.u32 %r41,[%r38]; setp.eq.u32 %r42,%r41,0; @ %r42 bra $L75; $L74: .loc 1 347 7 {call _gfortran_findloc0_r8348 7 bra $L73; $L75: .loc 1 351 8 ld.s8 %r30,[%r36+28]; .loc 1 353 6 setp.gt.s64 %r48,%r30,0; @ %r48 bra $L77; .loc 1 354 5 cvta.const.u64 %r50,$LC0; mov.u64 %r49,0; {_gfortrani_internal_error77: .loc 1 356 15 ld.u64 %r24,[%r35]; .loc 1 356 6 setp.ne.u64 %r51,%r24,0; @ %r51 bra $L78; .loc 1 358 7 st.u64 [%r35+48],%r24; add.u64 %r53,%r30,-1; st.u64 [%r35+56],%r53; mov.u64 %r54,1; st.u64 [%r35+40],%r54; .loc 1 359 28 cvt.u32.u64 %r55,%r54; st.u8 [%r35+28],%r55; .loc 1 360 24 st.u64 [%r35+8],%r24; .loc 1 361 29 mov.u64 call (%value_in),_gfortrani_xmallocarray61 27 st.u64 [%r35],%r24; bra $L79; $L78: .loc 1 363 12 cvta.global.u64 %r63,_gfortrani_compile_options; .loc 1 363 11 ld.u32 %r64,[%r63+36]; setp.eq.u32 %r65,%r64,0; @ %r65 bra $L79; .loc 1 365 8 cvta.const.u64 %r68,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 370 8 ld.u64 %r24,[%r35]; $L79: ld.u64 %r69,[%r35+40]; shl.b64 %r34,%r69,3; mov.u64 %r29,%r24; .loc 1 371 10 mov.u64 %r32,0; .loc 1 372 23 mov.u64 %r70,%r32; $L80: st.u64 [%r29],%r70; .loc 1 371 24 add.u64 %r32,%r32,1; .loc 1 371 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r71,%r30,%r32; @ %r71 bra $L80; $L73: .loc 1 373 1 ret; } findloc0_r10.o/ 1622802251findloc0_c4.o/_gfortran_findloc0_c4 .visible .func _gfortran_findloc0_cfortran/generated/findloc0_c4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mfindloc0_c4 .visible .func _gfortran_mfindloc0_c_gfortran_sfindloc0_c4 .visible .func _gfortran_sfindloc0_cVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extentsgfortrani_internal_error .extern .func _gfortrani_internal_77,65,83,75,32,97,114,103,117,109,101,110,116,0 }82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,488] = {70,73,78,68,770,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_findloc0_c4 .visible .func _gfortran_findloc0_c64predu64u64 %r158; .reg .predu32 %r170; .reg .u32 %r175; .reg .u64 %r177; .reg .predu16 %r191; .reg .u16 %r192; .reg .u32u32 %r216; .reg .f32 %r218; .reg .u32 %r219; .reg .u32 %r220; .reg .u16 %r221; .reg .u16 %r222; .reg .u16 %r223; .reg .u32 %r224; .reg .u16 %r225; .reg .predpred %r231; .reg .pred %r232; .reg .predmov.u64 %r130,%ar0; mov.u64 %r131,%ar1; mov.f32 %r132,%ar2; mov.f32 %r133,%ar3; mov.u32 %r134,%ar4; .loc 1 51 10 ld.s8 %r22,[%r131+28]; .loc 1 51 8 cvt.u32.u32 %r135,%r22; cvt.s64.s8 %r71,%r135; .loc 1 52 6 setp.gt.s64 %r136,%r71,0; @ %r136 bra $L2; .loc 1 53 5 cvta.const.u64 %r131stack; call _gfortran_runtime_errorr23,[%r130]; .loc 1 55 6 setp.ne.u64 %r139,%r23,0; @ %r139 bra $L3; .loc 1 57 7 st.u64 [%r130+48],%r23; add.u64 %r141,%r71,-1; st.u64 [%r130+56],%r141; mov.u64 %r142,1; st.u64 [%r130+40],%r142; .loc 1 58 28 cvt.u32.u64 %r143,%r142; st.u8 [%r130+28],%r143; .loc 1 59 24 st.u64 [%r130+8],%r23; .loc 1 60 29 mov.u64 %r14671call (%value_in),_gfortrani_xmallocarray9,[%value_in]; } mov.u64 %r23,%r149; .loc 1 60 27 st.u64 [%r130],%r23; bra $L4; $L3: .loc 1 64 11 cvta.global.u64 %r151,_gfortrani_compile_options; .loc 1 64 10 ld.u32 %r152,[%r151+36]; setp.eq.u32 %r153,%r152,0; @ %r153 bra $L4; .loc 1 6gfortrani_bounds_iforeach_return.loc 1 70 8 ld.u64 %r23,[%r130]; $L4: ld.u64 %r157,[%r130+40]; shl.b64 %r95,%r157,3; mov.u64 %r97,%r23; .loc 1 69 11 mov.u64 %r57,%r97; .loc 1 73 10 mov.u64 %r81,0; .loc 1 74 23 mov.u64 %r158,%r81; $L5: st.u64 [%r57],%r158; mov.u64 %r98,%r81; .loc 1 73 26 add.u64 %r81,%r98,1; .loc 1 73 3 add.u64 %r57,%r57,%r95; setp.ne.u64 %r159,%r71,%r81; @ %r159 bra $L5; add.u64 %r117,%r131,40; mov.u64 %r116,%frame; .loc 1 76 6 mov.u64 %r79,1; .loc 1 77 10 mov.u64 %r96,0; add.u64 %r245,%frame,120; bra $L7; $L20: mov.u64 %r96,%r80; $L7: .loc 1 79 18 ld.u64 %r160,[%r117]; st.u64 [%r116],%r160; .loc 1 80 19 ld.u64 %r162,[%r117+16]; add.u64 %r161,%r162,1; ld.u64 %r163,[%r117+8]; sub.u64 %r34,%r161,%r163; .loc 1 80 17 shl.b64 %r165,%r96,3; add.u64 %r166,%r245,%r165; st.u64 [%r166],%r34; .loc 1 81 10 mul.lo.u64 %r79,%r79,%r34; .loc 1 82 10 setp.le.s64 %r167,%r34,0; @ %r167 bra $L1; .loc 1 77 26 add.u64 %r80,%r96,1; .loc 1 77 3 add.u64 %r117,%r117,24; add.u64 %r116,%r116,8; setp.ne.u64 %r168,%r98,%r96; @ %r168 bra $L20; .loc 1 87 16 cvt.u32.u32 %r170,%r22; cvt.s64.s8 %r169,%r170; shl.b64 %r114,%r169,3; add.u64 %r243,%frame,240; mov.u32 %r17511177,[%value_in]; } .loc 1 104 23 ld.u64 %r127,[%frame]; .loc 1 104 13 shl.b64 %r129,%r127,3; .loc 1 105 34 ld.u64 %r121,[%frame+120]; .loc 1 91 19 ld.u64 %r43,[%r131]; .loc 1 89 6 setp.eq.u32 %r179,%r134,0; @ ! %r179 bra $L32; add.u64 %r244,%frame,248; mov.u32 %r251,1; .loc 1 149 17 mov.u64 %r255,0; bra $L8; $L32: .loc 1 91 37 add.u64 %r180,%r79,-1; .loc 1 91 31 shl.b64 %r181,%r180,3; .loc 1 91 12 add.u64 %r66,%r43,%r181; .loc 1 104 13 neg.s64 %r42,%r129; add.u64 %r244,%frame,248; mov.u32 %r246,1; .loc 1 112 17 mov.u64 %r250,0; $L14: ld.u64 %r70,[%frame+240]; mov.u64 %r122,%r70; mov.u32 %r39,%r175; $L12: .loc 1 97 11 ld.f32 %r183,[%r66]; set.u32.eq.f32 %r184,%r183,%r132; neg.s32 %r185,%r184; ld.f32 %r187,[%r66+4]; set.u32.eq.f32 %r188,%r187,%r133; neg.s32 %r189,%r188; cvt.u16.u32 %r191,%r185; cvt.u16.u32 %r192,%r189; and.b16 %r190,%r191,%r192; cvt.u32.u16 %r193,%r190; cvt.u16.u8 %r194,%r193; setp.eq.u16 %r195,%r194,0; @ %r195 bra $L9; setp.eq.u32 %r196,%r39,0; selp.u64 %r122,%r70,%r122,%r196; add.u64 %r112,%frame,128; mov.u64 %r111,%r244; add.u64 %r108,%r245,%r114; $L11: .loc 1 100 37 sub.u64 %r200,%r121,%r122; .loc 1 100 25 st.u64 [%r97],%r200; .loc 1 99 5 add.u64 %r97,%r97,%r95; setp.eq.u64 %r201,%r108,%r112; @ %r201 bra $L1; .loc 1 100 33 ld.u64 %r121,[%r112]; .loc 1 100 44 ld.u64 %r122,[%r111];8; bra $L11; $L9: .loc 1 104 13 add.u64 %r66,%r66,%r42; .loc 1 105 14 add.u64 %r122,%r122,1; mov.u32 %r39,%r246; .loc 1 105 6 setp.ne.u64 %r202,%r122,%r121; @ %r202 bra $L12; st.u64 [%frame+240],%r121; mov.u64 %r105,%r244; mov.u64 %r50,%r121; mov.u64 %r48,%r127; .loc 1 107 6 mov.u64 %r68,0; $L13: .loc 1 112 17 st.u64 [%r105+-8],%r250; .loc 1 115 27 mul.lo.u64 %r45,%r48,%r50; mov.u64 %r124,%r68; .loc 1 116 9 add.u64 %r68,%r68,1; .loc 1 117 11 setp.eq.u64 %r205,%r124,%r98; @ %r205 bra $L1; .loc 1 121 13 ld.u64 %r206,[%r105]; add.u64 %r47,%r206,1; st.u64 [%r105],%r47; shl.b64 %r102,%r68,3; .loc 1 122 20 add.u64 %r207,%frame,%r102; ld.u64 %r48,[%r207]; .loc 1 122 10 sub.u64 %r208,%r45,%r48; shl.b64 %r209,%r208,3; add.u64 %r66,%r66,%r209; .loc 1 124 33 add.u64 %r211,%r245,%r102; ld.u64 %r50,[%r211]; .loc 1 124 6 add.u64 %r105,%r105,8; setp.eq.u64 %r212,%r47,%r50; @ %r212 bra $L13; bra $L14; $L8: ld.u64 %r91,[%frame+240]; .loc 1 107 6 mov.u64 %r118,%r91; mov.u32 %r88,%r134; $L18: .loc 1 134 11 ld.f32 %r214,[%r43]; set.u32.eq.f32 %r215,%r214,%r132; neg.s32 %r216,%r215; ld.f32 %r218,[%r43+4]; set.u32.eq.f32 %r219,%r218,%r133; neg.s32 %r220,%r219; cvt.u16.u32 %r222,%r216; cvt.u16.u32 %r223,%r220; and.b16 %r221,%r222,%r223; cvt.u32.u16 %r224,%r221; cvt.u16.u8 %r225,%r224; setp.eq.u16 %r226,%r225,0; @ %r226 bra $L15; setp.eq.u32 %r227,%r88,0; selp.u64 %r118,%r91,%r118,%r227; mov.u64 %r99,%r244; add.u64 %r83,%r243,%r114; $L17: .loc 1 137 36 add.u64 %r230,%r118,1; .loc 1 137 25 st.u64 [%r97],%r230; .loc 1 136 5 add.u64 %r97,%r97,%r95; setp.eq.u64 %r231,%r83,%r99; @ %r231 bra $L1; .loc 1 137 32 ld.u64 %r118,[%r99]; add.u64 %r99,%r99,8; bra $L17; $L15: .loc 1 141 13 add.u64 %r43,%r43,%r129; .loc 1 142 14 add.u64 %r118,%r118,1; mov.u32 %r88,%r251; .loc 1 142 6 setp.ne.u64 %r232,%r118,%r121; @ %r232 bra $L18; st.u64 [%frame+240],%r121; mov.u64 %r51,%r244; mov.u64 %r65,%r121; mov.u64 %r63,%r127; .loc 1 144 6 mov.u64 %r69,0; $L19: .loc 1 149 17 st.u64 [%r51+-8],%r255; .loc 1 152 27 mul.lo.u64 %r60,%r63,%r65; mov.u64 %r90,%r69; .loc 1 153 9 add.u64 %r69,%r69,1; .loc 1 154 11 setp.eq.u64 %r235,%r90,%r98; @ %r235 bra $L1; .loc 1 158 13 ld.u64 %r236,[%r51]; add.u64 %r62,%r236,1; st.u64 [%r51],%r62; shl.b64 %r126,%r69,3; .loc 1 159 20 add.u64 %r237,%frame,%r126; ld.u64 %r63,[%r237]; .loc 1 159 10 sub.u64 %r238,%r63,%r60; shl.b64 %r239,%r238,3; add.u64 %r43,%r43,%r239; .loc 1 161 33 add.u64 %r241,%r245,%r126; ld.u64 %r65,[%r241]; .loc 1 161 6 add.u64 %r51,%r51,8; setp.eq.u64 %r242,%r62,%r65; @ %r242 bra $L19; bra $L8; $L1: .loc 1 165_gfortran_mfindloc0_c4 .visible .func _gfortran_mfindloc0_c7u64.reg .u64u32 %r211; .reg .u16pred %r217; .reg .predu64 %r222; .reg .pred %r223; .reg .u64u64 %r229; .reg .pred %r230; .reg .pred %r231; .reg .u64 %r232; .reg .u32 %r233; .reg .u32 %r238; .reg .u64 %r240; .reg .predpred %r248; .reg .f32 %r250; .reg .u32 %r251; .reg .u32 %r252; .reg .f3264pred %r284; .reg .f32 %r286; .reg .u32 %r287; .reg .u32 %r288; .reg .f32 %r290; .reg .u32 %r291; .reg .u32 %r292; .reg .u16 %r293; .reg .u16 %r294; .reg .u16 %r295; .reg .u32 %r296; .reg .u16 %r297; .reg .pred %r298; .reg .predpredu64 %r311; .reg .u64 %r312; .reg .u64mov.u64 %r170,%ar0; mov.u64 %r171,%ar1; mov.f32 %r172,%ar2; mov.f32 %r173,%ar3; mov.u64 %r174,%ar4; mov.u32 %r175,%ar5; .loc 1 190 10 ld.s8 %r22,[%r171+28]; .loc 1 190 8 cvt.u32.u32 %r176,%r22; cvt.s64.s8 %r96,%r176; .loc 1 191 6 setp.gt.s64 %r177,%r96,0; @ %r177 bra $L34;_gfortran_runtime_error34: .loc 1 194 6 ld.u64 %r180,[%r170]; setp.ne.u64 %r181,%r180,0; @ %r181 bra $L35; .loc 1 196 7 st.u64 [%r170+48],%r180; add.u64 %r183,%r96,-1; st.u64 [%r170+56],%r183; mov.u64 %r184,1; st.u64 [%r170+40],%r184; .loc 1 197 28 cvt.u32.u64 %r185,%r184; st.u8 [%r170+28],%r185; .loc 1 198 24 st.u64 [%r170+8],%r180; .loc 1 199 29 mov.u64 %r18896call (%value_in),_gfortrani_xmallocarray191,[%value_in]; } .loc 1 199 27 st.u64 [%r170],%r191; bra $L36; $L35: .loc 1 203 11 cvta.global.u64 %r193,_gfortrani_compile_options; .loc 1 203 10 ld.u32 %r194,[%r193+36]; setp.eq.u32 %r195,%r194,0r198; call _gfortrani_bounds_iforeach_return.loc 1 207 4 cvta.const.u64 %r2011717120198; call _gfortrani_bounds_equal_extents$L36: .loc 1 212 13 ld.u32 %r97,[%r174+16]; .loc 1 214 9 ld.u64 %r61,[%r174]; .loc 1 216 53 add.u32 %r203,%r97,-4; and.b32 %r204,%r203,-5;.loc 1 216 22 add.u32 %r208,%r97,-1; set.u32.le.u32 %r210,%r208,1; neg.s32 %r211,%r210; .loc 1 216 6 cvt.u16.u32 %r213,%r207; cvt.u16.u32 %r214,%r211; or.b16 %r212,%r213,%r214; cvt.u32.u16 %r215,%r212; cvt.u16.u8 %r216,%r215; setp.ne.u16 %r217,%r216,0; @ %r217 bra $L37; .loc 1 218 7 setp.ne.u32 %r218,%r97,16; @ %r218 bra $L38; $L37: ld.u64 %r219,[%r170+40]; shl.b64 %r158,%r219,3; ld.u64 %r126,[%r170]; .loc 1 226 8 mov.u64 %r156,%r126; .loc 1 229 10 mov.u64 %r109,0; .loc 1 230 23 mov.u64 %r222,%r109; bra $L39; $L38: .loc 1 223 5 cvta.const.u64 %r221,$LC3; mov.u64 %r220,22221; call _gfortrani_internal_error39: .loc 1 230 23 st.u64 [%r156],%r222; mov.u64 %r127,%r109; .loc 1 229 26 add.u64 %r109,%r127,1; .loc 1 229 3 add.u64 %r156,%r156,%r158; setp.ne.u64 %r223,%r96,%r109; @ %r223 bra $L39; .loc 1 236 20 ld.u64 %r39,[%r174+16]; add.u64 %r118,%r171,40; add.u64 %r318,%frame,120; mov.u64 %r116,%r318; add.u64 %r115,%r174,40; mov.u64 %r113,%frame; add.u64 %r34,%frame,240; mov.u64 %r106,%r34; .loc 1 232 6 mov.u64 %r107,1; .loc 1 233 10 mov.u64 %r125,0; bra $L41; $L54: mov.u64 %r125,%r108; $L41: .loc 1 235 18 ld.u64 %r224,[%r118]; st.u64 [%r116],%r224; .loc 1 236 20 ld.u64 %r226,[%r115]; mul.lo.u64 %r225,%r226,%r39; .loc 1 236 18 st.u64 [%r113],%r225; .loc 1 237 19 ld.u64 %r228,[%r118+16]; add.u64 %r227,%r228,1; ld.u64 %r229,[%r118+8]; sub.u64 %r45,%r227,%r229; .loc 1 237 17 st.u64 [%r106],%r45; .loc 1 238 10 mul.lo.u64 %r107,%r107,%r45; .loc 1 239 10 setp.le.s64 %r230,%r45,0; @ %r230 bra $L33; .loc 1 233 26 add.u64 %r108,%r125,1; .loc 1 233 3 add.u64 %r118,%r118,24; add.u64 %r116,%r116,8; add.u64 %r115,%r115,24; add.u64 %r113,%r113,8; add.u64 %r106,%r106,8; setp.ne.u64 %r231,%r127,%r125; @ %r231 bra $L54; .loc 1 244 16 cvt.u32.u32 %r233,%r22; cvt.s64.s8 %r232,%r233; shl.b64 %r152,%r232,3; add.u64 %r319,%frame,360; mov.u32 %r23815240,[%value_in]; } .loc 1 261 23 ld.u64 %r165,[%frame+120]; .loc 1 261 13 shl.b64 %r167,%r165,3; .loc 1 262 24 ld.u64 %r168,[%frame]; .loc 1 263 34 ld.u64 %r160,[%frame+240]; .loc 1 248 19 ld.u64 %r49,[%r171]; .loc 1 246 6 setp.eq.u32 %r242,%r175,0; @ ! %r242 bra $L78; add.u64 %r320,%frame,368; mov.u32 %r328,1; .loc 1 324 33 add.u64 %r331,%frame,240; bra $L42; $L78: .loc 1 248 37 add.u64 %r46,%r107,-1; .loc 1 248 31 shl.b64 %r243,%r46,3; .loc 1 248 12 add.u64 %r91,%r49,%r243; .loc 1 249 32 cvt.s64.s32 %r244,%r97; .loc 1 249 13 mad.lo.u64 %r92,%r244,%r46,%r61; .loc 1 261 13 neg.s64 %r59,%r167; .loc 1 262 14 neg.s64 %r60,%r168; add.u64 %r320,%frame,368; mov.u32 %r322,1; .loc 1 284 33 add.u64 %r325,%frame,240; $L48: ld.u64 %r120,[%frame+360]; mov.u64 %r161,%r120; mov.u32 %r112,%r238; $L46: .loc 1 254 11 ld.s8 %r247,[%r92]; cvt.u16.u32 %r246,%r247; setp.eq.u16 %r248,%r246,0; @ %r248 bra $L43; ld.f32 %r250,[%r91]; set.u32.eq.f32 %r251,%r250,%r172; neg.s32 %r252,%r251; ld.f32 %r254,[%r91+4]; set.u32.eq.f32 %r255,%r254,%r173; neg.s32 %r256,%r255; cvt.u16.u32 %r258,%r252; cvt.u16.u32 %r259,%r256; and.b16 %r257,%r258,%r259; cvt.u32.u16 %r260,%r257; cvt.u16.u8 %r261,%r260; setp.eq.u16 %r262,%r261,0; @ %r262 bra $L43; setp.eq.u32 %r263,%r112,0; selp.u64 %r161,%r120,%r161,%r263; add.u64 %r150,%frame,248; mov.u64 %r149,%r320; add.u64 %r147,%r34,%r152; $L45: .loc 1 257 37 sub.u64 %r266,%r160,%r161; .loc 1 257 25 st.u64 [%r126],%r266; .loc 1 256 5 add.u64 %r126,%r126,%r158; setp.eq.u64 %r267,%r147,%r150; @ %r267 bra $L33; .loc 1 257 33 ld.u64 %r160,[%r150]; .loc 1 257 44 ld.u64 %r161,[%r149]; add.u64 %r150,%r150,8; add.u64 %r149,%r149,8; bra $L45; $L43: .loc 1 261 13 add.u64 %r91,%r91,%r59; .loc 1 262 14 add.u64 %r92,%r92,%r60; .loc 1 263 14 add.u64 %r161,%r161,1; mov.u32 %r112,%r322; .loc 1 263 6 setp.ne.u64 %r268,%r161,%r160; @ %r268 bra $L46; st.u64 [%frame+360],%r160; mov.u64 %r144,%r320; mov.u64 %r70,%r168; mov.u64 %r73,%r160; mov.u64 %r68,%r165; mov.u64 %r143,8; .loc 1 265 6 mov.u64 %r93,0; .loc 1 270 17 mov.u64 %r270,%r93; $L47: st.u64 [%r144+-8],%r270; .loc 1 273 27 mul.lo.u64 %r63,%r68,%r73; .loc 1 274 28 mul.lo.u64 %r65,%r73,%r70; mov.u64 %r153,%r93; .loc 1 275 9 add.u64 %r93,%r93,1; .loc 1 276 11 setp.eq.u64 %r271,%r153,%r127; @ %r271 bra $L33; .loc 1 280 13 ld.u64 %r272,[%r144]; add.u64 %r67,%r272,1; st.u64 [%r144],%r67; .loc 1 281 20 add.u64 %r274,%r318,%r143; ld.u64 %r68,[%r274]; .loc 1 281 10 sub.u64 %r275,%r63,%r68; shl.b64 %r276,%r275,3; add.u64 %r91,%r91,%r276; .loc 1 282 21 add.u64 %r277,%frame,%r143; ld.u64 %r70,[%r277]; .loc 1 282 11 sub.u64 %r278,%r70,%r65; add.u64 %r92,%r92,%r278; .loc 1 284 33 add.u64 %r280,%r325,%r143; ld.u64 %r73,[%r280]; .loc 1 284 6 add.u64 %r144,%r144,8; add.u64 %r143,%r143,8; setp.eq.u64 %r281,%r67,%r73; @ %r281 bra $L47; bra $L48; $L42: ld.u64 %r35,[%frame+360]; .loc 1 265 6 mov.u64 %r155,%r35; mov.u32 %r33,%r175; $L52: .loc 1 294 11 ld.s8 %r283,[%r61]; cvt.u16.u32 %r282,%r283; setp.eq.u16 %r284,%r282,0; @ %r284 bra $L49; ld.f32 %r286,[%r49]; set.u32.eq.f32 %r287,%r286,%r172; neg.s32 %r288,%r287; ld.f32 %r290,[%r49+4]; set.u32.eq.f32 %r291,%r290,%r173; neg.s32 %r292,%r291; cvt.u16.u32 %r294,%r288; cvt.u16.u32 %r295,%r292; and.b16 %r293,%r294,%r295; cvt.u32.u16 %r296,%r293; cvt.u16.u8 %r297,%r296; setp.eq.u16 %r298,%r297,0; @ %r298 bra $L49; setp.eq.u32 %r299,%r33,0; selp.u64 %r155,%r35,%r155,%r299; mov.u64 %r138,%r320; add.u64 %r135,%r319,%r152; $L51: .loc 1 297 36 add.u64 %r302,%r155,1; .loc 1 297 25 st.u64 [%r126],%r302; .loc 1 296 5 add.u64 %r126,%r126,%r158; setp.eq.u64 %r303,%r135,%r138; @ %r303 bra $L33; .loc 1 297 32 ld.u64 %r155,[%r138]; add.u64 %r138,%r138,8; bra $L51; $L49: .loc 1 301 13 add.u64 %r49,%r49,%r167; .loc 1 302 14 add.u64 %r61,%r61,%r168; .loc 1 303 14 add.u64 %r155,%r155,1; mov.u32 %r33,%r328; .loc 1 303 6 setp.ne.u64 %r304,%r155,%r160; @ %r304 bra $L52; st.u64 [%frame+360],%r160; mov.u64 %r133,%r320; mov.u64 %r88,%r168; mov.u64 %r90,%r160; mov.u64 %r86,%r165; mov.u64 %r132,8; .loc 1 305 6 mov.u64 %r94,0; .loc 1 310 17 mov.u64 %r306,%r94; $L53: st.u64 [%r133+-8],%r306; .loc 1 313 27 mul.lo.u64 %r81,%r86,%r90; .loc 1 314 28 mul.lo.u64 %r83,%r90,%r88; mov.u64 %r95,%r94; .loc 1 315 9 add.u64 %r94,%r94,1; .loc 1 316 11 setp.eq.u64 %r307,%r95,%r127; @ %r307 bra $L33; .loc 1 320 13 ld.u64 %r308,[%r133]; add.u64 %r85,%r308,1; st.u64 [%r133],%r85; .loc 1 321 20 add.u64 %r310,%r318,%r132; ld.u64 %r86,[%r310]; .loc 1 321 10 sub.u64 %r311,%r86,%r81; shl.b64 %r312,%r311,3; add.u64 %r49,%r49,%r312; .loc 1 322 21 add.u64 %r313,%frame,%r132; ld.u64 %r88,[%r313]; .loc 1 322 11 sub.u64 %r314,%r88,%r83; add.u64 %r61,%r61,%r314; .loc 1 324 33 add.u64 %r316,%r331,%r132; ld.u64 %r90,[%r316]; .loc 1 324 6 add.u64 %r133,%r133,8; add.u64 %r132,%r132,8; setp.eq.u64 %r317,%r85,%r90; @ %r317 bra $L53; bra $L42; $L33: .loc 1 328_gfortran_sfindloc0_c4 .visible .func _gfortran_sfindloc0_cu64 %r65; .reg .u32 %r66; .reg .pred %r67; .reg .u64 %r70; .reg .u64 %r71; .reg .u64 %r72; .reg .pred %r73; mov.u64 %r35,%ar0; mov.u64 %r36,%ar1; mov.f32 %r37,%ar2; mov.f32 %r38,%ar3; mov.u64 %r39,%ar4; mov.u32 %r40,%ar5; .loc 1 345 6 setp.eq.u64 %r41,%r39,0; @ %r41 bra $L80; .loc 1 345 20 ld.u32 %r42,[%r39]; setp.eq.u32 %r43,%r42,0; @ %r43 bra $L81; $L80: .loc 1 347 7 {.param .f32 %out_arg4; st.param.f32 [%out_arg4],%r3840; call _gfortran_findloc0_c4); } .loc 1 348 7 bra $L79; $L81: .loc 1 351 8 ld.s8 %r30,[%r36+28]; .loc 1 353 6 setp.gt.s64 %r50,%r30,0; @ %r50 bra $L83; .loc 1 354 5 cvta.const.u64 %r52,$LC0; mov.u64 %r51,r52; call _gfortrani_internal_error83: .loc 1 356 15 ld.u64 %r24,[%r35]; .loc 1 356 6 setp.ne.u64 %r53,%r24,0; @ %r53 bra $L84; .loc 1 358 7 st.u64 [%r35+48],%r24; add.u64 %r55,%r30,-1; st.u64 [%r35+56],%r55; mov.u64 %r56,1; st.u64 [%r35+40],%r56; .loc 1 359 28 cvt.u32.u64 %r57,%r56; st.u8 [%r35+28],%r57; .loc 1 360 24 st.u64 [%r35+8],%r24; .loc 1 361 29 mov.u64 _gfortrani_xmallocarray61 27 st.u64 [%r35],%r24; bra $L85; $L84: .loc 1 363 12 cvta.global.u64 %r65,_gfortrani_compile_options; .loc 1 363 11 ld.u32 %r66,[%r65+36]; setp.eq.u32 %r67,%r66,0; @ %r67 bra $L85; .loc 1 365 8 cvta.const.u64 %r70,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 370 8 ld.u64 %r24,[%r35]; $L85: ld.u64 %r71,[%r35+40]; shl.b64 %r34,%r71,3; mov.u64 %r29,%r24; .loc 1 371 10 mov.u64 %r32,0; .loc 1 372 23 mov.u64 %r72,%r32; $L86: st.u64 [%r29],%r72; .loc 1 371 24 add.u64 %r32,%r32,1; .loc 1 371 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r73,%r30,%r32; @ %r73 bra $L86; $L79: .loc 1 373 1 ret; } findloc0_c8.o/_gfortran_findloc0_c8 .visible .func _gfortran_findloc0_c8fortran/generated/findloc0_c8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mfindloc0_c8 .visible .func _gfortran_mfindloc0_c8_gfortran_sfindloc0_c8 .visible .func _gfortran_sfindloc0_c8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extentsgfortrani_internal_error .extern .func _gfortrani_internal_77,65,83,75,32,97,114,103,117,109,101,110,116,0 }82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,488] = {70,73,78,68,770,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_findloc0_c8 .visible .func _gfortran_findloc0_c864predu64u64 %r158; .reg .predu32 %r170; .reg .u32 %r175; .reg .u64 %r177; .reg .predu16 %r191; .reg .u16 %r192; .reg .u32u32 %r216; .reg .f64 %r218; .reg .u32 %r219; .reg .u32 %r220; .reg .u16 %r221; .reg .u16 %r222; .reg .u16 %r223; .reg .u32 %r224; .reg .u16 %r225; .reg .predpred %r231; .reg .pred %r232; .reg .predmov.u64 %r130,%ar0; mov.u64 %r131,%ar1; mov.f64 %r132,%ar2; mov.f64 %r133,%ar3; mov.u32 %r134,%ar4; .loc 1 51 10 ld.s8 %r22,[%r131+28]; .loc 1 51 8 cvt.u32.u32 %r135,%r22; cvt.s64.s8 %r71,%r135; .loc 1 52 6 setp.gt.s64 %r136,%r71,0; @ %r136 bra $L2; .loc 1 53 5 cvta.const.u64 %r131stack; call _gfortran_runtime_errorr23,[%r130]; .loc 1 55 6 setp.ne.u64 %r139,%r23,0; @ %r139 bra $L3; .loc 1 57 7 st.u64 [%r130+48],%r23; add.u64 %r141,%r71,-1; st.u64 [%r130+56],%r141; mov.u64 %r142,1; st.u64 [%r130+40],%r142; .loc 1 58 28 cvt.u32.u64 %r143,%r142; st.u8 [%r130+28],%r143; .loc 1 59 24 st.u64 [%r130+8],%r23; .loc 1 60 29 mov.u64 %r14671call (%value_in),_gfortrani_xmallocarray9,[%value_in]; } mov.u64 %r23,%r149; .loc 1 60 27 st.u64 [%r130],%r23; bra $L4; $L3: .loc 1 64 11 cvta.global.u64 %r151,_gfortrani_compile_options; .loc 1 64 10 ld.u32 %r152,[%r151+36]; setp.eq.u32 %r153,%r152,0; @ %r153 bra $L4; .loc 1 6gfortrani_bounds_iforeach_return.loc 1 70 8 ld.u64 %r23,[%r130]; $L4: ld.u64 %r157,[%r130+40]; shl.b64 %r95,%r157,3; mov.u64 %r97,%r23; .loc 1 69 11 mov.u64 %r57,%r97; .loc 1 73 10 mov.u64 %r81,0; .loc 1 74 23 mov.u64 %r158,%r81; $L5: st.u64 [%r57],%r158; mov.u64 %r98,%r81; .loc 1 73 26 add.u64 %r81,%r98,1; .loc 1 73 3 add.u64 %r57,%r57,%r95; setp.ne.u64 %r159,%r71,%r81; @ %r159 bra $L5; add.u64 %r117,%r131,40; mov.u64 %r116,%frame; .loc 1 76 6 mov.u64 %r79,1; .loc 1 77 10 mov.u64 %r96,0; add.u64 %r245,%frame,120; bra $L7; $L20: mov.u64 %r96,%r80; $L7: .loc 1 79 18 ld.u64 %r160,[%r117]; st.u64 [%r116],%r160; .loc 1 80 19 ld.u64 %r162,[%r117+16]; add.u64 %r161,%r162,1; ld.u64 %r163,[%r117+8]; sub.u64 %r34,%r161,%r163; .loc 1 80 17 shl.b64 %r165,%r96,3; add.u64 %r166,%r245,%r165; st.u64 [%r166],%r34; .loc 1 81 10 mul.lo.u64 %r79,%r79,%r34; .loc 1 82 10 setp.le.s64 %r167,%r34,0; @ %r167 bra $L1; .loc 1 77 26 add.u64 %r80,%r96,1; .loc 1 77 3 add.u64 %r117,%r117,24; add.u64 %r116,%r116,8; setp.ne.u64 %r168,%r98,%r96; @ %r168 bra $L20; .loc 1 87 16 cvt.u32.u32 %r170,%r22; cvt.s64.s8 %r169,%r170; shl.b64 %r114,%r169,3; add.u64 %r243,%frame,240; mov.u32 %r17511177,[%value_in]; } .loc 1 104 23 ld.u64 %r127,[%frame]; .loc 1 104 13 shl.b64 %r129,%r127,4; .loc 1 105 34 ld.u64 %r121,[%frame+120]; .loc 1 91 19 ld.u64 %r43,[%r131]; .loc 1 89 6 setp.eq.u32 %r179,%r134,0; @ ! %r179 bra $L32; add.u64 %r244,%frame,248; mov.u32 %r251,1; .loc 1 149 17 mov.u64 %r255,0; bra $L8; $L32: .loc 1 91 37 add.u64 %r180,%r79,-1; .loc 1 91 31 shl.b64 %r181,%r180,4; .loc 1 91 12 add.u64 %r66,%r43,%r181; .loc 1 104 13 neg.s64 %r42,%r129; add.u64 %r244,%frame,248; mov.u32 %r246,1; .loc 1 112 17 mov.u64 %r250,0; $L14: ld.u64 %r70,[%frame+240]; mov.u64 %r122,%r70; mov.u32 %r39,%r175; $L12: .loc 1 97 11 ld.f64 %r183,[%r66]; set.u32.eq.f64 %r184,%r183,%r132; neg.s32 %r185,%r184; ld.f64 %r187,[%r66+8]; set.u32.eq.f64 %r188,%r187,%r133; neg.s32 %r189,%r188; cvt.u16.u32 %r191,%r185; cvt.u16.u32 %r192,%r189; and.b16 %r190,%r191,%r192; cvt.u32.u16 %r193,%r190; cvt.u16.u8 %r194,%r193; setp.eq.u16 %r195,%r194,0; @ %r195 bra $L9; setp.eq.u32 %r196,%r39,0; selp.u64 %r122,%r70,%r122,%r196; add.u64 %r112,%frame,128; mov.u64 %r111,%r244; add.u64 %r108,%r245,%r114; $L11: .loc 1 100 37 sub.u64 %r200,%r121,%r122; .loc 1 100 25 st.u64 [%r97],%r200; .loc 1 99 5 add.u64 %r97,%r97,%r95; setp.eq.u64 %r201,%r108,%r112; @ %r201 bra $L1; .loc 1 100 33 ld.u64 %r121,[%r112]; .loc 1 100 44 ld.u64 %r122,[%r111];8; bra $L11; $L9: .loc 1 104 13 add.u64 %r66,%r66,%r42; .loc 1 105 14 add.u64 %r122,%r122,1; mov.u32 %r39,%r246; .loc 1 105 6 setp.ne.u64 %r202,%r122,%r121; @ %r202 bra $L12; st.u64 [%frame+240],%r121; mov.u64 %r105,%r244; mov.u64 %r50,%r121; mov.u64 %r48,%r127; .loc 1 107 6 mov.u64 %r68,0; $L13: .loc 1 112 17 st.u64 [%r105+-8],%r250; .loc 1 115 27 mul.lo.u64 %r45,%r48,%r50; mov.u64 %r124,%r68; .loc 1 116 9 add.u64 %r68,%r68,1; .loc 1 117 11 setp.eq.u64 %r205,%r124,%r98; @ %r205 bra $L1; .loc 1 121 13 ld.u64 %r206,[%r105]; add.u64 %r47,%r206,1; st.u64 [%r105],%r47; shl.b64 %r102,%r68,3; .loc 1 122 20 add.u64 %r207,%frame,%r102; ld.u64 %r48,[%r207]; .loc 1 122 10 sub.u64 %r208,%r45,%r48; shl.b64 %r209,%r208,4; add.u64 %r66,%r66,%r209; .loc 1 124 33 add.u64 %r211,%r245,%r102; ld.u64 %r50,[%r211]; .loc 1 124 6 add.u64 %r105,%r105,8; setp.eq.u64 %r212,%r47,%r50; @ %r212 bra $L13; bra $L14; $L8: ld.u64 %r91,[%frame+240]; .loc 1 107 6 mov.u64 %r118,%r91; mov.u32 %r88,%r134; $L18: .loc 1 134 11 ld.f64 %r214,[%r43]; set.u32.eq.f64 %r215,%r214,%r132; neg.s32 %r216,%r215; ld.f64 %r218,[%r43+8]; set.u32.eq.f64 %r219,%r218,%r133; neg.s32 %r220,%r219; cvt.u16.u32 %r222,%r216; cvt.u16.u32 %r223,%r220; and.b16 %r221,%r222,%r223; cvt.u32.u16 %r224,%r221; cvt.u16.u8 %r225,%r224; setp.eq.u16 %r226,%r225,0; @ %r226 bra $L15; setp.eq.u32 %r227,%r88,0; selp.u64 %r118,%r91,%r118,%r227; mov.u64 %r99,%r244; add.u64 %r83,%r243,%r114; $L17: .loc 1 137 36 add.u64 %r230,%r118,1; .loc 1 137 25 st.u64 [%r97],%r230; .loc 1 136 5 add.u64 %r97,%r97,%r95; setp.eq.u64 %r231,%r83,%r99; @ %r231 bra $L1; .loc 1 137 32 ld.u64 %r118,[%r99]; add.u64 %r99,%r99,8; bra $L17; $L15: .loc 1 141 13 add.u64 %r43,%r43,%r129; .loc 1 142 14 add.u64 %r118,%r118,1; mov.u32 %r88,%r251; .loc 1 142 6 setp.ne.u64 %r232,%r118,%r121; @ %r232 bra $L18; st.u64 [%frame+240],%r121; mov.u64 %r51,%r244; mov.u64 %r65,%r121; mov.u64 %r63,%r127; .loc 1 144 6 mov.u64 %r69,0; $L19: .loc 1 149 17 st.u64 [%r51+-8],%r255; .loc 1 152 27 mul.lo.u64 %r60,%r63,%r65; mov.u64 %r90,%r69; .loc 1 153 9 add.u64 %r69,%r69,1; .loc 1 154 11 setp.eq.u64 %r235,%r90,%r98; @ %r235 bra $L1; .loc 1 158 13 ld.u64 %r236,[%r51]; add.u64 %r62,%r236,1; st.u64 [%r51],%r62; shl.b64 %r126,%r69,3; .loc 1 159 20 add.u64 %r237,%frame,%r126; ld.u64 %r63,[%r237]; .loc 1 159 10 sub.u64 %r238,%r63,%r60; shl.b64 %r239,%r238,4; add.u64 %r43,%r43,%r239; .loc 1 161 33 add.u64 %r241,%r245,%r126; ld.u64 %r65,[%r241]; .loc 1 161 6 add.u64 %r51,%r51,8; setp.eq.u64 %r242,%r62,%r65; @ %r242 bra $L19; bra $L8; $L1: .loc 1 165_gfortran_mfindloc0_c8 .visible .func _gfortran_mfindloc0_c87u64.reg .u64u32 %r211; .reg .u16pred %r217; .reg .predu64 %r222; .reg .pred %r223; .reg .u64u64 %r229; .reg .pred %r230; .reg .pred %r231; .reg .u64 %r232; .reg .u32 %r233; .reg .u32 %r238; .reg .u64 %r240; .reg .predpred %r248; .reg .f64 %r250; .reg .u32 %r251; .reg .u32 %r252; .reg .f6464pred %r284; .reg .f64 %r286; .reg .u32 %r287; .reg .u32 %r288; .reg .f64 %r290; .reg .u32 %r291; .reg .u32 %r292; .reg .u16 %r293; .reg .u16 %r294; .reg .u16 %r295; .reg .u32 %r296; .reg .u16 %r297; .reg .pred %r298; .reg .predpredu64 %r311; .reg .u64 %r312; .reg .u64mov.u64 %r170,%ar0; mov.u64 %r171,%ar1; mov.f64 %r172,%ar2; mov.f64 %r173,%ar3; mov.u64 %r174,%ar4; mov.u32 %r175,%ar5; .loc 1 190 10 ld.s8 %r22,[%r171+28]; .loc 1 190 8 cvt.u32.u32 %r176,%r22; cvt.s64.s8 %r96,%r176; .loc 1 191 6 setp.gt.s64 %r177,%r96,0; @ %r177 bra $L34;_gfortran_runtime_error34: .loc 1 194 6 ld.u64 %r180,[%r170]; setp.ne.u64 %r181,%r180,0; @ %r181 bra $L35; .loc 1 196 7 st.u64 [%r170+48],%r180; add.u64 %r183,%r96,-1; st.u64 [%r170+56],%r183; mov.u64 %r184,1; st.u64 [%r170+40],%r184; .loc 1 197 28 cvt.u32.u64 %r185,%r184; st.u8 [%r170+28],%r185; .loc 1 198 24 st.u64 [%r170+8],%r180; .loc 1 199 29 mov.u64 %r18896call (%value_in),_gfortrani_xmallocarray191,[%value_in]; } .loc 1 199 27 st.u64 [%r170],%r191; bra $L36; $L35: .loc 1 203 11 cvta.global.u64 %r193,_gfortrani_compile_options; .loc 1 203 10 ld.u32 %r194,[%r193+36]; setp.eq.u32 %r195,%r194,0r198; call _gfortrani_bounds_iforeach_return.loc 1 207 4 cvta.const.u64 %r2011717120198; call _gfortrani_bounds_equal_extents$L36: .loc 1 212 13 ld.u32 %r97,[%r174+16]; .loc 1 214 9 ld.u64 %r61,[%r174]; .loc 1 216 53 add.u32 %r203,%r97,-4; and.b32 %r204,%r203,-5;.loc 1 216 22 add.u32 %r208,%r97,-1; set.u32.le.u32 %r210,%r208,1; neg.s32 %r211,%r210; .loc 1 216 6 cvt.u16.u32 %r213,%r207; cvt.u16.u32 %r214,%r211; or.b16 %r212,%r213,%r214; cvt.u32.u16 %r215,%r212; cvt.u16.u8 %r216,%r215; setp.ne.u16 %r217,%r216,0; @ %r217 bra $L37; .loc 1 218 7 setp.ne.u32 %r218,%r97,16; @ %r218 bra $L38; $L37: ld.u64 %r219,[%r170+40]; shl.b64 %r158,%r219,3; ld.u64 %r126,[%r170]; .loc 1 226 8 mov.u64 %r156,%r126; .loc 1 229 10 mov.u64 %r109,0; .loc 1 230 23 mov.u64 %r222,%r109; bra $L39; $L38: .loc 1 223 5 cvta.const.u64 %r221,$LC3; mov.u64 %r220,22221; call _gfortrani_internal_error39: .loc 1 230 23 st.u64 [%r156],%r222; mov.u64 %r127,%r109; .loc 1 229 26 add.u64 %r109,%r127,1; .loc 1 229 3 add.u64 %r156,%r156,%r158; setp.ne.u64 %r223,%r96,%r109; @ %r223 bra $L39; .loc 1 236 20 ld.u64 %r39,[%r174+16]; add.u64 %r118,%r171,40; add.u64 %r318,%frame,120; mov.u64 %r116,%r318; add.u64 %r115,%r174,40; mov.u64 %r113,%frame; add.u64 %r34,%frame,240; mov.u64 %r106,%r34; .loc 1 232 6 mov.u64 %r107,1; .loc 1 233 10 mov.u64 %r125,0; bra $L41; $L54: mov.u64 %r125,%r108; $L41: .loc 1 235 18 ld.u64 %r224,[%r118]; st.u64 [%r116],%r224; .loc 1 236 20 ld.u64 %r226,[%r115]; mul.lo.u64 %r225,%r226,%r39; .loc 1 236 18 st.u64 [%r113],%r225; .loc 1 237 19 ld.u64 %r228,[%r118+16]; add.u64 %r227,%r228,1; ld.u64 %r229,[%r118+8]; sub.u64 %r45,%r227,%r229; .loc 1 237 17 st.u64 [%r106],%r45; .loc 1 238 10 mul.lo.u64 %r107,%r107,%r45; .loc 1 239 10 setp.le.s64 %r230,%r45,0; @ %r230 bra $L33; .loc 1 233 26 add.u64 %r108,%r125,1; .loc 1 233 3 add.u64 %r118,%r118,24; add.u64 %r116,%r116,8; add.u64 %r115,%r115,24; add.u64 %r113,%r113,8; add.u64 %r106,%r106,8; setp.ne.u64 %r231,%r127,%r125; @ %r231 bra $L54; .loc 1 244 16 cvt.u32.u32 %r233,%r22; cvt.s64.s8 %r232,%r233; shl.b64 %r152,%r232,3; add.u64 %r319,%frame,360; mov.u32 %r23815240,[%value_in]; } .loc 1 261 23 ld.u64 %r165,[%frame+120]; .loc 1 261 13 shl.b64 %r167,%r165,4; .loc 1 262 24 ld.u64 %r168,[%frame]; .loc 1 263 34 ld.u64 %r160,[%frame+240]; .loc 1 248 19 ld.u64 %r49,[%r171]; .loc 1 246 6 setp.eq.u32 %r242,%r175,0; @ ! %r242 bra $L78; add.u64 %r320,%frame,368; mov.u32 %r328,1; .loc 1 324 33 add.u64 %r331,%frame,240; bra $L42; $L78: .loc 1 248 37 add.u64 %r46,%r107,-1; .loc 1 248 31 shl.b64 %r243,%r46,4; .loc 1 248 12 add.u64 %r91,%r49,%r243; .loc 1 249 32 cvt.s64.s32 %r244,%r97; .loc 1 249 13 mad.lo.u64 %r92,%r244,%r46,%r61; .loc 1 261 13 neg.s64 %r59,%r167; .loc 1 262 14 neg.s64 %r60,%r168; add.u64 %r320,%frame,368; mov.u32 %r322,1; .loc 1 284 33 add.u64 %r325,%frame,240; $L48: ld.u64 %r120,[%frame+360]; mov.u64 %r161,%r120; mov.u32 %r112,%r238; $L46: .loc 1 254 11 ld.s8 %r247,[%r92]; cvt.u16.u32 %r246,%r247; setp.eq.u16 %r248,%r246,0; @ %r248 bra $L43; ld.f64 %r250,[%r91]; set.u32.eq.f64 %r251,%r250,%r172; neg.s32 %r252,%r251; ld.f64 %r254,[%r91+8]; set.u32.eq.f64 %r255,%r254,%r173; neg.s32 %r256,%r255; cvt.u16.u32 %r258,%r252; cvt.u16.u32 %r259,%r256; and.b16 %r257,%r258,%r259; cvt.u32.u16 %r260,%r257; cvt.u16.u8 %r261,%r260; setp.eq.u16 %r262,%r261,0; @ %r262 bra $L43; setp.eq.u32 %r263,%r112,0; selp.u64 %r161,%r120,%r161,%r263; add.u64 %r150,%frame,248; mov.u64 %r149,%r320; add.u64 %r147,%r34,%r152; $L45: .loc 1 257 37 sub.u64 %r266,%r160,%r161; .loc 1 257 25 st.u64 [%r126],%r266; .loc 1 256 5 add.u64 %r126,%r126,%r158; setp.eq.u64 %r267,%r147,%r150; @ %r267 bra $L33; .loc 1 257 33 ld.u64 %r160,[%r150]; .loc 1 257 44 ld.u64 %r161,[%r149]; add.u64 %r150,%r150,8; add.u64 %r149,%r149,8; bra $L45; $L43: .loc 1 261 13 add.u64 %r91,%r91,%r59; .loc 1 262 14 add.u64 %r92,%r92,%r60; .loc 1 263 14 add.u64 %r161,%r161,1; mov.u32 %r112,%r322; .loc 1 263 6 setp.ne.u64 %r268,%r161,%r160; @ %r268 bra $L46; st.u64 [%frame+360],%r160; mov.u64 %r144,%r320; mov.u64 %r70,%r168; mov.u64 %r73,%r160; mov.u64 %r68,%r165; mov.u64 %r143,8; .loc 1 265 6 mov.u64 %r93,0; .loc 1 270 17 mov.u64 %r270,%r93; $L47: st.u64 [%r144+-8],%r270; .loc 1 273 27 mul.lo.u64 %r63,%r68,%r73; .loc 1 274 28 mul.lo.u64 %r65,%r73,%r70; mov.u64 %r153,%r93; .loc 1 275 9 add.u64 %r93,%r93,1; .loc 1 276 11 setp.eq.u64 %r271,%r153,%r127; @ %r271 bra $L33; .loc 1 280 13 ld.u64 %r272,[%r144]; add.u64 %r67,%r272,1; st.u64 [%r144],%r67; .loc 1 281 20 add.u64 %r274,%r318,%r143; ld.u64 %r68,[%r274]; .loc 1 281 10 sub.u64 %r275,%r63,%r68; shl.b64 %r276,%r275,4; add.u64 %r91,%r91,%r276; .loc 1 282 21 add.u64 %r277,%frame,%r143; ld.u64 %r70,[%r277]; .loc 1 282 11 sub.u64 %r278,%r70,%r65; add.u64 %r92,%r92,%r278; .loc 1 284 33 add.u64 %r280,%r325,%r143; ld.u64 %r73,[%r280]; .loc 1 284 6 add.u64 %r144,%r144,8; add.u64 %r143,%r143,8; setp.eq.u64 %r281,%r67,%r73; @ %r281 bra $L47; bra $L48; $L42: ld.u64 %r35,[%frame+360]; .loc 1 265 6 mov.u64 %r155,%r35; mov.u32 %r33,%r175; $L52: .loc 1 294 11 ld.s8 %r283,[%r61]; cvt.u16.u32 %r282,%r283; setp.eq.u16 %r284,%r282,0; @ %r284 bra $L49; ld.f64 %r286,[%r49]; set.u32.eq.f64 %r287,%r286,%r172; neg.s32 %r288,%r287; ld.f64 %r290,[%r49+8]; set.u32.eq.f64 %r291,%r290,%r173; neg.s32 %r292,%r291; cvt.u16.u32 %r294,%r288; cvt.u16.u32 %r295,%r292; and.b16 %r293,%r294,%r295; cvt.u32.u16 %r296,%r293; cvt.u16.u8 %r297,%r296; setp.eq.u16 %r298,%r297,0; @ %r298 bra $L49; setp.eq.u32 %r299,%r33,0; selp.u64 %r155,%r35,%r155,%r299; mov.u64 %r138,%r320; add.u64 %r135,%r319,%r152; $L51: .loc 1 297 36 add.u64 %r302,%r155,1; .loc 1 297 25 st.u64 [%r126],%r302; .loc 1 296 5 add.u64 %r126,%r126,%r158; setp.eq.u64 %r303,%r135,%r138; @ %r303 bra $L33; .loc 1 297 32 ld.u64 %r155,[%r138]; add.u64 %r138,%r138,8; bra $L51; $L49: .loc 1 301 13 add.u64 %r49,%r49,%r167; .loc 1 302 14 add.u64 %r61,%r61,%r168; .loc 1 303 14 add.u64 %r155,%r155,1; mov.u32 %r33,%r328; .loc 1 303 6 setp.ne.u64 %r304,%r155,%r160; @ %r304 bra $L52; st.u64 [%frame+360],%r160; mov.u64 %r133,%r320; mov.u64 %r88,%r168; mov.u64 %r90,%r160; mov.u64 %r86,%r165; mov.u64 %r132,8; .loc 1 305 6 mov.u64 %r94,0; .loc 1 310 17 mov.u64 %r306,%r94; $L53: st.u64 [%r133+-8],%r306; .loc 1 313 27 mul.lo.u64 %r81,%r86,%r90; .loc 1 314 28 mul.lo.u64 %r83,%r90,%r88; mov.u64 %r95,%r94; .loc 1 315 9 add.u64 %r94,%r94,1; .loc 1 316 11 setp.eq.u64 %r307,%r95,%r127; @ %r307 bra $L33; .loc 1 320 13 ld.u64 %r308,[%r133]; add.u64 %r85,%r308,1; st.u64 [%r133],%r85; .loc 1 321 20 add.u64 %r310,%r318,%r132; ld.u64 %r86,[%r310]; .loc 1 321 10 sub.u64 %r311,%r86,%r81; shl.b64 %r312,%r311,4; add.u64 %r49,%r49,%r312; .loc 1 322 21 add.u64 %r313,%frame,%r132; ld.u64 %r88,[%r313]; .loc 1 322 11 sub.u64 %r314,%r88,%r83; add.u64 %r61,%r61,%r314; .loc 1 324 33 add.u64 %r316,%r331,%r132; ld.u64 %r90,[%r316]; .loc 1 324 6 add.u64 %r133,%r133,8; add.u64 %r132,%r132,8; setp.eq.u64 %r317,%r85,%r90; @ %r317 bra $L53; bra $L42; $L33: .loc 1 328_gfortran_sfindloc0_c8 .visible .func _gfortran_sfindloc0_c8u64 %r65; .reg .u32 %r66; .reg .pred %r67; .reg .u64 %r70; .reg .u64 %r71; .reg .u64 %r72; .reg .pred %r73; mov.u64 %r35,%ar0; mov.u64 %r36,%ar1; mov.f64 %r37,%ar2; mov.f64 %r38,%ar3; mov.u64 %r39,%ar4; mov.u32 %r40,%ar5; .loc 1 345 6 setp.eq.u64 %r41,%r39,0; @ %r41 bra $L80; .loc 1 345 20 ld.u32 %r42,[%r39]; setp.eq.u32 %r43,%r42,0; @ %r43 bra $L81; $L80: .loc 1 347 7 {.param .f64 %out_arg4; st.param.f64 [%out_arg4],%r3840; call _gfortran_findloc0_c8); } .loc 1 348 7 bra $L79; $L81: .loc 1 351 8 ld.s8 %r30,[%r36+28]; .loc 1 353 6 setp.gt.s64 %r50,%r30,0; @ %r50 bra $L83; .loc 1 354 5 cvta.const.u64 %r52,$LC0; mov.u64 %r51,r52; call _gfortrani_internal_error83: .loc 1 356 15 ld.u64 %r24,[%r35]; .loc 1 356 6 setp.ne.u64 %r53,%r24,0; @ %r53 bra $L84; .loc 1 358 7 st.u64 [%r35+48],%r24; add.u64 %r55,%r30,-1; st.u64 [%r35+56],%r55; mov.u64 %r56,1; st.u64 [%r35+40],%r56; .loc 1 359 28 cvt.u32.u64 %r57,%r56; st.u8 [%r35+28],%r57; .loc 1 360 24 st.u64 [%r35+8],%r24; .loc 1 361 29 mov.u64 _gfortrani_xmallocarray61 27 st.u64 [%r35],%r24; bra $L85; $L84: .loc 1 363 12 cvta.global.u64 %r65,_gfortrani_compile_options; .loc 1 363 11 ld.u32 %r66,[%r65+36]; setp.eq.u32 %r67,%r66,0; @ %r67 bra $L85; .loc 1 365 8 cvta.const.u64 %r70,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 370 8 ld.u64 %r24,[%r35]; $L85: ld.u64 %r71,[%r35+40]; shl.b64 %r34,%r71,3; mov.u64 %r29,%r24; .loc 1 371 10 mov.u64 %r32,0; .loc 1 372 23 mov.u64 %r72,%r32; $L86: st.u64 [%r29],%r72; .loc 1 371 24 add.u64 %r32,%r32,1; .loc 1 371 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r73,%r30,%r32; @ %r73 bra $L86; $L79: .loc 1 373 1 ret; } findloc0_c10.o/ 1622802251findloc0_s1.o/_gfortran_findloc0_s1 .visible .func _gfortran_findloc0_s1.file 1 "../../../../libgfortran/generated/findloc0_s1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mfindloc0_s1 .visible .func _gfortran_mfindloc0_s1_gfortran_sfindloc0_s1 .visible .func _gfortran_sfindloc0_s1VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortran_compare_gfortran_compare); // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extentsgfortrani_internal_error .extern .func _gfortrani_internal_82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,6270,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,98] = {70,73,78,68,777,65,83,75,32,97,114,103,117,109,101,110_gfortran_findloc0_s1 .visible .func _gfortran_findloc0_s13689pred.reg .u64 %r140; .reg .u32 %r141; .reg .predpredpred %r193; .reg .u32u64 %r202; .reg .pred %r203; .reg .pred %r205; .reg .u64 %r207; .reg .predmov.u64 %r118,%ar0; mov.u64 %r119,%ar1; mov.u64 %r120,%ar2; mov.u32 %r121,%ar3; mov.u64 %r122,%ar4; mov.u64 %r123,%ar5; .loc 1 52 10 ld.s8 %r22,[%r119+28]; .loc 1 52 8 cvt.u32.u32 %r124,%r22; cvt.s64.s8 %r79,%r124; .loc 1 53 6 setp.gt.s64 %r125,%r79,0; @ %r125 bra $L2; .loc 1 54 5 cvta.const.u64 %r126,$LCstack; call _gfortran_runtime_errorr23,[%r118]; .loc 1 56 6 setp.ne.u64 %r128,%r23,0; @ %r128 bra $L3; .loc 1 58 7 st.u64 [%r118+48],%r23; add.u64 %r130,%r79,-1; st.u64 [%r118+56],%r130; mov.u64 %r131,1; st.u64 [%r118+40],%r131; .loc 1 59 28 cvt.u32.u64 %r132,%r131; st.u8 [%r118+28],%r132; .loc 1 60 24 st.u64 [%r118+8],%r23; .loc 1 61 29 mov.u64 %r13579call (%value_in),_gfortrani_xmallocarray138,[%value_in]; } mov.u64 %r23,%r138; .loc 1 61 27 st.u64 [%r118],%r23; bra $L4; $L3: .loc 1 65 11 cvta.global.u64 %r140,_gfortrani_compile_options; .loc 1 65 10 ld.u32 %r141,[%r140+36]; setp.eq.u32 %r142,%r141,0; @ %r142 bra $L4; .loc 1 66r119145; call _gfortrani_bounds_iforeach_return.loc 1 71 8 ld.u64 %r23,[%r118]; $L4: ld.u64 %r146,[%r118+40]; shl.b64 %r116,%r146,3; mov.u64 %r95,%r23; .loc 1 70 11 mov.u64 %r114,%r95; .loc 1 74 10 mov.u64 %r87,0; .loc 1 75 23 mov.u64 %r147,%r87; $L5: st.u64 [%r114],%r147; mov.u64 %r96,%r87; .loc 1 74 26 add.u64 %r87,%r96,1; .loc 1 74 3 add.u64 %r114,%r114,%r116; setp.ne.u64 %r148,%r79,%r87; @ %r148 bra $L5; add.u64 %r106,%r119,40; mov.u64 %r108,%frame; .loc 1 77 6 mov.u64 %r85,1; .loc 1 78 10 mov.u64 %r94,0; add.u64 %r216,%frame,120; bra $L7; $L18: mov.u64 %r94,%r86; $L7: .loc 1 80 18 ld.u64 %r149,[%r106]; st.u64 [%r108],%r149; .loc 1 81 19 ld.u64 %r151,[%r106+16]; add.u64 %r150,%r151,1; ld.u64 %r152,[%r106+8]; sub.u64 %r32,%r150,%r152; .loc 1 81 17 shl.b64 %r154,%r94,3; add.u64 %r155,%r216,%r154; st.u64 [%r155],%r32; .loc 1 82 10 mul.lo.u64 %r85,%r85,%r32; .loc 1 83 10 setp.le.s64 %r156,%r32,0; @ %r156 bra $L1; .loc 1 78 26 add.u64 %r86,%r94,1; .loc 1 78 3 add.u64 %r106,%r106,24; add.u64 %r108,%r108,8; setp.ne.u64 %r157,%r96,%r94; @ %r157 bra $L18; .loc 1 88 16 cvt.u32.u32 %r159,%r22; cvt.s64.s8 %r158,%r159; shl.b64 %r78,%r158,3; add.u64 %r217,%frame,240; mov.u32 %r16492 19 ld.u64 %r48,[%r119]; .loc 1 90 6 setp.eq.u32 %r168,%r121,0; @ ! %r168 bra $L28; add.u64 %r218,%frame,248; bra $L8; $L28: .loc 1 92 37 add.u64 %r169,%r85,-1; .loc 1 92 12 mad.lo.u64 %r73,%r169,%r122,%r48; .loc 1 106 34 ld.u64 %r99,[%frame+120]; add.u64 %r218,%frame,248; $L13: .loc 1 106 21 ld.u64 %r47,[%frame+240]; $L11: .loc 1 981call (%value_in),_gfortran_compare_stringld.param.u32 %r175,[%value_in]; } .loc 1 98 11 setp.ne.u32 %r176,%r175,0; @ %r176 bra $L9; add.u64 %r77,%frame,128; mov.u64 %r74,%r218; add.u64 %r40,%r216,%r78; $L10: .loc 1 101 37 sub.u64 %r180,%r99,%r47; .loc 1 101 25 st.u64 [%r95],%r180; .loc 1 100 5 add.u64 %r95,%r95,%r116; setp.eq.u64 %r181,%r40,%r77; @ %r181 bra $L1; .loc 1 101 33 ld.u64 %r99,[%r77]; .loc 1 101 44 ld.u64 %r47,[%r74]; add.u64 %r77,%r77,8; add.u64 %r74,%r74,8; bra $L10; $L9: .loc 1 105 23 ld.u64 %r54,[%frame]; .loc 1 105 27 mul.lo.u64 %r182,%r54,%r122; .loc 1 105 13 sub.u64 %r73,%r73,%r182; .loc 1 106 14 add.u64 %r47,%r47,1; .loc 1 106 6 st.u64 [%frame+240],%r47; setp.ne.u64 %r183,%r47,%r99; @ %r183 bra $L11; mov.u64 %r27,%r218; mov.u64 %r56,%r99; .loc 1 108 6 mov.u64 %r75,0; .loc 1 113 17 mov.u64 %r185,%r75; $L12: st.u64 [%r27+-8],%r185; .loc 1 116 27 mul.lo.u64 %r51,%r54,%r56; mov.u64 %r97,%r75; .loc 1 117 9 add.u64 %r75,%r75,1; .loc 1 118 11 setp.eq.u64 %r186,%r97,%r96; @ %r186 bra $L1; .loc 1 122 13 ld.u64 %r187,[%r27]; add.u64 %r53,%r187,1; st.u64 [%r27],%r53; shl.b64 %r102,%r75,3; .loc 1 123 20 add.u64 %r188,%frame,%r102; ld.u64 %r54,[%r188]; .loc 1 123 10 sub.u64 %r189,%r51,%r54; mad.lo.u64 %r73,%r189,%r122,%r73; .loc 1 125 33 add.u64 %r192,%r216,%r102; ld.u64 %r56,[%r192]; .loc 1 125 6 add.u64 %r27,%r27,8; setp.eq.u64 %r193,%r53,%r56; @ %r193 bra $L12; bra $L13; $L8: .loc 1 143 21 ld.u64 %r89,[%frame+240]12call (%value_in),_gfortran_compare_stringld.param.u32 %r198,[%value_in]; } mov.u64 %r92,%r89; .loc 1 143 14 add.u64 %r89,%r92,1; .loc 1 135 11 setp.ne.u32 %r199,%r198,0; @ %r199 bra $L14; mov.u64 %r90,%r218; add.u64 %r60,%r217,%r78; $L15: .loc 1 138 36 add.u64 %r202,%r92,1; .loc 1 138 25 st.u64 [%r95],%r202; .loc 1 137 5 add.u64 %r95,%r95,%r116; setp.eq.u64 %r203,%r60,%r90; @ %r203 bra $L1; .loc 1 138 32 ld.u64 %r92,[%r90]; add.u64 %r90,%r90,8; bra $L15; $L14: .loc 1 142 23 ld.u64 %r70,[%frame]; .loc 1 142 13 mad.lo.u64 %r48,%r70,%r122,%r48; .loc 1 143 6 st.u64 [%frame+240],%r89; .loc 1 143 34 ld.u64 %r72,[%frame+120]; .loc 1 143 6 setp.ne.u64 %r205,%r72,%r89; @ %r205 bra $L16; mov.u64 %r46,%r218; .loc 1 145 6 mov.u64 %r76,0; .loc 1 150 17 mov.u64 %r207,%r76; $L17: st.u64 [%r46+-8],%r207; .loc 1 153 27 mul.lo.u64 %r67,%r72,%r70; mov.u64 %r93,%r76; .loc 1 154 9 add.u64 %r76,%r76,1; .loc 1 155 11 setp.eq.u64 %r208,%r93,%r96; @ %r208 bra $L1; .loc 1 159 13 ld.u64 %r209,[%r46]; add.u64 %r69,%r209,1; st.u64 [%r46],%r69; shl.b64 %r103,%r76,3; .loc 1 160 20 add.u64 %r210,%frame,%r103; ld.u64 %r70,[%r210]; .loc 1 160 10 sub.u64 %r211,%r70,%r67; mad.lo.u64 %r48,%r211,%r122,%r48; .loc 1 162 33 add.u64 %r214,%r216,%r103; ld.u64 %r72,[%r214]; .loc 1 162 6 add.u64 %r46,%r46,8; setp.eq.u64 %r215,%r69,%r72; @ %r215 bra $L17; bra $L8; $L_gfortran_mfindloc0_s1 .visible .func _gfortran_mfindloc0_s1u64 %r168; .reg .predpred %r218; .reg .predpred %r242; .reg .u64 %r245; .reg .predpredu64 %r282; .reg .u64 %r283; .reg .u6464.loc 1 193 10 ld.s8 %r22,[%r158+28]; .loc 1 193 8 cvt.u32.u32 %r164,%r22; cvt.s64.s8 %r115,%r164; .loc 1 194 6 setp.gt.s64 %r165,%r115,0; @ %r165 bra $L30; .loc 1 195 5 cvta.const.u64 %r166166_gfortran_runtime_error30: .loc 1 197 6 ld.u64 %r168,[%r157]; setp.ne.u64 %r169,%r168,0; @ %r169 bra $L31; .loc 1 199 7 st.u64 [%r157+48],%r168; add.u64 %r171,%r115,-1; st.u64 [%r157+56],%r171; mov.u64 %r172,1; st.u64 [%r157+40],%r172; .loc 1 200 28 cvt.u32.u64 %r173,%r172; st.u8 [%r157+28],%r173; .loc 1 201 24 st.u64 [%r157+8],%r168; .loc 1 202 29 mov.u64 %r176call (%value_in),_gfortrani_xmallocarray179,[%value_in]; } .loc 1 202 27 st.u64 [%r157],%r179; bra $L32; $L31: .loc 1 206 11 cvta.global.u64 %r181,_gfortrani_compile_options; .loc 1 206 10 ld.u32 %r182,[%r181+36]; setp.eq.u32 %r183,%r182,0r15186; call _gfortrani_bounds_iforeach_return.loc 1 210 4 cvta.const.u64 %r189,$LC2; {15189186; call _gfortrani_bounds_equal_extents$L32: .loc 1 215 13 ld.u32 %r116,[%r160+16]; .loc 1 217 9 ld.u64 %r69,[%r160]; .loc 1 219 53 add.u32 %r191,%r116,-4; and.b32 %r192,%r191,-5; set.u32.eq.u32 %r194,%r192,0; neg.s32 %r195,%r194; .loc 1 219 22 add.u32 %r196,%r116,-1; set.u32.le.u32 %r198,%r196,1; neg.s32 %r199,%r198; .loc 1 219 6 cvt.u16.u32 %r201,%r195; cvt.u16.u32 %r202,%r199; or.b16 %r200,%r201,%r202; cvt.u32.u16 %r203,%r200; cvt.u16.u8 %r204,%r203; setp.ne.u16 %r205,%r204,0; @ %r205 bra $L33; .loc 1 221 7 setp.ne.u32 %r206,%r116,16; @ %r206 bra $L34; $L33: ld.u64 %r207,[%r157+40]; shl.b64 %r155,%r207,3; ld.u64 %r140,[%r157]; .loc 1 229 8 mov.u64 %r153,%r140; .loc 1 232 10 mov.u64 %r127,0; .loc 1 233 23 mov.u64 %r210,%r127; bra $L35; $L34: .loc 1 226 5 cvta.const.u64 %r209,$LC3; mov.u64 %r208,20r209; call _gfortrani_internal_error35: .loc 1 233 23 st.u64 [%r153],%r210; mov.u64 %r141,%r127; .loc 1 232 26 add.u64 %r127,%r141,1; .loc 1 232 3 add.u64 %r153,%r153,%r155; setp.ne.u64 %r211,%r115,%r127; @ %r211 bra $L35; .loc 1 239 20 ld.u64 %r39,[%r160+16]; add.u64 %r145,%r158,40; add.u64 %r290,%frame,120; mov.u64 %r144,%r290; add.u64 %r29,%r160,40; mov.u64 %r94,%frame; add.u64 %r60,%frame,240; mov.u64 %r46,%r60; .loc 1 235 6 mov.u64 %r125,1; .loc 1 236 10 mov.u64 %r139,0; bra $L37; $L48: mov.u64 %r139,%r126; $L37: .loc 1 238 18 ld.u64 %r212,[%r145]; st.u64 [%r144],%r212; .loc 1 239 20 ld.u64 %r214,[%r29]; mul.lo.u64 %r213,%r214,%r39; .loc 1 239 18 st.u64 [%r94],%r213; .loc 1 240 19 ld.u64 %r216,[%r145+16]; add.u64 %r215,%r216,1; ld.u64 %r217,[%r145+8]; sub.u64 %r45,%r215,%r217; .loc 1 240 17 st.u64 [%r46],%r45; .loc 1 241 10 mul.lo.u64 %r125,%r125,%r45; .loc 1 242 10 setp.le.s64 %r218,%r45,0; @ %r218 bra $L29; .loc 1 236 26 add.u64 %r126,%r139,1; .loc 1 236 3 add.u64 %r145,%r145,24; add.u64 %r144,%r144,8; add.u64 %r29,%r29,24; add.u64 %r94,%r94,8; add.u64 %r46,%r46,8; setp.ne.u64 %r219,%r141,%r139; @ %r219 bra $L48; .loc 1 247 16 cvt.u32.u32 %r221,%r22; cvt.s64.s8 %r220,%r221; shl.b64 %r135,%r220,3; add.u64 %r291,%frame,360; mov.u32 %r226613528,[%value_in]; } .loc 1 251 19 ld.u64 %r50,[%r158]; .loc 1 249 6 setp.eq.u32 %r230,%r161,0; @ ! %r230 bra $L64; add.u64 %r292,%frame,368; .loc 1 327 33 add.u64 %r296,%frame,240; bra $L38; $L64: .loc 1 251 37 add.u64 %r47,%r125,-1; .loc 1 251 12 mad.lo.u64 %r106,%r47,%r162,%r50; .loc 1 252 32 cvt.s64.s32 %r232,%r116; .loc 1 252 13 mad.lo.u64 %r107,%r232,%r47,%r69; .loc 1 266 34 ld.u64 %r149,[%frame+240]; add.u64 %r292,%frame,368; .loc 1 287 33 add.u64 %r294,%frame,240; $L43: .loc 1 266 21 ld.u64 %r68,[%frame+360]; $L41: .loc 1 257 11 ld.s8 %r235,[%r107]; cvt.u16.u32 %r234,%r235; setp.eq.u16 %r236,%r234,0; @ %r236 bra 16106163159; call (%value_in),_gfortran_compare_stringld.param.u32 %r241,[%value_in]; } .loc 1 257 11 setp.ne.u32 %r242,%r241,0; @ %r242 bra $L39; add.u64 %r134,%frame,248; mov.u64 %r133,%r292; add.u64 %r131,%r60,%r135; $L40: .loc 1 260 37 sub.u64 %r245,%r149,%r68; .loc 1 260 25 st.u64 [%r140],%r245; .loc 1 259 5 add.u64 %r140,%r140,%r155; setp.eq.u64 %r246,%r131,%r134; @ %r246 bra $L29; .loc 1 260 33 ld.u64 %r149,[%r134]; .loc 1 260 44 ld.u64 %r68,[%r133]; add.u64 %r134,%r134,8; add.u64 %r133,%r133,8; bra $L40; $L39: .loc 1 264 23 ld.u64 %r79,[%frame+120]; .loc 1 264 27 mul.lo.u64 %r247,%r79,%r162; .loc 1 264 13 sub.u64 %r106,%r106,%r247; .loc 1 265 24 ld.u64 %r81,[%frame]; .loc 1 265 14 sub.u64 %r107,%r107,%r81; .loc 1 266 14 add.u64 %r68,%r68,1; .loc 1 266 6 st.u64 [%frame+360],%r68; setp.ne.u64 %r248,%r68,%r149; @ %r248 bra $L41; mov.u64 %r128,%r292; mov.u64 %r84,%r149; mov.u64 %r114,8; .loc 1 268 6 mov.u64 %r109,0; .loc 1 273 17 mov.u64 %r250,%r109; $L42: st.u64 [%r128+-8],%r250; .loc 1 276 27 mul.lo.u64 %r73,%r79,%r84; .loc 1 277 28 mul.lo.u64 %r76,%r84,%r81; mov.u64 %r142,%r109; .loc 1 278 9 add.u64 %r109,%r109,1; .loc 1 279 11 setp.eq.u64 %r251,%r142,%r141; @ %r251 bra $L29; .loc 1 283 13 ld.u64 %r252,[%r128]; add.u64 %r78,%r252,1; st.u64 [%r128],%r78; .loc 1 284 20 add.u64 %r254,%r290,%r114; ld.u64 %r79,[%r254]; .loc 1 284 10 sub.u64 %r255,%r73,%r79; mad.lo.u64 %r106,%r255,%r162,%r106; .loc 1 285 21 add.u64 %r257,%frame,%r114; ld.u64 %r81,[%r257]; .loc 1 285 11 sub.u64 %r258,%r81,%r76; add.u64 %r107,%r107,%r258; .loc 1 287 33 add.u64 %r260,%r294,%r114; ld.u64 %r84,[%r260]; .loc 1 287 6 add.u64 %r128,%r128,8; add.u64 %r114,%r114,8; setp.eq.u64 %r261,%r78,%r84; @ %r261 bra $L42; bra $L43; $L38: .loc 1 306 21 ld.u64 %r93,[%frame+360]; $L46: .loc 1 297 11 ld.s8 %r263,[%r69]; cvt.u16.u32 %r262,%r263; setp.eq.u16 %r264,%r262,0; @ %r264 bra $L44;16call (%value_in),_gfortran_compare_stringld.param.u32 %r269,[%value_in]; } .loc 1 297 11 setp.ne.u32 %r270,%r269,0; @ %r270 bra $L44; mov.u64 %r59,%r292; add.u64 %r35,%r291,%r135; $L45: .loc 1 300 36 add.u64 %r273,%r93,1; .loc 1 300 25 st.u64 [%r140],%r273; .loc 1 299 5 add.u64 %r140,%r140,%r155; setp.eq.u64 %r274,%r35,%r59; @ %r274 bra $L29; .loc 1 300 32 ld.u64 %r93,[%r59]; add.u64 %r59,%r59,8; bra $L45; $L44: .loc 1 304 23 ld.u64 %r101,[%frame+120]; .loc 1 304 13 mad.lo.u64 %r50,%r101,%r162,%r50; .loc 1 305 24 ld.u64 %r103,[%frame]; .loc 1 305 14 add.u64 %r69,%r69,%r103; .loc 1 306 14 add.u64 %r93,%r93,1; .loc 1 306 6 st.u64 [%frame+360],%r93; .loc 1 306 34 ld.u64 %r105,[%frame+240]; .loc 1 306 6 setp.ne.u64 %r276,%r93,%r105; @ %r276 bra $L46; mov.u64 %r28,%r292; mov.u64 %r150,8; .loc 1 308 6 mov.u64 %r111,0; .loc 1 313 17 mov.u64 %r278,%r111; $L47: st.u64 [%r28+-8],%r278; .loc 1 316 27 mul.lo.u64 %r96,%r101,%r105; .loc 1 317 28 mul.lo.u64 %r98,%r105,%r103; mov.u64 %r138,%r111; .loc 1 318 9 add.u64 %r111,%r111,1; .loc 1 319 11 setp.eq.u64 %r279,%r138,%r141; @ %r279 bra $L29; .loc 1 323 13 ld.u64 %r280,[%r28]; add.u64 %r100,%r280,1; st.u64 [%r28],%r100; .loc 1 324 20 add.u64 %r282,%r290,%r150; ld.u64 %r101,[%r282]; .loc 1 324 10 sub.u64 %r283,%r101,%r96; mad.lo.u64 %r50,%r283,%r162,%r50; .loc 1 325 21 add.u64 %r285,%frame,%r150; ld.u64 %r103,[%r285]; .loc 1 325 11 sub.u64 %r286,%r103,%r98; add.u64 %r69,%r69,%r286; .loc 1 327 33 add.u64 %r288,%r296,%r150; ld.u64 %r105,[%r288]; .loc 1 327 6 add.u64 %r28,%r28,8; add.u64 %r150,%r150,8; setp.eq.u64 %r289,%r100,%r105; @ %r289 bra $L47; bra $L38; $L29: .loc 1 331_gfortran_sfindloc0_s1 .visible .func _gfortran_sfindloc0_s1pred %r44; .reg .predpredpredpredar2; mov.u64 %r38,%ar3; mov.u32 %r39,%ar4; mov.u64 %r40,%ar5; mov.u64 %r41,%ar6; .loc 1 350 6 setp.eq.u64 %r42,%r38,0; @ %r42 bra $L66; .loc 1 350 20 ld.u32 %r43,[%r38];67; $L66: .loc 1 352 7 {41; call _gfortran_findloc0_s1353 7 bra $L65; $L67: .loc 1 356 8 ld.s8 %r30,[%r36+28]; .loc 1 358 6 setp.gt.s64 %r52,%r30,0; @ %r52 bra $L69; .loc 1 359 5 cvta.const.u64 %r54,$LC0; mov.u64 %r53,r54; call _gfortrani_internal_error69: .loc 1 361 15 ld.u64 %r24,[%r35]; .loc 1 361 6 setp.ne.u64 %r55,%r24,0; @ %r55 bra $L70; .loc 1 363 7 st.u64 [%r35+48],%r24; add.u64 %r57,%r30,-1; st.u64 [%r35+56],%r57; mov.u64 %r58,1; st.u64 [%r35+40],%r58; .loc 1 364 28 cvt.u32.u64 %r59,%r58; st.u8 [%r35+28],%r59; .loc 1 365 24 st.u64 [%r35+8],%r24; .loc 1 366 29 mov.u64 call (%value_in),_gfortrani_xmallocarray27 st.u64 [%r35],%r24; bra $L71; $L70: .loc 1 368 12 cvta.global.u64 %r67,_gfortrani_compile_options; .loc 1 368 11 ld.u32 %r68,[%r67+36]; setp.eq.u32 %r69,%r68,0; @ %r69 bra $L71; .loc 1 370 8 cvta.const.u64 %r72,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 375 8 ld.u64 %r24,[%r35]; $L71: ld.u64 %r73,[%r35+40]; shl.b64 %r34,%r73,3; mov.u64 %r29,%r24; .loc 1 376 10 mov.u64 %r32,0; .loc 1 377 23 mov.u64 %r74,%r32; $L72: st.u64 [%r29],%r74; .loc 1 376 24 add.u64 %r32,%r32,1; .loc 1 376 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r75,%r30,%r32; @ %r75 bra $L72; $L65: .loc 1 378 1 ret; } findloc0_s4.o/_gfortran_findloc0_s4 .visible .func _gfortran_findloc0_s.file 1 "../../../../libgfortran/generated/findloc0_s4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mfindloc0_s4 .visible .func _gfortran_mfindloc0_s_gfortran_sfindloc0_s4 .visible .func _gfortran_sfindloc0_sVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortran_compare_string_char4_gfortran_compare_string_char4gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extentsgfortrani_internal_error .extern .func _gfortrani_internal_82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,6270,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,98] = {70,73,78,68,777,65,83,75,32,97,114,103,117,109,101,110_gfortran_findloc0_s4 .visible .func _gfortran_findloc0_s368pred.reg .u32predpred.reg .u64 %r194; .reg .u64 %r197; .reg .predpredu64mov.u64 %r122,%ar0; mov.u64 %r123,%ar1; mov.u64 %r124,%ar2; mov.u32 %r125,%ar3; mov.u64 %r126,%ar4; mov.u64 %r127,%ar5; .loc 1 52 10 ld.s8 %r22,[%r123+28]; .loc 1 52 8 cvt.u32.u32 %r128,%r22; cvt.s64.s8 %r84,%r128; .loc 1 53 6 setp.gt.s64 %r129,%r84,0; @ %r129 bra $L2; .loc 1 54 5 cvta.const.u64 %r1301stack; call _gfortran_runtime_errorr23,[%r122]; .loc 1 56 6 setp.ne.u64 %r132,%r23,0; @ %r132 bra $L3; .loc 1 58 7 st.u64 [%r122+48],%r23; add.u64 %r134,%r84,-1; st.u64 [%r122+56],%r134; mov.u64 %r135,1; st.u64 [%r122+40],%r135; .loc 1 59 28 cvt.u32.u64 %r136,%r135; st.u8 [%r122+28],%r136; .loc 1 60 24 st.u64 [%r122+8],%r23; .loc 1 61 29 mov.u64 %r139139; call (%value_in),_gfortrani_xmallocarray2,[%value_in]; } mov.u64 %r23,%r142; .loc 1 61 27 st.u64 [%r122],%r23; bra $L4; $L3: .loc 1 65 11 cvta.global.u64 %r144,_gfortrani_compile_options; .loc 1 65 10 ld.u32 %r145,[%r144+36]; setp.eq.u32 %r146,%r145,0; @ %r146 bra $L4; .loc 1 66r123149; call _gfortrani_bounds_iforeach_return.loc 1 71 8 ld.u64 %r23,[%r122]; $L4: ld.u64 %r150,[%r122+40]; shl.b64 %r120,%r150,3; mov.u64 %r102,%r23; .loc 1 70 11 mov.u64 %r118,%r102; .loc 1 74 10 mov.u64 %r94,0; .loc 1 75 23 mov.u64 %r151,%r94; $L5: st.u64 [%r118],%r151; mov.u64 %r29,%r94; .loc 1 74 26 add.u64 %r94,%r29,1; .loc 1 74 3 add.u64 %r118,%r118,%r120; setp.ne.u64 %r152,%r84,%r94; @ %r152 bra $L5; add.u64 %r110,%r123,40; mov.u64 %r112,%frame; .loc 1 77 6 mov.u64 %r92,1; .loc 1 78 10 mov.u64 %r101,0; add.u64 %r223,%frame,120; bra $L7; $L18: mov.u64 %r101,%r93; $L7: .loc 1 80 18 ld.u64 %r153,[%r110]; st.u64 [%r112],%r153; .loc 1 81 19 ld.u64 %r155,[%r110+16]; add.u64 %r154,%r155,1; ld.u64 %r156,[%r110+8]; sub.u64 %r35,%r154,%r156; .loc 1 81 17 shl.b64 %r158,%r101,3; add.u64 %r159,%r223,%r158; st.u64 [%r159],%r35; .loc 1 82 10 mul.lo.u64 %r92,%r92,%r35; .loc 1 83 10 setp.le.s64 %r160,%r35,0; @ %r160 bra $L1; .loc 1 78 26 add.u64 %r93,%r101,1; .loc 1 78 3 add.u64 %r110,%r110,24; add.u64 %r112,%r112,8; setp.ne.u64 %r161,%r29,%r101; @ %r161 bra $L18; .loc 1 88 16 cvt.u32.u32 %r163,%r22; cvt.s64.s8 %r162,%r163; shl.b64 %r99,%r162,3; add.u64 %r224,%frame,240; mov.u32 %r16892 19 ld.u64 %r50,[%r123]; .loc 1 90 6 setp.eq.u32 %r172,%r125,0; @ ! %r172 bra $L28; add.u64 %r225,%frame,248; bra $L8; $L28: shl.b64 %r43,%r126,2; .loc 1 92 37 add.u64 %r173,%r92,-1; .loc 1 92 12 mad.lo.u64 %r75,%r173,%r43,%r50; .loc 1 106 34 ld.u64 %r106,[%frame+120]; add.u64 %r225,%frame,248; $L13: .loc 1 106 21 ld.u64 %r49,[%frame+240]; $L11: .loc 1 98126r1124; call (%value_in),_gfortran_compare_string_char4ld.param.u32 %r179,[%value_in]; } .loc 1 98 11 setp.ne.u32 %r180,%r179,0; @ %r180 bra $L9; add.u64 %r88,%frame,128; mov.u64 %r82,%r225; add.u64 %r76,%r223,%r99; $L10: .loc 1 101 37 sub.u64 %r184,%r106,%r49; .loc 1 101 25 st.u64 [%r102],%r184; .loc 1 100 5 add.u64 %r102,%r102,%r120; setp.eq.u64 %r185,%r76,%r88; @ %r185 bra $L1; .loc 1 101 33 ld.u64 %r106,[%r88]; .loc 1 101 44 ld.u64 %r49,[%r82]; add.u64 %r88,%r88,8; add.u64 %r82,%r82,8; bra $L10; $L9: .loc 1 105 23 ld.u64 %r56,[%frame]; .loc 1 105 13 mul.lo.u64 %r186,%r56,%r126; shl.b64 %r187,%r186,2; sub.u64 %r75,%r75,%r187; .loc 1 106 14 add.u64 %r49,%r49,1; .loc 1 106 6 st.u64 [%frame+240],%r49; setp.ne.u64 %r188,%r49,%r106; @ %r188 bra $L11; mov.u64 %r40,%r225; mov.u64 %r58,%r106; .loc 1 108 6 mov.u64 %r78,0; .loc 1 113 17 mov.u64 %r190,%r78; $L12: st.u64 [%r40+-8],%r190; .loc 1 116 27 mul.lo.u64 %r53,%r56,%r58; mov.u64 %r30,%r78; .loc 1 117 9 add.u64 %r78,%r78,1; .loc 1 118 11 setp.eq.u64 %r191,%r30,%r29; @ %r191 bra $L1; .loc 1 122 13 ld.u64 %r192,[%r40]; add.u64 %r55,%r192,1; st.u64 [%r40],%r55; shl.b64 %r107,%r78,3; .loc 1 123 20 add.u64 %r193,%frame,%r107; ld.u64 %r56,[%r193]; .loc 1 123 10 sub.u64 %r194,%r53,%r56; mad.lo.u64 %r75,%r194,%r43,%r75; .loc 1 125 33 add.u64 %r197,%r223,%r107; ld.u64 %r58,[%r197]; .loc 1 125 6 add.u64 %r40,%r40,8; setp.eq.u64 %r198,%r55,%r58; @ %r198 bra $L12; bra $L13; $L8: .loc 1 143 21 ld.u64 %r98,[%frame+240]124; call (%value_in),_gfortran_compare_string_char4ld.param.u32 %r203,[%value_in]; } mov.u64 %r103,%r98; .loc 1 143 14 add.u64 %r98,%r103,1; .loc 1 135 11 setp.ne.u32 %r204,%r203,0; @ %r204 bra $L14; mov.u64 %r96,%r225; add.u64 %r65,%r224,%r99; $L15: .loc 1 138 36 add.u64 %r207,%r103,1; .loc 1 138 25 st.u64 [%r102],%r207; .loc 1 137 5 add.u64 %r102,%r102,%r120; setp.eq.u64 %r208,%r65,%r96; @ %r208 bra $L1; .loc 1 138 32 ld.u64 %r103,[%r96]; add.u64 %r96,%r96,8; bra $L15; $L14: .loc 1 142 23 ld.u64 %r72,[%frame]; .loc 1 142 13 mul.lo.u64 %r209,%r72,%r126; shl.b64 %r210,%r209,2; add.u64 %r50,%r50,%r210; .loc 1 143 6 st.u64 [%frame+240],%r98; .loc 1 143 34 ld.u64 %r74,[%frame+120]; .loc 1 143 6 setp.ne.u64 %r211,%r74,%r98; @ %r211 bra $L16; mov.u64 %r67,%r225; .loc 1 145 6 mov.u64 %r80,0; .loc 1 150 17 mov.u64 %r213,%r80; $L17: st.u64 [%r67+-8],%r213; .loc 1 153 27 mul.lo.u64 %r69,%r72,%r74; mov.u64 %r28,%r80; .loc 1 154 9 add.u64 %r80,%r80,1; .loc 1 155 11 setp.eq.u64 %r214,%r28,%r29; @ %r214 bra $L1; .loc 1 159 13 ld.u64 %r215,[%r67]; add.u64 %r71,%r215,1; st.u64 [%r67],%r71; shl.b64 %r48,%r80,3; .loc 1 160 20 add.u64 %r216,%frame,%r48; ld.u64 %r72,[%r216]; .loc 1 160 10 sub.u64 %r217,%r72,%r69; mul.lo.u64 %r218,%r217,%r126; shl.b64 %r219,%r218,2; add.u64 %r50,%r50,%r219; .loc 1 162 33 add.u64 %r221,%r223,%r48; ld.u64 %r74,[%r221]; .loc 1 162 6 add.u64 %r67,%r67,8; setp.eq.u64 %r222,%r71,%r74; @ %r222 bra $L17; bra $L8; $L_gfortran_mfindloc0_s4 .visible .func _gfortran_mfindloc0_spredpredpred %r187; .reg .u64 %r190; .reg .u64pred %r209; .reg .pred %r210; .reg .u64pred %r223; .reg .u64 %r224; .reg .u32 %r225; .reg .u32 %r230; .reg .u64 %r232; .reg .pred %r234; .reg .u64 %r236; .reg .u16 %r238; .reg .u32 %r239; .reg .pred %r240; .reg .u32 %r245; .reg .pred %r246; .reg .u64 %r249; .reg .predu64 %r303; mov.u64 %r161,%ar0; mov.u64 %r162,%ar1; mov.u64 %r163,%ar2; mov.u64 %r164,%ar3; mov.u32 %r165,%ar4; mov.u64 %r166,%ar5; mov.u64 %r167,%ar6; .loc 1 193 10 ld.s8 %r22,[%r162+28]; .loc 1 193 8 cvt.u32.u32 %r168,%r22; cvt.s64.s8 %r117,%r168; .loc 1 194 6 setp.gt.s64 %r169,%r117,0; @ %r169 bra $L30; .loc 1 195 5 cvta.const.u64 %r170170_gfortran_runtime_error30: .loc 1 197 6 ld.u64 %r172,[%r161]; setp.ne.u64 %r173,%r172,0; @ %r173 bra $L31; .loc 1 199 7 st.u64 [%r161+48],%r172; add.u64 %r175,%r117,-1; st.u64 [%r161+56],%r175; mov.u64 %r176,1; st.u64 [%r161+40],%r176; .loc 1 200 28 cvt.u32.u64 %r177,%r176; st.u8 [%r161+28],%r177; .loc 1 201 24 st.u64 [%r161+8],%r172; .loc 1 202 29 mov.u64 %r180180; call (%value_in),_gfortrani_xmallocarray183,[%value_in]; } .loc 1 202 27 st.u64 [%r161],%r183; bra $L32; $L31: .loc 1 206 11 cvta.global.u64 %r185,_gfortrani_compile_options; .loc 1 206 10 ld.u32 %r186,[%r185+36]; setp.eq.u32 %r187,%r186,0161_gfortrani_bounds_iforeach_return.loc 1 210 4 cvta.const.u64 %r193164193190; call _gfortrani_bounds_equal_extents$L32: .loc 1 215 13 ld.u32 %r118,[%r164+16]; .loc 1 217 9 ld.u64 %r71,[%r164]; .loc 1 219 53 add.u32 %r195,%r118,-4; and.b32 %r196,%r195,-5; set.u32.eq.u32 %r198,%r196,0; neg.s32 %r199,%r198; .loc 1 219 22 add.u32 %r200,%r118,-1; set.u32.le.u32 %r202,%r200,1; neg.s32 %r203,%r202; .loc 1 219 6 cvt.u16.u32 %r205,%r199; cvt.u16.u32 %r206,%r203; or.b16 %r204,%r205,%r206; cvt.u32.u16 %r207,%r204; cvt.u16.u8 %r208,%r20733; .loc 1 221 7 setp.ne.u32 %r210,%r118,16; @ %r210 bra $L34; $L33: ld.u64 %r211,[%r161+40]; shl.b64 %r76,%r211,3; ld.u64 %r147,[%r161]; .loc 1 229 8 mov.u64 %r72,%r147; .loc 1 232 10 mov.u64 %r131,0; .loc 1 233 23 mov.u64 %r214,%r131; bra $L35; $L34: .loc 1 226 5 cvta.const.u64 %r213,$LC3; mov.u64 %r212,21213; call _gfortrani_internal_error35: .loc 1 233 23 st.u64 [%r72],%r214; mov.u64 %r35,%r131; .loc 1 232 26 add.u64 %r131,%r35,1; .loc 1 232 3 add.u64 %r72,%r72,%r76; setp.ne.u64 %r215,%r117,%r131; @ %r215 bra $L35; .loc 1 239 20 ld.u64 %r40,[%r164+16]; add.u64 %r152,%r162,40; add.u64 %r297,%frame,120; mov.u64 %r150,%r297; add.u64 %r149,%r164,40; mov.u64 %r28,%frame; add.u64 %r97,%frame,240; mov.u64 %r95,%r97; .loc 1 235 6 mov.u64 %r129,1; .loc 1 236 10 mov.u64 %r146,0; bra $L37; $L48: mov.u64 %r146,%r130; $L37: .loc 1 238 18 ld.u64 %r216,[%r152]; st.u64 [%r150],%r216; .loc 1 239 20 ld.u64 %r218,[%r149]; mul.lo.u64 %r217,%r218,%r40; .loc 1 239 18 st.u64 [%r28],%r217; .loc 1 240 19 ld.u64 %r220,[%r152+16]; add.u64 %r219,%r220,1; ld.u64 %r221,[%r152+8]; sub.u64 %r46,%r219,%r221; .loc 1 240 17 st.u64 [%r95],%r46; .loc 1 241 10 mul.lo.u64 %r129,%r129,%r46; .loc 1 242 10 setp.le.s64 %r222,%r46,0; @ %r222 bra $L29; .loc 1 236 26 add.u64 %r130,%r146,1; .loc 1 236 3 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; add.u64 %r149,%r149,24; add.u64 %r28,%r28,8; add.u64 %r95,%r95,8; setp.ne.u64 %r223,%r35,%r146; @ %r223 bra $L48; .loc 1 247 16 cvt.u32.u32 %r225,%r22; cvt.s64.s8 %r224,%r225; shl.b64 %r142,%r224,3; add.u64 %r298,%frame,360; mov.u32 %r230214232,[%value_in]; } .loc 1 251 19 ld.u64 %r50,[%r162]; .loc 1 249 6 setp.eq.u32 %r234,%r165,0; @ ! %r234 bra $L64; add.u64 %r299,%frame,368; .loc 1 327 33 add.u64 %r303,%frame,240; bra $L38; $L64: .loc 1 251 37 add.u64 %r47,%r129,-1; shl.b64 %r62,%r166,2; .loc 1 251 12 mad.lo.u64 %r110,%r47,%r62,%r50; .loc 1 252 32 cvt.s64.s32 %r236,%r118; .loc 1 252 13 mad.lo.u64 %r111,%r236,%r47,%r71; .loc 1 266 34 ld.u64 %r155,[%frame+240]; add.u64 %r299,%frame,368; .loc 1 287 33 add.u64 %r301,%frame,240; $L43: .loc 1 266 21 ld.u64 %r70,[%frame+360]; $L41: .loc 1 257 11 ld.s8 %r239,[%r111]; cvt.u16.u32 %r238,%r239; setp.eq.u16 %r240,%r238,0; @ %r240 bra 673; call (%value_in),_gfortran_compare_string_char4ld.param.u32 %r245,[%value_in]; } .loc 1 257 11 setp.ne.u32 %r246,%r245,0; @ %r246 bra $L39; add.u64 %r140,%frame,248; mov.u64 %r139,%r299; add.u64 %r137,%r97,%r142; $L40: .loc 1 260 37 sub.u64 %r249,%r155,%r70; .loc 1 260 25 st.u64 [%r147],%r249; .loc 1 259 5 add.u64 %r147,%r147,%r76; setp.eq.u64 %r250,%r137,%r140; @ %r250 bra $L29; .loc 1 260 33 ld.u64 %r155,[%r140]; .loc 1 260 44 ld.u64 %r70,[%r139]; add.u64 %r140,%r140,8; add.u64 %r139,%r139,8; bra $L40; $L39: .loc 1 264 23 ld.u64 %r81,[%frame+120]; .loc 1 264 13 mul.lo.u64 %r251,%r81,%r166; shl.b64 %r252,%r251,2; sub.u64 %r110,%r110,%r252; .loc 1 265 24 ld.u64 %r83,[%frame]; .loc 1 265 14 sub.u64 %r111,%r111,%r83; .loc 1 266 14 add.u64 %r70,%r70,1; .loc 1 266 6 st.u64 [%frame+360],%r70; setp.ne.u64 %r253,%r70,%r155; @ %r253 bra $L41; mov.u64 %r134,%r299; mov.u64 %r86,%r155; mov.u64 %r126,8; .loc 1 268 6 mov.u64 %r112,0; .loc 1 273 17 mov.u64 %r255,%r112; $L42: st.u64 [%r134+-8],%r255; .loc 1 276 27 mul.lo.u64 %r75,%r81,%r86; .loc 1 277 28 mul.lo.u64 %r78,%r86,%r83; mov.u64 %r36,%r112; .loc 1 278 9 add.u64 %r112,%r112,1; .loc 1 279 11 setp.eq.u64 %r256,%r36,%r35; @ %r256 bra $L29; .loc 1 283 13 ld.u64 %r257,[%r134]; add.u64 %r80,%r257,1; st.u64 [%r134],%r80; .loc 1 284 20 add.u64 %r259,%r297,%r126; ld.u64 %r81,[%r259]; .loc 1 284 10 sub.u64 %r260,%r75,%r81; mad.lo.u64 %r110,%r260,%r62,%r110; .loc 1 285 21 add.u64 %r262,%frame,%r126; ld.u64 %r83,[%r262]; .loc 1 285 11 sub.u64 %r263,%r83,%r78; add.u64 %r111,%r111,%r263; .loc 1 287 33 add.u64 %r265,%r301,%r126; ld.u64 %r86,[%r265]; .loc 1 287 6 add.u64 %r134,%r134,8; add.u64 %r126,%r126,8; setp.eq.u64 %r266,%r80,%r86; @ %r266 bra $L42; bra $L43; $L38: .loc 1 306 21 ld.u64 %r96,[%frame+360]; $L46: .loc 1 297 11 ld.s8 %r268,[%r71]; cvt.u16.u32 %r267,%r268; setp.eq.u16 %r269,%r267,0; @ %r269 bra $L44;1673; call (%value_in),_gfortran_compare_string_char4ld.param.u32 %r274,[%value_in]; } .loc 1 297 11 setp.ne.u32 %r275,%r274,0; @ %r275 bra $L44; mov.u64 %r90,%r299; add.u64 %r56,%r298,%r142; $L45: .loc 1 300 36 add.u64 %r278,%r96,1; .loc 1 300 25 st.u64 [%r147],%r278; .loc 1 299 5 add.u64 %r147,%r147,%r76; setp.eq.u64 %r279,%r56,%r90; @ %r279 bra $L29; .loc 1 300 32 ld.u64 %r96,[%r90]; add.u64 %r90,%r90,8; bra $L45; $L44: .loc 1 304 23 ld.u64 %r105,[%frame+120]; .loc 1 304 13 mul.lo.u64 %r280,%r105,%r166; shl.b64 %r281,%r280,2; add.u64 %r50,%r50,%r281; .loc 1 305 24 ld.u64 %r107,[%frame]; .loc 1 305 14 add.u64 %r71,%r71,%r107; .loc 1 306 14 add.u64 %r96,%r96,1; .loc 1 306 6 st.u64 [%frame+360],%r96; .loc 1 306 34 ld.u64 %r109,[%frame+240]; .loc 1 306 6 setp.ne.u64 %r282,%r96,%r109; @ %r282 bra $L46; mov.u64 %r34,%r299; mov.u64 %r159,8; .loc 1 308 6 mov.u64 %r113,0; .loc 1 313 17 mov.u64 %r284,%r113; $L47: st.u64 [%r34+-8],%r284; .loc 1 316 27 mul.lo.u64 %r99,%r105,%r109; .loc 1 317 28 mul.lo.u64 %r102,%r109,%r107; mov.u64 %r148,%r113; .loc 1 318 9 add.u64 %r113,%r113,1; .loc 1 319 11 setp.eq.u64 %r285,%r148,%r35; @ %r285 bra $L29; .loc 1 323 13 ld.u64 %r286,[%r34]; add.u64 %r104,%r286,1; st.u64 [%r34],%r104; .loc 1 324 20 add.u64 %r288,%r297,%r159; ld.u64 %r105,[%r288]; .loc 1 324 10 sub.u64 %r289,%r105,%r99; mul.lo.u64 %r290,%r289,%r166; shl.b64 %r291,%r290,2; add.u64 %r50,%r50,%r291; .loc 1 325 21 add.u64 %r292,%frame,%r159; ld.u64 %r107,[%r292]; .loc 1 325 11 sub.u64 %r293,%r107,%r102; add.u64 %r71,%r71,%r293; .loc 1 327 33 add.u64 %r295,%r303,%r159; ld.u64 %r109,[%r295]; .loc 1 327 6 add.u64 %r34,%r34,8; add.u64 %r159,%r159,8; setp.eq.u64 %r296,%r104,%r109; @ %r296 bra $L47; bra $L38; $L29: .loc 1 331_gfortran_sfindloc0_s4 .visible .func _gfortran_sfindloc0_spred %r44; .reg .predpredpredpredar2; mov.u64 %r38,%ar3; mov.u32 %r39,%ar4; mov.u64 %r40,%ar5; mov.u64 %r41,%ar6; .loc 1 350 6 setp.eq.u64 %r42,%r38,0; @ %r42 bra $L66; .loc 1 350 20 ld.u32 %r43,[%r38];67; $L66: .loc 1 352 7 {41; call _gfortran_findloc0_s4353 7 bra $L65; $L67: .loc 1 356 8 ld.s8 %r30,[%r36+28]; .loc 1 358 6 setp.gt.s64 %r52,%r30,0; @ %r52 bra $L69; .loc 1 359 5 cvta.const.u64 %r54,$LC0; mov.u64 %r53,r54; call _gfortrani_internal_error69: .loc 1 361 15 ld.u64 %r24,[%r35]; .loc 1 361 6 setp.ne.u64 %r55,%r24,0; @ %r55 bra $L70; .loc 1 363 7 st.u64 [%r35+48],%r24; add.u64 %r57,%r30,-1; st.u64 [%r35+56],%r57; mov.u64 %r58,1; st.u64 [%r35+40],%r58; .loc 1 364 28 cvt.u32.u64 %r59,%r58; st.u8 [%r35+28],%r59; .loc 1 365 24 st.u64 [%r35+8],%r24; .loc 1 366 29 mov.u64 call (%value_in),_gfortrani_xmallocarray27 st.u64 [%r35],%r24; bra $L71; $L70: .loc 1 368 12 cvta.global.u64 %r67,_gfortrani_compile_options; .loc 1 368 11 ld.u32 %r68,[%r67+36]; setp.eq.u32 %r69,%r68,0; @ %r69 bra $L71; .loc 1 370 8 cvta.const.u64 %r72,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 375 8 ld.u64 %r24,[%r35]; $L71: ld.u64 %r73,[%r35+40]; shl.b64 %r34,%r73,3; mov.u64 %r29,%r24; .loc 1 376 10 mov.u64 %r32,0; .loc 1 377 23 mov.u64 %r74,%r32; $L72: st.u64 [%r29],%r74; .loc 1 376 24 add.u64 %r32,%r32,1; .loc 1 376 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r75,%r30,%r32; @ %r75 bra $L72; $L65: .loc 1 378 1 ret; } findloc1_i1.o/_gfortran_findloc1_i1 .visible .func _gfortran_findloc1_i1); .file 1 "../../../../libgfortran/generated/findloc1_i1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mfindloc1_i1 .visible .func _gfortran_mfindloc1_i1_gfortran_sfindloc1_i1 .visible .func _gfortran_sfindloc1_i1); // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_internal_error .extern .func _gfortrani_internal_8] = {70,73,78,68,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,70,73,78,68,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,914,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,70,73,78,68,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_findloc1_i1 .visible .func _gfortran_findloc1_i148u64 %r75; .reg .u64 %r77; .reg .u649predu64u64 %r229; .reg .u64 %r230; .reg .predpred %r255; .reg .predu64 %r266; .reg .u16u64predpred %r317; .reg .u16 %r321; .reg .u32 %r322; .reg .predpred %r329; .reg .pred %r330; .reg .pred %r331; .reg .u64 %r332; .reg .u64 %r333; .reg .u64predmov.u64 %r180,%ar0; mov.u64 %r181,%ar1; mov.u32 %r182,%ar2; mov.u64 %r183,%ar3; mov.u32 %r184,%ar4; .loc 1 54 10 ld.s8 %r22,[%r181+28]; .loc 1 54 38 add.u32 %r185,%r22,-1; .loc 1 54 8 cvt.s64.s32 %r111,%r185; .loc 1 55 10 ld.u64 %r25,[%r183]; .loc 1 55 7 add.u64 %r112,%r25,-1; .loc 1 57 7 shr.u64 %r187,%r112,63; set.u32.lt.s64 %r189,%r111,%r112; neg.s32 %r190,%r189; cvt.u16.u64 %r193,%r187; cvt.u16.u32 %r194,%r190; or.b16 %r192,%r193,%r194; .loc 1 57 6 cvt.u32.u16 %r195,%r192; cvt.u16.u8 %r196,%r195; setp.eq.u16 %r197,%r196,0; @ %r197 bra $L2; .loc 1 59 7 cvt.u32.u32 %r200,%r22; cvt.s64.s8 %r199,%r200; st.u64 [%stack+8],%r199_gfortran_runtime_errorr203,%r112,%r112; add.u64 %r204,%r203,%r112; shl.b64 %r205,%r204,3; add.u64 %r206,%r181,%r205; ld.u64 %r30,[%r206+56]; ld.u64 %r32,[%r206+48]; .loc 1 67 9 ld.u64 %r115,[%r206+40]; .loc 1 69 3 setp.ne.u64 %r220,%r112,0; @ %r220 bra $L3; $L9: .loc 1 77 3 setp.gt.s64 %r221,%r111,%r112; @ %r221 bra $L4; bra $L50; $L3: add.u64 %r69,%r181,40; add.u64 %r74,%frame,120; add.u64 %r131,%frame,240; add.u64 %r222,%r181,16; add.u64 %r224,%r25,%r25; add.u64 %r225,%r224,%r25; shl.b64 %r226,%r225,3; add.u64 %r177,%r222,%r226; .loc 1 75 12 mov.u64 %r367,0; $L8: .loc 1 71 18 ld.u64 %r227,[%r69]; st.u64 [%r74],%r227; .loc 1 72 19 ld.u64 %r229,[%r69+16]; add.u64 %r228,%r229,1; ld.u64 %r230,[%r69+8]; sub.u64 %r37,%r228,%r230; .loc 1 74 10 setp.lt.s64 %r231,%r37,0; @ %r231 bra $L6; .loc 1 72 17 st.u64 [%r131],%r37; bra $L7; $L6: .loc 1 75 12 st.u64 [%r131],%r367; $L7: .loc 1 69 3 add.u64 %r69,%r69,24; add.u64 %r74,%r74,8; add.u64 %r131,%r131,8; setp.ne.u64 %r233,%r69,%r177; @ %r233 bra $L8; bra $L9; $L50: .loc 1 86 6 ld.u64 %r234,[%r180]; setp.eq.u64 %r235,%r234,0; @ ! %r235 bra $L11; bra $L10; $L4: add.u64 %r237,%r25,%r25; add.u64 %r238,%r237,%r25; shl.b64 %r239,%r238,3; add.u64 %r240,%r239,40; add.u64 %r104,%r181,%r240; shl.b64 %r241,%r25,3; add.u64 %r63,%r241,-8; add.u64 %r357,%frame,120; add.u64 %r99,%r357,%r63; add.u64 %r356,%frame,240; add.u64 %r48,%r356,%r63; add.u64 %r244,%r181,40; cvt.u32.u32 %r246,%r22; cvt.s64.s8 %r245,%r246; add.u64 %r248,%r245,%r245; add.u64 %r249,%r248,%r245; shl.b64 %r250,%r249,3; add.u64 %r168,%r244,%r250; .loc 1 83 12 mov.u64 %r366,0; $L14: .loc 1 79 18 ld.u64 %r251,[%r104]; st.u64 [%r99],%r251; .loc 1 80 19 ld.u64 %r253,[%r104+16]; add.u64 %r252,%r253,1; ld.u64 %r254,[%r104+8]; sub.u64 %r44,%r252,%r254; .loc 1 82 10 setp.lt.s64 %r255,%r44,0; @ %r255 bra $L12; .loc 1 80 17 st.u64 [%r48],%r44; bra $L13; $L12: .loc 1 83 12 st.u64 [%r48],%r366; $L13: .loc 1 77 3 add.u64 %r104,%r104,24; add.u64 %r99,%r99,8; add.u64 %r48,%r48,8; setp.ne.u64 %r257,%r104,%r168; @ %r257 bra $L14; bra $L51; $L10: .loc 1 90 7 setp.le.s64 %r258,%r111,0; @ %r258 bra $L16; add.u64 %r356,%frame,240; $L36: add.u64 %r138,%r180,40; mov.u64 %r133,%r356; cvt.u32.u32 %r260,%r22; cvt.s64.s8 %r259,%r260; add.u64 %r262,%r259,%r259; add.u64 %r263,%r262,%r259; shl.b64 %r264,%r263,3; add.u64 %r265,%r180,16; add.u64 %r106,%r264,%r265; .loc 1 93 10 mov.u64 %r105,1; .loc 1 97 4 mov.u64 %r288,0; bra $L17; $L16: .loc 1 101 24 mov.u64 %r266,0; st.u64 [%r180+8],%r266; .loc 1 102 28 cvt.u16.u32 %r269,%r22; add.u16 %r268,%r269,-1; cvt.u32.u16 %r270,%r268; st.u8 [%r180+28],%r270; .loc 1 104 20 add.u32 %r271,%r22,-2; cvt.s64.s32 %r55,%r271; add.u64 %r273,%r55,%r55; add.u64 %r274,%r273,%r55; shl.b64 %r275,%r274,3; add.u64 %r276,%r180,%r275; .loc 1 104 67 shl.b64 %r278,%r55,3; add.u64 %r279,%frame,%r278; .loc 1 104 59 ld.u64 %r281,[%r276+40]; ld.u64 %r282,[%r279+240]; mul.lo.u64 %r117,%r281,%r282; .loc 1 106 29 mov.u64 %r284call (%value_in),_gfortrani_xmallocarray285,[%value_in]; } .loc 1 106 27 st.u64 [%r180],%r285; .loc 1 107 10 setp.eq.u64 %r287,%r117,0; @ ! %r287 bra $L21; bra $L18; $L20: .loc 1 95 48 mul.lo.u64 %r105,%r51,%r105; $L17: .loc 1 97 4 st.u64 [%r138+8],%r288; ld.u64 %r51,[%r133]; add.u64 %r289,%r51,-1; st.u64 [%r138+16],%r289; st.u64 [%r138],%r105; .loc 1 90 7 add.u64 %r138,%r138,24; add.u64 %r133,%r133,8; setp.ne.u64 %r290,%r106,%r138; @ %r290 bra $L20; bra $L16; $L18: .loc 1 110 4 st.u64 [%r180+48],%r117; mov.u64 %r292,-1; st.u64 [%r180+56],%r292; mov.u64 %r293,1; st.u64 [%r180+40],%r293; .loc 1 111 4 bra $L1; $L11: .loc 1 116 19 ld.s8 %r61,[%r180+28]; .loc 1 116 10 setp.eq.u64 %r294,%r61,%r111; @ %r294 bra $L23; .loc 1 117 2 st.u64 [%stack+8],%r111; st.u64 [%stack],%r61;_gfortran_runtime_error23: .loc 1 122 11 cvta.global.u64 %r297,_gfortrani_compile_options; .loc 1 122 10 ld.u32 %r298,[%r297+36]; setp.eq.u32 %r299,%r298,0; @ %r299 bra $L21; .loc 1 123 2 add.u64 %r356,%frame,240; cvta.const.u64 %r303,$LC2r356302303; call _gfortrani_bounds_ifunction_return$L21: .loc 1 127 3 setp.gt.s64 %r305,%r111,0; @ %r305 bra $L24; $L27: .loc 1 64 9 add.u64 %r306,%r30,1; .loc 1 64 7 sub.u64 %r113,%r306,%r32; max.s64 %r42,%r113,0; .loc 1 135 8 ld.u64 %r126,[%r180]; .loc 1 138 8 ld.u64 %r125,[%r181]; .loc 1 147 22 add.u64 %r307,%r42,-1; .loc 1 147 35 mul.lo.u64 %r154,%r307,%r115; ld.u64 %r81,[%frame+360]; .loc 1 172 26 ld.u64 %r75,[%frame+120]; .loc 1 173 22 ld.u64 %r77,[%frame]; .loc 1 173 12 shl.b64 %r79,%r77,3; setp.ne.u32 %r355,%r184,0; setp.le.s64 %r354,%r111,1; .loc 1 148 4 setp.le.s64 %r359,%r113,0; .loc 1 150 11 cvt.u16.u32 %r360,%r182; .loc 1 160 4 setp.gt.s64 %r361,%r113,0; cvt.u32.u32 %r362,%r22; cvt.s64.s8 %r363,%r362; add.u64 %r364,%r363,-1; add.u64 %r365,%frame,120; bra $L25; $L24: add.u64 %r161,%frame,360; add.u64 %r160,%r180,40; mov.u64 %r158,%frame; cvt.u32.u32 %r309,%r22; cvt.s64.s8 %r308,%r309; add.u64 %r140,%r308,-1; .loc 1 127 3 mov.u64 %r127,0; add.u64 %r356,%frame,240; .loc 1 129 16 mov.u64 %r310,%r127; $L26: st.u64 [%r161],%r310; .loc 1 130 18 ld.u64 %r311,[%r160]; st.u64 [%r158],%r311; .loc 1 131 17 shl.b64 %r313,%r127,3; add.u64 %r314,%r356,%r313; .loc 1 131 10 ld.u64 %r315,[%r314]; setp.le.s64 %r316,%r315,0; @ %r316 bra $L1; .loc 1 127 26 add.u64 %r127,%r127,1; .loc 1 127 3 add.u64 %r161,%r161,8; add.u64 %r160,%r160,24; add.u64 %r158,%r158,8; setp.ne.u64 %r317,%r127,%r140; @ %r317 bra $L26; bra $L27; $L25: .loc 1 145 10 @ %r355 bra $L28; .loc 1 160 4 @ %r361 bra $L37; .loc 1 144 14 mov.u64 %r119,0; bra $L30; $L28: .loc 1 147 8 add.u64 %r120,%r125,%r154; .loc 1 148 4 @ %r359 bra $L38; .loc 1 148 11 mov.u64 %r119,%r42; $L31: .loc 1 150 11 ld.s8 %r322,[%r120]; cvt.u16.u32 %r321,%r322; setp.eq.u16 %r324,%r321,%r360; @ %r324 bra $L30; .loc 1 148 26 add.u64 %r119,%r119,-1; .loc 1 148 34 sub.u64 %r120,%r120,%r115; .loc 1 148 4 setp.ne.u64 %r325,%r119,0; @ %r325 bra $L31; bra $L30; $L37: .loc 1 160 4 mov.u64 %r118,%r125; .loc 1 160 11 mov.u64 %r119,1; $L29: .loc 1 162 11 ld.s8 %r327,[%r118]; cvt.u16.u32 %r326,%r327; setp.eq.u16 %r329,%r326,%r360; @ %r329 bra $L30; .loc 1 160 27 add.u64 %r119,%r119,1; .loc 1 160 35 add.u64 %r118,%r118,%r115; .loc 1 160 4 setp.ge.s64 %r330,%r42,%r119; @ %r330 bra $L29; .loc 1 144 14 mov.u64 %r119,0; bra $L30; $L38: mov.u64 %r119,0; $L30: .loc 1 169 13 st.u64 [%r126],%r119; .loc 1 171 15 add.u64 %r81,%r81,1; .loc 1 172 12 add.u64 %r125,%r125,%r75; .loc 1 173 12 add.u64 %r126,%r126,%r79; .loc 1 175 32 ld.u64 %r132,[%frame+240]; .loc 1 175 13 setp.ne.u64 %r331,%r81,%r132; @ %r331 bra $L25; .loc 1 178 35 mul.lo.u64 %r332,%r75,%r81; .loc 1 178 9 sub.u64 %r122,%r125,%r332; .loc 1 179 23 mul.lo.u64 %r333,%r77,%r81; .loc 1 179 9 shl.b64 %r334,%r333,3; sub.u64 %r123,%r126,%r334; .loc 1 181 7 @ %r354 bra $L1; add.u64 %r68,%frame,368; mov.u64 %r152,8; .loc 1 180 5 mov.u64 %r124,1; add.u64 %r356,%frame,240; .loc 1 177 13 mov.u64 %r358,0; bra $L34; $L35: st.u64 [%r68],%r358; .loc 1 178 35 mul.lo.u64 %r340,%r93,%r92; .loc 1 178 9 sub.u64 %r122,%r125,%r340; .loc 1 179 23 mul.lo.u64 %r341,%r95,%r92; .loc 1 179 9 shl.b64 %r342,%r341,3; sub.u64 %r123,%r126,%r342; .loc 1 180 5 add.u64 %r124,%r124,1; .loc 1 181 7 add.u64 %r68,%r68,8; add.u64 %r152,%r152,8; setp.eq.u64 %r343,%r124,%r364; @ %r343 bra $L1; $L34: .loc 1 188 16 ld.u64 %r344,[%r68]; add.u64 %r92,%r344,1; st.u64 [%r68],%r92; .loc 1 189 27 add.u64 %r346,%r365,%r152; ld.u64 %r93,[%r346]; .loc 1 189 13 add.u64 %r125,%r122,%r93; .loc 1 190 23 add.u64 %r347,%frame,%r152; ld.u64 %r95,[%r347]; .loc 1 190 13 shl.b64 %r348,%r95,3; add.u64 %r126,%r123,%r348; .loc 1 175 32 add.u64 %r350,%r356,%r152; ld.u64 %r98,[%r350]; .loc 1 175 13 setp.eq.u64 %r351,%r92,%r98; @ %r351 bra $L35; .loc 1 177 13 mov.u64 %r81,0; bra $L25; $L51: .loc 1 86 6 ld.u64 %r352,[%r180]; setp.eq.u64 %r353,%r352,0; @ ! %r353 bra $L11; bra $L36; $L1: .loc 1 194_gfortran_mfindloc1_i1 .visible .func _gfortran_mfindloc1_i1608pred.reg .u32predpred %r266; .reg .u64 %r267; .reg .u64 %r268; .reg .u64u64 %r280; .reg .u64 %r281; .reg .u64 %r282; .reg .u64 %r283; .reg .pred %r284; .reg .u64 %r286; .reg .u64 %r288; .reg .predu64 %r297; .reg .u64 %r298; .reg .u32u64 %r305; .reg .u64 %r306; .reg .u64u32u64 %r326; .reg .u16predpred %r382; .reg .u16u16 %r388; .reg .u32 %r389; .reg .predpred %r394; .reg .pred %r395; .reg .predmov.u64 %r201,%ar0; mov.u64 %r202,%ar1; mov.u32 %r203,%ar2; mov.u64 %r204,%ar3; mov.u64 %r205,%ar4; mov.u32 %r206,%ar5; .loc 1 225 10 ld.s8 %r22,[%r202+28]; .loc 1 225 38 add.u32 %r207,%r22,-1; .loc 1 225 8 cvt.s64.s32 %r143,%r207; .loc 1 226 10 ld.u64 %r25,[%r204]; .loc 1 226 7 add.u64 %r144,%r25,-1; .loc 1 228 7 shr.u64 %r209,%r144,63; set.u32.lt.s64 %r211,%r143,%r144; neg.s32 %r212,%r211; cvt.u16.u64 %r215,%r209; cvt.u16.u32 %r216,%r212; or.b16 %r214,%r215,%r216; .loc 1 228 6 cvt.u32.u16 %r217,%r214; cvt.u16.u8 %r218,%r217; setp.eq.u16 %r219,%r218,0; @ %r219 bra $L53; .loc 1 230 7 cvt.u32.u32 %r222,%r22; cvt.s64.s8 %r221,%r222; st.u64 [%stack+8],%r2220220_gfortran_runtime_error3: .loc 1 235 9 add.u64 %r225,%r144,%r144; add.u64 %r226,%r225,%r144; shl.b64 %r227,%r226,3; add.u64 %r228,%r202,%r227; ld.u64 %r31,[%r228+56]; ld.u64 %r33,[%r228+48]; .loc 1 239 9 ld.u64 %r147,[%r228+40]; .loc 1 240 12 add.u64 %r246,%r205,%r227; ld.u64 %r34,[%r246+40]; ld.u64 %r36,[%r205+16]; .loc 1 242 9 ld.u64 %r155,[%r205]; .loc 1 246 22 cvt.u32.u64 %r39,%r36; .loc 1 246 53 add.u32 %r248,%r39,-4; and.b32 %r249,%r248,-5; set.u32.eq.u32 %r251,%r249,0; neg.s32 %r252,%r251; .loc 1 246 22 add.u32 %r253,%r39,-1; set.u32.le.u32 %r255,%r253,1; neg.s32 %r256,%r255; .loc 1 246 6 cvt.u16.u32 %r258,%r252; cvt.u16.u32 %r259,%r256; or.b16 %r257,%r258,%r259; cvt.u32.u16 %r260,%r257; cvt.u16.u8 %r261,%r260; setp.eq.u16 %r262,%r261,0; @ %r262 bra $L54; $L58: .loc 1 255 3 setp.ne.u64 %r263,%r144,0; @ %r263 bra $L55; $L62: .loc 1 264 3 setp.gt.s64 %r264,%r143,%r144; @ %r264 bra $L56; bra $L113; $L54: .loc 1 248 7 setp.eq.u32 %r266,%r39,16; @ %r266 bra $L58; .loc 1 253 5 cvta.const.u64 %r268,$LC4; mov.u64 %r26call _gfortrani_internal_error5: add.u64 %r163,%r202,40; add.u64 %r194,%r205,40; add.u64 %r269,%r202,16; add.u64 %r271,%r25,%r25; add.u64 %r272,%r271,%r25; shl.b64 %r273,%r272,3; add.u64 %r198,%r269,%r273; .loc 1 255 3 mov.u64 %r67,0; add.u64 %r418,%frame,240; add.u64 %r419,%frame,120; add.u64 %r416,%frame,360; .loc 1 262 12 mov.u64 %r433,%r67; $L61: .loc 1 257 18 add.u64 %r275,%r418,%r67; ld.u64 %r276,[%r163]; st.u64 [%r275],%r276; .loc 1 258 18 add.u64 %r278,%r419,%r67; .loc 1 258 20 ld.u64 %r280,[%r194]; mul.lo.u64 %r279,%r280,%r36; .loc 1 258 18 st.u64 [%r278],%r279; .loc 1 259 19 ld.u64 %r282,[%r163+16]; add.u64 %r281,%r282,1; ld.u64 %r283,[%r163+8]; sub.u64 %r54,%r281,%r283; .loc 1 261 10 setp.lt.s64 %r284,%r54,0; @ %r284 bra $L59; .loc 1 259 17 add.u64 %r286,%r416,%r67; st.u64 [%r286],%r54; bra $L60; $L59: .loc 1 262 12 add.u64 %r288,%r416,%r67; st.u64 [%r288],%r433; $L60: .loc 1 255 3 add.u64 %r163,%r163,24; add.u64 %r194,%r194,24; add.u64 %r67,%r67,8; setp.ne.u64 %r290,%r163,%r198; @ %r290 bra $L61; bra $L62; $L113: .loc 1 274 6 ld.u64 %r291,[%r201]; setp.eq.u64 %r292,%r291,0; @ ! %r292 bra $L64; bra $L63; $L56: add.u64 %r294,%r25,%r25; add.u64 %r295,%r294,%r25; shl.b64 %r296,%r295,3; add.u64 %r136,%r296,40; add.u64 %r139,%r202,%r136; add.u64 %r134,%r205,%r136; shl.b64 %r297,%r25,3; add.u64 %r130,%r297,-8; cvt.u32.u32 %r299,%r22; cvt.s64.s8 %r298,%r299; shl.b64 %r300,%r298,3; add.u64 %r160,%r300,-8; add.u64 %r418,%frame,240; add.u64 %r419,%frame,120; add.u64 %r416,%frame,360; .loc 1 271 12 mov.u64 %r432,0; $L67: .loc 1 266 18 add.u64 %r302,%r418,%r130; ld.u64 %r303,[%r139]; st.u64 [%r302],%r303; .loc 1 267 18 add.u64 %r305,%r419,%r130; .loc 1 267 20 ld.u64 %r307,[%r134]; mul.lo.u64 %r306,%r307,%r36; .loc 1 267 18 st.u64 [%r305],%r306; .loc 1 268 19 ld.u64 %r309,[%r139+16]; add.u64 %r308,%r309,1; ld.u64 %r310,[%r139+8]; sub.u64 %r64,%r308,%r310; .loc 1 270 10 setp.lt.s64 %r311,%r64,0; @ %r311 bra $L65; .loc 1 268 17 add.u64 %r313,%r416,%r130; st.u64 [%r313],%r64; bra $L66; $L65: .loc 1 271 12 add.u64 %r315,%r416,%r130; st.u64 [%r315],%r432; $L66: .loc 1 264 3 add.u64 %r139,%r139,24; add.u64 %r134,%r134,24; add.u64 %r130,%r130,8; setp.ne.u64 %r317,%r130,%r160; @ %r317 bra $L67; bra $L114; $L63: .loc 1 278 7 setp.le.s64 %r318,%r143,0; @ %r318 bra $L69; add.u64 %r416,%frame,360; $L90: add.u64 %r172,%r201,40; mov.u64 %r169,%r416; cvt.u32.u32 %r320,%r22; cvt.s64.s8 %r319,%r320; add.u64 %r322,%r319,%r319; add.u64 %r323,%r322,%r319; shl.b64 %r324,%r323,3; add.u64 %r325,%r201,16; add.u64 %r140,%r324,%r325; .loc 1 281 10 mov.u64 %r133,1; .loc 1 285 4 mov.u64 %r348,0; bra $L70; $L69: .loc 1 289 24 mov.u64 %r326,0; st.u64 [%r201+8],%r326; .loc 1 290 28 cvt.u16.u32 %r329,%r22; add.u16 %r328,%r329,-1; cvt.u32.u16 %r330,%r328; st.u8 [%r201+28],%r330; .loc 1 292 20 add.u32 %r331,%r22,-2; cvt.s64.s32 %r74,%r331; add.u64 %r333,%r74,%r74; add.u64 %r334,%r333,%r74; shl.b64 %r335,%r334,3; add.u64 %r336,%r201,%r335; .loc 1 292 67 shl.b64 %r338,%r74,3; add.u64 %r339,%frame,%r338; .loc 1 292 59 ld.u64 %r341,[%r336+40]; ld.u64 %r342,[%r339+360]; mul.lo.u64 %r150,%r341,%r342; .loc 1 294 29 mov.u64 %r344344; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } .loc 1 294 27 st.u64 [%r201],%r345; .loc 1 295 10 setp.eq.u64 %r347,%r150,0; @ ! %r347 bra $L74; bra $L71; $L73: .loc 1 283 48 mul.lo.u64 %r133,%r70,%r133; $L70: .loc 1 285 4 st.u64 [%r172+8],%r348; ld.u64 %r70,[%r169]; add.u64 %r349,%r70,-1; st.u64 [%r172+16],%r349; st.u64 [%r172],%r133; .loc 1 278 7 add.u64 %r172,%r172,24; add.u64 %r169,%r169,8; setp.ne.u64 %r350,%r140,%r172; @ %r350 bra $L73; bra $L69; $L71: .loc 1 298 4 st.u64 [%r201+48],%r150; mov.u64 %r352,-1; st.u64 [%r201+56],%r352; mov.u64 %r353,1; st.u64 [%r201+40],%r353; .loc 1 299 4 bra $L52; $L64: .loc 1 304 19 ld.s8 %r81,[%r201+28]; .loc 1 304 10 setp.eq.u64 %r354,%r81,%r143; @ %r354 bra $L76; .loc 1 305 2 st.u64 [%stack+8],%r143; st.u64 [%stack],%r81;_gfortran_runtime_error76: .loc 1 310 11 cvta.global.u64 %r357,_gfortrani_compile_options; .loc 1 310 10 ld.u32 %r358,[%r357+36]; setp.eq.u32 %r359,%r358,0; @ %r359 bra $L74; .loc 1 311 2 add.u64 %r416,%frame,360; cvta.const.u64 %r363,$LC2r4163623; call _gfortrani_bounds_ifunction_return$L74: .loc 1 315 3 setp.gt.s64 %r365,%r143,0; @ %r365 bra $L77; $L80: .loc 1 235 9 add.u64 %r366,%r31,1; .loc 1 235 7 sub.u64 %r145,%r366,%r33; max.s64 %r68,%r145,0; .loc 1 240 12 mul.lo.u64 %r37,%r34,%r36; .loc 1 323 8 ld.u64 %r158,[%r201]; .loc 1 326 8 ld.u64 %r157,[%r202]; .loc 1 336 22 add.u64 %r66,%r68,-1; .loc 1 336 35 mul.lo.u64 %r43,%r147,%r66; .loc 1 337 29 mul.lo.u64 %r171,%r37,%r66; .loc 1 363 26 ld.u64 %r100,[%frame+240]; .loc 1 364 23 ld.u64 %r102,[%frame+120]; .loc 1 365 22 ld.u64 %r104,[%frame]; .loc 1 365 12 shl.b64 %r106,%r104,3; .loc 1 362 12 ld.u64 %r197,[%frame+480]; setp.ne.u32 %r417,%r206,0; add.u64 %r415,%frame,488; .loc 1 338 4 setp.le.s64 %r425,%r145,0; .loc 1 340 18 cvt.u16.u32 %r426,%r203; .loc 1 382 27 add.u64 %r428,%frame,240; .loc 1 367 32 add.u64 %r429,%frame,360; bra $L78; $L77: add.u64 %r193,%frame,480; add.u64 %r192,%r201,40; mov.u64 %r190,%frame; cvt.u32.u32 %r368,%r22; cvt.s64.s8 %r367,%r368; add.u64 %r173,%r367,-1; .loc 1 315 3 mov.u64 %r159,0; add.u64 %r416,%frame,360; .loc 1 317 16 mov.u64 %r369,%r159; $L79: st.u64 [%r193],%r369; .loc 1 318 18 ld.u64 %r370,[%r192]; st.u64 [%r190],%r370; .loc 1 319 17 shl.b64 %r372,%r159,3; add.u64 %r373,%r416,%r372; .loc 1 319 10 ld.u64 %r374,[%r373]; setp.le.s64 %r375,%r374,0; @ %r375 bra $L52; .loc 1 315 26 add.u64 %r159,%r159,1; .loc 1 315 3 add.u64 %r193,%r193,8; add.u64 %r192,%r192,24; add.u64 %r190,%r190,8; setp.ne.u64 %r376,%r159,%r173; @ %r376 bra $L79; bra $L80; $L78: .loc 1 334 10 @ %r417 bra $L81; .loc 1 351 4 setp.gt.s64 %r378,%r145,0; @ %r378 bra $L91; .loc 1 333 14 mov.u64 %r153,0; bra $L83; $L81: .loc 1 336 8 add.u64 %r29,%r157,%r43; .loc 1 337 9 add.u64 %r154,%r155,%r171; .loc 1 338 4 @ %r425 bra $L92; .loc 1 338 11 mov.u64 %r153,%r68; $L85: .loc 1 340 11 ld.s8 %r381,[%r154]; cvt.u16.u32 %r380,%r381; setp.eq.u16 %r382,%r380,0; @ %r382 bra $L84; .loc 1 340 18 ld.s8 %r384,[%r29]; cvt.u16.u32 %r383,%r384; setp.eq.u16 %r386,%r383,%r426; @ %r386 bra $L83; $L84: .loc 1 338 26 add.u64 %r153,%r153,-1; .loc 1 338 34 sub.u64 %r29,%r29,%r147; .loc 1 338 53 sub.u64 %r154,%r154,%r37; .loc 1 338 4 setp.ne.u64 %r387,%r153,0; @ %r387 bra $L85; bra $L83; $L91: .loc 1 351 4 mov.u64 %r152,%r155; mov.u64 %r108,%r157; .loc 1 351 11 mov.u64 %r153,1; $L82: .loc 1 353 11 ld.s8 %r389,[%r152]; cvt.u16.u32 %r388,%r389; setp.eq.u16 %r390,%r388,0; @ %r390 bra $L86; .loc 1 353 18 ld.s8 %r392,[%r108]; cvt.u16.u32 %r391,%r392; setp.eq.u16 %r394,%r391,%r426; @ %r394 bra $L83; $L86: .loc 1 351 27 add.u64 %r153,%r153,1; .loc 1 351 35 add.u64 %r108,%r108,%r147; .loc 1 351 54 add.u64 %r152,%r152,%r37; .loc 1 351 4 setp.ge.s64 %r395,%r68,%r153; @ %r395 bra $L82; .loc 1 333 14 mov.u64 %r153,0; bra $L83; $L92: mov.u64 %r153,0; $L83: .loc 1 360 13 st.u64 [%r158],%r153; .loc 1 362 15 add.u64 %r197,%r197,1; st.u64 [%frame+480],%r197; .loc 1 363 12 add.u64 %r157,%r157,%r100; .loc 1 364 13 add.u64 %r155,%r155,%r102; .loc 1 365 12 add.u64 %r158,%r158,%r106; .loc 1 367 32 ld.u64 %r127,[%frame+360]; .loc 1 367 13 setp.ne.u64 %r396,%r197,%r127; @ %r396 bra $L78; mov.u64 %r111,%r415; mov.u64 %r124,%r104; mov.u64 %r196,%r102; mov.u64 %r122,%r100; mov.u64 %r119,8; .loc 1 366 9 mov.u64 %r156,0; .loc 1 369 13 mov.u64 %r398,%r156; .loc 1 371 20 add.u64 %r422,%frame,120; $L89: .loc 1 369 13 st.u64 [%r111+-8],%r398; .loc 1 370 35 mul.lo.u64 %r110,%r127,%r122; .loc 1 371 24 mul.lo.u64 %r399,%r127,%r196; .loc 1 371 10 sub.u64 %r155,%r155,%r399; .loc 1 372 23 mul.lo.u64 %r118,%r127,%r124; .loc 1 373 5 add.u64 %r156,%r156,1; .loc 1 374 7 setp.le.s64 %r400,%r143,%r156; @ %r400 bra $L52; .loc 1 381 16 ld.u64 %r401,[%r111]; add.u64 %r121,%r401,1; st.u64 [%r111],%r121; .loc 1 382 27 add.u64 %r403,%r428,%r119; ld.u64 %r122,[%r403]; .loc 1 382 13 sub.u64 %r404,%r122,%r110; add.u64 %r157,%r157,%r404; .loc 1 383 23 add.u64 %r405,%frame,%r119; ld.u64 %r124,[%r405]; .loc 1 383 13 sub.u64 %r406,%r124,%r118; shl.b64 %r407,%r406,3; add.u64 %r158,%r158,%r407; .loc 1 367 32 add.u64 %r409,%r429,%r119; ld.u64 %r127,[%r409]; .loc 1 367 13 add.u64 %r111,%r111,8; setp.eq.u64 %r410,%r121,%r127; @ %r410 bra $L88; .loc 1 362 12 ld.u64 %r197,[%frame+480]; bra $L78; $L88: .loc 1 371 20 add.u64 %r412,%r422,%r119; ld.u64 %r196,[%r412]; add.u64 %r119,%r119,8; bra $L89; $L114: .loc 1 274 6 ld.u64 %r413,[%r201]; setp.eq.u64 %r414,%r413,0; @ ! %r414 bra $L64; bra $L90; $L52: .loc 1 387_gfortran_sfindloc1_i1 .visible .func _gfortran_sfindloc1_i168u64 %r126; .reg .u64u32pred %r190; .reg .pred %r192; .reg .u64pred %r213; .reg .predu32predpredpred %r257; .reg .u64 %r260; .reg .u64 %r261; .reg .pred %r263; .reg .u64 %r264; .reg .u32 %r265; .reg .u64 %r266; .reg .u6464predmov.u64 %r149,%ar0; mov.u64 %r150,%ar1; mov.u32 %r151,%ar2; mov.u64 %r152,%ar3; mov.u64 %r153,%ar4; mov.u32 %r154,%ar5; .loc 1 410 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L116; .loc 1 410 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L117; $L116:call _gfortran_findloc1_i1); } .loc 1 413 7 bra $L115; $L117: .loc 1 416 10 ld.s8 %r23,[%r150+28]; .loc 1 416 38 add.u32 %r163,%r23,-1; .loc 1 416 8 cvt.s64.s32 %r90,%r163; .loc 1 417 10 ld.u64 %r26,[%r152]; .loc 1 417 7 add.u64 %r91,%r26,-1; .loc 1 419 7 shr.u64 %r165,%r91,63; set.u32.lt.s64 %r167,%r90,%r91; neg.s32 %r168,%r167; cvt.u16.u64 %r171,%r165; cvt.u16.u32 %r172,%r168; or.b16 %r170,%r171,%r172; .loc 1 419 6 cvt.u32.u16 %r173,%r170; cvt.u16.u8 %r174,%r173; setp.ne.u16 %r175,%r174,0; @ %r175 bra $L119; .loc 1 430 3 setp.ne.u64 %r176,%r91,0; @ %r176 bra $L120; $L126: .loc 1 438 3 setp.gt.s64 %r177,%r90,%r91; @ %r177 bra $L121; bra $L163; $L119: .loc 1 421 7 cvt.u32.u32 %r180,%r23; cvt.s64.s8 %r179,%r180; st.u64 [%stack+8],%r179_gfortran_runtime_error120: add.u64 %r136,%r150,48; add.u64 %r138,%frame,120; add.u64 %r139,%r150,56; add.u64 %r182,%r150,24; add.u64 %r184,%r26,%r26; add.u64 %r185,%r184,%r26; shl.b64 %r186,%r185,3; add.u64 %r145,%r182,%r186; .loc 1 435 12 mov.u64 %r302,0; $L125: .loc 1 432 19 ld.u64 %r188,[%r139]; add.u64 %r187,%r188,1; ld.u64 %r189,[%r136]; sub.u64 %r35,%r187,%r189; .loc 1 434 10 setp.le.s64 %r190,%r35,0; @ %r190 bra $L123; .loc 1 432 17 st.u64 [%r138],%r35; bra $L124; $L123: .loc 1 435 12 st.u64 [%r138],%r302; $L124: .loc 1 430 3 add.u64 %r136,%r136,24; add.u64 %r138,%r138,8; add.u64 %r139,%r139,24; setp.ne.u64 %r192,%r136,%r145; @ %r192 bra $L125; bra $L126; $L163: .loc 1 448 6 ld.u64 %r193,[%r149]; setp.eq.u64 %r194,%r193,0; @ ! %r194 bra $L128; bra $L127; $L121: add.u64 %r196,%r26,%r26; add.u64 %r197,%r196,%r26; shl.b64 %r198,%r197,3; add.u64 %r199,%r198,48; add.u64 %r125,%r150,%r199; add.u64 %r294,%frame,120; shl.b64 %r201,%r26,3; add.u64 %r202,%r201,-8; add.u64 %r45,%r294,%r202; add.u64 %r203,%r150,48; cvt.u32.u32 %r205,%r23; cvt.s64.s8 %r204,%r205; add.u64 %r207,%r204,%r204; add.u64 %r208,%r207,%r204; shl.b64 %r209,%r208,3; add.u64 %r132,%r203,%r209; .loc 1 444 12 mov.u64 %r301,0; $L131: .loc 1 441 2 ld.u64 %r211,[%r125+8]; add.u64 %r210,%r211,1; ld.u64 %r212,[%r125]; sub.u64 %r40,%r210,%r212; .loc 1 443 10 setp.le.s64 %r213,%r40,0; @ %r213 bra $L129; .loc 1 440 17 st.u64 [%r45],%r40; bra $L130; $L129: .loc 1 444 12 st.u64 [%r45],%r301; $L130: .loc 1 438 3 add.u64 %r125,%r125,24; add.u64 %r45,%r45,8; setp.ne.u64 %r215,%r125,%r132; @ %r215 bra $L131; bra $L164; $L127: .loc 1 452 7 setp.le.s64 %r216,%r90,0; @ %r216 bra $L133; add.u64 %r294,%frame,120; $L149: add.u64 %r78,%r149,40; mov.u64 %r61,%r294; cvt.u32.u32 %r218,%r23; cvt.s64.s8 %r217,%r218; add.u64 %r220,%r217,%r217; add.u64 %r221,%r220,%r217; shl.b64 %r222,%r221,3; add.u64 %r223,%r149,16; add.u64 %r126,%r222,%r223; .loc 1 455 10 mov.u64 %r81,1; .loc 1 459 4 mov.u64 %r246,0; bra $L134; $L133: .loc 1 462 24 mov.u64 %r224,0; st.u64 [%r149+8],%r224; .loc 1 463 28 cvt.u16.u32 %r227,%r23; add.u16 %r226,%r227,-1; cvt.u32.u16 %r228,%r226; st.u8 [%r149+28],%r228; .loc 1 465 20 add.u32 %r229,%r23,-2; cvt.s64.s32 %r51,%r229; add.u64 %r231,%r51,%r51; add.u64 %r232,%r231,%r51; shl.b64 %r233,%r232,3; add.u64 %r234,%r149,%r233; .loc 1 465 67 shl.b64 %r236,%r51,3; add.u64 %r237,%frame,%r236; .loc 1 465 59 ld.u64 %r239,[%r234+40]; ld.u64 %r240,[%r237+120]; mul.lo.u64 %r94,%r239,%r240; .loc 1 467 29 mov.u64 %r24242; call (%value_in),_gfortrani_xmallocarray3,[%value_in]; } .loc 1 467 27 st.u64 [%r149],%r243; .loc 1 468 10 setp.eq.u64 %r245,%r94,0; @ ! %r245 bra $L138; bra $L135; $L137: .loc 1 457 48 mul.lo.u64 %r81,%r47,%r81; $L134: .loc 1 459 4 st.u64 [%r78+8],%r246; ld.u64 %r47,[%r61]; add.u64 %r247,%r47,-1; st.u64 [%r78+16],%r247; st.u64 [%r78],%r81; .loc 1 452 7 add.u64 %r78,%r78,24; add.u64 %r61,%r61,8; setp.ne.u64 %r248,%r78,%r126; @ %r248 bra $L137; bra $L133; $L135: .loc 1 471 4 st.u64 [%r149+48],%r94; mov.u64 %r250,-1; st.u64 [%r149+56],%r250; mov.u64 %r251,1; st.u64 [%r149+40],%r251; .loc 1 472 4 bra $L115; $L128: .loc 1 477 19 ld.s8 %r58,[%r149+28]; .loc 1 477 10 setp.eq.u64 %r252,%r58,%r90; @ %r252 bra $L139; .loc 1 478 2 st.u64 [%stack+8],%r90; st.u64 [%stack],%r58;_gfortran_runtime_error139: .loc 1 483 11 cvta.global.u64 %r255,_gfortrani_compile_options; .loc 1 483 10 ld.u32 %r256,[%r255+36]; setp.eq.u32 %r257,%r256,0; @ %r257 bra $L138; .loc 1 484 2 add.u64 %r294,%frame,120; cvta.const.u64 %r261,$LC2261; call _gfortrani_bounds_ifunction_return$L138: .loc 1 488 3 setp.gt.s64 %r263,%r90,0; @ %r263 bra $L140; $L143: .loc 1 495 8 ld.u64 %r97,[%r149]; ld.u64 %r42,[%frame+240]; .loc 1 503 22 ld.u64 %r64,[%frame]; .loc 1 503 12 shl.b64 %r66,%r64,3; setp.le.s64 %r295,%r90,1; cvt.u32.u32 %r297,%r23; cvt.s64.s8 %r298,%r297; add.u64 %r299,%r298,-1; add.u64 %r300,%frame,120; bra $L141; $L140: add.u64 %r114,%frame,240; add.u64 %r113,%r149,40; mov.u64 %r108,%frame; cvt.u32.u32 %r265,%r23; cvt.s64.s8 %r264,%r265; add.u64 %r79,%r264,-1; .loc 1 488 3 mov.u64 %r98,0; add.u64 %r294,%frame,120; .loc 1 490 16 mov.u64 %r266,%r98; $L142: st.u64 [%r114],%r266; .loc 1 491 18 ld.u64 %r267,[%r113]; st.u64 [%r108],%r267; .loc 1 492 17 shl.b64 %r269,%r98,3; add.u64 %r270,%r294,%r269; .loc 1 492 10 ld.u64 %r271,[%r270]; setp.le.s64 %r272,%r271,0; @ %r272 bra $L115; .loc 1 488 26 add.u64 %r98,%r98,1; .loc 1 488 3 add.u64 %r114,%r114,8; add.u64 %r113,%r113,24; add.u64 %r108,%r108,8; setp.ne.u64 %r273,%r79,%r98; @ %r273 bra $L142; bra $L143; $L141: .loc 1 500 13 mov.u64 %r274,0; st.u64 [%r97],%r274; .loc 1 502 15 add.u64 %r42,%r42,1; .loc 1 503 12 add.u64 %r97,%r97,%r66; .loc 1 505 32 ld.u64 %r88,[%frame+120]; .loc 1 505 13 setp.ne.u64 %r275,%r42,%r88; @ %r275 bra $L141; .loc 1 508 23 mul.lo.u64 %r276,%r64,%r42; .loc 1 508 9 shl.b64 %r277,%r276,3; sub.u64 %r30,%r97,%r277; .loc 1 510 7 @ ! %r295 bra $L165; bra $L115; $L148: .loc 1 507 13 st.u64 [%r102],%r274; .loc 1 508 23 mul.lo.u64 %r280,%r73,%r72; .loc 1 508 9 shl.b64 %r281,%r280,3; sub.u64 %r30,%r97,%r281; .loc 1 509 5 add.u64 %r96,%r96,1; .loc 1 510 7 add.u64 %r102,%r102,8; setp.ne.u64 %r282,%r96,%r299; @ %r282 bra $L147; bra $L115; $L165: add.u64 %r102,%frame,248; .loc 1 509 5 mov.u64 %r96,1; $L147: .loc 1 517 16 ld.u64 %r286,[%r102]; add.u64 %r72,%r286,1; st.u64 [%r102],%r72; shl.b64 %r120,%r96,3; .loc 1 518 23 add.u64 %r287,%frame,%r120; ld.u64 %r73,[%r287]; .loc 1 518 13 shl.b64 %r288,%r73,3; add.u64 %r97,%r30,%r288; .loc 1 505 32 add.u64 %r290,%r300,%r120; ld.u64 %r77,[%r290]; .loc 1 505 13 setp.eq.u64 %r291,%r72,%r77; @ %r291 bra $L148; .loc 1 507 13 mov.u64 %r42,0; bra $L141; $L164: .loc 1 448 6 ld.u64 %r292,[%r149]; setp.eq.u64 %r293,%r292,0; @ ! %r293 bra $L128; bra $L149; $L115: .loc 1 522findloc1_i2.o/_gfortran_findloc1_i2 .visible .func _gfortran_findloc1_i2); .file 1 "../../../../libgfortran/generated/findloc1_i2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mfindloc1_i2 .visible .func _gfortran_mfindloc1_i2_gfortran_sfindloc1_i2 .visible .func _gfortran_sfindloc1_i2); // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_internal_error .extern .func _gfortrani_internal_8] = {70,73,78,68,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,70,73,78,68,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,914,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,70,73,78,68,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_findloc1_i2 .visible .func _gfortran_findloc1_i24898predpred %r238; .reg .u64pred %r260; .reg .pred %r262; .reg .pred %r263; .reg .u64 %r264; .reg .u32 %r265; .reg .u64u64 %r279; .reg .u64 %r280; .reg .u64u64 %r315; .reg .u32 %r316; .reg .u64 %r317; .reg .u64pred %r323; .reg .pred %r324; .reg .u16 %r328; .reg .pred %r330; .reg .pred %r331; .reg .u16 %r332; .reg .predpredu64 %r374; .reg .u64 %r375; mov.u64 %r185,%ar0; mov.u64 %r186,%ar1; mov.u32 %r187,%ar2; mov.u64 %r188,%ar3; mov.u32 %r189,%ar4; .loc 1 54 10 ld.s8 %r22,[%r186+28]; .loc 1 54 38 add.u32 %r190,%r22,-1; .loc 1 54 8 cvt.s64.s32 %r118,%r190; .loc 1 55 10 ld.u64 %r25,[%r188]; .loc 1 55 7 add.u64 %r119,%r25,-1; .loc 1 57 7 shr.u64 %r192,%r119,63; set.u32.lt.s64 %r194,%r118,%r119; neg.s32 %r195,%r194; cvt.u16.u64 %r198,%r192; cvt.u16.u32 %r199,%r195; or.b16 %r197,%r198,%r199; .loc 1 57 6 cvt.u32.u16 %r200,%r197; cvt.u16.u8 %r201,%r200; setp.eq.u16 %r202,%r201,0; @ %r202 bra $L2; .loc 1 59 7 cvt.u32.u32 %r205,%r22; cvt.s64.s8 %r204,%r205; st.u64 [%stack+8],%r20403203_gfortran_runtime_errorr208,%r119,%r119; add.u64 %r209,%r208,%r119; shl.b64 %r210,%r209,3; add.u64 %r211,%r186,%r210; ld.u64 %r31,[%r211+56]; ld.u64 %r33,[%r211+48]; .loc 1 67 9 ld.u64 %r122,[%r211+40]; .loc 1 69 3 setp.ne.u64 %r225,%r119,0; @ %r225 bra $L3; $L9: .loc 1 77 3 setp.gt.s64 %r226,%r118,%r119; @ %r226 bra $L4; bra $L50; $L3: add.u64 %r177,%r186,40; add.u64 %r67,%frame,120; add.u64 %r70,%frame,240; add.u64 %r228,%r25,%r25; add.u64 %r229,%r228,%r25; shl.b64 %r230,%r229,3; add.u64 %r231,%r186,16; add.u64 %r182,%r230,%r231; .loc 1 75 12 mov.u64 %r375,0; $L8: .loc 1 71 18 ld.u64 %r232,[%r177]; st.u64 [%r67],%r232; .loc 1 72 19 ld.u64 %r234,[%r177+16]; add.u64 %r233,%r234,1; ld.u64 %r235,[%r177+8]; sub.u64 %r38,%r233,%r235; .loc 1 74 10 setp.lt.s64 %r236,%r38,0; @ %r236 bra $L6; .loc 1 72 17 st.u64 [%r70],%r38; bra $L7; $L6: .loc 1 75 12 st.u64 [%r70],%r375; $L7: .loc 1 69 3 add.u64 %r177,%r177,24; add.u64 %r67,%r67,8; add.u64 %r70,%r70,8; setp.ne.u64 %r238,%r177,%r182; @ %r238 bra $L8; bra $L9; $L50: .loc 1 86 6 ld.u64 %r239,[%r185]; setp.eq.u64 %r240,%r239,0; @ ! %r240 bra $L11; bra $L10; $L4: add.u64 %r242,%r25,%r25; add.u64 %r243,%r242,%r25; shl.b64 %r244,%r243,3; add.u64 %r245,%r244,40; add.u64 %r115,%r186,%r245; shl.b64 %r246,%r25,3; add.u64 %r105,%r246,-8; add.u64 %r365,%frame,120; add.u64 %r108,%r365,%r105; add.u64 %r364,%frame,240; add.u64 %r62,%r364,%r105; add.u64 %r249,%r186,40; cvt.u32.u32 %r251,%r22; cvt.s64.s8 %r250,%r251; add.u64 %r253,%r250,%r250; add.u64 %r254,%r253,%r250; shl.b64 %r255,%r254,3; add.u64 %r180,%r249,%r255; .loc 1 83 12 mov.u64 %r374,0; $L14: .loc 1 79 18 ld.u64 %r256,[%r115]; st.u64 [%r108],%r256; .loc 1 80 19 ld.u64 %r258,[%r115+16]; add.u64 %r257,%r258,1; ld.u64 %r259,[%r115+8]; sub.u64 %r45,%r257,%r259; .loc 1 82 10 setp.lt.s64 %r260,%r45,0; @ %r260 bra $L12; .loc 1 80 17 st.u64 [%r62],%r45; bra $L13; $L12: .loc 1 83 12 st.u64 [%r62],%r374; $L13: .loc 1 77 3 add.u64 %r115,%r115,24; add.u64 %r108,%r108,8; add.u64 %r62,%r62,8; setp.ne.u64 %r262,%r115,%r180; @ %r262 bra $L14; bra $L51; $L10: .loc 1 90 7 setp.le.s64 %r263,%r118,0; @ %r263 bra $L16; add.u64 %r364,%frame,240; $L36: add.u64 %r153,%r185,40; mov.u64 %r149,%r364; cvt.u32.u32 %r265,%r22; cvt.s64.s8 %r264,%r265; add.u64 %r267,%r264,%r264; add.u64 %r268,%r267,%r264; shl.b64 %r269,%r268,3; add.u64 %r270,%r185,16; add.u64 %r116,%r269,%r270; .loc 1 93 10 mov.u64 %r112,1; .loc 1 97 4 mov.u64 %r293,0; bra $L17; $L16: .loc 1 101 24 mov.u64 %r271,0; st.u64 [%r185+8],%r271; .loc 1 102 28 cvt.u16.u32 %r274,%r22; add.u16 %r273,%r274,-1; cvt.u32.u16 %r275,%r273; st.u8 [%r185+28],%r275; .loc 1 104 20 add.u32 %r276,%r22,-2; cvt.s64.s32 %r54,%r276; add.u64 %r278,%r54,%r54; add.u64 %r279,%r278,%r54; shl.b64 %r280,%r279,3; add.u64 %r281,%r185,%r280; .loc 1 104 67 shl.b64 %r283,%r54,3; add.u64 %r284,%frame,%r283; .loc 1 104 59 ld.u64 %r286,[%r281+40]; ld.u64 %r287,[%r284+240]; mul.lo.u64 %r124,%r286,%r287; .loc 1 106 29 mov.u64 %r289189; call (%value_in),_gfortrani_xmallocarray290,[%value_in]; } .loc 1 106 27 st.u64 [%r185],%r290; .loc 1 107 10 setp.eq.u64 %r292,%r124,0; @ ! %r292 bra $L21; bra $L18; $L20: .loc 1 95 48 mul.lo.u64 %r112,%r50,%r112; $L17: .loc 1 97 4 st.u64 [%r153+8],%r293; ld.u64 %r50,[%r149]; add.u64 %r294,%r50,-1; st.u64 [%r153+16],%r294; st.u64 [%r153],%r112; .loc 1 90 7 add.u64 %r153,%r153,24; add.u64 %r149,%r149,8; setp.ne.u64 %r295,%r116,%r153; @ %r295 bra $L20; bra $L16; $L18: .loc 1 110 4 st.u64 [%r185+48],%r124; mov.u64 %r297,-1; st.u64 [%r185+56],%r297; mov.u64 %r298,1; st.u64 [%r185+40],%r298; .loc 1 111 4 bra $L1; $L11: .loc 1 116 19 ld.s8 %r60,[%r185+28]; .loc 1 116 10 setp.eq.u64 %r299,%r60,%r118; @ %r299 bra $L23; .loc 1 117 2 st.u64 [%stack+8],%r118; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 122 11 cvta.global.u64 %r302,_gfortrani_compile_options; .loc 1 122 10 ld.u32 %r303,[%r302+36]; setp.eq.u32 %r304,%r303,0; @ %r304 bra $L21; .loc 1 123 2 add.u64 %r364,%frame,240; cvta.const.u64 %r308,$LC2r364307308; call _gfortrani_bounds_ifunction_return$L21: .loc 1 127 3 setp.gt.s64 %r310,%r118,0; @ %r310 bra $L24; $L27: .loc 1 64 9 add.u64 %r311,%r31,1; .loc 1 64 7 sub.u64 %r120,%r311,%r33; max.s64 %r43,%r120,0; .loc 1 135 8 ld.u64 %r111,[%r185]; .loc 1 138 8 ld.u64 %r132,[%r186]; .loc 1 160 35 add.u64 %r74,%r122,%r122; .loc 1 147 22 add.u64 %r313,%r43,-1; .loc 1 147 15 mul.lo.u64 %r68,%r313,%r74; .loc 1 148 34 neg.s64 %r71,%r74; ld.u64 %r82,[%frame+360]; .loc 1 172 22 ld.u64 %r75,[%frame+120]; .loc 1 172 12 add.u64 %r77,%r75,%r75; .loc 1 173 22 ld.u64 %r78,[%frame]; .loc 1 173 12 shl.b64 %r80,%r78,3; setp.ne.u32 %r363,%r189,0; setp.le.s64 %r362,%r118,1; .loc 1 148 4 setp.le.s64 %r367,%r120,0; .loc 1 150 11 cvt.u16.u32 %r368,%r187; .loc 1 160 4 setp.gt.s64 %r369,%r120,0; cvt.u32.u32 %r370,%r22; cvt.s64.s8 %r371,%r370; add.u64 %r372,%r371,-1; add.u64 %r373,%frame,120; bra $L25; $L24: add.u64 %r170,%frame,360; add.u64 %r169,%r185,40; mov.u64 %r167,%frame; cvt.u32.u32 %r316,%r22; cvt.s64.s8 %r315,%r316; add.u64 %r154,%r315,-1; .loc 1 127 3 mov.u64 %r133,0; add.u64 %r364,%frame,240; .loc 1 129 16 mov.u64 %r317,%r133; $L26: st.u64 [%r170],%r317; .loc 1 130 18 ld.u64 %r318,[%r169]; st.u64 [%r167],%r318; .loc 1 131 17 shl.b64 %r320,%r133,3; add.u64 %r321,%r364,%r320; .loc 1 131 10 ld.u64 %r322,[%r321]; setp.le.s64 %r323,%r322,0; @ %r323 bra $L1; .loc 1 127 26 add.u64 %r133,%r133,1; .loc 1 127 3 add.u64 %r170,%r170,8; add.u64 %r169,%r169,24; add.u64 %r167,%r167,8; setp.ne.u64 %r324,%r133,%r154; @ %r324 bra $L26; bra $L27; $L25: .loc 1 145 10 @ %r363 bra $L28; .loc 1 160 4 @ %r369 bra $L37; .loc 1 144 14 mov.u64 %r126,0; bra $L30; $L28: .loc 1 147 8 add.u64 %r127,%r132,%r68; .loc 1 148 4 @ %r367 bra $L38; .loc 1 148 11 mov.u64 %r126,%r43; $L31: .loc 1 150 11 ld.u16 %r328,[%r127]; setp.eq.u16 %r330,%r328,%r368; @ %r330 bra $L30; .loc 1 148 26 add.u64 %r126,%r126,-1; .loc 1 148 34 add.u64 %r127,%r127,%r71; .loc 1 148 4 setp.ne.u64 %r331,%r126,0; @ %r331 bra $L31; bra $L30; $L37: .loc 1 160 4 mov.u64 %r125,%r132; .loc 1 160 11 mov.u64 %r126,1; $L29: .loc 1 162 11 ld.u16 %r332,[%r125]; setp.eq.u16 %r334,%r332,%r368; @ %r334 bra $L30; .loc 1 160 27 add.u64 %r126,%r126,1; .loc 1 160 35 add.u64 %r125,%r125,%r74; .loc 1 160 4 setp.ge.s64 %r335,%r43,%r126; @ %r335 bra $L29; .loc 1 144 14 mov.u64 %r126,0; bra $L30; $L38: mov.u64 %r126,0; $L30: .loc 1 169 13 st.u64 [%r111],%r126; .loc 1 171 15 add.u64 %r82,%r82,1; .loc 1 172 12 add.u64 %r132,%r132,%r77; .loc 1 173 12 add.u64 %r111,%r111,%r80; .loc 1 175 32 ld.u64 %r141,[%frame+240]; .loc 1 175 13 setp.ne.u64 %r336,%r82,%r141; @ %r336 bra $L25; .loc 1 178 23 mul.lo.u64 %r337,%r75,%r82; .loc 1 178 9 add.u64 %r338,%r337,%r337; sub.u64 %r129,%r132,%r338; .loc 1 179 23 mul.lo.u64 %r339,%r78,%r82; .loc 1 179 9 shl.b64 %r340,%r339,3; sub.u64 %r130,%r111,%r340; .loc 1 181 7 @ %r362 bra $L1; add.u64 %r66,%frame,368; mov.u64 %r159,8; .loc 1 180 5 mov.u64 %r131,1; add.u64 %r364,%frame,240; .loc 1 177 13 mov.u64 %r366,0; bra $L34; $L35: st.u64 [%r66],%r366; .loc 1 178 23 mul.lo.u64 %r346,%r95,%r94; .loc 1 178 9 add.u64 %r347,%r346,%r346; sub.u64 %r129,%r132,%r347; .loc 1 179 23 mul.lo.u64 %r348,%r98,%r94; .loc 1 179 9 shl.b64 %r349,%r348,3; sub.u64 %r130,%r111,%r349; .loc 1 180 5 add.u64 %r131,%r131,1; .loc 1 181 7 add.u64 %r66,%r66,8; add.u64 %r159,%r159,8; setp.eq.u64 %r350,%r131,%r372; @ %r350 bra $L1; $L34: .loc 1 188 16 ld.u64 %r351,[%r66]; add.u64 %r94,%r351,1; st.u64 [%r66],%r94; .loc 1 189 23 add.u64 %r353,%r373,%r159; ld.u64 %r95,[%r353]; .loc 1 189 13 add.u64 %r354,%r95,%r95; add.u64 %r132,%r129,%r354; .loc 1 190 23 add.u64 %r355,%frame,%r159; ld.u64 %r98,[%r355]; .loc 1 190 13 shl.b64 %r356,%r98,3; add.u64 %r111,%r130,%r356; .loc 1 175 32 add.u64 %r358,%r364,%r159; ld.u64 %r102,[%r358]; .loc 1 175 13 setp.eq.u64 %r359,%r94,%r102; @ %r359 bra $L35; .loc 1 177 13 mov.u64 %r82,0; bra $L25; $L51: .loc 1 86 6 ld.u64 %r360,[%r185]; setp.eq.u64 %r361,%r360,0; @ ! %r361 bra $L11; bra $L36; $L1: .loc 1 194_gfortran_mfindloc1_i2 .visible .func _gfortran_mfindloc1_i2608u32 %r209; .reg .u32 %r210; .reg .u64 %r212; .reg .u32 %r214; .reg .u32 %r215; .reg .u16 %r217; .reg .u16 %r218; .reg .u16 %r219; .reg .u32 %r220; .reg .u16 %r221; .reg .predu64 %r230; .reg .u64 %r231; .reg .u64 %r249; .reg .u32u32u64 %r281; .reg .u64 %r282; .reg .u64 %r283; .reg .u64predu64u16 %r331; .reg .u16predpred %r362; .reg .u64 %r365; .reg .u64 %r366; .reg .pred %r368; .reg .u64 %r369; .reg .u64 %r372; .reg .u32 %r373; .reg .u64pred %r380; .reg .pred %r381; .reg .pred %r383; .reg .u16 %r385; .reg .u32 %r386; .reg .pred %r387; .reg .u16 %r388; .reg .pred %r390; .reg .pred %r391; .reg .u16 %r392; .reg .u32 %r393; .reg .pred %r394; .reg .u16 %r395; .reg .pred %r397; .reg .pred %r398; .reg .predu64 %r410; .reg .u64mov.u64 %r204,%ar0; mov.u64 %r205,%ar1; mov.u32 %r206,%ar2; mov.u64 %r207,%ar3; mov.u64 %r208,%ar4; mov.u32 %r209,%ar5; .loc 1 225 10 ld.s8 %r22,[%r205+28]; .loc 1 225 38 add.u32 %r210,%r22,-1; .loc 1 225 8 cvt.s64.s32 %r147,%r210; .loc 1 226 10 ld.u64 %r25,[%r207]; .loc 1 226 7 add.u64 %r148,%r25,-1; .loc 1 228 7 shr.u64 %r212,%r148,63; set.u32.lt.s64 %r214,%r147,%r148; neg.s32 %r215,%r214; cvt.u16.u64 %r218,%r212; cvt.u16.u32 %r219,%r215; or.b16 %r217,%r218,%r219; .loc 1 228 6 cvt.u32.u16 %r220,%r217; cvt.u16.u8 %r221,%r220; setp.eq.u16 %r222,%r221,0; @ %r222 bra $L53; .loc 1 230 7 cvt.u32.u32 %r225,%r22; cvt.s64.s8 %r224,%r225; st.u64 [%stack+8],%r224232stack; call _gfortran_runtime_error3: .loc 1 235 9 add.u64 %r228,%r148,%r148; add.u64 %r229,%r228,%r148; shl.b64 %r230,%r229,3; add.u64 %r231,%r205,%r230; ld.u64 %r31,[%r231+56]; ld.u64 %r33,[%r231+48]; .loc 1 239 9 ld.u64 %r151,[%r231+40]; .loc 1 240 12 add.u64 %r249,%r208,%r230; ld.u64 %r34,[%r249+40]; ld.u64 %r36,[%r208+16]; .loc 1 242 9 ld.u64 %r160,[%r208]; .loc 1 246 22 cvt.u32.u64 %r39,%r36; .loc 1 246 53 add.u32 %r251,%r39,-4; and.b32 %r252,%r251,-5; set.u32.eq.u32 %r254,%r252,0; neg.s32 %r255,%r254; .loc 1 246 22 add.u32 %r256,%r39,-1; set.u32.le.u32 %r258,%r256,1; neg.s32 %r259,%r258; .loc 1 246 6 cvt.u16.u32 %r261,%r255; cvt.u16.u32 %r262,%r259; or.b16 %r260,%r261,%r262; cvt.u32.u16 %r263,%r260; cvt.u16.u8 %r264,%r263; setp.eq.u16 %r265,%r264,0; @ %r265 bra $L54; $L58: .loc 1 255 3 setp.ne.u64 %r266,%r148,0; @ %r266 bra $L55; $L62: .loc 1 264 3 setp.gt.s64 %r267,%r147,%r148; @ %r267 bra $L56; bra $L113; $L54: .loc 1 248 7 setp.eq.u32 %r269,%r39,16; @ %r269 bra $L58; .loc 1 253 5 cvta.const.u64 %r271,$LC4; mov.u64 %r270,27271; call _gfortrani_internal_error5: add.u64 %r146,%r205,40; add.u64 %r167,%r208,40; add.u64 %r273,%r25,%r25; add.u64 %r274,%r273,%r25; shl.b64 %r275,%r274,3; add.u64 %r276,%r205,16; add.u64 %r101,%r275,%r276; .loc 1 255 3 mov.u64 %r200,0; add.u64 %r422,%frame,240; add.u64 %r423,%frame,120; add.u64 %r420,%frame,360; .loc 1 262 12 mov.u64 %r437,%r200; $L61: .loc 1 257 18 add.u64 %r278,%r422,%r200; ld.u64 %r279,[%r146]; st.u64 [%r278],%r279; .loc 1 258 18 add.u64 %r281,%r423,%r200; .loc 1 258 20 ld.u64 %r283,[%r167]; mul.lo.u64 %r282,%r283,%r36; .loc 1 258 18 st.u64 [%r281],%r282; .loc 1 259 19 ld.u64 %r285,[%r146+16]; add.u64 %r284,%r285,1; ld.u64 %r286,[%r146+8]; sub.u64 %r53,%r284,%r286; .loc 1 261 10 setp.lt.s64 %r287,%r53,0; @ %r287 bra $L59; .loc 1 259 17 add.u64 %r289,%r420,%r200; st.u64 [%r289],%r53; bra $L60; $L59: .loc 1 262 12 add.u64 %r291,%r420,%r200; st.u64 [%r291],%r437; $L60: .loc 1 255 3 add.u64 %r146,%r146,24; add.u64 %r167,%r167,24; add.u64 %r200,%r200,8; setp.ne.u64 %r293,%r101,%r146; @ %r293 bra $L61; bra $L62; $L113: .loc 1 274 6 ld.u64 %r294,[%r204]; setp.eq.u64 %r295,%r294,0; @ ! %r295 bra $L64; bra $L63; $L56: add.u64 %r297,%r25,%r25; add.u64 %r298,%r297,%r25; shl.b64 %r299,%r298,3; add.u64 %r143,%r299,40; add.u64 %r156,%r205,%r143; add.u64 %r141,%r208,%r143; shl.b64 %r300,%r25,3; add.u64 %r137,%r300,-8; cvt.u32.u32 %r302,%r22; cvt.s64.s8 %r301,%r302; shl.b64 %r303,%r301,3; add.u64 %r84,%r303,-8; add.u64 %r422,%frame,240; add.u64 %r423,%frame,120; add.u64 %r420,%frame,360; .loc 1 271 12 mov.u64 %r436,0; $L67: .loc 1 266 18 add.u64 %r305,%r422,%r137; ld.u64 %r306,[%r156]; st.u64 [%r305],%r306; .loc 1 267 18 add.u64 %r308,%r423,%r137; .loc 1 267 20 ld.u64 %r310,[%r141]; mul.lo.u64 %r309,%r310,%r36; .loc 1 267 18 st.u64 [%r308],%r309; .loc 1 268 19 ld.u64 %r312,[%r156+16]; add.u64 %r311,%r312,1; ld.u64 %r313,[%r156+8]; sub.u64 %r62,%r311,%r313; .loc 1 270 10 setp.lt.s64 %r314,%r62,0; @ %r314 bra $L65; .loc 1 268 17 add.u64 %r316,%r420,%r137; st.u64 [%r316],%r62; bra $L66; $L65: .loc 1 271 12 add.u64 %r318,%r420,%r137; st.u64 [%r318],%r436; $L66: .loc 1 264 3 add.u64 %r156,%r156,24; add.u64 %r141,%r141,24; add.u64 %r137,%r137,8; setp.ne.u64 %r320,%r84,%r137; @ %r320 bra $L67; bra $L114; $L63: .loc 1 278 7 setp.le.s64 %r321,%r147,0; @ %r321 bra $L69; add.u64 %r420,%frame,360; $L90: add.u64 %r179,%r204,40; mov.u64 %r176,%r420; add.u64 %r322,%r204,16; cvt.u32.u32 %r324,%r22; cvt.s64.s8 %r323,%r324; add.u64 %r326,%r323,%r323; add.u64 %r327,%r326,%r323; shl.b64 %r328,%r327,3; add.u64 %r169,%r322,%r328; .loc 1 281 10 mov.u64 %r140,1; .loc 1 285 4 mov.u64 %r351,0; bra $L70; $L69: .loc 1 289 24 mov.u64 %r329,0; st.u64 [%r204+8],%r329; .loc 1 290 28 cvt.u16.u32 %r332,%r22; add.u16 %r331,%r332,-1; cvt.u32.u16 %r333,%r331; st.u8 [%r204+28],%r333; .loc 1 292 20 add.u32 %r334,%r22,-2; cvt.s64.s32 %r72,%r334; add.u64 %r336,%r72,%r72; add.u64 %r337,%r336,%r72; shl.b64 %r338,%r337,3; add.u64 %r339,%r204,%r338; .loc 1 292 67 shl.b64 %r341,%r72,3; add.u64 %r342,%frame,%r341; .loc 1 292 59 ld.u64 %r344,[%r339+40]; ld.u64 %r345,[%r342+360]; mul.lo.u64 %r154,%r344,%r345; .loc 1 294 29 mov.u64 %r3471547; call (%value_in),_gfortrani_xmallocarray8,[%value_in]; } .loc 1 294 27 st.u64 [%r204],%r348; .loc 1 295 10 setp.eq.u64 %r350,%r154,0; @ ! %r350 bra $L74; bra $L71; $L73: .loc 1 283 48 mul.lo.u64 %r140,%r68,%r140; $L70: .loc 1 285 4 st.u64 [%r179+8],%r351; ld.u64 %r68,[%r176]; add.u64 %r352,%r68,-1; st.u64 [%r179+16],%r352; st.u64 [%r179],%r140; .loc 1 278 7 add.u64 %r179,%r179,24; add.u64 %r176,%r176,8; setp.ne.u64 %r353,%r169,%r179; @ %r353 bra $L73; bra $L69; $L71: .loc 1 298 4 st.u64 [%r204+48],%r154; mov.u64 %r355,-1; st.u64 [%r204+56],%r355; mov.u64 %r356,1; st.u64 [%r204+40],%r356; .loc 1 299 4 bra $L52; $L64: .loc 1 304 19 ld.s8 %r79,[%r204+28]; .loc 1 304 10 setp.eq.u64 %r357,%r79,%r147; @ %r357 bra $L76; .loc 1 305 2 st.u64 [%stack+8],%r147; st.u64 [%stack],%r79;_gfortran_runtime_error76: .loc 1 310 11 cvta.global.u64 %r360,_gfortrani_compile_options; .loc 1 310 10 ld.u32 %r361,[%r360+36]; setp.eq.u32 %r362,%r361,0; @ %r362 bra $L74; .loc 1 311 2 add.u64 %r420,%frame,360; cvta.const.u64 %r366,$LC2r420366; call _gfortrani_bounds_ifunction_return$L74: .loc 1 315 3 setp.gt.s64 %r368,%r147,0; @ %r368 bra $L77; $L80: .loc 1 235 9 add.u64 %r369,%r31,1; .loc 1 235 7 sub.u64 %r149,%r369,%r33; max.s64 %r65,%r149,0; .loc 1 240 12 mul.lo.u64 %r37,%r34,%r36; .loc 1 323 8 ld.u64 %r164,[%r204]; .loc 1 326 8 ld.u64 %r163,[%r205]; .loc 1 351 35 add.u64 %r100,%r151,%r151; .loc 1 336 22 add.u64 %r64,%r65,-1; .loc 1 336 15 mul.lo.u64 %r89,%r64,%r100; .loc 1 337 29 mul.lo.u64 %r177,%r37,%r64; .loc 1 338 34 neg.s64 %r95,%r100; .loc 1 363 22 ld.u64 %r102,[%frame+240]; .loc 1 363 12 add.u64 %r104,%r102,%r102; .loc 1 364 23 ld.u64 %r105,[%frame+120]; .loc 1 365 22 ld.u64 %r107,[%frame]; .loc 1 365 12 shl.b64 %r109,%r107,3; .loc 1 362 12 ld.u64 %r203,[%frame+480]; setp.ne.u32 %r421,%r209,0; add.u64 %r419,%frame,488; .loc 1 338 4 setp.le.s64 %r429,%r149,0; .loc 1 340 18 cvt.u16.u32 %r430,%r206; .loc 1 382 23 add.u64 %r432,%frame,240; .loc 1 367 32 add.u64 %r433,%frame,360; bra $L78; $L77: add.u64 %r198,%frame,480; add.u64 %r197,%r204,40; mov.u64 %r195,%frame; cvt.u32.u32 %r373,%r22; cvt.s64.s8 %r372,%r373; add.u64 %r181,%r372,-1; .loc 1 315 3 mov.u64 %r165,0; add.u64 %r420,%frame,360; .loc 1 317 16 mov.u64 %r374,%r165; $L79: st.u64 [%r198],%r374; .loc 1 318 18 ld.u64 %r375,[%r197]; st.u64 [%r195],%r375; .loc 1 319 17 shl.b64 %r377,%r165,3; add.u64 %r378,%r420,%r377; .loc 1 319 10 ld.u64 %r379,[%r378]; setp.le.s64 %r380,%r379,0; @ %r380 bra $L52; .loc 1 315 26 add.u64 %r165,%r165,1; .loc 1 315 3 add.u64 %r198,%r198,8; add.u64 %r197,%r197,24; add.u64 %r195,%r195,8; setp.ne.u64 %r381,%r165,%r181; @ %r381 bra $L79; bra $L80; $L78: .loc 1 334 10 @ %r421 bra $L81; .loc 1 351 4 setp.gt.s64 %r383,%r149,0; @ %r383 bra $L91; .loc 1 333 14 mov.u64 %r158,0; bra $L83; $L81: .loc 1 336 8 add.u64 %r29,%r163,%r89; .loc 1 337 9 add.u64 %r159,%r160,%r177; .loc 1 338 4 @ %r429 bra $L92; .loc 1 338 11 mov.u64 %r158,%r65; $L85: .loc 1 340 11 ld.s8 %r386,[%r159]; cvt.u16.u32 %r385,%r386; setp.eq.u16 %r387,%r385,0; @ %r387 bra $L84; .loc 1 340 18 ld.u16 %r388,[%r29]; setp.eq.u16 %r390,%r388,%r430; @ %r390 bra $L83; $L84: .loc 1 338 26 add.u64 %r158,%r158,-1; .loc 1 338 34 add.u64 %r29,%r29,%r95; .loc 1 338 53 sub.u64 %r159,%r159,%r37; .loc 1 338 4 setp.ne.u64 %r391,%r158,0; @ %r391 bra $L85; bra $L83; $L91: .loc 1 351 4 mov.u64 %r157,%r160; mov.u64 %r111,%r163; .loc 1 351 11 mov.u64 %r158,1; $L82: .loc 1 353 11 ld.s8 %r393,[%r157]; cvt.u16.u32 %r392,%r393; setp.eq.u16 %r394,%r392,0; @ %r394 bra $L86; .loc 1 353 18 ld.u16 %r395,[%r111]; setp.eq.u16 %r397,%r395,%r430; @ %r397 bra $L83; $L86: .loc 1 351 27 add.u64 %r158,%r158,1; .loc 1 351 35 add.u64 %r111,%r111,%r100; .loc 1 351 54 add.u64 %r157,%r157,%r37; .loc 1 351 4 setp.ge.s64 %r398,%r65,%r158; @ %r398 bra $L82; .loc 1 333 14 mov.u64 %r158,0; bra $L83; $L92: mov.u64 %r158,0; $L83: .loc 1 360 13 st.u64 [%r164],%r158; .loc 1 362 15 add.u64 %r203,%r203,1; st.u64 [%frame+480],%r203; .loc 1 363 12 add.u64 %r163,%r163,%r104; .loc 1 364 13 add.u64 %r160,%r160,%r105; .loc 1 365 12 add.u64 %r164,%r164,%r109; .loc 1 367 32 ld.u64 %r130,[%frame+360]; .loc 1 367 13 setp.ne.u64 %r399,%r203,%r130; @ %r399 bra $L78; mov.u64 %r126,%r419; mov.u64 %r127,%r107; mov.u64 %r202,%r105; mov.u64 %r124,%r102; mov.u64 %r120,8; .loc 1 366 9 mov.u64 %r162,0; .loc 1 369 13 mov.u64 %r401,%r162; .loc 1 371 20 add.u64 %r426,%frame,120; $L89: .loc 1 369 13 st.u64 [%r126+-8],%r401; .loc 1 370 23 mul.lo.u64 %r113,%r130,%r124; .loc 1 371 24 mul.lo.u64 %r402,%r130,%r202; .loc 1 371 10 sub.u64 %r160,%r160,%r402; .loc 1 372 23 mul.lo.u64 %r119,%r130,%r127; .loc 1 373 5 add.u64 %r162,%r162,1; .loc 1 374 7 setp.le.s64 %r403,%r147,%r162; @ %r403 bra $L52; .loc 1 381 16 ld.u64 %r404,[%r126]; add.u64 %r123,%r404,1; st.u64 [%r126],%r123; .loc 1 382 23 add.u64 %r406,%r432,%r120; ld.u64 %r124,[%r406]; .loc 1 382 13 sub.u64 %r407,%r124,%r113; add.u64 %r408,%r407,%r407; add.u64 %r163,%r163,%r408; .loc 1 383 23 add.u64 %r409,%frame,%r120; ld.u64 %r127,[%r409]; .loc 1 383 13 sub.u64 %r410,%r127,%r119; shl.b64 %r411,%r410,3; add.u64 %r164,%r164,%r411; .loc 1 367 32 add.u64 %r413,%r433,%r120; ld.u64 %r130,[%r413]; .loc 1 367 13 add.u64 %r126,%r126,8; setp.eq.u64 %r414,%r123,%r130; @ %r414 bra $L88; .loc 1 362 12 ld.u64 %r203,[%frame+480]; bra $L78; $L88: .loc 1 371 20 add.u64 %r416,%r426,%r120; ld.u64 %r202,[%r416]; add.u64 %r120,%r120,8; bra $L89; $L114: .loc 1 274 6 ld.u64 %r417,[%r204]; setp.eq.u64 %r418,%r417,0; @ ! %r418 bra $L64; bra $L90; $L52: .loc 1 387_gfortran_sfindloc1_i2 .visible .func _gfortran_sfindloc1_i268u64 %r126; .reg .u64u32pred %r190; .reg .pred %r192; .reg .u64pred %r213; .reg .predu32predpredpred %r257; .reg .u64 %r260; .reg .u64 %r261; .reg .pred %r263; .reg .u64 %r264; .reg .u32 %r265; .reg .u64 %r266; .reg .u6464predmov.u64 %r149,%ar0; mov.u64 %r150,%ar1; mov.u32 %r151,%ar2; mov.u64 %r152,%ar3; mov.u64 %r153,%ar4; mov.u32 %r154,%ar5; .loc 1 410 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L116; .loc 1 410 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L117; $L116:call _gfortran_findloc1_i2); } .loc 1 413 7 bra $L115; $L117: .loc 1 416 10 ld.s8 %r23,[%r150+28]; .loc 1 416 38 add.u32 %r163,%r23,-1; .loc 1 416 8 cvt.s64.s32 %r90,%r163; .loc 1 417 10 ld.u64 %r26,[%r152]; .loc 1 417 7 add.u64 %r91,%r26,-1; .loc 1 419 7 shr.u64 %r165,%r91,63; set.u32.lt.s64 %r167,%r90,%r91; neg.s32 %r168,%r167; cvt.u16.u64 %r171,%r165; cvt.u16.u32 %r172,%r168; or.b16 %r170,%r171,%r172; .loc 1 419 6 cvt.u32.u16 %r173,%r170; cvt.u16.u8 %r174,%r173; setp.ne.u16 %r175,%r174,0; @ %r175 bra $L119; .loc 1 430 3 setp.ne.u64 %r176,%r91,0; @ %r176 bra $L120; $L126: .loc 1 438 3 setp.gt.s64 %r177,%r90,%r91; @ %r177 bra $L121; bra $L163; $L119: .loc 1 421 7 cvt.u32.u32 %r180,%r23; cvt.s64.s8 %r179,%r180; st.u64 [%stack+8],%r179_gfortran_runtime_error120: add.u64 %r136,%r150,48; add.u64 %r138,%frame,120; add.u64 %r139,%r150,56; add.u64 %r182,%r150,24; add.u64 %r184,%r26,%r26; add.u64 %r185,%r184,%r26; shl.b64 %r186,%r185,3; add.u64 %r145,%r182,%r186; .loc 1 435 12 mov.u64 %r302,0; $L125: .loc 1 432 19 ld.u64 %r188,[%r139]; add.u64 %r187,%r188,1; ld.u64 %r189,[%r136]; sub.u64 %r35,%r187,%r189; .loc 1 434 10 setp.le.s64 %r190,%r35,0; @ %r190 bra $L123; .loc 1 432 17 st.u64 [%r138],%r35; bra $L124; $L123: .loc 1 435 12 st.u64 [%r138],%r302; $L124: .loc 1 430 3 add.u64 %r136,%r136,24; add.u64 %r138,%r138,8; add.u64 %r139,%r139,24; setp.ne.u64 %r192,%r136,%r145; @ %r192 bra $L125; bra $L126; $L163: .loc 1 448 6 ld.u64 %r193,[%r149]; setp.eq.u64 %r194,%r193,0; @ ! %r194 bra $L128; bra $L127; $L121: add.u64 %r196,%r26,%r26; add.u64 %r197,%r196,%r26; shl.b64 %r198,%r197,3; add.u64 %r199,%r198,48; add.u64 %r125,%r150,%r199; add.u64 %r294,%frame,120; shl.b64 %r201,%r26,3; add.u64 %r202,%r201,-8; add.u64 %r45,%r294,%r202; add.u64 %r203,%r150,48; cvt.u32.u32 %r205,%r23; cvt.s64.s8 %r204,%r205; add.u64 %r207,%r204,%r204; add.u64 %r208,%r207,%r204; shl.b64 %r209,%r208,3; add.u64 %r132,%r203,%r209; .loc 1 444 12 mov.u64 %r301,0; $L131: .loc 1 441 2 ld.u64 %r211,[%r125+8]; add.u64 %r210,%r211,1; ld.u64 %r212,[%r125]; sub.u64 %r40,%r210,%r212; .loc 1 443 10 setp.le.s64 %r213,%r40,0; @ %r213 bra $L129; .loc 1 440 17 st.u64 [%r45],%r40; bra $L130; $L129: .loc 1 444 12 st.u64 [%r45],%r301; $L130: .loc 1 438 3 add.u64 %r125,%r125,24; add.u64 %r45,%r45,8; setp.ne.u64 %r215,%r125,%r132; @ %r215 bra $L131; bra $L164; $L127: .loc 1 452 7 setp.le.s64 %r216,%r90,0; @ %r216 bra $L133; add.u64 %r294,%frame,120; $L149: add.u64 %r78,%r149,40; mov.u64 %r61,%r294; cvt.u32.u32 %r218,%r23; cvt.s64.s8 %r217,%r218; add.u64 %r220,%r217,%r217; add.u64 %r221,%r220,%r217; shl.b64 %r222,%r221,3; add.u64 %r223,%r149,16; add.u64 %r126,%r222,%r223; .loc 1 455 10 mov.u64 %r81,1; .loc 1 459 4 mov.u64 %r246,0; bra $L134; $L133: .loc 1 462 24 mov.u64 %r224,0; st.u64 [%r149+8],%r224; .loc 1 463 28 cvt.u16.u32 %r227,%r23; add.u16 %r226,%r227,-1; cvt.u32.u16 %r228,%r226; st.u8 [%r149+28],%r228; .loc 1 465 20 add.u32 %r229,%r23,-2; cvt.s64.s32 %r51,%r229; add.u64 %r231,%r51,%r51; add.u64 %r232,%r231,%r51; shl.b64 %r233,%r232,3; add.u64 %r234,%r149,%r233; .loc 1 465 67 shl.b64 %r236,%r51,3; add.u64 %r237,%frame,%r236; .loc 1 465 59 ld.u64 %r239,[%r234+40]; ld.u64 %r240,[%r237+120]; mul.lo.u64 %r94,%r239,%r240; .loc 1 467 29 mov.u64 %r24242; call (%value_in),_gfortrani_xmallocarray3,[%value_in]; } .loc 1 467 27 st.u64 [%r149],%r243; .loc 1 468 10 setp.eq.u64 %r245,%r94,0; @ ! %r245 bra $L138; bra $L135; $L137: .loc 1 457 48 mul.lo.u64 %r81,%r47,%r81; $L134: .loc 1 459 4 st.u64 [%r78+8],%r246; ld.u64 %r47,[%r61]; add.u64 %r247,%r47,-1; st.u64 [%r78+16],%r247; st.u64 [%r78],%r81; .loc 1 452 7 add.u64 %r78,%r78,24; add.u64 %r61,%r61,8; setp.ne.u64 %r248,%r78,%r126; @ %r248 bra $L137; bra $L133; $L135: .loc 1 471 4 st.u64 [%r149+48],%r94; mov.u64 %r250,-1; st.u64 [%r149+56],%r250; mov.u64 %r251,1; st.u64 [%r149+40],%r251; .loc 1 472 4 bra $L115; $L128: .loc 1 477 19 ld.s8 %r58,[%r149+28]; .loc 1 477 10 setp.eq.u64 %r252,%r58,%r90; @ %r252 bra $L139; .loc 1 478 2 st.u64 [%stack+8],%r90; st.u64 [%stack],%r58;_gfortran_runtime_error139: .loc 1 483 11 cvta.global.u64 %r255,_gfortrani_compile_options; .loc 1 483 10 ld.u32 %r256,[%r255+36]; setp.eq.u32 %r257,%r256,0; @ %r257 bra $L138; .loc 1 484 2 add.u64 %r294,%frame,120; cvta.const.u64 %r261,$LC2261; call _gfortrani_bounds_ifunction_return$L138: .loc 1 488 3 setp.gt.s64 %r263,%r90,0; @ %r263 bra $L140; $L143: .loc 1 495 8 ld.u64 %r97,[%r149]; ld.u64 %r42,[%frame+240]; .loc 1 503 22 ld.u64 %r64,[%frame]; .loc 1 503 12 shl.b64 %r66,%r64,3; setp.le.s64 %r295,%r90,1; cvt.u32.u32 %r297,%r23; cvt.s64.s8 %r298,%r297; add.u64 %r299,%r298,-1; add.u64 %r300,%frame,120; bra $L141; $L140: add.u64 %r114,%frame,240; add.u64 %r113,%r149,40; mov.u64 %r108,%frame; cvt.u32.u32 %r265,%r23; cvt.s64.s8 %r264,%r265; add.u64 %r79,%r264,-1; .loc 1 488 3 mov.u64 %r98,0; add.u64 %r294,%frame,120; .loc 1 490 16 mov.u64 %r266,%r98; $L142: st.u64 [%r114],%r266; .loc 1 491 18 ld.u64 %r267,[%r113]; st.u64 [%r108],%r267; .loc 1 492 17 shl.b64 %r269,%r98,3; add.u64 %r270,%r294,%r269; .loc 1 492 10 ld.u64 %r271,[%r270]; setp.le.s64 %r272,%r271,0; @ %r272 bra $L115; .loc 1 488 26 add.u64 %r98,%r98,1; .loc 1 488 3 add.u64 %r114,%r114,8; add.u64 %r113,%r113,24; add.u64 %r108,%r108,8; setp.ne.u64 %r273,%r79,%r98; @ %r273 bra $L142; bra $L143; $L141: .loc 1 500 13 mov.u64 %r274,0; st.u64 [%r97],%r274; .loc 1 502 15 add.u64 %r42,%r42,1; .loc 1 503 12 add.u64 %r97,%r97,%r66; .loc 1 505 32 ld.u64 %r88,[%frame+120]; .loc 1 505 13 setp.ne.u64 %r275,%r42,%r88; @ %r275 bra $L141; .loc 1 508 23 mul.lo.u64 %r276,%r64,%r42; .loc 1 508 9 shl.b64 %r277,%r276,3; sub.u64 %r30,%r97,%r277; .loc 1 510 7 @ ! %r295 bra $L165; bra $L115; $L148: .loc 1 507 13 st.u64 [%r102],%r274; .loc 1 508 23 mul.lo.u64 %r280,%r73,%r72; .loc 1 508 9 shl.b64 %r281,%r280,3; sub.u64 %r30,%r97,%r281; .loc 1 509 5 add.u64 %r96,%r96,1; .loc 1 510 7 add.u64 %r102,%r102,8; setp.ne.u64 %r282,%r96,%r299; @ %r282 bra $L147; bra $L115; $L165: add.u64 %r102,%frame,248; .loc 1 509 5 mov.u64 %r96,1; $L147: .loc 1 517 16 ld.u64 %r286,[%r102]; add.u64 %r72,%r286,1; st.u64 [%r102],%r72; shl.b64 %r120,%r96,3; .loc 1 518 23 add.u64 %r287,%frame,%r120; ld.u64 %r73,[%r287]; .loc 1 518 13 shl.b64 %r288,%r73,3; add.u64 %r97,%r30,%r288; .loc 1 505 32 add.u64 %r290,%r300,%r120; ld.u64 %r77,[%r290]; .loc 1 505 13 setp.eq.u64 %r291,%r72,%r77; @ %r291 bra $L148; .loc 1 507 13 mov.u64 %r42,0; bra $L141; $L164: .loc 1 448 6 ld.u64 %r292,[%r149]; setp.eq.u64 %r293,%r292,0; @ ! %r293 bra $L128; bra $L149; $L115: .loc 1 522 findloc1_i4.o/_gfortran_findloc1_i4 .visible .func _gfortran_findloc1_i4); .file 1 "../../../../libgfortran/generated/findloc1_i4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mfindloc1_i4 .visible .func _gfortran_mfindloc1_i4_gfortran_sfindloc1_i4 .visible .func _gfortran_sfindloc1_i4); // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_internal_error .extern .func _gfortrani_internal_8] = {70,73,78,68,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,70,73,78,68,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,914,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,70,73,78,68,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_findloc1_i4 .visible .func _gfortran_findloc1_i44898predpred %r238; .reg .u64pred %r260; .reg .pred %r262; .reg .pred %r263; .reg .u64 %r264; .reg .u32 %r265; .reg .u64u64 %r279; .reg .u64 %r280; .reg .u6464pred %r327; .reg .predu64u64 %r352; .reg .u64 %r354; .reg .predpred %r358; .reg .pred %r359; .reg .u64predmov.u64 %r185,%ar0; mov.u64 %r186,%ar1; mov.u32 %r187,%ar2; mov.u64 %r188,%ar3; mov.u32 %r189,%ar4; .loc 1 54 10 ld.s8 %r22,[%r186+28]; .loc 1 54 38 add.u32 %r190,%r22,-1; .loc 1 54 8 cvt.s64.s32 %r118,%r190; .loc 1 55 10 ld.u64 %r25,[%r188]; .loc 1 55 7 add.u64 %r119,%r25,-1; .loc 1 57 7 shr.u64 %r192,%r119,63; set.u32.lt.s64 %r194,%r118,%r119; neg.s32 %r195,%r194; cvt.u16.u64 %r198,%r192; cvt.u16.u32 %r199,%r195; or.b16 %r197,%r198,%r199; .loc 1 57 6 cvt.u32.u16 %r200,%r197; cvt.u16.u8 %r201,%r200; setp.eq.u16 %r202,%r201,0; @ %r202 bra $L2; .loc 1 59 7 cvt.u32.u32 %r205,%r22; cvt.s64.s8 %r204,%r205; st.u64 [%stack+8],%r20403203_gfortran_runtime_errorr208,%r119,%r119; add.u64 %r209,%r208,%r119; shl.b64 %r210,%r209,3; add.u64 %r211,%r186,%r210; ld.u64 %r31,[%r211+56]; ld.u64 %r33,[%r211+48]; .loc 1 67 9 ld.u64 %r122,[%r211+40]; .loc 1 69 3 setp.ne.u64 %r225,%r119,0; @ %r225 bra $L3; $L9: .loc 1 77 3 setp.gt.s64 %r226,%r118,%r119; @ %r226 bra $L4; bra $L50; $L3: add.u64 %r177,%r186,40; add.u64 %r67,%frame,120; add.u64 %r70,%frame,240; add.u64 %r228,%r25,%r25; add.u64 %r229,%r228,%r25; shl.b64 %r230,%r229,3; add.u64 %r231,%r186,16; add.u64 %r182,%r230,%r231; .loc 1 75 12 mov.u64 %r371,0; $L8: .loc 1 71 18 ld.u64 %r232,[%r177]; st.u64 [%r67],%r232; .loc 1 72 19 ld.u64 %r234,[%r177+16]; add.u64 %r233,%r234,1; ld.u64 %r235,[%r177+8]; sub.u64 %r38,%r233,%r235; .loc 1 74 10 setp.lt.s64 %r236,%r38,0; @ %r236 bra $L6; .loc 1 72 17 st.u64 [%r70],%r38; bra $L7; $L6: .loc 1 75 12 st.u64 [%r70],%r371; $L7: .loc 1 69 3 add.u64 %r177,%r177,24; add.u64 %r67,%r67,8; add.u64 %r70,%r70,8; setp.ne.u64 %r238,%r177,%r182; @ %r238 bra $L8; bra $L9; $L50: .loc 1 86 6 ld.u64 %r239,[%r185]; setp.eq.u64 %r240,%r239,0; @ ! %r240 bra $L11; bra $L10; $L4: add.u64 %r242,%r25,%r25; add.u64 %r243,%r242,%r25; shl.b64 %r244,%r243,3; add.u64 %r245,%r244,40; add.u64 %r115,%r186,%r245; shl.b64 %r246,%r25,3; add.u64 %r105,%r246,-8; add.u64 %r361,%frame,120; add.u64 %r108,%r361,%r105; add.u64 %r360,%frame,240; add.u64 %r62,%r360,%r105; add.u64 %r249,%r186,40; cvt.u32.u32 %r251,%r22; cvt.s64.s8 %r250,%r251; add.u64 %r253,%r250,%r250; add.u64 %r254,%r253,%r250; shl.b64 %r255,%r254,3; add.u64 %r180,%r249,%r255; .loc 1 83 12 mov.u64 %r370,0; $L14: .loc 1 79 18 ld.u64 %r256,[%r115]; st.u64 [%r108],%r256; .loc 1 80 19 ld.u64 %r258,[%r115+16]; add.u64 %r257,%r258,1; ld.u64 %r259,[%r115+8]; sub.u64 %r45,%r257,%r259; .loc 1 82 10 setp.lt.s64 %r260,%r45,0; @ %r260 bra $L12; .loc 1 80 17 st.u64 [%r62],%r45; bra $L13; $L12: .loc 1 83 12 st.u64 [%r62],%r370; $L13: .loc 1 77 3 add.u64 %r115,%r115,24; add.u64 %r108,%r108,8; add.u64 %r62,%r62,8; setp.ne.u64 %r262,%r115,%r180; @ %r262 bra $L14; bra $L51; $L10: .loc 1 90 7 setp.le.s64 %r263,%r118,0; @ %r263 bra $L16; add.u64 %r360,%frame,240; $L36: add.u64 %r153,%r185,40; mov.u64 %r149,%r360; cvt.u32.u32 %r265,%r22; cvt.s64.s8 %r264,%r265; add.u64 %r267,%r264,%r264; add.u64 %r268,%r267,%r264; shl.b64 %r269,%r268,3; add.u64 %r270,%r185,16; add.u64 %r116,%r269,%r270; .loc 1 93 10 mov.u64 %r112,1; .loc 1 97 4 mov.u64 %r293,0; bra $L17; $L16: .loc 1 101 24 mov.u64 %r271,0; st.u64 [%r185+8],%r271; .loc 1 102 28 cvt.u16.u32 %r274,%r22; add.u16 %r273,%r274,-1; cvt.u32.u16 %r275,%r273; st.u8 [%r185+28],%r275; .loc 1 104 20 add.u32 %r276,%r22,-2; cvt.s64.s32 %r54,%r276; add.u64 %r278,%r54,%r54; add.u64 %r279,%r278,%r54; shl.b64 %r280,%r279,3; add.u64 %r281,%r185,%r280; .loc 1 104 67 shl.b64 %r283,%r54,3; add.u64 %r284,%frame,%r283; .loc 1 104 59 ld.u64 %r286,[%r281+40]; ld.u64 %r287,[%r284+240]; mul.lo.u64 %r124,%r286,%r287; .loc 1 106 29 mov.u64 %r289189; call (%value_in),_gfortrani_xmallocarray290,[%value_in]; } .loc 1 106 27 st.u64 [%r185],%r290; .loc 1 107 10 setp.eq.u64 %r292,%r124,0; @ ! %r292 bra $L21; bra $L18; $L20: .loc 1 95 48 mul.lo.u64 %r112,%r50,%r112; $L17: .loc 1 97 4 st.u64 [%r153+8],%r293; ld.u64 %r50,[%r149]; add.u64 %r294,%r50,-1; st.u64 [%r153+16],%r294; st.u64 [%r153],%r112; .loc 1 90 7 add.u64 %r153,%r153,24; add.u64 %r149,%r149,8; setp.ne.u64 %r295,%r116,%r153; @ %r295 bra $L20; bra $L16; $L18: .loc 1 110 4 st.u64 [%r185+48],%r124; mov.u64 %r297,-1; st.u64 [%r185+56],%r297; mov.u64 %r298,1; st.u64 [%r185+40],%r298; .loc 1 111 4 bra $L1; $L11: .loc 1 116 19 ld.s8 %r60,[%r185+28]; .loc 1 116 10 setp.eq.u64 %r299,%r60,%r118; @ %r299 bra $L23; .loc 1 117 2 st.u64 [%stack+8],%r118; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 122 11 cvta.global.u64 %r302,_gfortrani_compile_options; .loc 1 122 10 ld.u32 %r303,[%r302+36]; setp.eq.u32 %r304,%r303,0; @ %r304 bra $L21; .loc 1 123 2 add.u64 %r360,%frame,240; cvta.const.u64 %r308,$LC2r360307308; call _gfortrani_bounds_ifunction_return$L21: .loc 1 127 3 setp.gt.s64 %r310,%r118,0; @ %r310 bra $L24; $L27: .loc 1 64 9 add.u64 %r311,%r31,1; .loc 1 64 7 sub.u64 %r120,%r311,%r33; max.s64 %r43,%r120,0; .loc 1 135 8 ld.u64 %r111,[%r185]; .loc 1 138 8 ld.u64 %r132,[%r186]; .loc 1 160 35 shl.b64 %r74,%r122,2; .loc 1 147 22 add.u64 %r312,%r43,-1; .loc 1 147 15 mul.lo.u64 %r68,%r312,%r74; .loc 1 148 34 neg.s64 %r71,%r74; ld.u64 %r82,[%frame+360]; .loc 1 172 22 ld.u64 %r75,[%frame+120]; .loc 1 172 12 shl.b64 %r77,%r75,2; .loc 1 173 22 ld.u64 %r78,[%frame]; .loc 1 173 12 shl.b64 %r80,%r78,3; setp.ne.u32 %r359,%r189,0; setp.le.s64 %r358,%r118,1; .loc 1 148 4 setp.le.s64 %r363,%r120,0; .loc 1 160 4 setp.gt.s64 %r364,%r120,0; cvt.u32.u32 %r365,%r22; cvt.s64.s8 %r366,%r365; add.u64 %r367,%r366,-1; add.u64 %r368,%frame,120; add.u64 %r369,%frame,240; bra $L25; $L24: add.u64 %r170,%frame,360; add.u64 %r169,%r185,40; mov.u64 %r167,%frame; cvt.u32.u32 %r314,%r22; cvt.s64.s8 %r313,%r314; add.u64 %r154,%r313,-1; .loc 1 127 3 mov.u64 %r133,0; add.u64 %r360,%frame,240; .loc 1 129 16 mov.u64 %r315,%r133; $L26: st.u64 [%r170],%r315; .loc 1 130 18 ld.u64 %r316,[%r169]; st.u64 [%r167],%r316; .loc 1 131 17 shl.b64 %r318,%r133,3; add.u64 %r319,%r360,%r318; .loc 1 131 10 ld.u64 %r320,[%r319]; setp.le.s64 %r321,%r320,0; @ %r321 bra $L1; .loc 1 127 26 add.u64 %r133,%r133,1; .loc 1 127 3 add.u64 %r170,%r170,8; add.u64 %r169,%r169,24; add.u64 %r167,%r167,8; setp.ne.u64 %r322,%r133,%r154; @ %r322 bra $L26; bra $L27; $L25: .loc 1 145 10 @ %r359 bra $L28; .loc 1 160 4 @ %r364 bra $L37; .loc 1 144 14 mov.u64 %r126,0; bra $L30; $L28: .loc 1 147 8 add.u64 %r127,%r132,%r68; .loc 1 148 4 @ %r363 bra $L38; .loc 1 148 11 mov.u64 %r126,%r43; $L31: .loc 1 150 11 ld.u32 %r326,[%r127]; setp.eq.u32 %r327,%r326,%r187; @ %r327 bra $L30; .loc 1 148 26 add.u64 %r126,%r126,-1; .loc 1 148 34 add.u64 %r127,%r127,%r71; .loc 1 148 4 setp.ne.u64 %r328,%r126,0; @ %r328 bra $L31; bra $L30; $L37: .loc 1 160 4 mov.u64 %r125,%r132; .loc 1 160 11 mov.u64 %r126,1; $L29: .loc 1 162 11 ld.u32 %r329,[%r125]; setp.eq.u32 %r330,%r329,%r187; @ %r330 bra $L30; .loc 1 160 27 add.u64 %r126,%r126,1; .loc 1 160 35 add.u64 %r125,%r125,%r74; .loc 1 160 4 setp.ge.s64 %r331,%r43,%r126; @ %r331 bra $L29; .loc 1 144 14 mov.u64 %r126,0; bra $L30; $L38: mov.u64 %r126,0; $L30: .loc 1 169 13 st.u64 [%r111],%r126; .loc 1 171 15 add.u64 %r82,%r82,1; .loc 1 172 12 add.u64 %r132,%r132,%r77; .loc 1 173 12 add.u64 %r111,%r111,%r80; .loc 1 175 32 ld.u64 %r141,[%frame+240]; .loc 1 175 13 setp.ne.u64 %r332,%r82,%r141; @ %r332 bra $L25; .loc 1 178 23 mul.lo.u64 %r333,%r75,%r82; .loc 1 178 9 shl.b64 %r334,%r333,2; sub.u64 %r129,%r132,%r334; .loc 1 179 23 mul.lo.u64 %r335,%r78,%r82; .loc 1 179 9 shl.b64 %r336,%r335,3; sub.u64 %r130,%r111,%r336; .loc 1 181 7 @ %r358 bra $L1; add.u64 %r66,%frame,368; mov.u64 %r159,8; .loc 1 180 5 mov.u64 %r131,1; .loc 1 177 13 mov.u64 %r362,0; bra $L34; $L35: st.u64 [%r66],%r362; .loc 1 178 23 mul.lo.u64 %r342,%r95,%r94; .loc 1 178 9 shl.b64 %r343,%r342,2; sub.u64 %r129,%r132,%r343; .loc 1 179 23 mul.lo.u64 %r344,%r98,%r94; .loc 1 179 9 shl.b64 %r345,%r344,3; sub.u64 %r130,%r111,%r345; .loc 1 180 5 add.u64 %r131,%r131,1; .loc 1 181 7 add.u64 %r66,%r66,8; add.u64 %r159,%r159,8; setp.eq.u64 %r346,%r131,%r367; @ %r346 bra $L1; $L34: .loc 1 188 16 ld.u64 %r347,[%r66]; add.u64 %r94,%r347,1; st.u64 [%r66],%r94; .loc 1 189 23 add.u64 %r349,%r368,%r159; ld.u64 %r95,[%r349]; .loc 1 189 13 shl.b64 %r350,%r95,2; add.u64 %r132,%r129,%r350; .loc 1 190 23 add.u64 %r351,%frame,%r159; ld.u64 %r98,[%r351]; .loc 1 190 13 shl.b64 %r352,%r98,3; add.u64 %r111,%r130,%r352; .loc 1 175 32 add.u64 %r354,%r369,%r159; ld.u64 %r102,[%r354]; .loc 1 175 13 setp.eq.u64 %r355,%r94,%r102; @ %r355 bra $L35; .loc 1 177 13 mov.u64 %r82,0; bra $L25; $L51: .loc 1 86 6 ld.u64 %r356,[%r185]; setp.eq.u64 %r357,%r356,0; @ ! %r357 bra $L11; bra $L36; $L1: .loc 1 194_gfortran_mfindloc1_i4 .visible .func _gfortran_mfindloc1_i4608u32 %r209; .reg .u32 %r210; .reg .u64 %r212; .reg .u32 %r214; .reg .u32 %r215; .reg .u16 %r217; .reg .u16 %r218; .reg .u16 %r219; .reg .u32 %r220; .reg .u16 %r221; .reg .predu64 %r230; .reg .u64 %r231; .reg .u64 %r249; .reg .u32u32u64 %r281; .reg .u64 %r282; .reg .u64 %r283; .reg .u64predu64u16 %r331; .reg .u16predpredu64 %r375; .reg .u64pred %r379; .reg .pred %r381; .reg .u16 %r383; .reg .u32 %r384; .reg .pred %r385; .reg .u32pred %r393; .reg .pred %r394; .reg .predu64u64 %r430; .reg .u64 %r431; mov.u64 %r204,%ar0; mov.u64 %r205,%ar1; mov.u32 %r206,%ar2; mov.u64 %r207,%ar3; mov.u64 %r208,%ar4; mov.u32 %r209,%ar5; .loc 1 225 10 ld.s8 %r22,[%r205+28]; .loc 1 225 38 add.u32 %r210,%r22,-1; .loc 1 225 8 cvt.s64.s32 %r147,%r210; .loc 1 226 10 ld.u64 %r25,[%r207]; .loc 1 226 7 add.u64 %r148,%r25,-1; .loc 1 228 7 shr.u64 %r212,%r148,63; set.u32.lt.s64 %r214,%r147,%r148; neg.s32 %r215,%r214; cvt.u16.u64 %r218,%r212; cvt.u16.u32 %r219,%r215; or.b16 %r217,%r218,%r219; .loc 1 228 6 cvt.u32.u16 %r220,%r217; cvt.u16.u8 %r221,%r220; setp.eq.u16 %r222,%r221,0; @ %r222 bra $L53; .loc 1 230 7 cvt.u32.u32 %r225,%r22; cvt.s64.s8 %r224,%r225; st.u64 [%stack+8],%r224232stack; call _gfortran_runtime_error3: .loc 1 235 9 add.u64 %r228,%r148,%r148; add.u64 %r229,%r228,%r148; shl.b64 %r230,%r229,3; add.u64 %r231,%r205,%r230; ld.u64 %r31,[%r231+56]; ld.u64 %r33,[%r231+48]; .loc 1 239 9 ld.u64 %r151,[%r231+40]; .loc 1 240 12 add.u64 %r249,%r208,%r230; ld.u64 %r34,[%r249+40]; ld.u64 %r36,[%r208+16]; .loc 1 242 9 ld.u64 %r160,[%r208]; .loc 1 246 22 cvt.u32.u64 %r39,%r36; .loc 1 246 53 add.u32 %r251,%r39,-4; and.b32 %r252,%r251,-5; set.u32.eq.u32 %r254,%r252,0; neg.s32 %r255,%r254; .loc 1 246 22 add.u32 %r256,%r39,-1; set.u32.le.u32 %r258,%r256,1; neg.s32 %r259,%r258; .loc 1 246 6 cvt.u16.u32 %r261,%r255; cvt.u16.u32 %r262,%r259; or.b16 %r260,%r261,%r262; cvt.u32.u16 %r263,%r260; cvt.u16.u8 %r264,%r263; setp.eq.u16 %r265,%r264,0; @ %r265 bra $L54; $L58: .loc 1 255 3 setp.ne.u64 %r266,%r148,0; @ %r266 bra $L55; $L62: .loc 1 264 3 setp.gt.s64 %r267,%r147,%r148; @ %r267 bra $L56; bra $L113; $L54: .loc 1 248 7 setp.eq.u32 %r269,%r39,16; @ %r269 bra $L58; .loc 1 253 5 cvta.const.u64 %r271,$LC4; mov.u64 %r270,27271; call _gfortrani_internal_error5: add.u64 %r146,%r205,40; add.u64 %r167,%r208,40; add.u64 %r273,%r25,%r25; add.u64 %r274,%r273,%r25; shl.b64 %r275,%r274,3; add.u64 %r276,%r205,16; add.u64 %r101,%r275,%r276; .loc 1 255 3 mov.u64 %r200,0; add.u64 %r418,%frame,240; add.u64 %r419,%frame,120; add.u64 %r416,%frame,360; .loc 1 262 12 mov.u64 %r431,%r200; $L61: .loc 1 257 18 add.u64 %r278,%r418,%r200; ld.u64 %r279,[%r146]; st.u64 [%r278],%r279; .loc 1 258 18 add.u64 %r281,%r419,%r200; .loc 1 258 20 ld.u64 %r283,[%r167]; mul.lo.u64 %r282,%r283,%r36; .loc 1 258 18 st.u64 [%r281],%r282; .loc 1 259 19 ld.u64 %r285,[%r146+16]; add.u64 %r284,%r285,1; ld.u64 %r286,[%r146+8]; sub.u64 %r53,%r284,%r286; .loc 1 261 10 setp.lt.s64 %r287,%r53,0; @ %r287 bra $L59; .loc 1 259 17 add.u64 %r289,%r416,%r200; st.u64 [%r289],%r53; bra $L60; $L59: .loc 1 262 12 add.u64 %r291,%r416,%r200; st.u64 [%r291],%r431; $L60: .loc 1 255 3 add.u64 %r146,%r146,24; add.u64 %r167,%r167,24; add.u64 %r200,%r200,8; setp.ne.u64 %r293,%r101,%r146; @ %r293 bra $L61; bra $L62; $L113: .loc 1 274 6 ld.u64 %r294,[%r204]; setp.eq.u64 %r295,%r294,0; @ ! %r295 bra $L64; bra $L63; $L56: add.u64 %r297,%r25,%r25; add.u64 %r298,%r297,%r25; shl.b64 %r299,%r298,3; add.u64 %r143,%r299,40; add.u64 %r156,%r205,%r143; add.u64 %r141,%r208,%r143; shl.b64 %r300,%r25,3; add.u64 %r137,%r300,-8; cvt.u32.u32 %r302,%r22; cvt.s64.s8 %r301,%r302; shl.b64 %r303,%r301,3; add.u64 %r84,%r303,-8; add.u64 %r418,%frame,240; add.u64 %r419,%frame,120; add.u64 %r416,%frame,360; .loc 1 271 12 mov.u64 %r430,0; $L67: .loc 1 266 18 add.u64 %r305,%r418,%r137; ld.u64 %r306,[%r156]; st.u64 [%r305],%r306; .loc 1 267 18 add.u64 %r308,%r419,%r137; .loc 1 267 20 ld.u64 %r310,[%r141]; mul.lo.u64 %r309,%r310,%r36; .loc 1 267 18 st.u64 [%r308],%r309; .loc 1 268 19 ld.u64 %r312,[%r156+16]; add.u64 %r311,%r312,1; ld.u64 %r313,[%r156+8]; sub.u64 %r62,%r311,%r313; .loc 1 270 10 setp.lt.s64 %r314,%r62,0; @ %r314 bra $L65; .loc 1 268 17 add.u64 %r316,%r416,%r137; st.u64 [%r316],%r62; bra $L66; $L65: .loc 1 271 12 add.u64 %r318,%r416,%r137; st.u64 [%r318],%r430; $L66: .loc 1 264 3 add.u64 %r156,%r156,24; add.u64 %r141,%r141,24; add.u64 %r137,%r137,8; setp.ne.u64 %r320,%r84,%r137; @ %r320 bra $L67; bra $L114; $L63: .loc 1 278 7 setp.le.s64 %r321,%r147,0; @ %r321 bra $L69; add.u64 %r416,%frame,360; $L90: add.u64 %r179,%r204,40; mov.u64 %r176,%r416; add.u64 %r322,%r204,16; cvt.u32.u32 %r324,%r22; cvt.s64.s8 %r323,%r324; add.u64 %r326,%r323,%r323; add.u64 %r327,%r326,%r323; shl.b64 %r328,%r327,3; add.u64 %r169,%r322,%r328; .loc 1 281 10 mov.u64 %r140,1; .loc 1 285 4 mov.u64 %r351,0; bra $L70; $L69: .loc 1 289 24 mov.u64 %r329,0; st.u64 [%r204+8],%r329; .loc 1 290 28 cvt.u16.u32 %r332,%r22; add.u16 %r331,%r332,-1; cvt.u32.u16 %r333,%r331; st.u8 [%r204+28],%r333; .loc 1 292 20 add.u32 %r334,%r22,-2; cvt.s64.s32 %r72,%r334; add.u64 %r336,%r72,%r72; add.u64 %r337,%r336,%r72; shl.b64 %r338,%r337,3; add.u64 %r339,%r204,%r338; .loc 1 292 67 shl.b64 %r341,%r72,3; add.u64 %r342,%frame,%r341; .loc 1 292 59 ld.u64 %r344,[%r339+40]; ld.u64 %r345,[%r342+360]; mul.lo.u64 %r154,%r344,%r345; .loc 1 294 29 mov.u64 %r3471547; call (%value_in),_gfortrani_xmallocarray8,[%value_in]; } .loc 1 294 27 st.u64 [%r204],%r348; .loc 1 295 10 setp.eq.u64 %r350,%r154,0; @ ! %r350 bra $L74; bra $L71; $L73: .loc 1 283 48 mul.lo.u64 %r140,%r68,%r140; $L70: .loc 1 285 4 st.u64 [%r179+8],%r351; ld.u64 %r68,[%r176]; add.u64 %r352,%r68,-1; st.u64 [%r179+16],%r352; st.u64 [%r179],%r140; .loc 1 278 7 add.u64 %r179,%r179,24; add.u64 %r176,%r176,8; setp.ne.u64 %r353,%r169,%r179; @ %r353 bra $L73; bra $L69; $L71: .loc 1 298 4 st.u64 [%r204+48],%r154; mov.u64 %r355,-1; st.u64 [%r204+56],%r355; mov.u64 %r356,1; st.u64 [%r204+40],%r356; .loc 1 299 4 bra $L52; $L64: .loc 1 304 19 ld.s8 %r79,[%r204+28]; .loc 1 304 10 setp.eq.u64 %r357,%r79,%r147; @ %r357 bra $L76; .loc 1 305 2 st.u64 [%stack+8],%r147; st.u64 [%stack],%r79;_gfortran_runtime_error76: .loc 1 310 11 cvta.global.u64 %r360,_gfortrani_compile_options; .loc 1 310 10 ld.u32 %r361,[%r360+36]; setp.eq.u32 %r362,%r361,0; @ %r362 bra $L74; .loc 1 311 2 add.u64 %r416,%frame,360; cvta.const.u64 %r366,$LC2r416366; call _gfortrani_bounds_ifunction_return$L74: .loc 1 315 3 setp.gt.s64 %r368,%r147,0; @ %r368 bra $L77; $L80: .loc 1 235 9 add.u64 %r369,%r31,1; .loc 1 235 7 sub.u64 %r149,%r369,%r33; max.s64 %r65,%r149,0; .loc 1 240 12 mul.lo.u64 %r37,%r34,%r36; .loc 1 323 8 ld.u64 %r164,[%r204]; .loc 1 326 8 ld.u64 %r163,[%r205]; .loc 1 351 35 shl.b64 %r100,%r151,2; .loc 1 336 22 add.u64 %r64,%r65,-1; .loc 1 336 15 mul.lo.u64 %r89,%r64,%r100; .loc 1 337 29 mul.lo.u64 %r177,%r37,%r64; .loc 1 338 34 neg.s64 %r95,%r100; .loc 1 363 22 ld.u64 %r102,[%frame+240]; .loc 1 363 12 shl.b64 %r104,%r102,2; .loc 1 364 23 ld.u64 %r105,[%frame+120]; .loc 1 365 22 ld.u64 %r107,[%frame]; .loc 1 365 12 shl.b64 %r109,%r107,3; .loc 1 362 12 ld.u64 %r203,[%frame+480]; setp.ne.u32 %r417,%r209,0; add.u64 %r415,%frame,488; .loc 1 338 4 setp.le.s64 %r423,%r149,0; .loc 1 382 23 add.u64 %r424,%frame,240; .loc 1 367 32 add.u64 %r425,%frame,360; .loc 1 371 20 add.u64 %r426,%frame,120; bra $L78; $L77: add.u64 %r198,%frame,480; add.u64 %r197,%r204,40; mov.u64 %r195,%frame; cvt.u32.u32 %r371,%r22; cvt.s64.s8 %r370,%r371; add.u64 %r181,%r370,-1; .loc 1 315 3 mov.u64 %r165,0; add.u64 %r416,%frame,360; .loc 1 317 16 mov.u64 %r372,%r165; $L79: st.u64 [%r198],%r372; .loc 1 318 18 ld.u64 %r373,[%r197]; st.u64 [%r195],%r373; .loc 1 319 17 shl.b64 %r375,%r165,3; add.u64 %r376,%r416,%r375; .loc 1 319 10 ld.u64 %r377,[%r376]; setp.le.s64 %r378,%r377,0; @ %r378 bra $L52; .loc 1 315 26 add.u64 %r165,%r165,1; .loc 1 315 3 add.u64 %r198,%r198,8; add.u64 %r197,%r197,24; add.u64 %r195,%r195,8; setp.ne.u64 %r379,%r165,%r181; @ %r379 bra $L79; bra $L80; $L78: .loc 1 334 10 @ %r417 bra $L81; .loc 1 351 4 setp.gt.s64 %r381,%r149,0; @ %r381 bra $L91; .loc 1 333 14 mov.u64 %r158,0; bra $L83; $L81: .loc 1 336 8 add.u64 %r29,%r163,%r89; .loc 1 337 9 add.u64 %r159,%r160,%r177; .loc 1 338 4 @ %r423 bra $L92; .loc 1 338 11 mov.u64 %r158,%r65; $L85: .loc 1 340 11 ld.s8 %r384,[%r159]; cvt.u16.u32 %r383,%r384; setp.eq.u16 %r385,%r383,0; @ %r385 bra $L84; .loc 1 340 18 ld.u32 %r386,[%r29]; setp.eq.u32 %r387,%r386,%r206; @ %r387 bra $L83; $L84: .loc 1 338 26 add.u64 %r158,%r158,-1; .loc 1 338 34 add.u64 %r29,%r29,%r95; .loc 1 338 53 sub.u64 %r159,%r159,%r37; .loc 1 338 4 setp.ne.u64 %r388,%r158,0; @ %r388 bra $L85; bra $L83; $L91: .loc 1 351 4 mov.u64 %r157,%r160; mov.u64 %r111,%r163; .loc 1 351 11 mov.u64 %r158,1; $L82: .loc 1 353 11 ld.s8 %r390,[%r157]; cvt.u16.u32 %r389,%r390; setp.eq.u16 %r391,%r389,0; @ %r391 bra $L86; .loc 1 353 18 ld.u32 %r392,[%r111]; setp.eq.u32 %r393,%r392,%r206; @ %r393 bra $L83; $L86: .loc 1 351 27 add.u64 %r158,%r158,1; .loc 1 351 35 add.u64 %r111,%r111,%r100; .loc 1 351 54 add.u64 %r157,%r157,%r37; .loc 1 351 4 setp.ge.s64 %r394,%r65,%r158; @ %r394 bra $L82; .loc 1 333 14 mov.u64 %r158,0; bra $L83; $L92: mov.u64 %r158,0; $L83: .loc 1 360 13 st.u64 [%r164],%r158; .loc 1 362 15 add.u64 %r203,%r203,1; st.u64 [%frame+480],%r203; .loc 1 363 12 add.u64 %r163,%r163,%r104; .loc 1 364 13 add.u64 %r160,%r160,%r105; .loc 1 365 12 add.u64 %r164,%r164,%r109; .loc 1 367 32 ld.u64 %r130,[%frame+360]; .loc 1 367 13 setp.ne.u64 %r395,%r203,%r130; @ %r395 bra $L78; mov.u64 %r126,%r415; mov.u64 %r127,%r107; mov.u64 %r202,%r105; mov.u64 %r124,%r102; mov.u64 %r120,8; .loc 1 366 9 mov.u64 %r162,0; .loc 1 369 13 mov.u64 %r397,%r162; $L89: st.u64 [%r126+-8],%r397; .loc 1 370 23 mul.lo.u64 %r113,%r130,%r124; .loc 1 371 24 mul.lo.u64 %r398,%r130,%r202; .loc 1 371 10 sub.u64 %r160,%r160,%r398; .loc 1 372 23 mul.lo.u64 %r119,%r130,%r127; .loc 1 373 5 add.u64 %r162,%r162,1; .loc 1 374 7 setp.le.s64 %r399,%r147,%r162; @ %r399 bra $L52; .loc 1 381 16 ld.u64 %r400,[%r126]; add.u64 %r123,%r400,1; st.u64 [%r126],%r123; .loc 1 382 23 add.u64 %r402,%r424,%r120; ld.u64 %r124,[%r402]; .loc 1 382 13 sub.u64 %r403,%r124,%r113; shl.b64 %r404,%r403,2; add.u64 %r163,%r163,%r404; .loc 1 383 23 add.u64 %r405,%frame,%r120; ld.u64 %r127,[%r405]; .loc 1 383 13 sub.u64 %r406,%r127,%r119; shl.b64 %r407,%r406,3; add.u64 %r164,%r164,%r407; .loc 1 367 32 add.u64 %r409,%r425,%r120; ld.u64 %r130,[%r409]; .loc 1 367 13 add.u64 %r126,%r126,8; setp.eq.u64 %r410,%r123,%r130; @ %r410 bra $L88; .loc 1 362 12 ld.u64 %r203,[%frame+480]; bra $L78; $L88: .loc 1 371 20 add.u64 %r412,%r426,%r120; ld.u64 %r202,[%r412]; add.u64 %r120,%r120,8; bra $L89; $L114: .loc 1 274 6 ld.u64 %r413,[%r204]; setp.eq.u64 %r414,%r413,0; @ ! %r414 bra $L64; bra $L90; $L52: .loc 1 387_gfortran_sfindloc1_i4 .visible .func _gfortran_sfindloc1_i468u64 %r126; .reg .u64u32pred %r190; .reg .pred %r192; .reg .u64pred %r213; .reg .predu32predpredpred %r257; .reg .u64 %r260; .reg .u64 %r261; .reg .pred %r263; .reg .u64 %r264; .reg .u32 %r265; .reg .u64 %r266; .reg .u6464predmov.u64 %r149,%ar0; mov.u64 %r150,%ar1; mov.u32 %r151,%ar2; mov.u64 %r152,%ar3; mov.u64 %r153,%ar4; mov.u32 %r154,%ar5; .loc 1 410 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L116; .loc 1 410 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L117; $L116:call _gfortran_findloc1_i4); } .loc 1 413 7 bra $L115; $L117: .loc 1 416 10 ld.s8 %r23,[%r150+28]; .loc 1 416 38 add.u32 %r163,%r23,-1; .loc 1 416 8 cvt.s64.s32 %r90,%r163; .loc 1 417 10 ld.u64 %r26,[%r152]; .loc 1 417 7 add.u64 %r91,%r26,-1; .loc 1 419 7 shr.u64 %r165,%r91,63; set.u32.lt.s64 %r167,%r90,%r91; neg.s32 %r168,%r167; cvt.u16.u64 %r171,%r165; cvt.u16.u32 %r172,%r168; or.b16 %r170,%r171,%r172; .loc 1 419 6 cvt.u32.u16 %r173,%r170; cvt.u16.u8 %r174,%r173; setp.ne.u16 %r175,%r174,0; @ %r175 bra $L119; .loc 1 430 3 setp.ne.u64 %r176,%r91,0; @ %r176 bra $L120; $L126: .loc 1 438 3 setp.gt.s64 %r177,%r90,%r91; @ %r177 bra $L121; bra $L163; $L119: .loc 1 421 7 cvt.u32.u32 %r180,%r23; cvt.s64.s8 %r179,%r180; st.u64 [%stack+8],%r179_gfortran_runtime_error120: add.u64 %r136,%r150,48; add.u64 %r138,%frame,120; add.u64 %r139,%r150,56; add.u64 %r182,%r150,24; add.u64 %r184,%r26,%r26; add.u64 %r185,%r184,%r26; shl.b64 %r186,%r185,3; add.u64 %r145,%r182,%r186; .loc 1 435 12 mov.u64 %r302,0; $L125: .loc 1 432 19 ld.u64 %r188,[%r139]; add.u64 %r187,%r188,1; ld.u64 %r189,[%r136]; sub.u64 %r35,%r187,%r189; .loc 1 434 10 setp.le.s64 %r190,%r35,0; @ %r190 bra $L123; .loc 1 432 17 st.u64 [%r138],%r35; bra $L124; $L123: .loc 1 435 12 st.u64 [%r138],%r302; $L124: .loc 1 430 3 add.u64 %r136,%r136,24; add.u64 %r138,%r138,8; add.u64 %r139,%r139,24; setp.ne.u64 %r192,%r136,%r145; @ %r192 bra $L125; bra $L126; $L163: .loc 1 448 6 ld.u64 %r193,[%r149]; setp.eq.u64 %r194,%r193,0; @ ! %r194 bra $L128; bra $L127; $L121: add.u64 %r196,%r26,%r26; add.u64 %r197,%r196,%r26; shl.b64 %r198,%r197,3; add.u64 %r199,%r198,48; add.u64 %r125,%r150,%r199; add.u64 %r294,%frame,120; shl.b64 %r201,%r26,3; add.u64 %r202,%r201,-8; add.u64 %r45,%r294,%r202; add.u64 %r203,%r150,48; cvt.u32.u32 %r205,%r23; cvt.s64.s8 %r204,%r205; add.u64 %r207,%r204,%r204; add.u64 %r208,%r207,%r204; shl.b64 %r209,%r208,3; add.u64 %r132,%r203,%r209; .loc 1 444 12 mov.u64 %r301,0; $L131: .loc 1 441 2 ld.u64 %r211,[%r125+8]; add.u64 %r210,%r211,1; ld.u64 %r212,[%r125]; sub.u64 %r40,%r210,%r212; .loc 1 443 10 setp.le.s64 %r213,%r40,0; @ %r213 bra $L129; .loc 1 440 17 st.u64 [%r45],%r40; bra $L130; $L129: .loc 1 444 12 st.u64 [%r45],%r301; $L130: .loc 1 438 3 add.u64 %r125,%r125,24; add.u64 %r45,%r45,8; setp.ne.u64 %r215,%r125,%r132; @ %r215 bra $L131; bra $L164; $L127: .loc 1 452 7 setp.le.s64 %r216,%r90,0; @ %r216 bra $L133; add.u64 %r294,%frame,120; $L149: add.u64 %r78,%r149,40; mov.u64 %r61,%r294; cvt.u32.u32 %r218,%r23; cvt.s64.s8 %r217,%r218; add.u64 %r220,%r217,%r217; add.u64 %r221,%r220,%r217; shl.b64 %r222,%r221,3; add.u64 %r223,%r149,16; add.u64 %r126,%r222,%r223; .loc 1 455 10 mov.u64 %r81,1; .loc 1 459 4 mov.u64 %r246,0; bra $L134; $L133: .loc 1 462 24 mov.u64 %r224,0; st.u64 [%r149+8],%r224; .loc 1 463 28 cvt.u16.u32 %r227,%r23; add.u16 %r226,%r227,-1; cvt.u32.u16 %r228,%r226; st.u8 [%r149+28],%r228; .loc 1 465 20 add.u32 %r229,%r23,-2; cvt.s64.s32 %r51,%r229; add.u64 %r231,%r51,%r51; add.u64 %r232,%r231,%r51; shl.b64 %r233,%r232,3; add.u64 %r234,%r149,%r233; .loc 1 465 67 shl.b64 %r236,%r51,3; add.u64 %r237,%frame,%r236; .loc 1 465 59 ld.u64 %r239,[%r234+40]; ld.u64 %r240,[%r237+120]; mul.lo.u64 %r94,%r239,%r240; .loc 1 467 29 mov.u64 %r24242; call (%value_in),_gfortrani_xmallocarray3,[%value_in]; } .loc 1 467 27 st.u64 [%r149],%r243; .loc 1 468 10 setp.eq.u64 %r245,%r94,0; @ ! %r245 bra $L138; bra $L135; $L137: .loc 1 457 48 mul.lo.u64 %r81,%r47,%r81; $L134: .loc 1 459 4 st.u64 [%r78+8],%r246; ld.u64 %r47,[%r61]; add.u64 %r247,%r47,-1; st.u64 [%r78+16],%r247; st.u64 [%r78],%r81; .loc 1 452 7 add.u64 %r78,%r78,24; add.u64 %r61,%r61,8; setp.ne.u64 %r248,%r78,%r126; @ %r248 bra $L137; bra $L133; $L135: .loc 1 471 4 st.u64 [%r149+48],%r94; mov.u64 %r250,-1; st.u64 [%r149+56],%r250; mov.u64 %r251,1; st.u64 [%r149+40],%r251; .loc 1 472 4 bra $L115; $L128: .loc 1 477 19 ld.s8 %r58,[%r149+28]; .loc 1 477 10 setp.eq.u64 %r252,%r58,%r90; @ %r252 bra $L139; .loc 1 478 2 st.u64 [%stack+8],%r90; st.u64 [%stack],%r58;_gfortran_runtime_error139: .loc 1 483 11 cvta.global.u64 %r255,_gfortrani_compile_options; .loc 1 483 10 ld.u32 %r256,[%r255+36]; setp.eq.u32 %r257,%r256,0; @ %r257 bra $L138; .loc 1 484 2 add.u64 %r294,%frame,120; cvta.const.u64 %r261,$LC2261; call _gfortrani_bounds_ifunction_return$L138: .loc 1 488 3 setp.gt.s64 %r263,%r90,0; @ %r263 bra $L140; $L143: .loc 1 495 8 ld.u64 %r97,[%r149]; ld.u64 %r42,[%frame+240]; .loc 1 503 22 ld.u64 %r64,[%frame]; .loc 1 503 12 shl.b64 %r66,%r64,3; setp.le.s64 %r295,%r90,1; cvt.u32.u32 %r297,%r23; cvt.s64.s8 %r298,%r297; add.u64 %r299,%r298,-1; add.u64 %r300,%frame,120; bra $L141; $L140: add.u64 %r114,%frame,240; add.u64 %r113,%r149,40; mov.u64 %r108,%frame; cvt.u32.u32 %r265,%r23; cvt.s64.s8 %r264,%r265; add.u64 %r79,%r264,-1; .loc 1 488 3 mov.u64 %r98,0; add.u64 %r294,%frame,120; .loc 1 490 16 mov.u64 %r266,%r98; $L142: st.u64 [%r114],%r266; .loc 1 491 18 ld.u64 %r267,[%r113]; st.u64 [%r108],%r267; .loc 1 492 17 shl.b64 %r269,%r98,3; add.u64 %r270,%r294,%r269; .loc 1 492 10 ld.u64 %r271,[%r270]; setp.le.s64 %r272,%r271,0; @ %r272 bra $L115; .loc 1 488 26 add.u64 %r98,%r98,1; .loc 1 488 3 add.u64 %r114,%r114,8; add.u64 %r113,%r113,24; add.u64 %r108,%r108,8; setp.ne.u64 %r273,%r79,%r98; @ %r273 bra $L142; bra $L143; $L141: .loc 1 500 13 mov.u64 %r274,0; st.u64 [%r97],%r274; .loc 1 502 15 add.u64 %r42,%r42,1; .loc 1 503 12 add.u64 %r97,%r97,%r66; .loc 1 505 32 ld.u64 %r88,[%frame+120]; .loc 1 505 13 setp.ne.u64 %r275,%r42,%r88; @ %r275 bra $L141; .loc 1 508 23 mul.lo.u64 %r276,%r64,%r42; .loc 1 508 9 shl.b64 %r277,%r276,3; sub.u64 %r30,%r97,%r277; .loc 1 510 7 @ ! %r295 bra $L165; bra $L115; $L148: .loc 1 507 13 st.u64 [%r102],%r274; .loc 1 508 23 mul.lo.u64 %r280,%r73,%r72; .loc 1 508 9 shl.b64 %r281,%r280,3; sub.u64 %r30,%r97,%r281; .loc 1 509 5 add.u64 %r96,%r96,1; .loc 1 510 7 add.u64 %r102,%r102,8; setp.ne.u64 %r282,%r96,%r299; @ %r282 bra $L147; bra $L115; $L165: add.u64 %r102,%frame,248; .loc 1 509 5 mov.u64 %r96,1; $L147: .loc 1 517 16 ld.u64 %r286,[%r102]; add.u64 %r72,%r286,1; st.u64 [%r102],%r72; shl.b64 %r120,%r96,3; .loc 1 518 23 add.u64 %r287,%frame,%r120; ld.u64 %r73,[%r287]; .loc 1 518 13 shl.b64 %r288,%r73,3; add.u64 %r97,%r30,%r288; .loc 1 505 32 add.u64 %r290,%r300,%r120; ld.u64 %r77,[%r290]; .loc 1 505 13 setp.eq.u64 %r291,%r72,%r77; @ %r291 bra $L148; .loc 1 507 13 mov.u64 %r42,0; bra $L141; $L164: .loc 1 448 6 ld.u64 %r292,[%r149]; setp.eq.u64 %r293,%r292,0; @ ! %r293 bra $L128; bra $L149; $L115: .loc 1 522findloc1_i8.o/_gfortran_findloc1_i8 .visible .func _gfortran_findloc1_fortran/generated/findloc1_i8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mfindloc1_i8 .visible .func _gfortran_mfindloc1_i8GLOBAL FUNCTION DECL: _gfortran_sfindloc1_i8 .visible .func _gfortran_sfindloc1_i8GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_options[6]; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_internal_error .extern .func _gfortrani_internal_8] = {70,73,78,68,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,70,73,78,68,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,914,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,70,73,78,68,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_findloc1_i8 .visible .func _gfortran_findloc1_, .param .u32 %in_ar4) {98predpred %r238; .reg .u64pred %r260; .reg .pred %r262; .reg .pred %r263; .reg .u64 %r264; .reg .u32 %r265; .reg .u64u64 %r279; .reg .u64 %r280; .reg .u6464predpred %r330; .reg .pred %r331; .reg .pred %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64u64 %r352; .reg .u64 %r354; .reg .predpred %r358; .reg .pred %r359; .reg .u64predmov.u64 %r185,%ar0; mov.u64 %r186,%ar1; mov.u64 %r187,%ar2; mov.u64 %r188,%ar3; mov.u32 %r189,%ar4; .loc 1 54 10 ld.s8 %r22,[%r186+28]; .loc 1 54 38 add.u32 %r190,%r22,-1; .loc 1 54 8 cvt.s64.s32 %r118,%r190; .loc 1 55 10 ld.u64 %r25,[%r188]; .loc 1 55 7 add.u64 %r119,%r25,-1; .loc 1 57 7 shr.u64 %r192,%r119,63; set.u32.lt.s64 %r194,%r118,%r119; neg.s32 %r195,%r194; cvt.u16.u64 %r198,%r192; cvt.u16.u32 %r199,%r195; or.b16 %r197,%r198,%r199; .loc 1 57 6 cvt.u32.u16 %r200,%r197; cvt.u16.u8 %r201,%r200; setp.eq.u16 %r202,%r201,0; @ %r202 bra $L2; .loc 1 59 7 cvt.u32.u32 %r205,%r22; cvt.s64.s8 %r204,%r205; st.u64 [%stack+8],%r20403203_gfortran_runtime_errorr208,%r119,%r119; add.u64 %r209,%r208,%r119; shl.b64 %r210,%r209,3; add.u64 %r211,%r186,%r210; ld.u64 %r31,[%r211+56]; ld.u64 %r33,[%r211+48]; .loc 1 67 9 ld.u64 %r122,[%r211+40]; .loc 1 69 3 setp.ne.u64 %r225,%r119,0; @ %r225 bra $L3; $L9: .loc 1 77 3 setp.gt.s64 %r226,%r118,%r119; @ %r226 bra $L4; bra $L50; $L3: add.u64 %r177,%r186,40; add.u64 %r67,%frame,120; add.u64 %r70,%frame,240; add.u64 %r228,%r25,%r25; add.u64 %r229,%r228,%r25; shl.b64 %r230,%r229,3; add.u64 %r231,%r186,16; add.u64 %r182,%r230,%r231; .loc 1 75 12 mov.u64 %r371,0; $L8: .loc 1 71 18 ld.u64 %r232,[%r177]; st.u64 [%r67],%r232; .loc 1 72 19 ld.u64 %r234,[%r177+16]; add.u64 %r233,%r234,1; ld.u64 %r235,[%r177+8]; sub.u64 %r38,%r233,%r235; .loc 1 74 10 setp.lt.s64 %r236,%r38,0; @ %r236 bra $L6; .loc 1 72 17 st.u64 [%r70],%r38; bra $L7; $L6: .loc 1 75 12 st.u64 [%r70],%r371; $L7: .loc 1 69 3 add.u64 %r177,%r177,24; add.u64 %r67,%r67,8; add.u64 %r70,%r70,8; setp.ne.u64 %r238,%r177,%r182; @ %r238 bra $L8; bra $L9; $L50: .loc 1 86 6 ld.u64 %r239,[%r185]; setp.eq.u64 %r240,%r239,0; @ ! %r240 bra $L11; bra $L10; $L4: add.u64 %r242,%r25,%r25; add.u64 %r243,%r242,%r25; shl.b64 %r244,%r243,3; add.u64 %r245,%r244,40; add.u64 %r115,%r186,%r245; shl.b64 %r246,%r25,3; add.u64 %r105,%r246,-8; add.u64 %r361,%frame,120; add.u64 %r108,%r361,%r105; add.u64 %r360,%frame,240; add.u64 %r62,%r360,%r105; add.u64 %r249,%r186,40; cvt.u32.u32 %r251,%r22; cvt.s64.s8 %r250,%r251; add.u64 %r253,%r250,%r250; add.u64 %r254,%r253,%r250; shl.b64 %r255,%r254,3; add.u64 %r180,%r249,%r255; .loc 1 83 12 mov.u64 %r370,0; $L14: .loc 1 79 18 ld.u64 %r256,[%r115]; st.u64 [%r108],%r256; .loc 1 80 19 ld.u64 %r258,[%r115+16]; add.u64 %r257,%r258,1; ld.u64 %r259,[%r115+8]; sub.u64 %r45,%r257,%r259; .loc 1 82 10 setp.lt.s64 %r260,%r45,0; @ %r260 bra $L12; .loc 1 80 17 st.u64 [%r62],%r45; bra $L13; $L12: .loc 1 83 12 st.u64 [%r62],%r370; $L13: .loc 1 77 3 add.u64 %r115,%r115,24; add.u64 %r108,%r108,8; add.u64 %r62,%r62,8; setp.ne.u64 %r262,%r115,%r180; @ %r262 bra $L14; bra $L51; $L10: .loc 1 90 7 setp.le.s64 %r263,%r118,0; @ %r263 bra $L16; add.u64 %r360,%frame,240; $L36: add.u64 %r153,%r185,40; mov.u64 %r149,%r360; cvt.u32.u32 %r265,%r22; cvt.s64.s8 %r264,%r265; add.u64 %r267,%r264,%r264; add.u64 %r268,%r267,%r264; shl.b64 %r269,%r268,3; add.u64 %r270,%r185,16; add.u64 %r116,%r269,%r270; .loc 1 93 10 mov.u64 %r112,1; .loc 1 97 4 mov.u64 %r293,0; bra $L17; $L16: .loc 1 101 24 mov.u64 %r271,0; st.u64 [%r185+8],%r271; .loc 1 102 28 cvt.u16.u32 %r274,%r22; add.u16 %r273,%r274,-1; cvt.u32.u16 %r275,%r273; st.u8 [%r185+28],%r275; .loc 1 104 20 add.u32 %r276,%r22,-2; cvt.s64.s32 %r54,%r276; add.u64 %r278,%r54,%r54; add.u64 %r279,%r278,%r54; shl.b64 %r280,%r279,3; add.u64 %r281,%r185,%r280; .loc 1 104 67 shl.b64 %r283,%r54,3; add.u64 %r284,%frame,%r283; .loc 1 104 59 ld.u64 %r286,[%r281+40]; ld.u64 %r287,[%r284+240]; mul.lo.u64 %r124,%r286,%r287; .loc 1 106 29 mov.u64 %r289189; call (%value_in),_gfortrani_xmallocarray290,[%value_in]; } .loc 1 106 27 st.u64 [%r185],%r290; .loc 1 107 10 setp.eq.u64 %r292,%r124,0; @ ! %r292 bra $L21; bra $L18; $L20: .loc 1 95 48 mul.lo.u64 %r112,%r50,%r112; $L17: .loc 1 97 4 st.u64 [%r153+8],%r293; ld.u64 %r50,[%r149]; add.u64 %r294,%r50,-1; st.u64 [%r153+16],%r294; st.u64 [%r153],%r112; .loc 1 90 7 add.u64 %r153,%r153,24; add.u64 %r149,%r149,8; setp.ne.u64 %r295,%r116,%r153; @ %r295 bra $L20; bra $L16; $L18: .loc 1 110 4 st.u64 [%r185+48],%r124; mov.u64 %r297,-1; st.u64 [%r185+56],%r297; mov.u64 %r298,1; st.u64 [%r185+40],%r298; .loc 1 111 4 bra $L1; $L11: .loc 1 116 19 ld.s8 %r60,[%r185+28]; .loc 1 116 10 setp.eq.u64 %r299,%r60,%r118; @ %r299 bra $L23; .loc 1 117 2 st.u64 [%stack+8],%r118; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 122 11 cvta.global.u64 %r302,_gfortrani_compile_options; .loc 1 122 10 ld.u32 %r303,[%r302+36]; setp.eq.u32 %r304,%r303,0; @ %r304 bra $L21; .loc 1 123 2 add.u64 %r360,%frame,240; cvta.const.u64 %r308,$LC2r360307308; call _gfortrani_bounds_ifunction_return$L21: .loc 1 127 3 setp.gt.s64 %r310,%r118,0; @ %r310 bra $L24; $L27: .loc 1 64 9 add.u64 %r311,%r31,1; .loc 1 64 7 sub.u64 %r120,%r311,%r33; max.s64 %r43,%r120,0; .loc 1 135 8 ld.u64 %r111,[%r185]; .loc 1 138 8 ld.u64 %r132,[%r186]; .loc 1 160 35 shl.b64 %r74,%r122,3; .loc 1 147 22 add.u64 %r312,%r43,-1; .loc 1 147 15 mul.lo.u64 %r68,%r312,%r74; .loc 1 148 34 neg.s64 %r71,%r74; ld.u64 %r82,[%frame+360]; .loc 1 172 22 ld.u64 %r75,[%frame+120]; .loc 1 172 12 shl.b64 %r77,%r75,3; .loc 1 173 22 ld.u64 %r78,[%frame]; .loc 1 173 12 shl.b64 %r80,%r78,3; setp.ne.u32 %r359,%r189,0; setp.le.s64 %r358,%r118,1; .loc 1 148 4 setp.le.s64 %r363,%r120,0; .loc 1 160 4 setp.gt.s64 %r364,%r120,0; cvt.u32.u32 %r365,%r22; cvt.s64.s8 %r366,%r365; add.u64 %r367,%r366,-1; add.u64 %r368,%frame,120; add.u64 %r369,%frame,240; bra $L25; $L24: add.u64 %r170,%frame,360; add.u64 %r169,%r185,40; mov.u64 %r167,%frame; cvt.u32.u32 %r314,%r22; cvt.s64.s8 %r313,%r314; add.u64 %r154,%r313,-1; .loc 1 127 3 mov.u64 %r133,0; add.u64 %r360,%frame,240; .loc 1 129 16 mov.u64 %r315,%r133; $L26: st.u64 [%r170],%r315; .loc 1 130 18 ld.u64 %r316,[%r169]; st.u64 [%r167],%r316; .loc 1 131 17 shl.b64 %r318,%r133,3; add.u64 %r319,%r360,%r318; .loc 1 131 10 ld.u64 %r320,[%r319]; setp.le.s64 %r321,%r320,0; @ %r321 bra $L1; .loc 1 127 26 add.u64 %r133,%r133,1; .loc 1 127 3 add.u64 %r170,%r170,8; add.u64 %r169,%r169,24; add.u64 %r167,%r167,8; setp.ne.u64 %r322,%r133,%r154; @ %r322 bra $L26; bra $L27; $L25: .loc 1 145 10 @ %r359 bra $L28; .loc 1 160 4 @ %r364 bra $L37; .loc 1 144 14 mov.u64 %r126,0; bra $L30; $L28: .loc 1 147 8 add.u64 %r127,%r132,%r68; .loc 1 148 4 @ %r363 bra $L38; .loc 1 148 11 mov.u64 %r126,%r43; $L31: .loc 1 150 11 ld.u64 %r326,[%r127]; setp.eq.u64 %r327,%r326,%r187; @ %r327 bra $L30; .loc 1 148 26 add.u64 %r126,%r126,-1; .loc 1 148 34 add.u64 %r127,%r127,%r71; .loc 1 148 4 setp.ne.u64 %r328,%r126,0; @ %r328 bra $L31; bra $L30; $L37: .loc 1 160 4 mov.u64 %r125,%r132; .loc 1 160 11 mov.u64 %r126,1; $L29: .loc 1 162 11 ld.u64 %r329,[%r125]; setp.eq.u64 %r330,%r329,%r187; @ %r330 bra $L30; .loc 1 160 27 add.u64 %r126,%r126,1; .loc 1 160 35 add.u64 %r125,%r125,%r74; .loc 1 160 4 setp.ge.s64 %r331,%r43,%r126; @ %r331 bra $L29; .loc 1 144 14 mov.u64 %r126,0; bra $L30; $L38: mov.u64 %r126,0; $L30: .loc 1 169 13 st.u64 [%r111],%r126; .loc 1 171 15 add.u64 %r82,%r82,1; .loc 1 172 12 add.u64 %r132,%r132,%r77; .loc 1 173 12 add.u64 %r111,%r111,%r80; .loc 1 175 32 ld.u64 %r141,[%frame+240]; .loc 1 175 13 setp.ne.u64 %r332,%r82,%r141; @ %r332 bra $L25; .loc 1 178 23 mul.lo.u64 %r333,%r75,%r82; .loc 1 178 9 shl.b64 %r334,%r333,3; sub.u64 %r129,%r132,%r334; .loc 1 179 23 mul.lo.u64 %r335,%r78,%r82; .loc 1 179 9 shl.b64 %r336,%r335,3; sub.u64 %r130,%r111,%r336; .loc 1 181 7 @ %r358 bra $L1; add.u64 %r66,%frame,368; mov.u64 %r159,8; .loc 1 180 5 mov.u64 %r131,1; .loc 1 177 13 mov.u64 %r362,0; bra $L34; $L35: st.u64 [%r66],%r362; .loc 1 178 23 mul.lo.u64 %r342,%r95,%r94; .loc 1 178 9 shl.b64 %r343,%r342,3; sub.u64 %r129,%r132,%r343; .loc 1 179 23 mul.lo.u64 %r344,%r98,%r94; .loc 1 179 9 shl.b64 %r345,%r344,3; sub.u64 %r130,%r111,%r345; .loc 1 180 5 add.u64 %r131,%r131,1; .loc 1 181 7 add.u64 %r66,%r66,8; add.u64 %r159,%r159,8; setp.eq.u64 %r346,%r131,%r367; @ %r346 bra $L1; $L34: .loc 1 188 16 ld.u64 %r347,[%r66]; add.u64 %r94,%r347,1; st.u64 [%r66],%r94; .loc 1 189 23 add.u64 %r349,%r368,%r159; ld.u64 %r95,[%r349]; .loc 1 189 13 shl.b64 %r350,%r95,3; add.u64 %r132,%r129,%r350; .loc 1 190 23 add.u64 %r351,%frame,%r159; ld.u64 %r98,[%r351]; .loc 1 190 13 shl.b64 %r352,%r98,3; add.u64 %r111,%r130,%r352; .loc 1 175 32 add.u64 %r354,%r369,%r159; ld.u64 %r102,[%r354]; .loc 1 175 13 setp.eq.u64 %r355,%r94,%r102; @ %r355 bra $L35; .loc 1 177 13 mov.u64 %r82,0; bra $L25; $L51: .loc 1 86 6 ld.u64 %r356,[%r185]; setp.eq.u64 %r357,%r356,0; @ ! %r357 bra $L11; bra $L36; $L1: .loc 1 194_gfortran_mfindloc1_i8 .visible .func _gfortran_mfindloc1_i8 {608u32 %r209; .reg .u32 %r210; .reg .u64 %r212; .reg .u32 %r214; .reg .u32 %r215; .reg .u16 %r217; .reg .u16 %r218; .reg .u16 %r219; .reg .u32 %r220; .reg .u16 %r221; .reg .predu64 %r230; .reg .u64 %r231; .reg .u64 %r249; .reg .u32u32u64 %r281; .reg .u64 %r282; .reg .u64 %r283; .reg .u64predu64u16 %r331; .reg .u16predpredu64 %r375; .reg .u64pred %r379; .reg .pred %r381; .reg .u16pred %r388; .reg .u16 %r389; .reg .u32predu64u64 %r430; .reg .u64 %r431; mov.u64 %r204,%ar0; mov.u64 %r205,%ar1; mov.u64 %r206,%ar2; mov.u64 %r207,%ar3; mov.u64 %r208,%ar4; mov.u32 %r209,%ar5; .loc 1 225 10 ld.s8 %r22,[%r205+28]; .loc 1 225 38 add.u32 %r210,%r22,-1; .loc 1 225 8 cvt.s64.s32 %r147,%r210; .loc 1 226 10 ld.u64 %r25,[%r207]; .loc 1 226 7 add.u64 %r148,%r25,-1; .loc 1 228 7 shr.u64 %r212,%r148,63; set.u32.lt.s64 %r214,%r147,%r148; neg.s32 %r215,%r214; cvt.u16.u64 %r218,%r212; cvt.u16.u32 %r219,%r215; or.b16 %r217,%r218,%r219; .loc 1 228 6 cvt.u32.u16 %r220,%r217; cvt.u16.u8 %r221,%r220; setp.eq.u16 %r222,%r221,0; @ %r222 bra $L53; .loc 1 230 7 cvt.u32.u32 %r225,%r22; cvt.s64.s8 %r224,%r225; st.u64 [%stack+8],%r224232stack; call _gfortran_runtime_error3: .loc 1 235 9 add.u64 %r228,%r148,%r148; add.u64 %r229,%r228,%r148; shl.b64 %r230,%r229,3; add.u64 %r231,%r205,%r230; ld.u64 %r31,[%r231+56]; ld.u64 %r33,[%r231+48]; .loc 1 239 9 ld.u64 %r151,[%r231+40]; .loc 1 240 12 add.u64 %r249,%r208,%r230; ld.u64 %r34,[%r249+40]; ld.u64 %r36,[%r208+16]; .loc 1 242 9 ld.u64 %r160,[%r208]; .loc 1 246 22 cvt.u32.u64 %r39,%r36; .loc 1 246 53 add.u32 %r251,%r39,-4; and.b32 %r252,%r251,-5; set.u32.eq.u32 %r254,%r252,0; neg.s32 %r255,%r254; .loc 1 246 22 add.u32 %r256,%r39,-1; set.u32.le.u32 %r258,%r256,1; neg.s32 %r259,%r258; .loc 1 246 6 cvt.u16.u32 %r261,%r255; cvt.u16.u32 %r262,%r259; or.b16 %r260,%r261,%r262; cvt.u32.u16 %r263,%r260; cvt.u16.u8 %r264,%r263; setp.eq.u16 %r265,%r264,0; @ %r265 bra $L54; $L58: .loc 1 255 3 setp.ne.u64 %r266,%r148,0; @ %r266 bra $L55; $L62: .loc 1 264 3 setp.gt.s64 %r267,%r147,%r148; @ %r267 bra $L56; bra $L113; $L54: .loc 1 248 7 setp.eq.u32 %r269,%r39,16; @ %r269 bra $L58; .loc 1 253 5 cvta.const.u64 %r271,$LC4; mov.u64 %r270,27271; call _gfortrani_internal_error5: add.u64 %r146,%r205,40; add.u64 %r167,%r208,40; add.u64 %r273,%r25,%r25; add.u64 %r274,%r273,%r25; shl.b64 %r275,%r274,3; add.u64 %r276,%r205,16; add.u64 %r101,%r275,%r276; .loc 1 255 3 mov.u64 %r200,0; add.u64 %r418,%frame,240; add.u64 %r419,%frame,120; add.u64 %r416,%frame,360; .loc 1 262 12 mov.u64 %r431,%r200; $L61: .loc 1 257 18 add.u64 %r278,%r418,%r200; ld.u64 %r279,[%r146]; st.u64 [%r278],%r279; .loc 1 258 18 add.u64 %r281,%r419,%r200; .loc 1 258 20 ld.u64 %r283,[%r167]; mul.lo.u64 %r282,%r283,%r36; .loc 1 258 18 st.u64 [%r281],%r282; .loc 1 259 19 ld.u64 %r285,[%r146+16]; add.u64 %r284,%r285,1; ld.u64 %r286,[%r146+8]; sub.u64 %r53,%r284,%r286; .loc 1 261 10 setp.lt.s64 %r287,%r53,0; @ %r287 bra $L59; .loc 1 259 17 add.u64 %r289,%r416,%r200; st.u64 [%r289],%r53; bra $L60; $L59: .loc 1 262 12 add.u64 %r291,%r416,%r200; st.u64 [%r291],%r431; $L60: .loc 1 255 3 add.u64 %r146,%r146,24; add.u64 %r167,%r167,24; add.u64 %r200,%r200,8; setp.ne.u64 %r293,%r101,%r146; @ %r293 bra $L61; bra $L62; $L113: .loc 1 274 6 ld.u64 %r294,[%r204]; setp.eq.u64 %r295,%r294,0; @ ! %r295 bra $L64; bra $L63; $L56: add.u64 %r297,%r25,%r25; add.u64 %r298,%r297,%r25; shl.b64 %r299,%r298,3; add.u64 %r143,%r299,40; add.u64 %r156,%r205,%r143; add.u64 %r141,%r208,%r143; shl.b64 %r300,%r25,3; add.u64 %r137,%r300,-8; cvt.u32.u32 %r302,%r22; cvt.s64.s8 %r301,%r302; shl.b64 %r303,%r301,3; add.u64 %r84,%r303,-8; add.u64 %r418,%frame,240; add.u64 %r419,%frame,120; add.u64 %r416,%frame,360; .loc 1 271 12 mov.u64 %r430,0; $L67: .loc 1 266 18 add.u64 %r305,%r418,%r137; ld.u64 %r306,[%r156]; st.u64 [%r305],%r306; .loc 1 267 18 add.u64 %r308,%r419,%r137; .loc 1 267 20 ld.u64 %r310,[%r141]; mul.lo.u64 %r309,%r310,%r36; .loc 1 267 18 st.u64 [%r308],%r309; .loc 1 268 19 ld.u64 %r312,[%r156+16]; add.u64 %r311,%r312,1; ld.u64 %r313,[%r156+8]; sub.u64 %r62,%r311,%r313; .loc 1 270 10 setp.lt.s64 %r314,%r62,0; @ %r314 bra $L65; .loc 1 268 17 add.u64 %r316,%r416,%r137; st.u64 [%r316],%r62; bra $L66; $L65: .loc 1 271 12 add.u64 %r318,%r416,%r137; st.u64 [%r318],%r430; $L66: .loc 1 264 3 add.u64 %r156,%r156,24; add.u64 %r141,%r141,24; add.u64 %r137,%r137,8; setp.ne.u64 %r320,%r84,%r137; @ %r320 bra $L67; bra $L114; $L63: .loc 1 278 7 setp.le.s64 %r321,%r147,0; @ %r321 bra $L69; add.u64 %r416,%frame,360; $L90: add.u64 %r179,%r204,40; mov.u64 %r176,%r416; add.u64 %r322,%r204,16; cvt.u32.u32 %r324,%r22; cvt.s64.s8 %r323,%r324; add.u64 %r326,%r323,%r323; add.u64 %r327,%r326,%r323; shl.b64 %r328,%r327,3; add.u64 %r169,%r322,%r328; .loc 1 281 10 mov.u64 %r140,1; .loc 1 285 4 mov.u64 %r351,0; bra $L70; $L69: .loc 1 289 24 mov.u64 %r329,0; st.u64 [%r204+8],%r329; .loc 1 290 28 cvt.u16.u32 %r332,%r22; add.u16 %r331,%r332,-1; cvt.u32.u16 %r333,%r331; st.u8 [%r204+28],%r333; .loc 1 292 20 add.u32 %r334,%r22,-2; cvt.s64.s32 %r72,%r334; add.u64 %r336,%r72,%r72; add.u64 %r337,%r336,%r72; shl.b64 %r338,%r337,3; add.u64 %r339,%r204,%r338; .loc 1 292 67 shl.b64 %r341,%r72,3; add.u64 %r342,%frame,%r341; .loc 1 292 59 ld.u64 %r344,[%r339+40]; ld.u64 %r345,[%r342+360]; mul.lo.u64 %r154,%r344,%r345; .loc 1 294 29 mov.u64 %r3471547; call (%value_in),_gfortrani_xmallocarray8,[%value_in]; } .loc 1 294 27 st.u64 [%r204],%r348; .loc 1 295 10 setp.eq.u64 %r350,%r154,0; @ ! %r350 bra $L74; bra $L71; $L73: .loc 1 283 48 mul.lo.u64 %r140,%r68,%r140; $L70: .loc 1 285 4 st.u64 [%r179+8],%r351; ld.u64 %r68,[%r176]; add.u64 %r352,%r68,-1; st.u64 [%r179+16],%r352; st.u64 [%r179],%r140; .loc 1 278 7 add.u64 %r179,%r179,24; add.u64 %r176,%r176,8; setp.ne.u64 %r353,%r169,%r179; @ %r353 bra $L73; bra $L69; $L71: .loc 1 298 4 st.u64 [%r204+48],%r154; mov.u64 %r355,-1; st.u64 [%r204+56],%r355; mov.u64 %r356,1; st.u64 [%r204+40],%r356; .loc 1 299 4 bra $L52; $L64: .loc 1 304 19 ld.s8 %r79,[%r204+28]; .loc 1 304 10 setp.eq.u64 %r357,%r79,%r147; @ %r357 bra $L76; .loc 1 305 2 st.u64 [%stack+8],%r147; st.u64 [%stack],%r79;_gfortran_runtime_error76: .loc 1 310 11 cvta.global.u64 %r360,_gfortrani_compile_options; .loc 1 310 10 ld.u32 %r361,[%r360+36]; setp.eq.u32 %r362,%r361,0; @ %r362 bra $L74; .loc 1 311 2 add.u64 %r416,%frame,360; cvta.const.u64 %r366,$LC2r416366; call _gfortrani_bounds_ifunction_return$L74: .loc 1 315 3 setp.gt.s64 %r368,%r147,0; @ %r368 bra $L77; $L80: .loc 1 235 9 add.u64 %r369,%r31,1; .loc 1 235 7 sub.u64 %r149,%r369,%r33; max.s64 %r65,%r149,0; .loc 1 240 12 mul.lo.u64 %r37,%r34,%r36; .loc 1 323 8 ld.u64 %r164,[%r204]; .loc 1 326 8 ld.u64 %r163,[%r205]; .loc 1 351 35 shl.b64 %r100,%r151,3; .loc 1 336 22 add.u64 %r64,%r65,-1; .loc 1 336 15 mul.lo.u64 %r89,%r64,%r100; .loc 1 337 29 mul.lo.u64 %r177,%r37,%r64; .loc 1 338 34 neg.s64 %r95,%r100; .loc 1 363 22 ld.u64 %r102,[%frame+240]; .loc 1 363 12 shl.b64 %r104,%r102,3; .loc 1 364 23 ld.u64 %r105,[%frame+120]; .loc 1 365 22 ld.u64 %r107,[%frame]; .loc 1 365 12 shl.b64 %r109,%r107,3; .loc 1 362 12 ld.u64 %r203,[%frame+480]; setp.ne.u32 %r417,%r209,0; add.u64 %r415,%frame,488; .loc 1 338 4 setp.le.s64 %r423,%r149,0; .loc 1 382 23 add.u64 %r424,%frame,240; .loc 1 367 32 add.u64 %r425,%frame,360; .loc 1 371 20 add.u64 %r426,%frame,120; bra $L78; $L77: add.u64 %r198,%frame,480; add.u64 %r197,%r204,40; mov.u64 %r195,%frame; cvt.u32.u32 %r371,%r22; cvt.s64.s8 %r370,%r371; add.u64 %r181,%r370,-1; .loc 1 315 3 mov.u64 %r165,0; add.u64 %r416,%frame,360; .loc 1 317 16 mov.u64 %r372,%r165; $L79: st.u64 [%r198],%r372; .loc 1 318 18 ld.u64 %r373,[%r197]; st.u64 [%r195],%r373; .loc 1 319 17 shl.b64 %r375,%r165,3; add.u64 %r376,%r416,%r375; .loc 1 319 10 ld.u64 %r377,[%r376]; setp.le.s64 %r378,%r377,0; @ %r378 bra $L52; .loc 1 315 26 add.u64 %r165,%r165,1; .loc 1 315 3 add.u64 %r198,%r198,8; add.u64 %r197,%r197,24; add.u64 %r195,%r195,8; setp.ne.u64 %r379,%r165,%r181; @ %r379 bra $L79; bra $L80; $L78: .loc 1 334 10 @ %r417 bra $L81; .loc 1 351 4 setp.gt.s64 %r381,%r149,0; @ %r381 bra $L91; .loc 1 333 14 mov.u64 %r158,0; bra $L83; $L81: .loc 1 336 8 add.u64 %r29,%r163,%r89; .loc 1 337 9 add.u64 %r159,%r160,%r177; .loc 1 338 4 @ %r423 bra $L92; .loc 1 338 11 mov.u64 %r158,%r65; $L85: .loc 1 340 11 ld.s8 %r384,[%r159]; cvt.u16.u32 %r383,%r384; setp.eq.u16 %r385,%r383,0; @ %r385 bra $L84; .loc 1 340 18 ld.u64 %r386,[%r29]; setp.eq.u64 %r387,%r386,%r206; @ %r387 bra $L83; $L84: .loc 1 338 26 add.u64 %r158,%r158,-1; .loc 1 338 34 add.u64 %r29,%r29,%r95; .loc 1 338 53 sub.u64 %r159,%r159,%r37; .loc 1 338 4 setp.ne.u64 %r388,%r158,0; @ %r388 bra $L85; bra $L83; $L91: .loc 1 351 4 mov.u64 %r157,%r160; mov.u64 %r111,%r163; .loc 1 351 11 mov.u64 %r158,1; $L82: .loc 1 353 11 ld.s8 %r390,[%r157]; cvt.u16.u32 %r389,%r390; setp.eq.u16 %r391,%r389,0; @ %r391 bra $L86; .loc 1 353 18 ld.u64 %r392,[%r111]; setp.eq.u64 %r393,%r392,%r206; @ %r393 bra $L83; $L86: .loc 1 351 27 add.u64 %r158,%r158,1; .loc 1 351 35 add.u64 %r111,%r111,%r100; .loc 1 351 54 add.u64 %r157,%r157,%r37; .loc 1 351 4 setp.ge.s64 %r394,%r65,%r158; @ %r394 bra $L82; .loc 1 333 14 mov.u64 %r158,0; bra $L83; $L92: mov.u64 %r158,0; $L83: .loc 1 360 13 st.u64 [%r164],%r158; .loc 1 362 15 add.u64 %r203,%r203,1; st.u64 [%frame+480],%r203; .loc 1 363 12 add.u64 %r163,%r163,%r104; .loc 1 364 13 add.u64 %r160,%r160,%r105; .loc 1 365 12 add.u64 %r164,%r164,%r109; .loc 1 367 32 ld.u64 %r130,[%frame+360]; .loc 1 367 13 setp.ne.u64 %r395,%r203,%r130; @ %r395 bra $L78; mov.u64 %r126,%r415; mov.u64 %r127,%r107; mov.u64 %r202,%r105; mov.u64 %r124,%r102; mov.u64 %r120,8; .loc 1 366 9 mov.u64 %r162,0; .loc 1 369 13 mov.u64 %r397,%r162; $L89: st.u64 [%r126+-8],%r397; .loc 1 370 23 mul.lo.u64 %r113,%r130,%r124; .loc 1 371 24 mul.lo.u64 %r398,%r130,%r202; .loc 1 371 10 sub.u64 %r160,%r160,%r398; .loc 1 372 23 mul.lo.u64 %r119,%r130,%r127; .loc 1 373 5 add.u64 %r162,%r162,1; .loc 1 374 7 setp.le.s64 %r399,%r147,%r162; @ %r399 bra $L52; .loc 1 381 16 ld.u64 %r400,[%r126]; add.u64 %r123,%r400,1; st.u64 [%r126],%r123; .loc 1 382 23 add.u64 %r402,%r424,%r120; ld.u64 %r124,[%r402]; .loc 1 382 13 sub.u64 %r403,%r124,%r113; shl.b64 %r404,%r403,3; add.u64 %r163,%r163,%r404; .loc 1 383 23 add.u64 %r405,%frame,%r120; ld.u64 %r127,[%r405]; .loc 1 383 13 sub.u64 %r406,%r127,%r119; shl.b64 %r407,%r406,3; add.u64 %r164,%r164,%r407; .loc 1 367 32 add.u64 %r409,%r425,%r120; ld.u64 %r130,[%r409]; .loc 1 367 13 add.u64 %r126,%r126,8; setp.eq.u64 %r410,%r123,%r130; @ %r410 bra $L88; .loc 1 362 12 ld.u64 %r203,[%frame+480]; bra $L78; $L88: .loc 1 371 20 add.u64 %r412,%r426,%r120; ld.u64 %r202,[%r412]; add.u64 %r120,%r120,8; bra $L89; $L114: .loc 1 274 6 ld.u64 %r413,[%r204]; setp.eq.u64 %r414,%r413,0; @ ! %r414 bra $L64; bra $L90; $L52: .loc 1 387_gfortran_sfindloc1_i8 .visible .func _gfortran_sfindloc1_i8 {68u64 %r126; .reg .u64u32pred %r190; .reg .pred %r192; .reg .u64pred %r213; .reg .predu32predpredpred %r257; .reg .u64 %r260; .reg .u64 %r261; .reg .pred %r263; .reg .u64 %r264; .reg .u32 %r265; .reg .u64 %r266; .reg .u6464predmov.u64 %r149,%ar0; mov.u64 %r150,%ar1; mov.u64 %r151,%ar2; mov.u64 %r152,%ar3; mov.u64 %r153,%ar4; mov.u32 %r154,%ar5; .loc 1 410 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L116; .loc 1 410 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L117; $L116:call _gfortran_findloc1_i8); } .loc 1 413 7 bra $L115; $L117: .loc 1 416 10 ld.s8 %r23,[%r150+28]; .loc 1 416 38 add.u32 %r163,%r23,-1; .loc 1 416 8 cvt.s64.s32 %r90,%r163; .loc 1 417 10 ld.u64 %r26,[%r152]; .loc 1 417 7 add.u64 %r91,%r26,-1; .loc 1 419 7 shr.u64 %r165,%r91,63; set.u32.lt.s64 %r167,%r90,%r91; neg.s32 %r168,%r167; cvt.u16.u64 %r171,%r165; cvt.u16.u32 %r172,%r168; or.b16 %r170,%r171,%r172; .loc 1 419 6 cvt.u32.u16 %r173,%r170; cvt.u16.u8 %r174,%r173; setp.ne.u16 %r175,%r174,0; @ %r175 bra $L119; .loc 1 430 3 setp.ne.u64 %r176,%r91,0; @ %r176 bra $L120; $L126: .loc 1 438 3 setp.gt.s64 %r177,%r90,%r91; @ %r177 bra $L121; bra $L163; $L119: .loc 1 421 7 cvt.u32.u32 %r180,%r23; cvt.s64.s8 %r179,%r180; st.u64 [%stack+8],%r179_gfortran_runtime_error120: add.u64 %r136,%r150,48; add.u64 %r138,%frame,120; add.u64 %r139,%r150,56; add.u64 %r182,%r150,24; add.u64 %r184,%r26,%r26; add.u64 %r185,%r184,%r26; shl.b64 %r186,%r185,3; add.u64 %r145,%r182,%r186; .loc 1 435 12 mov.u64 %r302,0; $L125: .loc 1 432 19 ld.u64 %r188,[%r139]; add.u64 %r187,%r188,1; ld.u64 %r189,[%r136]; sub.u64 %r35,%r187,%r189; .loc 1 434 10 setp.le.s64 %r190,%r35,0; @ %r190 bra $L123; .loc 1 432 17 st.u64 [%r138],%r35; bra $L124; $L123: .loc 1 435 12 st.u64 [%r138],%r302; $L124: .loc 1 430 3 add.u64 %r136,%r136,24; add.u64 %r138,%r138,8; add.u64 %r139,%r139,24; setp.ne.u64 %r192,%r136,%r145; @ %r192 bra $L125; bra $L126; $L163: .loc 1 448 6 ld.u64 %r193,[%r149]; setp.eq.u64 %r194,%r193,0; @ ! %r194 bra $L128; bra $L127; $L121: add.u64 %r196,%r26,%r26; add.u64 %r197,%r196,%r26; shl.b64 %r198,%r197,3; add.u64 %r199,%r198,48; add.u64 %r125,%r150,%r199; add.u64 %r294,%frame,120; shl.b64 %r201,%r26,3; add.u64 %r202,%r201,-8; add.u64 %r45,%r294,%r202; add.u64 %r203,%r150,48; cvt.u32.u32 %r205,%r23; cvt.s64.s8 %r204,%r205; add.u64 %r207,%r204,%r204; add.u64 %r208,%r207,%r204; shl.b64 %r209,%r208,3; add.u64 %r132,%r203,%r209; .loc 1 444 12 mov.u64 %r301,0; $L131: .loc 1 441 2 ld.u64 %r211,[%r125+8]; add.u64 %r210,%r211,1; ld.u64 %r212,[%r125]; sub.u64 %r40,%r210,%r212; .loc 1 443 10 setp.le.s64 %r213,%r40,0; @ %r213 bra $L129; .loc 1 440 17 st.u64 [%r45],%r40; bra $L130; $L129: .loc 1 444 12 st.u64 [%r45],%r301; $L130: .loc 1 438 3 add.u64 %r125,%r125,24; add.u64 %r45,%r45,8; setp.ne.u64 %r215,%r125,%r132; @ %r215 bra $L131; bra $L164; $L127: .loc 1 452 7 setp.le.s64 %r216,%r90,0; @ %r216 bra $L133; add.u64 %r294,%frame,120; $L149: add.u64 %r78,%r149,40; mov.u64 %r61,%r294; cvt.u32.u32 %r218,%r23; cvt.s64.s8 %r217,%r218; add.u64 %r220,%r217,%r217; add.u64 %r221,%r220,%r217; shl.b64 %r222,%r221,3; add.u64 %r223,%r149,16; add.u64 %r126,%r222,%r223; .loc 1 455 10 mov.u64 %r81,1; .loc 1 459 4 mov.u64 %r246,0; bra $L134; $L133: .loc 1 462 24 mov.u64 %r224,0; st.u64 [%r149+8],%r224; .loc 1 463 28 cvt.u16.u32 %r227,%r23; add.u16 %r226,%r227,-1; cvt.u32.u16 %r228,%r226; st.u8 [%r149+28],%r228; .loc 1 465 20 add.u32 %r229,%r23,-2; cvt.s64.s32 %r51,%r229; add.u64 %r231,%r51,%r51; add.u64 %r232,%r231,%r51; shl.b64 %r233,%r232,3; add.u64 %r234,%r149,%r233; .loc 1 465 67 shl.b64 %r236,%r51,3; add.u64 %r237,%frame,%r236; .loc 1 465 59 ld.u64 %r239,[%r234+40]; ld.u64 %r240,[%r237+120]; mul.lo.u64 %r94,%r239,%r240; .loc 1 467 29 mov.u64 %r24242; call (%value_in),_gfortrani_xmallocarray3,[%value_in]; } .loc 1 467 27 st.u64 [%r149],%r243; .loc 1 468 10 setp.eq.u64 %r245,%r94,0; @ ! %r245 bra $L138; bra $L135; $L137: .loc 1 457 48 mul.lo.u64 %r81,%r47,%r81; $L134: .loc 1 459 4 st.u64 [%r78+8],%r246; ld.u64 %r47,[%r61]; add.u64 %r247,%r47,-1; st.u64 [%r78+16],%r247; st.u64 [%r78],%r81; .loc 1 452 7 add.u64 %r78,%r78,24; add.u64 %r61,%r61,8; setp.ne.u64 %r248,%r78,%r126; @ %r248 bra $L137; bra $L133; $L135: .loc 1 471 4 st.u64 [%r149+48],%r94; mov.u64 %r250,-1; st.u64 [%r149+56],%r250; mov.u64 %r251,1; st.u64 [%r149+40],%r251; .loc 1 472 4 bra $L115; $L128: .loc 1 477 19 ld.s8 %r58,[%r149+28]; .loc 1 477 10 setp.eq.u64 %r252,%r58,%r90; @ %r252 bra $L139; .loc 1 478 2 st.u64 [%stack+8],%r90; st.u64 [%stack],%r58;_gfortran_runtime_error139: .loc 1 483 11 cvta.global.u64 %r255,_gfortrani_compile_options; .loc 1 483 10 ld.u32 %r256,[%r255+36]; setp.eq.u32 %r257,%r256,0; @ %r257 bra $L138; .loc 1 484 2 add.u64 %r294,%frame,120; cvta.const.u64 %r261,$LC2261; call _gfortrani_bounds_ifunction_return$L138: .loc 1 488 3 setp.gt.s64 %r263,%r90,0; @ %r263 bra $L140; $L143: .loc 1 495 8 ld.u64 %r97,[%r149]; ld.u64 %r42,[%frame+240]; .loc 1 503 22 ld.u64 %r64,[%frame]; .loc 1 503 12 shl.b64 %r66,%r64,3; setp.le.s64 %r295,%r90,1; cvt.u32.u32 %r297,%r23; cvt.s64.s8 %r298,%r297; add.u64 %r299,%r298,-1; add.u64 %r300,%frame,120; bra $L141; $L140: add.u64 %r114,%frame,240; add.u64 %r113,%r149,40; mov.u64 %r108,%frame; cvt.u32.u32 %r265,%r23; cvt.s64.s8 %r264,%r265; add.u64 %r79,%r264,-1; .loc 1 488 3 mov.u64 %r98,0; add.u64 %r294,%frame,120; .loc 1 490 16 mov.u64 %r266,%r98; $L142: st.u64 [%r114],%r266; .loc 1 491 18 ld.u64 %r267,[%r113]; st.u64 [%r108],%r267; .loc 1 492 17 shl.b64 %r269,%r98,3; add.u64 %r270,%r294,%r269; .loc 1 492 10 ld.u64 %r271,[%r270]; setp.le.s64 %r272,%r271,0; @ %r272 bra $L115; .loc 1 488 26 add.u64 %r98,%r98,1; .loc 1 488 3 add.u64 %r114,%r114,8; add.u64 %r113,%r113,24; add.u64 %r108,%r108,8; setp.ne.u64 %r273,%r79,%r98; @ %r273 bra $L142; bra $L143; $L141: .loc 1 500 13 mov.u64 %r274,0; st.u64 [%r97],%r274; .loc 1 502 15 add.u64 %r42,%r42,1; .loc 1 503 12 add.u64 %r97,%r97,%r66; .loc 1 505 32 ld.u64 %r88,[%frame+120]; .loc 1 505 13 setp.ne.u64 %r275,%r42,%r88; @ %r275 bra $L141; .loc 1 508 23 mul.lo.u64 %r276,%r64,%r42; .loc 1 508 9 shl.b64 %r277,%r276,3; sub.u64 %r30,%r97,%r277; .loc 1 510 7 @ ! %r295 bra $L165; bra $L115; $L148: .loc 1 507 13 st.u64 [%r102],%r274; .loc 1 508 23 mul.lo.u64 %r280,%r73,%r72; .loc 1 508 9 shl.b64 %r281,%r280,3; sub.u64 %r30,%r97,%r281; .loc 1 509 5 add.u64 %r96,%r96,1; .loc 1 510 7 add.u64 %r102,%r102,8; setp.ne.u64 %r282,%r96,%r299; @ %r282 bra $L147; bra $L115; $L165: add.u64 %r102,%frame,248; .loc 1 509 5 mov.u64 %r96,1; $L147: .loc 1 517 16 ld.u64 %r286,[%r102]; add.u64 %r72,%r286,1; st.u64 [%r102],%r72; shl.b64 %r120,%r96,3; .loc 1 518 23 add.u64 %r287,%frame,%r120; ld.u64 %r73,[%r287]; .loc 1 518 13 shl.b64 %r288,%r73,3; add.u64 %r97,%r30,%r288; .loc 1 505 32 add.u64 %r290,%r300,%r120; ld.u64 %r77,[%r290]; .loc 1 505 13 setp.eq.u64 %r291,%r72,%r77; @ %r291 bra $L148; .loc 1 507 13 mov.u64 %r42,0; bra $L141; $L164: .loc 1 448 6 ld.u64 %r292,[%r149]; setp.eq.u64 %r293,%r292,0; @ ! %r293 bra $L128; bra $L149; $L115: .loc 1 522findloc1_i16.o/ 1622802252_gfortran_findloc1_i16 .visible .func _gfortran_findloc1_i16fortran/generated/findloc1_i16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mfindloc1_i16 .visible .func _gfortran_mfindloc1_i16GLOBAL FUNCTION DECL: _gfortran_sfindloc1_i16 .visible .func _gfortran_sfindloc1_i16GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_options[6]; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_internal_error .extern .func _gfortrani_internal_8] = {70,73,78,68,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,70,73,78,68,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,914,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,70,73,78,68,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_findloc1_i16 .visible .func _gfortran_findloc1_i16, .param .u32 %in_ar4) {98predpredu64 %r257; .reg .u64u64 %r279; .reg .u64 %r280; .reg .u64 %r281; .reg .u64u64 %r303; .reg .u32 %r304; .reg .pred %r305; .reg .u64 %r308; .reg .u64 %r309; .reg .pred %r311; .reg .u64 %r312; .reg .u64 %r313; .reg .u64u64predpred %r332; .reg .predu64 %r337; .reg .pred %r339; .reg .pred %r340; .reg .predpredpred %r374; .reg .predu64 %r381; .reg .u64 %r382; mov.u64 %r185,%ar0; mov.u64 %r186,%ar1; mov.u64 %r187,%ar2; ld.u64 %r367,[%r187]; ld.u64 %r368,[%r187+8]; mov.u64 %r189,%ar3; mov.u32 %r190,%ar4; .loc 1 54 10 ld.s8 %r22,[%r186+28]; .loc 1 54 38 add.u32 %r191,%r22,-1; .loc 1 54 8 cvt.s64.s32 %r118,%r191; .loc 1 55 10 ld.u64 %r25,[%r189]; .loc 1 55 7 add.u64 %r119,%r25,-1; .loc 1 57 7 shr.u64 %r193,%r119,63; set.u32.lt.s64 %r195,%r118,%r119; neg.s32 %r196,%r195; cvt.u16.u64 %r199,%r193; cvt.u16.u32 %r200,%r196; or.b16 %r198,%r199,%r200; .loc 1 57 6 cvt.u32.u16 %r201,%r198; cvt.u16.u8 %r202,%r201; setp.eq.u16 %r203,%r202,0; @ %r203 bra $L2; .loc 1 59 7 cvt.u32.u32 %r206,%r22; cvt.s64.s8 %r205,%r206; st.u64 [%stack+8],%r20504stack; call _gfortran_runtime_errorr209,%r119,%r119; add.u64 %r210,%r209,%r119; shl.b64 %r211,%r210,3; add.u64 %r212,%r186,%r211; ld.u64 %r31,[%r212+56]; ld.u64 %r33,[%r212+48]; .loc 1 67 9 ld.u64 %r122,[%r212+40]; .loc 1 69 3 setp.ne.u64 %r226,%r119,0; @ %r226 bra $L3; $L9: .loc 1 77 3 setp.gt.s64 %r227,%r118,%r119; @ %r227 bra $L4; bra $L54; $L3: add.u64 %r177,%r186,40; add.u64 %r67,%frame,120; add.u64 %r70,%frame,240; add.u64 %r229,%r25,%r25; add.u64 %r230,%r229,%r25; shl.b64 %r231,%r230,3; add.u64 %r232,%r186,16; add.u64 %r182,%r231,%r232; .loc 1 75 12 mov.u64 %r382,0; $L8: .loc 1 71 18 ld.u64 %r233,[%r177]; st.u64 [%r67],%r233; .loc 1 72 19 ld.u64 %r235,[%r177+16]; add.u64 %r234,%r235,1; ld.u64 %r236,[%r177+8]; sub.u64 %r38,%r234,%r236; .loc 1 74 10 setp.lt.s64 %r237,%r38,0; @ %r237 bra $L6; .loc 1 72 17 st.u64 [%r70],%r38; bra $L7; $L6: .loc 1 75 12 st.u64 [%r70],%r382; $L7: .loc 1 69 3 add.u64 %r177,%r177,24; add.u64 %r67,%r67,8; add.u64 %r70,%r70,8; setp.ne.u64 %r239,%r177,%r182; @ %r239 bra $L8; bra $L9; $L54: .loc 1 86 6 ld.u64 %r240,[%r185]; setp.eq.u64 %r241,%r240,0; @ ! %r241 bra $L11; bra $L10; $L4: add.u64 %r243,%r25,%r25; add.u64 %r244,%r243,%r25; shl.b64 %r245,%r244,3; add.u64 %r246,%r245,40; add.u64 %r115,%r186,%r246; shl.b64 %r247,%r25,3; add.u64 %r105,%r247,-8; add.u64 %r371,%frame,120; add.u64 %r108,%r371,%r105; add.u64 %r370,%frame,240; add.u64 %r62,%r370,%r105; add.u64 %r250,%r186,40; cvt.u32.u32 %r252,%r22; cvt.s64.s8 %r251,%r252; add.u64 %r254,%r251,%r251; add.u64 %r255,%r254,%r251; shl.b64 %r256,%r255,3; add.u64 %r180,%r250,%r256; .loc 1 83 12 mov.u64 %r381,0; $L14: .loc 1 79 18 ld.u64 %r257,[%r115]; st.u64 [%r108],%r257; .loc 1 80 19 ld.u64 %r259,[%r115+16]; add.u64 %r258,%r259,1; ld.u64 %r260,[%r115+8]; sub.u64 %r45,%r258,%r260; .loc 1 82 10 setp.lt.s64 %r261,%r45,0; @ %r261 bra $L12; .loc 1 80 17 st.u64 [%r62],%r45; bra $L13; $L12: .loc 1 83 12 st.u64 [%r62],%r381; $L13: .loc 1 77 3 add.u64 %r115,%r115,24; add.u64 %r108,%r108,8; add.u64 %r62,%r62,8; setp.ne.u64 %r263,%r115,%r180; @ %r263 bra $L14; bra $L55; $L10: .loc 1 90 7 setp.le.s64 %r264,%r118,0; @ %r264 bra $L16; add.u64 %r370,%frame,240; $L38: add.u64 %r153,%r185,40; mov.u64 %r149,%r370; cvt.u32.u32 %r266,%r22; cvt.s64.s8 %r265,%r266; add.u64 %r268,%r265,%r265; add.u64 %r269,%r268,%r265; shl.b64 %r270,%r269,3; add.u64 %r271,%r185,16; add.u64 %r116,%r270,%r271; .loc 1 93 10 mov.u64 %r112,1; .loc 1 97 4 mov.u64 %r294,0; bra $L17; $L16: .loc 1 101 24 mov.u64 %r272,0; st.u64 [%r185+8],%r272; .loc 1 102 28 cvt.u16.u32 %r275,%r22; add.u16 %r274,%r275,-1; cvt.u32.u16 %r276,%r274; st.u8 [%r185+28],%r276; .loc 1 104 20 add.u32 %r277,%r22,-2; cvt.s64.s32 %r54,%r277; add.u64 %r279,%r54,%r54; add.u64 %r280,%r279,%r54; shl.b64 %r281,%r280,3; add.u64 %r282,%r185,%r281; .loc 1 104 67 shl.b64 %r284,%r54,3; add.u64 %r285,%frame,%r284; .loc 1 104 59 ld.u64 %r287,[%r282+40]; ld.u64 %r288,[%r285+240]; mul.lo.u64 %r124,%r287,%r288; .loc 1 106 29 mov.u64 %r290190; call (%value_in),_gfortrani_xmallocarray291,[%value_in]; } .loc 1 106 27 st.u64 [%r185],%r291; .loc 1 107 10 setp.eq.u64 %r293,%r124,0; @ ! %r293 bra $L21; bra $L18; $L20: .loc 1 95 48 mul.lo.u64 %r112,%r50,%r112; $L17: .loc 1 97 4 st.u64 [%r153+8],%r294; ld.u64 %r50,[%r149]; add.u64 %r295,%r50,-1; st.u64 [%r153+16],%r295; st.u64 [%r153],%r112; .loc 1 90 7 add.u64 %r153,%r153,24; add.u64 %r149,%r149,8; setp.ne.u64 %r296,%r116,%r153; @ %r296 bra $L20; bra $L16; $L18: .loc 1 110 4 st.u64 [%r185+48],%r124; mov.u64 %r298,-1; st.u64 [%r185+56],%r298; mov.u64 %r299,1; st.u64 [%r185+40],%r299; .loc 1 111 4 bra $L1; $L11: .loc 1 116 19 ld.s8 %r60,[%r185+28]; .loc 1 116 10 setp.eq.u64 %r300,%r60,%r118; @ %r300 bra $L23; .loc 1 117 2 st.u64 [%stack+8],%r118; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 122 11 cvta.global.u64 %r303,_gfortrani_compile_options; .loc 1 122 10 ld.u32 %r304,[%r303+36]; setp.eq.u32 %r305,%r304,0; @ %r305 bra $L21; .loc 1 123 2 add.u64 %r370,%frame,240; cvta.const.u64 %r309,$LC2r370308309; call _gfortrani_bounds_ifunction_return$L21: .loc 1 127 3 setp.gt.s64 %r311,%r118,0; @ %r311 bra $L24; $L27: .loc 1 64 9 add.u64 %r312,%r31,1; .loc 1 64 7 sub.u64 %r120,%r312,%r33; max.s64 %r43,%r120,0; .loc 1 135 8 ld.u64 %r111,[%r185]; .loc 1 138 8 ld.u64 %r132,[%r186]; .loc 1 160 35 shl.b64 %r74,%r122,4; .loc 1 147 22 add.u64 %r313,%r43,-1; .loc 1 147 15 mul.lo.u64 %r68,%r313,%r74; .loc 1 148 34 neg.s64 %r71,%r74; ld.u64 %r82,[%frame+360]; .loc 1 172 22 ld.u64 %r75,[%frame+120]; .loc 1 172 12 shl.b64 %r77,%r75,4; .loc 1 173 22 ld.u64 %r78,[%frame]; .loc 1 173 12 shl.b64 %r80,%r78,3; setp.ne.u32 %r369,%r190,0; setp.le.s64 %r372,%r118,1; .loc 1 148 4 setp.le.s64 %r374,%r120,0; .loc 1 160 4 setp.gt.s64 %r375,%r120,0; cvt.u32.u32 %r376,%r22; cvt.s64.s8 %r377,%r376; add.u64 %r378,%r377,-1; add.u64 %r379,%frame,120; add.u64 %r380,%frame,240; bra $L25; $L24: add.u64 %r170,%frame,360; add.u64 %r169,%r185,40; mov.u64 %r167,%frame; cvt.u32.u32 %r315,%r22; cvt.s64.s8 %r314,%r315; add.u64 %r154,%r314,-1; .loc 1 127 3 mov.u64 %r133,0; add.u64 %r370,%frame,240; .loc 1 129 16 mov.u64 %r316,%r133; $L26: st.u64 [%r170],%r316; .loc 1 130 18 ld.u64 %r317,[%r169]; st.u64 [%r167],%r317; .loc 1 131 17 shl.b64 %r319,%r133,3; add.u64 %r320,%r370,%r319; .loc 1 131 10 ld.u64 %r321,[%r320]; setp.le.s64 %r322,%r321,0; @ %r322 bra $L1; .loc 1 127 26 add.u64 %r133,%r133,1; .loc 1 127 3 add.u64 %r170,%r170,8; add.u64 %r169,%r169,24; add.u64 %r167,%r167,8; setp.ne.u64 %r323,%r133,%r154; @ %r323 bra $L26; bra $L27; $L25: .loc 1 145 10 @ %r369 bra $L28; .loc 1 160 4 @ %r375 bra $L39; .loc 1 144 14 mov.u64 %r126,0; bra $L30; $L28: .loc 1 147 8 add.u64 %r127,%r132,%r68; .loc 1 148 4 @ %r374 bra $L40; .loc 1 148 11 mov.u64 %r126,%r43; $L32: .loc 1 150 11 ld.u64 %r327,[%r127]; setp.ne.u64 %r329,%r327,%r367; @ %r329 bra $L41; ld.u64 %r330,[%r127+8]; setp.ne.u64 %r332,%r330,%r368; @ ! %r332 bra $L30; $L41: .loc 1 148 26 add.u64 %r126,%r126,-1; .loc 1 148 34 add.u64 %r127,%r127,%r71; .loc 1 148 4 setp.ne.u64 %r333,%r126,0; @ %r333 bra $L32; bra $L30; $L39: .loc 1 160 4 mov.u64 %r125,%r132; .loc 1 160 11 mov.u64 %r126,1; $L29: .loc 1 162 11 ld.u64 %r334,[%r125]; setp.ne.u64 %r336,%r334,%r367; @ %r336 bra $L42; ld.u64 %r337,[%r125+8]; setp.ne.u64 %r339,%r337,%r368; @ ! %r339 bra $L30; $L42: .loc 1 160 27 add.u64 %r126,%r126,1; .loc 1 160 35 add.u64 %r125,%r125,%r74; .loc 1 160 4 setp.ge.s64 %r340,%r43,%r126; @ %r340 bra $L29; .loc 1 144 14 mov.u64 %r126,0; bra $L30; $L40: mov.u64 %r126,0; $L30: .loc 1 169 13 st.u64 [%r111],%r126; .loc 1 171 15 add.u64 %r82,%r82,1; .loc 1 172 12 add.u64 %r132,%r132,%r77; .loc 1 173 12 add.u64 %r111,%r111,%r80; .loc 1 175 32 ld.u64 %r141,[%frame+240]; .loc 1 175 13 setp.ne.u64 %r341,%r82,%r141; @ %r341 bra $L25; .loc 1 178 23 mul.lo.u64 %r342,%r75,%r82; .loc 1 178 9 shl.b64 %r343,%r342,4; sub.u64 %r129,%r132,%r343; .loc 1 179 23 mul.lo.u64 %r344,%r78,%r82; .loc 1 179 9 shl.b64 %r345,%r344,3; sub.u64 %r130,%r111,%r345; .loc 1 181 7 @ %r372 bra $L1; add.u64 %r66,%frame,368; mov.u64 %r159,8; .loc 1 180 5 mov.u64 %r131,1; .loc 1 177 13 mov.u64 %r373,0; bra $L36; $L37: st.u64 [%r66],%r373; .loc 1 178 23 mul.lo.u64 %r351,%r95,%r94; .loc 1 178 9 shl.b64 %r352,%r351,4; sub.u64 %r129,%r132,%r352; .loc 1 179 23 mul.lo.u64 %r353,%r98,%r94; .loc 1 179 9 shl.b64 %r354,%r353,3; sub.u64 %r130,%r111,%r354; .loc 1 180 5 add.u64 %r131,%r131,1; .loc 1 181 7 add.u64 %r66,%r66,8; add.u64 %r159,%r159,8; setp.eq.u64 %r355,%r131,%r378; @ %r355 bra $L1; $L36: .loc 1 188 16 ld.u64 %r356,[%r66]; add.u64 %r94,%r356,1; st.u64 [%r66],%r94; .loc 1 189 23 add.u64 %r358,%r379,%r159; ld.u64 %r95,[%r358]; .loc 1 189 13 shl.b64 %r359,%r95,4; add.u64 %r132,%r129,%r359; .loc 1 190 23 add.u64 %r360,%frame,%r159; ld.u64 %r98,[%r360]; .loc 1 190 13 shl.b64 %r361,%r98,3; add.u64 %r111,%r130,%r361; .loc 1 175 32 add.u64 %r363,%r380,%r159; ld.u64 %r102,[%r363]; .loc 1 175 13 setp.eq.u64 %r364,%r94,%r102; @ %r364 bra $L37; .loc 1 177 13 mov.u64 %r82,0; bra $L25; $L55: .loc 1 86 6 ld.u64 %r365,[%r185]; setp.eq.u64 %r366,%r365,0; @ ! %r366 bra $L11; bra $L38; $L1: .loc 1 194_gfortran_mfindloc1_i16 .visible .func _gfortran_mfindloc1_i16 {608u32 %r210; .reg .u32 %r211; .reg .u64 %r213; .reg .u32 %r215; .reg .u32 %r216; .reg .u16u64u64 %r282; .reg .u64 %r283; .reg .u64 %r284; .reg .u64predu64 %r313; .reg .u64 %r314; .reg .pred %r315; .reg .u64 %r317; .reg .u64 %r319; .reg .predu32 %r325; .reg .u64 %r327; .reg .u64u16 %r332; .reg .u16 %r333; .reg .u32 %r334; .reg .u32pred %r354; .reg .u64 %r356; .reg .u64 %r357; .reg .predpredpred %r379; .reg .pred %r380; .reg .pred %r382; .reg .u16 %r384; .reg .u32 %r385; .reg .pred %r386; .reg .u64 %r387; .reg .predpredpred %r403; .reg .predu64 %r414; .reg .u64 %r415; .reg .u64 %r416; .reg .u64 %r418; .reg .pred %r419; .reg .u64 %r421; .reg .u64 %r422; .reg .predpredpred.reg .u64 %r442; mov.u64 %r204,%ar0; mov.u64 %r205,%ar1; mov.u64 %r206,%ar2; ld.u64 %r424,[%r206]; ld.u64 %r425,[%r206+8]; mov.u64 %r208,%ar3; mov.u64 %r209,%ar4; mov.u32 %r210,%ar5; .loc 1 225 10 ld.s8 %r22,[%r205+28]; .loc 1 225 38 add.u32 %r211,%r22,-1; .loc 1 225 8 cvt.s64.s32 %r147,%r211; .loc 1 226 10 ld.u64 %r25,[%r208]; .loc 1 226 7 add.u64 %r148,%r25,-1; .loc 1 228 7 shr.u64 %r213,%r148,63; set.u32.lt.s64 %r215,%r147,%r148; neg.s32 %r216,%r215; cvt.u16.u64 %r219,%r213; cvt.u16.u32 %r220,%r216; or.b16 %r218,%r219,%r220; .loc 1 228 6 cvt.u32.u16 %r221,%r218; cvt.u16.u8 %r222,%r221; setp.eq.u16 %r223,%r222,0; @ %r223 bra $L57; .loc 1 230 7 cvt.u32.u32 %r226,%r22; cvt.s64.s8 %r225,%r226; st.u64 [%stack+8],%r225242stack; call _gfortran_runtime_error7: .loc 1 235 9 add.u64 %r229,%r148,%r148; add.u64 %r230,%r229,%r148; shl.b64 %r231,%r230,3; add.u64 %r232,%r205,%r231; ld.u64 %r31,[%r232+56]; ld.u64 %r33,[%r232+48]; .loc 1 239 9 ld.u64 %r151,[%r232+40]; .loc 1 240 12 add.u64 %r250,%r209,%r231; ld.u64 %r34,[%r250+40]; ld.u64 %r36,[%r209+16]; .loc 1 242 9 ld.u64 %r160,[%r209]; .loc 1 246 22 cvt.u32.u64 %r39,%r36; .loc 1 246 53 add.u32 %r252,%r39,-4; and.b32 %r253,%r252,-5; set.u32.eq.u32 %r255,%r253,0; neg.s32 %r256,%r255; .loc 1 246 22 add.u32 %r257,%r39,-1; set.u32.le.u32 %r259,%r257,1; neg.s32 %r260,%r259; .loc 1 246 6 cvt.u16.u32 %r262,%r256; cvt.u16.u32 %r263,%r260; or.b16 %r261,%r262,%r263; cvt.u32.u16 %r264,%r261; cvt.u16.u8 %r265,%r264; setp.eq.u16 %r266,%r265,0; @ %r266 bra $L58; $L62: .loc 1 255 3 setp.ne.u64 %r267,%r148,0; @ %r267 bra $L59; $L66: .loc 1 264 3 setp.gt.s64 %r268,%r147,%r148; @ %r268 bra $L60; bra $L119; $L58: .loc 1 248 7 setp.eq.u32 %r270,%r39,16; @ %r270 bra $L62; .loc 1 253 5 cvta.const.u64 %r272,$LC4; mov.u64 %r271,call _gfortrani_internal_error9: add.u64 %r146,%r205,40; add.u64 %r167,%r209,40; add.u64 %r274,%r25,%r25; add.u64 %r275,%r274,%r25; shl.b64 %r276,%r275,3; add.u64 %r277,%r205,16; add.u64 %r101,%r276,%r277; .loc 1 255 3 mov.u64 %r200,0; add.u64 %r429,%frame,240; add.u64 %r430,%frame,120; add.u64 %r427,%frame,360; .loc 1 262 12 mov.u64 %r442,%r200; $L65: .loc 1 257 18 add.u64 %r279,%r429,%r200; ld.u64 %r280,[%r146]; st.u64 [%r279],%r280; .loc 1 258 18 add.u64 %r282,%r430,%r200; .loc 1 258 20 ld.u64 %r284,[%r167]; mul.lo.u64 %r283,%r284,%r36; .loc 1 258 18 st.u64 [%r282],%r283; .loc 1 259 19 ld.u64 %r286,[%r146+16]; add.u64 %r285,%r286,1; ld.u64 %r287,[%r146+8]; sub.u64 %r53,%r285,%r287; .loc 1 261 10 setp.lt.s64 %r288,%r53,0; @ %r288 bra $L63; .loc 1 259 17 add.u64 %r290,%r427,%r200; st.u64 [%r290],%r53; bra $L64; $L63: .loc 1 262 12 add.u64 %r292,%r427,%r200; st.u64 [%r292],%r442; $L64: .loc 1 255 3 add.u64 %r146,%r146,24; add.u64 %r167,%r167,24; add.u64 %r200,%r200,8; setp.ne.u64 %r294,%r101,%r146; @ %r294 bra $L65; bra $L66; $L119: .loc 1 274 6 ld.u64 %r295,[%r204]; setp.eq.u64 %r296,%r295,0; @ ! %r296 bra $L68; bra $L67; $L60: add.u64 %r298,%r25,%r25; add.u64 %r299,%r298,%r25; shl.b64 %r300,%r299,3; add.u64 %r143,%r300,40; add.u64 %r156,%r205,%r143; add.u64 %r141,%r209,%r143; shl.b64 %r301,%r25,3; add.u64 %r137,%r301,-8; cvt.u32.u32 %r303,%r22; cvt.s64.s8 %r302,%r303; shl.b64 %r304,%r302,3; add.u64 %r84,%r304,-8; add.u64 %r429,%frame,240; add.u64 %r430,%frame,120; add.u64 %r427,%frame,360; .loc 1 271 12 mov.u64 %r441,0; $L71: .loc 1 266 18 add.u64 %r306,%r429,%r137; ld.u64 %r307,[%r156]; st.u64 [%r306],%r307; .loc 1 267 18 add.u64 %r309,%r430,%r137; .loc 1 267 20 ld.u64 %r311,[%r141]; mul.lo.u64 %r310,%r311,%r36; .loc 1 267 18 st.u64 [%r309],%r310; .loc 1 268 19 ld.u64 %r313,[%r156+16]; add.u64 %r312,%r313,1; ld.u64 %r314,[%r156+8]; sub.u64 %r62,%r312,%r314; .loc 1 270 10 setp.lt.s64 %r315,%r62,0; @ %r315 bra $L69; .loc 1 268 17 add.u64 %r317,%r427,%r137; st.u64 [%r317],%r62; bra $L70; $L69: .loc 1 271 12 add.u64 %r319,%r427,%r137; st.u64 [%r319],%r441; $L70: .loc 1 264 3 add.u64 %r156,%r156,24; add.u64 %r141,%r141,24; add.u64 %r137,%r137,8; setp.ne.u64 %r321,%r84,%r137; @ %r321 bra $L71; bra $L120; $L67: .loc 1 278 7 setp.le.s64 %r322,%r147,0; @ %r322 bra $L73; add.u64 %r427,%frame,360; $L96: add.u64 %r179,%r204,40; mov.u64 %r176,%r427; add.u64 %r323,%r204,16; cvt.u32.u32 %r325,%r22; cvt.s64.s8 %r324,%r325; add.u64 %r327,%r324,%r324; add.u64 %r328,%r327,%r324; shl.b64 %r329,%r328,3; add.u64 %r169,%r323,%r329; .loc 1 281 10 mov.u64 %r140,1; .loc 1 285 4 mov.u64 %r352,0; bra $L74; $L73: .loc 1 289 24 mov.u64 %r330,0; st.u64 [%r204+8],%r330; .loc 1 290 28 cvt.u16.u32 %r333,%r22; add.u16 %r332,%r333,-1; cvt.u32.u16 %r334,%r332; st.u8 [%r204+28],%r334; .loc 1 292 20 add.u32 %r335,%r22,-2; cvt.s64.s32 %r72,%r335; add.u64 %r337,%r72,%r72; add.u64 %r338,%r337,%r72; shl.b64 %r339,%r338,3; add.u64 %r340,%r204,%r339; .loc 1 292 67 shl.b64 %r342,%r72,3; add.u64 %r343,%frame,%r342; .loc 1 292 59 ld.u64 %r345,[%r340+40]; ld.u64 %r346,[%r343+360]; mul.lo.u64 %r154,%r345,%r346; .loc 1 294 29 mov.u64 %r3481548; call (%value_in),_gfortrani_xmallocarray9,[%value_in]; } .loc 1 294 27 st.u64 [%r204],%r349; .loc 1 295 10 setp.eq.u64 %r351,%r154,0; @ ! %r351 bra $L78; bra $L75; $L77: .loc 1 283 48 mul.lo.u64 %r140,%r68,%r140; $L74: .loc 1 285 4 st.u64 [%r179+8],%r352; ld.u64 %r68,[%r176]; add.u64 %r353,%r68,-1; st.u64 [%r179+16],%r353; st.u64 [%r179],%r140; .loc 1 278 7 add.u64 %r179,%r179,24; add.u64 %r176,%r176,8; setp.ne.u64 %r354,%r169,%r179; @ %r354 bra $L77; bra $L73; $L75: .loc 1 298 4 st.u64 [%r204+48],%r154; mov.u64 %r356,-1; st.u64 [%r204+56],%r356; mov.u64 %r357,1; st.u64 [%r204+40],%r357; .loc 1 299 4 bra $L56; $L68: .loc 1 304 19 ld.s8 %r79,[%r204+28]; .loc 1 304 10 setp.eq.u64 %r358,%r79,%r147; @ %r358 bra $L80; .loc 1 305 2 st.u64 [%stack+8],%r147; st.u64 [%stack],%r79;_gfortran_runtime_error80: .loc 1 310 11 cvta.global.u64 %r361,_gfortrani_compile_options; .loc 1 310 10 ld.u32 %r362,[%r361+36]; setp.eq.u32 %r363,%r362,0; @ %r363 bra $L78; .loc 1 311 2 add.u64 %r427,%frame,360; cvta.const.u64 %r367,$LC2r4273667; call _gfortrani_bounds_ifunction_return$L78: .loc 1 315 3 setp.gt.s64 %r369,%r147,0; @ %r369 bra $L81; $L84: .loc 1 235 9 add.u64 %r370,%r31,1; .loc 1 235 7 sub.u64 %r149,%r370,%r33; max.s64 %r65,%r149,0; .loc 1 240 12 mul.lo.u64 %r37,%r34,%r36; .loc 1 323 8 ld.u64 %r164,[%r204]; .loc 1 326 8 ld.u64 %r163,[%r205]; .loc 1 351 35 shl.b64 %r100,%r151,4; .loc 1 336 22 add.u64 %r64,%r65,-1; .loc 1 336 15 mul.lo.u64 %r89,%r64,%r100; .loc 1 337 29 mul.lo.u64 %r177,%r37,%r64; .loc 1 338 34 neg.s64 %r95,%r100; .loc 1 363 22 ld.u64 %r102,[%frame+240]; .loc 1 363 12 shl.b64 %r104,%r102,4; .loc 1 364 23 ld.u64 %r105,[%frame+120]; .loc 1 365 22 ld.u64 %r107,[%frame]; .loc 1 365 12 shl.b64 %r109,%r107,3; .loc 1 362 12 ld.u64 %r203,[%frame+480]; setp.ne.u32 %r428,%r210,0; add.u64 %r426,%frame,488; .loc 1 338 4 setp.le.s64 %r434,%r149,0; .loc 1 382 23 add.u64 %r435,%frame,240; .loc 1 367 32 add.u64 %r436,%frame,360; .loc 1 371 20 add.u64 %r437,%frame,120; bra $L82; $L81: add.u64 %r198,%frame,480; add.u64 %r197,%r204,40; mov.u64 %r195,%frame; cvt.u32.u32 %r372,%r22; cvt.s64.s8 %r371,%r372; add.u64 %r181,%r371,-1; .loc 1 315 3 mov.u64 %r165,0; add.u64 %r427,%frame,360; .loc 1 317 16 mov.u64 %r373,%r165; $L83: st.u64 [%r198],%r373; .loc 1 318 18 ld.u64 %r374,[%r197]; st.u64 [%r195],%r374; .loc 1 319 17 shl.b64 %r376,%r165,3; add.u64 %r377,%r427,%r376; .loc 1 319 10 ld.u64 %r378,[%r377]; setp.le.s64 %r379,%r378,0; @ %r379 bra $L56; .loc 1 315 26 add.u64 %r165,%r165,1; .loc 1 315 3 add.u64 %r198,%r198,8; add.u64 %r197,%r197,24; add.u64 %r195,%r195,8; setp.ne.u64 %r380,%r165,%r181; @ %r380 bra $L83; bra $L84; $L82: .loc 1 334 10 @ %r428 bra $L85; .loc 1 351 4 setp.gt.s64 %r382,%r149,0; @ %r382 bra $L97; .loc 1 333 14 mov.u64 %r158,0; bra $L87; $L85: .loc 1 336 8 add.u64 %r29,%r163,%r89; .loc 1 337 9 add.u64 %r159,%r160,%r177; .loc 1 338 4 @ %r434 bra $L98; .loc 1 338 11 mov.u64 %r158,%r65; $L90: .loc 1 340 11 ld.s8 %r385,[%r159]; cvt.u16.u32 %r384,%r385; setp.eq.u16 %r386,%r384,0; @ %r386 bra $L88; .loc 1 340 18 ld.u64 %r387,[%r29]; setp.ne.u64 %r389,%r387,%r424; @ %r389 bra $L88; ld.u64 %r390,[%r29+8]; setp.ne.u64 %r392,%r390,%r425; @ ! %r392 bra $L87; $L88: .loc 1 338 26 add.u64 %r158,%r158,-1; .loc 1 338 34 add.u64 %r29,%r29,%r95; .loc 1 338 53 sub.u64 %r159,%r159,%r37; .loc 1 338 4 setp.ne.u64 %r393,%r158,0; @ %r393 bra $L90; bra $L87; $L97: .loc 1 351 4 mov.u64 %r157,%r160; mov.u64 %r111,%r163; .loc 1 351 11 mov.u64 %r158,1; $L86: .loc 1 353 11 ld.s8 %r395,[%r157]; cvt.u16.u32 %r394,%r395; setp.eq.u16 %r396,%r394,0; @ %r396 bra $L91; .loc 1 353 18 ld.u64 %r397,[%r111]; setp.ne.u64 %r399,%r397,%r424; @ %r399 bra $L91; ld.u64 %r400,[%r111+8]; setp.ne.u64 %r402,%r400,%r425; @ ! %r402 bra $L87; $L91: .loc 1 351 27 add.u64 %r158,%r158,1; .loc 1 351 35 add.u64 %r111,%r111,%r100; .loc 1 351 54 add.u64 %r157,%r157,%r37; .loc 1 351 4 setp.ge.s64 %r403,%r65,%r158; @ %r403 bra $L86; .loc 1 333 14 mov.u64 %r158,0; bra $L87; $L98: mov.u64 %r158,0; $L87: .loc 1 360 13 st.u64 [%r164],%r158; .loc 1 362 15 add.u64 %r203,%r203,1; st.u64 [%frame+480],%r203; .loc 1 363 12 add.u64 %r163,%r163,%r104; .loc 1 364 13 add.u64 %r160,%r160,%r105; .loc 1 365 12 add.u64 %r164,%r164,%r109; .loc 1 367 32 ld.u64 %r130,[%frame+360]; .loc 1 367 13 setp.ne.u64 %r404,%r203,%r130; @ %r404 bra $L82; mov.u64 %r126,%r426; mov.u64 %r127,%r107; mov.u64 %r202,%r105; mov.u64 %r124,%r102; mov.u64 %r120,8; .loc 1 366 9 mov.u64 %r162,0; .loc 1 369 13 mov.u64 %r406,%r162; $L95: st.u64 [%r126+-8],%r406; .loc 1 370 23 mul.lo.u64 %r113,%r130,%r124; .loc 1 371 24 mul.lo.u64 %r407,%r130,%r202; .loc 1 371 10 sub.u64 %r160,%r160,%r407; .loc 1 372 23 mul.lo.u64 %r119,%r130,%r127; .loc 1 373 5 add.u64 %r162,%r162,1; .loc 1 374 7 setp.le.s64 %r408,%r147,%r162; @ %r408 bra $L56; .loc 1 381 16 ld.u64 %r409,[%r126]; add.u64 %r123,%r409,1; st.u64 [%r126],%r123; .loc 1 382 23 add.u64 %r411,%r435,%r120; ld.u64 %r124,[%r411]; .loc 1 382 13 sub.u64 %r412,%r124,%r113; shl.b64 %r413,%r412,4; add.u64 %r163,%r163,%r413; .loc 1 383 23 add.u64 %r414,%frame,%r120; ld.u64 %r127,[%r414]; .loc 1 383 13 sub.u64 %r415,%r127,%r119; shl.b64 %r416,%r415,3; add.u64 %r164,%r164,%r416; .loc 1 367 32 add.u64 %r418,%r436,%r120; ld.u64 %r130,[%r418]; .loc 1 367 13 add.u64 %r126,%r126,8; setp.eq.u64 %r419,%r123,%r130; @ %r419 bra $L94; .loc 1 362 12 ld.u64 %r203,[%frame+480]; bra $L82; $L94: .loc 1 371 20 add.u64 %r421,%r437,%r120; ld.u64 %r202,[%r421]; add.u64 %r120,%r120,8; bra $L95; $L120: .loc 1 274 6 ld.u64 %r422,[%r204]; setp.eq.u64 %r423,%r422,0; @ ! %r423 bra $L68; bra $L96; $L56: .loc 1 387_gfortran_sfindloc1_i16 .visible .func _gfortran_sfindloc1_i16 {84u64 %r126; .reg .u64predpredpredpredu64predpredmov.u64 %r149,%ar0; mov.u64 %r150,%ar1; mov.u64 %r151,%ar2; ld.u64 %r298,[%r151]; ld.u64 %r299,[%r151+8]; mov.u64 %r153,%ar3; mov.u64 %r154,%ar4; mov.u32 %r155,%ar5; .loc 1 410 6 setp.eq.u64 %r156,%r154,0; @ %r156 bra $L122; .loc 1 410 20 ld.u32 %r157,[%r154]; setp.eq.u32 %r158,%r157,0; @ %r158 bra $L123; $L122: .loc 1 412 7 st.u64 [%frame+368],%r298; st.u64 [%frame+376],%r299; add.u64 %r166,%frame,368155; call _gfortran_findloc1_i16); } .loc 1 413 7 bra $L121; $L123: .loc 1 416 10 ld.s8 %r23,[%r150+28]; .loc 1 416 38 add.u32 %r167,%r23,-1; .loc 1 416 8 cvt.s64.s32 %r90,%r167; .loc 1 417 10 ld.u64 %r26,[%r153]; .loc 1 417 7 add.u64 %r91,%r26,-1; .loc 1 419 7 shr.u64 %r169,%r91,63; set.u32.lt.s64 %r171,%r90,%r91; neg.s32 %r172,%r171; cvt.u16.u64 %r175,%r169; cvt.u16.u32 %r176,%r172; or.b16 %r174,%r175,%r176; .loc 1 419 6 cvt.u32.u16 %r177,%r174; cvt.u16.u8 %r178,%r177; setp.ne.u16 %r179,%r178,0; @ %r179 bra $L125; .loc 1 430 3 setp.ne.u64 %r180,%r91,0; @ %r180 bra $L126; $L132: .loc 1 438 3 setp.gt.s64 %r181,%r90,%r91; @ %r181 bra $L127; bra $L169; $L125: .loc 1 421 7 cvt.u32.u32 %r184,%r23; cvt.s64.s8 %r183,%r184; st.u64 [%stack+8],%r183182182_gfortran_runtime_error126: add.u64 %r136,%r150,48; add.u64 %r138,%frame,120; add.u64 %r139,%r150,56; add.u64 %r186,%r150,24; add.u64 %r188,%r26,%r26; add.u64 %r189,%r188,%r26; shl.b64 %r190,%r189,3; add.u64 %r145,%r186,%r190; .loc 1 435 12 mov.u64 %r308,0; $L131: .loc 1 432 19 ld.u64 %r192,[%r139]; add.u64 %r191,%r192,1; ld.u64 %r193,[%r136]; sub.u64 %r35,%r191,%r193; .loc 1 434 10 setp.le.s64 %r194,%r35,0; @ %r194 bra $L129; .loc 1 432 17 st.u64 [%r138],%r35; bra $L130; $L129: .loc 1 435 12 st.u64 [%r138],%r308; $L130: .loc 1 430 3 add.u64 %r136,%r136,24; add.u64 %r138,%r138,8; add.u64 %r139,%r139,24; setp.ne.u64 %r196,%r136,%r145; @ %r196 bra $L131; bra $L132; $L169: .loc 1 448 6 ld.u64 %r197,[%r149]; setp.eq.u64 %r198,%r197,0; @ ! %r198 bra $L134; bra $L133; $L127: add.u64 %r200,%r26,%r26; add.u64 %r201,%r200,%r26; shl.b64 %r202,%r201,3; add.u64 %r203,%r202,48; add.u64 %r125,%r150,%r203; add.u64 %r300,%frame,120; shl.b64 %r205,%r26,3; add.u64 %r206,%r205,-8; add.u64 %r45,%r300,%r206; add.u64 %r207,%r150,48; cvt.u32.u32 %r209,%r23; cvt.s64.s8 %r208,%r209; add.u64 %r211,%r208,%r208; add.u64 %r212,%r211,%r208; shl.b64 %r213,%r212,3; add.u64 %r132,%r207,%r213; .loc 1 444 12 mov.u64 %r307,0; $L137: .loc 1 441 2 ld.u64 %r215,[%r125+8]; add.u64 %r214,%r215,1; ld.u64 %r216,[%r125]; sub.u64 %r40,%r214,%r216; .loc 1 443 10 setp.le.s64 %r217,%r40,0; @ %r217 bra $L135; .loc 1 440 17 st.u64 [%r45],%r40; bra $L136; $L135: .loc 1 444 12 st.u64 [%r45],%r307; $L136: .loc 1 438 3 add.u64 %r125,%r125,24; add.u64 %r45,%r45,8; setp.ne.u64 %r219,%r125,%r132; @ %r219 bra $L137; bra $L170; $L133: .loc 1 452 7 setp.le.s64 %r220,%r90,0; @ %r220 bra $L139; add.u64 %r300,%frame,120; $L155: add.u64 %r78,%r149,40; mov.u64 %r61,%r300; cvt.u32.u32 %r222,%r23; cvt.s64.s8 %r221,%r222; add.u64 %r224,%r221,%r221; add.u64 %r225,%r224,%r221; shl.b64 %r226,%r225,3; add.u64 %r227,%r149,16; add.u64 %r126,%r226,%r227; .loc 1 455 10 mov.u64 %r81,1; .loc 1 459 4 mov.u64 %r250,0; bra $L140; $L139: .loc 1 462 24 mov.u64 %r228,0; st.u64 [%r149+8],%r228; .loc 1 463 28 cvt.u16.u32 %r231,%r23; add.u16 %r230,%r231,-1; cvt.u32.u16 %r232,%r230; st.u8 [%r149+28],%r232; .loc 1 465 20 add.u32 %r233,%r23,-2; cvt.s64.s32 %r51,%r233; add.u64 %r235,%r51,%r51; add.u64 %r236,%r235,%r51; shl.b64 %r237,%r236,3; add.u64 %r238,%r149,%r237; .loc 1 465 67 shl.b64 %r240,%r51,3; add.u64 %r241,%frame,%r240; .loc 1 465 59 ld.u64 %r243,[%r238+40]; ld.u64 %r244,[%r241+120]; mul.lo.u64 %r94,%r243,%r244; .loc 1 467 29 mov.u64 %r246246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } .loc 1 467 27 st.u64 [%r149],%r247; .loc 1 468 10 setp.eq.u64 %r249,%r94,0; @ ! %r249 bra $L144; bra $L141; $L143: .loc 1 457 48 mul.lo.u64 %r81,%r47,%r81; $L140: .loc 1 459 4 st.u64 [%r78+8],%r250; ld.u64 %r47,[%r61]; add.u64 %r251,%r47,-1; st.u64 [%r78+16],%r251; st.u64 [%r78],%r81; .loc 1 452 7 add.u64 %r78,%r78,24; add.u64 %r61,%r61,8; setp.ne.u64 %r252,%r78,%r126; @ %r252 bra $L143; bra $L139; $L141: .loc 1 471 4 st.u64 [%r149+48],%r94; mov.u64 %r254,-1; st.u64 [%r149+56],%r254; mov.u64 %r255,1; st.u64 [%r149+40],%r255; .loc 1 472 4 bra $L121; $L134: .loc 1 477 19 ld.s8 %r58,[%r149+28]; .loc 1 477 10 setp.eq.u64 %r256,%r58,%r90; @ %r256 bra $L145; .loc 1 478 2 st.u64 [%stack+8],%r90; st.u64 [%stack],%r58;_gfortran_runtime_error145: .loc 1 483 11 cvta.global.u64 %r259,_gfortrani_compile_options; .loc 1 483 10 ld.u32 %r260,[%r259+36]; setp.eq.u32 %r261,%r260,0; @ %r261 bra $L144; .loc 1 484 2 add.u64 %r300,%frame,120; cvta.const.u64 %r265,$LC2r3002645; call _gfortrani_bounds_ifunction_return$L144: .loc 1 488 3 setp.gt.s64 %r267,%r90,0; @ %r267 bra $L146; $L149: .loc 1 495 8 ld.u64 %r97,[%r149]; ld.u64 %r42,[%frame+240]; .loc 1 503 22 ld.u64 %r64,[%frame]; .loc 1 503 12 shl.b64 %r66,%r64,3; setp.le.s64 %r301,%r90,1; cvt.u32.u32 %r303,%r23; cvt.s64.s8 %r304,%r303; add.u64 %r305,%r304,-1; add.u64 %r306,%frame,120; bra $L147; $L146: add.u64 %r114,%frame,240; add.u64 %r113,%r149,40; mov.u64 %r108,%frame; cvt.u32.u32 %r269,%r23; cvt.s64.s8 %r268,%r269; add.u64 %r79,%r268,-1; .loc 1 488 3 mov.u64 %r98,0; add.u64 %r300,%frame,120; .loc 1 490 16 mov.u64 %r270,%r98; $L148: st.u64 [%r114],%r270; .loc 1 491 18 ld.u64 %r271,[%r113]; st.u64 [%r108],%r271; .loc 1 492 17 shl.b64 %r273,%r98,3; add.u64 %r274,%r300,%r273; .loc 1 492 10 ld.u64 %r275,[%r274]; setp.le.s64 %r276,%r275,0; @ %r276 bra $L121; .loc 1 488 26 add.u64 %r98,%r98,1; .loc 1 488 3 add.u64 %r114,%r114,8; add.u64 %r113,%r113,24; add.u64 %r108,%r108,8; setp.ne.u64 %r277,%r79,%r98; @ %r277 bra $L148; bra $L149; $L147: .loc 1 500 13 mov.u64 %r278,0; st.u64 [%r97],%r278; .loc 1 502 15 add.u64 %r42,%r42,1; .loc 1 503 12 add.u64 %r97,%r97,%r66; .loc 1 505 32 ld.u64 %r88,[%frame+120]; .loc 1 505 13 setp.ne.u64 %r279,%r42,%r88; @ %r279 bra $L147; .loc 1 508 23 mul.lo.u64 %r280,%r64,%r42; .loc 1 508 9 shl.b64 %r281,%r280,3; sub.u64 %r30,%r97,%r281; .loc 1 510 7 @ ! %r301 bra $L171; bra $L121; $L154: .loc 1 507 13 st.u64 [%r102],%r278; .loc 1 508 23 mul.lo.u64 %r284,%r73,%r72; .loc 1 508 9 shl.b64 %r285,%r284,3; sub.u64 %r30,%r97,%r285; .loc 1 509 5 add.u64 %r96,%r96,1; .loc 1 510 7 add.u64 %r102,%r102,8; setp.ne.u64 %r286,%r96,%r305; @ %r286 bra $L153; bra $L121; $L171: add.u64 %r102,%frame,248; .loc 1 509 5 mov.u64 %r96,1; $L153: .loc 1 517 16 ld.u64 %r290,[%r102]; add.u64 %r72,%r290,1; st.u64 [%r102],%r72; shl.b64 %r120,%r96,3; .loc 1 518 23 add.u64 %r291,%frame,%r120; ld.u64 %r73,[%r291]; .loc 1 518 13 shl.b64 %r292,%r73,3; add.u64 %r97,%r30,%r292; .loc 1 505 32 add.u64 %r294,%r306,%r120; ld.u64 %r77,[%r294]; .loc 1 505 13 setp.eq.u64 %r295,%r72,%r77; @ %r295 bra $L154; .loc 1 507 13 mov.u64 %r42,0; bra $L147; $L170: .loc 1 448 6 ld.u64 %r296,[%r149]; setp.eq.u64 %r297,%r296,0; @ ! %r297 bra $L134; bra $L155; $L121: .loc 1 522findloc1_r4.o/_gfortran_findloc1_r4 .visible .func _gfortran_findloc1_rfortran/generated/findloc1_r4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mfindloc1_r4 .visible .func _gfortran_mfindloc1_r_gfortran_sfindloc1_r4 .visible .func _gfortran_sfindloc1_rVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_internal_error .extern .func _gfortrani_internal_8] = {70,73,78,68,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,70,73,78,68,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,914,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,70,73,78,68,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_findloc1_r4 .visible .func _gfortran_findloc1_r4898predpred %r238; .reg .u64pred %r260; .reg .pred %r262; .reg .pred %r263; .reg .u64 %r264; .reg .u32 %r265; .reg .u64u64 %r279; .reg .u64 %r280; .reg .u6464pred %r327; .reg .predpred %r331; .reg .pred %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64u64 %r352; .reg .u64 %r354; .reg .predpred %r358; .reg .pred %r359; .reg .u64predmov.u64 %r185,%ar0; mov.u64 %r186,%ar1; mov.f32 %r187,%ar2; mov.u64 %r188,%ar3; mov.u32 %r189,%ar4; .loc 1 54 10 ld.s8 %r22,[%r186+28]; .loc 1 54 38 add.u32 %r190,%r22,-1; .loc 1 54 8 cvt.s64.s32 %r118,%r190; .loc 1 55 10 ld.u64 %r25,[%r188]; .loc 1 55 7 add.u64 %r119,%r25,-1; .loc 1 57 7 shr.u64 %r192,%r119,63; set.u32.lt.s64 %r194,%r118,%r119; neg.s32 %r195,%r194; cvt.u16.u64 %r198,%r192; cvt.u16.u32 %r199,%r195; or.b16 %r197,%r198,%r199; .loc 1 57 6 cvt.u32.u16 %r200,%r197; cvt.u16.u8 %r201,%r200; setp.eq.u16 %r202,%r201,0; @ %r202 bra $L2; .loc 1 59 7 cvt.u32.u32 %r205,%r22; cvt.s64.s8 %r204,%r205; st.u64 [%stack+8],%r20403203_gfortran_runtime_errorr208,%r119,%r119; add.u64 %r209,%r208,%r119; shl.b64 %r210,%r209,3; add.u64 %r211,%r186,%r210; ld.u64 %r31,[%r211+56]; ld.u64 %r33,[%r211+48]; .loc 1 67 9 ld.u64 %r122,[%r211+40]; .loc 1 69 3 setp.ne.u64 %r225,%r119,0; @ %r225 bra $L3; $L9: .loc 1 77 3 setp.gt.s64 %r226,%r118,%r119; @ %r226 bra $L4; bra $L50; $L3: add.u64 %r177,%r186,40; add.u64 %r67,%frame,120; add.u64 %r70,%frame,240; add.u64 %r228,%r25,%r25; add.u64 %r229,%r228,%r25; shl.b64 %r230,%r229,3; add.u64 %r231,%r186,16; add.u64 %r182,%r230,%r231; .loc 1 75 12 mov.u64 %r371,0; $L8: .loc 1 71 18 ld.u64 %r232,[%r177]; st.u64 [%r67],%r232; .loc 1 72 19 ld.u64 %r234,[%r177+16]; add.u64 %r233,%r234,1; ld.u64 %r235,[%r177+8]; sub.u64 %r38,%r233,%r235; .loc 1 74 10 setp.lt.s64 %r236,%r38,0; @ %r236 bra $L6; .loc 1 72 17 st.u64 [%r70],%r38; bra $L7; $L6: .loc 1 75 12 st.u64 [%r70],%r371; $L7: .loc 1 69 3 add.u64 %r177,%r177,24; add.u64 %r67,%r67,8; add.u64 %r70,%r70,8; setp.ne.u64 %r238,%r177,%r182; @ %r238 bra $L8; bra $L9; $L50: .loc 1 86 6 ld.u64 %r239,[%r185]; setp.eq.u64 %r240,%r239,0; @ ! %r240 bra $L11; bra $L10; $L4: add.u64 %r242,%r25,%r25; add.u64 %r243,%r242,%r25; shl.b64 %r244,%r243,3; add.u64 %r245,%r244,40; add.u64 %r115,%r186,%r245; shl.b64 %r246,%r25,3; add.u64 %r105,%r246,-8; add.u64 %r361,%frame,120; add.u64 %r108,%r361,%r105; add.u64 %r360,%frame,240; add.u64 %r62,%r360,%r105; add.u64 %r249,%r186,40; cvt.u32.u32 %r251,%r22; cvt.s64.s8 %r250,%r251; add.u64 %r253,%r250,%r250; add.u64 %r254,%r253,%r250; shl.b64 %r255,%r254,3; add.u64 %r180,%r249,%r255; .loc 1 83 12 mov.u64 %r370,0; $L14: .loc 1 79 18 ld.u64 %r256,[%r115]; st.u64 [%r108],%r256; .loc 1 80 19 ld.u64 %r258,[%r115+16]; add.u64 %r257,%r258,1; ld.u64 %r259,[%r115+8]; sub.u64 %r45,%r257,%r259; .loc 1 82 10 setp.lt.s64 %r260,%r45,0; @ %r260 bra $L12; .loc 1 80 17 st.u64 [%r62],%r45; bra $L13; $L12: .loc 1 83 12 st.u64 [%r62],%r370; $L13: .loc 1 77 3 add.u64 %r115,%r115,24; add.u64 %r108,%r108,8; add.u64 %r62,%r62,8; setp.ne.u64 %r262,%r115,%r180; @ %r262 bra $L14; bra $L51; $L10: .loc 1 90 7 setp.le.s64 %r263,%r118,0; @ %r263 bra $L16; add.u64 %r360,%frame,240; $L36: add.u64 %r153,%r185,40; mov.u64 %r149,%r360; cvt.u32.u32 %r265,%r22; cvt.s64.s8 %r264,%r265; add.u64 %r267,%r264,%r264; add.u64 %r268,%r267,%r264; shl.b64 %r269,%r268,3; add.u64 %r270,%r185,16; add.u64 %r116,%r269,%r270; .loc 1 93 10 mov.u64 %r112,1; .loc 1 97 4 mov.u64 %r293,0; bra $L17; $L16: .loc 1 101 24 mov.u64 %r271,0; st.u64 [%r185+8],%r271; .loc 1 102 28 cvt.u16.u32 %r274,%r22; add.u16 %r273,%r274,-1; cvt.u32.u16 %r275,%r273; st.u8 [%r185+28],%r275; .loc 1 104 20 add.u32 %r276,%r22,-2; cvt.s64.s32 %r54,%r276; add.u64 %r278,%r54,%r54; add.u64 %r279,%r278,%r54; shl.b64 %r280,%r279,3; add.u64 %r281,%r185,%r280; .loc 1 104 67 shl.b64 %r283,%r54,3; add.u64 %r284,%frame,%r283; .loc 1 104 59 ld.u64 %r286,[%r281+40]; ld.u64 %r287,[%r284+240]; mul.lo.u64 %r124,%r286,%r287; .loc 1 106 29 mov.u64 %r289189; call (%value_in),_gfortrani_xmallocarray290,[%value_in]; } .loc 1 106 27 st.u64 [%r185],%r290; .loc 1 107 10 setp.eq.u64 %r292,%r124,0; @ ! %r292 bra $L21; bra $L18; $L20: .loc 1 95 48 mul.lo.u64 %r112,%r50,%r112; $L17: .loc 1 97 4 st.u64 [%r153+8],%r293; ld.u64 %r50,[%r149]; add.u64 %r294,%r50,-1; st.u64 [%r153+16],%r294; st.u64 [%r153],%r112; .loc 1 90 7 add.u64 %r153,%r153,24; add.u64 %r149,%r149,8; setp.ne.u64 %r295,%r116,%r153; @ %r295 bra $L20; bra $L16; $L18: .loc 1 110 4 st.u64 [%r185+48],%r124; mov.u64 %r297,-1; st.u64 [%r185+56],%r297; mov.u64 %r298,1; st.u64 [%r185+40],%r298; .loc 1 111 4 bra $L1; $L11: .loc 1 116 19 ld.s8 %r60,[%r185+28]; .loc 1 116 10 setp.eq.u64 %r299,%r60,%r118; @ %r299 bra $L23; .loc 1 117 2 st.u64 [%stack+8],%r118; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 122 11 cvta.global.u64 %r302,_gfortrani_compile_options; .loc 1 122 10 ld.u32 %r303,[%r302+36]; setp.eq.u32 %r304,%r303,0; @ %r304 bra $L21; .loc 1 123 2 add.u64 %r360,%frame,240; cvta.const.u64 %r308,$LC2r360307308; call _gfortrani_bounds_ifunction_return$L21: .loc 1 127 3 setp.gt.s64 %r310,%r118,0; @ %r310 bra $L24; $L27: .loc 1 64 9 add.u64 %r311,%r31,1; .loc 1 64 7 sub.u64 %r120,%r311,%r33; max.s64 %r43,%r120,0; .loc 1 135 8 ld.u64 %r111,[%r185]; .loc 1 138 8 ld.u64 %r132,[%r186]; .loc 1 160 35 shl.b64 %r74,%r122,2; .loc 1 147 22 add.u64 %r312,%r43,-1; .loc 1 147 15 mul.lo.u64 %r68,%r312,%r74; .loc 1 148 34 neg.s64 %r71,%r74; ld.u64 %r82,[%frame+360]; .loc 1 172 22 ld.u64 %r75,[%frame+120]; .loc 1 172 12 shl.b64 %r77,%r75,2; .loc 1 173 22 ld.u64 %r78,[%frame]; .loc 1 173 12 shl.b64 %r80,%r78,3; setp.ne.u32 %r359,%r189,0; setp.le.s64 %r358,%r118,1; .loc 1 148 4 setp.le.s64 %r363,%r120,0; .loc 1 160 4 setp.gt.s64 %r364,%r120,0; cvt.u32.u32 %r365,%r22; cvt.s64.s8 %r366,%r365; add.u64 %r367,%r366,-1; add.u64 %r368,%frame,120; add.u64 %r369,%frame,240; bra $L25; $L24: add.u64 %r170,%frame,360; add.u64 %r169,%r185,40; mov.u64 %r167,%frame; cvt.u32.u32 %r314,%r22; cvt.s64.s8 %r313,%r314; add.u64 %r154,%r313,-1; .loc 1 127 3 mov.u64 %r133,0; add.u64 %r360,%frame,240; .loc 1 129 16 mov.u64 %r315,%r133; $L26: st.u64 [%r170],%r315; .loc 1 130 18 ld.u64 %r316,[%r169]; st.u64 [%r167],%r316; .loc 1 131 17 shl.b64 %r318,%r133,3; add.u64 %r319,%r360,%r318; .loc 1 131 10 ld.u64 %r320,[%r319]; setp.le.s64 %r321,%r320,0; @ %r321 bra $L1; .loc 1 127 26 add.u64 %r133,%r133,1; .loc 1 127 3 add.u64 %r170,%r170,8; add.u64 %r169,%r169,24; add.u64 %r167,%r167,8; setp.ne.u64 %r322,%r133,%r154; @ %r322 bra $L26; bra $L27; $L25: .loc 1 145 10 @ %r359 bra $L28; .loc 1 160 4 @ %r364 bra $L37; .loc 1 144 14 mov.u64 %r126,0; bra $L30; $L28: .loc 1 147 8 add.u64 %r127,%r132,%r68; .loc 1 148 4 @ %r363 bra $L38; .loc 1 148 11 mov.u64 %r126,%r43; $L31: .loc 1 150 11 ld.f32 %r326,[%r127]; setp.eq.f32 %r327,%r326,%r187; @ %r327 bra $L30; .loc 1 148 26 add.u64 %r126,%r126,-1; .loc 1 148 34 add.u64 %r127,%r127,%r71; .loc 1 148 4 setp.ne.u64 %r328,%r126,0; @ %r328 bra $L31; bra $L30; $L37: .loc 1 160 4 mov.u64 %r125,%r132; .loc 1 160 11 mov.u64 %r126,1; $L29: .loc 1 162 11 ld.f32 %r329,[%r125]; setp.eq.f32 %r330,%r329,%r187; @ %r330 bra $L30; .loc 1 160 27 add.u64 %r126,%r126,1; .loc 1 160 35 add.u64 %r125,%r125,%r74; .loc 1 160 4 setp.ge.s64 %r331,%r43,%r126; @ %r331 bra $L29; .loc 1 144 14 mov.u64 %r126,0; bra $L30; $L38: mov.u64 %r126,0; $L30: .loc 1 169 13 st.u64 [%r111],%r126; .loc 1 171 15 add.u64 %r82,%r82,1; .loc 1 172 12 add.u64 %r132,%r132,%r77; .loc 1 173 12 add.u64 %r111,%r111,%r80; .loc 1 175 32 ld.u64 %r141,[%frame+240]; .loc 1 175 13 setp.ne.u64 %r332,%r82,%r141; @ %r332 bra $L25; .loc 1 178 23 mul.lo.u64 %r333,%r75,%r82; .loc 1 178 9 shl.b64 %r334,%r333,2; sub.u64 %r129,%r132,%r334; .loc 1 179 23 mul.lo.u64 %r335,%r78,%r82; .loc 1 179 9 shl.b64 %r336,%r335,3; sub.u64 %r130,%r111,%r336; .loc 1 181 7 @ %r358 bra $L1; add.u64 %r66,%frame,368; mov.u64 %r159,8; .loc 1 180 5 mov.u64 %r131,1; .loc 1 177 13 mov.u64 %r362,0; bra $L34; $L35: st.u64 [%r66],%r362; .loc 1 178 23 mul.lo.u64 %r342,%r95,%r94; .loc 1 178 9 shl.b64 %r343,%r342,2; sub.u64 %r129,%r132,%r343; .loc 1 179 23 mul.lo.u64 %r344,%r98,%r94; .loc 1 179 9 shl.b64 %r345,%r344,3; sub.u64 %r130,%r111,%r345; .loc 1 180 5 add.u64 %r131,%r131,1; .loc 1 181 7 add.u64 %r66,%r66,8; add.u64 %r159,%r159,8; setp.eq.u64 %r346,%r131,%r367; @ %r346 bra $L1; $L34: .loc 1 188 16 ld.u64 %r347,[%r66]; add.u64 %r94,%r347,1; st.u64 [%r66],%r94; .loc 1 189 23 add.u64 %r349,%r368,%r159; ld.u64 %r95,[%r349]; .loc 1 189 13 shl.b64 %r350,%r95,2; add.u64 %r132,%r129,%r350; .loc 1 190 23 add.u64 %r351,%frame,%r159; ld.u64 %r98,[%r351]; .loc 1 190 13 shl.b64 %r352,%r98,3; add.u64 %r111,%r130,%r352; .loc 1 175 32 add.u64 %r354,%r369,%r159; ld.u64 %r102,[%r354]; .loc 1 175 13 setp.eq.u64 %r355,%r94,%r102; @ %r355 bra $L35; .loc 1 177 13 mov.u64 %r82,0; bra $L25; $L51: .loc 1 86 6 ld.u64 %r356,[%r185]; setp.eq.u64 %r357,%r356,0; @ ! %r357 bra $L11; bra $L36; $L1: .loc 1 194_gfortran_mfindloc1_r4 .visible .func _gfortran_mfindloc1_r608u32 %r209; .reg .u32 %r210; .reg .u64 %r212; .reg .u32 %r214; .reg .u32 %r215; .reg .u16 %r217; .reg .u16 %r218; .reg .u16 %r219; .reg .u32 %r220; .reg .u16 %r221; .reg .predu64 %r230; .reg .u64 %r231; .reg .u64 %r249; .reg .u32u32u64 %r281; .reg .u64 %r282; .reg .u64 %r283; .reg .u64predu64u16 %r331; .reg .u16predpredu64 %r375; .reg .u64pred %r379; .reg .pred %r381; .reg .u16 %r383; .reg .u32 %r384; .reg .pred %r385; .reg .f32u32 %r390; .reg .pred %r391; .reg .f32 %r392; .reg .pred %r393; .reg .pred %r394; .reg .predu64u64 %r430; .reg .u64 %r431; mov.u64 %r204,%ar0; mov.u64 %r205,%ar1; mov.f32 %r206,%ar2; mov.u64 %r207,%ar3; mov.u64 %r208,%ar4; mov.u32 %r209,%ar5; .loc 1 225 10 ld.s8 %r22,[%r205+28]; .loc 1 225 38 add.u32 %r210,%r22,-1; .loc 1 225 8 cvt.s64.s32 %r147,%r210; .loc 1 226 10 ld.u64 %r25,[%r207]; .loc 1 226 7 add.u64 %r148,%r25,-1; .loc 1 228 7 shr.u64 %r212,%r148,63; set.u32.lt.s64 %r214,%r147,%r148; neg.s32 %r215,%r214; cvt.u16.u64 %r218,%r212; cvt.u16.u32 %r219,%r215; or.b16 %r217,%r218,%r219; .loc 1 228 6 cvt.u32.u16 %r220,%r217; cvt.u16.u8 %r221,%r220; setp.eq.u16 %r222,%r221,0; @ %r222 bra $L53; .loc 1 230 7 cvt.u32.u32 %r225,%r22; cvt.s64.s8 %r224,%r225; st.u64 [%stack+8],%r224232stack; call _gfortran_runtime_error3: .loc 1 235 9 add.u64 %r228,%r148,%r148; add.u64 %r229,%r228,%r148; shl.b64 %r230,%r229,3; add.u64 %r231,%r205,%r230; ld.u64 %r31,[%r231+56]; ld.u64 %r33,[%r231+48]; .loc 1 239 9 ld.u64 %r151,[%r231+40]; .loc 1 240 12 add.u64 %r249,%r208,%r230; ld.u64 %r34,[%r249+40]; ld.u64 %r36,[%r208+16]; .loc 1 242 9 ld.u64 %r160,[%r208]; .loc 1 246 22 cvt.u32.u64 %r39,%r36; .loc 1 246 53 add.u32 %r251,%r39,-4; and.b32 %r252,%r251,-5; set.u32.eq.u32 %r254,%r252,0; neg.s32 %r255,%r254; .loc 1 246 22 add.u32 %r256,%r39,-1; set.u32.le.u32 %r258,%r256,1; neg.s32 %r259,%r258; .loc 1 246 6 cvt.u16.u32 %r261,%r255; cvt.u16.u32 %r262,%r259; or.b16 %r260,%r261,%r262; cvt.u32.u16 %r263,%r260; cvt.u16.u8 %r264,%r263; setp.eq.u16 %r265,%r264,0; @ %r265 bra $L54; $L58: .loc 1 255 3 setp.ne.u64 %r266,%r148,0; @ %r266 bra $L55; $L62: .loc 1 264 3 setp.gt.s64 %r267,%r147,%r148; @ %r267 bra $L56; bra $L113; $L54: .loc 1 248 7 setp.eq.u32 %r269,%r39,16; @ %r269 bra $L58; .loc 1 253 5 cvta.const.u64 %r271,$LC4; mov.u64 %r270,27271; call _gfortrani_internal_error5: add.u64 %r146,%r205,40; add.u64 %r167,%r208,40; add.u64 %r273,%r25,%r25; add.u64 %r274,%r273,%r25; shl.b64 %r275,%r274,3; add.u64 %r276,%r205,16; add.u64 %r101,%r275,%r276; .loc 1 255 3 mov.u64 %r200,0; add.u64 %r418,%frame,240; add.u64 %r419,%frame,120; add.u64 %r416,%frame,360; .loc 1 262 12 mov.u64 %r431,%r200; $L61: .loc 1 257 18 add.u64 %r278,%r418,%r200; ld.u64 %r279,[%r146]; st.u64 [%r278],%r279; .loc 1 258 18 add.u64 %r281,%r419,%r200; .loc 1 258 20 ld.u64 %r283,[%r167]; mul.lo.u64 %r282,%r283,%r36; .loc 1 258 18 st.u64 [%r281],%r282; .loc 1 259 19 ld.u64 %r285,[%r146+16]; add.u64 %r284,%r285,1; ld.u64 %r286,[%r146+8]; sub.u64 %r53,%r284,%r286; .loc 1 261 10 setp.lt.s64 %r287,%r53,0; @ %r287 bra $L59; .loc 1 259 17 add.u64 %r289,%r416,%r200; st.u64 [%r289],%r53; bra $L60; $L59: .loc 1 262 12 add.u64 %r291,%r416,%r200; st.u64 [%r291],%r431; $L60: .loc 1 255 3 add.u64 %r146,%r146,24; add.u64 %r167,%r167,24; add.u64 %r200,%r200,8; setp.ne.u64 %r293,%r101,%r146; @ %r293 bra $L61; bra $L62; $L113: .loc 1 274 6 ld.u64 %r294,[%r204]; setp.eq.u64 %r295,%r294,0; @ ! %r295 bra $L64; bra $L63; $L56: add.u64 %r297,%r25,%r25; add.u64 %r298,%r297,%r25; shl.b64 %r299,%r298,3; add.u64 %r143,%r299,40; add.u64 %r156,%r205,%r143; add.u64 %r141,%r208,%r143; shl.b64 %r300,%r25,3; add.u64 %r137,%r300,-8; cvt.u32.u32 %r302,%r22; cvt.s64.s8 %r301,%r302; shl.b64 %r303,%r301,3; add.u64 %r84,%r303,-8; add.u64 %r418,%frame,240; add.u64 %r419,%frame,120; add.u64 %r416,%frame,360; .loc 1 271 12 mov.u64 %r430,0; $L67: .loc 1 266 18 add.u64 %r305,%r418,%r137; ld.u64 %r306,[%r156]; st.u64 [%r305],%r306; .loc 1 267 18 add.u64 %r308,%r419,%r137; .loc 1 267 20 ld.u64 %r310,[%r141]; mul.lo.u64 %r309,%r310,%r36; .loc 1 267 18 st.u64 [%r308],%r309; .loc 1 268 19 ld.u64 %r312,[%r156+16]; add.u64 %r311,%r312,1; ld.u64 %r313,[%r156+8]; sub.u64 %r62,%r311,%r313; .loc 1 270 10 setp.lt.s64 %r314,%r62,0; @ %r314 bra $L65; .loc 1 268 17 add.u64 %r316,%r416,%r137; st.u64 [%r316],%r62; bra $L66; $L65: .loc 1 271 12 add.u64 %r318,%r416,%r137; st.u64 [%r318],%r430; $L66: .loc 1 264 3 add.u64 %r156,%r156,24; add.u64 %r141,%r141,24; add.u64 %r137,%r137,8; setp.ne.u64 %r320,%r84,%r137; @ %r320 bra $L67; bra $L114; $L63: .loc 1 278 7 setp.le.s64 %r321,%r147,0; @ %r321 bra $L69; add.u64 %r416,%frame,360; $L90: add.u64 %r179,%r204,40; mov.u64 %r176,%r416; add.u64 %r322,%r204,16; cvt.u32.u32 %r324,%r22; cvt.s64.s8 %r323,%r324; add.u64 %r326,%r323,%r323; add.u64 %r327,%r326,%r323; shl.b64 %r328,%r327,3; add.u64 %r169,%r322,%r328; .loc 1 281 10 mov.u64 %r140,1; .loc 1 285 4 mov.u64 %r351,0; bra $L70; $L69: .loc 1 289 24 mov.u64 %r329,0; st.u64 [%r204+8],%r329; .loc 1 290 28 cvt.u16.u32 %r332,%r22; add.u16 %r331,%r332,-1; cvt.u32.u16 %r333,%r331; st.u8 [%r204+28],%r333; .loc 1 292 20 add.u32 %r334,%r22,-2; cvt.s64.s32 %r72,%r334; add.u64 %r336,%r72,%r72; add.u64 %r337,%r336,%r72; shl.b64 %r338,%r337,3; add.u64 %r339,%r204,%r338; .loc 1 292 67 shl.b64 %r341,%r72,3; add.u64 %r342,%frame,%r341; .loc 1 292 59 ld.u64 %r344,[%r339+40]; ld.u64 %r345,[%r342+360]; mul.lo.u64 %r154,%r344,%r345; .loc 1 294 29 mov.u64 %r3471547; call (%value_in),_gfortrani_xmallocarray8,[%value_in]; } .loc 1 294 27 st.u64 [%r204],%r348; .loc 1 295 10 setp.eq.u64 %r350,%r154,0; @ ! %r350 bra $L74; bra $L71; $L73: .loc 1 283 48 mul.lo.u64 %r140,%r68,%r140; $L70: .loc 1 285 4 st.u64 [%r179+8],%r351; ld.u64 %r68,[%r176]; add.u64 %r352,%r68,-1; st.u64 [%r179+16],%r352; st.u64 [%r179],%r140; .loc 1 278 7 add.u64 %r179,%r179,24; add.u64 %r176,%r176,8; setp.ne.u64 %r353,%r169,%r179; @ %r353 bra $L73; bra $L69; $L71: .loc 1 298 4 st.u64 [%r204+48],%r154; mov.u64 %r355,-1; st.u64 [%r204+56],%r355; mov.u64 %r356,1; st.u64 [%r204+40],%r356; .loc 1 299 4 bra $L52; $L64: .loc 1 304 19 ld.s8 %r79,[%r204+28]; .loc 1 304 10 setp.eq.u64 %r357,%r79,%r147; @ %r357 bra $L76; .loc 1 305 2 st.u64 [%stack+8],%r147; st.u64 [%stack],%r79;_gfortran_runtime_error76: .loc 1 310 11 cvta.global.u64 %r360,_gfortrani_compile_options; .loc 1 310 10 ld.u32 %r361,[%r360+36]; setp.eq.u32 %r362,%r361,0; @ %r362 bra $L74; .loc 1 311 2 add.u64 %r416,%frame,360; cvta.const.u64 %r366,$LC2r416366; call _gfortrani_bounds_ifunction_return$L74: .loc 1 315 3 setp.gt.s64 %r368,%r147,0; @ %r368 bra $L77; $L80: .loc 1 235 9 add.u64 %r369,%r31,1; .loc 1 235 7 sub.u64 %r149,%r369,%r33; max.s64 %r65,%r149,0; .loc 1 240 12 mul.lo.u64 %r37,%r34,%r36; .loc 1 323 8 ld.u64 %r164,[%r204]; .loc 1 326 8 ld.u64 %r163,[%r205]; .loc 1 351 35 shl.b64 %r100,%r151,2; .loc 1 336 22 add.u64 %r64,%r65,-1; .loc 1 336 15 mul.lo.u64 %r89,%r64,%r100; .loc 1 337 29 mul.lo.u64 %r177,%r37,%r64; .loc 1 338 34 neg.s64 %r95,%r100; .loc 1 363 22 ld.u64 %r102,[%frame+240]; .loc 1 363 12 shl.b64 %r104,%r102,2; .loc 1 364 23 ld.u64 %r105,[%frame+120]; .loc 1 365 22 ld.u64 %r107,[%frame]; .loc 1 365 12 shl.b64 %r109,%r107,3; .loc 1 362 12 ld.u64 %r203,[%frame+480]; setp.ne.u32 %r417,%r209,0; add.u64 %r415,%frame,488; .loc 1 338 4 setp.le.s64 %r423,%r149,0; .loc 1 382 23 add.u64 %r424,%frame,240; .loc 1 367 32 add.u64 %r425,%frame,360; .loc 1 371 20 add.u64 %r426,%frame,120; bra $L78; $L77: add.u64 %r198,%frame,480; add.u64 %r197,%r204,40; mov.u64 %r195,%frame; cvt.u32.u32 %r371,%r22; cvt.s64.s8 %r370,%r371; add.u64 %r181,%r370,-1; .loc 1 315 3 mov.u64 %r165,0; add.u64 %r416,%frame,360; .loc 1 317 16 mov.u64 %r372,%r165; $L79: st.u64 [%r198],%r372; .loc 1 318 18 ld.u64 %r373,[%r197]; st.u64 [%r195],%r373; .loc 1 319 17 shl.b64 %r375,%r165,3; add.u64 %r376,%r416,%r375; .loc 1 319 10 ld.u64 %r377,[%r376]; setp.le.s64 %r378,%r377,0; @ %r378 bra $L52; .loc 1 315 26 add.u64 %r165,%r165,1; .loc 1 315 3 add.u64 %r198,%r198,8; add.u64 %r197,%r197,24; add.u64 %r195,%r195,8; setp.ne.u64 %r379,%r165,%r181; @ %r379 bra $L79; bra $L80; $L78: .loc 1 334 10 @ %r417 bra $L81; .loc 1 351 4 setp.gt.s64 %r381,%r149,0; @ %r381 bra $L91; .loc 1 333 14 mov.u64 %r158,0; bra $L83; $L81: .loc 1 336 8 add.u64 %r29,%r163,%r89; .loc 1 337 9 add.u64 %r159,%r160,%r177; .loc 1 338 4 @ %r423 bra $L92; .loc 1 338 11 mov.u64 %r158,%r65; $L85: .loc 1 340 11 ld.s8 %r384,[%r159]; cvt.u16.u32 %r383,%r384; setp.eq.u16 %r385,%r383,0; @ %r385 bra $L84; .loc 1 340 18 ld.f32 %r386,[%r29]; setp.eq.f32 %r387,%r386,%r206; @ %r387 bra $L83; $L84: .loc 1 338 26 add.u64 %r158,%r158,-1; .loc 1 338 34 add.u64 %r29,%r29,%r95; .loc 1 338 53 sub.u64 %r159,%r159,%r37; .loc 1 338 4 setp.ne.u64 %r388,%r158,0; @ %r388 bra $L85; bra $L83; $L91: .loc 1 351 4 mov.u64 %r157,%r160; mov.u64 %r111,%r163; .loc 1 351 11 mov.u64 %r158,1; $L82: .loc 1 353 11 ld.s8 %r390,[%r157]; cvt.u16.u32 %r389,%r390; setp.eq.u16 %r391,%r389,0; @ %r391 bra $L86; .loc 1 353 18 ld.f32 %r392,[%r111]; setp.eq.f32 %r393,%r392,%r206; @ %r393 bra $L83; $L86: .loc 1 351 27 add.u64 %r158,%r158,1; .loc 1 351 35 add.u64 %r111,%r111,%r100; .loc 1 351 54 add.u64 %r157,%r157,%r37; .loc 1 351 4 setp.ge.s64 %r394,%r65,%r158; @ %r394 bra $L82; .loc 1 333 14 mov.u64 %r158,0; bra $L83; $L92: mov.u64 %r158,0; $L83: .loc 1 360 13 st.u64 [%r164],%r158; .loc 1 362 15 add.u64 %r203,%r203,1; st.u64 [%frame+480],%r203; .loc 1 363 12 add.u64 %r163,%r163,%r104; .loc 1 364 13 add.u64 %r160,%r160,%r105; .loc 1 365 12 add.u64 %r164,%r164,%r109; .loc 1 367 32 ld.u64 %r130,[%frame+360]; .loc 1 367 13 setp.ne.u64 %r395,%r203,%r130; @ %r395 bra $L78; mov.u64 %r126,%r415; mov.u64 %r127,%r107; mov.u64 %r202,%r105; mov.u64 %r124,%r102; mov.u64 %r120,8; .loc 1 366 9 mov.u64 %r162,0; .loc 1 369 13 mov.u64 %r397,%r162; $L89: st.u64 [%r126+-8],%r397; .loc 1 370 23 mul.lo.u64 %r113,%r130,%r124; .loc 1 371 24 mul.lo.u64 %r398,%r130,%r202; .loc 1 371 10 sub.u64 %r160,%r160,%r398; .loc 1 372 23 mul.lo.u64 %r119,%r130,%r127; .loc 1 373 5 add.u64 %r162,%r162,1; .loc 1 374 7 setp.le.s64 %r399,%r147,%r162; @ %r399 bra $L52; .loc 1 381 16 ld.u64 %r400,[%r126]; add.u64 %r123,%r400,1; st.u64 [%r126],%r123; .loc 1 382 23 add.u64 %r402,%r424,%r120; ld.u64 %r124,[%r402]; .loc 1 382 13 sub.u64 %r403,%r124,%r113; shl.b64 %r404,%r403,2; add.u64 %r163,%r163,%r404; .loc 1 383 23 add.u64 %r405,%frame,%r120; ld.u64 %r127,[%r405]; .loc 1 383 13 sub.u64 %r406,%r127,%r119; shl.b64 %r407,%r406,3; add.u64 %r164,%r164,%r407; .loc 1 367 32 add.u64 %r409,%r425,%r120; ld.u64 %r130,[%r409]; .loc 1 367 13 add.u64 %r126,%r126,8; setp.eq.u64 %r410,%r123,%r130; @ %r410 bra $L88; .loc 1 362 12 ld.u64 %r203,[%frame+480]; bra $L78; $L88: .loc 1 371 20 add.u64 %r412,%r426,%r120; ld.u64 %r202,[%r412]; add.u64 %r120,%r120,8; bra $L89; $L114: .loc 1 274 6 ld.u64 %r413,[%r204]; setp.eq.u64 %r414,%r413,0; @ ! %r414 bra $L64; bra $L90; $L52: .loc 1 387_gfortran_sfindloc1_r4 .visible .func _gfortran_sfindloc1_r68u64 %r126; .reg .u64u32pred %r190; .reg .pred %r192; .reg .u64pred %r213; .reg .predu32predpredpred %r257; .reg .u64 %r260; .reg .u64 %r261; .reg .pred %r263; .reg .u64 %r264; .reg .u32 %r265; .reg .u64 %r266; .reg .u6464predmov.u64 %r149,%ar0; mov.u64 %r150,%ar1; mov.f32 %r151,%ar2; mov.u64 %r152,%ar3; mov.u64 %r153,%ar4; mov.u32 %r154,%ar5; .loc 1 410 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L116; .loc 1 410 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L117; $L116:call _gfortran_findloc1_); } .loc 1 413 7 bra $L115; $L117: .loc 1 416 10 ld.s8 %r23,[%r150+28]; .loc 1 416 38 add.u32 %r163,%r23,-1; .loc 1 416 8 cvt.s64.s32 %r90,%r163; .loc 1 417 10 ld.u64 %r26,[%r152]; .loc 1 417 7 add.u64 %r91,%r26,-1; .loc 1 419 7 shr.u64 %r165,%r91,63; set.u32.lt.s64 %r167,%r90,%r91; neg.s32 %r168,%r167; cvt.u16.u64 %r171,%r165; cvt.u16.u32 %r172,%r168; or.b16 %r170,%r171,%r172; .loc 1 419 6 cvt.u32.u16 %r173,%r170; cvt.u16.u8 %r174,%r173; setp.ne.u16 %r175,%r174,0; @ %r175 bra $L119; .loc 1 430 3 setp.ne.u64 %r176,%r91,0; @ %r176 bra $L120; $L126: .loc 1 438 3 setp.gt.s64 %r177,%r90,%r91; @ %r177 bra $L121; bra $L163; $L119: .loc 1 421 7 cvt.u32.u32 %r180,%r23; cvt.s64.s8 %r179,%r180; st.u64 [%stack+8],%r179_gfortran_runtime_error120: add.u64 %r136,%r150,48; add.u64 %r138,%frame,120; add.u64 %r139,%r150,56; add.u64 %r182,%r150,24; add.u64 %r184,%r26,%r26; add.u64 %r185,%r184,%r26; shl.b64 %r186,%r185,3; add.u64 %r145,%r182,%r186; .loc 1 435 12 mov.u64 %r302,0; $L125: .loc 1 432 19 ld.u64 %r188,[%r139]; add.u64 %r187,%r188,1; ld.u64 %r189,[%r136]; sub.u64 %r35,%r187,%r189; .loc 1 434 10 setp.le.s64 %r190,%r35,0; @ %r190 bra $L123; .loc 1 432 17 st.u64 [%r138],%r35; bra $L124; $L123: .loc 1 435 12 st.u64 [%r138],%r302; $L124: .loc 1 430 3 add.u64 %r136,%r136,24; add.u64 %r138,%r138,8; add.u64 %r139,%r139,24; setp.ne.u64 %r192,%r136,%r145; @ %r192 bra $L125; bra $L126; $L163: .loc 1 448 6 ld.u64 %r193,[%r149]; setp.eq.u64 %r194,%r193,0; @ ! %r194 bra $L128; bra $L127; $L121: add.u64 %r196,%r26,%r26; add.u64 %r197,%r196,%r26; shl.b64 %r198,%r197,3; add.u64 %r199,%r198,48; add.u64 %r125,%r150,%r199; add.u64 %r294,%frame,120; shl.b64 %r201,%r26,3; add.u64 %r202,%r201,-8; add.u64 %r45,%r294,%r202; add.u64 %r203,%r150,48; cvt.u32.u32 %r205,%r23; cvt.s64.s8 %r204,%r205; add.u64 %r207,%r204,%r204; add.u64 %r208,%r207,%r204; shl.b64 %r209,%r208,3; add.u64 %r132,%r203,%r209; .loc 1 444 12 mov.u64 %r301,0; $L131: .loc 1 441 2 ld.u64 %r211,[%r125+8]; add.u64 %r210,%r211,1; ld.u64 %r212,[%r125]; sub.u64 %r40,%r210,%r212; .loc 1 443 10 setp.le.s64 %r213,%r40,0; @ %r213 bra $L129; .loc 1 440 17 st.u64 [%r45],%r40; bra $L130; $L129: .loc 1 444 12 st.u64 [%r45],%r301; $L130: .loc 1 438 3 add.u64 %r125,%r125,24; add.u64 %r45,%r45,8; setp.ne.u64 %r215,%r125,%r132; @ %r215 bra $L131; bra $L164; $L127: .loc 1 452 7 setp.le.s64 %r216,%r90,0; @ %r216 bra $L133; add.u64 %r294,%frame,120; $L149: add.u64 %r78,%r149,40; mov.u64 %r61,%r294; cvt.u32.u32 %r218,%r23; cvt.s64.s8 %r217,%r218; add.u64 %r220,%r217,%r217; add.u64 %r221,%r220,%r217; shl.b64 %r222,%r221,3; add.u64 %r223,%r149,16; add.u64 %r126,%r222,%r223; .loc 1 455 10 mov.u64 %r81,1; .loc 1 459 4 mov.u64 %r246,0; bra $L134; $L133: .loc 1 462 24 mov.u64 %r224,0; st.u64 [%r149+8],%r224; .loc 1 463 28 cvt.u16.u32 %r227,%r23; add.u16 %r226,%r227,-1; cvt.u32.u16 %r228,%r226; st.u8 [%r149+28],%r228; .loc 1 465 20 add.u32 %r229,%r23,-2; cvt.s64.s32 %r51,%r229; add.u64 %r231,%r51,%r51; add.u64 %r232,%r231,%r51; shl.b64 %r233,%r232,3; add.u64 %r234,%r149,%r233; .loc 1 465 67 shl.b64 %r236,%r51,3; add.u64 %r237,%frame,%r236; .loc 1 465 59 ld.u64 %r239,[%r234+40]; ld.u64 %r240,[%r237+120]; mul.lo.u64 %r94,%r239,%r240; .loc 1 467 29 mov.u64 %r24242; call (%value_in),_gfortrani_xmallocarray3,[%value_in]; } .loc 1 467 27 st.u64 [%r149],%r243; .loc 1 468 10 setp.eq.u64 %r245,%r94,0; @ ! %r245 bra $L138; bra $L135; $L137: .loc 1 457 48 mul.lo.u64 %r81,%r47,%r81; $L134: .loc 1 459 4 st.u64 [%r78+8],%r246; ld.u64 %r47,[%r61]; add.u64 %r247,%r47,-1; st.u64 [%r78+16],%r247; st.u64 [%r78],%r81; .loc 1 452 7 add.u64 %r78,%r78,24; add.u64 %r61,%r61,8; setp.ne.u64 %r248,%r78,%r126; @ %r248 bra $L137; bra $L133; $L135: .loc 1 471 4 st.u64 [%r149+48],%r94; mov.u64 %r250,-1; st.u64 [%r149+56],%r250; mov.u64 %r251,1; st.u64 [%r149+40],%r251; .loc 1 472 4 bra $L115; $L128: .loc 1 477 19 ld.s8 %r58,[%r149+28]; .loc 1 477 10 setp.eq.u64 %r252,%r58,%r90; @ %r252 bra $L139; .loc 1 478 2 st.u64 [%stack+8],%r90; st.u64 [%stack],%r58;_gfortran_runtime_error139: .loc 1 483 11 cvta.global.u64 %r255,_gfortrani_compile_options; .loc 1 483 10 ld.u32 %r256,[%r255+36]; setp.eq.u32 %r257,%r256,0; @ %r257 bra $L138; .loc 1 484 2 add.u64 %r294,%frame,120; cvta.const.u64 %r261,$LC2261; call _gfortrani_bounds_ifunction_return$L138: .loc 1 488 3 setp.gt.s64 %r263,%r90,0; @ %r263 bra $L140; $L143: .loc 1 495 8 ld.u64 %r97,[%r149]; ld.u64 %r42,[%frame+240]; .loc 1 503 22 ld.u64 %r64,[%frame]; .loc 1 503 12 shl.b64 %r66,%r64,3; setp.le.s64 %r295,%r90,1; cvt.u32.u32 %r297,%r23; cvt.s64.s8 %r298,%r297; add.u64 %r299,%r298,-1; add.u64 %r300,%frame,120; bra $L141; $L140: add.u64 %r114,%frame,240; add.u64 %r113,%r149,40; mov.u64 %r108,%frame; cvt.u32.u32 %r265,%r23; cvt.s64.s8 %r264,%r265; add.u64 %r79,%r264,-1; .loc 1 488 3 mov.u64 %r98,0; add.u64 %r294,%frame,120; .loc 1 490 16 mov.u64 %r266,%r98; $L142: st.u64 [%r114],%r266; .loc 1 491 18 ld.u64 %r267,[%r113]; st.u64 [%r108],%r267; .loc 1 492 17 shl.b64 %r269,%r98,3; add.u64 %r270,%r294,%r269; .loc 1 492 10 ld.u64 %r271,[%r270]; setp.le.s64 %r272,%r271,0; @ %r272 bra $L115; .loc 1 488 26 add.u64 %r98,%r98,1; .loc 1 488 3 add.u64 %r114,%r114,8; add.u64 %r113,%r113,24; add.u64 %r108,%r108,8; setp.ne.u64 %r273,%r79,%r98; @ %r273 bra $L142; bra $L143; $L141: .loc 1 500 13 mov.u64 %r274,0; st.u64 [%r97],%r274; .loc 1 502 15 add.u64 %r42,%r42,1; .loc 1 503 12 add.u64 %r97,%r97,%r66; .loc 1 505 32 ld.u64 %r88,[%frame+120]; .loc 1 505 13 setp.ne.u64 %r275,%r42,%r88; @ %r275 bra $L141; .loc 1 508 23 mul.lo.u64 %r276,%r64,%r42; .loc 1 508 9 shl.b64 %r277,%r276,3; sub.u64 %r30,%r97,%r277; .loc 1 510 7 @ ! %r295 bra $L165; bra $L115; $L148: .loc 1 507 13 st.u64 [%r102],%r274; .loc 1 508 23 mul.lo.u64 %r280,%r73,%r72; .loc 1 508 9 shl.b64 %r281,%r280,3; sub.u64 %r30,%r97,%r281; .loc 1 509 5 add.u64 %r96,%r96,1; .loc 1 510 7 add.u64 %r102,%r102,8; setp.ne.u64 %r282,%r96,%r299; @ %r282 bra $L147; bra $L115; $L165: add.u64 %r102,%frame,248; .loc 1 509 5 mov.u64 %r96,1; $L147: .loc 1 517 16 ld.u64 %r286,[%r102]; add.u64 %r72,%r286,1; st.u64 [%r102],%r72; shl.b64 %r120,%r96,3; .loc 1 518 23 add.u64 %r287,%frame,%r120; ld.u64 %r73,[%r287]; .loc 1 518 13 shl.b64 %r288,%r73,3; add.u64 %r97,%r30,%r288; .loc 1 505 32 add.u64 %r290,%r300,%r120; ld.u64 %r77,[%r290]; .loc 1 505 13 setp.eq.u64 %r291,%r72,%r77; @ %r291 bra $L148; .loc 1 507 13 mov.u64 %r42,0; bra $L141; $L164: .loc 1 448 6 ld.u64 %r292,[%r149]; setp.eq.u64 %r293,%r292,0; @ ! %r293 bra $L128; bra $L149; $L115: .loc 1 522findloc1_r8.o/_gfortran_findloc1_r8 .visible .func _gfortran_findloc1_r8fortran/generated/findloc1_r8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mfindloc1_r8 .visible .func _gfortran_mfindloc1_r8_gfortran_sfindloc1_r8 .visible .func _gfortran_sfindloc1_r8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_internal_error .extern .func _gfortrani_internal_8] = {70,73,78,68,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,70,73,78,68,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,914,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,70,73,78,68,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_findloc1_r8 .visible .func _gfortran_findloc1_r84898predpred %r238; .reg .u64pred %r260; .reg .pred %r262; .reg .pred %r263; .reg .u64 %r264; .reg .u32 %r265; .reg .u64u64 %r279; .reg .u64 %r280; .reg .u6464pred %r327; .reg .predpred %r331; .reg .pred %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64u64 %r352; .reg .u64 %r354; .reg .predpred %r358; .reg .pred %r359; .reg .u64predmov.u64 %r185,%ar0; mov.u64 %r186,%ar1; mov.f64 %r187,%ar2; mov.u64 %r188,%ar3; mov.u32 %r189,%ar4; .loc 1 54 10 ld.s8 %r22,[%r186+28]; .loc 1 54 38 add.u32 %r190,%r22,-1; .loc 1 54 8 cvt.s64.s32 %r118,%r190; .loc 1 55 10 ld.u64 %r25,[%r188]; .loc 1 55 7 add.u64 %r119,%r25,-1; .loc 1 57 7 shr.u64 %r192,%r119,63; set.u32.lt.s64 %r194,%r118,%r119; neg.s32 %r195,%r194; cvt.u16.u64 %r198,%r192; cvt.u16.u32 %r199,%r195; or.b16 %r197,%r198,%r199; .loc 1 57 6 cvt.u32.u16 %r200,%r197; cvt.u16.u8 %r201,%r200; setp.eq.u16 %r202,%r201,0; @ %r202 bra $L2; .loc 1 59 7 cvt.u32.u32 %r205,%r22; cvt.s64.s8 %r204,%r205; st.u64 [%stack+8],%r20403203_gfortran_runtime_errorr208,%r119,%r119; add.u64 %r209,%r208,%r119; shl.b64 %r210,%r209,3; add.u64 %r211,%r186,%r210; ld.u64 %r31,[%r211+56]; ld.u64 %r33,[%r211+48]; .loc 1 67 9 ld.u64 %r122,[%r211+40]; .loc 1 69 3 setp.ne.u64 %r225,%r119,0; @ %r225 bra $L3; $L9: .loc 1 77 3 setp.gt.s64 %r226,%r118,%r119; @ %r226 bra $L4; bra $L50; $L3: add.u64 %r177,%r186,40; add.u64 %r67,%frame,120; add.u64 %r70,%frame,240; add.u64 %r228,%r25,%r25; add.u64 %r229,%r228,%r25; shl.b64 %r230,%r229,3; add.u64 %r231,%r186,16; add.u64 %r182,%r230,%r231; .loc 1 75 12 mov.u64 %r371,0; $L8: .loc 1 71 18 ld.u64 %r232,[%r177]; st.u64 [%r67],%r232; .loc 1 72 19 ld.u64 %r234,[%r177+16]; add.u64 %r233,%r234,1; ld.u64 %r235,[%r177+8]; sub.u64 %r38,%r233,%r235; .loc 1 74 10 setp.lt.s64 %r236,%r38,0; @ %r236 bra $L6; .loc 1 72 17 st.u64 [%r70],%r38; bra $L7; $L6: .loc 1 75 12 st.u64 [%r70],%r371; $L7: .loc 1 69 3 add.u64 %r177,%r177,24; add.u64 %r67,%r67,8; add.u64 %r70,%r70,8; setp.ne.u64 %r238,%r177,%r182; @ %r238 bra $L8; bra $L9; $L50: .loc 1 86 6 ld.u64 %r239,[%r185]; setp.eq.u64 %r240,%r239,0; @ ! %r240 bra $L11; bra $L10; $L4: add.u64 %r242,%r25,%r25; add.u64 %r243,%r242,%r25; shl.b64 %r244,%r243,3; add.u64 %r245,%r244,40; add.u64 %r115,%r186,%r245; shl.b64 %r246,%r25,3; add.u64 %r105,%r246,-8; add.u64 %r361,%frame,120; add.u64 %r108,%r361,%r105; add.u64 %r360,%frame,240; add.u64 %r62,%r360,%r105; add.u64 %r249,%r186,40; cvt.u32.u32 %r251,%r22; cvt.s64.s8 %r250,%r251; add.u64 %r253,%r250,%r250; add.u64 %r254,%r253,%r250; shl.b64 %r255,%r254,3; add.u64 %r180,%r249,%r255; .loc 1 83 12 mov.u64 %r370,0; $L14: .loc 1 79 18 ld.u64 %r256,[%r115]; st.u64 [%r108],%r256; .loc 1 80 19 ld.u64 %r258,[%r115+16]; add.u64 %r257,%r258,1; ld.u64 %r259,[%r115+8]; sub.u64 %r45,%r257,%r259; .loc 1 82 10 setp.lt.s64 %r260,%r45,0; @ %r260 bra $L12; .loc 1 80 17 st.u64 [%r62],%r45; bra $L13; $L12: .loc 1 83 12 st.u64 [%r62],%r370; $L13: .loc 1 77 3 add.u64 %r115,%r115,24; add.u64 %r108,%r108,8; add.u64 %r62,%r62,8; setp.ne.u64 %r262,%r115,%r180; @ %r262 bra $L14; bra $L51; $L10: .loc 1 90 7 setp.le.s64 %r263,%r118,0; @ %r263 bra $L16; add.u64 %r360,%frame,240; $L36: add.u64 %r153,%r185,40; mov.u64 %r149,%r360; cvt.u32.u32 %r265,%r22; cvt.s64.s8 %r264,%r265; add.u64 %r267,%r264,%r264; add.u64 %r268,%r267,%r264; shl.b64 %r269,%r268,3; add.u64 %r270,%r185,16; add.u64 %r116,%r269,%r270; .loc 1 93 10 mov.u64 %r112,1; .loc 1 97 4 mov.u64 %r293,0; bra $L17; $L16: .loc 1 101 24 mov.u64 %r271,0; st.u64 [%r185+8],%r271; .loc 1 102 28 cvt.u16.u32 %r274,%r22; add.u16 %r273,%r274,-1; cvt.u32.u16 %r275,%r273; st.u8 [%r185+28],%r275; .loc 1 104 20 add.u32 %r276,%r22,-2; cvt.s64.s32 %r54,%r276; add.u64 %r278,%r54,%r54; add.u64 %r279,%r278,%r54; shl.b64 %r280,%r279,3; add.u64 %r281,%r185,%r280; .loc 1 104 67 shl.b64 %r283,%r54,3; add.u64 %r284,%frame,%r283; .loc 1 104 59 ld.u64 %r286,[%r281+40]; ld.u64 %r287,[%r284+240]; mul.lo.u64 %r124,%r286,%r287; .loc 1 106 29 mov.u64 %r289189; call (%value_in),_gfortrani_xmallocarray290,[%value_in]; } .loc 1 106 27 st.u64 [%r185],%r290; .loc 1 107 10 setp.eq.u64 %r292,%r124,0; @ ! %r292 bra $L21; bra $L18; $L20: .loc 1 95 48 mul.lo.u64 %r112,%r50,%r112; $L17: .loc 1 97 4 st.u64 [%r153+8],%r293; ld.u64 %r50,[%r149]; add.u64 %r294,%r50,-1; st.u64 [%r153+16],%r294; st.u64 [%r153],%r112; .loc 1 90 7 add.u64 %r153,%r153,24; add.u64 %r149,%r149,8; setp.ne.u64 %r295,%r116,%r153; @ %r295 bra $L20; bra $L16; $L18: .loc 1 110 4 st.u64 [%r185+48],%r124; mov.u64 %r297,-1; st.u64 [%r185+56],%r297; mov.u64 %r298,1; st.u64 [%r185+40],%r298; .loc 1 111 4 bra $L1; $L11: .loc 1 116 19 ld.s8 %r60,[%r185+28]; .loc 1 116 10 setp.eq.u64 %r299,%r60,%r118; @ %r299 bra $L23; .loc 1 117 2 st.u64 [%stack+8],%r118; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 122 11 cvta.global.u64 %r302,_gfortrani_compile_options; .loc 1 122 10 ld.u32 %r303,[%r302+36]; setp.eq.u32 %r304,%r303,0; @ %r304 bra $L21; .loc 1 123 2 add.u64 %r360,%frame,240; cvta.const.u64 %r308,$LC2r360307308; call _gfortrani_bounds_ifunction_return$L21: .loc 1 127 3 setp.gt.s64 %r310,%r118,0; @ %r310 bra $L24; $L27: .loc 1 64 9 add.u64 %r311,%r31,1; .loc 1 64 7 sub.u64 %r120,%r311,%r33; max.s64 %r43,%r120,0; .loc 1 135 8 ld.u64 %r111,[%r185]; .loc 1 138 8 ld.u64 %r132,[%r186]; .loc 1 160 35 shl.b64 %r74,%r122,3; .loc 1 147 22 add.u64 %r312,%r43,-1; .loc 1 147 15 mul.lo.u64 %r68,%r312,%r74; .loc 1 148 34 neg.s64 %r71,%r74; ld.u64 %r82,[%frame+360]; .loc 1 172 22 ld.u64 %r75,[%frame+120]; .loc 1 172 12 shl.b64 %r77,%r75,3; .loc 1 173 22 ld.u64 %r78,[%frame]; .loc 1 173 12 shl.b64 %r80,%r78,3; setp.ne.u32 %r359,%r189,0; setp.le.s64 %r358,%r118,1; .loc 1 148 4 setp.le.s64 %r363,%r120,0; .loc 1 160 4 setp.gt.s64 %r364,%r120,0; cvt.u32.u32 %r365,%r22; cvt.s64.s8 %r366,%r365; add.u64 %r367,%r366,-1; add.u64 %r368,%frame,120; add.u64 %r369,%frame,240; bra $L25; $L24: add.u64 %r170,%frame,360; add.u64 %r169,%r185,40; mov.u64 %r167,%frame; cvt.u32.u32 %r314,%r22; cvt.s64.s8 %r313,%r314; add.u64 %r154,%r313,-1; .loc 1 127 3 mov.u64 %r133,0; add.u64 %r360,%frame,240; .loc 1 129 16 mov.u64 %r315,%r133; $L26: st.u64 [%r170],%r315; .loc 1 130 18 ld.u64 %r316,[%r169]; st.u64 [%r167],%r316; .loc 1 131 17 shl.b64 %r318,%r133,3; add.u64 %r319,%r360,%r318; .loc 1 131 10 ld.u64 %r320,[%r319]; setp.le.s64 %r321,%r320,0; @ %r321 bra $L1; .loc 1 127 26 add.u64 %r133,%r133,1; .loc 1 127 3 add.u64 %r170,%r170,8; add.u64 %r169,%r169,24; add.u64 %r167,%r167,8; setp.ne.u64 %r322,%r133,%r154; @ %r322 bra $L26; bra $L27; $L25: .loc 1 145 10 @ %r359 bra $L28; .loc 1 160 4 @ %r364 bra $L37; .loc 1 144 14 mov.u64 %r126,0; bra $L30; $L28: .loc 1 147 8 add.u64 %r127,%r132,%r68; .loc 1 148 4 @ %r363 bra $L38; .loc 1 148 11 mov.u64 %r126,%r43; $L31: .loc 1 150 11 ld.f64 %r326,[%r127]; setp.eq.f64 %r327,%r326,%r187; @ %r327 bra $L30; .loc 1 148 26 add.u64 %r126,%r126,-1; .loc 1 148 34 add.u64 %r127,%r127,%r71; .loc 1 148 4 setp.ne.u64 %r328,%r126,0; @ %r328 bra $L31; bra $L30; $L37: .loc 1 160 4 mov.u64 %r125,%r132; .loc 1 160 11 mov.u64 %r126,1; $L29: .loc 1 162 11 ld.f64 %r329,[%r125]; setp.eq.f64 %r330,%r329,%r187; @ %r330 bra $L30; .loc 1 160 27 add.u64 %r126,%r126,1; .loc 1 160 35 add.u64 %r125,%r125,%r74; .loc 1 160 4 setp.ge.s64 %r331,%r43,%r126; @ %r331 bra $L29; .loc 1 144 14 mov.u64 %r126,0; bra $L30; $L38: mov.u64 %r126,0; $L30: .loc 1 169 13 st.u64 [%r111],%r126; .loc 1 171 15 add.u64 %r82,%r82,1; .loc 1 172 12 add.u64 %r132,%r132,%r77; .loc 1 173 12 add.u64 %r111,%r111,%r80; .loc 1 175 32 ld.u64 %r141,[%frame+240]; .loc 1 175 13 setp.ne.u64 %r332,%r82,%r141; @ %r332 bra $L25; .loc 1 178 23 mul.lo.u64 %r333,%r75,%r82; .loc 1 178 9 shl.b64 %r334,%r333,3; sub.u64 %r129,%r132,%r334; .loc 1 179 23 mul.lo.u64 %r335,%r78,%r82; .loc 1 179 9 shl.b64 %r336,%r335,3; sub.u64 %r130,%r111,%r336; .loc 1 181 7 @ %r358 bra $L1; add.u64 %r66,%frame,368; mov.u64 %r159,8; .loc 1 180 5 mov.u64 %r131,1; .loc 1 177 13 mov.u64 %r362,0; bra $L34; $L35: st.u64 [%r66],%r362; .loc 1 178 23 mul.lo.u64 %r342,%r95,%r94; .loc 1 178 9 shl.b64 %r343,%r342,3; sub.u64 %r129,%r132,%r343; .loc 1 179 23 mul.lo.u64 %r344,%r98,%r94; .loc 1 179 9 shl.b64 %r345,%r344,3; sub.u64 %r130,%r111,%r345; .loc 1 180 5 add.u64 %r131,%r131,1; .loc 1 181 7 add.u64 %r66,%r66,8; add.u64 %r159,%r159,8; setp.eq.u64 %r346,%r131,%r367; @ %r346 bra $L1; $L34: .loc 1 188 16 ld.u64 %r347,[%r66]; add.u64 %r94,%r347,1; st.u64 [%r66],%r94; .loc 1 189 23 add.u64 %r349,%r368,%r159; ld.u64 %r95,[%r349]; .loc 1 189 13 shl.b64 %r350,%r95,3; add.u64 %r132,%r129,%r350; .loc 1 190 23 add.u64 %r351,%frame,%r159; ld.u64 %r98,[%r351]; .loc 1 190 13 shl.b64 %r352,%r98,3; add.u64 %r111,%r130,%r352; .loc 1 175 32 add.u64 %r354,%r369,%r159; ld.u64 %r102,[%r354]; .loc 1 175 13 setp.eq.u64 %r355,%r94,%r102; @ %r355 bra $L35; .loc 1 177 13 mov.u64 %r82,0; bra $L25; $L51: .loc 1 86 6 ld.u64 %r356,[%r185]; setp.eq.u64 %r357,%r356,0; @ ! %r357 bra $L11; bra $L36; $L1: .loc 1 194_gfortran_mfindloc1_r8 .visible .func _gfortran_mfindloc1_r8608u32 %r209; .reg .u32 %r210; .reg .u64 %r212; .reg .u32 %r214; .reg .u32 %r215; .reg .u16 %r217; .reg .u16 %r218; .reg .u16 %r219; .reg .u32 %r220; .reg .u16 %r221; .reg .predu64 %r230; .reg .u64 %r231; .reg .u64 %r249; .reg .u32u32u64 %r281; .reg .u64 %r282; .reg .u64 %r283; .reg .u64predu64u16 %r331; .reg .u16predpredu64 %r375; .reg .u64pred %r379; .reg .pred %r381; .reg .u16 %r383; .reg .u32 %r384; .reg .pred %r385; .reg .f64u32 %r390; .reg .pred %r391; .reg .f64 %r392; .reg .pred %r393; .reg .pred %r394; .reg .predu64u64 %r430; .reg .u64 %r431; mov.u64 %r204,%ar0; mov.u64 %r205,%ar1; mov.f64 %r206,%ar2; mov.u64 %r207,%ar3; mov.u64 %r208,%ar4; mov.u32 %r209,%ar5; .loc 1 225 10 ld.s8 %r22,[%r205+28]; .loc 1 225 38 add.u32 %r210,%r22,-1; .loc 1 225 8 cvt.s64.s32 %r147,%r210; .loc 1 226 10 ld.u64 %r25,[%r207]; .loc 1 226 7 add.u64 %r148,%r25,-1; .loc 1 228 7 shr.u64 %r212,%r148,63; set.u32.lt.s64 %r214,%r147,%r148; neg.s32 %r215,%r214; cvt.u16.u64 %r218,%r212; cvt.u16.u32 %r219,%r215; or.b16 %r217,%r218,%r219; .loc 1 228 6 cvt.u32.u16 %r220,%r217; cvt.u16.u8 %r221,%r220; setp.eq.u16 %r222,%r221,0; @ %r222 bra $L53; .loc 1 230 7 cvt.u32.u32 %r225,%r22; cvt.s64.s8 %r224,%r225; st.u64 [%stack+8],%r224232stack; call _gfortran_runtime_error3: .loc 1 235 9 add.u64 %r228,%r148,%r148; add.u64 %r229,%r228,%r148; shl.b64 %r230,%r229,3; add.u64 %r231,%r205,%r230; ld.u64 %r31,[%r231+56]; ld.u64 %r33,[%r231+48]; .loc 1 239 9 ld.u64 %r151,[%r231+40]; .loc 1 240 12 add.u64 %r249,%r208,%r230; ld.u64 %r34,[%r249+40]; ld.u64 %r36,[%r208+16]; .loc 1 242 9 ld.u64 %r160,[%r208]; .loc 1 246 22 cvt.u32.u64 %r39,%r36; .loc 1 246 53 add.u32 %r251,%r39,-4; and.b32 %r252,%r251,-5; set.u32.eq.u32 %r254,%r252,0; neg.s32 %r255,%r254; .loc 1 246 22 add.u32 %r256,%r39,-1; set.u32.le.u32 %r258,%r256,1; neg.s32 %r259,%r258; .loc 1 246 6 cvt.u16.u32 %r261,%r255; cvt.u16.u32 %r262,%r259; or.b16 %r260,%r261,%r262; cvt.u32.u16 %r263,%r260; cvt.u16.u8 %r264,%r263; setp.eq.u16 %r265,%r264,0; @ %r265 bra $L54; $L58: .loc 1 255 3 setp.ne.u64 %r266,%r148,0; @ %r266 bra $L55; $L62: .loc 1 264 3 setp.gt.s64 %r267,%r147,%r148; @ %r267 bra $L56; bra $L113; $L54: .loc 1 248 7 setp.eq.u32 %r269,%r39,16; @ %r269 bra $L58; .loc 1 253 5 cvta.const.u64 %r271,$LC4; mov.u64 %r270,27271; call _gfortrani_internal_error5: add.u64 %r146,%r205,40; add.u64 %r167,%r208,40; add.u64 %r273,%r25,%r25; add.u64 %r274,%r273,%r25; shl.b64 %r275,%r274,3; add.u64 %r276,%r205,16; add.u64 %r101,%r275,%r276; .loc 1 255 3 mov.u64 %r200,0; add.u64 %r418,%frame,240; add.u64 %r419,%frame,120; add.u64 %r416,%frame,360; .loc 1 262 12 mov.u64 %r431,%r200; $L61: .loc 1 257 18 add.u64 %r278,%r418,%r200; ld.u64 %r279,[%r146]; st.u64 [%r278],%r279; .loc 1 258 18 add.u64 %r281,%r419,%r200; .loc 1 258 20 ld.u64 %r283,[%r167]; mul.lo.u64 %r282,%r283,%r36; .loc 1 258 18 st.u64 [%r281],%r282; .loc 1 259 19 ld.u64 %r285,[%r146+16]; add.u64 %r284,%r285,1; ld.u64 %r286,[%r146+8]; sub.u64 %r53,%r284,%r286; .loc 1 261 10 setp.lt.s64 %r287,%r53,0; @ %r287 bra $L59; .loc 1 259 17 add.u64 %r289,%r416,%r200; st.u64 [%r289],%r53; bra $L60; $L59: .loc 1 262 12 add.u64 %r291,%r416,%r200; st.u64 [%r291],%r431; $L60: .loc 1 255 3 add.u64 %r146,%r146,24; add.u64 %r167,%r167,24; add.u64 %r200,%r200,8; setp.ne.u64 %r293,%r101,%r146; @ %r293 bra $L61; bra $L62; $L113: .loc 1 274 6 ld.u64 %r294,[%r204]; setp.eq.u64 %r295,%r294,0; @ ! %r295 bra $L64; bra $L63; $L56: add.u64 %r297,%r25,%r25; add.u64 %r298,%r297,%r25; shl.b64 %r299,%r298,3; add.u64 %r143,%r299,40; add.u64 %r156,%r205,%r143; add.u64 %r141,%r208,%r143; shl.b64 %r300,%r25,3; add.u64 %r137,%r300,-8; cvt.u32.u32 %r302,%r22; cvt.s64.s8 %r301,%r302; shl.b64 %r303,%r301,3; add.u64 %r84,%r303,-8; add.u64 %r418,%frame,240; add.u64 %r419,%frame,120; add.u64 %r416,%frame,360; .loc 1 271 12 mov.u64 %r430,0; $L67: .loc 1 266 18 add.u64 %r305,%r418,%r137; ld.u64 %r306,[%r156]; st.u64 [%r305],%r306; .loc 1 267 18 add.u64 %r308,%r419,%r137; .loc 1 267 20 ld.u64 %r310,[%r141]; mul.lo.u64 %r309,%r310,%r36; .loc 1 267 18 st.u64 [%r308],%r309; .loc 1 268 19 ld.u64 %r312,[%r156+16]; add.u64 %r311,%r312,1; ld.u64 %r313,[%r156+8]; sub.u64 %r62,%r311,%r313; .loc 1 270 10 setp.lt.s64 %r314,%r62,0; @ %r314 bra $L65; .loc 1 268 17 add.u64 %r316,%r416,%r137; st.u64 [%r316],%r62; bra $L66; $L65: .loc 1 271 12 add.u64 %r318,%r416,%r137; st.u64 [%r318],%r430; $L66: .loc 1 264 3 add.u64 %r156,%r156,24; add.u64 %r141,%r141,24; add.u64 %r137,%r137,8; setp.ne.u64 %r320,%r84,%r137; @ %r320 bra $L67; bra $L114; $L63: .loc 1 278 7 setp.le.s64 %r321,%r147,0; @ %r321 bra $L69; add.u64 %r416,%frame,360; $L90: add.u64 %r179,%r204,40; mov.u64 %r176,%r416; add.u64 %r322,%r204,16; cvt.u32.u32 %r324,%r22; cvt.s64.s8 %r323,%r324; add.u64 %r326,%r323,%r323; add.u64 %r327,%r326,%r323; shl.b64 %r328,%r327,3; add.u64 %r169,%r322,%r328; .loc 1 281 10 mov.u64 %r140,1; .loc 1 285 4 mov.u64 %r351,0; bra $L70; $L69: .loc 1 289 24 mov.u64 %r329,0; st.u64 [%r204+8],%r329; .loc 1 290 28 cvt.u16.u32 %r332,%r22; add.u16 %r331,%r332,-1; cvt.u32.u16 %r333,%r331; st.u8 [%r204+28],%r333; .loc 1 292 20 add.u32 %r334,%r22,-2; cvt.s64.s32 %r72,%r334; add.u64 %r336,%r72,%r72; add.u64 %r337,%r336,%r72; shl.b64 %r338,%r337,3; add.u64 %r339,%r204,%r338; .loc 1 292 67 shl.b64 %r341,%r72,3; add.u64 %r342,%frame,%r341; .loc 1 292 59 ld.u64 %r344,[%r339+40]; ld.u64 %r345,[%r342+360]; mul.lo.u64 %r154,%r344,%r345; .loc 1 294 29 mov.u64 %r3471547; call (%value_in),_gfortrani_xmallocarray8,[%value_in]; } .loc 1 294 27 st.u64 [%r204],%r348; .loc 1 295 10 setp.eq.u64 %r350,%r154,0; @ ! %r350 bra $L74; bra $L71; $L73: .loc 1 283 48 mul.lo.u64 %r140,%r68,%r140; $L70: .loc 1 285 4 st.u64 [%r179+8],%r351; ld.u64 %r68,[%r176]; add.u64 %r352,%r68,-1; st.u64 [%r179+16],%r352; st.u64 [%r179],%r140; .loc 1 278 7 add.u64 %r179,%r179,24; add.u64 %r176,%r176,8; setp.ne.u64 %r353,%r169,%r179; @ %r353 bra $L73; bra $L69; $L71: .loc 1 298 4 st.u64 [%r204+48],%r154; mov.u64 %r355,-1; st.u64 [%r204+56],%r355; mov.u64 %r356,1; st.u64 [%r204+40],%r356; .loc 1 299 4 bra $L52; $L64: .loc 1 304 19 ld.s8 %r79,[%r204+28]; .loc 1 304 10 setp.eq.u64 %r357,%r79,%r147; @ %r357 bra $L76; .loc 1 305 2 st.u64 [%stack+8],%r147; st.u64 [%stack],%r79;_gfortran_runtime_error76: .loc 1 310 11 cvta.global.u64 %r360,_gfortrani_compile_options; .loc 1 310 10 ld.u32 %r361,[%r360+36]; setp.eq.u32 %r362,%r361,0; @ %r362 bra $L74; .loc 1 311 2 add.u64 %r416,%frame,360; cvta.const.u64 %r366,$LC2r416366; call _gfortrani_bounds_ifunction_return$L74: .loc 1 315 3 setp.gt.s64 %r368,%r147,0; @ %r368 bra $L77; $L80: .loc 1 235 9 add.u64 %r369,%r31,1; .loc 1 235 7 sub.u64 %r149,%r369,%r33; max.s64 %r65,%r149,0; .loc 1 240 12 mul.lo.u64 %r37,%r34,%r36; .loc 1 323 8 ld.u64 %r164,[%r204]; .loc 1 326 8 ld.u64 %r163,[%r205]; .loc 1 351 35 shl.b64 %r100,%r151,3; .loc 1 336 22 add.u64 %r64,%r65,-1; .loc 1 336 15 mul.lo.u64 %r89,%r64,%r100; .loc 1 337 29 mul.lo.u64 %r177,%r37,%r64; .loc 1 338 34 neg.s64 %r95,%r100; .loc 1 363 22 ld.u64 %r102,[%frame+240]; .loc 1 363 12 shl.b64 %r104,%r102,3; .loc 1 364 23 ld.u64 %r105,[%frame+120]; .loc 1 365 22 ld.u64 %r107,[%frame]; .loc 1 365 12 shl.b64 %r109,%r107,3; .loc 1 362 12 ld.u64 %r203,[%frame+480]; setp.ne.u32 %r417,%r209,0; add.u64 %r415,%frame,488; .loc 1 338 4 setp.le.s64 %r423,%r149,0; .loc 1 382 23 add.u64 %r424,%frame,240; .loc 1 367 32 add.u64 %r425,%frame,360; .loc 1 371 20 add.u64 %r426,%frame,120; bra $L78; $L77: add.u64 %r198,%frame,480; add.u64 %r197,%r204,40; mov.u64 %r195,%frame; cvt.u32.u32 %r371,%r22; cvt.s64.s8 %r370,%r371; add.u64 %r181,%r370,-1; .loc 1 315 3 mov.u64 %r165,0; add.u64 %r416,%frame,360; .loc 1 317 16 mov.u64 %r372,%r165; $L79: st.u64 [%r198],%r372; .loc 1 318 18 ld.u64 %r373,[%r197]; st.u64 [%r195],%r373; .loc 1 319 17 shl.b64 %r375,%r165,3; add.u64 %r376,%r416,%r375; .loc 1 319 10 ld.u64 %r377,[%r376]; setp.le.s64 %r378,%r377,0; @ %r378 bra $L52; .loc 1 315 26 add.u64 %r165,%r165,1; .loc 1 315 3 add.u64 %r198,%r198,8; add.u64 %r197,%r197,24; add.u64 %r195,%r195,8; setp.ne.u64 %r379,%r165,%r181; @ %r379 bra $L79; bra $L80; $L78: .loc 1 334 10 @ %r417 bra $L81; .loc 1 351 4 setp.gt.s64 %r381,%r149,0; @ %r381 bra $L91; .loc 1 333 14 mov.u64 %r158,0; bra $L83; $L81: .loc 1 336 8 add.u64 %r29,%r163,%r89; .loc 1 337 9 add.u64 %r159,%r160,%r177; .loc 1 338 4 @ %r423 bra $L92; .loc 1 338 11 mov.u64 %r158,%r65; $L85: .loc 1 340 11 ld.s8 %r384,[%r159]; cvt.u16.u32 %r383,%r384; setp.eq.u16 %r385,%r383,0; @ %r385 bra $L84; .loc 1 340 18 ld.f64 %r386,[%r29]; setp.eq.f64 %r387,%r386,%r206; @ %r387 bra $L83; $L84: .loc 1 338 26 add.u64 %r158,%r158,-1; .loc 1 338 34 add.u64 %r29,%r29,%r95; .loc 1 338 53 sub.u64 %r159,%r159,%r37; .loc 1 338 4 setp.ne.u64 %r388,%r158,0; @ %r388 bra $L85; bra $L83; $L91: .loc 1 351 4 mov.u64 %r157,%r160; mov.u64 %r111,%r163; .loc 1 351 11 mov.u64 %r158,1; $L82: .loc 1 353 11 ld.s8 %r390,[%r157]; cvt.u16.u32 %r389,%r390; setp.eq.u16 %r391,%r389,0; @ %r391 bra $L86; .loc 1 353 18 ld.f64 %r392,[%r111]; setp.eq.f64 %r393,%r392,%r206; @ %r393 bra $L83; $L86: .loc 1 351 27 add.u64 %r158,%r158,1; .loc 1 351 35 add.u64 %r111,%r111,%r100; .loc 1 351 54 add.u64 %r157,%r157,%r37; .loc 1 351 4 setp.ge.s64 %r394,%r65,%r158; @ %r394 bra $L82; .loc 1 333 14 mov.u64 %r158,0; bra $L83; $L92: mov.u64 %r158,0; $L83: .loc 1 360 13 st.u64 [%r164],%r158; .loc 1 362 15 add.u64 %r203,%r203,1; st.u64 [%frame+480],%r203; .loc 1 363 12 add.u64 %r163,%r163,%r104; .loc 1 364 13 add.u64 %r160,%r160,%r105; .loc 1 365 12 add.u64 %r164,%r164,%r109; .loc 1 367 32 ld.u64 %r130,[%frame+360]; .loc 1 367 13 setp.ne.u64 %r395,%r203,%r130; @ %r395 bra $L78; mov.u64 %r126,%r415; mov.u64 %r127,%r107; mov.u64 %r202,%r105; mov.u64 %r124,%r102; mov.u64 %r120,8; .loc 1 366 9 mov.u64 %r162,0; .loc 1 369 13 mov.u64 %r397,%r162; $L89: st.u64 [%r126+-8],%r397; .loc 1 370 23 mul.lo.u64 %r113,%r130,%r124; .loc 1 371 24 mul.lo.u64 %r398,%r130,%r202; .loc 1 371 10 sub.u64 %r160,%r160,%r398; .loc 1 372 23 mul.lo.u64 %r119,%r130,%r127; .loc 1 373 5 add.u64 %r162,%r162,1; .loc 1 374 7 setp.le.s64 %r399,%r147,%r162; @ %r399 bra $L52; .loc 1 381 16 ld.u64 %r400,[%r126]; add.u64 %r123,%r400,1; st.u64 [%r126],%r123; .loc 1 382 23 add.u64 %r402,%r424,%r120; ld.u64 %r124,[%r402]; .loc 1 382 13 sub.u64 %r403,%r124,%r113; shl.b64 %r404,%r403,3; add.u64 %r163,%r163,%r404; .loc 1 383 23 add.u64 %r405,%frame,%r120; ld.u64 %r127,[%r405]; .loc 1 383 13 sub.u64 %r406,%r127,%r119; shl.b64 %r407,%r406,3; add.u64 %r164,%r164,%r407; .loc 1 367 32 add.u64 %r409,%r425,%r120; ld.u64 %r130,[%r409]; .loc 1 367 13 add.u64 %r126,%r126,8; setp.eq.u64 %r410,%r123,%r130; @ %r410 bra $L88; .loc 1 362 12 ld.u64 %r203,[%frame+480]; bra $L78; $L88: .loc 1 371 20 add.u64 %r412,%r426,%r120; ld.u64 %r202,[%r412]; add.u64 %r120,%r120,8; bra $L89; $L114: .loc 1 274 6 ld.u64 %r413,[%r204]; setp.eq.u64 %r414,%r413,0; @ ! %r414 bra $L64; bra $L90; $L52: .loc 1 387_gfortran_sfindloc1_r8 .visible .func _gfortran_sfindloc1_r868u64 %r126; .reg .u64u32pred %r190; .reg .pred %r192; .reg .u64pred %r213; .reg .predu32predpredpred %r257; .reg .u64 %r260; .reg .u64 %r261; .reg .pred %r263; .reg .u64 %r264; .reg .u32 %r265; .reg .u64 %r266; .reg .u6464predmov.u64 %r149,%ar0; mov.u64 %r150,%ar1; mov.f64 %r151,%ar2; mov.u64 %r152,%ar3; mov.u64 %r153,%ar4; mov.u32 %r154,%ar5; .loc 1 410 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L116; .loc 1 410 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L117; $L116:call _gfortran_findloc1_); } .loc 1 413 7 bra $L115; $L117: .loc 1 416 10 ld.s8 %r23,[%r150+28]; .loc 1 416 38 add.u32 %r163,%r23,-1; .loc 1 416 8 cvt.s64.s32 %r90,%r163; .loc 1 417 10 ld.u64 %r26,[%r152]; .loc 1 417 7 add.u64 %r91,%r26,-1; .loc 1 419 7 shr.u64 %r165,%r91,63; set.u32.lt.s64 %r167,%r90,%r91; neg.s32 %r168,%r167; cvt.u16.u64 %r171,%r165; cvt.u16.u32 %r172,%r168; or.b16 %r170,%r171,%r172; .loc 1 419 6 cvt.u32.u16 %r173,%r170; cvt.u16.u8 %r174,%r173; setp.ne.u16 %r175,%r174,0; @ %r175 bra $L119; .loc 1 430 3 setp.ne.u64 %r176,%r91,0; @ %r176 bra $L120; $L126: .loc 1 438 3 setp.gt.s64 %r177,%r90,%r91; @ %r177 bra $L121; bra $L163; $L119: .loc 1 421 7 cvt.u32.u32 %r180,%r23; cvt.s64.s8 %r179,%r180; st.u64 [%stack+8],%r179_gfortran_runtime_error120: add.u64 %r136,%r150,48; add.u64 %r138,%frame,120; add.u64 %r139,%r150,56; add.u64 %r182,%r150,24; add.u64 %r184,%r26,%r26; add.u64 %r185,%r184,%r26; shl.b64 %r186,%r185,3; add.u64 %r145,%r182,%r186; .loc 1 435 12 mov.u64 %r302,0; $L125: .loc 1 432 19 ld.u64 %r188,[%r139]; add.u64 %r187,%r188,1; ld.u64 %r189,[%r136]; sub.u64 %r35,%r187,%r189; .loc 1 434 10 setp.le.s64 %r190,%r35,0; @ %r190 bra $L123; .loc 1 432 17 st.u64 [%r138],%r35; bra $L124; $L123: .loc 1 435 12 st.u64 [%r138],%r302; $L124: .loc 1 430 3 add.u64 %r136,%r136,24; add.u64 %r138,%r138,8; add.u64 %r139,%r139,24; setp.ne.u64 %r192,%r136,%r145; @ %r192 bra $L125; bra $L126; $L163: .loc 1 448 6 ld.u64 %r193,[%r149]; setp.eq.u64 %r194,%r193,0; @ ! %r194 bra $L128; bra $L127; $L121: add.u64 %r196,%r26,%r26; add.u64 %r197,%r196,%r26; shl.b64 %r198,%r197,3; add.u64 %r199,%r198,48; add.u64 %r125,%r150,%r199; add.u64 %r294,%frame,120; shl.b64 %r201,%r26,3; add.u64 %r202,%r201,-8; add.u64 %r45,%r294,%r202; add.u64 %r203,%r150,48; cvt.u32.u32 %r205,%r23; cvt.s64.s8 %r204,%r205; add.u64 %r207,%r204,%r204; add.u64 %r208,%r207,%r204; shl.b64 %r209,%r208,3; add.u64 %r132,%r203,%r209; .loc 1 444 12 mov.u64 %r301,0; $L131: .loc 1 441 2 ld.u64 %r211,[%r125+8]; add.u64 %r210,%r211,1; ld.u64 %r212,[%r125]; sub.u64 %r40,%r210,%r212; .loc 1 443 10 setp.le.s64 %r213,%r40,0; @ %r213 bra $L129; .loc 1 440 17 st.u64 [%r45],%r40; bra $L130; $L129: .loc 1 444 12 st.u64 [%r45],%r301; $L130: .loc 1 438 3 add.u64 %r125,%r125,24; add.u64 %r45,%r45,8; setp.ne.u64 %r215,%r125,%r132; @ %r215 bra $L131; bra $L164; $L127: .loc 1 452 7 setp.le.s64 %r216,%r90,0; @ %r216 bra $L133; add.u64 %r294,%frame,120; $L149: add.u64 %r78,%r149,40; mov.u64 %r61,%r294; cvt.u32.u32 %r218,%r23; cvt.s64.s8 %r217,%r218; add.u64 %r220,%r217,%r217; add.u64 %r221,%r220,%r217; shl.b64 %r222,%r221,3; add.u64 %r223,%r149,16; add.u64 %r126,%r222,%r223; .loc 1 455 10 mov.u64 %r81,1; .loc 1 459 4 mov.u64 %r246,0; bra $L134; $L133: .loc 1 462 24 mov.u64 %r224,0; st.u64 [%r149+8],%r224; .loc 1 463 28 cvt.u16.u32 %r227,%r23; add.u16 %r226,%r227,-1; cvt.u32.u16 %r228,%r226; st.u8 [%r149+28],%r228; .loc 1 465 20 add.u32 %r229,%r23,-2; cvt.s64.s32 %r51,%r229; add.u64 %r231,%r51,%r51; add.u64 %r232,%r231,%r51; shl.b64 %r233,%r232,3; add.u64 %r234,%r149,%r233; .loc 1 465 67 shl.b64 %r236,%r51,3; add.u64 %r237,%frame,%r236; .loc 1 465 59 ld.u64 %r239,[%r234+40]; ld.u64 %r240,[%r237+120]; mul.lo.u64 %r94,%r239,%r240; .loc 1 467 29 mov.u64 %r24242; call (%value_in),_gfortrani_xmallocarray3,[%value_in]; } .loc 1 467 27 st.u64 [%r149],%r243; .loc 1 468 10 setp.eq.u64 %r245,%r94,0; @ ! %r245 bra $L138; bra $L135; $L137: .loc 1 457 48 mul.lo.u64 %r81,%r47,%r81; $L134: .loc 1 459 4 st.u64 [%r78+8],%r246; ld.u64 %r47,[%r61]; add.u64 %r247,%r47,-1; st.u64 [%r78+16],%r247; st.u64 [%r78],%r81; .loc 1 452 7 add.u64 %r78,%r78,24; add.u64 %r61,%r61,8; setp.ne.u64 %r248,%r78,%r126; @ %r248 bra $L137; bra $L133; $L135: .loc 1 471 4 st.u64 [%r149+48],%r94; mov.u64 %r250,-1; st.u64 [%r149+56],%r250; mov.u64 %r251,1; st.u64 [%r149+40],%r251; .loc 1 472 4 bra $L115; $L128: .loc 1 477 19 ld.s8 %r58,[%r149+28]; .loc 1 477 10 setp.eq.u64 %r252,%r58,%r90; @ %r252 bra $L139; .loc 1 478 2 st.u64 [%stack+8],%r90; st.u64 [%stack],%r58;_gfortran_runtime_error139: .loc 1 483 11 cvta.global.u64 %r255,_gfortrani_compile_options; .loc 1 483 10 ld.u32 %r256,[%r255+36]; setp.eq.u32 %r257,%r256,0; @ %r257 bra $L138; .loc 1 484 2 add.u64 %r294,%frame,120; cvta.const.u64 %r261,$LC2261; call _gfortrani_bounds_ifunction_return$L138: .loc 1 488 3 setp.gt.s64 %r263,%r90,0; @ %r263 bra $L140; $L143: .loc 1 495 8 ld.u64 %r97,[%r149]; ld.u64 %r42,[%frame+240]; .loc 1 503 22 ld.u64 %r64,[%frame]; .loc 1 503 12 shl.b64 %r66,%r64,3; setp.le.s64 %r295,%r90,1; cvt.u32.u32 %r297,%r23; cvt.s64.s8 %r298,%r297; add.u64 %r299,%r298,-1; add.u64 %r300,%frame,120; bra $L141; $L140: add.u64 %r114,%frame,240; add.u64 %r113,%r149,40; mov.u64 %r108,%frame; cvt.u32.u32 %r265,%r23; cvt.s64.s8 %r264,%r265; add.u64 %r79,%r264,-1; .loc 1 488 3 mov.u64 %r98,0; add.u64 %r294,%frame,120; .loc 1 490 16 mov.u64 %r266,%r98; $L142: st.u64 [%r114],%r266; .loc 1 491 18 ld.u64 %r267,[%r113]; st.u64 [%r108],%r267; .loc 1 492 17 shl.b64 %r269,%r98,3; add.u64 %r270,%r294,%r269; .loc 1 492 10 ld.u64 %r271,[%r270]; setp.le.s64 %r272,%r271,0; @ %r272 bra $L115; .loc 1 488 26 add.u64 %r98,%r98,1; .loc 1 488 3 add.u64 %r114,%r114,8; add.u64 %r113,%r113,24; add.u64 %r108,%r108,8; setp.ne.u64 %r273,%r79,%r98; @ %r273 bra $L142; bra $L143; $L141: .loc 1 500 13 mov.u64 %r274,0; st.u64 [%r97],%r274; .loc 1 502 15 add.u64 %r42,%r42,1; .loc 1 503 12 add.u64 %r97,%r97,%r66; .loc 1 505 32 ld.u64 %r88,[%frame+120]; .loc 1 505 13 setp.ne.u64 %r275,%r42,%r88; @ %r275 bra $L141; .loc 1 508 23 mul.lo.u64 %r276,%r64,%r42; .loc 1 508 9 shl.b64 %r277,%r276,3; sub.u64 %r30,%r97,%r277; .loc 1 510 7 @ ! %r295 bra $L165; bra $L115; $L148: .loc 1 507 13 st.u64 [%r102],%r274; .loc 1 508 23 mul.lo.u64 %r280,%r73,%r72; .loc 1 508 9 shl.b64 %r281,%r280,3; sub.u64 %r30,%r97,%r281; .loc 1 509 5 add.u64 %r96,%r96,1; .loc 1 510 7 add.u64 %r102,%r102,8; setp.ne.u64 %r282,%r96,%r299; @ %r282 bra $L147; bra $L115; $L165: add.u64 %r102,%frame,248; .loc 1 509 5 mov.u64 %r96,1; $L147: .loc 1 517 16 ld.u64 %r286,[%r102]; add.u64 %r72,%r286,1; st.u64 [%r102],%r72; shl.b64 %r120,%r96,3; .loc 1 518 23 add.u64 %r287,%frame,%r120; ld.u64 %r73,[%r287]; .loc 1 518 13 shl.b64 %r288,%r73,3; add.u64 %r97,%r30,%r288; .loc 1 505 32 add.u64 %r290,%r300,%r120; ld.u64 %r77,[%r290]; .loc 1 505 13 setp.eq.u64 %r291,%r72,%r77; @ %r291 bra $L148; .loc 1 507 13 mov.u64 %r42,0; bra $L141; $L164: .loc 1 448 6 ld.u64 %r292,[%r149]; setp.eq.u64 %r293,%r292,0; @ ! %r293 bra $L128; bra $L149; $L115: .loc 1 522findloc1_r10.o/ 1622802252findloc1_c4.o/_gfortran_findloc1_c4 .visible .func _gfortran_findloc1_c.file 1 "../../../../libgfortran/generated/findloc1_c4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mfindloc1_c4 .visible .func _gfortran_mfindloc1_c_gfortran_sfindloc1_c4 .visible .func _gfortran_sfindloc1_cVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_internal_error .extern .func _gfortrani_internal_,105,110,99,111,114,114,101,99,116,32,105,110,32,70,73,78,68,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,10114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,914,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,70,73,78,68,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,108] = {70,73,78,68,76,79,67_gfortran_findloc1_c4 .visible .func _gfortran_findloc1_c486410416u64 %r222; .reg .pred %r236; .reg .predpred %r247; .reg .predu64 %r265; .reg .u64 %r266; .reg .u64u64u64 %r281; .reg .u64 %r282; .reg .u16pred %r315; .reg .u64u6464pred %r333; .reg .f32 %r338; .reg .u32 %r339; .reg .u32 %r340; .reg .f32 %r342; .reg .u32 %r343; .reg .u32pred %r365; .reg .pred %r366; .reg .predpredu64 %r405; .reg .u64 %r406; mov.u64 %r195,%ar0; mov.u64 %r196,%ar1; mov.f32 %r197,%ar2; mov.f32 %r198,%ar3; mov.u64 %r199,%ar4; mov.u32 %r200,%ar5; .loc 1 54 10 ld.s8 %r22,[%r196+28]; .loc 1 54 38 add.u32 %r201,%r22,-1; .loc 1 54 8 cvt.s64.s32 %r118,%r201; .loc 1 55 10 ld.u64 %r25,[%r199]; .loc 1 55 7 add.u64 %r119,%r25,-1; .loc 1 57 7 shr.u64 %r203,%r119,63; set.u32.lt.s64 %r205,%r118,%r119; neg.s32 %r206,%r205; cvt.u16.u64 %r209,%r203; cvt.u16.u32 %r210,%r206; or.b16 %r208,%r209,%r210; .loc 1 57 6 cvt.u32.u16 %r211,%r208; cvt.u16.u8 %r212,%r211; setp.eq.u16 %r213,%r212,0; @ %r213 bra $L2; .loc 1 59 7 cvt.u32.u32 %r216,%r22; cvt.s64.s8 %r215,%r216; st.u64 [%stack+8],%r21514214_gfortran_runtime_errorr219,%r119,%r119; add.u64 %r220,%r219,%r119; shl.b64 %r221,%r220,3; add.u64 %r222,%r196,%r221; ld.u64 %r30,[%r222+56]; ld.u64 %r32,[%r222+48]; .loc 1 67 9 ld.u64 %r122,[%r222+40]; .loc 1 69 3 setp.ne.u64 %r236,%r119,0; @ %r236 bra $L3; $L9: .loc 1 77 3 setp.gt.s64 %r237,%r118,%r119; @ %r237 bra $L4; bra $L50; $L3: add.u64 %r63,%r196,40; add.u64 %r48,%frame,120; add.u64 %r133,%frame,240; add.u64 %r238,%r196,16; add.u64 %r240,%r25,%r25; add.u64 %r241,%r240,%r25; shl.b64 %r242,%r241,3; add.u64 %r47,%r238,%r242; .loc 1 75 12 mov.u64 %r406,0; $L8: .loc 1 71 18 ld.u64 %r243,[%r63]; st.u64 [%r48],%r243; .loc 1 72 19 ld.u64 %r245,[%r63+16]; add.u64 %r244,%r245,1; ld.u64 %r246,[%r63+8]; sub.u64 %r37,%r244,%r246; .loc 1 74 10 setp.lt.s64 %r247,%r37,0; @ %r247 bra $L6; .loc 1 72 17 st.u64 [%r133],%r37; bra $L7; $L6: .loc 1 75 12 st.u64 [%r133],%r406; $L7: .loc 1 69 3 add.u64 %r63,%r63,24; add.u64 %r48,%r48,8; add.u64 %r133,%r133,8; setp.ne.u64 %r249,%r47,%r63; @ %r249 bra $L8; bra $L9; $L50: .loc 1 86 6 ld.u64 %r250,[%r195]; setp.eq.u64 %r251,%r250,0; @ ! %r251 bra $L11; bra $L10; $L4: add.u64 %r253,%r25,%r25; add.u64 %r254,%r253,%r25; shl.b64 %r255,%r254,3; add.u64 %r256,%r255,40; add.u64 %r161,%r196,%r256; shl.b64 %r257,%r25,3; add.u64 %r146,%r257,-8; add.u64 %r395,%frame,120; add.u64 %r151,%r395,%r146; add.u64 %r394,%frame,240; add.u64 %r143,%r394,%r146; add.u64 %r260,%r196,40; cvt.u32.u32 %r262,%r22; cvt.s64.s8 %r261,%r262; add.u64 %r264,%r261,%r261; add.u64 %r265,%r264,%r261; shl.b64 %r266,%r265,3; add.u64 %r104,%r260,%r266; .loc 1 83 12 mov.u64 %r405,0; $L14: .loc 1 79 18 ld.u64 %r267,[%r161]; st.u64 [%r151],%r267; .loc 1 80 19 ld.u64 %r269,[%r161+16]; add.u64 %r268,%r269,1; ld.u64 %r270,[%r161+8]; sub.u64 %r44,%r268,%r270; .loc 1 82 10 setp.lt.s64 %r271,%r44,0; @ %r271 bra $L12; .loc 1 80 17 st.u64 [%r143],%r44; bra $L13; $L12: .loc 1 83 12 st.u64 [%r143],%r405; $L13: .loc 1 77 3 add.u64 %r161,%r161,24; add.u64 %r151,%r151,8; add.u64 %r143,%r143,8; setp.ne.u64 %r273,%r104,%r161; @ %r273 bra $L14; bra $L51; $L10: .loc 1 90 7 setp.le.s64 %r274,%r118,0; @ %r274 bra $L16; add.u64 %r394,%frame,240; $L36: add.u64 %r170,%r195,40; mov.u64 %r168,%r394; add.u64 %r275,%r195,16; cvt.u32.u32 %r277,%r22; cvt.s64.s8 %r276,%r277; add.u64 %r279,%r276,%r276; add.u64 %r280,%r279,%r276; shl.b64 %r281,%r280,3; add.u64 %r162,%r275,%r281; .loc 1 93 10 mov.u64 %r110,1; .loc 1 97 4 mov.u64 %r304,0; bra $L17; $L16: .loc 1 101 24 mov.u64 %r282,0; st.u64 [%r195+8],%r282; .loc 1 102 28 cvt.u16.u32 %r285,%r22; add.u16 %r284,%r285,-1; cvt.u32.u16 %r286,%r284; st.u8 [%r195+28],%r286; .loc 1 104 20 add.u32 %r287,%r22,-2; cvt.s64.s32 %r55,%r287; add.u64 %r289,%r55,%r55; add.u64 %r290,%r289,%r55; shl.b64 %r291,%r290,3; add.u64 %r292,%r195,%r291; .loc 1 104 67 shl.b64 %r294,%r55,3; add.u64 %r295,%frame,%r294; .loc 1 104 59 ld.u64 %r297,[%r292+40]; ld.u64 %r298,[%r295+240]; mul.lo.u64 %r124,%r297,%r298; .loc 1 106 29 mov.u64 %r3001300; call (%value_in),_gfortrani_xmallocarray01,[%value_in]; } .loc 1 106 27 st.u64 [%r195],%r301; .loc 1 107 10 setp.eq.u64 %r303,%r124,0; @ ! %r303 bra $L21; bra $L18; $L20: .loc 1 95 48 mul.lo.u64 %r110,%r51,%r110; $L17: .loc 1 97 4 st.u64 [%r170+8],%r304; ld.u64 %r51,[%r168]; add.u64 %r305,%r51,-1; st.u64 [%r170+16],%r305; st.u64 [%r170],%r110; .loc 1 90 7 add.u64 %r170,%r170,24; add.u64 %r168,%r168,8; setp.ne.u64 %r306,%r162,%r170; @ %r306 bra $L20; bra $L16; $L18: .loc 1 110 4 st.u64 [%r195+48],%r124; mov.u64 %r308,-1; st.u64 [%r195+56],%r308; mov.u64 %r309,1; st.u64 [%r195+40],%r309; .loc 1 111 4 bra $L1; $L11: .loc 1 116 19 ld.s8 %r62,[%r195+28]; .loc 1 116 10 setp.eq.u64 %r310,%r62,%r118; @ %r310 bra $L23; .loc 1 117 2 st.u64 [%stack+8],%r118; st.u64 [%stack],%r62;_gfortran_runtime_error23: .loc 1 122 11 cvta.global.u64 %r313,_gfortrani_compile_options; .loc 1 122 10 ld.u32 %r314,[%r313+36]; setp.eq.u32 %r315,%r314,0; @ %r315 bra $L21; .loc 1 123 2 add.u64 %r394,%frame,240; cvta.const.u64 %r319,$LC2r394318319; call _gfortrani_bounds_ifunction_return$L21: .loc 1 127 3 setp.gt.s64 %r321,%r118,0; @ %r321 bra $L24; $L27: .loc 1 64 9 add.u64 %r322,%r30,1; .loc 1 64 7 sub.u64 %r120,%r322,%r32; max.s64 %r42,%r120,0; .loc 1 135 8 ld.u64 %r109,[%r195]; .loc 1 138 8 ld.u64 %r131,[%r196]; .loc 1 160 35 shl.b64 %r73,%r122,3; .loc 1 147 22 add.u64 %r323,%r42,-1; .loc 1 147 15 mul.lo.u64 %r69,%r323,%r73; .loc 1 148 34 neg.s64 %r70,%r73; ld.u64 %r71,[%frame+360]; .loc 1 172 22 ld.u64 %r74,[%frame+120]; .loc 1 172 12 shl.b64 %r76,%r74,3; .loc 1 173 22 ld.u64 %r77,[%frame]; .loc 1 173 12 shl.b64 %r79,%r77,3; setp.ne.u32 %r393,%r200,0; setp.le.s64 %r396,%r118,1; .loc 1 148 4 setp.le.s64 %r398,%r120,0; .loc 1 160 4 setp.gt.s64 %r399,%r120,0; cvt.u32.u32 %r400,%r22; cvt.s64.s8 %r401,%r400; add.u64 %r402,%r401,-1; add.u64 %r403,%frame,120; add.u64 %r404,%frame,240; bra $L25; $L24: add.u64 %r183,%frame,360; add.u64 %r182,%r195,40; mov.u64 %r180,%frame; cvt.u32.u32 %r325,%r22; cvt.s64.s8 %r324,%r325; add.u64 %r171,%r324,-1; .loc 1 127 3 mov.u64 %r132,0; add.u64 %r394,%frame,240; .loc 1 129 16 mov.u64 %r326,%r132; $L26: st.u64 [%r183],%r326; .loc 1 130 18 ld.u64 %r327,[%r182]; st.u64 [%r180],%r327; .loc 1 131 17 shl.b64 %r329,%r132,3; add.u64 %r330,%r394,%r329; .loc 1 131 10 ld.u64 %r331,[%r330]; setp.le.s64 %r332,%r331,0; @ %r332 bra $L1; .loc 1 127 26 add.u64 %r132,%r132,1; .loc 1 127 3 add.u64 %r183,%r183,8; add.u64 %r182,%r182,24; add.u64 %r180,%r180,8; setp.ne.u64 %r333,%r132,%r171; @ %r333 bra $L26; bra $L27; $L25: .loc 1 145 10 @ %r393 bra $L28; .loc 1 160 4 @ %r399 bra $L37; .loc 1 144 14 mov.u64 %r126,0; bra $L30; $L28: .loc 1 147 8 add.u64 %r127,%r131,%r69; .loc 1 148 4 @ %r398 bra $L38; .loc 1 148 11 mov.u64 %r126,%r42; $L31: .loc 1 150 11 ld.f32 %r338,[%r127+4]; set.u32.eq.f32 %r339,%r338,%r198; neg.s32 %r340,%r339; ld.f32 %r342,[%r127]; set.u32.eq.f32 %r343,%r342,%r197; neg.s32 %r344,%r343; cvt.u16.u32 %r346,%r340; cvt.u16.u32 %r347,%r344; and.b16 %r345,%r346,%r347; cvt.u32.u16 %r348,%r345; cvt.u16.u8 %r349,%r348; setp.ne.u16 %r350,%r349,0; @ %r350 bra $L30; .loc 1 148 26 add.u64 %r126,%r126,-1; .loc 1 148 34 add.u64 %r127,%r127,%r70; .loc 1 148 4 setp.ne.u64 %r351,%r126,0; @ %r351 bra $L31; bra $L30; $L37: .loc 1 160 4 mov.u64 %r125,%r131; .loc 1 160 11 mov.u64 %r126,1; $L29: .loc 1 162 11 ld.f32 %r353,[%r125+4]; set.u32.eq.f32 %r354,%r353,%r198; neg.s32 %r355,%r354; ld.f32 %r357,[%r125]; set.u32.eq.f32 %r358,%r357,%r197; neg.s32 %r359,%r358; cvt.u16.u32 %r361,%r355; cvt.u16.u32 %r362,%r359; and.b16 %r360,%r361,%r362; cvt.u32.u16 %r363,%r360; cvt.u16.u8 %r364,%r363; setp.ne.u16 %r365,%r364,0; @ %r365 bra $L30; .loc 1 160 27 add.u64 %r126,%r126,1; .loc 1 160 35 add.u64 %r125,%r125,%r73; .loc 1 160 4 setp.ge.s64 %r366,%r42,%r126; @ %r366 bra $L29; .loc 1 144 14 mov.u64 %r126,0; bra $L30; $L38: mov.u64 %r126,0; $L30: .loc 1 169 13 st.u64 [%r109],%r126; .loc 1 171 15 add.u64 %r71,%r71,1; .loc 1 172 12 add.u64 %r131,%r131,%r76; .loc 1 173 12 add.u64 %r109,%r109,%r79; .loc 1 175 32 ld.u64 %r138,[%frame+240]; .loc 1 175 13 setp.ne.u64 %r367,%r71,%r138; @ %r367 bra $L25; .loc 1 178 23 mul.lo.u64 %r368,%r74,%r71; .loc 1 178 9 shl.b64 %r369,%r368,3; sub.u64 %r128,%r131,%r369; .loc 1 179 23 mul.lo.u64 %r370,%r77,%r71; .loc 1 179 9 shl.b64 %r371,%r370,3; sub.u64 %r129,%r109,%r371; .loc 1 181 7 @ %r396 bra $L1; add.u64 %r68,%frame,368; mov.u64 %r155,8; .loc 1 180 5 mov.u64 %r130,1; .loc 1 177 13 mov.u64 %r397,0; bra $L34; $L35: st.u64 [%r68],%r397; .loc 1 178 23 mul.lo.u64 %r377,%r92,%r91; .loc 1 178 9 shl.b64 %r378,%r377,3; sub.u64 %r128,%r131,%r378; .loc 1 179 23 mul.lo.u64 %r379,%r95,%r91; .loc 1 179 9 shl.b64 %r380,%r379,3; sub.u64 %r129,%r109,%r380; .loc 1 180 5 add.u64 %r130,%r130,1; .loc 1 181 7 add.u64 %r68,%r68,8; add.u64 %r155,%r155,8; setp.eq.u64 %r381,%r130,%r402; @ %r381 bra $L1; $L34: .loc 1 188 16 ld.u64 %r382,[%r68]; add.u64 %r91,%r382,1; st.u64 [%r68],%r91; .loc 1 189 23 add.u64 %r384,%r403,%r155; ld.u64 %r92,[%r384]; .loc 1 189 13 shl.b64 %r385,%r92,3; add.u64 %r131,%r128,%r385; .loc 1 190 23 add.u64 %r386,%frame,%r155; ld.u64 %r95,[%r386]; .loc 1 190 13 shl.b64 %r387,%r95,3; add.u64 %r109,%r129,%r387; .loc 1 175 32 add.u64 %r389,%r404,%r155; ld.u64 %r98,[%r389]; .loc 1 175 13 setp.eq.u64 %r390,%r91,%r98; @ %r390 bra $L35; .loc 1 177 13 mov.u64 %r71,0; bra $L25; $L51: .loc 1 86 6 ld.u64 %r391,[%r195]; setp.eq.u64 %r392,%r391,0; @ ! %r392 bra $L11; bra $L36; $L1: .loc 1 194_gfortran_mfindloc1_c4 .visible .func _gfortran_mfindloc1_c608u32 %r226; .reg .u16 %r228; .reg .u16 %r229; .reg .u16predu32 %r265; .reg .u32 %r266; .reg .u32 %r267; .reg .u32 %r269; .reg .u32 %r270; .reg .u16 %r271; .reg .u16 %r272; .reg .u16 %r273; .reg .u32predu64 %r282; .reg .u64 %r283; .reg .u64u64predu64pred %r331; .reg .pred %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u32u16 %r342; .reg .u16predpredu32pred %r389; .reg .pred %r390; .reg .predpred %r411; .reg .u16 %r412; .reg .u32 %r413; .reg .pred %r414; .reg .f32pred %r428; .reg .predu64 %r433; .reg .pred.reg .u64pred %r445; .reg .u64 %r447; .reg .u64 %r448; .reg .predpredmov.f32 %r216,%ar2; mov.f32 %r217,%ar3; mov.u64 %r218,%ar4; mov.u64 %r219,%ar5; mov.u32 %r220,%ar6; .loc 1 225 10 ld.s8 %r22,[%r215+28]; .loc 1 225 38 add.u32 %r221,%r22,-1; .loc 1 225 8 cvt.s64.s32 %r143,%r221; .loc 1 226 10 ld.u64 %r25,[%r218]; .loc 1 226 7 add.u64 %r144,%r25,-1; .loc 1 228 7 shr.u64 %r223,%r144,63; set.u32.lt.s64 %r225,%r143,%r144; neg.s32 %r226,%r225; cvt.u16.u64 %r229,%r223; cvt.u16.u32 %r230,%r226; or.b16 %r228,%r229,%r230; .loc 1 228 6 cvt.u32.u16 %r231,%r228; cvt.u16.u8 %r232,%r231; setp.eq.u16 %r233,%r232,0; @ %r233 bra $L53; .loc 1 230 7 cvt.u32.u32 %r236,%r22; cvt.s64.s8 %r235,%r236; st.u64 [%stack+8],%r235_gfortran_runtime_error3: .loc 1 235 9 add.u64 %r239,%r144,%r144; add.u64 %r240,%r239,%r144; shl.b64 %r241,%r240,3; add.u64 %r242,%r215,%r241; ld.u64 %r31,[%r242+56]; ld.u64 %r33,[%r242+48]; .loc 1 239 9 ld.u64 %r147,[%r242+40]; .loc 1 240 12 add.u64 %r260,%r219,%r241; ld.u64 %r34,[%r260+40]; ld.u64 %r36,[%r219+16]; .loc 1 242 9 ld.u64 %r156,[%r219]; .loc 1 246 22 cvt.u32.u64 %r39,%r36; .loc 1 246 53 add.u32 %r262,%r39,-4; and.b32 %r263,%r262,-5; set.u32.eq.u32 %r265,%r263,0; neg.s32 %r266,%r265; .loc 1 246 22 add.u32 %r267,%r39,-1; set.u32.le.u32 %r269,%r267,1; neg.s32 %r270,%r269; .loc 1 246 6 cvt.u16.u32 %r272,%r266; cvt.u16.u32 %r273,%r270; or.b16 %r271,%r272,%r273; cvt.u32.u16 %r274,%r271; cvt.u16.u8 %r275,%r274; setp.eq.u16 %r276,%r275,0; @ %r276 bra $L54; $L58: .loc 1 255 3 setp.ne.u64 %r277,%r144,0; @ %r277 bra $L55; $L62: .loc 1 264 3 setp.gt.s64 %r278,%r143,%r144; @ %r278 bra $L56; bra $L113; $L54: .loc 1 248 7 setp.eq.u32 %r280,%r39,16; @ %r280 bra $L58; .loc 1 253 5 cvta.const.u64 %r282,$LC4; mov.u64 %r281,call _gfortrani_internal_error5: add.u64 %r125,%r215,40; add.u64 %r84,%r219,40; add.u64 %r283,%r215,16; add.u64 %r285,%r25,%r25; add.u64 %r286,%r285,%r25; shl.b64 %r287,%r286,3; add.u64 %r65,%r283,%r287; .loc 1 255 3 mov.u64 %r142,0; add.u64 %r453,%frame,240; add.u64 %r450,%frame,120; add.u64 %r452,%frame,360; .loc 1 262 12 mov.u64 %r466,%r142; $L61: .loc 1 257 18 add.u64 %r289,%r453,%r142; ld.u64 %r290,[%r125]; st.u64 [%r289],%r290; .loc 1 258 18 add.u64 %r292,%r450,%r142; .loc 1 258 20 ld.u64 %r294,[%r84]; mul.lo.u64 %r293,%r294,%r36; .loc 1 258 18 st.u64 [%r292],%r293; .loc 1 259 19 ld.u64 %r296,[%r125+16]; add.u64 %r295,%r296,1; ld.u64 %r297,[%r125+8]; sub.u64 %r53,%r295,%r297; .loc 1 261 10 setp.lt.s64 %r298,%r53,0; @ %r298 bra $L59; .loc 1 259 17 add.u64 %r300,%r452,%r142; st.u64 [%r300],%r53; bra $L60; $L59: .loc 1 262 12 add.u64 %r302,%r452,%r142; st.u64 [%r302],%r466; $L60: .loc 1 255 3 add.u64 %r125,%r125,24; add.u64 %r84,%r84,24; add.u64 %r142,%r142,8; setp.ne.u64 %r304,%r65,%r125; @ %r304 bra $L61; bra $L62; $L113: .loc 1 274 6 ld.u64 %r305,[%r214]; setp.eq.u64 %r306,%r305,0; @ ! %r306 bra $L64; bra $L63; $L56: add.u64 %r308,%r25,%r25; add.u64 %r309,%r308,%r25; shl.b64 %r310,%r309,3; add.u64 %r174,%r310,40; add.u64 %r179,%r215,%r174; add.u64 %r172,%r219,%r174; shl.b64 %r311,%r25,3; add.u64 %r152,%r311,-8; cvt.u32.u32 %r313,%r22; cvt.s64.s8 %r312,%r313; shl.b64 %r314,%r312,3; add.u64 %r129,%r314,-8; add.u64 %r453,%frame,240; add.u64 %r450,%frame,120; add.u64 %r452,%frame,360; .loc 1 271 12 mov.u64 %r465,0; $L67: .loc 1 266 18 add.u64 %r316,%r453,%r152; ld.u64 %r317,[%r179]; st.u64 [%r316],%r317; .loc 1 267 18 add.u64 %r319,%r450,%r152; .loc 1 267 20 ld.u64 %r321,[%r172]; mul.lo.u64 %r320,%r321,%r36; .loc 1 267 18 st.u64 [%r319],%r320; .loc 1 268 19 ld.u64 %r323,[%r179+16]; add.u64 %r322,%r323,1; ld.u64 %r324,[%r179+8]; sub.u64 %r62,%r322,%r324; .loc 1 270 10 setp.lt.s64 %r325,%r62,0; @ %r325 bra $L65; .loc 1 268 17 add.u64 %r327,%r452,%r152; st.u64 [%r327],%r62; bra $L66; $L65: .loc 1 271 12 add.u64 %r329,%r452,%r152; st.u64 [%r329],%r465; $L66: .loc 1 264 3 add.u64 %r179,%r179,24; add.u64 %r172,%r172,24; add.u64 %r152,%r152,8; setp.ne.u64 %r331,%r129,%r152; @ %r331 bra $L67; bra $L114; $L63: .loc 1 278 7 setp.le.s64 %r332,%r143,0; @ %r332 bra $L69; add.u64 %r452,%frame,360; $L90: add.u64 %r193,%r214,40; mov.u64 %r191,%r452; add.u64 %r333,%r214,16; cvt.u32.u32 %r335,%r22; cvt.s64.s8 %r334,%r335; add.u64 %r337,%r334,%r334; add.u64 %r338,%r337,%r334; shl.b64 %r339,%r338,3; add.u64 %r185,%r333,%r339; .loc 1 281 10 mov.u64 %r134,1; .loc 1 285 4 mov.u64 %r362,0; bra $L70; $L69: .loc 1 289 24 mov.u64 %r340,0; st.u64 [%r214+8],%r340; .loc 1 290 28 cvt.u16.u32 %r343,%r22; add.u16 %r342,%r343,-1; cvt.u32.u16 %r344,%r342; st.u8 [%r214+28],%r344; .loc 1 292 20 add.u32 %r345,%r22,-2; cvt.s64.s32 %r73,%r345; add.u64 %r347,%r73,%r73; add.u64 %r348,%r347,%r73; shl.b64 %r349,%r348,3; add.u64 %r350,%r214,%r349; .loc 1 292 67 shl.b64 %r352,%r73,3; add.u64 %r353,%frame,%r352; .loc 1 292 59 ld.u64 %r355,[%r350+40]; ld.u64 %r356,[%r353+360]; mul.lo.u64 %r150,%r355,%r356; .loc 1 294 29 mov.u64 %r358call (%value_in),_gfortrani_xmallocarray59,[%value_in]; } .loc 1 294 27 st.u64 [%r214],%r359; .loc 1 295 10 setp.eq.u64 %r361,%r150,0; @ ! %r361 bra $L74; bra $L71; $L73: .loc 1 283 48 mul.lo.u64 %r134,%r69,%r134; $L70: .loc 1 285 4 st.u64 [%r193+8],%r362; ld.u64 %r69,[%r191]; add.u64 %r363,%r69,-1; st.u64 [%r193+16],%r363; st.u64 [%r193],%r134; .loc 1 278 7 add.u64 %r193,%r193,24; add.u64 %r191,%r191,8; setp.ne.u64 %r364,%r185,%r193; @ %r364 bra $L73; bra $L69; $L71: .loc 1 298 4 st.u64 [%r214+48],%r150; mov.u64 %r366,-1; st.u64 [%r214+56],%r366; mov.u64 %r367,1; st.u64 [%r214+40],%r367; .loc 1 299 4 bra $L52; $L64: .loc 1 304 19 ld.s8 %r80,[%r214+28]; .loc 1 304 10 setp.eq.u64 %r368,%r80,%r143; @ %r368 bra $L76; .loc 1 305 2 st.u64 [%stack+8],%r143; st.u64 [%stack],%r80;_gfortran_runtime_error76: .loc 1 310 11 cvta.global.u64 %r371,_gfortrani_compile_options; .loc 1 310 10 ld.u32 %r372,[%r371+36]; setp.eq.u32 %r373,%r372,0; @ %r373 bra $L74; .loc 1 311 2 add.u64 %r452,%frame,360; cvta.const.u64 %r377,$LC2376377; call _gfortrani_bounds_ifunction_return$L74: .loc 1 315 3 setp.gt.s64 %r379,%r143,0; @ %r379 bra $L77; $L80: .loc 1 235 9 add.u64 %r380,%r31,1; .loc 1 235 7 sub.u64 %r145,%r380,%r33; max.s64 %r66,%r145,0; .loc 1 240 12 mul.lo.u64 %r37,%r34,%r36; .loc 1 323 8 ld.u64 %r160,[%r214]; .loc 1 326 8 ld.u64 %r159,[%r215]; .loc 1 351 35 shl.b64 %r95,%r147,3; .loc 1 336 22 add.u64 %r64,%r66,-1; .loc 1 336 15 mul.lo.u64 %r87,%r64,%r95; .loc 1 337 29 mul.lo.u64 %r44,%r37,%r64; .loc 1 338 34 neg.s64 %r91,%r95; .loc 1 363 22 ld.u64 %r96,[%frame+240]; .loc 1 363 12 shl.b64 %r98,%r96,3; .loc 1 364 23 ld.u64 %r99,[%frame+120]; .loc 1 365 22 ld.u64 %r101,[%frame]; .loc 1 365 12 shl.b64 %r103,%r101,3; .loc 1 362 12 ld.u64 %r213,[%frame+480]; setp.ne.u32 %r454,%r220,0; add.u64 %r451,%frame,488; .loc 1 338 4 setp.le.s64 %r458,%r145,0; .loc 1 382 23 add.u64 %r459,%frame,240; .loc 1 367 32 add.u64 %r460,%frame,360; .loc 1 371 20 add.u64 %r461,%frame,120; bra $L78; $L77: add.u64 %r206,%frame,480; add.u64 %r205,%r214,40; mov.u64 %r203,%frame; cvt.u32.u32 %r382,%r22; cvt.s64.s8 %r381,%r382; add.u64 %r194,%r381,-1; .loc 1 315 3 mov.u64 %r161,0; add.u64 %r452,%frame,360; .loc 1 317 16 mov.u64 %r383,%r161; $L79: st.u64 [%r206],%r383; .loc 1 318 18 ld.u64 %r384,[%r205]; st.u64 [%r203],%r384; .loc 1 319 17 shl.b64 %r386,%r161,3; add.u64 %r387,%r452,%r386; .loc 1 319 10 ld.u64 %r388,[%r387]; setp.le.s64 %r389,%r388,0; @ %r389 bra $L52; .loc 1 315 26 add.u64 %r161,%r161,1; .loc 1 315 3 add.u64 %r206,%r206,8; add.u64 %r205,%r205,24; add.u64 %r203,%r203,8; setp.ne.u64 %r390,%r161,%r194; @ %r390 bra $L79; bra $L80; $L78: .loc 1 334 10 @ %r454 bra $L81; .loc 1 351 4 setp.gt.s64 %r392,%r145,0; @ %r392 bra $L91; .loc 1 333 14 mov.u64 %r154,0; bra $L83; $L81: .loc 1 336 8 add.u64 %r29,%r159,%r87; .loc 1 337 9 add.u64 %r155,%r156,%r44; .loc 1 338 4 @ %r458 bra $L92; .loc 1 338 11 mov.u64 %r154,%r66; $L85: .loc 1 340 11 ld.s8 %r395,[%r155]; cvt.u16.u32 %r394,%r395; setp.eq.u16 %r396,%r394,0; @ %r396 bra $L84; .loc 1 340 18 ld.f32 %r398,[%r29+4]; set.u32.eq.f32 %r399,%r398,%r217; neg.s32 %r400,%r399; ld.f32 %r402,[%r29]; set.u32.eq.f32 %r403,%r402,%r216; neg.s32 %r404,%r403; cvt.u16.u32 %r406,%r400; cvt.u16.u32 %r407,%r404; and.b16 %r405,%r406,%r407; cvt.u32.u16 %r408,%r405; cvt.u16.u8 %r409,%r408; setp.ne.u16 %r410,%r409,0; @ %r410 bra $L83; $L84: .loc 1 338 26 add.u64 %r154,%r154,-1; .loc 1 338 34 add.u64 %r29,%r29,%r91; .loc 1 338 53 sub.u64 %r155,%r155,%r37; .loc 1 338 4 setp.ne.u64 %r411,%r154,0; @ %r411 bra $L85; bra $L83; $L91: .loc 1 351 4 mov.u64 %r153,%r156; mov.u64 %r104,%r159; .loc 1 351 11 mov.u64 %r154,1; $L82: .loc 1 353 11 ld.s8 %r413,[%r153]; cvt.u16.u32 %r412,%r413; setp.eq.u16 %r414,%r412,0; @ %r414 bra $L86; .loc 1 353 18 ld.f32 %r416,[%r104]; set.u32.eq.f32 %r417,%r416,%r216; neg.s32 %r418,%r417; ld.f32 %r420,[%r104+4]; set.u32.eq.f32 %r421,%r420,%r217; neg.s32 %r422,%r421; cvt.u16.u32 %r424,%r418; cvt.u16.u32 %r425,%r422; and.b16 %r423,%r424,%r425; cvt.u32.u16 %r426,%r423; cvt.u16.u8 %r427,%r426; setp.ne.u16 %r428,%r427,0; @ %r428 bra $L83; $L86: .loc 1 351 27 add.u64 %r154,%r154,1; .loc 1 351 35 add.u64 %r104,%r104,%r95; .loc 1 351 54 add.u64 %r153,%r153,%r37; .loc 1 351 4 setp.ge.s64 %r429,%r66,%r154; @ %r429 bra $L82; .loc 1 333 14 mov.u64 %r154,0; bra $L83; $L92: mov.u64 %r154,0; $L83: .loc 1 360 13 st.u64 [%r160],%r154; .loc 1 362 15 add.u64 %r213,%r213,1; st.u64 [%frame+480],%r213; .loc 1 363 12 add.u64 %r159,%r159,%r98; .loc 1 364 13 add.u64 %r156,%r156,%r99; .loc 1 365 12 add.u64 %r160,%r160,%r103; .loc 1 367 32 ld.u64 %r121,[%frame+360]; .loc 1 367 13 setp.ne.u64 %r430,%r213,%r121; @ %r430 bra $L78; mov.u64 %r118,%r451; mov.u64 %r119,%r101; mov.u64 %r211,%r99; mov.u64 %r116,%r96; mov.u64 %r112,8; .loc 1 366 9 mov.u64 %r158,0; .loc 1 369 13 mov.u64 %r432,%r158; $L89: st.u64 [%r118+-8],%r432; .loc 1 370 23 mul.lo.u64 %r106,%r121,%r116; .loc 1 371 24 mul.lo.u64 %r433,%r121,%r211; .loc 1 371 10 sub.u64 %r156,%r156,%r433; .loc 1 372 23 mul.lo.u64 %r111,%r121,%r119; .loc 1 373 5 add.u64 %r158,%r158,1; .loc 1 374 7 setp.le.s64 %r434,%r143,%r158; @ %r434 bra $L52; .loc 1 381 16 ld.u64 %r435,[%r118]; add.u64 %r115,%r435,1; st.u64 [%r118],%r115; .loc 1 382 23 add.u64 %r437,%r459,%r112; ld.u64 %r116,[%r437]; .loc 1 382 13 sub.u64 %r438,%r116,%r106; shl.b64 %r439,%r438,3; add.u64 %r159,%r159,%r439; .loc 1 383 23 add.u64 %r440,%frame,%r112; ld.u64 %r119,[%r440]; .loc 1 383 13 sub.u64 %r441,%r119,%r111; shl.b64 %r442,%r441,3; add.u64 %r160,%r160,%r442; .loc 1 367 32 add.u64 %r444,%r460,%r112; ld.u64 %r121,[%r444]; .loc 1 367 13 add.u64 %r118,%r118,8; setp.eq.u64 %r445,%r115,%r121; @ %r445 bra $L88; .loc 1 362 12 ld.u64 %r213,[%frame+480]; bra $L78; $L88: .loc 1 371 20 add.u64 %r447,%r461,%r112; ld.u64 %r211,[%r447]; add.u64 %r112,%r112,8; bra $L89; $L114: .loc 1 274 6 ld.u64 %r448,[%r214]; setp.eq.u64 %r449,%r448,0; @ ! %r449 bra $L64; bra $L90; $L52: .loc 1 387_gfortran_sfindloc1_c4 .visible .func _gfortran_sfindloc1_c368u64 %r126; .reg .u64u64 %r154; .reg .u32 %r155; .reg .predu16 %r173; .reg .u16 %r174; .reg .u32 %r175; .reg .u16 %r176; .reg .pred %r177; .reg .pred %r178; .reg .predpredpredu16 %r229; .reg .u32predu64predpredu64 %r303; .reg .u64 %r304; mov.u64 %r149,%ar0; mov.u64 %r150,%ar1; mov.f32 %r151,%ar2; mov.f32 %r152,%ar3; mov.u64 %r153,%ar4; mov.u64 %r154,%ar5; mov.u32 %r155,%ar6; .loc 1 410 6 setp.eq.u64 %r156,%r154,0; @ %r156 bra $L116; .loc 1 410 20 ld.u32 %r157,[%r154]; setp.eq.u32 %r158,%r157,0; @ %r158 bra $L117; $L116:.param .f32 %out_arg4; st.param.f32 [%out_arg4],%r152153155; call _gfortran_findloc1_c4413 7 bra $L115; $L117: .loc 1 416 10 ld.s8 %r23,[%r150+28]; .loc 1 416 38 add.u32 %r165,%r23,-1; .loc 1 416 8 cvt.s64.s32 %r90,%r165; .loc 1 417 10 ld.u64 %r26,[%r153]; .loc 1 417 7 add.u64 %r91,%r26,-1; .loc 1 419 7 shr.u64 %r167,%r91,63; set.u32.lt.s64 %r169,%r90,%r91; neg.s32 %r170,%r169; cvt.u16.u64 %r173,%r167; cvt.u16.u32 %r174,%r170; or.b16 %r172,%r173,%r174; .loc 1 419 6 cvt.u32.u16 %r175,%r172; cvt.u16.u8 %r176,%r175; setp.ne.u16 %r177,%r176,0; @ %r177 bra $L119; .loc 1 430 3 setp.ne.u64 %r178,%r91,0; @ %r178 bra $L120; $L126: .loc 1 438 3 setp.gt.s64 %r179,%r90,%r91; @ %r179 bra $L121; bra $L163; $L119: .loc 1 421 7 cvt.u32.u32 %r182,%r23; cvt.s64.s8 %r181,%r182; st.u64 [%stack+8],%r18180180_gfortran_runtime_error120: add.u64 %r136,%r150,48; add.u64 %r138,%frame,120; add.u64 %r139,%r150,56; add.u64 %r184,%r150,24; add.u64 %r186,%r26,%r26; add.u64 %r187,%r186,%r26; shl.b64 %r188,%r187,3; add.u64 %r145,%r184,%r188; .loc 1 435 12 mov.u64 %r304,0; $L125: .loc 1 432 19 ld.u64 %r190,[%r139]; add.u64 %r189,%r190,1; ld.u64 %r191,[%r136]; sub.u64 %r35,%r189,%r191; .loc 1 434 10 setp.le.s64 %r192,%r35,0; @ %r192 bra $L123; .loc 1 432 17 st.u64 [%r138],%r35; bra $L124; $L123: .loc 1 435 12 st.u64 [%r138],%r304; $L124: .loc 1 430 3 add.u64 %r136,%r136,24; add.u64 %r138,%r138,8; add.u64 %r139,%r139,24; setp.ne.u64 %r194,%r136,%r145; @ %r194 bra $L125; bra $L126; $L163: .loc 1 448 6 ld.u64 %r195,[%r149]; setp.eq.u64 %r196,%r195,0; @ ! %r196 bra $L128; bra $L127; $L121: add.u64 %r198,%r26,%r26; add.u64 %r199,%r198,%r26; shl.b64 %r200,%r199,3; add.u64 %r201,%r200,48; add.u64 %r125,%r150,%r201; add.u64 %r296,%frame,120; shl.b64 %r203,%r26,3; add.u64 %r204,%r203,-8; add.u64 %r45,%r296,%r204; add.u64 %r205,%r150,48; cvt.u32.u32 %r207,%r23; cvt.s64.s8 %r206,%r207; add.u64 %r209,%r206,%r206; add.u64 %r210,%r209,%r206; shl.b64 %r211,%r210,3; add.u64 %r132,%r205,%r211; .loc 1 444 12 mov.u64 %r303,0; $L131: .loc 1 441 2 ld.u64 %r213,[%r125+8]; add.u64 %r212,%r213,1; ld.u64 %r214,[%r125]; sub.u64 %r40,%r212,%r214; .loc 1 443 10 setp.le.s64 %r215,%r40,0; @ %r215 bra $L129; .loc 1 440 17 st.u64 [%r45],%r40; bra $L130; $L129: .loc 1 444 12 st.u64 [%r45],%r303; $L130: .loc 1 438 3 add.u64 %r125,%r125,24; add.u64 %r45,%r45,8; setp.ne.u64 %r217,%r125,%r132; @ %r217 bra $L131; bra $L164; $L127: .loc 1 452 7 setp.le.s64 %r218,%r90,0; @ %r218 bra $L133; add.u64 %r296,%frame,120; $L149: add.u64 %r78,%r149,40; mov.u64 %r61,%r296; cvt.u32.u32 %r220,%r23; cvt.s64.s8 %r219,%r220; add.u64 %r222,%r219,%r219; add.u64 %r223,%r222,%r219; shl.b64 %r224,%r223,3; add.u64 %r225,%r149,16; add.u64 %r126,%r224,%r225; .loc 1 455 10 mov.u64 %r81,1; .loc 1 459 4 mov.u64 %r248,0; bra $L134; $L133: .loc 1 462 24 mov.u64 %r226,0; st.u64 [%r149+8],%r226; .loc 1 463 28 cvt.u16.u32 %r229,%r23; add.u16 %r228,%r229,-1; cvt.u32.u16 %r230,%r228; st.u8 [%r149+28],%r230; .loc 1 465 20 add.u32 %r231,%r23,-2; cvt.s64.s32 %r51,%r231; add.u64 %r233,%r51,%r51; add.u64 %r234,%r233,%r51; shl.b64 %r235,%r234,3; add.u64 %r236,%r149,%r235; .loc 1 465 67 shl.b64 %r238,%r51,3; add.u64 %r239,%frame,%r238; .loc 1 465 59 ld.u64 %r241,[%r236+40]; ld.u64 %r242,[%r239+120]; mul.lo.u64 %r94,%r241,%r242; .loc 1 467 29 mov.u64 %r244244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } .loc 1 467 27 st.u64 [%r149],%r245; .loc 1 468 10 setp.eq.u64 %r247,%r94,0; @ ! %r247 bra $L138; bra $L135; $L137: .loc 1 457 48 mul.lo.u64 %r81,%r47,%r81; $L134: .loc 1 459 4 st.u64 [%r78+8],%r248; ld.u64 %r47,[%r61]; add.u64 %r249,%r47,-1; st.u64 [%r78+16],%r249; st.u64 [%r78],%r81; .loc 1 452 7 add.u64 %r78,%r78,24; add.u64 %r61,%r61,8; setp.ne.u64 %r250,%r78,%r126; @ %r250 bra $L137; bra $L133; $L135: .loc 1 471 4 st.u64 [%r149+48],%r94; mov.u64 %r252,-1; st.u64 [%r149+56],%r252; mov.u64 %r253,1; st.u64 [%r149+40],%r253; .loc 1 472 4 bra $L115; $L128: .loc 1 477 19 ld.s8 %r58,[%r149+28]; .loc 1 477 10 setp.eq.u64 %r254,%r58,%r90; @ %r254 bra $L139; .loc 1 478 2 st.u64 [%stack+8],%r90; st.u64 [%stack],%r58;_gfortran_runtime_error139: .loc 1 483 11 cvta.global.u64 %r257,_gfortrani_compile_options; .loc 1 483 10 ld.u32 %r258,[%r257+36]; setp.eq.u32 %r259,%r258,0; @ %r259 bra $L138; .loc 1 484 2 add.u64 %r296,%frame,120; cvta.const.u64 %r263,$LC2r2962623; call _gfortrani_bounds_ifunction_return$L138: .loc 1 488 3 setp.gt.s64 %r265,%r90,0; @ %r265 bra $L140; $L143: .loc 1 495 8 ld.u64 %r97,[%r149]; ld.u64 %r42,[%frame+240]; .loc 1 503 22 ld.u64 %r64,[%frame]; .loc 1 503 12 shl.b64 %r66,%r64,3; setp.le.s64 %r297,%r90,1; cvt.u32.u32 %r299,%r23; cvt.s64.s8 %r300,%r299; add.u64 %r301,%r300,-1; add.u64 %r302,%frame,120; bra $L141; $L140: add.u64 %r114,%frame,240; add.u64 %r113,%r149,40; mov.u64 %r108,%frame; cvt.u32.u32 %r267,%r23; cvt.s64.s8 %r266,%r267; add.u64 %r79,%r266,-1; .loc 1 488 3 mov.u64 %r98,0; add.u64 %r296,%frame,120; .loc 1 490 16 mov.u64 %r268,%r98; $L142: st.u64 [%r114],%r268; .loc 1 491 18 ld.u64 %r269,[%r113]; st.u64 [%r108],%r269; .loc 1 492 17 shl.b64 %r271,%r98,3; add.u64 %r272,%r296,%r271; .loc 1 492 10 ld.u64 %r273,[%r272]; setp.le.s64 %r274,%r273,0; @ %r274 bra $L115; .loc 1 488 26 add.u64 %r98,%r98,1; .loc 1 488 3 add.u64 %r114,%r114,8; add.u64 %r113,%r113,24; add.u64 %r108,%r108,8; setp.ne.u64 %r275,%r79,%r98; @ %r275 bra $L142; bra $L143; $L141: .loc 1 500 13 mov.u64 %r276,0; st.u64 [%r97],%r276; .loc 1 502 15 add.u64 %r42,%r42,1; .loc 1 503 12 add.u64 %r97,%r97,%r66; .loc 1 505 32 ld.u64 %r88,[%frame+120]; .loc 1 505 13 setp.ne.u64 %r277,%r42,%r88; @ %r277 bra $L141; .loc 1 508 23 mul.lo.u64 %r278,%r64,%r42; .loc 1 508 9 shl.b64 %r279,%r278,3; sub.u64 %r30,%r97,%r279; .loc 1 510 7 @ ! %r297 bra $L165; bra $L115; $L148: .loc 1 507 13 st.u64 [%r102],%r276; .loc 1 508 23 mul.lo.u64 %r282,%r73,%r72; .loc 1 508 9 shl.b64 %r283,%r282,3; sub.u64 %r30,%r97,%r283; .loc 1 509 5 add.u64 %r96,%r96,1; .loc 1 510 7 add.u64 %r102,%r102,8; setp.ne.u64 %r284,%r96,%r301; @ %r284 bra $L147; bra $L115; $L165: add.u64 %r102,%frame,248; .loc 1 509 5 mov.u64 %r96,1; $L147: .loc 1 517 16 ld.u64 %r288,[%r102]; add.u64 %r72,%r288,1; st.u64 [%r102],%r72; shl.b64 %r120,%r96,3; .loc 1 518 23 add.u64 %r289,%frame,%r120; ld.u64 %r73,[%r289]; .loc 1 518 13 shl.b64 %r290,%r73,3; add.u64 %r97,%r30,%r290; .loc 1 505 32 add.u64 %r292,%r302,%r120; ld.u64 %r77,[%r292]; .loc 1 505 13 setp.eq.u64 %r293,%r72,%r77; @ %r293 bra $L148; .loc 1 507 13 mov.u64 %r42,0; bra $L141; $L164: .loc 1 448 6 ld.u64 %r294,[%r149]; setp.eq.u64 %r295,%r294,0; @ ! %r295 bra $L128; bra $L149; $L115: .loc 1 522findloc1_c8.o/_gfortran_findloc1_c8 .visible .func _gfortran_findloc1_c8.file 1 "../../../../libgfortran/generated/findloc1_c8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mfindloc1_c8 .visible .func _gfortran_mfindloc1_c8_gfortran_sfindloc1_c8 .visible .func _gfortran_sfindloc1_c8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_internal_error .extern .func _gfortrani_internal_,105,110,99,111,114,114,101,99,116,32,105,110,32,70,73,78,68,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,10114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,914,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,70,73,78,68,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,108] = {70,73,78,68,76,79,67_gfortran_findloc1_c8 .visible .func _gfortran_findloc1_c8486410416u64 %r222; .reg .pred %r236; .reg .predpred %r247; .reg .predu64 %r265; .reg .u64 %r266; .reg .u64u64u64 %r281; .reg .u64 %r282; .reg .u16pred %r315; .reg .u64u6464pred %r333; .reg .f64 %r338; .reg .u32 %r339; .reg .u32 %r340; .reg .f64 %r342; .reg .u32 %r343; .reg .u32pred %r365; .reg .pred %r366; .reg .predpredu64 %r405; .reg .u64 %r406; mov.u64 %r195,%ar0; mov.u64 %r196,%ar1; mov.f64 %r197,%ar2; mov.f64 %r198,%ar3; mov.u64 %r199,%ar4; mov.u32 %r200,%ar5; .loc 1 54 10 ld.s8 %r22,[%r196+28]; .loc 1 54 38 add.u32 %r201,%r22,-1; .loc 1 54 8 cvt.s64.s32 %r118,%r201; .loc 1 55 10 ld.u64 %r25,[%r199]; .loc 1 55 7 add.u64 %r119,%r25,-1; .loc 1 57 7 shr.u64 %r203,%r119,63; set.u32.lt.s64 %r205,%r118,%r119; neg.s32 %r206,%r205; cvt.u16.u64 %r209,%r203; cvt.u16.u32 %r210,%r206; or.b16 %r208,%r209,%r210; .loc 1 57 6 cvt.u32.u16 %r211,%r208; cvt.u16.u8 %r212,%r211; setp.eq.u16 %r213,%r212,0; @ %r213 bra $L2; .loc 1 59 7 cvt.u32.u32 %r216,%r22; cvt.s64.s8 %r215,%r216; st.u64 [%stack+8],%r21514214_gfortran_runtime_errorr219,%r119,%r119; add.u64 %r220,%r219,%r119; shl.b64 %r221,%r220,3; add.u64 %r222,%r196,%r221; ld.u64 %r30,[%r222+56]; ld.u64 %r32,[%r222+48]; .loc 1 67 9 ld.u64 %r122,[%r222+40]; .loc 1 69 3 setp.ne.u64 %r236,%r119,0; @ %r236 bra $L3; $L9: .loc 1 77 3 setp.gt.s64 %r237,%r118,%r119; @ %r237 bra $L4; bra $L50; $L3: add.u64 %r63,%r196,40; add.u64 %r48,%frame,120; add.u64 %r133,%frame,240; add.u64 %r238,%r196,16; add.u64 %r240,%r25,%r25; add.u64 %r241,%r240,%r25; shl.b64 %r242,%r241,3; add.u64 %r47,%r238,%r242; .loc 1 75 12 mov.u64 %r406,0; $L8: .loc 1 71 18 ld.u64 %r243,[%r63]; st.u64 [%r48],%r243; .loc 1 72 19 ld.u64 %r245,[%r63+16]; add.u64 %r244,%r245,1; ld.u64 %r246,[%r63+8]; sub.u64 %r37,%r244,%r246; .loc 1 74 10 setp.lt.s64 %r247,%r37,0; @ %r247 bra $L6; .loc 1 72 17 st.u64 [%r133],%r37; bra $L7; $L6: .loc 1 75 12 st.u64 [%r133],%r406; $L7: .loc 1 69 3 add.u64 %r63,%r63,24; add.u64 %r48,%r48,8; add.u64 %r133,%r133,8; setp.ne.u64 %r249,%r47,%r63; @ %r249 bra $L8; bra $L9; $L50: .loc 1 86 6 ld.u64 %r250,[%r195]; setp.eq.u64 %r251,%r250,0; @ ! %r251 bra $L11; bra $L10; $L4: add.u64 %r253,%r25,%r25; add.u64 %r254,%r253,%r25; shl.b64 %r255,%r254,3; add.u64 %r256,%r255,40; add.u64 %r161,%r196,%r256; shl.b64 %r257,%r25,3; add.u64 %r146,%r257,-8; add.u64 %r395,%frame,120; add.u64 %r151,%r395,%r146; add.u64 %r394,%frame,240; add.u64 %r143,%r394,%r146; add.u64 %r260,%r196,40; cvt.u32.u32 %r262,%r22; cvt.s64.s8 %r261,%r262; add.u64 %r264,%r261,%r261; add.u64 %r265,%r264,%r261; shl.b64 %r266,%r265,3; add.u64 %r104,%r260,%r266; .loc 1 83 12 mov.u64 %r405,0; $L14: .loc 1 79 18 ld.u64 %r267,[%r161]; st.u64 [%r151],%r267; .loc 1 80 19 ld.u64 %r269,[%r161+16]; add.u64 %r268,%r269,1; ld.u64 %r270,[%r161+8]; sub.u64 %r44,%r268,%r270; .loc 1 82 10 setp.lt.s64 %r271,%r44,0; @ %r271 bra $L12; .loc 1 80 17 st.u64 [%r143],%r44; bra $L13; $L12: .loc 1 83 12 st.u64 [%r143],%r405; $L13: .loc 1 77 3 add.u64 %r161,%r161,24; add.u64 %r151,%r151,8; add.u64 %r143,%r143,8; setp.ne.u64 %r273,%r104,%r161; @ %r273 bra $L14; bra $L51; $L10: .loc 1 90 7 setp.le.s64 %r274,%r118,0; @ %r274 bra $L16; add.u64 %r394,%frame,240; $L36: add.u64 %r170,%r195,40; mov.u64 %r168,%r394; add.u64 %r275,%r195,16; cvt.u32.u32 %r277,%r22; cvt.s64.s8 %r276,%r277; add.u64 %r279,%r276,%r276; add.u64 %r280,%r279,%r276; shl.b64 %r281,%r280,3; add.u64 %r162,%r275,%r281; .loc 1 93 10 mov.u64 %r110,1; .loc 1 97 4 mov.u64 %r304,0; bra $L17; $L16: .loc 1 101 24 mov.u64 %r282,0; st.u64 [%r195+8],%r282; .loc 1 102 28 cvt.u16.u32 %r285,%r22; add.u16 %r284,%r285,-1; cvt.u32.u16 %r286,%r284; st.u8 [%r195+28],%r286; .loc 1 104 20 add.u32 %r287,%r22,-2; cvt.s64.s32 %r55,%r287; add.u64 %r289,%r55,%r55; add.u64 %r290,%r289,%r55; shl.b64 %r291,%r290,3; add.u64 %r292,%r195,%r291; .loc 1 104 67 shl.b64 %r294,%r55,3; add.u64 %r295,%frame,%r294; .loc 1 104 59 ld.u64 %r297,[%r292+40]; ld.u64 %r298,[%r295+240]; mul.lo.u64 %r124,%r297,%r298; .loc 1 106 29 mov.u64 %r3001300; call (%value_in),_gfortrani_xmallocarray01,[%value_in]; } .loc 1 106 27 st.u64 [%r195],%r301; .loc 1 107 10 setp.eq.u64 %r303,%r124,0; @ ! %r303 bra $L21; bra $L18; $L20: .loc 1 95 48 mul.lo.u64 %r110,%r51,%r110; $L17: .loc 1 97 4 st.u64 [%r170+8],%r304; ld.u64 %r51,[%r168]; add.u64 %r305,%r51,-1; st.u64 [%r170+16],%r305; st.u64 [%r170],%r110; .loc 1 90 7 add.u64 %r170,%r170,24; add.u64 %r168,%r168,8; setp.ne.u64 %r306,%r162,%r170; @ %r306 bra $L20; bra $L16; $L18: .loc 1 110 4 st.u64 [%r195+48],%r124; mov.u64 %r308,-1; st.u64 [%r195+56],%r308; mov.u64 %r309,1; st.u64 [%r195+40],%r309; .loc 1 111 4 bra $L1; $L11: .loc 1 116 19 ld.s8 %r62,[%r195+28]; .loc 1 116 10 setp.eq.u64 %r310,%r62,%r118; @ %r310 bra $L23; .loc 1 117 2 st.u64 [%stack+8],%r118; st.u64 [%stack],%r62;_gfortran_runtime_error23: .loc 1 122 11 cvta.global.u64 %r313,_gfortrani_compile_options; .loc 1 122 10 ld.u32 %r314,[%r313+36]; setp.eq.u32 %r315,%r314,0; @ %r315 bra $L21; .loc 1 123 2 add.u64 %r394,%frame,240; cvta.const.u64 %r319,$LC2r394318319; call _gfortrani_bounds_ifunction_return$L21: .loc 1 127 3 setp.gt.s64 %r321,%r118,0; @ %r321 bra $L24; $L27: .loc 1 64 9 add.u64 %r322,%r30,1; .loc 1 64 7 sub.u64 %r120,%r322,%r32; max.s64 %r42,%r120,0; .loc 1 135 8 ld.u64 %r109,[%r195]; .loc 1 138 8 ld.u64 %r131,[%r196]; .loc 1 160 35 shl.b64 %r73,%r122,4; .loc 1 147 22 add.u64 %r323,%r42,-1; .loc 1 147 15 mul.lo.u64 %r69,%r323,%r73; .loc 1 148 34 neg.s64 %r70,%r73; ld.u64 %r71,[%frame+360]; .loc 1 172 22 ld.u64 %r74,[%frame+120]; .loc 1 172 12 shl.b64 %r76,%r74,4; .loc 1 173 22 ld.u64 %r77,[%frame]; .loc 1 173 12 shl.b64 %r79,%r77,3; setp.ne.u32 %r393,%r200,0; setp.le.s64 %r396,%r118,1; .loc 1 148 4 setp.le.s64 %r398,%r120,0; .loc 1 160 4 setp.gt.s64 %r399,%r120,0; cvt.u32.u32 %r400,%r22; cvt.s64.s8 %r401,%r400; add.u64 %r402,%r401,-1; add.u64 %r403,%frame,120; add.u64 %r404,%frame,240; bra $L25; $L24: add.u64 %r183,%frame,360; add.u64 %r182,%r195,40; mov.u64 %r180,%frame; cvt.u32.u32 %r325,%r22; cvt.s64.s8 %r324,%r325; add.u64 %r171,%r324,-1; .loc 1 127 3 mov.u64 %r132,0; add.u64 %r394,%frame,240; .loc 1 129 16 mov.u64 %r326,%r132; $L26: st.u64 [%r183],%r326; .loc 1 130 18 ld.u64 %r327,[%r182]; st.u64 [%r180],%r327; .loc 1 131 17 shl.b64 %r329,%r132,3; add.u64 %r330,%r394,%r329; .loc 1 131 10 ld.u64 %r331,[%r330]; setp.le.s64 %r332,%r331,0; @ %r332 bra $L1; .loc 1 127 26 add.u64 %r132,%r132,1; .loc 1 127 3 add.u64 %r183,%r183,8; add.u64 %r182,%r182,24; add.u64 %r180,%r180,8; setp.ne.u64 %r333,%r132,%r171; @ %r333 bra $L26; bra $L27; $L25: .loc 1 145 10 @ %r393 bra $L28; .loc 1 160 4 @ %r399 bra $L37; .loc 1 144 14 mov.u64 %r126,0; bra $L30; $L28: .loc 1 147 8 add.u64 %r127,%r131,%r69; .loc 1 148 4 @ %r398 bra $L38; .loc 1 148 11 mov.u64 %r126,%r42; $L31: .loc 1 150 11 ld.f64 %r338,[%r127+8]; set.u32.eq.f64 %r339,%r338,%r198; neg.s32 %r340,%r339; ld.f64 %r342,[%r127]; set.u32.eq.f64 %r343,%r342,%r197; neg.s32 %r344,%r343; cvt.u16.u32 %r346,%r340; cvt.u16.u32 %r347,%r344; and.b16 %r345,%r346,%r347; cvt.u32.u16 %r348,%r345; cvt.u16.u8 %r349,%r348; setp.ne.u16 %r350,%r349,0; @ %r350 bra $L30; .loc 1 148 26 add.u64 %r126,%r126,-1; .loc 1 148 34 add.u64 %r127,%r127,%r70; .loc 1 148 4 setp.ne.u64 %r351,%r126,0; @ %r351 bra $L31; bra $L30; $L37: .loc 1 160 4 mov.u64 %r125,%r131; .loc 1 160 11 mov.u64 %r126,1; $L29: .loc 1 162 11 ld.f64 %r353,[%r125+8]; set.u32.eq.f64 %r354,%r353,%r198; neg.s32 %r355,%r354; ld.f64 %r357,[%r125]; set.u32.eq.f64 %r358,%r357,%r197; neg.s32 %r359,%r358; cvt.u16.u32 %r361,%r355; cvt.u16.u32 %r362,%r359; and.b16 %r360,%r361,%r362; cvt.u32.u16 %r363,%r360; cvt.u16.u8 %r364,%r363; setp.ne.u16 %r365,%r364,0; @ %r365 bra $L30; .loc 1 160 27 add.u64 %r126,%r126,1; .loc 1 160 35 add.u64 %r125,%r125,%r73; .loc 1 160 4 setp.ge.s64 %r366,%r42,%r126; @ %r366 bra $L29; .loc 1 144 14 mov.u64 %r126,0; bra $L30; $L38: mov.u64 %r126,0; $L30: .loc 1 169 13 st.u64 [%r109],%r126; .loc 1 171 15 add.u64 %r71,%r71,1; .loc 1 172 12 add.u64 %r131,%r131,%r76; .loc 1 173 12 add.u64 %r109,%r109,%r79; .loc 1 175 32 ld.u64 %r138,[%frame+240]; .loc 1 175 13 setp.ne.u64 %r367,%r71,%r138; @ %r367 bra $L25; .loc 1 178 23 mul.lo.u64 %r368,%r74,%r71; .loc 1 178 9 shl.b64 %r369,%r368,4; sub.u64 %r128,%r131,%r369; .loc 1 179 23 mul.lo.u64 %r370,%r77,%r71; .loc 1 179 9 shl.b64 %r371,%r370,3; sub.u64 %r129,%r109,%r371; .loc 1 181 7 @ %r396 bra $L1; add.u64 %r68,%frame,368; mov.u64 %r155,8; .loc 1 180 5 mov.u64 %r130,1; .loc 1 177 13 mov.u64 %r397,0; bra $L34; $L35: st.u64 [%r68],%r397; .loc 1 178 23 mul.lo.u64 %r377,%r92,%r91; .loc 1 178 9 shl.b64 %r378,%r377,4; sub.u64 %r128,%r131,%r378; .loc 1 179 23 mul.lo.u64 %r379,%r95,%r91; .loc 1 179 9 shl.b64 %r380,%r379,3; sub.u64 %r129,%r109,%r380; .loc 1 180 5 add.u64 %r130,%r130,1; .loc 1 181 7 add.u64 %r68,%r68,8; add.u64 %r155,%r155,8; setp.eq.u64 %r381,%r130,%r402; @ %r381 bra $L1; $L34: .loc 1 188 16 ld.u64 %r382,[%r68]; add.u64 %r91,%r382,1; st.u64 [%r68],%r91; .loc 1 189 23 add.u64 %r384,%r403,%r155; ld.u64 %r92,[%r384]; .loc 1 189 13 shl.b64 %r385,%r92,4; add.u64 %r131,%r128,%r385; .loc 1 190 23 add.u64 %r386,%frame,%r155; ld.u64 %r95,[%r386]; .loc 1 190 13 shl.b64 %r387,%r95,3; add.u64 %r109,%r129,%r387; .loc 1 175 32 add.u64 %r389,%r404,%r155; ld.u64 %r98,[%r389]; .loc 1 175 13 setp.eq.u64 %r390,%r91,%r98; @ %r390 bra $L35; .loc 1 177 13 mov.u64 %r71,0; bra $L25; $L51: .loc 1 86 6 ld.u64 %r391,[%r195]; setp.eq.u64 %r392,%r391,0; @ ! %r392 bra $L11; bra $L36; $L1: .loc 1 194_gfortran_mfindloc1_c8 .visible .func _gfortran_mfindloc1_c8608u32 %r226; .reg .u16 %r228; .reg .u16 %r229; .reg .u16predu32 %r265; .reg .u32 %r266; .reg .u32 %r267; .reg .u32 %r269; .reg .u32 %r270; .reg .u16 %r271; .reg .u16 %r272; .reg .u16 %r273; .reg .u32predu64 %r282; .reg .u64 %r283; .reg .u64u64predu64pred %r331; .reg .pred %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u32u16 %r342; .reg .u16predpredu32pred %r389; .reg .pred %r390; .reg .predpred %r411; .reg .u16 %r412; .reg .u32 %r413; .reg .pred %r414; .reg .f64pred %r428; .reg .predu64 %r433; .reg .pred.reg .u64pred %r445; .reg .u64 %r447; .reg .u64 %r448; .reg .predpredmov.f64 %r216,%ar2; mov.f64 %r217,%ar3; mov.u64 %r218,%ar4; mov.u64 %r219,%ar5; mov.u32 %r220,%ar6; .loc 1 225 10 ld.s8 %r22,[%r215+28]; .loc 1 225 38 add.u32 %r221,%r22,-1; .loc 1 225 8 cvt.s64.s32 %r143,%r221; .loc 1 226 10 ld.u64 %r25,[%r218]; .loc 1 226 7 add.u64 %r144,%r25,-1; .loc 1 228 7 shr.u64 %r223,%r144,63; set.u32.lt.s64 %r225,%r143,%r144; neg.s32 %r226,%r225; cvt.u16.u64 %r229,%r223; cvt.u16.u32 %r230,%r226; or.b16 %r228,%r229,%r230; .loc 1 228 6 cvt.u32.u16 %r231,%r228; cvt.u16.u8 %r232,%r231; setp.eq.u16 %r233,%r232,0; @ %r233 bra $L53; .loc 1 230 7 cvt.u32.u32 %r236,%r22; cvt.s64.s8 %r235,%r236; st.u64 [%stack+8],%r235_gfortran_runtime_error3: .loc 1 235 9 add.u64 %r239,%r144,%r144; add.u64 %r240,%r239,%r144; shl.b64 %r241,%r240,3; add.u64 %r242,%r215,%r241; ld.u64 %r31,[%r242+56]; ld.u64 %r33,[%r242+48]; .loc 1 239 9 ld.u64 %r147,[%r242+40]; .loc 1 240 12 add.u64 %r260,%r219,%r241; ld.u64 %r34,[%r260+40]; ld.u64 %r36,[%r219+16]; .loc 1 242 9 ld.u64 %r156,[%r219]; .loc 1 246 22 cvt.u32.u64 %r39,%r36; .loc 1 246 53 add.u32 %r262,%r39,-4; and.b32 %r263,%r262,-5; set.u32.eq.u32 %r265,%r263,0; neg.s32 %r266,%r265; .loc 1 246 22 add.u32 %r267,%r39,-1; set.u32.le.u32 %r269,%r267,1; neg.s32 %r270,%r269; .loc 1 246 6 cvt.u16.u32 %r272,%r266; cvt.u16.u32 %r273,%r270; or.b16 %r271,%r272,%r273; cvt.u32.u16 %r274,%r271; cvt.u16.u8 %r275,%r274; setp.eq.u16 %r276,%r275,0; @ %r276 bra $L54; $L58: .loc 1 255 3 setp.ne.u64 %r277,%r144,0; @ %r277 bra $L55; $L62: .loc 1 264 3 setp.gt.s64 %r278,%r143,%r144; @ %r278 bra $L56; bra $L113; $L54: .loc 1 248 7 setp.eq.u32 %r280,%r39,16; @ %r280 bra $L58; .loc 1 253 5 cvta.const.u64 %r282,$LC4; mov.u64 %r281,call _gfortrani_internal_error5: add.u64 %r125,%r215,40; add.u64 %r84,%r219,40; add.u64 %r283,%r215,16; add.u64 %r285,%r25,%r25; add.u64 %r286,%r285,%r25; shl.b64 %r287,%r286,3; add.u64 %r65,%r283,%r287; .loc 1 255 3 mov.u64 %r142,0; add.u64 %r453,%frame,240; add.u64 %r450,%frame,120; add.u64 %r452,%frame,360; .loc 1 262 12 mov.u64 %r466,%r142; $L61: .loc 1 257 18 add.u64 %r289,%r453,%r142; ld.u64 %r290,[%r125]; st.u64 [%r289],%r290; .loc 1 258 18 add.u64 %r292,%r450,%r142; .loc 1 258 20 ld.u64 %r294,[%r84]; mul.lo.u64 %r293,%r294,%r36; .loc 1 258 18 st.u64 [%r292],%r293; .loc 1 259 19 ld.u64 %r296,[%r125+16]; add.u64 %r295,%r296,1; ld.u64 %r297,[%r125+8]; sub.u64 %r53,%r295,%r297; .loc 1 261 10 setp.lt.s64 %r298,%r53,0; @ %r298 bra $L59; .loc 1 259 17 add.u64 %r300,%r452,%r142; st.u64 [%r300],%r53; bra $L60; $L59: .loc 1 262 12 add.u64 %r302,%r452,%r142; st.u64 [%r302],%r466; $L60: .loc 1 255 3 add.u64 %r125,%r125,24; add.u64 %r84,%r84,24; add.u64 %r142,%r142,8; setp.ne.u64 %r304,%r65,%r125; @ %r304 bra $L61; bra $L62; $L113: .loc 1 274 6 ld.u64 %r305,[%r214]; setp.eq.u64 %r306,%r305,0; @ ! %r306 bra $L64; bra $L63; $L56: add.u64 %r308,%r25,%r25; add.u64 %r309,%r308,%r25; shl.b64 %r310,%r309,3; add.u64 %r174,%r310,40; add.u64 %r179,%r215,%r174; add.u64 %r172,%r219,%r174; shl.b64 %r311,%r25,3; add.u64 %r152,%r311,-8; cvt.u32.u32 %r313,%r22; cvt.s64.s8 %r312,%r313; shl.b64 %r314,%r312,3; add.u64 %r129,%r314,-8; add.u64 %r453,%frame,240; add.u64 %r450,%frame,120; add.u64 %r452,%frame,360; .loc 1 271 12 mov.u64 %r465,0; $L67: .loc 1 266 18 add.u64 %r316,%r453,%r152; ld.u64 %r317,[%r179]; st.u64 [%r316],%r317; .loc 1 267 18 add.u64 %r319,%r450,%r152; .loc 1 267 20 ld.u64 %r321,[%r172]; mul.lo.u64 %r320,%r321,%r36; .loc 1 267 18 st.u64 [%r319],%r320; .loc 1 268 19 ld.u64 %r323,[%r179+16]; add.u64 %r322,%r323,1; ld.u64 %r324,[%r179+8]; sub.u64 %r62,%r322,%r324; .loc 1 270 10 setp.lt.s64 %r325,%r62,0; @ %r325 bra $L65; .loc 1 268 17 add.u64 %r327,%r452,%r152; st.u64 [%r327],%r62; bra $L66; $L65: .loc 1 271 12 add.u64 %r329,%r452,%r152; st.u64 [%r329],%r465; $L66: .loc 1 264 3 add.u64 %r179,%r179,24; add.u64 %r172,%r172,24; add.u64 %r152,%r152,8; setp.ne.u64 %r331,%r129,%r152; @ %r331 bra $L67; bra $L114; $L63: .loc 1 278 7 setp.le.s64 %r332,%r143,0; @ %r332 bra $L69; add.u64 %r452,%frame,360; $L90: add.u64 %r193,%r214,40; mov.u64 %r191,%r452; add.u64 %r333,%r214,16; cvt.u32.u32 %r335,%r22; cvt.s64.s8 %r334,%r335; add.u64 %r337,%r334,%r334; add.u64 %r338,%r337,%r334; shl.b64 %r339,%r338,3; add.u64 %r185,%r333,%r339; .loc 1 281 10 mov.u64 %r134,1; .loc 1 285 4 mov.u64 %r362,0; bra $L70; $L69: .loc 1 289 24 mov.u64 %r340,0; st.u64 [%r214+8],%r340; .loc 1 290 28 cvt.u16.u32 %r343,%r22; add.u16 %r342,%r343,-1; cvt.u32.u16 %r344,%r342; st.u8 [%r214+28],%r344; .loc 1 292 20 add.u32 %r345,%r22,-2; cvt.s64.s32 %r73,%r345; add.u64 %r347,%r73,%r73; add.u64 %r348,%r347,%r73; shl.b64 %r349,%r348,3; add.u64 %r350,%r214,%r349; .loc 1 292 67 shl.b64 %r352,%r73,3; add.u64 %r353,%frame,%r352; .loc 1 292 59 ld.u64 %r355,[%r350+40]; ld.u64 %r356,[%r353+360]; mul.lo.u64 %r150,%r355,%r356; .loc 1 294 29 mov.u64 %r358call (%value_in),_gfortrani_xmallocarray59,[%value_in]; } .loc 1 294 27 st.u64 [%r214],%r359; .loc 1 295 10 setp.eq.u64 %r361,%r150,0; @ ! %r361 bra $L74; bra $L71; $L73: .loc 1 283 48 mul.lo.u64 %r134,%r69,%r134; $L70: .loc 1 285 4 st.u64 [%r193+8],%r362; ld.u64 %r69,[%r191]; add.u64 %r363,%r69,-1; st.u64 [%r193+16],%r363; st.u64 [%r193],%r134; .loc 1 278 7 add.u64 %r193,%r193,24; add.u64 %r191,%r191,8; setp.ne.u64 %r364,%r185,%r193; @ %r364 bra $L73; bra $L69; $L71: .loc 1 298 4 st.u64 [%r214+48],%r150; mov.u64 %r366,-1; st.u64 [%r214+56],%r366; mov.u64 %r367,1; st.u64 [%r214+40],%r367; .loc 1 299 4 bra $L52; $L64: .loc 1 304 19 ld.s8 %r80,[%r214+28]; .loc 1 304 10 setp.eq.u64 %r368,%r80,%r143; @ %r368 bra $L76; .loc 1 305 2 st.u64 [%stack+8],%r143; st.u64 [%stack],%r80;_gfortran_runtime_error76: .loc 1 310 11 cvta.global.u64 %r371,_gfortrani_compile_options; .loc 1 310 10 ld.u32 %r372,[%r371+36]; setp.eq.u32 %r373,%r372,0; @ %r373 bra $L74; .loc 1 311 2 add.u64 %r452,%frame,360; cvta.const.u64 %r377,$LC2376377; call _gfortrani_bounds_ifunction_return$L74: .loc 1 315 3 setp.gt.s64 %r379,%r143,0; @ %r379 bra $L77; $L80: .loc 1 235 9 add.u64 %r380,%r31,1; .loc 1 235 7 sub.u64 %r145,%r380,%r33; max.s64 %r66,%r145,0; .loc 1 240 12 mul.lo.u64 %r37,%r34,%r36; .loc 1 323 8 ld.u64 %r160,[%r214]; .loc 1 326 8 ld.u64 %r159,[%r215]; .loc 1 351 35 shl.b64 %r95,%r147,4; .loc 1 336 22 add.u64 %r64,%r66,-1; .loc 1 336 15 mul.lo.u64 %r87,%r64,%r95; .loc 1 337 29 mul.lo.u64 %r44,%r37,%r64; .loc 1 338 34 neg.s64 %r91,%r95; .loc 1 363 22 ld.u64 %r96,[%frame+240]; .loc 1 363 12 shl.b64 %r98,%r96,4; .loc 1 364 23 ld.u64 %r99,[%frame+120]; .loc 1 365 22 ld.u64 %r101,[%frame]; .loc 1 365 12 shl.b64 %r103,%r101,3; .loc 1 362 12 ld.u64 %r213,[%frame+480]; setp.ne.u32 %r454,%r220,0; add.u64 %r451,%frame,488; .loc 1 338 4 setp.le.s64 %r458,%r145,0; .loc 1 382 23 add.u64 %r459,%frame,240; .loc 1 367 32 add.u64 %r460,%frame,360; .loc 1 371 20 add.u64 %r461,%frame,120; bra $L78; $L77: add.u64 %r206,%frame,480; add.u64 %r205,%r214,40; mov.u64 %r203,%frame; cvt.u32.u32 %r382,%r22; cvt.s64.s8 %r381,%r382; add.u64 %r194,%r381,-1; .loc 1 315 3 mov.u64 %r161,0; add.u64 %r452,%frame,360; .loc 1 317 16 mov.u64 %r383,%r161; $L79: st.u64 [%r206],%r383; .loc 1 318 18 ld.u64 %r384,[%r205]; st.u64 [%r203],%r384; .loc 1 319 17 shl.b64 %r386,%r161,3; add.u64 %r387,%r452,%r386; .loc 1 319 10 ld.u64 %r388,[%r387]; setp.le.s64 %r389,%r388,0; @ %r389 bra $L52; .loc 1 315 26 add.u64 %r161,%r161,1; .loc 1 315 3 add.u64 %r206,%r206,8; add.u64 %r205,%r205,24; add.u64 %r203,%r203,8; setp.ne.u64 %r390,%r161,%r194; @ %r390 bra $L79; bra $L80; $L78: .loc 1 334 10 @ %r454 bra $L81; .loc 1 351 4 setp.gt.s64 %r392,%r145,0; @ %r392 bra $L91; .loc 1 333 14 mov.u64 %r154,0; bra $L83; $L81: .loc 1 336 8 add.u64 %r29,%r159,%r87; .loc 1 337 9 add.u64 %r155,%r156,%r44; .loc 1 338 4 @ %r458 bra $L92; .loc 1 338 11 mov.u64 %r154,%r66; $L85: .loc 1 340 11 ld.s8 %r395,[%r155]; cvt.u16.u32 %r394,%r395; setp.eq.u16 %r396,%r394,0; @ %r396 bra $L84; .loc 1 340 18 ld.f64 %r398,[%r29+8]; set.u32.eq.f64 %r399,%r398,%r217; neg.s32 %r400,%r399; ld.f64 %r402,[%r29]; set.u32.eq.f64 %r403,%r402,%r216; neg.s32 %r404,%r403; cvt.u16.u32 %r406,%r400; cvt.u16.u32 %r407,%r404; and.b16 %r405,%r406,%r407; cvt.u32.u16 %r408,%r405; cvt.u16.u8 %r409,%r408; setp.ne.u16 %r410,%r409,0; @ %r410 bra $L83; $L84: .loc 1 338 26 add.u64 %r154,%r154,-1; .loc 1 338 34 add.u64 %r29,%r29,%r91; .loc 1 338 53 sub.u64 %r155,%r155,%r37; .loc 1 338 4 setp.ne.u64 %r411,%r154,0; @ %r411 bra $L85; bra $L83; $L91: .loc 1 351 4 mov.u64 %r153,%r156; mov.u64 %r104,%r159; .loc 1 351 11 mov.u64 %r154,1; $L82: .loc 1 353 11 ld.s8 %r413,[%r153]; cvt.u16.u32 %r412,%r413; setp.eq.u16 %r414,%r412,0; @ %r414 bra $L86; .loc 1 353 18 ld.f64 %r416,[%r104]; set.u32.eq.f64 %r417,%r416,%r216; neg.s32 %r418,%r417; ld.f64 %r420,[%r104+8]; set.u32.eq.f64 %r421,%r420,%r217; neg.s32 %r422,%r421; cvt.u16.u32 %r424,%r418; cvt.u16.u32 %r425,%r422; and.b16 %r423,%r424,%r425; cvt.u32.u16 %r426,%r423; cvt.u16.u8 %r427,%r426; setp.ne.u16 %r428,%r427,0; @ %r428 bra $L83; $L86: .loc 1 351 27 add.u64 %r154,%r154,1; .loc 1 351 35 add.u64 %r104,%r104,%r95; .loc 1 351 54 add.u64 %r153,%r153,%r37; .loc 1 351 4 setp.ge.s64 %r429,%r66,%r154; @ %r429 bra $L82; .loc 1 333 14 mov.u64 %r154,0; bra $L83; $L92: mov.u64 %r154,0; $L83: .loc 1 360 13 st.u64 [%r160],%r154; .loc 1 362 15 add.u64 %r213,%r213,1; st.u64 [%frame+480],%r213; .loc 1 363 12 add.u64 %r159,%r159,%r98; .loc 1 364 13 add.u64 %r156,%r156,%r99; .loc 1 365 12 add.u64 %r160,%r160,%r103; .loc 1 367 32 ld.u64 %r121,[%frame+360]; .loc 1 367 13 setp.ne.u64 %r430,%r213,%r121; @ %r430 bra $L78; mov.u64 %r118,%r451; mov.u64 %r119,%r101; mov.u64 %r211,%r99; mov.u64 %r116,%r96; mov.u64 %r112,8; .loc 1 366 9 mov.u64 %r158,0; .loc 1 369 13 mov.u64 %r432,%r158; $L89: st.u64 [%r118+-8],%r432; .loc 1 370 23 mul.lo.u64 %r106,%r121,%r116; .loc 1 371 24 mul.lo.u64 %r433,%r121,%r211; .loc 1 371 10 sub.u64 %r156,%r156,%r433; .loc 1 372 23 mul.lo.u64 %r111,%r121,%r119; .loc 1 373 5 add.u64 %r158,%r158,1; .loc 1 374 7 setp.le.s64 %r434,%r143,%r158; @ %r434 bra $L52; .loc 1 381 16 ld.u64 %r435,[%r118]; add.u64 %r115,%r435,1; st.u64 [%r118],%r115; .loc 1 382 23 add.u64 %r437,%r459,%r112; ld.u64 %r116,[%r437]; .loc 1 382 13 sub.u64 %r438,%r116,%r106; shl.b64 %r439,%r438,4; add.u64 %r159,%r159,%r439; .loc 1 383 23 add.u64 %r440,%frame,%r112; ld.u64 %r119,[%r440]; .loc 1 383 13 sub.u64 %r441,%r119,%r111; shl.b64 %r442,%r441,3; add.u64 %r160,%r160,%r442; .loc 1 367 32 add.u64 %r444,%r460,%r112; ld.u64 %r121,[%r444]; .loc 1 367 13 add.u64 %r118,%r118,8; setp.eq.u64 %r445,%r115,%r121; @ %r445 bra $L88; .loc 1 362 12 ld.u64 %r213,[%frame+480]; bra $L78; $L88: .loc 1 371 20 add.u64 %r447,%r461,%r112; ld.u64 %r211,[%r447]; add.u64 %r112,%r112,8; bra $L89; $L114: .loc 1 274 6 ld.u64 %r448,[%r214]; setp.eq.u64 %r449,%r448,0; @ ! %r449 bra $L64; bra $L90; $L52: .loc 1 387_gfortran_sfindloc1_c8 .visible .func _gfortran_sfindloc1_c8368u64 %r126; .reg .u64u64 %r154; .reg .u32 %r155; .reg .predu16 %r173; .reg .u16 %r174; .reg .u32 %r175; .reg .u16 %r176; .reg .pred %r177; .reg .pred %r178; .reg .predpredpredu16 %r229; .reg .u32predu64predpredu64 %r303; .reg .u64 %r304; mov.u64 %r149,%ar0; mov.u64 %r150,%ar1; mov.f64 %r151,%ar2; mov.f64 %r152,%ar3; mov.u64 %r153,%ar4; mov.u64 %r154,%ar5; mov.u32 %r155,%ar6; .loc 1 410 6 setp.eq.u64 %r156,%r154,0; @ %r156 bra $L116; .loc 1 410 20 ld.u32 %r157,[%r154]; setp.eq.u32 %r158,%r157,0; @ %r158 bra $L117; $L116:.param .f64 %out_arg4; st.param.f64 [%out_arg4],%r152153155; call _gfortran_findloc1_c8413 7 bra $L115; $L117: .loc 1 416 10 ld.s8 %r23,[%r150+28]; .loc 1 416 38 add.u32 %r165,%r23,-1; .loc 1 416 8 cvt.s64.s32 %r90,%r165; .loc 1 417 10 ld.u64 %r26,[%r153]; .loc 1 417 7 add.u64 %r91,%r26,-1; .loc 1 419 7 shr.u64 %r167,%r91,63; set.u32.lt.s64 %r169,%r90,%r91; neg.s32 %r170,%r169; cvt.u16.u64 %r173,%r167; cvt.u16.u32 %r174,%r170; or.b16 %r172,%r173,%r174; .loc 1 419 6 cvt.u32.u16 %r175,%r172; cvt.u16.u8 %r176,%r175; setp.ne.u16 %r177,%r176,0; @ %r177 bra $L119; .loc 1 430 3 setp.ne.u64 %r178,%r91,0; @ %r178 bra $L120; $L126: .loc 1 438 3 setp.gt.s64 %r179,%r90,%r91; @ %r179 bra $L121; bra $L163; $L119: .loc 1 421 7 cvt.u32.u32 %r182,%r23; cvt.s64.s8 %r181,%r182; st.u64 [%stack+8],%r18180180_gfortran_runtime_error120: add.u64 %r136,%r150,48; add.u64 %r138,%frame,120; add.u64 %r139,%r150,56; add.u64 %r184,%r150,24; add.u64 %r186,%r26,%r26; add.u64 %r187,%r186,%r26; shl.b64 %r188,%r187,3; add.u64 %r145,%r184,%r188; .loc 1 435 12 mov.u64 %r304,0; $L125: .loc 1 432 19 ld.u64 %r190,[%r139]; add.u64 %r189,%r190,1; ld.u64 %r191,[%r136]; sub.u64 %r35,%r189,%r191; .loc 1 434 10 setp.le.s64 %r192,%r35,0; @ %r192 bra $L123; .loc 1 432 17 st.u64 [%r138],%r35; bra $L124; $L123: .loc 1 435 12 st.u64 [%r138],%r304; $L124: .loc 1 430 3 add.u64 %r136,%r136,24; add.u64 %r138,%r138,8; add.u64 %r139,%r139,24; setp.ne.u64 %r194,%r136,%r145; @ %r194 bra $L125; bra $L126; $L163: .loc 1 448 6 ld.u64 %r195,[%r149]; setp.eq.u64 %r196,%r195,0; @ ! %r196 bra $L128; bra $L127; $L121: add.u64 %r198,%r26,%r26; add.u64 %r199,%r198,%r26; shl.b64 %r200,%r199,3; add.u64 %r201,%r200,48; add.u64 %r125,%r150,%r201; add.u64 %r296,%frame,120; shl.b64 %r203,%r26,3; add.u64 %r204,%r203,-8; add.u64 %r45,%r296,%r204; add.u64 %r205,%r150,48; cvt.u32.u32 %r207,%r23; cvt.s64.s8 %r206,%r207; add.u64 %r209,%r206,%r206; add.u64 %r210,%r209,%r206; shl.b64 %r211,%r210,3; add.u64 %r132,%r205,%r211; .loc 1 444 12 mov.u64 %r303,0; $L131: .loc 1 441 2 ld.u64 %r213,[%r125+8]; add.u64 %r212,%r213,1; ld.u64 %r214,[%r125]; sub.u64 %r40,%r212,%r214; .loc 1 443 10 setp.le.s64 %r215,%r40,0; @ %r215 bra $L129; .loc 1 440 17 st.u64 [%r45],%r40; bra $L130; $L129: .loc 1 444 12 st.u64 [%r45],%r303; $L130: .loc 1 438 3 add.u64 %r125,%r125,24; add.u64 %r45,%r45,8; setp.ne.u64 %r217,%r125,%r132; @ %r217 bra $L131; bra $L164; $L127: .loc 1 452 7 setp.le.s64 %r218,%r90,0; @ %r218 bra $L133; add.u64 %r296,%frame,120; $L149: add.u64 %r78,%r149,40; mov.u64 %r61,%r296; cvt.u32.u32 %r220,%r23; cvt.s64.s8 %r219,%r220; add.u64 %r222,%r219,%r219; add.u64 %r223,%r222,%r219; shl.b64 %r224,%r223,3; add.u64 %r225,%r149,16; add.u64 %r126,%r224,%r225; .loc 1 455 10 mov.u64 %r81,1; .loc 1 459 4 mov.u64 %r248,0; bra $L134; $L133: .loc 1 462 24 mov.u64 %r226,0; st.u64 [%r149+8],%r226; .loc 1 463 28 cvt.u16.u32 %r229,%r23; add.u16 %r228,%r229,-1; cvt.u32.u16 %r230,%r228; st.u8 [%r149+28],%r230; .loc 1 465 20 add.u32 %r231,%r23,-2; cvt.s64.s32 %r51,%r231; add.u64 %r233,%r51,%r51; add.u64 %r234,%r233,%r51; shl.b64 %r235,%r234,3; add.u64 %r236,%r149,%r235; .loc 1 465 67 shl.b64 %r238,%r51,3; add.u64 %r239,%frame,%r238; .loc 1 465 59 ld.u64 %r241,[%r236+40]; ld.u64 %r242,[%r239+120]; mul.lo.u64 %r94,%r241,%r242; .loc 1 467 29 mov.u64 %r244244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } .loc 1 467 27 st.u64 [%r149],%r245; .loc 1 468 10 setp.eq.u64 %r247,%r94,0; @ ! %r247 bra $L138; bra $L135; $L137: .loc 1 457 48 mul.lo.u64 %r81,%r47,%r81; $L134: .loc 1 459 4 st.u64 [%r78+8],%r248; ld.u64 %r47,[%r61]; add.u64 %r249,%r47,-1; st.u64 [%r78+16],%r249; st.u64 [%r78],%r81; .loc 1 452 7 add.u64 %r78,%r78,24; add.u64 %r61,%r61,8; setp.ne.u64 %r250,%r78,%r126; @ %r250 bra $L137; bra $L133; $L135: .loc 1 471 4 st.u64 [%r149+48],%r94; mov.u64 %r252,-1; st.u64 [%r149+56],%r252; mov.u64 %r253,1; st.u64 [%r149+40],%r253; .loc 1 472 4 bra $L115; $L128: .loc 1 477 19 ld.s8 %r58,[%r149+28]; .loc 1 477 10 setp.eq.u64 %r254,%r58,%r90; @ %r254 bra $L139; .loc 1 478 2 st.u64 [%stack+8],%r90; st.u64 [%stack],%r58;_gfortran_runtime_error139: .loc 1 483 11 cvta.global.u64 %r257,_gfortrani_compile_options; .loc 1 483 10 ld.u32 %r258,[%r257+36]; setp.eq.u32 %r259,%r258,0; @ %r259 bra $L138; .loc 1 484 2 add.u64 %r296,%frame,120; cvta.const.u64 %r263,$LC2r2962623; call _gfortrani_bounds_ifunction_return$L138: .loc 1 488 3 setp.gt.s64 %r265,%r90,0; @ %r265 bra $L140; $L143: .loc 1 495 8 ld.u64 %r97,[%r149]; ld.u64 %r42,[%frame+240]; .loc 1 503 22 ld.u64 %r64,[%frame]; .loc 1 503 12 shl.b64 %r66,%r64,3; setp.le.s64 %r297,%r90,1; cvt.u32.u32 %r299,%r23; cvt.s64.s8 %r300,%r299; add.u64 %r301,%r300,-1; add.u64 %r302,%frame,120; bra $L141; $L140: add.u64 %r114,%frame,240; add.u64 %r113,%r149,40; mov.u64 %r108,%frame; cvt.u32.u32 %r267,%r23; cvt.s64.s8 %r266,%r267; add.u64 %r79,%r266,-1; .loc 1 488 3 mov.u64 %r98,0; add.u64 %r296,%frame,120; .loc 1 490 16 mov.u64 %r268,%r98; $L142: st.u64 [%r114],%r268; .loc 1 491 18 ld.u64 %r269,[%r113]; st.u64 [%r108],%r269; .loc 1 492 17 shl.b64 %r271,%r98,3; add.u64 %r272,%r296,%r271; .loc 1 492 10 ld.u64 %r273,[%r272]; setp.le.s64 %r274,%r273,0; @ %r274 bra $L115; .loc 1 488 26 add.u64 %r98,%r98,1; .loc 1 488 3 add.u64 %r114,%r114,8; add.u64 %r113,%r113,24; add.u64 %r108,%r108,8; setp.ne.u64 %r275,%r79,%r98; @ %r275 bra $L142; bra $L143; $L141: .loc 1 500 13 mov.u64 %r276,0; st.u64 [%r97],%r276; .loc 1 502 15 add.u64 %r42,%r42,1; .loc 1 503 12 add.u64 %r97,%r97,%r66; .loc 1 505 32 ld.u64 %r88,[%frame+120]; .loc 1 505 13 setp.ne.u64 %r277,%r42,%r88; @ %r277 bra $L141; .loc 1 508 23 mul.lo.u64 %r278,%r64,%r42; .loc 1 508 9 shl.b64 %r279,%r278,3; sub.u64 %r30,%r97,%r279; .loc 1 510 7 @ ! %r297 bra $L165; bra $L115; $L148: .loc 1 507 13 st.u64 [%r102],%r276; .loc 1 508 23 mul.lo.u64 %r282,%r73,%r72; .loc 1 508 9 shl.b64 %r283,%r282,3; sub.u64 %r30,%r97,%r283; .loc 1 509 5 add.u64 %r96,%r96,1; .loc 1 510 7 add.u64 %r102,%r102,8; setp.ne.u64 %r284,%r96,%r301; @ %r284 bra $L147; bra $L115; $L165: add.u64 %r102,%frame,248; .loc 1 509 5 mov.u64 %r96,1; $L147: .loc 1 517 16 ld.u64 %r288,[%r102]; add.u64 %r72,%r288,1; st.u64 [%r102],%r72; shl.b64 %r120,%r96,3; .loc 1 518 23 add.u64 %r289,%frame,%r120; ld.u64 %r73,[%r289]; .loc 1 518 13 shl.b64 %r290,%r73,3; add.u64 %r97,%r30,%r290; .loc 1 505 32 add.u64 %r292,%r302,%r120; ld.u64 %r77,[%r292]; .loc 1 505 13 setp.eq.u64 %r293,%r72,%r77; @ %r293 bra $L148; .loc 1 507 13 mov.u64 %r42,0; bra $L141; $L164: .loc 1 448 6 ld.u64 %r294,[%r149]; setp.eq.u64 %r295,%r294,0; @ ! %r295 bra $L128; bra $L149; $L115: .loc 1 522findloc1_c10.o/ 1622802252findloc1_s1.o/_gfortran_findloc1_s1 .visible .func _gfortran_findloc1_s1; .file 1 "../../../../libgfortran/generated/findloc1_s1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mfindloc1_s1 .visible .func _gfortran_mfindloc1_s1); // BEGIN GLOBAL FUNCTION DECL: _gfortran_sfindloc1_s1 .visible .func _gfortran_sfindloc1_s1); // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortran_compare_gfortran_compare); // BEGIN GLOBAL FUNCTION DECL: _gfortrani_internal_error .extern .func _gfortrani_internal_,105,110,99,111,114,114,101,99,116,32,105,110,32,70,73,78,68,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,70,73,78,68,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,1218] = {70,73,78,68,7114,101,116,117,114,110,32,118,97,108,11_gfortran_findloc1_s1 .visible .func _gfortran_findloc1_s148predu64u64 %r258; .reg .u64pred %r262; .reg .pred %r264; .reg .pred %r265; .reg .u64u64predu32 %r316; .reg .u64 %r317; .reg .u64pred %r323; .reg .predu32 %r339; .reg .pred64pred %r367; .reg .predpred %r373; .reg .predu64 %r381; mov.u64 %r185,%ar0; mov.u64 %r186,%ar1; mov.u64 %r187,%ar2; mov.u64 %r188,%ar3; mov.u32 %r189,%ar4; mov.u64 %r190,%ar5; mov.u64 %r191,%ar6; .loc 1 56 10 ld.s8 %r22,[%r186+28]; .loc 1 56 38 add.u32 %r192,%r22,-1; .loc 1 56 8 cvt.s64.s32 %r119,%r192; .loc 1 57 10 ld.u64 %r25,[%r188]; .loc 1 57 7 add.u64 %r120,%r25,-1; .loc 1 59 7 shr.u64 %r194,%r120,63; set.u32.lt.s64 %r196,%r119,%r120; neg.s32 %r197,%r196; cvt.u16.u64 %r200,%r194; cvt.u16.u32 %r201,%r197; or.b16 %r199,%r200,%r201; .loc 1 59 6 cvt.u32.u16 %r202,%r199; cvt.u16.u8 %r203,%r202; setp.eq.u16 %r204,%r203,0; @ %r204 bra $L2; .loc 1 61 7 cvt.u32.u32 %r207,%r22; cvt.s64.s8 %r206,%r207; st.u64 [%stack+8],%r20605205_gfortran_runtime_errorr210,%r120,%r120; add.u64 %r211,%r210,%r120; shl.b64 %r212,%r211,3; add.u64 %r213,%r186,%r212; ld.u64 %r31,[%r213+56]; ld.u64 %r33,[%r213+48]; .loc 1 69 9 ld.u64 %r123,[%r213+40]; .loc 1 71 3 setp.ne.u64 %r227,%r120,0; @ %r227 bra $L3; $L9: .loc 1 79 3 setp.gt.s64 %r228,%r119,%r120; @ %r228 bra $L4; bra $L56; $L3: add.u64 %r81,%r186,40; add.u64 %r174,%frame,120; add.u64 %r175,%frame,240; add.u64 %r229,%r186,16; add.u64 %r231,%r25,%r25; add.u64 %r232,%r231,%r25; shl.b64 %r233,%r232,3; add.u64 %r182,%r229,%r233; .loc 1 77 12 mov.u64 %r381,0; $L8: .loc 1 73 18 ld.u64 %r234,[%r81]; st.u64 [%r174],%r234; .loc 1 74 19 ld.u64 %r236,[%r81+16]; add.u64 %r235,%r236,1; ld.u64 %r237,[%r81+8]; sub.u64 %r38,%r235,%r237; .loc 1 76 10 setp.lt.s64 %r238,%r38,0; @ %r238 bra $L6; .loc 1 74 17 st.u64 [%r175],%r38; bra $L7; $L6: .loc 1 77 12 st.u64 [%r175],%r381; $L7: .loc 1 71 3 add.u64 %r81,%r81,24; add.u64 %r174,%r174,8; add.u64 %r175,%r175,8; setp.ne.u64 %r240,%r81,%r182; @ %r240 bra $L8; bra $L9; $L56: .loc 1 88 6 ld.u64 %r241,[%r185]; setp.eq.u64 %r242,%r241,0; @ ! %r242 bra $L11; bra $L10; $L4: add.u64 %r244,%r25,%r25; add.u64 %r245,%r244,%r25; shl.b64 %r246,%r245,3; add.u64 %r247,%r246,40; add.u64 %r62,%r186,%r247; shl.b64 %r248,%r25,3; add.u64 %r170,%r248,-8; add.u64 %r370,%frame,120; add.u64 %r135,%r370,%r170; add.u64 %r369,%frame,240; add.u64 %r110,%r369,%r170; add.u64 %r251,%r186,40; cvt.u32.u32 %r253,%r22; cvt.s64.s8 %r252,%r253; add.u64 %r255,%r252,%r252; add.u64 %r256,%r255,%r252; shl.b64 %r257,%r256,3; add.u64 %r78,%r251,%r257; .loc 1 85 12 mov.u64 %r380,0; $L14: .loc 1 81 18 ld.u64 %r258,[%r62]; st.u64 [%r135],%r258; .loc 1 82 19 ld.u64 %r260,[%r62+16]; add.u64 %r259,%r260,1; ld.u64 %r261,[%r62+8]; sub.u64 %r45,%r259,%r261; .loc 1 84 10 setp.lt.s64 %r262,%r45,0; @ %r262 bra $L12; .loc 1 82 17 st.u64 [%r110],%r45; bra $L13; $L12: .loc 1 85 12 st.u64 [%r110],%r380; $L13: .loc 1 79 3 add.u64 %r62,%r62,24; add.u64 %r135,%r135,8; add.u64 %r110,%r110,8; setp.ne.u64 %r264,%r62,%r78; @ %r264 bra $L14; bra $L57; $L10: .loc 1 92 7 setp.le.s64 %r265,%r119,0; @ %r265 bra $L16; add.u64 %r369,%frame,240; $L37: add.u64 %r109,%r185,40; mov.u64 %r106,%r369; cvt.u32.u32 %r267,%r22; cvt.s64.s8 %r266,%r267; add.u64 %r269,%r266,%r266; add.u64 %r270,%r269,%r266; shl.b64 %r271,%r270,3; add.u64 %r272,%r185,16; add.u64 %r65,%r271,%r272; .loc 1 95 10 mov.u64 %r108,1; .loc 1 99 4 mov.u64 %r295,0; bra $L17; $L16: .loc 1 103 24 mov.u64 %r273,0; st.u64 [%r185+8],%r273; .loc 1 104 28 cvt.u16.u32 %r276,%r22; add.u16 %r275,%r276,-1; cvt.u32.u16 %r277,%r275; st.u8 [%r185+28],%r277; .loc 1 106 20 add.u32 %r278,%r22,-2; cvt.s64.s32 %r54,%r278; add.u64 %r280,%r54,%r54; add.u64 %r281,%r280,%r54; shl.b64 %r282,%r281,3; add.u64 %r283,%r185,%r282; .loc 1 106 67 shl.b64 %r285,%r54,3; add.u64 %r286,%frame,%r285; .loc 1 106 59 ld.u64 %r288,[%r283+40]; ld.u64 %r289,[%r286+240]; mul.lo.u64 %r125,%r288,%r289; .loc 1 108 29 mov.u64 %r291291; call (%value_in),_gfortrani_xmallocarray292,[%value_in]; } .loc 1 108 27 st.u64 [%r185],%r292; .loc 1 109 10 setp.eq.u64 %r294,%r125,0; @ ! %r294 bra $L21; bra $L18; $L20: .loc 1 97 48 mul.lo.u64 %r108,%r50,%r108; $L17: .loc 1 99 4 st.u64 [%r109+8],%r295; ld.u64 %r50,[%r106]; add.u64 %r296,%r50,-1; st.u64 [%r109+16],%r296; st.u64 [%r109],%r108; .loc 1 92 7 add.u64 %r109,%r109,24; add.u64 %r106,%r106,8; setp.ne.u64 %r297,%r65,%r109; @ %r297 bra $L20; bra $L16; $L18: .loc 1 112 4 st.u64 [%r185+48],%r125; mov.u64 %r299,-1; st.u64 [%r185+56],%r299; mov.u64 %r300,1; st.u64 [%r185+40],%r300; .loc 1 113 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r61,[%r185+28]; .loc 1 118 10 setp.eq.u64 %r301,%r61,%r119; @ %r301 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r119; st.u64 [%stack],%r61;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r304,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r305,[%r304+36]; setp.eq.u32 %r306,%r305,0; @ %r306 bra $L21; .loc 1 125 2 add.u64 %r369,%frame,240; cvta.const.u64 %r310,$LC2r369309310; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r312,%r119,0; @ %r312 bra $L24; $L28: .loc 1 66 9 add.u64 %r313,%r31,1; .loc 1 66 7 sub.u64 %r121,%r313,%r33; max.s64 %r43,%r121,0; .loc 1 137 8 ld.u64 %r133,[%r185]; .loc 1 140 8 ld.u64 %r132,[%r186]; .loc 1 162 44 mul.lo.u64 %r75,%r123,%r190; .loc 1 149 22 add.u64 %r314,%r43,-1; .loc 1 149 35 mul.lo.u64 %r70,%r314,%r75; .loc 1 150 34 neg.s64 %r72,%r75; .loc 1 174 22 ld.u64 %r152,[%frame+120]; .loc 1 174 26 mul.lo.u64 %r142,%r152,%r190; .loc 1 175 22 ld.u64 %r165,[%frame]; .loc 1 175 12 shl.b64 %r168,%r165,3; .loc 1 173 12 ld.u64 %r169,[%frame+360]; setp.ne.u32 %r368,%r189,0; setp.le.s64 %r371,%r119,1; .loc 1 150 4 setp.le.s64 %r373,%r121,0; .loc 1 162 4 setp.gt.s64 %r374,%r121,0; cvt.u32.u32 %r375,%r22; cvt.s64.s8 %r376,%r375; add.u64 %r377,%r376,-1; add.u64 %r378,%frame,120; add.u64 %r379,%frame,240; bra $L25; $L24: add.u64 %r150,%frame,360; add.u64 %r146,%r185,40; mov.u64 %r140,%frame; cvt.u32.u32 %r316,%r22; cvt.s64.s8 %r315,%r316; add.u64 %r111,%r315,-1; .loc 1 129 3 mov.u64 %r134,0; add.u64 %r369,%frame,240; .loc 1 131 16 mov.u64 %r317,%r134; $L27: st.u64 [%r150],%r317; .loc 1 132 18 ld.u64 %r318,[%r146]; st.u64 [%r140],%r318; .loc 1 133 17 shl.b64 %r320,%r134,3; add.u64 %r321,%r369,%r320; .loc 1 133 10 ld.u64 %r322,[%r321]; setp.le.s64 %r323,%r322,0; @ %r323 bra $L1; .loc 1 129 26 add.u64 %r134,%r134,1; .loc 1 129 3 add.u64 %r150,%r150,8; add.u64 %r146,%r146,24; add.u64 %r140,%r140,8; setp.ne.u64 %r324,%r111,%r134; @ %r324 bra $L27; bra $L28; $L25: .loc 1 147 10 @ %r368 bra $L29; .loc 1 162 4 @ %r374 bra $L38; $L32: .loc 1 146 14 mov.u64 %r127,0; bra $L31; $L29: .loc 1 149 8 add.u64 %r128,%r132,%r70; .loc 1 150 4 @ %r373 bra $L32; .loc 1 150 11 mov.u64 %r127(%value_in),_gfortran_compare_stringld.param.u32 %r332,[%value_in]; } .loc 1 152 11 setp.eq.u32 %r333,%r332,0; @ %r333 bra $L31; .loc 1 150 26 add.u64 %r127,%r127,-1; .loc 1 150 34 add.u64 %r128,%r128,%r72; .loc 1 150 4 setp.ne.u64 %r334,%r127,0; @ %r334 bra $L33; bra $L32; $L38: .loc 1 162 4 mov.u64 %r126,%r132; .loc 1 162 11 mov.u64 %r127,(%value_in),_gfortran_compare_stringld.param.u32 %r339,[%value_in]; } .loc 1 164 11 setp.eq.u32 %r340,%r339,0; @ %r340 bra $L31; .loc 1 162 27 add.u64 %r127,%r127,1; .loc 1 162 35 add.u64 %r126,%r126,%r75; .loc 1 162 4 setp.ge.s64 %r341,%r43,%r127; @ %r341 bra $L30; bra $L32; $L31: .loc 1 171 13 st.u64 [%r133],%r127; .loc 1 173 15 add.u64 %r169,%r169,1; .loc 1 174 12 add.u64 %r132,%r132,%r142; .loc 1 175 12 add.u64 %r133,%r133,%r168; .loc 1 177 32 ld.u64 %r139,[%frame+240]; .loc 1 177 13 setp.ne.u64 %r342,%r169,%r139; @ %r342 bra $L25; .loc 1 180 23 mul.lo.u64 %r343,%r169,%r152; .loc 1 180 35 mul.lo.u64 %r344,%r343,%r190; .loc 1 180 9 sub.u64 %r129,%r132,%r344; .loc 1 181 23 mul.lo.u64 %r345,%r169,%r165; .loc 1 181 9 shl.b64 %r346,%r345,3; sub.u64 %r130,%r133,%r346; .loc 1 183 7 @ %r371 bra $L1; add.u64 %r68,%frame,368; mov.u64 %r162,8; .loc 1 182 5 mov.u64 %r131,1; .loc 1 179 13 mov.u64 %r372,0; bra $L35; $L36: st.u64 [%r68],%r372; .loc 1 180 23 mul.lo.u64 %r352,%r95,%r94; .loc 1 180 35 mul.lo.u64 %r353,%r352,%r190; .loc 1 180 9 sub.u64 %r129,%r132,%r353; .loc 1 181 23 mul.lo.u64 %r354,%r98,%r94; .loc 1 181 9 shl.b64 %r355,%r354,3; sub.u64 %r130,%r133,%r355; .loc 1 182 5 add.u64 %r131,%r131,1; .loc 1 183 7 add.u64 %r68,%r68,8; add.u64 %r162,%r162,8; setp.eq.u64 %r356,%r131,%r377; @ %r356 bra $L1; $L35: .loc 1 190 16 ld.u64 %r357,[%r68]; add.u64 %r94,%r357,1; st.u64 [%r68],%r94; .loc 1 191 23 add.u64 %r359,%r378,%r162; ld.u64 %r95,[%r359]; .loc 1 191 13 mad.lo.u64 %r132,%r95,%r190,%r129; .loc 1 192 23 add.u64 %r361,%frame,%r162; ld.u64 %r98,[%r361]; .loc 1 192 13 shl.b64 %r362,%r98,3; add.u64 %r133,%r130,%r362; .loc 1 177 32 add.u64 %r364,%r379,%r162; ld.u64 %r102,[%r364]; .loc 1 177 13 setp.eq.u64 %r365,%r94,%r102; @ %r365 bra $L36; mov.u64 %r169,0; bra $L25; $L57: .loc 1 88 6 ld.u64 %r366,[%r185]; setp.eq.u64 %r367,%r366,0; @ ! %r367 bra $L11; bra $L37; $L1: .loc 1 196_gfortran_mfindloc1_s1 .visible .func _gfortran_mfindloc1_s1608u64 %r73; .reg .u64u16u64 %r231; .reg .u64 %r232; .reg .u64 %r233; .reg .u64u16 %r264; .reg .u32 %r265; .reg .u16u64 %r278; .reg .u64 %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64u64 %r287; .reg .u64 %r288; .reg .pred %r289; .reg .u64 %r291; .reg .u64 %r293; .reg .predu64 %r305; .reg .u64u64 %r311; .reg .u64 %r312; .reg .u64predu64pred %r380; .reg .pred %r381; .reg .u16 %r385; .reg .u32pred %r433; .reg .pred.reg .u64 %r439; mov.u64 %r204,%ar0; mov.u64 %r205,%ar1; mov.u64 %r206,%ar2; mov.u64 %r207,%ar3; mov.u64 %r208,%ar4; mov.u32 %r209,%ar5; mov.u64 %r210,%ar6; mov.u64 %r211,%ar7; .loc 1 227 10 ld.s8 %r22,[%r205+28]; .loc 1 227 38 add.u32 %r212,%r22,-1; .loc 1 227 8 cvt.s64.s32 %r140,%r212; .loc 1 228 10 ld.u64 %r25,[%r207]; .loc 1 228 7 add.u64 %r141,%r25,-1; .loc 1 230 7 shr.u64 %r214,%r141,63; set.u32.lt.s64 %r216,%r140,%r141; neg.s32 %r217,%r216; cvt.u16.u64 %r220,%r214; cvt.u16.u32 %r221,%r217; or59; .loc 1 232 7 cvt.u32.u32 %r227,%r22; cvt.s64.s8 %r226,%r227; st.u64 [%stack+8],%r22625225_gfortran_runtime_error9: .loc 1 237 9 add.u64 %r230,%r141,%r141; add.u64 %r231,%r230,%r141; shl.b64 %r232,%r231,3; add.u64 %r233,%r205,%r232; ld.u64 %r31,[%r233+56]; ld.u64 %r33,[%r233+48]; .loc 1 241 9 ld.u64 %r144,[%r233+40]; .loc 1 242 12 add.u64 %r251,%r208,%r232; ld.u64 %r34,[%r251+40]; ld.u64 %r36,[%r208+16]; .loc 1 244 9 ld.u64 %r154,[%r208]; .loc 1 248 22 cvt.u32.u64 %r39,%r36; .loc 1 248 53 add.u32 %r253,%r39,-4; and.b32 %r254,%r253,-5; set.u32.eq.u32 %r256,%r254,0; neg.s32 %r257,%r256; .loc 1 248 22 add.u32 %r258,%r39,-1; set.u32.le.u32 %r260,%r258,1; neg.s32 %r261,%r260; .loc 1 248 6 cvt.u16.u32 %r263,%r257; cvt.u16.u32 %r264,%r261; or.b16 %r262,%r263,%r264; cvt.u32.u16 %r265,%r262; cvt.u16.u8 %r266,%r265; setp.eq.u16 %r267,%r266,0; @ %r267 bra $L60; $L64: .loc 1 257 3 setp.ne.u64 %r268,%r141,0; @ %r268 bra $L61; $L68: .loc 1 266 3 setp.gt.s64 %r269,%r140,%r141; @ %r269 bra $L62; bra $L118; $L60: .loc 1 250 7 setp.eq.u32 %r271,%r39,16; @ %r271 bra $L64; .loc 1 255 5 cvta.const.u64 %r273,$LC4; mov.u64 %r272,27273; call _gfortrani_internal_error61: add.u64 %r38,%r205,40; add.u64 %r160,%r208,40; add.u64 %r275,%r25,%r25; add.u64 %r276,%r275,%r25; shl.b64 %r277,%r276,3; add.u64 %r278,%r205,16; add.u64 %r93,%r277,%r278; .loc 1 257 3 mov.u64 %r192,0; add.u64 %r428,%frame,240; add.u64 %r429,%frame,120; add.u64 %r426,%frame,360; .loc 1 264 12 mov.u64 %r439,%r192; $L67: .loc 1 259 18 add.u64 %r280,%r428,%r192; ld.u64 %r281,[%r38]; st.u64 [%r280],%r281; .loc 1 260 18 add.u64 %r283,%r429,%r192; .loc 1 260 20 ld.u64 %r285,[%r160]; mul.lo.u64 %r284,%r285,%r36; .loc 1 260 18 st.u64 [%r283],%r284; .loc 1 261 19 ld.u64 %r287,[%r38+16]; add.u64 %r286,%r287,1; ld.u64 %r288,[%r38+8]; sub.u64 %r53,%r286,%r288; .loc 1 263 10 setp.lt.s64 %r289,%r53,0; @ %r289 bra $L65; .loc 1 261 17 add.u64 %r291,%r426,%r192; st.u64 [%r291],%r53; bra $L66; $L65: .loc 1 264 12 add.u64 %r293,%r426,%r192; st.u64 [%r293],%r439; $L66: .loc 1 257 3 add.u64 %r38,%r38,24; add.u64 %r160,%r160,24; add.u64 %r192,%r192,8; setp.ne.u64 %r295,%r38,%r93; @ %r295 bra $L67; bra $L68; $L118: .loc 1 276 6 ld.u64 %r296,[%r204]; setp.eq.u64 %r297,%r296,0; @ ! %r297 bra $L70; bra $L69; $L62: add.u64 %r299,%r25,%r25; add.u64 %r300,%r299,%r25; shl.b64 %r301,%r300,3; add.u64 %r135,%r301,40; add.u64 %r149,%r205,%r135; add.u64 %r132,%r208,%r135; shl.b64 %r302,%r25,3; add.u64 %r128,%r302,-8; cvt.u32.u32 %r304,%r22; cvt.s64.s8 %r303,%r304; shl.b64 %r305,%r303,3; add.u64 %r120,%r305,-8; add.u64 %r428,%frame,240; add.u64 %r429,%frame,120; add.u64 %r426,%frame,360; .loc 1 273 12 mov.u64 %r438,0; $L73: .loc 1 268 18 add.u64 %r307,%r428,%r128; ld.u64 %r308,[%r149]; st.u64 [%r307],%r308; .loc 1 269 18 add.u64 %r310,%r429,%r128; .loc 1 269 20 ld.u64 %r312,[%r132]; mul.lo.u64 %r311,%r312,%r36; .loc 1 269 18 st.u64 [%r310],%r311; .loc 1 270 19 ld.u64 %r314,[%r149+16]; add.u64 %r313,%r314,1; ld.u64 %r315,[%r149+8]; sub.u64 %r62,%r313,%r315; .loc 1 272 10 setp.lt.s64 %r316,%r62,0; @ %r316 bra $L71; .loc 1 270 17 add.u64 %r318,%r426,%r128; st.u64 [%r318],%r62; bra $L72; $L71: .loc 1 273 12 add.u64 %r320,%r426,%r128; st.u64 [%r320],%r438; $L72: .loc 1 266 3 add.u64 %r149,%r149,24; add.u64 %r132,%r132,24; add.u64 %r128,%r128,8; setp.ne.u64 %r322,%r120,%r128; @ %r322 bra $L73; bra $L119; $L69: .loc 1 280 7 setp.le.s64 %r323,%r140,0; @ %r323 bra $L75; add.u64 %r426,%frame,360; $L99: add.u64 %r173,%r204,40; mov.u64 %r170,%r426; add.u64 %r324,%r204,16; cvt.u32.u32 %r326,%r22; cvt.s64.s8 %r325,%r326; add.u64 %r328,%r325,%r325; add.u64 %r329,%r328,%r325; shl.b64 %r330,%r329,3; add.u64 %r163,%r324,%r330; .loc 1 283 10 mov.u64 %r130,1; .loc 1 287 4 mov.u64 %r353,0; bra $L76; $L75: .loc 1 291 24 mov.u64 %r331,0; st.u64 [%r204+8],%r331; .loc 1 292 28 cvt.u16.u32 %r334,%r22; add.u16 %r333,%r334,-1; cvt.u32.u16 %r335,%r333; st.u8 [%r204+28],%r335; .loc 1 294 20 add.u32 %r336,%r22,-2; cvt.s64.s32 %r73,%r336; add.u64 %r338,%r73,%r73; add.u64 %r339,%r338,%r73; shl.b64 %r340,%r339,3; add.u64 %r341,%r204,%r340; .loc 1 294 67 shl.b64 %r343,%r73,3; add.u64 %r344,%frame,%r343; .loc 1 294 59 ld.u64 %r346,[%r341+40]; ld.u64 %r347,[%r344+360]; mul.lo.u64 %r147,%r346,%r347; .loc 1 296 29 mov.u64 %r349call (%value_in),_gfortrani_xmallocarray50,[%value_in]; } .loc 1 296 27 st.u64 [%r204],%r350; .loc 1 297 10 setp.eq.u64 %r352,%r147,0; @ ! %r352 bra $L80; bra $L77; $L79: .loc 1 285 48 mul.lo.u64 %r130,%r69,%r130; $L76: .loc 1 287 4 st.u64 [%r173+8],%r353; ld.u64 %r69,[%r170]; add.u64 %r354,%r69,-1; st.u64 [%r173+16],%r354; st.u64 [%r173],%r130; .loc 1 280 7 add.u64 %r173,%r173,24; add.u64 %r170,%r170,8; setp.ne.u64 %r355,%r163,%r173; @ %r355 bra $L79; bra $L75; $L77: .loc 1 300 4 st.u64 [%r204+48],%r147; mov.u64 %r357,-1; st.u64 [%r204+56],%r357; mov.u64 %r358,1; st.u64 [%r204+40],%r358; .loc 1 301 4 bra $L58; $L70: .loc 1 306 19 ld.s8 %r80,[%r204+28]; .loc 1 306 10 setp.eq.u64 %r359,%r80,%r140; @ %r359 bra $L82; .loc 1 307 2 st.u64 [%stack+8],%r140; st.u64 [%stack],%r80;_gfortran_runtime_error82: .loc 1 312 11 cvta.global.u64 %r362,_gfortrani_compile_options; .loc 1 312 10 ld.u32 %r363,[%r362+36]; setp.eq.u32 %r364,%r363,0; @ %r364 bra $L80; .loc 1 313 2 add.u64 %r426,%frame,360; cvta.const.u64 %r368,$LC2r4263678; call _gfortrani_bounds_ifunction_return$L80: .loc 1 317 3 setp.gt.s64 %r370,%r140,0; @ %r370 bra $L83; $L86: .loc 1 237 9 add.u64 %r371,%r31,1; .loc 1 237 7 sub.u64 %r142,%r371,%r33; max.s64 %r66,%r142,0; .loc 1 242 12 mul.lo.u64 %r37,%r34,%r36; .loc 1 325 8 ld.u64 %r158,[%r204]; .loc 1 328 8 ld.u64 %r157,[%r205]; .loc 1 353 44 mul.lo.u64 %r100,%r144,%r210; .loc 1 338 22 add.u64 %r64,%r66,-1; .loc 1 338 35 mul.lo.u64 %r89,%r64,%r100; .loc 1 339 29 mul.lo.u64 %r169,%r37,%r64; .loc 1 340 34 neg.s64 %r95,%r100; .loc 1 365 22 ld.u64 %r193,[%frame+240]; .loc 1 365 26 mul.lo.u64 %r196,%r193,%r210; .loc 1 366 23 ld.u64 %r197,[%frame+120]; .loc 1 367 22 ld.u64 %r199,[%frame]; .loc 1 367 12 shl.b64 %r201,%r199,3; .loc 1 364 12 ld.u64 %r203,[%frame+480]; setp.ne.u32 %r427,%r209,0; add.u64 %r425,%frame,488; .loc 1 340 4 setp.le.s64 %r433,%r142,0; .loc 1 353 4 setp.gt.s64 %r434,%r142,0; .loc 1 384 23 add.u64 %r435,%frame,240; .loc 1 369 32 add.u64 %r436,%frame,360; .loc 1 373 20 add.u64 %r437,%frame,120; bra $L84; $L83: add.u64 %r139,%frame,480; add.u64 %r189,%r204,40; mov.u64 %r187,%frame; cvt.u32.u32 %r373,%r22; cvt.s64.s8 %r372,%r373; add.u64 %r175,%r372,-1; .loc 1 317 3 mov.u64 %r29,0; add.u64 %r426,%frame,360; .loc 1 319 16 mov.u64 %r374,%r29; $L85: st.u64 [%r139],%r374; .loc 1 320 18 ld.u64 %r375,[%r189]; st.u64 [%r187],%r375; .loc 1 321 17 shl.b64 %r377,%r29,3; add.u64 %r378,%r426,%r377; .loc 1 321 10 ld.u64 %r379,[%r378]; setp.le.s64 %r380,%r379,0; @ %r380 bra $L58; .loc 1 317 26 add.u64 %r29,%r29,1; .loc 1 317 3 add.u64 %r139,%r139,8; add.u64 %r189,%r189,24; add.u64 %r187,%r187,8; setp.ne.u64 %r381,%r29,%r175; @ %r381 bra $L85; bra $L86; $L84: .loc 1 336 10 @ %r427 bra $L87; .loc 1 353 4 @ %r434 bra $L100; $L90: .loc 1 335 14 mov.u64 %r101,0; bra $L89; $L87: .loc 1 338 8 add.u64 %r152,%r157,%r89; .loc 1 339 9 add.u64 %r153,%r154,%r169; .loc 1 340 4 @ %r433 bra $L90; .loc 1 340 11 mov.u64 %r101,%r66; $L92: .loc 1 342 11 ld.s8 %r386,[%r153]; cvt.u16.u32 %r385,%r386; setp.ne.u16 %r387,%r385,0; @ %r387 bra $L91; $L93: .loc 1 340 26 add.u64 %r101,%r101,-1; .loc 1 340 34 add.u64 %r152,%r152,%r95; .loc 1 340 61 sub.u64 %r153,%r153,%r37; .loc 1 340 4 setp.ne.u64 %r388,%r101,0; @ %r388 bra $L92; bra $L9021206; call (%value_in),_gfortran_compare_stringld.param.u32 %r393,[%value_in]; } .loc 1 342 18 setp.eq.u32 %r394,%r393,0; @ ! %r394 bra $L93; bra $L89; $L100: .loc 1 353 4 mov.u64 %r151,%r154; mov.u64 %r150,%r157; .loc 1 353 11 mov.u64 %r101,1; $L88: .loc 1 355 11 ld.s8 %r396,[%r151]; cvt.u16.u32 %r395,%r396; setp.ne.u16 %r397,%r395,0; @ %r397 bra $L94; $L95: .loc 1 353 27 add.u64 %r101,%r101,1; .loc 1 353 35 add.u64 %r150,%r150,%r100; .loc 1 353 62 add.u64 %r151,%r151,%r37; .loc 1 353 4 setp.ge.s64 %r398,%r66,%r101; @ %r398 bra $L88; bra $L902115021206; call (%value_in),_gfortran_compare_stringld.param.u32 %r403,[%value_in]; } .loc 1 355 18 setp.ne.u32 %r404,%r403,0; @ %r404 bra $L95; $L89: .loc 1 362 13 st.u64 [%r158],%r101; .loc 1 364 15 add.u64 %r203,%r203,1; st.u64 [%frame+480],%r203; .loc 1 365 12 add.u64 %r157,%r157,%r196; .loc 1 366 13 add.u64 %r154,%r154,%r197; .loc 1 367 12 add.u64 %r158,%r158,%r201; .loc 1 369 32 ld.u64 %r119,[%frame+360]; .loc 1 369 13 setp.ne.u64 %r405,%r203,%r119; @ %r405 bra $L84; mov.u64 %r115,%r425; mov.u64 %r116,%r199; mov.u64 %r202,%r197; mov.u64 %r113,%r193; mov.u64 %r109,8; .loc 1 368 9 mov.u64 %r156,0; .loc 1 371 13 mov.u64 %r407,%r156; $L98: st.u64 [%r115+-8],%r407; .loc 1 372 23 mul.lo.u64 %r103,%r119,%r113; .loc 1 373 24 mul.lo.u64 %r408,%r119,%r202; .loc 1 373 10 sub.u64 %r154,%r154,%r408; .loc 1 374 23 mul.lo.u64 %r108,%r119,%r116; .loc 1 375 5 add.u64 %r156,%r156,1; .loc 1 376 7 setp.le.s64 %r409,%r140,%r156; @ %r409 bra $L58; .loc 1 383 16 ld.u64 %r410,[%r115]; add.u64 %r112,%r410,1; st.u64 [%r115],%r112; .loc 1 384 23 add.u64 %r412,%r435,%r109; ld.u64 %r113,[%r412]; .loc 1 384 13 sub.u64 %r413,%r113,%r103; mad.lo.u64 %r157,%r413,%r210,%r157; .loc 1 385 23 add.u64 %r415,%frame,%r109; ld.u64 %r116,[%r415]; .loc 1 385 13 sub.u64 %r416,%r116,%r108; shl.b64 %r417,%r416,3; add.u64 %r158,%r158,%r417; .loc 1 369 32 add.u64 %r419,%r436,%r109; ld.u64 %r119,[%r419]; .loc 1 369 13 add.u64 %r115,%r115,8; setp.eq.u64 %r420,%r112,%r119; @ %r420 bra $L97; .loc 1 364 12 ld.u64 %r203,[%frame+480]; bra $L84; $L97: .loc 1 373 20 add.u64 %r422,%r437,%r109; ld.u64 %r202,[%r422]; add.u64 %r109,%r109,8; bra $L98; $L119: .loc 1 276 6 ld.u64 %r423,[%r204]; setp.eq.u64 %r424,%r423,0; @ ! %r424 bra $L70; bra $L99; $L58: .loc 1 389_gfortran_sfindloc1_s1 .visible .func _gfortran_sfindloc1_s18u64 %r126; .reg .u64pred %r157; .reg .u32 %r158; .reg .predu32 %r177; .reg .u16 %r178; .reg .pred %r179; .reg .pred %r180; .reg .predpredpredu64predpredpredu64 %r305; .reg .u64 %r306; mov.u64 %r149,%ar0; mov.u64 %r150,%ar1; mov.u64 %r151,%ar2; mov.u64 %r152,%ar3; mov.u64 %r153,%ar4; mov.u32 %r154,%ar5; mov.u64 %r155,%ar6; mov.u64 %r156,%ar7; .loc 1 412 6 setp.eq.u64 %r157,%r153,0; @ %r157 bra $L121; .loc 1 412 20 ld.u32 %r158,[%r153]; setp.eq.u32 %r159,%r158,0; @ %r159 bra $L122; $L121:.loc 1 415 7 bra $L120; $L122: .loc 1 418 10 ld.s8 %r23,[%r150+28]; .loc 1 418 38 add.u32 %r167,%r23,-1; .loc 1 418 8 cvt.s64.s32 %r90,%r167; .loc 1 419 10 ld.u64 %r26,[%r152]; .loc 1 419 7 add.u64 %r91,%r26,-1; .loc 1 421 7 shr.u64 %r169,%r91,63; set.u32.lt.s64 %r171,%r90,%r91; neg.s32 %r172,%r171; cvt.u16.u64 %r175,%r169; cvt.u16.u32 %r176,%r172; or.b16 %r174,%r175,%r176; .loc 1 421 6 cvt.u32.u16 %r177,%r174; cvt.u16.u8 %r178,%r177; setp.ne.u16 %r179,%r178,0; @ %r179 bra $L124; .loc 1 432 3 setp.ne.u64 %r180,%r91,0; @ %r180 bra $L125; $L131: .loc 1 440 3 setp.gt.s64 %r181,%r90,%r91; @ %r181 bra $L126; bra $L168; $L124: .loc 1 423 7 cvt.u32.u32 %r184,%r23; cvt.s64.s8 %r183,%r184; st.u64 [%stack+8],%r183182182_gfortran_runtime_error125: add.u64 %r136,%r150,48; add.u64 %r138,%frame,120; add.u64 %r139,%r150,56; add.u64 %r186,%r150,24; add.u64 %r188,%r26,%r26; add.u64 %r189,%r188,%r26; shl.b64 %r190,%r189,3; add.u64 %r145,%r186,%r190; .loc 1 437 12 mov.u64 %r306,0; $L130: .loc 1 434 19 ld.u64 %r192,[%r139]; add.u64 %r191,%r192,1; ld.u64 %r193,[%r136]; sub.u64 %r35,%r191,%r193; .loc 1 436 10 setp.le.s64 %r194,%r35,0; @ %r194 bra $L128; .loc 1 434 17 st.u64 [%r138],%r35; bra $L129; $L128: .loc 1 437 12 st.u64 [%r138],%r306; $L129: .loc 1 432 3 add.u64 %r136,%r136,24; add.u64 %r138,%r138,8; add.u64 %r139,%r139,24; setp.ne.u64 %r196,%r136,%r145; @ %r196 bra $L130; bra $L131; $L168: .loc 1 450 6 ld.u64 %r197,[%r149]; setp.eq.u64 %r198,%r197,0; @ ! %r198 bra $L133; bra $L132; $L126: add.u64 %r200,%r26,%r26; add.u64 %r201,%r200,%r26; shl.b64 %r202,%r201,3; add.u64 %r203,%r202,48; add.u64 %r125,%r150,%r203; add.u64 %r298,%frame,120; shl.b64 %r205,%r26,3; add.u64 %r206,%r205,-8; add.u64 %r45,%r298,%r206; add.u64 %r207,%r150,48; cvt.u32.u32 %r209,%r23; cvt.s64.s8 %r208,%r209; add.u64 %r211,%r208,%r208; add.u64 %r212,%r211,%r208; shl.b64 %r213,%r212,3; add.u64 %r132,%r207,%r213; .loc 1 446 12 mov.u64 %r305,0; $L136: .loc 1 443 2 ld.u64 %r215,[%r125+8]; add.u64 %r214,%r215,1; ld.u64 %r216,[%r125]; sub.u64 %r40,%r214,%r216; .loc 1 445 10 setp.le.s64 %r217,%r40,0; @ %r217 bra $L134; .loc 1 442 17 st.u64 [%r45],%r40; bra $L135; $L134: .loc 1 446 12 st.u64 [%r45],%r305; $L135: .loc 1 440 3 add.u64 %r125,%r125,24; add.u64 %r45,%r45,8; setp.ne.u64 %r219,%r125,%r132; @ %r219 bra $L136; bra $L169; $L132: .loc 1 454 7 setp.le.s64 %r220,%r90,0; @ %r220 bra $L138; add.u64 %r298,%frame,120; $L154: add.u64 %r78,%r149,40; mov.u64 %r61,%r298; cvt.u32.u32 %r222,%r23; cvt.s64.s8 %r221,%r222; add.u64 %r224,%r221,%r221; add.u64 %r225,%r224,%r221; shl.b64 %r226,%r225,3; add.u64 %r227,%r149,16; add.u64 %r126,%r226,%r227; .loc 1 457 10 mov.u64 %r81,1; .loc 1 461 4 mov.u64 %r250,0; bra $L139; $L138: .loc 1 464 24 mov.u64 %r228,0; st.u64 [%r149+8],%r228; .loc 1 465 28 cvt.u16.u32 %r231,%r23; add.u16 %r230,%r231,-1; cvt.u32.u16 %r232,%r230; st.u8 [%r149+28],%r232; .loc 1 467 20 add.u32 %r233,%r23,-2; cvt.s64.s32 %r51,%r233; add.u64 %r235,%r51,%r51; add.u64 %r236,%r235,%r51; shl.b64 %r237,%r236,3; add.u64 %r238,%r149,%r237; .loc 1 467 67 shl.b64 %r240,%r51,3; add.u64 %r241,%frame,%r240; .loc 1 467 59 ld.u64 %r243,[%r238+40]; ld.u64 %r244,[%r241+120]; mul.lo.u64 %r94,%r243,%r244; .loc 1 469 29 mov.u64 %r246246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } .loc 1 469 27 st.u64 [%r149],%r247; .loc 1 470 10 setp.eq.u64 %r249,%r94,0; @ ! %r249 bra $L143; bra $L140; $L142: .loc 1 459 48 mul.lo.u64 %r81,%r47,%r81; $L139: .loc 1 461 4 st.u64 [%r78+8],%r250; ld.u64 %r47,[%r61]; add.u64 %r251,%r47,-1; st.u64 [%r78+16],%r251; st.u64 [%r78],%r81; .loc 1 454 7 add.u64 %r78,%r78,24; add.u64 %r61,%r61,8; setp.ne.u64 %r252,%r78,%r126; @ %r252 bra $L142; bra $L138; $L140: .loc 1 473 4 st.u64 [%r149+48],%r94; mov.u64 %r254,-1; st.u64 [%r149+56],%r254; mov.u64 %r255,1; st.u64 [%r149+40],%r255; .loc 1 474 4 bra $L120; $L133: .loc 1 479 19 ld.s8 %r58,[%r149+28]; .loc 1 479 10 setp.eq.u64 %r256,%r58,%r90; @ %r256 bra $L144; .loc 1 480 2 st.u64 [%stack+8],%r90; st.u64 [%stack],%r58;_gfortran_runtime_error144: .loc 1 485 11 cvta.global.u64 %r259,_gfortrani_compile_options; .loc 1 485 10 ld.u32 %r260,[%r259+36]; setp.eq.u32 %r261,%r260,0; @ %r261 bra $L143; .loc 1 486 2 add.u64 %r298,%frame,120; cvta.const.u64 %r265,$LC2r292645; call _gfortrani_bounds_ifunction_return$L143: .loc 1 490 3 setp.gt.s64 %r267,%r90,0; @ %r267 bra $L145; $L148: .loc 1 497 8 ld.u64 %r97,[%r149]; ld.u64 %r42,[%frame+240]; .loc 1 505 22 ld.u64 %r64,[%frame]; .loc 1 505 12 shl.b64 %r66,%r64,3; setp.le.s64 %r299,%r90,1; cvt.u32.u32 %r301,%r23; cvt.s64.s8 %r302,%r301; add.u64 %r303,%r302,-1; add.u64 %r304,%frame,120; bra $L146; $L145: add.u64 %r114,%frame,240; add.u64 %r113,%r149,40; mov.u64 %r108,%frame; cvt.u32.u32 %r269,%r23; cvt.s64.s8 %r268,%r269; add.u64 %r79,%r268,-1; .loc 1 490 3 mov.u64 %r98,0; add.u64 %r298,%frame,120; .loc 1 492 16 mov.u64 %r270,%r98; $L147: st.u64 [%r114],%r270; .loc 1 493 18 ld.u64 %r271,[%r113]; st.u64 [%r108],%r271; .loc 1 494 17 shl.b64 %r273,%r98,3; add.u64 %r274,%r298,%r273; .loc 1 494 10 ld.u64 %r275,[%r274]; setp.le.s64 %r276,%r275,0; @ %r276 bra $L120; .loc 1 490 26 add.u64 %r98,%r98,1; .loc 1 490 3 add.u64 %r114,%r114,8; add.u64 %r113,%r113,24; add.u64 %r108,%r108,8; setp.ne.u64 %r277,%r79,%r98; @ %r277 bra $L147; bra $L148; $L146: .loc 1 502 13 mov.u64 %r278,0; st.u64 [%r97],%r278; .loc 1 504 15 add.u64 %r42,%r42,1; .loc 1 505 12 add.u64 %r97,%r97,%r66; .loc 1 507 32 ld.u64 %r88,[%frame+120]; .loc 1 507 13 setp.ne.u64 %r279,%r42,%r88; @ %r279 bra $L146; .loc 1 510 23 mul.lo.u64 %r280,%r64,%r42; .loc 1 510 9 shl.b64 %r281,%r280,3; sub.u64 %r30,%r97,%r281; .loc 1 512 7 @ ! %r299 bra $L170; bra $L120; $L153: .loc 1 509 13 st.u64 [%r102],%r278; .loc 1 510 23 mul.lo.u64 %r284,%r73,%r72; .loc 1 510 9 shl.b64 %r285,%r284,3; sub.u64 %r30,%r97,%r285; .loc 1 511 5 add.u64 %r96,%r96,1; .loc 1 512 7 add.u64 %r102,%r102,8; setp.ne.u64 %r286,%r96,%r303; @ %r286 bra $L152; bra $L120; $L170: add.u64 %r102,%frame,248; .loc 1 511 5 mov.u64 %r96,1; $L152: .loc 1 519 16 ld.u64 %r290,[%r102]; add.u64 %r72,%r290,1; st.u64 [%r102],%r72; shl.b64 %r120,%r96,3; .loc 1 520 23 add.u64 %r291,%frame,%r120; ld.u64 %r73,[%r291]; .loc 1 520 13 shl.b64 %r292,%r73,3; add.u64 %r97,%r30,%r292; .loc 1 507 32 add.u64 %r294,%r304,%r120; ld.u64 %r77,[%r294]; .loc 1 507 13 setp.eq.u64 %r295,%r72,%r77; @ %r295 bra $L153; .loc 1 509 13 mov.u64 %r42,0; bra $L146; $L169: .loc 1 450 6 ld.u64 %r296,[%r149]; setp.eq.u64 %r297,%r296,0; @ ! %r297 bra $L133; bra $L154; $L120: .loc 1 524findloc1_s4.o/_gfortran_findloc1_s4 .visible .func _gfortran_findloc1_s; .file 1 "../../../../libgfortran/generated/findloc1_s4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mfindloc1_s4 .visible .func _gfortran_mfindloc1_s4); // BEGIN GLOBAL FUNCTION DECL: _gfortran_sfindloc1_s4 .visible .func _gfortran_sfindloc1_s4); // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortran_compare_string_char4_gfortran_compare_string_char4gfortrani_internal_error .extern .func _gfortrani_internal_,105,110,99,111,114,114,101,99,116,32,105,110,32,70,73,78,68,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,70,73,78,68,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,1218] = {70,73,78,68,7114,101,116,117,114,110,32,118,97,108,11_gfortran_findloc1_s4 .visible .func _gfortran_findloc1_s48predu64 %r258; .reg .u64pred %r265; .reg .pred64pred %r304; .reg .u64 %r305; .reg .u64u64 %r318; .reg .u64 %r319; .reg .u32 %r320; .reg .u64u32 %r336; .reg .pred %r337; .reg .pred %r338; .reg .u32predpredpredu64mov.u64 %r188,%ar0; mov.u64 %r189,%ar1; mov.u64 %r190,%ar2; mov.u64 %r191,%ar3; mov.u32 %r192,%ar4; mov.u64 %r193,%ar5; mov.u64 %r194,%ar6; .loc 1 56 10 ld.s8 %r22,[%r189+28]; .loc 1 56 38 add.u32 %r195,%r22,-1; .loc 1 56 8 cvt.s64.s32 %r117,%r195; .loc 1 57 10 ld.u64 %r25,[%r191]; .loc 1 57 7 add.u64 %r118,%r25,-1; .loc 1 59 7 shr.u64 %r197,%r118,63; set.u32.lt.s64 %r199,%r117,%r118; neg.s32 %r200,%r199; cvt.u16.u64 %r203,%r197; cvt.u16.u32 %r204,%r200; or.b16 %r202,%r203,%r204; .loc 1 59 6 cvt.u32.u16 %r205,%r202; cvt.u16.u8 %r206,%r205; setp.eq.u16 %r207,%r206,0; @ %r207 bra $L2; .loc 1 61 7 cvt.u32.u32 %r210,%r22; cvt.s64.s8 %r209,%r210; st.u64 [%stack+8],%r209_gfortran_runtime_errorr213,%r118,%r118; add.u64 %r214,%r213,%r118; shl.b64 %r215,%r214,3; add.u64 %r216,%r189,%r215; ld.u64 %r30,[%r216+56]; ld.u64 %r32,[%r216+48]; .loc 1 69 9 ld.u64 %r121,[%r216+40]; .loc 1 71 3 setp.ne.u64 %r230,%r118,0; @ %r230 bra $L3; $L9: .loc 1 79 3 setp.gt.s64 %r231,%r117,%r118; @ %r231 bra $L4; bra $L56; $L3: add.u64 %r81,%r189,40; add.u64 %r78,%frame,120; add.u64 %r70,%frame,240; add.u64 %r233,%r25,%r25; add.u64 %r234,%r233,%r25; shl.b64 %r235,%r234,3; add.u64 %r236,%r189,16; add.u64 %r179,%r235,%r236; .loc 1 77 12 mov.u64 %r386,0; $L8: .loc 1 73 18 ld.u64 %r237,[%r81]; st.u64 [%r78],%r237; .loc 1 74 19 ld.u64 %r239,[%r81+16]; add.u64 %r238,%r239,1; ld.u64 %r240,[%r81+8]; sub.u64 %r37,%r238,%r240; .loc 1 76 10 setp.lt.s64 %r241,%r37,0; @ %r241 bra $L6; .loc 1 74 17 st.u64 [%r70],%r37; bra $L7; $L6: .loc 1 77 12 st.u64 [%r70],%r386; $L7: .loc 1 71 3 add.u64 %r81,%r81,24; add.u64 %r78,%r78,8; add.u64 %r70,%r70,8; setp.ne.u64 %r243,%r81,%r179; @ %r243 bra $L8; bra $L9; $L56: .loc 1 88 6 ld.u64 %r244,[%r188]; setp.eq.u64 %r245,%r244,0; @ ! %r245 bra $L11; bra $L10; $L4: add.u64 %r247,%r25,%r25; add.u64 %r248,%r247,%r25; shl.b64 %r249,%r248,3; add.u64 %r250,%r249,40; add.u64 %r86,%r189,%r250; shl.b64 %r251,%r25,3; add.u64 %r134,%r251,-8; add.u64 %r375,%frame,120; add.u64 %r46,%r375,%r134; add.u64 %r374,%frame,240; add.u64 %r136,%r374,%r134; cvt.u32.u32 %r255,%r22; cvt.s64.s8 %r254,%r255; add.u64 %r257,%r254,%r254; add.u64 %r258,%r257,%r254; shl.b64 %r259,%r258,3; add.u64 %r260,%r189,40; add.u64 %r143,%r259,%r260; .loc 1 85 12 mov.u64 %r385,0; $L14: .loc 1 81 18 ld.u64 %r261,[%r86]; st.u64 [%r46],%r261; .loc 1 82 19 ld.u64 %r263,[%r86+16]; add.u64 %r262,%r263,1; ld.u64 %r264,[%r86+8]; sub.u64 %r43,%r262,%r264; .loc 1 84 10 setp.lt.s64 %r265,%r43,0; @ %r265 bra $L12; .loc 1 82 17 st.u64 [%r136],%r43; bra $L13; $L12: .loc 1 85 12 st.u64 [%r136],%r385; $L13: .loc 1 79 3 add.u64 %r86,%r86,24; add.u64 %r46,%r46,8; add.u64 %r136,%r136,8; setp.ne.u64 %r267,%r86,%r143; @ %r267 bra $L14; bra $L57; $L10: .loc 1 92 7 setp.le.s64 %r268,%r117,0; @ %r268 bra $L16; add.u64 %r374,%frame,240; $L37: add.u64 %r111,%r188,40; mov.u64 %r109,%r374; cvt.u32.u32 %r270,%r22; cvt.s64.s8 %r269,%r270; add.u64 %r272,%r269,%r269; add.u64 %r273,%r272,%r269; shl.b64 %r274,%r273,3; add.u64 %r275,%r188,16; add.u64 %r100,%r274,%r275; .loc 1 95 10 mov.u64 %r108,1; .loc 1 99 4 mov.u64 %r298,0; bra $L17; $L16: .loc 1 103 24 mov.u64 %r276,0; st.u64 [%r188+8],%r276; .loc 1 104 28 cvt.u16.u32 %r279,%r22; add.u16 %r278,%r279,-1; cvt.u32.u16 %r280,%r278; st.u8 [%r188+28],%r280; .loc 1 106 20 add.u32 %r281,%r22,-2; cvt.s64.s32 %r53,%r281; add.u64 %r283,%r53,%r53; add.u64 %r284,%r283,%r53; shl.b64 %r285,%r284,3; add.u64 %r286,%r188,%r285; .loc 1 106 67 shl.b64 %r288,%r53,3; add.u64 %r289,%frame,%r288; .loc 1 106 59 ld.u64 %r291,[%r286+40]; ld.u64 %r292,[%r289+240]; mul.lo.u64 %r123,%r291,%r292; .loc 1 108 29 mov.u64 %r294194; call (%value_in),_gfortrani_xmallocarray295,[%value_in]; } .loc 1 108 27 st.u64 [%r188],%r295; .loc 1 109 10 setp.eq.u64 %r297,%r123,0; @ ! %r297 bra $L21; bra $L18; $L20: .loc 1 97 48 mul.lo.u64 %r108,%r49,%r108; $L17: .loc 1 99 4 st.u64 [%r111+8],%r298; ld.u64 %r49,[%r109]; add.u64 %r299,%r49,-1; st.u64 [%r111+16],%r299; st.u64 [%r111],%r108; .loc 1 92 7 add.u64 %r111,%r111,24; add.u64 %r109,%r109,8; setp.ne.u64 %r300,%r100,%r111; @ %r300 bra $L20; bra $L16; $L18: .loc 1 112 4 st.u64 [%r188+48],%r123; mov.u64 %r302,-1; st.u64 [%r188+56],%r302; mov.u64 %r303,1; st.u64 [%r188+40],%r303; .loc 1 113 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r59,[%r188+28]; .loc 1 118 10 setp.eq.u64 %r304,%r59,%r117; @ %r304 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r117; st.u64 [%stack],%r59;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r307,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r308,[%r307+36]; setp.eq.u32 %r309,%r308,0; @ %r309 bra $L21; .loc 1 125 2 add.u64 %r374,%frame,240; cvta.const.u64 %r313,$LC2r374312313; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r315,%r117,0; @ %r315 bra $L24; $L28: .loc 1 66 9 add.u64 %r316,%r30,1; .loc 1 66 7 sub.u64 %r119,%r316,%r32; max.s64 %r41,%r119,0; .loc 1 137 8 ld.u64 %r131,[%r188]; .loc 1 140 8 ld.u64 %r130,[%r189]; shl.b64 %r182,%r193,2; .loc 1 162 35 mul.lo.u64 %r77,%r121,%r182; .loc 1 149 22 add.u64 %r317,%r41,-1; .loc 1 149 15 mul.lo.u64 %r71,%r317,%r77; .loc 1 150 34 neg.s64 %r74,%r77; .loc 1 174 22 ld.u64 %r167,[%frame+120]; .loc 1 174 12 mul.lo.u64 %r318,%r167,%r193; shl.b64 %r169,%r318,2; .loc 1 175 22 ld.u64 %r170,[%frame]; .loc 1 175 12 shl.b64 %r173,%r170,3; .loc 1 173 12 ld.u64 %r174,[%frame+360]; setp.ne.u32 %r373,%r192,0; setp.le.s64 %r376,%r117,1; .loc 1 150 4 setp.le.s64 %r378,%r119,0; .loc 1 162 4 setp.gt.s64 %r379,%r119,0; cvt.u32.u32 %r380,%r22; cvt.s64.s8 %r381,%r380; add.u64 %r382,%r381,-1; add.u64 %r383,%frame,120; add.u64 %r384,%frame,240; bra $L25; $L24: add.u64 %r154,%frame,360; add.u64 %r150,%r188,40; mov.u64 %r148,%frame; cvt.u32.u32 %r320,%r22; cvt.s64.s8 %r319,%r320; add.u64 %r112,%r319,-1; .loc 1 129 3 mov.u64 %r132,0; add.u64 %r374,%frame,240; .loc 1 131 16 mov.u64 %r321,%r132; $L27: st.u64 [%r154],%r321; .loc 1 132 18 ld.u64 %r322,[%r150]; st.u64 [%r148],%r322; .loc 1 133 17 shl.b64 %r324,%r132,3; add.u64 %r325,%r374,%r324; .loc 1 133 10 ld.u64 %r326,[%r325]; setp.le.s64 %r327,%r326,0; @ %r327 bra $L1; .loc 1 129 26 add.u64 %r132,%r132,1; .loc 1 129 3 add.u64 %r154,%r154,8; add.u64 %r150,%r150,24; add.u64 %r148,%r148,8; setp.ne.u64 %r328,%r112,%r132; @ %r328 bra $L27; bra $L28; $L25: .loc 1 147 10 @ %r373 bra $L29; .loc 1 162 4 @ %r379 bra $L38; $L32: .loc 1 146 14 mov.u64 %r125,0; bra $L31; $L29: .loc 1 149 8 add.u64 %r126,%r130,%r71; .loc 1 150 4 @ %r378 bra $L32; .loc 1 150 11 mov.u64 %r12519r126194190; call (%value_in),_gfortran_compare_string_char4ld.param.u32 %r336,[%value_in]; } .loc 1 152 11 setp.eq.u32 %r337,%r336,0; @ %r337 bra $L31; .loc 1 150 26 add.u64 %r125,%r125,-1; .loc 1 150 34 add.u64 %r126,%r126,%r74; .loc 1 150 4 setp.ne.u64 %r338,%r125,0; @ %r338 bra $L33; bra $L32; $L38: .loc 1 162 4 mov.u64 %r124,%r130; .loc 1 162 11 mov.u64 %r125,19r124194190; call (%value_in),_gfortran_compare_string_char4ld.param.u32 %r343,[%value_in]; } .loc 1 164 11 setp.eq.u32 %r344,%r343,0; @ %r344 bra $L31; .loc 1 162 27 add.u64 %r125,%r125,1; .loc 1 162 35 add.u64 %r124,%r124,%r77; .loc 1 162 4 setp.ge.s64 %r345,%r41,%r125; @ %r345 bra $L30; bra $L32; $L31: .loc 1 171 13 st.u64 [%r131],%r125; .loc 1 173 15 add.u64 %r174,%r174,1; .loc 1 174 12 add.u64 %r130,%r130,%r169; .loc 1 175 12 add.u64 %r131,%r131,%r173; .loc 1 177 32 ld.u64 %r141,[%frame+240]; .loc 1 177 13 setp.ne.u64 %r346,%r174,%r141; @ %r346 bra $L25; .loc 1 180 23 mul.lo.u64 %r347,%r174,%r167; .loc 1 180 9 mul.lo.u64 %r348,%r347,%r193; shl.b64 %r349,%r348,2; sub.u64 %r127,%r130,%r349; .loc 1 181 23 mul.lo.u64 %r350,%r174,%r170; .loc 1 181 9 shl.b64 %r351,%r350,3; sub.u64 %r128,%r131,%r351; .loc 1 183 7 @ %r376 bra $L1; add.u64 %r65,%frame,368; mov.u64 %r68,8; .loc 1 182 5 mov.u64 %r129,1; .loc 1 179 13 mov.u64 %r377,0; bra $L35; $L36: st.u64 [%r65],%r377; .loc 1 180 23 mul.lo.u64 %r357,%r94,%r93; .loc 1 180 9 mul.lo.u64 %r358,%r357,%r182; sub.u64 %r127,%r130,%r358; .loc 1 181 23 mul.lo.u64 %r359,%r97,%r93; .loc 1 181 9 shl.b64 %r360,%r359,3; sub.u64 %r128,%r131,%r360; .loc 1 182 5 add.u64 %r129,%r129,1; .loc 1 183 7 add.u64 %r65,%r65,8; add.u64 %r68,%r68,8; setp.eq.u64 %r361,%r129,%r382; @ %r361 bra $L1; $L35: .loc 1 190 16 ld.u64 %r362,[%r65]; add.u64 %r93,%r362,1; st.u64 [%r65],%r93; .loc 1 191 23 add.u64 %r364,%r383,%r68; ld.u64 %r94,[%r364]; .loc 1 191 13 mad.lo.u64 %r130,%r94,%r182,%r127; .loc 1 192 23 add.u64 %r366,%frame,%r68; ld.u64 %r97,[%r366]; .loc 1 192 13 shl.b64 %r367,%r97,3; add.u64 %r131,%r128,%r367; .loc 1 177 32 add.u64 %r369,%r384,%r68; ld.u64 %r101,[%r369]; .loc 1 177 13 setp.eq.u64 %r370,%r93,%r101; @ %r370 bra $L36; mov.u64 %r174,0; bra $L25; $L57: .loc 1 88 6 ld.u64 %r371,[%r188]; setp.eq.u64 %r372,%r371,0; @ ! %r372 bra $L11; bra $L37; $L1: .loc 1 196_gfortran_mfindloc1_s4 .visible .func _gfortran_mfindloc1_s4608u64predu64pred %r270; .reg .predu64 %r279; .reg .u64 %r280; .reg .u64 %r282; .reg .u64 %r283; .reg .u64 %r285; .reg .u64pred %r297; .reg .u64 %r298; .reg .predu64 %r305; .reg .u32 %r306; .reg .u64u64predu64predpredpredpred %r397; .reg .u16 %r398; .reg .u32 %r399; .reg .pred %r400; .reg .pred %r401; .reg .u32 %r406; .reg .predpred %r430; .reg .u64predmov.u64 %r206,%ar0; mov.u64 %r207,%ar1; mov.u64 %r208,%ar2; mov.u64 %r209,%ar3; mov.u64 %r210,%ar4; mov.u32 %r211,%ar5; mov.u64 %r212,%ar6; mov.u64 %r213,%ar7; .loc 1 227 10 ld.s8 %r22,[%r207+28]; .loc 1 227 38 add.u32 %r214,%r22,-1; .loc 1 227 8 cvt.s64.s32 %r138,%r214; .loc 1 228 10 ld.u64 %r25,[%r209]; .loc 1 228 7 add.u64 %r139,%r25,-1; .loc 1 230 7 shr.u64 %r216,%r139,63; set.u32.lt.s64 %r218,%r138,%r139; neg.s32 %r219,%r218; cvt.u16.u64 %r222,%r216; cvt.u16.u32 %r223,%r219; or.b16 %r221,%r222,%r223; .loc 1 230 6 cvt.u32.u16 %r224,%r221; cvt.u16.u8 %r225,%r224; setp.eq.u16 %r226,%r225,0; @ %r226 bra $L59; .loc 1 232 7 cvt.u32.u32 %r229,%r22; cvt.s64.s8 %r228,%r229; st.u64 [%stack+8],%r22827_gfortran_runtime_error9: .loc 1 237 9 add.u64 %r232,%r139,%r139; add.u64 %r233,%r232,%r139; shl.b64 %r234,%r233,3; add.u64 %r235,%r207,%r234; ld.u64 %r31,[%r235+56]; ld.u64 %r33,[%r235+48]; .loc 1 241 9 ld.u64 %r142,[%r235+40]; .loc 1 242 12 add.u64 %r253,%r210,%r234; ld.u64 %r34,[%r253+40]; ld.u64 %r36,[%r210+16]; .loc 1 244 9 ld.u64 %r151,[%r210]; .loc 1 248 22 cvt.u32.u64 %r38,%r36; .loc 1 248 53 add.u32 %r255,%r38,-4; and.b32 %r256,%r255,-5; set.u32.eq.u32 %r258,%r256,0; neg.s32 %r259,%r258; .loc 1 248 22 add.u32 %r260,%r38,-1; set.u32.le.u32 %r262,%r260,1; neg.s32 %r263,%r262; .loc 1 248 6 cvt.u16.u32 %r265,%r259; cvt.u16.u32 %r266,%r263; or$L64: .loc 1 257 3 setp.ne.u64 %r270,%r139,0; @ %r270 bra $L61; $L68: .loc 1 266 3 setp.gt.s64 %r271,%r138,%r139; @ %r271 bra $L62; bra $L118; $L60: .loc 1 250 7 setp.eq.u32 %r273,%r38,16; @ %r273 bra $L64; .loc 1 255 5 cvta.const.u64 %r275,$LC4; mov.u64 %r274,27275; call _gfortrani_internal_error61: add.u64 %r119,%r207,40; add.u64 %r79,%r210,40; add.u64 %r276,%r207,16; add.u64 %r278,%r25,%r25; add.u64 %r279,%r278,%r25; shl.b64 %r280,%r279,3; add.u64 %r63,%r276,%r280; .loc 1 257 3 mov.u64 %r156,0; add.u64 %r431,%frame,240; add.u64 %r432,%frame,120; add.u64 %r429,%frame,360; .loc 1 264 12 mov.u64 %r442,%r156; $L67: .loc 1 259 18 add.u64 %r282,%r431,%r156; ld.u64 %r283,[%r119]; st.u64 [%r282],%r283; .loc 1 260 18 add.u64 %r285,%r432,%r156; .loc 1 260 20 ld.u64 %r287,[%r79]; mul.lo.u64 %r286,%r287,%r36; .loc 1 260 18 st.u64 [%r285],%r286; .loc 1 261 19 ld.u64 %r289,[%r119+16]; add.u64 %r288,%r289,1; ld.u64 %r290,[%r119+8]; sub.u64 %r51,%r288,%r290; .loc 1 263 10 setp.lt.s64 %r291,%r51,0; @ %r291 bra $L65; .loc 1 261 17 add.u64 %r293,%r429,%r156; st.u64 [%r293],%r51; bra $L66; $L65: .loc 1 264 12 add.u64 %r295,%r429,%r156; st.u64 [%r295],%r442; $L66: .loc 1 257 3 add.u64 %r119,%r119,24; add.u64 %r79,%r79,24; add.u64 %r156,%r156,8; setp.ne.u64 %r297,%r63,%r119; @ %r297 bra $L67; bra $L68; $L118: .loc 1 276 6 ld.u64 %r298,[%r206]; setp.eq.u64 %r299,%r298,0; @ ! %r299 bra $L70; bra $L69; $L62: add.u64 %r301,%r25,%r25; add.u64 %r302,%r301,%r25; shl.b64 %r303,%r302,3; add.u64 %r162,%r303,40; add.u64 %r167,%r207,%r162; add.u64 %r160,%r210,%r162; shl.b64 %r304,%r25,3; add.u64 %r133,%r304,-8; cvt.u32.u32 %r306,%r22; cvt.s64.s8 %r305,%r306; shl.b64 %r307,%r305,3; add.u64 %r123,%r307,-8; add.u64 %r431,%frame,240; add.u64 %r432,%frame,120; add.u64 %r429,%frame,360; .loc 1 273 12 mov.u64 %r441,0; $L73: .loc 1 268 18 add.u64 %r309,%r431,%r133; ld.u64 %r310,[%r167]; st.u64 [%r309],%r310; .loc 1 269 18 add.u64 %r312,%r432,%r133; .loc 1 269 20 ld.u64 %r314,[%r160]; mul.lo.u64 %r313,%r314,%r36; .loc 1 269 18 st.u64 [%r312],%r313; .loc 1 270 19 ld.u64 %r316,[%r167+16]; add.u64 %r315,%r316,1; ld.u64 %r317,[%r167+8]; sub.u64 %r60,%r315,%r317; .loc 1 272 10 setp.lt.s64 %r318,%r60,0; @ %r318 bra $L71; .loc 1 270 17 add.u64 %r320,%r429,%r133; st.u64 [%r320],%r60; bra $L72; $L71: .loc 1 273 12 add.u64 %r322,%r429,%r133; st.u64 [%r322],%r441; $L72: .loc 1 266 3 add.u64 %r167,%r167,24; add.u64 %r160,%r160,24; add.u64 %r133,%r133,8; setp.ne.u64 %r324,%r123,%r133; @ %r324 bra $L73; bra $L119; $L69: .loc 1 280 7 setp.le.s64 %r325,%r138,0; @ %r325 bra $L75; add.u64 %r429,%frame,360; $L99: add.u64 %r178,%r206,40; mov.u64 %r176,%r429; add.u64 %r326,%r206,16; cvt.u32.u32 %r328,%r22; cvt.s64.s8 %r327,%r328; add.u64 %r330,%r327,%r327; add.u64 %r331,%r330,%r327; shl.b64 %r332,%r331,3; add.u64 %r168,%r326,%r332; .loc 1 283 10 mov.u64 %r128,1; .loc 1 287 4 mov.u64 %r355,0; bra $L76; $L75: .loc 1 291 24 mov.u64 %r333,0; st.u64 [%r206+8],%r333; .loc 1 292 28 cvt.u16.u32 %r336,%r22; add.u16 %r335,%r336,-1; cvt.u32.u16 %r337,%r335; st.u8 [%r206+28],%r337; .loc 1 294 20 add.u32 %r338,%r22,-2; cvt.s64.s32 %r71,%r338; add.u64 %r340,%r71,%r71; add.u64 %r341,%r340,%r71; shl.b64 %r342,%r341,3; add.u64 %r343,%r206,%r342; .loc 1 294 67 shl.b64 %r345,%r71,3; add.u64 %r346,%frame,%r345; .loc 1 294 59 ld.u64 %r348,[%r343+40]; ld.u64 %r349,[%r346+360]; mul.lo.u64 %r145,%r348,%r349; .loc 1 296 29 mov.u64 %r351call (%value_in),_gfortrani_xmallocarray52,[%value_in]; } .loc 1 296 27 st.u64 [%r206],%r352; .loc 1 297 10 setp.eq.u64 %r354,%r145,0; @ ! %r354 bra $L80; bra $L77; $L79: .loc 1 285 48 mul.lo.u64 %r128,%r67,%r128; $L76: .loc 1 287 4 st.u64 [%r178+8],%r355; ld.u64 %r67,[%r176]; add.u64 %r356,%r67,-1; st.u64 [%r178+16],%r356; st.u64 [%r178],%r128; .loc 1 280 7 add.u64 %r178,%r178,24; add.u64 %r176,%r176,8; setp.ne.u64 %r357,%r168,%r178; @ %r357 bra $L79; bra $L75; $L77: .loc 1 300 4 st.u64 [%r206+48],%r145; mov.u64 %r359,-1; st.u64 [%r206+56],%r359; mov.u64 %r360,1; st.u64 [%r206+40],%r360; .loc 1 301 4 bra $L58; $L70: .loc 1 306 19 ld.s8 %r77,[%r206+28]; .loc 1 306 10 setp.eq.u64 %r361,%r77,%r138; @ %r361 bra $L82; .loc 1 307 2 st.u64 [%stack+8],%r138; st.u64 [%stack],%r77;_gfortran_runtime_error82: .loc 1 312 11 cvta.global.u64 %r364,_gfortrani_compile_options; .loc 1 312 10 ld.u32 %r365,[%r364+36]; setp.eq.u32 %r366,%r365,0; @ %r366 bra $L80; .loc 1 313 2 add.u64 %r429,%frame,360; cvta.const.u64 %r370,$LC2; cvta.const.u64 %r369,$LC3; {369370; call _gfortrani_bounds_ifunction_return$L80: .loc 1 317 3 setp.gt.s64 %r372,%r138,0; @ %r372 bra $L83; $L86: .loc 1 237 9 add.u64 %r373,%r31,1; .loc 1 237 7 sub.u64 %r140,%r373,%r33; max.s64 %r64,%r140,0; .loc 1 242 12 mul.lo.u64 %r37,%r34,%r36; .loc 1 325 8 ld.u64 %r155,[%r206]; .loc 1 328 8 ld.u64 %r154,[%r207]; shl.b64 %r82,%r212,2; .loc 1 353 35 mul.lo.u64 %r93,%r142,%r82; .loc 1 338 22 add.u64 %r62,%r64,-1; .loc 1 338 15 mul.lo.u64 %r84,%r62,%r93; .loc 1 339 29 mul.lo.u64 %r172,%r37,%r62; .loc 1 340 34 neg.s64 %r88,%r93; .loc 1 365 22 ld.u64 %r194,[%frame+240]; .loc 1 365 12 mul.lo.u64 %r374,%r194,%r212; shl.b64 %r196,%r374,2; .loc 1 366 23 ld.u64 %r197,[%frame+120]; .loc 1 367 22 ld.u64 %r199,[%frame]; .loc 1 367 12 shl.b64 %r201,%r199,3; .loc 1 364 12 ld.u64 %r205,[%frame+480]; setp.ne.u32 %r430,%r211,0; add.u64 %r428,%frame,488; .loc 1 340 4 setp.le.s64 %r436,%r140,0; .loc 1 353 4 setp.gt.s64 %r437,%r140,0; .loc 1 384 23 add.u64 %r438,%frame,240; .loc 1 369 32 add.u64 %r439,%frame,360; .loc 1 373 20 add.u64 %r440,%frame,120; bra $L84; $L83: add.u64 %r175,%frame,480; add.u64 %r137,%r206,40; mov.u64 %r188,%frame; cvt.u32.u32 %r376,%r22; cvt.s64.s8 %r375,%r376; add.u64 %r179,%r375,-1; .loc 1 317 3 mov.u64 %r29,0; add.u64 %r429,%frame,360; .loc 1 319 16 mov.u64 %r377,%r29; $L85: st.u64 [%r175],%r377; .loc 1 320 18 ld.u64 %r378,[%r137]; st.u64 [%r188],%r378; .loc 1 321 17 shl.b64 %r380,%r29,3; add.u64 %r381,%r429,%r380; .loc 1 321 10 ld.u64 %r382,[%r381]; setp.le.s64 %r383,%r382,0; @ %r383 bra $L58; .loc 1 317 26 add.u64 %r29,%r29,1; .loc 1 317 3 add.u64 %r175,%r175,8; add.u64 %r137,%r137,24; add.u64 %r188,%r188,8; setp.ne.u64 %r384,%r29,%r179; @ %r384 bra $L85; bra $L86; $L84: .loc 1 336 10 @ %r430 bra $L87; .loc 1 353 4 @ %r437 bra $L100; $L90: .loc 1 335 14 mov.u64 %r96,0; bra $L89; $L87: .loc 1 338 8 add.u64 %r148,%r154,%r84; .loc 1 339 9 add.u64 %r149,%r151,%r172; .loc 1 340 4 @ %r436 bra $L90; .loc 1 340 11 mov.u64 %r96,%r64; $L92: .loc 1 342 11 ld.s8 %r389,[%r149]; cvt.u16.u32 %r388,%r389; setp.ne.u16 %r390,%r388,0; @ %r390 bra $L91; $L93: .loc 1 340 26 add.u64 %r96,%r96,-1; .loc 1 340 34 add.u64 %r148,%r148,%r88; .loc 1 340 61 sub.u64 %r149,%r149,%r37; .loc 1 340 4 setp.ne.u64 %r391,%r96,0; @ %r391 bra $L92; bra $L902121308; call (%value_in),_gfortran_compare_string_char4ld.param.u32 %r396,[%value_in]; } .loc 1 342 18 setp.eq.u32 %r397,%r396,0; @ ! %r397 bra $L93; bra $L89; $L100: .loc 1 353 4 mov.u64 %r147,%r151; mov.u64 %r146,%r154; .loc 1 353 11 mov.u64 %r96,1; $L88: .loc 1 355 11 ld.s8 %r399,[%r147]; cvt.u16.u32 %r398,%r399; setp.ne.u16 %r400,%r398,0; @ %r400 bra $L94; $L95: .loc 1 353 27 add.u64 %r96,%r96,1; .loc 1 353 35 add.u64 %r146,%r146,%r93; .loc 1 353 62 add.u64 %r147,%r147,%r37; .loc 1 353 4 setp.ge.s64 %r401,%r64,%r96; @ %r401 bra $L88; bra $L9021621308; call (%value_in),_gfortran_compare_string_char4ld.param.u32 %r406,[%value_in]; } .loc 1 355 18 setp.ne.u32 %r407,%r406,0; @ %r407 bra $L95; $L89: .loc 1 362 13 st.u64 [%r155],%r96; .loc 1 364 15 add.u64 %r205,%r205,1; st.u64 [%frame+480],%r205; .loc 1 365 12 add.u64 %r154,%r154,%r196; .loc 1 366 13 add.u64 %r151,%r151,%r197; .loc 1 367 12 add.u64 %r155,%r155,%r201; .loc 1 369 32 ld.u64 %r116,[%frame+360]; .loc 1 369 13 setp.ne.u64 %r408,%r205,%r116; @ %r408 bra $L84; mov.u64 %r83,%r428; mov.u64 %r113,%r199; mov.u64 %r203,%r197; mov.u64 %r110,%r194; mov.u64 %r42,8; .loc 1 368 9 mov.u64 %r153,0; .loc 1 371 13 mov.u64 %r410,%r153; $L98: st.u64 [%r83+-8],%r410; .loc 1 372 23 mul.lo.u64 %r98,%r116,%r110; .loc 1 373 24 mul.lo.u64 %r411,%r116,%r203; .loc 1 373 10 sub.u64 %r151,%r151,%r411; .loc 1 374 23 mul.lo.u64 %r105,%r116,%r113; .loc 1 375 5 add.u64 %r153,%r153,1; .loc 1 376 7 setp.le.s64 %r412,%r138,%r153; @ %r412 bra $L58; .loc 1 383 16 ld.u64 %r413,[%r83]; add.u64 %r109,%r413,1; st.u64 [%r83],%r109; .loc 1 384 23 add.u64 %r415,%r438,%r42; ld.u64 %r110,[%r415]; .loc 1 384 13 sub.u64 %r416,%r110,%r98; mad.lo.u64 %r154,%r416,%r82,%r154; .loc 1 385 23 add.u64 %r418,%frame,%r42; ld.u64 %r113,[%r418]; .loc 1 385 13 sub.u64 %r419,%r113,%r105; shl.b64 %r420,%r419,3; add.u64 %r155,%r155,%r420; .loc 1 369 32 add.u64 %r422,%r439,%r42; ld.u64 %r116,[%r422]; .loc 1 369 13 add.u64 %r83,%r83,8; setp.eq.u64 %r423,%r109,%r116; @ %r423 bra $L97; .loc 1 364 12 ld.u64 %r205,[%frame+480]; bra $L84; $L97: .loc 1 373 20 add.u64 %r425,%r440,%r42; ld.u64 %r203,[%r425]; add.u64 %r42,%r42,8; bra $L98; $L119: .loc 1 276 6 ld.u64 %r426,[%r206]; setp.eq.u64 %r427,%r426,0; @ ! %r427 bra $L70; bra $L99; $L58: .loc 1 389_gfortran_sfindloc1_s4 .visible .func _gfortran_sfindloc1_s48u64 %r126; .reg .u64pred %r157; .reg .u32 %r158; .reg .predu32 %r177; .reg .u16 %r178; .reg .pred %r179; .reg .pred %r180; .reg .predpredpredu64predpredpredu64 %r305; .reg .u64 %r306; mov.u64 %r149,%ar0; mov.u64 %r150,%ar1; mov.u64 %r151,%ar2; mov.u64 %r152,%ar3; mov.u64 %r153,%ar4; mov.u32 %r154,%ar5; mov.u64 %r155,%ar6; mov.u64 %r156,%ar7; .loc 1 412 6 setp.eq.u64 %r157,%r153,0; @ %r157 bra $L121; .loc 1 412 20 ld.u32 %r158,[%r153]; setp.eq.u32 %r159,%r158,0; @ %r159 bra $L122; $L121:.loc 1 415 7 bra $L120; $L122: .loc 1 418 10 ld.s8 %r23,[%r150+28]; .loc 1 418 38 add.u32 %r167,%r23,-1; .loc 1 418 8 cvt.s64.s32 %r90,%r167; .loc 1 419 10 ld.u64 %r26,[%r152]; .loc 1 419 7 add.u64 %r91,%r26,-1; .loc 1 421 7 shr.u64 %r169,%r91,63; set.u32.lt.s64 %r171,%r90,%r91; neg.s32 %r172,%r171; cvt.u16.u64 %r175,%r169; cvt.u16.u32 %r176,%r172; or.b16 %r174,%r175,%r176; .loc 1 421 6 cvt.u32.u16 %r177,%r174; cvt.u16.u8 %r178,%r177; setp.ne.u16 %r179,%r178,0; @ %r179 bra $L124; .loc 1 432 3 setp.ne.u64 %r180,%r91,0; @ %r180 bra $L125; $L131: .loc 1 440 3 setp.gt.s64 %r181,%r90,%r91; @ %r181 bra $L126; bra $L168; $L124: .loc 1 423 7 cvt.u32.u32 %r184,%r23; cvt.s64.s8 %r183,%r184; st.u64 [%stack+8],%r183182182_gfortran_runtime_error125: add.u64 %r136,%r150,48; add.u64 %r138,%frame,120; add.u64 %r139,%r150,56; add.u64 %r186,%r150,24; add.u64 %r188,%r26,%r26; add.u64 %r189,%r188,%r26; shl.b64 %r190,%r189,3; add.u64 %r145,%r186,%r190; .loc 1 437 12 mov.u64 %r306,0; $L130: .loc 1 434 19 ld.u64 %r192,[%r139]; add.u64 %r191,%r192,1; ld.u64 %r193,[%r136]; sub.u64 %r35,%r191,%r193; .loc 1 436 10 setp.le.s64 %r194,%r35,0; @ %r194 bra $L128; .loc 1 434 17 st.u64 [%r138],%r35; bra $L129; $L128: .loc 1 437 12 st.u64 [%r138],%r306; $L129: .loc 1 432 3 add.u64 %r136,%r136,24; add.u64 %r138,%r138,8; add.u64 %r139,%r139,24; setp.ne.u64 %r196,%r136,%r145; @ %r196 bra $L130; bra $L131; $L168: .loc 1 450 6 ld.u64 %r197,[%r149]; setp.eq.u64 %r198,%r197,0; @ ! %r198 bra $L133; bra $L132; $L126: add.u64 %r200,%r26,%r26; add.u64 %r201,%r200,%r26; shl.b64 %r202,%r201,3; add.u64 %r203,%r202,48; add.u64 %r125,%r150,%r203; add.u64 %r298,%frame,120; shl.b64 %r205,%r26,3; add.u64 %r206,%r205,-8; add.u64 %r45,%r298,%r206; add.u64 %r207,%r150,48; cvt.u32.u32 %r209,%r23; cvt.s64.s8 %r208,%r209; add.u64 %r211,%r208,%r208; add.u64 %r212,%r211,%r208; shl.b64 %r213,%r212,3; add.u64 %r132,%r207,%r213; .loc 1 446 12 mov.u64 %r305,0; $L136: .loc 1 443 2 ld.u64 %r215,[%r125+8]; add.u64 %r214,%r215,1; ld.u64 %r216,[%r125]; sub.u64 %r40,%r214,%r216; .loc 1 445 10 setp.le.s64 %r217,%r40,0; @ %r217 bra $L134; .loc 1 442 17 st.u64 [%r45],%r40; bra $L135; $L134: .loc 1 446 12 st.u64 [%r45],%r305; $L135: .loc 1 440 3 add.u64 %r125,%r125,24; add.u64 %r45,%r45,8; setp.ne.u64 %r219,%r125,%r132; @ %r219 bra $L136; bra $L169; $L132: .loc 1 454 7 setp.le.s64 %r220,%r90,0; @ %r220 bra $L138; add.u64 %r298,%frame,120; $L154: add.u64 %r78,%r149,40; mov.u64 %r61,%r298; cvt.u32.u32 %r222,%r23; cvt.s64.s8 %r221,%r222; add.u64 %r224,%r221,%r221; add.u64 %r225,%r224,%r221; shl.b64 %r226,%r225,3; add.u64 %r227,%r149,16; add.u64 %r126,%r226,%r227; .loc 1 457 10 mov.u64 %r81,1; .loc 1 461 4 mov.u64 %r250,0; bra $L139; $L138: .loc 1 464 24 mov.u64 %r228,0; st.u64 [%r149+8],%r228; .loc 1 465 28 cvt.u16.u32 %r231,%r23; add.u16 %r230,%r231,-1; cvt.u32.u16 %r232,%r230; st.u8 [%r149+28],%r232; .loc 1 467 20 add.u32 %r233,%r23,-2; cvt.s64.s32 %r51,%r233; add.u64 %r235,%r51,%r51; add.u64 %r236,%r235,%r51; shl.b64 %r237,%r236,3; add.u64 %r238,%r149,%r237; .loc 1 467 67 shl.b64 %r240,%r51,3; add.u64 %r241,%frame,%r240; .loc 1 467 59 ld.u64 %r243,[%r238+40]; ld.u64 %r244,[%r241+120]; mul.lo.u64 %r94,%r243,%r244; .loc 1 469 29 mov.u64 %r246246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } .loc 1 469 27 st.u64 [%r149],%r247; .loc 1 470 10 setp.eq.u64 %r249,%r94,0; @ ! %r249 bra $L143; bra $L140; $L142: .loc 1 459 48 mul.lo.u64 %r81,%r47,%r81; $L139: .loc 1 461 4 st.u64 [%r78+8],%r250; ld.u64 %r47,[%r61]; add.u64 %r251,%r47,-1; st.u64 [%r78+16],%r251; st.u64 [%r78],%r81; .loc 1 454 7 add.u64 %r78,%r78,24; add.u64 %r61,%r61,8; setp.ne.u64 %r252,%r78,%r126; @ %r252 bra $L142; bra $L138; $L140: .loc 1 473 4 st.u64 [%r149+48],%r94; mov.u64 %r254,-1; st.u64 [%r149+56],%r254; mov.u64 %r255,1; st.u64 [%r149+40],%r255; .loc 1 474 4 bra $L120; $L133: .loc 1 479 19 ld.s8 %r58,[%r149+28]; .loc 1 479 10 setp.eq.u64 %r256,%r58,%r90; @ %r256 bra $L144; .loc 1 480 2 st.u64 [%stack+8],%r90; st.u64 [%stack],%r58;_gfortran_runtime_error144: .loc 1 485 11 cvta.global.u64 %r259,_gfortrani_compile_options; .loc 1 485 10 ld.u32 %r260,[%r259+36]; setp.eq.u32 %r261,%r260,0; @ %r261 bra $L143; .loc 1 486 2 add.u64 %r298,%frame,120; cvta.const.u64 %r265,$LC2r292645; call _gfortrani_bounds_ifunction_return$L143: .loc 1 490 3 setp.gt.s64 %r267,%r90,0; @ %r267 bra $L145; $L148: .loc 1 497 8 ld.u64 %r97,[%r149]; ld.u64 %r42,[%frame+240]; .loc 1 505 22 ld.u64 %r64,[%frame]; .loc 1 505 12 shl.b64 %r66,%r64,3; setp.le.s64 %r299,%r90,1; cvt.u32.u32 %r301,%r23; cvt.s64.s8 %r302,%r301; add.u64 %r303,%r302,-1; add.u64 %r304,%frame,120; bra $L146; $L145: add.u64 %r114,%frame,240; add.u64 %r113,%r149,40; mov.u64 %r108,%frame; cvt.u32.u32 %r269,%r23; cvt.s64.s8 %r268,%r269; add.u64 %r79,%r268,-1; .loc 1 490 3 mov.u64 %r98,0; add.u64 %r298,%frame,120; .loc 1 492 16 mov.u64 %r270,%r98; $L147: st.u64 [%r114],%r270; .loc 1 493 18 ld.u64 %r271,[%r113]; st.u64 [%r108],%r271; .loc 1 494 17 shl.b64 %r273,%r98,3; add.u64 %r274,%r298,%r273; .loc 1 494 10 ld.u64 %r275,[%r274]; setp.le.s64 %r276,%r275,0; @ %r276 bra $L120; .loc 1 490 26 add.u64 %r98,%r98,1; .loc 1 490 3 add.u64 %r114,%r114,8; add.u64 %r113,%r113,24; add.u64 %r108,%r108,8; setp.ne.u64 %r277,%r79,%r98; @ %r277 bra $L147; bra $L148; $L146: .loc 1 502 13 mov.u64 %r278,0; st.u64 [%r97],%r278; .loc 1 504 15 add.u64 %r42,%r42,1; .loc 1 505 12 add.u64 %r97,%r97,%r66; .loc 1 507 32 ld.u64 %r88,[%frame+120]; .loc 1 507 13 setp.ne.u64 %r279,%r42,%r88; @ %r279 bra $L146; .loc 1 510 23 mul.lo.u64 %r280,%r64,%r42; .loc 1 510 9 shl.b64 %r281,%r280,3; sub.u64 %r30,%r97,%r281; .loc 1 512 7 @ ! %r299 bra $L170; bra $L120; $L153: .loc 1 509 13 st.u64 [%r102],%r278; .loc 1 510 23 mul.lo.u64 %r284,%r73,%r72; .loc 1 510 9 shl.b64 %r285,%r284,3; sub.u64 %r30,%r97,%r285; .loc 1 511 5 add.u64 %r96,%r96,1; .loc 1 512 7 add.u64 %r102,%r102,8; setp.ne.u64 %r286,%r96,%r303; @ %r286 bra $L152; bra $L120; $L170: add.u64 %r102,%frame,248; .loc 1 511 5 mov.u64 %r96,1; $L152: .loc 1 519 16 ld.u64 %r290,[%r102]; add.u64 %r72,%r290,1; st.u64 [%r102],%r72; shl.b64 %r120,%r96,3; .loc 1 520 23 add.u64 %r291,%frame,%r120; ld.u64 %r73,[%r291]; .loc 1 520 13 shl.b64 %r292,%r73,3; add.u64 %r97,%r30,%r292; .loc 1 507 32 add.u64 %r294,%r304,%r120; ld.u64 %r77,[%r294]; .loc 1 507 13 setp.eq.u64 %r295,%r72,%r77; @ %r295 bra $L153; .loc 1 509 13 mov.u64 %r42,0; bra $L146; $L169: .loc 1 450 6 ld.u64 %r296,[%r149]; setp.eq.u64 %r297,%r296,0; @ ! %r297 bra $L133; bra $L154; $L120: .loc 1 524findloc2_s1.o/gfortran_findloc2_s1gfortran_findloc2_s1); .file 1 "../../../../libgfortran/generated/findloc2_s1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mfindloc2_s1gfortran_mfindloc2_s1_gfortran_sfindloc2_s1gfortran_sfindloc2_s1_gfortran_compare_gfortran_compare); // BEGIN GLOBAL FUNCTION DECL: _gfortrani_internal_error .extern .func _gfortrani_internal_error21,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_findloc2_s1gfortran_findloc2_s1predu32 %r63; .reg .pred %r64; .reg .pred44 12 ld.u64 %r45,[%r39+56]; add.u64 %r44,%r45,1; .loc 1 44 10 ld.u64 %r46,[%r39+48]; sub.u64 %r38,%r44,%r46; .loc 1 45 6 setp.gt.s64 %r47,%r38,0; @ %r47 bra $L2; $L6: .loc 1 46 12 mov.u64 %r38,0; bra $L1; $L2: .loc 1 48 44 ld.u64 %r48,[%r39+40]; mul.lo.u64 %r27,%r42,%r48; .loc 1 51 18 ld.u64 %r35,[%r39]; .loc 1 49 6 setp.eq.u32 %r49,%r41,0; @ %r49 bra $L7; .loc 1 51 40 add.u64 %r50,%r38,-1; .loc 1 51 11 mad.lo.u64 %r37,%r50,%r27,%r35; $L5: .loc 1 54(%value_in),_gfortran_compare_stringld.param.u32 %r56,[%value_in]; } .loc 1 54 7 setp.eq.u32 %r57,%r56,0; @ %r57 bra $L1; .loc 1 56 8 sub.u64 %r37,%r37,%r27; .loc 1 52 33 add.u64 %r38,%r38,-1; .loc 1 52 7 setp.ne.u64 %r58,%r38,-1; @ %r58 bra $L5; bra $L6; $L7: .loc 1 62 14 mov.u64 %r36,1; $L4: .loc 1 64(%value_in),_gfortran_compare_string64 7 setp.eq.u32 %r64,%r63,0; @ %r64 bra $L8; .loc 1 66 8 add.u64 %r35,%r35,%r27; .loc 1 62 33 add.u64 %r36,%r36,1; .loc 1 62 7 setp.ge.s64 %r65,%r38,%r36; @ %r65 bra $L4; bra $L6; $L8: mov.u64 %r38,%r36; $L1: .loc 1 70gfortran_mfindloc2_s1gfortran_mfindloc2_s1predpredpredpred.loc 1 92 12 ld.u64 %r66,[%r59+56]; add.u64 %r65,%r66,1; .loc 1 92 10 ld.u64 %r67,[%r59+48]; sub.u64 %r58,%r65,%r67; .loc 1 93 6 setp.gt.s64 %r68,%r58,0; @ %r68 bra $L15; $L23: .loc 1 94 12 mov.u64 %r58,0; bra $L14; $L15: .loc 1 96 15 ld.u64 %r25,[%r61+16]; .loc 1 97 9 ld.u64 %r53,[%r61]; .loc 1 99 22 cvt.u32.u64 %r26,%r25; .loc 1 99 53 add.u32 %r69,%r26,-4; and.b32 %r70,%r69,-5;.loc 1 99 22 add.u32 %r74,%r26,-1; set.u32.le.u32 %r76,%r74,1; neg.s32 %r77,%r76; .loc 1 99 6 cvt.u16.u32 %r79,%r73; cvt.u16.u32 %r80,%r77; or.b16 %r78,%r79,%r80;17; .loc 1 101 7 setp.ne.u32 %r85,%r26,16; @ %r85 bra $L18; $L17: .loc 1 108 44 ld.u64 %r86,[%r59+40]; mul.lo.u64 %r31,%r63,%r86; .loc 1 109 13 ld.u64 %r87,[%r61+40]; mul.lo.u64 %r34,%r25,%r87; .loc 1 113 18 ld.u64 %r52,[%r59]; .loc 1 111 6 setp.ne.u32 %r88,%r62,0; @ %r88 bra $L19; .loc 1 126 14 mov.u64 %r54,1; bra $L20; $L18: .loc 1 106 5 cvta.const.u64 %r90,$LC0; mov.u64 %r89,0; {r90; call _gfortrani_internal_error19: .loc 1 113 40 add.u64 %r35,%r58,-1; .loc 1 113 11 mad.lo.u64 %r55,%r31,%r35,%r52; .loc 1 114 13 mad.lo.u64 %r56,%r34,%r35,%r53; $L22: .loc 1 117 7 ld.s8 %r94,[%r56]; cvt.u16.u32 %r93,%r94; setp.ne.u16 %r95,%r93,0; @ %r95 bra $L21; $L24: .loc 1 119 8 sub.u64 %r55,%r55,%r31; .loc 1 120 10 sub.u64 %r56,%r56,%r34; .loc 1 115 33 add.u64 %r58,%r58,-1; .loc 1 115 7 setp.ne.u64 %r96,%r58,-1; @ %r96 bra $L22; bra $L23; $L21: .loc 1 117 1call (%value_in),_gfortran_compare_string117 15 setp.eq.u32 %r102,%r101,0; @ ! %r102 bra $L24; bra $L14; $L20: .loc 1 128 7 ld.s8 %r104,[%r53]; cvt.u16.u32 %r103,%r104; setp.ne.u16 %r105,%r103,0; @ %r105 bra $L25; $L26: .loc 1 130 8 add.u64 %r52,%r52,%r31; .loc 1 131 10 add.u64 %r53,%r53,%r34; .loc 1 126 33 add.u64 %r54,%r54,1; .loc 1 126 7 setp.ge.s64 %r106,%r58,%r54; @ %r106 bra $L20; bra $L23; $L25: .loc 1 128 1call (%value_in),_gfortran_compare_string28 15 setp.ne.u32 %r112,%r111,0; @ %r112 bra $L26; mov.u64 %r58,%r54; $L14gfortran_sfindloc2_s1gfortran_sfindloc2_s1mov.u64 %r28,%ar4; mov.u64 %r29,%ar5; .loc 1 148 6 setp.eq.u64 %r30,%r26,0; @ %r30 bra $L33; .loc 1 148 20 ld.u32 %r31,[%r26];$L33: .loc 1 150 call (%value_in),_gfortran_findloc2_38,[%value_in]; } mov.u64 %r23,%r38; bra $L32; $L35: .loc 1 152 10 mov.u64 %r23,0; $L3findloc2_sgfortran_findloc2_s4gfortran_findloc2_s4); .file 1 "../../../../libgfortran/generated/findloc2_s4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mfindloc2_s4gfortran_mfindloc2_s_gfortran_sfindloc2_s4gfortran_sfindloc2_s_gfortran_compare_string_char4_gfortran_compare_string_char4gfortrani_internal_error .extern .func _gfortrani_internal_error21,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_findloc2_s4gfortran_findloc2_s4pred3244 12 ld.u64 %r47,[%r41+56]; add.u64 %r46,%r47,1; .loc 1 44 10 ld.u64 %r48,[%r41+48]; sub.u64 %r40,%r46,%r48; .loc 1 45 6 setp.gt.s64 %r49,%r40,0; @ %r49 bra $L2; $L6: .loc 1 46 12 mov.u64 %r40,0; bra $L1; $L2: .loc 1 48 44 ld.u64 %r50,[%r41+40]; mul.lo.u64 %r27,%r44,%r50; .loc 1 56 8 shl.b64 %r35,%r27,2; .loc 1 51 18 ld.u64 %r37,[%r41]; .loc 1 49 6 setp.eq.u32 %r51,%r43,0; @ %r51 bra $L7; .loc 1 51 40 add.u64 %r52,%r40,-1; .loc 1 51 45 mul.lo.u64 %r53,%r52,%r27; .loc 1 51 30 shl.b64 %r54,%r53,2; .loc 1 51 11 add.u64 %r39,%r37,%r54; .loc 1 56 8 neg.s64 %r33,%r35; $L5: .loc 1 54call (%value_in),_gfortran_compare_string_char4ld.param.u32 %r59,[%value_in]; } .loc 1 54 7 setp.eq.u32 %r60,%r59,0; @ %r60 bra $L1; .loc 1 56 8 add.u64 %r39,%r39,%r33; .loc 1 52 33 add.u64 %r40,%r40,-1; .loc 1 52 7 setp.ne.u64 %r61,%r40,-1; @ %r61 bra $L5; bra $L6; $L7: .loc 1 62 14 mov.u64 %r38,1; $L4: .loc 1 64call (%value_in),_gfortran_compare_string_char4ld.param.u32 %r66,[%value_in]; } .loc 1 64 7 setp.eq.u32 %r67,%r66,0; @ %r67 bra $L8; .loc 1 66 8 add.u64 %r37,%r37,%r35; .loc 1 62 33 add.u64 %r38,%r38,1; .loc 1 62 7 setp.ge.s64 %r68,%r40,%r38; @ %r68 bra $L4; bra $L6; $L8: mov.u64 %r40,%r38; $L1: .loc 1 70gfortran_mfindloc2_s4gfortran_mfindloc2_spredpredu32 %r114; .reg .pred %r115; mov.u64 %r61,%ar0; mov.u64 %r62,%ar1; mov.u64 %r63,%ar2; mov.u32 %r64,%ar3; mov.u64 %r65,%ar4; mov.u64 %r66,%ar5; .loc 1 92 12 ld.u64 %r68,[%r61+56]; add.u64 %r67,%r68,1; .loc 1 92 10 ld.u64 %r69,[%r61+48]; sub.u64 %r60,%r67,%r69; .loc 1 93 6 setp.gt.s64 %r70,%r60,0; @ %r70 bra $L15; $L23: .loc 1 94 12 mov.u64 %r60,0; bra $L14; $L15: .loc 1 96 15 ld.u64 %r25,[%r63+16]; .loc 1 97 9 ld.u64 %r54,[%r63]; .loc 1 99 22 cvt.u32.u64 %r26,%r25; .loc 1 99 53 add.u32 %r71,%r26,-4; and.b32 %r72,%r71,-5;.loc 1 99 22 add.u32 %r76,%r26,-1; set.u32.le.u32 %r78,%r76,1; neg.s32 %r79,%r78; .loc 1 99 6 cvt.u16.u32 %r81,%r75; cvt.u16.u32 %r82,%r79; or.b16 %r80,%r81,%r82;@ %r85 bra $L17; .loc 1 101 7 setp.ne.u32 %r87,%r26,16; @ %r87 bra $L18; $L17: .loc 1 108 44 ld.u64 %r88,[%r61+40]; mul.lo.u64 %r31,%r65,%r88; .loc 1 109 13 ld.u64 %r89,[%r63+40]; mul.lo.u64 %r34,%r25,%r89; .loc 1 119 8 shl.b64 %r59,%r31,2; .loc 1 113 18 ld.u64 %r53,[%r61]; .loc 1 111 6 setp.ne.u32 %r90,%r64,0; @ %r90 bra $L19; .loc 1 126 14 mov.u64 %r55,1; bra $L20; $L18: .loc 1 106 5 cvta.const.u64 %r92,$LC0; mov.u64 %r91,0; {_gfortrani_internal_error19: .loc 1 113 40 add.u64 %r35,%r60,-1; .loc 1 113 45 mul.lo.u64 %r93,%r31,%r35; .loc 1 113 30 shl.b64 %r94,%r93,2; .loc 1 113 11 add.u64 %r56,%r53,%r94; .loc 1 114 13 mad.lo.u64 %r57,%r34,%r35,%r54; .loc 1 119 8 neg.s64 %r43,%r59; $L22: .loc 1 117 7 ld.s8 %r97,[%r57]; cvt.u16.u32 %r96,%r97; setp.ne.u16 %r98,%r96,0; @ %r98 bra $L21; $L24: .loc 1 119 8 add.u64 %r56,%r56,%r43; .loc 1 120 10 sub.u64 %r57,%r57,%r34; .loc 1 115 33 add.u64 %r60,%r60,-1; .loc 1 115 7 setp.ne.u64 %r99,%r60,-1; @ %r99 bra $L22; bra $L23; $L21: .loc 1 117 1call (%value_in),_gfortran_compare_string_char4ld.param.u32 %r104,[%value_in]; } .loc 1 117 15 setp.eq.u32 %r105,%r104,0; @ ! %r105 bra $L24; bra $L14; $L20: .loc 1 128 7 ld.s8 %r107,[%r54]; cvt.u16.u32 %r106,%r107; setp.ne.u16 %r108,%r106,0; @ %r108 bra $L25; $L26: .loc 1 130 8 add.u64 %r53,%r53,%r59; .loc 1 131 10 add.u64 %r54,%r54,%r34; .loc 1 126 33 add.u64 %r55,%r55,1; .loc 1 126 7 setp.ge.s64 %r109,%r60,%r55; @ %r109 bra $L20; bra $L23; $L25: .loc 1 128 1call (%value_in),_gfortran_compare_string_char4ld.param.u32 %r114,[%value_in]; } .loc 1 128 15 setp.ne.u32 %r115,%r114,0; @ %r115 bra $L26; mov.u64 %r60,%r55; $L14gfortran_sfindloc2_s4gfortran_sfindloc2_smov.u64 %r28,%ar4; mov.u64 %r29,%ar5; .loc 1 148 6 setp.eq.u64 %r30,%r26,0; @ %r30 bra $L33; .loc 1 148 20 ld.u32 %r31,[%r26];$L33: .loc 1 150 call (%value_in),_gfortran_findloc2_38,[%value_in]; } mov.u64 %r23,%r38; bra $L32; $L35: .loc 1 152 10 mov.u64 %r23,0; $L3577 gfortran_cfi_desc_to_gfc_desc .visible .func _gfortran_cfi_desc_to_gfc_descfortran/runtime/ISO_Fortran_binding.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_gfc_desc_to_cfi_desc .visible .func _gfortran_gfc_desc_to_cfi_desc; // BEGIN GLOBAL FUNCTION DECL: CFICFI_addres); // BEGIN GLOBAL FUNCTION DECL: CFI_allocatCFI_allocateCFI_deallocatCFI_deallocateCFI_establishCFI_establishCFI_is_contiguouCFI_is_contiguousCFI_secCFI_sectionCFI_select_pCFI_select_partCFI_setpointerCFI_setpointerimpure_ptr .extern .global .align 8 .u64 _impure_pt_gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_options[6]fwritefwrite70,73,95,97,100,100,114,101,115,115,58,32,98,97,115,101,32,97,100,100,114,101,115,115,32,111,102,32,67,32,68,101,115,99,114,105,112,116,111,114,32,109,117,115,116,32,110,111,116,32,98,101,32,78,85,76,76,46,32] = {67,70,73,95,97,100,100,114,101,115,115,58,32,115,117,98,115,99,114,105,112,116,115,91,37,100,93,32,105,115,32,111,117,116,32,111,102,32,98,111,117,110,100,115,46,32,70,111,114,32,100,105,109,101,110,115,105,111,110,32,61,32,37,100,44,32,115,117,98,115,99,114,105,112,116,115,32,61,32,37,100,44,32,108,111,119,101,114,95,98,111,117,110,100,32,61,32,37,100,44,32,117,112,112,101,114,32,98,111,117,110,100,32,61,32,37,100,44,32,101,120,116,101,110,100,32,61,32,37,10067,70,73,95,97,108,108,111,99,97,116,101,58,32,67,32,68,101,115,99,114,105,112,116,111,114,32,105,115,32,78,85,76,767,70,73,95,97,108,108,111,99,97,116,101,58,32,84,104,101,32,111,98,106,101,99,116,32,111,102,32,116,104,101,32,67,32,100,101,115,99,114,105,112,116,111,114,32,109,117,115,116,32,98,101,32,97,32,112,111,105,110,116,101,114,32,111,114,32,97,108,108,111,99,97,116,97,98,108,101,32,118,97,114,105,97,98,108,101,4667,70,73,95,97,108,108,111,99,97,116,101,58,32,66,97,115,101,32,97,100,100,114,101,115,115,32,111,102,32,67,32,100,101,115,99,114,105,112,116,111,114,32,109,117,115,116,32,98,101,32,78,85,76,76,467,70,73,95,97,108,108,111,99,97,116,101,58,32,73,102,32,48,32,60,32,114,97,110,107,32,40,61,32,37,100,41,32,117,112,112,101,114,95,98,111,117,110,100,115,91,93,32,97,110,100,32,108,111,119,101,114,95,98,111,117,110,100,115,91,93,44,32,109,117,115,116,32,110,111,116,32,98,101,32,78,85,76,76,4667,70,73,95,97,108,108,111,99,97,116,101,58,32,70,97,105,108,11710,46,167,70,73,95,100,101,97,108,108,111,99,97,116,101,58,32,67,32,68,101,115,99,114,105,112,116,111,114,32,105,115,32,78,85,76,76,46,10,0 }67,70,73,95,100,101,97,108,108,111,99,97,116,101,58,32,66,97,115,101,32,97,100,100,114,101,115,115,32,105,115,32,97,108,114,101,97,100,121,32,78,85,76,76,4667,70,73,95,100,101,97,108,108,111,99,97,116,101,58,32,67,32,68,101,115,99,114,105,112,116,111,114,32,109,117,115,116,32,100,101,115,99,114,105,98,101,32,97,32,112,111,105,110,116,101,114,32,111,114,32,97,108,108,111,99,97,116,97,98,108,101,32,111,98,106,101,99,116,4667,70,73,95,101,115,116,97,98,108,105,115,104,58,32,67,32,100,101,115,99,114,105,112,116,111,114,32,105,115,32,78,85,76,76,46,10,0 }; // BEGIN VAR DEF: $LC11 .const .align 1 .u8 $LC11[67] = {67,70,73,95,101,115,116,97,98,108,105,115,104,58,32,82,97,110,107,32,109,117,115,116,32,98,101,32,98,101,116,119,101,101,110,32,48,32,97,110,100,32,37,100,44,32,48,32,60,32,114,97,110,107,32,40,48,32,33,60,32,37,100,41,46144] = {67,70,73,95,101,115,116,97,98,108,105,115,104,58,32,73,102,32,98,97,115,101,32,97,100,100,114,101,115,115,32,105,115,32,110,111,116,32,78,85,76,76,32,40,98,97,115,101,95,97,100,100,114,32,33,61,32,78,85,76,76,41,44,32,116,104,101,32,101,115,116,97,98,108,105,115,104,101,100,32,67,32,100,101,115,99,114,105,112,116,111,114,32,105,115,32,102,111,114,32,97,32,110,111,110,97,108,108,111,99,97,116,97,98,108,101,32,101,110,116,105,116,121,32,40,97,116,116,114,105,98,117,116,101,32,33,61,32,37,100,41,46128] = {67,70,73,95,101,115,116,97,98,108,105,115,104,58,32,69,120,116,101,110,116,115,32,109,117,115,116,32,110,111,116,32,98,101,32,78,85,76,76,32,40,101,120,116,101,110,116,115,32,33,61,32,78,85,76,76,41,32,105,102,32,114,97,110,107,32,40,61,32,37,100,41,32,62,32,48,32,97,110,100,32,98,97,115,101,32,97,100,100,114,101,115,115,32,105,115,32,110,111,116,32,78,85,76,76,32,40,98,97,115,101,95,97,100,100,114,32,33,61,32,78,85,76,76,41,4667,70,73,95,105,115,95,99,111,110,116,105,103,117,111,117,115,58,32,67,32,100,101,115,99,114,105,112,116,111,114,32,105,115,32,78,85,76,76,4667,70,73,95,105,115,95,99,111,110,116,105,103,117,111,117,115,58,32,66,97,115,101,32,97,100,100,114,101,115,115,32,111,102,32,67,32,68,101,115,99,114,105,112,116,111,114,32,105,115,32,97,108,114,101,97,100,121,32,78,85,76,76,4667,70,73,95,105,115,95,99,111,110,116,105,103,117,111,117,115,58,32,67,32,68,101,115,99,114,105,112,116,111,114,32,109,117,115,116,32,100,101,115,99,114,105,98,101,32,97,110,32,97,114,114,97,121,32,40,48,32,60,32,100,118,45,62,114,97,110,107,32,61,32,37,100,41,46,10,0 }; // BEGIN VAR DEF: $LC17 .const .align 1 .u8 $LC17[40] = {67,70,73,95,115,101,99,116,105,111,110,58,32,83,111,117,114,99,101,32,109,117,115,116,32,110,111,116,32,98,101,32,32,78,85,76,7639] = {67,70,73,95,115,101,99,116,105,111,110,58,32,82,101,115,117,108,116,32,109,117,115,116,32,110,111,116,32,98,101,32,78,85,76,765] = {67,70,73,95,115,101,99,116,105,111,110,58,32,66,97,115,101,32,97,100,100,114,101,115,115,32,111,102,32,115,111,117,114,99,101,32,109,117,115,116,32,110,111,116,32,98,101,32,78,85,76,76,46,1001,99,116,95,112,97,114,116,58,32,66,97,115,101,32,97,100,100,114,101,115,115,32,111,102,32,115,111,117,114,99,101,32,109,117,115,116,32,110,111,116,32,98,101,32,78,85,76,76,46,10,0 }; // BEGIN VAR DEF: $LC34 .const .align 1 .u8 $LC34[100] = {67,70,73,95,115,101,108,101,99,116,95,112,97,114,116,58,32,83,111,117,114,99,101,32,97,110,100,32,114,101,115,117,108,116,32,109,117,115,116,32,104,97,118,101,32,116,104,101,32,115,97,109,101,32,114,97,110,107,32,40,115,111,117,114,99,101,45,62,114,97,110,107,32,61,32,37,100,44,32,114,101,115,117,108,116,45,62,114,97,110,107,32,61,32,37,100,41,46,10,0 }; // BEGIN VAR DEF: $LC35 .const .align 1 .u8 $LC35[98] = {67,70,73,95,115,101,108,101,99,116,95,112,97,114,116,58,32,83,111,117,114,99,101,32,109,117,115,116,32,110,111,116,32,100,101,115,99,114,105,98,101,32,97,110,32,97,115,115,117,109,101,100,32,115,105,122,101,32,97,114,114,97,121,32,32,40,115,111,117,114,99,101,45,62,100,105,109,91,37,100,93,46,101,120,116,101,110,116,32,33,61,32,45,49,41,46,10,0 }; // BEGIN VAR DEF: $LC36 .const .align 1 .u8 $LC36[127] = {67,70,73,95,115,101,108,101,99,116,95,112,97,114,116,58,32,68,105,115,112,108,97,99,101,109,101,110,116,32,109,117,115,116,32,98,101,32,119,105,116,104,105,110,32,116,104,101,32,98,111,117,110,100,115,32,111,102,32,115,111,117,114,99,101,32,40,48,32,60,61,32,100,105,115,112,108,97,99,101,109,101,110,116,32,60,61,32,115,111,117,114,99,101,45,62,101,108,101,109,95,108,101,110,32,45,32,49,44,32,48,32,60,61,32,37,100,32,60,61,32,37,100,41,46,10,0 }; // BEGIN VAR DEF: $LC37 .const .align 1 .u8 $LC37[199] = {67,70,73,95,115,101,108,101,99,116,95,112,97,114,116,58,32,68,105,115,112,108,97,99,101,109,101,110,116,32,112,108,117,115,32,116,104,101,32,101,108,101,109,101,110,116,32,108,101,110,103,116,104,32,111,102,32,114,101,115,117,108,116,32,109,117,115,116,32,98,101,32,108,101,115,115,32,116,104,97,110,32,111,114,32,101,113,117,97,108,32,116,111,32,116,104,101,32,101,108,101,109,101,110,116,32,108,101,110,103,116,104,32,111,102,32,115,111,117,114,99,101,32,40,100,105,115,112,108,97,99,101,109,101,110,116,32,43,32,114,101,115,117,108,116,45,62,101,108,101,109,95,108,101,110,32,60,61,32,115,111,117,114,99,101,45,62,101,108,101,109,95,108,101,110,44,32,37,100,32,43,32,37,100,32,61,32,37,100,32,60,61,32,37,100,41,46,10,0 }; // BEGIN VAR DEF: $LC38 .const .align 1 .u8 $LC38[33] = {67,70,73,95,115,101,116,112,111,105,110,116,101,114,58,32,82,101,115,117,108,116,32,105,115,32,78,85,76,76,46,10,0 }; // BEGIN VAR DEF: $LC39 .const .align 1 .u8 $LC39[86] = {67,70,73,95,115,101,116,112,111,105,110,116,101,114,58,32,82,101,115,117,108,116,32,115,104,97,108,108,32,98,101,32,116,104,101,32,97,100,100,114,101,115,115,32,111,102,32,97,32,67,32,100,101,115,99,114,105,112,116,111,114,32,102,111,114,32,97,32,70,111,114,116,114,97,110,32,112,111,105,110,116,101,114,46,10,0 }; // BEGIN VAR DEF: $LC30 .const .align 1 .u8 $LC30[43] = {67,70,73,95,115,101,108,101,99,116,95,112,97,114,116,58,32,83,111,117,114,99,101,32,109,117,115,116,32,110,111,116,32,98,101,32,78,85,76,76,46,67,70,73,95,115,101,108,101,99,116,95,112,97,114,116,58,32,82,101,115,117,108,116,32,109,117,115,116,32,110,111,116,32,98,101,32,78,85,76,76,46,10,0 }67,70,73,95,115,101,108,101,99,116,95,112,97,114,116,58,32,82,101,115,117,108,116,32,109,117,115,116,32,110,111,116,32,100,101,115,99,114,105,98,101,32,97,110,32,97,108,108,111,99,97,116,97,98,108,101,32,111,98,106,101,99,116,32,40,114,101,115,117,108,116,45,62,97,116,116,114,105,98,117,116,101,32,33,61,32,37,100,41,46,10,0 }; // BEGIN VAR DEF: $LC40 .const .align 1 .u8 $LC40[120] = {67,70,73,95,115,101,116,112,111,105,110,116,101,114,58,32,69,108,101,109,101,110,116,32,108,101,110,103,116,104,115,32,111,102,32,114,101,115,117,108,116,32,40,114,101,115,117,108,116,45,62,101,108,101,109,95,108,101,110,32,61,32,37,100,41,32,97,110,100,32,115,111,117,114,99,101,32,40,115,111,117,114,99,101,45,62,101,108,101,109,95,108,101,110,32,61,32,37,100,41,32,109,117,115,116,32,98,101,32,116,104,101,32,115,97,109,101,46,10,0 }; // BEGIN VAR DEF: $LC41 .const .align 1 .u8 $LC41[102] = {67,70,73,95,115,101,116,112,111,105,110,116,101,114,58,32,82,97,110,107,115,32,111,102,32,114,101,115,117,108,116,32,40,114,101,115,117,108,116,45,62,114,97,110,107,32,61,32,37,100,41,32,97,110,100,32,115,111,117,114,99,101,32,40,115,111,117,114,99,101,45,62,114,97,110,107,32,61,32,37,100,41,32,109,117,115,116,32,98,101,32,116,104,101,32,115,97,109,101,46,10,0 }; // BEGIN VAR DEF: $LC42 .const .align 1 .u8 $LC42[101] = {67,70,73,95,115,101,116,112,111,105,110,116,101,114,58,32,84,121,112,101,115,32,111,102,32,114,101,115,117,108,116,32,40,114,101,115,117,108,116,45,62,116,121,112,101,61,32,37,100,41,32,97,110,100,32,115,111,117,114,99,101,32,40,115,111,117,114,99,101,45,62,116,121,112,101,32,61,32,37,100,41,32,109,117,115,116,32,98,101,32,116,104,101,32,115,97,109,101,4661] = {67,70,73,95,115,101,99,116,105,111,110,58,32,82,101,115,117,108,116,32,109,117,115,116,32,110,111,116,32,100,101,115,99,114,105,98,101,32,97,110,32,97,108,108,111,99,97,116,97,98,108,101,32,97,114,114,97,121,4667,70,73,95,115,101,99,116,105,111,110,58,32,83,111,117,114,99,101,32,109,117,115,116,32,100,101,115,99,114,105,98,101,32,97,110,32,97,114,114,97,121,32,40,48,32,60,32,115,111,117,114,99,101,45,62,114,97,110,107,44,32,48,32,33,60,32,37,100,41,4118] = {67,70,73,95,115,101,99,116,105,111,110,58,32,84,104,101,32,101,108,101,109,101,110,116,32,108,101,110,103,116,104,115,32,111,102,32,115,111,117,114,99,101,32,40,115,111,117,114,99,101,45,62,101,108,101,109,95,108,101,110,32,61,32,37,100,41,32,97,110,100,32,114,101,115,117,108,116,32,40,114,101,115,117,108,116,45,62,101,108,101,109,95,108,101,110,32,61,32,37,100,41,32,109,117,115,116,32,98,101,32,101,113,117,97,10867,70,73,95,115,101,99,116,105,111,110,58,32,84,121,112,101,115,32,111,102,32,115,111,117,114,99,101,32,40,115,111,117,114,99,101,45,62,116,121,112,101,32,61,32,37,100,41,32,97,110,100,32,114,101,115,117,108,116,32,40,114,101,115,117,108,116,45,62,116,121,112,101,32,61,32,37,100,41,32,109,117,115,116,32,98,101,32,101,113,117,97161] = {67,70,73,95,115,101,99,116,105,111,110,58,32,82,97,110,107,32,111,102,32,114,101,115,117,108,116,32,109,117,115,116,32,98,101,32,101,113,117,97,108,32,116,111,32,116,104,101,32,114,97,110,107,32,111,102,32,115,111,117,114,99,101,32,109,105,110,117,115,32,116,104,101,32,110,117,109,98,101,114,32,111,102,32,122,101,114,111,115,32,105,110,32,115,116,114,105,100,101,115,32,40,114,101,115,117,108,116,45,62,114,97,110,107,32,61,32,115,111,117,114,99,101,45,62,114,97,110,107,32,45,32,122,101,114,111,95,99,111,117,110,116,44,32,37,100,32,33,61,32,37,100,32,45,32,37,100,4167,70,73,95,115,101,99,116,105,111,110,58,32,83,111,117,114,99,101,32,109,117,115,116,32,110,111,116,32,98,101,32,97,110,32,97,115,115,117,109,101,100,32,115,105,122,101,32,97,114,114,97,121,32,105,102,32,117,112,112,101,114,95,98,111,117,110,100,115,32,105,115,32,78,85,76,76,46122] = {67,70,73,95,115,101,99,116,105,111,110,58,32,73,102,32,115,116,114,105,100,101,115,91,37,100,93,32,61,32,48,44,32,116,104,101,110,32,116,104,101,32,108,111,119,101,114,32,98,111,117,110,100,115,44,32,108,111,119,101,114,95,98,111,117,110,100,115,91,37,100,93,32,61,32,37,100,44,32,97,110,100,32,117,112,112,101,114,95,98,111,117,110,100,115,91,37,100,93,32,61,32,37,100,44,32,109,117,115,116,32,98,101,32,101,113,117,97,108,46201] = {67,70,73,95,115,101,99,116,105,111,110,58,32,76,111,119,101,114,32,98,111,117,110,100,115,32,109,117,115,116,32,98,101,32,119,105,116,104,105,110,32,116,104,101,32,98,111,117,110,100,115,32,111,102,32,116,104,101,32,102,111,114,116,114,97,110,32,97,114,114,97,121,32,40,115,111,117,114,99,101,45,62,100,105,109,91,37,100,93,46,108,111,119,101,114,95,98,111,117,110,100,32,60,61,32,108,111,119,101,114,95,98,111,117,110,100,115,91,37,100,93,32,60,61,32,115,111,117,114,99,101,45,62,100,105,109,91,37,100,93,46,108,111,119,101,114,95,98,111,117,110,100,32,43,32,115,111,117,114,99,101,45,62,100,105,109,91,37,100,93,46,101,120,116,101,110,116,32,45,32,49,44,32,37,100,32,60,61,32,37,100,32,60,61,32,37,100,41203] = {67,70,73,95,115,101,99,116,105,111,110,58,32,85,112,112,101,114,32,98,111,117,110,100,115,32,109,117,115,116,32,98,101,32,119,105,116,104,105,110,32,116,104,101,32,98,111,117,110,100,115,32,111,102,32,116,104,101,32,102,111,114,116,114,97,110,32,97,114,114,97,121,32,40,115,111,117,114,99,101,45,62,100,105,109,91,37,100,93,46,108,111,119,101,114,95,98,111,117,110,100,32,60,61,32,117,112,112,101,114,95,98,111,117,110,100,115,91,37,100,93,32,60,61,32,115,111,117,114,99,101,45,62,100,105,109,91,37,100,93,46,108,111,119,101,114,95,98,111,117,110,100,32,43,32,115,111,117,114,99,101,45,62,100,105,109,91,37,100,93,46,101,120,116,101,110,116,32,45,32,49,44,32,37,100,32,33,60,61,32,37,100,32,33,60,61,32,37,100,41,469] = {67,70,73,95,115,101,99,116,105,111,110,58,32,73,102,32,116,104,101,32,117,112,112,101,114,32,98,111,117,110,100,32,105,115,32,115,109,97,108,108,101,114,32,116,104,97,110,32,116,104,101,32,108,111,119,101,114,32,98,111,117,110,100,32,102,111,114,32,97,32,103,105,118,101,110,32,100,105,109,101,110,115,105,111,110,32,40,117,112,112,101,114,91,37,100,93,32,60,32,108,111,119,101,114,91,37,100,93,44,32,37,100,32,60,32,37,100,41,44,32,116,104,101,110,32,104,101,32,115,116,114,105,100,101,32,102,111,114,32,115,97,105,100,32,100,105,109,101,110,115,105,111,110,116,32,109,117,115,116,32,98,101,32,110,101,103,97,116,105,118,101,32,40,115,116,114,105,100,101,91,37,100,93,32,60,32,48,44,32,37,100,32,60,32,48,41,46,_gfortran_cfi_desc_to_gfc_desc .visible .func _gfortran_cfi_desc_to_gfc_descpredpred101; mov.u64 %r56,%ar0; mov.u64 %r57,%ar1; .loc 1 41 16 ld.u64 %r46,[%r57]; .loc 1 43 6 setp.ne.u64 %r58,%r46,0; @ ! %r58 bra $L1; .loc 1 46 27 ld.u64 %r59,[%r46]; st.u64 [%r56],%r59; .loc 1 47 44 ld.s16 %r23,[%r46+22]; .loc 1 47 29 cvt.u32.u32 %r60,%r23; cvt.s32.s8 %r24,%r60; .loc 1 51 6 cvt.u16.u32 %r61,%r24; setp.ne.u16 %r62,%r61,6; @ %r62 bra $L4; .loc 1 52 29 mov.u32 %r63,5; st.u8 [%r56+29],%r63; bra $L5; $L4: .loc 1 53 11 setp.eq.u16 %r65,%r61,5; @ %r65 bra $L6; .loc 1 47 27 st.u8 [%r56+29],%r60; bra $L5; $L6: .loc 1 54 29 mov.u32 %r67,6; st.u8 [%r56+29],%r67; $L5: .loc 1 56 9 ld.s8 %r27,[%r46+20]; .loc 1 56 49 ld.u64 %r52,[%r46+8]; .loc 1 63 23 ld.u32 %r53,[%r46+16]; .loc 1 66 24 ld.s8 %r69,[%r46+21]; cvt.u16.u32 %r68,%r69; .loc 1 56 6 setp.eq.u32 %r70,%r27,0; @ %r70 bra $L7; .loc 1 56 28 ld.u64 %r49,[%r46+40]; .loc 1 56 16 setp.eq.u64 %r71,%r49,%r52; @ %r71 bra $L8; .loc 1 58 11 ld.s8 %r73,[%r56+29]; cvt.u16.u32 %r72,%r73; setp.eq.u16 %r74,%r72,5; @ %r74 bra $L16; .loc 1 48 61 shr.s32 %r75,%r23,8; .loc 1 59 29 cvt.s64.s32 %r32,%r75; bra $L9; $L15: .loc 1 70 32 rem.u64 %r77,%r49,%r52; .loc 1 70 10 setp.eq.u64 %r78,%r77,0; .loc 1 71 10 selp.u64 %r101,%r52,%r49,%r78; st.u64 [%r56+32],%r101; .loc 1 76 13 mov.u64 %r79,0; st.u64 [%r56+8],%r79; .loc 1 77 3 cvt.u16.u32 %r80,%r27; setp.le.s16 %r81,%r80,0; @ %r81 bra $L1; add.u32 %r82,%r27,-1; cvt.u64.u32 %r83,%r82; add.u64 %r85,%r83,%r83; add.u64 %r86,%r85,%r83; shl.b64 %r87,%r86,3; mov.u64 %r48,%r79; mov.u64 %r45,%r48; $L14: add.u64 %r47,%r46,%r48; .loc 1 79 58 ld.u64 %r36,[%r47+24]; add.u64 %r33,%r56,%r48; .loc 1 79 35 st.u64 [%r33+48],%r36; .loc 1 81 7 ld.u64 %r89,[%r47+32]; add.u64 %r88,%r36,%r89; .loc 1 81 31 add.u64 %r90,%r88,-1; .loc 1 80 35 st.u64 [%r33+56],%r90; .loc 1 82 63 div.u64 %r42,%r49,%r52; .loc 1 82 35 st.u64 [%r33+40],%r42; .loc 1 83 48 mul.lo.u64 %r91,%r36,%r42; .loc 1 83 17 sub.u64 %r45,%r45,%r91; .loc 1 77 3 setp.eq.u64 %r92,%r87,%r48; @ %r92 bra $L13; .loc 1 82 59 ld.u64 %r49,[%r47+64]; add.u64 %r48,%r48,24; bra $L14; $L13: st.u64 [%r56+8],%r45; bra $L1; $L16: .loc 1 61 29 mov.u64 %r32,%r52; $L9: st.u64 [%r56+16],%r32; .loc 1 63 20 st.u32 [%r56+24],%r53; .loc 1 64 27 cvt.u32.u32 %r93,%r27; st.u8 [%r56+28],%r93; .loc 1 66 22 st.u16 [%r56+30],%r68; bra $L15; $L7: .loc 1 57 29 st.u64 [%r56+16],%r52; .loc 1 63 20 st.u32 [%r56+24],%r53; .loc 1 64 27 cvt.u32.u32 %r95,%r27; st.u8 [%r56+28],%r95; .loc 1 66 22 st.u16 [%r56+30],%r68; .loc 1 76 13 mov.u64 %r97,0; st.u64 [%r56+8],%r97; bra $L1; $L8: .loc 1 57 29 st.u64 [%r56+16],%r49; .loc 1 63 20 st.u32 [%r56+24],%r53; .loc 1 64 27 cvt.u32.u32 %r98,%r27; st.u8 [%r56+28],%r98; .loc 1 66 22 st.u16 [%r56+30],%r68; bra $L15; $L1: .loc 1 8_gfortran_gfc_desc_to_cfi_desc .visible .func _gfortran_gfc_desc_to_cfi_descpred %r62; .reg .predpredpred %r74; .reg .u16pred %r83; .reg .u16 %r85; .reg .u64 %r88; mov.u64 %r50,%ar0; mov.u64 %r51,%ar1; .loc 1 99 7 ld.u64 %r22,[%r50]; .loc 1 99 6 setp.ne.u64 %r52,%r22,0; @ %r52 bra $L18; .loc 1 100 9 mov.u64 %r53,384mov.u64 %r22,%r54; $L18: .loc 1 105 18 ld.u64 %r23,[%r51]; .loc 1 105 16 st.u64 [%r22],%r23; .loc 1 106 17 ld.u64 %r24,[%r51+16]; .loc 1 106 15 st.u64 [%r22+8],%r24; .loc 1 107 14 ld.u32 %r56,[%r51+24]; st.u32 [%r22+16],%r56; .loc 1 108 25 ld.s8 %r26,[%r51+28]; .loc 1 108 11 cvt.u32.u32 %r57,%r26; st.u8 [%r22+20],%r57; .loc 1 109 18 ld.u8 %r59,[%r51+30]; cvt.s32.s8 %r28,%r59; .loc 1 109 16 st.u8 [%r22+21],%r59; .loc 1 111 7 ld.s8 %r29,[%r51+29]; .loc 1 111 6 cvt.u16.u32 %r61,%r29; setp.eq.u16 %r62,%r61,6; @ %r62 bra $L31; .loc 1 113 11 setp.eq.u16 %r64,%r61,5; @ ! %r64 bra $L35; bra $L20; $L31: mov.u32 %r44,5; $L19: .loc 1 120 30 cvt.u32.u64 %r67,%r24; shl.b32 %r66,%r67,8; .loc 1 120 3 cvt.u16.u32 %r69,%r66; cvt.u16.u32 %r70,%r44; add.u16 %r68,%r69,%r70; .loc 1 119 13 st.u16 [%r22+22],%r68; $L30: .loc 1 122 6 setp.ne.u64 %r71,%r23,0; @ %r71 bra $L22; $L25: .loc 1 142 6 ld.u64 %r72,[%r50]; setp.eq.u64 %r73,%r72,0; @ ! %r73 bra $L17; bra $L23; $L22: .loc 1 124 5 setp.le.s32 %r74,%r26,0; @ %r74 bra $L25; .loc 1 139 62 ld.u64 %r38,[%r51+32]; .loc 1 132 35 add.u32 %r42,%r26,-1; mov.u64 %r43,0; .loc 1 124 12 cvt.u32.u64 %r46,%r43; .loc 1 126 5 cvt.u16.u32 %r75,%r28; setp.eq.u16 %r76,%r75,2; .loc 1 135 21 mov.u64 %r88,-1; $L29: add.u64 %r40,%r51,%r43; .loc 1 127 41 ld.u64 %r48,[%r40+48]; selp.u64 %r49,0,%r48,%r76; add.u64 %r30,%r22,%r43; st.u64 [%r30+24],%r49; .loc 1 134 9 ld.u64 %r47,[%r40+56]; .loc 1 132 5 setp.ne.u32 %r77,%r42,%r46; @ %r77 bra $L27; .loc 1 133 6 setp.ne.u64 %r78,%r48,1; @ %r78 bra $L27; .loc 1 134 6 setp.eq.u64 %r79,%r47,0; @ %r79 bra $L33; $L27: .loc 1 138 9 sub.u64 %r80,%r47,%r48; .loc 1 138 52 add.u64 %r36,%r80,1; bra $L28; $L33: .loc 1 135 21 mov.u64 %r36,%r88; $L28: st.u64 [%r30+32],%r36; .loc 1 139 17 ld.u64 %r82,[%r40+40]; mul.lo.u64 %r81,%r82,%r38; .loc 1 139 15 st.u64 [%r30+40],%r81; .loc 1 124 47 add.u32 %r46,%r46,1; .loc 1 124 5 add.u64 %r43,%r43,24; setp.ne.u32 %r83,%r26,%r46; @ %r83 bra $L29; bra $L25; $L23: .loc 1 143 12 st.u64 [%r50],%r22; .loc 1 144 1 bra $L17; $L35: .loc 1 119 29 cvt.u32.u16 %r44,%r61; bra $L19; $L20: mov.u16 %r85,6; st.u16 [%r22+22],%r85; bra $L30; $L17CFICFI_addresspred %r107; mov.u64 %r56,%ar0; mov.u64 %r57,%ar1; .loc 1 149 9 ld.u64 %r55,[%r56]; .loc 1 151 7 cvta.global.u64 %r58,_gfortrani_compile_options; ld.u32 %r22,[%r58+36]; .loc 1 161 10 set.u32.eq.u64 %r60,%r55,0; neg.s32 %r61,%r60; .loc 1 151 6 set.u32.ne.u32 %r63,%r22,0; neg.s32 %r64,%r63; .loc 1 161 10 cvt.u16.u32 %r66,%r61; cvt.u16.u32 %r67,%r64; and.b16 %r65,%r66,%r67;163 13 ld.global.u64 %r75,[_impure_ptr]; .loc 1 163 4 ld.u64 %r76,[%r75+24]; mov.u64 %r73,60; mov.u64 %r72,1; cvta.const.u64 %r71,$LC165 11 mov.u64 %r55,0; bra $L36; $L37: .loc 1 170 9 ld.s8 %r52,[%r56+20]; .loc 1 178 7 cvt.u16.u32 %r78,%r52; setp.le.s16 %r79,%r78,0; @ %r79 bra $L36; mov.u64 %r53,%r57; add.u64 %r49,%r56,24; .loc 1 178 14 mov.u32 %r51,0; setp.eq.u32 %r107,%r22,0; $L42: .loc 1 180 32 ld.u64 %r30,[%r53]; .loc 1 180 48 ld.u64 %r31,[%r49]; .loc 1 180 16 sub.u64 %r50,%r30,%r31; .loc 1 181 7 @ %r107 bra $L39; .loc 1 182 23 ld.u64 %r34,[%r49+8]; .loc 1 182 8 set.u32.ne.u64 %r82,%r34,-1; neg.s32 %r83,%r82; .loc 1 182 37 set.u32.le.s64 %r85,%r34,%r50; neg.s32 %r86,%r85; cvt.u16.u32 %r88,%r83; cvt.u16.u32 %r89,%r86; and.b16 %r87,%r88,%r89; cvt.u32.u16 %r90,%r87; cvt.u16.u8 %r91,%r90; setp.eq.u16 %r92,%r91,0; @ %r92 bra $L40; $L41: .loc 1 185 17 ld.global.u64 %r95,[_impure_ptr]; .loc 1 185 8 ld.u64 %r96,[%r95+24]; st.u32 [%stack+20],%r34; .loc 1 189 34 cvt.u32.u64 %r98,%r34; cvt.u32.u64 %r99,%r31; sub.u32 %r97,%r98,%r99; .loc 1 185 8 st.u32 [%stack+16],%r97; st.u32 [%stack+12],%r31; st.u32 [%stack+8],%r30; st.u32 [%stack+4],%r51; st.u32 [%stack],%r51; cvta.const.u64 %r94,$LC96100,[%value_in]; } .loc 1 191 22 mov.u64 %r55,0; bra $L36; $L40: .loc 1 183 5 setp.lt.s64 %r102,%r50,0; @ %r102 bra $L41; $L39: .loc 1 194 28 ld.u64 %r104,[%r49+16]; .loc 1 194 14 mad.lo.u64 %r55,%r50,%r104,%r55; .loc 1 178 34 add.u32 %r51,%r51,1; .loc 1 178 7 add.u64 %r53,%r53,8; add.u64 %r49,%r49,24; setp.ne.u32 %r105,%r51,%r52; @ %r105 bra $L42; $L36: .loc 1 199CFI_allocatCFI_allocate64predpredpred %r143; .reg .u64 %r146; .reg .predmov.u64 %r68,%ar0; mov.u64 %r69,%ar1; mov.u64 %r70,%ar2; mov.u64 %r71,%ar3; .loc 1 206 7 cvta.global.u64 %r72,_gfortrani_compile_options; ld.u32 %r22,[%r72+36]; .loc 1 206 6 setp.eq.u32 %r73,%r22,0; @ %r73 bra $L46; .loc 1 209 10 setp.ne.u64 %r74,%r68,0; @ %r74 bra $L47; .loc 1 211 13 ld.global.u64 %r79,[_impure_ptr]; .loc 1 211 4 ld.u64 %r80,[%r79+24]; mov.u64 %r77,36; mov.u64 %r76,1; cvta.const.u64 %r75,$LCld.param.u64 %r81,[%value_in]; } .loc 1 212 11 mov.u32 %r67,10; bra $L45; $L47: .loc 1 216 10 ld.s8 %r83,[%r68+21]; cvt.u16.u32 %r82,%r83; setp.ne.u16 %r84,%r82,2; @ %r84 bra $L49; .loc 1 218 13 ld.global.u64 %r89,[_impure_ptr]; .loc 1 218 4 ld.u64 %r90,[%r89+24]; mov.u64 %r87,88; mov.u64 %r86,1; cvta.const.u64 %r85,$LC385220 11 mov.u32 %r67,7; bra $L45; $L49: .loc 1 224 10 ld.u64 %r92,[%r68]; setp.eq.u64 %r93,%r92,0; @ %r93 bra $L46; .loc 1 226 13 ld.global.u64 %r98,[_impure_ptr]; .loc 1 226 4 ld.u64 %r99,[%r98+24]; mov.u64 %r96,57; mov.u64 %r95,1; cvta.const.u64 %r94,$LC428 11 mov.u32 %r67,3; bra $L45; $L46: .loc 1 234 67 ld.u16 %r102,[%r68+22]; and.b16 %r101,%r102,-5; set.u32.eq.u16 %r104,%r101,257; neg.s32 %r105,%r104; .loc 1 234 33 set.u32.eq.u16 %r108,%r102,1029; neg.s32 %r109,%r108; .loc 1 234 67 cvt.u16.u32 %r111,%r105; cvt.u16.u32 %r112,%r109; or.b16 %r110,%r111,%r112; cvt.u32.u16 %r113,%r110; cvt.u16.u8 %r114,%r113; setp.eq.u16 %r115,%r114,0; @ %r115 bra $L50; .loc 1 236 18 st.u64 [%r68+8],%r71; $L50: .loc 1 243 9 ld.s8 %r35,[%r68+20]; .loc 1 243 6 cvt.u16.u32 %r116,%r35; setp.gt.s16 %r117,%r116,0; @ %r117 bra $L51; .loc 1 258 24 ld.u64 %r66,[%r68+8]; .loc 1 239 10 mov.u64 %r59,1; bra $L52; $L51: .loc 1 245 10 setp.ne.u32 %r118,%r22,0; @ %r118 bra $L53; $L55: .loc 1 258 24 ld.u64 %r66,[%r68+8]; mov.u64 %r23,%r69; add.u64 %r62,%r68,8; mov.u64 %r39,%r70; mov.u32 %r60,0; mov.u64 %r59,1; bra $L54; $L53: .loc 1 246 21 set.u32.eq.u64 %r120,%r69,0; neg.s32 %r121,%r120; .loc 1 246 45 set.u32.eq.u64 %r123,%r7246 29 cvt.u16.u32 %r126,%r121; cvt.u16.u32 %r127,%r124; or.b16 %r125,%r126,%r127; .loc 1 246 4 cvt.u32.u16 %r128,%r125; cvt.u16.u8 %r129,%r12855; .loc 1 248 13 ld.global.u64 %r133,[_impure_ptr]; .loc 1 248 4 ld.u64 %r134,[%r133+24]; st.u32 [%stack],%r35; cvta.const.u64 %r132,$LC115,[%value_in]; } .loc 1 250 11 mov.u32 %r67,8; bra $L45; $L54: .loc 1 255 41 ld.u64 %r42,[%r23]; .loc 1 255 27 st.u64 [%r62+16],%r42; .loc 1 256 40 ld.u64 %r138,[%r39]; sub.u64 %r137,%r138,%r42; .loc 1 256 65 add.u64 %r45,%r137,1; .loc 1 256 22 st.u64 [%r62+24],%r45; .loc 1 257 7 setp.ne.u32 %r139,%r60,0; @ %r139 bra $L56; .loc 1 258 20 ld.u64 %r140,[%r68+8]; st.u64 [%r68+40],%r140; bra $L57; $L56: .loc 1 260 35 ld.u64 %r142,[%r62]; mul.lo.u64 %r141,%r142,%r66; .loc 1 260 20 st.u64 [%r62+32],%r141; $L57: .loc 1 261 12 mul.lo.u64 %r59,%r59,%r45; .loc 1 253 38 add.u32 %r60,%r60,1; .loc 1 253 7 add.u64 %r23,%r23,8; add.u64 %r62,%r62,24; add.u64 %r39,%r39,8; setp.ne.u32 %r143,%r60,%r35; @ %r143 bra $L54; $L52: .loc 1 265 19call (%value_in),c146,[%value_in]; } .loc 1 265 17 st.u64 [%r68],%r146; .loc 1 266 6 setp.ne.u64 %r148,%r146,0; @ %r148 bra $L58; .loc 1 268 16 ld.global.u64 %r153,[_impure_ptr]; .loc 1 268 7 ld.u64 %r154,[%r153+24]; mov.u64 %r151,44; mov.u64 %r150,1; cvta.const.u64 %r149,$LC615154; call (%value_in),fwriteld.param.u64 %r155,[%value_in]; } .loc 1 269 14 mov.u32 %r67,11; bra $L45; $L58: .loc 1 272 10 mov.u32 %r67,0; $L45: .loc 1 273 1 mov.u32 %value,%r67CFI_deallocatCFI_deallocateu64 %r42; .reg .u64 %r43; .reg .pred16mov.u64 %r32,%ar0; .loc 1 279 7 cvta.global.u64 %r33,_gfortrani_compile_options; .loc 1 279 6 ld.u32 %r34,[%r33+36]; setp.eq.u32 %r35,%r34,0; @ %r35 bra $L73; .loc 1 282 10 setp.ne.u64 %r36,%r32,0; @ %r36 bra $L74; .loc 1 284 13 ld.global.u64 %r41,[_impure_ptr]; .loc 1 284 4 ld.u64 %r42,[%r41+24]; mov.u64 %r39,38; mov.u64 %r38,1; cvta.const.u64 %r37,$LCld.param.u64 %r43,[%value_in]; } .loc 1 285 11 mov.u32 %r31,10; bra $L72; $L74: .loc 1 289 13 ld.u64 %r23,[%r32]; .loc 1 289 10 setp.ne.u64 %r44,%r23,0; @ %r44 bra $L76; .loc 1 291 13 ld.global.u64 %r49,[_impure_ptr]; .loc 1 291 4 ld.u64 %r50,[%r49+24]; mov.u64 %r47,46; mov.u64 %r46,1; cvta.const.u64 %r45,$LC8;ld.param.u64 %r51,[%value_in]; } .loc 1 292 11 mov.u32 %r31,2; bra $L72; $L76: .loc 1 296 10 ld.s8 %r53,[%r32+21]; cvt.u16.u32 %r52,%r53; setp.ne.u16 %r54,%r52,2; @ %r54 bra $L77; .loc 1 298 13 ld.global.u64 %r59,[_impure_ptr]; .loc 1 298 4 ld.u64 %r60,[%r59+24]; mov.u64 %r57,76; mov.u64 %r56,1; cvta.const.u64 %r55,$LC955ld.param.u64 %r61,[%value_in]; } .loc 1 300 11 mov.u32 %r31,7; bra $L72; $L73: .loc 1 289 13 ld.u64 %r23,[%r32]; $L.loc 1 306 17 mov.u64 %r63,0; st.u64 [%r32],%r63; .loc 1 308 10 cvt.u32.u64 %r31,%r63; $L72: .loc 1 309CFI_establishCFI_establishu16 %r107; .reg .predpredpred %r153; .reg .u32 %r155; .reg .u32316 7 cvta.global.u64 %r76,_gfortrani_compile_options; ld.u32 %r22,[%r76+36]; .loc 1 316 6 setp.eq.u32 %r77,%r22,0; @ ! %r77 bra $L103; set.u32.ne.u64 %r195,%r70,0; bra $L79; $L103: .loc 1 319 10 setp.ne.u64 %r78,%r69,0; @ %r78 bra $L80; .loc 1 321 13 ld.global.u64 %r83,[_impure_ptr]; .loc 1 321 4 ld.u64 %r84,[%r83+24]; mov.u64 %r81,37; mov.u64 %r80,1; cvta.const.u64 %r79,$LC1ld.param.u64 %r85,[%value_in]; } .loc 1 322 11 mov.u32 %r68,10; bra $L78; $L80: .loc 1 326 10 cvt.u32.u32 %r87,%r74; cvt.u16.u8 %r86,%r87; setp.le.u16 %r88,%r86,15; @ %r88 bra $L82; .loc 1 328 13 ld.global.u64 %r91,[_impure_ptr]; .loc 1 328 4 ld.u64 %r92,[%r91+24]; st.u32 [%stack+4],%r74; mov.u32 %r93,15; st.u32 [%stack],%r93; cvta.const.u64 %r90,$LC194,[%value_in]; } .loc 1 330 11 mov.u32 %r68,5; bra $L78; $L82: .loc 1 335 21 cvt.u16.u32 %r97,%r71; set.u32.eq.u16 %r98,%r97,1; neg.s32 %r99,%r98; .loc 1 335 63 set.u32.ne.u64 %r195,%r70,0; neg.s32 %r102,%r195; .loc 1 335 50 cvt.u16.u32 %r104,%r99; cvt.u16.u32 %r105,%r102; and.b16 %r103,%r104,%r105; .loc 1 335 10 cvt.u32.u16 %r106,%r103; cvt.u16.u8 %r107,%r106; setp.eq.u16 %r108,%r107,0; @ %r108 bra $L79; .loc 1 337 13 ld.global.u64 %r111,[_impure_ptr]; .loc 1 337 4 ld.u64 %r112,[%r111+24]; mov.u32 %r113,1; st.u32 [%stack],%r113; cvta.const.u64 %r110,$LC1211110114,[%value_in]; } .loc 1 341 11 mov.u32 %r68,7; bra $L78; $L79: .loc 1 335 63 neg.s32 %r118,%r195; cvt.u32.u32 %r116,%r118; cvt.u32.u8 %r37,%r116; .loc 1 345 17 st.u64 [%r69],%r70; .loc 1 347 6 cvt.u16.u32 %r119,%r72; setp.eq.u16 %r120,%r119,1029; @ %r120 bra $L83; .loc 1 348 12 and.b16 %r121,%r119,-5; set.u32.eq.u16 %r124,%r121,257; neg.s32 %r125,%r124; .loc 1 348 44 set.u32.eq.u16 %r128,%r119,6; neg.s32 %r129,%r128; .loc 1 348 63 cvt.u16.u32 %r131,%r125; cvt.u16.u32 %r132,%r129; or.b16 %r130,%r131,%r132; set.u32.eq.u16 %r135,%r119,-1; neg.s32 %r136,%r135; cvt.u16.u32 %r140,%r136; or.b16 %r138,%r130,%r140; cvt.u32.u16 %r141,%r138; cvt.u16.u8 %r142,%r141; setp.eq.u16 %r143,%r142,0; @ %r143 bra $L84; $L83: .loc 1 350 18 st.u64 [%r69+8],%r73; bra $L85; $L84: and.b16 %r144,%r119,255; .loc 1 354 31 cvt.s64.s16 %r146,%r119; .loc 1 354 14 cvt.s64.s16 %r148,%r144; .loc 1 357 37 sub.u64 %r150,%r146,%r148; .loc 1 357 14 shr.u64 %r55,%r150,8; .loc 1 359 10 setp.ne.u64 %r151,%r55,10; .loc 1 361 19 selp.u64 %r55,%r55,64,%r151; .loc 1 364 10 setp.ne.u16 %r153,%r144,4; @ %r153 bra $L87; .loc 1 366 19 add.u64 %r55,%r55,%r55; $L87: .loc 1 368 20 st.u64 [%r69+8],%r55; $L85: .loc 1 371 15 mov.u32 %r155,1; st.u32 [%r69+16],%r155; .loc 1 372 12 cvt.u32.u32 %r156,%r74; st.u8 [%r69+20],%r156; .loc 1 373 17 cvt.u32.u32 %r157,%r71; st.u8 [%r69+21],%r157; .loc 1 374 12 cvt.u16.u32 %r158,%r72; st.u16 [%r69+22],%r158; .loc 1 378 12 cvt.u16.u32 %r160,%r74; set.u32.gt.s16 %r161,%r160,0; neg.s32 %r162,%r161; .loc 1 378 16 cvt.u16.u32 %r164,%r162; cvt.u16.u32 %r165,%r37; and.b16 %r163,%r164,%r165; .loc 1 378 6%r168 bra $L88; $L93: .loc 1 399 10 mov.u32 %r68,0; bra $L78; $L88: .loc 1 380 10 set.u32.ne.u32 %r170,%r22,0; neg.s32 %r171,%r170; .loc 1 380 51 set.u32.eq.u64 %r173,%r75,0; neg.s32 %r174,%r173; cvt.u1689; .loc 1 382 13 ld.global.u64 %r183,[_impure_ptr]; .loc 1 382 4 ld.u64 %r184,[%r183+24]; st.u32 [%stack],%r74; cvta.const.u64 %r182,$LC131182185,[%value_in]; } .loc 1 385 11 mov.u32 %r68,8; bra $L78; $L89: add.u64 %r36,%r69,24; mov.u64 %r39,%r75; .loc 1 380 51 mov.u32 %r60,0; .loc 1 390 27 mov.u64 %r187,0; $L92: st.u64 [%r36],%r187; .loc 1 391 31 ld.u64 %r46,[%r39]; .loc 1 391 22 st.u64 [%r36+8],%r46; .loc 1 392 7 setp.ne.u32 %r188,%r60,0; @ %r188 bra $L90; .loc 1 393 20 ld.u64 %r189,[%r69+8]; st.u64 [%r69+40],%r189; bra $L91; $L90: .loc 1 395 49 ld.u64 %r191,[%r39+-8]; ld.u64 %r192,[%r69+8]; mul.lo.u64 %r190,%r191,%r192; .loc 1 395 20 st.u64 [%r36+16],%r190; $L91: .loc 1 388 34 add.u32 %r60,%r60,1; .loc 1 388 7 add.u64 %r36,%r36,24; add.u64 %r39,%r39,8; setp.lt.s32 %r193,%r60,%r74; @ %r193 bra $L92; bra $L93; $L78: .loc 1 400 1 mov.u32 %value,%r68CFI_is_contiguouCFI_is_contiguouspredmov.u64 %r46,%ar0; .loc 1 405 7 cvta.global.u64 %r47,_gfortrani_compile_options; .loc 1 405 6 ld.u32 %r48,[%r47+36];105; .loc 1 408 10 setp.ne.u64 %r50,%r46,0; @ %r50 bra $L106; .loc 1 410 13 ld.global.u64 %r55,[_impure_ptr]; .loc 1 410 4 ld.u64 %r56,[%r55+24]; mov.u64 %r53,41; mov.u64 %r52,1; cvta.const.u64 %r51,$LC1451ld.param.u64 %r57,[%value_in]; } .loc 1 411 11 cvt.u32.u64 %r45,%r46; bra $L104; $L106: .loc 1 415 10 ld.u64 %r58,[%r46]; setp.ne.u64 %r59,%r58,0; @ %r59 bra $L108; .loc 1 417 13 ld.global.u64 %r64,[_impure_ptr]; .loc 1 417 4 ld.u64 %r65,[%r64+24]; mov.u64 %r62,65; mov.u64 %r61,1; cvta.const.u64 %r60,$LC15ld.param.u64 %r66,[%value_in]; } .loc 1 419 11 cvt.u32.u64 %r45,%r58; bra $L104; $L108: .loc 1 423 13 ld.s8 %r23,[%r46+20]; .loc 1 423 10 setp.ne.u32 %r67,%r23,0; @ %r67 bra $L109; .loc 1 425 13 ld.global.u64 %r70,[_impure_ptr]; .loc 1 425 4 ld.u64 %r71,[%r70+24]; st.u32 [%stack],%r23; cvta.const.u64 %r69,$LC16,[%value_in]; } .loc 1 427 11 mov.u32 %r45,%r23; bra $L104; $L105: .loc 1 423 13 ld.s8 %r23,[%r46+20]; $L109: .loc 1 432 6 cvt.u16.u32 %r75,%r23; setp.le.s16 %r76,%r75,0; @ %r76 bra $L114; .loc 1 432 40 add.u16 %r77,%r75,-1; cvt.u32.u16 %r80,%r77; .loc 1 432 44 cvt.s64.s8 %r81,%r80; add.u64 %r83,%r81,%r81; add.u64 %r84,%r83,%r81; shl.b64 %r85,%r84,3; add.u64 %r86,%r46,%r85; .loc 1 432 20 ld.u64 %r88,[%r86+32]; setp.eq.u64 %r89,%r88,-1; @ ! %r89 bra $L118; bra $L115; $L113: .loc 1 439 10 setp.ne.u32 %r90,%r45,0; @ %r90 bra $L111; .loc 1 439 18 ld.u64 %r91,[%r46+40]; ld.u64 %r92,[%r46+8]; setp.eq.u64 %r93,%r91,%r92; @ ! %r93 bra $L104; bra $L112; $L111: .loc 1 442 29 ld.u64 %r95,[%r32+8]; ld.u64 %r96,[%r32]; mul.lo.u64 %r94,%r95,%r96; .loc 1 442 9 ld.u64 %r97,[%r32+32]; setp.ne.u64 %r98,%r97,%r94; @ %r98 bra $L116; $L112: .loc 1 437 34 add.u32 %r45,%r45,1; .loc 1 437 3 add.u64 %r32,%r32,24; setp.gt.s32 %r99,%r23,%r45; @ %r99 bra $L113; .loc 1 433 12 mov.u32 %r45,1; bra $L104; $L114: mov.u32 %r45,1; bra $L104; $L115: mov.u32 %r45,1; bra $L104; $L116: .loc 1 446 14 mov.u32 %r45,0; bra $L104; $L118: add.u64 %r32,%r46,8; .loc 1 437 25 mov.u32 %r45,0; bra $L113; $L104: .loc 1 451 1 mov.u32 %value,%r45CFI_secCFI_section368u64 %r91pred %r210; .reg .u64predpredpredu64 %r285; .reg .u64u64 %r305; .reg .u64 %r306; .reg .u64pred %r312; .reg .u64u64 %r317; .reg .u64 %r318; .reg .u64 %r319; .reg .u64u64 %r374; .reg .u64 %r375; .reg .u32 %r376; .reg .pred %r378; .reg .pred %r379; .reg .u64 %r381; .reg .u32 %r382; .reg .predu64predu32.reg .pred %r440; .reg .pred.reg .u64 %r482; .reg .pred %r483; .reg .u64 %r488; .reg .pred %r491; .reg .u64 %r494; .reg .u32 %r495; .reg .pred %r496; .reg .pred %r497; .reg .pred %r498; .reg .u64 %r499; .reg .pred %r500; .reg .u64 %r501; .reg .u32 %r502; mov.u64 %r203,%ar0; mov.u64 %r204,%ar1; mov.u64 %r205,%ar2; mov.u64 %r206,%ar3; mov.u64 %r207,%ar4; .loc 1 465 7 cvta.global.u64 %r208,_gfortrani_compile_options; ld.u32 %r22,[%r208+36]; .loc 1 465 6 setp.eq.u32 %r500,%r22,0; @ %r500 bra $L120; .loc 1 468 10 setp.ne.u64 %r210,%r204,0; @ %r210 bra $L121; .loc 1 470 13 ld.global.u64 %r215,[_impure_ptr]; .loc 1 470 4 ld.u64 %r216,[%r215+24]; mov.u64 %r213,39; mov.u64 %r212,1; cvta.const.u64 %r211,$LC121316; call (%value_in),fwriteld.param.u64 %r217,[%value_in]; } .loc 1 471 11 mov.u32 %r202,10; bra $L119; $L121: .loc 1 474 10 setp.ne.u64 %r218,%r203,0; @ %r218 bra $L123; .loc 1 476 13 ld.global.u64 %r223,[_impure_ptr]; .loc 1 476 4 ld.u64 %r224,[%r223+24]; mov.u64 %r221,38; mov.u64 %r220,1; cvta.const.u64 %r219,$LC1822224; call (%value_in),fwriteld.param.u64 %r225,[%value_in]; } .loc 1 477 11 mov.u32 %r202,10; bra $L119; $L123: .loc 1 481 10 ld.u64 %r226,[%r204]; setp.ne.u64 %r227,%r226,0; @ %r227 bra $L124; .loc 1 483 13 ld.global.u64 %r232,[_impure_ptr]; .loc 1 483 4 ld.u64 %r233,[%r232+24]; mov.u64 %r230,54; mov.u64 %r229,1; cvta.const.u64 %r228,$LC1923033; call (%value_in),fwriteld.param.u64 %r234,[%value_in]; } .loc 1 485 11 mov.u32 %r202,2; bra $L119; $L124: .loc 1 489 10 ld.s8 %r236,[%r203+21]; cvt.u16.u32 %r235,%r236; setp.ne.u16 %r237,%r235,1; @ %r237 bra $L125; .loc 1 491 13 ld.global.u64 %r242,[_impure_ptr]; .loc 1 491 4 ld.u64 %r243,[%r242+24]; mov.u64 %r240,60; mov.u64 %r239,1; cvta.const.u64 %r238,$LC223r224043; call (%value_in),fwriteld.param.u64 %r244,[%value_in]; } .loc 1 493 11 mov.u32 %r202,7; bra $L119; $L125: .loc 1 498 17 ld.s8 %r35,[%r204+20]; .loc 1 498 10 cvt.u16.u32 %r245,%r35; setp.gt.s16 %r246,%r245,0; @ %r246 bra $L126; .loc 1 500 13 ld.global.u64 %r249,[_impure_ptr]; .loc 1 500 4 ld.u64 %r250,[%r249+24]; st.u32 [%stack],%r35; cvta.const.u64 %r248,$LC22248251,[%value_in]; } .loc 1 502 11 mov.u32 %r202,5; bra $L119; $L126: .loc 1 506 17 ld.u64 %r40,[%r203+8]; .loc 1 506 37 ld.u64 %r41,[%r204+8]; .loc 1 506 10 setp.eq.u64 %r253,%r40,%r41; @ %r253 bra $L127; .loc 1 508 13 ld.global.u64 %r256,[_impure_ptr]; .loc 1 508 4 ld.u64 %r257,[%r256+24]; st.u32 [%stack+4],%r40; st.u32 [%stack],%r41; cvta.const.u64 %r255,$LC512 11 mov.u32 %r202,4; bra $L119; $L127: .loc 1 516 17 ld.s16 %r47,[%r203+22]; .loc 1 516 33 ld.s16 %r48,[%r204+22]; .loc 1 516 10 cvt.u16.u32 %r260,%r47; cvt.u16.u32 %r261,%r48; setp.eq.u16 %r262,%r260,%r261; @ %r262 bra $L128; .loc 1 518 13 ld.global.u64 %r265,[_impure_ptr]; .loc 1 518 4 ld.u64 %r266,[%r265+24]; st.u32 [%stack+4],%r47; st.u32 [%stack],%r48; cvta.const.u64 %r264,$LCmov.u32 %r202,6; bra $L119; $L166: mov.u64 %r200,%r207; .loc 1 457 1 mov.u32 %r158,0; mov.u32 %r137,%r158; $L130: .loc 1 529 10 ld.u64 %r269,[%r200]; setp.ne.u64 %r270,%r269,0; @ %r270 bra $L129; .loc 1 530 12 add.u32 %r137,%r137,1; $L129: .loc 1 527 38 add.u32 %r158,%r158,1; .loc 1 527 3 add.u64 %r200,%r200,8; setp.gt.s32 %r271,%r140,%r158; @ %r271 bra $L130; .loc 1 535 6 @ %r500 bra $L131; .loc 1 536 16 ld.s8 %r58,[%r203+20]; .loc 1 536 39 sub.u32 %r273,%r140,%r137; .loc 1 536 7 setp.eq.u32 %r274,%r58,%r273; @ %r274 bra $L131; .loc 1 538 16 ld.global.u64 %r277,[_impure_ptr]; .loc 1 538 7 ld.u64 %r278,[%r277+24]; st.u32 [%stack+8],%r137; st.u32 [%stack+4],%r140; st.u32 [%stack],%r58; cvta.const.u64 %r276,$LC27r276279,[%value_in]; } .loc 1 542 14 mov.u32 %r202,5; bra $L119; $L201: .loc 1 546 6 setp.eq.u64 %r497,%r205,0; @ %r497 bra $L134; $L169: .loc 1 553 7 @ %r496 bra $L133; bra $L134; $L168: add.u64 %r106,%r204,24; add.u64 %r501,%frame,240; mov.u64 %r123,%r501; add.u32 %r495,%r140,-1; cvt.u64.u32 %r285,%r495; shl.b64 %r286,%r285,3; add.u64 %r287,%frame,248; add.u64 %r199,%r286,%r287; $L135: .loc 1 549 11 ld.u64 %r288,[%r106]; st.u64 [%r123],%r288; .loc 1 548 7 add.u64 %r106,%r106,24; add.u64 %r123,%r123,8; setp.ne.u64 %r289,%r123,%r199; @ %r289 bra $L135; bra $L136; $L133: .loc 1 554 11 add.u64 %r501,%frame,240; cvt.s64.s32 %r291,%r140; shl.b64 %r292,%r291bra $L136; $L164: .loc 1 560 10 @ %r500 bra $L137; .loc 1 561 32 add.u32 %r302,%r140,-1; .loc 1 561 36 cvt.s64.s32 %r303,%r302; add.u64 %r305,%r303,%r303; add.u64 %r306,%r305,%r303; shl.b64 %r307,%r306,3; add.u64 %r308,%r204,%r307; .loc 1 561 4 ld.u64 %r310,[%r308+32]; setp.eq.u64 %r311,%r310,-1; @ %r311 bra $L138; $L137: .loc 1 568 7 setp.gt.s32 %r312,%r140,0; @ %r312 bra $L139; add.u64 %r501,%frame,240; bra $L163; $L138: .loc 1 563 13 ld.global.u64 %r317,[_impure_ptr]; .loc 1 563 4 ld.u64 %r318,[%r317+24]; mov.u64 %r315,79; mov.u64 %r314,1; cvta.const.u64 %r313,$LC251318; call (%value_in),fwriteld.param.u64 %r319,[%value_in]; } .loc 1 565 11 mov.u32 %r202,8; bra $L119; $L139: add.u64 %r181,%r204,24; add.u64 %r494,%frame,120; mov.u64 %r163,%r494; add.u64 %r499,%r204,32; mov.u64 %r176,%r499; add.u32 %r495,%r140,-1; cvt.u64.u32 %r321,%r495; shl.b64 %r322,%r321,3; add.u64 %r323,%frame,128; add.u64 %r188,%r322,%r323; $L141: .loc 1 569 40 ld.u64 %r325,[%r181]; ld.u64 %r326,[%r176]; add.u64 %r324,%r325,%r326; .loc 1 569 64 add.u64 %r327,%r324,-1; .loc 1 569 11 st.u64 [%r163],%r327; .loc 1 568 7 add.u64 %r181,%r181,24; add.u64 %r163,%r163,8; add.u64 %r176,%r176,24; setp.ne.u64 %r328,%r163,%r188; @ %r328 bra $L141; bra $L198; $L200: .loc 1 574 11 add.u64 %r494,%frame,120; setp.le.s32 %r331,%r140,0; @ %r331 bra $L143; add.u32 %r495,%r140,-1; cvt.u64.u32 %r333,%r495; add.u64 %r334,%r333,1; shl.b64 %r330,%r334,3; bra $L144; $L143: mov.u64 %r330,8; $L1449206330341,[%value_in]; } .loc 1 578 6 setp.eq.u64 %r343,%r207,0; @ ! %r343 bra $L199; add.u32 %r495,%r140,-1; add.u64 %r499,%r204,32; bra $L145; $L199: .loc 1 585 7 setp.gt.s32 %r344,%r140,0; @ %r344 bra $L146; bra $L163; $L145: mov.u64 %r36,%frame; .loc 1 457 1 mov.u32 %r153,0; .loc 1 581 12 mov.u64 %r345,1; $L147: st.u64 [%r36],%r345; .loc 1 580 42 add.u32 %r153,%r153,1; .loc 1 580 7 add.u64 %r36,%r36,8; setp.lt.s32 %r346,%r153,%r140; @ %r346 bra $L147; bra $L148; $L146: .loc 1 589 7 set.u32.ne.u32 %r348,%r22,0; neg.s32 %r349,%r348; cvt.u32.u32 %r347,%r349; mov.u64 %r23,0; .loc 1 585 16 cvt.u32.u64 %r152,%r23; .loc 1 590 8 cvt.u16.u8 %r357,%r347; $L150: .loc 1 587 23 add.u64 %r350,%r207,%r23; ld.u64 %r77,[%r350]; .loc 1 587 14 add.u64 %r351,%frame,%r23; st.u64 [%r351],%r77; .loc 1 590 8 set.u32.eq.u64 %r353,%r77,149; .loc 1 590 34 add.u64 %r362,%r501,%r23; .loc 1 590 46 add.u64 %r364,%r494,%r23; .loc 1 590 26 ld.u64 %r365,[%r362]; ld.u64 %r366,[%r364]; setp.eq.u64 %r367,%r365,%r366; @ %r367 bra $L149; .loc 1 592 17 ld.global.u64 %r370,[_impure_ptr]; .loc 1 592 8 ld.u64 %r371,[%r370+24]; .loc 1 595 58 add.u64 %r372,%r206,%r23; .loc 1 592 8 ld.u64 %r373,[%r372]; st.u32 [%stack+16],%r373; st.u32 [%stack+12],%r152; .loc 1 595 33 add.u64 %r374,%r205,%r23; .loc 1 592 8 ld.u64 %r375,[%r374]; st.u32 [%stack+8],%r375; st.u32 [%stack+4],%r152; st.u32 [%stack],%r152; cvta.const.u64 %r369,$LCmov.u32 %r202,12; bra $L119; $L149: .loc 1 585 42 add.u32 %r152,%r152,1; .loc 1 585 7 add.u64 %r23,%r23,8; setp.ne.u32 %r378,%r152,%r140; @ %r378 bra $L150; add.u32 %r495,%r140,-1; add.u64 %r499,%r204,32; bra $L148; $L159: .loc 1 605 8 setp.ne.u32 %r379,%r157,%r495; @ %r379 bra $L170; ld.u64 %r381,[%r147]; set.u32.eq.u64 %r382,%r381,-1; neg.s32 %r144,%r382; bra $L151; $L170: mov.u32 %r144,%r502; $L151: .loc 1 606 10 @ %r500 bra $L152; .loc 1 607 4 @ %r497 bra $L153; .loc 1 608 13 ld.u64 %r90,[%r149]; .loc 1 608 33 ld.u64 %r91,[%r147+-8]; .loc 1 608 4 setp.ge.s64 %r386,%r90,%r91; @ %r386 bra $L154; .loc 1 605 26 cvt.s64.s32 %r387,%r157; add.u64 %r389,%r387,%r387; add.u64 %r390,%r389,%r387; shl.b64 %r391,%r390,3; add.u64 %r392,%r204,%r391; ld.u64 %r190,[%r392+32]; bra $L155; $L154: .loc 1 608 46 setp.ne.u32 %r394,%r144,0; @ %r394 bra $L153; .loc 1 610 25 ld.u64 %r190,[%r147]; .loc 1 610 9 add.u64 %r395,%r91,%r190; .loc 1 609 23 setp.lt.s64 %r396,%r90,%r395; @ %r396 bra $L153; $L155: .loc 1 612 13 ld.global.u64 %r399,[_impure_ptr]; .loc 1 612 4 ld.u64 %r400,[%r399+24]; .loc 1 618 5 cvt.u32.u64 %r402,%r91; cvt.u32.u64 %r403,%r190; add.u32 %r401,%r402,%r403; .loc 1 618 29 add.u32 %r404,%r401,-1; .loc 1 612 4 st.u32 [%stack+24],%r404; st.u32 [%stack+20],%r90; st.u32 [%stack+16],%r91; st.u32 [%stack+12],%r157; st.u32 [%stack+8],%r157; st.u32 [%stack+4],%r157; st.u32 [%stack],%r157; cvta.const.u64 %r398,$LC619 11 mov.u32 %r202,12; bra $L119; $L153: .loc 1 624 13 add.u64 %r408,%r494,%r142; ld.u64 %r192,[%r408]; .loc 1 623 4 @ %r498 bra $L156; .loc 1 624 33 ld.u64 %r103,[%r147+-8]; .loc 1 624 4 setp.le.s64 %r410,%r103,%r192; @ %r410 bra $L157; .loc 1 605 26 cvt.s64.s32 %r411,%r157; add.u64 %r413,%r411,%r411; add.u64 %r414,%r413,%r411; shl.b64 %r415,%r414,3; add.u64 %r416,%r204,%r415; ld.u64 %r189,[%r416+32]; bra $L158; $L157: .loc 1 625 8 setp.ne.u32 %r418,%r144,0; @ %r418 bra $L156; .loc 1 627 21 ld.u64 %r189,[%r147]; .loc 1 627 5 add.u64 %r419,%r103,%r189; .loc 1 626 5 setp.gt.s64 %r420,%r419,%r192; @ %r420 bra $L156; $L158: .loc 1 629 13 ld.global.u64 %r423,[_impure_ptr]; .loc 1 629 4 ld.u64 %r424,[%r423+24]; .loc 1 635 5 cvt.u32.u64 %r426,%r103; cvt.u32.u64 %r427,%r189; add.u32 %r425,%r426,%r427; .loc 1 635 29 add.u32 %r428,%r425,-1; .loc 1 629 4 st.u32 [%stack+24],%r428; st.u32 [%stack+20],%r192; st.u32 [%stack+16],%r103; st.u32 [%stack+12],%r157; st.u32 [%stack+8],%r157; st.u32 [%stack+4],%r157; st.u32 [%stack],%r157; cvta.const.u64 %r422,$LC2429,[%value_in]; } .loc 1 636 11 mov.u32 %r202,12; bra $L119; $L156: .loc 1 640 23 ld.u64 %r115,[%r149]; .loc 1 640 4 setp.le.s64 %r431,%r115,%r192; @ %r431 bra $L152; .loc 1 640 36 add.u64 %r432,%frame,%r142; ld.u64 %r116,[%r432]; .loc 1 640 27 setp.lt.s64 %r433,%r116,0; @ %r433 bra $L152; .loc 1 642 20 ld.global.u64 %r436,[_impure_ptr]; .loc 1 642 11 ld.u64 %r437,[%r436+24]; st.u32 [%stack+20],%r116; st.u32 [%stack+16],%r157; st.u32 [%stack+12],%r115; st.u32 [%stack+8],%r192; st.u32 [%stack+4],%r157; st.u32 [%stack],%r157; cvta.const.u64 %r435,$LC48,[%value_in]; } .loc 1 647 11 mov.u32 %r202,9; bra $L119; $L152: .loc 1 602 38 add.u32 %r157,%r157,1; .loc 1 602 3 add.u64 %r149,%r149,8; add.u64 %r147,%r147,24; add.u64 %r142,%r142,8; setp.lt.s32 %r440,%r157,%r140; @ %r440 bra $L159; mov.u64 %r161,%frame; add.u64 %r151,%r204,24; add.u64 %r66,%frame,120; mov.u32 %r155,0; mov.u32 %r143,%r155; $L162: .loc 1 656 17 ld.u64 %r122,[%r161]; .loc 1 660 20 ld.u64 %r191,[%r76]; .loc 1 656 10 setp.ne.u64 %r441,%r122,0; @ %r441 bra $L160; .loc 1 658 7 add.u32 %r143,%r143,1; .loc 1 660 24 ld.u64 %r443,[%r151]; sub.u64 %r442,%r191,%r443; .loc 1 660 13 st.u64 [%r76],%r442; .loc 1 661 4 bra $L161; $L160: .loc 1 663 11 sub.u32 %r154,%r155,%r143; .loc 1 664 36 cvt.s64.s32 %r444,%r154; add.u64 %r446,%r444,%r444; add.u64 %r447,%r446,%r444; shl.b64 %r448,%r447,3; add.u64 %r449,%r203,%r448; st.u64 [%r449+24],%r191; .loc 1 665 47 ld.u64 %r459,[%r66]; sub.u64 %r458,%r459,%r191; .loc 1 665 58 div.s64 %r461,%r458,%r122; .loc 1 665 35 add.u64 %r462,%r461,1; .loc 1 665 31 st.u64 [%r449+32],%r462; .loc 1 666 39 ld.u64 %r471,[%r151+16]; mul.lo.u64 %r470,%r471,%r122; .loc 1 666 27 st.u64 [%r449+40],%r470; .loc 1 668 18 shl.b64 %r473,%r444,3; add.u64 %r474,%frame,%r473; add.u64 %r475,%r474,240; .loc 1 668 31 ld.u64 %r481,[%r475]; ld.u64 %r482,[%r151]; sub.u64 %r480,%r481,%r482; .loc 1 668 18 st.u64 [%r475],%r480; $L161: .loc 1 654 38 add.u32 %r155,%r155,1; .loc 1 654 3 add.u64 %r161,%r161,8; add.u64 %r76,%r76,8; add.u64 %r151,%r151,24; add.u64 %r66,%r66,8; setp.lt.s32 %r483,%r155,%r140; @ %r483 bra $L162; bra $L163; $L148: mov.u64 %r76,%r501; mov.u64 %r147,%r499; .loc 1 561 32 mov.u64 %r149,%r501; mov.u64 %r142,0; cvt.u32.u64 %r157,%r142; .loc 1 605 8 cvt.u32.u64 %r502,%r142; bra $L159; $L163: .loc 1 672 23call (%value_in),CFI_addres488,[%value_in]; } .loc 1 672 21 st.u64 [%r203],%r488; .loc 1 674 10 mov.u32 %r202,0; bra $L119; $L136: .loc 1 558 6 setp.eq.u64 %r498,%r206,0; @ ! %r498 bra $L200; bra $L164; $L128: .loc 1 527 29 mov.u32 %r140,%r35; bra $L166; $L120: ld.s8 %r140,[%r204+20]; .loc 1 527 3 setp.gt.s32 %r496,%r140,0; @ %r496 bra $L166; bra $L201; $L198: .loc 1 578 6 setp.eq.u64 %r491,%r207,0; @ %r491 bra $L182; add.u64 %r501,%frame,240; bra $L146; $L182: add.u64 %r501,%frame,240; bra $L145; $L134: .loc 1 558 6 setp.eq.u64 %r498,%r206,0; @ %r498 bra $L164; add.u64 %r501,%frame,240; bra $L163; $L131: .loc 1 546 6 setp.eq.u64 %r497,%r205,0; @ %r497 bra $L168; setp.gt.s32 %r496,%r140CFI_select_pCFI_select_part32predpredu32u32pred %r188; .reg .u64 %r189; .reg .u16 %r192; .reg .u16u32 %r214; .reg .u32 %r215; .reg .u16 %r216; .reg .u16 %r217; .reg .u16 %r218; .reg .u32 %r219; .reg .u16u64.loc 1 681 7 cvta.global.u64 %r92,_gfortrani_compile_options; .loc 1 681 6 ld.u32 %r93,[%r92+36]; setp.eq.u32 %r94,%r93,0; @ %r94 bra $L203; .loc 1 684 10 setp.ne.u64 %r95,%r89,0; @ %r95 bra $L204; .loc 1 686 13 ld.global.u64 %r100,[_impure_ptr]; .loc 1 686 4 ld.u64 %r101,[%r100+24]; mov.u64 %r98,42; mov.u64 %r97,1; cvta.const.u64 %r96,$LC3101; call (%value_in),fwriteld.param.u64 %r102,[%value_in]; } .loc 1 687 11 mov.u32 %r87,10; bra $L202; $L204: .loc 1 690 10 setp.ne.u64 %r103,%r88,0; @ %r103 bra $L206; .loc 1 692 13 ld.global.u64 %r108,[_impure_ptr]; .loc 1 692 4 ld.u64 %r109,[%r108+24]; mov.u64 %r106,42; mov.u64 %r105,1; cvta.const.u64 %r104,$LC31call (%value_in),fwrite693 11 mov.u32 %r87,10; bra $L202; $L206: .loc 1 698 10 ld.s8 %r112,[%r88+21]; cvt.u16.u32 %r111,%r112; setp.ne.u16 %r113,%r111,1; @ %r113 bra $L207; .loc 1 700 13 ld.global.u64 %r116,[_impure_ptr]; .loc 1 700 4 ld.u64 %r117,[%r116+24]; mov.u32 %r118,1; st.u32 [%stack],%r118; cvta.const.u64 %r115,$LC32703 11 mov.u32 %r87,7; bra $L202; $L207: .loc 1 707 10 ld.u64 %r223,[%r89]; setp.ne.u64 %r122,%r223,0; @ %r122 bra $L208; .loc 1 709 13 ld.global.u64 %r127,[_impure_ptr]; .loc 1 709 4 ld.u64 %r128,[%r127+24]; mov.u64 %r125,58; mov.u64 %r124,1; cvta.const.u64 %r123,$LC331128; call (%value_in),fwriteld.param.u64 %r129,[%value_in]; } .loc 1 711 11 mov.u32 %r87,2; bra $L202; $L208: .loc 1 715 17 ld.s8 %r34,[%r89+20]; .loc 1 715 33 ld.s8 %r222,[%r88+20]; .loc 1 715 10 cvt.u16.u32 %r130,%r34; cvt.u16.u32 %r131,%r222; setp.eq.u16 %r132,%r130,%r131; @ %r132 bra $L209; .loc 1 717 13 ld.global.u64 %r135,[_impure_ptr]; .loc 1 717 4 ld.u64 %r136,[%r135+24]; st.u32 [%stack+4],%r222; st.u32 [%stack],%r34; cvta.const.u64 %r134,$LC34mov.u32 %r87,5; bra $L202; $L209: .loc 1 724 10 setp.le.s16 %r140,%r130,0; @ %r140 bra $L210; .loc 1 724 56 add.u16 %r141,%r130,-1; cvt.u32.u16 %r143,%r141; cvt.s32.s8 %r40,%r143; .loc 1 724 60 cvt.s64.s8 %r144,%r143; add.u64 %r146,%r144,%r144; add.u64 %r147,%r146,%r144; shl.b64 %r148,%r147,3; add.u64 %r149,%r89,%r148; .loc 1 724 28 ld.u64 %r151,[%r149+32]; setp.ne.u64 %r152,%r151,-1; @ %r152 bra $L210; .loc 1 726 13 ld.global.u64 %r155,[_impure_ptr]; .loc 1 726 4 ld.u64 %r156,[%r155+24]; st.u32 [%stack],%r40; cvta.const.u64 %r154,$LC31154157,[%value_in]; } .loc 1 729 11 mov.u32 %r87,10; bra $L202; $L217: .loc 1 742 32 ld.u64 %r45,[%r89+8]; .loc 1 742 43 add.u64 %r159,%r45,-1; .loc 1 742 10 setp.ge.u64 %r160,%r159,%r90; @ %r160 bra $L211; .loc 1 744 13 ld.global.u64 %r163,[_impure_ptr]; .loc 1 744 4 ld.u64 %r164,[%r163+24]; .loc 1 747 29 cvt.u32.u64 %r166,%r45; add.u32 %r165,%r166,-1; .loc 1 744 4 st.u32 [%stack+4],%r165; st.u32 [%stack],%r90; cvta.const.u64 %r162,$LC36164748 11 mov.u32 %r87,12; bra $L202; $L211: .loc 1 753 32 ld.u64 %r53,[%r88+8]; .loc 1 753 24 add.u64 %r169,%r53,%r90; .loc 1 753 10 setp.ge.u64 %r170,%r45,%r169; @ %r170 bra $L212; .loc 1 755 13 ld.global.u64 %r173,[_impure_ptr]; .loc 1 755 4 ld.u64 %r174,[%r173+24]; st.u32 [%stack+12],%r45; .loc 1 760 25 cvt.u32.u64 %r176,%r90; cvt.u32.u64 %r177,%r53; add.u32 %r175,%r176,%r177; .loc 1 755 4 st.u32 [%stack+8],%r175; st.u32 [%stack+4],%r53; st.u32 [%stack],%r90; cvta.const.u64 %r172,$LC3717172178,[%value_in]; } .loc 1 762 11 mov.u32 %r87,12; bra $L202; $L212: .loc 1 766 6 cvt.u16.u32 %r180,%r222; setp.le.s16 %r181,%r180,0; @ %r181 bra $L213; add.u32 %r182,%r222,-1; cvt.u64.u32 %r183,%r182; add.u64 %r185,%r183,%r183; add.u64 %r186,%r185,%r183; shl.b64 %r187,%r186,3; add.u64 %r86,%r187,48; mov.u64 %r76,24; $L214: add.u64 %r73,%r89,%r76; .loc 1 770 47 ld.u64 %r65,[%r73]; add.u64 %r75,%r88,%r76; .loc 1 770 31 st.u64 [%r75],%r65; .loc 1 771 42 ld.u64 %r66,[%r73+8]; .loc 1 771 26 st.u64 [%r75+8],%r66; .loc 1 772 38 ld.u64 %r67,[%r73+16]; .loc 1 772 22 st.u64 [%r75+16],%r67; .loc 1 768 7 add.u64 %r76,%r76,24; setp.ne.u64 %r188,%r76,%r86; @ %r188 bra $L214; $L213: .loc 1 776 50 add.u64 %r189,%r223,%r90; .loc 1 776 21 st.u64 [%r88],%r189; .loc 1 777 10 mov.u32 %r87,0; bra $L202; $L203: .loc 1 734 75 ld.u16 %r193,[%r88+22]; and.b16 %r192,%r193,-5; set.u32.eq.u16 %r195,%r192,257; neg.s32 %r196,%r195; .loc 1 734 37 set.u32.eq.u16 %r199,%r193,1029; neg.s32 %r200,%r199; .loc 1 734 75 cvt.u16.u32 %r202,%r196; cvt.u16.u32 %r203,%r200; or.b16 %r201,%r202,%r203;%r206 bra $L215; ld.u64 %r223,[%r89]; ld.s8 %r222,[%r88+20]; bra $L212; $L210: ld.u16 %r208,[%r88+22]; and.b16 %r207,%r208,-5; set.u32.eq.u16 %r210,%r207,257; neg.s32 %r211,%r210; .loc 1 734 37 set.u32.eq.u16 %r214,%r208,1029; neg.s32 %r215,%r214; .loc 1 734 75 cvt.u16.u32 %r217,%r211; cvt.u16.u32 %r218,%r215; or.b16 %r216,%r217,%r218; cvt.u32.u16 %r219,%r216; cvt.u16.u8 %r220,%r219; setp.ne.u16 %r221,%r220,0; @ %r221 bra $L216; bra $L217; $L215: .loc 1 736 22 st.u64 [%r88+8],%r91; ld.u64 %r223,[%r89]; ld.s8 %r222,[%r88+20]; bra $L212; $L216: st.u64 [%r88+8],%r91; bra $L217; $L202: .loc 1 778 1 mov.u32 %value,%r87CFI_setpointerCFI_setpointer.reg .u64 %r67; .reg .u32 %r68; .reg .predpredpredpredpred %r128; mov.u64 %r64,%ar0; mov.u64 %r65,%ar1; mov.u64 %r66,%ar2; .loc 1 785 7 cvta.global.u64 %r67,_gfortrani_compile_options; .loc 1 785 6 ld.u32 %r68,[%r67+36]; setp.eq.u32 %r69,%r68,0; @ %r69 bra $L220; .loc 1 787 10 setp.ne.u64 %r70,%r64,0; @ %r70 bra $L221; .loc 1 789 13 ld.global.u64 %r75,[_impure_ptr]; .loc 1 789 4 ld.u64 %r76,[%r75+24]; mov.u64 %r73,32; mov.u64 %r72,1; cvta.const.u64 %r71,$LC3871790 11 mov.u32 %r63,10; bra $L219; $L221: .loc 1 793 10 ld.s8 %r79,[%r64+21]; cvt.u16.u32 %r78,%r79; setp.eq.u16 %r80,%r78,0; @ %r80 bra $L223; .loc 1 795 14 ld.global.u64 %r85,[_impure_ptr]; .loc 1 795 5 ld.u64 %r86,[%r85+24]; mov.u64 %r83,85; mov.u64 %r82,1; cvta.const.u64 %r81,$LC3981ld.param.u64 %r87,[%value_in]; } .loc 1 797 12 mov.u32 %r63,7; bra $L219; $L232: .loc 1 805 25 mov.u64 %r88,0; st.u64 [%r64],%r88; .loc 1 806 24 mov.u32 %r89,1; st.u32 [%r64+16],%r89; .loc 1 864 10 cvt.u32.u64 %r63,%r88; bra $L219; $L241: .loc 1 814 14 ld.u64 %r29,[%r64+8]; .loc 1 814 34 ld.u64 %r30,[%r65+8]; .loc 1 814 7 setp.eq.u64 %r90,%r29,%r30; @ %r90 bra $L224; .loc 1 816 17 ld.global.u64 %r93,[_impure_ptr]; .loc 1 816 8 ld.u64 %r94,[%r93+24]; st.u32 [%stack+4],%r30; st.u32 [%stack],%r29; cvta.const.u64 %r92,$LC495,[%value_in]; } .loc 1 820 15 mov.u32 %r63,4; bra $L219; $L224: .loc 1 823 14 ld.s8 %r35,[%r64+20]; .loc 1 823 30 ld.s8 %r50,[%r65+20]; .loc 1 823 7 cvt.u16.u32 %r97,%r35; cvt.u16.u32 %r98,%r50; setp.eq.u16 %r99,%r97,%r98; @ %r99 bra $L225; .loc 1 825 17 ld.global.u64 %r102,[_impure_ptr]; .loc 1 825 8 ld.u64 %r103,[%r102+24]; st.u32 [%stack+4],%r50; st.u32 [%stack],%r35; cvta.const.u64 %r101,$LC410r10104,[%value_in]; } .loc 1 828 15 mov.u32 %r63,5; bra $L219; $L225: .loc 1 831 14 ld.s16 %r40,[%r64+22]; .loc 1 831 30 ld.s16 %r41,[%r65+22]; .loc 1 831 7 cvt.u16.u32 %r106,%r40; cvt.u16.u32 %r107,%r41; setp.eq.u16 %r108,%r106,%r107; @ %r108 bra $L226; .loc 1 833 17 ld.global.u64 %r111,[_impure_ptr]; .loc 1 833 8 ld.u64 %r112,[%r111+24]; st.u32 [%stack+4],%r41; st.u32 [%stack],%r40; cvta.const.u64 %r110,$LC4211110113,[%value_in]; } .loc 1 836 15 mov.u32 %r63,6; bra $L219; $L226: .loc 1 842 17 ld.u64 %r46,[%r65]; st.u64 [%r64],%r46; .loc 1 849 31 ld.u32 %r47,[%r65+16]; .loc 1 849 23 st.u32 [%r64+16],%r47; .loc 1 852 7 cvt.u16.u32 %r115,%r50; setp.gt.s16 %r116,%r115,0; @ %r116 bra $L227; $L231: .loc 1 864 10 mov.u32 %r63,0; bra $L219; $L227: mov.u64 %r52,%r66; add.u32 %r117,%r50,-1; cvt.u64.u32 %r118,%r117; shl.b64 %r119,%r118,3; add.u64 %r120,%r52,8; add.u64 %r62,%r119,%r120; .loc 1 852 7 mov.u64 %r51,24; setp.eq.u64 %r128,%r52,0; $L230: .loc 1 854 7 @ %r128 bra $L228; .loc 1 855 47 ld.u64 %r23,[%r52]; add.u64 %r127,%r65,%r51; bra $L229; $L228: .loc 1 857 49 add.u64 %r127,%r65,%r51; ld.u64 %r23,[%r127]; $L229: add.u64 %r53,%r64,%r51; st.u64 [%r53],%r23; .loc 1 859 42 ld.u64 %r48,[%r127+8]; .loc 1 859 26 st.u64 [%r53+8],%r48; .loc 1 860 38 ld.u64 %r49,[%r127+16]; .loc 1 860 22 st.u64 [%r53+16],%r49; .loc 1 852 7 add.u64 %r52,%r52,8; add.u64 %r51,%r51,24; setp.ne.u64 %r123,%r52,%r62; @ %r123 bra $L230; bra $L231; $L220: .loc 1 803 6 setp.eq.u64 %r125,%r65,0; @ %r125 bra $L232; .loc 1 823 30 ld.s8 %r50,[%r65+20]; bra $L226; $L223: .loc 1 803 6 setp.eq.u64 %r126,%r65,0; @ ! %r126 bra $L241; bra $L232; $L219: .loc 1 865 1 mov.u32 %value,%r63gfortrani_size_from_real_kind../libgfortran/io/size_from_kind.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_size_from_complex_kindgfortrani_size_from_complex_kindF: _gfortrani_size_from_real_kindgfortrani_size_from_real_kind (r22pred %r25; mov.u32 %r23,%ar0; .loc 1 34 3 setp.eq.u32 %r24,%r23,4; @ %r24 bra $L3; setp.eq.u32 %r25,%r23,8; @ %r25 bra $L4; .loc 1 53 14 cvt.s64.s32 %r22,%r23; bra $L1; $L3: .loc 1 38 14 mov.u64 %r22,4; bra $L1; $L4: .loc 1 34 3 mov.u64 %r22,8; $L1: .loc 1 55gfortrani_size_from_complex_kindgfortrani_size_from_complex_kind (r23pred %r26; .reg .u32 %r27; mov.u32 %r24,%ar0; .loc 1 61 3 setp.eq.u32 %r25,%r24,4; @ %r25 bra $L7; setp.eq.u32 %r26,%r24,8; @ %r26 bra $L8; .loc 1 80 16 add.u32 %r27,%r24,%r24; cvt.s64.s32 %r23,%r27; bra $L5; $L7: .loc 1 65 14 mov.u64 %r23,8; bra $L5; $L8: .loc 1 61 3 mov.u64 %r23,16; $L5: .loc 1 8 associated.o/gfortran_associatedgfortran_associatedfortran/intrinsics/associatedgfortran_associatedgfortran_associatedu64u32 %r60; .reg .u16predpred %r94; .reg .pred %r95; mov.u64 %r51,%ar0; mov.u64 %r52,%ar1; .loc 1 36 7 ld.u64 %r22,[%r51]; .loc 1 36 6 setp.eq.u64 %r53,%r22,0; @ %r53 bra $L4; .loc 1 38 6 ld.u64 %r54,[%r52]; setp.ne.u64 %r55,%r22,%r54; @ %r55 bra $L5; .loc 1 40 6 ld.u64 %r56,[%r51+16]; ld.u64 %r57,[%r52+16]; setp.ne.u64 %r58,%r56,%r57; @ %r58 bra $L6; .loc 1 42 6 ld.s8 %r60,[%r51+29]; cvt.u16.u32 %r59,%r60; ld.s8 %r62,[%r52+29]; cvt.u16.u32 %r61,%r62; setp.ne.u16 %r63,%r59,%r61; @ %r63 bra $L7; .loc 1 45 10 ld.s8 %r28,[%r51+28]; .loc 1 46 3 cvt.u16.u32 %r64,%r28; setp.le.s16 %r65,%r64,0; @ %r65 bra $L8; add.u32 %r66,%r28,-1; cvt.u64.u32 %r67,%r66; add.u64 %r69,%r67,%r67; add.u64 %r70,%r69,%r67; shl.b64 %r71,%r70,3; add.u64 %r49,%r71,64; mov.u64 %r44,40; $L3: add.u64 %r43,%r51,%r44; .loc 1 49 16 ld.u64 %r73,[%r43+16]; add.u64 %r72,%r73,1; .loc 1 49 14 ld.u64 %r74,[%r43+8]; sub.u64 %r39,%r72,%r74; add.u64 %r38,%r52,%r44; .loc 1 51 21 ld.u64 %r76,[%r38+16]; add.u64 %r75,%r76,1; ld.u64 %r78,[%r38+8]; sub.u64 %r77,%r75,%r78; .loc 1 51 10 setp.ne.u64 %r79,%r77,%r39; @ %r79 bra $L9; .loc 1 53 10 ld.u64 %r81,[%r43]; ld.u64 %r82,[%r38]; set.u32.ne.u64 %r83,%r81,%r82; neg.s32 %r84,%r83; .loc 1 53 79 set.u32.ne.u64 %r86,%r77,10; .loc 1 55 10 setp.le.s64 %r94,%r77,0; @ %r94 bra $L11; .loc 1 46 3 add.u64 %r44,%r44,24; setp.ne.u64 %r95,%r44,%r49; @ %r95 bra $L3; .loc 1 59 10 mov.u32 %r50,1; bra $L1; $L4: .loc 1 37 12 cvt.u32.u64 %r50,%r22; bra $L1; $L5: mov.u32 %r50,0; bra $L1; $L6: mov.u32 %r50,0; bra $L1; $L7: mov.u32 %r50,0; bra $L1; $L8: .loc 1 59 10 mov.u32 %r50,1; bra $L1; $L9: .loc 1 37 12 mov.u32 %r50,0; bra $L1; $L10: mov.u32 %r50,0; bra $L1; $L11: mov.u32 %r50,0; $L1: .loc 1 60gfortran_abort .visible .func _gfortran_abort; .file 1 "../../../../libgfortran/intrinsics/abort.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_sys_abort .extern .func _gfortrani_sys_gfortran_abort .visible .func _gfortran_abort { .loc 1 32 3 { call _gfortrani_sys_} args.o/_gfortran_getarg_i4 .visible .func _gfortran_getarg_i4fortran/intrinsics/arggfortran_getarg_i8 .visible .func _gfortran_getarg_i8_gfortran_iargc; // BEGIN GLOBAL FUNCTION DECL: _gfortran_get_command_argument_i4 .visible .func _gfortran_get_command_argument_i4_gfortran_get_command_argument_i8 .visible .func _gfortran_get_command_argument_i8gfortrani_get_args .extern .func _gfortrani_get_arg_gfortran_runtime_error .extern .func _gfortran_runtime_110,103,32,97,114,103,117,109,101,110,116,32,116,111,32,103,101,116,95,99,111,109,109,97,110,100,95,97,114,103,117,109,101,110_gfortran_getarg_i4 .visible .func _gfortran_getarg_mov.u64 %r39,%ar2; .loc 1 42 3 add.u64 %r42,%frame,8; {frame; call _gfortrani_get_args,(%out_arg1,%out_arg2); } .loc 1 44 15 set.u32.eq.u64 %r44,%r39,0; neg.s32 %r45,%r44; .loc 1 44 22 set.u32.eq.u64 %r47,%r38,0or.b16 %r49,%r50,%r51; .loc 1 44 6! %r54 bra $L1; .loc 1 47 3 mov.u32 %r565658,[%value_in]; } .loc 1 49 8 ld.u32 %r25,[%r37]; .loc 1 49 6 ld.u32 %r60,[%frame+8]; set.u32.lt.s32 %r61,%r25,%r60; neg.s32 %r62,%r61; .loc 1 49 27 not.b32 %r64,%r25; shr.u32 %r65,%r64,31; cvt.u16.u32 %r68,%r62; cvt.u16.u32 %r69,%r65; and.b16 %r67,%r68,%r69; cvt.u32.u16 %r70,%r67; cvt.u16.u8 %r71,%r70; setp.eq.u16 %r72,%r71,0; @ %r72 bra $L1; .loc 1 51 46 cvt.s64.s32 %r73,%r25; .loc 1 51 33 ld.u64 %r74,[%frame]; shl.b64 %r75,%r73,3; add.u64 %r76,%r74,%r75; ld.u64 %r32,[%r76]32.loc 1 54 7 min.u64 %r80,%r39,%r78$L1: .loc 1 56_gfortran_getarg_i8 .visible .func _gfortran_getarg_mov.u64 %r40,%ar2; .loc 1 68 40 ld.u64 %r22,[%r38]; .loc 1 42 3 add.u64 %r43,%frame,8; {frame; call _gfortrani_get_args,(%out_arg1,%out_arg2); } .loc 1 44 15 set.u32.eq.u64 %r45,%r40,0; neg.s32 %r46,%r45; .loc 1 44 22 set.u32.eq.u64 %r48,%r39,0or.b16 %r50,%r51,%r52; .loc 1 44 6! %r55 bra $L7; .loc 1 68 24 cvt.u32.u64 %r23,%r22; .loc 1 47 3 mov.u32 %r573949 6 ld.u32 %r61,[%frame+8]; set.u32.lt.s32 %r62,%r23,%r61; neg.s32 %r63,%r62; .loc 1 49 27 not.b32 %r65,%r23; shr.u32 %r66,%r65,31; cvt.u16.u32 %r69,%r63; cvt.u16.u32 %r70,%r66; and.b16 %r68,%r69,%r70;7; .loc 1 51 46 cvt.s64.s32 %r74,%r23; .loc 1 51 33 ld.u64 %r75,[%frame]; shl.b64 %r76,%r74,3; add.u64 %r77,%r75,%r76; ld.u64 %r34,[%r77]34.loc 1 54 7 min.u64 %r81,%r40,%r79$L7: .loc_gfortran_iargcgfortran_iargc { .reg .u32 %valueloc 1 86 3 add.u64 %r26,%frame,8frame; call _gfortrani_get_args,(%out_arg1,%out_arg2); } .loc 1 88 16 ld.u32 %r27,[%frame+8]; add.u32 %value,%r27,-1; .loc 1 89 1_gfortran_get_command_argument_i4 .visible .func _gfortran_get_command_argument_i4 {pred_gfortran_runtime_error15: .loc 1 129 6 or.b64 %r49,%r42,%r43;6; .loc 1 129 39 setp.ne.u64 %r51,%r44,0; @ %r51 bra $L17; bra $L14; $L16: .loc 1 132 3 add.u64 %r54,%frame,8_gfortrani_get_args,(%out_arg1,%out_arg2); } .loc 1 134 7 ld.u32 %r39,[%r41]; .loc 1 134 6 setp.lt.s32 %r55,%r39,0; @ %r55 bra $L26; .loc 1 134 19 ld.u32 %r56,[%frame+8]; setp.ge.s32 %r57,%r39,%r56; @ %r57 bra $L27; $L25: .loc 1 137 26 cvt.s64.s32 %r59,%r39; .loc 1 137 14 ld.u64 %r60,[%frame]; shl.b64 %r61,%r59,3; add.u64 %r62,%r60,%r61; ld.u64 %r58,[%r62]58mov.u64 %r37,%r63; .loc 1 121 13 mov.u32 %r36,0; bra $L18; $L26: .loc 1 122 20 mov.u64 %r37,0; .loc 1 135 15 mov.u32 %r36,42; bra $L18; $L27: .loc 1 122 20 mov.u64 %r37,0; .loc 1 135 15 mov.u32 %r36,42; $L18: .loc 1 139 6 setp.eq.u64 %r65,%r42,0; @ %r65 bra $L19; .loc 1 141 10 setp.eq.u64 %r66,%r45,0; @ %r66 bra $L28; .loc 1 144 2 mov.u32 %r64147 6 setp.eq.u32 %r71,%r36,42; @ %r71 bra $L19; .loc 1 149 10 setp.le.u64 %r72,%r37,%r45; .loc 1 150 13 selp.u32 %r36,%r36,-1,%r72; .loc 1 152 7 min.u64 %r73,%r37,%r45; .loc 1 152 27 ld.s32 %r74,[%r41]; .loc 1 152 7 ld.u64 %r75,[%frame]; shl.b64 %r76,%r74,3; add.u64 %r77,%r75,%r76; ld.u64 %r78,[%r77];bra $L19; $L28: .loc 1 142 12 mov.u32 %r36,42; $L19: .loc 1 155 6 setp.eq.u64 %r87,%r43,0; @ %r87 bra $L21; .loc 1 156 13 st.u32 [%r43],%r37; $L21: .loc 1 158 6 setp.eq.u64 %r88,%r44,0; @ ! %r88 bra $L23; bra $L14; $L30: .loc 1 135 15 mov.u32 %r36,42; bra $L23; $L29: mov.u32 %r36,42; $L23: .loc 1 159 13 st.u32 [%r44],%r36; bra $L14; $L17: .loc 1 132 3 add.u64 %r91,%frame,8; {frame; call _gfortrani_get_args,(%out_arg1,%out_arg2); } .loc 1 134 7 ld.u32 %r39,[%r41]; .loc 1 134 6 setp.lt.s32 %r92,%r39,0; @ %r92 bra $L29; .loc 1 134 19 ld.u32 %r93,[%frame+8]; setp.le.s32 %r94,%r93,%r39; @ ! %r94 bra $L25; bra $L30; $L14: .loc 1 160_gfortran_get_command_argument_i8 .visible .func _gfortran_get_command_argument_i8 {179 13 ld.u64 %r33,[%r28]; st.u32 [%frame+8],%r33; .loc 1 180 3 add.u64 %r39,%frame,4; add.u64 %r40,%frame,8; {framecall _gfortran_get_command_argument_i4); }182 13 ld.s32 %r42,[%frame+4]; st.u64 [%r30],%r42; $L44: .loc 1 183 6 setp.eq.u64 %r43,%r31,0; @ %r43 bra $L43; .loc 1 184 13 ld.s32 %r44,[%frame]; st.u64 [%r31],%r44; $L43: .loc 1 1_gfortran_get_command_i4 .visible .func _gfortran_get_commanpredpred %r94; .reg .u64 %r98predor.b64 %r56,%r52,%r53; setp.ne.u64 %r57,%r56,0; @ %r57 bra $L53; .loc 1 203 41 setp.ne.u64 %r58,%r54,0; @ %r58 bra $L54; bra $L52; $L53: .loc 1 206 3 add.u64 %r61,%frame,8frame; call _gfortrani_get_args,(%out_arg1,%out_arg2); } .loc 1 208 6 setp.eq.u64 %r62,%r52,0; @ %r62 bra $L66; .loc 1 211 10 setp.eq.u64 %r63,%r55,0; @ %r63 bra $L67; .loc 1 214 2 mov.u32 %r65,3.loc 1 199 7 mov.u32 %r39,0; bra $L55; $L70: cvt.u32.u64 %r39,%r56; $L62: .loc 1 221 19 set.u32.ne.u64 %r69,%r52,0; neg.s32 %r70,%r69; cvt.u32.u32 %r68,%r70; mov.u64 %r38,0; mov.u64 %r40,%r38; .loc 1 221 27 cvt.u16.u8 %r82,%r68; .loc 1 226 30 cvt.u32.u64 %r121,%r55; $L60: .loc 1 219 16 ld.u64 %r71,[%frame]; shl.b64 %r72,%r38,3; add.u64 %r73,%r71,%r72; ld.u64 %r24,[%r73]24.loc 1 230 38 add.u64 %r51,%r75,%r40; .loc 1 221 40 set.u32.eq.u32 %r78,%r39,0; neg.s32 %r79,%r78; .loc 1 221 27 cvt.u16.u32 %r81,%r79; and.b16 %r80,%r81,%r82; .loc 1 221@ %r85 bra $L56; .loc 1 224 16 cvt.u32.u64 %r86,%r75; cvt.s64.s32 %r27,%r86; add.u64 %r87,%r27,%r40; .loc 1 224 7 setp.le.u64 %r88,%r87,%r55; @ %r88 bra $L57; .loc 1 226 30 cvt.u32.u64 %r91,%r40; sub.u32 %r89,%r121,%r91; .loc 1 233 4 cvt.s64.s32 %r27,%r89; .loc 1 227 18 mov.u32 %r39,-1; bra $L58; $L57: .loc 1 230 23 add.u32 %r92,%r119,-1; .loc 1 230 12 cvt.u32.u64 %r93,%r38; setp.eq.u32 %r94,%r92,%r93; @ %r94 bra $L68; .loc 1 230 27 set.u32.eq.u64 %r39,%r55,%r51; bra $L58; $L68: mov.u32 %r39,0; $L58: .loc 1 233 12 add.u64 %r98,%r52,%r40; .loc 1 233 4105,[%value_in]; } ld.u32 %r119,[%frame+8]; $L56: .loc 1 238 21 add.u32 %r107,%r119,-1; .loc 1 238 10 cvt.u32.u64 %r108,%r38; setp.eq.u32 %r109,%r107,%r108; @ %r109 bra $L69; .loc 1 239 9 add.u64 %r40,%r51,1; bra $L59; $L69: .loc 1 237 15 mov.u64 %r40,%r51; $L59: .loc 1 217 3 add.u64 %r38,%r38,1; cvt.u32.u64 %r110,%r38; setp.gt.s32 %r111,%r119,%r110; @ %r111 bra $L60; bra $L61; $L66: .loc 1 199 7 cvt.u32.u64 %r39,%r52; bra $L55; $L67: .loc 1 212 12 mov.u32 %r39,42; $L55: .loc 1 217 17 ld.u32 %r119,[%frame+8]; .loc 1 217 3 setp.gt.s32 %r112,%r119,0; @ %r112 bra $L62; mov.u64 %r40,0; $L61: .loc 1 242 6 setp.eq.u64 %r113,%r53,0; @ %r113 bra $L63; .loc 1 243 13 st.u32 [%r53],%r40; $L63: .loc 1 245 6 setp.eq.u64 %r114,%r54,0; @ %r114 bra $L52; $L65: .loc 1 246 13 st.u32 [%r54],%r39; bra $L52; $L54: .loc 1 206 3 add.u64 %r117,%frame,8frame; call _gfortrani_get_args,(%out_arg1,%out_arg2); } .loc 1 217 17 ld.u32 %r119,[%frame+8]; .loc 1 217 3 setp.gt.s32 %r118,%r119,0; @ %r118 bra $L70; .loc 1 199 7 cvt.u32.u64 %r39,%r56; bra $L65; $L52: .loc 1 247_gfortran_get_command_i8 .visible .func _gfortran_get_comman.loc 1 264 3 add.u64 %r34,%frame,429; call _gfortran_get_command_265 6 setp.eq.u64 %r35,%r27,0; @ %r35 bra $L82; .loc 1 266 13 ld.s32 %r36,[%frame+4]; st.u64 [%r27],%r36; $L82: .loc 1 267 6 setp.eq.u64 %r37,%r28,0; @ %r37 bra $L81; .loc 1 268 13 ld.s32 %r38,[%frame]; st.u64 [%r28],%r38; $L81: .loc 1 269 cshift0.o/ BEGIN FUNCTION DECL: cshift0 .func cshift0fortran/intrinsics/cshift0.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_cshift0_1 .visible .func _gfortran_cshift0_1gfortran_cshift0_1_char .visible .func _gfortran_cshift0_1_char_gfortran_cshift0_1_char4 .visible .func _gfortran_cshift0_1_char4_gfortran_cshift0_2 .visible .func _gfortran_cshift0_2gfortran_cshift0_2_char .visible .func _gfortran_cshift0_2_char_gfortran_cshift0_2_char4 .visible .func _gfortran_cshift0_2_char4_gfortran_cshift0_4 .visible .func _gfortran_cshift0_4gfortran_cshift0_4_char .visible .func _gfortran_cshift0_4_char_gfortran_cshift0_4_char4 .visible .func _gfortran_cshift0_4_char4_gfortran_cshift0_8 .visible .func _gfortran_cshift0_8gfortran_cshift0_8_char .visible .func _gfortran_cshift0_8_char_gfortran_cshift0_8_char4 .visible .func _gfortran_cshift0_8_char4_gfortran_cshift0_16 .visible .func _gfortran_cshift0_16gfortran_cshift0_16_char .visible .func _gfortran_cshift0_16_char_gfortran_cshift0_16_char4 .visible .func _gfortran_cshift0_16_char4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortran_size0_gfortran_size0_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extentsgfortrani_cshift0_i1 .extern .func _gfortrani_cshift0_i1_gfortrani_cshift0_i2 .extern .func _gfortrani_cshift0_i2_gfortrani_cshift0_i4 .extern .func _gfortrani_cshift0_i4_gfortrani_cshift0_i8 .extern .func _gfortrani_cshift0_i8_gfortrani_cshift0_i16 .extern .func _gfortrani_cshift0_i16_gfortrani_cshift0_r4 .extern .func _gfortrani_cshift0_r4_gfortrani_cshift0_r8 .extern .func _gfortrani_cshift0_r8_gfortrani_cshift0_c4 .extern .func _gfortrani_cshift0__gfortrani_cshift0_c8 .extern .func _gfortrani_cshift0_114,101,116,117,114,110,32,118,97,108,117,101,0 }03,117,109,101,110,116,32,39,68,73,77,39,32,105,115,32,111,117,116,32,111,102,32,114,97,110,103,101,32,105,110,32,99,97,108,108,32,116,111,32,39,67,83,72,73,70,84,397] = {67,83,72,73,70,84,0 }; // BEGIN FUNCTION DEF: cshift0 .func cshift0in_ar4) {u64 %r191; .reg .u64predpredpred %r215; .reg .pred %r216; .reg .pred %r217; .reg .pred %r218; .reg .predpredpred %r265; .reg .predpred %r305; .reg .u64 %r307; .reg .predu64 %r325; .reg .u64pred %r345; .reg .pred %r346; .reg .predu64 %r377; .reg .u64 %r386; .reg .pred %r388; .reg .u64 %r396; .reg .pred.reg .predu32u64 %r427; .reg .pred %r429; .reg .pred %r431; .reg .pred %r433; .reg .pred.reg .u64 %r439; .reg .u64 %r440; mov.u64 %r165,%ar0; mov.u64 %r166,%ar1; mov.u64 %r167,%ar2; mov.u32 %r168,%ar3; mov.u64 %r169,%ar4; .loc 1 54 6 setp.le.s32 %r170,%r168,0; @ %r170 bra $L2; .loc 1 54 28 ld.s8 %r171,[%r166+28]; .loc 1 54 17 setp.ge.s32 %r172,%r171,%r168; @ %r172 bra $L3; $L2: .loc 1 55 5 cvta.const.u64 %r173173_gfortran_runtime_error166; call (%value_in),_gfortran_size0,(59 6 ld.u64 %r177,[%r165]; setp.ne.u64 %r178,%r177,0; @ %r178 bra $L4; .loc 1 63 19 st.u64 [%r165+8],%r177; .loc 1 64 7 ld.u64 %r180,[%r166+16]; st.u64 [%r165+16],%r180; ld.u64 %r181,[%r166+24]; st.u64 [%r165+24],%r181; .loc 1 65 23 ld.s8 %r121,[%r166+28]; .loc 1 65 7 cvt.u16.u32 %r182,%r121; setp.gt.s16 %r183,%r182,0; @ %r183 bra $L5; $L8: .loc 1 81 241(%value_in),_gfortrani_xmallocarray186,[%value_in]; } .loc 1 81 22 st.u64 [%r165],%r186; bra $L6; $L7: .loc 1 69 16 ld.u64 %r189,[%r68+8]; ld.u64 %r191,[%r68]; .loc 1 74 17 mad.lo.u64 %r93,%r87,%r93,%r93; add.u64 %r68,%r68,24; add.u64 %r58,%r58,24; .loc 1 69 14 sub.u64 %r87,%r189,%r191; $L55: add.u64 %r31,%r165,%r58; .loc 1 77 4 st.u64 [%r31+48],%r193; st.u64 [%r31+56],%r87; st.u64 [%r31+40],%r93; .loc 1 65 7 setp.ne.u64 %r194,%r58,%r427; @ %r194 bra $L7; bra $L8; $L4: .loc 1 83 12 cvta.global.u64 %r195,_gfortrani_compile_options; .loc 1 83 11 ld.u32 %r196,[%r195+36]; setp.eq.u32 %r197,%r196,0; @ %r197 bra $L6; .loc 1 85 7 cvta.const.u64 %r201,$LC1; cvta.const.u64 %r20020001; call _gfortrani_bounds_equal_extents$L6: .loc 1 89 6 setp.eq.u64 %r202,%r176,0; @ %r202 bra $L1; .loc 1 92 15 ld.s8 %r203,[%r166+29]; shl.b32 %r204,%r203,4; cvt.s64.s32 %r205,%r204; ld.u64 %r207,[%r166+16]; shl.b64 %r206,%r207,7; or.b64 %r47,%r205,%r206; .loc 1 94 3 setp.eq.u64 %r208,%r47,1040; @ %r208 bra $L10; setp.gt.u64 %r209,%r47,1040; @ %r209 bra $L11; setp.eq.u64 %r210,%r47,288; @ %r210 bra $L12; setp.gt.u64 %r211,%r47,288; @ %r211 bra $L13; setp.eq.u64 %r212,%r47,160; @ %r212 bra $L14; setp.eq.u64 %r213,%r47,272; @ %r213 bra $L12; setp.eq.u64 %r214,%r47,144; @ %r214 bra $L14; bra $L15; $L13: setp.eq.u64 %r215,%r47,544; @ %r215 bra $L16; setp.eq.u64 %r216,%r47,560; @ %r216 bra $L17; setp.eq.u64 %r217,%r47,528; @ %r217 bra $L16; bra $L15; $L11: setp.eq.u64 %r218,%r47,1088; @ %r218 bra $L18; setp.gt.u64 %r219,%r47,1088; @ %r219 bra $L19; setp.eq.u64 %r220,%r47,1056; @ %r220 bra $L10; setp.eq.u64 %r221,%r47,1072; @ %r221 bra $L20; bra $L15; $L19: setp.eq.u64 %r222,%r47,2080; @ %r222 bra $L21; setp.eq.u64 %r223,%r47,2112; @ %r223 bra $L22; setp.eq.u64 %r224,%r47,2064; @ %r224 bra $L21; bra $L15; $L14: .loc 1 988; call _gfortrani_cshift0_99 7 bra $L1; $L12:8; call _gfortrani_cshift0_104 7 bra $L1; $L16:8; call _gfortrani_cshift0_109 7 bra $L1; $L10:8; call _gfortrani_cshift0_114 7 bra $L1; $L21:8; call _gfortrani_cshift0_i16121 7 bra $L1; $L17:8; call _gfortrani_cshift0_r47 bra $L1; $L20:8; call _gfortrani_cshift0_r8130 7 bra $L1; $L18:8; call _gfortrani_cshift0_c47 bra $L1; $L28; call _gfortrani_cshift0_c8160 7 bra $L1; $L15: .loc 1 188 3 setp.eq.u64 %r261,%r169,4; @ %r261 bra $L24; setp.gt.s64 %r262,%r169,4; @ %r262 bra $L25; setp.eq.u64 %r263,%r169,1; @ %r263 bra $L26; setp.eq.u64 %r264,%r169,2; @ %r264 bra $L27; bra $L28; $L25: setp.eq.u64 %r265,%r169,8; @ %r265 bra $L29; setp.eq.u64 %r266,%r169,16; @ %r266 bra $L30; bra $L28; $L26:8; call _gfortrani_cshift0_327 8 ld.u64 %r114,[%r165]; .loc 1 328 8 ld.u64 %r115,[%r166]; .loc 1 196 7 bra $L31; $L27: .loc 1 199 11 ld.u64 %r114,[%r165]; .loc 1 199 46 ld.u64 %r115,[%r166]; .loc 1 199 43 or.b64 %r271,%r114,%r115; cvt.u16.u64 %r274,%r271; and.b16 %r273,%r274,1; cvt.u32.u16 %r275,%r273; cvt.u16.u8 %r276,%r275; setp.ne.u16 %r277,%r276,0; @ %r277 bra $L318; call _gfortrani_cshift0_205 4 bra $L1; $L24: .loc 1 209 11 ld.u64 %r114,[%r165]; .loc 1 199 46 ld.u64 %r115,[%r166]; .loc 1 209 43 or.b64 %r282,%r114,%r115; and.b64 %r283,%r282,3; setp.ne.u64 %r284,%r283,0; @ %r284 bra $L318; call _gfortrani_cshift0_215 4 bra $L1; $L29: .loc 1 219 11 ld.u64 %r114,[%r165]; .loc 1 199 46 ld.u64 %r115,[%r166]; .loc 1 219 43 or.b64 %r289,%r114,%r115; and.b64 %r290,%r289,7; setp.eq.u64 %r291,%r290,0; @ %r291 bra $L32; .loc 1 228 8 and.b64 %r293,%r289,3; setp.ne.u64 %r294,%r293,0; @ %r294 bra $L318; call _gfortrani_cshift0_c4233 4 bra $L1; $L32:8; call _gfortrani_cshift0_239 4 bra $L1; $L30: .loc 1 244 11 ld.u64 %r114,[%r165]; .loc 1 199 46 ld.u64 %r115,[%r166]; .loc 1 245 4 or.b64 %r303,%r114,%r115; and.b64 %r304,%r303,15; setp.eq.u64 %r305,%r304,0; @ %r305 bra $L33; .loc 1 254 8 and.b64 %r307,%r303,7; setp.ne.u64 %r308,%r307,0; @ %r308 bra $L318; call _gfortrani_cshift0_c84 bra $L1; $L33:8; call _gfortrani_cshift0_i16265 4 bra $L1; $L28: .loc 1 327 8 ld.u64 %r114,[%r165]; .loc 1 328 8 ld.u64 %r115,[%r166]; $L31: .loc 1 286 9 add.u32 %r102,%r168,-1; .loc 1 287 14 mov.u64 %r317,0; st.u64 [%frame+240],%r317; .loc 1 288 14 st.u64 [%frame+360],%r317; .loc 1 290 13 mov.u64 %r319,1; st.u64 [%frame],%r319; .loc 1 291 12 st.u64 [%frame+120],%r317; .loc 1 298 23 ld.s8 %r134,[%r166+28]; cvt.u32.u32 %r321,%r134; cvt.s64.s8 %r133,%r321; .loc 1 298 3 setp.le.s64 %r322,%r133,0; @ %r322 bra $L57; .loc 1 300 15 cvt.s64.s32 %r126,%r102; .loc 1 302 21 ld.u64 %r153,[%r165+16]; .loc 1 305 21 ld.u64 %r157,[%r166+16]; mov.u64 %r86,%r169; mov.u64 %r83,%r169; mov.u64 %r94,40; .loc 1 292 5 mov.u64 %r91,%r317; .loc 1 296 7 mov.u64 %r90,%r91; .loc 1 298 12 mov.u64 %r116,%r91; .loc 1 312 20 mov.u64 %r440,%r91; $L39: add.u64 %r84,%r166,%r94; .loc 1 308 17 ld.u64 %r324,[%r84+16]; add.u64 %r323,%r324,1; .loc 1 308 15 ld.u64 %r325,[%r84+8]; sub.u64 %r150,%r323,%r325; .loc 1 302 21 add.u64 %r326,%r165,%r94; ld.u64 %r327,[%r326]; mul.lo.u64 %r82,%r153,%r327; .loc 1 305 21 ld.u64 %r328,[%r84]; mul.lo.u64 %r85,%r157,%r328; .loc 1 300 10 setp.ne.u64 %r329,%r126,%r116; @ %r329 bra $L35; .loc 1 303 14 setp.ne.u64 %r330,%r82,0; .loc 1 306 14 setp.ne.u64 %r331,%r85,0; .loc 1 308 15 mov.u64 %r90,%r150; selp.u64 %r86,%r85,%r169,%r331; selp.u64 %r83,%r82,%r169,%r330; bra $L38; $L35: .loc 1 312 20 shl.b64 %r332,%r91,3; add.u64 %r333,%frame,%r332; add.u64 %r334,%r333,120; st.u64 [%r334],%r440; .loc 1 313 21 st.u64 [%r333],%r150; .loc 1 314 22 st.u64 [%r333+360],%r82; .loc 1 315 22 st.u64 [%r333+240],%r85; .loc 1 316 12 add.u64 %r91,%r91,1; $L38: .loc 1 298 55 add.u64 %r116,%r116,1; .loc 1 298 3 add.u64 %r94,%r94,24; setp.ne.u64 %r344,%r116,%r133; @ %r344 bra $L39; .loc 1 319 14 ld.u64 %r61,[%frame+240]; .loc 1 321 14 ld.u64 %r39,[%frame+360]; .loc 1 319 6 setp.ne.u64 %r345,%r61,0; selp.u64 %r61,%r61,%r169,%r345; .loc 1 321 6 setp.eq.u64 %r346,%r39,0; selp.u64 %r39,%r169,%r39,%r346; .loc 1 330 24 setp.eq.u64 %r347,%r90,0; @ %r347 bra $L59; rem.s64 %r105,%r167,%r90; .loc 1 331 6 setp.ge.s64 %r348,%r105,0; @ %r348 bra $L34; .loc 1 332 11 add.u64 %r105,%r105,%r90; bra $L34; $L54: .loc 1 352 34 mul.lo.u64 %r131,%r86,%r105; .loc 1 342 24 mul.lo.u64 %r123,%r169,%r105; .loc 1 343 23 sub.u64 %r140,%r90,%r105; .loc 1 343 32 mul.lo.u64 %r34,%r169,%r140; .loc 1 340 19 set.u32.eq.u64 %r350,%r86,%r169; neg.s32 %r351,%r350; .loc 1 340 38 set.u32.eq.u64 %r353,%r83,%r169; neg.s32 %r354,%r353; .loc 1 340 27.loc 1 373 32 ld.u64 %r144,[%frame]; mul.lo.u64 %r138,%r140,%r83; .loc 1 380 30 mul.lo.u64 %r33,%r144,%r39; .loc 1 381 30 mul.lo.u64 %r359,%r144,%r61; .loc 1 381 16 neg.s64 %r135,%r359; setp.eq.u32 %r431,%r141,0; setp.gt.s64 %r429,%r133,2; .loc 1 354 4 setp.le.s64 %r433,%r140,0; .loc 1 360 4 setp.gt.s64 %r434,%r105,0; cvt.u32.u32 %r435,%r134; cvt.s64.s8 %r436,%r435; add.u64 %r437,%r436,-1; add.u64 %r438,%frame,360; add.u64 %r439,%frame,240; $L56: .loc 1 340 10 @ %r431 bra $L42; .loc 1 344 23 add.u64 %r361,%r115,%r123; .loc 1 344 41161368,[%value_in]; } .loc 1 345 17 add.u64 %r370,%r114,%r34; .loc 1 345 475bra $L43; $L42: .loc 1 352 16 add.u64 %r110,%r115,%r131; .loc 1 354 4 @ %r433 bra $L60; mov.u64 %r109,%r114; .loc 1 354 11 mov.u64 %r111,0; $L45:696,[%value_in]; } .loc 1 357 13 add.u64 %r109,%r109,%r83; .loc 1 358 12 add.u64 %r110,%r110,%r86; .loc 1 354 34 add.u64 %r111,%r111,1; .loc 1 354 4 setp.ne.u64 %r388,%r111,%r140; @ %r388 bra $L45; .loc 1 357 13 add.u64 %r106,%r114,%r138; bra $L44; $L60: .loc 1 354 4 mov.u64 %r106,%r114; $L44: .loc 1 360 4 @ ! %r434 bra $L43; mov.u64 %r107,%r115; .loc 1 360 23 mov.u64 %r108,0; $L46: .loc 1 362 8169396,[%value_in]; } .loc 1 363 13 add.u64 %r106,%r106,%r83; .loc 1 364 12 add.u64 %r107,%r107,%r86; .loc 1 360 40 add.u64 %r108,%r108,1; .loc 1 360 4 setp.ne.u64 %r398,%r108,%r105; @ %r398 bra $L46; $L43: .loc 1 369 12 add.u64 %r114,%r114,%r39; .loc 1 370 12 add.u64 %r115,%r115,%r61; .loc 1 371 15 ld.u64 %r399,[%frame+120]; add.u64 %r67,%r399,1; .loc 1 373 13 setp.eq.u64 %r400,%r67,%r144; @ %r400 bra $L48; .loc 1 371 15 st.u64 [%frame+120],%r67; bra $L56; $L48: .loc 1 377 20 mov.u64 %r401,0; st.u64 [%frame+120],%r401; .loc 1 381 16 add.u64 %r112,%r115,%r135; .loc 1 383 14 @ %r429 bra $L50; bra $L1; $L53: .loc 1 377 20 st.u64 [%r132],%r401; .loc 1 380 30 mul.lo.u64 %r70,%r76,%r75; .loc 1 381 30 mul.lo.u64 %r404,%r78,%r75; .loc 1 381 16 sub.u64 %r112,%r115,%r404; .loc 1 382 12 add.u64 %r113,%r113,1; .loc 1 383 14 add.u64 %r132,%r132,8; add.u64 %r129,%r129,8; setp.eq.u64 %r405,%r437,%r113; @ ! %r405 bra $L52; bra $L1; $L50: add.u64 %r132,%frame,128; .loc 1 380 30 mov.u64 %r70,%r33; .loc 1 383 14 mov.u64 %r129,8; .loc 1 382 12 mov.u64 %r113,1; $L52: .loc 1 391 23 ld.u64 %r409,[%r132]; add.u64 %r75,%r409,1; st.u64 [%r132],%r75; .loc 1 392 30 add.u64 %r411,%r438,%r129; ld.u64 %r76,[%r411]; .loc 1 392 20 sub.u64 %r412,%r76,%r70; add.u64 %r114,%r114,%r412; .loc 1 393 30 add.u64 %r414,%r439,%r129; ld.u64 %r78,[%r414]; .loc 1 393 20 add.u64 %r115,%r112,%r78; .loc 1 373 32 add.u64 %r415,%frame,%r129; ld.u64 %r81,[%r415]; .loc 1 373 13 setp.eq.u64 %r416,%r75,%r81; @ %r416 bra $L53; bra $L56; $L57: .loc 1 298 3 mov.u64 %r86,%r169; mov.u64 %r83,%r169; mov.u64 %r39,%r169; mov.u64 %r61,%r169; .loc 1 296 7 mov.u64 %r90,%r317; .loc 1 330 24 mov.u64 %r105,%r90; bra $L34; $L59: mov.u64 %r105,%r90; $L34: .loc 1 334 9 setp.ne.u64 %r417,%r114,0; @ %r417 bra $L54; bra $L1; $L5: .loc 1 69 16 ld.u64 %r419,[%r166+56]; ld.u64 %r421,[%r166+48]; .loc 1 69 14 sub.u64 %r87,%r419,%r421; add.u64 %r68,%r166,72; add.u32 %r422,%r121,-1; cvt.u64.u32 %r423,%r422; add.u64 %r425,%r423,%r423; add.u64 %r426,%r425,%r423; shl.b64 %r427,%r426,3; mov.u64 %r58,%r177; .loc 1 72 17 mov.u64 %r93,1; .loc 1 77 4 mov.u64 %r193,%r58; bra $L55; $L1: .loc 1 397_gfortran_cshift0_1 .visible .func _gfortran_cshift0_1ld.u64 %r23,[%r29+16103; ld.s8 %r27,[%r31]; bra $L102; $L103: mov.u32 %r27,1; $L102: ld.s8 %r38,[%r30]call cshift0); }_gfortran_cshift0_1_char .visible .func _gfortran_cshift0_1_charpredmov.u64 %r29,%ar2; mov.u64 %r30,%ar3; mov.u64 %r31,%ar4; mov.u32 %r32,%ar5; cvt.s64.s32 %r22,%r32; setp.eq.u64 %r33,%r31,0; @ %r33 bra $L106; ld.s8 %r26,[%r31]; bra $L105; $L106: mov.u32 %r26,1; $L105: ld.s87call cshift0); }_gfortran_cshift0_1_char4 .visible .func _gfortran_cshift0_1_char4predmov.u64 %r33,%ar4; mov.u32 %r34,%ar5; cvt.s64.s32 %r35,%r34; shl.b64 %r24,%r35,2; setp.eq.u64 %r36,%r33,0; @ %r36 bra $L109; ld.s8 %r28,[%r33]; bra $L108; $L109: mov.u32 %r28,1; $L108: ld.s8 %r42,[%r32]call cshift0); }_gfortran_cshift0_2 .visible .func _gfortran_cshift0_2ld.u64 %r23,[%r29+162; ld.s16 %r27,[%r31]; bra $L111; $L112: mov.u32 %r27,1; $L111: ld.s16 %r38,[%r30]call cshift0); }_gfortran_cshift0_2_char .visible .func _gfortran_cshift0_2_charpredmov.u64 %r29,%ar2; mov.u64 %r30,%ar3; mov.u64 %r31,%ar4; mov.u32 %r32,%ar5; cvt.s64.s32 %r22,%r32; setp.eq.u64 %r33,%r31,0; @ %r33 bra $L115; ld.s16 %r26,[%r31]; bra $L114; $L115: mov.u32 %r26,1; $L114:7call cshift0); }_gfortran_cshift0_2_char4 .visible .func _gfortran_cshift0_2_char4predmov.u64 %r33,%ar4; mov.u32 %r34,%ar5; cvt.s64.s32 %r35,%r34; shl.b64 %r24,%r35,2; setp.eq.u64 %r36,%r33,0; @ %r36 bra $L118; ld.s16 %r28,[%r33]; bra $L117; $L118: mov.u32 %r28,1; $L117: ld.s16 %r42,[%r32]call cshift0); }_gfortran_cshift0_4 .visible .func _gfortran_cshift0_predmov.u64 %r29,%ar2; mov.u64 %r30,%ar3; ld.u64 %r23,[%r28+16]; setp.eq.u64 %r31,%r30,0; @ %r31 bra $L121; ld.u32 %r26,[%r30]; bra $L120; $L121: mov.u32 %r26,1call cshift0); }_gfortran_cshift0_4_char .visible .func _gfortran_cshift0_4_charpredmov.u64 %r29,%ar3; mov.u64 %r30,%ar4; mov.u32 %r31,%ar5; cvt.s64.s32 %r22,%r31; setp.eq.u64 %r32,%r30,0; @ %r32 bra $L124; ld.u32 %r25,[%r30]; bra $L123; $L124: mov.u32 %r25,1; $L123: ld.s32 %r38,[%r29]call cshift0); }_gfortran_cshift0_4_char4 .visible .func _gfortran_cshift0_4_char4pred %r35; .reg .u64 %r41mov.u32 %r33,%ar5; cvt.s64.s32 %r34,%r33; shl.b64 %r24,%r34,2; setp.eq.u64 %r35,%r32,0; @ %r35 bra $L127; ld.u32 %r27,[%r32]; bra $L126; $L127: mov.u32 %r27,1; $L126: ld.s32 %r41,[%r31]call cshift0); }_gfortran_cshift0_8 .visible .func _gfortran_cshift0_8predmov.u64 %r29,%ar2; mov.u64 %r30,%ar3; ld.u64 %r23,[%r28+16]; setp.eq.u64 %r31,%r30,0; @ %r31 bra $L130; ld.u32 %r26,[%r30]; bra $L129; $L130: mov.u32 %r26,1call cshift0); }_gfortran_cshift0_8_char .visible .func _gfortran_cshift0_8_charpredmov.u64 %r29,%ar3; mov.u64 %r30,%ar4; mov.u32 %r31,%ar5; cvt.s64.s32 %r22,%r31; setp.eq.u64 %r32,%r30,0; @ %r32 bra $L133; ld.u32 %r25,[%r30]; bra $L132; $L133: mov.u32 %r25,1; $L132: ld.u64 %r36,[%r29]call cshift0); }_gfortran_cshift0_8_char4 .visible .func _gfortran_cshift0_8_char4pred %r35; .reg .u64 %r39mov.u32 %r33,%ar5; cvt.s64.s32 %r34,%r33; shl.b64 %r24,%r34,2; setp.eq.u64 %r35,%r32,0; @ %r35 bra $L136; ld.u32 %r27,[%r32]; bra $L135; $L136: mov.u32 %r27,1; $L135: ld.u64 %r39,[%r31]call cshift0); }_gfortran_cshift0_16 .visible .func _gfortran_cshift0_16ld.u64 %r23,[%r29+16139; ld.u32 %r27,[%r31]; bra $L138; $L139: mov.u32 %r27,1; $L138: ld.u64 %r36,[%r30]call cshift0); }_gfortran_cshift0_16_char .visible .func _gfortran_cshift0_16_charpred %r33; .reg .u64 %r37; mov.u64 %r27,%ar0; mov.u64 %r29,%ar2; mov.u64 %r30,%ar3; mov.u64 %r31,%ar4; mov.u32 %r32,%ar5; cvt.s64.s32 %r22,%r32; setp.eq.u64 %r33,%r31,0; @ %r33 bra $L142; ld.u32 %r26,[%r31]; bra $L141; $L142: mov.u32 %r26,1; $L141:7call cshift0); }_gfortran_cshift0_16_char4 .visible .func _gfortran_cshift0_16_char4pred %r36; .reg .u64 %r40; mov.u64 %r29,%ar0; mov.u64 %r31,%ar2; mov.u64 %r32,%ar3; mov.u64 %r33,%ar4; mov.u32 %r34,%ar5; cvt.s64.s32 %r35,%r34; shl.b64 %r24,%r35,2; setp.eq.u64 %r36,%r33,0; @ %r36 bra $L145; ld.u32 %r28,[%r33]; bra $L144; $L145: mov.u32 %r28,1; $L144: ld.u64 %r40,[%r32]call cshift0); } ret; } eoshift0BEGIN FUNCTION DECL: eoshift0 .func eoshift0); .file 1 "../../../../libgfortran/intrinsics/eoshift0.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_eoshift0_1 .visible .func _gfortran_eoshift0_1_gfortran_eoshift0_1_char .visible .func _gfortran_eoshift0_1_char_gfortran_eoshift0_2 .visible .func _gfortran_eoshift0_2_gfortran_eoshift0_2_char .visible .func _gfortran_eoshift0_2_char_gfortran_eoshift0_4 .visible .func _gfortran_eoshift0_4_gfortran_eoshift0_4_char .visible .func _gfortran_eoshift0_4_char_gfortran_eoshift0_8 .visible .func _gfortran_eoshift0_8_gfortran_eoshift0_8_char .visible .func _gfortran_eoshift0_8_char_gfortran_eoshift0_16 .visible .func _gfortran_eoshift0_16_gfortran_eoshift0_16_char .visible .func _gfortran_eoshift0_16_char_gfortran_eoshift0_16_char4 .visible .func _gfortran_eoshift0_16_char4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_size0_gfortran_size0_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extentsVAR DEF: space$0 .const .align 4 .u32 space$0[1] = {32 }; // BEGIN VAR DEF: space$1 .const .align 4 .u32 space$1[1] = {32 }; // BEGIN VAR DEF: space$2 .const .align 4 .u32 space$2[1] = {32 }; // BEGIN VAR DEF: space$3 .const .align 4 .u32 space$3[1] = {32 }; // BEGIN VAR DEF: space$4 .const .align 4 .u32 space$4[1] = {32 }8] = {69,79,83,72,73,70,8414,101,116,117,114,110,32,118,97,108,117,1012] = {2] = {32,0 }; // BEGIN FUNCTION DEF: eoshift0 .func eoshift0480predpred %r259; .reg .u32u32 %r266; .reg .u64u64u64 %r280; .reg .u64 %r281; .reg .u64 %r282; .reg .u64u32u64 %r301; .reg .u64predpredpred %r375; .reg .u32u64 %r391; .reg .predpred %r445; .reg .pred %r446; .reg .pred %r447; .reg .u64 %r448; .reg .pred %r449; .reg .pred %r450; .reg .u64 %r452; .reg .predu64 %r460; .reg .u64 %r461; .reg .u64 %r462; mov.u64 %r203,%ar0; mov.u64 %r204,%ar1; mov.u64 %r205,%ar2; mov.u64 %r206,%ar3; mov.u32 %r207,%ar4; mov.u64 %r208,%ar5; mov.u64 %r209,%ar6; mov.u64 %r21004; call (%value_in),_gfortran_size0,(%out_arg1); ld.param.u64 %r212,[%value_in]; } .loc 1 64 6 ld.u64 %r213,[%r203].loc 1 68 19 st.u64 [%r203+8],%r213; .loc 1 69 7 ld.u64 %r216,[%r204+16]; st.u64 [%r203+16],%r216; ld.u64 %r217,[%r204+24]; st.u64 [%r203+24],%r217; .loc 1 70 23 ld.s8 %r167,[%r204+28]; .loc 1 70 7 cvt.u16.u32 %r218,%r167; setp.gt.s16 %r219,%r218,0; @ %r219 bra $L3; $L6: .loc 1 87 2421208; call (%value_in),_gfortrani_xmallocarray222,[%value_in]; } .loc 1 87 22 st.u64 [%r203],%r222; bra $L4; $L5: .loc 1 74 16 ld.u64 %r225,[%r37+8]; ld.u64 %r227,[%r37]; .loc 1 79 17 mad.lo.u64 %r117,%r124,%r117,%r117; add.u64 %r37,%r37,24; add.u64 %r126,%r126,24; .loc 1 74 14 sub.u64 %r124,%r225,%r227; $L52: add.u64 %r68,%r203,%r126; .loc 1 82 4 st.u64 [%r68+48],%r229; st.u64 [%r68+56],%r124; st.u64 [%r68+40],%r117; .loc 1 70 7 setp.ne.u64 %r230,%r444,%r126; @ %r230 bra $L5; bra $L6; $L2: .loc 1 89 12 cvta.global.u64 %r231,_gfortrani_compile_options; .loc 1 89 11 ld.u32 %r232,[%r231+36]; setp.eq.u32 %r233,%r232,0; @ %r233 bra $L4; .loc 1 91 7 cvta.const.u64 %r237,$LC0;r20423637; call _gfortrani_bounds_equal_extents$L4: .loc 1 95 6 setp.ne.u64 %r238,%r212,0; @ ! %r238 bra $L1; .loc 1 98 9 add.u32 %r132,%r207,-1; .loc 1 100 13 mov.u64 %r239,1; st.u64 [%frame],%r239; .loc 1 101 12 mov.u64 %r240,0; st.u64 [%frame+120],%r240; .loc 1 102 14 mov.u64 %r241,-1; st.u64 [%frame+240],%r241; .loc 1 103 14 st.u64 [%frame+360],%r241; .loc 1 112 13 ld.s8 %r202,[%r204+28]; .loc 1 112 11 cvt.u32.u32 %r243,%r202; cvt.s64.s8 %r105,%r243; .loc 1 105 6 setp.le.s32 %r244,%r132,0; @ %r244 bra $L8; .loc 1 113 7 setp.le.s64 %r245,%r105,0; @ %r245 bra $L9; mov.u64 %r142,40; .loc 1 110 12 mov.u64 %r136,%r239; .loc 1 109 12 mov.u64 %r135,%r136; .loc 1 113 14 mov.u64 %r137,%r240; $L13: add.u64 %r187,%r203,%r142; .loc 1 117 7 ld.u64 %r246,[%r187]; setp.ne.u64 %r247,%r246,%r135; @ ! %r247 bra $L75; bra $L10; $L8: .loc 1 168 7 setp.gt.s64 %r248,%r105,0; @ %r248 bra $L10; mov.u64 %r179,%r241; mov.u64 %r178,%r179; mov.u64 %r106,%r240; mov.u64 %r101,%r106; mov.u64 %r95,%r106; bra $L12; $L75: add.u64 %r158,%r204,%r142; .loc 1 123 7 ld.u64 %r249,[%r158]; setp.ne.u64 %r250,%r249,%r136; @ %r250 bra $L10; .loc 1 128 12 ld.u64 %r252,[%r187+16]; ld.u64 %r254,[%r187+8]; sub.u64 %r253,%r252,%r254; .loc 1 128 9 mad.lo.u64 %r135,%r253,%r135,%r135; .loc 1 129 12 ld.u64 %r256,[%r158+16]; ld.u64 %r258,[%r158+8]; sub.u64 %r257,%r256,%r258; .loc 1 129 9 mad.lo.u64 %r136,%r257,%r136,%r136; .loc 1 113 30 add.u64 %r137,%r137,1; .loc 1 113 7 add.u64 %r142,%r142,24; setp.ne.u64 %r259,%r137,%r105; @ %r259 bra $L13; bra $L9; $L50: .loc 1 159 17 ld.u64 %r57,[%r203+16]; .loc 1 160 17 ld.u64 %r62,[%r204+16]; .loc 1 157 13 cvt.u32.u32 %r260,%r202; cvt.s64.s8 %r165,%r260; cvt.s64.s32 %r163,%r207; sub.u64 %r161,%r165,%r163; shl.b64 %r261,%r161,3; add.u64 %r262,%frame,120; mov.u32 %r266add.u64 %r271,%r163,%r163; add.u64 %r272,%r271,%r163; shl.b64 %r273,%r272,3; add.u64 %r93,%r273,40; add.u64 %r275,%r165,%r165; add.u64 %r276,%r275,%r165; shl.b64 %r277,%r276,3; add.u64 %r141,%r277,40; mov.u64 %r97,0; add.u64 %r448,%frame,360; add.u64 %r452,%frame,240; $L14: add.u64 %r36,%r204,%r93; .loc 1 158 14 add.u64 %r278,%frame,%r97; .loc 1 158 16 ld.u64 %r280,[%r36+16]; add.u64 %r279,%r280,1; ld.u64 %r282,[%r36+8]; sub.u64 %r281,%r279,%r282; .loc 1 158 14 st.u64 [%r278],%r281; .loc 1 159 15 add.u64 %r284,%r448,%r97; .loc 1 159 17 add.u64 %r285,%r203,%r93; ld.u64 %r287,[%r285]; mul.lo.u64 %r286,%r287,%r57; .loc 1 159 15 st.u64 [%r284],%r286; .loc 1 160 15 add.u64 %r289,%r452,%r97; .loc 1 160 17 ld.u64 %r291,[%r36]; mul.lo.u64 %r290,%r291,%r62; .loc 1 160 15 st.u64 [%r289],%r290; .loc 1 155 7 add.u64 %r97,%r97,8; add.u64 %r93,%r93,24; setp.ne.u64 %r292,%r93,%r141; @ %r292 bra $L14; .loc 1 161 5 mov.u64 %r125,%r161; .loc 1 205 12 ld.u64 %r178,[%frame+360]; .loc 1 206 12 ld.u64 %r179,[%frame+240]; $L51: .loc 1 163 16 shl.b64 %r293,%r125,3; add.u64 %r294,%frame,%r293; add.u64 %r295,%r294,120; mov.u64 %r296,0; st.u64 [%r295],%r296; .loc 1 164 41 sub.u32 %r297,%r202,%r132; .loc 1 164 11 cvt.s64.s32 %r105,%r297; mov.u64 %r101,%r208; mov.u64 %r95,%r208; bra $L12; $L10: .loc 1 170 12 cvt.s64.s32 %r154,%r132; .loc 1 159 17 ld.u64 %r188,[%r203+16]; .loc 1 160 17 ld.u64 %r189,[%r204+16]; mov.u64 %r108,40; mov.u64 %r109,0; mov.u64 %r106,%r109; mov.u64 %r138,%r109; mov.u64 %r101,%r109; mov.u64 %r95,%r109; .loc 1 182 17 mov.u64 %r462,%r109; $L19: add.u64 %r102,%r204,%r108; .loc 1 178 14 ld.u64 %r299,[%r102+16]; add.u64 %r298,%r299,1; .loc 1 178 12 ld.u64 %r300,[%r102+8]; sub.u64 %r193,%r298,%r300; .loc 1 172 18 add.u64 %r301,%r203,%r108; ld.u64 %r302,[%r301]; mul.lo.u64 %r197,%r188,%r302; .loc 1 175 18 ld.u64 %r303,[%r102]; mul.lo.u64 %r201,%r189,%r303; .loc 1 170 7 setp.ne.u64 %r304,%r154,%r138; @ %r304 bra $L15; .loc 1 173 11 setp.ne.u64 %r305,%r197,0; selp.u64 %r95,%r197,%r208,%r305; .loc 1 176 11 setp.ne.u64 %r306,%r201,0; selp.u64 %r101,%r201,%r208,%r306; .loc 1 178 12 mov.u64 %r106,%r193; bra $L18; $L15: .loc 1 182 17 shl.b64 %r307,%r109,3; add.u64 %r308,%frame,%r307; add.u64 %r309,%r308,120; st.u64 [%r309],%r462; .loc 1 183 18 st.u64 [%r308],%r193; .loc 1 184 19 st.u64 [%r308+360],%r197; .loc 1 185 19 st.u64 [%r308+240],%r201; .loc 1 186 9 add.u64 %r109,%r109,1; $L18: .loc 1 168 59 add.u64 %r138,%r138,1; .loc 1 168 7 add.u64 %r108,%r108,24; setp.lt.s64 %r319,%r138,%r105; @ %r319 bra $L19; .loc 1 205 12 ld.u64 %r178,[%frame+360]; .loc 1 206 12 ld.u64 %r179,[%frame+240]; $L12: .loc 1 192 27 abs.s64 %r320,%r205; .loc 1 192 6 setp.gt.s64 %r321,%r320,%r106; @ %r321 bra $L56; .loc 1 199 10 setp.le.s64 %r322,%r205,0; @ %r322 bra $L21; .loc 1 200 6 sub.u64 %r107,%r106,%r205; mov.u64 %r106,%r205; bra $L20; $L21: .loc 1 202 6 add.u64 %r107,%r205,%r106; mov.u64 %r106,%r205; bra $L20; $L56: .loc 1 195 11 mov.u64 %r107,0; $L20: .loc 1 207 8 ld.u64 %r150,[%r203]; .loc 1 208 8 ld.u64 %r76,[%r204]; .loc 1 210 9 setp.eq.u64 %r323,%r150,0; @ %r323 bra $L1; .loc 1 221 24 neg.s64 %r27,%r106; .loc 1 221 31 mul.lo.u64 %r99,%r27,%r95; .loc 1 215 29 mul.lo.u64 %r155,%r101,%r106; .loc 1 227 24 mul.lo.u64 %r115,%r107,%r208; setp.ge.s64 %r445,%r106,0; selp.u64 %r324,%r106,%r27,%r445; .loc 1 257 10 add.u64 %r164,%r324,-1; .loc 1 251 10 add.u64 %r156,%r106,-1; @ %r445 bra $L25; not.b64 %r326,%r106; bra $L26; $L25: mov.u64 %r326,%r156; $L26: .loc 1 225 19 set.u32.eq.u64 %r329,%r101,%r208; neg.s32 %r330,%r329; .loc 1 225 38 set.u32.eq.u64 %r332,%r95,%r208; neg.s32 %r333,%r332; .loc 1 225 .loc 1 275 32 ld.u64 %r185,[%frame]; mul.lo.u64 %r168,%r107,%r95; .loc 1 282 30 mul.lo.u64 %r177,%r185,%r178; .loc 1 283 30 mul.lo.u64 %r338,%r185,%r179; .loc 1 283 16 neg.s64 %r160,%r338; shl.b64 %r339,%r105,3; add.u64 %r118,%r339,-8; setp.le.s64 %r446,%r106,0; setp.ne.u32 %r449,%r182,0; setp.ne.u64 %r450,%r206,0; setp.le.s64 %r447,%r105,2; .loc 1 233 4 setp.gt.s64 %r455,%r107,0; .loc 1 257 8 setp.ne.u64 %r456,%r324,0; setp.eq.u64 %r457,%r210,1; .loc 1 264 8 setp.gt.s64 %r458,%r208,0; .loc 1 251 8 setp.eq.u64 %r459,%r106,0; add.u64 %r460,%frame,360; add.u64 %r461,%frame,240; $L53: .loc 1 213 10 @ %r446 bra $L27; .loc 1 215 15 add.u64 %r65,%r76,%r155; mov.u64 %r98,%r150; bra $L28; $L27: .loc 1 221 16 add.u64 %r98,%r150,%r99; mov.u64 %r65,%r76; $L28: .loc 1 225 10 @ %r449 bra $L29; .loc 1 233 4 @ %r455 bra $L57; bra $L31; $L29: .loc 1 228 411349,[%value_in]; } .loc 1 229 9 add.u64 %r98,%r98,%r115; bra $L31; $L57: .loc 1 233 4 mov.u64 %r144,%r98; .loc 1 233 11 mov.u64 %r145,0; $L30:14236 13 add.u64 %r144,%r144,%r95; .loc 1 237 12 add.u64 %r65,%r65,%r101; .loc 1 233 26 add.u64 %r145,%r145,1; .loc 1 233 4 setp.ne.u64 %r359,%r107,%r145; @ %r359 bra $L30; .loc 1 236 13 add.u64 %r98,%r98,%r168; $L31: .loc 1 240 10 @ %r445 bra $L32; .loc 1 250 10 @ %r450 bra $L58; mov.u64 %r98,%r150; bra $L34; $L32: @ %r450 bra $L35; $L34: .loc 1 257 8 @ %r456 bra $L59; $L39: .loc 1 271 12 add.u64 %r150,%r150,%r178; .loc 1 272 12 add.u64 %r76,%r76,%r179; .loc 1 273 15 ld.u64 %r364,[%frame+120]; add.u64 %r80,%r364,1; .loc 1 275 13 setp.eq.u64 %r365,%r80,%r185; @ %r365 bra $L37; .loc 1 273 15 st.u64 [%frame+120],%r80; bra $L53; $L35: .loc 1 251 8 @ ! %r459 bra $L33; bra $L39; $L58: .loc 1 250 10 mov.u64 %r98,%r150; $L33: mov.u64 %r148,%r326; $L40: .loc 1 253 6206208373,[%value_in]; } .loc 1 254 11 add.u64 %r98,%r98,%r95; .loc 1 251 10 add.u64 %r148,%r148,-1; .loc 1 251 8 setp.ne.u64 %r375,%r148,-1; @ %r375 bra $L40; bra $L39; $L59: .loc 1 257 10 mov.u64 %r146,%r164; $L36: .loc 1 261 9 @ %r457 bra $L41; .loc 1 264 8 @ %r458 bra $L42; bra $L43; $L41: .loc 1 262 8 ld.s8 %r382,[%r209]382bra $L43; $L42: .loc 1 264 15 mov.u64 %r147,0; $L44: .loc 1 265 3 add.u64 %r384,%r98,%r143209210391,[%value_in]; } .loc 1 264 33 add.u64 %r147,%r147,%r210; .loc 1 264 8 setp.gt.s64 %r393,%r208,%r147; @ %r393 bra $L44; $L43: .loc 1 267 11 add.u64 %r98,%r98,%r95; .loc 1 257 10 add.u64 %r146,%r146,-1; .loc 1 257 8 setp.ne.u64 %r394,%r146,-1; @ %r394 bra $L36; bra $L39; $L37: .loc 1 279 20 mov.u64 %r395,0; st.u64 [%frame+120],%r395; .loc 1 283 16 add.u64 %r149,%r76,%r160; .loc 1 285 14 @ %r447 bra $L1; add.u64 %r139,%frame,128; .loc 1 282 30 mov.u64 %r83,%r177; .loc 1 285 14 mov.u64 %r153,8; bra $L47; $L48: .loc 1 279 20 st.u64 [%r139],%r395; .loc 1 282 30 mul.lo.u64 %r83,%r89,%r88; .loc 1 283 30 mul.lo.u64 %r399,%r91,%r88; .loc 1 283 16 sub.u64 %r149,%r76,%r399; .loc 1 285 14 add.u64 %r139,%r139,8; add.u64 %r153,%r153,8; setp.eq.u64 %r400,%r118,%r153; @ %r400 bra $L1; $L47: .loc 1 293 23 ld.u64 %r401,[%r139]; add.u64 %r88,%r401,1; st.u64 [%r139],%r88; .loc 1 294 30 add.u64 %r403,%r460,%r153; ld.u64 %r89,[%r403]; .loc 1 294 20 sub.u64 %r404,%r89,%r83; add.u64 %r150,%r150,%r404; .loc 1 295 30 add.u64 %r406,%r461,%r153; ld.u64 %r91,[%r406]; .loc 1 295 20 add.u64 %r76,%r149,%r91; .loc 1 275 32 add.u64 %r407,%frame,%r153; ld.u64 %r94,[%r407]; .loc 1 275 13 setp.eq.u64 %r408,%r88,%r94; @ %r408 bra $L48; bra $L53; $L9: .loc 1 150 13 cvt.s64.s32 %r409,%r132; add.u64 %r411,%r409,%r409; add.u64 %r412,%r411,%r409; shl.b64 %r413,%r412,3; add.u64 %r414,%r204,%r413; ld.u64 %r46,[%r414+40]; .loc 1 151 4 ld.u64 %r424,[%r414+56]; ld.u64 %r433,[%r414+48]; sub.u64 %r432,%r424,%r433; .loc 1 150 11 mad.lo.u64 %r106,%r432,%r46,%r46; .loc 1 152 13 mul.lo.u64 %r205,%r205,%r46; .loc 1 155 7 setp.lt.s32 %r434,%r207,%r202; @ %r434 bra $L50; mov.u64 %r179,-1; mov.u64 %r178,%r179; .loc 1 135 5 mov.u64 %r125,0; bra $L51; $L3: .loc 1 74 16 ld.u64 %r436,[%r204+56]; ld.u64 %r438,[%r204+48]; .loc 1 74 14 sub.u64 %r124,%r436,%r438; add.u64 %r37,%r204,72; add.u32 %r439,%r167,-1; cvt.u64.u32 %r440,%r439; add.u64 %r442,%r440,%r440; add.u64 %r443,%r442,%r440; shl.b64 %r444,%r443,3; mov.u64 %r126,%r213; .loc 1 77 10 mov.u64 %r117,1; .loc 1 82 4 mov.u64 %r229,%r126; bra $L52; $L1: .loc 1 299_gfortran_eoshift0_1 .visible .func _gfortran_eoshift0_1 {ld.u64 %r23,[%r29+1678; ld.s8 %r27,[%r32]; bra $L77; $L78: mov.u32 %r27,1; $L77: ld.s8 %r42,[%r30]; mov.u64 %r41,13; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r41; call eoshift0,%out_arg8); }_gfortran_eoshift0_1_char .visible .func _gfortran_eoshift0_1_charpredmov.u64 %r27,%ar0; mov.u64 %r29,%ar2; mov.u64 %r30,%ar3; mov.u64 %r31,%ar4; mov.u64 %r32,%ar5; mov.u32 %r33,%ar6; cvt.s64.s32 %r22,%r33; setp.eq.u64 %r35,%r32,0; @ %r35 bra $L81; ld.s8 %r26,[%r32]; bra $L80; $L81: mov.u32 %r26,1; $L80: ld.s8 %r44,[%r30]; mov.u64 %r43,1; cvta.const.u64 %r427242; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r43; call eoshift0,%out_arg8); }_gfortran_eoshift0_1_char4 .visible .func _gfortran_eoshift0_1_char4predmov.u64 %r29,%ar0; mov.u64 %r31,%ar2; mov.u64 %r32,%ar3; mov.u64 %r33,%ar4; mov.u64 %r34,%ar5; mov.u32 %r35,%ar6; cvt.s64.s32 %r37,%r35; shl.b64 %r24,%r37,2; setp.eq.u64 %r38,%r34,0; @ %r38 bra $L84; ld.s8 %r28,[%r34]; bra $L83; $L84: mov.u32 %r28,1; $L83: ld.s8 %r47,[%r32]; mov.u64 %r46,4; cvta.const.u64 %r45,space$42445; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r46; call eoshift0,%out_arg8); }_gfortran_eoshift0_2 .visible .func _gfortran_eoshift0_2 {ld.u64 %r23,[%r29+1687; ld.s16 %r27,[%r32]; bra $L86; $L87: mov.u32 %r27,1; $L86: ld.s16 %r42,[%r30]; mov.u64 %r41,13; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r41; call eoshift0,%out_arg8); }_gfortran_eoshift0_2_char .visible .func _gfortran_eoshift0_2_charpredmov.u64 %r27,%ar0; mov.u64 %r29,%ar2; mov.u64 %r30,%ar3; mov.u64 %r31,%ar4; mov.u64 %r32,%ar5; mov.u32 %r33,%ar6; cvt.s64.s32 %r22,%r33; setp.eq.u64 %r35,%r32,0; @ %r35 bra $L90; ld.s16 %r26,[%r32]; bra $L89; $L90: mov.u32 %r26,1; $L89: ld.s16 %r44,[%r30]; mov.u64 %r43,1; cvta.const.u64 %r427242; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r43; call eoshift0,%out_arg8); }_gfortran_eoshift0_2_char4 .visible .func _gfortran_eoshift0_2_char4predmov.u64 %r29,%ar0; mov.u64 %r31,%ar2; mov.u64 %r32,%ar3; mov.u64 %r33,%ar4; mov.u64 %r34,%ar5; mov.u32 %r35,%ar6; cvt.s64.s32 %r37,%r35; shl.b64 %r24,%r37,2; setp.eq.u64 %r38,%r34,0; @ %r38 bra $L93; ld.s16 %r28,[%r34]; bra $L92; $L93: mov.u32 %r28,1; $L92: ld.s16 %r47,[%r32]; mov.u64 %r46,4; cvta.const.u64 %r45,space$32445; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r46; call eoshift0,%out_arg8); }_gfortran_eoshift0_4 .visible .func _gfortran_eoshift0_4 {predmov.u64 %r29,%ar2; mov.u64 %r30,%ar3; mov.u64 %r31,%ar4; ld.u64 %r23,[%r28+16ld.u32 %r26,[%r31]; bra $L95; $L96: mov.u32 %r26,1; $L95: ld.s32 %r41,[%r29]; mov.u64 %r40,1.param .u64 %out_arg8; st.param.u64 [%out_arg8],%r40; call eoshift0,%out_arg8); }_gfortran_eoshift0_4_char .visible .func _gfortran_eoshift0_4_charpredmov.u64 %r28,%ar2; mov.u64 %r29,%ar3; mov.u64 %r30,%ar4; mov.u64 %r31,%ar5; mov.u32 %r32,%ar6; cvt.s64.s32 %r22,%r32; setp.eq.u64 %r34,%r31,0; @ %r34 bra $L99; ld.u32 %r25,[%r31]; bra $L98; $L99: mov.u32 %r25,1; $L98: ld.s32 %r43,[%r29]; mov.u64 %r42,1; cvta.const2241; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r42; call eoshift0,%out_arg8); }_gfortran_eoshift0_4_char4 .visible .func _gfortran_eoshift0_4_char4predmov.u64 %r33,%ar5; mov.u32 %r34,%ar6; cvt.s64.s32 %r36,%r34; shl.b64 %r24,%r36,2; setp.eq.u64 %r37,%r33,0; @ %r37 bra $L102; ld.u32 %r27,[%r33]; bra $L101; $L102: mov.u32 %r27,1; $L101: ld.s32 %r46,[%r31]; mov.u64 %r45,4; cvta.const.u64 %r44,space$2.param .u64 %out_arg8; st.param.u64 [%out_arg8],%r45; call eoshift0,%out_arg8); }_gfortran_eoshift0_8 .visible .func _gfortran_eoshift0_8 {predmov.u64 %r28,%ar1; mov.u64 %r29,%ar2; mov.u64 %r30,%ar3; mov.u64 %r31,%ar4; ld.u64 %r23,[%r28+16105; ld.u32 %r26,[%r31]; bra $L104; $L105: mov.u32 %r26,1; $L104: mov.u64 %r41,1; cvta.const.u64 %r40,$LC2;3; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r41; call eoshift0,%out_arg8); }_gfortran_eoshift0_8_char .visible .func _gfortran_eoshift0_8_charpredmov.u64 %r29,%ar3; mov.u64 %r30,%ar4; mov.u64 %r31,%ar5; mov.u32 %r32,%ar6; cvt.s64.s32 %r22,%r32; setp.eq.u64 %r34,%r31,0; @ %r34 bra $L108; ld.u32 %r25,[%r31]; bra $L107; $L108: mov.u32 %r25,1; $L107: mov.u64 %r43,1; cvta.const.u64 %r42,$LC3; ld.u64 %r38,[%r29]2242; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r43; call eoshift0,%out_arg8); }_gfortran_eoshift0_8_char4 .visible .func _gfortran_eoshift0_8_char4predmov.u64 %r33,%ar5; mov.u32 %r34,%ar6; cvt.s64.s32 %r36,%r34; shl.b64 %r24,%r36,2; setp.eq.u64 %r37,%r33,0; @ %r37 bra $L111; ld.u32 %r27,[%r33]; bra $L110; $L111: mov.u32 %r27,1; $L110: mov.u64 %r46,4; cvta.const.u64 %r45,space$1; ld.u64 %r41,[%r31]2445; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r46; call eoshift0,%out_arg8); }_gfortran_eoshift0_16 .visible .func _gfortran_eoshift0_16 {ld.u64 %r23,[%r29+16ld.u32 %r27,[%r32]; bra $L113; $L114: mov.u32 %r27,1; $L113: mov.u64 %r42,1; cvta.const.u64 %r41,$LC2; ld.u64 %r37,[%r30]341; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r42; call eoshift0,%out_arg8); }_gfortran_eoshift0_16_char .visible .func _gfortran_eoshift0_16_charpredmov.u64 %r27,%ar0; mov.u64 %r29,%ar2; mov.u64 %r30,%ar3; mov.u64 %r31,%ar4; mov.u64 %r32,%ar5; mov.u32 %r33,%ar6; cvt.s64.s32 %r22,%r33; setp.eq.u64 %r35,%r32,0; @ %r35 bra $L117; ld.u32 %r26,[%r32]; bra $L116; $L117: mov.u32 %r26,1; $L116: mov.u64 %r44,1; cvta.const.u64 %r43,$LC37243; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r44; call eoshift0,%out_arg8); }_gfortran_eoshift0_16_char4 .visible .func _gfortran_eoshift0_16_char4predmov.u64 %r29,%ar0; mov.u64 %r31,%ar2; mov.u64 %r32,%ar3; mov.u64 %r33,%ar4; mov.u64 %r34,%ar5; mov.u32 %r35,%ar6; cvt.s64.s32 %r37,%r35; shl.b64 %r24,%r37,2; setp.eq.u64 %r38,%r34,0; @ %r38 bra $L120; ld.u32 %r28,[%r34]; bra $L119; $L120: mov.u32 %r28,1; $L119: mov.u64 %r47,4; cvta.const.u64 %r46,space$0; ld.u64 %r42,[%r32]332446; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r47; call eoshift0,%out_arg8); } ret; } eoshift2BEGIN FUNCTION DECL: eoshift2 .func eoshift2; .file 1 "../../../../libgfortran/intrinsics/eoshift2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_eoshift2_1 .visible .func _gfortran_eoshift2_1_gfortran_eoshift2_1_char .visible .func _gfortran_eoshift2_1_char_gfortran_eoshift2_2 .visible .func _gfortran_eoshift2_2_gfortran_eoshift2_2_char .visible .func _gfortran_eoshift2_2_char_gfortran_eoshift2_4 .visible .func _gfortran_eoshift2_4_gfortran_eoshift2_4_char .visible .func _gfortran_eoshift2_4_char_gfortran_eoshift2_8 .visible .func _gfortran_eoshift2_8_gfortran_eoshift2_8_char .visible .func _gfortran_eoshift2_8_char_gfortran_eoshift2_16 .visible .func _gfortran_eoshift2_16_gfortran_eoshift2_16_char .visible .func _gfortran_eoshift2_16_char_gfortran_eoshift2_16_char4 .visible .func _gfortran_eoshift2_16_char4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_size0_gfortran_size0_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extentsVAR DEF: space$0 .const .align 4 .u32 space$0[1] = {32 }; // BEGIN VAR DEF: space$1 .const .align 4 .u32 space$1[1] = {32 }; // BEGIN VAR DEF: space$2 .const .align 4 .u32 space$2[1] = {32 }; // BEGIN VAR DEF: space$3 .const .align 4 .u32 space$3[1] = {32 }; // BEGIN VAR DEF: space$4 .const .align 4 .u32 space$4[1] = {32 }8] = {69,79,83,72,73,70,8414,101,116,117,114,110,32,118,97,108,117,1012] = {2] = {32,0 }; // BEGIN FUNCTION DEF: eoshift2 .func eoshift2pred %r196; .reg .u64 %r199; .reg .u64 %r200; .reg .pred %r201; .reg .u64 %r202; .reg .u64pred %r216; .reg .pred %r217; .reg .predpred %r248; .reg .pred %r249; .reg .u64 %r250; .reg .u64u64 %r283; .reg .pred %r285; .reg .pred %r287; .reg .predu64 %r299; .reg .pred %r301; .reg .u32pred %r339; .reg .predu64 %r344; .reg .u64 %r346; .reg .u32u64 %r357; .reg .pred %r358; .reg .pred %r359; .reg .pred %r361; .reg .pred %r362; .reg .pred %r365; .reg .pred %r366; .reg .predu64 %r374; .reg .u64 %r375; .reg .u64 %r376; mov.u64 %r167,%ar0; mov.u64 %r168,%ar1; mov.u64 %r169,%ar2; mov.u64 %r170,%ar3; mov.u32 %r171,%ar4; mov.u64 %r172,%ar5; mov.u64 %r173,%ar6; .loc 1 66 10 ld.u64 %r22,[%r168+16]168; call (%value_in),_gfortran_size0,(%out_arg1); ld.param.u64 %r175,[%value_in]; } .loc 1 70 6 ld.u64 %r176,[%r167]; setp.ne.u64 %r177,%r176,0; @ %r177 bra $L2; .loc 1 74 19 st.u64 [%r167+8],%r176; .loc 1 75 7 ld.u64 %r179,[%r168+16]; st.u64 [%r167+16],%r179; ld.u64 %r180,[%r168+24]; st.u64 [%r167+24],%r180; .loc 1 78 2422; call (%value_in),_gfortrani_xmallocarray183,[%value_in]; } .loc 1 78 22 st.u64 [%r167],%r183; .loc 1 80 23 ld.s8 %r135,[%r168+28]; .loc 1 80 7 cvt.u16.u32 %r185,%r135; setp.gt.s16 %r186,%r185,0; @ %r186 bra $L3; bra $L4; $L5: .loc 1 84 16 ld.u64 %r188,[%r78+8]; ld.u64 %r190,[%r78]; .loc 1 89 17 mad.lo.u64 %r91,%r85,%r91,%r91; add.u64 %r78,%r78,24; add.u64 %r44,%r44,24; .loc 1 84 14 sub.u64 %r85,%r188,%r190; $L54: add.u64 %r34,%r167,%r44; .loc 1 92 4 st.u64 [%r34+48],%r192; st.u64 [%r34+56],%r85; st.u64 [%r34+40],%r91; .loc 1 80 7 setp.ne.u64 %r193,%r44,%r352; @ %r193 bra $L5; bra $L4; $L2: .loc 1 95 12 cvta.global.u64 %r194,_gfortrani_compile_options; .loc 1 95 11 ld.u32 %r195,[%r194+36]; setp.eq.u32 %r196,%r195,0; @ %r196 bra $L4; .loc 1 97 7 cvta.const.u64 %r200,$LC0;r1619900; call _gfortrani_bounds_equal_extents$L4: .loc 1 101 6 setp.ne.u64 %r201,%r175,0; @ ! %r201 bra $L1; .loc 1 104 9 add.u32 %r105,%r171,-1; .loc 1 106 13 mov.u64 %r202,1; st.u64 [%frame],%r202; .loc 1 107 12 mov.u64 %r203,0; st.u64 [%frame+120],%r203; .loc 1 108 14 mov.u64 %r204,-1; st.u64 [%frame+360],%r204; .loc 1 109 14 st.u64 [%frame+480],%r204; .loc 1 110 14 st.u64 [%frame+240],%r204; .loc 1 112 23 ld.s8 %r134,[%r168+28]; cvt.u32.u32 %r207,%r134; cvt.s64.s8 %r133,%r207; .loc 1 112 3 setp.gt.s64 %r208,%r133,0; @ %r208 bra $L8; .loc 1 151 27 abs.s64 %r38,%r169; .loc 1 149 8 ld.u64 %r123,[%r168]; .loc 1 148 8 ld.u64 %r122,[%r167]; .loc 1 141 6 setp.ne.u64 %r209,%r170,0; @ %r209 bra $L56; .loc 1 63 11 mov.u64 %r80,%r170; .loc 1 141 6 mov.u64 %r107,%r204; .loc 1 62 7 mov.u64 %r84,%r170; .loc 1 137 14 mov.u64 %r37,%r107; .loc 1 64 11 mov.u64 %r74,%r170; .loc 1 141 6 mov.u64 %r36,%r107; bra $L10; $L8: .loc 1 114 15 cvt.s64.s32 %r54,%r105; .loc 1 116 21 ld.u64 %r161,[%r167+16]; .loc 1 119 21 ld.u64 %r165,[%r168+16]; mov.u64 %r97,40; .loc 1 111 5 mov.u64 %r87,%r203; .loc 1 62 7 mov.u64 %r84,%r87; .loc 1 112 12 mov.u64 %r126,%r87; .loc 1 63 11 mov.u64 %r80,%r87; .loc 1 64 11 mov.u64 %r74,%r87; setp.eq.u64 %r354,%r170,0; .loc 1 126 20 mov.u64 %r376,%r87; $L16: add.u64 %r83,%r168,%r97; .loc 1 122 17 ld.u64 %r211,[%r83+16]; add.u64 %r210,%r211,1; .loc 1 122 15 ld.u64 %r212,[%r83+8]; sub.u64 %r158,%r210,%r212; .loc 1 116 21 add.u64 %r213,%r167,%r97; ld.u64 %r214,[%r213]; mul.lo.u64 %r73,%r161,%r214; .loc 1 119 21 ld.u64 %r215,[%r83]; mul.lo.u64 %r79,%r165,%r215; .loc 1 114 10 setp.ne.u64 %r216,%r54,%r126; @ %r216 bra $L11; .loc 1 117 14 setp.ne.u64 %r217,%r73,0; .loc 1 120 14 setp.ne.u64 %r218,%r79,0; .loc 1 122 15 mov.u64 %r84,%r158; selp.u64 %r80,%r79,%r22,%r218; selp.u64 %r74,%r73,%r22,%r217; bra $L14; $L11: .loc 1 126 20 shl.b64 %r357,%r87,3; add.u64 %r220,%frame,%r357; add.u64 %r221,%r220,120; st.u64 [%r221],%r376; .loc 1 127 21 st.u64 [%r220],%r158; .loc 1 128 22 st.u64 [%r220+480],%r73; .loc 1 129 22 st.u64 [%r220+360],%r79; .loc 1 130 14 @ %r354 bra $L57; .loc 1 131 26 add.u64 %r233,%r87,%r87; add.u64 %r234,%r233,%r87; shl.b64 %r235,%r234,3; add.u64 %r236,%r170,%r235; ld.u64 %r238,[%r236+40]; ld.u64 %r239,[%r170+16]; mul.lo.u64 %r43,%r238,%r239; bra $L15; $L57: .loc 1 133 24 mov.u64 %r43,%r376; $L15: add.u64 %r241,%frame,%r357; st.u64 [%r241+240],%r43; .loc 1 134 12 add.u64 %r87,%r87,1; $L14: .loc 1 112 55 add.u64 %r126,%r126,1; .loc 1 112 3 add.u64 %r97,%r97,24; setp.ne.u64 %r243,%r126,%r133; @ %r243 bra $L16; .loc 1 137 14 ld.u64 %r37,[%frame+360]; .loc 1 139 14 ld.u64 %r36,[%frame+480]; .loc 1 141 23 ld.u64 %r107,[%frame+240]; .loc 1 137 6 setp.ne.u64 %r244,%r37,0; .loc 1 66 8 selp.u64 %r37,%r37,%r22,%r244; .loc 1 139 6 setp.ne.u64 %r245,%r36,0; .loc 1 66 8 selp.u64 %r36,%r36,%r22,%r245; .loc 1 151 27 abs.s64 %r38,%r169; .loc 1 149 8 ld.u64 %r123,[%r168]; .loc 1 148 8 ld.u64 %r122,[%r167]; .loc 1 141 6 @ %r354 bra $L10; .loc 1 141 13 setp.eq.u64 %r247,%r107,0; .loc 1 66 8 selp.u64 %r107,%r22,%r107,%r247; bra $L9; $L52: .loc 1 158 10 setp.le.s64 %r248,%r169,0; @ %r248 bra $L19; .loc 1 159 6 sub.u64 %r86,%r84,%r169; bra $L20; $L19: .loc 1 161 6 add.u64 %r86,%r84,%r169; $L20: .loc 1 164 6 setp.eq.u64 %r249,%r170,0; @ %r249 bra $L59; mov.u64 %r84,%r169; bra $L53; $L64: .loc 1 154 11 mov.u64 %r86,0; $L53: .loc 1 165 10 ld.u64 %r170,[%r170]; bra $L21; $L51: .loc 1 180 24 neg.s64 %r30,%r84; .loc 1 180 31 mul.lo.u64 %r136,%r30,%r74; .loc 1 174 29 mul.lo.u64 %r99,%r84,%r80; .loc 1 186 24 mul.lo.u64 %r152,%r22,%r86; setp.ge.s64 %r358,%r84,0; selp.u64 %r250,%r84,%r30,%r358; .loc 1 216 10 add.u64 %r143,%r250,-1; .loc 1 210 10 add.u64 %r146,%r84,-1; @ %r358 bra $L24; not.b64 %r252,%r84; bra $L25; $L24: mov.u64 %r252,%r146; $L25: .loc 1 184 38 set.u32.eq.u64 %r255,%r74,%r22; neg.s32 %r256,%r255; .loc 1 184 19 set.u32.eq.u64 %r258,%r80,%r22; neg.s32 %r259,%r258; .loc 1 184 27 cvt.u16.u32 %r261,%r256; cvt.u16.u32 %r262,%r259; and.b16 %r260,%r261,%r262; cvt.u32.u16 %r263,%r260; cvt.u32.u8 %r144,%r263; .loc 1 235 32 ld.u64 %r51,[%frame]; mul.lo.u64 %r98,%r86,%r74; .loc 1 242 30 mul.lo.u64 %r104,%r51,%r36; .loc 1 243 30 mul.lo.u64 %r264,%r51,%r37; .loc 1 243 16 neg.s64 %r129,%r264; .loc 1 244 30 mul.lo.u64 %r265,%r51,%r107; .loc 1 244 16 neg.s64 %r139,%r265; setp.le.s64 %r359,%r84,0; setp.ne.u32 %r362,%r144,0; setp.gt.s64 %r361,%r133,2; .loc 1 192 4 setp.gt.s64 %r365,%r86,0; .loc 1 216 8 setp.ne.u64 %r366,%r250,0; setp.eq.u64 %r367,%r173,1; .loc 1 223 8 setp.gt.s64 %r368,%r22,0; .loc 1 210 8 setp.eq.u64 %r369,%r84,0; cvt.u32.u32 %r370,%r134; cvt.s64.s8 %r371,%r370; add.u64 %r372,%r371,-1; add.u64 %r373,%frame,480; add.u64 %r374,%frame,360; add.u64 %r375,%frame,240; $L55: .loc 1 172 10 @ %r359 bra $L26; .loc 1 174 15 add.u64 %r114,%r123,%r99; mov.u64 %r77,%r122; bra $L27; $L26: .loc 1 180 16 add.u64 %r77,%r122,%r136; mov.u64 %r114,%r123; $L27: .loc 1 184 10 @ %r362 bra $L28; .loc 1 192 4 @ %r365 bra $L60; bra $L30; $L28: .loc 1 187 477r1275,[%value_in]; } .loc 1 188 9 add.u64 %r77,%r77,%r152; bra $L30; $L60: .loc 1 192 4 mov.u64 %r113,%r77; .loc 1 192 11 mov.u64 %r115,0; $L29:283,[%value_in]; } .loc 1 195 13 add.u64 %r113,%r113,%r74; .loc 1 196 12 add.u64 %r114,%r114,%r80; .loc 1 192 26 add.u64 %r115,%r115,1; .loc 1 192 4 setp.ne.u64 %r285,%r115,%r86; @ %r285 bra $L29; .loc 1 195 13 add.u64 %r77,%r77,%r98; $L30: .loc 1 199 10 @ %r358 bra $L31; .loc 1 209 10 setp.ne.u64 %r287,%r170,0; @ %r287 bra $L61; mov.u64 %r77,%r122; bra $L33; $L31: setp.ne.u64 %r288,%r170,0; @ %r288 bra $L34; $L33: .loc 1 216 8 @ %r366 bra $L62; $L38: .loc 1 230 12 add.u64 %r122,%r122,%r36; .loc 1 231 12 add.u64 %r123,%r123,%r37; .loc 1 232 12 add.u64 %r170,%r170,%r107; .loc 1 233 15 ld.u64 %r290,[%frame+120]; add.u64 %r53,%r290,1; .loc 1 235 13 setp.eq.u64 %r291,%r51,%r53; @ %r291 bra $L36; .loc 1 233 15 st.u64 [%frame+120],%r53; bra $L55; $L34: .loc 1 210 8 @ ! %r369 bra $L32; bra $L38; $L61: .loc 1 209 10 mov.u64 %r77,%r122; $L32: mov.u64 %r118,%r252; $L39: .loc 1 212 6170299,[%value_in]; } .loc 1 213 11 add.u64 %r77,%r77,%r74; .loc 1 210 10 add.u64 %r118,%r118,-1; .loc 1 210 8 setp.ne.u64 %r301,%r118,-1; @ %r301 bra $L39; bra $L38; $L62: .loc 1 216 10 mov.u64 %r116,%r143; $L35: .loc 1 220 9 @ %r367 bra $L40; .loc 1 223 8 @ %r368 bra $L41; bra $L42; $L40: .loc 1 221 8 ld.s8 %r308,[%r172]bra $L42; $L41: .loc 1 223 15 mov.u64 %r117,0; $L43: .loc 1 224 3 add.u64 %r310,%r77,%r1131172173317,[%value_in]; } .loc 1 223 33 add.u64 %r117,%r117,%r173; .loc 1 223 8 setp.gt.s64 %r319,%r22,%r117; @ %r319 bra $L43; $L42: .loc 1 226 11 add.u64 %r77,%r77,%r74; .loc 1 216 10 add.u64 %r116,%r116,-1; .loc 1 216 8 setp.ne.u64 %r320,%r116,-1; @ %r320 bra $L35; bra $L38; $L36: .loc 1 239 20 mov.u64 %r321,0; st.u64 [%frame+120],%r321; .loc 1 243 16 add.u64 %r119,%r123,%r129; .loc 1 244 16 add.u64 %r120,%r170,%r139; .loc 1 246 14 @ %r361 bra $L45; bra $L1; $L50: .loc 1 239 20 st.u64 [%r75],%r321; .loc 1 242 30 mul.lo.u64 %r56,%r66,%r65; .loc 1 243 30 mul.lo.u64 %r324,%r68,%r65; .loc 1 243 16 sub.u64 %r119,%r123,%r324; .loc 1 244 30 mul.lo.u64 %r325,%r70,%r65; .loc 1 244 16 sub.u64 %r120,%r170,%r325; .loc 1 245 12 add.u64 %r121,%r121,1; .loc 1 246 14 add.u64 %r75,%r75,8; add.u64 %r150,%r150,8; setp.eq.u64 %r326,%r121,%r372; @ ! %r326 bra $L48; bra $L1; $L45: add.u64 %r75,%frame,128; .loc 1 242 30 mov.u64 %r56,%r104; .loc 1 246 14 mov.u64 %r150,8; .loc 1 245 12 mov.u64 %r121,1; $L48: .loc 1 254 23 ld.u64 %r330,[%r75]; add.u64 %r65,%r330,1; st.u64 [%r75],%r65; .loc 1 255 30 add.u64 %r332,%r373,%r150; ld.u64 %r66,[%r332]; .loc 1 255 20 sub.u64 %r333,%r66,%r56; add.u64 %r122,%r122,%r333; .loc 1 256 30 add.u64 %r335,%r374,%r150; ld.u64 %r68,[%r335]; .loc 1 256 20 add.u64 %r123,%r119,%r68; .loc 1 257 30 add.u64 %r337,%r375,%r150; ld.u64 %r70,[%r337]; .loc 1 257 20 add.u64 %r170,%r120,%r70; .loc 1 235 32 add.u64 %r338,%frame,%r150; ld.u64 %r72,[%r338]; .loc 1 235 13 setp.eq.u64 %r339,%r65,%r72; @ %r339 bra $L50; bra $L55; $L59: mov.u64 %r84,%r169; bra $L21; $L63: .loc 1 154 11 mov.u64 %r170,0; .loc 1 167 10 mov.u64 %r86,%r170; $L21: .loc 1 169 9 setp.ne.u64 %r340,%r122,0; @ %r340 bra $L51; bra $L1; $L10: .loc 1 151 6 setp.lt.s64 %r341,%r84,%r38; @ ! %r341 bra $L52; bra $L63; $L56: .loc 1 63 11 mov.u64 %r80,%r203; .loc 1 62 7 mov.u64 %r84,%r80; .loc 1 137 14 mov.u64 %r37,%r204; .loc 1 64 11 mov.u64 %r74,%r80; .loc 1 141 6 mov.u64 %r36,%r37; mov.u64 %r107,%r37; $L9: .loc 1 151 6 setp.lt.s64 %r342,%r84,%r38; @ %r342 bra $L64; bra $L52; $L3: .loc 1 84 16 ld.u64 %r344,[%r168+56]; ld.u64 %r346,[%r168+48]; .loc 1 84 14 sub.u64 %r85,%r344,%r346; add.u64 %r78,%r168,72; add.u32 %r347,%r135,-1; cvt.u64.u32 %r348,%r347; add.u64 %r350,%r348,%r348; add.u64 %r351,%r350,%r348; shl.b64 %r352,%r351,3; mov.u64 %r44,%r176; .loc 1 87 10 mov.u64 %r91,1; .loc 1 92 4 mov.u64 %r192,%r44; bra $L54; $L1: .loc 1 261_gfortran_eoshift2_1 .visible .func _gfortran_eoshift2_1 {predmov.u64 %r27,%ar1; mov.u64 %r28,%ar2; mov.u64 %r29,%ar3; mov.u64 %r30,%ar4; setp.eq.u64 %r31,%r30,0; @ %r31 bra $L78; ld.s8 %r25,[%r30]; bra $L77; $L78: mov.u32 %r25,1; $L77: ld.s8 %r39,[%r28]; mov.u64 %r38,1; cvta.const3738; call eoshift2} // BEGIN GLOBAL FUNCTION DEF: _gfortran_eoshift2_1_char .visible .func _gfortran_eoshift2_1_charpredmov.u64 %r28,%ar2; mov.u64 %r29,%ar3; mov.u64 %r30,%ar4; mov.u64 %r31,%ar5; setp.eq.u64 %r34,%r31,0; @ %r34 bra $L81; ld.s8 %r25,[%r31]; bra $L80; $L81: mov.u32 %r25,1; $L80: ld.s8 %r42,[%r29]; mov.u64 %r41,1; cvta.const041; call eoshift2} // BEGIN GLOBAL FUNCTION DEF: _gfortran_eoshift2_1_char4 .visible .func _gfortran_eoshift2_1_char4predmov.u64 %r28,%ar2; mov.u64 %r29,%ar3; mov.u64 %r30,%ar4; mov.u64 %r31,%ar5; setp.eq.u64 %r34,%r31,0; @ %r34 bra $L84; ld.s8 %r25,[%r31]; bra $L83; $L84: mov.u32 %r25,1; $L83: ld.s8 %r42,[%r29]; mov.u64 %r41,4; cvta.const.u64 %r40,space$4041; call eoshift2} // BEGIN GLOBAL FUNCTION DEF: _gfortran_eoshift2_2 .visible .func _gfortran_eoshift2_2 {predmov.u64 %r27,%ar1; mov.u64 %r28,%ar2; mov.u64 %r29,%ar3; mov.u64 %r30,%ar4; setp.eq.u64 %r31,%r30,0; @ %r31 bra $L87; ld.s16 %r25,[%r30]; bra $L86; $L87: mov.u32 %r25,1; $L86: ld.s16 %r39,[%r28]; mov.u64 %r38,1; cvta.const3738; call eoshift2} // BEGIN GLOBAL FUNCTION DEF: _gfortran_eoshift2_2_char .visible .func _gfortran_eoshift2_2_charpredmov.u64 %r28,%ar2; mov.u64 %r29,%ar3; mov.u64 %r30,%ar4; mov.u64 %r31,%ar5; setp.eq.u64 %r34,%r31,0; @ %r34 bra $L90; ld.s16 %r25,[%r31]; bra $L89; $L90: mov.u32 %r25,1; $L89: ld.s16 %r42,[%r29]; mov.u64 %r41,1; cvta.const041; call eoshift2} // BEGIN GLOBAL FUNCTION DEF: _gfortran_eoshift2_2_char4 .visible .func _gfortran_eoshift2_2_char4predmov.u64 %r28,%ar2; mov.u64 %r29,%ar3; mov.u64 %r30,%ar4; mov.u64 %r31,%ar5; setp.eq.u64 %r34,%r31,0; @ %r34 bra $L93; ld.s16 %r25,[%r31]; bra $L92; $L93: mov.u32 %r25,1; $L92: ld.s16 %r42,[%r29]; mov.u64 %r41,4; cvta.const.u64 %r40,space$3041; call eoshift2} // BEGIN GLOBAL FUNCTION DEF: _gfortran_eoshift2_4 .visible .func _gfortran_eoshift2_4 {r24u64mov.u64 %r29,%ar4; setp.eq.u64 %r30,%r29,0; @ %r30 bra $L96; ld.u32 %r24,[%r29]; bra $L95; $L96: mov.u32 %r24,1; $L95: ld.s32 %r38,[%r27]; mov.u64 %r37,1; cvta.const.u64 %r36,$LC2; {3637; call eoshift2} // BEGIN GLOBAL FUNCTION DEF: _gfortran_eoshift2_4_char .visible .func _gfortran_eoshift2_4_charpredmov.u64 %r29,%ar4; mov.u64 %r30,%ar5;99; ld.u32 %r24,[%r30]; bra $L98; $L99: mov.u32 %r24,1; $L98: ld.s32 %r41,[%r28]; mov.u64 %r40,1; cvta.const.u64 %r39539; call eoshift2} // BEGIN GLOBAL FUNCTION DEF: _gfortran_eoshift2_4_char4 .visible .func _gfortran_eoshift2_4_char4predmov.u64 %r29,%ar4; mov.u64 %r30,%ar5;102; ld.u32 %r24,[%r30]; bra $L101; $L102: mov.u32 %r24,1; $L101: ld.s32 %r41,[%r28]; mov.u64 %r40,4; cvta.const.u64 %r39,space$2; {39; call eoshift2} // BEGIN GLOBAL FUNCTION DEF: _gfortran_eoshift2_8 .visible .func _gfortran_eoshift2_8 {r24u64mov.u64 %r29,%ar4; setp.eq.u64 %r30,%r29,0; @ %r30 bra $L105; ld.u32 %r24,[%r29]; bra $L104; $L105: mov.u32 %r24,1; $L104: mov.u64 %r38,1; cvta.const.u64 %r37,$LC2; ld.u64 %r34,[%r27]; {3738; call eoshift2} // BEGIN GLOBAL FUNCTION DEF: _gfortran_eoshift2_8_char .visible .func _gfortran_eoshift2_8_charpredmov.u64 %r27,%ar2; mov.u64 %r28,%ar3; mov.u64 %r29,%ar4; mov.u64 %r30,%ar5; setp.eq.u64 %r33,%r30,0; @ %r33 bra $L108; ld.u32 %r24,[%r30]; bra $L107; $L108: mov.u32 %r24,1; $L107: mov.u64 %r41,1; cvta.const.u64 %r40,$LC3; ld.u64 %r37,[%r24041; call eoshift2} // BEGIN GLOBAL FUNCTION DEF: _gfortran_eoshift2_8_char4 .visible .func _gfortran_eoshift2_8_char4predmov.u64 %r27,%ar2; mov.u64 %r28,%ar3; mov.u64 %r29,%ar4; mov.u64 %r30,%ar5; setp.eq.u64 %r33,%r30,0; @ %r33 bra $L111; ld.u32 %r24,[%r30]; bra $L110; $L111: mov.u32 %r24,1; $L110: mov.u64 %r41,4; cvta.const.u64 %r40,space$1; ld.u64 %r37,[%r24041; call eoshift2} // BEGIN GLOBAL FUNCTION DEF: _gfortran_eoshift2_16 .visible .func _gfortran_eoshift2_16 {predsetp.eq.u64 %r31,%r30,0; @ %r31 bra $L114; ld.u32 %r25,[%r30]; bra $L113; $L114: mov.u32 %r25,1; $L113: mov.u64 %r39,1; cvta.const.u64 %r38,$LC2; ld.u64 %r35,[%r28]339; call eoshift2} // BEGIN GLOBAL FUNCTION DEF: _gfortran_eoshift2_16_char .visible .func _gfortran_eoshift2_16_charpredmov.u64 %r29,%ar3; mov.u64 %r30,%ar4; mov.u64 %r31,%ar5; setp.eq.u64 %r34,%r31,0; @ %r34 bra $L117; ld.u32 %r25,[%r31]; bra $L116; $L117: mov.u32 %r25,1; $L116: mov.u64 %r42,1; cvta.const.u64 %r41,$LC3; ld.u64 %r38,[%r29]142; call eoshift2} // BEGIN GLOBAL FUNCTION DEF: _gfortran_eoshift2_16_char4 .visible .func _gfortran_eoshift2_16_char4predmov.u64 %r29,%ar3; mov.u64 %r30,%ar4; mov.u64 %r31,%ar5; setp.eq.u64 %r34,%r31,0; @ %r34 bra $L120; ld.u32 %r25,[%r31]; bra $L119; $L120: mov.u32 %r25,1; $L119: mov.u64 %r42,4; cvta.const.u64 %r41,space$0; ld.u64 %r38,[%r29]142; call eoshift2} erfc_scaledgfortran_erfc_scaled_r4 .visible .func (.param .f32 %value_out) _gfortran_erfc_scaled_r4../libgfortran/intrinsics/erfc_scaled_in_gfortran_erfc_scaled_r8 .visible .func (.param .f64 %value_out) _gfortran_erfc_scaled_expf .extern .func (.param .f32 %value_out) expfp .extern .func (.param .f64 %value_out) exp (.param .f64 %in_ar0); // BEGIN VAR DEF: c$18 .const .align 4 .u32 c$18[9] = {1058041512,1091445090,1115962631,1133859148,1146911985,1154875782,1157640513,1150929628,850981762 }; // BEGIN VAR DEF: d$6 .const .align 8 .u64 d$6[8] = {4625053223475404501,4637982392458399207,4647936310760731806,4654845762207915303,4659454959122452994,4661518765118262918,4659781664573558205,4653125905153220449 }; // BEGIN VAR DEF: d$17 .const .align 4 .u32 d$17[8] = {1098640184,1122722638,1141263255,1154133111,1162718410,1166562548,1163326946,1150929628 }; // BEGIN VAR DEF: c$7 .const .align 8 .u64 c$7[9] = {4603256977754886206,4621190386817743874,4634353141348049104,4643961261102179426,4650968949598166947,4655244480475941079,4656728784125521771,4653125905153195150,4492092620723410965_gfortran_erfc_scaled_r4 .visible .func (.param .f32 %value_out) _gfortran_erfc_scaled_r4 (predpredf32 %r100; .reg .f32 %r102; .reg .f32 %r103; .reg .f32 %r105; .reg .f32 %r106; .reg .f32 %r107; .reg .f32 %r108; .reg .f32 %r109; mov.f32 %r76,%ar0; .loc 1 111 18 setp.gt.f32 %r77,%r76,0f00000000; @ %r77 bra $L17; neg.f32 %r48,%r76; bra $L2; $L17: mov.f32 %r48,%r76; $L2: .loc 1 112 6 setp.le.f32 %r78,%r48,0f3ef00000; @ ! %r78 bra $L27; .loc 1 115 10 setp.gt.f32 %r79,%r48,0f337ffae5; @ ! %r79 bra $L28; .loc 1 116 6 mul.f32 %r47,%r48,%r48; .loc 1 121 24 fma.rn.f32 %r28,%r47,0f3e3e3c83,0f404a4fda; fma.rn.f32 %r50,%r28,%r47,0f42e3ba72; .loc 1 122 24 add.f32 %r80,%r47,0f41bccf72; fma.rn.f32 %r30,%r80,%r47,0f4374064f; .loc 1 121 24 fma.rn.f32 %r52,%r50,%r47,0f43bcbe1c; .loc 1 124 23 fma.rn.f32 %r33,%r52,%r47,0f4548960b; .loc 1 122 24 fma.rn.f32 %r53,%r30,%r47,0f44a053bb; .loc 1 124 39 fma.rn.f32 %r22,%r53,%r47,0f4531c3ca; bra $L5; $L28: mov.f32 %r22,0f4531c3ca; mov.f32 %r33,0f4548960b; .loc 1 114 11 mov.f32 %r47,0f00000000; $L5: .loc 1 124 15 mul.f32 %r81,%r76,%r33; .loc 1 124 11 div.rn.f32 %r59,%r81,%r22; .loc 1 125 11 mov.f32 %r82,0f3f800000; sub.f32 %r60,%r82,%r59; .loc 1 126 13 { .param .f32 %value_in; .param .f32 %out_arg1; st.param.f32 [%out_arg1],%r47; call (%value_in),expf,(%out_arg1); ld.param.f32 %r84,[%value_in]; } .loc 1 126 11 mul.f32 %r75,%r84,%r60; .loc 1 127 14 bra $L1; $L27: .loc 1 129 11 setp.le.f32 %r85,%r48,0f40800000; @ ! %r85 bra $L29; .loc 1 131 12 mul.f32 %r54,%r48,0f32b8f382; cvta.const.u64 %r72,c$18; cvta.const.u64 %r69,d$17; add.u64 %r65,%r72,28; .loc 1 132 12 mov.f32 %r55,%r48; .loc 1 131 12 mov.f32 %r45,0f417beb38; mov.f32 %r46,0f3f106ea8; $L11: .loc 1 135 17 add.f32 %r25,%r46,%r54; .loc 1 135 9 mul.f32 %r54,%r25,%r48; .loc 1 136 17 add.f32 %r26,%r45,%r55; .loc 1 136 9 mul.f32 %r55,%r26,%r48; .loc 1 133 7 add.u64 %r72,%r72,4; add.u64 %r69,%r69,4; setp.eq.u64 %r86,%r65,%r72; @ %r86 bra $L10; .loc 1 135 20 ld.f32 %r46,[%r72]; .loc 1 136 20 ld.f32 %r45,[%r69]; bra $L11; $L10: .loc 1 138 19 add.f32 %r87,%r54,0f4499cadc; .loc 1 138 35 add.f32 %r88,%r55,0f4499cadc; .loc 1 138 11 div.rn.f32 %r75,%r87,%r88; bra $L12; $L29: .loc 1 143 10 setp.ge.f32 %r89,%r48,0f41131aa0; @ ! %r89 bra $L13; .loc 1 145 14 setp.ge.f32 %r90,%r48,0f7e1024d1; @ %r90 bra $L19; .loc 1 147 14 setp.ge.f32 %r91,%r48,0f45354000; @ ! %r91 bra $L13; .loc 1 149 12 mov.f32 %r92,0f3f106ebb; div.rn.f32 %r75,%r92,%r48; .loc 1 150 8 bra $L12; $L13: .loc 1 153 29 mul.f32 %r93,%r48,%r48; .loc 1 153 11 mov.f32 %r94,0f3f800000; div.rn.f32 %r49,%r94,%r93; .loc 1 158 24 fma.rn.f32 %r61,%r49,0f3c85a7d9,0f3e9c53c6; fma.rn.f32 %r66,%r49,%r61,0f3eb87f20; .loc 1 159 24 add.f32 %r95,%r49,0f402462a3; fma.rn.f32 %r68,%r49,%r95,0f3fefbceb; .loc 1 158 24 fma.rn.f32 %r70,%r49,%r66,0f3e00cced; .loc 1 159 24 fma.rn.f32 %r71,%r49,%r68,0f3f0724ca; .loc 1 158 24 fma.rn.f32 %r73,%r49,%r70,0f3c83c224; .loc 1 159 24 fma.rn.f32 %r74,%r49,%r71,0f3d77e215; .loc 1 161 24 fma.rn.f32 %r34,%r49,%r73,0f3a2cafe8; .loc 1 161 17 mul.f32 %r35,%r34,%r49; .loc 1 161 40 fma.rn.f32 %r36,%r49,%r74,0f3b190a3d; .loc 1 161 11 div.rn.f32 %r96,%r35,%r36; .loc 1 162 20 mov.f32 %r97,0f3f106ebb; sub.f32 %r37,%r97,%r96; .loc 1 162 11 div.rn.f32 %r75,%r37,%r48; bra $L12; $L19: .loc 1 142 11 mov.f32 %r75,0f00000000; $L12: .loc 1 166 6 setp.lt.f32 %r98,%r76,0f00000000; @ ! %r98 bra $L1; .loc 1 168 10 setp.lt.f32 %r99,%r76,0fc1161cac; @ %r99 bra $L20; .loc 1 172 10 mul.f32 %r100,%r76,0f41800000; cvt.rzi.f32.f32 %r39,%r100; .loc 1 172 8 mul.f32 %r56,%r39,0f3d800000; .loc 1 174 8 mul.f32 %r102,%r56,%r56;expf,(%out_arg1); ld.param.f32 %r103,[%value_in]; } .loc 1 173 12 sub.f32 %r105,%r76,%r56; .loc 1 173 20 add.f32 %r106,%r76,%r56; .loc 1 173 8 mul.f32 %r107,%r105,%r106; .loc 1 174 23 { .param .f32 %value_in; .param .f32 %out_arg1; st.param.f32 [%out_arg1],%r107; call (%value_in),expf,(%out_arg1); ld.param.f32 %r108,[%value_in]; } neg.f32 %r67,%r75; .loc 1 174 6 mul.f32 %r109,%r103,%r108; .loc 1 175 8 fma.rn.f32 %r75,%r109,0f40000000,%r67; bra $L1; $L20: .loc 1 169 6 mov.f32 %r75,0f7f800000;gfortran_erfc_scaled_r8 .visible .func (.param .f64 %value_out) _gfortran_erfc_scaled_r8 (predpredf64 %r100; .reg .f64 %r102; .reg .f64 %r103; .reg .f64 %r105; .reg .f64 %r106; .reg .f64 %r107; .reg .f64 %r108; .reg .f64 %r109; mov.f64 %r76,%ar0; .loc 1 111 18 setp.gt.f64 %r77,%r76,0d0000000000000000; @ %r77 bra $L46; neg.f64 %r48,%r76; bra $L31; $L46: mov.f64 %r48,%r76; $L31: .loc 1 112 6 setp.le.f64 %r78,%r48,0d3fde000000000000; @ ! %r78 bra $L56; .loc 1 115 10 setp.gt.f64 %r79,%r48,0d3c9ffe5ab7e8ad5e; @ ! %r79 bra $L57; .loc 1 116 6 mul.f64 %r47,%r48,%r48; .loc 1 121 24 fma.rn.f64 %r28,%r47,0d3fc7c7905a31c322,0d400949fb3ed443e9; fma.rn.f64 %r50,%r28,%r47,0d405c774e4d365da3; .loc 1 122 24 add.f64 %r80,%r47,0d403799ee342fb2de; fma.rn.f64 %r30,%r80,%r47,0d406e80c9d57e55b8; .loc 1 121 24 fma.rn.f64 %r52,%r50,%r47,0d407797c38897528b; .loc 1 124 23 fma.rn.f64 %r33,%r52,%r47,0d40a912c1535d121a; .loc 1 122 24 fma.rn.f64 %r53,%r30,%r47,0d40940a77529cadc8; .loc 1 124 39 fma.rn.f64 %r22,%r53,%r47,0d40a63879423b87ad; bra $L34; $L57: mov.f64 %r22,0d40a63879423b87ad; mov.f64 %r33,0d40a912c1535d121a; .loc 1 114 11 mov.f64 %r47,0d0000000000000000; $L34: .loc 1 124 15 mul.f64 %r81,%r76,%r33; .loc 1 124 11 div.rn.f64 %r59,%r81,%r22; .loc 1 125 11 mov.f64 %r82,0d3ff0000000000000; sub.f64 %r60,%r82,%r59; .loc 1 126 13 { .param .f64 %value_in; .param .f64 %out_arg1; st.param.f64 [%out_arg1],%r47; call (%value_in),exp,(%out_arg1); ld.param.f64 %r84,[%value_in]; } .loc 1 126 11 mul.f64 %r75,%r84,%r60; .loc 1 127 14 bra $L30; $L56: .loc 1 129 11 setp.le.f64 %r85,%r48,0d4010000000000000; @ ! %r85 bra $L58; .loc 1 131 12 mul.f64 %r54,%r48,0d3e571e703c5f5815; cvta.const.u64 %r72,c$7; cvta.const.u64 %r69,d$6; add.u64 %r65,%r72,56; .loc 1 132 12 mov.f64 %r55,%r48; .loc 1 131 12 mov.f64 %r45,0d402f7d66f486ded5; mov.f64 %r46,0d3fe20dd508eb103e; $L40: .loc 1 135 17 add.f64 %r25,%r46,%r54; .loc 1 135 9 mul.f64 %r54,%r25,%r48; .loc 1 136 17 add.f64 %r26,%r45,%r55; .loc 1 136 9 mul.f64 %r55,%r26,%r48; .loc 1 133 7 add.u64 %r72,%r72,8; add.u64 %r69,%r69,8; setp.eq.u64 %r86,%r65,%r72; @ %r86 bra $L39; .loc 1 135 20 ld.f64 %r46,[%r72]; .loc 1 136 20 ld.f64 %r45,[%r69]; bra $L40; $L39: .loc 1 138 19 add.f64 %r87,%r54,0d4093395b7fd2fc8e; .loc 1 138 35 add.f64 %r88,%r55,0d4093395b7fd35f61; .loc 1 138 11 div.rn.f64 %r75,%r87,%r88; bra $L41; $L58: .loc 1 143 10 setp.ge.f64 %r89,%r48,0d403a8b020c49ba5e; @ ! %r89 bra $L42; .loc 1 145 14 setp.ge.f64 %r90,%r48,0d7fc203a27a37512d; @ %r90 bra $L48; .loc 1 147 14 setp.ge.f64 %r91,%r48,0d418ffeeb00000000; @ ! %r91 bra $L42; .loc 1 149 12 mov.f64 %r92,0d3fe20dd750429b6d; div.rn.f64 %r75,%r92,%r48; .loc 1 150 8 bra $L41; $L42: .loc 1 153 29 mul.f64 %r93,%r48,%r48; .loc 1 153 11 mov.f64 %r94,0d3ff0000000000000; div.rn.f64 %r49,%r94,%r93; .loc 1 158 24 fma.rn.f64 %r61,%r49,0d3f90b4fb18b485c7,0d3fd38a78b9f065f6; fma.rn.f64 %r66,%r49,%r61,0d3fd70fe40e2425b8; .loc 1 159 24 add.f64 %r95,%r49,0d40048c54508800db; fma.rn.f64 %r68,%r49,%r95,0d3ffdf79d6855f0ad; .loc 1 158 24 fma.rn.f64 %r70,%r49,%r66,0d3fc0199d980a842f; .loc 1 159 24 fma.rn.f64 %r71,%r49,%r68,0d3fe0e4993e122c39; .loc 1 158 24 fma.rn.f64 %r73,%r49,%r70,0d3f9078448cd6c5b5; .loc 1 159 24 fma.rn.f64 %r74,%r49,%r71,0d3faefc42917d7de7; .loc 1 161 24 fma.rn.f64 %r34,%r49,%r73,0d3f4595fd0d71e33c; .loc 1 161 17 mul.f64 %r35,%r34,%r49; .loc 1 161 40 fma.rn.f64 %r36,%r49,%r74,0d3f632147a014bad1; .loc 1 161 11 div.rn.f64 %r96,%r35,%r36; .loc 1 162 20 mov.f64 %r97,0d3fe20dd750429b6d; sub.f64 %r37,%r97,%r96; .loc 1 162 11 div.rn.f64 %r75,%r37,%r48; bra $L41; $L48: .loc 1 142 11 mov.f64 %r75,0d0000000000000000; $L41: .loc 1 166 6 setp.lt.f64 %r98,%r76,0d0000000000000000; @ ! %r98 bra $L30; .loc 1 168 10 setp.lt.f64 %r99,%r76,0dc03aa0c49ba5e354; @ %r99 bra $L49; .loc 1 172 10 mul.f64 %r100,%r76,0d4030000000000000; cvt.rzi.f64.f64 %r39,%r100; .loc 1 172 8 mul.f64 %r56,%r39,0d3fb0000000000000; .loc 1 174 8 mul.f64 %r102,%r56,%r56;exp,(%out_arg1); ld.param.f64 %r103,[%value_in]; } .loc 1 173 12 sub.f64 %r105,%r76,%r56; .loc 1 173 20 add.f64 %r106,%r76,%r56; .loc 1 173 8 mul.f64 %r107,%r105,%r106; .loc 1 174 23 { .param .f64 %value_in; .param .f64 %out_arg1; st.param.f64 [%out_arg1],%r107; call (%value_in),exp,(%out_arg1); ld.param.f64 %r108,[%value_in]; } neg.f64 %r67,%r75; .loc 1 174 6 mul.f64 %r109,%r103,%r108; .loc 1 175 8 fma.rn.f64 %r75,%r109,0d4000000000000000,%r67; bra $L30; $L49: .loc 1 169 6 mov.f64 %r75,0d7ff0000000000000; $L30:618 gfortran_is_extension_ofgfortran_is_extension_offortran/intrinsics/extends_type_ofgfortran_is_extension_ofgfortran_is_extension_ofpredpred54 6 setp.eq.u64 %r27,%r26,0; @ %r27 bra $L4; .loc 1 56 6 setp.eq.u64 %r28,%r25,0; @ %r28 bra $L5; .loc 1 61 25 ld.u32 %r23,[%r26]; $L3: .loc 1 61 10 ld.u32 %r29,[%r25]; setp.eq.u32 %r30,%r29,%r23; @ %r30 bra $L6; .loc 1 62 10 ld.u64 %r25,[%r25+16]; .loc 1 59 9 setp.ne.u64 %r31,%r25,0; @ %r31 bra $L3; .loc 1 57 12 cvt.u32.u64 %r24,%r25; bra $L1; $L4: .loc 1 55 12 mov.u32 %r24,1; bra $L1; $L5: .loc 1 57 12 cvt.u32.u64 %r24,%r25; bra $L1; $L6: .loc 1 55 12 mov.u32 %r24,1; $L1: .loc 1 65gfortran_fnum_gfortran_fnum_../libgfortran/intrinsics/fnu_gfortran_fnum_gfortran_fnum_i8_gfortrani_unit_to_f_gfortrani_unit_to_fdF: _gfortran_fnum_gfortran_fnum_mov.u64 %r24,%ar0; .loc 1 38_gfortrani_unit_to_fd,(39_gfortran_fnum_gfortran_fnum_u32 %r27; .reg .u32 %r29; mov.u64 %r26,%ar0; .loc 1 47 10 ld.u32 %r27,[%r26]_gfortrani_unit_to_fd,(%out_arg1); ld.param.u32 %r29,[%value_in]; } cvt.s64.s32 %value,%r29; .loc 1 48 ierrno.o/ _gfortran_ierrno_gfortran_ierrno_i4; .file 1 "../../../../libgfortran/intrinsics/ierrno.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_ierrno_gfortran_ierrno_i8; // BEGIN GLOBAL FUNCTION DECL: __errno__errnogfortran_ierrno_gfortran_ierrno_i4 { .reg .u32 %value; .reg .u64 %r24; .loc 1 39 26__errno; ld.param.u64 %r24,[%value_in]; } .loc 1 39 10 ld.u32 %value,[%r24]; .loc_gfortran_ierrno_gfortran_ierrno_i8 { .reg .u64 %value; .reg .u64 %r25; .loc 1 48 26__errno48 10 ld.s32 %value,[%r25]; .loc 1 4 ishftc.o/ gfortran_ishftc4gfortran_ishftc4fortran/intrinsics/ishft_gfortran_ishftc8gfortran_ishftc8_gfortran_ishftc16 .visible .func _gfortran_ishftc16ortran_ishftc4gfortran_ishftc4u32mov.u32 %r37,%ar0; mov.u32 %r38,%ar1; mov.u32 %r39,%ar2; .loc 1 36 6 setp.ge.s32 %r40,%r38,0; @ %r40 bra $L2; .loc 1 37 11 add.u32 %r38,%r38,%r39; $L2: .loc 1 39 13 set.u32.eq.u32 %r42,%r38,0; neg.s32 %r43,%r42; .loc 1 39 27 set.u32.eq.u32 %r45,%r38,%r39; neg.s32 %r46,%r45; .loc 1 39 18 cvt.u16.u32 %r48,%r43; cvt.u16.u32 %r49,%r46; or.b16 %r47,%r48,%r49; .loc 1 39 6 cvt.u32.u16 %r50,%r47; cvt.u16.u8 %r51,%r5; .loc 1 45 10 setp.eq.u32 %r53,%r39,32; @ %r53 bra $L6; .loc 1 45 67 mov.u32 %r55,-1; shl.b32 %r54,%r55,%r39; .loc 1 45 10 not.b32 %r32,%r54; bra $L4; $L6: mov.u32 %r32,-1; $L4: .loc 1 47 8 and.b32 %r35,%r37,%r32; .loc 1 49 31 shl.b32 %r56,%r35,%r38; .loc 1 49 22 xor.b32 %r57,%r56,%r37; and.b32 %r58,%r57,%r32; xor.b32 %r59,%r58,%r37; .loc 1 49 66 sub.u32 %r60,%r39,%r38; .loc 1 49 57 shr.u32 %r61,%r35,%r60; .loc 1 49 49 or.b32 %r36,%r59,%r61; bra $L1; $L5: .loc 1 40 12 mov.u32 %r36,%r37; $L1: .loc 1 50_gfortran_ishftc8gfortran_ishftc832u16 %r48; .reg .u16 %r49; .reg .u32mov.u64 %r37,%ar0; mov.u32 %r38,%ar1; mov.u32 %r39,%ar2; .loc 1 60 6 setp.ge.s32 %r40,%r38,0; @ %r40 bra $L8; .loc 1 61 11 add.u32 %r38,%r38,%r39; $L8: .loc 1 63 13 set.u32.eq.u32 %r42,%r38,0; neg.s32 %r43,%r42; .loc 1 63 27 set.u32.eq.u32 %r45,%r38,%r39; neg.s32 %r46,%r45; .loc 1 63 18 cvt.u16.u32 %r48,%r43; cvt.u16.u32 %r49,%r46; or.b16 %r47,%r48,%r49; .loc 1 63 6 cvt.u32.u16 %r50,%r47; cvt.u16.u8 %r51,%r50; setp.ne.u16 %r52,%r51,0; @ %r52 bra $L11; .loc 1 69 10 setp.eq.u32 %r53,%r39,64; @ %r53 bra $L12; .loc 1 69 67 mov.u64 %r55,-1; shl.b64 %r54,%r55,%r39; .loc 1 69 10 not.b64 %r32,%r54; bra $L10; $L12: mov.u64 %r32,-1; $L10: .loc 1 71 8 and.b64 %r35,%r37,%r32; .loc 1 73 31 shl.b64 %r56,%r35,%r38; .loc 1 73 22 xor.b64 %r57,%r56,%r37; and.b64 %r58,%r57,%r32; xor.b64 %r59,%r58,%r37; .loc 1 73 66 sub.u32 %r60,%r39,%r38; .loc 1 73 57 shr.u64 %r61,%r35,%r60; .loc 1 73 49 or.b64 %r36,%r59,%r61; bra $L7; $L11: .loc 1 64 12 mov.u64 %r36,%r37; $L7: .loc 1 74_gfortran_ishftc16 .visible .func _gfortran_ishftc16predpred %r57; .reg .u32 %r59; .reg .pred9186,[%r38]; ld.u64 %r187,[%r38+8]; mov.u32 %r40,%ar2; mov.u32 %r41,%ar3; .loc 1 85 6 setp.ge.s32 %r42,%r40,0; @ %r42 bra $L14; .loc 1 86 11 add.u32 %r40,%r40,%r41; $L14:or.b16 %r49,%r50,%r51; .loc 1 88 6; .loc 1 89 12 st.u64 [%r37],%r186; st.u64 [%r37+8],%r187; bra $L13; $L15: .loc 1 94 10 setp.eq.u32 %r57,%r41,128; @ %r57 bra $L24; .loc 1 94 70 add.u32 %r59,%r41,-64; setp.lt.s32 %r73,%r59,0; @ %r73 bra $L18; mov.u64 %r75,-1; shl.b64 %r189,%r75,%r59; mov.u64 %r188,0; bra $L19; $L18: mov.u64 %r77,-1; mov.u64 %r76,9223372036854775807; mov.u32 %r79,63; sub.u32 %r78,%r79,%r41; shr.u64 %r80,%r76,%r78; shl.b64 %r81,%r77,%r41; or.b64 %r189,%r81,%r80; mov.u64 %r188,%r81; $L19: .loc 1 94 10 not.b64 %r182,%r188; not.b64 %r183,%r189; bra $L17; $L24: mov.u64 %r182,-1; mov.u64 %r183,%r182; $L17: .loc 1 96 8 and.b64 %r184,%r186,%r182; and.b64 %r185,%r187,%r183; .loc 1 98 31 add.u32 %r97,%r40,-64; setp.lt.s32 %r111,%r97,0; @ %r111 bra $L20; shl.b64 %r191,%r184,%r97; mov.u64 %r190,0; bra $L21; $L20: shr.u64 %r114,%r184,1; mov.u32 %r117,63; sub.u32 %r116,%r117,%r40; shr.u64 %r118,%r114,%r116; shl.b64 %r119,%r185,%r40; or.b64 %r191,%r119,%r118; shl.b64 %r190,%r184,%r40; $L21: .loc 1 98 22 xor.b64 %r125,%r190,%r186; xor.b64 %r128,%r191,%r187; and.b64 %r132,%r125,%r182; and.b64 %r135,%r128,%r183; xor.b64 %r196,%r132,%r186; xor.b64 %r197,%r135,%r187; .loc 1 98 66 sub.u32 %r145,%r41,%r40; .loc 1 98 57 add.u32 %r147,%r145,-64; setp.lt.s32 %r161,%r147,0; @ %r161 bra $L22; shr.u64 %r198,%r185,%r147; mov.u64 %r199,0; bra $L23; $L22: shl.b64 %r164,%r185,1; mov.u32 %r167,63; sub.u32 %r166,%r167,%r145; shl.b64 %r168,%r164,%r166; shr.u64 %r169,%r184,%r145; or.b64 %r198,%r169,%r168; shr.u64 %r199,%r185,%r145; $L23: .loc 1 98 49 or.b64 %r174,%r196,%r198; st.u64 [%r37],%r174; or.b64 %r177,%r197,%r199; st.u64 [%r37+8],%r177; $L13: .loc 1 99 1 ret; } is_contiguous.o/1622802253gfortran_is_contiguous0gfortran_is_contiguous0../libgfortran/intrinsics/is_contiguousgfortran_is_contiguous0gfortran_is_contiguous0pred %r43; mov.u64 %r35,%ar0; .loc 1 35 7 ld.s8 %r28,[%r35+28]; .loc 1 38 3 setp.le.s64 %r36,%r28,0; @ %r36 bra $L4; add.u64 %r27,%r35,40; .loc 1 37 10 mov.u64 %r30,1; .loc 1 38 10 mov.u64 %r31,0; $L3: .loc 1 41 10 ld.u64 %r37,[%r27]; setp.ne.u64 %r38,%r37,%r30; @ %r38 bra $L5; .loc 1 44 17 ld.u64 %r40,[%r27+16]; ld.u64 %r42,[%r27+8]; sub.u64 %r41,%r40,%r42; .loc 1 44 14 mad.lo.u64 %r30,%r41,%r30,%r30; .loc 1 38 25 add.u64 %r31,%r31,1; .loc 1 38 3 add.u64 %r27,%r27,24; setp.ne.u64 %r43,%r28,%r31; @ %r43 bra $L3; .loc 1 47 10 mov.u32 %r34,1; bra $L1; $L4: mov.u32 %r34,1; bra $L1; $L5: .loc 1 42 9 mov.u32 %r34,0; $L1: .loc 1 48mvbits.o/ gfortran_mvbits_i1 .visible .func _gfortran_mvbits_i1.file 1 "../../../../libgfortran/intrinsics/mvbitgfortran_mvbits_i2 .visible .func _gfortran_mvbits_i2_gfortran_mvbits_i4 .visible .func _gfortran_mvbits_i4_gfortran_mvbits_i8 .visible .func _gfortran_mvbits_i8_gfortran_mvbits_i16 .visible .func _gfortran_mvbits_i16F: _gfortran_mvbits_i1 .visible .func _gfortran_mvbits_i1 {.reg .u16 %r52; .reg .u16u16 %r72; .reg .u32 %r74; .reg .u16 %r75; mov.u64 %r43,%ar0; mov.u64 %r44,%ar1; mov.u64 %r45,%ar2; mov.u64 %r46,%ar3; mov.u64 %r47,%ar4; .loc 1 46 14 ld.u32 %r22,[%r45]; .loc 1 46 11 setp.eq.u32 %r48,%r22,8; @ %r48 bra $L3; .loc 1 46 61 mov.u32 %r50,1; shl.b32 %r49,%r50,%r22; .loc 1 46 70 cvt.u16.u32 %r53,%r49; add.u16 %r52,%r53,-1; .loc 1 47 43 cvt.u32.u16 %r54,%r52; cvt.s32.s8 %r42,%r54; bra $L2; $L3: mov.u32 %r42,-1; $L2: .loc 1 47 57 ld.u32 %r32,[%r47]; .loc 1 47 15 ld.u8 %r55,[%r43]; .loc 1 47 30 ld.u32 %r57,[%r44]; shr.s32 %r56,%r55,%r57; .loc 1 47 43 and.b32 %r58,%r56,%r42; .loc 1 47 54 shl.b32 %r59,%r58,%r32; .loc 1 48 30 shl.b32 %r60,%r42,%r32; .loc 1 48 17 cvt.u16.u32 %r63,%r60; not.b16 %r62,%r63; .loc 1 48 11 ld.u8 %r75,[%r46]; mov.u16 %r68,%r75; and.b16 %r66,%r62,%r68; .loc 1 50 7 cvt.u16.u32 %r72,%r59; or.b16 %r71,%r72,%r66; cvt.u32.u16 %r74,%r71; st.u8 [%r46],%r74;_gfortran_mvbits_i2 .visible .func _gfortran_mvbits_i2 {mov.u64 %r43,%ar0; mov.u64 %r44,%ar1; mov.u64 %r45,%ar2; mov.u64 %r46,%ar3; mov.u64 %r47,%ar4; .loc 1 46 14 ld.u32 %r22,[%r45]; .loc 1 46 11 setp.eq.u32 %r48,%r22,16; @ %r48 bra $L6; .loc 1 46 61 mov.u32 %r50,1; shl.b32 %r49,%r50,%r22; .loc 1 46 70 cvt.u16.u32 %r52,%r49; add.u16 %r51,%r52,-1; .loc 1 47 43 cvt.s32.s16 %r42,%r51; bra $L5; $L6: mov.u32 %r42,-1; $L5: .loc 1 47 57 ld.u32 %r32,[%r47]; .loc 1 47 15 ld.u16 %r53,[%r43]; .loc 1 47 30 ld.u32 %r55,[%r44]; shr.s32 %r54,%r53,%r55; .loc 1 47 43 and.b32 %r56,%r54,%r42; .loc 1 47 54 shl.b32 %r57,%r56,%r32; .loc 1 48 30 shl.b32 %r58,%r42,%r32; .loc 1 48 17 cvt.u16.u32 %r60,%r58; not.b16 %r59,%r60; .loc 1 48 11 ld.u16 %r62,[%r46]; and.b16 %r61,%r59,%r62; .loc 1 50 7 cvt.u16.u32 %r64,%r57; or.b16 %r63,%r61,%r64; st.u16 [%r46],%r63;_gfortran_mvbits_i4 .visible .func _gfortran_mvbits_i4 {46 14 ld.u32 %r22,[%r41]; .loc 1 46 50 setp.eq.u32 %r44,%r22,32; @ %r44 bra $L9; .loc 1 46 61 mov.u32 %r46,1; shl.b32 %r45,%r46,%r22; .loc 1 46 50 add.u32 %r35,%r45,-1; .loc 1 47 43 mov.u32 %r38,%r35; bra $L8; $L9: mov.u32 %r38,-1; .loc 1 46 50 mov.u32 %r35,%r38; $L8: .loc 1 47 57 ld.u32 %r29,[%r43]; .loc 1 47 30 ld.u32 %r48,[%r39]; ld.u32 %r49,[%r40]; shr.u32 %r47,%r48,%r49; .loc 1 47 43 and.b32 %r50,%r47,%r38; .loc 1 47 54 shl.b32 %r51,%r50,%r29; .loc 1 48 30 shl.b32 %r52,%r35,%r29; .loc 1 48 20 not.b32 %r53,%r52; .loc 1 48 11 ld.u32 %r55,[%r42]; and.b32 %r54,%r53,%r55; .loc 1 50 17 or.b32 %r56,%r51,%r54; .loc 1 50 7 st.u32 [%r42],%r56;_gfortran_mvbits_i8 .visible .func _gfortran_mvbits_i8 {pred46 14 ld.u32 %r22,[%r41]; .loc 1 46 50 setp.eq.u32 %r44,%r22,64; @ %r44 bra $L12; .loc 1 46 61 mov.u64 %r46,1; shl.b64 %r45,%r46,%r22; .loc 1 46 50 add.u64 %r35,%r45,-1; .loc 1 47 43 mov.u64 %r38,%r35; bra $L11; $L12: mov.u64 %r38,-1; .loc 1 46 50 mov.u64 %r35,%r38; $L11: .loc 1 47 57 ld.u32 %r29,[%r43]; .loc 1 47 30 ld.u64 %r48,[%r39]; ld.u32 %r49,[%r40]; shr.u64 %r47,%r48,%r49; .loc 1 47 43 and.b64 %r50,%r47,%r38; .loc 1 47 54 shl.b64 %r51,%r50,%r29; .loc 1 48 30 shl.b64 %r52,%r35,%r29; .loc 1 48 20 not.b64 %r53,%r52; .loc 1 48 11 ld.u64 %r55,[%r42]; and.b64 %r54,%r53,%r55; .loc 1 50 17 or.b64 %r56,%r51,%r54; .loc 1 50 7 st.u64 [%r42],%r56;_gfortran_mvbits_i16 .visible .func _gfortran_mvbits_i16 {pred %r44; .reg .u32 %r46; .reg .predpredu32u64 %r124; .reg .pred %r142; .reg .u64 %r145; .reg .u32346 14 ld.u32 %r22,[%r41]; .loc 1 46 50 setp.eq.u32 %r44,%r22,128; @ %r44 bra $L23; .loc 1 46 61 add.u32 %r46,%r22,-64; setp.lt.s32 %r60,%r46,0; @ %r60 bra $L15; mov.u64 %r62,1; shl.b64 %r206,%r62,%r46; mov.u64 %r205,0; bra $L16; $L15: mov.u64 %r64,1; mov.u64 %r206,0; shl.b64 %r205,%r64,%r22; $L16: .loc 1 46 50 mov.u64 %r207,-1; add.u64 %r75,%r205,%r207; set.u32.lt.u64 %r80,%r75,%r205; cvt.s64.s32 %r78,%r80; add.u64 %r81,%r206,%r207; mov.u64 %r201,%r75; sub.u64 %r202,%r81,%r78; .loc 1 47 43 mov.u64 %r203,%r201; mov.u64 %r204,%r202; bra $L14; $L23: mov.u64 %r203,-1; mov.u64 %r204,%r203; .loc 1 46 50 mov.u64 %r201,%r203; mov.u64 %r202,%r203; $L14: .loc 1 47 57 ld.u32 %r29,[%r43]; .loc 1 47 30 ld.u32 %r87,[%r40]; add.u32 %r86,%r87,-64; setp.lt.s32 %r104,%r86,0; @ %r104 bra $L17; ld.u64 %r106,[%r39+8]; shr.u64 %r211,%r106,%r86; mov.u64 %r212,0; bra $L18; $L17: ld.u64 %r108,[%r39+8]; shl.b64 %r107,%r108,1; mov.u32 %r110,63; sub.u32 %r109,%r110,%r87; shl.b64 %r112,%r107,%r109; ld.u64 %r114,[%r39]; shr.u64 %r113,%r114,%r87; or.b64 %r211,%r113,%r112; shr.u64 %r212,%r108,%r87; $L18: .loc 1 47 43 and.b64 %r121,%r211,%r203; and.b64 %r124,%r212,%r204; .loc 1 47 54 add.u32 %r223,%r29,-64; setp.lt.s32 %r142,%r223,0; @ %r142 bra $L19; shl.b64 %r216,%r121,%r223; mov.u64 %r215,0; bra $L20; $L19: shr.u64 %r145,%r121,1; mov.u32 %r148,63; sub.u32 %r147,%r148,%r29; shr.u64 %r149,%r145,%r147; shl.b64 %r150,%r124,%r29; or.b64 %r216,%r150,%r149; shl.b64 %r215,%r121,%r29; $L20: .loc 1 48 30 setp.lt.s32 %r170,%r223,0; @ %r170 bra $L21; shl.b64 %r218,%r201,%r223; mov.u64 %r217,0; bra $L22; $L21: shr.u64 %r173,%r201,1; mov.u32 %r176,63; sub.u32 %r175,%r176,%r29; shr.u64 %r177,%r173,%r175; shl.b64 %r178,%r202,%r29; or.b64 %r218,%r178,%r177; shl.b64 %r217,%r201,%r29; $L22: .loc 1 48 20 not.b64 %r184,%r217; not.b64 %r186,%r218; .loc 1 48 11 ld.u64 %r190,[%r42]; and.b64 %r189,%r190,%r184; ld.u64 %r193,[%r42+8]; and.b64 %r192,%r193,%r186; .loc 1 50 17 or.b64 %r195,%r215,%r189; st.u64 [%r42],%r195; or.b64 %r198,%r216,%r192; st.u64 [%r42+8],%r198; .loc 1 51 1 ret; } move_alloc.o/gfortran_move_alloc .visible .func _gfortran_move_allocfortran/intrinsics/move__gfortran_move_alloc_c .visible .func _gfortran_move_alloc_c, F: _gfortran_move_alloc .visible .func _gfortran_move_allocmov.u64 %r38,%ar0; mov.u64 %r39,%ar1; .loc 1 37 3 ld.u64 %r40,[%r39];.loc 1 39 19 ld.s8 %r30,[%r38+28]; .loc 1 39 3 cvt.u16.u32 %r41,%r30; setp.le.s16 %r42,%r41,0; @ %r42 bra $L2; add.u32 %r43,%r30,-1; cvt.u64.u32 %r44,%r43; add.u64 %r46,%r44,%r44; add.u64 %r47,%r46,%r44; shl.b64 %r48,%r47,3; add.u64 %r37,%r48,64; mov.u64 %r27,40; .loc 1 44 7 mov.u64 %r50,0; $L3: add.u64 %r31,%r38,%r27; .loc 1 41 7 ld.u64 %r23,[%r31+8]; add.u64 %r32,%r39,%r27; st.u64 [%r32+8],%r23; ld.u64 %r24,[%r31+16]; st.u64 [%r32+16],%r24; ld.u64 %r25,[%r31]; st.u64 [%r32],%r25; .loc 1 44 7 ld.u64 %r49,[%r31+8]; st.u64 [%r31+16],%r49; st.u64 [%r31],%r50; .loc 1 39 3 add.u64 %r27,%r27,24; setp.ne.u64 %r51,%r27,%r37; @ %r51 bra $L3; $L2: .loc 1 48 20 ld.u64 %r28,[%r38+8]; .loc 1 48 14 st.u64 [%r39+8],%r28; .loc 1 49 3 ld.u64 %r52,[%r38+16]; st.u64 [%r39+16],%r52; ld.u64 %r53,[%r38+24]; st.u64 [%r39+24],%r53; .loc 1 50 23 ld.u64 %r29,[%r38]; .loc 1 50 17 st.u64 [%r39],%r29; .loc 1 51 19 mov.u64 %r54,0; st.u64 [%r38],%r54; .loc 1 5_gfortran_move_alloc_c .visible .func _gfortran_move_alloc_c_gfortran_move_alloc,(%out_arg1,%out_arg2); } .loc 1 65 1 ret; } pack_generic.o/ 1622802254BEGIN FUNCTION DECL: pack_internal .func pack_internal.file 1 "../../../../libgfortran/intrinsics/pack_generic.c" // BEGIN FUNCTION DECL: pack_s_internal .func pack_s_internal_gfortran_pack .visible .func _gfortran_packgfortran_pack_char .visible .func _gfortran_pack_char_gfortran_pack_char4 .visible .func _gfortran_pack_char4_gfortran_pack_s .visible .func _gfortran_pack_sgfortran_pack_s_char .visible .func _gfortran_pack_s_char_gfortran_pack_s_char4 .visible .func _gfortran_pack_s_char4); // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_count_0_gfortrani_count_0_gfortrani_xmallocarr_gfortrani_xmallocarray_gfortrani_pack_i1 .extern .func _gfortrani_pack_i1gfortrani_pack_i2 .extern .func _gfortrani_pack_i2gfortrani_pack_i4 .extern .func _gfortrani_pack_i4gfortrani_pack_i8 .extern .func _gfortrani_pack_i8gfortrani_pack_i16 .extern .func _gfortrani_pack_i16gfortrani_pack_r4 .extern .func _gfortrani_pack_r4gfortrani_pack_r8 .extern .func _gfortrani_pack_r8gfortrani_pack_c4 .extern .func _gfortrani_pack_c4gfortrani_pack_c8 .extern .func _gfortrani_pack_c874] = {73,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,80,65,67,75,32,105,110,116,114,105,110,115,105,99,59,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,100,0 }110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121,0 }; // BEGIN FUNCTION DEF: pack_internal .func pack_internal {48pred %r155; .reg .pred %r156; .reg .pred %r157; .reg .predpredu64 %r198; .reg .predpredpredpred %r258; .reg .u64u64 %r284; .reg .u64 %r285; mov.u64 %r135,%ar0; mov.u64 %r136,%ar1; mov.u64 %r137,%ar2; mov.u64 %r138,%ar3; mov.u64 %r139,%ar4; .loc 1 94 9 ld.s8 %r22,[%r136+28]; .loc 1 94 7 cvt.u32.u32 %r140,%r22; cvt.s64.s8 %r93,%r140; .loc 1 96 8 ld.u64 %r107,[%r136]; .loc 1 97 8 ld.u64 %r45,[%r137]; .loc 1 102 15 ld.u64 %r23,[%r137+16]; .loc 1 102 13 cvt.u32.u64 %r94,%r23; .loc 1 104 53 add.u32 %r141,%r94,-4; and.b32 %r142,%r141,-5; set.u32.eq.u32 %r144,%r142,0; neg.s32 %r145,%r144; .loc 1 104 22 add.u32 %r146,%r94,-1; set.u32.le.u32 %r148,%r146,1; neg.s32 %r149,%r148; .loc 1 104 6 cvt.u16.u32 %r151,%r145; cvt.u16.u32 %r152,%r149; or.b16 %r150,%r151,%r152; cvt.u32.u16 %r153,%r150; cvt.u16.u8 %r154,%r153; setp.eq.u16 %r155,%r154,0; @ %r155 bra $L2; $L5: .loc 1 117 3 setp.gt.s64 %r156,%r93,0; @ %r156 bra $L3; $L7: .loc 1 124 14 ld.u64 %r40,[%frame+360]; .loc 1 124 6 setp.ne.u64 %r157,%r40,0; .loc 1 125 16 selp.u64 %r40,%r40,%r139,%r157; .loc 1 126 6 ld.u64 %r186,[%frame+240]; setp.ne.u64 %r187,%r186,0; @ %r187 bra $L8; bra $L45; $L2: .loc 1 106 7 setp.eq.u32 %r158,%r94,16; @ %r158 bra $L5; .loc 1 115 5 cvta.const.u64 %r151_gfortran_runtime_error121 20 ld.u64 %r33,[%r136+16]; .loc 1 119 16 cvt.u32.u32 %r162,%r22; cvt.s64.s8 %r161,%r162; shl.b64 %r128,%r161,3; add.u64 %r163,%frame,120; mov.u32 %r167167169,[%value_in]; } mov.u64 %r121,40; mov.u64 %r122,0; add.u64 %r281,%frame,360; add.u64 %r280,%frame,240; $L6: add.u64 %r120,%r136,%r121; .loc 1 120 17 add.u64 %r171,%frame,%r122; .loc 1 120 19 ld.u64 %r173,[%r120+16]; add.u64 %r172,%r173,1; ld.u64 %r175,[%r120+8]; sub.u64 %r174,%r172,%r175; .loc 1 120 17 st.u64 [%r171],%r174; .loc 1 121 18 add.u64 %r177,%r281,%r122; .loc 1 121 20 ld.u64 %r179,[%r120]; mul.lo.u64 %r178,%r179,%r33; .loc 1 121 18 st.u64 [%r177],%r178; .loc 1 122 18 add.u64 %r181,%r280,%r122; .loc 1 122 20 add.u64 %r182,%r137,%r121; ld.u64 %r184,[%r182]; mul.lo.u64 %r183,%r184,%r23; .loc 1 122 18 st.u64 [%r181],%r183; .loc 1 117 3 add.u64 %r122,%r122,8; add.u64 %r121,%r121,24; setp.ne.u64 %r185,%r122,%r128; @ %r185 bra $L6; bra $L7; $L45: .loc 1 127 16 cvt.s64.s32 %r188,%r94; st.u64 [%frame+240],%r188; $L8: .loc 1 129 10 ld.u64 %r89,[%r135]; .loc 1 129 6 setp.eq.u64 %r189,%r89,0; @ %r189 bra $L9; .loc 1 129 33 cvta.global.u64 %r190,_gfortrani_compile_options; .loc 1 129 30 ld.u32 %r191,[%r190+36]; setp.eq.u32 %r192,%r191,0; @ %r192 bra $L10; $L9: .loc 1 134 10 setp.eq.u64 %r193,%r138,0; @ %r193 bra $L11; .loc 1 138 12 ld.u64 %r195,[%r138+56]; add.u64 %r194,%r195,1; .loc 1 138 10 ld.u64 %r196,[%r138+48]; sub.u64 %r91,%r194,%r196; bra $L12; $L11: .loc 1 144 12137; call (%value_in),_gfortrani_count_0,(%out_arg1); ld.param.u64 %r198,[%value_in]; } mov.u64 %r91,%r198; .loc 1 147 14 ld.u64 %r89,[%r135]; $L12: .loc 1 147 10 setp.ne.u64 %r199,%r89,0; @ %r199 bra $L13; .loc 1 150 4 st.u64 [%r135+48],%r89; add.u64 %r201,%r91,-1; st.u64 [%r135+56],%r201; mov.u64 %r202,1; st.u64 [%r135+40],%r202; .loc 1 152 16 st.u64 [%r135+8],%r89; .loc 1 154 21139; call (%value_in),_gfortrani_xmallocarray206,[%value_in]; } mov.u64 %r89,%r206; .loc 1 154 19 st.u64 [%r135],%r89; .loc 1 156 7 setp.ne.u64 %r208,%r91,0; @ %r208 bra $L10; bra $L1; $L13: .loc 1 164 17 ld.u64 %r210,[%r135+56]; add.u64 %r209,%r210,1; .loc 1 164 15 ld.u64 %r211,[%r135+48]; sub.u64 %r97,%r209,%r211; .loc 1 165 7 setp.eq.u64 %r212,%r91,%r97; @ %r212 bra $L10; .loc 1 166 6 st.u64 [%stack+8],%r97; st.u64 [%stack],%r91;_gfortran_runtime_error10: .loc 1 172 14 ld.u64 %r215,[%r135+40]; ld.u64 %r216,[%r135+16]; mul.lo.u64 %r88,%r215,%r216; .loc 1 173 6 setp.ne.u64 %r217,%r88,0; .loc 1 174 14 selp.u64 %r88,%r88,%r139,%r217; .loc 1 176 12 ld.u64 %r99,[%frame+240]; .loc 1 179 10 set.u32.ne.u64 %r219,%r107,0; neg.s32 %r220,%r219; .loc 1 179 15 set.u32.ne.u64 %r222,%r45,0; neg.s32 %r223,%r222; cvt.u16.u32 %r225,%r220; cvt.u16.u32 %r226,%r223; and.b16 %r224,%r225,%r226; .loc 1 179 9 cvt.u32.u16 %r227,%r224; cvt.u16.u8 %r228,%r227; setp.eq.u16 %r229,%r228,0; @ %r229 bra $L15; .loc 1 193 32 ld.u64 %r132,[%frame]; .loc 1 200 30 mul.lo.u64 %r96,%r132,%r40; .loc 1 201 30 mul.lo.u64 %r230,%r132,%r99; .loc 1 201 16 neg.s64 %r116,%r230; setp.le.s64 %r279,%r93,1; add.u64 %r284,%frame,360; add.u64 %r285,%frame,240; $L27: .loc 1 182 10 ld.s8 %r232,[%r450; @ %r233 bra $L16; .loc 1 185 11107139240,[%value_in]; } .loc 1 186 16 add.u64 %r89,%r89,%r88; $L16: .loc 1 189 12 add.u64 %r107,%r107,%r40; .loc 1 190 12 add.u64 %r45,%r45,%r99; .loc 1 191 15 ld.u64 %r242,[%frame+120]; add.u64 %r63,%r242,1; .loc 1 193 13 setp.eq.u64 %r243,%r63,%r132; @ %r243 bra $L17; .loc 1 191 15 st.u64 [%frame+120],%r63; bra $L27; $L17: .loc 1 197 20 mov.u64 %r244,0; st.u64 [%frame+120],%r244; .loc 1 201 16 add.u64 %r105,%r45,%r116; .loc 1 203 14 @ %r279 bra $L15; add.u64 %r108,%frame,128; .loc 1 200 30 mov.u64 %r65,%r96; .loc 1 203 14 mov.u64 %r127,8; .loc 1 202 12 mov.u64 %r106,1; bra $L20; $L21: .loc 1 197 20 st.u64 [%r108],%r244; .loc 1 200 30 mul.lo.u64 %r65,%r71,%r70; .loc 1 201 30 mul.lo.u64 %r248,%r73,%r70; .loc 1 201 16 sub.u64 %r105,%r45,%r248; .loc 1 202 12 add.u64 %r106,%r106,1; .loc 1 203 14 add.u64 %r108,%r108,8; add.u64 %r127,%r127,8; setp.eq.u64 %r249,%r93,%r106; @ %r249 bra $L15; $L20: .loc 1 211 23 ld.u64 %r250,[%r108]; add.u64 %r70,%r250,1; st.u64 [%r108],%r70; .loc 1 212 30 add.u64 %r252,%r284,%r127; ld.u64 %r71,[%r252]; .loc 1 212 20 sub.u64 %r253,%r71,%r65; add.u64 %r107,%r107,%r253; .loc 1 213 30 add.u64 %r255,%r285,%r127; ld.u64 %r73,[%r255]; .loc 1 213 20 add.u64 %r45,%r105,%r73; .loc 1 193 32 add.u64 %r256,%frame,%r127; ld.u64 %r75,[%r256]; .loc 1 193 13 setp.eq.u64 %r257,%r70,%r75; @ %r257 bra $L21; bra $L27; $L15: .loc 1 219 6 setp.ne.u64 %r258,%r138,0; @ ! %r258 bra $L1; .loc 1 221 11 ld.u64 %r260,[%r138+56]; add.u64 %r259,%r260,1; .loc 1 221 9 ld.u64 %r261,[%r138+48]; sub.u64 %r100,%r259,%r261; .loc 1 222 22 ld.u64 %r263,[%r135]; sub.u64 %r262,%r89,%r263; .loc 1 222 13 div.s64 %r101,%r262,%r88; .loc 1 223 10 setp.le.s64 %r264,%r100,%r101; @ %r264 bra $L1; .loc 1 225 22 ld.u64 %r265,[%r138+40]; ld.u64 %r266,[%r138+16]; mul.lo.u64 %r84,%r265,%r266; .loc 1 226 14 setp.ne.u64 %r267,%r84,0; selp.u64 %r84,%r84,%r139,%r267; .loc 1 229 16 ld.u64 %r269,[%r138]; mad.lo.u64 %r104,%r84,%r101,%r269; .loc 1 230 13 sub.u64 %r102,%r100,%r101; .loc 1 231 19 add.u64 %r103,%r102,-1; $L26: .loc 1 28939276,[%value_in]; } .loc 1 234 20 add.u64 %r89,%r89,%r88; .loc 1 235 20 add.u64 %r104,%r104,%r84; .loc 1 231 19 add.u64 %r103,%r103,-1; .loc 1 231 17 setp.ne.u64 %r278,%r103,-1; @ %r278 bra $L26; $L1: .loc 1 239FUNCTION DEF: pack_s_internal .func pack_s_internal {99predpred %r146; .reg .pred %r147; .reg .predpredpredpredpred %r173; .reg .u32u32 %r180; .reg .u16 %r181; .reg .u16 %r182; .reg .u16 %r183; .reg .u32predpredmov.u64 %r117,%ar0; mov.u64 %r118,%ar1; mov.u64 %r119,%ar2; mov.u64 %r120,%ar3; mov.u64 %r121,%ar4; .loc 1 470 9 ld.s8 %r22,[%r118+28]; .loc 1 470 7 cvt.u32.u32 %r122,%r22; cvt.s64.s8 %r72,%r122; .loc 1 473 14 st.u64 [%frame+240],%r121; .loc 1 475 3 setp.le.s64 %r123,%r72,0; @ %r123 bra $L69; .loc 1 482 20 ld.u64 %r29,[%r118+16]; .loc 1 477 16 shl.b64 %r106,%r72,3; add.u64 %r126,%frame,120; mov.u32 %r130106132,[%value_in]; } add.u64 %r89,%r118,40; mov.u64 %r87,0; .loc 1 474 9 mov.u64 %r96,1; add.u64 %r245,%frame,240; .loc 1 480 12 mov.u64 %r248,%r87; $L50: .loc 1 478 19 ld.u64 %r135,[%r89+16]; add.u64 %r134,%r135,1; ld.u64 %r136,[%r89+8]; sub.u64 %r26,%r134,%r136; .loc 1 479 10 setp.lt.s64 %r137,%r26,0; @ %r137 bra $L48; .loc 1 478 17 add.u64 %r138,%frame,%r87; st.u64 [%r138],%r26; .loc 1 483 13 mul.lo.u64 %r96,%r96,%r26; bra $L49; $L48: .loc 1 480 12 add.u64 %r139,%frame,%r87; st.u64 [%r139],%r248; mov.u64 %r96,0; $L49: .loc 1 482 18 add.u64 %r142,%r245,%r87; .loc 1 482 20 ld.u64 %r144,[%r89]; mul.lo.u64 %r143,%r144,%r29; .loc 1 482 18 st.u64 [%r142],%r143; .loc 1 475 3 add.u64 %r89,%r89,24; add.u64 %r87,%r87,8; setp.ne.u64 %r145,%r87,%r106; @ %r145 bra $L50; .loc 1 485 14 ld.u64 %r86,[%frame+240]; .loc 1 485 6 setp.ne.u64 %r146,%r86,0; selp.u64 %r86,%r86,%r121,%r146; .loc 1 490 6 setp.eq.u64 %r147,%r96,0; @ %r147 bra $L70; bra $L47; $L69: .loc 1 486 16 mov.u64 %r86,%r121; .loc 1 474 9 mov.u64 %r96,1; $L47: .loc 1 491 10 ld.u64 %r78,[%r118]; bra $L52; $L70: .loc 1 493 10 mov.u64 %r78,%r96; $L52: .loc 1 495 10 ld.u64 %r66,[%r117]; .loc 1 495 6 setp.ne.u64 %r148,%r66,0; @ ! %r148 bra $L85; setp.eq.u64 %r244,%r120,0; bra $L53; $L85: .loc 1 499 10 setp.eq.u64 %r244,%r120,0; @ %r244 bra $L54; .loc 1 503 12 ld.u64 %r151,[%r120+56]; add.u64 %r150,%r151,1; .loc 1 503 10 ld.u64 %r152,[%r120+48]; sub.u64 %r76,%r150,%r152; .loc 1 504 7 setp.le.s64 %r153,%r76,0; @ ! %r153 bra $L86; bra $L55; $L54: .loc 1 512 7 ld.u32 %r154,[%r119]; setp.eq.u32 %r155,%r154,0; @ %r155 bra $L55; .loc 1 516 14 ld.u64 %r75,[%frame]; .loc 1 517 8 setp.le.s64 %r246,%r72,1; @ %r246 bra $L57; add.u64 %r99,%frame,8; cvt.u32.u32 %r158,%r22; cvt.s64.s8 %r157,%r158; shl.b64 %r159,%r157,3; add.u64 %r92,%frame,%r159; $L58: .loc 1 518 9 ld.u64 %r160,[%r99]; mul.lo.u64 %r75,%r75,%r160; .loc 1 517 8 add.u64 %r99,%r99,8; setp.ne.u64 %r161,%r92,%r99; @ %r161 bra $L58; $L57: .loc 1 526 7 mov.u64 %r162,0; st.u64 [%r117+48],%r162; add.u64 %r163,%r75,-1; st.u64 [%r117+56],%r163; mov.u64 %r164,1; st.u64 [%r117+40],%r164; .loc 1 528 19 st.u64 [%r117+8],%r162; .loc 1 530 24121; call (%value_in),_gfortrani_xmallocarray530 22 st.u64 [%r117],%r66; .loc 1 532 10 setp.ne.u64 %r170,%r75,0; @ ! %r170 bra $L46; $L53: .loc 1 536 14 ld.u64 %r171,[%r117+40]; ld.u64 %r172,[%r117+16]; mul.lo.u64 %r65,%r171,%r172; .loc 1 537 6 setp.ne.u64 %r173,%r65,0; selp.u64 %r65,%r65,%r121,%r173; .loc 1 547 6 ld.u32 %r175,[%r119]; set.u32.ne.u32 %r176,%r175,0; neg.s32 %r177,%r176; .loc 1 547 13 set.u32.ne.u64 %r179,%r96,0; neg.s32 %r180,%r179; cvt.u16.u32 %r182,%r177; cvt.u16.u32 %r183,%r180; and.b16 %r181,%r182,%r183; cvt.u32.u16 %r184,%r181; cvt.u16.u8 %r185,%r184; setp.eq.u16 %r186,%r185,0; @ %r186 bra $L60; .loc 1 549 13 setp.eq.u64 %r187,%r78,0; @ %r187 bra $L60; .loc 1 516 14 ld.u64 %r110,[%frame]; .loc 1 557 9 ld.u64 %r111,[%frame+120]; .loc 1 566 27 mul.lo.u64 %r114,%r86,%r110; setp.le.s64 %r246,%r72,1; $L64: .loc 1 552 41194,[%value_in]; } .loc 1 553 9 add.u64 %r66,%r66,%r65; .loc 1 556 9 add.u64 %r78,%r78,%r86; .loc 1 557 12 add.u64 %r111,%r111,1; .loc 1 559 10 setp.ne.u64 %r196,%r111,%r110; @ %r196 bra $L64; .loc 1 568 11 @ %r246 bra $L60; add.u64 %r84,%frame,128; add.u64 %r105,%frame,248; .loc 1 566 27 mov.u64 %r45,%r114; .loc 1 567 9 mov.u64 %r77,1; .loc 1 563 17 mov.u64 %r247,0; bra $L62; $L63: st.u64 [%r84],%r247; .loc 1 566 27 mul.lo.u64 %r45,%r48,%r47; .loc 1 567 9 add.u64 %r77,%r77,1; .loc 1 568 11 add.u64 %r84,%r84,8; add.u64 %r105,%r105,8; setp.eq.u64 %r201,%r72,%r77; @ %r201 bra $L60; $L62: .loc 1 576 13 ld.u64 %r202,[%r84]; add.u64 %r47,%r202,1; st.u64 [%r84],%r47; .loc 1 577 20 ld.u64 %r48,[%r105]; .loc 1 577 10 sub.u64 %r203,%r48,%r45; add.u64 %r78,%r78,%r203; .loc 1 559 29 shl.b64 %r204,%r77,3; add.u64 %r205,%frame,%r204; ld.u64 %r50,[%r205]; .loc 1 559 10 setp.eq.u64 %r206,%r47,%r50; @ %r206 bra $L63; mov.u64 %r111,0; bra $L64; $L60: .loc 1 584 6 @ %r244 bra $L46; .loc 1 586 11 ld.u64 %r209,[%r120+56]; add.u64 %r208,%r209,1; .loc 1 586 9 ld.u64 %r210,[%r120+48]; sub.u64 %r79,%r208,%r210; .loc 1 587 22 ld.u64 %r212,[%r117]; sub.u64 %r211,%r66,%r212; .loc 1 587 13 div.s64 %r80,%r211,%r65; .loc 1 588 10 setp.le.s64 %r213,%r79,%r80; @ %r213 bra $L46; .loc 1 590 22 ld.u64 %r214,[%r120+40]; ld.u64 %r215,[%r120+16]; mul.lo.u64 %r59,%r214,%r215; .loc 1 591 14 setp.ne.u64 %r216,%r59,0; selp.u64 %r59,%r59,%r121,%r216; .loc 1 594 16 ld.u64 %r218,[%r120]; mad.lo.u64 %r83,%r59,%r80,%r218; .loc 1 595 13 sub.u64 %r81,%r79,%r80; .loc 1 596 19 add.u64 %r82,%r81,-1; $L67: .loc 1 59121225,[%value_in]; } .loc 1 599 20 add.u64 %r66,%r66,%r65; .loc 1 600 20 add.u64 %r83,%r83,%r59; .loc 1 596 19 add.u64 %r82,%r82,-1; .loc 1 596 17 setp.ne.u64 %r227,%r82,-1; @ %r227 bra $L67; bra $L46; $L86: .loc 1 526 7 st.u64 [%r117+48],%r66; add.u64 %r229,%r76,-1; st.u64 [%r117+56],%r229; mov.u64 %r230,1; st.u64 [%r117+40],%r230; .loc 1 528 19 st.u64 [%r117+8],%r66; .loc 1 530 24121; call (%value_in),_gfortrani_xmallocarray530 22 st.u64 [%r117],%r66; bra $L53; $L55: .loc 1 526 7 mov.u64 %r236,0; st.u64 [%r117+48],%r236; mov.u64 %r237,-1; st.u64 [%r117+56],%r237; mov.u64 %r238,1; st.u64 [%r117+40],%r238; .loc 1 528 19 st.u64 [%r117+8],%r236; .loc 1 530 24call (%value_in),_gfortrani_xmallocarray2,[%value_in]; } .loc 1 530 22 st.u64 [%r117],%r242; $L46: .loc 1 604_gfortran_pack .visible .func _gfortran_packpred %r90predpred %r180; .reg .predpred %r184; mov.u64 %r66,%ar0; mov.u64 %r67,%ar1; mov.u64 %r68,%ar2; mov.u64 %r69,%ar3; .loc 1 252 15 ld.u64 %r26,[%r67+16]; ld.s8 %r70,[%r67+29]; shl.b32 %r71,%r70,4; cvt.s64.s32 %r72,%r71; shl.b64 %r73,%r26,7; or.b64 %r28,%r72,%r73; .loc 1 254 3 setp.eq.u64 %r74,%r28,1040; @ %r74 bra $L88; setp.gt.u64 %r75,%r28,1040; @ %r75 bra $L89; setp.eq.u64 %r76,%r28,288; @ %r76 bra $L90; setp.gt.u64 %r77,%r28,288; @ %r77 bra $L91; setp.eq.u64 %r78,%r28,160; @ %r78 bra $L92; setp.eq.u64 %r79,%r28,272; @ %r79 bra $L90; setp.eq.u64 %r80,%r28,144; @ %r80 bra $L92; bra $L93; $L91: setp.eq.u64 %r81,%r28,544; @ %r81 bra $L94; setp.eq.u64 %r82,%r28,560; @ %r82 bra $L95; setp.eq.u64 %r83,%r28,528; @ %r83 bra $L94; bra $L93; $L89: setp.eq.u64 %r84,%r28,1088; @ %r84 bra $L96; setp.gt.u64 %r85,%r28,1088; @ %r85 bra $L97; setp.eq.u64 %r86,%r28,1056; @ %r86 bra $L88; setp.eq.u64 %r87,%r28,1072; @ %r87 bra $L98; bra $L93; $L97: setp.eq.u64 %r88,%r28,2080; @ %r88 bra $L99; setp.eq.u64 %r89,%r28,2112; @ %r89 bra $L100; setp.eq.u64 %r90,%r28,2064; @ %r90 bra $L99; bra $L93; $L92:66260 7 bra $L87; $L90:667 bra $L87; $L94:66272 7 bra $L87; $L88:66278 7 bra $L87; $L99:66call _gfortrani_pack_i16285 7 bra $L87; $L95:66291 7 bra $L87; $L98:667 bra $L87; $L96:66323 7 bra $L87; $L100:66328 7 bra $L87; $L93: .loc 1 356 3 setp.eq.u64 %r127,%r26,4; @ %r127 bra $L102; setp.gt.u64 %r128,%r26,4; @ %r128 bra $L103; setp.eq.u64 %r129,%r26,1; @ %r129 bra $L92; setp.eq.u64 %r130,%r26,2; @ %r130 bra $L104; bra $L105; $L103: setp.eq.u64 %r131,%r26,8; @ %r131 bra $L106; setp.eq.u64 %r132,%r26,16; @ %r132 bra $L107; bra $L105; $L104: .loc 1 364 11 ld.u64 %r134,[%r66]; and.b64 %r133,%r134,1; .loc 1 364 10 setp.ne.u64 %r135,%r133,0; @ %r135 bra $L105; .loc 1 364 46 ld.u64 %r137,[%r67]; and.b64 %r136,%r137,1; .loc 1 364 43 setp.ne.u64 %r138,%r136,0; @ %r138 bra $L105; .loc 1 365 4 setp.eq.u64 %r139,%r69,0; @ %r139 bra $L108; .loc 1 365 18 ld.u64 %r141,[%r69]; and.b64 %r140,%r141,1; .loc 1 365 15 setp.ne.u64 %r142,%r140,0; @ %r142 bra $L105; $L108:371 4 bra $L87; $L102: .loc 1 375 11 ld.u64 %r148,[%r66]; and.b64 %r147,%r148,3; .loc 1 375 10 setp.ne.u64 %r149,%r147,0; @ %r149 bra $L105; .loc 1 375 46 ld.u64 %r151,[%r67]; and.b64 %r150,%r151,3; .loc 1 375 43 setp.ne.u64 %r152,%r150,0; @ %r152 bra $L105; .loc 1 376 4 setp.eq.u64 %r153,%r69,0; @ %r153 bra $L109; .loc 1 376 18 ld.u64 %r155,[%r69]; and.b64 %r154,%r155,3; .loc 1 376 15 setp.ne.u64 %r156,%r154,0; @ %r156 bra $L105; $L109:382 4 bra $L87; $L106: .loc 1 386 11 ld.u64 %r162,[%r66]; and.b64 %r161,%r162,7; .loc 1 386 10 setp.ne.u64 %r163,%r161,0; @ %r163 bra $L105; .loc 1 386 46 ld.u64 %r165,[%r67]; and.b64 %r164,%r165,7; .loc 1 386 43 setp.ne.u64 %r166,%r164,0; @ %r166 bra $L105; .loc 1 387 4 setp.eq.u64 %r167,%r69,0; @ %r167 bra $L110; .loc 1 387 18 ld.u64 %r169,[%r69]; and.b64 %r168,%r169,7; .loc 1 387 15 setp.ne.u64 %r170,%r168,0; @ %r170 bra $L105; $L11393 4 bra $L87; $L107: .loc 1 398 11 ld.u64 %r176,[%r66]; and.b64 %r175,%r176,15; .loc 1 398 10 setp.ne.u64 %r177,%r175,0; @ %r177 bra $L105; .loc 1 398 47 ld.u64 %r179,[%r67]; and.b64 %r178,%r179,15; .loc 1 398 44 setp.ne.u64 %r180,%r178,0; @ %r180 bra $L105; .loc 1 399 4 setp.eq.u64 %r181,%r69,0; @ %r181 bra $L111; .loc 1 399 18 ld.u64 %r183,[%r69]; and.b64 %r182,%r183,15; .loc 1 399 15 setp.ne.u64 %r184,%r182,0; @ %r184 bra $L105; $L111:call _gfortrani_pack_i16405 4 bra $L87; $L105:66call p,%out_arg4,%out_arg5); } $L87_gfortran_pack_char .visible .func _gfortran_pack_char429 3 cvt.s64.s32 %r35,%r28; {p,%out_arg4,%out_arg5); }_gfortran_pack_char4 .visible .func _gfortran_pack_char4mov.u64 %r29,%ar4; mov.u32 %r30,%ar5; .loc 1 445 57 cvt.s64.s32 %r37,%r30; shl.b64 %r38,%r37,2; .loc 1 445 3 {call p,%out_arg4,%out_arg5); }_gfortran_pack_s .visible .func _gfortran_pack_s614 3 ld.u64 %r33,[%r25+16]; {call pack_s,%out_arg4,%out_arg5); }_gfortran_pack_s_char .visible .func _gfortran_pack_s_char631 3 cvt.s64.s32 %r35,%r28; {pack_s_internal,(%out_arg1,%out_arg2,%out_arg3,%out_arg4,%out_arg5); } .loc 1 63_gfortran_pack_s_char4 .visible .func _gfortran_pack_s_char4mov.u64 %r29,%ar4; mov.u32 %r30,%ar5; .loc 1 649 19 cvt.s64.s32 %r37,%r30; shl.b64 %r38,%r37,2; .loc 1 648 3 {call pack_s,%out_arg4,%out_arg5); } .loc 1 650 1 ret; } /637 _gfortran_selected_char_gfortran_selected_char_kindfortran/intrinsics/selected_char_kind.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_f_gfortrani_fstrlenncasecmp15,111,95,49,48,54,52,54,0 }6] = {97,115,99,10508_gfortran_selected_char_gfortran_selected_char_kind32u64 %r46; .reg .u32 %r48; .reg .predr27; call (%value_in),_gfortrani_fstrle.loc 1 39 6 setp.ne.u64 %r32,%r31,5; @ %r32 bra $L2; .loc 1 39 20strncasecmp.loc 1 39 17 setp.ne.u32 %r37,%r36,0; .loc 1 41 12 selp.u32 %r26,-1,1,%r37; bra $L1; $L2: .loc 1 40 7 setp.ne.u64 %r38,%r31,7; @ %r38 bra $L4; .loc 1 40 23 cvta.const.u64 %r40,$LCcall (%value_in),strncasecmp.loc 1 40 20 setp.ne.u32 %r43,%r42,0; .loc 1 41 12 selp.u32 %r26,-1,1,%r43; bra $L1; $L4: .loc 1 42 11 setp.ne.u64 %r44,%r31,9; @ %r44 bra $L7; .loc 1 42 24 cvta.const.u64 %r46,$LC2call (%value_in),strncasecmp.loc 1 42 21 setp.ne.u32 %r49,%r48,0; .loc 1 43 12 selp.u32 %r26,-1,4,%r49; bra $L1; $L7: .loc 1 45 12 mov.u32 %r26,-1; $L1: .loc 1 46gfortran_size0gfortran_size0../libgfortran/intrinsics/siz_gfortran_size1gfortran_size1gfortran_size0gfortran_size0u64 %r29; .reg .u32 %r30; .reg .u64 %r33; .reg .u64 %r39pred %r56; mov.u64 %r42,%ar0; .loc 1 36 19 ld.s8 %r30,[%r42+28]; .loc 1 36 3 cvt.u16.u32 %r43,%r30; setp.le.s16 %r44,%r43,0; @ %r44 bra $L4; add.u64 %r33,%r42,48; add.u64 %r27,%r42,56; add.u32 %r45,%r30,-1; cvt.u64.u32 %r46,%r45; add.u64 %r48,%r46,%r46; add.u64 %r49,%r48,%r46; shl.b64 %r50,%r49,3; add.u64 %r51,%r42,72; add.u64 %r39,%r50,%r51; .loc 1 35 8 mov.u64 %r41,1; $L3: .loc 1 38 13 ld.u64 %r53,[%r27]; add.u64 %r52,%r53,1; .loc 1 38 11 ld.u64 %r54,[%r33]; sub.u64 %r29,%r52,%r54; .loc 1 41 12 max.s64 %r55,%r29,0; mul.lo.u64 %r41,%r41,%r55; .loc 1 36 3 add.u64 %r33,%r33,24; add.u64 %r27,%r27,24; setp.ne.u64 %r56,%r33,%r39; @ %r56 bra $L3; bra $L1; $L4: .loc 1 35 8 mov.u64 %r41,1; $L1: .loc 1 44gfortran_size1gfortran_size1d.u64 %r25,%r29,-1; .loc 1 57 10 add.u64 %r31,%r25,%r25; add.u64 %r32,%r31,%r25; shl.b64 %r33,%r32,3; add.u64 %r34,%r28,%r33; ld.u64 %r37,[%r34+56]; add.u64 %r36,%r37,1; .loc 1 57 8 ld.u64 %r44,[%r34+48]; sub.u64 %r26,%r36,%r44; .loc 1 60 10 max.s64 %value,%r26,0; .loc 1 61 /659 BEGIN FUNCTION DECL: spread_internal .func spread_internal../libgfortran/intrinsics/spread_generic.c" // BEGIN FUNCTION DECL: spread_internal_scalar$isra$0 .func spread_internal_scalar$isra$0gfortran_spread .visible .func _gfortran_spreadgfortran_spread_char .visible .func _gfortran_spread_char_gfortran_spread_char4 .visible .func _gfortran_spread_char4_gfortran_spread_scalar .visible .func _gfortran_spread_scalargfortran_spread_char_scalar .visible .func _gfortran_spread_char_scalar_gfortran_spread_char4_scalar .visible .func _gfortran_spread_char4_scalarGLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_options[6]; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray_gfortrani_spread_i1 .extern .func _gfortrani_spread_i1gfortrani_spread_i2 .extern .func _gfortrani_spread_i2gfortrani_spread_i4 .extern .func _gfortrani_spread_i4gfortrani_spread_i8 .extern .func _gfortrani_spread_i8gfortrani_spread_i16 .extern .func _gfortrani_spread_i16gfortrani_spread_r4 .extern .func _gfortrani_spread_r4gfortrani_spread_r8 .extern .func _gfortrani_spread_r8gfortrani_spread_c4 .extern .func _gfortrani_spread_c4gfortrani_spread_c8 .extern .func _gfortrani_spread_c8gfortrani_spread_scalar_i1 .extern .func _gfortrani_spread_scalar_i1gfortrani_spread_scalar_i2 .extern .func _gfortrani_spread_scalar_i2gfortrani_spread_scalar_i4 .extern .func _gfortrani_spread_scalar_i4gfortrani_spread_scalar_i8 .extern .func _gfortrani_spread_scalar_i8gfortrani_spread_scalar_i16 .extern .func _gfortrani_spread_scalar_i16gfortrani_spread_scalar_r4 .extern .func _gfortrani_spread_scalar_r4gfortrani_spread_scalar_r8 .extern .func _gfortrani_spread_scalar_r8gfortrani_spread_scalar_c4 .extern .func _gfortrani_spread_scalar_c4gfortrani_spread_scalar_c8 .extern .func _gfortrani_spread_scalar_c8114,101,116,117,114,110,32,114,97,110,107,32,116,111,111,32,108,97,114,103,101,32,105,110,32114,97,110,107,32,109,105,115,109,97,116,99,104,32,105,110,32,115,112,114,101,97,100,40,473,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,83,80,82,69,65,68,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10105,110,99,111,114,114,101,99,116,32,100,101,115,116,105,110,97,116,105,111,110,32,114,97,110,107,32,105,110,32,115,112,114,101,97,100,40,410,105,109,32,111,117,116,115,105,100,101,32,111,102,32,114,97,110,107,32,105,110,32,115,112,114,101,97,100,40,41114,101,116,117,114,110,32,97,114,114,97,121,32,109,105,115,115,105,110,103,32,100,101,115,99,114,105,112,116,111,114,32,105,110,32,115,112,114,101,97,100,40,41100,105,109,32,116,111,111,32,108,97,114,103,101,32,105,110,32,115,112,114,101,97,100,40,41,0 }; // BEGIN FUNCTION DEF: spread_internal .func spread_internalu64 %r97; .reg .u6432pred %r160; .reg .predpredu64 %r198; .reg .predu64predpredpredu64 %r342; .reg .u64u64 %r357; .reg .u64mov.u64 %r149,%ar0; mov.u64 %r150,%ar1; mov.u64 %r151,%ar2; mov.u64 %r152,%ar3; .loc 1 54 10 ld.u64 %r22,[%r150+16]; .loc 1 56 11 ld.s8 %r23,[%r150+28]; .loc 1 56 9 cvt.u32.u32 %r153,%r23; cvt.s64.s8 %r118,%r153; .loc 1 58 9 add.u64 %r119,%r118,1; .loc 1 59 6 setp.le.s64 %r154,%r119,15; @ %r154 bra $L2; .loc 1 60 5 cvta.const.u64 %r11_gfortran_runtime_errorld.u64 %r24,[%r151]; .loc 1 62 6 setp.le.s64 %r157,%r24,%r119; @ %r157 bra $L3; .loc 1 63_gfortran_runtime_errorr120,[%r152]; .loc 1 67 10 ld.u64 %r128,[%r149]; .loc 1 67 6 setp.ne.u64 %r160,%r128,0; @ %r160 bra $L4; .loc 1 74 23 st.u8 [%r149+28],%r119; .loc 1 78 7 setp.le.s64 %r161,%r119,0; @ %r161 bra $L5; add.u64 %r102,%r149,40; .loc 1 76 11 mov.u64 %r107,%r128; .loc 1 78 14 mov.u64 %r124,%r107; .loc 1 77 10 mov.u64 %r140,1; .loc 1 36 14 mov.u64 %r99,%r107; .loc 1 89 19 mov.u64 %r365,%r107; .loc 1 83 21 add.u64 %r366,%r120,-1; bra $L8; $L35: mov.u64 %r140,%r100; $L8: .loc 1 84 15 mul.lo.u64 %r146,%r22,%r140; .loc 1 81 20 ld.u64 %r165,[%r151]; add.u64 %r164,%r165,-1; .loc 1 81 7 setp.ne.u64 %r166,%r164,%r124; @ %r166 bra $L6; .loc 1 83 21 mov.u64 %r36,%r366; .loc 1 85 11 mul.lo.u64 %r100,%r120,%r140; .loc 1 84 15 mov.u64 %r99,%r146; bra $L7; $L6: .loc 1 89 19 shl.b64 %r167,%r107,3; add.u64 %r168,%frame,%r167; st.u64 [%r168+120],%r365; .loc 1 90 22 add.u64 %r172,%r107,%r107; add.u64 %r173,%r172,%r107; shl.b64 %r174,%r173,3; add.u64 %r175,%r150,%r174; ld.u64 %r178,[%r175+56]; add.u64 %r177,%r178,1; ld.u64 %r185,[%r175+48]; sub.u64 %r31,%r177,%r185; .loc 1 90 20 st.u64 [%r168],%r31; .loc 1 91 23 ld.u64 %r198,[%r175+40]; mul.lo.u64 %r197,%r198,%r22; .loc 1 91 21 st.u64 [%r168+240],%r197; .loc 1 92 21 st.u64 [%r168+360],%r146; .loc 1 94 24 add.u64 %r36,%r31,-1; .loc 1 95 11 mul.lo.u64 %r100,%r31,%r140; .loc 1 96 11 add.u64 %r107,%r107,1; $L7: .loc 1 99 4 st.u64 [%r102+8],%r365; st.u64 [%r102+16],%r36; st.u64 [%r102],%r140; .loc 1 78 31 add.u64 %r124,%r124,1; .loc 1 78 7 add.u64 %r102,%r102,24; setp.ne.u64 %r203,%r119,%r124; @ %r203 bra $L35; bra $L43; $L5: .loc 1 101 19 st.u64 [%r149+8],%r128; .loc 1 102 24 mov.u64 %r20522; call (%value_in),_gfortrani_xmallocarray207,[%value_in]; } mov.u64 %r128,%r207; .loc 1 102 22 st.u64 [%r149],%r128; .loc 1 180 12 ld.u64 %r103,[%frame+240]; .loc 1 36 14 mov.u64 %r99,0; bra $L10; $L43: .loc 1 101 19 st.u64 [%r149+8],%r365; .loc 1 102 2410_gfortrani_xmallocarray212,[%value_in]; } mov.u64 %r128,%r212; .loc 1 102 22 st.u64 [%r149],%r128; .loc 1 104 10 setp.le.s64 %r214,%r100,0; @ %r214 bra $L1; .loc 1 180 12 ld.u64 %r103,[%frame+240]; bra $L10; $L4: .loc 1 114 11 ld.s8 %r215,[%r149+28]; .loc 1 114 10 setp.eq.u64 %r216,%r215,%r119; @ %r216 bra $L12; .loc 1 115 2 cvta.const.u64 %r217,$LC2stack; call _gfortran_runtime_error12: .loc 1 117 26 cvta.global.u64 %r219,_gfortrani_compile_options; ld.u32 %r111,[%r219+36]; .loc 1 117 10 setp.ne.u32 %r220,%r111,0; @ %r220 bra $L13; .loc 1 155 4 setp.gt.s64 %r221,%r215,0; @ %r221 bra $L14; .loc 1 36 14 mov.u64 %r99,0; bra $L15; $L13: .loc 1 119 4 setp.le.s64 %r222,%r215,0; @ %r222 bra $L36; .loc 1 157 24 add.u64 %r112,%r24,-1; add.u64 %r63,%r149,40; .loc 1 111 18 mov.u32 %r111,0; .loc 1 113 11 mov.u64 %r108,0; .loc 1 119 11 mov.u64 %r123,%r108; .loc 1 36 14 mov.u64 %r99,%r108; .loc 1 136 16 mov.u64 %r367,%r108; $L20: .loc 1 123 21 ld.u64 %r226,[%r63+16]; add.u64 %r225,%r226,1; .loc 1 123 19 ld.u64 %r227,[%r63+8]; sub.u64 %r122,%r225,%r227; .loc 1 124 11 setp.ne.u64 %r228,%r112,%r123; @ %r228 bra $L16; .loc 1 126 14 ld.u64 %r229,[%r63]; ld.u64 %r230,[%r149+16]; mul.lo.u64 %r99,%r229,%r230; .loc 1 128 8 setp.eq.u64 %r231,%r120,%r122; @ %r231 bra $L17; .loc 1 129 7 st.u64 [%stack+16],%r120; st.u64 [%stack+8],%r122; add.u64 %r233,%r112,1;_gfortran_runtime_error16: .loc 1 136 16 shl.b64 %r356,%r108,3; add.u64 %r236,%frame,%r356; add.u64 %r237,%r236,120; st.u64 [%r237],%r367; .loc 1 137 19 add.u64 %r358,%r108,%r108; add.u64 %r354,%r358,%r108; shl.b64 %r242,%r354,3; add.u64 %r243,%r150,%r242; ld.u64 %r246,[%r243+56]; add.u64 %r245,%r246,1; ld.u64 %r253,[%r243+48]; sub.u64 %r51,%r245,%r253; .loc 1 137 17 st.u64 [%r236],%r51; .loc 1 138 8 setp.eq.u64 %r256,%r51,%r122; @ %r256 bra $L18; .loc 1 139 7 st.u64 [%stack+16],%r51; st.u64 [%stack+8],%r122; add.u64 %r258,%r123,1;_gfortran_runtime_error18: .loc 1 145 8 setp.gt.s64 %r260,%r51,0; .loc 1 146 18 selp.u32 %r111,%r111,1,%r260; .loc 1 147 20 ld.u64 %r271,[%r243+40]; mul.lo.u64 %r270,%r271,%r22; .loc 1 147 18 st.u64 [%r236+240],%r270; .loc 1 148 20 ld.u64 %r276,[%r63]; ld.u64 %r277,[%r149+16]; mul.lo.u64 %r275,%r276,%r277; .loc 1 148 18 st.u64 [%r236+360],%r275; .loc 1 149 8 add.u64 %r108,%r108,1; $L17: .loc 1 119 28 add.u64 %r123,%r123,1; .loc 1 119 4 add.u64 %r63,%r63,24; setp.ne.u64 %r278,%r215,%r123; @ %r278 bra $L20; bra $L21; $L14: .loc 1 157 24 add.u64 %r105,%r24,-1; add.u64 %r104,%r149,40; .loc 1 159 14 ld.u64 %r138,[%r149+16]; .loc 1 113 11 mov.u64 %r109,0; .loc 1 155 11 mov.u64 %r121,%r109; .loc 1 36 14 mov.u64 %r99,%r109; .loc 1 163 16 mov.u64 %r368,%r109; $L24: .loc 1 159 14 ld.u64 %r281,[%r104]; mul.lo.u64 %r147,%r138,%r281; .loc 1 157 11 setp.eq.u64 %r282,%r105,%r121; @ %r282 bra $L37; .loc 1 163 16 shl.b64 %r357,%r109,3; add.u64 %r284,%frame,%r357; add.u64 %r285,%r284,120; st.u64 [%r285],%r368; .loc 1 164 19 add.u64 %r359,%r109,%r109; add.u64 %r355,%r359,%r109; shl.b64 %r290,%r355,3; add.u64 %r291,%r150,%r290; ld.u64 %r294,[%r291+56]; add.u64 %r293,%r294,1; ld.u64 %r301,[%r291+48]; sub.u64 %r69,%r293,%r301; .loc 1 164 17 st.u64 [%r284],%r69; .loc 1 165 8 setp.gt.s64 %r304,%r69,0; .loc 1 166 18 selp.u32 %r111,%r111,1,%r304; .loc 1 167 20 ld.u64 %r315,[%r291+40]; mul.lo.u64 %r314,%r315,%r22; .loc 1 167 18 st.u64 [%r284+240],%r314; .loc 1 168 18 st.u64 [%r284+360],%r147; .loc 1 169 8 add.u64 %r109,%r109,1; bra $L22; $L37: .loc 1 159 12 mov.u64 %r99,%r147; $L22: .loc 1 155 28 add.u64 %r121,%r121,1; .loc 1 155 4 add.u64 %r104,%r104,24; setp.ne.u64 %r319,%r215,%r121; @ %r319 bra $L24; $L21: .loc 1 174 10 setp.ne.u32 %r320,%r111,0; @ ! %r320 bra $L15; bra $L1; $L36: .loc 1 36 14 mov.u64 %r99,0; $L15: .loc 1 177 18 ld.u64 %r80,[%frame+240]; .loc 1 177 10 setp.eq.u64 %r321,%r80,0; selp.u64 %r103,%r22,%r80,%r321; $L10: .loc 1 181 12 ld.u64 %r125,[%frame+360]; .loc 1 183 8 ld.u64 %r127,[%r150]; .loc 1 185 9 setp.ne.u64 %r322,%r127,0; @ %r322 bra $L25; bra $L1; $L32: .loc 1 192 16 mov.u64 %r129,%r128; .loc 1 189 14 mov.u64 %r130,0; $L26: .loc 1 191 1127329,[%value_in]; } .loc 1 192 16 add.u64 %r129,%r129,%r99; .loc 1 189 33 add.u64 %r130,%r130,1; .loc 1 189 7 setp.ne.u64 %r331,%r120,%r130; @ %r331 bra $L26; $L33: .loc 1 195 12 add.u64 %r127,%r127,%r103; .loc 1 196 12 add.u64 %r128,%r128,%r125; .loc 1 197 15 ld.u64 %r332,[%frame+120]; add.u64 %r82,%r332,1; .loc 1 199 13 setp.eq.u64 %r333,%r82,%r98; @ %r333 bra $L27; .loc 1 197 15 st.u64 [%frame+120],%r82; bra $L34; $L27: .loc 1 203 20 mov.u64 %r334,0; st.u64 [%frame+120],%r334; .loc 1 207 16 add.u64 %r126,%r128,%r141; .loc 1 209 14 @ %r360 bra $L1; add.u64 %r142,%frame,128; .loc 1 206 30 mov.u64 %r86,%r79; .loc 1 209 14 mov.u64 %r135,8; bra $L30; $L31: .loc 1 203 20 st.u64 [%r142],%r334; .loc 1 206 30 mul.lo.u64 %r86,%r92,%r91; .loc 1 207 30 mul.lo.u64 %r338,%r94,%r91; .loc 1 207 16 sub.u64 %r126,%r128,%r338; .loc 1 209 14 add.u64 %r142,%r142,8; add.u64 %r135,%r135,8; setp.eq.u64 %r339,%r106,%r135; @ %r339 bra $L1; $L30: .loc 1 217 23 ld.u64 %r340,[%r142]; add.u64 %r91,%r340,1; st.u64 [%r142],%r91; .loc 1 218 30 add.u64 %r342,%r363,%r135; ld.u64 %r92,[%r342]; .loc 1 218 20 sub.u64 %r343,%r92,%r86; add.u64 %r127,%r127,%r343; .loc 1 219 30 add.u64 %r345,%r364,%r135; ld.u64 %r94,[%r345]; .loc 1 219 20 add.u64 %r128,%r126,%r94; .loc 1 199 32 add.u64 %r346,%frame,%r135; ld.u64 %r97,[%r346]; .loc 1 199 13 setp.eq.u64 %r347,%r91,%r97; @ %r347 bra $L31; bra $L34; $L25: .loc 1 199 32 ld.u64 %r98,[%frame]; .loc 1 206 30 mul.lo.u64 %r79,%r103,%r98; .loc 1 207 30 mul.lo.u64 %r348,%r125,%r98; .loc 1 207 16 neg.s64 %r141,%r348; cvt.u32.u32 %r350,%r23; cvt.s64.s8 %r349,%r350; shl.b64 %r106,%r349,3; setp.le.s64 %r360,%r118,1; setp.gt.s64 %r361,%r120,0; add.u64 %r363,%frame,240; add.u64 %r364,%frame,360; $L34: .loc 1 189 7 @ %r361 bra $L32; bra $L33; $L1FUNCTION DEF: spread_internal_scalar$isra$0 .func spread_internal_scalar$isra$0pred %r56; .reg .u64 %r57; .reg .u64u32 %r63; .reg .u64u64 %r100; .reg .pred %r102; mov.u64 %r50,%ar0; mov.u64 %r51,%ar1; mov.u64 %r52,%ar2; mov.u64 %r53,%ar3; .loc 1 233 7 cvt.u32.u64 %r22,%r53; .loc 1 237 8 ld.u64 %r23,[%r50+16]; .loc 1 239 6 ld.s8 %r55,[%r50+28]; cvt.u16.u32 %r54,%r55; setp.eq.u16 %r56,%r54,1; @ %r56 bra $L45; .loc 1 240 5 cvta.const.u64 %r57,$LC4; {stack; call _gfortran_runtime_error45: .loc 1 242 6 ld.u64 %r59,[%r52]; setp.le.s64 %r60,%r59,1; @ %r60 bra $L46; .loc 1 243 5 cvta.const.u64 %r61,$LC1stack; call _gfortran_runtime_error46: .loc 1 249 7 add.u32 %r63,%r22,-1; cvt.s64.s32 %r49,%r63; .loc 1 245 6 ld.u64 %r64,[%r50]; setp.ne.u64 %r65,%r64,0; @ %r65 bra $L47; .loc 1 247 24 cvt.s64.s32 %r68,%r2call (%value_in),_gfortrani_xmallocarray.loc 1 247 22 st.u64 [%r50],%r69; .loc 1 248 19 st.u64 [%r50+8],%r64; .loc 1 249 7 st.u64 [%r50+48],%r64; st.u64 [%r50+56],%r49; mov.u64 %r73,1; st.u64 [%r50+40],%r73; $L49: .loc 1 258 3 setp.gt.s32 %r74,%r22,0; @ %r74 bra $L48; bra $L44; $L47: .loc 1 253 26 ld.u64 %r76,[%r50+56]; ld.u64 %r78,[%r50+48]; .loc 1 253 56 sub.u64 %r79,%r76,%r78; .loc 1 254 7 ld.u64 %r82,[%r50+40]; div.s64 %r81,%r79,%r82; .loc 1 253 10 setp.ge.s64 %r83,%r81,%r49; @ %r83 bra $L49; .loc 1 255 2 cvta.const.u64 %r84,$LC5; {stack; call _gfortran_runtime_error48: cvt.u32.u64 %r87,%r53; add.u32 %r86,%r87,-1; cvt.u64.u32 %r45,%r86; .loc 1 258 3 mov.u64 %r43,0; $L50: .loc 1 260 43 ld.u64 %r89,[%r50+40]; ld.u64 %r90,[%r50+16]; mul.lo.u64 %r88,%r89,%r90; .loc 1 260 12 ld.u64 %r93,[%r50]; mad.lo.u64 %r92,%r88,%r43,%r93; .loc 1 261 7100,[%value_in]; } mov.u64 %r44,%r43; .loc 1 258 3 add.u64 %r43,%r43,1; setp.ne.u64 %r102,%r44,%r45; @ %r102 bra $L50; $L44_gfortran_spread .visible .func _gfortran_spreadpred %r90; .reg .predpredpred.reg .predpredpredpredpredpred %r181; .reg .u64 %r184; .reg .u64 %r185; mov.u64 %r79,%ar0; mov.u64 %r80,%ar1; mov.u64 %r81,%ar2; mov.u64 %r82,%ar3; .loc 1 275 15 ld.u64 %r26,[%r79+16]; ld.s8 %r83,[%r79+29]; shl.b32 %r84,%r83,4; cvt.s64.s32 %r85,%r84; shl.b64 %r86,%r26,7; or.b64 %r28,%r85,%r86; .loc 1 276 3 setp.eq.u64 %r87,%r28,1040; @ %r87 bra $L54; setp.gt.u64 %r88,%r28,1040; @ %r88 bra $L55; setp.eq.u64 %r89,%r28,288; @ %r89 bra $L56; setp.gt.u64 %r90,%r28,288; @ %r90 bra $L57; setp.eq.u64 %r91,%r28,160; @ %r91 bra $L58; setp.eq.u64 %r92,%r28,272; @ %r92 bra $L56; setp.eq.u64 %r93,%r28,144; @ %r93 bra $L58; bra $L59; $L57: setp.eq.u64 %r94,%r28,544; @ %r94 bra $L60; setp.eq.u64 %r95,%r28,560; @ %r95 bra $L61; setp.eq.u64 %r96,%r28,528; @ %r96 bra $L60; bra $L59; $L55: setp.eq.u64 %r97,%r28,1088; @ %r97 bra $L62; setp.gt.u64 %r98,%r28,1088; @ %r98 bra $L63; setp.eq.u64 %r99,%r28,1056; @ %r99 bra $L54; setp.eq.u64 %r100,%r28,1072; @ %r100 bra $L64; bra $L59; $L63: setp.eq.u64 %r101,%r28,2080; @ %r101 bra $L65; setp.eq.u64 %r102,%r28,2112; @ %r102 bra $L66; setp.eq.u64 %r103,%r28,2064; @ %r103 bra $L65; bra $L59; $L58: .loc 1 280 7 ld.u64 %r107,[%r82]; ld.u64 %r106,[%r106107; call _gfortrani_spread_282 7 bra $L53; $L56: .loc 1 286 7 ld.u64 %r111,[%r82]; ld.u64 %r110,[%r110111; call _gfortrani_spread_288 7 bra $L53; $L60: .loc 1 292 7 ld.u64 %r115,[%r82]; ld.u64 %r114,[%r114115; call _gfortrani_spread_294 7 bra $L53; $L54: .loc 1 298 7 ld.u64 %r119,[%r82]; ld.u64 %r118,[%r118119; call _gfortrani_spread_300 7 bra $L53; $L65: .loc 1 305 7 ld.u64 %r123,[%r82]; ld.u64 %r122,[%r123; call _gfortrani_spread_i16307 7 bra $L53; $L61: .loc 1 311 7 ld.u64 %r127,[%r82]; ld.u64 %r126,[%r127; call _gfortrani_spread_r4313 7 bra $L53; $L64: .loc 1 316 7 ld.u64 %r131,[%r82]; ld.u64 %r130,[%r130131; call _gfortrani_spread_r8318 7 bra $L53; $L62: .loc 1 343 7 ld.u64 %r135,[%r82]; ld.u64 %r134,[%r134135; call _gfortrani_spread_c4345 7 bra $L53; $L66: .loc 1 348 7 ld.u64 %r139,[%r82]; ld.u64 %r138,[%r138139; call _gfortrani_spread_c87 bra $L53; $L59: .loc 1 376 3 setp.eq.u64 %r140,%r26,4; @ %r140 bra $L68; setp.gt.u64 %r141,%r26,4; @ %r141 bra $L69; setp.eq.u64 %r142,%r26,1; @ %r142 bra $L58; setp.eq.u64 %r143,%r26,2; @ %r143 bra $L70; bra $L71; $L69: setp.eq.u64 %r144,%r26,8; @ %r144 bra $L72; setp.eq.u64 %r145,%r26,16; @ %r145 bra $L73; bra $L71; $L70: .loc 1 384 11 ld.u64 %r147,[%r79]; and.b64 %r146,%r147,1; .loc 1 384 10 setp.ne.u64 %r148,%r146,0; @ %r148 bra $L71; .loc 1 384 46 ld.u64 %r150,[%r80]; and.b64 %r149,%r150,1; .loc 1 384 43 setp.ne.u64 %r151,%r149,0; @ %r151 bra $L71; .loc 1 388 4 ld.u64 %r155,[%r82]; ld.u64 %r154,[%r154155; call _gfortrani_spread_390 4 bra $L53; $L68: .loc 1 394 11 ld.u64 %r157,[%r79]; and.b64 %r156,%r157,3; .loc 1 394 10 setp.ne.u64 %r158,%r156,0; @ %r158 bra $L71; .loc 1 394 46 ld.u64 %r160,[%r80]; and.b64 %r159,%r160,3; .loc 1 394 43 setp.ne.u64 %r161,%r159,0; @ %r161 bra $L71; .loc 1 398 4 ld.u64 %r165,[%r82]; ld.u64 %r164,[%r1645; call _gfortrani_spread_400 4 bra $L53; $L72: .loc 1 404 11 ld.u64 %r167,[%r79]; and.b64 %r166,%r167,7; .loc 1 404 10 setp.ne.u64 %r168,%r166,0; @ %r168 bra $L71; .loc 1 404 46 ld.u64 %r170,[%r80]; and.b64 %r169,%r170,7; .loc 1 404 43 setp.ne.u64 %r171,%r169,0; @ %r171 bra $L71; .loc 1 408 4 ld.u64 %r175,[%r82]; ld.u64 %r174,[%r174175; call _gfortrani_spread_410 4 bra $L53; $L73: .loc 1 414 11 ld.u64 %r177,[%r79]; and.b64 %r176,%r177,15; .loc 1 414 10 setp.ne.u64 %r178,%r176,0; @ %r178 bra $L71; .loc 1 415 7 ld.u64 %r180,[%r80]; and.b64 %r179,%r180,15; .loc 1 415 4 setp.ne.u64 %r181,%r179,0; @ %r181 bra $L71; .loc 1 419 4 ld.u64 %r185,[%r82]; ld.u64 %r184,[%r1call _gfortrani_spread_i16421 4 bra $L53; $L7179call spread,%out_arg4); } $L53:_gfortran_spread_char .visible .func _gfortran_spread_char.loc 1 443call spread,%out_arg4); }_gfortran_spread_char4 .visible .func _gfortran_spread_char4$LFB12:call spread,%out_arg4); }_gfortran_spread_scalar .visible .func _gfortran_spread_scalar064predpredpred %r165; .reg .u64 %r168; mov.u64 %r64,%ar0; mov.u64 %r65,%ar1; mov.u64 %r66,%ar2; mov.u64 %r67,%ar3; .loc 1 476 7 ld.u64 %r22,[%r64+16]; .loc 1 476 6 setp.ne.u64 %r68,%r22,0; @ %r68 bra $L110; .loc 1 477 5 cvta.const.u64 %r69,$LCstack; call _gfortran_runtime_error110: .loc 1 479 15 ld.s8 %r71,[%r64+29]; shl.b32 %r72,%r71,4; cvt.s64.s32 %r73,%r72; shl.b64 %r74,%r22,7; or.b64 %r28,%r73,%r74; .loc 1 484 7 ld.u64 %r46,[%r67]; .loc 1 480 3 setp.eq.u64 %r75,%r28,1040; @ %r75 bra $L111; setp.gt.u64 %r76,%r28,1040; @ %r76 bra $L112; setp.eq.u64 %r77,%r28,288; @ %r77 bra $L113; setp.gt.u64 %r78,%r28,288; @ %r78 bra $L114; setp.eq.u64 %r79,%r28,160; @ %r79 bra $L115; setp.eq.u64 %r80,%r28,272; @ %r80 bra $L113; setp.eq.u64 %r81,%r28,144; @ %r81 bra $L115; bra $L116; $L114: setp.eq.u64 %r82,%r28,544; @ %r82 bra $L117; setp.eq.u64 %r83,%r28,560; @ %r83 bra $L118; setp.eq.u64 %r84,%r28,528; @ %r84 bra $L117; bra $L116; $L112: setp.eq.u64 %r85,%r28,1088; @ %r85 bra $L119; setp.gt.u64 %r86,%r28,1088; @ %r86 bra $L120; setp.eq.u64 %r87,%r28,1056; @ %r87 bra $L111; setp.eq.u64 %r88,%r28,1072; @ %r88 bra $L121; bra $L116; $L120: setp.eq.u64 %r89,%r28,2080; @ %r89 bra $L122; setp.eq.u64 %r90,%r28,2112; @ %r90 bra $L123; setp.eq.u64 %r91,%r28,2064; @ %r91 bra $L122; bra $L116; $L115: .loc 1 484 7 ld.u64 %r94,[%r66call _gfortrani_spread_scalar_7 bra $L109; $L113: .loc 1 490 7 ld.u64 %r98,[%r66call _gfortrani_spread_scalar_492 7 bra $L109; $L117: .loc 1 496 7 ld.u64 %r102,[%r66102call _gfortrani_spread_scalar_498 7 bra $L109; $L111: .loc 1 502 7 ld.u64 %r106,[%r66106call _gfortrani_spread_scalar_504 7 bra $L109; $L122: .loc 1 509 7 ld.u64 %r110,[%r66110call _gfortrani_spread_scalar_i16511 7 bra $L109; $L118: .loc 1 515 7 ld.u64 %r114,[%r66114call _gfortrani_spread_scalar_r4517 7 bra $L109; $L121: .loc 1 520 7 ld.u64 %r118,[%r66118call _gfortrani_spread_scalar_r8522 7 bra $L109; $L119: .loc 1 547 7 ld.u64 %r122,[%r66call _gfortrani_spread_scalar_c4549 7 bra $L109; $L123: .loc 1 552 7 ld.u64 %r126,[%r66call _gfortrani_spread_scalar_c87 bra $L109; $L116: .loc 1 580 3 setp.eq.u64 %r128,%r22,4; @ %r128 bra $L125; setp.gt.u64 %r129,%r22,4; @ %r129 bra $L126; setp.eq.u64 %r130,%r22,1; @ %r130 bra $L127; setp.eq.u64 %r131,%r22,2; @ %r131 bra $L128; bra $L129; $L126: setp.eq.u64 %r132,%r22,8; @ %r132 bra $L130; setp.eq.u64 %r133,%r22,16; @ %r133 bra $L131; bra $L129; $L127: .loc 1 583 7 ld.u64 %r136,[%r66136call _gfortrani_spread_scalar_585 7 bra $L109; $L128: .loc 1 588 43 ld.u64 %r139,[%r64]; or.b64 %r138,%r65,%r139; and.b64 %r140,%r138,1; setp.ne.u64 %r141,%r140,0; @ %r141 bra $L129; .loc 1 592 4 ld.u64 %r144,[%r66144call _gfortrani_spread_scalar_594 4 bra $L109; $L125: .loc 1 598 43 ld.u64 %r147,[%r64]; or.b64 %r146,%r65,%r147; and.b64 %r148,%r146,3; setp.ne.u64 %r149,%r148,0; @ %r149 bra $L129; .loc 1 602 4 ld.u64 %r152,[%r66152call _gfortrani_spread_scalar_604 4 bra $L109; $L130: .loc 1 608 43 ld.u64 %r155,[%r64]; or.b64 %r154,%r65,%r155; and.b64 %r156,%r154,7; setp.ne.u64 %r157,%r156,0; @ %r157 bra $L129; .loc 1 612 4 ld.u64 %r160,[%r661call _gfortrani_spread_scalar_614 4 bra $L109; $L131: .loc 1 618 44 ld.u64 %r163,[%r64]; or.b64 %r162,%r65,%r163; and.b64 %r164,%r162,152 4 ld.u64 %r168,[%r66168call _gfortrani_spread_scalar_i16624 call spread_internal_scalar$isra$0$L109: .loc 1 632_gfortran_spread_char_scalar .visible .func _gfortran_spread_char_scalar30; .reg .predmov.u64 %r24,%ar0; mov.u64 %r26,%ar2; mov.u64 %r27,%ar3; mov.u64 %r28,%ar4; .loc 1 647 6 ld.u64 %r30,[%r24+16]; setp.ne.u64 %r31,%r30,0; @ %r31 bra $L163; .loc 1 648 5 cvta.const.u64 %r32,$LC6stack; call _gfortran_runtime_error163: .loc 1 649 3 ld.u64 %r37,[%r28]; {call spread_internal_scalar$isra$00_gfortran_spread_char4_scalar .visible .func _gfortran_spread_char4_scalar30; .reg .pred$LFB14: mov.u64 %r24,%ar0; mov.u64 %r26,%ar2; mov.u64 %r27,%ar3; mov.u64 %r28,%ar4; ld.u64 %r30,[%r24+16]; setp.ne.u64 %r31,%r30,0; @ %r31 bra $L165; cvta.const.u64 %r32,$LC6stack; call _gfortran_runtime_error165: ld.u64 %r37,[%r28]; {call spread_internal_scalar$isra$0$LFE14: } /677 FUNCTION DECL: _gfortran_string_len_trim$pa_gfortran_string_len_trim$part$0fortran/intrinsics/string_intrinsics_in_gfortrani_memcmp_char4gfortrani_memcmp_char4../libgfortran/intrinsics/string_intrinsicgfortran_compare_string .visible .func (.param .u32 %value_out) _gfortran_compare); // BEGIN GLOBAL FUNCTION DECL: _gfortran_concat_string .visible .func _gfortran_conc_gfortran_string_trim .visible .func _gfortran_string_trimgfortran_string_len_trgfortran_string_len_tr; // BEGIN GLOBAL FUNCTION DECL: _gfortran_string_indexgfortran_string_index_gfortran_adjustl .visible .func _gfortran_adjustl_gfortran_adjustr .visible .func _gfortran_adjustr_gfortran_string_scangfortran_string_scan_gfortran_string_verifygfortran_string_verify_gfortran_string_minmax .visible .func _gfortran_string_minmax); // BEGIN GLOBAL FUNCTION DECL: _gfortran_compare_string_char4gfortran_compare_string_char4gfortran_concat_string_char4 .visible .func _gfortran_concat_string_chGLOBAL FUNCTION DECL: _gfortran_string_trim_char4 .visible .func _gfortran_string_trim_char4gfortran_string_len_trim_char4gfortran_string_len_trim_char); // BEGIN GLOBAL FUNCTION DECL: _gfortran_string_index_char4gfortran_string_index_char_gfortran_adjustl_char4 .visible .func _gfortran_adjustl_char4_gfortran_adjustr_char4 .visible .func _gfortran_adjustr_char4_gfortran_string_scan_char4gfortran_string_scan_char_gfortran_string_verify_char4gfortran_string_verify_char_gfortran_string_minmax_char4 .visible .func _gfortran_string_minmax_ch); // BEGIN GLOBAL FUNCTION DECL: memcmpmem_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime_VAR DEF: zero_length_string .global .align 1 .u8 zero_length_string[1]; // BEGIN VAR DEF: zero_length_string_char4 .global .align 4 .u32 zero_length_string_char4[1],105,114,115,116,32,97,114,103,117,109,101,110,116,32,111,102,32,39,37,115,39,32,105,110,116,114,105,110,115,105,99,32,115,104,111,117,108,100,32,98,101,32,112,114,101,115,101,110,116,04] = {77,65,884] = {77,73,7883,101,99,111,110,100,32,97,114,103,117,109,101,110,116,32,111,102,32,39,37,115,39,32,105,110,116,114,105,110,115,105,99,32,115,104,111,117,108,100,32,98,101,32,112,114,101,115,101,110,116,0 }; // BEGIN FUNCTION DEF: _gfortran_string_len_trim$pa_gfortran_string_len_trim$pa16u64u64 %r59; .reg .u16u32 %r66; .reg .pred %r67; mov.u64 %r39,%ar0; mov.u64 %r40,%ar1; .loc 1 193 10 add.u64 %r38,%r39,-1; .loc 1 198 6 setp.gt.u64 %r41,%r38,7; @ %r41 bra $L2; $L13: .loc 1 240 11 add.u64 %r42,%r40,%r38; .loc 1 240 9 ld.s8 %r44,[%r42]; cvt.u16.u32 %r43,%r44; setp.eq.u16 %r45,%r43,32; @ %r45 bra $L3; .loc 1 246 12 add.u64 %r36,%r38,1; bra $L4; $L3: .loc 1 242 10 setp.eq.u64 %r46,%r38,0; @ ! %r46 bra $L14; bra $L1; $L2: .loc 1 210 12 add.u64 %r47,%r40,%r39; .loc 1 206 16 and.b64 %r24,%r47,7; .loc 1 211 9 sub.u64 %r25,%r38,%r24; .loc 1 212 7 setp.eq.u64 %r48,%r24,0; @ %r48 bra $L7; $L9: .loc 1 213 7 add.u64 %r49,%r40,%r38; .loc 1 213 5 ld.s8 %r51,[%r49]; cvt.u16.u32 %r50,%r51; setp.eq.u16 %r52,%r50,32; @ %r52 bra $L8; .loc 1 214 24 add.u64 %r38,%r38,1; bra $L1; $L8: .loc 1 212 7 add.u64 %r38,%r38,-1; setp.ne.u64 %r53,%r25,%r38; @ %r53 bra $L9; bra $L7; $L12: .loc 1 230 6 add.u64 %r29,%r25,-8; .loc 1 231 8 add.u64 %r54,%r40,%r29; .loc 1 231 7 ld.u64 %r55,[%r54+1]; setp.ne.u64 %r56,%r55,2314885530818453536; @ %r56 bra $L10; .loc 1 230 6 mov.u64 %r25,%r29; bra $L11; $L7: and.b64 %r38,%r25,7; $L11: .loc 1 228 13 setp.ne.u64 %r57,%r25,%r38; @ %r57 bra $L12; bra $L13; $L15: .loc 1 242 10 setp.eq.u64 %r58,%r38,0; @ ! %r58 bra $L14; bra $L1; $L16: .loc 1 240 9 mov.u64 %r38,%r25; $L14: mov.u64 %r36,%r38; .loc 1 244 7 add.u64 %r38,%r38,-1; .loc 1 240 11 add.u64 %r59,%r40,%r38; .loc 1 240 9 ld.s8 %r61,[%r59]; cvt.u16.u32 %r60,%r61; setp.eq.u16 %r62,%r60,32; @ %r62 bra $L15; $L4: .loc 1 246 12 mov.u64 %r38,%r36; bra $L1; $L10: .loc 1 240 11 add.u64 %r64,%r40,%r25; .loc 1 240 9 ld.s8 %r66,[%r64]; cvt.u16.u32 %r65,%r66; setp.eq.u16 %r67,%r65,32; @ %r67 bra $L16; .loc 1 246 12 add.u64 %r36,%r25,1; bra $L4;gfortrani_memcmp_char4gfortrani_memcmp_char4pred %r29; .reg .pred %r30; .reg .pred %r31; .reg .pred.loc 2 60 13 add.u64 %r24,%r28,-1; .loc 2 60 9 setp.eq.u64 %r29,%r28,0; @ %r29 bra $L31; $L30: .loc 2 62 11 ld.u32 %r22,[%r26]; .loc 2 62 18 ld.u32 %r23,[%r27]; .loc 2 62 10 setp.eq.u32 %r30,%r22,%r23; @ %r30 bra $L29; .loc 2 63 24 setp.lt.u32 %r31,%r22,%r23; selp.u32 %r25,-1,1,%r31; bra $L27; $L29: .loc 2 64 10 add.u64 %r26,%r26,4; .loc 2 65 10 add.u64 %r27,%r27,4; .loc 2 60 13 add.u64 %r24,%r24,-1; .loc 2 60 9 setp.ne.u64 %r32,%r24,-1; @ %r32 bra $L30; .loc 2 67 10 mov.u32 %r25,0; bra $L27; $L31: cvt.u32.u64 %r25,%r28; $L27: .loc 2 68_gfortran_compare_gfortran_compare_stringu16 %r48; .reg .predmov.u64 %r31,%ar1; mov.u64 %r32,%ar2; mov.u64 %r33,%ar3; .loc 1 91 6 or.b64 %r34,%r31,%r33;40; .loc 1 93 6 setp.eq.u64 %r36,%r31,0; @ %r36 bra $L41; .loc 1 95 6 setp.eq.u64 %r37,%r33,0; @ %r37 bra $L42; .loc 1 98 9 min.u64 %r42,%r32,%r30;(%value_in),memcmp,(%out_arg1,mov.u32 %r29,%r43; .loc 1 99 6 setp.ne.u32 %r45,%r29,0; @ %r45 bra $L34; .loc 1 102 6 setp.eq.u64 %r46,%r32,%r30; @ %r46 bra $L34; .loc 1 105 6 setp.le.u64 %r47,%r32,%r30; @ %r47 bra $L36; .loc 1 107 11 sub.u64 %r24,%r32,%r30; .loc 1 108 9 add.u64 %r27,%r33,%r30; .loc 1 109 11 mov.u32 %r26,-1; bra $L37; $L36: .loc 1 113 11 sub.u64 %r24,%r30,%r32; .loc 1 114 9 add.u64 %r27,%r31,%r32; .loc 1 115 11 mov.u32 %r26,1; bra $L37; $L39: .loc 1 120 10 ld.u8 %r48,[%r27]; setp.eq.u16 %r49,%r48,32; @ %r49 bra $L38; .loc 1 122 14 setp.gt.u16 %r51,%r48,32; @ %r51 bra $L43; .loc 1 125 20 neg.s32 %r29,%r26; bra $L34; $L38: .loc 1 127 8 add.u64 %r27,%r27,1; .loc 1 118 9 setp.ne.u64 %r52,%r22,%r27; @ %r52 bra $L39; bra $L34; $L37: add.u64 %r22,%r27,%r24; bra $L39; $L40: .loc 1 92 12 cvt.u32.u64 %r29,%r34; bra $L34; $L41: .loc 1 94 12 mov.u32 %r29,-1; bra $L34; $L42: .loc 1 96 12 mov.u32 %r29,1; bra $L34; $L43: .loc 1 123 20 mov.u32 %r29,%r26; $L34_gfortran_concat_string .visible .func _gfortran_conc, .param .u64 %in_ar5) {mov.u64 %r31,%ar5; .loc 1 142 6 setp.lt.u64 %r32,%r28,%r26; @ %r32 bra $L46; .loc 1 144 7145 7 bra $L45; $L46: .loc 1 147 3148 8 add.u64 %r24,%r27,%r28; .loc 1 149 11 sub.u64 %r25,%r26,%r28; .loc 1 151 6 setp.gt.u64 %r49,%r25,%r30; @ %r49 bra $L48; .loc 1 153 7.loc 1 154 7 bra $L45; $L48: .loc 1 157 3$L45_gfortran_string_trim .visible .func _gfortran_string_trsetp.ne.u64 %r29,%r27,0; @ %r29 bra $L50; .loc 1 168 8 st.u64 [%r25],%r27; bra $L51; $L50:r28; call (%value_in),_gfortran_string_len_trim$part$0st.u64 [%r25],%r33; .loc 1 170 652; $L51: .loc 1 171 11 cvta.global.u64 %r35,zero_length_string; st.u64 [%r26],%r35; bra $L49; $L52: .loc 1 175 15 mov.u64 r37; call (%value_in),_gfortrani_xmallocarray.loc 1 175 13 st.u64 [%r26],%r38; .loc 1 178 7 ld.u64 %r42,[%r25]$L49_gfortran_string_len_trgfortran_string_len_tr88 6 setp.eq.u64 %r25,%r23,0; @ %r25 bra $L56;gfortran_string_len_trim$part$0bra $L54; $L56: .loc 1 189 12 mov.u64 %r22,%r23; $L54gfortran_string_indexgfortran_string_index16predar2; mov.u64 %r38,%ar3; mov.u32 %r39,%ar4; .loc 1 259 6 setp.ne.u64 %r40,%r37,0; @ %r40 bra $L58; .loc 1 260 30 setp.eq.u32 %r41,%r39,0; @ %r41 bra $L68; add.u64 %r34,%r35,1; bra $L57; $L58: .loc 1 262 6 setp.gt.u64 %r42,%r37,%r35; @ %r42 bra $L69; .loc 1 265 6 setp.ne.u32 %r43,%r39,0; @ %r43 bra $L60; .loc 1 267 12 add.u64 %r44,%r35,1; sub.u64 %r29,%r44,%r37; .loc 1 269 13 mov.u64 %r31,1; .loc 1 268 13 mov.u64 %r28,0; bra $L61; $L60: .loc 1 274 13 sub.u64 %r28,%r35,%r37; .loc 1 275 13 mov.u64 %r31,-1; .loc 1 273 12 mov.u64 %r29,%r31; bra $L61; $L67: add.u64 %r32,%r36,%r28; $L66: .loc 1 280 14 mov.u64 %r30,0; $L63: .loc 1 282 18 add.u64 %r45,%r32,%r30; .loc 1 282 37 add.u64 %r46,%r38,%r30; .loc 1 282 14 ld.s8 %r48,[%r45]; cvt.u16.u32 %r47,%r48; ld.s8 %r50,[%r46]; cvt.u16.u32 %r49,%r50; setp.ne.u16 %r51,%r47,%r49; @ %r51 bra $L62; .loc 1 280 31 add.u64 %r30,%r30,1; .loc 1 280 7 setp.ne.u64 %r52,%r37,%r30; @ %r52 bra $L63; bra $L64; $L62: .loc 1 285 10 setp.ne.u64 %r53,%r37,%r30; @ %r53 bra $L65; $L64: .loc 1 286 23 add.u64 %r34,%r28,1; bra $L57; $L65: .loc 1 278 30 add.u64 %r28,%r28,%r31; .loc 1 278 3 add.u64 %r32,%r32,%r31; setp.ne.u64 %r54,%r28,%r29; @ %r54 bra $L66; .loc 1 263 12 mov.u64 %r34,0; bra $L57; $L61: .loc 1 278 3 setp.ne.u64 %r55,%r28,%r29; @ %r55 bra $L67; .loc 1 263 12 mov.u64 %r34,0; bra $L57; $L68: .loc 1 260 30 mov.u64 %r34,1; bra $L57; $L69: .loc 1 263 12 mov.u64 %r34,0; $L57: .loc 1 289_gfortran_adjustlmov.u64 %r31,%ar2; .loc 1 301 9 setp.eq.u64 %r32,%r30,0; @ %r32 bra $L72; mov.u64 %r28,%r31; .loc 1 300 5 mov.u64 %r27,0; bra $L74; $L76: .loc 1 302 6 add.u64 %r27,%r27,1; .loc 1 301 9 add.u64 %r28,%r28,1; setp.eq.u64 %r33,%r30,%r27; @ ! %r33 bra $L74; mov.u64 %r55,0; bra $L75; $L74: .loc 1 301 18 ld.s8 %r35,[%r28]; cvt.u16.u32 %r34,%r35; setp.eq.u16 %r36,%r34,32; @ %r36 bra $L76; bra $L84; $L75: .loc 1 307 13 add.u64 %r43,%r29,%r55; .loc 1 307 5 mov.u32 %r4044,[%value_in]; } .loc 1 308 1 bra $L72; $L84: .loc 1 305 33 sub.u64 %r55,%r30,%r27; .loc 1 305 5.loc 1 306 6 setp.ne.u64 %r54,%r27,0; @ %r54 bra $L75; $L72:_gfortran_adjustr .visible .func _gfortran_adjustr.loc 1 319 9 mov.u64 %r26,%r29; bra $L87; $L90: .loc 1 320 6 mov.u64 %r26,%r22; $L87: .loc 1 319 9 setp.eq.u64 %r31,%r26,0; @ %r31 bra $L89; .loc 1 319 22 add.u64 %r22,%r26,-1; add.u64 %r32,%r30,%r22; .loc 1 319 16 ld.s8 %r34,[%r32]; cvt.u16.u32 %r33,%r34; setp.eq.u16 %r35,%r33,32; @ %r35 bra $L90; .loc 1 323 5 sub.u64 %r27,%r29,%r26; bra $L86; $L89: mov.u64 %r27,%r29; $L86: .loc 1 322 6 setp.le.u64 %r36,%r29,%r26; @ %r36 bra $L88; .loc 1 323 5 mov.u32 %r340,[%value_in]; } $L88: .loc 1 324 11 add.u64 %r41,%r28,%r27; .loc 1 324 3_gfortran_string_scangfortran_string_scan16336 12 set.u32.eq.u64 %r43,%r37,0; neg.s32 %r44,%r43; .loc 1 336 27 set.u32.eq.u64 %r46,%r39,0; neg.s32 %r47,%r46; .loc 1 336 17 cvt.u16.u32 %r49,%r44; cvt.u16.u32 %r50,%r47; or.b16 %r48,%r49,%r50; .loc 1 336 6 cvt.u32.u16 %r51,%r48; cvt.u16.u8 %r52,%r51; setp.ne.u16 %r53,%r52,0; @ %r53 bra $L98; .loc 1 339 6 setp.ne.u32 %r54,%r41,0; @ %r54 bra $L93; .loc 1 352 14 mov.u64 %r29,0; bra $L94; $L95: .loc 1 345 29 add.u64 %r55,%r40,%r32; .loc 1 345 11 ld.s8 %r57,[%r55]; cvt.u16.u32 %r56,%r57; setp.eq.u16 %r59,%r56,%r58; @ %r59 bra $L99; .loc 1 343 29 add.u64 %r32,%r32,1; .loc 1 343 4 setp.gt.u64 %r60,%r39,%r32; @ %r60 bra $L95; .loc 1 341 7 setp.eq.u64 %r61,%r36,0; @ %r61 bra $L91; mov.u64 %r37,%r36; $L93: .loc 1 345 15 add.u64 %r36,%r37,-1; add.u64 %r62,%r38,%r36; ld.s8 %r25,[%r62]; .loc 1 343 11 mov.u64 %r32,0; .loc 1 345 11 cvt.u16.u32 %r58,%r25; bra $L95; $L97: .loc 1 356 25 add.u64 %r63,%r40,%r30; .loc 1 356 11 ld.s8 %r65,[%r63]; cvt.u16.u32 %r64,%r65; setp.ne.u16 %r67,%r64,%r66; @ %r67 bra $L96; .loc 1 357 13 add.u64 %r36,%r29,1; bra $L91; $L96: .loc 1 354 29 add.u64 %r30,%r30,1; .loc 1 354 4 setp.gt.u64 %r68,%r39,%r30; @ %r68 bra $L97; .loc 1 352 30 add.u64 %r29,%r29,1; .loc 1 352 7 setp.le.u64 %r69,%r37,%r29; @ %r69 bra $L100; $L94: .loc 1 356 15 add.u64 %r70,%r38,%r29; ld.s8 %r27,[%r70]; .loc 1 354 11 mov.u64 %r30,0; .loc 1 356 11 cvt.u16.u32 %r66,%r27; bra $L97; $L98: .loc 1 337 12 mov.u64 %r36,0; bra $L91; $L99: mov.u64 %r36,%r37; bra $L91; $L100: mov.u64 %r36,0; $L_gfortran_string_verifygfortran_string_verifyr22.loc 1 377 6 setp.eq.u64 %r35,%r30,0; @ %r35 bra $L115; .loc 1 380 6 setp.eq.u32 %r36,%r34,0; @ %r36 bra $L116; .loc 1 383 13 add.u64 %r26,%r30,-1; .loc 1 384 13 mov.u64 %r30,-1; .loc 1 382 12 mov.u64 %r24,%r30; bra $L109; $L112: .loc 1 396 18 add.u64 %r37,%r31,%r26; ld.s8 %r22,[%r37]; .loc 1 394 14 mov.u64 %r25,0; .loc 1 396 14 cvt.u16.u32 %r41,%r22; $L111: .loc 1 396 32 add.u64 %r38,%r33,%r25; .loc 1 396 14 ld.s8 %r40,[%r38]; cvt.u16.u32 %r39,%r40; setp.eq.u16 %r42,%r39,%r41; @ %r42 bra $L110; .loc 1 394 32 add.u64 %r25,%r25,1; .loc 1 394 7 setp.ne.u64 %r43,%r32,%r25; @ %r43 bra $L111; $L113: .loc 1 400 23 add.u64 %r29,%r26,1; bra $L107; $L110: .loc 1 392 31 add.u64 %r26,%r26,%r24; .loc 1 392 3 setp.eq.u64 %r44,%r30,%r26; @ %r44 bra $L117; $L114: .loc 1 394 7 @ %r47 bra $L112; bra $L113; $L115: .loc 1 378 12 mov.u64 %r29,%r30; bra $L107; $L117: mov.u64 %r29,0; bra $L107; $L116: .loc 1 390 13 mov.u64 %r24,1; .loc 1 389 13 mov.u64 %r26,0; $L109: setp.ne.u64 %r47,%r32,0; bra $L114_gfortran_string_minmax .visible .func _gfortran_string_minmaxu64 %r62; .reg .predmov.u64 %r47,%ar0; mov.u64 %r48,%ar1; mov.u32 %r49,%ar2; mov.u32 %r50,%ar3; .loc 1 419 10 mov.u64 %r41,%args; ld.u64 %r31,[%r41]; .loc 1 420 7 add.u64 %r32,%r41,16; ld.u64 %r29,[%r41+8]; .loc 1 421 9 st.u64 [%r47],%r31; .loc 1 423 6 setp.eq.u64 %r52,%r29,0; @ %r52 bra $L120; .loc 1 427 3 setp.gt.s32 %r53,%r50,1; @ %r53 bra $L130; bra $L122; $L120: .loc 1 424 5 setp.gt.s32 %r54,%r49,0; @ %r54 bra $L131; cvta.const.u64 %r33,$LC1; bra $L123; $L131: cvta.const.u64 %r33,$LC0; $L123:_gfortran_runtime_error130: .loc 1 427 10 mov.u32 %r35,1; $L121: .loc 1 429 15 mov.u64 %r44,%r32; ld.u64 %r37,[%r44]; .loc 1 430 12 add.u64 %r32,%r44,16; ld.u64 %r38,[%r44+8]; .loc 1 432 10 setp.ne.u64 %r57,%r38,0; @ %r57 bra $L124; .loc 1 434 7 setp.ne.u32 %r58,%r35,1; @ %r58 bra $L125; .loc 1 435 6 setp.gt.s32 %r59,%r49,0; @ %r59 bra $L132; cvta.const.u64 %r34,$LC1; bra $L126; $L132: cvta.const.u64 %r34,$LC0; $L126: st.u64 [%stack],%r34; cvta.const.u64 %r60,$LC3; {stack; call _gfortran_runtime_error124: .loc 1 441 10 ld.u64 %r62,[%r47]; setp.ge.u64 %r63,%r62,%r37; @ %r63 bra $L127; .loc 1 442 8 st.u64 [%r47],%r37; $L127: .loc 1 444 16call (%value_in),_gfortran_compare_string444 14 mul.lo.u32 %r69,%r68,%r49; .loc 1 444 10 setp.ge.s32 %r70,%r69,0; .loc 1 429 15 selp.u64 %r31,%r31,%r37,%r70; selp.u64 %r29,%r29,%r38,%r70; $L125: .loc 1 427 27 add.u32 %r35,%r35,1; .loc 1 427 3 setp.ne.u32 %r71,%r50,%r35; @ %r71 bra $L121; $L122: .loc 1 452 7 ld.u64 %r25,[%r47]; .loc 1 452 6 setp.ne.u64 %r72,%r25,0; @ %r72 bra $L128; .loc 1 453 11 cvta.global.u64 %r73,zero_length_string; st.u64 [%r48],%r73; bra $L119; $L128: .loc 1 456 23 mov.u64 _gfortrani_xmallocarray7 ld.u64 %r93,[%r47]; sub.u64 %r92,%r93,%r31; .loc 1 458 15 add.u64 %r94,%r76,%r31; .loc 1 458 7 mov.u32 %r9459 13 st.u64 [%r48],%r76; $L119: .loc 1 461_gfortran_compare_string_char4gfortran_compare_string_charpredmov.u64 %r38,%ar2; mov.u64 %r39,%ar3; .loc 1 91 6 or.b64 %r40,%r37,%r39; setp.eq.u64 %r41,%r40,0; @ %r41 bra $L143; .loc 1 93 6 setp.eq.u64 %r42,%r37,0; @ %r42 bra $L144; .loc 1 95 6 setp.eq.u64 %r43,%r39,0; @ %r43 bra $L145; .loc 1 98 9 min.u64 %r23,%r38,%r36; .loc 2 60 13 add.u64 %r22,%r23,-1; .loc 2 60 9 setp.eq.u64 %r44,%r23,0; @ %r44 bra $L136; mov.u64 %r27,%r39; mov.u64 %r26,%r37; $L138: .loc 2 62 11 ld.u32 %r24,[%r26]; .loc 2 62 18 ld.u32 %r25,[%r27]; .loc 2 62 10 setp.eq.u32 %r45,%r24,%r25; @ %r45 bra $L137; .loc 2 63 24 setp.ge.u32 %r46,%r24,%r25; .loc 1 94 12 selp.u32 %r35,1,-1,%r46; bra $L134; $L137: .loc 2 64 10 add.u64 %r26,%r26,4; .loc 2 65 10 add.u64 %r27,%r27,4; .loc 2 60 13 add.u64 %r22,%r22,-1; .loc 2 60 9 setp.ne.u64 %r47,%r22,-1; @ %r47 bra $L138; $L136: .loc 1 102 6 setp.eq.u64 %r48,%r38,%r36; @ %r48 bra $L147; .loc 1 105 6 setp.le.u64 %r49,%r38,%r36; @ %r49 bra $L139; .loc 1 107 11 sub.u64 %r28,%r38,%r36; .loc 1 108 28 shl.b64 %r50,%r36,2; .loc 1 108 9 add.u64 %r32,%r39,%r50; .loc 1 109 11 mov.u32 %r35,-1; bra $L140; $L139: .loc 1 113 11 sub.u64 %r28,%r36,%r38; .loc 1 114 28 shl.b64 %r51,%r38,2; .loc 1 114 9 add.u64 %r32,%r37,%r51; .loc 1 115 11 mov.u32 %r35,1; bra $L140; $L142: .loc 1 120 11 ld.u32 %r31,[%r32]; .loc 1 120 10 setp.eq.u32 %r52,%r31,32; @ %r52 bra $L141; .loc 1 122 14 setp.gt.u32 %r53,%r31,32; @ %r53 bra $L134; .loc 1 125 20 neg.s32 %r35,%r35; bra $L134; $L141: .loc 1 127 8 add.u64 %r32,%r32,4; .loc 1 118 13 add.u64 %r33,%r33,-1; .loc 1 118 9 setp.ne.u64 %r54,%r33,-1; @ %r54 bra $L142; .loc 1 92 12 mov.u32 %r35,0; bra $L134; $L140: .loc 1 118 13 add.u64 %r33,%r28,-1; bra $L142; $L143: .loc 1 92 12 cvt.u32.u64 %r35,%r40; bra $L134; $L144: .loc 1 94 12 mov.u32 %r35,-1; bra $L134; $L145: .loc 2 63 24 mov.u32 %r35,1; bra $L134; $L147: .loc 1 92 12 mov.u32 %r35,0; $L134: .loc 1 131_gfortran_concat_string_char4 .visible .func _gfortran_concat_string_chu64 %r76; .reg .u32 %r77; .reg .predmov.u64 %r37,%ar3; mov.u64 %r38,%ar4; mov.u64 %r39,%ar5; .loc 1 142 6 setp.lt.u64 %r40,%r36,%r34; @ %r40 bra $L154; .loc 1 144 7 shl.b64 %r41,%r34,2;.loc 1 145 7 bra $L153; $L154: .loc 1 147 3 shl.b64 %r23,%r36,2;.loc 1 148 8 add.u64 %r26,%r35,%r23; .loc 1 149 11 sub.u64 %r27,%r34,%r36; .loc 1 151 6 setp.gt.u64 %r58,%r27,%r38; @ %r58 bra $L156; .loc 1 153 7 shl.b64 %r59,%r27,7 bra $L153; $L156: .loc 1 157 3 shl.b64 %r25,%r38,add.u64 %r28,%r26,%r25; shl.b64 %r76,%r27,2; add.u64 %r29,%r26,%r76; .loc 2 48 10 mov.u32 %r77,32; $L157: st.u32 [%r28],%r77; .loc 2 47 3 add.u64 %r28,%r28,4; setp.ne.u64 %r78,%r28,%r29; @ %r78 bra $L157_gfortran_string_trim_char4 .visible .func _gfortran_string_trim_charmov.u64 %r37,%ar3; .loc 1 188 6 setp.ne.u64 %r38,%r36,0; @ %r38 bra $L160; $L163: .loc 1 168 8 mov.u64 %r39,0; st.u64 [%r34],%r39; .loc 1 171 11 cvta.global.u64 %r40,zero_length_string_char4; st.u64 [%r35],%r40; bra $L159; $L160: .loc 1 193 10 add.u64 %r25,%r36,-1; .loc 1 240 11 shl.b64 %r32,%r25,2; add.u64 %r41,%r37,%r32; .loc 1 240 9 ld.u32 %r42,[%r41]; setp.ne.u32 %r43,%r42,32; @ %r43 bra $L162; .loc 1 242 10 setp.eq.u64 %r44,%r25,0; @ %r44 bra $L163; add.u64 %r45,%r32,-4; add.u64 %r30,%r37,%r45; bra $L164; $L165: add.u64 %r30,%r30,-4;%r25,-1; .loc 1 240 9 ld.u32 %r47,[%r30]; setp.eq.u32 %r48,%r47,32; @ %r48 bra $L165; $L162: .loc 1 168 8 st.u64 [%r34],%r36; .loc 1 175 15 mov.u64 %r50_gfortrani_xmallocarray.loc 1 175 13 st.u64 [%r35],%r51; .loc 1 178 7 ld.u64 %r54,[%r34]; shl.b64 %r53,%r54,_gfortran_string_len_trim_char4gfortran_string_len_trim_char4u32 %r40; .reg .pred188 6 setp.eq.u64 %r33,%r31,0; @ %r33 bra $L177; .loc 1 193 10 add.u64 %r30,%r31,-1; .loc 1 240 11 shl.b64 %r27,%r30,2; add.u64 %r34,%r32,%r27; .loc 1 240 9 ld.u32 %r35,[%r34]; setp.ne.u32 %r36,%r35,32; @ %r36 bra $L174; .loc 1 242 10 setp.eq.u64 %r37,%r30,0; @ %r37 bra $L172; add.u64 %r38,%r27,-4; add.u64 %r28,%r32,%r38; bra $L175; $L176: add.u64 %r28,%r28,-4; setp.eq.u64 %r39,%r30,0; @ %r39 bra $L172; $L175: mov.u64 %r31,%r30; .loc 1 244 7 add.u64 %r30,%r30,-1; .loc 1 240 9 ld.u32 %r40,[%r28]; setp.eq.u32 %r41,%r40,32; @ %r41 bra $L176; $L174: .loc 1 246 12 mov.u64 %r30,%r31; bra $L172; $L177: .loc 1 189 12 mov.u64 %r30,%r31; $L172: .loc 1 247_gfortran_string_index_char4gfortran_string_index_char32mov.u64 %r38,%ar2; mov.u64 %r39,%ar3; mov.u32 %r40,%ar4; .loc 1 259 6 setp.ne.u64 %r41,%r38,0; @ %r41 bra $L185; .loc 1 260 30 setp.eq.u32 %r42,%r40,0; @ %r42 bra $L194; add.u64 %r35,%r36,1; bra $L184; $L185: .loc 1 262 6 setp.gt.u64 %r43,%r38,%r36; @ %r43 bra $L195; .loc 1 265 6 setp.ne.u32 %r44,%r40,0; @ %r44 bra $L187; .loc 1 267 12 add.u64 %r45,%r36,1; sub.u64 %r27,%r45,%r38; .loc 1 269 13 mov.u64 %r29,1; .loc 1 268 13 mov.u64 %r26,0; bra $L188; $L187: .loc 1 274 13 sub.u64 %r26,%r36,%r38; .loc 1 275 13 mov.u64 %r29,-1; .loc 1 273 12 mov.u64 %r27,%r29; bra $L188; $L193: shl.b64 %r46,%r26,2; add.u64 %r30,%r37,%r46; mov.u64 %r25,%r39; .loc 1 280 14 mov.u64 %r28,0; $L190: .loc 1 282 14 ld.u32 %r47,[%r30]; ld.u32 %r48,[%r25]; setp.ne.u32 %r49,%r47,%r48; @ %r49 bra $L189; .loc 1 280 31 add.u64 %r28,%r28,1; .loc 1 280 7 add.u64 %r30,%r30,4; add.u64 %r25,%r25,4; setp.ne.u64 %r50,%r38,%r28; @ %r50 bra $L190; bra $L191; $L189: .loc 1 285 10 setp.ne.u64 %r51,%r38,%r28; @ %r51 bra $L192; $L191: .loc 1 286 23 add.u64 %r35,%r26,1; bra $L184; $L192: .loc 1 278 30 add.u64 %r26,%r26,%r29; .loc 1 278 3 setp.ne.u64 %r52,%r26,%r27; @ %r52 bra $L193; .loc 1 263 12 mov.u64 %r35,0; bra $L184; $L188: .loc 1 278 3 setp.ne.u64 %r53,%r26,%r27; @ %r53 bra $L193; .loc 1 263 12 mov.u64 %r35,0; bra $L184; $L194: .loc 1 260 30 mov.u64 %r35,1; bra $L184; $L195: .loc 1 263 12 mov.u64 %r35,0; $L184gfortran_adjustl_char4 .visible .func _gfortran_adjustl_char4u32 %r41; .reg .predmov.u64 %r35,%ar2; .loc 1 301 9 setp.eq.u64 %r36,%r34,0; @ %r36 bra $L199; mov.u64 %r32,%r35; .loc 1 300 5 mov.u64 %r28,0; bra $L201; $L203: .loc 1 302 6 add.u64 %r28,%r28,1; .loc 1 301 9 add.u64 %r32,%r32,4; setp.eq.u64 %r37,%r34,%r28; @ %r37 bra $L206; $L201: .loc 1 301 18 ld.u32 %r38,[%r32]; setp.eq.u32 %r39,%r38,32; @ %r39 bra $L203; bra $L211; $L206: mov.u64 %r53,0; $L202: add.u64 %r29,%r33,%r53; .loc 2 47 10 mov.u64 %r30,0; .loc 2 48 10 mov.u32 %r41,32; $L205: st.u32 [%r29],%r41; .loc 2 47 25 add.u64 %r30,%r30,1; .loc 2 47 3 add.u64 %r29,%r29,4; setp.lt.u64 %r42,%r30,%r28; @ %r42 bra $L205; bra $L199; $L211: .loc 1 305 33 sub.u64 %r24,%r34,%r28; .loc 1 305 5 shl.b64 %r53,%r24,306 6 setp.ne.u64 %r52,%r28,0; @ %r52 bra $L202; $L199:_gfortran_adjustr_char4 .visible .func _gfortran_adjustr_char4u32 %r43; .reg .pred %r44; .reg .u64mov.u64 %r35,%ar2; .loc 1 319 9 setp.eq.u64 %r36,%r34,0; @ %r36 bra $L218; shl.b64 %r55,%r34,2; mov.u64 %r27,%r55; mov.u64 %r26,%r34; bra $L214; $L216: .loc 1 320 6 add.u64 %r26,%r26,-1; .loc 1 319 9 add.u64 %r27,%r27,-4; setp.eq.u64 %r37,%r26,0; @ %r37 bra $L215; $L214: .loc 1 319 22 add.u64 %r38,%r35,%r27; .loc 1 319 16 ld.u32 %r39,[%r38+-4]; setp.eq.u32 %r40,%r39,32; @ %r40 bra $L216; .loc 1 323 5 sub.u64 %r32,%r34,%r26; .loc 1 322 6 setp.le.u64 %r41,%r34,%r26; @ %r41 bra $L219; mov.u64 %r26,%r27; shl.b64 %r55,%r32,2; $L215: mov.u64 %r31,%r33; add.u64 %r28,%r33,%r55; .loc 2 48 10 mov.u32 %r43,32; $L217: st.u32 [%r31],%r43; .loc 2 47 3 add.u64 %r31,%r31,4; setp.ne.u64 %r44,%r28,%r31; @ %r44 bra $L217; bra $L213; $L218: .loc 1 319 9 mov.u64 %r26,%r34; mov.u64 %r55,%r26; bra $L213; $L219: mov.u64 %r26,%r27; shl.b64 %r55,%r32,2; $L213: .loc 1 324 11 add.u64 %r46,%r33,%r55; .loc 1 324 3_gfortran_string_scan_char4gfortran_string_scan_char32u32 %r68; .reg .pred %r69; .reg .pred %r70; .reg .pred.loc 1 336 12 set.u32.eq.u64 %r50,%r44,0; neg.s32 %r51,%r50; .loc 1 336 27 set.u32.eq.u64 %r53,%r46,0; neg.s32 %r54,%r53; .loc 1 336 17 cvt.u16.u32 %r56,%r51; cvt.u16.u32 %r57,%r54; or.b16 %r55,%r56,%r57; .loc 1 336 6 cvt.u32.u16 %r58,%r55; cvt.u16.u8 %r59,%r58; setp.ne.u16 %r60,%r59,0; @ %r60 bra $L232; .loc 1 339 6 setp.ne.u32 %r61,%r48,0; @ %r61 bra $L226; mov.u64 %r36,%r45; .loc 1 352 14 mov.u64 %r33,0; bra $L227; $L226: shl.b64 %r62,%r44,2; add.u64 %r63,%r62,-4; add.u64 %r32,%r45,%r63; .loc 1 339 6 mov.u64 %r43,%r44; bra $L228; $L229: .loc 1 345 11 ld.u32 %r64,[%r40]; setp.eq.u32 %r65,%r25,%r64; @ %r65 bra $L224; .loc 1 343 29 add.u64 %r35,%r35,1; .loc 1 343 4 add.u64 %r40,%r40,4; setp.gt.u64 %r66,%r46,%r35; @ %r66 bra $L229; .loc 1 341 31 add.u64 %r43,%r43,-1; .loc 1 341 7 add.u64 %r32,%r32,-4; setp.eq.u64 %r67,%r43,0; @ %r67 bra $L224; $L228: .loc 1 345 15 ld.u32 %r25,[%r32]; mov.u64 %r40,%r47; .loc 1 343 11 mov.u64 %r35,0; bra $L229; $L231: .loc 1 356 11 ld.u32 %r68,[%r38]; setp.ne.u32 %r69,%r27,%r68; @ %r69 bra $L230; .loc 1 357 13 add.u64 %r43,%r33,1; bra $L224; $L230: .loc 1 354 29 add.u64 %r34,%r34,1; .loc 1 354 4 add.u64 %r38,%r38,4; setp.gt.u64 %r70,%r46,%r34; @ %r70 bra $L231; .loc 1 352 30 add.u64 %r33,%r33,1; .loc 1 352 7 add.u64 %r36,%r36,4; setp.le.u64 %r71,%r44,%r33; @ %r71 bra $L233; $L227: .loc 1 356 15 ld.u32 %r27,[%r36]; mov.u64 %r38,%r47; .loc 1 354 11 mov.u64 %r34,0; bra $L231; $L232: .loc 1 337 12 mov.u64 %r43,0; bra $L224; $L233: mov.u64 %r43,0; $L224gfortran_string_verify_char4gfortran_string_verify_charr22u32u64 %r48; .reg .predar2; mov.u64 %r38,%ar3; mov.u32 %r39,%ar4; .loc 1 377 6 setp.eq.u64 %r40,%r35,0; @ %r40 bra $L247; .loc 1 380 6 setp.eq.u32 %r41,%r39,0; @ %r41 bra $L248; .loc 1 383 13 add.u64 %r26,%r35,-1; .loc 1 384 13 mov.u64 %r35,-1; .loc 1 382 12 mov.u64 %r24,%r35; bra $L241; $L244: .loc 1 396 18 ld.u32 %r22,[%r29]; mov.u64 %r27,%r38; .loc 1 394 14 mov.u64 %r25,0; $L243: .loc 1 396 14 ld.u32 %r42,[%r27]; setp.eq.u32 %r43,%r22,%r42; @ %r43 bra $L242; .loc 1 394 32 add.u64 %r25,%r25,1; .loc 1 394 7 add.u64 %r27,%r27,4; setp.ne.u64 %r44,%r37,%r25; @ %r44 bra $L243; $L245: .loc 1 400 23 add.u64 %r34,%r26,1; bra $L239; $L242: .loc 1 392 31 add.u64 %r26,%r26,%r24; .loc 1 392 3 add.u64 %r29,%r29,%r30; setp.eq.u64 %r45,%r35,%r26; @ %r45 bra $L249; $L246: .loc 1 394 7 @ %r49 bra $L244; bra $L245; $L247: .loc 1 378 12 mov.u64 %r34,%r35; bra $L239; $L249: mov.u64 %r34,0; bra $L239; $L248: .loc 1 390 13 mov.u64 %r24,1; .loc 1 389 13 mov.u64 %r26,0; $L241: shl.b64 %r30,%r24,2; shl.b64 %r48,%r26,2; add.u64 %r29,%r36,%r48; setp.ne.u64 %r49,%r37,0; bra $L246; $L239: .loc 1 404_gfortran_string_minmax_char4 .visible .func _gfortran_string_minmax_ch94419 10 mov.u64 %r42,%args; ld.u64 %r30,[%r42]; .loc 1 420 7 add.u64 %r33,%r42,16; ld.u64 %r29,[%r42+8]; .loc 1 421 9 st.u64 [%r52],%r30; .loc 1 423 6 setp.eq.u64 %r57,%r29,0; @ %r57 bra $L252; .loc 1 427 3 setp.gt.s32 %r58,%r55,1; @ %r58 bra $L265; bra $L254; $L252: .loc 1 424 5 setp.gt.s32 %r59,%r54,0; @ %r59 bra $L266; cvta.const.u64 %r31,$LC1; bra $L255; $L266: cvta.const.u64 %r31,$LC0; $L255: st.u64 [%stack],%r31; cvta.const.u64 %r60,$LC2; {stack; call _gfortran_runtime_error265: .loc 1 427 10 mov.u32 %r40,1; $L253: .loc 1 429 15 mov.u64 %r48,%r33; ld.u64 %r38,[%r48]; .loc 1 430 12 add.u64 %r33,%r48,16; ld.u64 %r39,[%r48+8]; .loc 1 432 10 setp.ne.u64 %r62,%r39,0; @ %r62 bra $L256; .loc 1 434 7 setp.ne.u32 %r63,%r40,1; @ %r63 bra $L257; .loc 1 435 6 setp.gt.s32 %r64,%r54,0; @ %r64 bra $L267; cvta.const.u64 %r32,$LC1; bra $L258; $L267: cvta.const.u64 %r32,$LC0; $L258:3stack; call _gfortran_runtime_error256: .loc 1 441 10 ld.u64 %r67,[%r52]; setp.ge.u64 %r68,%r67,%r38; @ %r68 bra $L259; .loc 1 442 8 st.u64 [%r52],%r38; $L259call (%value_in),_gfortran_compare_string_char4ld.param.u32 %r73,[%value_in]; } .loc 1 444 14 mul.lo.u32 %r74,%r73,%r54; .loc 1 444 10 setp.ge.s32 %r75,%r74,0; .loc 1 429 15 selp.u64 %r30,%r30,%r38,%r75; selp.u64 %r29,%r29,%r39,%r75; $L257: .loc 1 427 27 add.u32 %r40,%r40,1; .loc 1 427 3 setp.ne.u32 %r76,%r55,%r40; @ %r76 bra $L253; $L254: .loc 1 452 7 ld.u64 %r25,[%r52]; .loc 1 452 6 setp.ne.u64 %r77,%r25,0; @ %r77 bra $L260; .loc 1 453 11 cvta.global.u64 %r78,zero_length_string_char4; st.u64 [%r53],%r78; bra $L251; $L260: .loc 1 456 23 mov.u64 %r80,4r80; call (%value_in),_gfortrani_xmallocarray.loc 1 457 7 shl.b64 %r26,%r30,34 ld.u64 %r27,[%r52]; .loc 2 47 3 setp.ne.u64 %r91,%r27,%r30; @ %r91 bra $L262; $L264: .loc 1 459 13 st.u64 [%r53],%r81; bra $L251; $L262: add.u64 %r49,%r81,%r26; shl.b64 %r92,%r27,2; add.u64 %r34,%r81,%r92; .loc 2 48 10 mov.u32 %r93,32; $L263: st.u32 [%r49],%r93; .loc 2 47 3 add.u64 %r49,%r49,4; setp.ne.u64 %r94,%r34,%r49; @ %r94 bra $L263; bra $L264; $L251: .loc 1 461gfortran_srand .visible .func _gfortran_srand../libgfortran/intrinsics/rand.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_ira_gfortran_irand_gfortran_rand .visible .func (.param .f32 %value_out) _gfortran_rand (.param .u64 %in_ar0); // BEGIN VAR DEF: rand_seed .global .align 8 .u64 rand_seed[1] = {1_gfortran_srand .visible .func _gfortran_srapred %r27; .reg .u64 %r29; mov.u64 %r25,%ar0; .loc 1 63 3 ld.s32 %r29,[%r25]; .loc 1 53 21 setp.ne.u64 %r27,%r29,0; selp.u64 %r29,%r29,123459876,%r27; .loc 1 53 13 st.global.u64 [rand_seed],%r29; .loc 1 6_gfortran_ira_gfortran_irandmov.u64 %r32,%ar0; .loc 1 76 6 setp.ne.u64 %r33,%r32,0; @ %r33 bra $L4; .loc 1 101 27 ld.global.u64 %r35,[rand_seed]; mul.lo.u64 %r34,%r35,16807; .loc 1 101 39 rem.u64 %r30,%r34,2147483647; .loc 1 102 6 cvt.u32.u64 %r31,%r30; bra $L5; $L4: .loc 1 77 7 ld.u32 %r23,[%r32]; .loc 1 83 3 setp.eq.u32 %r36,%r23,0; @ %r36 bra $L6; setp.eq.u32 %r37,%r23,1; @ %r37 bra $L9; bra $L11; $L6: .loc 1 101 27 ld.global.u64 %r39,[rand_seed]; mul.lo.u64 %r38,%r39,16807; .loc 1 101 39 rem.u64 %r30,%r38,2147483647; .loc 1 102 6 cvt.u32.u64 %r31,%r30; bra $L5; $L11: .loc 1 97 7 cvt.s64.s32 %r40,%r23; .loc 1 53 21 setp.eq.u32 %r41,%r23,0; @ %r41 bra $L10; .loc 1 101 27 mul.lo.u64 %r43,%r40,16807; .loc 1 101 39 rem.u64 %r30,%r43,2147483647; .loc 1 102 6 cvt.u32.u64 %r31,%r30; bra $L5; $L10: mov.u32 %r31,520932930; mov.u64 %r30,520932930; .loc 1 54 1 bra $L5; $L9: .loc 1 83 3 mov.u32 %r31,520932930; mov.u64 %r30,520932930; $L5: .loc 1 101 14 st.global.u64 [rand_seed],%r30; .loc 1 107_gfortran_rand .visible .func (.param .f32 %value_out) _gfortran_randpredf32 %r62; .reg .u64 %r63; .reg .pred %r64; .reg .u64u32 %r69; .reg .f32 %r70; mov.u64 %r47,%ar0; .loc 1 76 6 setp.ne.u64 %r48,%r47,0; @ %r48 bra $L13; .loc 1 101 27 ld.global.u64 %r50,[rand_seed]; mul.lo.u64 %r49,%r50,16807; .loc 1 101 39 rem.u64 %r35,%r49,2147483647; .loc 1 127 38 cvt.u32.u64 %r52,%r35; add.u32 %r51,%r52,-1; .loc 1 127 42 and.b32 %r53,%r51,-512; .loc 1 127 50 cvt.rn.f32.u32 %r54,%r53; mul.f32 %r46,%r54,0f30000000; bra $L14; $L13: .loc 1 77 7 ld.u32 %r22,[%r47]; .loc 1 83 3 setp.eq.u32 %r55,%r22,0; @ %r55 bra $L15; setp.eq.u32 %r56,%r22,1; @ %r56 bra $L18; bra $L20; $L15: .loc 1 101 27 ld.global.u64 %r58,[rand_seed]; mul.lo.u64 %r57,%r58,16807; .loc 1 101 39 rem.u64 %r35,%r57,2147483647; .loc 1 127 38 cvt.u32.u64 %r60,%r35; add.u32 %r59,%r60,-1; .loc 1 127 42 and.b32 %r61,%r59,-512; .loc 1 127 50 cvt.rn.f32.u32 %r62,%r61; mul.f32 %r46,%r62,0f30000000; bra $L14; $L20: .loc 1 97 7 cvt.s64.s32 %r63,%r22; .loc 1 53 21 setp.eq.u32 %r64,%r22,0; @ %r64 bra $L19; .loc 1 101 27 mul.lo.u64 %r66,%r63,16807; .loc 1 101 39 rem.u64 %r35,%r66,2147483647; .loc 1 127 38 cvt.u32.u64 %r68,%r35; add.u32 %r67,%r68,-1; .loc 1 127 42 and.b32 %r69,%r67,-512; .loc 1 127 50 cvt.rn.f32.u32 %r70,%r69; mul.f32 %r46,%r70,0f30000000; bra $L14; $L19: mov.f32 %r46,0f3e786670; mov.u64 %r35,520932930; .loc 1 54 1 bra $L14; $L18: .loc 1 83 3 mov.f32 %r46,0f3e786670; mov.u64 %r35,520932930; $L14: .loc 1 101 14 st.global.u64 [rand_seed],%r35; .loc 1 1 random.o/ BEGIN FUNCTION DECL: init_rand_state$constprop$0 .func init_rand_state$constprop$0; .file 1 "../../../../libgfortran/intrinsics/random.c" .file 2 "../../../../libgfortran/intrinsics/time_1_gfortran_arandom_r4 .visible .func _gfortran_arandom_r4_gfortran_arandom_r8 .visible .func _gfortran_arandom_r8_gfortran_random_seed_i4 .visible .func _gfortran_random_seed_i4_gfortran_random_seed_i8 .visible .func _gfortran_random_seed_i8popreareadcloseclosetimetime_gfortran_runtime_error .extern .func _gfortran_runtime_VAR DEF: master_state .global .align 8 .u64 master_state[5] = {0,-5952639272145821898,-2790978430781836137,-4872796757339724681,-6638731986642513151 }01,118,47,117,114,97,110,100,11182,65,78,68,79,77,95,83,69,69,68,32,115,104,111,117,108,100,32,104,97,118,101,32,97,116,32,109,111,115,116,32,111,110,101,32,97,114,103,117,109,101,110,116,32,112,114,101,115,101,110,116,4665,114,114,97,121,32,114,97,110,107,32,111,102,32,71,69,84,32,105,115,32,110,111,116,3265,114,114,97,121,32,115,105,122,101,32,111,102,32,71,69,84,32,105,115,32,116,111,111,32,115,109,97,108,108,46,0 }; // BEGIN VAR DEF: JUMP$0 .const .align 8 .u64 JUMP$0[4] = {1733541517147835066,-3051731464161248980,-6244198995065845334,41556572707897605465,114,114,97,121,32,115,105,122,101,32,111,102,32,80,85,84,32,105,115,32,116,111,111,32,115,109,97,108,108,46,0 }; // BEGIN VAR DEF: rand_state$1 .global .align 8 .u64 rand_state$1[5]65,114,114,97,121,32,114,97,110,107,32,111,102,32,80,85,84,32,105,115,32,110,111,116,32,49,46,0 }; // BEGIN VAR DEF: xor_keys .const .align 8 .u64 xor_keys[4] = {-4824380571461165239,-3143405005619208392,-4949652464462711484,1245905266060733753 }; // BEGIN FUNCTION DEF: init_rand_state$constprop$0 .func init_rand_state$constprop$0 {16u32 %r62; .reg .pred %r64; .reg .u64 %r67; .reg .u32predpred %r98; .reg .u64loc 1 353 20 cvta.global.u64 %r108,master_state; .loc 1 353 6 ld.u8 %r58,[%r108]; setp.ne.u16 %r59,%r58,0; @ ! %r59 bra $L18; add.u64 %r109,%r108,8; bra $L2; $L18: .loc 1 319 12 mov.u32 %r61,262144.loc 1 320 6 setp.eq.u32 %r64,%r62,-1; @ %r64 bra $L3; .loc 1 322 17 mov.u64 %r67,86267; call (%value_in),read.loc 1 323 7close,(ld.u64 %r31,[%frame]; .loc 1 324 14 bra $L4; $L3: .loc 2 230 14call (%value_in),time,(%out_arg1); ld.param.u64 %r72,[%value_in]; } .loc 2 233 6 setp.eq.u64 %r73,%r72,-1; @ %r73 bra $L11; .loc 1 331 12 xor.b64 %r31,%r72,324108011427370141; bra $L4; $L11: .loc 1 326 12 mov.u64 %r31,324108011427370141; $L4: cvta.global.u64 %r109,master_state+8; mov.u64 %r52,%r109; add.u64 %r49,%r109,32; $L6: .loc 1 291 19 add.u64 %r23,%r31,-7046029254386353131; .loc 1 292 15 shr.u64 %r75,%r23,30; .loc 1 292 10 xor.b64 %r76,%r75,%r23; .loc 1 292 5 mul.lo.u64 %r26,%r76,-4658895280553007687; .loc 1 293 15 shr.u64 %r77,%r26,27; .loc 1 293 10 xor.b64 %r28,%r77,%r26; .loc 1 293 5 mul.lo.u64 %r29,%r28,-7723592293110705685; .loc 1 294 17 shr.u64 %r78,%r29,31; .loc 1 294 12 xor.b64 %r31,%r78,%r29; .loc 1 360 29 st.u64 [%r52],%r31; .loc 1 357 7 add.u64 %r52,%r52,8; setp.ne.u64 %r79,%r49,%r52; @ %r79 bra $L6; .loc 1 362 25 mov.u32 %r81,1; st.u8 [%r108],%r81; $L2: .loc 1 364 3 cvta.global.u64 %r82,rand_state$1+8; mov.u64 %r86109ld.u64 %r48,[%r108+8]; ld.u64 %r36,[%r108+16]; ld.u64 %r33,[%r108+24]; ld.u64 %r44,[%r108+32]; cvta.const.u64 %r55,JUMP$0; add.u64 %r53,%r55,32; mov.u64 %r56,1733541517147835066; .loc 1 265 12 mov.u64 %r43,0; .loc 1 264 12 mov.u64 %r42,%r43; .loc 1 263 12 mov.u64 %r41,%r43; .loc 1 262 12 mov.u64 %r40,%r43; bra $L7; $L9: .loc 1 268 11 shr.u64 %r96,%r56,%r39; and.b64 %r97,%r96,1; .loc 1 268 10 setp.eq.u64 %r98,%r97,0; @ %r98 bra $L8; .loc 1 269 5 xor.b64 %r40,%r40,%r48; .loc 1 270 5 xor.b64 %r41,%r41,%r36; .loc 1 271 5 xor.b64 %r42,%r42,%r33; .loc 1 272 5 xor.b64 %r43,%r43,%r44; $L8: .loc 1 238 18 shl.b64 %r45,%r36,17; .loc 1 240 12 xor.b64 %r46,%r33,%r48; .loc 1 241 12 xor.b64 %r47,%r36,%r44; .loc 1 242 12 xor.b64 %r36,%r36,%r46; .loc 1 243 12 xor.b64 %r48,%r48,%r47; .loc 1 245 12 xor.b64 %r33,%r45,%r46; .loc 1 229 18 shr.u64 %r99,%r47,19; shl.b64 %r44,%r47,45; or.b64 %r44,%r44,%r99; .loc 1 267 29 add.u32 %r39,%r39,1; .loc 1 267 5 setp.ne.u32 %r100,%r39,64; @ %r100 bra $L9; .loc 1 266 3 add.u64 %r55,%r55,8; setp.eq.u64 %r101,%r53,%r55; @ %r101 bra $L10; .loc 1 268 15 ld.u64 %r56,[%r55]; $L7: .loc 1 267 13 mov.u32 %r39,0; bra $L9; $L10: .loc 1 277 12 st.u64 [%r108+8],%r40; .loc 1 278 12 st.u64 [%r108+16],%r41; .loc 1 279 12 st.u64 [%r108+24],%r42; .loc 1 280 12 st.u64 [%r108+32],%r43; .loc 1 368 12 mov.u32 %r107,1; st.global.u8 [rand_state$1],%r107; .loc 1 369_gfortran_random_r4 .visible .func _gfortran_random_r4mov.u64 %r42,%ar0; .loc 1 380 7 cvta.global.u64 %r77,rand_state$1; .loc 1 380 6 ld.u8 %r44,[%r77]; setp.ne.u16 %r45,%r44,0; @ %r45 bra $L20; .loc 1 381 5 { call init_rand_state$constprop$0; } $L20: .loc 1 236 37 ld.u64 %r28,[%r77+16]; .loc 1 240 20 ld.u64 %r34,[%r77+8]; .loc 1 240 12 ld.u64 %r49,[%r77+24]; xor.b64 %r35,%r34,%r49; .loc 1 241 12 ld.u64 %r51,[%r77+32]; xor.b64 %r37,%r28,%r51; .loc 1 242 12 xor.b64 %r53,%r28,%r35; st.u64 [%r77+16],%r53; .loc 1 243 12 xor.b64 %r55,%r34,%r37; st.u64 [%r77+8],%r55; .loc 1 238 18 shl.b64 %r57,%r28,17; .loc 1 245 12 xor.b64 %r58,%r57,%r35; st.u64 [%r77+24],%r58; .loc 1 229 18 shr.u64 %r60,%r37,19; shl.b64 %r61,%r37,45; or.b64 %r62,%r60,%r61; .loc 1 247 12 st.u64 [%r77+32],%r62; .loc 1 236 27 shl.b64 %r64,%r28,2; add.u64 %r65,%r64,%r28; .loc 1 229 18 shl.b64 %r66,%r65,7; shr.u64 %r67,%r65,57; or.b64 %r68,%r66,%r67; .loc 1 236 18 shl.b64 %r70,%r68,3; add.u64 %r71,%r70,%r68; .loc 1 385 33 shr.u64 %r72,%r71,32; .loc 1 106 5 cvt.u32.u64 %r74,%r72; and.b32 %r73,%r74,-256; .loc 1 107 8 cvt.rn.f32.u32 %r75,%r73; .loc 1 107 23 mul.f32 %r76,%r75,0f2f800000; .loc 1 107 6 st.f32 [%r42],%r76;_gfortran_random_r8 .visible .func _gfortran_random_r8u64 %r40; .reg .u16 %r42; .reg .pred %r43; .reg .u64 %r47; .reg .u64mov.u64 %r40,%ar0; .loc 1 399 7 cvta.global.u64 %r73,rand_state$1; .loc 1 399 6 ld.u8 %r42,[%r73]; setp.ne.u16 %r43,%r42,0; @ %r43 bra $L22; .loc 1 400 5 { call init_rand_state$constprop$0; } $L22: .loc 1 236 37 ld.u64 %r26,[%r73+16]; .loc 1 240 20 ld.u64 %r32,[%r73+8]; .loc 1 240 12 ld.u64 %r47,[%r73+24]; xor.b64 %r33,%r32,%r47; .loc 1 241 12 ld.u64 %r49,[%r73+32]; xor.b64 %r35,%r26,%r49; .loc 1 242 12 xor.b64 %r51,%r26,%r33; st.u64 [%r73+16],%r51; .loc 1 243 12 xor.b64 %r53,%r32,%r35; st.u64 [%r73+8],%r53; .loc 1 238 18 shl.b64 %r55,%r26,17; .loc 1 245 12 xor.b64 %r56,%r55,%r33; st.u64 [%r73+24],%r56; .loc 1 229 18 shr.u64 %r58,%r35,19; shl.b64 %r59,%r35,45; or.b64 %r60,%r58,%r59; .loc 1 247 12 st.u64 [%r73+32],%r60; .loc 1 236 27 shl.b64 %r62,%r26,2; add.u64 %r63,%r62,%r26; .loc 1 229 18 shl.b64 %r64,%r63,7; shr.u64 %r65,%r63,57; or.b64 %r66,%r64,%r65; .loc 1 236 18 shl.b64 %r68,%r66,3; add.u64 %r69,%r68,%r66; .loc 1 121 5 and.b64 %r70,%r69,-2048; .loc 1 122 8 cvt.rn.f64.u64 %r71,%r70; .loc 1 122 23 mul.f64 %r72,%r71,0d3bf0000000000000; .loc 1 122 6 st.f64 [%r40],%r72; .loc 1 40_gfortran_arandom_r4 .visible .func _gfortran_arandom_r4pred64pred %r110; .reg .predmov.u64 %r80,%ar0; .loc 1 462 8 ld.u64 %r51,[%r80]; .loc 1 464 7 ld.s8 %r45,[%r80+28]; .loc 1 466 3 setp.gt.s64 %r81,%r45,0; @ %r81 bra $L24; $L29: .loc 1 475 11 ld.u64 %r46,[%frame]; .loc 1 477 7 cvta.global.u64 %r137,rand_state$1; .loc 1 477 6 ld.u8 %r83,[%r137]25; bra $L43; $L24: add.u64 %r73,%frame,240; add.u64 %r72,%r80,40; mov.u64 %r70,%frame; .loc 1 466 19 mov.u64 %r52,0; add.u64 %r135,%frame,120; .loc 1 468 16 mov.u64 %r85,%r52; $L28: st.u64 [%r73],%r85; .loc 1 469 17 ld.u64 %r86,[%r72]; st.u64 [%r70],%r86; .loc 1 470 19 ld.u64 %r88,[%r72+16]; add.u64 %r87,%r88,1; ld.u64 %r89,[%r72+8]; sub.u64 %r27,%r87,%r89; .loc 1 470 17 shl.b64 %r91,%r52,3; add.u64 %r92,%r135,%r91; st.u64 [%r92],%r27; .loc 1 471 10 setp.gt.s64 %r93,%r27,0; @ ! %r93 bra $L23; .loc 1 466 36 add.u64 %r52,%r52,1; .loc 1 466 3 add.u64 %r73,%r73,8; add.u64 %r72,%r72,24; add.u64 %r70,%r70,8; setp.ne.u64 %r94,%r45,%r52; @ %r94 bra $L28; bra $L29; $L43: .loc 1 478 5 { call init_rand_state$constprop$0; } bra $L25; $L37: .loc 1 236 27 shl.b64 %r96,%r62,2; add.u64 %r97,%r96,%r62; .loc 1 229 18 shl.b64 %r98,%r97,7; shr.u64 %r99,%r97,57; or.b64 %r100,%r98,%r99; .loc 1 236 18 shl.b64 %r102,%r100,3; add.u64 %r103,%r102,%r100; .loc 1 238 18 shl.b64 %r59,%r62,17; .loc 1 240 12 xor.b64 %r60,%r28,%r63; .loc 1 241 12 xor.b64 %r61,%r41,%r62; .loc 1 242 12 xor.b64 %r62,%r62,%r60; .loc 1 243 12 xor.b64 %r63,%r63,%r61; .loc 1 245 12 xor.b64 %r28,%r59,%r60; .loc 1 229 18 shr.u64 %r104,%r61,19; shl.b64 %r41,%r61,45; or.b64 %r41,%r41,%r104; .loc 1 484 37 shr.u64 %r105,%r103,32; .loc 1 106 5 cvt.u32.u64 %r107,%r105; and.b32 %r106,%r107,-256; .loc 1 107 8 cvt.rn.f32.u32 %r108,%r106; .loc 1 107 23 mul.f32 %r109,%r108,0f2f800000; .loc 1 107 6 st.f32 [%r51],%r109; .loc 1 488 12 add.u64 %r51,%r51,%r33; .loc 1 489 15 add.u64 %r34,%r34,1; .loc 1 492 13 setp.ne.u64 %r110,%r34,%r47; @ %r110 bra $L37; .loc 1 501 14 @ ! %r136 bra $L44; bra $L31; $L36: .loc 1 496 20 st.u64 [%r79],%r141; .loc 1 499 29 mul.lo.u64 %r36,%r39,%r38; .loc 1 500 12 add.u64 %r50,%r50,1; .loc 1 501 14 add.u64 %r79,%r79,8; setp.ne.u64 %r113,%r45,%r50; @ %r113 bra $L33; st.u64 [%r137+16],%r62; st.u64 [%r137+24],%r28; st.u64 [%r137+8],%r63; st.u64 [%r137+32],%r41; bra $L23; $L31: st.u64 [%r137+16],%r62; st.u64 [%r137+24],%r28; st.u64 [%r137+8],%r63; st.u64 [%r137+32],%r41; bra $L23; $L44: add.u64 %r79,%frame,248; .loc 1 499 29 mov.u64 %r36,%r67; .loc 1 500 12 mov.u64 %r50,1; $L33: .loc 1 508 23 ld.u64 %r123,[%r79]; add.u64 %r38,%r123,1; st.u64 [%r79],%r38; shl.b64 %r76,%r50,3; .loc 1 509 29 add.u64 %r124,%frame,%r76; ld.u64 %r39,[%r124]; .loc 1 509 20 sub.u64 %r125,%r39,%r36; shl.b64 %r126,%r125,2; add.u64 %r51,%r51,%r126; .loc 1 492 32 add.u64 %r128,%r140,%r76; ld.u64 %r42,[%r128]; .loc 1 492 13 setp.eq.u64 %r129,%r38,%r42; @ %r129 bra $L36; .loc 1 496 20 mov.u64 %r34,0; bra $L37; $L25: .loc 1 480 9 setp.eq.u64 %r130,%r51,0; @ %r130 bra $L23; ld.u64 %r62,[%r137+16]; ld.u64 %r28,[%r137+24]; ld.u64 %r63,[%r137+8]; ld.u64 %r41,[%r137+32]; .loc 1 488 12 shl.b64 %r33,%r46,2; ld.u64 %r34,[%frame+240]; .loc 1 492 32 ld.u64 %r47,[%frame+120]; .loc 1 499 29 mul.lo.u64 %r67,%r47,%r46; setp.eq.u64 %r136,%r45,1; add.u64 %r140,%frame,120; .loc 1 496 20 mov.u64 %r141,0; bra $L37; $L_gfortran_arandom_r8 .visible .func _gfortran_arandom_r8predpred %r91; .reg .predpred %r106; .reg .predmov.u64 %r78,%ar0; .loc 1 529 8 ld.u64 %r51,[%r78]; .loc 1 531 7 ld.s8 %r44,[%r78+28]; .loc 1 533 3 setp.gt.s64 %r79,%r44,0; @ %r79 bra $L46; $L51: .loc 1 542 11 ld.u64 %r45,[%frame]; .loc 1 544 7 cvta.global.u64 %r133,rand_state$1; .loc 1 544 6 ld.u8 %r81,[%r133]bra $L65; $L46: add.u64 %r70,%frame,240; add.u64 %r69,%r78,40; mov.u64 %r66,%frame; .loc 1 533 19 mov.u64 %r52,0; add.u64 %r132,%frame,120; .loc 1 535 16 mov.u64 %r83,%r52; $L50: st.u64 [%r70],%r83; .loc 1 536 17 ld.u64 %r84,[%r69]; st.u64 [%r66],%r84; .loc 1 537 19 ld.u64 %r86,[%r69+16]; add.u64 %r85,%r86,1; ld.u64 %r87,[%r69+8]; sub.u64 %r27,%r85,%r87; .loc 1 537 17 shl.b64 %r89,%r52,3; add.u64 %r90,%r132,%r89; st.u64 [%r90],%r27; .loc 1 538 10 setp.gt.s64 %r91,%r27,0; @ ! %r91 bra $L45; .loc 1 533 36 add.u64 %r52,%r52,1; .loc 1 533 3 add.u64 %r70,%r70,8; add.u64 %r69,%r69,24; add.u64 %r66,%r66,8; setp.ne.u64 %r92,%r44,%r52; @ %r92 bra $L50; bra $L51; $L65: .loc 1 545 5 { call init_rand_state$constprop$0; } bra $L47; $L59: .loc 1 236 27 shl.b64 %r94,%r62,2; add.u64 %r95,%r94,%r62; .loc 1 229 18 shl.b64 %r96,%r95,7; shr.u64 %r97,%r95,57; or.b64 %r98,%r96,%r97; .loc 1 236 18 shl.b64 %r100,%r98,3; add.u64 %r101,%r100,%r98; .loc 1 238 18 shl.b64 %r59,%r62,17; .loc 1 240 12 xor.b64 %r60,%r28,%r63; .loc 1 241 12 xor.b64 %r61,%r40,%r62; .loc 1 242 12 xor.b64 %r62,%r62,%r60; .loc 1 243 12 xor.b64 %r63,%r63,%r61; .loc 1 245 12 xor.b64 %r28,%r59,%r60; .loc 1 229 18 shr.u64 %r102,%r61,19; shl.b64 %r40,%r61,45; or.b64 %r40,%r40,%r102; .loc 1 121 5 and.b64 %r103,%r101,-2048; .loc 1 122 8 cvt.rn.f64.u64 %r104,%r103; .loc 1 122 23 mul.f64 %r105,%r104,0d3bf0000000000000; .loc 1 122 6 st.f64 [%r51],%r105; .loc 1 554 12 add.u64 %r51,%r51,%r32; .loc 1 555 15 add.u64 %r33,%r33,1; .loc 1 558 13 setp.ne.u64 %r106,%r33,%r49; @ %r106 bra $L59; .loc 1 567 14 @ ! %r131 bra $L66; bra $L53; $L58: .loc 1 562 20 st.u64 [%r76],%r137; .loc 1 565 29 mul.lo.u64 %r35,%r38,%r37; .loc 1 566 12 add.u64 %r50,%r50,1; .loc 1 567 14 add.u64 %r76,%r76,8; setp.ne.u64 %r109,%r44,%r50; @ %r109 bra $L55; st.u64 [%r133+16],%r62; st.u64 [%r133+24],%r28; st.u64 [%r133+8],%r63; st.u64 [%r133+32],%r40; bra $L45; $L53: st.u64 [%r133+16],%r62; st.u64 [%r133+24],%r28; st.u64 [%r133+8],%r63; st.u64 [%r133+32],%r40; bra $L45; $L66: add.u64 %r76,%frame,248; .loc 1 565 29 mov.u64 %r35,%r67; .loc 1 566 12 mov.u64 %r50,1; $L55: .loc 1 574 23 ld.u64 %r119,[%r76]; add.u64 %r37,%r119,1; st.u64 [%r76],%r37; shl.b64 %r73,%r50,3; .loc 1 575 29 add.u64 %r120,%frame,%r73; ld.u64 %r38,[%r120]; .loc 1 575 20 sub.u64 %r121,%r38,%r35; shl.b64 %r122,%r121,3; add.u64 %r51,%r51,%r122; .loc 1 558 32 add.u64 %r124,%r136,%r73; ld.u64 %r41,[%r124]; .loc 1 558 13 setp.eq.u64 %r125,%r37,%r41; @ %r125 bra $L58; .loc 1 562 20 mov.u64 %r33,0; bra $L59; $L47: .loc 1 547 9 setp.eq.u64 %r126,%r51,0; @ %r126 bra $L45; ld.u64 %r62,[%r133+16]; ld.u64 %r28,[%r133+24]; ld.u64 %r63,[%r133+8]; ld.u64 %r40,[%r133+32]; .loc 1 554 12 shl.b64 %r32,%r45,3; ld.u64 %r33,[%frame+240]; .loc 1 558 32 ld.u64 %r49,[%frame+120]; .loc 1 565 29 mul.lo.u64 %r67,%r49,%r45; setp.eq.u64 %r131,%r44,1; add.u64 %r136,%frame,120; .loc 1 562 20 mov.u64 %r137,0; bra $L59; $L45: .loc 1 _gfortran_random_seed_i4 .visible .func _gfortran_random_seed_i432predpred %r106; .reg .predpredpredmov.u64 %r84,%ar0; mov.u64 %r85,%ar1; mov.u64 %r86,%ar2; .loc 1 757 17 set.u32.ne.u64 %r88,%r84,0; .loc 1 757 33 set.u32.ne.u64 %r91,%r85,0; neg.s32 %r92,%r91; .loc 1 757 22 sub.u32 %r93,%r92,%r88; .loc 1 757 49 set.u32.ne.u64 %r95,%r86,0; .loc 1 757 38 sub.u32 %r97,%r93,%r95; .loc 1 757 6 setp.le.s32 %r98,%r97,1; @ %r98 bra $L68; .loc 1 758 5 cvta.const.u64 %r99_gfortran_runtime_error68: .loc 1 760 6 setp.eq.u64 %r101,%r84,0; @ %r101 bra $L69; .loc 1 761 11 mov.u32 %r102,8; st.u32 [%r84],%r102; .loc 1 766 6 setp.eq.u64 %r103,%r86,0; @ %r103 bra $L70; $L73: .loc 1 769 10 ld.s8 %r105,[%r86+28]; cvt.u16.u32 %r104,%r105; setp.ne.u16 %r106,%r104,1; @ ! %r106 bra $L93; bra $L71; $L69: .loc 1 766 6 setp.ne.u64 %r107,%r86,0; @ %r107 bra $L73; bra $L94; $L71: .loc 1 770 2 cvta.const.u64 %r108,$LC2; {stack; call _gfortran_runtime_error93: .loc 1 773 11 ld.u64 %r111,[%r86+56]; add.u64 %r110,%r111,1; ld.u64 %r113,[%r86+48]; sub.u64 %r112,%r110,%r113; .loc 1 773 10 setp.gt.s64 %r114,%r112,7; @ %r114 bra $L75; .loc 1 774 2_gfortran_runtime_error75: .loc 1 776 10 ld.global.u8 %r118,[rand_state$76; .loc 1 777 2 { call init_rand_state$constprop$0; } $L76: cvta.global.u64 %r69,rand_state$1+8; mov.u64 %r67,%frame; cvta.const.u64 %r66,xor_keys; add.u64 %r62,%r69,32; .loc 1 752 1 mov.u64 %r68,%frame; mov.u64 %r81,-4824380571461165239; $L78: .loc 1 742 22 ld.u64 %r122,[%r69]; xor.b64 %r121,%r122,%r81; .loc 1 742 13 st.u64 [%r68],%r121; .loc 1 741 3 add.u64 %r69,%r69,8; add.u64 %r68,%r68,8; add.u64 %r66,%r66,8; setp.eq.u64 %r123,%r62,%r69; @ %r123 bra $L87; .loc 1 742 32 ld.u64 %r81,[%r66]; bra $L78; $L87: .loc 1 783 19 mov.u64 %r54,0; $L77: .loc 1 784 35 mov.u64 %r125,7; sub.u64 %r124,%r125,%r54; .loc 1 784 26 ld.u64 %r127,[%r86+40]; mul.lo.u64 %r126,%r124,%r127; shl.b64 %r128,%r126,2; .loc 1 784 10 ld.u64 %r130,[%r86]; add.u64 %r129,%r130,%r128; .loc 1 784 2 shl.b64 %r131,%r54,2; add.u64 %r132,%r131,%r6713139,[%value_in]; } .loc 1 783 36 add.u64 %r54,%r54,1; .loc 1 783 7 setp.ne.u64 %r141,%r54,8; @ %r141 bra $L77; bra $L67; $L94: .loc 1 795 6 setp.ne.u64 %r142,%r85,0; @ %r142 bra $L80; .loc 1 797 25 cvt.u32.u64 %r144,%r85; st.global.u8 [master_state],%r144; .loc 1 798 7 { call init_rand_state$constprop$0; } bra $L67; $L70: .loc 1 801 6 setp.eq.u64 %r145,%r85,0; @ %r145 bra $L67; $L80: .loc 1 804 10 ld.s8 %r147,[%r85+28]; cvt.u16.u32 %r146,%r147; setp.eq.u16 %r148,%r146,1; @ %r148 bra $L82; .loc 1 805 9 cvta.const.u64 %r1414_gfortran_runtime_error82: .loc 1 808 11 ld.u64 %r152,[%r85+56]; add.u64 %r151,%r152,1; ld.u64 %r154,[%r85+48]; sub.u64 %r153,%r151,%r154; .loc 1 808 10 setp.gt.s64 %r155,%r153,7; @ %r155 bra $L83; .loc 1 809 9_gfortran_runtime_error83: .loc 1 814 35 ld.u64 %r75,[%r85+40]; mov.u64 %r60,%frame; neg.s64 %r159,%r75; shl.b64 %r160,%r159,2; shl.b64 %r162,%r75,3; sub.u64 %r163,%r162,%r75; shl.b64 %r164,%r163,2; ld.u64 %r165,[%r85]; add.u64 %r82,%r165,%r164; add.u64 %r77,%frame,32; mov.u64 %r83,%frame; $L84: .loc 1 813 2168172,[%value_in]; } .loc 1 812 7 add.u64 %r83,%r83,4; add.u64 %r82,%r82,%r160; setp.ne.u64 %r174,%r77,%r83; @ %r174 bra $L84; cvta.global.u64 %r55,master_state+8; cvta.const.u64 %r53,xor_keys; mov.u64 %r76,-4824380571461165239; $L86: .loc 1 742 22 ld.u64 %r176,[%r60]; xor.b64 %r175,%r176,%r76; .loc 1 742 13 st.u64 [%r55],%r175; .loc 1 741 3 add.u64 %r60,%r60,8; add.u64 %r55,%r55,8; add.u64 %r53,%r53,8; setp.eq.u64 %r177,%r60,%r77; @ %r177 bra $L85; .loc 1 742 32 ld.u64 %r76,[%r53]; bra $L86; $L85: .loc 1 820 25 mov.u32 %r179,1; st.global.u8 [master_state],%r179; .loc 1 821 7 { call init_rand_state$constprop$0; } $L67: .loc 1 827_gfortran_random_seed_i8 .visible .func _gfortran_random_seed_i8predpred %r111; .reg .u64 %r112; .reg .u16u64 %r119; .reg .predpredpred837 17 set.u32.ne.u64 %r85,%r81,0; .loc 1 837 33 set.u32.ne.u64 %r88,%r82,0; neg.s32 %r89,%r88; .loc 1 837 22 sub.u32 %r90,%r89,%r85; .loc 1 837 49 set.u32.ne.u64 %r92,%r83,0; .loc 1 837 38 sub.u32 %r94,%r90,%r92; .loc 1 837 6 setp.le.s32 %r95,%r94,1; @ %r95 bra $L96; .loc 1 838 5 cvta.const.u64 %r96,$LC1stack; call _gfortran_runtime_error96: .loc 1 841 6 setp.eq.u64 %r98,%r81,0; @ %r98 bra $L97; .loc 1 842 11 mov.u64 %r99,4; st.u64 [%r81],%r99; .loc 1 847 6 setp.eq.u64 %r100,%r83,0; @ %r100 bra $L98; $L101: .loc 1 850 10 ld.s8 %r102,[%r83+28]; cvt.u16.u32 %r101,%r102; setp.ne.u16 %r103,%r101,1; @ ! %r103 bra $L121; bra $L99; $L97: .loc 1 847 6 setp.ne.u64 %r104,%r83,0; @ %r104 bra $L101; bra $L122; $L99: .loc 1 851 2 cvta.const.u64 %r105,$LC2stack; call _gfortran_runtime_error121: .loc 1 854 11 ld.u64 %r108,[%r83+56]; add.u64 %r107,%r108,1; ld.u64 %r110,[%r83+48]; sub.u64 %r109,%r107,%r110; .loc 1 854 10 setp.gt.s64 %r111,%r109,3; @ %r111 bra $L103; .loc 1 855 2_gfortran_runtime_error103: .loc 1 857 10 ld.global.u8 %r115,[rand_state$1]; setp.ne.u16 %r116,%r115,0; @ %r116 bra $L104; .loc 1 858 2 { call init_rand_state$constprop$0; } $L104: cvta.global.u64 %r66,rand_state$1+8; mov.u64 %r64,%frame; cvta.const.u64 %r61,xor_keys; add.u64 %r52,%r66,32; .loc 1 833 1 mov.u64 %r65,%frame; mov.u64 %r78,-4824380571461165239; $L106: .loc 1 742 22 ld.u64 %r119,[%r66]; xor.b64 %r118,%r119,%r78; .loc 1 742 13 st.u64 [%r65],%r118; .loc 1 741 3 add.u64 %r66,%r66,8; add.u64 %r65,%r65,8; add.u64 %r61,%r61,8; setp.eq.u64 %r120,%r52,%r66; @ %r120 bra $L115; .loc 1 742 32 ld.u64 %r78,[%r61]; bra $L106; $L115: .loc 1 864 19 mov.u64 %r53,0; $L105: .loc 1 865 26 ld.u64 %r122,[%r83+40]; mul.lo.u64 %r121,%r53,%r122; shl.b64 %r123,%r121,3; .loc 1 865 10 ld.u64 %r125,[%r83]; add.u64 %r124,%r125,%r123; .loc 1 865 2 shl.b64 %r126,%r53,3; add.u64 %r127,%r126,%r64; mov.u64 %r1301127130134,[%value_in]; } .loc 1 864 35 add.u64 %r53,%r53,1; .loc 1 864 7 setp.ne.u64 %r136,%r53,4; @ %r136 bra $L105; bra $L95; $L122: .loc 1 875 6 setp.ne.u64 %r137,%r82,0; @ %r137 bra $L108; .loc 1 877 25 cvt.u32.u64 %r139,%r82; st.global.u8 [master_state],%r139; .loc 1 878 7 { call init_rand_state$constprop$0; } bra $L95; $L98: .loc 1 881 6 setp.eq.u64 %r140,%r82,0; @ %r140 bra $L95; $L108: .loc 1 884 10 ld.s8 %r142,[%r82+28]; cvt.u16.u32 %r141,%r142; setp.eq.u16 %r143,%r141,1; @ %r143 bra $L110; .loc 1 885 9 cvta.const.u64 %r144144_gfortran_runtime_error110: .loc 1 888 11 ld.u64 %r147,[%r82+56]; add.u64 %r146,%r147,1; ld.u64 %r149,[%r82+48]; sub.u64 %r148,%r146,%r149; .loc 1 888 10 setp.gt.s64 %r150,%r148,3; @ %r150 bra $L111; .loc 1 889 9 cvta.const.u64 %r11_gfortran_runtime_error111: mov.u64 %r50,%frame; ld.u64 %r153,[%r82+40]; shl.b64 %r73,%r153,3; ld.u64 %r62,[%r82]; add.u64 %r71,%frame,32; .loc 1 893 41 mov.u64 %r74,%frame; $L112: .loc 1 893 2 mov.u64 %r1566160,[%value_in]; } .loc 1 892 7 add.u64 %r74,%r74,8; add.u64 %r62,%r62,%r73; setp.ne.u64 %r162,%r71,%r74; @ %r162 bra $L112; cvta.global.u64 %r49,master_state+8; cvta.const.u64 %r80,xor_keys; mov.u64 %r75,-4824380571461165239; $L114: .loc 1 742 22 ld.u64 %r164,[%r50]; xor.b64 %r163,%r164,%r75; .loc 1 742 13 st.u64 [%r49],%r163; .loc 1 741 3 add.u64 %r50,%r50,8; add.u64 %r49,%r49,8; add.u64 %r80,%r80,8; setp.eq.u64 %r165,%r50,%r71; @ %r165 bra $L113; .loc 1 742 32 ld.u64 %r75,[%r80]; bra $L114; $L113: .loc 1 897 25 mov.u32 %r167,1; st.global.u8 [master_state],%r167; .loc 1 898 7 { call init_rand_state$constprop$0; } $L95: .loc 1 904698 BEGIN FUNCTION DECL: reshape_internal .func reshape_internal.file 1 "../../../../libgfortran/intrinsics/reshape_gener_gfortran_reshape .visible .func _gfortran_reshape_gfortran_reshape_char .visible .func _gfortran_reshape_char); // BEGIN GLOBAL FUNCTION DECL: _gfortran_reshape_char4 .visible .func _gfortran_reshape_ch); // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray_gfortrani_reshape_packed .extern .func _gfortrani_reshape_packed73,110,99,111,114,114,101,99,116,32,115,105,122,101,32,105,110,32,83,79,85,82,67,69,32,97,114,103,117,109,101,110,116,32,116,111,32,82,69,83,72,65,80,69,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1097,108,117,101,32,37,108,100,32,105,110,32,79,82,68,69,82,32,97,114,103,117,109,101,110,116,32,116,111,32,82,69,83,72,65,80,69,32,105,110,116,114,105,110,115,105,9910,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,82,69,83,72,65,80,69,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,100,0 }114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,82,69,83,72,65,80,69,32,105,110,116,114,105,110,115,105,99,111,117,116,32,111,102,32,114,97,110,103,101,32,105,110,32,79,82,68,69,82,32,97,114,103,117,109,101,110,116,32,116,111,32,82,69,83,72,65,80,69,32,105115,104,97,112,101,32,97,110,100,32,116,97,114,103,101,116,32,100,111,32,110,111,116,32,99,111,110,102,111,114,109,0 }; // BEGIN FUNCTION DEF: reshape_internal .func reshape_internal12064pred %r287; .reg .predpred %r299; .reg .u64 %r303; .reg .u64 %r304; .reg .pred %r306; .reg .predpredu64 %r313; .reg .u64 %r319; .reg .u32predu64 %r341; .reg .predu64 %r357; .reg .pred %r358; .reg .predpred %r372; .reg .u64 %r373; .reg .u64 %r376; .reg .u64 %r377; .reg .u32pred %r449; .reg .u64 %r450; .reg .pred %r452; .reg .pred %r453; .reg .pred %r454; .reg .u32.reg .pred %r475; .reg .pred %r477; .reg .u64 %r478; .reg .pred %r479; .reg .u32 %r481; .reg .u32 %r482; .reg .u32 %r484; .reg .u32 %r485; .reg .u16 %r486; .reg .u16 %r487; .reg .u16 %r488; .reg .u32 %r490; .reg .u32 %r491; .reg .u16 %r493; .reg .u16 %r495; .reg .u32 %r496; .reg .u16 %r497; .reg .pred %r498; .reg .u64 %r505; .reg .u64pred %r562; .reg .u32 %r564; .reg .u32 %r565; .reg .u16 %r569; .reg .u16 %r570; .reg .u32 %r572; .reg .u16 %r573; .reg .pred %r574; .reg .u64 %r576; .reg .pred %r577; .reg .u64 %r587; .reg .u64 %r597; .reg .u64 %r607; .reg .u64 %r609; .reg .u64 %r612; .reg .u64 %r614; .reg .u64 %r616; .reg .u64 %r617; .reg .pred %r618; .reg .pred %r619; .reg .u64 %r620; .reg .pred %r621; .reg .pred %r622; .reg .pred %r623; .reg .u64 %r624; .reg .u64 %r625; .reg .u64 %r626; .reg .pred %r627; .reg .u64 %r628; .reg .u64 %r629; .reg .u64pred %r635; .reg .u64 %r636; .reg .u64 %r637; .reg .u64 %r638; .reg .pred %r639; .reg .u64 %r640; .reg .u64 %r641; .reg .u32 %r642; .reg .u32 %r643; .reg .u64 %r644; .reg .u32 %r645; .reg .u64 %r646; .reg .u32 %r647; mov.u64 %r272,%ar0; mov.u64 %r273,%ar1; mov.u64 %r274,%ar2; mov.u64 %r275,%ar3; mov.u64 %r276,%ar4; mov.u64 %r277,%ar5; .loc 1 68 10 ld.u64 %r279,[%r274+56]; add.u64 %r278,%r279,1; .loc 1 68 8 ld.u64 %r280,[%r274+48]; sub.u64 %r172,%r278,%r280; .loc 1 73 15 ld.s8 %r281,[%r272+28]; .loc 1 73 6 setp.eq.u64 %r282,%r281,%r172; @ %r282 bra $L2; .loc 1 74 5 cvta.const.u64 %r283283_gfortran_runtime_errorld.u64 %r285,[%r274+40]; shl.b64 %r264,%r285,3; ld.u64 %r262,[%r274]; mov.u64 %r117,%frame; shl.b64 %r286,%r172,3; add.u64 %r269,%r286,%frame; .loc 1 80 44 mov.u64 %r265,%frame; .loc 1 76 15 mov.u32 %r165,0; .loc 1 83 18 mov.u64 %r646,0; .loc 1 84 16 mov.u32 %r647,1; $L5: .loc 1 80 39 ld.u64 %r32,[%r262]; .loc 1 81 10 setp.le.s64 %r287,%r32,0; @ %r287 bra $L3; .loc 1 80 21 st.u64 [%r265],%r32; bra $L4; $L3: .loc 1 83 18 st.u64 [%r265],%r646; .loc 1 84 16 mov.u32 %r165,%r647; $L4: .loc 1 78 3 add.u64 %r262,%r262,%r264; add.u64 %r265,%r265,8; setp.ne.u64 %r289,%r265,%r269; @ %r289 bra $L5; .loc 1 88 6 ld.u64 %r290,[%r272]6; add.u64 %r255,%r272,40; add.u64 %r294,%r172,%r172; add.u64 %r295,%r294,%r172; shl.b64 %r296,%r295,3; add.u64 %r261,%r255,%r296; mov.u64 %r254,%r117; .loc 1 92 10 mov.u64 %r30,1; $L7: .loc 1 95 8 ld.u64 %r176,[%r254]; .loc 1 97 4 st.u64 [%r255+8],%r290; add.u64 %r298,%r176,-1; st.u64 [%r255+16],%r298; st.u64 [%r255],%r30; .loc 1 99 7 mul.lo.u64 %r30,%r30,%r176; .loc 1 93 7 add.u64 %r254,%r254,8; add.u64 %r255,%r255,24; setp.ne.u64 %r299,%r255,%r261; @ %r299 bra $L7; .loc 1 101 19 st.u64 [%r272+8],%r290; .loc 1 108 24 max.s64 %r303,30(%value_in),_gfortrani_xmallocarray04,[%value_in]; } .loc 1 108 22 st.u64 [%r272],%r304; .loc 1 109 23 st.u8 [%r272+28],%r172; $L6: .loc 1 112 6 setp.ne.u32 %r306,%r165,0; @ %r306 bra $L1; .loc 1 115 6 setp.eq.u64 %r307,%r275,0; @ %r307 bra $L51; .loc 1 117 14 ld.s8 %r43,[%r275+28]; .loc 1 117 12 cvt.u32.u32 %r308,%r43; cvt.s64.s8 %r154,%r308; .loc 1 120 7 cvt.u16.u32 %r309,%r43; setp.le.s16 %r310,%r309,0; @ %r310 bra $L52; .loc 1 122 21 cvt.u64.u32 %r311,%r43; shl.b64 %r312,%r311,3; add.u64 %r313,%frame,36r312319,[%value_in]; } add.u64 %r110,%r275,40; add.u64 %r244,%frame,120; add.u64 %r245,%frame,240; add.u32 %r321,%r43,-1; cvt.u64.u32 %r322,%r321; add.u64 %r324,%r322,%r322; add.u64 %r325,%r324,%r322; shl.b64 %r326,%r325,3; add.u64 %r327,%r275,64; add.u64 %r250,%r326,%r327; .loc 1 119 14 mov.u32 %r164,%r165; .loc 1 118 13 mov.u64 %r155,1; .loc 1 128 26 mov.u64 %r644,0; .loc 1 127 15 cvt.u32.u64 %r645,%r155; $L14: .loc 1 123 24 ld.u64 %r44,[%r110]; .loc 1 123 22 st.u64 [%r244],%r44; .loc 1 124 24 ld.u64 %r329,[%r110+16]; add.u64 %r328,%r329,1; ld.u64 %r330,[%r110+8]; sub.u64 %r48,%r328,%r330; .loc 1 125 14 setp.le.s64 %r331,%r48,0; @ %r331 bra $L11; .loc 1 124 22 st.u64 [%r245],%r48; bra $L12; $L11: .loc 1 128 26 st.u64 [%r245],%r644; .loc 1 127 15 mov.u32 %r164,%r645; $L12: .loc 1 131 14 setp.ne.u64 %r333,%r44,%r155; @ %r333 bra $L53; .loc 1 132 19 ld.u64 %r334,[%r245]; mul.lo.u64 %r155,%r155,%r334; bra $L13; $L53: .loc 1 134 19 mov.u64 %r155,0; $L13: .loc 1 120 7 add.u64 %r110,%r110,24; add.u64 %r244,%r244,8; add.u64 %r245,%r245,8; setp.ne.u64 %r335,%r110,%r250; @ %r335 bra $L14; bra $L10; $L52: .loc 1 119 14 mov.u32 %r164,%r165; .loc 1 118 13 mov.u64 %r155,1; $L10: .loc 1 136 12 ld.u64 %r157,[%r275]; bra $L9; $L51: .loc 1 143 12 mov.u64 %r157,%r275; .loc 1 142 14 mov.u32 %r164,1; .loc 1 141 13 mov.u64 %r155,1; .loc 1 140 12 mov.u64 %r154,%r275; $L9: .loc 1 146 7 cvta.global.u64 %r336,_gfortrani_compile_options; .loc 1 146 6 ld.u32 %r337,[%r336+36]; setp.eq.u32 %r338,%r337,0; @ ! %r338 bra $L88; setp.eq.u64 %r627,%r276,0; bra $L15; $L88: add.u64 %r118,%r272,48; mov.u64 %r190,1; .loc 1 150 10 mov.u64 %r184,%r190; bra $L17; $L54: mov.u64 %r190,%r191; $L17: .loc 1 153 20 ld.u64 %r52,[%r117]; .loc 1 153 7 mul.lo.u64 %r184,%r184,%r52; .loc 1 154 17 ld.u64 %r340,[%r118+8]; add.u64 %r339,%r340,1; .loc 1 154 15 ld.u64 %r341,[%r118]; sub.u64 %r185,%r339,%r341; .loc 1 155 7 setp.eq.u64 %r342,%r52,%r185; @ %r342 bra $L16; .loc 1 156 6 st.u64 [%stack+16],%r52; st.u64 [%stack+8],%r185; st.u64 [%stack],%r190;_gfortran_runtime_error16: .loc 1 151 7 add.u64 %r191,%r190,1; add.u64 %r117,%r117,8; add.u64 %r118,%r118,24; setp.ne.u64 %r345,%r190,%r172; @ %r345 bra $L54; add.u64 %r211,%r273,48; add.u64 %r174,%r273,56; ld.s8 %r346,[%r273+28]; add.u32 %r347,%r346,-1; cvt.u64.u32 %r348,%r347; add.u64 %r350,%r348,%r348; add.u64 %r351,%r350,%r348; shl.b64 %r352,%r351,3; add.u64 %r353,%r273,72; add.u64 %r130,%r352,%r353; .loc 1 162 21 mov.u64 %r183,1; $L18: .loc 1 171 9 ld.u64 %r356,[%r174]; add.u64 %r355,%r356,1; .loc 1 171 7 ld.u64 %r357,[%r211]; sub.u64 %r354,%r355,%r357; .loc 1 172 33 max.s64 %r354,%r354,0; .loc 1 172 18 mul.lo.u64 %r183,%r183,%r354; .loc 1 168 7 add.u64 %r211,%r211,24; add.u64 %r174,%r174,24; setp.ne.u64 %r358,%r130,%r211; @ %r358 bra $L18; .loc 1 175 10 setp.ge.s64 %r359,%r183,%r184; @ %r359 bra $L19; .loc 1 175 34 set.u32.eq.u64 %r361,%r275,0; neg.s32 %r362,%r361; .loc 1 175 39 cvt.u16.u32 %r365,%r164; cvt.u16.u32 %r368,%r362; or.b16 %r367,%r368,%r365; .loc 1 175 30 cvt.u32.u16 %r370,%r367; cvt.u16.u8 %r371,%r370; setp.eq.u16 %r372,%r371,0; @ %r372 bra $L19; .loc 1 176 2 st.u64 [%stack+8],%r184; st.u64 [%stack],%r183; cvta.const.u64 %r373373_gfortran_runtime_error19: .loc 1 180 10 setp.eq.u64 %r627,%r276,0; @ %r627 bra $L15; .loc 1 186 14 shl.b64 %r376,%r172,2; add.u64 %r377,%frame,480; mov.u32 %r381ld.u64 %r385,[%r276+40]; shl.b64 %r202,%r385,3; ld.u64 %r222,[%r276]; .loc 1 190 33 mov.u64 %r232,0; .loc 1 200 16 mov.u32 %r643,1; $L22: .loc 1 190 28 ld.u64 %r72,[%r222]; .loc 1 190 10 add.u64 %r179,%r72,-1; .loc 1 192 14 shr.u64 %r387,%r179,63; .loc 1 192 23 set.u32.le.s64 %r389,%r172,%r179; neg.s32 %r390,%r389; .loc 1 192 18 cvt.u16.u64 %r393,%r387; cvt.u16.u32 %r394,%r390; or.b16 %r392,%r393,%r394; .loc 1 192 11.loc 1 193 3 st.u64 [%stack],%r72; cvta.const.u64 %r398,$LC3stack; call _gfortran_runtime_error20: .loc 1 196 16 shl.b64 %r400,%r179,2; add.u64 %r401,%frame,%r400; add.u64 %r402,%r401,480; .loc 1 196 11 ld.u32 %r403,[%r402]; setp.eq.u32 %r404,%r403,0; @ %r404 bra $L21; .loc 1 197 3 st.u64 [%stack],%r72; cvta.const.u64 %r405405_gfortran_runtime_errorst.u32 [%r402],%r643; .loc 1 188 4 add.u64 %r232,%r232,1; add.u64 %r222,%r222,%r202; setp.ne.u64 %r411,%r232,%r172; @ %r411 bra $L22; bra $L15; $L26: cvt.u32.u64 %r162,%r26; .loc 1 208 10 @ %r627 bra $L23; .loc 1 209 34 ld.u64 %r414,[%r276+40]; mul.lo.u64 %r413,%r26,%r414; .loc 1 209 31 ld.u64 %r415,[%r276]; shl.b64 %r416,%r413,3; add.u64 %r417,%r415,%r416; .loc 1 209 68 ld.u32 %r419,[%r417]; add.u32 %r162,%r419,-1; $L23: .loc 1 213 17 st.u64 [%r195],%r420; .loc 1 214 20 cvt.s64.s32 %r421,%r162; add.u64 %r423,%r421,%r421; add.u64 %r424,%r423,%r421; shl.b64 %r425,%r424,3; add.u64 %r426,%r272,%r425; ld.u64 %r87,[%r426+40]; .loc 1 214 18 st.u64 [%r196],%r87; .loc 1 215 20 ld.u64 %r436,[%r426+56]; add.u64 %r435,%r436,1; ld.u64 %r444,[%r426+48]; sub.u64 %r91,%r435,%r444; .loc 1 215 18 st.u64 [%r197],%r91; .loc 1 217 35 shl.b64 %r446,%r421,3; add.u64 %r447,%frame,%r446; .loc 1 217 10 ld.u64 %r448,[%r447]; setp.eq.u64 %r449,%r91,%r448; @ %r449 bra $L24; .loc 1 218 9 cvta.const.u64 %r44stack; call _gfortran_runtime_error24: .loc 1 220 10 setp.ne.u64 %r452,%r87,%r146; @ %r452 bra $L55; .loc 1 221 15 mul.lo.u64 %r146,%r146,%r91; bra $L25; $L55: .loc 1 223 15 mov.u64 %r146,0; $L25: .loc 1 224 10 setp.le.s64 %r453,%r91,0; @ %r453 bra $L1; .loc 1 206 3 add.u64 %r26,%r26,1; add.u64 %r195,%r195,8; add.u64 %r196,%r196,8; add.u64 %r197,%r197,8; setp.ne.u64 %r454,%r26,%r172; @ %r454 bra $L26; bra $L89; $L15: add.u64 %r195,%frame,1080; add.u64 %r629,%frame,840; mov.u64 %r196,%r629; add.u64 %r626,%frame,960; mov.u64 %r197,%r626; .loc 1 190 33 mov.u64 %r26,0; mov.u64 %r146,1; .loc 1 213 17 mov.u64 %r420,%r26; bra $L26; $L89: .loc 1 228 10 ld.s8 %r96,[%r273+28]; .loc 1 228 8 cvt.u32.u32 %r455,%r96; cvt.s64.s8 %r56,%r455; .loc 1 237 17 add.u32 %r456,%r96,-1; cvt.u64.u32 %r210,%r456; add.u64 %r457,%r210,1; shl.b64 %r458,%r457,3; add.u64 %r625,%frame,720; mov.u32 %r463458465,[%value_in]; } add.u64 %r156,%r273,40; add.u64 %r624,%frame,480; mov.u64 %r163,%r624; add.u64 %r628,%frame,600; mov.u64 %r148,%r628; add.u64 %r468,%r210,%r210; add.u64 %r469,%r468,%r210; shl.b64 %r470,%r469,3; add.u64 %r471,%r273,64; add.u64 %r51,%r470,%r471; mov.u64 %r151,%r624; .loc 1 233 9 mov.u64 %r152,1; .loc 1 243 15 mov.u64 %r641,0; .loc 1 242 11 cvt.u32.u64 %r642,%r152; $L31: .loc 1 238 20 ld.u64 %r99,[%r156]; .loc 1 238 18 st.u64 [%r151],%r99; .loc 1 239 20 ld.u64 %r473,[%r156+16]; add.u64 %r472,%r473,1; ld.u64 %r474,[%r156+8]; sub.u64 %r103,%r472,%r474; .loc 1 240 10 setp.le.s64 %r475,%r103,0; @ %r475 bra $L28; .loc 1 239 18 st.u64 [%r148],%r103; bra $L29; $L28: .loc 1 243 15 st.u64 [%r148],%r641; .loc 1 242 11 mov.u32 %r165,%r642; $L29: .loc 1 246 10 setp.ne.u64 %r477,%r99,%r152; @ %r477 bra $L56; .loc 1 247 15 ld.u64 %r478,[%r148]; mul.lo.u64 %r152,%r152,%r478; bra $L30; $L56: .loc 1 249 15 mov.u64 %r152,0; $L30: .loc 1 235 3 add.u64 %r156,%r156,24; add.u64 %r151,%r151,8; add.u64 %r148,%r148,8; setp.ne.u64 %r479,%r51,%r156; @ %r479 bra $L31; .loc 1 257 7 ld.u64 %r149,[%r272]; .loc 1 257 52 ld.u64 %r94,[%r273]; .loc 1 252 13 set.u32.ne.u64 %r481,%r146,0; neg.s32 %r482,%r481; .loc 1 252 32 set.u32.ne.u64 %r484,%r155,0; neg.s32 %r485,%r484; cvt.u16.u32 %r487,%r482; cvt.u16.u32 %r488,%r485; and.b16 %r486,%r487,%r488; .loc 1 252 27 set.u32.ne.u64 %r490,%r152,0; neg.s32 %r491,%r490; .loc 1 252 32 cvt.u16.u32 %r495,%r491; and.b16 %r493,%r486,%r495; cvt.u32.u16 %r496,%r493; cvt.u16.u8 %r497,%r496; setp.eq.u16 %r498,%r497,0; @ %r498 bra $L32; .loc 1 256 13 mul.lo.u64 %r505,%r155,%r277; .loc 1 255 13 mul.lo.u64 %r506,%r152,%r277; .loc 1 254 13 mul.lo.u64 %r507,%r146,%r27750157505; call _gfortrani_reshape_packed259 7 bra $L1; $L32: .loc 1 263 21 ld.u64 %r111,[%frame+840]; .loc 1 263 12 mul.lo.u64 %r192,%r111,%r277; .loc 1 264 12 ld.u64 %r508,[%frame+480]; mul.lo.u64 %r150,%r277,%r508; .loc 1 266 6 and.b32 %r509,%r165,%r164; setp.eq.u32 %r510,%r509,0; @ %r510 bra $L3434: .loc 1 269 6 setp.eq.u32 %r511,%r165,0; @ %r511 bra $L35; .loc 1 275 7 setp.le.s64 %r512,%r154,0; @ %r512 bra $L57; .loc 1 280 22 ld.u64 %r98,[%frame+120]; .loc 1 280 13 mul.lo.u64 %r150,%r98,%r277; .loc 1 277 16 shl.b64 %r205,%r154,3; add.u64 %r514,%frame,3620521,[%value_in]; } .loc 1 278 17 add.u64 %r524,%frame,2r524add.u64 %r161,%frame,128; add.u64 %r159,%r624,%r205; $L36: .loc 1 279 17 st.u64 [%r163],%r98; .loc 1 275 7 add.u64 %r163,%r163,8; setp.eq.u64 %r534,%r159,%r163; @ %r534 bra $L58; .loc 1 279 26 ld.u64 %r98,[%r161]; add.u64 %r161,%r161,8; bra $L36; $L48: .loc 1 287 7277541,[%value_in]; } .loc 1 289 12 add.u64 %r149,%r149,%r192; .loc 1 290 11 add.u64 %r94,%r94,%r150; .loc 1 291 16 add.u64 %r240,%r240,1; st.u64 [%frame+1080],%r240; .loc 1 292 16 add.u64 %r241,%r241,1; st.u64 [%frame+720],%r241; .loc 1 296 13 setp.ne.u64 %r543,%r240,%r235; @ %r543 bra $L37; .loc 1 300 21 mov.u64 %r544,0; st.u64 [%frame+1080],%r544; .loc 1 305 14 @ %r639 bra $L38; add.u64 %r178,%frame,1088; .loc 1 303 43 mov.u64 %r122,%r230; .loc 1 305 14 mov.u64 %r170,8; mov.u64 %r115,%r544; .loc 1 300 21 mov.u64 %r631,%r115; bra $L39; $L40: st.u64 [%r178],%r631; .loc 1 303 30 mul.lo.u64 %r548,%r125,%r124; .loc 1 303 43 mul.lo.u64 %r122,%r548,%r277; .loc 1 305 14 add.u64 %r115,%r115,1; add.u64 %r178,%r178,8; add.u64 %r170,%r170,8; setp.eq.u64 %r549,%r115,%r640; @ %r549 bra $L38; $L39: .loc 1 313 24 ld.u64 %r550,[%r178]; add.u64 %r124,%r550,1; st.u64 [%r178],%r124; .loc 1 314 30 add.u64 %r552,%r629,%r170; ld.u64 %r125,[%r552]; .loc 1 314 20 sub.u64 %r554,%r149,%r122; mad.lo.u64 %r149,%r125,%r277,%r554; .loc 1 296 34 add.u64 %r556,%r626,%r170; ld.u64 %r128,[%r556]; .loc 1 296 13 setp.eq.u64 %r557,%r124,%r128; @ %r557 bra $L40; .loc 1 320 13 setp.ne.u64 %r558,%r241,%r237; @ ! %r558 bra $L49; bra $L41; $L60: mov.u64 %r149,0; $L49: .loc 1 324 21 mov.u64 %r559,0; st.u64 [%frame+720],%r559; .loc 1 327 29 ld.u64 %r561,[%frame+480]; mul.lo.u64 %r560,%r237,%r561; .loc 1 327 42 mul.lo.u64 %r133,%r560,%r277; .loc 1 329 14 setp.ne.u64 %r562,%r56,1; @ %r562 bra $L42; $L45: .loc 1 331 19 set.u32.ne.u64 %r564,%r104,0; neg.s32 %r565,%r564; .loc 1 331 24 cvt.u16.u32 %r570,%r565; and.b16 %r569,%r570,%r634; .loc 1 331 18 cvt.u32.u16 %r572,%r569; cvt.u16.u8 %r573,%r572; setp.ne.u16 %r574,%r573,0; @ %r574 bra $L43; .loc 1 345 19 mov.u64 %r94,%r157; bra $L44; $L47: .loc 1 324 21 st.u64 [%r137],%r559; .loc 1 327 29 mul.lo.u64 %r576,%r142,%r141; .loc 1 327 42 mul.lo.u64 %r133,%r576,%r277; .loc 1 329 14 add.u64 %r227,%r227,1; add.u64 %r137,%r137,8; add.u64 %r138,%r138,8; setp.eq.u64 %r577,%r227,%r56; @ ! %r577 bra $L46; bra $L45; $L43: .loc 1 336 19 @ %r635 bra $L59; .loc 1 338 352007,[%value_in]; } .loc 1 339 366r6200597,[%value_in]; } .loc 1 340 366620007,[%value_in]; } .loc 1 341 32 ld.u64 %r609,[%frame+480]; mul.lo.u64 %r150,%r277,%r609; .loc 1 345 19 mov.u64 %r94,%r157; .loc 1 341 32 mov.u64 %r56,%r154; .loc 1 334 24 mov.u64 %r104,0; bra $L44; $L42: add.u64 %r137,%frame,728; add.u64 %r138,%frame,488; .loc 1 329 14 mov.u64 %r227,1; $L46: .loc 1 350 24 ld.u64 %r612,[%r137]; add.u64 %r141,%r612,1; st.u64 [%r137],%r141; .loc 1 351 29 ld.u64 %r142,[%r138]; .loc 1 351 19 sub.u64 %r614,%r94,%r133; mad.lo.u64 %r94,%r142,%r277,%r614; .loc 1 320 34 shl.b64 %r616,%r227,3; add.u64 %r617,%r628,%r616; ld.u64 %r145,[%r617]; .loc 1 320 13 setp.eq.u64 %r618,%r141,%r145; @ %r618 bra $L47; bra $L44; $L57: .loc 1 273 12 mov.u64 %r94,%r157; .loc 1 275 7 mov.u64 %r56,%r154; bra $L35; $L58: .loc 1 273 12 mov.u64 %r94,%r157; mov.u64 %r56,%r154; $L35: .loc 1 284 9 setp.eq.u64 %r619,%r149,0; @ %r619 bra $L1; .loc 1 296 34 ld.u64 %r235,[%frame+960]; .loc 1 320 34 ld.u64 %r237,[%frame+600]; .loc 1 291 13 ld.u64 %r240,[%frame+1080]; .loc 1 292 13 ld.u64 %r241,[%frame+720]; .loc 1 303 30 mul.lo.u64 %r620,%r111,%r235; .loc 1 303 43 mul.lo.u64 %r230,%r620,%r277; .loc 1 338 35 shl.b64 %r200,%r154,3; mov.u64 %r104,%r94; .loc 1 331 24 set.u32.ne.u64 %r632,%r275,0; neg.s32 %r633,%r632; cvt.u16.u32 %r634,%r633; .loc 1 336 19 setp.le.s64 %r635,%r154,0; .loc 1 338 35 add.u64 %r636,%frame,360; .loc 1 339 36 add.u64 %r637,%frame,240; .loc 1 340 36 add.u64 %r638,%frame,120; .loc 1 305 14 setp.eq.u64 %r639,%r172,1; add.u64 %r640,%r172,-1; bra $L48; $L59: .loc 1 345 19 mov.u64 %r94,%r157; .loc 1 336 19 mov.u64 %r56,%r154; .loc 1 334 24 mov.u64 %r104,0; $L44: .loc 1 284 9 setp.eq.u64 %r621,%r149,0; @ %r621 bra $L1; $L41: .loc 1 320 34 ld.u64 %r237,[%frame+600]; .loc 1 291 13 ld.u64 %r240,[%frame+1080]; .loc 1 292 13 ld.u64 %r241,[%frame+720]; bra $L48; $L37: .loc 1 320 13 setp.eq.u64 %r622,%r241,%r237; @ %r622 bra $L49; bra $L48; $L38: setp.eq.u64 %r623,%r241,%r237; @ %r623 bra $L60; $L1: .loc 1 355_gfortran_reshape .visible .func _gfortran_reshape {r24mov.u64 %r28,%ar4; .loc 1 364 3 ld.u64 %r35,[%r25+16]; {_gfortran_reshape_char .visible .func _gfortran_reshape_char) {reg .u64 %r23mov.u64 %r29,%ar6; .loc 1 380 3 {_gfortran_reshape_char4 .visible .func _gfortran_reshape_ch) {mov.u64 %r29,%ar5; mov.u64 %r30,%ar6; .loc 1 396 21 shl.b64 %r38,%r30,2; .loc 1 395 3 {397 1 ret; } /717 gfortrani_reshape_packed .visible .func _gfortrani_reshape_packed.file 1 "../../../../libgfortran/intrinsics/reshape_packed.c"F: _gfortrani_reshape_packed .visible .func _gfortrani_reshape_packedmov.u64 %r33,%ar5; .loc 1 38 8 min.s64 %r24,%r29,%r31; .loc 1 39 3.loc 1 40 7 add.u64 %r26,%r28,%r24; .loc 1 41 9 sub.u64 %r27,%r29,%r24; .loc 1 42 9 setp.le.s64 %r42,%r27,0; @ %r42 bra $L1; $L3: .loc 1 44 12 min.s64 %r25,%r33,%r27; .loc 1 45 7.loc 1 46 11 add.u64 %r26,%r26,%r25; .loc 1 47 13 sub.u64 %r27,%r27,%r25; .loc 1 42 9 setp.gt.s64 %r51,%r27,0; @ %r51 bra $L3; $L1: .loc 1 49 1 ret; } /735 gfortran_selected_int_gfortran_selected_int_kind../libgfortran/intrinsics/selected_int_kind.f90" // BEGIN VAR DEF: int_infos$0 .const .align 4 .u64 int_infos$0[5] = {8589934593,17179869186,38654705668,77309411336,163208757264_gfortran_selected_int_gfortran_selected_int_kindpred %r37; mov.u64 %r30,%ar0; .loc 1 39 8 ld.u32 %r22,[%r30]; cvta.const.u64 %r27,int_infos$0+12; mov.u32 %r26,2; .loc 1 38 8 mov.u32 %r25,1; $L4: .loc 1 39 8 setp.gt.s32 %r31,%r22,%r26; @ %r31 bra $L2; .loc 1 40 56 cvta.const.u64 %r32,int_infos$0; .loc 1 39 8 add.u32 %r33,%r25,-1; cvt.s64.s32 %r34,%r33; .loc 1 40 56 shl.b64 %r35,%r34,3; add.u64 %r36,%r32,%r35; ld.u32 %r29,[%r36]; .loc 1 41 12 bra $L1; $L2: .loc 1 38 8 add.u32 %r25,%r25,1; setp.eq.u32 %r37,%r25,6; @ %r37 bra $L5; .loc 1 39 8 ld.u32 %r26,[%r27]; add.u64 %r27,%r27,8; bra $L4; $L5: .loc 1 45 8 mov.u32 %r29,-1; $L1gfortran_selected_real_kind2008gfortran_selected_real_kind2008fortran/intrinsics/selected_real_kind.f90" // BEGIN GLOBAL FUNCTION DECL: _gfortran_selected_real_gfortran_selected_real_kindgfortran_selected_real_kind2008gfortran_selected_real_kind2008u16predpred49 6 setp.eq.u64 %r37,%r34,0; @ %r37 bra $L13; .loc 1 49 25 ld.u32 %r26,[%r34]; bra $L2; $L13: .loc 1 42 8 cvt.u32.u64 %r26,%r34; $L2: .loc 1 50 6 setp.eq.u64 %r38,%r35,0; @ %r38 bra $L14; .loc 1 50 25 ld.u32 %r27,[%r35]; .loc 1 57 8 set.u32.le.s32 %r40,%r27,37; neg.s32 %r41,%r40; cvt.u32.u32 %r39,%r41; cvt.u32.u8 %r30,%r39; .loc 1 57 54 and.b32 %r25,%r30,1; bra $L3; $L14: mov.u32 %r25,1; mov.u32 %r30,%r25; .loc 1 43 8 cvt.u32.u64 %r27,%r35; $L3: .loc 1 55; .loc 1 51 33 ld.u32 %r28,[%r36]; .loc 1 58 8 set.u32.le.s32 %r44,%r28,2; neg.s32 %r45,%r44; cvt.u32.u32 %r43,%r45; cvt.u32.u8 %r29,%r43; bra $L4; $L15: mov.u32 %r29,1; .loc 1 44 12 cvt.u32.u64 %r28,%r36; $L4: .loc 1 62 13 setp.gt.s32 %r46,%r26,6; @ %r46 bra $L5; cvt.u16.u32 %r48,%r29; cvt.u16.u32 %r49,%r30; and.b16 %r47,%r48,%r49; cvt.u32.u16 %r50,%r47; cvt.u16.u8 %r51,%r6; .loc 1 56 58 mov.u32 %r32,1; bra $L7; $L5: .loc 1 56 8 set.u32.le.s32 %r55,%r26,15; .loc 1 56 58 neg.s32 %r32,%r55; $L7: .loc 1 57 8 setp.gt.s32 %r56,%r27,307; @ %r56 bra $L8; .loc 1 58 8 setp.gt.s32 %r57,%r28,2; @ %r57 bra $L17; .loc 1 62 13 set.u32.le.s32 %r59,%r26,15; neg.s32 %r60,%r59; cvt.u16.u32 %r62,%r60; cvt.u16.u32 %r63,%r29; and.b16 %r61,%r62,%r63;@ ! %r66 bra $L10; bra $L18; $L12: .loc 1 68 37 setp.ne.u32 %r67,%r25,0; @ %r67 bra $L10; bra $L22; $L16: .loc 1 62 13 mov.u32 %r33,4; bra $L1; $L18: mov.u32 %r33,8; .loc 1 64 12 bra $L1; $L10: .loc 1 68 37 setp.ne.u32 %r68,%r32,0; .loc 1 69 41 selp.u32 %r33,-4,-1,%r68; bra $L1; $L22: .loc 1 70 41 setp.ne.u32 %r69,%r32,0; .loc 1 73 41 selp.u32 %r33,-2,-3,%r69; bra $L1; $L17: .loc 1 77 41 mov.u32 %r33,-5; bra $L1; $L8: .loc 1 58 8 setp.le.s32 %r72,%r28,2; @ %r72 bra $L12; .loc 1 77 41 mov.u32 %r33,-5; $L1: .loc 1 79 45_gfortran_selected_real_gfortran_selected_real_kind94 72 mov.u64 %r27,call (%value_in),_gfortran_selected_real_kind2008gfortran_sind_r4 .visible .func (.param .f32 %value_out) _gfortran_sind_r4../libgfortran/intrinsics/trigd.inc"_gfortran_sind_r8 .visible .func (.param .f64 %value_out) _gfortran_sind__gfortran_cosd_r8 .visible .func (.param .f64 %value_out) _gfortran_cosd__gfortran_tand_r8 .visible .func (.param .f64 %value_out) _gfortran_tand_fmodf .extern .func (.param .f32 %value_out) fmodfCL: cosf .extern .func (.param .f32 %value_out) cosfsinf .extern .func (.param .f32 %value_out) tanf .extern .func (.param .f32 %value_out) tanfod .extern .func (.param .f64 %value_out) fmodCL: cos .extern .func (.param .f64 %value_out) cossin .extern .func (.param .f64 %value_out) sintan .extern .func (.param .f64 %value_out) tan (.param .f64 %in_ar0); // BEGIN VAR DEF: tiny$0 .const .align 8 .u64 tiny$0[1] = {103582791429521408 }; // BEGIN VAR DEF: tiny$1 .const .align 4 .u32 tiny$1[1] = {226492416_gfortran_sind_r4 .visible .func (.param .f32 %value_out) _gfortran_sind_r4 (r22; .reg .f32 %r24; .reg .f32 %r31; .reg .f32 %r32.reg .f32 %r49; .reg .predpredpred %r60; .reg .u32u32 %r86; .reg .pred %r91; .reg .predpred %r97; .reg .f32 %r98; .reg .f32 %r99; .reg .pred %r100; .reg .f32 %r102; .reg .f32 %r103; .reg .pred %r104; .reg .f32 %r106; .reg .pred %r107; .reg .f32 %r108; .reg .f32 %r109; .reg .predf32 %r120; mov.f32 %r49,%ar0; .loc 1 95 7 abs.f32 %r22,%r49; .loc 1 95 6 setp.gtu.f32 %r50,%r22,0f7f7fffff; @ %r50 bra $L2; .loc 1 110 11 mov.f32 %r52,0f3d000000; sub.f32 %r51,%r22,%r52; .loc 1 110 10 setp.lt.f32 %r53,%r51,0f00000000; @ ! %r53 bra $L41; .loc 1 112 4 mul.f32 %r24,%r49,0fb6395dad; fma.rn.f32 %r48,%r49,0f3c8f0000,%r24; .loc 1 114 11 bra $L1; $L41: .loc 1 102 7 mov.f32 %r54,0f3f800000; copysign.f32 %r48,%r49,%r54; .loc 1 127 7 mov.f32 %r56,0f43b40000; { .param .f32 %value_in; .param .f32 %out_arg1; st.param.f32 [%out_arg1],%r22(%value_in),fmodmov.f32 %r42,%r57; .loc 1 133 11 cvt.rzi.s32.f32 %r40,%r57; cvt.rn.f32.s32 %r58,%r40; sub.f32 %r59,%r58,%r57; .loc 1 133 10 setp.neu.f32 %r60,%r59,0f00000000; @ %r60 bra $L6; rem.s32 %r62,%r40,30; .loc 1 133 48 setp.ne.u32 %r67,%r62,0; @ %r67 bra $L6; rem.s32 %r69,%r40,180; .loc 1 137 7 setp.ne.u32 %r74,%r69,0; @ %r74 bra $L7; .loc 1 140 11 setp.eq.u32 %r75,%r40,180; @ %r75 bra $L8; .loc 1 216 7 mul.f32 %r48,%r48,0f00000000; bra $L1; $L8: mul.f32 %r48,%r48,0f80000000; bra $L1; $L7: rem.s32 %r78,%r40,90; .loc 1 143 12 setp.ne.u32 %r83,%r78,0; @ %r83 bra $L10; .loc 1 144 6 setp.eq.u32 %r84,%r40,90; @ %r84 bra $L1; neg.f32 %r48,%r48; bra $L1; $L10: rem.s32 %r86,%r40,60; .loc 1 145 12 setp.ne.u32 %r91,%r86,0; @ %r91 bra $L11; .loc 1 148 11 setp.gt.s32 %r92,%r40,179; @ %r92 bra $L12; .loc 1 216 7 mul.f32 %r48,%r48,0f3f5db3d7; bra $L1; $L12: mul.f32 %r48,%r48,0fbf5db3d7; bra $L1; $L11: .loc 1 152 6 setp.gt.s32 %r93,%r40,179; @ %r93 bra $L13; .loc 1 216 7 mul.f32 %r48,%r48,0f3f000000; bra $L1; $L13: mul.f32 %r48,%r48,0fbf000000; bra $L1; $L6: .loc 1 161 8 mov.f32 %r94,0f43340000; sub.f32 %r31,%r57,%r94; .loc 1 161 7 setp.le.f32 %r95,%r31,0f00000000; @ ! %r95 bra $L42; .loc 1 163 12 mov.f32 %r96,0f42b40000; sub.f32 %r32,%r57,%r96; .loc 1 163 11 setp.le.f32 %r97,%r32,0f00000000; @ ! %r97 bra $L43; .loc 1 165 9 mov.f32 %r99,0f42340000; sub.f32 %r98,%r57,%r99; .loc 1 165 8 setp.gt.f32 %r100,%r98,0f00000000; @ ! %r100 bra $L18; .loc 1 168 9 sub.f32 %r32,%r96,%r57; bra $L20; $L43: .loc 1 174 9 mov.f32 %r103,0f43070000; sub.f32 %r102,%r57,%r103; .loc 1 174 8 setp.le.f32 %r104,%r102,0f00000000; @ %r104 bra $L20; .loc 1 180 7 sub.f32 %r42,%r94,%r57; bra $L18; $L42: .loc 1 184 13 mov.f32 %r106,0f43870000; sub.f32 %r38,%r57,%r106; .loc 1 184 12 setp.le.f32 %r107,%r38,0f00000000; @ ! %r107 bra $L44; .loc 1 186 12 mov.f32 %r109,0f43610000; sub.f32 %r108,%r57,%r109; .loc 1 186 11 setp.le.f32 %r110,%r108,0f00000000; @ %r110 bra $L28; .loc 1 190 5 sub.f32 %r38,%r106,%r57; .loc 1 191 14 mov.u32 %r39,1; bra $L23; $L28: .loc 1 187 3 mov.f32 %r38,%r31; .loc 1 160 9 mov.u32 %r39,0; $L23: .loc 1 193 8 neg.f32 %r48,%r48; bra $L24; $L44: .loc 1 198 12 mov.f32 %r113,0f439d8000; sub.f32 %r112,%r57,%r113; .loc 1 198 11 setp.le.f32 %r114,%r112,0f00000000; @ %r114 bra $L29; .loc 1 204 3 mov.f32 %r115,0f43b40000; sub.f32 %r38,%r115,%r57; .loc 1 160 9 mov.u32 %r39,0; bra $L25; $L29: .loc 1 201 14 mov.u32 %r39,1; $L25: .loc 1 205 8 neg.f32 %r48,%r48; $L24: .loc 1 208 4 mul.f32 %r37,%r38,0fb6395dad; fma.rn.f32 %r43,%r38,0f3c8f0000,%r37; .loc 1 210 7 setp.eq.u32 %r116,%r39,0; @ %r116 bra $L26; $L27sf,(%out_arg1); ld.param.f32 %r118,[%value_in]; } .loc 1 216 7 mul.f32 %r48,%r48,%r118; bra $L1; $L26sinf,(%out_arg1); ld.param.f32 %r120,[%value_in]; } .loc 1 216 7 mul.f32 %r48,%r48,%r120; bra $L1; $L2: .loc 1 223 5 sub.f32 %r48,%r49,%r49; bra $L1; $L20: .loc 1 208 4 mul.f32 %r46,%r32,0fb6395dad; fma.rn.f32 %r43,%r32,0f3c8f0000,%r46; bra $L27; $L18: mul.f32 %r47,%r42,0fb6395dad; fma.rn.f32 %r43,%r42,0f3c8f0000,%r47; bra $L26;gfortran_cosd_r4 .visible .func (.param .f32 %value_out) _gfortran_cosd_r4 (r22; .reg .f32 %r24; .reg .f32 %r39; .reg .u32 %predpred %r55; .reg .predpredf32 %r122; .reg .pred %r123; .reg .f32 %r125; .reg .f32f32pred %r137; .reg .f32 %r138; .reg .pred %r139; .reg .f32 %r142; .reg .f32 %r144; mov.f32 %r51,%ar0; .loc 1 245 7 abs.f32 %r22,%r51; .loc 1 245 6 setp.gtu.f32 %r52,%r22,0f7f7fffff; @ %r52 bra $L46; .loc 1 253 11 mov.f32 %r54,0f3c000000; sub.f32 %r53,%r22,%r54; .loc 1 253 10 setp.le.f32 %r55,%r53,0f00000000; @ ! %r55 bra $L80; .loc 1 258 7 setp.eq.f32 %r56,%r51,0f00000000; @ %r56 bra $L49; .loc 1 259 6 ld.const.f32 %r24,[tiny$1]; mov.f32 %r57,0f3f800000; sub.f32 %r50,%r57,%r24; bra $L45; $L80: .loc 1 275 7 mov.f32 %r59,0f43b40000; { .param .f32 %value_in; .param .f32 %out_arg1; st.param.f32 [%out_arg1],%r22(%value_in),fmod282 11 cvt.rzi.s32.f32 %r43,%r60; cvt.rn.f32.s32 %r61,%r43; sub.f32 %r62,%r61,%r60; .loc 1 282 10 setp.neu.f32 %r63,%r62,0f00000000; @ %r63 bra $L51; rem.s32 %r65,%r43,30; .loc 1 282 48 setp.ne.u32 %r70,%r65,0; @ %r70 bra $L51; rem.s32 %r72,%r43,180; .loc 1 284 7 setp.ne.u32 %r77,%r72,0; @ %r77 bra $L52; .loc 1 285 6 setp.eq.u32 %r78,%r43,180; @ %r78 bra $L64; $L49: mov.f32 %r50,0f3f800000; bra $L45; $L52: rem.s32 %r80,%r43,90; .loc 1 287 12 setp.eq.u32 %r85,%r80,0; @ %r85 bra $L65; rem.s32 %r87,%r43,60; .loc 1 289 12 setp.ne.u32 %r92,%r87,0; @ %r92 bra $L53; .loc 1 292 31 set.u32.ne.u32 %r94,%r43,60; neg.s32 %r95,%r94; .loc 1 292 59 set.u32.ne.u32 %r97,%r43,300; neg.s32 %r98,%r97; .loc 1 292 36 cvt.u16.u32 %r100,%r95; cvt.u16.u32 %r101,%r98; and.b16 %r99,%r100,%r101; .loc 1 292 11 cvt.u32.u16 %r102,%r99; cvt.u16.u8 %r103,%r102; setp.eq.u16 %r104,%r103,0; .loc 1 293 3 selp.f32 %r50,0f3f000000,0fbf000000,%r104; bra $L45; $L53: .loc 1 298 31 set.u32.ne.u32 %r106,%r43,30; neg.s32 %r107,%r106; .loc 1 298 59 set.u32.ne.u32 %r109,%r43,330; neg.s32 %r110,%r109; .loc 1 298 36 cvt.u16.u32 %r112,%r107; cvt.u16.u32 %r113,%r110; and.b16 %r111,%r112,%r113; .loc 1 298 11 cvt.u32.u16 %r114,%r111; cvt.u16.u8 %r115,%r114; setp.eq.u16 %r116,%r115,0; .loc 1 299 3 selp.f32 %r50,0f3f5db3d7,0fbf5db3d7,%r116; bra $L45; $L51: .loc 1 309 8 mov.f32 %r117,0f43340000; sub.f32 %r44,%r60,%r117; .loc 1 309 7 setp.le.f32 %r118,%r44,0f00000000; @ ! %r118 bra $L81; .loc 1 311 12 mov.f32 %r119,0f42b40000; sub.f32 %r39,%r60,%r119; .loc 1 311 11 setp.le.f32 %r120,%r39,0f00000000; @ ! %r120 bra $L82; .loc 1 313 9 mov.f32 %r122,0f42340000; sub.f32 %r121,%r60,%r122; .loc 1 313 8 setp.gt.f32 %r123,%r121,0f00000000; @ ! %r123 bra $L83; .loc 1 315 9 sub.f32 %r39,%r119,%r60; .loc 1 307 9 mov.u32 %r45,0; bra $L60; $L82: .loc 1 321 9 mov.f32 %r126,0f43070000; sub.f32 %r125,%r60,%r126; .loc 1 321 8 setp.le.f32 %r127,%r125,0f00000000; @ %r127 bra $L69; .loc 1 327 7 sub.f32 %r44,%r117,%r60; .loc 1 328 9 mov.u32 %r45,1; bra $L58; $L81: .loc 1 332 13 mov.f32 %r129,0f43870000; sub.f32 %r39,%r60,%r129; .loc 1 332 12 setp.le.f32 %r130,%r39,0f00000000; @ ! %r130 bra $L84; .loc 1 334 12 mov.f32 %r132,0f43610000; sub.f32 %r131,%r60,%r132; .loc 1 334 11 setp.le.f32 %r133,%r131,0f00000000; @ %r133 bra $L70; .loc 1 338 5 sub.f32 %r39,%r129,%r60; .loc 1 341 12 mov.u32 %r45,1; bra $L60; $L84: .loc 1 346 12 mov.f32 %r136,0f439d8000; sub.f32 %r135,%r60,%r136; .loc 1 346 11 setp.le.f32 %r137,%r135,0f00000000; @ %r137 bra $L71; .loc 1 352 3 mov.f32 %r138,0f43b40000; sub.f32 %r44,%r138,%r60; .loc 1 307 9 mov.u32 %r45,0; bra $L58; $L63: .loc 1 362 7 setp.eq.u32 %r139,%r45,0; @ %r139 bra $L45; .loc 1 363 6 neg.f32 %r50,%r50; bra $L45; $L46: .loc 1 371 5 sub.f32 %r50,%r51,%r51; bra $L45; $L64: .loc 1 285 6 mov.f32 %r50,0fbf800000; bra $L45; $L65: .loc 1 288 6 mov.f32 %r50,0f00000000; bra $L45; $L69: .loc 1 328 9 mov.u32 %r45,1; bra $L60; $L71: .loc 1 307 9 mov.u32 %r45,0; $L60: .loc 1 355 4 mul.f32 %r46,%r39,0fb6395dad; fma.rn.f32 %r47,%r39,0f3c8f0000,%r46; .loc 1 358 6 { .param .f32 %value_in; .param .f32 %out_arg1; st.param.f32 [%out_arg1],%r47; call (%value_in),sinf,(%out_arg1); ld.param.f32 %r142,[%value_in]; } mov.f32 %r50,%r142; bra $L63; $L83: .loc 1 275 7 mov.f32 %r44,%r60; .loc 1 307 9 mov.u32 %r45,0; bra $L58; $L70: .loc 1 341 12 mov.u32 %r45,1; $L58: .loc 1 355 4 mul.f32 %r48,%r44,0fb6395dad; fma.rn.f32 %r49,%r44,0f3c8f0000,%r48; .loc 1 360 6 { .param .f32 %value_in; .param .f32 %out_arg1; st.param.f32 [%out_arg1],%r49; call (%value_in),cosf,(%out_arg1); ld.param.f32 %r144,[%value_in]; } mov.f32 %r50,%r144; bra $L63; $L45gfortran_tand_r4 .visible .func (.param .f32 %value_out) _gfortran_tand_r4 (f32 %r43; .reg .pred %r44; .reg .f32 %r45; .reg .f32 %r46; .reg .predpredpred %r87; .reg .f32 %r88; .reg .f32 %r89; .reg .pred %r90; .reg .f32 %r92; .reg .f32 %r93; .reg .pred %r94; .reg .f32 %r95; .reg .f32 %r97; mov.f32 %r43,%ar0; .loc 1 389 7 abs.f32 %r22,%r43; .loc 1 389 6 setp.gtu.f32 %r44,%r22,0f7f7fffff; @ %r44 bra $L86; .loc 1 404 11 mov.f32 %r46,0f3d000000; sub.f32 %r45,%r22,%r46; .loc 1 404 10 setp.lt.f32 %r47,%r45,0f00000000; @ ! %r47 bra $L107; .loc 1 406 4 mul.f32 %r24,%r43,0fb6395dad; fma.rn.f32 %r42,%r43,0f3c8f0000,%r24; .loc 1 408 11 bra $L85; $L107: .loc 1 396 7 mov.f32 %r48,0f3f800000; copysign.f32 %r42,%r43,%r48; .loc 1 421 7 mov.f32 %r50,0f43b40000; { .param .f32 %value_in; .param .f32 %out_arg1; st.param.f32 [%out_arg1],%r22(%value_in),fmod7 11 cvt.rzi.s32.f32 %r38,%r51; cvt.rn.f32.s32 %r52,%r38; sub.f32 %r53,%r52,%r51; .loc 1 427 10 setp.neu.f32 %r54,%r53,0f00000000; @ %r54 bra $L90; mul.lo.u32 %r57,%r38,-1527099483; .loc 1 427 48 setp.gt.u32 %r58,%r57,95443717; @ %r58 bra $L90; rem.s32 %r60,%r38,180; .loc 1 429 7 setp.ne.u32 %r65,%r60,0; @ %r65 bra $L91; .loc 1 476 7 mul.f32 %r42,%r42,0f00000000; bra $L85; $L91: rem.s32 %r67,%r38,90; .loc 1 437 12 setp.ne.u32 %r72,%r67,0; @ %r72 bra $L93; .loc 1 438 6 setp.ne.u32 %r73,%r38,90; selp.f32 %r41,0fff800000,0f7f800000,%r73; .loc 1 476 7 mul.f32 %r42,%r42,%r41; bra $L85; $L93: .loc 1 443 31 set.u32.ne.u32 %r75,%r38,45; neg.s32 %r76,%r75; .loc 1 443 59 set.u32.ne.u32 %r78,%r38,225; neg.s32 %r79,%r78; .loc 1 443 36 cvt.u16.u32 %r81,%r76; cvt.u16.u32 %r82,%r79; and.b16 %r80,%r81,%r82; .loc 1 443 11@ %r85 bra $L85; neg.f32 %r42,%r42; bra $L85; $L90: .loc 1 451 8 mov.f32 %r86,0f43340000; sub.f32 %r33,%r51,%r86; .loc 1 451 7 setp.le.f32 %r87,%r33,0f00000000; @ ! %r87 bra $L108; .loc 1 453 12 mov.f32 %r89,0f42b40000; sub.f32 %r88,%r51,%r89; .loc 1 453 11 setp.gt.f32 %r90,%r88,0f00000000; @ ! %r90 bra $L109; .loc 1 455 5 sub.f32 %r33,%r86,%r51; .loc 1 456 5 neg.f32 %r42,%r42; bra $L97; $L108: .loc 1 461 12 mov.f32 %r93,0f43870000; sub.f32 %r92,%r51,%r93; .loc 1 461 11 setp.le.f32 %r94,%r92,0f00000000; @ %r94 bra $L97; .loc 1 467 5 mov.f32 %r95,0f43b40000; sub.f32 %r33,%r95,%r51; .loc 1 468 5 neg.f32 %r42,%r42; bra $L97; $L109: .loc 1 421 7 mov.f32 %r33,%r51; $L97: .loc 1 472 4 mul.f32 %r36,%r33,0fb6395dad; fma.rn.f32 %r39,%r33,0f3c8f0000,%r36; .loc 1 473 4 { .param .f32 %value_in; .param .f32 %out_arg1; st.param.f32 [%out_arg1],%r39; call (%value_in),tanf,(%out_arg1); ld.param.f32 %r97,[%value_in]; } .loc 1 476 7 mul.f32 %r42,%r42,%r97; bra $L85; $L86: .loc 1 483 5 sub.f32 %r42,%r43,%r43; $L85gfortran_sind_r8 .visible .func (.param .f64 %value_out) _gfortran_sind_r8 (predpredpred %r60; .reg .u32u32 %r86; .reg .pred %r91; .reg .predpred %r97; .reg .f64 %r98; .reg .f64 %r99; .reg .pred %r100; .reg .f64 %r102; .reg .f64 %r103; .reg .pred %r104; .reg .f64 %r106; .reg .predf64 %r120; mov.f64 %r49,%ar0; .loc 1 95 7 abs.f64 %r22,%r49; .loc 1 95 6 setp.gtu.f64 %r50,%r22,0d7fefffffffffffff; @ %r50 bra $L111; .loc 1 110 11 mov.f64 %r52,0d3ec0000000000000; sub.f64 %r51,%r22,%r52; .loc 1 110 10 setp.lt.f64 %r53,%r51,0d0000000000000000; @ ! %r53 bra $L150; .loc 1 112 4 mul.f64 %r24,%r49,0d3e444a53a722b83b; fma.rn.f64 %r48,%r49,0d3f91df4600000000,%r24; .loc 1 114 11 bra $L110; $L150: .loc 1 102 7 mov.f64 %r54,0d3ff0000000000000; copysign.f64 %r48,%r49,%r54; .loc 1 127 7 mov.f64 %r56,0d4076800000000000; { .param .f64 %value_in; .param .f64 %out_arg1; st.param.f64 [%out_arg1],%r22(%value_in),fmo57,[%value_in]; } mov.f64 %r42,%r57; .loc 1 133 11 cvt.rzi.s32.f64 %r40,%r57; cvt.rn.f64.s32 %r58,%r40; sub.f64 %r59,%r58,%r57; .loc 1 133 10 setp.neu.f64 %r60,%r59,0d0000000000000000; @ %r60 bra $L115; rem.s32 %r62,%r40,30; .loc 1 133 48 setp.ne.u32 %r67,%r62,0; @ %r67 bra $L115; rem.s32 %r69,%r40,180; .loc 1 137 7 setp.ne.u32 %r74,%r69,0; @ %r74 bra $L116; .loc 1 140 11 setp.eq.u32 %r75,%r40,180; @ %r75 bra $L117; .loc 1 216 7 mul.f64 %r48,%r48,0d0000000000000000; bra $L110; $L117: mul.f64 %r48,%r48,0d8000000000000000; bra $L110; $L116: rem.s32 %r78,%r40,90; .loc 1 143 12 setp.ne.u32 %r83,%r78,0; @ %r83 bra $L119; .loc 1 144 6 setp.eq.u32 %r84,%r40,90; @ %r84 bra $L110; neg.f64 %r48,%r48; bra $L110; $L119: rem.s32 %r86,%r40,60; .loc 1 145 12 setp.ne.u32 %r91,%r86,0; @ %r91 bra $L120; .loc 1 148 11 setp.gt.s32 %r92,%r40,179; @ %r92 bra $L121; .loc 1 216 7 mul.f64 %r48,%r48,0d3febb67ae8584caa; bra $L110; $L121: mul.f64 %r48,%r48,0dbfebb67ae8584caa; bra $L110; $L120: .loc 1 152 6 setp.gt.s32 %r93,%r40,179; @ %r93 bra $L122; .loc 1 216 7 mul.f64 %r48,%r48,0d3fe0000000000000; bra $L110; $L122: mul.f64 %r48,%r48,0dbfe0000000000000; bra $L110; $L115: .loc 1 161 8 mov.f64 %r94,0d4066800000000000; sub.f64 %r31,%r57,%r94; .loc 1 161 7 setp.le.f64 %r95,%r31,0d0000000000000000; @ ! %r95 bra $L151; .loc 1 163 12 mov.f64 %r96,0d4056800000000000; sub.f64 %r32,%r57,%r96; .loc 1 163 11 setp.le.f64 %r97,%r32,0d0000000000000000; @ ! %r97 bra $L152; .loc 1 165 9 mov.f64 %r99,0d4046800000000000; sub.f64 %r98,%r57,%r99; .loc 1 165 8 setp.gt.f64 %r100,%r98,0d0000000000000000; @ ! %r100 bra $L127; .loc 1 168 9 sub.f64 %r32,%r96,%r57; bra $L129; $L152: .loc 1 174 9 mov.f64 %r103,0d4060e00000000000; sub.f64 %r102,%r57,%r103; .loc 1 174 8 setp.le.f64 %r104,%r102,0d0000000000000000; @ %r104 bra $L129; .loc 1 180 7 sub.f64 %r42,%r94,%r57; bra $L127; $L151: .loc 1 184 13 mov.f64 %r106,0d4070e00000000000; sub.f64 %r38,%r57,%r106; .loc 1 184 12 setp.le.f64 %r107,%r38,0d0000000000000000; @ ! %r107 bra $L153; .loc 1 186 12 mov.f64 %r109,0d406c200000000000; sub.f64 %r108,%r57,%r109; .loc 1 186 11 setp.le.f64 %r110,%r108,0d0000000000000000; @ %r110 bra $L137; .loc 1 190 5 sub.f64 %r38,%r106,%r57; .loc 1 191 14 mov.u32 %r39,1; bra $L132; $L137: .loc 1 187 3 mov.f64 %r38,%r31; .loc 1 160 9 mov.u32 %r39,0; $L132: .loc 1 193 8 neg.f64 %r48,%r48; bra $L133; $L153: .loc 1 198 12 mov.f64 %r113,0d4073b00000000000; sub.f64 %r112,%r57,%r113; .loc 1 198 11 setp.le.f64 %r114,%r112,0d0000000000000000; @ %r114 bra $L138; .loc 1 204 3 mov.f64 %r115,0d4076800000000000; sub.f64 %r38,%r115,%r57; .loc 1 160 9 mov.u32 %r39,0; bra $L134; $L138: .loc 1 201 14 mov.u32 %r39,1; $L134: .loc 1 205 8 neg.f64 %r48,%r48; $L133: .loc 1 208 4 mul.f64 %r37,%r38,0d3e444a53a722b83b; fma.rn.f64 %r43,%r38,0d3f91df4600000000,%r37; .loc 1 210 7 setp.eq.u32 %r116,%r39,0; @ %r116 bra $L135s,(%out_arg1); ld.param.f64 %r118,[%value_in]; } .loc 1 216 7 mul.f64 %r48,%r48,%r118; bra $L110sin,(%out_arg1); ld.param.f64 %r120,[%value_in]; } .loc 1 216 7 mul.f64 %r48,%r48,%r120; bra $L110; $L111: .loc 1 223 5 sub.f64 %r48,%r49,%r49; bra $L110; $L129: .loc 1 208 4 mul.f64 %r46,%r32,0d3e444a53a722b83b; fma.rn.f64 %r43,%r32,0d3f91df4600000000,%r46; bra $L136; $L127: mul.f64 %r47,%r42,0d3e444a53a722b83b; fma.rn.f64 %r43,%r42,0d3f91df4600000000,%r47; bra $L135; $L110:_gfortran_cosd_r8 .visible .func (.param .f64 %value_out) _gfortran_cosd_r8 (f64 %r22; .reg .f64 %r24; .reg .f64 %r39; .reg .u32 %predpred %r55; .reg .predpredf64 %r122; .reg .predpred %r130; .reg .f64predf64 %r142; .reg .f64 %r144; mov.f64 %r51,%ar0; .loc 1 245 7 abs.f64 %r22,%r51; .loc 1 245 6 setp.gtu.f64 %r52,%r22,0d7fefffffffffffff; @ %r52 bra $L155; .loc 1 253 11 mov.f64 %r54,0d3ea0000000000000; sub.f64 %r53,%r22,%r54; .loc 1 253 10 setp.le.f64 %r55,%r53,0d0000000000000000; @ ! %r55 bra $L189; .loc 1 258 7 setp.eq.f64 %r56,%r51,0d0000000000000000; @ %r56 bra $L158; .loc 1 259 6 ld.const.f64 %r24,[tiny$0]; mov.f64 %r57,0d3ff0000000000000; sub.f64 %r50,%r57,%r24; bra $L154; $L189: .loc 1 275 7 mov.f64 %r59,0d4076800000000000; { .param .f64 %value_in; .param .f64 %out_arg1; st.param.f64 [%out_arg1],%r22(%value_in),fmo,[%value_in]; } .loc 1 282 11 cvt.rzi.s32.f64 %r43,%r60; cvt.rn.f64.s32 %r61,%r43; sub.f64 %r62,%r61,%r60; .loc 1 282 10 setp.neu.f64 %r63,%r62,0d0000000000000000; @ %r63 bra $L160; rem.s32 %r65,%r43,30; .loc 1 282 48 setp.ne.u32 %r70,%r65,0; @ %r70 bra $L160; rem.s32 %r72,%r43,180; .loc 1 284 7 setp.ne.u32 %r77,%r72,0; @ %r77 bra $L161; .loc 1 285 6 setp.eq.u32 %r78,%r43,180; @ %r78 bra $L173; $L158: mov.f64 %r50,0d3ff0000000000000; bra $L154; $L161: rem.s32 %r80,%r43,90; .loc 1 287 12 setp.eq.u32 %r85,%r80,0; @ %r85 bra $L174; rem.s32 %r87,%r43,60; .loc 1 289 12 setp.ne.u32 %r92,%r87,0; @ %r92 bra $L162; .loc 1 292 31 set.u32.ne.u32 %r94,%r43,60; neg.s32 %r95,%r94; .loc 1 292 59 set.u32.ne.u32 %r97,%r43,300; neg.s32 %r98,%r97; .loc 1 292 36 cvt.u16.u32 %r100,%r95; cvt.u16.u32 %r101,%r98; and.b16 %r99,%r100,%r101; .loc 1 292 11 cvt.u32.u16 %r102,%r99; cvt.u16.u8 %r103,%r102; setp.eq.u16 %r104,%r103,0; .loc 1 293 3 selp.f64 %r50,0d3fe0000000000000,0dbfe0000000000000,%r104; bra $L154; $L162: .loc 1 298 31 set.u32.ne.u32 %r106,%r43,30; neg.s32 %r107,%r106; .loc 1 298 59 set.u32.ne.u32 %r109,%r43,330; neg.s32 %r110,%r109; .loc 1 298 36 cvt.u16.u32 %r112,%r107; cvt.u16.u32 %r113,%r110; and.b16 %r111,%r112,%r113; .loc 1 298 11 cvt.u32.u16 %r114,%r111; cvt.u16.u8 %r115,%r114; setp.eq.u16 %r116,%r115,0; .loc 1 299 3 selp.f64 %r50,0d3febb67ae8584caa,0dbfebb67ae8584caa,%r116; bra $L154; $L160: .loc 1 309 8 mov.f64 %r117,0d4066800000000000; sub.f64 %r44,%r60,%r117; .loc 1 309 7 setp.le.f64 %r118,%r44,0d0000000000000000; @ ! %r118 bra $L190; .loc 1 311 12 mov.f64 %r119,0d4056800000000000; sub.f64 %r39,%r60,%r119; .loc 1 311 11 setp.le.f64 %r120,%r39,0d0000000000000000; @ ! %r120 bra $L191; .loc 1 313 9 mov.f64 %r122,0d4046800000000000; sub.f64 %r121,%r60,%r122; .loc 1 313 8 setp.gt.f64 %r123,%r121,0d0000000000000000; @ ! %r123 bra $L192; .loc 1 315 9 sub.f64 %r39,%r119,%r60; .loc 1 307 9 mov.u32 %r45,0; bra $L169; $L191: .loc 1 321 9 mov.f64 %r126,0d4060e00000000000; sub.f64 %r125,%r60,%r126; .loc 1 321 8 setp.le.f64 %r127,%r125,0d0000000000000000; @ %r127 bra $L178; .loc 1 327 7 sub.f64 %r44,%r117,%r60; .loc 1 328 9 mov.u32 %r45,1; bra $L167; $L190: .loc 1 332 13 mov.f64 %r129,0d4070e00000000000; sub.f64 %r39,%r60,%r129; .loc 1 332 12 setp.le.f64 %r130,%r39,0d0000000000000000; @ ! %r130 bra $L193; .loc 1 334 12 mov.f64 %r132,0d406c200000000000; sub.f64 %r131,%r60,%r132; .loc 1 334 11 setp.le.f64 %r133,%r131,0d0000000000000000; @ %r133 bra $L179; .loc 1 338 5 sub.f64 %r39,%r129,%r60; .loc 1 341 12 mov.u32 %r45,1; bra $L169; $L193: .loc 1 346 12 mov.f64 %r136,0d4073b00000000000; sub.f64 %r135,%r60,%r136; .loc 1 346 11 setp.le.f64 %r137,%r135,0d0000000000000000; @ %r137 bra $L180; .loc 1 352 3 mov.f64 %r138,0d4076800000000000; sub.f64 %r44,%r138,%r60; .loc 1 307 9 mov.u32 %r45,0; bra $L167; $L172: .loc 1 362 7 setp.eq.u32 %r139,%r45,0; @ %r139 bra $L154; .loc 1 363 6 neg.f64 %r50,%r50; bra $L154; $L155: .loc 1 371 5 sub.f64 %r50,%r51,%r51; bra $L154; $L173: .loc 1 285 6 mov.f64 %r50,0dbff0000000000000; bra $L154; $L174: .loc 1 288 6 mov.f64 %r50,0d0000000000000000; bra $L154; $L178: .loc 1 328 9 mov.u32 %r45,1; bra $L169; $L180: .loc 1 307 9 mov.u32 %r45,0; $L169: .loc 1 355 4 mul.f64 %r46,%r39,0d3e444a53a722b83b; fma.rn.f64 %r47,%r39,0d3f91df4600000000,%r46; .loc 1 358 6 { .param .f64 %value_in; .param .f64 %out_arg1; st.param.f64 [%out_arg1],%r47; call (%value_in),sin,(%out_arg1); ld.param.f64 %r142,[%value_in]; } mov.f64 %r50,%r142; bra $L172; $L192: .loc 1 275 7 mov.f64 %r44,%r60; .loc 1 307 9 mov.u32 %r45,0; bra $L167; $L179: .loc 1 341 12 mov.u32 %r45,1; $L167: .loc 1 355 4 mul.f64 %r48,%r44,0d3e444a53a722b83b; fma.rn.f64 %r49,%r44,0d3f91df4600000000,%r48; .loc 1 360 6 { .param .f64 %value_in; .param .f64 %out_arg1; st.param.f64 [%out_arg1],%r49; call (%value_in),cos,(%out_arg1); ld.param.f64 %r144,[%value_in]; } mov.f64 %r50,%r144; bra $L172; $L154:_gfortran_tand_r8 .visible .func (.param .f64 %value_out) _gfortran_tand_r8 (f64 %r43; .reg .pred %r44; .reg .f64 %r45; .reg .f64 %r46; .reg .predpredpred %r87; .reg .f64 %r88; .reg .f64 %r89; .reg .pred.loc 1 389 7 abs.f64 %r22,%r43; .loc 1 389 6 setp.gtu.f64 %r44,%r22,0d7fefffffffffffff; @ %r44 bra $L195; .loc 1 404 11 mov.f64 %r46,0d3ec0000000000000; sub.f64 %r45,%r22,%r46; .loc 1 404 10 setp.lt.f64 %r47,%r45,0d0000000000000000; @ ! %r47 bra $L216; .loc 1 406 4 mul.f64 %r24,%r43,0d3e444a53a722b83b; fma.rn.f64 %r42,%r43,0d3f91df4600000000,%r24; .loc 1 408 11 bra $L194; $L216: .loc 1 396 7 mov.f64 %r48,0d3ff0000000000000; copysign.f64 %r42,%r43,%r48; .loc 1 421 7 mov.f64 %r50,0d4076800000000000; { .param .f64 %value_in; .param .f64 %out_arg1; st.param.f64 [%out_arg1],%r22(%value_in),fmo51,[%value_in]; } .loc 1 427 11 cvt.rzi.s32.f64 %r38,%r51; cvt.rn.f64.s32 %r52,%r38; sub.f64 %r53,%r52,%r51; .loc 1 427 10 setp.neu.f64 %r54,%r53,0d0000000000000000; @ %r54 bra $L199; mul.lo.u32 %r57,%r38,-1527099483; .loc 1 427 48 setp.gt.u32 %r58,%r57,95443717; @ %r58 bra $L199; rem.s32 %r60,%r38,180; .loc 1 429 7 setp.ne.u32 %r65,%r60,0; @ %r65 bra $L200; .loc 1 476 7 mul.f64 %r42,%r42,0d0000000000000000; bra $L194; $L200: rem.s32 %r67,%r38,90; .loc 1 437 12 setp.ne.u32 %r72,%r67,0; @ %r72 bra $L202; .loc 1 438 6 setp.ne.u32 %r73,%r38,90; selp.f64 %r41,0dfff0000000000000,0d7ff0000000000000,%r73; .loc 1 476 7 mul.f64 %r42,%r42,%r41; bra $L194; $L202: .loc 1 443 31 set.u32.ne.u32 %r75,%r38,45; neg.s32 %r76,%r75; .loc 1 443 59 set.u32.ne.u32 %r78,%r38,225; neg.s32 %r79,%r78; .loc 1 443 36 cvt.u16.u32 %r81,%r76; cvt.u16.u32 %r82,%r79; and.b16 %r80,%r81,%r82; .loc 1 443 11@ %r85 bra $L194; neg.f64 %r42,%r42; bra $L194; $L199: .loc 1 451 8 mov.f64 %r86,0d4066800000000000; sub.f64 %r33,%r51,%r86; .loc 1 451 7 setp.le.f64 %r87,%r33,0d0000000000000000; @ ! %r87 bra $L217; .loc 1 453 12 mov.f64 %r89,0d4056800000000000; sub.f64 %r88,%r51,%r89; .loc 1 453 11 setp.gt.f64 %r90,%r88,0d0000000000000000; @ ! %r90 bra $L218; .loc 1 455 5 sub.f64 %r33,%r86,%r51; .loc 1 456 5 neg.f64 %r42,%r42; bra $L206; $L217: .loc 1 461 12 mov.f64 %r93,0d4070e00000000000; sub.f64 %r92,%r51,%r93; .loc 1 461 11 setp.le.f64 %r94,%r92,0d0000000000000000; @ %r94 bra $L206; .loc 1 467 5 mov.f64 %r95,0d4076800000000000; sub.f64 %r33,%r95,%r51; .loc 1 468 5 neg.f64 %r42,%r42; bra $L206; $L218: .loc 1 421 7 mov.f64 %r33,%r51; $L206: .loc 1 472 4 mul.f64 %r36,%r33,0d3e444a53a722b83b; fma.rn.f64 %r39,%r33,0d3f91df4600000000,%r36; .loc 1 473 4 { .param .f64 %value_in; .param .f64 %out_arg1; st.param.f64 [%out_arg1],%r39; call (%value_in),tan,(%out_arg1); ld.param.f64 %r97,[%value_in]; } .loc 1 476 7 mul.f64 %r42,%r42,%r97; bra $L194; $L195: .loc 1 483 5 sub.f64 %r42,%r43,%r43; $L194:778 BEGIN FUNCTION DECL: unpack_bounds .func unpack_bounds../libgfortran/intrinsics/unpack_generic.c" // BEGIN FUNCTION DECL: unpack_internal .func unpack_internal_gfortran_unpack1 .visible .func _gfortran_unpack1gfortran_unpack1_char .visible .func _gfortran_unpack1_char_gfortran_unpack1_char4 .visible .func _gfortran_unpack1_char4_gfortran_unpack0 .visible .func _gfortran_unpack0gfortran_unpack0_char .visible .func _gfortran_unpack0_char_gfortran_unpack0_char4 .visible .func _gfortran_unpack0_char4); // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_size0_gfortran_size0_gfortrani_count_0_gfortrani_count_0_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents_gfortrani_xmallocarr_gfortrani_xmallocarrayFUNCTION DECL: _gfortrani_unpack1_i1 .extern .func _gfortrani_unpack1_i1gfortrani_unpack1_i2 .extern .func _gfortrani_unpack1_i2gfortrani_unpack1_i4 .extern .func _gfortrani_unpack1_i4gfortrani_unpack1_i8 .extern .func _gfortrani_unpack1_i8gfortrani_unpack1_i16 .extern .func _gfortrani_unpack1_i16gfortrani_unpack1_r4 .extern .func _gfortrani_unpack1_r4gfortrani_unpack1_r8 .extern .func _gfortrani_unpack1_r8gfortrani_unpack1_c4 .extern .func _gfortrani_unpack1_c4gfortrani_unpack1_c8 .extern .func _gfortrani_unpack1_c8gfortrani_unpack0_i1 .extern .func _gfortrani_unpack0_i1gfortrani_unpack0_i2 .extern .func _gfortrani_unpack0_i2gfortrani_unpack0_i4 .extern .func _gfortrani_unpack0_i4gfortrani_unpack0_i8 .extern .func _gfortrani_unpack0_i8gfortrani_unpack0_i16 .extern .func _gfortrani_unpack0_i16gfortrani_unpack0_r4 .extern .func _gfortrani_unpack0_r4gfortrani_unpack0_r8 .extern .func _gfortrani_unpack0_r8gfortrani_unpack0_c4 .extern .func _gfortrani_unpack0_c4gfortrani_unpack0_c8 .extern .func _gfortrani_unpack0_c8112,97,99,107,95,105,110,116,101,114,110,97,108,0 }99,111,114,114,101,99,116,32,115,105,122,101,32,111,102,32,114,101,116,117,114,110,32,118,97,108,117,101,32,105,110,32,85,78,80,65,67,75,32,105,110,116,114,105,110,115,105,99,58,32,115,104,111,117,108,100,32,98,101,32,97,116,32,108,101,97,115,116,32,37,108,100,44,32,105,115,32,37,1087] = {85,78,80,65,67,756] = {70,73,69,76,68114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,1218] = {100,105,109,32,62,32,4852116,114,97,110,47,105,110,116,114,105,110,115,105,99,115,47,117,110,112,97,99,107,95,103,101,110,101,114,105,99,46,99,0 }; // BEGIN FUNCTION DEF: unpack_bounds .func unpack_bounds326; call (%value_in),_gfortran_siz39 16call (%value_in),_gfortrani_count_0,(%out_arg1); ld.param.u64 %r32,[%value_in]; } .loc 1 40 6 setp.ge.s64 %r33,%r30,%r32; @ %r33 bra $L2; .loc 1 41 5 st.u64 [%stack+8],%r300stack; call _gfortran_runtime_error4647 5 cvta.const.u64 %r40,$LC1call _gfortrani_bounds_equal_extents$L3: .loc 1 50 6 ld.u64 %r41,[%r251; .loc 1 51 5 cvta.const.u64 %r46,$LC1; cvta.const.u64 %r455call _gfortrani_bounds_equal_extents$L1: .loc 1 54FUNCTION DEF: unpack_internal .func unpack_internal {pred %r187; .reg .predpred %r193; .reg .predpred64u64 %r281; .reg .u64u64u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .pred %r337; .reg .pred %r338; .reg .u64u64 %r342; .reg .pred %r343; mov.u64 %r168,%ar0; mov.u64 %r169,%ar1; mov.u64 %r170,%ar2; mov.u64 %r171,%ar3; mov.u64 %r172,%ar4; .loc 1 88 8 ld.u64 %r97,[%r170]; .loc 1 93 15 ld.u64 %r22,[%r170+16]; .loc 1 95 22 cvt.u32.u64 %r23,%r22; .loc 1 95 53 add.u32 %r173,%r23,-4; and.b32 %r174,%r173,-5; set.u32.eq.u32 %r176,%r174,0; neg.s32 %r177,%r176; .loc 1 95 22 add.u32 %r178,%r23,-1; set.u32.le.u32 %r180,%r178,1; neg.s32 %r181,%r180; .loc 1 95 6 cvt.u16.u32 %r183,%r177; cvt.u16.u32 %r184,%r181; or.b16 %r182,%r183,%r184; cvt.u32.u16 %r185,%r182; cvt.u16.u8 %r186,%r185; setp.ne.u16 %r187,%r186,0; @ %r187 bra $L12; .loc 1 97 7 setp.eq.u32 %r189,%r23,16; @ %r189 bra $L12; .loc 1 106 5 cvta.const.u64 %r190,$LC4;stack; call _gfortran_runtime_error12: .loc 1 108 6 ld.u64 %r192,[%r168; .loc 1 112 11 ld.s8 %r105,[%r170+28]; .loc 1 114 7 setp.le.s64 %r195,%r105,0; @ %r195 bra $L14; .loc 1 121 17 ld.u64 %r37,[%r168+16]; .loc 1 122 17 ld.u64 %r42,[%r171+16]; .loc 1 116 13 shl.b64 %r342,%r105,3; add.u64 %r198,%frame,120; cvt.u32.u64 %r202,%r19mov.u64 %r147,40; mov.u64 %r148,%r192; .loc 1 86 9 cvt.u32.u64 %r34,%r148; .loc 1 113 10 mov.u64 %r107,1; add.u64 %r341,%frame,480; add.u64 %r340,%frame,360; add.u64 %r339,%frame,240; .loc 1 117 4 mov.u64 %r206,%r148; $L15: add.u64 %r146,%r168,%r147; st.u64 [%r146+8],%r206; add.u64 %r145,%r170,%r147; ld.u64 %r208,[%r145+16]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r145+8]; sub.u64 %r31,%r207,%r209; add.u64 %r210,%r31,-1; st.u64 [%r146+16],%r210; st.u64 [%r146],%r107; .loc 1 119 14 add.u64 %r211,%frame,%r148; st.u64 [%r211],%r31; .loc 1 120 18 set.u32.le.s64 %r213,%r31,0; neg.s32 %r214,%r213; cvt.u16.u32 %r217,%r34; and.b16 %r216,%r217,1; cvt.u16.u32 %r220,%r214; or.b16 %r219,%r220,%r216; cvt.u32.u16 %r222,%r219; cvt.u32.u8 %r34,%r222; .loc 1 121 15 add.u64 %r224,%r341,%r148; .loc 1 121 17 mul.lo.u64 %r225,%r107,%r37; .loc 1 121 15 st.u64 [%r224],%r225; .loc 1 122 15 add.u64 %r227,%r340,%r148; .loc 1 122 17 add.u64 %r228,%r171,%r147; ld.u64 %r230,[%r228]; mul.lo.u64 %r229,%r230,%r42; .loc 1 122 15 st.u64 [%r227],%r229; .loc 1 123 15 add.u64 %r232,%r339,%r148; .loc 1 123 17 ld.u64 %r234,[%r145]; mul.lo.u64 %r233,%r234,%r22; .loc 1 123 15 st.u64 [%r232],%r233; .loc 1 124 7 mul.lo.u64 %r107,%r107,%r31; .loc 1 114 7 add.u64 %r148,%r148,8; add.u64 %r147,%r147,24; setp.ne.u64 %r235,%r148,%r342; @ %r235 bra $L15; bra $L37; $L14: .loc 1 126 19 st.u64 [%r168+8],%r192; .loc 1 127 24 mov.u64 %r22172; call (%value_in),_gfortrani_xmallocarray239,[%value_in]; } .loc 1 127 22 st.u64 [%r168],%r239; bra $L17; $L37: .loc 1 126 19 st.u64 [%r168+8],%r206; .loc 1 127 24(%value_in),_gfortrani_xmallocarray4,[%value_in]; } .loc 1 127 22 st.u64 [%r168],%r244; bra $L18; $L13: .loc 1 131 11 ld.s8 %r105,[%r168+28]; .loc 1 132 7 setp.le.s64 %r247,%r105,0; @ %r247 bra $L17; .loc 1 137 17 ld.u64 %r57,[%r168+16]; .loc 1 138 17 ld.u64 %r62,[%r171+16]; .loc 1 134 13 shl.b64 %r342,%r105,3; add.u64 %r250,%frame,120; mov.u32 %r254add.u64 %r139,%r168,40; add.u64 %r137,%r171,40; add.u64 %r134,%r170,40; add.u64 %r260,%r105,%r105; add.u64 %r261,%r260,%r105; shl.b64 %r262,%r261,3; add.u64 %r101,%r139,%r262; mov.u64 %r130,0; .loc 1 86 9 cvt.u32.u64 %r34,%r130; add.u64 %r341,%frame,480; add.u64 %r340,%frame,360; add.u64 %r339,%frame,240; $L19: .loc 1 135 16 ld.u64 %r264,[%r139+16]; add.u64 %r263,%r264,1; ld.u64 %r265,[%r139+8]; sub.u64 %r54,%r263,%r265; .loc 1 135 14 add.u64 %r266,%frame,%r130; st.u64 [%r266],%r54; .loc 1 136 18 set.u32.le.s64 %r268,%r54,0; neg.s32 %r269,%r268; cvt.u16.u32 %r272,%r34; and.b16 %r271,%r272,1; cvt.u16.u32 %r275,%r269; or.b16 %r274,%r275,%r271; cvt.u32.u16 %r277,%r274; cvt.u32.u8 %r34,%r277; .loc 1 137 15 add.u64 %r279,%r341,%r130; .loc 1 137 17 ld.u64 %r281,[%r139]; mul.lo.u64 %r280,%r281,%r57; .loc 1 137 15 st.u64 [%r279],%r280; .loc 1 138 15 add.u64 %r283,%r340,%r130; .loc 1 138 17 ld.u64 %r285,[%r137]; mul.lo.u64 %r284,%r285,%r62; .loc 1 138 15 st.u64 [%r283],%r284; .loc 1 139 15 add.u64 %r287,%r339,%r130; .loc 1 139 17 ld.u64 %r289,[%r134]; mul.lo.u64 %r288,%r289,%r22; .loc 1 139 15 st.u64 [%r287],%r288; .loc 1 132 7 add.u64 %r139,%r139,24; add.u64 %r137,%r137,24; add.u64 %r134,%r134,24; add.u64 %r130,%r130,8; setp.ne.u64 %r290,%r101,%r139; @ %r290 bra $L19; bra $L18; $L28: .loc 1 176 32 ld.u64 %r164,[%frame]; .loc 1 183 30 mul.lo.u64 %r135,%r164,%r108; .loc 1 184 30 mul.lo.u64 %r291,%r164,%r109; .loc 1 184 16 neg.s64 %r128,%r291; .loc 1 185 30 mul.lo.u64 %r292,%r164,%r110; .loc 1 185 16 neg.s64 %r129,%r292; setp.eq.u64 %r343,%r105,1; $L30: .loc 1 159 10 ld.s8 %r294,[%r97]; cvt.u16.u32 %r293,%r294; setp.eq.u16 %r295,%r293,0; @ %r295 bra $L20; .loc 1 162 1111172302,[%value_in]; } .loc 1 163 16 add.u64 %r94,%r94,%r73; bra $L21; $L20: .loc 1 168 1111172310,[%value_in]; } $L21: .loc 1 171 12 add.u64 %r114,%r114,%r108; .loc 1 172 12 add.u64 %r98,%r98,%r109; .loc 1 173 12 add.u64 %r97,%r97,%r110; .loc 1 174 15 ld.u64 %r312,[%frame+120]; add.u64 %r76,%r312,1; .loc 1 176 13 setp.eq.u64 %r313,%r76,%r164; @ %r313 bra $L22; .loc 1 174 15 st.u64 [%frame+120],%r76; bra $L30; $L22: .loc 1 180 20 mov.u64 %r314,0; st.u64 [%frame+120],%r314; .loc 1 184 16 add.u64 %r111,%r98,%r128; .loc 1 185 16 add.u64 %r112,%r97,%r129; .loc 1 187 14 @ %r343 bra $L11; add.u64 %r106,%frame,128; .loc 1 183 30 mov.u64 %r78,%r135; .loc 1 187 14 mov.u64 %r156,8; bra $L25; $L27: .loc 1 180 20 st.u64 [%r106],%r314; .loc 1 183 30 mul.lo.u64 %r78,%r87,%r86; .loc 1 184 30 mul.lo.u64 %r318,%r89,%r86; .loc 1 184 16 sub.u64 %r111,%r98,%r318; .loc 1 185 30 mul.lo.u64 %r319,%r91,%r86; .loc 1 185 16 sub.u64 %r112,%r97,%r319; .loc 1 187 14 add.u64 %r106,%r106,8; add.u64 %r156,%r156,8; setp.eq.u64 %r320,%r342,%r156; @ %r320 bra $L11; $L25: .loc 1 195 23 ld.u64 %r321,[%r106]; add.u64 %r86,%r321,1; st.u64 [%r106],%r86; .loc 1 196 30 add.u64 %r323,%r341,%r156; ld.u64 %r87,[%r323]; .loc 1 196 20 sub.u64 %r324,%r87,%r78; add.u64 %r114,%r114,%r324; .loc 1 197 30 add.u64 %r326,%r340,%r156; ld.u64 %r89,[%r326]; .loc 1 197 20 add.u64 %r98,%r111,%r89; .loc 1 198 30 add.u64 %r328,%r339,%r156; ld.u64 %r91,[%r328]; .loc 1 198 20 add.u64 %r97,%r112,%r91; .loc 1 176 32 add.u64 %r329,%frame,%r156; ld.u64 %r93,[%r329]; .loc 1 176 13 setp.eq.u64 %r330,%r86,%r93; @ %r330 bra $L27; bra $L30; $L17: .loc 1 147 3 cvta.const.u64 %r334,$LC5; cvta.const.u64 %r333,__func__$0; mov.u32 %r332,147; cvta.const.u64 %r331,$LC33333338: .loc 1 149 14 ld.u64 %r335,[%r169+40]; ld.u64 %r336,[%r169+16]; mul.lo.u64 %r73,%r335,%r336; .loc 1 150 12 ld.u64 %r108,[%frame+480]; .loc 1 151 12 ld.u64 %r109,[%frame+360]; .loc 1 152 12 ld.u64 %r110,[%frame+240]; .loc 1 153 8 ld.u64 %r114,[%r168]; .loc 1 154 8 ld.u64 %r98,[%r171]; .loc 1 155 8 ld.u64 %r94,[%r169]; .loc 1 157 9 setp.ne.u64 %r337,%r114,0; @ %r337 bra $L28; bra $L11; $L18: .loc 1 143 6 setp.ne.u32 %r338,%r34,0; @ ! %r338 bra $L38; $L11: .loc 1 202_gfortran_unpack1 .visible .func _gfortran_unpack1predpred %r90; .reg .predpredpredpredpredpredpred %r189; mov.u64 %r68,%ar0; mov.u64 %r69,%ar1; mov.u64 %r70,%ar2; mov.u64 %r71,%ar3; .loc 1 215 7 cvta.global.u64 %r72,_gfortrani_compile_options; .loc 1 215 6 ld.u32 %r73,[%r72+36]; setp.eq.u32 %r74,%r73,0; @ %r74 bra $L40;call unpack_bounds$L40: .loc 1 218 15 ld.u64 %r27,[%r69+16]; ld.s8 %r79,[%r69+29]; shl.b32 %r80,%r79,4; cvt.s64.s32 %r81,%r80; shl.b64 %r82,%r27,7; or.b64 %r29,%r81,%r82; .loc 1 221 3 setp.eq.u64 %r83,%r29,1040; @ %r83 bra $L41; setp.gt.u64 %r84,%r29,1040; @ %r84 bra $L42; setp.eq.u64 %r85,%r29,288; @ %r85 bra $L43; setp.gt.u64 %r86,%r29,288; @ %r86 bra $L44; setp.eq.u64 %r87,%r29,160; @ %r87 bra $L45; setp.eq.u64 %r88,%r29,272; @ %r88 bra $L43; setp.eq.u64 %r89,%r29,144; @ %r89 bra $L45; bra $L46; $L44: setp.eq.u64 %r90,%r29,544; @ %r90 bra $L47; setp.eq.u64 %r91,%r29,560; @ %r91 bra $L48; setp.eq.u64 %r92,%r29,528; @ %r92 bra $L47; bra $L46; $L42: setp.eq.u64 %r93,%r29,1088; @ %r93 bra $L49; setp.gt.u64 %r94,%r29,1088; @ %r94 bra $L50; setp.eq.u64 %r95,%r29,1056; @ %r95 bra $L41; setp.eq.u64 %r96,%r29,1072; @ %r96 bra $L51; bra $L46; $L50: setp.eq.u64 %r97,%r29,2080; @ %r97 bra $L52; setp.eq.u64 %r98,%r29,2112; @ %r98 bra $L53; setp.eq.u64 %r99,%r29,2064; @ %r99 bra $L52; bra $L46; $L45:call _gfortrani_unpack1_227 7 bra $L39; $L4call _gfortrani_unpack1_233 7 bra $L39; $L47:call _gfortrani_unpack1_239 7 bra $L39; $L41:call _gfortrani_unpack1_245 7 bra $L39; $L52:call _gfortrani_unpack1_i167 bra $L39; $L48:call _gfortrani_unpack1_r47 bra $L39; $L51:call _gfortrani_unpack1_r8263 7 bra $L39; $L49:call _gfortrani_unpack1_c4290 7 bra $L39; $Lcall _gfortrani_unpack1_c8295 7 bra $L39; $L46: .loc 1 321 11 ld.u64 %r30,[%r68+16]; .loc 1 321 3 setp.eq.u64 %r136,%r30,4; @ %r136 bra $L55; setp.gt.u64 %r137,%r30,4; @ %r137 bra $L56; setp.eq.u64 %r138,%r30,1; @ %r138 bra $L45; setp.eq.u64 %r139,%r30,2; @ %r139 bra $L57; bra $L58; $L56: setp.eq.u64 %r140,%r30,8; @ %r140 bra $L59; setp.eq.u64 %r141,%r30,16; @ %r141 bra $L60; bra $L58; $L57: .loc 1 329 11 ld.u64 %r143,[%r68]; and.b64 %r142,%r143,1; .loc 1 329 10 setp.ne.u64 %r144,%r142,0; @ %r144 bra $L58; .loc 1 329 46 ld.u64 %r146,[%r69]; and.b64 %r145,%r146,1; .loc 1 329 43 setp.ne.u64 %r147,%r145,0; @ %r147 bra $L58; .loc 1 330 7 ld.u64 %r149,[%r71]; and.b64 %r148,%r149,1; .loc 1 330 4 setp.ne.u64 %r150,%r148,0; @ %r150 bra $L58call _gfortrani_unpack1_4 bra $L39; $L55: .loc 1 340 11 ld.u64 %r156,[%r68]; and.b64 %r155,%r156,3; .loc 1 340 10 setp.ne.u64 %r157,%r155,0; @ %r157 bra $L58; .loc 1 340 46 ld.u64 %r159,[%r69]; and.b64 %r158,%r159,3; .loc 1 340 43 setp.ne.u64 %r160,%r158,0; @ %r160 bra $L58; .loc 1 341 7 ld.u64 %r162,[%r71]; and.b64 %r161,%r162,3; .loc 1 341 4 setp.ne.u64 %r163,%r161,0; @ %r163 bra $L58call _gfortrani_unpack1_347 4 bra $L39; $L59: .loc 1 351 11 ld.u64 %r169,[%r68]; and.b64 %r168,%r169,7; .loc 1 351 10 setp.ne.u64 %r170,%r168,0; @ %r170 bra $L58; .loc 1 351 46 ld.u64 %r172,[%r69]; and.b64 %r171,%r172,7; .loc 1 351 43 setp.ne.u64 %r173,%r171,0; @ %r173 bra $L58; .loc 1 352 7 ld.u64 %r175,[%r71]; and.b64 %r174,%r175,7; .loc 1 352 4 setp.ne.u64 %r176,%r174,0; @ %r176 bra $L58call _gfortrani_unpack1_4 bra $L39; $L60: .loc 1 363 11 ld.u64 %r182,[%r68]; and.b64 %r181,%r182,15; .loc 1 363 10 setp.ne.u64 %r183,%r181,0; @ %r183 bra $L58; .loc 1 364 7 ld.u64 %r185,[%r69]; and.b64 %r184,%r185,15; .loc 1 364 4 setp.ne.u64 %r186,%r184,0; @ %r186 bra $L58; .loc 1 365 7 ld.u64 %r188,[%r71]; and.b64 %r187,%r188,15; .loc 1 365 4 setp.ne.u64 %r189,%r187,0; @ %r189 bra $L58call _gfortrani_unpack1_i16371 4 bra $L39; $L58call unp,%out_arg4,%out_arg5); } $L39:_gfortran_unpack1_char .visible .func _gfortran_unpack1_charmov.u64 %r27,%ar3; mov.u64 %r28,%ar4; mov.u32 %r29,%ar5; .loc 1 396 7 cvta.global.u64 %r31,_gfortrani_compile_options; .loc 1 396 6 ld.u32 %r32,[%r31+36]; setp.eq.u32 %r33,%r32,0; @ %r33 bra $L98; .loc 1 397 5 {call unpack_bounds$L98: .loc 1 399 3 cvt.s64.s32 %r43,%r29; {call unp,%out_arg4,%out_arg5); }_gfortran_unpack1_char4 .visible .func _gfortran_unpack1_char4mov.u64 %r29,%ar3; mov.u64 %r30,%ar4; mov.u32 %r31,%ar5; .loc 1 417 7 cvta.global.u64 %r33,_gfortrani_compile_options; .loc 1 417 6 ld.u32 %r34,[%r33+36]; setp.eq.u32 %r35,%r34,0; @ %r35 bra $L103; .loc 1 418 5call unpack_bounds$L103: .loc 1 421 20 cvt.s64.s32 %r45,%r31; shl.b64 %r46,%r45,2; .loc 1 420 3call unp,%out_arg4,%out_arg5); } .loc 1 42_gfortran_unpack0 .visible .func _gfortran_unpack0pred %r90; .reg .predpredu64 %r149; .reg .u64 %r150; .reg .predpredpredpredpred184; mov.u64 %r64,%ar0; mov.u64 %r65,%ar1; mov.u64 %r66,%ar2; mov.u64 %r67,%ar3; .loc 1 437 7 cvta.global.u64 %r68,_gfortrani_compile_options; .loc 1 437 6 ld.u32 %r69,[%r68+36]; setp.eq.u32 %r70,%r69,0; @ %r70 bra $L108; .loc 1 438 5 mov.u64 %r74,rcall unpack_bounds$L108: .loc 1 440 15 ld.u64 %r27,[%r65+16]; ld.s8 %r75,[%r65+29]; shl.b32 %r76,%r75,4; cvt.s64.s32 %r77,%r76; shl.b64 %r78,%r27,7; or.b64 %r29,%r77,%r78; .loc 1 442 3 setp.eq.u64 %r79,%r29,1040; @ %r79 bra $L109; setp.gt.u64 %r80,%r29,1040; @ %r80 bra $L110; setp.eq.u64 %r81,%r29,288; @ %r81 bra $L111; setp.gt.u64 %r82,%r29,288; @ %r82 bra $L112; setp.eq.u64 %r83,%r29,160; @ %r83 bra $L113; setp.eq.u64 %r84,%r29,272; @ %r84 bra $L111; setp.eq.u64 %r85,%r29,144; @ %r85 bra $L113; bra $L114; $L112: setp.eq.u64 %r86,%r29,544; @ %r86 bra $L115; setp.eq.u64 %r87,%r29,560; @ %r87 bra $L116; setp.eq.u64 %r88,%r29,528; @ %r88 bra $L115; bra $L114; $L110: setp.eq.u64 %r89,%r29,1088; @ %r89 bra $L117; setp.gt.u64 %r90,%r29,1088; @ %r90 bra $L118; setp.eq.u64 %r91,%r29,1056; @ %r91 bra $L109; setp.eq.u64 %r92,%r29,1072; @ %r92 bra $L119; bra $L114; $L118: setp.eq.u64 %r93,%r29,2080; @ %r93 bra $L120; setp.eq.u64 %r94,%r29,2112; @ %r94 bra $L121; setp.eq.u64 %r95,%r29,2064; @ %r95 bra $L120; bra $L114; call _gfortrani_unpack0_448 7 bra $L107; $L111:call _gfortrani_unpack0_7 bra $L107; $L115:call _gfortrani_unpack0_460 7 bra $L107; $L109:call _gfortrani_unpack0_7 bra $L107; $L120:call _gfortrani_unpack0_i16473 7 bra $L107; $L116:call _gfortrani_unpack0_r4479 7 bra $L107; $L119:call _gfortrani_unpack0_r8484 7 bra $L107; $L117:call _gfortrani_unpack0_c4511 7 bra $L107; $L121:call _gfortrani_unpack0_c87 bra $L107; $L114: .loc 1 542 11 ld.u64 %r30,[%r64+16]; .loc 1 542 3 setp.eq.u64 %r132,%r30,4; @ %r132 bra $L123; setp.gt.u64 %r133,%r30,4; @ %r133 bra $L124; setp.eq.u64 %r134,%r30,1; @ %r134 bra $L113; setp.eq.u64 %r135,%r30,2; @ %r135 bra $L125; bra $L126; $L124: setp.eq.u64 %r136,%r30,8; @ %r136 bra $L127; setp.eq.u64 %r137,%r30,16; @ %r137 bra $L128; bra $L126; $L125: .loc 1 550 11 ld.u64 %r139,[%r64]; and.b64 %r138,%r139,1; .loc 1 550 10 setp.ne.u64 %r140,%r138,0; @ %r140 bra $L126; .loc 1 551 4 ld.u64 %r142,[%r65]; or.b64 %r141,%r67,%r142; and.b64 %r143,%r141,1; setp.ne.u64 %r144,%r143,0; @ %r144 bra $L126call _gfortrani_unpack0_4 bra $L107; $L123: .loc 1 561 11 ld.u64 %r150,[%r64]; and.b64 %r149,%r150,3; .loc 1 561 10 setp.ne.u64 %r151,%r149,0; @ %r151 bra $L126; .loc 1 562 4 ld.u64 %r153,[%r65]; or.b64 %r152,%r67,%r153; and.b64 %r154,%r152,3; setp.ne.u64 %r155,%r154,0; @ %r155 bra $L126call _gfortrani_unpack0_568 4 bra $L107; $L127: .loc 1 572 11 ld.u64 %r161,[%r64]; and.b64 %r160,%r161,7; .loc 1 572 10 setp.ne.u64 %r162,%r160,0; @ %r162 bra $L126; .loc 1 573 4 ld.u64 %r164,[%r65]; or.b64 %r163,%r67,%r164; and.b64 %r165,%r163,7; setp.ne.u64 %r166,%r165,0; @ %r166 bra $L126call _gfortrani_unpack0_579 4 bra $L107; $L128: .loc 1 584 11 ld.u64 %r172,[%r64]; and.b64 %r171,%r172,15; .loc 1 584 10 setp.ne.u64 %r173,%r171,0; @ %r173 bra $L126; .loc 1 586 4 ld.u64 %r175,[%r65]; or.b64 %r174,%r67,%r175; and.b64 %r176,%r174,15; setp.ne.u64 %r177,%r176,0; @ %r177 bra $L126call _gfortrani_unpack0_i16592 4 bra $L107; $L126: .loc 1 597 3 add.u64 %r182,%frame,8; mov.u64 %r184,0; st.u64 [%r182],%r184; st.u64 [%r182+8],%r184; st.u64 [%r182+16],%r184; st.u64 [%r182+24],%r184; .loc 1 599 17 st.u64 [%frame],%r67;framecall unp,%out_arg4,%out_arg5); } $L107: .loc 1 601_gfortran_unpack0_char .visible .func _gfortran_unpack0_charpredmov.u64 %r24,%ar0; mov.u64 %r26,%ar2; mov.u64 %r27,%ar3; mov.u64 %r28,%ar4; mov.u32 %r29,%ar5; .loc 1 618 7 cvta.global.u64 %r31,_gfortrani_compile_options; .loc 1 618 6 ld.u32 %r32,[%r31+36]; setp.eq.u32 %r33,%r32,0; @ %r33 bra $L167; .loc 1 619 5call unpack_bounds$L167: .loc 1 621 3 add.u64 %r38,%frame,8; mov.u64 %r40,0; st.u64 [%r38],%r40; st.u64 [%r38+8],%r40; st.u64 [%r38+16],%r40; st.u64 [%r38+24],%r40; .loc 1 623 17 st.u64 [%frame],%r28; .loc 1 624 3 cvt.s64.s32 %r49,%r29; {framecall unp,%out_arg4,%out_arg5); } .loc 1 625_gfortran_unpack0_char4 .visible .func _gfortran_unpack0_char4predmov.u64 %r26,%ar0; mov.u64 %r28,%ar2; mov.u64 %r29,%ar3; mov.u64 %r30,%ar4; mov.u32 %r31,%ar5; .loc 1 642 7 cvta.global.u64 %r33,_gfortrani_compile_options; .loc 1 642 6 ld.u32 %r34,[%r33+36]; setp.eq.u32 %r35,%r34,0; @ %r35 bra $L172; .loc 1 643 5 mov.u64 %r39,0call unpack_bounds$L172: .loc 1 645 3 add.u64 %r40,%frame,8; mov.u64 %r42,0; st.u64 [%r40],%r42; st.u64 [%r40+8],%r42; st.u64 [%r40+16],%r42; st.u64 [%r40+24],%r42; .loc 1 647 17 st.u64 [%frame],%r30; .loc 1 649 20 cvt.s64.s32 %r51,%r31; shl.b64 %r52,%r51,2; .loc 1 648 3unp,%out_arg4,%out_arg5); } .loc 1 650796 gfortran_internal_packgfortran_internal_p../../../../libgfortran/runtime/in_pack_gener_gfortrani_internal_pack_1_gfortrani_internal_pack_1_gfortrani_internal_pack_2_gfortrani_internal_pack_2_gfortrani_internal_pack_4_gfortrani_internal_pack_4_gfortrani_internal_pack_8_gfortrani_internal_pack_8_gfortrani_internal_pack_16_gfortrani_internal_pack_16_gfortrani_internal_pack_r4_gfortrani_internal_pack_r4_gfortrani_internal_pack_r8_gfortrani_internal_pack_r8_gfortrani_internal_pack_c4_gfortrani_internal_pack_c4_gfortrani_internal_pack_c8_gfortrani_internal_pack_c8_gfortrani_xmallocarr_gfortrani_xmallocarrayortran_internal_packgfortran_internal_packpredpredpred %r132; .reg .predpred %r148; .reg .u64 %r150; .reg .pred %r151; .reg .u64 %r152; .reg .u64 %r153; .reg .u64 %r154; .reg .u64 %r155; .reg .pred %r156; .reg .predpred %r188; .reg .u64 %r190; .reg .u64 %r191; mov.u64 %r88,%ar0; .loc 1 48 13 ld.u64 %r87,[%r88]; .loc 1 48 6 setp.eq.u64 %r89,%r87,0; @ %r89 bra $L1; .loc 1 51 15 ld.u64 %r26,[%r88+16]; ld.s8 %r90,[%r88+29]; shl.b32 %r91,%r90,4; cvt.s64.s32 %r92,%r91; shl.b64 %r93,%r26,7; or.b64 %r28,%r92,%r93; .loc 1 53 3 setp.eq.u64 %r94,%r28,1040; @ %r94 bra $L3; setp.gt.u64 %r95,%r28,1040; @ %r95 bra $L4; setp.eq.u64 %r96,%r28,288; @ %r96 bra $L5; setp.gt.u64 %r97,%r28,288; @ %r97 bra $L6; setp.eq.u64 %r98,%r28,160; @ %r98 bra $L7; setp.eq.u64 %r99,%r28,272; @ %r99 bra $L5; setp.eq.u64 %r100,%r28,144; @ %r100 bra $L7; bra $L8; $L6: setp.eq.u64 %r101,%r28,544; @ %r101 bra $L9; setp.eq.u64 %r102,%r28,560; @ %r102 bra $L10; setp.eq.u64 %r103,%r28,528; @ %r103 bra $L9; bra $L8; $L4: setp.eq.u64 %r104,%r28,1088; @ %r104 bra $L11; setp.gt.u64 %r105,%r28,1088; @ %r105 bra $L12; setp.eq.u64 %r106,%r28,1056; @ %r106 bra $L3; setp.eq.u64 %r107,%r28,1072; @ %r107 bra $L13; bra $L8; $L12: setp.eq.u64 %r108,%r28,2080; @ %r108 bra $L14; setp.eq.u64 %r109,%r28,2112; @ %r109 bra $L15; setp.eq.u64 %r110,%r28,2064; @ %r110 bra $L14; bra $L8; $L7: .loc 1 5788; call (%value_in),_gfortrani_internal_pack_1,(1; $L5: .loc 1 88; call (%value_in),_gfortrani_internal_pack_2,(bra $L1; $L9: .loc 1 88; call (%value_in),_gfortrani_internal_pack_4,(%out_arg1); ld.param.u64 %r116,[%value_in]; } mov.u64 %r87,%r116; bra $L1; $L3: .loc 1 88; call (%value_in),_gfortrani_internal_pack_8,(%out_arg1); ld.param.u64 %r118,[%value_in]; } mov.u64 %r87,%r118; bra $L1; $L14: .loc 1 7488; call (%value_in),_gfortrani_internal_pack_16,(%out_arg1); ld.param.u64 %r120,[%value_in]; } mov.u64 %r87,%r120; bra $L1; $L10: .loc 1 7788; call (%value_in),_gfortrani_internal_pack_r4,(%out_arg1); ld.param.u64 %r122,[%value_in]; } mov.u64 %r87,%r122; bra $L1; $L13: .loc 1 8088; call (%value_in),_gfortrani_internal_pack_r8,(bra $L1; $L11: .loc 1 10188; call (%value_in),_gfortrani_internal_pack_c4,(%out_arg1); ld.param.u64 %r126,[%value_in]; } mov.u64 %r87,%r126; bra $L1; $L15: .loc 1 10488; call (%value_in),_gfortrani_internal_pack_c8,(%out_arg1); ld.param.u64 %r128,[%value_in]; } mov.u64 %r87,%r128; bra $L1; $L8: .loc 1 128 3 setp.eq.u64 %r129,%r26,4; @ %r129 bra $L16; setp.gt.u64 %r130,%r26,4; @ %r130 bra $L17; setp.eq.u64 %r131,%r26,1; @ %r131 bra $L7; setp.eq.u64 %r132,%r26,2; @ %r132 bra $L18; bra $L19; $L17: setp.eq.u64 %r133,%r26,8; @ %r133 bra $L20; setp.eq.u64 %r134,%r26,16; @ %r134 bra $L21; bra $L19; $L18: .loc 1 134 11 and.b64 %r135,%r87,1; .loc 1 134 109; .loc 1 137 9call (%value_in),_gfortrani_internal_pack_2,(%out_arg1); ld.param.u64 %r138,[%value_in]; } mov.u64 %r87,%r138; bra $L1; $L16: .loc 1 140 11 and.b64 %r139,%r87,3; .loc 1 140 10 setp.ne.u64 %r140,%r139,0; @ %r140 bra $L19; .loc 1 143 9call (%value_in),_gfortrani_internal_pack_4,(%out_arg1); ld.param.u64 %r142,[%value_in]; } mov.u64 %r87,%r142; bra $L1; $L20: .loc 1 146 11 and.b64 %r143,%r87,7; .loc 1 146 10 setp.ne.u64 %r144,%r143,0; @ %r144 bra $L19; .loc 1 149 9call (%value_in),_gfortrani_internal_pack_8,(%out_arg1); ld.param.u64 %r146,[%value_in]; } mov.u64 %r87,%r146; bra $L1; $L21: .loc 1 153 11 and.b64 %r147,%r87,15; .loc 1 153 10 setp.ne.u64 %r148,%r147,0; @ %r148 bra $L19; .loc 1 156 9call (%value_in),_gfortrani_internal_pack_16,(%out_arg1); ld.param.u64 %r150,[%value_in]; } mov.u64 %r87,%r150; bra $L1; $L19: .loc 1 162 7 ld.s8 %r66,[%r88+28]; .loc 1 165 3 setp.le.s64 %r151,%r66,0; @ %r151 bra $L1; add.u64 %r58,%frame,240; add.u64 %r56,%r88,40; mov.u64 %r29,%frame; add.u64 %r65,%frame,120; .loc 1 165 19 mov.u64 %r75,0; .loc 1 164 10 mov.u32 %r59,1; .loc 1 163 9 mov.u64 %r68,1; .loc 1 167 16 mov.u64 %r152,%r75; bra $L23; $L28: mov.u64 %r75,%r69; $L23: st.u64 [%r58],%r152; .loc 1 168 19 ld.u64 %r40,[%r56]; .loc 1 168 17 st.u64 [%r29],%r40; .loc 1 169 19 ld.u64 %r154,[%r56+16]; add.u64 %r153,%r154,1; ld.u64 %r155,[%r56+8]; sub.u64 %r44,%r153,%r155; .loc 1 169 17 st.u64 [%r65],%r44; .loc 1 170 10 setp.le.s64 %r156,%r44,0; @ %r156 bra $L1; .loc 1 177 10 setp.eq.u64 %r157,%r40,%r68; .loc 1 178 16 selp.u32 %r59,%r59,0,%r157; .loc 1 180 13 mul.lo.u64 %r68,%r68,%r44; .loc 1 165 36 add.u64 %r69,%r75,1; .loc 1 165 3 add.u64 %r58,%r58,8; add.u64 %r56,%r56,24; add.u64 %r29,%r29,8; add.u64 %r65,%r65,8; setp.ne.u64 %r158,%r66,%r69; @ %r158 bra $L28; .loc 1 183 6 setp.ne.u32 %r159,%r59,0; @ %r159 bra $L1; .loc 1 187 13call (%value_in),_gfortrani_xmallocarray162,[%value_in]; } mov.u64 %r87,%r162; .loc 1 189 7 ld.u64 %r73,[%r88]; .loc 1 190 19 ld.u64 %r46,[%frame]; .loc 1 190 11 mul.lo.u64 %r70,%r46,%r26; .loc 1 192 9 setp.eq.u64 %r164,%r73,0; @ %r164 bra $L1; .loc 1 202 32 ld.u64 %r78,[%frame+120]; .loc 1 199 12 ld.u64 %r81,[%frame+240]; .loc 1 209 28 mul.lo.u64 %r165,%r46,%r78; .loc 1 209 40 mul.lo.u64 %r86,%r165,%r26; .loc 1 188 8 mov.u64 %r71,%r87; setp.eq.u64 %r188,%r66,1; add.u64 %r191,%frame,120; $L27: .loc 1 195 72,[%value_in]; } .loc 1 197 12 add.u64 %r71,%r71,%r26; .loc 1 198 11 add.u64 %r73,%r73,%r70; .loc 1 199 15 add.u64 %r81,%r81,1; .loc 1 202 13 setp.ne.u64 %r174,%r81,%r78; @ %r174 bra $L27; .loc 1 211 14 @ %r188 bra $L1; add.u64 %r80,%frame,248; .loc 1 209 40 mov.u64 %r50,%r86; .loc 1 210 12 mov.u64 %r74,1; .loc 1 206 20 mov.u64 %r190,0; bra $L25; $L26: st.u64 [%r80],%r190; .loc 1 209 28 mul.lo.u64 %r178,%r53,%r52; .loc 1 209 40 mul.lo.u64 %r50,%r178,%r26; .loc 1 210 12 add.u64 %r72,%r74,1; .loc 1 211 14 add.u64 %r80,%r80,8; setp.eq.u64 %r179,%r74,%r75; @ %r179 bra $L1; mov.u64 %r74,%r72; $L25: .loc 1 218 23 ld.u64 %r180,[%r80]; add.u64 %r52,%r180,1; st.u64 [%r80],%r52; shl.b64 %r62,%r74,3; .loc 1 219 28 add.u64 %r181,%frame,%r62; ld.u64 %r53,[%r181]; .loc 1 219 19 sub.u64 %r183,%r73,%r50; mad.lo.u64 %r73,%r53,%r26,%r183; .loc 1 202 32 add.u64 %r185,%r191,%r62; ld.u64 %r57,[%r185]; .loc 1 202 13 setp.eq.u64 %r186,%r52,%r57; @ %r186 bra $L26; mov.u64 %r81,0; bra $L27; $L1: .loc 1 224815 gfortran_internal_unpack .visible .func _gfortran_internal_unpackfortran/runtime/in_unpack_gener_gfortrani_internal_unpack_1 .extern .func _gfortrani_internal_unpack_1; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_internal_unpack_2 .extern .func _gfortrani_internal_unpack_// BEGIN GLOBAL FUNCTION DECL: _gfortrani_internal_unpack_4 .extern .func _gfortrani_internal_unpack_); // BEGIN GLOBAL FUNCTION DECL: _gfortrani_internal_unpack_8 .extern .func _gfortrani_internal_unpack_8; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_internal_unpack_16 .extern .func _gfortrani_internal_unpack_16; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_internal_unpack_r4 .extern .func _gfortrani_internal_unpack_r); // BEGIN GLOBAL FUNCTION DECL: _gfortrani_internal_unpack_r8 .extern .func _gfortrani_internal_unpack_r8; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_internal_unpack_c4 .extern .func _gfortrani_internal_unpack_c); // BEGIN GLOBAL FUNCTION DECL: _gfortrani_internal_unpack_c8 .extern .func _gfortrani_internal_unpack_c8ortran_internal_unpack .visible .func _gfortran_internal_unpackpredpredpredpredpredpred %r172; .reg .predpred %r185; .reg .predpredpredmov.u64 %r96,%ar0; mov.u64 %r97,%ar1; .loc 1 46 8 ld.u64 %r73,[%r96]; .loc 1 48 9 set.u32.eq.u64 %r99,%r73,%r97; neg.s32 %r100,%r99; .loc 1 48 20 set.u32.eq.u64 %r102,%r97,0; neg.s32 %r103,%r102; .loc 1 48 17 cvt.u16.u32 %r105,%r100; cvt.u16.u32 %r106,%r103; or.b16 %r104,%r105,%r106; .loc 1 48 6 cvt.u32.u16 %r107,%r104; cvt.u16.u8 %r108,%r107; setp.ne.u16 %r109,%r108,0; @ %r109 bra $L1; .loc 1 51 15 ld.u64 %r29,[%r96+16]; ld.s8 %r110,[%r96+29]; shl.b32 %r111,%r110,4; cvt.u32.u64 %r113,%r29; shl.b32 %r112,%r113,7; or.b32 %r32,%r111,%r112; .loc 1 52 3 setp.eq.u32 %r114,%r32,1040; @ %r114 bra $L3; setp.gt.u32 %r115,%r32,1040; @ %r115 bra $L4; setp.eq.u32 %r116,%r32,288; @ %r116 bra $L5; setp.gt.u32 %r117,%r32,288; @ %r117 bra $L6; setp.eq.u32 %r118,%r32,160; @ %r118 bra $L7; setp.eq.u32 %r119,%r32,272; @ %r119 bra $L5; setp.eq.u32 %r120,%r32,144; @ %r120 bra $L7; bra $L8; $L6: setp.eq.u32 %r121,%r32,544; @ %r121 bra $L9; setp.eq.u32 %r122,%r32,560; @ %r122 bra $L10; setp.eq.u32 %r123,%r32,528; @ %r123 bra $L9; bra $L8; $L4: setp.eq.u32 %r124,%r32,1088; @ %r124 bra $L11; setp.gt.u32 %r125,%r32,1088; @ %r125 bra $L12; setp.eq.u32 %r126,%r32,1056; @ %r126 bra $L3; setp.eq.u32 %r127,%r32,1072; @ %r127 bra $L13; bra $L8; $L12: setp.eq.u32 %r128,%r32,2080; @ %r128 bra $L14; setp.eq.u32 %r129,%r32,2112; @ %r129 bra $L15; setp.eq.u32 %r130,%r32,2064; @ %r130 bra $L14; bra $L8; $L7: .loc 1 56 797; call _gfortrani_internal_unpack_1,(%out_arg1,%out_arg2); } .loc 1 57 7 bra $L1; $L5: .loc 1 61 797; call _gfortrani_internal_unpack_2,(%out_arg1,%out_arg2); } .loc 1 62 7 bra $L1; $L9: .loc 1 66 797; call _gfortrani_internal_unpack_4,(%out_arg1,%out_arg2); } .loc 1 67 7 bra $L1; $L3: .loc 1 71 797; call _gfortrani_internal_unpack72 7 bra $L1; $L14: .loc 1 77 797; call _gfortrani_internal_unpack_16,(%out_arg1,%out_arg2); } .loc 1 78 7 bra $L1; $L10: .loc 1 82 797; call _gfortrani_internal_unpack_r4,(%out_arg1,%out_arg2); } .loc 1 83 7 bra $L1; $L13: .loc 1 86 797; call _gfortrani_internal_unpack_r8,(%out_arg1,%out_arg2); } .loc 1 87 7 bra $L1; $L11: .loc 1 110 797; call _gfortrani_internal_unpack_c4,(%out_arg1,%out_arg2); } .loc 1 111 7 bra $L1; $L15: .loc 1 114 797; call _gfortrani_internal_unpack_c8,(%out_arg1,%out_arg2); } .loc 1 115 7 bra $L1; $L8: .loc 1 141 3 setp.eq.u64 %r149,%r29,4; @ %r149 bra $L17; setp.gt.u64 %r150,%r29,4; @ %r150 bra $L18; setp.eq.u64 %r151,%r29,1; @ %r151 bra $L7; setp.eq.u64 %r152,%r29,2; @ %r152 bra $L19; bra $L20; $L18: setp.eq.u64 %r153,%r29,8; @ %r153 bra $L21; setp.eq.u64 %r154,%r29,16; @ %r154 bra $L22; bra $L20; $L19: .loc 1 148 41 or.b64 %r155,%r73,%r97; and.b64 %r156,%r155,1; setp.ne.u64 %r157,%r156,0; @ %r157 bra $L20; .loc 1 152 497; call _gfortrani_internal_unpack_2,(%out_arg1,%out_arg2); } .loc 1 153 4 bra $L1; $L17: .loc 1 157 41 or.b64 %r160,%r73,%r97; and.b64 %r161,%r160,3; setp.ne.u64 %r162,%r161,0; @ %r162 bra $L20; .loc 1 161 497; call _gfortrani_internal_unpack_4,(%out_arg1,%out_arg2); } .loc 1 162 4 bra $L1; $L21: .loc 1 166 41 or.b64 %r165,%r73,%r97; and.b64 %r166,%r165,7; setp.ne.u64 %r167,%r166,0; @ %r167 bra $L20; .loc 1 170 497; call _gfortrani_internal_unpack171 4 bra $L1; $L22: .loc 1 176 42 or.b64 %r170,%r73,%r97; and.b64 %r171,%r170,15; setp.ne.u64 %r172,%r171,0; @ %r172 bra $L20; .loc 1 180 497; call _gfortrani_internal_unpack_16,(%out_arg1,%out_arg2); } .loc 1 181 4 bra $L1; $L20: .loc 1 190 7 ld.s8 %r69,[%r96+28]; .loc 1 192 3 setp.le.s64 %r175,%r69,0; @ %r175 bra $L23; add.u64 %r66,%r96,40; mov.u64 %r44,0; .loc 1 192 19 mov.u64 %r90,%r44; .loc 1 191 9 mov.u64 %r65,1; add.u64 %r221,%frame,240; add.u64 %r220,%frame,120; .loc 1 194 16 mov.u64 %r178,%r44; bra $L25; $L32: mov.u64 %r90,%r74; $L25: add.u64 %r177,%r221,%r44; st.u64 [%r177],%r178; .loc 1 195 19 ld.u64 %r49,[%r66]; .loc 1 195 17 add.u64 %r179,%frame,%r44; st.u64 [%r179],%r49; .loc 1 196 19 ld.u64 %r181,[%r66+16]; add.u64 %r180,%r181,1; ld.u64 %r182,[%r66+8]; sub.u64 %r53,%r180,%r182; .loc 1 196 17 add.u64 %r184,%r220,%r44; st.u64 [%r184],%r53; .loc 1 197 10 setp.le.s64 %r185,%r53,0; @ %r185 bra $L1; .loc 1 200 10 setp.ne.u64 %r186,%r49,%r65; @ %r186 bra $L31; .loc 1 201 8 mul.lo.u64 %r65,%r65,%r53; bra $L24; $L31: .loc 1 203 8 mov.u64 %r65,0; $L24: .loc 1 192 36 add.u64 %r74,%r90,1; .loc 1 192 3 add.u64 %r66,%r66,24; add.u64 %r44,%r44,8; setp.ne.u64 %r187,%r69,%r74; @ %r187 bra $L32; .loc 1 208 6 setp.eq.u64 %r188,%r65,0; @ %r188 bra $L26; .loc 1 210 32 mul.lo.u64 %r29,%r65,%r29; $L23: .loc 1 210 7195,[%value_in]; } .loc 1 211 7 bra $L1; $L26: .loc 1 214 19 ld.u64 %r54,[%frame]; .loc 1 214 11 mul.lo.u64 %r71,%r54,%r29; .loc 1 216 9 setp.eq.u64 %r197,%r73,0; @ %r197 bra $L1; .loc 1 226 32 ld.u64 %r37,[%frame+120]; .loc 1 223 12 ld.u64 %r91,[%frame+240]; .loc 1 233 29 mul.lo.u64 %r198,%r37,%r54; .loc 1 233 41 mul.lo.u64 %r95,%r198,%r29; setp.eq.u64 %r222,%r69,1; $L30: .loc 1 219 7205,[%value_in]; } .loc 1 221 11 add.u64 %r97,%r97,%r29; .loc 1 222 12 add.u64 %r73,%r73,%r71; .loc 1 223 15 add.u64 %r91,%r91,1; .loc 1 226 13 setp.ne.u64 %r207,%r37,%r91; @ %r207 bra $L30; .loc 1 235 14 @ %r222 bra $L1; add.u64 %r67,%frame,248; add.u64 %r87,%frame,8; .loc 1 233 41 mov.u64 %r57,%r95; .loc 1 234 12 mov.u64 %r92,1; .loc 1 230 20 mov.u64 %r223,0; bra $L28; $L29: st.u64 [%r67],%r223; .loc 1 233 29 mul.lo.u64 %r211,%r60,%r59; .loc 1 233 41 mul.lo.u64 %r57,%r211,%r29; .loc 1 234 12 add.u64 %r72,%r92,1; .loc 1 235 14 add.u64 %r67,%r67,8; add.u64 %r87,%r87,8; setp.eq.u64 %r212,%r90,%r92; @ %r212 bra $L1; mov.u64 %r92,%r72; $L28: .loc 1 242 23 ld.u64 %r213,[%r67]; add.u64 %r59,%r213,1; st.u64 [%r67],%r59; .loc 1 243 29 ld.u64 %r60,[%r87]; .loc 1 243 20 sub.u64 %r215,%r73,%r57; mad.lo.u64 %r73,%r60,%r29,%r215; .loc 1 226 32 shl.b64 %r217,%r92,3; add.u64 %r218,%r220,%r217; ld.u64 %r64,[%r218]; .loc 1 226 13 setp.eq.u64 %r219,%r59,%r64; @ %r219 bra $L29; mov.u64 %r91,%r65; bra $L30; $L1: .loc 1 247gfortran_specific__abs_c4 .visible .func (.param .f32 %value_out) _gfortran_specific__abs_c4../libgfortran/generated/_abs_c4.F90" // BEGIN GLOBAL FUNCTION DECL: cabsf .extern .func (.param .f32 %value_out) cabsfortran_specific__abs_c4 .visible .func (.param .f32 %value_out) _gfortran_specific__abs_c4mov.u64 %r25,%ar0; .loc 1 42 42 ld.f32 %r28,[%r25+4]; ld.f32 %r26,[%r25]; { .param .f32 %value_in; .param .f32 %out_arg1; st.param.f32 [%out_arg1],%r26(%value_in),cabs43 12ortran_specific__abs_c8 .visible .func (.param .f64 %value_out) _gfortran_specific__abs_c8../libgfortran/generated/_abs_c8.F90" // BEGIN GLOBAL FUNCTION DECL: cabs .extern .func (.param .f64 %value_out) cabsortran_specific__abs_c8 .visible .func (.param .f64 %value_out) _gfortran_specific__abs_c8mov.u64 %r25,%ar0; .loc 1 42 42 ld.f64 %r28,[%r25+8]; ld.f64 %r26,[%r25]; { .param .f64 %value_in; .param .f64 %out_arg1; st.param.f64 [%out_arg1],%r26(%value_in),c43 12ortran_specific__abs_gfortran_specific../libgfortran/generated/_abs_i4.F90gfortran_specific__abs_gfortran_specific__abs_mov.u64 %r24,%ar0; .loc 1 42 42 ld.u32 %r26,[%r24]; abs.s32 %valuegfortran_specific__abs_gfortran_specific../libgfortran/generated/_abs_i8.F90gfortran_specific__abs_gfortran_specific__abs_mov.u64 %r24,%ar0; .loc 1 42 42 ld.u64 %r26,[%r24]; abs.s64 %value,%r26; .loc 1 43 12_abs_i16.o/_gfortran_specific__abs_i16 .visible .func _gfortran_specific__abs_i16fortran/generated/_abs_i16.F90gfortran_specific__abs_i16 .visible .func _gfortran_specific__abs_i1642 43 ld.u64 %r28,[%r25+8]; shr.s64 %r27,%r28,63; ld.u64 %r33,[%r25]; xor.b64 %r32,%r33,%r27; xor.b64 %r35,%r28,%r27; sub.u64 %r39,%r32,%r27; set.u32.gt.u64 %r44,%r39,%r32; cvt.s64.s32 %r42,%r44; sub.u64 %r45,%r35,%r27; add.u64 %r48,%r45,%r42; .loc 1 43 12 st.u64 [%r24],%r39; st.u64 [%r24+8],%r48gfortran_specific__abs_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__abs_r4../libgfortran/generated/_abs_r4.F90gfortran_specific__abs_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__abs_r4mov.u64 %r24,%ar0; .loc 1 42 42 ld.f32 %r26,[%r24]; abs.f32 %value,%r26; .loc 1 43 ortran_specific__abs_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__abs_r8../libgfortran/generated/_abs_r8.F90gfortran_specific__abs_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__abs_r8mov.u64 %r24,%ar0; .loc 1 42 42 ld.f64 %r26,[%r24]; abs.f64 %value,%r26; .loc 1 43 12_abs_r10.o/ _aimag_c4.o/gfortran_specific__aimag_c4 .visible .func (.param .f32 %value_out) _gfortran_specific__aimag_c4../libgfortran/generated/_aimag_c4.F90gfortran_specific__aimag_c4 .visible .func (.param .f32 %value_out) _gfortran_specific__aimag_c4mov.u64 %r23,%ar0; .loc 1 43 12 ld.f32 %value,[%r23+ortran_specific__aimag_c8 .visible .func (.param .f64 %value_out) _gfortran_specific__aimag_c8../libgfortran/generated/_aimag_c8.F90gfortran_specific__aimag_c8 .visible .func (.param .f64 %value_out) _gfortran_specific__aimag_c8mov.u64 %r23,%ar0; .loc 1 43 12 ld.f64 %value,[%r23+8]_aimag_c10.o/ _exp_r4.o/ gfortran_specific__exp_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__exp_r4../libgfortran/generated/_exp_r4.F90" // BEGIN GLOBAL FUNCTION DECL: expf .extern .func (.param .f32 %value_out) expfF: _gfortran_specific__exp_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__exp_r4mov.u64 %r24,%ar0; .loc 1 42 42expf,(%out_arg1); ld.param.f32 %r26,[%value_in]; } .loc 1 43 12ortran_specific__exp_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__exp_r8../libgfortran/generated/_exp_r8.F90" // BEGIN GLOBAL FUNCTION DECL: exp .extern .func (.param .f64 %value_out) expF: _gfortran_specific__exp_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__exp_r8mov.u64 %r24,%ar0; .loc 1 42 42exp,(%out_arg1); ld.param.f64 %r26,[%value_in]; } .loc 1 43 12ortran_specific__exp_c4 .visible .func _gfortran_specific__exp_c4fortran/generated/_exp_c4.F90" // BEGIN GLOBAL FUNCTION DECL: cexpf .extern .func cexpf)_gfortran_specific__exp_c4 .visible .func _gfortran_specific__exp_c442 42 ld.f32 %r31,[%r27+4]; ld.f32 %r29,[%r27];.param .f32 %out_arg3; st.param.f32 [%out_arg3],%r31; call cexpf43 12 st.f32 [%r26],%r24; st.f32 [%r26+4],%r25;// BEGIN GLOBAL FUNCTION DECL: _gfortran_specific__exp_c8 .visible .func _gfortran_specific__exp_c8fortran/generated/_exp_c8.F90" // BEGIN GLOBAL FUNCTION DECL: cexp .extern .func cexp)_gfortran_specific__exp_c8 .visible .func _gfortran_specific__exp_c842 42 ld.f64 %r31,[%r27+8]; ld.f64 %r29,[%r27];.param .f64 %out_arg3; st.param.f64 [%out_arg3],%r31; call cexp43 12 st.f64 [%r26],%r24; st.f64 [%r26+8],%r25; _exp_c10.o/ _log_r4.o/ gfortran_specific__log_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__log_r4../libgfortran/generated/_log_r4.F90" // BEGIN GLOBAL FUNCTION DECL: logf .extern .func (.param .f32 %value_out) logfF: _gfortran_specific__log_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__log_r4mov.u64 %r24,%ar0; .loc 1 42 42logf,(%out_arg1); ld.param.f32 %r26,[%value_in]; } .loc 1 43 12ortran_specific__log_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__log_r8../libgfortran/generated/_log_r8.F90" // BEGIN GLOBAL FUNCTION DECL: log .extern .func (.param .f64 %value_out) logF: _gfortran_specific__log_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__log_r8mov.u64 %r24,%ar0; .loc 1 42 42log,(%out_arg1); ld.param.f64 %r26,[%value_in]; } .loc 1 43 12ortran_specific__log_c4 .visible .func _gfortran_specific__log_c4fortran/generated/_log_c4.F90" // BEGIN GLOBAL FUNCTION DECL: clogf .extern .func clogf)_gfortran_specific__log_c4 .visible .func _gfortran_specific__log_c442 42 ld.f32 %r31,[%r27+4]; ld.f32 %r29,[%r27];.param .f32 %out_arg3; st.param.f32 [%out_arg3],%r31; call clogf43 12 st.f32 [%r26],%r24; st.f32 [%r26+4],%r25;// BEGIN GLOBAL FUNCTION DECL: _gfortran_specific__log_c8 .visible .func _gfortran_specific__log_c8fortran/generated/_log_c8.F90" // BEGIN GLOBAL FUNCTION DECL: clog .extern .func clog)_gfortran_specific__log_c8 .visible .func _gfortran_specific__log_c842 42 ld.f64 %r31,[%r27+8]; ld.f64 %r29,[%r27];.param .f64 %out_arg3; st.param.f64 [%out_arg3],%r31; call clog43 12 st.f64 [%r26],%r24; st.f64 [%r26+8],%r25; _log_c10.o/ gfortran_specific__log10_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__log10_r4../libgfortran/generated/_log10_r4.F90" // BEGIN GLOBAL FUNCTION DECL: log10f .extern .func (.param .f32 %value_out) log10fF: _gfortran_specific__log10_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__log10_r4mov.u64 %r24,%ar0; .loc 1 42 46log10f,(%out_arg1); ld.param.f32 %r26,[%value_in]; } .loc 1 43 12ortran_specific__log10_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__log10_r8../libgfortran/generated/_log10_r8.F90" // BEGIN GLOBAL FUNCTION DECL: log10 .extern .func (.param .f64 %value_out) log10F: _gfortran_specific__log10_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__log10_r8mov.u64 %r24,%ar0; .loc 1 42 46log10,(%out_arg1); ld.param.f64 %r26,[%value_in]; } .loc 1 43 12ortran_specific__sqrt_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__sqrt_r4../libgfortran/generated/_sqrt_r4.F90gfortran_specific__sqrt_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__sqrt_r4mov.u64 %r24,%ar0; .loc 1 42 44 ld.f32 %r25,[%r24]; sqrt.rn.f32 %value,%r25; .loc 1 43 _sqrt_r8gfortran_specific__sqrt_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__sqrt_r8../libgfortran/generated/_sqrt_r8.F90gfortran_specific__sqrt_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__sqrt_r8mov.u64 %r24,%ar0; .loc 1 42 44 ld.f64 %r25,[%r24]; sqrt.rn.f64 %value,%r25; .loc 1 43 12ortran_specific__sqrt_c4 .visible .func _gfortran_specific__sqrt_c4fortran/generated/_sqrt_c4.F90" // BEGIN GLOBAL FUNCTION DECL: csqrtf .extern .func csqrtf)_gfortran_specific__sqrt_c4 .visible .func _gfortran_specific__sqrt_c442 44 ld.f32 %r31,[%r27+4]; ld.f32 %r29,[%r27];.param .f32 %out_arg3; st.param.f32 [%out_arg3],%r31; call csqrtf43 12 st.f32 [%r26],%r24; st.f32 [%r26+4],%r25;gfortran_specific__sqrt_c8 .visible .func _gfortran_specific__sqrt_c8fortran/generated/_sqrt_c8.F90" // BEGIN GLOBAL FUNCTION DECL: csqrt .extern .func csq)_gfortran_specific__sqrt_c8 .visible .func _gfortran_specific__sqrt_c842 44 ld.f64 %r31,[%r27+8]; ld.f64 %r29,[%r27];.param .f64 %out_arg3; st.param.f64 [%out_arg3],%r31; call csqrt43 12 st.f64 [%r26],%r24; st.f64 [%r26+8],%r25; _sqrt_c10.o/ _asin_r4gfortran_specific__asin_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__asin_r4../libgfortran/generated/_asin_r4.F90" // BEGIN GLOBAL FUNCTION DECL: asinf .extern .func (.param .f32 %value_out) aF: _gfortran_specific__asin_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__asin_r4mov.u64 %r24,%ar0; .loc 1 42 44asinf,(%out_arg1); ld.param.f32 %r26,[%value_in]; } .loc 1 43 12ortran_specific__asin_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__asin_r8../libgfortran/generated/_asin_r8.F90" // BEGIN GLOBAL FUNCTION DECL: asin .extern .func (.param .f64 %value_out) asinF: _gfortran_specific__asin_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__asin_r8mov.u64 %r24,%ar0; .loc 1 42 44asin,(%out_arg1); ld.param.f64 %r26,[%value_in]; } .loc 1 43 12ortran_specific__asinh_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__asinh_r4../libgfortran/generated/_asinh_r4.F90" // BEGIN GLOBAL FUNCTION DECL: asinhf .extern .func (.param .f32 %value_out) asinhfF: _gfortran_specific__asinh_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__asinh_r4mov.u64 %r24,%ar0; .loc 1 42 46asinhf,(%out_arg1); ld.param.f32 %r26,[%value_in]; } .loc 1 43 12ortran_specific__asinh_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__asinh_r8../libgfortran/generated/_asinh_r8.F90" // BEGIN GLOBAL FUNCTION DECL: asinh .extern .func (.param .f64 %value_out) asinhF: _gfortran_specific__asinh_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__asinh_r8mov.u64 %r24,%ar0; .loc 1 42 46asinh,(%out_arg1); ld.param.f64 %r26,[%value_in]; } .loc 1 43 12ortran_specific__acos_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__acos_r4../libgfortran/generated/_acos_r4.F90" // BEGIN GLOBAL FUNCTION DECL: acosf .extern .func (.param .f32 %value_out) acosfF: _gfortran_specific__acos_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__acos_r4mov.u64 %r24,%ar0; .loc 1 42 44acosf,(%out_arg1); ld.param.f32 %r26,[%value_in]; } .loc 1 43 12ortran_specific__acos_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__acos_r8../libgfortran/generated/_acos_r8.F90" // BEGIN GLOBAL FUNCTION DECL: acos .extern .func (.param .f64 %value_out) acosF: _gfortran_specific__acos_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__acos_r8mov.u64 %r24,%ar0; .loc 1 42 44acos,(%out_arg1); ld.param.f64 %r26,[%value_in]; } .loc 1 43 12ortran_specific__acosh_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__acosh_r4../libgfortran/generated/_acosh_r4.F90" // BEGIN GLOBAL FUNCTION DECL: acoshf .extern .func (.param .f32 %value_out) acoshfF: _gfortran_specific__acosh_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__acosh_r4mov.u64 %r24,%ar0; .loc 1 42 46acoshf,(%out_arg1); ld.param.f32 %r26,[%value_in]; } .loc 1 43 12ortran_specific__acosh_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__acosh_r8../libgfortran/generated/_acosh_r8.F90" // BEGIN GLOBAL FUNCTION DECL: acosh .extern .func (.param .f64 %value_out) acoshF: _gfortran_specific__acosh_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__acosh_r8mov.u64 %r24,%ar0; .loc 1 42 46acosh,(%out_arg1); ld.param.f64 %r26,[%value_in]; } .loc 1 43 12ortran_specific__atan_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__atan_r4../libgfortran/generated/_atan_r4.F90" // BEGIN GLOBAL FUNCTION DECL: atanf .extern .func (.param .f32 %value_out) atanfF: _gfortran_specific__atan_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__atan_r4mov.u64 %r24,%ar0; .loc 1 42 44atanf,(%out_arg1); ld.param.f32 %r26,[%value_in]; } .loc 1 43 12ortran_specific__atan_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__atan_r8../libgfortran/generated/_atan_r8.F90" // BEGIN GLOBAL FUNCTION DECL: atan .extern .func (.param .f64 %value_out) atanF: _gfortran_specific__atan_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__atan_r8mov.u64 %r24,%ar0; .loc 1 42 44atan,(%out_arg1); ld.param.f64 %r26,[%value_in]; } .loc 1 43 12ortran_specific__atanh_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__atanh_r4../libgfortran/generated/_atanh_r4.F90" // BEGIN GLOBAL FUNCTION DECL: atanhf .extern .func (.param .f32 %value_out) atanhfF: _gfortran_specific__atanh_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__atanh_r4mov.u64 %r24,%ar0; .loc 1 42 46atanhf,(%out_arg1); ld.param.f32 %r26,[%value_in]; } .loc 1 43 12ortran_specific__atanh_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__atanh_r8../libgfortran/generated/_atanh_r8.F90" // BEGIN GLOBAL FUNCTION DECL: atanh .extern .func (.param .f64 %value_out) aF: _gfortran_specific__atanh_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__atanh_r8mov.u64 %r24,%ar0; .loc 1 42 46atanh,(%out_arg1); ld.param.f64 %r26,[%value_in]; } .loc 1 43 12ortran_specific__sin_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__sin_r4../libgfortran/generated/_sin_r4.F90" // BEGIN GLOBAL FUNCTION DECL: sinf .extern .func (.param .f32 %value_out) F: _gfortran_specific__sin_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__sin_r4mov.u64 %r24,%ar0; .loc 1 42 42sinf,(%out_arg1); ld.param.f32 %r26,[%value_in]; } .loc 1 43 12ortran_specific__sin_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__sin_r8../libgfortran/generated/_sin_r8.F90" // BEGIN GLOBAL FUNCTION DECL: sin .extern .func (.param .f64 %value_out) sinF: _gfortran_specific__sin_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__sin_r8mov.u64 %r24,%ar0; .loc 1 42 42sin,(%out_arg1); ld.param.f64 %r26,[%value_in]; } .loc 1 43 12ortran_specific__sin_c4 .visible .func _gfortran_specific__sin_c4fortran/generated/_sin_c4.F90" // BEGIN GLOBAL FUNCTION DECL: csinf .extern .func csinf)_gfortran_specific__sin_c4 .visible .func _gfortran_specific__sin_c442 42 ld.f32 %r31,[%r27+4]; ld.f32 %r29,[%r27];.param .f32 %out_arg3; st.param.f32 [%out_arg3],%r31; call csinf43 12 st.f32 [%r26],%r24; st.f32 [%r26+4],%r25;// BEGIN GLOBAL FUNCTION DECL: _gfortran_specific__sin_c8 .visible .func _gfortran_specific__sin_c8fortran/generated/_sin_c8.F90" // BEGIN GLOBAL FUNCTION DECL: csin .extern .func cs_gfortran_specific__sin_c8 .visible .func _gfortran_specific__sin_c842 42 ld.f64 %r31,[%r27+8]; ld.f64 %r29,[%r27];.param .f64 %out_arg3; st.param.f64 [%out_arg3],%r31; call cs43 12 st.f64 [%r26],%r24; st.f64 [%r26+8],%r25; _sin_c10.o/ _cos_r4.o/ gfortran_specific__cos_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__cos_r4../libgfortran/generated/_cos_r4.F90" // BEGIN GLOBAL FUNCTION DECL: cosf .extern .func (.param .f32 %value_out) cosfF: _gfortran_specific__cos_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__cos_r4mov.u64 %r24,%ar0; .loc 1 42 42cosf,(%out_arg1); ld.param.f32 %r26,[%value_in]; } .loc 1 43 12ortran_specific__cos_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__cos_r8../libgfortran/generated/_cos_r8.F90" // BEGIN GLOBAL FUNCTION DECL: cos .extern .func (.param .f64 %value_out) cosF: _gfortran_specific__cos_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__cos_r8mov.u64 %r24,%ar0; .loc 1 42 42cos,(%out_arg1); ld.param.f64 %r26,[%value_in]; } .loc 1 43 12ortran_specific__cos_c4 .visible .func _gfortran_specific__cos_c4fortran/generated/_cos_c4.F90" // BEGIN GLOBAL FUNCTION DECL: ccosf .extern .func ccosf)_gfortran_specific__cos_c4 .visible .func _gfortran_specific__cos_c442 42 ld.f32 %r31,[%r27+4]; ld.f32 %r29,[%r27];.param .f32 %out_arg3; st.param.f32 [%out_arg3],%r31; call ccosf43 12 st.f32 [%r26],%r24; st.f32 [%r26+4],%r25;// BEGIN GLOBAL FUNCTION DECL: _gfortran_specific__cos_c8 .visible .func _gfortran_specific__cos_c8fortran/generated/_cos_c8.F90" // BEGIN GLOBAL FUNCTION DECL: ccos .extern .func ccos)_gfortran_specific__cos_c8 .visible .func _gfortran_specific__cos_c842 42 ld.f64 %r31,[%r27+8]; ld.f64 %r29,[%r27];.param .f64 %out_arg3; st.param.f64 [%out_arg3],%r31; call ccos43 12 st.f64 [%r26],%r24; st.f64 [%r26+8],%r25; _cos_c10.o/ _tan_r4.o/ gfortran_specific__tan_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__tan_r4../libgfortran/generated/_tan_r4.F90" // BEGIN GLOBAL FUNCTION DECL: tanf .extern .func (.param .f32 %value_out) tanfF: _gfortran_specific__tan_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__tan_r4mov.u64 %r24,%ar0; .loc 1 42 42tanf,(%out_arg1); ld.param.f32 %r26,[%value_in]; } .loc 1 43 12ortran_specific__tan_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__tan_r8../libgfortran/generated/_tan_r8.F90" // BEGIN GLOBAL FUNCTION DECL: tan .extern .func (.param .f64 %value_out) tanF: _gfortran_specific__tan_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__tan_r8mov.u64 %r24,%ar0; .loc 1 42 42tan,(%out_arg1); ld.param.f64 %r26,[%value_in]; } .loc 1 43 12ortran_specific__sinh_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__sinh_r4../libgfortran/generated/_sinh_r4.F90" // BEGIN GLOBAL FUNCTION DECL: sinhf .extern .func (.param .f32 %value_out) sinhfF: _gfortran_specific__sinh_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__sinh_r4mov.u64 %r24,%ar0; .loc 1 42 44sinhf,(%out_arg1); ld.param.f32 %r26,[%value_in]; } .loc 1 43 12ortran_specific__sinh_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__sinh_r8../libgfortran/generated/_sinh_r8.F90" // BEGIN GLOBAL FUNCTION DECL: sinh .extern .func (.param .f64 %value_out) sinhF: _gfortran_specific__sinh_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__sinh_r8mov.u64 %r24,%ar0; .loc 1 42 44sinh,(%out_arg1); ld.param.f64 %r26,[%value_in]; } .loc 1 43 12ortran_specific__cosh_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__cosh_r4../libgfortran/generated/_cosh_r4.F90" // BEGIN GLOBAL FUNCTION DECL: coshf .extern .func (.param .f32 %value_out) coshfF: _gfortran_specific__cosh_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__cosh_r4mov.u64 %r24,%ar0; .loc 1 42 44coshf,(%out_arg1); ld.param.f32 %r26,[%value_in]; } .loc 1 43 12ortran_specific__cosh_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__cosh_r8../libgfortran/generated/_cosh_r8.F90" // BEGIN GLOBAL FUNCTION DECL: cosh .extern .func (.param .f64 %value_out) coshF: _gfortran_specific__cosh_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__cosh_r8mov.u64 %r24,%ar0; .loc 1 42 44cosh,(%out_arg1); ld.param.f64 %r26,[%value_in]; } .loc 1 43 12ortran_specific__tanh_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__tanh_r4../libgfortran/generated/_tanh_r4.F90" // BEGIN GLOBAL FUNCTION DECL: tanhf .extern .func (.param .f32 %value_out) tanhfF: _gfortran_specific__tanh_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__tanh_r4mov.u64 %r24,%ar0; .loc 1 42 44tanhf,(%out_arg1); ld.param.f32 %r26,[%value_in]; } .loc 1 43 12ortran_specific__tanh_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__tanh_r8../libgfortran/generated/_tanh_r8.F90" // BEGIN GLOBAL FUNCTION DECL: tanh .extern .func (.param .f64 %value_out) F: _gfortran_specific__tanh_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__tanh_r8mov.u64 %r24,%ar0; .loc 1 42 44tanh,(%out_arg1); ld.param.f64 %r26,[%value_in]; } .loc 1 43 12ortran_specific__conjg_4 .visible .func _gfortran_specific__conjg_4fortran/generated/_conjg_c4.F90gfortran_specific__conjg_4 .visible .func _gfortran_specific__conjg_445 ld.f32 %r27,[%r26+4]; neg.f32 %r24,%r27; .loc 1 43 12 ld.f32 %r28,[%r26]; st.f32 [%r25],%r28; st.f32 [%r25+4],%r24; ret; } _conjg_c8.o/gfortran_specific__conjg_8 .visible .func _gfortran_specific__conjg_8fortran/generated/_conjg_c8.F90gfortran_specific__conjg_8 .visible .func _gfortran_specific__conjg_845 ld.f64 %r27,[%r26+8]; neg.f64 %r24,%r27; .loc 1 43 12 ld.f64 %r28,[%r26]; st.f64 [%r25],%r28; st.f64 [%r25+8],%r24; ret; } _conjg_c10.o/ gfortran_specific__aint_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__aint_r4../libgfortran/generated/_aint_r4.F90gfortran_specific__aint_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__aint_r4mov.u64 %r24,%ar0; .loc 1 42 44 ld.f32 %r25,[%r24]; cvt.rzi.f32.f32 %value,%r25; .loc 1 43 _aint_r8.o/gfortran_specific__aint_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__aint_r8../libgfortran/generated/_aint_r8.F90gfortran_specific__aint_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__aint_r8mov.u64 %r24,%ar0; .loc 1 42 44 ld.f64 %r25,[%r24]; cvt.rzi.f64.f64 %value,%r25; .loc 1 43 12ortran_specific__anint_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__anint_r4../libgfortran/generated/_anint_r4.F90" // BEGIN GLOBAL FUNCTION DECL: roundf .extern .func (.param .f32 %value_out) roundfF: _gfortran_specific__anint_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__anint_r4mov.u64 %r24,%ar0; .loc 1 42 46roundf,(%out_arg1); ld.param.f32 %r26,[%value_in]; } .loc 1 43 12ortran_specific__anint_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__anint_r8../libgfortran/generated/_anint_r8.F90" // BEGIN GLOBAL FUNCTION DECL: round .extern .func (.param .f64 %value_out) roundF: _gfortran_specific__anint_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__anint_r8mov.u64 %r24,%ar0; .loc 1 42 46round,(%out_arg1); ld.param.f64 %r26,[%value_in]; } .loc 1 43 12ortran_specific__sign_gfortran_specific__sign_fortran/generated/_sign_i4.F90gfortran_specific__sign_gfortran_specific__sign_i441 46 ld.u32 %r25,[%r28]; ld.u32 %r31,[%r29]; xor.b32 %r30,%r25,%r31; shr.s32 %r26,%r30,31; add.u32 %r33,gfortran_specific__sign_gfortran_specific__sign_fortran/generated/_sign_i8.F90gfortran_specific__sign_gfortran_specific__sign_41 46 ld.u64 %r25,[%r28]; ld.u64 %r31,[%r29]; xor.b64 %r30,%r25,%r31; shr.s64 %r26,%r30,63; add.u64 %r33,%r25,%r26; xor.b64 %value,%r33,%r26; .loc 1 42 12_sign_i16.o/ gfortran_specific__sign_i16 .visible .func _gfortran_specific__sign_i16fortran/generated/_sign_i16.F90gfortran_specific__sign_i16 .visible .func _gfortran_specific__sign_i16u64.loc 1 41 47 ld.u64 %r59,[%r29]; ld.u64 %r60,[%r29+8]; ld.u64 %r36,[%r30+8]; xor.b64 %r35,%r36,%r60; shr.s64 %r38,%r35,63; add.u64 %r43,%r59,%r38; set.u32.lt.u64 %r48,%r43,%r59; cvt.s64.s32 %r46,%r48; add.u64 %r49,%r60,%r38; sub.u64 %r52,%r49,%r46; xor.b64 %r53,%r43,%r38; st.u64 [%r28],%r53; xor.b64 %r56,%r52,%r38; st.u64 [%r28+8],%r56; .loc 1 42 12 ret; } _sign_r4.o/gfortran_specific__sign_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__sign_r4fortran/generated/_sign_r4.F90gfortran_specific__sign_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__sign_r446 ld.f32 %r27,[%r25]; ld.f32 %r28,[%r26]; copysign.f32 %value,%r28,%r27; .loc 1 42 _sign_r8.o/gfortran_specific__sign_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__sign_r8fortran/generated/_sign_r8.F90gfortran_specific__sign_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__sign_r846 ld.f64 %r27,[%r25]; ld.f64 %r28,[%r26]; copysign.f64 %value,%r28,%r27; .loc 1 42 12ortran_specific__dim_gfortran_specific__dim_fortran/generated/_dim_i4.F90gfortran_specific__dim_gfortran_specific__dim_i441 44 ld.u32 %r29,[%r26]; ld.u32 %r30,[%r27]; sub.u32 %r28,%r29,%r30; max.s32 %value,%r28,gfortran_specific__dim_gfortran_specific__dim_fortran/generated/_dim_i8.F90gfortran_specific__dim_gfortran_specific__dim_41 44 ld.u64 %r29,[%r26]; ld.u64 %r30,[%r27]; sub.u64 %r28,%r29,%r30; max.s64 %value,%r28,0; .loc 1 42 12gfortran_specific__dim_i16 .visible .func _gfortran_specific__dim_i16fortran/generated/_dim_i16.F90gfortran_specific__dim_i16 .visible .func _gfortran_specific__dim_i16pred.loc 1 41 45 ld.u64 %r47,[%r27]; ld.u64 %r48,[%r27+8]; ld.u64 %r49,[%r28]; ld.u64 %r50,[%r28+8];2; add.u64 %r54,%r38,%r35; setp.ge.s64 %r44,%r54,0; @ %r44 bra $L2; mov.u64 %r53,0; mov.u64 %r54,%r53; $L2: .loc 1 42 12 st.u64 [%r26],%r53; st.u64 [%r26+8],%r54; ret; } _dim_r4.o/ gfortran_specific__dim_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__dim_r4fortran/generated/_dim_r4.F90gfortran_specific__dim_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__dim_r4pred44 ld.f32 %r27,[%r25]; ld.f32 %r28,[%r26]; sub.f32 %r24,%r27,%r28; setp.le.f32 %r29,%r24,0f00000000; selp.f32 %value,0f00000000,%r24,%r29; .loc 1 42 ortran_specific__dim_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__dim_r8fortran/generated/_dim_r8.F90gfortran_specific__dim_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__dim_r8pred44 ld.f64 %r27,[%r25]; ld.f64 %r28,[%r26]; sub.f64 %r24,%r27,%r28; setp.le.f64 %r29,%r24,0d0000000000000000; selp.f64 %value,0d0000000000000000,%r24,%r29; .loc 1 42 12_dim_r10.o/ _atan2_r4.o/gfortran_specific__atan2_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__atan2_r4fortran/generated/_atan2_r4.F90" // BEGIN GLOBAL FUNCTION DECL: atan2f .extern .func (.param .f32 %value_out) atan2fortran_specific__atan2_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__atan2_r4f32 %r28; .reg .f32 %r29; mov.u64 %r25,%ar0; mov.u64 %r26,%ar1; .loc 1 41 48 ld.f32 %r28,[%r26]; ld.f32 %r27,[%r25]; { .param .f32 %value_in; .param .f32 %out_arg1; st.param.f32 [%out_arg1],%r2712 _atan2_r8.o/gfortran_specific__atan2_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__atan2_r8fortran/generated/_atan2_r8.F90" // BEGIN GLOBAL FUNCTION DECL: atan2 .extern .func (.param .f64 %value_out) atan2ortran_specific__atan2_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__atan2_r8f64 %r28; .reg .f64 %r29; mov.u64 %r25,%ar0; mov.u64 %r26,%ar1; .loc 1 41 48 ld.f64 %r28,[%r26]; ld.f64 %r27,[%r25]; { .param .f64 %value_in; .param .f64 %out_arg1; st.param.f64 [%out_arg1],%r27(%value_in),atan2_atan2_r10.o/ gfortran_specific__mod_gfortran_specific__mod_fortran/generated/_mod_i4.F90gfortran_specific__mod_gfortran_specific__mod_i4remgfortran_specific__mod_gfortran_specific__mod_fortran/generated/_mod_i8.F90gfortran_specific__mod_gfortran_specific__mod_44 ld.u64 %r28,[%r25]; ld.u64 %r29,[%r26]; rem.s64 %value,%r28,%r29; .loc 1 42 12gfortran_specific__mod_i16 .visible .func _gfortran_specific__mod_i16fortran/generated/_mod_i16.F90"odti3 .extern .func __modti3gfortran_specific__mod_i16 .visible .func _gfortran_specific__mod_i1641 45 ld.u64 %r46,[%r26]; ld.u64 %r47,[%r26+8]; add.u64 %r37,%frame,16; add.u64 %r41,%frame,32; st.u64 [%frame+16],%r46; st.u64 [%frame+24],%r47; ld.u64 %r39,[%r27]; st.u64 [%frame+32],%r39; ld.u64 %r40,[%r27+8]; st.u64 [%frame+40],%r40; {42 12 st.u64 [%r25],%r48; st.u64 [%r25+8],%r49;// BEGIN GLOBAL FUNCTION DECL: _gfortran_specific__mod_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__mod_r4fortran/generated/_mod_r4.F90" // BEGIN GLOBAL FUNCTION DECL: fmodf .extern .func (.param .f32 %value_out) fmodfortran_specific__mod_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__mod_r4f32 %r28; .reg .f32 %r29; mov.u64 %r25,%ar0; mov.u64 %r26,%ar1; .loc 1 41 44 ld.f32 %r28,[%r26]; ld.f32 %r27,[%r25]; { .param .f32 %value_in; .param .f32 %out_arg1; st.param.f32 [%out_arg1],%r27(%value_in),fmod12ortran_specific__mod_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__mod_r8fortran/generated/_mod_r8.F90" // BEGIN GLOBAL FUNCTION DECL: fmod .extern .func (.param .f64 %value_out) fmodortran_specific__mod_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__mod_r8f64 %r28; .reg .f64 %r29; mov.u64 %r25,%ar0; mov.u64 %r26,%ar1; .loc 1 41 44 ld.f64 %r28,[%r26]; ld.f64 %r27,[%r25]; { .param .f64 %value_in; .param .f64 %out_arg1; st.param.f64 [%out_arg1],%r27(%value_in),fmo29,[%value_in]; } .loc 1 42 12ortran_specific__nint_4_4gfortran_specific__nint_4_4../libgfortran/generated/misc_specifics.F90" // BEGIN GLOBAL FUNCTION DECL: _gfortran_specific__nint_4_8gfortran_specific__nint_4_8_gfortran_specific__nint_8_4gfortran_specific__nint_8_4_gfortran_specific__nint_8_8gfortran_specific__nint_8_8_gfortran_specific__nint_16_4 .visible .func _gfortran_specific__nint_16_); // BEGIN GLOBAL FUNCTION DECL: _gfortran_specific__nint_16_8 .visible .func _gfortran_specific__nint_16_8; // BEGIN GLOBAL FUNCTION DECL: _gfortran_specific__char_1_i4 .visible .func _gfortran_specific__char_1_i4_gfortran_specific__char_1_i8 .visible .func _gfortran_specific__char_1_i8_gfortran_specific__char_1_i16 .visible .func _gfortran_specific__char_1_i16_gfortran_specific__len_1_gfortran_specific__len_1_); // BEGIN GLOBAL FUNCTION DECL: _gfortran_specific__len_1_gfortran_specific__len_1_; // BEGIN GLOBAL FUNCTION DECL: _gfortran_specific__len_1_i16 .visible .func _gfortran_specific__len_1_i16_gfortran_specific__index_1_gfortran_specific__index_1_i4gfortran_specific__index_1_gfortran_specific__index_1_i8gfortran_specific__index_1_i16 .visible .func _gfortran_specific__index_1_i16lroundflroundflroundlround_gfortran_string_index_gfortran_string_indexF: _gfortran_specific__nint_4_4gfortran_specific__nint_4_4mov.u64 %r24,%ar0; .loc 1 37 45 ld.f32 %r26,[%r24]26; call (%value_in),lroundf,(38 12 cvt.u32_gfortran_specific__nint_4_8gfortran_specific__nint_4_8mov.u64 %r24,%ar0; .loc 1 45 45 ld.f64 %r26,[%r24]26; call (%value_in),lround,(46 12 cvt.u32_gfortran_specific__nint_8_4gfortran_specific__nint_8_4u64 %r25; .reg .f32 %r27; .reg .u64 %r28; mov.u64 %r25,%ar0; .loc 1 69 45 ld.f32 %r27,[%r25]; call (%value_in),lroundf,(%out_arg1); ld.param.u64 %r28,[%value_in]; } cvt.u32.u64 %r23,%r28; cvt.s64.s32 %value,%r23; .loc 1 70 12gfortran_specific__nint_8_8gfortran_specific__nint_8_8u64 %r25; .reg .f64 %r27; .reg .u64 %r28; mov.u64 %r25,%ar0; .loc 1 77 45 ld.f64 %r27,[%r25]; call (%value_in),lround,(%out_arg1); ld.param.u64 %r28,[%value_in]; } cvt.u32.u64 %r23,%r28; cvt.s64.s32 %value,%r23; .loc 1 78 12gfortran_specific__nint_16_4 .visible .func _gfortran_specific__nint_16_4101 46 ld.f32 %r28,[%r26]28; call (%value_in),lroundf,(%out_arg1); ld.param.u64 %r29,[%value_in]; } cvt.u32.u64 %r23,%r29; cvt.s64.s32 %r31,%r23; st.u64 [%r25],%r31; shr.s64 %r32,%r31,63; st.u64 [%r25+8],%r32; .loc 1 102 12_gfortran_specific__nint_16_8 .visible .func _gfortran_specific__nint_16_8109 46 ld.f64 %r28,[%r26]28; call (%value_in),lround,(%out_arg1); ld.param.u64 %r29,[%value_in]; } cvt.u32.u64 %r23,%r29; cvt.s64.s32 %r31,%r23; st.u64 [%r25],%r31; shr.s64 %r32,%r31,63; st.u64 [%r25+8],%r32; .loc 1 110 12_gfortran_specific__char_1_i4 .visible .func _gfortran_specific__char_1_i4mov.u64 %r24,%ar0; mov.u64 %r26,%ar2; .loc 1 135 54 ld.u32 %r27,[%r26]; st.u8 [%r24],%r27; .loc 1 136 12_gfortran_specific__char_1_i8 .visible .func _gfortran_specific__char_1_i8mov.u64 %r24,%ar0; mov.u64 %r26,%ar2; .loc 1 143 54 ld.u64 %r27,[%r26]; st.u8 [%r24],%r27; .loc 1 144 12_gfortran_specific__char_1_i16 .visible .func _gfortran_specific__char_1_i16mov.u64 %r24,%ar0; mov.u64 %r26,%ar2; .loc 1 151 55 ld.u64 %r27,[%r26]; st.u8 [%r24],%r27; .loc 1 152 12_gfortran_specific__len_1_gfortran_specific__len_1_i4mov.u64 %r24,%ar1; .loc 1 162 12 cvt.u32_gfortran_specific__len_1_gfortran_specific__len_1_mov.u64 %r25,%ar1; .loc 1 169 44 cvt.u32.u64 %r27,%r25; cvt.s64.s32 %value,%r27; .loc 1 170 12gfortran_specific__len_1_i16 .visible .func _gfortran_specific__len_1_i16u64 %r29; mov.u64 %r24,%ar0; mov.u64 %r26,%ar2; .loc 1 177 45 cvt.u32.u64 %r28,%r26; cvt.s64.s32 %r27,%r28; st.u64 [%r24],%r27; shr.s64 %r29,%r27,63; st.u64 [%r24+8],%r29; .loc 1 178 12_gfortran_specific__index_1_gfortran_specific__index_1__gfortran_string_index,[%value_in]; } .loc 1 188 12 cvt.u32_gfortran_specific__index_1_gfortran_specific__index_1_56 mov.u32 %r33call (%value_in),_gfortran_string_index34,[%value_in]; } cvt.u32.u64 %r36,%r34; cvt.s64.s32 %value,%r36; .loc 1 196 12gfortran_specific__index_1_i16 .visible .func _gfortran_specific__index_1_i16 {mov.u64 %r29,%ar4; .loc 1 203 57call (%value_in),_gfortran_string_index35,[%value_in]; } cvt.u32.u64 %r37,%r35; cvt.s64.s32 %r36,%r37; st.u64 [%r25],%r36; shr.s64 %r38,%r36,63; st.u64 [%r25+8],%r38; .loc 1 204 12 ret; } dprod_r8.o/ ortran_specific__dprod_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__dprod_r8fortran/intrinsics/dprod_r8.f90gfortran_specific__dprod_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__dprod_r81 48 ld.f32 %r31,[%r27]; cvt.f64.f32 %r30,%r31; ld.f32 %r33,[%r28]; cvt.f64.f32 %r32,%r33; mul.f64 %value,%r30,%r32; .loc 1 32 12 f2c_specifics.o/1622802262gfortran_f2c_specific__abs_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__abs_r4../libgfortran/intrinsics/f2c_specifics.F90" // BEGIN GLOBAL FUNCTION DECL: _gfortran_f2c_specific__abs_c4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__abs_c4_gfortran_f2c_specific__aimag_c4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__aimag_c4_gfortran_f2c_specific__aimag_c8 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__aimag_c8_gfortran_f2c_specific__exp_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__exp_r4_gfortran_f2c_specific__exp_c4 .visible .func _gfortran_f2c_specific__exp_c); // BEGIN GLOBAL FUNCTION DECL: _gfortran_f2c_specific__exp_c8 .visible .func _gfortran_f2c_specific__exp_c8; // BEGIN GLOBAL FUNCTION DECL: _gfortran_f2c_specific__log_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__log_r4_gfortran_f2c_specific__log_c4 .visible .func _gfortran_f2c_specific__log_c); // BEGIN GLOBAL FUNCTION DECL: _gfortran_f2c_specific__log_c8 .visible .func _gfortran_f2c_specific__log_c8; // BEGIN GLOBAL FUNCTION DECL: _gfortran_f2c_specific__log10_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__log10_r4_gfortran_f2c_specific__sqrt_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__sqrt_r4_gfortran_f2c_specific__sqrt_c4 .visible .func _gfortran_f2c_specific__sqrt_c); // BEGIN GLOBAL FUNCTION DECL: _gfortran_f2c_specific__sqrt_c8 .visible .func _gfortran_f2c_specific__sqrt_c8; // BEGIN GLOBAL FUNCTION DECL: _gfortran_f2c_specific__asin_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__asin_r4_gfortran_f2c_specific__acos_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__acos_r4_gfortran_f2c_specific__atan_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__atan_r4_gfortran_f2c_specific__asinh_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__asinh_r4_gfortran_f2c_specific__acosh_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__acosh_r4_gfortran_f2c_specific__atanh_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__atanh_r4_gfortran_f2c_specific__sin_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__sin_r4_gfortran_f2c_specific__sin_c4 .visible .func _gfortran_f2c_specific__sin_c); // BEGIN GLOBAL FUNCTION DECL: _gfortran_f2c_specific__sin_c8 .visible .func _gfortran_f2c_specific__sin_c8; // BEGIN GLOBAL FUNCTION DECL: _gfortran_f2c_specific__cos_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__cos_r4_gfortran_f2c_specific__cos_c4 .visible .func _gfortran_f2c_specific__cos_c); // BEGIN GLOBAL FUNCTION DECL: _gfortran_f2c_specific__cos_c8 .visible .func _gfortran_f2c_specific__cos_c8; // BEGIN GLOBAL FUNCTION DECL: _gfortran_f2c_specific__tan_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__tan_r4_gfortran_f2c_specific__sinh_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__sinh_r4_gfortran_f2c_specific__cosh_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__cosh_r4_gfortran_f2c_specific__tanh_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__tanh_r4_gfortran_f2c_specific__aint_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__aint_r4_gfortran_f2c_specific__anint_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__anint_r4_gfortran_f2c_specific__sign_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__sign_r); // BEGIN GLOBAL FUNCTION DECL: _gfortran_f2c_specific__dim_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__dim_r); // BEGIN GLOBAL FUNCTION DECL: _gfortran_f2c_specific__atan2_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__atan2_r); // BEGIN GLOBAL FUNCTION DECL: _gfortran_f2c_specific__mod_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__); // BEGIN GLOBAL FUNCTION DECL: _gfortran_f2c_specific__conjg_4 .visible .func _gfortran_f2c_specific__conjg_); // BEGIN GLOBAL FUNCTION DECL: _gfortran_f2c_specific__conjg_8 .visible .func _gfortran_f2c_specific__conjg_8; // BEGIN GLOBAL FUNCTION DECL: cabsf .extern .func (.param .f32 %value_out) cabsfCL: expf .extern .func (.param .f32 %value_out) expfcexpf .extern .func cexpcexp .extern .func cexp); // BEGIN GLOBAL FUNCTION DECL: logf .extern .func (.param .f32 %value_out) logfclogf .extern .func clogclog .extern .func clog10f .extern .func (.param .f32 %value_out) log10fcsqrtf .extern .func csqrcsqrt); // BEGIN GLOBAL FUNCTION DECL: asinf .extern .func (.param .f32 %value_out) aacosf .extern .func (.param .f32 %value_out) acosfatanf .extern .func (.param .f32 %value_out) atanfasinhf .extern .func (.param .f32 %value_out) asinhfacoshf .extern .func (.param .f32 %value_out) acoshfatanhf .extern .func (.param .f32 %value_out) atanhfsinf .extern .func (.param .f32 %value_out) csinf .extern .func csincsin .extern .func csccosf .extern .func ccosccos .extern .func ccotanf .extern .func (.param .f32 %value_out) tanfsinhf .extern .func (.param .f32 %value_out) sinhfcoshf .extern .func (.param .f32 %value_out) coshftanhf .extern .func (.param .f32 %value_out) tanhfroundf .extern .func (.param .f32 %value_out) roundfatan2f .extern .func (.param .f32 %value_out) atan2fCL: fmodf .extern .func (.param .f32 %value_out) fmodfortran_f2c_specific__abs_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__abs_r4mov.u64 %r25,%ar0; .loc 1 66 73 ld.f32 %r28,[%r25]; abs.f32 %r27,%r28; cvt.f64.f32 %value,%r27; .loc 1 66 86_gfortran_f2c_specific__abs_c4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__abs_c4mov.u64 %r26,%ar0; .loc 1 72 17 ld.f32 %r29,[%r26+4]; ld.f32 %r27,[%r26]; { .param .f32 %value_in; .param .f32 %out_arg1; st.param.f32 [%out_arg1],%r27(%value_in),cabscvt.f64.f32 %value,%r30; .loc 1 73 12_gfortran_f2c_specific__aimag_c4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__aimag_c4mov.u64 %r24,%ar0; .loc 1 80 48 ld.f32 %r26,[%r24+4]; cvt.f64.f32 %value,%r26; .loc 1 81 12_gfortran_f2c_specific__aimag_c8 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__aimag_c8mov.u64 %r23,%ar0; .loc 1 87 12 ld.f64 %value,[%r23+8]_gfortran_f2c_specific__exp_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__exp_r4mov.u64 %r25,%ar0; .loc 1 91 73expf,(%out_arg1); ld.param.f32 %r27,[%value_in]; } cvt.f64.f32 %value,%r27; .loc 1 91 86_gfortran_f2c_specific__exp_c4 .visible .func _gfortran_f2c_specific__exp_c493 81 ld.f32 %r31,[%r27+4]; ld.f32 %r29,[%r27];.param .f32 %out_arg3; st.param.f32 [%out_arg3],%r31; call cexpfst.f32 [%r26],%r24; st.f32 [%r26+4],%r25; .loc 1 93 96_gfortran_f2c_specific__exp_c8 .visible .func _gfortran_f2c_specific__exp_c895 95 ld.f64 %r31,[%r27+8]; ld.f64 %r29,[%r27];.param .f64 %out_arg3; st.param.f64 [%out_arg3],%r31; call cexpst.f64 [%r26],%r24; st.f64 [%r26+8],%r25;_gfortran_f2c_specific__log_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__log_r4mov.u64 %r25,%ar0; .loc 1 98 73logf,(%out_arg1); ld.param.f32 %r27,[%value_in]; } cvt.f64.f32 %value,%r27; .loc 1 98 86_gfortran_f2c_specific__log_c4 .visible .func _gfortran_f2c_specific__log_c4100 81 ld.f32 %r31,[%r27+4]; ld.f32 %r29,[%r27];.param .f32 %out_arg3; st.param.f32 [%out_arg3],%r31; call clogfst.f32 [%r26],%r24; st.f32 [%r26+4],%r25;_gfortran_f2c_specific__log_c8 .visible .func _gfortran_f2c_specific__log_c8[%r27+8]; ld.f64 %r29,[%r27];.param .f64 %out_arg3; st.param.f64 [%out_arg3],%r31; call clogst.f64 [%r26],%r24; st.f64 [%r26+8],%r25; .loc 1 102 110_gfortran_f2c_specific__log10_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__log10_r4log10f,(%out_arg1); ld.param.f32 %r27,[%value_in]; } cvt.f64.f32 %value,%r27; .loc 1 105 88_gfortran_f2c_specific__sqrt_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__sqrt_r4mov.u64 %r25,%ar0; .loc 1 108 74 ld.f32 %r26,[%r25]; sqrt.rn.f32 %r23,%r26; cvt.f64.f32 %value,%r23; .loc 1 108 87_gfortran_f2c_specific__sqrt_c4 .visible .func _gfortran_f2c_specific__sqrt_c4110 82 ld.f32 %r31,[%r27+4]; ld.f32 %r29,[%r27];.param .f32 %out_arg3; st.param.f32 [%out_arg3],%r31; call csqrtfst.f32 [%r26],%r24; st.f32 [%r26+4],%r25;_gfortran_f2c_specific__sqrt_c8 .visible .func _gfortran_f2c_specific__sqrt_c8[%r27+8]; ld.f64 %r29,[%r27];.param .f64 %out_arg3; st.param.f64 [%out_arg3],%r31; call csqrtst.f64 [%r26],%r24; st.f64 [%r26+8],%r25; .loc 1 112 _gfortran_f2c_specific__asin_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__asin_r4asinf,(%out_arg1); ld.param.f32 %r27,[%value_in]; } cvt.f64.f32 %value,%r27; .loc 1 115 87_gfortran_f2c_specific__acos_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__acos_r4acosf,(%out_arg1); ld.param.f32 %r27,[%value_in]; } cvt.f64.f32 %value,%r27; .loc 1 118 87_gfortran_f2c_specific__atan_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__atan_r4atanf,(%out_arg1); ld.param.f32 %r27,[%value_in]; } cvt.f64.f32 %value,%r27; .loc 1 121 87_gfortran_f2c_specific__asinh_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__asinh_r4asinhf,(%out_arg1); ld.param.f32 %r27,[%value_in]; } cvt.f64.f32 %value,%r27; .loc 1 124 88_gfortran_f2c_specific__acosh_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__acosh_r4acoshf,(%out_arg1); ld.param.f32 %r27,[%value_in]; } cvt.f64.f32 %value,%r27; .loc 1 127 88_gfortran_f2c_specific__atanh_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__atanh_r4atanhf,(%out_arg1); ld.param.f32 %r27,[%value_in]; } cvt.f64.f32 %value,%r27; .loc 1 130 88_gfortran_f2c_specific__sin_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__sin_r4sinf,(%out_arg1); ld.param.f32 %r27,[%value_in]; } cvt.f64.f32 %value,%r27; .loc 1 133 86_gfortran_f2c_specific__sin_c4 .visible .func _gfortran_f2c_specific__sin_c4135 81 ld.f32 %r31,[%r27+4]; ld.f32 %r29,[%r27];.param .f32 %out_arg3; st.param.f32 [%out_arg3],%r31; call csinfst.f32 [%r26],%r24; st.f32 [%r26+4],%r25;_gfortran_f2c_specific__sin_c8 .visible .func _gfortran_f2c_specific__sin_c8[%r27+8]; ld.f64 %r29,[%r27];.param .f64 %out_arg3; st.param.f64 [%out_arg3],%r31; call csst.f64 [%r26],%r24; st.f64 [%r26+8],%r25; .loc 1 137 110_gfortran_f2c_specific__cos_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__cos_r4cosf,(%out_arg1); ld.param.f32 %r27,[%value_in]; } cvt.f64.f32 %value,%r27; .loc 1 140 86_gfortran_f2c_specific__cos_c4 .visible .func _gfortran_f2c_specific__cos_c4142 81 ld.f32 %r31,[%r27+4]; ld.f32 %r29,[%r27];.param .f32 %out_arg3; st.param.f32 [%out_arg3],%r31; call ccosfst.f32 [%r26],%r24; st.f32 [%r26+4],%r25;_gfortran_f2c_specific__cos_c8 .visible .func _gfortran_f2c_specific__cos_c8[%r27+8]; ld.f64 %r29,[%r27];.param .f64 %out_arg3; st.param.f64 [%out_arg3],%r31; call ccosst.f64 [%r26],%r24; st.f64 [%r26+8],%r25; .loc 1 144 110_gfortran_f2c_specific__tan_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__tan_r4tanf,(%out_arg1); ld.param.f32 %r27,[%value_in]; } cvt.f64.f32 %value,%r27; .loc 1 147 86_gfortran_f2c_specific__sinh_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__sinh_r4sinhf,(%out_arg1); ld.param.f32 %r27,[%value_in]; } cvt.f64.f32 %value,%r27; .loc 1 150 87_gfortran_f2c_specific__cosh_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__cosh_r4coshf,(%out_arg1); ld.param.f32 %r27,[%value_in]; } cvt.f64.f32 %value,%r27; .loc 1 153 87_gfortran_f2c_specific__tanh_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__tanh_r4tanhf,(%out_arg1); ld.param.f32 %r27,[%value_in]; } cvt.f64.f32 %value,%r27; .loc 1 156 87_gfortran_f2c_specific__aint_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__aint_r4mov.u64 %r25,%ar0; .loc 1 159 74 ld.f32 %r26,[%r25]; cvt.rzi.f32.f32 %r23,%r26; cvt.f64.f32 %value,%r23; .loc 1 159 87_gfortran_f2c_specific__anint_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__anint_r4roundf,(%out_arg1); ld.param.f32 %r27,[%value_in]; } cvt.f64.f32 %value,%r27; .loc 1 162 88_gfortran_f2c_specific__sign_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__sign_r4f32 %r28; .reg .f32 %r29; mov.u64 %r26,%ar0; mov.u64 %r27,%ar1; .loc 1 175 78 ld.f32 %r28,[%r26]; ld.f32 %r29,[%r27]; copysign.f32 %r24,%r29,%r28; cvt.f64.f32 %value,%r24; .loc 1 175 91_gfortran_f2c_specific__dim_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__dim_r4u64 %r27;178 77 ld.f32 %r28,[%r26]; ld.f32 %r29,[%r27]; sub.f32 %r24,%r28,%r29; setp.le.f32 %r30,%r24,0f00000000; @ %r30 bra $L36; cvt.f64.f32 %r25,%r24; bra $L34; $L36: mov.f64 %r25,0d0000000000000000; $L34: .loc 1 178 90_gfortran_f2c_specific__atan2_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__atan2_r4cvt.f64.f32 %value,%r30; .loc 1 181 92_gfortran_f2c_specific__mod_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__mod_r4odcvt.f64.f32 %value,%r30; .loc 1 184 90_gfortran_f2c_specific__conjg_4 .visible .func _gfortran_f2c_specific__conjg_4190 20 ld.f32 %r27,[%r26+4]; neg.f32 %r24,%r27; ld.f32 %r28,[%r26]; st.f32 [%r25],%r28; st.f32 [%r25+4],%r24; .loc 1 191 14_gfortran_f2c_specific__conjg_8 .visible .func _gfortran_f2c_specific__conjg_8195 20 ld.f64 %r27,[%r26+8]; neg.f64 %r24,%r27; ld.f64 %r28,[%r26]; st.f64 [%r25],%r28; st.f64 [%r25+8],%r24; .loc 1 196 14 ret; } random_init.o/_gfortran_random_init .visible .func _gfortran_random_infortran/intrinsics/random_init.f90" // BEGIN GLOBAL FUNCTION DECL: _gfortran_random_seed_i4 .extern .func _gfortran_random_seed_i4_gfortran_runtime_error_at .extern .func _gfortran_runtime_error_at_gfortran_os_error_at .extern .func _gfortran_os_error_atseed$0 .global .align 8 .u64 seed$0[8]32,55,57,32,111,102,32,102,105,108,101,32,116,114,97,110,47,105,110,116,114,105,110,115,105,99,115,47,114,97,110,100,111,109,955] = {115,101,101,100,0 }101,109,112,116,105,110,103,32,116,111,32,97,108,108,111,99,97,116,101,32,97,108,114,101,97,100,121,32,97,108,108,111,99,97,116,101,100,32,118,97,114,105,97,98,108,101,32,39,37,115,39,0 }; // BEGIN VAR DEF: once$1 .global .align 4 .u32 once$1[1] = {16973,110,32,102,105,108,101,32,39,116,114,97,110,47,105,110,116,114,105,110,115,105,99,115,47,114,97,110,100,111,109,95,105,110,105,116,46,102,57,48,39,44,32,97,114,111,117,110,100,32,108,105,110,101,32,56,48_gfortran_random_init .visible .func _gfortran_random_inpredpredmov.u32 %r47,%ar0; mov.u32 %r48,%ar1; mov.u32 %r49,%ar2; .loc 1 76 7 ld.global.u32 %r50,[once$1];77 20 mov.u32 %r52,0; st.global.u32 [once$1],%r52; .loc 1 78 34 mov.u64 %r55,0; {gfortran_random_seed_.loc 1 79 27 cvta.global.u64 %r110,seed$0; mov.u64 %r57,4; st.u64 [%r110+16],%r57; mov.u64 %r59,1103806595072; st.u64 [%r110+24],%r59; ld.u32 %r23,[%frame]; setp.le.s32 %r60,%r23,0; @ %r60 bra $L9; cvt.s64.s32 %r61,%r23; shl.b64 %r36,%r61,2; bra $L3; $L9: mov.u64 %r36,%r55; $L3: ld.u64 %r63,[%r110]; setp.eq.u64 %r64,%r63,0; @ %r64 bra $L4; .loc 1 79 15 cvta.const.u64 %r67,$LC2; st.u64 [%stack],%r67; cvta.const.u64 %r66,$LC0; cvta.const.u64 %r65,$LC1stack; call _gfortran_runtime_error_at79 27 max.u64 %r70,%rst.u64 [%r110],%r71; setp.ne.u64 %r74,%r71,0; @ %r74 bra $L5;r7_gfortran_os_error_atmov.u64 %r79,1; st.u64 [%r110+48],%r79; cvt.s64.s32 %r81,%r23; st.u64 [%r110+56],%r81; st.u64 [%r110+40],%r79; mov.u64 %r85,-1; st.u64 [%r110+8],%r85; mov.u64 %r87,4; st.u64 [%r110+32],%r87; .loc 1 80 32110; call _gfortran_random_seed_.loc 1 85 10 setp.eq.u32 %r91,%r48,0; @ %r91 bra $L2; .loc 1 85 46 ld.u64 %r38,[%r110]; ld.u64 %r39,[%r110+8]; ld.u64 %r40,[%r110+48]; ld.u64 %r41,[%r110+56]; setp.gt.s64 %r96,%r40,%r41; @ %r96 bra $L2; add.u64 %r97,%r39,%r40; shl.b64 %r98,%r97,2; add.u64 %r26,%r38,%r98; add.u64 %r99,%r41,1; sub.u64 %r46,%r99,%r40; mov.u64 %r37,%r63; $L6: ld.u32 %r101,[%r26]; add.u32 %r100,%r101,%r49; st.u32 [%r26],%r100; add.u64 %r37,%r37,1; add.u64 %r26,%r26,4; setp.ne.u64 %r102,%r37,%r46; @ %r102 bra $L6; $L2: .loc 1 88 7 setp.eq.u32 %r103,%r47,0; @ %r103 bra $L7; .loc 1 89 33 mov.u64 %r106,0; cvta.global.u64 %r105,seed$0; {5106; call _gfortran_random_seed_bra $L1; $L7: .loc 1 91 25 mov.u64 %r109,gfortran_random_seed_$L1: .loc 1 94 36backtracebacktrace_alloc../libbacktracebacktrace_free .visible .func backtrace_freebacktrace_vector_growbacktrace_vector_growbacktrace_vector_finishbacktrace_vector_finishbacktrace_vector_releasbacktrace_vector_release__errno__errno114,101,97,108,108,111,99,0 }7] = {109,97,108,108,111,99backtracebacktrace_alloc5.loc 1 57 958 6 set.u32.eq.u64 %r36,%r33,0; neg.s32 %r37,%r36; .loc 1 60 10 set.u32.ne.u64 %r39,%r30,0; neg.s32 %r40,%r39; cvt.u16.u32 %r42,%r37; cvt.u16.u32 %r43,%r40; and.b16 %r41,%r42,%r43; cvt.u32.u16 %r44,%r41; cvt.u16.u8 %r45,%r44; setp.eq.u16 %r46,%r45,0; @ %r46 bra $L1; .loc 1 61 34__errno 2 ld.u32 %r50,[%r47]; cvta.const.u64 %r49,$LC0;,.param .u32 %in_ar2)call %r30,(%out_arg1,%out_arg2,%out_arg3),$LCT0; } $L1: .loc 1 64backtrace_free .visible .func backtrace_free {mov.u64 %r23,%ar1; .loc 1 74.loc 1 7backtrace_vector_growbacktrace_vector_growpredu641mov.u64 %r35,%ar3; mov.u64 %r36,%ar4; .loc 1 86 17 ld.u64 %r22,[%r36+16]; .loc 1 91 14 ld.u64 %r30,[%r36+8]; .loc 1 101 14 ld.u64 %r31,[%r36]; .loc 1 86 6 setp.ge.u64 %r37,%r22,%r33; @ %r37 bra $L8; .loc 1 91 10 setp.ne.u64 %r38,%r30,0; @ %r38 bra $L9; .loc 1 92 6 shl.b64 %r29,%r33,5; bra $L10; $L9: .loc 1 93 15 setp.le.u64 %r39,%r30,4095; @ %r39 bra $L11; .loc 1 94 6 add.u64 %r29,%r30,4096; bra $L10; $L11: .loc 1 96 6 add.u64 %r29,%r30,%r30; $L10: .loc 1 98 27 add.u64 %r41,%r33,%r30; max.u64 %r23,%r41,%r29; .loc 1 101 14mov.u64 %r31,%r44; .loc 1 102 10 setp.ne.u64 %r45,%r31,0; @ %r45 bra __errno104 4 ld.u32 %r49,[%r46]; cvta.const.u64 %r48,$LC1;.loc 1 105 11 bra $L7; $L12: .loc 1 108 17 st.u64 [%r36],%r31; .loc 1 109 27 ld.u64 %r30,[%r36+8]; .loc 1 109 22 sub.u64 %r22,%r23,%r30; $L8: .loc 1 112 7 add.u64 %r31,%r31,%r30; .loc 1 113 13 add.u64 %r50,%r33,%r30; st.u64 [%r36+8],%r50; .loc 1 114 12 sub.u64 %r51,%r22,%r33; st.u64 [%r36+16],%r51;backtrace_vector_finishbacktrace_vector_finishpred %r32; .reg .u64mov.u64 %r30,%ar3; .loc 1 148 12 mov.u64 %r31,0; st.u64 [%r28+16],%r31; .loc 1 150 10 ld.u64 %r23,[%r28+8]; .loc 1 154 7 ld.u64 %r22,[%r28]; .loc 1 150 6 setp.ne.u64 %r32,%r23,0; @ %r32 bra $L15;mov.u64 %r26,%r23; bra $L16; $L15: .loc 1 159mov.u64 %r26,%r36; .loc 1 159 13 st.u64 [%r28],%r26; .loc 1 160 6 setp.ne.u64 %r37,%r26,0; @ %r37 bra __errnou32 %r41,[%r38]; cvta.const.u64 %r40,$LC1;call %r29,(%out_arg1,%out_arg2,%out_arg3),$LCT2; } bra $L14; $L16: .loc 1 134 13 mov.u64 %r42,0; st.u64 [%r28],%r42; .loc 1 135 13 st.u64 [%r28+8],%r42; .loc 1 136 12 st.u64 [%r28+16],%r42; $L14: .loc 1 138backtrace_vector_releasbacktrace_vector_releaseu64 %r40; .reg .u64 %r42; .reg .u32 %r43.loc 1 148 12 mov.u64 %r32,0; st.u64 [%r29+16],%r32; .loc 1 150 10 ld.u64 %r22,[%r29+8]; .loc 1 154 7 ld.u64 %r26,[%r29]; .loc 1 150 6 setp.ne.u64 %r33,%r22,0; @ %r33 bra $L19; .loc 1 154 755 17 st.u64 [%r29],%r22; .loc 1 156 14 mov.u32 %r27,1; bra $L18; $L19: .loc 1 1596159 13 st.u64 [%r29],%r38; .loc 1 160 621;__errno162 7 ld.u32 %r43,[%r40]; cvta.const.u64 %r42,$LC1;cvt.u32.u64 %r27,%r38; bra $L18; $L21: .loc 1 166 10 mov.u32 %r27,1; $L18: .loc 1 167atomic.o/ FUNCTION DECL: unwind .func (.param .u32 %value_out) unwindbacktrace/backtracbacktrace_fullbacktrace_full_Unwind_GetIPInfo_Unwind_GetIPInfo; // BEGIN GLOBAL FUNCTION DECL: backtrace_pcinfobacktrace_pcinfobacktrace_backtrace_allbacktrace_free_Unwind_Backtrace_Unwind_BacktrFUNCTION DEF: unwindunwindpredpredu32 %r58; .reg .pred1 7 mov.u32 %r35,0; st.u32 [%frame],%r35; .loc 1 74frame; call (%value_in),_Unwind_GetIPInfo,(%out_arg1,%out_arg2).loc 1 79 12 ld.u32 %r22,[%r34]; .loc 1 79 6 setp.le.s32 %r39,%r22,0; @ %r39 bra $L2; .loc 1 81 7 add.u32 %r40,%r22,-1; st.u32 [%r34],%r40; .loc 1 82 14 mov.u32 %r32,%r35; bra $L1; $L2: .loc 1 85 6 ld.u32 %r41,[%frame]; setp.ne.u32 %r42,%r41,0; @ %r42 bra $L4; .loc 1 86 5 add.u64 %r29,%r29,-1; $L4: .loc 1 89 18 ld.u64 %r30,[%r34+32]; .loc 1 89 23 ld.u64 %r31,[%r34+16]; .loc 1 88 6 ld.u32 %r43,[%r34+45; .loc 1 89 18 mov.u64 %r49,0; { .param .u32 %value_in; $LCT0: .callprototype ( .param .u32 %value_out),.param .u64 %in_ar2,call (%value_in),%r31),$LCT0; ld.param.u32 %r50,[%value_in]; } mov.u32 %r26,%r50; .loc 1 89 16 st.u32 [%r34+40],%r26; bra $L6; $L5: .loc 1 91 18 ld.u64 %r56,[%r34+24]; ld.u64 %r57,[%r34+8call (%value_in),backtrace_pcinfor26,%r58; .loc 1 91 16 st.u32 [%r34+40],%r26; $L6: .loc 1 93 6 setp.eq.u32 %r59,%r26,0; .loc 1 94 12 selp.u32 %r32,0,5,%r59; $L1: .loc 1 97 1 mov.u32 %value,%rbacktrace_fullbacktrace_fullpredmov.u64 %r29,%ar4; .loc 1 109 21 add.u32 %r30,%r26,1; .loc 1 109 14 st.u32 [%frame],%r30; .loc 1 110 15 st.u64 [%frame+8],%r25; .loc 1 111 18 st.u64 [%frame+16],%r27; .loc 1 112 24 st.u64 [%frame+24],%r28; .loc 1 113 14 st.u64 [%frame+32],%r29; .loc 1 114 13 mov.u32 %r31,0; st.u32 [%frame+40],%r31; .loc 1 118 7 mov.u64 %r35,0; movcall (%value_in),backtrace_allocld.param.u64 %r36,[%value_in]; } .loc 1 119 6 setp.ne.u64 %r38,%r36,0; @ %r38 bra $L9; .loc 1 120 21 st.u32 [%frame+44],%r31; bra $L10; $L9: .loc 1 123 7 {call backtrace_free); } .loc 1 124 23 mov.u32 %r45,1; st.u32 [%frame+44],%r45; $L10: .loc 1 127 3 mov.u64 %r46,unwindframe; call (%value_in),_Unwind_Backtrace.loc 1 128 15 ld.u32 %value,[%frame+40]; .loc 1 129 1dwarf.o/ FUNCTION DECL: units_search .func (.param .u32 %value_out) units_searchbacktrace/dwarf.c" // BEGIN FUNCTION DECL: function_addrs_search .func (.param .u32 %value_out) function_addrs_sear); // BEGIN FUNCTION DECL: unit_addrs_searchunit_addrs_sear); // BEGIN FUNCTION DECL: line_searchline_sear); // BEGIN FUNCTION DECL: abbrev_compare .func (.param .u32 %value_out) abbrev_compare; // BEGIN FUNCTION DECL: report_inlined_functionsreport_inlined_functionsokup_abbrev .func (.param .u64 %value_out) lookup_abbrev; // BEGIN FUNCTION DECL: add_function_rangeadd_function_range); // BEGIN FUNCTION DECL: add_unit_addradd_unit_addr); // BEGIN FUNCTION DECL: free_abbrevs .func free_abbrevs; // BEGIN FUNCTION DECL: unit_addrs_compunit_addrs_compare; // BEGIN FUNCTION DECL: line_compare .func (.param .u32 %value_out) line_compare; // BEGIN FUNCTION DECL: function_addrs_compare .func (.param .u32 %value_out) function_addrs_compare; // BEGIN FUNCTION DECL: add_line$isra$0 .func add_line$isra$0FUNCTION DECL: update_pcrange$isra$0 .func update_pcrange$isra$0FUNCTION DECL: read_uleb128 .func (.param .u64 %value_out) read_uleb128 (.param .u64 %in_ar0); // BEGIN FUNCTION DECL: read_sleb128 .func (.param .u64 %value_out) read_sleb128 (.param .u64 %in_ar0); // BEGIN FUNCTION DECL: advanceadvance; // BEGIN FUNCTION DECL: read_byteread_byte (.param .u64 %in_ar0); // BEGIN FUNCTION DECL: read_uint16 .func (.param .u32 %value_out) read_uint16 (.param .u64 %in_ar0); // BEGIN FUNCTION DECL: read_uint24read_uint24 (.param .u64 %in_ar0); // BEGIN FUNCTION DECL: read_uint32read_uint32 (.param .u64 %in_ar0); // BEGIN FUNCTION DECL: read_uint64 .func (.param .u64 %value_out) read_uint64 (.param .u64 %in_ar0); // BEGIN FUNCTION DECL: read_address .func (.param .u64 %value_out) read_addressFUNCTION DECL: resolve_addr_index$isra$0 .func (.param .u32 %value_out) resolve_addr_index$isra$0FUNCTION DECL: add_ranges_from_rnglists$isra$0 .func (.param .u32 %value_out) add_ranges_from_rnglists$isra$010, .param .u64 %in_ar11); // BEGIN FUNCTION DECL: resolve_string .func (.param .u32 %value_out) resolve_stringFUNCTION DECL: read_attribute .func (.param .u32 %value_out) read_attributeread_line_header_format_entries); // BEGIN FUNCTION DECL: read_referenced_name .func (.param .u64 %value_out) read_referenced_nameFUNCTION DECL: read_referenced_name_from_attr$paread_referenced_name_from_attr$part$0FUNCTION DECL: find_address_rangesfind_address_ranges, .param .u64 %in_ar10); // BEGIN FUNCTION DECL: read_function_entread_function_ent, .param .u64 %in_ar9); // BEGIN FUNCTION DECL: dwarf_lookup_pc .func (.param .u32 %value_out) dwarf_lookup_pc); .file 2 "../../../../libbacktrace/internal.h" // BEGIN FUNCTION DECL: dwarf_fileline .func (.param .u32 %value_out) dwarf_filelinebacktrace_dwarf_addbacktrace_dwarf_addbsearchbsearchbacktrace_vector_growbacktrace_vector_growbacktrace_free .extern .func backtrace_freestrcmpstrcmpnnprintfstrnnlen; // BEGIN GLOBAL FUNCTION DECL: backtrace_backtrace_allbacktrace_vector_releasebacktrace_vector_releasebacktrace_qsort .extern .func backtrace_qsortbacktrace_vector_finishbacktrace_vector_finish105,110,118,97,108,105,100,32,97,98,98,114,101,118,105,97,116,105,111,110,32,99,111,10015,32,105,110,32,37,115,32,97,116,32,37,1068,87,65,82,70,32,117,110,100,101,114,102,108,111,176,69,66,49,50,56,32,111,118,101,114,102,108,111,119,115,32,117,105,110,116,54,52,95,1115,105,103,110,101,100,32,76,69,66,49,50,56,32,111,118,101,114,102,108,111,119,115,32,117,105,110,116,54,5226] = {117,110,114,101,99,111,103,110,105,122,101,100,32,97,100,100,114,101,115,115,32,3] = {68,87,95,70,79,82,77,95,97,100,100,114,120,32,118,97,108,117,101,32,111,117,116,32,111,102,32,114,97,110,10346,100,101,98,117,103,95,97,100,100,114,0 }114,110,103,108,105,115,116,115,32,111,102,102,115,101,116,32,111,117,116,32,111,102,32,114,97,110,103,101114,110,103,108,105,115,116,115,32,105,110,100,101,120,32,111,102,102,115,101,116,32,111,117,116,32,111,102,32,114,97,110,103,101117,110,114,101,99,111,103,110,105,122,101,100,32,68,87,95,82,76,69,32,118,97,108,117,10146,100,101,98,117,103,95,114,110,103,108,105,115,116,1114,97,110,103,101,115,32,111,102,102,115,101,116,32,111,117,116,32,111,102,32,114,97,110,103,10146,100,101,98,117,103,95,114,97,110,103,101,168,87,95,70,79,82,77,95,115,116,114,120,32,118,97,108,117,101,32,111,117,116,32,111,102,32,114,97,110,103,10146,100,101,98,117,103,95,115,116,114,95,111,102,102,115,101,116,168,87,95,70,79,82,77,95,115,116,114,120,32,111,102,102,115,101,116,32,111,117,116,32,111,102,32,114,97,110,103,101,0 }; // BEGIN VAR DEF: $LC17 .const .align 1 .u8 $LC17[26] = {68,87,95,70,79,82,77,95,115,116,114,112,32,111,117,116,32,111,102,32,114,97,110,103,10131] = {68,87,95,70,79,82,77,95,108,105,110,101,95,115,116,114,112,32,111,117,116,32,111,102,32,114,97,110,103,101,0 }; // BEGIN VAR DEF: $LC19 .const .align 1 .u8 $LC19[43] = {68,87,95,70,79,82,77,95,105,110,100,105,114,101,99,116,32,116,111,32,68,87,95,70,79,82,77,95,105,109,112,108,105,99,105,116,95,99,111,110,115,116105,110,118,97,108,105,100,32,100,105,114,101,99,116,111,114,121,32,105,110,100,101,120,32,105,110,32,108,105,110,101,32,110,117,109,98,101,114,32,112,114,111,103,114,97,109,0 }; // BEGIN VAR DEF: $LC34 .const .align 1 .u8 $LC34[43] = {105,110,118,97,108,105,100,32,102,105,108,101,32,110,117,109,98,101,114,32,105,110,32,108,105,110,101,32,110,117,109,98,101,114,32,112,114,111,103,114,97,109,0 }; // BEGIN VAR DEF: $LC35 .const .align 1 .u8 $LC35[27] = {117,110,114,101,99,111,103,110,105,122,101,100,32,68,87,65,82,70,32,118,101,114,115,105,111,110,0 }; // BEGIN VAR DEF: $LC36 .const .align 1 .u8 $LC36[27] = {97,98,98,114,101,118,32,111,102,102,115,101,116,32,111,117,116,32,111,102,32,114,97,110,103,101,0 }; // BEGIN VAR DEF: $LC37 .const .align 1 .u8 $LC37[14] = {46,100,101,98,117,103,95,97,98,98,114,101,118,0 }; // BEGIN VAR DEF: $LC30 .const .align 1 .u8 $LC30[12] = {46,100,101,98,117,103,95,108,10509,98,101,114,32,118,101,114,115,105,111,110,0 }110,111,110,45,122,101,114,111,32,115,101,103,109,101,110,116,95,115,101,108,101,99,116,111,114,930] = {68,87,95,70,79,82,77,95,115,116,114,112,95,115,117,112,32,111,117,116,32,111,102,32,114,97,110,103,101117,110,114,101,99,111,103,110,105,122,101,100,32,68,87,65,82,70,32,102,111,114,109115,105,110,103,32,102,105,108,101,32,110,97,109,101,32,105,110,32,108,105,110,101,32,110,117,109,98,101,114,32,112,114,111,103,114,97,109,32,104,101,97,100,101,114105,110,118,97,108,105,100,32,100,105,114,101,99,116,111,114,121,32,105,110,100,101,120,32,105,110,32,108,105,110,101,32,110,117,109,98,101,114,32,112,114,111,103,114,97,109,32,104,101,97,100,101,11497,98,115,116,114,97,99,116,32,111,114,105,103,105,110,32,111,114,32,115,112,101,99116,32,111,102,32,114,97,110,103,10146,100,101,98,117,103,95,105,110,102,1105,110,118,97,108,105,100,32,97,98,115,116,114,97,99,116,32,111,114,105,103,105,110,32,111,114,32,115,112,101,99,105,102,105,99105,110,118,97,108,105,100,32,102,105,108,101,32,110,117,109,98,101,114,32,105,110,32,68,87,95,65,84,95,99,97,108,108,95,102,105,108,101,32,97,116,116,114,105,98,117,116,101117,110,105,116,32,108,105,110,101,32,111,102,102,115,101,116,32,111,117,116,32,111,102,32,114,97,110,103,101,0 }; // BEGIN FUNCTION DEF: units_search .func (.param .u32 %value_out) units_searchpred1125 22 ld.u64 %r24,[%r29]; .loc 1 1128 10 ld.u64 %r25,[%r28]; .loc 1 1129 6 ld.u64 %r30,[%r24+24]; setp.gt.u64 %r31,%r30,%r25; @ %r31 bra $L3; .loc 1 1131 11 ld.u64 %r33,[%r24+32]; set.u32.le.u64 %r34,%r33,%r25; neg.s32 %r27,%r34; bra $L1; $L3: .loc 1 1130 12 mov.u32 %r27,-1; $L1:FUNCTION DEF: function_addrs_search .func (.param .u32 %value_out) function_addrs_searchpred24,[%r27]; .loc 1 1179 6 ld.u64 %r29,[%r28]; setp.gt.u64 %r30,%r29,%r24; @ %r30 bra $L6; .loc 1 1181 11 ld.u64 %r32,[%r28+8]; set.u32.le.u64 %r33,%r32,%r24; neg.s32 %r26,%r33; bra $L4; $L6: .loc 1 1180 12 mov.u32 %r26,-1; $L4: .loc 1 1185 1 mov.u32 %value,%r26;unit_addrs_search .func (.param .u32 %value_out) unit_addrs_searchpred24,[%r27]; .loc 1 1264 6 ld.u64 %r29,[%r28]; setp.gt.u64 %r30,%r29,%r24; @ %r30 bra $L9; .loc 1 1266 11 ld.u64 %r32,[%r28+8]; set.u32.le.u64 %r33,%r32,%r24; neg.s32 %r26,%r33; bra $L7; $L9: .loc 1 1265 12 mov.u32 %r26,-1; $L7: .loc 1 1270 1 mov.u32 %value,%r26;line_searchline_searchpred24,[%r27]; .loc 1 1308 6 ld.u64 %r29,[%r28]; setp.gt.u64 %r30,%r29,%r24; @ %r30 bra $L12; .loc 1 1310 11 ld.u64 %r32,[%r28+24]; set.u32.le.u64 %r33,%r32,%r24; neg.s32 %r26,%r33; bra $L10; $L12: .loc 1 1309 12 mov.u32 %r26,-1; $L10: .loc 1 1314 1 mov.u32 %value,%r26;abbrev_compare .func (.param .u32 %value_out) abbrev_comparepred.loc 1 1325 20 ld.u64 %r23,[%r27]; .loc 1 1325 6 setp.lt.u64 %r28,%r22,%r23; @ %r28 bra $L15; .loc 1 1327 11 set.u32.gt.u64 %r30,%r22,%r23; neg.s32 %r25,%r30; bra $L13; $L15: .loc 1 1326 12 mov.u32 %r25,-1; $L13: .loc 1 1336 1 mov.u32 %value,%r25;report_inlined_functions .func (.param .u32 %value_out) report_inlined_functionspredpredpred %r98; mov.u64 %r53,%ar0; st.u64 [%frame],%r53; mov.u64 %r54,%ar1; mov.u64 %r55,%ar2; mov.u64 %r56,%ar3; mov.u64 %r57,%ar4; mov.u64 %r58,%ar5; .loc 1 3513 15 ld.u64 %r22,[%r54+32]; .loc 1 3513 6 setp.ne.u64 %r59,%r22,0; @ %r59 bra $L17; $L19: .loc 1 3514 12 mov.u32 %r52,0; bra $L16; $L17: .loc 1 3517 7 ld.u64 %r65,[%r54+24]; mov.u64 %r64,function_addrs_search; mov.u64 %r63framecall (%value_in),bsearch66,[%value_in]; } mov.u64 %r40,%r66; .loc 1 3521 6 setp.eq.u64 %r67,%r66,0; @ %r67 bra $L19; .loc 1 3524 46 ld.u64 %r46,[%r54+24]; .loc 1 3525 14 ld.u64 %r45,[%r54+32]; .loc 1 3535 9 ld.u64 %r28,[%frame]; .loc 1 3524 36 sub.u64 %r68,%r66,%r46; shr.s64 %r70,%r68,3; .loc 1 3524 64 mad.lo.u64 %r72,%r70,-6148914691236517205,1; .loc 1 3524 9 setp.ge.u64 %r73,%r72,%r45; @ %r73 bra $L20; add.u64 %r31,%r66,24; add.u64 %r41,%r66,32; bra $L21; $L22: add.u64 %r31,%r31,24; add.u64 %r41,%r41,24; .loc 1 3524 36 sub.u64 %r74,%r50,%r46; shr.s64 %r76,%r74,3; .loc 1 3524 64 mad.lo.u64 %r78,%r76,-6148914691236517205,1; .loc 1 3524 9 setp.ge.u64 %r79,%r78,%r45; @ %r79 bra $L23; $L21: add.u64 %r40,%r31,-24; mov.u64 %r50,%r31; .loc 1 3526 3 ld.u64 %r80,[%r31]; setp.lt.u64 %r81,%r28,%r80; @ %r81 bra $L20; .loc 1 3527 3 ld.u64 %r82,[%r41]; setp.lt.u64 %r83,%r28,%r82; @ %r83 bra $L22; bra $L20; $L23: mov.u64 %r40,%r50; $L20: .loc 1 3532 11 ld.u64 %r38,[%r40+16]; .loc 1 35report_inlined_functions90,[%value_in]; } mov.u32 %r52,%r90; .loc 1 3537 6 setp.ne.u32 %r91,%r52,0; @ %r91 bra $L16; .loc 1 3541 9 ld.u64 %r96,[%r38]; ld.u32 %r95,[%r58]; ld.u64 %r94,[%r57]; ld.u64 %r93,[%frame]; { .param .u32 %value_in; $LCT0: .callprototype ( .param .u32 %value_out),.param .u64 %in_ar2,.param .u32 %in_ar3,.param .u64 %in_ar4)96; call (%value_in),%r55),$LCT0; ld.param.u32 %r97,[%value_in]; } mov.u32 %r52,%r97; .loc 1 3542 6 setp.ne.u32 %r98,%r52,0; @ %r98 bra $L16; .loc 1 3547 22 ld.u64 %r36,[%r38+8]; .loc 1 3547 13 st.u64 [%r57],%r36; .loc 1 3548 20 ld.u32 %r37,[%r38+16]; .loc 1 3548 11 st.u32 [%r58],%r37; $L16: .loc 1 3551 1 mov.u32 %value,%r52FUNCTION DEF: lookup_abbrev .loc 1 1502 25 ld.u64 %r23,[%r30]; .loc 1 1502 12 add.u64 %r34,%r31,-1; .loc 1 1502 6 setp.ge.u64 %r35,%r34,%r23; @ ! %r35 bra $L31; ld.u64 %r58,[%r30+8]; bra $L28; $L31: .loc 1 1503 26 shl.b64 %r36,%r31,5; add.u64 %r37,%r36,-32; ld.u64 %r58,[%r30+8]; add.u64 %r29,%r58,%r37; .loc 1 1503 7 ld.u64 %r39,[%r29]; setp.eq.u64 %r40,%r39,%r31; @ %r40 bra $L27; $L28: .loc 1 1507 3 add.u64 %r41,%frame,8; mov.u64 %r43,0; st.u64 [%r41],%r43; st.u64 [%r41+8],%r43; st.u64 [%r41+16],%r43; .loc 1 1508 12 st.u64 [%frame],%r31; .loc 1 1509 7 mov.u64 %r50,abbrev_compare; mov.u64 %r44call (%value_in),bsearch52,[%value_in]; } mov.u64 %r29,%r52; .loc 1 1511 6 setp.ne.u64 %r53,%r29,0; @ %r53 bra $L27; .loc 1 1513 7 cvt.u32.u64 %r56,%r29; cvta.const.u64 %r55,$LC0;27: .loc 1 1517FUNCTION DEF: add_function_range .func (.param .u32 %value_out) add_function_rangepredpredar2; mov.u64 %r38,%ar3; mov.u64 %r39,%ar4; mov.u64 %r40,%ar5; mov.u64 %r41,%ar6; .loc 1 3159 10 ld.u64 %r22,[%r41+24]; .loc 1 3159 6 setp.eq.u64 %r42,%r22,0; @ %r42 bra $L33; .loc 1 3161 51 add.u64 %r44,%r22,%r22; add.u64 %r45,%r44,%r22; shl.b64 %r46,%r45,3; add.u64 %r47,%r46,-24; .loc 1 3161 9 ld.u64 %r48,[%r41]; add.u64 %r32,%r48,%r47; .loc 1 3162 22 ld.u64 %r26,[%r32+8]; .loc 1 3162 10 setp.eq.u64 %r49,%r26,%r37; @ %r49 bra $L34; .loc 1 3162 49 add.u64 %r50,%r26,1; .loc 1 3162 29 setp.ne.u64 %r51,%r50,%r37; @ %r51 bra $L33; $L34: .loc 1 3163 4 ld.u64 %r52,[%r32+16]; setp.ne.u64 %r53,%r52,%r36; @ %r53 bra $L33; .loc 1 3165 7 setp.ge.u64 %r54,%r26,%r38; @ %r54 bra $L36; .loc 1 3166 14 st.u64 [%r32+8],%r38; .loc 1 3167 11 mov.u32 %r34,1; bra $L32; $L33: .loc 1 3172 8 mov.u64 %r56,24;call (%value_in),backtrace_vector_grow60,[%value_in]; } .loc 1 3174 6 setp.eq.u64 %r61,%r60,0; @ %r61 bra $L37; .loc 1 3177 10 st.u64 [%r60],%r37; .loc 1 3178 11 st.u64 [%r60+8],%r38; .loc 1 3179 15 st.u64 [%r60+16],%r36; .loc 1 3181 3 ld.u64 %r63,[%r41+24]; add.u64 %r62,%r63,1; st.u64 [%r41+24],%r62; .loc 1 3183 10 mov.u32 %r34,1; bra $L32; $L36: .loc 1 3167 11 mov.u32 %r34,1; bra $L32; $L37: .loc 1 3175 12 cvt.u32.u64 %r34,%r60; $L32: .loc 1 3184 1 mov.u32 %value,%r34;add_unit_addradd_unit_addrpredpredar2; mov.u64 %r38,%ar3; mov.u64 %r39,%ar4; mov.u64 %r40,%ar5; mov.u64 %r41,%ar6; .loc 1 1201 10 ld.u64 %r22,[%r41+24]; .loc 1 1201 6 setp.eq.u64 %r42,%r22,0; @ %r42 bra $L42; .loc 1 1203 47 add.u64 %r44,%r22,%r22; add.u64 %r45,%r44,%r22; shl.b64 %r46,%r45,3; add.u64 %r47,%r46,-24; .loc 1 1203 9 ld.u64 %r48,[%r41]; add.u64 %r32,%r48,%r47; .loc 1 1204 22 ld.u64 %r26,[%r32+8]; .loc 1 1204 10 setp.eq.u64 %r49,%r26,%r37; @ %r49 bra $L43; .loc 1 1204 49 add.u64 %r50,%r26,1; .loc 1 1204 29 setp.ne.u64 %r51,%r50,%r37; @ %r51 bra $L42; $L43: .loc 1 1205 4 ld.u64 %r52,[%r32+16]; setp.ne.u64 %r53,%r52,%r36; @ %r53 bra $L42; .loc 1 1207 7 setp.ge.u64 %r54,%r26,%r38; @ %r54 bra $L45; .loc 1 1208 14 st.u64 [%r32+8],%r38; .loc 1 1209 11 mov.u32 %r34,1; bra $L41; $L42: .loc 1 1214 8 mov.u64 %r56,24;call (%value_in),backtrace_vector_grow60,[%value_in]; } .loc 1 1216 6 setp.eq.u64 %r61,%r60,0; @ %r61 bra $L46; .loc 1 1219 10 st.u64 [%r60],%r37; .loc 1 1220 11 st.u64 [%r60+8],%r38; .loc 1 1221 8 st.u64 [%r60+16],%r36; .loc 1 1223 3 ld.u64 %r63,[%r41+24]; add.u64 %r62,%r63,1; st.u64 [%r41+24],%r62; .loc 1 1225 10 mov.u32 %r34,1; bra $L41; $L45: .loc 1 1209 11 mov.u32 %r34,1; bra $L41; $L46: .loc 1 1217 12 cvt.u32.u64 %r34,%r60; $L41: .loc 1 1226 1 mov.u32 %value,%r34;free_abbrevs .func free_abbrevspredmov.u64 %r35,%ar3; .loc 1 737 26 ld.u64 %r31,[%r33]; .loc 1 737 3 setp.eq.u64 %r36,%r31,0; @ %r36 bra $L51; .loc 1 737 10 mov.u64 %r29,0; $L52: .loc 1 739 23 shl.b64 %r37,%r29,5; ld.u64 %r38,[%r33+8]; add.u64 %r23,%r38,%r37; .loc 1 738 5 ld.u64 %r45,[%r23+16]; shl.b64 %r44,%r45,4; ld.u64 %r46,[%r23+24]call backtrace_free); } .loc 1 737 41 add.u64 %r29,%r29,1; .loc 1 737 26 ld.u64 %r27,[%r33]; .loc 1 737 3 setp.gt.u64 %r47,%r27,%r29; @ %r47 bra $L52; .loc 1 741 3 shl.b64 %r31,%r27,5; $L51: ld.u64 %r53,[%r33+8]backtrace_free); } .loc 1 744 24 mov.u64 %r54,0; st.u64 [%r33],%r54; .loc 1 745 20 st.u64 [%r33+8],%r54; .loc 1 746 1 ret; } // BEGIN FUNCTION DEF: unit_addrs_compare .func (.param .u32 %value_out) unit_addrs_compare37 9 ld.u64 %r22,[%r32]; .loc 1 1237 19 ld.u64 %r23,[%r33]; .loc 1 1237 6 setp.lt.u64 %r34,%r22,%r23; @ %r34 bra $L59; .loc 1 1239 6 setp.gt.u64 %r35,%r22,%r23; @ %r35 bra $L60; .loc 1 1241 9 ld.u64 %r24,[%r32+8]; .loc 1 1241 20 ld.u64 %r25,[%r33+8]; .loc 1 1241 6 setp.lt.u64 %r36,%r24,%r25; @ %r36 bra $L6116]; ld.u64 %r27,[%r38+56]; .loc 1 1245 29 ld.u64 %r39,[%r33+16]; ld.u64 %r29,[%r39+56]; .loc 1 1245 6 setp.lt.s64 %r40,%r27,%r29; @ %r40 bra $L63; .loc 1 1247 6 set.u32.gt.s64 %r42,%r27,%r29; neg.s32 %r31,%r42; bra $L57; $L59: .loc 1 1238 12 mov.u32 %r31,-1; bra $L57; $L60: .loc 1 1240 12 mov.u32 %r31,1; bra $L57; $L61: mov.u32 %r31,1; bra $L57; $L62: .loc 1 1238 12 mov.u32 %r31,-1; bra $L57; $L63: mov.u32 %r31,-1; $L57: .loc 1 12FUNCTION DEF: line_compare .func (.param .u32 %value_out) line_comparepred83 10 ld.u64 %r22,[%r28]; .loc 1 1283 20 ld.u64 %r23,[%r29]; .loc 1 1283 6 setp.lt.u64 %r30,%r22,%r23; @ %r30 bra $L66; .loc 1 1285 11 setp.gt.u64 %r31,%r22,%r23; @ %r31 bra $L67; .loc 1 1287 15 ld.u32 %r24,[%r28+20]; .loc 1 1287 26 ld.u32 %r25,[%r29+20]; .loc 1 1287 11 setp.lt.s32 %r32,%r24,%r25; @ %r32 bra $L68; .loc 1 1289 11 set.u32.gt.s32 %r34,%r24,%r25; neg.s32 %r27,%r34; bra $L64; $L66: .loc 1 1284 12 mov.u32 %r27,-1; bra $L64; $L67: .loc 1 1286 12 mov.u32 %r27,1; bra $L64; $L68: .loc 1 1284 12 mov.u32 %r27,-1; $L64:FUNCTION DEF: function_addrs_comp32pred %r35; .reg .pred1156 9 ld.u64 %r22,[%r31]; .loc 1 1156 19 ld.u64 %r23,[%r32]; .loc 1 1156 6 setp.lt.u64 %r33,%r22,%r23; @ %r33 bra $L71; .loc 1 1158 6 setp.gt.u64 %r34,%r22,%r23; @ %r34 bra $L72; .loc 1 1160 9 ld.u64 %r24,[%r31+8]; .loc 1 1160 20 ld.u64 %r25,[%r32+8]; .loc 1 1160 6 setp.lt.u64 %r35,%r24,%r25; @ %r35 bra $L73; .loc 1 1162 6 setp.gt.u64 %r36,%r24,%r25; @ %r36 bra $L74; .loc 1 1164 10 ld.u64 %r39,[%r32+16]; ld.u64 %r40,[%r31+16]; ld.u64 %r38,[%r39]; ld.u64 %r37,[%r40call (%value_in),strcmpmov.u32 %r30,%r41; bra $L69; $L71: .loc 1 1157 12 mov.u32 %r30,-1; bra $L69; $L72: .loc 1 1159 12 mov.u32 %r30,1; bra $L69; $L73: mov.u32 %r30,1; bra $L69; $L74: .loc 1 1157 12 mov.u32 %r30,-1; $L69: .loc 1 1165FUNCTION DEF: add_line$isra$0 .func add_line$isra$0predu64mov.u64 %r42,%ar5; mov.u64 %r43,%ar6; mov.u64 %r44,%ar7; .loc 1 2270 10 ld.u64 %r22,[%r44+24]; .loc 1 2270 6 setp.eq.u64 %r45,%r22,0; @ %r45 bra $L76; .loc 1 2272 42 add.u64 %r47,%r22,%r22; add.u64 %r48,%r47,%r22; shl.b64 %r49,%r48,3; add.u64 %r50,%r49,-24; .loc 1 2272 10 ld.u64 %r51,[%r44]; add.u64 %r26,%r51,%r50; .loc 1 2273 10 ld.u64 %r52,[%r26]; setp.ne.u64 %r53,%r52,%r39; @ %r53 bra $L76; .loc 1 2273 24 ld.u64 %r54,[%r26+8]; setp.ne.u64 %r55,%r54,%r40; @ %r55 bra $L76; .loc 1 2273 52 ld.u32 %r56,[%r26+16]; setp.ne.u32 %r57,%r56,%r41; @ ! %r57 bra $L75; $L76: .loc 1 2278 2 mov.u64 %r59call (%value_in),backtrace_vector_grow,[%value_in]; } .loc 1 2280 6 setp.eq.u64 %r64,%r63,0; @ %r64 bra $L75; .loc 1 2285 15 ld.u64 %r66,[%r38+16]; add.u64 %r65,%r66,%r39; .loc 1 2285 10 st.u64 [%r63],%r65; .loc 1 2287 16 st.u64 [%r63+8],%r40; .loc 1 2288 14 st.u32 [%r63+16],%r41; .loc 1 2289 16 ld.u64 %r34,[%r44+24]; .loc 1 2289 11 st.u32 [%r63+20],%r34; .loc 1 2291 3 add.u64 %r67,%r34,1; st.u64 [%r44+24],%r67; $L75: .loc 1 2294 1 ret; } // BEGIN FUNCTION DEF: update_pcrange$isra$0 .func update_pcrange$isra$0.reg .predpredpredmov.u32 %r35,%ar0; mov.u64 %r36,%ar1; mov.u64 %r37,%ar2; .loc 1 1544 3 setp.eq.u32 %r38,%r35,18; @ %r38 bra $L86; setp.eq.u32 %r39,%r35,85; @ %r39 bra $L87; setp.ne.u32 %r40,%r35,17; @ %r40 bra $L85; .loc 1 1547 14 ld.u32 %r22,[%r36]; .loc 1 1547 10 setp.ne.u32 %r41,%r22,1; @ %r41 bra $L89; .loc 1 1549 19 ld.u64 %r42,[%r36+8]; st.u64 [%r37],%r42; .loc 1 1550 24 st.u32 [%r37+8],%r22; bra $L85; $L89: .loc 1 1552 15 setp.ne.u32 %r44,%r22,2; @ %r44 bra $L85; .loc 1 1554 19 ld.u64 %r45,[%r36+8]; st.u64 [%r37],%r45; .loc 1 1555 24 mov.u64 %r46,4294967297; st.u64 [%r37+8],%r46; bra $L85; $L86: .loc 1 1561 14 ld.u32 %r25,[%r36]; .loc 1 1561 10 setp.ne.u32 %r47,%r25,1; @ %r47 bra $L90; .loc 1 1563 20 ld.u64 %r48,[%r36+8]; st.u64 [%r37+16],%r48; .loc 1 1564 25 st.u32 [%r37+24],%r25; bra $L85; $L90: .loc 1 1566 15 setp.ne.u32 %r50,%r25,3; @ %r50 bra $L91; .loc 1 1568 20 ld.u64 %r51,[%r36+8]; st.u64 [%r37+16],%r51; .loc 1 1569 25 mov.u64 %r52,4294967297; st.u64 [%r37+24],%r52; bra $L85; $L91: .loc 1 1572 15 setp.ne.u32 %r53,%r25,2; @ %r53 bra $L85; .loc 1 1574 20 ld.u64 %r54,[%r36+8]; st.u64 [%r37+16],%r54; .loc 1 1575 25 mov.u32 %r55,1; st.u32 [%r37+24],%r55; .loc 1 1576 34 st.u32 [%r37+32],%r55; bra $L85; $L87: .loc 1 1581 14 ld.u32 %r29,[%r36]; .loc 1 1582 4 set.u32.eq.u32 %r58,%r29,3; neg.s32 %r59,%r58; set.u32.eq.u32 %r61,%r29,10or.b16 %r63,%r64,%r65; .loc 1 158192; .loc 1 1584 20 ld.u64 %r69,[%r36+8]; st.u64 [%r37+40],%r69; .loc 1 1585 25 mov.u32 %r70,1; st.u32 [%r37+48],%r70; bra $L85; $L92: .loc 1 1587 15 setp.ne.u32 %r71,%r29,12; @ %r71 bra $L85; .loc 1 1589 20 ld.u64 %r72,[%r36+8]; st.u64 [%r37+40],%r72; .loc 1 1590 25 mov.u64 %r73,4294967297; st.u64 [%r37+48],%r73; $L85: .loc 1 1598 1 ret; } // BEGIN FUNCTION DEF: read_uleb128 .func (.param .u64 %value_out) read_uleb12832predpred %r91; mov.u64 %r47,%ar0; .loc 1 633 12 mov.u32 %r27,0; .loc 1 632 9 mov.u32 %r26,%r27; .loc 1 631 7 mov.u64 %r25,0; $L99: .loc 1 638 9 ld.u64 %r29,[%r47+16]; .loc 1 428 10 ld.u64 %r46,[%r47+24]; .loc 1 428 6 setp.eq.u64 %r48,%r46,0; @ %r48 bra $L94; .loc 1 448 12 add.u64 %r38,%r29,1; st.u64 [%r47+16],%r38; .loc 1 449 13 add.u64 %r49,%r46,-1; st.u64 [%r47+24],%r49; .loc 1 641 9 ld.u8 %r30,[%r29]; .loc 1 642 10 setp.le.u32 %r50,%r26,63; @ %r50 bra $L95; bra $L101; $L94: .loc 1 431 6 ld.u32 %r51,[%r47+56]93; .loc 1 418 38 ld.u64 %r57,[%r47+8]; sub.u64 %r56,%r29,%r57; .loc 1 417 3 st.u32 [%stack+16],%r56; ld.u64 %r58,[%r47]; st.u64 [%stack+8],%r58; cvta.const.u64 %r59,$LC2; st.u64 [%stack],%r59; cvta.const.u64 %r55,$LC1; mov.u64 %r54,20framestack; call (%value_in),sn,%out_arg4); ld.param.u32 %r60,[%value_in]; } .loc 1 419 3 ld.u64 %r65,[%r47+48]; ld.u64 %r66,[%r47+40];,.param .u32 %in_ar2);framecall %r66,(%out_arg1,%out_arg2,%out_arg3),$LCT2; } .loc 1 434 31 mov.u32 %r67,1; st.u32 [%r47+56],%r67; bra $L93; $L95: .loc 1 643 10 cvt.u16.u32 %r69,%r30; and.b16 %r68,%r69,127; cvt.u32.u16 %r71,%r68; cvt.u64.u8 %r70,%r71; .loc 1 643 33 shl.b64 %r72,%r70,%r26; .loc 1 643 6 or.b64 %r25,%r25,%r72; bra $L98; $L101: .loc 1 644 15 setp.ne.u32 %r73,%r27,0; @ %r73 bra $L98; .loc 1 418 38 ld.u64 %r78,[%r47+8]; sub.u64 %r77,%r38,%r78; .loc 1 417 3 st.u32 [%stack+16],%r77; ld.u64 %r79,[%r47]; st.u64 [%stack+8],%r79; cvta.const.u64 %r80,$LC3; st.u64 [%stack],%r80; cvta.const.u64 %r76,$LC1; mov.u64 %r75,20framestack; call (%value_in),sn,%out_arg4); ld.param.u32 %r81,[%value_in]; } .loc 1 419 3 ld.u64 %r86,[%r47+48]; ld.u64 %r87,[%r47+40];7; call %r87,(%out_arg1,%out_arg2,%out_arg3),$LCT3; } .loc 1 647 13 mov.u32 %r27,1; $L98: .loc 1 649 13 add.u32 %r26,%r26,7; .loc 1 651 3 cvt.u32.u32 %r90,%r30; cvt.s32.s8 %r89,%r90; cvt.u16.u32 %r88,%r89; setp.lt.s16 %r91,%r88,0; @ %r91 bra $L99; mov.u64 %r46,%r25; $L93: .loc 1 654FUNCTION DEF: read_sleb128 .func (.param .u64 %value_out) read_sleb128predpredmov.u64 %r53,%ar0; .loc 1 668 12 mov.u32 %r28,0; .loc 1 667 9 mov.u32 %r27,%r28; .loc 1 666 7 mov.u64 %r26,0; $L108: .loc 1 673 9 ld.u64 %r31,[%r53+16]; .loc 1 428 10 ld.u64 %r34,[%r53+24]; .loc 1 428 6 setp.eq.u64 %r54,%r34,0; @ %r54 bra $L103; .loc 1 448 12 add.u64 %r42,%r31,1; st.u64 [%r53+16],%r42; .loc 1 449 13 add.u64 %r55,%r34,-1; st.u64 [%r53+24],%r55; .loc 1 676 9 ld.u8 %r32,[%r31]; .loc 1 677 10 setp.le.u32 %r56,%r27,63; @ %r56 bra $L104; bra $L115; $L103: .loc 1 431 6 ld.u32 %r57,[%r53+56]; setp.ne.u32 %r58,%r57,0; @ %r58 bra $L110; .loc 1 418 38 ld.u64 %r63,[%r53+8]; sub.u64 %r62,%r31,%r63; .loc 1 417 3 st.u32 [%stack+16],%r62; ld.u64 %r64,[%r53]; st.u64 [%stack+8],%r64; cvta.const.u64 %r65,$LC2; st.u64 [%stack],%r65; cvta.const.u64 %r61,$LC1; mov.u64 %r60,20framestack; call (%value_in),sn,%out_arg4); ld.param.u32 %r66,[%value_in]; } .loc 1 419 3 ld.u64 %r71,[%r53+48]; ld.u64 %r72,[%r53+40];57; call %r72,(%out_arg1,%out_arg2,%out_arg3),$LCT4; } .loc 1 434 31 mov.u32 %r73,1; st.u32 [%r53+56],%r73; .loc 1 675 9 mov.u64 %r52,%r34; bra $L102; $L104: .loc 1 678 10 cvt.u16.u32 %r75,%r32; and.b16 %r74,%r75,127; cvt.u32.u16 %r77,%r74; cvt.u64.u8 %r76,%r77; .loc 1 678 33 shl.b64 %r78,%r76,%r27; .loc 1 678 6 or.b64 %r26,%r26,%r78; bra $L107; $L115: .loc 1 679 15 setp.ne.u32 %r79,%r28,0; @ %r79 bra $L107; .loc 1 418 38 ld.u64 %r84,[%r53+8]; sub.u64 %r83,%r42,%r84; .loc 1 417 3 st.u32 [%stack+16],%r83; ld.u64 %r85,[%r53]; st.u64 [%stack+8],%r85; cvta.const.u64 %r86,$LC4; st.u64 [%stack],%r86; cvta.const.u64 %r82,$LC1; mov.u64 %r81,20stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r87,[%value_in]; } .loc 1 419 3 ld.u64 %r92,[%r53+48]; ld.u64 %r93,[%r53+40];,.param .u32 %in_ar2)call %r9,$LCT5; } .loc 1 682 13 mov.u32 %r28,1; $L107: .loc 1 684 13 add.u32 %r27,%r27,7; .loc 1 686 3 cvt.u32.u32 %r96,%r32; cvt.s32.s8 %r95,%r96; cvt.u16.u32 %r94,%r95; setp.lt.s16 %r97,%r94,0; @ %r97 bra $L108; .loc 1 688 6 shr.u32 %r98,%r32,6; cvt.u16.u32 %r101,%r98; and.b16 %r100,%r101,1; .loc 1 688 23 set.u32.le.u32 %r103,%r27,63; neg.s32 %r104,%r103; cvt.u16.u32 %r108,%r104; and.b16 %r106,%r100,%r108; cvt.u32.u16 %r109,%r106; cvt.u16.u8 %r110,%r109; setp.eq.u16 %r111,%r110,0; @ %r111 bra $L109; .loc 1 689 28 mov.u64 %r113,-1; shl.b64 %r112,%r113,%r27; .loc 1 689 9 or.b64 %r26,%r26,%r112; $L109: .loc 1 691 10 mov.u64 %r52,%r26; bra $L102; $L110: .loc 1 675 9 mov.u64 %r52,%r34; $L102: .loc 1 692FUNCTION DEF: advance .func (.param .u32 %value_out) advancepredpred428 10 ld.u64 %r22,[%r34+24]; .loc 1 428 6 setp.gt.u64 %r36,%r35,%r22; @ %r36 bra $L117; .loc 1 448 12 ld.u64 %r38,[%r34+16]; add.u64 %r37,%r38,%r35; st.u64 [%r34+16],%r37; .loc 1 449 13 sub.u64 %r39,%r22,%r35; st.u64 [%r34+24],%r39; mov.u32 %r33,1; bra $L116; $L117: .loc 1 431 11 ld.u32 %r33,[%r34+56]; .loc 1 431 6 setp.ne.u32 %r40,%r33,0; @ %r40 bra $L119; .loc 1 418 38 ld.u64 %r45,[%r34+16]; ld.u64 %r46,[%r34+8]; sub.u64 %r44,%r45,%r46; .loc 1 417 3 st.u32 [%stack+16],%r44; ld.u64 %r47,[%r34]; st.u64 [%stack+8],%r47; cvta.const.u64 %r48,$LC2; st.u64 [%stack],%r48; cvta.const.u64 %r43,$LC1; mov.u64 %r42,20frameld.param.u32 %r49,[%value_in]; } .loc 1 419 3 ld.u64 %r54,[%r34+48]; ld.u64 %r55,[%r34+40];33; call %r55,(%out_arg1,%out_arg2,%out_arg3),$LCT6; } .loc 1 434 31 mov.u32 %r56,1; st.u32 [%r34+56],%r56; bra $L116; $L119: .loc 1 447 12 mov.u32 %r33,0; $L116: .loc 1 451 1 mov.u32 %value,%rFUNCTION DEF: read_byte .func (.param .u32 %value_out) read_bytepred %r39mov.u64 %r34,%ar0; .loc 1 477 24 ld.u64 %r22,[%r34+16]; .loc 1 428 10 ld.u64 %r23,[%r34+24]; .loc 1 428 6 setp.eq.u64 %r35,%r23,0; @ %r35 bra $L121; .loc 1 448 12 add.u64 %r36,%r22,1; st.u64 [%r34+16],%r36; .loc 1 449 13 add.u64 %r37,%r23,-1; st.u64 [%r34+24],%r37; .loc 1 481 11 ld.u8 %r33,[%r22]; bra $L122; $L121: .loc 1 431 6 ld.u32 %r38,[%r34+56]; setp.ne.u32 %r39,%r38,0; @ %r39 bra $L123; .loc 1 418 38 ld.u64 %r44,[%r34+8]; sub.u64 %r43,%r22,%r44; .loc 1 417 3 st.u32 [%stack+16],%r43; ld.u64 %r45,[%r34]; st.u64 [%stack+8],%r45; cvta.const.u64 %r46,$LC2; st.u64 [%stack],%r46; cvta.const.u64 %r42,$LC1; mov.u64 %r41,20framestack; call (%value_in),sn,ld.u64 %r52,[%r34+48]; ld.u64 %r53,[%r34+40];,.param .u32 %in_ar2)framecall %r5,$LCT7; } .loc 1 434 31 mov.u32 %r54,1; st.u32 [%r34+56],%r54; .loc 1 480 12 cvt.u32.u64 %r33,%r23; bra $L122; $L123: cvt.u32.u64 %r33,%r23; $L122: .loc 1 482 1 mov.u32 %value,%rFUNCTION DEF: read_uint16read_uint16mov.u64 %r36,%ar0; .loc 1 501 24 ld.u64 %r23,[%r36+16]; .loc 1 428 10 ld.u64 %r24,[%r36+24]; .loc 1 428 6 setp.le.u64 %r37,%r24,1; @ %r37 bra $L125; .loc 1 448 12 add.u64 %r38,%r23,2; st.u64 [%r36+16],%r38; .loc 1 449 13 add.u64 %r39,%r24,-2; st.u64 [%r36+24],%r39; .loc 1 505 6 ld.u32 %r40,[%r36+32bra $L130; $L125: .loc 1 431 6 ld.u32 %r42,[%r36+56]; setp.ne.u32 %r43,%r42,0; @ %r43 bra $L129; .loc 1 418 38 ld.u64 %r48,[%r36+8]; sub.u64 %r47,%r23,%r48; .loc 1 417 3 st.u32 [%stack+16],%r47; ld.u64 %r49,[%r36]; st.u64 [%stack+8],%r49; cvta.const.u64 %r50,$LC2; st.u64 [%stack],%r50; cvta.const.u64 %r46,$LC1; mov.u64 %r45,20stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r51,[%value_in]; } .loc 1 419 3 ld.u64 %r56,[%r36+48]; ld.u64 %r57,[%r36+40];,.param .u32 %in_ar2)framecall %r57,(%out_arg1,%out_arg2,%out_arg3),$LCT8; } .loc 1 434 31 mov.u32 %r58,1; st.u32 [%r36+56],%r58; .loc 1 504 12 mov.u32 %r35,%r42; bra $L128; $L126: ld.u8 %r59,[%r23]; ld.u8 %r60,[%r23+1]; shl.b64 %r61,%r60,8; or.b64 %r62,%r61,%r59; cvt.u16.u64 %r63,%r62; cvt.u32.u64 %r65,%r62; shl.b32 %r64,%r65,8; cvt.u32.u16 %r66,%r63; shr.u32 %r67,%r66,8;or.b16 %r68,%r69,%r70; cvt.u32.u16 %r35,%r68; .loc 1 506 35 bra $L128; $L130: ld.u8 %r71,[%r23]; ld.u8 %r72,[%r23+1]; shl.b64 %r73,%r72,8; or.b64 %r74,%r73,%r71; cvt.u16.u64 %r75,%r74; cvt.u32.u16 %r35,%r75; .loc 1 508 35 bra $L128; $L129: .loc 1 504 12 mov.u32 %r35,0; $L128: .loc 1 509 1 mov.u32 %value,%rFUNCTION DEF: read_uint24read_uint24u3232u64 %r67; .reg .u3232u32 %r87; mov.u64 %r53,%ar0; .loc 1 516 24 ld.u64 %r41,[%r53+16]; .loc 1 428 10 ld.u64 %r42,[%r53+24]; .loc 1 428 6 setp.le.u64 %r54,%r42,2; @ %r54 bra $L132; .loc 1 448 12 add.u64 %r55,%r41,3; st.u64 [%r53+16],%r55; .loc 1 449 13 add.u64 %r56,%r42,-3; st.u64 [%r53+24],%r56; .loc 1 520 6 ld.u32 %r57,[%r53+32]; setp.ne.u32 %r58,%r57,0; @ %r58 bra $L133; bra $L137; $L132: .loc 1 431 6 ld.u32 %r59,[%r53+56]; setp.ne.u32 %r60,%r59,0; @ %r60 bra $L136; .loc 1 418 38 ld.u64 %r65,[%r53+8]; sub.u64 %r64,%r41,%r65; .loc 1 417 3 st.u32 [%stack+16],%r64; ld.u64 %r66,[%r53]; st.u64 [%stack+8],%r66; cvta.const.u64 %r67,$LC2; st.u64 [%stack],%r67; cvta.const.u64 %r63,$LC1; mov.u64 %r62,20stack; call (%value_in),sn419 3 ld.u64 %r73,[%r53+48]; ld.u64 %r74,[%r53+40];59; call %r74,(%out_arg1,%out_arg2,%out_arg3),$LCT9; } .loc 1 434 31 mov.u32 %r75,1; st.u32 [%r53+56],%r75; .loc 1 519 12 mov.u32 %r52,%r59; bra $L131; $L133: .loc 1 521 14 ld.u8 %r76,[%r41]; .loc 1 521 30 shl.b32 %r77,%r76,16; .loc 1 521 40 ld.u8 %r78,[%r41+1]; .loc 1 521 56 shl.b32 %r79,%r78,8; .loc 1 521 37 or.b32 %r80,%r77,%r79; .loc 1 522 8 ld.u8 %r81,[%r41+2]; .loc 1 522 6 or.b32 %r52,%r80,%r81; bra $L131; $L137: .loc 1 524 14 ld.u8 %r82,[%r41+2]; .loc 1 524 30 shl.b32 %r83,%r82,16; .loc 1 524 40 ld.u8 %r84,[%r41+1]; .loc 1 524 56 shl.b32 %r85,%r84,8; .loc 1 524 37 or.b32 %r86,%r83,%r85; .loc 1 525 8 ld.u8 %r87,[%r41]; .loc 1 525 6 or.b32 %r52,%r86,%r87; bra $L131; $L136: .loc 1 519 12 mov.u32 %r52,0; $L131: .loc 1 526 1 mov.u32 %value,%r52FUNCTION DEF: read_uint32read_uint3mov.u64 %r48,%ar0; .loc 1 533 24 ld.u64 %r35,[%r48+16]; .loc 1 428 10 ld.u64 %r36,[%r48+24]; .loc 1 428 6 setp.le.u64 %r49,%r36,3; @ %r49 bra $L139; .loc 1 448 12 add.u64 %r50,%r35,4; st.u64 [%r48+16],%r50; .loc 1 449 13 add.u64 %r51,%r36,-4; st.u64 [%r48+24],%r51; .loc 1 537 6 ld.u32 %r52,[%r48+32]bra $L144; $L139: .loc 1 431 6 ld.u32 %r54,[%r48+56143; .loc 1 418 38 ld.u64 %r60,[%r48+8]; sub.u64 %r59,%r35,%r60; .loc 1 417 3 st.u32 [%stack+16],%r59; ld.u64 %r61,[%r48]; st.u64 [%stack+8],%r61; cvta.const.u64 %r62,$LC2; st.u64 [%stack],%r62; cvta.const.u64 %r58,$LC1; mov.u64 %r57,20stack; call (%value_in),sn,419 3 ld.u64 %r68,[%r48+48]; ld.u64 %r69,[%r48+40]; { $LCT154; call %r69,(%out_arg1,%out_arg2,%out_arg3),$LCT10; } .loc 1 434 31 mov.u32 %r70,1; st.u32 [%r48+56],%r70; .loc 1 536 12 mov.u32 %r47,%r54; bra $L138; $L140: .loc 1 538 14 ld.u8 %r71,[%r35]; .loc 1 538 30 shl.b32 %r72,%r71,24; .loc 1 538 40 ld.u8 %r73,[%r35+1]; .loc 1 538 56 shl.b32 %r74,%r73,16; .loc 1 538 37 or.b32 %r75,%r72,%r74; .loc 1 539 33 ld.u8 %r76,[%r35+3]; .loc 1 539 31 or.b32 %r77,%r75,%r76; .loc 1 539 9 ld.u8 %r78,[%r35+2]; .loc 1 539 25 shl.b32 %r79,%r78,8; .loc 1 539 31 or.b32 %r47,%r77,%r79; bra $L138; $L144: .loc 1 542 31 ld.u8 %r80,[%r35]; ld.u8 %r81,[%r35+1]; shl.b64 %r82,%r81,8; or.b64 %r83,%r82,%r80; ld.u8 %r84,[%r35+2]; shl.b64 %r85,%r84,16; or.b64 %r86,%r85,%r83; ld.u8 %r87,[%r35+3]; shl.b64 %r88,%r87,24; or.b64 %r89,%r88,%r86; cvt.u32.u64 %r47,%r89; bra $L138; $L143: .loc 1 536 12 mov.u32 %r47,0; $L138: .loc 1 543 1 mov.u32 %value,%r47FUNCTION DEF: read_uint64 .func (.param .u64 %value_out) read_uint64pred %r69; .reg .u32 %r70; .reg .predmov.u64 %r64,%ar0; .loc 1 550 24 ld.u64 %r47,[%r64+16]; .loc 1 428 10 ld.u64 %r51,[%r64+24]; .loc 1 428 6 setp.le.u64 %r65,%r51,7; @ %r65 bra $L146; .loc 1 448 12 add.u64 %r66,%r47,8; st.u64 [%r64+16],%r66; .loc 1 449 13 add.u64 %r67,%r51,-8; st.u64 [%r64+24],%r67; .loc 1 554 6 ld.u32 %r68,[%r64+32]; setp.ne.u32 %r69,%r68,0; @ %r69 bra $L147; bra $L151; $L146: .loc 1 431 6 ld.u32 %r70,[%r64+56]; setp.ne.u32 %r71,%r70,0; @ %r71 bra $L150; .loc 1 418 38 ld.u64 %r76,[%r64+8]; sub.u64 %r75,%r47,%r76; .loc 1 417 3 st.u32 [%stack+16],%r75; ld.u64 %r77,[%r64]; st.u64 [%stack+8],%r77; cvta.const.u64 %r78,$LC2;74,$LC1; mov.u64 %r73,20stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r79,[%value_in]; } .loc 1 419 3 ld.u64 %r84,[%r64+48]; ld.u64 %r85,[%r64+40]; { $LCT1170; call %r85,(%out_arg1,%out_arg2,%out_arg3),$LCT11; } .loc 1 434 31 mov.u32 %r86,1; st.u32 [%r64+56],%r86; .loc 1 553 12 mov.u64 %r63,0; bra $L145; $L147: .loc 1 555 14 ld.u8 %r87,[%r47]; .loc 1 555 30 shl.b64 %r88,%r87,56; .loc 1 555 40 ld.u8 %r89,[%r47+1]; .loc 1 555 56 shl.b64 %r90,%r89,48; .loc 1 555 37 or.b64 %r91,%r88,%r90; .loc 1 558 33 ld.u8 %r92,[%r47+7]; .loc 1 558 31 or.b64 %r93,%r91,%r92; .loc 1 556 9 ld.u8 %r94,[%r47+2]; .loc 1 556 25 shl.b64 %r95,%r94,40; .loc 1 558 31 or.b64 %r96,%r93,%r95; .loc 1 556 35 ld.u8 %r97,[%r47+3]; .loc 1 556 51 shl.b64 %r98,%r97,32; .loc 1 558 31 or.b64 %r99,%r96,%r98; .loc 1 557 9 ld.u8 %r100,[%r47+4]; .loc 1 557 25 shl.b64 %r101,%r100,24; .loc 1 558 31 or.b64 %r102,%r99,%r101; .loc 1 557 35 ld.u8 %r103,[%r47+5]; .loc 1 557 51 shl.b64 %r104,%r103,16; .loc 1 558 31 or.b64 %r105,%r102,%r104; .loc 1 558 9 ld.u8 %r106,[%r47+6]; .loc 1 558 25 shl.b64 %r107,%r106,8; .loc 1 558 31 or.b64 %r63,%r105,%r107; bra $L145; $L151: .loc 1 563 31 ld.u8 %r108,[%r47]; ld.u8 %r109,[%r47+1]; shl.b64 %r110,%r109,8; or.b64 %r111,%r110,%r108; ld.u8 %r112,[%r47+2]; shl.b64 %r113,%r112,16; or.b64 %r114,%r113,%r111; ld.u8 %r115,[%r47+3]; shl.b64 %r116,%r115,24; or.b64 %r117,%r116,%r114; ld.u8 %r118,[%r47+4]; shl.b64 %r119,%r118,32; or.b64 %r120,%r119,%r117; ld.u8 %r121,[%r47+5]; shl.b64 %r122,%r121,40; or.b64 %r123,%r122,%r120; ld.u8 %r124,[%r47+6]; shl.b64 %r125,%r124,48; or.b64 %r126,%r125,%r123; ld.u8 %r127,[%r47+7]; shl.b64 %r128,%r127,56; or.b64 %r63,%r128,%r126; bra $L145; $L150: .loc 1 553 12 mov.u64 %r63,0; $L145: .loc 1 5FUNCTION DEF: read_address .func (.param .u64 %value_out) read_addressmov.u32 %r34,%ar1; .loc 1 584 3 setp.eq.u32 %r35,%r34,4; @ %r35 bra $L153; setp.gt.s32 %r36,%r34,4; @ %r36 bra $L154; setp.eq.u32 %r37,%r34,1; @ %r37 bra $L155; setp.eq.u32 %r38,%r34,2; @ %r38 bra $L156; bra $L157; $L154: setp.eq.u32 %r39,%r34,8; @ %r39 bra $L158call (%value_in),read_byte,(%out_arg1); ld.param.u32 %r41,[%value_in]; } cvt.u32.u32 %r43,%r41; cvt.u64.u8call (%value_in),read_uint16,(cvt.u16.u32 %r47,%r45; cvt.u64call (%value_in),read_uint32,(%out_arg1); ld.param.u32 %r49,[%value_in]; } cvt.u64.u32 %r32,%r49; bra $L152; $L158: .loc 1 59333; call (%value_in),read_uint64,(%out_arg1); ld.param.u64 %r51,[%value_in]; } mov.u64 %r32,%r51; bra $L152; $L157: .loc 1 418 38 ld.u64 %r56,[%r33+16]; ld.u64 %r57,[%r33+8]; sub.u64 %r55,%r56,%r57; .loc 1 417 3 st.u32 [%stack+16],%r55; ld.u64 %r58,[%r33]; st.u64 [%stack+8],%r58; cvta.const.u64 %r59,$LC5; st.u64 [%stack],%r59; cvta.const.u64 %r54,$LC1; mov.u64 %r53,20framestack; call (%value_in),sn,%out_arg4); ld.param.u32 %r60,[%value_in]; } .loc 1 419 3 ld.u64 %r65,[%r33+48]; ld.u64 %r66,[%r33+40]; mov.u32 %r64,0; { $LCT1264; call %r66,(%out_arg1,%out_arg2,%out_arg3),$LCT12; } .loc 1 596 14 mov.u64 %r32,0; $L152: .loc 1 598FUNCTION DEF: resolve_addr_index$isra$0 .func (.param .u32 %value_out) resolve_addr_index$isra$0mov.u64 %r38,%ar8; .loc 1 1099 23 cvt.s64.s32 %r22,%r33; .loc 1 1099 10 mad.lo.u64 %r24,%r22,%r35,%r32; .loc 1 1100 14 add.u64 %r40,%r22,%r24; .loc 1 1100 6.loc 1 1102 7 mov.u32 %r44,0; cvta.const.u64 %r43,$LC6; { $LCT1313; } .loc 1 1103 14 mov.u32 %r29,%r44; bra $L160; $L161: .loc 1 1106 17 cvta.const.u64 %r45,$LC7; st.u64 [%frame],%r45; .loc 1 1107 18 st.u64 [%frame+8],%r30; .loc 1 1108 51 add.u64 %r46,%r30,%r24; .loc 1 1108 16 st.u64 [%frame+16],%r46; .loc 1 1109 52 sub.u64 %r47,%r31,%r24; .loc 1 1109 17 st.u64 [%frame+24],%r47; .loc 1 1110 25 st.u32 [%frame+32],%r34; .loc 1 1111 27 st.u64 [%frame+40],%r36; .loc 1 1112 17 st.u64 [%frame+48],%r37; .loc 1 1113 31 mov.u32 %r48,0; st.u32 [%frame+56]frame33; call (%value_in),read_addres1115 12 st.u64 [%r38],%r51; .loc 1 1116 10 mov.u32 %r29,1; $L160: .loc 1 1117FUNCTION DEF: add_ranges_from_rnglists$isra$0 .func (.param .u32 %value_out) add_ranges_from_rnglists$isra$0, .param .u64 %in_ar11, .param .u64 %in_ar1ar11; ld.param.u64 %ar11,[%in_ar11]; .reg .u64 %ar12; ld.param.u64 %ar12,[%in_ar1288u64predpredpredpred %r201; .reg .u64 %r204; .reg .u32u64 %r235; .reg .u32pred %r282; .reg .u32 %r285; .reg .u64 %r287; .reg .u32 %r290; .reg .u64 %r292; .reg .u32 %r295; .reg .u64 %r297; .reg .u64 %r305; .reg .u64 %r306; .reg .u32 %r307; .reg .predu64 %r333; .reg .u64 %r334; .reg .u32 %r335; .reg .u32u32 %r343; .reg .u32 %r344; .reg .u64predmov.u64 %r109,%ar0; mov.u64 %r110,%ar1; mov.u64 %r111,%ar2; mov.u32 %r112,%ar3; mov.u64 %r113,%ar4; mov.u64 %r114,%ar5; st.u64 [%frame+272],%r114; mov.u64 %r115,%ar6; mov.u32 %r116,%ar7; mov.u64 %r117,%ar8; mov.u64 %r118,%ar9; mov.u64 %r119,%ar10; mov.u64 %r120,%ar11; mov.u64 %r121,%ar12; .loc 1 1740 37 ld.u64 %r84,[%r110+136]; .loc 1 1736 6 setp.eq.u32 %r122,%r116,0; @ %r122 bra $L164; .loc 1 1739 15 ld.u64 %r23,[%r113+80]; .loc 1 1739 53 ld.u32 %r24,[%r113+44]; .loc 1 1739 70 setp.eq.u32 %r123,%r24,0; selp.u64 %r25,4,8,%r123; .loc 1 1739 12 mad.lo.u64 %r27,%r25,%r115,%r23; .loc 1 1740 6 setp.lt.u64 %r125,%r27,%r84; @ %r125 bra $L166; $L187: .loc 1 1742 7 mov.u32 %r128,0; cvta.const.u64 %r127,$LC8; { $LCT14121128; call %r119,$LCT14; } .loc 1 1743 14 mov.u32 %r108,%r128; bra $L163; $L189: .loc 1 573 12 add.u64 %r357,%frame,357; call (%value_in),read_uint64,(%out_arg1); ld.param.u64 %r131,[%value_in]; } mov.u64 %r93,%r131; bra $L168; $L217: .loc 1 575 12 add.u64 %r357,%frame,207; call (%value_in),read_uint32,(%out_arg1); ld.param.u32 %r134,[%value_in]; } cvt.u64.u32 %r93,%r134; $L168: .loc 1 1758 14 ld.u64 %r135,[%r113+80]; add.u64 %r29,%r93,%r135; .loc 1 1759 41 ld.u64 %r30,[%r110+136]; .loc 1 1759 10 setp.lt.u64 %r136,%r29,%r30; @ %r136 bra $L169; .loc 1 1761 4 mov.u32 %r139,0; cvta.const.u64 %r138,$LC9; { $LCT15120%r119,(%out_arg1,%out_arg2,%out_arg3),$LCT15; } .loc 1 1762 11 mov.u32 %r108,%r139; bra $L163; $L169: .loc 1 1764 63 ld.u64 %r141,[%r110+64]; add.u64 %r140,%r141,%r29; .loc 1 1764 24 st.u64 [%frame+216],%r140; .loc 1 1765 64 sub.u64 %r142,%r30,%r29; .loc 1 1765 25 st.u64 [%frame+224],%r142; $L186: .loc 1 1796 11 add.u64 %r358,%frame,264; .loc 1 1782 11 add.u64 %r359,%frame,272; $L170: .loc 1 17357; call (%value_in),read_byte,(%out_arg1); ld.param.u32 %r145,[%value_in]; } .loc 1 1773 10 setp.eq.u32 %r147,%r145,0; @ %r147 bra $L171; .loc 1 1775 7 cvt.u16.u32 %r148,%r145; setp.eq.u16 %r149,%r148,4; @ %r149 bra $L172; setp.gt.u16 %r151,%r148,4; @ %r151 bra $L173; setp.eq.u16 %r153,%r148,2; @ %r153 bra $L174; setp.eq.u16 %r155,%r148,3; @ %r155 bra $L175; setp.eq.u16 %r157,%r148,1; @ %r157 bra $L176; bra $L177; $L173: setp.eq.u16 %r159,%r148,6; @ %r159 bra $L178; setp.eq.u16 %r161,%r148,7; @ %r161 bra $L179; setp.eq.u16 %r163,%r148,5; @ %r163 bra $L180; bra $L177; $L17read_uleb128,(%out_arg1); ld.param.u64 %r166,[%value_in]; } .loc 1 1782 11 ld.u32 %r177,[%r113+48]; ld.u64 %r178,[%r113+72]; ld.u64 %r179,[%r110+112]; ld.u64 %r180,[%r110+4018179178119; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r120; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r359; call (%value_in),resolve_addr_index$isra$0,%out_arg8,%out_arg9); ld.param.u32 %r181,[%value_in]; } .loc 1 1782 9 setp.ne.u32 %r182,%r181,0; @ %r182 bra $L170; $L185: .loc 1 1785 15 mov.u32 %r108,0; bra $L163; $L174:read_uleb128.loc 1 1796 11 ld.u32 %r196,[%r113+48]; ld.u64 %r197,[%r113+72]; ld.u64 %r198,[%r110+112]; ld.u64 %r199,[%r110+4097119; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r120; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r358; call (%value_in),resolve_addr_index$isra$0,%out_arg8,%out_arg9); ld.param.u32 %r200,[%value_in]; } .loc 1 1796 9 setp.eq.u32 %r201,%r200,0; @ %r201 bra $L182read_uleb128,(%out_arg1); ld.param.u64 %r204,[%value_in]; } .loc 1 1801 11 ld.u32 %r214,[%r113+48]; ld.u64 %r215,[%r113+72]; ld.u64 %r216,[%r110+112]; ld.u64 %r217,[%r110+40119; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r120; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%frame; call (%value_in),resolve_addr_index$isra$0,%out_arg8,%out_arg9); ld.param.u32 %r218,[%value_in]; } .loc 1 1801 9 setp.eq.u32 %r219,%r218,0; @ %r219 bra $L182; .loc 1 1805 11 ld.u64 %r228,[%frame]; add.u64 %r227,%r111,%r228; ld.u64 %r230,[%frame+264]; add.u64 %r229,%r111,%r230; { .param .u32 %value_in; $LCT16: .callprototype ( .param .u32 %value_out),.param .u64 %in_ar2,.param .u64 %in_ar3,.param .u64 %in_ar4,227119%r117,$LCT16; ld.param.u32 %r231,[%value_in]; } .loc 1 1805 9 setp.eq.u32 %r232,%r231,0; @ ! %r232 bra $L170; $L182: .loc 1 1799 15 mov.u32 %r108,0; bra $L163; $L175:read_uleb128,(%out_arg1); ld.param.u64 %r235,[%value_in]; } .loc 1 1819 11 ld.u32 %r245,[%r113+48]; ld.u64 %r246,[%r113+72]; ld.u64 %r247,[%r110+112]; ld.u64 %r248,[%r110+40.param .u64 %out_arg8; st.param.u64 [%out_arg8],%r120; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%frame; call (%value_in),resolve_addr_index$isra$0,%out_arg8,%out_arg9); ld.param.u32 %r249,[%value_in]; } .loc 1 1819 9 setp.eq.u32 %r250,%r249,0; @ %r250 bra $L184; .loc 1 1823357; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r253,[%value_in]; } .loc 1 1824 10 ld.u64 %r254,[%frame]; add.u64 %r58,%r111,%r254; st.u64 [%frame],%r58; .loc 1 1825 11 add.u64 %r262,%r253,%r58; { .param .u32 %value_in; $LCT17: .callprototype ( .param .u32 %value_out),.param .u64 %in_ar2,.param .u64 %in_ar3,.param .u64 %in_ar4,182119%r117,$LCT17; ld.param.u32 %r263,[%value_in]; } .loc 1 1825 9 setp.eq.u32 %r264,%r263,0; @ ! %r264 bra $L170; $L184: .loc 1 1822 15 mov.u32 %r108,0; bra $L163; $L172: .loc 1 1836 127; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r267,[%value_in]; } .loc 1 1837 137; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r270,[%value_in]; } ld.u64 %r271,[%frame+272]; add.u64 %r107,%r111,%r271; .loc 1 1838 11 add.u64 %r279,%r270,%r107; add.u64 %r280,%r267,%r107; { .param .u32 %value_in; $LCT18: .callprototype ( .param .u32 %value_out),.param .u64 %in_ar2,.param .u64 %in_ar3,.param .u64 %in_ar4,2119%r117,$LCT18; ld.param.u32 %r281,[%value_in]; } .loc 1 1838 9 setp.eq.u32 %r282,%r281,0; @ ! %r282 bra $L170; bra $L185; $L180: .loc 1 1846 11 ld.u32 %r285,[%r113+48]285; call (%value_in),read_addres287,[%value_in]; } .loc 1 1846 9 st.u64 [%frame+272],%r287; .loc 1 1847 4 bra $L170; $L178: .loc 1 1854 12 ld.u32 %r290,[%r113+48]290; call (%value_in),read_addres2,[%value_in]; } .loc 1 1855 13 ld.u32 %r295,[%r113+48]295; call (%value_in),read_addres7,[%value_in]; } .loc 1 1856 11 add.u64 %r305,%r111,%r297; add.u64 %r306,%r111,%r292; { .param .u32 %value_in; $LCT19: .callprototype ( .param .u32 %value_out),.param .u64 %in_ar2,.param .u64 %in_ar3,.param .u64 %in_ar4,306305119%r117,$LCT19; ld.param.u32 %r307,[%value_in]; } .loc 1 1856 9 setp.eq.u32 %r308,%r307,0; @ ! %r308 bra $L170; bra $L185; $L179: .loc 1 1868 12 ld.u32 %r311,[%r113+48]311; call (%value_in),read_addres13,[%value_in]; } .loc 1 1869357; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r316,[%value_in]; } .loc 1 1870 10 add.u64 %r79,%r111,%r313; .loc 1 1871 11 add.u64 %r324,%r316,%r79; { .param .u32 %value_in; $LCT20: .callprototype ( .param .u32 %value_out),.param .u64 %in_ar2,.param .u64 %in_ar3,.param .u64 %in_ar4,324119%r117,$LCT20; ld.param.u32 %r325,[%value_in]; } .loc 1 1871 9 setp.eq.u32 %r326,%r325,0; @ ! %r326 bra $L170; bra $L185; $L177: .loc 1 418 38 ld.u64 %r331,[%frame+216]; ld.u64 %r332,[%frame+208]; sub.u64 %r330,%r331,%r332; .loc 1 417 3 st.u32 [%stack+16],%r330; ld.u64 %r333,[%frame+200]; st.u64 [%stack+8],%r333; cvta.const.u64 %r334,$LC10; st.u64 [%stack],%r334; cvta.const.u64 %r329,$LC1; mov2stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r335,[%value_in]; } .loc 1 419 3 ld.u64 %r340,[%frame+248]; ld.u64 %r341,[%frame+240]; mov.u32 %r339,0; { $LCT21frame339; call %r341,$LCT21; } .loc 1 1879 11 mov.u32 %r108,%r339; .loc 1 420 1 bra $L163; $L171: .loc 1 1883 6 ld.u32 %r343,[%frame+256]; set.u32.eq.u32 %r344,%r343,0; neg.s32 %r108,%r344; bra $L163; $L216: .loc 1 1746 21 cvta.const.u64 %r347,$LC11; st.u64 [%frame+200],%r347; .loc 1 1747 44 ld.u64 %r102,[%r110+64]; .loc 1 1747 22 st.u64 [%frame+208],%r102; .loc 1 1748 59 add.u64 %r348,%r102,%r115; .loc 1 1748 20 st.u64 [%frame+216],%r348; .loc 1 1749 60 sub.u64 %r349,%r84,%r115; .loc 1 1749 21 st.u64 [%frame+224],%r349; .loc 1 1750 29 st.u32 [%frame+232],%r112; .loc 1 1751 31 st.u64 [%frame+240],%r119; .loc 1 1752 21 st.u64 [%frame+248],%r120; .loc 1 1753 35 st.u32 [%frame+256],%r116; add.u64 %r357,%frame,200; bra $L186; $L164: .loc 1 1740 6 setp.ge.u64 %r351,%r115,%r84; @ ! %r351 bra $L216; bra $L187; $L166: .loc 1 1746 21 cvta.const.u64 %r352,$LC11; st.u64 [%frame+200],%r352; .loc 1 1747 44 ld.u64 %r104,[%r110+64]; .loc 1 1747 22 st.u64 [%frame+208],%r104; .loc 1 1748 59 add.u64 %r353,%r104,%r27; .loc 1 1748 20 st.u64 [%frame+216],%r353; .loc 1 1749 60 sub.u64 %r354,%r84,%r27; .loc 1 1749 21 st.u64 [%frame+224],%r354; .loc 1 1750 29 st.u32 [%frame+232],%r112; .loc 1 1751 31 st.u64 [%frame+240],%r119; .loc 1 1752 21 st.u64 [%frame+248],%r120; .loc 1 1753 35 mov.u32 %r355,0; st.u32 [%frame+256],%r355; .loc 1 572 6 setp.ne.u32 %r356,%r24,0; @ %r356 bra $L189; bra $L217; $L163: .loc 1 1887FUNCTION DEF: add_ranges .func (.param .u32 %value_out) add_ranges, .param .u64 %in_ar10, .param .u64 %in_ar1ar11; ld.param.u64 %ar11,[%in_ar11pred32u32 %r125; .reg .u32predpred %r141; .reg .u64 %r144; .reg .u32 %r147; .reg .u64 %r148; .reg .u64 %r149; .reg .pred %r150; .reg .predpred %r154; .reg .pred %r155; .reg .pred %r156; .reg .predpredmov.u64 %r71,%ar6; mov.u64 %r72,%ar7; mov.u64 %r73,%ar8; mov.u64 %r74,%ar9; mov.u64 %r75,%ar10; mov.u64 %r76,%ar11; .loc 1 1908 6 ld.u32 %r77,[%r71+8]; setp.eq.u32 %r78,%r77,0; @ %r78 bra $L219; .loc 1 1908 27 ld.u32 %r79,[%r71+2419; .loc 1 1620 18 ld.u64 %r31,[%r71]; .loc 1 1620 9 st.u64 [%frame+64],%r31; .loc 1 1621 6 ld.u32 %r81,[%r71+12]; setp.ne.u32 %r82,%r81,0; @ %r82 bra $L220; $L223: .loc 1 1629 19 ld.u64 %r36,[%r71+16]; .loc 1 1629 10 st.u64 [%frame],%r36; .loc 1 1630 6 ld.u32 %r83,[%r71+32]; setp.ne.u32 %r84,%r83,0; @ ! %r84 bra $L222; bra $L221; $L220: .loc 1 1623 12 add.u64 %r94,%frame,64; ld.u32 %r95,[%r69+48]; ld.u64 %r96,[%r69+72]; ld.u64 %r97,[%r66+112];3174; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r75; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r94; call (%value_in),resolve_addr_index$isra$0,1623 10 setp.ne.u32 %r100,%r99,0; @ %r100 bra $L223; $L225: .loc 1 1626 9 mov.u32 %r64,0; bra $L218; $L221: .loc 1 1632 12 ld.u32 %r110,[%r69+48]; ld.u64 %r111,[%r69+72]; ld.u64 %r112,[%r66+112]; ld.u64 %r11311r1121113674; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r75; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%frame; call (%value_in),resolve_addr_index$isra$0,%out_arg8,%out_arg9); ld.param.u32 %r114,[%value_in]; } .loc 1 1632 10 setp.eq.u32 %r115,%r114,0; @ %r115 bra $L225; .loc 1 1638 12 ld.u64 %r36,[%frame]; $L222: ld.u64 %r48,[%frame+64]; .loc 1 1637 6 ld.u32 %r116,[%r71+28]; setp.eq.u32 %r117,%r116,0; @ %r117 bra $L226; .loc 1 1638 12 add.u64 %r36,%r36,%r48; $L226: .loc 1 1642 9 add.u64 %r42,%r67,%r48; st.u64 [%frame+64],%r42; .loc 1 1643 10 add.u64 %r43,%r67,%r36; st.u64 [%frame],%r43; .loc 1 1645 10 { .param .u32 %value_in; $LCT22: .callprototype ( .param .u32 %value_out),.param .u64 %in_ar2,.param .u64 %in_ar3,.param .u64 %in_ar4,.param .u64 %in_ar5,.param .u64 %in_ar6);7576; call (%value_in),%r72,$LCT22; ld.param.u32 %r125,[%value_in]; } mov.u32 %r64,%r125; .loc 1 1909 12 bra $L218; $L219: .loc 1 1913 6 ld.u32 %r126,[%r71+48]; setp.eq.u32 %r127,%r126,0; @ %r127 bra $L244; ld.u64 %r26,[%r71+40]; .loc 1 1919 6 ld.u32 %r128,[%r69+40]; setp.gt.s32 %r129,%r128,4; @ %r129 bra $L228; ld.u64 %r30,[%r66+96]; .loc 1 1668 6 setp.lt.u64 %r130,%r26,%r30; @ %r130 bra $L229; .loc 1 1670 7 mov.u32 %r133,0; cvta.const.u64 %r132,$LC12; { $LCT23%r74,$LCT23; } .loc 1 1671 14 mov.u32 %r64,%r133; bra $L218; $L229: ld.u64 %r29,[%r66+24]; .loc 1 1674 19 cvta.const.u64 %r134,$LC13; st.u64 [%frame],%r134; .loc 1 1675 20 st.u64 [%frame+8],%r29; .loc 1 1676 55 add.u64 %r135,%r29,%r26; .loc 1 1676 18 st.u64 [%frame+16],%r135; .loc 1 1677 56 sub.u64 %r136,%r30,%r26; .loc 1 1677 19 st.u64 [%frame+24],%r136; .loc 1 1678 27 st.u32 [%frame+32],%r68; .loc 1 1679 29 st.u64 [%frame+40],%r74; .loc 1 1680 19 st.u64 [%frame+48],%r75; .loc 1 1681 33 mov.u32 %r137,0; st.u32 [%frame+56],%r137; .loc 1 1691 13 ld.u32 %r56,[%r69+48]; bra $L231; $L272: .loc 1 1688 10 ld.u32 %r138,[%frame+56]; setp.eq.u32 %r139,%r138,0; @ %r139 bra $L232; $L234: .loc 1 1689 9 mov.u32 %r64,0; bra $L218; $L232: .loc 1 1691 13 ld.u32 %r56,[%r69+48]; mov.u64 %r54,%r70; bra $L233; $L242: .loc 1 1688 10 ld.u32 %r140,[%frame+56]; setp.ne.u32 %r141,%r140,0; @ %r141 bra $L234; $L233: .loc 1 1691 13 mov.u64 %r70,%r54; $L231:frame56; call (%value_in),read_addres144,[%value_in]; } .loc 1 1692 14 ld.u32 %r147,[%r69+48]framecall (%value_in),read_addres148,[%value_in]; } mov.u64 %r54,%r148; .loc 1 1694 20 or.b64 %r149,%r144,%r148; .loc 1 1694 10 setp.eq.u64 %r150,%r149,0; @ %r150 bra $L235; .loc 1 1697 11 ld.u32 %r56,[%r69+48]; .loc 1 606 3 setp.eq.u32 %r151,%r56,4; @ %r151 bra $L236; setp.gt.s32 %r152,%r56,4; @ %r152 bra $L237; setp.eq.u32 %r153,%r56,1; @ %r153 bra $L238; setp.eq.u32 %r154,%r56,2; @ %r154 bra $L239; bra $L240; $L237: setp.eq.u32 %r155,%r56,8; @ %r155 bra $L241; bra $L240; $L238: .loc 1 1697 10 setp.eq.u64 %r156,%r144,255; @ %r156 bra $L242; bra $L240; $L239: setp.eq.u64 %r157,%r144,65535; @ %r157 bra $L242; bra $L240; $L236: setp.eq.u64 %r158,%r144,4294967295; @ %r158 bra $L242; bra $L240; $L241: setp.eq.u64 %r159,%r144,-1; @ %r159 bra $L242; $L240: .loc 1 1703 11 add.u64 %r167,%r70,%r148; .loc 1 1701 9 add.u64 %r168,%r167,%r67; .loc 1 1702 10 add.u64 %r169,%r70,%r144; .loc 1 1701 9 add.u64 %r170,%r169,%r67; { .param .u32 %value_in; $LCT24: .callprototype ( .param .u32 %value_out),.param .u64 %in_ar2,.param .u64 %in_ar3,.param .u64 %in_ar4,.param .u64 %in_ar5,.param .u64 %in_ar6);17087576; call (%value_in),%r72,$LCT24; ld.param.u32 %r171,[%value_in]; } .loc 1 1701 7 setp.eq.u32 %r172,%r171,0; @ ! %r172 bra $L272; bra $L234; $L235: .loc 1 1709 6 ld.u32 %r174,[%frame+56]; set.u32.eq.u32 %r175,%r174,0; neg.s32 %r64,%r175; .loc 1 1920 12 bra $L218; $L228: .loc 1 1924 12 ld.u32 %r1907026; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r190; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r72; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r73; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r74; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r75; .param .u64 %out_arg13; st.param.u64 [%out_arg13],%r76; call (%value_in),add_ranges_from_rnglists$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13); ld.param.u32 %r191,[%value_in]; } mov.u32 %r64,%r191; bra $L218; $L244: .loc 1 1916 14 mov.u32 %r64,1; $L218: .loc 1 1927 1 mov.u32 %value,%r64FUNCTION DEF: resolve_string .func (.param .u32 %value_out) resolve_string272predpredpredpredmov.u64 %r52,%ar6; mov.u64 %r53,%ar7; .loc 1 1043 14 ld.u32 %r22,[%r50]; .loc 1 1043 3 setp.eq.u32 %r54,%r22,5; @ %r54 bra $L274; setp.eq.u32 %r55,%r22,6; @ %r55 bra $L275; mov.u32 %r45,1; bra $L273; $L274: .loc 1 1046 23 ld.u64 %r23,[%r50+8]; .loc 1 1046 15 st.u64 [%r53],%r23; .loc 1 1047 14 mov.u32 %r45,1; bra $L273; $L275: ld.u64 %r24,[%r50+8]; .loc 1 1054 41 setp.ne.u32 %r56,%r47,0; selp.u64 %r25,8,4,%r56; .loc 1 1054 9 mad.lo.u64 %r27,%r24,%r25,%r49; .loc 1 1056 29 ld.u64 %r29,[%r46+120]; .loc 1 1055 13 add.u64 %r58,%r25,%r27; .loc 1 1055 5 setp.lt.u64 %r59,%r58,%r29; @ %r59 bra $L278; .loc 1 1058 6 mov.u32 %r62,0; cvta.const.u64 %r61,$LC14; { $LCT2525; } .loc 1 1059 13 mov.u32 %r45,%r62; bra $L273; $L278: .loc 1 1062 18 cvta.const.u64 %r63,$LC15; st.u64 [%frame+200],%r63; .loc 1 1063 41 ld.u64 %r30,[%r46+48]; .loc 1 1063 19 st.u64 [%frame+208],%r30; .loc 1 1064 59 add.u64 %r64,%r30,%r27; .loc 1 1064 17 st.u64 [%frame+216],%r64; .loc 1 1065 60 sub.u64 %r65,%r29,%r27; .loc 1 1065 18 st.u64 [%frame+224],%r65; .loc 1 1066 26 st.u32 [%frame+232],%r48; .loc 1 1067 28 st.u64 [%frame+240],%r51; .loc 1 1068 18 st.u64 [%frame+248],%r52; .loc 1 1069 32 mov.u32 %r66,0; st.u32 [%frame+256],%r66; .loc 1 572 6 setp.eq.u32 %r67,%r47,0; @ %r67 bra $L280; .loc 1 573 12 add.u64 %r69,%frame,call (%value_in),read_uint64mov.u64 %r36,%r70; bra $L281; $L280: .loc 1 575 12 add.u64 %r72,%frame,20call (%value_in),read_uint32,(%out_arg1); ld.param.u32 %r73,[%value_in]; } cvt.u64.u32 %r36,%r73; $L281: .loc 1 1072 5 ld.u64 %r74,[%r46+104]; setp.gt.u64 %r75,%r74,%r36; @ %r75 bra $L282; .loc 1 418 38 ld.u64 %r80,[%frame+216]; ld.u64 %r81,[%frame+208]; sub.u64 %r79,%r80,%r81; .loc 1 417 3 st.u32 [%stack+16],%r79; ld.u64 %r82,[%frame+200]; st.u64 [%stack+8],%r82; cvta.const.u64 %r83,$LC16; st.u64 [%stack],%r83; cvta.const.u64 %r78,$LC1; mov.u64 %r77,20framestack; call (%value_in),sn,%out_arg4); ld.param.u32 %r84,[%value_in]; } .loc 1 419 3 ld.u64 %r89,[%frame+248]; ld.u64 %r90,[%frame+240]; mov.u32 %r88,0; { $LCT2688; call %r90,(%out_arg1,%out_arg2,%out_arg3),$LCT26; } .loc 1 1075 13 mov.u32 %r45,%r88; bra $L273; $L282: .loc 1 1077 59 ld.u64 %r92,[%r46+32]; add.u64 %r91,%r92,%r36; .loc 1 1077 10 st.u64 [%r53],%r91; .loc 1 1078 9 mov.u32 %r45,1; $L273: .loc 1 1084 1 mov.u32 %value,%r45FUNCTION DEF: read_attribute .func (.param .u32 %value_out) read_attribute208u64 %r138; .reg .u32pred %r147; .reg .predpredpred %r154; .reg .pred %r155; .reg .pred %r156; .reg .predpredpredpred %r171; .reg .pred %r172; .reg .predpred %r177; .reg .pred %r178; .reg .pred %r179; .reg .predpred %r186; .reg .predpredpred %r194; .reg .u32 %r195; .reg .predu64 %r202; .reg .u32 %r203; .reg .u32 %r205; .reg .u64u32 %r221; .reg .u64 %r223; .reg .u16u32 %r229; .reg .u64 %r231; .reg .u32 %r232; .reg .u64 %r234; .reg .u32u32u32 %r268; .reg .u64 %r272; .reg .u32u64u6432u32 %r331; .reg .u64 %r332; .reg .u64 %r333; .reg .u32 %r334; .reg .u64 %r336; .reg .u32 %r337; .reg .pred %r338; .reg .u64 %r341; .reg .predu32 %r359; .reg .u32 %r361; .reg .u64 %r362; .reg .u3232pred %r404; .reg .pred %r405; .reg .pred %r406; .reg .predpred %r424; .reg .predu32 %r450; .reg .u64 %r452; .reg .u32u32 %r460; .reg .pred %r461; .reg .u32 %r462; .reg .u32 %r463; .reg .pred %r464; .reg .u64 %r466; .reg .u32 %r468; .reg .predmov.u32 %r136,%ar0; mov.u64 %r137,%ar1; mov.u64 %r138,%ar2; mov.u32 %r139,%ar3; mov.u32 %r140,%ar4; mov.u32 %r141,%ar5; mov.u64 %r142,%ar6; mov.u64 %r143,%ar7; mov.u64 %r144,%ar8; $L348: .loc 1 763 3 mov.u64 %r145,0; st.u64 [%r144],%r145; st.u64 [%r144+8],%r145; .loc 1 765 3 setp.eq.u32 %r147,%r136,21; @ %r147 bra $L285; setp.gt.u32 %r148,%r136,21; @ %r148 bra $L286; setp.gt.u32 %r149,%r136,12; @ %r149 bra $L287; setp.gt.u32 %r150,%r136,10; @ %r150 bra $L288; setp.eq.u32 %r151,%r136,6; @ %r151 bra $L289; setp.gt.u32 %r152,%r136,6; @ %r152 bra $L290; setp.eq.u32 %r153,%r136,4; @ %r153 bra $L291; setp.gt.u32 %r154,%r136,4; @ %r154 bra $L292; setp.eq.u32 %r155,%r136,1; @ %r155 bra $L293; setp.eq.u32 %r156,%r136,3; @ %r156 bra $L294; bra $L295; $L292: setp.eq.u32 %r157,%r136,5; @ %r157 bra $L296; bra $L295; $L290: setp.eq.u32 %r158,%r136,9; @ %r158 bra $L297; setp.eq.u32 %r159,%r136,10; @ %r159 bra $L298; setp.eq.u32 %r160,%r136,7; @ %r160 bra $L299; setp.eq.u32 %r161,%r136,8; @ %r161 bra $L300; bra $L295; $L287: setp.eq.u32 %r162,%r136,17; @ %r162 bra $L301; setp.gt.u32 %r163,%r136,17; @ %r163 bra $L302; setp.eq.u32 %r164,%r136,15; @ %r164 bra $L303; setp.gt.u32 %r165,%r136,15; @ %r165 bra $L304; setp.eq.u32 %r166,%r136,13; @ %r166 bra $L305; setp.eq.u32 %r167,%r136,14; @ %r167 bra $L306; bra $L295; $L304: setp.eq.u32 %r168,%r136,16; @ %r168 bra $L307; bra $L295; $L302: setp.eq.u32 %r169,%r136,19; @ %r169 bra $L308; setp.eq.u32 %r170,%r136,20; @ %r170 bra $L309; setp.eq.u32 %r171,%r136,18; @ %r171 bra $L310; bra $L295; $L286: setp.eq.u32 %r172,%r136,32; @ %r172 bra $L311; setp.gt.u32 %r173,%r136,32; @ %r173 bra $L312; setp.eq.u32 %r174,%r136,26; @ %r174 bra $L313; setp.gt.u32 %r175,%r136,26; @ %r175 bra $L314; setp.eq.u32 %r176,%r136,24; @ %r176 bra $L315; setp.gt.u32 %r177,%r136,24; @ %r177 bra $L316; setp.eq.u32 %r178,%r136,22; @ %r178 bra $L317; setp.eq.u32 %r179,%r136,23; @ %r179 bra $L318; bra $L295; $L316: setp.eq.u32 %r180,%r136,25; @ %r180 bra $L319; bra $L295; $L314: setp.eq.u32 %r181,%r136,30; @ %r181 bra $L320; setp.gt.u32 %r182,%r136,30; @ %r182 bra $L321; setp.eq.u32 %r183,%r136,28; @ %r183 bra $L322; setp.eq.u32 %r184,%r136,29; @ %r184 bra $L323; setp.eq.u32 %r185,%r136,27; @ %r185 bra $L324; bra $L295; $L321: setp.eq.u32 %r186,%r136,31; @ %r186 bra $L325; bra $L295; $L312: setp.eq.u32 %r187,%r136,36; @ %r187 bra $L326; setp.gt.u32 %r188,%r136,36; @ %r188 bra $L327; setp.eq.u32 %r189,%r136,34; @ %r189 bra $L328; setp.eq.u32 %r190,%r136,35; @ %r190 bra $L329; setp.eq.u32 %r191,%r136,33; @ %r191 bra $L330; bra $L295; $L327: setp.gt.u32 %r192,%r136,7938; @ %r192 bra $L331; setp.gt.u32 %r193,%r136,7936; @ %r193 bra $L328; setp.le.u32 %r194,%r136,40; @ %r194 bra $L332; add.u32 %r195,%r136,-41; setp.gt.u32 %r196,%r195,3; @ ! %r196 bra $L383; bra $L295; $L331: setp.eq.u32 %r197,%r136,7968; @ %r197 bra $L334; setp.eq.u32 %r198,%r136,7969; @ %r198 bra $L323; bra $L295; $L293: .loc 1 768 21 st.u32 [%r144],%r136; .loc 1 769 2113141; call (%value_in),read_addres202,[%value_in]; } .loc 1 769 19 st.u64 [%r144+8],%r202; .loc 1 770 14 mov.u32 %r135,%r136; bra $L284; $L294: .loc 1 772 21 mov.u32 %r203,13; st.u32 [%r144],%r203138; call (%value_in),read_uint16,(%out_arg1); ld.param.u32 %r205,[%value_in]; } .loc 1 773 14 cvt.u16.u32 %r210,%r205; cvt.u64.u16 %r209,%r2113r209; call (%value_in),advance135,%r211; bra $L284; $L291: .loc 1 775 21 mov.u32 %r212,13; st.u32 [%r144],%r212138; call (%value_in),read_uint32,(.loc 1 776 14 cvt.u64.u32 %r217,%r21413r217; call (%value_in),advance135,%r218; bra $L284; $L296: .loc 1 778 21 mov.u32 %r219,3; st.u32 [%r144],%r219;138; call (%value_in),read_uint16,(%out_arg1); ld.param.u32 %r221,[%value_in]; } cvt.u16.u32 %r224,%r221; cvt.u64.u16 %r223,%r224; st.u64 [%r144+8],%r223; .loc 1 780 14 mov.u32 %r135,1; bra $L284; $L289: .loc 1 782 21 mov.u32 %r225,3; st.u32 [%r144],%r225;138; call (%value_in),read_uint32,(%out_arg1); ld.param.u32 %r227,[%value_in]; } cvt.u64.u32 %r228,%r227; st.u64 [%r144+8],%r228; .loc 1 784 14 mov.u32 %r135,1; bra $L284; $L299: .loc 1 786 21 mov.u32 %r229,3; st.u32 [%r144],%r229; .loc 1 787 21138; call (%value_in),read_uint64,(%out_arg1); ld.param.u64 %r231,[%value_in]; } .loc 1 787 19 st.u64 [%r144+8],%r231; .loc 1 788 14 mov.u32 %r135,1; bra $L284; $L320: .loc 1 790 21 mov.u32 %r232,13; st.u32 [%r144],%r232; .loc 1 791 14 mov.u64 %r234,16138advance135,%r235; bra $L284; $L300: .loc 1 793 21 mov.u32 %r236,5; st.u32 [%r144],%r236; .loc 1 458 15 ld.u64 %r76,[%r138+16]; .loc 1 459 16 ld.u64 %r77,[%r138+24];strnle239,[%value_in]; } .loc 1 464 10 add.u64 %r79,%r239,1; .loc 1 428 6 setp.lt.u64 %r241,%r77,%r79; @ %r241 bra $L336; .loc 1 448 12 add.u64 %r242,%r76,%r79; st.u64 [%r138+16],%r242; .loc 1 449 13 sub.u64 %r243,%r77,%r79; st.u64 [%r138+24],%r243; mov.u32 %r135,1; bra $L337; $L336: .loc 1 431 11 ld.u32 %r135,[%r138+56]; .loc 1 431 6 setp.ne.u32 %r244,%r135,0; @ %r244 bra $L370; .loc 1 418 38 ld.u64 %r249,[%r138+8]; sub.u64 %r248,%r76,%r249; .loc 1 417 3 st.u32 [%stack+16],%r248; ld.u64 %r250,[%r138]; st.u64 [%stack+8],%r250; cvta.const.u64 %r251,$LC2; st.u64 [%stack],%r251; cvta.const.u64 %r247,$LC1; mov2462stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r252,[%value_in]; } .loc 1 419 3 ld.u64 %r257,[%r138+48]; ld.u64 %r258,[%r138+40]; { $LCT27frame135; call %r258,$LCT27; } .loc 1 434 31 mov.u32 %r259,1; st.u32 [%r138+56],%r259; .loc 1 467 12 mov.u64 %r76,0; bra $L337; $L370: mov.u32 %r135,0; mov.u64 %r76,0; $L337: .loc 1 794 21 st.u64 [%r144+8],%r76; .loc 1 795 40 bra $L284; $L297: .loc 1 797 21 mov.u32 %r260,13; st.u32 [%r144],%r260; .loc 1 79read_uleb128,(%out_arg1); ld.param.u64 %r262,[%value_in]; }13r262; call (%value_in),advance135,%r265; bra $L284; $L298: .loc 1 800 21 mov.u32 %r266,13; st.u32 [%r144],%r266138; call (%value_in),read_byte,(%out_arg1); ld.param.u32 %r268,[%value_in]; } .loc 1 801 14 cvt.u32.u32 %r273,%r268; cvt.u64.u8 %r272,%r27313r272; call (%value_in),advance135,%r274; bra $L284; $L288: .loc 1 803 21 mov.u32 %r275,3; st.u32 [%r144],%r275;138; call (%value_in),read_byte,(%out_arg1); ld.param.u32 %r277,[%value_in]; } cvt.u32.u32 %r280,%r277; cvt.u64.u8 %r279,%r280; st.u64 [%r144+8],%r279; .loc 1 805 14 mov.u32 %r135,1; bra $L284; $L305: .loc 1 811 21 mov.u32 %r281,4; st.u32 [%r144],%r281; .loc 1 812 21138; call (%value_in),read_sleb128,(%out_arg1); ld.param.u64 %r283,[%value_in]; } .loc 1 812 19 st.u64 [%r144+8],%r283; .loc 1 813 14 mov.u32 %r135,1; bra $L284; $L306: .loc 1 572 6 setp.eq.u32 %r284,%r139,0; @ %r284 bra $L338; .loc 1 573 12138; call (%value_in),read_uint64,(%out_arg1); ld.param.u64 %r286,[%value_in]; } mov.u64 %r88,%r286; bra $L339; $L3138; call (%value_in),read_uint32,(%out_arg1); ld.param.u32 %r288,[%value_in]; } cvt.u64.u32 %r88,%r288; $L339: .loc 1 819 5 ld.u64 %r289,[%r142+104]; setp.gt.u64 %r290,%r289,%r88; @ %r290 bra $L340; .loc 1 418 38 ld.u64 %r295,[%r138+16]; ld.u64 %r296,[%r138+8]; sub.u64 %r294,%r295,%r296; .loc 1 417 3 st.u32 [%stack+16],%r294; ld.u64 %r297,[%r138]; st.u64 [%stack+8],%r297; cvta.const.u64 %r298,$LC17; st.u64 [%stack],%r298; cvta.const.u64 %r293,$LC1; mov292293stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r299,[%value_in]; } .loc 1 419 3 ld.u64 %r304,[%r138+48]; ld.u64 %r305,[%r138+40]; mov.u32 %r303,0; { $LCT28frame3; call %r305,$LCT28; } .loc 1 822 13 mov.u32 %r135,%r303; .loc 1 420 1 bra $L284; $L340: .loc 1 824 16 mov.u32 %r306,5; st.u32 [%r144],%r306; .loc 1 826 51 ld.u64 %r308,[%r142+32]; add.u64 %r307,%r308,%r88; .loc 1 825 16 st.u64 [%r144+8],%r307; .loc 1 827 9 mov.u32 %r135,1; bra $L284; $L325: .loc 1 572 6 setp.eq.u32 %r309,%r139,0; @ %r309 bra $L341; .loc 1 573 12138; call (%value_in),read_uint64,(%out_arg1); ld.param.u64 %r311,[%value_in]; } mov.u64 %r97,%r311; bra $L342; $L341138; call (%value_in),read_uint32,(%out_arg1); ld.param.u32 %r313,[%value_in]; } cvt.u64.u32 %r97,%r313; $L342: .loc 1 834 5 ld.u64 %r314,[%r142+128]; setp.gt.u64 %r315,%r314,%r97; @ %r315 bra $L343; .loc 1 418 38 ld.u64 %r320,[%r138+16]; ld.u64 %r321,[%r138+8]; sub.u64 %r319,%r320,%r321; .loc 1 417 3 st.u32 [%stack+16],%r319; ld.u64 %r322,[%r138]; st.u64 [%stack+8],%r322; cvta.const.u64 %r323,$LC18; st.u64 [%stack],%r323; cvta.const.u64 %r318,$LC1; mov17318stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r324,[%value_in]; } .loc 1 419 3 ld.u64 %r329,[%r138+48]; ld.u64 %r330,[%r138+40]; mov.u32 %r328,0; { $LCT2929%r330,$LCT29; } .loc 1 837 13 mov.u32 %r135,%r328; .loc 1 420 1 bra $L284; $L343: .loc 1 839 16 mov.u32 %r331,5; st.u32 [%r144],%r331; .loc 1 841 56 ld.u64 %r333,[%r142+56]; add.u64 %r332,%r333,%r97; .loc 1 840 16 st.u64 [%r144+8],%r332; .loc 1 842 9 mov.u32 %r135,1; bra $L284; $L303: .loc 1 845 21 mov.u32 %r334,3; st.u32 [%r144],%r334; .loc 1 846 21138; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r336,[%value_in]; } .loc 1 846 19 st.u64 [%r144+8],%r336; .loc 1 847 14 mov.u32 %r135,1; bra $L284; $L307: .loc 1 849 21 mov.u32 %r337,8; st.u32 [%r144],%r337; .loc 1 850 10 setp.ne.u32 %r338,%r140,2; @ %r338 bra $L344; .loc 1 851 1613141; call (%value_in),read_addres41,[%value_in]; } .loc 1 851 14 st.u64 [%r144+8],%r341; .loc 1 854 14 mov.u32 %r135,1; bra $L284; $L344: .loc 1 572 6 setp.eq.u32 %r342,%r139,0; @ %r342 bra $L345; .loc 1 573 12138; call (%value_in),read_uint64,(%out_arg1); ld.param.u64 %r344,[%value_in]; } mov.u64 %r106,%r344; bra $L346; $L345138; call (%value_in),read_uint32,(%out_arg1); ld.param.u32 %r346,[%value_in]; } cvt.u64.u32 %r106,%r346; $L346: .loc 1 853 14 st.u64 [%r144+8],%r106; .loc 1 854 14 mov.u32 %r135,1; bra $L284; $L301: .loc 1 856 21 mov.u32 %r347,7; st.u32 [%r144],%r347;138; call (%value_in),read_byte,(%out_arg1); ld.param.u32 %r349,[%value_in]; } cvt.u32.u32 %r352,%r349; cvt.u64.u8 %r351,%r352; st.u64 [%r144+8],%r351; .loc 1 858 14 mov.u32 %r135,1; bra $L284; $L310: .loc 1 860 21 mov.u32 %r353,7; st.u32 [%r144],%r353;138; call (%value_in),read_uint16,(%out_arg1); ld.param.u32 %r355,[%value_in]; } cvt.u16.u32 %r358,%r355; cvt.u64.u16 %r357,%r358; st.u64 [%r144+8],%r357; .loc 1 862 14 mov.u32 %r135,1; bra $L284; $L308: .loc 1 864 21 mov.u32 %r359,7; st.u32 [%r144],%r359;138; call (%value_in),read_uint32,(%out_arg1); ld.param.u32 %r361,[%value_in]; } cvt.u64.u32 %r362,%r361; st.u64 [%r144+8],%r362; .loc 1 866 14 mov.u32 %r135,1; bra $L284; $L309: .loc 1 868 21 mov.u32 %r363,7; st.u32 [%r144],%r363; .loc 1 869 21138; call (%value_in),read_uint64,(%out_arg1); ld.param.u64 %r365,[%value_in]; } .loc 1 869 19 st.u64 [%r144+8],%r365; .loc 1 870 14 mov.u32 %r135,1; bra $L284; $L285: .loc 1 872 21 mov.u32 %r366,7; st.u32 [%r144],%r366; .loc 1 873 21138; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r368,[%value_in]; } .loc 1 873 19 st.u64 [%r144+8],%r368; .loc 1 874 14 mov.u32 %r135,1; bra $L284; $L317: .loc 1 879 9138; call (%value_in),read_uleb128,(setp.ne.u64 %r371,%r370,33; @ %r371 bra $L347; .loc 1 418 38 ld.u64 %r376,[%r138+16]; ld.u64 %r377,[%r138+8]; sub.u64 %r375,%r376,%r377; .loc 1 417 3 st.u32 [%stack+16],%r375; ld.u64 %r378,[%r138]; st.u64 [%stack+8],%r378; cvta.const.u64 %r379,$LC19; st.u64 [%stack],%r379; cvta.const.u64 %r374,$LC1; mov73374stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r380,[%value_in]; } .loc 1 419 3 ld.u64 %r385,[%r138+48]; ld.u64 %r386,[%r138+40]; mov.u32 %r384,0; { $LCT385%r386,$LCT30; } .loc 1 884 13 mov.u32 %r135,%r384; .loc 1 420 1 bra $L284; $L347: .loc 1 886 9 cvt.u32.u64 %r136,%r370; mov.u64 %r137,0; bra $L348; $L318: .loc 1 891 21 mov.u32 %r387,10; st.u32 [%r144],%r387; .loc 1 572 6 setp.eq.u32 %r388,%r139,0; @ %r388 bra $L349; .loc 1 573 12138; call (%value_in),read_uint64,(%out_arg1); ld.param.u64 %r390,[%value_in]; } mov.u64 %r115,%r390; bra $L350; $L349138; call (%value_in),read_uint32,(%out_arg1); ld.param.u32 %r392,[%value_in]; } cvt.u64.u32 %r115,%r392; $L350: .loc 1 892 19 st.u64 [%r144+8],%r115; .loc 1 893 14 mov.u32 %r135,1; bra $L284; $L315: .loc 1 895 21 mov.u32 %r393,14; st.u32 [%r144],%r393; .loc 1 896read_uleb128,(%out_arg1); ld.param.u64 %r395,[%value_in]; }13r395; call (%value_in),advance98,[%value_in]; } mov.u32 %r135,%r398; bra $L284; $L319: .loc 1 898 21 mov.u32 %r399,3; st.u32 [%r144],%r399; .loc 1 899 19 mov.u64 %r400,1; st.u64 [%r144+8],%r400; .loc 1 900 14 cvt.u32.u64 %r135,%r400; bra $L284; $L311: .loc 1 902 21 mov.u32 %r401,11; st.u32 [%r144],%r401; .loc 1 903 21138; call (%value_in),read_uint64,(%out_arg1); ld.param.u64 %r403,[%value_in]; } .loc 1 903 19 st.u64 [%r144+8],%r403; .loc 1 904 14 mov.u32 %r135,1; bra $L284; $L332: setp.eq.u32 %r404,%r136,38; @ %r404 bra $L351; setp.gt.u32 %r405,%r136,38; @ %r405 bra $L352; setp.eq.u32 %r406,%r136,37; @ %r406 bra $L353; mov.u32 %r135,0; bra $L284; $L352: setp.eq.u32 %r407,%r136,39; @ ! %r407 bra $L384; bra $L354; $L313: .loc 1 9138; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r409,[%value_in]; } mov.u64 %r72,%r409; .loc 1 914 6 bra $L356; $L353138; call (%value_in),read_byte,(%out_arg1); ld.param.u32 %r411,[%value_in]; } .loc 1 916 13 cvt.u32.u32 %r413,%r411; cvt.u64.u8 %r72,%r413; .loc 1 917 6 bra $L356; $L351138; call (%value_in),read_uint16,(%out_arg1); ld.param.u32 %r415,[%value_in]; } .loc 1 919 13 cvt.u16.u32 %r417,%r415; cvt.u64.u16 %r72,%r417; .loc 1 920 6 call (%value_in),read_uint24,(%out_arg1); ld.param.u32 %r419,[%value_in]; } .loc 1 922 13 cvt.u64.u32 %r72,%r419; .loc 1 923 6 bra $L356; $L384138; call (%value_in),read_uint32,(%out_arg1); ld.param.u32 %r421,[%value_in]; } .loc 1 925 13 cvt.u64.u32 %r72,%r421; $L356: .loc 1 931 16 mov.u32 %r422,6; st.u32 [%r144],%r422; .loc 1 932 14 st.u64 [%r144+8],%r72; .loc 1 933 9 mov.u32 %r135,1; bra $L284; $L383: setp.eq.u32 %r423,%r136,42; @ %r423 bra $L357; setp.gt.u32 %r424,%r136,42; @ %r424 bra $L358; setp.eq.u32 %r425,%r136,41; @ %r425 bra $L359; mov.u32 %r135,0; bra $L284; $L358: setp.eq.u32 %r426,%r136,43; @ ! %r426 bra $L385; bra $L360; $L324: .loc 1 9438; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r428,[%value_in]; } mov.u64 %r73,%r428; .loc 1 944 6 bra $L362; $L359138; call (%value_in),read_byte,(%out_arg1); ld.param.u32 %r430,[%value_in]; } .loc 1 946 13 cvt.u32.u32 %r432,%r430; cvt.u64.u8 %r73,%r432; .loc 1 947 6 bra $L362; $L35138; call (%value_in),read_uint16,(%out_arg1); ld.param.u32 %r434,[%value_in]; } .loc 1 949 13 cvt.u16.u32 %r436,%r434; cvt.u64.u16 %r73,%r436; .loc 1 950 6 call (%value_in),read_uint24,(952 13 cvt.u64.u32 %r73,%r438; .loc 1 953 6 bra $L362; $L385138; call (%value_in),read_uint32,(%out_arg1); ld.param.u32 %r440,[%value_in]; } .loc 1 955 13 cvt.u64.u32 %r73,%r440; $L362: .loc 1 961 16 mov.u32 %r441,2; st.u32 [%r144],%r441; .loc 1 962 14 st.u64 [%r144+8],%r73; .loc 1 963 9 mov.u32 %r135,1; bra $L284; $L322: .loc 1 966 21 mov.u32 %r442,10; st.u32 [%r144],%r44138; call (%value_in),read_uint32,(%out_arg1); ld.param.u32 %r444,[%value_in]; } cvt.u64.u32 %r445,%r444; st.u64 [%r144+8],%r445; .loc 1 968 14 mov.u32 %r135,1; bra $L284; $L326: .loc 1 970 21 mov.u32 %r446,10; st.u32 [%r144],%r446; .loc 1 971 21138; call (%value_in),read_uint64,(%out_arg1); ld.param.u64 %r448,[%value_in]; } .loc 1 971 19 st.u64 [%r144+8],%r448; .loc 1 972 14 mov.u32 %r135,1; bra $L284; $L330: .loc 1 974 21 mov.u32 %r449,3; st.u32 [%r144],%r449; .loc 1 975 19 st.u64 [%r144+8],%r137; .loc 1 976 14 mov.u32 %r135,1; bra $L284; $L328: .loc 1 980 21 mov.u32 %r450,10; st.u32 [%r144],%r450; .loc 1 981 21138; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r452,[%value_in]; } .loc 1 981 19 st.u64 [%r144+8],%r452; .loc 1 982 14 mov.u32 %r135,1; bra $L284; $L329: .loc 1 984 21 mov.u32 %r453,12; st.u32 [%r144],%r453; .loc 1 985 21138; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r455,[%value_in]; } .loc 1 985 19 st.u64 [%r144+8],%r455; .loc 1 986 14 mov.u32 %r135,1; bra $L284; $L334: .loc 1 572 6 setp.eq.u32 %r456,%r139,0; @ %r456 bra $L363; .loc 1 573 12138; call (%value_in),read_uint64,(%out_arg1); ld.param.u64 %r458,[%value_in]; } mov.u64 %r117,%r458; bra $L364; $L363138; call (%value_in),read_uint32,(%out_arg1); ld.param.u32 %r460,[%value_in]; } cvt.u64.u32 %r117,%r460; $L364: .loc 1 996 19 st.u64 [%r144+8],%r117; .loc 1 997 10 setp.ne.u64 %r461,%r143,0; @ %r461 bra $L365; $L368: .loc 1 999 18 mov.u32 %r462,0; st.u32 [%r144],%r462; .loc 1 1000 11 mov.u32 %r135,1; bra $L284; $L365: .loc 1 1002 21 mov.u32 %r463,9; st.u32 [%r144],%r463; .loc 1 1003 14 mov.u32 %r135,1; bra $L284; $L323: .loc 1 572 6 setp.eq.u32 %r464,%r139,0; @ %r464 bra $L366; .loc 1 573 12138; call (%value_in),read_uint64,(%out_arg1); ld.param.u64 %r466,[%value_in]; } mov.u64 %r119,%r466; bra $L367; $L366138; call (%value_in),read_uint32,(%out_arg1); ld.param.u32 %r468,[%value_in]; } cvt.u64.u32 %r119,%r468; $L367: .loc 1 1009 5 setp.eq.u64 %r469,%r143,0; @ %r469 bra $L368; .loc 1 1014 5 ld.u64 %r470,[%r143+160]; setp.gt.u64 %r471,%r470,%r119; @ %r471 bra $L369; .loc 1 418 38 ld.u64 %r476,[%r138+16]; ld.u64 %r477,[%r138+8]; sub.u64 %r475,%r476,%r477; .loc 1 417 3 st.u32 [%stack+16],%r475; ld.u64 %r478,[%r138]; st.u64 [%stack+8],%r478; cvta.const.u64 %r479,$LC20; st.u64 [%stack],%r479; cvta.const.u64 %r474,$LC1; mov47374stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r480,[%value_in]; } .loc 1 419 3 ld.u64 %r485,[%r138+48]; ld.u64 %r486,[%r138+40]; mov.u32 %r484,0; { $LCT31485%r486,$LCT31; } .loc 1 1017 13 mov.u32 %r135,%r484; .loc 1 420 1 bra $L284; $L369: .loc 1 1019 16 mov.u32 %r487,5; st.u32 [%r144],%r487; .loc 1 1021 59 ld.u64 %r489,[%r143+88]; add.u64 %r488,%r489,%r119; .loc 1 1020 16 st.u64 [%r144+8],%r488; .loc 1 1022 9 mov.u32 %r135,1; bra $L284; $L295: .loc 1 418 38 ld.u64 %r494,[%r138+16]; ld.u64 %r495,[%r138+8]; sub.u64 %r493,%r494,%r495; .loc 1 417 3 st.u32 [%stack+16],%r493; ld.u64 %r496,[%r138]; st.u64 [%stack+8],%r496; cvta.const.u64 %r497,$LC21; st.u64 [%stack],%r497; cvta.const.u64 %r492,$LC1; movstack; call (%value_in),sn,%out_arg4); ld.param.u32 %r498,[%value_in]; } .loc 1 419 3 ld.u64 %r503,[%r138+48]; ld.u64 %r504,[%r138+40]; mov.u32 %r502,0; { $LCT3250ame2; call %r504,$LCT32; } .loc 1 1026 14 mov.u32 %r135,%r502; $L284: .loc 1 1028FUNCTION DEF: read_line_header_format_entries .func (.param .u32 %value_out) read_line_header_format_entries24predu64 %r216; .reg .pred %r217; .reg .pred %r218; .reg .pred %r219; .reg .u64 %r221; .reg .u64 %r224; .reg .u64predpred %r278; .reg .u64 %r280; .reg .u64 %r281; .reg .u64 %r282; mov.u64 %r101,%ar0; mov.u64 %r102,%ar1; mov.u64 %r103,%ar2; mov.u64 %r104,%ar3; mov.u64 %r105,%ar4; mov.u64 %r106,%ar5; mov.u64 %r107,%ar6; .loc 1 2541 1104; call (%value_in),read_byte,(%out_arg1); ld.param.u32 %r109,[%value_in]; } .loc 1 2541 17 cvt.u32.u32 %r111,%r109; cvt.u64.u8 %r41,%r111; .loc 1 2542 6 setp.eq.u64 %r112,%r41,0; @ %r112 bra $L407; .loc 1 2547 4 ld.u64 %r117,[%r104+48]; ld.u64 %r118,[%r104+40]; shl.b64 %r119,%r41118117; call (%value_in),backtrace_allocld.param.u64 %r120,[%value_in]; } mov.u64 %r38,%r120; .loc 1 2552 10 setp.eq.u64 %r122,%r38,0; @ %r122 bra $L408; mov.u64 %r98,%r38; .loc 1 2555 14 mov.u64 %r42,0; $L389: .loc 1 2557 2104; call (%value_in),read_uleb128,(.loc 1 2557 22 st.u32 [%r98],%r124;104; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r126,[%value_in]; } .loc 1 2558 22 st.u32 [%r98+4],%r126; .loc 1 2555 39 add.u64 %r42,%r42,1; .loc 1 2555 7 add.u64 %r98,%r98,8; setp.ne.u64 %r127,%r41,%r42; @ %r127 bra $L389; bra $L387; $L407: .loc 1 2543 13 mov.u64 %r38,%r41; $L387: .loc 1 2562 17104; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r129,[%value_in]; } .loc 1 2563 6390; .loc 1 2565 15 st.u64 [%r106],%r129; .loc 1 2566 15 st.u64 [%r107],%r129; .loc 1 2567 11 mov.u32 %r100,1; .loc 1 2568 7 bra $L391; $L390: .loc 1 2572 5 shl.b64 %r32,%r129,3; ld.u64 %r137,[%r104+48]; ld.u64 %r138,[%r104+40]138137; call (%value_in),backtrace_allocld.param.u64 %r139,[%value_in]; } .loc 1 2574 6 setp.eq.u64 %r141,%r139,0; @ %r141 bra $L409; mov.u64 %r40,%r139; .loc 1 2579 10 mov.u64 %r45,0; setp.ne.u64 %r278,%r41,0; .loc 1 2458 34 add.u64 %r280,%r102,56; add.u64 %r281,%frame,200; .loc 1 2464 9 add.u64 %r282,%frame,216; $L406: .loc 1 2452 8 mov.u64 %r142,0; st.u64 [%frame+216],%r142; .loc 1 2453 3 @ %r278 bra $L392; $L402: .loc 1 418 38 ld.u64 %r148,[%r104+16]; ld.u64 %r149,[%r104+8]; sub.u64 %r147,%r148,%r149; .loc 1 417 3 st.u32 [%stack+16],%r147; ld.u64 %r150,[%r104]; st.u64 [%stack+8],%r150; cvta.const.u64 %r151,$LC22; st.u64 [%stack],%r151; cvta.const.u64 %r146,$LC1; movframe146stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r152,[%value_in]; } .loc 1 419 3 ld.u64 %r157,[%r104+48]; ld.u64 %r158,[%r104+40]; mov.u32 %r156,0; { $LCT33frame156; call %r158,$LCT33; } bra $L393; $L392: mov.u64 %r99,%r38; .loc 1 2451 7 mov.u64 %r80,%r142; .loc 1 2453 10 mov.u64 %r65,%r80; $L400: .loc 1 2457 12 ld.u64 %r169,[%r102+8]; ld.u32 %r170,[%r105+4]; ld.u32 %r171,[%r103+40]; ld.u32 %r172,[%r103+44]; ld.u32 %r173,[%r99+4]; mov.u64 %r16017r16010280; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r169; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r281; call (%value_in),read_attribute,%out_arg8,%out_arg9); ld.param.u32 %r174,[%value_in]; } .loc 1 2457 10 setp.eq.u32 %r175,%r174,0; @ %r175 bra $L393; .loc 1 2461 25 ld.u32 %r52,[%r99]; .loc 1 2461 7 setp.eq.u32 %r176,%r52,1; @ %r176 bra $L395; setp.eq.u32 %r177,%r52,2; @ %r177 bra $L396; bra $L397; $L395: .loc 1 2464 9 ld.u64 %r187,[%r104+48]; ld.u64 %r188,[%r104+40]; ld.u64 %r190,[%r103+64]; ld.u32 %r191,[%r102+200]; ld.u32 %r192,[%r103+4419190281.param .u64 %out_arg8; st.param.u64 [%out_arg8],%r282; call (%value_in),resolve_string,%out_arg8); ld.param.u32 %r193,[%value_in]; } .loc 1 2464 7 setp.eq.u32 %r194,%r193,0; @ ! %r194 bra $L397; bra $L393; $L396: .loc 1 2471 7 ld.u32 %r195,[%frame+200]; setp.ne.u32 %r196,%r195,3; @ %r196 bra $L397; .loc 1 2473 17 ld.u64 %r60,[%frame+208]; .loc 1 2473 11 ld.u64 %r197,[%r105+40]; setp.lt.u64 %r198,%r60,%r197; @ %r198 bra $L398; .loc 1 418 38 ld.u64 %r203,[%r104+16]; ld.u64 %r204,[%r104+8]; sub.u64 %r202,%r203,%r204; .loc 1 417 3 st.u32 [%stack+16],%r202; ld.u64 %r205,[%r104]; st.u64 [%stack+8],%r205; cvta.const.u64 %r206,$LC23; st.u64 [%stack],%r206; cvta.const.u64 %r201,$LC1; mov20020stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r207,[%value_in]; } .loc 1 419 3 ld.u64 %r212,[%r104+48]; ld.u64 %r213,[%r104+40]; cvt.u32.u64 %r211,%r160; { $LCT3421frame211; call %r21,$LCT34; } .loc 1 420 1 bra $L393; $L398: .loc 1 2480 12 ld.u64 %r214,[%r105+48]; shl.b64 %r215,%r60,3; add.u64 %r216,%r214,%r215; ld.u64 %r80,[%r216]; $L397: .loc 1 2453 35 add.u64 %r65,%r65,1; .loc 1 2453 3 add.u64 %r99,%r99,8; setp.ne.u64 %r217,%r41,%r65; @ %r217 bra $L400; .loc 1 2489 12 ld.u64 %r66,[%frame+216]; .loc 1 2489 6 setp.eq.u64 %r218,%r66,0; @ %r218 bra $L402; .loc 1 2496 6 setp.ne.u64 %r219,%r80,0; @ %r219 bra $L403; .loc 1 2497 13 st.u64 [%r40],%r66; bra $L404; $L403: .loc 1 2504 1780221,[%value_in]; } .loc 1 2505 1224,[%value_in]; } .loc 1 2506 20 ld.u64 %r230,[%r104+48]; ld.u64 %r231,[%r104+40]; .loc 1 2506 52 add.u64 %r232,%r221,%r224; .loc 1 2506 20 add.u64 %r233,%r2323230; call (%value_in),backtrace_allocld.param.u64 %r234,[%value_in]; } .loc 1 2508 10 setp.eq.u64 %r236,%r234,0; @ %r236 bra $L393; .loc 1 2510 71243,[%value_in]; } .loc 1 2514 18 add.u64 %r245,%r234,%r221; mov.u32 %r246,47; st.u8 [%r245],%r246; .loc 1 2515 27 add.u64 %r247,%r221,1; add.u64 %r248,%r234,%r247; .loc 1 2515 7 add.u64 %r249,%r224,1; ld.u64 %r250,[%frame+216]249257,[%value_in]; } .loc 1 2516 15 st.u64 [%r40],%r234; bra $L404; $L393: .loc 1 2584 4 ld.u64 %r264,[%r104+48]; ld.u64 %r265,[%r104+40]; {r15264; call backtrace_free); } .loc 1 2587 8 mov.u32 %r100,0; .loc 1 2588 4 bra $L391; $L404: .loc 1 2579 33 add.u64 %r45,%r45,1; .loc 1 2579 3 add.u64 %r40,%r40,8; setp.ne.u64 %r266,%r129,%r45; @ %r266 bra $L406; .loc 1 2592 11 st.u64 [%r106],%r129; .loc 1 2593 11 st.u64 [%r107],%r139; .loc 1 2595 7 mov.u32 %r100,1; bra $L391; $L409: .loc 1 2576 11 cvt.u32.u64 %r100,%r139; $L391: .loc 1 2598 6 setp.eq.u64 %r267,%r38,0; @ %r267 bra $L386; .loc 1 2599 5 ld.u64 %r273,[%r104+48]; ld.u64 %r274,[%r104+40]; shl.b64 %r275,%r41,3; {27743; call backtrace_free); } bra $L386; $L408: .loc 1 2553 9 cvt.u32.u64 %r100,%r120; $L386: .loc 1 2FUNCTION DEF: read_referenced_name .func (.param .u64 %value_out) read_referenced_name88predpredpredpredpredpredpred %r176; .reg .u64 %r182; .reg .predu64 %r190; mov.u64 %r71,%ar0; mov.u64 %r72,%ar1; mov.u64 %r73,%ar2; mov.u64 %r74,%ar3; mov.u64 %r75,%ar4; .loc 1 3054 17 ld.u64 %r22,[%r72+16]; .loc 1 3054 6 setp.gt.u64 %r76,%r22,%r73; @ %r76 bra $L429; .loc 1 3055 17 sub.u64 %r23,%r73,%r22; .loc 1 3055 43 ld.u64 %r24,[%r72+8]; .loc 1 3055 7 setp.lt.u64 %r77,%r23,%r24; @ %r77 bra $L430; $L429: .loc 1 3057 7 mov.u32 %r80,0; cvta.const.u64 %r79,$LC24; { $LCT3535; } .loc 1 3060 14 mov.u64 %r70,0; bra $L428; $L430: .loc 1 3065 17 cvta.const.u64 %r81,$LC25; st.u64 [%frame+200],%r81; .loc 1 3066 18 ld.u64 %r82,[%r71+56]; st.u64 [%frame+208],%r82; .loc 1 3067 31 ld.u64 %r84,[%r72]; add.u64 %r83,%r84,%r23; .loc 1 3067 16 st.u64 [%frame+216],%r83; .loc 1 3068 36 add.u64 %r85,%r22,%r24; sub.u64 %r86,%r85,%r73; .loc 1 3068 17 st.u64 [%frame+224],%r86; .loc 1 3069 25 ld.u32 %r87,[%r71+200]; st.u32 [%frame+232],%r87; .loc 1 3070 27 st.u64 [%frame+240],%r74; .loc 1 3071 17 st.u64 [%frame+248],%r75; .loc 1 3072 31 mov.u32 %r88,0; st.u32 [%frame+256],%r88; .loc 1 3074 10 add.u64 %r187,%frame,7; call (%value_in),read_uleb128,(75 6 setp.ne.u64 %r92,%r91,0; @ %r92 bra $L432; .loc 1 418 38 ld.u64 %r97,[%frame+216]; ld.u64 %r98,[%frame+208]; sub.u64 %r96,%r97,%r98; .loc 1 417 3 st.u32 [%stack+16],%r96; ld.u64 %r99,[%frame+200]; st.u64 [%stack+8],%r99; cvta.const.u64 %r100,$LC26; st.u64 [%stack],%r100; cvta.const.u64 %r95,$LC1; mov.u64 %r94,20stack; call (%value_in),snld.u64 %r106,[%frame+248]; ld.u64 %r107,[%frame+240]; { $LCT36framecall %r107,(%out_arg1,%out_arg2,%out_arg3),$LCT36; } .loc 1 3078 14 mov.u64 %r70,%r91; .loc 1 420 1 bra $L428; $L432: .loc 1 3081 12 add.u64 %r112,%r72,1111call (%value_in),lookup_abbrevld.param.u64 %r113,[%value_in]; } .loc 1 3082 6 setp.eq.u64 %r114,%r113,0; @ %r114 bra $L445; .loc 1 3085 7 mov.u64 %r115,0; st.u64 [%frame+272],%r115; .loc 1 3086 3 ld.u64 %r116,[%r113+16]; setp.eq.u64 %r117,%r116,0; @ %r117 bra $L446; .loc 1 3092 7 add.u64 %r58,%r71,56; .loc 1 3086 10 mov.u64 %r57,%r115; .loc 1 3116 11 add.u64 %r189,%frame,264; .loc 1 3103 9 add.u64 %r190,%frame,272; $L444: shl.b64 %r69,%r57,4; .loc 1 3090 64 ld.u64 %r118,[%r113+24]; add.u64 %r36,%r118,%r69; .loc 1 3090 12 ld.u64 %r128,[%r71+8]; ld.u32 %r129,[%r72+48]; ld.u32 %r130,[%r72+40]; ld.u32 %r131,[%r72+44]; ld.u64 %r133,[%r36+8]; ld.u32 %r134,[%r36+4]1118758; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r128; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%frame; call (%value_in),read_attribute,%out_arg8,%out_arg9); ld.param.u32 %r135,[%value_in]; } .loc 1 3090 10 setp.eq.u32 %r136,%r135,0; @ %r136 bra $L434; .loc 1 3095 28 ld.u64 %r137,[%r113+24]; add.u64 %r42,%r137,%r69; .loc 1 3095 31 ld.u32 %r43,[%r42]; .loc 1 3095 7 setp.eq.u32 %r138,%r43,71; @ %r138 bra $L435; setp.gt.u32 %r139,%r43,71; @ %r139 bra $L436; setp.eq.u32 %r140,%r43,3; @ %r140 bra $L437; bra $L438; $L436: setp.eq.u32 %r141,%r43,110; @ %r141 bra $L439; setp.eq.u32 %r142,%r43,8199; @ %r142 bra $L439; bra $L438; $L437: .loc 1 3101 7 ld.u64 %r143,[%frame+272]; setp.eq.u64 %r144,%r143,0; @ ! %r144 bra $L438; .loc 1 3103 9 ld.u64 %r154,[%r72+64]; ld.u32 %r155,[%r71+200]; ld.u32 %r1561515154frame7475; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r190; call (%value_in),resolve_string,%out_arg8); ld.param.u32 %r157,[%value_in]; } .loc 1 3103 7 setp.eq.u32 %r158,%r157,0; @ ! %r158 bra $L438; bra $L434; $L439: .loc 1 3115 8 mov.u64 %r159,0; st.u64 [%frame+264],%r159; .loc 1 3116 11 ld.u64 %r169,[%r72+64]; ld.u32 %r170,[%r71+200]; ld.u32 %r171171170frame7475; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r189; call (%value_in),resolve_string,%out_arg8); ld.param.u32 %r172,[%value_in]; } .loc 1 3116 9 setp.eq.u32 %r173,%r172,0; @ %r173 bra $L447; .loc 1 3120 12 ld.u64 %r70,[%frame+264]; .loc 1 3120 9 setp.ne.u64 %r174,%r70,0; @ ! %r174 bra $L438; bra $L428; $L447: .loc 1 3119 15 mov.u64 %r70,%r159; bra $L428; $L435: .loc 1 3000 6 ld.u32 %r175,[%r42+4]; setp.eq.u32 %r176,%r175,32; @ %r176 bra $L43871framecall (%value_in),read_referenced_name_from_attr$part$0.loc 1 3133 9 setp.eq.u64 %r183,%r182,0; @ %r183 bra $L438; .loc 1 3134 12 st.u64 [%frame+272],%r182; $L438: .loc 1 3086 38 add.u64 %r57,%r57,1; .loc 1 3086 3 ld.u64 %r184,[%r113+16]; setp.gt.u64 %r185,%r184,%r57; @ %r185 bra $L444; .loc 1 3143 10 ld.u64 %r70,[%frame+272]; bra $L428; $L434: .loc 1 3093 9 mov.u64 %r70,0; bra $L428; $L446: .loc 1 3086 3 mov.u64 %r70,%r116; .loc 1 3143 10 bra $L428; $L445: .loc 1 3083 12 mov.u64 %r70,%r113; $L428: .loc 1 3144FUNCTION DEF: read_referenced_name_from_attr$paread_referenced_name_from_attr$part$0predpredpred3003 10 ld.u32 %r22,[%r44]; .loc 1 3003 6 setp.ne.u32 %r47,%r22,8; @ %r47 bra $L468; .loc 1 3006 4 ld.u64 %r24,[%r42+48]; ld.u64 %r25,[%r42+40]; ld.u64 %r48,[%r44+8]; st.u64 [%frame],%r48; .loc 1 1143 7 mov.u64 %r53,units_search; mov.u64 call (%value_in),bsearch54,[%value_in]; } mov.u64 %r41,%r54; .loc 1 1144 27 setp.ne.u64 %r55,%r41,0; @ ! %r55 bra $L467; ld.u64 %r38,[%r41]; .loc 1 3008 10 setp.ne.u64 %r56,%r38,0; @ %r56 bra $L471; $L473: .loc 1 3009 9 mov.u64 %r41,0; bra $L467; $L471: .loc 1 3011 16 ld.u64 %r63,[%r44+8]; ld.u64 %r64,[%r38+24]; sub.u64 %r62,%r6call (%value_in),read_referenced_name65,[%value_in]; } mov.u64 %r41,%r65; bra $L467; $L468: .loc 1 3016 7 and.b32 %r66,%r22,-5; .loc 1 3015 6 setp.ne.u32 %r67,%r66,3; @ %r67 bra $L472; .loc 1 3017 12 ld.u64 %r73,[%r44+8];call (%value_in),read_referenced_name74,[%value_in]; } mov.u64 %r41,%r74; bra $L467; $L472: .loc 1 3019 6 setp.ne.u32 %r75,%r22,9; @ %r75 bra $L473; .loc 1 3022 43 ld.u64 %r31,[%r42+8]; .loc 1 3022 4 ld.u64 %r32,[%r31+48]; ld.u64 %r33,[%r31+40]; ld.u64 %r76,[%r44+8]; st.u64 [%frame],%r76; .loc 1 1143 7 mov.u64 %r81,units_search; mov.u64 call (%value_in),bsearch82,[%value_in]; } mov.u64 %r41,%r82; .loc 1 1144 27 setp.ne.u64 %r83,%r41,0; @ ! %r83 bra $L467; ld.u64 %r39,[%r41]; .loc 1 3024 10 setp.eq.u64 %r84,%r39,0; @ %r84 bra $L473; .loc 1 3027 16 ld.u64 %r91,[%r44+8]; ld.u64 %r92,[%r39+24]; sub.u64 %r90,%r91,%r92; .loc 1 3028 14 ld.u64 %r93,[%r42+8]call (%value_in),read_referenced_name94,[%value_in]; } mov.u64 %r41,%r94; $L467: .loc 1 3033FUNCTION DEF: find_address_ranges .func (.param .u32 %value_out) find_address_ranges, .param .u64 %in_ar1064u16predpred %r155; .reg .pred %r156; .reg .predpredpred %r164; .reg .u32 %r169; .reg .pred.reg .u16 %r179; .reg .u32 %r180; .reg .u16 %r181; .reg .predpredpred %r210; .reg .u64predu64 %r248; .reg .u32 %r253; .reg .pred %r254; .reg .u32predmov.u64 %r84,%ar0; mov.u64 %r85,%ar1; mov.u64 %r86,%ar2; mov.u64 %r87,%ar3; mov.u32 %r88,%ar4; mov.u64 %r89,%ar5; mov.u64 %r90,%ar6; mov.u64 %r91,%ar7; mov.u64 %r92,%ar8; mov.u64 %r93,%ar9; mov.u64 %r94,%ar10; .loc 1 1942 9 ld.u64 %r95,[%r86+24]; setp.ne.u64 %r96,%r95,0; @ ! %r96 bra $L506; .loc 1 1957 16 add.u64 %r282,%r92,112; .loc 1 1961 10 setp.eq.u64 %r283,%r94,0; .loc 1 1965 7 add.u64 %r284,%frame,88; .loc 1 1967 7 add.u64 %r285,%frame,72; add.u64 %r286,%frame,56; .loc 1 2034 9 add.u64 %r287,%r92,88; .loc 1 2041 9 add.u64 %r288,%r92,96; bra $L479; $L506: .loc 1 2073 10 mov.u32 %r83,1; bra $L478; $L479:86; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r98,[%value_in]; } .loc 1 1954 10 setp.eq.u64 %r99,%r98,0; @ %r99 bra $L481; .loc 1 1957 1628call (%value_in),lookup_abbrev1958 10 setp.ne.u64 %r106,%r105,0; @ %r106 bra $L482; $L502: .loc 1 1959 9 mov.u32 %r83,0; bra $L478; $L482: .loc 1 1961 10 @ %r283 bra $L484; .loc 1 1962 20 ld.u32 %r23,[%r105+8]; .loc 1 1962 12 st.u32 [%r94],%r23; $L484: .loc 1 1964 7 mov.u64 %r108,0; st.u64 [%frame],%r108; st.u64 [%frame+8],%r108; st.u64 [%frame+16],%r108; st.u64 [%frame+24],%r108; st.u64 [%frame+32],%r108; st.u64 [%frame+40],%r108; st.u64 [%frame+48],%r108; .loc 1 1965 7 st.u64 [%r284],%r108; st.u64 [%r284+8],%r108; .loc 1 1967 7 st.u64 [%r285],%r108; st.u64 [%r285+8],%r108; .loc 1 1969 7 ld.u64 %r121,[%r105+16]; setp.ne.u64 %r122,%r121,0; @ %r122 bra $L507; $L501: .loc 1 2047 17 ld.u32 %r61,[%r105+8]; .loc 1 2048 4 set.u32.eq.u32 %r124,%r61,17; neg.s32 %r125,%r124; set.u32.eq.u32 %r127,%r61,46; neg.s32 %r128,%r127; cvt.u16.u32 %r130,%r125; cvt.u16.u32 %r131,%r128; or.b16 %r129,%r130,%r131; .loc 1 2047 10 cvt.u32.u16 %r132,%r129; cvt.u16.u8 %r133,%r132; setp.ne.u16 %r134,%r133,0; @ ! %r134 bra $L487; bra $L486; $L507: .loc 1 1969 14 mov.u64 %r81,%r108; .loc 1 1968 25 cvt.u32.u64 %r76,%r81; .loc 1 1966 21 cvt.u32.u64 %r75,%r81; $L485: shl.b64 %r77,%r81,4; .loc 1 1973 61 ld.u64 %r135,[%r105+24]; add.u64 %r28,%r135,%r77; .loc 1 1973 9 ld.u32 %r146,[%r92+48]; ld.u32 %r147,[%r92+40]; ld.u32 %r148,[%r92+44]; ld.u64 %r149,[%r28+8]; ld.u32 %r150,[%r28+4]187; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r89; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r286; call (%value_in),read_attribute,%out_arg8,%out_arg9); ld.param.u32 %r151,[%value_in]; } mov.u32 %r83,%r151; .loc 1 1973 7 setp.ne.u32 %r152,%r151,0; @ ! %r152 bra $L478; .loc 1 1978 28 ld.u64 %r153,[%r105+24]; add.u64 %r154,%r153,%r77; ld.u32 %r34,[%r154]; .loc 1 1978 4 setp.eq.u32 %r155,%r34,85; @ %r155 bra $L489; setp.gt.u32 %r156,%r34,85; @ %r156 bra $L490; setp.eq.u32 %r157,%r34,16; @ %r157 bra $L491; setp.gt.u32 %r158,%r34,16; @ %r158 bra $L492; setp.eq.u32 %r159,%r34,3; @ %r159 bra $L493; bra $L494; $L492: setp.le.u32 %r160,%r34,18; @ %r160 bra $L489; setp.eq.u32 %r161,%r34,27; @ %r161 bra $L495; bra $L494; $L490: setp.eq.u32 %r162,%r34,115; @ %r162 bra $L496; setp.eq.u32 %r163,%r34,116; @ %r163 bra $L497; setp.eq.u32 %r164,%r34,114; @ %r164 bra $L498; bra $L494; $L489: .loc 1 1981 8286frame; call update_pcrange$isra$0.loc 1 1982 8 bra $L494; $L491: .loc 1 1985 11 ld.u32 %r169,[%r105+8]; setp.ne.u32 %r170,%r169,17; @ %r170 bra $L494; .loc 1 1986 12 ld.u32 %r80,[%frame+56]; .loc 1 1987 9 set.u32.eq.u32 %r172,%r80,3; neg.s32 %r173,%r172; set.u32.eq.u32 %r175,%r80,10; neg.s32 %r176,%r175; cvt.u16.u32 %r178,%r173; cvt.u16.u32 %r179,%r176; or.b16 %r177,%r178,%r179; .loc 1 1986 5 cvt.u32.u16 %r180,%r177; cvt.u16.u8 %r181,%r180; setp.eq.u16 %r182,%r181,0; @ %r182 bra $L494; .loc 1 1988 14 ld.u64 %r183,[%frame+64]; st.u64 [%r92+56],%r183; bra $L494; $L493: .loc 1 1992 11 ld.u32 %r184,[%r105+8]; setp.ne.u32 %r185,%r184,17; @ %r185 bra $L494; .loc 1 1994 14 ld.u64 %r186,[%frame+56]; st.u64 [%frame+88],%r186; ld.u64 %r187,[%frame+64]; st.u64 [%frame+96],%r187; .loc 1 1995 19 mov.u32 %r75,1; bra $L494; $L495: .loc 1 2000 11 ld.u32 %r188,[%r105+8]; setp.ne.u32 %r189,%r188,17; @ %r189 bra $L494; .loc 1 2002 18 ld.u64 %r190,[%frame+56]; st.u64 [%frame+72],%r190; ld.u64 %r191,[%frame+64]; st.u64 [%frame+80],%r191; .loc 1 2003 23 mov.u32 %r76,1; bra $L494; $L498: .loc 1 2008 11 ld.u32 %r192,[%r105+8]; setp.ne.u32 %r193,%r192,17; @ %r193 bra $L494; .loc 1 2009 5 ld.u32 %r194,[%frame+56]; setp.ne.u32 %r195,%r194,10; @ %r195 bra $L494; .loc 1 2010 23 ld.u64 %r196,[%frame+64]; st.u64 [%r92+64],%r196; bra $L494; $L496: .loc 1 2014 11 ld.u32 %r197,[%r105+8]; setp.ne.u32 %r198,%r197,17; @ %r198 bra $L494; .loc 1 2015 5 ld.u32 %r199,[%frame+56]; setp.ne.u32 %r200,%r199,10; @ %r200 bra $L494; .loc 1 2016 16 ld.u64 %r201,[%frame+64]; st.u64 [%r92+72],%r201; bra $L494; $L497: .loc 1 2020 11 ld.u32 %r202,[%r105+8]; setp.ne.u32 %r203,%r202,17; @ %r203 bra $L494; .loc 1 2021 5 ld.u32 %r204,[%frame+56]; setp.ne.u32 %r205,%r204,10; @ %r205 bra $L494; .loc 1 2022 20 ld.u64 %r206,[%frame+64]; st.u64 [%r92+80],%r206; $L494: .loc 1 1969 42 add.u64 %r81,%r81,1; .loc 1 1969 7 ld.u64 %r207,[%r105+16]; setp.gt.u64 %r208,%r207,%r81; @ %r208 bra $L485; .loc 1 2032 10 setp.ne.u32 %r209,%r75,0; @ %r209 bra $L499; $L503: .loc 1 2039 10 setp.ne.u32 %r210,%r76,0; @ ! %r210 bra $L501; bra $L500; $L499: .loc 1 2034 9 ld.u64 %r221,[%r92+64]; ld.u32 %r222222849091; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r287; call (%value_in),resolve_string,%out_arg8); ld.param.u32 %r223,[%value_in]; } .loc 1 2034 7 setp.eq.u32 %r224,%r223,0; @ ! %r224 bra $L503; bra $L502; $L500: .loc 1 2041 9 ld.u64 %r235,[%r92+64]; ld.u32 %r236232859091; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r288; call (%value_in),resolve_string,%out_arg8); ld.param.u32 %r237,[%value_in]; } .loc 1 2041 7 setp.eq.u32 %r238,%r237,0; @ ! %r238 bra $L501; bra $L502; $L487: .loc 1 2064 10 ld.u32 %r239,[%r105+12]; setp.ne.u32 %r240,%r239,0; @ ! %r240 bra $L505; bra $L504; $L486: .loc 1 2050 9 mov.u64 %r248,add_unit_addr; ld.u64 %r246,[%frame];frame; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r248; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r92; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r90; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r91; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r93; call (%value_in),add_ranges,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12); ld.param.u32 %r253,[%value_in]; } .loc 1 2050 7 setp.eq.u32 %r254,%r253,0; @ %r254 bra $L502; .loc 1 2058 7 ld.u32 %r255,[%r105+8]; setp.ne.u32 %r256,%r255,17; @ %r256 bra $L487; .loc 1 2059 8 ld.u32 %r257,[%frame+48]; setp.ne.u32 %r258,%r257,0; @ %r258 bra $L481; .loc 1 2060 5 ld.u32 %r259,[%frame+8]; setp.eq.u32 %r260,%r259,0; @ %r260 bra $L487; .loc 1 2060 28 ld.u32 %r261,[%frame+24]; setp.ne.u32 %r262,%r261,0; @ ! %r262 bra $L487; bra $L481; $L505: .loc 1 1942 9 ld.u64 %r263,[%r86+24479; bra $L506; $L504: .loc 1 2066 9 mov.u64 %r275,0;89; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r91; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r92; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r93; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r275; call (%value_in),find_address_ranges,%out_arg8,%out_arg9,%out_arg10,%out_arg11); ld.param.u32 %r276,[%value_in]; } .loc 1 2066 7 setp.eq.u32 %r277,%r276,0; @ ! %r277 bra $L505; bra $L502; $L481: .loc 1 1955 9 mov.u32 %r83,1; $L478: .loc 1 2074 1 mov.u32 %value,%r83FUNCTION DEF: read_function_entread_function_ent288predu16 %r151; .reg .u16 %r152; .reg .u32 %r153; .reg .u16 %r154; .reg .predpred %r163; .reg .u32 %r167; .reg .u64 %r169; .reg .u64 %r172; .reg .u64 %r179; .reg .predu32predpred %r230; .reg .predpred %r236; .reg .predpredpred.reg .u32u32 %r293; .reg .pred %r294; .reg .u64 %r295; .reg .u64pred %r309; .reg .predu32 %r361; .reg .u64 %r362; .reg .u32pred %r372; .reg .u32 %r373; .reg .predpred %r395; .reg .u32 %r400; .reg .predu64 %r408; .reg .pred %r409; .reg .predmov.u64 %r122,%ar0; mov.u64 %r123,%ar1; mov.u64 %r124,%ar2; mov.u64 %r125,%ar3; st.u64 [%frame+272],%r125; mov.u64 %r126,%ar4; mov.u64 %r127,%ar5; mov.u64 %r128,%ar6; mov.u64 %r129,%ar7; mov.u64 %r130,%ar8; mov.u64 %r131,%ar9; .loc 1 3197 9 ld.u64 %r132,[%r126+24]; setp.ne.u64 %r133,%r132,0; @ ! %r133 bra $L582; .loc 1 3212 16 add.u64 %r429,%r124,112; .loc 1 3367 32 add.u64 %r430,%r123,56; add.u64 %r431,%frame,256; .loc 1 3257 10 add.u64 %r432,%frame,272; .loc 1 3236 7 add.u64 %r433,%frame,200; bra $L538; $L582: .loc 1 3425 10 mov.u32 %r121,1; bra $L537; $L538:read_uleb128,(209 10 setp.eq.u64 %r136,%r135,0; @ %r136 bra $L586; .loc 1 3212 16129; call (%value_in),lookup_abbrevld.param.u64 %r142,[%value_in]; } .loc 1 3213 10 setp.ne.u64 %r143,%r142,0; @ %r143 bra $L541; $L544: .loc 1 3214 9 mov.u32 %r121,0; bra $L537; $L541: .loc 1 3216 28 ld.u32 %r23,[%r142+8]; .loc 1 3217 8 set.u32.eq.u32 %r145,%r23,46; neg.s32 %r146,%r145; set.u32.eq.u32 %r148,%r23,3; neg.s32 %r149,%r148; cvt.u16.u32 %r151,%r146; cvt.u16.u32 %r152,%r149; or.b16 %r150,%r151,%r152; .loc 1 3218 8 cvt.u32.u16 %r153,%r150; cvt.u16.u8 %r154,%r153; setp.ne.u16 %r155,%r154,0; @ %r155 bra $L542; setp.ne.u32 %r156,%r23,29; @ %r156 bra $L543; .loc 1 3221 6 mov.u64 %r118,%r131; $L583: .loc 1 3229 9 mov.u64 %r158115129; call (%value_in),backtrace_allocld.param.u64 %r161,[%value_in]; } mov.u64 %r100,%r161; .loc 1 3231 7 setp.eq.u64 %r163,%r100,0; @ %r163 bra $L544; .loc 1 3233 4 mov.u32 %r167158169,[%value_in]; } .loc 1 3236 7 mov.u64 %r172,0; st.u64 [%r433],%r172; st.u64 [%r433+8],%r172; st.u64 [%r433+16],%r172; st.u64 [%r433+24],%r172; st.u64 [%r433+32],%r172; st.u64 [%r433+40],%r172; st.u64 [%r433+48],%r172; .loc 1 3238 7 ld.u64 %r179,[%r142+16]; setp.eq.u64 %r180,%r179,0; @ %r180 bra $L545; mov.u32 %r117,1; bra $L585; $L588: .loc 1 3223 6 mov.u64 %r118,%r130; .loc 1 3218 8 cvt.u32.u64 %r117,%r416; .loc 1 3225 16 mov.u64 %r100,%r416; $L585: .loc 1 3237 25 mov.u32 %r93,0; .loc 1 3238 14 mov.u64 %r103,0; .loc 1 3265 7 setp.ne.u32 %r427,%r117,0; $L569: shl.b64 %r97,%r103,4; .loc 1 3242 61 ld.u64 %r181,[%r142+24]; add.u64 %r32,%r181,%r97; .loc 1 3242 9 ld.u64 %r192,[%r123+8]; ld.u32 %r193,[%r124+48]; ld.u32 %r194,[%r124+40]; ld.u32 %r195,[%r124+44]; ld.u64 %r196,[%r32+8]; ld.u32 %r197,[%r32+4]6430; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r192; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r431; call (%value_in),read_attribute,%out_arg8,%out_arg9); ld.param.u32 %r198,[%value_in]; } .loc 1 3242 7 setp.eq.u32 %r199,%r198,0; @ %r199 bra $L564; .loc 1 3250 7 ld.u32 %r200,[%r142+8]; setp.eq.u32 %r201,%r200,17; @ %r201 bra $L547; $L550: .loc 1 3265 7 @ %r427 bra $L548; .loc 1 3350 6 bra $L549; $L547: .loc 1 3251 27 ld.u64 %r203,[%r142+24]; add.u64 %r204,%r203,%r97; ld.u32 %r49,[%r204]; .loc 1 3251 8 setp.ne.u32 %r205,%r49,17; @ %r205 bra $L550; .loc 1 3253 15 ld.u32 %r40,[%frame+256]; .loc 1 3253 11 setp.ne.u32 %r206,%r40,1; @ %r206 bra $L551; .loc 1 3254 8 ld.u64 %r207,[%frame+264]; st.u64 [%frame+272],%r207; bra $L552; $L551: .loc 1 3255 16 setp.ne.u32 %r208,%r40,2; @ %r208 bra $L552; .loc 1 3257 10 ld.u64 %r219,[%frame+264]; ld.u32 %r220,[%r123+200]; ld.u32 %r221,[%r124+48]; ld.u64 %r222,[%r124+72]; ld.u64 %r223,[%r123+168]; ld.u64 %r224,[%r123+962232128; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r129; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r432; call (%value_in),resolve_addr_index$isra$0,%out_arg8,%out_arg9); ld.param.u32 %r225,[%value_in]; } .loc 1 3257 8 setp.eq.u32 %r226,%r225,0; @ ! %r226 bra $L550; bra $L564; $L548: .loc 1 3267 29 ld.u64 %r227,[%r142+24]; add.u64 %r48,%r227,%r97; .loc 1 3267 32 ld.u32 %r49,[%r48]; .loc 1 3267 8 setp.eq.u32 %r228,%r49,88; @ %r228 bra $L553; setp.gt.u32 %r229,%r49,88; @ %r229 bra $L554; setp.eq.u32 %r230,%r49,49; @ %r230 bra $L555; setp.gt.u32 %r231,%r49,49; @ %r231 bra $L556; setp.eq.u32 %r232,%r49,3; @ %r232 bra $L557; add.u32 %r233,%r49,-17; setp.gt.u32 %r234,%r233,1; @ %r234 bra $L549; bra $L558; $L556: setp.eq.u32 %r235,%r49,71; @ %r235 bra $L559; setp.eq.u32 %r236,%r49,85; @ %r236 bra $L558; bra $L549; $L554: setp.eq.u32 %r237,%r49,110; @ %r237 bra $L560; setp.eq.u32 %r238,%r49,8199; @ %r238 bra $L560; setp.eq.u32 %r239,%r49,89; @ %r239 bra $L561; bra $L549; $L553: .loc 1 3270 8 ld.u32 %r240,[%frame+256]; setp.ne.u32 %r241,%r240,3; @ %r241 bra $L549; .loc 1 3272 18 ld.u64 %r51,[%frame+264]; .loc 1 3272 12 setp.ne.u64 %r242,%r51,0; @ %r242 bra $L562; .loc 1 3273 30 cvta.const.u64 %r243,$LC27; st.u64 [%r100+8],%r243; bra $L549; $L562: .loc 1 3276 21 add.u64 %r244,%r51,-1; .loc 1 3276 9 ld.u64 %r245,[%r127+56]; setp.lt.u64 %r246,%r244,%r245; @ %r246 bra $L563; .loc 1 418 38 ld.u64 %r251,[%r126+16]; ld.u64 %r252,[%r126+8]; sub.u64 %r250,%r251,%r252; .loc 1 417 3 st.u32 [%stack+16],%r250; ld.u64 %r253,[%r126]; st.u64 [%stack+8],%r253; cvta.const.u64 %r254,$LC28; st.u64 [%stack],%r254; cvta.const.u64 %r249,$LC1; mov24249stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r255,[%value_in]; } .loc 1 419 3 ld.u64 %r260,[%r126+48]; ld.u64 %r261,[%r126+40]; mov.u32 %r259,0; { $LCT3726frame259; call %r261,$LCT37; } .loc 1 420 1 bra $L564; $L563: .loc 1 3284 23 ld.u64 %r262,[%r127+64]; shl.b64 %r263,%r51,3; add.u64 %r264,%r262,%r263; ld.u64 %r58,[%r264+-8]; .loc 1 3283 32 st.u64 [%r100+8],%r58; bra $L549; $L561: .loc 1 3290 8 ld.u32 %r266,[%frame+256]; setp.ne.u32 %r267,%r266,3; @ %r267 bra $L549; .loc 1 3291 31 ld.u64 %r268,[%frame+264]; st.u32 [%r100+16],%r268; bra $L549; $L584: .loc 1 3000 6 ld.u32 %r269,[%r48+4]; setp.ne.u32 %r270,%r269,32; @ %r270 bra $L565; $L566: .loc 1 3308 24 mov.u32 %r93,0; bra $L549; $L565:112431289; call (%value_in),read_referenced_name_from_attr$part$0.loc 1 3307 10 setp.eq.u64 %r278,%r277,0; @ %r278 bra $L566; .loc 1 3308 24 st.u64 [%r100],%r277; mov.u32 %r93,0; bra $L549; $L557: .loc 1 3314 8 ld.u64 %r279,[%r100]; setp.ne.u64 %r280,%r279,0; @ %r280 bra $L549; .loc 1 3316 10 ld.u64 %r290,[%r124+64]; ld.u32 %r291,[%r123+200]; ld.u32 %r292,[%r124+44292903.param .u64 %out_arg8; st.param.u64 [%out_arg8],%r100; call (%value_in),resolve_string,%out_arg8); ld.param.u32 %r293,[%value_in]; } .loc 1 3316 8 setp.eq.u32 %r294,%r293,0; @ ! %r294 bra $L549; bra $L564; $L560: .loc 1 3329 9 mov.u64 %r295,0; st.u64 [%frame],%r295; .loc 1 3330 12 ld.u64 %r305,[%r124+64]; ld.u32 %r306,[%r123+200]; ld.u32 %r307,[%r124+44063053.param .u64 %out_arg8; st.param.u64 [%out_arg8],%frame; call (%value_in),resolve_string,%out_arg8); ld.param.u32 %r308,[%value_in]; } .loc 1 3330 10 setp.ne.u32 %r309,%r308,0; @ ! %r309 bra $L564; .loc 1 3335 13 ld.u64 %r72,[%frame]; .loc 1 3335 10 setp.eq.u64 %r310,%r72,0; @ %r310 bra $L549; .loc 1 3337 19 st.u64 [%r100],%r72; .loc 1 3338 22 mov.u32 %r93,%r117; .loc 1 3341 5 bra $L549; $L558: .loc 1 3344 5433; call update_pcrange$isra$0$L549: .loc 1 3238 42 add.u64 %r103,%r103,1; .loc 1 3238 7 ld.u64 %r316,[%r142+16]; setp.gt.u64 %r317,%r316,%r103; @ %r317 bra $L569; bra $L649; $L564: .loc 1 3246 13 mov.u32 %r121,0; bra $L537; $L649: .loc 1 3355 10 setp.eq.u32 %r318,%r117,0; @ %r318 bra $L571; $L545: .loc 1 3355 23 ld.u64 %r319,[%r100]; setp.ne.u64 %r320,%r319,0; @ %r320 bra $L572; .loc 1 3357 4 mov.u64 %r323,4122r31289; call backtrace_free); } .loc 1 3382 10 ld.u32 %r326,[%r142+12]; setp.eq.u32 %r327,%r326,0; @ %r327 bra $L581; $L577: .loc 1 3386 13 ld.u64 %r338,[%frame+2711233126.param .u64 %out_arg8; st.param.u64 [%out_arg8],%r129; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r130; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r131; call (%value_in),read_function_ent,%out_arg8,%out_arg9,%out_arg10); ld.param.u32 %r339,[%value_in]; } .loc 1 3386 11 setp.eq.u32 %r340,%r339,0; @ ! %r340 bra $L581; bra $L544; $L572: .loc 1 3364 7 ld.u32 %r341,[%frame+248]; setp.ne.u32 %r342,%r341,0; @ %r342 bra $L574; .loc 1 3365 8 ld.u32 %r343,[%frame+208]; setp.eq.u32 %r344,%r343,0; @ %r344 bra $L575; .loc 1 3365 31 ld.u32 %r345,[%frame+224]; setp.eq.u32 %r346,%r345,0; @ %r346 bra $L575; $L574: .loc 1 3367 13 ld.u64 %r360,[%frame+272]; ld.u32 %r361,[%r123+200]; ld.u64 %r362,[%r123+16]; mov.u64 %r354,add_function_range364433; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r354; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r100; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r128; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r129; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r118; call (%value_in),add_ranges,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12); ld.param.u32 %r364,[%value_in]; } .loc 1 3367 11 setp.eq.u32 %r365,%r364,0; @ ! %r365 bra $L650; bra $L544; $L575: .loc 1 3376 8 mov.u64 %r368,4122r3681289; call backtrace_free); } .loc 1 3382 10 ld.u32 %r371,[%r142+12]; setp.ne.u32 %r372,%r371,0; @ %r372 bra $L577; bra $L581; $L650: ld.u32 %r373,[%r142+12]; setp.ne.u32 %r374,%r373,0; @ %r374 bra $L578; bra $L581; $L571: ld.u32 %r375,[%r142+12]; setp.ne.u32 %r376,%r375,0; @ %r376 bra $L577; bra $L581; $L578: .loc 1 3398 8 mov.u64 %r377,0; st.u64 [%frame],%r377; st.u64 [%frame+8],%r377; st.u64 [%frame+16],%r377; st.u64 [%frame+24],%r377; .loc 1 3400 13 ld.u64 %r391,[%frame+271123396.param .u64 %out_arg8; st.param.u64 [%out_arg8],%r129; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r130; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%frame; call (%value_in),read_function_ent,%out_arg8,%out_arg9,%out_arg10); ld.param.u32 %r392,[%value_in]; } .loc 1 3400 11 setp.eq.u32 %r393,%r392,0; @ %r393 bra $L579; .loc 1 3405 11 ld.u64 %r394,[%frame+24]; setp.eq.u64 %r395,%r394,0; @ %r395 bra $L581; .loc 1 341129; call (%value_in),backtrace_vector_releaseld.param.u32 %r400,[%value_in]; } .loc 1 3409 8 setp.eq.u32 %r401,%r400,0; @ %r401 bra $L579; .loc 1 3413 12 ld.u64 %r101,[%frame]; .loc 1 3414 5 ld.u64 %r406,[%frame+24]; mov.u64 %r405,function_addrs_compare; mov.u64 %r404,24; {r40604405; call backtrace_qso); } .loc 1 3418 30 st.u64 [%r100+24],%r101; .loc 1 3419 36 ld.u64 %r407,[%frame+24]; st.u64 [%r100+32],%r407; bra $L581; $L579: .loc 1 3403 10 mov.u32 %r121,0; bra $L537; $L581: .loc 1 3197 9 ld.u64 %r408,[%r126+24]; setp.ne.u64 %r409,%r408,0; @ %r409 bra $L538; bra $L582; $L586: .loc 1 3210 9 mov.u32 %r121,1; bra $L537; $L542: .loc 1 3220 10 setp.ne.u32 %r411,%r23,29; .loc 1 3221 6 selp.u64 %r118,%r130,%r131,%r411; bra $L583; $L552: .loc 1 3265 7 @ %r427 bra $L558; bra $L549; $L559: .loc 1 3298 8 setp.ne.u32 %r413,%r93,0; @ %r413 bra $L549; bra $L584; $L555: setp.ne.u32 %r414,%r93,0; @ %r414 bra $L549; bra $L584; $L543: .loc 1 3236 7 mov.u64 %r416,0; st.u64 [%r433],%r416; st.u64 [%r433+8],%r416; st.u64 [%r433+16],%r416; st.u64 [%r433+24],%r416; st.u64 [%r433+32],%r416; st.u64 [%r433+40],%r416; st.u64 [%r433+48],%r416; .loc 1 3238 7 ld.u64 %r423,[%r142+16]; setp.ne.u64 %r424,%r423,0; @ %r424 bra $L588; bra $L571; $L537: .loc 1 3426FUNCTION DEF: dwarf_lookup_pc .func (.param .u32 %value_out) dwarf_lookup_pc4486u64u64 %r416; .reg .u64 %r424; .reg .u64 %r426; .reg .u64 %r429; .reg .u64 %r463; .reg .u64u32 %r513; .reg .u64 %r514; .reg .u64 %r515; .reg .u64 %r516; .reg .u64 %r517; .reg .u64 %r518; .reg .u64 %r519; .reg .u64 %r520; .reg .u32u64 %r532; .reg .u64 %r534; .reg .u64 %r536; .reg .pred %r537; .reg .u32 %r538; .reg .u64 %r539; .reg .u64pred %r546; .reg .u64 %r547; .reg .pred %r548; .reg .pred %r549; .reg .pred %r551; .reg .u64 %r552; .reg .pred %r553; .reg .u64 %r554; .reg .pred %r555; .reg .u64 %r556; .reg .pred %r557; .reg .pred %r559; .reg .pred %r560; .reg .pred %r576; .reg .u64u32 %r583; .reg .u32 %r584; .reg .u32 %r587; .reg .pred %r588; .reg .u64 %r591; .reg .u32 %r594; .reg .u32 %r596; .reg .predu64pred %r623; .reg .u64 %r625; .reg .u64 %r626; .reg .u64 %r627; .reg .u64 %r628; .reg .u64 %r629; .reg .u64 %r630; .reg .u64 %r631; .reg .u32 %r632; .reg .u32 %r636; .reg .u64 %r637; .reg .u64 %r638; .reg .pred %r639; .reg .u64 %r642; .reg .u32 %r645; .reg .u64 %r648; .reg .u64 %r652; .reg .pred %r654; .reg .u64 %r655; .reg .u64 %r656; .reg .u64 %r657; .reg .u32 %r660; .reg .u32 %r662; .reg .pred %r663; .reg .u32 %r664; .reg .pred %r665; .reg .u64 %r667; .reg .u64 %r668; .reg .u64 %r669; .reg .u64 %r670; .reg .u64 %r671; .reg .u64 %r672; .reg .u64 %r673; .reg .u32 %r674; .reg .u64 %r679; .reg .u64 %r680; .reg .u32 %r681; .reg .u32 %r682; .reg .u32 %r685; .reg .u32 %r689; .reg .pred %r690; .reg .pred %r691; .reg .u64 %r693; .reg .u64 %r694; .reg .u64 %r695; .reg .u64 %r696; .reg .u64 %r697; .reg .u64 %r698; .reg .u32 %r699; .reg .u64 %r704; .reg .u64 %r705; .reg .u32 %r706; .reg .u64 %r707; .reg .u64 %r708; .reg .u16 %r710; .reg .u16 %r711; .reg .u16 %r713; .reg .u32 %r715; .reg .u32 %r718; .reg .u32 %r722; .reg .u32 %r724; .reg .pred %r725; .reg .u32 %r726; .reg .pred %r727; .reg .u64 %r729; .reg .u64 %r730; .reg .u64 %r731; .reg .u64 %r732; .reg .u64 %r733; .reg .u64 %r734; .reg .u32 %r735; .reg .u64 %r740; .reg .u64 %r741; .reg .u64 %r742; .reg .u32 %r743; .reg .pred %r744; .reg .u64 %r745; .reg .pred %r746; .reg .u64 %r750; .reg .u64 %r751; .reg .u64 %r753; .reg .pred %r754; .reg .u16 %r755; .reg .pred %r756; .reg .u64 %r757; .reg .pred %r758; .reg .u16 %r759; .reg .pred %r760; .reg .u64 %r765; .reg .u64 %r766; .reg .u64 %r767; .reg .u64 %r768; .reg .pred %r770; .reg .u32 %r771; .reg .pred %r772; .reg .u64 %r773; .reg .u64 %r776; .reg .pred %r778; .reg .u64 %r780; .reg .u64 %r781; .reg .u64 %r782; .reg .u64 %r783; .reg .u64 %r784; .reg .u64 %r785; .reg .u32 %r786; .reg .u64 %r791; .reg .u64 %r792; .reg .u32 %r793; .reg .u64 %r794; .reg .u64 %r795; .reg .u64 %r796; .reg .u64 %r797; .reg .u64 %r798; .reg .pred %r799; .reg .u16 %r800; .reg .pred %r801; .reg .pred %r802; .reg .u32 %r803; .reg .pred %r804; .reg .u64 %r806; .reg .u64 %r807; .reg .u64 %r808; .reg .u64 %r809; .reg .u64 %r810; .reg .u64 %r811; .reg .u32 %r812; .reg .u64 %r817; .reg .u64 %r818; .reg .u64 %r819; .reg .u64 %r820; .reg .pred %r821; .reg .u64 %r826; .reg .u64 %r827; .reg .u64 %r828; .reg .u64 %r829; .reg .u64 %r830; .reg .pred %r832; .reg .u64 %r835; .reg .u64 %r836; .reg .u64 %r838; .reg .u16 %r839; .reg .u32 %r840; .reg .pred %r841; .reg .u16 %r843; .reg .u32 %r844; .reg .pred %r845; .reg .u16 %r846; .reg .u32 %r847; .reg .pred %r848; .reg .u16 %r850; .reg .u32 %r851; .reg .pred %r852; .reg .u16 %r853; .reg .u32 %r854; .reg .pred %r855; .reg .u16 %r857; .reg .u32 %r858; .reg .pred %r859; .reg .pred %r860; .reg .u16 %r861; .reg .pred %r862; .reg .u16 %r863; .reg .pred %r864; .reg .u32 %r865; .reg .pred %r866; .reg .u32 %r867; .reg .pred %r868; .reg .u64 %r871; .reg .pred %r873; .reg .u64 %r875; .reg .u64 %r876; .reg .u64 %r877; .reg .u64 %r878; .reg .u64 %r879; .reg .u64 %r880; .reg .u32 %r881; .reg .u64 %r886; .reg .u64 %r887; .reg .u64 %r888; .reg .u64 %r890; .reg .u64 %r893; .reg .u16 %r894; .reg .u32 %r895; .reg .pred %r896; .reg .pred %r897; .reg .pred %r898; .reg .u64 %r899; .reg .u64 %r900; .reg .u64 %r901; .reg .u64 %r902; .reg .pred %r903; .reg .u64 %r904; .reg .u64 %r905; .reg .u64 %r906; .reg .u64 %r909; .reg .u64 %r910; .reg .u64 %r911; .reg .u64 %r912; .reg .u64 %r913; .reg .u64 %r914; .reg .u64 %r915; .reg .u32 %r916; .reg .u64 %r921; .reg .u64 %r922; .reg .u64 %r924; .reg .u64 %r927; .reg .u64 %r933; .reg .u64 %r934; .reg .u64 %r935; .reg .u64 %r936; .reg .u64 %r937; .reg .pred %r939; .reg .u64 %r946; .reg .u64 %r948; .reg .u32 %r949; .reg .u64 %r950; .reg .u64 %r951; .reg .u64 %r952; .reg .u64 %r959; .reg .u64 %r961; .reg .u64 %r962; .reg .u64 %r965; .reg .u64 %r968; .reg .u16 %r969; .reg .pred %r970; .reg .u64 %r979; .reg .u64 %r981; .reg .u32 %r984; .reg .pred %r985; .reg .u64 %r994; .reg .u64 %r996; .reg .u32 %r999; .reg .pred %r1000; .reg .u64 %r1001; .reg .pred %r1002; .reg .u64 %r1003; .reg .pred %r1004; .reg .u64 %r1005; .reg .pred %r1006; .reg .u32 %r1008; .reg .u32 %r1009; .reg .u32 %r1010; .reg .u32 %r1012; .reg .u64 %r1013; .reg .u32 %r1015; .reg .u32 %r1016; .reg .u32 %r1017; .reg .u64 %r1027; .reg .u64 %r1028; .reg .pred %r1029; .reg .u64 %r1032; .reg .pred %r1033; .reg .u32 %r1034; .reg .pred %r1035; .reg .u64 %r1037; .reg .u64 %r1038; .reg .u64 %r1039; .reg .u64 %r1040; .reg .u64 %r1041; .reg .u64 %r1042; .reg .u32 %r1043; .reg .u64 %r1048; .reg .u64 %r1049; .reg .u32 %r1050; .reg .u16 %r1051; .reg .pred %r1052; .reg .pred %r1054; .reg .pred %r1056; .reg .pred %r1058; .reg .pred %r1060; .reg .u32 %r1063; .reg .u64 %r1065; .reg .u64 %r1068; .reg .pred %r1070; .reg .u32 %r1071; .reg .pred %r1072; .reg .u64 %r1074; .reg .u64 %r1075; .reg .u64 %r1076; .reg .u64 %r1077; .reg .u64 %r1078; .reg .u64 %r1079; .reg .u32 %r1080; .reg .u64 %r1085; .reg .u64 %r1086; .reg .u32 %r1087; .reg .u64 %r1088; .reg .u64 %r1089; .reg .u64 %r1092; .reg .u64 %r1095; .reg .u64 %r1098; .reg .u16 %r1099; .reg .u32 %r1100; .reg .pred %r1101; .reg .pred %r1102; .reg .u32 %r1103; .reg .pred %r1104; .reg .u32 %r1105; .reg .u64 %r1106; .reg .pred %r1107; .reg .u64 %r1108; .reg .u64 %r1109; .reg .u64 %r1110; .reg .u64 %r1112; .reg .u64 %r1113; .reg .u64 %r1114; .reg .u64 %r1115; .reg .u64 %r1116; .reg .u64 %r1117; .reg .u64 %r1118; .reg .u32 %r1119; .reg .u32 %r1123; .reg .u64 %r1124; .reg .u64 %r1125; .reg .u64 %r1127; .reg .u64 %r1130; .reg .u64 %r1136; .reg .u64 %r1137; .reg .u64 %r1138; .reg .u64 %r1139; .reg .u64 %r1140; .reg .pred %r1142; .reg .u64 %r1149; .reg .u64 %r1151; .reg .u32 %r1152; .reg .u64 %r1153; .reg .u64 %r1154; .reg .u64 %r1155; .reg .u64 %r1162; .reg .u64 %r1166; .reg .pred %r1167; .reg .u32 %r1168; .reg .pred %r1169; .reg .u64 %r1171; .reg .u64 %r1172; .reg .u64 %r1173; .reg .u64 %r1174; .reg .u64 %r1175; .reg .u64 %r1176; .reg .u64 %r1177; .reg .u32 %r1178; .reg .u64 %r1183; .reg .u64 %r1184; .reg .u32 %r1185; .reg .u64 %r1186; .reg .u64 %r1187; .reg .u16 %r1188; .reg .pred %r1189; .reg .pred %r1191; .reg .pred %r1193; .reg .pred %r1195; .reg .pred %r1197; .reg .pred %r1199; .reg .pred %r1201; .reg .pred %r1203; .reg .pred %r1205; .reg .pred %r1207; .reg .pred %r1209; .reg .pred %r1211; .reg .pred %r1213; .reg .u64 %r1223; .reg .u64 %r1224; .reg .u64 %r1227; .reg .u64 %r1228; .reg .u64 %r1229; .reg .u64 %r1230; .reg .u64 %r1232; .reg .u64 %r1234; .reg .u64 %r1237; .reg .u32 %r1238; .reg .u64 %r1241; .reg .pred %r1242; .reg .u64 %r1243; .reg .u64 %r1244; .reg .pred %r1245; .reg .u64 %r1247; .reg .u64 %r1248; .reg .u64 %r1249; .reg .u64 %r1250; .reg .u64 %r1251; .reg .u64 %r1252; .reg .u64 %r1253; .reg .u32 %r1254; .reg .u32 %r1258; .reg .u64 %r1259; .reg .u64 %r1260; .reg .u64 %r1261; .reg .u64 %r1262; .reg .u64 %r1263; .reg .u64 %r1267; .reg .u32 %r1268; .reg .u32 %r1269; .reg .u32 %r1271; .reg .u32 %r1272; .reg .u32 %r1273; .reg .u32 %r1274; .reg .u32 %r1276; .reg .u64 %r1277; .reg .u32 %r1280; .reg .u64 %r1282; .reg .u16 %r1283; .reg .u64 %r1286; .reg .u32 %r1287; .reg .u64 %r1288; .reg .u64 %r1289; .reg .u64 %r1290; .reg .pred %r1291; .reg .u64 %r1294; .reg .pred %r1295; .reg .pred %r1296; .reg .u64 %r1297; .reg .pred %r1298; .reg .u64 %r1300; .reg .u64 %r1305; .reg .pred %r1306; .reg .u64 %r1307; .reg .u64 %r1308; .reg .u32 %r1315; .reg .pred %r1316; .reg .u64 %r1320; .reg .u64 %r1321; .reg .u32 %r1322; .reg .pred %r1323; .reg .u64 %r1324; .reg .u64 %r1325; .reg .u64 %r1326; .reg .u64 %r1327; .reg .u32 %r1333; .reg .pred %r1334; .reg .u64 %r1340; .reg .u64 %r1341; .reg .u64 %r1347; .reg .u64 %r1348; .reg .u64 %r1349; .reg .u64 %r1355; .reg .u64 %r1356; .reg .u64 %r1357; .reg .u32 %r1358; .reg .u32 %r1359; .reg .u64 %r1363; .reg .u32 %r1371; .reg .pred %r1372; .reg .pred %r1373; .reg .pred %r1374; .reg .pred %r1375; .reg .u32 %r1381; .reg .pred %r1382; .reg .u64 %r1387; .reg .pred %r1388; .reg .u64 %r1392; .reg .u64 %r1393; .reg .pred %r1394; .reg .u64 %r1400; .reg .u64 %r1401; .reg .u64 %r1407; .reg .u64 %r1408; .reg .u64 %r1409; .reg .u32 %r1410; .reg .pred %r1411; .reg .pred %r1412; .reg .pred %r1413; .reg .u64 %r1421; .reg .u32 %r1422; .reg .u32 %r1426; .reg .u64 %r1427; .reg .u64 %r1428; .reg .u32 %r1429; .reg .u64 %r1433; .reg .u64 %r1434; .reg .u64 %r1435; .reg .u64 %r1436; .reg .u64 %r1438; .reg .pred %r1439; .reg .pred %r1440; .reg .pred %r1441; .reg .u16 %r1442; .reg .u32 %r1443; .reg .pred %r1444; .reg .pred %r1445; .reg .u64 %r1447; .reg .u64 %r1450; .reg .u64 %r1456; .reg .u64 %r1457; .reg .u64 %r1458; .reg .pred %r1460; .reg .u64 %r1467; .reg .u64 %r1469; .reg .u32 %r1470; .reg .u64 %r1471; .reg .u64 %r1472; .reg .u64 %r1473; .reg .u64 %r1480; .reg .u32 %r1485; .reg .u64 %r1486; .reg .u64 %r1487; .reg .u32 %r1488; .reg .pred %r1489; .reg .u32 %r1495; .reg .u64 %r1496; .reg .u64 %r1497; .reg .u32 %r1498; .reg .u64 %r1503; .reg .u64 %r1504; .reg .u64 %r1506; .reg .pred %r1507; .reg .u64 %r1508; .reg .u64 %r1510; .reg .u64 %r1512; .reg .pred %r1513; .reg .u32 %r1519; .reg .u64 %r1520; .reg .u64 %r1521; .reg .u32 %r1522; .reg .u64 %r1523; .reg .u64 %r1525; .reg .u64 %r1527; .reg .pred %r1528; .reg .u64 %r1529; .reg .pred %r1530; .reg .u64 %r1531; .reg .pred %r1532; .reg .u64 %r1533; .reg .u32 %r1534; .reg .u64 %r1541; .reg .u32 %r1542; .reg .pred %r1543; .reg .u64 %r1546; .reg .u64 %r1548; .reg .u32 %r1549; .reg .u64 %r1550; .reg .u32 %r1551; .reg .u32 %r1553; .reg .pred %r1554; .reg .u64 %r1555; .reg .u64 %r1556; .reg .u64 %r1557; .reg .u64 %r1558; .reg .u64 %r1559; .reg .pred %r1560; .reg .u64 %r1561; .reg .u64 %r1562; .reg .u64 %r1563; .reg .u64 %r1564; .reg .u64 %r1565; .reg .u16 %r1566; mov.u64 %r514,%ar0; mov.u64 %r515,%ar1; mov.u64 %r516,%ar2; st.u64 [%frame+432],%r516; mov.u64 %r517,%ar3; mov.u64 %r518,%ar4; mov.u64 %r519,%ar5; mov.u64 %r520,%ar6; .loc 1 3575 10 mov.u32 %r521,1; st.u32 [%r520],%r521; .loc 1 3578 17 ld.u64 %r22,[%r515+32]; .loc 1 3580 5 setp.eq.u64 %r522,%r22,0; @ %r522 bra $L764; .loc 1 3580 7 ld.u64 %r528,[%r515+24]; add.u64 %r1564,%frame,432; mov.u64 %r527,unit_addrs_search;56525527; call (%value_in),bsearch530,0; @ %r531 bra $L764; .loc 1 3593 33 ld.u64 %r493,[%r515+24]; .loc 1 3593 53 ld.u64 %r182,[%r515+32]; .loc 1 3593 26 sub.u64 %r532,%r530,%r493; shr.s64 %r534,%r532,3; .loc 1 3593 42 mad.lo.u64 %r536,%r534,-6148914691236517205,1; .loc 1 3593 9 setp.le.u64 %r537,%r182,%r536; @ %r537 bra $L653; .loc 1 3594 9 ld.u64 %r29,[%frame+432]; add.u64 %r495,%r530,24; add.u64 %r491,%r530,32; bra $L654; $L764: .loc 1 3585 14 mov.u32 %r538,0; st.u32 [%r520],%r538; .loc 1 3586 14 mov.u32 %r513,%r538; bra $L651; $L656: .loc 1 3593 9 add.u64 %r495,%r495,24; add.u64 %r491,%r491,24; .loc 1 3593 26 sub.u64 %r539,%r486,%r493; shr.s64 %r541,%r539,3; .loc 1 3593 42 mad.lo.u64 %r543,%r541,-6148914691236517205,1; .loc 1 3593 9 setp.ge.u64 %r544,%r543,%r182; @ %r544 bra $L771; $L654: add.u64 %r92,%r495,-24; mov.u64 %r486,%r495; .loc 1 3594 3 ld.u64 %r545,[%r495]; setp.gt.u64 %r546,%r545,%r29; @ %r546 bra $L653; .loc 1 3595 3 ld.u64 %r547,[%r491]; setp.lt.u64 %r548,%r29,%r547; @ %r548 bra $L656; bra $L653; $L771: mov.u64 %r92,%r486; $L653: .loc 1 3604 5 ld.u64 %r93,[%r92+16]; .loc 1 3605 12 ld.u64 %r34,[%r93+128]; .loc 1 3626 12 ld.u32 %r272,[%r514+8]; .loc 1 3610 9 setp.gt.u64 %r549,%r92,%r493; @ %r549 bra $L814; setp.eq.u32 %r1560,%r272,0; bra $L658; $L814: setp.eq.u32 %r1560,%r272,0; bra $L657; $L660: .loc 1 3614 10 @ %r1560 bra $L659; .loc 1 3615 26 ld.u64 %r33,[%r93+128]; membar.sys; .loc 1 3615 10 mov.u64 %r34,%r33; .loc 1 3626 12 ld.u32 %r272,[%r514+8]; setp.eq.u32 %r1560,%r272,0; $L659: .loc 1 3617 10 setp.ne.u64 %r551,%r34,-1; @ %r551 bra $L658; .loc 1 3620 7 add.u64 %r92,%r92,-24; .loc 1 3622 9 ld.u64 %r93,[%r92+16]; .loc 1 3623 16 ld.u64 %r34,[%r93+128]; .loc 1 3610 9 ld.u64 %r552,[%r515+24]; setp.ge.u64 %r553,%r552,%r92; @ %r553 bra $L658; $L657: .loc 1 3611 9 ld.u64 %r37,[%frame+432]; .loc 1 3611 3 ld.u64 %r554,[%r92+-24]; setp.gt.u64 %r555,%r554,%r37; @ %r555 bra $L658; .loc 1 3612 3 ld.u64 %r556,[%r92+-16]; setp.lt.u64 %r557,%r37,%r556; @ %r557 bra $L660; $L658: .loc 1 3626 6 @ %r1560 bra $L661; .loc 1 3627 13 ld.u64 %r40,[%r93+128]; membar.sys; mov.u64 %r34,%r40; $L661: .loc 1 3630 6 setp.eq.u64 %r559,%r34,0; @ %r559 bra $L662; .loc 1 3685 6 setp.eq.u64 %r560,%r34,-1; @ ! %r560 bra $L664; bra $L663; $L662: .loc 1 3641 11 ld.u64 %r41,[%r92+16]; .loc 1 2908 3 add.u64 %r1563,%frame,400; st.u64 [%r1563],%r34; st.u64 [%r1563+8],%r34; st.u64 [%r1563+16],%r34; .loc 1 2909 13 st.u64 [%frame+424],%r34; .loc 1 2911 3 add.u64 %r1556,%frame,200; st.u64 [%r1556],%r34; st.u64 [%r1556+8],%r34; st.u64 [%r1556+16],%r34; st.u64 [%r1556+24],%r34; st.u64 [%r1556+32],%r34; st.u64 [%r1556+40],%r34; st.u64 [%r1556+48],%r34; st.u64 [%r1556+56],%r34; st.u64 [%r1556+64],%r34; .loc 1 2914 10 ld.u64 %r101,[%r41+56]; .loc 1 2914 59 ld.u64 %r102,[%r515+136]; .loc 1 2913 6 setp.lt.u64 %r576,%r101,%r102; @ %r576 bra $L665; .loc 1 2916 7 cvt.u32.u64 %r579,%r34; cvta.const.u64 %r578,$LC29; { $LCT38%r518,$LCT38; } .loc 1 2917 7 bra $L666; $L665: .loc 1 2920 17 cvta.const.u64 %r580,$LC30; st.u64 [%frame+336],%r580; .loc 1 2921 46 ld.u64 %r103,[%r515+64]; .loc 1 2921 18 st.u64 [%frame+344],%r103; .loc 1 2922 57 add.u64 %r581,%r103,%r101; .loc 1 2922 16 st.u64 [%frame+352],%r581; .loc 1 2923 58 sub.u64 %r582,%r102,%r101; .loc 1 2923 17 st.u64 [%frame+360],%r582; .loc 1 2924 25 ld.u32 %r583,[%r515+200]; st.u32 [%frame+368],%r583; .loc 1 2925 27 st.u64 [%frame+376],%r518; .loc 1 2926 17 st.u64 [%frame+384],%r519; .loc 1 2927 31 cvt.u32.u64 %r584,%r34; st.u32 [%frame+392],%r584; .loc 1 717 9 add.u64 %r1561,%frame,3361561; call (%value_in),read_uint32,(%out_arg1); ld.param.u32 %r587,[%value_in]; } .loc 1 717 7 cvt.u64.u32 %r145,%r587; .loc 1 718 6 setp.ne.u64 %r588,%r145,4294967295; @ %r588 bra $L772; .loc 1 720 1361; call (%value_in),read_uint64,(%out_arg1); ld.param.u64 %r591,[%value_in]; } mov.u64 %r145,%r591; .loc 1 721 19 mov.u32 %r509,1; bra $L667; $L772: .loc 1 724 17 cvt.u32.u64 %r509,%r34; $L667: .loc 1 2930 17 st.u64 [%frame+360],%r145; .loc 1 2616 11561; call (%value_in),read_uint16,(%out_arg1); ld.param.u32 %r594,[%value_in]; } .loc 1 2616 16 st.u32 [%frame+200],%r594; .loc 1 2617 24 add.u32 %r596,%r594,-2; .loc 1 2617 6 setp.le.u32 %r597,%r596,3; @ %r597 bra $L668; .loc 1 418 38 ld.u64 %r602,[%frame+352]; ld.u64 %r603,[%frame+344]; sub.u64 %r601,%r602,%r603; .loc 1 417 3 st.u32 [%stack+16],%r601; ld.u64 %r604,[%frame+336]; st.u64 [%stack+8],%r604; cvta.const.u64 %r605,$LC31; st.u64 [%stack],%r605; cvta.const.u64 %r600,$LC1; mov599600stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r606,[%value_in]; } .loc 1 419 3 ld.u64 %r611,[%frame+384]; ld.u64 %r612,[%frame+376]; mov.u32 %r610,0; { $LCT39%r612,$LCT39; } bra $L666; $L668: .loc 1 2623 6 setp.eq.u32 %r613,%r594,5; @ %r613 bra $L669; .loc 1 2624 19 ld.u32 %r614,[%r41+48]; st.u32 [%frame+204],%r614; bra $L670; $L669: .loc 1 2627 231561; call (%value_in),read_byte,(%out_arg1); ld.param.u32 %r617,[%value_in]; } st.u32 [%frame+204],%r617; .loc 1 261561; call (%value_in),read_byte,(%out_arg1); ld.param.u32 %r621,[%value_in]; } .loc 1 2630 10 setp.eq.u32 %r623,%r621,0; @ %r623 bra $L670; .loc 1 418 38 ld.u64 %r628,[%frame+352]; ld.u64 %r629,[%frame+344]; sub.u64 %r627,%r628,%r629; .loc 1 417 3 st.u32 [%stack+16],%r627; ld.u64 %r630,[%frame+336]; st.u64 [%stack+8],%r630; cvta.const.u64 %r631,$LC32; st.u64 [%stack],%r631; cvta.const.u64 %r626,$LC1; mov625stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r632,[%value_in]; } .loc 1 419 3 ld.u64 %r637,[%frame+384]; ld.u64 %r638,[%frame+376]; mov.u32 %r636,0; { $LCT46frame636; call %r638,$LCT40; } bra $L666; $L670: .loc 1 572 6 setp.eq.u32 %r639,%r509,0; @ %r639 bra $L671; .loc 1 573 1261; call (%value_in),read_uint64,(%out_arg1); ld.param.u64 %r642,[%value_in]; } mov.u64 %r160,%r642; bra $L672; $L6711561; call (%value_in),read_uint32,(%out_arg1); ld.param.u32 %r645,[%value_in]; } cvt.u64.u32 %r160,%r645; $L672: .loc 1 2640 11 add.u64 %r1559,%frame,272; mov.u64 %r648,645596164852,[%value_in]; } .loc 1 2641 16 st.u64 [%frame+296],%r160; .loc 1 428 10 ld.u64 %r162,[%frame+360]; .loc 1 428 6 setp.gt.u64 %r654,%r160,%r162; @ %r654 bra $L673; .loc 1 448 12 ld.u64 %r656,[%frame+352]; add.u64 %r655,%r656,%r160; st.u64 [%frame+352],%r655; .loc 1 449 13 sub.u64 %r657,%r162,%r160; st.u64 [%frame+360],%r657; .loc 1 2646 231559; call (%value_in),read_byte,(%out_arg1); ld.param.u32 %r660,[%value_in]; } st.u32 [%frame+208],%r660; .loc 1 2647 6 ld.u32 %r662,[%frame+200]; setp.le.s32 %r663,%r662,3; @ %r663 bra $L674; bra $L878; $L673: .loc 1 431 6 ld.u32 %r664,[%frame+392]; setp.eq.u32 %r665,%r664,0; @ ! %r665 bra $L666; .loc 1 418 38 ld.u64 %r670,[%frame+352]; ld.u64 %r671,[%frame+344]; sub.u64 %r669,%r670,%r671; .loc 1 417 3 st.u32 [%stack+16],%r669; ld.u64 %r672,[%frame+336]; st.u64 [%stack+8],%r672; cvta.const.u64 %r673,$LC2; st.u64 [%stack],%r673; cvta.const.u64 %r668,$LC1; mov667668stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r674,[%value_in]; } .loc 1 419 3 ld.u64 %r679,[%frame+384]; ld.u64 %r680,[%frame+376]; { $LCT41frame664; call %r680,$LCT41; } .loc 1 434 31 mov.u32 %r681,1; st.u32 [%frame+392],%r681; bra $L666; $L674: .loc 1 2648 27 mov.u32 %r682,1; st.u32 [%frame+212],%r682; bra $L678; $L878: .loc 1 2650 21559; call (%value_in),read_byte,(%out_arg1); ld.param.u32 %r685,[%value_in]; } st.u32 [%frame+212],%r685; $L6781559; call (%value_in),read_byte,(%out_arg1); ld.param.u32 %r689,[%value_in]; } .loc 1 489 24 ld.u64 %r175,[%frame+288]; .loc 1 428 10 ld.u64 %r181,[%frame+296]; .loc 1 428 6 setp.ne.u64 %r690,%r181,0; @ %r690 bra $L679; .loc 1 431 11 ld.u32 %r412,[%frame+328]; .loc 1 431 6 setp.ne.u32 %r691,%r412,0; @ %r691 bra $L773; .loc 1 418 38 ld.u64 %r696,[%frame+280]; sub.u64 %r695,%r175,%r696; .loc 1 417 3 st.u32 [%stack+16],%r695; ld.u64 %r697,[%frame+272]; st.u64 [%stack+8],%r697; cvta.const.u64 %r698,$LC2; st.u64 [%stack],%r698; cvta.const.u64 %r694,$LC1; mov693694stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r699,[%value_in]; } .loc 1 419 3 ld.u64 %r704,[%frame+320]; ld.u64 %r705,[%frame+312]; { $LCT42frame412; call %r705,$LCT42; } .loc 1 434 31 mov.u32 %r706,1; st.u32 [%frame+328],%r706; bra $L680; $L679: .loc 1 448 12 add.u64 %r707,%r175,1; st.u64 [%frame+288],%r707; .loc 1 449 13 add.u64 %r708,%r181,-1; st.u64 [%frame+296],%r708; .loc 1 493 22 ld.u8 %r1566,[%r175]; mov.u16 %r711,%r1566; xor.b16 %r710,%r711,-128; add.u16 %r713,%r710,-128; .loc 1 2655 20 cvt.u32.u16 %r715,%r713; cvt.s32.s8 %r412,%r715; .loc 1 493 22 bra $L680; $L773: cvt.u32.u64 %r412,%r181; $L680: .loc 1 2655 18 st.u32 [%frame+216],%r412; .loc 1 265659; call (%value_in),read_byte,(%out_arg1); ld.param.u32 %r718,[%value_in]; } st.u32 [%frame+220],%r718; .loc 1 2658 221559; call (%value_in),read_byte,(%out_arg1); ld.param.u32 %r722,[%value_in]; } .loc 1 2658 20 st.u32 [%frame+224],%r722; .loc 1 2659 32 ld.u64 %r134,[%frame+288]; .loc 1 2659 23 st.u64 [%frame+232],%r134; .loc 1 2660 44 add.u32 %r724,%r722,-1; .loc 1 2660 8 cvt.u64.u32 %r136,%r724; .loc 1 428 10 ld.u64 %r191,[%frame+296]; .loc 1 428 6 setp.le.u64 %r725,%r136,%r191; @ %r725 bra $L681; .loc 1 431 6 ld.u32 %r726,[%frame+328]; setp.ne.u32 %r727,%r726,0; @ %r727 bra $L666; .loc 1 418 38 ld.u64 %r732,[%frame+280]; sub.u64 %r731,%r134,%r732; .loc 1 417 3 st.u32 [%stack+16],%r731; ld.u64 %r733,[%frame+272]; st.u64 [%stack+8],%r733; cvta.const.u64 %r734,$LC2; st.u64 [%stack],%r734; cvta.const.u64 %r730,$LC1; mov729730stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r735,[%value_in]; } .loc 1 419 3 ld.u64 %r740,[%frame+320]; ld.u64 %r741,[%frame+312]; { $LCT43frame726; call %r741,$LCT43; } bra $L666; $L681: .loc 1 448 12 add.u64 %r218,%r134,%r136; st.u64 [%frame+288],%r218; .loc 1 449 13 sub.u64 %r742,%r191,%r136; st.u64 [%frame+296],%r742; .loc 1 2663 6 ld.u32 %r743,[%frame+200]; setp.gt.s32 %r744,%r743,4; @ %r744 bra $L683; .loc 1 2322 19 mov.u64 %r745,0; st.u64 [%frame+240],%r745; .loc 1 2324 8 add.u64 %r201,%r134,%r191; .loc 1 2325 9 setp.lt.u64 %r746,%r218,%r201; @ %r746 bra $L774; .loc 1 2331 13 st.u64 [%frame+248],%r745; bra $L685; $L687: .loc 1 2327 43 sub.u64 %r750,%r201,%r206; .loc 1 2327 12750; call (%value_in),strnle751,[%value_in]; } .loc 1 2327 9 add.u64 %r753,%r751,1; add.u64 %r206,%r206,%r753; .loc 1 2328 7 add.u64 %r207,%r207,1; st.u64 [%frame+240],%r207; .loc 1 2325 9 setp.le.u64 %r754,%r201,%r206; @ ! %r754 bra $L684; bra $L686; $L774: mov.u64 %r206,%r218; mov.u64 %r207,%r745; $L684: .loc 1 2325 19 ld.u8 %r755,[%r206]; setp.ne.u16 %r756,%r755,0; @ %r756 bra $L687; $L686: .loc 1 2331 13 mov.u64 %r757,0; st.u64 [%frame+248],%r757; .loc 1 2332 6 setp.ne.u64 %r758,%r207,0; @ %r758 bra $L688; $L685: .loc 1 2344 9 ld.u8 %r759,[%r218]; setp.ne.u16 %r760,%r759,0; @ %r760 bra $L775; bra $L690; $L688: .loc 1 2335 6 ld.u64 %r765,[%frame+320]; ld.u64 %r766,[%frame+312]; shl.b64 %r767,%r2077665; call (%value_in),backtrace_allocld.param.u64 %r768,[%value_in]; } .loc 1 2334 17 st.u64 [%frame+248],%r768; .loc 1 2339 10 setp.eq.u64 %r770,%r768,0; @ %r770 bra $L666; .loc 1 2344 18 ld.u64 %r218,[%frame+288]; bra $L685; $L775: .loc 1 2344 9 mov.u64 %r496,0; $L689: .loc 1 2346 10 ld.u32 %r771,[%frame+328]; setp.ne.u32 %r772,%r771,0; @ %r772 bra $L666; .loc 1 2349 16 ld.u64 %r773,[%frame+248]; add.u64 %r215,%r773,%r496; .loc 1 459 16 ld.u64 %r273,[%frame+296]call (%value_in),strnle776,[%value_in]; } .loc 1 464 10 add.u64 %r275,%r776,1; .loc 1 428 6 setp.ge.u64 %r778,%r273,%r275; @ %r778 bra $L691; .loc 1 418 38 ld.u64 %r783,[%frame+280]; sub.u64 %r782,%r218,%r783; .loc 1 417 3 st.u32 [%stack+16],%r782; ld.u64 %r784,[%frame+272]; st.u64 [%stack+8],%r784; cvta.const.u64 %r785,$LC2; st.u64 [%stack],%r785; cvta.const.u64 %r781,$LC1; mov07stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r786,[%value_in]; } .loc 1 419 3 ld.u64 %r791,[%frame+320]; ld.u64 %r792,[%frame+312]; { $LCT44%r792,$LCT44; } .loc 1 434 31 mov.u32 %r793,1; st.u32 [%frame+328],%r793; .loc 1 2350 20 ld.u64 %r794,[%frame+248]; add.u64 %r416,%r794,%r496; .loc 1 467 12 mov.u64 %r218,%r34; bra $L692; $L691: .loc 1 448 12 ld.u64 %r796,[%frame+288]; add.u64 %r795,%r796,%r275; st.u64 [%frame+288],%r795; .loc 1 449 13 sub.u64 %r797,%r273,%r275; st.u64 [%frame+296],%r797; mov.u64 %r416,%r215; $L692: .loc 1 2349 20 st.u64 [%r215],%r218; .loc 1 2350 10 ld.u64 %r798,[%r416]; setp.eq.u64 %r799,%r798,0; @ %r799 bra $L666; .loc 1 2344 18 ld.u64 %r218,[%frame+288]; .loc 1 2344 9 add.u64 %r496,%r496,8; ld.u8 %r800,[%r218]; setp.ne.u16 %r801,%r800,0; @ %r801 bra $L689; $L690: .loc 1 428 10 ld.u64 %r291,[%frame+296]; .loc 1 428 6 setp.ne.u64 %r802,%r291,0; @ %r802 bra $L693; .loc 1 431 6 ld.u32 %r803,[%frame+328]; setp.ne.u32 %r804,%r803,0; @ %r804 bra $L666; .loc 1 418 38 ld.u64 %r809,[%frame+280]; sub.u64 %r808,%r218,%r809; .loc 1 417 3 st.u32 [%stack+16],%r808; ld.u64 %r810,[%frame+272]; st.u64 [%stack+8],%r810; cvta.const.u64 %r811,$LC2; st.u64 [%stack],%r811; cvta.const.u64 %r807,$LC1; mov806807stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r812,[%value_in]; } .loc 1 419 3 ld.u64 %r817,[%frame+320]; ld.u64 %r818,[%frame+312]; { $LCT45frame3; call %r818,$LCT45; } bra $L666; $L693: .loc 1 448 12 ld.u64 %r300,[%frame+288]; add.u64 %r237,%r300,1; st.u64 [%frame+288],%r237; .loc 1 449 13 add.u64 %r819,%r291,-1; st.u64 [%frame+296],%r819; .loc 1 2358 24 mov.u64 %r820,0; st.u64 [%frame+256],%r820; .loc 1 2360 8 add.u64 %r221,%r300,%r291; .loc 1 2361 9 setp.gt.u64 %r821,%r221,%r237; @ %r821 bra $L776; $L702: .loc 1 2371 7 ld.u64 %r826,[%frame+320]; ld.u64 %r827,[%frame+312]; ld.u64 %r829,[%frame+256]; shl.b64 %r828,%r8298826; call (%value_in),backtrace_allocld.param.u64 %r830,[%value_in]; } .loc 1 2370 18 st.u64 [%frame+264],%r830; .loc 1 2375 6 setp.eq.u64 %r832,%r830,0; @ ! %r832 bra $L879; bra $L666; $L703: .loc 1 2363 44 sub.u64 %r835,%r221,%r237; .loc 1 2363835; call (%value_in),strnle836,[%value_in]; } .loc 1 2363 9 add.u64 %r838,%r836,1; add.u64 %r226,%r237,%r838; .loc 1 702 9 ld.s8 %r840,[%r226]; cvt.u16.u32 %r839,%r840; setp.ge.s16 %r841,%r839,0; @ %r841 bra $L777; add.u64 %r1557,%r226,1; $L697: .loc 1 704 7 mov.u64 %r227,%r1557; add.u64 %r1557,%r1557,1; sub.u64 %r485,%r1557,%r226; .loc 1 702 9 ld.s8 %r844,[%r227]; cvt.u16.u32 %r843,%r844; setp.lt.s16 %r845,%r843,0; @ %r845 bra $L697; bra $L696; $L777: .loc 1 701 7 mov.u64 %r485,%r1565; $L696: .loc 1 2364 9 add.u64 %r230,%r226,%r485; .loc 1 702 9 ld.s8 %r847,[%r230]; cvt.u16.u32 %r846,%r847; setp.ge.s16 %r848,%r846,0; @ %r848 bra $L778; add.u64 %r1555,%r230,1; $L699: .loc 1 704 7 mov.u64 %r31,%r1555; add.u64 %r1555,%r1555,1; sub.u64 %r483,%r1555,%r230; .loc 1 702 9 ld.s8 %r851,[%r31]; cvt.u16.u32 %r850,%r851; setp.lt.s16 %r852,%r850,0; @ %r852 bra $L699; bra $L698; $L778: .loc 1 701 7 mov.u64 %r483,%r1565; $L698: .loc 1 2365 9 add.u64 %r233,%r230,%r483; .loc 1 702 9 ld.s8 %r854,[%r233]; cvt.u16.u32 %r853,%r854; setp.ge.s16 %r855,%r853,0; @ %r855 bra $L779; add.u64 %r1558,%r233,1; $L701: .loc 1 704 7 mov.u64 %r234,%r1558; add.u64 %r1558,%r1558,1; sub.u64 %r279,%r1558,%r233; .loc 1 702 9 ld.s8 %r858,[%r234]; cvt.u16.u32 %r857,%r858; setp.lt.s16 %r859,%r857,0; @ %r859 bra $L701; bra $L700; $L779: .loc 1 701 7 mov.u64 %r279,%r1565; $L700: .loc 1 2366 9 add.u64 %r237,%r233,%r279; .loc 1 2367 7 add.u64 %r89,%r89,1; st.u64 [%frame+256],%r89; .loc 1 2361 9 setp.le.u64 %r860,%r221,%r237; @ ! %r860 bra $L694; bra $L702; $L776: mov.u64 %r89,%r820; .loc 1 701 7 mov.u64 %r1565,1; $L694: .loc 1 2361 19 ld.u8 %r861,[%r237]; setp.ne.u16 %r862,%r861,0; @ %r862 bra $L703; bra $L702; $L879: .loc 1 2378 18 ld.u64 %r1562,[%frame+288]; .loc 1 2378 9 ld.u8 %r863,[%r1562]; setp.ne.u16 %r864,%r863,0; @ %r864 bra $L780; $L712: .loc 1 2680 6 ld.u32 %r865,[%frame+328]; setp.ne.u32 %r866,%r865,0; @ ! %r866 bra $L880; bra $L666; $L780: .loc 1 2378 9 mov.u64 %r504,0; $L704: .loc 1 2383 10 ld.u32 %r867,[%frame+328]; setp.ne.u32 %r868,%r867,0; @ %r868 bra $L666; .loc 1 459 16 ld.u64 %r302,[%frame+296]56302; call (%value_in),strnlen,(%out_arg1,%out_arg2); ld.param.u64 %r871,[%value_in]; } .loc 1 464 10 add.u64 %r304,%r871,1; .loc 1 428 6 setp.ge.u64 %r873,%r302,%r304; @ %r873 bra $L706; .loc 1 418 38 ld.u64 %r878,[%frame+280]; sub.u64 %r877,%r1562,%r878; .loc 1 417 3 st.u32 [%stack+16],%r877; ld.u64 %r879,[%frame+272]; st.u64 [%stack+8],%r879; cvta.const.u64 %r880,$LC2; st.u64 [%stack],%r880; cvta.const.u64 %r876,$LC1; mov875876stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r881,[%value_in]; } .loc 1 419 3 ld.u64 %r886,[%frame+320]; ld.u64 %r887,[%frame+312]; { $LCT46886%r887,$LCT46; } bra $L666; $L706: .loc 1 448 12 add.u64 %r888,%r1562,%r304; st.u64 [%frame+288],%r888; .loc 1 449 13 sub.u64 %r890,%r302,%r304; st.u64 [%frame+296],%r890; .loc 1 2389 1959; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r893,[%value_in]; } .loc 1 2390 10 ld.s8 %r895,[%r1562]; cvt.u16.u32 %r894,%r895; setp.eq.u16 %r896,%r894,47; @ %r896 bra $L707; .loc 1 2391 4 setp.ne.u64 %r897,%r893,0; @ %r897 bra $L708; .loc 1 2391 27 ld.u64 %r247,[%r41+96]; .loc 1 2391 23 setp.ne.u64 %r898,%r247,0; @ %r898 bra $L709; $L707: .loc 1 2392 20 ld.u64 %r899,[%frame+264]; add.u64 %r900,%r899,%r504; st.u64 [%r900],%r1562; bra $L710; $L708: .loc 1 2402 23 add.u64 %r901,%r893,-1; .loc 1 2402 12 ld.u64 %r902,[%frame+240]; setp.ge.u64 %r903,%r901,%r902; @ %r903 bra $L711; .loc 1 2403 10 ld.u64 %r904,[%frame+248]; shl.b64 %r905,%r893,3; add.u64 %r906,%r904,%r905; ld.u64 %r247,[%r906+-8]; bra $L709; $L711: .loc 1 418 38 ld.u64 %r912,[%frame+288]; ld.u64 %r913,[%frame+280]; sub.u64 %r911,%r912,%r913; .loc 1 417 3 st.u32 [%stack+16],%r911; ld.u64 %r914,[%frame+272]; st.u64 [%stack+8],%r914; cvta.const.u64 %r915,$LC23; st.u64 [%stack],%r915; cvta.const.u64 %r910,$LC1; mov909910stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r916,[%value_in]; } .loc 1 419 3 ld.u64 %r921,[%frame+320]; ld.u64 %r922,[%frame+312]; { $LCT47%r922,$LCT47; } bra $L666; $L709:247924,[%value_in]; } .loc 1 2412 19927,[%value_in]; } .loc 1 2413 18 ld.u64 %r933,[%frame+320]; ld.u64 %r934,[%frame+312]; .loc 1 2413 50 add.u64 %r935,%r924,%r927; .loc 1 2413 18 add.u64 %r936,%r935,19r934933; call (%value_in),backtrace_allocld.param.u64 %r937,[%value_in]; } .loc 1 2416 7 setp.eq.u64 %r939,%r937,0; @ %r939 bra $L666; .loc 1 2418 492474946,[%value_in]; } .loc 1 2422 15 add.u64 %r948,%r937,%r924; mov.u32 %r949,47; st.u8 [%r948],%r949; .loc 1 2423 24 add.u64 %r950,%r924,1; add.u64 %r951,%r937,%r950; .loc 1 2423 4 add.u64 %r952,%r927629959,[%value_in]; } .loc 1 2424 22 ld.u64 %r961,[%frame+264]; add.u64 %r962,%r961,%r504; st.u64 [%r962],%r937; $L710: .loc 1 2428 759; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r965,[%value_in]; } .loc 1 2429 759; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r968,[%value_in]; } .loc 1 2378 18 ld.u64 %r1562,[%frame+288]; .loc 1 2378 9 add.u64 %r504,%r504,8; ld.u8 %r969,[%r1562]; setp.ne.u16 %r970,%r969,0; @ %r970 bra $L704; bra $L712; $L683: .loc 1 2670 12 add.u64 %r979,%frame,248; add.u64 %r981,%frame,24515151551556read_line_header_format_entries84,[%value_in]; } .loc 1 2670 10 setp.eq.u32 %r985,%r984,0; @ %r985 bra $L666; .loc 1 2674 12 add.u64 %r994,%frame,264; add.u64 %r996,%frame,256515151551556read_line_header_format_entries99,[%value_in]; } .loc 1 2674 10 setp.eq.u32 %r1000,%r999,0; @ ! %r1000 bra $L712; bra $L666; $L880: .loc 1 2702 6 ld.u64 %r1001,[%frame+256]; setp.eq.u64 %r1002,%r1001,0; @ %r1002 bra $L781; .loc 1 2703 20 ld.u64 %r1003,[%frame+264]; ld.u64 %r326,[%r1003]; bra $L713; $L781: .loc 1 2705 20 cvta.const.u64 %r326,$LC27; $L713: .loc 1 2708 18 ld.u64 %r91,[%frame+360]; .loc 1 2708 9 setp.eq.u64 %r1004,%r91,0; @ %r1004 bra $L714; .loc 1 2706 12 mov.u64 %r429,%r326; .loc 1 2707 10 mov.u32 %r137,1; .loc 1 2700 11 mov.u64 %r327,0; .loc 1 2701 12 cvt.u32.u64 %r293,%r327; $L746: .loc 1 477 24 ld.u64 %r408,[%frame+352]; .loc 1 448 12 add.u64 %r1005,%r408,1; st.u64 [%frame+352],%r1005; .loc 1 449 13 add.u64 %r91,%r91,-1; st.u64 [%frame+360],%r91; .loc 1 481 11 ld.u8 %r409,[%r408]; .loc 1 2713 20 ld.u32 %r329,[%frame+224]; .loc 1 2713 10 setp.lt.u32 %r1006,%r409,%r329; @ %r1006 bra $L715; .loc 1 2718 7 sub.u32 %r330,%r409,%r329; .loc 1 2719 22 ld.u32 %r331,[%frame+220]; .loc 1 2719 12 div.u32 %r1008,%r330,%r331; .loc 1 2720 46 add.u32 %r334,%r1008,%r293; .loc 1 2721 14 ld.u32 %r336,[%frame+212]; .loc 1 2720 34 ld.u32 %r1010,[%frame+208]; mul.lo.u32 %r1009,%r334,%r1010; .loc 1 2721 9 div.u32 %r1012,%r1009,%r336; cvt.u64.u32 %r1013,%r1012; .loc 1 2720 12 add.u64 %r327,%r327,%r1013; .loc 1 2722 13 rem.u32 %r293,%r334,%r336; .loc 1 2723 41 rem.u32 %r1015,%r330,%r331; .loc 1 2723 29 ld.u32 %r1017,[%frame+216]; add.u32 %r1016,%r1015,%r1017; .loc 1 2723 11 add.u32 %r137,%r137,%r1016; .loc 1 2724 4 ld.u64 %r1027,[%frame+384]; ld.u64 %r1028,[%frame+3765151510227; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r1563; call add_line$isra$0,%out_arg8); } .loc 1 2708 18 ld.u64 %r91,[%frame+360]; bra $L716; $L715: .loc 1 2727 15 setp.ne.u32 %r1029,%r409,0; @ %r1029 bra $L717;1561; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r1032,[%value_in]; } .loc 1 477 24 ld.u64 %r414,[%frame+352]; .loc 1 428 10 ld.u64 %r91,[%frame+360]; .loc 1 428 6 setp.ne.u64 %r1033,%r91,0; @ %r1033 bra $L718; .loc 1 431 6 ld.u32 %r1034,[%frame+392]; setp.ne.u32 %r1035,%r1034,0; @ %r1035 bra $L719; .loc 1 418 38 ld.u64 %r1040,[%frame+344]; sub.u64 %r1039,%r414,%r1040; .loc 1 417 3 st.u32 [%stack+16],%r1039; ld.u64 %r1041,[%frame+336]; st.u64 [%stack+8],%r1041; cvta.const.u64 %r1042,$LC2; st.u64 [%stack],%r1042; cvta.const.u64 %r1038,$LC1; mov.u64 %r1037,20101038stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r1043,[%value_in]; } .loc 1 419 3 ld.u64 %r1048,[%frame+384]; ld.u64 %r1049,[%frame+376]; { $LCT4810frame,$LCT48; } .loc 1 434 31 mov.u32 %r1050,1; st.u32 [%frame+392],%r1050; .loc 1 428 10 ld.u64 %r91,[%frame+360]; bra $L719; $L718: .loc 1 448 12 add.u64 %r424,%r414,1; st.u64 [%frame+352],%r424; .loc 1 449 13 add.u64 %r91,%r91,-1; st.u64 [%frame+360],%r91; .loc 1 2733 4 ld.u8 %r1051,[%r414]; setp.eq.u16 %r1052,%r1051,3; @ %r1052 bra $L720; setp.gt.u16 %r1054,%r1051,3; @ %r1054 bra $L721; setp.eq.u16 %r1056,%r1051,1; @ %r1056 bra $L782; setp.eq.u16 %r1058,%r1051,2; @ %r1058 bra $L722; bra $L719; $L721: setp.eq.u16 %r1060,%r1051,4; @ %r1060 bra $L723; bra $L719; $L722: .loc 1 2745 18 ld.u32 %r1063,[%frame+204]51063; call (%value_in),read_address,(%out_arg1,%out_arg2); ld.param.u64 %r1065,[%value_in]; } mov.u64 %r327,%r1065; .loc 1 2708 18 ld.u64 %r91,[%frame+360]; .loc 1 2746 8 bra $L716; $L720: .loc 1 459 16424strnle1068,[%value_in]; } .loc 1 464 10 add.u64 %r426,%r1068,1; .loc 1 428 6 setp.ge.u64 %r1070,%r91,%r426; @ %r1070 bra $L724; .loc 1 431 6 ld.u32 %r1071,[%frame+392]; setp.ne.u32 %r1072,%r1071,0; @ %r1072 bra $L666; .loc 1 418 38 ld.u64 %r1077,[%frame+344]; sub.u64 %r1076,%r424,%r1077; .loc 1 417 3 st.u32 [%stack+16],%r1076; ld.u64 %r1078,[%frame+336]; st.u64 [%stack+8],%r1078; cvta.const.u64 %r1079,$LC2; st.u64 [%stack],%r1079; cvta.const.u64 %r1075,$LC1; mov.u64 %r1074,201074107stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r1080,[%value_in]; } .loc 1 419 3 ld.u64 %r1085,[%frame+384]; ld.u64 %r1086,[%frame+376]; { $LCT491085,$LCT49; } .loc 1 434 31 mov.u32 %r1087,1; st.u32 [%frame+392],%r1087; bra $L666; $L724: .loc 1 448 12 add.u64 %r1088,%r424,%r426; st.u64 [%frame+352],%r1088; .loc 1 449 13 sub.u64 %r1089,%r91,%r426; st.u64 [%frame+360],%r1089; .loc 1 27551561; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r1092,[%value_in]; } .loc 1 2757 361; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r1095,[%value_in]; } .loc 1 2758 361; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r1098,[%value_in]; } .loc 1 2759 6 ld.s8 %r1100,[%r414+1]; cvt.u16.u32 %r1099,%r1100; setp.ne.u16 %r1101,%r1099,47; @ %r1101 bra $L726; .loc 1 2708 18 ld.u64 %r91,[%frame+360]; mov.u64 %r429,%r424; bra $L716; $L726: .loc 1 2755 13 cvt.u32.u64 %r348,%r1092; .loc 1 2768 10 setp.ne.u32 %r1102,%r348,0; @ %r1102 bra $L727; .loc 1 2768 26 ld.u32 %r1103,[%frame+200]; setp.gt.s32 %r1104,%r1103,4; @ %r1104 bra $L727; .loc 1 2769 13 ld.u64 %r357,[%r41+96]; bra $L728; $L727: .loc 1 2770 26 add.u32 %r1105,%r348,-1; cvt.u64.u32 %r352,%r1105; .loc 1 2770 15 ld.u64 %r1106,[%frame+240]; setp.ge.u64 %r1107,%r352,%r1106; @ %r1107 bra $L729; .loc 1 2771 13 ld.u64 %r1108,[%frame+248]; shl.b64 %r1109,%r352,3; add.u64 %r1110,%r1108,%r1109; ld.u64 %r357,[%r1110]; bra $L728; $L729: .loc 1 418 38 ld.u64 %r1115,[%frame+352]; ld.u64 %r1116,[%frame+344]; sub.u64 %r1114,%r1115,%r1116; .loc 1 417 3 st.u32 [%stack+16],%r1114; ld.u64 %r1117,[%frame+336]; st.u64 [%stack+8],%r1117; cvta.const.u64 %r1118,$LC33; st.u64 [%stack],%r1118; cvta.const.u64 %r1113,$LC1; mov.u64 %r1112,2011121113stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r1119,[%value_in]; } .loc 1 419 3 ld.u64 %r1124,[%frame+384]; ld.u64 %r1125,[%frame+376]; mov.u32 %r1123,0; { $LCT511frame,$LCT50; } bra $L666; $L728: .loc 1 2779 1771127,[%value_in]; } .loc 1 27804241130,[%value_in]; } .loc 1 2782 5 ld.u64 %r1136,[%frame+384]; ld.u64 %r1137,[%frame+376]; .loc 1 2782 37 add.u64 %r1138,%r1127,%r1130; .loc 1 2782 5 add.u64 %r1139,%r1138,11111371136; call (%value_in),backtrace_allocld.param.u64 %r1140,[%value_in]; } mov.u64 %r429,%r1140; .loc 1 2785 10 setp.eq.u64 %r1142,%r429,0; @ %r1142 bra $L666; .loc 1 2787 71271149,[%value_in]; } .loc 1 2792 18 add.u64 %r1151,%r429,%r1127; mov.u32 %r1152,47; st.u8 [%r1151],%r1152; .loc 1 2793 27 add.u64 %r1153,%r1127,1; add.u64 %r1154,%r429,%r1153; .loc 1 2793 7 add.u64 %r1155,%r1111r424111162,[%value_in]; } .loc 1 2708 18 ld.u64 %r91,[%frame+360]; bra $L716; $L723: .loc 1 28001561; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r1166,[%value_in]; } .loc 1 2708 18 ld.u64 %r91,[%frame+360]; .loc 1 2801 8 bra $L716; $L719: .loc 1 2803 13 add.u64 %r368,%r1032,-1; .loc 1 428 6 setp.le.u64 %r1167,%r368,%r91; @ %r1167 bra $L730; .loc 1 431 6 ld.u32 %r1168,[%frame+392]; setp.ne.u32 %r1169,%r1168,0; @ %r1169 bra $L666; .loc 1 418 38 ld.u64 %r1174,[%frame+352]; ld.u64 %r1175,[%frame+344]; sub.u64 %r1173,%r1174,%r1175; .loc 1 417 3 st.u32 [%stack+16],%r1173; ld.u64 %r1176,[%frame+336]; st.u64 [%stack+8],%r1176; cvta.const.u64 %r1177,$LC2; st.u64 [%stack],%r1177; cvta.const.u64 %r1172,$LC1; mov.u64 %r1171,2011711172stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r1178,[%value_in]; } .loc 1 419 3 ld.u64 %r1183,[%frame+384]; ld.u64 %r1184,[%frame+376]; { $LCT51118ame,$LCT51; } .loc 1 434 31 mov.u32 %r1185,1; st.u32 [%frame+392],%r1185; bra $L666; $L730: .loc 1 448 12 ld.u64 %r1187,[%frame+352]; add.u64 %r1186,%r1187,%r368; st.u64 [%frame+352],%r1186; .loc 1 449 13 sub.u64 %r91,%r91,%r368; st.u64 [%frame+360],%r91; bra $L716; $L717: .loc 1 2810 4 cvt.u16.u32 %r1188,%r409; setp.eq.u16 %r1189,%r1188,5; @ %r1189 bra $L731; setp.gt.u16 %r1191,%r1188,5; @ %r1191 bra $L732; setp.eq.u16 %r1193,%r1188,3; @ %r1193 bra $L733; setp.gt.u16 %r1195,%r1188,3; @ %r1195 bra $L734; setp.eq.u16 %r1197,%r1188,1; @ %r1197 bra $L735; setp.eq.u16 %r1199,%r1188,2; @ %r1199 bra $L736; bra $L737; $L734: setp.eq.u16 %r1201,%r1188,4; @ %r1201 bra $L738; bra $L737; $L732: setp.eq.u16 %r1203,%r1188,9; @ %r1203 bra $L739; setp.gt.u16 %r1205,%r1188,9; @ %r1205 bra $L740; setp.le.u16 %r1207,%r1188,7; @ %r1207 bra $L716; setp.eq.u16 %r1209,%r1188,8; @ %r1209 bra $L741; bra $L737; $L740: setp.le.u16 %r1211,%r1188,11; @ %r1211 bra $L716; setp.eq.u16 %r1213,%r1188,12; @ %r1213 bra $L742; bra $L737; $L735: .loc 1 2813 8 ld.u64 %r1223,[%frame+384]; ld.u64 %r1224,[%frame+376515151224223; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r1563; call add_line$isra$0,%out_arg8); } .loc 1 2708 18 ld.u64 %r91,[%frame+360]; .loc 1 2815 8 bra $L716; $L736: .loc 1 2820 1361; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r1227,[%value_in]; } .loc 1 2821 45 cvt.u64.u32 %r1228,%r293; add.u64 %r375,%r1228,%r1227; .loc 1 2822 13 ld.u32 %r378,[%frame+212]; .loc 1 2821 18 ld.u32 %r1229,[%frame+208]; .loc 1 2821 33 mul.lo.u64 %r1230,%r1229,%r375; .loc 1 2822 8 div.u64 %r1232,%r1230,%r378; .loc 1 2821 11 add.u64 %r327,%r327,%r1232; .loc 1 2823 35 rem.u64 %r1234,%r375,%r378; .loc 1 2823 12 cvt.u32.u64 %r293,%r1234; .loc 1 2708 18 ld.u64 %r91,[%frame+360]; .loc 1 2825 8 bra $L716; $L733: .loc 1 2827 2461; call (%value_in),read_sleb128,(%out_arg1); ld.param.u64 %r1237,[%value_in]; } .loc 1 2827 15 cvt.u32.u64 %r1238,%r1237; add.u32 %r137,%r137,%r1238; .loc 1 2708 18 ld.u64 %r91,[%frame+360]; .loc 1 2828 8 bra $L716; $L738: .loc 1 2833 1261; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r1241,[%value_in]; } .loc 1 2834 6 setp.ne.u64 %r1242,%r1241,0; @ %r1242 bra $L743; .loc 1 2708 18 ld.u64 %r91,[%frame+360]; .loc 1 2835 14 cvta.const.u64 %r429,$LC27; bra $L716; $L743: .loc 1 2838 18 add.u64 %r1243,%r1241,-1; .loc 1 2838 10 ld.u64 %r1244,[%frame+256]; setp.lt.u64 %r1245,%r1243,%r1244; @ %r1245 bra $L744; .loc 1 418 38 ld.u64 %r1250,[%frame+352]; ld.u64 %r1251,[%frame+344]; sub.u64 %r1249,%r1250,%r1251; .loc 1 417 3 st.u32 [%stack+16],%r1249; ld.u64 %r1252,[%frame+336]; st.u64 [%stack+8],%r1252; cvta.const.u64 %r1253,$LC34; st.u64 [%stack],%r1253; cvta.const.u64 %r1248,$LC1; mov.u64 %r1247,20124748stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r1254,[%value_in]; } .loc 1 419 3 ld.u64 %r1259,[%frame+384]; ld.u64 %r1260,[%frame+376]; mov.u32 %r1258,0; { $LCT5212,$LCT52; } bra $L666; $L744: .loc 1 2845 16 ld.u64 %r1261,[%frame+264]; shl.b64 %r1262,%r1241,3; add.u64 %r1263,%r1261,%r1262; ld.u64 %r429,[%r1263+-8]; .loc 1 2708 18 ld.u64 %r91,[%frame+360]; bra $L716; $L731: .loc 1 28501561; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r1267,[%value_in]; } .loc 1 2708 18 ld.u64 %r91,[%frame+360]; .loc 1 2851 8 bra $L716; $L741: .loc 1 2860 6 mov.u32 %r1269,255; sub.u32 %r1268,%r1269,%r329; .loc 1 2861 11 ld.u32 %r1272,[%frame+220]; div.u32 %r1271,%r1268,%r1272; .loc 1 2862 45 add.u32 %r394,%r1271,%r293; .loc 1 2863 13 ld.u32 %r396,[%frame+212]; .loc 1 2862 33 ld.u32 %r1274,[%frame+208]; mul.lo.u32 %r1273,%r394,%r1274; .loc 1 2863 8 div.u32 %r1276,%r1273,%r396; cvt.u64.u32 %r1277,%r1276; .loc 1 2862 11 add.u64 %r327,%r327,%r1277; .loc 1 2864 12 rem.u32 %r293,%r394,%r396; .loc 1 2866 8 bra $L716; $L739: .loc 1 2868 11561; call (%value_in),read_uint16,(%out_arg1); ld.param.u32 %r1280,[%value_in]; } cvt.u16.u32 %r1283,%r1280; cvt.u64.u16 %r1282,%r1283; .loc 1 2868 16 add.u64 %r327,%r327,%r1282; .loc 1 2708 18 ld.u64 %r91,[%frame+360]; .loc 1 2869 17 mov.u32 %r293,0; .loc 1 2870 8 bra $L716; $L742: .loc 1 28761561; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r1286,[%value_in]; } .loc 1 2708 18 ld.u64 %r91,[%frame+360]; .loc 1 2877 8 bra $L716; $L737: .loc 1 2882 35 add.u32 %r1287,%r409,-1; cvt.u64.u32 %r1288,%r1287; .loc 1 2882 31 ld.u64 %r1289,[%frame+232]; add.u64 %r1290,%r1289,%r1288; .loc 1 2882 10 ld.u8 %r406,[%r1290]; .loc 1 2882 3 setp.eq.u32 %r1291,%r406,0; @ %r1291 bra $L716; $L745: .loc 1 2883 561; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r1294,[%value_in]; } .loc 1 2882 48 add.u32 %r406,%r406,-1; .loc 1 2882 3 setp.ne.u32 %r1295,%r406,0; @ %r1295 bra $L745; .loc 1 2708 18 ld.u64 %r91,[%frame+360]; bra $L716; $L782: .loc 1 2733 4 mov.u64 %r429,%r326; mov.u32 %r293,%r409; mov.u32 %r137,1; mov.u64 %r327,0; $L716: .loc 1 2708 9 setp.ne.u64 %r1296,%r91,0; @ %r1296 bra $L746; bra $L714; $L882: .loc 1 2941 6 ld.u64 %r1297,[%frame+424]; setp.eq.u64 %r1298,%r1297,0; @ %r1298 bra $L666; .loc 1 2951 2 mov.u64 %r13005113005181563; call (%value_in),backtrace_vector_grow05,[%value_in]; } .loc 1 2953 6 setp.eq.u64 %r1306,%r1305,0; @ %r1306 bra $L666; .loc 1 2955 10 mov.u64 %r1307,-1; st.u64 [%r1305],%r1307; .loc 1 2956 16 mov.u64 %r1308,0; st.u64 [%r1305+8],%r1308; .loc 1 2957 14 st.u64 [%r1305+16],%r13085118519; call (%value_in),backtrace_vector_releaseld.param.u32 %r1315,[%value_in]; } .loc 1 2960 6 setp.eq.u32 %r1316,%r1315,0; @ %r1316 bra $L666; .loc 1 2963 6 ld.u64 %r111,[%frame+400]; .loc 1 2964 3 ld.u64 %r1321,[%frame+424]; mov.u64 %r1320,line_compare313001320; call backtrace_qso); } .loc 1 2967 21 ld.u64 %r113,[%frame+424]; .loc 1 3648 7 ld.u32 %r1322,[%r514+8]; setp.ne.u32 %r1323,%r1322,0; @ %r1323 bra $L747; bra $L881; $L666: .loc 2 267 12 ld.u64 %r1325,[%frame+416]; ld.u64 %r1326,[%frame+408]; add.u64 %r1324,%r1325,%r1326; st.u64 [%frame+416],%r1324; .loc 2 268 13 mov.u64 %r1327,0; st.u64 [%frame+408],%r1327; .loc 2 269 3118519; call (%value_in),backtrace_vector_releaseld.param.u32 %r1333,[%value_in]; } .loc 1 2302 10 ld.u64 %r463,[%frame+240]; .loc 1 2302 6 setp.eq.u64 %r1334,%r463,0; @ %r1334 bra $L749; .loc 1 2303 5 shl.b64 %r1340,%r463,3; ld.u64 %r1341,[%frame+2413411340518519; call backtrace_free); } $L749: .loc 1 2305 3 ld.u64 %r1348,[%frame+256]; shl.b64 %r1347,%r1348,3; ld.u64 %r1349,[%frame+26451134913519; call backtrace_free); } .loc 1 3639 22 mov.u64 %r506,%r34; .loc 1 2975 16 mov.u64 %r113,0; .loc 1 2974 10 mov.u64 %r34,-1; .loc 1 3640 28 mov.u64 %r508,%r113; .loc 1 3629 12 cvt.u32.u64 %r88,%r113; bra $L750; $L747: .loc 1 3652 4 ld.u64 %r411,[%r92+16]; .loc 1 3450 7 st.u64 [%r1559],%r1308; st.u64 [%r1559+8],%r1308; st.u64 [%r1559+16],%r1308; st.u64 [%r1559+24],%r1308; mov.u64 %r45,%r34; .loc 1 3451 13 mov.u64 %r94,%r1559; bra $L751; $L881: .loc 1 3651 12 add.u64 %r94,%r515,208; .loc 1 3652 4 ld.u64 %r411,[%r92+16]; mov.u64 %r45,%r94; $L751: .loc 1 3454 17 cvta.const.u64 %r1355,$LC25; st.u64 [%frame],%r1355; .loc 1 3455 18 ld.u64 %r1356,[%r515+56]; st.u64 [%frame+8],%r1356; .loc 1 3456 16 ld.u64 %r1357,[%r411]; st.u64 [%frame+16],%r1357; .loc 1 3457 20 ld.u64 %r276,[%r411+8]; .loc 1 3457 17 st.u64 [%frame+24],%r276; .loc 1 3458 25 ld.u32 %r1358,[%r515+200]; st.u32 [%frame+32],%r1358; .loc 1 3459 27 st.u64 [%frame+40],%r518; .loc 1 3460 17 st.u64 [%frame+48],%r519; .loc 1 3461 31 mov.u32 %r1359,0; st.u32 [%frame+56],%r1359; .loc 1 3463 9 bra $L752; $L754: .loc 1 3465 12 mov.u64 %r1363,5151511363frame1556518; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r519; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r94; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r94; call (%value_in),read_function_ent,%out_arg8,%out_arg9,%out_arg10); ld.param.u32 %r1371,[%value_in]; } .loc 1 3465 10 setp.eq.u32 %r1372,%r1371,0; @ %r1372 bra $L753; .loc 1 3463 18 ld.u64 %r276,[%frame+24]; $L752: .loc 1 3463 9 setp.ne.u64 %r1373,%r276,0; @ %r1373 bra $L754; .loc 1 3470 12 ld.u64 %r508,[%r94+24]; .loc 1 3470 6 setp.eq.u64 %r1374,%r508,0; @ %r1374 bra $L753; .loc 1 3475 6 setp.ne.u64 %r1375,%r45,0; @ %r1375 bra $L755; .loc 1 3477159518519; call (%value_in),backtrace_vector_releaseld.param.u32 %r1381,[%value_in]; } .loc 1 3477 10 setp.eq.u32 %r1382,%r1381,0; @ %r1382 bra $L753; .loc 1 3479 13 ld.u64 %r506,[%r94]; bra $L756; $L755: .loc 1 3486 951518519; call (%value_in),backtrace_vector_finishld.param.u64 %r1387,[%value_in]; } mov.u64 %r506,%r1387; .loc 1 3488 10 setp.eq.u64 %r1388,%r506,0; @ %r1388 bra $L753; .loc 1 3490 19 st.u64 [%r45+24],%r276; $L756: .loc 1 3493 3 mov.u64 %r1393,function_addrs_compare; mov3921393; call backtrace_qso); } bra $L757; $L753: .loc 1 3639 22 mov.u64 %r506,%r34; .loc 1 3640 28 mov.u64 %r508,0; $L757: .loc 1 2302 10 ld.u64 %r469,[%frame+240]; .loc 1 2302 6 setp.eq.u64 %r1394,%r469,0; @ %r1394 bra $L758; .loc 1 2303 5 shl.b64 %r1400,%r469,3; ld.u64 %r1401,[%frame+2414011400518519; call backtrace_free); } $L758: .loc 1 2305 3 ld.u64 %r1408,[%frame+256]; shl.b64 %r1407,%r1408,3; ld.u64 %r1409,[%frame+2645114091407518519; call backtrace_free); } .loc 1 2966 10 mov.u64 %r34,%r111; .loc 1 3656 13 mov.u32 %r88,1; $L750: .loc 1 3666 10 ld.u32 %r1410,[%r514+8]; setp.ne.u32 %r1411,%r1410,0; @ %r1411 bra $L759; .loc 1 3668 19 st.u64 [%r93+136],%r113; .loc 1 3669 22 st.u64 [%r93+144],%r506; .loc 1 3670 28 st.u64 [%r93+152],%r508; .loc 1 3671 13 st.u64 [%r93+128],%r34; bra $L760; $L759: .loc 1 3675 4 membar.sys; st.u64 [%r93+136],%r113; .loc 1 3676 4 membar.sys; st.u64 [%r93+144],%r506; .loc 1 3677 4 membar.sys; st.u64 [%r93+152],%r508; .loc 1 3679 4 membar.sys; st.u64 [%r93+128],%r34; $L760: .loc 1 3685 6 setp.ne.u64 %r1412,%r34,-1; @ %r1412 bra $L664; .loc 1 3690 10 setp.eq.u32 %r1413,%r88,0; @ %r1413 bra $L663; .loc 1 3691 9 ld.u64 %r1421,[%frame+43515151427518dwarf_lookup_pc22,[%value_in]; } mov.u32 %r513,%r1422; bra $L651; $L663: .loc 1 3693 14 ld.u64 %r1428,[%frame+432]; mov.u64 %r1427,0; cvt.u32.u64 %r1426,%r1427; { .param .u32 %value_in; $LCT53: .callprototype ( .param .u32 %value_out),.param .u64 %in_ar2,1427141427; call (%value_in),%r517),$LCT53; ld.param.u32 %r1429,[%value_in]; } mov.u32 %r513,%r1429; bra $L651; $L664: .loc 1 3698 24 ld.u64 %r1435,[%r92+16]; ld.u64 %r1436,[%r1435+136]; mov.u64 %r1434,line_search; mov.u64 %r1433564143614331434; call (%value_in),bsearch1438,[%value_in]; } .loc 1 3700 6 setp.ne.u64 %r1439,%r1438,0; @ %r1439 bra $L761; .loc 1 3707 16 ld.u64 %r56,[%r92+16]; .loc 1 3707 19 ld.u64 %r90,[%r56+104]; .loc 1 3707 10 setp.ne.u64 %r1440,%r90,0; @ %r1440 bra $L762; .loc 1 3711 13 ld.u64 %r90,[%r56+88]; .loc 1 3712 7 setp.eq.u64 %r1441,%r90,0; @ %r1441 bra $L763; .loc 1 3713 8 ld.s8 %r1443,[%r90]; cvt.u16.u32 %r1442,%r1443; setp.eq.u16 %r1444,%r1442,47; @ %r1444 bra $L763; .loc 1 3714 19 ld.u64 %r58,[%r56+96]; .loc 1 3714 8 setp.eq.u64 %r1445,%r58,0; @ %r1445 bra $L763; .loc 1 3721 23901447,[%value_in]; } .loc 1 3723 1581450,[%value_in]; } .loc 1 3724 53 add.u64 %r1456,%r1447,%r1450; .loc 1 3724 21 add.u64 %r1457,%r1456,11457518519; call (%value_in),backtrace_allocld.param.u64 %r1458,[%value_in]; } .loc 1 3726 11 setp.eq.u64 %r1460,%r1458,0; @ %r1460 bra $L764; .loc 1 3731145141467,[%value_in]; } .loc 1 3733 19 add.u64 %r1469,%r1458,%r1450; mov.u32 %r1470,47; st.u8 [%r1469],%r1470; .loc 1 3734 28 add.u64 %r1471,%r1450,1; add.u64 %r1472,%r1458,%r1471; .loc 1 3734 8 add.u64 %r1473,%r14471474731480,[%value_in]; } .loc 1 3737 9 ld.u64 %r56,[%r92+16]; mov.u64 %r90,%r1458; $L763: .loc 1 3737 27 st.u64 [%r56+104],%r90; $L762: .loc 1 3740 14 ld.u64 %r1487,[%frame+432]; mov.u64 %r1486,0; cvt.u32.u64 %r1485,%r1486; { .param .u32 %value_in; $LCT54: .callprototype ( .param .u32 %value_out),.param .u64 %in_ar2,487141486; call (%value_in),%r517),$LCT54; ld.param.u32 %r1488,[%value_in]; } mov.u32 %r513,%r1488; bra $L651; $L761: .loc 1 3745 12 ld.u64 %r66,[%r92+16]; .loc 1 3745 15 ld.u64 %r67,[%r66+152]; .loc 1 3745 6 setp.ne.u64 %r1489,%r67,0; @ %r1489 bra $L765; .loc 1 3746 12 ld.u32 %r1495,[%r1438+16]; ld.u64 %r1496,[%r1438+8]; ld.u64 %r1497,[%frame+432]; { .param .u32 %value_in; $LCT55: .callprototype ( .param .u32 %value_out),.param .u64 %in_ar2,141495%r517),$LCT55; ld.param.u32 %r1498,[%value_in]; } mov.u32 %r513,%r1498; bra $L651; $L765: .loc 1 3749 7 ld.u64 %r1504,[%r66+144]; mov.u64 %r1503,function_addrs_search560414331503; call (%value_in),bsearch1506,[%value_in]; } mov.u64 %r484,%r1506; .loc 1 3753 6 setp.eq.u64 %r1507,%r1506,0; @ %r1507 bra $L766; .loc 1 3759 43 ld.u64 %r492,[%r92+16]; .loc 1 3759 46 ld.u64 %r487,[%r492+144]; .loc 1 3760 14 ld.u64 %r289,[%r492+152]; .loc 1 3770 9 ld.u64 %r141,[%frame+432]; .loc 1 3759 36 sub.u64 %r1508,%r1506,%r487; shr.s64 %r1510,%r1508,3; .loc 1 3759 63 mad.lo.u64 %r1512,%r1510,-6148914691236517205,1; .loc 1 3759 9 setp.le.u64 %r1513,%r289,%r1512; @ %r1513 bra $L767; add.u64 %r481,%r1506,24; add.u64 %r512,%r1506,32; bra $L768; $L766: .loc 1 3754 12 ld.u32 %r1519,[%r1438+16]; ld.u64 %r1520,[%r1438+8]; ld.u64 %r1521,[%frame+432]; { .param .u32 %value_in; $LCT56: .callprototype ( .param .u32 %value_out),.param .u64 %in_ar2,115201511506; call (%value_in),%r517),$LCT56; ld.param.u32 %r1522,[%value_in]; } mov.u32 %r513,%r1522; bra $L651; $L769: .loc 1 3759 9 add.u64 %r481,%r481,24; add.u64 %r512,%r512,24; .loc 1 3759 36 sub.u64 %r1523,%r507,%r487; shr.s64 %r1525,%r1523,3; .loc 1 3759 63 mad.lo.u64 %r1527,%r1525,-6148914691236517205,1; .loc 1 3759 9 setp.ge.u64 %r1528,%r1527,%r289; @ %r1528 bra $L783; $L768: add.u64 %r484,%r481,-24; mov.u64 %r507,%r481; .loc 1 3761 3 ld.u64 %r1529,[%r481]; setp.gt.u64 %r1530,%r1529,%r141; @ %r1530 bra $L767; .loc 1 3762 3 ld.u64 %r1531,[%r512]; setp.gt.u64 %r1532,%r1531,%r141; @ %r1532 bra $L769; bra $L767; $L783: mov.u64 %r484,%r507; $L767: .loc 1 3765 12 ld.u64 %r96,[%r484+16]; .loc 1 3767 12 ld.u64 %r1533,[%r1438+8]; st.u64 [%frame],%r1533; .loc 1 3768 10 ld.u32 %r1534,[%r1438+16]; st.u32 [%frame+200],%r1534; .loc 1 3770 9 add.u64 %r1541,%frame,201517frame1541; call (%value_in),report_inlined_functions2,[%value_in]; } mov.u32 %r513,%r1542; .loc 1 3772 6 setp.ne.u32 %r1543,%r513,0; @ %r1543 bra $L651; .loc 1 3775 10 ld.u32 %r1549,[%frame+200]; ld.u64 %r1550,[%frame+432]; ld.u64 %r1548,[%r96]; ld.u64 %r1546,[%frame]; { .param .u32 %value_in; $LCT57: .callprototype ( .param .u32 %value_out),.param .u64 %in_ar2,5015461541548; call (%value_in),%r517),$LCT57; ld.param.u32 %r1551,[%value_in]; } mov.u32 %r513,%r1551; bra $L651; $L714: .loc 1 2938 6 ld.u32 %r1553,[%frame+392]; setp.ne.u32 %r1554,%r1553,0; @ ! %r1554 bra $L882; bra $L666; $L651: .loc 1 3776FUNCTION DEF: dwarf_fileline .func (.param .u32 %value_out) dwarf_fileline.reg .predu32 %r60; .reg .pred.loc 1 3791 6 ld.u32 %r35,[%r30+8]; setp.ne.u32 %r36,%r35,0; @ %r36 bra $L884; .loc 1 3793 18 ld.u64 %r28,[%r30+32]; .loc 1 3793 7 setp.eq.u64 %r37,%r28,0; @ %r37 bra $L885; $L888: .loc 1 379734frame; call (%value_in),dwarf_lookup_pc.loc 1 3799 7 setp.ne.u32 %r46,%r45,0; @ %r46 bra $L883; .loc 1 3799 17 ld.u32 %r47,[%frame]; setp.eq.u32 %r48,%r47,0; @ %r48 bra $L887; $L889: .loc 1 3797 10 mov.u32 %r29,0; bra $L883; $L887: .loc 1 3795 11 ld.u64 %r28,[%r28]; .loc 1 3793 7 setp.ne.u64 %r49,%r28,0; @ %r49 bra $L888; bra $L885; $L884: .loc 1 3807 10 add.u64 %r26,%r30,32; $L890: .loc 1 3810 12 ld.u64 %r24,[%r26]; membar.sys; .loc 1 3811 7 setp.eq.u64 %r50,%r24,0; @ %r50 bra $L885; .loc 1 381434frame; call (%value_in),dwarf_lookup_pc.loc 1 3816 7 setp.ne.u32 %r59,%r58,0; @ %r59 bra $L883; .loc 1 3816 17 ld.u32 %r60,[%frame]; setp.ne.u32 %r61,%r60,0; @ %r61 bra $L889; .loc 1 3819 7 mov.u64 %r26,%r24; .loc 1 3810 10 bra $L890; $L885: .loc 1 3825 10 mov.u64 %r66,0; cvt.u32.u64 %r65,%r66; { .param .u32 %value_in; $LCT58: .callprototype ( .param .u32 %value_out),.param .u64 %in_ar2,.param .u32 %in_ar3,.param .u64 %in_ar4);call (%value_in),%r32),$LCT58; ld.param.u32 %r67,[%value_in]; } mov.u32 %r29,%r67; $L883: .loc 1 38backtrace_dwarf_addbacktrace_dwarf_add544107; .reg .u64u64 %r195; .reg .predpred %r206; .reg .u32 %r207; .reg .pred32u32 %r302; .reg .predu64u32 %r324; .reg .u64 %r325; .reg .u64 %r326; .reg .u64 %r329; .reg .u32 %r330; .reg .pred %r331; .reg .u64 %r334; .reg .u64 %r337; .reg .predpred %r347; .reg .predpred %r372; .reg .u64 %r375; .reg .u32 %r378; .reg .u64 %r382; .reg .u64 %r386; .reg .u64 %r390; .reg .u32 %r391; .reg .predpred %r414; .reg .u64 %r417; .reg .u64 %r420; .reg .predpredpred %r471; .reg .u32 %r472; .reg .pred %r473; .reg .u64 %r474; .reg .pred %r475; .reg .predu32 %r485; .reg .pred %r486; .reg .predpredu64 %r503; .reg .u32 %r509; .reg .u64 %r510; .reg .pred %r511; .reg .u64 %r512; .reg .u64 %r513; .reg .u64u64 %r534; .reg .u64u64 %r549; .reg .u64 %r552; .reg .u32 %r554; .reg .u64 %r556; .reg .pred %r558; .reg .u32 %r559; .reg .pred %r560; .reg .pred %r561; .reg .pred %r562; .reg .pred %r563; .reg .u64 %r565; .reg .u32 %r566; .reg .pred %r568; .reg .pred %r569; .reg .u64 %r570; .reg .u64 %r572; .reg .u64 %r573; .reg .u64 %r575; .reg .pred %r576; .reg .u64 %r577; .reg .u64 %r579; .reg .u64 %r583; .reg .u64 %r584; .reg .u32mov.u64 %r161,%ar6; mov.u64 %r162,%ar7; mov.u64 %r163,%ar8; .loc 1 2095 3 add.u64 %r577,%frame,488; mov.u64 %r165,0; st.u64 [%r577],%r165; st.u64 [%r577+8],%r165; st.u64 [%r577+16],%r165; .loc 1 2096 3 add.u64 %r575,%frame,456; st.u64 [%r575],%r165; st.u64 [%r575+8],%r165; st.u64 [%r575+16],%r165; .loc 1 2097 16 st.u64 [%frame+512],%r165; .loc 1 2098 19 st.u64 [%frame+480],%r165; .loc 1 2104 13 cvta.const.u64 %r174,$LC25; st.u64 [%frame+392],%r174; .loc 1 2105 36 ld.u64 %r82,[%r157]; .loc 1 2105 14 st.u64 [%frame+400],%r82; .loc 1 2106 12 st.u64 [%frame+408],%r82; .loc 1 2107 35 ld.u64 %r81,[%r157+72]; .loc 1 2107 13 st.u64 [%frame+416],%r81; .loc 1 2108 21 st.u32 [%frame+424],%r158; .loc 1 2109 23 st.u64 [%frame+432],%r160; .loc 1 2110 13 st.u64 [%frame+440],%r161; .loc 1 2111 27 cvt.u32.u64 %r175,%r165; st.u32 [%frame+448],%r175; .loc 1 2113 3 add.u64 %r579,%frame,520; st.u64 [%r579],%r165; st.u64 [%r579+8],%r165; st.u64 [%r579+16],%r165; .loc 1 2116 9 setp.ne.u64 %r180,%r81,0; @ %r180 bra $L954; bra $L900; $L940: .loc 1 2129 10 setp.ne.u32 %r181,%r149,0; @ %r181 bra $L939; .loc 1 2132 23 ld.u64 %r82,[%frame+408]; bra $L899; $L954: .loc 1 2093 10 mov.u64 %r152,%r165; .loc 1 2114 15 mov.u64 %r81,%r152; add.u64 %r573,%frame,392; add.u64 %r572,%frame,328; .loc 1 1371 13 add.u64 %r583,%frame,200; add.u64 %r584,%frame,264573; call (%value_in),read_uint32,(%out_arg1); ld.param.u32 %r184,[%value_in]; } .loc 1 717 7 cvt.u64.u32 %r84,%r184; .loc 1 718 6 setp.ne.u64 %r185,%r84,4294967295; @ %r185 bra $L955; .loc 1 720 13573; call (%value_in),read_uint64,(%out_arg1); ld.param.u64 %r188,[%value_in]; } mov.u64 %r84,%r188; .loc 1 721 19 mov.u32 %r151,1; bra $L902; $L955: .loc 1 724 17 mov.u32 %r151,0; $L902: .loc 1 2135 16 mov.u64 %r191,64191195,[%value_in]; } .loc 1 2136 21 st.u64 [%frame+352],%r84; .loc 1 428 10 ld.u64 %r85,[%frame+416]; .loc 1 428 6 setp.gt.u64 %r197,%r84,%r85; @ %r197 bra $L903; .loc 1 448 12 ld.u64 %r199,[%frame+408]; add.u64 %r198,%r199,%r84; st.u64 [%frame+408],%r198; .loc 1 449 13 sub.u64 %r200,%r85,%r84; st.u64 [%frame+416],%r200; .loc 1 214172; call (%value_in),read_uint16,(%out_arg1); ld.param.u32 %r203,[%value_in]; } .loc 1 2142 23 add.u32 %r205,%r203,-2; .loc 1 2142 10 setp.gt.u32 %r206,%r205,3; @ ! %r206 bra $L1011; bra $L904; $L903: .loc 1 431 6 ld.u32 %r207,[%frame+448]; setp.ne.u32 %r208,%r207,0; @ %r208 bra $L939; .loc 1 418 38 ld.u64 %r213,[%frame+408]; ld.u64 %r214,[%frame+400]; sub.u64 %r212,%r213,%r214; .loc 1 417 3 st.u32 [%stack+16],%r212; ld.u64 %r215,[%frame+392]; st.u64 [%stack+8],%r215; cvta.const.u64 %r216,$LC2; st.u64 [%stack],%r216; cvta.const.u64 %r211,$LC1; mov21021stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r217,[%value_in]; } .loc 1 419 3 ld.u64 %r222,[%frame+440]; ld.u64 %r223,[%frame+432]; { $LCT592frame7; call %r22,$LCT59; } .loc 1 434 31 mov.u32 %r224,1; st.u32 [%frame+448],%r224; bra $L939; $L904: .loc 1 418 38 ld.u64 %r229,[%frame+344]; ld.u64 %r230,[%frame+336]; sub.u64 %r228,%r229,%r230; .loc 1 417 3 st.u32 [%stack+16],%r228; ld.u64 %r231,[%frame+328]; st.u64 [%stack+8],%r231; cvta.const.u64 %r232,$LC35; st.u64 [%stack],%r232; cvta.const.u64 %r227,$LC1; mov226stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r233,[%value_in]; } .loc 1 419 3 ld.u64 %r238,[%frame+376]; ld.u64 %r239,[%frame+368]; mov.u32 %r237,0; { $LCT623frame237; call %r239,$LCT60; } .loc 1 2145 4 bra $L939; $L1011: .loc 1 2148 10 setp.le.s32 %r576,%r203,4; @ %r576 bra $L956; .loc 1 215272; call (%value_in),read_byte,(%out_arg1); ld.param.u32 %r243,[%value_in]; } .loc 1 2153 18 cvt.u16.u32 %r246,%r243; and.b16 %r245,%r246,-5908; .loc 1 2152 14 mov.u32 %r47,%r243; bra $L906; $L956: .loc 1 2149 12 mov.u32 %r47,0; $L906: .loc 1 2161 6 mov.u64 %r25116579; call (%value_in),backtrace_vector_grow.loc 1 2163 10 setp.eq.u64 %r257,%r256,0; @ %r257 bra $L939; .loc 1 2167 5 mov.u64 %r259,16161; call (%value_in),backtrace_allocld.param.u64 %r262,[%value_in]; } .loc 1 2168 10 setp.eq.u64 %r264,%r262,0; @ %r264 bra $L939; .loc 1 2171 11 st.u64 [%r256],%r262; .loc 1 2172 7 add.u64 %r81,%r81,1; .loc 1 2174 10 @ %r576 bra $L957; .loc 1 21572; call (%value_in),read_byte,(%out_arg1); ld.param.u32 %r268,[%value_in]; } .loc 1 2177 11 mov.u32 %r63,%r268; bra $L909; $L957: .loc 1 2175 11 mov.u32 %r63,0; $L909: .loc 1 2179 15 add.u64 %r51,%r262,112; .loc 1 2179 7 mov.u64 %r271,16; mov.u32 %r273273572 6 setp.eq.u32 %r277,%r151,0; @ %r277 bra $L910; .loc 1 573 12572; call (%value_in),read_uint64,(136,%r280; bra $L911; $L91072; call (%value_in),read_uint32,(%out_arg1); ld.param.u32 %r283,[%value_in]; } cvt.u64.u32 %r136,%r283; $L911: .loc 1 2181 12 ld.u64 %r53,[%r157+88]; ld.u64 %r54,[%r157+16]; .loc 1 1351 24 mov.u64 %r284,0; st.u64 [%r262+112],%r284; .loc 1 1352 20 st.u64 [%r262+120],%r284; .loc 1 1354 6 setp.gt.u64 %r286,%r53,%r136; @ %r286 bra $L912; .loc 1 1356 7 cvt.u32.u64 %r289,%r284; cvta.const.u64 %r288,$LC36; { $LCT61%r160,$LCT61; } bra $L939; $L912: .loc 1 1360 19 cvta.const.u64 %r290,$LC37; st.u64 [%frame+264],%r290; .loc 1 1361 20 st.u64 [%frame+272],%r54; .loc 1 1362 33 add.u64 %r291,%r54,%r136; .loc 1 1362 18 st.u64 [%frame+280],%r291; .loc 1 1363 39 sub.u64 %r292,%r53,%r136; .loc 1 1363 19 st.u64 [%frame+288],%r292; .loc 1 1364 27 st.u32 [%frame+296],%r158; .loc 1 1365 29 st.u64 [%frame+304],%r160; .loc 1 1366 19 st.u64 [%frame+312],%r161; .loc 1 1367 33 cvt.u32.u64 %r293,%r284; st.u32 [%frame+320],%r293; .loc 1 1371 13 mov.u64 %r296,64296300,[%value_in]; } .loc 1 1372 15 mov.u64 %r107,%r284; .loc 1 1373 9 bra $L913; $L920: .loc 1 1375 10 ld.u32 %r302,[%frame+256]; setp.ne.u32 %r303,%r302,0; @ %r303 bra $L939; .loc 1 1377 7 add.u64 %r107,%r107,1; .loc 1 1379 7583; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r306,[%value_in]; } .loc 1 477 24 ld.u64 %r138,[%frame+216]; .loc 1 428 10 ld.u64 %r139,[%frame+224]; .loc 1 428 6 setp.ne.u64 %r307,%r139,0; @ %r307 bra $L915; .loc 1 431 6 ld.u32 %r308,[%frame+256]; setp.ne.u32 %r309,%r308,0; @ %r309 bra $L918; .loc 1 418 38 ld.u64 %r314,[%frame+208]; sub.u64 %r313,%r138,%r314; .loc 1 417 3 st.u32 [%stack+16],%r313; ld.u64 %r315,[%frame+200]; st.u64 [%stack+8],%r315; cvta.const.u64 %r316,$LC2; st.u64 [%stack],%r316; cvta.const.u64 %r312,$LC1; mov11312stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r317,[%value_in]; } .loc 1 419 3 ld.u64 %r322,[%frame+248]; ld.u64 %r323,[%frame+240]; { $LCT62frame8; call %r32,$LCT62; } .loc 1 434 31 mov.u32 %r324,1; st.u32 [%frame+256],%r324; bra $L918; $L915: .loc 1 448 12 add.u64 %r325,%r138,1; st.u64 [%frame+216],%r325; .loc 1 449 13 add.u64 %r326,%r139,-1; st.u64 [%frame+224],%r326; .loc 1 481 11 bra $L918; $L919: .loc 1 1387 11583; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r329,[%value_in]; } .loc 1 1388 7 cvt.u32.u64 %r330,%r329; setp.ne.u32 %r331,%r330,33; @ %r331 bra $L918; .loc 1 13call (%value_in),read_sleb128,(read_uleb128,(%out_arg1); ld.param.u64 %r337,[%value_in]; } .loc 1 1383 13 setp.ne.u64 %r338,%r337,0; @ %r338 bra $L919; .loc 1 1392 7583; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r341,[%value_in]; } $L93; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r344,[%value_in]; } .loc 1 1373 9 setp.ne.u64 %r345,%r344,0; @ %r345 bra $L920; .loc 1 1395 6 ld.u32 %r346,[%frame+256]; setp.ne.u32 %r347,%r346,0; @ %r347 bra $L939; .loc 1 1398 6 setp.eq.u64 %r348,%r107,0; @ %r348 bra $L935; .loc 1 1402 9 shl.b64 %r113,1161; call (%value_in),backtrace_allocld.param.u64 %r353,[%value_in]; } .loc 1 1401 20 st.u64 [%r262+120],%r353; .loc 1 1405 6 setp.eq.u64 %r355,%r353,0; @ %r355 bra $L939; .loc 1 1407 24 st.u64 [%r262+112],%r107; .loc 1 1408 3346113361,[%value_in]; } .loc 1 1418 10 ld.u32 %r363,[%frame+320]; setp.eq.u32 %r364,%r363,0; @ %r364 bra $L958; $L931: .loc 1 14871161; call free_abbrevsbra $L939; $L958: .loc 1 1418 10 mov.u64 %r68,%r344; $L924:read_uleb128,(%out_arg1); ld.param.u64 %r371,[%value_in]; } .loc 1 1422 10 setp.eq.u64 %r372,%r371,0; @ %r372 bra $L925; .loc 1 1426 32584; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r375,[%value_in]; } .loc 1 1426 15 cvt.u32.u64 %r118,%r375; .loc 1 1427584; call (%value_in),read_byte,(%out_arg1); ld.param.u32 %r378,[%value_in]; } .loc 1 1429 17 mov.u64 %r382,6426,[%value_in]; } .loc 1 1430 17 mov.u64 %r121,%r344; .loc 1 1431 13 bra $L926; $L928: .loc 1 1435 4 add.u64 %r121,%r121,1; .loc 1 1436 11583; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r390,[%value_in]; } .loc 1 1437 7 cvt.u32.u64 %r391,%r390; setp.ne.u32 %r392,%r391,33; @ %r392 bra $L926; .loc 1 1438 6583; call (%value_in),read_sleb128,(%out_arg1); ld.param.u64 %r395,[%value_in]; } $L92read_uleb128,(%out_arg1); ld.param.u64 %r398,[%value_in]; } .loc 1 1431 13 setp.ne.u64 %r399,%r398,0; @ %r399 bra $L928; .loc 1 1441 10 setp.ne.u64 %r400,%r121,0; @ %r400 bra $L929; .loc 1 1444 4584; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r403,[%value_in]; } .loc 1 1445 4584; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r406,[%value_in]; } .loc 1 1443 10 mov.u64 %r126,%r121; bra $L930; $L929: .loc 1 1450 6 shl.b64 %r411,1161; call (%value_in),backtrace_allocld.param.u64 %r412,[%value_in]; } mov.u64 %r126,%r412; .loc 1 1452 7 setp.eq.u64 %r414,%r126,0; @ %r414 bra $L931; mov.u64 %r29,%r126; .loc 1 1454 14 mov.u64 %r121,%r398; $L934: .loc 1 1460584; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r417,[%value_in]; } .loc 1 1461584; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r420,[%value_in]; } .loc 1 1462 11 setp.eq.u64 %r421,%r417,0; @ %r421 bra $L930; .loc 1 1464 32 st.u32 [%r29],%r417; .loc 1 1465 32 cvt.u32.u64 %r130,%r420; .loc 1 1465 30 st.u32 [%r29+4],%r130; .loc 1 1466 11 setp.ne.u32 %r422,%r130,33; @ %r422 bra $L932; .loc 1 1467 26584; call (%value_in),read_sleb128,(%out_arg1); ld.param.u64 %r425,[%value_in]; } .loc 1 1467 24 st.u64 [%r29+8],%r425; bra $L933; $L932: .loc 1 1469 24 mov.u64 %r426,0; st.u64 [%r29+8],%r426; $L933: .loc 1 1470 8 add.u64 %r121,%r121,1; add.u64 %r29,%r29,16; .loc 1 1456 6 bra $L934; $L930: .loc 1 1477 23 ld.u64 %r427,[%r262+120]; add.u64 %r133,%r427,%r68; .loc 1 1477 37 st.u64 [%r133],%r371; st.u32 [%r133+8],%r118; st.u32 [%r133+12],%r378; st.u64 [%r133+16],%r121; st.u64 [%r133+24],%r126; .loc 1 1418 10 add.u64 %r68,%r68,32; ld.u32 %r428,[%frame+320]; setp.ne.u32 %r429,%r428,0; @ ! %r429 bra $L924; bra $L931; $L925: .loc 1 1481 3 ld.u64 %r434,[%r262+112]; ld.u64 %r435,[%r262+120]; mov.u64 %r433,abbrev_compare; mov.u64 %r432,32432433; call backtrace_qso); } $L935: .loc 1 2187 10 setp.gt.s32 %r436,%r203,4; @ %r436 bra $L936; .loc 1 2188572; call (%value_in),read_byte,(%out_arg1); ld.param.u32 %r439,[%value_in]; } .loc 1 2188 11 mov.u32 %r63,%r439; $L936: .loc 1 2190 7 add.u32 %r441,%r47,-4; setp.gt.u32 %r442,%r441,1; @ %r442 bra $L937; .loc 1 2197 4572; call (%value_in),read_uint64,(%out_arg1); ld.param.u64 %r445,[%value_in]; } $L937: .loc 1 2203 21 st.u64 [%r262+24],%r152; .loc 1 2204 45 setp.ne.u32 %r446,%r151,0; selp.u64 %r58,12,4,%r446; add.u64 %r41,%r84,%r152; .loc 1 2204 19 add.u64 %r152,%r41,%r58; .loc 1 2205 22 st.u64 [%r262+32],%r152; .loc 1 2206 30 ld.u64 %r59,[%frame+344]; .loc 1 2206 20 st.u64 [%r262],%r59; .loc 1 2207 24 ld.u64 %r447,[%frame+352]; st.u64 [%r262+8],%r447; .loc 1 2208 42 sub.u64 %r448,%r59,%r82; .loc 1 2208 27 st.u64 [%r262+16],%r448; .loc 1 2209 18 st.u32 [%r262+40],%r203; .loc 1 2210 21 st.u32 [%r262+44],%r151; .loc 1 2211 19 st.u32 [%r262+48],%r63; .loc 1 2212 19 mov.u64 %r449,0; st.u64 [%r262+88],%r449; .loc 1 2213 19 st.u64 [%r262+96],%r449; .loc 1 2214 23 st.u64 [%r262+104],%r449; .loc 1 2215 18 st.u64 [%r262+56],%r449; .loc 1 2218 16 st.u64 [%r262+128],%r449; .loc 1 2219 22 st.u64 [%r262+136],%r449; .loc 1 2220 25 st.u64 [%r262+144],%r449; .loc 1 2221 31 st.u64 [%r262+152],%r449; .loc 1 2223572160; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r161; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r262; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r577; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%frame; call (%value_in),find_address_ranges,%out_arg8,%out_arg9,%out_arg10,%out_arg11); ld.param.u32 %r470,[%value_in]; } .loc 1 2223 10 setp.eq.u32 %r471,%r470,0; @ %r471 bra $L939; .loc 1 2228 10 ld.u32 %r472,[%frame+384]; setp.ne.u32 %r473,%r472,0; @ %r473 bra $L939; $L908: .loc 1 2129 15 ld.u32 %r149,[%frame+448]; .loc 1 2116 9 ld.u64 %r474,[%frame+416]; setp.ne.u64 %r475,%r474,0; @ %r475 bra $L940; .loc 1 2231 6 setp.ne.u32 %r476,%r149,0; @ %r476 bra $L939; $L900: .loc 1 2234 17 ld.u64 %r477,[%frame+520]; st.u64 [%frame+456],%r477; ld.u64 %r478,[%frame+528]; st.u64 [%frame+464],%r478; ld.u64 %r479,[%frame+536]; st.u64 [%frame+472],%r479; .loc 1 2235 19 st.u64 [%frame+480],%r81161; call (%value_in),backtrace_vector_releaseld.param.u32 %r485,[%value_in]; } .loc 1 3853 6 setp.eq.u32 %r486,%r485,0; @ ! %r486 bra $L1012; bra $L941; $L939: .loc 1 2239 6 setp.eq.u64 %r487,%r81,0; @ %r487 bra $L943; ld.u64 %r27,[%frame+520]; .loc 1 2242 14 mov.u64 %r72,0; $L944: .loc 1 2244 4 ld.u64 %r493,[%r27]; add.u64 %r492,%r493,112161; call free_abbrevs2245 4 mov.u64 %r496,160; ld.u64 %r495,[%r274960161; call backtrace_free); } .loc 1 2242 37 add.u64 %r72,%r72,1; .loc 1 2242 7 add.u64 %r27,%r27,8; setp.ne.u64 %r499,%r81,%r72; @ %r499 bra $L944; .loc 2 267 12 ld.u64 %r501,[%frame+536]; ld.u64 %r502,[%frame+528]; add.u64 %r500,%r501,%r502; st.u64 [%frame+536],%r500; .loc 2 268 13 mov.u64 %r503,0; st.u64 [%frame+528],%r503; .loc 2 269 3161; call (%value_in),backtrace_vector_releaseld.param.u32 %r509,[%value_in]; } $L943: .loc 1 2249 6 ld.u64 %r510,[%frame+512]; setp.eq.u64 %r511,%r510,0; @ %r511 bra $L941; .loc 2 267 12 ld.u64 %r513,[%frame+504]; ld.u64 %r514,[%frame+496]; add.u64 %r512,%r513,%r514; st.u64 [%frame+504],%r512; .loc 2 268 13 mov.u64 %r515,0; st.u64 [%frame+496],%r515; .loc 2 269 3161; call (%value_in),backtrace_vector_releaseld.param.u32 %r521,[%value_in]; } $L941: .loc 1 3904 12 mov.u32 %r154,0; bra $L898; $L1012:161; call (%value_in),backtrace_vector_releaseld.param.u32 %r527,[%value_in]; } .loc 1 3855 6 setp.eq.u32 %r528,%r527,0; @ %r528 bra $L941; .loc 1 3857 9 ld.u64 %r35,[%frame+488]; .loc 1 3858 9 ld.u64 %r36,[%frame+456]; .loc 1 3859 15 ld.u64 %r37,[%frame+512]; .loc 1 3860 15 ld.u64 %r38,[%frame+480]; .loc 1 3861 3 mov.u64 %r532,unit_addrs_compare; mov.u64 %r531,24; {53532; call backtrace_qso); } .loc 1 3866 5 mov.u64 %r534,2161; call (%value_in),backtrace_allocld.param.u64 %r537,[%value_in]; } .loc 1 3868 6 setp.eq.u64 %r539,%r537,0; @ %r539 bra $L941; .loc 1 3871 15 mov.u64 %r540,0; st.u64 [%r537],%r540; .loc 1 3872 18 st.u64 [%r537+8],%r159; .loc 1 3873 23 st.u64 [%r537+16],%r156; .loc 1 3874 16 st.u64 [%r537+24],%r35; .loc 1 3875 22 st.u64 [%r537+32],%r37; .loc 1 3876 16 st.u64 [%r537+40],%r36; .loc 1 3877 22 st.u64 [%r537+48],%r38; .loc 1 3878 25 add.u64 %r541,%r537,56; mov.u64 %r543,144543547,[%value_in]; } .loc 1 3879 23 st.u32 [%r537+200],%r158; .loc 1 3880 3 add.u64 %r549,%r537,208; mov.u64 %r552,32; cvt.u32.u64 %r554,%r5552556,[%value_in]; } .loc 1 3906 6 setp.eq.u64 %r558,%r163,0; @ %r558 bra $L947; .loc 1 3907 21 st.u64 [%r163],%r537; $L947: .loc 1 3913 15 add.u64 %r31,%r155,32; .loc 1 3909 6 ld.u32 %r559,[%r155+8]; setp.ne.u32 %r560,%r559,0; @ %r560 bra $L948; .loc 1 3913 15 ld.u64 %r32,[%r155+32]; .loc 1 3913 7 setp.eq.u64 %r561,%r32,0; @ %r561 bra $L949; $L950: mov.u64 %r52,%r32; .loc 1 3915 8 ld.u64 %r32,[%r32]; .loc 1 3913 7 setp.ne.u64 %r562,%r32,0; @ %r562 bra $L950; .loc 1 3915 8 mov.u64 %r31,%r52; $L949: .loc 1 3917 11 st.u64 [%r31],%r537; bra $L951; $L948: .loc 1 3925 7 mov.u64 %r28,%r31; $L953: .loc 1 3931 12 ld.u64 %r24,[%r28]; membar.sys; .loc 1 3933 11 setp.eq.u64 %r563,%r24,0; @ %r563 bra $L952; .loc 1 3936 11 mov.u64 %r28,%r24; .loc 1 3928 6 bra $L953; $L952: .loc 1 3939 8 @ %r592 atom.cas.b64 %r565,[%r28],0,%r537; mov.b64 {%r590,%r591},%r565; shfl.idx.b32 %r590,%r590,%r589,31; shfl.idx.b32 %r591,%r591,%r589,31; mov.b64 %r565,{%r590,%r591}; setp.eq.u64 %r568,%r565,%r24; selp.u32 %r566,1,0,%r568; .loc 1 3939 7 setp.eq.u32 %r569,%r566,0; @ %r569 bra $L948; $L951: .loc 1 3944 16 mov.u64 %r570,dwarf_fileline; st.u64 [%r162],%r570; .loc 1 3946 10 mov.u32 %r154,1; $L898: .loc 1 3947filelineBEGIN FUNCTION DECL: fileline_initializefileline_initializebacktrace/filelinbacktrace_pcinfobacktrace_pcinfobacktrace_syminfobacktrace_syminfogetpietpisnnprintfbacktrace_opebacktrace_initializebacktrace_initial100,32,116,111,32,114,101,97,100,32,101,120,101,99,117,116,97,98,108,101,32,105,110,102,111,114,109,97,116,105,111,110,0 }111,99,47,99,117,114,112,114,111,99,47,102,105,108,101108,105,98,98,97,99,107,116,114,97,99,101,32,99,111,117,108,100,32,110,111,116,32,102,105,110,100,32,101,120,101,99,117,116,97,98,108,101,32,116,111,32,111,112,10114,111,99,4747,112,114,111,99,47,37,108,100,47,111,98,106,101,99,116,47,97,46,111,117,116,0 }; // BEGIN FUNCTION DEF: fileline_initializefileline_initialize32predpred6432pred %r101; .reg .u64 %r102.loc 1 64 6 ld.u32 %r47,[%r44+82; .loc 1 69 6 ld.u32 %r49,[%r44+56]; setp.ne.u32 %r50,%r49,0; @ %r50 bra $L3; $L6: .loc 1 76 24 ld.u64 %r26,[%r44+24]; .loc 1 76 17 st.u64 [%frame+64],%r26; bra $L4; $L3: .loc 1 71 7 mov.u32 %r53,-1; cvta.const.u64 %r52,$LC2;,.param .u32 %in_ar2);call %r45,(%out_arg1,%out_arg2,%out_arg3),$LCT0; } .loc 1 72 14 mov.u32 %r43,0; bra $L1; $L2: .loc 1 67 14 ld.u32 %r24,[%r44+56]; membar.sys; .loc 1 69 6 setp.ne.u32 %r54,%r24,0; @ %r54 bra $L3; .loc 1 75 6 ld.u32 %r55,[%r44+8]; setp.eq.u32 %r56,%r55,0; @ %r56 bra $L6; .loc 1 78 19 ld.u64 %r28,[%r44+24]; membar.sys; mov.u64 %r26,%r28; .loc 1 78 17 st.u64 [%frame+64],%r26; $L4: .loc 1 79 6 setp.ne.u64 %r57,%r26,0; @ %r57 bra $L20; .loc 1 86 13 cvt.u32.u64 %r41,%r26; .loc 1 116 20 add.u64 %r102,%frame,72; $L13: .loc 1 90 7 setp.eq.u32 %r58,%r41,3; @ %r58 bra $L21; setp.eq.u32 %r59,%r41,4; @ %r59 bra $L8; setp.eq.u32 %r60,%r41,1; @ %r60 bra $L10; setp.eq.u32 %r61,%r41,2; @ %r61 bra $L22; .loc 1 93 13 ld.u64 %r39,[%r44]; .loc 1 113 10 setp.eq.u64 %r62,%r39,0; @ %r62 bra $L10; bra $L7; $L8:getpid; st.u64 [%stack],%r67; cvta.const.u64 %r66,$LC3; mov.u64 %r65,64framestack; call (%value_in),sn107 13 mov.u64 %r39,%frame; .loc 1 108 4 bra $L7; $L21: .loc 1 102 13 cvta.const.u64 %r39,$LC0; bra $L7; $L22: .loc 1 90 7 cvta.const.u64 %r39,$LC1; $L7: .loc 1 116 20102; call (%value_in),backtrace_openld.param.u32 %r75,[%value_in]; } .loc 1 118 10 setp.ge.s32 %r76,%r75,0; @ %r76 bra $L11; .loc 1 118 26 ld.u32 %r77,[%frame+72]; setp.eq.u32 %r78,%r77,0; @ %r78 bra $L15; $L10: .loc 1 86 28 add.u32 %r41,%r41,1; .loc 1 86 3 setp.ne.u32 %r79,%r41,5; @ %r79 bra $L13; .loc 1 131 13 ld.u64 %r32,[%r44]; .loc 1 131 7 setp.eq.u64 %r80,%r32,0; @ %r80 bra $L14; .loc 1 132 6 mov.u32 %r83,2;bra $L15; $L14: .loc 1 134 6 cvt.u32.u64 %r86,%r32; cvta.const.u64 %r85,$LC4;,.param .u32 %in_ar2);%r45,(%out_arg1,%out_arg2,%out_arg3),$LCT2; } bra $L15; $L40: .loc 1 158 24 ld.u64 %r42,[%frame+64]; .loc 1 157 6 ld.u32 %r87,[%r44+8]; setp.eq.u32 %r88,%r87,0; @ %r88 bra $L16; bra $L39; $L15: .loc 1 150 17 ld.u32 %r43,[%r44+8]; .loc 1 150 10 setp.ne.u32 %r89,%r43,0; @ %r89 bra $L18; .loc 1 151 40 mov.u32 %r90,1; st.u32 [%r44+56],%r90; bra $L1; $L18: .loc 1 153 2 membar.sys; mov.u32 %r91,1; st.u32 [%r44+56],%r91; .loc 1 154 14 mov.u32 %r43,0; bra $L1; $L16: .loc 1 158 24 st.u64 [%r44+24],%r42; .loc 1 167 10 mov.u32 %r43,1; bra $L1; $L39: .loc 1 161 7 membar.sys; st.u64 [%r44+24],%r42; .loc 1 167 10 mov.u32 %r43,1; bra $L1; $L20: .loc 1 80 12 mov.u32 %r43,1; bra $L1; $L11: .loc 1 143 12 add.u64 %r99,%frame,64backtrace_initialize.loc 1 143 10 setp.eq.u32 %r101,%r100,0; @ ! %r101 bra $L40; bra $L15; $L1: .loc 1 168 1 mov.u32 %value,%r43backtrace_pcinfobacktrace_pcinfor24u32 %r42; .reg .u64mov.u64 %r29,%ar4; .loc 1 177 8fileline_initializemov.u32 %r24,%r33; .loc 1 177 6 setp.eq.u32 %r34,%r24,0; @ %r34 bra $L41; .loc 1 180 6 ld.u32 %r35,[%r25+56]; setp.ne.u32 %r36,%r35,0; @ %r36 bra $L43; .loc 1 183 10 ld.u64 %r43,[%r25+24]; { .param .u32 %value_in; $LCT3: .callprototype ( .param .u32 %value_out),.param .u64 %in_ar2,.param .u64 %in_ar3,.param .u64 %in_ar4)call (%value_in),%r43),$LCT3bra $L41; $L43: .loc 1 178 12 mov.u32 %r24,0backtrace_syminfobacktrace_syminfor24u64mov.u64 %r29,%ar4; .loc 1 193 8fileline_initializemov.u32 %r24,%r33; .loc 1 193 6 setp.eq.u32 %r34,%r24,0; @ %r34 bra $L47; .loc 1 196 6 ld.u32 %r35,[%r25+56]; setp.ne.u32 %r36,%r35,0; @ %r36 bra $L49; .loc 1 199 3 ld.u64 %r42,[%r25+40];,.param .u64 %in_ar3,.param .u64 %in_ar4)call %r42),$LCT4; } .loc 1 200 10 mov.u32 %r24,1; bra $L47; $L49: .loc 1 194 12 mov.u32 %r24,0 posix.o/ backtrace_opebacktrace_open../libbacktrace/posix.c" // BEGIN GLOBAL FUNCTION DECL: backtrace_closbacktrace_closeopop__errno__errno; // BEGIN GLOBAL FUNCTION DECL: fcntfcntlcloseclos6] = {99,108,111,115backtrace_opebacktrace_openpredpredmov.u64 %r29,%ar2; mov.u64 %r30,%ar3; .loc 1 64 6mov.u32 %r32,0; st.u32 [%r30],%r32; .loc 1 67 16 mov.u32 %r34,262144mov.u32 %r26,%r35; .loc 1 68 6 setp.ge.s32 %r37,%r26,0; @ %r37 bra $L3; .loc 1 7__errnou32 %r39,[%r38]; setp.ne.u32 %r40,%r39,2; @ %r40 bra $L4; .loc 1 71 18 mov.u32 %r41,1; st.u32 [%r30],%r41; .loc 1 74 14 mov.u32 %r26,-1; .loc 1 71 18 bra $L1; $L4: .loc 1 73 34__errno2 ld.u32 %r45,[%r42];,.param .u32 %in_ar2);call %r28,(%out_arg1,%out_arg2,%out_arg3),$LCT0; } .loc 1 74 14 mov.u32 %r26,-1; bra $L1; $L3: .loc 1 82 3 mov.u32 %r48,1; st.u32 [%stack],%r48;fcntl.loc 1 85 10 bra $L1; $L2: .loc 1 67 16 mov.u32 %r53,262144mov.u32 %r26,%r54; .loc 1 68 6 setp.lt.s32 %r56,%r26,0; @ ! %r56 bra $L3; bra $L4; $L1: .loc 1 86 1 movbacktrace_closbacktrace_closepred.loc 1 94 7close,(94 6 setp.ge.s32 %r31,%r30,0; @ %r31 bra $L8; .loc 1 96 38__errno; ld.param.u64 %r32,[%value_in]; } .loc 1 96 7 ld.u32 %r35,[%r32]; cvta.const.u64 %r34,$LC0;.loc 1 97 14 mov.u32 %r25,0; bra $L6; $L8: .loc 1 99 10 mov.u32 %r25,1;BEGIN FUNCTION DECL: print_callbackprint_callbackbacktrace/print.c" // BEGIN FUNCTION DECL: error_callback .func error_callbackbacktrace_print .visible .func backtrace_prinVAR DECL: _impure_ptr .extern .global .align 8 .u64 _impure_ptrerrorstrerror(.param .u32 %value_out) fputcbacktrace_fulbacktrace_full17] = {48,120,37,108,120,32,37,115,10,9,37,115,58,37,1005] = {37,115,58,32,0 }4] = {63,63,635] = {58,32,37,1108,105,98,98,97,99,107,116,114,97,99,101,58,32,37,115,0 }; // BEGIN FUNCTION DEF: print_callback .func (.param .u32 %value_out) print_callbackpred %r29; .reg .predmov.u64 %r28,%ar4; .loc 1 58 3 setp.ne.u64 %r29,%r26,0; @ %r29 bra $L2; cvta.const.u64 %r26,$LC0; $L2: setp.ne.u64 %r30,%r28,0; @ %r30 bra $L3; cvta.const.u64 %r28,$LC0; $L3: ld.u64 %r33,[%r24+8]; st.u32 [%stack+24],%r27; st.u64 [%stack+16],%r26; st.u64 [%stack+8],%r28.loc 1 63 10 mov.u32 %value,0; .loc 1 64 1FUNCTION DEF: error_callback .func error_callbackpredpredmov.u32 %r35,%ar2; .loc 1 73 19 ld.u64 %r36,[%r33]; ld.u64 %r23,[%r36]; .loc 1 73 6 setp.eq.u64 %r37,%r23,0; @ %r37 bra $L5; .loc 1 74 14 ld.global.u64 %r40,[_impure_ptr]; .loc 1 74 5 ld.u64 %r41,[%r40+24]22,[%value_in]; } $L5: .loc 1 75 12 ld.global.u64 %r46,[_impure_ptr]; .loc 1 75 3 ld.u64 %r47,[%r46+24]; st.u64 [%stack],%r34; cvta.const.u64 %r45,$LC38,[%value_in]; } .loc 1 76 6 setp.le.s32 %r50,%r35,0; @ %r50 bra $L6; .loc 1 77 5strerr77 14 ld.global.u64 %r55,[_impure_ptr]; .loc 1 77 5 ld.u64 %r56,[%r55+24]; st.u64 [%stack],%r52; cvta.const.u64 %r54,$LC457,[%value_in]; } $L6: .loc 1 78 16 ld.global.u64 %r61,[_impure_ptr]; .loc 1 78 3 ld.u64 %r62,[%r61+24];5962; call (%value_in),fputc.locbacktrace_print .visible .func backtrace_print.loc 1 88 14 st.u64 [%frame],%r23; .loc 1 89 10 st.u64 [%frame+8],%r25; .loc 1 90 3 add.u32 %r31,%r24,1; mov.u64 %r29,error_callback; mov.u64 %r28,print_callbackframe; call (%value_in),backtrace_full.loc 1 92read.o/ backtrace_get_viewbacktrace_get_view.file 1 "../../../../libbacktrace/read.c" // BEGIN GLOBAL FUNCTION DECL: backtrace_release_view .visible .func backtrace_release_viewlseekl// BEGIN GLOBAL FUNCTION DECL: __errno__errno; // BEGIN GLOBAL FUNCTION DECL: backtrace_backtrace_allreareadbacktrace_free .extern .func backtrace_free114,101,97,100,0 }6] = {108,115,101,101,107102,105,108,101,32,116,111,111,32,115,104,111,114backtrace_get_viewbacktrace_get_viewpredu32u64 %r70; .reg .u64 %r73; mov.u64 %r33,%ar0; mov.u32 %r34,%ar1; mov.u64 %r35,%ar2; mov.u64 %r36,%ar3; mov.u64 %r37,%ar4; mov.u64 %r38,%ar5; mov.u64 %r39,%ar6; .loc 1 61 7 mov.u32 %r42,0;(%value_in),lseek.loc 1 61 6 setp.ge.s64 %r44,%r43,0; @ %r44 bra $L2; .loc 1 63 38__errno63 7 ld.u32 %r48,[%r45]; cvta.const.u64 %r47,$LC0;,.param .u32 %in_ar2);call %r37,(%out_arg1,%out_arg2,%out_arg3),$LCT0; } .loc 1 64 14 mov.u32 %r32,%r42; bra $L1; $L2:call (%value_in),backtrace_alloc67 14 st.u64 [%r39+8],%r53; .loc 1 68 670 14 st.u64 [%r39],%r53; .loc 1 71 13 st.u64 [%r39+16],%r36; .loc 1 73 9ad.loc 1 74 6 setp.ge.s32 %r60,%r59,0; @ %r60 bra $L4; .loc 1 76 37__errno; ld.param.u64 %r61,[%value_in]; } .loc 1 76 7 ld.u32 %r64,[%r61]; cvta.const.u64 %r63,$LC1;.loc 1 77 7 ld.u64 %r66,[%r39+8];.loc 1 78 14 mov.u32 %r32,%r42; bra $L1; $L4: .loc 1 81 7 cvt.s64.s32 %r67,%r59; .loc 1 81 6 setp.ge.u64 %r68,%r67,%r36; @ %r68 bra $L6; .loc 1 83 7 cvta.const.u64 %r70,$LC2;,.param .u32 %in_ar2);call %r37,(%out_arg1,%out_arg2,%out_arg3),$LCT2; } .loc 1 84 7 ld.u64 %r73,[%r39+85 14 mov.u32 %r32,%r42; bra $L1; $L5: .loc 1 69 12 cvt.u32.u64 %r32,%r53; bra $L1; $L6: .loc 1 88 10 mov.u32 %r32,1; $L1: .loc 1 89backtrace_release_view .visible .func backtrace_release_view99 3 ld.u64 %r33,[%r25+16]; ld.u64 %r34,[%r25+8]; {call backtrace_free); } .loc 1 100 14 mov.u64 %r35,0; st.u64 [%r25],%r35; .loc 1 101 14 st.u64 [%r25+8],%r35; .loc 1 102 1 ret; } simple.o/ FUNCTION DECL: simple_unwindle_unwindbacktracebacktrace_simplbacktrace_simple_Unwind_GetIPInfo_Unwind_GetIPInfo; // BEGIN GLOBAL FUNCTION DECL: _Unwind_Backtrace_Unwind_BacktrFUNCTION DEF: simple_unwind .func (.param .u32 %value_out) simple_unwind66 7 mov.u32 %r32,0; st.u32 [%frame],%r32; .loc 1 69_Unwind_GetIPInfo74 12 ld.u32 %r22,[%r31]; .loc 1 74 6 setp.le.s32 %r36,%r22,0; @ %r36 bra $L2; .loc 1 76 7 add.u32 %r37,%r22,-1; st.u32 [%r31],%r37; .loc 1 77 14 mov.u32 %r29,%r32; bra $L1; $L2: .loc 1 80 6 ld.u32 %r38,[%frame]; setp.ne.u32 %r39,%r38,0; @ %r39 bra $L4; .loc 1 81 5 add.u64 %r28,%r28,-1; $L4: .loc 1 83 16 ld.u64 %r42,[%r31+32]; ld.u64 %r44,[%r31+16]; { .param .u32 %value_in; $LCT0: .callprototype ( .param .u32 %value_out)(%value_in),%r44,(%out_arg1,%out_arg2),$LCT083 14 st.u32 [%r31+40],%r43; .loc 1 85 6 setp.eq.u32 %r45,%r43,0; .loc 1 86 12 selp.u32 %r29,0,5,%r45; $L1: .loc 1 89backtrace_simplbacktrace_simpleu64 %r31; .reg .u32 %r33; mov.u64 %r24,%ar0; mov.u32 %r25,%ar1; mov.u64 %r26,%ar2; mov.u64 %r27,%ar3; mov.u64 %r28,%ar4; .loc 1 100 21 add.u32 %r29,%r25,1; .loc 1 100 14 st.u32 [%frame],%r29; .loc 1 101 15 st.u64 [%frame+8],%r24; .loc 1 102 18 st.u64 [%frame+16],%r26; .loc 1 103 24 st.u64 [%frame+24],%r27; .loc 1 104 14 st.u64 [%frame+32],%r28; .loc 1 105 13 mov.u32 %r30,0; st.u32 [%frame+40],%r30; .loc 1 106 3 mov.u64 %r31,simple_unwind;frame; call (%value_in),_Unwind_Backtrace.loc 1 107 15 ld.u32 %value,[%frame+40]; .loc 1 108 1backtrace_qsort .visible .func backtrace_qsort../libbacktrace/sortbacktrace_qsort .visible .func backtrace_qsortpredmov.u64 %r53,%ar0; mov.u64 %r54,%ar1; mov.u64 %r55,%ar2; mov.u64 %r56,%ar3; .loc 1 69 6 setp.gt.u64 %r57,%r54,1; @ ! %r57 bra $L1; setp.ne.u64 %r96,%r55,0; .loc 1 50 3 setp.eq.u64 %r98,%r55,0; bra $L2; $L15: .loc 1 69 6 setp.le.u64 %r58,%r54,1; @ %r58 bra $L1; $L2: .loc 1 76 29 shr.u64 %r59,%r54,1; .loc 1 76 3 mad.lo.u64 %r37,%r59,%r55,%r53; .loc 1 50 3 @ %r96 bra $L4; $L7: add.u64 %r49,%r53,%r55; .loc 1 78 7 mov.u64 %r28,0; .loc 1 79 10 mov.u64 %r31,1; bra $L5; $L4: add.u64 %r30,%r37,%r55; .loc 1 50 3 mov.u64 %r36,%r53; $L6: .loc 1 54 9 ld.s8 %r33,[%r36]; .loc 1 55 10 ld.u8 %r62,[%r37]; st.u8 [%r36],%r62; .loc 1 56 10 cvt.u32.u32 %r63,%r33; st.u8 [%r37],%r63; .loc 1 50 31 add.u64 %r36,%r36,1; .loc 1 50 36 add.u64 %r37,%r37,1; .loc 1 50 3 setp.ne.u64 %r64,%r30,%r37; @ %r64 bra $L6; bra $L7; $L5: .loc 1 81 33 mov.u64 %r43,%r49; .loc 1 81 12 { .param .u32 %value_in; $LCT0: .callprototype ( .param .u32 %value_out)(%value_in),%r56,(%out_arg1,%out_arg2),$LCT0; ld.param.u32 %r67,[%value_in]; } .loc 1 81 10 setp.le.s32 %r68,%r67,0; @ ! %r68 bra $L28; add.u64 %r97,%r55,%r49; bra $L8; $L28: .loc 1 83 4 add.u64 %r28,%r28,1; .loc 1 84 7 setp.eq.u64 %r69,%r28,%r31; @ ! %r69 bra $L29; add.u64 %r97,%r55,%r49; bra $L8; $L29: .loc 1 85 6 mad.lo.u64 %r42,%r28,%r55,%r53; .loc 1 50 3 @ ! %r98 bra $L30; mov.u64 %r97,%r49; bra $L8; $L30: add.u64 %r97,%r55,%r43; $L9: .loc 1 54 9 ld.s8 %r38,[%r42]; .loc 1 55 10 ld.u8 %r72,[%r43]; st.u8 [%r42],%r72; .loc 1 56 10 cvt.u32.u32 %r73,%r38; st.u8 [%r43],%r73; .loc 1 50 31 add.u64 %r42,%r42,1; .loc 1 50 36 add.u64 %r43,%r43,1; .loc 1 50 3 setp.ne.u64 %r75,%r43,%r97; @ %r75 bra $L9; $L8: .loc 1 79 27 add.u64 %r31,%r31,1; .loc 1 79 3 mov.u64 %r49,%r97; setp.lt.u64 %r76,%r31,%r54; @ %r76 bra $L5; .loc 1 89 6 setp.ne.u64 %r77,%r28,0; @ %r77 bra $L10; $L13: .loc 1 98 13 add.u64 %r78,%r54,-1; sub.u64 %r32,%r78,%r28; .loc 1 97 25 mad.lo.u64 %r80,%r28,%r55,%r55; .loc 1 97 12 add.u64 %r50,%r53,%r80; .loc 1 94 9 add.u64 %r81,%r28,%r28; .loc 1 94 6 setp.lt.u64 %r82,%r81,%r54; @ %r82 bra $L11; bra $L31; $L10: .loc 1 90 5 mad.lo.u64 %r48,%r28,%r55,%r53; .loc 1 50 3 @ %r98 bra $L13; add.u64 %r52,%r48,%r55; mov.u64 %r47,%r53; $L14: .loc 1 54 9 ld.s8 %r44,[%r47]; .loc 1 55 10 ld.u8 %r85,[%r48]; st.u8 [%r47],%r85; .loc 1 56 10 cvt.u32.u32 %r86,%r44; st.u8 [%r48],%r86; .loc 1 50 31 add.u64 %r47,%r47,1; .loc 1 50 36 add.u64 %r48,%r48,1; .loc 1 50 3 setp.ne.u64 %r87,%r48,%r52; @ %r87 bra $L14; bra $L13; $L11: .loc 1 9); } .loc 1 97 12 mov.u64 %r53,%r50; .loc 1 98 13 mov.u64 %r54,%r32; .loc 1 99 7 bra $L15; $L31: .loc 1 103 7 {); } .loc 1 106 7 mov.u64 %r54,%r28; bra $L15; $L1: .loc 1 108 1 ret; } state.o/ backtrace_create_statebacktrace_create_statebacktrace/statbacktrace_backtrace_allbacktrace_create_statebacktrace_create_statepred.loc 1 61 3 add.u64 %r27,%frame,8; mov.u64 %r29,0; st.u64 [%r27],%r29; st.u64 [%r27+8],%r29; st.u64 [%r27+16],%r29; st.u64 [%r27+24],%r29; st.u64 [%r27+32],%r29; st.u64 [%r27+40],%r29; st.u64 [%r27+48],%r29; st.u64 [%r27+56],%r29; .loc 1 62 23 st.u64 [%frame],%r23; .loc 1 63 23 st.u32 [%frame+8],%r24; .loc 1 66 5 mov.u64 %r38,call (%value_in),backtrace_alloc67 6 setp.eq.u64 %r43,%r41,0; @ %r43 bra $L1; .loc 1 69FUNCTION DECL: unknown_fileline .func (.param .u32 %value_out) unknown_fileline.file 1 "../../../../libbacktrace/unknowbacktrace_initialbacktrace_initial); // BEGIN FUNCTION DEF: unknown_fileline .func (.param .u32 %value_out) unknown_fileline3.loc 1 49 10 mov.u64 %r32,0; cvt.u32.u64 %r31,%r32; { .param .u32 %value_in; $LCT0: .callprototype ( .param .u32 %value_out),.param .u64 %in_ar2,%r25),$LCT050backtrace_initialbacktrace_initialmov.u64 %r23,%ar0; mov.u64 %r28,%ar5; .loc 1 62 24 mov.u64 %r29,0; st.u64 [%r23+32],%r29; .loc 1 63 16 mov.u64 %r30,unknown_fileline; st.u64 [%r28],%r30; .loc 1 64 10 mov.u32 %value,1; .loc 1 6fortran.la# libgfortranfortran -lmfortranmgomp/libgfortran.spec# # This spec file is read by gfortran when linking. # It is used to specify the libraries we need to link in, in the right # order. # %rename lib liborig *lib: -lm %(libgcc) %(liborig) u32 %r164; .reg .u64 %r165; .reg .predpred %r180; .reg .predpredpred100,%ar1; mov.u64 %r101,%ar2; mov.u64 %r102,%ar3; mov.u64 %r103,%ar4; st.u64 [%frame],%r103; mov.u64 %r104,%ar5; mov.u64 %r105,%ar6; mov.u64 %r106,%ar7; mov.u64 %r107100; neg.s32 %r128,%r127; .loc 1 46 24 cvt.u16.u32 %r130,%r125; cvt.u16.u32 %r131,%r128; and.b16 %r129,%r130,%r131; .loc 1 47 14 cvt.u32.u16 %r132,%r129; cvt.u16.u8 %r133,%r132; setp.ne.u16 %r134,%r133,0; @ %r134 bra $L196; .loc 1 46 49 shr.u64 %r136,%r101100; neg.s32 %r139,%r138; .loc 1 46 53 cvt.u16.u64 %r142,%r136; cvt.u16.u32 %r143,%r139; and.b16 %r141,%r142,%r143; .loc 1 46 40 cvt.u32.u16 %r144,%r141; cvt.u16.u8 %r145,%r144; setp.ne.u16 %r146,%r145,0; selp.u64 %r100,%r99,%r100,%r146; bra $L167; $L196: mov.u64 %r100,%r99; $L167: .loc 1 46 11 st.u64 [%r224+16],%r100; .loc 1 48 12 st.u64 [%r224+24],%r101; .loc 1 49 12.loc 1 52 22 mul.lo.u64 %r62,%r25,%r101; st.u64 [%r224+8],%r6210010010 @ %r225 bra $L179; .loc 1 447 15 ld.u64 %r191,[%r55+24]; .loc 1 448 4 ld.u64 %r192,[%r191+88]; {451 22 ld.u64 %r31,[%r55+24]; .loc 1 451 13 ld.u32 %r43,[%r31]; .loc 1 452 10 setp.eq.u64 %r193,%r43,1; @ %r193 bra $L180; .loc 1 453 2 add.u64 %r33,%r31,64; $L181: .loc 3 51 10 @ %r234 atom.exch.b32 %r96,[%r33],1; shfl.idx.b32 %r96,%r96,%r233,38loop_ullu16predpredpredpred %r180; .reg .predpredpredu32 %r212; .reg .u32 %r216; .reg .pred %r218; .reg .u64 %r219; .reg .u32 %r220; .reg .pred101,%ar0; mov.u64 %r102,%ar1; mov.u64 %r103,%ar2; mov.u64 %r104,%ar3; mov.u64 %r105,%ar4; mov.u64 %r106,%ar5; st.u64 [%frame],%r106; mov.u64 %r107,%ar6; mov.u64 %r108,%ar7; mov.u64 %r109,%ar8; mov.u64 %r110110 @ %r229 bra $L181; .loc 1 449 15 ld.u64 %r191,[%r54+24]; .loc 1 450 4 ld.u64 %r192,[%r18100100100100 ordered.o/ predpred %r85; .reg .u64 %r8686,__nvptx_uni; add.u64 %r86,%r86,%ustmp1; ld.shared.u32 %r84,[%r8.u32 %r68,[%r43+48]; add.u32 %r39,%r23,%r68; .loc 2 134 10 setp.gt.u32 %r69,%r83,%r39; @ %r69 bra $L25; .loc 2 135 8 sub.u32 %r39,%r39,%r83; $L25: .loc 2 136 44 ld.u32 %r32,[%r48+40]; .loc 2 136 27 cvt.u64.u32 %r70,%r39; .loc 2 136 35 shl.b64 %r71,%r70,2; add.u64 %r72,%r51,%r71; st.u32 [%r72],%r32; ld.u32 %r83,[%r42]; $L24: .loc 2 139 9 ld.u32 %r73,[%r43+48];129999,%r99,1; .loc 2 177 7 add.u64 %r110,%r110,8; add.u64 %r108,%r108,40; setp.ne.u64 %r133,%r22,%r99126126126121299,%r99999999u64u64u16 %r75; .reg .u165 setp.eq.u64 %r54,%r23,0; @ %r54 bra $L113; .loc 2 1122 60 ld.u64 %r44,[%r39+8]; .loc 2 1122 37 st.u64 [%r23+16],%r44; $L113: .loc 2 1123 27 ld.u64 %r24,[%r39+8]; .loc 2 1123 5 setp.eq.u64 %r55,%r24,0; @ %r55 bra $L114; .loc 2 1124 37 st.u64 [%r24+8],%r23; bra $L112; $L114: add.u64 %r48,%r39,-8; .loc 2 1128 10 cvt.u32.u64 %r58,%r.loc 2 1152 21 ld.u64 %r40,[%r53]; .loc 2 1153 31 ld.u64 %r47,[%r53+48]; .loc 2 1153 13 ld.u64 %r41,[%r47]; .loc 2 1154 3 setp.eq.u64 %r55,%r41,0; @ %r55 bra $L176; .loc 2 1153 52 mov.u64 %r52,0; .loc 2 1154 10 mov.u64 %r44,%r52; .loc 2 1166 10 setp.ne.u64 %r119,%r40,0; .loc 2 1206 7 add.u64 %r120,%r54,1224; .loc 2 1168 4 add.u64 %r121,%r40,8; $L185: .loc 2 1156 25 add.u64 %r56,%r44,2; shl.b64 %r57,%r56,3; add.u64 %r58,%r47,%r57; ld.u64 %r42,[%r58]; .loc 2 1162 11 ld.u64 %r59,[%r42+80]; add.u64 %r23,%r59,-1; .loc 2 1162 10 st.u64 [%r42+80],%r23; setp.ne.u64 %r60,%r23,0; @ %r60 bra $L177; .loc 2 1165 30 ld.u64 %r43,[%r42+40]; .loc 2 1166 10 @ ! %r119 bra $L179; .loc 2 1168 4 ld.u8 %r69,[%r42+207]; ld.u32 %r70,[%r42+88];75 11 ld.u8 %r73,[%r25]; setp.eq.u16 %r74,%r73,0; @ %r74 bra $L181; .loc 2 1179 35 cvt.u32.u32 %r75,%r66; st.u8 [%r25],%r75; .loc 5 63 10 @ %r123 atom.add.u32 %r76,[%r25+16],1; shfl.idx.b32 %r76,%r76,%r122,31; .loc 5 64 1 bra $L179; $L181: .loc 2 1182 16 ld.u8 %r77,[%r25+1]; setp.eq.u16 %r78,%r77,0; @ %r78 bra $L179; .loc 2 1186 38 cvt.u32.u16 %r79,%r73; st.u8 [%r25+1],%r79; .loc 5 63 10 @ %r123 atom.add.u32 %r80,[%r25+16],1; shfl.idx.b32 %r80,%r80,%r122,31; $L179: .loc 2 1191 10 setp.ne.u64 %r81,%r43,0; @ ! %r81 bra $L183; .loc 2 1193 4 ld.u8 %r89,[%r42+207]; ld.u32 %r90,[%r42+88]; add.u64 %r91,%r43,8; mov.u32 %r87,0; mov.u32 %r82837 cvt.u32.u32 %r94,%r87; st.u8 [%r43+48],%r94; .loc 5 63 10 @ %r123 atom.add.u32 %r95,[%r43+52],1; shfl.idx.b32 %r95,%r95,%r122,31; $L183: .loc 2 1206 7 ld.u8 %r103,[%r42+207]; ld.u32 %r104,[%r42+88]; mov.u32 %r101,0; mov.u32 %r100,1.loc 2 1211 7 ld.u32 %r107,[%r54+1256]; add.u32 %r106,%r107,1; st.u32 [%r54+1256],%r106; .loc 2 1212 7 ld.u32 %r109,[%r54+1260]; add.u32 %r108,%r109,1; st.u32 [%r54+1260],%r10pred %r74; .reg .u32 %r75; .reg .pred %r76; .reg .u32 %r77; .reg .u32 %r79; .reg .pred %r80; .reg .u64 %r81; .reg .pred %r82; .reg .u64 %r85; .reg .u64predu64 %r117; .reg .u64 %r118; .reg .predu32u32 %r153; .reg .u32 %r155; .reg .u32 %r157; .reg .pred %r158; .reg .u32 %r164; .reg .u32 %r167; .reg .pred %r168; .reg .pred %r169; .reg .u32180180ld.u32 %r180100,[%r24+120]; st.u64 [%r100102,[%r45+16]; set.u32.eq.u64 %r103,%r102,0; neg.s32 %r104,%r103; cvt.u32.u32 %r101,%r104; cvt.u32.u8 %r42,%r101; $L398: .loc 4 283 7 cvt.u16.u32 %r106,%r67; cvt.u16.u32 %r107,%r42180; .loc 4 214 5 add.u64 %r117mov.u64 %r60,%r118; ld.u32 %r180.loc 2 613 14 mov.u32 %r157,2; st.u32 [%r24+200],%r157; .loc 2 614 6 setp.ne.u64 %r158,%r27,0; @ %r158 bra $L413; bra $L409; $L400: .loc 2 610 3 cvt.u32.u64 %r16467180,%r162,%r441; .loc 2 709 25 mov.u64 %r100,0; .loc 2 709 10 mov.u64 %r98,%r100100100predpredu64 %r213; .reg .u32 %r214; .reg .u64pred %r232; .reg .u32 %r233; .reg .u32predpred %r259; .reg .u32 %r260; .reg .u64 %r261; .reg .pred %r262; .reg .predu32 %r266; .reg .u64109116 ld.u64 %r163,[%r44+72]; setp.ne.u64 %r164,%r163,0; @ %r164 bra $L544; $L547: .loc 2 1227 12 mov.u64 %r60,%r265; bra $L545; $L544: .loc 2 1231 6 ld.u64 %r165,[%r44]; setp.eq.u64 %r166,%r1$L57$L545: .loc 2 1245 21 ld.u64 %r82,[%r44]; .loc 2 1246 6 setp.eq.u64 %r173,%r82,0; @ %r173 bra $L548; .loc 2 1252 6 ld.u8 %r174,[%r44+207]; setp.ne.u16 %r175,%r174,0; @ ! %r175 bra $L550; .loc 2 1253 18 ld.u64 %r85,[%r82+64]; .loc 2 1253 10 ld.u64 %r176,[%r85+8]; add.u64 %r87,%r176,-1; .loc 2 1253 7 st.u64 [%r85+8],%r87; setp.ne.u64 %r177,%r87,0; @ %r177 bra $L550; .loc 2 1254 7 ld.u8 %r178,[%r85+1]; setp.eq.u16 %r179,%r178,0; @ %r179 bra $L550; .loc 2 1256 40 cvt.u32.u64 %r180,%r87; st.u8 [%r85+1],%r180; .loc 5 63 10 @ %r271 atom.add.u32 %r181,[%r85+16],1; shfl.idx.b32 %r181,%r181,%r270,31; $L550: .loc 3 1329 10 add.u64 %r108,%r44,112; .loc 4 466 6 ld.u64 %r182,[%r82+8]; setp.eq.u64 %r183,%r182,0; @ %r183 bra $L552; .loc 2 1260 7 add.u64 %r90,%r82,8; .loc 4 468 7119119,%r290; @ %r291 bra $L775; .loc 4 424 10 setp.eq.u64 %r292,%r1198686868658585858,0; bra $L858; $L957: .loc 2 387 7 setp.eq.u64 %r459,%r58,0; @ %r459 bra $L859; .loc 2 387 21 ld.u8 %r208,[%r58585858,0; $L862:58585858585810058100,0; @ %r354 bra $L887; .loc 2 490 11 ld.u8 %r355,[%r100100100]; .loc 2 493 5 setp.eq.u64 %r359,%r75,0; @ %r359 bra $L888; .loc 2 494 5 ld.u8 %r360,[%r75+49]; setp.ne.u16 %r361,%r360,0; @ ! %r361 bra $L888; bra $L885; $L883: .loc 2 498 10 setp.eq.u64 %r362,%r100,0; @ %r362 bra $L887; $L888: .loc 2 499 25 ld.u64 %r364,[%r100+56]; add.u64 %r363,%r364,1; st.u64 [%r10058100585858589predpred.reg .u32u32 %r158; .reg .predu32 %r166; .reg .pred %r167; .reg .pred %r171; .reg .pred %r172; .reg .u64predu32 %r222; .reg .u64 %r223; .reg .predpred %r231; .reg .u64 %r232; .reg .pred %r233; .reg .u32 %r234; .reg .u64 %r237; .reg .u32 %r238; .reg .u64 %r239; .reg .u32 %r240; .reg .u32 %r241; .reg .u64 %r242; .reg .pred %r243; .reg .predpredpredpred1000; .loc 2 1906 7 ld.u64 %r139,[%r56+56]; setp.eq.u64 %r140,%r139,0; @ %r140 bra $L1001; .loc 4 191 6 ld.u64 %r141,[%r55+8]; setp.ne.u64 %r142,%r141,0; @ %r142 bra $L10021003;bra $L1005; $L10011048: .loc 4 396 11 ld.u64 %r151,[%r56+8]; .loc 3 1320 10 ld.u64 %r152,[%r151+8]; add.u64 %r49,%r152,-128; $L1005: .loc 2 1931 10 ld.u32 %r153,[%r49+200]; setp.ne.u32 %r154,%r153,2; @ %r154 bra $L1003; .loc 2 1934 8 ld.u64 %r156,[%r49];156.loc 2 1935 7 setp.eq.u32 %r160,%r158,0; @ %r160 bra $L1008; .loc 2 1937 11 setp.eq.u64 %r161,%r50,0; @ %r161 bra $L10162,%r66,0; @ %r162 bra $L1010; .loc 3 92$L1010: .loc 2 1940bra $L1009; $L1003: .loc 2 1954 33 mov.u32 %r165,1; st.u8 [%r56+48],%r165; mov.u64 %r49,0; $L10086 ld.u64 %r187,[%r49+72]; setp.ne.u64 %r188,%r187,0; @ %r188 bra $L1021; $L1024: .loc 2 1227 12 mov.u64 %r69,0; bra $L1022; $L1021: .loc 2 1231 6 ld.u64 %r189,[%r49]; setp.eq.u64 %r190,%r189,0; @ %r190 bra $L10231024; $L1050122,%r49,112; .loc 4 466 6 ld.u64 %r206,[%r97+8]; setp.eq.u64 %r207,%r206,0; @ %r207 bra $L1029; .loc 2 1260 7 add.u64 %r104,%r97,8; .loc 4 468 7 mov.u32 %r208,10r1042214,%r122; @ %r215 bra $L1031; .loc 4 424 10 setp.eq.u64 %r216,%r114,%r214; @ %r216 bra $L1034; .loc 4 425 14 st.u64 [%r97+24],%r114; bra $L1031; $L10341032: .loc 2 1262 16 ld.u64 %r105,[%r97+64]; .loc 2 1262 7 setp.eq.u64 %r218,%r105,0; @ %r218 bra $L1031; .loc 2 1262 27 ld.u8 %r219,[%r105]; setp.eq.u16 %r220,%r219,0; @ %r220 bra $L1031; .loc 2 1264 37 mov.u32 %r221,0; st.u8 [%r105],%r221; .loc 5 63 10 @ %r301 atom.add.u32 %r222,[%r105+16],1; shfl.idx.b32 %r222,%r222,%r300,31; $L1031: .loc 2 1267 39 mov.u64 %r223,0; st.u64 [%r49+112],%r223; .loc 2 1268 39 st.u64 [%r49+120],%r223; bra $L1025; $L1054: .loc 2 1235 12 mov.u64 %r69,%r291; $L1025: .loc 4 141 35 ld.u64 %r94,[%r491036; $L1035: .loc 2 108 25 ld.u64 %r9595,0; @ ! %r228 bra $L1036; mov.u64 %r96,%r95; $L1037: .loc 2 112 49 st.u64 [%r96+-112],%r94; .loc 2 113 4 ld.u64 %r96,[%r96]; .loc 2 115 5 setp.ne.u64 %r230,%r95,%r96; @ %r230 bra $L1037; $L10361000: .loc 4 402 20 mov.u32 %r271,1; st.u8 [%frame],%r271; .loc 3 1320 10 add.u64 %r49,%r62,-128; .loc 4 403 14 bra $L1005; $L999: .loc 4 385 4 mov.u32 %r274,0; mov.u32 %r272,2;27292741005; bra $L1048; $L1002: .loc 4 385 4 cvt.u32.u64 %r283,%r62; mov.u32 %r281,2831005; bra $L1049; $L1112: .loc 4 402 20 mov.u32 %r290,1; st.u8 [%frame],%r290; .loc 3 1320 10 add.u64 %r49,%r64,-112; .loc 4 403 14 bra $L1005mov.u32 %r30,%ar1; .loc 1 37 6 and.b32 %r31,%r30,1;9+4pred %r64; mov.u64 %r42,%ar0; mov.u32 %r43,%ar1; .loc 1 97 22 add.u64 %r41,%r42,4; .loc 1 80 6 and.b32 %r44,%r43,1; setp.eq.u32 %r45,%r44,0; @ %r45 bra $L12cvt.s64.s32 %r47,%r46; .loc 1 84 25ld.u64 %r30,[%r50+16]; .loc 1 86 20 ld.u32 %r51,[%r42];ld.u32 %r60,[%r42];ptrlock.o/ 5656565656; $L52: add.u64 %r74,%r171,4; st.u64 [%frame+24],%r74; .loc 1 101 6 setp.gt.u64 %r333,%r56,%r5656; .loc 1 115 7 min.u64 %r342,%r343,%r69; .loc 1 115 22 add.u64 %r344,%r190,%r56835656; .loc 1 96 16 sub.u64 %r362,%r191,%r5656; $L57: add.u64 %r91,%r174,4; st.u64 [%frame+24],%r91; .loc 1 101 6 setp.gt.u64 %r372,%r56,%r91100,%r47,%r170; st.u64 [%frame+24],%r100100141141,1; st.u64 [%frame+24],%r121; .loc 1 101 6 setp.gt.u64 %r474,%r60,%r121; @ %r474 bra $L26; mov.u64 %r60; .loc 1 96 16 sub.u64 %r482,%r191,%r60; $L72: add.u64 %r126,%r146,2; st.u64 [%frame+24],%r126; .loc 1 101 6 setp.gt.u64 %r491,%r60,%r126; @ %r491 bra $L26; mov.u64 %r60; .loc 1 115 7 min.u64 %r560,%r561,%r113; .loc 1 115 22 add.u64 %r562,%r190,%r60; $L80: add.u64 %r155,%r113,%r127; st.u64 [%frame+24],%r155; .loc 1 118 6 setp.gt.u64 %r564,%r6060; $L82: add.u64 %r167,%r441,%r163; st.u64 [%frame+24],%r167; .loc 1 101 6 setp.gt.u64 %r590,%rpredpredpred %r85; .reg .u64 %r8686,__nvptx_uni; add.u64 %r86,%r86,%ustmp1; ld.shared.u32 %r84,[%r884,%r84,%ustmp0; setp.eq.u32 %r85,%r84,%ustmp0; }ar2; .loc 2 631 15 ld.u32 %r33,[%r36]; .loc 2 632 3 ld.global.u32 %r3@ %r85 atom.global.exch.b32 %r34,[goacc_prof_lock],1; shfl.idx.b32 %r34,%r34,%r84,3$L199: .loc 2 657 7 ld.u64 %r81,[%r29];oacc-targetacc_on_device_h_acc_on_device_hacc_onacc_onacc_on_device_h_acc_on_device_h_mov.u64 %r27,%ar0; st.u64 [%frame+16],%r27; ld.u64 %r28,[%frame+16]; ld.u32 %r22,[%r28]; mov.u32 %r29,%r22acc_on_device,(mov.u32 %r23,%r30; set.u32.ne.u32 %r24,%r23,0; neg.s32 %r24,%r24; st.u32 [%frame],%r24; ld.u32 %r25,[%frame]; mov.u32 %r26,%r25; mov__tls_get_addrlibc.so.6ld-linux-x86-64.so.285F(11IHn΁HHH1HD$ 1IHLD$ @S1tA@\I@.Am@F$3$AL$9vt Hlj5$H$E1[fD$E1H48A18[]A\A]A^18fDSHHHDH Ht[ÐHtXH

T>4 >H @ A dJD JX Kt K DM M $O0 TOH tO` Ox Q Q R tR dT0 tTD TX Tl T U $U V dW< dY tc 4f$ i vp4,䆀Hdx\Tt4Ԛ40dԢ8Tptdt<IJPԲd䲀$dij4HD\4 T`   Dʀ$ $ˀ\ ˀ ̀ D̀ π Ҁp d݀ ݀ ݀ 4ހ8 tހ  PDt8D" ?T@8CKTdVD[@4hDhTh(dhxth` h $i l px qrTsstd<Ԇ4ĕ$T亁4ԼDTdt H4DƁˁ|$ρdρltρρρρ ρρЁЁ0$ЁDdЁdtЁxЁЁссҁ$dׁtځہL(Tt8848t88<::::;4=8 =d =x = > > B !dBD!I!J "J,"KT"$R"$Z#dZD#4]#T]#o#@$|$$ă$$d8%t%T%d%%l&T&&Ě0'tl'䤂''T4(\(4(D(T((t(԰ )ı)Ա))4)T*0*ijD*䳂***䶂P+䷂++伂,ւ,,$-- P..>.?.$? /BL/TC`/C/DD/G/TH0Sp0TU0TW0W1W81Yh1[1t]1] 2$aX2b2c2g(3hH3hh3i3Tj3tj3j3j 4tt4dt4$45h5|5䂃5$5D5(6d6ĉ66d64 7D 7t47H7\7p777$7D788ě88d8Ĝ 9 9449DH9T\999ԟ :䟃 :$H:tp:d:;dL;x;Ԫ;䪃;$;d <tl<Ԭ<Į<d<=T|===4 >P>>>D?Կ8@ȃ@ʃ@D$A;d4?D?tA$A4ADATAdAtABBtBBTHC4CtC DDt2DE4E9E9E:F;DF;4Dd<`D=D=D4>\FT>pF>F>F>Ft?F?4GDGEGTFGDG HHlHTIH4JHJHJHLIDM8IMtI$NIOIDOIOF$btioLDpėD<$ҋTҋӋ4؋d؋܋t݋݋dދ@D$D$hDL$D|$(ttD  8T | ddD4 44"$(P(*,,,,4-\BBBCDDDD0DDDt$E$FG,G@HTDH|dHHdITJHKTRdRtR$SPV|dXTY^HcTe$g`$nduw4DxXxx|d$pĖ|dЌ<Ԍ܌0ߌLp4|H84T<<=>0D?h$@BClDD4EE$FDFTFFtGDGXGlGDIIJKHLtLLLM4MdMMT4NtNOO<OPtPPT[PD]t^d`tcjdkportvLvdԏ4HD\DTđ,$dTԔ<PTxTXD䠍<䧍t䫍0tTԭpTDIJĸp4ˍtˍT͍ Tύ Dҍ $֍ t8 $t  D  h $ 40L`T@Tt d!PPDXX<m$tĘT(tH  Ԟ8  tTd44 t Ĩ4t`䩎|DĪ dHītDLhT$D䵎TPd<Tp4$TTdt,@Th4  PDxd($$ p!$8%d&+d-$.3 8p @ Bѐ>55DL9`9D:$:(;X<4<= ? ?\?p???4?4@H@@@A$AAADAB$B48BDLBhB|B4BtBB,C @C4 C C C C HDDDtDdD ELE`EtEEE4EDETE4FLFFFt#G$@G&|G$(G)H8hH8H9H;I<4I>dI>I4AIDAJA$JADJDCJDJtE KEdDEȞ4FDGLdG`GJԟJ $K tK4L`dLtLMȠTNN(N<O`PDQܡQR$StDSSĢSdT4VxTXģXHD[cDTdiХmpprttvPtwȦwyd|~$|̨$@ttȩ$ܩD@|Ԅ`䇗tt \4tġ0dDȮ䣗ܮ$d,Ԥd䤗x4D䧗p4DԨ䩗D䫗б  ĭHԭ\4D 䰗@$xdijt\TT\ $ŗ̗0͗|4ԗߗߗTܸT ̹LPػD$$'ļ:DQTQtDc<cldee$e$TeeDfкtgh hDh$iȿDiܿdiiii,i@jTdjj$kkDkDlTlltmmnptppDTqxq4rrts(sHsht4ttu4u$u8vLTv`vyztzz0zLz`{tt{{|D||~(<PTD(Th$|tD ĉ4DT䊘T4 <hd䎘4ď0DXxđ$Ēd0DDdĔ$dĕ4$d8LԖ`t4Td|ę䙘䚘T @4XԝlT T<4hd䫘0|䰘0TT TT|ijԳ䳘4dDd丘40DĹX乘lDt亘TԻ$PT4t4俘@˜|$Øǘ(ɘ$˘˘˘(d͘͘͘͘Θ$ΘdΘΘИl$ИҘ֘ $ܘݘݘTޘ ޘ ޘ4$ߘHߘ\4pTdTt4t$<$h4ddLTdT4dD4d|4ttt$$d8h|$tD4,dLt  xt   @`TtD44\$D@dTtD0$ $&\d&&&''((T(h))4*d*T,d,, ,,-@4-TD-hT-|d-t-----..($.<4.PD.dT.xd.t.......44$5@Bp4F$GhzzT{~8T~d$t`4tT@dxd䙙$$Ģ8$P|4 ĥ<Tt0ddԪ䬙t(DLx$ lDLĺ亙仙Pdx侙$D`|™™t˙˙@͙4͙͙ϙ$tљtҙҙtәә,ԙXTՙיtؙٙPTڙdڙtڙdD,DDt`40h4t$Lt$4 tl $ $"0 % - 43` D@L @l $A E J4KtK$K8KL4L`tLHNdOOtPTR4TUWY\^ ` bl e DgTidij$k4TkHl|lm DnX4s<xy{{{|4|| | t}\}p}}t~~~D0Xdt$ă8dĊ4ԋDTht$(`TxTd$ D XdԳ tlԹ乚DDdd$俚 D($š š8šTÚdŚŚŚ ƚ\ǚ̚D̚,$͚p4͚͚tךؚDT 4tdX $ !!D>D>0T?;?TWP@Xt@4ZP&Zd&Zx&$[&t[@*\|*\+t]-$_T.d_.Ta.cD8tct8cx9Td94e:e;g;Ti@di@4j@TjAj,AjXAklA$kADkAdkAkAkADl4BlpBmBdmBm`C4nCTnCnC$rHDyD4yDyEzPEzEzE4{ED{Et{F{@FT|lF|F$}F}F~$GhG$GDGTGGԁHLH`HtHTHHHIT IhItIIIJ$JĈ8J䈜LJ`J4JJdK4KPKdKԍKKTK$LpLMDMM$ NdOO0ODOĮXOpOO4OTOOOO䰜8PLPDtPԱP4P0Q|QQԻQQ$R,R\R|R4RTRR4œRtœSĜdSǜSȜSȜSΜS4ΜTϜPTϜdTDҜTdҜTҜTҜUӜdh4>xhC@iEBTECE$CE8CFLCTF[G\I `tK`TMaDO,ePi$QitQiQjR@jdRTjR|j4SjSjSjTjU,kV`kDWkYkZlD[\l\l\lt]l]l$^l_0m_PmD`dm`m`m4bn$hntini0ndk$ok8oldoooq(p$sxp$uTqwqyodzpD|q~q4 rrr4ss4(txtTtuPuuu$uvԕXv4vvd(w`wtwwL ? d@ @ 4A0 Ap $C DC tC E< dEP E E dF F tID I` I 4J J@ tOQSTVDZ|Z4[t[ 4\Lt\t\\]d]]0]X_adbb dhteee(DfPfxfDgg@mno rdsstLvTw,$xxxyz0{$|T|||$}T}$}8}LT4tH48x$dT4䌡4l4d$$tԕ8 d 4 T!ԛ "tp"ġ"4"d8#d#t#Ĥ#ĥ$$d<%%&TT&&İ'԰0'd''T\((,)Tá)ơ)ʡD*ʡ*ϡ+֡+4ء+tء,ۡT,4ܡ,$ݡ,ݡ8-ݡd-dߡ-ߡ-ߡ0.$|.4. /H//D/\0T01tl11T2T222<33D3T\445<5Td55555T5d 646H64p6DTdt$8L`t$4DT(d<tPdx$,4@DTThd|t4DT(d<tPdxD8TDt(T$h4|DTdt d,L`X|tD   D h T D4$@d| D  $@D`dT !t"$#4!$!&<$(.(6)6$)6*7*7$+X7T+l7+7585,85@85T85h85|858$68468t68686 96 9649$7H9T999999::D;<:<:<8< :t=:=:=:=;=;=0;=D;=X;=l;>;>;>;>;><><?,<?@<$?T<4?h<D?|<?<?<?<?<?<$@=d@P=@=@=$A=dA=A=A=A >A >A4>AH>B\>Bp>$B>tB>C>tC?C(?4Dt?tD?D?D?dE?E?E?F@DF@tF(@GT@NA$O(AtODADPhAQAQAtRADS BTLBT`B$TtB4TBDTBTTBTBTCT$CUCUC4VCVCVDVDV(DVhnd>n>nTApIXpKpMpDO$qPq4QqUtrWrdYs]8sDjsnsos4r0t$ttututwuwux(uy`u~xty\zztl{ij{D{T{8|Ŀ|Dģ|ƣ@}ɣ}Ṭ}ϣT~ѣ~ԣ~tף<4ڣܣޣ ߣl$4DPTD4d$tD܄ X4#h()T+Lt,-@`$Qx }Xt$DĤĤ܌̤Dդhݤ$ 4ĐD|T4 x0"t)/D6`;dCDK4SؙDYh`$hosxtuPdv4w(zxz؞{8|T}~D~ܠ(tdD $X䅥Ć<ԣd Dl$䌥Pč4dD̦$d䓥ĔHdD,$xĩ䚥ě\d@D$ث$䡥pĢTdԦdL0|ȯı`d4DԹԼX$ T¥åLĥƥDǥxtȥ̥ܵ(Хtԥե ֥X٥dܥߥTtD8йhLH,x Ľ$ \DTD$ܿ(dt X "<#$&<)$ST$TpVX4Y4[<d^d_a0df$gTk\mnp48ddԜ DlDLlԐt4tTd4LDǪ۪ܪ ݪߪPTDcd`fftgDkos(wt{ 䃬X<$d Ԙp,|ĵ$ƬҬhլt4ج=D?XKLttNP 4R\TWPdYd^hDm8Dsts|{D8D$䜭4<$44í|Tƭ T \dT DW ğ T 8 t  dT T0| ` DD$$LtT `ίt} 4T䤰$lǰϰ\װذ dٰlڰt۰dttHd4, !D%t@,Dv,tx,,-x--d.d`.d.d.dD/d/d²/dDz(0d̲t0dѲ0dֲ 1d۲X1$ܲ1$1$<2$2$2$ 3$l3$3$4$ P4$4$45N$6Op6S6DX77ܳ 88$9 9 :P;d;l??$@T@䖵@D@$8AA䞵AğBhBBCĻPCdCԾ`qvTv(wTµXwµwTĵwdŵxƵ xdǵ@xDȵhxȵ|x$ɵxɵ\o?o@ pAdqBqC$rDrDF8sGsdHsJHtKtLuMTvNvOwPtwRw4TwtUDxVxWyX`DK`NědXlTZȜ$\ ]|aܝc0elmoqtDt؟v8y~tԡ\ ĜxTģT4`D4h乷P$Ʒ׷H߷T0#4MdbDxDDtܭٸ(tTЯdt8*SذSTUXVlVWıXdYY $Z<Zhhkhċ|ԋL[ƻ8dݼ dtŽ<$dDD\x$t$8$ TT |  D  , X4 t     0$ Dd pt     $ d Dt X l d@xT DX48p$tPdT0hDH4$(t`T(TT0T\4D,d4 $Dt|d$\T<D t  4! !HT"##4$4$|4%% 4&T&4'',4(t(4))<)P)d)x)*$*4*D*T*d*(t*<*P*d*4++ +D$,|t,,-,d-d-.T..,.dD///0,T0X001d1142`T2223T3<3h33344T4d4$4P4d45D5T555<D6h664777D$8pt888849D90T9Dd9Xt9l9999:d::D;pT;;;;< d<8D4?D???@4A|BdC DDT$EFF,GtHIdJLDKKKLL04MhMNDOOXPTQQR(RpRRSSSSS0SDSXTlTDTTdUUU8UT$Vp4VVVWdW0W\4XXY TZHZt[[[\d\\ d]@d^^^__4``4TsLv`Dwtwx$xDxTxdxtxxx,x@xTxhx|xyy$y4yDyTydyty0yDyXyly$zTzzzz{D{ t{ {4{H|\4|pd||||||}$}T}(}<}P}d}x~$~D~d~~~~~8t\t44d0DԁXlTԂ$4t,lԆdԇ D|$ T|$\ď4$<tDԔd Ė<d䙾t@$tlDTd (<\D|DXdd\$<d`D0\ttHd|$4  $  L  T   45:t<84>,t>>>??$@t@A B B 4CH E$ EP F Hl H tIdJJLKxL8TMd4N|DNTNdNtNNNO4QRDTTTTDUpVte dv\T}4@tT@DhĊ($P丿ؿDڿ<ڿdDܿݿݿDtt0`$$8X4l$\4pdt,4@\$T$Ht TldH $   t(!td!!!d!$!"48#dL#`#0d l  (""$!#d!#!#"#"#4#($46$6$7$7$8$8$$8 %8h%8|%8%8%8%9%d?P&C&TF 'Wl'Y'[(]8(^t(Ԋ(4)T)t0)X))4&D)T)T *L*D`****t8+Ԗ++Ę,$h,D,,`--t-<..T/X/Dp/000T1d1t(1Լ<1P1t11<2t223`3dt3d3$4p44454@555D5DH6$648777$L88D 9d99d#0:T$|:1;=;K<Z<Dz=t==dp>>$>?Ľ?0?4X?AT,:t,dA,xA,A-Ad-At-B-B.hBT/B0B$2DC4:Ct:C:C:C:C;DDAD4KDtK EK EK4EKHEK\E$LtEOEOEOEOF4PF[4G[8F]F_LG_lG$aG4aGaHtdLHdhHteHeHeHeHfHh8IDmITnIDJXJJĊJԋJJJ$J4JDKT$K8KKTKdKtKK$DLDXLLL M4XMMMMԗ,Ně|N4NĝN$O$8OTLOTOԬOHPdPPd0QdQĿQQRtHRRTR$StpS4St$TtT$TdTPU$UVdHLABDCLTDEHlHIOHOdTTVtX4] $``Tb$d$g\dhi@kldrdrl4s$t,ttdtduTw(}t4<ԃh|T4ԇH\$pdtD,\pĉ4Dt  Č4pD,T@\p4D0DTX|TdPt44ld Ժp$4T$d $X4(T$lL4tt(<P<$lDDTXdltT$D$,$!d"&<'P'T230D64:EHRD~~ ~4\D|$ DԜ<Pd4TTdt$D4Xtd,@T4@Tl$8H pD,|$0t4$($<DtPt$$dTdt$44$x&,DF\FHID DK 4L TP4 U V $W _d Db e   D X4$4pD8D|D@t T t 4  P  d   T l$tXlDXTd?dt?x? tAl4CDCtCCCCC4CH4DpDGtMDPdTDY,$\xi$qTd~4 D4Ԏ0ĥ$d8T44$ĴddPlļdL4$4`DT$ddtxD L  !$(   l!!!$!4"L" "!"4"#T$<#$.x#d.#.#4/!D/!T/#t/#/$/$/X$/l$/$/$0$1D%1X%2l%2%$2%2%2%43&t38&3L&$5p&6&6&7&7';'= (C(4Gh)I)J)K)S*_*b<+o+4p+du+u,$v,,dvT,v|,v,z,{-T{(-{@-4-$-0$D$ĺ$Ժ$&('tX't''Ľ(4$*L.`.t..D.T/h/d/4//$0d(0P00001dD1tX1l1t11`2t2228343$33d4t4D,5|5T55$6 6 46$ L64 `6$ 6647t77T77 8@888@9h999t99T0:tL::$%:% ;D'L;';';.H<D<TF<Ip=L=N>QD>QD?U?$W?X@$Z\@$[@dAi`AjATlXBoBpDCtrC4tCTyDD{PED|ETEt>ԝ>dAtB BBDTDFԫHDtHtHHHHDIԵIIIdJtJJHKK$K4Kd L L4LMMMN@NDNtNLLtPMD\OtO$P@PPdPtP$QTDQdQR\SS$UTUDVVVTDWdWWW@XpOO$O4ODOTOdOtPQQ Q4 RD (Rd DR!tR"R$"RD"$SD#S'xTT,T0@UD5W60W6\X6pX6X6XD7X7X;Y$@tY$DYDYF,lJlTKPk$LlM mM,mM@mNXm$NpmdNmOmP$n$SnT o$UHoVtoTVoVoWoWpW(pXdpYxpYpZp$ZpTZqd^pT`@qTbqdbqbrbrb,rb@rbTrchr4crtcrcrdrDe@se`sfsfsgtg\ththt4iuDi(ui`uitujujum(vtmhvmvTnvn(wDpwdqwqw4rxr xtTxvxxxzyT{Xy{y4|y|z}Pz}dzT~zd~zz{$\{4p{{|Ą,|T@|T|Dh|ԇ}P}t~L~t~~D<Ԟ@tԮdԁDd܂(ĺ<T,txЄ TdX($D $ | |d | }T } ~d lT p$#t$%'d''(D)tD*, -p//Ѝ00 0 041H334Ȏ5D5D7D8$;$<@Ĵ??D@K,LTNT ^a$dh$ff,Tilpvкر4X8Ԇ4H4\dлd @Dt$ļD\|T44ؾ$d|̿Ğd44|T,ԴDDddL$t84t$L  @Tl/4?XGDHIIDJpLDMP,Q@dQxQST tU,UTtW4[0d\|^f,px$q$`Ԑ8tԚ<ģt\TIJt4Dht|t0|$(hdLDH TT4%DCDdEDGHJLtN0QtStSSDTTVV$W,XdtY$ZZ$[ [8[X\x\^_D_0_P`lD``4aaa$b8DbTbc4cTccTePhThiDj@jTkkno$p\4pDpTp(dppqTqhqdrds4tv<whDx4yyy Tz z4{PD{dT{x{t}4~dԀ4 D\$ԃt$4DTd<tPdxĈԈ,@$$48TTtpĉԉD<$LdxT4ԍXl4Ď0`tdTԔ,@DTh|4DTT<$0TpԣdħdԲP$TT<$4$T8d$ t 4TD4Xd$T DX4D4$\ $ 8 L ` t$ 4 D d d |HD\dlT`dxt4 t L |!!t# d%\%4'4)D*HT9;?@$BtBB$CDCCGUtYDDZ4]]T`d$|t| | 4}( }T T~  d   d T(  ģ  Ԥ `  (  d xĴ<T|4DTdt0DXlĻԻ$4 D T4dHt\pļԼ$$84LD`TtdtĽԽ(<Pd$x4DTdt,ľ@ԾTh|$4DTdt0DXl0DXl%4%H(\(h)|)d/*H64-\6.@7d.T73970:T;t6=6d=6?7DA8$B,8TCd8E8G09I94K:DK:TK:dK:tK:K;K;K(;K<;KP;Kd;Kx;K;L;L;$L;4L;DL;TL<dL<tL,<L@<LT<Lh<L|<L<L<L<L<M<M<$M=DM=TM0=dMD=tMX=Ml=M=M=M=M=M=M=M=N >N >$N4>4NH>DN\>TNp>dN>tN>N>N>N>N>N>N?N$?N8?OL?O`?$Ot?4O?DO?TO?dO?tO?O?O@O@O(@O<@OP@Od@Ox@P@P@$P@4P@DP@TP@dPAtPAP0APDAPADQARAR\ATB$UTBWBDXBZC[TC[hC[|C[C[C$\Dt\(DT^D^D_E_$E`lE`E4aEaEaFc0FdLF4dhFDd|FfF4fFDh8GiGiGDjGkGkHk$Hl@HlTHllHoHpIp@Ip`ITrIdzI{ J|TJ|xJ}J~JJ(KTKxKKKKL8LdtLLtLL MDLM`Md|MMMDMN,NĐHNdNN$N$N$0O`OCtDHOOTP(P$dPDxPPPPDQd\QQ R4XRRR S$4S4SĩST8TttTP,U@UTUhU4UUVU4DWWW8XTXXXXYD8YPYDYZxZdZ|VVVVVVV$W4WD0WTWdWWWY(Z [ [4L[d[[[4[\tH\d\x\]$]dP]|]]$]TP^^D_t__`t``adaTabbdtbbTcdcTcc]^t^(_tl_c c dd Hd dd xd dd d d d dT e \o>po4?o?o@o@oApA@pATpAhp$B|pBpdCpE,qDEXqEqFqGrHDr4NrNr$OrOrP(sQlsRs4RsTRstRsRsRtR,tDSXttSttStDYt^,u_Xu_u_u4auiv4kqtk4v4mvDmvmwmwn,wnTw4npwtw4t x$zxzxzx{$y{4CEHdIH4JpTKKKLNdP4PZ0Z`Z$[[D\\T](]`_``0$aLcdcc$ddde4eiHj@zT{$\$ T|Tt(dd<$PTdDxTTTtDDDD<t\XP$,D@tDT@$GG0GDGXHxH4HDHH4ItI0K|L4M@M\MpNSTDUXUlTVWWX dX@XdDYYYZZZ[8\d^^_d`04ğ8īx$Բ\t$$d8L`4@$4Ttd HDt4<t`t$T <PdT$H4hD  4  d $T DdD<Pd4L`DH4t "`4(-/<46x8D:( |:>:>:>:?:$?;D?@;d?d;?;?;?;T@<@,<@H<@d<tA<DC<D=D4=DEd=F=F=TG=I8>IX>Jx>J>4K>K>L>L?M0?tMP?Mp?4N?N?N?TO?O?P@tP8@PT@4Qp@Q@R@tR@R@TSAS A4T4T>TT???D@@@84ATApA$BtBBCdCC4DPTDlEFFHHITJtK0P$QDRW@Yt$Z$[$^ 4^ T^4``Tcdcccff,g@ipijo$o4t Dt $uLupdy~$T\ԔTxdD$Tĭ4D$ x $ D4$DlDdTxd$8L`t$Ddt(<Pd4xTt,4HTdd,@\t   T 0 X D   4  H p$ P4T @4t4D$tTd8Dlt8tTt,@T$,d//0/`40000T11P4233$DhScsdԓ  4$H4\DpTdt$dԧ(44D0t\xĮԯ0p$4DT4Xd T,LlD4Զ$8tTķpԸ$XDpT|d$\$      $ $D 8d L ` t   $ D d    ( < P$ dD d     ($DTXl4Tt4$4PtdTDd(T\4TdT4Tht  D `!|$!D!"H#hd##T$t$$$8T%|%%D&'t''@t(t(((() 4)DT)`)*4*d***<*`*+$+D++$,L$,hT,$--.D/t/400411442T2tT334444 T5(5D6`t6|64777T8 8,$9H9h:t::T;;4<<4T=T=tt>>4?@A4TD$G4GDGH,H\KORLDVWW DYl Z Z Z Z [ [@ d\t ] $_ d` a\ tm m m m m n0 nD $nX Tnl dn n n n n n n n n o4 oH 4od o Dp p$q,qHq\DrtuvDvtwwxx$x4x,dx@txTxhx|xxxxxyy$yDy8yTyz$zzzT{}\T~~D$DP|ĀԀ$4D0TDtXlԁ4Tt  <ԂXt4Ttԃ$D4dĄ4LtĈ$dTTDD4d$8tTĐH\$pTDt(<d ԔL`4tdD$`$DT d pdԬ $8 th ĭ  $ !tP!!İ! "t""#$p#D#d##$$($<$P$d$$%H%x%%@&T&4h&&4&D&t','@'T'h''''$(T($t(4(D(T(4 ) `)d ) )!)T!)!*! *D"@*"`*"|*4#*#*#*$$*t$+$$+7-$9-d:X.4<.4>@/F/Y 0Dm`0$n0do0~<1411T,2d@2tT2h2|222ē2ԓ22233$034D3DX3Tl3d3t33333ĕ04l444444,5l55ę55$5D5t5$646H6\64p66Ԝ6646D6T6d6t7$787L7`7ĝt7ԝ747D7T777,8ğD848T8t88Ԡ8849ġ,94L9t`9t9Ģ99d9t9ģ9ԣ9::(:<:$P:4d:Dx:T:d:Ĥ:Ԥ::4:;4D;T\;dp;t;;;;Ԧ;;$;D<d$<8<L<ԧ`<t<<4<T<d<t<<==Ĩ(=Ԩ<=P=d=x==$=4=D=T=d=t>>,>ĩ@>4d>dx>t>>>>>P?d?ԫ?$?@Ĭ@D@4X@@ĭ@@ALA`AԮAATAdB,B@BtBBBBTC@CxC$CijDT,Dd@DtTDhD|DDDԴDDdDE4E$PE4dEDxEEĶE$EķEĹTFpF4FdFFĺF G$(GdPGxGԻG$GdGH4\HTpHtHHHDHTHtIPI|II4IdIIJTJ$hJt|JJJJ$JKTLKpK4KDK$L<?l?@̽@4GhMԾTNN8TOlOOY4YtbDc$cctd8gdgx$gggggh4h,Th@th\hph$iDidiiii,i@j\Djdjjjjk,$kHDkddktkklTll4lH$mdTm|tmmmmmnn $n 4n4ndn$oToo$oDTptpp4qDqTqdr\s4wzh~ԁdtD@d\4ĈTԋ0$tČL|D`ԍ$ t<P$$t<x$0DTxdTT04lt4$8L$`4tDtĝ$T(<Pd4xDTtğԟ,@T$h4|DTdtĠ4,d@t\x,X4ģԣ D ĤtԤ(ThtThԧt@Tx0īDԫXl$4T t(<Pԭdxt,dH\xDt(P$|t4HԴdxTԵD(TTdp$Ĺ@ԹT$hTdԺT<$`ԼĽľ,Ծ@h4Ddt DT|<|d4$tTd0Tt4DPdxdTDtl$(D<dX8T$h4|t$DT8TpTt,d`tT,T$<$Tl$4L$$4\t $8L$`DtTdt,@|$0\pt $ d  T lD  ,dTH`T  !4! d!("D"X4"tT#d##%,$&tQ!BBD B(A0A8Vpu 8D0A(B BBBD TAQ@TBBE A(D0GPR 0A(A BBBA `UBBE B(D0A8J` 8F0A(B BBBD T 8A0A(B BBBJ `THVBBE B(D0A8J` 8F0A(B BBBD T 8A0A(B BBBJ `WBBE B(D0A8JP 8A0A(B BBBE W 8A0A(B BBBG @`XBBE A(C0M 0A(A BBBJ $`[@AAN hFAp[|EAA z ABL a ABL a ABL a ABL a ABL a ABL a 0xCA8]BBD A(G@ (A ABBD 0l^BAD G0  AABD <D8_AAI0k AAH T AAJ lFAD_bBB B(D0A8P@8A0F(B BBB@`64l`AAG0` CAC u CAG $`jAk L J N U@aMFB$XTazAAK`hCAa@aBBG A(A0Dpm 0A(A BBBA exGlhe6b4eBAC  ABD HABp@eBBA A(G@z (H ABBL o (C ABBI u (C ABBC w (C ABBA ,(hBED \ ABN 8hBBD A(N0b (L ABBF 8 LiBBD A(N0b (L ABBF 8\iBBD A(N0n (L ABBJ Hj*BBF B(D0A8M` 8A0A(B BBBK dhljBBE B(E0A8M`{ 8A0A(B BBBE ) 8F0A(B BBBH Lpp,`pQD\ H s E \ D P H 8,rBBD A(S0z (A FBBG 8rBBD A(S0z (A FBBG 4 r\BBD A(S0t(A FBB4@ rYBBD A(S0q(A FBB4x $sWBBD A(S0o(A FBB4 Ls\BBD A(S0t(A FBBX tsBBD A(I0} (F ABBF D (C ABBD D(F ABBD s+AP G Nd sx s s s t t t ,t 8t Dt, Pt@ \tT hth tt| t t t t tL t"BFB A(C0] (D BBBK (D BBBH0 uBBF B(D0A8O@A 8D0A(B BBBB H| 0vBBF B(D0A8O@A 8D0A(B BBBB H v9BBE B(E0A8V`| 8D0A(B BBBH H wBBF B(A0A8Q@h 8D0A(B BBBD H` ,yBBF B(A0A8Q@b 8D0A(B BBBB H zBBE B(E0A8V` 8D0A(B BBBF H |RBBF B(D0A8O`1 8D0A(B BBBB HD(~!BBF B(A0A8Q@ 8D0A(B BBBG   A^$7AAP _DA('Ae< AK( 0FBEA zAB(8TFBEA zAB(dxFBEA zAB(FBEA zAB(FBEA zAB(FBEA zAB$IAAF {DA(<0FBEA zAB(hTFBEA zABx/| 8F0A(B BBBB ` 8F0A(B BBBA (ĄTDn F [$H!AAG SCADp BAF ^ CBD R CBI b ABK DPD| D 0xAAG L AAI xFA0AAG0e CAF TFA<P*Ha4ThBAD p FBA ZFBLIBBD A(D0S (F ABBE p (C ABBH 4AAD Y CAE a CAC 0HBAD I  AABK dHBBB B(A0A8D 8A0A(B BBBE  8F0A(B BBBE 8BBA A(D0 (D ABBA `PBAD R0u  AABF d  AFBK w  FABH P  FABG 8PKAA @ ABH AB80EBB G(A0(D BBB\BBD A(S0N (F ABBK R (F ABBC v (F ABBG L(TmAD ABEP q ABD \ CBG (xAAD r DAC ,gAH G ABA `fBBC A(Q (A ABBE j (F ABBC  (F ABBD 8A| C X4^l(Af0kAAD l AAD aLA|РZ 8C0A(B BBBB  8A0A(B BBBG D<ТBBB B(A0A8DP8A0A(B BBBhTQ~ A CHdBBB B(D0A8Jp 8D0A(B BBBF (̥AAD n DAG 40JAG \ AAH xC <TBAD O@e  AABA D FABHBBD B(D0A8D 8C0A(B BBBF d 8A0A(B BBBD Y 8F0A(B BBBH @\gBBB A(A0DP 0A(A BBBA (,|AG e CH ` FB <Di C F J y G ] K ` H o I S E  L BBD A(G0_ (A ABBK P (C ABBH pMA0tQAL @ CAF DFA<0uAAD r DAC \ DAG GDA@pkGDD O ABO W ABF XD@GDD i ABE M ABH hD LK@ARDm G 4c[D dADh pAGM s(BAD | ABJ DKAE,`̷AC M& I L-Ged0Gh|-Ge0GhMge$>AAL kCAoAAJ t CAD D FAE TAAL8BBA A(I@a (C ABBH X (C ABBH 0$GAAI Z FAT DAA(XfQAN0[ AAC 0XLAAO0V FAB WCAltBBE A(D0R@B 0A(A BBBF ^ 0C(A BBBH | 0C(A BBBJ 0($BAD G0g  CABE $\SAAOPAA8BBD A(Q0K (C ABBC HBBE B(D0A8NPR 8C0A(B BBBD $ @AAQ hCA4 *Ha4P AAJP\ AAF c CAI @ tBAA D@e  DABD   DABA 8 BAA N FBF O CBA H!4#BBB B(D0A8DP 8A0A(B BBBG `T!BBB A(A0h (D BBBF  (A BBBK  (D BBBF !D%D`(!\AAGPv CAE !|[ I R<"@BBE A(D0R@n0A(A BBB<\"KAD ] FBK Q ABD PD"7BAD A FBH m NB[ l CBG ,"AC E E K 0#AC Il G  J ,H#tE AC I D Hx#vBBD A(J0n (C ABBG ](C ABB# # #4$AAJ d IAN V CAF @8$AAJ @ IAJ R CAJ dFA0|$tAAG q AAD eGA $fH] K F J <$HAAJ ^ JAS k AAK TFA%&AQ F I4%pH%vBBA A(J (D ABBA Q (D DBBC e (D ABBB e (D ABBB 0%GAAJ Z GAR DAA%4&BAD j AB\ X CBC <&P=U]4T&xBAD j AB\ X CBC (&AAJ D AAF 4&D'AAD0| CAJ d CAH &<6Id,'dDBAD h ABF \8'BBA A(D0v (C ABBH  (F ABBX U (C ABBC 8'3BAD u ABA f ABG ('AAJ0l AAF D(<LBAD Z ABD Y ABD q IBL @H(DAAJ _ IAK D FAE \ CAH <(AAJ U FAH \ AAB rGA4(pAAJ d IAN V CAF $)BAZ E F J Q<,)AAJ } FAH R CAB \IA@l)@AAJ f IAL R CAB e FAD @)BAD P FBA O CBD YIB8)XBBD A(J@e (C ABBH H0* mBBA A(G`e (P ABBI T(C ABBG 8C0A(B BBBD *IddH*NBBD A(O0M (C ABBC D(P ABB@,+AAJ K AAG T AAJ  IAM 8p+BAA q ABH @ RBL L+BBE A(D0I@'HOP[HA@m 0C(A BBBH $+=A[ L F B L$,,8,BAD n ABH h,t)AO H P,@,AAJ u CAK b FAG f AAH 4,lAAJ I AAI E AAI \-BBD A(J0n (D ABBF d (D DBBH  (G ABBX `x-tBBE B(D0A8LPXT`TXAPG 8C0A(B BBBF  XI`W (-Al S V J F J {..0.D.>Od\.sddt.(0.$GAAJ Z GAR DAA.@4.L&AAJ B CAF b FAG @/DBAD p ABF Y ABD AB8L/SBBD A(G0& (C ABBA /$8/ BBD A(M0 (C ABBB /CGOJPN/ vTY<0AAJ Z AAH \ AAB \GA@T0eBAD J0x  FABF \  AABH <0AAJ c AAG \ AAJ tIA80tyBAD w FBJ Y ABD <1AAG A CAB N CAF rCA4T1BAD T ABB z RBR 41qHAA o CBB P NBE 41AAJ q EAU V AAH 1 2@$2BAD M FBD O CBD YIBh2 |2222Ag H A42AAG f DAL [ CAA 203<$3H83TBBA A(J0 (F ABBJ J (A ABBH Q (C ABBG b (C ABBF  (C ABBH d (A ABBF `3L1@T 8C0A(B BBBG 8 8I0A(B BBBN `44(BBD B(A0A8G@T 8C0A(B BBBE  8I0A(B BBBO 45AS D X4=AU J X84 BBD A(G@ (C ABBB 5 IAn A TL45 BBD A(J0R (G ABBO V (C ABBB 85 ]BAD ~ JBO ] CBF 85 JBBC A(L0O (C ABBE 5 AZ E Y G  6L046XYAAJ s GAI FAA,h6FN@O AD EFEH@D6$ BBD A(L`6hWpQhA`J (C ABBJ 06YAAJ s GAI FAA7(((7$A^ A a G a G fT7h7|7zlR782Ae J A7X7T<7PAAJ c AAG Y AAE tIAH8BBB B(D0A8R 8D0A(B BBBI Dh8dbBAA  ABD k ABJ b ABK $8BA] J F B QX8BBD A(M0v (G ABBX ^ (C ABBJ X(C ABB$49hdA\ C V J F J $\9dAa F Q G F J 89ZEDD x RBO _ ABF 9@9AAJ U FAH O AAG  EAS <:AAJ U FAH Y AAE rGA0X:JAAJ O GAU OAA4:BAD T ABB } RBO ,:x%Ao H E C z F d<:xAAK0j CAE g CAE |CA44;8AAG i AAD [ IAK @l; =BAA D`v  CABD    CABE (;!AD@s AG x JN @;"AAGPO JAM T CAH V CAF p <|# BBA A(Gp (F CBBB T (C ABBD T (C ABBD V (C ABBB $<%RAe R F B QL<P%%BBE A(D0] (I BBBF g (C BBBH D =0&BAD X IBF W CBD W FBA T='<h='AAJ H FAE R CAJ tIA='\='BBD A(M0 (F ABBA ^ (C ABBJ (I ABB<>(AAJ p IAJ Z CAB iCA8\> )BAD  IBG e ABH $>*BAV I F J Q>,*>(*>$*> *?*H$?*AAJ w IAK R CAB Z CAB XCA8p?|*'BAD  IBJ ] ABH 8?p+BBA A(D@B (C ABBD 4?, AAD q JAN V AAH  @,XDG A <@ -BBA A(D0K (C ABBC B (J ABBO L (G ABBH [ (C ABBE [ (C ABBE D (H ABBG @@x.hZBE A(D0P@O 0C(A BBBA LA/uBBD A(J@ (D ABBA c (D DBBA hA14|A1BAD T ABB z RBR AH2TAD2BBB B(D0A8JP 8D0A(B BBBD XW`WXAPH B3BBD A(M0d (F ABB[ f(A ABB$lB`4XA[ D F J S E B4B4GAk L NLB4'BBD A(J0i (F ABBQ ^ (A ABBD `C6J 8D0A(B BBBD  8H0N(B BBBG <|C@8AAJ c AAG \ AAJ tIAC82Aa N A@C8dBBG A(D0KP 0A(A BBBB  D;Tg E $vFa I V J ^BDD>2BAD F CBF H IBE A FBG E?$$E?oAa N V J F J @LE@AAJ b IAH Y AAE e FAD 0E|@AAI D IAG ^CA8E@BAD  NBI ] ABH FAFA@(FAAAJ g IAS T AAJ D FAE <lFBAAJ l IAN Z CAB DFAFBF|BNFV D ]FBFBGBGB00GBAAI I FAE ^CA4dGB AAJ w IAK ^ CAF GCGC2T]$GCKFV D S E FGDHDHD@,HDAAJ c IAO D FAE V AAH pH|DHxD4HtDAAJ G AAK k IAK DHDBAD C CBA y FBG | IBI @IEAAJ u IAE R CAJ e FAD \I`FBBA A(G0 (C ABBE G (I ABBK d (C ABBD e (C ABBC p (C ABBH 0IGA^ A F J [ E a G aJH@(JDIBAG l ABG U ABH SAB0lJJAAD i FAJ DCAXJBBE B(F0A8JPXP`AhGpoP8D0A(B BBB<JJSAD W0 AABKH0<A|8VğBBD A(KPg (A ABBG W8W`Q~(8W|aFA DMBdW xW W W2PVWAXWWF XD C E K (XDD D <DX)\AA  ABC H XتRTZXXX;Dm I XD J [ E $ Y AF Cc C L4YOBBA A(D0N (A ABBB * (D ABBM <YJAG0} AAG AAEH0LYXBBB B(A0A8J 8A0A(B BBBD HxZdBBB B(D0A8Lpw 8A0A(B BBBA (ZJFAG tAADdZ BBE B(A0A8J@L 8A0A(B BBBC  8A0A(B BBBA \X[4qBBD A(G@I (C ABBG y (C ABBG X (C ABBH H[TBBA A(I0P (C ABBI t(H ABBL\BBB B(D0A8Oi 8A0A(B BBBA `T\y 8A0F(B BBBG ~ 8A0A(B BBBH H\$_BBE B(D0A8GP 8D0A(B BBBE 8]8BBD A(L@h (C ABBK @]S)4T]hBBD A(J0(A ABB(]@ AAHp AAG l]$9BBB A(D0D@ 0F(A BBBG W 0C(A BBBG D 0C(A BBBB @(^\EAG`u AAD  FAK  FAD l^VD^XBBAD @ ABF ] ABH L ABI H^`BBD A(O@S (C ABBE v (H ABBE ,_$KRAD  CBD 0H_DAAG h CAK BCAL|_BBE B(D0A8P& 8C0A(B BBBF <_)AAJ  AAC t CAH |GA< `BAD GpU  CABG n HAB,L`p?AL ] CC  EF ,|`GAKp{ CF b CK `<`rGBD A(D0W(A ABBA$aHAAG |AA\(a[BB B(A0A8D@8A0A(B BBBIP@|a4BBE B(D0A8F  8A0A(B BBBH  8F0A(B BBBB t 8F0A(B BBBE b9Hp`$bBBC A(L (A ABBE P (F ABBE  (F ABBE LbBBF A(Gn (C ABBH  (C ABBG <bD^BAD F0  FABK d FABTcdBAA D`  CABH   FCBJ   FCBA @pcAAK I AAH g FAB P FAI 4cxAAL S CAK w HAH @c0BAA G@  CABI   AABK p0dBBF A(GPd (C ABBB  (C ABBG X (C ABBH i (C ABBG ddLd BBE B(A0A8G 8D0A(B BBBE LeSBBA A(D{ (A ABBE g (F ABBF (le AD`s CE  CH eeH@ 8D0A(B BBBD D fGBAA  ABE H ABE _ ABN $TfGAAL vAA$|f5AAR QJAf&AdLfAADp{ CAK  FAE D CAH P CAA Hg<vBBE B(D0A8I 8C0A(B BBBG <\gp AAG  FAA D FAE QFA(g@AAP0n CAD g!cAk D r@g4"WBBG A(A0G 0D(A BBBI ,,hP2=BAD S ABC @\h`3BAA DS  AABA   FABK Hh,9BBD B(A0A8L` 8C0A(B BBBD hADUiASHs E R$iADUBBE A(D0 (D BBBK P (D BBBF 8hD] KAH Gpu  AABD pL_bbD E k E H A U A 4$HcBDA _ ABG QIBd\cBBB B(D0A8MP 8A0A(B BBBE u 8F0A(B BBBD PģfAAJ0] AAE K AAC D CAH 8A@J8F0(TiiAP oAAEDitl\PjfKBG B(G0A8PpH`̤PkoBBG B(D0A8O@ 8A0A(B BBBJ _ 8C0A(B BBBA 0\m=@DmBBE A(D0K0A(A BBB<nHBB A(A0 (A BBBE ȥuD`g E $hv$DH B(D0B8D@K pvBBG A(F08B@EHAPEXI`LhBpFxBEBBABEEBEEL0H8B@DHAPDXE`GhBpFxBJBBABEEBEEL0H8B@DHAPDXE`GhBpFxBJBBABEBBABEBBABEBBEEL0H8B@DHAPDXE`GhBpFxBJBBABEBBEEL0H8B@DHAPDXE`GhBpFxBJBBABEBBEEL0H8B@DHAPDXE`GhBpFxBJBBABEBBEEL0H8B@DHAPDXE`GhBpFxBJBBABEBBEEL8H@BHDPAXD`EhGpBxFBJBBABEBBABEBBEEL8H@BHDPAXD`EhGpBxFBJBBABEBBABEBBEEL8H@BHDPAXD`EhGpBxFBJBBABEBBABEBBEEL8H@BHDPAXG`EhDpBxFBJBBABEBBABEBBEEL0Q8B@EHBPEXE`GhBpGxBU0H8B@DHAPDXE`GhBpFxBJBBABEEBEEL8H@BHDPAXD`EhDpBxFBJEBEBI0H8B@DHBPDXE`GhBpJxBN0H8B@DHBPDXE`GhBpJxBN8H@BHDPAXD`EhDpBxFBJEBEBI8H@BHDPAXD`EhDpBxFBJEBEBI0H8B@DHBPDXE`GhBpJxBN8H@BHDPAXD`EhDpBxFBJEBEBI0J8B@EHAPDXE`DhBpCxBJBBABEEBEBL8H@BHDPAXD`EhDpBxFBJBBABEBBABEBBEBL8H@BHDPBXD`EhDpBxGBJEBEBI8H@BHDPBXD`EhDpBxGBJEBEBI0H8B@DHAPDXE`DhBpFxBJBBABEBBBBEBBABEEBBEL0H8B@DHAPDXE`GhBpJxBN0H8B@DHAPDXE`GhBpJxBN8H@BHDPAXD`EhDpBxFBJEBEBI8G@BHDPAXD`EhEpBxFBJBBBBI8J@BHEPAXD`EhDpBxCBJEBEBI0H8B@DHAPDXE`GhBpJxBN0H8B@DHAPDXE`DhBpFxBJBBABEEBBBL8H@BHDPAXD`EhGpBxFBGBBABEBBABEEBBBL0H8B@DHBPDXE`DhBpFxBJBBABEBBABEBBABEEBBBL8H@BHDPBXD`EhGpBxGBJBBEBI0H8B@DHAPDXE`GhBpFxBJBBABEBBABEBBABEEBEBS0M8H@BHDPBXD`EhDpBxGBJEBEEI0H8B@DHBPDXE`GhBpJxBN0H8B@DHAPDXE`DhBpFxBJBBABEEBEBL0H8B@DHAPDXE`GhBpDxBJBBBBEBBEBR0S8H@BHDPIXI`EhBpBxBBEBBEBO0Q8B@DHAPEXJ`LhBpJxBO0W8H@BHRPAXB`EhBpBxABEBBABEBBBBEBBEEL8H@BHDPAXD`EhDpBxFBJBBABEBBBBEBBEEL8H@BHDPAXD`EhDpBxFBJBBABEBBBBEBBEEL8H@BHDPAXD`EhDpBxFBJBBABEBBBBEBBEEL0H8B@DHAPDXE`DhBpFxBJBBABEBBABEBBBBEBBEEL8G@BHEPBXD`EhDpBxGBJBBBBO0Q8B@EHAPEXI`IhBpGxBEBBABEEBEBL0G8B@DHBPDXE`HhBpJxBN0G8B@EHAPDXE`GhBpJxBT0Q8B@EHAPEXI`IhBpFxBEBBABEEBEBL8H@BHDPAXD`EhDpBxFBJEBEBI0H8B@DHAPDXE`GhBpJxBN0H8B@DHAPDXE`GhBpFxBGBBABEEBEBL8H@BHDPAXD`EhDpBxFBJEBEBI0H8B@DHBPDXE`DhBpJxBN8H@BHDPAXD`EhDpBxFBJBBABEBBABEEBEBL8H@BHDPBXD`EhDpBxFBJEBEBI8G@BHDPAXD`EhEpBxFBJEBEBI0J8B@EHAPDXE`DhBpCxBJBBABEBBEBL0H8B@DHAPDXE`DhBpFxBJBBABEEBEBL0H8B@DHAPDXE`GhBpJxBN0H8B@DHAPDXE`DhBpFxBJBBABEEBEBL0H8B@DHAPDXE`JhBpJxBK0H8B@DHAPDXE`DhBpFxBJBBABEEBEBL0H8B@DHAPDXE`DhBpFxBJBBABEEBEBL0H8B@DHAPDXE`DhBpFxBJBBABEEBEBL0H8B@DHAPDXE`DhBpGxBJBBABEEBEBL0H8B@DHAPDXE`DhBpFxBJBBABEEBEBL8H@BHDPAXD`EhDpBxFBJEBBBI0H8B@DHAPDXE`DhBpFxBJBBABEEBEBL8H@BHDPAXD`EhDpBxFBJEBEBI8H@BHDPAXD`EhDpBxFBJEBEBI8J@BHEPAXD`EhDpBxFBGEBEBI0A (A BBBH ȹ{vܹ{KD F(|MXSIV|(|R~<|R~@P,}BAJ  ABA J ABC QAB(}QAI Y AAA @,~itBBH B(F0A8SPmXE`BhCpBxEEEEBWPKXB`DhApExEKEGBXPGXB`HhBpGxEJEJBNPKXB`GhTpBxGEEBBIPGXB`HhBpGxEJEGBNPbXF`BhDpBxEEHBGBJEBEBIPSXF`BhEpAxHEGBFBJEEEBIPQXB`DhApExEKEJBXPGXB`HhBpGxEJEJBNPQXB`DhApExEKEJEXPGXB`HhBpGxEJEJBNPQXB`DhBpDxEHEHBXPQXB`DhBpDxEHEHBXPQXB`Eh_pBxGEEEBIP`XB`DhBpExEKEGBXPSXK`BhGpAxGEGBFBGEEEBIPGXB`HhBpGxEJEGBNPSXF`BhEpBxHEGBGBJEEEBIPQXB`DhBpDxEEBKBXPSXG`BhHpAxGEGBFBJEEEBIPGXB`HhBpGxEJEGBNPSXF`BhEpBxHEGBGBJEEEBIPQXB`EhApHxEJEJBUPGXB`HhBpGxEJEJBNPQXB`DhApExEKEJEXPGXB`HhBpGxEJEJBNPSXF`BhEpBxHEGBGBJBBEBIPQXB`DhApExEKEJBXPGXB`HhBpGxEJEJBNXG`BhHpAxGEGBFBJEEEEIPQXB`DhApExEKEJEXPGXB`HhBpGxEJEJBNPSXG`BhHpAxGEGBFBJEEEBIXG`BhHpBxGEGBGBJEEEBIPQXB`DhApDxEHEKBXPQXB`DhBpDxEHEKBXPQXB`DhApDxEHEKBXPQXB`DhBpExEKEJBUPSXG`BhDpAxHEGBFBJEEEBIPSXF`BhEpBxHEGBFBJEEEBIPKXB`DhApDxEGBFBJBBABEEEEELPlXB`DhApDxEHEKBXPQXB`DhBpDxEHEKBXPQXB`DhApExEKEJBUPQXB`DhBpDxEHEKBXPSXF`BhEpAxHEGBFBJEEEBIPSXF`BhEpBxHEGBFBJEEEBIPKXB`DhApDxEGBFBJBBABEEEEELPnXF`BhEpAxHEGBFBJBEEEIPSXF`BhEpAxHEGBFBJBEEEIPQXB`DhApDxEEEKBXPSXF`BhEpAxHEGBFBJBEEEIPSXF`BhEpAxHEGBFBJBEEEIPSXJ`BhDpAxEEGBFBJEEEBPP\XF`BhEpAxHEGBFBJEEEBIPSXF`BhEpAxHEGBFBJEEEBIPQXB`DhBpDxEGBHBYPSXF`BhDpBxEEHBGBJEBEBIPQXB`DhApHxEDBFBJBGABEEEEBTPSXG`BhDpBxHXPSXF`BhHpAxGEGBFBEEEEBQPSXF`BhEpBxHEGBGBJEEEBIPQXB`HhYpBxHEEEBOPGXB`HhBpGxEJEGBNPQXB`DhApExEKEJBXPGXB`HhBpGxEJEJBNPMXB`GhOpBxHEEBBOPGXB`HhBpGxEJEGBNP`XB`DhApExEKEJBXPGXB`HhBpGxEJEJBNPQXB`DhApDxEEBHBJBGBGEEEEBLPQXB`DhBpDxEEBGBJBGAGEEEEBLPQXB`EhApDxEGBGBYPQXB`EhApHxEGEJBUPQXB`DhApDxEEEKBXPSXG`BhHpAxGEGBFBJEEEBIXG`BhHpAxGEGBFBJEEBBIXG`BhHpBxGEGBGBJEEEBIPSXF`BhEpAxHEGBFBJEEEBIPSXG`BhHpAxGEGBFBJEEEBIPQXB`EhBpHxEDEGBUPQXB`DhApDxEEBGBJBGAGEEEEELPQXB`DhApDxEEBGBJBGAGEEEEELPSXF`BhEpBxHEGBFBJBBBBEBBABEEEEBLPQXB`DhApDxEEEKBXPQXB`DhApDxEHEKEXPKXB`DhBpGxEGEJBNPQXB`DhApDxEHEKEXPFXB`EhBpHxEJEJBNPQXB`DhApDxEHEKEXPQXB`DhBpDxEDBKBYPQXB`DhBpDxEDBKBYPQXB`DhApExEKEJBXPGXB`HhBpGxEJEJBNPMXB`GhOpBxHEGBBIPGXB`HhBpGxEJEGBNP`XB`DhApDxEHEKBXPSXF`BhEpBxHEGBGBJEEEBIXF`BhEpAxHEGBFBJEEEEIXF`EhDpBxEYPSXK`BhDpAxGEDBFBJEEEBIPQXB`DhApDxEGBHBYPQXB`DhApDxEHEKBXPSXF`BhDpAxEEHBFBJEBEBIPQXB`DhApDxEGBHBYPSXF`BhDpBxEEHBFBJEBEBIPQXB`DhBpDxEGBHBYPSXF`BhDpBxEEHBFBJEBEBIPQXB`DhBpDxEGBHBYPQXB`DhApDxEHEKEXPGXB`HhApGxEGEJBNPQXB`DhBpDxEGBEBYPSXF`EhDpBxEYPSXF`EhDpBxEYPQXB`DhApDxEHBKEXXF`EhDpBxEYPQXB`DhBpDxEGBHBYPQXB`DhApDxEEEKBXPSXF`BhEpAxHEGBFBJEEEEIPSXF`BhEpAxHEGBGBJEEEBIPSXI`BhDpAxEEHBFBJEEEBOP[XF`BhDpBxEEHBGBJEEEBIPQXB`DhBpDxEEBHBJBGAGEEEEELPQXB`DhBpDxEEBHBJBGAGEEEEELPSXF`BhDpBxEYPSXI`BhDpAxEEHBFBJEEEBOPYXB`DhApDxEEBGBJBGAGEEEEBRP[XI`BhDpAxEEHBFBJEEEBOP[XF`BhEpAxHEGBGBJEEEBIPSXF`BhEpAxHEGBFBJEEEBOP[XF`BhDpBxEEHBGBJEEEBIPSXF`EhDpBxEYPSXF`BhEpBxHEGBFBJEEEEIXF`BhEpAxHEGBFBJEEEEIXG`BhHpAxGEGBGBJBBBBEBBBBEEEEBLPSXF`BhEpAxHEGBFBJEEEBIPGXB`DhApHxEGEGBNPFXB`EhBpHxEGEGBNPQXB`DhApDxEHEKBXP`XB`DhApDxEHEKBXPQXB`DhApDxEHEKBXPSXF`BhEpAxHEGBFBJEEEBOP[XF`BhDpBxEEHBGBJEEEBIPQXB`DhBpDxEEBHBJBGAGEEEEELPQXB`DhBpDxEDBKBYPQXB`DhApDxEGBHBYPQXB`DhApDxEHEKEXPQXB`EhBpHxEDEJBUPQXB`DhBpDxEEEKBXPQXB`DhBpDxEEEKBXPSXF`BhEpAxHEGBFBJEEEBIPSXF`BhEpAxHEGBFBJEEEBIPSXF`BhEpAxHEGBFBJEEEBOPYXB`DhApDxEEBGBJBGAGEEEEBRP[XF`BhDpAxEEHBFBJBBEBIPQXB`DhApDxEEEKBXPQXB`DhApDxEEBGBJBGAGEEEEBLPQXB`DhApDxEEBHBJBGAGEEEEELPQXB`DhApDxEEEKEXPSXK`BhGpAxGEDBGBJEEEBIPSXF`BhEpAxHEGBGBJEEEBIP`XB`DhApDxEHEKBXPQXB`DhApDxEHEKBXPFXB`EhApHxEJEJBNPSXF`BhEpBxHEGBGBJBEEBIPSXF`BhEpBxHEGBGBJBEEBIPSXF`BhEpBxHEGBGBJBEEBIPSXF`BhEpBxHEGBGBJBEEBIPSXF`BhDpBxEEHBGBJEBEBIPQXB`DhApDxEHEKBXPGXB`HhApGxEJEGBNPGXB`HhBpGxEJEJBNPMXB`GhOpBxHEGBBIPGXB`HhBpGxEJEGBNP`XB`DhApDxEHEKBXPGXB`HhApGxEJEGBNPGXB`HhBpGxEJEJBNPSXF`BhEpAxHEGBGBJEEEBIPSXF`BhEpAxHEDBGBJEBEBIPQXB`DhApDxEDBHBYPSXF`BhEpAxHEGBFBJEEEEIPSXF`BhEpAxHEGBFBJEEEEIPSXF`BhEpAxHEGBFBJEEEBIPSXH`BhDpAxDEGBFBJEEEBIXK`BhDpAxGEGBFBGBEEBIXF`BhEpAxHEGBFBJBEEBIXF`BhEpAxHEGBFBJBEEBIXF`BhEpAxHEGBFBJBEEBIXF`BhEpBxHEGBGBJBEEBIPQXB`DhApDxEEEKBXPQXB`DhBpDxEEBGBJBGBGEBBBBEBBABEEEEBLPSXK`BhDpBxGEDBFBJBBBBEBBBBEBBABEBBABEEEEELPQXB`DhBpDxEEBHBJBGAGEEEEBLPSXF`EhDpBxEYPSXF`EhDpBxEYPQXB`DhBpDxEEBGBJBGAGEEEEBLPQXB`DhApDxEEBGBJBGAGEEEEELPSXJ`BhEpAxDEDBFBJEEEBIXF`BhEpAxHEGBFBJBEEBIXF`BhEpAxHEGBFBJBEEBIXF`BhEpAxHEGBFBJBEEBIXF`BhEpAxHEGBFBJBEEBIXF`BhEpBxHEGBGBJBEEBIPQXB`EhApHxEDEJBUPQXB`DhBpDxEEBGBJBGBGEBBBBEBBABEEEEBLPQXB`DhBpDxEEBHBJBGAGEEEEBLPSXG`BhHpAxGEGBFBJEEEBOPLXG`BhHpAxGEGBFBJEEBBIXG`BhHpBxGEGBGBJEEEBIPSXF`BhEpAxHEGBFBJEEEBIPSXF`BhEpAxHEGBFBJEEEBIPQXB`DhBpDxEEEKBXPSXG`BhHpAxGEGBFBJEEEBIPKXB`GhBpGxEGEJBNPQXB`DhApDxEHEKB^P[XF`BhEpBxHEGBFBJEEEEIPQXB`DhApDxEEEKBXPSXF`BhEpBxHEGBFBJBEEEIPFXB`EhApHxEGBGBJBBABEEEEBLPSXF`BhEpBxHEGBGBJEEEEIPQXB`EhApHxEDEJEUPQXB`DhApDxEEEKEXPQXB`DhBpDxEEEKBXPQXB`DhApDxEEBKBXPQXB`DhBpDxEEBHBJBGAGEEEEBLPQXB`DhBpDxEEEKBXPQXB`HhBpHxEDBGBUP`XB`DhApDxEEEKBXPQXB`DhApDxEHEKEXPSXF`BhEpAxHEGBFBJEEEBIPQXB`DhApDxEHEKBXPQXB`DhApDxEEEKB_PZXB`DhApDxEHEKBXPQXB`DhBpDxEEEKBXPSXF`BhDpBxEEHBGBJEBEBIPSXF`BhEpAxHEGBGBJEEEBIPSXF`BhEpBxHEGBFBJBBBBEBBABEEEEBLPQXB`EhApHxEGEJBUPSXF`BhEpBxHEGBGBJBEEBIXF`BhEpAxHEGBFBJEEEBIPSXF`BhEpAxHEGBGBJBBBBEBBBBEEEEBLPSXF`BhDpBxHYPQXB`DhApDxEHBKBUPQXB`DhBpDxEEEKBXPQXB`DhApDxEEEKBXPQXB`DhApDxEEBGBJBGAGEBEEBLPSXF`BhEpAxHEGBFBJEEEBIPSXF`BhEpAxHEGBFBJEEEBIPSXK`BhDpAxGEDBFBJEEEEIPSXF`BhEpAxHEGBFBJEEEEIPSXF`BhEpAxHEGBFBJEEEEIPSXG`BhHpAxGEGBFBJEEEBIXG`BhHpAxGEGBFBJEEBBIXG`BhHpBxGEGBGBJEEEBIPSXF`BhDpBxEEHBFBJEBEBIPQXB`EhApHxEJEJBUPGXB`HhBpGxEJEJBNPMXB`GhOpBxHEGBBIPGXB`HhBpGxEJEGBNP`XB`DhApExEKEJBXPGXB`HhBpGxEJEJBNPQXB`DhApDxEEBHBJBGAGEEEEBLPSXK`BhDpBxHEGBFBJEBBBPPRXB`DhBpDxNLEJBIPSXG`BhDpBxHEGBGBJBBEBPPRXB`EhBpHxIOBJBPPRXB`EhBpDxLPBJBOPXXD`BhGpJxOBEEEOPSXF`BhHpMxOOPSXI`BhGpKxOOPQXB`DhApDxNOEJBIPQXB`DhApDxEEBHBJBGAGEEEEBLPQXB`DhApExEKEJBXPGXB`HhBpGxEJEJBNPMXB`GhOpBxHEEBBOPGXB`HhBpGxEJEGBNPbXF`BhEpAxHEDBGBJEBEBIPSXG`BhDpAxHEGBFBJEEEEIXF`BhEpAxHEGBGBJEEEEIPFXB`EhBpHxEGBGBJBBABEEEEBLPSXF`BhDpBxEEHBGBJEBEBIPQXB`DhBpDxEDBEBYPQXB`DhApExEKEJBXPKXB`GhBpGxEGEJBNPQXB`DhApExEKEJBXPGXB`HhBpGxEJEJBNPQXB`DhApDxEHBKEXPFXB`EhApHxEGBGBJBBABEEEEELXF`EhDpBxEYPSXF`EhDpBxEYPQXB`DhApDxEEEKBXPQXB`DhApDxEEEKEXPQXB`DhApDxEEBGBJBGAGEEEEBLPQXB`DhApDxEHEKBXPQXB`DhBpDxEHEKBXPQXB`EhApDxEGBGBYPQXB`DhApDxEEBHBJBGAGEEEEBLPQXB`DhApDxEEBHBJBGAGEEEEELPQXB`DhApDxEGBHBYPQXB`DhBpDxEGBHBYPQXB`DhApDxEEBHBJBGAGEEEEBLPSXG`BhHpAxGEDBGBJBBBBEBBBBEEEEBLPQXB`DhBpDxEGBHBYPQXB`DhApExEKEJBXPGXB`HhBpGxEJEJBNPQXB`DhApExEKEJBXPGXB`HhBpGxEJEJBNPQXB`DhApExEKEJBXPGXB`HhBpGxEJEJBNPSXG`BhHpAxGEGBFBJEEEBIXK`BhGpBxGEDBGBJEEEBIPQXB`DhApExEKEJBXPGXB`HhBpGxEJEJBNPQXB`DhApExEKEJBXPGXB`HhBpGxEJEJBNPMXB`GhOpBxHEGBBIPGXB`HhBpGxEJEGBNPQXB`DhApExEKEJBXPGXB`HhBpGxEJEJBNPQXB`DhApExEKEJBXPKXB`GhBpGxEGEJBNPQXB`DhApExEKEJBXPGXB`HhBpGxEJEJBNPQXB`DhApDxEDBEB_PS 8A0A(B BBBG 0xwAAN D DAG QDAIJ'Db0ܲ'DbH7TR J FLhBBD B(C0A8D 8A0A(B BBBA 8Do,̵kAD T CD  CJ 4 AAD@ CAC F CAF |8#BBB B(D0A8DPi 8A0A(B BBBD  8F0A(B BBBF  8F0A(B BBBE 8dBBD A(F0 (A ABBA dȽ BBE B(D0A8Jz 8A0A(B BBBB  8F0A(B BBBA \p `plMBBB B(A0A8DP 8F0A(B BBBA D 8G0A(B BBBE <XOAI R HAL NAAHH H<BBE B(D0A8J`c 8D0A(B BBBF `VJx3Am`(yBBB B(D0A8Kpc 8A0A(B BBBK K 8F0A(B BBBF DHl[(|[AAD O AAI <dDu G 4XAAD c AAE % DAF H$BBB B(A0A8FP 8D0A(B BBBD L BBB B(A0A8D 8D0A(B BBBA 8,@BBC A(J0W (C ABBG 0hBAD I0T  CABF :-Ak(AAL0R AAF (RAAG P CAC  hAV I C0\UAV I pPAV I i(p AAN0f CAF LMBBA A(O@s (A ABBB K (A FBBJ LTAAPpO FAH [ AAC W AAG ] AAA H<BBD A(G@ (C ABBG c(F CBBDA0Aj,AHBBG B(A0A8K 8C0A(B BBBA (AD0Aj`}7-Ak |AV I (,AAS P CAG 8AD ~ CJ g CF u CH C  BBB D(A0W (C BBBE 28D@LHJPR08F@AHEP`0H8D@LHJPR0U8D@LHJPR0U8D@LHJPR0U8D@LHJPR0U8D@LHJPR0U8H@KHJPU0U8D@LHJPR0O8D@LHJPR0U8D@LHJPR0U8D@LHJPR0L8D@LHJPR0U8D@LHJPR0t8D@LHJPR0 @|A J fdx xA J nHBBD A(G0W (C ABBA w8F@IHBP\0T'Aa,h BAD  ABI (4HAAJ W AAK `t:Dc I I BBB B(A0A8D`y 8D0A(B BBBD hWpExFP`+hBpExPP`IhGpTxHR`hBpExPP`KhBpExPP`hBpExPP`hBpExPP`++ $*KAT K _ A J$+ 0+_AI0 AG l. AGPW AH DX:Xd:wAW H R x:A I ](;AG@Y AF [A@<AAG [ AAJ o AAG g AAG  p<8G I 4( =BBD A(J0(A ABB8`  >BBE A(D0(A BBB8 >EEE D(D0w (A BBBK 8 ?BBE D(A0(H BBB$ x?]AAG QAA< ?P ?DABl ?DAB $@ $  @PAAG DAA$ H@PAAG DAA$ p@BAAG vAA< @BBB A(D0l (I BBBJ 8T ABBD A(J`p (A ABBG  lA, hABAD G0v AAB8 ABBD A(J`p (A ABBG  ,B,$ (BBAD G0v AAB$T BPAAG DAA$| BPAAG DAA$ BPAAG DAA$ CPAAG DAA$ (CPAAG DAA, PCxBAD JPb AABHL CFBBD A(M` (A ABBF o(A ABB D/Gg D# D( DZEGD FABEE)Ga8E;AyT8EhDE?A}HhEEHB D(D0e (A BBBE m(A BBB(E6BKA ]AB<EBBB A(D0a (A BBBE <PF7Au$XtFPAAG DAA$FPAAG DAA,FEDD p ABH |4G 8A0A(B BBBJ b 8A0A(B BBBD  8A0A(B BBBA $XHlAAJP]AA$HD d H 40>,AADps CAC  CAA 0h>BAA Dps  CABG D>@IAA W ABK q FBG  FBI D>\IAA W ABK \ FBD M FBC (,?kYAK D AAE HX? d 8A0A(B BBBA (? AAG0h CAK (? VAD`a AI [ AD <? ]BAD D0o  ACBH P AAB<@ 7Dd H lX@ fBBD A(D0Lpb 0A(A BBBE z 0A(F BBBA W 0A(F BBBD 0@ =AAI`S CAF PCA@ :Ax8A4 AG@S AD  FE L FF NFLTA "BBA A(G`H (C ABBC J (H ABBI LA BBA A(G@ (A ABBE L (A ABBF tAH BBE B(A0A8DPQ 8A0A(B BBBD  8F0A(B BBBH Y8F0A(B BBBLlB BBA A(G@ (A ABBE L (A ABBF B BBE B(A0A8DP| 8A0A(B BBBI  8F0A(B BBBF X 8F0A(B BBBI G 8F0A(B BBBJ PPC BAD D0Q  AABH l  FABC L  FABK `C ph 8A0A(B BBBH  8A0A(B BBBD PD BAD D0Q  AABH l  FABC L  FABK P\D@ BAD D0Q  AABH l  FABC L  FABK PD BBA A(G@Q (A ABBD  (F ABBD Eh 1E ,E (DE AAN@z DAA 4pE bAAN@t GAL e DAF (ET Hb F A O S E E 5i$E BAAG vAAF< Cw$Fx DS BAC M0s  AABF  AAB P> /4P> /H 8D0A(B BBBA 8P? QAG @ AAE K IAK <Q,@ qBAD J0s  AABH  AABL@QlA 7 BBG B(F0A8G 8A0A(B BBBB HQ\J `BBG B(F0A8Gq 8C0A(B BBBH @QpO KBAC N`!  AABG P  FABG  R|Q KDn F R`@RQ BBE B(E0A8Dp 8A0A(B BBBH  8A0A(B BBBF XRS YOBA A(L0H (A ABBC n(A ABBDP08ST AAG0m AAH  AAH $D] G U(lA!jAD M FH CF<@l"AC G* K U K 48@"AG`\ AK 8XA"rBBD A(GP (C ABBH H(D"+ BBB B(D0A8JQ 8A0A(B BBBF  M"C|HM"EBBB B(D0A8Mv 8A0A(B BBBF S 8F0A(B BBBF  8F0A(B BBBG HtQ"9BBE B(A0A8J 8A0A(B BBBF H a"0BBB B(A0A8I`C 8A0A(B BBBH < d"+BEB A(A0 (A BBBF (Lg"-AGPx DD I DK (xh".AG`y DK L DH Li"BBB B(A0A8D 8A0A(B BBBI Pp"BAD G@  AABE g  FABH L  FABK HHtq"4 BBD B(A0A8Of 8D0A(B BBBB h|" d|" `|"B(Ђ|"RAG dAAH}"0$|"AAD z AAF HAA8XO6BBA A(G0"(A ABBp(}"BBD A(G0Q (C ABBG B (A ABBH W (C ABBA v (C ABBB LD~"KAD D ABI @H  AB_ ,X"}BAA M ABD H"BBE B(D0A8D@ 8D0A(B BBBJ 4Ԅ"QAJ Q CAG c FAA  0"L <"BBD A(D@ (A ABBE \ (F ABBA ,p܂"BAA } ABD L"BAD n ABH n CBE T CBG TCB\"KBA A(GP (A ABBE ` (A ABBJ PHP_ (A ABBC  (A ABBA x"s^v D F "rFA I `@@"AAG0 CAD X CAD } CAG "SQAH"BBB B(D0A8Dp) 8D0A(B BBBI HhuBBD A(G0V (D ABBE x(A ABB8H"RAD _AAHH cJA@"KBD A(J0J (A ABBD @(4"lAD G AC ZA<`L"BBB A(A0 (A BBBI ("`G QH a AF Ḧp"BBD A(L0h (A ABBE A(A ABBP"BAD D0  AABB   AABC {  AABA $l`"AAG AAD"BBB A(D0L 0D(A BBBJ ,܉"BAD  ABE 8  "tBBE A(D0^(A BBBHd"<<\"KDC O ABD XH  "?LL"AAD k CAK O CAE O CAE o CAE FrO"SAu J R84"mRZ P"cF` R cEdtؚ"BBE B(A0A8D`$ 8F0A(B BBBD  8C0A(B BBBF ܋p"T]k H C"sD m G ""`PHq A 4< "~AAD0A AAG T DAA th"# "bD| H E A Р"AU J č@"jihH"yBB A(A0 (A BBBG 58M@b8A0d0,"njBB B(A0A8GB 8A0A(B BBBF (4"NAAF I CAK lčX"BBA A(D@HDPBXH`U@L (F ABBD  (C ABBI _ (C ABBA d4ح"BBB B(A0A8DP)XG`NXAPbXG`NXAPH 8A0A(B BBBI H"sBBB B(A0A8Dp 8A0A(B BBBK (D"RAAD  AAK ¯BA@0\"<AG@rA,P|"F} E [ E WAPS"RNgQ@"RNjNH"BBE B(D0A8Jt 8D0A(B BBBE (Ƚ"AAD  AAE 00"AAD0s CAC FCAd"(Af,"#mBD A(G0{(C ABBb0T (A ABBF px0D (J ABB 8D0A(B BBBF $Pȭ7AAJ hAA0xt"MAD q ABJ X"cL"BBB B(A0A8Dpr 8H0A(B BBBL  8F0A(B BBBP A 8A0A(B BBBF c 8J0A(B BBBJ <X"KBD A(J0o(A ABBG"0"YBAA D0\  CABF ,<"dBAD  ABA |";AJ nAP0"BAA G  AABH   AABF x  AABD h" d"p"-"ԓ"5LD"AAJ S AAG T AAB ` AAF P GAH 8"Fr SP "p$dh"2AZ E C E A("AD L CD h CE "e4̔@"AG Z CK i AF e CH H"MBBE B(A0A8GV 8D0A(B BBBI P"D{ A S M mth"W\"'BBB B(A0A8J 8D0A(B BBBD `eDt##HD\ H [g CAD T AAB Y FAH Lp#3BBE B(A0A8J 8C0A(B BBBF |` #KBE B(A0A8G 8A0A(B BBBK D 8F0A(B BBBE hFP|#?PAD IP  AABI pHP[  CABG З# 0#JG@U AI d CI TCIÄD#BKB C(A0 (A BBBD  (C BBBJ Q (C BBBF R (C BBBE q (F BBBC L\# BBB B(D0A8D 8C0A(B BBBE L,#BBA A(D@E (A ABBK  (A ABBF H@# BBB B(A0A8J 8C0A(B BBBF H0,#JBBB B(A0A8D`* 8C0A(B BBBD 4ؙ43#AAG B DAH F DAE 03#BAD GPv  DABE HDx4#86#BBA A(D@i (A ABBG 4̚08#XAAG0 AAH [ CAA 8X9#iBBA A(G0 (A ABBD T@:#AAJ0p GAT U AAA S CAI W CAE DFA8:#BBB A(D0_ (A BBBG ԛH>#vFU E FJ>#AO H hS#KAD z ABC >B pU#)!CxAF[A3BBE B(A0A8ICsB_FfBC\BCOBwjfA\^CNoC_AAGV8A0A(B BBB,(~#jSAGDAAG,$h#BAA G@ AAB`T#BBB B(D0A8Lc 8A0A(B BBBA BBER#xHY G `؞d#zBBA A(G@$ (D ABBF  (D ABBH  (D ABBH @<#KBD A(J0@ (A ABBF h,#QEAD z ABI 4L#mAAJ@A AAA D CAA `#-QEB B(D0A8J8A0A(B BBBFH@L#AAD  AAK I AAE CAd<#@ 8A0A(B BBBF | 8H0A(B BBBS d#M BBB B(D0A8D 8A0A(B BBBA  8A0A(B BBBG L`|#7BBA A(Dp (A ABBJ   (A ABBF l#@Lġ# BBB B(A0A8D 8D0A(B BBBI L8#%BBD A(D0I (C ABBB ^ (F CBBE Xd#3BBB A(D0D 0A(A BBBH K 0C(A BBBK #HԢ#`l 8D0A(B BBBI ` #BBB B(D0A8J 8D0A(B BBBF jDBKN$(#iMT O MCHHp# BBE B(A0A8GX 8A0A(B BBBJ $$( 0$BAA AB8$<L@$LAG  AAH TCAHH $Tx$T<$^AC HG G  H G A ( $oBAD dABL(T $BBB B(A0A8D 8C0A(B BBBE (x$AAI | AAG $B$B̥$ $($AC M D  $ 4$H$\$p $ $"$$40$JAG N AAF xC X$FBBB A(A0H`  0A(A BBBD O 0A(A BBBA (@$SFAG }AACl$f<$K$x$jAAD aAA0$KAAQ Z AAA LIA$$|Ao@ L$jBAC \ ABK ] ABH NABdx$xt$&$<$5$<Ds̨$5Db J D0$lj$Yt$BBA A(N0 (C ABBJ  (A CBBA  (C ABBA  (A CBBA 8\"$BBA A(D`e (A ABBK ̩($L($BBB B(K0A8IN 8D0A(B BBBH 0l:$>Al C G(P:$QAD@r AAF |;$DV$;$.AAD eAA$;$;AAD rAA ;$XBNf B 0t<$BAK D@  AABG H<@=$BBE B(D0A8G 8C0A(B BBBA K$H\ D XTL$BBB A(A0G@ 0D(A BBBB D 0C(D BBBG 8M$IAD G0v AABAH0H@LN$BBB B(A0A8FP 8D0A(B BBBH LO$\BBB B(D0A8J 8D0A(B BBBA ܬU$JAG0@AV$JAG0@Ad0V$BBE A(D0G@z 0D(A BBBF HQPGXA`U@c 0D(D BBBH HW$eBBE B(D0A8JPf 8C0A(B BBBD $ЭY$*AAQ QDA$Y$*AAQ QDAL Y$BBE B(A0A8J 8D0A(B BBBI LpL`$BBB B(A0A8M 8A0A(B BBBA Lb$BBE B(A0A8J 8A0A(B BBBH Lh$(`\k$AAD@ DAA @0l$8BBB A(A0DP  0D(A BBBG Я,n$bDu A n$]As A (n$AAJ | AAF |48o$BBB A(D0R@ 0A(A BBBA D 0F(A BBBG D 0C(A BBBJ D0G(A BBBpo$qBBA A(D0 (D ABBE \ (D DBBH ^ (G DBBK d (G ABBE |(p$PBBG B(D0A8T`D 8D0A(B BBBI N 8G0A(B BBBJ  8A0A(B BBBH t$LBB B(D0A8PPGWA~ 8D0A(B BBBG IcAFl4w$S 8D0A(B BBBA hEhB]IYA,z$BAA L ABA Բ({$JR D sE`{$BBE B(D0A8D@| 8C0A(B BBBD R 8D0A(B BBBA \\0|$5TBE B(D0A8OP8A0A(B BBBCPPH}$BBI B(D0A8Mx 8D0A(B BBBJ 4$AAI0f AAE Z AAA (@,$AAD@ CAA ,l$_AG: AE T AC $pl$]As A ̴$]As A D$BBB B(D0A8P@k8D0A(B BBB40L$AAI0s AAH c AAA Hh$H$BBE B(A0A8Np 8D0A(B BBBG $~\$jAAD z DAK P DAK B DAI F DAE I DAB t($\$$%BAA ] ABD v ABG F ABG I ABD A IBE \$ BAA Y ABH K ABJ F ABG X ABE A IBE \H$KBE A(G0(A ABBEH0D (C ABBL L4$KBE A(G0 (A ABBF PH0H$AAD F DAG F DAE D IAJ IDA\DH$BAF G0f  AABF D  FABC D  CABF D IABH$AAD O DAF F DAE D IAJ IDAH$AAD @ DAE F DAE D IAJ IDAH<@$AAL O DAF I DAJ F DAE DIAL$SAJ0v EAF BAADH0T IAE HعD$AAD F DAG F DAE D IAJ IDA\$$%BAA ] ABD v ABG F ABG I ABD A IBE DX$BAA B ABG U ABH F ABG D̺$BAA B ABG U ABH F ABG Lh$BAA I ABH F ABG A IBL IABLd$AAD S DAB c DAH F DAE D IAE L$AAD S DAB c DAH F DAE D IAE \$%BAA ] ABD v ABG F ABG I ABD A IBE LdX$BAA j ABG G ABF J ABK aIBH$AAD ~ DAG F DAE D IAJ IDAH,$AAD ~ DAG F DAE D IAJ IDALLp$BAA U ABD F ABG I ABD AIB$#Hܘ$AAD ~ DAG F DAE D IAJ IDAH $AAD ~ DAG F DAE D IAJ IDALHd$BAA U ABD F ABG I ABD AIB$#Й$,A^HȾ$RAC N0m AABKH0i AABKH$3KU0l$3KUL$3KUh$3KVlؚ$BBF A(L@ (A ABBI D (F ABBA D (C ABBD D(I ABBX$ 0T$TAAR Y GAS DFA<$o^Z(T؛$ZAl C F J I G IH $BBE B(D0A8IP; 8D0A(B BBBG l`$BBA A(D0 (D ABBD D (D DBBH D (G DBBE \(I ABBL<$AAD [ DAJ c DAH F DAE D IAE \p$1BAD D`l  DABB ;  JABH Q  GABE hApWhA`PP$xBAD DmNAl  DABF   GABK L@|$BAA O ABJ F ABG A IBL IABL̦$BAA o ABJ G ABF A IBL JABL<$AAD S DAB c DAH F DAE D IAE L0$BBB B(D0A8M P 8A0A(B BBBD L$ X$xAq V O M P$=BAA  ABH  ABF G ABF i IBE H $AAJ c CAM \ GA\ D CAH JCALX$BAA  ABD F ABG I ABD IAB$#$#L$_BBB B(D0A8Mk 8A0A(B BBBA H ̹$AAD ~ DAG F DAE D IAJ IDAHl$AAD ~ DAG F DAE D IAJ IDA\T$B|AX 8D0A(B BBBF <4$AAI T DAD I DAJ DIALX$AAD [ DAJ c DAH F DAE D IAE d$ BAA y ABH F ABG N ABG p ABE I ABD AIB $D A K V A 4($TDq A Pl$RIYNL L I4t$*AAJ M CAC  AAL L$BBA A(JO (A ABBK D (C ABBD P@$BAF L@  DABC Q  GABE D  IABA @P$AAF0X DAK P DAK I DAB HH$AAD ~ DAG F DAE D IAJ IDA8$BAC ^ ABI ^ ABA L$BAA N ABK G ABF J ABC AIBLlP$KAD ] ABP V ABG A FBG `L$BAA N ABK G ABF J ABC AIB, 0$BAA F0v DABH<$c AAI ~ IAH DFAEL$BAA N ABK G ABF J ABC AIB|L`$ TBB B(A0A8Dpd 8I0A(B BBBJ Y8A0A(B BBBEHpl$BBA A(D0n (D ABBG D (C DBBI D (I ABBN D(F DBBl<`$BBA A(D0n (D ABBG D (C DBBI D (I ABBN D(F DBBL$BAA N ABK G ABF J ABC AIB $;FcD$;Fc4h$3H$#\$#p$#$#$# $#H<$BBB B(A0A8Dp 8D0A(B BBBE  $ $4$(H$sFAI NEALtH$wAAQ  FAM F DAE E DAF xIA`x$BAA D0  AABJ A  AABK M  AABG z  AABA t($$ BBE B(D0A8MP 8P0A(B BBBQ F 8A0A(B BBBH D8K0G(B BBB$$$$ $ X$BBD A(J0 (P ABBK G (D ABBH D(S ABB\`\$BBE B(D0A8PWIAD 8D0A(B BBBH ( $Gx A F J A O C<$AAF0C DAH Q GAG IDA,$3KUH$3KUd($3KUL$3KV\p$aBAA  ABC @ ABE F ABG a ABD A IBL @$AAD0A DAD f DAE u IAI $@$TGh Q F J CLhT$BBB B(D0A8J 8A0A(B BBBK H$KAF L@  AABH D FABCC@Ph$:BAA D@  DABD D  IABH D  FABA `XT$BBB A(D0 (D BBBJ m (D BBBI _ (D BBBG 4$BBA A(F`(D ABBX$T$P$ 0L$ $5Ac$5Ac$aBBB A(A0 (D BBBG j (D BBBD D (D BBBB b (D BBBD A (I BBBE `P$ 8I0A(B BBBF m 8D0A(B BBBF <8$\BBB A(A0 (A BBBD X$)Dd p$ l$\4h$BBD A(GP (D ABBA D (C DBBA `(D DBBH$AAD | DAI F DAE D IAJ IDA$%IO@$AAD W DAF F DAE R DAA L<$SAJ0{ EAI BAADH0L IAE H0$_BBE B(A0A8J@ 8D0A(B BBBE $D$TAl K P H C|$0x$JAI0pAAJC0$H$FG DAFC pL$3Aap$ l$rAn Q Q M $zAs T Q M $$#H@$AAD @ DAE F DAE D IAJ IDAH$MKBB B(A0A8Jpn 8A0A(B BBBK  8A0A(B BBBD  8F0A(B BBBE t 8F0A(B BBBE CpH$$%BBE B(D0A8M 8D0A(B BBBJ D$BBA A(D0 (D ABBF D (C DBBA L (D ABBK D (I ABBN T (D ABBC T (D ABBC H$BBB B(A0A8F 8D0A(B BBBF D($BAM  ABM { ABB A FBG pL$kFc G x($JI s AI DCI@$9BAD G0#  AABK D  CABF <%BBA A(I (D ABBH LD%BBG B(A0A8O 8D0A(B BBBD <8%sAAR x AAB D FAE DIA4x%AAD@[ DAJ V DAE @ %AAD0 DAC I DAB F DAE PP%pBAA D@  DABF I  DABH D  FABK L%AAD ^ DAG c DAH F DAE D IAE LH%AAD S DAB c DAH F DAE D IAE DD%|FAD |AAGH D IAJ DFA%BBA A(D0 (D ABBA D (C DBBA L (D ABBK n (D ABBI D (F DBBF d(I ABBH  %BBB B(D0A8M 8D0A(B BBBD \l %BAA d ABE G ABF F ABG X ABE I ABE  % \ %BAA D`  DABE ^  IABF D  FABK Q  GABA L@<%AAD S DAB c DAH F DAE D IAE L%AAD S DAB c DAH F DAE D IAE $<%bAAI QDA(%[FAI ~AA,4%RAC HG D Ld%BBE B(A0A8Du 8A0A(B BBBH <h%JAK AAFH D FAE %PXH`aXAPGXG`[XAP 8F0A(B BBBH F 8A0A(B BBBH { 8F0A(B BBBA @%AAD Y DAD F DAE K DAA < %AAD J DAK D GAL FDA\% $X% H8T%BBB B(D0A8JP@ 8D0A(B BBBD \% BBD A(R0~ (D ABBF D (F ABBI R (D ABBE Lh%SAJ0t HAE BAADH0d IAE H4%xBBB B(A0A8DP| 8D0A(B BBBI HL%FAI D AAJ DFAEH DIA|%BBA A(G0U (A ABBH D (F ABBA M (C ABBK D (D ABBK D(G ABBL%7Gb G Fl%%%x%BBE B(A0A8D@ 8D0A(B BBBH ^ 8D0A(B BBBM T8C0A(B BBB$ %8$ %LL0 %BAA J0  AABC m  AABG T CAB`!%BBE A(D0 (D BBBC  (D BBBH K (D BBBK $"%AAAD uDA$("%AH Ch D PL#%9dx#%(x#%VBAD ICB<#%AAL Q EAK D DAG DCA$%@4$%"A`H$%>(t$%%QO|D$%AH G H H M K H H M K d L H H H H H H H H H H H H H H H H H H H H A E ,%%RFAG i AAG P0%%RGDD f ABH P(%%0<%%UAAG r DAH GDA$p&%/AAF bCA($&%CFAJ cCAH@H&%KBD A(J0{ (A ABBC @&% &% 0&%/QYL&%=Qgh&%&QP&%.QX  '%^Au J H H H'%EQo|'%MkH'%8Qb4(%AF E D Q G V P|)%NDl H Up)%,)%dD\ H S A $*%YIG fAIH (P*%dAAG 5 AAH +%LBBD A(O@m (G ABBW [ (K ABBM G (D ABBH e (A DBBJ V (A DBBA \H% 4XH%[FAJ l DAF DCAHH%<AY F WTI%{hR lpI%D C U K P,J%]Dm Q pJ%|J%+J% <J%SAAJ U AAE O CAE DGAP(J%BAD W ABG i ABD V ABG I ABL @|pK%|BAD b ABD J ABK R ABC \K%BAD G0o  CABE c  CABG ]  EABC D  GABJ \ BBE B(D0A8P 8A0A(B BBBC Hhā&BBB B(D0A8LP 8D0A(B BBBF @X&BBE A(D0JP 0AXB`FhBpTPGXK`]XAP_ 8D0A(B BBBM ` ̆&5Dl<x &BEB A(D0f (A BBBA  D&P P&'S l&TDb A h &c 8A0A(B BBBF :GDDDDQh!ԋ&:BBB B(D0A8O] 8A0A(B BBBE  GPBJGGGGV^G]A;G`AlGVB"d&BBB B(A0A8Gj 8A0A(B BBBK GDDDDQ2GDDDDQ\"&BBE B(D0A8G 8A0A(B BBBF LA\#&BBE B(D0A8P 8A0A(B BBBC pBtA`#\&,Aj|#p&:DuL#&"BBA A(D@ (A ABBK ` (A ABBJ #x&#& $& $|&%4$&1](H$Ĭ&CAAD Y HAG (t$&AAD0 AAA $\&($h&mRAD qNB($&{QAJpK AAG 8 %&A(D0 (A ABBH ,%&zKAD ^ABG$%ذ&AAR DA%`&`%&sA@ J & &sA@ J H0&l&BBD A(M0q (A ABBC l (A AZABH&&e 8D0A(B BBBD  '&G H 0('d&;BAD I  AABA 8\'p&zBBD A(IP (D ABBF '&@IT C PL'Է&ABBB B(A0A8D 8A0A(B BBBA (Ծ&cA O d$((&BBE B(D0A8P@ 8H0A(B BBBO HRPFHA@HEP[HA@8(&{BBA A(D0I (G ABBH (&QDt E (8&((4&5BAD fAB($)H&BAA  ABS P)&(d)&5BAD fAB)&^A J )&[A| J )D&^A J )&zAG J *&AP *&[A| J <*0&^A J X*t&^A J ,t*&6BAD { ABK *&AR J 4*H&sBAA w HBK A GBE (*&8BAD iAB(+&(<+&8BAD iABh+&(|+&8BAD iABD+&BBB B(D0A8M@8G0A(B BBB+&:Da A H ,&4BBE B(D0A8LP8J0A(B BBBDX,&BBE B(C0A8P@8A0A(B BBB,|&40,&bJAG wAAAB ,,&fRAD p DBE -$&AO84-(&vBBE A(D0`(A BBB4p-l&xBBD A(S0U(A ABB(-&EBAD zAB,-&BAD  ABE .x&(.&AG  AJ a AF D.h&1A[ A L`.&OBB B(D0A8J 8A0A(B BBBA \.L&KBBB B(A0A8MDPA 8D0A(B BBBA /<&gA[ D F0/&SD/&nAl`/,&t/(&-8/D&BBD A(P0O (G GBBF 0/&BAA D  AABE /D&% 0`&- 0|&-40&=xH0&BBE B(D0A8J`hApchA`D 8D0A(B BBBG hQpIxA``RhHpAxDW`d0h&! 8A0L(B BBBJ L,1`&BBD B(D0A8J 8A0A 8A0A(B BBBH H1t&BBB B(A0A8D`Z 8A0A(B BBBF 02&BAD KH  AABB H2&BBB B(A0A8DAqFAH 8A0A(B BBBG S2&#A]D2& AC HD A  A  G  H H<3&BBB B(A0A8DP 8A0A(B BBBE H3&:BBE B(D0A8Gp 8A0A(B BBBE (3x&BAD ABH4&ZP) 8A0A(B BBBL L4' BBB B(D0A8G 8A0A(B BBBE oQBA]EPBA]UTFA_xJAC]AjAAlBAjAH$58'VBBB B(D0A8DP:8A0A(B BBB(p5L'BDG AB85'DBB H(A0x(D BBB5D' 5@' 6<'t6'BBB B(D0A8GED\Aj 8A0A(B BBBB  DGD\6l''BBE B(A0A8G 8D0A(B BBBD d07*'BBB B(A0A8D` 8A0A(B BBBA q 8A0A(B BBBE 87$.'BBD A(G`" (A ABBH \7/' BBB B(D0A8Gc 8A0A(B BBBG AmOH48;'3BBB B(D0A8M 8A0A(B BBBH 48?'BEA A(M0(G ABBH8d@'BBE B(D0A8M` 8A0A(B BBBB 98C'DEB L$98C'DKB ID98C'DKB Id98C' x94C'_d9D'(BBB B(A0A8D( 8G0A(B BBBJ b 8A0A(B BBBD <9HK'FEBB D(A0 (D BBBE 4:XN'tDG A P:N'fd:O' BBA A(G0J (D ABBH t (G ABBH  (G ABBH X (D ABBO D (D ABBE H:Q'BBB B(D0A8L 8D0A(B BBBF 8;S'DQ8P;';R'BBE A(D0G 0D(A BBBD \;U'n BBB B(A0A8J  8A0A(B BBBE $ N _ A p4<`'g 8A0A(B BBBF eKBEy`K^AX<p'.BBD A(G08I@BHEPR0M (A ABBF D (D ABBE =p'"BBE B(A0A8G  D G D F D Y  8A0A(B BBBA / E { A   z g B <=x'TAD Ju AABDH >܌'BBA A(Jd (A ABBF l (A ABBF p><'8,BBB B(A0A8J  O S B  8A0A(B BBBE  B s B  Q a + P i B  I d A # B  B V B r B s C C C C B `  O T A t B a B P  P M | B ^ A   A g   P P  B D B a  B  A  Q X E  C C C C B [ [  Q Z B B b A   E K  J G G G G V O B [ B i  B ^ B l t@x' DK P@|' DK P@'vBBB B(D0A8Gp" 8A0A(B BBBH  8A0A(B BBBK xBBBXp^xNXxBpxHfxApxBZAIpt xBBJ dAH'DG QAL'5BBE B(D0A8G 8A0A(B BBBE  LBGBNMHeAhLcBBGBNxLaA8B'DN KTB'DK IpB'DK I$B'2A]D FHAB'DG Q$B'2A]D FHAB'BBE B(D0A8I# 8A0A(B BBBJ ROiBJBGQVIiI<C'BAD LBYL AABBHC'BBB B(D0A8Jx 8A0A(B BBBG  D'H Dl'(lD'[BAD ~ ABH HD4'0BBB B(A0A8D 8D0A(B BBBI @D'BAD KE  AABE c  AABA L(Et' DA dABEE( 0A8G CCCCCUd 8A0A(B BBBB `F(BBB B(D0A8D` 8A0A(B BBBC 8A0A(B BBB(GD(@BBB B(D0A8P@ 8G0A(B BBBM Dax(JAG  AAE Y GAG @B 4aP(yBBD A(P0V(D ABBdb(TLBB A(A0D@m0A(A BBBEP@ 0A(A BBBA (lb(OO B M C `PDb$(BGD U CBI V DBD  FBG b|((b(hAAI F AAA 8 c\(VAA  ABB AFBG\c0(Qpc|(hkh A JXc(sbBH A(D0L 0F(A BBBE I 0A(A BBBG Pc(?BBB A(D0J8D@DHDPDXD`I0 (D BBBK @d(WAo P \d (Ax W xd(Ax W d(A} R A\d(DBBE A(D0 (G BBBF B (J BBBV A(D BBBex(DN,e(@e|(DQXe( le(DQe( e(DQe( e(DQe( e(DQf( f(EDl A 8f(=Dd A 0Tf(CBAA D0  AABF f)^f)'DQ K F<f)0EEE D(A0 (A BBBK $g)fAAP QAA(g)D A (Dg)JRAI cAAG@pg)BAA Dp@  DABA   DABD HgX)tBBA A(I0 (A ABBI (A ABB8h)EAA " ABD g ABF L)BBB B(A0A8M 8A0A(B BBBD PGGUL4nN)BBB B(D0A8G' 8D0A(B BBBH 0nHP)BAD I  AABD (nQ)sAALbAA nXQ)DAFps AE hoQ)wBBB B(D0A8LIDDDDI 8A0A(B BBBC LtoY)'BBB B(A0A8J 8A0A(B BBBF Lox])BBE B(D0A8S] 8A0A(B BBBF @p`)BBB A(A0D0A(A BBB(Xpa)GBAD xABLpa)EBBE B(A0A8M& 8A0A(B BBBF ppf)YBBB B(A0A8JZGCA- 8A0A(B BBBH yKVA(Hqj)9AAJ V AAA 4tqj)sAAD ~ HAK D AAA @qk)BAA Dj  AABJ t  AABA $qk)(AAJ YAA4rk)BAA D  AABA rp)4BBB B(D0A8R 8A0A(B BBBE GqBpAcAgGnDd$sxu)BBB B(A0A8JP  8A0A(B BBBB s 8D0A(B BBBE Xs{){BA A(J`\(A ABBEH`X (A ABBJ Ls})*BAD L  AABB f  AABF U AAB8t})GBBB B(D0A8Je 8A0A(B BBBJ ZCFCEC^CFCEC[t)gBBB B(D0A8MBBE_ 8A0A(B BBBA GGDDDDI 8G0A(B BBBJ \u)BBE B(D0A8P E 8A0A(B BBBA  D G D F D Y  G D G D F X |u)BBE B(D0A8O 8A0A(B BBBE .FAKSXCCCCBe`dv)BBD A(G@3 (A ABBG  (D ABBH  (G ABBH dv4)p 8A0A(B BBBK xKDDDDSph0w\)BBE B(D0A8O 8A0A(B BBBH @HDDDDIw)UBBE B(A0A8JCCCCCYDCCCCBQECICCkm CECCCYCECEC`JBEGUBDBYV 8A0A(B BBBB )DGDFDSBBEeCCCCF^hHBET IGBM ECCCC]LPOXBBEjBHGXugCCCCIDy*KAD0qJAT0n AAH DGALy* BBB B(A0A8M 8A0A(B BBBE L8zd*BBD A(G (A ABBH E (A ABBE Lz$*BBE B(D0A8JJ 8D0A(B BBBG Hzd*BBE A(D0 (A BBBH R(A BBBp${*>BBE B(G0A8Jw 8A0A(B BBBB BaAEdB{(* {(*"BBE B(A0A8MCFCFCY 8A0A(B BBBD FCECCYAEB_`L|`E*^BBE B(D0A8P`u 8S0A(B BBBO R 8A0A(B BBBD |\J* |hJ*8|tJ*AAMy AAF I AAE }M*$(}M*\AAL HDA<P}M*BBE A(D0 (A BBB] ,}N*BAD  ABA H}\O*BBE B(A0A8Lp 8G0A(B BBBR L ~Q*BBB B(D0A8L 8A0A(B BBBC \~pV*wAFPkDL|~V*"BBB B(D0A8P [ 8A0A(B BBBF t~Y*sFEBZ` 8A0A(B BBBH ESANLD8\*BBB B(A0A8LQ 8A0A(B BBBG H_* BBE B(A0A8J 8A0A(B BBBD (\i*DBAD yABH i*BBB B(A0A8D 8A0A(B BBBA 8Xi*BBD A(P@ (A ABBE hj*BEGEHENP 8A0A(B BBBA 4k*BBD A(MPq(A ABB$8Tl*LAAO0xAA,`|l*AF HK A 4m*AC HK G  K (ȁ4p*:AC G F 0Hq*BAC M0  AABH X(r*0BBE B(C0A8LKB_8A0A(B BBB,v*8AC HK E ,x*AF HH K Lz*BBH B(A0A8G 8A0A(B BBBH ,4x}*AF HH K ,dX*IAC HS F 4x*}BBG A(L@^(A ABBH̃*BBH B(D0A8D`_ 8A0A(B BBBH HD*gBBH B(D0A8DP 8A0A(B BBBF Hdh*eBBE B(G0A8GP 8A0A(B BBBA <*BBH A(D0JP0A(A BBBH*C HD F ,l*AF HH K LА*BBH A(D0G@HEP[HA@0A(A BBBl@*BBH B(D0A8GpxBBnpqxKBEUp8A0A(B BBB,\*UAI HR A ,*AI HGY F X *@BBE B(G0A8GpxTJASpN8A0A(B BBB4*5AC HK F R F ,P *kAF HK K 4L*2AC E D w I  I ,T*AC C D  K L* 8A0A(B BBBD 48*cBAD \ ABJ ZKB`p*ABAD J   AABB _  AABE Y  AABK R  AABJ $Ԉ*aAAG FLAL*$BBD B(A0A8J  8A0A(B BBBK PLв*BBE B(D0A8P@`HHPAXD`X8A0A(B BBB,,*{AC HD F LЉ|*BBE B(D0A8I' 8A0A(B BBBF  <*Xpx4@*AC HN D l D Lxt*BBE B(E0A8J 8A0A(B BBBF \Ȋ*BBB B(D0A8G6 8A0A(B BBBD MkAL(*=BBE B(D0A8J 8A0A(B BBBF Hxt*L`BRHLADFDBVL 8A0A(B BBBJ PGGGGGfK]AkQDIBFDG^+ZBBhiBHBU GBBZLJGGGGQHJGGGGN|*f  G G G G G a   U W B G 8A0A(B BBBA  P D B B G D G f g K ] A   M D B D B B B a | J G G G G ^ K L _ A X M L D D B B B D G [  Q D G B F A D G G D G a  E B E c L*BBB B(D0A8Qd 8A0A(B BBBD L(* BBB B(D0A8J] 8A0A(B BBBB L4*(BBE B(A0A8J? 8A0A(B BBBH +^AE J I+]< 8A0A(B BBBB p 8A0A(B BBBD M 8A0A(B BBBA Hx +BBD B(A0A8Opb8A0A(B BBBL +#BBG A(J@ (A ABBH  (A ABBD H@+BBD B(A0A8O8A0A(B BBBL +LBBB B(A0A8G`V 8A0A(B BBBG ܑ +d,+BBB B(D0A8D 8A0A(B BBBE  8A0A(B BBBA $Xd+tAAP PLAL+BBE B(A0A8IR 8A0A(B BBBF `В\+;BBE B(A0A8JWKBES 8A0A(B BBBF $48-+AAN vDA\-+,Af4x-+BBD A(O@(A ABB,.+BBG A(D0L@r 0A(A BBBJ Y 0F(A BBBJ  0F(A BBBJ Y 0F(A BBBJ 845+R~L5+L` 6+\ BBE B(D0A8J a 8A0A(B BBBK $A+Du G t L O I xؔB+BBB B(A0A8DPn 8G0A(B BBBL O 8D0A(B BBBD K 8C0A(B BBBQ TF+% 8A0A(B BBBI JAEZBHBRBDBYnJGGGGQ,AGB`MJDDB^YDDBDVGJDGZCO[APLKBIDPDGDGDZ^MJD^iG[AHܖ+B A(A0D@0A(A BBB,hk+fBAA W ABA dl+BBB B(D0A8GY 8A0A(B BBBI  8A0A(B BBBF Tlp+BBB A(D0M` 0A(A BBBK 0A(A BBB,Xr+_AC HG F HDs+dBBB B(A0A8JPx 8A0A(B BBBJ XԘhu+BBB A(D0JS 0A(A BBBH  0A(A BBBA H0 w+_BBB B(A0A8M  8A0A(B BBBI H| +IBBB B(A0A8M M 8A0A(B BBBB `ș$+ Y 8A0A(B BBBF  G G G N \, +d$BBB B(A0A8M 8A0A(B BBBK *HQB(+BAD gJBH+BBE B(D0A8P8A0A(B BBBLh+BBB B(A0A8D  8A0A(B BBBF 4T+OEBA H(T0d(A ABB\+BBE B(D0A8MA 8A0A(B BBBH GiA+BBE B(D0A8JODEAFUeLkA^ 8A0A(B BBBF OADZx| + BBE B(D0A8M  8A0A(B BBBG .KmACCCCCIL+BBB B(D0A8Ji 8A0A(B BBBF (H$+lBAN SABHth+KBBE B(D0A8MP 8A0A(B BBBE 4l+BBD A(PP(L ABB`+Z8A0A 8A0A(B BBBE `+BBE B(D0A8G 8A0A(B BBBD (EBEV8 @+DBBD A(GP (A ABBD HT+JBBG B(A0A8G_Bo_ByBofBBUDP|BEB\AWZGADb8A0A(B BBBA0 +BAA G@  AABA 0@|+dBAD G@:  AABD t++++LĠ+JBBB B(D0A8I@ 8A0A(B BBBH 0,IBAD G@6 AABH,\,p,(,AAG0 AAG ` ,`BBD A(G (A ABBD  BlAHBGX ,>(,((< ,aoAM SCADLhd,BBE B(D0A8UB 8A0A(B BBBG 4,BBD A(GP(A ABB<L,hBBE A(D0L@B0D(A BBBL0|,6 8A0A(B BBBA `L,NEBB A(D0 (A BBBN n (D BBBH  (D BBBC 08,AAI F IAM _DA(,MO D P`VJLD(,MBBC A(O0{ (A ABBH  (A ABBE L(, BBE B(D0A8L 8A0A(B BBBG D` DF mG``,4,sBBB B(D0A8O 8A0A(B BBBF  BHBZĥH;,)إd;,)TT|;,ccO^P;,$X<,8D=,LP=,ZDB J Idl=,` 8A0A(B BBBF  8A0A(B BBBF |ԦA,BBB B(D0A8Jph 8A0A(B BBBG  8A0A(B BBBF I 8A0A(B BBBE T(F,BBE B(D0A8Gc 8A0A(B BBBD f 8A0A(B BBBH  8A0A(B BBBH bBDBY`, 8A0A(B BBBD  8A0A(B BBBI \TR,BBD A(GPZ (D ABBE jXB`HhBpUP! (D DBBH H8T,ZRBB B(A0A8G 8A0A(B BBBB L[,x [,qG i AF 8$\,ViP G DTh\,LBD A(G0(A ABBL0x\,v2YUIINJ 8A0A(B BBBA XIIN$,AAG0AA8@l,8BBD A(K` (A ABBD 8|p, BBC A(O (A ABBC 4D,RBD A(D0(A ABBL,BBB B(D0A8J 8A0A(B BBBJ D@l,BBE A(A0G 0Ak 8A0A(B BBBJ Hԫ,BBE B(D0A8M 8A0A(B BBBG @ \,BBE A(D0Ju 0A(A BBBK @dت,*BBE A(D0J 0A(A BBBF 0Ĭ,BAD J  AABG (ܬP,KAAL AAG @t,BBB A(D0Mr 0A(A BBBF DL,BBB A(A0J   0A(A BBBD 8X,BBC A(O(A ABBXЭ,) BBB A(A0Dh 0D(A BBBA | 0D(D BBBF L,,BBB B(D0A8Kpp 8D0A(B BBBK <|,BAA G  AABI ] AABH,`(8D0D(B BBBH,BBE B(A0A8F8A0A(B BBBHTh,nBBE B(D0A8JP 8D0A(B BBBH  8D0A(B BBBO \ 8D0A(B B { 8A0A(B BBBD LTh,BBE B(D0A8M< 8D0A(B BBBJ ,9,+̰,[DGDFDRDDDDDI|X,BBE B(D0A8MwHBDV  8A0A(B BBBB SCCCCC^ر,BBE B(A0A8G  C F C H C a  C F C H C [ D D D D G D ]  8A0A(B BBBI |0,yBBB B(D0A8J  C C C C C d [ C F C H C ^  8A0A(B BBBE (,BBE B(D0A8R  D D D D G c  C C C C F ^ ` 8A0A(B BBBH T D D D D G c ,+Db$ij(,^Aj E I G I A @`, BBD A(A0G 0A(A BBBC (0,,AGu AB A\,"`p, BBE B(D0A8L  8A0A(B BBBG  K B B f lԴX-MBBE B(D0A8GzDJAv 8A0A(B BBBH LDUA`D8-V B   A @-AAD AAI _ AAG IAA(@-AAL AAA 0$-BAC I  AABF L-~DE G i4l -fAAGM AAH AA(X-ADpJ AH ~Aж -~DE G i(l-ADpJ AH ~A ->BBE B(D0A8G 8A0A(B BBBC QDDZ~K^ANMAU$-BBB B(D0A8G= 8A0A(B BBBE OBDU|K^AOBFU0l+-MBB B(D0A8N 8A0A(B BBBE [JdBEXQB[BA`B`A$ܸ2-jAAI0\AA<2-BBC A(G (A ABBD pDx4-*BBA A(I@B (A ABBI ZHYPFXA`K@|HKP]HA@}HAPAXD`K@|HLP]HA@846-KBBA A(D` (A ABBA LH7-BBB B(A0A8Jb 8A0A(B BBBH HDF- L-BBB B(A0A8G 8A0A(B BBBD 0O-DO-0XO-BAA G0w  AABJ @Q-Fr|Q-FrQ-"PBE B(A0A8GAGDbHDAIAbAGB_PA 8A0A(B BBBJ C^CFCFCUs- BBB B(D0A8G BDA]DDDDD[5SAl 8A0A(B BBBE \,|-KA(0-iAAP A AAK t-?̽-!DQ B I-}`-!BBB A(A0s (D BBBK G (A BBBB Y (A BBBH d-3x-n(p-AAN  AAA -FDS I I G \ܾ-,BBA A(G0 (D ABBE X (D ABBG D(G DBBH<->BBB B(A0A8DP> 8A0A(B BBBJ ԭ-eL0-BBB B(A0A8D 8A0A(B BBBK -4Dc I Զ-4Dc I $-4Dc I @-4Dc I \@-4Dc I xd-4Dc I --Ak$-EAAM oDAķ-EFc G P(-Dm G f B N B f A $-D B L@-DBBB B(A0A8IN 8D0A(B BBBB $-OAAQ kFA4Կ-NAAD0Y DAD  DAH-;BBG B(A0A8DpL 8D0A(B BBBD <- D[T-h-)\|-jBBD A(G0^ (G ABBF p (D ABBG H (D ABBG x -BBB B(A0A8JZ 8C0A(B BBBF LFFb 8F0A(B BBBD LX-[BBE B(D0A8M{ 8D0A(B BBBK T-BBE B(D0A8JTQ8A0A(B BBBAL,-BBA A(J@Q (A ABBI B (A ABBH 8P-RBF A(D@(A ABBX-BBB A(A0DPb 0A(A BBBJ  0A(A BBBE -DAn A H- X 8A0A(B BBBE Pl-D@ L nLp- BBB B(D0A8J 8D0A(B BBBF X-BBE A(A0D@h 0D(A BBBF  0D(A BBBF X-AG H A W A O d D e K e K e K ~ B A L | G C A Lxt-YAD u FBE pH A CBB pD-ZBBL A(A0x (D BBBD  (A BBBM R (A BBBO  (D BBBE ^ (A BBBK T-.h4- |0-,-'H-)d-p-|-78-fBBA A(D0L (C ABBB (0-RAA CBT\~BBB B(D0A8I@HSPBXD`Q@U8A0A(B BBB4bBBD A(J0E(D ABBH-BBA A(D0L (C ABBB (A ABB88-BBE A(D0(A BBBHt*BBE B(D0A8Pp8A0A(B BBBRXP`+8A0A(B BBBH`].BBB B(D0A8D8D0A(B BBB(@-AC GU H L-BBA A(D@f (C ABBH L (D ABBK 0(-IBAA D@  CABG \-,p-$8-DrQp-EsQ-C-Gdb-AlTl_TH<-9 BBB B(A0A8DN 8C0A(B BBBH \0.D$tAADP{DA,BBAC I0 DAB(.AAD  AAE D .{Ai F JH .zBBB B(D0A8F`^ 8D0A(B BBBB (d .AAD0B CAD d.XA`IXAPY 8D0A(B BBBA DXZ`BhCpIPH.BBB B(A0A8DPS 8C0A(B BBBC HD(.zBA A(G0I(C ABBJ^0HBBB B(D0A8R\8D0A(B BBBLP.BBB A(A0i (C BBBF  (A BBBG L,.|`.nAR UCAH.Jpi 8C0A(B BBBJ 8".BBA A(D`L (C ABBB ,#.~DQ K I A HP$.xBB B(A0A8D`Y 8C0A(B BBBA H`&.P 8D0A(B BBBD 0(.BAD JPj  CABG \`).UBB B(A0A8Jf8C0A(B BBBGHH|0.RBB B(A0A8D@p 8C0A(B BBBF ?,FdA4OBBA A(G0(A ABBl1.6O 8C0A(B BBBD ,bjAPZnAA.>D e A dB.KBB B(D0A8Jq 8A0A(B BBBE C0|H.BAA Dpe  CABE HDI.BBE B(A0A8D`" 8C0A(B BBBA dJ.LBE B(D0A8J` 8A0A(B BBBG tC`4\.ZAE J t\.ZAE J 08\.BAA G0  AABG ,l^.AC G A { @`.ZAL C `.ZAL C ,`.!AG _ AH B AE  a.A^ Q ,,0b.AAI D CAE /, |/ ABA Z4Q(A BBBK5(n5ԉ5(L5IDD p ABD (x$5 AAG0 AAA H5X_6THr F , _68x6H s] PH U$\y6 (A BBBA X68T6BBG A(F0(A BBB6 (A ABBD 4:t;;C (A ABBG 2@?.PHBCMHBV^BZA]BZB^B_BhLtu@GRBE B(D0A8JX 8D0A(B BBBA DGDTN 8G0A(B BBBJ  8F0A(B BBBM G(|G((G:GDD _AB()Gy(A BBBK8$58HAA M ABD IH}H}H ~H ~|H4~LD$H~HOQAG sAA$p~HGAAO sAA <(&@^\i K 8h  KdA8D0A(B BBBHXS@ ~jEAA K CBI S FBE IEDR&VAANEOB]- (- <- ,P-Aw H F J a O E K TIAv[EGM0d,]5x D'c\Nl|xt p l4hHd\`p\XTPLHD@<$884L0`,t($ D (<P dx,@Th|0DXl  |4xHt\pplhd`XTP d A[\A] B Ȥ ,;D_ M A A P̤DABlCBAD 0KA@lThht|%HX%HX2IzPLRx)+# ,$cAAG  AAB ,TOAAG  DAH Hd*PT |b AG H ȩ-gDA K 0%8IuHT A (   `LAQ|PH@<HAQHKTK` 0\fAm B p A $TDy C O I O I |DQDL ,/AAG I DAE @(cNPD, `ac}(oDP A Y,Jhd`\]$]8LT`% T4D4&H22IH&t22I\&22Ip&22I "2RAAL "43RoAAL #p3RWAAL ,#3R?AAL P#3R'AAL t#$47AC C #@5RAAL #|5RAAL #5RϸAAL $5RAAL '06(,6P$(69wHlp$H63[HfT(h6h(t6AL(x6ALP(P<PPPdPxQQ HSQ HSĊQ HSQ HS Q HS$Q HS4(Q HSP,QALl0Q HS4Q HS8Q HS module omp_lib_kinds implicit none integer, parameter :: omp_lock_kind = 4 integer, parameter :: omp_nest_lock_kind = 8 integer, parameter :: omp_sched_kind = 4 integer, parameter :: omp_proc_bind_kind = 4 integer, parameter :: omp_lock_hint_kind = 4 integer, parameter :: omp_pause_resource_kind = 4 integer (omp_sched_kind), parameter :: omp_sched_static = 1 integer (omp_sched_kind), parameter :: omp_sched_dynamic = 2 integer (omp_sched_kind), parameter :: omp_sched_guided = 3 integer (omp_sched_kind), parameter :: omp_sched_auto = 4 integer (omp_proc_bind_kind), & parameter :: omp_proc_bind_false = 0 integer (omp_proc_bind_kind), & parameter :: omp_proc_bind_true = 1 integer (omp_proc_bind_kind), & parameter :: omp_proc_bind_master = 2 integer (omp_proc_bind_kind), & parameter :: omp_proc_bind_close = 3 integer (omp_proc_bind_kind), & parameter :: omp_proc_bind_spread = 4 integer (omp_lock_hint_kind), & parameter :: omp_lock_hint_none = 0 integer (omp_lock_hint_kind), & parameter :: omp_lock_hint_uncontended = 1 integer (omp_lock_hint_kind), & parameter :: omp_lock_hint_contended = 2 integer (omp_lock_hint_kind), & parameter :: omp_lock_hint_nonspeculative = 4 integer (omp_lock_hint_kind), & parameter :: omp_lock_hint_speculative = 8 integer (kind=omp_pause_resource_kind), & parameter :: omp_pause_soft = 1 integer (kind=omp_pause_resource_kind), & parameter :: omp_pause_hard = 2 end module module omp_lib use omp_lib_kinds implicit none integer, parameter :: openmp_version = 201511 interface subroutine omp_init_lock (svar) use omp_lib_kinds integer (omp_lock_kind), intent (out) :: svar end subroutine omp_init_lock end interface interface subroutine omp_init_lock_with_hint (svar, hint) use omp_lib_kinds integer (omp_lock_kind), intent (out) :: svar integer (omp_lock_hint_kind), intent (in) :: hint end subroutine omp_init_lock_with_hint end interface interface subroutine omp_init_nest_lock (nvar) use omp_lib_kinds integer (omp_nest_lock_kind), intent (out) :: nvar end subroutine omp_init_nest_lock end interface interface subroutine omp_init_nest_lock_with_hint (nvar, hint) use omp_lib_kinds integer (omp_nest_lock_kind), intent (out) :: nvar integer (omp_lock_hint_kind), intent (in) :: hint end subroutine omp_init_nest_lock_with_hint end interface interface subroutine omp_destroy_lock (svar) use omp_lib_kinds integer (omp_lock_kind), intent (inout) :: svar end subroutine omp_destroy_lock end interface interface subroutine omp_destroy_nest_lock (nvar) use omp_lib_kinds integer (omp_nest_lock_kind), intent (inout) :: nvar end subroutine omp_destroy_nest_lock end interface interface subroutine omp_set_lock (svar) use omp_lib_kinds integer (omp_lock_kind), intent (inout) :: svar end subroutine omp_set_lock end interface interface subroutine omp_set_nest_lock (nvar) use omp_lib_kinds integer (omp_nest_lock_kind), intent (inout) :: nvar end subroutine omp_set_nest_lock end interface interface subroutine omp_unset_lock (svar) use omp_lib_kinds integer (omp_lock_kind), intent (inout) :: svar end subroutine omp_unset_lock end interface interface subroutine omp_unset_nest_lock (nvar) use omp_lib_kinds integer (omp_nest_lock_kind), intent (inout) :: nvar end subroutine omp_unset_nest_lock end interface interface omp_set_dynamic subroutine omp_set_dynamic (dynamic_threads) logical (4), intent (in) :: dynamic_threads end subroutine omp_set_dynamic subroutine omp_set_dynamic_8 (dynamic_threads) logical (8), intent (in) :: dynamic_threads end subroutine omp_set_dynamic_8 end interface interface omp_set_nested subroutine omp_set_nested (nested) logical (4), intent (in) :: nested end subroutine omp_set_nested subroutine omp_set_nested_8 (nested) logical (8), intent (in) :: nested end subroutine omp_set_nested_8 end interface interface omp_set_num_threads subroutine omp_set_num_threads (num_threads) integer (4), intent (in) :: num_threads end subroutine omp_set_num_threads subroutine omp_set_num_threads_8 (num_threads) integer (8), intent (in) :: num_threads end subroutine omp_set_num_threads_8 end interface interface function omp_get_dynamic () logical (4) :: omp_get_dynamic end function omp_get_dynamic end interface interface function omp_get_nested () logical (4) :: omp_get_nested end function omp_get_nested end interface interface function omp_in_parallel () logical (4) :: omp_in_parallel end function omp_in_parallel end interface interface function omp_test_lock (svar) use omp_lib_kinds logical (4) :: omp_test_lock integer (omp_lock_kind), intent (inout) :: svar end function omp_test_lock end interface interface function omp_get_max_threads () integer (4) :: omp_get_max_threads end function omp_get_max_threads end interface interface function omp_get_num_procs () integer (4) :: omp_get_num_procs end function omp_get_num_procs end interface interface function omp_get_num_threads () integer (4) :: omp_get_num_threads end function omp_get_num_threads end interface interface function omp_get_thread_num () integer (4) :: omp_get_thread_num end function omp_get_thread_num end interface interface function omp_test_nest_lock (nvar) use omp_lib_kinds integer (4) :: omp_test_nest_lock integer (omp_nest_lock_kind), intent (inout) :: nvar end function omp_test_nest_lock end interface interface function omp_get_wtick () double precision :: omp_get_wtick end function omp_get_wtick end interface interface function omp_get_wtime () double precision :: omp_get_wtime end function omp_get_wtime end interface interface omp_set_schedule subroutine omp_set_schedule (kind, chunk_size) use omp_lib_kinds integer (omp_sched_kind), intent (in) :: kind integer (4), intent (in) :: chunk_size end subroutine omp_set_schedule subroutine omp_set_schedule_8 (kind, chunk_size) use omp_lib_kinds integer (omp_sched_kind), intent (in) :: kind integer (8), intent (in) :: chunk_size end subroutine omp_set_schedule_8 end interface interface omp_get_schedule subroutine omp_get_schedule (kind, chunk_size) use omp_lib_kinds integer (omp_sched_kind), intent (out) :: kind integer (4), intent (out) :: chunk_size end subroutine omp_get_schedule subroutine omp_get_schedule_8 (kind, chunk_size) use omp_lib_kinds integer (omp_sched_kind), intent (out) :: kind integer (8), intent (out) :: chunk_size end subroutine omp_get_schedule_8 end interface interface function omp_get_thread_limit () integer (4) :: omp_get_thread_limit end function omp_get_thread_limit end interface interface omp_set_max_active_levels subroutine omp_set_max_active_levels (max_levels) integer (4), intent (in) :: max_levels end subroutine omp_set_max_active_levels subroutine omp_set_max_active_levels_8 (max_levels) integer (8), intent (in) :: max_levels end subroutine omp_set_max_active_levels_8 end interface interface function omp_get_max_active_levels () integer (4) :: omp_get_max_active_levels end function omp_get_max_active_levels end interface interface function omp_get_level () integer (4) :: omp_get_level end function omp_get_level end interface interface omp_get_ancestor_thread_num function omp_get_ancestor_thread_num (level) integer (4), intent (in) :: level integer (4) :: omp_get_ancestor_thread_num end function omp_get_ancestor_thread_num function omp_get_ancestor_thread_num_8 (level) integer (8), intent (in) :: level integer (4) :: omp_get_ancestor_thread_num_8 end function omp_get_ancestor_thread_num_8 end interface interface omp_get_team_size function omp_get_team_size (level) integer (4), intent (in) :: level integer (4) :: omp_get_team_size end function omp_get_team_size function omp_get_team_size_8 (level) integer (8), intent (in) :: level integer (4) :: omp_get_team_size_8 end function omp_get_team_size_8 end interface interface function omp_get_active_level () integer (4) :: omp_get_active_level end function omp_get_active_level end interface interface function omp_in_final () logical (4) :: omp_in_final end function omp_in_final end interface interface function omp_get_cancellation () logical (4) :: omp_get_cancellation end function omp_get_cancellation end interface interface function omp_get_proc_bind () use omp_lib_kinds integer (omp_proc_bind_kind) :: omp_get_proc_bind end function omp_get_proc_bind end interface interface function omp_get_num_places () integer (4) :: omp_get_num_places end function omp_get_num_places end interface interface omp_get_place_num_procs function omp_get_place_num_procs (place_num) integer (4), intent(in) :: place_num integer (4) :: omp_get_place_num_procs end function omp_get_place_num_procs function omp_get_place_num_procs_8 (place_num) integer (8), intent(in) :: place_num integer (4) :: omp_get_place_num_procs_8 end function omp_get_place_num_procs_8 end interface interface omp_get_place_proc_ids subroutine omp_get_place_proc_ids (place_num, ids) integer (4), intent(in) :: place_num integer (4), intent(out) :: ids(*) end subroutine omp_get_place_proc_ids subroutine omp_get_place_proc_ids_8 (place_num, ids) integer (8), intent(in) :: place_num integer (8), intent(out) :: ids(*) end subroutine omp_get_place_proc_ids_8 end interface interface function omp_get_place_num () integer (4) :: omp_get_place_num end function omp_get_place_num end interface interface function omp_get_partition_num_places () integer (4) :: omp_get_partition_num_places end function omp_get_partition_num_places end interface interface omp_get_partition_place_nums subroutine omp_get_partition_place_nums (place_nums) integer (4), intent(out) :: place_nums(*) end subroutine omp_get_partition_place_nums subroutine omp_get_partition_place_nums_8 (place_nums) integer (8), intent(out) :: place_nums(*) end subroutine omp_get_partition_place_nums_8 end interface interface omp_set_default_device subroutine omp_set_default_device (device_num) integer (4), intent (in) :: device_num end subroutine omp_set_default_device subroutine omp_set_default_device_8 (device_num) integer (8), intent (in) :: device_num end subroutine omp_set_default_device_8 end interface interface function omp_get_default_device () integer (4) :: omp_get_default_device end function omp_get_default_device end interface interface function omp_get_num_devices () integer (4) :: omp_get_num_devices end function omp_get_num_devices end interface interface function omp_get_num_teams () integer (4) :: omp_get_num_teams end function omp_get_num_teams end interface interface function omp_get_team_num () integer (4) :: omp_get_team_num end function omp_get_team_num end interface interface function omp_is_initial_device () logical (4) :: omp_is_initial_device end function omp_is_initial_device end interface interface function omp_get_initial_device () integer (4) :: omp_get_initial_device end function omp_get_initial_device end interface interface function omp_get_max_task_priority () integer (4) :: omp_get_max_task_priority end function omp_get_max_task_priority end interface interface subroutine omp_set_affinity_format (format) character(len=*), intent(in) :: format end subroutine omp_set_affinity_format end interface interface function omp_get_affinity_format (buffer) integer (4) :: omp_get_affinity_format character(len=*), intent(out) :: buffer end function omp_get_affinity_format end interface interface subroutine omp_display_affinity (format) character(len=*), intent(in) :: format end subroutine omp_display_affinity end interface interface function omp_capture_affinity (buffer, format) integer (4) :: omp_capture_affinity character(len=*), intent(out) :: buffer character(len=*), intent(in) :: format end function omp_capture_affinity end interface interface function omp_pause_resource (kind, device_num) use omp_lib_kinds integer (4) :: omp_pause_resource integer (kind=omp_pause_resource_kind), & intent(in) :: kind integer (4) :: device_num end function end interface interface function omp_pause_resource_all (kind) use omp_lib_kinds integer (4) :: omp_pause_resource_all integer (kind=omp_pause_resource_kind), & intent(in) :: kind end function end interface end module omp_lib finclude/omp_lib.hJakub Jelinek integer omp_lock_kind, omp_nest_lock_kind, openmp_version parameter (omp_lock_kind = 4) parameter (omp_nest_lock_kind = 8) integer omp_sched_kind parameter (omp_sched_kind = 4) integer (omp_sched_kind) omp_sched_static, omp_sched_dynamic integer (omp_sched_kind) omp_sched_guided, omp_sched_auto parameter (omp_sched_static = 1) parameter (omp_sched_dynamic = 2) parameter (omp_sched_guided = 3) parameter (omp_sched_auto = 4) integer omp_proc_bind_kind parameter (omp_proc_bind_kind = 4) integer (omp_proc_bind_kind) omp_proc_bind_false integer (omp_proc_bind_kind) omp_proc_bind_true integer (omp_proc_bind_kind) omp_proc_bind_master integer (omp_proc_bind_kind) omp_proc_bind_close integer (omp_proc_bind_kind) omp_proc_bind_spread parameter (omp_proc_bind_false = 0) parameter (omp_proc_bind_true = 1) parameter (omp_proc_bind_master = 2) parameter (omp_proc_bind_close = 3) parameter (omp_proc_bind_spread = 4) integer omp_lock_hint_kind parameter (omp_lock_hint_kind = 4) integer (omp_lock_hint_kind) omp_lock_hint_none integer (omp_lock_hint_kind) omp_lock_hint_uncontended integer (omp_lock_hint_kind) omp_lock_hint_contended integer (omp_lock_hint_kind) omp_lock_hint_nonspeculative integer (omp_lock_hint_kind) omp_lock_hint_speculative parameter (omp_lock_hint_none = 0) parameter (omp_lock_hint_uncontended = 1) parameter (omp_lock_hint_contended = 2) parameter (omp_lock_hint_nonspeculative = 4) parameter (omp_lock_hint_speculative = 8) parameter (openmp_version = 201511) integer omp_pause_resource_kind parameter (omp_pause_resource_kind = 4) integer (omp_pause_resource_kind) omp_pause_soft integer (omp_pause_resource_kind) omp_pause_hard parameter (omp_pause_soft = 1) parameter (omp_pause_hard = 2) external omp_init_lock, omp_init_nest_lock external omp_init_lock_with_hint external omp_init_nest_lock_with_hint external omp_destroy_lock, omp_destroy_nest_lock external omp_set_lock, omp_set_nest_lock external omp_unset_lock, omp_unset_nest_lock external omp_set_dynamic, omp_set_nested external omp_set_num_threads external omp_get_dynamic, omp_get_nested logical(4) omp_get_dynamic, omp_get_nested external omp_test_lock, omp_in_parallel logical(4) omp_test_lock, omp_in_parallel external omp_get_max_threads, omp_get_num_procs integer(4) omp_get_max_threads, omp_get_num_procs external omp_get_num_threads, omp_get_thread_num integer(4) omp_get_num_threads, omp_get_thread_num external omp_test_nest_lock integer(4) omp_test_nest_lock external omp_get_wtick, omp_get_wtime double precision omp_get_wtick, omp_get_wtime external omp_set_schedule, omp_get_schedule external omp_get_thread_limit, omp_set_max_active_levels external omp_get_max_active_levels, omp_get_level external omp_get_ancestor_thread_num, omp_get_team_size external omp_get_active_level integer(4) omp_get_thread_limit, omp_get_max_active_levels integer(4) omp_get_level, omp_get_ancestor_thread_num integer(4) omp_get_team_size, omp_get_active_level external omp_in_final logical(4) omp_in_final external omp_get_cancellation logical(4) omp_get_cancellation external omp_get_proc_bind integer(omp_proc_bind_kind) omp_get_proc_bind integer(4) omp_get_num_places external omp_get_num_places integer(4) omp_get_place_num_procs external omp_get_place_num_procs external omp_get_place_proc_ids integer(4) omp_get_place_num external omp_get_place_num integer(4) omp_get_partition_num_places external omp_get_partition_num_places external omp_get_partition_place_nums external omp_set_default_device, omp_get_default_device external omp_get_num_devices, omp_get_num_teams external omp_get_team_num integer(4) omp_get_default_device, omp_get_num_devices integer(4) omp_get_num_teams, omp_get_team_num external omp_is_initial_device logical(4) omp_is_initial_device external omp_get_initial_device integer(4) omp_get_initial_device external omp_get_max_task_priority integer(4) omp_get_max_task_priority external omp_set_affinity_format, omp_get_affinity_format external omp_display_affinity, omp_capture_affinity integer(4) omp_get_affinity_format integer(4) omp_capture_affinity external omp_pause_resource, omp_pause_resource_all integer(4) omp_pause_resource integer(4) omp_pause_resource_all finclude/omp_lib.mod]Is8W&)\[Q-dg2#Km; Hb!Ȉ%Q]I*xxqy7zv fo}VCքuQ,eS'/F<dmwif2T1zp4ѿF}Gv__odyG[qx?=bG7o`t]% L䟙LWˠ2~r\(@?\~F}O_ p4bz/L\ę8npu\.:LKv*p='?(\m0I˥v&!&RQ NC l0j]d9M3Ɛ(TꝖI逆tFکi* q;NYZV= R>DeG% g4|p&q:fsGX(C %YBGG4d"`9LWw%l2$nx%$H"MAv,oN|Ek] ߾Թl+>պs@x`x .刡+,gQT ;0;Pԛna:9 RSHNMA"]|"ZM΁htib4 LD*61(-׿S3Mщp3tƾ%V' HXZQOaJ4 )R(tP}4d,niScTJF2vIu*u߇&|kvMIXcZX Ӊ@AZ#Cնq*?uJ7n)k YMH@HB`PW+,#*a*&=릧AP`$-=10 >p>t2p342P{p̈$?tNcn^M i(VĢ) 3<ŀxn=yAf2W0+*, `F}VEx~yoDul qP Q+0'ݧ >F֩ۺp$!5x¿CaAH8z& ffe{z偸 DŮzCTy2,>4a̛ƾ:`Mΰa kW`i6lEZmM;6`3Eax)6XW5| Cêb3=Uqߋy1?>Gw@Gg.tlv1EM.Z _ͤ)-*fV6Rn҃=Ե(| ԷQyPYB2l:Vߒ Y;ߨ7A`ъCwϽ6Q[(7K2{2EifJe%.b-BaolT#!VUxYnZYwjQ{eMNbE2PV^ՈO6/DBTGcM@ZUk,굑f .L0EzNiͯp)UH]$5Bj q:^qA ,t<:݄f?`l)tCRkhc=Fofinclude/openacc.f90! OpenACC Runtime Library Definitions. ! ! Contributed by Tobias Burnus ! and Mentor Embedded ! Keep in sync with config/accel/openacc.f90 and openacc_lib.h. module openacc_kinds use iso_fortran_env, only: int32 implicit none public private :: int32 ! When adding items, also update 'public' setting in 'module openacc' below. integer, parameter :: acc_device_kind = int32 ! integer (acc_device_kind), parameter :: acc_device_current = -1 integer (acc_device_kind), parameter :: acc_device_none = 0 integer (acc_device_kind), parameter :: acc_device_default = 1 integer (acc_device_kind), parameter :: acc_device_host = 2 ! integer (acc_device_kind), parameter :: integer (acc_device_kind), parameter :: acc_device_not_host = 4 integer (acc_device_kind), parameter :: acc_device_nvidia = 5 integer (acc_device_kind), parameter :: acc_device_radeon = 8 integer, parameter :: acc_device_property_kind = int32 ! OpenACC 2.6/2.7/3.0 used acc_device_property; in a spec update the ! missing '_kind' was added for consistency. For backward compatibility, keep: integer, parameter :: acc_device_property = acc_device_property_kind ! integer (acc_device_property_kind), parameter :: acc_property_memory = 1 integer (acc_device_property_kind), parameter :: acc_property_free_memory = 2 integer (acc_device_property_kind), parameter :: acc_property_name = int(Z'10001') integer (acc_device_property_kind), parameter :: acc_property_vendor = int(Z'10002') integer (acc_device_property_kind), parameter :: acc_property_driver = int(Z'10003') integer, parameter :: acc_handle_kind = int32 ! integer (acc_handle_kind), parameter :: acc_async_noval = -1 integer (acc_handle_kind), parameter :: acc_async_sync = -2 end module openacc_kinds module openacc_internal use openacc_kinds implicit none interface function acc_get_num_devices_h (devicetype) import integer acc_get_num_devices_h integer (acc_device_kind) devicetype end function subroutine acc_set_device_type_h (devicetype) import integer (acc_device_kind) devicetype end subroutine function acc_get_device_type_h () import integer (acc_device_kind) acc_get_device_type_h end function subroutine acc_set_device_num_h (devicenum, devicetype) import integer devicenum integer (acc_device_kind) devicetype end subroutine function acc_get_device_num_h (devicetype) import integer acc_get_device_num_h integer (acc_device_kind) devicetype end function function acc_get_property_h (devicenum, devicetype, property) use iso_c_binding, only: c_size_t import implicit none (type, external) integer (c_size_t) :: acc_get_property_h integer, value :: devicenum integer (acc_device_kind), value :: devicetype integer (acc_device_property_kind), value :: property end function subroutine acc_get_property_string_h (devicenum, devicetype, property, string) import implicit none (type, external) integer, value :: devicenum integer (acc_device_kind), value :: devicetype integer (acc_device_property_kind), value :: property character (*) :: string end subroutine function acc_async_test_h (arg) logical acc_async_test_h integer arg end function function acc_async_test_all_h () logical acc_async_test_all_h end function subroutine acc_wait_h (arg) integer arg end subroutine subroutine acc_wait_async_h (arg, async) integer arg, async end subroutine subroutine acc_wait_all_h () end subroutine subroutine acc_wait_all_async_h (async) integer async end subroutine subroutine acc_init_h (devicetype) import integer (acc_device_kind) devicetype end subroutine subroutine acc_shutdown_h (devicetype) import integer (acc_device_kind) devicetype end subroutine function acc_on_device_h (devicetype) import integer (acc_device_kind) devicetype logical acc_on_device_h end function subroutine acc_copyin_32_h (a, len) use iso_c_binding, only: c_int32_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len end subroutine subroutine acc_copyin_64_h (a, len) use iso_c_binding, only: c_int64_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len end subroutine subroutine acc_copyin_array_h (a) type (*), dimension (..), contiguous :: a end subroutine subroutine acc_present_or_copyin_32_h (a, len) use iso_c_binding, only: c_int32_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len end subroutine subroutine acc_present_or_copyin_64_h (a, len) use iso_c_binding, only: c_int64_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len end subroutine subroutine acc_present_or_copyin_array_h (a) type (*), dimension (..), contiguous :: a end subroutine subroutine acc_create_32_h (a, len) use iso_c_binding, only: c_int32_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len end subroutine subroutine acc_create_64_h (a, len) use iso_c_binding, only: c_int64_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len end subroutine subroutine acc_create_array_h (a) type (*), dimension (..), contiguous :: a end subroutine subroutine acc_present_or_create_32_h (a, len) use iso_c_binding, only: c_int32_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len end subroutine subroutine acc_present_or_create_64_h (a, len) use iso_c_binding, only: c_int64_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len end subroutine subroutine acc_present_or_create_array_h (a) type (*), dimension (..), contiguous :: a end subroutine subroutine acc_copyout_32_h (a, len) use iso_c_binding, only: c_int32_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len end subroutine subroutine acc_copyout_64_h (a, len) use iso_c_binding, only: c_int64_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len end subroutine subroutine acc_copyout_array_h (a) type (*), dimension (..), contiguous :: a end subroutine subroutine acc_copyout_finalize_32_h (a, len) use iso_c_binding, only: c_int32_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len end subroutine subroutine acc_copyout_finalize_64_h (a, len) use iso_c_binding, only: c_int64_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len end subroutine subroutine acc_copyout_finalize_array_h (a) type (*), dimension (..), contiguous :: a end subroutine subroutine acc_delete_32_h (a, len) use iso_c_binding, only: c_int32_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len end subroutine subroutine acc_delete_64_h (a, len) use iso_c_binding, only: c_int64_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len end subroutine subroutine acc_delete_array_h (a) type (*), dimension (..), contiguous :: a end subroutine subroutine acc_delete_finalize_32_h (a, len) use iso_c_binding, only: c_int32_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len end subroutine subroutine acc_delete_finalize_64_h (a, len) use iso_c_binding, only: c_int64_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len end subroutine subroutine acc_delete_finalize_array_h (a) type (*), dimension (..), contiguous :: a end subroutine subroutine acc_update_device_32_h (a, len) use iso_c_binding, only: c_int32_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len end subroutine subroutine acc_update_device_64_h (a, len) use iso_c_binding, only: c_int64_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len end subroutine subroutine acc_update_device_array_h (a) type (*), dimension (..), contiguous :: a end subroutine subroutine acc_update_self_32_h (a, len) use iso_c_binding, only: c_int32_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len end subroutine subroutine acc_update_self_64_h (a, len) use iso_c_binding, only: c_int64_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len end subroutine subroutine acc_update_self_array_h (a) type (*), dimension (..), contiguous :: a end subroutine function acc_is_present_32_h (a, len) use iso_c_binding, only: c_int32_t logical acc_is_present_32_h !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len end function function acc_is_present_64_h (a, len) use iso_c_binding, only: c_int64_t logical acc_is_present_64_h !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len end function function acc_is_present_array_h (a) logical acc_is_present_array_h type (*), dimension (..), contiguous :: a end function subroutine acc_copyin_async_32_h (a, len, async) use iso_c_binding, only: c_int32_t use openacc_kinds, only: acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len integer (acc_handle_kind) async end subroutine subroutine acc_copyin_async_64_h (a, len, async) use iso_c_binding, only: c_int64_t use openacc_kinds, only: acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len integer (acc_handle_kind) async end subroutine subroutine acc_copyin_async_array_h (a, async) use openacc_kinds, only: acc_handle_kind type (*), dimension (..), contiguous :: a integer (acc_handle_kind) async end subroutine subroutine acc_create_async_32_h (a, len, async) use iso_c_binding, only: c_int32_t use openacc_kinds, only: acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len integer (acc_handle_kind) async end subroutine subroutine acc_create_async_64_h (a, len, async) use iso_c_binding, only: c_int64_t use openacc_kinds, only: acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len integer (acc_handle_kind) async end subroutine subroutine acc_create_async_array_h (a, async) use openacc_kinds, only: acc_handle_kind type (*), dimension (..), contiguous :: a integer (acc_handle_kind) async end subroutine subroutine acc_copyout_async_32_h (a, len, async) use iso_c_binding, only: c_int32_t use openacc_kinds, only: acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len integer (acc_handle_kind) async end subroutine subroutine acc_copyout_async_64_h (a, len, async) use iso_c_binding, only: c_int64_t use openacc_kinds, only: acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len integer (acc_handle_kind) async end subroutine subroutine acc_copyout_async_array_h (a, async) use openacc_kinds, only: acc_handle_kind type (*), dimension (..), contiguous :: a integer (acc_handle_kind) async end subroutine subroutine acc_delete_async_32_h (a, len, async) use iso_c_binding, only: c_int32_t use openacc_kinds, only: acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len integer (acc_handle_kind) async end subroutine subroutine acc_delete_async_64_h (a, len, async) use iso_c_binding, only: c_int64_t use openacc_kinds, only: acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len integer (acc_handle_kind) async end subroutine subroutine acc_delete_async_array_h (a, async) use openacc_kinds, only: acc_handle_kind type (*), dimension (..), contiguous :: a integer (acc_handle_kind) async end subroutine subroutine acc_update_device_async_32_h (a, len, async) use iso_c_binding, only: c_int32_t use openacc_kinds, only: acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len integer (acc_handle_kind) async end subroutine subroutine acc_update_device_async_64_h (a, len, async) use iso_c_binding, only: c_int64_t use openacc_kinds, only: acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len integer (acc_handle_kind) async end subroutine subroutine acc_update_device_async_array_h (a, async) use openacc_kinds, only: acc_handle_kind type (*), dimension (..), contiguous :: a integer (acc_handle_kind) async end subroutine subroutine acc_update_self_async_32_h (a, len, async) use iso_c_binding, only: c_int32_t use openacc_kinds, only: acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len integer (acc_handle_kind) async end subroutine subroutine acc_update_self_async_64_h (a, len, async) use iso_c_binding, only: c_int64_t use openacc_kinds, only: acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len integer (acc_handle_kind) async end subroutine subroutine acc_update_self_async_array_h (a, async) use openacc_kinds, only: acc_handle_kind type (*), dimension (..), contiguous :: a integer (acc_handle_kind) async end subroutine end interface interface function acc_get_num_devices_l (devicetype) & bind (C, name = "acc_get_num_devices") use iso_c_binding, only: c_int integer (c_int) :: acc_get_num_devices_l integer (c_int), value :: devicetype end function subroutine acc_set_device_type_l (devicetype) & bind (C, name = "acc_set_device_type") use iso_c_binding, only: c_int integer (c_int), value :: devicetype end subroutine function acc_get_device_type_l () & bind (C, name = "acc_get_device_type") use iso_c_binding, only: c_int integer (c_int) :: acc_get_device_type_l end function subroutine acc_set_device_num_l (devicenum, devicetype) & bind (C, name = "acc_set_device_num") use iso_c_binding, only: c_int integer (c_int), value :: devicenum, devicetype end subroutine function acc_get_device_num_l (devicetype) & bind (C, name = "acc_get_device_num") use iso_c_binding, only: c_int integer (c_int) :: acc_get_device_num_l integer (c_int), value :: devicetype end function function acc_get_property_l (devicenum, devicetype, property) & bind (C, name = "acc_get_property") use iso_c_binding, only: c_int, c_size_t implicit none (type, external) integer (c_size_t) :: acc_get_property_l integer (c_int), value :: devicenum integer (c_int), value :: devicetype integer (c_int), value :: property end function function acc_get_property_string_l (devicenum, devicetype, property) & bind (C, name = "acc_get_property_string") use iso_c_binding, only: c_int, c_ptr implicit none (type, external) type (c_ptr) :: acc_get_property_string_l integer (c_int), value :: devicenum integer (c_int), value :: devicetype integer (c_int), value :: property end function function acc_async_test_l (a) & bind (C, name = "acc_async_test") use iso_c_binding, only: c_int integer (c_int) :: acc_async_test_l integer (c_int), value :: a end function function acc_async_test_all_l () & bind (C, name = "acc_async_test_all") use iso_c_binding, only: c_int integer (c_int) :: acc_async_test_all_l end function subroutine acc_wait_l (a) & bind (C, name = "acc_wait") use iso_c_binding, only: c_int integer (c_int), value :: a end subroutine subroutine acc_wait_async_l (arg, async) & bind (C, name = "acc_wait_async") use iso_c_binding, only: c_int integer (c_int), value :: arg, async end subroutine subroutine acc_wait_all_l () & bind (C, name = "acc_wait_all") use iso_c_binding, only: c_int end subroutine subroutine acc_wait_all_async_l (async) & bind (C, name = "acc_wait_all_async") use iso_c_binding, only: c_int integer (c_int), value :: async end subroutine subroutine acc_init_l (devicetype) & bind (C, name = "acc_init") use iso_c_binding, only: c_int integer (c_int), value :: devicetype end subroutine subroutine acc_shutdown_l (devicetype) & bind (C, name = "acc_shutdown") use iso_c_binding, only: c_int integer (c_int), value :: devicetype end subroutine function acc_on_device_l (devicetype) & bind (C, name = "acc_on_device") use iso_c_binding, only: c_int integer (c_int) :: acc_on_device_l integer (c_int), value :: devicetype end function subroutine acc_copyin_l (a, len) & bind (C, name = "acc_copyin") use iso_c_binding, only: c_size_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_size_t), value :: len end subroutine subroutine acc_present_or_copyin_l (a, len) & bind (C, name = "acc_present_or_copyin") use iso_c_binding, only: c_size_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_size_t), value :: len end subroutine subroutine acc_create_l (a, len) & bind (C, name = "acc_create") use iso_c_binding, only: c_size_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_size_t), value :: len end subroutine subroutine acc_present_or_create_l (a, len) & bind (C, name = "acc_present_or_create") use iso_c_binding, only: c_size_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_size_t), value :: len end subroutine subroutine acc_copyout_l (a, len) & bind (C, name = "acc_copyout") use iso_c_binding, only: c_size_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_size_t), value :: len end subroutine subroutine acc_copyout_finalize_l (a, len) & bind (C, name = "acc_copyout_finalize") use iso_c_binding, only: c_size_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_size_t), value :: len end subroutine subroutine acc_delete_l (a, len) & bind (C, name = "acc_delete") use iso_c_binding, only: c_size_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_size_t), value :: len end subroutine subroutine acc_delete_finalize_l (a, len) & bind (C, name = "acc_delete_finalize") use iso_c_binding, only: c_size_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_size_t), value :: len end subroutine subroutine acc_update_device_l (a, len) & bind (C, name = "acc_update_device") use iso_c_binding, only: c_size_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_size_t), value :: len end subroutine subroutine acc_update_self_l (a, len) & bind (C, name = "acc_update_self") use iso_c_binding, only: c_size_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_size_t), value :: len end subroutine function acc_is_present_l (a, len) & bind (C, name = "acc_is_present") use iso_c_binding, only: c_int32_t, c_size_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a integer (c_int32_t) :: acc_is_present_l type (*), dimension (*) :: a integer (c_size_t), value :: len end function subroutine acc_copyin_async_l (a, len, async) & bind (C, name = "acc_copyin_async") use iso_c_binding, only: c_size_t, c_int !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_size_t), value :: len integer (c_int), value :: async end subroutine subroutine acc_create_async_l (a, len, async) & bind (C, name = "acc_create_async") use iso_c_binding, only: c_size_t, c_int !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_size_t), value :: len integer (c_int), value :: async end subroutine subroutine acc_copyout_async_l (a, len, async) & bind (C, name = "acc_copyout_async") use iso_c_binding, only: c_size_t, c_int !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_size_t), value :: len integer (c_int), value :: async end subroutine subroutine acc_delete_async_l (a, len, async) & bind (C, name = "acc_delete_async") use iso_c_binding, only: c_size_t, c_int !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_size_t), value :: len integer (c_int), value :: async end subroutine subroutine acc_update_device_async_l (a, len, async) & bind (C, name = "acc_update_device_async") use iso_c_binding, only: c_size_t, c_int !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_size_t), value :: len integer (c_int), value :: async end subroutine subroutine acc_update_self_async_l (a, len, async) & bind (C, name = "acc_update_self_async") use iso_c_binding, only: c_size_t, c_int !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_size_t), value :: len integer (c_int), value :: async end subroutine end interface end module openacc_internal module openacc use openacc_kinds use openacc_internal implicit none private ! From openacc_kinds public :: acc_device_kind public :: acc_device_none, acc_device_default, acc_device_host public :: acc_device_not_host, acc_device_nvidia, acc_device_radeon public :: acc_device_property_kind, acc_device_property public :: acc_property_memory, acc_property_free_memory public :: acc_property_name, acc_property_vendor, acc_property_driver public :: acc_handle_kind public :: acc_async_noval, acc_async_sync public :: openacc_version public :: acc_get_num_devices, acc_set_device_type, acc_get_device_type public :: acc_set_device_num, acc_get_device_num public :: acc_get_property, acc_get_property_string public :: acc_async_test, acc_async_test_all public :: acc_wait, acc_async_wait, acc_wait_async public :: acc_wait_all, acc_async_wait_all, acc_wait_all_async public :: acc_init, acc_shutdown, acc_on_device public :: acc_copyin, acc_present_or_copyin, acc_pcopyin, acc_create public :: acc_present_or_create, acc_pcreate, acc_copyout, acc_delete public :: acc_update_device, acc_update_self, acc_is_present public :: acc_copyin_async, acc_create_async, acc_copyout_async public :: acc_delete_async, acc_update_device_async, acc_update_self_async public :: acc_copyout_finalize, acc_delete_finalize integer, parameter :: openacc_version = 201711 interface acc_get_num_devices procedure :: acc_get_num_devices_h end interface interface acc_set_device_type procedure :: acc_set_device_type_h end interface interface acc_get_device_type procedure :: acc_get_device_type_h end interface interface acc_set_device_num procedure :: acc_set_device_num_h end interface interface acc_get_device_num procedure :: acc_get_device_num_h end interface interface acc_get_property procedure :: acc_get_property_h end interface interface acc_get_property_string procedure :: acc_get_property_string_h end interface interface acc_async_test procedure :: acc_async_test_h end interface interface acc_async_test_all procedure :: acc_async_test_all_h end interface interface acc_wait procedure :: acc_wait_h end interface ! acc_async_wait is an OpenACC 1.0 compatibility name for acc_wait. interface acc_async_wait procedure :: acc_wait_h end interface interface acc_wait_async procedure :: acc_wait_async_h end interface interface acc_wait_all procedure :: acc_wait_all_h end interface ! acc_async_wait_all is an OpenACC 1.0 compatibility name for acc_wait_all. interface acc_async_wait_all procedure :: acc_wait_all_h end interface interface acc_wait_all_async procedure :: acc_wait_all_async_h end interface interface acc_init procedure :: acc_init_h end interface interface acc_shutdown procedure :: acc_shutdown_h end interface interface acc_on_device procedure :: acc_on_device_h end interface ! acc_malloc: Only available in C/C++ ! acc_free: Only available in C/C++ ! As vendor extension, the following code supports both 32bit and 64bit ! arguments for "size"; the OpenACC standard only permits default-kind ! integers, which are of kind 4 (i.e. 32 bits). ! Additionally, the two-argument version also takes arrays as argument. ! and the one argument version also scalars. Note that the code assumes ! that the arrays are contiguous. interface acc_copyin procedure :: acc_copyin_32_h procedure :: acc_copyin_64_h procedure :: acc_copyin_array_h end interface interface acc_present_or_copyin procedure :: acc_present_or_copyin_32_h procedure :: acc_present_or_copyin_64_h procedure :: acc_present_or_copyin_array_h end interface interface acc_pcopyin procedure :: acc_present_or_copyin_32_h procedure :: acc_present_or_copyin_64_h procedure :: acc_present_or_copyin_array_h end interface interface acc_create procedure :: acc_create_32_h procedure :: acc_create_64_h procedure :: acc_create_array_h end interface interface acc_present_or_create procedure :: acc_present_or_create_32_h procedure :: acc_present_or_create_64_h procedure :: acc_present_or_create_array_h end interface interface acc_pcreate procedure :: acc_present_or_create_32_h procedure :: acc_present_or_create_64_h procedure :: acc_present_or_create_array_h end interface interface acc_copyout procedure :: acc_copyout_32_h procedure :: acc_copyout_64_h procedure :: acc_copyout_array_h end interface interface acc_copyout_finalize procedure :: acc_copyout_finalize_32_h procedure :: acc_copyout_finalize_64_h procedure :: acc_copyout_finalize_array_h end interface interface acc_delete procedure :: acc_delete_32_h procedure :: acc_delete_64_h procedure :: acc_delete_array_h end interface interface acc_delete_finalize procedure :: acc_delete_finalize_32_h procedure :: acc_delete_finalize_64_h procedure :: acc_delete_finalize_array_h end interface interface acc_update_device procedure :: acc_update_device_32_h procedure :: acc_update_device_64_h procedure :: acc_update_device_array_h end interface interface acc_update_self procedure :: acc_update_self_32_h procedure :: acc_update_self_64_h procedure :: acc_update_self_array_h end interface ! acc_map_data: Only available in C/C++ ! acc_unmap_data: Only available in C/C++ ! acc_deviceptr: Only available in C/C++ ! acc_hostptr: Only available in C/C++ interface acc_is_present procedure :: acc_is_present_32_h procedure :: acc_is_present_64_h procedure :: acc_is_present_array_h end interface ! acc_memcpy_to_device: Only available in C/C++ ! acc_memcpy_from_device: Only available in C/C++ interface acc_copyin_async procedure :: acc_copyin_async_32_h procedure :: acc_copyin_async_64_h procedure :: acc_copyin_async_array_h end interface interface acc_create_async procedure :: acc_create_async_32_h procedure :: acc_create_async_64_h procedure :: acc_create_async_array_h end interface interface acc_copyout_async procedure :: acc_copyout_async_32_h procedure :: acc_copyout_async_64_h procedure :: acc_copyout_async_array_h end interface interface acc_delete_async procedure :: acc_delete_async_32_h procedure :: acc_delete_async_64_h procedure :: acc_delete_async_array_h end interface interface acc_update_device_async procedure :: acc_update_device_async_32_h procedure :: acc_update_device_async_64_h procedure :: acc_update_device_async_array_h end interface interface acc_update_self_async procedure :: acc_update_self_async_32_h procedure :: acc_update_self_async_64_h procedure :: acc_update_self_async_array_h end interface end module openacc function acc_get_num_devices_h (devicetype) use openacc_internal, only: acc_get_num_devices_l use openacc_kinds integer acc_get_num_devices_h integer (acc_device_kind) devicetype acc_get_num_devices_h = acc_get_num_devices_l (devicetype) end function subroutine acc_set_device_type_h (devicetype) use openacc_internal, only: acc_set_device_type_l use openacc_kinds integer (acc_device_kind) devicetype call acc_set_device_type_l (devicetype) end subroutine function acc_get_device_type_h () use openacc_internal, only: acc_get_device_type_l use openacc_kinds integer (acc_device_kind) acc_get_device_type_h acc_get_device_type_h = acc_get_device_type_l () end function subroutine acc_set_device_num_h (devicenum, devicetype) use openacc_internal, only: acc_set_device_num_l use openacc_kinds integer devicenum integer (acc_device_kind) devicetype call acc_set_device_num_l (devicenum, devicetype) end subroutine function acc_get_device_num_h (devicetype) use openacc_internal, only: acc_get_device_num_l use openacc_kinds integer acc_get_device_num_h integer (acc_device_kind) devicetype acc_get_device_num_h = acc_get_device_num_l (devicetype) end function function acc_get_property_h (devicenum, devicetype, property) use iso_c_binding, only: c_size_t use openacc_internal, only: acc_get_property_l use openacc_kinds implicit none (type, external) integer (c_size_t) :: acc_get_property_h integer, value :: devicenum integer (acc_device_kind), value :: devicetype integer (acc_device_property_kind), value :: property acc_get_property_h = acc_get_property_l (devicenum, devicetype, property) end function subroutine acc_get_property_string_h (devicenum, devicetype, property, string) use iso_c_binding, only: c_char, c_size_t, c_ptr, c_f_pointer, c_associated use openacc_internal, only: acc_get_property_string_l use openacc_kinds implicit none (type, external) integer, value :: devicenum integer (acc_device_kind), value :: devicetype integer (acc_device_property_kind), value :: property character (*) :: string type (c_ptr) :: cptr integer(c_size_t) :: clen, slen, i character (kind=c_char, len=1), pointer, contiguous :: sptr (:) interface function strlen (s) bind (C, name = "strlen") use iso_c_binding, only: c_ptr, c_size_t type (c_ptr), intent(in), value :: s integer (c_size_t) :: strlen end function strlen end interface cptr = acc_get_property_string_l (devicenum, devicetype, property) string = "" if (.not. c_associated (cptr)) then return end if clen = strlen (cptr) call c_f_pointer (cptr, sptr, [clen]) slen = min (clen, len (string, kind=c_size_t)) do i = 1, slen string (i:i) = sptr (i) end do end subroutine function acc_async_test_h (arg) use openacc_internal, only: acc_async_test_l logical acc_async_test_h integer arg acc_async_test_h = acc_async_test_l (arg) /= 0 end function function acc_async_test_all_h () use openacc_internal, only: acc_async_test_all_l logical acc_async_test_all_h acc_async_test_all_h = acc_async_test_all_l () /= 0 end function subroutine acc_wait_h (arg) use openacc_internal, only: acc_wait_l integer arg call acc_wait_l (arg) end subroutine subroutine acc_wait_async_h (arg, async) use openacc_internal, only: acc_wait_async_l integer arg, async call acc_wait_async_l (arg, async) end subroutine subroutine acc_wait_all_h () use openacc_internal, only: acc_wait_all_l call acc_wait_all_l () end subroutine subroutine acc_wait_all_async_h (async) use openacc_internal, only: acc_wait_all_async_l integer async call acc_wait_all_async_l (async) end subroutine subroutine acc_init_h (devicetype) use openacc_internal, only: acc_init_l use openacc_kinds integer (acc_device_kind) devicetype call acc_init_l (devicetype) end subroutine subroutine acc_shutdown_h (devicetype) use openacc_internal, only: acc_shutdown_l use openacc_kinds integer (acc_device_kind) devicetype call acc_shutdown_l (devicetype) end subroutine function acc_on_device_h (devicetype) use openacc_internal, only: acc_on_device_l use openacc_kinds integer (acc_device_kind) devicetype logical acc_on_device_h acc_on_device_h = acc_on_device_l (devicetype) /= 0 end function subroutine acc_copyin_32_h (a, len) use iso_c_binding, only: c_int32_t, c_size_t use openacc_internal, only: acc_copyin_l !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len call acc_copyin_l (a, int (len, kind = c_size_t)) end subroutine subroutine acc_copyin_64_h (a, len) use iso_c_binding, only: c_int64_t, c_size_t use openacc_internal, only: acc_copyin_l !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len call acc_copyin_l (a, int (len, kind = c_size_t)) end subroutine subroutine acc_copyin_array_h (a) use openacc_internal, only: acc_copyin_l type (*), dimension (..), contiguous :: a call acc_copyin_l (a, sizeof (a)) end subroutine subroutine acc_present_or_copyin_32_h (a, len) use iso_c_binding, only: c_int32_t, c_size_t use openacc_internal, only: acc_present_or_copyin_l !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len call acc_present_or_copyin_l (a, int (len, kind = c_size_t)) end subroutine subroutine acc_present_or_copyin_64_h (a, len) use iso_c_binding, only: c_int64_t, c_size_t use openacc_internal, only: acc_present_or_copyin_l !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len call acc_present_or_copyin_l (a, int (len, kind = c_size_t)) end subroutine subroutine acc_present_or_copyin_array_h (a) use openacc_internal, only: acc_present_or_copyin_l type (*), dimension (..), contiguous :: a call acc_present_or_copyin_l (a, sizeof (a)) end subroutine subroutine acc_create_32_h (a, len) use iso_c_binding, only: c_int32_t, c_size_t use openacc_internal, only: acc_create_l !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len call acc_create_l (a, int (len, kind = c_size_t)) end subroutine subroutine acc_create_64_h (a, len) use iso_c_binding, only: c_int64_t, c_size_t use openacc_internal, only: acc_create_l !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len call acc_create_l (a, int (len, kind = c_size_t)) end subroutine subroutine acc_create_array_h (a) use openacc_internal, only: acc_create_l type (*), dimension (..), contiguous :: a call acc_create_l (a, sizeof (a)) end subroutine subroutine acc_present_or_create_32_h (a, len) use iso_c_binding, only: c_int32_t, c_size_t use openacc_internal, only: acc_present_or_create_l !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len call acc_present_or_create_l (a, int (len, kind = c_size_t)) end subroutine subroutine acc_present_or_create_64_h (a, len) use iso_c_binding, only: c_int64_t, c_size_t use openacc_internal, only: acc_present_or_create_l !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len call acc_present_or_create_l (a, int (len, kind = c_size_t)) end subroutine subroutine acc_present_or_create_array_h (a) use openacc_internal, only: acc_present_or_create_l type (*), dimension (..), contiguous :: a call acc_present_or_create_l (a, sizeof (a)) end subroutine subroutine acc_copyout_32_h (a, len) use iso_c_binding, only: c_int32_t, c_size_t use openacc_internal, only: acc_copyout_l !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len call acc_copyout_l (a, int (len, kind = c_size_t)) end subroutine subroutine acc_copyout_64_h (a, len) use iso_c_binding, only: c_int64_t, c_size_t use openacc_internal, only: acc_copyout_l !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len call acc_copyout_l (a, int (len, kind = c_size_t)) end subroutine subroutine acc_copyout_array_h (a) use openacc_internal, only: acc_copyout_l type (*), dimension (..), contiguous :: a call acc_copyout_l (a, sizeof (a)) end subroutine subroutine acc_copyout_finalize_32_h (a, len) use iso_c_binding, only: c_int32_t, c_size_t use openacc_internal, only: acc_copyout_finalize_l !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len call acc_copyout_finalize_l (a, int (len, kind = c_size_t)) end subroutine subroutine acc_copyout_finalize_64_h (a, len) use iso_c_binding, only: c_int64_t, c_size_t use openacc_internal, only: acc_copyout_finalize_l !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len call acc_copyout_finalize_l (a, int (len, kind = c_size_t)) end subroutine subroutine acc_copyout_finalize_array_h (a) use openacc_internal, only: acc_copyout_finalize_l type (*), dimension (..), contiguous :: a call acc_copyout_finalize_l (a, sizeof (a)) end subroutine subroutine acc_delete_32_h (a, len) use iso_c_binding, only: c_int32_t, c_size_t use openacc_internal, only: acc_delete_l !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len call acc_delete_l (a, int (len, kind = c_size_t)) end subroutine subroutine acc_delete_64_h (a, len) use iso_c_binding, only: c_int64_t, c_size_t use openacc_internal, only: acc_delete_l !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len call acc_delete_l (a, int (len, kind = c_size_t)) end subroutine subroutine acc_delete_array_h (a) use openacc_internal, only: acc_delete_l type (*), dimension (..), contiguous :: a call acc_delete_l (a, sizeof (a)) end subroutine subroutine acc_delete_finalize_32_h (a, len) use iso_c_binding, only: c_int32_t, c_size_t use openacc_internal, only: acc_delete_finalize_l !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len call acc_delete_finalize_l (a, int (len, kind = c_size_t)) end subroutine subroutine acc_delete_finalize_64_h (a, len) use iso_c_binding, only: c_int64_t, c_size_t use openacc_internal, only: acc_delete_finalize_l !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len call acc_delete_finalize_l (a, int (len, kind = c_size_t)) end subroutine subroutine acc_delete_finalize_array_h (a) use openacc_internal, only: acc_delete_finalize_l type (*), dimension (..), contiguous :: a call acc_delete_finalize_l (a, sizeof (a)) end subroutine subroutine acc_update_device_32_h (a, len) use iso_c_binding, only: c_int32_t, c_size_t use openacc_internal, only: acc_update_device_l !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len call acc_update_device_l (a, int (len, kind = c_size_t)) end subroutine subroutine acc_update_device_64_h (a, len) use iso_c_binding, only: c_int64_t, c_size_t use openacc_internal, only: acc_update_device_l !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len call acc_update_device_l (a, int (len, kind = c_size_t)) end subroutine subroutine acc_update_device_array_h (a) use openacc_internal, only: acc_update_device_l type (*), dimension (..), contiguous :: a call acc_update_device_l (a, sizeof (a)) end subroutine subroutine acc_update_self_32_h (a, len) use iso_c_binding, only: c_int32_t, c_size_t use openacc_internal, only: acc_update_self_l !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len call acc_update_self_l (a, int (len, kind = c_size_t)) end subroutine subroutine acc_update_self_64_h (a, len) use iso_c_binding, only: c_int64_t, c_size_t use openacc_internal, only: acc_update_self_l !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len call acc_update_self_l (a, int (len, kind = c_size_t)) end subroutine subroutine acc_update_self_array_h (a) use openacc_internal, only: acc_update_self_l type (*), dimension (..), contiguous :: a call acc_update_self_l (a, sizeof (a)) end subroutine function acc_is_present_32_h (a, len) use iso_c_binding, only: c_int32_t, c_size_t use openacc_internal, only: acc_is_present_l logical acc_is_present_32_h !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len acc_is_present_32_h = acc_is_present_l (a, int (len, kind = c_size_t)) /= 0 end function function acc_is_present_64_h (a, len) use iso_c_binding, only: c_int64_t, c_size_t use openacc_internal, only: acc_is_present_l logical acc_is_present_64_h !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len acc_is_present_64_h = acc_is_present_l (a, int (len, kind = c_size_t)) /= 0 end function function acc_is_present_array_h (a) use openacc_internal, only: acc_is_present_l logical acc_is_present_array_h type (*), dimension (..), contiguous :: a acc_is_present_array_h = acc_is_present_l (a, sizeof (a)) /= 0 end function subroutine acc_copyin_async_32_h (a, len, async) use iso_c_binding, only: c_int32_t, c_size_t, c_int use openacc_internal, only: acc_copyin_async_l use openacc_kinds, only: acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len integer (acc_handle_kind) async call acc_copyin_async_l (a, int (len, kind = c_size_t), int (async, kind = c_int)) end subroutine subroutine acc_copyin_async_64_h (a, len, async) use iso_c_binding, only: c_int64_t, c_size_t, c_int use openacc_internal, only: acc_copyin_async_l use openacc_kinds, only: acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len integer (acc_handle_kind) async call acc_copyin_async_l (a, int (len, kind = c_size_t), int (async, kind = c_int)) end subroutine subroutine acc_copyin_async_array_h (a, async) use iso_c_binding, only: c_int use openacc_internal, only: acc_copyin_async_l use openacc_kinds, only: acc_handle_kind type (*), dimension (..), contiguous :: a integer (acc_handle_kind) async call acc_copyin_async_l (a, sizeof (a), int (async, kind = c_int)) end subroutine subroutine acc_create_async_32_h (a, len, async) use iso_c_binding, only: c_int32_t, c_size_t, c_int use openacc_internal, only: acc_create_async_l use openacc_kinds, only: acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len integer (acc_handle_kind) async call acc_create_async_l (a, int (len, kind = c_size_t), int (async, kind = c_int)) end subroutine subroutine acc_create_async_64_h (a, len, async) use iso_c_binding, only: c_int64_t, c_size_t, c_int use openacc_internal, only: acc_create_async_l use openacc_kinds, only: acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len integer (acc_handle_kind) async call acc_create_async_l (a, int (len, kind = c_size_t), int (async, kind = c_int)) end subroutine subroutine acc_create_async_array_h (a, async) use iso_c_binding, only: c_int use openacc_internal, only: acc_create_async_l use openacc_kinds, only: acc_handle_kind type (*), dimension (..), contiguous :: a integer (acc_handle_kind) async call acc_create_async_l (a, sizeof (a), int (async, kind = c_int)) end subroutine subroutine acc_copyout_async_32_h (a, len, async) use iso_c_binding, only: c_int32_t, c_size_t, c_int use openacc_internal, only: acc_copyout_async_l use openacc_kinds, only: acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len integer (acc_handle_kind) async call acc_copyout_async_l (a, int (len, kind = c_size_t), int (async, kind = c_int)) end subroutine subroutine acc_copyout_async_64_h (a, len, async) use iso_c_binding, only: c_int64_t, c_size_t, c_int use openacc_internal, only: acc_copyout_async_l use openacc_kinds, only: acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len integer (acc_handle_kind) async call acc_copyout_async_l (a, int (len, kind = c_size_t), int (async, kind = c_int)) end subroutine subroutine acc_copyout_async_array_h (a, async) use iso_c_binding, only: c_int use openacc_internal, only: acc_copyout_async_l use openacc_kinds, only: acc_handle_kind type (*), dimension (..), contiguous :: a integer (acc_handle_kind) async call acc_copyout_async_l (a, sizeof (a), int (async, kind = c_int)) end subroutine subroutine acc_delete_async_32_h (a, len, async) use iso_c_binding, only: c_int32_t, c_size_t, c_int use openacc_internal, only: acc_delete_async_l use openacc_kinds, only: acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len integer (acc_handle_kind) async call acc_delete_async_l (a, int (len, kind = c_size_t), int (async, kind = c_int)) end subroutine subroutine acc_delete_async_64_h (a, len, async) use iso_c_binding, only: c_int64_t, c_size_t, c_int use openacc_internal, only: acc_delete_async_l use openacc_kinds, only: acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len integer (acc_handle_kind) async call acc_delete_async_l (a, int (len, kind = c_size_t), int (async, kind = c_int)) end subroutine subroutine acc_delete_async_array_h (a, async) use iso_c_binding, only: c_int use openacc_internal, only: acc_delete_async_l use openacc_kinds, only: acc_handle_kind type (*), dimension (..), contiguous :: a integer (acc_handle_kind) async call acc_delete_async_l (a, sizeof (a), int (async, kind = c_int)) end subroutine subroutine acc_update_device_async_32_h (a, len, async) use iso_c_binding, only: c_int32_t, c_size_t, c_int use openacc_internal, only: acc_update_device_async_l use openacc_kinds, only: acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len integer (acc_handle_kind) async call acc_update_device_async_l (a, int (len, kind = c_size_t), int (async, kind = c_int)) end subroutine subroutine acc_update_device_async_64_h (a, len, async) use iso_c_binding, only: c_int64_t, c_size_t, c_int use openacc_internal, only: acc_update_device_async_l use openacc_kinds, only: acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len integer (acc_handle_kind) async call acc_update_device_async_l (a, int (len, kind = c_size_t), int (async, kind = c_int)) end subroutine subroutine acc_update_device_async_array_h (a, async) use iso_c_binding, only: c_int use openacc_internal, only: acc_update_device_async_l use openacc_kinds, only: acc_handle_kind type (*), dimension (..), contiguous :: a integer (acc_handle_kind) async call acc_update_device_async_l (a, sizeof (a), int (async, kind = c_int)) end subroutine subroutine acc_update_self_async_32_h (a, len, async) use iso_c_binding, only: c_int32_t, c_size_t, c_int use openacc_internal, only: acc_update_self_async_l use openacc_kinds, only: acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len integer (acc_handle_kind) async call acc_update_self_async_l (a, int (len, kind = c_size_t), int (async, kind = c_int)) end subroutine subroutine acc_update_self_async_64_h (a, len, async) use iso_c_binding, only: c_int64_t, c_size_t, c_int use openacc_internal, only: acc_update_self_async_l use openacc_kinds, only: acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len integer (acc_handle_kind) async call acc_update_self_async_l (a, int (len, kind = c_size_t), int (async, kind = c_int)) end subroutine subroutine acc_update_self_async_array_h (a, async) use iso_c_binding, only: c_int use openacc_internal, only: acc_update_self_async_l use openacc_kinds, only: acc_handle_kind type (*), dimension (..), contiguous :: a integer (acc_handle_kind) async call acc_update_self_async_l (a, sizeof (a), int (async, kind = c_int)) end subroutine finclude/openacc.modՕn0~L7BmNGEM"4R߾6${ K3flmf^; Y"l0lJT2mYC!d.2 NjZDTjTĀ09&4~-)>~t| Ɗ 곐 #+uW"o+ϵ>3 @zRM1]i}-_:SHyh -ؘhʋ\ b:X&15Hv!#šbu!KdD`AV-w ٢V&,&Э VW̓݉L gN$%8YS"nn Aqa+uS`*Ei} `U럜]C>dYVg*\z]ʆAҝ< }=/Ezڛ z,G].+Zueޛcܡ{L`ao2finclude/openacc_kinds.modՖo0+ޭpp) cLA܎@ȴ] DY]H!ןkix (t)Ev182Y dI2ȥ؃` ! and Mentor Embedded ! NOTE: Due to the use of dimension (..), the code only works when compiled ! with -std=f2008ts/gnu/legacy but not with other standard settings. ! Alternatively, the user can use the module version, which permits ! compilation with -std=f95. ! Keep in sync with openacc.f90 and config/accel/openacc.f90. integer, parameter :: acc_device_kind = 4 ! integer (acc_device_kind), parameter :: acc_device_current = -1 integer (acc_device_kind), parameter :: acc_device_none = 0 integer (acc_device_kind), parameter :: acc_device_default = 1 integer (acc_device_kind), parameter :: acc_device_host = 2 ! integer (acc_device_kind), parameter :: acc_device_host_nonshm = 3 ! removed. integer (acc_device_kind), parameter :: acc_device_not_host = 4 integer (acc_device_kind), parameter :: acc_device_nvidia = 5 integer (acc_device_kind), parameter :: acc_device_radeon = 8 integer, parameter :: acc_device_property_kind = 4 ! OpenACC 2.6/2.7/3.0 used acc_device_property; in a spec update the ! missing '_kind' was added for consistency. For backward compatibility, keep: integer, parameter :: acc_device_property & & = acc_device_property_kind integer (acc_device_property_kind), parameter :: & & acc_property_memory = 1 integer (acc_device_property_kind), parameter :: & & acc_property_free_memory = 2 integer (acc_device_property_kind), parameter :: & & acc_property_name = int(Z'10001') integer (acc_device_property_kind), parameter :: & & acc_property_vendor = int(Z'10002') integer (acc_device_property_kind), parameter :: & & acc_property_driver = int(Z'10003') integer, parameter :: acc_handle_kind = 4 ! integer (acc_handle_kind), parameter :: acc_async_noval = -1 integer (acc_handle_kind), parameter :: acc_async_sync = -2 integer, parameter :: openacc_version = 201711 interface acc_get_num_devices function acc_get_num_devices_h (devicetype) import acc_device_kind integer acc_get_num_devices_h integer (acc_device_kind) devicetype end function end interface interface acc_set_device_type subroutine acc_set_device_type_h (devicetype) import acc_device_kind integer (acc_device_kind) devicetype end subroutine end interface interface acc_get_device_type function acc_get_device_type_h () import acc_device_kind integer (acc_device_kind) acc_get_device_type_h end function end interface interface acc_set_device_num subroutine acc_set_device_num_h (devicenum, devicetype) import acc_device_kind integer devicenum integer (acc_device_kind) devicetype end subroutine end interface interface acc_get_device_num function acc_get_device_num_h (devicetype) import acc_device_kind integer acc_get_device_num_h integer (acc_device_kind) devicetype end function end interface interface acc_get_property function acc_get_property_h (devicenum, devicetype, & & property) use iso_c_binding, only: c_size_t import acc_device_kind, acc_device_property_kind implicit none (type, external) integer (c_size_t) :: acc_get_property_h integer, value :: devicenum integer (acc_device_kind), value :: devicetype integer (acc_device_property_kind), value :: property end function end interface interface acc_get_property_string subroutine acc_get_property_string_h (devicenum, devicetype, & & property, string) import acc_device_kind, acc_device_property_kind implicit none (type, external) integer, value :: devicenum integer (acc_device_kind), value :: devicetype integer (acc_device_property_kind), value :: property character (*) :: string end subroutine end interface interface acc_async_test function acc_async_test_h (arg) logical acc_async_test_h integer arg end function end interface interface acc_async_test_all function acc_async_test_all_h () logical acc_async_test_all_h end function end interface interface acc_wait subroutine acc_wait_h (arg) integer arg end subroutine end interface ! acc_async_wait is an OpenACC 1.0 compatibility name for acc_wait. interface acc_async_wait procedure :: acc_wait_h end interface interface acc_wait_async subroutine acc_wait_async_h (arg, async) integer arg, async end subroutine end interface interface acc_wait_all subroutine acc_wait_all_h () end subroutine end interface ! acc_async_wait_all is an OpenACC 1.0 compatibility name for ! acc_wait_all. interface acc_async_wait_all procedure :: acc_wait_all_h end interface interface acc_wait_all_async subroutine acc_wait_all_async_h (async) integer async end subroutine end interface interface acc_init subroutine acc_init_h (devicetype) import acc_device_kind integer (acc_device_kind) devicetype end subroutine end interface interface acc_shutdown subroutine acc_shutdown_h (devicetype) import acc_device_kind integer (acc_device_kind) devicetype end subroutine end interface interface acc_on_device function acc_on_device_h (devicetype) import acc_device_kind logical acc_on_device_h integer (acc_device_kind) devicetype end function end interface ! acc_malloc: Only available in C/C++ ! acc_free: Only available in C/C++ interface acc_copyin subroutine acc_copyin_32_h (a, len) use iso_c_binding, only: c_int32_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len end subroutine subroutine acc_copyin_64_h (a, len) use iso_c_binding, only: c_int64_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len end subroutine subroutine acc_copyin_array_h (a) type (*), dimension (..), contiguous :: a end subroutine end interface interface acc_present_or_copyin subroutine acc_present_or_copyin_32_h (a, len) use iso_c_binding, only: c_int32_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len end subroutine subroutine acc_present_or_copyin_64_h (a, len) use iso_c_binding, only: c_int64_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len end subroutine subroutine acc_present_or_copyin_array_h (a) type (*), dimension (..), contiguous :: a end subroutine end interface interface acc_pcopyin procedure :: acc_present_or_copyin_32_h procedure :: acc_present_or_copyin_64_h procedure :: acc_present_or_copyin_array_h end interface interface acc_create subroutine acc_create_32_h (a, len) use iso_c_binding, only: c_int32_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len end subroutine subroutine acc_create_64_h (a, len) use iso_c_binding, only: c_int64_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len end subroutine subroutine acc_create_array_h (a) type (*), dimension (..), contiguous :: a end subroutine end interface interface acc_present_or_create subroutine acc_present_or_create_32_h (a, len) use iso_c_binding, only: c_int32_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len end subroutine subroutine acc_present_or_create_64_h (a, len) use iso_c_binding, only: c_int64_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len end subroutine subroutine acc_present_or_create_array_h (a) type (*), dimension (..), contiguous :: a end subroutine end interface interface acc_pcreate procedure :: acc_present_or_create_32_h procedure :: acc_present_or_create_64_h procedure :: acc_present_or_create_array_h end interface interface acc_copyout subroutine acc_copyout_32_h (a, len) use iso_c_binding, only: c_int32_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len end subroutine subroutine acc_copyout_64_h (a, len) use iso_c_binding, only: c_int64_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len end subroutine subroutine acc_copyout_array_h (a) type (*), dimension (..), contiguous :: a end subroutine end interface interface acc_copyout_finalize subroutine acc_copyout_finalize_32_h (a, len) use iso_c_binding, only: c_int32_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len end subroutine subroutine acc_copyout_finalize_64_h (a, len) use iso_c_binding, only: c_int64_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len end subroutine subroutine acc_copyout_finalize_array_h (a) type (*), dimension (..), contiguous :: a end subroutine end interface interface acc_delete subroutine acc_delete_32_h (a, len) use iso_c_binding, only: c_int32_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len end subroutine subroutine acc_delete_64_h (a, len) use iso_c_binding, only: c_int64_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len end subroutine subroutine acc_delete_array_h (a) type (*), dimension (..), contiguous :: a end subroutine end interface interface acc_delete_finalize subroutine acc_delete_finalize_32_h (a, len) use iso_c_binding, only: c_int32_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len end subroutine subroutine acc_delete_finalize_64_h (a, len) use iso_c_binding, only: c_int64_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len end subroutine subroutine acc_delete_finalize_array_h (a) type (*), dimension (..), contiguous :: a end subroutine end interface interface acc_update_device subroutine acc_update_device_32_h (a, len) use iso_c_binding, only: c_int32_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len end subroutine subroutine acc_update_device_64_h (a, len) use iso_c_binding, only: c_int64_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len end subroutine subroutine acc_update_device_array_h (a) type (*), dimension (..), contiguous :: a end subroutine end interface interface acc_update_self subroutine acc_update_self_32_h (a, len) use iso_c_binding, only: c_int32_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len end subroutine subroutine acc_update_self_64_h (a, len) use iso_c_binding, only: c_int64_t !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len end subroutine subroutine acc_update_self_array_h (a) type (*), dimension (..), contiguous :: a end subroutine end interface ! acc_map_data: Only available in C/C++ ! acc_unmap_data: Only available in C/C++ ! acc_deviceptr: Only available in C/C++ ! acc_hostptr: Only available in C/C++ interface acc_is_present function acc_is_present_32_h (a, len) use iso_c_binding, only: c_int32_t logical acc_is_present_32_h !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len end function function acc_is_present_64_h (a, len) use iso_c_binding, only: c_int64_t logical acc_is_present_64_h !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len end function function acc_is_present_array_h (a) logical acc_is_present_array_h type (*), dimension (..), contiguous :: a end function end interface ! acc_memcpy_to_device: Only available in C/C++ ! acc_memcpy_from_device: Only available in C/C++ interface acc_copyin_async subroutine acc_copyin_async_32_h (a, len, async) use iso_c_binding, only: c_int32_t import acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len integer (acc_handle_kind) async end subroutine subroutine acc_copyin_async_64_h (a, len, async) use iso_c_binding, only: c_int64_t import acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len integer (acc_handle_kind) async end subroutine subroutine acc_copyin_async_array_h (a, async_) import acc_handle_kind type (*), dimension (..), contiguous :: a integer (acc_handle_kind) async_ end subroutine end interface interface acc_create_async subroutine acc_create_async_32_h (a, len, async) use iso_c_binding, only: c_int32_t import acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len integer (acc_handle_kind) async end subroutine subroutine acc_create_async_64_h (a, len, async) use iso_c_binding, only: c_int64_t import acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len integer (acc_handle_kind) async end subroutine subroutine acc_create_async_array_h (a, async_) import acc_handle_kind type (*), dimension (..), contiguous :: a integer (acc_handle_kind) async_ end subroutine end interface interface acc_copyout_async subroutine acc_copyout_async_32_h (a, len, async) use iso_c_binding, only: c_int32_t import acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len integer (acc_handle_kind) async end subroutine subroutine acc_copyout_async_64_h (a, len, async) use iso_c_binding, only: c_int64_t import acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len integer (acc_handle_kind) async end subroutine subroutine acc_copyout_async_array_h (a, async_) import acc_handle_kind type (*), dimension (..), contiguous :: a integer (acc_handle_kind) async_ end subroutine end interface interface acc_delete_async subroutine acc_delete_async_32_h (a, len, async) use iso_c_binding, only: c_int32_t import acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len integer (acc_handle_kind) async end subroutine subroutine acc_delete_async_64_h (a, len, async) use iso_c_binding, only: c_int64_t import acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len integer (acc_handle_kind) async end subroutine subroutine acc_delete_async_array_h (a, async_) import acc_handle_kind type (*), dimension (..), contiguous :: a integer (acc_handle_kind) async_ end subroutine end interface interface acc_update_device_async subroutine acc_update_device_async_32_h (a, len, async) use iso_c_binding, only: c_int32_t import acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len integer (acc_handle_kind) async end subroutine subroutine acc_update_device_async_64_h (a, len, async) use iso_c_binding, only: c_int64_t import acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len integer (acc_handle_kind) async end subroutine subroutine acc_update_device_async_array_h (a, async_) import acc_handle_kind type (*), dimension (..), contiguous :: a integer (acc_handle_kind) async_ end subroutine end interface interface acc_update_self_async subroutine acc_update_self_async_32_h (a, len, async) use iso_c_binding, only: c_int32_t import acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int32_t) len integer (acc_handle_kind) async end subroutine subroutine acc_update_self_async_64_h (a, len, async) use iso_c_binding, only: c_int64_t import acc_handle_kind !GCC$ ATTRIBUTES NO_ARG_CHECK :: a type (*), dimension (*) :: a integer (c_int64_t) len integer (acc_handle_kind) async end subroutine subroutine acc_update_self_async_array_h (a, async_) import acc_handle_kind type (*), dimension (..), contiguous :: a integer (acc_handle_kind) async_ end subroutine end interface ISO_Fortran_binding.h/* Declarations for ISO Fortran binding. Contributed by Daniel Celis Garza This file is part of the GNU Fortran runtime library (libgfortran). LibgfortranLibgfortran#ifndef ISO_FORTRAN_BINDING_H #define ISO_FORTRAN_BINDING_H#include /* Standard ptrdiff_t tand size_t. */ #include /* Integer types. */ /* Constants, defined as macros. */ #define CFI_VERSION 1 #define CFI_MAX_RANK 15 /* Attributes. */ #define CFI_attribute_pointer 0 #define CFI_attribute_allocatable 1 #define CFI_attribute_other 2 /* Error codes. CFI_INVALID_STRIDE should be defined in the standard because they are useful to the implementation of the functions. */ #define CFI_SUCCESS 0 #define CFI_FAILURE 1 #define CFI_ERROR_BASE_ADDR_NULL 2 #define CFI_ERROR_BASE_ADDR_NOT_NULL 3 #define CFI_INVALID_ELEM_LEN 4 #define CFI_INVALID_RANK 5 #define CFI_INVALID_TYPE 6 #define CFI_INVALID_ATTRIBUTE 7 #define CFI_INVALID_EXTENT 8 #define CFI_INVALID_STRIDE 9 #define CFI_INVALID_DESCRIPTOR 10 #define CFI_ERROR_MEM_ALLOCATION 11 #define CFI_ERROR_OUT_OF_BOUNDS 12 /* CFI type definitions. */ typedef ptrdiff_t CFI_index_t; typedef int8_t CFI_rank_t; typedef int8_t CFI_attribute_t; typedef int16_t CFI_type_t; /* CFI_dim_t. */ typedef struct CFI_dim_t { CFI_index_t lower_bound; CFI_index_t extent; CFI_index_t sm; } CFI_dim_t; /* CFI_cdesc_t, C descriptors are cast to this structure as follows: CFI_CDESC_T(CFI_MAX_RANK) foo; CFI_cdesc_t * bar = (CFI_cdesc_t *) &foo; */ typedef struct CFI_cdesc_t { void *base_addr; size_t elem_len; int version; CFI_rank_t rank; CFI_attribute_t attribute; CFI_type_t type; CFI_dim_t dim[]; } CFI_cdesc_t; /* CFI_CDESC_T with an explicit type. */ #define CFI_CDESC_TYPE_T(r, base_type) \ struct { \ base_type *base_addr; \ size_t elem_len; \ int version; \ CFI_rank_t rank; \ CFI_attribute_t attribute; \ CFI_type_t type; \ CFI_dim_t dim[r]; \ } #define CFI_CDESC_T(r) CFI_CDESC_TYPE_T (r, void) /* CFI function declarations. */ extern void *CFI_address (const CFI_cdesc_t *, const CFI_index_t []); extern int CFI_allocate (CFI_cdesc_t *, const CFI_index_t [], const CFI_index_t [], size_t); extern int CFI_deallocate (CFI_cdesc_t *); extern int CFI_establish (CFI_cdesc_t *, void *, CFI_attribute_t, CFI_type_t, size_t, CFI_rank_t, const CFI_index_t []); extern int CFI_is_contiguous (const CFI_cdesc_t *); extern int CFI_section (CFI_cdesc_t *, const CFI_cdesc_t *, const CFI_index_t [], const CFI_index_t [], const CFI_index_t []); extern int CFI_select_part (CFI_cdesc_t *, const CFI_cdesc_t *, size_t, size_t); extern int CFI_setpointer (CFI_cdesc_t *, CFI_cdesc_t *, const CFI_index_t []); /* Types and kind numbers. Allows bitwise and to reveal the intrinsic type of a kind type. It also allows us to find the kind parameter by inverting the bit-shift equation. CFI_type_kind_shift = 8 CFI_intrinsic_type = 0 0 0 0 0 0 0 0 0 0 1 0 CFI_type_kind = 0 0 0 0 0 0 0 0 1 0 0 0 CFI_type_example = CFI_intrinsic_type + (CFI_type_kind << CFI_type_kind_shift) Defining the CFI_type_example. CFI_type_kind = 0 0 0 0 0 0 0 0 1 0 0 0 << CFI_type_kind_shift ------------------------- 1 0 0 0 0 0 0 0 0 0 0 0 + CFI_intrinsic_type = 0 0 0 0 0 0 0 0 0 0 1 0 ------------------------- CFI_type_example = 1 0 0 0 0 0 0 0 0 0 1 0 Finding the intrinsic type with the logical mask. CFI_type_example = 1 0 0 0 0 0 0 0 0 0 1 0 & CFI_type_mask = 0 0 0 0 1 1 1 1 1 1 1 1 ------------------------- CFI_intrinsic_type = 0 0 0 0 0 0 0 0 0 0 1 0 Using the intrinsic type and kind shift to find the kind value of the type. CFI_type_kind = (CFI_type_example - CFI_intrinsic_type) >> CFI_type_kind_shift CFI_type_example = 1 0 0 0 0 0 0 0 0 0 1 0 - CFI_intrinsic_type = 0 0 0 0 0 0 0 0 0 0 1 0 ------------------------- 1 0 0 0 0 0 0 0 0 0 0 0 >> CFI_type_kind_shift ------------------------- CFI_type_kind = 0 0 0 0 0 0 0 0 1 0 0 0 */ #define CFI_type_mask 0xFF #define CFI_type_kind_shift 8 /* Intrinsic types. Their kind number defines their storage size. */ #define CFI_type_Integer 1 #define CFI_type_Logical 2 #define CFI_type_Real 3 #define CFI_type_Complex 4 #define CFI_type_Character 5 /* Types with no kind. */ #define CFI_type_struct 6 #define CFI_type_cptr 7 #define CFI_type_cfunptr 8 #define CFI_type_other -1 /* Types with kind parameter. The kind parameter represents the type's byte size. The exception is kind = 10, which has byte size of 64 but 80 bit precision. Complex variables are double the byte size of their real counterparts. The ucs4_char matches wchar_t if sizeof (wchar_t) == 4. */ #define CFI_type_char (CFI_type_Character + (1 << CFI_type_kind_shift)) #define CFI_type_ucs4_char (CFI_type_Character + (4 << CFI_type_kind_shift)) /* C-Fortran Interoperability types. */ #define CFI_type_signed_char (CFI_type_Integer + (1 << CFI_type_kind_shift)) #define CFI_type_short (CFI_type_Integer + (2 << CFI_type_kind_shift)) #define CFI_type_int (CFI_type_Integer + (4 << CFI_type_kind_shift)) #define CFI_type_long (CFI_type_Integer + (8 << CFI_type_kind_shift)) #define CFI_type_long_long (CFI_type_Integer + (8 << CFI_type_kind_shift)) #define CFI_type_size_t (CFI_type_Integer + (8 << CFI_type_kind_shift)) #define CFI_type_int8_t (CFI_type_Integer + (1 << CFI_type_kind_shift)) #define CFI_type_int16_t (CFI_type_Integer + (2 << CFI_type_kind_shift)) #define CFI_type_int32_t (CFI_type_Integer + (4 << CFI_type_kind_shift)) #define CFI_type_int64_t (CFI_type_Integer + (8 << CFI_type_kind_shift)) #define CFI_type_int_least8_t (CFI_type_Integer + (1 << CFI_type_kind_shift)) #define CFI_type_int_least16_t (CFI_type_Integer + (2 << CFI_type_kind_shift)) #define CFI_type_int_least32_t (CFI_type_Integer + (4 << CFI_type_kind_shift)) #define CFI_type_int_least64_t (CFI_type_Integer + (8 << CFI_type_kind_shift)) #define CFI_type_int_fast8_t (CFI_type_Integer + (1 << CFI_type_kind_shift)) #define CFI_type_int_fast16_t (CFI_type_Integer + (2 << CFI_type_kind_shift)) #define CFI_type_int_fast32_t (CFI_type_Integer + (4 << CFI_type_kind_shift)) #define CFI_type_int_fast64_t (CFI_type_Integer + (8 << CFI_type_kind_shift)) #define CFI_type_intmax_t (CFI_type_Integer + (8 << CFI_type_kind_shift)) #define CFI_type_intptr_t (CFI_type_Integer + (8 << CFI_type_kind_shift)) #define CFI_type_ptrdiff_t (CFI_type_Integer + (8 << CFI_type_kind_shift)) #define CFI_type_int128_t (CFI_type_Integer + (16 << CFI_type_kind_shift)) #define CFI_type_int_least128_t (CFI_type_Integer + (16 << CFI_type_kind_shift)) #define CFI_type_int_fast128_t (CFI_type_Integer + (16 << CFI_type_kind_shift)) #define CFI_type_Bool (CFI_type_Logical + (1 << CFI_type_kind_shift)) #define CFI_type_float (CFI_type_Real + (4 << CFI_type_kind_shift)) #define CFI_type_double (CFI_type_Real + (8 << CFI_type_kind_shift)) #define CFI_type_long_double (CFI_type_Real + (10 << CFI_type_kind_shift)) #define CFI_type_float128 (CFI_type_Real + (16 << CFI_type_kind_shift)) #define CFI_type_float_Complex (CFI_type_Complex + (4 << CFI_type_kind_shift)) #define CFI_type_double_Complex (CFI_type_Complex + (8 << CFI_type_kind_shift)) #define CFI_type_long_double_Complex (CFI_type_Complex + (10 << CFI_type_kind_shift)) #define CFI_type_float128_Complex (CFI_type_Complex + (16 << CFI_type_kind_shift)) #ifdef __cplusplus } #endif #endif /* ISO_FORTRAN_BINDING_H */ caf_single.a! single.o/assign_char1_from_char4 .func assign_char1_from_charfortran/caf/single.c" // BEGIN FUNCTION DECL: caf_runtime_error .func caf_runtime_FUNCTION DECL: caf_internal_error .func caf_internal_errorFUNCTION DECL: convert_type .func convert_type; // BEGIN FUNCTION DECL: copy_data .func copy_data); // BEGIN FUNCTION DECL: get_for_ref$constprop$0$isra$0 .func get_for_ref$constprop$0$isra$0, .param .u64 %in_ar11, .param .u32 %in_ar1_gfortran_caf_init .visible .func _gfortran_caf_in_gfortran_caf_finalize .visible .func _gfortran_caf_fingfortran_caf_this_imag_gfortran_caf_this_image_gfortran_caf_gfortran_caf_register .visible .func _gfortran_caf_registersend_by_ref$constprop$0$isra$0 .func send_by_ref$constprop$0$isra$0, .param .u64 %in_ar11, .param .u64 %in_ar12, .param .u32 %in_ar1_gfortran_caf_deregister .visible .func _gfortran_caf_deregister_gfortran_caf_sync_all .visible .func _gfortran_caf_sync_all_gfortran_caf_sync_memory .visible .func _gfortran_caf_sync_memory_gfortran_caf_sync_images .visible .func _gfortran_caf_sync_images_gfortran_caf_stop_numeric .visible .func _gfortran_caf_stop_numer_gfortran_caf_stop_str .visible .func _gfortran_caf_stop_str_gfortran_caf_error_stop_str .visible .func _gfortran_caf_error_stop_str_gfortran_caf_fail_image .visible .func _gfortran_caf_fail_imag_gfortran_caf_image_statu_gfortran_caf_image_status_gfortran_caf_failed_images .visible .func _gfortran_caf_failed_images_gfortran_caf_stopped_images .visible .func _gfortran_caf_stopped_images_gfortran_caf_error_stop .visible .func _gfortran_caf_error_stop_gfortran_caf_co_broadcast .visible .func _gfortran_caf_co_broadcast_gfortran_caf_co_sum .visible .func _gfortran_caf_co_sum_gfortran_caf_co_min .visible .func _gfortran_caf_co_m_gfortran_caf_co_max .visible .func _gfortran_caf_co_max_gfortran_caf_co_reduce .visible .func _gfortran_caf_co_reduce_gfortran_caf_get .visible .func _gfortran_caf_get_gfortran_caf_send .visible .func _gfortran_caf_send_gfortran_caf_sendget .visible .func _gfortran_caf_sendge, .param .u32 %in_ar11, .param .u32 %in_ar1_gfortran_caf_get_by_ref .visible .func _gfortran_caf_get_by_ref; // BEGIN GLOBAL FUNCTION DECL: _gfortran_caf_send_by_ref .visible .func _gfortran_caf_send_by_ref; // BEGIN GLOBAL FUNCTION DECL: _gfortran_caf_sendget_by_ref .visible .func _gfortran_caf_sendget_by_ref, .param .u32 %in_ar11, .param .u32 %in_ar1_gfortran_caf_atomic_define .visible .func _gfortran_caf_atomic_def_gfortran_caf_atomic_ref .visible .func _gfortran_caf_atomic_ref_gfortran_caf_atomic_cas .visible .func _gfortran_caf_atomic_cas_gfortran_caf_atomic_op .visible .func _gfortran_caf_atomic_op_gfortran_caf_event_post .visible .func _gfortran_caf_event_pos); // BEGIN GLOBAL FUNCTION DECL: _gfortran_caf_event_wait .visible .func _gfortran_caf_event_w); // BEGIN GLOBAL FUNCTION DECL: _gfortran_caf_event_query .visible .func _gfortran_caf_event_query); // BEGIN GLOBAL FUNCTION DECL: _gfortran_caf_lock .visible .func _gfortran_caf_lock_gfortran_caf_unlock .visible .func _gfortran_caf_unlock);_gfortran_caf_is_presenVAR DECL: _impure_ptr .extern .global .align 8 .u64 _impure_ptrfwritefwritevfvffputcfputcBEGIN GLOBAL FUNCTION DECL: snnprintffixdfti .extern// BEGIN GLOBAL FUNCTION DECL: __floattisf .extern// BEGIN GLOBAL FUNCTION DECL: __floattidf .externCL: memmovefputcfputc,111,114,116,114,97,110,32,114,117,110,116,105,109,101,32,101,114,114,111,114,58,328,105,98,99,97,102,95,115,105,110,103,108,101,32,82,85,78,84,73,77,69,32,69,82,82,79,82,58,32,67,97,110,110,111,116,32,99,111,110,118,101,114,116,32,116,121,112,101,32,37,100,32,107,105,110,100,32,37,100,32,116,111,32,116,121,112,101,32,37,100,3200,32,116,111,32,97,108,108,111,99,97,116,101,32,99,111,97,114,114,9705,98,99,97,102,95,115,105,110,103,108,101,58,58,99,97,102,95,115,101,110,100,95,98,121,95,114,101,102,40,41,58,32,117,110,107,110,111,119,110,32,107,105,110,100,32,105,110,32,118,101,99,116,111,114,45,114,101,102,469] = {83,84,79,80,32,376] = {83,84,79,80,3282,79,82,32,83,84,79,80,3273,77,65,71,69,32,70,65,73,76,69,68,33,10,0 }82,79,82,32,83,84,79,809 .const .align 1 .u8 $LC9[62] = {108,105,98,99,97,102,95,115,105,110,103,108,101,58,58,99,97,102,95,103,101,116,95,98,121,95,114,101,102,40,41,58,32,117,110,107,110,111,119,110,32,107,105,110,100,32,105,110,32,118,101,99,116,111,114,45,114,101,102,46,10,0 }; // BEGIN VAR DEF: __func__$2 .const .align 1 .u8 __func__$2[25] = {95,103,102,111,114,116,114,97,110,95,99,97,102,95,97,116,111,109,105,99,95,114,101,102108,105,98,99,97,102,95,115,105,110,103,108,101,58,58,99,97,102,95,103,101,116,95,98,121,95,114,101,102,40,41,58,32,117,110,107,110,111,119,110,3246,10,0 }; // BEGIN VAR DEF: $LC11 .const .align 1 .u8 $LC11[64] = {108,105,98,99,97,102,95,115,105,110,103,108,101,58,58,99,97,102,95,103,101,116,95,98,121,95,114,101,102,40,41,58,32,117,110,107,110,111,119,110,32,97,114,114,97,121,32,114,101,102,101,114,101,110,99,101,32,116,121,112,101,46108,105,98,99,97,102,95,115,105,110,103,108,101,58,58,99,97,102,95,103,101,116,95,98,121,95,114,101,102,40,41,58,32,114,97,110,107,32,111,117,116,32,111,102,32,114,97,110,103,101,46108,105,98,99,97,102,95,115,105,110,103,108,101,58,58,99,97,102,95,103,101,116,95,98,121,95,114,101,102,40,41,58,32,101,120,116,101,110,116,32,111,117,116,32,111,102,32,114,97,110,103,101,46108,105,98,99,97,102,95,115,105,110,103,108,101,58,58,99,97,102,95,103,101,116,95,98,121,95,114,101,102,40,41,58,32,99,97,110,110,111,116,32,97,108,108,111,99,97,116,101,32,109,101,109,111,114,121,46108,105,98,99,97,102,95,115,105,110,103,108,101,58,58,99,97,102,95,103,101,116,95,98,121,95,114,101,102,40,41,58,32,101,120,116,101,110,116,32,111,102,32,110,111,110,45,97,108,108,111,99,97,116,97,98,108,101,32,97,114,114,97,121,115,32,109,105,115,109,97,116,99,104,32,40,37,108,117,32,33,61,32,37,108,117,41,46108,105,98,99,97,102,95,115,105,110,103,108,101,58,58,99,97,102,95,103,101,116,95,98,121,95,114,101,102,40,41,58,32,116,119,111,32,111,114,32,109,111,114,101,32,97,114,114,97,121,32,112,97,114,116,32,114,101,102,101,114,101,110,99,101,115,32,970,0 }; // BEGIN VAR DEF: $LC17 .const .align 1 .u8 $LC17[37] = {33,114,101,97,108,108,111,99,95,110,101,101,100,101,100,32,124,124,32,100,115,116,95,114,101,97,108,108,111,99,97,116,97,98,108,10199,117,114,95,100,105,109,32,61,61,32,71,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,32,40,100,115,116,4195,103,102,111,114,116,114,97,110,95,99,97,102,95,97,116,111,109,105,99,95,111,112,0 }; // BEGIN VAR DEF: __func__$1 .const .align 1 .u8 __func__$1[25] = {95,103,102,111,114,116,114,97,110,95,99,97,102,95,97,116,111,109,105,99,95,9902,111,114,116,114,97,110,95,99,97,102,95,97,116,111,109,105,99,95,100,101,102,105,110,101,0 }; // BEGIN VAR DEF: __func__$4 .const .align 1 .u8 __func__$4[26] = {95,103,102,111,114,116,114,97,110,95,99,97,102,95,115,101,110,100,95,98,121,95,114,101,102,0 }; // BEGIN VAR DEF: __func__$5 .const .align 1 .u8 __func__$5[25] = {95,103,102,111,114,116,114,97,110,95,99,97,102,95,103,101,116,95,98,121,95,114,101,102,0 }; // BEGIN VAR DEF: $LC30 .const .align 1 .u8 $LC30[23] = {86,97,114,105,97,98,108,101,32,105,115,32,110,111,116,32,108,111,99,107,101,100,0 }; // BEGIN GLOBAL VAR DEF: unreachable .visible .const .align 1 .u8 unreachable[45] = {70,97,116,97,108,32,101,114,114,111,114,58,32,117,110,114,101,97,99,104,97,98,108,101,32,97,108,116,101,114,110,97,116,105,118,101,32,102,111,117,110,100,46,108,105,98,99,97,102,95,115,105,110,103,108,101,58,58,99,97,102,95,105,115,95,112,114,101,115,101,110,116,40,41,58,32,111,110,108,121,32,115,99,97,108,97,114,32,105,110,100,101,120,101,115,32,97,108,108,111,119,101,100,4610] = {115,105,122,101,32,61,61,32,49108,105,98,99,97,102,95,115,105,110,103,108,101,58,58,99,97,102,95,115,101,110,100,95,98,121,95,114,101,102,40,41,58,32,117,110,107,110,111,119,110,32498,99,97,102,95,115,105,110,103,108,101,58,58,99,97,102,95,115,101,110,100,95,98,121,95,114,101,102,40,41,58,32,117,110,107,110,111,119,110,32,97,114,114,97,121,32,114,101,102,101,114,101,110,99,101,32,116,121,112,101108,105,98,99,97,102,95,115,105,110,103,108,101,58,58,99,97,102,95,115,101,110,100,95,98,121,95,114,101,102,40,41,58,32,114,97,110,107,32,111,117,116,32,111,102,32,114,9798,99,97,102,95,115,105,110,103,108,101,58,58,99,97,102,95,115,101,110,100,95,98,121,95,114,101,102,40,41,58,32,114,101,97,108,108,111,99,97,116,105,111,110,32,111,102,32,97,114,114,9799,111,109,112,111,110,101,110,116,32,114,101,102,32,110,111,116,32,97,108,108,111,119,101,100108,105,98,99,97,102,95,115,105,110,103,108,101,58,58,99,97,102,95,115,101,110,100,95,98,121,95,114,101,102,40,41,58,32,99,97,110,110,111,116,32,97,108,108,111,99,97,116,101,32,109,101,109,111,114,121,46108,105,98,99,97,102,95,115,105,110,103,108,101,58,58,99,97,102,95,115,101,110,100,95,98,121,95,114,101,102,40,41,58,32,101,120,116,101,110,116,32,111,102,32,110,111,110,45,97,108,108,111,99,97,116,97,98,108,101,32,97,114,114,97,121,32,109,105,115,109,97,116,99,104,4610] = {105,32,61,6128 .const .align 1 .u8 $LC28[10] = {107,105,110,100,32,61,61,32,5208,114,101,97,100,121,32,108,111,99,107,101,100,0 }; // BEGIN GLOBAL VAR DEF: caf_staticcaf_static_list[1]; // BEGIN FUNCTION DEF: assign_char1_from_char4 .func assign_char1_from_charmov.u64 %r35,%ar2; mov.u64 %r36,%ar3; .loc 1 459 26 shr.u64 %r37,%r34,2; .loc 1 459 5 min.u64 %r28,%r37,%r33; .loc 1 460 3 setp.ne.u64 %r38,%r28,0; @ %r38 bra $L2; $L7: .loc 1 462 6 setp.gt.u64 %r39,%r33,%r28; @ ! %r39 bra $L1; bra $L3; $L2: mov.u64 %r24,%r36; mov.u64 %r27,%r35; add.u64 %r32,%r28,%r35; $L6: .loc 1 461 17 ld.u32 %r23,[%r24]; .loc 1 461 12 setp.le.u32 %r40,%r23,255; selp.u32 %r23,%r23,63,%r40; st.u86; bra $L7; $L3:$L1: .loc 1 464 1 ret; } // BEGIN FUNCTION DEF: caf_runtime_error .func caf_runti64mov.u64 %r29,%ar0; .loc 1 65 12 ld.global.u64 %r34,[_impure_ptr]; .loc 1 65 3 ld.u64 %r35,[%r34+24]; mov.u64 %r32,23; mov.u64 %r31,1; cvta.const.u64 %r30,$LCld.param.u64 %r36,[%value_in]; } .loc 1 67 13 ld.global.u64 %r41,[_impure_ptr]; .loc 1 67 3,[%value_in]; } .loc 1 69 12 ld.global.u64 %r46,[_impure_ptr]; .loc 1 69 3 ld.u64 %r47,[%r46+24];fputc.loc 1 72 3FUNCTION DEF: caf_internal_error .func caf_internal_error {gs; ld.param.u64 %argsmov.u64 %r29,%ar2; mov.u64 %r30,%ar3; .loc 1 8284 13 mov.u32 %r33,1; st.u32 [%r28],%r33; .loc 1 85 10 setp.ne.u64 %r34,%r30,0; @ ! %r34 bra $L10; .loc 1 87 14 st.u64 [%stack],%argsstack; call (%value_in),sn,%out_arg4); ld.param.u32 %r38,[%value_in]; } .loc 1 88 7 setp.lt.s32 %r40,%r38,0; @ %r40 bra $L10; .loc 1 88 33 cvt.s64.s32 %r22,%r38; .loc 1 88 17 setp.ge.u64 %r41,%r22,%r30; @ %r41 bra $L10; .loc 1 89 6 sub.u64 %r47,%r30,%r22; .loc 1 89 14 add.u64 %r48,%r29,%r22; .loc 1 8995 5 st.u64 [%stack],%argscaf_runtime_error10: .loc 1 97 1 ret; } // BEGIN FUNCTION DEF: convert_type .func convert_typef32 %r96; .reg .predpred %r113; .reg .pred %r115; .reg .pred32pred %r128; .reg .predpredf64 %r176; .reg .predu64pred %r195; .reg .predu64setp.eq.u32 %r76,%r73,3; @ %r76 bra $L16; setp.eq.u32 %r77,%r73,4; @ %r77 bra $L17; setp.ne.u32 %r78,%r73,1; @ %r78 bra $L18; .loc 1 499 10 setp.ne.u32 %r79,%r74,1; @ %r79 bra $L19; .loc 1 500 10 ld.s8 %r204,[%r72]; shr.s64 %r205,%r204,63; bra $L20; $L19: .loc 1 501 15 setp.ne.u32 %r83,%r74,2; @ %r83 bra $L21; .loc 1 502 10 ld.s16 %r204,[%r72]; shr.s64 %r205,%r204,63; bra $L20; $L21: .loc 1 503 15 setp.ne.u32 %r87,%r74,4; @ %r87 bra $L22; .loc 1 504 10 ld.s32 %r204,[%r72]; shr.s64 %r205,%r204,63; bra $L20; $L22: .loc 1 505 15 setp.ne.u32 %r91,%r74,8; @ %r91 bra $L23; .loc 1 506 10 ld.u64 %r204,[%r72]; shr.s64 %r205,%r204,63; bra $L20; $L23: .loc 1 508 15 setp.ne.u32 %r94,%r74,16; @ %r94 bra $L18; .loc 1 509 10 ld.u64 %r204,[%r72]; ld.u64 %r205,[%r72+8]; bra $L20; $L16: .loc 1 515 10 setp.ne.u32 %r95,%r74,4; @ %r95 bra $L24; .loc 1 516 11 ld.f32 %r96,[%r72]; cvt.f64.f32 %r67,%r96; .loc 1 550 3 setp.eq.u32 %r97,%r70,3; @ %r97 bra $L25; setp.eq.u32 %r98,%r70,4; @ %r98 bra $L26; setp.eq.u32 %r99,%r70,1; @ ! %r99 bra $L18; bra $L27; $L24: .loc 1 517 15 setp.ne.u32 %r100,%r74,8; @ %r100 bra $L18; .loc 1 518 11 ld.f64 %r67,[%r72]; .loc 1 550 3 setp.eq.u32 %r101,%r70,3; @ %r101 bra $L25; setp.eq.u32 %r102,%r70,4; @ %r102 bra $L26; setp.eq.u32 %r103,%r70,1; @ ! %r103 bra $L18; bra $L27; $L17: .loc 1 531 10 setp.ne.u32 %r104,%r74,4; @ %r104 bra $L28; .loc 1 532 13 ld.f32 %r106,[%r72]; ld.f32 %r108,[%r72+4]; .loc 1 532 11 cvt.f64.f32 %r65,%r106; cvt.f64.f32 %r66,%r108; bra $L29; $L28: .loc 1 533 15 setp.ne.u32 %r109,%r74,8; @ %r109 bra $L18; .loc 1 534 11 ld.f64 %r65,[%r72]; ld.f64 %r66,[%r72+8]; bra $L29; $L51: .loc 1 556 23 st.u8 [%r69],%r204; bra $L15; $L79: .loc 1 557 12 setp.ne.u32 %r111,%r71,2; @ %r111 bra $L31; .loc 1 558 24 st.u16 [%r69],%r204; bra $L15; $L31: .loc 1 559 12 setp.ne.u32 %r113,%r71,4; @ %r113 bra $L32; .loc 1 560 24 st.u32 [%r69],%r204; bra $L15; $L32: .loc 1 561 12 setp.ne.u32 %r115,%r71,8; @ %r115 bra $L33; .loc 1 562 24 st.u64 [%r69],%r204; bra $L15; $L33: .loc 1 564 12 setp.ne.u32 %r117,%r71,16; @ %r117 bra $L18; .loc 1 565 22 st.u64 [%r69],%r204; st.u64 [%r69+8],%r205; bra $L15; $L27: .loc 1 572 7 setp.ne.u32 %r120,%r71,1; @ %r120 bra $L34; .loc 1 573 23 cvt.rzi.s32.f64 %r121,%r67; st.u8 [%r69],%r121; bra $L15; $L34: .loc 1 574 12 setp.ne.u32 %r122,%r71,2; @ %r122 bra $L35; .loc 1 575 24 cvt.rzi.s32.f64 %r123,%r67; st.u16 [%r69],%r123; bra $L15; $L35: .loc 1 576 12 setp.ne.u32 %r124,%r71,4; @ %r124 bra $L36; .loc 1 577 24 cvt.rzi.s32.f64 %r125,%r67; st.u32 [%r69],%r125; bra $L15; $L36: .loc 1 578 12 setp.ne.u32 %r126,%r71,8; @ %r126 bra $L37; .loc 1 579 24 cvt.rzi.s64.f64 %r127,%r67; st.u64 [%r69],%r127; bra $L15; $L37: .loc 1 581 12 setp.ne.u32 %r128,%r71,16; @ %r128 bra $L18; .loc 1 582 24d206,[%frame]; ld.u64 %r207,[%frame+8]; st.u64 [%r69],%r206; st.u64 [%r69+8],%r207; bra $L15; $L50: .loc 1 589 7 setp.ne.u32 %r134,%r71,1; @ %r134 bra $L38; .loc 1 590 23 cvt.rzi.s32.f64 %r135,%r65; st.u8 [%r69],%r135; bra $L15; $L38: .loc 1 591 12 setp.ne.u32 %r136,%r71,2; @ %r136 bra $L39; .loc 1 592 24 cvt.rzi.s32.f64 %r137,%r65; st.u16 [%r69],%r137; bra $L15; $L39: .loc 1 593 12 setp.ne.u32 %r138,%r71,4; @ %r138 bra $L40; .loc 1 594 24 cvt.rzi.s32.f64 %r139,%r65; st.u32 [%r69],%r139; bra $L15; $L40: .loc 1 595 12 setp.ne.u32 %r140,%r71,8; @ %r140 bra $L41; .loc 1 596 24 cvt.rzi.s64.f64 %r141,%r65; st.u64 [%r69],%r141; bra $L15; $L41: .loc 1 598 12 setp.ne.u32 %r142,%r71,16; @ %r142 bra $L18; .loc 1 599 24d208,[%frame]; ld.u64 %r209,[%frame+8]; st.u64 [%r69],%r208; st.u64 [%r69+8],%r209; bra $L15; $L57: .loc 1 611 22 st.u64 [%frame],%r204; st.u64 [%frame+8],%r205;__floattisf,(%out_arg1); ld.param.f32 %r151,[%value_in]; } st.f32 [%r69],%r151; bra $L15; $L82: .loc 1 612 12 setp.ne.u32 %r153,%r71,8; @ %r153 bra $L18; .loc 1 613 23 st.u64 [%frame],%r204; st.u64 [%frame+8],%r205;__floattidf,(%out_arg1); ld.param.f64 %r157,[%value_in]; } st.f64 [%r69],%r157; bra $L15; $L55: .loc 1 628 22 cvt.rn.f32.f64 %r159,%r67; st.f32 [%r69],%r159; bra $L15; $L81: .loc 1 629 12 setp.ne.u32 %r160,%r71,8; @ %r160 bra $L18; .loc 1 630 21 st.f64 [%r69],%r67; bra $L15; $L53: .loc 1 645 22 cvt.rn.f32.f64 %r161,%r65; st.f32 [%r69],%r161; bra $L15; $L80: .loc 1 646 12 setp.ne.u32 %r162,%r71,8; @ %r162 bra $L18; .loc 1 647 21 st.f64 [%r69],%r65; bra $L15; $L46: .loc 1 663 7 setp.ne.u32 %r163,%r71,4; @ %r163 bra $L42; .loc 1 664 31 st.u64 [%frame],%r204; st.u64 [%frame+8],%r205;__floattisf,(%out_arg1); ld.param.f32 %r167,[%value_in]; } st.f32 [%r69],%r167; .loc 1 664 29 mov.f32 %r169,0f00000000; st.f32 [%r69+4],%r169; bra $L15; $L42: .loc 1 665 12 setp.ne.u32 %r170,%r71,8; @ %r170 bra $L18; .loc 1 666 32 st.u64 [%frame],%r204; st.u64 [%frame+8],%r205;__floattidf,(%out_arg1); ld.param.f64 %r174,[%value_in]; } st.f64 [%r69],%r174; .loc 1 666 30 mov.f64 %r176,0d0000000000000000; st.f64 [%r69+8],%r176; bra $L15; $L26: .loc 1 680 7 setp.ne.u32 %r177,%r71,4; @ %r177 bra $L43; .loc 1 681 31 cvt.rn.f32.f64 %r178,%r67; st.f32 [%r69],%r178; .loc 1 681 29 mov.f32 %r179,0f00000000; st.f32 [%r69+4],%r179; bra $L15; $L43: .loc 1 682 12 setp.ne.u32 %r180,%r71,8; @ %r180 bra $L18; .loc 1 683 30 st.f64 [%r69],%r67; mov.f64 %r181,0d0000000000000000; st.f64 [%r69+8],%r181; bra $L15; $L59: .loc 1 698 31 cvt.rn.f32.f64 %r182,%r65; st.f32 [%r69],%r182; cvt.rn.f32.f64 %r183,%r66; st.f32 [%r69+4],%r183; bra $L15; $L83: .loc 1 699 12 setp.ne.u32 %r184,%r71,8; @ %r184 bra $L18; .loc 1 700 30 st.f64 [%r69],%r65; st.f64 [%r69+8],%r66; bra $L15; $L18: .loc 1 720 12 ld.global.u64 %r187,[_impure_ptr]; .loc 1 720 3 ld.u64 %r188,[%r187+24]; st.u32 [%stack+12],%r71; st.u32 [%stack+8],%r70; st.u32 [%stack+4],%r74; st.u32 [%stack],%r73; cvta.const.u64 %r186,$LC18r186189,[%value_in]; } .loc 1 722 6 setp.eq.u64 %r191,%r75,0; @ %r191 bra $L44; .loc 1 723 11 mov.u32 %r192,1; st.u32 [%r75],%r192; bra $L15; $L44:0: .loc 1 550 3 setp.eq.u32 %r193,%r70,3; @ %r193 bra $L45; setp.eq.u32 %r194,%r70,4; @ %r194 bra $L46; setp.eq.u32 %r195,%r70,1; @ ! %r195 bra $L18; bra $L47; $L29: setp.eq.u32 %r196,%r70,3; @ %r196 bra $L48; setp.eq.u32 %r197,%r70,4; @ %r197 bra $L49; setp.eq.u32 %r198,%r70,1; @ ! %r198 bra $L18; bra $L50; $L47: .loc 1 555 7 setp.eq.u32 %r199,%r71,1; @ ! %r199 bra $L79; bra $L51; $L48: .loc 1 644 7 setp.eq.u32 %r200,%r71,4; @ ! %r200 bra $L80; bra $L53; $L25: .loc 1 627 7 setp.eq.u32 %r201,%r71,4; @ ! %r201 bra $L81; bra $L55; $L45: .loc 1 610 7 setp.eq.u32 %r202,%r71,4; @ ! %r202 bra $L82; bra $L57; $L49: .loc 1 697 7 setp.eq.u32 %r203,%r71,4; @ ! %r203 bra $L83; bra $L59; $L15: .loc 1 726 1 ret; } // BEGIN FUNCTION DEF: copy_data .func copy_datau16 %r82; .reg .u16 %r83; .reg .u32 %r84; .reg .u16predu32 %r141; .reg .predmov.u64 %r62,%ar0; mov.u64 %r63,%ar1; mov.u32 %r64,%ar2; mov.u32 %r65,%ar3; mov.u32 %r66,%ar4; mov.u32 %r67,%ar5; mov.u64 %r68,%ar6; mov.u64 %r69,%ar7; mov.u64 %r70,%ar8; mov.u64 %r71,%ar9; .loc 1 1166 21 set.u32.eq.u32 %r73,%r64,6; neg.s32 %r74,%r73; cvt.u32.u32 %r72,%r74; cvt.u32.u8 %r39,%r72; .loc 1 1163 16 set.u32.eq.u32 %r76,%r64,%r65; neg.s32 %r77,%r76; .loc 1 1163 40 set.u32.eq.u32 %r79,%r66,%r67; neg.s32 %r80,%r79; .loc 1 1163 28 cvt.u16.u32 %r82,%r77; cvt.u16.u32 %r83,%r80; and.b16 %r81,%r82,%r83; .loc 1 1163 6 cvt.u32.u16 %r84,%r81; cvt.u16.u8 %r85,%r84; setp.eq.u16 %r86,%r85,0; @ %r86 bra $L85; .loc 1 1165 56 min.u64 %r92,%r68,%r69; .loc 1 1165 7 mul.lo.u64 %r93,%r92,%r7move166 49 set.u32.eq.u32 %r96,%r65,6; neg.s32 %r97,%r96; .loc 1 1166 37 cvt.u16.u32 %r99,%r97; cvt.u16.u32 %r100,%r39; or.b16 %r98,%r99,%r100; .loc 1 1167 4 set.u32.gt.u64 %r102,%r68,%r69; neg.s3298@ ! %r110 bra $L84; .loc 1 1169 7 setp.ne.u32 %r111,%r66,1; @ %r111 bra $L88; .loc 1 1170 6 sub.u64 %r117,%r68,%r69; add.u64 %r118,%r62,%r69; mov.u32 %r1111115117119,[%value_in]; } bra $L84; $L88: .loc 1 1172 13 shr.u64 %r43,%r69,2; .loc 1 1172 39 shr.u64 %r55,%r68,2; .loc 1 1172 6 setp.ge.u64 %r120,%r43,%r55; @ %r120 bra $L84; shl.b64 %r121,%r43,2; add.u64 %r51,%r62,%r121; shl.b64 %r122,%r55,2; add.u64 %r54,%r62,%r122; .loc 1 1173 27 mov.u32 %r123,32; $L90: st.u32 [%r51],%r123; .loc 1 1172 6 add.u64 %r51,%r51,4; setp.ne.u64 %r124,%r51,%r54; @ %r124 bra $L90; bra $L84; $L85: .loc 1 1176 49 set.u32.eq.u32 %r126,%r66,1; neg.s32 %r127,%r126; .loc 1 1176 3739117691; .loc 1 1call assign_char1_from_char4bra $L84; $L91: .loc 1 1178 11 setp.eq.u32 %r138,%r64,6; @ %r138 bra $L92; .loc 1 1181 5 setp.ne.u64 %r139,%r70,0; @ %r139 bra $L97; bra $L84; $L92: .loc 1 446 15 shr.u64 %r44,%r68,2; .loc 1 446 5 min.u64 %r45,%r69,%r44; .loc 1 447 3 setp.eq.u64 %r140,%r45,0; @ %r140 bra $L94; mov.u64 %r52,%r63; mov.u64 %r58,%r62; add.u64 %r61,%r52,%r45; $L95: .loc 1 448 14 ld.u8 %r141,[%r52]; st.u32 [%r58],%r141; .loc 1 447 3 add.u64 %r52,%r52,1; add.u64 %r58,%r58,4; setp.ne.u64 %r142,%r52,%r61; @ %r142 bra $L95; $L94: .loc 1 449 3 setp.le.u64 %r143,%r44,%r45; @ %r143 bra $L84; shl.b64 %r144,%r45,2; add.u64 %r49,%r62,%r144; shl.b64 %r145,%r44,2; add.u64 %r53,%r62,%r145; .loc 1 450 12 mov.u32 %r146,32; $L96: st.u32 [%r49],%r146; .loc 1 449 3 add.u64 %r49,%r49,4; setp.ne.u64 %r147,%r49,%r53; @ %r147 bra $L96; bra $L84; $L97: .loc 1 1181 12 mov.u64 %r41,0; $L93: .loc 1 1183 26771; call convert_type1184 5 add.u64 %r62,%r62,%r68; .loc 1 1185 5 add.u64 %r63,%r63,%r69; .loc 1 1181 26 add.u64 %r41,%r41,1; .loc 1 1181 5 setp.ne.u64 %r155,%r70,%r41; @ %r155 bra $L93; $L84: .loc 1 1187 1 ret; } // BEGIN FUNCTION DEF: get_for_ref$constprop$0$isra$0 .func get_for_ref$constprop$0$isra$0, .param .u64 %in_ar11, .param .u32 %in_ar12) {ar11; ld.param.u64 %ar11,[%in_ar11]; .reg .u32 %ar12; ld.param.u32 %ar12,[%in_ar1u32 %r211; .reg .u64 %r215; .reg .u64u64 %r279; .reg .u64 %r282; .reg .u64u64 %r299; .reg .u64 %r301; .reg .u32 %r302; .reg .u64 %r303; .reg .u64 %r305; .reg .u64 %r306; .reg .u6464u64 %r327; .reg .u6432u64pred %r404; .reg .predpred %r409; .reg .pred %r411; .reg .pred %r413; .reg .pred %r415; .reg .pred %r417; .reg .predpred %r438; .reg .pred %r439; .reg .predu64 %r521; .reg .u64 %r522; .reg .u64 %r523; .reg .u64 %r530; .reg .u64 %r531; .reg .u64 %r532; .reg .u64u64 %r582; .reg .u64 %r589; .reg .u64 %r590; .reg .u64 %r591; .reg .u64 %r598; .reg .pred %r599; .reg .u64 %r603; .reg .u64 %r604; .reg .u64 %r610; .reg .pred %r618; .reg .u64 %r619; .reg .u64 %r620; .reg .u32 %r622; .reg .u32 %r623; .reg .u32 %r625; .reg .u32 %r626; .reg .u16 %r627; .reg .u16 %r628; .reg .u16 %r629; .reg .u32 %r630; .reg .u16 %r631; .reg .pred %r632; .reg .pred %r633; .reg .u64 %r634; .reg .u64 %r638; .reg .u64 %r642; .reg .u64 %r643; .reg .u64 %r645; .reg .u64 %r646; .reg .u64 %r661; .reg .u64u64 %r670; .reg .u64 %r671; .reg .pred %r672; .reg .u64 %r676; .reg .u64 %r677; .reg .pred %r691; .reg .u64 %r692; .reg .u64 %r693; .reg .u32 %r695; .reg .u32 %r696; .reg .u32 %r698; .reg .u32 %r699; .reg .u16 %r700; .reg .u16 %r701; .reg .u16 %r702; .reg .u32 %r703; .reg .u16 %r704; .reg .pred %r705; .reg .pred %r706; .reg .u64 %r707; .reg .u64 %r711; .reg .u64 %r715; .reg .u64 %r716; .reg .u64 %r724; .reg .u64 %r725; .reg .u64 %r726; .reg .pred %r727; .reg .u64 %r731; .reg .u64 %r732; .reg .u64 %r735; .reg .u64 %r743; .reg .pred %r744; .reg .u64 %r759; .reg .u64 %r760; .reg .u64 %r764; .reg .u64 %r765; .reg .u64 %r766; .reg .u64 %r767; .reg .u64 %r768; .reg .u64 %r769; .reg .pred %r770; .reg .u64 %r772; .reg .u64 %r773; .reg .u64 %r774; .reg .u64 %r775; .reg .u64 %r781; .reg .u64 %r783; .reg .u64 %r784; .reg .u64 %r785; .reg .u64 %r792; .reg .u64 %r795; .reg .u64 %r799; .reg .u64 %r800; .reg .u64 %r806; .reg .pred %r814; .reg .u64 %r815; .reg .u64 %r816; .reg .u32 %r818; .reg .u32 %r819; .reg .u32 %r821; .reg .u32 %r822; .reg .u16 %r823; .reg .u16 %r824; .reg .u16 %r825; .reg .u32 %r826; .reg .u16 %r827; .reg .pred %r828; .reg .pred %r829; .reg .u64 %r830; .reg .u64 %r834; .reg .u64 %r838; .reg .u64 %r839; .reg .u64 %r847; .reg .u64 %r848; .reg .u64 %r849; .reg .u64 %r864; .reg .u64 %r865; .reg .u64 %r869; .reg .u64 %r870; .reg .u64 %r871; .reg .u64 %r872; .reg .u64 %r873; .reg .u64 %r874; .reg .pred %r875; .reg .u64 %r879; .reg .u64 %r880; .reg .u64 %r886; .reg .pred %r894; .reg .u64 %r895; .reg .u64 %r896; .reg .u32 %r898; .reg .u32 %r899; .reg .u32 %r901; .reg .u32 %r902; .reg .u16 %r903; .reg .u16 %r904; .reg .u16 %r905; .reg .u32 %r906; .reg .u16 %r907; .reg .pred %r908; .reg .pred %r909; .reg .u64 %r910; .reg .u64 %r914; .reg .u64 %r918; .reg .u64 %r919; .reg .u64 %r921; .reg .u64 %r922; .reg .u64 %r937; .reg .u64 %r938; .reg .u64 %r942; .reg .u64 %r943; .reg .u64 %r944; .reg .u64 %r945; .reg .u64 %r946; .reg .u64 %r947; .reg .pred %r948; .reg .u64 %r949; .reg .pred %r951; .reg .u16 %r952; .reg .pred %r953; .reg .pred %r955; .reg .pred %r957; .reg .pred %r959; .reg .pred %r961; .reg .u64 %r962; .reg .u64 %r966; .reg .u64 %r967; .reg .u64 %r968; .reg .u64 %r969; .reg .pred %r970; .reg .u64 %r976; .reg .pred %r977; .reg .pred %r978; .reg .pred %r979; .reg .pred %r980; .reg .pred %r981; .reg .pred %r982; .reg .u64 %r989; .reg .u64 %r990; .reg .u64 %r997; .reg .u64 %r998; .reg .u64 %r999; .reg .u64 %r1006; .reg .u64 %r1007; .reg .u64 %r1008; .reg .u64 %r1015; .reg .u64 %r1016; .reg .u64 %r1017; .reg .u64 %r1024; .reg .u64 %r1025; .reg .u64 %r1026; .reg .u64 %r1031; .reg .u64 %r1043; .reg .u64 %r1044; .reg .u64 %r1051; .reg .u64 %r1052; .reg .u64 %r1053; .reg .u64 %r1060; .reg .pred %r1061; .reg .u64 %r1062; .reg .u64 %r1066; .reg .u64 %r1067; .reg .u64 %r1068; .reg .u64 %r1069; .reg .pred %r1070; .reg .u64 %r1075; .reg .u64 %r1085; .reg .u64 %r1086; .reg .u64 %r1090; .reg .u64 %r1091; .reg .u64 %r1092; .reg .u64 %r1093; .reg .u64 %r1094; .reg .u64 %r1095; .reg .u64 %r1101; .reg .u64 %r1102; .reg .u64 %r1109; .reg .pred %r1110; .reg .u64 %r1114; .reg .u64 %r1115; .reg .pred %r1129; .reg .u64 %r1130; .reg .u64 %r1131; .reg .u32 %r1133; .reg .u32 %r1134; .reg .u32 %r1136; .reg .u32 %r1137; .reg .u16 %r1138; .reg .u16 %r1139; .reg .u16 %r1140; .reg .u32 %r1141; .reg .u16 %r1142; .reg .pred %r1143; .reg .pred %r1144; .reg .u64 %r1145; .reg .u64 %r1149; .reg .u64 %r1150; .reg .u64 %r1155; .reg .u64 %r1165; .reg .u64 %r1166; .reg .u64 %r1170; .reg .u64 %r1171; .reg .u64 %r1172; .reg .u64 %r1173; .reg .u64 %r1174; .reg .u64 %r1175; .reg .u64 %r1179; .reg .u64 %r1180; .reg .u64 %r1181; .reg .u64 %r1182; .reg .pred %r1183; .reg .u64 %r1185; .reg .u64 %r1186; .reg .u64 %r1187; .reg .u64 %r1188; .reg .u64 %r1191; .reg .pred %r1192; .reg .u64 %r1207; .reg .u64 %r1208; .reg .u64 %r1215; .reg .u64 %r1216; .reg .u64 %r1217; .reg .pred %r1218; .reg .pred %r1219; .reg .pred %r1220; .reg .u64 %r1221; .reg .u64 %r1222; .reg .u32 %r1223; .reg .u64 %r1224; .reg .u64 %r1225; .reg .u64 %r1226; .reg .u64 %r1227; .reg .u64 %r1228; .reg .u64 %r1229; .reg .u64 %r1230; .reg .u64 %r1231; .reg .u64 %r1232; .reg .u64 %r1233; .reg .u64 %r1235; .reg .u64 %r1236; .reg .u64 %r1237; .reg .u64 %r1238; .reg .u64 %r1239; .reg .u64 %r1240; .reg .u64 %r1241; .reg .u64 %r1242; .reg .u64 %r1243; .reg .u64 %r1244; .reg .u64 %r1245; .reg .u64 %r1247; mov.u64 %r323,%ar0; mov.u64 %r324,%ar1; mov.u64 %r325,%ar2; mov.u64 %r326,%ar3; mov.u64 %r327,%ar4; mov.u64 %r328,%ar5; mov.u64 %r329,%ar6; mov.u32 %r330,%ar7; mov.u32 %r331,%ar8; mov.u64 %r332,%ar9; mov.u64 %r333,%ar10; mov.u64 %r334,%ar11; mov.u32 %r335,%ar12; .loc 1 1209 6 setp.eq.u64 %r336,%r323,0; @ %r336 bra $L105; .loc 1 1392 13 shl.b64 %r337,%r332,3; add.u64 %r159,%r325,%r337; .loc 1 1214 10 ld.u64 %r303,[%r323]; .loc 1 1220 18 ld.u32 %r302,[%r323+8]; setp.ne.u64 %r1220,%r303,0; .loc 1 1523 4 mov.u64 %r1247,0; bra $L107; $L182: .loc 1 1447 4 mov.u64 %r327,%r1247; ld.u32 %r1223,[%r303+8]; $L158: .loc 1 1220 18 mov.u32 %r302,%r1223; mov.u64 %r323,%r303; ld.u64 %r303,[%r303]; mov.u64 %r333,0; setp.ne.u64 %r1220,%r303,0; $L107: .loc 1 1214 6 @ %r1220 bra $L108; .loc 1 1216 14 ld.u64 %r22,[%r326+16]; .loc 1 1220 7 setp.eq.u32 %r339,%r302,0; @ %r339 bra $L109; add.u32 %r340,%r302,-1; setp.gt.u32 %r341,%r340,1; @ %r341 bra $L110; .loc 1 1239 21 add.u64 %r342,%r323,%r333; ld.u8 %r314,[%r342+24]; .loc 1 1239 7 setp.eq.u32 %r344,%r314,0; @ ! %r344 bra $L228; bra $L111; $L109: .loc 1 1227 35 ld.u64 %r345,[%r323+24]; add.u64 %r301,%r329,%r345; .loc 1 1227 6 ld.u64 %r299,[%r323+16]; .loc 1 1225 7 ld.u64 %r346,[%r323+32]; setp.le.s64 %r347,%r346,0; @ %r347 bra $L113; .loc 1 1227 6 ld.s8 %r358,[%r326+29]; mov.u64 %r356,1; ld.u64 %r349,[%r3013r34935822; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r299; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r356; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r334; call copy_data,%out_arg8,%out_arg9,%out_arg10); } bra $L114; $L113: .loc 1 1231 6 ld.s8 %r369,[%r326+29]; mov.u64 %r367,r301322; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r299; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r367; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r334; call copy_data,%out_arg8,%out_arg9,%out_arg10); } $L114: .loc 1 1234 4 ld.u64 %r371,[%r324]; add.u64 %r370,%r371,1; st.u64 [%r324],%r370; .loc 1 1235 4 bra $L105; $L111: .loc 1 1218 14 ld.s8 %r36,[%r326+28]; .loc 1 1241 8 setp.eq.u64 %r372,%r36,0; @ %r372 bra $L116; mov.u64 %r282,%r325; shl.b64 %r373,%r36,3; add.u64 %r279,%r282,%r373; .loc 1 1217 17 mov.u64 %r37,0; $L117: .loc 1 1242 20 ld.u64 %r374,[%r282]; add.u64 %r37,%r37,%r374; mov.u64 %r36,%r37; .loc 1 1241 8 add.u64 %r282,%r282,8; setp.ne.u64 %r375,%r279,%r282; @ %r375 bra $L117; $L116: .loc 1 1243 8 ld.u64 %r386,[%r323+16]; ld.s8 %r387,[%r326+29]; mad.lo.u64 %r389,%r22,%r36,%r328; mov.u64 %r384,r32938722; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r386; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r384; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r334; call copy_data,%out_arg8,%out_arg9,%out_arg10); } .loc 1 1247 11 ld.u64 %r391,[%r324]; add.u64 %r390,%r391,1; st.u64 [%r324],%r390; .loc 1 1248 8 bra $L105; $L110: .loc 1 1252 4 cvta.const.u64 %r392,unreachable;stack; call caf_runtime_error108: .loc 1 1256 3 setp.eq.u32 %r394,%r302,1; @ %r394 bra $L118; setp.eq.u32 %r395,%r302,2; @ %r395 bra $L119; setp.ne.u32 %r396,%r302,0; @ %r396 bra $L110; .loc 1 1259 19 ld.u64 %r47,[%r323+32]; .loc 1 1227 35 ld.u64 %r397,[%r323+24]; add.u64 %r327,%r329,%r397; .loc 1 1259 10 setp.le.s64 %r398,%r47,0; @ %r398 bra $L181; .loc 1 1263 30 ld.u32 %r302,[%r303+8]; mov.u32 %r1223,%r302; .loc 1 1263 18 setp.ne.u32 %r399,%r302,1; @ %r399 bra $L121; .loc 1 1261 17 add.u64 %r400,%r329,%r47; .loc 1 1264 10 ld.u64 %r401,[%r400]; ld.u64 %r51,[%r401+8]; bra $L122; $L177: .loc 1 1273 9 ld.u64 %r329,[%r327]; mov.u64 %r323,%r303; ld.u64 %r303,[%r303]; .loc 1 1277 4 mov.u64 %r327,%r1247; .loc 1 1266 10 mov.u64 %r333,0; setp.ne.u64 %r1220,%r303,0; .loc 1 1273 9 bra $L107; $L122: .loc 1 1275 9 mov.u64 %r329,%r327; mov.u64 %r327,%r51; bra $L158; $L118: .loc 1 1288 24 add.u64 %r402,%r323,%r333; ld.u8 %r314,[%r402+24]; .loc 1 1288 10 setp.eq.u32 %r404,%r314,0; @ ! %r404 bra $L179; ld.u32 %r1223,[%r303+8]; bra $L158; $L179: .loc 1 1297 10 setp.ne.u64 %r405,%r333,0; @ %r405 bra $L124; .loc 1 1298 5 ld.u64 %r329,[%r327]; $L124: .loc 1 1299 7 cvt.u16.u32 %r406,%r314; setp.eq.u16 %r407,%r406,4; @ %r407 bra $L125; setp.gt.u16 %r409,%r406,4; @ %r409 bra $L126; setp.eq.u16 %r411,%r406,2; @ %r411 bra $L127; setp.eq.u16 %r413,%r406,3; @ %r413 bra $L128; setp.eq.u16 %r415,%r406,1; @ %r415 bra $L129; bra $L110; $L126: setp.eq.u16 %r417,%r406,5; @ %r417 bra $L130; setp.eq.u16 %r419,%r406,6; @ %r419 bra $L131; bra $L110; $L129: .loc 1 1304 23 mov.u64 %r420,0; st.u64 [%r159],%r420; .loc 1 1305 54 add.u64 %r1225,%r333,%r333; add.u64 %r1222,%r1225,%r333; shl.b64 %r424,%r1222,3; add.u64 %r425,%r323,%r424; add.u64 %r426,%r425,56; .loc 1 1305 4 ld.u64 %r427,[%r426]; setp.eq.u64 %r428,%r427,0; @ %r428 bra $L105; .loc 1 1305 16 mov.u64 %r99,%r420; .loc 1 1315 39 add.u64 %r434,%r425,64; .loc 1 1319 3 add.u64 %r1235,%r425,48; add.u64 %r1236,%r327,%r424; add.u64 %r1237,%r1236,48; add.u64 %r1238,%r1236,40; .loc 1 1332 8 add.u64 %r1239,%r333,1; add.u64 %r1240,%r332,1; .loc 1 1337 8 add.u64 %r1241,%r332,%r332; add.u64 %r1242,%r1241,%r332; shl.b64 %r1243,%r1242,3; add.u64 %r1244,%r326,%r1243; add.u64 %r1245,%r1244,40; $L139: .loc 1 1315 39 ld.u32 %r53,[%r434]; setp.eq.u32 %r435,%r53,4; @ %r435 bra $L132; setp.gt.s32 %r436,%r53,4; @ %r436 bra $L133; setp.eq.u32 %r437,%r53,1; @ %r437 bra $L134; setp.eq.u32 %r438,%r53,2; @ %r438 bra $L135; bra $L110; $L133: setp.eq.u32 %r439,%r53,8; @ %r439 bra $L136; setp.eq.u32 %r440,%r53,16; @ %r440 bra $L137; bra $L110; $L134: .loc 1 1317 3 ld.u64 %r447,[%r1235]; add.u64 %r448,%r447,%r99; ld.s8 %r449,[%r448]; ld.u64 %r457,[%r1237]; sub.u64 %r456,%r449,%r457; ld.u64 %r464,[%r1238]; mul.lo.u64 %r61,%r456,%r464; bra $L138; $L135: .loc 1 1318 3 ld.u64 %r471,[%r1235]; add.u64 %r472,%r99,%r99; add.u64 %r473,%r471,%r472; ld.s16 %r474,[%r473]; ld.u64 %r482,[%r1237]; sub.u64 %r481,%r474,%r482; ld.u64 %r489,[%r1238]; mul.lo.u64 %r61,%r481,%r489; bra $L138; $L132: .loc 1 1319 3 ld.u64 %r496,[%r1235]; shl.b64 %r497,%r99,2; add.u64 %r498,%r496,%r497; ld.s32 %r499,[%r498]; ld.u64 %r507,[%r1237]; sub.u64 %r506,%r499,%r507; ld.u64 %r514,[%r1238]; mul.lo.u64 %r61,%r506,%r514; bra $L138; $L136: .loc 1 1321 3 ld.u64 %r521,[%r1235]; shl.b64 %r522,%r99,3; add.u64 %r523,%r521,%r522; ld.u64 %r531,[%r523]; ld.u64 %r532,[%r1237]; sub.u64 %r530,%r531,%r532; ld.u64 %r539,[%r1238]; mul.lo.u64 %r61,%r530,%r539; bra $L138; $L137: .loc 1 1324 3 ld.u64 %r546,[%r1235]; shl.b64 %r547,%r99,4; add.u64 %r548,%r546,%r547; ld.u64 %r556,[%r548]; ld.u64 %r557,[%r1237]; sub.u64 %r555,%r556,%r557; ld.u64 %r564,[%r1238]; mul.lo.u64 %r61,%r555,%r564; $L138: .loc 1 1333 33 ld.u64 %r581,[%r323+16]; .loc 1 1332 8 mad.lo.u64 %r582,%r61,%r5813324333582; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r330; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r331; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r1240; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r1239; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r334; .param .u32 %out_arg13; st.param.u32 [%out_arg13],%r335; call get_for_ref$constprop$0$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13); } .loc 1 1337 5 ld.u64 %r590,[%r159]; ld.u64 %r591,[%r1245]; add.u64 %r589,%r590,%r591; st.u64 [%r159],%r589; .loc 1 1306 9 add.u64 %r99,%r99,1; .loc 1 1305 4 ld.u64 %r598,[%r426]; setp.lt.u64 %r599,%r99,%r598; @ %r599 bra $L139; bra $L105; $L127: .loc 1 1341 4 add.u64 %r1225,%r333,%r333; add.u64 %r1222,%r1225,%r333; shl.b64 %r603,%r1222,3; add.u64 %r604,%r323,%r603; ld.u64 %r101,[%r604+64]; abs.s64 %r102,%r101; add.u64 %r610,%r327,%r603; ld.u64 %r296,[%r610+56]; ld.u64 %r319,[%r610+48]; setp.le.s64 %r618,%r101,0; @ %r618 bra $L140; add.u64 %r619,%r296,1; sub.u64 %r104,%r619,%r319; bra $L141; $L140: add.u64 %r620,%r319,1; sub.u64 %r104,%r620,%r296; $L141: set.u32.le.s64 %r622,%r104,0; neg.s32 %r623,%r622; set.u32.le.s64 %r625,%r102,0; neg.s32 %r626,%r625; cvt.u16.u32 %r628,%r623; cvt.u16.u32 %r629,%r626; or.b16 %r627,%r628,%r629; cvt.u32.u16 %r630,%r627; cvt.u16.u8 %r631,%r630; setp.ne.u16 %r632,%r631,0; @ %r632 bra $L105; setp.eq.u64 %r633,%r102,1; @ %r633 bra $L142; add.u64 %r634,%r104,-1; div.s64 %r638,%r634,%r102; add.u64 %r104,%r638,1; $L142: .loc 1 1345 34 shl.b64 %r642,%r1222,3; add.u64 %r643,%r327,%r642; .loc 1 1345 15 ld.u64 %r645,[%r643+40]; mul.lo.u64 %r112,%r101,%r645; .loc 1 1348 23 mov.u64 %r646,0; st.u64 [%r159],%r646; .loc 1 1332 8 add.u64 %r310,%r332,1; add.u64 %r305,%r333,1; .loc 1 1349 20 mov.u64 %r120,%r646; .loc 1 1347 21 mov.u64 %r121,%r120; mul.lo.u64 %r1221,%r332,3; .loc 1 1357 8 shl.b64 %r666,%r1221,3; add.u64 %r667,%r326,%r666; add.u64 %r668,%r667,40; $L143: .loc 1 1353 33 ld.u64 %r661,[%r323+16]; .loc 1 1352 8 mad.lo.u64 %r662,%r121,%r6613324333662; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r330; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r331; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r310; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r305; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r334; .param .u32 %out_arg13; st.param.u32 [%out_arg13],%r335; call get_for_ref$constprop$0$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13); } .loc 1 1357 5 ld.u64 %r670,[%r159]; ld.u64 %r671,[%r668]; add.u64 %r669,%r670,%r671; st.u64 [%r159],%r669; .loc 1 1350 9 add.u64 %r120,%r120,1; .loc 1 1350 33 add.u64 %r121,%r121,%r112; .loc 1 1349 4 setp.ne.u64 %r672,%r120,%r104; @ %r672 bra $L143; bra $L105; $L128: .loc 1 1361 4 add.u64 %r1225,%r333,%r333; add.u64 %r1222,%r1225,%r333; shl.b64 %r676,%r1222,3; add.u64 %r677,%r323,%r676; ld.u64 %r122,[%r677+64]; abs.s64 %r123,%r122; ld.u64 %r294,[%r677+56]; ld.u64 %r317,[%r677+48]; setp.le.s64 %r691,%r122,0; @ %r691 bra $L144; add.u64 %r692,%r294,1; sub.u64 %r125,%r692,%r317; bra $L145; $L144: add.u64 %r693,%r317,1; sub.u64 %r125,%r693,%r294; $L145: set.u32.le.s64 %r695,%r125,0; neg.s32 %r696,%r695; set.u32.le.s64 %r698,%r123,0; neg.s32 %r699,%r698; cvt.u16.u32 %r701,%r696; cvt.u16.u32 %r702,%r699; or.b16 %r700,%r701,%r702; cvt.u32.u16 %r703,%r700; cvt.u16.u8 %r704,%r703; setp.ne.u16 %r705,%r704,0; @ %r705 bra $L105; setp.eq.u64 %r706,%r123,1; @ %r706 bra $L146; add.u64 %r707,%r125,-1; div.s64 %r711,%r707,%r123; add.u64 %r125,%r711,1; $L146: .loc 1 1367 10 shl.b64 %r715,%r1222,3; add.u64 %r716,%r327,%r715; ld.u64 %r134,[%r716+40]; .loc 1 1366 10 ld.u64 %r725,[%r716+48]; sub.u64 %r724,%r317,%r725; .loc 1 1365 21 mul.lo.u64 %r150,%r724,%r134; .loc 1 1368 15 mul.lo.u64 %r135,%r122,%r134; .loc 1 1370 23 mov.u64 %r726,0; st.u64 [%r159],%r726; .loc 1 1332 8 add.u64 %r318,%r333,1; .loc 1 1376 41 setp.ne.u64 %r727,%r125,1; @ ! %r727 bra $L229; mul.lo.u64 %r1221,%r332,3; bra $L147; $L229: .loc 1 1375 12 add.u64 %r1224,%r332,%r332; add.u64 %r1221,%r1224,%r332; shl.b64 %r731,%r1221,3; add.u64 %r732,%r326,%r731; ld.u64 %r735,[%r732+56]; ld.u64 %r743,[%r732+48]; .loc 1 1375 8 setp.ne.u64 %r744,%r735,%r743; @ %r744 bra $L148; $L147: .loc 1 1376 41 add.u64 %r140,%r332,1; .loc 1 1377 20 mov.u64 %r152,0; .loc 1 1384 8 shl.b64 %r764,%r1221,3; add.u64 %r765,%r326,%r764; add.u64 %r766,%r765,40; $L149: .loc 1 1380 33 ld.u64 %r759,[%r323+16]; .loc 1 1379 8 mad.lo.u64 %r760,%r150,%r7593324333760; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r330; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r331; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r140; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r318; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r334; .param .u32 %out_arg13; st.param.u32 [%out_arg13],%r335; call get_for_ref$constprop$0$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13); } .loc 1 1384 5 ld.u64 %r768,[%r159]; ld.u64 %r769,[%r766]; add.u64 %r767,%r768,%r769; st.u64 [%r159],%r767; .loc 1 1385 25 add.u64 %r150,%r150,%r135; .loc 1 1377 47 add.u64 %r152,%r152,1; .loc 1 1377 4 setp.ne.u64 %r770,%r125,%r152; @ %r770 bra $L149; bra $L105; $L125: .loc 1 1389 47 add.u64 %r772,%r333,%r333; add.u64 %r773,%r772,%r333; shl.b64 %r774,%r773,3; add.u64 %r775,%r323,%r774; .loc 1 1390 29 add.u64 %r781,%r327,%r774; .loc 1 1390 10 ld.u64 %r784,[%r775+48]; ld.u64 %r785,[%r781+48]; sub.u64 %r783,%r784,%r785; .loc 1 1389 21 ld.u64 %r792,[%r781+40]; mul.lo.u64 %r157,%r783,%r792; .loc 1 1392 23 st.u64 [%r159],%r1247; .loc 1 1393 4 add.u64 %r333,%r333,1; .loc 1 1394 32 ld.u64 %r795,[%r323+16]; .loc 1 1393 4 mad.lo.u64 %r329,%r157,%r795,%r329; .loc 1 1397 4 bra $L107; $L130: .loc 1 1399 4 add.u64 %r1225,%r333,%r333; add.u64 %r1222,%r1225,%r333; shl.b64 %r799,%r1222,3; add.u64 %r800,%r323,%r799; ld.u64 %r163,[%r800+64]; abs.s64 %r164,%r163; .loc 1 1341 4 add.u64 %r806,%r327,%r799; ld.u64 %r293,[%r806+56]; .loc 1 1361 4 ld.u64 %r316,[%r800+48]; .loc 1 1399 4 setp.le.s64 %r814,%r163,0; @ %r814 bra $L150; add.u64 %r815,%r293,1; sub.u64 %r166,%r815,%r316; bra $L151; $L150: add.u64 %r816,%r316,1; sub.u64 %r166,%r816,%r293; $L151: set.u32.le.s64 %r818,%r166,0; neg.s32 %r819,%r818; set.u32.le.s64 %r821,%r164,0; neg.s32 %r822,%r821; cvt.u16.u32 %r824,%r819; cvt.u16.u32 %r825,%r822; or.b16 %r823,%r824,%r825; cvt.u32.u16 %r826,%r823; cvt.u16.u8 %r827,%r826; setp.ne.u16 %r828,%r827,0; @ %r828 bra $L105; setp.eq.u64 %r829,%r164,1; @ %r829 bra $L152; add.u64 %r830,%r166,-1; div.s64 %r834,%r830,%r164; add.u64 %r166,%r834,1; $L152: .loc 1 1403 17 shl.b64 %r838,%r1222,3; add.u64 %r839,%r327,%r838; ld.u64 %r173,[%r839+40]; .loc 1 1403 15 mul.lo.u64 %r174,%r163,%r173; .loc 1 1406 10 ld.u64 %r848,[%r839+48]; sub.u64 %r847,%r316,%r848; .loc 1 1405 21 mul.lo.u64 %r185,%r847,%r173; .loc 1 1408 23 mov.u64 %r849,0; st.u64 [%r159],%r849; .loc 1 1332 8 add.u64 %r306,%r332,1; add.u64 %r311,%r333,1; .loc 1 1409 20 mov.u64 %r187,%r849; mul.lo.u64 %r1221,%r332,3; .loc 1 1416 8 shl.b64 %r869,%r1221,3; add.u64 %r870,%r326,%r869; add.u64 %r871,%r870,40; $L153: .loc 1 1412 33 ld.u64 %r864,[%r323+16]; .loc 1 1411 8 mad.lo.u64 %r865,%r185,%r8643324333865; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r330; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r331; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r306; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r311; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r334; .param .u32 %out_arg13; st.param.u32 [%out_arg13],%r335; call get_for_ref$constprop$0$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13); } .loc 1 1416 5 ld.u64 %r873,[%r159]; ld.u64 %r874,[%r871]; add.u64 %r872,%r873,%r874; st.u64 [%r159],%r872; .loc 1 1417 25 add.u64 %r185,%r185,%r174; .loc 1 1409 47 add.u64 %r187,%r187,1; .loc 1 1409 4 setp.ne.u64 %r875,%r187,%r166; @ %r875 bra $L153; bra $L105; $L131: .loc 1 1421 4 add.u64 %r1225,%r333,%r333; add.u64 %r1222,%r1225,%r333; shl.b64 %r879,%r1222,3; add.u64 %r880,%r323,%r879; ld.u64 %r188,[%r880+64]; abs.s64 %r189,%r188; .loc 1 1341 4 add.u64 %r886,%r327,%r879; ld.u64 %r292,[%r886+48]; .loc 1 1361 4 ld.u64 %r315,[%r880+56]; .loc 1 1421 4 setp.le.s64 %r894,%r188,0; @ %r894 bra $L154; add.u64 %r895,%r315,1; sub.u64 %r191,%r895,%r292; bra $L155; $L154: add.u64 %r896,%r292,1; sub.u64 %r191,%r896,%r315; $L155: set.u32.le.s64 %r898,%r191,0; neg.s32 %r899,%r898; set.u32.le.s64 %r901,%r189,0; neg.s32 %r902,%r901; cvt.u16.u32 %r904,%r899; cvt.u16.u32 %r905,%r902; or.b16 %r903,%r904,%r905; cvt.u32.u16 %r906,%r903; cvt.u16.u8 %r907,%r906; setp.ne.u16 %r908,%r907,0; @ %r908 bra $L105; setp.eq.u64 %r909,%r189,1; @ %r909 bra $L156; add.u64 %r910,%r191,-1; div.s64 %r914,%r910,%r189; add.u64 %r191,%r914,1; $L156: .loc 1 1425 17 shl.b64 %r918,%r1222,3; add.u64 %r919,%r327,%r918; .loc 1 1425 15 ld.u64 %r921,[%r919+40]; mul.lo.u64 %r199,%r188,%r921; .loc 1 1428 23 mov.u64 %r922,0; st.u64 [%r159],%r922; .loc 1 1332 8 add.u64 %r322,%r332,1; add.u64 %r307,%r333,1; .loc 1 1429 20 mov.u64 %r210,%r922; .loc 1 1427 21 mov.u64 %r208,%r210; mul.lo.u64 %r1221,%r332,3; .loc 1 1436 8 shl.b64 %r942,%r1221,3; add.u64 %r943,%r326,%r942; add.u64 %r944,%r943,40; $L157: .loc 1 1432 33 ld.u64 %r937,[%r323+16]; .loc 1 1431 8 mad.lo.u64 %r938,%r208,3324333938; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r330; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r331; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r322; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r307; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r334; .param .u32 %out_arg13; st.param.u32 [%out_arg13],%r335; call get_for_ref$constprop$0$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13); } .loc 1 1436 5 ld.u64 %r946,[%r159]; ld.u64 %r947,[%r944]; add.u64 %r945,%r946,%r947; st.u64 [%r159],%r945; .loc 1 1437 25 add.u64 %r208,%r208,%r199; .loc 1 1429 47 add.u64 %r210,%r210,1; .loc 1 1429 4 setp.ne.u64 %r948,%r210,%r191; @ %r948 bra $L157; bra $L105; $L119: .loc 1 1239 21 add.u64 %r949,%r323,%r333; ld.u8 %r314,[%r949+24]; .loc 1 1445 10 setp.eq.u32 %r951,%r314,0; @ %r951 bra $L182; $L180: .loc 1 1452 7 cvt.u16.u32 %r952,%r314; setp.eq.u16 %r953,%r952,3; @ %r953 bra $L159; setp.gt.u16 %r955,%r952,3; @ %r955 bra $L160; setp.eq.u16 %r957,%r952,1; @ %r957 bra $L161; setp.eq.u16 %r959,%r952,2; @ %r959 bra $L162; bra $L110; $L160: setp.eq.u16 %r961,%r952,4; @ %r961 bra $L163; bra $L110; $L161: .loc 1 1456 23 mov.u64 %r962,0; st.u64 [%r159],%r962; .loc 1 1457 54 add.u64 %r1225,%r333,%r333; add.u64 %r1222,%r1225,%r333; shl.b64 %r966,%r1222,3; add.u64 %r967,%r323,%r966; add.u64 %r968,%r967,56; .loc 1 1457 4 ld.u64 %r969,[%r968]; setp.eq.u64 %r970,%r969,0; @ %r970 bra $L105; .loc 1 1457 16 mov.u64 %r237,%r962; .loc 1 1464 39 add.u64 %r976,%r967,64; .loc 1 1468 3 add.u64 %r1226,%r967,48; .loc 1 1481 8 add.u64 %r1227,%r333,1; add.u64 %r1228,%r332,1; .loc 1 1486 8 add.u64 %r1229,%r332,%r332; add.u64 %r1230,%r1229,%r332; shl.b64 %r1231,%r1230,3; add.u64 %r1232,%r326,%r1231; add.u64 %r1233,%r1232,40; $L171: .loc 1 1464 39 ld.u32 %r211,[%r976]; setp.eq.u32 %r977,%r211,4; @ %r977 bra $L164; setp.gt.s32 %r978,%r211,4; @ %r978 bra $L165; setp.eq.u32 %r979,%r211,1; @ %r979 bra $L166; setp.eq.u32 %r980,%r211,2; @ %r980 bra $L167; bra $L110; $L165: setp.eq.u32 %r981,%r211,8; @ %r981 bra $L168; setp.eq.u32 %r982,%r211,16; @ %r982 bra $L169; bra $L110; $L166: .loc 1 1466 3 ld.u64 %r989,[%r1226]; add.u64 %r990,%r989,%r237; ld.s8 %r215,[%r990]; bra $L170; $L167: .loc 1 1467 3 ld.u64 %r997,[%r1226]; add.u64 %r998,%r237,%r237; add.u64 %r999,%r997,%r998; ld.s16 %r215,[%r999]; bra $L170; $L164: .loc 1 1468 3 ld.u64 %r1006,[%r1226]; shl.b64 %r1007,%r237,2; add.u64 %r1008,%r1006,%r1007; ld.s32 %r215,[%r1008]; bra $L170; $L168: .loc 1 1470 3 ld.u64 %r1015,[%r1226]; shl.b64 %r1016,%r237,3; add.u64 %r1017,%r1015,%r1016; ld.u64 %r215,[%r1017]; bra $L170; $L169: .loc 1 1473 3 ld.u64 %r1024,[%r1226]; shl.b64 %r1025,%r237,4; add.u64 %r1026,%r1024,%r1025; ld.u64 %r215,[%r1026]; $L170: .loc 1 1482 33 ld.u64 %r1043,[%r323+16]; .loc 1 1481 8 mad.lo.u64 %r1044,%r215,%r1043,%r329; mov.u64 %r1031,332433103144; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r330; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r331; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r1228; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r1227; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r334; .param .u32 %out_arg13; st.param.u32 [%out_arg13],%r335; call get_for_ref$constprop$0$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13); } .loc 1 1486 5 ld.u64 %r1052,[%r159]; ld.u64 %r1053,[%r1233]; add.u64 %r1051,%r1052,%r1053; st.u64 [%r159],%r1051; .loc 1 1458 9 add.u64 %r237,%r237,1; .loc 1 1457 4 ld.u64 %r1060,[%r968]; setp.lt.u64 %r1061,%r237,%r1060; @ %r1061 bra $L171; bra $L105; $L162: .loc 1 1490 23 mov.u64 %r1062,0; st.u64 [%r159],%r1062; .loc 1 1492 52 add.u64 %r1225,%r333,%r333; add.u64 %r1222,%r1225,%r333; shl.b64 %r1066,%r1222,3; add.u64 %r1067,%r323,%r1066; add.u64 %r1068,%r1067,56; .loc 1 1491 4 ld.u64 %r1069,[%r1068]; setp.lt.s64 %r1070,%r1069,0; @ %r1070 bra $L105; .loc 1 1332 8 add.u64 %r309,%r332,1; add.u64 %r308,%r333,1; .loc 1 1491 26 mov.u64 %r44,%r1062; mul.lo.u64 %r1221,%r332,3; .loc 1 1500 8 shl.b64 %r1090,%r1221,3; add.u64 %r1091,%r326,%r1090; add.u64 %r1092,%r1091,40; .loc 1 1493 52 add.u64 %r1101,%r1067,64; $L172: .loc 1 1496 33 ld.u64 %r1085,[%r323+16]; .loc 1 1495 8 mad.lo.u64 %r1086,%r44,%r1085,%r329; mov.u64 %r1075,332433107586; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r330; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r331; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r309; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r308; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r334; .param .u32 %out_arg13; st.param.u32 [%out_arg13],%r335; call get_for_ref$constprop$0$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13); } .loc 1 1500 5 ld.u64 %r1094,[%r159]; ld.u64 %r1095,[%r1092]; add.u64 %r1093,%r1094,%r1095; st.u64 [%r159],%r1093; .loc 1 1493 26 ld.u64 %r1102,[%r1101]; add.u64 %r44,%r44,%r1102; .loc 1 1491 4 ld.u64 %r1109,[%r1068]; setp.le.s64 %r1110,%r44,%r1109; @ %r1110 bra $L172; bra $L105; $L159: .loc 1 1504 4 add.u64 %r1225,%r333,%r333; add.u64 %r1222,%r1225,%r333; shl.b64 %r1114,%r1222,3; add.u64 %r1115,%r323,%r1114; ld.u64 %r249,[%r1115+64]; abs.s64 %r250,%r249; .loc 1 1361 4 ld.u64 %r291,[%r1115+56]; ld.u64 %r269,[%r1115+48]; .loc 1 1504 4 setp.le.s64 %r1129,%r249,0; @ %r1129 bra $L173; add.u64 %r1130,%r291,1; sub.u64 %r252,%r1130,%r269; bra $L174; $L173: add.u64 %r1131,%r269,1; sub.u64 %r252,%r1131,%r291; $L174: set.u32.le.s64 %r1133,%r252,0; neg.s32 %r1134,%r1133; set.u32.le.s64 %r1136,%r250,0; neg.s32 %r1137,%r1136; cvt.u16.u32 %r1139,%r1134; cvt.u16.u32 %r1140,%r1137; or.b16 %r1138,%r1139,%r1140; cvt.u32.u16 %r1141,%r1138; cvt.u16.u8 %r1142,%r1141; setp.ne.u16 %r1143,%r1142,0; @ %r1143 bra $L105; setp.eq.u64 %r1144,%r250,1; @ %r1144 bra $L175; add.u64 %r1145,%r252,-1; div.s64 %r1149,%r1145,%r250; add.u64 %r252,%r1149,1; $L175: .loc 1 1509 23 mov.u64 %r1150,0; st.u64 [%r159],%r1150; .loc 1 1332 8 add.u64 %r313,%r332,1; add.u64 %r312,%r333,1; .loc 1 1510 20 mov.u64 %r271,%r1150; mul.lo.u64 %r1221,%r332,3; .loc 1 1517 8 shl.b64 %r1170,%r1221,3; add.u64 %r1171,%r326,%r1170; add.u64 %r1172,%r1171,40; .loc 1 1518 51 shl.b64 %r1179,%r1222,3; add.u64 %r1180,%r323,%r1179; add.u64 %r1181,%r1180,64; $L176: .loc 1 1513 33 ld.u64 %r1165,[%r323+16]; .loc 1 1512 8 mad.lo.u64 %r1166,%r269,%r1165,%r329; mov.u64 %r1155,3324331155166; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r330; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r331; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r313; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r312; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r334; .param .u32 %out_arg13; st.param.u32 [%out_arg13],%r335; call get_for_ref$constprop$0$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13); } .loc 1 1517 5 ld.u64 %r1174,[%r159]; ld.u64 %r1175,[%r1172]; add.u64 %r1173,%r1174,%r1175; st.u64 [%r159],%r1173; .loc 1 1518 25 ld.u64 %r1182,[%r1181]; add.u64 %r269,%r269,%r1182; .loc 1 1510 47 add.u64 %r271,%r271,1; .loc 1 1510 4 setp.ne.u64 %r1183,%r271,%r252; @ %r1183 bra $L176; bra $L105; $L163: .loc 1 1522 21 add.u64 %r1185,%r333,%r333; add.u64 %r1186,%r1185,%r333; shl.b64 %r1187,%r1186,3; add.u64 %r1188,%r323,%r1187; ld.u64 %r272,[%r1188+48]; .loc 1 1523 4 add.u64 %r333,%r333,1; .loc 1 1524 32 ld.u64 %r1191,[%r323+16]; .loc 1 1523 4 mad.lo.u64 %r329,%r272,%r1191,%r329; mov.u64 %r327,%r1247; .loc 1 1527 4 bra $L107; $L121: .loc 1 1268 18 setp.eq.u32 %r1192,%r1223,0; @ %r1192 bra $L177; .loc 1 1266 10 mov.u64 %r51,%r1247; bra $L122; $L148: .loc 1 1380 33 ld.u64 %r1207,[%r323+16]; .loc 1 1379 8 mad.lo.u64 %r1208,%r150,%r1203324333208; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r330; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r331; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r332; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r318; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r334; .param .u32 %out_arg13; st.param.u32 [%out_arg13],%r335; call get_for_ref$constprop$0$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13); } .loc 1 1384 5 ld.u64 %r1216,[%r159]; ld.u64 %r1217,[%r732+40]; add.u64 %r1215,%r1216,%r1217; st.u64 [%r159],%r1215; bra $L105; $L228: .loc 1 1256 3 setp.eq.u32 %r1218,%r302,1; @ %r1218 bra $L179; setp.eq.u32 %r1219,%r302,2; @ ! %r1219 bra $L110; bra $L180; $L181: .loc 1 1282 2 mov.u64 %r329,%r327; ld.u32 %r1223,[%r303+8]; bra $L158; $L105: .loc 1 153_gfortran_caf_init .visible .func _gfortran_caf_in_gfortran_caf_finalize .visible .func _gfortran_caf_finalize {u64 %r27; .reg .u64 %r29; .reg .pred %r30; .loc 1 110 26 ld.global.u64 %r24,[caf_static_list]; .loc 1 110 9 setp.eq.u64 %r26,%r24,0; @ %r26 bra $L231; $L233: mov.u64 %r25,%r24; .loc 1 112 21 ld.u64 %r24,[%r24+8]; .loc 1 113 7 ld.u64 %r214 7 ld.global.u64 %r29,[caf_static_list];115 23 st.global.u64 [caf_static_list],%r24; .loc 1 110 9 setp.ne.u64 %r30,%r24,0; @ %r30 bra $L233; $L23_gfortran_caf_this_imag_gfortran_caf_this_imageloc 1 123 10 mov.u32 %value,1_gfortran_caf_gfortran_caf_num_imagesloc 1 131 10 mov.u32 %value,1_gfortran_caf_register .visible .func _gfortran_caf_registerstack; mov.u64 %stack,0predpredpred %r93; .reg .predpredpred %r110; mov.u64 %r36,%ar0; mov.u32 %r37,%ar1; mov.u64 %r38,%ar2; mov.u64 %r39,%ar3; mov.u64 %r40,%ar4; mov.u64 %r41,%ar5; mov.u64 %r42,%ar6; .loc 1 140 14 cvta.const.u64 %r44,$LC2; mov.u64 %r45,2-2; .loc 1 144 6 setp.gt.u32 %r52,%r51,2; @ %r52 bra $L241; .loc 1 146 13 mov.u64 654; call (%value_in),cmov.u64 %r34,%r55; bra $L242; $L241: .loc 1 147 45 add.u32 %r57,%r37,-5; .loc 1 147 11 setp.gt.u32 %r58,%r57,1; @ %r58 bra $L243; .loc 1 150 13 mov.u64 %r60cmov.u64 %r34,%r61; bra $L242; $L243: .loc 1 151 11 setp.eq.u32 %r63,%r37,7; @ %r63 bra $L244; .loc 1 154 13mov.u64 %r34,%r65; .loc 1 156 6 setp.ne.u32 %r67,%r37,8; @ %r67 bra $L242; .loc 1 159 7 ld.u64 %r33,[%r38]; bra $L245; $L242: .loc 1 157 14 mov.u64 %r68mov.u64 %r33,%r69; .loc 1 157 12 st.u64 [%r38],%r33; $L245: .loc 1 159 6 setp.eq.u64 %r71,%r33,0; @ %r71 bra $L246; .loc 1 159 7 set.u32.ne.u32 %r73,%r37,7; neg.s32 %r74,%r73; cvt.u32.u32 %r72,%r74; cvt.u32.u8 %r25,%r72; set.u32.eq.u64 %r76,%r34,0; neg.s32 %r77,%r76;@ ! %r83 bra $L248; bra $L247; $L254:$L255: .loc 1 170 7 {stack; call caf_internal_error); } bra $L240; $L248: .loc 1 175 24 st.u64 [%r33],%r34; .loc 1 176 31 cvt.u32.u32 %r90,%r25; st.u8 [%r33+16],%r90; .loc 1 177 62 ld.s8 %r92,[%r39+28]; cvt.u16.u32 %r91,%r92; setp.gt.s16 %r93,%r91,0; selp.u64 %r31,%r39,0,%r93; .loc 1 177 22 st.u64 [%r33+8],%r31; .loc 1 180 6 setp.eq.u64 %r94,%r40,0; @ %r94 bra $L251; .loc 1 181 11 mov.u32 %r95,0; st.u32 [%r40],%r95; $L251: .loc 1 183 12 and.b32 %r96,%r37,-3; .loc 1 183 6 setp.eq.u32 %r97,%r96,0; @ %r97 bra $L252; .loc 1 184 39 add.u32 %r98,%r37,-4; .loc 1 185 7 setp.gt.u32 %r99,%r98,2; @ %r99 bra $L253; $L252: .loc 1 187 27 mov.u64 %r100,16101,[%value_in]; } .loc 1 188 18 ld.global.u64 %r103,[caf_static_list]; st.u64 [%r101+8],%r103; .loc 1 189 18 ld.u64 %r104,[%r38]; st.u64 [%r101],%r104; .loc 1 190 23 st.global.u64 [caf_static_list],%r101; $L253: .loc 1 192 30 st.u64 [%r39],%r34; bra $L240; $L246: .loc 1 166 10 setp.ne.u64 %r105,%r34,0; @ %r105 bra $L254; bra $L255; $L247:bra $L255; $L244: .loc 1 157 1407108,[%value_in]; } mov.u64 %r33,%r108; .loc 1 157 12 st.u64 [%r38],%r33; .loc 1 159 6 setp.eq.u64 %r110,%r33,0; @ %r110 bra $L255; .loc 1 152 11 mov.u64 %r34,0; .loc 1 159 7 cvt.u32.u64 %r25,%r34; bra $L248; $L240: .loc 1 193 1 ret; } // BEGIN FUNCTION DEF: send_by_ref$constprop$0$isra$0 .func send_by_ref$constprop$0$isra$0, .param .u64 %in_ar11, .param .u64 %in_ar12, .param .u32 %in_ar13) {ar11; ld.param.u64 %ar11,[%in_ar11]; .reg .u64 %ar12; ld.param.u64 %ar12,[%in_ar12]; .reg .u32 %ar13; ld.param.u32 %ar13,[%in_ar13]; .reg .u64 %stack; mov.u64 %stack,0u64u64 %r304; .reg .u64 %r305; .reg .u32 %r306; .reg .u64u64 %r313; .reg .u64u64 %r327; .reg .u64u64u64 %r357; .reg .u64 %r358; .reg .u32 %r359; .reg .u64u64 %r365; .reg .u64 %r366; .reg .u32 %r367; .reg .u32u64 %r375; .reg .u64 %r376; .reg .u64 %r380; .reg .predpred %r386; .reg .u64predu64 %r431; .reg .u32predpredpred %r498; .reg .u32 %r499; .reg .pred %r500; .reg .u32 %r501; .reg .pred %r502; .reg .u64 %r512; .reg .u64 %r517; .reg .u64 %r527; .reg .u64 %r532; .reg .pred %r534; .reg .u64 %r544; .reg .predpred %r557; .reg .pred %r559; .reg .pred %r561; .reg .pred %r563; .reg .u64 %r564; .reg .u64 %r565; .reg .u64 %r569; .reg .u64 %r570; .reg .u64 %r571; .reg .u64 %r572; .reg .pred %r573; .reg .u64 %r579; .reg .pred %r580; .reg .pred %r581; .reg .pred %r582; .reg .pred %r583; .reg .pred %r584; .reg .pred %r585; .reg .u64 %r592; .reg .u64 %r593; .reg .u64 %r594; .reg .u64 %r601; .reg .u64 %r602; .reg .u64 %r609; .reg .u64 %r616; .reg .u64 %r617; .reg .u64 %r618; .reg .u64 %r619; .reg .u64 %r626; .reg .u64 %r627; .reg .u64 %r634; .reg .u64 %r641; .reg .u64 %r642; .reg .u64 %r643; .reg .u64 %r644; .reg .u64 %r651; .reg .u64 %r652; .reg .u64u64 %r676; .reg .u64 %r677; .reg .u64 %r684; .reg .u64 %r691; .reg .u64 %r692; .reg .u64 %r693; .reg .u64 %r700; .reg .u64 %r701; .reg .u64 %r702; .reg .u64 %r709; .reg .u64 %r713; .reg .u64 %r732; .reg .u64 %r733; .reg .u64 %r741; .reg .u64 %r742; .reg .u64 %r743; .reg .u64 %r750; .reg .pred %r751; .reg .u64 %r755; .reg .u64 %r756; .reg .u64 %r762; .reg .pred %r770; .reg .u64 %r771; .reg .u64 %r772; .reg .u32 %r774; .reg .u32 %r775; .reg .u32 %r777; .reg .u32 %r778; .reg .u16 %r779; .reg .u16 %r780; .reg .u16 %r781; .reg .u32 %r782; .reg .u16 %r783; .reg .pred %r784; .reg .pred %r785; .reg .u64 %r786; .reg .u64 %r790; .reg .u64 %r794; .reg .u64 %r795; .reg .u64 %r797; .reg .u64 %r798; .reg .u64 %r799; .reg .u64 %r815; .reg .u64 %r816; .reg .u64 %r824; .reg .u64 %r825; .reg .u64 %r826; .reg .pred %r827; .reg .u64 %r831; .reg .u64 %r832; .reg .pred %r846; .reg .u64 %r847; .reg .u64 %r848; .reg .u32 %r850; .reg .u32 %r851; .reg .u32 %r853; .reg .u32 %r854; .reg .u16 %r855; .reg .u16 %r856; .reg .u16 %r857; .reg .u32 %r858; .reg .u16 %r859; .reg .pred %r860; .reg .pred %r861; .reg .u64 %r862; .reg .u64 %r866; .reg .u64 %r870; .reg .u64 %r871; .reg .u64 %r873; .reg .u64 %r880; .reg .u64 %r881; .reg .u64 %r882; .reg .u64 %r898; .reg .u64 %r899; .reg .u64 %r907; .reg .u64 %r908; .reg .u64 %r909; .reg .pred %r910; .reg .u64 %r925; .reg .u64 %r927; .reg .u64 %r928; .reg .u64 %r929; .reg .u64 %r930; .reg .u64 %r936; .reg .u64 %r938; .reg .u64 %r939; .reg .u64 %r940; .reg .u64 %r947; .reg .u64 %r948; .reg .u64 %r950; .reg .u64 %r951; .reg .u64 %r955; .reg .u64 %r956; .reg .u64 %r962; .reg .pred %r970; .reg .u64 %r971; .reg .u64 %r972; .reg .u32 %r974; .reg .u32 %r975; .reg .u32 %r977; .reg .u32 %r978; .reg .u16 %r979; .reg .u16 %r980; .reg .u16 %r981; .reg .u32 %r982; .reg .u16 %r983; .reg .pred %r984; .reg .pred %r985; .reg .u64 %r986; .reg .u64 %r990; .reg .u64 %r994; .reg .u64 %r995; .reg .u64 %r997; .reg .u64 %r1004; .reg .u64 %r1005; .reg .u64 %r1006; .reg .u64 %r1022; .reg .u64 %r1023; .reg .u64 %r1031; .reg .u64 %r1032; .reg .u64 %r1033; .reg .pred %r1034; .reg .u64 %r1038; .reg .u64 %r1039; .reg .u64 %r1045; .reg .pred %r1053; .reg .u64 %r1054; .reg .u64 %r1055; .reg .u32 %r1057; .reg .u32 %r1058; .reg .u32 %r1060; .reg .u32 %r1061; .reg .u16 %r1062; .reg .u16 %r1063; .reg .u16 %r1064; .reg .u32 %r1065; .reg .u16 %r1066; .reg .pred %r1067; .reg .pred %r1068; .reg .u64 %r1069; .reg .u64 %r1073; .reg .u64 %r1077; .reg .u64 %r1078; .reg .u64 %r1080; .reg .u64 %r1081; .reg .u64 %r1082; .reg .u64 %r1098; .reg .u64 %r1099; .reg .u64 %r1107; .reg .u64 %r1108; .reg .u64 %r1109; .reg .pred %r1110; .reg .u64 %r1111; .reg .pred %r1113; .reg .u64 %r1123; .reg .u16 %r1128; .reg .pred %r1129; .reg .pred %r1131; .reg .pred %r1133; .reg .pred %r1135; .reg .pred %r1137; .reg .u64 %r1138; .reg .u64 %r1139; .reg .u64 %r1143; .reg .u64 %r1144; .reg .u64 %r1145; .reg .u64 %r1146; .reg .pred %r1147; .reg .u64 %r1153; .reg .pred %r1154; .reg .pred %r1155; .reg .pred %r1156; .reg .pred %r1157; .reg .pred %r1158; .reg .pred %r1159; .reg .u64 %r1166; .reg .u64 %r1167; .reg .u64 %r1174; .reg .u64 %r1175; .reg .u64 %r1176; .reg .u64 %r1183; .reg .u64 %r1184; .reg .u64 %r1185; .reg .u64 %r1192; .reg .u64 %r1193; .reg .u64 %r1194; .reg .u64 %r1201; .reg .u64 %r1202; .reg .u64 %r1203; .reg .u64 %r1207; .reg .u64 %r1221; .reg .u64 %r1222; .reg .u64 %r1229; .reg .u64 %r1230; .reg .u64 %r1231; .reg .u64 %r1238; .reg .pred %r1239; .reg .u64 %r1240; .reg .u64 %r1241; .reg .u64 %r1245; .reg .u64 %r1246; .reg .u64 %r1247; .reg .u64 %r1248; .reg .pred %r1249; .reg .u64 %r1253; .reg .u64 %r1265; .reg .u64 %r1266; .reg .u64 %r1274; .reg .u64 %r1275; .reg .u64 %r1276; .reg .u64 %r1282; .reg .u64 %r1283; .reg .u64 %r1290; .reg .pred %r1291; .reg .u64 %r1295; .reg .u64 %r1296; .reg .pred %r1310; .reg .u64 %r1311; .reg .u64 %r1312; .reg .u32 %r1314; .reg .u32 %r1315; .reg .u32 %r1317; .reg .u32 %r1318; .reg .u16 %r1319; .reg .u16 %r1320; .reg .u16 %r1321; .reg .u32 %r1322; .reg .u16 %r1323; .reg .pred %r1324; .reg .pred %r1325; .reg .u64 %r1326; .reg .u64 %r1330; .reg .u64 %r1331; .reg .u64 %r1332; .reg .u64 %r1336; .reg .u64 %r1348; .reg .u64 %r1349; .reg .u64 %r1357; .reg .u64 %r1358; .reg .u64 %r1359; .reg .u64 %r1363; .reg .u64 %r1364; .reg .u64 %r1365; .reg .u64 %r1366; .reg .pred %r1367; .reg .u64 %r1371; .reg .u64 %r1382; .reg .u64 %r1384; .reg .u64 %r1385; .reg .u64 %r1386; .reg .u64 %r1387; .reg .u64 %r1390; .reg .u64 %r1391; .reg .u64 %r1392; .reg .pred %r1393; .reg .pred %r1394; .reg .u64 %r1395; .reg .u64 %r1396; .reg .u64 %r1397; .reg .u64 %r1398; .reg .pred %r1399; .reg .u32 %r1400; .reg .u64 %r1401; .reg .u64 %r1402; .reg .u64 %r1403; .reg .u64 %r1404; .reg .u64 %r1405; .reg .u64 %r1406; .reg .u64 %r1407; .reg .u64 %r1408; .reg .u64 %r1409; .reg .u64 %r1410; .reg .u64 %r1411; .reg .u64 %r1412; .reg .u64 %r1413; .reg .u64 %r1415; .reg .u64 %r1416; .reg .u64 %r1417; .reg .u64 %r1418; .reg .u64 %r1419; .reg .u64 %r1420; .reg .u64 %r1421; .reg .u64 %r1422; .reg .u64 %r1423; .reg .u64 %r1424; .reg .u64 %r1425; .reg .pred %r1426; .reg .u64 %r1427; .reg .u64 %r1428; .reg .u64 %r1429; .reg .u64 %r1430; .reg .u64 %r1431; .reg .u64 %r1433; .reg .u64 %r1434; .reg .u64 %r1435; .reg .u64 %r1436; .reg .u64 %r1437; .reg .u64 %r1438; .reg .u64 %r1439; .reg .u64 %r1440; .reg .u64 %r1441; .reg .u64 %r1442; .reg .u64 %r1443; .reg .u64 %r1444; .reg .u64 %r1445; .reg .u64 %r1446; .reg .u64 %r1447; .reg .u64 %r1448; .reg .u64 %r1449; .reg .u64 %r1450; .reg .u64 %r1451; .reg .u64 %r1452; mov.u64 %r360,%ar0; mov.u64 %r361,%ar1; mov.u64 %r362,%ar2; mov.u64 %r363,%ar3; mov.u64 %r364,%ar4; mov.u64 %r365,%ar5; mov.u64 %r366,%ar6; mov.u32 %r367,%ar7; mov.u32 %r368,%ar8; mov.u64 %r369,%ar9; mov.u64 %r370,%ar10; mov.u64 %r371,%ar11; mov.u64 %r372,%ar12; mov.u32 %r373,%ar13; .loc 1 2030 14 cvta.const.u64 %r375,$LC3; mov.u64 %r376,63753760,[%value_in]; } .loc 1 2033 16 ld.s8 %r23,[%r364+28]; .loc 1 2035 6 setp.eq.u64 %r382,%r360,0; @ %r382 bra $L266; .loc 1 2040 10 ld.u64 %r346,[%r360]; .loc 1 2045 18 ld.u32 %r359,[%r360+8]; .loc 1 2040 6 setp.ne.u64 %r383,%r346,0; @ %r383 bra $L268; .loc 1 2042 14 ld.u64 %r24,[%r364+16]; .loc 1 2045 7 setp.eq.u32 %r384,%r359,0; @ %r384 bra $L269; add.u32 %r385,%r359,-1; setp.gt.u32 %r386,%r385,1; @ %r386 bra $L270; .loc 1 2101 21 add.u64 %r387,%r360,%r369; ld.u8 %r238,[%r387+24]; .loc 1 2101 7 setp.eq.u32 %r389,%r238,0; @ ! %r389 bra $L409; bra $L271; $L269: .loc 1 2048 16 ld.u64 %r25,[%r360+32]; .loc 1 2050 25 ld.u64 %r390,[%r360+24]; add.u64 %r304,%r365,%r390; .loc 1 2059 5 ld.u64 %r1397,[%r360+16]; .loc 1 2048 7 setp.le.s64 %r391,%r25,0; @ %r391 bra $L273; .loc 1 2050 12 ld.u64 %r33,[%r304]; .loc 1 2061 11 add.u64 %r308,%r365,%r25; .loc 1 2050 11 setp.ne.u64 %r392,%r33,0; @ %r392 bra $L274; .loc 1 2054 39 st.u64 [%frame+64],%r33; .loc 1 2056 9 st.u64 [%frame+80],%r24; ld.u64 %r395,[%r364+24]; st.u64 [%frame+88],%r395; .loc 1 2059 5 add.u64 %r403,%frame,64; mov.u32 %r397,9708403372067 8 setp.eq.u64 %r404,%r372,0; @ %r404 bra $L275; .loc 1 2067 22 ld.u32 %r405,[%r372]; setp.eq.u32 %r406,%r405,0; @ ! %r406 bra $L266; $L275: .loc 1 2071 11 ld.u64 %r33,[%frame+64]; .loc 1 2071 9 ld.u64 %r407,[%r360+24]; add.u64 %r408,%r365,%r407; st.u64 [%r408],%r33; .loc 1 2074 14 ld.s8 %r1400,[%r364+29]; mov.u32 %r373,%r1400; ld.u64 %r1397,[%r360+16]; bra $L277; $L274: .loc 1 2080 9 ld.u64 %r409,[%r308]; ld.u64 %r32,[%r409+8]; .loc 1 2081 8 setp.eq.u64 %r410,%r32,0; @ ! %r410 bra $L410; ld.s8 %r1400,[%r364+29]; bra $L277; $L410: .loc 1 2083 12 ld.u64 %r33,[%r32]; .loc 1 2084 18 ld.s8 %r373,[%r32+29]; ld.s8 %r1400,[%r364+29]; $L277: .loc 1 2089 8 mov.u64 %r419,1; {37314001397; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r24; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r419; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r372; call copy_data,%out_arg8,%out_arg9,%out_arg10); } bra $L278; $L273: .loc 1 2093 6 ld.s8 %r433,[%r364+29]; mov.u64 %r431,r33731397; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r24; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r431; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r372; call copy_data,%out_arg8,%out_arg9,%out_arg10); } $L278: .loc 1 2096 4 ld.u64 %r435,[%r361]; add.u64 %r434,%r435,1; st.u64 [%r361],%r434; .loc 1 2097 4 bra $L266; $L271: .loc 1 2093 6 ld.s8 %r306,[%r364+29]; .loc 1 2059 5 ld.u64 %r316,[%r360+16]; .loc 1 2103 11 setp.eq.u64 %r436,%r23,0; @ %r436 bra $L279; mov.u64 %r353,%r362; shl.b64 %r437,%r23,3; add.u64 %r350,%r353,%r437; .loc 1 2043 17 mov.u64 %r47,%r346; $L280: .loc 1 2106 24 ld.u64 %r438,[%r353]; add.u64 %r47,%r47,%r438; .loc 1 2105 5 add.u64 %r353,%r353,8; setp.ne.u64 %r439,%r350,%r353; @ %r439 bra $L280; .loc 1 2107 5 mad.lo.u64 %r451,%r24,%r47,%r366; mov.u64 %r448,r451373.param .u64 %out_arg8; st.param.u64 [%out_arg8],%r24; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r448; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r372; call copy_data,%out_arg8,%out_arg9,%out_arg10); } bra $L281; $L279: .loc 1 2112 3 mov.u64 %r460,r3373.param .u64 %out_arg8; st.param.u64 [%out_arg8],%r24; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r460; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r372; call copy_data,%out_arg8,%out_arg9,%out_arg10); } $L281: .loc 1 2115 11 ld.u64 %r463,[%r361]; add.u64 %r462,%r463,1; st.u64 [%r361],%r462; .loc 1 2116 8 bra $L266; $L270: .loc 1 2120 4 cvta.const.u64 %r464,unreachable464caf_runtime_error268: .loc 1 2124 3 setp.eq.u32 %r466,%r359,1; @ %r466 bra $L282; setp.eq.u32 %r467,%r359,2; @ %r467 bra $L283; setp.ne.u32 %r468,%r359,0; @ %r468 bra $L270; .loc 1 2127 19 ld.u64 %r54,[%r360+32]; .loc 1 2050 25 ld.u64 %r469,[%r360+24]; add.u64 %r247,%r365,%r469; .loc 1 2127 10 setp.le.s64 %r470,%r54,0; @ %r470 bra $L284; .loc 1 2129 7 ld.u64 %r471,[%r247]; setp.ne.u64 %r472,%r471,0; @ %r472 bra $L285; .loc 1 2136 35 ld.u64 %r473,[%r364+16]; st.u64 [%r247+16],%r473; ld.u64 %r474,[%r364+24]; st.u64 [%r247+24],%r474; .loc 1 2137 20 st.u64 [%r247+8],%r471; .loc 1 2139 8 setp.ne.u64 %r476,%r23,0; @ %r476 bra $L352; $L289: .loc 1 2149 34 mov.u64 %r477,0; st.u64 [%r247],%r477; .loc 1 2154 14 add.u64 %r485,%r365,%r54; .loc 1 2152 8 ld.u64 %r487,[%r360+16]; mul.lo.u64 %r486,%r371,%r487; mov.u32 %r479,r47984737_gfortran_caf_register2158 11 setp.ne.u64 %r488,%r372,0; @ %r488 bra $L287; bra $L288; $L352: .loc 1 2139 8 mov.u64 %r349,%r471; .loc 1 2138 19 mov.u64 %r61,1; .loc 1 2139 20 mov.u64 %r62,%r349; .loc 1 2142 40 mov.u64 %r492,%r349; $L286: add.u64 %r348,%r364,%r349; .loc 1 2141 18 ld.u64 %r490,[%r348+56]; add.u64 %r489,%r490,1; .loc 1 2141 16 ld.u64 %r491,[%r348+48]; sub.u64 %r59,%r489,%r491; add.u64 %r347,%r247,%r349; .loc 1 2142 40 st.u64 [%r347+48],%r492; .loc 1 2143 53 add.u64 %r493,%r59,-1; .loc 1 2143 40 st.u64 [%r347+56],%r493; .loc 1 2144 40 st.u64 [%r347+40],%r61; .loc 1 2145 16 mul.lo.u64 %r61,%r61,%r59; .loc 1 2139 41 add.u64 %r62,%r62,1; .loc 1 2139 8 add.u64 %r349,%r349,24; setp.ne.u64 %r494,%r23,%r62; @ %r494 bra $L286; bra $L289; $L288: .loc 1 2165 11 ld.u64 %r346,[%r360]; .loc 1 2166 25 ld.u64 %r495,[%r360+24]; add.u64 %r247,%r365,%r495; .loc 1 2161 17 ld.u64 %r496,[%r360+32]; add.u64 %r497,%r365,%r496; ld.u64 %r84,[%r497]; .loc 1 2165 7 setp.ne.u64 %r498,%r346,0; @ ! %r498 bra $L291; bra $L290; $L287: .loc 1 2158 25 ld.u32 %r499,[%r372]; setp.ne.u32 %r500,%r499,0; @ ! %r500 bra $L288; bra $L266; $L290: .loc 1 2165 18 ld.u32 %r501,[%r346+8]; setp.ne.u32 %r502,%r501,0; @ %r502 bra $L291; .loc 1 2166 9 ld.u64 %r247,[%r247]; $L291: .loc 1 2170 4 ld.u64 %r517,[%r84+8]; mov.u64 %r512,362517364366; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r367; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r368; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r512; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r370; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r371; .param .u64 %out_arg13; st.param.u64 [%out_arg13],%r372; .param .u32 %out_arg14; st.param.u32 [%out_arg14],%r373; call send_by_ref$constprop$0$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13,%out_arg14); } bra $L266; $L284: .loc 1 2175 2 mov.u64 %r5236247364366; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r367; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r368; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r527; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r370; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r371; .param .u64 %out_arg13; st.param.u64 [%out_arg13],%r372; .param .u32 %out_arg14; st.param.u32 [%out_arg14],%r373; call send_by_ref$constprop$0$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13,%out_arg14); } bra $L266; $L282: .loc 1 2181 24 add.u64 %r532,%r360,%r369; ld.u8 %r238,[%r532+24]; .loc 1 2181 10 setp.ne.u32 %r534,%r238,0; @ %r534 bra $L292; .loc 1 2183 4 mov.u64 %r544,3625364366; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r367; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r368; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r544; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r370; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r371; .param .u64 %out_arg13; st.param.u64 [%out_arg13],%r372; .param .u32 %out_arg14; st.param.u32 [%out_arg14],%r373; call send_by_ref$constprop$0$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13,%out_arg14); } .loc 1 2186 4 bra $L266; $L292: .loc 1 2190 10 setp.ne.u64 %r549,%r369,0; @ %r549 bra $L293; .loc 1 2191 5 ld.u64 %r365,[%r363]; $L293: .loc 1 2192 7 cvt.u16.u32 %r550,%r238; setp.eq.u16 %r551,%r550,4; @ %r551 bra $L294; setp.gt.u16 %r553,%r550,4; @ %r553 bra $L295; setp.eq.u16 %r555,%r550,2; @ %r555 bra $L296; setp.eq.u16 %r557,%r550,3; @ %r557 bra $L297; setp.eq.u16 %r559,%r550,1; @ %r559 bra $L298; bra $L270; $L295: setp.eq.u16 %r561,%r550,5; @ %r561 bra $L299; setp.eq.u16 %r563,%r550,6; @ %r563 bra $L300; bra $L270; $L298: .loc 1 2196 13 shl.b64 %r564,%r370,3; add.u64 %r73,%r362,%r564; .loc 1 2196 23 mov.u64 %r565,0; st.u64 [%r73],%r565; .loc 1 2197 54 add.u64 %r1398,%r369,%r369; add.u64 %r1396,%r1398,%r369; shl.b64 %r569,%r1396,3; add.u64 %r570,%r360,%r569; add.u64 %r571,%r570,56; .loc 1 2197 4 ld.u64 %r572,[%r571]; setp.eq.u64 %r573,%r572,0; @ %r573 bra $L266; .loc 1 2197 16 mov.u64 %r122,%r565; .loc 1 2207 39 add.u64 %r579,%r570,64; .loc 1 2211 3 add.u64 %r1420,%r570,48; add.u64 %r1421,%r363,%r569; add.u64 %r1422,%r1421,48; add.u64 %r1423,%r1421,40; .loc 1 2224 8 add.u64 %r1424,%r370,1; add.u64 %r1425,%r369,1; .loc 1 2228 11 setp.eq.u64 %r1426,%r23,0; .loc 1 2230 10 add.u64 %r1427,%r370,%r370; add.u64 %r1428,%r1427,%r370; shl.b64 %r1429,%r1428,3; add.u64 %r1430,%r364,%r1429; add.u64 %r1431,%r1430,40; $L310: .loc 1 2207 39 ld.u32 %r74,[%r579]; setp.eq.u32 %r580,%r74,4; @ %r580 bra $L301; setp.gt.s32 %r581,%r74,4; @ %r581 bra $L302; setp.eq.u32 %r582,%r74,1; @ %r582 bra $L303; setp.eq.u32 %r583,%r74,2; @ %r583 bra $L304; bra $L305; $L302: setp.eq.u32 %r584,%r74,8; @ %r584 bra $L306; setp.eq.u32 %r585,%r74,16; @ %r585 bra $L307; bra $L305; $L303: .loc 1 2209 3 ld.u64 %r592,[%r1420]; add.u64 %r593,%r592,%r122; ld.s8 %r594,[%r593]; ld.u64 %r602,[%r1422]; sub.u64 %r601,%r594,%r602; ld.u64 %r609,[%r1423]; mul.lo.u64 %r82,%r601,%r609; bra $L308; $L304: .loc 1 2210 3 ld.u64 %r616,[%r1420]; add.u64 %r617,%r122,%r122; add.u64 %r618,%r616,%r617; ld.s16 %r619,[%r618]; ld.u64 %r627,[%r1422]; sub.u64 %r626,%r619,%r627; ld.u64 %r634,[%r1423]; mul.lo.u64 %r82,%r626,%r634; bra $L308; $L301: .loc 1 2211 3 ld.u64 %r641,[%r1420]; shl.b64 %r642,%r122,2; add.u64 %r643,%r641,%r642; ld.s32 %r644,[%r643]; ld.u64 %r652,[%r1422]; sub.u64 %r651,%r644,%r652; ld.u64 %r659,[%r1423]; mul.lo.u64 %r82,%r651,%r659; bra $L308; $L306: .loc 1 2213 3 ld.u64 %r666,[%r1420]; shl.b64 %r667,%r122,3; add.u64 %r668,%r666,%r667; ld.u64 %r676,[%r668]; ld.u64 %r677,[%r1422]; sub.u64 %r675,%r676,%r677; ld.u64 %r684,[%r1423]; mul.lo.u64 %r82,%r675,%r684; bra $L308; $L307: .loc 1 2216 3 ld.u64 %r691,[%r1420]; shl.b64 %r692,%r122,4; add.u64 %r693,%r691,%r692; ld.u64 %r701,[%r693]; ld.u64 %r702,[%r1422]; sub.u64 %r700,%r701,%r702; ld.u64 %r709,[%r1423]; mul.lo.u64 %r82,%r700,%r709; bra $L308; $L305: .loc 1 2219 5 mov.u64 %r713,0; {2713713stack; call caf_internal_error); } .loc 1 2220 5 bra $L266; $L308: .loc 1 2225 29 ld.u64 %r732,[%r360+16]; .loc 1 2224 8 mad.lo.u64 %r733,%r82,%r732,%r365363613623364366; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r367; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r368; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r1425; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r1424; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r371; .param .u64 %out_arg13; st.param.u64 [%out_arg13],%r372; .param .u32 %out_arg14; st.param.u32 [%out_arg14],%r373; call send_by_ref$constprop$0$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13,%out_arg14); } .loc 1 2228 11 @ %r1426 bra $L309; .loc 1 2230 7 ld.u64 %r742,[%r73]; ld.u64 %r743,[%r1431]; add.u64 %r741,%r742,%r743; st.u64 [%r73],%r741; $L309: .loc 1 2198 9 add.u64 %r122,%r122,1; .loc 1 2197 4 ld.u64 %r750,[%r571]; setp.lt.u64 %r751,%r122,%r750; @ %r751 bra $L310; bra $L266; $L296: .loc 1 2234 4 add.u64 %r1398,%r369,%r369; add.u64 %r1396,%r1398,%r369; shl.b64 %r755,%r1396,3; add.u64 %r756,%r360,%r755; ld.u64 %r124,[%r756+64]; abs.s64 %r125,%r124; add.u64 %r762,%r363,%r755; ld.u64 %r313,[%r762+56]; ld.u64 %r312,[%r762+48]; setp.le.s64 %r770,%r124,0; @ %r770 bra $L311; add.u64 %r771,%r313,1; sub.u64 %r127,%r771,%r312; bra $L312; $L311: add.u64 %r772,%r312,1; sub.u64 %r127,%r772,%r313; $L312: set.u32.le.s64 %r774,%r127,0; neg.s32 %r775,%r774; set.u32.le.s64 %r777,%r125,0; neg.s32 %r778,%r777; cvt.u16.u32 %r780,%r775; cvt.u16.u32 %r781,%r778; or.b16 %r779,%r780,%r781; cvt.u32.u16 %r782,%r779; cvt.u16.u8 %r783,%r782; setp.ne.u16 %r784,%r783,0; @ %r784 bra $L266; setp.eq.u64 %r785,%r125,1; @ %r785 bra $L313; add.u64 %r786,%r127,-1; div.s64 %r790,%r786,%r125; add.u64 %r127,%r790,1; $L313: .loc 1 2239 17 shl.b64 %r794,%r1396,3; add.u64 %r795,%r363,%r794; .loc 1 2239 15 ld.u64 %r797,[%r795+40]; mul.lo.u64 %r135,%r124,%r797; .loc 1 2241 13 shl.b64 %r798,%r370,3; add.u64 %r137,%r362,%r798; .loc 1 2241 23 mov.u64 %r799,0; st.u64 [%r137],%r799; .loc 1 2224 8 add.u64 %r309,%r370,1; add.u64 %r324,%r369,1; .loc 1 2242 20 mov.u64 %r145,%r799; .loc 1 2238 21 mov.u64 %r146,%r145; setp.eq.u64 %r1399,%r23,0; .loc 1 2251 10 add.u64 %r1438,%r370,%r370; add.u64 %r1439,%r1438,%r370; shl.b64 %r1440,%r1439,3; add.u64 %r1441,%r364,%r1440; add.u64 %r1442,%r1441,40; $L315: .loc 1 2246 29 ld.u64 %r815,[%r360+16]; .loc 1 2245 8 mad.lo.u64 %r816,%r146,%r815,%r365363613623364366; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r367; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r368; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r324; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r309; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r371; .param .u64 %out_arg13; st.param.u64 [%out_arg13],%r372; .param .u32 %out_arg14; st.param.u32 [%out_arg14],%r373; call send_by_ref$constprop$0$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13,%out_arg14); } .loc 1 2249 11 @ %r1399 bra $L314; .loc 1 2251 7 ld.u64 %r825,[%r137]; ld.u64 %r826,[%r1442]; add.u64 %r824,%r825,%r826; st.u64 [%r137],%r824; $L314: .loc 1 2243 9 add.u64 %r145,%r145,1; .loc 1 2243 33 add.u64 %r146,%r146,%r135; .loc 1 2242 4 setp.ne.u64 %r827,%r145,%r127; @ %r827 bra $L315; bra $L266; $L297: .loc 1 2255 4 add.u64 %r1398,%r369,%r369; add.u64 %r1396,%r1398,%r369; shl.b64 %r831,%r1396,3; add.u64 %r832,%r360,%r831; ld.u64 %r147,[%r832+64]; abs.s64 %r148,%r147; ld.u64 %r311,[%r832+56]; ld.u64 %r305,[%r832+48]; setp.le.s64 %r846,%r147,0; @ %r846 bra $L316; add.u64 %r847,%r311,1; sub.u64 %r150,%r847,%r305; bra $L317; $L316: add.u64 %r848,%r305,1; sub.u64 %r150,%r848,%r311; $L317: set.u32.le.s64 %r850,%r150,0; neg.s32 %r851,%r850; set.u32.le.s64 %r853,%r148,0; neg.s32 %r854,%r853; cvt.u16.u32 %r856,%r851; cvt.u16.u32 %r857,%r854; or.b16 %r855,%r856,%r857; cvt.u32.u16 %r858,%r855; cvt.u16.u8 %r859,%r858; setp.ne.u16 %r860,%r859,0; @ %r860 bra $L266; setp.eq.u64 %r861,%r148,1; @ %r861 bra $L318; add.u64 %r862,%r150,-1; div.s64 %r866,%r862,%r148; add.u64 %r150,%r866,1; $L318: .loc 1 2260 10 shl.b64 %r870,%r1396,3; add.u64 %r871,%r363,%r870; .loc 1 2259 21 ld.u64 %r873,[%r871+48]; sub.u64 %r170,%r305,%r873; .loc 1 2261 15 ld.u64 %r880,[%r871+40]; mul.lo.u64 %r159,%r147,%r880; .loc 1 2263 13 shl.b64 %r881,%r370,3; add.u64 %r161,%r362,%r881; .loc 1 2263 23 mov.u64 %r882,0; st.u64 [%r161],%r882; .loc 1 2224 8 add.u64 %r325,%r370,1; add.u64 %r310,%r369,1; .loc 1 2264 20 mov.u64 %r171,%r882; setp.eq.u64 %r1399,%r23,0; .loc 1 2272 10 add.u64 %r1433,%r370,%r370; add.u64 %r1434,%r1433,%r370; shl.b64 %r1435,%r1434,3; add.u64 %r1436,%r364,%r1435; add.u64 %r1437,%r1436,40; $L320: .loc 1 2267 29 ld.u64 %r898,[%r360+16]; .loc 1 2266 8 mad.lo.u64 %r899,%r170,%r898,%r365363613623364366; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r367; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r368; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r310; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r325; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r371; .param .u64 %out_arg13; st.param.u64 [%out_arg13],%r372; .param .u32 %out_arg14; st.param.u32 [%out_arg14],%r373; call send_by_ref$constprop$0$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13,%out_arg14); } .loc 1 2270 11 @ %r1399 bra $L319; .loc 1 2272 7 ld.u64 %r908,[%r161]; ld.u64 %r909,[%r1437]; add.u64 %r907,%r908,%r909; st.u64 [%r161],%r907; $L319: .loc 1 2273 25 add.u64 %r170,%r170,%r159; .loc 1 2264 47 add.u64 %r171,%r171,1; .loc 1 2264 4 setp.ne.u64 %r910,%r171,%r150; @ %r910 bra $L320; bra $L266; $L294: .loc 1 2280 4 add.u64 %r925,%r369,1; .loc 1 2277 47 add.u64 %r927,%r369,%r369; add.u64 %r928,%r927,%r369; shl.b64 %r929,%r928,3; add.u64 %r930,%r360,%r929; .loc 1 2278 13 add.u64 %r936,%r363,%r929; .loc 1 2278 11 ld.u64 %r939,[%r930+48]; ld.u64 %r940,[%r936+48]; sub.u64 %r938,%r939,%r940; .loc 1 2277 21 ld.u64 %r948,[%r936+40]; mul.lo.u64 %r947,%r938,%r948; .loc 1 2281 29 ld.u64 %r950,[%r360+16]; .loc 1 2280 4 mad.lo.u64 %r951,%r947,%r950,%r365363613623364366; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r367; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r368; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r925; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r370; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r371; .param .u64 %out_arg13; st.param.u64 [%out_arg13],%r372; .param .u32 %out_arg14; st.param.u32 [%out_arg14],%r373; call send_by_ref$constprop$0$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13,%out_arg14); } .loc 1 2284 4 bra $L266; $L299: .loc 1 2286 4 add.u64 %r1398,%r369,%r369; add.u64 %r1396,%r1398,%r369; shl.b64 %r955,%r1396,3; add.u64 %r956,%r360,%r955; ld.u64 %r182,[%r956+64]; abs.s64 %r183,%r182; .loc 1 2234 4 add.u64 %r962,%r363,%r955; ld.u64 %r49,[%r962+56]; .loc 1 2255 4 ld.u64 %r355,[%r956+48]; .loc 1 2286 4 setp.le.s64 %r970,%r182,0; @ %r970 bra $L321; add.u64 %r971,%r49,1; sub.u64 %r185,%r971,%r355; bra $L322; $L321: add.u64 %r972,%r355,1; sub.u64 %r185,%r972,%r49; $L322: set.u32.le.s64 %r974,%r185,0; neg.s32 %r975,%r974; set.u32.le.s64 %r977,%r183,0; neg.s32 %r978,%r977; cvt.u16.u32 %r980,%r975; cvt.u16.u32 %r981,%r978; or.b16 %r979,%r980,%r981; cvt.u32.u16 %r982,%r979; cvt.u16.u8 %r983,%r982; setp.ne.u16 %r984,%r983,0; @ %r984 bra $L266; setp.eq.u64 %r985,%r183,1; @ %r985 bra $L323; add.u64 %r986,%r185,-1; div.s64 %r990,%r986,%r183; add.u64 %r185,%r990,1; $L323: .loc 1 2291 10 shl.b64 %r994,%r1396,3; add.u64 %r995,%r363,%r994; .loc 1 2290 21 ld.u64 %r997,[%r995+48]; sub.u64 %r204,%r355,%r997; .loc 1 2292 15 ld.u64 %r1004,[%r995+40]; mul.lo.u64 %r194,%r182,%r1004; .loc 1 2294 13 shl.b64 %r1005,%r370,3; add.u64 %r196,%r362,%r1005; .loc 1 2294 23 mov.u64 %r1006,0; st.u64 [%r196],%r1006; .loc 1 2224 8 add.u64 %r326,%r370,1; add.u64 %r328,%r369,1; .loc 1 2295 20 mov.u64 %r205,%r1006; setp.eq.u64 %r1399,%r23,0; .loc 1 2303 10 add.u64 %r1448,%r370,%r370; add.u64 %r1449,%r1448,%r370; shl.b64 %r1450,%r1449,3; add.u64 %r1451,%r364,%r1450; add.u64 %r1452,%r1451,40; $L325: .loc 1 2298 29 ld.u64 %r1022,[%r360+16]; .loc 1 2297 8 mad.lo.u64 %r1023,%r204,%r1022,%r3653636136233641023366; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r367; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r368; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r328; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r326; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r371; .param .u64 %out_arg13; st.param.u64 [%out_arg13],%r372; .param .u32 %out_arg14; st.param.u32 [%out_arg14],%r373; call send_by_ref$constprop$0$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13,%out_arg14); } .loc 1 2301 11 @ %r1399 bra $L324; .loc 1 2303 7 ld.u64 %r1032,[%r196]; ld.u64 %r1033,[%r1452]; add.u64 %r1031,%r1032,%r1033; st.u64 [%r196],%r1031; $L324: .loc 1 2304 25 add.u64 %r204,%r204,%r194; .loc 1 2295 47 add.u64 %r205,%r205,1; .loc 1 2295 4 setp.ne.u64 %r1034,%r205,%r185; @ %r1034 bra $L325; bra $L266; $L300: .loc 1 2308 4 add.u64 %r1398,%r369,%r369; add.u64 %r1396,%r1398,%r369; shl.b64 %r1038,%r1396,3; add.u64 %r1039,%r360,%r1038; ld.u64 %r206,[%r1039+64]; abs.s64 %r207,%r206; .loc 1 2234 4 add.u64 %r1045,%r363,%r1038; ld.u64 %r356,[%r1045+48]; .loc 1 2255 4 ld.u64 %r357,[%r1039+56]; .loc 1 2308 4 setp.le.s64 %r1053,%r206,0; @ %r1053 bra $L326; add.u64 %r1054,%r357,1; sub.u64 %r209,%r1054,%r356; bra $L327; $L326: add.u64 %r1055,%r356,1; sub.u64 %r209,%r1055,%r357; $L327: set.u32.le.s64 %r1057,%r209,0; neg.s32 %r1058,%r1057; set.u32.le.s64 %r1060,%r207,0; neg.s32 %r1061,%r1060; cvt.u16.u32 %r1063,%r1058; cvt.u16.u32 %r1064,%r1061; or.b16 %r1062,%r1063,%r1064; cvt.u32.u16 %r1065,%r1062; cvt.u16.u8 %r1066,%r1065; setp.ne.u16 %r1067,%r1066,0; @ %r1067 bra $L266; setp.eq.u64 %r1068,%r207,1; @ %r1068 bra $L328; add.u64 %r1069,%r209,-1; div.s64 %r1073,%r1069,%r207; add.u64 %r209,%r1073,1; $L328: .loc 1 2313 17 shl.b64 %r1077,%r1396,3; add.u64 %r1078,%r363,%r1077; .loc 1 2313 15 ld.u64 %r1080,[%r1078+40]; mul.lo.u64 %r217,%r206,%r1080; .loc 1 2315 13 shl.b64 %r1081,%r370,3; add.u64 %r219,%r362,%r1081; .loc 1 2315 23 mov.u64 %r1082,0; st.u64 [%r219],%r1082; .loc 1 2224 8 add.u64 %r329,%r370,1; add.u64 %r327,%r369,1; .loc 1 2316 20 mov.u64 %r229,%r1082; .loc 1 2312 21 mov.u64 %r228,%r229; setp.eq.u64 %r1399,%r23,0; .loc 1 2324 10 add.u64 %r1443,%r370,%r370; add.u64 %r1444,%r1443,%r370; shl.b64 %r1445,%r1444,3; add.u64 %r1446,%r364,%r1445; add.u64 %r1447,%r1446,40; $L330: .loc 1 2319 29 ld.u64 %r1098,[%r360+16]; .loc 1 2318 8 mad.lo.u64 %r1099,%r228,%r1098,%r3653636136233641099366; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r367; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r368; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r327; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r329; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r371; .param .u64 %out_arg13; st.param.u64 [%out_arg13],%r372; .param .u32 %out_arg14; st.param.u32 [%out_arg14],%r373; call send_by_ref$constprop$0$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13,%out_arg14); } .loc 1 2322 11 @ %r1399 bra $L329; .loc 1 2324 7 ld.u64 %r1108,[%r219]; ld.u64 %r1109,[%r1447]; add.u64 %r1107,%r1108,%r1109; st.u64 [%r219],%r1107; $L329: .loc 1 2325 25 add.u64 %r228,%r228,%r217; .loc 1 2316 47 add.u64 %r229,%r229,1; .loc 1 2316 4 setp.ne.u64 %r1110,%r229,%r209; @ %r1110 bra $L330; bra $L266; $L283: .loc 1 2101 21 add.u64 %r1111,%r360,%r369; ld.u8 %r238,[%r1111+24]; .loc 1 2333 10 setp.ne.u32 %r1113,%r238,0; @ %r1113 bra $L331; .loc 1 2335 4 mov.u64 %r1123,3621123364366; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r367; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r368; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r1123; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r370; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r371; .param .u64 %out_arg13; st.param.u64 [%out_arg13],%r372; .param .u32 %out_arg14; st.param.u32 [%out_arg14],%r373; call send_by_ref$constprop$0$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13,%out_arg14); } .loc 1 2338 4 bra $L266; $L331: .loc 1 2340 7 cvt.u16.u32 %r1128,%r238; setp.eq.u16 %r1129,%r1128,3; @ %r1129 bra $L332; setp.gt.u16 %r1131,%r1128,3; @ %r1131 bra $L333; setp.eq.u16 %r1133,%r1128,1; @ %r1133 bra $L334; setp.eq.u16 %r1135,%r1128,2; @ %r1135 bra $L335; bra $L270; $L333: setp.eq.u16 %r1137,%r1128,4; @ %r1137 bra $L336; bra $L270; $L334: .loc 1 2344 13 shl.b64 %r1138,%r370,3; add.u64 %r231,%r362,%r1138; .loc 1 2344 23 mov.u64 %r1139,0; st.u64 [%r231],%r1139; .loc 1 2345 54 add.u64 %r1398,%r369,%r369; add.u64 %r1396,%r1398,%r369; shl.b64 %r1143,%r1396,3; add.u64 %r1144,%r360,%r1143; add.u64 %r1145,%r1144,56; .loc 1 2345 4 ld.u64 %r1146,[%r1145]; setp.eq.u64 %r1147,%r1146,0; @ %r1147 bra $L266; .loc 1 2345 16 mov.u64 %r221,%r1139; .loc 1 2352 39 add.u64 %r1153,%r1144,64; .loc 1 2356 3 add.u64 %r1406,%r1144,48; .loc 1 2369 8 add.u64 %r1407,%r370,1; add.u64 %r1408,%r369,1; .loc 1 2374 8 add.u64 %r1409,%r370,%r370; add.u64 %r1410,%r1409,%r370; shl.b64 %r1411,%r1410,3; add.u64 %r1412,%r364,%r1411; add.u64 %r1413,%r1412,40; $L344: .loc 1 2352 39 ld.u32 %r232,[%r1153]; setp.eq.u32 %r1154,%r232,4; @ %r1154 bra $L337; setp.gt.s32 %r1155,%r232,4; @ %r1155 bra $L338; setp.eq.u32 %r1156,%r232,1; @ %r1156 bra $L339; setp.eq.u32 %r1157,%r232,2; @ %r1157 bra $L340; bra $L270; $L338: setp.eq.u32 %r1158,%r232,8; @ %r1158 bra $L341; setp.eq.u32 %r1159,%r232,16; @ %r1159 bra $L342; bra $L270; $L339: .loc 1 2354 3 ld.u64 %r1166,[%r1406]; add.u64 %r1167,%r1166,%r221; ld.s8 %r236,[%r1167]; bra $L343; $L340: .loc 1 2355 3 ld.u64 %r1174,[%r1406]; add.u64 %r1175,%r221,%r221; add.u64 %r1176,%r1174,%r1175; ld.s16 %r236,[%r1176]; bra $L343; $L337: .loc 1 2356 3 ld.u64 %r1183,[%r1406]; shl.b64 %r1184,%r221,2; add.u64 %r1185,%r1183,%r1184; ld.s32 %r236,[%r1185]; bra $L343; $L341: .loc 1 2358 3 ld.u64 %r1192,[%r1406]; shl.b64 %r1193,%r221,3; add.u64 %r1194,%r1192,%r1193; ld.u64 %r236,[%r1194]; bra $L343; $L342: .loc 1 2361 3 ld.u64 %r1201,[%r1406]; shl.b64 %r1202,%r221,4; add.u64 %r1203,%r1201,%r1202; ld.u64 %r236,[%r1203]; $L343: .loc 1 2370 29 ld.u64 %r1221,[%r360+16]; .loc 1 2369 8 mad.lo.u64 %r1222,%r236,%r1221,%r365; mov.u64 %r12036136212073641222366; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r367; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r368; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r1408; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r1407; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r371; .param .u64 %out_arg13; st.param.u64 [%out_arg13],%r372; .param .u32 %out_arg14; st.param.u32 [%out_arg14],%r373; call send_by_ref$constprop$0$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13,%out_arg14); } .loc 1 2374 5 ld.u64 %r1230,[%r231]; ld.u64 %r1231,[%r1413]; add.u64 %r1229,%r1230,%r1231; st.u64 [%r231],%r1229; .loc 1 2346 9 add.u64 %r221,%r221,1; .loc 1 2345 4 ld.u64 %r1238,[%r1145]; setp.lt.u64 %r1239,%r221,%r1238; @ %r1239 bra $L344; bra $L266; $L335: .loc 1 2378 13 shl.b64 %r1240,%r370,3; add.u64 %r262,%r362,%r1240; .loc 1 2378 23 mov.u64 %r1241,0; st.u64 [%r262],%r1241; .loc 1 2380 52 add.u64 %r1398,%r369,%r369; add.u64 %r1396,%r1398,%r369; shl.b64 %r1245,%r1396,3; add.u64 %r1246,%r360,%r1245; add.u64 %r1247,%r1246,56; .loc 1 2379 4 ld.u64 %r1248,[%r1247]; setp.lt.s64 %r1249,%r1248,0; @ %r1249 bra $L266; .loc 1 2224 8 add.u64 %r331,%r370,1; add.u64 %r330,%r369,1; .loc 1 2379 26 mov.u64 %r272,%r1241; setp.eq.u64 %r1399,%r23,0; .loc 1 2389 10 add.u64 %r1401,%r370,%r370; add.u64 %r1402,%r1401,%r370; shl.b64 %r1403,%r1402,3; add.u64 %r1404,%r364,%r1403; add.u64 %r1405,%r1404,40; .loc 1 2381 52 add.u64 %r1282,%r1246,64; $L346: .loc 1 2384 29 ld.u64 %r1265,[%r360+16]; .loc 1 2383 8 mad.lo.u64 %r1266,%r272,%r1265,%r365; mov.u64 %r1253,3636136212533641266366; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r367; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r368; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r330; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r331; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r371; .param .u64 %out_arg13; st.param.u64 [%out_arg13],%r372; .param .u32 %out_arg14; st.param.u32 [%out_arg14],%r373; call send_by_ref$constprop$0$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13,%out_arg14); } .loc 1 2387 11 @ %r1399 bra $L345; .loc 1 2389 7 ld.u64 %r1275,[%r262]; ld.u64 %r1276,[%r1405]; add.u64 %r1274,%r1275,%r1276; st.u64 [%r262],%r1274; $L345: .loc 1 2381 26 ld.u64 %r1283,[%r1282]; add.u64 %r272,%r272,%r1283; .loc 1 2379 4 ld.u64 %r1290,[%r1247]; setp.le.s64 %r1291,%r272,%r1290; @ %r1291 bra $L346; bra $L266; $L332: .loc 1 2393 4 add.u64 %r1398,%r369,%r369; add.u64 %r1396,%r1398,%r369; shl.b64 %r1295,%r1396,3; add.u64 %r1296,%r360,%r1295; ld.u64 %r274,[%r1296+64]; abs.s64 %r275,%r274; .loc 1 2255 4 ld.u64 %r358,[%r1296+56]; ld.u64 %r295,[%r1296+48]; .loc 1 2393 4 setp.le.s64 %r1310,%r274,0; @ %r1310 bra $L347; add.u64 %r1311,%r358,1; sub.u64 %r277,%r1311,%r295; bra $L348; $L347: add.u64 %r1312,%r295,1; sub.u64 %r277,%r1312,%r358; $L348: set.u32.le.s64 %r1314,%r277,0; neg.s32 %r1315,%r1314; set.u32.le.s64 %r1317,%r275,0; neg.s32 %r1318,%r1317; cvt.u16.u32 %r1320,%r1315; cvt.u16.u32 %r1321,%r1318; or.b16 %r1319,%r1320,%r1321; cvt.u32.u16 %r1322,%r1319; cvt.u16.u8 %r1323,%r1322; setp.ne.u16 %r1324,%r1323,0; @ %r1324 bra $L266; setp.eq.u64 %r1325,%r275,1; @ %r1325 bra $L349; add.u64 %r1326,%r277,-1; div.s64 %r1330,%r1326,%r275; add.u64 %r277,%r1330,1; $L349: .loc 1 2398 13 shl.b64 %r1331,%r370,3; add.u64 %r285,%r362,%r1331; .loc 1 2398 23 mov.u64 %r1332,0; st.u64 [%r285],%r1332; .loc 1 2224 8 add.u64 %r333,%r370,1; add.u64 %r332,%r369,1; .loc 1 2399 20 mov.u64 %r297,%r1332; setp.eq.u64 %r1399,%r23,0; .loc 1 2407 10 add.u64 %r1415,%r370,%r370; add.u64 %r1416,%r1415,%r370; shl.b64 %r1417,%r1416,3; add.u64 %r1418,%r364,%r1417; add.u64 %r1419,%r1418,40; .loc 1 2408 51 shl.b64 %r1363,%r1396,3; add.u64 %r1364,%r360,%r1363; add.u64 %r1365,%r1364,64; $L351: .loc 1 2402 29 ld.u64 %r1348,[%r360+16]; .loc 1 2401 8 mad.lo.u64 %r1349,%r295,%r1348,%r365; mov.u64 %r1336,3636136213336449366; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r367; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r368; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r332; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r333; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r371; .param .u64 %out_arg13; st.param.u64 [%out_arg13],%r372; .param .u32 %out_arg14; st.param.u32 [%out_arg14],%r373; call send_by_ref$constprop$0$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13,%out_arg14); } .loc 1 2405 11 @ %r1399 bra $L350; .loc 1 2407 7 ld.u64 %r1358,[%r285]; ld.u64 %r1359,[%r1419]; add.u64 %r1357,%r1358,%r1359; st.u64 [%r285],%r1357; $L350: .loc 1 2408 25 ld.u64 %r1366,[%r1365]; add.u64 %r295,%r295,%r1366; .loc 1 2399 47 add.u64 %r297,%r297,1; .loc 1 2399 4 setp.ne.u64 %r1367,%r297,%r277; @ %r1367 bra $L351; bra $L266; $L336: .loc 1 2413 4 add.u64 %r1382,%r369,1; .loc 1 2412 21 add.u64 %r1384,%r369,%r369; add.u64 %r1385,%r1384,%r369; shl.b64 %r1386,%r1385,3; add.u64 %r1387,%r360,%r1386; .loc 1 2414 32 ld.u64 %r1390,[%r360+16]; ld.u64 %r1391,[%r1387+48]; .loc 1 2413 4 mad.lo.u64 %r1392,%r1390,%r1391,%r365; mov.u64 %r1371,363613621336492366; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r367; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r368; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r1382; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r370; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r371; .param .u64 %out_arg13; st.param.u64 [%out_arg13],%r372; .param .u32 %out_arg14; st.param.u32 [%out_arg14],%r373; call send_by_ref$constprop$0$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13,%out_arg14); } .loc 1 2417 4 bra $L266; $L409: .loc 1 2124 3 setp.eq.u32 %r1393,%r359,1; @ %r1393 bra $L292; setp.eq.u32 %r1394,%r359,2; @ ! %r1394 bra $L270; bra $L331; $L285: .loc 1 2161 17 add.u64 %r1395,%r365,%r54; ld.u64 %r84,[%r1395]; bra $L290; $L266: .loc 1 2_gfortran_caf_deregister .visible .func _gfortran_caf_deregister16.reg .pred %r40; .reg .u32 %r41; mov.u64 %r26,%ar0; mov.u32 %r27,%ar1; mov.u64 %r28,%ar2; .loc 1 201 22 ld.u64 %r25,[%r26]; .loc 1 203 6 ld.u8 %r31,[%r25+16]; setp.eq.u16 %r32,%r31,0; @ %r32 bra $L412; .loc 1 203 50 ld.u64 %r23,[%r25]; .loc 1 203 356 setp.eq.u32 %r35,%r27,1; @ %r35 bra $L413; .loc 1 208 7 ld.u64 %r36,[%r26209 14 mov.u64 %r37,0; st.u64 [%r26],%r37; bra $L414; $L413: .loc 1 213 28 mov.u64 %r38,0; st.u64 [%r25],%r38; .loc 1 214 35 cvt.u32.u64 %r39,%r38; st.u8 [%r25+16],%r39; $L414: .loc 1 217 6 setp.eq.u64 %r40,%r28,0; @ %r40 bra $L411; .loc 1 218 11 mov.u32 %r41,0; st.u32 [%r28],%r_gfortran_caf_sync_all .visible .func _gfortran_caf_sync_allr22; .reg .pred.loc 1 227 3 .loc 1 228 6 setp.eq.u64 %r25,%r22,0; @ %r25 bra $L425; .loc 1 229 11 mov.u32 %r26,0; st.u32 [%r22],%r26; $L425_gfortran_caf_sync_memory .visible .func _gfortran_caf_sync_memoryr22; .reg .pred %r25; .reg .u32 %r26; $LFB50: mov.u64 %r22,%ar0; setp.eq.u64 %r25,%r22,0; @ %r25 bra $L430; mov.u32 %r26,0; st.u32 [%r22],%r26; $L430_gfortran_caf_sync_images .visible .func _gfortran_caf_sync_imagespred %r27; .reg .u32 %r28; mov.u64 %r24,%ar2; .loc 1 263 3 .loc 1 264 6 setp.eq.u64 %r27,%r24,0; @ %r27 bra $L435; .loc 1 265 11 mov.u32 %r28,0; st.u32 [%r24],%r28_gfortran_caf_stop_numeric .visible .func _gfortran_caf_stop_numericpredmov.u32 %r24,%ar0; mov.u32 %r25,%ar1; .loc 1 272 6 setp.ne.u32 %r26,%r25,0; @ %r26 bra $L441; .loc 1 273 14 ld.global.u64 %r29,[_impure_ptr]; .loc 1 273 5 ld.u64 %r30,[%r29+24];431,[%value_in]; } $L441: .loc 1 274 3 mov.u32 %r33,0_gfortran_caf_stop_str .visible .func _gfortran_caf_stop_strpredpredpredmov.u32 %r35,%ar2; .loc 1 281 6 setp.ne.u32 %r36,%r35,0; @ %r36 bra $L443; .loc 1 283 23 ld.global.u64 %r41,[_impure_ptr]; .loc 1 283 7 ld.u64 %r42,[%r41+24]; mov.u64 %r39,5; mov.u64 %r38,1; cvta.const.u64 %r37ld.param.u64 %r43,[%value_in]; } .loc 1 284 13 setp.eq.u64 %r44,%r34,0; @ %r44 bra $L444; mov.u64 %r32,%r33; add.u64 %r31,%r32,%r34; $L445: .loc 1 285 22 ld.global.u64 %r47,[_impure_ptr]; .loc 1 285 2 ld.u64 %r48,[%r47+24]; ld.s8 %r49,[%r32]call (%value_in),fputc.loc 1 284 13 add.u64 %r32,%r32,1; setp.ne.u64 %r51,%r31,%r32; @ %r51 bra $L445; $L444: .loc 1 286 20 ld.global.u64 %r54,[_impure_ptr]; .loc 1 286 7 ld.u64 %r55,[%r54+24];call (%value_in),fputc$L443: .loc 1 288 3 mov.u32 %r57,0_gfortran_caf_error_stop_str .visible .func _gfortran_caf_error_stop_strpredpredpredmov.u32 %r35,%ar2; .loc 1 295 6 setp.ne.u32 %r36,%r35,0; @ %r36 bra $L451; .loc 1 297 29 ld.global.u64 %r41,[_impure_ptr]; .loc 1 297 7 ld.u64 %r42,[%r41+24]; mov.u64 %r39,11; mov.u64 %r38,1; cvta.const.u64 %r37,$LC6ld.param.u64 %r43,[%value_in]; } .loc 1 298 13 setp.eq.u64 %r44,%r34,0; @ %r44 bra $L452; mov.u64 %r32,%r33; add.u64 %r31,%r32,%r34; $L453: .loc 1 299 22 ld.global.u64 %r47,[_impure_ptr]; .loc 1 299 2 ld.u64 %r48,[%r47+24]; ld.s8 %r49,[%r32]call (%value_in),fputc.loc 1 298 13 add.u64 %r32,%r32,1; setp.ne.u64 %r51,%r31,%r32; @ %r51 bra $L453; $L452: .loc 1 300 20 ld.global.u64 %r54,[_impure_ptr]; .loc 1 300 7 ld.u64 %r55,[%r54+24];call (%value_in),fputc$L451: .loc 1 302 3 mov.u32 %r57,1_gfortran_caf_fail_image .visible .func _gfortran_caf_fail_image {loc 1 312 29 ld.global.u64 %r28,[_impure_ptr]; .loc 1 312 3 ld.u64 %r29,[%r28+24]; mov.u64 %r26,14; mov.u64 %r25,1; cvta.const.u64 %r24,$LC7313 3 mov.u32 %r31,0_gfortran_caf_image_statu_gfortran_caf_image_status (r23; .reg .pred %r25; mov.u32 %r23,%ar0; .loc 1 323 6 setp.eq.u32 %r25,%r23,1; .loc 1 326 12 selp.u32 %value,0,6000,%r25; .loc 1 32_gfortran_caf_failed_images .visible .func _gfortran_caf_failed_imagespredmov.u64 %r24,%ar0; mov.u64 %r26,%ar2; .loc 1 338 41 setp.eq.u64 %r27,%r26,0; @ %r27 bra $L464; .loc 1 342 25 ld.s32 %r23,[%r26]; bra $L463; $L464: mov.u64 %r23,4; $L463: .loc 1 340 20 mov.u64 %r28,0; st.u64 [%r24],%r28; .loc 1 341 21 mov.u32 %r29,1; st.u8 [%r24+29],%r29; .loc 1 3345 29 st.u64 [%r24+48],%r28; .loc 1 346 25 mov.u64 %r31,-1; st.u64 [%r24+56],%r31; .loc 1 347 25 mov.u64 %r32,1; st.u64 [%r24+40],%r32; .loc 1 348 17 st.u64 [%r24+8],%r28;_gfortran_caf_stopped_images .visible .func _gfortran_caf_stopped_imagespred$LFB56: mov.u64 %r24,%ar0; mov.u64 %r26,%ar2; setp.eq.u64 %r27,%r26,0; @ %r27 bra $L467; ld.s32 %r22,[%r26]; bra $L466; $L467: mov.u64 %r22,4; $L466: mov.u64 %r28,0; st.u64 [%r24],%r28; mov.u32 %r29,1; st.u8 [%r24+29],%r29; st.u64 [%r24+16],%r22; st.u64 [%r24+48],%r28; mov.u64 %r31,-1; st.u64 [%r24+56],%r31; mov.u64 %r32,1; st.u64 [%r24+40],%r32; st.u64 [%r24+8],%r28;_gfortran_caf_error_stop .visible .func _gfortran_caf_error_stoppredmov.u32 %r24,%ar0; mov.u32 %r25,%ar1; .loc 1 377 6 setp.ne.u32 %r26,%r25,0; @ %r26 bra $L469; .loc 1 378 14 ld.global.u64 %r29,[_impure_ptr]; .loc 1 378 5 ld.u64 %r30,[%r29+24]; st.u32 [%stack],%r24; cvta.const.u64 %r28,$LC31,[%value_in]; } $L469: .loc 1 379 3_gfortran_caf_co_broadcast .visible .func _gfortran_caf_co_broadcastr24; .reg .pred %r27; .reg .u32 %r28; mov.u64 %r24,%ar2; .loc 1 389 6 setp.eq.u64 %r27,%r24,0; @ %r27 bra $L470; .loc 1 390 11 mov.u32 %r28,0; st.u32 [%r24],%r28; $L470:_gfortran_caf_co_sum .visible .func _gfortran_caf_co_sumr24; .reg .pred %r27; .reg .u32 %r28; $LFB52: mov.u64 %r24,%ar2; setp.eq.u64 %r27,%r24,0; @ %r27 bra $L475; mov.u32 %r28,0; st.u32 [%r24],%r28; $L475_gfortran_caf_co_min .visible .func _gfortran_caf_co_mpred %r28; .reg .u32 %r29; mov.u64 %r24,%ar2480; .loc 1 411 11 mov.u32 %r29,0; st.u32 [%r24],%r29; $L480: .loc 1 41_gfortran_caf_co_max .visible .func _gfortran_caf_co_maxpred %r28; .reg .u32 %r29; $LFB54: mov.u64 %r24,%ar2;485; mov.u32 %r29,0; st.u32 [%r24],%r29; $L485_gfortran_caf_co_reduce .visible .func _gfortran_caf_co_reducer26; .reg .pred %r30; .reg .u32 %r31; mov.u64 %r26,%ar4; .loc 1 436 7 setp.eq.u64 %r30,%r26,0; @ %r30 bra $L490; .loc 1 437 12 mov.u32 %r31,0; st.u32 [%r26],%r31; $L490: .loc 1 438_gfortran_caf_get .visible .func _gfortran_caf_getu64u64 %r249; .reg .u64 %r251; .reg .u64u64 %r272; .reg .u64u64u64u16 %r337; .reg .predu32pred %r374; .reg .pred %r376; .reg .predpredu64 %r387; .reg .u32 %r388; .reg .pred %r389; .reg .u64 %r398; .reg .u64pred %r408; .reg .predpred %r413; .reg .u32predpred16u64 %r474; .reg .u64 %r475; .reg .u64 %r477; .reg .pred.reg .u64 %r492; .reg .u16 %r494; .reg .u16 %r495; .reg .u32 %r496; .reg .u32 %r497; .reg .u16 %r498; .reg .u16 %r499; .reg .u16 %r500; .reg .u32 %r501; .reg .u16 %r502; .reg .predu32 %r512; .reg .u32 %r513; .reg .u32 %r514; .reg .u16 %r518; .reg .u16 %r519; .reg .u32predu64 %r565; .reg .pred %r567; .reg .u64 %r570; .reg .u64 %r571; .reg .u64 %r572; .reg .u64 %r573; .reg .u64 %r574; .reg .u64 %r576; .reg .u64 %r577; .reg .u64 %r579; .reg .u16 %r580; .reg .pred %r581; .reg .u32 %r582; .reg .u64 %r583; .reg .u64 %r585; .reg .u64 %r586; .reg .u64 %r587; .reg .u64u16 %r616; .reg .u16 %r617; .reg .u32 %r618; .reg .u16 %r619; .reg .predu32 %r631; .reg .u16 %r635; .reg .u16 %r636; .reg .u32 %r638; .reg .u16 %r639; .reg .pred %r640; .reg .u32 %r645; .reg .u64 %r647; .reg .u64 %r648; .reg .u64 %r649; .reg .pred %r650; .reg .u32 %r651; .reg .pred %r652; .reg .pred %r654; .reg .u32 %r661; .reg .predpred %r673; .reg .u32 %r675; .reg .u64 %r676; .reg .u64 %r677; .reg .pred %r678; .reg .pred %r680; .reg .pred %r681; .reg .u32 %r682; .reg .u32 %r683; .reg .u16 %r684; .reg .u64 %r685; .reg .u64 %r686; .reg .pred %r687; .reg .pred %r688; .reg .u32 %r689; .reg .u32 %r690; .reg .u16 %r691; mov.u64 %r310,%ar0; mov.u64 %r311,%ar1; mov.u64 %r313,%ar3; mov.u64 %r315,%ar5; mov.u32 %r316,%ar6; mov.u32 %r317,%ar7; mov.u32 %r318,%ar8; mov.u64 %r319,%ar9; .loc 1 740 14 ld.s8 %r22,[%r315+28]; .loc 1 741 10 ld.u64 %r149,[%r313+16]; .loc 1 742 10 ld.u64 %r150,[%r315+16]; .loc 1 744 6 setp.eq.u64 %r320,%r319,0; @ %r320 bra $L496; .loc 1 745 11 mov.u32 %r321,0; st.u32 [%r319],%r321; $L496: .loc 1 747 6 setp.eq.u32 %r322,%r22,0; @ %r322 bra $L497; .loc 1 778 3 setp.gt.s32 %r323,%r22,0; @ %r323 bra $L498; .loc 1 777 8 mov.u64 %r170,1; add.u32 %r675,%r22,-1; bra $L499; $L497: .loc 1 749 13 ld.u64 %r324,[%r310]; add.u64 %r171,%r324,%r311; .loc 1 750 11 ld.s8 %r24,[%r315+29]; .loc 1 750 41 ld.s8 %r25,[%r313+29]; .loc 1 753 4 ld.u64 %r304,[%r315]; .loc 1 751 4 set.u32.eq.u32 %r326,%r317,%r316; neg.s32 %r327,%r326; .loc 1 750 10 cvt.u16.u32 %r329,%r24; cvt.u16.u32 %r330,%r25; set.u32.eq.u16 %r331,%r329,%r330; neg.s32 %r332,%r331; .loc 1 751 4 cvt.u16.u32 %r334,%r327; cvt.u16.u32 %r335,%r332; and.b16 %r333,%r334,%r335; cvt.u32.u16 %r336,%r333; cvt.u16.u8 %r337,%r336; setp.eq.u16 %r338,%r337,0; @ %r338 bra $L500; .loc 1 753 4 min.u64 %r343,%r149,%r15ld.param.u64 %r344,[%value_in]; } .loc 1 755 7 ld.s8 %r347,[%r315+29]; cvt.u16.u32 %r346,%r347; set.u32.eq.u16 %r348,%r346,6; neg.s32 %r349,%r348; .loc 1 755 51 set.u32.lt.u64 %r351,%r149,%r150495; .loc 1 757 11 setp.ne.u32 %r359,%r317,1; @ %r359 bra $L502; .loc 1 758 3 sub.u64 %r366,%r150,%r149; ld.u64 %r368,[%r315]; add.u64 %r367,%r368,%r149;bra $L495; $L502: .loc 1 761 10 shr.u64 %r172,%r149,2; .loc 1 761 36 shr.u64 %r147,%r150,2; .loc 1 761 3 setp.le.u64 %r370,%r147,%r172; @ %r370 bra $L495; .loc 1 762 17 ld.u64 %r31,[%r315]; shl.b64 %r371,%r172,2; add.u64 %r290,%r31,%r371; shl.b64 %r372,%r147,2; add.u64 %r292,%r31,%r372; .loc 1 762 48 mov.u32 %r373,32; $L504: st.u32 [%r290],%r373; .loc 1 761 3 add.u64 %r290,%r290,4; setp.ne.u64 %r374,%r292,%r290; @ %r374 bra $L504; bra $L495; $L500: .loc 1 765 15 setp.ne.u16 %r376,%r329,6; @ %r376 bra $L505; .loc 1 765 59 setp.ne.u32 %r377,%r317,1; @ %r377 bra $L506; .loc 1 766 2304171; call assign_char1_from_char4bra $L495; $L506: .loc 1 446 15 shr.u64 %r173,%r150,2; .loc 1 446 5 min.u64 %r174,%r149,%r173; .loc 1 447 3 setp.eq.u64 %r382,%r174,0; @ %r382 bra $L507; mov.u64 %r282,%r171; mov.u64 %r281,%r304; add.u64 %r278,%r282,%r174; $L508: .loc 1 448 14 ld.u8 %r383,[%r282]; st.u32 [%r281],%r383; .loc 1 447 3 add.u64 %r282,%r282,1; add.u64 %r281,%r281,4; setp.ne.u64 %r384,%r278,%r282; @ %r384 bra $L508; $L507: .loc 1 449 3 setp.le.u64 %r385,%r173,%r174; @ %r385 bra $L495; shl.b64 %r386,%r174,2; add.u64 %r291,%r304,%r386; shl.b64 %r387,%r173,2; add.u64 %r283,%r304,%r387; .loc 1 450 12 mov.u32 %r388,32; $L509: st.u32 [%r291],%r388; .loc 1 449 3 add.u64 %r291,%r291,4; setp.ne.u64 %r389,%r283,%r291; @ %r389 bra $L509; bra $L495; $L505: .loc 1 772 2317convert_typebra $L495; $L498: add.u64 %r197,%r315,48; add.u64 %r193,%r315,56; add.u32 %r675,%r22,-1; cvt.u64.u32 %r398,%r675; add.u64 %r400,%r398,%r398; add.u64 %r401,%r400,%r398; shl.b64 %r402,%r401,3; add.u64 %r403,%r315,72; add.u64 %r184,%r402,%r403; .loc 1 777 8 mov.u64 %r170,1; $L510: .loc 1 780 50 ld.u64 %r405,[%r193]; ld.u64 %r406,[%r197]; sub.u64 %r404,%r405,%r406; .loc 1 780 17 add.u64 %r169,%r404,1; .loc 1 783 12 max.s64 %r407,%r169,0; mul.lo.u64 %r170,%r170,%r407; .loc 1 778 3 add.u64 %r197,%r197,24; add.u64 %r193,%r193,24; setp.ne.u64 %r408,%r184,%r197; @ %r408 bra $L510; .loc 1 786 6 setp.eq.u64 %r409,%r170,0; @ %r409 bra $L495; $L499: .loc 1 837 8 set.u32.eq.u32 %r411,%r317,%r316; neg.s32 %r412,%r411; cvt.u32.u32 %r410,%r412; cvt.u32.u8 %r305,%r410; .loc 1 847 14 shr.u64 %r306,%r149,2; .loc 1 446 15 shr.u64 %r307,%r150,2; .loc 1 446 5 min.u64 %r308,%r149,%r307; .loc 1 789 6 setp.ne.u32 %r413,%r318,0; @ %r413 bra $L511; add.u32 %r414,%r22,-2; cvt.u64.u32 %r415,%r414; add.u64 %r417,%r415,%r415; add.u64 %r418,%r417,%r415; shl.b64 %r419,%r418,3; add.u64 %r420,%r315,64; add.u64 %r199,%r419,%r420; shl.b64 %r232,%r308,2; shl.b64 %r229,%r307,2; shl.b64 %r238,%r306,2; .loc 1 741 10 mov.u64 %r298,%r149; .loc 1 742 10 mov.u64 %r297,%r150; .loc 1 865 10 mov.u64 %r155,0; cvt.s64.s32 %r677,%r675; setp.gt.s32 %r678,%r675,0; .loc 1 879 59 add.u64 %r570,%r677,%r677; add.u64 %r571,%r570,%r677; shl.b64 %r572,%r571,3; add.u64 %r573,%r315,%r572; add.u64 %r574,%r573,40; add.u64 %r686,%r313,64; .loc 1 894 57 add.u64 %r602,%r313,%r572; add.u64 %r603,%r602,40; .loc 1 899 4 cvt.u16.u32 %r617,%r305; .loc 1 911 59 setp.ne.u32 %r687,%r317,1; .loc 1 447 3 setp.eq.u64 %r688,%r308,0; .loc 1 902 51 set.u32.lt.u64 %r689,%r149,%r150; neg.s32 %r690,%r689; cvt.u16.u32 %r691,%r690; bra $L512; $L511: .loc 1 792 19 mul.lo.u64 %r422,%r149,%r1742223,[%value_in]; } mov.u64 %r251,%r423; mov.u64 %r240,%r423; .loc 1 741 10 mov.u64 %r301,%r149; .loc 1 795 14 mov.u64 %r167,0; cvt.s64.s32 %r677,%r675; add.u64 %r685,%r313,64; .loc 1 809 54 add.u64 %r442,%r677,%r677; add.u64 %r443,%r442,%r677; shl.b64 %r444,%r443,3; add.u64 %r445,%r313,%r444; add.u64 %r446,%r445,40; bra $L513; $L517: add.u64 %r249,%r313,40; add.u32 %r425,%r200,-2; cvt.u64.u32 %r426,%r425; add.u64 %r428,%r426,%r426; add.u64 %r429,%r428,%r426; shl.b64 %r430,%r429,3; add.u64 %r241,%r430,%r685; .loc 1 799 14 mov.u64 %r45,1; .loc 1 798 14 mov.u64 %r48,%r45; .loc 1 797 14 mov.u64 %r168,0; $L514: .loc 1 802 40 mul.lo.u64 %r432,%r45,%r48; .loc 1 802 31 div.u64 %r41,%r167,%r432; .loc 1 804 9 ld.u64 %r434,[%r249+16]; ld.u64 %r435,[%r249+8]; sub.u64 %r433,%r434,%r435; .loc 1 804 35 add.u64 %r45,%r433,1; .loc 1 805 19 ld.u64 %r48,[%r249]; .loc 1 803 7 rem.u64 %r437,%r41,%r45; .loc 1 802 24 mad.lo.u64 %r52,%r437,%r48,%r168; mov.u64 %r168,%r52; .loc 1 800 4 add.u64 %r249,%r249,24; setp.ne.u64 %r439,%r241,%r249; @ %r439 bra $L514; .loc 1 809 26 div.u64 %r302,%r167,%r45; $L518: .loc 1 809 36 ld.u64 %r448,[%r446]; .loc 1 809 20 mad.lo.u64 %r449,%r302,%r448,%r52; .loc 1 811 6 mad.lo.u64 %r451,%r449,%r301,%r311; .loc 1 810 10 ld.u64 %r453,[%r310]; add.u64 %r452,%r453,%r451; .loc 1 812 112449460,[%value_in]; } .loc 1 795 30 add.u64 %r167,%r167,1; .loc 1 795 7 add.u64 %r240,%r240,%r149; setp.lt.u64 %r462,%r167,%r170; @ %r462 bra $L515; add.u32 %r463,%r22,-2; cvt.u64.u32 %r464,%r463; add.u64 %r466,%r464,%r464; add.u64 %r467,%r466,%r464; shl.b64 %r468,%r467,3; add.u64 %r469,%r315,64; add.u64 %r252,%r468,%r469; shl.b64 %r270,%r308,2; shl.b64 %r267,%r307,2; shl.b64 %r276,%r306,2; .loc 1 817 14 mov.u64 %r164,0; setp.gt.s32 %r678,%r675,0; .loc 1 831 56 add.u64 %r482,%r677,%r677; add.u64 %r483,%r482,%r677; shl.b64 %r484,%r483,3; add.u64 %r485,%r315,%r484; add.u64 %r486,%r485,40; .loc 1 837 8 cvt.u16.u32 %r500,%r305; .loc 1 851 56 setp.ne.u32 %r680,%r317,1; .loc 1 447 3 setp.eq.u64 %r681,%r308,0; .loc 1 841 12 set.u32.lt.u64 %r682,%r149,%r150; neg.s32 %r683,%r682; cvt.u16.u32 %r684,%r683; bra $L516; $L515: .loc 1 811 24 ld.u64 %r301,[%r313+16]; $L513: .loc 1 800 20 ld.s8 %r200,[%r313+28]; .loc 1 800 4 cvt.u16.u32 %r470,%r200; setp.gt.s16 %r471,%r470,1; @ %r471 bra $L517; mov.u64 %r302,%r167; mov.u64 %r52,0; bra $L518; $L532: add.u64 %r260,%r315,40; .loc 1 821 14 mov.u64 %r67,1; .loc 1 820 14 mov.u64 %r70,%r67; .loc 1 819 14 mov.u64 %r165,0; $L519: .loc 1 824 41 mul.lo.u64 %r472,%r70,%r67; .loc 1 824 32 div.u64 %r63,%r164,%r472; .loc 1 826 11 ld.u64 %r474,[%r260+16]; ld.u64 %r475,[%r260+8]; sub.u64 %r473,%r474,%r475; .loc 1 826 38 add.u64 %r67,%r473,1; .loc 1 827 21 ld.u64 %r70,[%r260]; .loc 1 825 8 rem.u64 %r477,%r63,%r67; .loc 1 824 25 mad.lo.u64 %r74,%r477,%r70,%r165; mov.u64 %r165,%r74; .loc 1 822 4 add.u64 %r260,%r260,24; setp.ne.u64 %r479,%r252,%r260; @ %r479 bra $L519; .loc 1 831 27 div.u64 %r303,%r164,%r67; $L533: .loc 1 831 37 ld.u64 %r488,[%r486]; .loc 1 831 21 mad.lo.u64 %r489,%r303,%r488,%r74; .loc 1 833 27 ld.u64 %r491,[%r315+16]; .loc 1 832 10 ld.u64 %r492,[%r315]; mad.lo.u64 %r161,%r489,%r491,%r492; .loc 1 836 8 ld.s8 %r82,[%r315+29]; .loc 1 836 38 ld.s8 %r83,[%r313+29]; .loc 1 836 7 cvt.u16.u32 %r494,%r82; cvt.u16.u32 %r495,%r83; set.u32.eq.u16 %r496,%r494,%r495; neg.s32 %r497,%r496; .loc 1 837 8 cvt.u16.u32 %r499,%r497; and.b16 %r498,%r499,%r500; cvt.u32.u16 %r501,%r498; cvt.u16.u8 %r502,%r501; setp.eq.u16 %r503,%r502,0; @ %r503 bra $L520; .loc 1 839 8 min.u64 %r508,%r149,%r15(%value_in),memmove9,[%value_in]; } .loc 1 840 11 ld.s8 %r512,[%r315+29]; cvt.u16.u32 %r511,%r512; set.u32.eq.u16 %r513,%r511,6; neg.s32 %r514,%r513; .loc 1 841 12 cvt.u16.u32 %r519,%r514; and.b16 %r518,%r519,%r684; cvt.u32.u16 %r521,%r518; cvt.u16.u8 %r522,%r521; setp.ne.u16 %r523,%r522,0; @ ! %r523 bra $L522; .loc 1 843 8 @ %r680 bra $L523; .loc 1 844 7 sub.u64 %r530,%r150,%r149; add.u64 %r531,%r161,%r149;bra $L522; $L523: .loc 1 847 7 setp.ge.u64 %r533,%r306,%r307; @ %r533 bra $L522; add.u64 %r277,%r161,%r276; add.u64 %r272,%r161,%r267; .loc 1 848 29 mov.u32 %r534,32; $L525: st.u32 [%r277],%r534; .loc 1 847 7 add.u64 %r277,%r277,4; setp.ne.u64 %r535,%r272,%r277; @ %r535 bra $L525; bra $L522; $L520: .loc 1 851 12 setp.ne.u16 %r537,%r494,6; @ %r537 bra $L526; .loc 1 851 56 @ %r680 bra $L527; .loc 1 852 6251; call assign_char1_from_char4bra $L522; $L527: .loc 1 447 3 @ %r681 bra $L528; mov.u64 %r263,%r161; mov.u64 %r264,%r251; add.u64 %r676,%r251,%r308; $L529: .loc 1 448 14 ld.u8 %r544,[%r264]; st.u32 [%r263],%r544; .loc 1 447 3 add.u64 %r264,%r264,1; add.u64 %r263,%r263,4; setp.ne.u64 %r546,%r676,%r264; @ %r546 bra $L529; $L528: .loc 1 449 3 setp.le.u64 %r547,%r307,%r308; @ %r547 bra $L522; add.u64 %r271,%r161,%r270; add.u64 %r265,%r161,%r267; .loc 1 450 12 mov.u32 %r548,32; $L530: st.u32 [%r271],%r548; .loc 1 449 3 add.u64 %r271,%r271,4; setp.ne.u64 %r549,%r265,%r271; @ %r549 bra $L530; bra $L522; $L526:317convert_type$L522: .loc 1 817 30 add.u64 %r164,%r164,1; .loc 1 817 7 add.u64 %r251,%r251,%r149; setp.ge.u64 %r557,%r164,%r170; @ %r557 bra $L531; $L516: .loc 1 822 4 @ %r678 bra $L532; mov.u64 %r303,%r164; mov.u64 %r74,0; bra $L533; $L531:__nvptx_free,(%out_arg1); } .loc 1 862 7 bra $L495; $L548: add.u64 %r210,%r315,40; .loc 1 869 17 mov.u64 %r96,1; .loc 1 868 17 mov.u64 %r99,%r96; .loc 1 867 17 mov.u64 %r159,0; $L534: .loc 1 872 37 mul.lo.u64 %r560,%r96,%r99; .loc 1 872 28 div.u64 %r92,%r155,%r560; .loc 1 874 7 ld.u64 %r562,[%r210+16]; ld.u64 %r563,[%r210+8]; sub.u64 %r561,%r562,%r563; .loc 1 874 34 add.u64 %r96,%r561,1; .loc 1 875 24 ld.u64 %r99,[%r210]; .loc 1 873 11 rem.u64 %r565,%r92,%r96; .loc 1 872 21 mad.lo.u64 %r103,%r565,%r99,%r159; mov.u64 %r159,%r103; .loc 1 870 7 add.u64 %r210,%r210,24; setp.ne.u64 %r567,%r199,%r210; @ %r567 bra $L534; .loc 1 879 30 div.u64 %r299,%r155,%r96; $L549: .loc 1 879 40 ld.u64 %r576,[%r574]; .loc 1 879 24 mad.lo.u64 %r577,%r299,%r576,%r103; .loc 1 880 13 ld.u64 %r579,[%r315]; mad.lo.u64 %r152,%r577,%r297,%r579; .loc 1 885 23 ld.s8 %r195,[%r313+28]; .loc 1 885 7 cvt.u16.u32 %r580,%r195; setp.le.s16 %r581,%r580,1; @ %r581 bra $L550; add.u64 %r221,%r313,40; add.u32 %r582,%r195,-2; cvt.u64.u32 %r583,%r582; add.u64 %r585,%r583,%r583; add.u64 %r586,%r585,%r583; shl.b64 %r587,%r586,3; add.u64 %r211,%r587,%r686; .loc 1 882 17 mov.u64 %r156,0; .loc 1 884 14 mov.u64 %r117,1; .loc 1 883 14 mov.u64 %r120,%r117; $L536: .loc 1 887 36 mul.lo.u64 %r589,%r117,%r120; .loc 1 887 27 div.u64 %r113,%r155,%r589; .loc 1 889 7 ld.u64 %r591,[%r221+16]; ld.u64 %r592,[%r221+8]; sub.u64 %r590,%r591,%r592; .loc 1 889 33 add.u64 %r117,%r590,1; .loc 1 890 23 ld.u64 %r120,[%r221]; .loc 1 888 11 rem.u64 %r594,%r113,%r117; .loc 1 887 20 mad.lo.u64 %r124,%r594,%r120,%r156; mov.u64 %r156,%r124; .loc 1 885 7 add.u64 %r221,%r221,24; setp.ne.u64 %r596,%r211,%r221; @ %r596 bra $L536; .loc 1 894 29 div.u64 %r300,%r155,%r117; bra $L535; $L550: .loc 1 885 7 mov.u64 %r300,%r155; mov.u64 %r124,0; $L535: .loc 1 894 39 ld.u64 %r605,[%r603]; .loc 1 894 23 mad.lo.u64 %r606,%r300,%r605,%r124; .loc 1 896 6 mad.lo.u64 %r608,%r606,%r298,%r311; .loc 1 895 13 ld.u64 %r609,[%r310]; add.u64 %r153,%r609,%r608; .loc 1 898 11 ld.s8 %r135,[%r315+29]; .loc 1 898 41 ld.s8 %r136,[%r313+29]; .loc 1 898 10 cvt.u16.u32 %r611,%r135; cvt.u16.u32 %r612,%r136; set.u32.eq.u16 %r613,%r611,%r612; neg.s32 %r614,%r613; .loc 1 899 4 cvt.u16.u32 %r616,%r614; and.b16 %r615,%r616,%r617; cvt.u32.u16 %r618,%r615; cvt.u16.u8 %r619,%r618; setp.eq.u16 %r620,%r619,0; @ %r620 bra $L537; .loc 1 901 4 min.u64 %r625,%r149,%r1511535; call (%value_in),memmove26,[%value_in]; } .loc 1 902 7 ld.s8 %r629,[%r315+29]; cvt.u16.u32 %r628,%r629; set.u32.eq.u16 %r630,%r628,6; neg.s32 %r631,%r630; .loc 1 902 51 cvt.u16.u32 %r636,%r631; and.b16 %r635,%r636,%r691; cvt.u32.u16 %r638,%r635; cvt.u16.u8 %r639,%r638; setp.ne.u16 %r640,%r639,0; @ ! %r640 bra $L539; .loc 1 904 11 @ %r687 bra $L540; .loc 1 905 3 sub.u64 %r647,%r150,%r149; add.u64 %r648,%r152,%r149; mov.u32 %r664645647649,[%value_in]; } bra $L539; $L540: .loc 1 907 3 setp.ge.u64 %r650,%r306,%r307; @ %r650 bra $L539; add.u64 %r239,%r152,%r238; add.u64 %r234,%r152,%r229; .loc 1 908 25 mov.u32 %r651,32; $L542: st.u32 [%r239],%r651; .loc 1 907 3 add.u64 %r239,%r239,4; setp.ne.u64 %r652,%r234,%r239; @ %r652 bra $L542; bra $L539; $L537: .loc 1 911 15 setp.ne.u16 %r654,%r611,6; @ %r654 bra $L543; .loc 1 911 59 @ %r687 bra $L544; .loc 1 912 252153; call assign_char1_from_char4bra $L539; $L544: .loc 1 447 3 @ %r688 bra $L545; mov.u64 %r226,%r153; mov.u64 %r225,%r152; add.u64 %r222,%r226,%r308; $L546: .loc 1 448 14 ld.u8 %r661,[%r226]; st.u32 [%r225],%r661; .loc 1 447 3 add.u64 %r226,%r226,1; add.u64 %r225,%r225,4; setp.ne.u64 %r662,%r222,%r226; @ %r662 bra $L546; $L545: .loc 1 449 3 setp.le.u64 %r663,%r307,%r308; @ %r663 bra $L539; add.u64 %r233,%r152,%r232; add.u64 %r227,%r152,%r229; .loc 1 450 12 mov.u32 %r664,32; $L547: st.u32 [%r233],%r664; .loc 1 449 3 add.u64 %r233,%r233,4; setp.ne.u64 %r665,%r227,%r233; @ %r665 bra $L547; bra $L539; $L543: .loc 1 916 217convert_type$L539: .loc 1 865 26 add.u64 %r155,%r155,1; .loc 1 865 3 setp.ge.u64 %r673,%r155,%r170; @ %r673 bra $L495; .loc 1 880 54 ld.u64 %r297,[%r315+16]; .loc 1 896 24 ld.u64 %r298,[%r313+16]; $L512: .loc 1 870 7 @ %r678 bra $L548; mov.u64 %r299,%r155; mov.u64 %r103,0; bra $L5_gfortran_caf_send .visible .func _gfortran_caf_sendu64u64 %r302; .reg .u64 %r303; .reg .u64 %r304; .reg .u64predpred %r325; .reg .u64 %r326; .reg .u32 %r328; .reg .u32 %r329; .reg .u16 %r331; .reg .u16pred %r340; .reg .u64 %r345; .reg .u64 %r346; .reg .u16 %r348; .reg .u32pred %r376; .reg .predpredu64 %r387; .reg .u32 %r388; .reg .pred %r389; .reg .u64 %r398; .reg .u64pred %r408; .reg .predpred %r413; .reg .u32predpredpred489; .reg .u64 %r491; .reg .predu64 %r505; .reg .u64 %r506; .reg .u16 %r508; .reg .u16 %r509; .reg .u32 %r510; .reg .u32 %r511; .reg .u16 %r512; .reg .u16 %r513; .reg .u16 %r514; .reg .u32 %r515; .reg .u16 %r516; .reg .pred %r517; .reg .u64 %r522; .reg .u64 %r523; .reg .u16 %r525; .reg .u32 %r526; .reg .u32 %r527; .reg .u32 %r528; .reg .u16 %r532; .reg .u16 %r533; .reg .u32 %r535; .reg .u16 %r536; .reg .predpred %r549; .reg .predu32 %r571; .reg .pred %r572; .reg .pred %r573; .reg .u64 %r576; .reg .u64 %r577; .reg .u64pred %r583; .reg .u64 %r586; .reg .u64 %r587; .reg .u64 %r588; .reg .u64 %r589; .reg .u64 %r590; .reg .u64 %r592; .reg .u64 %r593; .reg .u64 %r595; .reg .u64 %r596; .reg .pred %r597; .reg .u16 %r598; .reg .pred %r599; .reg .u32 %r600; .reg .u64 %r601; .reg .u64.reg .pred %r614; .reg .u64 %r623; .reg .u64 %r624; .reg .u64 %r626; .reg .u16 %r628; .reg .u16pred %r667; .reg .u32 %r668; .reg .pred %r669; .reg .pred %r671; .reg .u32 %r678; .reg .pred %r679; .reg .pred %r680; .reg .u32 %r681; .reg .pred %r682; .reg .pred %r690; .reg .u32 %r692; .reg .u64 %r693; .reg .pred %r694; .reg .pred %r696; .reg .pred %r697; .reg .u32 %r698; .reg .u32 %r699; .reg .u16 %r700; .reg .u64 %r701; .reg .u64 %r702; .reg .u64 %r703; .reg .u64 %r704; .reg .pred %r705; .reg .pred %r706; .reg .u32 %r707; .reg .u32 %r708; .reg .u16 %r709; mov.u64 %r312,%ar0; mov.u64 %r313,%ar1; mov.u64 %r315,%ar3; mov.u64 %r317,%ar5; mov.u32 %r318,%ar6; mov.u32 %r319,%ar7; mov.u32 %r320,%ar8; mov.u64 %r321,%ar9; .loc 1 933 14 ld.s8 %r22,[%r315+28]; .loc 1 934 10 ld.u64 %r157,[%r317+16]; .loc 1 935 10 ld.u64 %r158,[%r315+16]; .loc 1 937 6 setp.eq.u64 %r322,%r321,0; @ %r322 bra $L582; .loc 1 938 11 mov.u32 %r323,0; st.u32 [%r321],%r323; $L582: .loc 1 940 6 setp.eq.u32 %r324,%r22,0; @ %r324 bra $L583; .loc 1 971 3 setp.gt.s32 %r325,%r22,0; @ %r325 bra $L584; .loc 1 970 8 mov.u64 %r176,1; add.u32 %r692,%r22,-1; bra $L585; $L583: .loc 1 942 13 ld.u64 %r326,[%r312]; add.u64 %r177,%r326,%r313; .loc 1 943 11 ld.s8 %r24,[%r315+29]; .loc 1 943 41 ld.s8 %r25,[%r317+29]; .loc 1 946 18 ld.u64 %r306,[%r317]; .loc 1 944 4 set.u32.eq.u32 %r328,%r318,%r319; neg.s32 %r329,%r328; .loc 1 943 10 cvt.u16.u32 %r331,%r24; cvt.u16.u32 %r332,%r25; set.u32.eq.u16 %r333,%r331,%r332; neg.s32 %r334,%r333; .loc 1 944 4 cvt.u16.u32 %r336,%r329; cvt.u16.u32 %r337,%r334; and.b16 %r335,%r336,%r337; cvt.u32.u16 %r338,%r335; cvt.u16.u8 %r339,%r338; setp.eq.u16 %r340,%r339,0; @ %r340 bra $L586; .loc 1 946 4 min.u64 %r345,%r157,%r15845; call (%value_in),memmove346,[%value_in]; } .loc 1 948 7 ld.s8 %r349,[%r315+29]; cvt.u16.u32 %r348,%r349; set.u32.eq.u16 %r350,%r348,6; neg.s32 %r351,%r350; .loc 1 948 51581; .loc 1 950 11 setp.ne.u32 %r361,%r318,1; @ %r361 bra $L588; .loc 1 951 3 sub.u64 %r367,%r158,%r157; add.u64 %r368,%r177,%r157; mov.u32 %r3636r367369,[%value_in]; } bra $L581; $L588: .loc 1 953 10 shr.u64 %r178,%r157,2; .loc 1 953 36 shr.u64 %r149,%r158,2; .loc 1 953 3 setp.le.u64 %r370,%r149,%r178; @ %r370 bra $L581; shl.b64 %r371,%r178,2; add.u64 %r292,%r177,%r371; shl.b64 %r372,%r149,2; add.u64 %r293,%r177,%r372; .loc 1 954 25 mov.u32 %r373,32; $L590: st.u32 [%r292],%r373; .loc 1 953 3 add.u64 %r292,%r292,4; setp.ne.u64 %r374,%r293,%r292; @ %r374 bra $L590; bra $L581; $L586: .loc 1 957 15 setp.ne.u16 %r376,%r331,6; @ %r376 bra $L591; .loc 1 957 59 setp.ne.u32 %r377,%r318,1; @ %r377 bra $L592; .loc 1 958 215r157177306; call assign_char1_from_char4bra $L581; $L592: .loc 1 446 15 shr.u64 %r179,%r158,2; .loc 1 446 5 min.u64 %r180,%r157,%r179; .loc 1 447 3 setp.eq.u64 %r382,%r180,0; @ %r382 bra $L593; mov.u64 %r284,%r306; mov.u64 %r283,%r177; add.u64 %r280,%r284,%r180; $L594: .loc 1 448 14 ld.u8 %r383,[%r284]; st.u32 [%r283],%r383; .loc 1 447 3 add.u64 %r284,%r284,1; add.u64 %r283,%r283,4; setp.ne.u64 %r384,%r280,%r284; @ %r384 bra $L594; $L593: .loc 1 449 3 setp.le.u64 %r385,%r179,%r180; @ %r385 bra $L581; shl.b64 %r386,%r180,2; add.u64 %r294,%r177,%r386; shl.b64 %r387,%r179,2; add.u64 %r285,%r177,%r387; .loc 1 450 12 mov.u32 %r388,32; $L595: st.u32 [%r294],%r388; .loc 1 449 3 add.u64 %r294,%r294,4; setp.ne.u64 %r389,%r285,%r294; @ %r389 bra $L595; bra $L581; $L591: .loc 1 964 2318convert_typebra $L581; $L584: add.u64 %r204,%r315,48; add.u64 %r201,%r315,56; add.u32 %r692,%r22,-1; cvt.u64.u32 %r398,%r692; add.u64 %r400,%r398,%r398; add.u64 %r401,%r400,%r398; shl.b64 %r402,%r401,3; add.u64 %r403,%r315,72; add.u64 %r192,%r402,%r403; .loc 1 970 8 mov.u64 %r176,1; $L596: .loc 1 973 50 ld.u64 %r405,[%r201]; ld.u64 %r406,[%r204]; sub.u64 %r404,%r405,%r406; .loc 1 973 17 add.u64 %r175,%r404,1; .loc 1 976 12 max.s64 %r407,%r175,0; mul.lo.u64 %r176,%r176,%r407; .loc 1 971 3 add.u64 %r204,%r204,24; add.u64 %r201,%r201,24; setp.ne.u64 %r408,%r192,%r204; @ %r408 bra $L596; .loc 1 979 6 setp.eq.u64 %r409,%r176,0; @ %r409 bra $L581; $L585: .loc 1 1038 8 set.u32.eq.u32 %r411,%r318,%r319; neg.s32 %r412,%r411; cvt.u32.u32 %r410,%r412; cvt.u32.u8 %r307,%r410; .loc 1 1049 14 shr.u64 %r308,%r157,2; .loc 1 446 15 shr.u64 %r309,%r158,2; .loc 1 446 5 min.u64 %r310,%r157,%r309; .loc 1 982 6 setp.ne.u32 %r413,%r320,0; @ %r413 bra $L597; add.u32 %r414,%r22,-2; cvt.u64.u32 %r415,%r414; add.u64 %r417,%r415,%r415; add.u64 %r418,%r417,%r415; shl.b64 %r419,%r418,3; add.u64 %r420,%r315,64; add.u64 %r205,%r419,%r420; shl.b64 %r235,%r310,2; shl.b64 %r232,%r309,2; shl.b64 %r241,%r308,2; .loc 1 935 10 mov.u64 %r299,%r158; .loc 1 1067 10 mov.u64 %r165,0; cvt.s64.s32 %r693,%r692; setp.gt.s32 %r694,%r692,0; .loc 1 1081 59 add.u64 %r586,%r693,%r693; add.u64 %r587,%r586,%r693; shl.b64 %r588,%r587,3; add.u64 %r589,%r315,%r588; add.u64 %r590,%r589,40; .loc 1 1099 54 add.u64 %r702,%r317,%r588; add.u64 %r703,%r702,40; add.u64 %r704,%r317,64; .loc 1 1107 4 cvt.u16.u32 %r634,%r307; .loc 1 1120 59 setp.ne.u32 %r705,%r318,1; .loc 1 447 3 setp.eq.u64 %r706,%r310,0; .loc 1 1111 51 set.u32.lt.u64 %r707,%r157,%r158; neg.s32 %r708,%r707; cvt.u16.u32 %r709,%r708; bra $L598; $L597: .loc 1 987 10 ld.s8 %r422,[%r317+28]; cvt.u16.u32 %r421,%r422; setp.ne.u16 %r423,%r421,0; @ %r423 bra $L599; .loc 1 98915725,[%value_in]; } mov.u64 %r152,%r425; .loc 1 990 4 ld.u64 %r427,[%r317]935 10 mov.u64 %r304,%r158; cvt.s64.s32 %r693,%r692; bra $L600; $L599: .loc 1 994 10 mul.lo.u64 %r437,%r157,%r17643738,[%value_in]; } mov.u64 %r152,%r438; mov.u64 %r243,%r152; .loc 1 934 10 mov.u64 %r302,%r157; .loc 1 996 11 mov.u64 %r168,0; cvt.s64.s32 %r693,%r692; add.u64 %r701,%r317,64; .loc 1 1010 58 add.u64 %r457,%r693,%r693; add.u64 %r458,%r457,%r693; shl.b64 %r459,%r458,3; add.u64 %r460,%r317,%r459; add.u64 %r461,%r460,40; bra $L601; $L604: add.u64 %r252,%r317,40; add.u32 %r440,%r202,-2; cvt.u64.u32 %r441,%r440; add.u64 %r443,%r441,%r441; add.u64 %r444,%r443,%r441; shl.b64 %r445,%r444,3; add.u64 %r244,%r445,%r701; .loc 1 1000 18 mov.u64 %r45,1; .loc 1 999 18 mov.u64 %r48,%r45; .loc 1 998 18 mov.u64 %r169,0; $L602: .loc 1 1003 37 mul.lo.u64 %r447,%r45,%r48; .loc 1 1003 28 div.u64 %r41,%r168,%r447; .loc 1 1005 7 ld.u64 %r449,[%r252+16]; ld.u64 %r450,[%r252+8]; sub.u64 %r448,%r449,%r450; .loc 1 1005 33 add.u64 %r45,%r448,1; .loc 1 1006 23 ld.u64 %r48,[%r252]; .loc 1 1004 11 rem.u64 %r452,%r41,%r45; .loc 1 1003 21 mad.lo.u64 %r52,%r452,%r48,%r169; mov.u64 %r169,%r52; .loc 1 1001 8 add.u64 %r252,%r252,24; setp.ne.u64 %r454,%r244,%r252; @ %r454 bra $L602; .loc 1 1010 30 div.u64 %r303,%r168,%r45; $L605: .loc 1 1010 40 ld.u64 %r463,[%r461]; .loc 1 1010 24 mad.lo.u64 %r464,%r303,%r463,%r52; .loc 1 1011 14 ld.u64 %r467,[%r317]; mad.lo.u64 %r466,%r464,%r302,%r467; .loc 1 1013996 27 add.u64 %r168,%r168,1; .loc 1 996 4 add.u64 %r243,%r243,%r157; setp.ge.u64 %r476,%r168,%r176; @ %r476 bra $L603; .loc 1 1012 26 ld.u64 %r302,[%r317+16]; $L601: .loc 1 1001 24 ld.s8 %r202,[%r317+28]; .loc 1 1001 8 cvt.u16.u32 %r477,%r202; setp.gt.s16 %r478,%r477,1; @ %r478 bra $L604; mov.u64 %r303,%r168; mov.u64 %r52,0; bra $L605; $L603: .loc 1 1035 28 ld.u64 %r304,[%r315+16]; $L600: add.u32 %r479,%r22,-2; cvt.u64.u32 %r480,%r479; add.u64 %r482,%r480,%r480; add.u64 %r483,%r482,%r480; shl.b64 %r484,%r483,3; add.u64 %r485,%r315,64; add.u64 %r253,%r484,%r485; shl.b64 %r272,%r310,2; shl.b64 %r269,%r309,2; shl.b64 %r278,%r308,2; .loc 1 1018 23 mov.u64 %r151,0; .loc 1 1019 14 mov.u64 %r173,%r151; setp.gt.s32 %r694,%r692,0; .loc 1 1033 56 add.u64 %r496,%r693,%r693; add.u64 %r497,%r496,%r693; shl.b64 %r498,%r497,3; add.u64 %r499,%r315,%r498; add.u64 %r500,%r499,40; .loc 1 1038 8 cvt.u16.u32 %r514,%r307; .loc 1 1053 56 setp.ne.u32 %r696,%r318,1; .loc 1 447 3 setp.eq.u64 %r697,%r310,0; .loc 1 1043 5 set.u32.lt.u64 %r698,%r157,%r158; neg.s32 %r699,%r698; cvt.u16.u32 %r700,%r699; bra $L606; $L621: add.u64 %r261,%r315,40; .loc 1 1023 14 mov.u64 %r66,1; .loc 1 1022 14 mov.u64 %r69,%r66; .loc 1 1021 14 mov.u64 %r174,0; $L607: .loc 1 1026 41 mul.lo.u64 %r486,%r69,%r66; .loc 1 1026 32 div.u64 %r62,%r173,%r486; .loc 1 1028 11 ld.u64 %r488,[%r261+16]; ld.u64 %r489,[%r261+8]; sub.u64 %r487,%r488,%r489; .loc 1 1028 38 add.u64 %r66,%r487,1; .loc 1 1029 21 ld.u64 %r69,[%r261]; .loc 1 1027 8 rem.u64 %r491,%r62,%r66; .loc 1 1026 25 mad.lo.u64 %r73,%r491,%r69,%r174; mov.u64 %r174,%r73; .loc 1 1024 4 add.u64 %r261,%r261,24; setp.ne.u64 %r493,%r253,%r261; @ %r493 bra $L607; .loc 1 1033 27 div.u64 %r305,%r173,%r66; $L622: .loc 1 1033 37 ld.u64 %r502,[%r500]; .loc 1 1033 21 mad.lo.u64 %r503,%r305,%r502,%r73; .loc 1 1035 9 mad.lo.u64 %r505,%r503,%r304,%r313; .loc 1 1034 10 ld.u64 %r506,[%r312]; add.u64 %r170,%r506,%r505; .loc 1 1036 17 add.u64 %r171,%r152,%r151; .loc 1 1037 8 ld.s8 %r82,[%r315+29]; .loc 1 1037 38 ld.s8 %r83,[%r317+29]; .loc 1 1037 7 cvt.u16.u32 %r508,%r82; cvt.u16.u32 %r509,%r83; set.u32.eq.u16 %r510,%r508,%r509; neg.s32 %r511,%r510; .loc 1 1038 8 cvt.u16.u32 %r513,%r511; and.b16 %r512,%r513,%r514; cvt.u32.u16 %r515,%r512; cvt.u16.u8 %r516,%r515; setp.eq.u16 %r517,%r516,0; @ %r517 bra $L608; .loc 1 1040 8 min.u64 %r522,%r157,%r158522; call (%value_in),memmove523,[%value_in]; } .loc 1 1042 11 ld.s8 %r526,[%r315+29]; cvt.u16.u32 %r525,%r526; set.u32.eq.u16 %r527,%r525,6; neg.s32 %r528,%r527; .loc 1 1043 5 cvt.u16.u32 %r533,%r528; and.b16 %r532,%r533,%r700; cvt.u32.u16 %r535,%r532; cvt.u16.u8 %r536,%r535; setp.ne.u16 %r537,%r536,0; @ ! %r537 bra $L610; .loc 1 1045 8 @ %r696 bra $L611; .loc 1 1046 7 sub.u64 %r544,%r158,%r157; add.u64 %r545,%r170,%r157; mov.u32 %r5454546,[%value_in]; } bra $L610; $L611: .loc 1 1049 7 setp.ge.u64 %r547,%r308,%r309; @ %r547 bra $L610; add.u64 %r279,%r170,%r278; add.u64 %r274,%r170,%r269; .loc 1 1050 29 mov.u32 %r548,32; $L613: st.u32 [%r279],%r548; .loc 1 1049 7 add.u64 %r279,%r279,4; setp.ne.u64 %r549,%r274,%r279; @ %r549 bra $L613; bra $L610; $L608: .loc 1 1053 12 setp.ne.u16 %r551,%r508,6; @ %r551 bra $L614; .loc 1 1053 56 @ %r696 bra $L615; .loc 1 1054 615r157170171; call assign_char1_from_char4bra $L610; $L615: .loc 1 447 3 @ %r697 bra $L616; mov.u64 %r266,%r171; mov.u64 %r265,%r170; add.u64 %r262,%r266,%r310; $L617: .loc 1 448 14 ld.u8 %r558,[%r266]; st.u32 [%r265],%r558; .loc 1 447 3 add.u64 %r266,%r266,1; add.u64 %r265,%r265,4; setp.ne.u64 %r559,%r262,%r266; @ %r559 bra $L617; $L616: .loc 1 449 3 setp.le.u64 %r560,%r309,%r310; @ %r560 bra $L610; add.u64 %r273,%r170,%r272; add.u64 %r267,%r170,%r269; .loc 1 450 12 mov.u32 %r561,32; $L618: st.u32 [%r273],%r561; .loc 1 449 3 add.u64 %r273,%r273,4; setp.ne.u64 %r562,%r267,%r273; @ %r562 bra $L618; bra $L610; $L614: .loc 1 1058 6 {318convert_type$L610: .loc 1 1060 14 ld.s8 %r571,[%r317+28]; cvt.u16.u32 %r570,%r571; setp.eq.u16 %r572,%r570,0; @ %r572 bra $L619; .loc 1 1061 22 add.u64 %r151,%r151,%r157; $L619: .loc 1 1019 30 add.u64 %r173,%r173,1; .loc 1 1019 7 setp.ge.u64 %r573,%r173,%r176; @ %r573 bra $L620; .loc 1 1035 28 ld.u64 %r304,[%r315+16]; $L606: .loc 1 1024 4 @ %r694 bra $L621; mov.u64 %r305,%r173; mov.u64 %r73,0; bra $L622; $L620: .loc 1 1063__nvptx_free,(%out_arg1); } .loc 1 1064 7 bra $L581; $L638: add.u64 %r215,%r315,40; .loc 1 1071 17 mov.u64 %r98,1; .loc 1 1070 17 mov.u64 %r101,%r98; .loc 1 1069 17 mov.u64 %r166,0; $L623: .loc 1 1074 37 mul.lo.u64 %r576,%r101,%r98; .loc 1 1074 28 div.u64 %r94,%r165,%r576; .loc 1 1076 7 ld.u64 %r578,[%r215+16]; ld.u64 %r579,[%r215+8]; sub.u64 %r577,%r578,%r579; .loc 1 1076 34 add.u64 %r98,%r577,1; .loc 1 1077 24 ld.u64 %r101,[%r215]; .loc 1 1075 11 rem.u64 %r581,%r94,%r98; .loc 1 1074 21 mad.lo.u64 %r105,%r581,%r101,%r166; mov.u64 %r166,%r105; .loc 1 1072 7 add.u64 %r215,%r215,24; setp.ne.u64 %r583,%r205,%r215; @ %r583 bra $L623; .loc 1 1081 30 div.u64 %r300,%r165,%r98; $L639: .loc 1 1081 40 ld.u64 %r592,[%r590]; .loc 1 1081 24 mad.lo.u64 %r593,%r300,%r592,%r105; .loc 1 1083 7 mad.lo.u64 %r595,%r593,%r299,%r313; .loc 1 1082 13 ld.u64 %r596,[%r312]; add.u64 %r160,%r596,%r595; .loc 1 1085 11 ld.s8 %r114,[%r317+28]; .loc 1 1100 30 ld.u64 %r153,[%r317]; .loc 1 1085 10 setp.eq.u32 %r597,%r114,0; @ %r597 bra $L624; .loc 1 1090 4 cvt.u16.u32 %r598,%r114; setp.le.s16 %r599,%r598,1; @ %r599 bra $L640; add.u64 %r224,%r317,40; add.u32 %r600,%r114,-2; cvt.u64.u32 %r601,%r600; add.u64 %r603,%r601,%r601; add.u64 %r604,%r603,%r601; shl.b64 %r605,%r604,3; add.u64 %r216,%r605,%r704; .loc 1 1087 14 mov.u64 %r161,0; .loc 1 1089 11 mov.u64 %r121,1; .loc 1 1088 11 mov.u64 %r124,%r121; $L626: .loc 1 1092 40 mul.lo.u64 %r607,%r121,%r124; .loc 1 1092 31 div.u64 %r117,%r165,%r607; .loc 1 1094 10 ld.u64 %r609,[%r224+16]; ld.u64 %r610,[%r224+8]; sub.u64 %r608,%r609,%r610; .loc 1 1094 36 add.u64 %r121,%r608,1; .loc 1 1095 19 ld.u64 %r124,[%r224]; .loc 1 1093 7 rem.u64 %r612,%r117,%r121; .loc 1 1092 24 mad.lo.u64 %r128,%r612,%r124,%r161; mov.u64 %r161,%r128; .loc 1 1090 4 add.u64 %r224,%r224,24; setp.ne.u64 %r614,%r216,%r224; @ %r614 bra $L626; .loc 1 1099 26 div.u64 %r301,%r165,%r121; bra $L625; $L640: .loc 1 1090 4 mov.u64 %r301,%r165; mov.u64 %r128,0; $L625: .loc 1 1099 36 ld.u64 %r623,[%r703]; .loc 1 1099 20 mad.lo.u64 %r624,%r301,%r623,%r128; .loc 1 1101 21 ld.u64 %r626,[%r317+16]; .loc 1 1100 7 mad.lo.u64 %r153,%r624,%r626,%r153; $L624: .loc 1 1106 11 ld.s8 %r138,[%r315+29]; .loc 1 1106 41 ld.s8 %r139,[%r317+29]; .loc 1 1106 10 cvt.u16.u32 %r628,%r138; cvt.u16.u32 %r629,%r139; set.u32.eq.u16 %r630,%r628,%r629; neg.s32 %r631,%r630; .loc 1 1107 4 cvt.u16.u32 %r633,%r631; and.b16 %r632,%r633,%r634; cvt.u32.u16 %r635,%r632; cvt.u16.u8 %r636,%r635; setp.eq.u16 %r637,%r636,0; @ %r637 bra $L627; .loc 1 1109 4 min.u64 %r642,%r157,%r158642; call (%value_in),memmove43,[%value_in]; } .loc 1 1111 7 ld.s8 %r646,[%r315+29]; cvt.u16.u32 %r645,%r646; set.u32.eq.u16 %r647,%r645,6; neg.s32 %r648,%r647; .loc 1 1111 51 cvt.u16.u32 %r653,%r648; and.b16 %r652,%r653,%r709; cvt.u32.u16 %r655,%r652; cvt.u16.u8 %r656,%r655; setp.ne.u16 %r657,%r656,0; @ ! %r657 bra $L629; .loc 1 1113 11 @ %r705 bra $L630; .loc 1 1114 3 sub.u64 %r664,%r158,%r157; add.u64 %r665,%r160,%r157; mov.u32 %r6666666,[%value_in]; } bra $L629; $L630: .loc 1 1116 3 setp.ge.u64 %r667,%r308,%r309; @ %r667 bra $L629; add.u64 %r242,%r160,%r241; add.u64 %r237,%r160,%r232; .loc 1 1117 25 mov.u32 %r668,32; $L632: st.u32 [%r242],%r668; .loc 1 1116 3 add.u64 %r242,%r242,4; setp.ne.u64 %r669,%r237,%r242; @ %r669 bra $L632; bra $L629; $L627: .loc 1 1120 15 setp.ne.u16 %r671,%r628,6; @ %r671 bra $L633; .loc 1 1120 59 @ %r705 bra $L634; .loc 1 1121 215r1571153; call assign_char1_from_char4bra $L629; $L634: .loc 1 447 3 @ %r706 bra $L635; mov.u64 %r229,%r153; mov.u64 %r228,%r160; add.u64 %r225,%r229,%r310; $L636: .loc 1 448 14 ld.u8 %r678,[%r229]; st.u32 [%r228],%r678; .loc 1 447 3 add.u64 %r229,%r229,1; add.u64 %r228,%r228,4; setp.ne.u64 %r679,%r225,%r229; @ %r679 bra $L636; $L635: .loc 1 449 3 setp.le.u64 %r680,%r309,%r310; @ %r680 bra $L629; add.u64 %r236,%r160,%r235; add.u64 %r230,%r160,%r232; .loc 1 450 12 mov.u32 %r681,32; $L637: st.u32 [%r236],%r681; .loc 1 449 3 add.u64 %r236,%r236,4; setp.ne.u64 %r682,%r230,%r236; @ %r682 bra $L637; bra $L629; $L633: .loc 1 1125 2 {318convert_type$L629: .loc 1 1067 26 add.u64 %r165,%r165,1; .loc 1 1067 3 setp.ge.u64 %r690,%r165,%r176; @ %r690 bra $L581; .loc 1 1083 26 ld.u64 %r299,[%r315+16]; $L598: .loc 1 1072 7 @ %r694 bra $L638; mov.u64 %r300,%r165; mov.u64 %r105,0; bra $L639; $L581: .loc 1 112_gfortran_caf_sendget .visible .func _gfortran_caf_sendge, .param .u32 %in_ar11, .param .u32 %in_ar1ar10; ld.param.u32 %ar10,[%in_ar10]; .reg .u32 %ar11; ld.param.u32 %ar11,[%in_ar11]; .reg .u32 %ar12; ld.param.u32 %ar12,[%in_ar1mov.u64 %r28,%ar3; mov.u64 %r29,%ar4; mov.u64 %r30,%ar5; mov.u64 %r31,%ar6; mov.u64 %r33,%ar8; mov.u32 %r35,%ar10; mov.u32 %r36,%ar11; mov.u32 %r37,%ar12; .loc 1 1144 9 ld.u64 %r24,[%r33]; .loc 1 1145 31 ld.u64 %r39,[%r30]; add.u64 %r38,%r39,%r31; .loc 1 1145 29 st.u64 [%r33],%r38; .loc 1 1147 3 mov.u64 %r49,0; {3; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r36; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r37; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r49; call _gfortran_caf_send,%out_arg8,%out_arg9,%out_arg10); } .loc 1 1149 29 st.u64 [%r33],%r24; .loc 1 115_gfortran_caf_get_by_ref .visible .func _gfortran_caf_get_by_ref65u32 %r25; .reg .u64 %r26; .reg .u32 %r33; .reg .u64 %r83; .reg .u64 %r86; .reg .u64 %r9364u64 %r283; .reg .u64u64 %r287; .reg .u64 %r292; .reg .u64u64u64u64u64u64 %r372; .reg .u64 %r375; .reg .u64 %r376; .reg .u64u64 %r404; .reg .pred %r405; .reg .pred %r407; .reg .predpred %r414; .reg .u64 %r416; .reg .pred %r417; .reg .u16 %r418; .reg .pred %r419; .reg .pred %r421; .reg .pred %r423; .reg .pred %r425; .reg .pred %r427; .reg .predpred %r433; .reg .predu64.reg .u64495; .reg .u32 %r496; .reg .u16 %r497; .reg .pred %r498; .reg .pred %r499; .reg .u64 %r500; .reg .u64 %r504; .reg .pred %r505; .reg .u64 %r506; .reg .u64 %r507; .reg .u32 %r509; .reg .u32 %r510; .reg .u32 %r512; .reg .u32 %r513; .reg .u16 %r514; .reg .u16 %r515; .reg .u16 %r516; .reg .u32 %r517; .reg .u16 %r518; .reg .pred %r519; .reg .pred %r520; .reg .u64 %r521; .reg .u64u64 %r531; .reg .u64 %r532; .reg .u64 %r533; .reg .u64 %r534; .reg .u64 %r535; .reg .u64 %r536; .reg .u64u32 %r546; .reg .u32 %r547; .reg .u16 %r548; .reg .u16 %r549; .reg .u16 %r550; .reg .u32 %r551; .reg .u16 %r552; .reg .pred %r553; .reg .pred %r554; .reg .u64u64 %r565; .reg .predu32 %r571; .reg .u32 %r573; .reg .u32 %r574; .reg .u16 %r575; .reg .u16 %r576; .reg .u16 %r577; .reg .u32 %r578; .reg .u16 %r579; .reg .pred %r580; .reg .pred %r581; .reg .u64 %r582; .reg .u64 %r586; .reg .u64 %r590; .reg .pred %r593; .reg .u32 %r595; .reg .u32 %r596; .reg .u32 %r598; .reg .u32 %r599; .reg .u16 %r600; .reg .u16 %r601; .reg .u16 %r602; .reg .u32 %r603; .reg .u16 %r604; .reg .pred %r605; .reg .u64 %r609; .reg .pred %r612; .reg .pred %r613; .reg .u64 %r614; .reg .pred %r615; .reg .u32 %r617; .reg .u32 %r618; .reg .u32 %r620; .reg .u32 %r621; .reg .u16 %r622; .reg .u16 %r623; .reg .u16 %r624; .reg .u32 %r625; .reg .u16 %r626; .reg .pred %r627; .reg .u64 %r631; .reg .pred %r634; .reg .pred %r635; .reg .pred %r637; .reg .u32 %r638; .reg .pred %r639; .reg .u32 %r641; .reg .u64 %r642; .reg .u64 %r644; .reg .u64 %r645; .reg .u64 %r646; .reg .u64 %r647; .reg .pred %r651; .reg .u64 %r655; .reg .u64 %r661; .reg .u64 %r662; .reg .u64 %r663; .reg .u6464u64 %r683; .reg .u64u32 %r710; .reg .u16 %r711; .reg .u16 %r712; .reg .u16 %r713; .reg .u32 %r714; .reg .pred %r715; .reg .u64 %r716; .reg .pred %r717; .reg .pred %r718; .reg .u32 %r719; .reg .pred %r720; .reg .u64 %r721; .reg .u32 %r722; .reg .u64 %r723; .reg .u64 %r724; .reg .u64 %r725; .reg .pred %r726; .reg .u16 %r727; .reg .pred %r728; .reg .pred %r730; .reg .pred %r732; .reg .pred %r734; .reg .pred %r736; .reg .pred %r737; .reg .pred %r738; .reg .pred %r739; .reg .pred %r740; .reg .pred %r741; .reg .pred %r742; .reg .u64 %r743; .reg .u64 %r744; .reg .u64 %r746; .reg .u64 %r747; .reg .u64 %r748; .reg .u64 %r750; .reg .u64 %r751; .reg .u64 %r752; .reg .u64 %r754; .reg .u64 %r755; .reg .u64 %r757; .reg .u64 %r758; .reg .u64 %r759; .reg .u64 %r761; .reg .u64 %r762; .reg .u64 %r764; .reg .u64 %r765; .reg .u64 %r766; .reg .pred %r767; .reg .u64 %r768; .reg .u64 %r769; .reg .u32 %r771; .reg .u32 %r772; .reg .u32 %r774; .reg .u32 %r775; .reg .u16 %r776; .reg .u16 %r777; .reg .u16 %r778; .reg .u32 %r779; .reg .u16 %r780; .reg .pred %r781; .reg .pred %r782; .reg .u64 %r783; .reg .u64 %r787; .reg .u64 %r788; .reg .u64 %r790; .reg .u64 %r791; .reg .u64 %r792; .reg .u64 %r793; .reg .u64 %r795; .reg .pred %r796; .reg .u32 %r798; .reg .u32 %r799; .reg .u32 %r801; .reg .u32 %r802; .reg .u16 %r803; .reg .u16 %r804; .reg .u16 %r805; .reg .u32 %r806; .reg .u16 %r807; .reg .pred %r808; .reg .pred %r809; .reg .u64 %r811; .reg .pred %r812; .reg .u32 %r814; .reg .u32 %r815; .reg .u32 %r817; .reg .u32 %r818; .reg .u16 %r819; .reg .u16 %r820; .reg .u16 %r821; .reg .u32 %r822; .reg .u16 %r823; .reg .pred %r824; .reg .pred %r825; .reg .pred %r826; .reg .u64 %r831; .reg .u64 %r832; .reg .u64 %r833; .reg .u64 %r834; .reg .u64 %r836; .reg .u64 %r837; .reg .u64 %r845; .reg .pred %r846; .reg .pred %r848; .reg .u64 %r853; .reg .u64 %r858; .reg .u64 %r859; .reg .u64 %r860; .reg .u64 %r861; .reg .u32 %r879; .reg .u32 %r880; .reg .u16 %r881; .reg .u16 %r882; .reg .u16 %r883; .reg .u32 %r884; .reg .pred %r885; .reg .u64 %r886; .reg .pred %r887; .reg .pred %r888; .reg .u32 %r889; .reg .pred %r890; .reg .u64 %r891; .reg .u32 %r892; .reg .u64 %r893; .reg .u64 %r894; .reg .u64 %r898; .reg .pred %r901; .reg .u64 %r903; .reg .u32 %r904; .reg .u32 %r905; .reg .u32 %r907; .reg .u32 %r908; .reg .u16 %r909; .reg .u16 %r910; .reg .u16 %r911; .reg .u32 %r912; .reg .u16 %r913; .reg .pred %r914; .reg .pred %r915; .reg .pred %r916; .reg .u64 %r918; .reg .u64 %r919; .reg .u64 %r920; .reg .pred %r922; .reg .pred %r923; .reg .u64 %r924; .reg .u32 %r925; .reg .u64 %r926; .reg .u64 %r927; .reg .pred %r928; .reg .pred %r929; .reg .u64 %r930; .reg .pred %r931; .reg .u32 %r932; .reg .u64 %r933; .reg .u64 %r935; .reg .u64 %r936; .reg .u64 %r937; .reg .u64 %r938; .reg .u64 %r939; .reg .pred %r942; .reg .u64 %r950; .reg .u32 %r952; .reg .u64 %r954; .reg .u64 %r956; .reg .u64 %r962; .reg .u64 %r970; .reg .u64 %r971; .reg .u64 %r972; .reg .u64 %r973; .reg .u64 %r974; .reg .u64 %r975; .reg .u64 %r976; .reg .pred %r977; .reg .u64 %r978; .reg .u64 %r979; .reg .u64 %r980; .reg .pred %r987; .reg .pred %r988; .reg .pred %r994; .reg .pred %r995; .reg .u64 %r996; .reg .pred %r997; .reg .u32 %r998; .reg .pred %r999; .reg .pred %r1000; mov.u64 %r316,%ar0; mov.u64 %r318,%ar2; mov.u64 %r319,%ar3; mov.u32 %r320,%ar4; mov.u32 %r321,%ar5; mov.u32 %r323,%ar7; mov.u64 %r324,%ar8; mov.u32 %r325,%ar9; .loc 1 1550 14 add.u64 %r971,%frame,361; cvta.const.u64 %r327,$LC9; mov.u64 %r328,6328332,[%value_in]; } .loc 1 1552 14 add.u64 %r973,%frame,481; cvta.const.u64 %r335,$LC10; mov.u64 %r336,5836340,[%value_in]; } .loc 1 1554 14 add.u64 %r980,%frame,297; cvta.const.u64 %r343,$LC11; mov.u64 %r344,6444348,[%value_in]; } .loc 1 1556 14 add.u64 %r979,%frame,594; cvta.const.u64 %r351,$LC12; mov.u64 %r352,53356,[%value_in]; } .loc 1 1558 14 add.u64 %r976,%frame,539; cvta.const.u64 %r359,$LC13; mov.u64 %r360,5560364,[%value_in]; } .loc 1 1560 14 add.u64 %r972,%frame,423; cvta.const.u64 %r367,$LC1436372,[%value_in]; } .loc 1 1562 14 add.u64 %r975,%frame,120; cvta.const.u64 %r375,$LC15; mov.u64 %r376,93760,[%value_in]; } .loc 1 1564 14 add.u64 %r978,%frame,210; cvta.const.u64 %r383,$LC16; mov.u64 %r384,8971568 7 ld.s8 %r242,[%r318+28]; .loc 1 1572 9 ld.u64 %r274,[%r316]; .loc 1 1573 21 ld.u64 %r275,[%r316+8]; .loc 1 1585 39 ld.u64 %r23,[%r318]; .loc 1 1585 37 set.u32.eq.u64 %r391,%r23,0; neg.s32 %r392,%r391; cvt.u32.u32 %r390,%r392; cvt.u32.u8 %r277,%r390; .loc 1 1587 3 cvt.u16.u32 %r394,%r323; xor.b16 %r393,%r394,1; cvt.u32.u16 %r395,%r393; cvt.u32.u8 %r24,%r395; cvt.u16.u32 %r397,%r24; cvt.u16.u32 %r398,%r277; and.b16 %r396,%r397,%r398; cvt.u32.u16 %r399,%r396; cvt.u32.u8 %r25,%r399; setp.eq.u32 %r400,%r25,0; @ %r400 bra $L679; cvta.const.u64 %r404,$LC17; cvta.const.u64 %r403,__func__$5; mov.u32 %r402,1587; cvta.const.u64 %r401,$LC183401589 6 setp.eq.u64 %r405,%r324,0; @ %r405 bra $L680; .loc 1 1590 11 st.u32 [%r324],%r25; $L680: .loc 1 1595 9 setp.eq.u64 %r407,%r319,0; @ %r407 bra $L678; .loc 1 1597 20 ld.u32 %r160,[%r319+8]; mov.u32 %r266,%r25; mov.u64 %r304,%r319; mov.u64 %r264,%r275; mov.u64 %r161,%r274; mov.u32 %r212,0; mov.u64 %r86,1; setp.eq.u32 %r977,%r160,1; .loc 1 1670 11 mov.u64 %r996,%r86; .loc 1 1768 6 setp.eq.u64 %r997,%r23,0; .loc 1 1796 4 cvt.u32.u64 %r998,%r86; .loc 1 1775 9 setp.eq.u32 %r999,%r24,0; setp.ne.u32 %r1000,%r24,0; $L764: .loc 1 1597 7 @ %r977 bra $L682; setp.eq.u32 %r409,%r160,2; @ %r409 bra $L683; setp.ne.u32 %r410,%r160,0; @ %r410 bra $L684; .loc 1 1600 18 ld.u64 %r26,[%r304+32]; .loc 1 1613 17 ld.u64 %r241,[%r304]; .loc 1 1600 7 setp.eq.u64 %r411,%r26,0; @ %r411 bra $L685; .loc 1 1602 21 add.u64 %r412,%r161,%r26; ld.u64 %r287,[%r412]; .loc 1 1604 15 ld.u64 %r161,[%r287]; .loc 1 1605 12 ld.u64 %r264,[%r287+8]; bra $L686; $L685: .loc 1 1609 15 ld.u64 %r413,[%r304+24]; add.u64 %r161,%r161,%r413; .loc 1 1613 11 setp.eq.u64 %r414,%r241,0; @ %r414 bra $L687; .loc 1 1613 38 ld.u32 %r160,[%r241+8]; .loc 1 1613 24 setp.eq.u32 %r977,%r160,1; @ %r977 bra $L774; .loc 1 1616 7 mov.u64 %r264,%r26; bra $L689; $L682: .loc 1 1620 11 mov.u64 %r416,0; st.u64 [%frame+648],%r416; .loc 1 1620 31 ld.u8 %r169,[%r304+24]; .loc 1 1620 4 setp.ne.u32 %r417,%r169,0; @ %r417 bra $L690; .loc 1 1613 17 ld.u64 %r241,[%r304]; bra $L686; $L690: add.u64 %r137,%r304,48; add.u64 %r120,%r264,40; .loc 1 1620 4 mov.u32 %r270,%r25; .loc 1 1620 31 mov.u64 %r167,%r416; .loc 1 1733 9 setp.ne.u32 %r994,%r266,0; .loc 1 1743 10 setp.eq.u32 %r995,%r323,0; $L730: cvt.u16.u32 %r418,%r169; setp.eq.u16 %r419,%r418,4; @ %r419 bra $L691; setp.gt.u16 %r421,%r418,4; @ %r421 bra $L692; setp.eq.u16 %r423,%r418,2; @ %r423 bra $L693; setp.eq.u16 %r425,%r418,3; @ %r425 bra $L694; setp.eq.u16 %r427,%r418,1; @ %r427 bra $L695; bra $L696; $L692: setp.eq.u16 %r429,%r418,5; @ %r429 bra $L697; setp.eq.u16 %r431,%r418,6; @ %r431 bra $L698; bra $L696; $L695: .loc 1 1625 11 ld.u64 %r314,[%r137+8]; .loc 1 1634 32 ld.u32 %r33,[%r137+16]; setp.eq.u32 %r432,%r33,4; @ %r432 bra $L699; setp.gt.s32 %r433,%r33,4; @ %r433 bra $L700; setp.eq.u32 %r434,%r33,1; @ %r434 bra $L701; setp.eq.u32 %r435,%r33,2; @ %r435 bra $L702; bra $L703; $L700: setp.eq.u32 %r436,%r33,8; @ %r436 bra $L704; setp.eq.u32 %r437,%r33,16; @ %r437 bra $L705; bra $L703; $L701: .loc 1 1636 7 ld.u64 %r438,[%r137]; ld.s8 %r439,[%r438]; ld.u64 %r441,[%r120+8]; sub.u64 %r440,%r439,%r441; ld.u64 %r443,[%r120]; mul.lo.u64 %r442,%r440,%r443; ld.u64 %r445,[%r304+16]; mad.lo.u64 %r161,%r442,%r445,%r161; bra $L706; $L702: .loc 1 1637 7 ld.u64 %r446,[%r137]; ld.s16 %r447,[%r446]; ld.u64 %r449,[%r120+8]; sub.u64 %r448,%r447,%r449; ld.u64 %r451,[%r120]; mul.lo.u64 %r450,%r448,%r451; ld.u64 %r453,[%r304+16]; mad.lo.u64 %r161,%r450,%r453,%r161; bra $L706; $L699: .loc 1 1638 7 ld.u64 %r454,[%r137]; ld.s32 %r455,[%r454]; ld.u64 %r457,[%r120+8]; sub.u64 %r456,%r455,%r457; ld.u64 %r459,[%r120]; mul.lo.u64 %r458,%r456,%r459; ld.u64 %r461,[%r304+16]; mad.lo.u64 %r161,%r458,%r461,%r161; bra $L706; $L704: .loc 1 1640 7 ld.u64 %r462,[%r137]; ld.u64 %r464,[%r462]; ld.u64 %r465,[%r120+8]; sub.u64 %r463,%r464,%r465; ld.u64 %r467,[%r120]; mul.lo.u64 %r466,%r463,%r467; ld.u64 %r469,[%r304+16]; mad.lo.u64 %r161,%r466,%r469,%r161; bra $L706; $L705: .loc 1 1643 7 ld.u64 %r470,[%r137]; ld.u64 %r472,[%r470]; ld.u64 %r473,[%r120+8]; sub.u64 %r471,%r472,%r473; ld.u64 %r475,[%r120]; mul.lo.u64 %r474,%r471,%r475; ld.u64 %r477,[%r304+16]; mad.lo.u64 %r161,%r474,%r477,%r161; bra $L706; $L703: .loc 1 1646 9 mov.u64 %r481,48stack; call caf_internal_error); } .loc 1 1647 9 bra $L678; $L693: .loc 1 1652 5 ld.u64 %r83,[%r137+16]; abs.s64 %r286,%r83; ld.u64 %r268,[%r120+16]; .loc 1 1636 7 ld.u64 %r267,[%r120+8]; .loc 1 1652 5 setp.le.s64 %r484,%r83,0; @ %r484 bra $L708; add.u64 %r485,%r268,1; sub.u64 %r314,%r485,%r267; bra $L709; $L708: add.u64 %r486,%r267,1; sub.u64 %r314,%r486,%r268; $L709: set.u32.le.s64 %r488,%r314,0; neg.s32 %r489,%r488; set.u32.le.s64 %r491,%r286,0; neg.s32 %r492,%r491; cvt.u16.u32 %r494,%r489; cvt.u16.u32 %r495,%r492; or.b16 %r493,%r494,%r495; cvt.u32.u16 %r496,%r493; cvt.u16.u8 %r497,%r496; setp.ne.u16 %r498,%r497,0; @ %r498 bra $L678; setp.eq.u64 %r499,%r286,1; @ %r499 bra $L710; add.u64 %r500,%r314,-1; div.s64 %r504,%r500,%r286; add.u64 %r314,%r504,1; bra $L710; $L694: .loc 1 1660 5 ld.u64 %r93,[%r137+16]; abs.s64 %r285,%r93; .loc 1 1636 7 ld.u64 %r168,[%r137]; .loc 1 1625 32 ld.u64 %r313,[%r137+8]; .loc 1 1660 5 setp.le.s64 %r505,%r93,0; @ %r505 bra $L711; add.u64 %r506,%r313,1; sub.u64 %r314,%r506,%r168; bra $L712; $L711: add.u64 %r507,%r168,1; sub.u64 %r314,%r507,%r313; $L712: set.u32.le.s64 %r509,%r314,0; neg.s32 %r510,%r509; set.u32.le.s64 %r512,%r285,0; neg.s32 %r513,%r512; cvt.u16.u32 %r515,%r510; cvt.u16.u32 %r516,%r513; or.b16 %r514,%r515,%r516; cvt.u32.u16 %r517,%r514; cvt.u16.u8 %r518,%r517; setp.ne.u16 %r519,%r518,0; @ %r519 bra $L678; setp.eq.u64 %r520,%r285,1; @ %r520 bra $L713; add.u64 %r521,%r314,-1; div.s64 %r525,%r521,%r285; add.u64 %r314,%r525,1; $L713: .loc 1 1665 9 ld.u64 %r527,[%r120+8]; sub.u64 %r526,%r168,%r527; .loc 1 1666 9 ld.u64 %r529,[%r120]; mul.lo.u64 %r528,%r526,%r529; .loc 1 1667 9 ld.u64 %r531,[%r304+16]; .loc 1 1664 12 mad.lo.u64 %r161,%r528,%r531,%r161; .loc 1 1668 5 bra $L710; $L691: .loc 1 1672 9 ld.u64 %r533,[%r137]; ld.u64 %r534,[%r120+8]; sub.u64 %r532,%r533,%r534; .loc 1 1673 9 ld.u64 %r536,[%r120]; mul.lo.u64 %r535,%r532,%r536; .loc 1 1674 9 ld.u64 %r538,[%r304+16]; .loc 1 1671 12 mad.lo.u64 %r161,%r535,%r538,%r161; .loc 1 1670 11 mov.u64 %r314,%r996; bra $L714; $L697: .loc 1 1677 5 ld.u64 %r119,[%r137+16]; abs.s64 %r284,%r119; .loc 1 1636 7 ld.u64 %r273,[%r137]; .loc 1 1652 5 ld.u64 %r310,[%r120+16]; .loc 1 1677 5 setp.le.s64 %r539,%r119,0; @ %r539 bra $L715; add.u64 %r540,%r310,1; sub.u64 %r314,%r540,%r273; bra $L716; $L715: add.u64 %r541,%r273,1; sub.u64 %r314,%r541,%r310; $L716: set.u32.le.s64 %r543,%r314,0; neg.s32 %r544,%r543; set.u32.le.s64 %r546,%r284,0; neg.s32 %r547,%r546; cvt.u16.u32 %r549,%r544; cvt.u16.u32 %r550,%r547; or.b16 %r548,%r549,%r550; cvt.u32.u16 %r551,%r548; cvt.u16.u8 %r552,%r551; setp.ne.u16 %r553,%r552,0; @ %r553 bra $L678; setp.eq.u64 %r554,%r284,1; @ %r554 bra $L717; add.u64 %r555,%r314,-1; div.s64 %r559,%r555,%r284; add.u64 %r314,%r559,1; $L717: .loc 1 1682 9 ld.u64 %r561,[%r120+8]; sub.u64 %r560,%r273,%r561; .loc 1 1683 9 ld.u64 %r563,[%r120]; mul.lo.u64 %r562,%r560,%r563; .loc 1 1684 9 ld.u64 %r565,[%r304+16]; .loc 1 1681 12 mad.lo.u64 %r161,%r562,%r565,%r161; .loc 1 1685 5 bra $L710; $L698: .loc 1 1687 5 ld.u64 %r136,[%r137+16]; abs.s64 %r283,%r136; .loc 1 1636 7 ld.u64 %r302,[%r120+8]; .loc 1 1625 32 ld.u64 %r276,[%r137+8]; .loc 1 1687 5 setp.le.s64 %r566,%r136,0; @ %r566 bra $L718; add.u64 %r567,%r276,1; sub.u64 %r314,%r567,%r302; bra $L719; $L718: add.u64 %r568,%r302,1; sub.u64 %r314,%r568,%r276; $L719: set.u32.le.s64 %r570,%r314,0; neg.s32 %r571,%r570; set.u32.le.s64 %r573,%r283,0; neg.s32 %r574,%r573; cvt.u16.u32 %r576,%r571; cvt.u16.u32 %r577,%r574; or.b16 %r575,%r576,%r577; cvt.u32.u16 %r578,%r575; cvt.u16.u8 %r579,%r578; setp.ne.u16 %r580,%r579,0; @ %r580 bra $L678; setp.eq.u64 %r581,%r283,1; @ %r581 bra $L710; add.u64 %r582,%r314,-1; div.s64 %r586,%r582,%r283; add.u64 %r314,%r586,1; bra $L710; $L696: .loc 1 1695 5 mov.u64 %r590,98324590590stack; call caf_internal_error); } .loc 1 1696 5 bra $L678; $L706: .loc 1 1698 11 setp.le.s64 %r593,%r314,0; @ %r593 bra $L678; $L710: .loc 1 1702 18 set.u32.gt.s64 %r595,%r314,1; neg.s32 %r596,%r595; .loc 1 1702 34 set.u32.eq.u32 %r598,%r242,0; neg.s32 %r599,%r598; .loc 1 1702 22 cvt.u16.u32 %r601,%r596; cvt.u16.u32 %r602,%r599; and.b16 %r600,%r601,%r602; .loc 1 1702 11 cvt.u32.u16 %r603,%r600; cvt.u16.u8 %r604,%r603; setp.eq.u16 %r605,%r604,0; @ %r605 bra $L714; $L751: .loc 1 1705 5 mov.u64 %r609,9460stack; call caf_internal_error); } .loc 1 1706 5 bra $L678; $L714: .loc 1 1709 11 setp.ne.u32 %r612,%r242,-1; @ %r612 bra $L720; .loc 1 1709 27 setp.ne.u64 %r613,%r23,0; @ %r613 bra $L721; .loc 1 1711 14 add.u32 %r242,%r212,1; .loc 1 1712 31 st.u8 [%r318+28],%r242; .loc 1 1713 31 cvt.s64.s32 %r614,%r320; st.u64 [%r318+16],%r614; bra $L722; $L720: .loc 1 1716 11 setp.le.s32 %r615,%r242,0; @ %r615 bra $L721; .loc 1 1720 21 set.u32.le.s32 %r617,%r242,%r212; neg.s32 %r618,%r617; .loc 1 1720 42 set.u32.ne.u64 %r620,%r314,1; neg.s32 %r621,%r620; .loc 1 1720 33 cvt.u16.u32 %r623,%r618; cvt.u16.u32 %r624,%r621; and.b16 %r622,%r623,%r624; .loc 1 1720 8 cvt.u32.u16 %r625,%r622; cvt.u16.u8 %r626,%r625; setp.eq.u16 %r627,%r626,0; @ %r627 bra $L722; $L755: .loc 1 1722 9 mov.u64 %r631,6363stack; call caf_internal_error); } .loc 1 1723 9 bra $L678; $L722: .loc 1 1726 13 setp.eq.u64 %r634,%r314,1; @ %r634 bra $L723; .loc 1 1730 12 setp.ne.u32 %r635,%r270,0; @ %r635 bra $L724; .loc 1 1733 9 @ %r994 bra $L725; .loc 1 1740 13 setp.eq.u64 %r637,%r167,0; @ %r637 bra $L724; .loc 1 1741 22 ld.s8 %r638,[%r264+28]; .loc 1 1741 7 setp.ne.u32 %r639,%r638,%r242; @ %r639 bra $L724; .loc 1 1747 46 cvt.u32.u64 %r212,%r167; .loc 1 1743 10 @ %r995 bra $L724; add.u64 %r269,%r318,40; add.u32 %r641,%r212,-1; cvt.u64.u32 %r642,%r641; add.u64 %r644,%r642,%r642; add.u64 %r645,%r644,%r642; shl.b64 %r646,%r645,3; add.u64 %r647,%r318,64; add.u64 %r140,%r646,%r647; $L726: .loc 1 1749 12 st.u64 [%r269+8],%r996; st.u64 [%r269+16],%r996; st.u64 [%r269],%r996; .loc 1 1747 11 add.u64 %r269,%r269,24; setp.ne.u64 %r651,%r140,%r269; @ %r651 bra $L726; bra $L724; $L725: .loc 1 1760 10 mov.u64 %r655,97r32465655stack; call caf_internal_error); } .loc 1 1762 10 bra $L678; $L724: .loc 1 1768 6 @ ! %r997 bra $L841; cvt.s64.s32 %r974,%r212; bra $L727; $L841: .loc 1 1768 9 cvt.s64.s32 %r974,%r212; add.u64 %r661,%r974,%r974; add.u64 %r662,%r661,%r974; shl.b64 %r663,%r662,3; add.u64 %r664,%r318,%r663; ld.u64 %r667,[%r664+56]; add.u64 %r666,%r667,1; ld.u64 %r675,[%r664+48]; sub.u64 %r159,%r666,%r675; .loc 1 1768 6 setp.ne.u64 %r676,%r159,%r314; @ %r676 bra $L728; .loc 1 1771 12 setp.eq.u32 %r677,%r277,0; @ %r677 bra $L775; .loc 1 1775 9 @ %r1000 bra $L729; mov.u32 %r270,%r277; bra $L723; $L728: @ %r999 bra $L727; $L729: .loc 1 1777 10 st.u64 [%stack+8],%r159; st.u64 [%stack],%r314; mov.u64 %r683,683683stack; call caf_internal_error); } .loc 1 1781 10 bra $L678; $L727: .loc 1 1796 4 add.u64 %r688,%r974,%r974; add.u64 %r689,%r688,%r974; shl.b64 %r690,%r689,3; add.u64 %r691,%r318,%r690; st.u64 [%r691+48],%r996; st.u64 [%r691+56],%r314; st.u64 [%r691+40],%r86; mov.u32 %r270,%r998; .loc 1 1791 21 mov.u32 %r277,%r998; bra $L723; $L775: mov.u32 %r270,1; $L723: .loc 1 1801 37 set.u32.lt.s32 %r709,%r212,%r242; neg.s32 %r710,%r709; .loc 1 1801 22 cvt.u16.u32 %r712,%r710; cvt.u16.u32 %r713,%r270; and.b16 %r711,%r712,%r713; cvt.u32.u16 %r714,%r711; cvt.u32.u8 %r165,%r714; .loc 1 1801 8 setp.eq.u32 %r715,%r165,0; @ %r715 bra $L721; .loc 1 1802 7 add.u32 %r212,%r212,1; mov.u32 %r270,%r165; $L721: .loc 1 1804 13 mul.lo.u64 %r86,%r86,%r314; .loc 1 1620 56 add.u64 %r167,%r167,1; st.u64 [%frame+648],%r167; .loc 1 1620 31 add.u64 %r716,%r304,%r167; ld.u8 %r169,[%r716+24]; .loc 1 1620 4 add.u64 %r137,%r137,24; add.u64 %r120,%r120,24; setp.ne.u32 %r717,%r169,0; @ %r717 bra $L730; .loc 1 1806 7 setp.ne.u32 %r718,%r270,0; @ %r718 bra $L731; .loc 1 1613 17 ld.u64 %r241,[%r304]; bra $L686; $L731: .loc 1 1812 8 ld.s8 %r719,[%r318+28]; setp.ne.u32 %r720,%r719,%r212; @ %r720 bra $L732; .loc 1 1613 17 ld.u64 %r241,[%r304]; .loc 1 1808 27 mov.u32 %r266,%r270; bra $L686; $L732: .loc 1 1812 8 cvta.const.u64 %r724,$LC19; cvta.const.u64 %r723,__func__$5; mov.u32 %r722,1812; cvta.const.u64 %r721,$LC182721816 11 mov.u64 %r725,0; st.u64 [%frame+648],%r725; .loc 1 1816 31 ld.u8 %r252,[%r304+24]; .loc 1 1816 4 setp.ne.u32 %r726,%r252,0; @ %r726 bra $L733; .loc 1 1613 17 ld.u64 %r241,[%r304]; bra $L686; $L733: add.u64 %r292,%r304,48; .loc 1 1816 4 mov.u32 %r271,%r25; .loc 1 1816 31 mov.u64 %r251,%r725; .loc 1 1908 9 setp.ne.u32 %r987,%r266,0; .loc 1 1884 27 setp.ne.u64 %r988,%r23,0; $L761: cvt.u16.u32 %r727,%r252; setp.eq.u16 %r728,%r727,3; @ %r728 bra $L734; setp.gt.u16 %r730,%r727,3; @ %r730 bra $L735; setp.eq.u16 %r732,%r727,1; @ %r732 bra $L736; setp.eq.u16 %r734,%r727,2; @ %r734 bra $L737; bra $L696; $L735: setp.eq.u16 %r736,%r727,4; @ %r736 bra $L738; bra $L696; $L736: .loc 1 1821 11 ld.u64 %r315,[%r292+8]; .loc 1 1827 32 ld.u32 %r176,[%r292+16]; setp.eq.u32 %r737,%r176,4; @ %r737 bra $L739; setp.gt.s32 %r738,%r176,4; @ %r738 bra $L740; setp.eq.u32 %r739,%r176,1; @ %r739 bra $L741; setp.eq.u32 %r740,%r176,2; @ %r740 bra $L742; bra $L703; $L740: setp.eq.u32 %r741,%r176,8; @ %r741 bra $L743; setp.eq.u32 %r742,%r176,16; @ %r742 bra $L744; bra $L703; $L741: .loc 1 1829 7 ld.u64 %r743,[%r292]; ld.s8 %r744,[%r743]; ld.u64 %r746,[%r304+16]; mad.lo.u64 %r161,%r744,%r746,%r161; bra $L745; $L742: .loc 1 1830 7 ld.u64 %r747,[%r292]; ld.s16 %r748,[%r747]; ld.u64 %r750,[%r304+16]; mad.lo.u64 %r161,%r748,%r750,%r161; bra $L745; $L739: .loc 1 1831 7 ld.u64 %r751,[%r292]; ld.s32 %r752,[%r751]; ld.u64 %r754,[%r304+16]; mad.lo.u64 %r161,%r752,%r754,%r161; bra $L745; $L743: .loc 1 1833 7 ld.u64 %r755,[%r292]; ld.u64 %r757,[%r755]; ld.u64 %r758,[%r304+16]; mad.lo.u64 %r161,%r757,%r758,%r161; bra $L745; $L744: .loc 1 1836 7 ld.u64 %r759,[%r292]; ld.u64 %r761,[%r759]; ld.u64 %r762,[%r304+16]; mad.lo.u64 %r161,%r761,%r762,%r161; bra $L745; $L737: .loc 1 1845 37 ld.u64 %r765,[%r292+8]; ld.u64 %r766,[%r292+16]; div.s64 %r764,%r765,%r766; .loc 1 1845 11 add.u64 %r315,%r764,1; .loc 1 1849 5 bra $L745; $L734: .loc 1 1851 5 ld.u64 %r207,[%r292+16]; abs.s64 %r278,%r207; .loc 1 1821 32 ld.u64 %r312,[%r292+8]; .loc 1 1829 7 ld.u64 %r265,[%r292]; .loc 1 1851 5 setp.le.s64 %r767,%r207,0; @ %r767 bra $L746; add.u64 %r768,%r312,1; sub.u64 %r315,%r768,%r265; bra $L747; $L746: add.u64 %r769,%r265,1; sub.u64 %r315,%r769,%r312; $L747: set.u32.le.s64 %r771,%r315,0; neg.s32 %r772,%r771; set.u32.le.s64 %r774,%r278,0; neg.s32 %r775,%r774; cvt.u16.u32 %r777,%r772; cvt.u16.u32 %r778,%r775; or.b16 %r776,%r777,%r778; cvt.u32.u16 %r779,%r776; cvt.u16.u8 %r780,%r779; setp.ne.u16 %r781,%r780,0; @ %r781 bra $L678; setp.eq.u64 %r782,%r278,1; @ %r782 bra $L748; add.u64 %r783,%r315,-1; div.s64 %r787,%r783,%r278; add.u64 %r315,%r787,1; $L748: .loc 1 1856 9 mul.lo.u64 %r788,%r207,%r265; .loc 1 1857 9 ld.u64 %r790,[%r304+16]; .loc 1 1855 12 mad.lo.u64 %r161,%r788,%r790,%r161; bra $L749; $L738: .loc 1 1862 9 ld.u64 %r792,[%r292]; ld.u64 %r793,[%r292+16]; mul.lo.u64 %r791,%r792,%r793; .loc 1 1863 9 ld.u64 %r795,[%r304+16]; .loc 1 1861 12 mad.lo.u64 %r161,%r791,%r795,%r161; .loc 1 1860 11 mov.u64 %r315,%r996; bra $L750; $L745: .loc 1 1873 11 setp.le.s64 %r796,%r315,0; @ %r796 bra $L678; $L749: .loc 1 1877 18 set.u32.gt.s64 %r798,%r315,1; neg.s32 %r799,%r798; .loc 1 1877 34 set.u32.eq.u32 %r801,%r242,0; neg.s32 %r802,%r801; .loc 1 1877 22 cvt.u16.u32 %r804,%r799; cvt.u16.u32 %r805,%r802; and.b16 %r803,%r804,%r805; .loc 1 1877 11 cvt.u32.u16 %r806,%r803; cvt.u16.u8 %r807,%r806; setp.ne.u16 %r808,%r807,0; @ %r808 bra $L751; $L750: .loc 1 1884 11 setp.ne.u32 %r809,%r242,-1; @ %r809 bra $L752; .loc 1 1884 27 @ %r988 bra $L753; .loc 1 1886 14 add.u32 %r242,%r212,1; .loc 1 1887 31 st.u8 [%r318+28],%r242; .loc 1 1888 31 cvt.s64.s32 %r811,%r320; st.u64 [%r318+16],%r811; bra $L754; $L752: .loc 1 1891 11 setp.le.s32 %r812,%r242,0; @ %r812 bra $L753; .loc 1 1895 21 set.u32.ge.s32 %r814,%r212,%r242; neg.s32 %r815,%r814; .loc 1 1895 42 set.u32.ne.u64 %r817,%r315,1; neg.s32 %r818,%r817; .loc 1 1895 33 cvt.u16.u32 %r820,%r815; cvt.u16.u32 %r821,%r818; and.b16 %r819,%r820,%r821; .loc 1 1895 8 cvt.u32.u16 %r822,%r819; cvt.u16.u8 %r823,%r822; setp.ne.u16 %r824,%r823,0; @ %r824 bra $L755; $L754: .loc 1 1901 13 setp.eq.u64 %r825,%r315,1; @ %r825 bra $L756; .loc 1 1905 12 setp.ne.u32 %r826,%r271,0; @ %r826 bra $L757; .loc 1 1908 9 @ %r987 bra $L725; .loc 1 1914 22 cvt.u32.u64 %r212,%r251; $L757: .loc 1 1926 6 @ ! %r997 bra $L842; cvt.s64.s32 %r974,%r212; bra $L758; $L842: .loc 1 1926 9 cvt.s64.s32 %r974,%r212; add.u64 %r831,%r974,%r974; add.u64 %r832,%r831,%r974; shl.b64 %r833,%r832,3; add.u64 %r834,%r318,%r833; ld.u64 %r837,[%r834+56]; add.u64 %r836,%r837,1; ld.u64 %r845,[%r834+48]; sub.u64 %r240,%r836,%r845; .loc 1 1926 6 setp.eq.u64 %r846,%r240,%r315; @ %r846 bra $L759; .loc 1 1933 9 @ ! %r1000 bra $L758; bra $L760; $L759: .loc 1 1929 12 setp.eq.u32 %r848,%r277,0; @ %r848 bra $L776; .loc 1 1933 9 @ %r999 bra $L777; $L760: .loc 1 1935 10 st.u64 [%stack+8],%r240; st.u64 [%stack],%r315; mov.u64 %r853,3853stack; call caf_internal_error); } .loc 1 1939 10 bra $L678; $L758: .loc 1 1954 4 add.u64 %r858,%r974,%r974; add.u64 %r859,%r858,%r974; shl.b64 %r860,%r859,3; add.u64 %r861,%r318,%r860; st.u64 [%r861+48],%r996; st.u64 [%r861+56],%r315; st.u64 [%r861+40],%r86; mov.u32 %r271,%r998; .loc 1 1949 21 mov.u32 %r277,%r998; bra $L756; $L776: mov.u32 %r271,1; bra $L756; $L777: .loc 1 1933 9 mov.u32 %r271,%r277; $L756: .loc 1 1958 37 set.u32.lt.s32 %r879,%r212,%r242; neg.s32 %r880,%r879; .loc 1 1958 22 cvt.u16.u32 %r882,%r880; cvt.u16.u32 %r883,%r271; and.b16 %r881,%r882,%r883; cvt.u32.u16 %r884,%r881; cvt.u32.u8 %r249,%r884; .loc 1 1958 8 setp.eq.u32 %r885,%r249,0; @ %r885 bra $L753; .loc 1 1959 7 add.u32 %r212,%r212,1; mov.u32 %r271,%r249; $L753: .loc 1 1961 13 mul.lo.u64 %r86,%r86,%r315; .loc 1 1816 56 add.u64 %r251,%r251,1; st.u64 [%frame+648],%r251; .loc 1 1816 31 add.u64 %r886,%r304,%r251; ld.u8 %r252,[%r886+24]; .loc 1 1816 4 add.u64 %r292,%r292,24; setp.ne.u32 %r887,%r252,0; @ %r887 bra $L761; .loc 1 1963 7 setp.ne.u32 %r888,%r271,0; @ %r888 bra $L762; .loc 1 1613 17 ld.u64 %r241,[%r304]; bra $L686; $L762: .loc 1 1969 8 ld.s8 %r889,[%r318+28]; setp.ne.u32 %r890,%r889,%r212; @ %r890 bra $L763; .loc 1 1613 17 ld.u64 %r241,[%r304]; .loc 1 1965 27 mov.u32 %r266,%r271; bra $L686; $L763: .loc 1 1969 8 cvta.const.u64 %r894,$LC19; cvta.const.u64 %r893,__func__$5; mov.u32 %r892,1969; cvta.const.u64 %r891,$LC18892893891973 4 mov.u64 %r898,97r324898898stack; call caf_internal_error); } .loc 1 1974 4 bra $L678; $L686: .loc 1 1595 9 setp.eq.u64 %r901,%r241,0; @ %r901 bra $L687; $L689: .loc 1 1613 38 ld.u32 %r160,[%r241+8]; mov.u64 %r304,%r241; setp.eq.u32 %r977,%r160,1; bra $L764; $L774: mov.u64 %r304,%r241; .loc 1 1614 7 mov.u64 %r264,%r161; bra $L764; $L687: .loc 1 1979 29 ld.u64 %r903,[%r304+16]; set.u32.eq.u64 %r904,%r903,0; neg.s32 %r905,%r904; .loc 1 1979 12 set.u32.eq.u64 %r907,%r86,0; neg.s32 %r908,%r907; .loc 1 1979 17 cvt.u16.u32 %r910,%r905; cvt.u16.u32 %r911,%r908; or.b16 %r909,%r910,%r911; .loc 1 1979 6 cvt.u32.u16 %r912,%r909; cvt.u16.u8 %r913,%r912; setp.ne.u16 %r914,%r913,0; @ %r914 bra $L678; .loc 1 1986 6 setp.eq.u32 %r915,%r277,0; @ %r915 bra $L765; .loc 1 1988 10 setp.eq.u32 %r916,%r266,0; @ %r916 bra $L766; $L770: .loc 1 2003 35 ld.u64 %r919,[%r318+16]; mul.lo.u64 %r918,%r86,%r919918920,[%value_in]; } .loc 1 2003 33 st.u64 [%r318],%r920; .loc 1 2004 10 setp.eq.u64 %r922,%r920,0; @ ! %r922 bra $L765; bra $L767; $L766: .loc 1 1990 4 setp.eq.u64 %r923,%r86,1; @ %r923 bra $L768; cvta.const.u64 %r927,$LC20; cvta.const.u64 %r926,__func__$5; mov.u32 %r925,1990; cvta.const.u64 %r924,$LC1899271992 7 setp.ne.u32 %r928,%r242,-1; @ %r928 bra $L769; .loc 1 1992 23 setp.ne.u64 %r929,%r23,0; @ %r929 bra $L770; .loc 1 1994 17 add.u32 %r242,%r212,1; .loc 1 1995 34 st.u8 [%r318+28],%r242; .loc 1 1996 34 cvt.s64.s32 %r930,%r320; st.u64 [%r318+16],%r930; bra $L771; $L769: .loc 1 1999 4 setp.le.s32 %r931,%r242,0; @ %r931 bra $L770; $L771: add.u64 %r206,%r318,40; add.u32 %r932,%r242,-1; cvt.u64.u32 %r933,%r932; add.u64 %r935,%r933,%r933; add.u64 %r936,%r935,%r933; shl.b64 %r937,%r936,3; add.u64 %r938,%r318,64; add.u64 %r280,%r937,%r938; .loc 1 2000 6 mov.u64 %r939,1; $L772: st.u64 [%r206+8],%r939; st.u64 [%r206+16],%r939; st.u64 [%r206],%r939; .loc 1 1999 4 add.u64 %r206,%r206,24; setp.ne.u64 %r942,%r206,%r280; @ %r942 bra $L772; bra $L770; $L767: .loc 1 200697324920stack; call caf_internal_error); } .loc 1 2007 4 bra $L678; $L765: .loc 1 2015 3 mov.u64 %r950,120; mov.u32 %r95frame950954,[%value_in]; } .loc 1 2016 5 mov.u64 %r956,0; st.u64 [%frame+648],%r956; .loc 1 2017 3 add.u64 %r970,%frame,648; ld.u64 %r962,[%r31frame3185274; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r320; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r321; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r956; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r956; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r324; .param .u32 %out_arg13; st.param.u32 [%out_arg13],%r325; call get_for_ref$constprop$0$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13); } $L678: .loc 1 20_gfortran_caf_send_by_ref .visible .func _gfortran_caf_send_by_ref608u64 %r265; .reg .u64 %r266; .reg .u64 %r267; .reg .u64u64 %r280; .reg .u64u64u64 %r320; .reg .u64 %r321; .reg .u64 %r325; .reg .u64u64 %r349; .reg .pred %r351; .reg .u32 %r352; .reg .pred %r353; .reg .pred %r354; .reg .pred %r355; .reg .predpred %r373; .reg .pred %r375; .reg .pred %r377; .reg .pred %r379; .reg .pred %r381; .reg .pred %r383; .reg .predu64u64u64 %r410; .reg .u64u32 %r450; .reg .u16 %r451; .reg .pred %r452; .reg .pred %r453; .reg .u64 %r454; .reg .u64 %r458; .reg .pred %r459; .reg .u64 %r460; .reg .u64 %r461; .reg .u32 %r463; .reg .u32 %r464; .reg .u32 %r466; .reg .u32 %r467; .reg .u16 %r468; .reg .u16 %r469; .reg .u16 %r470; .reg .u32 %r471; .reg .u16 %r472; .reg .pred %r473; .reg .pred %r474; .reg .u64 %r475; .reg .u64 %r479; .reg .u64pred498; .reg .u32 %r500; .reg .u32 %r501; .reg .u16 %r502; .reg .u16 %r503; .reg .u16 %r504; .reg .u32 %r505; .reg .u16 %r506; .reg .pred %r507; .reg .predu64 %r514; .reg .u64 %r515; .reg .u64 %r516; .reg .u64 %r517; .reg .u64 %r519; .reg .pred %r520; .reg .u64 %r521; .reg .u64 %r522; .reg .u32 %r524; .reg .u32 %r525; .reg .u32 %r527; .reg .u32 %r528; .reg .u16 %r529; .reg .u16 %r530; .reg .u16 %r531; .reg .u32pred %r560; .reg .u64 %r564; .reg .pred %r567; .reg .u64 %r570; .reg .u64 %r571; .reg .u64 %r572; .reg .u64 %r573; .reg .u64 %r575; .reg .u64 %r576; .reg .u64 %r584; .reg .u64 %r585; .reg .pred %r586; .reg .u64 %r590; .reg .u64 %r592; .reg .u64 %r594; .reg .u64 %r595; .reg .u64 %r596; .reg .u64 %r597; .reg .u64 %r599; .reg .u64 %r600; .reg .u64 %r608; .reg .u64 %r609; .reg .u64 %r611; .reg .pred %r612; .reg .u64 %r616; .reg .u64 %r621; .reg .u64 %r622; .reg .u64 %r623; .reg .u64 %r624; .reg .u64 %r643; .reg .u64 %r644; .reg .u64 %r645; .reg .u64 %r646; .reg .u64 %r648; .reg .u64 %r649; .reg .u64 %r657; .reg .u64 %r658; .reg .pred %r659; .reg .u64 %r662; .reg .pred %r663; .reg .pred %r665; .reg .u16 %r666; .reg .pred %r667; .reg .pred %r669; .reg .predpred %r679; .reg .pred %r680; .reg .pred %r681; .reg .u64 %r682; .reg .u64 %r683; .reg .u64 %r685; .reg .u64 %r686; .reg .u64 %r687; .reg .u64 %r689; .reg .u64 %r690; .reg .u64 %r691; .reg .u64 %r693; .reg .u64 %r694; .reg .u64 %r696; .reg .u64 %r697; .reg .u64 %r698; .reg .u64 %r700; .reg .u64 %r701; .reg .u64 %r703; .reg .u64 %r704; .reg .u64 %r705; .reg .pred %r706; .reg .u64 %r707; .reg .u64 %r708; .reg .u32 %r710; .reg .u32 %r711; .reg .u32 %r713; .reg .u32 %r714; .reg .u16 %r715; .reg .u16 %r716; .reg .u16 %r717; .reg .u32 %r718; .reg .u16 %r719; .reg .pred %r720; .reg .pred %r721; .reg .u64 %r722; .reg .u64 %r726; .reg .u64 %r727; .reg .u64 %r729; .reg .u64 %r730; .reg .u64 %r731; .reg .u64 %r732; .reg .u64 %r734; .reg .pred %r735; .reg .u32 %r737; .reg .u32 %r738; .reg .u16 %r742; .reg .u16 %r743; .reg .u32 %r745; .reg .u16 %r746; .reg .pred %r747; .reg .pred %r748; .reg .u64 %r749; .reg .u64 %r751; .reg .u64 %r752; .reg .u64 %r753; .reg .u64 %r754; .reg .u64 %r756; .reg .u64 %r757; .reg .u64 %r765; .reg .pred %r766; .reg .u64 %r770; .reg .u64 %r773; .reg .u64 %r775; .reg .pred %r776; .reg .u64 %r780; .reg .pred %r783; .reg .u64 %r785; .reg .u32 %r786; .reg .u32 %r787; .reg .u32 %r789; .reg .u32 %r790; .reg .u16 %r791; .reg .u16 %r792; .reg .u16 %r793; .reg .u32 %r794; .reg .u16 %r795; .reg .pred %r796; .reg .u64 %r798; .reg .u32 %r800; .reg .u64 %r802; .reg .u64 %r811; .reg .u64 %r819; .reg .u64 %r820; .reg .pred %r821; .reg .u64 %r822; .reg .u32 %r823; .reg .u64 %r824; .reg .u64 %r825; .reg .u64 %r827; .reg .u64 %r828; .reg .u64 %r829; .reg .u64 %r830; .reg .u64 %r831; .reg .u64 %r832; .reg .u64 %r833; .reg .u64 %r834; .reg .u64 %r839; .reg .u64 %r844; .reg .u64 %r845; .reg .u64 %r846; .reg .u64 %r847; .reg .u32 %r848; .reg .u32 %r849; .reg .u16 %r850; .reg .pred %r851; mov.u64 %r285,%ar0; mov.u64 %r287,%ar2; mov.u64 %r288,%ar3; mov.u32 %r289,%ar4; mov.u32 %r290,%ar5; mov.u32 %r292,%ar7; mov.u64 %r293,%ar8; mov.u32 %r294,%ar9; .loc 1 2439 14 add.u64 %r833,%frame,358; cvta.const.u64 %r296,$LC9; mov.u64 %r297,683r296297301,[%value_in]; } .loc 1 2441 14 add.u64 %r829,%frame,479; cvta.const.u64 %r304,$LC21; mov.u64 %r305,590309,[%value_in]; } .loc 1 2443 14 add.u64 %r832,%frame,293; cvta.const.u64 %r312,$LC22; mov.u64 %r313,6513317,[%value_in]; } .loc 1 2445 14 add.u64 %r831,%frame,538; cvta.const.u64 %r320,$LC23; mov.u64 %r321,5421325,[%value_in]; } .loc 1 2447 14 add.u64 %r827,%frame,120; cvta.const.u64 %r328,$LC24; mov.u64 %r329,9629333,[%value_in]; } .loc 1 2449 14 add.u64 %r828,%frame,420; cvta.const.u64 %r336,$LC250341,[%value_in]; } .loc 1 2451 14 add.u64 %r830,%frame,216; cvta.const.u64 %r344,$LC26; mov.u64 %r345,784434349,[%value_in]; } .loc 1 2457 7 ld.s8 %r248,[%r287+28]; .loc 1 2461 9 ld.u64 %r251,[%r285]; .loc 1 2462 21 ld.u64 %r252,[%r285+8]; .loc 1 2469 6 setp.eq.u64 %r351,%r293,0; @ %r351 bra $L844; .loc 1 2470 11 mov.u32 %r352,0; st.u32 [%r293],%r352; $L844: .loc 1 2475 9 setp.eq.u64 %r353,%r288,0; @ %r353 bra $L843; mov.u64 %r153,%r288; mov.u64 %r152,%r252; mov.u64 %r240,%r251; mov.u32 %r239,0; mov.u64 %r129,1; .loc 1 2525 11 mov.u64 %r847,0; .loc 1 2624 34 set.u32.gt.s32 %r848,%r248,0; neg.s32 %r849,%r848; .loc 1 2624 22 cvt.u16.u32 %r850,%r849; .loc 1 2647 9 setp.ne.u32 %r851,%r292,0; bra $L908; $L914: mov.u64 %r153,%r274; $L908: .loc 1 2477 20 ld.u32 %r23,[%r153+8]; .loc 1 2477 7 setp.eq.u32 %r354,%r23,1; @ %r354 bra $L846; setp.eq.u32 %r355,%r23,2; @ %r355 bra $L847; setp.ne.u32 %r356,%r23,0; @ %r356 bra $L848; .loc 1 2487 18 ld.u64 %r24,[%r153+32]; .loc 1 2492 29 ld.u64 %r357,[%r153+24]; add.u64 %r152,%r240,%r357; .loc 1 2487 7 setp.le.s64 %r358,%r24,0; @ %r358 bra $L911; .loc 1 2492 12 ld.u64 %r25,[%r152]; .loc 1 2492 11 setp.ne.u64 %r359,%r25,0; @ %r359 bra $L850; .loc 1 2496 8 @ %r851 bra $L912; .loc 1 2498 98r293stack; call caf_internal_error); } .loc 1 2499 9 bra $L843; $L850: .loc 1 2506 21 add.u64 %r367,%r240,%r24; .loc 1 2509 12 ld.u64 %r368,[%r367]; mov.u64 %r240,%r152; ld.u64 %r152,[%r368+8]; bra $L849; $L846: .loc 1 2519 7 setp.eq.u64 %r369,%r152,0; @ %r369 bra $L913; .loc 1 2520 13 ld.u64 %r240,[%r152]; bra $L852; $L913: mov.u64 %r152,%r287; $L852: .loc 1 2525 11 st.u64 [%frame+592],%r847; .loc 1 2525 31 ld.u8 %r171,[%r153+24]; .loc 1 2525 4 setp.eq.u32 %r371,%r171,0; @ %r371 bra $L849; add.u64 %r284,%r153,25; add.u64 %r282,%r153,48; add.u64 %r272,%r152,40; .loc 1 2667 4 mov.u64 %r844,1; .loc 1 2525 56 mov.u64 %r845,-24; sub.u64 %r846,%r845,%r153; $L886: cvt.u16.u32 %r372,%r171; setp.eq.u16 %r373,%r372,4; @ %r373 bra $L853; setp.gt.u16 %r375,%r372,4; @ %r375 bra $L854; setp.eq.u16 %r377,%r372,2; @ %r377 bra $L855; setp.eq.u16 %r379,%r372,3; @ %r379 bra $L856; setp.eq.u16 %r381,%r372,1; @ %r381 bra $L857; bra $L858; $L854: setp.eq.u16 %r383,%r372,5; @ %r383 bra $L859; setp.eq.u16 %r385,%r372,6; @ %r385 bra $L860; bra $L858; $L857: .loc 1 2530 11 ld.u64 %r269,[%r282+8]; .loc 1 2539 32 ld.u32 %r29,[%r282+16]; setp.eq.u32 %r386,%r29,4; @ %r386 bra $L861; setp.gt.s32 %r387,%r29,4; @ %r387 bra $L862; setp.eq.u32 %r388,%r29,1; @ %r388 bra $L863; setp.eq.u32 %r389,%r29,2; @ %r389 bra $L864; bra $L865; $L862: setp.eq.u32 %r390,%r29,8; @ %r390 bra $L866; setp.eq.u32 %r391,%r29,16; @ %r391 bra $L867; bra $L865; $L863: .loc 1 2541 7 ld.u64 %r392,[%r282]; ld.s8 %r393,[%r392]; ld.u64 %r395,[%r272+8]; sub.u64 %r394,%r393,%r395; ld.u64 %r397,[%r272]; mul.lo.u64 %r396,%r394,%r397; ld.u64 %r399,[%r153+16]; mad.lo.u64 %r240,%r396,%r399,%r240; bra $L868; $L864: .loc 1 2542 7 ld.u64 %r400,[%r282]; ld.s16 %r401,[%r400]; ld.u64 %r403,[%r272+8]; sub.u64 %r402,%r401,%r403; ld.u64 %r405,[%r272]; mul.lo.u64 %r404,%r402,%r405; ld.u64 %r407,[%r153+16]; mad.lo.u64 %r240,%r404,%r407,%r240; bra $L868; $L861: .loc 1 2543 7 ld.u64 %r408,[%r282]; ld.s32 %r409,[%r408]; ld.u64 %r411,[%r272+8]; sub.u64 %r410,%r409,%r411; ld.u64 %r413,[%r272]; mul.lo.u64 %r412,%r410,%r413; ld.u64 %r415,[%r153+16]; mad.lo.u64 %r240,%r412,%r415,%r240; bra $L868; $L866: .loc 1 2545 7 ld.u64 %r416,[%r282]; ld.u64 %r418,[%r416]; ld.u64 %r419,[%r272+8]; sub.u64 %r417,%r418,%r419; ld.u64 %r421,[%r272]; mul.lo.u64 %r420,%r417,%r421; ld.u64 %r423,[%r153+16]; mad.lo.u64 %r240,%r420,%r423,%r240; bra $L868; $L867: .loc 1 2548 7 ld.u64 %r424,[%r282]; ld.u64 %r426,[%r424]; ld.u64 %r427,[%r272+8]; sub.u64 %r425,%r426,%r427; ld.u64 %r429,[%r272]; mul.lo.u64 %r428,%r425,%r429; ld.u64 %r431,[%r153+16]; mad.lo.u64 %r240,%r428,%r431,%r240; bra $L868; $L865: .loc 1 2551 9 mov.u64 %r435,83r2933435stack; call caf_internal_error); } .loc 1 2552 9 bra $L843; $L855: .loc 1 2558 7 ld.u64 %r79,[%r282+16]; abs.s64 %r268,%r79; ld.u64 %r126,[%r272+16]; .loc 1 2541 7 ld.u64 %r262,[%r272+8]; .loc 1 2558 7 setp.le.s64 %r438,%r79,0; @ %r438 bra $L869; add.u64 %r439,%r126,1; sub.u64 %r269,%r439,%r262; bra $L870; $L869: add.u64 %r440,%r262,1; sub.u64 %r269,%r440,%r126; $L870: set.u32.le.s64 %r442,%r269,0; neg.s32 %r443,%r442; set.u32.le.s64 %r445,%r268,0; neg.s32 %r446,%r445; cvt.u16.u32 %r448,%r443; cvt.u16.u32 %r449,%r446; or.b16 %r447,%r448,%r449; cvt.u32.u16 %r450,%r447; cvt.u16.u8 %r451,%r450; setp.ne.u16 %r452,%r451,0; @ %r452 bra $L843; setp.eq.u64 %r453,%r268,1; @ %r453 bra $L871; add.u64 %r454,%r269,-1; div.s64 %r458,%r454,%r268; add.u64 %r269,%r458,1; bra $L871; $L856: .loc 1 2569 5 ld.u64 %r94,[%r282+16]; abs.s64 %r266,%r94; .loc 1 2541 7 ld.u64 %r92,[%r282]; .loc 1 2530 32 ld.u64 %r93,[%r282+8]; .loc 1 2569 5 setp.le.s64 %r459,%r94,0; @ %r459 bra $L872; add.u64 %r460,%r93,1; sub.u64 %r269,%r460,%r92; bra $L873; $L872: add.u64 %r461,%r92,1; sub.u64 %r269,%r461,%r93; $L873: set.u32.le.s64 %r463,%r269,0; neg.s32 %r464,%r463; set.u32.le.s64 %r466,%r266,0; neg.s32 %r467,%r466; cvt.u16.u32 %r469,%r464; cvt.u16.u32 %r470,%r467; or.b16 %r468,%r469,%r470; cvt.u32.u16 %r471,%r468; cvt.u16.u8 %r472,%r471; setp.ne.u16 %r473,%r472,0; @ %r473 bra $L843; setp.eq.u64 %r474,%r266,1; @ %r474 bra $L874; add.u64 %r475,%r269,-1; div.s64 %r479,%r475,%r266; add.u64 %r269,%r479,1; $L874: .loc 1 2574 9 ld.u64 %r481,[%r272+8]; sub.u64 %r480,%r92,%r481; .loc 1 2575 9 ld.u64 %r483,[%r272]; mul.lo.u64 %r482,%r480,%r483; .loc 1 2576 9 ld.u64 %r485,[%r153+16]; .loc 1 2573 12 mad.lo.u64 %r240,%r482,%r485,%r240; .loc 1 2577 5 bra $L871; $L853: .loc 1 2581 9 ld.u64 %r487,[%r282]; ld.u64 %r488,[%r272+8]; sub.u64 %r486,%r487,%r488; .loc 1 2582 9 ld.u64 %r490,[%r272]; mul.lo.u64 %r489,%r486,%r490; .loc 1 2583 9 ld.u64 %r492,[%r153+16]; .loc 1 2580 12 mad.lo.u64 %r240,%r489,%r492,%r240; bra $L875; $L859: .loc 1 2587 7 ld.u64 %r118,[%r282+16]; abs.s64 %r265,%r118; .loc 1 2541 7 ld.u64 %r267,[%r282]; .loc 1 2558 7 ld.u64 %r243,[%r272+16]; .loc 1 2587 7 setp.le.s64 %r493,%r118,0; @ %r493 bra $L876; add.u64 %r494,%r243,1; sub.u64 %r269,%r494,%r267; bra $L877; $L876: add.u64 %r495,%r267,1; sub.u64 %r269,%r495,%r243; $L877: set.u32.le.s64 %r497,%r269,0; neg.s32 %r498,%r497; set.u32.le.s64 %r500,%r265,0; neg.s32 %r501,%r500; cvt.u16.u32 %r503,%r498; cvt.u16.u32 %r504,%r501; or.b16 %r502,%r503,%r504; cvt.u32.u16 %r505,%r502; cvt.u16.u8 %r506,%r505; setp.ne.u16 %r507,%r506,0; @ %r507 bra $L843; setp.eq.u64 %r508,%r265,1; @ %r508 bra $L878; add.u64 %r509,%r269,-1; div.s64 %r513,%r509,%r265; add.u64 %r269,%r513,1; $L878: .loc 1 2597 9 ld.u64 %r515,[%r272+8]; sub.u64 %r514,%r267,%r515; .loc 1 2598 9 ld.u64 %r517,[%r272]; mul.lo.u64 %r516,%r514,%r517; .loc 1 2599 9 ld.u64 %r519,[%r153+16]; .loc 1 2596 12 mad.lo.u64 %r240,%r516,%r519,%r240; .loc 1 2600 5 bra $L871; $L860: .loc 1 2603 7 ld.u64 %r139,[%r282+16]; abs.s64 %r261,%r139; .loc 1 2541 7 ld.u64 %r89,[%r272+8]; .loc 1 2530 32 ld.u64 %r90,[%r282+8]; .loc 1 2603 7 setp.le.s64 %r520,%r139,0; @ %r520 bra $L879; add.u64 %r521,%r90,1; sub.u64 %r269,%r521,%r89; bra $L880; $L879: add.u64 %r522,%r89,1; sub.u64 %r269,%r522,%r90; $L880: set.u32.le.s64 %r524,%r269,0; neg.s32 %r525,%r524; set.u32.le.s64 %r527,%r261,0; neg.s32 %r528,%r527; cvt.u16.u32 %r530,%r525; cvt.u16.u32 %r531,%r528; or.b16 %r529,%r530,%r531; cvt.u32.u16 %r532,%r529; cvt.u16.u8 %r533,%r532; setp.ne.u16 %r534,%r533,0; @ %r534 bra $L843; setp.eq.u64 %r535,%r261,1; @ %r535 bra $L871; add.u64 %r536,%r269,-1; div.s64 %r540,%r536,%r261; add.u64 %r269,%r540,1; bra $L871; $L858: .loc 1 2616 5 mov.u64 %r544,8293544544stack; call caf_internal_error); } .loc 1 2617 5 bra $L843; $L868: .loc 1 2620 11 setp.le.s64 %r547,%r269,0; @ %r547 bra $L843; $L871: .loc 1 2624 18 set.u32.gt.s64 %r549,%r269,1; neg.s32 %r550,%r549; .loc 1 2624 22 cvt.u16.u32 %r555,%r550; and.b16 %r554,%r555,%r850; .loc 1 2624 11 cvt.u32.u16 %r557,%r554; cvt.u16.u8 %r558,%r557; setp.ne.u16 %r559,%r558,0; @ %r559 bra $L881; .loc 1 2676 13 mul.lo.u64 %r129,%r129,%r269; bra $L875; $L881: .loc 1 2628 8 setp.gt.s32 %r560,%r248,%r239; @ %r560 bra $L882; $L905: .loc 1 2630 9 mov.u64 %r564,82935644stack; call caf_internal_error); } .loc 1 2631 9 bra $L843; $L882: .loc 1 2639 6 setp.eq.u64 %r567,%r240,0; @ %r567 bra $L883; .loc 1 2640 13 cvt.s64.s32 %r834,%r239; add.u64 %r570,%r834,%r834; add.u64 %r571,%r570,%r834; shl.b64 %r572,%r571,3; add.u64 %r573,%r152,%r572; ld.u64 %r576,[%r573+56]; add.u64 %r575,%r576,1; ld.u64 %r585,[%r573+48]; sub.u64 %r584,%r575,%r585; .loc 1 2640 10 setp.ne.u64 %r586,%r584,%r269; @ %r586 bra $L883; bra $L884; $L947: .loc 1 2651 9 cvt.s64.s32 %r592,%r239; add.u64 %r594,%r592,%r592; add.u64 %r595,%r594,%r592; shl.b64 %r596,%r595,3; add.u64 %r597,%r152,%r596; ld.u64 %r600,[%r597+56]; add.u64 %r599,%r600,1; .loc 1 2649 10 ld.u64 %r609,[%r597+48]; sub.u64 %r608,%r599,%r609; st.u64 [%stack+8],%r608; st.u64 [%stack],%r269; mov.u64 %r590,8293590590stack; call caf_internal_error); } .loc 1 2653 10 bra $L843; $L909: .loc 1 2657 14 ld.u64 %r611,[%r153]; setp.eq.u64 %r612,%r611,0; @ %r612 bra $L885; .loc 1 2659 10 mov.u64 %r616,61661stack; call caf_internal_error); } .loc 1 2661 10 bra $L843; $L885: .loc 1 2667 4 cvt.s64.s32 %r834,%r239; add.u64 %r621,%r834,%r834; add.u64 %r622,%r621,%r834; shl.b64 %r623,%r622,3; add.u64 %r624,%r152,%r623; st.u64 [%r624+48],%r844; st.u64 [%r624+56],%r269; st.u64 [%r624+40],%r129; $L884: .loc 1 2676 13 mul.lo.u64 %r129,%r129,%r269; .loc 1 2673 12 add.u64 %r643,%r834,%r834; add.u64 %r644,%r643,%r834; shl.b64 %r645,%r644,3; add.u64 %r646,%r287,%r645; ld.u64 %r649,[%r646+56]; add.u64 %r648,%r649,1; ld.u64 %r658,[%r646+48]; sub.u64 %r657,%r648,%r658; .loc 1 2673 9 setp.ne.u64 %r659,%r657,%r269; @ %r659 bra $L875; .loc 1 2674 7 add.u32 %r239,%r239,1; $L875: .loc 1 2525 56 add.u64 %r662,%r846,%r284; st.u64 [%frame+592],%r662; .loc 1 2525 31 ld.u8 %r171,[%r284]; .loc 1 2525 4 add.u64 %r284,%r284,1; add.u64 %r282,%r282,24; add.u64 %r272,%r272,24; setp.ne.u32 %r663,%r171,0; @ %r663 bra $L886; bra $L849; $L847: .loc 1 2680 11 st.u64 [%frame+592],%r847; .loc 1 2680 31 ld.u8 %r229,[%r153+24]; .loc 1 2680 4 setp.eq.u32 %r665,%r229,0; @ %r665 bra $L849; add.u64 %r280,%r153,25; add.u64 %r130,%r153,48; .loc 1 2680 56 mov.u64 %r839,-24; $L907: cvt.u16.u32 %r666,%r229; setp.eq.u16 %r667,%r666,3; @ %r667 bra $L887; setp.gt.u16 %r669,%r666,3; @ %r669 bra $L888; setp.eq.u16 %r671,%r666,1; @ %r671 bra $L889; setp.eq.u16 %r673,%r666,2; @ %r673 bra $L890; bra $L858; $L888: setp.eq.u16 %r675,%r666,4; @ %r675 bra $L891; bra $L858; $L889: .loc 1 2685 11 ld.u64 %r242,[%r130+8]; .loc 1 2691 32 ld.u32 %r173,[%r130+16]; setp.eq.u32 %r676,%r173,4; @ %r676 bra $L892; setp.gt.s32 %r677,%r173,4; @ %r677 bra $L893; setp.eq.u32 %r678,%r173,1; @ %r678 bra $L894; setp.eq.u32 %r679,%r173,2; @ %r679 bra $L895; bra $L865; $L893: setp.eq.u32 %r680,%r173,8; @ %r680 bra $L896; setp.eq.u32 %r681,%r173,16; @ %r681 bra $L897; bra $L865; $L894: .loc 1 2693 7 ld.u64 %r682,[%r130]; ld.s8 %r683,[%r682]; ld.u64 %r685,[%r153+16]; mad.lo.u64 %r240,%r683,%r685,%r240; bra $L898; $L895: .loc 1 2694 7 ld.u64 %r686,[%r130]; ld.s16 %r687,[%r686]; ld.u64 %r689,[%r153+16]; mad.lo.u64 %r240,%r687,%r689,%r240; bra $L898; $L892: .loc 1 2695 7 ld.u64 %r690,[%r130]; ld.s32 %r691,[%r690]; ld.u64 %r693,[%r153+16]; mad.lo.u64 %r240,%r691,%r693,%r240; bra $L898; $L896: .loc 1 2697 7 ld.u64 %r694,[%r130]; ld.u64 %r696,[%r694]; ld.u64 %r697,[%r153+16]; mad.lo.u64 %r240,%r696,%r697,%r240; bra $L898; $L897: .loc 1 2700 7 ld.u64 %r698,[%r130]; ld.u64 %r700,[%r698]; ld.u64 %r701,[%r153+16]; mad.lo.u64 %r240,%r700,%r701,%r240; bra $L898; $L890: .loc 1 2709 37 ld.u64 %r704,[%r130+8]; ld.u64 %r705,[%r130+16]; div.s64 %r703,%r704,%r705; .loc 1 2709 11 add.u64 %r242,%r703,1; .loc 1 2713 5 bra $L898; $L887: .loc 1 2715 5 ld.u64 %r202,[%r130+16]; abs.s64 %r255,%r202; .loc 1 2685 32 ld.u64 %r91,[%r130+8]; .loc 1 2693 7 ld.u64 %r88,[%r130]; .loc 1 2715 5 setp.le.s64 %r706,%r202,0; @ %r706 bra $L899; add.u64 %r707,%r91,1; sub.u64 %r242,%r707,%r88; bra $L900; $L899: add.u64 %r708,%r88,1; sub.u64 %r242,%r708,%r91; $L900: set.u32.le.s64 %r710,%r242,0; neg.s32 %r711,%r710; set.u32.le.s64 %r713,%r255,0; neg.s32 %r714,%r713; cvt.u16.u32 %r716,%r711; cvt.u16.u32 %r717,%r714; or.b16 %r715,%r716,%r717; cvt.u32.u16 %r718,%r715; cvt.u16.u8 %r719,%r718; setp.ne.u16 %r720,%r719,0; @ %r720 bra $L843; setp.eq.u64 %r721,%r255,1; @ %r721 bra $L901; add.u64 %r722,%r242,-1; div.s64 %r726,%r722,%r255; add.u64 %r242,%r726,1; $L901: .loc 1 2720 9 mul.lo.u64 %r727,%r88,%r202; .loc 1 2721 9 ld.u64 %r729,[%r153+16]; .loc 1 2719 12 mad.lo.u64 %r240,%r727,%r729,%r240; bra $L902; $L891: .loc 1 2726 9 ld.u64 %r731,[%r130]; ld.u64 %r732,[%r130+16]; mul.lo.u64 %r730,%r731,%r732; .loc 1 2727 9 ld.u64 %r734,[%r153+16]; .loc 1 2725 12 mad.lo.u64 %r240,%r730,%r734,%r240; bra $L903; $L898: .loc 1 2737 11 setp.le.s64 %r735,%r242,0; @ %r735 bra $L843; $L902: .loc 1 2742 18 set.u32.gt.s64 %r737,%r242,1; neg.s32 %r738,%r737; .loc 1 2742 22 cvt.u16.u32 %r743,%r738; and.b16 %r742,%r743,%r850; .loc 1 2742 11 cvt.u32.u16 %r745,%r742; cvt.u16.u8 %r746,%r745; setp.ne.u16 %r747,%r746,0; @ %r747 bra $L904; .loc 1 2770 13 mul.lo.u64 %r129,%r129,%r242; bra $L903; $L904: .loc 1 2746 8 setp.le.s32 %r748,%r248,%r239; @ %r748 bra $L905; .loc 1 2755 27 cvt.s64.s32 %r749,%r239; add.u64 %r751,%r749,%r749; add.u64 %r752,%r751,%r749; shl.b64 %r753,%r752,3; add.u64 %r754,%r287,%r753; ld.u64 %r757,[%r754+56]; add.u64 %r756,%r757,1; ld.u64 %r765,[%r754+48]; sub.u64 %r228,%r756,%r765; .loc 1 2759 12 setp.eq.u64 %r766,%r228,%r242; @ %r766 bra $L906; .loc 1 2761 6 st.u64 [%stack+8],%r228; st.u64 [%stack],%r242; mov.u64 %r770,82937707stack; call caf_internal_error); } .loc 1 2765 6 bra $L843; $L906: .loc 1 2768 5 add.u32 %r239,%r239,1; .loc 1 2770 13 mul.lo.u64 %r129,%r129,%r242; $L903: .loc 1 2680 56 sub.u64 %r773,%r839,%r153; add.u64 %r775,%r773,%r280; st.u64 [%frame+592],%r775; .loc 1 2680 31 ld.u8 %r229,[%r280]; .loc 1 2680 4 add.u64 %r280,%r280,1; add.u64 %r130,%r130,24; setp.ne.u32 %r776,%r229,0; @ %r776 bra $L907; bra $L849; $L848: .loc 1 2774 4 mov.u64 %r780,780780stack; call caf_internal_error); } .loc 1 2775 4 bra $L843; $L911: .loc 1 2492 29 mov.u64 %r240,%r152; bra $L849; $L912: .loc 1 2503 9 mov.u64 %r152,%r847; .loc 1 2502 12 mov.u64 %r240,%r25; $L849: .loc 1 2778 13 ld.u64 %r274,[%r153]; .loc 1 2475 9 setp.ne.u64 %r783,%r274,0; @ %r783 bra $L914; .loc 1 2780 29 ld.u64 %r785,[%r153+16]; set.u32.eq.u64 %r786,%r785,0; neg.s32 %r787,%r786; .loc 1 2780 12 set.u32.eq.u64 %r789,%r129,0; neg.s32 %r790,%r789; .loc 1 2780 17 cvt.u16.u32 %r792,%r787; cvt.u16.u32 %r793,%r790; or.b16 %r791,%r792,%r793; .loc 1 2780 6 cvt.u32.u16 %r794,%r791; cvt.u16.u8 %r795,%r794; setp.ne.u16 %r796,%r795,0; @ %r796 bra $L843; .loc 1 2791 3 mov.u64 %r798,120; cvt.u32.u64 %r800,%r274; { .param .u64 %value_in; .param .u64 %out_arg1; st.param.u64 [%out_arg1],%frame798802,[%value_in]; } .loc 1 2792 5 st.u64 [%frame+592],%r274; .loc 1 2793 3 add.u64 %r819,%frame,592; ld.u64 %r811,[%r228r819frame52287811; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r289; .param .u32 %out_arg9; st.param.u32 [%out_arg9],%r290; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r274; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r274; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r129; .param .u64 %out_arg13; st.param.u64 [%out_arg13],%r293; .param .u32 %out_arg14; st.param.u32 [%out_arg14],%r294; call send_by_ref$constprop$0$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13,%out_arg14); } .loc 1 2796 3 ld.u64 %r820,[%frame+592]; setp.eq.u64 %r821,%r820,%r129; @ %r821 bra $L843; cvta.const.u64 %r825,$LC27; cvta.const.u64 %r824,__func__$4; mov.u32 %r823,2796; cvta.const.u64 %r822,$LC1888252647 9 @ %r851 bra $L909; bra $L947; $L843: .loc 1 2797_gfortran_caf_sendget_by_ref .visible .func _gfortran_caf_sendget_by_ref, .param .u32 %in_ar11, .param .u32 %in_ar1ar11; ld.param.u32 %ar11,[%in_ar11]; .reg .u32 %ar12; ld.param.u32 %ar12,[%in_ar1mov.u32 %r29,%ar4; mov.u64 %r30,%ar5; mov.u32 %r31,%ar6; mov.u32 %r32,%ar7; mov.u32 %r33,%ar8; mov.u64 %r34,%ar9; mov.u64 %r35,%ar10; mov.u32 %r36,%ar11; mov.u32 %r37,%ar12; .loc 1 2809 31 mov.u64 %r38,0; st.u64 [%frame],%r38; .loc 1 2810 31 mov.u32 %r39,-1; st.u8 [%frame+28],%r39; .loc 1 2811 31 st.u8 [%frame+29],%r36; .loc 1 2813 3 mov.u32 %r47,1frame3233; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r47; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r35; .param .u32 %out_arg10; st.param.u32 [%out_arg10],%r37; call _gfortran_caf_get_by_ref,%out_arg8,%out_arg9,%out_arg10); } .loc 1 2818 6 setp.eq.u64 %r50,%r35,0; @ %r50 bra $L949; .loc 1 2818 16 ld.u32 %r51,[%r35]; setp.eq.u32 %r52,%r51,0; @ ! %r52 bra $L948; $L949: .loc 1 2821 3 mov.u32 %r60,5frame3133; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r60; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r34; .param .u32 %out_arg10; st.param.u32 [%out_arg10],%r36; call _gfortran_caf_send_by_ref,%out_arg8,%out_arg9,%out_arg10); } .loc 1 2825 7 ld.u64 %r24,[%frame]; .loc 1 2825 6 setp.eq.u64 %r63,%r24,0; @ %r63 bra $L948; .loc 1 2826$L948: .loc 1 28_gfortran_caf_atomic_define .visible .func _gfortran_caf_atomic_defpred.reg .predmov.u64 %r29,%ar4; mov.u32 %r31,%ar6; .loc 1 2836 3 setp.eq.u32 %r32,%r31,4; @ %r32 bra $L961; cvta.const.u64 %r36,$LC28; cvta.const.u64 %r35,__func__$3; mov.u32 %r34,2836; cvta.const.u64 %r33,$LC18; {2840 3 ld.u64 %r37,[%r25]; add.u64 %r38,%r37,%r26; ld.u32 %r39,[%r28]; st.u32 [%r38],%r39; .loc 1 2842 6 setp.eq.u64 %r40,%r29,0; @ %r40 bra $L960; .loc 1 2843 11 mov.u32 %r41,0; st.u32 [%r29],%r41; $L960: .loc 1 2844_gfortran_caf_atomic_ref .visible .func _gfortran_caf_atomic_refpredmov.u64 %r29,%ar4; mov.u32 %r31,%ar6; .loc 1 2852 3 setp.eq.u32 %r32,%r31,4; @ %r32 bra $L967; cvta.const.u64 %r36,$LC28; cvta.const.u64 %r35,__func__$2; mov.u32 %r34,2852; cvta.const.u64 %r33,$LC18; {2856 3 ld.u64 %r37,[%r25]; add.u64 %r38,%r37,%r26; ld.u32 %r23,[%r38]; st.u32 [%r28],%r23; .loc 1 2858 6 setp.eq.u64 %r39,%r29,0; @ %r39 bra $L966; .loc 1 2859 11 mov.u32 %r40,0; st.u32 [%r29],%r40; $L966: .loc 1 286_gfortran_caf_atomic_cas .visible .func _gfortran_caf_atomic_casr2332mov.u64 %r32,%ar6; mov.u32 %r34,%ar8; .loc 1 2869 3 setp.eq.u32 %r35,%r34,4; @ %r35 bra $L973; cvta.const.u64 %r39,$LC28; cvta.const.u64 %r38,__func__$1; mov.u32 %r37,2869; cvta.const.u64 %r36,$LC182871 13 ld.u64 %r40,[%r26]; add.u64 %r25,%r40,%r27; .loc 1 2873 23 ld.u32 %r23,[%r30]; .loc 1 2873 21 st.u32 [%r29],%r23; .loc 1 2874 10 ld.u32 %r44,[%r31]; atom.cas.b32 %r42,[%r25],%r23,%r44; setp.eq.u32 %r45,%r42,%r23; selp.u32 %r43,1,0,%r45; setp.ne.u32 %r46,%r43,0; @ %r46 bra $L974; st.u32 [%r29],%r42; $L974: .loc 1 287972; .loc 1 2878 11 mov.u32 %r48,0; st.u32 [%r32],%r48; $L972: .loc 1 28_gfortran_caf_atomic_op .visible .func _gfortran_caf_atomic_oppredmov.u32 %r29,%ar0; mov.u64 %r30,%ar1; mov.u64 %r31,%ar2; mov.u64 %r33,%ar4; mov.u64 %r34,%ar5; mov.u64 %r35,%ar6; mov.u32 %r37,%ar8; .loc 1 2888 3 setp.eq.u32 %r38,%r37,4; @ %r38 bra $L980; cvta.const.u64 %r42,$LC28; cvta.const.u64 %r41,__func__$0; mov.u32 %r40,2888; cvta.const.u64 %r39,$LC182891 13 ld.u64 %r43,[%r30]; add.u64 %r28,%r43,%r31; .loc 1 2893 3 setp.eq.u32 %r44,%r29,3; @ %r44 bra $L981; setp.gt.s32 %r45,%r29,3; @ %r45 bra $L982; setp.ne.u32 %r46,%r29,1; @ %r46 bra $L983; .loc 1 2896 13 ld.u32 %r47,[%r33]; atom.add.u32 %r27,[%r28],%r47; .loc 1 2897 7 bra $L984; $L983: .loc 1 2899 13 ld.u32 %r48,[%r33]; atom.b32.and %r27,[%r28],%r48; .loc 1 2900 7 bra $L984; $L981: .loc 1 2902 13 ld.u32 %r49,[%r33]; atom.b32.or %r27,[%r28],%r49; .loc 1 2903 7 bra $L984; $L982: .loc 1 2905 13 ld.u32 %r50,[%r33]; atom.b32.xor %r27,[%r28],%r50; $L984: .loc 1 2911 6 setp.eq.u64 %r51,%r34,0; @ %r51 bra $L985; .loc 1 2912 23 st.u32 [%r34],%r27; $L985: .loc 1 2914 6 setp.eq.u64 %r52,%r35,0; @ %r52 bra $L979; .loc 1 2915 11 mov.u32 %r53,0; st.u32 [%r35],%r53; $L979: .loc 1 291_gfortran_caf_event_post .visible .func _gfortran_caf_event_pospred2927 3 ld.u64 %r31,[%r25]; shl.b64 %r32,%r26,2; add.u64 %r33,%r31,%r32; atom.add.u32 %r34,[%r33],1; .loc 1 2929 5 setp.eq.u64 %r35,%r28,0; @ %r35 bra $L993; .loc 1 2930 11 mov.u32 %r36,0; st.u32 [%r28],%r36; $L993: .loc 1 293_gfortran_caf_event_wait .visible .func _gfortran_caf_event_wr27predmov.u32 %r29,%ar2; mov.u64 %r30,%ar3; .loc 1 2942 4 ld.u64 %r33,[%r27]; shl.b64 %r34,%r28,2; add.u64 %r35,%r33,%r34; .loc 1 2941 30 neg.s32 %r36,%r29; .loc 1 2942 4 atom.add.u32 %r37,[%r35],%r36; .loc 1 2944 6 setp.eq.u64 %r38,%r30,0; @ %r38 bra $L998; .loc 1 2945 11 mov.u32 %r39,0; st.u32 [%r30],%r39; $L998: .loc 1 29_gfortran_caf_event_query .visible .func _gfortran_caf_event_querypred.loc 1 2955 3 ld.u64 %r31,[%r26]; shl.b64 %r32,%r27,2; add.u64 %r33,%r31,%r32; ld.u32 %r24,[%r33]; st.u32 [%r29],%r24; .loc 1 2957 5 setp.eq.u64 %r34,%r30,0; @ %r34 bra $L1003; .loc 1 2958 11 mov.u32 %r35,0; st.u32 [%r30],%r35; $L1003: .loc 1 2959_gfortran_caf_lock .visible .func _gfortran_caf_lockpredmov.u64 %r34,%ar6; .loc 1 2967 9 ld.u64 %r35,[%r28]; add.u64 %r26,%r35,%r29; .loc 1 2969 6 ld.u8 %r36,[%r26]; setp.ne.u16 %r37,%r36,0; @ %r37 bra $L1009; .loc 1 2971 13 mov.u32 %r38,1; st.u8 [%r26],%r38; .loc 1 2972 10 setp.eq.u64 %r39,%r31,0; @ %r39 bra $L1010; .loc 1 2973 16 mov.u32 %r40,1; st.u32 [%r31],%r40; $L1010: .loc 1 2974 10 setp.eq.u64 %r41,%r32,0; @ %r41 bra $L1008; $L1014: .loc 1 2975 8 mov.u32 %r42,0; st.u32 [%r32],%r42; bra $L1008; $L1009: .loc 1 2979 6 setp.eq.u64 %r43,%r31,0; @ %r43 bra $L1013; .loc 1 2981 21 mov.u32 %r44,0; st.u32 [%r31],%r44; .loc 1 2982 10 setp.ne.u64 %r45,%r32,0; @ %r45 bra $L1014; bra $L1008; $L1013: .loc 1 2988 6 setp.eq.u64 %r46,%r32,0; @ %r46 bra $L1015; .loc 1 2990 13 mov.u32 %r47,1; st.u32 [%r32],%r47; .loc 1 2991 10 setp.eq.u64 %r48,%r34,0; @ %r48 bra $L1008; .loc 1 2993 11 min.u64 %r27,%r34,8; .loc 1 2995 4 cvta.const.u64 %r49,$LC29;.loc 1 2996 7 setp.le.u64 %r58,%r34,%r27; @ %r58 bra $L1008; .loc 1 2997 6 sub.u64 %r64,%r34,%r27; .loc 1 2997 14 add.u64 %r65,%r33,%r27; .loc 1 2997bra $L1008; $L1015: .loc 1 3001 3 cvt.u32.u64 %r69,%r32; mov.u64 %r68,14; cvta.const.u64 %r67,$LC2call _gfortran_caf_error_stop_str1008:_gfortran_caf_unlock .visible .func _gfortran_caf_unlockpred3011 9 ld.u64 %r34,[%r28]; add.u64 %r26,%r34,%r29; .loc 1 3013 6 ld.u8 %r35,[%r26]; setp.eq.u16 %r36,%r35,0; @ %r36 bra $L1026; .loc 1 3015 13 mov.u32 %r37,0; st.u8 [%r26],%r37; .loc 1 3016 10 setp.eq.u64 %r38,%r31,0; @ %r38 bra $L1025; .loc 1 3017 8 mov.u32 %r39,0; st.u32 [%r31],%r39; bra $L1025; $L1026: .loc 1 3021 6 setp.eq.u64 %r40,%r31,0; @ %r40 bra $L1029; .loc 1 3023 13 mov.u32 %r41,1; st.u32 [%r31],%r41; .loc 1 3024 10025; .loc 1 3026 11 min.u64 %r27,%r33,8; .loc 1 3028 4 cvta.const.u64 %r43,$LC33029 7 setp.le.u64 %r52,%r33,%r27; @ %r52 bra $L1025; .loc 1 3030 6 sub.u64 %r58,%r33,%r27; .loc 1 3030 14 add.u64 %r59,%r32,%r27; .loc 1 3030 6 mov.u32 %r5659bra $L1025; $L1029: .loc 1 3034 3 cvt.u32.u64 %r63,%r31; mov.u64 %r62,22; cvta.const.u64 %r61,$LC30call _gfortran_caf_error_stop_str1025: .loc 1 303_gfortran_caf_is_pres_gfortran_caf_is_presentvalue;stack; mov.u64 %stack,09u64 %r95; .reg .predpredpred %r113; .reg .pred %r115; .reg .u16 %r116; .reg .u16pred %r151; .reg .u16 %r152; .reg .u16 %r154; .reg .u32 %r155; .reg .predpred %r171; .reg .u32 %r173; .reg .pred.reg .u64 %r179; mov.u64 %r70,%ar0; mov.u64 %r72,%ar2; .loc 1 3042 14 add.u64 %r178,%frame,64; cvta.const.u64 %r74,$LC313044 14 add.u64 %r179,%frame,127; cvta.const.u64 %r82,$LC103046 14 cvta.const.u64 %r90,$LC113050 9 ld.u64 %r48,[%r70]; .loc 1 3051 21 ld.u64 %r49,[%r70+8]; .loc 1 3054 9 setp.eq.u64 %r97,%r72,0; @ %r97 bra $L1037; $L1059: .loc 1 3056 20 ld.u32 %r22,[%r72+8]; .loc 1 3056 7 setp.eq.u32 %r98,%r22,1; @ %r98 bra $L1038; setp.eq.u32 %r99,%r22,2; @ %r99 bra $L1039; setp.eq.u32 %r100,%r22,0; @ ! %r100 bra $L1082; bra $L1040; $L1039: .loc 1 3102 31 ld.u8 %r52,[%r72+24]; .loc 1 3102 4 setp.eq.u32 %r101,%r52,0; @ %r101 bra $L1042; add.u64 %r60,%r72,25; add.u64 %r62,%r72,48; bra $L1043; $L1038: .loc 1 3073 31 ld.u8 %r51,[%r72+24]; .loc 1 3073 4 setp.eq.u32 %r102,%r51,0; @ %r102 bra $L1042; add.u64 %r46,%r72,25; add.u64 %r45,%r72,48; add.u64 %r54,%r49,48; bra $L1044; $L1040: .loc 1 3059 18 ld.u64 %r23,[%r72+32]; .loc 1 3059 7 setp.eq.u64 %r103,%r23,0; @ %r103 bra $L1045; .loc 1 3061 21 add.u64 %r104,%r48,%r23; ld.u64 %r50,[%r104]; .loc 1 3063 15 ld.u64 %r48,[%r50]; .loc 1 3064 12 ld.u64 %r49,[%r50+8]; bra $L1042; $L1045: .loc 1 3068 15 ld.u64 %r105,[%r72+24]; add.u64 %r48,%r48,%r105; .loc 1 3069 12 mov.u64 %r49,%r48; bra $L1042; $L1044: cvt.u16.u32 %r106,%r51; setp.eq.u16 %r107,%r106,3; @ %r107 bra $L1046; setp.gt.u16 %r109,%r106,3; @ %r109 bra $L1047; setp.eq.u16 %r111,%r106,1; @ %r111 bra $L1046; setp.eq.u16 %r113,%r106,2; @ %r113 bra $L1048; bra $L1049; $L1047: setp.eq.u16 %r115,%r106,4; @ %r115 bra $L1050; add.u16 %r116,%r106,-5; cvt.u32.u16 %r119,%r116; cvt.u16.u8 %r118,%r119; setp.gt.u16 %r120,%r118,1; @ %r120 bra $L1049; bra $L1046; $L1050: .loc 1 3079 9 ld.u64 %r122,[%r45]; ld.u64 %r123,[%r54]; sub.u64 %r121,%r122,%r123; .loc 1 3080 9 ld.u64 %r125,[%r54+-8]; mul.lo.u64 %r124,%r121,%r125; .loc 1 3081 9 ld.u64 %r127,[%r72+16]; .loc 1 3078 12 mad.lo.u64 %r48,%r124,%r127,%r48; .loc 1 3073 31 ld.u8 %r51,[%r46]; .loc 1 3073 4 setp.ne.u32 %r128,%r51,03093 5 mov.u64 %r134,17r1134134stack; call caf_internal_error); } .loc 1 3094 12 cvt.u32.u64 %r69,%r134; bra $L1036; $L1049: .loc 1 30stack; call caf_internal_error); } .loc 1 3097 12 cvt.u32.u64 %r69,%r140; bra $L1036; $L1051: add.u64 %r46,%r46,1; add.u64 %r45,%r45,24; add.u64 %r54,%r54,24; bra $L1044; $L1043: cvt.u16.u32 %r142,%r52; setp.eq.u16 %r143,%r142,3; @ %r143 bra $L1046; setp.gt.u16 %r145,%r142,3; @ %r145 bra $L1054; setp.eq.u16 %r147,%r142,1; @ %r147 bra $L1046; setp.eq.u16 %r149,%r142,2; @ %r149 bra $L1055; bra $L1049; $L1054: setp.eq.u16 %r151,%r142,4; @ %r151 bra $L1056; add.u16 %r152,%r142,-5; cvt.u32.u16 %r155,%r152; cvt.u16.u8 %r154,%r155; setp.gt.u16 %r156,%r154,1; @ %r156 bra $L1049; bra $L1046; $L1056: .loc 1 3108 9 ld.u64 %r158,[%r62]; ld.u64 %r159,[%r62+16]; mul.lo.u64 %r157,%r158,%r159; .loc 1 3109 9 ld.u64 %r161,[%r72+16]; .loc 1 3107 12 mad.lo.u64 %r48,%r157,%r161,%r48; .loc 1 3102 31 ld.u8 %r52,[%r60]; .loc 1 3102 4 setp.ne.u32 %r162,%r52,0; @ %r162 bra $L1057; bra $L1042; $L1055: .loc 1 3113 8 ld.u64 %r163,[%r72]; setp.eq.u64 %r164,%r163,0; @ %r164 bra $L1058; bra $L1046; $L1057: add.u64 %r60,%r60,1; add.u64 %r62,%r62,24; bra $L1043; $L1082: .loc 1 3130 4 mov.u64 %r168,r161688stack; call caf_internal_error); } .loc 1 3131 11 cvt.u32.u64 %r69,%r168; bra $L1036; $L1042: .loc 1 3133 13 ld.u64 %r72,[%r72]; .loc 1 3054 9 setp.ne.u64 %r171,%r72,0; @ %r171 bra $L1059; $L1037: .loc 1 3135 17 set.u32.ne.u64 %r173,%r48,0; neg.s32 %r69,%r173; bra $L1036; $L1052: .loc 1 3073 31 ld.u8 %r51,[%r46]; .loc 1 3073 4 setp.ne.u32 %r176,%r51,0; @ %r176 bra $L1051; bra $L1037; $L1058: .loc 1 3102 31 ld.u8 %r52,[%r60]; .loc 1 3102 4 setp.ne.u32 %r177,%r52,0; @ %r177 bra $L1057; bra $L1037; $L1036: .loc 1 3136 libcaf_single.la# libcaf_singlecaf_single -lmcaf_single ` compile_options.o/ maxloc0_16_i16.o/ maxloc0_16_r10.o/ maxloc0_16_r16.o/ maxloc1_16_i16.o/ maxloc1_16_r10.o/ maxloc1_16_r16.o/ minloc0_16_i16.o/ minloc0_16_r10.o/ minloc0_16_r16.o/ minloc1_16_i16.o/ minloc1_16_r10.o/ minloc1_16_r16.o/ matmulavx128_i1.o/ matmulavx128_i2.o/ matmulavx128_i4.o/ matmulavx128_i8.o/ matmulavx128_i16.o/ matmulavx128_r4.o/ matmulavx128_r8.o/ matmulavx128_r10.o/ matmulavx128_r16.o/ matmulavx128_c4.o/ matmulavx128_c8.o/ matmulavx128_c10.o/ matmulavx128_c16.o/ cshift1_16_i16.o/ cshift1_16_r10.o/ cshift1_16_r16.o/ cshift1_16_c10.o/ cshift1_16_c16.o/ ISO_Fortran_binding.o/ size_from_kind.o/ extends_type_of.o/ selected_char_kind.o/ spread_generic.o/ string_intrinsics.o/ reshape_generic.o/ reshape_packed.o/ selected_int_kind.o/ selected_real_kind.o/ unpack_generic.o/ in_pack_generic.o/ in_unpack_generic.o/ misc_specifics.o/ bounds.o/_gfortrani_bounds_iforeach_return .visible .func _gfortrani_bounds_iforeach_returnfortran/runtime/boundgfortrani_bounds_ifunction_return .visible .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .visible .func _gfortrani_bounds_equal_extentsgfortrani_bounds_reduced_extents .visible .func _gfortrani_bounds_reduced_extents); // BEGIN GLOBAL FUNCTION DECL: _gfortrani_count_0gfortrani_count_0_gfortran_runtime_error .extern .func _gfortran_runtime__gfortran_size0_gfortran_size0FUNCTION DECL: _gfortrani_internal_error .extern .func _gfortrani_internal_error99,111,114,114,101,99,116,32,115,105,122,101,32,105,110,32,37,115,32,111,102,32,37,115,32,105,110,116,114,105,110,115,105,99,58,32,115,104,111,117,108,100,32,110,111,116,32,98,101,32,122,101,114,111,45,115,105,122,101,1073,110,99,111,114,114,101,99,116,32,115,105,122,101,32,111,102,32,37,115,32,111,102,32,37,115,32,105,110,116,114,105,110,115,105,99,58,32,115,104,111,117,108,100,32,98,101,32,122,101,114,111,45,115,105,122,101,100,0 }; // BEGIN VAR DEF: $LC11 .const .align 1 .u8 $LC11[37] = {70,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121,32,105,110,32,99,111,117,110,116,95,4898,111,117,110,100,115,95,114,101,100,117,99,101,100,95,101,120,116,101,110,116,115,0 }; // BEGIN VAR DEF: __func__$1 .const .align 1 .u8 __func__$1[21] = {98,111,117,110,100,115,95,101,113,117,97,108,95,101,120,116,101,110,116,115,0 }99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,37,115,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,10810,99,111,114,114,101,99,116,32,115,105,122,101,32,105,110,32,37,115,32,111,102,32,37,115,32,105,110,116,114,105,110,115,105,99,58,32,115,104,111,117,108,100,32,98,101,32,122,101,114,111,45,115,105,122,101,1073,110,99,111,114,114,101,99,116,32,115,105,122,101,32,111,102,32,37,115,32,105,110,32,37,115,32,105,110,116,114,105,110,115,105,99,58,32,115,104,111,117,108,100,32,110,111,116,32,98,101,32,122,101,114,111,45,115,105,122,101,1073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,37,115,32,111,102,32,37,115,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1071,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,40,97,41,32,61,61,32,71,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,40,98,41116,114,97,110,47,114,117,110,116,105,109,101,47,98,111,117,110,100,115,46,9973,110,99,111,114,114,101,99,116,32,115,105,122,101,32,111,102,32,37,115,32,105,110,32,37,115,32,105,110,116,114,105,110,115,105,99,58,32,115,104,111,117,108,100,32,98,101,32,122,101,114,111,45,115,105,122,10173,110,99,111,114,114,101,99,116,32,115,105,122,101,32,111,102,32,37,115,32,111,102,32,37,115,32,105,110,116,114,105,110,115,105,99,58,32,83,104,111,117,108,100,32,110,111,116,32,98,101,32,122,101,114,111,45,115,105,122,101,100,0 }71,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,40,97,41,32,61,61,32,71,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,40,98,41,32,45,32,49_gfortrani_bounds_iforeach_return .visible .func _gfortrani_bounds_iforeach_return.loc 1 46 8 ld.s8 %r26,[%r29+28]; .loc 1 47 16 ld.u64 %r32,[%r28+56]; add.u64 %r31,%r32,1; .loc 1 47 14 ld.u64 %r33,[%r28+48]; sub.u64 %r27,%r31,%r33; .loc 1 48 6 setp.eq.u64 %r34,%r26,%r27; @ %r34 bra $L1; .loc 1 49 5 st.u64 [%stack+16],%r26; st.u64 [%stack+8],%r27; st.u64 [%stack],%r30; cvta.const.u64 %r35,$LC0; {stack; call _gfortran_runtime_error_gfortrani_bounds_ifunction_return .visible .func _gfortrani_bounds_ifunction_returnpred %r70; .reg .u64 %r71; .reg .pred %r73; .reg .pred.loc 1 68 10 ld.s8 %r22,[%r46+28]; .loc 1 69 12gfortran_size0,(%out_arg1); ld.param.u64 %r51,[%value_in]; } .loc 1 72 28 cvt.u32.u32 %r52,%r22; cvt.s64.s8 %r29,%r52; .loc 1 72 3 setp.le.s64 %r53,%r29,0; @ %r53 bra $L4; mov.u64 %r30,%r47; shl.b64 %r56,%r29,3; add.u64 %r45,%r30,%r56; mov.u64 %r36,%r30; .loc 1 71 9 mov.u32 %r32,0; $L6: .loc 1 74 10 ld.u64 %r57,[%r36]; setp.ne.u64 %r58,%r57,0; .loc 1 75 8 selp.u32 %r32,%r32,1,%r58; .loc 1 72 3 add.u64 %r36,%r36,8; setp.ne.u64 %r59,%r36,%r45; @ %r59 bra $L6; bra $L25; $L4: .loc 1 86 10 setp.eq.u64 %r60,%r51,0; @ %r60 bra $L8; bra $L3; $L25: .loc 1 77 6 setp.eq.u32 %r61,%r32,0; @ %r61 bra $L10; .loc 1 79 10 setp.eq.u64 %r62,%r51,0; @ %r62 bra $L3; .loc 1 80 2 st.u64 [%stack+8],%r49; st.u64 [%stack],%r48; cvta.const.u64 %r63_gfortran_runtime_error87 2 st.u64 [%stack+8],%r49; st.u64 [%stack],%r48; cvta.const.u64 %r65,$LC2stack; call _gfortran_runtime_error26: add.u64 %r33,%r46,48; add.u64 %r24,%r46,56; .loc 1 86 10 mov.u64 %r37,0; $L12: .loc 1 94 15 ld.u64 %r68,[%r24]; add.u64 %r67,%r68,1; .loc 1 94 13 ld.u64 %r69,[%r33]; sub.u64 %r35,%r67,%r69; .loc 1 95 26 ld.u64 %r28,[%r30]; .loc 1 96 6 add.u64 %r37,%r37,1; .loc 1 95 7 setp.eq.u64 %r70,%r28,%r35; @ %r70 bra $L11; .loc 1 96 6 st.u64 [%stack+32],%r28; st.u64 [%stack+24],%r35; st.u64 [%stack+16],%r37; st.u64 [%stack+8],%r49; st.u64 [%stack],%r48; cvta.const.u64 %r71_gfortran_runtime_error11: .loc 1 91 7 add.u64 %r33,%r33,24; add.u64 %r30,%r30,8; add.u64 %r24,%r24,24; setp.ne.u64 %r73,%r29,%r37; @ %r73 bra $L12; bra $L3; $L10: .loc 1 86 10 setp.eq.u64 %r74,%r51,0; @ ! %r74 bra $L26; bra $L8; $L3: .loc 1 10_gfortrani_bounds_equal_extents .visible .func _gfortrani_bounds_equal_extents16pred.loc 1 115 3 ld.s8 %r51,[%r46+28]; cvt.u16.u32 %r50,%r51; ld.s8 %r53,[%r47+28]; cvt.u16.u32 %r52,%r53; setp.eq.u16 %r54,%r50,%r52; @ %r54 bra $L28; cvta.const.u64 %r58,$LC4; cvta.const.u64 %r57,__func__$1; mov.u32 %r56,11528: .loc 1 117 12gfortran_size0118 12call (%value_in),_gfortran_size0,(%out_arg1); ld.param.u64 %r62,[%value_in]; } .loc 1 120 6 setp.ne.u64 %r63,%r62,0; @ %r63 bra $L29; .loc 1 122 10 setp.eq.u64 %r64,%r60,0; @ %r64 bra $L27; .loc 1 123 2 st.u64 [%stack+8],%r49; st.u64 [%stack],%r48; cvta.const.u64 %r65,$LC6stack; call _gfortran_runtime_error29: .loc 1 129 10 setp.eq.u64 %r67,%r60,0; @ %r67 bra $L31; .loc 1 134 23 ld.s8 %r39,[%r47+28]; .loc 1 134 7 setp.gt.s64 %r68,%r39,0; @ %r68 bra $L32; bra $L27; $L31: .loc 1 130 2 st.u64 [%stack+8],%r49;_gfortran_runtime_error32: add.u64 %r41,%r47,48; add.u64 %r35,%r46,56; add.u64 %r31,%r47,56; .loc 1 134 14 mov.u64 %r42,0; $L34: .loc 1 138 15 ld.u64 %r72,[%r35]; add.u64 %r71,%r72,1; .loc 1 138 13 ld.u64 %r73,[%r35+-8]; sub.u64 %r36,%r71,%r73; .loc 1 139 15 ld.u64 %r75,[%r31]; add.u64 %r74,%r75,1; .loc 1 139 13 ld.u64 %r76,[%r41]; sub.u64 %r37,%r74,%r76; .loc 1 141 6 add.u64 %r42,%r42,1; .loc 1 140 7 setp.eq.u64 %r77,%r36,%r37; @ %r77 bra $L33; .loc 1 141 6 st.u64 [%stack+32],%r37; st.u64 [%stack+24],%r36; st.u64 [%stack+16],%r42; st.u64 [%stack+8],%r49; st.u64 [%stack],%r48; cvta.const.u64 %r78,$LC3; {stack; call _gfortran_runtime_erroradd.u64 %r41,%r41,24; add.u64 %r35,%r35,24; add.u64 %r31,%r31,24; setp.ne.u64 %r80,%r39,%r42; @ %r80 bra $L34; $L27:_gfortrani_bounds_reduced_extents .visible .func _gfortrani_bounds_reduced_extentspredpred.loc 1 159 3 ld.s8 %r50,[%r45+28]; ld.s8 %r51,[%r46+28]; add.u32 %r52,%r51,-1; setp.eq.u32 %r53,%r50,%r52; @ %r53 bra $L40; cvta.const.u64 %r57,$LC8; cvta.const.u64 %r56,__func__$0; mov.u32 %r55,15940: .loc 1 161 12call (%value_in),_gfortran_size0,(62 12gfortran_size0,(%out_arg1); ld.param.u64 %r61,[%value_in]; } .loc 1 164 6 setp.ne.u64 %r62,%r61,0; @ %r62 bra $L41; .loc 1 166 10 setp.eq.u64 %r63,%r59,0; @ %r63 bra $L39; .loc 1 167 2 st.u64 [%stack+8],%r49; st.u64 [%stack],%r48; cvta.const.u64 %r64,$LCstack; call _gfortran_runtime_error41: .loc 1 173 10 setp.eq.u64 %r66,%r59,0; @ %r66 bra $L43; .loc 1 179 23 ld.s8 %r44,[%r46+28]; .loc 1 179 7 setp.gt.s64 %r67,%r44,0; @ %r67 bra $L44; bra $L39; $L43: .loc 1 174 2 st.u64 [%stack+8],%r49; st.u64 [%stack],%r48; cvta.const.u64 %r68,$LC_gfortran_runtime_error44: .loc 1 183 10 cvt.s64.s32 %r33,%r47; add.u64 %r35,%r46,48; .loc 1 179 14 mov.u64 %r41,0; .loc 1 178 9 mov.u64 %r34,%r41; $L46: .loc 1 183 7 setp.eq.u64 %r70,%r33,%r41; @ %r70 bra $L45; .loc 1 185 19 add.u64 %r72,%r34,%r34; add.u64 %r73,%r72,%r34; shl.b64 %r74,%r73,3; add.u64 %r75,%r45,%r74; ld.u64 %r78,[%r75+56]; add.u64 %r77,%r78,1; .loc 1 185 17 ld.u64 %r85,[%r75+48]; sub.u64 %r39,%r77,%r85; .loc 1 186 19 ld.u64 %r87,[%r35+8]; add.u64 %r86,%r87,1; .loc 1 186 17 ld.u64 %r88,[%r35]; sub.u64 %r40,%r86,%r88; .loc 1 188 3 add.u64 %r34,%r34,1; .loc 1 187 11 setp.eq.u64 %r89,%r39,%r40; @ %r89 bra $L45; .loc 1 188 3 st.u64 [%stack+32],%r40; st.u64 [%stack+24],%r39; st.u64 [%stack+16],%r34; st.u64 [%stack+8],%r49; st.u64 [%stack],%r48; cvta.const.u64 %r90stack; call _gfortran_runtime_error45: .loc 1 179 49 add.u64 %r41,%r41,1; .loc 1 179 7 add.u64 %r35,%r35,24; setp.ne.u64 %r92,%r41,%r44; @ %r92 bra $L46; $L39: .loc 1 19_gfortrani_count_0gfortrani_count_016 .b8 %frame_ar[368predpredpredpred %r125; .reg .u64 %r127; mov.u64 %r75,%ar0; .loc 1 214 8 ld.s8 %r54,[%r75+28]; .loc 1 215 10 ld.u64 %r23,[%r75+16]; .loc 1 217 8 ld.u64 %r59,[%r75]; .loc 1 219 17 cvt.u32.u64 %r24,%r23; .loc 1 219 38 add.u32 %r76,%r24,-4; and.b32 %r77,%r76,-5; set.u32.eq.u32 %r79,%r77,0; neg.s32 %r80,%r79; .loc 1 219 17 add.u32 %r81,%r24,-1; set.u32.le.u32 %r83,%r81,1; neg.s32 %r84,%r83; .loc 1 219 6 cvt.u16.u32 %r86,%r80; cvt.u16.u32 %r87,%r84; or.b16 %r85,%r86,%r87;@ %r90 bra $L52; $L55: .loc 1 231 3 setp.gt.s64 %r91,%r54,0; @ %r91 bra $L53; $L58: ld.u64 %r28,[%frame+240]; .loc 1 249 22 ld.u64 %r41,[%frame]; .loc 1 251 32 ld.u64 %r53,[%frame+120]; .loc 1 254 23 mul.lo.u64 %r92,%r53,%r41; .loc 1 254 9 neg.s64 %r61,%r92; .loc 1 241 10 mov.u64 %r74,0; setp.eq.u64 %r125,%r54,1; .loc 1 253 13 mov.u64 %r127,%r74; bra $L54; $L52: .loc 1 221 7 setp.eq.u32 %r94,%r24,16; @ %r94 bra $L55; .loc 1 229 5 cvta.const.u64 %r96,$LC11; mov.u64 %r95,r96; call _gfortrani_internal_error3: add.u64 %r69,%r75,40; .loc 1 231 3 mov.u64 %r67,0; mov.u64 %r60,%r67; add.u64 %r124,%frame,120; add.u64 %r123,%frame,240; .loc 1 235 16 mov.u64 %r107,%r67; $L57: .loc 1 233 18 add.u64 %r97,%frame,%r67; .loc 1 233 20 ld.u64 %r99,[%r69]; mul.lo.u64 %r98,%r99,%r23; .loc 1 233 18 st.u64 [%r97],%r98; .loc 1 234 19 ld.u64 %r101,[%r69+16]; add.u64 %r100,%r101,1; ld.u64 %r102,[%r69+8]; sub.u64 %r37,%r100,%r102; .loc 1 234 17 add.u64 %r104,%r124,%r67; st.u64 [%r104],%r37; .loc 1 235 16 add.u64 %r106,%r123,%r67; st.u64 [%r106],%r107; .loc 1 237 10 setp.le.s64 %r108,%r37,0; @ %r108 bra $L63; .loc 1 231 26 add.u64 %r60,%r60,1; .loc 1 231 3 add.u64 %r69,%r69,24; add.u64 %r67,%r67,8; setp.ne.u64 %r109,%r54,%r60; @ %r109 bra $L57; bra $L58; $L54: .loc 1 245 10 ld.s8 %r111,[%r59]; cvt.u16.u32 %r110,%r111; setp.eq.u16 %r112,%r110,0; @ %r112 bra $L59; .loc 1 246 9 add.u64 %r74,%r74,1; $L59: .loc 1 248 15 add.u64 %r28,%r28,1; .loc 1 249 12 add.u64 %r59,%r59,%r41; .loc 1 251 13 setp.ne.u64 %r113,%r28,%r53; @ %r113 bra $L54; .loc 1 254 9 add.u64 %r57,%r59,%r61; .loc 1 256 7 @ %r125 bra $L51; add.u64 %r52,%frame,248; add.u64 %r56,%frame,8; add.u64 %r73,%frame,128; .loc 1 255 5 mov.u64 %r58,1; bra $L61; $L62: .loc 1 253 13 st.u64 [%r52],%r127; .loc 1 254 23 mul.lo.u64 %r118,%r48,%r47; .loc 1 254 9 sub.u64 %r57,%r59,%r118; .loc 1 255 5 add.u64 %r58,%r58,1; .loc 1 256 7 add.u64 %r52,%r52,8; add.u64 %r56,%r56,8; add.u64 %r73,%r73,8; setp.eq.u64 %r119,%r54,%r58; @ %r119 bra $L51; $L61: .loc 1 263 16 ld.u64 %r120,[%r52]; add.u64 %r47,%r120,1; st.u64 [%r52],%r47; .loc 1 264 23 ld.u64 %r48,[%r56]; .loc 1 264 13 add.u64 %r59,%r57,%r48; .loc 1 251 32 ld.u64 %r50,[%r73]; .loc 1 251 13 setp.eq.u64 %r121,%r47,%r50; @ %r121 bra $L62; .loc 1 253 13 mov.u64 %r28,0; bra $L54; $L63: .loc 1 238 9 mov.u64 %r74,0; $L51gfortran_set_options .visible .func _gfortran_set_optionsfortran/runtime/compile_optiongfortrani_init_compile_options .visible .func _gfortrani_init_compile_option_gfortran_set_convert .visible .func _gfortran_set_convert_gfortran_set_record_marker .visible .func _gfortran_set_record_marker_gfortran_set_max_subrecord_length .visible .func _gfortran_set_max_subrecord_length_gfortran_runtime_error .extern .func _gfortran_runtime_VAR DEF: _gfortrani_compile_options .visible .global .align 8 .u64 _gfortrani_compile_options[697,108,105,100,32,118,97,108,117,101,32,102,111,114,32,114,101,99,111,114,100,32,109,97,114,107,10132,118,97,108,117,101,32,102,111,114,32,109,97,120,105,109,117,109,32,115,117,98,114,101,99,111,114,100,32,108,101,110,103,116_gfortran_set_options .visible .func _gfortran_set_optionspredmov.u32 %r29,%ar0; mov.u64 %r30,%ar1; .loc 1 148 6 setp.gt.s32 %r31,%r29,0; @ ! %r31 bra $L1; .loc 1 149 39 ld.u32 %r22,[%r30]; .loc 1 149 30 cvta.global.u64 %r32,_gfortrani_compile_options; st.u32 [%r32],%r22; .loc 1 150 6 setp.eq.u32 %r33,%r29,1; @ %r33 bra $L1; .loc 1 151 31 ld.u32 %r23,[%r30+4]; st.u32 [%r32+4],%r23; .loc 1 152 6 setp.eq.u32 %r35,%r29,2; @ %r35 bra $L1; .loc 1 153 30 ld.u32 %r24,[%r30+8]; st.u32 [%r32+8],%r24; .loc 1 154 6 setp.eq.u32 %r37,%r29,3; @ %r37 bra $L1; .loc 1 155 31 ld.u32 %r25,[%r30+12]; st.u32 [%r32+16],%r25; .loc 1 156 6 setp.eq.u32 %r39,%r29,4; @ %r39 bra $L1; .loc 1 157 31 ld.u32 %r26,[%r30+16]; st.u32 [%r32+20],%r26; .loc 1 158 6 setp.eq.u32 %r41,%r29,5; @ %r41 bra $L1; .loc 1 159 34 ld.u32 %r27,[%r30+20]; st.u32 [%r32+36],%r27; .loc 1 160 6 setp.eq.u32 %r43,%r29,6; @ %r43 bra $L1; .loc 1 161 33 ld.u32 %r28,[%r30+24]; st.u32 [%r32+40],%r28; $L1: .loc 1 1_gfortrani_init_compile_options .visible .func _gfortrani_init_compile_options {.reg .u64 %r27; .loc 1 207 28 cvta.global.u64 %r22,_gfortrani_compile_options; mov.u64 %r23,2194728288324; st.u64 [%r22],%r23; .loc 1 211 28 mov.u32 %r25,0; st.u32 [%r22+8],%r25; .loc 1 212 29 mov.u64 %r27,4294967296; st.u64 [%r22+16],%r27; .loc 1 214 31 st.u32 [%r22+40],%r25;_gfortran_set_convert .visible .func _gfortran_set_conmov.u32 %r22,%ar0; .loc 1 226 27 st.global.u32 [_gfortrani_compile_options+12],%r22;_gfortran_set_record_marker .visible .func _gfortran_set_record_markermov.u64 %stack,0; .reg .u32 %r22; .reg .pred.reg .u64 %r28; .reg .u64 %r29; mov.u32 %r22,%ar0; .loc 1 237 3 setp.eq.u32 %r23,%r22,4; @ %r23 bra $L26; setp.eq.u32 %r24,%r22,8; @ %r24 bra $L27; bra $L30; $L26: .loc 1 240 37 mov.u64 %r26,4; st.global.u64 [_gfortrani_compile_options+24],%r26; .loc 1 241 7 bra $L25; $L27: .loc 1 244 37 mov.u64 %r28,8; st.global.u64 [_gfortrani_compile_options+24],%r28; .loc 1 245 7 bra $L25; $L30: .loc 1 248 7_gfortran_runtime_error25:_gfortran_set_max_subrecord_length .visible .func _gfortran_set_max_subrecord_lengthmov.u64 %stack,0; .reg .u32 %r24258 38 add.u32 %r25,%r24,-1; .loc 1 258 6 setp.le.u32 %r26,%r25,2147483638; @ %r26 bra $L32; .loc 1 260 7 cvta.const.u64 %r27_gfortran_runtime_error32: .loc 1 264 40 st.global.u32 [_gfortrani_compile_options+32],%r24; .loc 1 265 1 ret; } memory.o/_gfortrani_xm_gfortrani_xmallocfortran/runtime/memory.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_xmallocarraygfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_xc_gfortrani_xc_gfortrani_x_gfortrani_x_gfortran_os_error .extern .func _gfortran_os_erro__errno__errno7102,97,105,108,10116,101,103,101,114,32,97,108,108,111,99,97,114,114,97,12177102,97,105,108,101,100,32,105,110,32,120,109,97,108,108,111,99,97,114,114,9765,108,108,111,99,97,116,105,110,103,32,99,108,101,97,114,101,100,32,109,101,109,111,114,121,32,102,97,105,108,101,1077102,97,105,108,117,114,101,32,105,110,32,120,114,101,97,108,108,111,99_gfortrani_xm_gfortrani_xmallocu64 %r27; .reg .pred 7 max.u64 %r26,40 6 setp.ne.u64 %r29,%r27,0; @ %r29 bra $L1; .loc 1 41 5call _gfortran_os_error,(_gfortrani_xmallocarraygfortrani_xmallocarrayu16 %r40; .reg .u16 %r41; .reg .u32 %r42; .reg .u16 %r43; .reg .pred32u64predpredpred53 7 set.u32.eq.u64 %r34,%r31,0; neg.s32 %r35,%r34; .loc 1 53 17 set.u32.eq.u64 %r37,%r32,0; neg.s32 %r38,%r37; .loc 1 53 14 cvt.u16.u32 %r40,%r35; cvt.u16.u32 %r41,%r38; or.b16 %r39,%r40,%r41; .loc 1 53 6 cvt.u32.u16 %r42,%r39; cvt.u16.u8 %r43,%r42; setp.ne.u16 %r44,%r43,0; @ %r44 bra $L13; .loc 1 55 12 mov.u64 %r29,0; shr.u64 %r45,%r31,32; shr.u64 %r46,%r32,32; cvt.u32.u64 %r48,%r8; cvt.u32.u64 %r53,%r31; cvt.u64.u32 %r52,%r53; cvt.u32.u64 %r55,%r32; cvt.u64.u32 %r54,%r55; mul.lo.u64 %r47,%r52,%r54; bra $L5; $L8: cvt.u32.u64 %r58,%r46; cvt.u32.u64 %r59,%r31; bra $L9; $L7: cvt.u32.u64 %r60,%r46; setp.ne.u32 %r61,%r60,0; @ %r61 bra $L11; cvt.u32.u64 %r58,%r45; cvt.u32.u64 %r59,%r32; $L9: cvt.u32.u64 %r63,%r31; cvt.u64.u32 %r62,%r63; cvt.u32.u64 %r65,%r32; cvt.u64.u32 %r64,%r65; mul.lo.u64 %r66,%r62,%r64; cvt.u64.u32 %r68,%r59; cvt.u64.u32 %r69,%r58; shr.u64 %r71,%r66,32; mad.lo.u64 %r72,%r68,%r69,%r71; shr.s64 %r73,%r72,32; cvt.u32.u64 %r74,%r73; setp.ne.u32 %r75,%r74,0; @ %r75 bra $L11; shl.b64 %r76,%r72,32; cvt.u32.u64 %r78,%r66; cvt.u64.u32 %r77,%r78; or.b64 %r47,%r76,%r77; bra $L5; $L11: mul.lo.u64 %r47,%r31,%r32; mov.u64 %r29,1; $L5: mov.u64 %r25,%r47; .loc 1 55 11 setp.eq.u64 %r81,%r29,0; @ %r81 bra $L4; .loc 1 57 7__errnomov.u32 %r83,12; st.u32 [%r82],%r83; .loc 1 58 7 cvta.const.u64 %r884; call _gfortran_os_error$L13: .loc 1 54 10 mov.u64 %r25,1; $L4: .loc 1 61 763 6 setp.ne.u64 %r88,%r86,0; @ %r88 bra $L3; .loc 1 64 5call _gfortran_os_error$L3: .loc 1 67gfortrani_xc_gfortrani_xcallocpred75 7 set.u32.eq.u64 %r29,%r26,0; neg.s32 %r30,%r29; .loc 1 75 17 set.u32.eq.u64 %r32,%r27,0; neg.s32 %r33,%r32; .loc 1 75 14 cvt.u16.u32 %r35,%r30; cvt.u16.u32 %r36,%r33; or.b16 %r34,%r35,%r36; .loc 1 75 6 cvt.u32.u16 %r37,%r34; cvt.u16.u8 %r38,%r37; setp.eq.u16 %r39,%r38,0; @ %r39 bra $L18; .loc 1 76 18 mov.u64 %r27,1; .loc 1 76 11 mov.u64 %r26,%r27; $L18: .loc 1 78 13call (%value_in),c17; .loc 1 80 5 cvta.const.u64 %r4545; call _gfortran_os_errorgfortrani_x_gfortrani_xrealloc92 16 max.u64 %r28,23; .loc 1 94 5call _gfortran_os_error$L23: .loc 1 97 string.o/ gfortrani_fstrlengfortrani_fstrlenfortran/runtime/string.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_fstrcpygfortrani_fstrcpygfortrani_cf_strcpygfortrani_cf_str_gfortrani_fc_strd_gfortrani_fc_strd_gfortrani_fc_strdup_notrgfortrani_fc_strdup_notr; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_find_op_gfortrani_find_option_gfortrani_gfc_itoagfortrani_gfc_itoand_gfortran_os_error .extern .func _gfortran_os_errostrncasecmp_gfortran_generate_error .extern .func _gfortran_generate_error_gfortrani_sys_abort .extern .func _gfortrani_sys___umodti3 .extern// BEGIN GLOBAL FUNCTION DECL: __udivti3 .extern2] = {48,077102,97,105,108,101,100,32,105,110,32,102,99,95,115,116,114,100,11_gfortrani_fstrlengfortrani_fstrlenmov.u64 %r26,%ar0; mov.u64 %r25,%ar1; .loc 1 36 3 bra $L2; $L4: .loc 1 37 15 add.u64 %r22,%r25,-1; add.u64 %r28,%r26,%r22; .loc 1 37 8 ld.s8 %r30,[%r28]; cvt.u16.u32 %r29,%r30; setp.ne.u16 %r31,%r29,32; @ %r31 bra $L1; .loc 1 36 22 mov.u64 %r25,%r22; $L2: .loc 1 36 3 setp.ne.u64 %r32,%r25,0; @ %r32 bra $L4; $L1: .loc 1 41gfortrani_fstrcpygfortrani_fstrcpypred %r29; .reg .u64 %r36; .reg .u64 %r44; .reg .u32 %r49; .reg .u64 %r51; .reg .u64 %r52; .reg .u6452 6 setp.lt.u64 %r29,%r28,%r26; @ %r29 bra $L6; .loc 1 55 7mov.u64 %r24,%r26; bra $L5; $L6: .loc 1 60 72 7 sub.u64 %r51,%r26,%r28; .loc 1 62 15 add.u64 %r52,%r25,%r28; .loc 1 62 7 mov.u32 %r49,3r5,[%value_in]; } .loc 1 63 14 mov.u64 %r24,%r28; $L5: .loc 1 65gfortrani_cf_strcpypred76 1327mov.u64 %r24,%r29; .loc 1 78 6 setp.lt.u64 %r31,%r24,%r26; @ %r31 bra $L9; .loc 1 81 7mov.u64 %r24,%r26; bra $L8; $L9: .loc 1 86 788 7 sub.u64 %r53,%r26,%r24; .loc 1 88 15 add.u64 %r54,%r25,%r24; .loc 1 88 7 mov.u32 %r515455,[%value_in]; } $L8: .loc 1 91gfortrani_fc_strd_gfortrani_fc_strdup6 3 bra $L12; $L14: .loc 1 37 15 add.u64 %r22,%r27,-1; add.u64 %r28,%r26,%r22; .loc 1 37 8 ld.s8 %r30,[%r28]; cvt.u16.u32 %r29,%r30; setp.ne.u16 %r31,%r29,32; @ %r31 bra $L13; .loc 1 36 22 mov.u64 %r27,%r22; $L12: .loc 1 36 3 setp.ne.u64 %r32,%r27,0; @ %r32 bra $L14; $L13: .loc 1 130 13call (%value_in),strndup.loc 1 131 61;call _gfortran_os_error$L1gfortrani_fc_strdup_notrgfortrani_fc_strdup_notr44strndup145 6 setp.ne.u64 %r29,%r27,0; @ %r29 bra $L16; .loc 1 146 5call _gfortran_os_error$L16: .loc 1 148_gfortrani_find_op_gfortrani_find_optionpredpredmov.u64 %r31,%ar2; mov.u64 %r32,%ar3; mov.u64 %r33,%ar4; .loc 1 36 3 bra $L19; $L21: .loc 1 37 15 add.u64 %r25,%r31,-1; add.u64 %r34,%r30,%r25; .loc 1 37 8 ld.s8 %r36,[%r34]; cvt.u16.u32 %r35,%r36; setp.ne.u16 %r37,%r35,32; @ %r37 bra $L20; .loc 1 36 22 mov.u64 %r31,%r25; $L19: .loc 1 36 3 setp.ne.u64 %r38,%r31,0; @ %r38 bra $L21; bra $L20; $L24: .loc 1 163 1624.loc 1 163 8 setp.ne.u64 %r42,%r31,%r40; @ %r42 bra $L22strncasecmp.loc 1 163 35 setp.ne.u32 %r47,%r46,0; @ %r47 bra $L22; .loc 1 164 18 ld.u32 %r28,[%r32+8]; bra $L18; $L22: .loc 1 162 26 add.u64 %r32,%r32,16; .loc 1 162 14 ld.u64 %r24,[%r32]; .loc 1 162 3 setp.ne.u64 %r48,%r24,0; @ %r48 bra $L24; bra $L25; $L20: .loc 1 162 14 ld.u64 %r24,[%r32]; .loc 1 162 3 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L24; $L25: .loc 1 166 3 mov.u32 %r51,5002gfortran_generate_error.loc 1 168 10 mov.u32 %r28,-1; $L18gfortrani_gfc_itoagfortrani_gfc_itoapredmov.u64 %r33,%ar0; ld.u64 %r131,[%r33]; ld.u64 %r132,[%r33+8]188 6 setp.gt.u64 %r37,%r36,49; @ %r37 bra $L29; .loc 1 189 5 { call _gfortrani_sys_9: .loc 1 191 6 or.b64 %r38,%r131,%r132; setp.eq.u64 %r40,%r38,0; @ %r40 bra $L37; .loc 1 196 6 setp.lt.s64 %r42,%r132,0; @ %r42 bra $L32; .loc 1 195 5 mov.u64 %r127,%r131; mov.u64 %r128,%r132; .loc 1 194 12 mov.u32 %r27,0; bra $L34; $L32: .loc 1 199 11 set.u32.ne.u64 %r50,%r131,0; cvt.s64.s32 %r48,%r50; neg.s64 %r127,%r131; sub.u64 %r128,%r48,%r132; .loc 1 198 16 mov.u32 %r27,1; $L34: .loc 1 202 5 add.u64 %r32,%r35,49; .loc 1 203 6 mov.u32 %r55,0; st.u8 [%r35+49],%r55; add.u64 %r142,%frame,16; add.u64 %r141,%frame,32; $L41: mov.u64 %r31,%r32; .loc 1 207 12 add.u64 %r32,%r32,-1; .loc 1 207 23 mov.u64 %r62,%r127; mov.u64 %r63,%r128; mov.u64 %r66,10; mov.u64 %r67,0; st.u64 [%frame+16],%r127; st.u64 [%frame+24],%r128; st.u64 [%frame+32],%r66; st.u64 [%frame+40],%r67; {142141; call __umod.loc 1 207 18 ld.u16 %r89,[%frame]; add.u16 %r88,%r89,48; .loc 1 207 12 cvt.u32.u16 %r90,%r88; st.u8 [%r32],%r90; .loc 1 208 9 st.u64 [%frame+16],%r127; st.u64 [%frame+24],%r128; st.u64 [%frame+32],%r66; st.u64 [%frame+40],%r67; {142141; call __udiv]; ld.u64 %r128,[%frame+8]; .loc 1 205 9 setp.ne.u64 %r119,%r63,0; @ %r119 bra $L41; setp.gt.u64 %r123,%r62,9; @ %r123 bra $L41; .loc 1 211 6 setp.eq.u32 %r124,%r27,0; @ %r124 bra $L28; .loc 1 212 10 mov.u32 %r125,45; st.u8 [%r32+-1],%r125; add.u64 %r32,%r31,-2; bra $L28; $L37: .loc 1 192 12 cvta.const.u64 %r32,$LC1; $L28select.o/ gfortran_selec_gfortran_select_stringfortran/runtime/select_in_gfortran_select_string_char4gfortran_select_string_char4_gfortran_compare_gfortran_compare); // BEGIN GLOBAL FUNCTION DECL: _gfortran_compare_string_char4_gfortran_compare_string_char4F: _gfortran_selec_gfortran_select_stringu64u64 %r83; .reg .u64 %r86; .reg .u32u32 %r104; .reg .pred %r105; .reg .pred %r106; .reg .u64 %r107pred %r119; mov.u64 %r54,%ar0; mov.u32 %r55,%ar1; mov.u64 %r56,%ar2; mov.u64 %r57,%ar3; .loc 1 56 6 setp.eq.u32 %r58,%r55,0; @ %r58 bra $L12; .loc 1 61 6 ld.u64 %r59,[%r54]; setp.ne.u64 %r60,%r59,0; @ %r60 bra $L13; .loc 1 61 34 ld.u64 %r23,[%r54+16]; .loc 1 61 26 setp.ne.u64 %r61,%r23,0; @ %r61 bra $L14; .loc 1 63 20 ld.u32 %r53,[%r54+32]; .loc 1 65 12 add.u64 %r45,%r54,40; .loc 1 66 16 add.u32 %r55,%r55,-1; .loc 1 67 10 setp.eq.u32 %r62,%r55,0; @ %r62 bra $L1; .loc 1 73 6 ld.u64 %r63,[%r54+40]; setp.ne.u64 %r64,%r63,0; @ %r64 bra $L15; .loc 1 75 49 ld.u64 %r23,[%r54+56]; .loc 1 65 12 mov.u64 %r54,%r45; bra $L4; $L14: .loc 1 54 7 mov.u32 %r53,-1; $L4: .loc 1 75call (%value_in),_gfortran_compare_string75 10 setp.lt.s32 %r71,%r70,0; @ %r71 bra $L5; .loc 1 77 21 ld.u32 %r53,[%r54+32]; bra $L1; $L5: .loc 1 80 16 add.u32 %r55,%r55,-1; .loc 1 81 10 setp.eq.u32 %r72,%r55,0; @ %r72 bra $L1; .loc 1 79 12 add.u64 %r54,%r54,40; bra $L3; $L13: .loc 1 54 7 mov.u32 %r53,-1; bra $L3; $L15: .loc 1 65 12 mov.u64 %r54,%r45; $L3: .loc 1 85 13 cvt.s64.s32 %r73,%r55; shl.b64 %r75,%r73,2; add.u64 %r76,%r75,%r73; shl.b64 %r77,%r76,3; .loc 1 85 25 add.u64 %r78,%r77,-40; .loc 1 85 5 add.u64 %r47,%r54,%r78; .loc 1 87 6 ld.u64 %r79,[%r47+16]; setp.eq.u64 %r80,%r79,0; @ %r80 bra $L6; .loc 1 103 9 setp.gt.s32 %r81,%r55,0; @ %r81 bra $L7; bra $L1; $L6: .loc 1 89 11 ld.u64 %r86,[%r47+8]; ld.u64 %r83,[%r47]call (%value_in),_gfortran_compare_stringld.param.u32 %r87,[%value_in]; } .loc 1 89 10 setp.gt.s32 %r88,%r87,0; @ %r88 bra $L8; .loc 1 90 17 ld.u32 %r53,[%r47+32]; bra $L1; $L8: .loc 1 92 16 add.u32 %r48,%r55,-1; .loc 1 103 9 setp.le.s32 %r89,%r55,1; @ %r89 bra $L1; mov.u32 %r55,%r48; $L7: mov.u32 %r46,-1; $L11: .loc 1 105 18 add.u32 %r90,%r55,%r46; .loc 1 105 11 shr.s32 %r50,%r90,1; .loc 1 107 17 cvt.s64.s32 %r91,%r50; shl.b64 %r93,%r91,2; add.u64 %r94,%r93,%r91; shl.b64 %r95,%r94,3; .loc 1 107 9 add.u64 %r51,%r54,%r95; .loc 1 108 11 ld.u64 %r100,[%r51+8]; ld.u64 %r97,[%r5110call (%value_in),_gfortran_compare_string110 10 setp.ne.u32 %r102,%r101,0; @ %r102 bra $L9; .loc 1 111 17 ld.u32 %r53,[%r51+32]; bra $L1; $L9: .loc 1 113 10 setp.lt.s32 %r103,%r101,0; .loc 1 116 14 selp.u32 %r55,%r55,%r50,%r103; selp.u32 %r46,%r50,%r46,%r103; .loc 1 103 14 add.u32 %r104,%r46,1; .loc 1 103 9 setp.lt.s32 %r105,%r104,%r55; @ %r105 bra $L11; .loc 1 125 6 setp.eq.u32 %r106,%r46,-1; @ %r106 bra $L1; .loc 1 128 13 cvt.s64.s32 %r107,%r46; shl.b64 %r109,%r107,2; add.u64 %r110,%r109,%r107; shl.b64 %r111,%r110,3; .loc 1 128 5 add.u64 %r49,%r54,%r111; .loc 1 129 7 ld.u64 %r116,[%r49+16]; ld.u64 %r1177116; call (%value_in),_gfortran_compare_stringld.param.u32 %r118,[%value_in]; } .loc 1 129 6 setp.gt.s32 %r119,%r118,0; @ %r119 bra $L1; .loc 1 130 13 ld.u32 %r53,[%r49+32]; bra $L1; $L12: .loc 1 57 12 mov.u32 %r53,-1;gfortran_select_string_char4gfortran_select_string_char4u64u64 %r83; .reg .u64 %r86; .reg .u32u32 %r104; .reg .pred %r105; .reg .pred %r106; .reg .u64 %r107pred %r119; mov.u64 %r54,%ar0; mov.u32 %r55,%ar1; mov.u64 %r56,%ar2; mov.u64 %r57,%ar3; .loc 1 56 6 setp.eq.u32 %r58,%r55,0; @ %r58 bra $L38; .loc 1 61 6 ld.u64 %r59,[%r54]; setp.ne.u64 %r60,%r59,0; @ %r60 bra $L39; .loc 1 61 34 ld.u64 %r23,[%r54+16]; .loc 1 61 26 setp.ne.u64 %r61,%r23,0; @ %r61 bra $L40; .loc 1 63 20 ld.u32 %r53,[%r54+32]; .loc 1 65 12 add.u64 %r45,%r54,40; .loc 1 66 16 add.u32 %r55,%r55,-1; .loc 1 67 10 setp.eq.u32 %r62,%r55,0; @ %r62 bra $L27; .loc 1 73 6 ld.u64 %r63,[%r54+40]; setp.ne.u64 %r64,%r63,0; @ %r64 bra $L41; .loc 1 75 49 ld.u64 %r23,[%r54+56]; .loc 1 65 12 mov.u64 %r54,%r45; bra $L30; $L40: .loc 1 54 7 mov.u32 %r53,-1; $L30: .loc 1 75call (%value_in),_gfortran_compare_string_char475 10 setp.lt.s32 %r71,%r70,0; @ %r71 bra $L31; .loc 1 77 21 ld.u32 %r53,[%r54+32]; bra $L27; $L31: .loc 1 80 16 add.u32 %r55,%r55,-1; .loc 1 81 10 setp.eq.u32 %r72,%r55,0; @ %r72 bra $L27; .loc 1 79 12 add.u64 %r54,%r54,40; bra $L29; $L39: .loc 1 54 7 mov.u32 %r53,-1; bra $L29; $L41: .loc 1 65 12 mov.u64 %r54,%r45; $L29: .loc 1 85 13 cvt.s64.s32 %r73,%r55; shl.b64 %r75,%r73,2; add.u64 %r76,%r75,%r73; shl.b64 %r77,%r76,3; .loc 1 85 25 add.u64 %r78,%r77,-40; .loc 1 85 5 add.u64 %r47,%r54,%r78; .loc 1 87 6 ld.u64 %r79,[%r47+16]; setp.eq.u64 %r80,%r79,0; @ %r80 bra $L32; .loc 1 103 9 setp.gt.s32 %r81,%r55,0; @ %r81 bra $L33; bra $L27; $L32: .loc 1 89 11 ld.u64 %r86,[%r47+8]; ld.u64 %r83,[%r47]call (%value_in),_gfortran_compare_string_char4ld.param.u32 %r87,[%value_in]; } .loc 1 89 10 setp.gt.s32 %r88,%r87,0; @ %r88 bra $L34; .loc 1 90 17 ld.u32 %r53,[%r47+32]; bra $L27; $L34: .loc 1 92 16 add.u32 %r48,%r55,-1; .loc 1 103 9 setp.le.s32 %r89,%r55,1; @ %r89 bra $L27; mov.u32 %r55,%r48; $L33: mov.u32 %r46,-1; $L37: .loc 1 105 18 add.u32 %r90,%r55,%r46; .loc 1 105 11 shr.s32 %r50,%r90,1; .loc 1 107 17 cvt.s64.s32 %r91,%r50; shl.b64 %r93,%r91,2; add.u64 %r94,%r93,%r91; shl.b64 %r95,%r94,3; .loc 1 107 9 add.u64 %r51,%r54,%r95; .loc 1 108 11 ld.u64 %r100,[%r51+8]; ld.u64 %r97,[%r5110call (%value_in),_gfortran_compare_string_char4110 10 setp.ne.u32 %r102,%r101,0; @ %r102 bra $L35; .loc 1 111 17 ld.u32 %r53,[%r51+32]; bra $L27; $L35: .loc 1 113 10 setp.lt.s32 %r103,%r101,0; .loc 1 116 14 selp.u32 %r55,%r55,%r50,%r103; selp.u32 %r46,%r50,%r46,%r103; .loc 1 103 14 add.u32 %r104,%r46,1; .loc 1 103 9 setp.lt.s32 %r105,%r104,%r55; @ %r105 bra $L37; .loc 1 125 6 setp.eq.u32 %r106,%r46,-1; @ %r106 bra $L27; .loc 1 128 13 cvt.s64.s32 %r107,%r46; shl.b64 %r109,%r107,2; add.u64 %r110,%r109,%r107; shl.b64 %r111,%r110,3; .loc 1 128 5 add.u64 %r49,%r54,%r111; .loc 1 129 7 ld.u64 %r116,[%r49+16]; ld.u64 %r1177116; call (%value_in),_gfortran_compare_string_char4ld.param.u32 %r118,[%value_in]; } .loc 1 129 6 setp.gt.s32 %r119,%r118,0; @ %r119 bra $L27; .loc 1 130 13 ld.u32 %r53,[%r49+32]; bra $L27; $L38: .loc 1 57 12 mov.u32 %r53,-1minimal.o/ gfortrani_stupid_function_name_for_static_linking .visible .func _gfortrani_stupid_function_name_for_static_linking; .file 1 "../../../libgfortran/runtime/minimal.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_set_args .visible .func _gfortran_set_args_gfortrani_get_args .visible .func _gfortrani_get_arg); // BEGIN GLOBAL FUNCTION DECL: _gfortrani_estr_writegfortrani_estr_write_gfortrani_st_printfgfortrani_st__gfortrani_sys_abort .visible .func _gfortrani_sys_abort; // BEGIN FUNCTION DECL: recursion_check .func recursion_check; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_exit_error .visible .func _gfortrani_exit_error_gfortrani_show_locus .visible .func _gfortrani_show_gfortran_os_error .visible .func _gfortran_os_erro_gfortran_os_error_at .visible .func _gfortran_os_error_at_gfortran_runtime_error .visible .func _gfortran_runtime__gfortran_runtime_error_at .visible .func _gfortran_runtime_error_at_gfortran_runtime_warning_at .visible .func _gfortran_runtime_warning_at_gfortrani_internal_error .visible .func _gfortrani_internal__gfortran_stop_numeric .visible .func _gfortran_stop_numer_gfortran_stop_string .visible .func _gfortran_stop_string_gfortran_error_stop_string .visible .func _gfortran_error_stop_string_gfortran_error_stop_numeric .visible .func _gfortran_error_stop_numerVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_optionwrite10,70,111,114,116,114,97,110,32,114,117,110,116,105,109,101,32,119,97,114,110,105,110,103,58,3282,79,82,32,83,84,79,80,32,0 }; // BEGIN VAR DEF: argc_save .global .align 4 .u32 argc_save[110,80,114,111,103,114,97,109,32,97,98,111,114,116,101,100,46,10,0 }; // BEGIN VAR DEF: magic$0 .global .align 4 .u32 magic$0[1]73,110,116,101,114,110,97,108,32,69,114,114,111,114,58,32,0 }; // BEGIN VAR DEF: $LC11 .const .align 1 .u8 $LC11[9] = {83,84,79,80,32,37,100,10,0 }; // BEGIN VAR DEF: $LC12 .const .align 1 .u8 $LC12[6] = {83,84,79,80,32,0 }; // BEGIN VAR DEF: argv_save .global .align 8 .u64 argv_save[1]82,79,82,32,83,84,79,80,32,37,100,10,0 }; // BEGIN GLOBAL VAR DEF: _gfortrani_options .visible .global .align 8 .u64 _gfortrani_options[7]69,114,114,111,114,32,116,101,114,109,105,110,97,116,105,11165,116,32,108,105,110,101,32,37,100,32,111,102,32,102,105,108,101,32,37,115,32,40,117,110,105,116,32,61,32,37,100,4165,116,32,108,105,110,101,32,37,100,32,111,102,32,102,105,108,101,32,379,112,101,114,97,116,105,110,103,32,115,121,115,116,101,109,32,101,114,114,111,114,58,322] = {1058,3270,111,114,116,114,97,110,32,114,117,110,116,105,109,101,32,101,114,114,111,114,58,32,0 }97,110,32,114,117,110,116,105,109,101,32,101_gfortrani_stupid_function_name_for_static_linking .visible .func _gfortrani_stupid_function_name_for_static_linking { .loc 1 64_gfortran_set_args .visible .func _gfortran_set_args76 13 st.global.u32 [argc_save],%r22; .loc 1 77 13 st.global.u64 [argv_save],%r23; .loc 1 7_gfortrani_get_args .visible .func _gfortrani_get_args87 9 ld.global.u32 %r26,[argc_save]; st.u32 [%r24],%r26; .loc 1 88 9 ld.global.u64 %r27,[argv_save]; st.u64 [%r25_gfortrani_estr_writegfortrani_estr_writeu64 %r27; .reg .u32 %r29; .reg .u32 %r32; mov.u64 %r25,%ar0; .loc 1 10025mov.u32 %r29,229value,%r32; .loc 1 101_gfortrani_st_printfgfortrani_st_printfmov.u64 %r24,%ar0; .loc 1 114 13gfortrani_sys_abort .visible .func _gfortrani_sys_abort { .reg .u32 %r22; .reg .u64 %r24; .reg .pred %r25; .reg .pred %r26; .reg .u64 %r27; .reg .u32 %r28; .reg .predloc 1 128 14 cvta.global.u64 %r24,_gfortrani_options; ld.u32 %r22,[%r24+44]; .loc 1 128 6 setp.eq.u32 %r25,%r22,1; @ %r25 bra $L7; .loc 1 129 7 setp.ne.u32 %r26,%r22,-1; @ %r26 bra $L8; .loc 1 129 53 cvta.global.u64 %r27,_gfortrani_compile_options; .loc 1 129 35 ld.u32 %r28,[%r27+16]; setp.ne.u32 %r29,%r28,1; @ %r29 bra $L8; $L7: .loc 1 131 7 cvta.const.u64 %r30,$LCcall (%value_in),_gfortrani_estr_write,(%out_arg1); ld.param.u64 %r31,[%value_in]; } $L8} // BEGIN FUNCTION DEF: recursion_check .func recursion_check {loc 1 198 6 ld.global.u32 %r23,[magic$0]; setp.ne.u32 %r24,%r23,551452929; @ %r24 bra $L13; .loc 1 199 5 { call _gfortrani_sys_13: .loc 1 201 9 mov.u32 %r25,551452929; st.global.u32 [magic$0],%r25; .loc 1 2_gfortrani_exit_error .visible .func _gfortrani_exit_errorloc 1 144 14 cvta.global.u64 %r25,_gfortrani_options; ld.u32 %r22,[%r25+44]; .loc 1 144 6 setp.eq.u32 %r26,%r22,1; @ %r26 bra $L15; .loc 1 145 7 setp.ne.u32 %r27,%r22,-1; @ %r27 bra $L16; .loc 1 145 53 cvta.global.u64 %r28,_gfortrani_compile_options; .loc 1 145 35 ld.u32 %r29,[%r28+16]; setp.ne.u32 %r30,%r29,1; @ %r30 bra $L16; $L15: .loc 1 147 7 cvta.const.u64 %r31,$LC131; call (%value_in),_gfortrani_estr_write,(%out_arg1); ld.param.u64 %r32,[%value_in]; } $L16} // BEGIN GLOBAL FUNCTION DEF: _gfortrani_show_locus .visible .func _gfortrani_show_locus8u16 %r38; .reg .u16 %r39; .reg .u16 %r40; .reg .u32 %r41; .reg .u16mov.u64 %r29,%ar0; .loc 1 161 15 cvta.global.u64 %r30,_gfortrani_options; .loc 1 161 6 ld.u32 %r32,[%r30+16]; set.u32.eq.u32 %r33,%r32,0; neg.s32 %r34,%r33; .loc 1 161 22 set.u32.eq.u64 %r36,%r29,0; neg.s32 %r37,%r36; cvt.u16.u32 %r39,%r34; cvt.u16.u32 %r40,%r37; or.b16 %r38,%r39,%r40; cvt.u32.u16 %r41,%r38; cvt.u16.u8 %r42,%r41; setp.ne.u16 %r43,%r42,0; @ %r43 bra $L20; .loc 1 161 43 ld.u64 %r23,[%r29+8]; .loc 1 161 37 setp.eq.u64 %r44,%r23,0; @ %r44 bra $L20; .loc 1 164 10 ld.u32 %r24,[%r29+4]; .loc 1 177 15 ld.u32 %r28,[%r29+16]; .loc 1 164 6 setp.le.s32 %r45,%r24,0; @ %r45 bra $L22; .loc 1 176 4 st.u32 [%stack+16],%r24; st.u64 [%stack+8],%r2346,$LC2stack; call (%value_in),_gfortrani_st_179 7 bra $L20; $L22: .loc 1 182 3 st.u64 [%stack+8],%r2349,$LC3;stack; call (%value_in),_gfortrani_st_$L20: .loc 1 18_gfortran_os_error .visible .func _gfortran_os_errormov.u64 %r22,%ar0; .loc 1 212 3 { call recursion_check; } .loc 1 213 3 cvta.const.u64 %r23,$LC4;call (%value_in),_gfortrani_estr_write,(%out_arg1); ld.param.u64 %r24,[%value_in]; } .loc 1 214 3call (%value_in),_gfortrani_estr_write,(215 3 cvta.const.u64 %r27; call (%value_in),_gfortrani_estr_write,(216 3 mov.u32 %r29,1_gfortrani_exit_error_gfortran_os_error_at .visible .func _gfortran_os_error_at.loc 1 230 3 { call recursion_check; }call (%value_in),_gfortrani_estr_write,(232 3 cvta.const.u64 %r27,$LC6; call (%value_in),_gfortrani_estr_write,(4 3236 3 cvta.const.u64 %r333; call (%value_in),_gfortrani_estr_write237 3 mov.u32 %r35,1_gfortrani_exit_error_gfortran_runtime_error .visible .func _gfortran_runtimov.u64 %r23,%ar0; .loc 1 250 3 { call recursion_check; } .loc 1 251 3 cvta.const.u64 %r24,$LCgfortrani_estr_write253255 3 cvta.const.u64 %r30; call (%value_in),_gfortrani_estr_write,(256 3 mov.u32 %r322; call _gfortrani_exit_error_gfortran_runtime_error_at .visible .func _gfortran_runtime_error_at.loc 1 268 3 { call recursion_checkcall (%value_in),_gfortrani_estr_write,(270 3 cvta.const.u64 %r27,$LC8; call (%value_in),_gfortrani_estr_write,(272 3274 3 cvta.const.u64 %r333; call (%value_in),_gfortrani_estr_write275 3 mov.u32 %r35; call _gfortrani_exit_error_gfortran_runtime_warning_at .visible .func _gfortran_runtime_warning_atmov.u32 %r29,229writemov.u64 %r35,26; cvta.const.u64 %r34,$LC9288.loc 1 100 10 mov.u64 %r43,1; cvta.const.u64 %r42,$LC5_gfortrani_internal_error .visible .func _gfortrani_internal_r22mov.u64 %r23,%ar1; .loc 1 301 3 { call recursion_check; } .loc 1 302call _gfortrani_show_locus,(%out_arg1); } .loc 1 303 3 cvta.const.u64 %r25,$LC1call (%value_in),_gfortrani_estr_write,(call (%value_in),_gfortrani_estr_write,(305 3 cvta.const.u64 %r29; call (%value_in),_gfortrani_estr_write313 3 mov.u32 %r31,3_gfortrani_exit_error_gfortran_stop_numeric .visible .func _gfortran_stop_numericpred %r24; .reg .u64 %r25; .reg .u32 %r26; mov.u32 %r22,%ar0; mov.u32 %r23,%ar1; .loc 1 331 6 setp.ne.u32 %r24,%r23,0; @ %r24 bra $L33; .loc 1 334 7 st.u32 [%stack],%r22; cvta.const.u64 %r25,$LC11_gfortrani_st_$L33} // BEGIN GLOBAL FUNCTION DEF: _gfortran_stop_string .visible .func _gfortran_stop_stringu16 %r35; .reg .u32 %r37; .reg .u16 %r38; .reg .pred10 set.u32.ne.u64 %r29,%r25,0; neg.s32 %r30,%r29; .loc 1 345 6 cvt.u16.u32 %r32,%r27; xor.b16 %r31,%r32,1; .loc 1 348 10 cvt.u16.u32 %r35,%r30; and.b16 %r34,%r35,%r31; cvt.u32.u16 %r37,%r34; cvt.u16.u8 %r38,%r37; setp.eq.u16 %r39,%r38,0; @ %r39 bra $L35; .loc 1 350 4 cvta.const.u64 %r40,$LC12gfortrani_estr_write351 11 mov.u32 %r42,2.loc 1 352 4 cvta.const.u64 %r4646; call (%value_in),_gfortrani_estr_write$L35} // BEGIN GLOBAL FUNCTION DEF: _gfortran_error_stop_string .visible .func _gfortran_error_stop_stringpred.loc 1 3740; .loc 1 373 7 cvta.const.u64 %r26,$LC1326; call (%value_in),_gfortrani_estr_write,(4 14 mov.u32 %r28,2cvta.const.u64 %r32; call (%value_in),_gfortrani_estr_write,(%out_arg1); ld.param.u64 %r33,[%value_in]; } $L40: .loc 1 377 3 mov.u32 %r34; call _gfortrani_exit_error_gfortran_error_stop_numeric .visible .func _gfortran_error_stop_numericpred %r24; .reg .u64 %r25; .reg .u32 %r26; mov.u32 %r22,%ar0; mov.u32 %r23,%ar1; .loc 1 389 6 setp.ne.u32 %r24,%r23,0; @ %r24 bra $L42; .loc 1 392 7 st.u32 [%stack],%r22; cvta.const.u64 %r25,$LC14_gfortrani_st_$L42: .loc 1 394 3_gfortrani_exit_errorall_l1.o/_gfortran_all_l1 .visible .func _gfortran_all_l1fortran/generated/all_l1.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_internal_error .extern .func _gfortrani_internal_error21,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121,32,105,110,32,65,76,76,32,105,110,116,114,105,110,115,105,99,0 }114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,65,76,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,10810,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,65,76,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_all_l1 .visible .func _gfortran_all_l1480pred %r217; .reg .predu64 %r228; .reg .pred %r229; .reg .pred %r231; .reg .u64 %r232; .reg .predu32u64 %r250; .reg .u64 %r251; .reg .u64 %r252; .reg .u64 %r253; .reg .predu32u16u64pred %r293; .reg .u32 %r294; .reg .u32predu64 %r318; .reg .u6464u64 %r341; .reg .u32predpred %r355; .reg .u64 %r356; .reg .u64 %r358; .reg .u64u64 %r375; .reg .u64 %r376; mov.u64 %r196,%ar0; mov.u64 %r197,%ar1; mov.u64 %r198,%ar2; .loc 1 56 10 ld.u64 %r22,[%r198]; .loc 1 56 7 add.u64 %r121,%r22,-1; .loc 1 57 10 ld.s8 %r23,[%r197+28]; .loc 1 57 38 add.u32 %r25,%r23,-1; .loc 1 57 8 cvt.s64.s32 %r122,%r25; .loc 1 59 14 ld.u64 %r26,[%r197+16]; .loc 1 61 9 add.u64 %r200,%r121,%r121; add.u64 %r201,%r200,%r121; shl.b64 %r202,%r201,3; add.u64 %r203,%r197,%r202; ld.u64 %r27,[%r203+56]; ld.u64 %r29,[%r203+48]; .loc 1 65 11 ld.u64 %r30,[%r203+40]; .loc 1 67 3 setp.gt.s64 %r217,%r121,0; @ %r217 bra $L2; $L8: .loc 1 75 3 setp.lt.s64 %r218,%r121,%r122; @ %r218 bra $L3; bra $L4; $L2: add.u64 %r52,%r197,40; add.u64 %r148,%frame,120; add.u64 %r67,%frame,240; add.u64 %r220,%r22,%r22; add.u64 %r221,%r220,%r22; shl.b64 %r222,%r221,3; add.u64 %r223,%r197,16; add.u64 %r193,%r222,%r223; .loc 1 73 12 mov.u64 %r376,0; $L7: .loc 1 69 20 ld.u64 %r225,[%r52]; mul.lo.u64 %r224,%r225,%r26; .loc 1 69 18 st.u64 [%r148],%r224; .loc 1 70 19 ld.u64 %r227,[%r52+16]; add.u64 %r226,%r227,1; ld.u64 %r228,[%r52+8]; sub.u64 %r41,%r226,%r228; .loc 1 72 10 setp.lt.s64 %r229,%r41,0; @ %r229 bra $L5; .loc 1 70 17 st.u64 [%r67],%r41; bra $L6; $L5: .loc 1 73 12 st.u64 [%r67],%r376; $L6: .loc 1 67 3 add.u64 %r52,%r52,24; add.u64 %r148,%r148,8; add.u64 %r67,%r67,8; setp.ne.u64 %r231,%r52,%r193; @ %r231 bra $L7; bra $L8; $L4: .loc 1 84 6 ld.u64 %r232,[%r196]; setp.eq.u64 %r233,%r232,0; @ ! %r233 bra $L53; bra $L9; $L3: add.u64 %r235,%r22,%r22; add.u64 %r236,%r235,%r22; shl.b64 %r237,%r236,3; add.u64 %r238,%r237,40; add.u64 %r117,%r197,%r238; shl.b64 %r239,%r22,3; add.u64 %r107,%r239,-8; add.u64 %r363,%frame,120; add.u64 %r109,%r363,%r107; add.u64 %r366,%frame,240; add.u64 %r69,%r366,%r107; add.u64 %r242,%r197,40; cvt.u32.u32 %r244,%r23; cvt.s64.s8 %r243,%r244; add.u64 %r246,%r243,%r243; add.u64 %r247,%r246,%r243; shl.b64 %r248,%r247,3; add.u64 %r147,%r242,%r248; .loc 1 81 12 mov.u64 %r375,0; $L13: .loc 1 77 20 ld.u64 %r250,[%r117]; mul.lo.u64 %r249,%r250,%r26; .loc 1 77 18 st.u64 [%r109],%r249; .loc 1 78 19 ld.u64 %r252,[%r117+16]; add.u64 %r251,%r252,1; ld.u64 %r253,[%r117+8]; sub.u64 %r49,%r251,%r253; .loc 1 80 10 setp.lt.s64 %r254,%r49,0; @ %r254 bra $L11; .loc 1 78 17 st.u64 [%r69],%r49; bra $L12; $L11: .loc 1 81 12 st.u64 [%r69],%r375; $L12: .loc 1 75 3 add.u64 %r117,%r117,24; add.u64 %r109,%r109,8; add.u64 %r69,%r69,8; setp.ne.u64 %r256,%r117,%r147; @ %r256 bra $L13; bra $L4; $L9: .loc 1 88 7 setp.le.s32 %r257,%r25,0; @ %r257 bra $L14; add.u64 %r172,%r196,40; add.u64 %r170,%frame,240; add.u64 %r258,%r196,16; cvt.u32.u32 %r260,%r23; cvt.s64.s8 %r259,%r260; add.u64 %r262,%r259,%r259; add.u64 %r263,%r262,%r259; shl.b64 %r264,%r263,3; add.u64 %r164,%r258,%r264; .loc 1 91 17 mov.u64 %r112,1; bra $L15; $L14: .loc 1 99 24 mov.u64 %r265,0; st.u64 [%r196+8],%r265; .loc 1 100 28 cvt.u16.u32 %r268,%r23; add.u16 %r267,%r268,-1; cvt.u32.u16 %r269,%r267; st.u8 [%r196+28],%r269; .loc 1 102 20 add.u32 %r270,%r23,-2; cvt.s64.s32 %r58,%r270; add.u64 %r272,%r58,%r58; add.u64 %r273,%r272,%r58; shl.b64 %r274,%r273,3; add.u64 %r275,%r196,%r274; .loc 1 102 67 shl.b64 %r277,%r58,3; add.u64 %r278,%frame,%r277; .loc 1 102 59 ld.u64 %r280,[%r275+40]; ld.u64 %r281,[%r278+240]; mul.lo.u64 %r130,%r280,%r281; .loc 1 104 10 setp.eq.u64 %r282,%r130,0; @ %r282 bra $L16; bra $L54; $L18: .loc 1 93 55 mul.lo.u64 %r112,%r54,%r112; $L15: .loc 1 95 4 st.u64 [%r172+8],%r232; ld.u64 %r54,[%r170]; add.u64 %r284,%r54,-1; st.u64 [%r172+16],%r284; st.u64 [%r172],%r112; .loc 1 88 7 add.u64 %r172,%r172,24; add.u64 %r170,%r170,8; setp.ne.u64 %r285,%r164,%r172; @ %r285 bra $L18; bra $L14; $L16: .loc 1 107 4 st.u64 [%r196+48],%r130; mov.u64 %r287,-1; st.u64 [%r196+56],%r287; mov.u64 %r288,1; st.u64 [%r196+40],%r288; .loc 1 108 4 bra $L1; $L54: .loc 1 111 24 mov.u64 %r2901290; call (%value_in),_gfortrani_xmallocarray,(%out_arg1,%out_arg2); ld.param.u64 %r291,[%value_in]; } .loc 1 111 22 st.u64 [%r196],%r291; $L25: .loc 1 137 3 setp.gt.s32 %r293,%r25,0; @ ! %r293 bra $L26; add.u64 %r366,%frame,240; bra $L20; $L26: .loc 1 145 8 ld.u64 %r137,[%r197]; .loc 1 147 21 cvt.u32.u64 %r79,%r26; .loc 1 147 50 add.u32 %r294,%r79,-4; and.b32 %r295,%r294,-5; set.u32.eq.u32 %r297,%r295,0; neg.s32 %r298,%r297; .loc 1 147 21 add.u32 %r299,%r79,-1; set.u32.le.u32 %r301,%r299,1; neg.s32 %r302,%r301; .loc 1 147 6 cvt.u16.u32 %r304,%r298; cvt.u16.u32 %r305,%r302; orbra $L55; $L20: add.u64 %r182,%frame,360; add.u64 %r181,%r196,40; mov.u64 %r179,%frame; .loc 1 123 10 mov.u64 %r111,0; .loc 1 139 16 mov.u64 %r327,%r111; bra $L23; $L53: .loc 1 115 19 ld.s8 %r65,[%r196+28]; .loc 1 115 10 setp.eq.u64 %r309,%r65,%r122; @ %r309 bra $L24; .loc 1 116 2 st.u64 [%stack+8],%r122; st.u64 [%stack],%r65; cvta.const.u64 %r310310_gfortran_runtime_error24: .loc 1 121 11 cvta.global.u64 %r312,_gfortrani_compile_options; .loc 1 121 10 ld.u32 %r313,[%r312+36]; setp.eq.u32 %r314,%r313,0; @ %r314 bra $L25; .loc 1 123 4 setp.le.s64 %r315,%r122,0; @ %r315 bra $L26; add.u64 %r163,%r196,48; add.u64 %r366,%frame,240; mov.u64 %r155,%r366; add.u64 %r150,%r196,56; cvt.u32.u32 %r317,%r23; cvt.s64.s8 %r316,%r317; add.u64 %r133,%r316,-1; .loc 1 123 10 mov.u64 %r129,0; $L28: .loc 1 127 21 ld.u64 %r319,[%r150]; add.u64 %r318,%r319,1; .loc 1 127 19 ld.u64 %r320,[%r163]; sub.u64 %r128,%r318,%r320; .loc 1 128 18 ld.u64 %r73,[%r155]; .loc 1 128 11 setp.eq.u64 %r321,%r73,%r128; @ %r321 bra $L27; .loc 1 129 3 st.u64 [%stack+16],%r73; st.u64 [%stack+8],%r128; cvt.u32.u64 %r324,%r129; add.u32 %r323,%r324,1; st.u32 [%stack],%r323; cvta.const.u64 %r322,$LC1; {stack; call _gfortran_runtime_error27: .loc 1 123 25 add.u64 %r129,%r129,1; .loc 1 123 4 add.u64 %r163,%r163,24; add.u64 %r155,%r155,8; add.u64 %r150,%r150,24; setp.ne.u64 %r326,%r129,%r133; @ %r326 bra $L28; bra $L20; $L23: .loc 1 139 16 st.u64 [%r182],%r327; .loc 1 140 18 ld.u64 %r328,[%r181]; st.u64 [%r179],%r328; .loc 1 141 17 shl.b64 %r330,%r111,3; add.u64 %r331,%r366,%r330; .loc 1 141 10 ld.u64 %r332,[%r331]; setp.le.s64 %r333,%r332,0; @ %r333 bra $L1; .loc 1 137 26 add.u64 %r111,%r111,1; .loc 1 137 3 add.u64 %r182,%r182,8; add.u64 %r181,%r181,24; add.u64 %r179,%r179,8; setp.gt.s64 %r334,%r122,%r111; @ %r334 bra $L23; bra $L26; $L55: .loc 1 149 7 setp.eq.u32 %r336,%r79,16; @ %r336 bra $L21; .loc 1 157 5 cvta.const.u64 %r338,$LC2; mov.u64 %r333338; call _gfortrani_internal_error61 9 add.u64 %r339,%r27,1; .loc 1 61 7 sub.u64 %r124,%r339,%r29; max.s64 %r126,%r124,0; .loc 1 65 11 mul.lo.u64 %r33,%r30,%r26; .loc 1 159 8 ld.u64 %r138,[%r196]; ld.u64 %r161,[%frame+360]; .loc 1 189 22 ld.u64 %r86,[%frame+120]; .loc 1 190 22 ld.u64 %r88,[%frame]; .loc 1 192 32 ld.u64 %r159,[%frame+240]; .loc 1 199 30 mul.lo.u64 %r340,%r159,%r86; .loc 1 199 16 neg.s64 %r146,%r340; .loc 1 200 30 mul.lo.u64 %r341,%r159,%r88; .loc 1 200 16 neg.s64 %r157,%r341; setp.gt.s64 %r364,%r124,0; setp.le.s32 %r365,%r25,1; cvt.u32.u32 %r368,%r23; cvt.s64.s8 %r369,%r368; add.u64 %r370,%r369,-1; add.u64 %r371,%frame,120; add.u64 %r372,%frame,240; .loc 1 196 20 mov.u64 %r373,0; $L36: .loc 1 171 12 @ %r364 bra $L37; .loc 1 172 10 mov.u32 %r343,1; st.u8 [%r138],%r343; bra $L30; $L37: mov.u64 %r132,%r137; .loc 1 175 13 mov.u64 %r131,0; $L29: .loc 1 178 9 ld.s8 %r84,[%r132]; .loc 1 178 6 setp.eq.u32 %r344,%r84,0; @ %r344 bra $L31; .loc 1 175 28 add.u64 %r131,%r131,1; .loc 1 175 36 add.u64 %r132,%r132,%r33; .loc 1 175 6 setp.gt.s64 %r345,%r126,%r131; @ %r345 bra $L29; .loc 1 170 10 mov.u32 %r84,1; $L31: .loc 1 184 12 cvt.u32.u32 %r346,%r84; st.u8 [%r138],%r346; $L30: .loc 1 188 15 add.u64 %r161,%r161,1; .loc 1 189 12 add.u64 %r137,%r137,%r86; .loc 1 190 12 add.u64 %r138,%r138,%r88; .loc 1 192 13 setp.ne.u64 %r347,%r161,%r159; @ %r347 bra $L36; .loc 1 199 16 add.u64 %r134,%r137,%r146; .loc 1 200 16 add.u64 %r135,%r138,%r157; .loc 1 202 14 @ %r365 bra $L1; add.u64 %r32,%frame,368; mov.u64 %r160,8; .loc 1 201 12 mov.u64 %r136,1; bra $L34; $L35: .loc 1 196 20 st.u64 [%r32],%r373; .loc 1 199 30 mul.lo.u64 %r353,%r98,%r97; .loc 1 199 16 sub.u64 %r134,%r137,%r353; .loc 1 200 30 mul.lo.u64 %r354,%r100,%r97; .loc 1 200 16 sub.u64 %r135,%r138,%r354; .loc 1 201 12 add.u64 %r136,%r136,1; .loc 1 202 14 add.u64 %r32,%r32,8; add.u64 %r160,%r160,8; setp.eq.u64 %r355,%r136,%r370; @ %r355 bra $L1; $L34: .loc 1 210 23 ld.u64 %r356,[%r32]; add.u64 %r97,%r356,1; st.u64 [%r32],%r97; .loc 1 211 30 add.u64 %r358,%r371,%r160; ld.u64 %r98,[%r358]; .loc 1 211 20 add.u64 %r137,%r134,%r98; .loc 1 212 30 add.u64 %r359,%frame,%r160; ld.u64 %r100,[%r359]; .loc 1 212 20 add.u64 %r138,%r135,%r100; .loc 1 192 32 add.u64 %r361,%r372,%r160; ld.u64 %r103,[%r361]; .loc 1 192 13 setp.eq.u64 %r362,%r97,%r103; @ %r362 bra $L35; .loc 1 196 20 mov.u64 %r161,0; bra $L36; $L1: .loc 1 216 1 ret; } all_l2.o/_gfortran_all_l2 .visible .func _gfortran_all_l2fortran/generated/all_lgfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_internal_error .extern .func _gfortrani_internal_error21,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121,32,105,110,32,65,76,76,32,105,110,116,114,105,110,115,105,99,0 }114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,65,76,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,10810,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,65,76,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_all_l2 .visible .func _gfortran_all_l2480u64 %r229; .reg .u64 %r230; .reg .u64 %r231; .reg .u64 %r232; .reg .u64 %r233; .reg .pred %r234; .reg .pred %r236; .reg .u64u64u64 %r257; .reg .u64 %r258; .reg .predu64u16u64 %r279; .reg .u64 %r280; .reg .u64 %r282; .reg .u64pred %r320; .reg .u64u64 %r332; .reg .u64 %r333; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .pred %r338; .reg .pred %r339; .reg .pred %r341; .reg .u64 %r342; .reg .u64 %r343; .reg .u64 %r344; .reg .u64 %r346; .reg .u64 %r347; .reg .u64 %r348; .reg .u16 %r350; .reg .u1664u64 %r370; .reg .u64 %r372; .reg .predmov.u64 %r201,%ar0; mov.u64 %r202,%ar1; mov.u64 %r203,%ar2; .loc 1 56 10 ld.u64 %r22,[%r203]; .loc 1 56 7 add.u64 %r126,%r22,-1; .loc 1 57 10 ld.s8 %r23,[%r202+28]; .loc 1 57 38 add.u32 %r25,%r23,-1; .loc 1 57 8 cvt.s64.s32 %r127,%r25; .loc 1 59 14 ld.u64 %r26,[%r202+16]; .loc 1 61 9 add.u64 %r205,%r126,%r126; add.u64 %r206,%r205,%r126; shl.b64 %r207,%r206,3; add.u64 %r208,%r202,%r207; ld.u64 %r27,[%r208+56]; ld.u64 %r29,[%r208+48]; .loc 1 65 11 ld.u64 %r30,[%r208+40]; .loc 1 67 3 setp.gt.s64 %r222,%r126,0; @ %r222 bra $L2; $L8: .loc 1 75 3 setp.lt.s64 %r223,%r126,%r127; @ %r223 bra $L3; bra $L4; $L2: add.u64 %r151,%r202,40; add.u64 %r52,%frame,120; add.u64 %r109,%frame,240; add.u64 %r225,%r22,%r22; add.u64 %r226,%r225,%r22; shl.b64 %r227,%r226,3; add.u64 %r228,%r202,16; add.u64 %r160,%r227,%r228; .loc 1 73 12 mov.u64 %r387,0; $L7: .loc 1 69 20 ld.u64 %r230,[%r151]; mul.lo.u64 %r229,%r230,%r26; .loc 1 69 18 st.u64 [%r52],%r229; .loc 1 70 19 ld.u64 %r232,[%r151+16]; add.u64 %r231,%r232,1; ld.u64 %r233,[%r151+8]; sub.u64 %r41,%r231,%r233; .loc 1 72 10 setp.lt.s64 %r234,%r41,0; @ %r234 bra $L5; .loc 1 70 17 st.u64 [%r109],%r41; bra $L6; $L5: .loc 1 73 12 st.u64 [%r109],%r387; $L6: .loc 1 67 3 add.u64 %r151,%r151,24; add.u64 %r52,%r52,8; add.u64 %r109,%r109,8; setp.ne.u64 %r236,%r151,%r160; @ %r236 bra $L7; bra $L8; $L4: .loc 1 84 6 ld.u64 %r237,[%r201]; setp.eq.u64 %r238,%r237,0; @ ! %r238 bra $L51; bra $L9; $L3: add.u64 %r240,%r22,%r22; add.u64 %r241,%r240,%r22; shl.b64 %r242,%r241,3; add.u64 %r243,%r242,40; add.u64 %r145,%r202,%r243; shl.b64 %r244,%r22,3; add.u64 %r112,%r244,-8; add.u64 %r376,%frame,120; add.u64 %r117,%r376,%r112; add.u64 %r375,%frame,240; add.u64 %r110,%r375,%r112; add.u64 %r247,%r202,40; cvt.u32.u32 %r249,%r23; cvt.s64.s8 %r248,%r249; add.u64 %r251,%r248,%r248; add.u64 %r252,%r251,%r248; shl.b64 %r253,%r252,3; add.u64 %r198,%r247,%r253; .loc 1 81 12 mov.u64 %r386,0; $L13: .loc 1 77 20 ld.u64 %r255,[%r145]; mul.lo.u64 %r254,%r255,%r26; .loc 1 77 18 st.u64 [%r117],%r254; .loc 1 78 19 ld.u64 %r257,[%r145+16]; add.u64 %r256,%r257,1; ld.u64 %r258,[%r145+8]; sub.u64 %r49,%r256,%r258; .loc 1 80 10 setp.lt.s64 %r259,%r49,0; @ %r259 bra $L11; .loc 1 78 17 st.u64 [%r110],%r49; bra $L12; $L11: .loc 1 81 12 st.u64 [%r110],%r386; $L12: .loc 1 75 3 add.u64 %r145,%r145,24; add.u64 %r117,%r117,8; add.u64 %r110,%r110,8; setp.ne.u64 %r261,%r145,%r198; @ %r261 bra $L13; bra $L4; $L9: .loc 1 88 7 setp.le.s32 %r262,%r25,0; @ %r262 bra $L14; add.u64 %r179,%r201,40; add.u64 %r177,%frame,240; add.u64 %r263,%r201,16; cvt.u32.u32 %r265,%r23; cvt.s64.s8 %r264,%r265; add.u64 %r267,%r264,%r264; add.u64 %r268,%r267,%r264; shl.b64 %r269,%r268,3; add.u64 %r171,%r263,%r269; .loc 1 91 17 mov.u64 %r115,1; bra $L15; $L14: .loc 1 99 24 mov.u64 %r270,0; st.u64 [%r201+8],%r270; .loc 1 100 28 cvt.u16.u32 %r273,%r23; add.u16 %r272,%r273,-1; cvt.u32.u16 %r274,%r272; st.u8 [%r201+28],%r274; .loc 1 102 20 add.u32 %r275,%r23,-2; cvt.s64.s32 %r58,%r275; add.u64 %r277,%r58,%r58; add.u64 %r278,%r277,%r58; shl.b64 %r279,%r278,3; add.u64 %r280,%r201,%r279; .loc 1 102 67 shl.b64 %r282,%r58,3; add.u64 %r283,%frame,%r282; .loc 1 102 59 ld.u64 %r285,[%r280+40]; ld.u64 %r286,[%r283+240]; mul.lo.u64 %r134,%r285,%r286; .loc 1 104 10 setp.eq.u64 %r287,%r134,0; @ %r287 bra $L16; bra $L52; $L18: .loc 1 93 55 mul.lo.u64 %r115,%r54,%r115; $L15: .loc 1 95 4 st.u64 [%r179+8],%r237; ld.u64 %r54,[%r177]; add.u64 %r289,%r54,-1; st.u64 [%r179+16],%r289; st.u64 [%r179],%r115; .loc 1 88 7 add.u64 %r179,%r179,24; add.u64 %r177,%r177,8; setp.ne.u64 %r290,%r171,%r179; @ %r290 bra $L18; bra $L14; $L16: .loc 1 107 4 st.u64 [%r201+48],%r134; mov.u64 %r292,-1; st.u64 [%r201+56],%r292; mov.u64 %r293,1; st.u64 [%r201+40],%r293; .loc 1 108 4 bra $L1; $L52: .loc 1 111 24 mov.u64 %r295,1295; call (%value_in),_gfortrani_xmallocarray296,[%value_in]; } .loc 1 111 22 st.u64 [%r201],%r296; $L25: .loc 1 137 3 setp.gt.s32 %r298,%r25,0; @ ! %r298 bra $L26; add.u64 %r375,%frame,240; bra $L20; $L26: .loc 1 145 8 ld.u64 %r140,[%r202]; .loc 1 147 21 cvt.u32.u64 %r76,%r26; .loc 1 147 50 add.u32 %r299,%r76,-4; and.b32 %r300,%r299,-5; set.u32.eq.u32 %r302,%r300,0; neg.s32 %r303,%r302; .loc 1 147 21 add.u32 %r304,%r76,-1; set.u32.le.u32 %r306,%r304,1; neg.s32 %r307,%r306; .loc 1 147 6 cvt.u16.u32 %r309,%r303; cvt.u16.u32 %r310,%r307; orbra $L53; $L20: add.u64 %r189,%frame,360; add.u64 %r188,%r201,40; mov.u64 %r186,%frame; .loc 1 123 10 mov.u64 %r142,0; .loc 1 139 16 mov.u64 %r332,%r142; bra $L23; $L51: .loc 1 115 19 ld.s8 %r65,[%r201+28]; .loc 1 115 10 setp.eq.u64 %r314,%r65,%r127; @ %r314 bra $L24; .loc 1 116 2 st.u64 [%stack+8],%r127; st.u64 [%stack],%r65; cvta.const.u64 %r315315_gfortran_runtime_error24: .loc 1 121 11 cvta.global.u64 %r317,_gfortrani_compile_options; .loc 1 121 10 ld.u32 %r318,[%r317+36]; setp.eq.u32 %r319,%r318,0; @ %r319 bra $L25; .loc 1 123 4 setp.le.s64 %r320,%r127,0; @ %r320 bra $L26; add.u64 %r170,%r201,48; add.u64 %r375,%frame,240; mov.u64 %r168,%r375; add.u64 %r162,%r201,56; cvt.u32.u32 %r322,%r23; cvt.s64.s8 %r321,%r322; add.u64 %r149,%r321,-1; .loc 1 123 10 mov.u64 %r114,0; $L28: .loc 1 127 21 ld.u64 %r324,[%r162]; add.u64 %r323,%r324,1; .loc 1 127 19 ld.u64 %r325,[%r170]; sub.u64 %r133,%r323,%r325; .loc 1 128 18 ld.u64 %r71,[%r168]; .loc 1 128 11 setp.eq.u64 %r326,%r71,%r133; @ %r326 bra $L27; .loc 1 129 3 st.u64 [%stack+16],%r71; st.u64 [%stack+8],%r133; cvt.u32.u64 %r329,%r114; add.u32 %r328,%r329,1; st.u32 [%stack],%r328; cvta.const.u64 %r327,$LC1stack; call _gfortran_runtime_error27: .loc 1 123 25 add.u64 %r114,%r114,1; .loc 1 123 4 add.u64 %r170,%r170,24; add.u64 %r168,%r168,8; add.u64 %r162,%r162,24; setp.ne.u64 %r331,%r114,%r149; @ %r331 bra $L28; bra $L20; $L23: .loc 1 139 16 st.u64 [%r189],%r332; .loc 1 140 18 ld.u64 %r333,[%r188]; st.u64 [%r186],%r333; .loc 1 141 17 shl.b64 %r335,%r142,3; add.u64 %r336,%r375,%r335; .loc 1 141 10 ld.u64 %r337,[%r336]; setp.le.s64 %r338,%r337,0; @ %r338 bra $L1; .loc 1 137 26 add.u64 %r142,%r142,1; .loc 1 137 3 add.u64 %r189,%r189,8; add.u64 %r188,%r188,24; add.u64 %r186,%r186,8; setp.gt.s64 %r339,%r127,%r142; @ %r339 bra $L23; bra $L26; $L53: .loc 1 149 7 setp.eq.u32 %r341,%r76,16; @ %r341 bra $L21; .loc 1 157 5 cvta.const.u64 %r343,$LC2; mov.u64 %r342,3343; call _gfortrani_internal_error61 9 add.u64 %r344,%r27,1; .loc 1 61 7 sub.u64 %r129,%r344,%r29; max.s64 %r131,%r129,0; .loc 1 65 11 mul.lo.u64 %r33,%r30,%r26; .loc 1 159 8 ld.u64 %r141,[%r201]; ld.u64 %r166,[%frame+360]; .loc 1 189 22 ld.u64 %r86,[%frame+120]; .loc 1 190 22 ld.u64 %r88,[%frame]; .loc 1 190 12 add.u64 %r89,%r88,%r88; .loc 1 192 32 ld.u64 %r164,[%frame+240]; .loc 1 199 30 mul.lo.u64 %r346,%r164,%r86; .loc 1 199 16 neg.s64 %r150,%r346; .loc 1 200 30 mul.lo.u64 %r347,%r164,%r88; .loc 1 200 16 add.u64 %r348,%r347,%r347; neg.s64 %r148,%r348; setp.gt.s64 %r377,%r129,0; setp.le.s32 %r374,%r25,1; cvt.u32.u32 %r379,%r23; cvt.s64.s8 %r380,%r379; add.u64 %r381,%r380,-1; add.u64 %r382,%frame,120; add.u64 %r383,%frame,240; .loc 1 196 20 mov.u64 %r384,0; $L36: .loc 1 171 12 @ %r377 bra $L37; .loc 1 172 10 mov.u16 %r350,1; st.u16 [%r141],%r350; bra $L30; $L37: mov.u64 %r136,%r140; .loc 1 175 13 mov.u64 %r135,0; $L29: .loc 1 178 6 ld.s8 %r352,[%r136]; cvt.u16.u32 %r351,%r352; setp.eq.u16 %r353,%r351,0; @ %r353 bra $L38; .loc 1 175 28 add.u64 %r135,%r135,1; .loc 1 175 36 add.u64 %r136,%r136,%r33; .loc 1 175 6 setp.gt.s64 %r354,%r131,%r135; @ %r354 bra $L29; .loc 1 170 10 mov.u32 %r116,1; bra $L31; $L38: .loc 1 180 14 mov.u32 %r116,0; $L31: .loc 1 184 12 cvt.u16.u32 %r355,%r116; st.u16 [%r141],%r355; $L30: .loc 1 188 15 add.u64 %r166,%r166,1; .loc 1 189 12 add.u64 %r140,%r140,%r86; .loc 1 190 12 add.u64 %r141,%r141,%r89; .loc 1 192 13 setp.ne.u64 %r356,%r166,%r164; @ %r356 bra $L36; .loc 1 199 16 add.u64 %r137,%r140,%r150; .loc 1 200 16 add.u64 %r138,%r141,%r148; .loc 1 202 14 @ %r374 bra $L1; add.u64 %r32,%frame,368; mov.u64 %r165,8; .loc 1 201 12 mov.u64 %r139,1; bra $L34; $L35: .loc 1 196 20 st.u64 [%r32],%r384; .loc 1 199 30 mul.lo.u64 %r362,%r101,%r100; .loc 1 199 16 sub.u64 %r137,%r140,%r362; .loc 1 200 30 mul.lo.u64 %r363,%r103,%r100; .loc 1 200 16 add.u64 %r364,%r363,%r363; sub.u64 %r138,%r141,%r364; .loc 1 201 12 add.u64 %r139,%r139,1; .loc 1 202 14 add.u64 %r32,%r32,8; add.u64 %r165,%r165,8; setp.eq.u64 %r365,%r139,%r381; @ %r365 bra $L1; $L34: .loc 1 210 23 ld.u64 %r366,[%r32]; add.u64 %r100,%r366,1; st.u64 [%r32],%r100; .loc 1 211 30 add.u64 %r368,%r382,%r165; ld.u64 %r101,[%r368]; .loc 1 211 20 add.u64 %r140,%r137,%r101; .loc 1 212 30 add.u64 %r369,%frame,%r165; ld.u64 %r103,[%r369]; .loc 1 212 20 add.u64 %r370,%r103,%r103; add.u64 %r141,%r138,%r370; .loc 1 192 32 add.u64 %r372,%r383,%r165; ld.u64 %r107,[%r372]; .loc 1 192 13 setp.eq.u64 %r373,%r100,%r107; @ %r373 bra $L35; .loc 1 196 20 mov.u64 %r166,0; bra $L36; $L1: .loc 1 216 1 ret; } all_l4.o/_gfortran_all_l4 .visible .func _gfortran_all_l4fortran/generated/all_l4.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_internal_error .extern .func _gfortrani_internal_error21,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121,32,105,110,32,65,76,76,32,105,110,116,114,105,110,115,105,99,0 }114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,65,76,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,10810,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,65,76,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_all_l4 .visible .func _gfortran_all_l4480u64 %r229; .reg .u64 %r230; .reg .u64 %r231; .reg .u64 %r232; .reg .u64 %r233; .reg .pred %r234; .reg .pred %r236; .reg .u64u64u64 %r257; .reg .u64 %r258; .reg .predu64u16u64 %r279; .reg .u64 %r280; .reg .u64 %r282; .reg .u64pred %r320; .reg .u64u64 %r332; .reg .u64 %r333; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .pred %r338; .reg .pred %r339; .reg .pred %r341; .reg .u64 %r342; .reg .u64 %r343; .reg .u64 %r344; .reg .u64 %r345; .reg .u64 %r346; .reg .u64 %r347; .reg .u32 %r349; .reg .u16 %r350; .reg .u32 %r351; .reg .pred %r352; .reg .pred %r353; .reg .pred %r354; .reg .u64pred %r363; .reg .u64u64 %r385; mov.u64 %r201,%ar0; mov.u64 %r202,%ar1; mov.u64 %r203,%ar2; .loc 1 56 10 ld.u64 %r22,[%r203]; .loc 1 56 7 add.u64 %r126,%r22,-1; .loc 1 57 10 ld.s8 %r23,[%r202+28]; .loc 1 57 38 add.u32 %r25,%r23,-1; .loc 1 57 8 cvt.s64.s32 %r127,%r25; .loc 1 59 14 ld.u64 %r26,[%r202+16]; .loc 1 61 9 add.u64 %r205,%r126,%r126; add.u64 %r206,%r205,%r126; shl.b64 %r207,%r206,3; add.u64 %r208,%r202,%r207; ld.u64 %r27,[%r208+56]; ld.u64 %r29,[%r208+48]; .loc 1 65 11 ld.u64 %r30,[%r208+40]; .loc 1 67 3 setp.gt.s64 %r222,%r126,0; @ %r222 bra $L2; $L8: .loc 1 75 3 setp.lt.s64 %r223,%r126,%r127; @ %r223 bra $L3; bra $L4; $L2: add.u64 %r151,%r202,40; add.u64 %r52,%frame,120; add.u64 %r109,%frame,240; add.u64 %r225,%r22,%r22; add.u64 %r226,%r225,%r22; shl.b64 %r227,%r226,3; add.u64 %r228,%r202,16; add.u64 %r160,%r227,%r228; .loc 1 73 12 mov.u64 %r385,0; $L7: .loc 1 69 20 ld.u64 %r230,[%r151]; mul.lo.u64 %r229,%r230,%r26; .loc 1 69 18 st.u64 [%r52],%r229; .loc 1 70 19 ld.u64 %r232,[%r151+16]; add.u64 %r231,%r232,1; ld.u64 %r233,[%r151+8]; sub.u64 %r41,%r231,%r233; .loc 1 72 10 setp.lt.s64 %r234,%r41,0; @ %r234 bra $L5; .loc 1 70 17 st.u64 [%r109],%r41; bra $L6; $L5: .loc 1 73 12 st.u64 [%r109],%r385; $L6: .loc 1 67 3 add.u64 %r151,%r151,24; add.u64 %r52,%r52,8; add.u64 %r109,%r109,8; setp.ne.u64 %r236,%r151,%r160; @ %r236 bra $L7; bra $L8; $L4: .loc 1 84 6 ld.u64 %r237,[%r201]; setp.eq.u64 %r238,%r237,0; @ ! %r238 bra $L51; bra $L9; $L3: add.u64 %r240,%r22,%r22; add.u64 %r241,%r240,%r22; shl.b64 %r242,%r241,3; add.u64 %r243,%r242,40; add.u64 %r145,%r202,%r243; shl.b64 %r244,%r22,3; add.u64 %r112,%r244,-8; add.u64 %r375,%frame,120; add.u64 %r117,%r375,%r112; add.u64 %r374,%frame,240; add.u64 %r110,%r374,%r112; add.u64 %r247,%r202,40; cvt.u32.u32 %r249,%r23; cvt.s64.s8 %r248,%r249; add.u64 %r251,%r248,%r248; add.u64 %r252,%r251,%r248; shl.b64 %r253,%r252,3; add.u64 %r198,%r247,%r253; .loc 1 81 12 mov.u64 %r384,0; $L13: .loc 1 77 20 ld.u64 %r255,[%r145]; mul.lo.u64 %r254,%r255,%r26; .loc 1 77 18 st.u64 [%r117],%r254; .loc 1 78 19 ld.u64 %r257,[%r145+16]; add.u64 %r256,%r257,1; ld.u64 %r258,[%r145+8]; sub.u64 %r49,%r256,%r258; .loc 1 80 10 setp.lt.s64 %r259,%r49,0; @ %r259 bra $L11; .loc 1 78 17 st.u64 [%r110],%r49; bra $L12; $L11: .loc 1 81 12 st.u64 [%r110],%r384; $L12: .loc 1 75 3 add.u64 %r145,%r145,24; add.u64 %r117,%r117,8; add.u64 %r110,%r110,8; setp.ne.u64 %r261,%r145,%r198; @ %r261 bra $L13; bra $L4; $L9: .loc 1 88 7 setp.le.s32 %r262,%r25,0; @ %r262 bra $L14; add.u64 %r179,%r201,40; add.u64 %r177,%frame,240; add.u64 %r263,%r201,16; cvt.u32.u32 %r265,%r23; cvt.s64.s8 %r264,%r265; add.u64 %r267,%r264,%r264; add.u64 %r268,%r267,%r264; shl.b64 %r269,%r268,3; add.u64 %r171,%r263,%r269; .loc 1 91 17 mov.u64 %r115,1; bra $L15; $L14: .loc 1 99 24 mov.u64 %r270,0; st.u64 [%r201+8],%r270; .loc 1 100 28 cvt.u16.u32 %r273,%r23; add.u16 %r272,%r273,-1; cvt.u32.u16 %r274,%r272; st.u8 [%r201+28],%r274; .loc 1 102 20 add.u32 %r275,%r23,-2; cvt.s64.s32 %r58,%r275; add.u64 %r277,%r58,%r58; add.u64 %r278,%r277,%r58; shl.b64 %r279,%r278,3; add.u64 %r280,%r201,%r279; .loc 1 102 67 shl.b64 %r282,%r58,3; add.u64 %r283,%frame,%r282; .loc 1 102 59 ld.u64 %r285,[%r280+40]; ld.u64 %r286,[%r283+240]; mul.lo.u64 %r134,%r285,%r286; .loc 1 104 10 setp.eq.u64 %r287,%r134,0; @ %r287 bra $L16; bra $L52; $L18: .loc 1 93 55 mul.lo.u64 %r115,%r54,%r115; $L15: .loc 1 95 4 st.u64 [%r179+8],%r237; ld.u64 %r54,[%r177]; add.u64 %r289,%r54,-1; st.u64 [%r179+16],%r289; st.u64 [%r179],%r115; .loc 1 88 7 add.u64 %r179,%r179,24; add.u64 %r177,%r177,8; setp.ne.u64 %r290,%r171,%r179; @ %r290 bra $L18; bra $L14; $L16: .loc 1 107 4 st.u64 [%r201+48],%r134; mov.u64 %r292,-1; st.u64 [%r201+56],%r292; mov.u64 %r293,1; st.u64 [%r201+40],%r293; .loc 1 108 4 bra $L1; $L52: .loc 1 111 24295; call (%value_in),_gfortrani_xmallocarray296,[%value_in]; } .loc 1 111 22 st.u64 [%r201],%r296; $L25: .loc 1 137 3 setp.gt.s32 %r298,%r25,0; @ ! %r298 bra $L26; add.u64 %r374,%frame,240; bra $L20; $L26: .loc 1 145 8 ld.u64 %r140,[%r202]; .loc 1 147 21 cvt.u32.u64 %r76,%r26; .loc 1 147 50 add.u32 %r299,%r76,-4; and.b32 %r300,%r299,-5; set.u32.eq.u32 %r302,%r300,0; neg.s32 %r303,%r302; .loc 1 147 21 add.u32 %r304,%r76,-1; set.u32.le.u32 %r306,%r304,1; neg.s32 %r307,%r306; .loc 1 147 6 cvt.u16.u32 %r309,%r303; cvt.u16.u32 %r310,%r307; orbra $L53; $L20: add.u64 %r189,%frame,360; add.u64 %r188,%r201,40; mov.u64 %r186,%frame; .loc 1 123 10 mov.u64 %r142,0; .loc 1 139 16 mov.u64 %r332,%r142; bra $L23; $L51: .loc 1 115 19 ld.s8 %r65,[%r201+28]; .loc 1 115 10 setp.eq.u64 %r314,%r65,%r127; @ %r314 bra $L24; .loc 1 116 2 st.u64 [%stack+8],%r127; st.u64 [%stack],%r65; cvta.const.u64 %r315315_gfortran_runtime_error24: .loc 1 121 11 cvta.global.u64 %r317,_gfortrani_compile_options; .loc 1 121 10 ld.u32 %r318,[%r317+36]; setp.eq.u32 %r319,%r318,0; @ %r319 bra $L25; .loc 1 123 4 setp.le.s64 %r320,%r127,0; @ %r320 bra $L26; add.u64 %r170,%r201,48; add.u64 %r374,%frame,240; mov.u64 %r168,%r374; add.u64 %r162,%r201,56; cvt.u32.u32 %r322,%r23; cvt.s64.s8 %r321,%r322; add.u64 %r149,%r321,-1; .loc 1 123 10 mov.u64 %r114,0; $L28: .loc 1 127 21 ld.u64 %r324,[%r162]; add.u64 %r323,%r324,1; .loc 1 127 19 ld.u64 %r325,[%r170]; sub.u64 %r133,%r323,%r325; .loc 1 128 18 ld.u64 %r71,[%r168]; .loc 1 128 11 setp.eq.u64 %r326,%r71,%r133; @ %r326 bra $L27; .loc 1 129 3 st.u64 [%stack+16],%r71; st.u64 [%stack+8],%r133; cvt.u32.u64 %r329,%r114; add.u32 %r328,%r329,1; st.u32 [%stack],%r328; cvta.const.u64 %r327,$LC1stack; call _gfortran_runtime_error27: .loc 1 123 25 add.u64 %r114,%r114,1; .loc 1 123 4 add.u64 %r170,%r170,24; add.u64 %r168,%r168,8; add.u64 %r162,%r162,24; setp.ne.u64 %r331,%r114,%r149; @ %r331 bra $L28; bra $L20; $L23: .loc 1 139 16 st.u64 [%r189],%r332; .loc 1 140 18 ld.u64 %r333,[%r188]; st.u64 [%r186],%r333; .loc 1 141 17 shl.b64 %r335,%r142,3; add.u64 %r336,%r374,%r335; .loc 1 141 10 ld.u64 %r337,[%r336]; setp.le.s64 %r338,%r337,0; @ %r338 bra $L1; .loc 1 137 26 add.u64 %r142,%r142,1; .loc 1 137 3 add.u64 %r189,%r189,8; add.u64 %r188,%r188,24; add.u64 %r186,%r186,8; setp.gt.s64 %r339,%r127,%r142; @ %r339 bra $L23; bra $L26; $L53: .loc 1 149 7 setp.eq.u32 %r341,%r76,16; @ %r341 bra $L21; .loc 1 157 5 cvta.const.u64 %r343,$LC2; mov.u64 %r342,3343; call _gfortrani_internal_error61 9 add.u64 %r344,%r27,1; .loc 1 61 7 sub.u64 %r129,%r344,%r29; max.s64 %r131,%r129,0; .loc 1 65 11 mul.lo.u64 %r33,%r30,%r26; .loc 1 159 8 ld.u64 %r141,[%r201]; ld.u64 %r166,[%frame+360]; .loc 1 189 22 ld.u64 %r86,[%frame+120]; .loc 1 190 22 ld.u64 %r88,[%frame]; .loc 1 190 12 shl.b64 %r89,%r88,2; .loc 1 192 32 ld.u64 %r164,[%frame+240]; .loc 1 199 30 mul.lo.u64 %r345,%r164,%r86; .loc 1 199 16 neg.s64 %r150,%r345; .loc 1 200 30 mul.lo.u64 %r346,%r164,%r88; .loc 1 200 16 shl.b64 %r347,%r346,2; neg.s64 %r148,%r347; setp.gt.s64 %r372,%r129,0; setp.le.s32 %r373,%r25,1; cvt.u32.u32 %r377,%r23; cvt.s64.s8 %r378,%r377; add.u64 %r379,%r378,-1; add.u64 %r380,%frame,120; add.u64 %r381,%frame,240; .loc 1 196 20 mov.u64 %r382,0; $L36: .loc 1 171 12 @ %r372 bra $L37; .loc 1 172 10 mov.u32 %r349,1; st.u32 [%r141],%r349; bra $L30; $L37: mov.u64 %r136,%r140; .loc 1 175 13 mov.u64 %r135,0; $L29: .loc 1 178 6 ld.s8 %r351,[%r136]; cvt.u16.u32 %r350,%r351; setp.eq.u16 %r352,%r350,0; @ %r352 bra $L38; .loc 1 175 28 add.u64 %r135,%r135,1; .loc 1 175 36 add.u64 %r136,%r136,%r33; .loc 1 175 6 setp.gt.s64 %r353,%r131,%r135; @ %r353 bra $L29; .loc 1 170 10 mov.u32 %r116,1; bra $L31; $L38: .loc 1 180 14 mov.u32 %r116,0; $L31: .loc 1 184 12 st.u32 [%r141],%r116; $L30: .loc 1 188 15 add.u64 %r166,%r166,1; .loc 1 189 12 add.u64 %r140,%r140,%r86; .loc 1 190 12 add.u64 %r141,%r141,%r89; .loc 1 192 13 setp.ne.u64 %r354,%r166,%r164; @ %r354 bra $L36; .loc 1 199 16 add.u64 %r137,%r140,%r150; .loc 1 200 16 add.u64 %r138,%r141,%r148; .loc 1 202 14 @ %r373 bra $L1; add.u64 %r32,%frame,368; mov.u64 %r165,8; .loc 1 201 12 mov.u64 %r139,1; bra $L34; $L35: .loc 1 196 20 st.u64 [%r32],%r382; .loc 1 199 30 mul.lo.u64 %r360,%r101,%r100; .loc 1 199 16 sub.u64 %r137,%r140,%r360; .loc 1 200 30 mul.lo.u64 %r361,%r103,%r100; .loc 1 200 16 shl.b64 %r362,%r361,2; sub.u64 %r138,%r141,%r362; .loc 1 201 12 add.u64 %r139,%r139,1; .loc 1 202 14 add.u64 %r32,%r32,8; add.u64 %r165,%r165,8; setp.eq.u64 %r363,%r139,%r379; @ %r363 bra $L1; $L34: .loc 1 210 23 ld.u64 %r364,[%r32]; add.u64 %r100,%r364,1; st.u64 [%r32],%r100; .loc 1 211 30 add.u64 %r366,%r380,%r165; ld.u64 %r101,[%r366]; .loc 1 211 20 add.u64 %r140,%r137,%r101; .loc 1 212 30 add.u64 %r367,%frame,%r165; ld.u64 %r103,[%r367]; .loc 1 212 20 shl.b64 %r368,%r103,2; add.u64 %r141,%r138,%r368; .loc 1 192 32 add.u64 %r370,%r381,%r165; ld.u64 %r107,[%r370]; .loc 1 192 13 setp.eq.u64 %r371,%r100,%r107; @ %r371 bra $L35; .loc 1 196 20 mov.u64 %r166,0; bra $L36; $L1: .loc 1 216 1 ret; } all_l8.o/_gfortran_all_l8 .visible .func _gfortran_all_l8fortran/generated/all_l8.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_internal_error .extern .func _gfortrani_internal_error21,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121,32,105,110,32,65,76,76,32,105,110,116,114,105,110,115,105,99,0 }114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,65,76,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,10810,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,65,76,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_all_l8 .visible .func _gfortran_all_l8480u64 %r229; .reg .u64 %r230; .reg .u64 %r231; .reg .u64 %r232; .reg .u64 %r233; .reg .pred %r234; .reg .pred %r236; .reg .u64u64u64 %r257; .reg .u64 %r258; .reg .predu64u16u64 %r279; .reg .u64 %r280; .reg .u64 %r282; .reg .u64pred %r320; .reg .u64u64 %r332; .reg .u64 %r333; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .pred %r338; .reg .pred %r339; .reg .pred %r341; .reg .u64 %r342; .reg .u64 %r343; .reg .u64 %r344; .reg .u64 %r345; .reg .u64 %r346; .reg .u64 %r347; .reg .u64 %r349; .reg .u16 %r350; .reg .u32 %r351; .reg .pred %r352; .reg .pred %r353; .reg .pred %r354; .reg .u64pred %r363; .reg .u64u64 %r385; mov.u64 %r201,%ar0; mov.u64 %r202,%ar1; mov.u64 %r203,%ar2; .loc 1 56 10 ld.u64 %r22,[%r203]; .loc 1 56 7 add.u64 %r126,%r22,-1; .loc 1 57 10 ld.s8 %r23,[%r202+28]; .loc 1 57 38 add.u32 %r25,%r23,-1; .loc 1 57 8 cvt.s64.s32 %r127,%r25; .loc 1 59 14 ld.u64 %r26,[%r202+16]; .loc 1 61 9 add.u64 %r205,%r126,%r126; add.u64 %r206,%r205,%r126; shl.b64 %r207,%r206,3; add.u64 %r208,%r202,%r207; ld.u64 %r27,[%r208+56]; ld.u64 %r29,[%r208+48]; .loc 1 65 11 ld.u64 %r30,[%r208+40]; .loc 1 67 3 setp.gt.s64 %r222,%r126,0; @ %r222 bra $L2; $L8: .loc 1 75 3 setp.lt.s64 %r223,%r126,%r127; @ %r223 bra $L3; bra $L4; $L2: add.u64 %r151,%r202,40; add.u64 %r52,%frame,120; add.u64 %r109,%frame,240; add.u64 %r225,%r22,%r22; add.u64 %r226,%r225,%r22; shl.b64 %r227,%r226,3; add.u64 %r228,%r202,16; add.u64 %r160,%r227,%r228; .loc 1 73 12 mov.u64 %r385,0; $L7: .loc 1 69 20 ld.u64 %r230,[%r151]; mul.lo.u64 %r229,%r230,%r26; .loc 1 69 18 st.u64 [%r52],%r229; .loc 1 70 19 ld.u64 %r232,[%r151+16]; add.u64 %r231,%r232,1; ld.u64 %r233,[%r151+8]; sub.u64 %r41,%r231,%r233; .loc 1 72 10 setp.lt.s64 %r234,%r41,0; @ %r234 bra $L5; .loc 1 70 17 st.u64 [%r109],%r41; bra $L6; $L5: .loc 1 73 12 st.u64 [%r109],%r385; $L6: .loc 1 67 3 add.u64 %r151,%r151,24; add.u64 %r52,%r52,8; add.u64 %r109,%r109,8; setp.ne.u64 %r236,%r151,%r160; @ %r236 bra $L7; bra $L8; $L4: .loc 1 84 6 ld.u64 %r237,[%r201]; setp.eq.u64 %r238,%r237,0; @ ! %r238 bra $L51; bra $L9; $L3: add.u64 %r240,%r22,%r22; add.u64 %r241,%r240,%r22; shl.b64 %r242,%r241,3; add.u64 %r243,%r242,40; add.u64 %r145,%r202,%r243; shl.b64 %r244,%r22,3; add.u64 %r112,%r244,-8; add.u64 %r375,%frame,120; add.u64 %r117,%r375,%r112; add.u64 %r374,%frame,240; add.u64 %r110,%r374,%r112; add.u64 %r247,%r202,40; cvt.u32.u32 %r249,%r23; cvt.s64.s8 %r248,%r249; add.u64 %r251,%r248,%r248; add.u64 %r252,%r251,%r248; shl.b64 %r253,%r252,3; add.u64 %r198,%r247,%r253; .loc 1 81 12 mov.u64 %r384,0; $L13: .loc 1 77 20 ld.u64 %r255,[%r145]; mul.lo.u64 %r254,%r255,%r26; .loc 1 77 18 st.u64 [%r117],%r254; .loc 1 78 19 ld.u64 %r257,[%r145+16]; add.u64 %r256,%r257,1; ld.u64 %r258,[%r145+8]; sub.u64 %r49,%r256,%r258; .loc 1 80 10 setp.lt.s64 %r259,%r49,0; @ %r259 bra $L11; .loc 1 78 17 st.u64 [%r110],%r49; bra $L12; $L11: .loc 1 81 12 st.u64 [%r110],%r384; $L12: .loc 1 75 3 add.u64 %r145,%r145,24; add.u64 %r117,%r117,8; add.u64 %r110,%r110,8; setp.ne.u64 %r261,%r145,%r198; @ %r261 bra $L13; bra $L4; $L9: .loc 1 88 7 setp.le.s32 %r262,%r25,0; @ %r262 bra $L14; add.u64 %r179,%r201,40; add.u64 %r177,%frame,240; add.u64 %r263,%r201,16; cvt.u32.u32 %r265,%r23; cvt.s64.s8 %r264,%r265; add.u64 %r267,%r264,%r264; add.u64 %r268,%r267,%r264; shl.b64 %r269,%r268,3; add.u64 %r171,%r263,%r269; .loc 1 91 17 mov.u64 %r115,1; bra $L15; $L14: .loc 1 99 24 mov.u64 %r270,0; st.u64 [%r201+8],%r270; .loc 1 100 28 cvt.u16.u32 %r273,%r23; add.u16 %r272,%r273,-1; cvt.u32.u16 %r274,%r272; st.u8 [%r201+28],%r274; .loc 1 102 20 add.u32 %r275,%r23,-2; cvt.s64.s32 %r58,%r275; add.u64 %r277,%r58,%r58; add.u64 %r278,%r277,%r58; shl.b64 %r279,%r278,3; add.u64 %r280,%r201,%r279; .loc 1 102 67 shl.b64 %r282,%r58,3; add.u64 %r283,%frame,%r282; .loc 1 102 59 ld.u64 %r285,[%r280+40]; ld.u64 %r286,[%r283+240]; mul.lo.u64 %r134,%r285,%r286; .loc 1 104 10 setp.eq.u64 %r287,%r134,0; @ %r287 bra $L16; bra $L52; $L18: .loc 1 93 55 mul.lo.u64 %r115,%r54,%r115; $L15: .loc 1 95 4 st.u64 [%r179+8],%r237; ld.u64 %r54,[%r177]; add.u64 %r289,%r54,-1; st.u64 [%r179+16],%r289; st.u64 [%r179],%r115; .loc 1 88 7 add.u64 %r179,%r179,24; add.u64 %r177,%r177,8; setp.ne.u64 %r290,%r171,%r179; @ %r290 bra $L18; bra $L14; $L16: .loc 1 107 4 st.u64 [%r201+48],%r134; mov.u64 %r292,-1; st.u64 [%r201+56],%r292; mov.u64 %r293,1; st.u64 [%r201+40],%r293; .loc 1 108 4 bra $L1; $L52: .loc 1 111 24 mov.u64 %r2951295; call (%value_in),_gfortrani_xmallocarray296,[%value_in]; } .loc 1 111 22 st.u64 [%r201],%r296; $L25: .loc 1 137 3 setp.gt.s32 %r298,%r25,0; @ ! %r298 bra $L26; add.u64 %r374,%frame,240; bra $L20; $L26: .loc 1 145 8 ld.u64 %r140,[%r202]; .loc 1 147 21 cvt.u32.u64 %r76,%r26; .loc 1 147 50 add.u32 %r299,%r76,-4; and.b32 %r300,%r299,-5; set.u32.eq.u32 %r302,%r300,0; neg.s32 %r303,%r302; .loc 1 147 21 add.u32 %r304,%r76,-1; set.u32.le.u32 %r306,%r304,1; neg.s32 %r307,%r306; .loc 1 147 6 cvt.u16.u32 %r309,%r303; cvt.u16.u32 %r310,%r307; orbra $L53; $L20: add.u64 %r189,%frame,360; add.u64 %r188,%r201,40; mov.u64 %r186,%frame; .loc 1 123 10 mov.u64 %r142,0; .loc 1 139 16 mov.u64 %r332,%r142; bra $L23; $L51: .loc 1 115 19 ld.s8 %r65,[%r201+28]; .loc 1 115 10 setp.eq.u64 %r314,%r65,%r127; @ %r314 bra $L24; .loc 1 116 2 st.u64 [%stack+8],%r127; st.u64 [%stack],%r65; cvta.const.u64 %r315315_gfortran_runtime_error24: .loc 1 121 11 cvta.global.u64 %r317,_gfortrani_compile_options; .loc 1 121 10 ld.u32 %r318,[%r317+36]; setp.eq.u32 %r319,%r318,0; @ %r319 bra $L25; .loc 1 123 4 setp.le.s64 %r320,%r127,0; @ %r320 bra $L26; add.u64 %r170,%r201,48; add.u64 %r374,%frame,240; mov.u64 %r168,%r374; add.u64 %r162,%r201,56; cvt.u32.u32 %r322,%r23; cvt.s64.s8 %r321,%r322; add.u64 %r149,%r321,-1; .loc 1 123 10 mov.u64 %r114,0; $L28: .loc 1 127 21 ld.u64 %r324,[%r162]; add.u64 %r323,%r324,1; .loc 1 127 19 ld.u64 %r325,[%r170]; sub.u64 %r133,%r323,%r325; .loc 1 128 18 ld.u64 %r71,[%r168]; .loc 1 128 11 setp.eq.u64 %r326,%r71,%r133; @ %r326 bra $L27; .loc 1 129 3 st.u64 [%stack+16],%r71; st.u64 [%stack+8],%r133; cvt.u32.u64 %r329,%r114; add.u32 %r328,%r329,1; st.u32 [%stack],%r328; cvta.const.u64 %r327,$LC1stack; call _gfortran_runtime_error27: .loc 1 123 25 add.u64 %r114,%r114,1; .loc 1 123 4 add.u64 %r170,%r170,24; add.u64 %r168,%r168,8; add.u64 %r162,%r162,24; setp.ne.u64 %r331,%r114,%r149; @ %r331 bra $L28; bra $L20; $L23: .loc 1 139 16 st.u64 [%r189],%r332; .loc 1 140 18 ld.u64 %r333,[%r188]; st.u64 [%r186],%r333; .loc 1 141 17 shl.b64 %r335,%r142,3; add.u64 %r336,%r374,%r335; .loc 1 141 10 ld.u64 %r337,[%r336]; setp.le.s64 %r338,%r337,0; @ %r338 bra $L1; .loc 1 137 26 add.u64 %r142,%r142,1; .loc 1 137 3 add.u64 %r189,%r189,8; add.u64 %r188,%r188,24; add.u64 %r186,%r186,8; setp.gt.s64 %r339,%r127,%r142; @ %r339 bra $L23; bra $L26; $L53: .loc 1 149 7 setp.eq.u32 %r341,%r76,16; @ %r341 bra $L21; .loc 1 157 5 cvta.const.u64 %r343,$LC2; mov.u64 %r342,3343; call _gfortrani_internal_error61 9 add.u64 %r344,%r27,1; .loc 1 61 7 sub.u64 %r129,%r344,%r29; max.s64 %r131,%r129,0; .loc 1 65 11 mul.lo.u64 %r33,%r30,%r26; .loc 1 159 8 ld.u64 %r141,[%r201]; ld.u64 %r166,[%frame+360]; .loc 1 189 22 ld.u64 %r86,[%frame+120]; .loc 1 190 22 ld.u64 %r88,[%frame]; .loc 1 190 12 shl.b64 %r89,%r88,3; .loc 1 192 32 ld.u64 %r164,[%frame+240]; .loc 1 199 30 mul.lo.u64 %r345,%r164,%r86; .loc 1 199 16 neg.s64 %r150,%r345; .loc 1 200 30 mul.lo.u64 %r346,%r164,%r88; .loc 1 200 16 shl.b64 %r347,%r346,3; neg.s64 %r148,%r347; setp.gt.s64 %r372,%r129,0; setp.le.s32 %r373,%r25,1; .loc 1 180 14 mov.u64 %r377,0; cvt.u32.u32 %r378,%r23; cvt.s64.s8 %r379,%r378; add.u64 %r380,%r379,-1; add.u64 %r381,%frame,120; add.u64 %r382,%frame,240; $L36: .loc 1 171 12 @ %r372 bra $L37; .loc 1 172 10 mov.u64 %r349,1; st.u64 [%r141],%r349; bra $L30; $L37: mov.u64 %r136,%r140; .loc 1 175 13 mov.u64 %r135,0; $L29: .loc 1 178 6 ld.s8 %r351,[%r136]; cvt.u16.u32 %r350,%r351; setp.eq.u16 %r352,%r350,0; @ %r352 bra $L38; .loc 1 175 28 add.u64 %r135,%r135,1; .loc 1 175 36 add.u64 %r136,%r136,%r33; .loc 1 175 6 setp.gt.s64 %r353,%r131,%r135; @ %r353 bra $L29; .loc 1 170 10 mov.u64 %r116,1; bra $L31; $L38: .loc 1 180 14 mov.u64 %r116,%r377; $L31: .loc 1 184 12 st.u64 [%r141],%r116; $L30: .loc 1 188 15 add.u64 %r166,%r166,1; .loc 1 189 12 add.u64 %r140,%r140,%r86; .loc 1 190 12 add.u64 %r141,%r141,%r89; .loc 1 192 13 setp.ne.u64 %r354,%r166,%r164; @ %r354 bra $L36; .loc 1 199 16 add.u64 %r137,%r140,%r150; .loc 1 200 16 add.u64 %r138,%r141,%r148; .loc 1 202 14 @ %r373 bra $L1; add.u64 %r32,%frame,368; mov.u64 %r165,8; .loc 1 201 12 mov.u64 %r139,1; bra $L34; $L35: .loc 1 196 20 st.u64 [%r32],%r377; .loc 1 199 30 mul.lo.u64 %r360,%r101,%r100; .loc 1 199 16 sub.u64 %r137,%r140,%r360; .loc 1 200 30 mul.lo.u64 %r361,%r103,%r100; .loc 1 200 16 shl.b64 %r362,%r361,3; sub.u64 %r138,%r141,%r362; .loc 1 201 12 add.u64 %r139,%r139,1; .loc 1 202 14 add.u64 %r32,%r32,8; add.u64 %r165,%r165,8; setp.eq.u64 %r363,%r139,%r380; @ %r363 bra $L1; $L34: .loc 1 210 23 ld.u64 %r364,[%r32]; add.u64 %r100,%r364,1; st.u64 [%r32],%r100; .loc 1 211 30 add.u64 %r366,%r381,%r165; ld.u64 %r101,[%r366]; .loc 1 211 20 add.u64 %r140,%r137,%r101; .loc 1 212 30 add.u64 %r367,%frame,%r165; ld.u64 %r103,[%r367]; .loc 1 212 20 shl.b64 %r368,%r103,3; add.u64 %r141,%r138,%r368; .loc 1 192 32 add.u64 %r370,%r382,%r165; ld.u64 %r107,[%r370]; .loc 1 192 13 setp.eq.u64 %r371,%r100,%r107; @ %r371 bra $L35; .loc 1 196 20 mov.u64 %r166,0; bra $L36; $L1: .loc 1 216 1 ret; } all_l16.o/fortran/generated/all_l16.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_internal_error .extern .func _gfortrani_internal_error21,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121,32,105,110,32,65,76,76,32,105,110,116,114,105,110,115,105,99,0 }114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,65,76,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,10810,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,65,76,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_all_l16 .visible .func _gfortran_all_l16480u64 %r229; .reg .u64 %r230; .reg .u64 %r231; .reg .u64 %r232; .reg .u64 %r233; .reg .pred %r234; .reg .pred %r236; .reg .u64u64u64 %r257; .reg .u64 %r258; .reg .predu64u16u64 %r279; .reg .u64 %r280; .reg .u64 %r282; .reg .u64pred %r320; .reg .u64u64 %r332; .reg .u64 %r333; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .pred %r338; .reg .pred %r339; .reg .pred %r341; .reg .u64 %r342; .reg .u64 %r343; .reg .u64 %r344; .reg .u64 %r345; .reg .u64 %r346; .reg .u64 %r347; .reg .u64 %r349; .reg .u16 %r351; .reg .u32 %r352; .reg .predpredmov.u64 %r201,%ar0; mov.u64 %r202,%ar1; mov.u64 %r203,%ar2; .loc 1 56 10 ld.u64 %r22,[%r203]; .loc 1 56 7 add.u64 %r126,%r22,-1; .loc 1 57 10 ld.s8 %r23,[%r202+28]; .loc 1 57 38 add.u32 %r25,%r23,-1; .loc 1 57 8 cvt.s64.s32 %r127,%r25; .loc 1 59 14 ld.u64 %r26,[%r202+16]; .loc 1 61 9 add.u64 %r205,%r126,%r126; add.u64 %r206,%r205,%r126; shl.b64 %r207,%r206,3; add.u64 %r208,%r202,%r207; ld.u64 %r27,[%r208+56]; ld.u64 %r29,[%r208+48]; .loc 1 65 11 ld.u64 %r30,[%r208+40]; .loc 1 67 3 setp.gt.s64 %r222,%r126,0; @ %r222 bra $L2; $L8: .loc 1 75 3 setp.lt.s64 %r223,%r126,%r127; @ %r223 bra $L3; bra $L4; $L2: add.u64 %r151,%r202,40; add.u64 %r52,%frame,120; add.u64 %r109,%frame,240; add.u64 %r225,%r22,%r22; add.u64 %r226,%r225,%r22; shl.b64 %r227,%r226,3; add.u64 %r228,%r202,16; add.u64 %r160,%r227,%r228; .loc 1 73 12 mov.u64 %r390,0; $L7: .loc 1 69 20 ld.u64 %r230,[%r151]; mul.lo.u64 %r229,%r230,%r26; .loc 1 69 18 st.u64 [%r52],%r229; .loc 1 70 19 ld.u64 %r232,[%r151+16]; add.u64 %r231,%r232,1; ld.u64 %r233,[%r151+8]; sub.u64 %r41,%r231,%r233; .loc 1 72 10 setp.lt.s64 %r234,%r41,0; @ %r234 bra $L5; .loc 1 70 17 st.u64 [%r109],%r41; bra $L6; $L5: .loc 1 73 12 st.u64 [%r109],%r390; $L6: .loc 1 67 3 add.u64 %r151,%r151,24; add.u64 %r52,%r52,8; add.u64 %r109,%r109,8; setp.ne.u64 %r236,%r151,%r160; @ %r236 bra $L7; bra $L8; $L4: .loc 1 84 6 ld.u64 %r237,[%r201]; setp.eq.u64 %r238,%r237,0; @ ! %r238 bra $L51; bra $L9; $L3: add.u64 %r240,%r22,%r22; add.u64 %r241,%r240,%r22; shl.b64 %r242,%r241,3; add.u64 %r243,%r242,40; add.u64 %r145,%r202,%r243; shl.b64 %r244,%r22,3; add.u64 %r112,%r244,-8; add.u64 %r379,%frame,120; add.u64 %r117,%r379,%r112; add.u64 %r378,%frame,240; add.u64 %r110,%r378,%r112; add.u64 %r247,%r202,40; cvt.u32.u32 %r249,%r23; cvt.s64.s8 %r248,%r249; add.u64 %r251,%r248,%r248; add.u64 %r252,%r251,%r248; shl.b64 %r253,%r252,3; add.u64 %r198,%r247,%r253; .loc 1 81 12 mov.u64 %r389,0; $L13: .loc 1 77 20 ld.u64 %r255,[%r145]; mul.lo.u64 %r254,%r255,%r26; .loc 1 77 18 st.u64 [%r117],%r254; .loc 1 78 19 ld.u64 %r257,[%r145+16]; add.u64 %r256,%r257,1; ld.u64 %r258,[%r145+8]; sub.u64 %r49,%r256,%r258; .loc 1 80 10 setp.lt.s64 %r259,%r49,0; @ %r259 bra $L11; .loc 1 78 17 st.u64 [%r110],%r49; bra $L12; $L11: .loc 1 81 12 st.u64 [%r110],%r389; $L12: .loc 1 75 3 add.u64 %r145,%r145,24; add.u64 %r117,%r117,8; add.u64 %r110,%r110,8; setp.ne.u64 %r261,%r145,%r198; @ %r261 bra $L13; bra $L4; $L9: .loc 1 88 7 setp.le.s32 %r262,%r25,0; @ %r262 bra $L14; add.u64 %r179,%r201,40; add.u64 %r177,%frame,240; add.u64 %r263,%r201,16; cvt.u32.u32 %r265,%r23; cvt.s64.s8 %r264,%r265; add.u64 %r267,%r264,%r264; add.u64 %r268,%r267,%r264; shl.b64 %r269,%r268,3; add.u64 %r171,%r263,%r269; .loc 1 91 17 mov.u64 %r115,1; bra $L15; $L14: .loc 1 99 24 mov.u64 %r270,0; st.u64 [%r201+8],%r270; .loc 1 100 28 cvt.u16.u32 %r273,%r23; add.u16 %r272,%r273,-1; cvt.u32.u16 %r274,%r272; st.u8 [%r201+28],%r274; .loc 1 102 20 add.u32 %r275,%r23,-2; cvt.s64.s32 %r58,%r275; add.u64 %r277,%r58,%r58; add.u64 %r278,%r277,%r58; shl.b64 %r279,%r278,3; add.u64 %r280,%r201,%r279; .loc 1 102 67 shl.b64 %r282,%r58,3; add.u64 %r283,%frame,%r282; .loc 1 102 59 ld.u64 %r285,[%r280+40]; ld.u64 %r286,[%r283+240]; mul.lo.u64 %r134,%r285,%r286; .loc 1 104 10 setp.eq.u64 %r287,%r134,0; @ %r287 bra $L16; bra $L52; $L18: .loc 1 93 55 mul.lo.u64 %r115,%r54,%r115; $L15: .loc 1 95 4 st.u64 [%r179+8],%r237; ld.u64 %r54,[%r177]; add.u64 %r289,%r54,-1; st.u64 [%r179+16],%r289; st.u64 [%r179],%r115; .loc 1 88 7 add.u64 %r179,%r179,24; add.u64 %r177,%r177,8; setp.ne.u64 %r290,%r171,%r179; @ %r290 bra $L18; bra $L14; $L16: .loc 1 107 4 st.u64 [%r201+48],%r134; mov.u64 %r292,-1; st.u64 [%r201+56],%r292; mov.u64 %r293,1; st.u64 [%r201+40],%r293; .loc 1 108 4 bra $L1; $L52: .loc 1 111 24 mov.u64 %r295,16295; call (%value_in),_gfortrani_xmallocarray296,[%value_in]; } .loc 1 111 22 st.u64 [%r201],%r296; $L25: .loc 1 137 3 setp.gt.s32 %r298,%r25,0; @ ! %r298 bra $L26; add.u64 %r378,%frame,240; bra $L20; $L26: .loc 1 145 8 ld.u64 %r140,[%r202]; .loc 1 147 21 cvt.u32.u64 %r76,%r26; .loc 1 147 50 add.u32 %r299,%r76,-4; and.b32 %r300,%r299,-5; set.u32.eq.u32 %r302,%r300,0; neg.s32 %r303,%r302; .loc 1 147 21 add.u32 %r304,%r76,-1; set.u32.le.u32 %r306,%r304,1; neg.s32 %r307,%r306; .loc 1 147 6 cvt.u16.u32 %r309,%r303; cvt.u16.u32 %r310,%r307; orbra $L53; $L20: add.u64 %r189,%frame,360; add.u64 %r188,%r201,40; mov.u64 %r186,%frame; .loc 1 123 10 mov.u64 %r142,0; .loc 1 139 16 mov.u64 %r332,%r142; bra $L23; $L51: .loc 1 115 19 ld.s8 %r65,[%r201+28]; .loc 1 115 10 setp.eq.u64 %r314,%r65,%r127; @ %r314 bra $L24; .loc 1 116 2 st.u64 [%stack+8],%r127; st.u64 [%stack],%r65; cvta.const.u64 %r315315_gfortran_runtime_error24: .loc 1 121 11 cvta.global.u64 %r317,_gfortrani_compile_options; .loc 1 121 10 ld.u32 %r318,[%r317+36]; setp.eq.u32 %r319,%r318,0; @ %r319 bra $L25; .loc 1 123 4 setp.le.s64 %r320,%r127,0; @ %r320 bra $L26; add.u64 %r170,%r201,48; add.u64 %r378,%frame,240; mov.u64 %r168,%r378; add.u64 %r162,%r201,56; cvt.u32.u32 %r322,%r23; cvt.s64.s8 %r321,%r322; add.u64 %r149,%r321,-1; .loc 1 123 10 mov.u64 %r114,0; $L28: .loc 1 127 21 ld.u64 %r324,[%r162]; add.u64 %r323,%r324,1; .loc 1 127 19 ld.u64 %r325,[%r170]; sub.u64 %r133,%r323,%r325; .loc 1 128 18 ld.u64 %r71,[%r168]; .loc 1 128 11 setp.eq.u64 %r326,%r71,%r133; @ %r326 bra $L27; .loc 1 129 3 st.u64 [%stack+16],%r71; st.u64 [%stack+8],%r133; cvt.u32.u64 %r329,%r114; add.u32 %r328,%r329,1; st.u32 [%stack],%r328; cvta.const.u64 %r327,$LC1stack; call _gfortran_runtime_error27: .loc 1 123 25 add.u64 %r114,%r114,1; .loc 1 123 4 add.u64 %r170,%r170,24; add.u64 %r168,%r168,8; add.u64 %r162,%r162,24; setp.ne.u64 %r331,%r114,%r149; @ %r331 bra $L28; bra $L20; $L23: .loc 1 139 16 st.u64 [%r189],%r332; .loc 1 140 18 ld.u64 %r333,[%r188]; st.u64 [%r186],%r333; .loc 1 141 17 shl.b64 %r335,%r142,3; add.u64 %r336,%r378,%r335; .loc 1 141 10 ld.u64 %r337,[%r336]; setp.le.s64 %r338,%r337,0; @ %r338 bra $L1; .loc 1 137 26 add.u64 %r142,%r142,1; .loc 1 137 3 add.u64 %r189,%r189,8; add.u64 %r188,%r188,24; add.u64 %r186,%r186,8; setp.gt.s64 %r339,%r127,%r142; @ %r339 bra $L23; bra $L26; $L53: .loc 1 149 7 setp.eq.u32 %r341,%r76,16; @ %r341 bra $L21; .loc 1 157 5 cvta.const.u64 %r343,$LC2; mov.u64 %r342,3343; call _gfortrani_internal_error61 9 add.u64 %r344,%r27,1; .loc 1 61 7 sub.u64 %r129,%r344,%r29; max.s64 %r131,%r129,0; .loc 1 65 11 mul.lo.u64 %r33,%r30,%r26; .loc 1 159 8 ld.u64 %r141,[%r201]; ld.u64 %r166,[%frame+360]; .loc 1 189 22 ld.u64 %r86,[%frame+120]; .loc 1 190 22 ld.u64 %r88,[%frame]; .loc 1 190 12 shl.b64 %r89,%r88,4; .loc 1 192 32 ld.u64 %r164,[%frame+240]; .loc 1 199 30 mul.lo.u64 %r345,%r164,%r86; .loc 1 199 16 neg.s64 %r150,%r345; .loc 1 200 30 mul.lo.u64 %r346,%r164,%r88; .loc 1 200 16 shl.b64 %r347,%r346,4; neg.s64 %r148,%r347; setp.gt.s64 %r380,%r129,0; setp.le.s32 %r377,%r25,1; .loc 1 180 14 mov.u64 %r382,0; cvt.u32.u32 %r384,%r23; cvt.s64.s8 %r385,%r384; add.u64 %r386,%r385,-1; add.u64 %r387,%frame,120; add.u64 %r388,%frame,240; $L36: .loc 1 171 12 @ %r380 bra $L37; .loc 1 172 10 mov.u64 %r349,1; st.u64 [%r141],%r349; st.u64 [%r141+8],%r382; bra $L30; $L37: mov.u64 %r136,%r140; .loc 1 175 13 mov.u64 %r135,0; $L29: .loc 1 178 6 ld.s8 %r352,[%r136]; cvt.u16.u32 %r351,%r352; setp.eq.u16 %r353,%r351,0; @ %r353 bra $L38; .loc 1 175 28 add.u64 %r135,%r135,1; .loc 1 175 36 add.u64 %r136,%r136,%r33; .loc 1 175 6 setp.gt.s64 %r354,%r131,%r135; @ %r354 bra $L29; .loc 1 170 10 mov.u64 %r375,1; mov.u64 %r376,%r382; bra $L31; $L38: .loc 1 180 14 mov.u64 %r375,%r382; mov.u64 %r376,%r382; $L31: .loc 1 184 12 st.u64 [%r141],%r375; st.u64 [%r141+8],%r376; $L30: .loc 1 188 15 add.u64 %r166,%r166,1; .loc 1 189 12 add.u64 %r140,%r140,%r86; .loc 1 190 12 add.u64 %r141,%r141,%r89; .loc 1 192 13 setp.ne.u64 %r357,%r166,%r164; @ %r357 bra $L36; .loc 1 199 16 add.u64 %r137,%r140,%r150; .loc 1 200 16 add.u64 %r138,%r141,%r148; .loc 1 202 14 @ %r377 bra $L1; add.u64 %r32,%frame,368; mov.u64 %r165,8; .loc 1 201 12 mov.u64 %r139,1; bra $L34; $L35: .loc 1 196 20 st.u64 [%r32],%r382; .loc 1 199 30 mul.lo.u64 %r363,%r101,%r100; .loc 1 199 16 sub.u64 %r137,%r140,%r363; .loc 1 200 30 mul.lo.u64 %r364,%r103,%r100; .loc 1 200 16 shl.b64 %r365,%r364,4; sub.u64 %r138,%r141,%r365; .loc 1 201 12 add.u64 %r139,%r139,1; .loc 1 202 14 add.u64 %r32,%r32,8; add.u64 %r165,%r165,8; setp.eq.u64 %r366,%r139,%r386; @ %r366 bra $L1; $L34: .loc 1 210 23 ld.u64 %r367,[%r32]; add.u64 %r100,%r367,1; st.u64 [%r32],%r100; .loc 1 211 30 add.u64 %r369,%r387,%r165; ld.u64 %r101,[%r369]; .loc 1 211 20 add.u64 %r140,%r137,%r101; .loc 1 212 30 add.u64 %r370,%frame,%r165; ld.u64 %r103,[%r370]; .loc 1 212 20 shl.b64 %r371,%r103,4; add.u64 %r141,%r138,%r371; .loc 1 192 32 add.u64 %r373,%r388,%r165; ld.u64 %r107,[%r373]; .loc 1 192 13 setp.eq.u64 %r374,%r100,%r107; @ %r374 bra $L35; .loc 1 196 20 mov.u64 %r166,0; bra $L36; $L1: .loc 1 216 1 ret; } any_l1.o/_gfortran_any_l1 .visible .func _gfortran_any_l1fortran/generated/any_l1.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_internal_error .extern .func _gfortrani_internal_error21,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121,32,105,110,32,65,78,89,32,105,110,116,114,105,110,115,105,99,0 }114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,65,78,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,10810,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,65,78,89,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_any_l1 .visible .func _gfortran_any_l1480pred %r217; .reg .predu64 %r228; .reg .pred %r229; .reg .pred %r231; .reg .u64 %r232; .reg .predu32u64 %r250; .reg .u64 %r251; .reg .u64 %r252; .reg .u64 %r253; .reg .predu32u16u64pred %r293; .reg .u32 %r294; .reg .u32predu64 %r318; .reg .u6464u64 %r341; .reg .u32predpred %r362; .reg .predu64 %r376; mov.u64 %r196,%ar0; mov.u64 %r197,%ar1; mov.u64 %r198,%ar2; .loc 1 56 10 ld.u64 %r22,[%r198]; .loc 1 56 7 add.u64 %r121,%r22,-1; .loc 1 57 10 ld.s8 %r23,[%r197+28]; .loc 1 57 38 add.u32 %r25,%r23,-1; .loc 1 57 8 cvt.s64.s32 %r122,%r25; .loc 1 59 14 ld.u64 %r26,[%r197+16]; .loc 1 61 9 add.u64 %r200,%r121,%r121; add.u64 %r201,%r200,%r121; shl.b64 %r202,%r201,3; add.u64 %r203,%r197,%r202; ld.u64 %r27,[%r203+56]; ld.u64 %r29,[%r203+48]; .loc 1 65 11 ld.u64 %r30,[%r203+40]; .loc 1 67 3 setp.gt.s64 %r217,%r121,0; @ %r217 bra $L2; $L8: .loc 1 75 3 setp.lt.s64 %r218,%r121,%r122; @ %r218 bra $L3; bra $L4; $L2: add.u64 %r52,%r197,40; add.u64 %r148,%frame,120; add.u64 %r67,%frame,240; add.u64 %r220,%r22,%r22; add.u64 %r221,%r220,%r22; shl.b64 %r222,%r221,3; add.u64 %r223,%r197,16; add.u64 %r193,%r222,%r223; .loc 1 73 12 mov.u64 %r376,0; $L7: .loc 1 69 20 ld.u64 %r225,[%r52]; mul.lo.u64 %r224,%r225,%r26; .loc 1 69 18 st.u64 [%r148],%r224; .loc 1 70 19 ld.u64 %r227,[%r52+16]; add.u64 %r226,%r227,1; ld.u64 %r228,[%r52+8]; sub.u64 %r41,%r226,%r228; .loc 1 72 10 setp.lt.s64 %r229,%r41,0; @ %r229 bra $L5; .loc 1 70 17 st.u64 [%r67],%r41; bra $L6; $L5: .loc 1 73 12 st.u64 [%r67],%r376; $L6: .loc 1 67 3 add.u64 %r52,%r52,24; add.u64 %r148,%r148,8; add.u64 %r67,%r67,8; setp.ne.u64 %r231,%r52,%r193; @ %r231 bra $L7; bra $L8; $L4: .loc 1 84 6 ld.u64 %r232,[%r196]; setp.eq.u64 %r233,%r232,0; @ ! %r233 bra $L51; bra $L9; $L3: add.u64 %r235,%r22,%r22; add.u64 %r236,%r235,%r22; shl.b64 %r237,%r236,3; add.u64 %r238,%r237,40; add.u64 %r117,%r197,%r238; shl.b64 %r239,%r22,3; add.u64 %r107,%r239,-8; add.u64 %r366,%frame,120; add.u64 %r109,%r366,%r107; add.u64 %r365,%frame,240; add.u64 %r69,%r365,%r107; add.u64 %r242,%r197,40; cvt.u32.u32 %r244,%r23; cvt.s64.s8 %r243,%r244; add.u64 %r246,%r243,%r243; add.u64 %r247,%r246,%r243; shl.b64 %r248,%r247,3; add.u64 %r147,%r242,%r248; .loc 1 81 12 mov.u64 %r375,0; $L13: .loc 1 77 20 ld.u64 %r250,[%r117]; mul.lo.u64 %r249,%r250,%r26; .loc 1 77 18 st.u64 [%r109],%r249; .loc 1 78 19 ld.u64 %r252,[%r117+16]; add.u64 %r251,%r252,1; ld.u64 %r253,[%r117+8]; sub.u64 %r49,%r251,%r253; .loc 1 80 10 setp.lt.s64 %r254,%r49,0; @ %r254 bra $L11; .loc 1 78 17 st.u64 [%r69],%r49; bra $L12; $L11: .loc 1 81 12 st.u64 [%r69],%r375; $L12: .loc 1 75 3 add.u64 %r117,%r117,24; add.u64 %r109,%r109,8; add.u64 %r69,%r69,8; setp.ne.u64 %r256,%r117,%r147; @ %r256 bra $L13; bra $L4; $L9: .loc 1 88 7 setp.le.s32 %r257,%r25,0; @ %r257 bra $L14; add.u64 %r172,%r196,40; add.u64 %r170,%frame,240; add.u64 %r258,%r196,16; cvt.u32.u32 %r260,%r23; cvt.s64.s8 %r259,%r260; add.u64 %r262,%r259,%r259; add.u64 %r263,%r262,%r259; shl.b64 %r264,%r263,3; add.u64 %r164,%r258,%r264; .loc 1 91 17 mov.u64 %r112,1; bra $L15; $L14: .loc 1 99 24 mov.u64 %r265,0; st.u64 [%r196+8],%r265; .loc 1 100 28 cvt.u16.u32 %r268,%r23; add.u16 %r267,%r268,-1; cvt.u32.u16 %r269,%r267; st.u8 [%r196+28],%r269; .loc 1 102 20 add.u32 %r270,%r23,-2; cvt.s64.s32 %r58,%r270; add.u64 %r272,%r58,%r58; add.u64 %r273,%r272,%r58; shl.b64 %r274,%r273,3; add.u64 %r275,%r196,%r274; .loc 1 102 67 shl.b64 %r277,%r58,3; add.u64 %r278,%frame,%r277; .loc 1 102 59 ld.u64 %r280,[%r275+40]; ld.u64 %r281,[%r278+240]; mul.lo.u64 %r130,%r280,%r281; .loc 1 104 10 setp.eq.u64 %r282,%r130,0; @ %r282 bra $L16; bra $L52; $L18: .loc 1 93 55 mul.lo.u64 %r112,%r54,%r112; $L15: .loc 1 95 4 st.u64 [%r172+8],%r232; ld.u64 %r54,[%r170]; add.u64 %r284,%r54,-1; st.u64 [%r172+16],%r284; st.u64 [%r172],%r112; .loc 1 88 7 add.u64 %r172,%r172,24; add.u64 %r170,%r170,8; setp.ne.u64 %r285,%r164,%r172; @ %r285 bra $L18; bra $L14; $L16: .loc 1 107 4 st.u64 [%r196+48],%r130; mov.u64 %r287,-1; st.u64 [%r196+56],%r287; mov.u64 %r288,1; st.u64 [%r196+40],%r288; .loc 1 108 4 bra $L1; $L52: .loc 1 111 24 mov.u64 %r2901290; call (%value_in),_gfortrani_xmallocarray,(%out_arg1,%out_arg2); ld.param.u64 %r291,[%value_in]; } .loc 1 111 22 st.u64 [%r196],%r291; $L25: .loc 1 137 3 setp.gt.s32 %r293,%r25,0; @ ! %r293 bra $L26; add.u64 %r365,%frame,240; bra $L20; $L26: .loc 1 145 8 ld.u64 %r137,[%r197]; .loc 1 147 21 cvt.u32.u64 %r79,%r26; .loc 1 147 50 add.u32 %r294,%r79,-4; and.b32 %r295,%r294,-5; set.u32.eq.u32 %r297,%r295,0; neg.s32 %r298,%r297; .loc 1 147 21 add.u32 %r299,%r79,-1; set.u32.le.u32 %r301,%r299,1; neg.s32 %r302,%r301; .loc 1 147 6 cvt.u16.u32 %r304,%r298; cvt.u16.u32 %r305,%r302; orbra $L53; $L20: add.u64 %r182,%frame,360; add.u64 %r181,%r196,40; mov.u64 %r179,%frame; .loc 1 123 10 mov.u64 %r111,0; .loc 1 139 16 mov.u64 %r327,%r111; bra $L23; $L51: .loc 1 115 19 ld.s8 %r65,[%r196+28]; .loc 1 115 10 setp.eq.u64 %r309,%r65,%r122; @ %r309 bra $L24; .loc 1 116 2 st.u64 [%stack+8],%r122; st.u64 [%stack],%r65; cvta.const.u64 %r310310_gfortran_runtime_error24: .loc 1 121 11 cvta.global.u64 %r312,_gfortrani_compile_options; .loc 1 121 10 ld.u32 %r313,[%r312+36]; setp.eq.u32 %r314,%r313,0; @ %r314 bra $L25; .loc 1 123 4 setp.le.s64 %r315,%r122,0; @ %r315 bra $L26; add.u64 %r163,%r196,48; add.u64 %r365,%frame,240; mov.u64 %r155,%r365; add.u64 %r150,%r196,56; cvt.u32.u32 %r317,%r23; cvt.s64.s8 %r316,%r317; add.u64 %r133,%r316,-1; .loc 1 123 10 mov.u64 %r129,0; $L28: .loc 1 127 21 ld.u64 %r319,[%r150]; add.u64 %r318,%r319,1; .loc 1 127 19 ld.u64 %r320,[%r163]; sub.u64 %r128,%r318,%r320; .loc 1 128 18 ld.u64 %r73,[%r155]; .loc 1 128 11 setp.eq.u64 %r321,%r73,%r128; @ %r321 bra $L27; .loc 1 129 3 st.u64 [%stack+16],%r73; st.u64 [%stack+8],%r128; cvt.u32.u64 %r324,%r129; add.u32 %r323,%r324,1; st.u32 [%stack],%r323; cvta.const.u64 %r322,$LC1; {stack; call _gfortran_runtime_error27: .loc 1 123 25 add.u64 %r129,%r129,1; .loc 1 123 4 add.u64 %r163,%r163,24; add.u64 %r155,%r155,8; add.u64 %r150,%r150,24; setp.ne.u64 %r326,%r129,%r133; @ %r326 bra $L28; bra $L20; $L23: .loc 1 139 16 st.u64 [%r182],%r327; .loc 1 140 18 ld.u64 %r328,[%r181]; st.u64 [%r179],%r328; .loc 1 141 17 shl.b64 %r330,%r111,3; add.u64 %r331,%r365,%r330; .loc 1 141 10 ld.u64 %r332,[%r331]; setp.le.s64 %r333,%r332,0; @ %r333 bra $L1; .loc 1 137 26 add.u64 %r111,%r111,1; .loc 1 137 3 add.u64 %r182,%r182,8; add.u64 %r181,%r181,24; add.u64 %r179,%r179,8; setp.gt.s64 %r334,%r122,%r111; @ %r334 bra $L23; bra $L26; $L53: .loc 1 149 7 setp.eq.u32 %r336,%r79,16; @ %r336 bra $L21; .loc 1 157 5 cvta.const.u64 %r338,$LC2; mov.u64 %r333338; call _gfortrani_internal_error61 9 add.u64 %r339,%r27,1; .loc 1 61 7 sub.u64 %r124,%r339,%r29; max.s64 %r126,%r124,0; .loc 1 65 11 mul.lo.u64 %r33,%r30,%r26; .loc 1 159 8 ld.u64 %r138,[%r196]; ld.u64 %r161,[%frame+360]; .loc 1 189 22 ld.u64 %r86,[%frame+120]; .loc 1 190 22 ld.u64 %r88,[%frame]; .loc 1 192 32 ld.u64 %r159,[%frame+240]; .loc 1 199 30 mul.lo.u64 %r340,%r159,%r86; .loc 1 199 16 neg.s64 %r146,%r340; .loc 1 200 30 mul.lo.u64 %r341,%r159,%r88; .loc 1 200 16 neg.s64 %r157,%r341; setp.gt.s64 %r363,%r124,0; setp.le.s32 %r364,%r25,1; cvt.u32.u32 %r368,%r23; cvt.s64.s8 %r369,%r368; add.u64 %r370,%r369,-1; add.u64 %r371,%frame,120; add.u64 %r372,%frame,240; .loc 1 196 20 mov.u64 %r373,0; $L36: .loc 1 170 12 @ %r363 bra $L37; .loc 1 171 10 mov.u32 %r343,0; st.u8 [%r138],%r343; bra $L30; $L37: mov.u64 %r132,%r137; .loc 1 174 13 mov.u64 %r131,0; $L29: .loc 1 178 7 ld.s8 %r84,[%r132]; .loc 1 178 6 setp.ne.u32 %r344,%r84,0; @ %r344 bra $L38; .loc 1 174 28 add.u64 %r131,%r131,1; .loc 1 174 36 add.u64 %r132,%r132,%r33; .loc 1 174 6 setp.gt.s64 %r345,%r126,%r131; @ %r345 bra $L29; bra $L31; $L38: .loc 1 180 14 mov.u32 %r84,1; $L31: .loc 1 184 12 cvt.u32.u32 %r346,%r84; st.u8 [%r138],%r346; $L30: .loc 1 188 15 add.u64 %r161,%r161,1; .loc 1 189 12 add.u64 %r137,%r137,%r86; .loc 1 190 12 add.u64 %r138,%r138,%r88; .loc 1 192 13 setp.ne.u64 %r347,%r161,%r159; @ %r347 bra $L36; .loc 1 199 16 add.u64 %r134,%r137,%r146; .loc 1 200 16 add.u64 %r135,%r138,%r157; .loc 1 202 14 @ %r364 bra $L1; add.u64 %r32,%frame,368; mov.u64 %r160,8; .loc 1 201 12 mov.u64 %r136,1; bra $L34; $L35: .loc 1 196 20 st.u64 [%r32],%r373; .loc 1 199 30 mul.lo.u64 %r353,%r98,%r97; .loc 1 199 16 sub.u64 %r134,%r137,%r353; .loc 1 200 30 mul.lo.u64 %r354,%r100,%r97; .loc 1 200 16 sub.u64 %r135,%r138,%r354; .loc 1 201 12 add.u64 %r136,%r136,1; .loc 1 202 14 add.u64 %r32,%r32,8; add.u64 %r160,%r160,8; setp.eq.u64 %r355,%r136,%r370; @ %r355 bra $L1; $L34: .loc 1 210 23 ld.u64 %r356,[%r32]; add.u64 %r97,%r356,1; st.u64 [%r32],%r97; .loc 1 211 30 add.u64 %r358,%r371,%r160; ld.u64 %r98,[%r358]; .loc 1 211 20 add.u64 %r137,%r134,%r98; .loc 1 212 30 add.u64 %r359,%frame,%r160; ld.u64 %r100,[%r359]; .loc 1 212 20 add.u64 %r138,%r135,%r100; .loc 1 192 32 add.u64 %r361,%r372,%r160; ld.u64 %r103,[%r361]; .loc 1 192 13 setp.eq.u64 %r362,%r97,%r103; @ %r362 bra $L35; .loc 1 196 20 mov.u64 %r161,0; bra $L36; $L1: .loc 1 216 1 ret; } any_l2.o/_gfortran_any_l2 .visible .func _gfortran_any_l2fortran/generated/any_lgfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_internal_error .extern .func _gfortrani_internal_error21,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121,32,105,110,32,65,78,89,32,105,110,116,114,105,110,115,105,99,0 }114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,65,78,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,10810,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,65,78,89,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_any_l2 .visible .func _gfortran_any_l2480u64 %r229; .reg .u64 %r230; .reg .u64 %r231; .reg .u64 %r232; .reg .u64 %r233; .reg .pred %r234; .reg .pred %r236; .reg .u64u64u64 %r257; .reg .u64 %r258; .reg .predu64u16u64 %r279; .reg .u64 %r280; .reg .u64 %r282; .reg .u64pred %r320; .reg .u64u64 %r332; .reg .u64 %r333; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .pred %r338; .reg .pred %r339; .reg .pred %r341; .reg .u64 %r342; .reg .u64 %r343; .reg .u64 %r344; .reg .u64 %r346; .reg .u64 %r347; .reg .u64 %r348; .reg .u16 %r350; .reg .u1664u64 %r370; .reg .u64 %r372; .reg .predmov.u64 %r201,%ar0; mov.u64 %r202,%ar1; mov.u64 %r203,%ar2; .loc 1 56 10 ld.u64 %r22,[%r203]; .loc 1 56 7 add.u64 %r126,%r22,-1; .loc 1 57 10 ld.s8 %r23,[%r202+28]; .loc 1 57 38 add.u32 %r25,%r23,-1; .loc 1 57 8 cvt.s64.s32 %r127,%r25; .loc 1 59 14 ld.u64 %r26,[%r202+16]; .loc 1 61 9 add.u64 %r205,%r126,%r126; add.u64 %r206,%r205,%r126; shl.b64 %r207,%r206,3; add.u64 %r208,%r202,%r207; ld.u64 %r27,[%r208+56]; ld.u64 %r29,[%r208+48]; .loc 1 65 11 ld.u64 %r30,[%r208+40]; .loc 1 67 3 setp.gt.s64 %r222,%r126,0; @ %r222 bra $L2; $L8: .loc 1 75 3 setp.lt.s64 %r223,%r126,%r127; @ %r223 bra $L3; bra $L4; $L2: add.u64 %r151,%r202,40; add.u64 %r52,%frame,120; add.u64 %r109,%frame,240; add.u64 %r225,%r22,%r22; add.u64 %r226,%r225,%r22; shl.b64 %r227,%r226,3; add.u64 %r228,%r202,16; add.u64 %r160,%r227,%r228; .loc 1 73 12 mov.u64 %r387,0; $L7: .loc 1 69 20 ld.u64 %r230,[%r151]; mul.lo.u64 %r229,%r230,%r26; .loc 1 69 18 st.u64 [%r52],%r229; .loc 1 70 19 ld.u64 %r232,[%r151+16]; add.u64 %r231,%r232,1; ld.u64 %r233,[%r151+8]; sub.u64 %r41,%r231,%r233; .loc 1 72 10 setp.lt.s64 %r234,%r41,0; @ %r234 bra $L5; .loc 1 70 17 st.u64 [%r109],%r41; bra $L6; $L5: .loc 1 73 12 st.u64 [%r109],%r387; $L6: .loc 1 67 3 add.u64 %r151,%r151,24; add.u64 %r52,%r52,8; add.u64 %r109,%r109,8; setp.ne.u64 %r236,%r151,%r160; @ %r236 bra $L7; bra $L8; $L4: .loc 1 84 6 ld.u64 %r237,[%r201]; setp.eq.u64 %r238,%r237,0; @ ! %r238 bra $L51; bra $L9; $L3: add.u64 %r240,%r22,%r22; add.u64 %r241,%r240,%r22; shl.b64 %r242,%r241,3; add.u64 %r243,%r242,40; add.u64 %r145,%r202,%r243; shl.b64 %r244,%r22,3; add.u64 %r112,%r244,-8; add.u64 %r376,%frame,120; add.u64 %r117,%r376,%r112; add.u64 %r375,%frame,240; add.u64 %r110,%r375,%r112; add.u64 %r247,%r202,40; cvt.u32.u32 %r249,%r23; cvt.s64.s8 %r248,%r249; add.u64 %r251,%r248,%r248; add.u64 %r252,%r251,%r248; shl.b64 %r253,%r252,3; add.u64 %r198,%r247,%r253; .loc 1 81 12 mov.u64 %r386,0; $L13: .loc 1 77 20 ld.u64 %r255,[%r145]; mul.lo.u64 %r254,%r255,%r26; .loc 1 77 18 st.u64 [%r117],%r254; .loc 1 78 19 ld.u64 %r257,[%r145+16]; add.u64 %r256,%r257,1; ld.u64 %r258,[%r145+8]; sub.u64 %r49,%r256,%r258; .loc 1 80 10 setp.lt.s64 %r259,%r49,0; @ %r259 bra $L11; .loc 1 78 17 st.u64 [%r110],%r49; bra $L12; $L11: .loc 1 81 12 st.u64 [%r110],%r386; $L12: .loc 1 75 3 add.u64 %r145,%r145,24; add.u64 %r117,%r117,8; add.u64 %r110,%r110,8; setp.ne.u64 %r261,%r145,%r198; @ %r261 bra $L13; bra $L4; $L9: .loc 1 88 7 setp.le.s32 %r262,%r25,0; @ %r262 bra $L14; add.u64 %r179,%r201,40; add.u64 %r177,%frame,240; add.u64 %r263,%r201,16; cvt.u32.u32 %r265,%r23; cvt.s64.s8 %r264,%r265; add.u64 %r267,%r264,%r264; add.u64 %r268,%r267,%r264; shl.b64 %r269,%r268,3; add.u64 %r171,%r263,%r269; .loc 1 91 17 mov.u64 %r115,1; bra $L15; $L14: .loc 1 99 24 mov.u64 %r270,0; st.u64 [%r201+8],%r270; .loc 1 100 28 cvt.u16.u32 %r273,%r23; add.u16 %r272,%r273,-1; cvt.u32.u16 %r274,%r272; st.u8 [%r201+28],%r274; .loc 1 102 20 add.u32 %r275,%r23,-2; cvt.s64.s32 %r58,%r275; add.u64 %r277,%r58,%r58; add.u64 %r278,%r277,%r58; shl.b64 %r279,%r278,3; add.u64 %r280,%r201,%r279; .loc 1 102 67 shl.b64 %r282,%r58,3; add.u64 %r283,%frame,%r282; .loc 1 102 59 ld.u64 %r285,[%r280+40]; ld.u64 %r286,[%r283+240]; mul.lo.u64 %r134,%r285,%r286; .loc 1 104 10 setp.eq.u64 %r287,%r134,0; @ %r287 bra $L16; bra $L52; $L18: .loc 1 93 55 mul.lo.u64 %r115,%r54,%r115; $L15: .loc 1 95 4 st.u64 [%r179+8],%r237; ld.u64 %r54,[%r177]; add.u64 %r289,%r54,-1; st.u64 [%r179+16],%r289; st.u64 [%r179],%r115; .loc 1 88 7 add.u64 %r179,%r179,24; add.u64 %r177,%r177,8; setp.ne.u64 %r290,%r171,%r179; @ %r290 bra $L18; bra $L14; $L16: .loc 1 107 4 st.u64 [%r201+48],%r134; mov.u64 %r292,-1; st.u64 [%r201+56],%r292; mov.u64 %r293,1; st.u64 [%r201+40],%r293; .loc 1 108 4 bra $L1; $L52: .loc 1 111 24 mov.u64 %r295,1295; call (%value_in),_gfortrani_xmallocarray296,[%value_in]; } .loc 1 111 22 st.u64 [%r201],%r296; $L25: .loc 1 137 3 setp.gt.s32 %r298,%r25,0; @ ! %r298 bra $L26; add.u64 %r375,%frame,240; bra $L20; $L26: .loc 1 145 8 ld.u64 %r140,[%r202]; .loc 1 147 21 cvt.u32.u64 %r76,%r26; .loc 1 147 50 add.u32 %r299,%r76,-4; and.b32 %r300,%r299,-5; set.u32.eq.u32 %r302,%r300,0; neg.s32 %r303,%r302; .loc 1 147 21 add.u32 %r304,%r76,-1; set.u32.le.u32 %r306,%r304,1; neg.s32 %r307,%r306; .loc 1 147 6 cvt.u16.u32 %r309,%r303; cvt.u16.u32 %r310,%r307; orbra $L53; $L20: add.u64 %r189,%frame,360; add.u64 %r188,%r201,40; mov.u64 %r186,%frame; .loc 1 123 10 mov.u64 %r142,0; .loc 1 139 16 mov.u64 %r332,%r142; bra $L23; $L51: .loc 1 115 19 ld.s8 %r65,[%r201+28]; .loc 1 115 10 setp.eq.u64 %r314,%r65,%r127; @ %r314 bra $L24; .loc 1 116 2 st.u64 [%stack+8],%r127; st.u64 [%stack],%r65; cvta.const.u64 %r315315_gfortran_runtime_error24: .loc 1 121 11 cvta.global.u64 %r317,_gfortrani_compile_options; .loc 1 121 10 ld.u32 %r318,[%r317+36]; setp.eq.u32 %r319,%r318,0; @ %r319 bra $L25; .loc 1 123 4 setp.le.s64 %r320,%r127,0; @ %r320 bra $L26; add.u64 %r170,%r201,48; add.u64 %r375,%frame,240; mov.u64 %r168,%r375; add.u64 %r162,%r201,56; cvt.u32.u32 %r322,%r23; cvt.s64.s8 %r321,%r322; add.u64 %r149,%r321,-1; .loc 1 123 10 mov.u64 %r114,0; $L28: .loc 1 127 21 ld.u64 %r324,[%r162]; add.u64 %r323,%r324,1; .loc 1 127 19 ld.u64 %r325,[%r170]; sub.u64 %r133,%r323,%r325; .loc 1 128 18 ld.u64 %r71,[%r168]; .loc 1 128 11 setp.eq.u64 %r326,%r71,%r133; @ %r326 bra $L27; .loc 1 129 3 st.u64 [%stack+16],%r71; st.u64 [%stack+8],%r133; cvt.u32.u64 %r329,%r114; add.u32 %r328,%r329,1; st.u32 [%stack],%r328; cvta.const.u64 %r327,$LC1stack; call _gfortran_runtime_error27: .loc 1 123 25 add.u64 %r114,%r114,1; .loc 1 123 4 add.u64 %r170,%r170,24; add.u64 %r168,%r168,8; add.u64 %r162,%r162,24; setp.ne.u64 %r331,%r114,%r149; @ %r331 bra $L28; bra $L20; $L23: .loc 1 139 16 st.u64 [%r189],%r332; .loc 1 140 18 ld.u64 %r333,[%r188]; st.u64 [%r186],%r333; .loc 1 141 17 shl.b64 %r335,%r142,3; add.u64 %r336,%r375,%r335; .loc 1 141 10 ld.u64 %r337,[%r336]; setp.le.s64 %r338,%r337,0; @ %r338 bra $L1; .loc 1 137 26 add.u64 %r142,%r142,1; .loc 1 137 3 add.u64 %r189,%r189,8; add.u64 %r188,%r188,24; add.u64 %r186,%r186,8; setp.gt.s64 %r339,%r127,%r142; @ %r339 bra $L23; bra $L26; $L53: .loc 1 149 7 setp.eq.u32 %r341,%r76,16; @ %r341 bra $L21; .loc 1 157 5 cvta.const.u64 %r343,$LC2; mov.u64 %r342,3343; call _gfortrani_internal_error61 9 add.u64 %r344,%r27,1; .loc 1 61 7 sub.u64 %r129,%r344,%r29; max.s64 %r131,%r129,0; .loc 1 65 11 mul.lo.u64 %r33,%r30,%r26; .loc 1 159 8 ld.u64 %r141,[%r201]; ld.u64 %r166,[%frame+360]; .loc 1 189 22 ld.u64 %r86,[%frame+120]; .loc 1 190 22 ld.u64 %r88,[%frame]; .loc 1 190 12 add.u64 %r89,%r88,%r88; .loc 1 192 32 ld.u64 %r164,[%frame+240]; .loc 1 199 30 mul.lo.u64 %r346,%r164,%r86; .loc 1 199 16 neg.s64 %r150,%r346; .loc 1 200 30 mul.lo.u64 %r347,%r164,%r88; .loc 1 200 16 add.u64 %r348,%r347,%r347; neg.s64 %r148,%r348; setp.gt.s64 %r377,%r129,0; setp.le.s32 %r374,%r25,1; cvt.u32.u32 %r379,%r23; cvt.s64.s8 %r380,%r379; add.u64 %r381,%r380,-1; add.u64 %r382,%frame,120; add.u64 %r383,%frame,240; .loc 1 196 20 mov.u64 %r384,0; $L36: .loc 1 170 12 @ %r377 bra $L37; .loc 1 171 10 mov.u16 %r350,0; st.u16 [%r141],%r350; bra $L30; $L37: mov.u64 %r136,%r140; .loc 1 174 13 mov.u64 %r135,0; $L29: .loc 1 178 6 ld.s8 %r352,[%r136]; cvt.u16.u32 %r351,%r352; setp.ne.u16 %r353,%r351,0; @ %r353 bra $L38; .loc 1 174 28 add.u64 %r135,%r135,1; .loc 1 174 36 add.u64 %r136,%r136,%r33; .loc 1 174 6 setp.gt.s64 %r354,%r131,%r135; @ %r354 bra $L29; .loc 1 169 10 mov.u32 %r116,0; bra $L31; $L38: .loc 1 180 14 mov.u32 %r116,1; $L31: .loc 1 184 12 cvt.u16.u32 %r355,%r116; st.u16 [%r141],%r355; $L30: .loc 1 188 15 add.u64 %r166,%r166,1; .loc 1 189 12 add.u64 %r140,%r140,%r86; .loc 1 190 12 add.u64 %r141,%r141,%r89; .loc 1 192 13 setp.ne.u64 %r356,%r166,%r164; @ %r356 bra $L36; .loc 1 199 16 add.u64 %r137,%r140,%r150; .loc 1 200 16 add.u64 %r138,%r141,%r148; .loc 1 202 14 @ %r374 bra $L1; add.u64 %r32,%frame,368; mov.u64 %r165,8; .loc 1 201 12 mov.u64 %r139,1; bra $L34; $L35: .loc 1 196 20 st.u64 [%r32],%r384; .loc 1 199 30 mul.lo.u64 %r362,%r101,%r100; .loc 1 199 16 sub.u64 %r137,%r140,%r362; .loc 1 200 30 mul.lo.u64 %r363,%r103,%r100; .loc 1 200 16 add.u64 %r364,%r363,%r363; sub.u64 %r138,%r141,%r364; .loc 1 201 12 add.u64 %r139,%r139,1; .loc 1 202 14 add.u64 %r32,%r32,8; add.u64 %r165,%r165,8; setp.eq.u64 %r365,%r139,%r381; @ %r365 bra $L1; $L34: .loc 1 210 23 ld.u64 %r366,[%r32]; add.u64 %r100,%r366,1; st.u64 [%r32],%r100; .loc 1 211 30 add.u64 %r368,%r382,%r165; ld.u64 %r101,[%r368]; .loc 1 211 20 add.u64 %r140,%r137,%r101; .loc 1 212 30 add.u64 %r369,%frame,%r165; ld.u64 %r103,[%r369]; .loc 1 212 20 add.u64 %r370,%r103,%r103; add.u64 %r141,%r138,%r370; .loc 1 192 32 add.u64 %r372,%r383,%r165; ld.u64 %r107,[%r372]; .loc 1 192 13 setp.eq.u64 %r373,%r100,%r107; @ %r373 bra $L35; .loc 1 196 20 mov.u64 %r166,0; bra $L36; $L1: .loc 1 216 1 ret; } any_l4.o/_gfortran_any_l4 .visible .func _gfortran_any_l4fortran/generated/any_l4.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_internal_error .extern .func _gfortrani_internal_error21,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121,32,105,110,32,65,78,89,32,105,110,116,114,105,110,115,105,99,0 }114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,65,78,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,10810,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,65,78,89,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_any_l4 .visible .func _gfortran_any_l4480u64 %r229; .reg .u64 %r230; .reg .u64 %r231; .reg .u64 %r232; .reg .u64 %r233; .reg .pred %r234; .reg .pred %r236; .reg .u64u64u64 %r257; .reg .u64 %r258; .reg .predu64u16u64 %r279; .reg .u64 %r280; .reg .u64 %r282; .reg .u64pred %r320; .reg .u64u64 %r332; .reg .u64 %r333; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .pred %r338; .reg .pred %r339; .reg .pred %r341; .reg .u64 %r342; .reg .u64 %r343; .reg .u64 %r344; .reg .u64 %r345; .reg .u64 %r346; .reg .u64 %r347; .reg .u32 %r349; .reg .u16 %r350; .reg .u32 %r351; .reg .pred %r352; .reg .pred %r353; .reg .pred %r354; .reg .u64pred %r363; .reg .u64u64 %r385; mov.u64 %r201,%ar0; mov.u64 %r202,%ar1; mov.u64 %r203,%ar2; .loc 1 56 10 ld.u64 %r22,[%r203]; .loc 1 56 7 add.u64 %r126,%r22,-1; .loc 1 57 10 ld.s8 %r23,[%r202+28]; .loc 1 57 38 add.u32 %r25,%r23,-1; .loc 1 57 8 cvt.s64.s32 %r127,%r25; .loc 1 59 14 ld.u64 %r26,[%r202+16]; .loc 1 61 9 add.u64 %r205,%r126,%r126; add.u64 %r206,%r205,%r126; shl.b64 %r207,%r206,3; add.u64 %r208,%r202,%r207; ld.u64 %r27,[%r208+56]; ld.u64 %r29,[%r208+48]; .loc 1 65 11 ld.u64 %r30,[%r208+40]; .loc 1 67 3 setp.gt.s64 %r222,%r126,0; @ %r222 bra $L2; $L8: .loc 1 75 3 setp.lt.s64 %r223,%r126,%r127; @ %r223 bra $L3; bra $L4; $L2: add.u64 %r151,%r202,40; add.u64 %r52,%frame,120; add.u64 %r109,%frame,240; add.u64 %r225,%r22,%r22; add.u64 %r226,%r225,%r22; shl.b64 %r227,%r226,3; add.u64 %r228,%r202,16; add.u64 %r160,%r227,%r228; .loc 1 73 12 mov.u64 %r385,0; $L7: .loc 1 69 20 ld.u64 %r230,[%r151]; mul.lo.u64 %r229,%r230,%r26; .loc 1 69 18 st.u64 [%r52],%r229; .loc 1 70 19 ld.u64 %r232,[%r151+16]; add.u64 %r231,%r232,1; ld.u64 %r233,[%r151+8]; sub.u64 %r41,%r231,%r233; .loc 1 72 10 setp.lt.s64 %r234,%r41,0; @ %r234 bra $L5; .loc 1 70 17 st.u64 [%r109],%r41; bra $L6; $L5: .loc 1 73 12 st.u64 [%r109],%r385; $L6: .loc 1 67 3 add.u64 %r151,%r151,24; add.u64 %r52,%r52,8; add.u64 %r109,%r109,8; setp.ne.u64 %r236,%r151,%r160; @ %r236 bra $L7; bra $L8; $L4: .loc 1 84 6 ld.u64 %r237,[%r201]; setp.eq.u64 %r238,%r237,0; @ ! %r238 bra $L51; bra $L9; $L3: add.u64 %r240,%r22,%r22; add.u64 %r241,%r240,%r22; shl.b64 %r242,%r241,3; add.u64 %r243,%r242,40; add.u64 %r145,%r202,%r243; shl.b64 %r244,%r22,3; add.u64 %r112,%r244,-8; add.u64 %r375,%frame,120; add.u64 %r117,%r375,%r112; add.u64 %r374,%frame,240; add.u64 %r110,%r374,%r112; add.u64 %r247,%r202,40; cvt.u32.u32 %r249,%r23; cvt.s64.s8 %r248,%r249; add.u64 %r251,%r248,%r248; add.u64 %r252,%r251,%r248; shl.b64 %r253,%r252,3; add.u64 %r198,%r247,%r253; .loc 1 81 12 mov.u64 %r384,0; $L13: .loc 1 77 20 ld.u64 %r255,[%r145]; mul.lo.u64 %r254,%r255,%r26; .loc 1 77 18 st.u64 [%r117],%r254; .loc 1 78 19 ld.u64 %r257,[%r145+16]; add.u64 %r256,%r257,1; ld.u64 %r258,[%r145+8]; sub.u64 %r49,%r256,%r258; .loc 1 80 10 setp.lt.s64 %r259,%r49,0; @ %r259 bra $L11; .loc 1 78 17 st.u64 [%r110],%r49; bra $L12; $L11: .loc 1 81 12 st.u64 [%r110],%r384; $L12: .loc 1 75 3 add.u64 %r145,%r145,24; add.u64 %r117,%r117,8; add.u64 %r110,%r110,8; setp.ne.u64 %r261,%r145,%r198; @ %r261 bra $L13; bra $L4; $L9: .loc 1 88 7 setp.le.s32 %r262,%r25,0; @ %r262 bra $L14; add.u64 %r179,%r201,40; add.u64 %r177,%frame,240; add.u64 %r263,%r201,16; cvt.u32.u32 %r265,%r23; cvt.s64.s8 %r264,%r265; add.u64 %r267,%r264,%r264; add.u64 %r268,%r267,%r264; shl.b64 %r269,%r268,3; add.u64 %r171,%r263,%r269; .loc 1 91 17 mov.u64 %r115,1; bra $L15; $L14: .loc 1 99 24 mov.u64 %r270,0; st.u64 [%r201+8],%r270; .loc 1 100 28 cvt.u16.u32 %r273,%r23; add.u16 %r272,%r273,-1; cvt.u32.u16 %r274,%r272; st.u8 [%r201+28],%r274; .loc 1 102 20 add.u32 %r275,%r23,-2; cvt.s64.s32 %r58,%r275; add.u64 %r277,%r58,%r58; add.u64 %r278,%r277,%r58; shl.b64 %r279,%r278,3; add.u64 %r280,%r201,%r279; .loc 1 102 67 shl.b64 %r282,%r58,3; add.u64 %r283,%frame,%r282; .loc 1 102 59 ld.u64 %r285,[%r280+40]; ld.u64 %r286,[%r283+240]; mul.lo.u64 %r134,%r285,%r286; .loc 1 104 10 setp.eq.u64 %r287,%r134,0; @ %r287 bra $L16; bra $L52; $L18: .loc 1 93 55 mul.lo.u64 %r115,%r54,%r115; $L15: .loc 1 95 4 st.u64 [%r179+8],%r237; ld.u64 %r54,[%r177]; add.u64 %r289,%r54,-1; st.u64 [%r179+16],%r289; st.u64 [%r179],%r115; .loc 1 88 7 add.u64 %r179,%r179,24; add.u64 %r177,%r177,8; setp.ne.u64 %r290,%r171,%r179; @ %r290 bra $L18; bra $L14; $L16: .loc 1 107 4 st.u64 [%r201+48],%r134; mov.u64 %r292,-1; st.u64 [%r201+56],%r292; mov.u64 %r293,1; st.u64 [%r201+40],%r293; .loc 1 108 4 bra $L1; $L52: .loc 1 111 24295; call (%value_in),_gfortrani_xmallocarray296,[%value_in]; } .loc 1 111 22 st.u64 [%r201],%r296; $L25: .loc 1 137 3 setp.gt.s32 %r298,%r25,0; @ ! %r298 bra $L26; add.u64 %r374,%frame,240; bra $L20; $L26: .loc 1 145 8 ld.u64 %r140,[%r202]; .loc 1 147 21 cvt.u32.u64 %r76,%r26; .loc 1 147 50 add.u32 %r299,%r76,-4; and.b32 %r300,%r299,-5; set.u32.eq.u32 %r302,%r300,0; neg.s32 %r303,%r302; .loc 1 147 21 add.u32 %r304,%r76,-1; set.u32.le.u32 %r306,%r304,1; neg.s32 %r307,%r306; .loc 1 147 6 cvt.u16.u32 %r309,%r303; cvt.u16.u32 %r310,%r307; orbra $L53; $L20: add.u64 %r189,%frame,360; add.u64 %r188,%r201,40; mov.u64 %r186,%frame; .loc 1 123 10 mov.u64 %r142,0; .loc 1 139 16 mov.u64 %r332,%r142; bra $L23; $L51: .loc 1 115 19 ld.s8 %r65,[%r201+28]; .loc 1 115 10 setp.eq.u64 %r314,%r65,%r127; @ %r314 bra $L24; .loc 1 116 2 st.u64 [%stack+8],%r127; st.u64 [%stack],%r65; cvta.const.u64 %r315315_gfortran_runtime_error24: .loc 1 121 11 cvta.global.u64 %r317,_gfortrani_compile_options; .loc 1 121 10 ld.u32 %r318,[%r317+36]; setp.eq.u32 %r319,%r318,0; @ %r319 bra $L25; .loc 1 123 4 setp.le.s64 %r320,%r127,0; @ %r320 bra $L26; add.u64 %r170,%r201,48; add.u64 %r374,%frame,240; mov.u64 %r168,%r374; add.u64 %r162,%r201,56; cvt.u32.u32 %r322,%r23; cvt.s64.s8 %r321,%r322; add.u64 %r149,%r321,-1; .loc 1 123 10 mov.u64 %r114,0; $L28: .loc 1 127 21 ld.u64 %r324,[%r162]; add.u64 %r323,%r324,1; .loc 1 127 19 ld.u64 %r325,[%r170]; sub.u64 %r133,%r323,%r325; .loc 1 128 18 ld.u64 %r71,[%r168]; .loc 1 128 11 setp.eq.u64 %r326,%r71,%r133; @ %r326 bra $L27; .loc 1 129 3 st.u64 [%stack+16],%r71; st.u64 [%stack+8],%r133; cvt.u32.u64 %r329,%r114; add.u32 %r328,%r329,1; st.u32 [%stack],%r328; cvta.const.u64 %r327,$LC1stack; call _gfortran_runtime_error27: .loc 1 123 25 add.u64 %r114,%r114,1; .loc 1 123 4 add.u64 %r170,%r170,24; add.u64 %r168,%r168,8; add.u64 %r162,%r162,24; setp.ne.u64 %r331,%r114,%r149; @ %r331 bra $L28; bra $L20; $L23: .loc 1 139 16 st.u64 [%r189],%r332; .loc 1 140 18 ld.u64 %r333,[%r188]; st.u64 [%r186],%r333; .loc 1 141 17 shl.b64 %r335,%r142,3; add.u64 %r336,%r374,%r335; .loc 1 141 10 ld.u64 %r337,[%r336]; setp.le.s64 %r338,%r337,0; @ %r338 bra $L1; .loc 1 137 26 add.u64 %r142,%r142,1; .loc 1 137 3 add.u64 %r189,%r189,8; add.u64 %r188,%r188,24; add.u64 %r186,%r186,8; setp.gt.s64 %r339,%r127,%r142; @ %r339 bra $L23; bra $L26; $L53: .loc 1 149 7 setp.eq.u32 %r341,%r76,16; @ %r341 bra $L21; .loc 1 157 5 cvta.const.u64 %r343,$LC2; mov.u64 %r342,3343; call _gfortrani_internal_error61 9 add.u64 %r344,%r27,1; .loc 1 61 7 sub.u64 %r129,%r344,%r29; max.s64 %r131,%r129,0; .loc 1 65 11 mul.lo.u64 %r33,%r30,%r26; .loc 1 159 8 ld.u64 %r141,[%r201]; ld.u64 %r166,[%frame+360]; .loc 1 189 22 ld.u64 %r86,[%frame+120]; .loc 1 190 22 ld.u64 %r88,[%frame]; .loc 1 190 12 shl.b64 %r89,%r88,2; .loc 1 192 32 ld.u64 %r164,[%frame+240]; .loc 1 199 30 mul.lo.u64 %r345,%r164,%r86; .loc 1 199 16 neg.s64 %r150,%r345; .loc 1 200 30 mul.lo.u64 %r346,%r164,%r88; .loc 1 200 16 shl.b64 %r347,%r346,2; neg.s64 %r148,%r347; setp.gt.s64 %r372,%r129,0; setp.le.s32 %r373,%r25,1; cvt.u32.u32 %r377,%r23; cvt.s64.s8 %r378,%r377; add.u64 %r379,%r378,-1; add.u64 %r380,%frame,120; add.u64 %r381,%frame,240; .loc 1 196 20 mov.u64 %r382,0; $L36: .loc 1 170 12 @ %r372 bra $L37; .loc 1 171 10 mov.u32 %r349,0; st.u32 [%r141],%r349; bra $L30; $L37: mov.u64 %r136,%r140; .loc 1 174 13 mov.u64 %r135,0; $L29: .loc 1 178 6 ld.s8 %r351,[%r136]; cvt.u16.u32 %r350,%r351; setp.ne.u16 %r352,%r350,0; @ %r352 bra $L38; .loc 1 174 28 add.u64 %r135,%r135,1; .loc 1 174 36 add.u64 %r136,%r136,%r33; .loc 1 174 6 setp.gt.s64 %r353,%r131,%r135; @ %r353 bra $L29; .loc 1 169 10 mov.u32 %r116,0; bra $L31; $L38: .loc 1 180 14 mov.u32 %r116,1; $L31: .loc 1 184 12 st.u32 [%r141],%r116; $L30: .loc 1 188 15 add.u64 %r166,%r166,1; .loc 1 189 12 add.u64 %r140,%r140,%r86; .loc 1 190 12 add.u64 %r141,%r141,%r89; .loc 1 192 13 setp.ne.u64 %r354,%r166,%r164; @ %r354 bra $L36; .loc 1 199 16 add.u64 %r137,%r140,%r150; .loc 1 200 16 add.u64 %r138,%r141,%r148; .loc 1 202 14 @ %r373 bra $L1; add.u64 %r32,%frame,368; mov.u64 %r165,8; .loc 1 201 12 mov.u64 %r139,1; bra $L34; $L35: .loc 1 196 20 st.u64 [%r32],%r382; .loc 1 199 30 mul.lo.u64 %r360,%r101,%r100; .loc 1 199 16 sub.u64 %r137,%r140,%r360; .loc 1 200 30 mul.lo.u64 %r361,%r103,%r100; .loc 1 200 16 shl.b64 %r362,%r361,2; sub.u64 %r138,%r141,%r362; .loc 1 201 12 add.u64 %r139,%r139,1; .loc 1 202 14 add.u64 %r32,%r32,8; add.u64 %r165,%r165,8; setp.eq.u64 %r363,%r139,%r379; @ %r363 bra $L1; $L34: .loc 1 210 23 ld.u64 %r364,[%r32]; add.u64 %r100,%r364,1; st.u64 [%r32],%r100; .loc 1 211 30 add.u64 %r366,%r380,%r165; ld.u64 %r101,[%r366]; .loc 1 211 20 add.u64 %r140,%r137,%r101; .loc 1 212 30 add.u64 %r367,%frame,%r165; ld.u64 %r103,[%r367]; .loc 1 212 20 shl.b64 %r368,%r103,2; add.u64 %r141,%r138,%r368; .loc 1 192 32 add.u64 %r370,%r381,%r165; ld.u64 %r107,[%r370]; .loc 1 192 13 setp.eq.u64 %r371,%r100,%r107; @ %r371 bra $L35; .loc 1 196 20 mov.u64 %r166,0; bra $L36; $L1: .loc 1 216 1 ret; } any_l8.o/_gfortran_any_l8 .visible .func _gfortran_any_l8fortran/generated/any_l8.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_internal_error .extern .func _gfortrani_internal_error21,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121,32,105,110,32,65,78,89,32,105,110,116,114,105,110,115,105,99,0 }114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,65,78,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,10810,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,65,78,89,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_any_l8 .visible .func _gfortran_any_l8480u64 %r229; .reg .u64 %r230; .reg .u64 %r231; .reg .u64 %r232; .reg .u64 %r233; .reg .pred %r234; .reg .pred %r236; .reg .u64u64u64 %r257; .reg .u64 %r258; .reg .predu64u16u64 %r279; .reg .u64 %r280; .reg .u64 %r282; .reg .u64pred %r320; .reg .u64u64 %r332; .reg .u64 %r333; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .pred %r338; .reg .pred %r339; .reg .pred %r341; .reg .u64 %r342; .reg .u64 %r343; .reg .u64 %r344; .reg .u64 %r345; .reg .u64 %r346; .reg .u64 %r347; .reg .u16 %r350; .reg .u32 %r351; .reg .pred %r352; .reg .pred %r353; .reg .pred %r354; .reg .u64pred %r363; .reg .u64u64 %r385; mov.u64 %r201,%ar0; mov.u64 %r202,%ar1; mov.u64 %r203,%ar2; .loc 1 56 10 ld.u64 %r22,[%r203]; .loc 1 56 7 add.u64 %r126,%r22,-1; .loc 1 57 10 ld.s8 %r23,[%r202+28]; .loc 1 57 38 add.u32 %r25,%r23,-1; .loc 1 57 8 cvt.s64.s32 %r127,%r25; .loc 1 59 14 ld.u64 %r26,[%r202+16]; .loc 1 61 9 add.u64 %r205,%r126,%r126; add.u64 %r206,%r205,%r126; shl.b64 %r207,%r206,3; add.u64 %r208,%r202,%r207; ld.u64 %r27,[%r208+56]; ld.u64 %r29,[%r208+48]; .loc 1 65 11 ld.u64 %r30,[%r208+40]; .loc 1 67 3 setp.gt.s64 %r222,%r126,0; @ %r222 bra $L2; $L8: .loc 1 75 3 setp.lt.s64 %r223,%r126,%r127; @ %r223 bra $L3; bra $L4; $L2: add.u64 %r151,%r202,40; add.u64 %r52,%frame,120; add.u64 %r109,%frame,240; add.u64 %r225,%r22,%r22; add.u64 %r226,%r225,%r22; shl.b64 %r227,%r226,3; add.u64 %r228,%r202,16; add.u64 %r160,%r227,%r228; .loc 1 73 12 mov.u64 %r385,0; $L7: .loc 1 69 20 ld.u64 %r230,[%r151]; mul.lo.u64 %r229,%r230,%r26; .loc 1 69 18 st.u64 [%r52],%r229; .loc 1 70 19 ld.u64 %r232,[%r151+16]; add.u64 %r231,%r232,1; ld.u64 %r233,[%r151+8]; sub.u64 %r41,%r231,%r233; .loc 1 72 10 setp.lt.s64 %r234,%r41,0; @ %r234 bra $L5; .loc 1 70 17 st.u64 [%r109],%r41; bra $L6; $L5: .loc 1 73 12 st.u64 [%r109],%r385; $L6: .loc 1 67 3 add.u64 %r151,%r151,24; add.u64 %r52,%r52,8; add.u64 %r109,%r109,8; setp.ne.u64 %r236,%r151,%r160; @ %r236 bra $L7; bra $L8; $L4: .loc 1 84 6 ld.u64 %r237,[%r201]; setp.eq.u64 %r238,%r237,0; @ ! %r238 bra $L51; bra $L9; $L3: add.u64 %r240,%r22,%r22; add.u64 %r241,%r240,%r22; shl.b64 %r242,%r241,3; add.u64 %r243,%r242,40; add.u64 %r145,%r202,%r243; shl.b64 %r244,%r22,3; add.u64 %r112,%r244,-8; add.u64 %r375,%frame,120; add.u64 %r117,%r375,%r112; add.u64 %r374,%frame,240; add.u64 %r110,%r374,%r112; add.u64 %r247,%r202,40; cvt.u32.u32 %r249,%r23; cvt.s64.s8 %r248,%r249; add.u64 %r251,%r248,%r248; add.u64 %r252,%r251,%r248; shl.b64 %r253,%r252,3; add.u64 %r198,%r247,%r253; .loc 1 81 12 mov.u64 %r384,0; $L13: .loc 1 77 20 ld.u64 %r255,[%r145]; mul.lo.u64 %r254,%r255,%r26; .loc 1 77 18 st.u64 [%r117],%r254; .loc 1 78 19 ld.u64 %r257,[%r145+16]; add.u64 %r256,%r257,1; ld.u64 %r258,[%r145+8]; sub.u64 %r49,%r256,%r258; .loc 1 80 10 setp.lt.s64 %r259,%r49,0; @ %r259 bra $L11; .loc 1 78 17 st.u64 [%r110],%r49; bra $L12; $L11: .loc 1 81 12 st.u64 [%r110],%r384; $L12: .loc 1 75 3 add.u64 %r145,%r145,24; add.u64 %r117,%r117,8; add.u64 %r110,%r110,8; setp.ne.u64 %r261,%r145,%r198; @ %r261 bra $L13; bra $L4; $L9: .loc 1 88 7 setp.le.s32 %r262,%r25,0; @ %r262 bra $L14; add.u64 %r179,%r201,40; add.u64 %r177,%frame,240; add.u64 %r263,%r201,16; cvt.u32.u32 %r265,%r23; cvt.s64.s8 %r264,%r265; add.u64 %r267,%r264,%r264; add.u64 %r268,%r267,%r264; shl.b64 %r269,%r268,3; add.u64 %r171,%r263,%r269; .loc 1 91 17 mov.u64 %r115,1; bra $L15; $L14: .loc 1 99 24 mov.u64 %r270,0; st.u64 [%r201+8],%r270; .loc 1 100 28 cvt.u16.u32 %r273,%r23; add.u16 %r272,%r273,-1; cvt.u32.u16 %r274,%r272; st.u8 [%r201+28],%r274; .loc 1 102 20 add.u32 %r275,%r23,-2; cvt.s64.s32 %r58,%r275; add.u64 %r277,%r58,%r58; add.u64 %r278,%r277,%r58; shl.b64 %r279,%r278,3; add.u64 %r280,%r201,%r279; .loc 1 102 67 shl.b64 %r282,%r58,3; add.u64 %r283,%frame,%r282; .loc 1 102 59 ld.u64 %r285,[%r280+40]; ld.u64 %r286,[%r283+240]; mul.lo.u64 %r134,%r285,%r286; .loc 1 104 10 setp.eq.u64 %r287,%r134,0; @ %r287 bra $L16; bra $L52; $L18: .loc 1 93 55 mul.lo.u64 %r115,%r54,%r115; $L15: .loc 1 95 4 st.u64 [%r179+8],%r237; ld.u64 %r54,[%r177]; add.u64 %r289,%r54,-1; st.u64 [%r179+16],%r289; st.u64 [%r179],%r115; .loc 1 88 7 add.u64 %r179,%r179,24; add.u64 %r177,%r177,8; setp.ne.u64 %r290,%r171,%r179; @ %r290 bra $L18; bra $L14; $L16: .loc 1 107 4 st.u64 [%r201+48],%r134; mov.u64 %r292,-1; st.u64 [%r201+56],%r292; mov.u64 %r293,1; st.u64 [%r201+40],%r293; .loc 1 108 4 bra $L1; $L52: .loc 1 111 24 mov.u64 %r2951295; call (%value_in),_gfortrani_xmallocarray296,[%value_in]; } .loc 1 111 22 st.u64 [%r201],%r296; $L25: .loc 1 137 3 setp.gt.s32 %r298,%r25,0; @ ! %r298 bra $L26; add.u64 %r374,%frame,240; bra $L20; $L26: .loc 1 145 8 ld.u64 %r140,[%r202]; .loc 1 147 21 cvt.u32.u64 %r76,%r26; .loc 1 147 50 add.u32 %r299,%r76,-4; and.b32 %r300,%r299,-5; set.u32.eq.u32 %r302,%r300,0; neg.s32 %r303,%r302; .loc 1 147 21 add.u32 %r304,%r76,-1; set.u32.le.u32 %r306,%r304,1; neg.s32 %r307,%r306; .loc 1 147 6 cvt.u16.u32 %r309,%r303; cvt.u16.u32 %r310,%r307; orbra $L53; $L20: add.u64 %r189,%frame,360; add.u64 %r188,%r201,40; mov.u64 %r186,%frame; .loc 1 123 10 mov.u64 %r142,0; .loc 1 139 16 mov.u64 %r332,%r142; bra $L23; $L51: .loc 1 115 19 ld.s8 %r65,[%r201+28]; .loc 1 115 10 setp.eq.u64 %r314,%r65,%r127; @ %r314 bra $L24; .loc 1 116 2 st.u64 [%stack+8],%r127; st.u64 [%stack],%r65; cvta.const.u64 %r315315_gfortran_runtime_error24: .loc 1 121 11 cvta.global.u64 %r317,_gfortrani_compile_options; .loc 1 121 10 ld.u32 %r318,[%r317+36]; setp.eq.u32 %r319,%r318,0; @ %r319 bra $L25; .loc 1 123 4 setp.le.s64 %r320,%r127,0; @ %r320 bra $L26; add.u64 %r170,%r201,48; add.u64 %r374,%frame,240; mov.u64 %r168,%r374; add.u64 %r162,%r201,56; cvt.u32.u32 %r322,%r23; cvt.s64.s8 %r321,%r322; add.u64 %r149,%r321,-1; .loc 1 123 10 mov.u64 %r114,0; $L28: .loc 1 127 21 ld.u64 %r324,[%r162]; add.u64 %r323,%r324,1; .loc 1 127 19 ld.u64 %r325,[%r170]; sub.u64 %r133,%r323,%r325; .loc 1 128 18 ld.u64 %r71,[%r168]; .loc 1 128 11 setp.eq.u64 %r326,%r71,%r133; @ %r326 bra $L27; .loc 1 129 3 st.u64 [%stack+16],%r71; st.u64 [%stack+8],%r133; cvt.u32.u64 %r329,%r114; add.u32 %r328,%r329,1; st.u32 [%stack],%r328; cvta.const.u64 %r327,$LC1stack; call _gfortran_runtime_error27: .loc 1 123 25 add.u64 %r114,%r114,1; .loc 1 123 4 add.u64 %r170,%r170,24; add.u64 %r168,%r168,8; add.u64 %r162,%r162,24; setp.ne.u64 %r331,%r114,%r149; @ %r331 bra $L28; bra $L20; $L23: .loc 1 139 16 st.u64 [%r189],%r332; .loc 1 140 18 ld.u64 %r333,[%r188]; st.u64 [%r186],%r333; .loc 1 141 17 shl.b64 %r335,%r142,3; add.u64 %r336,%r374,%r335; .loc 1 141 10 ld.u64 %r337,[%r336]; setp.le.s64 %r338,%r337,0; @ %r338 bra $L1; .loc 1 137 26 add.u64 %r142,%r142,1; .loc 1 137 3 add.u64 %r189,%r189,8; add.u64 %r188,%r188,24; add.u64 %r186,%r186,8; setp.gt.s64 %r339,%r127,%r142; @ %r339 bra $L23; bra $L26; $L53: .loc 1 149 7 setp.eq.u32 %r341,%r76,16; @ %r341 bra $L21; .loc 1 157 5 cvta.const.u64 %r343,$LC2; mov.u64 %r342,3343; call _gfortrani_internal_error61 9 add.u64 %r344,%r27,1; .loc 1 61 7 sub.u64 %r129,%r344,%r29; max.s64 %r131,%r129,0; .loc 1 65 11 mul.lo.u64 %r33,%r30,%r26; .loc 1 159 8 ld.u64 %r141,[%r201]; ld.u64 %r166,[%frame+360]; .loc 1 189 22 ld.u64 %r86,[%frame+120]; .loc 1 190 22 ld.u64 %r88,[%frame]; .loc 1 190 12 shl.b64 %r89,%r88,3; .loc 1 192 32 ld.u64 %r164,[%frame+240]; .loc 1 199 30 mul.lo.u64 %r345,%r164,%r86; .loc 1 199 16 neg.s64 %r150,%r345; .loc 1 200 30 mul.lo.u64 %r346,%r164,%r88; .loc 1 200 16 shl.b64 %r347,%r346,3; neg.s64 %r148,%r347; setp.gt.s64 %r372,%r129,0; setp.le.s32 %r373,%r25,1; .loc 1 169 10 mov.u64 %r377,0; cvt.u32.u32 %r378,%r23; cvt.s64.s8 %r379,%r378; add.u64 %r380,%r379,-1; add.u64 %r381,%frame,120; add.u64 %r382,%frame,240; $L36: .loc 1 170 12 @ %r372 bra $L37; .loc 1 171 10 st.u64 [%r141],%r377; bra $L30; $L37: mov.u64 %r136,%r140; .loc 1 174 13 mov.u64 %r135,0; $L29: .loc 1 178 6 ld.s8 %r351,[%r136]; cvt.u16.u32 %r350,%r351; setp.ne.u16 %r352,%r350,0; @ %r352 bra $L38; .loc 1 174 28 add.u64 %r135,%r135,1; .loc 1 174 36 add.u64 %r136,%r136,%r33; .loc 1 174 6 setp.gt.s64 %r353,%r131,%r135; @ %r353 bra $L29; .loc 1 169 10 mov.u64 %r116,%r377; bra $L31; $L38: .loc 1 180 14 mov.u64 %r116,1; $L31: .loc 1 184 12 st.u64 [%r141],%r116; $L30: .loc 1 188 15 add.u64 %r166,%r166,1; .loc 1 189 12 add.u64 %r140,%r140,%r86; .loc 1 190 12 add.u64 %r141,%r141,%r89; .loc 1 192 13 setp.ne.u64 %r354,%r166,%r164; @ %r354 bra $L36; .loc 1 199 16 add.u64 %r137,%r140,%r150; .loc 1 200 16 add.u64 %r138,%r141,%r148; .loc 1 202 14 @ %r373 bra $L1; add.u64 %r32,%frame,368; mov.u64 %r165,8; .loc 1 201 12 mov.u64 %r139,1; bra $L34; $L35: .loc 1 196 20 st.u64 [%r32],%r377; .loc 1 199 30 mul.lo.u64 %r360,%r101,%r100; .loc 1 199 16 sub.u64 %r137,%r140,%r360; .loc 1 200 30 mul.lo.u64 %r361,%r103,%r100; .loc 1 200 16 shl.b64 %r362,%r361,3; sub.u64 %r138,%r141,%r362; .loc 1 201 12 add.u64 %r139,%r139,1; .loc 1 202 14 add.u64 %r32,%r32,8; add.u64 %r165,%r165,8; setp.eq.u64 %r363,%r139,%r380; @ %r363 bra $L1; $L34: .loc 1 210 23 ld.u64 %r364,[%r32]; add.u64 %r100,%r364,1; st.u64 [%r32],%r100; .loc 1 211 30 add.u64 %r366,%r381,%r165; ld.u64 %r101,[%r366]; .loc 1 211 20 add.u64 %r140,%r137,%r101; .loc 1 212 30 add.u64 %r367,%frame,%r165; ld.u64 %r103,[%r367]; .loc 1 212 20 shl.b64 %r368,%r103,3; add.u64 %r141,%r138,%r368; .loc 1 192 32 add.u64 %r370,%r382,%r165; ld.u64 %r107,[%r370]; .loc 1 192 13 setp.eq.u64 %r371,%r100,%r107; @ %r371 bra $L35; .loc 1 196 20 mov.u64 %r166,0; bra $L36; $L1: .loc 1 216 1 ret; } any_l16.o/fortran/generated/any_l16.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_internal_error .extern .func _gfortrani_internal_error21,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121,32,105,110,32,65,78,89,32,105,110,116,114,105,110,115,105,99,0 }114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,65,78,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,10810,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,65,78,89,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_any_l16 .visible .func _gfortran_any_l16480u64 %r229; .reg .u64 %r230; .reg .u64 %r231; .reg .u64 %r232; .reg .u64 %r233; .reg .pred %r234; .reg .pred %r236; .reg .u64u64u64 %r257; .reg .u64 %r258; .reg .predu64u16u64 %r279; .reg .u64 %r280; .reg .u64 %r282; .reg .u64pred %r320; .reg .u64u64 %r332; .reg .u64 %r333; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .pred %r338; .reg .pred %r339; .reg .pred %r341; .reg .u64 %r342; .reg .u64 %r343; .reg .u64 %r344; .reg .u64 %r345; .reg .u64 %r346; .reg .u64 %r347; .reg .u16 %r351; .reg .u32 %r352; .reg .predpredmov.u64 %r201,%ar0; mov.u64 %r202,%ar1; mov.u64 %r203,%ar2; .loc 1 56 10 ld.u64 %r22,[%r203]; .loc 1 56 7 add.u64 %r126,%r22,-1; .loc 1 57 10 ld.s8 %r23,[%r202+28]; .loc 1 57 38 add.u32 %r25,%r23,-1; .loc 1 57 8 cvt.s64.s32 %r127,%r25; .loc 1 59 14 ld.u64 %r26,[%r202+16]; .loc 1 61 9 add.u64 %r205,%r126,%r126; add.u64 %r206,%r205,%r126; shl.b64 %r207,%r206,3; add.u64 %r208,%r202,%r207; ld.u64 %r27,[%r208+56]; ld.u64 %r29,[%r208+48]; .loc 1 65 11 ld.u64 %r30,[%r208+40]; .loc 1 67 3 setp.gt.s64 %r222,%r126,0; @ %r222 bra $L2; $L8: .loc 1 75 3 setp.lt.s64 %r223,%r126,%r127; @ %r223 bra $L3; bra $L4; $L2: add.u64 %r151,%r202,40; add.u64 %r52,%frame,120; add.u64 %r109,%frame,240; add.u64 %r225,%r22,%r22; add.u64 %r226,%r225,%r22; shl.b64 %r227,%r226,3; add.u64 %r228,%r202,16; add.u64 %r160,%r227,%r228; .loc 1 73 12 mov.u64 %r390,0; $L7: .loc 1 69 20 ld.u64 %r230,[%r151]; mul.lo.u64 %r229,%r230,%r26; .loc 1 69 18 st.u64 [%r52],%r229; .loc 1 70 19 ld.u64 %r232,[%r151+16]; add.u64 %r231,%r232,1; ld.u64 %r233,[%r151+8]; sub.u64 %r41,%r231,%r233; .loc 1 72 10 setp.lt.s64 %r234,%r41,0; @ %r234 bra $L5; .loc 1 70 17 st.u64 [%r109],%r41; bra $L6; $L5: .loc 1 73 12 st.u64 [%r109],%r390; $L6: .loc 1 67 3 add.u64 %r151,%r151,24; add.u64 %r52,%r52,8; add.u64 %r109,%r109,8; setp.ne.u64 %r236,%r151,%r160; @ %r236 bra $L7; bra $L8; $L4: .loc 1 84 6 ld.u64 %r237,[%r201]; setp.eq.u64 %r238,%r237,0; @ ! %r238 bra $L51; bra $L9; $L3: add.u64 %r240,%r22,%r22; add.u64 %r241,%r240,%r22; shl.b64 %r242,%r241,3; add.u64 %r243,%r242,40; add.u64 %r145,%r202,%r243; shl.b64 %r244,%r22,3; add.u64 %r112,%r244,-8; add.u64 %r379,%frame,120; add.u64 %r117,%r379,%r112; add.u64 %r378,%frame,240; add.u64 %r110,%r378,%r112; add.u64 %r247,%r202,40; cvt.u32.u32 %r249,%r23; cvt.s64.s8 %r248,%r249; add.u64 %r251,%r248,%r248; add.u64 %r252,%r251,%r248; shl.b64 %r253,%r252,3; add.u64 %r198,%r247,%r253; .loc 1 81 12 mov.u64 %r389,0; $L13: .loc 1 77 20 ld.u64 %r255,[%r145]; mul.lo.u64 %r254,%r255,%r26; .loc 1 77 18 st.u64 [%r117],%r254; .loc 1 78 19 ld.u64 %r257,[%r145+16]; add.u64 %r256,%r257,1; ld.u64 %r258,[%r145+8]; sub.u64 %r49,%r256,%r258; .loc 1 80 10 setp.lt.s64 %r259,%r49,0; @ %r259 bra $L11; .loc 1 78 17 st.u64 [%r110],%r49; bra $L12; $L11: .loc 1 81 12 st.u64 [%r110],%r389; $L12: .loc 1 75 3 add.u64 %r145,%r145,24; add.u64 %r117,%r117,8; add.u64 %r110,%r110,8; setp.ne.u64 %r261,%r145,%r198; @ %r261 bra $L13; bra $L4; $L9: .loc 1 88 7 setp.le.s32 %r262,%r25,0; @ %r262 bra $L14; add.u64 %r179,%r201,40; add.u64 %r177,%frame,240; add.u64 %r263,%r201,16; cvt.u32.u32 %r265,%r23; cvt.s64.s8 %r264,%r265; add.u64 %r267,%r264,%r264; add.u64 %r268,%r267,%r264; shl.b64 %r269,%r268,3; add.u64 %r171,%r263,%r269; .loc 1 91 17 mov.u64 %r115,1; bra $L15; $L14: .loc 1 99 24 mov.u64 %r270,0; st.u64 [%r201+8],%r270; .loc 1 100 28 cvt.u16.u32 %r273,%r23; add.u16 %r272,%r273,-1; cvt.u32.u16 %r274,%r272; st.u8 [%r201+28],%r274; .loc 1 102 20 add.u32 %r275,%r23,-2; cvt.s64.s32 %r58,%r275; add.u64 %r277,%r58,%r58; add.u64 %r278,%r277,%r58; shl.b64 %r279,%r278,3; add.u64 %r280,%r201,%r279; .loc 1 102 67 shl.b64 %r282,%r58,3; add.u64 %r283,%frame,%r282; .loc 1 102 59 ld.u64 %r285,[%r280+40]; ld.u64 %r286,[%r283+240]; mul.lo.u64 %r134,%r285,%r286; .loc 1 104 10 setp.eq.u64 %r287,%r134,0; @ %r287 bra $L16; bra $L52; $L18: .loc 1 93 55 mul.lo.u64 %r115,%r54,%r115; $L15: .loc 1 95 4 st.u64 [%r179+8],%r237; ld.u64 %r54,[%r177]; add.u64 %r289,%r54,-1; st.u64 [%r179+16],%r289; st.u64 [%r179],%r115; .loc 1 88 7 add.u64 %r179,%r179,24; add.u64 %r177,%r177,8; setp.ne.u64 %r290,%r171,%r179; @ %r290 bra $L18; bra $L14; $L16: .loc 1 107 4 st.u64 [%r201+48],%r134; mov.u64 %r292,-1; st.u64 [%r201+56],%r292; mov.u64 %r293,1; st.u64 [%r201+40],%r293; .loc 1 108 4 bra $L1; $L52: .loc 1 111 24 mov.u64 %r295,16295; call (%value_in),_gfortrani_xmallocarray296,[%value_in]; } .loc 1 111 22 st.u64 [%r201],%r296; $L25: .loc 1 137 3 setp.gt.s32 %r298,%r25,0; @ ! %r298 bra $L26; add.u64 %r378,%frame,240; bra $L20; $L26: .loc 1 145 8 ld.u64 %r140,[%r202]; .loc 1 147 21 cvt.u32.u64 %r76,%r26; .loc 1 147 50 add.u32 %r299,%r76,-4; and.b32 %r300,%r299,-5; set.u32.eq.u32 %r302,%r300,0; neg.s32 %r303,%r302; .loc 1 147 21 add.u32 %r304,%r76,-1; set.u32.le.u32 %r306,%r304,1; neg.s32 %r307,%r306; .loc 1 147 6 cvt.u16.u32 %r309,%r303; cvt.u16.u32 %r310,%r307; orbra $L53; $L20: add.u64 %r189,%frame,360; add.u64 %r188,%r201,40; mov.u64 %r186,%frame; .loc 1 123 10 mov.u64 %r142,0; .loc 1 139 16 mov.u64 %r332,%r142; bra $L23; $L51: .loc 1 115 19 ld.s8 %r65,[%r201+28]; .loc 1 115 10 setp.eq.u64 %r314,%r65,%r127; @ %r314 bra $L24; .loc 1 116 2 st.u64 [%stack+8],%r127; st.u64 [%stack],%r65; cvta.const.u64 %r315315_gfortran_runtime_error24: .loc 1 121 11 cvta.global.u64 %r317,_gfortrani_compile_options; .loc 1 121 10 ld.u32 %r318,[%r317+36]; setp.eq.u32 %r319,%r318,0; @ %r319 bra $L25; .loc 1 123 4 setp.le.s64 %r320,%r127,0; @ %r320 bra $L26; add.u64 %r170,%r201,48; add.u64 %r378,%frame,240; mov.u64 %r168,%r378; add.u64 %r162,%r201,56; cvt.u32.u32 %r322,%r23; cvt.s64.s8 %r321,%r322; add.u64 %r149,%r321,-1; .loc 1 123 10 mov.u64 %r114,0; $L28: .loc 1 127 21 ld.u64 %r324,[%r162]; add.u64 %r323,%r324,1; .loc 1 127 19 ld.u64 %r325,[%r170]; sub.u64 %r133,%r323,%r325; .loc 1 128 18 ld.u64 %r71,[%r168]; .loc 1 128 11 setp.eq.u64 %r326,%r71,%r133; @ %r326 bra $L27; .loc 1 129 3 st.u64 [%stack+16],%r71; st.u64 [%stack+8],%r133; cvt.u32.u64 %r329,%r114; add.u32 %r328,%r329,1; st.u32 [%stack],%r328; cvta.const.u64 %r327,$LC1stack; call _gfortran_runtime_error27: .loc 1 123 25 add.u64 %r114,%r114,1; .loc 1 123 4 add.u64 %r170,%r170,24; add.u64 %r168,%r168,8; add.u64 %r162,%r162,24; setp.ne.u64 %r331,%r114,%r149; @ %r331 bra $L28; bra $L20; $L23: .loc 1 139 16 st.u64 [%r189],%r332; .loc 1 140 18 ld.u64 %r333,[%r188]; st.u64 [%r186],%r333; .loc 1 141 17 shl.b64 %r335,%r142,3; add.u64 %r336,%r378,%r335; .loc 1 141 10 ld.u64 %r337,[%r336]; setp.le.s64 %r338,%r337,0; @ %r338 bra $L1; .loc 1 137 26 add.u64 %r142,%r142,1; .loc 1 137 3 add.u64 %r189,%r189,8; add.u64 %r188,%r188,24; add.u64 %r186,%r186,8; setp.gt.s64 %r339,%r127,%r142; @ %r339 bra $L23; bra $L26; $L53: .loc 1 149 7 setp.eq.u32 %r341,%r76,16; @ %r341 bra $L21; .loc 1 157 5 cvta.const.u64 %r343,$LC2; mov.u64 %r342,3343; call _gfortrani_internal_error61 9 add.u64 %r344,%r27,1; .loc 1 61 7 sub.u64 %r129,%r344,%r29; max.s64 %r131,%r129,0; .loc 1 65 11 mul.lo.u64 %r33,%r30,%r26; .loc 1 159 8 ld.u64 %r141,[%r201]; ld.u64 %r166,[%frame+360]; .loc 1 189 22 ld.u64 %r86,[%frame+120]; .loc 1 190 22 ld.u64 %r88,[%frame]; .loc 1 190 12 shl.b64 %r89,%r88,4; .loc 1 192 32 ld.u64 %r164,[%frame+240]; .loc 1 199 30 mul.lo.u64 %r345,%r164,%r86; .loc 1 199 16 neg.s64 %r150,%r345; .loc 1 200 30 mul.lo.u64 %r346,%r164,%r88; .loc 1 200 16 shl.b64 %r347,%r346,4; neg.s64 %r148,%r347; setp.gt.s64 %r380,%r129,0; setp.le.s32 %r377,%r25,1; .loc 1 180 14 mov.u64 %r382,0; cvt.u32.u32 %r384,%r23; cvt.s64.s8 %r385,%r384; add.u64 %r386,%r385,-1; add.u64 %r387,%frame,120; add.u64 %r388,%frame,240; $L36: .loc 1 170 12 @ %r380 bra $L37; .loc 1 171 10 st.u64 [%r141],%r382; st.u64 [%r141+8],%r382; bra $L30; $L37: mov.u64 %r136,%r140; .loc 1 174 13 mov.u64 %r135,0; $L29: .loc 1 178 6 ld.s8 %r352,[%r136]; cvt.u16.u32 %r351,%r352; setp.ne.u16 %r353,%r351,0; @ %r353 bra $L38; .loc 1 174 28 add.u64 %r135,%r135,1; .loc 1 174 36 add.u64 %r136,%r136,%r33; .loc 1 174 6 setp.gt.s64 %r354,%r131,%r135; @ %r354 bra $L29; .loc 1 169 10 mov.u64 %r375,%r382; mov.u64 %r376,%r382; bra $L31; $L38: .loc 1 180 14 mov.u64 %r375,1; mov.u64 %r376,%r382; $L31: .loc 1 184 12 st.u64 [%r141],%r375; st.u64 [%r141+8],%r376; $L30: .loc 1 188 15 add.u64 %r166,%r166,1; .loc 1 189 12 add.u64 %r140,%r140,%r86; .loc 1 190 12 add.u64 %r141,%r141,%r89; .loc 1 192 13 setp.ne.u64 %r357,%r166,%r164; @ %r357 bra $L36; .loc 1 199 16 add.u64 %r137,%r140,%r150; .loc 1 200 16 add.u64 %r138,%r141,%r148; .loc 1 202 14 @ %r377 bra $L1; add.u64 %r32,%frame,368; mov.u64 %r165,8; .loc 1 201 12 mov.u64 %r139,1; bra $L34; $L35: .loc 1 196 20 st.u64 [%r32],%r382; .loc 1 199 30 mul.lo.u64 %r363,%r101,%r100; .loc 1 199 16 sub.u64 %r137,%r140,%r363; .loc 1 200 30 mul.lo.u64 %r364,%r103,%r100; .loc 1 200 16 shl.b64 %r365,%r364,4; sub.u64 %r138,%r141,%r365; .loc 1 201 12 add.u64 %r139,%r139,1; .loc 1 202 14 add.u64 %r32,%r32,8; add.u64 %r165,%r165,8; setp.eq.u64 %r366,%r139,%r386; @ %r366 bra $L1; $L34: .loc 1 210 23 ld.u64 %r367,[%r32]; add.u64 %r100,%r367,1; st.u64 [%r32],%r100; .loc 1 211 30 add.u64 %r369,%r387,%r165; ld.u64 %r101,[%r369]; .loc 1 211 20 add.u64 %r140,%r137,%r101; .loc 1 212 30 add.u64 %r370,%frame,%r165; ld.u64 %r103,[%r370]; .loc 1 212 20 shl.b64 %r371,%r103,4; add.u64 %r141,%r138,%r371; .loc 1 192 32 add.u64 %r373,%r388,%r165; ld.u64 %r107,[%r373]; .loc 1 192 13 setp.eq.u64 %r374,%r100,%r107; @ %r374 bra $L35; .loc 1 196 20 mov.u64 %r166,0; bra $L36; $L1: .loc 1 216 1 ret; } count_1_l.o/_gfortran_count_1_l .visible .func _gfortran_count_1_lfortran/generated/count_1_l.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_internal_error .extern .func _gfortrani_internal_error21,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121,32,105,110,32,67,79,85,78,84,32,105,110,116,114,105,110,115,105,99,0 }114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,67,79,85,78,84,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,10810,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,67,79,85,78,84,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_count_1_l .visible .func _gfortran_count_1_l48096u64u64 %r229; .reg .u64 %r230; .reg .u64 %r231; .reg .pred %r232; .reg .predu32 %r263; .reg .u64 %r265; .reg .u64 %r266; .reg .u64 %r267; .reg .u64 %r268; .reg .u16 %r270; .reg .u16 %r271; .reg .u32 %r272; .reg .u32 %r273; .reg .u64 %r275; .reg .u64u32 %r320; .reg .u64pred %r336; .reg .predu16 %r347; .reg .u32 %r348; .reg .predpred %r355; .reg .u32u64 %r371; .reg .pred %r372; .reg .predu64 %r385; .reg .u64 %r386; mov.u64 %r199,%ar0; mov.u64 %r200,%ar1; mov.u64 %r201,%ar2; .loc 1 56 10 ld.u64 %r22,[%r201]; .loc 1 56 7 add.u64 %r126,%r22,-1; .loc 1 57 10 ld.s8 %r23,[%r200+28]; .loc 1 57 38 add.u32 %r25,%r23,-1; .loc 1 57 8 cvt.s64.s32 %r127,%r25; .loc 1 59 14 ld.u64 %r26,[%r200+16]; .loc 1 61 9 add.u64 %r203,%r126,%r126; add.u64 %r204,%r203,%r126; shl.b64 %r205,%r204,3; add.u64 %r206,%r200,%r205; ld.u64 %r27,[%r206+56]; ld.u64 %r29,[%r206+48]; .loc 1 65 11 ld.u64 %r30,[%r206+40]; .loc 1 67 3 setp.gt.s64 %r220,%r126,0; @ %r220 bra $L2; $L8: .loc 1 75 3 setp.lt.s64 %r221,%r126,%r127; @ %r221 bra $L3; bra $L4; $L2: add.u64 %r52,%r200,40; add.u64 %r51,%frame,120; add.u64 %r68,%frame,240; add.u64 %r222,%r200,16; add.u64 %r224,%r22,%r22; add.u64 %r225,%r224,%r22; shl.b64 %r226,%r225,3; add.u64 %r96,%r222,%r226; .loc 1 73 12 mov.u64 %r386,0; $L7: .loc 1 69 20 ld.u64 %r228,[%r52]; mul.lo.u64 %r227,%r228,%r26; .loc 1 69 18 st.u64 [%r51],%r227; .loc 1 70 19 ld.u64 %r230,[%r52+16]; add.u64 %r229,%r230,1; ld.u64 %r231,[%r52+8]; sub.u64 %r41,%r229,%r231; .loc 1 72 10 setp.lt.s64 %r232,%r41,0; @ %r232 bra $L5; .loc 1 70 17 st.u64 [%r68],%r41; bra $L6; $L5: .loc 1 73 12 st.u64 [%r68],%r386; $L6: .loc 1 67 3 add.u64 %r52,%r52,24; add.u64 %r51,%r51,8; add.u64 %r68,%r68,8; setp.ne.u64 %r234,%r52,%r96; @ %r234 bra $L7; bra $L8; $L4: .loc 1 84 6 ld.u64 %r235,[%r199]; setp.eq.u64 %r236,%r235,0; @ ! %r236 bra $L53; bra $L9; $L3: add.u64 %r238,%r22,%r22; add.u64 %r239,%r238,%r22; shl.b64 %r240,%r239,3; add.u64 %r241,%r240,40; add.u64 %r121,%r200,%r241; shl.b64 %r242,%r22,3; add.u64 %r111,%r242,-8; add.u64 %r376,%frame,120; add.u64 %r113,%r376,%r111; add.u64 %r375,%frame,240; add.u64 %r108,%r375,%r111; add.u64 %r245,%r200,40; cvt.u32.u32 %r247,%r23; cvt.s64.s8 %r246,%r247; add.u64 %r249,%r246,%r246; add.u64 %r250,%r249,%r246; shl.b64 %r251,%r250,3; add.u64 %r192,%r245,%r251; .loc 1 81 12 mov.u64 %r385,0; $L13: .loc 1 77 20 ld.u64 %r253,[%r121]; mul.lo.u64 %r252,%r253,%r26; .loc 1 77 18 st.u64 [%r113],%r252; .loc 1 78 19 ld.u64 %r255,[%r121+16]; add.u64 %r254,%r255,1; ld.u64 %r256,[%r121+8]; sub.u64 %r49,%r254,%r256; .loc 1 80 10 setp.lt.s64 %r257,%r49,0; @ %r257 bra $L11; .loc 1 78 17 st.u64 [%r108],%r49; bra $L12; $L11: .loc 1 81 12 st.u64 [%r108],%r385; $L12: .loc 1 75 3 add.u64 %r121,%r121,24; add.u64 %r113,%r113,8; add.u64 %r108,%r108,8; setp.ne.u64 %r259,%r121,%r192; @ %r259 bra $L13; bra $L4; $L9: .loc 1 88 7 setp.le.s32 %r260,%r25,0; @ %r260 bra $L14; add.u64 %r175,%r199,40; add.u64 %r173,%frame,240; add.u64 %r261,%r199,16; cvt.u32.u32 %r263,%r23; cvt.s64.s8 %r262,%r263; add.u64 %r265,%r262,%r262; add.u64 %r266,%r265,%r262; shl.b64 %r267,%r266,3; add.u64 %r167,%r261,%r267; .loc 1 91 17 mov.u64 %r115,1; bra $L15; $L14: .loc 1 99 24 mov.u64 %r268,0; st.u64 [%r199+8],%r268; .loc 1 100 28 cvt.u16.u32 %r271,%r23; add.u16 %r270,%r271,-1; cvt.u32.u16 %r272,%r270; st.u8 [%r199+28],%r272; .loc 1 102 20 add.u32 %r273,%r23,-2; cvt.s64.s32 %r59,%r273; add.u64 %r275,%r59,%r59; add.u64 %r276,%r275,%r59; shl.b64 %r277,%r276,3; add.u64 %r278,%r199,%r277; .loc 1 102 67 shl.b64 %r280,%r59,3; add.u64 %r281,%frame,%r280; .loc 1 102 59 ld.u64 %r283,[%r278+40]; ld.u64 %r284,[%r281+240]; mul.lo.u64 %r135,%r283,%r284; .loc 1 104 10 setp.eq.u64 %r285,%r135,0; @ %r285 bra $L16; bra $L54; $L18: .loc 1 93 55 mul.lo.u64 %r115,%r55,%r115; $L15: .loc 1 95 4 st.u64 [%r175+8],%r235; ld.u64 %r55,[%r173]; add.u64 %r287,%r55,-1; st.u64 [%r175+16],%r287; st.u64 [%r175],%r115; .loc 1 88 7 add.u64 %r175,%r175,24; add.u64 %r173,%r173,8; setp.ne.u64 %r288,%r167,%r175; @ %r288 bra $L18; bra $L14; $L16: .loc 1 107 4 st.u64 [%r199+48],%r135; mov.u64 %r290,-1; st.u64 [%r199+56],%r290; mov.u64 %r291,1; st.u64 [%r199+40],%r291; .loc 1 108 4 bra $L1; $L54: .loc 1 111 24 mov.u64 %r29293; call (%value_in),_gfortrani_xmallocarray294,[%value_in]; } .loc 1 111 22 st.u64 [%r199],%r294; $L25: .loc 1 137 3 setp.gt.s32 %r296,%r25,0; @ ! %r296 bra $L26; add.u64 %r375,%frame,240; bra $L20; $L26: .loc 1 145 8 ld.u64 %r142,[%r200]; .loc 1 147 21 cvt.u32.u64 %r80,%r26; .loc 1 147 50 add.u32 %r297,%r80,-4; and.b32 %r298,%r297,-5; set.u32.eq.u32 %r300,%r298,0; neg.s32 %r301,%r300; .loc 1 147 21 add.u32 %r302,%r80,-1; set.u32.le.u32 %r304,%r302,1; neg.s32 %r305,%r304; .loc 1 147 6 cvt.u16; bra $L55; $L20: add.u64 %r185,%frame,360; add.u64 %r184,%r199,40; mov.u64 %r182,%frame; .loc 1 123 10 mov.u64 %r114,0; .loc 1 139 16 mov.u64 %r330,%r114; bra $L23; $L53: .loc 1 115 19 ld.s8 %r66,[%r199+28]; .loc 1 115 10 setp.eq.u64 %r312,%r66,%r127; @ %r312 bra $L24; .loc 1 116 2 st.u64 [%stack+8],%r1273_gfortran_runtime_error24: .loc 1 121 11 cvta.global.u64 %r315,_gfortrani_compile_options; .loc 1 121 10 ld.u32 %r316,[%r315+36]; setp.eq.u32 %r317,%r316,0; @ %r317 bra $L25; .loc 1 123 4 setp.le.s64 %r318,%r127,0; @ %r318 bra $L26; add.u64 %r163,%r199,48; add.u64 %r375,%frame,240; mov.u64 %r157,%r375; add.u64 %r153,%r199,56; cvt.u32.u32 %r320,%r23; cvt.s64.s8 %r319,%r320; add.u64 %r123,%r319,-1; .loc 1 123 10 mov.u64 %r134,0; $L28: .loc 1 127 21 ld.u64 %r322,[%r153]; add.u64 %r321,%r322,1; .loc 1 127 19 ld.u64 %r323,[%r163]; sub.u64 %r133,%r321,%r323; .loc 1 128 18 ld.u64 %r74,[%r157]; .loc 1 128 11 setp.eq.u64 %r324,%r74,%r133; @ %r324 bra $L27; .loc 1 129 3 st.u64 [%stack+16],%r74; st.u64 [%stack+8],%r133; cvt.u32.u64 %r327,%r134; add.u32 %r326,%r327,1; st.u32 [%stack],%r326;_gfortran_runtime_error27: .loc 1 123 25 add.u64 %r134,%r134,1; .loc 1 123 4 add.u64 %r163,%r163,24; add.u64 %r157,%r157,8; add.u64 %r153,%r153,24; setp.ne.u64 %r329,%r123,%r134; @ %r329 bra $L28; bra $L20; $L23: .loc 1 139 16 st.u64 [%r185],%r330; .loc 1 140 18 ld.u64 %r331,[%r184]; st.u64 [%r182],%r331; .loc 1 141 17 shl.b64 %r333,%r114,3; add.u64 %r334,%r375,%r333; .loc 1 141 10 ld.u64 %r335,[%r334]; setp.le.s64 %r336,%r335,0; @ %r336 bra $L1; .loc 1 137 26 add.u64 %r114,%r114,1; .loc 1 137 3 add.u64 %r185,%r185,8; add.u64 %r184,%r184,24; add.u64 %r182,%r182,8; setp.gt.s64 %r337,%r127,%r114; @ %r337 bra $L23; bra $L26; $L55: .loc 1 149 7 setp.eq.u32 %r339,%r80,16; @ %r339 bra $L21; .loc 1 157 5 cvta.const.u64 %r341,$LC2; mov.u64 %r340,34341; call _gfortrani_internal_error61 9 add.u64 %r342,%r27,1; .loc 1 61 7 sub.u64 %r129,%r342,%r29; max.s64 %r131,%r129,0; .loc 1 65 11 mul.lo.u64 %r33,%r30,%r26; .loc 1 159 8 ld.u64 %r143,[%r199]; ld.u64 %r165,[%frame+360]; .loc 1 185 22 ld.u64 %r89,[%frame+120]; .loc 1 186 22 ld.u64 %r91,[%frame]; .loc 1 188 32 ld.u64 %r70,[%frame+240]; .loc 1 195 30 mul.lo.u64 %r343,%r89,%r70; .loc 1 195 16 neg.s64 %r154,%r343; .loc 1 196 30 mul.lo.u64 %r344,%r91,%r70; .loc 1 196 16 neg.s64 %r158,%r344; setp.gt.s64 %r373,%r129,0; setp.le.s32 %r374,%r25,1; cvt.u32.u32 %r378,%r23; cvt.s64.s8 %r379,%r378; add.u64 %r380,%r379,-1; add.u64 %r381,%frame,120; add.u64 %r382,%frame,240; .loc 1 192 20 mov.u64 %r383,0; $L36: .loc 1 170 12 @ %r373 bra $L37; .loc 1 171 10 mov.u32 %r346,0; st.u8 [%r143],%r346; bra $L30; $L37: mov.u64 %r137,%r142; .loc 1 169 10 mov.u32 %r117,0; .loc 1 174 13 mov.u64 %r136,0; $L29: .loc 1 177 6 ld.s8 %r348,[%r137]; cvt.u16.u32 %r347,%r348; setp.eq.u16 %r349,%r347,0; @ %r349 bra $L31; .loc 1 178 11 cvt.u16.u32 %r352,%r117; add.u16 %r351,%r352,1; cvt.u32.u16 %r353,%r351; cvt.s32.s8 %r117,%r353; $L31: .loc 1 174 28 add.u64 %r136,%r136,1; .loc 1 174 36 add.u64 %r137,%r137,%r33; .loc 1 174 6 setp.gt.s64 %r355,%r131,%r136; @ %r355 bra $L29; .loc 1 180 12 cvt.u32.u32 %r356,%r117; st.u8 [%r143],%r356; $L30: .loc 1 184 15 add.u64 %r165,%r165,1; .loc 1 185 12 add.u64 %r142,%r142,%r89; .loc 1 186 12 add.u64 %r143,%r143,%r91; .loc 1 188 13 setp.ne.u64 %r357,%r70,%r165; @ %r357 bra $L36; .loc 1 195 16 add.u64 %r139,%r142,%r154; .loc 1 196 16 add.u64 %r140,%r143,%r158; .loc 1 198 14 @ %r374 bra $L1; add.u64 %r32,%frame,368; mov.u64 %r164,8; .loc 1 197 12 mov.u64 %r141,1; bra $L34; $L35: .loc 1 192 20 st.u64 [%r32],%r383; .loc 1 195 30 mul.lo.u64 %r363,%r103,%r102; .loc 1 195 16 sub.u64 %r139,%r142,%r363; .loc 1 196 30 mul.lo.u64 %r364,%r105,%r102; .loc 1 196 16 sub.u64 %r140,%r143,%r364; .loc 1 197 12 add.u64 %r141,%r141,1; .loc 1 198 14 add.u64 %r32,%r32,8; add.u64 %r164,%r164,8; setp.eq.u64 %r365,%r141,%r380; @ %r365 bra $L1; $L34: .loc 1 206 23 ld.u64 %r366,[%r32]; add.u64 %r102,%r366,1; st.u64 [%r32],%r102; .loc 1 207 30 add.u64 %r368,%r381,%r164; ld.u64 %r103,[%r368]; .loc 1 207 20 add.u64 %r142,%r139,%r103; .loc 1 208 30 add.u64 %r369,%frame,%r164; ld.u64 %r105,[%r369]; .loc 1 208 20 add.u64 %r143,%r140,%r105; .loc 1 188 32 add.u64 %r371,%r382,%r164; ld.u64 %r107,[%r371]; .loc 1 188 13 setp.eq.u64 %r372,%r102,%r107; @ %r372 bra $L35; .loc 1 192 20 mov.u64 %r165,0; bra $L36; $L1: .loc 1 212 1 ret; } count_2_l.o/_gfortran_count_2_l .visible .func _gfortran_count_2_lfortran/generated/count_2_l.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_internal_error .extern .func _gfortrani_internal_error21,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121,32,105,110,32,67,79,85,78,84,32,105,110,116,114,105,110,115,105,99,0 }114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,67,79,85,78,84,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,10810,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,67,79,85,78,84,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_count_2_l .visible .func _gfortran_count_2_l48092predpred %r238; .reg .u64predu64 %r279; .reg .u64 %r280; .reg .u64 %r281; .reg .u64pred %r316; .reg .u64 %r317; .reg .u64 %r319; .reg .u32predpredmov.u64 %r203,%ar0; mov.u64 %r204,%ar1; mov.u64 %r205,%ar2; .loc 1 56 10 ld.u64 %r22,[%r205]; .loc 1 56 7 add.u64 %r125,%r22,-1; .loc 1 57 10 ld.s8 %r23,[%r204+28]; .loc 1 57 38 add.u32 %r25,%r23,-1; .loc 1 57 8 cvt.s64.s32 %r126,%r25; .loc 1 59 14 ld.u64 %r26,[%r204+16]; .loc 1 61 9 add.u64 %r207,%r125,%r125; add.u64 %r208,%r207,%r125; shl.b64 %r209,%r208,3; add.u64 %r210,%r204,%r209; ld.u64 %r27,[%r210+56]; ld.u64 %r29,[%r210+48]; .loc 1 65 11 ld.u64 %r30,[%r210+40]; .loc 1 67 3 setp.gt.s64 %r224,%r125,0; @ %r224 bra $L2; $L8: .loc 1 75 3 setp.lt.s64 %r225,%r125,%r126; @ %r225 bra $L3; bra $L4; $L2: add.u64 %r197,%r204,40; add.u64 %r51,%frame,120; add.u64 %r52,%frame,240; add.u64 %r227,%r22,%r22; add.u64 %r228,%r227,%r22; shl.b64 %r229,%r228,3; add.u64 %r230,%r204,16; add.u64 %r73,%r229,%r230; .loc 1 73 12 mov.u64 %r392,0; $L7: .loc 1 69 20 ld.u64 %r232,[%r197]; mul.lo.u64 %r231,%r232,%r26; .loc 1 69 18 st.u64 [%r51],%r231; .loc 1 70 19 ld.u64 %r234,[%r197+16]; add.u64 %r233,%r234,1; ld.u64 %r235,[%r197+8]; sub.u64 %r41,%r233,%r235; .loc 1 72 10 setp.lt.s64 %r236,%r41,0; @ %r236 bra $L5; .loc 1 70 17 st.u64 [%r52],%r41; bra $L6; $L5: .loc 1 73 12 st.u64 [%r52],%r392; $L6: .loc 1 67 3 add.u64 %r197,%r197,24; add.u64 %r51,%r51,8; add.u64 %r52,%r52,8; setp.ne.u64 %r238,%r73,%r197; @ %r238 bra $L7; bra $L8; $L4: .loc 1 84 6 ld.u64 %r239,[%r203]; setp.eq.u64 %r240,%r239,0; @ ! %r240 bra $L53; bra $L9; $L3: add.u64 %r242,%r22,%r22; add.u64 %r243,%r242,%r22; shl.b64 %r244,%r243,3; add.u64 %r245,%r244,40; add.u64 %r136,%r204,%r245; shl.b64 %r246,%r22,3; add.u64 %r112,%r246,-8; add.u64 %r381,%frame,120; add.u64 %r117,%r381,%r112; add.u64 %r380,%frame,240; add.u64 %r110,%r380,%r112; add.u64 %r249,%r204,40; cvt.u32.u32 %r251,%r23; cvt.s64.s8 %r250,%r251; add.u64 %r253,%r250,%r250; add.u64 %r254,%r253,%r250; shl.b64 %r255,%r254,3; add.u64 %r200,%r249,%r255; .loc 1 81 12 mov.u64 %r391,0; $L13: .loc 1 77 20 ld.u64 %r257,[%r136]; mul.lo.u64 %r256,%r257,%r26; .loc 1 77 18 st.u64 [%r117],%r256; .loc 1 78 19 ld.u64 %r259,[%r136+16]; add.u64 %r258,%r259,1; ld.u64 %r260,[%r136+8]; sub.u64 %r49,%r258,%r260; .loc 1 80 10 setp.lt.s64 %r261,%r49,0; @ %r261 bra $L11; .loc 1 78 17 st.u64 [%r110],%r49; bra $L12; $L11: .loc 1 81 12 st.u64 [%r110],%r391; $L12: .loc 1 75 3 add.u64 %r136,%r136,24; add.u64 %r117,%r117,8; add.u64 %r110,%r110,8; setp.ne.u64 %r263,%r136,%r200; @ %r263 bra $L13; bra $L4; $L9: .loc 1 88 7 setp.le.s32 %r264,%r25,0; @ %r264 bra $L14; add.u64 %r180,%r203,40; add.u64 %r178,%frame,240; add.u64 %r265,%r203,16; cvt.u32.u32 %r267,%r23; cvt.s64.s8 %r266,%r267; add.u64 %r269,%r266,%r266; add.u64 %r270,%r269,%r266; shl.b64 %r271,%r270,3; add.u64 %r172,%r265,%r271; .loc 1 91 17 mov.u64 %r115,1; bra $L15; $L14: .loc 1 99 24 mov.u64 %r272,0; st.u64 [%r203+8],%r272; .loc 1 100 28 cvt.u16.u32 %r275,%r23; add.u16 %r274,%r275,-1; cvt.u32.u16 %r276,%r274; st.u8 [%r203+28],%r276; .loc 1 102 20 add.u32 %r277,%r23,-2; cvt.s64.s32 %r58,%r277; add.u64 %r279,%r58,%r58; add.u64 %r280,%r279,%r58; shl.b64 %r281,%r280,3; add.u64 %r282,%r203,%r281; .loc 1 102 67 shl.b64 %r284,%r58,3; add.u64 %r285,%frame,%r284; .loc 1 102 59 ld.u64 %r287,[%r282+40]; ld.u64 %r288,[%r285+240]; mul.lo.u64 %r133,%r287,%r288; .loc 1 104 10 setp.eq.u64 %r289,%r133,0; @ %r289 bra $L16; bra $L54; $L18: .loc 1 93 55 mul.lo.u64 %r115,%r54,%r115; $L15: .loc 1 95 4 st.u64 [%r180+8],%r239; ld.u64 %r54,[%r178]; add.u64 %r291,%r54,-1; st.u64 [%r180+16],%r291; st.u64 [%r180],%r115; .loc 1 88 7 add.u64 %r180,%r180,24; add.u64 %r178,%r178,8; setp.ne.u64 %r292,%r172,%r180; @ %r292 bra $L18; bra $L14; $L16: .loc 1 107 4 st.u64 [%r203+48],%r133; mov.u64 %r294,-1; st.u64 [%r203+56],%r294; mov.u64 %r295,1; st.u64 [%r203+40],%r295; .loc 1 108 4 bra $L1; $L54: .loc 1 111 24 mov.u64 %r297,13(%value_in),_gfortrani_xmallocarray298,[%value_in]; } .loc 1 111 22 st.u64 [%r203],%r298; $L25: .loc 1 137 3 setp.gt.s32 %r300,%r25,0; @ ! %r300 bra $L26; add.u64 %r380,%frame,240; bra $L20; $L26: .loc 1 145 8 ld.u64 %r140,[%r204]; .loc 1 147 21 cvt.u32.u64 %r78,%r26; .loc 1 147 50 add.u32 %r301,%r78,-4; and.b32 %r302,%r301,-5; set.u32.eq.u32 %r304,%r302,0; neg.s32 %r305,%r304; .loc 1 147 21 add.u32 %r306,%r78,-1; set.u32.le.u32 %r308,%r306,1; neg.s32 %r309,%r308; .loc 1 147 6 cvt.u16.u32 %r311,%r305; cvt.u16.u32 %r312,%r309; orbra $L55; $L20: add.u64 %r190,%frame,360; add.u64 %r189,%r203,40; mov.u64 %r187,%frame; .loc 1 123 10 mov.u64 %r142,0; .loc 1 139 16 mov.u64 %r334,%r142; bra $L23; $L53: .loc 1 115 19 ld.s8 %r65,[%r203+28]; .loc 1 115 10 setp.eq.u64 %r316,%r65,%r126; @ %r316 bra $L24; .loc 1 116 2 st.u64 [%stack+8],%r126; st.u64 [%stack],%r65; cvta.const.u64 %r31317_gfortran_runtime_error24: .loc 1 121 11 cvta.global.u64 %r319,_gfortrani_compile_options; .loc 1 121 10 ld.u32 %r320,[%r319+36]; setp.eq.u32 %r321,%r320,0; @ %r321 bra $L25; .loc 1 123 4 setp.le.s64 %r322,%r126,0; @ %r322 bra $L26; add.u64 %r171,%r203,48; add.u64 %r380,%frame,240; mov.u64 %r166,%r380; add.u64 %r164,%r203,56; cvt.u32.u32 %r324,%r23; cvt.s64.s8 %r323,%r324; add.u64 %r146,%r323,-1; .loc 1 123 10 mov.u64 %r114,0; $L28: .loc 1 127 21 ld.u64 %r326,[%r164]; add.u64 %r325,%r326,1; .loc 1 127 19 ld.u64 %r327,[%r171]; sub.u64 %r132,%r325,%r327; .loc 1 128 18 ld.u64 %r72,[%r166]; .loc 1 128 11 setp.eq.u64 %r328,%r72,%r132; @ %r328 bra $L27; .loc 1 129 3 st.u64 [%stack+16],%r72; st.u64 [%stack+8],%r132; cvt.u32.u64 %r331,%r114; add.u32 %r330,%r331,1; st.u32 [%stack],%r330;_gfortran_runtime_error27: .loc 1 123 25 add.u64 %r114,%r114,1; .loc 1 123 4 add.u64 %r171,%r171,24; add.u64 %r166,%r166,8; add.u64 %r164,%r164,24; setp.ne.u64 %r333,%r114,%r146; @ %r333 bra $L28; bra $L20; $L23: .loc 1 139 16 st.u64 [%r190],%r334; .loc 1 140 18 ld.u64 %r335,[%r189]; st.u64 [%r187],%r335; .loc 1 141 17 shl.b64 %r337,%r142,3; add.u64 %r338,%r380,%r337; .loc 1 141 10 ld.u64 %r339,[%r338]; setp.le.s64 %r340,%r339,0; @ %r340 bra $L1; .loc 1 137 26 add.u64 %r142,%r142,1; .loc 1 137 3 add.u64 %r190,%r190,8; add.u64 %r189,%r189,24; add.u64 %r187,%r187,8; setp.gt.s64 %r341,%r126,%r142; @ %r341 bra $L23; bra $L26; $L55: .loc 1 149 7 setp.eq.u32 %r343,%r78,16; @ %r343 bra $L21; .loc 1 157 5 cvta.const.u64 %r345,$LC2; mov.u64 %r344,3445; call _gfortrani_internal_error61 9 add.u64 %r346,%r27,1; .loc 1 61 7 sub.u64 %r128,%r346,%r29; max.s64 %r130,%r128,0; .loc 1 65 11 mul.lo.u64 %r33,%r30,%r26; .loc 1 159 8 ld.u64 %r141,[%r203]; ld.u64 %r168,[%frame+360]; .loc 1 185 22 ld.u64 %r89,[%frame+120]; .loc 1 186 22 ld.u64 %r91,[%frame]; .loc 1 186 12 add.u64 %r92,%r91,%r91; .loc 1 188 32 ld.u64 %r68,[%frame+240]; .loc 1 195 30 mul.lo.u64 %r348,%r89,%r68; .loc 1 195 16 neg.s64 %r154,%r348; .loc 1 196 30 mul.lo.u64 %r349,%r91,%r68; .loc 1 196 16 add.u64 %r350,%r349,%r349; neg.s64 %r165,%r350; setp.gt.s64 %r382,%r128,0; setp.le.s32 %r379,%r25,1; cvt.u32.u32 %r384,%r23; cvt.s64.s8 %r385,%r384; add.u64 %r386,%r385,-1; add.u64 %r387,%frame,120; add.u64 %r388,%frame,240; .loc 1 192 20 mov.u64 %r389,0; $L36: .loc 1 170 12 @ %r382 bra $L37; .loc 1 171 10 mov.u16 %r352,0; st.u16 [%r141],%r352; bra $L30; $L37: mov.u64 %r135,%r140; .loc 1 169 10 mov.u32 %r116,0; .loc 1 174 13 mov.u64 %r134,0; $L29: .loc 1 177 6 ld.s8 %r354,[%r135]; cvt.u16.u32 %r353,%r354; setp.eq.u16 %r355,%r353,0; @ %r355 bra $L31; .loc 1 178 11 cvt.u16.u32 %r357,%r116; add.u16 %r356,%r357,1; cvt.s32.s16 %r116,%r356; $L31: .loc 1 174 28 add.u64 %r134,%r134,1; .loc 1 174 36 add.u64 %r135,%r135,%r33; .loc 1 174 6 setp.gt.s64 %r359,%r130,%r134; @ %r359 bra $L29; .loc 1 180 12 cvt.u16.u32 %r360,%r116; st.u16 [%r141],%r360; $L30: .loc 1 184 15 add.u64 %r168,%r168,1; .loc 1 185 12 add.u64 %r140,%r140,%r89; .loc 1 186 12 add.u64 %r141,%r141,%r92; .loc 1 188 13 setp.ne.u64 %r361,%r68,%r168; @ %r361 bra $L36; .loc 1 195 16 add.u64 %r137,%r140,%r154; .loc 1 196 16 add.u64 %r138,%r141,%r165; .loc 1 198 14 @ %r379 bra $L1; add.u64 %r32,%frame,368; mov.u64 %r167,8; .loc 1 197 12 mov.u64 %r139,1; bra $L34; $L35: .loc 1 192 20 st.u64 [%r32],%r389; .loc 1 195 30 mul.lo.u64 %r367,%r103,%r102; .loc 1 195 16 sub.u64 %r137,%r140,%r367; .loc 1 196 30 mul.lo.u64 %r368,%r105,%r102; .loc 1 196 16 add.u64 %r369,%r368,%r368; sub.u64 %r138,%r141,%r369; .loc 1 197 12 add.u64 %r139,%r139,1; .loc 1 198 14 add.u64 %r32,%r32,8; add.u64 %r167,%r167,8; setp.eq.u64 %r370,%r139,%r386; @ %r370 bra $L1; $L34: .loc 1 206 23 ld.u64 %r371,[%r32]; add.u64 %r102,%r371,1; st.u64 [%r32],%r102; .loc 1 207 30 add.u64 %r373,%r387,%r167; ld.u64 %r103,[%r373]; .loc 1 207 20 add.u64 %r140,%r137,%r103; .loc 1 208 30 add.u64 %r374,%frame,%r167; ld.u64 %r105,[%r374]; .loc 1 208 20 add.u64 %r375,%r105,%r105; add.u64 %r141,%r138,%r375; .loc 1 188 32 add.u64 %r377,%r388,%r167; ld.u64 %r108,[%r377]; .loc 1 188 13 setp.eq.u64 %r378,%r102,%r108; @ %r378 bra $L35; .loc 1 192 20 mov.u64 %r168,0; bra $L36; $L1: .loc 1 212 1 ret; } count_4_l.o/_gfortran_count_4_l .visible .func _gfortran_count_4_lfortran/generated/count_4_l.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_internal_error .extern .func _gfortrani_internal_error21,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121,32,105,110,32,67,79,85,78,84,32,105,110,116,114,105,110,115,105,99,0 }114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,67,79,85,78,84,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,10810,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,67,79,85,78,84,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_count_4_l .visible .func _gfortran_count_4_l480u64 %r229; .reg .u64 %r230; .reg .u64 %r231; .reg .u64 %r232; .reg .u64 %r233; .reg .pred %r234; .reg .pred %r236; .reg .u64u64u64 %r257; .reg .u64 %r258; .reg .predu64u16u64 %r279; .reg .u64 %r280; .reg .u64 %r282; .reg .u64pred %r320; .reg .u64u64 %r332; .reg .u64 %r333; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .pred %r338; .reg .pred %r339; .reg .pred %r341; .reg .u64 %r342; .reg .u64 %r343; .reg .u64 %r344; .reg .u64 %r345; .reg .u64 %r346; .reg .u64 %r347; .reg .u32 %r349; .reg .u16 %r350; .reg .u32 %r351; .reg .pred %r352; .reg .pred %r353; .reg .pred %r354; .reg .u64pred %r363; .reg .u64u64 %r385; mov.u64 %r201,%ar0; mov.u64 %r202,%ar1; mov.u64 %r203,%ar2; .loc 1 56 10 ld.u64 %r22,[%r203]; .loc 1 56 7 add.u64 %r123,%r22,-1; .loc 1 57 10 ld.s8 %r23,[%r202+28]; .loc 1 57 38 add.u32 %r25,%r23,-1; .loc 1 57 8 cvt.s64.s32 %r124,%r25; .loc 1 59 14 ld.u64 %r26,[%r202+16]; .loc 1 61 9 add.u64 %r205,%r123,%r123; add.u64 %r206,%r205,%r123; shl.b64 %r207,%r206,3; add.u64 %r208,%r202,%r207; ld.u64 %r27,[%r208+56]; ld.u64 %r29,[%r208+48]; .loc 1 65 11 ld.u64 %r30,[%r208+40]; .loc 1 67 3 setp.gt.s64 %r222,%r123,0; @ %r222 bra $L2; $L8: .loc 1 75 3 setp.lt.s64 %r223,%r123,%r124; @ %r223 bra $L3; bra $L4; $L2: add.u64 %r195,%r202,40; add.u64 %r51,%frame,120; add.u64 %r52,%frame,240; add.u64 %r225,%r22,%r22; add.u64 %r226,%r225,%r22; shl.b64 %r227,%r226,3; add.u64 %r228,%r202,16; add.u64 %r73,%r227,%r228; .loc 1 73 12 mov.u64 %r385,0; $L7: .loc 1 69 20 ld.u64 %r230,[%r195]; mul.lo.u64 %r229,%r230,%r26; .loc 1 69 18 st.u64 [%r51],%r229; .loc 1 70 19 ld.u64 %r232,[%r195+16]; add.u64 %r231,%r232,1; ld.u64 %r233,[%r195+8]; sub.u64 %r41,%r231,%r233; .loc 1 72 10 setp.lt.s64 %r234,%r41,0; @ %r234 bra $L5; .loc 1 70 17 st.u64 [%r52],%r41; bra $L6; $L5: .loc 1 73 12 st.u64 [%r52],%r385; $L6: .loc 1 67 3 add.u64 %r195,%r195,24; add.u64 %r51,%r51,8; add.u64 %r52,%r52,8; setp.ne.u64 %r236,%r73,%r195; @ %r236 bra $L7; bra $L8; $L4: .loc 1 84 6 ld.u64 %r237,[%r201]; setp.eq.u64 %r238,%r237,0; @ ! %r238 bra $L53; bra $L9; $L3: add.u64 %r240,%r22,%r22; add.u64 %r241,%r240,%r22; shl.b64 %r242,%r241,3; add.u64 %r243,%r242,40; add.u64 %r134,%r202,%r243; shl.b64 %r244,%r22,3; add.u64 %r110,%r244,-8; add.u64 %r375,%frame,120; add.u64 %r115,%r375,%r110; add.u64 %r374,%frame,240; add.u64 %r108,%r374,%r110; add.u64 %r247,%r202,40; cvt.u32.u32 %r249,%r23; cvt.s64.s8 %r248,%r249; add.u64 %r251,%r248,%r248; add.u64 %r252,%r251,%r248; shl.b64 %r253,%r252,3; add.u64 %r198,%r247,%r253; .loc 1 81 12 mov.u64 %r384,0; $L13: .loc 1 77 20 ld.u64 %r255,[%r134]; mul.lo.u64 %r254,%r255,%r26; .loc 1 77 18 st.u64 [%r115],%r254; .loc 1 78 19 ld.u64 %r257,[%r134+16]; add.u64 %r256,%r257,1; ld.u64 %r258,[%r134+8]; sub.u64 %r49,%r256,%r258; .loc 1 80 10 setp.lt.s64 %r259,%r49,0; @ %r259 bra $L11; .loc 1 78 17 st.u64 [%r108],%r49; bra $L12; $L11: .loc 1 81 12 st.u64 [%r108],%r384; $L12: .loc 1 75 3 add.u64 %r134,%r134,24; add.u64 %r115,%r115,8; add.u64 %r108,%r108,8; setp.ne.u64 %r261,%r134,%r198; @ %r261 bra $L13; bra $L4; $L9: .loc 1 88 7 setp.le.s32 %r262,%r25,0; @ %r262 bra $L14; add.u64 %r178,%r201,40; add.u64 %r176,%frame,240; add.u64 %r263,%r201,16; cvt.u32.u32 %r265,%r23; cvt.s64.s8 %r264,%r265; add.u64 %r267,%r264,%r264; add.u64 %r268,%r267,%r264; shl.b64 %r269,%r268,3; add.u64 %r170,%r263,%r269; .loc 1 91 17 mov.u64 %r113,1; bra $L15; $L14: .loc 1 99 24 mov.u64 %r270,0; st.u64 [%r201+8],%r270; .loc 1 100 28 cvt.u16.u32 %r273,%r23; add.u16 %r272,%r273,-1; cvt.u32.u16 %r274,%r272; st.u8 [%r201+28],%r274; .loc 1 102 20 add.u32 %r275,%r23,-2; cvt.s64.s32 %r58,%r275; add.u64 %r277,%r58,%r58; add.u64 %r278,%r277,%r58; shl.b64 %r279,%r278,3; add.u64 %r280,%r201,%r279; .loc 1 102 67 shl.b64 %r282,%r58,3; add.u64 %r283,%frame,%r282; .loc 1 102 59 ld.u64 %r285,[%r280+40]; ld.u64 %r286,[%r283+240]; mul.lo.u64 %r131,%r285,%r286; .loc 1 104 10 setp.eq.u64 %r287,%r131,0; @ %r287 bra $L16; bra $L54; $L18: .loc 1 93 55 mul.lo.u64 %r113,%r54,%r113; $L15: .loc 1 95 4 st.u64 [%r178+8],%r237; ld.u64 %r54,[%r176]; add.u64 %r289,%r54,-1; st.u64 [%r178+16],%r289; st.u64 [%r178],%r113; .loc 1 88 7 add.u64 %r178,%r178,24; add.u64 %r176,%r176,8; setp.ne.u64 %r290,%r170,%r178; @ %r290 bra $L18; bra $L14; $L16: .loc 1 107 4 st.u64 [%r201+48],%r131; mov.u64 %r292,-1; st.u64 [%r201+56],%r292; mov.u64 %r293,1; st.u64 [%r201+40],%r293; .loc 1 108 4 bra $L1; $L54: .loc 1 111 24295; call (%value_in),_gfortrani_xmallocarray296,[%value_in]; } .loc 1 111 22 st.u64 [%r201],%r296; $L25: .loc 1 137 3 setp.gt.s32 %r298,%r25,0; @ ! %r298 bra $L26; add.u64 %r374,%frame,240; bra $L20; $L26: .loc 1 145 8 ld.u64 %r138,[%r202]; .loc 1 147 21 cvt.u32.u64 %r78,%r26; .loc 1 147 50 add.u32 %r299,%r78,-4; and.b32 %r300,%r299,-5; set.u32.eq.u32 %r302,%r300,0; neg.s32 %r303,%r302; .loc 1 147 21 add.u32 %r304,%r78,-1; set.u32.le.u32 %r306,%r304,1; neg.s32 %r307,%r306; .loc 1 147 6 cvt.u16.u32 %r309,%r303; cvt.u16.u32 %r310,%r307; orbra $L55; $L20: add.u64 %r188,%frame,360; add.u64 %r187,%r201,40; mov.u64 %r185,%frame; .loc 1 123 10 mov.u64 %r140,0; .loc 1 139 16 mov.u64 %r332,%r140; bra $L23; $L53: .loc 1 115 19 ld.s8 %r65,[%r201+28]; .loc 1 115 10 setp.eq.u64 %r314,%r65,%r124; @ %r314 bra $L24; .loc 1 116 2 st.u64 [%stack+8],%r124; st.u64 [%stack],%r65; cvta.const.u64 %r315315_gfortran_runtime_error24: .loc 1 121 11 cvta.global.u64 %r317,_gfortrani_compile_options; .loc 1 121 10 ld.u32 %r318,[%r317+36]; setp.eq.u32 %r319,%r318,0; @ %r319 bra $L25; .loc 1 123 4 setp.le.s64 %r320,%r124,0; @ %r320 bra $L26; add.u64 %r169,%r201,48; add.u64 %r374,%frame,240; mov.u64 %r164,%r374; add.u64 %r162,%r201,56; cvt.u32.u32 %r322,%r23; cvt.s64.s8 %r321,%r322; add.u64 %r144,%r321,-1; .loc 1 123 10 mov.u64 %r112,0; $L28: .loc 1 127 21 ld.u64 %r324,[%r162]; add.u64 %r323,%r324,1; .loc 1 127 19 ld.u64 %r325,[%r169]; sub.u64 %r130,%r323,%r325; .loc 1 128 18 ld.u64 %r72,[%r164]; .loc 1 128 11 setp.eq.u64 %r326,%r72,%r130; @ %r326 bra $L27; .loc 1 129 3 st.u64 [%stack+16],%r72; st.u64 [%stack+8],%r130; cvt.u32.u64 %r329,%r112; add.u32 %r328,%r329,1; st.u32 [%stack],%r328; cvta.const.u64 %r327,$LC1stack; call _gfortran_runtime_error27: .loc 1 123 25 add.u64 %r112,%r112,1; .loc 1 123 4 add.u64 %r169,%r169,24; add.u64 %r164,%r164,8; add.u64 %r162,%r162,24; setp.ne.u64 %r331,%r112,%r144; @ %r331 bra $L28; bra $L20; $L23: .loc 1 139 16 st.u64 [%r188],%r332; .loc 1 140 18 ld.u64 %r333,[%r187]; st.u64 [%r185],%r333; .loc 1 141 17 shl.b64 %r335,%r140,3; add.u64 %r336,%r374,%r335; .loc 1 141 10 ld.u64 %r337,[%r336]; setp.le.s64 %r338,%r337,0; @ %r338 bra $L1; .loc 1 137 26 add.u64 %r140,%r140,1; .loc 1 137 3 add.u64 %r188,%r188,8; add.u64 %r187,%r187,24; add.u64 %r185,%r185,8; setp.gt.s64 %r339,%r124,%r140; @ %r339 bra $L23; bra $L26; $L55: .loc 1 149 7 setp.eq.u32 %r341,%r78,16; @ %r341 bra $L21; .loc 1 157 5 cvta.const.u64 %r343,$LC2; mov.u64 %r342,3343; call _gfortrani_internal_error61 9 add.u64 %r344,%r27,1; .loc 1 61 7 sub.u64 %r126,%r344,%r29; max.s64 %r128,%r126,0; .loc 1 65 11 mul.lo.u64 %r33,%r30,%r26; .loc 1 159 8 ld.u64 %r139,[%r201]; ld.u64 %r166,[%frame+360]; .loc 1 185 22 ld.u64 %r87,[%frame+120]; .loc 1 186 22 ld.u64 %r89,[%frame]; .loc 1 186 12 shl.b64 %r90,%r89,2; .loc 1 188 32 ld.u64 %r68,[%frame+240]; .loc 1 195 30 mul.lo.u64 %r345,%r87,%r68; .loc 1 195 16 neg.s64 %r152,%r345; .loc 1 196 30 mul.lo.u64 %r346,%r89,%r68; .loc 1 196 16 shl.b64 %r347,%r346,2; neg.s64 %r163,%r347; setp.gt.s64 %r372,%r126,0; setp.le.s32 %r373,%r25,1; cvt.u32.u32 %r377,%r23; cvt.s64.s8 %r378,%r377; add.u64 %r379,%r378,-1; add.u64 %r380,%frame,120; add.u64 %r381,%frame,240; .loc 1 192 20 mov.u64 %r382,0; $L36: .loc 1 170 12 @ %r372 bra $L37; .loc 1 171 10 mov.u32 %r349,0; st.u32 [%r139],%r349; bra $L30; $L37: mov.u64 %r133,%r138; .loc 1 169 10 mov.u32 %r114,0; .loc 1 174 13 mov.u64 %r132,0; $L29: .loc 1 177 6 ld.s8 %r351,[%r133]; cvt.u16.u32 %r350,%r351; setp.eq.u16 %r352,%r350,0; @ %r352 bra $L31; .loc 1 178 11 add.u32 %r114,%r114,1; $L31: .loc 1 174 28 add.u64 %r132,%r132,1; .loc 1 174 36 add.u64 %r133,%r133,%r33; .loc 1 174 6 setp.gt.s64 %r353,%r128,%r132; @ %r353 bra $L29; .loc 1 180 12 st.u32 [%r139],%r114; $L30: .loc 1 184 15 add.u64 %r166,%r166,1; .loc 1 185 12 add.u64 %r138,%r138,%r87; .loc 1 186 12 add.u64 %r139,%r139,%r90; .loc 1 188 13 setp.ne.u64 %r354,%r68,%r166; @ %r354 bra $L36; .loc 1 195 16 add.u64 %r135,%r138,%r152; .loc 1 196 16 add.u64 %r136,%r139,%r163; .loc 1 198 14 @ %r373 bra $L1; add.u64 %r32,%frame,368; mov.u64 %r165,8; .loc 1 197 12 mov.u64 %r137,1; bra $L34; $L35: .loc 1 192 20 st.u64 [%r32],%r382; .loc 1 195 30 mul.lo.u64 %r360,%r101,%r100; .loc 1 195 16 sub.u64 %r135,%r138,%r360; .loc 1 196 30 mul.lo.u64 %r361,%r103,%r100; .loc 1 196 16 shl.b64 %r362,%r361,2; sub.u64 %r136,%r139,%r362; .loc 1 197 12 add.u64 %r137,%r137,1; .loc 1 198 14 add.u64 %r32,%r32,8; add.u64 %r165,%r165,8; setp.eq.u64 %r363,%r137,%r379; @ %r363 bra $L1; $L34: .loc 1 206 23 ld.u64 %r364,[%r32]; add.u64 %r100,%r364,1; st.u64 [%r32],%r100; .loc 1 207 30 add.u64 %r366,%r380,%r165; ld.u64 %r101,[%r366]; .loc 1 207 20 add.u64 %r138,%r135,%r101; .loc 1 208 30 add.u64 %r367,%frame,%r165; ld.u64 %r103,[%r367]; .loc 1 208 20 shl.b64 %r368,%r103,2; add.u64 %r139,%r136,%r368; .loc 1 188 32 add.u64 %r370,%r381,%r165; ld.u64 %r106,[%r370]; .loc 1 188 13 setp.eq.u64 %r371,%r100,%r106; @ %r371 bra $L35; .loc 1 192 20 mov.u64 %r166,0; bra $L36; $L1: .loc 1 212 1 ret; } count_8_l.o/_gfortran_count_8_l .visible .func _gfortran_count_8_lfortran/generated/count_8_l.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_internal_error .extern .func _gfortrani_internal_error21,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121,32,105,110,32,67,79,85,78,84,32,105,110,116,114,105,110,115,105,99,0 }114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,67,79,85,78,84,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,10810,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,67,79,85,78,84,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_count_8_l .visible .func _gfortran_count_8_l480u64 %r229; .reg .u64 %r230; .reg .u64 %r231; .reg .u64 %r232; .reg .u64 %r233; .reg .pred %r234; .reg .pred %r236; .reg .u64u64u64 %r257; .reg .u64 %r258; .reg .predu64u16u64 %r279; .reg .u64 %r280; .reg .u64 %r282; .reg .u64pred %r320; .reg .u64u64 %r332; .reg .u64 %r333; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .pred %r338; .reg .pred %r339; .reg .pred %r341; .reg .u64 %r342; .reg .u64 %r343; .reg .u64 %r344; .reg .u64 %r345; .reg .u64 %r346; .reg .u64 %r347; .reg .u16 %r350; .reg .u32 %r351; .reg .pred %r352; .reg .pred %r353; .reg .pred %r354; .reg .u64pred %r363; .reg .u64u64 %r385; mov.u64 %r201,%ar0; mov.u64 %r202,%ar1; mov.u64 %r203,%ar2; .loc 1 56 10 ld.u64 %r22,[%r203]; .loc 1 56 7 add.u64 %r123,%r22,-1; .loc 1 57 10 ld.s8 %r23,[%r202+28]; .loc 1 57 38 add.u32 %r25,%r23,-1; .loc 1 57 8 cvt.s64.s32 %r124,%r25; .loc 1 59 14 ld.u64 %r26,[%r202+16]; .loc 1 61 9 add.u64 %r205,%r123,%r123; add.u64 %r206,%r205,%r123; shl.b64 %r207,%r206,3; add.u64 %r208,%r202,%r207; ld.u64 %r27,[%r208+56]; ld.u64 %r29,[%r208+48]; .loc 1 65 11 ld.u64 %r30,[%r208+40]; .loc 1 67 3 setp.gt.s64 %r222,%r123,0; @ %r222 bra $L2; $L8: .loc 1 75 3 setp.lt.s64 %r223,%r123,%r124; @ %r223 bra $L3; bra $L4; $L2: add.u64 %r195,%r202,40; add.u64 %r51,%frame,120; add.u64 %r52,%frame,240; add.u64 %r225,%r22,%r22; add.u64 %r226,%r225,%r22; shl.b64 %r227,%r226,3; add.u64 %r228,%r202,16; add.u64 %r73,%r227,%r228; .loc 1 73 12 mov.u64 %r385,0; $L7: .loc 1 69 20 ld.u64 %r230,[%r195]; mul.lo.u64 %r229,%r230,%r26; .loc 1 69 18 st.u64 [%r51],%r229; .loc 1 70 19 ld.u64 %r232,[%r195+16]; add.u64 %r231,%r232,1; ld.u64 %r233,[%r195+8]; sub.u64 %r41,%r231,%r233; .loc 1 72 10 setp.lt.s64 %r234,%r41,0; @ %r234 bra $L5; .loc 1 70 17 st.u64 [%r52],%r41; bra $L6; $L5: .loc 1 73 12 st.u64 [%r52],%r385; $L6: .loc 1 67 3 add.u64 %r195,%r195,24; add.u64 %r51,%r51,8; add.u64 %r52,%r52,8; setp.ne.u64 %r236,%r73,%r195; @ %r236 bra $L7; bra $L8; $L4: .loc 1 84 6 ld.u64 %r237,[%r201]; setp.eq.u64 %r238,%r237,0; @ ! %r238 bra $L53; bra $L9; $L3: add.u64 %r240,%r22,%r22; add.u64 %r241,%r240,%r22; shl.b64 %r242,%r241,3; add.u64 %r243,%r242,40; add.u64 %r134,%r202,%r243; shl.b64 %r244,%r22,3; add.u64 %r110,%r244,-8; add.u64 %r375,%frame,120; add.u64 %r115,%r375,%r110; add.u64 %r374,%frame,240; add.u64 %r108,%r374,%r110; add.u64 %r247,%r202,40; cvt.u32.u32 %r249,%r23; cvt.s64.s8 %r248,%r249; add.u64 %r251,%r248,%r248; add.u64 %r252,%r251,%r248; shl.b64 %r253,%r252,3; add.u64 %r198,%r247,%r253; .loc 1 81 12 mov.u64 %r384,0; $L13: .loc 1 77 20 ld.u64 %r255,[%r134]; mul.lo.u64 %r254,%r255,%r26; .loc 1 77 18 st.u64 [%r115],%r254; .loc 1 78 19 ld.u64 %r257,[%r134+16]; add.u64 %r256,%r257,1; ld.u64 %r258,[%r134+8]; sub.u64 %r49,%r256,%r258; .loc 1 80 10 setp.lt.s64 %r259,%r49,0; @ %r259 bra $L11; .loc 1 78 17 st.u64 [%r108],%r49; bra $L12; $L11: .loc 1 81 12 st.u64 [%r108],%r384; $L12: .loc 1 75 3 add.u64 %r134,%r134,24; add.u64 %r115,%r115,8; add.u64 %r108,%r108,8; setp.ne.u64 %r261,%r134,%r198; @ %r261 bra $L13; bra $L4; $L9: .loc 1 88 7 setp.le.s32 %r262,%r25,0; @ %r262 bra $L14; add.u64 %r178,%r201,40; add.u64 %r176,%frame,240; add.u64 %r263,%r201,16; cvt.u32.u32 %r265,%r23; cvt.s64.s8 %r264,%r265; add.u64 %r267,%r264,%r264; add.u64 %r268,%r267,%r264; shl.b64 %r269,%r268,3; add.u64 %r170,%r263,%r269; .loc 1 91 17 mov.u64 %r113,1; bra $L15; $L14: .loc 1 99 24 mov.u64 %r270,0; st.u64 [%r201+8],%r270; .loc 1 100 28 cvt.u16.u32 %r273,%r23; add.u16 %r272,%r273,-1; cvt.u32.u16 %r274,%r272; st.u8 [%r201+28],%r274; .loc 1 102 20 add.u32 %r275,%r23,-2; cvt.s64.s32 %r58,%r275; add.u64 %r277,%r58,%r58; add.u64 %r278,%r277,%r58; shl.b64 %r279,%r278,3; add.u64 %r280,%r201,%r279; .loc 1 102 67 shl.b64 %r282,%r58,3; add.u64 %r283,%frame,%r282; .loc 1 102 59 ld.u64 %r285,[%r280+40]; ld.u64 %r286,[%r283+240]; mul.lo.u64 %r131,%r285,%r286; .loc 1 104 10 setp.eq.u64 %r287,%r131,0; @ %r287 bra $L16; bra $L54; $L18: .loc 1 93 55 mul.lo.u64 %r113,%r54,%r113; $L15: .loc 1 95 4 st.u64 [%r178+8],%r237; ld.u64 %r54,[%r176]; add.u64 %r289,%r54,-1; st.u64 [%r178+16],%r289; st.u64 [%r178],%r113; .loc 1 88 7 add.u64 %r178,%r178,24; add.u64 %r176,%r176,8; setp.ne.u64 %r290,%r170,%r178; @ %r290 bra $L18; bra $L14; $L16: .loc 1 107 4 st.u64 [%r201+48],%r131; mov.u64 %r292,-1; st.u64 [%r201+56],%r292; mov.u64 %r293,1; st.u64 [%r201+40],%r293; .loc 1 108 4 bra $L1; $L54: .loc 1 111 24 mov.u64 %r2951295; call (%value_in),_gfortrani_xmallocarray296,[%value_in]; } .loc 1 111 22 st.u64 [%r201],%r296; $L25: .loc 1 137 3 setp.gt.s32 %r298,%r25,0; @ ! %r298 bra $L26; add.u64 %r374,%frame,240; bra $L20; $L26: .loc 1 145 8 ld.u64 %r138,[%r202]; .loc 1 147 21 cvt.u32.u64 %r78,%r26; .loc 1 147 50 add.u32 %r299,%r78,-4; and.b32 %r300,%r299,-5; set.u32.eq.u32 %r302,%r300,0; neg.s32 %r303,%r302; .loc 1 147 21 add.u32 %r304,%r78,-1; set.u32.le.u32 %r306,%r304,1; neg.s32 %r307,%r306; .loc 1 147 6 cvt.u16.u32 %r309,%r303; cvt.u16.u32 %r310,%r307; orbra $L55; $L20: add.u64 %r188,%frame,360; add.u64 %r187,%r201,40; mov.u64 %r185,%frame; .loc 1 123 10 mov.u64 %r140,0; .loc 1 139 16 mov.u64 %r332,%r140; bra $L23; $L53: .loc 1 115 19 ld.s8 %r65,[%r201+28]; .loc 1 115 10 setp.eq.u64 %r314,%r65,%r124; @ %r314 bra $L24; .loc 1 116 2 st.u64 [%stack+8],%r124; st.u64 [%stack],%r65; cvta.const.u64 %r315315_gfortran_runtime_error24: .loc 1 121 11 cvta.global.u64 %r317,_gfortrani_compile_options; .loc 1 121 10 ld.u32 %r318,[%r317+36]; setp.eq.u32 %r319,%r318,0; @ %r319 bra $L25; .loc 1 123 4 setp.le.s64 %r320,%r124,0; @ %r320 bra $L26; add.u64 %r169,%r201,48; add.u64 %r374,%frame,240; mov.u64 %r164,%r374; add.u64 %r162,%r201,56; cvt.u32.u32 %r322,%r23; cvt.s64.s8 %r321,%r322; add.u64 %r144,%r321,-1; .loc 1 123 10 mov.u64 %r112,0; $L28: .loc 1 127 21 ld.u64 %r324,[%r162]; add.u64 %r323,%r324,1; .loc 1 127 19 ld.u64 %r325,[%r169]; sub.u64 %r130,%r323,%r325; .loc 1 128 18 ld.u64 %r72,[%r164]; .loc 1 128 11 setp.eq.u64 %r326,%r72,%r130; @ %r326 bra $L27; .loc 1 129 3 st.u64 [%stack+16],%r72; st.u64 [%stack+8],%r130; cvt.u32.u64 %r329,%r112; add.u32 %r328,%r329,1; st.u32 [%stack],%r328; cvta.const.u64 %r327,$LC1stack; call _gfortran_runtime_error27: .loc 1 123 25 add.u64 %r112,%r112,1; .loc 1 123 4 add.u64 %r169,%r169,24; add.u64 %r164,%r164,8; add.u64 %r162,%r162,24; setp.ne.u64 %r331,%r112,%r144; @ %r331 bra $L28; bra $L20; $L23: .loc 1 139 16 st.u64 [%r188],%r332; .loc 1 140 18 ld.u64 %r333,[%r187]; st.u64 [%r185],%r333; .loc 1 141 17 shl.b64 %r335,%r140,3; add.u64 %r336,%r374,%r335; .loc 1 141 10 ld.u64 %r337,[%r336]; setp.le.s64 %r338,%r337,0; @ %r338 bra $L1; .loc 1 137 26 add.u64 %r140,%r140,1; .loc 1 137 3 add.u64 %r188,%r188,8; add.u64 %r187,%r187,24; add.u64 %r185,%r185,8; setp.gt.s64 %r339,%r124,%r140; @ %r339 bra $L23; bra $L26; $L55: .loc 1 149 7 setp.eq.u32 %r341,%r78,16; @ %r341 bra $L21; .loc 1 157 5 cvta.const.u64 %r343,$LC2; mov.u64 %r342,3343; call _gfortrani_internal_error61 9 add.u64 %r344,%r27,1; .loc 1 61 7 sub.u64 %r126,%r344,%r29; max.s64 %r128,%r126,0; .loc 1 65 11 mul.lo.u64 %r33,%r30,%r26; .loc 1 159 8 ld.u64 %r139,[%r201]; ld.u64 %r166,[%frame+360]; .loc 1 185 22 ld.u64 %r87,[%frame+120]; .loc 1 186 22 ld.u64 %r89,[%frame]; .loc 1 186 12 shl.b64 %r90,%r89,3; .loc 1 188 32 ld.u64 %r68,[%frame+240]; .loc 1 195 30 mul.lo.u64 %r345,%r87,%r68; .loc 1 195 16 neg.s64 %r152,%r345; .loc 1 196 30 mul.lo.u64 %r346,%r89,%r68; .loc 1 196 16 shl.b64 %r347,%r346,3; neg.s64 %r163,%r347; setp.gt.s64 %r372,%r126,0; setp.le.s32 %r373,%r25,1; .loc 1 171 10 mov.u64 %r377,0; cvt.u32.u32 %r378,%r23; cvt.s64.s8 %r379,%r378; add.u64 %r380,%r379,-1; add.u64 %r381,%frame,120; add.u64 %r382,%frame,240; $L36: .loc 1 170 12 @ %r372 bra $L37; .loc 1 171 10 st.u64 [%r139],%r377; bra $L30; $L37: mov.u64 %r133,%r138; .loc 1 169 10 mov.u64 %r114,0; .loc 1 174 13 mov.u64 %r132,%r114; $L29: .loc 1 177 6 ld.s8 %r351,[%r133]; cvt.u16.u32 %r350,%r351; setp.eq.u16 %r352,%r350,0; @ %r352 bra $L31; .loc 1 178 11 add.u64 %r114,%r114,1; $L31: .loc 1 174 28 add.u64 %r132,%r132,1; .loc 1 174 36 add.u64 %r133,%r133,%r33; .loc 1 174 6 setp.gt.s64 %r353,%r128,%r132; @ %r353 bra $L29; .loc 1 180 12 st.u64 [%r139],%r114; $L30: .loc 1 184 15 add.u64 %r166,%r166,1; .loc 1 185 12 add.u64 %r138,%r138,%r87; .loc 1 186 12 add.u64 %r139,%r139,%r90; .loc 1 188 13 setp.ne.u64 %r354,%r68,%r166; @ %r354 bra $L36; .loc 1 195 16 add.u64 %r135,%r138,%r152; .loc 1 196 16 add.u64 %r136,%r139,%r163; .loc 1 198 14 @ %r373 bra $L1; add.u64 %r32,%frame,368; mov.u64 %r165,8; .loc 1 197 12 mov.u64 %r137,1; bra $L34; $L35: .loc 1 192 20 st.u64 [%r32],%r377; .loc 1 195 30 mul.lo.u64 %r360,%r101,%r100; .loc 1 195 16 sub.u64 %r135,%r138,%r360; .loc 1 196 30 mul.lo.u64 %r361,%r103,%r100; .loc 1 196 16 shl.b64 %r362,%r361,3; sub.u64 %r136,%r139,%r362; .loc 1 197 12 add.u64 %r137,%r137,1; .loc 1 198 14 add.u64 %r32,%r32,8; add.u64 %r165,%r165,8; setp.eq.u64 %r363,%r137,%r380; @ %r363 bra $L1; $L34: .loc 1 206 23 ld.u64 %r364,[%r32]; add.u64 %r100,%r364,1; st.u64 [%r32],%r100; .loc 1 207 30 add.u64 %r366,%r381,%r165; ld.u64 %r101,[%r366]; .loc 1 207 20 add.u64 %r138,%r135,%r101; .loc 1 208 30 add.u64 %r367,%frame,%r165; ld.u64 %r103,[%r367]; .loc 1 208 20 shl.b64 %r368,%r103,3; add.u64 %r139,%r136,%r368; .loc 1 188 32 add.u64 %r370,%r382,%r165; ld.u64 %r106,[%r370]; .loc 1 188 13 setp.eq.u64 %r371,%r100,%r106; @ %r371 bra $L35; .loc 1 192 20 mov.u64 %r166,0; bra $L36; $L1: .loc 1 212 1 ret; } count_16_l.o/_gfortran_count_16_l .visible .func _gfortran_count_16_lfortran/generated/count_16_l.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_internal_error .extern .func _gfortrani_internal_error21,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121,32,105,110,32,67,79,85,78,84,32,105,110,116,114,105,110,115,105,99,0 }114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,67,79,85,78,84,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,10810,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,67,79,85,78,84,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_count_16_l .visible .func _gfortran_count_16_l480u64 %r229; .reg .u64 %r230; .reg .u64 %r231; .reg .u64 %r232; .reg .u64 %r233; .reg .pred %r234; .reg .pred %r236; .reg .u64u64u64 %r257; .reg .u64 %r258; .reg .predu64u16u64 %r279; .reg .u64 %r280; .reg .u64 %r282; .reg .u64pred %r320; .reg .u64u64 %r332; .reg .u64 %r333; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .pred %r338; .reg .pred %r339; .reg .pred %r341; .reg .u64 %r342; .reg .u64 %r343; .reg .u64 %r344; .reg .u64 %r345; .reg .u64 %r346; .reg .u64 %r347; .reg .u16 %r351; .reg .u32 %r352; .reg .pred %r353; .reg .u64 %r356; .reg .u64 %r359; .reg .u32 %r361; .reg .pred %r366; .reg .pred %r369; .reg .u64 %r375; .reg .u64pred %r396; .reg .u64 %r398; .reg .u64 %r399; .reg .u32 %r401; .reg .u64mov.u64 %r201,%ar0; mov.u64 %r202,%ar1; mov.u64 %r203,%ar2; .loc 1 56 10 ld.u64 %r22,[%r203]; .loc 1 56 7 add.u64 %r123,%r22,-1; .loc 1 57 10 ld.s8 %r23,[%r202+28]; .loc 1 57 38 add.u32 %r25,%r23,-1; .loc 1 57 8 cvt.s64.s32 %r124,%r25; .loc 1 59 14 ld.u64 %r26,[%r202+16]; .loc 1 61 9 add.u64 %r205,%r123,%r123; add.u64 %r206,%r205,%r123; shl.b64 %r207,%r206,3; add.u64 %r208,%r202,%r207; ld.u64 %r27,[%r208+56]; ld.u64 %r29,[%r208+48]; .loc 1 65 11 ld.u64 %r30,[%r208+40]; .loc 1 67 3 setp.gt.s64 %r222,%r123,0; @ %r222 bra $L2; $L8: .loc 1 75 3 setp.lt.s64 %r223,%r123,%r124; @ %r223 bra $L3; bra $L4; $L2: add.u64 %r195,%r202,40; add.u64 %r51,%frame,120; add.u64 %r52,%frame,240; add.u64 %r225,%r22,%r22; add.u64 %r226,%r225,%r22; shl.b64 %r227,%r226,3; add.u64 %r228,%r202,16; add.u64 %r73,%r227,%r228; .loc 1 73 12 mov.u64 %r407,0; $L7: .loc 1 69 20 ld.u64 %r230,[%r195]; mul.lo.u64 %r229,%r230,%r26; .loc 1 69 18 st.u64 [%r51],%r229; .loc 1 70 19 ld.u64 %r232,[%r195+16]; add.u64 %r231,%r232,1; ld.u64 %r233,[%r195+8]; sub.u64 %r41,%r231,%r233; .loc 1 72 10 setp.lt.s64 %r234,%r41,0; @ %r234 bra $L5; .loc 1 70 17 st.u64 [%r52],%r41; bra $L6; $L5: .loc 1 73 12 st.u64 [%r52],%r407; $L6: .loc 1 67 3 add.u64 %r195,%r195,24; add.u64 %r51,%r51,8; add.u64 %r52,%r52,8; setp.ne.u64 %r236,%r73,%r195; @ %r236 bra $L7; bra $L8; $L4: .loc 1 84 6 ld.u64 %r237,[%r201]; setp.eq.u64 %r238,%r237,0; @ ! %r238 bra $L53; bra $L9; $L3: add.u64 %r240,%r22,%r22; add.u64 %r241,%r240,%r22; shl.b64 %r242,%r241,3; add.u64 %r243,%r242,40; add.u64 %r134,%r202,%r243; shl.b64 %r244,%r22,3; add.u64 %r110,%r244,-8; add.u64 %r394,%frame,120; add.u64 %r115,%r394,%r110; add.u64 %r393,%frame,240; add.u64 %r108,%r393,%r110; add.u64 %r247,%r202,40; cvt.u32.u32 %r249,%r23; cvt.s64.s8 %r248,%r249; add.u64 %r251,%r248,%r248; add.u64 %r252,%r251,%r248; shl.b64 %r253,%r252,3; add.u64 %r198,%r247,%r253; .loc 1 81 12 mov.u64 %r406,0; $L13: .loc 1 77 20 ld.u64 %r255,[%r134]; mul.lo.u64 %r254,%r255,%r26; .loc 1 77 18 st.u64 [%r115],%r254; .loc 1 78 19 ld.u64 %r257,[%r134+16]; add.u64 %r256,%r257,1; ld.u64 %r258,[%r134+8]; sub.u64 %r49,%r256,%r258; .loc 1 80 10 setp.lt.s64 %r259,%r49,0; @ %r259 bra $L11; .loc 1 78 17 st.u64 [%r108],%r49; bra $L12; $L11: .loc 1 81 12 st.u64 [%r108],%r406; $L12: .loc 1 75 3 add.u64 %r134,%r134,24; add.u64 %r115,%r115,8; add.u64 %r108,%r108,8; setp.ne.u64 %r261,%r134,%r198; @ %r261 bra $L13; bra $L4; $L9: .loc 1 88 7 setp.le.s32 %r262,%r25,0; @ %r262 bra $L14; add.u64 %r178,%r201,40; add.u64 %r176,%frame,240; add.u64 %r263,%r201,16; cvt.u32.u32 %r265,%r23; cvt.s64.s8 %r264,%r265; add.u64 %r267,%r264,%r264; add.u64 %r268,%r267,%r264; shl.b64 %r269,%r268,3; add.u64 %r170,%r263,%r269; .loc 1 91 17 mov.u64 %r113,1; bra $L15; $L14: .loc 1 99 24 mov.u64 %r270,0; st.u64 [%r201+8],%r270; .loc 1 100 28 cvt.u16.u32 %r273,%r23; add.u16 %r272,%r273,-1; cvt.u32.u16 %r274,%r272; st.u8 [%r201+28],%r274; .loc 1 102 20 add.u32 %r275,%r23,-2; cvt.s64.s32 %r58,%r275; add.u64 %r277,%r58,%r58; add.u64 %r278,%r277,%r58; shl.b64 %r279,%r278,3; add.u64 %r280,%r201,%r279; .loc 1 102 67 shl.b64 %r282,%r58,3; add.u64 %r283,%frame,%r282; .loc 1 102 59 ld.u64 %r285,[%r280+40]; ld.u64 %r286,[%r283+240]; mul.lo.u64 %r131,%r285,%r286; .loc 1 104 10 setp.eq.u64 %r287,%r131,0; @ %r287 bra $L16; bra $L54; $L18: .loc 1 93 55 mul.lo.u64 %r113,%r54,%r113; $L15: .loc 1 95 4 st.u64 [%r178+8],%r237; ld.u64 %r54,[%r176]; add.u64 %r289,%r54,-1; st.u64 [%r178+16],%r289; st.u64 [%r178],%r113; .loc 1 88 7 add.u64 %r178,%r178,24; add.u64 %r176,%r176,8; setp.ne.u64 %r290,%r170,%r178; @ %r290 bra $L18; bra $L14; $L16: .loc 1 107 4 st.u64 [%r201+48],%r131; mov.u64 %r292,-1; st.u64 [%r201+56],%r292; mov.u64 %r293,1; st.u64 [%r201+40],%r293; .loc 1 108 4 bra $L1; $L54: .loc 1 111 24 mov.u64 %r295,16295; call (%value_in),_gfortrani_xmallocarray296,[%value_in]; } .loc 1 111 22 st.u64 [%r201],%r296; $L25: .loc 1 137 3 setp.gt.s32 %r298,%r25,0; @ ! %r298 bra $L26; add.u64 %r393,%frame,240; bra $L20; $L26: .loc 1 145 8 ld.u64 %r138,[%r202]; .loc 1 147 21 cvt.u32.u64 %r78,%r26; .loc 1 147 50 add.u32 %r299,%r78,-4; and.b32 %r300,%r299,-5; set.u32.eq.u32 %r302,%r300,0; neg.s32 %r303,%r302; .loc 1 147 21 add.u32 %r304,%r78,-1; set.u32.le.u32 %r306,%r304,1; neg.s32 %r307,%r306; .loc 1 147 6 cvt.u16.u32 %r309,%r303; cvt.u16.u32 %r310,%r307; orbra $L55; $L20: add.u64 %r188,%frame,360; add.u64 %r187,%r201,40; mov.u64 %r185,%frame; .loc 1 123 10 mov.u64 %r140,0; .loc 1 139 16 mov.u64 %r332,%r140; bra $L23; $L53: .loc 1 115 19 ld.s8 %r65,[%r201+28]; .loc 1 115 10 setp.eq.u64 %r314,%r65,%r124; @ %r314 bra $L24; .loc 1 116 2 st.u64 [%stack+8],%r124; st.u64 [%stack],%r65; cvta.const.u64 %r315315_gfortran_runtime_error24: .loc 1 121 11 cvta.global.u64 %r317,_gfortrani_compile_options; .loc 1 121 10 ld.u32 %r318,[%r317+36]; setp.eq.u32 %r319,%r318,0; @ %r319 bra $L25; .loc 1 123 4 setp.le.s64 %r320,%r124,0; @ %r320 bra $L26; add.u64 %r169,%r201,48; add.u64 %r393,%frame,240; mov.u64 %r164,%r393; add.u64 %r162,%r201,56; cvt.u32.u32 %r322,%r23; cvt.s64.s8 %r321,%r322; add.u64 %r144,%r321,-1; .loc 1 123 10 mov.u64 %r112,0; $L28: .loc 1 127 21 ld.u64 %r324,[%r162]; add.u64 %r323,%r324,1; .loc 1 127 19 ld.u64 %r325,[%r169]; sub.u64 %r130,%r323,%r325; .loc 1 128 18 ld.u64 %r72,[%r164]; .loc 1 128 11 setp.eq.u64 %r326,%r72,%r130; @ %r326 bra $L27; .loc 1 129 3 st.u64 [%stack+16],%r72; st.u64 [%stack+8],%r130; cvt.u32.u64 %r329,%r112; add.u32 %r328,%r329,1; st.u32 [%stack],%r328; cvta.const.u64 %r327,$LC1stack; call _gfortran_runtime_error27: .loc 1 123 25 add.u64 %r112,%r112,1; .loc 1 123 4 add.u64 %r169,%r169,24; add.u64 %r164,%r164,8; add.u64 %r162,%r162,24; setp.ne.u64 %r331,%r112,%r144; @ %r331 bra $L28; bra $L20; $L23: .loc 1 139 16 st.u64 [%r188],%r332; .loc 1 140 18 ld.u64 %r333,[%r187]; st.u64 [%r185],%r333; .loc 1 141 17 shl.b64 %r335,%r140,3; add.u64 %r336,%r393,%r335; .loc 1 141 10 ld.u64 %r337,[%r336]; setp.le.s64 %r338,%r337,0; @ %r338 bra $L1; .loc 1 137 26 add.u64 %r140,%r140,1; .loc 1 137 3 add.u64 %r188,%r188,8; add.u64 %r187,%r187,24; add.u64 %r185,%r185,8; setp.gt.s64 %r339,%r124,%r140; @ %r339 bra $L23; bra $L26; $L55: .loc 1 149 7 setp.eq.u32 %r341,%r78,16; @ %r341 bra $L21; .loc 1 157 5 cvta.const.u64 %r343,$LC2; mov.u64 %r342,3343; call _gfortrani_internal_error61 9 add.u64 %r344,%r27,1; .loc 1 61 7 sub.u64 %r126,%r344,%r29; max.s64 %r128,%r126,0; .loc 1 65 11 mul.lo.u64 %r33,%r30,%r26; .loc 1 159 8 ld.u64 %r139,[%r201]; ld.u64 %r166,[%frame+360]; .loc 1 185 22 ld.u64 %r87,[%frame+120]; .loc 1 186 22 ld.u64 %r89,[%frame]; .loc 1 186 12 shl.b64 %r90,%r89,4; .loc 1 188 32 ld.u64 %r68,[%frame+240]; .loc 1 195 30 mul.lo.u64 %r345,%r87,%r68; .loc 1 195 16 neg.s64 %r152,%r345; .loc 1 196 30 mul.lo.u64 %r346,%r89,%r68; .loc 1 196 16 shl.b64 %r347,%r346,4; neg.s64 %r163,%r347; setp.gt.s64 %r395,%r126,0; setp.le.s32 %r396,%r25,1; .loc 1 169 10 mov.u64 %r399,0; cvt.u32.u32 %r401,%r23; cvt.s64.s8 %r402,%r401; add.u64 %r403,%r402,-1; add.u64 %r404,%frame,120; add.u64 %r405,%frame,240; $L36: .loc 1 170 12 @ %r395 bra $L37; .loc 1 171 10 st.u64 [%r139],%r399; st.u64 [%r139+8],%r399; bra $L30; $L37: mov.u64 %r133,%r138; .loc 1 169 10 mov.u64 %r387,%r399; mov.u64 %r388,%r399; .loc 1 174 13 mov.u64 %r132,0; .loc 1 178 11 mov.u64 %r398,1; $L29: .loc 1 177 6 ld.s8 %r352,[%r133]; cvt.u16.u32 %r351,%r352; setp.eq.u16 %r353,%r351,0; @ %r353 bra $L31; .loc 1 178 11 add.u64 %r356,%r387,%r398; set.u32.lt.u64 %r361,%r356,%r387; cvt.s64.s32 %r359,%r361; mov.u64 %r387,%r356; sub.u64 %r388,%r388,%r359; $L31: .loc 1 174 28 add.u64 %r132,%r132,1; .loc 1 174 36 add.u64 %r133,%r133,%r33; .loc 1 174 6 setp.gt.s64 %r366,%r128,%r132; @ %r366 bra $L29; .loc 1 180 12 st.u64 [%r139],%r387; st.u64 [%r139+8],%r388; $L30: .loc 1 184 15 add.u64 %r166,%r166,1; .loc 1 185 12 add.u64 %r138,%r138,%r87; .loc 1 186 12 add.u64 %r139,%r139,%r90; .loc 1 188 13 setp.ne.u64 %r369,%r68,%r166; @ %r369 bra $L36; .loc 1 195 16 add.u64 %r135,%r138,%r152; .loc 1 196 16 add.u64 %r136,%r139,%r163; .loc 1 198 14 @ %r396 bra $L1; add.u64 %r32,%frame,368; mov.u64 %r165,8; .loc 1 197 12 mov.u64 %r137,1; bra $L34; $L35: .loc 1 192 20 st.u64 [%r32],%r399; .loc 1 195 30 mul.lo.u64 %r375,%r101,%r100; .loc 1 195 16 sub.u64 %r135,%r138,%r375; .loc 1 196 30 mul.lo.u64 %r376,%r103,%r100; .loc 1 196 16 shl.b64 %r377,%r376,4; sub.u64 %r136,%r139,%r377; .loc 1 197 12 add.u64 %r137,%r137,1; .loc 1 198 14 add.u64 %r32,%r32,8; add.u64 %r165,%r165,8; setp.eq.u64 %r378,%r137,%r403; @ %r378 bra $L1; $L34: .loc 1 206 23 ld.u64 %r379,[%r32]; add.u64 %r100,%r379,1; st.u64 [%r32],%r100; .loc 1 207 30 add.u64 %r381,%r404,%r165; ld.u64 %r101,[%r381]; .loc 1 207 20 add.u64 %r138,%r135,%r101; .loc 1 208 30 add.u64 %r382,%frame,%r165; ld.u64 %r103,[%r382]; .loc 1 208 20 shl.b64 %r383,%r103,4; add.u64 %r139,%r136,%r383; .loc 1 188 32 add.u64 %r385,%r405,%r165; ld.u64 %r106,[%r385]; .loc 1 188 13 setp.eq.u64 %r386,%r100,%r106; @ %r386 bra $L35; .loc 1 192 20 mov.u64 %r166,0; bra $L36; $L1: .loc 1 212 1 ret; } maxloc0_4_i1.o/_gfortran_maxloc0_4_i1 .visible .func _gfortran_maxloc0_4_i1libgfortran/generated/maxloc0_4_i1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_4_i1 .visible .func _gfortran_mmaxloc0_4_i1_gfortran_smaxloc0_4_i1 .visible .func _gfortran_smaxloc0_4_i1VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,65,88,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_4_i1 .visible .func _gfortran_maxloc0_4_i1stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[368u64 %r124; .reg .u32 %r125; .reg .u64u16 %r167; .reg .u16 %r168; .reg .predmov.u32 %r116,%ar2; .loc 1 50 10 ld.s8 %r22,[%r115+28]; .loc 1 50 8 cvt.u32.u32 %r117,%r22; cvt.s64.s8 %r69,%r117; .loc 1 51 6 setp.gt.s64 %r118,%r69,0; @ %r118 bra $L2; .loc 1 52 5 cvta.const.u64 %r1111_gfortran_runtime_errorr23,[%r114]; .loc 1 54 6 setp.ne.u64 %r121,%r23,0; @ %r121 bra $L3; .loc 1 56 7 st.u64 [%r114+48],%r23; add.u64 %r123,%r69,-1; st.u64 [%r114+56],%r123; mov.u64 %r124,1; st.u64 [%r114+40],%r124; .loc 1 57 28 cvt.u32.u64 %r125,%r124; st.u8 [%r114+28],%r125; .loc 1 58 24 st.u64 [%r114+8],%r23; .loc 1 59 29128; call (%value_in),_gfortrani_xmallocarray131,[%value_in]; } mov.u64 %r23,%r131; .loc 1 59 27 st.u64 [%r114],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r133,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r134,[%r133+36]; setp.eq.u32 %r135,%r134,0; @ %r135 bra $L4; .loc 1 64r115138; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r114]; $L4: .loc 1 68 11 ld.u64 %r70,[%r114+40]; add.u64 %r38,%r115,40; mov.u64 %r43,%frame; add.u64 %r34,%frame,120; .loc 1 70 10 mov.u64 %r96,0; add.u64 %r189,%frame,240; .loc 1 74 16 mov.u64 %r146,%r96; bra $L7; $L22: mov.u64 %r96,%r75; $L7: .loc 1 72 18 ld.u64 %r139,[%r38]; st.u64 [%r43],%r139; .loc 1 73 19 ld.u64 %r141,[%r38+16]; add.u64 %r140,%r141,1; ld.u64 %r142,[%r38+8]; sub.u64 %r31,%r140,%r142; .loc 1 73 17 st.u64 [%r34],%r31; .loc 1 74 16 shl.b64 %r144,%r96,3; add.u64 %r145,%r189,%r144; st.u64 [%r145],%r146; .loc 1 75 10 setp.gt.s64 %r147,%r31,0; @ %r147 bra $L5; shl.b64 %r81,%r70,2; mov.u64 %r83,%r23; .loc 1 78 11 mov.u64 %r76,0; .loc 1 79 24 cvt.u32.u64 %r148,%r76; $L6: st.u32 [%r83],%r148; .loc 1 78 27 add.u64 %r76,%r76,1; .loc 1 78 4 add.u64 %r83,%r83,%r81; setp.ne.u64 %r149,%r69,%r76; @ %r149 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r75,%r96,1; .loc 1 70 3 add.u64 %r38,%r38,24; add.u64 %r43,%r43,8; add.u64 %r34,%r34,8; setp.ne.u64 %r150,%r69,%r75; @ %r150 bra $L22; .loc 1 84 8 ld.u64 %r61,[%r115]; shl.b64 %r78,%r70,2; mov.u64 %r89,%r23; .loc 1 87 10 mov.u64 %r74,0; .loc 1 88 23 mov.u32 %r151,1; $L8: st.u32 [%r89],%r151; mov.u64 %r97,%r74; .loc 1 87 26 add.u64 %r74,%r74,1; .loc 1 87 3 add.u64 %r89,%r89,%r78; setp.ne.u64 %r152,%r96,%r97; @ %r152 bra $L8; .loc 1 101 9 setp.ne.u64 %r153,%r61,0; @ ! %r153 bra $L1; .loc 1 149 19 ld.u64 %r47,[%frame]; .loc 1 151 34 ld.u64 %r50,[%frame+120]; cvt.u32.u32 %r155,%r22; cvt.s64.s8 %r154,%r155; shl.b64 %r156,%r154,3; add.u64 %r100,%r189,%r156; .loc 1 99 12 mov.u32 %r64,-128; setp.eq.u32 %r188,%r116,0; add.u64 %r190,%frame,248; .loc 1 157 13 mov.u64 %r193,0; .loc 1 174 32 add.u64 %r194,%frame,120; $L21: .loc 1 151 21 ld.u64 %r108,[%frame+240]; .loc 1 126 12 @ %r188 bra $L23; mov.u64 %r39,%r108; mov.u64 %r59,%r61; $L14: .loc 1 129 12 ld.s8 %r32,[%r59]; .loc 1 129 11 cvt.u16.u32 %r159,%r32; cvt.u16.u32 %r160,%r64; setp.lt.s16 %r161,%r159,%r160; @ %r161 bra $L12; mov.u64 %r41,%r190; mov.u64 %r62,%r23; mov.u64 %r103,%r39; $L13: .loc 1 133 35 cvt.u32.u64 %r164,%r103; add.u32 %r163,%r164,1; .loc 1 133 24 st.u32 [%r62],%r163; .loc 1 132 10 add.u64 %r62,%r62,%r78; setp.eq.u64 %r165,%r41,%r100; @ %r165 bra $L24; .loc 1 133 31 ld.u64 %r103,[%r41]; add.u64 %r41,%r41,8; bra $L13; $L24: mov.u32 %r64,%r32; $L12: .loc 1 135 12 add.u64 %r59,%r59,%r47; .loc 1 137 17 add.u64 %r39,%r39,1; .loc 1 137 10 st.u64 [%frame+240],%r39; setp.ne.u64 %r166,%r39,%r50; @ %r166 bra $L14; bra $L34; $L23: mov.u64 %r49,%r108; mov.u64 %r60,%r61; $L11: .loc 1 141 11 ld.s8 %r40,[%r60]; .loc 1 141 10 cvt.u16.u32 %r167,%r40; cvt.u16.u32 %r168,%r64; setp.le.s16 %r169,%r167,%r168; @ %r169 bra $L16; mov.u64 %r106,%r190; mov.u64 %r107,%r23; mov.u64 %r42,%r49; $L17: .loc 1 145 35 cvt.u32.u64 %r172,%r42; add.u32 %r171,%r172,1; .loc 1 145 24 st.u32 [%r107],%r171; .loc 1 144 4 add.u64 %r107,%r107,%r78; setp.eq.u64 %r173,%r100,%r106; @ %r173 bra $L25; .loc 1 145 31 ld.u64 %r42,[%r106]; add.u64 %r106,%r106,8; bra $L17; $L25: mov.u32 %r64,%r40; $L16: .loc 1 149 9 add.u64 %r60,%r60,%r47; .loc 1 151 14 add.u64 %r49,%r49,1; .loc 1 151 7 st.u64 [%frame+240],%r49; setp.ne.u64 %r174,%r49,%r50; @ %r174 bra $L11; .loc 1 149 9 sub.u64 %r175,%r50,%r108; mad.lo.u64 %r61,%r175,%r47,%r61; bra $L18; $L34: .loc 1 135 12 sub.u64 %r177,%r50,%r108; mad.lo.u64 %r61,%r177,%r47,%r61; $L18: mov.u64 %r77,%r190; .loc 1 151 34 mov.u64 %r58,%r50; mov.u64 %r56,%r47; .loc 1 152 9 mov.u64 %r63,0; $L20: .loc 1 157 13 st.u64 [%r77+-8],%r193; .loc 1 160 23 mul.lo.u64 %r53,%r56,%r58; mov.u64 %r98,%r63; .loc 1 161 5 add.u64 %r63,%r63,1; .loc 1 162 7 setp.eq.u64 %r181,%r98,%r96; @ %r181 bra $L1; .loc 1 170 16 ld.u64 %r182,[%r77]; add.u64 %r55,%r182,1; st.u64 [%r77],%r55; shl.b64 %r67,%r63,3; .loc 1 171 23 add.u64 %r183,%frame,%r67; ld.u64 %r56,[%r183]; .loc 1 171 13 sub.u64 %r184,%r56,%r53; add.u64 %r61,%r61,%r184; .loc 1 174 32 add.u64 %r186,%r194,%r67; ld.u64 %r58,[%r186]; .loc 1 174 7 add.u64 %r77,%r77,8; setp.eq.u64 %r187,%r55,%r58; @ %r187 bra $L20; bra $L21; $L_gfortran_mmaxloc0_4_i1 .visible .func _gfortran_mmaxloc0_4_i1reg .u64 %stack; mov.u64 %stack,0480pred %r168; .reg .predpredpredpredpred %r239; .reg .pred %r240; .reg .u32predpred %r295; mov.u64 %r164,%ar0; mov.u64 %r165,%ar1; mov.u64 %r166,%ar2; mov.u32 %r167,%ar3; .loc 1 202 6 setp.ne.u64 %r168,%r166,0; @ %r168 bra $L361617; call _gfortran_maxloc0_4_.loc 1 205 7 bra $L35; $L36: .loc 1 208 10 ld.s8 %r22,[%r165+28]; .loc 1 209 6 setp.gt.s32 %r172,%r22,0; @ %r172 bra $L38; .loc 1 210 5 cvta.const.u64 %r173173_gfortran_runtime_error%r175,[%r164]; setp.ne.u64 %r176,%r175,0; @ %r176 bra $L39; .loc 1 214 7 st.u64 [%r164+48],%r175; add.u32 %r178,%r22,-1; cvt.s64.s32 %r179,%r178; st.u64 [%r164+56],%r179; mov.u64 %r180,1; st.u64 [%r164+40],%r180; .loc 1 215 28 cvt.u32.u64 %r181,%r180; st.u8 [%r164+28],%r181; .loc 1 216 24 st.u64 [%r164+8],%r175; .loc 1 217 29 cvt.u32.u32 %r186,%r22; cvt.s64.s8 %r185,%r18call (%value_in),_gfortrani_xmallocarray187,[%value_in]; } .loc 1 217 27 st.u64 [%r164],%r187; bra $L40; $L39: .loc 1 221 11 cvta.global.u64 %r189,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r190,[%r189+36]; setp.eq.u32 %r191,%r190,0164r194; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19716r197194; call _gfortrani_bounds_equal_extents$L40: .loc 1 231 15 ld.u64 %r31,[%r166+16]; .loc 1 233 9 ld.u64 %r93,[%r166]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r199,%r32,-4; and.b32 %r200,%r199,-5; set.u32.eq.u32 %r202,%r200,0; neg.s32 %r203,%r202; .loc 1 235 22 add.u32 %r204,%r32,-1; set.u32.le.u32 %r206,%r204,1; neg.s32 %r207,%r206; .loc 1 235 6 cvt.u16.u32 %r209,%r203; cvt.u16.u32 %r210,%r207; or.b16 %r208,%r209,%r210; cvt.u32.u16 %r211,%r208; cvt.u16.u8 %r212,%r211; setp.ne.u16 %r213,%r212,0; @ %r213 bra $L41; .loc 1 237 7 setp.ne.u32 %r215,%r32,16; @ %r215 bra $L42; $L41: .loc 1 244 11 ld.u64 %r106,[%r164+40]; .loc 1 245 8 ld.u64 %r107,[%r164]; .loc 1 246 17 cvt.u32.u32 %r216,%r22; cvt.s64.s8 %r65,%r216; add.u64 %r52,%r165,40; add.u64 %r285,%frame,120; mov.u64 %r115,%r285; add.u64 %r136,%r166,40; mov.u64 %r163,%frame; add.u64 %r160,%frame,240; add.u64 %r157,%frame,360; mov.u64 %r158,%r157; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r225,%r29; bra $L43; $L42: .loc 1 242 5 cvta.const.u64 %r217,$LC3stack; call _gfortran_runtime_error64: mov.u64 %r29,%r112; $L43: .loc 1 248 18 ld.u64 %r219,[%r52]; st.u64 [%r115],%r219; .loc 1 249 20 ld.u64 %r221,[%r136]; mul.lo.u64 %r220,%r221,%r31; .loc 1 249 18 st.u64 [%r163],%r220; .loc 1 250 19 ld.u64 %r223,[%r52+16]; add.u64 %r222,%r223,1; ld.u64 %r224,[%r52+8]; sub.u64 %r44,%r222,%r224; .loc 1 250 17 st.u64 [%r160],%r44; .loc 1 251 16 st.u64 [%r158],%r225; .loc 1 252 10 setp.gt.s64 %r226,%r44,0; @ %r226 bra $L44; shl.b64 %r147,%r106,2; mov.u64 %r137,%r107; .loc 1 255 11 mov.u64 %r113,0; .loc 1 256 24 cvt.u32.u64 %r227,%r113; $L45: st.u32 [%r137],%r227; .loc 1 255 27 add.u64 %r113,%r113,1; .loc 1 255 4 add.u64 %r137,%r137,%r147; setp.ne.u64 %r228,%r65,%r113; @ %r228 bra $L45; bra $L35; $L44: .loc 1 246 26 add.u64 %r112,%r29,1; .loc 1 246 3 add.u64 %r52,%r52,24; add.u64 %r115,%r115,8; add.u64 %r136,%r136,24; add.u64 %r163,%r163,8; add.u64 %r160,%r160,8; add.u64 %r158,%r158,8; setp.ne.u64 %r229,%r65,%r112; @ %r229 bra $L64; .loc 1 261 8 ld.u64 %r89,[%r165]; shl.b64 %r62,%r106,2; mov.u64 %r91,%r107; .loc 1 264 10 mov.u64 %r111,0; .loc 1 265 23 cvt.u32.u64 %r230,%r111; $L47: st.u32 [%r91],%r230; mov.u64 %r140,%r111; .loc 1 264 26 add.u64 %r111,%r111,1; .loc 1 264 3 add.u64 %r91,%r91,%r62; setp.ne.u64 %r231,%r29,%r140; @ %r231 bra $L47; .loc 1 276 9 setp.eq.u64 %r232,%r89,0; @ %r232 bra $L35; .loc 1 331 19 ld.u64 %r69,[%frame+120]; .loc 1 332 20 ld.u64 %r71,[%frame]; .loc 1 334 34 ld.u64 %r74,[%frame+240]; ld.u64 %r161,[%frame+360]; cvt.u32.u32 %r234,%r22; cvt.s64.s8 %r233,%r234; shl.b64 %r235,%r233,3; add.u64 %r125,%r235,%r157; .loc 1 274 12 mov.u32 %r98,-128; setp.eq.u32 %r287,1,0; add.u64 %r286,%frame,368; .loc 1 340 13 mov.u64 %r292,0; .loc 1 359 32 add.u64 %r293,%frame,240; .loc 1 308 12 setp.eq.u32 %r295,%r167,0; $L63: .loc 1 280 10 @ %r287 bra $L48; mov.u64 %r53,%r161; mov.u32 %r90,0; mov.u32 %r289,1; $L53: mov.u64 %r45,%r89; .loc 1 300 13 add.u64 %r89,%r45,%r69; .loc 1 284 11 ld.s8 %r238,[%r93]; cvt.u16.u32 %r237,%r238; setp.eq.u16 %r239,%r237,0; @ %r239 bra $L49; setp.eq.u32 %r240,%r90,0; selp.u64 %r161,%r161,%r53,%r240; .loc 1 294 16 ld.s8 %r98,[%r45]; mov.u64 %r135,%r286; mov.u64 %r148,%r107; mov.u64 %r159,%r161; $L52: .loc 1 296 33 cvt.u32.u64 %r243,%r159; add.u32 %r242,%r243,1; .loc 1 296 22 st.u32 [%r148],%r242; .loc 1 295 9 add.u64 %r148,%r148,%r62; setp.eq.u64 %r244,%r125,%r135; @ %r244 bra $L65; .loc 1 296 29 ld.u64 %r159,[%r135]; add.u64 %r135,%r135,8; bra $L52; $L49: .loc 1 301 14 add.u64 %r93,%r93,%r71; .loc 1 303 11 add.u64 %r53,%r53,1; mov.u32 %r90,%r289; .loc 1 303 4 setp.ne.u64 %r245,%r53,%r74; @ %r245 bra $L53; bra $L89; $L48: .loc 1 308 12 @ %r295 bra $L66; mov.u64 %r61,%r161; mov.u64 %r87,%r89; $L58: .loc 1 311 11 ld.s8 %r248,[%r93]; cvt.u16.u32 %r247,%r248; setp.eq.u16 %r249,%r247,0; @ %r249 bra $L56; .loc 1 311 22 ld.s8 %r57,[%r87]; .loc 1 311 19 cvt.u16.u32 %r250,%r57; cvt.u16.u32 %r251,%r98; setp.lt.s16 %r252,%r250,%r251; @ %r252 bra $L56; mov.u64 %r116,%r286; mov.u64 %r117,%r107; mov.u64 %r154,%r61; .loc 1 314 19 mov.u64 %r141,%r292; $L57: .loc 1 315 36 cvt.u32.u64 %r255,%r154; add.u32 %r254,%r255,1; .loc 1 315 25 st.u32 [%r117],%r254; .loc 1 314 35 add.u64 %r109,%r141,1; .loc 1 314 12 add.u64 %r117,%r117,%r62; setp.le.s64 %r256,%r29,%r141; @ %r256 bra $L67; .loc 1 315 32 ld.u64 %r154,[%r116]; add.u64 %r116,%r116,8; mov.u64 %r141,%r109; bra $L57; $L67: mov.u32 %r98,%r57; $L56: .loc 1 317 13 add.u64 %r87,%r87,%r69; .loc 1 319 11 add.u64 %r61,%r61,1; .loc 1 319 4 st.u64 [%frame+360],%r61; setp.ne.u64 %r257,%r61,%r74; @ %r257 bra $L58; .loc 1 317 13 sub.u64 %r258,%r74,%r161; mad.lo.u64 %r89,%r258,%r69,%r89; bra $L59; $L66: mov.u64 %r73,%r161; mov.u64 %r92,%r93; mov.u64 %r88,%r89; $L55: .loc 1 323 11 ld.s8 %r261,[%r92]; cvt.u16.u32 %r260,%r261; setp.eq.u16 %r262,%r260,0; @ %r262 bra $L60; .loc 1 323 22 ld.s8 %r64,[%r88]; .loc 1 323 19 cvt.u16.u32 %r263,%r64; cvt.u16.u32 %r264,%r98; setp.le.s16 %r265,%r263,%r264; @ %r265 bra $L60; mov.u64 %r97,%r286; mov.u64 %r100,%r107; mov.u64 %r149,%r73; .loc 1 326 12 mov.u64 %r142,%r292; $L61: .loc 1 327 36 cvt.u32.u64 %r268,%r149; add.u32 %r267,%r268,1; .loc 1 327 25 st.u32 [%r100],%r267; .loc 1 326 28 add.u64 %r108,%r142,1; .loc 1 326 5 add.u64 %r100,%r100,%r62; setp.le.s64 %r269,%r29,%r142; @ %r269 bra $L68; .loc 1 327 32 ld.u64 %r149,[%r97]; add.u64 %r97,%r97,8; mov.u64 %r142,%r108; bra $L61; $L68: mov.u32 %r98,%r64; $L60: .loc 1 331 9 add.u64 %r88,%r88,%r69; .loc 1 332 10 add.u64 %r92,%r92,%r71; .loc 1 334 14 add.u64 %r73,%r73,1; .loc 1 334 7 st.u64 [%frame+360],%r73; setp.ne.u64 %r270,%r73,%r74; @ %r270 bra $L55; sub.u64 %r132,%r74,%r161; .loc 1 331 9 mad.lo.u64 %r89,%r69,%r132,%r89; .loc 1 332 10 mad.lo.u64 %r93,%r71,%r132,%r93; bra $L59; $L89: st.u64 [%frame+360],%r74; $L59: mov.u64 %r123,%r286; mov.u64 %r83,%r71; .loc 1 334 34 mov.u64 %r85,%r74; mov.u64 %r81,%r69; mov.u64 %r122,8; .loc 1 335 9 mov.u64 %r96,0; $L62: .loc 1 340 13 st.u64 [%r123+-8],%r292; .loc 1 343 23 mul.lo.u64 %r76,%r81,%r85; .loc 1 344 24 mul.lo.u64 %r78,%r85,%r83; mov.u64 %r99,%r96; .loc 1 345 5 add.u64 %r96,%r96,1; .loc 1 346 7 setp.eq.u64 %r275,%r29,%r99; @ %r275 bra $L35; .loc 1 354 16 ld.u64 %r276,[%r123]; add.u64 %r80,%r276,1; st.u64 [%r123],%r80; .loc 1 355 23 add.u64 %r278,%r285,%r122; ld.u64 %r81,[%r278]; .loc 1 355 13 sub.u64 %r279,%r81,%r76; add.u64 %r89,%r89,%r279; .loc 1 356 24 add.u64 %r280,%frame,%r122; ld.u64 %r83,[%r280]; .loc 1 356 14 sub.u64 %r281,%r83,%r78; add.u64 %r93,%r93,%r281; .loc 1 359 32 add.u64 %r283,%r293,%r122; ld.u64 %r85,[%r283]; .loc 1 359 7 add.u64 %r123,%r123,8; add.u64 %r122,%r122,8; setp.eq.u64 %r284,%r80,%r85; @ %r284 bra $L62; ld.u64 %r161,[%frame+360]; bra $L63; $L65: mov.u64 %r89,%r45; setp.eq.u32 %r287,1,1; bra $L48; $L35: .loc 1 _gfortran_smaxloc0_4_i1 .visible .func _gfortran_smaxloc0_4_i1u32pred.reg .u64u32 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 379 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L91; .loc 1 379 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L92; $L91: .loc 1 381 7 {_gfortran_maxloc0_4_.loc 1 382 7 bra $L90; $L92: .loc 1 385 8 ld.s8 %r30,[%r36+28]_gfortran_runtime_error94: .loc 1 390 15 ld.u64 %r24,[%r35]; .loc 1 390 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L95; .loc 1 392 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 393 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 394 24 st.u64 [%r35+8],%r24; .loc 1 395 29 mov.u64 %r56call (%value_in),_gfortrani_xmallocarray95 27 st.u64 [%r35],%r24; bra $L96; $L95: .loc 1 397 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 397 11 ld.u32 %r62,[%r61+36; .loc 1 399 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 404 8 ld.u64 %r24,[%r35]; $L96: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,2; mov.u64 %r29,%r24; .loc 1 405 10 mov.u64 %r32,0; .loc 1 406 23 cvt.u32.u64 %r68,%r32; $L97: st.u32 [%r29],%r68; .loc 1 405 24 add.u64 %r32,%r32,1; .loc 1 405 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L97; $L90: .loc 1 407 1 ret; } maxloc0_8_i1.o/_gfortran_maxloc0_8_i1 .visible .func _gfortran_maxloc0_8_i1libgfortran/generated/maxloc0_8_i1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_8_i1 .visible .func _gfortran_mmaxloc0_8_i1_gfortran_smaxloc0_8_i1 .visible .func _gfortran_smaxloc0_8_i1VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,65,88,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_8_i1 .visible .func _gfortran_maxloc0_8_i1stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[368pred %r145; .reg .pred %r146; .reg .u64 %r147; .reg .pred %r148; .reg .predu64 %r152; .reg .u16 %r155; .reg .u16predpredu64 %r188; mov.u64 %r110,%ar0; mov.u64 %r111,%ar1; mov.u32 %r112,%ar2; .loc 1 50 10 ld.s8 %r22,[%r111+28]; .loc 1 50 8 cvt.u32.u32 %r113,%r22; cvt.s64.s8 %r67,%r113; .loc 1 51 6 setp.gt.s64 %r114,%r67,0; @ %r114 bra $L2; .loc 1 52 5 cvta.const.u64 %r115115_gfortran_runtime_errorr23,[%r110]; .loc 1 54 6 setp.ne.u64 %r117,%r23,0; @ %r117 bra $L3; .loc 1 56 7 st.u64 [%r110+48],%r23; add.u64 %r119,%r67,-1; st.u64 [%r110+56],%r119; mov.u64 %r120,1; st.u64 [%r110+40],%r120; .loc 1 57 28 cvt.u32.u64 %r121,%r120; st.u8 [%r110+28],%r121; .loc 1 58 24 st.u64 [%r110+8],%r23; .loc 1 59 29 mov.u64 %r16call (%value_in),_gfortrani_xmallocarray127,[%value_in]; } mov.u64 %r23,%r127; .loc 1 59 27 st.u64 [%r110],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r129,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r130,[%r129+36]; setp.eq.u32 %r131,%r130,0; @ %r131 bra $L4; .loc 1 64110134; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r110]; $L4: .loc 1 68 11 ld.u64 %r68,[%r110+40]; add.u64 %r38,%r111,40; mov.u64 %r102,%frame; add.u64 %r103,%frame,120; .loc 1 70 10 mov.u64 %r94,0; add.u64 %r183,%frame,240; .loc 1 74 16 mov.u64 %r142,%r94; bra $L7; $L22: mov.u64 %r94,%r74; $L7: .loc 1 72 18 ld.u64 %r135,[%r38]; st.u64 [%r102],%r135; .loc 1 73 19 ld.u64 %r137,[%r38+16]; add.u64 %r136,%r137,1; ld.u64 %r138,[%r38+8]; sub.u64 %r31,%r136,%r138; .loc 1 73 17 st.u64 [%r103],%r31; .loc 1 74 16 shl.b64 %r140,%r94,3; add.u64 %r141,%r183,%r140; st.u64 [%r141],%r142; .loc 1 75 10 setp.gt.s64 %r143,%r31,0; @ %r143 bra $L5; shl.b64 %r76,%r68,3; mov.u64 %r82,%r23; .loc 1 78 11 mov.u64 %r75,0; .loc 1 79 24 mov.u64 %r144,%r75; $L6: st.u64 [%r82],%r144; .loc 1 78 27 add.u64 %r75,%r75,1; .loc 1 78 4 add.u64 %r82,%r82,%r76; setp.ne.u64 %r145,%r67,%r75; @ %r145 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r74,%r94,1; .loc 1 70 3 add.u64 %r38,%r38,24; add.u64 %r102,%r102,8; add.u64 %r103,%r103,8; setp.ne.u64 %r146,%r67,%r74; @ %r146 bra $L22; .loc 1 84 8 ld.u64 %r58,[%r111]; shl.b64 %r41,%r68,3; mov.u64 %r37,%r23; .loc 1 87 10 mov.u64 %r73,0; .loc 1 88 23 mov.u64 %r147,1; $L8: st.u64 [%r37],%r147; mov.u64 %r95,%r73; .loc 1 87 26 add.u64 %r73,%r73,1; .loc 1 87 3 add.u64 %r37,%r37,%r41; setp.ne.u64 %r148,%r94,%r95; @ %r148 bra $L8; .loc 1 101 9 setp.ne.u64 %r149,%r58,0; @ ! %r149 bra $L1; .loc 1 149 19 ld.u64 %r43,[%frame]; .loc 1 151 34 ld.u64 %r48,[%frame+120]; cvt.u32.u32 %r151,%r22; cvt.s64.s8 %r150,%r151; shl.b64 %r152,%r150,3; add.u64 %r69,%r183,%r152; .loc 1 99 12 mov.u32 %r63,-128; setp.eq.u32 %r182,%r112,0; add.u64 %r184,%frame,248; .loc 1 157 13 mov.u64 %r187,0; .loc 1 174 32 add.u64 %r188,%frame,120; $L21: .loc 1 151 21 ld.u64 %r100,[%frame+240]; .loc 1 126 12 @ %r182 bra $L23; mov.u64 %r39,%r100; mov.u64 %r56,%r58; $L14: .loc 1 129 12 ld.s8 %r32,[%r56]; .loc 1 129 11 cvt.u16.u32 %r155,%r32; cvt.u16.u32 %r156,%r63; setp.lt.s16 %r157,%r155,%r156; @ %r157 bra $L12; mov.u64 %r71,%r184; mov.u64 %r33,%r23; mov.u64 %r85,%r39; $L13: .loc 1 133 35 add.u64 %r159,%r85,1; .loc 1 133 24 st.u64 [%r33],%r159; .loc 1 132 10 add.u64 %r33,%r33,%r41; setp.eq.u64 %r160,%r69,%r71; @ %r160 bra $L24; .loc 1 133 31 ld.u64 %r85,[%r71]; add.u64 %r71,%r71,8; bra $L13; $L24: mov.u32 %r63,%r32; $L12: .loc 1 135 12 add.u64 %r56,%r56,%r43; .loc 1 137 17 add.u64 %r39,%r39,1; .loc 1 137 10 st.u64 [%frame+240],%r39; setp.ne.u64 %r161,%r39,%r48; @ %r161 bra $L14; bra $L34; $L23: mov.u64 %r46,%r100; mov.u64 %r57,%r58; $L11: .loc 1 141 11 ld.s8 %r40,[%r57]; .loc 1 141 10 cvt.u16.u32 %r162,%r40; cvt.u16.u32 %r163,%r63; setp.le.s16 %r164,%r162,%r163; @ %r164 bra $L16; mov.u64 %r80,%r184; mov.u64 %r98,%r23; mov.u64 %r81,%r46; $L17: .loc 1 145 35 add.u64 %r166,%r81,1; .loc 1 145 24 st.u64 [%r98],%r166; .loc 1 144 4 add.u64 %r98,%r98,%r41; setp.eq.u64 %r167,%r69,%r80; @ %r167 bra $L25; .loc 1 145 31 ld.u64 %r81,[%r80]; add.u64 %r80,%r80,8; bra $L17; $L25: mov.u32 %r63,%r40; $L16: .loc 1 149 9 add.u64 %r57,%r57,%r43; .loc 1 151 14 add.u64 %r46,%r46,1; .loc 1 151 7 st.u64 [%frame+240],%r46; setp.ne.u64 %r168,%r46,%r48; @ %r168 bra $L11; .loc 1 149 9 sub.u64 %r169,%r48,%r100; mad.lo.u64 %r58,%r169,%r43,%r58; bra $L18; $L34: .loc 1 135 12 sub.u64 %r171,%r48,%r100; mad.lo.u64 %r58,%r171,%r43,%r58; $L18: mov.u64 %r66,%r184; .loc 1 151 34 mov.u64 %r55,%r48; mov.u64 %r53,%r43; .loc 1 152 9 mov.u64 %r62,0; $L20: .loc 1 157 13 st.u64 [%r66+-8],%r187; .loc 1 160 23 mul.lo.u64 %r50,%r53,%r55; mov.u64 %r96,%r62; .loc 1 161 5 add.u64 %r62,%r62,1; .loc 1 162 7 setp.eq.u64 %r175,%r96,%r94; @ %r175 bra $L1; .loc 1 170 16 ld.u64 %r176,[%r66]; add.u64 %r52,%r176,1; st.u64 [%r66],%r52; shl.b64 %r61,%r62,3; .loc 1 171 23 add.u64 %r177,%frame,%r61; ld.u64 %r53,[%r177]; .loc 1 171 13 sub.u64 %r178,%r53,%r50; add.u64 %r58,%r58,%r178; .loc 1 174 32 add.u64 %r180,%r188,%r61; ld.u64 %r55,[%r180]; .loc 1 174 7 add.u64 %r66,%r66,8; setp.eq.u64 %r181,%r52,%r55; @ %r181 bra $L20; bra $L21; $L_gfortran_mmaxloc0_8_i1 .visible .func _gfortran_mmaxloc0_8_i1reg .u64 %stack; mov.u64 %stack,0480u32u64pred %r185; .reg .u64 %r188; .reg .u64u32 %r196; .reg .u32 %r197; .reg .u32u16pred %r223; .reg .u64 %r224; .reg .predu32 %r232; .reg .pred %r233; .reg .pred %r234; .reg .u64 %r236; .reg .predpred %r248; .reg .pred %r249; .reg .u64 %r250; .reg .u16 %r252; .reg .u32 %r253; .reg .predu64 %r274; .reg .pred %r275; .reg .u64mov.u64 %r158,%ar0; mov.u64 %r159,%ar1; mov.u64 %r160,%ar2; mov.u32 %r161,%ar3; .loc 1 202 6 setp.ne.u64 %r162,%r160,0; @ %r162 bra $L3615r159161; call _gfortran_maxloc0_8_.loc 1 205 7 bra $L35; $L36: .loc 1 208 10 ld.s8 %r22,[%r159+28]; .loc 1 209 6 setp.gt.s32 %r166,%r22,0; @ %r166 bra $L38; .loc 1 210 5 cvta.const.u64 %r16167_gfortran_runtime_error%r169,[%r1539; .loc 1 214 7 st.u64 [%r158+48],%r169; add.u32 %r172,%r22,-1; cvt.s64.s32 %r173,%r172; st.u64 [%r158+56],%r173; mov.u64 %r174,1; st.u64 [%r158+40],%r174; .loc 1 215 28 cvt.u32.u64 %r175,%r174; st.u8 [%r158+28],%r175; .loc 1 216 24 st.u64 [%r158+8],%r169; .loc 1 217 29 cvt.u32.u32 %r180,%r22; cvt.s64.s8 %r179,%r180; mov.u64 %r178call (%value_in),_gfortrani_xmallocarray181,[%value_in]; } .loc 1 217 27 st.u64 [%r158],%r181; bra $L40; $L39: .loc 1 221 11 cvta.global.u64 %r183,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r184,[%r183+36]; setp.eq.u32 %r185,%r184,0r159188; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r191,$LC2; {15919188; call _gfortrani_bounds_equal_extents$L40: .loc 1 231 15 ld.u64 %r31,[%r160+16]; .loc 1 233 9 ld.u64 %r91,[%r160]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r193,%r32,-4; and.b32 %r194,%r193,-5; set.u32.eq.u32 %r196,%r194,0; neg.s32 %r197,%r196; .loc 1 235 22 add.u32 %r198,%r32,-1; set.u32.le.u32 %r200,%r198,1; neg.s32 %r201,%r200; .loc 1 235 6 cvt.u16.u32 %r203,%r197; cvt.u16.u32 %r204,%r201; or.b16 %r202,%r203,%r204; cvt.u32.u16 %r205,%r202; cvt.u16.u8 %r206,%r205; setp.ne.u16 %r207,%r206,0; @ %r207 bra $L41; .loc 1 237 7 setp.ne.u32 %r209,%r32,16; @ %r209 bra $L42; $L41: .loc 1 244 11 ld.u64 %r103,[%r158+40]; .loc 1 245 8 ld.u64 %r104,[%r158]; .loc 1 246 17 cvt.u32.u32 %r210,%r22; cvt.s64.s8 %r67,%r210; add.u64 %r151,%r159,40; add.u64 %r278,%frame,120; mov.u64 %r147,%r278; add.u64 %r145,%r160,40; mov.u64 %r144,%frame; add.u64 %r140,%frame,240; add.u64 %r61,%frame,360; mov.u64 %r131,%r61; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r219,%r29; bra $L43; $L42: .loc 1 242 5_gfortran_runtime_error64: mov.u64 %r29,%r109; $L43: .loc 1 248 18 ld.u64 %r213,[%r151]; st.u64 [%r147],%r213; .loc 1 249 20 ld.u64 %r215,[%r145]; mul.lo.u64 %r214,%r215,%r31; .loc 1 249 18 st.u64 [%r144],%r214; .loc 1 250 19 ld.u64 %r217,[%r151+16]; add.u64 %r216,%r217,1; ld.u64 %r218,[%r151+8]; sub.u64 %r44,%r216,%r218; .loc 1 250 17 st.u64 [%r140],%r44; .loc 1 251 16 st.u64 [%r131],%r219; .loc 1 252 10 setp.gt.s64 %r220,%r44,0; @ %r220 bra $L44; shl.b64 %r134,%r103,3; mov.u64 %r125,%r104; .loc 1 255 11 mov.u64 %r110,0; .loc 1 256 24 mov.u64 %r221,%r110; $L45: st.u64 [%r125],%r221; .loc 1 255 27 add.u64 %r110,%r110,1; .loc 1 255 4 add.u64 %r125,%r125,%r134; setp.ne.u64 %r222,%r67,%r110; @ %r222 bra $L45; bra $L35; $L44: .loc 1 246 26 add.u64 %r109,%r29,1; .loc 1 246 3 add.u64 %r151,%r151,24; add.u64 %r147,%r147,8; add.u64 %r145,%r145,24; add.u64 %r144,%r144,8; add.u64 %r140,%r140,8; add.u64 %r131,%r131,8; setp.ne.u64 %r223,%r67,%r109; @ %r223 bra $L64; .loc 1 261 8 ld.u64 %r88,[%r159]; shl.b64 %r156,%r103,3; mov.u64 %r154,%r104; .loc 1 264 10 mov.u64 %r108,0; .loc 1 265 23 mov.u64 %r224,%r108; $L47: st.u64 [%r154],%r224; mov.u64 %r137,%r108; .loc 1 264 26 add.u64 %r108,%r108,1; .loc 1 264 3 add.u64 %r154,%r154,%r156; setp.ne.u64 %r225,%r29,%r137; @ %r225 bra $L47; .loc 1 276 9 setp.eq.u64 %r226,%r88,0; @ %r226 bra $L35; .loc 1 331 19 ld.u64 %r69,[%frame+120]; .loc 1 332 20 ld.u64 %r71,[%frame]; .loc 1 334 34 ld.u64 %r74,[%frame+240]; ld.u64 %r153,[%frame+360]; cvt.u32.u32 %r228,%r22; cvt.s64.s8 %r227,%r228; shl.b64 %r229,%r227,3; add.u64 %r115,%r229,%r61; .loc 1 274 12 mov.u32 %r95,-128; setp.eq.u32 %r277,1,0; add.u64 %r276,%frame,368; .loc 1 340 13 mov.u64 %r283,0; .loc 1 359 32 add.u64 %r284,%frame,240; .loc 1 308 12 setp.eq.u32 %r286,%r161,0; $L63: .loc 1 280 10 @ %r277 bra $L48; mov.u64 %r53,%r153; mov.u32 %r89,0; mov.u32 %r280,1; $L53: mov.u64 %r45,%r88; .loc 1 300 13 add.u64 %r88,%r45,%r69; .loc 1 284 11 ld.s8 %r232,[%r910; @ %r233 bra $L49; setp.eq.u32 %r234,%r89,0; selp.u64 %r153,%r153,%r53,%r234; .loc 1 294 16 ld.s8 %r95,[%r45]; mov.u64 %r124,%r276; mov.u64 %r136,%r104; mov.u64 %r149,%r153; $L52: .loc 1 296 33 add.u64 %r236,%r149,1; .loc 1 296 22 st.u64 [%r136],%r236; .loc 1 295 9 add.u64 %r136,%r136,%r156; setp.eq.u64 %r237,%r115,%r124; @ %r237 bra $L65; .loc 1 296 29 ld.u64 %r149,[%r124]; add.u64 %r124,%r124,8; bra $L52; $L49: .loc 1 301 14 add.u64 %r91,%r91,%r71; .loc 1 303 11 add.u64 %r53,%r53,1; mov.u32 %r89,%r280; .loc 1 303 4 setp.ne.u64 %r238,%r53,%r74; @ %r238 bra $L53; bra $L89; $L48: .loc 1 308 12 @ %r286 bra $L66; mov.u64 %r63,%r153; mov.u64 %r86,%r88; $L58: .loc 1 311 11 ld.s8 %r241,[%r91]; cvt.u16.u32 %r240,%r241; setp.eq.u16 %r242,%r240,0; @ %r242 bra $L56; .loc 1 311 22 ld.s8 %r59,[%r86]; .loc 1 311 19 cvt.u16.u32 %r243,%r59; cvt.u16.u32 %r244,%r95; setp.lt.s16 %r245,%r243,%r244; @ %r245 bra $L56; mov.u64 %r92,%r276; mov.u64 %r94,%r104; mov.u64 %r146,%r63; .loc 1 314 19 mov.u64 %r138,%r283; $L57: .loc 1 315 36 add.u64 %r247,%r146,1; .loc 1 315 25 st.u64 [%r94],%r247; .loc 1 314 35 add.u64 %r106,%r138,1; .loc 1 314 12 add.u64 %r94,%r94,%r156; setp.le.s64 %r248,%r29,%r138; @ %r248 bra $L67; .loc 1 315 32 ld.u64 %r146,[%r92]; add.u64 %r92,%r92,8; mov.u64 %r138,%r106; bra $L57; $L67: mov.u32 %r95,%r59; $L56: .loc 1 317 13 add.u64 %r86,%r86,%r69; .loc 1 319 11 add.u64 %r63,%r63,1; .loc 1 319 4 st.u64 [%frame+360],%r63; setp.ne.u64 %r249,%r63,%r74; @ %r249 bra $L58; .loc 1 317 13 sub.u64 %r250,%r74,%r153; mad.lo.u64 %r88,%r250,%r69,%r88; bra $L59; $L66: mov.u64 %r73,%r153; mov.u64 %r90,%r91; mov.u64 %r87,%r88; $L55: .loc 1 323 11 ld.s8 %r253,[%r90]; cvt.u16.u32 %r252,%r253; setp.eq.u16 %r254,%r252,0; @ %r254 bra $L60; .loc 1 323 22 ld.s8 %r66,[%r87]; .loc 1 323 19 cvt.u16.u32 %r255,%r66; cvt.u16.u32 %r256,%r95; setp.le.s16 %r257,%r255,%r256; @ %r257 bra $L60; mov.u64 %r111,%r276; mov.u64 %r56,%r104; mov.u64 %r142,%r73; .loc 1 326 12 mov.u64 %r139,%r283; $L61: .loc 1 327 36 add.u64 %r259,%r142,1; .loc 1 327 25 st.u64 [%r56],%r259; .loc 1 326 28 add.u64 %r105,%r139,1; .loc 1 326 5 add.u64 %r56,%r56,%r156; setp.le.s64 %r260,%r29,%r139; @ %r260 bra $L68; .loc 1 327 32 ld.u64 %r142,[%r111]; add.u64 %r111,%r111,8; mov.u64 %r139,%r105; bra $L61; $L68: mov.u32 %r95,%r66; $L60: .loc 1 331 9 add.u64 %r87,%r87,%r69; .loc 1 332 10 add.u64 %r90,%r90,%r71; .loc 1 334 14 add.u64 %r73,%r73,1; .loc 1 334 7 st.u64 [%frame+360],%r73; setp.ne.u64 %r261,%r73,%r74; @ %r261 bra $L55; sub.u64 %r121,%r74,%r153; .loc 1 331 9 mad.lo.u64 %r88,%r69,%r121,%r88; .loc 1 332 10 mad.lo.u64 %r91,%r71,%r121,%r91; bra $L59; $L89: st.u64 [%frame+360],%r74; $L59: mov.u64 %r113,%r276; mov.u64 %r83,%r71; .loc 1 334 34 mov.u64 %r85,%r74; mov.u64 %r81,%r69; mov.u64 %r112,8; .loc 1 335 9 mov.u64 %r93,0; $L62: .loc 1 340 13 st.u64 [%r113+-8],%r283; .loc 1 343 23 mul.lo.u64 %r76,%r81,%r85; .loc 1 344 24 mul.lo.u64 %r78,%r85,%r83; mov.u64 %r54,%r93; .loc 1 345 5 add.u64 %r93,%r93,1; .loc 1 346 7 setp.eq.u64 %r266,%r29,%r54; @ %r266 bra $L35; .loc 1 354 16 ld.u64 %r267,[%r113]; add.u64 %r80,%r267,1; st.u64 [%r113],%r80; .loc 1 355 23 add.u64 %r269,%r278,%r112; ld.u64 %r81,[%r269]; .loc 1 355 13 sub.u64 %r270,%r81,%r76; add.u64 %r88,%r88,%r270; .loc 1 356 24 add.u64 %r271,%frame,%r112; ld.u64 %r83,[%r271]; .loc 1 356 14 sub.u64 %r272,%r83,%r78; add.u64 %r91,%r91,%r272; .loc 1 359 32 add.u64 %r274,%r284,%r112; ld.u64 %r85,[%r274]; .loc 1 359 7 add.u64 %r113,%r113,8; add.u64 %r112,%r112,8; setp.eq.u64 %r275,%r80,%r85; @ %r275 bra $L62; ld.u64 %r153,[%frame+360]; bra $L63; $L65: mov.u64 %r88,%r45; setp.eq.u32 %r277,1,1; bra $L48; $L35: .loc 1 _gfortran_smaxloc0_8_i1 .visible .func _gfortran_smaxloc0_8_i1u32pred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 379 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L91; .loc 1 379 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L92; $L91: .loc 1 381 7 {_gfortran_maxloc0_8_.loc 1 382 7 bra $L90; $L92: .loc 1 385 8 ld.s8 %r30,[%r36+28]_gfortran_runtime_error94: .loc 1 390 15 ld.u64 %r24,[%r35]; .loc 1 390 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L95; .loc 1 392 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 393 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 394 24 st.u64 [%r35+8],%r24; .loc 1 395 29 mov.u64 call (%value_in),_gfortrani_xmallocarray95 27 st.u64 [%r35],%r24; bra $L96; $L95: .loc 1 397 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 397 11 ld.u32 %r62,[%r61+36; .loc 1 399 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 404 8 ld.u64 %r24,[%r35]; $L96: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,3; mov.u64 %r29,%r24; .loc 1 405 10 mov.u64 %r32,0; .loc 1 406 23 mov.u64 %r68,%r32; $L97: st.u64 [%r29],%r68; .loc 1 405 24 add.u64 %r32,%r32,1; .loc 1 405 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L97; $L90: .loc 1 407 1 ret; } maxloc0_16_i1.o/_gfortran_maxloc0_16_i1 .visible .func _gfortran_maxloc0_16_i1libgfortran/generated/maxloc0_16_i1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_16_i1 .visible .func _gfortran_mmaxloc0_16_i1_gfortran_smaxloc0_16_i1 .visible .func _gfortran_smaxloc0_16_i1VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,65,88,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_16_i1 .visible .func _gfortran_maxloc0_16_i1stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[36832pred %r166; .reg .u16 %r167; .reg .u16 %r168; .reg .predmov.u64 %r112,%ar0; mov.u64 %r113,%ar1; mov.u32 %r114,%ar2; .loc 1 50 10 ld.s8 %r22,[%r113+28]; .loc 1 50 8 cvt.u32.u32 %r115,%r22; cvt.s64.s8 %r69,%r115; .loc 1 51 6 setp.gt.s64 %r116,%r69,0; @ %r116 bra $L2; .loc 1 52 5 cvta.const.u64 %r11117_gfortran_runtime_errorr23,[%r112]; .loc 1 54 6 setp.ne.u64 %r119,%r23,0; @ %r119 bra $L3; .loc 1 56 7 st.u64 [%r112+48],%r23; add.u64 %r121,%r69,-1; st.u64 [%r112+56],%r121; mov.u64 %r122,1; st.u64 [%r112+40],%r122; .loc 1 57 28 cvt.u32.u64 %r123,%r122; st.u8 [%r112+28],%r123; .loc 1 58 24 st.u64 [%r112+8],%r23; .loc 1 59 29 mov.u64 %r126,16126; call (%value_in),_gfortrani_xmallocarray129,[%value_in]; } mov.u64 %r23,%r129; .loc 1 59 27 st.u64 [%r112],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r131,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r132,[%r131+36]; setp.eq.u32 %r133,%r132,0; @ %r133 bra $L4; .loc 1 64r113136; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r112]; $L4: .loc 1 68 11 ld.u64 %r70,[%r112+40]; add.u64 %r44,%r113,40; mov.u64 %r39,%frame; add.u64 %r105,%frame,120; .loc 1 70 10 mov.u64 %r97,0; add.u64 %r189,%frame,240; .loc 1 74 16 mov.u64 %r144,%r97; bra $L7; $L22: mov.u64 %r97,%r76; $L7: .loc 1 72 18 ld.u64 %r137,[%r44]; st.u64 [%r39],%r137; .loc 1 73 19 ld.u64 %r139,[%r44+16]; add.u64 %r138,%r139,1; ld.u64 %r140,[%r44+8]; sub.u64 %r32,%r138,%r140; .loc 1 73 17 st.u64 [%r105],%r32; .loc 1 74 16 shl.b64 %r142,%r97,3; add.u64 %r143,%r189,%r142; st.u64 [%r143],%r144; .loc 1 75 10 setp.gt.s64 %r145,%r32,0; @ %r145 bra $L5; shl.b64 %r83,%r70,4; mov.u64 %r85,%r23; .loc 1 78 11 mov.u64 %r77,0; .loc 1 79 24 mov.u64 %r146,%r77; $L6: st.u64 [%r85],%r146; st.u64 [%r85+8],%r146; .loc 1 78 27 add.u64 %r77,%r77,1; .loc 1 78 4 add.u64 %r85,%r85,%r83; setp.ne.u64 %r148,%r69,%r77; @ %r148 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r76,%r97,1; .loc 1 70 3 add.u64 %r44,%r44,24; add.u64 %r39,%r39,8; add.u64 %r105,%r105,8; setp.ne.u64 %r149,%r69,%r76; @ %r149 bra $L22; .loc 1 84 8 ld.u64 %r60,[%r113]; shl.b64 %r37,%r70,4; mov.u64 %r82,%r23; .loc 1 87 10 mov.u64 %r75,0; .loc 1 88 23 mov.u64 %r150,1; mov.u64 %r151,%r75; $L8: st.u64 [%r82],%r150; st.u64 [%r82+8],%r151; mov.u64 %r98,%r75; .loc 1 87 26 add.u64 %r75,%r75,1; .loc 1 87 3 add.u64 %r82,%r82,%r37; setp.ne.u64 %r152,%r97,%r98; @ %r152 bra $L8; .loc 1 101 9 setp.ne.u64 %r153,%r60,0; @ ! %r153 bra $L1; .loc 1 149 19 ld.u64 %r47,[%frame]; .loc 1 151 34 ld.u64 %r50,[%frame+120]; cvt.u32.u32 %r155,%r22; cvt.s64.s8 %r154,%r155; shl.b64 %r156,%r154,3; add.u64 %r86,%r189,%r156; .loc 1 99 12 mov.u32 %r64,-128; setp.eq.u32 %r188,%r114,0; add.u64 %r190,%frame,248; .loc 1 157 13 mov.u64 %r193,0; .loc 1 174 32 add.u64 %r194,%frame,120; $L21: .loc 1 151 21 ld.u64 %r104,[%frame+240]; .loc 1 126 12 @ %r188 bra $L23; mov.u64 %r40,%r104; mov.u64 %r58,%r60; $L14: .loc 1 129 12 ld.s8 %r33,[%r58]; .loc 1 129 11 cvt.u16.u32 %r159,%r33; cvt.u16.u32 %r160,%r64; setp.lt.s16 %r161,%r159,%r160; @ %r161 bra $L12; mov.u64 %r27,%r190; mov.u64 %r61,%r23; mov.u64 %r100,%r40; $L13: .loc 1 133 35 add.u64 %r163,%r100,1; st.u64 [%r61],%r163; shr.s64 %r164,%r163,63; st.u64 [%r61+8],%r164; .loc 1 132 10 add.u64 %r61,%r61,%r37; setp.eq.u64 %r165,%r27,%r86; @ %r165 bra $L24; .loc 1 133 31 ld.u64 %r100,[%r27]; add.u64 %r27,%r27,8; bra $L13; $L24: mov.u32 %r64,%r33; $L12: .loc 1 135 12 add.u64 %r58,%r58,%r47; .loc 1 137 17 add.u64 %r40,%r40,1; .loc 1 137 10 st.u64 [%frame+240],%r40; setp.ne.u64 %r166,%r40,%r50; @ %r166 bra $L14; bra $L34; $L23: mov.u64 %r49,%r104; mov.u64 %r59,%r60; $L11: .loc 1 141 11 ld.s8 %r42,[%r59]; .loc 1 141 10 cvt.u16.u32 %r167,%r42; cvt.u16.u32 %r168,%r64; setp.le.s16 %r169,%r167,%r168; @ %r169 bra $L16; mov.u64 %r101,%r190; mov.u64 %r102,%r23; mov.u64 %r81,%r49; $L17: .loc 1 145 35 add.u64 %r171,%r81,1; st.u64 [%r102],%r171; shr.s64 %r172,%r171,63; st.u64 [%r102+8],%r172; .loc 1 144 4 add.u64 %r102,%r102,%r37; setp.eq.u64 %r173,%r86,%r101; @ %r173 bra $L25; .loc 1 145 31 ld.u64 %r81,[%r101]; add.u64 %r101,%r101,8; bra $L17; $L25: mov.u32 %r64,%r42; $L16: .loc 1 149 9 add.u64 %r59,%r59,%r47; .loc 1 151 14 add.u64 %r49,%r49,1; .loc 1 151 7 st.u64 [%frame+240],%r49; setp.ne.u64 %r174,%r49,%r50; @ %r174 bra $L11; .loc 1 149 9 sub.u64 %r175,%r50,%r104; mad.lo.u64 %r60,%r175,%r47,%r60; bra $L18; $L34: .loc 1 135 12 sub.u64 %r177,%r50,%r104; mad.lo.u64 %r60,%r177,%r47,%r60; $L18: mov.u64 %r74,%r190; .loc 1 151 34 mov.u64 %r57,%r50; mov.u64 %r55,%r47; .loc 1 152 9 mov.u64 %r63,0; $L20: .loc 1 157 13 st.u64 [%r74+-8],%r193; .loc 1 160 23 mul.lo.u64 %r52,%r55,%r57; mov.u64 %r99,%r63; .loc 1 161 5 add.u64 %r63,%r63,1; .loc 1 162 7 setp.eq.u64 %r181,%r99,%r97; @ %r181 bra $L1; .loc 1 170 16 ld.u64 %r182,[%r74]; add.u64 %r54,%r182,1; st.u64 [%r74],%r54; shl.b64 %r66,%r63,3; .loc 1 171 23 add.u64 %r183,%frame,%r66; ld.u64 %r55,[%r183]; .loc 1 171 13 sub.u64 %r184,%r55,%r52; add.u64 %r60,%r60,%r184; .loc 1 174 32 add.u64 %r186,%r194,%r66; ld.u64 %r57,[%r186]; .loc 1 174 7 add.u64 %r74,%r74,8; setp.eq.u64 %r187,%r54,%r57; @ %r187 bra $L20; bra $L21; $L_gfortran_mmaxloc0_16_i1 .visible .func _gfortran_mmaxloc0_16_i1reg .u64 %stack; mov.u64 %stack,048096u64 %r170; .reg .u64 %r172; .reg .predpred %r188; .reg .u64 %r191; .reg .u64 %r194; .reg .u32 %r196; .reg .u32 %r197; .reg .u32u16 %r206; .reg .u16u32 %r213; .reg .u64pred %r223; .reg .u64pred %r239; .reg .u64 %r241; .reg .u64 %r242; .reg .pred %r243; .reg .predpred %r269; .reg .pred %r274; .reg .u64u64 %r280; .reg .u64 %r282; .reg .predu32 %r288; .reg .u64 %r291; .reg .u64 %r292; .reg .pred %r294; mov.u64 %r161,%ar0; mov.u64 %r162,%ar1; mov.u64 %r163,%ar2; mov.u32 %r164,%ar3; .loc 1 202 6 setp.ne.u64 %r165,%r163,0; @ %r165 bra $L36gfortran_maxloc0_16_.loc 1 205 7 bra $L35; $L36: .loc 1 208 10 ld.s8 %r22,[%r162+28]; .loc 1 209 6 setp.gt.s32 %r169,%r22,0; @ %r169 bra $L38; .loc 1 210 5 cvta.const.u64 %r170170_gfortran_runtime_error%r172,[%r161]; setp.ne.u64 %r173,%r172,0; @ %r173 bra $L39; .loc 1 214 7 st.u64 [%r161+48],%r172; add.u32 %r175,%r22,-1; cvt.s64.s32 %r176,%r175; st.u64 [%r161+56],%r176; mov.u64 %r177,1; st.u64 [%r161+40],%r177; .loc 1 215 28 cvt.u32.u64 %r178,%r177; st.u8 [%r161+28],%r178; .loc 1 216 24 st.u64 [%r161+8],%r172; .loc 1 217 29 cvt.u32.u32 %r183,%r22; cvt.s64.s8 %r182,%r183; mov.u64 %r181,16call (%value_in),_gfortrani_xmallocarray184,[%value_in]; } .loc 1 217 27 st.u64 [%r161],%r184; bra $L40; $L39: .loc 1 221 11 cvta.global.u64 %r186,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r187,[%r186+36]; setp.eq.u32 %r188,%r187,0161_gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r194163194191; call _gfortrani_bounds_equal_extents$L40: .loc 1 231 15 ld.u64 %r31,[%r163+16]; .loc 1 233 9 ld.u64 %r89,[%r163]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r196,%r32,-4; and.b32 %r197,%r196,-5; set.u32.eq.u32 %r199,%r197,0; neg.s32 %r200,%r199; .loc 1 235 22 add.u32 %r201,%r32,-1; set.u32.le.u32 %r203,%r201,1; neg.s32 %r204,%r203; .loc 1 235 6 cvt.u16.u32 %r206,%r200; cvt.u16.u32 %r207,%r204; or.b16 %r205,%r206,%r207; cvt.u32.u16 %r208,%r205; cvt.u16.u8 %r209,%r20837 7 setp.ne.u32 %r212,%r32,16; @ %r212 bra $L42; $L41: .loc 1 244 11 ld.u64 %r101,[%r161+40]; .loc 1 245 8 ld.u64 %r102,[%r161]; .loc 1 246 17 cvt.u32.u32 %r213,%r22; cvt.s64.s8 %r64,%r213; add.u64 %r159,%r162,40; add.u64 %r285,%frame,120; mov.u64 %r157,%r285; add.u64 %r156,%r163,40; mov.u64 %r152,%frame; add.u64 %r153,%frame,240; add.u64 %r149,%frame,360; mov.u64 %r148,%r149; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r222,%r29; bra $L43; $L42: .loc 1 242 5_gfortran_runtime_error64: mov.u64 %r29,%r107; $L43: .loc 1 248 18 ld.u64 %r216,[%r159]; st.u64 [%r157],%r216; .loc 1 249 20 ld.u64 %r218,[%r156]; mul.lo.u64 %r217,%r218,%r31; .loc 1 249 18 st.u64 [%r152],%r217; .loc 1 250 19 ld.u64 %r220,[%r159+16]; add.u64 %r219,%r220,1; ld.u64 %r221,[%r159+8]; sub.u64 %r44,%r219,%r221; .loc 1 250 17 st.u64 [%r153],%r44; .loc 1 251 16 st.u64 [%r148],%r222; .loc 1 252 10 setp.gt.s64 %r223,%r44,0; @ %r223 bra $L44; shl.b64 %r142,%r101,4; mov.u64 %r132,%r102; .loc 1 255 11 mov.u64 %r108,0; .loc 1 256 24 mov.u64 %r224,%r108; $L45: st.u64 [%r132],%r224; st.u64 [%r132+8],%r224; .loc 1 255 27 add.u64 %r108,%r108,1; .loc 1 255 4 add.u64 %r132,%r132,%r142; setp.ne.u64 %r226,%r64,%r108; @ %r226 bra $L45; bra $L35; $L44: .loc 1 246 26 add.u64 %r107,%r29,1; .loc 1 246 3 add.u64 %r159,%r159,24; add.u64 %r157,%r157,8; add.u64 %r156,%r156,24; add.u64 %r152,%r152,8; add.u64 %r153,%r153,8; add.u64 %r148,%r148,8; setp.ne.u64 %r227,%r64,%r107; @ %r227 bra $L64; .loc 1 261 8 ld.u64 %r86,[%r162]; shl.b64 %r110,%r101,4; mov.u64 %r51,%r102; .loc 1 264 10 mov.u64 %r106,0; .loc 1 265 23 mov.u64 %r228,%r106; $L47: st.u64 [%r51],%r228; st.u64 [%r51+8],%r228; mov.u64 %r138,%r106; .loc 1 264 26 add.u64 %r106,%r106,1; .loc 1 264 3 add.u64 %r51,%r51,%r110; setp.ne.u64 %r230,%r29,%r138; @ %r230 bra $L47; .loc 1 276 9 setp.eq.u64 %r231,%r86,0; @ %r231 bra $L35; .loc 1 331 19 ld.u64 %r67,[%frame+120]; .loc 1 332 20 ld.u64 %r69,[%frame]; .loc 1 334 34 ld.u64 %r72,[%frame+240]; ld.u64 %r158,[%frame+360]; cvt.u32.u32 %r233,%r22; cvt.s64.s8 %r232,%r233; shl.b64 %r234,%r232,3; add.u64 %r120,%r234,%r149; .loc 1 274 12 mov.u32 %r95,-128; setp.eq.u32 %r284,1,0; add.u64 %r286,%frame,368; .loc 1 340 13 mov.u64 %r291,0; .loc 1 359 32 add.u64 %r292,%frame,240; .loc 1 308 12 setp.eq.u32 %r294,%r164,0; $L63: .loc 1 280 10 @ %r284 bra $L48; mov.u64 %r52,%r158; mov.u32 %r87,0; mov.u32 %r288,1; $L53: mov.u64 %r45,%r86; .loc 1 300 13 add.u64 %r86,%r45,%r67; .loc 1 284 11 ld.s8 %r237,[%r89]; cvt.u16.u32 %r236,%r237; setp.eq.u16 %r238,%r236,0; @ %r238 bra $L49; setp.eq.u32 %r239,%r87,0; selp.u64 %r158,%r158,%r52,%r239; .loc 1 294 16 ld.s8 %r95,[%r45]; mov.u64 %r129,%r286; mov.u64 %r143,%r102; mov.u64 %r155,%r158; $L52: .loc 1 296 33 add.u64 %r241,%r155,1; st.u64 [%r143],%r241; shr.s64 %r242,%r241,63; st.u64 [%r143+8],%r242; .loc 1 295 9 add.u64 %r143,%r143,%r110; setp.eq.u64 %r243,%r120,%r129; @ %r243 bra $L65; .loc 1 296 29 ld.u64 %r155,[%r129]; add.u64 %r129,%r129,8; bra $L52; $L49: .loc 1 301 14 add.u64 %r89,%r89,%r69; .loc 1 303 11 add.u64 %r52,%r52,1; mov.u32 %r87,%r288; .loc 1 303 4 setp.ne.u64 %r244,%r52,%r72; @ %r244 bra $L53; bra $L89; $L48: .loc 1 308 12 @ %r294 bra $L66; mov.u64 %r60,%r158; mov.u64 %r84,%r86; $L58: .loc 1 311 11 ld.s8 %r247,[%r89]; cvt.u16.u32 %r246,%r247; setp.eq.u16 %r248,%r246,0; @ %r248 bra $L56; .loc 1 311 22 ld.s8 %r56,[%r84]; .loc 1 311 19 cvt.u16.u32 %r249,%r56; cvt.u16.u32 %r250,%r95; setp.lt.s16 %r251,%r249,%r250; @ %r251 bra $L56; mov.u64 %r96,%r286; mov.u64 %r105,%r102; mov.u64 %r151,%r60; .loc 1 314 19 mov.u64 %r139,%r291; $L57: .loc 1 315 36 add.u64 %r253,%r151,1; st.u64 [%r105],%r253; shr.s64 %r254,%r253,63; st.u64 [%r105+8],%r254; .loc 1 314 35 add.u64 %r104,%r139,1; .loc 1 314 12 add.u64 %r105,%r105,%r110; setp.le.s64 %r255,%r29,%r139; @ %r255 bra $L67; .loc 1 315 32 ld.u64 %r151,[%r96]; add.u64 %r96,%r96,8; mov.u64 %r139,%r104; bra $L57; $L67: mov.u32 %r95,%r56; $L56: .loc 1 317 13 add.u64 %r84,%r84,%r67; .loc 1 319 11 add.u64 %r60,%r60,1; .loc 1 319 4 st.u64 [%frame+360],%r60; setp.ne.u64 %r256,%r60,%r72; @ %r256 bra $L58; .loc 1 317 13 sub.u64 %r257,%r72,%r158; mad.lo.u64 %r86,%r257,%r67,%r86; bra $L59; $L66: mov.u64 %r71,%r158; mov.u64 %r88,%r89; mov.u64 %r85,%r86; $L55: .loc 1 323 11 ld.s8 %r260,[%r88]; cvt.u16.u32 %r259,%r260; setp.eq.u16 %r261,%r259,0; @ %r261 bra $L60; .loc 1 323 22 ld.s8 %r63,[%r85]; .loc 1 323 19 cvt.u16.u32 %r262,%r63; cvt.u16.u32 %r263,%r95; setp.le.s16 %r264,%r262,%r263; @ %r264 bra $L60; mov.u64 %r61,%r286; mov.u64 %r91,%r102; mov.u64 %r147,%r71; .loc 1 326 12 mov.u64 %r140,%r291; $L61: .loc 1 327 36 add.u64 %r266,%r147,1; st.u64 [%r91],%r266; shr.s64 %r267,%r266,63; st.u64 [%r91+8],%r267; .loc 1 326 28 add.u64 %r103,%r140,1; .loc 1 326 5 add.u64 %r91,%r91,%r110; setp.le.s64 %r268,%r29,%r140; @ %r268 bra $L68; .loc 1 327 32 ld.u64 %r147,[%r61]; add.u64 %r61,%r61,8; mov.u64 %r140,%r103; bra $L61; $L68: mov.u32 %r95,%r63; $L60: .loc 1 331 9 add.u64 %r85,%r85,%r67; .loc 1 332 10 add.u64 %r88,%r88,%r69; .loc 1 334 14 add.u64 %r71,%r71,1; .loc 1 334 7 st.u64 [%frame+360],%r71; setp.ne.u64 %r269,%r71,%r72; @ %r269 bra $L55; sub.u64 %r126,%r72,%r158; .loc 1 331 9 mad.lo.u64 %r86,%r67,%r126,%r86; .loc 1 332 10 mad.lo.u64 %r89,%r69,%r126,%r89; bra $L59; $L89: st.u64 [%frame+360],%r72; $L59: mov.u64 %r118,%r286; mov.u64 %r81,%r69; .loc 1 334 34 mov.u64 %r83,%r72; mov.u64 %r79,%r67; mov.u64 %r117,8; .loc 1 335 9 mov.u64 %r93,0; $L62: .loc 1 340 13 st.u64 [%r118+-8],%r291; .loc 1 343 23 mul.lo.u64 %r74,%r79,%r83; .loc 1 344 24 mul.lo.u64 %r76,%r83,%r81; mov.u64 %r90,%r93; .loc 1 345 5 add.u64 %r93,%r93,1; .loc 1 346 7 setp.eq.u64 %r274,%r29,%r90; @ %r274 bra $L35; .loc 1 354 16 ld.u64 %r275,[%r118]; add.u64 %r78,%r275,1; st.u64 [%r118],%r78; .loc 1 355 23 add.u64 %r277,%r285,%r117; ld.u64 %r79,[%r277]; .loc 1 355 13 sub.u64 %r278,%r79,%r74; add.u64 %r86,%r86,%r278; .loc 1 356 24 add.u64 %r279,%frame,%r117; ld.u64 %r81,[%r279]; .loc 1 356 14 sub.u64 %r280,%r81,%r76; add.u64 %r89,%r89,%r280; .loc 1 359 32 add.u64 %r282,%r292,%r117; ld.u64 %r83,[%r282]; .loc 1 359 7 add.u64 %r118,%r118,8; add.u64 %r117,%r117,8; setp.eq.u64 %r283,%r78,%r83; @ %r283 bra $L62; ld.u64 %r158,[%frame+360]; bra $L63; $L65: mov.u64 %r86,%r45; setp.eq.u32 %r284,1,1; bra $L48; $L35: .loc 1 _gfortran_smaxloc0_16_i1 .visible .func _gfortran_smaxloc0_16_i1u32pred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 379 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L91; .loc 1 379 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L92; $L91: .loc 1 381 7 {_gfortran_maxloc0_16_.loc 1 382 7 bra $L90; $L92: .loc 1 385 8 ld.s8 %r30,[%r36+28]_gfortran_runtime_error94: .loc 1 390 15 ld.u64 %r24,[%r35]; .loc 1 390 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L95; .loc 1 392 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 393 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 394 24 st.u64 [%r35+8],%r24; .loc 1 395 29call (%value_in),_gfortrani_xmallocarray95 27 st.u64 [%r35],%r24; bra $L96; $L95: .loc 1 397 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 397 11 ld.u32 %r62,[%r61+36; .loc 1 399 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 404 8 ld.u64 %r24,[%r35]; $L96: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,4; mov.u64 %r29,%r24; .loc 1 405 10 mov.u64 %r32,0; .loc 1 406 23 mov.u64 %r68,%r32; $L97: st.u64 [%r29],%r68; st.u64 [%r29+8],%r68; .loc 1 405 24 add.u64 %r32,%r32,1; .loc 1 405 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r70,%r30,%r32; @ %r70 bra $L97; $L90: .loc 1 407 1 ret; } maxloc0_4_i2.o/_gfortran_maxloc0_4_i2 .visible .func _gfortran_maxloc0_4_i2libgfortran/generated/maxloc0_4_i2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_4_i2 .visible .func _gfortran_mmaxloc0_4_i2_gfortran_smaxloc0_4_i2 .visible .func _gfortran_smaxloc0_4_i2VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,65,88,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_4_i2 .visible .func _gfortran_maxloc0_4_i2stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[368u64predpredpred %r168; .reg .pred %r169; .reg .u16 %r170; .reg .u16 %r171; .reg .pred %r172; .reg .u32 %r174; .reg .u32predu64 %r198; mov.u64 %r116,%ar0; mov.u64 %r117,%ar1; mov.u32 %r118,%ar2; .loc 1 50 10 ld.s8 %r22,[%r117+28]; .loc 1 50 8 cvt.u32.u32 %r119,%r22; cvt.s64.s8 %r72,%r119; .loc 1 51 6 setp.gt.s64 %r120,%r72,0; @ %r120 bra $L2; .loc 1 52 5 cvta.const.u64 %r121121_gfortran_runtime_errorr23,[%r116]; .loc 1 54 6 setp.ne.u64 %r123,%r23,0; @ %r123 bra $L3; .loc 1 56 7 st.u64 [%r116+48],%r23; add.u64 %r125,%r72,-1; st.u64 [%r116+56],%r125; mov.u64 %r126,1; st.u64 [%r116+40],%r126; .loc 1 57 28 cvt.u32.u64 %r127,%r126; st.u8 [%r116+28],%r127; .loc 1 58 24 st.u64 [%r116+8],%r23; .loc 1 59 29 mov.u64 %r130,4;r130; call (%value_in),_gfortrani_xmallocarray133,[%value_in]; } mov.u64 %r23,%r133; .loc 1 59 27 st.u64 [%r116],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r135,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r136,[%r135+36]; setp.eq.u32 %r137,%r136,0; @ %r137 bra $L4; .loc 1 64r140; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r116]; $L4: .loc 1 68 11 ld.u64 %r73,[%r116+40]; add.u64 %r39,%r117,40; mov.u64 %r45,%frame; add.u64 %r35,%frame,120; .loc 1 70 10 mov.u64 %r99,0; add.u64 %r193,%frame,240; .loc 1 74 16 mov.u64 %r148,%r99; bra $L7; $L22: mov.u64 %r99,%r79; $L7: .loc 1 72 18 ld.u64 %r141,[%r39]; st.u64 [%r45],%r141; .loc 1 73 19 ld.u64 %r143,[%r39+16]; add.u64 %r142,%r143,1; ld.u64 %r144,[%r39+8]; sub.u64 %r31,%r142,%r144; .loc 1 73 17 st.u64 [%r35],%r31; .loc 1 74 16 shl.b64 %r146,%r99,3; add.u64 %r147,%r193,%r146; st.u64 [%r147],%r148; .loc 1 75 10 setp.gt.s64 %r149,%r31,0; @ %r149 bra $L5; shl.b64 %r85,%r73,2; mov.u64 %r87,%r23; .loc 1 78 11 mov.u64 %r80,0; .loc 1 79 24 cvt.u32.u64 %r150,%r80; $L6: st.u32 [%r87],%r150; .loc 1 78 27 add.u64 %r80,%r80,1; .loc 1 78 4 add.u64 %r87,%r87,%r85; setp.ne.u64 %r151,%r72,%r80; @ %r151 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r79,%r99,1; .loc 1 70 3 add.u64 %r39,%r39,24; add.u64 %r45,%r45,8; add.u64 %r35,%r35,8; setp.ne.u64 %r152,%r72,%r79; @ %r152 bra $L22; .loc 1 84 8 ld.u64 %r64,[%r117]; shl.b64 %r82,%r73,2; mov.u64 %r92,%r23; .loc 1 87 10 mov.u64 %r78,0; .loc 1 88 23 mov.u32 %r153,1; $L8: st.u32 [%r92],%r153; mov.u64 %r32,%r78; .loc 1 87 26 add.u64 %r78,%r78,1; .loc 1 87 3 add.u64 %r92,%r92,%r82; setp.ne.u64 %r154,%r99,%r32; @ %r154 bra $L8; .loc 1 101 9 setp.ne.u64 %r155,%r64,0; @ ! %r155 bra $L1; .loc 1 149 19 ld.u64 %r49,[%frame]; .loc 1 149 9 add.u64 %r51,%r49,%r49; .loc 1 151 34 ld.u64 %r53,[%frame+120]; cvt.u32.u32 %r158,%r22; cvt.s64.s8 %r157,%r158; shl.b64 %r159,%r157,3; add.u64 %r102,%r193,%r159; .loc 1 99 12 mov.u32 %r67,-32768; setp.eq.u32 %r192,%r118,0; add.u64 %r194,%frame,248; .loc 1 157 13 mov.u64 %r197,0; .loc 1 174 32 add.u64 %r198,%frame,120; $L21: .loc 1 151 21 ld.u64 %r110,[%frame+240]; .loc 1 126 12 @ %r192 bra $L23; mov.u64 %r40,%r110; mov.u64 %r62,%r64; $L14: .loc 1 129 12 ld.s16 %r33,[%r62]; .loc 1 129 11 cvt.u16.u32 %r162,%r33; cvt.u16.u32 %r163,%r67; setp.lt.s16 %r164,%r162,%r163; @ %r164 bra $L12; mov.u64 %r43,%r194; mov.u64 %r65,%r23; mov.u64 %r105,%r40; $L13: .loc 1 133 35 cvt.u32.u64 %r167,%r105; add.u32 %r166,%r167,1; .loc 1 133 24 st.u32 [%r65],%r166; .loc 1 132 10 add.u64 %r65,%r65,%r82; setp.eq.u64 %r168,%r43,%r102; @ %r168 bra $L24; .loc 1 133 31 ld.u64 %r105,[%r43]; add.u64 %r43,%r43,8; bra $L13; $L24: mov.u32 %r67,%r33; $L12: .loc 1 135 12 add.u64 %r62,%r62,%r51; .loc 1 137 17 add.u64 %r40,%r40,1; .loc 1 137 10 st.u64 [%frame+240],%r40; setp.ne.u64 %r169,%r40,%r53; @ %r169 bra $L14; bra $L34; $L23: mov.u64 %r52,%r110; mov.u64 %r63,%r64; $L11: .loc 1 141 11 ld.s16 %r42,[%r63]; .loc 1 141 10 cvt.u16.u32 %r170,%r42; cvt.u16.u32 %r171,%r67; setp.le.s16 %r172,%r170,%r171; @ %r172 bra $L16; mov.u64 %r108,%r194; mov.u64 %r109,%r23; mov.u64 %r44,%r52; $L17: .loc 1 145 35 cvt.u32.u64 %r175,%r44; add.u32 %r174,%r175,1; .loc 1 145 24 st.u32 [%r109],%r174; .loc 1 144 4 add.u64 %r109,%r109,%r82; setp.eq.u64 %r176,%r102,%r108; @ %r176 bra $L25; .loc 1 145 31 ld.u64 %r44,[%r108]; add.u64 %r108,%r108,8; bra $L17; $L25: mov.u32 %r67,%r42; $L16: .loc 1 149 9 add.u64 %r63,%r63,%r51; .loc 1 151 14 add.u64 %r52,%r52,1; .loc 1 151 7 st.u64 [%frame+240],%r52; setp.ne.u64 %r177,%r52,%r53; @ %r177 bra $L11; .loc 1 149 9 sub.u64 %r178,%r53,%r110; mad.lo.u64 %r64,%r178,%r51,%r64; bra $L18; $L34: .loc 1 135 12 sub.u64 %r180,%r53,%r110; mad.lo.u64 %r64,%r180,%r51,%r64; $L18: mov.u64 %r81,%r194; .loc 1 151 34 mov.u64 %r61,%r53; mov.u64 %r59,%r49; .loc 1 152 9 mov.u64 %r66,0; $L20: .loc 1 157 13 st.u64 [%r81+-8],%r197; .loc 1 160 23 mul.lo.u64 %r56,%r59,%r61; mov.u64 %r100,%r66; .loc 1 161 5 add.u64 %r66,%r66,1; .loc 1 162 7 setp.eq.u64 %r184,%r100,%r99; @ %r184 bra $L1; .loc 1 170 16 ld.u64 %r185,[%r81]; add.u64 %r58,%r185,1; st.u64 [%r81],%r58; shl.b64 %r70,%r66,3; .loc 1 171 23 add.u64 %r186,%frame,%r70; ld.u64 %r59,[%r186]; .loc 1 171 13 sub.u64 %r187,%r59,%r56; add.u64 %r188,%r187,%r187; add.u64 %r64,%r64,%r188; .loc 1 174 32 add.u64 %r190,%r198,%r70; ld.u64 %r61,[%r190]; .loc 1 174 7 add.u64 %r81,%r81,8; setp.eq.u64 %r191,%r58,%r61; @ %r191 bra $L20; bra $L21; $L_gfortran_mmaxloc0_4_i2 .visible .func _gfortran_mmaxloc0_4_i2reg .u64 %stack; mov.u64 %stack,04807predpred %r193; .reg .u64 %r196; .reg .u6464pred %r228; .reg .u32 %r229; .reg .predpredu64predu64 %r279; .reg .u64 %r281; .reg .u64 %r282; .reg .u64pred %r299; mov.u64 %r166,%ar0; mov.u64 %r167,%ar1; mov.u64 %r168,%ar2; mov.u32 %r169,%ar3; .loc 1 202 6 setp.ne.u64 %r170,%r168,0; @ %r170 bra $L36gfortran_maxloc0_4_.loc 1 205 7 bra $L35; $L36: .loc 1 208 10 ld.s8 %r22,[%r167+28]; .loc 1 209 6 setp.gt.s32 %r174,%r22,0; @ %r174 bra $L38; .loc 1 210 5 cvta.const.u64 %r175175_gfortran_runtime_error%r177,[%r166]; setp.ne.u64 %r178,%r177,0; @ %r178 bra $L39; .loc 1 214 7 st.u64 [%r166+48],%r177; add.u32 %r180,%r22,-1; cvt.s64.s32 %r181,%r180; st.u64 [%r166+56],%r181; mov.u64 %r182,1; st.u64 [%r166+40],%r182; .loc 1 215 28 cvt.u32.u64 %r183,%r182; st.u8 [%r166+28],%r183; .loc 1 216 24 st.u64 [%r166+8],%r177; .loc 1 217 29 cvt.u32.u32 %r188,%r22; cvt.s64.s8 %r187,%r188call (%value_in),_gfortrani_xmallocarray189,[%value_in]; } .loc 1 217 27 st.u64 [%r166],%r189; bra $L40; $L39: .loc 1 221 11 cvta.global.u64 %r191,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r192,[%r191+36];r167196; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19call _gfortrani_bounds_equal_extents$L40: .loc 1 231 15 ld.u64 %r31,[%r168+16]; .loc 1 233 9 ld.u64 %r93,[%r168]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r201,%r32,-4; and.b32 %r202,%r201,-5; set.u32.eq.u32 %r204,%r202,0; neg.s32 %r205,%r204; .loc 1 235 22 add.u32 %r206,%r32,-1; set.u32.le.u32 %r208,%r206,1; neg.s32 %r209,%r208; .loc 1 235 6 cvt.u16.u32 %r211,%r205; cvt.u16.u32 %r212,%r209; or.b16 %r210,%r211,%r212; cvt.u32.u16 %r213,%r210; cvt.u16.u8 %r214,%r213; setp.ne.u16 %r215,%r214,0; @ %r215 bra $L41; .loc 1 237 7 setp.ne.u32 %r217,%r32,16; @ %r217 bra $L42; $L41: .loc 1 244 11 ld.u64 %r106,[%r166+40]; .loc 1 245 8 ld.u64 %r107,[%r166]; .loc 1 246 17 cvt.u32.u32 %r218,%r22; cvt.s64.s8 %r65,%r218; add.u64 %r52,%r167,40; add.u64 %r289,%frame,120; mov.u64 %r116,%r289; add.u64 %r102,%r168,40; mov.u64 %r165,%frame; add.u64 %r162,%frame,240; add.u64 %r159,%frame,360; mov.u64 %r160,%r159; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r227,%r29; bra $L43; $L42: .loc 1 242 5_gfortran_runtime_error64: mov.u64 %r29,%r113; $L43: .loc 1 248 18 ld.u64 %r221,[%r52]; st.u64 [%r116],%r221; .loc 1 249 20 ld.u64 %r223,[%r102]; mul.lo.u64 %r222,%r223,%r31; .loc 1 249 18 st.u64 [%r165],%r222; .loc 1 250 19 ld.u64 %r225,[%r52+16]; add.u64 %r224,%r225,1; ld.u64 %r226,[%r52+8]; sub.u64 %r44,%r224,%r226; .loc 1 250 17 st.u64 [%r162],%r44; .loc 1 251 16 st.u64 [%r160],%r227; .loc 1 252 10 setp.gt.s64 %r228,%r44,0; @ %r228 bra $L44; shl.b64 %r149,%r106,2; mov.u64 %r139,%r107; .loc 1 255 11 mov.u64 %r114,0; .loc 1 256 24 cvt.u32.u64 %r229,%r114; $L45: st.u32 [%r139],%r229; .loc 1 255 27 add.u64 %r114,%r114,1; .loc 1 255 4 add.u64 %r139,%r139,%r149; setp.ne.u64 %r230,%r65,%r114; @ %r230 bra $L45; bra $L35; $L44: .loc 1 246 26 add.u64 %r113,%r29,1; .loc 1 246 3 add.u64 %r52,%r52,24; add.u64 %r116,%r116,8; add.u64 %r102,%r102,24; add.u64 %r165,%r165,8; add.u64 %r162,%r162,8; add.u64 %r160,%r160,8; setp.ne.u64 %r231,%r65,%r113; @ %r231 bra $L64; .loc 1 261 8 ld.u64 %r90,[%r167]; shl.b64 %r62,%r106,2; mov.u64 %r91,%r107; .loc 1 264 10 mov.u64 %r112,0; .loc 1 265 23 cvt.u32.u64 %r232,%r112; $L47: st.u32 [%r91],%r232; mov.u64 %r142,%r112; .loc 1 264 26 add.u64 %r112,%r112,1; .loc 1 264 3 add.u64 %r91,%r91,%r62; setp.ne.u64 %r233,%r29,%r142; @ %r233 bra $L47; .loc 1 276 9 setp.eq.u64 %r234,%r90,0; @ %r234 bra $L35; .loc 1 331 19 ld.u64 %r69,[%frame+120]; .loc 1 331 9 add.u64 %r71,%r69,%r69; .loc 1 332 20 ld.u64 %r72,[%frame]; .loc 1 334 34 ld.u64 %r75,[%frame+240]; ld.u64 %r163,[%frame+360]; cvt.u32.u32 %r237,%r22; cvt.s64.s8 %r236,%r237; shl.b64 %r238,%r236,3; add.u64 %r127,%r238,%r159; .loc 1 274 12 mov.u32 %r98,-32768; setp.eq.u32 %r291,1,0; add.u64 %r290,%frame,368; .loc 1 340 13 mov.u64 %r296,0; .loc 1 359 32 add.u64 %r297,%frame,240; .loc 1 308 12 setp.eq.u32 %r299,%r169,0; $L63: .loc 1 280 10 @ %r291 bra $L48; mov.u64 %r53,%r163; mov.u32 %r46,0; mov.u32 %r293,1; $L53: mov.u64 %r45,%r90; .loc 1 300 13 add.u64 %r90,%r45,%r71; .loc 1 284 11 ld.s8 %r241,[%r93]; cvt.u16.u32 %r240,%r241; setp.eq.u16 %r242,%r240,0; @ %r242 bra $L49; setp.eq.u32 %r243,%r46,0; selp.u64 %r163,%r163,%r53,%r243; .loc 1 294 16 ld.s16 %r98,[%r45]; mov.u64 %r137,%r290; mov.u64 %r150,%r107; mov.u64 %r161,%r163; $L52: .loc 1 296 33 cvt.u32.u64 %r246,%r161; add.u32 %r245,%r246,1; .loc 1 296 22 st.u32 [%r150],%r245; .loc 1 295 9 add.u64 %r150,%r150,%r62; setp.eq.u64 %r247,%r127,%r137; @ %r247 bra $L65; .loc 1 296 29 ld.u64 %r161,[%r137]; add.u64 %r137,%r137,8; bra $L52; $L49: .loc 1 301 14 add.u64 %r93,%r93,%r72; .loc 1 303 11 add.u64 %r53,%r53,1; mov.u32 %r46,%r293; .loc 1 303 4 setp.ne.u64 %r248,%r53,%r75; @ %r248 bra $L53; bra $L89; $L48: .loc 1 308 12 @ %r299 bra $L66; mov.u64 %r61,%r163; mov.u64 %r88,%r90; $L58: .loc 1 311 11 ld.s8 %r251,[%r93]; cvt.u16.u32 %r250,%r251; setp.eq.u16 %r252,%r250,0; @ %r252 bra $L56; .loc 1 311 22 ld.s16 %r57,[%r88]; .loc 1 311 19 cvt.u16.u32 %r253,%r57; cvt.u16.u32 %r254,%r98; setp.lt.s16 %r255,%r253,%r254; @ %r255 bra $L56; mov.u64 %r117,%r290; mov.u64 %r119,%r107; mov.u64 %r156,%r61; .loc 1 314 19 mov.u64 %r143,%r296; $L57: .loc 1 315 36 cvt.u32.u64 %r258,%r156; add.u32 %r257,%r258,1; .loc 1 315 25 st.u32 [%r119],%r257; .loc 1 314 35 add.u64 %r110,%r143,1; .loc 1 314 12 add.u64 %r119,%r119,%r62; setp.le.s64 %r259,%r29,%r143; @ %r259 bra $L67; .loc 1 315 32 ld.u64 %r156,[%r117]; add.u64 %r117,%r117,8; mov.u64 %r143,%r110; bra $L57; $L67: mov.u32 %r98,%r57; $L56: .loc 1 317 13 add.u64 %r88,%r88,%r71; .loc 1 319 11 add.u64 %r61,%r61,1; .loc 1 319 4 st.u64 [%frame+360],%r61; setp.ne.u64 %r260,%r61,%r75; @ %r260 bra $L58; .loc 1 317 13 sub.u64 %r261,%r75,%r163; mad.lo.u64 %r90,%r261,%r71,%r90; bra $L59; $L66: mov.u64 %r74,%r163; mov.u64 %r92,%r93; mov.u64 %r89,%r90; $L55: .loc 1 323 11 ld.s8 %r264,[%r92]; cvt.u16.u32 %r263,%r264; setp.eq.u16 %r265,%r263,0; @ %r265 bra $L60; .loc 1 323 22 ld.s16 %r64,[%r89]; .loc 1 323 19 cvt.u16.u32 %r266,%r64; cvt.u16.u32 %r267,%r98; setp.le.s16 %r268,%r266,%r267; @ %r268 bra $L60; mov.u64 %r97,%r290; mov.u64 %r100,%r107; mov.u64 %r151,%r74; .loc 1 326 12 mov.u64 %r144,%r296; $L61: .loc 1 327 36 cvt.u32.u64 %r271,%r151; add.u32 %r270,%r271,1; .loc 1 327 25 st.u32 [%r100],%r270; .loc 1 326 28 add.u64 %r109,%r144,1; .loc 1 326 5 add.u64 %r100,%r100,%r62; setp.le.s64 %r272,%r29,%r144; @ %r272 bra $L68; .loc 1 327 32 ld.u64 %r151,[%r97]; add.u64 %r97,%r97,8; mov.u64 %r144,%r109; bra $L61; $L68: mov.u32 %r98,%r64; $L60: .loc 1 331 9 add.u64 %r89,%r89,%r71; .loc 1 332 10 add.u64 %r92,%r92,%r72; .loc 1 334 14 add.u64 %r74,%r74,1; .loc 1 334 7 st.u64 [%frame+360],%r74; setp.ne.u64 %r273,%r74,%r75; @ %r273 bra $L55; sub.u64 %r134,%r75,%r163; .loc 1 331 9 mad.lo.u64 %r90,%r71,%r134,%r90; .loc 1 332 10 mad.lo.u64 %r93,%r72,%r134,%r93; bra $L59; $L89: st.u64 [%frame+360],%r75; $L59: mov.u64 %r125,%r290; mov.u64 %r84,%r72; .loc 1 334 34 mov.u64 %r86,%r75; mov.u64 %r82,%r69; mov.u64 %r124,8; .loc 1 335 9 mov.u64 %r96,0; $L62: .loc 1 340 13 st.u64 [%r125+-8],%r296; .loc 1 343 23 mul.lo.u64 %r77,%r82,%r86; .loc 1 344 24 mul.lo.u64 %r79,%r86,%r84; mov.u64 %r118,%r96; .loc 1 345 5 add.u64 %r96,%r96,1; .loc 1 346 7 setp.eq.u64 %r278,%r29,%r118; @ %r278 bra $L35; .loc 1 354 16 ld.u64 %r279,[%r125]; add.u64 %r81,%r279,1; st.u64 [%r125],%r81; .loc 1 355 23 add.u64 %r281,%r289,%r124; ld.u64 %r82,[%r281]; .loc 1 355 13 sub.u64 %r282,%r82,%r77; add.u64 %r283,%r282,%r282; add.u64 %r90,%r90,%r283; .loc 1 356 24 add.u64 %r284,%frame,%r124; ld.u64 %r84,[%r284]; .loc 1 356 14 sub.u64 %r285,%r84,%r79; add.u64 %r93,%r93,%r285; .loc 1 359 32 add.u64 %r287,%r297,%r124; ld.u64 %r86,[%r287]; .loc 1 359 7 add.u64 %r125,%r125,8; add.u64 %r124,%r124,8; setp.eq.u64 %r288,%r81,%r86; @ %r288 bra $L62; ld.u64 %r163,[%frame+360]; bra $L63; $L65: mov.u64 %r90,%r45; setp.eq.u32 %r291,1,1; bra $L48; $L35: .loc 1 _gfortran_smaxloc0_4_i2 .visible .func _gfortran_smaxloc0_4_i2u32pred.reg .u64u32 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 379 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L91; .loc 1 379 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L92; $L91: .loc 1 381 7 {_gfortran_maxloc0_4_.loc 1 382 7 bra $L90; $L92: .loc 1 385 8 ld.s8 %r30,[%r36+28]_gfortran_runtime_error94: .loc 1 390 15 ld.u64 %r24,[%r35]; .loc 1 390 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L95; .loc 1 392 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 393 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 394 24 st.u64 [%r35+8],%r24; .loc 1 395 29 mov.u64 %r56call (%value_in),_gfortrani_xmallocarray95 27 st.u64 [%r35],%r24; bra $L96; $L95: .loc 1 397 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 397 11 ld.u32 %r62,[%r61+36; .loc 1 399 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 404 8 ld.u64 %r24,[%r35]; $L96: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,2; mov.u64 %r29,%r24; .loc 1 405 10 mov.u64 %r32,0; .loc 1 406 23 cvt.u32.u64 %r68,%r32; $L97: st.u32 [%r29],%r68; .loc 1 405 24 add.u64 %r32,%r32,1; .loc 1 405 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L97; $L90: .loc 1 407 1 ret; } maxloc0_8_i2.o/_gfortran_maxloc0_8_i2 .visible .func _gfortran_maxloc0_8_i2libgfortran/generated/maxloc0_8_i2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_8_i2 .visible .func _gfortran_mmaxloc0_8_i2_gfortran_smaxloc0_8_i2 .visible .func _gfortran_smaxloc0_8_i2VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,65,88,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_8_i2 .visible .func _gfortran_maxloc0_8_i2stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[36832predpredpredpredmov.u64 %r112,%ar0; mov.u64 %r113,%ar1; mov.u32 %r114,%ar2; .loc 1 50 10 ld.s8 %r22,[%r113+28]; .loc 1 50 8 cvt.u32.u32 %r115,%r22; cvt.s64.s8 %r71,%r115; .loc 1 51 6 setp.gt.s64 %r116,%r71,0; @ %r116 bra $L2; .loc 1 52 5 cvta.const.u64 %r11117_gfortran_runtime_errorr23,[%r112]; .loc 1 54 6 setp.ne.u64 %r119,%r23,0; @ %r119 bra $L3; .loc 1 56 7 st.u64 [%r112+48],%r23; add.u64 %r121,%r71,-1; st.u64 [%r112+56],%r121; mov.u64 %r122,1; st.u64 [%r112+40],%r122; .loc 1 57 28 cvt.u32.u64 %r123,%r122; st.u8 [%r112+28],%r123; .loc 1 58 24 st.u64 [%r112+8],%r23; .loc 1 59 29 mov.u64 %r171call (%value_in),_gfortrani_xmallocarray129,[%value_in]; } mov.u64 %r23,%r129; .loc 1 59 27 st.u64 [%r112],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r131,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r132,[%r131+36]; setp.eq.u32 %r133,%r132,0; @ %r133 bra $L4; .loc 1 64r113136; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r112]; $L4: .loc 1 68 11 ld.u64 %r72,[%r112+40]; add.u64 %r36,%r113,40; mov.u64 %r104,%frame; add.u64 %r105,%frame,120; .loc 1 70 10 mov.u64 %r98,0; add.u64 %r187,%frame,240; .loc 1 74 16 mov.u64 %r144,%r98; bra $L7; $L22: mov.u64 %r98,%r79; $L7: .loc 1 72 18 ld.u64 %r137,[%r36]; st.u64 [%r104],%r137; .loc 1 73 19 ld.u64 %r139,[%r36+16]; add.u64 %r138,%r139,1; ld.u64 %r140,[%r36+8]; sub.u64 %r31,%r138,%r140; .loc 1 73 17 st.u64 [%r105],%r31; .loc 1 74 16 shl.b64 %r142,%r98,3; add.u64 %r143,%r187,%r142; st.u64 [%r143],%r144; .loc 1 75 10 setp.gt.s64 %r145,%r31,0; @ %r145 bra $L5; shl.b64 %r81,%r72,3; mov.u64 %r87,%r23; .loc 1 78 11 mov.u64 %r80,0; .loc 1 79 24 mov.u64 %r146,%r80; $L6: st.u64 [%r87],%r146; .loc 1 78 27 add.u64 %r80,%r80,1; .loc 1 78 4 add.u64 %r87,%r87,%r81; setp.ne.u64 %r147,%r71,%r80; @ %r147 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r79,%r98,1; .loc 1 70 3 add.u64 %r36,%r36,24; add.u64 %r104,%r104,8; add.u64 %r105,%r105,8; setp.ne.u64 %r148,%r71,%r79; @ %r148 bra $L22; .loc 1 84 8 ld.u64 %r62,[%r113]; shl.b64 %r84,%r72,3; mov.u64 %r39,%r23; .loc 1 87 10 mov.u64 %r78,0; .loc 1 88 23 mov.u64 %r149,1; $L8: st.u64 [%r39],%r149; mov.u64 %r32,%r78; .loc 1 87 26 add.u64 %r78,%r78,1; .loc 1 87 3 add.u64 %r39,%r39,%r84; setp.ne.u64 %r150,%r98,%r32; @ %r150 bra $L8; .loc 1 101 9 setp.ne.u64 %r151,%r62,0; @ ! %r151 bra $L1; .loc 1 149 19 ld.u64 %r45,[%frame]; .loc 1 149 9 add.u64 %r47,%r45,%r45; .loc 1 151 34 ld.u64 %r51,[%frame+120]; cvt.u32.u32 %r154,%r22; cvt.s64.s8 %r153,%r154; shl.b64 %r155,%r153,3; add.u64 %r73,%r187,%r155; .loc 1 99 12 mov.u32 %r67,-32768; setp.eq.u32 %r186,%r114,0; add.u64 %r188,%frame,248; .loc 1 157 13 mov.u64 %r191,0; .loc 1 174 32 add.u64 %r192,%frame,120; $L21: .loc 1 151 21 ld.u64 %r103,[%frame+240]; .loc 1 126 12 @ %r186 bra $L23; mov.u64 %r40,%r103; mov.u64 %r60,%r62; $L14: .loc 1 129 12 ld.s16 %r33,[%r60]; .loc 1 129 11 cvt.u16.u32 %r158,%r33; cvt.u16.u32 %r159,%r67; setp.lt.s16 %r160,%r158,%r159; @ %r160 bra $L12; mov.u64 %r76,%r188; mov.u64 %r34,%r23; mov.u64 %r90,%r40; $L13: .loc 1 133 35 add.u64 %r162,%r90,1; .loc 1 133 24 st.u64 [%r34],%r162; .loc 1 132 10 add.u64 %r34,%r34,%r84; setp.eq.u64 %r163,%r73,%r76; @ %r163 bra $L24; .loc 1 133 31 ld.u64 %r90,[%r76]; add.u64 %r76,%r76,8; bra $L13; $L24: mov.u32 %r67,%r33; $L12: .loc 1 135 12 add.u64 %r60,%r60,%r47; .loc 1 137 17 add.u64 %r40,%r40,1; .loc 1 137 10 st.u64 [%frame+240],%r40; setp.ne.u64 %r164,%r40,%r51; @ %r164 bra $L14; bra $L34; $L23: mov.u64 %r49,%r103; mov.u64 %r61,%r62; $L11: .loc 1 141 11 ld.s16 %r42,[%r61]; .loc 1 141 10 cvt.u16.u32 %r165,%r42; cvt.u16.u32 %r166,%r67; setp.le.s16 %r167,%r165,%r166; @ %r167 bra $L16; mov.u64 %r85,%r188; mov.u64 %r101,%r23; mov.u64 %r86,%r49; $L17: .loc 1 145 35 add.u64 %r169,%r86,1; .loc 1 145 24 st.u64 [%r101],%r169; .loc 1 144 4 add.u64 %r101,%r101,%r84; setp.eq.u64 %r170,%r73,%r85; @ %r170 bra $L25; .loc 1 145 31 ld.u64 %r86,[%r85]; add.u64 %r85,%r85,8; bra $L17; $L25: mov.u32 %r67,%r42; $L16: .loc 1 149 9 add.u64 %r61,%r61,%r47; .loc 1 151 14 add.u64 %r49,%r49,1; .loc 1 151 7 st.u64 [%frame+240],%r49; setp.ne.u64 %r171,%r49,%r51; @ %r171 bra $L11; .loc 1 149 9 sub.u64 %r172,%r51,%r103; mad.lo.u64 %r62,%r172,%r47,%r62; bra $L18; $L34: .loc 1 135 12 sub.u64 %r174,%r51,%r103; mad.lo.u64 %r62,%r174,%r47,%r62; $L18: mov.u64 %r70,%r188; .loc 1 151 34 mov.u64 %r59,%r51; mov.u64 %r57,%r45; .loc 1 152 9 mov.u64 %r66,0; $L20: .loc 1 157 13 st.u64 [%r70+-8],%r191; .loc 1 160 23 mul.lo.u64 %r54,%r57,%r59; mov.u64 %r99,%r66; .loc 1 161 5 add.u64 %r66,%r66,1; .loc 1 162 7 setp.eq.u64 %r178,%r99,%r98; @ %r178 bra $L1; .loc 1 170 16 ld.u64 %r179,[%r70]; add.u64 %r56,%r179,1; st.u64 [%r70],%r56; shl.b64 %r65,%r66,3; .loc 1 171 23 add.u64 %r180,%frame,%r65; ld.u64 %r57,[%r180]; .loc 1 171 13 sub.u64 %r181,%r57,%r54; add.u64 %r182,%r181,%r181; add.u64 %r62,%r62,%r182; .loc 1 174 32 add.u64 %r184,%r192,%r65; ld.u64 %r59,[%r184]; .loc 1 174 7 add.u64 %r70,%r70,8; setp.eq.u64 %r185,%r56,%r59; @ %r185 bra $L20; bra $L21; $L_gfortran_mmaxloc0_8_i2 .visible .func _gfortran_mmaxloc0_8_i2reg .u64 %stack; mov.u64 %stack,0480pred %r187; .reg .u64 %r190; .reg .u64u64pred %r222; .reg .u64 %r223; .reg .predu32 %r231; .reg .u64 %r232; .reg .u16 %r234; .reg .u32 %r235; .reg .pred %r236; .reg .predpred %r251; .reg .pred %r252; .reg .u64 %r253; .reg .u16 %r255; .reg .u32predpred %r281; .reg .u64 %r282; .reg .u32 %r284; .reg .u64 %r287; .reg .u64 %r288; .reg .pred %r290; mov.u64 %r160,%ar0; mov.u64 %r161,%ar1; mov.u64 %r162,%ar2; mov.u32 %r163,%ar3; .loc 1 202 6 setp.ne.u64 %r164,%r162,0; @ %r164 bra $L36; .loc 1 204 7 {1163; call _gfortran_maxloc0_8_.loc 1 205 7 bra $L35; $L36: .loc 1 208 10 ld.s8 %r22,[%r161+28]; .loc 1 209 6 setp.gt.s32 %r168,%r22,0; @ %r168 bra $L38; .loc 1 210 5 cvta.const.u64 %r16stack; call _gfortran_runtime_error%r171,[%r160]; setp.ne.u64 %r172,%r171,0; @ %r172 bra $L39; .loc 1 214 7 st.u64 [%r160+48],%r171; add.u32 %r174,%r22,-1; cvt.s64.s32 %r175,%r174; st.u64 [%r160+56],%r175; mov.u64 %r176,1; st.u64 [%r160+40],%r176; .loc 1 215 28 cvt.u32.u64 %r177,%r176; st.u8 [%r160+28],%r177; .loc 1 216 24 st.u64 [%r160+8],%r171; .loc 1 217 29 cvt.u32.u32 %r182,%r22; cvt.s64.s8 %r181,%r182; mov.u64 %r180call (%value_in),_gfortrani_xmallocarray183,[%value_in]; } .loc 1 217 27 st.u64 [%r160],%r183; bra $L40; $L39: .loc 1 221 11 cvta.global.u64 %r185,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r186,[%r185+36]; setp.eq.u32 %r187,%r186,0; @ %r187 bra $L40; .loc 1 224 4 cvta.const.u64 %r190,$LC1; {161190; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19316161193190; call _gfortrani_bounds_equal_extents$L40: .loc 1 231 15 ld.u64 %r31,[%r162+16]; .loc 1 233 9 ld.u64 %r92,[%r162]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r195,%r32,-4; and.b32 %r196,%r195,-5; set.u32.eq.u32 %r198,%r196,0; neg.s32 %r199,%r198; .loc 1 235 22 add.u32 %r200,%r32,-1; set.u32.le.u32 %r202,%r200,1; neg.s32 %r203,%r202; .loc 1 235 6 cvt.u16.u32 %r205,%r199; cvt.u16.u32 %r206,%r203; or.b16 %r204,%r205,%r206; cvt.u32.u16 %r207,%r204; cvt.u16.u8 %r208,%r20741; .loc 1 237 7 setp.ne.u32 %r211,%r32,16; @ %r211 bra $L42; $L41: .loc 1 244 11 ld.u64 %r106,[%r160+40]; .loc 1 245 8 ld.u64 %r107,[%r160]; .loc 1 246 17 cvt.u32.u32 %r212,%r22; cvt.s64.s8 %r68,%r212; add.u64 %r153,%r161,40; add.u64 %r282,%frame,120; mov.u64 %r149,%r282; add.u64 %r147,%r162,40; mov.u64 %r146,%frame; add.u64 %r143,%frame,240; add.u64 %r62,%frame,360; mov.u64 %r47,%r62; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r221,%r29; bra $L43; $L42: .loc 1 242 5_gfortran_runtime_error64: mov.u64 %r29,%r113; $L43: .loc 1 248 18 ld.u64 %r215,[%r153]; st.u64 [%r149],%r215; .loc 1 249 20 ld.u64 %r217,[%r147]; mul.lo.u64 %r216,%r217,%r31; .loc 1 249 18 st.u64 [%r146],%r216; .loc 1 250 19 ld.u64 %r219,[%r153+16]; add.u64 %r218,%r219,1; ld.u64 %r220,[%r153+8]; sub.u64 %r44,%r218,%r220; .loc 1 250 17 st.u64 [%r143],%r44; .loc 1 251 16 st.u64 [%r47],%r221; .loc 1 252 10 setp.gt.s64 %r222,%r44,0; @ %r222 bra $L44; shl.b64 %r137,%r106,3; mov.u64 %r129,%r107; .loc 1 255 11 mov.u64 %r114,0; .loc 1 256 24 mov.u64 %r223,%r114; $L45: st.u64 [%r129],%r223; .loc 1 255 27 add.u64 %r114,%r114,1; .loc 1 255 4 add.u64 %r129,%r129,%r137; setp.ne.u64 %r224,%r68,%r114; @ %r224 bra $L45; bra $L35; $L44: .loc 1 246 26 add.u64 %r113,%r29,1; .loc 1 246 3 add.u64 %r153,%r153,24; add.u64 %r149,%r149,8; add.u64 %r147,%r147,24; add.u64 %r146,%r146,8; add.u64 %r143,%r143,8; add.u64 %r47,%r47,8; setp.ne.u64 %r225,%r68,%r113; @ %r225 bra $L64; .loc 1 261 8 ld.u64 %r90,[%r161]; shl.b64 %r158,%r106,3; mov.u64 %r156,%r107; .loc 1 264 10 mov.u64 %r112,0; .loc 1 265 23 mov.u64 %r226,%r112; $L47: st.u64 [%r156],%r226; mov.u64 %r140,%r112; .loc 1 264 26 add.u64 %r112,%r112,1; .loc 1 264 3 add.u64 %r156,%r156,%r158; setp.ne.u64 %r227,%r29,%r140; @ %r227 bra $L47; .loc 1 276 9 setp.eq.u64 %r228,%r90,0; @ %r228 bra $L35; .loc 1 331 19 ld.u64 %r70,[%frame+120]; .loc 1 331 9 add.u64 %r72,%r70,%r70; .loc 1 332 20 ld.u64 %r73,[%frame]; .loc 1 334 34 ld.u64 %r76,[%frame+240]; ld.u64 %r155,[%frame+360]; cvt.u32.u32 %r231,%r22; cvt.s64.s8 %r230,%r231; shl.b64 %r232,%r230,3; add.u64 %r119,%r232,%r62; .loc 1 274 12 mov.u32 %r97,-32768; setp.eq.u32 %r281,1,0; add.u64 %r280,%frame,368; .loc 1 340 13 mov.u64 %r287,0; .loc 1 359 32 add.u64 %r288,%frame,240; .loc 1 308 12 setp.eq.u32 %r290,%r163,0; $L63: .loc 1 280 10 @ %r281 bra $L48; mov.u64 %r56,%r155; mov.u32 %r46,0; mov.u32 %r284,1; $L53: mov.u64 %r45,%r90; .loc 1 300 13 add.u64 %r90,%r45,%r72; .loc 1 284 11 ld.s8 %r235,[%r92]; cvt.u16.u32 %r234,%r235; setp.eq.u16 %r236,%r234,0; @ %r236 bra $L49; setp.eq.u32 %r237,%r46,0; selp.u64 %r155,%r155,%r56,%r237; .loc 1 294 16 ld.s16 %r97,[%r45]; mov.u64 %r128,%r280; mov.u64 %r139,%r107; mov.u64 %r151,%r155; $L52: .loc 1 296 33 add.u64 %r239,%r151,1; .loc 1 296 22 st.u64 [%r139],%r239; .loc 1 295 9 add.u64 %r139,%r139,%r158; setp.eq.u64 %r240,%r119,%r128; @ %r240 bra $L65; .loc 1 296 29 ld.u64 %r151,[%r128]; add.u64 %r128,%r128,8; bra $L52; $L49: .loc 1 301 14 add.u64 %r92,%r92,%r73; .loc 1 303 11 add.u64 %r56,%r56,1; mov.u32 %r46,%r284; .loc 1 303 4 setp.ne.u64 %r241,%r56,%r76; @ %r241 bra $L53; bra $L89; $L48: .loc 1 308 12 @ %r290 bra $L66; mov.u64 %r64,%r155; mov.u64 %r88,%r90; $L58: .loc 1 311 11 ld.s8 %r244,[%r92]; cvt.u16.u32 %r243,%r244; setp.eq.u16 %r245,%r243,0; @ %r245 bra $L56; .loc 1 311 22 ld.s16 %r60,[%r88]; .loc 1 311 19 cvt.u16.u32 %r246,%r60; cvt.u16.u32 %r247,%r97; setp.lt.s16 %r248,%r246,%r247; @ %r248 bra $L56; mov.u64 %r93,%r280; mov.u64 %r96,%r107; mov.u64 %r148,%r64; .loc 1 314 19 mov.u64 %r141,%r287; $L57: .loc 1 315 36 add.u64 %r250,%r148,1; .loc 1 315 25 st.u64 [%r96],%r250; .loc 1 314 35 add.u64 %r110,%r141,1; .loc 1 314 12 add.u64 %r96,%r96,%r158; setp.le.s64 %r251,%r29,%r141; @ %r251 bra $L67; .loc 1 315 32 ld.u64 %r148,[%r93]; add.u64 %r93,%r93,8; mov.u64 %r141,%r110; bra $L57; $L67: mov.u32 %r97,%r60; $L56: .loc 1 317 13 add.u64 %r88,%r88,%r72; .loc 1 319 11 add.u64 %r64,%r64,1; .loc 1 319 4 st.u64 [%frame+360],%r64; setp.ne.u64 %r252,%r64,%r76; @ %r252 bra $L58; .loc 1 317 13 sub.u64 %r253,%r76,%r155; mad.lo.u64 %r90,%r253,%r72,%r90; bra $L59; $L66: mov.u64 %r75,%r155; mov.u64 %r91,%r92; mov.u64 %r89,%r90; $L55: .loc 1 323 11 ld.s8 %r256,[%r91]; cvt.u16.u32 %r255,%r256; setp.eq.u16 %r257,%r255,0; @ %r257 bra $L60; .loc 1 323 22 ld.s16 %r67,[%r89]; .loc 1 323 19 cvt.u16.u32 %r258,%r67; cvt.u16.u32 %r259,%r97; setp.le.s16 %r260,%r258,%r259; @ %r260 bra $L60; mov.u64 %r115,%r280; mov.u64 %r57,%r107; mov.u64 %r144,%r75; .loc 1 326 12 mov.u64 %r142,%r287; $L61: .loc 1 327 36 add.u64 %r262,%r144,1; .loc 1 327 25 st.u64 [%r57],%r262; .loc 1 326 28 add.u64 %r109,%r142,1; .loc 1 326 5 add.u64 %r57,%r57,%r158; setp.le.s64 %r263,%r29,%r142; @ %r263 bra $L68; .loc 1 327 32 ld.u64 %r144,[%r115]; add.u64 %r115,%r115,8; mov.u64 %r142,%r109; bra $L61; $L68: mov.u32 %r97,%r67; $L60: .loc 1 331 9 add.u64 %r89,%r89,%r72; .loc 1 332 10 add.u64 %r91,%r91,%r73; .loc 1 334 14 add.u64 %r75,%r75,1; .loc 1 334 7 st.u64 [%frame+360],%r75; setp.ne.u64 %r264,%r75,%r76; @ %r264 bra $L55; sub.u64 %r125,%r76,%r155; .loc 1 331 9 mad.lo.u64 %r90,%r72,%r125,%r90; .loc 1 332 10 mad.lo.u64 %r92,%r73,%r125,%r92; bra $L59; $L89: st.u64 [%frame+360],%r76; $L59: mov.u64 %r117,%r280; mov.u64 %r85,%r73; .loc 1 334 34 mov.u64 %r87,%r76; mov.u64 %r83,%r70; mov.u64 %r116,8; .loc 1 335 9 mov.u64 %r95,0; $L62: .loc 1 340 13 st.u64 [%r117+-8],%r287; .loc 1 343 23 mul.lo.u64 %r78,%r83,%r87; .loc 1 344 24 mul.lo.u64 %r80,%r87,%r85; mov.u64 %r94,%r95; .loc 1 345 5 add.u64 %r95,%r95,1; .loc 1 346 7 setp.eq.u64 %r269,%r29,%r94; @ %r269 bra $L35; .loc 1 354 16 ld.u64 %r270,[%r117]; add.u64 %r82,%r270,1; st.u64 [%r117],%r82; .loc 1 355 23 add.u64 %r272,%r282,%r116; ld.u64 %r83,[%r272]; .loc 1 355 13 sub.u64 %r273,%r83,%r78; add.u64 %r274,%r273,%r273; add.u64 %r90,%r90,%r274; .loc 1 356 24 add.u64 %r275,%frame,%r116; ld.u64 %r85,[%r275]; .loc 1 356 14 sub.u64 %r276,%r85,%r80; add.u64 %r92,%r92,%r276; .loc 1 359 32 add.u64 %r278,%r288,%r116; ld.u64 %r87,[%r278]; .loc 1 359 7 add.u64 %r117,%r117,8; add.u64 %r116,%r116,8; setp.eq.u64 %r279,%r82,%r87; @ %r279 bra $L62; ld.u64 %r155,[%frame+360]; bra $L63; $L65: mov.u64 %r90,%r45; setp.eq.u32 %r281,1,1; bra $L48; $L35: .loc 1 _gfortran_smaxloc0_8_i2 .visible .func _gfortran_smaxloc0_8_i2u32pred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 379 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L91; .loc 1 379 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L92; $L91: .loc 1 381 7 {_gfortran_maxloc0_8_.loc 1 382 7 bra $L90; $L92: .loc 1 385 8 ld.s8 %r30,[%r36+28]_gfortran_runtime_error94: .loc 1 390 15 ld.u64 %r24,[%r35]; .loc 1 390 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L95; .loc 1 392 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 393 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 394 24 st.u64 [%r35+8],%r24; .loc 1 395 29 mov.u64 call (%value_in),_gfortrani_xmallocarray95 27 st.u64 [%r35],%r24; bra $L96; $L95: .loc 1 397 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 397 11 ld.u32 %r62,[%r61+36; .loc 1 399 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 404 8 ld.u64 %r24,[%r35]; $L96: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,3; mov.u64 %r29,%r24; .loc 1 405 10 mov.u64 %r32,0; .loc 1 406 23 mov.u64 %r68,%r32; $L97: st.u64 [%r29],%r68; .loc 1 405 24 add.u64 %r32,%r32,1; .loc 1 405 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L97; $L90: .loc 1 407 1 ret; } maxloc0_16_i2.o/_gfortran_maxloc0_16_i2 .visible .func _gfortran_maxloc0_16_i2libgfortran/generated/maxloc0_16_i2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_16_i2 .visible .func _gfortran_mmaxloc0_16_i2_gfortran_smaxloc0_16_i2 .visible .func _gfortran_smaxloc0_16_i2VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,65,88,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_16_i2 .visible .func _gfortran_maxloc0_16_i2stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[3683264pred %r168; .reg .pred %r169; .reg .u16 %r170; .reg .u16 %r171; .reg .pred.reg .u64 %r180; .reg .predmov.u32 %r116,%ar2; .loc 1 50 10 ld.s8 %r22,[%r115+28]; .loc 1 50 8 cvt.u32.u32 %r117,%r22; cvt.s64.s8 %r71,%r117; .loc 1 51 6 setp.gt.s64 %r118,%r71,0; @ %r118 bra $L2; .loc 1 52 5 cvta.const.u64 %r1111_gfortran_runtime_errorr23,[%r114]; .loc 1 54 6 setp.ne.u64 %r121,%r23,0; @ %r121 bra $L3; .loc 1 56 7 st.u64 [%r114+48],%r23; add.u64 %r123,%r71,-1; st.u64 [%r114+56],%r123; mov.u64 %r124,1; st.u64 [%r114+40],%r124; .loc 1 57 28 cvt.u32.u64 %r125,%r124; st.u8 [%r114+28],%r125; .loc 1 58 24 st.u64 [%r114+8],%r23; .loc 1 59 29 mov.u64 %r128,1671call (%value_in),_gfortrani_xmallocarray131,[%value_in]; } mov.u64 %r23,%r131; .loc 1 59 27 st.u64 [%r114],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r133,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r134,[%r133+36]; setp.eq.u32 %r135,%r134,0; @ %r135 bra $L4; .loc 1 64r115138; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r114]; $L4: .loc 1 68 11 ld.u64 %r72,[%r114+40]; add.u64 %r44,%r115,40; mov.u64 %r35,%frame; add.u64 %r52,%frame,120; .loc 1 70 10 mov.u64 %r100,0; add.u64 %r193,%frame,240; .loc 1 74 16 mov.u64 %r146,%r100; bra $L7; $L22: mov.u64 %r100,%r79; $L7: .loc 1 72 18 ld.u64 %r139,[%r44]; st.u64 [%r35],%r139; .loc 1 73 19 ld.u64 %r141,[%r44+16]; add.u64 %r140,%r141,1; ld.u64 %r142,[%r44+8]; sub.u64 %r32,%r140,%r142; .loc 1 73 17 st.u64 [%r52],%r32; .loc 1 74 16 shl.b64 %r144,%r100,3; add.u64 %r145,%r193,%r144; st.u64 [%r145],%r146; .loc 1 75 10 setp.gt.s64 %r147,%r32,0; @ %r147 bra $L5; shl.b64 %r86,%r72,4; mov.u64 %r88,%r23; .loc 1 78 11 mov.u64 %r80,0; .loc 1 79 24 mov.u64 %r148,%r80; $L6: st.u64 [%r88],%r148; st.u64 [%r88+8],%r148; .loc 1 78 27 add.u64 %r80,%r80,1; .loc 1 78 4 add.u64 %r88,%r88,%r86; setp.ne.u64 %r150,%r71,%r80; @ %r150 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r79,%r100,1; .loc 1 70 3 add.u64 %r44,%r44,24; add.u64 %r35,%r35,8; add.u64 %r52,%r52,8; setp.ne.u64 %r151,%r71,%r79; @ %r151 bra $L22; .loc 1 84 8 ld.u64 %r62,[%r115]; shl.b64 %r38,%r72,4; mov.u64 %r85,%r23; .loc 1 87 10 mov.u64 %r78,0; .loc 1 88 23 mov.u64 %r152,1; mov.u64 %r153,%r78; $L8: st.u64 [%r85],%r152; st.u64 [%r85+8],%r153; mov.u64 %r33,%r78; .loc 1 87 26 add.u64 %r78,%r78,1; .loc 1 87 3 add.u64 %r85,%r85,%r38; setp.ne.u64 %r154,%r100,%r33; @ %r154 bra $L8; .loc 1 101 9 setp.ne.u64 %r155,%r62,0; @ ! %r155 bra $L1; .loc 1 149 19 ld.u64 %r47,[%frame]; .loc 1 149 9 add.u64 %r49,%r47,%r47; .loc 1 151 34 ld.u64 %r51,[%frame+120]; cvt.u32.u32 %r158,%r22; cvt.s64.s8 %r157,%r158; shl.b64 %r159,%r157,3; add.u64 %r89,%r193,%r159; .loc 1 99 12 mov.u32 %r66,-32768; setp.eq.u32 %r192,%r116,0; add.u64 %r194,%frame,248; .loc 1 157 13 mov.u64 %r197,0; .loc 1 174 32 add.u64 %r198,%frame,120; $L21: .loc 1 151 21 ld.u64 %r107,[%frame+240]; .loc 1 126 12 @ %r192 bra $L23; mov.u64 %r41,%r107; mov.u64 %r60,%r62; $L14: .loc 1 129 12 ld.s16 %r34,[%r60]; .loc 1 129 11 cvt.u16.u32 %r162,%r34; cvt.u16.u32 %r163,%r66; setp.lt.s16 %r164,%r162,%r163; @ %r164 bra $L12; mov.u64 %r27,%r194; mov.u64 %r63,%r23; mov.u64 %r103,%r41; $L13: .loc 1 133 35 add.u64 %r166,%r103,1; st.u64 [%r63],%r166; shr.s64 %r167,%r166,63; st.u64 [%r63+8],%r167; .loc 1 132 10 add.u64 %r63,%r63,%r38; setp.eq.u64 %r168,%r27,%r89; @ %r168 bra $L24; .loc 1 133 31 ld.u64 %r103,[%r27]; add.u64 %r27,%r27,8; bra $L13; $L24: mov.u32 %r66,%r34; $L12: .loc 1 135 12 add.u64 %r60,%r60,%r49; .loc 1 137 17 add.u64 %r41,%r41,1; .loc 1 137 10 st.u64 [%frame+240],%r41; setp.ne.u64 %r169,%r41,%r51; @ %r169 bra $L14; bra $L34; $L23: mov.u64 %r50,%r107; mov.u64 %r61,%r62; $L11: .loc 1 141 11 ld.s16 %r42,[%r61]; .loc 1 141 10 cvt.u16.u32 %r170,%r42; cvt.u16.u32 %r171,%r66; setp.le.s16 %r172,%r170,%r171; @ %r172 bra $L16; mov.u64 %r104,%r194; mov.u64 %r105,%r23; mov.u64 %r84,%r50; $L17: .loc 1 145 35 add.u64 %r174,%r84,1; st.u64 [%r105],%r174; shr.s64 %r175,%r174,63; st.u64 [%r105+8],%r175; .loc 1 144 4 add.u64 %r105,%r105,%r38; setp.eq.u64 %r176,%r89,%r104; @ %r176 bra $L25; .loc 1 145 31 ld.u64 %r84,[%r104]; add.u64 %r104,%r104,8; bra $L17; $L25: mov.u32 %r66,%r42; $L16: .loc 1 149 9 add.u64 %r61,%r61,%r49; .loc 1 151 14 add.u64 %r50,%r50,1; .loc 1 151 7 st.u64 [%frame+240],%r50; setp.ne.u64 %r177,%r50,%r51; @ %r177 bra $L11; .loc 1 149 9 sub.u64 %r178,%r51,%r107; mad.lo.u64 %r62,%r178,%r49,%r62; bra $L18; $L34: .loc 1 135 12 sub.u64 %r180,%r51,%r107; mad.lo.u64 %r62,%r180,%r49,%r62; $L18: mov.u64 %r77,%r194; .loc 1 151 34 mov.u64 %r59,%r51; mov.u64 %r57,%r47; .loc 1 152 9 mov.u64 %r65,0; $L20: .loc 1 157 13 st.u64 [%r77+-8],%r197; .loc 1 160 23 mul.lo.u64 %r54,%r57,%r59; mov.u64 %r101,%r65; .loc 1 161 5 add.u64 %r65,%r65,1; .loc 1 162 7 setp.eq.u64 %r184,%r101,%r100; @ %r184 bra $L1; .loc 1 170 16 ld.u64 %r185,[%r77]; add.u64 %r56,%r185,1; st.u64 [%r77],%r56; shl.b64 %r68,%r65,3; .loc 1 171 23 add.u64 %r186,%frame,%r68; ld.u64 %r57,[%r186]; .loc 1 171 13 sub.u64 %r187,%r57,%r54; add.u64 %r188,%r187,%r187; add.u64 %r62,%r62,%r188; .loc 1 174 32 add.u64 %r190,%r198,%r68; ld.u64 %r59,[%r190]; .loc 1 174 7 add.u64 %r77,%r77,8; setp.eq.u64 %r191,%r56,%r59; @ %r191 bra $L20; bra $L21; $L_gfortran_mmaxloc0_16_i2 .visible .func _gfortran_mmaxloc0_16_i2reg .u64 %stack; mov.u64 %stack,048098predpredpredu16predpred %r246; .reg .pred %r247; .reg .u16 %r249; .reg .u32 %r250; .reg .predpredpredu64 %r280; .reg .u64 %r281; .reg .u64predpred %r298; mov.u64 %r163,%ar0; mov.u64 %r164,%ar1; mov.u64 %r165,%ar2; mov.u32 %r166,%ar3; .loc 1 202 6 setp.ne.u64 %r167,%r165,0; @ %r167 bra $L3616r1166; call _gfortran_maxloc0_16_.loc 1 205 7 bra $L35; $L36: .loc 1 208 10 ld.s8 %r22,[%r164+28]; .loc 1 209 6 setp.gt.s32 %r171,%r22,0; @ %r171 bra $L38; .loc 1 210 5 cvta.const.u64 %r172172_gfortran_runtime_error%r174,[%r163]; setp.ne.u64 %r175,%r174,0; @ %r175 bra $L39; .loc 1 214 7 st.u64 [%r163+48],%r174; add.u32 %r177,%r22,-1; cvt.s64.s32 %r178,%r177; st.u64 [%r163+56],%r178; mov.u64 %r179,1; st.u64 [%r163+40],%r179; .loc 1 215 28 cvt.u32.u64 %r180,%r179; st.u8 [%r163+28],%r180; .loc 1 216 24 st.u64 [%r163+8],%r174; .loc 1 217 29 cvt.u32.u32 %r185,%r22; cvt.s64.s8 %r184,%r185; mov.u64 %r183,16call (%value_in),_gfortrani_xmallocarray186,[%value_in]; } .loc 1 217 27 st.u64 [%r163],%r186; bra $L40; $L39: .loc 1 221 11 cvta.global.u64 %r188,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r189,[%r188+36]; setp.eq.u32 %r190,%r189,0r164193; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r196193; call _gfortrani_bounds_equal_extents$L40: .loc 1 231 15 ld.u64 %r31,[%r165+16]; .loc 1 233 9 ld.u64 %r90,[%r165]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r198,%r32,-4; and.b32 %r199,%r198,-5; set.u32.eq.u32 %r201,%r199,0; neg.s32 %r202,%r201; .loc 1 235 22 add.u32 %r203,%r32,-1; set.u32.le.u32 %r205,%r203,1; neg.s32 %r206,%r205; .loc 1 235 6 cvt.u16.u32 %r208,%r202; cvt.u16.u32 %r209,%r206; or.b16 %r207,%r208,%r209;41; .loc 1 237 7 setp.ne.u32 %r214,%r32,16; @ %r214 bra $L42; $L41: .loc 1 244 11 ld.u64 %r102,[%r163+40]; .loc 1 245 8 ld.u64 %r103,[%r163]; .loc 1 246 17 cvt.u32.u32 %r215,%r22; cvt.s64.s8 %r65,%r215; add.u64 %r161,%r164,40; add.u64 %r289,%frame,120; mov.u64 %r159,%r289; add.u64 %r158,%r165,40; mov.u64 %r154,%frame; add.u64 %r155,%frame,240; add.u64 %r151,%frame,360; mov.u64 %r150,%r151; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r224,%r29; bra $L43; $L42: .loc 1 242 5_gfortran_runtime_error64: mov.u64 %r29,%r109; $L43: .loc 1 248 18 ld.u64 %r218,[%r161]; st.u64 [%r159],%r218; .loc 1 249 20 ld.u64 %r220,[%r158]; mul.lo.u64 %r219,%r220,%r31; .loc 1 249 18 st.u64 [%r154],%r219; .loc 1 250 19 ld.u64 %r222,[%r161+16]; add.u64 %r221,%r222,1; ld.u64 %r223,[%r161+8]; sub.u64 %r44,%r221,%r223; .loc 1 250 17 st.u64 [%r155],%r44; .loc 1 251 16 st.u64 [%r150],%r224; .loc 1 252 10 setp.gt.s64 %r225,%r44,0; @ %r225 bra $L44; shl.b64 %r144,%r102,4; mov.u64 %r133,%r103; .loc 1 255 11 mov.u64 %r110,0; .loc 1 256 24 mov.u64 %r226,%r110; $L45: st.u64 [%r133],%r226; st.u64 [%r133+8],%r226; .loc 1 255 27 add.u64 %r110,%r110,1; .loc 1 255 4 add.u64 %r133,%r133,%r144; setp.ne.u64 %r228,%r65,%r110; @ %r228 bra $L45; bra $L35; $L44: .loc 1 246 26 add.u64 %r109,%r29,1; .loc 1 246 3 add.u64 %r161,%r161,24; add.u64 %r159,%r159,8; add.u64 %r158,%r158,24; add.u64 %r154,%r154,8; add.u64 %r155,%r155,8; add.u64 %r150,%r150,8; setp.ne.u64 %r229,%r65,%r109; @ %r229 bra $L64; .loc 1 261 8 ld.u64 %r88,[%r164]; shl.b64 %r112,%r102,4; mov.u64 %r52,%r103; .loc 1 264 10 mov.u64 %r108,0; .loc 1 265 23 mov.u64 %r230,%r108; $L47: st.u64 [%r52],%r230; st.u64 [%r52+8],%r230; mov.u64 %r140,%r108; .loc 1 264 26 add.u64 %r108,%r108,1; .loc 1 264 3 add.u64 %r52,%r52,%r112; setp.ne.u64 %r232,%r29,%r140; @ %r232 bra $L47; .loc 1 276 9 setp.eq.u64 %r233,%r88,0; @ %r233 bra $L35; .loc 1 331 19 ld.u64 %r68,[%frame+120]; .loc 1 331 9 add.u64 %r70,%r68,%r68; .loc 1 332 20 ld.u64 %r71,[%frame]; .loc 1 334 34 ld.u64 %r74,[%frame+240]; ld.u64 %r160,[%frame+360]; cvt.u32.u32 %r236,%r22; cvt.s64.s8 %r235,%r236; shl.b64 %r237,%r235,3; add.u64 %r122,%r237,%r151; .loc 1 274 12 mov.u32 %r95,-32768; setp.eq.u32 %r288,1,0; add.u64 %r290,%frame,368; .loc 1 340 13 mov.u64 %r295,0; .loc 1 359 32 add.u64 %r296,%frame,240; .loc 1 308 12 setp.eq.u32 %r298,%r166,0; $L63: .loc 1 280 10 @ %r288 bra $L48; mov.u64 %r53,%r160; mov.u32 %r46,0; mov.u32 %r292,1; $L53: mov.u64 %r45,%r88; .loc 1 300 13 add.u64 %r88,%r45,%r70; .loc 1 284 11 ld.s8 %r240,[%r90]; cvt.u16.u32 %r239,%r240; setp.eq.u16 %r241,%r239,0; @ %r241 bra $L49; setp.eq.u32 %r242,%r46,0; selp.u64 %r160,%r160,%r53,%r242; .loc 1 294 16 ld.s16 %r95,[%r45]; mov.u64 %r131,%r290; mov.u64 %r145,%r103; mov.u64 %r157,%r160; $L52: .loc 1 296 33 add.u64 %r244,%r157,1; st.u64 [%r145],%r244; shr.s64 %r245,%r244,63; st.u64 [%r145+8],%r245; .loc 1 295 9 add.u64 %r145,%r145,%r112; setp.eq.u64 %r246,%r122,%r131; @ %r246 bra $L65; .loc 1 296 29 ld.u64 %r157,[%r131]; add.u64 %r131,%r131,8; bra $L52; $L49: .loc 1 301 14 add.u64 %r90,%r90,%r71; .loc 1 303 11 add.u64 %r53,%r53,1; mov.u32 %r46,%r292; .loc 1 303 4 setp.ne.u64 %r247,%r53,%r74; @ %r247 bra $L53; bra $L89; $L48: .loc 1 308 12 @ %r298 bra $L66; mov.u64 %r61,%r160; mov.u64 %r86,%r88; $L58: .loc 1 311 11 ld.s8 %r250,[%r90]; cvt.u16.u32 %r249,%r250; setp.eq.u16 %r251,%r249,0; @ %r251 bra $L56; .loc 1 311 22 ld.s16 %r57,[%r86]; .loc 1 311 19 cvt.u16.u32 %r252,%r57; cvt.u16.u32 %r253,%r95; setp.lt.s16 %r254,%r252,%r253; @ %r254 bra $L56; mov.u64 %r97,%r290; mov.u64 %r107,%r103; mov.u64 %r153,%r61; .loc 1 314 19 mov.u64 %r141,%r295; $L57: .loc 1 315 36 add.u64 %r256,%r153,1; st.u64 [%r107],%r256; shr.s64 %r257,%r256,63; st.u64 [%r107+8],%r257; .loc 1 314 35 add.u64 %r106,%r141,1; .loc 1 314 12 add.u64 %r107,%r107,%r112; setp.le.s64 %r258,%r29,%r141; @ %r258 bra $L67; .loc 1 315 32 ld.u64 %r153,[%r97]; add.u64 %r97,%r97,8; mov.u64 %r141,%r106; bra $L57; $L67: mov.u32 %r95,%r57; $L56: .loc 1 317 13 add.u64 %r86,%r86,%r70; .loc 1 319 11 add.u64 %r61,%r61,1; .loc 1 319 4 st.u64 [%frame+360],%r61; setp.ne.u64 %r259,%r61,%r74; @ %r259 bra $L58; .loc 1 317 13 sub.u64 %r260,%r74,%r160; mad.lo.u64 %r88,%r260,%r70,%r88; bra $L59; $L66: mov.u64 %r73,%r160; mov.u64 %r89,%r90; mov.u64 %r87,%r88; $L55: .loc 1 323 11 ld.s8 %r263,[%r89]; cvt.u16.u32 %r262,%r263; setp.eq.u16 %r264,%r262,0; @ %r264 bra $L60; .loc 1 323 22 ld.s16 %r64,[%r87]; .loc 1 323 19 cvt.u16.u32 %r265,%r64; cvt.u16.u32 %r266,%r95; setp.le.s16 %r267,%r265,%r266; @ %r267 bra $L60; mov.u64 %r62,%r290; mov.u64 %r91,%r103; mov.u64 %r149,%r73; .loc 1 326 12 mov.u64 %r142,%r295; $L61: .loc 1 327 36 add.u64 %r269,%r149,1; st.u64 [%r91],%r269; shr.s64 %r270,%r269,63; st.u64 [%r91+8],%r270; .loc 1 326 28 add.u64 %r105,%r142,1; .loc 1 326 5 add.u64 %r91,%r91,%r112; setp.le.s64 %r271,%r29,%r142; @ %r271 bra $L68; .loc 1 327 32 ld.u64 %r149,[%r62]; add.u64 %r62,%r62,8; mov.u64 %r142,%r105; bra $L61; $L68: mov.u32 %r95,%r64; $L60: .loc 1 331 9 add.u64 %r87,%r87,%r70; .loc 1 332 10 add.u64 %r89,%r89,%r71; .loc 1 334 14 add.u64 %r73,%r73,1; .loc 1 334 7 st.u64 [%frame+360],%r73; setp.ne.u64 %r272,%r73,%r74; @ %r272 bra $L55; sub.u64 %r128,%r74,%r160; .loc 1 331 9 mad.lo.u64 %r88,%r70,%r128,%r88; .loc 1 332 10 mad.lo.u64 %r90,%r71,%r128,%r90; bra $L59; $L89: st.u64 [%frame+360],%r74; $L59: mov.u64 %r120,%r290; mov.u64 %r83,%r71; .loc 1 334 34 mov.u64 %r85,%r74; mov.u64 %r81,%r68; mov.u64 %r119,8; .loc 1 335 9 mov.u64 %r93,0; $L62: .loc 1 340 13 st.u64 [%r120+-8],%r295; .loc 1 343 23 mul.lo.u64 %r76,%r81,%r85; .loc 1 344 24 mul.lo.u64 %r78,%r85,%r83; mov.u64 %r98,%r93; .loc 1 345 5 add.u64 %r93,%r93,1; .loc 1 346 7 setp.eq.u64 %r277,%r29,%r98; @ %r277 bra $L35; .loc 1 354 16 ld.u64 %r278,[%r120]; add.u64 %r80,%r278,1; st.u64 [%r120],%r80; .loc 1 355 23 add.u64 %r280,%r289,%r119; ld.u64 %r81,[%r280]; .loc 1 355 13 sub.u64 %r281,%r81,%r76; add.u64 %r282,%r281,%r281; add.u64 %r88,%r88,%r282; .loc 1 356 24 add.u64 %r283,%frame,%r119; ld.u64 %r83,[%r283]; .loc 1 356 14 sub.u64 %r284,%r83,%r78; add.u64 %r90,%r90,%r284; .loc 1 359 32 add.u64 %r286,%r296,%r119; ld.u64 %r85,[%r286]; .loc 1 359 7 add.u64 %r120,%r120,8; add.u64 %r119,%r119,8; setp.eq.u64 %r287,%r80,%r85; @ %r287 bra $L62; ld.u64 %r160,[%frame+360]; bra $L63; $L65: mov.u64 %r88,%r45; setp.eq.u32 %r288,1,1; bra $L48; $L35: .loc 1 _gfortran_smaxloc0_16_i2 .visible .func _gfortran_smaxloc0_16_i2u32pred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 379 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L91; .loc 1 379 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L92; $L91: .loc 1 381 7 {_gfortran_maxloc0_16_.loc 1 382 7 bra $L90; $L92: .loc 1 385 8 ld.s8 %r30,[%r36+28]_gfortran_runtime_error94: .loc 1 390 15 ld.u64 %r24,[%r35]; .loc 1 390 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L95; .loc 1 392 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 393 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 394 24 st.u64 [%r35+8],%r24; .loc 1 395 29call (%value_in),_gfortrani_xmallocarray95 27 st.u64 [%r35],%r24; bra $L96; $L95: .loc 1 397 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 397 11 ld.u32 %r62,[%r61+36; .loc 1 399 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 404 8 ld.u64 %r24,[%r35]; $L96: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,4; mov.u64 %r29,%r24; .loc 1 405 10 mov.u64 %r32,0; .loc 1 406 23 mov.u64 %r68,%r32; $L97: st.u64 [%r29],%r68; st.u64 [%r29+8],%r68; .loc 1 405 24 add.u64 %r32,%r32,1; .loc 1 405 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r70,%r30,%r32; @ %r70 bra $L97; $L90: .loc 1 407 1 ret; } maxloc0_4_i4.o/_gfortran_maxloc0_4_i4 .visible .func _gfortran_maxloc0_4_i4libgfortran/generated/maxloc0_4_i4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_4_i4 .visible .func _gfortran_mmaxloc0_4_i4_gfortran_smaxloc0_4_i4 .visible .func _gfortran_smaxloc0_4_i4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,65,88,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_4_i4 .visible .func _gfortran_maxloc0_4_i4stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[368u64predpred %r154; .reg .predpredpredmov.u64 %r116,%ar0; mov.u64 %r117,%ar1; mov.u32 %r118,%ar2; .loc 1 50 10 ld.s8 %r22,[%r117+28]; .loc 1 50 8 cvt.u32.u32 %r119,%r22; cvt.s64.s8 %r72,%r119; .loc 1 51 6 setp.gt.s64 %r120,%r72,0; @ %r120 bra $L2; .loc 1 52 5 cvta.const.u64 %r121121_gfortran_runtime_errorr23,[%r116]; .loc 1 54 6 setp.ne.u64 %r123,%r23,0; @ %r123 bra $L3; .loc 1 56 7 st.u64 [%r116+48],%r23; add.u64 %r125,%r72,-1; st.u64 [%r116+56],%r125; mov.u64 %r126,1; st.u64 [%r116+40],%r126; .loc 1 57 28 cvt.u32.u64 %r127,%r126; st.u8 [%r116+28],%r127; .loc 1 58 24 st.u64 [%r116+8],%r23; .loc 1 59 29 mov.u64 %r130,4;r130; call (%value_in),_gfortrani_xmallocarray133,[%value_in]; } mov.u64 %r23,%r133; .loc 1 59 27 st.u64 [%r116],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r135,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r136,[%r135+36]; setp.eq.u32 %r137,%r136,0; @ %r137 bra $L4; .loc 1 64r140; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r116]; $L4: .loc 1 68 11 ld.u64 %r73,[%r116+40]; add.u64 %r39,%r117,40; mov.u64 %r45,%frame; add.u64 %r35,%frame,120; .loc 1 70 10 mov.u64 %r99,0; add.u64 %r188,%frame,240; .loc 1 74 16 mov.u64 %r148,%r99; bra $L7; $L22: mov.u64 %r99,%r79; $L7: .loc 1 72 18 ld.u64 %r141,[%r39]; st.u64 [%r45],%r141; .loc 1 73 19 ld.u64 %r143,[%r39+16]; add.u64 %r142,%r143,1; ld.u64 %r144,[%r39+8]; sub.u64 %r31,%r142,%r144; .loc 1 73 17 st.u64 [%r35],%r31; .loc 1 74 16 shl.b64 %r146,%r99,3; add.u64 %r147,%r188,%r146; st.u64 [%r147],%r148; .loc 1 75 10 setp.gt.s64 %r149,%r31,0; @ %r149 bra $L5; shl.b64 %r85,%r73,2; mov.u64 %r87,%r23; .loc 1 78 11 mov.u64 %r80,0; .loc 1 79 24 cvt.u32.u64 %r150,%r80; $L6: st.u32 [%r87],%r150; .loc 1 78 27 add.u64 %r80,%r80,1; .loc 1 78 4 add.u64 %r87,%r87,%r85; setp.ne.u64 %r151,%r72,%r80; @ %r151 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r79,%r99,1; .loc 1 70 3 add.u64 %r39,%r39,24; add.u64 %r45,%r45,8; add.u64 %r35,%r35,8; setp.ne.u64 %r152,%r72,%r79; @ %r152 bra $L22; .loc 1 84 8 ld.u64 %r64,[%r117]; shl.b64 %r82,%r73,2; mov.u64 %r92,%r23; .loc 1 87 10 mov.u64 %r78,0; .loc 1 88 23 mov.u32 %r153,1; $L8: st.u32 [%r92],%r153; mov.u64 %r32,%r78; .loc 1 87 26 add.u64 %r78,%r78,1; .loc 1 87 3 add.u64 %r92,%r92,%r82; setp.ne.u64 %r154,%r99,%r32; @ %r154 bra $L8; .loc 1 101 9 setp.ne.u64 %r155,%r64,0; @ ! %r155 bra $L1; .loc 1 149 19 ld.u64 %r49,[%frame]; .loc 1 149 9 shl.b64 %r51,%r49,2; .loc 1 151 34 ld.u64 %r53,[%frame+120]; cvt.u32.u32 %r157,%r22; cvt.s64.s8 %r156,%r157; shl.b64 %r158,%r156,3; add.u64 %r102,%r188,%r158; .loc 1 99 12 mov.u32 %r67,-2147483648; setp.eq.u32 %r187,%r118,0; add.u64 %r189,%frame,248; .loc 1 157 13 mov.u64 %r192,0; .loc 1 174 32 add.u64 %r193,%frame,120; $L21: .loc 1 151 21 ld.u64 %r110,[%frame+240]; .loc 1 126 12 @ %r187 bra $L23; mov.u64 %r40,%r110; mov.u64 %r62,%r64; $L14: .loc 1 129 12 ld.u32 %r33,[%r62]; .loc 1 129 11 setp.lt.s32 %r161,%r33,%r67; @ %r161 bra $L12; mov.u64 %r43,%r189; mov.u64 %r65,%r23; mov.u64 %r105,%r40; $L13: .loc 1 133 35 cvt.u32.u64 %r164,%r105; add.u32 %r163,%r164,1; .loc 1 133 24 st.u32 [%r65],%r163; .loc 1 132 10 add.u64 %r65,%r65,%r82; setp.eq.u64 %r165,%r43,%r102; @ %r165 bra $L24; .loc 1 133 31 ld.u64 %r105,[%r43]; add.u64 %r43,%r43,8; bra $L13; $L24: mov.u32 %r67,%r33; $L12: .loc 1 135 12 add.u64 %r62,%r62,%r51; .loc 1 137 17 add.u64 %r40,%r40,1; .loc 1 137 10 st.u64 [%frame+240],%r40; setp.ne.u64 %r166,%r40,%r53; @ %r166 bra $L14; bra $L34; $L23: mov.u64 %r52,%r110; mov.u64 %r63,%r64; $L11: .loc 1 141 11 ld.u32 %r42,[%r63]; .loc 1 141 10 setp.le.s32 %r167,%r42,%r67; @ %r167 bra $L16; mov.u64 %r108,%r189; mov.u64 %r109,%r23; mov.u64 %r44,%r52; $L17: .loc 1 145 35 cvt.u32.u64 %r170,%r44; add.u32 %r169,%r170,1; .loc 1 145 24 st.u32 [%r109],%r169; .loc 1 144 4 add.u64 %r109,%r109,%r82; setp.eq.u64 %r171,%r102,%r108; @ %r171 bra $L25; .loc 1 145 31 ld.u64 %r44,[%r108]; add.u64 %r108,%r108,8; bra $L17; $L25: mov.u32 %r67,%r42; $L16: .loc 1 149 9 add.u64 %r63,%r63,%r51; .loc 1 151 14 add.u64 %r52,%r52,1; .loc 1 151 7 st.u64 [%frame+240],%r52; setp.ne.u64 %r172,%r52,%r53; @ %r172 bra $L11; .loc 1 149 9 sub.u64 %r173,%r53,%r110; mad.lo.u64 %r64,%r173,%r51,%r64; bra $L18; $L34: .loc 1 135 12 sub.u64 %r175,%r53,%r110; mad.lo.u64 %r64,%r175,%r51,%r64; $L18: mov.u64 %r81,%r189; .loc 1 151 34 mov.u64 %r61,%r53; mov.u64 %r59,%r49; .loc 1 152 9 mov.u64 %r66,0; $L20: .loc 1 157 13 st.u64 [%r81+-8],%r192; .loc 1 160 23 mul.lo.u64 %r56,%r59,%r61; mov.u64 %r100,%r66; .loc 1 161 5 add.u64 %r66,%r66,1; .loc 1 162 7 setp.eq.u64 %r179,%r100,%r99; @ %r179 bra $L1; .loc 1 170 16 ld.u64 %r180,[%r81]; add.u64 %r58,%r180,1; st.u64 [%r81],%r58; shl.b64 %r70,%r66,3; .loc 1 171 23 add.u64 %r181,%frame,%r70; ld.u64 %r59,[%r181]; .loc 1 171 13 sub.u64 %r182,%r59,%r56; shl.b64 %r183,%r182,2; add.u64 %r64,%r64,%r183; .loc 1 174 32 add.u64 %r185,%r193,%r70; ld.u64 %r61,[%r185]; .loc 1 174 7 add.u64 %r81,%r81,8; setp.eq.u64 %r186,%r58,%r61; @ %r186 bra $L20; bra $L21; $L_gfortran_mmaxloc0_4_i4 .visible .func _gfortran_mmaxloc0_4_i4reg .u64 %stack; mov.u64 %stack,04807predpred %r193; .reg .u64 %r196; .reg .u6464pred %r228; .reg .u32 %r229; .reg .predpred %r233; .reg .pred %r234; .reg .u64u16u64u64 %r292; .reg .pred %r294; mov.u64 %r166,%ar0; mov.u64 %r167,%ar1; mov.u64 %r168,%ar2; mov.u32 %r169,%ar3; .loc 1 202 6 setp.ne.u64 %r170,%r168,0; @ %r170 bra $L36gfortran_maxloc0_4_.loc 1 205 7 bra $L35; $L36: .loc 1 208 10 ld.s8 %r22,[%r167+28]; .loc 1 209 6 setp.gt.s32 %r174,%r22,0; @ %r174 bra $L38; .loc 1 210 5 cvta.const.u64 %r175175_gfortran_runtime_error%r177,[%r166]; setp.ne.u64 %r178,%r177,0; @ %r178 bra $L39; .loc 1 214 7 st.u64 [%r166+48],%r177; add.u32 %r180,%r22,-1; cvt.s64.s32 %r181,%r180; st.u64 [%r166+56],%r181; mov.u64 %r182,1; st.u64 [%r166+40],%r182; .loc 1 215 28 cvt.u32.u64 %r183,%r182; st.u8 [%r166+28],%r183; .loc 1 216 24 st.u64 [%r166+8],%r177; .loc 1 217 29 cvt.u32.u32 %r188,%r22; cvt.s64.s8 %r187,%r188call (%value_in),_gfortrani_xmallocarray189,[%value_in]; } .loc 1 217 27 st.u64 [%r166],%r189; bra $L40; $L39: .loc 1 221 11 cvta.global.u64 %r191,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r192,[%r191+36];r167196; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19call _gfortrani_bounds_equal_extents$L40: .loc 1 231 15 ld.u64 %r31,[%r168+16]; .loc 1 233 9 ld.u64 %r93,[%r168]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r201,%r32,-4; and.b32 %r202,%r201,-5; set.u32.eq.u32 %r204,%r202,0; neg.s32 %r205,%r204; .loc 1 235 22 add.u32 %r206,%r32,-1; set.u32.le.u32 %r208,%r206,1; neg.s32 %r209,%r208; .loc 1 235 6 cvt.u16.u32 %r211,%r205; cvt.u16.u32 %r212,%r209; or.b16 %r210,%r211,%r212; cvt.u32.u16 %r213,%r210; cvt.u16.u8 %r214,%r213; setp.ne.u16 %r215,%r214,0; @ %r215 bra $L41; .loc 1 237 7 setp.ne.u32 %r217,%r32,16; @ %r217 bra $L42; $L41: .loc 1 244 11 ld.u64 %r106,[%r166+40]; .loc 1 245 8 ld.u64 %r107,[%r166]; .loc 1 246 17 cvt.u32.u32 %r218,%r22; cvt.s64.s8 %r65,%r218; add.u64 %r52,%r167,40; add.u64 %r286,%frame,120; mov.u64 %r116,%r286; add.u64 %r102,%r168,40; mov.u64 %r165,%frame; add.u64 %r162,%frame,240; add.u64 %r159,%frame,360; mov.u64 %r160,%r159; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r227,%r29; bra $L43; $L42: .loc 1 242 5_gfortran_runtime_error64: mov.u64 %r29,%r113; $L43: .loc 1 248 18 ld.u64 %r221,[%r52]; st.u64 [%r116],%r221; .loc 1 249 20 ld.u64 %r223,[%r102]; mul.lo.u64 %r222,%r223,%r31; .loc 1 249 18 st.u64 [%r165],%r222; .loc 1 250 19 ld.u64 %r225,[%r52+16]; add.u64 %r224,%r225,1; ld.u64 %r226,[%r52+8]; sub.u64 %r44,%r224,%r226; .loc 1 250 17 st.u64 [%r162],%r44; .loc 1 251 16 st.u64 [%r160],%r227; .loc 1 252 10 setp.gt.s64 %r228,%r44,0; @ %r228 bra $L44; shl.b64 %r149,%r106,2; mov.u64 %r139,%r107; .loc 1 255 11 mov.u64 %r114,0; .loc 1 256 24 cvt.u32.u64 %r229,%r114; $L45: st.u32 [%r139],%r229; .loc 1 255 27 add.u64 %r114,%r114,1; .loc 1 255 4 add.u64 %r139,%r139,%r149; setp.ne.u64 %r230,%r65,%r114; @ %r230 bra $L45; bra $L35; $L44: .loc 1 246 26 add.u64 %r113,%r29,1; .loc 1 246 3 add.u64 %r52,%r52,24; add.u64 %r116,%r116,8; add.u64 %r102,%r102,24; add.u64 %r165,%r165,8; add.u64 %r162,%r162,8; add.u64 %r160,%r160,8; setp.ne.u64 %r231,%r65,%r113; @ %r231 bra $L64; .loc 1 261 8 ld.u64 %r90,[%r167]; shl.b64 %r62,%r106,2; mov.u64 %r91,%r107; .loc 1 264 10 mov.u64 %r112,0; .loc 1 265 23 cvt.u32.u64 %r232,%r112; $L47: st.u32 [%r91],%r232; mov.u64 %r142,%r112; .loc 1 264 26 add.u64 %r112,%r112,1; .loc 1 264 3 add.u64 %r91,%r91,%r62; setp.ne.u64 %r233,%r29,%r142; @ %r233 bra $L47; .loc 1 276 9 setp.eq.u64 %r234,%r90,0; @ %r234 bra $L35; .loc 1 331 19 ld.u64 %r69,[%frame+120]; .loc 1 331 9 shl.b64 %r71,%r69,2; .loc 1 332 20 ld.u64 %r72,[%frame]; .loc 1 334 34 ld.u64 %r75,[%frame+240]; ld.u64 %r163,[%frame+360]; cvt.u32.u32 %r236,%r22; cvt.s64.s8 %r235,%r236; shl.b64 %r237,%r235,3; add.u64 %r127,%r237,%r159; .loc 1 274 12 mov.u32 %r98,-2147483648; setp.eq.u32 %r285,1,0; add.u64 %r284,%frame,368; .loc 1 340 13 mov.u64 %r291,0; .loc 1 359 32 add.u64 %r292,%frame,240; .loc 1 308 12 setp.eq.u32 %r294,%r169,0; $L63: .loc 1 280 10 @ %r285 bra $L48; mov.u64 %r53,%r163; mov.u32 %r46,0; mov.u32 %r288,1; $L53: mov.u64 %r45,%r90; .loc 1 300 13 add.u64 %r90,%r45,%r71; .loc 1 284 11 ld.s8 %r240,[%r93]; cvt.u16.u32 %r239,%r240; setp.eq.u16 %r241,%r239,0; @ %r241 bra $L49; setp.eq.u32 %r242,%r46,0; selp.u64 %r163,%r163,%r53,%r242; .loc 1 294 16 ld.u32 %r98,[%r45]; mov.u64 %r137,%r284; mov.u64 %r150,%r107; mov.u64 %r161,%r163; $L52: .loc 1 296 33 cvt.u32.u64 %r245,%r161; add.u32 %r244,%r245,1; .loc 1 296 22 st.u32 [%r150],%r244; .loc 1 295 9 add.u64 %r150,%r150,%r62; setp.eq.u64 %r246,%r127,%r137; @ %r246 bra $L65; .loc 1 296 29 ld.u64 %r161,[%r137]; add.u64 %r137,%r137,8; bra $L52; $L49: .loc 1 301 14 add.u64 %r93,%r93,%r72; .loc 1 303 11 add.u64 %r53,%r53,1; mov.u32 %r46,%r288; .loc 1 303 4 setp.ne.u64 %r247,%r53,%r75; @ %r247 bra $L53; bra $L89; $L48: .loc 1 308 12 @ %r294 bra $L66; mov.u64 %r61,%r163; mov.u64 %r88,%r90; $L58: .loc 1 311 11 ld.s8 %r250,[%r93]; cvt.u16.u32 %r249,%r250; setp.eq.u16 %r251,%r249,0; @ %r251 bra $L56; .loc 1 311 22 ld.u32 %r57,[%r88]; .loc 1 311 19 setp.lt.s32 %r252,%r57,%r98; @ %r252 bra $L56; mov.u64 %r117,%r284; mov.u64 %r119,%r107; mov.u64 %r156,%r61; .loc 1 314 19 mov.u64 %r143,%r291; $L57: .loc 1 315 36 cvt.u32.u64 %r255,%r156; add.u32 %r254,%r255,1; .loc 1 315 25 st.u32 [%r119],%r254; .loc 1 314 35 add.u64 %r110,%r143,1; .loc 1 314 12 add.u64 %r119,%r119,%r62; setp.le.s64 %r256,%r29,%r143; @ %r256 bra $L67; .loc 1 315 32 ld.u64 %r156,[%r117]; add.u64 %r117,%r117,8; mov.u64 %r143,%r110; bra $L57; $L67: mov.u32 %r98,%r57; $L56: .loc 1 317 13 add.u64 %r88,%r88,%r71; .loc 1 319 11 add.u64 %r61,%r61,1; .loc 1 319 4 st.u64 [%frame+360],%r61; setp.ne.u64 %r257,%r61,%r75; @ %r257 bra $L58; .loc 1 317 13 sub.u64 %r258,%r75,%r163; mad.lo.u64 %r90,%r258,%r71,%r90; bra $L59; $L66: mov.u64 %r74,%r163; mov.u64 %r92,%r93; mov.u64 %r89,%r90; $L55: .loc 1 323 11 ld.s8 %r261,[%r92]; cvt.u16.u32 %r260,%r261; setp.eq.u16 %r262,%r260,0; @ %r262 bra $L60; .loc 1 323 22 ld.u32 %r64,[%r89]; .loc 1 323 19 setp.le.s32 %r263,%r64,%r98; @ %r263 bra $L60; mov.u64 %r97,%r284; mov.u64 %r100,%r107; mov.u64 %r151,%r74; .loc 1 326 12 mov.u64 %r144,%r291; $L61: .loc 1 327 36 cvt.u32.u64 %r266,%r151; add.u32 %r265,%r266,1; .loc 1 327 25 st.u32 [%r100],%r265; .loc 1 326 28 add.u64 %r109,%r144,1; .loc 1 326 5 add.u64 %r100,%r100,%r62; setp.le.s64 %r267,%r29,%r144; @ %r267 bra $L68; .loc 1 327 32 ld.u64 %r151,[%r97]; add.u64 %r97,%r97,8; mov.u64 %r144,%r109; bra $L61; $L68: mov.u32 %r98,%r64; $L60: .loc 1 331 9 add.u64 %r89,%r89,%r71; .loc 1 332 10 add.u64 %r92,%r92,%r72; .loc 1 334 14 add.u64 %r74,%r74,1; .loc 1 334 7 st.u64 [%frame+360],%r74; setp.ne.u64 %r268,%r74,%r75; @ %r268 bra $L55; sub.u64 %r134,%r75,%r163; .loc 1 331 9 mad.lo.u64 %r90,%r71,%r134,%r90; .loc 1 332 10 mad.lo.u64 %r93,%r72,%r134,%r93; bra $L59; $L89: st.u64 [%frame+360],%r75; $L59: mov.u64 %r125,%r284; mov.u64 %r84,%r72; .loc 1 334 34 mov.u64 %r86,%r75; mov.u64 %r82,%r69; mov.u64 %r124,8; .loc 1 335 9 mov.u64 %r96,0; $L62: .loc 1 340 13 st.u64 [%r125+-8],%r291; .loc 1 343 23 mul.lo.u64 %r77,%r82,%r86; .loc 1 344 24 mul.lo.u64 %r79,%r86,%r84; mov.u64 %r118,%r96; .loc 1 345 5 add.u64 %r96,%r96,1; .loc 1 346 7 setp.eq.u64 %r273,%r29,%r118; @ %r273 bra $L35; .loc 1 354 16 ld.u64 %r274,[%r125]; add.u64 %r81,%r274,1; st.u64 [%r125],%r81; .loc 1 355 23 add.u64 %r276,%r286,%r124; ld.u64 %r82,[%r276]; .loc 1 355 13 sub.u64 %r277,%r82,%r77; shl.b64 %r278,%r277,2; add.u64 %r90,%r90,%r278; .loc 1 356 24 add.u64 %r279,%frame,%r124; ld.u64 %r84,[%r279]; .loc 1 356 14 sub.u64 %r280,%r84,%r79; add.u64 %r93,%r93,%r280; .loc 1 359 32 add.u64 %r282,%r292,%r124; ld.u64 %r86,[%r282]; .loc 1 359 7 add.u64 %r125,%r125,8; add.u64 %r124,%r124,8; setp.eq.u64 %r283,%r81,%r86; @ %r283 bra $L62; ld.u64 %r163,[%frame+360]; bra $L63; $L65: mov.u64 %r90,%r45; setp.eq.u32 %r285,1,1; bra $L48; $L35: .loc 1 _gfortran_smaxloc0_4_i4 .visible .func _gfortran_smaxloc0_4_i4u32pred.reg .u64u32 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 379 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L91; .loc 1 379 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L92; $L91: .loc 1 381 7 {_gfortran_maxloc0_4_.loc 1 382 7 bra $L90; $L92: .loc 1 385 8 ld.s8 %r30,[%r36+28]_gfortran_runtime_error94: .loc 1 390 15 ld.u64 %r24,[%r35]; .loc 1 390 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L95; .loc 1 392 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 393 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 394 24 st.u64 [%r35+8],%r24; .loc 1 395 29 mov.u64 %r56call (%value_in),_gfortrani_xmallocarray95 27 st.u64 [%r35],%r24; bra $L96; $L95: .loc 1 397 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 397 11 ld.u32 %r62,[%r61+36; .loc 1 399 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 404 8 ld.u64 %r24,[%r35]; $L96: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,2; mov.u64 %r29,%r24; .loc 1 405 10 mov.u64 %r32,0; .loc 1 406 23 cvt.u32.u64 %r68,%r32; $L97: st.u32 [%r29],%r68; .loc 1 405 24 add.u64 %r32,%r32,1; .loc 1 405 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L97; $L90: .loc 1 407 1 ret; } maxloc0_8_i4.o/_gfortran_maxloc0_8_i4 .visible .func _gfortran_maxloc0_8_i4libgfortran/generated/maxloc0_8_i4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_8_i4 .visible .func _gfortran_mmaxloc0_8_i4_gfortran_smaxloc0_8_i4 .visible .func _gfortran_smaxloc0_8_i4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,65,88,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_8_i4 .visible .func _gfortran_maxloc0_8_i4stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[36832predpred %r180; .reg .predmov.u64 %r112,%ar0; mov.u64 %r113,%ar1; mov.u32 %r114,%ar2; .loc 1 50 10 ld.s8 %r22,[%r113+28]; .loc 1 50 8 cvt.u32.u32 %r115,%r22; cvt.s64.s8 %r71,%r115; .loc 1 51 6 setp.gt.s64 %r116,%r71,0; @ %r116 bra $L2; .loc 1 52 5 cvta.const.u64 %r11117_gfortran_runtime_errorr23,[%r112]; .loc 1 54 6 setp.ne.u64 %r119,%r23,0; @ %r119 bra $L3; .loc 1 56 7 st.u64 [%r112+48],%r23; add.u64 %r121,%r71,-1; st.u64 [%r112+56],%r121; mov.u64 %r122,1; st.u64 [%r112+40],%r122; .loc 1 57 28 cvt.u32.u64 %r123,%r122; st.u8 [%r112+28],%r123; .loc 1 58 24 st.u64 [%r112+8],%r23; .loc 1 59 29 mov.u64 %r171call (%value_in),_gfortrani_xmallocarray129,[%value_in]; } mov.u64 %r23,%r129; .loc 1 59 27 st.u64 [%r112],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r131,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r132,[%r131+36]; setp.eq.u32 %r133,%r132,0; @ %r133 bra $L4; .loc 1 64r113136; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r112]; $L4: .loc 1 68 11 ld.u64 %r72,[%r112+40]; add.u64 %r36,%r113,40; mov.u64 %r104,%frame; add.u64 %r105,%frame,120; .loc 1 70 10 mov.u64 %r98,0; add.u64 %r182,%frame,240; .loc 1 74 16 mov.u64 %r144,%r98; bra $L7; $L22: mov.u64 %r98,%r79; $L7: .loc 1 72 18 ld.u64 %r137,[%r36]; st.u64 [%r104],%r137; .loc 1 73 19 ld.u64 %r139,[%r36+16]; add.u64 %r138,%r139,1; ld.u64 %r140,[%r36+8]; sub.u64 %r31,%r138,%r140; .loc 1 73 17 st.u64 [%r105],%r31; .loc 1 74 16 shl.b64 %r142,%r98,3; add.u64 %r143,%r182,%r142; st.u64 [%r143],%r144; .loc 1 75 10 setp.gt.s64 %r145,%r31,0; @ %r145 bra $L5; shl.b64 %r81,%r72,3; mov.u64 %r87,%r23; .loc 1 78 11 mov.u64 %r80,0; .loc 1 79 24 mov.u64 %r146,%r80; $L6: st.u64 [%r87],%r146; .loc 1 78 27 add.u64 %r80,%r80,1; .loc 1 78 4 add.u64 %r87,%r87,%r81; setp.ne.u64 %r147,%r71,%r80; @ %r147 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r79,%r98,1; .loc 1 70 3 add.u64 %r36,%r36,24; add.u64 %r104,%r104,8; add.u64 %r105,%r105,8; setp.ne.u64 %r148,%r71,%r79; @ %r148 bra $L22; .loc 1 84 8 ld.u64 %r62,[%r113]; shl.b64 %r84,%r72,3; mov.u64 %r39,%r23; .loc 1 87 10 mov.u64 %r78,0; .loc 1 88 23 mov.u64 %r149,1; $L8: st.u64 [%r39],%r149; mov.u64 %r32,%r78; .loc 1 87 26 add.u64 %r78,%r78,1; .loc 1 87 3 add.u64 %r39,%r39,%r84; setp.ne.u64 %r150,%r98,%r32; @ %r150 bra $L8; .loc 1 101 9 setp.ne.u64 %r151,%r62,0; @ ! %r151 bra $L1; .loc 1 149 19 ld.u64 %r45,[%frame]; .loc 1 149 9 shl.b64 %r47,%r45,2; .loc 1 151 34 ld.u64 %r51,[%frame+120]; cvt.u32.u32 %r153,%r22; cvt.s64.s8 %r152,%r153; shl.b64 %r154,%r152,3; add.u64 %r73,%r182,%r154; .loc 1 99 12 mov.u32 %r67,-2147483648; setp.eq.u32 %r181,%r114,0; add.u64 %r183,%frame,248; .loc 1 157 13 mov.u64 %r186,0; .loc 1 174 32 add.u64 %r187,%frame,120; $L21: .loc 1 151 21 ld.u64 %r103,[%frame+240]; .loc 1 126 12 @ %r181 bra $L23; mov.u64 %r40,%r103; mov.u64 %r60,%r62; $L14: .loc 1 129 12 ld.u32 %r33,[%r60]; .loc 1 129 11 setp.lt.s32 %r157,%r33,%r67; @ %r157 bra $L12; mov.u64 %r76,%r183; mov.u64 %r34,%r23; mov.u64 %r90,%r40; $L13: .loc 1 133 35 add.u64 %r159,%r90,1; .loc 1 133 24 st.u64 [%r34],%r159; .loc 1 132 10 add.u64 %r34,%r34,%r84; setp.eq.u64 %r160,%r73,%r76; @ %r160 bra $L24; .loc 1 133 31 ld.u64 %r90,[%r76]; add.u64 %r76,%r76,8; bra $L13; $L24: mov.u32 %r67,%r33; $L12: .loc 1 135 12 add.u64 %r60,%r60,%r47; .loc 1 137 17 add.u64 %r40,%r40,1; .loc 1 137 10 st.u64 [%frame+240],%r40; setp.ne.u64 %r161,%r40,%r51; @ %r161 bra $L14; bra $L34; $L23: mov.u64 %r49,%r103; mov.u64 %r61,%r62; $L11: .loc 1 141 11 ld.u32 %r42,[%r61]; .loc 1 141 10 setp.le.s32 %r162,%r42,%r67; @ %r162 bra $L16; mov.u64 %r85,%r183; mov.u64 %r101,%r23; mov.u64 %r86,%r49; $L17: .loc 1 145 35 add.u64 %r164,%r86,1; .loc 1 145 24 st.u64 [%r101],%r164; .loc 1 144 4 add.u64 %r101,%r101,%r84; setp.eq.u64 %r165,%r73,%r85; @ %r165 bra $L25; .loc 1 145 31 ld.u64 %r86,[%r85]; add.u64 %r85,%r85,8; bra $L17; $L25: mov.u32 %r67,%r42; $L16: .loc 1 149 9 add.u64 %r61,%r61,%r47; .loc 1 151 14 add.u64 %r49,%r49,1; .loc 1 151 7 st.u64 [%frame+240],%r49; setp.ne.u64 %r166,%r49,%r51; @ %r166 bra $L11; .loc 1 149 9 sub.u64 %r167,%r51,%r103; mad.lo.u64 %r62,%r167,%r47,%r62; bra $L18; $L34: .loc 1 135 12 sub.u64 %r169,%r51,%r103; mad.lo.u64 %r62,%r169,%r47,%r62; $L18: mov.u64 %r70,%r183; .loc 1 151 34 mov.u64 %r59,%r51; mov.u64 %r57,%r45; .loc 1 152 9 mov.u64 %r66,0; $L20: .loc 1 157 13 st.u64 [%r70+-8],%r186; .loc 1 160 23 mul.lo.u64 %r54,%r57,%r59; mov.u64 %r99,%r66; .loc 1 161 5 add.u64 %r66,%r66,1; .loc 1 162 7 setp.eq.u64 %r173,%r99,%r98; @ %r173 bra $L1; .loc 1 170 16 ld.u64 %r174,[%r70]; add.u64 %r56,%r174,1; st.u64 [%r70],%r56; shl.b64 %r65,%r66,3; .loc 1 171 23 add.u64 %r175,%frame,%r65; ld.u64 %r57,[%r175]; .loc 1 171 13 sub.u64 %r176,%r57,%r54; shl.b64 %r177,%r176,2; add.u64 %r62,%r62,%r177; .loc 1 174 32 add.u64 %r179,%r187,%r65; ld.u64 %r59,[%r179]; .loc 1 174 7 add.u64 %r70,%r70,8; setp.eq.u64 %r180,%r56,%r59; @ %r180 bra $L20; bra $L21; $L_gfortran_mmaxloc0_8_i4 .visible .func _gfortran_mmaxloc0_8_i4reg .u64 %stack; mov.u64 %stack,0480pred %r187; .reg .u64 %r190; .reg .u64u64pred %r222; .reg .u64 %r223; .reg .predu32 %r230; .reg .u64 %r231; .reg .u16 %r233; .reg .u32 %r234; .reg .pred %r235; .reg .predpred %r248; .reg .predu64 %r273; .reg .predu64 %r283; .reg .pred %r285; mov.u64 %r160,%ar0; mov.u64 %r161,%ar1; mov.u64 %r162,%ar2; mov.u32 %r163,%ar3; .loc 1 202 6 setp.ne.u64 %r164,%r162,0; @ %r164 bra $L36; .loc 1 204 7 {1163; call _gfortran_maxloc0_8_.loc 1 205 7 bra $L35; $L36: .loc 1 208 10 ld.s8 %r22,[%r161+28]; .loc 1 209 6 setp.gt.s32 %r168,%r22,0; @ %r168 bra $L38; .loc 1 210 5 cvta.const.u64 %r16stack; call _gfortran_runtime_error%r171,[%r160]; setp.ne.u64 %r172,%r171,0; @ %r172 bra $L39; .loc 1 214 7 st.u64 [%r160+48],%r171; add.u32 %r174,%r22,-1; cvt.s64.s32 %r175,%r174; st.u64 [%r160+56],%r175; mov.u64 %r176,1; st.u64 [%r160+40],%r176; .loc 1 215 28 cvt.u32.u64 %r177,%r176; st.u8 [%r160+28],%r177; .loc 1 216 24 st.u64 [%r160+8],%r171; .loc 1 217 29 cvt.u32.u32 %r182,%r22; cvt.s64.s8 %r181,%r182; mov.u64 %r180call (%value_in),_gfortrani_xmallocarray183,[%value_in]; } .loc 1 217 27 st.u64 [%r160],%r183; bra $L40; $L39: .loc 1 221 11 cvta.global.u64 %r185,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r186,[%r185+36]; setp.eq.u32 %r187,%r186,0; @ %r187 bra $L40; .loc 1 224 4 cvta.const.u64 %r190,$LC1; {161190; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19316161193190; call _gfortrani_bounds_equal_extents$L40: .loc 1 231 15 ld.u64 %r31,[%r162+16]; .loc 1 233 9 ld.u64 %r92,[%r162]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r195,%r32,-4; and.b32 %r196,%r195,-5; set.u32.eq.u32 %r198,%r196,0; neg.s32 %r199,%r198; .loc 1 235 22 add.u32 %r200,%r32,-1; set.u32.le.u32 %r202,%r200,1; neg.s32 %r203,%r202; .loc 1 235 6 cvt.u16.u32 %r205,%r199; cvt.u16.u32 %r206,%r203; or.b16 %r204,%r205,%r206; cvt.u32.u16 %r207,%r204; cvt.u16.u8 %r208,%r20741; .loc 1 237 7 setp.ne.u32 %r211,%r32,16; @ %r211 bra $L42; $L41: .loc 1 244 11 ld.u64 %r106,[%r160+40]; .loc 1 245 8 ld.u64 %r107,[%r160]; .loc 1 246 17 cvt.u32.u32 %r212,%r22; cvt.s64.s8 %r68,%r212; add.u64 %r153,%r161,40; add.u64 %r277,%frame,120; mov.u64 %r149,%r277; add.u64 %r147,%r162,40; mov.u64 %r146,%frame; add.u64 %r143,%frame,240; add.u64 %r62,%frame,360; mov.u64 %r47,%r62; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r221,%r29; bra $L43; $L42: .loc 1 242 5_gfortran_runtime_error64: mov.u64 %r29,%r113; $L43: .loc 1 248 18 ld.u64 %r215,[%r153]; st.u64 [%r149],%r215; .loc 1 249 20 ld.u64 %r217,[%r147]; mul.lo.u64 %r216,%r217,%r31; .loc 1 249 18 st.u64 [%r146],%r216; .loc 1 250 19 ld.u64 %r219,[%r153+16]; add.u64 %r218,%r219,1; ld.u64 %r220,[%r153+8]; sub.u64 %r44,%r218,%r220; .loc 1 250 17 st.u64 [%r143],%r44; .loc 1 251 16 st.u64 [%r47],%r221; .loc 1 252 10 setp.gt.s64 %r222,%r44,0; @ %r222 bra $L44; shl.b64 %r137,%r106,3; mov.u64 %r129,%r107; .loc 1 255 11 mov.u64 %r114,0; .loc 1 256 24 mov.u64 %r223,%r114; $L45: st.u64 [%r129],%r223; .loc 1 255 27 add.u64 %r114,%r114,1; .loc 1 255 4 add.u64 %r129,%r129,%r137; setp.ne.u64 %r224,%r68,%r114; @ %r224 bra $L45; bra $L35; $L44: .loc 1 246 26 add.u64 %r113,%r29,1; .loc 1 246 3 add.u64 %r153,%r153,24; add.u64 %r149,%r149,8; add.u64 %r147,%r147,24; add.u64 %r146,%r146,8; add.u64 %r143,%r143,8; add.u64 %r47,%r47,8; setp.ne.u64 %r225,%r68,%r113; @ %r225 bra $L64; .loc 1 261 8 ld.u64 %r90,[%r161]; shl.b64 %r158,%r106,3; mov.u64 %r156,%r107; .loc 1 264 10 mov.u64 %r112,0; .loc 1 265 23 mov.u64 %r226,%r112; $L47: st.u64 [%r156],%r226; mov.u64 %r140,%r112; .loc 1 264 26 add.u64 %r112,%r112,1; .loc 1 264 3 add.u64 %r156,%r156,%r158; setp.ne.u64 %r227,%r29,%r140; @ %r227 bra $L47; .loc 1 276 9 setp.eq.u64 %r228,%r90,0; @ %r228 bra $L35; .loc 1 331 19 ld.u64 %r70,[%frame+120]; .loc 1 331 9 shl.b64 %r72,%r70,2; .loc 1 332 20 ld.u64 %r73,[%frame]; .loc 1 334 34 ld.u64 %r76,[%frame+240]; ld.u64 %r155,[%frame+360]; cvt.u32.u32 %r230,%r22; cvt.s64.s8 %r229,%r230; shl.b64 %r231,%r229,3; add.u64 %r119,%r231,%r62; .loc 1 274 12 mov.u32 %r97,-2147483648; setp.eq.u32 %r276,1,0; add.u64 %r275,%frame,368; .loc 1 340 13 mov.u64 %r282,0; .loc 1 359 32 add.u64 %r283,%frame,240; .loc 1 308 12 setp.eq.u32 %r285,%r163,0; $L63: .loc 1 280 10 @ %r276 bra $L48; mov.u64 %r56,%r155; mov.u32 %r46,0; mov.u32 %r279,1; $L53: mov.u64 %r45,%r90; .loc 1 300 13 add.u64 %r90,%r45,%r72; .loc 1 284 11 ld.s8 %r234,[%r92]; cvt.u16.u32 %r233,%r234; setp.eq.u16 %r235,%r233,0; @ %r235 bra $L49; setp.eq.u32 %r236,%r46,0; selp.u64 %r155,%r155,%r56,%r236; .loc 1 294 16 ld.u32 %r97,[%r45]; mov.u64 %r128,%r275; mov.u64 %r139,%r107; mov.u64 %r151,%r155; $L52: .loc 1 296 33 add.u64 %r238,%r151,1; .loc 1 296 22 st.u64 [%r139],%r238; .loc 1 295 9 add.u64 %r139,%r139,%r158; setp.eq.u64 %r239,%r119,%r128; @ %r239 bra $L65; .loc 1 296 29 ld.u64 %r151,[%r128]; add.u64 %r128,%r128,8; bra $L52; $L49: .loc 1 301 14 add.u64 %r92,%r92,%r73; .loc 1 303 11 add.u64 %r56,%r56,1; mov.u32 %r46,%r279; .loc 1 303 4 setp.ne.u64 %r240,%r56,%r76; @ %r240 bra $L53; bra $L89; $L48: .loc 1 308 12 @ %r285 bra $L66; mov.u64 %r64,%r155; mov.u64 %r88,%r90; $L58: .loc 1 311 11 ld.s8 %r243,[%r92]; cvt.u16.u32 %r242,%r243; setp.eq.u16 %r244,%r242,0; @ %r244 bra $L56; .loc 1 311 22 ld.u32 %r60,[%r88]; .loc 1 311 19 setp.lt.s32 %r245,%r60,%r97; @ %r245 bra $L56; mov.u64 %r93,%r275; mov.u64 %r96,%r107; mov.u64 %r148,%r64; .loc 1 314 19 mov.u64 %r141,%r282; $L57: .loc 1 315 36 add.u64 %r247,%r148,1; .loc 1 315 25 st.u64 [%r96],%r247; .loc 1 314 35 add.u64 %r110,%r141,1; .loc 1 314 12 add.u64 %r96,%r96,%r158; setp.le.s64 %r248,%r29,%r141; @ %r248 bra $L67; .loc 1 315 32 ld.u64 %r148,[%r93]; add.u64 %r93,%r93,8; mov.u64 %r141,%r110; bra $L57; $L67: mov.u32 %r97,%r60; $L56: .loc 1 317 13 add.u64 %r88,%r88,%r72; .loc 1 319 11 add.u64 %r64,%r64,1; .loc 1 319 4 st.u64 [%frame+360],%r64; setp.ne.u64 %r249,%r64,%r76; @ %r249 bra $L58; .loc 1 317 13 sub.u64 %r250,%r76,%r155; mad.lo.u64 %r90,%r250,%r72,%r90; bra $L59; $L66: mov.u64 %r75,%r155; mov.u64 %r91,%r92; mov.u64 %r89,%r90; $L55: .loc 1 323 11 ld.s8 %r253,[%r91]; cvt.u16.u32 %r252,%r253; setp.eq.u16 %r254,%r252,0; @ %r254 bra $L60; .loc 1 323 22 ld.u32 %r67,[%r89]; .loc 1 323 19 setp.le.s32 %r255,%r67,%r97; @ %r255 bra $L60; mov.u64 %r115,%r275; mov.u64 %r57,%r107; mov.u64 %r144,%r75; .loc 1 326 12 mov.u64 %r142,%r282; $L61: .loc 1 327 36 add.u64 %r257,%r144,1; .loc 1 327 25 st.u64 [%r57],%r257; .loc 1 326 28 add.u64 %r109,%r142,1; .loc 1 326 5 add.u64 %r57,%r57,%r158; setp.le.s64 %r258,%r29,%r142; @ %r258 bra $L68; .loc 1 327 32 ld.u64 %r144,[%r115]; add.u64 %r115,%r115,8; mov.u64 %r142,%r109; bra $L61; $L68: mov.u32 %r97,%r67; $L60: .loc 1 331 9 add.u64 %r89,%r89,%r72; .loc 1 332 10 add.u64 %r91,%r91,%r73; .loc 1 334 14 add.u64 %r75,%r75,1; .loc 1 334 7 st.u64 [%frame+360],%r75; setp.ne.u64 %r259,%r75,%r76; @ %r259 bra $L55; sub.u64 %r125,%r76,%r155; .loc 1 331 9 mad.lo.u64 %r90,%r72,%r125,%r90; .loc 1 332 10 mad.lo.u64 %r92,%r73,%r125,%r92; bra $L59; $L89: st.u64 [%frame+360],%r76; $L59: mov.u64 %r117,%r275; mov.u64 %r85,%r73; .loc 1 334 34 mov.u64 %r87,%r76; mov.u64 %r83,%r70; mov.u64 %r116,8; .loc 1 335 9 mov.u64 %r95,0; $L62: .loc 1 340 13 st.u64 [%r117+-8],%r282; .loc 1 343 23 mul.lo.u64 %r78,%r83,%r87; .loc 1 344 24 mul.lo.u64 %r80,%r87,%r85; mov.u64 %r94,%r95; .loc 1 345 5 add.u64 %r95,%r95,1; .loc 1 346 7 setp.eq.u64 %r264,%r29,%r94; @ %r264 bra $L35; .loc 1 354 16 ld.u64 %r265,[%r117]; add.u64 %r82,%r265,1; st.u64 [%r117],%r82; .loc 1 355 23 add.u64 %r267,%r277,%r116; ld.u64 %r83,[%r267]; .loc 1 355 13 sub.u64 %r268,%r83,%r78; shl.b64 %r269,%r268,2; add.u64 %r90,%r90,%r269; .loc 1 356 24 add.u64 %r270,%frame,%r116; ld.u64 %r85,[%r270]; .loc 1 356 14 sub.u64 %r271,%r85,%r80; add.u64 %r92,%r92,%r271; .loc 1 359 32 add.u64 %r273,%r283,%r116; ld.u64 %r87,[%r273]; .loc 1 359 7 add.u64 %r117,%r117,8; add.u64 %r116,%r116,8; setp.eq.u64 %r274,%r82,%r87; @ %r274 bra $L62; ld.u64 %r155,[%frame+360]; bra $L63; $L65: mov.u64 %r90,%r45; setp.eq.u32 %r276,1,1; bra $L48; $L35: .loc 1 _gfortran_smaxloc0_8_i4 .visible .func _gfortran_smaxloc0_8_i4u32pred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 379 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L91; .loc 1 379 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L92; $L91: .loc 1 381 7 {_gfortran_maxloc0_8_.loc 1 382 7 bra $L90; $L92: .loc 1 385 8 ld.s8 %r30,[%r36+28]_gfortran_runtime_error94: .loc 1 390 15 ld.u64 %r24,[%r35]; .loc 1 390 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L95; .loc 1 392 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 393 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 394 24 st.u64 [%r35+8],%r24; .loc 1 395 29 mov.u64 call (%value_in),_gfortrani_xmallocarray95 27 st.u64 [%r35],%r24; bra $L96; $L95: .loc 1 397 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 397 11 ld.u32 %r62,[%r61+36; .loc 1 399 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 404 8 ld.u64 %r24,[%r35]; $L96: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,3; mov.u64 %r29,%r24; .loc 1 405 10 mov.u64 %r32,0; .loc 1 406 23 mov.u64 %r68,%r32; $L97: st.u64 [%r29],%r68; .loc 1 405 24 add.u64 %r32,%r32,1; .loc 1 405 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L97; $L90: .loc 1 407 1 ret; } maxloc0_16_i4.o/_gfortran_maxloc0_16_i4 .visible .func _gfortran_maxloc0_16_i4libgfortran/generated/maxloc0_16_i4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_16_i4 .visible .func _gfortran_mmaxloc0_16_i4_gfortran_smaxloc0_16_i4 .visible .func _gfortran_smaxloc0_16_i4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,65,88,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_16_i4 .visible .func _gfortran_maxloc0_16_i4stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[3683264u32 %r157; .reg .u64 %r158; .reg .predpred %r166; .reg .predpred %r171; .reg .predpredmov.u64 %r114,%ar0; mov.u64 %r115,%ar1; mov.u32 %r116,%ar2; .loc 1 50 10 ld.s8 %r22,[%r115+28]; .loc 1 50 8 cvt.u32.u32 %r117,%r22; cvt.s64.s8 %r71,%r117; .loc 1 51 6 setp.gt.s64 %r118,%r71,0; @ %r118 bra $L2; .loc 1 52 5 cvta.const.u64 %r1111_gfortran_runtime_errorr23,[%r114]; .loc 1 54 6 setp.ne.u64 %r121,%r23,0; @ %r121 bra $L3; .loc 1 56 7 st.u64 [%r114+48],%r23; add.u64 %r123,%r71,-1; st.u64 [%r114+56],%r123; mov.u64 %r124,1; st.u64 [%r114+40],%r124; .loc 1 57 28 cvt.u32.u64 %r125,%r124; st.u8 [%r114+28],%r125; .loc 1 58 24 st.u64 [%r114+8],%r23; .loc 1 59 29 mov.u64 %r128,1671call (%value_in),_gfortrani_xmallocarray131,[%value_in]; } mov.u64 %r23,%r131; .loc 1 59 27 st.u64 [%r114],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r133,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r134,[%r133+36]; setp.eq.u32 %r135,%r134,0; @ %r135 bra $L4; .loc 1 64r115138; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r114]; $L4: .loc 1 68 11 ld.u64 %r72,[%r114+40]; add.u64 %r44,%r115,40; mov.u64 %r35,%frame; add.u64 %r52,%frame,120; .loc 1 70 10 mov.u64 %r100,0; add.u64 %r188,%frame,240; .loc 1 74 16 mov.u64 %r146,%r100; bra $L7; $L22: mov.u64 %r100,%r79; $L7: .loc 1 72 18 ld.u64 %r139,[%r44]; st.u64 [%r35],%r139; .loc 1 73 19 ld.u64 %r141,[%r44+16]; add.u64 %r140,%r141,1; ld.u64 %r142,[%r44+8]; sub.u64 %r32,%r140,%r142; .loc 1 73 17 st.u64 [%r52],%r32; .loc 1 74 16 shl.b64 %r144,%r100,3; add.u64 %r145,%r188,%r144; st.u64 [%r145],%r146; .loc 1 75 10 setp.gt.s64 %r147,%r32,0; @ %r147 bra $L5; shl.b64 %r86,%r72,4; mov.u64 %r88,%r23; .loc 1 78 11 mov.u64 %r80,0; .loc 1 79 24 mov.u64 %r148,%r80; $L6: st.u64 [%r88],%r148; st.u64 [%r88+8],%r148; .loc 1 78 27 add.u64 %r80,%r80,1; .loc 1 78 4 add.u64 %r88,%r88,%r86; setp.ne.u64 %r150,%r71,%r80; @ %r150 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r79,%r100,1; .loc 1 70 3 add.u64 %r44,%r44,24; add.u64 %r35,%r35,8; add.u64 %r52,%r52,8; setp.ne.u64 %r151,%r71,%r79; @ %r151 bra $L22; .loc 1 84 8 ld.u64 %r62,[%r115]; shl.b64 %r38,%r72,4; mov.u64 %r85,%r23; .loc 1 87 10 mov.u64 %r78,0; .loc 1 88 23 mov.u64 %r152,1; mov.u64 %r153,%r78; $L8: st.u64 [%r85],%r152; st.u64 [%r85+8],%r153; mov.u64 %r33,%r78; .loc 1 87 26 add.u64 %r78,%r78,1; .loc 1 87 3 add.u64 %r85,%r85,%r38; setp.ne.u64 %r154,%r100,%r33; @ %r154 bra $L8; .loc 1 101 9 setp.ne.u64 %r155,%r62,0; @ ! %r155 bra $L1; .loc 1 149 19 ld.u64 %r47,[%frame]; .loc 1 149 9 shl.b64 %r49,%r47,2; .loc 1 151 34 ld.u64 %r51,[%frame+120]; cvt.u32.u32 %r157,%r22; cvt.s64.s8 %r156,%r157; shl.b64 %r158,%r156,3; add.u64 %r89,%r188,%r158; .loc 1 99 12 mov.u32 %r66,-2147483648; setp.eq.u32 %r187,%r116,0; add.u64 %r189,%frame,248; .loc 1 157 13 mov.u64 %r192,0; .loc 1 174 32 add.u64 %r193,%frame,120; $L21: .loc 1 151 21 ld.u64 %r107,[%frame+240]; .loc 1 126 12 @ %r187 bra $L23; mov.u64 %r41,%r107; mov.u64 %r60,%r62; $L14: .loc 1 129 12 ld.u32 %r34,[%r60]; .loc 1 129 11 setp.lt.s32 %r161,%r34,%r66; @ %r161 bra $L12; mov.u64 %r27,%r189; mov.u64 %r63,%r23; mov.u64 %r103,%r41; $L13: .loc 1 133 35 add.u64 %r163,%r103,1; st.u64 [%r63],%r163; shr.s64 %r164,%r163,63; st.u64 [%r63+8],%r164; .loc 1 132 10 add.u64 %r63,%r63,%r38; setp.eq.u64 %r165,%r27,%r89; @ %r165 bra $L24; .loc 1 133 31 ld.u64 %r103,[%r27]; add.u64 %r27,%r27,8; bra $L13; $L24: mov.u32 %r66,%r34; $L12: .loc 1 135 12 add.u64 %r60,%r60,%r49; .loc 1 137 17 add.u64 %r41,%r41,1; .loc 1 137 10 st.u64 [%frame+240],%r41; setp.ne.u64 %r166,%r41,%r51; @ %r166 bra $L14; bra $L34; $L23: mov.u64 %r50,%r107; mov.u64 %r61,%r62; $L11: .loc 1 141 11 ld.u32 %r42,[%r61]; .loc 1 141 10 setp.le.s32 %r167,%r42,%r66; @ %r167 bra $L16; mov.u64 %r104,%r189; mov.u64 %r105,%r23; mov.u64 %r84,%r50; $L17: .loc 1 145 35 add.u64 %r169,%r84,1; st.u64 [%r105],%r169; shr.s64 %r170,%r169,63; st.u64 [%r105+8],%r170; .loc 1 144 4 add.u64 %r105,%r105,%r38; setp.eq.u64 %r171,%r89,%r104; @ %r171 bra $L25; .loc 1 145 31 ld.u64 %r84,[%r104]; add.u64 %r104,%r104,8; bra $L17; $L25: mov.u32 %r66,%r42; $L16: .loc 1 149 9 add.u64 %r61,%r61,%r49; .loc 1 151 14 add.u64 %r50,%r50,1; .loc 1 151 7 st.u64 [%frame+240],%r50; setp.ne.u64 %r172,%r50,%r51; @ %r172 bra $L11; .loc 1 149 9 sub.u64 %r173,%r51,%r107; mad.lo.u64 %r62,%r173,%r49,%r62; bra $L18; $L34: .loc 1 135 12 sub.u64 %r175,%r51,%r107; mad.lo.u64 %r62,%r175,%r49,%r62; $L18: mov.u64 %r77,%r189; .loc 1 151 34 mov.u64 %r59,%r51; mov.u64 %r57,%r47; .loc 1 152 9 mov.u64 %r65,0; $L20: .loc 1 157 13 st.u64 [%r77+-8],%r192; .loc 1 160 23 mul.lo.u64 %r54,%r57,%r59; mov.u64 %r101,%r65; .loc 1 161 5 add.u64 %r65,%r65,1; .loc 1 162 7 setp.eq.u64 %r179,%r101,%r100; @ %r179 bra $L1; .loc 1 170 16 ld.u64 %r180,[%r77]; add.u64 %r56,%r180,1; st.u64 [%r77],%r56; shl.b64 %r68,%r65,3; .loc 1 171 23 add.u64 %r181,%frame,%r68; ld.u64 %r57,[%r181]; .loc 1 171 13 sub.u64 %r182,%r57,%r54; shl.b64 %r183,%r182,2; add.u64 %r62,%r62,%r183; .loc 1 174 32 add.u64 %r185,%r193,%r68; ld.u64 %r59,[%r185]; .loc 1 174 7 add.u64 %r77,%r77,8; setp.eq.u64 %r186,%r56,%r59; @ %r186 bra $L20; bra $L21; $L_gfortran_mmaxloc0_16_i4 .visible .func _gfortran_mmaxloc0_16_i4reg .u64 %stack; mov.u64 %stack,048098predpredpredu64pred %r293; mov.u64 %r163,%ar0; mov.u64 %r164,%ar1; mov.u64 %r165,%ar2; mov.u32 %r166,%ar3; .loc 1 202 6 setp.ne.u64 %r167,%r165,0; @ %r167 bra $L3616r1166; call _gfortran_maxloc0_16_.loc 1 205 7 bra $L35; $L36: .loc 1 208 10 ld.s8 %r22,[%r164+28]; .loc 1 209 6 setp.gt.s32 %r171,%r22,0; @ %r171 bra $L38; .loc 1 210 5 cvta.const.u64 %r172172_gfortran_runtime_error%r174,[%r163]; setp.ne.u64 %r175,%r174,0; @ %r175 bra $L39; .loc 1 214 7 st.u64 [%r163+48],%r174; add.u32 %r177,%r22,-1; cvt.s64.s32 %r178,%r177; st.u64 [%r163+56],%r178; mov.u64 %r179,1; st.u64 [%r163+40],%r179; .loc 1 215 28 cvt.u32.u64 %r180,%r179; st.u8 [%r163+28],%r180; .loc 1 216 24 st.u64 [%r163+8],%r174; .loc 1 217 29 cvt.u32.u32 %r185,%r22; cvt.s64.s8 %r184,%r185; mov.u64 %r183,16call (%value_in),_gfortrani_xmallocarray186,[%value_in]; } .loc 1 217 27 st.u64 [%r163],%r186; bra $L40; $L39: .loc 1 221 11 cvta.global.u64 %r188,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r189,[%r188+36]; setp.eq.u32 %r190,%r189,0r164193; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r196193; call _gfortrani_bounds_equal_extents$L40: .loc 1 231 15 ld.u64 %r31,[%r165+16]; .loc 1 233 9 ld.u64 %r90,[%r165]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r198,%r32,-4; and.b32 %r199,%r198,-5; set.u32.eq.u32 %r201,%r199,0; neg.s32 %r202,%r201; .loc 1 235 22 add.u32 %r203,%r32,-1; set.u32.le.u32 %r205,%r203,1; neg.s32 %r206,%r205; .loc 1 235 6 cvt.u16.u32 %r208,%r202; cvt.u16.u32 %r209,%r206; or.b16 %r207,%r208,%r209;41; .loc 1 237 7 setp.ne.u32 %r214,%r32,16; @ %r214 bra $L42; $L41: .loc 1 244 11 ld.u64 %r102,[%r163+40]; .loc 1 245 8 ld.u64 %r103,[%r163]; .loc 1 246 17 cvt.u32.u32 %r215,%r22; cvt.s64.s8 %r65,%r215; add.u64 %r161,%r164,40; add.u64 %r284,%frame,120; mov.u64 %r159,%r284; add.u64 %r158,%r165,40; mov.u64 %r154,%frame; add.u64 %r155,%frame,240; add.u64 %r151,%frame,360; mov.u64 %r150,%r151; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r224,%r29; bra $L43; $L42: .loc 1 242 5_gfortran_runtime_error64: mov.u64 %r29,%r109; $L43: .loc 1 248 18 ld.u64 %r218,[%r161]; st.u64 [%r159],%r218; .loc 1 249 20 ld.u64 %r220,[%r158]; mul.lo.u64 %r219,%r220,%r31; .loc 1 249 18 st.u64 [%r154],%r219; .loc 1 250 19 ld.u64 %r222,[%r161+16]; add.u64 %r221,%r222,1; ld.u64 %r223,[%r161+8]; sub.u64 %r44,%r221,%r223; .loc 1 250 17 st.u64 [%r155],%r44; .loc 1 251 16 st.u64 [%r150],%r224; .loc 1 252 10 setp.gt.s64 %r225,%r44,0; @ %r225 bra $L44; shl.b64 %r144,%r102,4; mov.u64 %r133,%r103; .loc 1 255 11 mov.u64 %r110,0; .loc 1 256 24 mov.u64 %r226,%r110; $L45: st.u64 [%r133],%r226; st.u64 [%r133+8],%r226; .loc 1 255 27 add.u64 %r110,%r110,1; .loc 1 255 4 add.u64 %r133,%r133,%r144; setp.ne.u64 %r228,%r65,%r110; @ %r228 bra $L45; bra $L35; $L44: .loc 1 246 26 add.u64 %r109,%r29,1; .loc 1 246 3 add.u64 %r161,%r161,24; add.u64 %r159,%r159,8; add.u64 %r158,%r158,24; add.u64 %r154,%r154,8; add.u64 %r155,%r155,8; add.u64 %r150,%r150,8; setp.ne.u64 %r229,%r65,%r109; @ %r229 bra $L64; .loc 1 261 8 ld.u64 %r88,[%r164]; shl.b64 %r112,%r102,4; mov.u64 %r52,%r103; .loc 1 264 10 mov.u64 %r108,0; .loc 1 265 23 mov.u64 %r230,%r108; $L47: st.u64 [%r52],%r230; st.u64 [%r52+8],%r230; mov.u64 %r140,%r108; .loc 1 264 26 add.u64 %r108,%r108,1; .loc 1 264 3 add.u64 %r52,%r52,%r112; setp.ne.u64 %r232,%r29,%r140; @ %r232 bra $L47; .loc 1 276 9 setp.eq.u64 %r233,%r88,0; @ %r233 bra $L35; .loc 1 331 19 ld.u64 %r68,[%frame+120]; .loc 1 331 9 shl.b64 %r70,%r68,2; .loc 1 332 20 ld.u64 %r71,[%frame]; .loc 1 334 34 ld.u64 %r74,[%frame+240]; ld.u64 %r160,[%frame+360]; cvt.u32.u32 %r235,%r22; cvt.s64.s8 %r234,%r235; shl.b64 %r236,%r234,3; add.u64 %r122,%r236,%r151; .loc 1 274 12 mov.u32 %r95,-2147483648; setp.eq.u32 %r283,1,0; add.u64 %r285,%frame,368; .loc 1 340 13 mov.u64 %r290,0; .loc 1 359 32 add.u64 %r291,%frame,240; .loc 1 308 12 setp.eq.u32 %r293,%r166,0; $L63: .loc 1 280 10 @ %r283 bra $L48; mov.u64 %r53,%r160; mov.u32 %r46,0; mov.u32 %r287,1; $L53: mov.u64 %r45,%r88; .loc 1 300 13 add.u64 %r88,%r45,%r70; .loc 1 284 11 ld.s8 %r239,[%r90]; cvt.u16.u32 %r238,%r239; setp.eq.u16 %r240,%r238,0; @ %r240 bra $L49; setp.eq.u32 %r241,%r46,0; selp.u64 %r160,%r160,%r53,%r241; .loc 1 294 16 ld.u32 %r95,[%r45]; mov.u64 %r131,%r285; mov.u64 %r145,%r103; mov.u64 %r157,%r160; $L52: .loc 1 296 33 add.u64 %r243,%r157,1; st.u64 [%r145],%r243; shr.s64 %r244,%r243,63; st.u64 [%r145+8],%r244; .loc 1 295 9 add.u64 %r145,%r145,%r112; setp.eq.u64 %r245,%r122,%r131; @ %r245 bra $L65; .loc 1 296 29 ld.u64 %r157,[%r131]; add.u64 %r131,%r131,8; bra $L52; $L49: .loc 1 301 14 add.u64 %r90,%r90,%r71; .loc 1 303 11 add.u64 %r53,%r53,1; mov.u32 %r46,%r287; .loc 1 303 4 setp.ne.u64 %r246,%r53,%r74; @ %r246 bra $L53; bra $L89; $L48: .loc 1 308 12 @ %r293 bra $L66; mov.u64 %r61,%r160; mov.u64 %r86,%r88; $L58: .loc 1 311 11 ld.s8 %r249,[%r90]; cvt.u16.u32 %r248,%r249; setp.eq.u16 %r250,%r248,0; @ %r250 bra $L56; .loc 1 311 22 ld.u32 %r57,[%r86]; .loc 1 311 19 setp.lt.s32 %r251,%r57,%r95; @ %r251 bra $L56; mov.u64 %r97,%r285; mov.u64 %r107,%r103; mov.u64 %r153,%r61; .loc 1 314 19 mov.u64 %r141,%r290; $L57: .loc 1 315 36 add.u64 %r253,%r153,1; st.u64 [%r107],%r253; shr.s64 %r254,%r253,63; st.u64 [%r107+8],%r254; .loc 1 314 35 add.u64 %r106,%r141,1; .loc 1 314 12 add.u64 %r107,%r107,%r112; setp.le.s64 %r255,%r29,%r141; @ %r255 bra $L67; .loc 1 315 32 ld.u64 %r153,[%r97]; add.u64 %r97,%r97,8; mov.u64 %r141,%r106; bra $L57; $L67: mov.u32 %r95,%r57; $L56: .loc 1 317 13 add.u64 %r86,%r86,%r70; .loc 1 319 11 add.u64 %r61,%r61,1; .loc 1 319 4 st.u64 [%frame+360],%r61; setp.ne.u64 %r256,%r61,%r74; @ %r256 bra $L58; .loc 1 317 13 sub.u64 %r257,%r74,%r160; mad.lo.u64 %r88,%r257,%r70,%r88; bra $L59; $L66: mov.u64 %r73,%r160; mov.u64 %r89,%r90; mov.u64 %r87,%r88; $L55: .loc 1 323 11 ld.s8 %r260,[%r89]; cvt.u16.u32 %r259,%r260; setp.eq.u16 %r261,%r259,0; @ %r261 bra $L60; .loc 1 323 22 ld.u32 %r64,[%r87]; .loc 1 323 19 setp.le.s32 %r262,%r64,%r95; @ %r262 bra $L60; mov.u64 %r62,%r285; mov.u64 %r91,%r103; mov.u64 %r149,%r73; .loc 1 326 12 mov.u64 %r142,%r290; $L61: .loc 1 327 36 add.u64 %r264,%r149,1; st.u64 [%r91],%r264; shr.s64 %r265,%r264,63; st.u64 [%r91+8],%r265; .loc 1 326 28 add.u64 %r105,%r142,1; .loc 1 326 5 add.u64 %r91,%r91,%r112; setp.le.s64 %r266,%r29,%r142; @ %r266 bra $L68; .loc 1 327 32 ld.u64 %r149,[%r62]; add.u64 %r62,%r62,8; mov.u64 %r142,%r105; bra $L61; $L68: mov.u32 %r95,%r64; $L60: .loc 1 331 9 add.u64 %r87,%r87,%r70; .loc 1 332 10 add.u64 %r89,%r89,%r71; .loc 1 334 14 add.u64 %r73,%r73,1; .loc 1 334 7 st.u64 [%frame+360],%r73; setp.ne.u64 %r267,%r73,%r74; @ %r267 bra $L55; sub.u64 %r128,%r74,%r160; .loc 1 331 9 mad.lo.u64 %r88,%r70,%r128,%r88; .loc 1 332 10 mad.lo.u64 %r90,%r71,%r128,%r90; bra $L59; $L89: st.u64 [%frame+360],%r74; $L59: mov.u64 %r120,%r285; mov.u64 %r83,%r71; .loc 1 334 34 mov.u64 %r85,%r74; mov.u64 %r81,%r68; mov.u64 %r119,8; .loc 1 335 9 mov.u64 %r93,0; $L62: .loc 1 340 13 st.u64 [%r120+-8],%r290; .loc 1 343 23 mul.lo.u64 %r76,%r81,%r85; .loc 1 344 24 mul.lo.u64 %r78,%r85,%r83; mov.u64 %r98,%r93; .loc 1 345 5 add.u64 %r93,%r93,1; .loc 1 346 7 setp.eq.u64 %r272,%r29,%r98; @ %r272 bra $L35; .loc 1 354 16 ld.u64 %r273,[%r120]; add.u64 %r80,%r273,1; st.u64 [%r120],%r80; .loc 1 355 23 add.u64 %r275,%r284,%r119; ld.u64 %r81,[%r275]; .loc 1 355 13 sub.u64 %r276,%r81,%r76; shl.b64 %r277,%r276,2; add.u64 %r88,%r88,%r277; .loc 1 356 24 add.u64 %r278,%frame,%r119; ld.u64 %r83,[%r278]; .loc 1 356 14 sub.u64 %r279,%r83,%r78; add.u64 %r90,%r90,%r279; .loc 1 359 32 add.u64 %r281,%r291,%r119; ld.u64 %r85,[%r281]; .loc 1 359 7 add.u64 %r120,%r120,8; add.u64 %r119,%r119,8; setp.eq.u64 %r282,%r80,%r85; @ %r282 bra $L62; ld.u64 %r160,[%frame+360]; bra $L63; $L65: mov.u64 %r88,%r45; setp.eq.u32 %r283,1,1; bra $L48; $L35: .loc 1 362 1 ret; } // BEGIN GLOBAL FUNCTION DEF: _gfortran_smaxloc0_16_i4 .visible .func _gfortran_smaxloc0_16_i4u32pred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 379 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L91; .loc 1 379 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L92; $L91: .loc 1 381 7 {_gfortran_maxloc0_16_.loc 1 382 7 bra $L90; $L92: .loc 1 385 8 ld.s8 %r30,[%r36+28]_gfortran_runtime_error94: .loc 1 390 15 ld.u64 %r24,[%r35]; .loc 1 390 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L95; .loc 1 392 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 393 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 394 24 st.u64 [%r35+8],%r24; .loc 1 395 29call (%value_in),_gfortrani_xmallocarray95 27 st.u64 [%r35],%r24; bra $L96; $L95: .loc 1 397 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 397 11 ld.u32 %r62,[%r61+36; .loc 1 399 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 404 8 ld.u64 %r24,[%r35]; $L96: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,4; mov.u64 %r29,%r24; .loc 1 405 10 mov.u64 %r32,0; .loc 1 406 23 mov.u64 %r68,%r32; $L97: st.u64 [%r29],%r68; st.u64 [%r29+8],%r68; .loc 1 405 24 add.u64 %r32,%r32,1; .loc 1 405 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r70,%r30,%r32; @ %r70 bra $L97; $L90: .loc 1 407 1 ret; } maxloc0_4_i8.o/_gfortran_maxloc0_4_i8 .visible .func _gfortran_maxloc0_4_i8libgfortran/generated/maxloc0_4_i8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_4_i8 .visible .func _gfortran_mmaxloc0_4_i8_gfortran_smaxloc0_4_i8 .visible .func _gfortran_smaxloc0_4_i8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,65,88,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_4_i8 .visible .func _gfortran_maxloc0_4_i8stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[368u64predpred %r154; .reg .predpredpredmov.u64 %r116,%ar0; mov.u64 %r117,%ar1; mov.u32 %r118,%ar2; .loc 1 50 10 ld.s8 %r22,[%r117+28]; .loc 1 50 8 cvt.u32.u32 %r119,%r22; cvt.s64.s8 %r72,%r119; .loc 1 51 6 setp.gt.s64 %r120,%r72,0; @ %r120 bra $L2; .loc 1 52 5 cvta.const.u64 %r121121_gfortran_runtime_errorr23,[%r116]; .loc 1 54 6 setp.ne.u64 %r123,%r23,0; @ %r123 bra $L3; .loc 1 56 7 st.u64 [%r116+48],%r23; add.u64 %r125,%r72,-1; st.u64 [%r116+56],%r125; mov.u64 %r126,1; st.u64 [%r116+40],%r126; .loc 1 57 28 cvt.u32.u64 %r127,%r126; st.u8 [%r116+28],%r127; .loc 1 58 24 st.u64 [%r116+8],%r23; .loc 1 59 29 mov.u64 %r130,4;r130; call (%value_in),_gfortrani_xmallocarray133,[%value_in]; } mov.u64 %r23,%r133; .loc 1 59 27 st.u64 [%r116],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r135,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r136,[%r135+36]; setp.eq.u32 %r137,%r136,0; @ %r137 bra $L4; .loc 1 64r140; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r116]; $L4: .loc 1 68 11 ld.u64 %r73,[%r116+40]; add.u64 %r39,%r117,40; mov.u64 %r45,%frame; add.u64 %r35,%frame,120; .loc 1 70 10 mov.u64 %r99,0; add.u64 %r188,%frame,240; .loc 1 74 16 mov.u64 %r148,%r99; bra $L7; $L22: mov.u64 %r99,%r79; $L7: .loc 1 72 18 ld.u64 %r141,[%r39]; st.u64 [%r45],%r141; .loc 1 73 19 ld.u64 %r143,[%r39+16]; add.u64 %r142,%r143,1; ld.u64 %r144,[%r39+8]; sub.u64 %r31,%r142,%r144; .loc 1 73 17 st.u64 [%r35],%r31; .loc 1 74 16 shl.b64 %r146,%r99,3; add.u64 %r147,%r188,%r146; st.u64 [%r147],%r148; .loc 1 75 10 setp.gt.s64 %r149,%r31,0; @ %r149 bra $L5; shl.b64 %r85,%r73,2; mov.u64 %r87,%r23; .loc 1 78 11 mov.u64 %r80,0; .loc 1 79 24 cvt.u32.u64 %r150,%r80; $L6: st.u32 [%r87],%r150; .loc 1 78 27 add.u64 %r80,%r80,1; .loc 1 78 4 add.u64 %r87,%r87,%r85; setp.ne.u64 %r151,%r72,%r80; @ %r151 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r79,%r99,1; .loc 1 70 3 add.u64 %r39,%r39,24; add.u64 %r45,%r45,8; add.u64 %r35,%r35,8; setp.ne.u64 %r152,%r72,%r79; @ %r152 bra $L22; .loc 1 84 8 ld.u64 %r64,[%r117]; shl.b64 %r82,%r73,2; mov.u64 %r92,%r23; .loc 1 87 10 mov.u64 %r78,0; .loc 1 88 23 mov.u32 %r153,1; $L8: st.u32 [%r92],%r153; mov.u64 %r32,%r78; .loc 1 87 26 add.u64 %r78,%r78,1; .loc 1 87 3 add.u64 %r92,%r92,%r82; setp.ne.u64 %r154,%r99,%r32; @ %r154 bra $L8; .loc 1 101 9 setp.ne.u64 %r155,%r64,0; @ ! %r155 bra $L1; .loc 1 149 19 ld.u64 %r49,[%frame]; .loc 1 149 9 shl.b64 %r51,%r49,3; .loc 1 151 34 ld.u64 %r53,[%frame+120]; cvt.u32.u32 %r157,%r22; cvt.s64.s8 %r156,%r157; shl.b64 %r158,%r156,3; add.u64 %r102,%r188,%r158; .loc 1 99 12 mov.u64 %r67,-9223372036854775808; setp.eq.u32 %r187,%r118,0; add.u64 %r189,%frame,248; .loc 1 157 13 mov.u64 %r192,0; .loc 1 174 32 add.u64 %r193,%frame,120; $L21: .loc 1 151 21 ld.u64 %r110,[%frame+240]; .loc 1 126 12 @ %r187 bra $L23; mov.u64 %r40,%r110; mov.u64 %r62,%r64; $L14: .loc 1 129 12 ld.u64 %r33,[%r62]; .loc 1 129 11 setp.lt.s64 %r161,%r33,%r67; @ %r161 bra $L12; mov.u64 %r43,%r189; mov.u64 %r65,%r23; mov.u64 %r105,%r40; $L13: .loc 1 133 35 cvt.u32.u64 %r164,%r105; add.u32 %r163,%r164,1; .loc 1 133 24 st.u32 [%r65],%r163; .loc 1 132 10 add.u64 %r65,%r65,%r82; setp.eq.u64 %r165,%r43,%r102; @ %r165 bra $L24; .loc 1 133 31 ld.u64 %r105,[%r43]; add.u64 %r43,%r43,8; bra $L13; $L24: mov.u64 %r67,%r33; $L12: .loc 1 135 12 add.u64 %r62,%r62,%r51; .loc 1 137 17 add.u64 %r40,%r40,1; .loc 1 137 10 st.u64 [%frame+240],%r40; setp.ne.u64 %r166,%r40,%r53; @ %r166 bra $L14; bra $L34; $L23: mov.u64 %r52,%r110; mov.u64 %r63,%r64; $L11: .loc 1 141 11 ld.u64 %r42,[%r63]; .loc 1 141 10 setp.le.s64 %r167,%r42,%r67; @ %r167 bra $L16; mov.u64 %r108,%r189; mov.u64 %r109,%r23; mov.u64 %r44,%r52; $L17: .loc 1 145 35 cvt.u32.u64 %r170,%r44; add.u32 %r169,%r170,1; .loc 1 145 24 st.u32 [%r109],%r169; .loc 1 144 4 add.u64 %r109,%r109,%r82; setp.eq.u64 %r171,%r102,%r108; @ %r171 bra $L25; .loc 1 145 31 ld.u64 %r44,[%r108]; add.u64 %r108,%r108,8; bra $L17; $L25: mov.u64 %r67,%r42; $L16: .loc 1 149 9 add.u64 %r63,%r63,%r51; .loc 1 151 14 add.u64 %r52,%r52,1; .loc 1 151 7 st.u64 [%frame+240],%r52; setp.ne.u64 %r172,%r52,%r53; @ %r172 bra $L11; .loc 1 149 9 sub.u64 %r173,%r53,%r110; mad.lo.u64 %r64,%r173,%r51,%r64; bra $L18; $L34: .loc 1 135 12 sub.u64 %r175,%r53,%r110; mad.lo.u64 %r64,%r175,%r51,%r64; $L18: mov.u64 %r81,%r189; .loc 1 151 34 mov.u64 %r61,%r53; mov.u64 %r59,%r49; .loc 1 152 9 mov.u64 %r66,0; $L20: .loc 1 157 13 st.u64 [%r81+-8],%r192; .loc 1 160 23 mul.lo.u64 %r56,%r59,%r61; mov.u64 %r100,%r66; .loc 1 161 5 add.u64 %r66,%r66,1; .loc 1 162 7 setp.eq.u64 %r179,%r100,%r99; @ %r179 bra $L1; .loc 1 170 16 ld.u64 %r180,[%r81]; add.u64 %r58,%r180,1; st.u64 [%r81],%r58; shl.b64 %r70,%r66,3; .loc 1 171 23 add.u64 %r181,%frame,%r70; ld.u64 %r59,[%r181]; .loc 1 171 13 sub.u64 %r182,%r59,%r56; shl.b64 %r183,%r182,3; add.u64 %r64,%r64,%r183; .loc 1 174 32 add.u64 %r185,%r193,%r70; ld.u64 %r61,[%r185]; .loc 1 174 7 add.u64 %r81,%r81,8; setp.eq.u64 %r186,%r58,%r61; @ %r186 bra $L20; bra $L21; $L_gfortran_mmaxloc0_4_i8 .visible .func _gfortran_mmaxloc0_4_i8reg .u64 %stack; mov.u64 %stack,04807predpred %r193; .reg .u64 %r196; .reg .u6464pred %r228; .reg .u32 %r229; .reg .predpred %r233; .reg .pred %r234; .reg .u64u16u64u64 %r292; .reg .pred %r294; mov.u64 %r166,%ar0; mov.u64 %r167,%ar1; mov.u64 %r168,%ar2; mov.u32 %r169,%ar3; .loc 1 202 6 setp.ne.u64 %r170,%r168,0; @ %r170 bra $L36gfortran_maxloc0_4_.loc 1 205 7 bra $L35; $L36: .loc 1 208 10 ld.s8 %r22,[%r167+28]; .loc 1 209 6 setp.gt.s32 %r174,%r22,0; @ %r174 bra $L38; .loc 1 210 5 cvta.const.u64 %r175175_gfortran_runtime_error%r177,[%r166]; setp.ne.u64 %r178,%r177,0; @ %r178 bra $L39; .loc 1 214 7 st.u64 [%r166+48],%r177; add.u32 %r180,%r22,-1; cvt.s64.s32 %r181,%r180; st.u64 [%r166+56],%r181; mov.u64 %r182,1; st.u64 [%r166+40],%r182; .loc 1 215 28 cvt.u32.u64 %r183,%r182; st.u8 [%r166+28],%r183; .loc 1 216 24 st.u64 [%r166+8],%r177; .loc 1 217 29 cvt.u32.u32 %r188,%r22; cvt.s64.s8 %r187,%r188call (%value_in),_gfortrani_xmallocarray189,[%value_in]; } .loc 1 217 27 st.u64 [%r166],%r189; bra $L40; $L39: .loc 1 221 11 cvta.global.u64 %r191,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r192,[%r191+36];r167196; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19call _gfortrani_bounds_equal_extents$L40: .loc 1 231 15 ld.u64 %r31,[%r168+16]; .loc 1 233 9 ld.u64 %r93,[%r168]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r201,%r32,-4; and.b32 %r202,%r201,-5; set.u32.eq.u32 %r204,%r202,0; neg.s32 %r205,%r204; .loc 1 235 22 add.u32 %r206,%r32,-1; set.u32.le.u32 %r208,%r206,1; neg.s32 %r209,%r208; .loc 1 235 6 cvt.u16.u32 %r211,%r205; cvt.u16.u32 %r212,%r209; or.b16 %r210,%r211,%r212; cvt.u32.u16 %r213,%r210; cvt.u16.u8 %r214,%r213; setp.ne.u16 %r215,%r214,0; @ %r215 bra $L41; .loc 1 237 7 setp.ne.u32 %r217,%r32,16; @ %r217 bra $L42; $L41: .loc 1 244 11 ld.u64 %r106,[%r166+40]; .loc 1 245 8 ld.u64 %r107,[%r166]; .loc 1 246 17 cvt.u32.u32 %r218,%r22; cvt.s64.s8 %r65,%r218; add.u64 %r52,%r167,40; add.u64 %r286,%frame,120; mov.u64 %r116,%r286; add.u64 %r102,%r168,40; mov.u64 %r165,%frame; add.u64 %r162,%frame,240; add.u64 %r159,%frame,360; mov.u64 %r160,%r159; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r227,%r29; bra $L43; $L42: .loc 1 242 5_gfortran_runtime_error64: mov.u64 %r29,%r113; $L43: .loc 1 248 18 ld.u64 %r221,[%r52]; st.u64 [%r116],%r221; .loc 1 249 20 ld.u64 %r223,[%r102]; mul.lo.u64 %r222,%r223,%r31; .loc 1 249 18 st.u64 [%r165],%r222; .loc 1 250 19 ld.u64 %r225,[%r52+16]; add.u64 %r224,%r225,1; ld.u64 %r226,[%r52+8]; sub.u64 %r44,%r224,%r226; .loc 1 250 17 st.u64 [%r162],%r44; .loc 1 251 16 st.u64 [%r160],%r227; .loc 1 252 10 setp.gt.s64 %r228,%r44,0; @ %r228 bra $L44; shl.b64 %r149,%r106,2; mov.u64 %r139,%r107; .loc 1 255 11 mov.u64 %r114,0; .loc 1 256 24 cvt.u32.u64 %r229,%r114; $L45: st.u32 [%r139],%r229; .loc 1 255 27 add.u64 %r114,%r114,1; .loc 1 255 4 add.u64 %r139,%r139,%r149; setp.ne.u64 %r230,%r65,%r114; @ %r230 bra $L45; bra $L35; $L44: .loc 1 246 26 add.u64 %r113,%r29,1; .loc 1 246 3 add.u64 %r52,%r52,24; add.u64 %r116,%r116,8; add.u64 %r102,%r102,24; add.u64 %r165,%r165,8; add.u64 %r162,%r162,8; add.u64 %r160,%r160,8; setp.ne.u64 %r231,%r65,%r113; @ %r231 bra $L64; .loc 1 261 8 ld.u64 %r90,[%r167]; shl.b64 %r62,%r106,2; mov.u64 %r91,%r107; .loc 1 264 10 mov.u64 %r112,0; .loc 1 265 23 cvt.u32.u64 %r232,%r112; $L47: st.u32 [%r91],%r232; mov.u64 %r142,%r112; .loc 1 264 26 add.u64 %r112,%r112,1; .loc 1 264 3 add.u64 %r91,%r91,%r62; setp.ne.u64 %r233,%r29,%r142; @ %r233 bra $L47; .loc 1 276 9 setp.eq.u64 %r234,%r90,0; @ %r234 bra $L35; .loc 1 331 19 ld.u64 %r69,[%frame+120]; .loc 1 331 9 shl.b64 %r71,%r69,3; .loc 1 332 20 ld.u64 %r72,[%frame]; .loc 1 334 34 ld.u64 %r75,[%frame+240]; ld.u64 %r163,[%frame+360]; cvt.u32.u32 %r236,%r22; cvt.s64.s8 %r235,%r236; shl.b64 %r237,%r235,3; add.u64 %r127,%r237,%r159; .loc 1 274 12 mov.u64 %r98,-9223372036854775808; setp.eq.u32 %r285,1,0; add.u64 %r284,%frame,368; .loc 1 340 13 mov.u64 %r291,0; .loc 1 359 32 add.u64 %r292,%frame,240; .loc 1 308 12 setp.eq.u32 %r294,%r169,0; $L63: .loc 1 280 10 @ %r285 bra $L48; mov.u64 %r53,%r163; mov.u32 %r46,0; mov.u32 %r288,1; $L53: mov.u64 %r45,%r90; .loc 1 300 13 add.u64 %r90,%r45,%r71; .loc 1 284 11 ld.s8 %r240,[%r93]; cvt.u16.u32 %r239,%r240; setp.eq.u16 %r241,%r239,0; @ %r241 bra $L49; setp.eq.u32 %r242,%r46,0; selp.u64 %r163,%r163,%r53,%r242; .loc 1 294 16 ld.u64 %r98,[%r45]; mov.u64 %r137,%r284; mov.u64 %r150,%r107; mov.u64 %r161,%r163; $L52: .loc 1 296 33 cvt.u32.u64 %r245,%r161; add.u32 %r244,%r245,1; .loc 1 296 22 st.u32 [%r150],%r244; .loc 1 295 9 add.u64 %r150,%r150,%r62; setp.eq.u64 %r246,%r127,%r137; @ %r246 bra $L65; .loc 1 296 29 ld.u64 %r161,[%r137]; add.u64 %r137,%r137,8; bra $L52; $L49: .loc 1 301 14 add.u64 %r93,%r93,%r72; .loc 1 303 11 add.u64 %r53,%r53,1; mov.u32 %r46,%r288; .loc 1 303 4 setp.ne.u64 %r247,%r53,%r75; @ %r247 bra $L53; bra $L89; $L48: .loc 1 308 12 @ %r294 bra $L66; mov.u64 %r61,%r163; mov.u64 %r88,%r90; $L58: .loc 1 311 11 ld.s8 %r250,[%r93]; cvt.u16.u32 %r249,%r250; setp.eq.u16 %r251,%r249,0; @ %r251 bra $L56; .loc 1 311 22 ld.u64 %r57,[%r88]; .loc 1 311 19 setp.lt.s64 %r252,%r57,%r98; @ %r252 bra $L56; mov.u64 %r117,%r284; mov.u64 %r119,%r107; mov.u64 %r156,%r61; .loc 1 314 19 mov.u64 %r143,%r291; $L57: .loc 1 315 36 cvt.u32.u64 %r255,%r156; add.u32 %r254,%r255,1; .loc 1 315 25 st.u32 [%r119],%r254; .loc 1 314 35 add.u64 %r110,%r143,1; .loc 1 314 12 add.u64 %r119,%r119,%r62; setp.le.s64 %r256,%r29,%r143; @ %r256 bra $L67; .loc 1 315 32 ld.u64 %r156,[%r117]; add.u64 %r117,%r117,8; mov.u64 %r143,%r110; bra $L57; $L67: mov.u64 %r98,%r57; $L56: .loc 1 317 13 add.u64 %r88,%r88,%r71; .loc 1 319 11 add.u64 %r61,%r61,1; .loc 1 319 4 st.u64 [%frame+360],%r61; setp.ne.u64 %r257,%r61,%r75; @ %r257 bra $L58; .loc 1 317 13 sub.u64 %r258,%r75,%r163; mad.lo.u64 %r90,%r258,%r71,%r90; bra $L59; $L66: mov.u64 %r74,%r163; mov.u64 %r92,%r93; mov.u64 %r89,%r90; $L55: .loc 1 323 11 ld.s8 %r261,[%r92]; cvt.u16.u32 %r260,%r261; setp.eq.u16 %r262,%r260,0; @ %r262 bra $L60; .loc 1 323 22 ld.u64 %r64,[%r89]; .loc 1 323 19 setp.le.s64 %r263,%r64,%r98; @ %r263 bra $L60; mov.u64 %r97,%r284; mov.u64 %r100,%r107; mov.u64 %r151,%r74; .loc 1 326 12 mov.u64 %r144,%r291; $L61: .loc 1 327 36 cvt.u32.u64 %r266,%r151; add.u32 %r265,%r266,1; .loc 1 327 25 st.u32 [%r100],%r265; .loc 1 326 28 add.u64 %r109,%r144,1; .loc 1 326 5 add.u64 %r100,%r100,%r62; setp.le.s64 %r267,%r29,%r144; @ %r267 bra $L68; .loc 1 327 32 ld.u64 %r151,[%r97]; add.u64 %r97,%r97,8; mov.u64 %r144,%r109; bra $L61; $L68: mov.u64 %r98,%r64; $L60: .loc 1 331 9 add.u64 %r89,%r89,%r71; .loc 1 332 10 add.u64 %r92,%r92,%r72; .loc 1 334 14 add.u64 %r74,%r74,1; .loc 1 334 7 st.u64 [%frame+360],%r74; setp.ne.u64 %r268,%r74,%r75; @ %r268 bra $L55; sub.u64 %r134,%r75,%r163; .loc 1 331 9 mad.lo.u64 %r90,%r71,%r134,%r90; .loc 1 332 10 mad.lo.u64 %r93,%r72,%r134,%r93; bra $L59; $L89: st.u64 [%frame+360],%r75; $L59: mov.u64 %r125,%r284; mov.u64 %r84,%r72; .loc 1 334 34 mov.u64 %r86,%r75; mov.u64 %r82,%r69; mov.u64 %r124,8; .loc 1 335 9 mov.u64 %r96,0; $L62: .loc 1 340 13 st.u64 [%r125+-8],%r291; .loc 1 343 23 mul.lo.u64 %r77,%r82,%r86; .loc 1 344 24 mul.lo.u64 %r79,%r86,%r84; mov.u64 %r118,%r96; .loc 1 345 5 add.u64 %r96,%r96,1; .loc 1 346 7 setp.eq.u64 %r273,%r29,%r118; @ %r273 bra $L35; .loc 1 354 16 ld.u64 %r274,[%r125]; add.u64 %r81,%r274,1; st.u64 [%r125],%r81; .loc 1 355 23 add.u64 %r276,%r286,%r124; ld.u64 %r82,[%r276]; .loc 1 355 13 sub.u64 %r277,%r82,%r77; shl.b64 %r278,%r277,3; add.u64 %r90,%r90,%r278; .loc 1 356 24 add.u64 %r279,%frame,%r124; ld.u64 %r84,[%r279]; .loc 1 356 14 sub.u64 %r280,%r84,%r79; add.u64 %r93,%r93,%r280; .loc 1 359 32 add.u64 %r282,%r292,%r124; ld.u64 %r86,[%r282]; .loc 1 359 7 add.u64 %r125,%r125,8; add.u64 %r124,%r124,8; setp.eq.u64 %r283,%r81,%r86; @ %r283 bra $L62; ld.u64 %r163,[%frame+360]; bra $L63; $L65: mov.u64 %r90,%r45; setp.eq.u32 %r285,1,1; bra $L48; $L35: .loc 1 _gfortran_smaxloc0_4_i8 .visible .func _gfortran_smaxloc0_4_i8u32pred.reg .u64u32 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 379 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L91; .loc 1 379 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L92; $L91: .loc 1 381 7 {_gfortran_maxloc0_4_.loc 1 382 7 bra $L90; $L92: .loc 1 385 8 ld.s8 %r30,[%r36+28]_gfortran_runtime_error94: .loc 1 390 15 ld.u64 %r24,[%r35]; .loc 1 390 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L95; .loc 1 392 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 393 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 394 24 st.u64 [%r35+8],%r24; .loc 1 395 29 mov.u64 %r56call (%value_in),_gfortrani_xmallocarray95 27 st.u64 [%r35],%r24; bra $L96; $L95: .loc 1 397 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 397 11 ld.u32 %r62,[%r61+36; .loc 1 399 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 404 8 ld.u64 %r24,[%r35]; $L96: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,2; mov.u64 %r29,%r24; .loc 1 405 10 mov.u64 %r32,0; .loc 1 406 23 cvt.u32.u64 %r68,%r32; $L97: st.u32 [%r29],%r68; .loc 1 405 24 add.u64 %r32,%r32,1; .loc 1 405 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L97; $L90: .loc 1 407 1 ret; } maxloc0_8_i8.o/_gfortran_maxloc0_8_i8 .visible .func _gfortran_maxloc0_8_i8libgfortran/generated/maxloc0_8_i8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_8_i8 .visible .func _gfortran_mmaxloc0_8_i8_gfortran_smaxloc0_8_i8 .visible .func _gfortran_smaxloc0_8_i8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,65,88,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_8_i8 .visible .func _gfortran_maxloc0_8_i8stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[36832predpred %r180; .reg .predmov.u64 %r112,%ar0; mov.u64 %r113,%ar1; mov.u32 %r114,%ar2; .loc 1 50 10 ld.s8 %r22,[%r113+28]; .loc 1 50 8 cvt.u32.u32 %r115,%r22; cvt.s64.s8 %r71,%r115; .loc 1 51 6 setp.gt.s64 %r116,%r71,0; @ %r116 bra $L2; .loc 1 52 5 cvta.const.u64 %r11117_gfortran_runtime_errorr23,[%r112]; .loc 1 54 6 setp.ne.u64 %r119,%r23,0; @ %r119 bra $L3; .loc 1 56 7 st.u64 [%r112+48],%r23; add.u64 %r121,%r71,-1; st.u64 [%r112+56],%r121; mov.u64 %r122,1; st.u64 [%r112+40],%r122; .loc 1 57 28 cvt.u32.u64 %r123,%r122; st.u8 [%r112+28],%r123; .loc 1 58 24 st.u64 [%r112+8],%r23; .loc 1 59 29 mov.u64 %r171call (%value_in),_gfortrani_xmallocarray129,[%value_in]; } mov.u64 %r23,%r129; .loc 1 59 27 st.u64 [%r112],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r131,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r132,[%r131+36]; setp.eq.u32 %r133,%r132,0; @ %r133 bra $L4; .loc 1 64r113136; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r112]; $L4: .loc 1 68 11 ld.u64 %r72,[%r112+40]; add.u64 %r36,%r113,40; mov.u64 %r104,%frame; add.u64 %r105,%frame,120; .loc 1 70 10 mov.u64 %r98,0; add.u64 %r182,%frame,240; .loc 1 74 16 mov.u64 %r144,%r98; bra $L7; $L22: mov.u64 %r98,%r79; $L7: .loc 1 72 18 ld.u64 %r137,[%r36]; st.u64 [%r104],%r137; .loc 1 73 19 ld.u64 %r139,[%r36+16]; add.u64 %r138,%r139,1; ld.u64 %r140,[%r36+8]; sub.u64 %r31,%r138,%r140; .loc 1 73 17 st.u64 [%r105],%r31; .loc 1 74 16 shl.b64 %r142,%r98,3; add.u64 %r143,%r182,%r142; st.u64 [%r143],%r144; .loc 1 75 10 setp.gt.s64 %r145,%r31,0; @ %r145 bra $L5; shl.b64 %r81,%r72,3; mov.u64 %r87,%r23; .loc 1 78 11 mov.u64 %r80,0; .loc 1 79 24 mov.u64 %r146,%r80; $L6: st.u64 [%r87],%r146; .loc 1 78 27 add.u64 %r80,%r80,1; .loc 1 78 4 add.u64 %r87,%r87,%r81; setp.ne.u64 %r147,%r71,%r80; @ %r147 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r79,%r98,1; .loc 1 70 3 add.u64 %r36,%r36,24; add.u64 %r104,%r104,8; add.u64 %r105,%r105,8; setp.ne.u64 %r148,%r71,%r79; @ %r148 bra $L22; .loc 1 84 8 ld.u64 %r62,[%r113]; shl.b64 %r84,%r72,3; mov.u64 %r39,%r23; .loc 1 87 10 mov.u64 %r78,0; .loc 1 88 23 mov.u64 %r149,1; $L8: st.u64 [%r39],%r149; mov.u64 %r32,%r78; .loc 1 87 26 add.u64 %r78,%r78,1; .loc 1 87 3 add.u64 %r39,%r39,%r84; setp.ne.u64 %r150,%r98,%r32; @ %r150 bra $L8; .loc 1 101 9 setp.ne.u64 %r151,%r62,0; @ ! %r151 bra $L1; .loc 1 149 19 ld.u64 %r45,[%frame]; .loc 1 149 9 shl.b64 %r47,%r45,3; .loc 1 151 34 ld.u64 %r51,[%frame+120]; cvt.u32.u32 %r153,%r22; cvt.s64.s8 %r152,%r153; shl.b64 %r154,%r152,3; add.u64 %r73,%r182,%r154; .loc 1 99 12 mov.u64 %r67,-9223372036854775808; setp.eq.u32 %r181,%r114,0; add.u64 %r183,%frame,248; .loc 1 157 13 mov.u64 %r186,0; .loc 1 174 32 add.u64 %r187,%frame,120; $L21: .loc 1 151 21 ld.u64 %r103,[%frame+240]; .loc 1 126 12 @ %r181 bra $L23; mov.u64 %r40,%r103; mov.u64 %r60,%r62; $L14: .loc 1 129 12 ld.u64 %r33,[%r60]; .loc 1 129 11 setp.lt.s64 %r157,%r33,%r67; @ %r157 bra $L12; mov.u64 %r76,%r183; mov.u64 %r34,%r23; mov.u64 %r90,%r40; $L13: .loc 1 133 35 add.u64 %r159,%r90,1; .loc 1 133 24 st.u64 [%r34],%r159; .loc 1 132 10 add.u64 %r34,%r34,%r84; setp.eq.u64 %r160,%r73,%r76; @ %r160 bra $L24; .loc 1 133 31 ld.u64 %r90,[%r76]; add.u64 %r76,%r76,8; bra $L13; $L24: mov.u64 %r67,%r33; $L12: .loc 1 135 12 add.u64 %r60,%r60,%r47; .loc 1 137 17 add.u64 %r40,%r40,1; .loc 1 137 10 st.u64 [%frame+240],%r40; setp.ne.u64 %r161,%r40,%r51; @ %r161 bra $L14; bra $L34; $L23: mov.u64 %r49,%r103; mov.u64 %r61,%r62; $L11: .loc 1 141 11 ld.u64 %r42,[%r61]; .loc 1 141 10 setp.le.s64 %r162,%r42,%r67; @ %r162 bra $L16; mov.u64 %r85,%r183; mov.u64 %r101,%r23; mov.u64 %r86,%r49; $L17: .loc 1 145 35 add.u64 %r164,%r86,1; .loc 1 145 24 st.u64 [%r101],%r164; .loc 1 144 4 add.u64 %r101,%r101,%r84; setp.eq.u64 %r165,%r73,%r85; @ %r165 bra $L25; .loc 1 145 31 ld.u64 %r86,[%r85]; add.u64 %r85,%r85,8; bra $L17; $L25: mov.u64 %r67,%r42; $L16: .loc 1 149 9 add.u64 %r61,%r61,%r47; .loc 1 151 14 add.u64 %r49,%r49,1; .loc 1 151 7 st.u64 [%frame+240],%r49; setp.ne.u64 %r166,%r49,%r51; @ %r166 bra $L11; .loc 1 149 9 sub.u64 %r167,%r51,%r103; mad.lo.u64 %r62,%r167,%r47,%r62; bra $L18; $L34: .loc 1 135 12 sub.u64 %r169,%r51,%r103; mad.lo.u64 %r62,%r169,%r47,%r62; $L18: mov.u64 %r70,%r183; .loc 1 151 34 mov.u64 %r59,%r51; mov.u64 %r57,%r45; .loc 1 152 9 mov.u64 %r66,0; $L20: .loc 1 157 13 st.u64 [%r70+-8],%r186; .loc 1 160 23 mul.lo.u64 %r54,%r57,%r59; mov.u64 %r99,%r66; .loc 1 161 5 add.u64 %r66,%r66,1; .loc 1 162 7 setp.eq.u64 %r173,%r99,%r98; @ %r173 bra $L1; .loc 1 170 16 ld.u64 %r174,[%r70]; add.u64 %r56,%r174,1; st.u64 [%r70],%r56; shl.b64 %r65,%r66,3; .loc 1 171 23 add.u64 %r175,%frame,%r65; ld.u64 %r57,[%r175]; .loc 1 171 13 sub.u64 %r176,%r57,%r54; shl.b64 %r177,%r176,3; add.u64 %r62,%r62,%r177; .loc 1 174 32 add.u64 %r179,%r187,%r65; ld.u64 %r59,[%r179]; .loc 1 174 7 add.u64 %r70,%r70,8; setp.eq.u64 %r180,%r56,%r59; @ %r180 bra $L20; bra $L21; $L_gfortran_mmaxloc0_8_i8 .visible .func _gfortran_mmaxloc0_8_i8reg .u64 %stack; mov.u64 %stack,04807pred %r187; .reg .u64 %r190; .reg .u64u64pred %r222; .reg .u64 %r223; .reg .predu32 %r230; .reg .u64 %r231; .reg .u16 %r233; .reg .u32 %r234; .reg .pred %r235; .reg .predpred %r248; .reg .predu64 %r273; .reg .predu64 %r283; .reg .pred %r285; mov.u64 %r160,%ar0; mov.u64 %r161,%ar1; mov.u64 %r162,%ar2; mov.u32 %r163,%ar3; .loc 1 202 6 setp.ne.u64 %r164,%r162,0; @ %r164 bra $L36; .loc 1 204 7 {1163; call _gfortran_maxloc0_8_.loc 1 205 7 bra $L35; $L36: .loc 1 208 10 ld.s8 %r22,[%r161+28]; .loc 1 209 6 setp.gt.s32 %r168,%r22,0; @ %r168 bra $L38; .loc 1 210 5 cvta.const.u64 %r16stack; call _gfortran_runtime_error%r171,[%r160]; setp.ne.u64 %r172,%r171,0; @ %r172 bra $L39; .loc 1 214 7 st.u64 [%r160+48],%r171; add.u32 %r174,%r22,-1; cvt.s64.s32 %r175,%r174; st.u64 [%r160+56],%r175; mov.u64 %r176,1; st.u64 [%r160+40],%r176; .loc 1 215 28 cvt.u32.u64 %r177,%r176; st.u8 [%r160+28],%r177; .loc 1 216 24 st.u64 [%r160+8],%r171; .loc 1 217 29 cvt.u32.u32 %r182,%r22; cvt.s64.s8 %r181,%r182; mov.u64 %r180call (%value_in),_gfortrani_xmallocarray183,[%value_in]; } .loc 1 217 27 st.u64 [%r160],%r183; bra $L40; $L39: .loc 1 221 11 cvta.global.u64 %r185,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r186,[%r185+36]; setp.eq.u32 %r187,%r186,0; @ %r187 bra $L40; .loc 1 224 4 cvta.const.u64 %r190,$LC1; {161190; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19316161193190; call _gfortrani_bounds_equal_extents$L40: .loc 1 231 15 ld.u64 %r31,[%r162+16]; .loc 1 233 9 ld.u64 %r92,[%r162]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r195,%r32,-4; and.b32 %r196,%r195,-5; set.u32.eq.u32 %r198,%r196,0; neg.s32 %r199,%r198; .loc 1 235 22 add.u32 %r200,%r32,-1; set.u32.le.u32 %r202,%r200,1; neg.s32 %r203,%r202; .loc 1 235 6 cvt.u16.u32 %r205,%r199; cvt.u16.u32 %r206,%r203; or.b16 %r204,%r205,%r206; cvt.u32.u16 %r207,%r204; cvt.u16.u8 %r208,%r20741; .loc 1 237 7 setp.ne.u32 %r211,%r32,16; @ %r211 bra $L42; $L41: .loc 1 244 11 ld.u64 %r106,[%r160+40]; .loc 1 245 8 ld.u64 %r107,[%r160]; .loc 1 246 17 cvt.u32.u32 %r212,%r22; cvt.s64.s8 %r68,%r212; add.u64 %r153,%r161,40; add.u64 %r277,%frame,120; mov.u64 %r149,%r277; add.u64 %r147,%r162,40; mov.u64 %r146,%frame; add.u64 %r143,%frame,240; add.u64 %r62,%frame,360; mov.u64 %r47,%r62; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r221,%r29; bra $L43; $L42: .loc 1 242 5_gfortran_runtime_error64: mov.u64 %r29,%r113; $L43: .loc 1 248 18 ld.u64 %r215,[%r153]; st.u64 [%r149],%r215; .loc 1 249 20 ld.u64 %r217,[%r147]; mul.lo.u64 %r216,%r217,%r31; .loc 1 249 18 st.u64 [%r146],%r216; .loc 1 250 19 ld.u64 %r219,[%r153+16]; add.u64 %r218,%r219,1; ld.u64 %r220,[%r153+8]; sub.u64 %r44,%r218,%r220; .loc 1 250 17 st.u64 [%r143],%r44; .loc 1 251 16 st.u64 [%r47],%r221; .loc 1 252 10 setp.gt.s64 %r222,%r44,0; @ %r222 bra $L44; shl.b64 %r137,%r106,3; mov.u64 %r129,%r107; .loc 1 255 11 mov.u64 %r114,0; .loc 1 256 24 mov.u64 %r223,%r114; $L45: st.u64 [%r129],%r223; .loc 1 255 27 add.u64 %r114,%r114,1; .loc 1 255 4 add.u64 %r129,%r129,%r137; setp.ne.u64 %r224,%r68,%r114; @ %r224 bra $L45; bra $L35; $L44: .loc 1 246 26 add.u64 %r113,%r29,1; .loc 1 246 3 add.u64 %r153,%r153,24; add.u64 %r149,%r149,8; add.u64 %r147,%r147,24; add.u64 %r146,%r146,8; add.u64 %r143,%r143,8; add.u64 %r47,%r47,8; setp.ne.u64 %r225,%r68,%r113; @ %r225 bra $L64; .loc 1 261 8 ld.u64 %r90,[%r161]; shl.b64 %r158,%r106,3; mov.u64 %r156,%r107; .loc 1 264 10 mov.u64 %r112,0; .loc 1 265 23 mov.u64 %r226,%r112; $L47: st.u64 [%r156],%r226; mov.u64 %r140,%r112; .loc 1 264 26 add.u64 %r112,%r112,1; .loc 1 264 3 add.u64 %r156,%r156,%r158; setp.ne.u64 %r227,%r29,%r140; @ %r227 bra $L47; .loc 1 276 9 setp.eq.u64 %r228,%r90,0; @ %r228 bra $L35; .loc 1 331 19 ld.u64 %r70,[%frame+120]; .loc 1 331 9 shl.b64 %r72,%r70,3; .loc 1 332 20 ld.u64 %r73,[%frame]; .loc 1 334 34 ld.u64 %r76,[%frame+240]; ld.u64 %r155,[%frame+360]; cvt.u32.u32 %r230,%r22; cvt.s64.s8 %r229,%r230; shl.b64 %r231,%r229,3; add.u64 %r119,%r231,%r62; .loc 1 274 12 mov.u64 %r97,-9223372036854775808; setp.eq.u32 %r276,1,0; add.u64 %r275,%frame,368; .loc 1 340 13 mov.u64 %r282,0; .loc 1 359 32 add.u64 %r283,%frame,240; .loc 1 308 12 setp.eq.u32 %r285,%r163,0; $L63: .loc 1 280 10 @ %r276 bra $L48; mov.u64 %r56,%r155; mov.u32 %r46,0; mov.u32 %r279,1; $L53: mov.u64 %r45,%r90; .loc 1 300 13 add.u64 %r90,%r45,%r72; .loc 1 284 11 ld.s8 %r234,[%r92]; cvt.u16.u32 %r233,%r234; setp.eq.u16 %r235,%r233,0; @ %r235 bra $L49; setp.eq.u32 %r236,%r46,0; selp.u64 %r155,%r155,%r56,%r236; .loc 1 294 16 ld.u64 %r97,[%r45]; mov.u64 %r128,%r275; mov.u64 %r139,%r107; mov.u64 %r151,%r155; $L52: .loc 1 296 33 add.u64 %r238,%r151,1; .loc 1 296 22 st.u64 [%r139],%r238; .loc 1 295 9 add.u64 %r139,%r139,%r158; setp.eq.u64 %r239,%r119,%r128; @ %r239 bra $L65; .loc 1 296 29 ld.u64 %r151,[%r128]; add.u64 %r128,%r128,8; bra $L52; $L49: .loc 1 301 14 add.u64 %r92,%r92,%r73; .loc 1 303 11 add.u64 %r56,%r56,1; mov.u32 %r46,%r279; .loc 1 303 4 setp.ne.u64 %r240,%r56,%r76; @ %r240 bra $L53; bra $L89; $L48: .loc 1 308 12 @ %r285 bra $L66; mov.u64 %r64,%r155; mov.u64 %r88,%r90; $L58: .loc 1 311 11 ld.s8 %r243,[%r92]; cvt.u16.u32 %r242,%r243; setp.eq.u16 %r244,%r242,0; @ %r244 bra $L56; .loc 1 311 22 ld.u64 %r60,[%r88]; .loc 1 311 19 setp.lt.s64 %r245,%r60,%r97; @ %r245 bra $L56; mov.u64 %r93,%r275; mov.u64 %r96,%r107; mov.u64 %r148,%r64; .loc 1 314 19 mov.u64 %r141,%r282; $L57: .loc 1 315 36 add.u64 %r247,%r148,1; .loc 1 315 25 st.u64 [%r96],%r247; .loc 1 314 35 add.u64 %r110,%r141,1; .loc 1 314 12 add.u64 %r96,%r96,%r158; setp.le.s64 %r248,%r29,%r141; @ %r248 bra $L67; .loc 1 315 32 ld.u64 %r148,[%r93]; add.u64 %r93,%r93,8; mov.u64 %r141,%r110; bra $L57; $L67: mov.u64 %r97,%r60; $L56: .loc 1 317 13 add.u64 %r88,%r88,%r72; .loc 1 319 11 add.u64 %r64,%r64,1; .loc 1 319 4 st.u64 [%frame+360],%r64; setp.ne.u64 %r249,%r64,%r76; @ %r249 bra $L58; .loc 1 317 13 sub.u64 %r250,%r76,%r155; mad.lo.u64 %r90,%r250,%r72,%r90; bra $L59; $L66: mov.u64 %r75,%r155; mov.u64 %r91,%r92; mov.u64 %r89,%r90; $L55: .loc 1 323 11 ld.s8 %r253,[%r91]; cvt.u16.u32 %r252,%r253; setp.eq.u16 %r254,%r252,0; @ %r254 bra $L60; .loc 1 323 22 ld.u64 %r67,[%r89]; .loc 1 323 19 setp.le.s64 %r255,%r67,%r97; @ %r255 bra $L60; mov.u64 %r115,%r275; mov.u64 %r57,%r107; mov.u64 %r144,%r75; .loc 1 326 12 mov.u64 %r142,%r282; $L61: .loc 1 327 36 add.u64 %r257,%r144,1; .loc 1 327 25 st.u64 [%r57],%r257; .loc 1 326 28 add.u64 %r109,%r142,1; .loc 1 326 5 add.u64 %r57,%r57,%r158; setp.le.s64 %r258,%r29,%r142; @ %r258 bra $L68; .loc 1 327 32 ld.u64 %r144,[%r115]; add.u64 %r115,%r115,8; mov.u64 %r142,%r109; bra $L61; $L68: mov.u64 %r97,%r67; $L60: .loc 1 331 9 add.u64 %r89,%r89,%r72; .loc 1 332 10 add.u64 %r91,%r91,%r73; .loc 1 334 14 add.u64 %r75,%r75,1; .loc 1 334 7 st.u64 [%frame+360],%r75; setp.ne.u64 %r259,%r75,%r76; @ %r259 bra $L55; sub.u64 %r125,%r76,%r155; .loc 1 331 9 mad.lo.u64 %r90,%r72,%r125,%r90; .loc 1 332 10 mad.lo.u64 %r92,%r73,%r125,%r92; bra $L59; $L89: st.u64 [%frame+360],%r76; $L59: mov.u64 %r117,%r275; mov.u64 %r85,%r73; .loc 1 334 34 mov.u64 %r87,%r76; mov.u64 %r83,%r70; mov.u64 %r116,8; .loc 1 335 9 mov.u64 %r95,0; $L62: .loc 1 340 13 st.u64 [%r117+-8],%r282; .loc 1 343 23 mul.lo.u64 %r78,%r83,%r87; .loc 1 344 24 mul.lo.u64 %r80,%r87,%r85; mov.u64 %r94,%r95; .loc 1 345 5 add.u64 %r95,%r95,1; .loc 1 346 7 setp.eq.u64 %r264,%r29,%r94; @ %r264 bra $L35; .loc 1 354 16 ld.u64 %r265,[%r117]; add.u64 %r82,%r265,1; st.u64 [%r117],%r82; .loc 1 355 23 add.u64 %r267,%r277,%r116; ld.u64 %r83,[%r267]; .loc 1 355 13 sub.u64 %r268,%r83,%r78; shl.b64 %r269,%r268,3; add.u64 %r90,%r90,%r269; .loc 1 356 24 add.u64 %r270,%frame,%r116; ld.u64 %r85,[%r270]; .loc 1 356 14 sub.u64 %r271,%r85,%r80; add.u64 %r92,%r92,%r271; .loc 1 359 32 add.u64 %r273,%r283,%r116; ld.u64 %r87,[%r273]; .loc 1 359 7 add.u64 %r117,%r117,8; add.u64 %r116,%r116,8; setp.eq.u64 %r274,%r82,%r87; @ %r274 bra $L62; ld.u64 %r155,[%frame+360]; bra $L63; $L65: mov.u64 %r90,%r45; setp.eq.u32 %r276,1,1; bra $L48; $L35: .loc 1 _gfortran_smaxloc0_8_i8 .visible .func _gfortran_smaxloc0_8_i8u32pred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 379 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L91; .loc 1 379 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L92; $L91: .loc 1 381 7 {_gfortran_maxloc0_8_.loc 1 382 7 bra $L90; $L92: .loc 1 385 8 ld.s8 %r30,[%r36+28]_gfortran_runtime_error94: .loc 1 390 15 ld.u64 %r24,[%r35]; .loc 1 390 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L95; .loc 1 392 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 393 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 394 24 st.u64 [%r35+8],%r24; .loc 1 395 29 mov.u64 call (%value_in),_gfortrani_xmallocarray95 27 st.u64 [%r35],%r24; bra $L96; $L95: .loc 1 397 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 397 11 ld.u32 %r62,[%r61+36; .loc 1 399 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 404 8 ld.u64 %r24,[%r35]; $L96: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,3; mov.u64 %r29,%r24; .loc 1 405 10 mov.u64 %r32,0; .loc 1 406 23 mov.u64 %r68,%r32; $L97: st.u64 [%r29],%r68; .loc 1 405 24 add.u64 %r32,%r32,1; .loc 1 405 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L97; $L90: .loc 1 407 1 ret; } maxloc0_16_i8.o/_gfortran_maxloc0_16_i8 .visible .func _gfortran_maxloc0_16_i8libgfortran/generated/maxloc0_16_i8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_16_i8 .visible .func _gfortran_mmaxloc0_16_i8_gfortran_smaxloc0_16_i8 .visible .func _gfortran_smaxloc0_16_i8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,65,88,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_16_i8 .visible .func _gfortran_maxloc0_16_i8stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[3683264u32 %r157; .reg .u64 %r158; .reg .predpred %r166; .reg .predpred %r171; .reg .predpredmov.u64 %r114,%ar0; mov.u64 %r115,%ar1; mov.u32 %r116,%ar2; .loc 1 50 10 ld.s8 %r22,[%r115+28]; .loc 1 50 8 cvt.u32.u32 %r117,%r22; cvt.s64.s8 %r71,%r117; .loc 1 51 6 setp.gt.s64 %r118,%r71,0; @ %r118 bra $L2; .loc 1 52 5 cvta.const.u64 %r1111_gfortran_runtime_errorr23,[%r114]; .loc 1 54 6 setp.ne.u64 %r121,%r23,0; @ %r121 bra $L3; .loc 1 56 7 st.u64 [%r114+48],%r23; add.u64 %r123,%r71,-1; st.u64 [%r114+56],%r123; mov.u64 %r124,1; st.u64 [%r114+40],%r124; .loc 1 57 28 cvt.u32.u64 %r125,%r124; st.u8 [%r114+28],%r125; .loc 1 58 24 st.u64 [%r114+8],%r23; .loc 1 59 29 mov.u64 %r128,1671call (%value_in),_gfortrani_xmallocarray131,[%value_in]; } mov.u64 %r23,%r131; .loc 1 59 27 st.u64 [%r114],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r133,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r134,[%r133+36]; setp.eq.u32 %r135,%r134,0; @ %r135 bra $L4; .loc 1 64r115138; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r114]; $L4: .loc 1 68 11 ld.u64 %r72,[%r114+40]; add.u64 %r44,%r115,40; mov.u64 %r35,%frame; add.u64 %r52,%frame,120; .loc 1 70 10 mov.u64 %r100,0; add.u64 %r188,%frame,240; .loc 1 74 16 mov.u64 %r146,%r100; bra $L7; $L22: mov.u64 %r100,%r79; $L7: .loc 1 72 18 ld.u64 %r139,[%r44]; st.u64 [%r35],%r139; .loc 1 73 19 ld.u64 %r141,[%r44+16]; add.u64 %r140,%r141,1; ld.u64 %r142,[%r44+8]; sub.u64 %r32,%r140,%r142; .loc 1 73 17 st.u64 [%r52],%r32; .loc 1 74 16 shl.b64 %r144,%r100,3; add.u64 %r145,%r188,%r144; st.u64 [%r145],%r146; .loc 1 75 10 setp.gt.s64 %r147,%r32,0; @ %r147 bra $L5; shl.b64 %r86,%r72,4; mov.u64 %r88,%r23; .loc 1 78 11 mov.u64 %r80,0; .loc 1 79 24 mov.u64 %r148,%r80; $L6: st.u64 [%r88],%r148; st.u64 [%r88+8],%r148; .loc 1 78 27 add.u64 %r80,%r80,1; .loc 1 78 4 add.u64 %r88,%r88,%r86; setp.ne.u64 %r150,%r71,%r80; @ %r150 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r79,%r100,1; .loc 1 70 3 add.u64 %r44,%r44,24; add.u64 %r35,%r35,8; add.u64 %r52,%r52,8; setp.ne.u64 %r151,%r71,%r79; @ %r151 bra $L22; .loc 1 84 8 ld.u64 %r62,[%r115]; shl.b64 %r38,%r72,4; mov.u64 %r85,%r23; .loc 1 87 10 mov.u64 %r78,0; .loc 1 88 23 mov.u64 %r152,1; mov.u64 %r153,%r78; $L8: st.u64 [%r85],%r152; st.u64 [%r85+8],%r153; mov.u64 %r33,%r78; .loc 1 87 26 add.u64 %r78,%r78,1; .loc 1 87 3 add.u64 %r85,%r85,%r38; setp.ne.u64 %r154,%r100,%r33; @ %r154 bra $L8; .loc 1 101 9 setp.ne.u64 %r155,%r62,0; @ ! %r155 bra $L1; .loc 1 149 19 ld.u64 %r47,[%frame]; .loc 1 149 9 shl.b64 %r49,%r47,3; .loc 1 151 34 ld.u64 %r51,[%frame+120]; cvt.u32.u32 %r157,%r22; cvt.s64.s8 %r156,%r157; shl.b64 %r158,%r156,3; add.u64 %r89,%r188,%r158; .loc 1 99 12 mov.u64 %r66,-9223372036854775808; setp.eq.u32 %r187,%r116,0; add.u64 %r189,%frame,248; .loc 1 157 13 mov.u64 %r192,0; .loc 1 174 32 add.u64 %r193,%frame,120; $L21: .loc 1 151 21 ld.u64 %r107,[%frame+240]; .loc 1 126 12 @ %r187 bra $L23; mov.u64 %r41,%r107; mov.u64 %r60,%r62; $L14: .loc 1 129 12 ld.u64 %r34,[%r60]; .loc 1 129 11 setp.lt.s64 %r161,%r34,%r66; @ %r161 bra $L12; mov.u64 %r27,%r189; mov.u64 %r63,%r23; mov.u64 %r103,%r41; $L13: .loc 1 133 35 add.u64 %r163,%r103,1; st.u64 [%r63],%r163; shr.s64 %r164,%r163,63; st.u64 [%r63+8],%r164; .loc 1 132 10 add.u64 %r63,%r63,%r38; setp.eq.u64 %r165,%r27,%r89; @ %r165 bra $L24; .loc 1 133 31 ld.u64 %r103,[%r27]; add.u64 %r27,%r27,8; bra $L13; $L24: mov.u64 %r66,%r34; $L12: .loc 1 135 12 add.u64 %r60,%r60,%r49; .loc 1 137 17 add.u64 %r41,%r41,1; .loc 1 137 10 st.u64 [%frame+240],%r41; setp.ne.u64 %r166,%r41,%r51; @ %r166 bra $L14; bra $L34; $L23: mov.u64 %r50,%r107; mov.u64 %r61,%r62; $L11: .loc 1 141 11 ld.u64 %r42,[%r61]; .loc 1 141 10 setp.le.s64 %r167,%r42,%r66; @ %r167 bra $L16; mov.u64 %r104,%r189; mov.u64 %r105,%r23; mov.u64 %r84,%r50; $L17: .loc 1 145 35 add.u64 %r169,%r84,1; st.u64 [%r105],%r169; shr.s64 %r170,%r169,63; st.u64 [%r105+8],%r170; .loc 1 144 4 add.u64 %r105,%r105,%r38; setp.eq.u64 %r171,%r89,%r104; @ %r171 bra $L25; .loc 1 145 31 ld.u64 %r84,[%r104]; add.u64 %r104,%r104,8; bra $L17; $L25: mov.u64 %r66,%r42; $L16: .loc 1 149 9 add.u64 %r61,%r61,%r49; .loc 1 151 14 add.u64 %r50,%r50,1; .loc 1 151 7 st.u64 [%frame+240],%r50; setp.ne.u64 %r172,%r50,%r51; @ %r172 bra $L11; .loc 1 149 9 sub.u64 %r173,%r51,%r107; mad.lo.u64 %r62,%r173,%r49,%r62; bra $L18; $L34: .loc 1 135 12 sub.u64 %r175,%r51,%r107; mad.lo.u64 %r62,%r175,%r49,%r62; $L18: mov.u64 %r77,%r189; .loc 1 151 34 mov.u64 %r59,%r51; mov.u64 %r57,%r47; .loc 1 152 9 mov.u64 %r65,0; $L20: .loc 1 157 13 st.u64 [%r77+-8],%r192; .loc 1 160 23 mul.lo.u64 %r54,%r57,%r59; mov.u64 %r101,%r65; .loc 1 161 5 add.u64 %r65,%r65,1; .loc 1 162 7 setp.eq.u64 %r179,%r101,%r100; @ %r179 bra $L1; .loc 1 170 16 ld.u64 %r180,[%r77]; add.u64 %r56,%r180,1; st.u64 [%r77],%r56; shl.b64 %r68,%r65,3; .loc 1 171 23 add.u64 %r181,%frame,%r68; ld.u64 %r57,[%r181]; .loc 1 171 13 sub.u64 %r182,%r57,%r54; shl.b64 %r183,%r182,3; add.u64 %r62,%r62,%r183; .loc 1 174 32 add.u64 %r185,%r193,%r68; ld.u64 %r59,[%r185]; .loc 1 174 7 add.u64 %r77,%r77,8; setp.eq.u64 %r186,%r56,%r59; @ %r186 bra $L20; bra $L21; $L_gfortran_mmaxloc0_16_i8 .visible .func _gfortran_mmaxloc0_16_i8reg .u64 %stack; mov.u64 %stack,04808predpredpredu64pred %r293; mov.u64 %r163,%ar0; mov.u64 %r164,%ar1; mov.u64 %r165,%ar2; mov.u32 %r166,%ar3; .loc 1 202 6 setp.ne.u64 %r167,%r165,0; @ %r167 bra $L3616r1166; call _gfortran_maxloc0_16_.loc 1 205 7 bra $L35; $L36: .loc 1 208 10 ld.s8 %r22,[%r164+28]; .loc 1 209 6 setp.gt.s32 %r171,%r22,0; @ %r171 bra $L38; .loc 1 210 5 cvta.const.u64 %r172172_gfortran_runtime_error%r174,[%r163]; setp.ne.u64 %r175,%r174,0; @ %r175 bra $L39; .loc 1 214 7 st.u64 [%r163+48],%r174; add.u32 %r177,%r22,-1; cvt.s64.s32 %r178,%r177; st.u64 [%r163+56],%r178; mov.u64 %r179,1; st.u64 [%r163+40],%r179; .loc 1 215 28 cvt.u32.u64 %r180,%r179; st.u8 [%r163+28],%r180; .loc 1 216 24 st.u64 [%r163+8],%r174; .loc 1 217 29 cvt.u32.u32 %r185,%r22; cvt.s64.s8 %r184,%r185; mov.u64 %r183,16call (%value_in),_gfortrani_xmallocarray186,[%value_in]; } .loc 1 217 27 st.u64 [%r163],%r186; bra $L40; $L39: .loc 1 221 11 cvta.global.u64 %r188,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r189,[%r188+36]; setp.eq.u32 %r190,%r189,0r164193; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r196193; call _gfortrani_bounds_equal_extents$L40: .loc 1 231 15 ld.u64 %r31,[%r165+16]; .loc 1 233 9 ld.u64 %r90,[%r165]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r198,%r32,-4; and.b32 %r199,%r198,-5; set.u32.eq.u32 %r201,%r199,0; neg.s32 %r202,%r201; .loc 1 235 22 add.u32 %r203,%r32,-1; set.u32.le.u32 %r205,%r203,1; neg.s32 %r206,%r205; .loc 1 235 6 cvt.u16.u32 %r208,%r202; cvt.u16.u32 %r209,%r206; or.b16 %r207,%r208,%r209;41; .loc 1 237 7 setp.ne.u32 %r214,%r32,16; @ %r214 bra $L42; $L41: .loc 1 244 11 ld.u64 %r102,[%r163+40]; .loc 1 245 8 ld.u64 %r103,[%r163]; .loc 1 246 17 cvt.u32.u32 %r215,%r22; cvt.s64.s8 %r65,%r215; add.u64 %r161,%r164,40; add.u64 %r284,%frame,120; mov.u64 %r159,%r284; add.u64 %r158,%r165,40; mov.u64 %r154,%frame; add.u64 %r155,%frame,240; add.u64 %r151,%frame,360; mov.u64 %r150,%r151; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r224,%r29; bra $L43; $L42: .loc 1 242 5_gfortran_runtime_error64: mov.u64 %r29,%r109; $L43: .loc 1 248 18 ld.u64 %r218,[%r161]; st.u64 [%r159],%r218; .loc 1 249 20 ld.u64 %r220,[%r158]; mul.lo.u64 %r219,%r220,%r31; .loc 1 249 18 st.u64 [%r154],%r219; .loc 1 250 19 ld.u64 %r222,[%r161+16]; add.u64 %r221,%r222,1; ld.u64 %r223,[%r161+8]; sub.u64 %r44,%r221,%r223; .loc 1 250 17 st.u64 [%r155],%r44; .loc 1 251 16 st.u64 [%r150],%r224; .loc 1 252 10 setp.gt.s64 %r225,%r44,0; @ %r225 bra $L44; shl.b64 %r144,%r102,4; mov.u64 %r133,%r103; .loc 1 255 11 mov.u64 %r110,0; .loc 1 256 24 mov.u64 %r226,%r110; $L45: st.u64 [%r133],%r226; st.u64 [%r133+8],%r226; .loc 1 255 27 add.u64 %r110,%r110,1; .loc 1 255 4 add.u64 %r133,%r133,%r144; setp.ne.u64 %r228,%r65,%r110; @ %r228 bra $L45; bra $L35; $L44: .loc 1 246 26 add.u64 %r109,%r29,1; .loc 1 246 3 add.u64 %r161,%r161,24; add.u64 %r159,%r159,8; add.u64 %r158,%r158,24; add.u64 %r154,%r154,8; add.u64 %r155,%r155,8; add.u64 %r150,%r150,8; setp.ne.u64 %r229,%r65,%r109; @ %r229 bra $L64; .loc 1 261 8 ld.u64 %r88,[%r164]; shl.b64 %r112,%r102,4; mov.u64 %r52,%r103; .loc 1 264 10 mov.u64 %r108,0; .loc 1 265 23 mov.u64 %r230,%r108; $L47: st.u64 [%r52],%r230; st.u64 [%r52+8],%r230; mov.u64 %r140,%r108; .loc 1 264 26 add.u64 %r108,%r108,1; .loc 1 264 3 add.u64 %r52,%r52,%r112; setp.ne.u64 %r232,%r29,%r140; @ %r232 bra $L47; .loc 1 276 9 setp.eq.u64 %r233,%r88,0; @ %r233 bra $L35; .loc 1 331 19 ld.u64 %r68,[%frame+120]; .loc 1 331 9 shl.b64 %r70,%r68,3; .loc 1 332 20 ld.u64 %r71,[%frame]; .loc 1 334 34 ld.u64 %r74,[%frame+240]; ld.u64 %r160,[%frame+360]; cvt.u32.u32 %r235,%r22; cvt.s64.s8 %r234,%r235; shl.b64 %r236,%r234,3; add.u64 %r122,%r236,%r151; .loc 1 274 12 mov.u64 %r95,-9223372036854775808; setp.eq.u32 %r283,1,0; add.u64 %r285,%frame,368; .loc 1 340 13 mov.u64 %r290,0; .loc 1 359 32 add.u64 %r291,%frame,240; .loc 1 308 12 setp.eq.u32 %r293,%r166,0; $L63: .loc 1 280 10 @ %r283 bra $L48; mov.u64 %r53,%r160; mov.u32 %r46,0; mov.u32 %r287,1; $L53: mov.u64 %r45,%r88; .loc 1 300 13 add.u64 %r88,%r45,%r70; .loc 1 284 11 ld.s8 %r239,[%r90]; cvt.u16.u32 %r238,%r239; setp.eq.u16 %r240,%r238,0; @ %r240 bra $L49; setp.eq.u32 %r241,%r46,0; selp.u64 %r160,%r160,%r53,%r241; .loc 1 294 16 ld.u64 %r95,[%r45]; mov.u64 %r131,%r285; mov.u64 %r145,%r103; mov.u64 %r157,%r160; $L52: .loc 1 296 33 add.u64 %r243,%r157,1; st.u64 [%r145],%r243; shr.s64 %r244,%r243,63; st.u64 [%r145+8],%r244; .loc 1 295 9 add.u64 %r145,%r145,%r112; setp.eq.u64 %r245,%r122,%r131; @ %r245 bra $L65; .loc 1 296 29 ld.u64 %r157,[%r131]; add.u64 %r131,%r131,8; bra $L52; $L49: .loc 1 301 14 add.u64 %r90,%r90,%r71; .loc 1 303 11 add.u64 %r53,%r53,1; mov.u32 %r46,%r287; .loc 1 303 4 setp.ne.u64 %r246,%r53,%r74; @ %r246 bra $L53; bra $L89; $L48: .loc 1 308 12 @ %r293 bra $L66; mov.u64 %r61,%r160; mov.u64 %r86,%r88; $L58: .loc 1 311 11 ld.s8 %r249,[%r90]; cvt.u16.u32 %r248,%r249; setp.eq.u16 %r250,%r248,0; @ %r250 bra $L56; .loc 1 311 22 ld.u64 %r57,[%r86]; .loc 1 311 19 setp.lt.s64 %r251,%r57,%r95; @ %r251 bra $L56; mov.u64 %r97,%r285; mov.u64 %r107,%r103; mov.u64 %r153,%r61; .loc 1 314 19 mov.u64 %r141,%r290; $L57: .loc 1 315 36 add.u64 %r253,%r153,1; st.u64 [%r107],%r253; shr.s64 %r254,%r253,63; st.u64 [%r107+8],%r254; .loc 1 314 35 add.u64 %r106,%r141,1; .loc 1 314 12 add.u64 %r107,%r107,%r112; setp.le.s64 %r255,%r29,%r141; @ %r255 bra $L67; .loc 1 315 32 ld.u64 %r153,[%r97]; add.u64 %r97,%r97,8; mov.u64 %r141,%r106; bra $L57; $L67: mov.u64 %r95,%r57; $L56: .loc 1 317 13 add.u64 %r86,%r86,%r70; .loc 1 319 11 add.u64 %r61,%r61,1; .loc 1 319 4 st.u64 [%frame+360],%r61; setp.ne.u64 %r256,%r61,%r74; @ %r256 bra $L58; .loc 1 317 13 sub.u64 %r257,%r74,%r160; mad.lo.u64 %r88,%r257,%r70,%r88; bra $L59; $L66: mov.u64 %r73,%r160; mov.u64 %r89,%r90; mov.u64 %r87,%r88; $L55: .loc 1 323 11 ld.s8 %r260,[%r89]; cvt.u16.u32 %r259,%r260; setp.eq.u16 %r261,%r259,0; @ %r261 bra $L60; .loc 1 323 22 ld.u64 %r64,[%r87]; .loc 1 323 19 setp.le.s64 %r262,%r64,%r95; @ %r262 bra $L60; mov.u64 %r62,%r285; mov.u64 %r91,%r103; mov.u64 %r149,%r73; .loc 1 326 12 mov.u64 %r142,%r290; $L61: .loc 1 327 36 add.u64 %r264,%r149,1; st.u64 [%r91],%r264; shr.s64 %r265,%r264,63; st.u64 [%r91+8],%r265; .loc 1 326 28 add.u64 %r105,%r142,1; .loc 1 326 5 add.u64 %r91,%r91,%r112; setp.le.s64 %r266,%r29,%r142; @ %r266 bra $L68; .loc 1 327 32 ld.u64 %r149,[%r62]; add.u64 %r62,%r62,8; mov.u64 %r142,%r105; bra $L61; $L68: mov.u64 %r95,%r64; $L60: .loc 1 331 9 add.u64 %r87,%r87,%r70; .loc 1 332 10 add.u64 %r89,%r89,%r71; .loc 1 334 14 add.u64 %r73,%r73,1; .loc 1 334 7 st.u64 [%frame+360],%r73; setp.ne.u64 %r267,%r73,%r74; @ %r267 bra $L55; sub.u64 %r128,%r74,%r160; .loc 1 331 9 mad.lo.u64 %r88,%r70,%r128,%r88; .loc 1 332 10 mad.lo.u64 %r90,%r71,%r128,%r90; bra $L59; $L89: st.u64 [%frame+360],%r74; $L59: mov.u64 %r120,%r285; mov.u64 %r83,%r71; .loc 1 334 34 mov.u64 %r85,%r74; mov.u64 %r81,%r68; mov.u64 %r119,8; .loc 1 335 9 mov.u64 %r93,0; $L62: .loc 1 340 13 st.u64 [%r120+-8],%r290; .loc 1 343 23 mul.lo.u64 %r76,%r81,%r85; .loc 1 344 24 mul.lo.u64 %r78,%r85,%r83; mov.u64 %r98,%r93; .loc 1 345 5 add.u64 %r93,%r93,1; .loc 1 346 7 setp.eq.u64 %r272,%r29,%r98; @ %r272 bra $L35; .loc 1 354 16 ld.u64 %r273,[%r120]; add.u64 %r80,%r273,1; st.u64 [%r120],%r80; .loc 1 355 23 add.u64 %r275,%r284,%r119; ld.u64 %r81,[%r275]; .loc 1 355 13 sub.u64 %r276,%r81,%r76; shl.b64 %r277,%r276,3; add.u64 %r88,%r88,%r277; .loc 1 356 24 add.u64 %r278,%frame,%r119; ld.u64 %r83,[%r278]; .loc 1 356 14 sub.u64 %r279,%r83,%r78; add.u64 %r90,%r90,%r279; .loc 1 359 32 add.u64 %r281,%r291,%r119; ld.u64 %r85,[%r281]; .loc 1 359 7 add.u64 %r120,%r120,8; add.u64 %r119,%r119,8; setp.eq.u64 %r282,%r80,%r85; @ %r282 bra $L62; ld.u64 %r160,[%frame+360]; bra $L63; $L65: mov.u64 %r88,%r45; setp.eq.u32 %r283,1,1; bra $L48; $L35: .loc 1 _gfortran_smaxloc0_16_i8 .visible .func _gfortran_smaxloc0_16_i8u32pred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 379 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L91; .loc 1 379 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L92; $L91: .loc 1 381 7 {_gfortran_maxloc0_16_.loc 1 382 7 bra $L90; $L92: .loc 1 385 8 ld.s8 %r30,[%r36+28]_gfortran_runtime_error94: .loc 1 390 15 ld.u64 %r24,[%r35]; .loc 1 390 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L95; .loc 1 392 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 393 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 394 24 st.u64 [%r35+8],%r24; .loc 1 395 29call (%value_in),_gfortrani_xmallocarray95 27 st.u64 [%r35],%r24; bra $L96; $L95: .loc 1 397 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 397 11 ld.u32 %r62,[%r61+36; .loc 1 399 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 404 8 ld.u64 %r24,[%r35]; $L96: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,4; mov.u64 %r29,%r24; .loc 1 405 10 mov.u64 %r32,0; .loc 1 406 23 mov.u64 %r68,%r32; $L97: st.u64 [%r29],%r68; st.u64 [%r29+8],%r68; .loc 1 405 24 add.u64 %r32,%r32,1; .loc 1 405 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r70,%r30,%r32; @ %r70 bra $L97; $L90: .loc 1 407 1 ret; } maxloc0_4_i16.o/_gfortran_maxloc0_4_i16 .visible .func _gfortran_maxloc0_4_i16libgfortran/generated/maxloc0_4_i16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_4_i16 .visible .func _gfortran_mmaxloc0_4_i16_gfortran_smaxloc0_4_i16 .visible .func _gfortran_smaxloc0_4_i16VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,65,88,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_4_i16 .visible .func _gfortran_maxloc0_4_i16stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[368predpred %r154; .reg .predpred %r163; .reg .pred %r166; .reg .predpredpredpredu64 %r191; .reg .predpred64u64 %r211; .reg .u64 %r214; .reg .u64 %r215; mov.u64 %r116,%ar0; mov.u64 %r117,%ar1; mov.u32 %r118,%ar2; .loc 1 50 10 ld.s8 %r22,[%r117+28]; .loc 1 50 8 cvt.u32.u32 %r119,%r22; cvt.s64.s8 %r72,%r119; .loc 1 51 6 setp.gt.s64 %r120,%r72,0; @ %r120 bra $L2; .loc 1 52 5 cvta.const.u64 %r121121_gfortran_runtime_errorr23,[%r116]; .loc 1 54 6 setp.ne.u64 %r123,%r23,0; @ %r123 bra $L3; .loc 1 56 7 st.u64 [%r116+48],%r23; add.u64 %r125,%r72,-1; st.u64 [%r116+56],%r125; mov.u64 %r126,1; st.u64 [%r116+40],%r126; .loc 1 57 28 cvt.u32.u64 %r127,%r126; st.u8 [%r116+28],%r127; .loc 1 58 24 st.u64 [%r116+8],%r23; .loc 1 59 29 mov.u64 %r130,4;r130; call (%value_in),_gfortrani_xmallocarray133,[%value_in]; } mov.u64 %r23,%r133; .loc 1 59 27 st.u64 [%r116],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r135,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r136,[%r135+36]; setp.eq.u32 %r137,%r136,0; @ %r137 bra $L4; .loc 1 64r140; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r116]; $L4: .loc 1 68 11 ld.u64 %r73,[%r116+40]; add.u64 %r39,%r117,40; mov.u64 %r45,%frame; add.u64 %r35,%frame,120; .loc 1 70 10 mov.u64 %r99,0; add.u64 %r210,%frame,240; .loc 1 74 16 mov.u64 %r148,%r99; bra $L7; $L24: mov.u64 %r99,%r79; $L7: .loc 1 72 18 ld.u64 %r141,[%r39]; st.u64 [%r45],%r141; .loc 1 73 19 ld.u64 %r143,[%r39+16]; add.u64 %r142,%r143,1; ld.u64 %r144,[%r39+8]; sub.u64 %r31,%r142,%r144; .loc 1 73 17 st.u64 [%r35],%r31; .loc 1 74 16 shl.b64 %r146,%r99,3; add.u64 %r147,%r210,%r146; st.u64 [%r147],%r148; .loc 1 75 10 setp.gt.s64 %r149,%r31,0; @ %r149 bra $L5; shl.b64 %r85,%r73,2; mov.u64 %r87,%r23; .loc 1 78 11 mov.u64 %r80,0; .loc 1 79 24 cvt.u32.u64 %r150,%r80; $L6: st.u32 [%r87],%r150; .loc 1 78 27 add.u64 %r80,%r80,1; .loc 1 78 4 add.u64 %r87,%r87,%r85; setp.ne.u64 %r151,%r72,%r80; @ %r151 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r79,%r99,1; .loc 1 70 3 add.u64 %r39,%r39,24; add.u64 %r45,%r45,8; add.u64 %r35,%r35,8; setp.ne.u64 %r152,%r72,%r79; @ %r152 bra $L24; .loc 1 84 8 ld.u64 %r64,[%r117]; shl.b64 %r82,%r73,2; mov.u64 %r92,%r23; .loc 1 87 10 mov.u64 %r78,0; .loc 1 88 23 mov.u32 %r153,1; $L8: st.u32 [%r92],%r153; mov.u64 %r32,%r78; .loc 1 87 26 add.u64 %r78,%r78,1; .loc 1 87 3 add.u64 %r92,%r92,%r82; setp.ne.u64 %r154,%r99,%r32; @ %r154 bra $L8; .loc 1 101 9 setp.ne.u64 %r155,%r64,0; @ ! %r155 bra $L1; .loc 1 149 19 ld.u64 %r49,[%frame]; .loc 1 149 9 shl.b64 %r51,%r49,4; .loc 1 151 34 ld.u64 %r53,[%frame+120]; cvt.u32.u32 %r157,%r22; cvt.s64.s8 %r156,%r157; shl.b64 %r158,%r156,3; add.u64 %r102,%r210,%r158; .loc 1 99 12 mov.u64 %r207,0; mov.u64 %r208,-9223372036854775808; setp.eq.u32 %r209,%r118,0; add.u64 %r211,%frame,248; .loc 1 157 13 mov.u64 %r214,%r207; .loc 1 174 32 add.u64 %r215,%frame,120; $L23: .loc 1 151 21 ld.u64 %r110,[%frame+240]; .loc 1 126 12 @ %r209 bra $L25; mov.u64 %r40,%r110; mov.u64 %r62,%r64; $L15: .loc 1 129 12 ld.u64 %r203,[%r62]; ld.u64 %r204,[%r62+8]; .loc 1 129 11 setp.gt.s64 %r163,%r208,%r204; @ %r163 bra $L12; setp.ne.u64 %r166,%r208,%r204; @ %r166 bra $L28; setp.gt.u64 %r169,%r207,%r203; @ %r169 bra $L12; $L28: mov.u64 %r43,%r211; mov.u64 %r65,%r23; mov.u64 %r105,%r40; $L14: .loc 1 133 35 cvt.u32.u64 %r172,%r105; add.u32 %r171,%r172,1; .loc 1 133 24 st.u32 [%r65],%r171; .loc 1 132 10 add.u64 %r65,%r65,%r82; setp.eq.u64 %r173,%r43,%r102; @ %r173 bra $L26; .loc 1 133 31 ld.u64 %r105,[%r43]; add.u64 %r43,%r43,8; bra $L14; $L26: mov.u64 %r207,%r203; mov.u64 %r208,%r204; $L12: .loc 1 135 12 add.u64 %r62,%r62,%r51; .loc 1 137 17 add.u64 %r40,%r40,1; .loc 1 137 10 st.u64 [%frame+240],%r40; setp.ne.u64 %r174,%r40,%r53; @ %r174 bra $L15; bra $L38; $L25: mov.u64 %r52,%r110; mov.u64 %r63,%r64; $L11: .loc 1 141 11 ld.u64 %r205,[%r63]; ld.u64 %r206,[%r63+8]; .loc 1 141 10 setp.gt.s64 %r177,%r206,%r208; @ %r177 bra $L29; setp.ne.u64 %r180,%r206,%r208; @ %r180 bra $L17; setp.gt.u64 %r183,%r205,%r207; @ ! %r183 bra $L17; $L29: mov.u64 %r108,%r211; mov.u64 %r109,%r23; mov.u64 %r44,%r52; $L19: .loc 1 145 35 cvt.u32.u64 %r186,%r44; add.u32 %r185,%r186,1; .loc 1 145 24 st.u32 [%r109],%r185; .loc 1 144 4 add.u64 %r109,%r109,%r82; setp.eq.u64 %r187,%r102,%r108; @ %r187 bra $L27; .loc 1 145 31 ld.u64 %r44,[%r108]; add.u64 %r108,%r108,8; bra $L19; $L27: mov.u64 %r207,%r205; mov.u64 %r208,%r206; $L17: .loc 1 149 9 add.u64 %r63,%r63,%r51; .loc 1 151 14 add.u64 %r52,%r52,1; .loc 1 151 7 st.u64 [%frame+240],%r52; setp.ne.u64 %r188,%r52,%r53; @ %r188 bra $L11; .loc 1 149 9 sub.u64 %r189,%r53,%r110; mad.lo.u64 %r64,%r189,%r51,%r64; bra $L20; $L38: .loc 1 135 12 sub.u64 %r191,%r53,%r110; mad.lo.u64 %r64,%r191,%r51,%r64; $L20: mov.u64 %r81,%r211; .loc 1 151 34 mov.u64 %r61,%r53; mov.u64 %r59,%r49; .loc 1 152 9 mov.u64 %r66,0; $L22: .loc 1 157 13 st.u64 [%r81+-8],%r214; .loc 1 160 23 mul.lo.u64 %r56,%r59,%r61; mov.u64 %r100,%r66; .loc 1 161 5 add.u64 %r66,%r66,1; .loc 1 162 7 setp.eq.u64 %r195,%r100,%r99; @ %r195 bra $L1; .loc 1 170 16 ld.u64 %r196,[%r81]; add.u64 %r58,%r196,1; st.u64 [%r81],%r58; shl.b64 %r70,%r66,3; .loc 1 171 23 add.u64 %r197,%frame,%r70; ld.u64 %r59,[%r197]; .loc 1 171 13 sub.u64 %r198,%r59,%r56; shl.b64 %r199,%r198,4; add.u64 %r64,%r64,%r199; .loc 1 174 32 add.u64 %r201,%r215,%r70; ld.u64 %r61,[%r201]; .loc 1 174 7 add.u64 %r81,%r81,8; setp.eq.u64 %r202,%r58,%r61; @ %r202 bra $L22; bra $L23; $L_gfortran_mmaxloc0_4_i16 .visible .func _gfortran_mmaxloc0_4_i16reg .u64 %stack; mov.u64 %stack,04807predpred %r193; .reg .u64 %r196; .reg .u6464pred %r228; .reg .u32 %r229; .reg .predpred %r233; .reg .pred %r234; .reg .u64u16predpred %r265; .reg .u64 %r266; .reg .u16 %r268; .reg .u32 %r269; .reg .pred %r270; .reg .predpredu64 %r313; .reg .u64 %r314; .reg .pred %r316; mov.u64 %r166,%ar0; mov.u64 %r167,%ar1; mov.u64 %r168,%ar2; mov.u32 %r169,%ar3; .loc 1 202 6 setp.ne.u64 %r170,%r168,0; @ %r170 bra $L40gfortran_maxloc0_4_i16.loc 1 205 7 bra $L39; $L40: .loc 1 208 10 ld.s8 %r22,[%r167+28]; .loc 1 209 6 setp.gt.s32 %r174,%r22,0; @ %r174 bra $L42; .loc 1 210 5 cvta.const.u64 %r175175_gfortran_runtime_error42: .loc 1 212 6 ld.u64 %r177,[%r166]; setp.ne.u64 %r178,%r177,0; @ %r178 bra $L43; .loc 1 214 7 st.u64 [%r166+48],%r177; add.u32 %r180,%r22,-1; cvt.s64.s32 %r181,%r180; st.u64 [%r166+56],%r181; mov.u64 %r182,1; st.u64 [%r166+40],%r182; .loc 1 215 28 cvt.u32.u64 %r183,%r182; st.u8 [%r166+28],%r183; .loc 1 216 24 st.u64 [%r166+8],%r177; .loc 1 217 29 cvt.u32.u32 %r188,%r22; cvt.s64.s8 %r187,%r188call (%value_in),_gfortrani_xmallocarray189,[%value_in]; } .loc 1 217 27 st.u64 [%r166],%r189; bra $L44; $L43: .loc 1 221 11 cvta.global.u64 %r191,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r192,[%r191+36];r167196; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19call _gfortrani_bounds_equal_extents$L44: .loc 1 231 15 ld.u64 %r31,[%r168+16]; .loc 1 233 9 ld.u64 %r93,[%r168]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r201,%r32,-4; and.b32 %r202,%r201,-5; set.u32.eq.u32 %r204,%r202,0; neg.s32 %r205,%r204; .loc 1 235 22 add.u32 %r206,%r32,-1; set.u32.le.u32 %r208,%r206,1; neg.s32 %r209,%r208; .loc 1 235 6 cvt.u16.u32 %r211,%r205; cvt.u16.u32 %r212,%r209; or.b16 %r210,%r211,%r212; cvt.u32.u16 %r213,%r210; cvt.u16.u8 %r214,%r213; setp.ne.u16 %r215,%r214,0; @ %r215 bra $L45; .loc 1 237 7 setp.ne.u32 %r217,%r32,16; @ %r217 bra $L46; $L45: .loc 1 244 11 ld.u64 %r106,[%r166+40]; .loc 1 245 8 ld.u64 %r107,[%r166]; .loc 1 246 17 cvt.u32.u32 %r218,%r22; cvt.s64.s8 %r65,%r218; add.u64 %r52,%r167,40; add.u64 %r307,%frame,120; mov.u64 %r116,%r307; add.u64 %r102,%r168,40; mov.u64 %r165,%frame; add.u64 %r162,%frame,240; add.u64 %r159,%frame,360; mov.u64 %r160,%r159; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r227,%r29; bra $L47; $L46: .loc 1 242 5_gfortran_runtime_error70: mov.u64 %r29,%r113; $L47: .loc 1 248 18 ld.u64 %r221,[%r52]; st.u64 [%r116],%r221; .loc 1 249 20 ld.u64 %r223,[%r102]; mul.lo.u64 %r222,%r223,%r31; .loc 1 249 18 st.u64 [%r165],%r222; .loc 1 250 19 ld.u64 %r225,[%r52+16]; add.u64 %r224,%r225,1; ld.u64 %r226,[%r52+8]; sub.u64 %r44,%r224,%r226; .loc 1 250 17 st.u64 [%r162],%r44; .loc 1 251 16 st.u64 [%r160],%r227; .loc 1 252 10 setp.gt.s64 %r228,%r44,0; @ %r228 bra $L48; shl.b64 %r149,%r106,2; mov.u64 %r139,%r107; .loc 1 255 11 mov.u64 %r114,0; .loc 1 256 24 cvt.u32.u64 %r229,%r114; $L49: st.u32 [%r139],%r229; .loc 1 255 27 add.u64 %r114,%r114,1; .loc 1 255 4 add.u64 %r139,%r139,%r149; setp.ne.u64 %r230,%r65,%r114; @ %r230 bra $L49; bra $L39; $L48: .loc 1 246 26 add.u64 %r113,%r29,1; .loc 1 246 3 add.u64 %r52,%r52,24; add.u64 %r116,%r116,8; add.u64 %r102,%r102,24; add.u64 %r165,%r165,8; add.u64 %r162,%r162,8; add.u64 %r160,%r160,8; setp.ne.u64 %r231,%r65,%r113; @ %r231 bra $L70; .loc 1 261 8 ld.u64 %r90,[%r167]; shl.b64 %r62,%r106,2; mov.u64 %r91,%r107; .loc 1 264 10 mov.u64 %r112,0; .loc 1 265 23 cvt.u32.u64 %r232,%r112; $L51: st.u32 [%r91],%r232; mov.u64 %r142,%r112; .loc 1 264 26 add.u64 %r112,%r112,1; .loc 1 264 3 add.u64 %r91,%r91,%r62; setp.ne.u64 %r233,%r29,%r142; @ %r233 bra $L51; .loc 1 276 9 setp.eq.u64 %r234,%r90,0; @ %r234 bra $L39; .loc 1 331 19 ld.u64 %r69,[%frame+120]; .loc 1 331 9 shl.b64 %r71,%r69,4; .loc 1 332 20 ld.u64 %r72,[%frame]; .loc 1 334 34 ld.u64 %r75,[%frame+240]; ld.u64 %r163,[%frame+360]; cvt.u32.u32 %r236,%r22; cvt.s64.s8 %r235,%r236; shl.b64 %r237,%r235,3; add.u64 %r127,%r237,%r159; .loc 1 274 12 mov.u64 %r304,0; mov.u64 %r305,-9223372036854775808; setp.eq.u32 %r306,1,0; add.u64 %r308,%frame,368; .loc 1 340 13 mov.u64 %r313,%r304; .loc 1 359 32 add.u64 %r314,%frame,240; .loc 1 308 12 setp.eq.u32 %r316,%r169,0; $L69: .loc 1 280 10 @ %r306 bra $L52; mov.u64 %r53,%r163; mov.u32 %r46,0; mov.u32 %r310,1; $L57: mov.u64 %r45,%r90; .loc 1 300 13 add.u64 %r90,%r45,%r71; .loc 1 284 11 ld.s8 %r240,[%r93]; cvt.u16.u32 %r239,%r240; setp.eq.u16 %r241,%r239,0; @ %r241 bra $L53; setp.eq.u32 %r242,%r46,0; selp.u64 %r163,%r163,%r53,%r242; .loc 1 294 16 ld.u64 %r304,[%r45]; ld.u64 %r305,[%r45+8]; mov.u64 %r137,%r308; mov.u64 %r150,%r107; mov.u64 %r161,%r163; $L56: .loc 1 296 33 cvt.u32.u64 %r245,%r161; add.u32 %r244,%r245,1; .loc 1 296 22 st.u32 [%r150],%r244; .loc 1 295 9 add.u64 %r150,%r150,%r62; setp.eq.u64 %r246,%r127,%r137; @ %r246 bra $L71; .loc 1 296 29 ld.u64 %r161,[%r137]; add.u64 %r137,%r137,8; bra $L56; $L53: .loc 1 301 14 add.u64 %r93,%r93,%r72; .loc 1 303 11 add.u64 %r53,%r53,1; mov.u32 %r46,%r310; .loc 1 303 4 setp.ne.u64 %r247,%r53,%r75; @ %r247 bra $L57; bra $L97; $L52: .loc 1 308 12 @ %r316 bra $L72; mov.u64 %r61,%r163; mov.u64 %r88,%r90; $L63: .loc 1 311 11 ld.s8 %r250,[%r93]; cvt.u16.u32 %r249,%r250; setp.eq.u16 %r251,%r249,0; @ %r251 bra $L60; .loc 1 311 22 ld.u64 %r300,[%r88]; ld.u64 %r301,[%r88+8]; .loc 1 311 19 setp.gt.s64 %r254,%r305,%r301; @ %r254 bra $L60; setp.ne.u64 %r257,%r305,%r301; @ %r257 bra $L75; setp.gt.u64 %r260,%r304,%r300; @ %r260 bra $L60; $L75: mov.u64 %r117,%r308; mov.u64 %r119,%r107; mov.u64 %r156,%r61; .loc 1 314 19 mov.u64 %r143,%r313; $L62: .loc 1 315 36 cvt.u32.u64 %r263,%r156; add.u32 %r262,%r263,1; .loc 1 315 25 st.u32 [%r119],%r262; .loc 1 314 35 add.u64 %r110,%r143,1; .loc 1 314 12 add.u64 %r119,%r119,%r62; setp.le.s64 %r264,%r29,%r143; @ %r264 bra $L73; .loc 1 315 32 ld.u64 %r156,[%r117]; add.u64 %r117,%r117,8; mov.u64 %r143,%r110; bra $L62; $L73: mov.u64 %r304,%r300; mov.u64 %r305,%r301; $L60: .loc 1 317 13 add.u64 %r88,%r88,%r71; .loc 1 319 11 add.u64 %r61,%r61,1; .loc 1 319 4 st.u64 [%frame+360],%r61; setp.ne.u64 %r265,%r61,%r75; @ %r265 bra $L63; .loc 1 317 13 sub.u64 %r266,%r75,%r163; mad.lo.u64 %r90,%r266,%r71,%r90; bra $L64; $L72: mov.u64 %r74,%r163; mov.u64 %r92,%r93; mov.u64 %r89,%r90; $L59: .loc 1 323 11 ld.s8 %r269,[%r92]; cvt.u16.u32 %r268,%r269; setp.eq.u16 %r270,%r268,0; @ %r270 bra $L65; .loc 1 323 22 ld.u64 %r302,[%r89]; ld.u64 %r303,[%r89+8]; .loc 1 323 19 setp.gt.s64 %r273,%r303,%r305; @ %r273 bra $L76; setp.ne.u64 %r276,%r303,%r305; @ %r276 bra $L65; setp.gt.u64 %r279,%r302,%r304; @ ! %r279 bra $L65; $L76: mov.u64 %r97,%r308; mov.u64 %r100,%r107; mov.u64 %r151,%r74; .loc 1 326 12 mov.u64 %r144,%r313; $L67: .loc 1 327 36 cvt.u32.u64 %r282,%r151; add.u32 %r281,%r282,1; .loc 1 327 25 st.u32 [%r100],%r281; .loc 1 326 28 add.u64 %r109,%r144,1; .loc 1 326 5 add.u64 %r100,%r100,%r62; setp.le.s64 %r283,%r29,%r144; @ %r283 bra $L74; .loc 1 327 32 ld.u64 %r151,[%r97]; add.u64 %r97,%r97,8; mov.u64 %r144,%r109; bra $L67; $L74: mov.u64 %r304,%r302; mov.u64 %r305,%r303; $L65: .loc 1 331 9 add.u64 %r89,%r89,%r71; .loc 1 332 10 add.u64 %r92,%r92,%r72; .loc 1 334 14 add.u64 %r74,%r74,1; .loc 1 334 7 st.u64 [%frame+360],%r74; setp.ne.u64 %r284,%r74,%r75; @ %r284 bra $L59; sub.u64 %r134,%r75,%r163; .loc 1 331 9 mad.lo.u64 %r90,%r71,%r134,%r90; .loc 1 332 10 mad.lo.u64 %r93,%r72,%r134,%r93; bra $L64; $L97: st.u64 [%frame+360],%r75; $L64: mov.u64 %r125,%r308; mov.u64 %r84,%r72; .loc 1 334 34 mov.u64 %r86,%r75; mov.u64 %r82,%r69; mov.u64 %r124,8; .loc 1 335 9 mov.u64 %r96,0; $L68: .loc 1 340 13 st.u64 [%r125+-8],%r313; .loc 1 343 23 mul.lo.u64 %r77,%r82,%r86; .loc 1 344 24 mul.lo.u64 %r79,%r86,%r84; mov.u64 %r118,%r96; .loc 1 345 5 add.u64 %r96,%r96,1; .loc 1 346 7 setp.eq.u64 %r289,%r29,%r118; @ %r289 bra $L39; .loc 1 354 16 ld.u64 %r290,[%r125]; add.u64 %r81,%r290,1; st.u64 [%r125],%r81; .loc 1 355 23 add.u64 %r292,%r307,%r124; ld.u64 %r82,[%r292]; .loc 1 355 13 sub.u64 %r293,%r82,%r77; shl.b64 %r294,%r293,4; add.u64 %r90,%r90,%r294; .loc 1 356 24 add.u64 %r295,%frame,%r124; ld.u64 %r84,[%r295]; .loc 1 356 14 sub.u64 %r296,%r84,%r79; add.u64 %r93,%r93,%r296; .loc 1 359 32 add.u64 %r298,%r314,%r124; ld.u64 %r86,[%r298]; .loc 1 359 7 add.u64 %r125,%r125,8; add.u64 %r124,%r124,8; setp.eq.u64 %r299,%r81,%r86; @ %r299 bra $L68; ld.u64 %r163,[%frame+360]; bra $L69; $L71: mov.u64 %r90,%r45; setp.eq.u32 %r306,1,1; bra $L52; $L39: .loc 1 _gfortran_smaxloc0_4_i16 .visible .func _gfortran_smaxloc0_4_i16u32pred.reg .u64u32 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 379 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L99; .loc 1 379 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L100; $L99: .loc 1 381 7 {_gfortran_maxloc0_4_i16.loc 1 382 7 bra $L98; $L100: .loc 1 385 8 ld.s8 %r30,[%r36+28]; .loc 1 387 6 setp.gt.s64 %r46,%r30,0; @ %r46 bra $L102_gfortran_runtime_error102: .loc 1 390 15 ld.u64 %r24,[%r35]; .loc 1 390 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L103; .loc 1 392 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 393 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 394 24 st.u64 [%r35+8],%r24; .loc 1 395 29 mov.u64 %r56call (%value_in),_gfortrani_xmallocarray95 27 st.u64 [%r35],%r24; bra $L104; $L103: .loc 1 397 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 397 11 ld.u32 %r62,[%r61+36.loc 1 399 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 404 8 ld.u64 %r24,[%r35]; $L104: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,2; mov.u64 %r29,%r24; .loc 1 405 10 mov.u64 %r32,0; .loc 1 406 23 cvt.u32.u64 %r68,%r32; $L105: st.u32 [%r29],%r68; .loc 1 405 24 add.u64 %r32,%r32,1; .loc 1 405 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L105; $L98: .loc 1 407 1 ret; } maxloc0_8_i16.o/_gfortran_maxloc0_8_i16 .visible .func _gfortran_maxloc0_8_i16libgfortran/generated/maxloc0_8_i16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_8_i16 .visible .func _gfortran_mmaxloc0_8_i16_gfortran_smaxloc0_8_i16 .visible .func _gfortran_smaxloc0_8_i16VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,65,88,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_8_i16 .visible .func _gfortran_maxloc0_8_i16stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[36832predpredpred %r169; .reg .pred %r172; .reg .predpredpredpredpredmov.u64 %r112,%ar0; mov.u64 %r113,%ar1; mov.u32 %r114,%ar2; .loc 1 50 10 ld.s8 %r22,[%r113+28]; .loc 1 50 8 cvt.u32.u32 %r115,%r22; cvt.s64.s8 %r71,%r115; .loc 1 51 6 setp.gt.s64 %r116,%r71,0; @ %r116 bra $L2; .loc 1 52 5 cvta.const.u64 %r11117_gfortran_runtime_errorr23,[%r112]; .loc 1 54 6 setp.ne.u64 %r119,%r23,0; @ %r119 bra $L3; .loc 1 56 7 st.u64 [%r112+48],%r23; add.u64 %r121,%r71,-1; st.u64 [%r112+56],%r121; mov.u64 %r122,1; st.u64 [%r112+40],%r122; .loc 1 57 28 cvt.u32.u64 %r123,%r122; st.u8 [%r112+28],%r123; .loc 1 58 24 st.u64 [%r112+8],%r23; .loc 1 59 29 mov.u64 %r171call (%value_in),_gfortrani_xmallocarray129,[%value_in]; } mov.u64 %r23,%r129; .loc 1 59 27 st.u64 [%r112],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r131,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r132,[%r131+36]; setp.eq.u32 %r133,%r132,0; @ %r133 bra $L4; .loc 1 64r113136; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r112]; $L4: .loc 1 68 11 ld.u64 %r72,[%r112+40]; add.u64 %r36,%r113,40; mov.u64 %r104,%frame; add.u64 %r105,%frame,120; .loc 1 70 10 mov.u64 %r98,0; add.u64 %r204,%frame,240; .loc 1 74 16 mov.u64 %r144,%r98; bra $L7; $L24: mov.u64 %r98,%r79; $L7: .loc 1 72 18 ld.u64 %r137,[%r36]; st.u64 [%r104],%r137; .loc 1 73 19 ld.u64 %r139,[%r36+16]; add.u64 %r138,%r139,1; ld.u64 %r140,[%r36+8]; sub.u64 %r31,%r138,%r140; .loc 1 73 17 st.u64 [%r105],%r31; .loc 1 74 16 shl.b64 %r142,%r98,3; add.u64 %r143,%r204,%r142; st.u64 [%r143],%r144; .loc 1 75 10 setp.gt.s64 %r145,%r31,0; @ %r145 bra $L5; shl.b64 %r81,%r72,3; mov.u64 %r87,%r23; .loc 1 78 11 mov.u64 %r80,0; .loc 1 79 24 mov.u64 %r146,%r80; $L6: st.u64 [%r87],%r146; .loc 1 78 27 add.u64 %r80,%r80,1; .loc 1 78 4 add.u64 %r87,%r87,%r81; setp.ne.u64 %r147,%r71,%r80; @ %r147 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r79,%r98,1; .loc 1 70 3 add.u64 %r36,%r36,24; add.u64 %r104,%r104,8; add.u64 %r105,%r105,8; setp.ne.u64 %r148,%r71,%r79; @ %r148 bra $L24; .loc 1 84 8 ld.u64 %r62,[%r113]; shl.b64 %r84,%r72,3; mov.u64 %r39,%r23; .loc 1 87 10 mov.u64 %r78,0; .loc 1 88 23 mov.u64 %r149,1; $L8: st.u64 [%r39],%r149; mov.u64 %r32,%r78; .loc 1 87 26 add.u64 %r78,%r78,1; .loc 1 87 3 add.u64 %r39,%r39,%r84; setp.ne.u64 %r150,%r98,%r32; @ %r150 bra $L8; .loc 1 101 9 setp.ne.u64 %r151,%r62,0; @ ! %r151 bra $L1; .loc 1 149 19 ld.u64 %r45,[%frame]; .loc 1 149 9 shl.b64 %r47,%r45,4; .loc 1 151 34 ld.u64 %r51,[%frame+120]; cvt.u32.u32 %r153,%r22; cvt.s64.s8 %r152,%r153; shl.b64 %r154,%r152,3; add.u64 %r73,%r204,%r154; .loc 1 99 12 mov.u64 %r201,0; mov.u64 %r202,-9223372036854775808; setp.eq.u32 %r203,%r114,0; add.u64 %r205,%frame,248; .loc 1 157 13 mov.u64 %r208,%r201; .loc 1 174 32 add.u64 %r209,%frame,120; $L23: .loc 1 151 21 ld.u64 %r103,[%frame+240]; .loc 1 126 12 @ %r203 bra $L25; mov.u64 %r40,%r103; mov.u64 %r60,%r62; $L15: .loc 1 129 12 ld.u64 %r197,[%r60]; ld.u64 %r198,[%r60+8]; .loc 1 129 11 setp.gt.s64 %r159,%r202,%r198; @ %r159 bra $L12; setp.ne.u64 %r162,%r202,%r198; @ %r162 bra $L28; setp.gt.u64 %r165,%r201,%r197; @ %r165 bra $L12; $L28: mov.u64 %r76,%r205; mov.u64 %r34,%r23; mov.u64 %r90,%r40; $L14: .loc 1 133 35 add.u64 %r167,%r90,1; .loc 1 133 24 st.u64 [%r34],%r167; .loc 1 132 10 add.u64 %r34,%r34,%r84; setp.eq.u64 %r168,%r73,%r76; @ %r168 bra $L26; .loc 1 133 31 ld.u64 %r90,[%r76]; add.u64 %r76,%r76,8; bra $L14; $L26: mov.u64 %r201,%r197; mov.u64 %r202,%r198; $L12: .loc 1 135 12 add.u64 %r60,%r60,%r47; .loc 1 137 17 add.u64 %r40,%r40,1; .loc 1 137 10 st.u64 [%frame+240],%r40; setp.ne.u64 %r169,%r40,%r51; @ %r169 bra $L15; bra $L38; $L25: mov.u64 %r49,%r103; mov.u64 %r61,%r62; $L11: .loc 1 141 11 ld.u64 %r199,[%r61]; ld.u64 %r200,[%r61+8]; .loc 1 141 10 setp.gt.s64 %r172,%r200,%r202; @ %r172 bra $L29; setp.ne.u64 %r175,%r200,%r202; @ %r175 bra $L17; setp.gt.u64 %r178,%r199,%r201; @ ! %r178 bra $L17; $L29: mov.u64 %r85,%r205; mov.u64 %r101,%r23; mov.u64 %r86,%r49; $L19: .loc 1 145 35 add.u64 %r180,%r86,1; .loc 1 145 24 st.u64 [%r101],%r180; .loc 1 144 4 add.u64 %r101,%r101,%r84; setp.eq.u64 %r181,%r73,%r85; @ %r181 bra $L27; .loc 1 145 31 ld.u64 %r86,[%r85]; add.u64 %r85,%r85,8; bra $L19; $L27: mov.u64 %r201,%r199; mov.u64 %r202,%r200; $L17: .loc 1 149 9 add.u64 %r61,%r61,%r47; .loc 1 151 14 add.u64 %r49,%r49,1; .loc 1 151 7 st.u64 [%frame+240],%r49; setp.ne.u64 %r182,%r49,%r51; @ %r182 bra $L11; .loc 1 149 9 sub.u64 %r183,%r51,%r103; mad.lo.u64 %r62,%r183,%r47,%r62; bra $L20; $L38: .loc 1 135 12 sub.u64 %r185,%r51,%r103; mad.lo.u64 %r62,%r185,%r47,%r62; $L20: mov.u64 %r70,%r205; .loc 1 151 34 mov.u64 %r59,%r51; mov.u64 %r57,%r45; .loc 1 152 9 mov.u64 %r66,0; $L22: .loc 1 157 13 st.u64 [%r70+-8],%r208; .loc 1 160 23 mul.lo.u64 %r54,%r57,%r59; mov.u64 %r99,%r66; .loc 1 161 5 add.u64 %r66,%r66,1; .loc 1 162 7 setp.eq.u64 %r189,%r99,%r98; @ %r189 bra $L1; .loc 1 170 16 ld.u64 %r190,[%r70]; add.u64 %r56,%r190,1; st.u64 [%r70],%r56; shl.b64 %r65,%r66,3; .loc 1 171 23 add.u64 %r191,%frame,%r65; ld.u64 %r57,[%r191]; .loc 1 171 13 sub.u64 %r192,%r57,%r54; shl.b64 %r193,%r192,4; add.u64 %r62,%r62,%r193; .loc 1 174 32 add.u64 %r195,%r209,%r65; ld.u64 %r59,[%r195]; .loc 1 174 7 add.u64 %r70,%r70,8; setp.eq.u64 %r196,%r56,%r59; @ %r196 bra $L22; bra $L23; $L_gfortran_mmaxloc0_8_i16 .visible .func _gfortran_mmaxloc0_8_i16reg .u64 %stack; mov.u64 %stack,0480106; .reg .u64 %r107pred %r187; .reg .u64 %r190; .reg .u64u64pred %r222; .reg .u64 %r223; .reg .predu32 %r230; .reg .u64 %r231; .reg .u16 %r233; .reg .u32 %r234; .reg .pred %r235; .reg .pred %r236; .reg .u64 %r238; .reg .pred %r239; .reg .pred %r240; .reg .u16 %r242; .reg .u32 %r243; .reg .pred %r244; .reg .pred %r247; .reg .pred %r250; .reg .predu32 %r261; .reg .pred %r262; .reg .pred %r265; .reg .pred %r268; .reg .pred %r271; .reg .u64 %r273; .reg .pred %r274; .reg .predu64 %r283; .reg .u64u64 %r305; .reg .pred %r307; mov.u64 %r160,%ar0; mov.u64 %r161,%ar1; mov.u64 %r162,%ar2; mov.u32 %r163,%ar3; .loc 1 202 6 setp.ne.u64 %r164,%r162,0; @ %r164 bra $L40; .loc 1 204 7 {1163; call _gfortran_maxloc0_8_i16.loc 1 205 7 bra $L39; $L40: .loc 1 208 10 ld.s8 %r22,[%r161+28]; .loc 1 209 6 setp.gt.s32 %r168,%r22,0; @ %r168 bra $L42; .loc 1 210 5 cvta.const.u64 %r16stack; call _gfortran_runtime_error42: .loc 1 212 6 ld.u64 %r171,[%r160]; setp.ne.u64 %r172,%r171,0; @ %r172 bra $L43; .loc 1 214 7 st.u64 [%r160+48],%r171; add.u32 %r174,%r22,-1; cvt.s64.s32 %r175,%r174; st.u64 [%r160+56],%r175; mov.u64 %r176,1; st.u64 [%r160+40],%r176; .loc 1 215 28 cvt.u32.u64 %r177,%r176; st.u8 [%r160+28],%r177; .loc 1 216 24 st.u64 [%r160+8],%r171; .loc 1 217 29 cvt.u32.u32 %r182,%r22; cvt.s64.s8 %r181,%r182; mov.u64 %r180call (%value_in),_gfortrani_xmallocarray183,[%value_in]; } .loc 1 217 27 st.u64 [%r160],%r183; bra $L44; $L43: .loc 1 221 11 cvta.global.u64 %r185,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r186,[%r185+36]; setp.eq.u32 %r187,%r186,0; @ %r187 bra $L44; .loc 1 224 4 cvta.const.u64 %r190,$LC1; {161190; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19316161193190; call _gfortrani_bounds_equal_extents$L44: .loc 1 231 15 ld.u64 %r31,[%r162+16]; .loc 1 233 9 ld.u64 %r92,[%r162]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r195,%r32,-4; and.b32 %r196,%r195,-5; set.u32.eq.u32 %r198,%r196,0; neg.s32 %r199,%r198; .loc 1 235 22 add.u32 %r200,%r32,-1; set.u32.le.u32 %r202,%r200,1; neg.s32 %r203,%r202; .loc 1 235 6 cvt.u16.u32 %r205,%r199; cvt.u16.u32 %r206,%r203; or.b16 %r204,%r205,%r206; cvt.u32.u16 %r207,%r204; cvt.u16.u8 %r208,%r207; .loc 1 237 7 setp.ne.u32 %r211,%r32,16; @ %r211 bra $L46; $L45: .loc 1 244 11 ld.u64 %r106,[%r160+40]; .loc 1 245 8 ld.u64 %r107,[%r160]; .loc 1 246 17 cvt.u32.u32 %r212,%r22; cvt.s64.s8 %r68,%r212; add.u64 %r153,%r161,40; add.u64 %r298,%frame,120; mov.u64 %r149,%r298; add.u64 %r147,%r162,40; mov.u64 %r146,%frame; add.u64 %r143,%frame,240; add.u64 %r62,%frame,360; mov.u64 %r47,%r62; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r221,%r29; bra $L47; $L46: .loc 1 242 5_gfortran_runtime_error70: mov.u64 %r29,%r113; $L47: .loc 1 248 18 ld.u64 %r215,[%r153]; st.u64 [%r149],%r215; .loc 1 249 20 ld.u64 %r217,[%r147]; mul.lo.u64 %r216,%r217,%r31; .loc 1 249 18 st.u64 [%r146],%r216; .loc 1 250 19 ld.u64 %r219,[%r153+16]; add.u64 %r218,%r219,1; ld.u64 %r220,[%r153+8]; sub.u64 %r44,%r218,%r220; .loc 1 250 17 st.u64 [%r143],%r44; .loc 1 251 16 st.u64 [%r47],%r221; .loc 1 252 10 setp.gt.s64 %r222,%r44,0; @ %r222 bra $L48; shl.b64 %r137,%r106,3; mov.u64 %r129,%r107; .loc 1 255 11 mov.u64 %r114,0; .loc 1 256 24 mov.u64 %r223,%r114; $L49: st.u64 [%r129],%r223; .loc 1 255 27 add.u64 %r114,%r114,1; .loc 1 255 4 add.u64 %r129,%r129,%r137; setp.ne.u64 %r224,%r68,%r114; @ %r224 bra $L49; bra $L39; $L48: .loc 1 246 26 add.u64 %r113,%r29,1; .loc 1 246 3 add.u64 %r153,%r153,24; add.u64 %r149,%r149,8; add.u64 %r147,%r147,24; add.u64 %r146,%r146,8; add.u64 %r143,%r143,8; add.u64 %r47,%r47,8; setp.ne.u64 %r225,%r68,%r113; @ %r225 bra $L70; .loc 1 261 8 ld.u64 %r90,[%r161]; shl.b64 %r158,%r106,3; mov.u64 %r156,%r107; .loc 1 264 10 mov.u64 %r112,0; .loc 1 265 23 mov.u64 %r226,%r112; $L51: st.u64 [%r156],%r226; mov.u64 %r140,%r112; .loc 1 264 26 add.u64 %r112,%r112,1; .loc 1 264 3 add.u64 %r156,%r156,%r158; setp.ne.u64 %r227,%r29,%r140; @ %r227 bra $L51; .loc 1 276 9 setp.eq.u64 %r228,%r90,0; @ %r228 bra $L39; .loc 1 331 19 ld.u64 %r70,[%frame+120]; .loc 1 331 9 shl.b64 %r72,%r70,4; .loc 1 332 20 ld.u64 %r73,[%frame]; .loc 1 334 34 ld.u64 %r76,[%frame+240]; ld.u64 %r155,[%frame+360]; cvt.u32.u32 %r230,%r22; cvt.s64.s8 %r229,%r230; shl.b64 %r231,%r229,3; add.u64 %r119,%r231,%r62; .loc 1 274 12 mov.u64 %r295,0; mov.u64 %r296,-9223372036854775808; setp.eq.u32 %r297,1,0; add.u64 %r299,%frame,368; .loc 1 340 13 mov.u64 %r304,%r295; .loc 1 359 32 add.u64 %r305,%frame,240; .loc 1 308 12 setp.eq.u32 %r307,%r163,0; $L69: .loc 1 280 10 @ %r297 bra $L52; mov.u64 %r56,%r155; mov.u32 %r46,0; mov.u32 %r301,1; $L57: mov.u64 %r45,%r90; .loc 1 300 13 add.u64 %r90,%r45,%r72; .loc 1 284 11 ld.s8 %r234,[%r92]; cvt.u16.u32 %r233,%r234; setp.eq.u16 %r235,%r233,0; @ %r235 bra $L53; setp.eq.u32 %r236,%r46,0; selp.u64 %r155,%r155,%r56,%r236; .loc 1 294 16 ld.u64 %r295,[%r45]; ld.u64 %r296,[%r45+8]; mov.u64 %r128,%r299; mov.u64 %r139,%r107; mov.u64 %r151,%r155; $L56: .loc 1 296 33 add.u64 %r238,%r151,1; .loc 1 296 22 st.u64 [%r139],%r238; .loc 1 295 9 add.u64 %r139,%r139,%r158; setp.eq.u64 %r239,%r119,%r128; @ %r239 bra $L71; .loc 1 296 29 ld.u64 %r151,[%r128]; add.u64 %r128,%r128,8; bra $L56; $L53: .loc 1 301 14 add.u64 %r92,%r92,%r73; .loc 1 303 11 add.u64 %r56,%r56,1; mov.u32 %r46,%r301; .loc 1 303 4 setp.ne.u64 %r240,%r56,%r76; @ %r240 bra $L57; bra $L97; $L52: .loc 1 308 12 @ %r307 bra $L72; mov.u64 %r64,%r155; mov.u64 %r88,%r90; $L63: .loc 1 311 11 ld.s8 %r243,[%r92]; cvt.u16.u32 %r242,%r243; setp.eq.u16 %r244,%r242,0; @ %r244 bra $L60; .loc 1 311 22 ld.u64 %r291,[%r88]; ld.u64 %r292,[%r88+8]; .loc 1 311 19 setp.gt.s64 %r247,%r296,%r292; @ %r247 bra $L60; setp.ne.u64 %r250,%r296,%r292; @ %r250 bra $L75; setp.gt.u64 %r253,%r295,%r291; @ %r253 bra $L60; $L75: mov.u64 %r93,%r299; mov.u64 %r96,%r107; mov.u64 %r148,%r64; .loc 1 314 19 mov.u64 %r141,%r304; $L62: .loc 1 315 36 add.u64 %r255,%r148,1; .loc 1 315 25 st.u64 [%r96],%r255; .loc 1 314 35 add.u64 %r110,%r141,1; .loc 1 314 12 add.u64 %r96,%r96,%r158; setp.le.s64 %r256,%r29,%r141; @ %r256 bra $L73; .loc 1 315 32 ld.u64 %r148,[%r93]; add.u64 %r93,%r93,8; mov.u64 %r141,%r110; bra $L62; $L73: mov.u64 %r295,%r291; mov.u64 %r296,%r292; $L60: .loc 1 317 13 add.u64 %r88,%r88,%r72; .loc 1 319 11 add.u64 %r64,%r64,1; .loc 1 319 4 st.u64 [%frame+360],%r64; setp.ne.u64 %r257,%r64,%r76; @ %r257 bra $L63; .loc 1 317 13 sub.u64 %r258,%r76,%r155; mad.lo.u64 %r90,%r258,%r72,%r90; bra $L64; $L72: mov.u64 %r75,%r155; mov.u64 %r91,%r92; mov.u64 %r89,%r90; $L59: .loc 1 323 11 ld.s8 %r261,[%r91]; cvt.u16.u32 %r260,%r261; setp.eq.u16 %r262,%r260,0; @ %r262 bra $L65; .loc 1 323 22 ld.u64 %r293,[%r89]; ld.u64 %r294,[%r89+8]; .loc 1 323 19 setp.gt.s64 %r265,%r294,%r296; @ %r265 bra $L76; setp.ne.u64 %r268,%r294,%r296; @ %r268 bra $L65; setp.gt.u64 %r271,%r293,%r295; @ ! %r271 bra $L65; $L76: mov.u64 %r115,%r299; mov.u64 %r57,%r107; mov.u64 %r144,%r75; .loc 1 326 12 mov.u64 %r142,%r304; $L67: .loc 1 327 36 add.u64 %r273,%r144,1; .loc 1 327 25 st.u64 [%r57],%r273; .loc 1 326 28 add.u64 %r109,%r142,1; .loc 1 326 5 add.u64 %r57,%r57,%r158; setp.le.s64 %r274,%r29,%r142; @ %r274 bra $L74; .loc 1 327 32 ld.u64 %r144,[%r115]; add.u64 %r115,%r115,8; mov.u64 %r142,%r109; bra $L67; $L74: mov.u64 %r295,%r293; mov.u64 %r296,%r294; $L65: .loc 1 331 9 add.u64 %r89,%r89,%r72; .loc 1 332 10 add.u64 %r91,%r91,%r73; .loc 1 334 14 add.u64 %r75,%r75,1; .loc 1 334 7 st.u64 [%frame+360],%r75; setp.ne.u64 %r275,%r75,%r76; @ %r275 bra $L59; sub.u64 %r125,%r76,%r155; .loc 1 331 9 mad.lo.u64 %r90,%r72,%r125,%r90; .loc 1 332 10 mad.lo.u64 %r92,%r73,%r125,%r92; bra $L64; $L97: st.u64 [%frame+360],%r76; $L64: mov.u64 %r117,%r299; mov.u64 %r85,%r73; .loc 1 334 34 mov.u64 %r87,%r76; mov.u64 %r83,%r70; mov.u64 %r116,8; .loc 1 335 9 mov.u64 %r95,0; $L68: .loc 1 340 13 st.u64 [%r117+-8],%r304; .loc 1 343 23 mul.lo.u64 %r78,%r83,%r87; .loc 1 344 24 mul.lo.u64 %r80,%r87,%r85; mov.u64 %r94,%r95; .loc 1 345 5 add.u64 %r95,%r95,1; .loc 1 346 7 setp.eq.u64 %r280,%r29,%r94; @ %r280 bra $L39; .loc 1 354 16 ld.u64 %r281,[%r117]; add.u64 %r82,%r281,1; st.u64 [%r117],%r82; .loc 1 355 23 add.u64 %r283,%r298,%r116; ld.u64 %r83,[%r283]; .loc 1 355 13 sub.u64 %r284,%r83,%r78; shl.b64 %r285,%r284,4; add.u64 %r90,%r90,%r285; .loc 1 356 24 add.u64 %r286,%frame,%r116; ld.u64 %r85,[%r286]; .loc 1 356 14 sub.u64 %r287,%r85,%r80; add.u64 %r92,%r92,%r287; .loc 1 359 32 add.u64 %r289,%r305,%r116; ld.u64 %r87,[%r289]; .loc 1 359 7 add.u64 %r117,%r117,8; add.u64 %r116,%r116,8; setp.eq.u64 %r290,%r82,%r87; @ %r290 bra $L68; ld.u64 %r155,[%frame+360]; bra $L69; $L71: mov.u64 %r90,%r45; setp.eq.u32 %r297,1,1; bra $L52; $L39: .loc 1 _gfortran_smaxloc0_8_i16 .visible .func _gfortran_smaxloc0_8_i16u32pred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 379 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L99; .loc 1 379 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L100; $L99: .loc 1 381 7 {_gfortran_maxloc0_8_i16.loc 1 382 7 bra $L98; $L100: .loc 1 385 8 ld.s8 %r30,[%r36+28]; .loc 1 387 6 setp.gt.s64 %r46,%r30,0; @ %r46 bra $L102_gfortran_runtime_error102: .loc 1 390 15 ld.u64 %r24,[%r35]; .loc 1 390 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L103; .loc 1 392 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 393 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 394 24 st.u64 [%r35+8],%r24; .loc 1 395 29 mov.u64 call (%value_in),_gfortrani_xmallocarray95 27 st.u64 [%r35],%r24; bra $L104; $L103: .loc 1 397 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 397 11 ld.u32 %r62,[%r61+36.loc 1 399 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 404 8 ld.u64 %r24,[%r35]; $L104: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,3; mov.u64 %r29,%r24; .loc 1 405 10 mov.u64 %r32,0; .loc 1 406 23 mov.u64 %r68,%r32; $L105: st.u64 [%r29],%r68; .loc 1 405 24 add.u64 %r32,%r32,1; .loc 1 405 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L105; $L98: .loc 1 407 1 ret; } /19 _gfortran_maxloc0_16_i16 .visible .func _gfortran_maxloc0_16_i16libgfortran/generated/maxloc0_16_i16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_16_i16 .visible .func _gfortran_mmaxloc0_16_i16_gfortran_smaxloc0_16_i16 .visible .func _gfortran_smaxloc0_16_i16VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,65,88,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_16_i16 .visible .func _gfortran_maxloc0_16_i16stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[3683264u32 %r157; .reg .u64 %r158; .reg .pred %r163; .reg .pred %r166; .reg .predpredpredpredu64 %r191; .reg .predpred64mov.u32 %r116,%ar2; .loc 1 50 10 ld.s8 %r22,[%r115+28]; .loc 1 50 8 cvt.u32.u32 %r117,%r22; cvt.s64.s8 %r71,%r117; .loc 1 51 6 setp.gt.s64 %r118,%r71,0; @ %r118 bra $L2; .loc 1 52 5 cvta.const.u64 %r1111_gfortran_runtime_errorr23,[%r114]; .loc 1 54 6 setp.ne.u64 %r121,%r23,0; @ %r121 bra $L3; .loc 1 56 7 st.u64 [%r114+48],%r23; add.u64 %r123,%r71,-1; st.u64 [%r114+56],%r123; mov.u64 %r124,1; st.u64 [%r114+40],%r124; .loc 1 57 28 cvt.u32.u64 %r125,%r124; st.u8 [%r114+28],%r125; .loc 1 58 24 st.u64 [%r114+8],%r23; .loc 1 59 29 mov.u64 %r128,1671call (%value_in),_gfortrani_xmallocarray131,[%value_in]; } mov.u64 %r23,%r131; .loc 1 59 27 st.u64 [%r114],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r133,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r134,[%r133+36]; setp.eq.u32 %r135,%r134,0; @ %r135 bra $L4; .loc 1 64r115138; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r114]; $L4: .loc 1 68 11 ld.u64 %r72,[%r114+40]; add.u64 %r44,%r115,40; mov.u64 %r35,%frame; add.u64 %r52,%frame,120; .loc 1 70 10 mov.u64 %r100,0; add.u64 %r210,%frame,240; .loc 1 74 16 mov.u64 %r146,%r100; bra $L7; $L24: mov.u64 %r100,%r79; $L7: .loc 1 72 18 ld.u64 %r139,[%r44]; st.u64 [%r35],%r139; .loc 1 73 19 ld.u64 %r141,[%r44+16]; add.u64 %r140,%r141,1; ld.u64 %r142,[%r44+8]; sub.u64 %r32,%r140,%r142; .loc 1 73 17 st.u64 [%r52],%r32; .loc 1 74 16 shl.b64 %r144,%r100,3; add.u64 %r145,%r210,%r144; st.u64 [%r145],%r146; .loc 1 75 10 setp.gt.s64 %r147,%r32,0; @ %r147 bra $L5; shl.b64 %r86,%r72,4; mov.u64 %r88,%r23; .loc 1 78 11 mov.u64 %r80,0; .loc 1 79 24 mov.u64 %r148,%r80; $L6: st.u64 [%r88],%r148; st.u64 [%r88+8],%r148; .loc 1 78 27 add.u64 %r80,%r80,1; .loc 1 78 4 add.u64 %r88,%r88,%r86; setp.ne.u64 %r150,%r71,%r80; @ %r150 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r79,%r100,1; .loc 1 70 3 add.u64 %r44,%r44,24; add.u64 %r35,%r35,8; add.u64 %r52,%r52,8; setp.ne.u64 %r151,%r71,%r79; @ %r151 bra $L24; .loc 1 84 8 ld.u64 %r62,[%r115]; shl.b64 %r38,%r72,4; mov.u64 %r85,%r23; .loc 1 87 10 mov.u64 %r78,0; .loc 1 88 23 mov.u64 %r152,1; mov.u64 %r153,%r78; $L8: st.u64 [%r85],%r152; st.u64 [%r85+8],%r153; mov.u64 %r33,%r78; .loc 1 87 26 add.u64 %r78,%r78,1; .loc 1 87 3 add.u64 %r85,%r85,%r38; setp.ne.u64 %r154,%r100,%r33; @ %r154 bra $L8; .loc 1 101 9 setp.ne.u64 %r155,%r62,0; @ ! %r155 bra $L1; .loc 1 149 19 ld.u64 %r47,[%frame]; .loc 1 149 9 shl.b64 %r49,%r47,4; .loc 1 151 34 ld.u64 %r51,[%frame+120]; cvt.u32.u32 %r157,%r22; cvt.s64.s8 %r156,%r157; shl.b64 %r158,%r156,3; add.u64 %r89,%r210,%r158; .loc 1 99 12 mov.u64 %r207,0; mov.u64 %r208,-9223372036854775808; setp.eq.u32 %r209,%r116,0; add.u64 %r211,%frame,248; .loc 1 157 13 mov.u64 %r214,%r207; .loc 1 174 32 add.u64 %r215,%frame,120; $L23: .loc 1 151 21 ld.u64 %r107,[%frame+240]; .loc 1 126 12 @ %r209 bra $L25; mov.u64 %r41,%r107; mov.u64 %r60,%r62; $L15: .loc 1 129 12 ld.u64 %r203,[%r60]; ld.u64 %r204,[%r60+8]; .loc 1 129 11 setp.gt.s64 %r163,%r208,%r204; @ %r163 bra $L12; setp.ne.u64 %r166,%r208,%r204; @ %r166 bra $L28; setp.gt.u64 %r169,%r207,%r203; @ %r169 bra $L12; $L28: mov.u64 %r27,%r211; mov.u64 %r63,%r23; mov.u64 %r103,%r41; $L14: .loc 1 133 35 add.u64 %r171,%r103,1; st.u64 [%r63],%r171; shr.s64 %r172,%r171,63; st.u64 [%r63+8],%r172; .loc 1 132 10 add.u64 %r63,%r63,%r38; setp.eq.u64 %r173,%r27,%r89; @ %r173 bra $L26; .loc 1 133 31 ld.u64 %r103,[%r27]; add.u64 %r27,%r27,8; bra $L14; $L26: mov.u64 %r207,%r203; mov.u64 %r208,%r204; $L12: .loc 1 135 12 add.u64 %r60,%r60,%r49; .loc 1 137 17 add.u64 %r41,%r41,1; .loc 1 137 10 st.u64 [%frame+240],%r41; setp.ne.u64 %r174,%r41,%r51; @ %r174 bra $L15; bra $L38; $L25: mov.u64 %r50,%r107; mov.u64 %r61,%r62; $L11: .loc 1 141 11 ld.u64 %r205,[%r61]; ld.u64 %r206,[%r61+8]; .loc 1 141 10 setp.gt.s64 %r177,%r206,%r208; @ %r177 bra $L29; setp.ne.u64 %r180,%r206,%r208; @ %r180 bra $L17; setp.gt.u64 %r183,%r205,%r207; @ ! %r183 bra $L17; $L29: mov.u64 %r104,%r211; mov.u64 %r105,%r23; mov.u64 %r84,%r50; $L19: .loc 1 145 35 add.u64 %r185,%r84,1; st.u64 [%r105],%r185; shr.s64 %r186,%r185,63; st.u64 [%r105+8],%r186; .loc 1 144 4 add.u64 %r105,%r105,%r38; setp.eq.u64 %r187,%r89,%r104; @ %r187 bra $L27; .loc 1 145 31 ld.u64 %r84,[%r104]; add.u64 %r104,%r104,8; bra $L19; $L27: mov.u64 %r207,%r205; mov.u64 %r208,%r206; $L17: .loc 1 149 9 add.u64 %r61,%r61,%r49; .loc 1 151 14 add.u64 %r50,%r50,1; .loc 1 151 7 st.u64 [%frame+240],%r50; setp.ne.u64 %r188,%r50,%r51; @ %r188 bra $L11; .loc 1 149 9 sub.u64 %r189,%r51,%r107; mad.lo.u64 %r62,%r189,%r49,%r62; bra $L20; $L38: .loc 1 135 12 sub.u64 %r191,%r51,%r107; mad.lo.u64 %r62,%r191,%r49,%r62; $L20: mov.u64 %r77,%r211; .loc 1 151 34 mov.u64 %r59,%r51; mov.u64 %r57,%r47; .loc 1 152 9 mov.u64 %r65,0; $L22: .loc 1 157 13 st.u64 [%r77+-8],%r214; .loc 1 160 23 mul.lo.u64 %r54,%r57,%r59; mov.u64 %r101,%r65; .loc 1 161 5 add.u64 %r65,%r65,1; .loc 1 162 7 setp.eq.u64 %r195,%r101,%r100; @ %r195 bra $L1; .loc 1 170 16 ld.u64 %r196,[%r77]; add.u64 %r56,%r196,1; st.u64 [%r77],%r56; shl.b64 %r68,%r65,3; .loc 1 171 23 add.u64 %r197,%frame,%r68; ld.u64 %r57,[%r197]; .loc 1 171 13 sub.u64 %r198,%r57,%r54; shl.b64 %r199,%r198,4; add.u64 %r62,%r62,%r199; .loc 1 174 32 add.u64 %r201,%r215,%r68; ld.u64 %r59,[%r201]; .loc 1 174 7 add.u64 %r77,%r77,8; setp.eq.u64 %r202,%r56,%r59; @ %r202 bra $L22; bra $L23; $L_gfortran_mmaxloc0_16_i16 .visible .func _gfortran_mmaxloc0_16_i16reg .u64 %stack; mov.u64 %stack,0480u64 %r98predpredpredpred %r272; .reg .predpredu64u64 %r302; .reg .u64mov.u64 %r163,%ar0; mov.u64 %r164,%ar1; mov.u64 %r165,%ar2; mov.u32 %r166,%ar3; .loc 1 202 6 setp.ne.u64 %r167,%r165,0; @ %r167 bra $L4016r1166; call _gfortran_maxloc0_16_i16.loc 1 205 7 bra $L39; $L40: .loc 1 208 10 ld.s8 %r22,[%r164+28]; .loc 1 209 6 setp.gt.s32 %r171,%r22,0; @ %r171 bra $L42; .loc 1 210 5 cvta.const.u64 %r172172_gfortran_runtime_error42: .loc 1 212 6 ld.u64 %r174,[%r163]; setp.ne.u64 %r175,%r174,0; @ %r175 bra $L43; .loc 1 214 7 st.u64 [%r163+48],%r174; add.u32 %r177,%r22,-1; cvt.s64.s32 %r178,%r177; st.u64 [%r163+56],%r178; mov.u64 %r179,1; st.u64 [%r163+40],%r179; .loc 1 215 28 cvt.u32.u64 %r180,%r179; st.u8 [%r163+28],%r180; .loc 1 216 24 st.u64 [%r163+8],%r174; .loc 1 217 29 cvt.u32.u32 %r185,%r22; cvt.s64.s8 %r184,%r185; mov.u64 %r183,16call (%value_in),_gfortrani_xmallocarray186,[%value_in]; } .loc 1 217 27 st.u64 [%r163],%r186; bra $L44; $L43: .loc 1 221 11 cvta.global.u64 %r188,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r189,[%r188+36]; setp.eq.u32 %r190,%r189,0r164193; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r196193; call _gfortrani_bounds_equal_extents$L44: .loc 1 231 15 ld.u64 %r31,[%r165+16]; .loc 1 233 9 ld.u64 %r90,[%r165]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r198,%r32,-4; and.b32 %r199,%r198,-5; set.u32.eq.u32 %r201,%r199,0; neg.s32 %r202,%r201; .loc 1 235 22 add.u32 %r203,%r32,-1; set.u32.le.u32 %r205,%r203,1; neg.s32 %r206,%r205; .loc 1 235 6 cvt.u16.u32 %r208,%r202; cvt.u16.u32 %r209,%r206; or.b16 %r207,%r208,%r209;45; .loc 1 237 7 setp.ne.u32 %r214,%r32,16; @ %r214 bra $L46; $L45: .loc 1 244 11 ld.u64 %r102,[%r163+40]; .loc 1 245 8 ld.u64 %r103,[%r163]; .loc 1 246 17 cvt.u32.u32 %r215,%r22; cvt.s64.s8 %r65,%r215; add.u64 %r161,%r164,40; add.u64 %r306,%frame,120; mov.u64 %r159,%r306; add.u64 %r158,%r165,40; mov.u64 %r154,%frame; add.u64 %r155,%frame,240; add.u64 %r151,%frame,360; mov.u64 %r150,%r151; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r224,%r29; bra $L47; $L46: .loc 1 242 5_gfortran_runtime_error70: mov.u64 %r29,%r109; $L47: .loc 1 248 18 ld.u64 %r218,[%r161]; st.u64 [%r159],%r218; .loc 1 249 20 ld.u64 %r220,[%r158]; mul.lo.u64 %r219,%r220,%r31; .loc 1 249 18 st.u64 [%r154],%r219; .loc 1 250 19 ld.u64 %r222,[%r161+16]; add.u64 %r221,%r222,1; ld.u64 %r223,[%r161+8]; sub.u64 %r44,%r221,%r223; .loc 1 250 17 st.u64 [%r155],%r44; .loc 1 251 16 st.u64 [%r150],%r224; .loc 1 252 10 setp.gt.s64 %r225,%r44,0; @ %r225 bra $L48; shl.b64 %r144,%r102,4; mov.u64 %r133,%r103; .loc 1 255 11 mov.u64 %r110,0; .loc 1 256 24 mov.u64 %r226,%r110; $L49: st.u64 [%r133],%r226; st.u64 [%r133+8],%r226; .loc 1 255 27 add.u64 %r110,%r110,1; .loc 1 255 4 add.u64 %r133,%r133,%r144; setp.ne.u64 %r228,%r65,%r110; @ %r228 bra $L49; bra $L39; $L48: .loc 1 246 26 add.u64 %r109,%r29,1; .loc 1 246 3 add.u64 %r161,%r161,24; add.u64 %r159,%r159,8; add.u64 %r158,%r158,24; add.u64 %r154,%r154,8; add.u64 %r155,%r155,8; add.u64 %r150,%r150,8; setp.ne.u64 %r229,%r65,%r109; @ %r229 bra $L70; .loc 1 261 8 ld.u64 %r88,[%r164]; shl.b64 %r112,%r102,4; mov.u64 %r52,%r103; .loc 1 264 10 mov.u64 %r108,0; .loc 1 265 23 mov.u64 %r230,%r108; $L51: st.u64 [%r52],%r230; st.u64 [%r52+8],%r230; mov.u64 %r140,%r108; .loc 1 264 26 add.u64 %r108,%r108,1; .loc 1 264 3 add.u64 %r52,%r52,%r112; setp.ne.u64 %r232,%r29,%r140; @ %r232 bra $L51; .loc 1 276 9 setp.eq.u64 %r233,%r88,0; @ %r233 bra $L39; .loc 1 331 19 ld.u64 %r68,[%frame+120]; .loc 1 331 9 shl.b64 %r70,%r68,4; .loc 1 332 20 ld.u64 %r71,[%frame]; .loc 1 334 34 ld.u64 %r74,[%frame+240]; ld.u64 %r160,[%frame+360]; cvt.u32.u32 %r235,%r22; cvt.s64.s8 %r234,%r235; shl.b64 %r236,%r234,3; add.u64 %r122,%r236,%r151; .loc 1 274 12 mov.u64 %r303,0; mov.u64 %r304,-9223372036854775808; setp.eq.u32 %r307,1,0; add.u64 %r305,%frame,368; .loc 1 340 13 mov.u64 %r312,%r303; .loc 1 359 32 add.u64 %r313,%frame,240; .loc 1 308 12 setp.eq.u32 %r315,%r166,0; $L69: .loc 1 280 10 @ %r307 bra $L52; mov.u64 %r53,%r160; mov.u32 %r46,0; mov.u32 %r309,1; $L57: mov.u64 %r45,%r88; .loc 1 300 13 add.u64 %r88,%r45,%r70; .loc 1 284 11 ld.s8 %r239,[%r90]; cvt.u16.u32 %r238,%r239; setp.eq.u16 %r240,%r238,0; @ %r240 bra $L53; setp.eq.u32 %r241,%r46,0; selp.u64 %r160,%r160,%r53,%r241; .loc 1 294 16 ld.u64 %r303,[%r45]; ld.u64 %r304,[%r45+8]; mov.u64 %r131,%r305; mov.u64 %r145,%r103; mov.u64 %r157,%r160; $L56: .loc 1 296 33 add.u64 %r243,%r157,1; st.u64 [%r145],%r243; shr.s64 %r244,%r243,63; st.u64 [%r145+8],%r244; .loc 1 295 9 add.u64 %r145,%r145,%r112; setp.eq.u64 %r245,%r122,%r131; @ %r245 bra $L71; .loc 1 296 29 ld.u64 %r157,[%r131]; add.u64 %r131,%r131,8; bra $L56; $L53: .loc 1 301 14 add.u64 %r90,%r90,%r71; .loc 1 303 11 add.u64 %r53,%r53,1; mov.u32 %r46,%r309; .loc 1 303 4 setp.ne.u64 %r246,%r53,%r74; @ %r246 bra $L57; bra $L97; $L52: .loc 1 308 12 @ %r315 bra $L72; mov.u64 %r61,%r160; mov.u64 %r86,%r88; $L63: .loc 1 311 11 ld.s8 %r249,[%r90]; cvt.u16.u32 %r248,%r249; setp.eq.u16 %r250,%r248,0; @ %r250 bra $L60; .loc 1 311 22 ld.u64 %r299,[%r86]; ld.u64 %r300,[%r86+8]; .loc 1 311 19 setp.gt.s64 %r253,%r304,%r300; @ %r253 bra $L60; setp.ne.u64 %r256,%r304,%r300; @ %r256 bra $L75; setp.gt.u64 %r259,%r303,%r299; @ %r259 bra $L60; $L75: mov.u64 %r97,%r305; mov.u64 %r107,%r103; mov.u64 %r153,%r61; .loc 1 314 19 mov.u64 %r141,%r312; $L62: .loc 1 315 36 add.u64 %r261,%r153,1; st.u64 [%r107],%r261; shr.s64 %r262,%r261,63; st.u64 [%r107+8],%r262; .loc 1 314 35 add.u64 %r106,%r141,1; .loc 1 314 12 add.u64 %r107,%r107,%r112; setp.le.s64 %r263,%r29,%r141; @ %r263 bra $L73; .loc 1 315 32 ld.u64 %r153,[%r97]; add.u64 %r97,%r97,8; mov.u64 %r141,%r106; bra $L62; $L73: mov.u64 %r303,%r299; mov.u64 %r304,%r300; $L60: .loc 1 317 13 add.u64 %r86,%r86,%r70; .loc 1 319 11 add.u64 %r61,%r61,1; .loc 1 319 4 st.u64 [%frame+360],%r61; setp.ne.u64 %r264,%r61,%r74; @ %r264 bra $L63; .loc 1 317 13 sub.u64 %r265,%r74,%r160; mad.lo.u64 %r88,%r265,%r70,%r88; bra $L64; $L72: mov.u64 %r73,%r160; mov.u64 %r89,%r90; mov.u64 %r87,%r88; $L59: .loc 1 323 11 ld.s8 %r268,[%r89]; cvt.u16.u32 %r267,%r268; setp.eq.u16 %r269,%r267,0; @ %r269 bra $L65; .loc 1 323 22 ld.u64 %r301,[%r87]; ld.u64 %r302,[%r87+8]; .loc 1 323 19 setp.gt.s64 %r272,%r302,%r304; @ %r272 bra $L76; setp.ne.u64 %r275,%r302,%r304; @ %r275 bra $L65; setp.gt.u64 %r278,%r301,%r303; @ ! %r278 bra $L65; $L76: mov.u64 %r62,%r305; mov.u64 %r91,%r103; mov.u64 %r149,%r73; .loc 1 326 12 mov.u64 %r142,%r312; $L67: .loc 1 327 36 add.u64 %r280,%r149,1; st.u64 [%r91],%r280; shr.s64 %r281,%r280,63; st.u64 [%r91+8],%r281; .loc 1 326 28 add.u64 %r105,%r142,1; .loc 1 326 5 add.u64 %r91,%r91,%r112; setp.le.s64 %r282,%r29,%r142; @ %r282 bra $L74; .loc 1 327 32 ld.u64 %r149,[%r62]; add.u64 %r62,%r62,8; mov.u64 %r142,%r105; bra $L67; $L74: mov.u64 %r303,%r301; mov.u64 %r304,%r302; $L65: .loc 1 331 9 add.u64 %r87,%r87,%r70; .loc 1 332 10 add.u64 %r89,%r89,%r71; .loc 1 334 14 add.u64 %r73,%r73,1; .loc 1 334 7 st.u64 [%frame+360],%r73; setp.ne.u64 %r283,%r73,%r74; @ %r283 bra $L59; sub.u64 %r128,%r74,%r160; .loc 1 331 9 mad.lo.u64 %r88,%r70,%r128,%r88; .loc 1 332 10 mad.lo.u64 %r90,%r71,%r128,%r90; bra $L64; $L97: st.u64 [%frame+360],%r74; $L64: mov.u64 %r120,%r305; mov.u64 %r83,%r71; .loc 1 334 34 mov.u64 %r85,%r74; mov.u64 %r81,%r68; mov.u64 %r119,8; .loc 1 335 9 mov.u64 %r93,0; $L68: .loc 1 340 13 st.u64 [%r120+-8],%r312; .loc 1 343 23 mul.lo.u64 %r76,%r81,%r85; .loc 1 344 24 mul.lo.u64 %r78,%r85,%r83; mov.u64 %r98,%r93; .loc 1 345 5 add.u64 %r93,%r93,1; .loc 1 346 7 setp.eq.u64 %r288,%r29,%r98; @ %r288 bra $L39; .loc 1 354 16 ld.u64 %r289,[%r120]; add.u64 %r80,%r289,1; st.u64 [%r120],%r80; .loc 1 355 23 add.u64 %r291,%r306,%r119; ld.u64 %r81,[%r291]; .loc 1 355 13 sub.u64 %r292,%r81,%r76; shl.b64 %r293,%r292,4; add.u64 %r88,%r88,%r293; .loc 1 356 24 add.u64 %r294,%frame,%r119; ld.u64 %r83,[%r294]; .loc 1 356 14 sub.u64 %r295,%r83,%r78; add.u64 %r90,%r90,%r295; .loc 1 359 32 add.u64 %r297,%r313,%r119; ld.u64 %r85,[%r297]; .loc 1 359 7 add.u64 %r120,%r120,8; add.u64 %r119,%r119,8; setp.eq.u64 %r298,%r80,%r85; @ %r298 bra $L68; ld.u64 %r160,[%frame+360]; bra $L69; $L71: mov.u64 %r88,%r45; setp.eq.u32 %r307,1,1; bra $L52; $L39: .loc 1 _gfortran_smaxloc0_16_i16 .visible .func _gfortran_smaxloc0_16_i16u32pred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 379 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L99; .loc 1 379 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L100; $L99: .loc 1 381 7 {_gfortran_maxloc0_16_i16.loc 1 382 7 bra $L98; $L100: .loc 1 385 8 ld.s8 %r30,[%r36+28]; .loc 1 387 6 setp.gt.s64 %r46,%r30,0; @ %r46 bra $L102_gfortran_runtime_error102: .loc 1 390 15 ld.u64 %r24,[%r35]; .loc 1 390 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L103; .loc 1 392 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 393 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 394 24 st.u64 [%r35+8],%r24; .loc 1 395 29call (%value_in),_gfortrani_xmallocarray95 27 st.u64 [%r35],%r24; bra $L104; $L103: .loc 1 397 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 397 11 ld.u32 %r62,[%r61+36.loc 1 399 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 404 8 ld.u64 %r24,[%r35]; $L104: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,4; mov.u64 %r29,%r24; .loc 1 405 10 mov.u64 %r32,0; .loc 1 406 23 mov.u64 %r68,%r32; $L105: st.u64 [%r29],%r68; st.u64 [%r29+8],%r68; .loc 1 405 24 add.u64 %r32,%r32,1; .loc 1 405 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r70,%r30,%r32; @ %r70 bra $L105; $L98: .loc 1 407 1 ret; } maxloc0_4_r4.o/_gfortran_maxloc0_4_r4 .visible .func _gfortran_maxloc0_4_libgfortran/generated/maxloc0_4_r4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_4_r4 .visible .func _gfortran_mmaxloc0_4_r4_gfortran_smaxloc0_4_r4 .visible .func _gfortran_smaxloc0_4_r4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,65,88,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_4_r4 .visible .func _gfortran_maxloc0_4_stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[368.reg .u64predpredpredu32 %r176; .reg .pred %r177; .reg .pred %r178; .reg .pred %r180; .reg .u32 %r182; .reg .u32 %r183; .reg .pred64pred %r208; .reg .u32 %r210; .reg .u64 %r211; .reg .u64 %r212; .reg .pred50 10 ld.s8 %r22,[%r128+28]; .loc 1 50 8 cvt.u32.u32 %r130,%r22; cvt.s64.s8 %r79,%r130; .loc 1 51 6 setp.gt.s64 %r131,%r79,0; @ %r131 bra $L2; .loc 1 52 5 cvta.const.u64 %r1321stack; call _gfortran_runtime_errorr23,[%r127]; .loc 1 54 6 setp.ne.u64 %r134,%r23,0; @ %r134 bra $L3; .loc 1 56 7 st.u64 [%r127+48],%r23; add.u64 %r136,%r79,-1; st.u64 [%r127+56],%r136; mov.u64 %r137,1; st.u64 [%r127+40],%r137; .loc 1 57 28 cvt.u32.u64 %r138,%r137; st.u8 [%r127+28],%r138; .loc 1 58 24 st.u64 [%r127+8],%r23; .loc 1 59 2979call (%value_in),_gfortrani_xmallocarray4,[%value_in]; } mov.u64 %r23,%r144; .loc 1 59 27 st.u64 [%r127],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r146,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r147,[%r146+36]; setp.eq.u32 %r148,%r147,0; @ %r148 bra $L4; .loc 1 64 2r12151; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r127]; $L4: .loc 1 68 11 ld.u64 %r80,[%r127+40]; add.u64 %r44,%r128,40; mov.u64 %r37,%frame; add.u64 %r47,%frame,120; .loc 1 70 10 mov.u64 %r103,0; add.u64 %r206,%frame,240; .loc 1 74 16 mov.u64 %r159,%r103; bra $L7; $L32: mov.u64 %r103,%r88; $L7: .loc 1 72 18 ld.u64 %r152,[%r44]; st.u64 [%r37],%r152; .loc 1 73 19 ld.u64 %r154,[%r44+16]; add.u64 %r153,%r154,1; ld.u64 %r155,[%r44+8]; sub.u64 %r31,%r153,%r155; .loc 1 73 17 st.u64 [%r47],%r31; .loc 1 74 16 shl.b64 %r157,%r103,3; add.u64 %r158,%r206,%r157; st.u64 [%r158],%r159; .loc 1 75 10 setp.gt.s64 %r160,%r31,0; @ %r160 bra $L5; shl.b64 %r122,%r80,2; mov.u64 %r101,%r23; .loc 1 78 11 mov.u64 %r89,0; .loc 1 79 24 cvt.u32.u64 %r161,%r89; $L6: st.u32 [%r101],%r161; .loc 1 78 27 add.u64 %r89,%r89,1; .loc 1 78 4 add.u64 %r101,%r101,%r122; setp.ne.u64 %r162,%r79,%r89; @ %r162 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r88,%r103,1; .loc 1 70 3 add.u64 %r44,%r44,24; add.u64 %r37,%r37,8; add.u64 %r47,%r47,8; setp.ne.u64 %r163,%r79,%r88; @ %r163 bra $L32; .loc 1 84 8 ld.u64 %r67,[%r128]; shl.b64 %r108,%r80,2; mov.u64 %r106,%r23; .loc 1 87 10 mov.u64 %r87,0; .loc 1 88 23 mov.u32 %r164,1; $L8: st.u32 [%r106],%r164; mov.u64 %r104,%r87; .loc 1 87 26 add.u64 %r87,%r87,1; .loc 1 87 3 add.u64 %r106,%r106,%r108; setp.ne.u64 %r165,%r103,%r104; @ %r165 bra $L8; .loc 1 101 9 setp.eq.u64 %r166,%r67,0; @ %r166 bra $L1; .loc 1 149 19 ld.u64 %r51,[%frame]; .loc 1 149 9 shl.b64 %r53,%r51,2; .loc 1 151 34 ld.u64 %r56,[%frame+120]; ld.u64 %r123,[%frame+240]; .loc 1 141 11 ld.f32 %r125,[%r67]; cvt.u32.u32 %r168,%r22; cvt.s64.s8 %r167,%r168; shl.b64 %r169,%r167,3; add.u64 %r113,%r206,%r169; .loc 1 97 12 mov.f32 %r73,0fff800000; setp.eq.u32 %r208,1,0; add.u64 %r207,%frame,248; .loc 1 157 13 mov.u64 %r211,0; .loc 1 174 32 add.u64 %r212,%frame,120; .loc 1 126 12 setp.eq.u32 %r214,%r129,0; $L30: .loc 1 106 10 @ %r208 bra $L10; mov.u64 %r38,%r123; mov.u32 %r85,0; .loc 1 110 12 mov.u32 %r210,1; $L17: .loc 1 118 13 add.u64 %r126,%r67,%r53; .loc 1 110 11 setp.le.f32 %r172,%r73,%r125; @ ! %r172 bra $L51; setp.eq.u32 %r173,%r85,0; selp.u64 %r123,%r123,%r38,%r173; mov.u64 %r90,%r207; mov.u64 %r95,%r23; mov.u64 %r119,%r123; $L15: .loc 1 115 36 cvt.u32.u64 %r176,%r119; add.u32 %r175,%r176,1; .loc 1 115 25 st.u32 [%r95],%r175; .loc 1 114 5 add.u64 %r95,%r95,%r108; setp.eq.u64 %r177,%r90,%r113; @ %r177 bra $L33; .loc 1 115 32 ld.u64 %r119,[%r90]; add.u64 %r90,%r90,8; bra $L15; $L51: .loc 1 120 11 add.u64 %r38,%r38,1; .loc 1 120 4 setp.eq.u64 %r178,%r38,%r56; @ %r178 bra $L16; .loc 1 110 12 ld.f32 %r125,[%r126]; mov.u64 %r67,%r126; mov.u32 %r85,%r210; bra $L17; $L10: .loc 1 126 12 @ %r214 bra $L34; mov.u64 %r46,%r123; mov.u64 %r65,%r67; $L24: .loc 1 129 11 setp.le.f32 %r180,%r73,%r125; @ ! %r180 bra $L19; mov.u64 %r84,%r207; mov.u64 %r39,%r23; mov.u64 %r116,%r46; $L21: .loc 1 133 35 cvt.u32.u64 %r183,%r116; add.u32 %r182,%r183,1; .loc 1 133 24 st.u32 [%r39],%r182; .loc 1 132 10 add.u64 %r39,%r39,%r108; setp.eq.u64 %r184,%r84,%r113; @ %r184 bra $L35; .loc 1 133 31 ld.u64 %r116,[%r84]; add.u64 %r84,%r84,8; bra $L21; $L35: .loc 1 131 18 mov.f32 %r73,%r125; $L19: .loc 1 135 12 add.u64 %r65,%r65,%r53; .loc 1 137 17 add.u64 %r46,%r46,1; .loc 1 137 10 st.u64 [%frame+240],%r46; setp.ne.u64 %r185,%r46,%r56; @ %r185 bra $L22; .loc 1 135 12 sub.u64 %r186,%r56,%r123; mad.lo.u64 %r67,%r186,%r53,%r67; bra $L23; $L22: .loc 1 129 12 ld.f32 %r125,[%r65]; bra $L24; $L34: mov.u64 %r55,%r123; mov.u64 %r66,%r67; $L18: .loc 1 141 10 setp.lt.f32 %r188,%r73,%r125; @ ! %r188 bra $L25; mov.u64 %r117,%r207; mov.u64 %r120,%r23; mov.u64 %r110,%r55; $L27: .loc 1 145 35 cvt.u32.u64 %r191,%r110; add.u32 %r190,%r191,1; .loc 1 145 24 st.u32 [%r120],%r190; .loc 1 144 4 add.u64 %r120,%r120,%r108; setp.eq.u64 %r192,%r113,%r117; @ %r192 bra $L36; .loc 1 145 31 ld.u64 %r110,[%r117]; add.u64 %r117,%r117,8; bra $L27; $L36: .loc 1 143 18 mov.f32 %r73,%r125; $L25: .loc 1 149 9 add.u64 %r66,%r66,%r53; .loc 1 151 14 add.u64 %r55,%r55,1; .loc 1 151 7 st.u64 [%frame+240],%r55; setp.ne.u64 %r193,%r55,%r56; @ %r193 bra $L28; .loc 1 149 9 sub.u64 %r194,%r56,%r123; mad.lo.u64 %r67,%r194,%r53,%r67; bra $L23; $L28: .loc 1 141 11 ld.f32 %r125,[%r66]; bra $L18; $L16: st.u64 [%frame+240],%r56; .loc 1 118 13 mov.u64 %r67,%r126; $L23: mov.u64 %r76,%r207; .loc 1 151 34 mov.u64 %r63,%r56; mov.u64 %r61,%r51; .loc 1 152 9 mov.u64 %r71,0; $L29: .loc 1 157 13 st.u64 [%r76+-8],%r211; .loc 1 160 23 mul.lo.u64 %r58,%r61,%r63; mov.u64 %r105,%r71; .loc 1 161 5 add.u64 %r71,%r71,1; .loc 1 162 7 setp.eq.u64 %r198,%r105,%r103; @ %r198 bra $L1; .loc 1 170 16 ld.u64 %r199,[%r76]; add.u64 %r60,%r199,1; st.u64 [%r76],%r60; shl.b64 %r69,%r71,3; .loc 1 171 23 add.u64 %r200,%frame,%r69; ld.u64 %r61,[%r200]; .loc 1 171 13 sub.u64 %r201,%r61,%r58; shl.b64 %r202,%r201,2; add.u64 %r67,%r67,%r202; .loc 1 174 32 add.u64 %r204,%r212,%r69; ld.u64 %r63,[%r204]; .loc 1 174 7 add.u64 %r76,%r76,8; setp.eq.u64 %r205,%r60,%r63; @ %r205 bra $L29; ld.u64 %r123,[%frame+240]; .loc 1 141 11 ld.f32 %r125,[%r67]; bra $L30; $L33: mov.f32 %r73,%r125; setp.eq.u32 %r208,1,1; bra $L10; $L_gfortran_mmaxloc0_4_r4 .visible .func _gfortran_mmaxloc0_4_r4reg .u64 %stack; mov.u64 %stack,0480100pred %r172; .reg .predpredpred %r195; .reg .u64 %r198; .reg .u64u32 %r211; .reg .u16pred %r217; .reg .predpredu16 %r257; .reg .u32 %r258; .reg .pred %r259; .reg .pred %r260; .reg .u32 %r262; .reg .u32 %r263; .reg .pred %r264; .reg .pred %r265; .reg .u64 %r266; .reg .u16 %r268; .reg .u32 %r269; .reg .pred %r270; .reg .pred %r271; .reg .u32 %r273; .reg .u32 %r274; .reg .pred %r275; .reg .pred %r276; .reg .pred %r281; .reg .u64 %r282; .reg .u64pred %r294; .reg .u64 %r296; .reg .u64 %r297; .reg .pred %r299; mov.u64 %r168,%ar0; mov.u64 %r169,%ar1; mov.u64 %r170,%ar2; mov.u32 %r171,%ar3; .loc 1 202 6 setp.ne.u64 %r172,%r170,0; @ %r172 bra $Lcall _gfortran_maxloc0_4_r4.loc 1 205 7 bra $L52; $L53: .loc 1 208 10 ld.s8 %r22,[%r169+28]; .loc 1 209 6 setp.gt.s32 %r176,%r22,0; @ %r176 bra $L55; .loc 1 210 5 cvta.const.u64 %r177_gfortran_runtime_error5: .loc 1 212 6 ld.u64 %r179,[%r168]; setp.ne.u64 %r180,%r179,0; @ %r180 bra $L56; .loc 1 214 7 st.u64 [%r168+48],%r179; add.u32 %r182,%r22,-1; cvt.s64.s32 %r183,%r182; st.u64 [%r168+56],%r183; mov.u64 %r184,1; st.u64 [%r168+40],%r184; .loc 1 215 28 cvt.u32.u64 %r185,%r184; st.u8 [%r168+28],%r185; .loc 1 216 24 st.u64 [%r168+8],%r179; .loc 1 217 29 cvt.u32.u32 %r190,%r22; cvt.s64.s8 %r189,%r1call (%value_in),_gfortrani_xmallocarray191,[%value_in]; } .loc 1 217 27 st.u64 [%r168],%r191; bra $L57; $L56: .loc 1 221 11 cvta.global.u64 %r193,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r194,[%r193+36]; setp.eq.u32 %r195,%r194,0r169198; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r2011716920198; call _gfortrani_bounds_equal_extents$L57: .loc 1 231 15 ld.u64 %r31,[%r170+16]; .loc 1 233 9 ld.u64 %r94,[%r170]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r203,%r32,-4; and.b32 %r204,%r203,-5;.loc 1 235 22 add.u32 %r208,%r32,-1; set.u32.le.u32 %r210,%r208,1; neg.s32 %r211,%r210; .loc 1 235 6 cvt.u16.u32 %r213,%r207; cvt.u16.u32 %r214,%r211; or.b16 %r212,%r213,%r214; cvt.u32.u16 %r215,%r212; cvt.u16.u8 %r216,%r215; setp.ne.u16 %r217,%r216,0; @ %r217 bra $L58; .loc 1 237 7 setp.ne.u32 %r219,%r32,16; @ %r219 bra $L59; $L58: .loc 1 244 11 ld.u64 %r112,[%r168+40]; .loc 1 245 8 ld.u64 %r113,[%r168]; .loc 1 246 17 cvt.u32.u32 %r220,%r22; cvt.s64.s8 %r135,%r220; add.u64 %r166,%r169,40; add.u64 %r292,%frame,120; mov.u64 %r163,%r292; add.u64 %r160,%r170,40; mov.u64 %r159,%frame; add.u64 %r156,%frame,240; add.u64 %r152,%frame,360; mov.u64 %r154,%r152; .loc 1 246 10 mov.u64 %r120,0; .loc 1 251 16 mov.u64 %r229,%r120; bra $L60; $L59: .loc 1 242 5_gfortran_runtime_error248 18 ld.u64 %r223,[%r166]; st.u64 [%r163],%r223; .loc 1 249 20 ld.u64 %r225,[%r160]; mul.lo.u64 %r224,%r225,%r31; .loc 1 249 18 st.u64 [%r159],%r224; .loc 1 250 19 ld.u64 %r227,[%r166+16]; add.u64 %r226,%r227,1; ld.u64 %r228,[%r166+8]; sub.u64 %r45,%r226,%r228; .loc 1 250 17 st.u64 [%r156],%r45; .loc 1 251 16 st.u64 [%r154],%r229; .loc 1 252 10 setp.gt.s64 %r230,%r45,0; @ %r230 bra $L61; shl.b64 %r131,%r112,2; mov.u64 %r133,%r113; .loc 1 255 11 mov.u64 %r121,0; .loc 1 256 24 cvt.u32.u64 %r231,%r121; $L62: st.u32 [%r133],%r231; .loc 1 255 27 add.u64 %r121,%r121,1; .loc 1 255 4 add.u64 %r133,%r133,%r131; setp.ne.u64 %r232,%r121,%r135; @ %r232 bra $L62; bra $L52; $L61: .loc 1 246 26 add.u64 %r120,%r120,1; .loc 1 246 3 add.u64 %r166,%r166,24; add.u64 %r163,%r163,8; add.u64 %r160,%r160,24; add.u64 %r159,%r159,8; add.u64 %r156,%r156,8; add.u64 %r154,%r154,8; setp.ne.u64 %r233,%r120,%r135; @ %r233 bra $L60; .loc 1 261 8 ld.u64 %r91,[%r169]; shl.b64 %r124,%r112,2; mov.u64 %r122,%r113; .loc 1 264 10 mov.u64 %r119,0; .loc 1 265 23 cvt.u32.u64 %r234,%r119; $L64: st.u32 [%r122],%r234; .loc 1 264 26 add.u64 %r119,%r119,1; .loc 1 264 3 add.u64 %r122,%r122,%r124; setp.ne.u64 %r235,%r119,%r135; @ %r235 bra $L64; .loc 1 276 9 setp.eq.u64 %r236,%r91,0; @ %r236 bra $L52; .loc 1 331 19 ld.u64 %r71,[%frame+120]; .loc 1 331 9 shl.b64 %r73,%r71,2; .loc 1 332 20 ld.u64 %r74,[%frame]; .loc 1 334 34 ld.u64 %r77,[%frame+240]; .loc 1 334 21 ld.u64 %r164,[%frame+360]; cvt.u32.u32 %r238,%r22; cvt.s64.s8 %r237,%r238; shl.b64 %r239,%r237,3; add.u64 %r96,%r239,%r152; .loc 1 272 12 mov.f32 %r102,0fff800000; setp.eq.u32 %r294,1,0; add.u64 %r293,%frame,368; .loc 1 340 13 mov.u64 %r296,0; .loc 1 359 32 add.u64 %r297,%frame,240; .loc 1 308 12 setp.eq.u32 %r299,%r171,0; $L83: .loc 1 280 10 @ %r294 bra $L65; $L72: .loc 1 284 11 ld.s8 %r242,[%r94]; cvt.u16.u32 %r241,%r242; setp.eq.u16 %r243,%r241,0; @ %r243 bra $L66; .loc 1 287 8 ld.u32 %r244,[%r113]; setp.ne.u32 %r245,%r244,0; @ %r245 bra $L67; mov.u64 %r101,%r293; mov.u64 %r104,%r113; mov.u64 %r157,%r164; $L68: .loc 1 289 38 cvt.u32.u64 %r248,%r157; add.u32 %r247,%r248,1; .loc 1 289 27 st.u32 [%r104],%r247; .loc 1 288 7 add.u64 %r104,%r104,%r124; setp.eq.u64 %r249,%r96,%r101; @ %r249 bra $L67; .loc 1 289 34 ld.u64 %r157,[%r101]; add.u64 %r101,%r101,8; bra $L68; $L67: .loc 1 290 9 ld.f32 %r52,[%r91]; .loc 1 290 8 setp.ge.f32 %r250,%r52,%r102; @ ! %r250 bra $L66; mov.u64 %r128,%r293; mov.u64 %r129,%r113; mov.u64 %r161,%r164; $L71: .loc 1 296 33 cvt.u32.u64 %r253,%r161; add.u32 %r252,%r253,1; .loc 1 296 22 st.u32 [%r129],%r252; .loc 1 295 9 add.u64 %r129,%r129,%r124; setp.eq.u64 %r254,%r96,%r128; @ %r254 bra $L84; .loc 1 296 29 ld.u64 %r161,[%r128]; add.u64 %r128,%r128,8; bra $L71; $L66: .loc 1 300 13 add.u64 %r91,%r91,%r73; .loc 1 301 14 add.u64 %r94,%r94,%r74; .loc 1 303 11 add.u64 %r164,%r164,1; .loc 1 303 4 st.u64 [%frame+360],%r164; setp.ne.u64 %r255,%r164,%r77; @ %r255 bra $L72; bra $L73; $L65: .loc 1 308 12 @ %r299 bra $L85; mov.u64 %r64,%r164; mov.u64 %r89,%r91; $L78: .loc 1 311 11 ld.s8 %r258,[%r94]; cvt.u16.u32 %r257,%r258; setp.eq.u16 %r259,%r257,0; @ %r259 bra $L75; .loc 1 311 22 ld.f32 %r60,[%r89]; .loc 1 311 19 setp.ge.f32 %r260,%r60,%r102; @ ! %r260 bra $L75; mov.u64 %r67,%r293; mov.u64 %r92,%r113; mov.u64 %r155,%r64; .loc 1 314 19 mov.u64 %r117,0; $L77: .loc 1 315 36 cvt.u32.u64 %r263,%r155; add.u32 %r262,%r263,1; .loc 1 315 25 st.u32 [%r92],%r262; .loc 1 314 35 add.u64 %r117,%r117,1; .loc 1 314 12 add.u64 %r92,%r92,%r124; setp.ge.s64 %r264,%r117,%r135; @ %r264 bra $L86; .loc 1 315 32 ld.u64 %r155,[%r67]; add.u64 %r67,%r67,8; bra $L77; $L86: .loc 1 313 19 mov.f32 %r102,%r60; $L75: .loc 1 317 13 add.u64 %r89,%r89,%r73; .loc 1 319 11 add.u64 %r64,%r64,1; .loc 1 319 4 st.u64 [%frame+360],%r64; setp.ne.u64 %r265,%r64,%r77; @ %r265 bra $L78; .loc 1 317 13 sub.u64 %r266,%r77,%r164; mad.lo.u64 %r91,%r266,%r73,%r91; bra $L73; $L85: mov.u64 %r76,%r164; mov.u64 %r93,%r94; mov.u64 %r90,%r91; $L74: .loc 1 323 11 ld.s8 %r269,[%r93]; cvt.u16.u32 %r268,%r269; setp.eq.u16 %r270,%r268,0; @ %r270 bra $L79; .loc 1 323 22 ld.f32 %r66,[%r90]; .loc 1 323 19 setp.gt.f32 %r271,%r66,%r102; @ ! %r271 bra $L79; mov.u64 %r46,%r293; mov.u64 %r56,%r113; mov.u64 %r151,%r76; .loc 1 326 12 mov.u64 %r115,0; $L81: .loc 1 327 36 cvt.u32.u64 %r274,%r151; add.u32 %r273,%r274,1; .loc 1 327 25 st.u32 [%r56],%r273; .loc 1 326 28 add.u64 %r115,%r115,1; .loc 1 326 5 add.u64 %r56,%r56,%r124; setp.ge.s64 %r275,%r115,%r135; @ %r275 bra $L87; .loc 1 327 32 ld.u64 %r151,[%r46]; add.u64 %r46,%r46,8; bra $L81; $L87: .loc 1 325 12 mov.f32 %r102,%r66; $L79: .loc 1 331 9 add.u64 %r90,%r90,%r73; .loc 1 332 10 add.u64 %r93,%r93,%r74; .loc 1 334 14 add.u64 %r76,%r76,1; .loc 1 334 7 st.u64 [%frame+360],%r76; setp.ne.u64 %r276,%r76,%r77; @ %r276 bra $L74; sub.u64 %r138,%r77,%r164; .loc 1 331 9 mad.lo.u64 %r91,%r73,%r138,%r91; .loc 1 332 10 mad.lo.u64 %r94,%r74,%r138,%r94; $L73: mov.u64 %r125,%r293; .loc 1 325 12 mov.u64 %r86,%r74; .loc 1 334 34 mov.u64 %r88,%r77; mov.u64 %r84,%r71; mov.u64 %r118,8; .loc 1 335 9 mov.u64 %r100,0; $L82: .loc 1 340 13 st.u64 [%r125+-8],%r296; .loc 1 343 23 mul.lo.u64 %r79,%r84,%r88; .loc 1 344 24 mul.lo.u64 %r81,%r88,%r86; .loc 1 345 5 add.u64 %r100,%r100,1; .loc 1 346 7 setp.eq.u64 %r281,%r100,%r135; @ %r281 bra $L52; .loc 1 354 16 ld.u64 %r282,[%r125]; add.u64 %r83,%r282,1; st.u64 [%r125],%r83; .loc 1 355 23 add.u64 %r284,%r292,%r118; ld.u64 %r84,[%r284]; .loc 1 355 13 sub.u64 %r285,%r84,%r79; shl.b64 %r286,%r285,2; add.u64 %r91,%r91,%r286; .loc 1 356 24 add.u64 %r287,%frame,%r118; ld.u64 %r86,[%r287]; .loc 1 356 14 sub.u64 %r288,%r86,%r81; add.u64 %r94,%r94,%r288; .loc 1 359 32 add.u64 %r290,%r297,%r118; ld.u64 %r88,[%r290]; .loc 1 359 7 add.u64 %r125,%r125,8; add.u64 %r118,%r118,8; setp.eq.u64 %r291,%r83,%r88; @ %r291 bra $L82; .loc 1 334 21 ld.u64 %r164,[%frame+360]; bra $L83; $L84: .loc 1 290 9 mov.f32 %r102,%r52; setp.eq.u32 %r294,1,1; bra $L65; $L52: .loc 1 _gfortran_smaxloc0_4_r4 .visible .func _gfortran_smaxloc0_4_r4u32pred.reg .u64u32 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 379 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L114; .loc 1 379 20 ld.u32 %r40,[%r37381 7 {_gfortran_maxloc0_4_r4.loc 1 382 7 bra $L113; $L115: .loc 1 385 8 ld.s8 %r30,[%r36+28]; .loc 1 387 6 setp.gt.s64 %r46,%r30,0; @ %r46 bra $L1_gfortran_runtime_error117: .loc 1 390 15 ld.u64 %r24,[%r35]; .loc 1 390 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L118; .loc 1 392 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 393 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 394 24 st.u64 [%r35+8],%r24; .loc 1 395 29 mov.u64 %r56call (%value_in),_gfortrani_xmallocarray95 27 st.u64 [%r35],%r24; bra $L119; $L118: .loc 1 397 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 397 11 ld.u32 %r62,[%r61+36.loc 1 399 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 404 8 ld.u64 %r24,[%r35]; $L119: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,2; mov.u64 %r29,%r24; .loc 1 405 10 mov.u64 %r32,0; .loc 1 406 23 cvt.u32.u64 %r68,%r32; $L120: st.u32 [%r29],%r68; .loc 1 405 24 add.u64 %r32,%r32,1; .loc 1 405 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L120; $L113: .loc 1 407 1 ret; } maxloc0_8_r4.o/_gfortran_maxloc0_8_r4 .visible .func _gfortran_maxloc0_8_libgfortran/generated/maxloc0_8_r4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_8_r4 .visible .func _gfortran_mmaxloc0_8_r4_gfortran_smaxloc0_8_r4 .visible .func _gfortran_smaxloc0_8_r4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,65,88,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_8_r4 .visible .func _gfortran_maxloc0_8_stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[368pred.reg .u64 %r140; .reg .u32 %r141; .reg .predpred %r157; .reg .u64pred %r170; .reg .pred %r171; .reg .pred %r173; .reg .u64pred %r177; .reg .u64 %r178; .reg .predpredpred %r196; .reg .predpred %r205; mov.u64 %r121,%ar0; mov.u64 %r122,%ar1; mov.u32 %r123,%ar2; .loc 1 50 10 ld.s8 %r22,[%r122+28]; .loc 1 50 8 cvt.u32.u32 %r124,%r22; cvt.s64.s8 %r84,%r124; .loc 1 51 6 setp.gt.s64 %r125,%r84,0; @ %r125 bra $L2; .loc 1 52 5 cvta.const.u64 %r126,$LCstack; call _gfortran_runtime_errorr23,[%r121]; .loc 1 54 6 setp.ne.u64 %r128,%r23,0; @ %r128 bra $L3; .loc 1 56 7 st.u64 [%r121+48],%r23; add.u64 %r130,%r84,-1; st.u64 [%r121+56],%r130; mov.u64 %r131,1; st.u64 [%r121+40],%r131; .loc 1 57 28 cvt.u32.u64 %r132,%r131; st.u8 [%r121+28],%r132; .loc 1 58 24 st.u64 [%r121+8],%r23; .loc 1 59 29 mov.u64 %r135135; call (%value_in),_gfortrani_xmallocarray138,[%value_in]; } mov.u64 %r23,%r138; .loc 1 59 27 st.u64 [%r121],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r140,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r141,[%r140+36]; setp.eq.u32 %r142,%r141,0; @ %r142 bra $L4; .loc 1 64r122145; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r121]; $L4: .loc 1 68 11 ld.u64 %r85,[%r121+40]; add.u64 %r33,%r122,40; mov.u64 %r53,%frame; add.u64 %r45,%frame,120; .loc 1 70 10 mov.u64 %r106,0; add.u64 %r198,%frame,240; .loc 1 74 16 mov.u64 %r153,%r106; bra $L7; $L32: mov.u64 %r106,%r91; $L7: .loc 1 72 18 ld.u64 %r146,[%r33]; st.u64 [%r53],%r146; .loc 1 73 19 ld.u64 %r148,[%r33+16]; add.u64 %r147,%r148,1; ld.u64 %r149,[%r33+8]; sub.u64 %r32,%r147,%r149; .loc 1 73 17 st.u64 [%r45],%r32; .loc 1 74 16 shl.b64 %r151,%r106,3; add.u64 %r152,%r198,%r151; st.u64 [%r152],%r153; .loc 1 75 10 setp.gt.s64 %r154,%r32,0; @ %r154 bra $L5; shl.b64 %r114,%r85,3; mov.u64 %r104,%r23; .loc 1 78 11 mov.u64 %r92,0; .loc 1 79 24 mov.u64 %r155,%r92; $L6: st.u64 [%r104],%r155; .loc 1 78 27 add.u64 %r92,%r92,1; .loc 1 78 4 add.u64 %r104,%r104,%r114; setp.ne.u64 %r156,%r84,%r92; @ %r156 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r91,%r106,1; .loc 1 70 3 add.u64 %r33,%r33,24; add.u64 %r53,%r53,8; add.u64 %r45,%r45,8; setp.ne.u64 %r157,%r84,%r91; @ %r157 bra $L32; .loc 1 84 8 ld.u64 %r73,[%r122]; shl.b64 %r44,%r85,3; mov.u64 %r55,%r23; .loc 1 87 10 mov.u64 %r90,0; .loc 1 88 23 mov.u64 %r158,1; $L8: st.u64 [%r55],%r158; mov.u64 %r107,%r90; .loc 1 87 26 add.u64 %r90,%r90,1; .loc 1 87 3 add.u64 %r55,%r55,%r44; setp.ne.u64 %r159,%r106,%r107; @ %r159 bra $L8; .loc 1 101 9 setp.eq.u64 %r160,%r73,0; @ %r160 bra $L1; .loc 1 149 19 ld.u64 %r57,[%frame]; .loc 1 149 9 shl.b64 %r59,%r57,2; .loc 1 151 34 ld.u64 %r61,[%frame+120]; ld.u64 %r115,[%frame+240]; .loc 1 141 11 ld.f32 %r119,[%r73]; cvt.u32.u32 %r162,%r22; cvt.s64.s8 %r161,%r162; shl.b64 %r163,%r161,3; add.u64 %r95,%r198,%r163; .loc 1 97 12 mov.f32 %r78,0fff800000; setp.eq.u32 %r197,1,0; add.u64 %r199,%frame,248; .loc 1 157 13 mov.u64 %r202,0; .loc 1 174 32 add.u64 %r203,%frame,120; .loc 1 126 12 setp.eq.u32 %r205,%r123,0; $L30: .loc 1 106 10 @ %r197 bra $L10; mov.u64 %r40,%r115; mov.u32 %r89,0; .loc 1 110 12 mov.u32 %r201,1; $L17: .loc 1 118 13 add.u64 %r120,%r73,%r59; .loc 1 110 11 setp.le.f32 %r166,%r78,%r119; @ ! %r166 bra $L51; setp.eq.u32 %r167,%r89,0; selp.u64 %r115,%r115,%r40,%r167; mov.u64 %r77,%r199; mov.u64 %r97,%r23; mov.u64 %r111,%r115; $L15: .loc 1 115 36 add.u64 %r169,%r111,1; .loc 1 115 25 st.u64 [%r97],%r169; .loc 1 114 5 add.u64 %r97,%r97,%r44; setp.eq.u64 %r170,%r77,%r95; @ %r170 bra $L33; .loc 1 115 32 ld.u64 %r111,[%r77]; add.u64 %r77,%r77,8; bra $L15; $L51: .loc 1 120 11 add.u64 %r40,%r40,1; .loc 1 120 4 setp.eq.u64 %r171,%r40,%r61; @ %r171 bra $L16; .loc 1 110 12 ld.f32 %r119,[%r120]; mov.u64 %r73,%r120; mov.u32 %r89,%r201; bra $L17; $L10: .loc 1 126 12 @ %r205 bra $L34; mov.u64 %r52,%r115; mov.u64 %r71,%r73; $L24: .loc 1 129 11 setp.le.f32 %r173,%r78,%r119; @ ! %r173 bra $L19; mov.u64 %r110,%r199; mov.u64 %r112,%r23; mov.u64 %r109,%r52; $L21: .loc 1 133 35 add.u64 %r175,%r109,1; .loc 1 133 24 st.u64 [%r112],%r175; .loc 1 132 10 add.u64 %r112,%r112,%r44; setp.eq.u64 %r176,%r95,%r110; @ %r176 bra $L35; .loc 1 133 31 ld.u64 %r109,[%r110]; add.u64 %r110,%r110,8; bra $L21; $L35: .loc 1 131 18 mov.f32 %r78,%r119; $L19: .loc 1 135 12 add.u64 %r71,%r71,%r59; .loc 1 137 17 add.u64 %r52,%r52,1; .loc 1 137 10 st.u64 [%frame+240],%r52; setp.ne.u64 %r177,%r52,%r61; @ %r177 bra $L22; .loc 1 135 12 sub.u64 %r178,%r61,%r115; mad.lo.u64 %r73,%r178,%r59,%r73; bra $L23; $L22: .loc 1 129 12 ld.f32 %r119,[%r71]; bra $L24; $L34: mov.u64 %r60,%r115; mov.u64 %r72,%r73; $L18: .loc 1 141 10 setp.lt.f32 %r180,%r78,%r119; @ ! %r180 bra $L25; mov.u64 %r50,%r199; mov.u64 %r47,%r23; mov.u64 %r93,%r60; $L27: .loc 1 145 35 add.u64 %r182,%r93,1; .loc 1 145 24 st.u64 [%r47],%r182; .loc 1 144 4 add.u64 %r47,%r47,%r44; setp.eq.u64 %r183,%r50,%r95; @ %r183 bra $L36; .loc 1 145 31 ld.u64 %r93,[%r50]; add.u64 %r50,%r50,8; bra $L27; $L36: .loc 1 143 18 mov.f32 %r78,%r119; $L25: .loc 1 149 9 add.u64 %r72,%r72,%r59; .loc 1 151 14 add.u64 %r60,%r60,1; .loc 1 151 7 st.u64 [%frame+240],%r60; setp.ne.u64 %r184,%r60,%r61; @ %r184 bra $L28; .loc 1 149 9 sub.u64 %r185,%r61,%r115; mad.lo.u64 %r73,%r185,%r59,%r73; bra $L23; $L28: .loc 1 141 11 ld.f32 %r119,[%r72]; bra $L18; $L16: st.u64 [%frame+240],%r61; .loc 1 118 13 mov.u64 %r73,%r120; $L23: mov.u64 %r42,%r199; .loc 1 151 34 mov.u64 %r70,%r61; mov.u64 %r68,%r57; .loc 1 152 9 mov.u64 %r76,0; $L29: .loc 1 157 13 st.u64 [%r42+-8],%r202; .loc 1 160 23 mul.lo.u64 %r65,%r68,%r70; mov.u64 %r108,%r76; .loc 1 161 5 add.u64 %r76,%r76,1; .loc 1 162 7 setp.eq.u64 %r189,%r108,%r106; @ %r189 bra $L1; .loc 1 170 16 ld.u64 %r190,[%r42]; add.u64 %r67,%r190,1; st.u64 [%r42],%r67; shl.b64 %r118,%r76,3; .loc 1 171 23 add.u64 %r191,%frame,%r118; ld.u64 %r68,[%r191]; .loc 1 171 13 sub.u64 %r192,%r68,%r65; shl.b64 %r193,%r192,2; add.u64 %r73,%r73,%r193; .loc 1 174 32 add.u64 %r195,%r203,%r118; ld.u64 %r70,[%r195]; .loc 1 174 7 add.u64 %r42,%r42,8; setp.eq.u64 %r196,%r67,%r70; @ %r196 bra $L29; ld.u64 %r115,[%frame+240]; .loc 1 141 11 ld.f32 %r119,[%r73]; bra $L30; $L33: mov.f32 %r78,%r119; setp.eq.u32 %r197,1,1; bra $L10; $L_gfortran_mmaxloc0_8_r4 .visible .func _gfortran_mmaxloc0_8_r4reg .u64 %stack; mov.u64 %stack,04807pred %r187; .reg .u64 %r190; .reg .u64u64pred %r222; .reg .u64 %r223; .reg .predu32 %r230; .reg .u64 %r231; .reg .u16 %r233; .reg .u32 %r234; .reg .pred %r235; .reg .u64 %r236; .reg .predpred %r250; .reg .u64 %r252; .reg .predu16 %r257; .reg .u32 %r258; .reg .pred %r259; .reg .pred %r260; .reg .u64 %r262; .reg .pred %r263; .reg .predpred %r282; .reg .u64 %r284; .reg .u64 %r285; .reg .pred %r287; mov.u64 %r160,%ar0; mov.u64 %r161,%ar1; mov.u64 %r162,%ar2; mov.u32 %r163,%ar3; .loc 1 202 6 setp.ne.u64 %r164,%r162,0; @ %r164 bra $L53; .loc 1 204 7 {1163; call _gfortran_maxloc0_8_r4.loc 1 205 7 bra $L52; $L53: .loc 1 208 10 ld.s8 %r22,[%r161+28]; .loc 1 209 6 setp.gt.s32 %r168,%r22,0; @ %r168 bra $L55; .loc 1 210 5 cvta.const.u64 %r16stack; call _gfortran_runtime_error5: .loc 1 212 6 ld.u64 %r171,[%r160]; setp.ne.u64 %r172,%r171,0; @ %r172 bra $L56; .loc 1 214 7 st.u64 [%r160+48],%r171; add.u32 %r174,%r22,-1; cvt.s64.s32 %r175,%r174; st.u64 [%r160+56],%r175; mov.u64 %r176,1; st.u64 [%r160+40],%r176; .loc 1 215 28 cvt.u32.u64 %r177,%r176; st.u8 [%r160+28],%r177; .loc 1 216 24 st.u64 [%r160+8],%r171; .loc 1 217 29 cvt.u32.u32 %r182,%r22; cvt.s64.s8 %r181,%r182; mov.u64 %r180call (%value_in),_gfortrani_xmallocarray183,[%value_in]; } .loc 1 217 27 st.u64 [%r160],%r183; bra $L57; $L56: .loc 1 221 11 cvta.global.u64 %r185,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r186,[%r185+36]; setp.eq.u32 %r187,%r186,0; @ %r187 bra $L57; .loc 1 224 4 cvta.const.u64 %r190,$LC1; {161190; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19316161193190; call _gfortrani_bounds_equal_extents$L57: .loc 1 231 15 ld.u64 %r31,[%r162+16]; .loc 1 233 9 ld.u64 %r97,[%r162]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r195,%r32,-4; and.b32 %r196,%r195,-5; set.u32.eq.u32 %r198,%r196,0; neg.s32 %r199,%r198; .loc 1 235 22 add.u32 %r200,%r32,-1; set.u32.le.u32 %r202,%r200,1; neg.s32 %r203,%r202; .loc 1 235 6 cvt.u16.u32 %r205,%r199; cvt.u16.u32 %r206,%r203; or.b16 %r204,%r205,%r206; cvt.u32.u16 %r207,%r204; cvt.u16.u8 %r208,%r20758; .loc 1 237 7 setp.ne.u32 %r211,%r32,16; @ %r211 bra $L59; $L58: .loc 1 244 11 ld.u64 %r112,[%r160+40]; .loc 1 245 8 ld.u64 %r113,[%r160]; .loc 1 246 17 cvt.u32.u32 %r212,%r22; cvt.s64.s8 %r134,%r212; add.u64 %r122,%r161,40; add.u64 %r280,%frame,120; mov.u64 %r132,%r280; add.u64 %r124,%r162,40; mov.u64 %r66,%frame; add.u64 %r57,%frame,240; add.u64 %r59,%frame,360; mov.u64 %r58,%r59; .loc 1 246 10 mov.u64 %r119,0; .loc 1 251 16 mov.u64 %r221,%r119; bra $L60; $L59: .loc 1 242 5_gfortran_runtime_error248 18 ld.u64 %r215,[%r122]; st.u64 [%r132],%r215; .loc 1 249 20 ld.u64 %r217,[%r124]; mul.lo.u64 %r216,%r217,%r31; .loc 1 249 18 st.u64 [%r66],%r216; .loc 1 250 19 ld.u64 %r219,[%r122+16]; add.u64 %r218,%r219,1; ld.u64 %r220,[%r122+8]; sub.u64 %r45,%r218,%r220; .loc 1 250 17 st.u64 [%r57],%r45; .loc 1 251 16 st.u64 [%r58],%r221; .loc 1 252 10 setp.gt.s64 %r222,%r45,0; @ %r222 bra $L61; shl.b64 %r127,%r112,3; mov.u64 %r129,%r113; .loc 1 255 11 mov.u64 %r120,0; .loc 1 256 24 mov.u64 %r223,%r120; $L62: st.u64 [%r129],%r223; .loc 1 255 27 add.u64 %r120,%r120,1; .loc 1 255 4 add.u64 %r129,%r129,%r127; setp.ne.u64 %r224,%r120,%r134; @ %r224 bra $L62; bra $L52; $L61: .loc 1 246 26 add.u64 %r119,%r119,1; .loc 1 246 3 add.u64 %r122,%r122,24; add.u64 %r132,%r132,8; add.u64 %r124,%r124,24; add.u64 %r66,%r66,8; add.u64 %r57,%r57,8; add.u64 %r58,%r58,8; setp.ne.u64 %r225,%r119,%r134; @ %r225 bra $L60; .loc 1 261 8 ld.u64 %r94,[%r161]; shl.b64 %r145,%r112,3; mov.u64 %r153,%r113; .loc 1 264 10 mov.u64 %r118,0; .loc 1 265 23 mov.u64 %r226,%r118; $L64: st.u64 [%r153],%r226; .loc 1 264 26 add.u64 %r118,%r118,1; .loc 1 264 3 add.u64 %r153,%r153,%r145; setp.ne.u64 %r227,%r118,%r134; @ %r227 bra $L64; .loc 1 276 9 setp.eq.u64 %r228,%r94,0; @ %r228 bra $L52; .loc 1 331 19 ld.u64 %r73,[%frame+120]; .loc 1 331 9 shl.b64 %r75,%r73,2; .loc 1 332 20 ld.u64 %r76,[%frame]; .loc 1 334 34 ld.u64 %r80,[%frame+240]; .loc 1 334 21 ld.u64 %r156,[%frame+360]; cvt.u32.u32 %r230,%r22; cvt.s64.s8 %r229,%r230; shl.b64 %r231,%r229,3; add.u64 %r48,%r231,%r59; .loc 1 272 12 mov.f32 %r104,0fff800000; setp.eq.u32 %r282,1,0; add.u64 %r281,%frame,368; .loc 1 340 13 mov.u64 %r284,0; .loc 1 359 32 add.u64 %r285,%frame,240; .loc 1 308 12 setp.eq.u32 %r287,%r163,0; $L83: .loc 1 280 10 @ %r282 bra $L65; $L72: .loc 1 284 11 ld.s8 %r234,[%r97]; cvt.u16.u32 %r233,%r234; setp.eq.u16 %r235,%r233,0; @ %r235 bra $L66; .loc 1 287 8 ld.u64 %r236,[%r113]; setp.ne.u64 %r237,%r236,0; @ %r237 bra $L67; mov.u64 %r61,%r281; mov.u64 %r72,%r113; mov.u64 %r150,%r156; $L68: .loc 1 289 38 add.u64 %r239,%r150,1; .loc 1 289 27 st.u64 [%r72],%r239; .loc 1 288 7 add.u64 %r72,%r72,%r145; setp.eq.u64 %r240,%r48,%r61; @ %r240 bra $L67; .loc 1 289 34 ld.u64 %r150,[%r61]; add.u64 %r61,%r61,8; bra $L68; $L67: .loc 1 290 9 ld.f32 %r54,[%r94]; .loc 1 290 8 setp.ge.f32 %r241,%r54,%r104; @ ! %r241 bra $L66; mov.u64 %r108,%r281; mov.u64 %r123,%r113; mov.u64 %r154,%r156; $L71: .loc 1 296 33 add.u64 %r243,%r154,1; .loc 1 296 22 st.u64 [%r123],%r243; .loc 1 295 9 add.u64 %r123,%r123,%r145; setp.eq.u64 %r244,%r48,%r108; @ %r244 bra $L84; .loc 1 296 29 ld.u64 %r154,[%r108]; add.u64 %r108,%r108,8; bra $L71; $L66: .loc 1 300 13 add.u64 %r94,%r94,%r75; .loc 1 301 14 add.u64 %r97,%r97,%r76; .loc 1 303 11 add.u64 %r156,%r156,1; .loc 1 303 4 st.u64 [%frame+360],%r156; setp.ne.u64 %r245,%r156,%r80; @ %r245 bra $L72; bra $L73; $L65: .loc 1 308 12 @ %r287 bra $L85; mov.u64 %r68,%r156; mov.u64 %r92,%r94; $L78: .loc 1 311 11 ld.s8 %r248,[%r97]; cvt.u16.u32 %r247,%r248; setp.eq.u16 %r249,%r247,0; @ %r249 bra $L75; .loc 1 311 22 ld.f32 %r63,[%r92]; .loc 1 311 19 setp.ge.f32 %r250,%r63,%r104; @ ! %r250 bra $L75; mov.u64 %r121,%r281; mov.u64 %r36,%r113; mov.u64 %r131,%r68; .loc 1 314 19 mov.u64 %r116,0; $L77: .loc 1 315 36 add.u64 %r252,%r131,1; .loc 1 315 25 st.u64 [%r36],%r252; .loc 1 314 35 add.u64 %r116,%r116,1; .loc 1 314 12 add.u64 %r36,%r36,%r145; setp.ge.s64 %r253,%r116,%r134; @ %r253 bra $L86; .loc 1 315 32 ld.u64 %r131,[%r121]; add.u64 %r121,%r121,8; bra $L77; $L86: .loc 1 313 19 mov.f32 %r104,%r63; $L75: .loc 1 317 13 add.u64 %r92,%r92,%r75; .loc 1 319 11 add.u64 %r68,%r68,1; .loc 1 319 4 st.u64 [%frame+360],%r68; setp.ne.u64 %r254,%r68,%r80; @ %r254 bra $L78; .loc 1 317 13 sub.u64 %r255,%r80,%r156; mad.lo.u64 %r94,%r255,%r75,%r94; bra $L73; $L85: mov.u64 %r79,%r156; mov.u64 %r96,%r97; mov.u64 %r93,%r94; $L74: .loc 1 323 11 ld.s8 %r258,[%r96]; cvt.u16.u32 %r257,%r258; setp.eq.u16 %r259,%r257,0; @ %r259 bra $L79; .loc 1 323 22 ld.f32 %r70,[%r93]; .loc 1 323 19 setp.gt.f32 %r260,%r70,%r104; @ ! %r260 bra $L79; mov.u64 %r155,%r281; mov.u64 %r159,%r113; mov.u64 %r126,%r79; .loc 1 326 12 mov.u64 %r115,0; $L81: .loc 1 327 36 add.u64 %r262,%r126,1; .loc 1 327 25 st.u64 [%r159],%r262; .loc 1 326 28 add.u64 %r115,%r115,1; .loc 1 326 5 add.u64 %r159,%r159,%r145; setp.ge.s64 %r263,%r115,%r134; @ %r263 bra $L87; .loc 1 327 32 ld.u64 %r126,[%r155]; add.u64 %r155,%r155,8; bra $L81; $L87: .loc 1 325 12 mov.f32 %r104,%r70; $L79: .loc 1 331 9 add.u64 %r93,%r93,%r75; .loc 1 332 10 add.u64 %r96,%r96,%r76; .loc 1 334 14 add.u64 %r79,%r79,1; .loc 1 334 7 st.u64 [%frame+360],%r79; setp.ne.u64 %r264,%r79,%r80; @ %r264 bra $L74; sub.u64 %r137,%r80,%r156; .loc 1 331 9 mad.lo.u64 %r94,%r75,%r137,%r94; .loc 1 332 10 mad.lo.u64 %r97,%r76,%r137,%r97; $L73: mov.u64 %r103,%r281; .loc 1 325 12 mov.u64 %r89,%r76; .loc 1 334 34 mov.u64 %r91,%r80; mov.u64 %r87,%r73; mov.u64 %r101,8; .loc 1 335 9 mov.u64 %r102,0; $L82: .loc 1 340 13 st.u64 [%r103+-8],%r284; .loc 1 343 23 mul.lo.u64 %r82,%r87,%r91; .loc 1 344 24 mul.lo.u64 %r84,%r91,%r89; .loc 1 345 5 add.u64 %r102,%r102,1; .loc 1 346 7 setp.eq.u64 %r269,%r102,%r134; @ %r269 bra $L52; .loc 1 354 16 ld.u64 %r270,[%r103]; add.u64 %r86,%r270,1; st.u64 [%r103],%r86; .loc 1 355 23 add.u64 %r272,%r280,%r101; ld.u64 %r87,[%r272]; .loc 1 355 13 sub.u64 %r273,%r87,%r82; shl.b64 %r274,%r273,2; add.u64 %r94,%r94,%r274; .loc 1 356 24 add.u64 %r275,%frame,%r101; ld.u64 %r89,[%r275]; .loc 1 356 14 sub.u64 %r276,%r89,%r84; add.u64 %r97,%r97,%r276; .loc 1 359 32 add.u64 %r278,%r285,%r101; ld.u64 %r91,[%r278]; .loc 1 359 7 add.u64 %r103,%r103,8; add.u64 %r101,%r101,8; setp.eq.u64 %r279,%r86,%r91; @ %r279 bra $L82; .loc 1 334 21 ld.u64 %r156,[%frame+360]; bra $L83; $L84: .loc 1 290 9 mov.f32 %r104,%r54; setp.eq.u32 %r282,1,1; bra $L65; $L52: .loc 1 _gfortran_smaxloc0_8_r4 .visible .func _gfortran_smaxloc0_8_r4u32pred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 379 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L114; .loc 1 379 20 ld.u32 %r40,[%r37381 7 {_gfortran_maxloc0_8_r4.loc 1 382 7 bra $L113; $L115: .loc 1 385 8 ld.s8 %r30,[%r36+28]; .loc 1 387 6 setp.gt.s64 %r46,%r30,0; @ %r46 bra $L1_gfortran_runtime_error117: .loc 1 390 15 ld.u64 %r24,[%r35]; .loc 1 390 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L118; .loc 1 392 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 393 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 394 24 st.u64 [%r35+8],%r24; .loc 1 395 29 mov.u64 call (%value_in),_gfortrani_xmallocarray95 27 st.u64 [%r35],%r24; bra $L119; $L118: .loc 1 397 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 397 11 ld.u32 %r62,[%r61+36.loc 1 399 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 404 8 ld.u64 %r24,[%r35]; $L119: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,3; mov.u64 %r29,%r24; .loc 1 405 10 mov.u64 %r32,0; .loc 1 406 23 mov.u64 %r68,%r32; $L120: st.u64 [%r29],%r68; .loc 1 405 24 add.u64 %r32,%r32,1; .loc 1 405 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L120; $L113: .loc 1 407 1 ret; } maxloc0_16_r4.o/_gfortran_maxloc0_16_r4 .visible .func _gfortran_maxloc0_16_libgfortran/generated/maxloc0_16_r4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_16_r4 .visible .func _gfortran_mmaxloc0_16_r4_gfortran_smaxloc0_16_r4 .visible .func _gfortran_smaxloc0_16_r4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,65,88,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_16_r4 .visible .func _gfortran_maxloc0_16_stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[368105; .reg .u64 %r108predpred %r160; .reg .predpredu32 %r167; .reg .u64 %r168; .reg .pred %r171; .reg .pred %r172; .reg .u64 %r174; .reg .u64 %r175; .reg .pred %r176; .reg .pred %r177; .reg .predu64 %r193; .reg .predmov.u64 %r124,%ar0; mov.u64 %r125,%ar1; mov.u32 %r126,%ar2; .loc 1 50 10 ld.s8 %r22,[%r125+28]; .loc 1 50 8 cvt.u32.u32 %r127,%r22; cvt.s64.s8 %r79,%r127; .loc 1 51 6 setp.gt.s64 %r128,%r79,0; @ %r128 bra $L2; .loc 1 52 5 cvta.const.u64 %r112_gfortran_runtime_errorr23,[%r124]; .loc 1 54 6 setp.ne.u64 %r131,%r23,0; @ %r131 bra $L3; .loc 1 56 7 st.u64 [%r124+48],%r23; add.u64 %r133,%r79,-1; st.u64 [%r124+56],%r133; mov.u64 %r134,1; st.u64 [%r124+40],%r134; .loc 1 57 28 cvt.u32.u64 %r135,%r134; st.u8 [%r124+28],%r135; .loc 1 58 24 st.u64 [%r124+8],%r23; .loc 1 59 29 mov.u64 %r138,1679call (%value_in),_gfortrani_xmallocarray1,[%value_in]; } mov.u64 %r23,%r141; .loc 1 59 27 st.u64 [%r124],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r143,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r144,[%r143+36]; setp.eq.u32 %r145,%r144,0; @ %r145 bra $L4; .loc 1 64r148; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r124]; $L4: .loc 1 68 11 ld.u64 %r80,[%r124+40]; add.u64 %r51,%r125,40; mov.u64 %r42,%frame; add.u64 %r76,%frame,120; .loc 1 70 10 mov.u64 %r108,0; add.u64 %r205,%frame,240; .loc 1 74 16 mov.u64 %r156,%r108; bra $L7; $L32: mov.u64 %r108,%r88; $L7: .loc 1 72 18 ld.u64 %r149,[%r51]; st.u64 [%r42],%r149; .loc 1 73 19 ld.u64 %r151,[%r51+16]; add.u64 %r150,%r151,1; ld.u64 %r152,[%r51+8]; sub.u64 %r31,%r150,%r152; .loc 1 73 17 st.u64 [%r76],%r31; .loc 1 74 16 shl.b64 %r154,%r108,3; add.u64 %r155,%r205,%r154; st.u64 [%r155],%r156; .loc 1 75 10 setp.gt.s64 %r157,%r31,0; @ %r157 bra $L5; shl.b64 %r120,%r80,4; mov.u64 %r105,%r23; .loc 1 78 11 mov.u64 %r89,0; .loc 1 79 24 mov.u64 %r158,%r89; $L6: st.u64 [%r105],%r158; st.u64 [%r105+8],%r158; .loc 1 78 27 add.u64 %r89,%r89,1; .loc 1 78 4 add.u64 %r105,%r105,%r120; setp.ne.u64 %r160,%r79,%r89; @ %r160 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r88,%r108,1; .loc 1 70 3 add.u64 %r51,%r51,24; add.u64 %r42,%r42,8; add.u64 %r76,%r76,8; setp.ne.u64 %r161,%r79,%r88; @ %r161 bra $L32; .loc 1 84 8 ld.u64 %r69,[%r125]; shl.b64 %r35,%r80,4; mov.u64 %r48,%r23; .loc 1 87 10 mov.u64 %r87,0; .loc 1 88 23 mov.u64 %r162,1; mov.u64 %r163,%r87; $L8: st.u64 [%r48],%r162; st.u64 [%r48+8],%r163; mov.u64 %r109,%r87; .loc 1 87 26 add.u64 %r87,%r87,1; .loc 1 87 3 add.u64 %r48,%r48,%r35; setp.ne.u64 %r164,%r108,%r109; @ %r164 bra $L8; .loc 1 101 9 setp.eq.u64 %r165,%r69,0; @ %r165 bra $L1; .loc 1 149 19 ld.u64 %r54,[%frame]; .loc 1 149 9 shl.b64 %r56,%r54,2; .loc 1 151 34 ld.u64 %r58,[%frame+120]; ld.u64 %r121,[%frame+240]; .loc 1 141 11 ld.f32 %r122,[%r69]; cvt.u32.u32 %r167,%r22; cvt.s64.s8 %r166,%r167; shl.b64 %r168,%r166,3; add.u64 %r46,%r205,%r168; .loc 1 97 12 mov.f32 %r74,0fff800000; setp.eq.u32 %r207,1,0; add.u64 %r206,%frame,248; .loc 1 157 13 mov.u64 %r210,0; .loc 1 174 32 add.u64 %r211,%frame,120; .loc 1 126 12 setp.eq.u32 %r213,%r126,0; $L30: .loc 1 106 10 @ %r207 bra $L10; mov.u64 %r37,%r121; mov.u32 %r85,0; .loc 1 110 12 mov.u32 %r209,1; $L17: .loc 1 118 13 add.u64 %r123,%r69,%r56; .loc 1 110 11 setp.le.f32 %r171,%r74,%r122; @ ! %r171 bra $L51; setp.eq.u32 %r172,%r85,0; selp.u64 %r121,%r121,%r37,%r172; mov.u64 %r78,%r206; mov.u64 %r94,%r23; mov.u64 %r117,%r121; $L15: .loc 1 115 36 add.u64 %r174,%r117,1; st.u64 [%r94],%r174; shr.s64 %r175,%r174,63; st.u64 [%r94+8],%r175; .loc 1 114 5 add.u64 %r94,%r94,%r35; setp.eq.u64 %r176,%r46,%r78; @ %r176 bra $L33; .loc 1 115 32 ld.u64 %r117,[%r78]; add.u64 %r78,%r78,8; bra $L15; $L51: .loc 1 120 11 add.u64 %r37,%r37,1; .loc 1 120 4 setp.eq.u64 %r177,%r37,%r58; @ %r177 bra $L16; .loc 1 110 12 ld.f32 %r122,[%r123]; mov.u64 %r69,%r123; mov.u32 %r85,%r209; bra $L17; $L10: .loc 1 126 12 @ %r213 bra $L34; mov.u64 %r49,%r121; mov.u64 %r67,%r69; $L24: .loc 1 129 11 setp.le.f32 %r179,%r74,%r122; @ ! %r179 bra $L19; mov.u64 %r119,%r206; mov.u64 %r70,%r23; mov.u64 %r114,%r49; $L21: .loc 1 133 35 add.u64 %r181,%r114,1; st.u64 [%r70],%r181; shr.s64 %r182,%r181,63; st.u64 [%r70+8],%r182; .loc 1 132 10 add.u64 %r70,%r70,%r35; setp.eq.u64 %r183,%r46,%r119; @ %r183 bra $L35; .loc 1 133 31 ld.u64 %r114,[%r119]; add.u64 %r119,%r119,8; bra $L21; $L35: .loc 1 131 18 mov.f32 %r74,%r122; $L19: .loc 1 135 12 add.u64 %r67,%r67,%r56; .loc 1 137 17 add.u64 %r49,%r49,1; .loc 1 137 10 st.u64 [%frame+240],%r49; setp.ne.u64 %r184,%r49,%r58; @ %r184 bra $L22; .loc 1 135 12 sub.u64 %r185,%r58,%r121; mad.lo.u64 %r69,%r185,%r56,%r69; bra $L23; $L22: .loc 1 129 12 ld.f32 %r122,[%r67]; bra $L24; $L34: mov.u64 %r57,%r121; mov.u64 %r68,%r69; $L18: .loc 1 141 10 setp.lt.f32 %r187,%r74,%r122; @ ! %r187 bra $L25; mov.u64 %r112,%r206; mov.u64 %r116,%r23; mov.u64 %r111,%r57; $L27: .loc 1 145 35 add.u64 %r189,%r111,1; st.u64 [%r116],%r189; shr.s64 %r190,%r189,63; st.u64 [%r116+8],%r190; .loc 1 144 4 add.u64 %r116,%r116,%r35; setp.eq.u64 %r191,%r46,%r112; @ %r191 bra $L36; .loc 1 145 31 ld.u64 %r111,[%r112]; add.u64 %r112,%r112,8; bra $L27; $L36: .loc 1 143 18 mov.f32 %r74,%r122; $L25: .loc 1 149 9 add.u64 %r68,%r68,%r56; .loc 1 151 14 add.u64 %r57,%r57,1; .loc 1 151 7 st.u64 [%frame+240],%r57; setp.ne.u64 %r192,%r57,%r58; @ %r192 bra $L28; .loc 1 149 9 sub.u64 %r193,%r58,%r121; mad.lo.u64 %r69,%r193,%r56,%r69; bra $L23; $L28: .loc 1 141 11 ld.f32 %r122,[%r68]; bra $L18; $L16: st.u64 [%frame+240],%r58; .loc 1 118 13 mov.u64 %r69,%r123; $L23: mov.u64 %r71,%r206; .loc 1 151 34 mov.u64 %r65,%r58; mov.u64 %r63,%r54; .loc 1 152 9 mov.u64 %r73,0; $L29: .loc 1 157 13 st.u64 [%r71+-8],%r210; .loc 1 160 23 mul.lo.u64 %r60,%r63,%r65; mov.u64 %r110,%r73; .loc 1 161 5 add.u64 %r73,%r73,1; .loc 1 162 7 setp.eq.u64 %r197,%r110,%r108; @ %r197 bra $L1; .loc 1 170 16 ld.u64 %r198,[%r71]; add.u64 %r62,%r198,1; st.u64 [%r71],%r62; shl.b64 %r40,%r73,3; .loc 1 171 23 add.u64 %r199,%frame,%r40; ld.u64 %r63,[%r199]; .loc 1 171 13 sub.u64 %r200,%r63,%r60; shl.b64 %r201,%r200,2; add.u64 %r69,%r69,%r201; .loc 1 174 32 add.u64 %r203,%r211,%r40; ld.u64 %r65,[%r203]; .loc 1 174 7 add.u64 %r71,%r71,8; setp.eq.u64 %r204,%r62,%r65; @ %r204 bra $L29; ld.u64 %r121,[%frame+240]; .loc 1 141 11 ld.f32 %r122,[%r69]; bra $L30; $L33: mov.f32 %r74,%r122; setp.eq.u32 %r207,1,1; bra $L10; $L_gfortran_mmaxloc0_16_r4 .visible .func _gfortran_mmaxloc0_16_r4reg .u64 %stack; mov.u64 %stack,0480pred %r168; .reg .predpredpred %r226; .reg .u64 %r227; .reg .pred %r229; .reg .predpred %r234; .reg .u64u16pred %r248; .reg .pred %r249; .reg .u64 %r251; .reg .u64 %r252; .reg .pred %r253; .reg .pred %r254; .reg .u16 %r256; .reg .u32predpred %r274; .reg .predu64 %r283; .reg .u64u64 %r287; .reg .u64 %r289; .reg .pred %r290; .reg .u64 %r291; .reg .u64 %r292; .reg .predpred %r298; mov.u64 %r164,%ar0; mov.u64 %r165,%ar1; mov.u64 %r166,%ar2; mov.u32 %r167,%ar3; .loc 1 202 6 setp.ne.u64 %r168,%r166,0; @ %r168 bra $L1617; call _gfortran_maxloc0_16_r4.loc 1 205 7 bra $L52; $L53: .loc 1 208 10 ld.s8 %r22,[%r165+28]; .loc 1 209 6 setp.gt.s32 %r172,%r22,0; @ %r172 bra $L55; .loc 1 210 5 cvta.const.u64 %r173173_gfortran_runtime_error5: .loc 1 212 6 ld.u64 %r175,[%r164]; setp.ne.u64 %r176,%r175,0; @ %r176 bra $L56; .loc 1 214 7 st.u64 [%r164+48],%r175; add.u32 %r178,%r22,-1; cvt.s64.s32 %r179,%r178; st.u64 [%r164+56],%r179; mov.u64 %r180,1; st.u64 [%r164+40],%r180; .loc 1 215 28 cvt.u32.u64 %r181,%r180; st.u8 [%r164+28],%r181; .loc 1 216 24 st.u64 [%r164+8],%r175; .loc 1 217 29 cvt.u32.u32 %r186,%r22; cvt.s64.s8 %r185,%r186; mov.u64 %r184,16call (%value_in),_gfortrani_xmallocarray187,[%value_in]; } .loc 1 217 27 st.u64 [%r164],%r187; bra $L57; $L56: .loc 1 221 11 cvta.global.u64 %r189,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r190,[%r189+36]; setp.eq.u32 %r191,%r190,0164r194; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19716r197194; call _gfortrani_bounds_equal_extents$L57: .loc 1 231 15 ld.u64 %r31,[%r166+16]; .loc 1 233 9 ld.u64 %r101,[%r166]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r199,%r32,-4; and.b32 %r200,%r199,-5; set.u32.eq.u32 %r202,%r200,0; neg.s32 %r203,%r202; .loc 1 235 22 add.u32 %r204,%r32,-1; set.u32.le.u32 %r206,%r204,1; neg.s32 %r207,%r206; .loc 1 235 6 cvt.u16.u32 %r209,%r203; cvt.u16.u32 %r210,%r207; or.b16 %r208,%r209,%r210; cvt.u32.u16 %r211,%r208; cvt.u16.u8 %r212,%r211; setp.ne.u16 %r213,%r212,0; @ %r213 bra $L58; .loc 1 237 7 setp.ne.u32 %r215,%r32,16; @ %r215 bra $L59; $L58: .loc 1 244 11 ld.u64 %r116,[%r164+40]; .loc 1 245 8 ld.u64 %r117,[%r164]; .loc 1 246 17 cvt.u32.u32 %r216,%r22; cvt.s64.s8 %r136,%r216; add.u64 %r153,%r165,40; add.u64 %r291,%frame,120; mov.u64 %r150,%r291; add.u64 %r151,%r166,40; mov.u64 %r148,%frame; add.u64 %r125,%frame,240; add.u64 %r48,%frame,360; mov.u64 %r135,%r48; .loc 1 246 10 mov.u64 %r122,0; .loc 1 251 16 mov.u64 %r225,%r122; bra $L60; $L59: .loc 1 242 5 cvta.const.u64 %r217,$LC3stack; call _gfortran_runtime_error248 18 ld.u64 %r219,[%r153]; st.u64 [%r150],%r219; .loc 1 249 20 ld.u64 %r221,[%r151]; mul.lo.u64 %r220,%r221,%r31; .loc 1 249 18 st.u64 [%r148],%r220; .loc 1 250 19 ld.u64 %r223,[%r153+16]; add.u64 %r222,%r223,1; ld.u64 %r224,[%r153+8]; sub.u64 %r47,%r222,%r224; .loc 1 250 17 st.u64 [%r125],%r47; .loc 1 251 16 st.u64 [%r135],%r225; .loc 1 252 10 setp.gt.s64 %r226,%r47,0; @ %r226 bra $L61; shl.b64 %r130,%r116,4; mov.u64 %r132,%r117; .loc 1 255 11 mov.u64 %r123,0; .loc 1 256 24 mov.u64 %r227,%r123; $L62: st.u64 [%r132],%r227; st.u64 [%r132+8],%r227; .loc 1 255 27 add.u64 %r123,%r123,1; .loc 1 255 4 add.u64 %r132,%r132,%r130; setp.ne.u64 %r229,%r123,%r136; @ %r229 bra $L62; bra $L52; $L61: .loc 1 246 26 add.u64 %r122,%r122,1; .loc 1 246 3 add.u64 %r153,%r153,24; add.u64 %r150,%r150,8; add.u64 %r151,%r151,24; add.u64 %r148,%r148,8; add.u64 %r125,%r125,8; add.u64 %r135,%r135,8; setp.ne.u64 %r230,%r122,%r136; @ %r230 bra $L60; .loc 1 261 8 ld.u64 %r98,[%r165]; shl.b64 %r159,%r116,4; mov.u64 %r157,%r117; .loc 1 264 10 mov.u64 %r121,0; .loc 1 265 23 mov.u64 %r231,%r121; $L64: st.u64 [%r157],%r231; st.u64 [%r157+8],%r231; .loc 1 264 26 add.u64 %r121,%r121,1; .loc 1 264 3 add.u64 %r157,%r157,%r159; setp.ne.u64 %r233,%r121,%r136; @ %r233 bra $L64; .loc 1 276 9 setp.eq.u64 %r234,%r98,0; @ %r234 bra $L52; .loc 1 331 19 ld.u64 %r77,[%frame+120]; .loc 1 331 9 shl.b64 %r79,%r77,2; .loc 1 332 20 ld.u64 %r80,[%frame]; .loc 1 334 34 ld.u64 %r84,[%frame+240]; .loc 1 334 21 ld.u64 %r161,[%frame+360]; cvt.u32.u32 %r236,%r22; cvt.s64.s8 %r235,%r236; shl.b64 %r237,%r235,3; add.u64 %r72,%r237,%r48; .loc 1 272 12 mov.f32 %r105,0fff800000; setp.eq.u32 %r293,1,0; add.u64 %r292,%frame,368; .loc 1 340 13 mov.u64 %r295,0; .loc 1 359 32 add.u64 %r296,%frame,240; .loc 1 308 12 setp.eq.u32 %r298,%r167,0; $L83: .loc 1 280 10 @ %r293 bra $L65; $L72: .loc 1 284 11 ld.s8 %r240,[%r101]; cvt.u16.u32 %r239,%r240; setp.eq.u16 %r241,%r239,0; @ %r241 bra $L66; .loc 1 287 8 ld.u64 %r242,[%r117]; ld.u64 %r243,[%r117+8]; or.b64 %r242,%r242,%r243; setp.ne.u64 %r244,%r242,0; @ %r244 bra $L67; mov.u64 %r99,%r292; mov.u64 %r102,%r117; mov.u64 %r152,%r161; $L68: .loc 1 289 38 add.u64 %r246,%r152,1; st.u64 [%r102],%r246; shr.s64 %r247,%r246,63; st.u64 [%r102+8],%r247; .loc 1 288 7 add.u64 %r102,%r102,%r159; setp.eq.u64 %r248,%r72,%r99; @ %r248 bra $L67; .loc 1 289 34 ld.u64 %r152,[%r99]; add.u64 %r99,%r99,8; bra $L68; $L67: .loc 1 290 9 ld.f32 %r54,[%r98]; .loc 1 290 8 setp.ge.f32 %r249,%r54,%r105; @ ! %r249 bra $L66; mov.u64 %r127,%r292; mov.u64 %r128,%r117; mov.u64 %r156,%r161; $L71: .loc 1 296 33 add.u64 %r251,%r156,1; st.u64 [%r128],%r251; shr.s64 %r252,%r251,63; st.u64 [%r128+8],%r252; .loc 1 295 9 add.u64 %r128,%r128,%r159; setp.eq.u64 %r253,%r72,%r127; @ %r253 bra $L84; .loc 1 296 29 ld.u64 %r156,[%r127]; add.u64 %r127,%r127,8; bra $L71; $L66: .loc 1 300 13 add.u64 %r98,%r98,%r79; .loc 1 301 14 add.u64 %r101,%r101,%r80; .loc 1 303 11 add.u64 %r161,%r161,1; .loc 1 303 4 st.u64 [%frame+360],%r161; setp.ne.u64 %r254,%r161,%r84; @ %r254 bra $L72; bra $L73; $L65: .loc 1 308 12 @ %r298 bra $L85; mov.u64 %r67,%r161; mov.u64 %r96,%r98; $L78: .loc 1 311 11 ld.s8 %r257,[%r101]; cvt.u16.u32 %r256,%r257; setp.eq.u16 %r258,%r256,0; @ %r258 bra $L75; .loc 1 311 22 ld.f32 %r63,[%r96]; .loc 1 311 19 setp.ge.f32 %r259,%r63,%r105; @ ! %r259 bra $L75; mov.u64 %r49,%r292; mov.u64 %r61,%r117; mov.u64 %r149,%r67; .loc 1 314 19 mov.u64 %r120,0; $L77: .loc 1 315 36 add.u64 %r261,%r149,1; st.u64 [%r61],%r261; shr.s64 %r262,%r261,63; st.u64 [%r61+8],%r262; .loc 1 314 35 add.u64 %r120,%r120,1; .loc 1 314 12 add.u64 %r61,%r61,%r159; setp.ge.s64 %r263,%r120,%r136; @ %r263 bra $L86; .loc 1 315 32 ld.u64 %r149,[%r49]; add.u64 %r49,%r49,8; bra $L77; $L86: .loc 1 313 19 mov.f32 %r105,%r63; $L75: .loc 1 317 13 add.u64 %r96,%r96,%r79; .loc 1 319 11 add.u64 %r67,%r67,1; .loc 1 319 4 st.u64 [%frame+360],%r67; setp.ne.u64 %r264,%r67,%r84; @ %r264 bra $L78; .loc 1 317 13 sub.u64 %r265,%r84,%r161; mad.lo.u64 %r98,%r265,%r79,%r98; bra $L73; $L85: mov.u64 %r82,%r161; mov.u64 %r100,%r101; mov.u64 %r97,%r98; $L74: .loc 1 323 11 ld.s8 %r268,[%r100]; cvt.u16.u32 %r267,%r268; setp.eq.u16 %r269,%r267,0; @ %r269 bra $L79; .loc 1 323 22 ld.f32 %r71,[%r97]; .loc 1 323 19 setp.gt.f32 %r270,%r71,%r105; @ ! %r270 bra $L79; mov.u64 %r124,%r292; mov.u64 %r36,%r117; mov.u64 %r134,%r82; .loc 1 326 12 mov.u64 %r119,0; $L81: .loc 1 327 36 add.u64 %r272,%r134,1; st.u64 [%r36],%r272; shr.s64 %r273,%r272,63; st.u64 [%r36+8],%r273; .loc 1 326 28 add.u64 %r119,%r119,1; .loc 1 326 5 add.u64 %r36,%r36,%r159; setp.ge.s64 %r274,%r119,%r136; @ %r274 bra $L87; .loc 1 327 32 ld.u64 %r134,[%r124]; add.u64 %r124,%r124,8; bra $L81; $L87: .loc 1 325 12 mov.f32 %r105,%r71; $L79: .loc 1 331 9 add.u64 %r97,%r97,%r79; .loc 1 332 10 add.u64 %r100,%r100,%r80; .loc 1 334 14 add.u64 %r82,%r82,1; .loc 1 334 7 st.u64 [%frame+360],%r82; setp.ne.u64 %r275,%r82,%r84; @ %r275 bra $L74; sub.u64 %r139,%r84,%r161; .loc 1 331 9 mad.lo.u64 %r98,%r79,%r139,%r98; .loc 1 332 10 mad.lo.u64 %r101,%r80,%r139,%r101; $L73: mov.u64 %r111,%r292; .loc 1 325 12 mov.u64 %r93,%r80; .loc 1 334 34 mov.u64 %r95,%r84; mov.u64 %r91,%r77; mov.u64 %r110,8; .loc 1 335 9 mov.u64 %r103,0; $L82: .loc 1 340 13 st.u64 [%r111+-8],%r295; .loc 1 343 23 mul.lo.u64 %r86,%r91,%r95; .loc 1 344 24 mul.lo.u64 %r88,%r95,%r93; .loc 1 345 5 add.u64 %r103,%r103,1; .loc 1 346 7 setp.eq.u64 %r280,%r103,%r136; @ %r280 bra $L52; .loc 1 354 16 ld.u64 %r281,[%r111]; add.u64 %r90,%r281,1; st.u64 [%r111],%r90; .loc 1 355 23 add.u64 %r283,%r291,%r110; ld.u64 %r91,[%r283]; .loc 1 355 13 sub.u64 %r284,%r91,%r86; shl.b64 %r285,%r284,2; add.u64 %r98,%r98,%r285; .loc 1 356 24 add.u64 %r286,%frame,%r110; ld.u64 %r93,[%r286]; .loc 1 356 14 sub.u64 %r287,%r93,%r88; add.u64 %r101,%r101,%r287; .loc 1 359 32 add.u64 %r289,%r296,%r110; ld.u64 %r95,[%r289]; .loc 1 359 7 add.u64 %r111,%r111,8; add.u64 %r110,%r110,8; setp.eq.u64 %r290,%r90,%r95; @ %r290 bra $L82; .loc 1 334 21 ld.u64 %r161,[%frame+360]; bra $L83; $L84: .loc 1 290 9 mov.f32 %r105,%r54; setp.eq.u32 %r293,1,1; bra $L65; $L52: .loc 1 _gfortran_smaxloc0_16_r4 .visible .func _gfortran_smaxloc0_16_r4u32pred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 379 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L114; .loc 1 379 20 ld.u32 %r40,[%r37381 7 {_gfortran_maxloc0_16_r4.loc 1 382 7 bra $L113; $L115: .loc 1 385 8 ld.s8 %r30,[%r36+28]; .loc 1 387 6 setp.gt.s64 %r46,%r30,0; @ %r46 bra $L1_gfortran_runtime_error117: .loc 1 390 15 ld.u64 %r24,[%r35]; .loc 1 390 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L118; .loc 1 392 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 393 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 394 24 st.u64 [%r35+8],%r24; .loc 1 395 29call (%value_in),_gfortrani_xmallocarray95 27 st.u64 [%r35],%r24; bra $L119; $L118: .loc 1 397 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 397 11 ld.u32 %r62,[%r61+36.loc 1 399 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 404 8 ld.u64 %r24,[%r35]; $L119: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,4; mov.u64 %r29,%r24; .loc 1 405 10 mov.u64 %r32,0; .loc 1 406 23 mov.u64 %r68,%r32; $L120: st.u64 [%r29],%r68; st.u64 [%r29+8],%r68; .loc 1 405 24 add.u64 %r32,%r32,1; .loc 1 405 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r70,%r30,%r32; @ %r70 bra $L120; $L113: .loc 1 407 1 ret; } maxloc0_4_r8.o/_gfortran_maxloc0_4_r8 .visible .func _gfortran_maxloc0_4_libgfortran/generated/maxloc0_4_r8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_4_r8 .visible .func _gfortran_mmaxloc0_4_r8_gfortran_smaxloc0_4_r8 .visible .func _gfortran_smaxloc0_4_r8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,65,88,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_4_r8 .visible .func _gfortran_maxloc0_4_stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[368.reg .u64predpredpredu32 %r176; .reg .pred %r177; .reg .pred %r178; .reg .pred %r180; .reg .u32 %r182; .reg .u32 %r183; .reg .pred64pred %r208; .reg .u32 %r210; .reg .u64 %r211; .reg .u64 %r212; .reg .pred50 10 ld.s8 %r22,[%r128+28]; .loc 1 50 8 cvt.u32.u32 %r130,%r22; cvt.s64.s8 %r79,%r130; .loc 1 51 6 setp.gt.s64 %r131,%r79,0; @ %r131 bra $L2; .loc 1 52 5 cvta.const.u64 %r1321stack; call _gfortran_runtime_errorr23,[%r127]; .loc 1 54 6 setp.ne.u64 %r134,%r23,0; @ %r134 bra $L3; .loc 1 56 7 st.u64 [%r127+48],%r23; add.u64 %r136,%r79,-1; st.u64 [%r127+56],%r136; mov.u64 %r137,1; st.u64 [%r127+40],%r137; .loc 1 57 28 cvt.u32.u64 %r138,%r137; st.u8 [%r127+28],%r138; .loc 1 58 24 st.u64 [%r127+8],%r23; .loc 1 59 2979call (%value_in),_gfortrani_xmallocarray4,[%value_in]; } mov.u64 %r23,%r144; .loc 1 59 27 st.u64 [%r127],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r146,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r147,[%r146+36]; setp.eq.u32 %r148,%r147,0; @ %r148 bra $L4; .loc 1 64 2r12151; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r127]; $L4: .loc 1 68 11 ld.u64 %r80,[%r127+40]; add.u64 %r44,%r128,40; mov.u64 %r37,%frame; add.u64 %r47,%frame,120; .loc 1 70 10 mov.u64 %r103,0; add.u64 %r206,%frame,240; .loc 1 74 16 mov.u64 %r159,%r103; bra $L7; $L32: mov.u64 %r103,%r88; $L7: .loc 1 72 18 ld.u64 %r152,[%r44]; st.u64 [%r37],%r152; .loc 1 73 19 ld.u64 %r154,[%r44+16]; add.u64 %r153,%r154,1; ld.u64 %r155,[%r44+8]; sub.u64 %r31,%r153,%r155; .loc 1 73 17 st.u64 [%r47],%r31; .loc 1 74 16 shl.b64 %r157,%r103,3; add.u64 %r158,%r206,%r157; st.u64 [%r158],%r159; .loc 1 75 10 setp.gt.s64 %r160,%r31,0; @ %r160 bra $L5; shl.b64 %r122,%r80,2; mov.u64 %r101,%r23; .loc 1 78 11 mov.u64 %r89,0; .loc 1 79 24 cvt.u32.u64 %r161,%r89; $L6: st.u32 [%r101],%r161; .loc 1 78 27 add.u64 %r89,%r89,1; .loc 1 78 4 add.u64 %r101,%r101,%r122; setp.ne.u64 %r162,%r79,%r89; @ %r162 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r88,%r103,1; .loc 1 70 3 add.u64 %r44,%r44,24; add.u64 %r37,%r37,8; add.u64 %r47,%r47,8; setp.ne.u64 %r163,%r79,%r88; @ %r163 bra $L32; .loc 1 84 8 ld.u64 %r67,[%r128]; shl.b64 %r108,%r80,2; mov.u64 %r106,%r23; .loc 1 87 10 mov.u64 %r87,0; .loc 1 88 23 mov.u32 %r164,1; $L8: st.u32 [%r106],%r164; mov.u64 %r104,%r87; .loc 1 87 26 add.u64 %r87,%r87,1; .loc 1 87 3 add.u64 %r106,%r106,%r108; setp.ne.u64 %r165,%r103,%r104; @ %r165 bra $L8; .loc 1 101 9 setp.eq.u64 %r166,%r67,0; @ %r166 bra $L1; .loc 1 149 19 ld.u64 %r51,[%frame]; .loc 1 149 9 shl.b64 %r53,%r51,3; .loc 1 151 34 ld.u64 %r56,[%frame+120]; ld.u64 %r123,[%frame+240]; .loc 1 141 11 ld.f64 %r125,[%r67]; cvt.u32.u32 %r168,%r22; cvt.s64.s8 %r167,%r168; shl.b64 %r169,%r167,3; add.u64 %r113,%r206,%r169; .loc 1 97 12 mov.f64 %r73,0dfff0000000000000; setp.eq.u32 %r208,1,0; add.u64 %r207,%frame,248; .loc 1 157 13 mov.u64 %r211,0; .loc 1 174 32 add.u64 %r212,%frame,120; .loc 1 126 12 setp.eq.u32 %r214,%r129,0; $L30: .loc 1 106 10 @ %r208 bra $L10; mov.u64 %r38,%r123; mov.u32 %r85,0; .loc 1 110 12 mov.u32 %r210,1; $L17: .loc 1 118 13 add.u64 %r126,%r67,%r53; .loc 1 110 11 setp.le.f64 %r172,%r73,%r125; @ ! %r172 bra $L51; setp.eq.u32 %r173,%r85,0; selp.u64 %r123,%r123,%r38,%r173; mov.u64 %r90,%r207; mov.u64 %r95,%r23; mov.u64 %r119,%r123; $L15: .loc 1 115 36 cvt.u32.u64 %r176,%r119; add.u32 %r175,%r176,1; .loc 1 115 25 st.u32 [%r95],%r175; .loc 1 114 5 add.u64 %r95,%r95,%r108; setp.eq.u64 %r177,%r90,%r113; @ %r177 bra $L33; .loc 1 115 32 ld.u64 %r119,[%r90]; add.u64 %r90,%r90,8; bra $L15; $L51: .loc 1 120 11 add.u64 %r38,%r38,1; .loc 1 120 4 setp.eq.u64 %r178,%r38,%r56; @ %r178 bra $L16; .loc 1 110 12 ld.f64 %r125,[%r126]; mov.u64 %r67,%r126; mov.u32 %r85,%r210; bra $L17; $L10: .loc 1 126 12 @ %r214 bra $L34; mov.u64 %r46,%r123; mov.u64 %r65,%r67; $L24: .loc 1 129 11 setp.le.f64 %r180,%r73,%r125; @ ! %r180 bra $L19; mov.u64 %r84,%r207; mov.u64 %r39,%r23; mov.u64 %r116,%r46; $L21: .loc 1 133 35 cvt.u32.u64 %r183,%r116; add.u32 %r182,%r183,1; .loc 1 133 24 st.u32 [%r39],%r182; .loc 1 132 10 add.u64 %r39,%r39,%r108; setp.eq.u64 %r184,%r84,%r113; @ %r184 bra $L35; .loc 1 133 31 ld.u64 %r116,[%r84]; add.u64 %r84,%r84,8; bra $L21; $L35: .loc 1 131 18 mov.f64 %r73,%r125; $L19: .loc 1 135 12 add.u64 %r65,%r65,%r53; .loc 1 137 17 add.u64 %r46,%r46,1; .loc 1 137 10 st.u64 [%frame+240],%r46; setp.ne.u64 %r185,%r46,%r56; @ %r185 bra $L22; .loc 1 135 12 sub.u64 %r186,%r56,%r123; mad.lo.u64 %r67,%r186,%r53,%r67; bra $L23; $L22: .loc 1 129 12 ld.f64 %r125,[%r65]; bra $L24; $L34: mov.u64 %r55,%r123; mov.u64 %r66,%r67; $L18: .loc 1 141 10 setp.lt.f64 %r188,%r73,%r125; @ ! %r188 bra $L25; mov.u64 %r117,%r207; mov.u64 %r120,%r23; mov.u64 %r110,%r55; $L27: .loc 1 145 35 cvt.u32.u64 %r191,%r110; add.u32 %r190,%r191,1; .loc 1 145 24 st.u32 [%r120],%r190; .loc 1 144 4 add.u64 %r120,%r120,%r108; setp.eq.u64 %r192,%r113,%r117; @ %r192 bra $L36; .loc 1 145 31 ld.u64 %r110,[%r117]; add.u64 %r117,%r117,8; bra $L27; $L36: .loc 1 143 18 mov.f64 %r73,%r125; $L25: .loc 1 149 9 add.u64 %r66,%r66,%r53; .loc 1 151 14 add.u64 %r55,%r55,1; .loc 1 151 7 st.u64 [%frame+240],%r55; setp.ne.u64 %r193,%r55,%r56; @ %r193 bra $L28; .loc 1 149 9 sub.u64 %r194,%r56,%r123; mad.lo.u64 %r67,%r194,%r53,%r67; bra $L23; $L28: .loc 1 141 11 ld.f64 %r125,[%r66]; bra $L18; $L16: st.u64 [%frame+240],%r56; .loc 1 118 13 mov.u64 %r67,%r126; $L23: mov.u64 %r76,%r207; .loc 1 151 34 mov.u64 %r63,%r56; mov.u64 %r61,%r51; .loc 1 152 9 mov.u64 %r71,0; $L29: .loc 1 157 13 st.u64 [%r76+-8],%r211; .loc 1 160 23 mul.lo.u64 %r58,%r61,%r63; mov.u64 %r105,%r71; .loc 1 161 5 add.u64 %r71,%r71,1; .loc 1 162 7 setp.eq.u64 %r198,%r105,%r103; @ %r198 bra $L1; .loc 1 170 16 ld.u64 %r199,[%r76]; add.u64 %r60,%r199,1; st.u64 [%r76],%r60; shl.b64 %r69,%r71,3; .loc 1 171 23 add.u64 %r200,%frame,%r69; ld.u64 %r61,[%r200]; .loc 1 171 13 sub.u64 %r201,%r61,%r58; shl.b64 %r202,%r201,3; add.u64 %r67,%r67,%r202; .loc 1 174 32 add.u64 %r204,%r212,%r69; ld.u64 %r63,[%r204]; .loc 1 174 7 add.u64 %r76,%r76,8; setp.eq.u64 %r205,%r60,%r63; @ %r205 bra $L29; ld.u64 %r123,[%frame+240]; .loc 1 141 11 ld.f64 %r125,[%r67]; bra $L30; $L33: mov.f64 %r73,%r125; setp.eq.u32 %r208,1,1; bra $L10; $L_gfortran_mmaxloc0_4_r8 .visible .func _gfortran_mmaxloc0_4_r8reg .u64 %stack; mov.u64 %stack,0480100pred %r172; .reg .predpredpred %r195; .reg .u64 %r198; .reg .u64u32 %r211; .reg .u16pred %r217; .reg .predpredu16 %r257; .reg .u32 %r258; .reg .pred %r259; .reg .pred %r260; .reg .u32 %r262; .reg .u32 %r263; .reg .pred %r264; .reg .pred %r265; .reg .u64 %r266; .reg .u16 %r268; .reg .u32 %r269; .reg .pred %r270; .reg .pred %r271; .reg .u32 %r273; .reg .u32 %r274; .reg .pred %r275; .reg .pred %r276; .reg .pred %r281; .reg .u64 %r282; .reg .u64pred %r294; .reg .u64 %r296; .reg .u64 %r297; .reg .pred %r299; mov.u64 %r168,%ar0; mov.u64 %r169,%ar1; mov.u64 %r170,%ar2; mov.u32 %r171,%ar3; .loc 1 202 6 setp.ne.u64 %r172,%r170,0; @ %r172 bra $Lcall _gfortran_maxloc0_4_r8.loc 1 205 7 bra $L52; $L53: .loc 1 208 10 ld.s8 %r22,[%r169+28]; .loc 1 209 6 setp.gt.s32 %r176,%r22,0; @ %r176 bra $L55; .loc 1 210 5 cvta.const.u64 %r177_gfortran_runtime_error5: .loc 1 212 6 ld.u64 %r179,[%r168]; setp.ne.u64 %r180,%r179,0; @ %r180 bra $L56; .loc 1 214 7 st.u64 [%r168+48],%r179; add.u32 %r182,%r22,-1; cvt.s64.s32 %r183,%r182; st.u64 [%r168+56],%r183; mov.u64 %r184,1; st.u64 [%r168+40],%r184; .loc 1 215 28 cvt.u32.u64 %r185,%r184; st.u8 [%r168+28],%r185; .loc 1 216 24 st.u64 [%r168+8],%r179; .loc 1 217 29 cvt.u32.u32 %r190,%r22; cvt.s64.s8 %r189,%r1call (%value_in),_gfortrani_xmallocarray191,[%value_in]; } .loc 1 217 27 st.u64 [%r168],%r191; bra $L57; $L56: .loc 1 221 11 cvta.global.u64 %r193,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r194,[%r193+36]; setp.eq.u32 %r195,%r194,0r169198; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r2011716920198; call _gfortrani_bounds_equal_extents$L57: .loc 1 231 15 ld.u64 %r31,[%r170+16]; .loc 1 233 9 ld.u64 %r94,[%r170]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r203,%r32,-4; and.b32 %r204,%r203,-5;.loc 1 235 22 add.u32 %r208,%r32,-1; set.u32.le.u32 %r210,%r208,1; neg.s32 %r211,%r210; .loc 1 235 6 cvt.u16.u32 %r213,%r207; cvt.u16.u32 %r214,%r211; or.b16 %r212,%r213,%r214; cvt.u32.u16 %r215,%r212; cvt.u16.u8 %r216,%r215; setp.ne.u16 %r217,%r216,0; @ %r217 bra $L58; .loc 1 237 7 setp.ne.u32 %r219,%r32,16; @ %r219 bra $L59; $L58: .loc 1 244 11 ld.u64 %r112,[%r168+40]; .loc 1 245 8 ld.u64 %r113,[%r168]; .loc 1 246 17 cvt.u32.u32 %r220,%r22; cvt.s64.s8 %r135,%r220; add.u64 %r166,%r169,40; add.u64 %r292,%frame,120; mov.u64 %r163,%r292; add.u64 %r160,%r170,40; mov.u64 %r159,%frame; add.u64 %r156,%frame,240; add.u64 %r152,%frame,360; mov.u64 %r154,%r152; .loc 1 246 10 mov.u64 %r120,0; .loc 1 251 16 mov.u64 %r229,%r120; bra $L60; $L59: .loc 1 242 5_gfortran_runtime_error248 18 ld.u64 %r223,[%r166]; st.u64 [%r163],%r223; .loc 1 249 20 ld.u64 %r225,[%r160]; mul.lo.u64 %r224,%r225,%r31; .loc 1 249 18 st.u64 [%r159],%r224; .loc 1 250 19 ld.u64 %r227,[%r166+16]; add.u64 %r226,%r227,1; ld.u64 %r228,[%r166+8]; sub.u64 %r45,%r226,%r228; .loc 1 250 17 st.u64 [%r156],%r45; .loc 1 251 16 st.u64 [%r154],%r229; .loc 1 252 10 setp.gt.s64 %r230,%r45,0; @ %r230 bra $L61; shl.b64 %r131,%r112,2; mov.u64 %r133,%r113; .loc 1 255 11 mov.u64 %r121,0; .loc 1 256 24 cvt.u32.u64 %r231,%r121; $L62: st.u32 [%r133],%r231; .loc 1 255 27 add.u64 %r121,%r121,1; .loc 1 255 4 add.u64 %r133,%r133,%r131; setp.ne.u64 %r232,%r121,%r135; @ %r232 bra $L62; bra $L52; $L61: .loc 1 246 26 add.u64 %r120,%r120,1; .loc 1 246 3 add.u64 %r166,%r166,24; add.u64 %r163,%r163,8; add.u64 %r160,%r160,24; add.u64 %r159,%r159,8; add.u64 %r156,%r156,8; add.u64 %r154,%r154,8; setp.ne.u64 %r233,%r120,%r135; @ %r233 bra $L60; .loc 1 261 8 ld.u64 %r91,[%r169]; shl.b64 %r124,%r112,2; mov.u64 %r122,%r113; .loc 1 264 10 mov.u64 %r119,0; .loc 1 265 23 cvt.u32.u64 %r234,%r119; $L64: st.u32 [%r122],%r234; .loc 1 264 26 add.u64 %r119,%r119,1; .loc 1 264 3 add.u64 %r122,%r122,%r124; setp.ne.u64 %r235,%r119,%r135; @ %r235 bra $L64; .loc 1 276 9 setp.eq.u64 %r236,%r91,0; @ %r236 bra $L52; .loc 1 331 19 ld.u64 %r71,[%frame+120]; .loc 1 331 9 shl.b64 %r73,%r71,3; .loc 1 332 20 ld.u64 %r74,[%frame]; .loc 1 334 34 ld.u64 %r77,[%frame+240]; .loc 1 334 21 ld.u64 %r164,[%frame+360]; cvt.u32.u32 %r238,%r22; cvt.s64.s8 %r237,%r238; shl.b64 %r239,%r237,3; add.u64 %r96,%r239,%r152; .loc 1 272 12 mov.f64 %r102,0dfff0000000000000; setp.eq.u32 %r294,1,0; add.u64 %r293,%frame,368; .loc 1 340 13 mov.u64 %r296,0; .loc 1 359 32 add.u64 %r297,%frame,240; .loc 1 308 12 setp.eq.u32 %r299,%r171,0; $L83: .loc 1 280 10 @ %r294 bra $L65; $L72: .loc 1 284 11 ld.s8 %r242,[%r94]; cvt.u16.u32 %r241,%r242; setp.eq.u16 %r243,%r241,0; @ %r243 bra $L66; .loc 1 287 8 ld.u32 %r244,[%r113]; setp.ne.u32 %r245,%r244,0; @ %r245 bra $L67; mov.u64 %r101,%r293; mov.u64 %r104,%r113; mov.u64 %r157,%r164; $L68: .loc 1 289 38 cvt.u32.u64 %r248,%r157; add.u32 %r247,%r248,1; .loc 1 289 27 st.u32 [%r104],%r247; .loc 1 288 7 add.u64 %r104,%r104,%r124; setp.eq.u64 %r249,%r96,%r101; @ %r249 bra $L67; .loc 1 289 34 ld.u64 %r157,[%r101]; add.u64 %r101,%r101,8; bra $L68; $L67: .loc 1 290 9 ld.f64 %r52,[%r91]; .loc 1 290 8 setp.ge.f64 %r250,%r52,%r102; @ ! %r250 bra $L66; mov.u64 %r128,%r293; mov.u64 %r129,%r113; mov.u64 %r161,%r164; $L71: .loc 1 296 33 cvt.u32.u64 %r253,%r161; add.u32 %r252,%r253,1; .loc 1 296 22 st.u32 [%r129],%r252; .loc 1 295 9 add.u64 %r129,%r129,%r124; setp.eq.u64 %r254,%r96,%r128; @ %r254 bra $L84; .loc 1 296 29 ld.u64 %r161,[%r128]; add.u64 %r128,%r128,8; bra $L71; $L66: .loc 1 300 13 add.u64 %r91,%r91,%r73; .loc 1 301 14 add.u64 %r94,%r94,%r74; .loc 1 303 11 add.u64 %r164,%r164,1; .loc 1 303 4 st.u64 [%frame+360],%r164; setp.ne.u64 %r255,%r164,%r77; @ %r255 bra $L72; bra $L73; $L65: .loc 1 308 12 @ %r299 bra $L85; mov.u64 %r64,%r164; mov.u64 %r89,%r91; $L78: .loc 1 311 11 ld.s8 %r258,[%r94]; cvt.u16.u32 %r257,%r258; setp.eq.u16 %r259,%r257,0; @ %r259 bra $L75; .loc 1 311 22 ld.f64 %r60,[%r89]; .loc 1 311 19 setp.ge.f64 %r260,%r60,%r102; @ ! %r260 bra $L75; mov.u64 %r67,%r293; mov.u64 %r92,%r113; mov.u64 %r155,%r64; .loc 1 314 19 mov.u64 %r117,0; $L77: .loc 1 315 36 cvt.u32.u64 %r263,%r155; add.u32 %r262,%r263,1; .loc 1 315 25 st.u32 [%r92],%r262; .loc 1 314 35 add.u64 %r117,%r117,1; .loc 1 314 12 add.u64 %r92,%r92,%r124; setp.ge.s64 %r264,%r117,%r135; @ %r264 bra $L86; .loc 1 315 32 ld.u64 %r155,[%r67]; add.u64 %r67,%r67,8; bra $L77; $L86: .loc 1 313 19 mov.f64 %r102,%r60; $L75: .loc 1 317 13 add.u64 %r89,%r89,%r73; .loc 1 319 11 add.u64 %r64,%r64,1; .loc 1 319 4 st.u64 [%frame+360],%r64; setp.ne.u64 %r265,%r64,%r77; @ %r265 bra $L78; .loc 1 317 13 sub.u64 %r266,%r77,%r164; mad.lo.u64 %r91,%r266,%r73,%r91; bra $L73; $L85: mov.u64 %r76,%r164; mov.u64 %r93,%r94; mov.u64 %r90,%r91; $L74: .loc 1 323 11 ld.s8 %r269,[%r93]; cvt.u16.u32 %r268,%r269; setp.eq.u16 %r270,%r268,0; @ %r270 bra $L79; .loc 1 323 22 ld.f64 %r66,[%r90]; .loc 1 323 19 setp.gt.f64 %r271,%r66,%r102; @ ! %r271 bra $L79; mov.u64 %r46,%r293; mov.u64 %r56,%r113; mov.u64 %r151,%r76; .loc 1 326 12 mov.u64 %r115,0; $L81: .loc 1 327 36 cvt.u32.u64 %r274,%r151; add.u32 %r273,%r274,1; .loc 1 327 25 st.u32 [%r56],%r273; .loc 1 326 28 add.u64 %r115,%r115,1; .loc 1 326 5 add.u64 %r56,%r56,%r124; setp.ge.s64 %r275,%r115,%r135; @ %r275 bra $L87; .loc 1 327 32 ld.u64 %r151,[%r46]; add.u64 %r46,%r46,8; bra $L81; $L87: .loc 1 325 12 mov.f64 %r102,%r66; $L79: .loc 1 331 9 add.u64 %r90,%r90,%r73; .loc 1 332 10 add.u64 %r93,%r93,%r74; .loc 1 334 14 add.u64 %r76,%r76,1; .loc 1 334 7 st.u64 [%frame+360],%r76; setp.ne.u64 %r276,%r76,%r77; @ %r276 bra $L74; sub.u64 %r138,%r77,%r164; .loc 1 331 9 mad.lo.u64 %r91,%r73,%r138,%r91; .loc 1 332 10 mad.lo.u64 %r94,%r74,%r138,%r94; $L73: mov.u64 %r125,%r293; .loc 1 325 12 mov.u64 %r86,%r74; .loc 1 334 34 mov.u64 %r88,%r77; mov.u64 %r84,%r71; mov.u64 %r118,8; .loc 1 335 9 mov.u64 %r100,0; $L82: .loc 1 340 13 st.u64 [%r125+-8],%r296; .loc 1 343 23 mul.lo.u64 %r79,%r84,%r88; .loc 1 344 24 mul.lo.u64 %r81,%r88,%r86; .loc 1 345 5 add.u64 %r100,%r100,1; .loc 1 346 7 setp.eq.u64 %r281,%r100,%r135; @ %r281 bra $L52; .loc 1 354 16 ld.u64 %r282,[%r125]; add.u64 %r83,%r282,1; st.u64 [%r125],%r83; .loc 1 355 23 add.u64 %r284,%r292,%r118; ld.u64 %r84,[%r284]; .loc 1 355 13 sub.u64 %r285,%r84,%r79; shl.b64 %r286,%r285,3; add.u64 %r91,%r91,%r286; .loc 1 356 24 add.u64 %r287,%frame,%r118; ld.u64 %r86,[%r287]; .loc 1 356 14 sub.u64 %r288,%r86,%r81; add.u64 %r94,%r94,%r288; .loc 1 359 32 add.u64 %r290,%r297,%r118; ld.u64 %r88,[%r290]; .loc 1 359 7 add.u64 %r125,%r125,8; add.u64 %r118,%r118,8; setp.eq.u64 %r291,%r83,%r88; @ %r291 bra $L82; .loc 1 334 21 ld.u64 %r164,[%frame+360]; bra $L83; $L84: .loc 1 290 9 mov.f64 %r102,%r52; setp.eq.u32 %r294,1,1; bra $L65; $L52: .loc 1 _gfortran_smaxloc0_4_r8 .visible .func _gfortran_smaxloc0_4_r8u32pred.reg .u64u32 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 379 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L114; .loc 1 379 20 ld.u32 %r40,[%r37381 7 {_gfortran_maxloc0_4_r8.loc 1 382 7 bra $L113; $L115: .loc 1 385 8 ld.s8 %r30,[%r36+28]; .loc 1 387 6 setp.gt.s64 %r46,%r30,0; @ %r46 bra $L1_gfortran_runtime_error117: .loc 1 390 15 ld.u64 %r24,[%r35]; .loc 1 390 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L118; .loc 1 392 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 393 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 394 24 st.u64 [%r35+8],%r24; .loc 1 395 29 mov.u64 %r56call (%value_in),_gfortrani_xmallocarray95 27 st.u64 [%r35],%r24; bra $L119; $L118: .loc 1 397 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 397 11 ld.u32 %r62,[%r61+36.loc 1 399 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 404 8 ld.u64 %r24,[%r35]; $L119: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,2; mov.u64 %r29,%r24; .loc 1 405 10 mov.u64 %r32,0; .loc 1 406 23 cvt.u32.u64 %r68,%r32; $L120: st.u32 [%r29],%r68; .loc 1 405 24 add.u64 %r32,%r32,1; .loc 1 405 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L120; $L113: .loc 1 407 1 ret; } maxloc0_8_r8.o/_gfortran_maxloc0_8_r8 .visible .func _gfortran_maxloc0_8_libgfortran/generated/maxloc0_8_r8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_8_r8 .visible .func _gfortran_mmaxloc0_8_r8_gfortran_smaxloc0_8_r8 .visible .func _gfortran_smaxloc0_8_r8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,65,88,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_8_r8 .visible .func _gfortran_maxloc0_8_stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[368pred.reg .u64 %r140; .reg .u32 %r141; .reg .predpred %r157; .reg .u64pred %r170; .reg .pred %r171; .reg .pred %r173; .reg .u64pred %r177; .reg .u64 %r178; .reg .predpredpred %r196; .reg .predpred %r205; mov.u64 %r121,%ar0; mov.u64 %r122,%ar1; mov.u32 %r123,%ar2; .loc 1 50 10 ld.s8 %r22,[%r122+28]; .loc 1 50 8 cvt.u32.u32 %r124,%r22; cvt.s64.s8 %r84,%r124; .loc 1 51 6 setp.gt.s64 %r125,%r84,0; @ %r125 bra $L2; .loc 1 52 5 cvta.const.u64 %r126,$LCstack; call _gfortran_runtime_errorr23,[%r121]; .loc 1 54 6 setp.ne.u64 %r128,%r23,0; @ %r128 bra $L3; .loc 1 56 7 st.u64 [%r121+48],%r23; add.u64 %r130,%r84,-1; st.u64 [%r121+56],%r130; mov.u64 %r131,1; st.u64 [%r121+40],%r131; .loc 1 57 28 cvt.u32.u64 %r132,%r131; st.u8 [%r121+28],%r132; .loc 1 58 24 st.u64 [%r121+8],%r23; .loc 1 59 29 mov.u64 %r135135; call (%value_in),_gfortrani_xmallocarray138,[%value_in]; } mov.u64 %r23,%r138; .loc 1 59 27 st.u64 [%r121],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r140,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r141,[%r140+36]; setp.eq.u32 %r142,%r141,0; @ %r142 bra $L4; .loc 1 64r122145; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r121]; $L4: .loc 1 68 11 ld.u64 %r85,[%r121+40]; add.u64 %r33,%r122,40; mov.u64 %r53,%frame; add.u64 %r45,%frame,120; .loc 1 70 10 mov.u64 %r106,0; add.u64 %r198,%frame,240; .loc 1 74 16 mov.u64 %r153,%r106; bra $L7; $L32: mov.u64 %r106,%r91; $L7: .loc 1 72 18 ld.u64 %r146,[%r33]; st.u64 [%r53],%r146; .loc 1 73 19 ld.u64 %r148,[%r33+16]; add.u64 %r147,%r148,1; ld.u64 %r149,[%r33+8]; sub.u64 %r32,%r147,%r149; .loc 1 73 17 st.u64 [%r45],%r32; .loc 1 74 16 shl.b64 %r151,%r106,3; add.u64 %r152,%r198,%r151; st.u64 [%r152],%r153; .loc 1 75 10 setp.gt.s64 %r154,%r32,0; @ %r154 bra $L5; shl.b64 %r114,%r85,3; mov.u64 %r104,%r23; .loc 1 78 11 mov.u64 %r92,0; .loc 1 79 24 mov.u64 %r155,%r92; $L6: st.u64 [%r104],%r155; .loc 1 78 27 add.u64 %r92,%r92,1; .loc 1 78 4 add.u64 %r104,%r104,%r114; setp.ne.u64 %r156,%r84,%r92; @ %r156 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r91,%r106,1; .loc 1 70 3 add.u64 %r33,%r33,24; add.u64 %r53,%r53,8; add.u64 %r45,%r45,8; setp.ne.u64 %r157,%r84,%r91; @ %r157 bra $L32; .loc 1 84 8 ld.u64 %r73,[%r122]; shl.b64 %r44,%r85,3; mov.u64 %r55,%r23; .loc 1 87 10 mov.u64 %r90,0; .loc 1 88 23 mov.u64 %r158,1; $L8: st.u64 [%r55],%r158; mov.u64 %r107,%r90; .loc 1 87 26 add.u64 %r90,%r90,1; .loc 1 87 3 add.u64 %r55,%r55,%r44; setp.ne.u64 %r159,%r106,%r107; @ %r159 bra $L8; .loc 1 101 9 setp.eq.u64 %r160,%r73,0; @ %r160 bra $L1; .loc 1 149 19 ld.u64 %r57,[%frame]; .loc 1 149 9 shl.b64 %r59,%r57,3; .loc 1 151 34 ld.u64 %r61,[%frame+120]; ld.u64 %r115,[%frame+240]; .loc 1 141 11 ld.f64 %r119,[%r73]; cvt.u32.u32 %r162,%r22; cvt.s64.s8 %r161,%r162; shl.b64 %r163,%r161,3; add.u64 %r95,%r198,%r163; .loc 1 97 12 mov.f64 %r78,0dfff0000000000000; setp.eq.u32 %r197,1,0; add.u64 %r199,%frame,248; .loc 1 157 13 mov.u64 %r202,0; .loc 1 174 32 add.u64 %r203,%frame,120; .loc 1 126 12 setp.eq.u32 %r205,%r123,0; $L30: .loc 1 106 10 @ %r197 bra $L10; mov.u64 %r40,%r115; mov.u32 %r89,0; .loc 1 110 12 mov.u32 %r201,1; $L17: .loc 1 118 13 add.u64 %r120,%r73,%r59; .loc 1 110 11 setp.le.f64 %r166,%r78,%r119; @ ! %r166 bra $L51; setp.eq.u32 %r167,%r89,0; selp.u64 %r115,%r115,%r40,%r167; mov.u64 %r77,%r199; mov.u64 %r97,%r23; mov.u64 %r111,%r115; $L15: .loc 1 115 36 add.u64 %r169,%r111,1; .loc 1 115 25 st.u64 [%r97],%r169; .loc 1 114 5 add.u64 %r97,%r97,%r44; setp.eq.u64 %r170,%r77,%r95; @ %r170 bra $L33; .loc 1 115 32 ld.u64 %r111,[%r77]; add.u64 %r77,%r77,8; bra $L15; $L51: .loc 1 120 11 add.u64 %r40,%r40,1; .loc 1 120 4 setp.eq.u64 %r171,%r40,%r61; @ %r171 bra $L16; .loc 1 110 12 ld.f64 %r119,[%r120]; mov.u64 %r73,%r120; mov.u32 %r89,%r201; bra $L17; $L10: .loc 1 126 12 @ %r205 bra $L34; mov.u64 %r52,%r115; mov.u64 %r71,%r73; $L24: .loc 1 129 11 setp.le.f64 %r173,%r78,%r119; @ ! %r173 bra $L19; mov.u64 %r110,%r199; mov.u64 %r112,%r23; mov.u64 %r109,%r52; $L21: .loc 1 133 35 add.u64 %r175,%r109,1; .loc 1 133 24 st.u64 [%r112],%r175; .loc 1 132 10 add.u64 %r112,%r112,%r44; setp.eq.u64 %r176,%r95,%r110; @ %r176 bra $L35; .loc 1 133 31 ld.u64 %r109,[%r110]; add.u64 %r110,%r110,8; bra $L21; $L35: .loc 1 131 18 mov.f64 %r78,%r119; $L19: .loc 1 135 12 add.u64 %r71,%r71,%r59; .loc 1 137 17 add.u64 %r52,%r52,1; .loc 1 137 10 st.u64 [%frame+240],%r52; setp.ne.u64 %r177,%r52,%r61; @ %r177 bra $L22; .loc 1 135 12 sub.u64 %r178,%r61,%r115; mad.lo.u64 %r73,%r178,%r59,%r73; bra $L23; $L22: .loc 1 129 12 ld.f64 %r119,[%r71]; bra $L24; $L34: mov.u64 %r60,%r115; mov.u64 %r72,%r73; $L18: .loc 1 141 10 setp.lt.f64 %r180,%r78,%r119; @ ! %r180 bra $L25; mov.u64 %r50,%r199; mov.u64 %r47,%r23; mov.u64 %r93,%r60; $L27: .loc 1 145 35 add.u64 %r182,%r93,1; .loc 1 145 24 st.u64 [%r47],%r182; .loc 1 144 4 add.u64 %r47,%r47,%r44; setp.eq.u64 %r183,%r50,%r95; @ %r183 bra $L36; .loc 1 145 31 ld.u64 %r93,[%r50]; add.u64 %r50,%r50,8; bra $L27; $L36: .loc 1 143 18 mov.f64 %r78,%r119; $L25: .loc 1 149 9 add.u64 %r72,%r72,%r59; .loc 1 151 14 add.u64 %r60,%r60,1; .loc 1 151 7 st.u64 [%frame+240],%r60; setp.ne.u64 %r184,%r60,%r61; @ %r184 bra $L28; .loc 1 149 9 sub.u64 %r185,%r61,%r115; mad.lo.u64 %r73,%r185,%r59,%r73; bra $L23; $L28: .loc 1 141 11 ld.f64 %r119,[%r72]; bra $L18; $L16: st.u64 [%frame+240],%r61; .loc 1 118 13 mov.u64 %r73,%r120; $L23: mov.u64 %r42,%r199; .loc 1 151 34 mov.u64 %r70,%r61; mov.u64 %r68,%r57; .loc 1 152 9 mov.u64 %r76,0; $L29: .loc 1 157 13 st.u64 [%r42+-8],%r202; .loc 1 160 23 mul.lo.u64 %r65,%r68,%r70; mov.u64 %r108,%r76; .loc 1 161 5 add.u64 %r76,%r76,1; .loc 1 162 7 setp.eq.u64 %r189,%r108,%r106; @ %r189 bra $L1; .loc 1 170 16 ld.u64 %r190,[%r42]; add.u64 %r67,%r190,1; st.u64 [%r42],%r67; shl.b64 %r118,%r76,3; .loc 1 171 23 add.u64 %r191,%frame,%r118; ld.u64 %r68,[%r191]; .loc 1 171 13 sub.u64 %r192,%r68,%r65; shl.b64 %r193,%r192,3; add.u64 %r73,%r73,%r193; .loc 1 174 32 add.u64 %r195,%r203,%r118; ld.u64 %r70,[%r195]; .loc 1 174 7 add.u64 %r42,%r42,8; setp.eq.u64 %r196,%r67,%r70; @ %r196 bra $L29; ld.u64 %r115,[%frame+240]; .loc 1 141 11 ld.f64 %r119,[%r73]; bra $L30; $L33: mov.f64 %r78,%r119; setp.eq.u32 %r197,1,1; bra $L10; $L_gfortran_mmaxloc0_8_r8 .visible .func _gfortran_mmaxloc0_8_r8reg .u64 %stack; mov.u64 %stack,04807pred %r187; .reg .u64 %r190; .reg .u64u64pred %r222; .reg .u64 %r223; .reg .predu32 %r230; .reg .u64 %r231; .reg .u16 %r233; .reg .u32 %r234; .reg .pred %r235; .reg .u64 %r236; .reg .predpred %r250; .reg .u64 %r252; .reg .predu16 %r257; .reg .u32 %r258; .reg .pred %r259; .reg .pred %r260; .reg .u64 %r262; .reg .pred %r263; .reg .predpred %r282; .reg .u64 %r284; .reg .u64 %r285; .reg .pred %r287; mov.u64 %r160,%ar0; mov.u64 %r161,%ar1; mov.u64 %r162,%ar2; mov.u32 %r163,%ar3; .loc 1 202 6 setp.ne.u64 %r164,%r162,0; @ %r164 bra $L53; .loc 1 204 7 {1163; call _gfortran_maxloc0_8_r8.loc 1 205 7 bra $L52; $L53: .loc 1 208 10 ld.s8 %r22,[%r161+28]; .loc 1 209 6 setp.gt.s32 %r168,%r22,0; @ %r168 bra $L55; .loc 1 210 5 cvta.const.u64 %r16stack; call _gfortran_runtime_error5: .loc 1 212 6 ld.u64 %r171,[%r160]; setp.ne.u64 %r172,%r171,0; @ %r172 bra $L56; .loc 1 214 7 st.u64 [%r160+48],%r171; add.u32 %r174,%r22,-1; cvt.s64.s32 %r175,%r174; st.u64 [%r160+56],%r175; mov.u64 %r176,1; st.u64 [%r160+40],%r176; .loc 1 215 28 cvt.u32.u64 %r177,%r176; st.u8 [%r160+28],%r177; .loc 1 216 24 st.u64 [%r160+8],%r171; .loc 1 217 29 cvt.u32.u32 %r182,%r22; cvt.s64.s8 %r181,%r182; mov.u64 %r180call (%value_in),_gfortrani_xmallocarray183,[%value_in]; } .loc 1 217 27 st.u64 [%r160],%r183; bra $L57; $L56: .loc 1 221 11 cvta.global.u64 %r185,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r186,[%r185+36]; setp.eq.u32 %r187,%r186,0; @ %r187 bra $L57; .loc 1 224 4 cvta.const.u64 %r190,$LC1; {161190; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19316161193190; call _gfortrani_bounds_equal_extents$L57: .loc 1 231 15 ld.u64 %r31,[%r162+16]; .loc 1 233 9 ld.u64 %r97,[%r162]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r195,%r32,-4; and.b32 %r196,%r195,-5; set.u32.eq.u32 %r198,%r196,0; neg.s32 %r199,%r198; .loc 1 235 22 add.u32 %r200,%r32,-1; set.u32.le.u32 %r202,%r200,1; neg.s32 %r203,%r202; .loc 1 235 6 cvt.u16.u32 %r205,%r199; cvt.u16.u32 %r206,%r203; or.b16 %r204,%r205,%r206; cvt.u32.u16 %r207,%r204; cvt.u16.u8 %r208,%r20758; .loc 1 237 7 setp.ne.u32 %r211,%r32,16; @ %r211 bra $L59; $L58: .loc 1 244 11 ld.u64 %r112,[%r160+40]; .loc 1 245 8 ld.u64 %r113,[%r160]; .loc 1 246 17 cvt.u32.u32 %r212,%r22; cvt.s64.s8 %r134,%r212; add.u64 %r122,%r161,40; add.u64 %r280,%frame,120; mov.u64 %r132,%r280; add.u64 %r124,%r162,40; mov.u64 %r66,%frame; add.u64 %r57,%frame,240; add.u64 %r59,%frame,360; mov.u64 %r58,%r59; .loc 1 246 10 mov.u64 %r119,0; .loc 1 251 16 mov.u64 %r221,%r119; bra $L60; $L59: .loc 1 242 5_gfortran_runtime_error248 18 ld.u64 %r215,[%r122]; st.u64 [%r132],%r215; .loc 1 249 20 ld.u64 %r217,[%r124]; mul.lo.u64 %r216,%r217,%r31; .loc 1 249 18 st.u64 [%r66],%r216; .loc 1 250 19 ld.u64 %r219,[%r122+16]; add.u64 %r218,%r219,1; ld.u64 %r220,[%r122+8]; sub.u64 %r45,%r218,%r220; .loc 1 250 17 st.u64 [%r57],%r45; .loc 1 251 16 st.u64 [%r58],%r221; .loc 1 252 10 setp.gt.s64 %r222,%r45,0; @ %r222 bra $L61; shl.b64 %r127,%r112,3; mov.u64 %r129,%r113; .loc 1 255 11 mov.u64 %r120,0; .loc 1 256 24 mov.u64 %r223,%r120; $L62: st.u64 [%r129],%r223; .loc 1 255 27 add.u64 %r120,%r120,1; .loc 1 255 4 add.u64 %r129,%r129,%r127; setp.ne.u64 %r224,%r120,%r134; @ %r224 bra $L62; bra $L52; $L61: .loc 1 246 26 add.u64 %r119,%r119,1; .loc 1 246 3 add.u64 %r122,%r122,24; add.u64 %r132,%r132,8; add.u64 %r124,%r124,24; add.u64 %r66,%r66,8; add.u64 %r57,%r57,8; add.u64 %r58,%r58,8; setp.ne.u64 %r225,%r119,%r134; @ %r225 bra $L60; .loc 1 261 8 ld.u64 %r94,[%r161]; shl.b64 %r145,%r112,3; mov.u64 %r153,%r113; .loc 1 264 10 mov.u64 %r118,0; .loc 1 265 23 mov.u64 %r226,%r118; $L64: st.u64 [%r153],%r226; .loc 1 264 26 add.u64 %r118,%r118,1; .loc 1 264 3 add.u64 %r153,%r153,%r145; setp.ne.u64 %r227,%r118,%r134; @ %r227 bra $L64; .loc 1 276 9 setp.eq.u64 %r228,%r94,0; @ %r228 bra $L52; .loc 1 331 19 ld.u64 %r73,[%frame+120]; .loc 1 331 9 shl.b64 %r75,%r73,3; .loc 1 332 20 ld.u64 %r76,[%frame]; .loc 1 334 34 ld.u64 %r80,[%frame+240]; .loc 1 334 21 ld.u64 %r156,[%frame+360]; cvt.u32.u32 %r230,%r22; cvt.s64.s8 %r229,%r230; shl.b64 %r231,%r229,3; add.u64 %r48,%r231,%r59; .loc 1 272 12 mov.f64 %r104,0dfff0000000000000; setp.eq.u32 %r282,1,0; add.u64 %r281,%frame,368; .loc 1 340 13 mov.u64 %r284,0; .loc 1 359 32 add.u64 %r285,%frame,240; .loc 1 308 12 setp.eq.u32 %r287,%r163,0; $L83: .loc 1 280 10 @ %r282 bra $L65; $L72: .loc 1 284 11 ld.s8 %r234,[%r97]; cvt.u16.u32 %r233,%r234; setp.eq.u16 %r235,%r233,0; @ %r235 bra $L66; .loc 1 287 8 ld.u64 %r236,[%r113]; setp.ne.u64 %r237,%r236,0; @ %r237 bra $L67; mov.u64 %r61,%r281; mov.u64 %r72,%r113; mov.u64 %r150,%r156; $L68: .loc 1 289 38 add.u64 %r239,%r150,1; .loc 1 289 27 st.u64 [%r72],%r239; .loc 1 288 7 add.u64 %r72,%r72,%r145; setp.eq.u64 %r240,%r48,%r61; @ %r240 bra $L67; .loc 1 289 34 ld.u64 %r150,[%r61]; add.u64 %r61,%r61,8; bra $L68; $L67: .loc 1 290 9 ld.f64 %r54,[%r94]; .loc 1 290 8 setp.ge.f64 %r241,%r54,%r104; @ ! %r241 bra $L66; mov.u64 %r108,%r281; mov.u64 %r123,%r113; mov.u64 %r154,%r156; $L71: .loc 1 296 33 add.u64 %r243,%r154,1; .loc 1 296 22 st.u64 [%r123],%r243; .loc 1 295 9 add.u64 %r123,%r123,%r145; setp.eq.u64 %r244,%r48,%r108; @ %r244 bra $L84; .loc 1 296 29 ld.u64 %r154,[%r108]; add.u64 %r108,%r108,8; bra $L71; $L66: .loc 1 300 13 add.u64 %r94,%r94,%r75; .loc 1 301 14 add.u64 %r97,%r97,%r76; .loc 1 303 11 add.u64 %r156,%r156,1; .loc 1 303 4 st.u64 [%frame+360],%r156; setp.ne.u64 %r245,%r156,%r80; @ %r245 bra $L72; bra $L73; $L65: .loc 1 308 12 @ %r287 bra $L85; mov.u64 %r68,%r156; mov.u64 %r92,%r94; $L78: .loc 1 311 11 ld.s8 %r248,[%r97]; cvt.u16.u32 %r247,%r248; setp.eq.u16 %r249,%r247,0; @ %r249 bra $L75; .loc 1 311 22 ld.f64 %r63,[%r92]; .loc 1 311 19 setp.ge.f64 %r250,%r63,%r104; @ ! %r250 bra $L75; mov.u64 %r121,%r281; mov.u64 %r36,%r113; mov.u64 %r131,%r68; .loc 1 314 19 mov.u64 %r116,0; $L77: .loc 1 315 36 add.u64 %r252,%r131,1; .loc 1 315 25 st.u64 [%r36],%r252; .loc 1 314 35 add.u64 %r116,%r116,1; .loc 1 314 12 add.u64 %r36,%r36,%r145; setp.ge.s64 %r253,%r116,%r134; @ %r253 bra $L86; .loc 1 315 32 ld.u64 %r131,[%r121]; add.u64 %r121,%r121,8; bra $L77; $L86: .loc 1 313 19 mov.f64 %r104,%r63; $L75: .loc 1 317 13 add.u64 %r92,%r92,%r75; .loc 1 319 11 add.u64 %r68,%r68,1; .loc 1 319 4 st.u64 [%frame+360],%r68; setp.ne.u64 %r254,%r68,%r80; @ %r254 bra $L78; .loc 1 317 13 sub.u64 %r255,%r80,%r156; mad.lo.u64 %r94,%r255,%r75,%r94; bra $L73; $L85: mov.u64 %r79,%r156; mov.u64 %r96,%r97; mov.u64 %r93,%r94; $L74: .loc 1 323 11 ld.s8 %r258,[%r96]; cvt.u16.u32 %r257,%r258; setp.eq.u16 %r259,%r257,0; @ %r259 bra $L79; .loc 1 323 22 ld.f64 %r70,[%r93]; .loc 1 323 19 setp.gt.f64 %r260,%r70,%r104; @ ! %r260 bra $L79; mov.u64 %r155,%r281; mov.u64 %r159,%r113; mov.u64 %r126,%r79; .loc 1 326 12 mov.u64 %r115,0; $L81: .loc 1 327 36 add.u64 %r262,%r126,1; .loc 1 327 25 st.u64 [%r159],%r262; .loc 1 326 28 add.u64 %r115,%r115,1; .loc 1 326 5 add.u64 %r159,%r159,%r145; setp.ge.s64 %r263,%r115,%r134; @ %r263 bra $L87; .loc 1 327 32 ld.u64 %r126,[%r155]; add.u64 %r155,%r155,8; bra $L81; $L87: .loc 1 325 12 mov.f64 %r104,%r70; $L79: .loc 1 331 9 add.u64 %r93,%r93,%r75; .loc 1 332 10 add.u64 %r96,%r96,%r76; .loc 1 334 14 add.u64 %r79,%r79,1; .loc 1 334 7 st.u64 [%frame+360],%r79; setp.ne.u64 %r264,%r79,%r80; @ %r264 bra $L74; sub.u64 %r137,%r80,%r156; .loc 1 331 9 mad.lo.u64 %r94,%r75,%r137,%r94; .loc 1 332 10 mad.lo.u64 %r97,%r76,%r137,%r97; $L73: mov.u64 %r103,%r281; .loc 1 325 12 mov.u64 %r89,%r76; .loc 1 334 34 mov.u64 %r91,%r80; mov.u64 %r87,%r73; mov.u64 %r101,8; .loc 1 335 9 mov.u64 %r102,0; $L82: .loc 1 340 13 st.u64 [%r103+-8],%r284; .loc 1 343 23 mul.lo.u64 %r82,%r87,%r91; .loc 1 344 24 mul.lo.u64 %r84,%r91,%r89; .loc 1 345 5 add.u64 %r102,%r102,1; .loc 1 346 7 setp.eq.u64 %r269,%r102,%r134; @ %r269 bra $L52; .loc 1 354 16 ld.u64 %r270,[%r103]; add.u64 %r86,%r270,1; st.u64 [%r103],%r86; .loc 1 355 23 add.u64 %r272,%r280,%r101; ld.u64 %r87,[%r272]; .loc 1 355 13 sub.u64 %r273,%r87,%r82; shl.b64 %r274,%r273,3; add.u64 %r94,%r94,%r274; .loc 1 356 24 add.u64 %r275,%frame,%r101; ld.u64 %r89,[%r275]; .loc 1 356 14 sub.u64 %r276,%r89,%r84; add.u64 %r97,%r97,%r276; .loc 1 359 32 add.u64 %r278,%r285,%r101; ld.u64 %r91,[%r278]; .loc 1 359 7 add.u64 %r103,%r103,8; add.u64 %r101,%r101,8; setp.eq.u64 %r279,%r86,%r91; @ %r279 bra $L82; .loc 1 334 21 ld.u64 %r156,[%frame+360]; bra $L83; $L84: .loc 1 290 9 mov.f64 %r104,%r54; setp.eq.u32 %r282,1,1; bra $L65; $L52: .loc 1 _gfortran_smaxloc0_8_r8 .visible .func _gfortran_smaxloc0_8_r8u32pred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 379 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L114; .loc 1 379 20 ld.u32 %r40,[%r37381 7 {_gfortran_maxloc0_8_r8.loc 1 382 7 bra $L113; $L115: .loc 1 385 8 ld.s8 %r30,[%r36+28]; .loc 1 387 6 setp.gt.s64 %r46,%r30,0; @ %r46 bra $L1_gfortran_runtime_error117: .loc 1 390 15 ld.u64 %r24,[%r35]; .loc 1 390 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L118; .loc 1 392 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 393 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 394 24 st.u64 [%r35+8],%r24; .loc 1 395 29 mov.u64 call (%value_in),_gfortrani_xmallocarray95 27 st.u64 [%r35],%r24; bra $L119; $L118: .loc 1 397 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 397 11 ld.u32 %r62,[%r61+36.loc 1 399 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 404 8 ld.u64 %r24,[%r35]; $L119: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,3; mov.u64 %r29,%r24; .loc 1 405 10 mov.u64 %r32,0; .loc 1 406 23 mov.u64 %r68,%r32; $L120: st.u64 [%r29],%r68; .loc 1 405 24 add.u64 %r32,%r32,1; .loc 1 405 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L120; $L113: .loc 1 407 1 ret; } maxloc0_16_r8.o/_gfortran_maxloc0_16_r8 .visible .func _gfortran_maxloc0_16_libgfortran/generated/maxloc0_16_r8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_16_r8 .visible .func _gfortran_mmaxloc0_16_r8_gfortran_smaxloc0_16_r8 .visible .func _gfortran_smaxloc0_16_r8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,65,88,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_16_r8 .visible .func _gfortran_maxloc0_16_stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[368105; .reg .u64 %r108predpred %r160; .reg .predpredu32 %r167; .reg .u64 %r168; .reg .pred %r171; .reg .pred %r172; .reg .u64 %r174; .reg .u64 %r175; .reg .pred %r176; .reg .pred %r177; .reg .predu64 %r193; .reg .predmov.u64 %r124,%ar0; mov.u64 %r125,%ar1; mov.u32 %r126,%ar2; .loc 1 50 10 ld.s8 %r22,[%r125+28]; .loc 1 50 8 cvt.u32.u32 %r127,%r22; cvt.s64.s8 %r79,%r127; .loc 1 51 6 setp.gt.s64 %r128,%r79,0; @ %r128 bra $L2; .loc 1 52 5 cvta.const.u64 %r112_gfortran_runtime_errorr23,[%r124]; .loc 1 54 6 setp.ne.u64 %r131,%r23,0; @ %r131 bra $L3; .loc 1 56 7 st.u64 [%r124+48],%r23; add.u64 %r133,%r79,-1; st.u64 [%r124+56],%r133; mov.u64 %r134,1; st.u64 [%r124+40],%r134; .loc 1 57 28 cvt.u32.u64 %r135,%r134; st.u8 [%r124+28],%r135; .loc 1 58 24 st.u64 [%r124+8],%r23; .loc 1 59 29 mov.u64 %r138,1679call (%value_in),_gfortrani_xmallocarray1,[%value_in]; } mov.u64 %r23,%r141; .loc 1 59 27 st.u64 [%r124],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r143,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r144,[%r143+36]; setp.eq.u32 %r145,%r144,0; @ %r145 bra $L4; .loc 1 64r148; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r124]; $L4: .loc 1 68 11 ld.u64 %r80,[%r124+40]; add.u64 %r51,%r125,40; mov.u64 %r42,%frame; add.u64 %r76,%frame,120; .loc 1 70 10 mov.u64 %r108,0; add.u64 %r205,%frame,240; .loc 1 74 16 mov.u64 %r156,%r108; bra $L7; $L32: mov.u64 %r108,%r88; $L7: .loc 1 72 18 ld.u64 %r149,[%r51]; st.u64 [%r42],%r149; .loc 1 73 19 ld.u64 %r151,[%r51+16]; add.u64 %r150,%r151,1; ld.u64 %r152,[%r51+8]; sub.u64 %r31,%r150,%r152; .loc 1 73 17 st.u64 [%r76],%r31; .loc 1 74 16 shl.b64 %r154,%r108,3; add.u64 %r155,%r205,%r154; st.u64 [%r155],%r156; .loc 1 75 10 setp.gt.s64 %r157,%r31,0; @ %r157 bra $L5; shl.b64 %r120,%r80,4; mov.u64 %r105,%r23; .loc 1 78 11 mov.u64 %r89,0; .loc 1 79 24 mov.u64 %r158,%r89; $L6: st.u64 [%r105],%r158; st.u64 [%r105+8],%r158; .loc 1 78 27 add.u64 %r89,%r89,1; .loc 1 78 4 add.u64 %r105,%r105,%r120; setp.ne.u64 %r160,%r79,%r89; @ %r160 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r88,%r108,1; .loc 1 70 3 add.u64 %r51,%r51,24; add.u64 %r42,%r42,8; add.u64 %r76,%r76,8; setp.ne.u64 %r161,%r79,%r88; @ %r161 bra $L32; .loc 1 84 8 ld.u64 %r69,[%r125]; shl.b64 %r35,%r80,4; mov.u64 %r48,%r23; .loc 1 87 10 mov.u64 %r87,0; .loc 1 88 23 mov.u64 %r162,1; mov.u64 %r163,%r87; $L8: st.u64 [%r48],%r162; st.u64 [%r48+8],%r163; mov.u64 %r109,%r87; .loc 1 87 26 add.u64 %r87,%r87,1; .loc 1 87 3 add.u64 %r48,%r48,%r35; setp.ne.u64 %r164,%r108,%r109; @ %r164 bra $L8; .loc 1 101 9 setp.eq.u64 %r165,%r69,0; @ %r165 bra $L1; .loc 1 149 19 ld.u64 %r54,[%frame]; .loc 1 149 9 shl.b64 %r56,%r54,3; .loc 1 151 34 ld.u64 %r58,[%frame+120]; ld.u64 %r121,[%frame+240]; .loc 1 141 11 ld.f64 %r122,[%r69]; cvt.u32.u32 %r167,%r22; cvt.s64.s8 %r166,%r167; shl.b64 %r168,%r166,3; add.u64 %r46,%r205,%r168; .loc 1 97 12 mov.f64 %r74,0dfff0000000000000; setp.eq.u32 %r207,1,0; add.u64 %r206,%frame,248; .loc 1 157 13 mov.u64 %r210,0; .loc 1 174 32 add.u64 %r211,%frame,120; .loc 1 126 12 setp.eq.u32 %r213,%r126,0; $L30: .loc 1 106 10 @ %r207 bra $L10; mov.u64 %r37,%r121; mov.u32 %r85,0; .loc 1 110 12 mov.u32 %r209,1; $L17: .loc 1 118 13 add.u64 %r123,%r69,%r56; .loc 1 110 11 setp.le.f64 %r171,%r74,%r122; @ ! %r171 bra $L51; setp.eq.u32 %r172,%r85,0; selp.u64 %r121,%r121,%r37,%r172; mov.u64 %r78,%r206; mov.u64 %r94,%r23; mov.u64 %r117,%r121; $L15: .loc 1 115 36 add.u64 %r174,%r117,1; st.u64 [%r94],%r174; shr.s64 %r175,%r174,63; st.u64 [%r94+8],%r175; .loc 1 114 5 add.u64 %r94,%r94,%r35; setp.eq.u64 %r176,%r46,%r78; @ %r176 bra $L33; .loc 1 115 32 ld.u64 %r117,[%r78]; add.u64 %r78,%r78,8; bra $L15; $L51: .loc 1 120 11 add.u64 %r37,%r37,1; .loc 1 120 4 setp.eq.u64 %r177,%r37,%r58; @ %r177 bra $L16; .loc 1 110 12 ld.f64 %r122,[%r123]; mov.u64 %r69,%r123; mov.u32 %r85,%r209; bra $L17; $L10: .loc 1 126 12 @ %r213 bra $L34; mov.u64 %r49,%r121; mov.u64 %r67,%r69; $L24: .loc 1 129 11 setp.le.f64 %r179,%r74,%r122; @ ! %r179 bra $L19; mov.u64 %r119,%r206; mov.u64 %r70,%r23; mov.u64 %r114,%r49; $L21: .loc 1 133 35 add.u64 %r181,%r114,1; st.u64 [%r70],%r181; shr.s64 %r182,%r181,63; st.u64 [%r70+8],%r182; .loc 1 132 10 add.u64 %r70,%r70,%r35; setp.eq.u64 %r183,%r46,%r119; @ %r183 bra $L35; .loc 1 133 31 ld.u64 %r114,[%r119]; add.u64 %r119,%r119,8; bra $L21; $L35: .loc 1 131 18 mov.f64 %r74,%r122; $L19: .loc 1 135 12 add.u64 %r67,%r67,%r56; .loc 1 137 17 add.u64 %r49,%r49,1; .loc 1 137 10 st.u64 [%frame+240],%r49; setp.ne.u64 %r184,%r49,%r58; @ %r184 bra $L22; .loc 1 135 12 sub.u64 %r185,%r58,%r121; mad.lo.u64 %r69,%r185,%r56,%r69; bra $L23; $L22: .loc 1 129 12 ld.f64 %r122,[%r67]; bra $L24; $L34: mov.u64 %r57,%r121; mov.u64 %r68,%r69; $L18: .loc 1 141 10 setp.lt.f64 %r187,%r74,%r122; @ ! %r187 bra $L25; mov.u64 %r112,%r206; mov.u64 %r116,%r23; mov.u64 %r111,%r57; $L27: .loc 1 145 35 add.u64 %r189,%r111,1; st.u64 [%r116],%r189; shr.s64 %r190,%r189,63; st.u64 [%r116+8],%r190; .loc 1 144 4 add.u64 %r116,%r116,%r35; setp.eq.u64 %r191,%r46,%r112; @ %r191 bra $L36; .loc 1 145 31 ld.u64 %r111,[%r112]; add.u64 %r112,%r112,8; bra $L27; $L36: .loc 1 143 18 mov.f64 %r74,%r122; $L25: .loc 1 149 9 add.u64 %r68,%r68,%r56; .loc 1 151 14 add.u64 %r57,%r57,1; .loc 1 151 7 st.u64 [%frame+240],%r57; setp.ne.u64 %r192,%r57,%r58; @ %r192 bra $L28; .loc 1 149 9 sub.u64 %r193,%r58,%r121; mad.lo.u64 %r69,%r193,%r56,%r69; bra $L23; $L28: .loc 1 141 11 ld.f64 %r122,[%r68]; bra $L18; $L16: st.u64 [%frame+240],%r58; .loc 1 118 13 mov.u64 %r69,%r123; $L23: mov.u64 %r71,%r206; .loc 1 151 34 mov.u64 %r65,%r58; mov.u64 %r63,%r54; .loc 1 152 9 mov.u64 %r73,0; $L29: .loc 1 157 13 st.u64 [%r71+-8],%r210; .loc 1 160 23 mul.lo.u64 %r60,%r63,%r65; mov.u64 %r110,%r73; .loc 1 161 5 add.u64 %r73,%r73,1; .loc 1 162 7 setp.eq.u64 %r197,%r110,%r108; @ %r197 bra $L1; .loc 1 170 16 ld.u64 %r198,[%r71]; add.u64 %r62,%r198,1; st.u64 [%r71],%r62; shl.b64 %r40,%r73,3; .loc 1 171 23 add.u64 %r199,%frame,%r40; ld.u64 %r63,[%r199]; .loc 1 171 13 sub.u64 %r200,%r63,%r60; shl.b64 %r201,%r200,3; add.u64 %r69,%r69,%r201; .loc 1 174 32 add.u64 %r203,%r211,%r40; ld.u64 %r65,[%r203]; .loc 1 174 7 add.u64 %r71,%r71,8; setp.eq.u64 %r204,%r62,%r65; @ %r204 bra $L29; ld.u64 %r121,[%frame+240]; .loc 1 141 11 ld.f64 %r122,[%r69]; bra $L30; $L33: mov.f64 %r74,%r122; setp.eq.u32 %r207,1,1; bra $L10; $L_gfortran_mmaxloc0_16_r8 .visible .func _gfortran_mmaxloc0_16_r8reg .u64 %stack; mov.u64 %stack,0480pred %r168; .reg .predpredpred %r226; .reg .u64 %r227; .reg .pred %r229; .reg .predpred %r234; .reg .u64u16pred %r248; .reg .pred %r249; .reg .u64 %r251; .reg .u64 %r252; .reg .pred %r253; .reg .pred %r254; .reg .u16 %r256; .reg .u32predpred %r274; .reg .predu64 %r283; .reg .u64u64 %r287; .reg .u64 %r289; .reg .pred %r290; .reg .u64 %r291; .reg .u64 %r292; .reg .predpred %r298; mov.u64 %r164,%ar0; mov.u64 %r165,%ar1; mov.u64 %r166,%ar2; mov.u32 %r167,%ar3; .loc 1 202 6 setp.ne.u64 %r168,%r166,0; @ %r168 bra $L1617; call _gfortran_maxloc0_16_r8.loc 1 205 7 bra $L52; $L53: .loc 1 208 10 ld.s8 %r22,[%r165+28]; .loc 1 209 6 setp.gt.s32 %r172,%r22,0; @ %r172 bra $L55; .loc 1 210 5 cvta.const.u64 %r173173_gfortran_runtime_error5: .loc 1 212 6 ld.u64 %r175,[%r164]; setp.ne.u64 %r176,%r175,0; @ %r176 bra $L56; .loc 1 214 7 st.u64 [%r164+48],%r175; add.u32 %r178,%r22,-1; cvt.s64.s32 %r179,%r178; st.u64 [%r164+56],%r179; mov.u64 %r180,1; st.u64 [%r164+40],%r180; .loc 1 215 28 cvt.u32.u64 %r181,%r180; st.u8 [%r164+28],%r181; .loc 1 216 24 st.u64 [%r164+8],%r175; .loc 1 217 29 cvt.u32.u32 %r186,%r22; cvt.s64.s8 %r185,%r186; mov.u64 %r184,16call (%value_in),_gfortrani_xmallocarray187,[%value_in]; } .loc 1 217 27 st.u64 [%r164],%r187; bra $L57; $L56: .loc 1 221 11 cvta.global.u64 %r189,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r190,[%r189+36]; setp.eq.u32 %r191,%r190,0164r194; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19716r197194; call _gfortrani_bounds_equal_extents$L57: .loc 1 231 15 ld.u64 %r31,[%r166+16]; .loc 1 233 9 ld.u64 %r101,[%r166]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r199,%r32,-4; and.b32 %r200,%r199,-5; set.u32.eq.u32 %r202,%r200,0; neg.s32 %r203,%r202; .loc 1 235 22 add.u32 %r204,%r32,-1; set.u32.le.u32 %r206,%r204,1; neg.s32 %r207,%r206; .loc 1 235 6 cvt.u16.u32 %r209,%r203; cvt.u16.u32 %r210,%r207; or.b16 %r208,%r209,%r210; cvt.u32.u16 %r211,%r208; cvt.u16.u8 %r212,%r211; setp.ne.u16 %r213,%r212,0; @ %r213 bra $L58; .loc 1 237 7 setp.ne.u32 %r215,%r32,16; @ %r215 bra $L59; $L58: .loc 1 244 11 ld.u64 %r116,[%r164+40]; .loc 1 245 8 ld.u64 %r117,[%r164]; .loc 1 246 17 cvt.u32.u32 %r216,%r22; cvt.s64.s8 %r136,%r216; add.u64 %r153,%r165,40; add.u64 %r291,%frame,120; mov.u64 %r150,%r291; add.u64 %r151,%r166,40; mov.u64 %r148,%frame; add.u64 %r125,%frame,240; add.u64 %r48,%frame,360; mov.u64 %r135,%r48; .loc 1 246 10 mov.u64 %r122,0; .loc 1 251 16 mov.u64 %r225,%r122; bra $L60; $L59: .loc 1 242 5 cvta.const.u64 %r217,$LC3stack; call _gfortran_runtime_error248 18 ld.u64 %r219,[%r153]; st.u64 [%r150],%r219; .loc 1 249 20 ld.u64 %r221,[%r151]; mul.lo.u64 %r220,%r221,%r31; .loc 1 249 18 st.u64 [%r148],%r220; .loc 1 250 19 ld.u64 %r223,[%r153+16]; add.u64 %r222,%r223,1; ld.u64 %r224,[%r153+8]; sub.u64 %r47,%r222,%r224; .loc 1 250 17 st.u64 [%r125],%r47; .loc 1 251 16 st.u64 [%r135],%r225; .loc 1 252 10 setp.gt.s64 %r226,%r47,0; @ %r226 bra $L61; shl.b64 %r130,%r116,4; mov.u64 %r132,%r117; .loc 1 255 11 mov.u64 %r123,0; .loc 1 256 24 mov.u64 %r227,%r123; $L62: st.u64 [%r132],%r227; st.u64 [%r132+8],%r227; .loc 1 255 27 add.u64 %r123,%r123,1; .loc 1 255 4 add.u64 %r132,%r132,%r130; setp.ne.u64 %r229,%r123,%r136; @ %r229 bra $L62; bra $L52; $L61: .loc 1 246 26 add.u64 %r122,%r122,1; .loc 1 246 3 add.u64 %r153,%r153,24; add.u64 %r150,%r150,8; add.u64 %r151,%r151,24; add.u64 %r148,%r148,8; add.u64 %r125,%r125,8; add.u64 %r135,%r135,8; setp.ne.u64 %r230,%r122,%r136; @ %r230 bra $L60; .loc 1 261 8 ld.u64 %r98,[%r165]; shl.b64 %r159,%r116,4; mov.u64 %r157,%r117; .loc 1 264 10 mov.u64 %r121,0; .loc 1 265 23 mov.u64 %r231,%r121; $L64: st.u64 [%r157],%r231; st.u64 [%r157+8],%r231; .loc 1 264 26 add.u64 %r121,%r121,1; .loc 1 264 3 add.u64 %r157,%r157,%r159; setp.ne.u64 %r233,%r121,%r136; @ %r233 bra $L64; .loc 1 276 9 setp.eq.u64 %r234,%r98,0; @ %r234 bra $L52; .loc 1 331 19 ld.u64 %r77,[%frame+120]; .loc 1 331 9 shl.b64 %r79,%r77,3; .loc 1 332 20 ld.u64 %r80,[%frame]; .loc 1 334 34 ld.u64 %r84,[%frame+240]; .loc 1 334 21 ld.u64 %r161,[%frame+360]; cvt.u32.u32 %r236,%r22; cvt.s64.s8 %r235,%r236; shl.b64 %r237,%r235,3; add.u64 %r72,%r237,%r48; .loc 1 272 12 mov.f64 %r105,0dfff0000000000000; setp.eq.u32 %r293,1,0; add.u64 %r292,%frame,368; .loc 1 340 13 mov.u64 %r295,0; .loc 1 359 32 add.u64 %r296,%frame,240; .loc 1 308 12 setp.eq.u32 %r298,%r167,0; $L83: .loc 1 280 10 @ %r293 bra $L65; $L72: .loc 1 284 11 ld.s8 %r240,[%r101]; cvt.u16.u32 %r239,%r240; setp.eq.u16 %r241,%r239,0; @ %r241 bra $L66; .loc 1 287 8 ld.u64 %r242,[%r117]; ld.u64 %r243,[%r117+8]; or.b64 %r242,%r242,%r243; setp.ne.u64 %r244,%r242,0; @ %r244 bra $L67; mov.u64 %r99,%r292; mov.u64 %r102,%r117; mov.u64 %r152,%r161; $L68: .loc 1 289 38 add.u64 %r246,%r152,1; st.u64 [%r102],%r246; shr.s64 %r247,%r246,63; st.u64 [%r102+8],%r247; .loc 1 288 7 add.u64 %r102,%r102,%r159; setp.eq.u64 %r248,%r72,%r99; @ %r248 bra $L67; .loc 1 289 34 ld.u64 %r152,[%r99]; add.u64 %r99,%r99,8; bra $L68; $L67: .loc 1 290 9 ld.f64 %r54,[%r98]; .loc 1 290 8 setp.ge.f64 %r249,%r54,%r105; @ ! %r249 bra $L66; mov.u64 %r127,%r292; mov.u64 %r128,%r117; mov.u64 %r156,%r161; $L71: .loc 1 296 33 add.u64 %r251,%r156,1; st.u64 [%r128],%r251; shr.s64 %r252,%r251,63; st.u64 [%r128+8],%r252; .loc 1 295 9 add.u64 %r128,%r128,%r159; setp.eq.u64 %r253,%r72,%r127; @ %r253 bra $L84; .loc 1 296 29 ld.u64 %r156,[%r127]; add.u64 %r127,%r127,8; bra $L71; $L66: .loc 1 300 13 add.u64 %r98,%r98,%r79; .loc 1 301 14 add.u64 %r101,%r101,%r80; .loc 1 303 11 add.u64 %r161,%r161,1; .loc 1 303 4 st.u64 [%frame+360],%r161; setp.ne.u64 %r254,%r161,%r84; @ %r254 bra $L72; bra $L73; $L65: .loc 1 308 12 @ %r298 bra $L85; mov.u64 %r67,%r161; mov.u64 %r96,%r98; $L78: .loc 1 311 11 ld.s8 %r257,[%r101]; cvt.u16.u32 %r256,%r257; setp.eq.u16 %r258,%r256,0; @ %r258 bra $L75; .loc 1 311 22 ld.f64 %r63,[%r96]; .loc 1 311 19 setp.ge.f64 %r259,%r63,%r105; @ ! %r259 bra $L75; mov.u64 %r49,%r292; mov.u64 %r61,%r117; mov.u64 %r149,%r67; .loc 1 314 19 mov.u64 %r120,0; $L77: .loc 1 315 36 add.u64 %r261,%r149,1; st.u64 [%r61],%r261; shr.s64 %r262,%r261,63; st.u64 [%r61+8],%r262; .loc 1 314 35 add.u64 %r120,%r120,1; .loc 1 314 12 add.u64 %r61,%r61,%r159; setp.ge.s64 %r263,%r120,%r136; @ %r263 bra $L86; .loc 1 315 32 ld.u64 %r149,[%r49]; add.u64 %r49,%r49,8; bra $L77; $L86: .loc 1 313 19 mov.f64 %r105,%r63; $L75: .loc 1 317 13 add.u64 %r96,%r96,%r79; .loc 1 319 11 add.u64 %r67,%r67,1; .loc 1 319 4 st.u64 [%frame+360],%r67; setp.ne.u64 %r264,%r67,%r84; @ %r264 bra $L78; .loc 1 317 13 sub.u64 %r265,%r84,%r161; mad.lo.u64 %r98,%r265,%r79,%r98; bra $L73; $L85: mov.u64 %r82,%r161; mov.u64 %r100,%r101; mov.u64 %r97,%r98; $L74: .loc 1 323 11 ld.s8 %r268,[%r100]; cvt.u16.u32 %r267,%r268; setp.eq.u16 %r269,%r267,0; @ %r269 bra $L79; .loc 1 323 22 ld.f64 %r71,[%r97]; .loc 1 323 19 setp.gt.f64 %r270,%r71,%r105; @ ! %r270 bra $L79; mov.u64 %r124,%r292; mov.u64 %r36,%r117; mov.u64 %r134,%r82; .loc 1 326 12 mov.u64 %r119,0; $L81: .loc 1 327 36 add.u64 %r272,%r134,1; st.u64 [%r36],%r272; shr.s64 %r273,%r272,63; st.u64 [%r36+8],%r273; .loc 1 326 28 add.u64 %r119,%r119,1; .loc 1 326 5 add.u64 %r36,%r36,%r159; setp.ge.s64 %r274,%r119,%r136; @ %r274 bra $L87; .loc 1 327 32 ld.u64 %r134,[%r124]; add.u64 %r124,%r124,8; bra $L81; $L87: .loc 1 325 12 mov.f64 %r105,%r71; $L79: .loc 1 331 9 add.u64 %r97,%r97,%r79; .loc 1 332 10 add.u64 %r100,%r100,%r80; .loc 1 334 14 add.u64 %r82,%r82,1; .loc 1 334 7 st.u64 [%frame+360],%r82; setp.ne.u64 %r275,%r82,%r84; @ %r275 bra $L74; sub.u64 %r139,%r84,%r161; .loc 1 331 9 mad.lo.u64 %r98,%r79,%r139,%r98; .loc 1 332 10 mad.lo.u64 %r101,%r80,%r139,%r101; $L73: mov.u64 %r111,%r292; .loc 1 325 12 mov.u64 %r93,%r80; .loc 1 334 34 mov.u64 %r95,%r84; mov.u64 %r91,%r77; mov.u64 %r110,8; .loc 1 335 9 mov.u64 %r103,0; $L82: .loc 1 340 13 st.u64 [%r111+-8],%r295; .loc 1 343 23 mul.lo.u64 %r86,%r91,%r95; .loc 1 344 24 mul.lo.u64 %r88,%r95,%r93; .loc 1 345 5 add.u64 %r103,%r103,1; .loc 1 346 7 setp.eq.u64 %r280,%r103,%r136; @ %r280 bra $L52; .loc 1 354 16 ld.u64 %r281,[%r111]; add.u64 %r90,%r281,1; st.u64 [%r111],%r90; .loc 1 355 23 add.u64 %r283,%r291,%r110; ld.u64 %r91,[%r283]; .loc 1 355 13 sub.u64 %r284,%r91,%r86; shl.b64 %r285,%r284,3; add.u64 %r98,%r98,%r285; .loc 1 356 24 add.u64 %r286,%frame,%r110; ld.u64 %r93,[%r286]; .loc 1 356 14 sub.u64 %r287,%r93,%r88; add.u64 %r101,%r101,%r287; .loc 1 359 32 add.u64 %r289,%r296,%r110; ld.u64 %r95,[%r289]; .loc 1 359 7 add.u64 %r111,%r111,8; add.u64 %r110,%r110,8; setp.eq.u64 %r290,%r90,%r95; @ %r290 bra $L82; .loc 1 334 21 ld.u64 %r161,[%frame+360]; bra $L83; $L84: .loc 1 290 9 mov.f64 %r105,%r54; setp.eq.u32 %r293,1,1; bra $L65; $L52: .loc 1 _gfortran_smaxloc0_16_r8 .visible .func _gfortran_smaxloc0_16_r8u32pred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 379 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L114; .loc 1 379 20 ld.u32 %r40,[%r37381 7 {_gfortran_maxloc0_16_r8.loc 1 382 7 bra $L113; $L115: .loc 1 385 8 ld.s8 %r30,[%r36+28]; .loc 1 387 6 setp.gt.s64 %r46,%r30,0; @ %r46 bra $L1_gfortran_runtime_error117: .loc 1 390 15 ld.u64 %r24,[%r35]; .loc 1 390 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L118; .loc 1 392 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 393 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 394 24 st.u64 [%r35+8],%r24; .loc 1 395 29call (%value_in),_gfortrani_xmallocarray95 27 st.u64 [%r35],%r24; bra $L119; $L118: .loc 1 397 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 397 11 ld.u32 %r62,[%r61+36.loc 1 399 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 404 8 ld.u64 %r24,[%r35]; $L119: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,4; mov.u64 %r29,%r24; .loc 1 405 10 mov.u64 %r32,0; .loc 1 406 23 mov.u64 %r68,%r32; $L120: st.u64 [%r29],%r68; st.u64 [%r29+8],%r68; .loc 1 405 24 add.u64 %r32,%r32,1; .loc 1 405 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r70,%r30,%r32; @ %r70 bra $L120; $L113: .loc 1 407 1 ret; } maxloc0_4_r10.o/162280219237 maxloc0_8_r16.o/1622802192maxloc1_4_i1.o/_gfortran_maxloc1_4_i1 .visible .func _gfortran_maxloc1_4_i1.file 1 "../../../libgfortran/generated/maxloc1_4_i1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_4_i1 .visible .func _gfortran_mmaxloc1_4__gfortran_smaxloc1_4_i1 .visible .func _gfortran_smaxloc1_4_VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_maxloc1_4_i1 .visible .func _gfortran_maxloc1_4_i148016predpred %r227; .reg .predu64 %r236; .reg .u64predu32 %r256; .reg .u64 %r258; .reg .u64u64 %r274; .reg .u64predu64pred %r321; .reg .u16pred %r326; .reg .pred %r327; .reg .u64 %r333; .reg .u64 %r334; .reg .u64predpred %r350; .reg .u64 %r352; .reg .u32mov.u64 %r177,%ar0; mov.u64 %r178,%ar1; mov.u64 %r179,%ar2; mov.u32 %r180,%ar3; .loc 1 58 10 ld.s8 %r22,[%r178+28]; .loc 1 58 38 add.u32 %r181,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r103,%r181; .loc 1 59 10 ld.u64 %r25,[%r179]; .loc 1 59 7 add.u64 %r104,%r25,-1; .loc 1 61 7 shr.u64 %r183,%r104,63; set.u32.lt.s64 %r185,%r103,%r104; neg.s32 %r186,%r185; cvt.u16.u64 %r189,%r183; cvt.u16.u32 %r190,%r186; or.b16 %r188,%r189,%r190; .loc 1 61 6 cvt.u32.u16 %r191,%r188; cvt.u16.u8 %r192,%r191; setp.eq.u16 %r193,%r192,0; @ %r193 bra $L2; .loc 1 63 7 cvt.u32.u32 %r196,%r22; cvt.s64.s8 %r195,%r196; st.u64 [%stack+8],%r1951944_gfortran_runtime_errorr199,%r104,%r104; add.u64 %r200,%r199,%r104; shl.b64 %r201,%r200,3; add.u64 %r202,%r178,%r201; ld.u64 %r31,[%r202+56]; ld.u64 %r33,[%r202+48]; .loc 1 71 9 ld.u64 %r107,[%r202+40]; .loc 1 73 3 setp.ne.u64 %r216,%r104,0; @ %r216 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r217,%r103,%r104; @ %r217 bra $L4; bra $L49; $L3: add.u64 %r121,%r178,40; add.u64 %r175,%frame,120; add.u64 %r173,%frame,240; add.u64 %r219,%r25,%r25; add.u64 %r220,%r219,%r25; shl.b64 %r221,%r220,3; add.u64 %r222,%r178,16; add.u64 %r145,%r221,%r222; .loc 1 79 12 mov.u64 %r361,0; $L8: .loc 1 75 18 ld.u64 %r223,[%r121]; st.u64 [%r175],%r223; .loc 1 76 19 ld.u64 %r225,[%r121+16]; add.u64 %r224,%r225,1; ld.u64 %r226,[%r121+8]; sub.u64 %r38,%r224,%r226; .loc 1 78 10 setp.lt.s64 %r227,%r38,0; @ %r227 bra $L6; .loc 1 76 17 st.u64 [%r173],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r173],%r361; $L7: .loc 1 73 3 add.u64 %r121,%r121,24; add.u64 %r175,%r175,8; add.u64 %r173,%r173,8; setp.ne.u64 %r229,%r121,%r145; @ %r229 bra $L8; bra $L9; $L49: .loc 1 90 6 ld.u64 %r230,[%r177]; setp.eq.u64 %r231,%r230,0; @ ! %r231 bra $L11; bra $L10; $L4: add.u64 %r233,%r25,%r25; add.u64 %r234,%r233,%r25; shl.b64 %r235,%r234,3; add.u64 %r236,%r235,40; add.u64 %r127,%r178,%r236; shl.b64 %r237,%r25,3; add.u64 %r99,%r237,-8; add.u64 %r348,%frame,120; add.u64 %r101,%r348,%r99; add.u64 %r347,%frame,240; add.u64 %r95,%r347,%r99; add.u64 %r240,%r178,40; cvt.u32.u32 %r242,%r22; cvt.s64.s8 %r241,%r242; add.u64 %r244,%r241,%r241; add.u64 %r245,%r244,%r241; shl.b64 %r246,%r245,3; add.u64 %r29,%r240,%r246; .loc 1 87 12 mov.u64 %r360,0; $L14: .loc 1 83 18 ld.u64 %r247,[%r127]; st.u64 [%r101],%r247; .loc 1 84 19 ld.u64 %r249,[%r127+16]; add.u64 %r248,%r249,1; ld.u64 %r250,[%r127+8]; sub.u64 %r44,%r248,%r250; .loc 1 86 10 setp.lt.s64 %r251,%r44,0; @ %r251 bra $L12; .loc 1 84 17 st.u64 [%r95],%r44; bra $L13; $L12: .loc 1 87 12 st.u64 [%r95],%r360; $L13: .loc 1 81 3 add.u64 %r127,%r127,24; add.u64 %r101,%r101,8; add.u64 %r95,%r95,8; setp.ne.u64 %r253,%r29,%r127; @ %r253 bra $L14; bra $L50; $L10: .loc 1 94 7 setp.le.s64 %r254,%r103,0; @ %r254 bra $L16; add.u64 %r347,%frame,240; $L38: add.u64 %r149,%r177,40; mov.u64 %r147,%r347; cvt.u32.u32 %r256,%r22; cvt.s64.s8 %r255,%r256; add.u64 %r258,%r255,%r255; add.u64 %r259,%r258,%r255; shl.b64 %r260,%r259,3; add.u64 %r261,%r177,16; add.u64 %r128,%r260,%r261; .loc 1 97 10 mov.u64 %r94,1; .loc 1 101 4 mov.u64 %r284,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r262,0; st.u64 [%r177+8],%r262; .loc 1 106 28 cvt.u16.u32 %r265,%r22; add.u16 %r264,%r265,-1; cvt.u32.u16 %r266,%r264; st.u8 [%r177+28],%r266; .loc 1 108 20 add.u32 %r267,%r22,-2; cvt.s64.s32 %r55,%r267; add.u64 %r269,%r55,%r55; add.u64 %r270,%r269,%r55; shl.b64 %r271,%r270,3; add.u64 %r272,%r177,%r271; .loc 1 108 67 shl.b64 %r274,%r55,3; add.u64 %r275,%frame,%r274; .loc 1 108 59 ld.u64 %r277,[%r272+40]; ld.u64 %r278,[%r275+240]; mul.lo.u64 %r109,%r277,%r278; .loc 1 110 29280; call (%value_in),_gfortrani_xmallocarray281,[%value_in]; } .loc 1 110 27 st.u64 [%r177],%r281; .loc 1 111 10 setp.eq.u64 %r283,%r109,0; @ ! %r283 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r94,%r51,%r94; $L17: .loc 1 101 4 st.u64 [%r149+8],%r284; ld.u64 %r51,[%r147]; add.u64 %r285,%r51,-1; st.u64 [%r149+16],%r285; st.u64 [%r149],%r94; .loc 1 94 7 add.u64 %r149,%r149,24; add.u64 %r147,%r147,8; setp.ne.u64 %r286,%r128,%r149; @ %r286 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r177+48],%r109; mov.u64 %r288,-1; st.u64 [%r177+56],%r288; mov.u64 %r289,1; st.u64 [%r177+40],%r289; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r61,[%r177+28]; .loc 1 121 10 setp.eq.u64 %r290,%r61,%r103; @ %r290 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r61;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r293,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r294,[%r293+36]; setp.eq.u32 %r295,%r294,0; @ %r295 bra $L21; .loc 1 128 2 add.u64 %r347,%frame,240; cvta.const.u64 %r299,$LC2r34729899; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r301,%r103,0; @ %r301 bra $L24; $L27: .loc 1 68 9 add.u64 %r302,%r31,1; .loc 1 68 7 sub.u64 %r105,%r302,%r33; max.s64 %r45,%r105,0; .loc 1 140 8 ld.u64 %r118,[%r178]; .loc 1 141 8 ld.u64 %r119,[%r177]; ld.u64 %r75,[%frame+360]; .loc 1 194 22 ld.u64 %r71,[%frame+120]; .loc 1 195 22 ld.u64 %r73,[%frame]; .loc 1 195 12 shl.b64 %r74,%r73,2; .loc 1 197 32 ld.u64 %r140,[%frame+240]; .loc 1 204 23 mul.lo.u64 %r303,%r71,%r140; .loc 1 204 9 neg.s64 %r130,%r303; .loc 1 205 23 mul.lo.u64 %r304,%r73,%r140; .loc 1 205 9 shl.b64 %r305,%r304,2; neg.s64 %r135,%r305; setp.gt.s64 %r349,%r105,0; setp.le.s64 %r350,%r103,1; .loc 1 155 9 mov.u32 %r353,-128; setp.eq.u32 %r354,%r180,0; cvt.u32.u32 %r355,%r22; cvt.s64.s8 %r356,%r355; add.u64 %r357,%r356,-1; add.u64 %r358,%frame,120; add.u64 %r359,%frame,240; bra $L25; $L24: add.u64 %r162,%frame,360; add.u64 %r161,%r177,40; mov.u64 %r159,%frame; cvt.u32.u32 %r307,%r22; cvt.s64.s8 %r306,%r307; add.u64 %r150,%r306,-1; .loc 1 132 3 mov.u64 %r120,0; add.u64 %r347,%frame,240; .loc 1 134 16 mov.u64 %r308,%r120; $L26: st.u64 [%r162],%r308; .loc 1 135 18 ld.u64 %r309,[%r161]; st.u64 [%r159],%r309; .loc 1 136 17 shl.b64 %r311,%r120,3; add.u64 %r312,%r347,%r311; .loc 1 136 10 ld.u64 %r313,[%r312]; setp.le.s64 %r314,%r313,0; @ %r314 bra $L1; .loc 1 132 26 add.u64 %r120,%r120,1; .loc 1 132 3 add.u64 %r162,%r162,8; add.u64 %r161,%r161,24; add.u64 %r159,%r159,8; setp.ne.u64 %r315,%r120,%r150; @ %r315 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r349 bra $L28; .loc 1 159 10 mov.u32 %r317,0; st.u32 [%r119],%r317; bra $L29; $L28: .loc 1 180 31 mov.u64 %r113,%r118; .loc 1 155 9 mov.u32 %r97,%r353; .loc 1 157 9 mov.u32 %r96,1; .loc 1 178 8 mov.u64 %r112,0; $L33: .loc 1 182 14 ld.s8 %r176,[%r113]; .loc 1 182 7 @ %r354 bra $L30; .loc 1 182 6 cvt.u16.u32 %r319,%r97; cvt.u16.u32 %r320,%r176; setp.le.s16 %r321,%r319,%r320; @ %r321 bra $L31; bra $L32; $L30: cvt.u16.u32 %r322,%r97; cvt.u16.u32 %r323,%r176; setp.ge.s16 %r324,%r322,%r323; @ %r324 bra $L32; $L31: cvt.u32.u64 %r325,%r112; add.u32 %r96,%r325,1; .loc 1 182 14 mov.u32 %r97,%r176; $L32: .loc 1 180 23 add.u64 %r112,%r112,1; .loc 1 180 31 add.u64 %r113,%r113,%r107; .loc 1 180 6 setp.gt.s64 %r326,%r45,%r112; @ %r326 bra $L33; .loc 1 189 12 st.u32 [%r119],%r96; $L29: .loc 1 193 15 add.u64 %r75,%r75,1; .loc 1 194 12 add.u64 %r118,%r118,%r71; .loc 1 195 12 add.u64 %r119,%r119,%r74; .loc 1 197 13 setp.ne.u64 %r327,%r75,%r140; @ %r327 bra $L25; .loc 1 204 9 add.u64 %r115,%r118,%r130; .loc 1 205 9 add.u64 %r116,%r119,%r135; .loc 1 207 7 @ %r350 bra $L1; add.u64 %r132,%frame,368; mov.u64 %r144,8; .loc 1 206 5 mov.u64 %r117,1; .loc 1 201 13 mov.u64 %r352,0; bra $L36; $L37: st.u64 [%r132],%r352; .loc 1 204 23 mul.lo.u64 %r333,%r86,%r85; .loc 1 204 9 sub.u64 %r115,%r118,%r333; .loc 1 205 23 mul.lo.u64 %r334,%r88,%r85; .loc 1 205 9 shl.b64 %r335,%r334,2; sub.u64 %r116,%r119,%r335; .loc 1 206 5 add.u64 %r117,%r117,1; .loc 1 207 7 add.u64 %r132,%r132,8; add.u64 %r144,%r144,8; setp.eq.u64 %r336,%r117,%r357; @ %r336 bra $L1; $L36: .loc 1 215 16 ld.u64 %r337,[%r132]; add.u64 %r85,%r337,1; st.u64 [%r132],%r85; .loc 1 216 23 add.u64 %r339,%r358,%r144; ld.u64 %r86,[%r339]; .loc 1 216 13 add.u64 %r118,%r115,%r86; .loc 1 217 23 add.u64 %r340,%frame,%r144; ld.u64 %r88,[%r340]; .loc 1 217 13 shl.b64 %r341,%r88,2; add.u64 %r119,%r116,%r341; .loc 1 197 32 add.u64 %r343,%r359,%r144; ld.u64 %r91,[%r343]; .loc 1 197 13 setp.eq.u64 %r344,%r85,%r91; @ %r344 bra $L37; .loc 1 201 13 mov.u64 %r75,0; bra $L25; $L50: .loc 1 90 6 ld.u64 %r345,[%r177]; setp.eq.u64 %r346,%r345,0; @ ! %r346 bra $L11; bra $L38; $L1:_gfortran_mmaxloc1_4_i1 .visible .func _gfortran_mmaxloc1_4_, .param .u32 %in_ar4) {608pred %r225; .reg .u32 %r230; .reg .u64 %r232; .reg .u32 %r234; .reg .u32 %r235; .reg .u16 %r237; .reg .u16 %r238; .reg .u16pred %r262; .reg .u32 %r263; .reg .u32 %r264; .reg .u32 %r266; .reg .u32u16 %r274; .reg .u32 %r275; .reg .u16 %r276; .reg .predpredu64 %r302; .reg .u64 %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u64 %r306; .reg .u6464u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .pred %r336; .reg .u64 %r338; .reg .u64predpredpredpred %r413; .reg .pred %r414; .reg .predpred %r419; .reg .pred %r420; .reg .u16 %r421; .reg .u32 %r422; .reg .pred %r423; .reg .u16 %r424; .reg .u16 %r425; .reg .pred %r426; .reg .pred %r427; .reg .predu64predpred %r454; .reg .u64 %r455; .reg .u64 %r456; .reg .pred %r457; .reg .u64 %r458; .reg .predu64 %r467; mov.u64 %r220,%ar0; mov.u64 %r221,%ar1; mov.u64 %r222,%ar2; mov.u64 %r223,%ar3; mov.u32 %r224,%ar4; .loc 1 251 6 setp.ne.u64 %r225,%r223,0; @ %r225 bra $L2224; call _gfortran_maxloc1_4_7 bra $L51; $L52: .loc 1 261 10 ld.u64 %r22,[%r222]; .loc 1 261 7 add.u64 %r152,%r22,-1; .loc 1 262 10 ld.s8 %r23,[%r221+28]; .loc 1 262 38 add.u32 %r230,%r23,-1; .loc 1 262 8 cvt.s64.s32 %r153,%r230; .loc 1 265 7 shr.u64 %r232,%r152,63; set.u32.gt.s64 %r234,%r152,%r153; neg.s32 %r235,%r234; cvt.u16.u64 %r238,%r232; cvt.u16.u32 %r239,%r235; or.b16 %r237,%r238,%r239; .loc 1 265 6 cvt.u32.u16 %r240,%r237; cvt.u16.u8 %r241,%r240; setp.eq.u16 %r242,%r241,0; @ %r242 bra $L54; .loc 1 267 7 cvt.u32.u32 %r245,%r23; cvt.s64.s8 %r244,%r245; st.u64 [%stack+8],%r244; st.u64 [%stack],%r22; cvta.const.u64 %r243stack; call _gfortran_runtime_error4: .loc 1 272 9 add.u64 %r452,%r152,%r152; add.u64 %r453,%r452,%r152; shl.b64 %r250,%r453,3; add.u64 %r251,%r221,%r250; ld.u64 %r254,[%r251+56]; add.u64 %r253,%r254,1; .loc 1 272 7 ld.u64 %r261,[%r251+48]; sub.u64 %r154,%r253,%r261; .loc 1 273 6 setp.le.s64 %r262,%r154,0; @ %r262 bra $L51; .loc 1 276 9 ld.u64 %r169,[%r223]; .loc 1 278 15 ld.u64 %r34,[%r223+16]; .loc 1 280 22 cvt.u32.u64 %r35,%r34; .loc 1 280 53 add.u32 %r263,%r35,-4; and.b32 %r264,%r263,-5; set.u32.eq.u32 %r266,%r264,0; neg.s32 %r267,%r266; .loc 1 280 22 add.u32 %r268,%r35,-1; set.u32.le.u32 %r270,%r268,1; neg.s32 %r271,%r270; .loc 1 280 6 cvt.u16.u32 %r273,%r267; cvt.u16.u32 %r274,%r271; or.b16 %r272,%r273,%r274; cvt.u32.u16 %r275,%r272; cvt.u16.u8 %r276,%r275; setp.ne.u16 %r277,%r276,0; @ %r277 bra $L56; .loc 1 282 7 setp.ne.u32 %r279,%r35,16; @ %r279 bra $L57; $L56: .loc 1 289 9 shl.b64 %r283,%r453,3; add.u64 %r284,%r221,%r283; ld.u64 %r157,[%r284+40]; .loc 1 290 12 add.u64 %r290,%r223,%r283; ld.u64 %r42,[%r290+40]; .loc 1 292 3 setp.ne.u64 %r292,%r152,0; @ %r292 bra $L58; $L64: .loc 1 302 3 setp.lt.s64 %r293,%r152,%r153; @ %r293 bra $L59; bra $L116; $L57: .loc 1 287 5 cvta.const.u64 %r2944_gfortran_runtime_error8: add.u64 %r40,%r221,40; add.u64 %r178,%r223,40; add.u64 %r297,%r22,%r22; add.u64 %r298,%r297,%r22; shl.b64 %r299,%r298,3; add.u64 %r300,%r221,16; add.u64 %r131,%r299,%r300; .loc 1 292 3 mov.u64 %r147,0; add.u64 %r456,%frame,240; add.u64 %r455,%frame,360; .loc 1 299 12 mov.u64 %r467,%r147; $L63: .loc 1 294 18 add.u64 %r302,%r456,%r147; ld.u64 %r303,[%r40]; st.u64 [%r302],%r303; .loc 1 295 18 add.u64 %r304,%frame,%r147; .loc 1 295 20 ld.u64 %r306,[%r178]; mul.lo.u64 %r305,%r306,%r34; .loc 1 295 18 st.u64 [%r304],%r305; .loc 1 296 19 ld.u64 %r308,[%r40+16]; add.u64 %r307,%r308,1; ld.u64 %r309,[%r40+8]; sub.u64 %r54,%r307,%r309; .loc 1 298 10 setp.lt.s64 %r310,%r54,0; @ %r310 bra $L61; .loc 1 296 17 add.u64 %r312,%r455,%r147; st.u64 [%r312],%r54; bra $L62; $L61: .loc 1 299 12 add.u64 %r314,%r455,%r147; st.u64 [%r314],%r467; $L62: .loc 1 292 3 add.u64 %r40,%r40,24; add.u64 %r178,%r178,24; add.u64 %r147,%r147,8; setp.ne.u64 %r316,%r40,%r131; @ %r316 bra $L63; bra $L64; $L116: .loc 1 312 6 ld.u64 %r317,[%r220]; setp.eq.u64 %r318,%r317,0; @ ! %r318 bra $L66; bra $L65; $L59: add.u64 %r320,%r22,%r22; add.u64 %r321,%r320,%r22; shl.b64 %r322,%r321,3; add.u64 %r114,%r322,40; add.u64 %r132,%r221,%r114; add.u64 %r96,%r223,%r114; shl.b64 %r323,%r22,3; add.u64 %r150,%r323,-8; cvt.u32.u32 %r325,%r23; cvt.s64.s8 %r324,%r325; shl.b64 %r326,%r324,3; add.u64 %r39,%r326,-8; add.u64 %r456,%frame,240; add.u64 %r455,%frame,360; .loc 1 309 12 mov.u64 %r466,0; $L69: .loc 1 304 18 add.u64 %r328,%r456,%r150; ld.u64 %r329,[%r132]; st.u64 [%r328],%r329; .loc 1 305 18 add.u64 %r330,%frame,%r150; .loc 1 305 20 ld.u64 %r332,[%r96]; mul.lo.u64 %r331,%r332,%r34; .loc 1 305 18 st.u64 [%r330],%r331; .loc 1 306 19 ld.u64 %r334,[%r132+16]; add.u64 %r333,%r334,1; ld.u64 %r335,[%r132+8]; sub.u64 %r64,%r333,%r335; .loc 1 308 10 setp.lt.s64 %r336,%r64,0; @ %r336 bra $L67; .loc 1 306 17 add.u64 %r338,%r455,%r150; st.u64 [%r338],%r64; bra $L68; $L67: .loc 1 309 12 add.u64 %r340,%r455,%r150; st.u64 [%r340],%r466; $L68: .loc 1 302 3 add.u64 %r132,%r132,24; add.u64 %r96,%r96,24; add.u64 %r150,%r150,8; setp.ne.u64 %r342,%r39,%r150; @ %r342 bra $L69; bra $L117; $L65: .loc 1 316 7 setp.eq.u64 %r343,%r153,0; @ %r343 bra $L71; add.u64 %r455,%frame,360; $L96: add.u64 %r145,%r220,40; mov.u64 %r142,%r455; cvt.u32.u32 %r345,%r23; cvt.s64.s8 %r344,%r345; add.u64 %r347,%r344,%r344; add.u64 %r348,%r347,%r344; shl.b64 %r349,%r348,3; add.u64 %r350,%r220,16; add.u64 %r133,%r349,%r350; .loc 1 319 10 mov.u64 %r135,1; .loc 1 323 4 mov.u64 %r369,0; bra $L72; $L71: .loc 1 327 20 add.u32 %r351,%r23,-2; cvt.s64.s32 %r74,%r351; add.u64 %r353,%r74,%r74; add.u64 %r354,%r353,%r74; shl.b64 %r355,%r354,3; add.u64 %r356,%r220,%r355; .loc 1 327 67 shl.b64 %r358,%r74,3; add.u64 %r359,%frame,%r358; .loc 1 327 59 ld.u64 %r361,[%r356+40]; ld.u64 %r362,[%r359+360]; mul.lo.u64 %r160,%r361,%r362; .loc 1 329 24 mov.u64 %r363,0; st.u64 [%r220+8],%r363; .loc 1 330 28 cvt.u16.u32 %r366,%r23; add.u16 %r365,%r366,-1; cvt.u32.u16 %r367,%r365; st.u8 [%r220+28],%r367; .loc 1 332 10 setp.eq.u64 %r368,%r160,0; @ %r368 bra $L73; bra $L118; $L75: .loc 1 321 47 mul.lo.u64 %r135,%r71,%r135; $L72: .loc 1 323 4 st.u64 [%r145+8],%r369; ld.u64 %r71,[%r142]; add.u64 %r370,%r71,-1; st.u64 [%r145+16],%r370; st.u64 [%r145],%r135; .loc 1 316 7 add.u64 %r145,%r145,24; add.u64 %r142,%r142,8; setp.ne.u64 %r371,%r133,%r145; @ %r371 bra $L75; bra $L71; $L73: .loc 1 335 4 st.u64 [%r220+48],%r160; mov.u64 %r373,-1; st.u64 [%r220+56],%r373; mov.u64 %r374,1; st.u64 [%r220+40],%r374; .loc 1 336 4 bra $L51; $L118: .loc 1 339 24call (%value_in),_gfortrani_xmallocarray77,[%value_in]; } .loc 1 339 22 st.u64 [%r220],%r377; bra $L76; $L66: .loc 1 344 19 ld.s8 %r379,[%r220+28]; .loc 1 344 10 setp.eq.u64 %r380,%r379,%r153; @ %r380 bra $L77; .loc 1 345 2 cvta.const.u64 %r381381_gfortran_runtime_error77: .loc 1 347 11 cvta.global.u64 %r383,_gfortrani_compile_options; .loc 1 347 10 ld.u32 %r384,[%r383+36]; setp.eq.u32 %r385,%r384,0; @ %r385 bra $L76; .loc 1 349 4 add.u64 %r455,%frame,360; cvta.const.u64 %r389,$LC2r455388389; call _gfortrani_bounds_ifunction_return4 cvta.const.u64 %r393,$LC393389; call _gfortrani_bounds_equal_extents$L76: .loc 1 356 3 setp.ne.u64 %r395,%r153,0; @ %r395 bra $L78; $L81: .loc 1 364 8 ld.u64 %r170,[%r220]; .loc 1 365 8 ld.u64 %r168,[%r221]; .loc 1 367 9 setp.ne.u64 %r396,%r168,0; @ %r396 bra $L79; bra $L51; $L78: add.u64 %r197,%frame,480; add.u64 %r196,%r220,40; add.u64 %r192,%frame,120; cvt.u32.u32 %r398,%r23; cvt.s64.s8 %r397,%r398; add.u64 %r146,%r397,-1; .loc 1 356 3 mov.u64 %r171,0; add.u64 %r455,%frame,360; .loc 1 358 16 mov.u64 %r399,%r171; $L80: st.u64 [%r197],%r399; .loc 1 359 18 ld.u64 %r400,[%r196]; st.u64 [%r192],%r400; .loc 1 360 17 shl.b64 %r402,%r171,3; add.u64 %r403,%r455,%r402; .loc 1 360 10 ld.u64 %r404,[%r403]; setp.le.s64 %r405,%r404,0; @ %r405 bra $L51; .loc 1 356 26 add.u64 %r171,%r171,1; .loc 1 356 3 add.u64 %r197,%r197,8; add.u64 %r196,%r196,24; add.u64 %r192,%r192,8; setp.ne.u64 %r406,%r146,%r171; @ %r406 bra $L80; bra $L81; $L79: .loc 1 290 12 mul.lo.u64 %r45,%r42,%r34; ld.u64 %r189,[%frame+480]; .loc 1 430 22 ld.u64 %r100,[%frame+240]; .loc 1 431 23 ld.u64 %r102,[%frame]; .loc 1 432 22 ld.u64 %r104,[%frame+120]; .loc 1 432 12 shl.b64 %r105,%r104,2; .loc 1 434 32 ld.u64 %r29,[%frame+360]; .loc 1 441 23 mul.lo.u64 %r38,%r100,%r29; .loc 1 442 24 mul.lo.u64 %r407,%r102,%r29; .loc 1 442 10 neg.s64 %r136,%r407; .loc 1 443 23 mul.lo.u64 %r408,%r104,%r29; .loc 1 443 9 shl.b64 %r409,%r408,2; neg.s64 %r190,%r409; setp.le.s64 %r454,%r153,1; .loc 1 408 9 setp.ne.u32 %r459,%r224,0; cvt.u32.u32 %r460,%r23; cvt.s64.s8 %r461,%r460; add.u64 %r462,%r461,-1; add.u64 %r463,%frame,240; add.u64 %r464,%frame,360; add.u64 %r465,%frame,120; $L97: .loc 1 356 3 mov.u64 %r163,%r169; mov.u64 %r162,%r168; .loc 1 386 9 mov.u64 %r161,0; $L85: .loc 1 389 7 ld.s8 %r219,[%r163]; .loc 1 389 6 setp.eq.u32 %r457,%r219,0; @ %r457 bra $L82; .loc 1 397 11 ld.s8 %r143,[%r162]; .loc 1 398 11 cvt.u32.u64 %r411,%r161; add.u32 %r141,%r411,1; .loc 1 408 9 @ %r459 bra $L83; bra $L119; $L82: .loc 1 386 24 add.u64 %r161,%r161,1; .loc 1 386 32 add.u64 %r162,%r162,%r157; .loc 1 386 47 add.u64 %r163,%r163,%r45; .loc 1 386 2 setp.ne.u64 %r413,%r154,%r161; @ %r413 bra $L85; .loc 1 385 9 mov.u32 %r141,%r219; bra $L86; $L119: .loc 1 418 8 setp.gt.s64 %r414,%r154,%r161; @ %r414 bra $L87; bra $L86; $L83: .loc 1 409 8 setp.le.s64 %r415,%r154,%r161; @ %r415 bra $L86; add.u64 %r204,%r161,1; $L89: .loc 1 411 8 @ %r457 bra $L88; .loc 1 411 18 ld.s8 %r89,[%r162]; .loc 1 411 15 cvt.u16.u32 %r417,%r89; cvt.u16.u32 %r418,%r143; setp.lt.s16 %r419,%r417,%r418; @ %r419 bra $L88; .loc 1 414 16 cvt.u32.u64 %r141,%r204; mov.u32 %r143,%r89; $L88: .loc 1 409 33 add.u64 %r162,%r162,%r157; .loc 1 409 48 add.u64 %r163,%r163,%r45; .loc 1 409 8 add.u64 %r205,%r204,1; setp.eq.u64 %r420,%r154,%r204; @ %r420 bra $L86; .loc 1 411 9 ld.s8 %r219,[%r163]; mov.u64 %r204,%r205; setp.eq.u32 %r457,%r219,0; bra $L89; $L87: add.u64 %r200,%r161,1; bra $L91; $L98: mov.u64 %r200,%r201; $L91: .loc 1 420 8 ld.s8 %r422,[%r163]; cvt.u16.u32 %r421,%r422; setp.eq.u16 %r423,%r421,0; @ %r423 bra $L90; .loc 1 420 18 ld.s8 %r94,[%r162]; .loc 1 420 15 cvt.u16.u32 %r424,%r94; cvt.u16.u32 %r425,%r143; setp.le.s16 %r426,%r424,%r425; @ %r426 bra $L90; .loc 1 423 16 cvt.u32.u64 %r141,%r200; mov.u32 %r143,%r94; $L90: .loc 1 418 33 add.u64 %r162,%r162,%r157; .loc 1 418 48 add.u64 %r163,%r163,%r45; .loc 1 418 8 add.u64 %r201,%r200,1; setp.ne.u64 %r427,%r154,%r200; @ %r427 bra $L98; $L86: .loc 1 426 8 st.u32 [%r170],%r141; .loc 1 429 15 add.u64 %r189,%r189,1; .loc 1 430 12 add.u64 %r168,%r168,%r100; .loc 1 431 13 add.u64 %r169,%r169,%r102; .loc 1 432 12 add.u64 %r170,%r170,%r105; .loc 1 434 13 setp.ne.u64 %r428,%r29,%r189; @ %r428 bra $L97; .loc 1 442 10 add.u64 %r165,%r169,%r136; .loc 1 443 9 add.u64 %r166,%r170,%r190; .loc 1 445 7 @ %r454 bra $L51; add.u64 %r151,%frame,488; .loc 1 441 23 mov.u64 %r108,%r38; .loc 1 445 7 mov.u64 %r214,8; .loc 1 444 5 mov.u64 %r167,1; .loc 1 438 13 mov.u64 %r458,0; bra $L94; $L95: st.u64 [%r151],%r458; .loc 1 441 23 mul.lo.u64 %r108,%r122,%r121; .loc 1 442 24 mul.lo.u64 %r434,%r124,%r121; .loc 1 442 10 sub.u64 %r165,%r169,%r434; .loc 1 443 23 mul.lo.u64 %r435,%r126,%r121; .loc 1 443 9 shl.b64 %r436,%r435,2; sub.u64 %r166,%r170,%r436; .loc 1 444 5 add.u64 %r167,%r167,1; .loc 1 445 7 add.u64 %r151,%r151,8; add.u64 %r214,%r214,8; setp.eq.u64 %r437,%r167,%r462; @ %r437 bra $L51; $L94: .loc 1 453 16 ld.u64 %r438,[%r151]; add.u64 %r121,%r438,1; st.u64 [%r151],%r121; .loc 1 454 23 add.u64 %r440,%r463,%r214; ld.u64 %r122,[%r440]; .loc 1 454 13 sub.u64 %r441,%r122,%r108; add.u64 %r168,%r168,%r441; .loc 1 455 24 add.u64 %r442,%frame,%r214; ld.u64 %r124,[%r442]; .loc 1 455 14 add.u64 %r169,%r165,%r124; .loc 1 456 23 add.u64 %r444,%r465,%r214; ld.u64 %r126,[%r444]; .loc 1 456 13 shl.b64 %r445,%r126,2; add.u64 %r170,%r166,%r445; .loc 1 434 32 add.u64 %r447,%r464,%r214; ld.u64 %r130,[%r447]; .loc 1 434 13 setp.eq.u64 %r448,%r121,%r130; @ %r448 bra $L95; .loc 1 438 13 mov.u64 %r189,0; bra $L97; $L117: .loc 1 312 6 ld.u64 %r449,[%r220]; setp.eq.u64 %r450,%r449,0; @ ! %r450 bra $L66; bra $L96; $L51_gfortran_smaxloc1_4_i1 .visible .func _gfortran_smaxloc1_4_, .param .u32 %in_ar4) {local .align 16 .b8 %frame_ar[368u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u32 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 483 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L121; .loc 1 483 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L122; $L121:r152154; call _gfortran_maxloc1_4_490 7 bra $L120; $L122: .loc 1 493 10 ld.u64 %r23,[%r152]; .loc 1 493 7 add.u64 %r88,%r23,-1; .loc 1 494 10 ld.s8 %r24,[%r151+28]; .loc 1 494 38 add.u32 %r26,%r24,-1; .loc 1 494 8 cvt.s64.s32 %r296,%r26; .loc 1 496 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 496 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L124; .loc 1 503 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L125; $L131: .loc 1 511 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L126; bra $L179; $L124: .loc 1 498 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error125: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 508 12 mov.u64 %r302,0; $L130: .loc 1 505 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 507 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L128; .loc 1 505 17 st.u64 [%r139],%r36; bra $L129; $L128: .loc 1 508 12 st.u64 [%r139],%r302; $L129: .loc 1 503 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L130; bra $L131; $L179: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L133; bra $L132; $L126: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 517 12 mov.u64 %r301,0; $L136: .loc 1 514 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 516 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L134; .loc 1 513 17 st.u64 [%r43],%r41; bra $L135; $L134: .loc 1 517 12 st.u64 [%r43],%r301; $L135: .loc 1 511 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L136; bra $L180; $L132: .loc 1 524 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L138; add.u64 %r295,%frame,120; $L158: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 527 10 mov.u64 %r81,1; .loc 1 531 4 mov.u64 %r239,0; bra $L139; $L138: .loc 1 535 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 536 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 538 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 538 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 538 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 540 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L140; bra $L181; $L142: .loc 1 529 48 mul.lo.u64 %r81,%r46,%r81; $L139: .loc 1 531 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 524 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L142; bra $L138; $L140: .loc 1 543 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 544 4 bra $L120; $L181: .loc 1 547 2496246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 547 22 st.u64 [%r150],%r98; $L146: .loc 1 573 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L143; bra $L144; $L133: .loc 1 551 19 ld.s8 %r55,[%r150+28]; .loc 1 551 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L145; .loc 1 552 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error145: .loc 1 557 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 557 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L146; .loc 1 559 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L144; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 559 10 mov.u64 %r104,0; $L148: .loc 1 563 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 563 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 564 18 ld.u64 %r63,[%r31]; .loc 1 559 25 add.u64 %r104,%r104,1; .loc 1 564 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L147; .loc 1 565 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error147: .loc 1 559 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L148; $L143: .loc 1 575 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L149; shl.b64 %r266,%r296,3; bra $L150; $L149: mov.u64 %r266,8; $L150: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L151: .loc 1 576 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 573 26 add.u64 %r100,%r100,1; .loc 1 573 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L151; $L144: ld.u64 %r80,[%frame+240]; .loc 1 585 22 ld.u64 %r68,[%frame]; .loc 1 585 12 shl.b64 %r69,%r68,2; .loc 1 587 32 ld.u64 %r59,[%frame+120]; .loc 1 594 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 583 13 mov.u32 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L157: st.u32 [%r98],%r280; .loc 1 584 15 add.u64 %r80,%r80,1; .loc 1 585 12 add.u64 %r98,%r98,%r69; .loc 1 587 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L157; .loc 1 596 7 @ ! %r294 bra $L182; bra $L120; $L156: .loc 1 591 13 st.u64 [%r99],%r297; .loc 1 594 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 595 5 add.u64 %r97,%r97,1; .loc 1 596 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L155; bra $L120; $L182: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 594 23 mov.u64 %r72,%r44; .loc 1 595 5 mov.u64 %r97,1; .loc 1 591 13 mov.u64 %r297,0; $L155: .loc 1 600 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 601 23 ld.u64 %r75,[%r124]; .loc 1 601 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,2; add.u64 %r98,%r98,%r291; .loc 1 587 32 ld.u64 %r77,[%r123]; .loc 1 587 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L156; .loc 1 591 13 mov.u64 %r80,0; bra $L157; $L180: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L133; bra $L158; $L120: .loc 1 605 1 ret; } maxloc1_8_i1.o/_gfortran_maxloc1_8_i1 .visible .func _gfortran_maxloc1_8_i1.file 1 "../../../libgfortran/generated/maxloc1_8_i1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_8_i1 .visible .func _gfortran_mmaxloc1_8__gfortran_smaxloc1_8_i1 .visible .func _gfortran_smaxloc1_8_VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_maxloc1_8_i1 .visible .func _gfortran_maxloc1_8_i1480predpred %r226; .reg .predu64 %r249; .reg .pred %r250; .reg .pred %r252; .reg .pred64u64 %r273; .reg .u64 %r274; .reg .u64pred %r289; .reg .u64 %r290; .reg .u64 %r292; .reg .u32 %r293; .reg .predu64 %r302; .reg .u32 %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u64pred %r311; .reg .u16 %r315; .reg .u16 %r316; .reg .predpred %r320; .reg .predpred %r342; .reg .u64 %r343; .reg .predpredu64 %r357; .reg .u64 %r358; .reg .u64 %r359; mov.u64 %r176,%ar0; mov.u64 %r177,%ar1; mov.u64 %r178,%ar2; mov.u32 %r179,%ar3; .loc 1 58 10 ld.s8 %r22,[%r177+28]; .loc 1 58 38 add.u32 %r180,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r110,%r180; .loc 1 59 10 ld.u64 %r25,[%r178]; .loc 1 59 7 add.u64 %r111,%r25,-1; .loc 1 61 7 shr.u64 %r182,%r111,63; set.u32.lt.s64 %r184,%r110,%r111; neg.s32 %r185,%r184; cvt.u16.u64 %r188,%r182; cvt.u16.u32 %r189,%r185; or.b16 %r187,%r188,%r189; .loc 1 61 6 cvt.u32.u16 %r190,%r187; cvt.u16.u8 %r191,2; .loc 1 63 7 cvt.u32.u32 %r195,%r22; cvt.s64.s8 %r194,%r195; st.u64 [%stack+8],%r1941933_gfortran_runtime_errorr198,%r111,%r111; add.u64 %r199,%r198,%r111; shl.b64 %r200,%r199,3; add.u64 %r201,%r177,%r200; ld.u64 %r31,[%r201+56]; ld.u64 %r33,[%r201+48]; .loc 1 71 9 ld.u64 %r114,[%r201+40]; .loc 1 73 3 setp.ne.u64 %r215,%r111,0; @ %r215 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r216,%r110,%r111; @ %r216 bra $L4; bra $L50; $L3: add.u64 %r108,%r177,40; add.u64 %r126,%frame,120; add.u64 %r172,%frame,240; add.u64 %r217,%r177,16; add.u64 %r219,%r25,%r25; add.u64 %r220,%r219,%r25; shl.b64 %r221,%r220,3; add.u64 %r173,%r217,%r221; .loc 1 79 12 mov.u64 %r359,0; $L8: .loc 1 75 18 ld.u64 %r222,[%r108]; st.u64 [%r126],%r222; .loc 1 76 19 ld.u64 %r224,[%r108+16]; add.u64 %r223,%r224,1; ld.u64 %r225,[%r108+8]; sub.u64 %r38,%r223,%r225; .loc 1 78 10 setp.lt.s64 %r226,%r38,0; @ %r226 bra $L6; .loc 1 76 17 st.u64 [%r172],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r172],%r359; $L7: .loc 1 73 3 add.u64 %r108,%r108,24; add.u64 %r126,%r126,8; add.u64 %r172,%r172,8; setp.ne.u64 %r228,%r108,%r173; @ %r228 bra $L8; bra $L9; $L50: .loc 1 90 6 ld.u64 %r229,[%r176]; setp.eq.u64 %r230,%r229,0; @ ! %r230 bra $L11; bra $L10; $L4: add.u64 %r232,%r25,%r25; add.u64 %r233,%r232,%r25; shl.b64 %r234,%r233,3; add.u64 %r235,%r234,40; add.u64 %r128,%r177,%r235; shl.b64 %r236,%r25,3; add.u64 %r102,%r236,-8; add.u64 %r345,%frame,120; add.u64 %r105,%r345,%r102; add.u64 %r349,%frame,240; add.u64 %r98,%r349,%r102; add.u64 %r239,%r177,40; cvt.u32.u32 %r241,%r22; cvt.s64.s8 %r240,%r241; add.u64 %r243,%r240,%r240; add.u64 %r244,%r243,%r240; shl.b64 %r245,%r244,3; add.u64 %r41,%r239,%r245; .loc 1 87 12 mov.u64 %r358,0; $L14: .loc 1 83 18 ld.u64 %r246,[%r128]; st.u64 [%r105],%r246; .loc 1 84 19 ld.u64 %r248,[%r128+16]; add.u64 %r247,%r248,1; ld.u64 %r249,[%r128+8]; sub.u64 %r45,%r247,%r249; .loc 1 86 10 setp.lt.s64 %r250,%r45,0; @ %r250 bra $L12; .loc 1 84 17 st.u64 [%r98],%r45; bra $L13; $L12: .loc 1 87 12 st.u64 [%r98],%r358; $L13: .loc 1 81 3 add.u64 %r128,%r128,24; add.u64 %r105,%r105,8; add.u64 %r98,%r98,8; setp.ne.u64 %r252,%r41,%r128; @ %r252 bra $L14; bra $L51; $L10: .loc 1 94 7 setp.le.s64 %r253,%r110,0; @ %r253 bra $L16; add.u64 %r349,%frame,240; $L37: add.u64 %r148,%r176,40; mov.u64 %r142,%r349; cvt.u32.u32 %r255,%r22; cvt.s64.s8 %r254,%r255; add.u64 %r257,%r254,%r254; add.u64 %r258,%r257,%r254; shl.b64 %r259,%r258,3; add.u64 %r260,%r176,16; add.u64 %r129,%r259,%r260; .loc 1 97 10 mov.u64 %r99,1; .loc 1 101 4 mov.u64 %r283,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r261,0; st.u64 [%r176+8],%r261; .loc 1 106 28 cvt.u16.u32 %r264,%r22; add.u16 %r263,%r264,-1; cvt.u32.u16 %r265,%r263; st.u8 [%r176+28],%r265; .loc 1 108 20 add.u32 %r266,%r22,-2; cvt.s64.s32 %r57,%r266; add.u64 %r268,%r57,%r57; add.u64 %r269,%r268,%r57; shl.b64 %r270,%r269,3; add.u64 %r271,%r176,%r270; .loc 1 108 67 shl.b64 %r273,%r57,3; add.u64 %r274,%frame,%r273; .loc 1 108 59 ld.u64 %r276,[%r271+40]; ld.u64 %r277,[%r274+240]; mul.lo.u64 %r116,%r276,%r277; .loc 1 110 29 mov.u64 %r279279; call (%value_in),_gfortrani_xmallocarray280,[%value_in]; } .loc 1 110 27 st.u64 [%r176],%r280; .loc 1 111 10 setp.eq.u64 %r282,%r116,0; @ ! %r282 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r99,%r53,%r99; $L17: .loc 1 101 4 st.u64 [%r148+8],%r283; ld.u64 %r53,[%r142]; add.u64 %r284,%r53,-1; st.u64 [%r148+16],%r284; st.u64 [%r148],%r99; .loc 1 94 7 add.u64 %r148,%r148,24; add.u64 %r142,%r142,8; setp.ne.u64 %r285,%r129,%r148; @ %r285 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r176+48],%r116; mov.u64 %r287,-1; st.u64 [%r176+56],%r287; mov.u64 %r288,1; st.u64 [%r176+40],%r288; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r63,[%r176+28]; .loc 1 121 10 setp.eq.u64 %r289,%r63,%r110; @ %r289 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r110; st.u64 [%stack],%r63; cvta.const.u64 %r290,$LC1stack; call _gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r292,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r293,[%r292+36]; setp.eq.u32 %r294,%r293,0; @ %r294 bra $L21; .loc 1 128 2 add.u64 %r349,%frame,240; cvta.const.u64 %r298,$LC2r34929798; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r300,%r110,0; @ %r300 bra $L24; $L27: .loc 1 68 9 add.u64 %r301,%r31,1; .loc 1 68 7 sub.u64 %r112,%r301,%r33; max.s64 %r46,%r112,0; .loc 1 140 8 ld.u64 %r122,[%r177]; .loc 1 141 8 ld.u64 %r123,[%r176]; ld.u64 %r48,[%frame+360]; .loc 1 194 22 ld.u64 %r72,[%frame+120]; .loc 1 195 22 ld.u64 %r74,[%frame]; .loc 1 195 12 shl.b64 %r76,%r74,3; setp.gt.s64 %r346,%r112,0; setp.le.s64 %r347,%r110,1; setp.eq.u32 %r351,%r179,0; .loc 1 159 10 mov.u64 %r352,0; cvt.u32.u32 %r353,%r22; cvt.s64.s8 %r354,%r353; add.u64 %r355,%r354,-1; add.u64 %r356,%frame,120; add.u64 %r357,%frame,240; bra $L25; $L24: add.u64 %r161,%frame,360; add.u64 %r160,%r176,40; mov.u64 %r158,%frame; cvt.u32.u32 %r303,%r22; cvt.s64.s8 %r302,%r303; add.u64 %r149,%r302,-1; .loc 1 132 3 mov.u64 %r124,0; add.u64 %r349,%frame,240; .loc 1 134 16 mov.u64 %r304,%r124; $L26: st.u64 [%r161],%r304; .loc 1 135 18 ld.u64 %r305,[%r160]; st.u64 [%r158],%r305; .loc 1 136 17 shl.b64 %r307,%r124,3; add.u64 %r308,%r349,%r307; .loc 1 136 10 ld.u64 %r309,[%r308]; setp.le.s64 %r310,%r309,0; @ %r310 bra $L1; .loc 1 132 26 add.u64 %r124,%r124,1; .loc 1 132 3 add.u64 %r161,%r161,8; add.u64 %r160,%r160,24; add.u64 %r158,%r158,8; setp.ne.u64 %r311,%r124,%r149; @ %r311 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r346 bra $L28; .loc 1 159 10 st.u64 [%r123],%r352; bra $L29; $L28: .loc 1 180 31 mov.u64 %r117,%r122; .loc 1 155 9 mov.u32 %r103,-128; .loc 1 157 9 mov.u64 %r101,1; .loc 1 178 8 mov.u64 %r146,0; $L32: .loc 1 182 14 ld.s8 %r175,[%r117]; .loc 1 185 14 add.u64 %r146,%r146,1; .loc 1 182 7 @ %r351 bra $L30; .loc 1 182 6 cvt.u16.u32 %r315,%r103; cvt.u16.u32 %r316,%r175; setp.le.s16 %r317,%r315,%r316; selp.u32 %r103,%r175,%r103,%r317; selp.u64 %r101,%r146,%r101,%r317; bra $L31; $L30: cvt.u16.u32 %r318,%r103; cvt.u16.u32 %r319,%r175; setp.lt.s16 %r320,%r318,%r319; selp.u32 %r103,%r175,%r103,%r320; selp.u64 %r101,%r146,%r101,%r320; $L31: .loc 1 180 31 add.u64 %r117,%r117,%r114; .loc 1 180 6 setp.gt.s64 %r321,%r46,%r146; @ %r321 bra $L32; .loc 1 189 12 st.u64 [%r123],%r101; $L29: .loc 1 193 15 add.u64 %r48,%r48,1; .loc 1 194 12 add.u64 %r122,%r122,%r72; .loc 1 195 12 add.u64 %r123,%r123,%r76; .loc 1 197 32 ld.u64 %r144,[%frame+240]; .loc 1 197 13 setp.ne.u64 %r322,%r48,%r144; @ %r322 bra $L25; .loc 1 204 23 mul.lo.u64 %r323,%r72,%r48; .loc 1 204 9 sub.u64 %r119,%r122,%r323; .loc 1 205 23 mul.lo.u64 %r324,%r74,%r48; .loc 1 205 9 shl.b64 %r325,%r324,3; sub.u64 %r120,%r123,%r325; .loc 1 207 7 @ %r347 bra $L1; add.u64 %r147,%frame,368; mov.u64 %r109,8; .loc 1 206 5 mov.u64 %r121,1; bra $L35; $L36: .loc 1 201 13 st.u64 [%r147],%r352; .loc 1 204 23 mul.lo.u64 %r331,%r88,%r87; .loc 1 204 9 sub.u64 %r119,%r122,%r331; .loc 1 205 23 mul.lo.u64 %r332,%r90,%r87; .loc 1 205 9 shl.b64 %r333,%r332,3; sub.u64 %r120,%r123,%r333; .loc 1 206 5 add.u64 %r121,%r121,1; .loc 1 207 7 add.u64 %r147,%r147,8; add.u64 %r109,%r109,8; setp.eq.u64 %r334,%r121,%r355; @ %r334 bra $L1; $L35: .loc 1 215 16 ld.u64 %r335,[%r147]; add.u64 %r87,%r335,1; st.u64 [%r147],%r87; .loc 1 216 23 add.u64 %r337,%r356,%r109; ld.u64 %r88,[%r337]; .loc 1 216 13 add.u64 %r122,%r119,%r88; .loc 1 217 23 add.u64 %r338,%frame,%r109; ld.u64 %r90,[%r338]; .loc 1 217 13 shl.b64 %r339,%r90,3; add.u64 %r123,%r120,%r339; .loc 1 197 32 add.u64 %r341,%r357,%r109; ld.u64 %r93,[%r341]; .loc 1 197 13 setp.eq.u64 %r342,%r87,%r93; @ %r342 bra $L36; .loc 1 201 13 mov.u64 %r48,0; bra $L25; $L51: .loc 1 90 6 ld.u64 %r343,[%r176]; setp.eq.u64 %r344,%r343,0; @ ! %r344 bra $L11; bra $L37; $L1:_gfortran_mmaxloc1_8_i1 .visible .func _gfortran_mmaxloc1_8_, .param .u32 %in_ar4) {608101; .reg .u64 %r106; .reg .u64.reg .u64u64 %r214; .reg .u64u32 %r227; .reg .u16 %r229; .reg .u16u16 %r233; .reg .pred %r234; .reg .u64u64pred646464pred %r335; .reg .u64 %r336; .reg .u32 %r337; .reg .u64u64 %r342; .reg .u32 %r343; .reg .u64pred %r372; .reg .u64 %r373; .reg .u64 %r375; .reg .u32 %r376; .reg .predu32pred %r397; .reg .pred %r398; .reg .pred %r401; .reg .pred %r402; .reg .predpredpred %r414; .reg .pred %r415; .reg .predpredpredpredmov.u64 %r212,%ar0; mov.u64 %r213,%ar1; mov.u64 %r214,%ar2; mov.u64 %r215,%ar3; mov.u32 %r216,%ar4; .loc 1 251 6 setp.ne.u64 %r217,%r215,0; @ %r217 bra $L121321216; call _gfortran_maxloc1_8_7 bra $L52; $L53: .loc 1 261 10 ld.u64 %r22,[%r214]; .loc 1 261 7 add.u64 %r142,%r22,-1; .loc 1 262 10 ld.s8 %r23,[%r213+28]; .loc 1 262 38 add.u32 %r222,%r23,-1; .loc 1 262 8 cvt.s64.s32 %r143,%r222; .loc 1 265 7 shr.u64 %r224,%r142,63; set.u32.gt.s64 %r226,%r142,%r143; neg.s32 %r227,%r226; cvt.u16.u64 %r230,%r224; cvt.u16.u32 %r231,%r227; or.b16 %r229,%r230,%r231; .loc 1 265 6 cvt.u32.u16 %r232,%r229; cvt.u16.u8 %r233,%r232; setp.eq.u16 %r234,%r233,0; @ %r234 bra $L55; .loc 1 267 7 cvt.u32.u32 %r237,%r23; cvt.s64.s8 %r236,%r237; st.u64 [%stack+8],%r236; st.u64 [%stack],%r22; cvta.const.u64 %r235235_gfortran_runtime_error5: .loc 1 272 9 add.u64 %r447,%r142,%r142; add.u64 %r448,%r447,%r142; shl.b64 %r242,%r448,3; add.u64 %r243,%r213,%r242; ld.u64 %r246,[%r243+56]; add.u64 %r245,%r246,1; .loc 1 272 7 ld.u64 %r253,[%r243+48]; sub.u64 %r144,%r245,%r253; .loc 1 273 6 setp.le.s64 %r254,%r144,0; @ %r254 bra $L52; .loc 1 276 9 ld.u64 %r160,[%r215]; .loc 1 278 15 ld.u64 %r34,[%r215+16]; .loc 1 280 22 cvt.u32.u64 %r35,%r34; .loc 1 280 53 add.u32 %r255,%r35,-4; and.b32 %r256,%r255,-5; set.u32.eq.u32 %r258,%r256,0; neg.s32 %r259,%r258; .loc 1 280 22 add.u32 %r260,%r35,-1; set.u32.le.u32 %r262,%r260,1; neg.s32 %r263,%r262; .loc 1 280 6 cvt.u16.u32 %r265,%r259; cvt.u16.u32 %r266,%r263; or.loc 1 282 7 setp.ne.u32 %r271,%r35,16; @ %r271 bra $L58; $L57: .loc 1 289 9 shl.b64 %r275,%r448,3; add.u64 %r276,%r213,%r275; ld.u64 %r147,[%r276+40]; .loc 1 290 12 add.u64 %r282,%r215,%r275; ld.u64 %r38,[%r282+40]; .loc 1 292 3 setp.ne.u64 %r284,%r142,0; @ %r284 bra $L59; $L65: .loc 1 302 3 setp.lt.s64 %r285,%r142,%r143; @ %r285 bra $L60; bra $L119; $L58: .loc 1 287 5 cvta.const.u64 %r286286_gfortran_runtime_error9: add.u64 %r99,%r213,40; add.u64 %r62,%r215,40; add.u64 %r289,%r22,%r22; add.u64 %r290,%r289,%r22; shl.b64 %r291,%r290,3; add.u64 %r292,%r213,16; add.u64 %r88,%r291,%r292; .loc 1 292 3 mov.u64 %r40,0; add.u64 %r445,%frame,240; add.u64 %r443,%frame,360; .loc 1 299 12 mov.u64 %r458,%r40; $L64: .loc 1 294 18 add.u64 %r294,%r445,%r40; ld.u64 %r295,[%r99]; st.u64 [%r294],%r295; .loc 1 295 18 add.u64 %r296,%frame,%r40; .loc 1 295 20 ld.u64 %r298,[%r62]; mul.lo.u64 %r297,%r298,%r34; .loc 1 295 18 st.u64 [%r296],%r297; .loc 1 296 19 ld.u64 %r300,[%r99+16]; add.u64 %r299,%r300,1; ld.u64 %r301,[%r99+8]; sub.u64 %r50,%r299,%r301; .loc 1 298 10 setp.lt.s64 %r302,%r50,0; @ %r302 bra $L62; .loc 1 296 17 add.u64 %r304,%r443,%r40; st.u64 [%r304],%r50; bra $L63; $L62: .loc 1 299 12 add.u64 %r306,%r443,%r40; st.u64 [%r306],%r458; $L63: .loc 1 292 3 add.u64 %r99,%r99,24; add.u64 %r62,%r62,24; add.u64 %r40,%r40,8; setp.ne.u64 %r308,%r88,%r99; @ %r308 bra $L64; bra $L65; $L119: .loc 1 312 6 ld.u64 %r309,[%r212]; setp.eq.u64 %r310,%r309,0; @ ! %r310 bra $L67; bra $L66; $L60: add.u64 %r312,%r22,%r22; add.u64 %r313,%r312,%r22; shl.b64 %r314,%r313,3; add.u64 %r122,%r314,40; add.u64 %r125,%r213,%r122; add.u64 %r106,%r215,%r122; shl.b64 %r315,%r22,3; add.u64 %r59,%r315,-8; cvt.u32.u32 %r317,%r23; cvt.s64.s8 %r316,%r317; shl.b64 %r318,%r316,3; add.u64 %r200,%r318,-8; add.u64 %r445,%frame,240; add.u64 %r443,%frame,360; .loc 1 309 12 mov.u64 %r457,0; $L70: .loc 1 304 18 add.u64 %r320,%r445,%r59; ld.u64 %r321,[%r125]; st.u64 [%r320],%r321; .loc 1 305 18 add.u64 %r322,%frame,%r59; .loc 1 305 20 ld.u64 %r324,[%r106]; mul.lo.u64 %r323,%r324,%r34; .loc 1 305 18 st.u64 [%r322],%r323; .loc 1 306 19 ld.u64 %r326,[%r125+16]; add.u64 %r325,%r326,1; ld.u64 %r327,[%r125+8]; sub.u64 %r61,%r325,%r327; .loc 1 308 10 setp.lt.s64 %r328,%r61,0; @ %r328 bra $L68; .loc 1 306 17 add.u64 %r330,%r443,%r59; st.u64 [%r330],%r61; bra $L69; $L68: .loc 1 309 12 add.u64 %r332,%r443,%r59; st.u64 [%r332],%r457; $L69: .loc 1 302 3 add.u64 %r125,%r125,24; add.u64 %r106,%r106,24; add.u64 %r59,%r59,8; setp.ne.u64 %r334,%r59,%r200; @ %r334 bra $L70; bra $L120; $L66: .loc 1 316 7 setp.eq.u64 %r335,%r143,0; @ %r335 bra $L72; add.u64 %r443,%frame,360; $L97: add.u64 %r139,%r212,40; mov.u64 %r137,%r443; cvt.u32.u32 %r337,%r23; cvt.s64.s8 %r336,%r337; add.u64 %r339,%r336,%r336; add.u64 %r340,%r339,%r336; shl.b64 %r341,%r340,3; add.u64 %r342,%r212,16; add.u64 %r128,%r341,%r342; .loc 1 319 10 mov.u64 %r126,1; .loc 1 323 4 mov.u64 %r361,0; bra $L73; $L72: .loc 1 327 20 add.u32 %r343,%r23,-2; cvt.s64.s32 %r70,%r343; add.u64 %r345,%r70,%r70; add.u64 %r346,%r345,%r70; shl.b64 %r347,%r346,3; add.u64 %r348,%r212,%r347; .loc 1 327 67 shl.b64 %r350,%r70,3; add.u64 %r351,%frame,%r350; .loc 1 327 59 ld.u64 %r353,[%r348+40]; ld.u64 %r354,[%r351+360]; mul.lo.u64 %r150,%r353,%r354; .loc 1 329 24 mov.u64 %r355,0; st.u64 [%r212+8],%r355; .loc 1 330 28 cvt.u16.u32 %r358,%r23; add.u16 %r357,%r358,-1; cvt.u32.u16 %r359,%r357; st.u8 [%r212+28],%r359; .loc 1 332 10 setp.eq.u64 %r360,%r150,0; @ %r360 bra $L74; bra $L121; $L76: .loc 1 321 47 mul.lo.u64 %r126,%r67,%r126; $L73: .loc 1 323 4 st.u64 [%r139+8],%r361; ld.u64 %r67,[%r137]; add.u64 %r362,%r67,-1; st.u64 [%r139+16],%r362; st.u64 [%r139],%r126; .loc 1 316 7 add.u64 %r139,%r139,24; add.u64 %r137,%r137,8; setp.ne.u64 %r363,%r128,%r139; @ %r363 bra $L76; bra $L72; $L74: .loc 1 335 4 st.u64 [%r212+48],%r150; mov.u64 %r365,-1; st.u64 [%r212+56],%r365; mov.u64 %r366,1; st.u64 [%r212+40],%r366; .loc 1 336 4 bra $L52; $L121: .loc 1 339 24 mov.u64 %r368368; call (%value_in),_gfortrani_xmallocarray69,[%value_in]; } .loc 1 339 22 st.u64 [%r212],%r369; bra $L77; $L67: .loc 1 344 19 ld.s8 %r371,[%r212+28]; .loc 1 344 10 setp.eq.u64 %r372,%r371,%r143; @ %r372 bra $L78; .loc 1 345 2 cvta.const.u64 %r373373_gfortran_runtime_error78: .loc 1 347 11 cvta.global.u64 %r375,_gfortrani_compile_options; .loc 1 347 10 ld.u32 %r376,[%r375+36]; setp.eq.u32 %r377,%r376,0; @ %r377 bra $L77; .loc 1 349 4 add.u64 %r443,%frame,360; cvta.const.u64 %r381,$LC2r443380381; call _gfortrani_bounds_ifunction_return4 cvta.const.u64 %r385,$LC38381; call _gfortrani_bounds_equal_extents$L77: .loc 1 356 3 setp.ne.u64 %r387,%r143,0; @ %r387 bra $L79; $L82: .loc 1 364 8 ld.u64 %r161,[%r212]; .loc 1 365 8 ld.u64 %r159,[%r213]; .loc 1 367 9 setp.ne.u64 %r388,%r159,0; @ %r388 bra $L80; bra $L52; $L79: add.u64 %r187,%frame,480; add.u64 %r184,%r212,40; add.u64 %r179,%frame,120; cvt.u32.u32 %r390,%r23; cvt.s64.s8 %r389,%r390; add.u64 %r155,%r389,-1; .loc 1 356 3 mov.u64 %r162,0; add.u64 %r443,%frame,360; .loc 1 358 16 mov.u64 %r391,%r162; $L81: st.u64 [%r187],%r391; .loc 1 359 18 ld.u64 %r392,[%r184]; st.u64 [%r179],%r392; .loc 1 360 17 shl.b64 %r394,%r162,3; add.u64 %r395,%r443,%r394; .loc 1 360 10 ld.u64 %r396,[%r395]; setp.le.s64 %r397,%r396,0; @ %r397 bra $L52; .loc 1 356 26 add.u64 %r162,%r162,1; .loc 1 356 3 add.u64 %r187,%r187,8; add.u64 %r184,%r184,24; add.u64 %r179,%r179,8; setp.ne.u64 %r398,%r155,%r162; @ %r398 bra $L81; bra $L82; $L80: .loc 1 290 12 mul.lo.u64 %r41,%r38,%r34; ld.u64 %r186,[%frame+480]; .loc 1 430 22 ld.u64 %r92,[%frame+240]; .loc 1 431 23 ld.u64 %r94,[%frame]; .loc 1 432 22 ld.u64 %r96,[%frame+120]; .loc 1 432 12 shl.b64 %r98,%r96,3; setp.le.s64 %r442,%r143,1; .loc 1 408 9 setp.ne.u32 %r450,%r216,0; cvt.u32.u32 %r451,%r23; cvt.s64.s8 %r452,%r451; add.u64 %r453,%r452,-1; add.u64 %r454,%frame,240; add.u64 %r455,%frame,360; add.u64 %r456,%frame,120; $L98: .loc 1 356 3 mov.u64 %r154,%r160; mov.u64 %r153,%r159; .loc 1 386 9 mov.u64 %r133,0; $L86: .loc 1 389 7 ld.s8 %r205,[%r154]; mov.u64 %r183,%r133; .loc 1 386 24 add.u64 %r133,%r133,1; .loc 1 389 6 setp.eq.u32 %r444,%r205,0; @ %r444 bra $L83; .loc 1 397 11 ld.s8 %r136,[%r153]; .loc 1 408 9 @ %r450 bra $L84; bra $L122; $L83: .loc 1 386 32 add.u64 %r153,%r153,%r147; .loc 1 386 47 add.u64 %r154,%r154,%r41; .loc 1 386 2 setp.ne.u64 %r401,%r144,%r133; @ %r401 bra $L86; .loc 1 385 9 mov.u64 %r133,0; bra $L87; $L122: .loc 1 418 8 setp.gt.s64 %r402,%r144,%r183; @ %r402 bra $L92; bra $L87; $L84: .loc 1 409 8 setp.le.s64 %r403,%r144,%r183; @ %r403 bra $L87; $L90: .loc 1 414 16 add.u64 %r183,%r183,1; .loc 1 411 8 @ %r444 bra $L89; .loc 1 411 18 ld.s8 %r85,[%r153]; .loc 1 411 15 cvt.u16.u32 %r405,%r85; cvt.u16.u32 %r406,%r136; setp.ge.s16 %r407,%r405,%r406; selp.u32 %r136,%r85,%r136,%r407; selp.u64 %r133,%r183,%r133,%r407; $L89: .loc 1 409 33 add.u64 %r153,%r153,%r147; .loc 1 409 48 add.u64 %r154,%r154,%r41; .loc 1 409 8 setp.eq.u64 %r408,%r144,%r183; @ %r408 bra $L87; .loc 1 411 9 ld.s8 %r205,[%r154]; setp.eq.u32 %r444,%r205,0; bra $L90; $L92: .loc 1 423 16 add.u64 %r183,%r183,1; .loc 1 420 8 ld.s8 %r410,[%r154]; cvt.u16.u32 %r409,%r410; setp.eq.u16 %r411,%r409,0; @ %r411 bra $L91; .loc 1 420 18 ld.s8 %r90,[%r153]; .loc 1 420 15 cvt.u16.u32 %r412,%r90; cvt.u16.u32 %r413,%r136; setp.gt.s16 %r414,%r412,%r413; selp.u32 %r136,%r90,%r136,%r414; selp.u64 %r133,%r183,%r133,%r414; $L91: .loc 1 418 33 add.u64 %r153,%r153,%r147; .loc 1 418 48 add.u64 %r154,%r154,%r41; .loc 1 418 8 setp.ne.u64 %r415,%r144,%r183; @ %r415 bra $L92; $L87: .loc 1 426 8 st.u64 [%r161],%r133; .loc 1 429 15 add.u64 %r186,%r186,1; .loc 1 430 12 add.u64 %r159,%r159,%r92; .loc 1 431 13 add.u64 %r160,%r160,%r94; .loc 1 432 12 add.u64 %r161,%r161,%r98; .loc 1 434 32 ld.u64 %r29,[%frame+360]; .loc 1 434 13 setp.ne.u64 %r416,%r29,%r186; @ %r416 bra $L98; .loc 1 441 23 mul.lo.u64 %r101,%r186,%r92; .loc 1 442 24 mul.lo.u64 %r417,%r186,%r94; .loc 1 442 10 sub.u64 %r156,%r160,%r417; .loc 1 443 23 mul.lo.u64 %r418,%r186,%r96; .loc 1 443 9 shl.b64 %r419,%r418,3; sub.u64 %r157,%r161,%r419; .loc 1 445 7 @ %r442 bra $L52; add.u64 %r197,%frame,488; mov.u64 %r196,8; .loc 1 444 5 mov.u64 %r158,1; .loc 1 438 13 mov.u64 %r449,0; bra $L95; $L96: st.u64 [%r197],%r449; .loc 1 441 23 mul.lo.u64 %r101,%r113,%r112; .loc 1 442 24 mul.lo.u64 %r425,%r115,%r112; .loc 1 442 10 sub.u64 %r156,%r160,%r425; .loc 1 443 23 mul.lo.u64 %r426,%r117,%r112; .loc 1 443 9 shl.b64 %r427,%r426,3; sub.u64 %r157,%r161,%r427; .loc 1 444 5 add.u64 %r158,%r158,1; .loc 1 445 7 add.u64 %r197,%r197,8; add.u64 %r196,%r196,8; setp.eq.u64 %r428,%r158,%r453; @ %r428 bra $L52; $L95: .loc 1 453 16 ld.u64 %r429,[%r197]; add.u64 %r112,%r429,1; st.u64 [%r197],%r112; .loc 1 454 23 add.u64 %r431,%r454,%r196; ld.u64 %r113,[%r431]; .loc 1 454 13 sub.u64 %r432,%r113,%r101; add.u64 %r159,%r159,%r432; .loc 1 455 24 add.u64 %r433,%frame,%r196; ld.u64 %r115,[%r433]; .loc 1 455 14 add.u64 %r160,%r156,%r115; .loc 1 456 23 add.u64 %r435,%r456,%r196; ld.u64 %r117,[%r435]; .loc 1 456 13 shl.b64 %r436,%r117,3; add.u64 %r161,%r157,%r436; .loc 1 434 32 add.u64 %r438,%r455,%r196; ld.u64 %r121,[%r438]; .loc 1 434 13 setp.eq.u64 %r439,%r112,%r121; @ %r439 bra $L96; .loc 1 438 13 mov.u64 %r186,0; bra $L98; $L120: .loc 1 312 6 ld.u64 %r440,[%r212]; setp.eq.u64 %r441,%r440,0; @ ! %r441 bra $L67; bra $L97; $L52_gfortran_smaxloc1_8_i1 .visible .func _gfortran_smaxloc1_8_, .param .u32 %in_ar4) {local .align 16 .b8 %frame_ar[368u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 483 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L124; .loc 1 483 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L125; $L124:r152154; call _gfortran_maxloc1_8_490 7 bra $L123; $L125: .loc 1 493 10 ld.u64 %r23,[%r152]; .loc 1 493 7 add.u64 %r88,%r23,-1; .loc 1 494 10 ld.s8 %r24,[%r151+28]; .loc 1 494 38 add.u32 %r26,%r24,-1; .loc 1 494 8 cvt.s64.s32 %r296,%r26; .loc 1 496 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 496 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L127; .loc 1 503 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L128; $L134: .loc 1 511 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L129; bra $L182; $L127: .loc 1 498 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error128: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 508 12 mov.u64 %r302,0; $L133: .loc 1 505 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 507 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L131; .loc 1 505 17 st.u64 [%r139],%r36; bra $L132; $L131: .loc 1 508 12 st.u64 [%r139],%r302; $L132: .loc 1 503 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L133; bra $L134; $L182: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L136; bra $L135; $L129: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 517 12 mov.u64 %r301,0; $L139: .loc 1 514 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 516 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L137; .loc 1 513 17 st.u64 [%r43],%r41; bra $L138; $L137: .loc 1 517 12 st.u64 [%r43],%r301; $L138: .loc 1 511 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L139; bra $L183; $L135: .loc 1 524 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L141; add.u64 %r295,%frame,120; $L161: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 527 10 mov.u64 %r81,1; .loc 1 531 4 mov.u64 %r239,0; bra $L142; $L141: .loc 1 535 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 536 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 538 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 538 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 538 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 540 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L143; bra $L184; $L145: .loc 1 529 48 mul.lo.u64 %r81,%r46,%r81; $L142: .loc 1 531 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 524 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L145; bra $L141; $L143: .loc 1 543 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 544 4 bra $L123; $L184: .loc 1 547 24 mov.u64 %r24696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 547 22 st.u64 [%r150],%r98; $L149: .loc 1 573 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L146; bra $L147; $L136: .loc 1 551 19 ld.s8 %r55,[%r150+28]; .loc 1 551 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L148; .loc 1 552 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error148: .loc 1 557 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 557 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L149; .loc 1 559 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L147; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 559 10 mov.u64 %r104,0; $L151: .loc 1 563 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 563 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 564 18 ld.u64 %r63,[%r31]; .loc 1 559 25 add.u64 %r104,%r104,1; .loc 1 564 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L150; .loc 1 565 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error150: .loc 1 559 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L151; $L146: .loc 1 575 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L152; shl.b64 %r266,%r296,3; bra $L153; $L152: mov.u64 %r266,8; $L153: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L154: .loc 1 576 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 573 26 add.u64 %r100,%r100,1; .loc 1 573 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L154; $L147: ld.u64 %r80,[%frame+240]; .loc 1 585 22 ld.u64 %r68,[%frame]; .loc 1 585 12 shl.b64 %r69,%r68,3; .loc 1 587 32 ld.u64 %r59,[%frame+120]; .loc 1 594 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 583 13 mov.u64 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L160: st.u64 [%r98],%r280; .loc 1 584 15 add.u64 %r80,%r80,1; .loc 1 585 12 add.u64 %r98,%r98,%r69; .loc 1 587 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L160; .loc 1 596 7 @ ! %r294 bra $L185; bra $L123; $L159: .loc 1 591 13 st.u64 [%r99],%r280; .loc 1 594 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 595 5 add.u64 %r97,%r97,1; .loc 1 596 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L158; bra $L123; $L185: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 594 23 mov.u64 %r72,%r44; .loc 1 595 5 mov.u64 %r97,1; $L158: .loc 1 600 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 601 23 ld.u64 %r75,[%r124]; .loc 1 601 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,3; add.u64 %r98,%r98,%r291; .loc 1 587 32 ld.u64 %r77,[%r123]; .loc 1 587 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L159; .loc 1 591 13 mov.u64 %r80,0; bra $L160; $L183: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L136; bra $L161; $L123: .loc 1 605 1 ret; } maxloc1_16_i1.o/_gfortran_maxloc1_16_i1 .visible .func _gfortran_maxloc1_16_i1.file 1 "../../../libgfortran/generated/maxloc1_16_i1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_16_i1 .visible .func _gfortran_mmaxloc1_16__gfortran_smaxloc1_16_i1 .visible .func _gfortran_smaxloc1_16_VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_maxloc1_16_i1 .visible .func _gfortran_maxloc1_16_i1480pred %r217; .reg .pred64u64 %r251; .reg .predu64 %r256; .reg .u64 %r257; .reg .u32u32 %r268; .reg .u64predpred %r344; .reg .u64 %r350; .reg .u64 %r351; .reg .u64 %r352; .reg .predmov.u64 %r178,%ar0; mov.u64 %r179,%ar1; mov.u64 %r180,%ar2; mov.u32 %r181,%ar3; .loc 1 58 10 ld.s8 %r22,[%r179+28]; .loc 1 58 38 add.u32 %r182,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r107,%r182; .loc 1 59 10 ld.u64 %r25,[%r180]; .loc 1 59 7 add.u64 %r108,%r25,-1; .loc 1 61 7 shr.u64 %r184,%r108,63; set.u32.lt.s64 %r186,%r107,%r108; neg.s32 %r187,%r186; cvt.u16.u64 %r190,%r184; cvt.u16.u32 %r191,%r187; or.b16 %r189,%r190,%r191; .loc 1 61 6 cvt.u32.u16 %r192,%r189; cvt.u16.u8 %r193,%r192; setp.eq.u16 %r194,%r193,0; @ %r194 bra $L2; .loc 1 63 7 cvt.u32.u32 %r197,%r22; cvt.s64.s8 %r196,%r197; st.u64 [%stack+8],%r1961955_gfortran_runtime_errorr200,%r108,%r108; add.u64 %r201,%r200,%r108; shl.b64 %r202,%r201,3; add.u64 %r203,%r179,%r202; ld.u64 %r31,[%r203+56]; ld.u64 %r33,[%r203+48]; .loc 1 71 9 ld.u64 %r111,[%r203+40]; .loc 1 73 3 setp.ne.u64 %r217,%r108,0; @ %r217 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r218,%r107,%r108; @ %r218 bra $L4; bra $L49; $L3: add.u64 %r126,%r179,40; add.u64 %r173,%frame,120; add.u64 %r171,%frame,240; add.u64 %r219,%r179,16; add.u64 %r221,%r25,%r25; add.u64 %r222,%r221,%r25; shl.b64 %r223,%r222,3; add.u64 %r68,%r219,%r223; .loc 1 79 12 mov.u64 %r386,0; $L8: .loc 1 75 18 ld.u64 %r224,[%r126]; st.u64 [%r173],%r224; .loc 1 76 19 ld.u64 %r226,[%r126+16]; add.u64 %r225,%r226,1; ld.u64 %r227,[%r126+8]; sub.u64 %r38,%r225,%r227; .loc 1 78 10 setp.lt.s64 %r228,%r38,0; @ %r228 bra $L6; .loc 1 76 17 st.u64 [%r171],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r171],%r386; $L7: .loc 1 73 3 add.u64 %r126,%r126,24; add.u64 %r173,%r173,8; add.u64 %r171,%r171,8; setp.ne.u64 %r230,%r68,%r126; @ %r230 bra $L8; bra $L9; $L49: .loc 1 90 6 ld.u64 %r231,[%r178]; setp.eq.u64 %r232,%r231,0; @ ! %r232 bra $L11; bra $L10; $L4: add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r237,%r236,40; add.u64 %r128,%r179,%r237; shl.b64 %r238,%r25,3; add.u64 %r99,%r238,-8; add.u64 %r374,%frame,120; add.u64 %r103,%r374,%r99; add.u64 %r373,%frame,240; add.u64 %r94,%r373,%r99; cvt.u32.u32 %r242,%r22; cvt.s64.s8 %r241,%r242; add.u64 %r244,%r241,%r241; add.u64 %r245,%r244,%r241; shl.b64 %r246,%r245,3; add.u64 %r247,%r179,40; add.u64 %r106,%r246,%r247; .loc 1 87 12 mov.u64 %r385,0; $L14: .loc 1 83 18 ld.u64 %r248,[%r128]; st.u64 [%r103],%r248; .loc 1 84 19 ld.u64 %r250,[%r128+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r128+8]; sub.u64 %r45,%r249,%r251; .loc 1 86 10 setp.lt.s64 %r252,%r45,0; @ %r252 bra $L12; .loc 1 84 17 st.u64 [%r94],%r45; bra $L13; $L12: .loc 1 87 12 st.u64 [%r94],%r385; $L13: .loc 1 81 3 add.u64 %r128,%r128,24; add.u64 %r103,%r103,8; add.u64 %r94,%r94,8; setp.ne.u64 %r254,%r106,%r128; @ %r254 bra $L14; bra $L50; $L10: .loc 1 94 7 setp.le.s64 %r255,%r107,0; @ %r255 bra $L16; add.u64 %r373,%frame,240; $L38: add.u64 %r147,%r178,40; mov.u64 %r141,%r373; add.u64 %r256,%r178,16; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r130,%r256,%r262; .loc 1 97 10 mov.u64 %r96,1; .loc 1 101 4 mov.u64 %r285,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r263,0; st.u64 [%r178+8],%r263; .loc 1 106 28 cvt.u16.u32 %r266,%r22; add.u16 %r265,%r266,-1; cvt.u32.u16 %r267,%r265; st.u8 [%r178+28],%r267; .loc 1 108 20 add.u32 %r268,%r22,-2; cvt.s64.s32 %r56,%r268; add.u64 %r270,%r56,%r56; add.u64 %r271,%r270,%r56; shl.b64 %r272,%r271,3; add.u64 %r273,%r178,%r272; .loc 1 108 67 shl.b64 %r275,%r56,3; add.u64 %r276,%frame,%r275; .loc 1 108 59 ld.u64 %r278,[%r273+40]; ld.u64 %r279,[%r276+240]; mul.lo.u64 %r113,%r278,%r279; .loc 1 110 29 mov.u64 %r281,16281; call (%value_in),_gfortrani_xmallocarray282,[%value_in]; } .loc 1 110 27 st.u64 [%r178],%r282; .loc 1 111 10 setp.eq.u64 %r284,%r113,0; @ ! %r284 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r96,%r52,%r96; $L17: .loc 1 101 4 st.u64 [%r147+8],%r285; ld.u64 %r52,[%r141]; add.u64 %r286,%r52,-1; st.u64 [%r147+16],%r286; st.u64 [%r147],%r96; .loc 1 94 7 add.u64 %r147,%r147,2420; bra $L16; $L18: .loc 1 114 4 st.u64 [%r178+48],%r113; mov.u64 %r289,-1; st.u64 [%r178+56],%r289; mov.u64 %r290,1; st.u64 [%r178+40],%r290; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r62,[%r178+28]; .loc 1 121 10 setp.eq.u64 %r291,%r62,%r107; @ %r291 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r107; st.u64 [%stack],%r62;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r294,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r295,[%r294+36]; setp.eq.u32 %r296,%r295,0; @ %r296 bra $L21; .loc 1 128 2 add.u64 %r373,%frame,240; cvta.const.u64 %r300,$LC2r373299300; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r302,%r107,0; @ %r302 bra $L24; $L27: .loc 1 68 9 add.u64 %r303,%r31,1; .loc 1 68 7 sub.u64 %r109,%r303,%r33; max.s64 %r46,%r109,0; .loc 1 140 8 ld.u64 %r121,[%r179]; .loc 1 141 8 ld.u64 %r122,[%r178]; ld.u64 %r74,[%frame+360]; .loc 1 194 22 ld.u64 %r70,[%frame+120]; .loc 1 195 22 ld.u64 %r72,[%frame]; .loc 1 195 12 shl.b64 %r73,%r72,4; .loc 1 197 32 ld.u64 %r102,[%frame+240]; .loc 1 204 23 mul.lo.u64 %r304,%r70,%r102; .loc 1 204 9 neg.s64 %r133,%r304; .loc 1 205 23 mul.lo.u64 %r305,%r72,%r102; .loc 1 205 9 shl.b64 %r306,%r305,4; neg.s64 %r139,%r306; setp.gt.s64 %r375,%r109,0; setp.le.s64 %r376,%r107,1; .loc 1 180 31 mov.u64 %r378,1; setp.eq.u32 %r379,%r181,0; cvt.u32.u32 %r380,%r22; cvt.s64.s8 %r381,%r380; add.u64 %r382,%r381,-1; add.u64 %r383,%frame,120; add.u64 %r384,%frame,240; bra $L25; $L24: add.u64 %r160,%frame,360; add.u64 %r159,%r178,40; mov.u64 %r157,%frame; cvt.u32.u32 %r308,%r22; cvt.s64.s8 %r307,%r308; add.u64 %r148,%r307,-1; .loc 1 132 3 mov.u64 %r123,0; add.u64 %r373,%frame,240; .loc 1 134 16 mov.u64 %r309,%r123; $L26: st.u64 [%r160],%r309; .loc 1 135 18 ld.u64 %r310,[%r159]; st.u64 [%r157],%r310; .loc 1 136 17 shl.b64 %r312,%r123,3; add.u64 %r313,%r373,%r312; .loc 1 136 10 ld.u64 %r314,[%r313]; setp.le.s64 %r315,%r314,0; @ %r315 bra $L1; .loc 1 132 26 add.u64 %r123,%r123,1; .loc 1 132 3 add.u64 %r160,%r160,8; add.u64 %r159,%r159,24; add.u64 %r157,%r157,8; setp.ne.u64 %r316,%r123,%r148; @ %r316 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r375 bra $L28; .loc 1 159 10 mov.u64 %r318,0; st.u64 [%r122],%r318; st.u64 [%r122+8],%r318; bra $L29; $L28: .loc 1 180 31 mov.u64 %r116,%r121; mov.u64 %r366,%r378; mov.u64 %r367,0; .loc 1 155 9 mov.u32 %r100,-128; .loc 1 157 9 mov.u64 %r364,1; mov.u64 %r365,%r367; $L33: .loc 1 182 14 ld.s8 %r177,[%r116]; .loc 1 182 7 @ %r379 bra $L30; .loc 1 182 6 cvt.u16.u32 %r321,%r100; cvt.u16.u32 %r322,%r177; setp.le.s16 %r323,%r321,%r322; @ %r323 bra $L31; bra $L32; $L30: cvt.u16.u32 %r324,%r100; cvt.u16.u32 %r325,%r177; setp.ge.s16 %r326,%r324,%r325; @ %r326 bra $L32; $L31: .loc 1 185 14 mov.u64 %r364,%r366; mov.u64 %r365,%r367; .loc 1 182 14 mov.u32 %r100,%r177; $L32: .loc 1 180 31 add.u64 %r116,%r116,%r111; .loc 1 180 6 add.u64 %r329,%r366,%r378; set.u32.lt.u64 %r334,%r329,%r366; cvt.s64.s32 %r332,%r334; mov.u64 %r366,%r329; sub.u64 %r367,%r367,%r332; add.u64 %r339,%r366,-1; setp.gt.s64 %r341,%r46,%r339; @ %r341 bra $L33; .loc 1 189 12 st.u64 [%r122],%r364; st.u64 [%r122+8],%r365; $L29: .loc 1 193 15 add.u64 %r74,%r74,1; .loc 1 194 12 add.u64 %r121,%r121,%r70; .loc 1 195 12 add.u64 %r122,%r122,%r73; .loc 1 197 13 setp.ne.u64 %r344,%r74,%r102; @ %r344 bra $L25; .loc 1 204 9 add.u64 %r118,%r121,%r133; .loc 1 205 9 add.u64 %r119,%r122,%r139; .loc 1 207 7 @ %r376 bra $L1; add.u64 %r129,%frame,368; mov.u64 %r143,8; .loc 1 206 5 mov.u64 %r120,1; .loc 1 201 13 mov.u64 %r377,0; bra $L36; $L37: st.u64 [%r129],%r377; .loc 1 204 23 mul.lo.u64 %r350,%r85,%r84; .loc 1 204 9 sub.u64 %r118,%r121,%r350; .loc 1 205 23 mul.lo.u64 %r351,%r87,%r84; .loc 1 205 9 shl.b64 %r352,%r351,4; sub.u64 %r119,%r122,%r352; .loc 1 206 5 add.u64 %r120,%r120,1; .loc 1 207 7 add.u64 %r129,%r129,8; add.u64 %r143,%r143,8; setp.eq.u64 %r353,%r120,%r382; @ %r353 bra $L1; $L36: .loc 1 215 16 ld.u64 %r354,[%r129]; add.u64 %r84,%r354,1; st.u64 [%r129],%r84; .loc 1 216 23 add.u64 %r356,%r383,%r143; ld.u64 %r85,[%r356]; .loc 1 216 13 add.u64 %r121,%r118,%r85; .loc 1 217 23 add.u64 %r357,%frame,%r143; ld.u64 %r87,[%r357]; .loc 1 217 13 shl.b64 %r358,%r87,4; add.u64 %r122,%r119,%r358; .loc 1 197 32 add.u64 %r360,%r384,%r143; ld.u64 %r90,[%r360]; .loc 1 197 13 setp.eq.u64 %r361,%r84,%r90; @ %r361 bra $L37; .loc 1 201 13 mov.u64 %r74,0; bra $L25; $L50: .loc 1 90 6 ld.u64 %r362,[%r178]; setp.eq.u64 %r363,%r362,0; @ ! %r363 bra $L11; bra $L38; $L1:_gfortran_mmaxloc1_16_i1 .visible .func _gfortran_mmaxloc1_16_, .param .u32 %in_ar4) {608pred %r289; .reg .predpred %r304; .reg .pred %r305; .reg .u64u64u64 %r318; .reg .u64pred %r322; .reg .u64 %r324; .reg .u64 %r326; .reg .pred %r328; .reg .u64 %r329; .reg .predu64pred %r354; .reg .predpredpred %r397; .reg .u64 %r400; .reg .u64 %r401; .reg .u64 %r405; .reg .pred.reg .u64 %r416; .reg .predpred %r425; .reg .pred %r426; .reg .pred %r427; .reg .u64 %r431; .reg .u64 %r434; .reg .u64pred %r476; .reg .pred %r479; .reg .u64 %r483; .reg .u64 %r486; .reg .u64 %r489; .reg .u64 %r492; .reg .u32 %r494; .reg .u64 %r498; .reg .u64 %r500; .reg .u64 %r503; .reg .u32 %r505; .reg .u64 %r509; .reg .u16 %r510; .reg .u32 %r511; .reg .pred %r512; .reg .u16 %r513; .reg .u16 %r514; .reg .pred %r515; .reg .u64 %r518; .reg .u64 %r521; .reg .u32 %r523; .reg .predpred %r558; .reg .u64 %r559; .reg .u64 %r560; .reg .u64u64 %r593; .reg .pred.reg .u64 %r612; .reg .u64 %r613; mov.u64 %r232,%ar0; mov.u64 %r233,%ar1; mov.u64 %r234,%ar2; mov.u64 %r235,%ar3; mov.u32 %r236,%ar4; .loc 1 251 6 setp.ne.u64 %r237,%r235,0; @ %r237 bra $L23236; call _gfortran_maxloc1_16_7 bra $L51; $L52: .loc 1 261 10 ld.u64 %r22,[%r234]; .loc 1 261 7 add.u64 %r150,%r22,-1; .loc 1 262 10 ld.s8 %r23,[%r233+28]; .loc 1 262 38 add.u32 %r242,%r23,-1; .loc 1 262 8 cvt.s64.s32 %r151,%r242; .loc 1 265 7 shr.u64 %r244,%r150,63; set.u32.gt.s64 %r246,%r150,%r151; neg.s32 %r247,%r246; cvt.u16.u64 %r250,%r244; cvt.u16.u32 %r251,%r247; or.b16 %r249,%r250,%r251; .loc 1 265 6.loc 1 267 7 cvt.u32.u32 %r257,%r23; cvt.s64.s8 %r256,%r257; st.u64 [%stack+8],%r256; st.u64 [%stack],%r22; cvta.const.u64 %r22_gfortran_runtime_error4: .loc 1 272 9 add.u64 %r598,%r150,%r150; add.u64 %r602,%r598,%r150; shl.b64 %r262,%r602,3; add.u64 %r263,%r233,%r262; ld.u64 %r266,[%r263+56]; add.u64 %r265,%r266,1; .loc 1 272 7 ld.u64 %r273,[%r263+48]; sub.u64 %r152,%r265,%r273; .loc 1 273 6 setp.le.s64 %r274,%r152,0; @ %r274 bra $L51; .loc 1 276 9 ld.u64 %r170,[%r235]; .loc 1 278 15 ld.u64 %r35,[%r235+16]; .loc 1 280 22 cvt.u32.u64 %r36,%r35; .loc 1 280 53 add.u32 %r275,%r36,-4; and.b32 %r276,%r275,-5; set.u32.eq.u32 %r278,%r276,0; neg.s32 %r279,%r278; .loc 1 280 22 add.u32 %r280,%r36,-1; set.u32.le.u32 %r282,%r280,1; neg.s32 %r283,%r282; .loc 1 280 6 cvt.u16.u32 %r285,%r279; cvt.u16.u32 %r286,%r283; or.b16 %r284,%r285,%r286; cvt.u32.u16 %r287,%r284; cvt.u16.u8 %r288,%r287; setp.ne.u16 %r289,%r288,0; @ %r289 bra $L56; .loc 1 282 7 setp.ne.u32 %r291,%r36,16; @ %r291 bra $L57; $L56: .loc 1 289 9 shl.b64 %r295,%r602,3; add.u64 %r296,%r233,%r295; ld.u64 %r155,[%r296+40]; .loc 1 290 12 add.u64 %r302,%r235,%r295; ld.u64 %r41,[%r302+40]; .loc 1 292 3 setp.ne.u64 %r304,%r150,0; @ %r304 bra $L58; $L64: .loc 1 302 3 setp.lt.s64 %r305,%r150,%r151; @ %r305 bra $L59; bra $L118; $L57: .loc 1 287 5 cvta.const.u64 %r306306_gfortran_runtime_error8: add.u64 %r85,%r233,40; add.u64 %r228,%r235,40; add.u64 %r309,%r22,%r22; add.u64 %r310,%r309,%r22; shl.b64 %r311,%r310,3; add.u64 %r312,%r233,16; add.u64 %r76,%r311,%r312; .loc 1 292 3 mov.u64 %r163,0; add.u64 %r599,%frame,240; add.u64 %r603,%frame,360; .loc 1 299 12 mov.u64 %r613,%r163; $L63: .loc 1 294 18 add.u64 %r314,%r599,%r163; ld.u64 %r315,[%r85]; st.u64 [%r314],%r315; .loc 1 295 18 add.u64 %r316,%frame,%r163; .loc 1 295 20 ld.u64 %r318,[%r228]; mul.lo.u64 %r317,%r318,%r35; .loc 1 295 18 st.u64 [%r316],%r317; .loc 1 296 19 ld.u64 %r320,[%r85+16]; add.u64 %r319,%r320,1; ld.u64 %r321,[%r85+8]; sub.u64 %r53,%r319,%r321; .loc 1 298 10 setp.lt.s64 %r322,%r53,0; @ %r322 bra $L61; .loc 1 296 17 add.u64 %r324,%r603,%r163; st.u64 [%r324],%r53; bra $L62; $L61: .loc 1 299 12 add.u64 %r326,%r603,%r163; st.u64 [%r326],%r613; $L62: .loc 1 292 3 add.u64 %r85,%r85,24; add.u64 %r228,%r228,24; add.u64 %r163,%r163,8; setp.ne.u64 %r328,%r76,%r85; @ %r328 bra $L63; bra $L64; $L118: .loc 1 312 6 ld.u64 %r329,[%r232]; setp.eq.u64 %r330,%r329,0; @ ! %r330 bra $L66; bra $L65; $L59: add.u64 %r332,%r22,%r22; add.u64 %r333,%r332,%r22; shl.b64 %r334,%r333,3; add.u64 %r176,%r334,40; add.u64 %r149,%r233,%r176; add.u64 %r221,%r235,%r176; shl.b64 %r335,%r22,3; add.u64 %r68,%r335,-8; cvt.u32.u32 %r337,%r23; cvt.s64.s8 %r336,%r337; shl.b64 %r338,%r336,3; add.u64 %r181,%r338,-8; add.u64 %r599,%frame,240; add.u64 %r603,%frame,360; .loc 1 309 12 mov.u64 %r612,0; $L69: .loc 1 304 18 add.u64 %r340,%r599,%r68; ld.u64 %r341,[%r149]; st.u64 [%r340],%r341; .loc 1 305 18 add.u64 %r342,%frame,%r68; .loc 1 305 20 ld.u64 %r344,[%r221]; mul.lo.u64 %r343,%r344,%r35; .loc 1 305 18 st.u64 [%r342],%r343; .loc 1 306 19 ld.u64 %r346,[%r149+16]; add.u64 %r345,%r346,1; ld.u64 %r347,[%r149+8]; sub.u64 %r66,%r345,%r347; .loc 1 308 10 setp.lt.s64 %r348,%r66,0; @ %r348 bra $L67; .loc 1 306 17 add.u64 %r350,%r603,%r68; st.u64 [%r350],%r66; bra $L68; $L67: .loc 1 309 12 add.u64 %r352,%r603,%r68; st.u64 [%r352],%r612; $L68: .loc 1 302 3 add.u64 %r149,%r149,24; add.u64 %r221,%r221,24; add.u64 %r68,%r68,8; setp.ne.u64 %r354,%r68,%r181; @ %r354 bra $L69; bra $L119; $L65: .loc 1 316 7 setp.eq.u64 %r355,%r151,0; @ %r355 bra $L71; add.u64 %r603,%frame,360; $L97: add.u64 %r132,%r232,40; mov.u64 %r130,%r603; cvt.u32.u32 %r357,%r23; cvt.s64.s8 %r356,%r357; add.u64 %r359,%r356,%r356; add.u64 %r360,%r359,%r356; shl.b64 %r361,%r360,3; add.u64 %r362,%r232,16; add.u64 %r59,%r361,%r362; .loc 1 319 10 mov.u64 %r133,1; .loc 1 323 4 mov.u64 %r381,0; bra $L72; $L71: .loc 1 327 20 add.u32 %r363,%r23,-2; cvt.s64.s32 %r74,%r363; add.u64 %r365,%r74,%r74; add.u64 %r366,%r365,%r74; shl.b64 %r367,%r366,3; add.u64 %r368,%r232,%r367; .loc 1 327 67 shl.b64 %r370,%r74,3; add.u64 %r371,%frame,%r370; .loc 1 327 59 ld.u64 %r373,[%r368+40]; ld.u64 %r374,[%r371+360]; mul.lo.u64 %r158,%r373,%r374; .loc 1 329 24 mov.u64 %r375,0; st.u64 [%r232+8],%r375; .loc 1 330 28 cvt.u16.u32 %r378,%r23; add.u16 %r377,%r378,-1; cvt.u32.u16 %r379,%r377; st.u8 [%r232+28],%r379; .loc 1 332 10 setp.eq.u64 %r380,%r158,0; @ %r380 bra $L73; bra $L120; $L75: .loc 1 321 47 mul.lo.u64 %r133,%r71,%r133; $L72: .loc 1 323 4 st.u64 [%r132+8],%r381; ld.u64 %r71,[%r130]; add.u64 %r382,%r71,-1; st.u64 [%r132+16],%r382; st.u64 [%r132],%r133; .loc 1 316 7 add.u64 %r132,%r132,24; add.u64 %r130,%r130,8; setp.ne.u64 %r383,%r59,%r132; @ %r383 bra $L75; bra $L71; $L73: .loc 1 335 4 st.u64 [%r232+48],%r158; mov.u64 %r385,-1; st.u64 [%r232+56],%r385; mov.u64 %r386,1; st.u64 [%r232+40],%r386; .loc 1 336 4 bra $L51; $L120: .loc 1 339 24 mov.u64 %r388,16388; call (%value_in),_gfortrani_xmallocarray89,[%value_in]; } .loc 1 339 22 st.u64 [%r232],%r389; bra $L76; $L66: .loc 1 344 19 ld.s8 %r391,[%r232+28]; .loc 1 344 10 setp.eq.u64 %r392,%r391,%r151; @ %r392 bra $L77; .loc 1 345 2 cvta.const.u64 %r3933_gfortran_runtime_error77: .loc 1 347 11 cvta.global.u64 %r395,_gfortrani_compile_options; .loc 1 347 10 ld.u32 %r396,[%r395+36]; setp.eq.u32 %r397,%r396,0; @ %r397 bra $L76; .loc 1 349 4 add.u64 %r603,%frame,360; cvta.const.u64 %r401,$LC2r60300401; call _gfortrani_bounds_ifunction_return4 cvta.const.u64 %r405,$LC40401; call _gfortrani_bounds_equal_extents$L76: .loc 1 356 3 setp.ne.u64 %r407,%r151,0; @ %r407 bra $L78; $L81: .loc 1 364 8 ld.u64 %r171,[%r232]; .loc 1 365 8 ld.u64 %r169,[%r233]; .loc 1 367 9 setp.ne.u64 %r408,%r169,0; @ %r408 bra $L79; bra $L51; $L78: add.u64 %r186,%frame,480; add.u64 %r148,%r232,40; add.u64 %r145,%frame,120; cvt.u32.u32 %r410,%r23; cvt.s64.s8 %r409,%r410; add.u64 %r134,%r409,-1; .loc 1 356 3 mov.u64 %r172,0; add.u64 %r603,%frame,360; .loc 1 358 16 mov.u64 %r411,%r172; $L80: st.u64 [%r186],%r411; .loc 1 359 18 ld.u64 %r412,[%r148]; st.u64 [%r145],%r412; .loc 1 360 17 shl.b64 %r414,%r172,3; add.u64 %r415,%r603,%r414; .loc 1 360 10 ld.u64 %r416,[%r415]; setp.le.s64 %r417,%r416,0; @ %r417 bra $L51; .loc 1 356 26 add.u64 %r172,%r172,1; .loc 1 356 3 add.u64 %r186,%r186,8; add.u64 %r148,%r148,24; add.u64 %r145,%r145,8; setp.ne.u64 %r418,%r134,%r172; @ %r418 bra $L80; bra $L81; $L79: .loc 1 290 12 mul.lo.u64 %r44,%r41,%r35; ld.u64 %r188,[%frame+480]; .loc 1 430 22 ld.u64 %r101,[%frame+240]; .loc 1 431 23 ld.u64 %r103,[%frame]; .loc 1 432 22 ld.u64 %r105,[%frame+120]; .loc 1 432 12 shl.b64 %r106,%r105,4; .loc 1 434 32 ld.u64 %r29,[%frame+360]; .loc 1 441 23 mul.lo.u64 %r39,%r101,%r29; .loc 1 442 24 mul.lo.u64 %r419,%r103,%r29; .loc 1 442 10 neg.s64 %r92,%r419; .loc 1 443 23 mul.lo.u64 %r420,%r105,%r29; .loc 1 443 9 shl.b64 %r421,%r420,4; neg.s64 %r182,%r421; setp.le.s64 %r597,%r151,1; .loc 1 408 9 setp.ne.u32 %r605,%r236,0; add.u64 %r606,%r152,-1; cvt.u32.u32 %r607,%r23; cvt.s64.s8 %r608,%r607; add.u64 %r609,%r608,-1; add.u64 %r610,%frame,240; add.u64 %r611,%frame,360; $L98: .loc 1 356 3 mov.u64 %r165,%r170; mov.u64 %r164,%r169; .loc 1 386 9 mov.u64 %r199,0; $L85: .loc 1 389 7 ld.s8 %r227,[%r165]; mov.u64 %r229,%r199; .loc 1 386 24 add.u64 %r199,%r199,1; .loc 1 389 6 setp.eq.u32 %r600,%r227,0; @ %r600 bra $L82; .loc 1 397 11 ld.s8 %r143,[%r164]; .loc 1 398 11 mov.u64 %r559,%r199; shr.s64 %r560,%r559,63; .loc 1 408 9 @ %r605 bra $L83; bra $L121; $L82: .loc 1 386 32 add.u64 %r164,%r164,%r155; .loc 1 386 47 add.u64 %r165,%r165,%r44; .loc 1 386 2 setp.ne.u64 %r425,%r152,%r199; @ %r425 bra $L85; .loc 1 385 9 mov.u64 %r559,0; mov.u64 %r560,%r559; bra $L86; $L121: .loc 1 418 8 setp.gt.s64 %r426,%r152,%r229; @ %r426 bra $L87; bra $L86; $L83: .loc 1 409 8 setp.le.s64 %r427,%r152,%r229; @ %r427 bra $L86; add.u64 %r567,%r229,1; shr.s64 %r568,%r567,63; sub.u64 %r431,%r606,%r229; shr.s64 %r434,%r229,63; add.u64 %r437,%r229,2; set.u32.lt.u64 %r442,%r437,%r229; cvt.s64.s32 %r440,%r442; sub.u64 %r446,%r434,%r440; add.u64 %r448,%r431,%r437; set.u32.lt.u64 %r453,%r448,%r431; cvt.s64.s32 %r451,%r453; sub.u64 %r457,%r446,%r451; mov.u64 %r579,1; $L90: .loc 1 411 8 @ %r600 bra $L88; .loc 1 411 18 ld.s8 %r90,[%r164]; .loc 1 411 15 cvt.u16.u32 %r459,%r90; cvt.u16.u32 %r460,%r143; setp.lt.s16 %r461,%r459,%r460; .loc 1 414 16 selp.u64 %r559,%r559,%r567,%r461; selp.u64 %r560,%r560,%r568,%r461; selp.u32 %r143,%r143,%r90,%r461; $L88: .loc 1 409 33 add.u64 %r164,%r164,%r155; .loc 1 409 48 add.u64 %r165,%r165,%r44; .loc 1 409 8 add.u64 %r464,%r567,%r579; set.u32.lt.u64 %r469,%r464,%r567; cvt.s64.s32 %r467,%r469; mov.u64 %r567,%r464; sub.u64 %r568,%r568,%r467; setp.ne.u64 %r476,%r448,%r567; @ %r476 bra $L99; setp.ne.u64 %r479,%r457,%r568; @ ! %r479 bra $L86; $L99: .loc 1 411 9 ld.s8 %r227,[%r165]; setp.eq.u32 %r600,%r227,0; bra $L90; $L87: add.u64 %r563,%r229,1; shr.s64 %r564,%r563,63; sub.u64 %r483,%r606,%r229; shr.s64 %r486,%r229,63; add.u64 %r489,%r229,2; set.u32.lt.u64 %r494,%r489,%r229; cvt.s64.s32 %r492,%r494; sub.u64 %r498,%r486,%r492; add.u64 %r500,%r483,%r489; set.u32.lt.u64 %r505,%r500,%r483; cvt.s64.s32 %r503,%r505; sub.u64 %r509,%r498,%r503; .loc 1 418 8 mov.u64 %r593,1; $L113: .loc 1 420 8 ld.s8 %r511,[%r165]; cvt.u16.u32 %r510,%r511; setp.eq.u16 %r512,%r510,0; @ %r512 bra $L91; .loc 1 420 18 ld.s8 %r95,[%r164]; .loc 1 420 15 cvt.u16.u32 %r513,%r95; cvt.u16.u32 %r514,%r143; setp.le.s16 %r515,%r513,%r514; .loc 1 423 16 selp.u64 %r559,%r559,%r563,%r515; selp.u64 %r560,%r560,%r564,%r515; selp.u32 %r143,%r143,%r95,%r515; $L91: .loc 1 418 33 add.u64 %r164,%r164,%r155; .loc 1 418 48 add.u64 %r165,%r165,%r44; .loc 1 418 8 add.u64 %r518,%r563,%r593; set.u32.lt.u64 %r523,%r518,%r563; cvt.s64.s32 %r521,%r523; mov.u64 %r563,%r518; sub.u64 %r564,%r564,%r521; setp.ne.u64 %r530,%r500,%r563; @ %r530 bra $L113; setp.ne.u64 %r533,%r509,%r564; @ %r533 bra $L113; $L86: .loc 1 426 8 st.u64 [%r171],%r559; st.u64 [%r171+8],%r560; .loc 1 429 15 add.u64 %r188,%r188,1; .loc 1 430 12 add.u64 %r169,%r169,%r101; .loc 1 431 13 add.u64 %r170,%r170,%r103; .loc 1 432 12 add.u64 %r171,%r171,%r106; .loc 1 434 13 setp.ne.u64 %r536,%r29,%r188; @ %r536 bra $L98; .loc 1 442 10 add.u64 %r166,%r170,%r92; .loc 1 443 9 add.u64 %r167,%r171,%r182; .loc 1 445 7 @ %r597 bra $L51; add.u64 %r220,%frame,488; .loc 1 441 23 mov.u64 %r110,%r39; .loc 1 445 7 mov.u64 %r219,8; .loc 1 444 5 mov.u64 %r168,1; add.u64 %r601,%frame,120; .loc 1 438 13 mov.u64 %r604,0; bra $L95; $L96: st.u64 [%r220],%r604; .loc 1 441 23 mul.lo.u64 %r110,%r122,%r121; .loc 1 442 24 mul.lo.u64 %r542,%r124,%r121; .loc 1 442 10 sub.u64 %r166,%r170,%r542; .loc 1 443 23 mul.lo.u64 %r543,%r126,%r121; .loc 1 443 9 shl.b64 %r544,%r543,4; sub.u64 %r167,%r171,%r544; .loc 1 444 5 add.u64 %r168,%r168,1; .loc 1 445 7 add.u64 %r220,%r220,8; add.u64 %r219,%r219,8; setp.eq.u64 %r545,%r168,%r609; @ %r545 bra $L51; $L95: .loc 1 453 16 ld.u64 %r546,[%r220]; add.u64 %r121,%r546,1; st.u64 [%r220],%r121; .loc 1 454 23 add.u64 %r548,%r610,%r219; ld.u64 %r122,[%r548]; .loc 1 454 13 sub.u64 %r549,%r122,%r110; add.u64 %r169,%r169,%r549; .loc 1 455 24 add.u64 %r550,%frame,%r219; ld.u64 %r124,[%r550]; .loc 1 455 14 add.u64 %r170,%r166,%r124; .loc 1 456 23 add.u64 %r552,%r601,%r219; ld.u64 %r126,[%r552]; .loc 1 456 13 shl.b64 %r553,%r126,4; add.u64 %r171,%r167,%r553; .loc 1 434 32 add.u64 %r555,%r611,%r219; ld.u64 %r129,[%r555]; .loc 1 434 13 setp.eq.u64 %r556,%r121,%r129; @ %r556 bra $L96; .loc 1 438 13 mov.u64 %r188,0; bra $L98; $L119: .loc 1 312 6 ld.u64 %r557,[%r232]; setp.eq.u64 %r558,%r557,0; @ ! %r558 bra $L66; bra $L97; $L51_gfortran_smaxloc1_16_i1 .visible .func _gfortran_smaxloc1_16_, .param .u32 %in_ar4) {local .align 16 .b8 %frame_ar[368u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r282; .reg .predpred %r293; .reg .pred %r294; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 483 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L123; .loc 1 483 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L124; r152154; call _gfortran_maxloc1_16_490 7 bra $L122; $L124: .loc 1 493 10 ld.u64 %r23,[%r152]; .loc 1 493 7 add.u64 %r88,%r23,-1; .loc 1 494 10 ld.s8 %r24,[%r151+28]; .loc 1 494 38 add.u32 %r26,%r24,-1; .loc 1 494 8 cvt.s64.s32 %r297,%r26; .loc 1 496 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r297; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 496 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L126; .loc 1 503 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L127; $L133: .loc 1 511 3 setp.lt.s64 %r175,%r88,%r297; @ %r175 bra $L128; bra $L181; $L126: .loc 1 498 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error127: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 508 12 mov.u64 %r303,0; $L132: .loc 1 505 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 507 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L130; .loc 1 505 17 st.u64 [%r139],%r36; bra $L131; $L130: .loc 1 508 12 st.u64 [%r139],%r303; $L131: .loc 1 503 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L132; bra $L133; $L181: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L135; bra $L134; $L128: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r296,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r296,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 517 12 mov.u64 %r302,0; $L138: .loc 1 514 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 516 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L136; .loc 1 513 17 st.u64 [%r43],%r41; bra $L137; $L136: .loc 1 517 12 st.u64 [%r43],%r302; $L137: .loc 1 511 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L138; bra $L182; $L134: .loc 1 524 7 setp.eq.u64 %r213,%r297,0; @ %r213 bra $L140; add.u64 %r296,%frame,120; $L160: add.u64 %r111,%r150,40; mov.u64 %r107,%r296; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 527 10 mov.u64 %r81,1; .loc 1 531 4 mov.u64 %r239,0; bra $L141; $L140: .loc 1 535 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 536 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 538 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 538 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 538 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 540 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L142; bra $L183; $L144: .loc 1 529 48 mul.lo.u64 %r81,%r46,%r81; $L141: .loc 1 531 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 524 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L144; bra $L140; $L142: .loc 1 543 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 544 4 bra $L122; $L183: .loc 1 547 24 mov.u64 %r246,1696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 547 22 st.u64 [%r150],%r98; $L148: .loc 1 573 3 setp.ne.u64 %r249,%r297,0; @ %r249 bra $L145; bra $L146; $L135: .loc 1 551 19 ld.s8 %r55,[%r150+28]; .loc 1 551 10 setp.eq.u64 %r250,%r55,%r297; @ %r250 bra $L147; .loc 1 552 2 st.u64 [%stack+8],%r297; st.u64 [%stack],%r55;_gfortran_runtime_error147: .loc 1 557 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 557 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L148; .loc 1 559 4 setp.eq.u64 %r256,%r297,0; @ %r256 bra $L146; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 559 10 mov.u64 %r104,0; $L150: .loc 1 563 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 563 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 564 18 ld.u64 %r63,[%r31]; .loc 1 559 25 add.u64 %r104,%r104,1; .loc 1 564 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L149; .loc 1 565 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error149: .loc 1 559 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L150; $L145: .loc 1 575 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L151; shl.b64 %r266,%r297,3; bra $L152; $L151: mov.u64 %r266,8; $L152: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L153: .loc 1 576 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 573 26 add.u64 %r100,%r100,1; .loc 1 573 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r297,%r100; @ %r279 bra $L153; $L146: ld.u64 %r80,[%frame+240]; .loc 1 585 22 ld.u64 %r68,[%frame]; .loc 1 585 12 shl.b64 %r69,%r68,4; .loc 1 587 32 ld.u64 %r59,[%frame+120]; .loc 1 594 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r295,%r297,1; .loc 1 583 13 mov.u64 %r280,0; cvt.u32.u32 %r299,%r24; cvt.s64.s8 %r300,%r299; add.u64 %r301,%r300,-1; $L159: st.u64 [%r98],%r280; st.u64 [%r98+8],%r280; .loc 1 584 15 add.u64 %r80,%r80,1; .loc 1 585 12 add.u64 %r98,%r98,%r69; .loc 1 587 13 setp.ne.u64 %r282,%r59,%r80; @ %r282 bra $L159; .loc 1 596 7 @ ! %r295 bra $L184; bra $L122; $L158: .loc 1 591 13 st.u64 [%r99],%r280; .loc 1 594 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 595 5 add.u64 %r97,%r97,1; .loc 1 596 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r285,%r97,%r301; @ %r285 bra $L157; bra $L122; $L184: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 594 23 mov.u64 %r72,%r44; .loc 1 595 5 mov.u64 %r97,1; $L157: .loc 1 600 16 ld.u64 %r290,[%r99]; add.u64 %r74,%r290,1; st.u64 [%r99],%r74; .loc 1 601 23 ld.u64 %r75,[%r124]; .loc 1 601 13 sub.u64 %r291,%r75,%r72; shl.b64 %r292,%r291,4; add.u64 %r98,%r98,%r292; .loc 1 587 32 ld.u64 %r77,[%r123]; .loc 1 587 13 setp.eq.u64 %r293,%r74,%r77; @ %r293 bra $L158; .loc 1 591 13 mov.u64 %r80,0; bra $L159; $L182: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r294,%r98,0; @ ! %r294 bra $L135; bra $L160; $L122: .loc 1 605 1 ret; } maxloc1_4_i2.o/_gfortran_maxloc1_4_i2 .visible .func _gfortran_maxloc1_4_i2.file 1 "../../../libgfortran/generated/maxloc1_4_i2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_4_i2 .visible .func _gfortran_mmaxloc1_4__gfortran_smaxloc1_4_i2 .visible .func _gfortran_smaxloc1_4_VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_maxloc1_4_i2 .visible .func _gfortran_maxloc1_4_i2480pred %r221; .reg .pred %r222; .reg .u64u64 %r229; .reg .u64 %r230; .reg .u64 %r231; .reg .pred %r232; .reg .predu64 %r280; .reg .u64 %r282; .reg .u64predu64pred %r322; .reg .predpred %r345; .reg .u64mov.u64 %r182,%ar0; mov.u64 %r183,%ar1; mov.u64 %r184,%ar2; mov.u32 %r185,%ar3; .loc 1 58 10 ld.s8 %r22,[%r183+28]; .loc 1 58 38 add.u32 %r186,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r108,%r186; .loc 1 59 10 ld.u64 %r25,[%r184]; .loc 1 59 7 add.u64 %r109,%r25,-1; .loc 1 61 7 shr.u64 %r188,%r109,63; set.u32.lt.s64 %r190,%r108,%r109; neg.s32 %r191,%r190; cvt.u16.u64 %r194,%r188; cvt.u16.u32 %r195,%r191; or.b16 %r193,%r194,%r195; .loc 1 61 6 cvt.u32.u16 %r196,%r193; cvt.u16.u8 %r197,%r196; setp.eq.u16 %r198,%r197,0; @ %r198 bra $L2; .loc 1 63 7 cvt.u32.u32 %r201,%r22; cvt.s64.s8 %r200,%r201; st.u64 [%stack+8],%r20019_gfortran_runtime_errorr204,%r109,%r109; add.u64 %r205,%r204,%r109; shl.b64 %r206,%r205,3; add.u64 %r207,%r183,%r206; ld.u64 %r31,[%r207+56]; ld.u64 %r33,[%r207+48]; .loc 1 71 9 ld.u64 %r112,[%r207+40]; .loc 1 73 3 setp.ne.u64 %r221,%r109,0; @ %r221 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r222,%r108,%r109; @ %r222 bra $L4; bra $L49; $L3: add.u64 %r29,%r183,40; add.u64 %r125,%frame,120; add.u64 %r126,%frame,240; add.u64 %r223,%r183,16; add.u64 %r225,%r25,%r25; add.u64 %r226,%r225,%r25; shl.b64 %r227,%r226,3; add.u64 %r147,%r223,%r227; .loc 1 79 12 mov.u64 %r371,0; $L8: .loc 1 75 18 ld.u64 %r228,[%r29]; st.u64 [%r125],%r228; .loc 1 76 19 ld.u64 %r230,[%r29+16]; add.u64 %r229,%r230,1; ld.u64 %r231,[%r29+8]; sub.u64 %r38,%r229,%r231; .loc 1 78 10 setp.lt.s64 %r232,%r38,0; @ %r232 bra $L6; .loc 1 76 17 st.u64 [%r126],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r126],%r371; $L7: .loc 1 73 3 add.u64 %r29,%r29,24; add.u64 %r125,%r125,8; add.u64 %r126,%r126,8; setp.ne.u64 %r234,%r29,%r147; @ %r234 bra $L8; bra $L9; $L49: .loc 1 90 6 ld.u64 %r235,[%r182]; setp.eq.u64 %r236,%r235,0; @ ! %r236 bra $L11; bra $L10; $L4: add.u64 %r238,%r25,%r25; add.u64 %r239,%r238,%r25; shl.b64 %r240,%r239,3; add.u64 %r241,%r240,40; add.u64 %r137,%r183,%r241; shl.b64 %r242,%r25,3; add.u64 %r105,%r242,-8; add.u64 %r357,%frame,120; add.u64 %r127,%r357,%r105; add.u64 %r361,%frame,240; add.u64 %r103,%r361,%r105; cvt.u32.u32 %r246,%r22; cvt.s64.s8 %r245,%r246; add.u64 %r248,%r245,%r245; add.u64 %r249,%r248,%r245; shl.b64 %r250,%r249,3; add.u64 %r251,%r183,40; add.u64 %r64,%r250,%r251; .loc 1 87 12 mov.u64 %r370,0; $L14: .loc 1 83 18 ld.u64 %r252,[%r137]; st.u64 [%r127],%r252; .loc 1 84 19 ld.u64 %r254,[%r137+16]; add.u64 %r253,%r254,1; ld.u64 %r255,[%r137+8]; sub.u64 %r44,%r253,%r255; .loc 1 86 10 setp.lt.s64 %r256,%r44,0; @ %r256 bra $L12; .loc 1 84 17 st.u64 [%r103],%r44; bra $L13; $L12: .loc 1 87 12 st.u64 [%r103],%r370; $L13: .loc 1 81 3 add.u64 %r137,%r137,24; add.u64 %r127,%r127,8; add.u64 %r103,%r103,8; setp.ne.u64 %r258,%r64,%r137; @ %r258 bra $L14; bra $L50; $L10: .loc 1 94 7 setp.le.s64 %r259,%r108,0; @ %r259 bra $L16; add.u64 %r361,%frame,240; $L37: add.u64 %r154,%r182,40; mov.u64 %r152,%r361; cvt.u32.u32 %r261,%r22; cvt.s64.s8 %r260,%r261; add.u64 %r263,%r260,%r260; add.u64 %r264,%r263,%r260; shl.b64 %r265,%r264,3; add.u64 %r266,%r182,16; add.u64 %r138,%r265,%r266; .loc 1 97 10 mov.u64 %r100,1; .loc 1 101 4 mov.u64 %r289,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r267,0; st.u64 [%r182+8],%r267; .loc 1 106 28 cvt.u16.u32 %r270,%r22; add.u16 %r269,%r270,-1; cvt.u32.u16 %r271,%r269; st.u8 [%r182+28],%r271; .loc 1 108 20 add.u32 %r272,%r22,-2; cvt.s64.s32 %r54,%r272; add.u64 %r274,%r54,%r54; add.u64 %r275,%r274,%r54; shl.b64 %r276,%r275,3; add.u64 %r277,%r182,%r276; .loc 1 108 67 shl.b64 %r279,%r54,3; add.u64 %r280,%frame,%r279; .loc 1 108 59 ld.u64 %r282,[%r277+40]; ld.u64 %r283,[%r280+240]; mul.lo.u64 %r114,%r282,%r283; .loc 1 110 29285; call (%value_in),_gfortrani_xmallocarray286,[%value_in]; } .loc 1 110 27 st.u64 [%r182],%r286; .loc 1 111 10 setp.eq.u64 %r288,%r114,0; @ ! %r288 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r100,%r50,%r100; $L17: .loc 1 101 4 st.u64 [%r154+8],%r289; ld.u64 %r50,[%r152]; add.u64 %r290,%r50,-1; st.u64 [%r154+16],%r290; st.u64 [%r154],%r100; .loc 1 94 7 add.u64 %r154,%r154,24; add.u64 %r152,%r152,8; setp.ne.u64 %r291,%r138,%r154; @ %r291 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r182+48],%r114; mov.u64 %r293,-1; st.u64 [%r182+56],%r293; mov.u64 %r294,1; st.u64 [%r182+40],%r294; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r60,[%r182+28]; .loc 1 121 10 setp.eq.u64 %r295,%r60,%r108; @ %r295 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r108; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r298,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r299,[%r298+36]; setp.eq.u32 %r300,%r299,0; @ %r300 bra $L21; .loc 1 128 2 add.u64 %r361,%frame,240; cvta.const.u64 %r304,$LC2r361303304; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r306,%r108,0; @ %r306 bra $L24; $L27: .loc 1 68 9 add.u64 %r307,%r31,1; .loc 1 68 7 sub.u64 %r110,%r307,%r33; max.s64 %r45,%r110,0; .loc 1 140 8 ld.u64 %r122,[%r183]; .loc 1 141 8 ld.u64 %r123,[%r182]; .loc 1 180 31 add.u64 %r68,%r112,%r112; ld.u64 %r75,[%frame+360]; .loc 1 194 22 ld.u64 %r70,[%frame+120]; .loc 1 194 12 add.u64 %r71,%r70,%r70; .loc 1 195 22 ld.u64 %r73,[%frame]; .loc 1 195 12 shl.b64 %r74,%r73,2; .loc 1 197 32 ld.u64 %r143,[%frame+240]; .loc 1 204 23 mul.lo.u64 %r310,%r70,%r143; .loc 1 204 9 add.u64 %r311,%r310,%r310; neg.s64 %r134,%r311; .loc 1 205 23 mul.lo.u64 %r312,%r73,%r143; .loc 1 205 9 shl.b64 %r313,%r312,2; neg.s64 %r128,%r313; setp.gt.s64 %r358,%r110,0; setp.le.s64 %r359,%r108,1; .loc 1 155 9 mov.u32 %r363,-32768; setp.eq.u32 %r364,%r185,0; cvt.u32.u32 %r365,%r22; cvt.s64.s8 %r366,%r365; add.u64 %r367,%r366,-1; add.u64 %r368,%frame,120; add.u64 %r369,%frame,240; bra $L25; $L24: add.u64 %r167,%frame,360; add.u64 %r166,%r182,40; mov.u64 %r164,%frame; cvt.u32.u32 %r315,%r22; cvt.s64.s8 %r314,%r315; add.u64 %r155,%r314,-1; .loc 1 132 3 mov.u64 %r124,0; add.u64 %r361,%frame,240; .loc 1 134 16 mov.u64 %r316,%r124; $L26: st.u64 [%r167],%r316; .loc 1 135 18 ld.u64 %r317,[%r166]; st.u64 [%r164],%r317; .loc 1 136 17 shl.b64 %r319,%r124,3; add.u64 %r320,%r361,%r319; .loc 1 136 10 ld.u64 %r321,[%r320]; setp.le.s64 %r322,%r321,0; @ %r322 bra $L1; .loc 1 132 26 add.u64 %r124,%r124,1; .loc 1 132 3 add.u64 %r167,%r167,8; add.u64 %r166,%r166,24; add.u64 %r164,%r164,8; setp.ne.u64 %r323,%r124,%r155; @ %r323 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r358 bra $L38; .loc 1 159 10 mov.u32 %r325,0; st.u32 [%r123],%r325; bra $L29; $L38: mov.u64 %r118,%r122; .loc 1 155 9 mov.u32 %r102,%r363; .loc 1 157 9 mov.u32 %r101,1; .loc 1 178 8 mov.u64 %r117,0; $L28: .loc 1 182 14 ld.s16 %r181,[%r118]; .loc 1 182 7 @ %r364 bra $L30; .loc 1 182 6 cvt.u16.u32 %r327,%r102; cvt.u16.u32 %r328,%r181; setp.le.s16 %r329,%r327,%r328; @ %r329 bra $L31; bra $L32; $L30: cvt.u16.u32 %r330,%r102; cvt.u16.u32 %r331,%r181; setp.ge.s16 %r332,%r330,%r331; @ %r332 bra $L32; $L31: cvt.u32.u64 %r333,%r117; add.u32 %r101,%r333,1; .loc 1 182 14 mov.u32 %r102,%r181; $L32: .loc 1 180 23 add.u64 %r117,%r117,1; .loc 1 180 31 add.u64 %r118,%r118,%r68; .loc 1 180 6 setp.gt.s64 %r334,%r45,%r117; @ %r334 bra $L28; .loc 1 189 12 st.u32 [%r123],%r101; $L29: .loc 1 193 15 add.u64 %r75,%r75,1; .loc 1 194 12 add.u64 %r122,%r122,%r71; .loc 1 195 12 add.u64 %r123,%r123,%r74; .loc 1 197 13 setp.ne.u64 %r335,%r75,%r143; @ %r335 bra $L25; .loc 1 204 9 add.u64 %r119,%r122,%r134; .loc 1 205 9 add.u64 %r120,%r123,%r128; .loc 1 207 7 @ %r359 bra $L1; add.u64 %r133,%frame,368; mov.u64 %r149,8; .loc 1 206 5 mov.u64 %r121,1; .loc 1 201 13 mov.u64 %r362,0; bra $L35; $L36: st.u64 [%r133],%r362; .loc 1 204 23 mul.lo.u64 %r341,%r87,%r86; .loc 1 204 9 add.u64 %r342,%r341,%r341; sub.u64 %r119,%r122,%r342; .loc 1 205 23 mul.lo.u64 %r343,%r90,%r86; .loc 1 205 9 shl.b64 %r344,%r343,2; sub.u64 %r120,%r123,%r344; .loc 1 206 5 add.u64 %r121,%r121,1; .loc 1 207 7 add.u64 %r133,%r133,8; add.u64 %r149,%r149,8; setp.eq.u64 %r345,%r121,%r367; @ %r345 bra $L1; $L35: .loc 1 215 16 ld.u64 %r346,[%r133]; add.u64 %r86,%r346,1; st.u64 [%r133],%r86; .loc 1 216 23 add.u64 %r348,%r368,%r149; ld.u64 %r87,[%r348]; .loc 1 216 13 add.u64 %r349,%r87,%r87; add.u64 %r122,%r119,%r349; .loc 1 217 23 add.u64 %r350,%frame,%r149; ld.u64 %r90,[%r350]; .loc 1 217 13 shl.b64 %r351,%r90,2; add.u64 %r123,%r120,%r351; .loc 1 197 32 add.u64 %r353,%r369,%r149; ld.u64 %r93,[%r353]; .loc 1 197 13 setp.eq.u64 %r354,%r86,%r93; @ %r354 bra $L36; .loc 1 201 13 mov.u64 %r75,0; bra $L25; $L50: .loc 1 90 6 ld.u64 %r355,[%r182]; setp.eq.u64 %r356,%r355,0; @ ! %r356 bra $L11; bra $L37; $L1:_gfortran_mmaxloc1_4_i2 .visible .func _gfortran_mmaxloc1_4_, .param .u32 %in_ar4) {60898u16 %r238; .reg .u16u64 %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u64 %r306; .reg .u64pred %r317; .reg .u64 %r318; .reg .pred %r319; .reg .u64u32u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64u64predu32 %r385; .reg .predu64 %r394; .reg .predu64pred %r407; .reg .u64 %r410; .reg .u64 %r411; .reg .u64 %r412; .reg .u32 %r414; .reg .pred %r416; .reg .pred %r417; .reg .predpred %r423; .reg .u16 %r424; .reg .u32 %r425; .reg .predpred %r429; .reg .pred %r430; .reg .predpred %r461; .reg .u64 %r462; .reg .pred.reg .u64 %r471; mov.u64 %r221,%ar0; mov.u64 %r222,%ar1; mov.u64 %r223,%ar2; mov.u64 %r224,%ar3; mov.u32 %r225,%ar4; .loc 1 251 6 setp.ne.u64 %r226,%r224,0; @ %r226 bra $L223225; call _gfortran_maxloc1_4_7 bra $L51; $L52: .loc 1 261 10 ld.u64 %r22,[%r223]; .loc 1 261 7 add.u64 %r151,%r22,-1; .loc 1 262 10 ld.s8 %r23,[%r222+28]; .loc 1 262 38 add.u32 %r231,%r23,-1; .loc 1 262 8 cvt.s64.s32 %r152,%r231; .loc 1 265 7 shr.u64 %r233,%r151,63; set.u32.gt.s64 %r235,%r151,%r152; neg.s32 %r236,%r235; cvt.u16.u64 %r239,%r233; cvt.u16.u32 %r240,%r236; or.b16 %r238,%r239,%r240; .loc 1 265 6 cvt.u32.u16 %r241,%r238; cvt.u16.u8 %r242,%r241; setp.eq.u16 %r243,%r242,0; @ %r243 bra $L54; .loc 1 267 7 cvt.u32.u32 %r246,%r23; cvt.s64.s8 %r245,%r246; st.u64 [%stack+8],%r245; st.u64 [%stack],%r22; cvta.const.u64 %r244244_gfortran_runtime_error4: .loc 1 272 9 add.u64 %r456,%r151,%r151; add.u64 %r457,%r456,%r151; shl.b64 %r251,%r457,3; add.u64 %r252,%r222,%r251; ld.u64 %r255,[%r252+56]; add.u64 %r254,%r255,1; .loc 1 272 7 ld.u64 %r262,[%r252+48]; sub.u64 %r153,%r254,%r262; .loc 1 273 6 setp.le.s64 %r263,%r153,0; @ %r263 bra $L51; .loc 1 276 9 ld.u64 %r169,[%r224]; .loc 1 278 15 ld.u64 %r34,[%r224+16]; .loc 1 280 22 cvt.u32.u64 %r35,%r34; .loc 1 280 53 add.u32 %r264,%r35,-4; and.b32 %r265,%r264,-5; set.u32.eq.u32 %r267,%r265,0; neg.s32 %r268,%r267; .loc 1 280 22 add.u32 %r269,%r35,-1; set.u32.le.u32 %r271,%r269,1; neg.s32 %r272,%r271; .loc 1 280 6 cvt.u16.u32 %r274,%r268; cvt.u16.u32 %r275,%r272; or.loc 1 282 7 setp.ne.u32 %r280,%r35,16; @ %r280 bra $L57; $L56: .loc 1 289 9 shl.b64 %r284,%r457,3; add.u64 %r285,%r222,%r284; ld.u64 %r156,[%r285+40]; .loc 1 290 12 add.u64 %r291,%r224,%r284; ld.u64 %r41,[%r291+40]; .loc 1 292 3 setp.ne.u64 %r293,%r151,0; @ %r293 bra $L58; $L64: .loc 1 302 3 setp.lt.s64 %r294,%r151,%r152; @ %r294 bra $L59; bra $L116; $L57: .loc 1 287 5 cvta.const.u64 %r2955_gfortran_runtime_error8: add.u64 %r96,%r222,40; add.u64 %r68,%r224,40; add.u64 %r298,%r22,%r22; add.u64 %r299,%r298,%r22; shl.b64 %r300,%r299,3; add.u64 %r301,%r222,16; add.u64 %r39,%r300,%r301; .loc 1 292 3 mov.u64 %r139,0; add.u64 %r460,%frame,240; add.u64 %r459,%frame,360; .loc 1 299 12 mov.u64 %r471,%r139; $L63: .loc 1 294 18 add.u64 %r303,%r460,%r139; ld.u64 %r304,[%r96]; st.u64 [%r303],%r304; .loc 1 295 18 add.u64 %r305,%frame,%r139; .loc 1 295 20 ld.u64 %r307,[%r68]; mul.lo.u64 %r306,%r307,%r34; .loc 1 295 18 st.u64 [%r305],%r306; .loc 1 296 19 ld.u64 %r309,[%r96+16]; add.u64 %r308,%r309,1; ld.u64 %r310,[%r96+8]; sub.u64 %r53,%r308,%r310; .loc 1 298 10 setp.lt.s64 %r311,%r53,0; @ %r311 bra $L61; .loc 1 296 17 add.u64 %r313,%r459,%r139; st.u64 [%r313],%r53; bra $L62; $L61: .loc 1 299 12 add.u64 %r315,%r459,%r139; st.u64 [%r315],%r471; $L62: .loc 1 292 3 add.u64 %r96,%r96,24; add.u64 %r68,%r68,24; add.u64 %r139,%r139,8; setp.ne.u64 %r317,%r39,%r96; @ %r317 bra $L63; bra $L64; $L116: .loc 1 312 6 ld.u64 %r318,[%r221]; setp.eq.u64 %r319,%r318,0; @ ! %r319 bra $L66; bra $L65; $L59: add.u64 %r321,%r22,%r22; add.u64 %r322,%r321,%r22; shl.b64 %r323,%r322,3; add.u64 %r130,%r323,40; add.u64 %r134,%r222,%r130; add.u64 %r111,%r224,%r130; shl.b64 %r324,%r22,3; add.u64 %r58,%r324,-8; cvt.u32.u32 %r326,%r23; cvt.s64.s8 %r325,%r326; shl.b64 %r327,%r325,3; add.u64 %r69,%r327,-8; add.u64 %r460,%frame,240; add.u64 %r459,%frame,360; .loc 1 309 12 mov.u64 %r470,0; $L69: .loc 1 304 18 add.u64 %r329,%r460,%r58; ld.u64 %r330,[%r134]; st.u64 [%r329],%r330; .loc 1 305 18 add.u64 %r331,%frame,%r58; .loc 1 305 20 ld.u64 %r333,[%r111]; mul.lo.u64 %r332,%r333,%r34; .loc 1 305 18 st.u64 [%r331],%r332; .loc 1 306 19 ld.u64 %r335,[%r134+16]; add.u64 %r334,%r335,1; ld.u64 %r336,[%r134+8]; sub.u64 %r65,%r334,%r336; .loc 1 308 10 setp.lt.s64 %r337,%r65,0; @ %r337 bra $L67; .loc 1 306 17 add.u64 %r339,%r459,%r58; st.u64 [%r339],%r65; bra $L68; $L67: .loc 1 309 12 add.u64 %r341,%r459,%r58; st.u64 [%r341],%r470; $L68: .loc 1 302 3 add.u64 %r134,%r134,24; add.u64 %r111,%r111,24; add.u64 %r58,%r58,8; setp.ne.u64 %r343,%r69,%r58; @ %r343 bra $L69; bra $L117; $L65: .loc 1 316 7 setp.eq.u64 %r344,%r152,0; @ %r344 bra $L71; add.u64 %r459,%frame,360; $L96: add.u64 %r148,%r221,40; mov.u64 %r146,%r459; cvt.u32.u32 %r346,%r23; cvt.s64.s8 %r345,%r346; add.u64 %r348,%r345,%r345; add.u64 %r349,%r348,%r345; shl.b64 %r350,%r349,3; add.u64 %r351,%r221,16; add.u64 %r135,%r350,%r351; .loc 1 319 10 mov.u64 %r136,1; .loc 1 323 4 mov.u64 %r370,0; bra $L72; $L71: .loc 1 327 20 add.u32 %r352,%r23,-2; cvt.s64.s32 %r76,%r352; add.u64 %r354,%r76,%r76; add.u64 %r355,%r354,%r76; shl.b64 %r356,%r355,3; add.u64 %r357,%r221,%r356; .loc 1 327 67 shl.b64 %r359,%r76,3; add.u64 %r360,%frame,%r359; .loc 1 327 59 ld.u64 %r362,[%r357+40]; ld.u64 %r363,[%r360+360]; mul.lo.u64 %r160,%r362,%r363; .loc 1 329 24 mov.u64 %r364,0; st.u64 [%r221+8],%r364; .loc 1 330 28 cvt.u16.u32 %r367,%r23; add.u16 %r366,%r367,-1; cvt.u32.u16 %r368,%r366; st.u8 [%r221+28],%r368; .loc 1 332 10 setp.eq.u64 %r369,%r160,0; @ %r369 bra $L73; bra $L118; $L75: .loc 1 321 47 mul.lo.u64 %r136,%r73,%r136; $L72: .loc 1 323 4 st.u64 [%r148+8],%r370; ld.u64 %r73,[%r146]; add.u64 %r371,%r73,-1; st.u64 [%r148+16],%r371; st.u64 [%r148],%r136; .loc 1 316 7 add.u64 %r148,%r148,24; add.u64 %r146,%r146,8; setp.ne.u64 %r372,%r135,%r148; @ %r372 bra $L75; bra $L71; $L73: .loc 1 335 4 st.u64 [%r221+48],%r160; mov.u64 %r374,-1; st.u64 [%r221+56],%r374; mov.u64 %r375,1; st.u64 [%r221+40],%r375; .loc 1 336 4 bra $L51; $L118: .loc 1 339 24call (%value_in),_gfortrani_xmallocarray78,[%value_in]; } .loc 1 339 22 st.u64 [%r221],%r378; bra $L76; $L66: .loc 1 344 19 ld.s8 %r380,[%r221+28]; .loc 1 344 10 setp.eq.u64 %r381,%r380,%r152; @ %r381 bra $L77; .loc 1 345 2 cvta.const.u64 %r382,$LC5; {stack; call _gfortran_runtime_error77: .loc 1 347 11 cvta.global.u64 %r384,_gfortrani_compile_options; .loc 1 347 10 ld.u32 %r385,[%r384+36]; setp.eq.u32 %r386,%r385,0; @ %r386 bra $L76; .loc 1 349 4 add.u64 %r459,%frame,360; cvta.const.u64 %r390,$LC2r459389390; call _gfortrani_bounds_ifunction_return4 cvta.const.u64 %r394,$LC222394390; call _gfortrani_bounds_equal_extents$L76: .loc 1 356 3 setp.ne.u64 %r396,%r152,0; @ %r396 bra $L78; $L81: .loc 1 364 8 ld.u64 %r170,[%r221]; .loc 1 365 8 ld.u64 %r168,[%r222]; .loc 1 367 9 setp.ne.u64 %r397,%r168,0; @ %r397 bra $L79; bra $L51; $L78: add.u64 %r198,%frame,480; add.u64 %r197,%r221,40; add.u64 %r195,%frame,120; cvt.u32.u32 %r399,%r23; cvt.s64.s8 %r398,%r399; add.u64 %r164,%r398,-1; .loc 1 356 3 mov.u64 %r171,0; add.u64 %r459,%frame,360; .loc 1 358 16 mov.u64 %r400,%r171; $L80: st.u64 [%r198],%r400; .loc 1 359 18 ld.u64 %r401,[%r197]; st.u64 [%r195],%r401; .loc 1 360 17 shl.b64 %r403,%r171,3; add.u64 %r404,%r459,%r403; .loc 1 360 10 ld.u64 %r405,[%r404]; setp.le.s64 %r406,%r405,0; @ %r406 bra $L51; .loc 1 356 26 add.u64 %r171,%r171,1; .loc 1 356 3 add.u64 %r198,%r198,8; add.u64 %r197,%r197,24; add.u64 %r195,%r195,8; setp.ne.u64 %r407,%r164,%r171; @ %r407 bra $L80; bra $L81; $L79: .loc 1 290 12 mul.lo.u64 %r44,%r41,%r34; .loc 1 386 32 add.u64 %r91,%r156,%r156; ld.u64 %r187,[%frame+480]; .loc 1 430 22 ld.u64 %r101,[%frame+240]; .loc 1 430 12 add.u64 %r102,%r101,%r101; .loc 1 431 23 ld.u64 %r104,[%frame]; .loc 1 432 22 ld.u64 %r106,[%frame+120]; .loc 1 432 12 shl.b64 %r107,%r106,2; .loc 1 434 32 ld.u64 %r29,[%frame+360]; .loc 1 441 23 mul.lo.u64 %r38,%r29,%r101; .loc 1 442 24 mul.lo.u64 %r410,%r29,%r104; .loc 1 442 10 neg.s64 %r137,%r410; .loc 1 443 23 mul.lo.u64 %r411,%r29,%r106; .loc 1 443 9 shl.b64 %r412,%r411,2; neg.s64 %r188,%r412; setp.le.s64 %r458,%r152,1; .loc 1 408 9 setp.ne.u32 %r463,%r225,0; cvt.u32.u32 %r464,%r23; cvt.s64.s8 %r465,%r464; add.u64 %r466,%r465,-1; add.u64 %r467,%frame,240; add.u64 %r468,%frame,360; add.u64 %r469,%frame,120; $L97: .loc 1 356 3 mov.u64 %r163,%r169; mov.u64 %r162,%r168; .loc 1 386 9 mov.u64 %r161,0; $L85: .loc 1 389 7 ld.s8 %r220,[%r163]; .loc 1 389 6 setp.eq.u32 %r461,%r220,0; @ %r461 bra $L82; .loc 1 397 11 ld.s16 %r145,[%r162]; .loc 1 398 11 cvt.u32.u64 %r414,%r161; add.u32 %r143,%r414,1; .loc 1 408 9 @ %r463 bra $L83; bra $L119; $L82: .loc 1 386 24 add.u64 %r161,%r161,1; .loc 1 386 32 add.u64 %r162,%r162,%r91; .loc 1 386 47 add.u64 %r163,%r163,%r44; .loc 1 386 2 setp.ne.u64 %r416,%r153,%r161; @ %r416 bra $L85; .loc 1 385 9 mov.u32 %r143,%r220; bra $L86; $L119: .loc 1 418 8 setp.gt.s64 %r417,%r153,%r161; @ %r417 bra $L87; bra $L86; $L83: .loc 1 409 8 setp.le.s64 %r418,%r153,%r161; @ %r418 bra $L86; add.u64 %r205,%r161,1; $L89: .loc 1 411 8 @ %r461 bra $L88; .loc 1 411 18 ld.s16 %r93,[%r162]; .loc 1 411 15 cvt.u16.u32 %r420,%r93; cvt.u16.u32 %r421,%r145; setp.lt.s16 %r422,%r420,%r421; @ %r422 bra $L88; .loc 1 414 16 cvt.u32.u64 %r143,%r205; mov.u32 %r145,%r93; $L88: .loc 1 409 33 add.u64 %r162,%r162,%r91; .loc 1 409 48 add.u64 %r163,%r163,%r44; .loc 1 409 8 add.u64 %r206,%r205,1; setp.eq.u64 %r423,%r153,%r205; @ %r423 bra $L86; .loc 1 411 9 ld.s8 %r220,[%r163]; mov.u64 %r205,%r206; setp.eq.u32 %r461,%r220,0; bra $L89; $L87: add.u64 %r201,%r161,1; bra $L91; $L98: mov.u64 %r201,%r202; $L91: .loc 1 420 8 ld.s8 %r425,[%r163]; cvt.u16.u32 %r424,%r425; setp.eq.u16 %r426,%r424,0; @ %r426 bra $L90; .loc 1 420 18 ld.s16 %r98,[%r162]; .loc 1 420 15 cvt.u16.u32 %r427,%r98; cvt.u16.u32 %r428,%r145; setp.le.s16 %r429,%r427,%r428; @ %r429 bra $L90; .loc 1 423 16 cvt.u32.u64 %r143,%r201; mov.u32 %r145,%r98; $L90: .loc 1 418 33 add.u64 %r162,%r162,%r91; .loc 1 418 48 add.u64 %r163,%r163,%r44; .loc 1 418 8 add.u64 %r202,%r201,1; setp.ne.u64 %r430,%r153,%r201; @ %r430 bra $L98; $L86: .loc 1 426 8 st.u32 [%r170],%r143; .loc 1 429 15 add.u64 %r187,%r187,1; .loc 1 430 12 add.u64 %r168,%r168,%r102; .loc 1 431 13 add.u64 %r169,%r169,%r104; .loc 1 432 12 add.u64 %r170,%r170,%r107; .loc 1 434 13 setp.ne.u64 %r431,%r29,%r187; @ %r431 bra $L97; .loc 1 442 10 add.u64 %r165,%r169,%r137; .loc 1 443 9 add.u64 %r166,%r170,%r188; .loc 1 445 7 @ %r458 bra $L51; add.u64 %r150,%frame,488; .loc 1 441 23 mov.u64 %r110,%r38; .loc 1 445 7 mov.u64 %r215,8; .loc 1 444 5 mov.u64 %r167,1; .loc 1 438 13 mov.u64 %r462,0; bra $L94; $L95: st.u64 [%r150],%r462; .loc 1 441 23 mul.lo.u64 %r110,%r123,%r122; .loc 1 442 24 mul.lo.u64 %r437,%r125,%r122; .loc 1 442 10 sub.u64 %r165,%r169,%r437; .loc 1 443 23 mul.lo.u64 %r438,%r127,%r122; .loc 1 443 9 shl.b64 %r439,%r438,2; sub.u64 %r166,%r170,%r439; .loc 1 444 5 add.u64 %r167,%r167,1; .loc 1 445 7 add.u64 %r150,%r150,8; add.u64 %r215,%r215,8; setp.eq.u64 %r440,%r167,%r466; @ %r440 bra $L51; $L94: .loc 1 453 16 ld.u64 %r441,[%r150]; add.u64 %r122,%r441,1; st.u64 [%r150],%r122; .loc 1 454 23 add.u64 %r443,%r467,%r215; ld.u64 %r123,[%r443]; .loc 1 454 13 sub.u64 %r444,%r123,%r110; add.u64 %r445,%r444,%r444; add.u64 %r168,%r168,%r445; .loc 1 455 24 add.u64 %r446,%frame,%r215; ld.u64 %r125,[%r446]; .loc 1 455 14 add.u64 %r169,%r165,%r125; .loc 1 456 23 add.u64 %r448,%r469,%r215; ld.u64 %r127,[%r448]; .loc 1 456 13 shl.b64 %r449,%r127,2; add.u64 %r170,%r166,%r449; .loc 1 434 32 add.u64 %r451,%r468,%r215; ld.u64 %r131,[%r451]; .loc 1 434 13 setp.eq.u64 %r452,%r122,%r131; @ %r452 bra $L95; .loc 1 438 13 mov.u64 %r187,0; bra $L97; $L117: .loc 1 312 6 ld.u64 %r453,[%r221]; setp.eq.u64 %r454,%r453,0; @ ! %r454 bra $L66; bra $L96; $L51_gfortran_smaxloc1_4_i2 .visible .func _gfortran_smaxloc1_4_, .param .u32 %in_ar4) {local .align 16 .b8 %frame_ar[368u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u32 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 483 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L121; .loc 1 483 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L122; $L121:r152154; call _gfortran_maxloc1_4_490 7 bra $L120; $L122: .loc 1 493 10 ld.u64 %r23,[%r152]; .loc 1 493 7 add.u64 %r88,%r23,-1; .loc 1 494 10 ld.s8 %r24,[%r151+28]; .loc 1 494 38 add.u32 %r26,%r24,-1; .loc 1 494 8 cvt.s64.s32 %r296,%r26; .loc 1 496 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 496 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L124; .loc 1 503 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L125; $L131: .loc 1 511 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L126; bra $L179; $L124: .loc 1 498 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error125: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 508 12 mov.u64 %r302,0; $L130: .loc 1 505 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 507 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L128; .loc 1 505 17 st.u64 [%r139],%r36; bra $L129; $L128: .loc 1 508 12 st.u64 [%r139],%r302; $L129: .loc 1 503 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L130; bra $L131; $L179: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L133; bra $L132; $L126: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 517 12 mov.u64 %r301,0; $L136: .loc 1 514 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 516 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L134; .loc 1 513 17 st.u64 [%r43],%r41; bra $L135; $L134: .loc 1 517 12 st.u64 [%r43],%r301; $L135: .loc 1 511 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L136; bra $L180; $L132: .loc 1 524 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L138; add.u64 %r295,%frame,120; $L158: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 527 10 mov.u64 %r81,1; .loc 1 531 4 mov.u64 %r239,0; bra $L139; $L138: .loc 1 535 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 536 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 538 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 538 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 538 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 540 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L140; bra $L181; $L142: .loc 1 529 48 mul.lo.u64 %r81,%r46,%r81; $L139: .loc 1 531 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 524 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L142; bra $L138; $L140: .loc 1 543 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 544 4 bra $L120; $L181: .loc 1 547 2496246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 547 22 st.u64 [%r150],%r98; $L146: .loc 1 573 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L143; bra $L144; $L133: .loc 1 551 19 ld.s8 %r55,[%r150+28]; .loc 1 551 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L145; .loc 1 552 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error145: .loc 1 557 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 557 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L146; .loc 1 559 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L144; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 559 10 mov.u64 %r104,0; $L148: .loc 1 563 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 563 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 564 18 ld.u64 %r63,[%r31]; .loc 1 559 25 add.u64 %r104,%r104,1; .loc 1 564 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L147; .loc 1 565 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error147: .loc 1 559 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L148; $L143: .loc 1 575 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L149; shl.b64 %r266,%r296,3; bra $L150; $L149: mov.u64 %r266,8; $L150: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L151: .loc 1 576 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 573 26 add.u64 %r100,%r100,1; .loc 1 573 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L151; $L144: ld.u64 %r80,[%frame+240]; .loc 1 585 22 ld.u64 %r68,[%frame]; .loc 1 585 12 shl.b64 %r69,%r68,2; .loc 1 587 32 ld.u64 %r59,[%frame+120]; .loc 1 594 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 583 13 mov.u32 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L157: st.u32 [%r98],%r280; .loc 1 584 15 add.u64 %r80,%r80,1; .loc 1 585 12 add.u64 %r98,%r98,%r69; .loc 1 587 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L157; .loc 1 596 7 @ ! %r294 bra $L182; bra $L120; $L156: .loc 1 591 13 st.u64 [%r99],%r297; .loc 1 594 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 595 5 add.u64 %r97,%r97,1; .loc 1 596 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L155; bra $L120; $L182: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 594 23 mov.u64 %r72,%r44; .loc 1 595 5 mov.u64 %r97,1; .loc 1 591 13 mov.u64 %r297,0; $L155: .loc 1 600 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 601 23 ld.u64 %r75,[%r124]; .loc 1 601 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,2; add.u64 %r98,%r98,%r291; .loc 1 587 32 ld.u64 %r77,[%r123]; .loc 1 587 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L156; .loc 1 591 13 mov.u64 %r80,0; bra $L157; $L180: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L133; bra $L158; $L120: .loc 1 605 1 ret; } maxloc1_8_i2.o/_gfortran_maxloc1_8_i2 .visible .func _gfortran_maxloc1_8_i2.file 1 "../../../libgfortran/generated/maxloc1_8_i2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_8_i2 .visible .func _gfortran_mmaxloc1_8__gfortran_smaxloc1_8_i2 .visible .func _gfortran_smaxloc1_8_VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_maxloc1_8_i2 .visible .func _gfortran_maxloc1_8_i248095predu64u64 %r229; .reg .u64 %r230; .reg .predpred %r255; .reg .predu64 %r266; .reg .u16u64predpred %r327; .reg .predu64 %r333; .reg .u64predpred %r357; .reg .u64 %r359; .reg .pred %r361; .reg .u64 %r362; .reg .u32mov.u64 %r181,%ar0; mov.u64 %r182,%ar1; mov.u64 %r183,%ar2; mov.u32 %r184,%ar3; .loc 1 58 10 ld.s8 %r22,[%r182+28]; .loc 1 58 38 add.u32 %r185,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r110,%r185; .loc 1 59 10 ld.u64 %r25,[%r183]; .loc 1 59 7 add.u64 %r111,%r25,-1; .loc 1 61 7 shr.u64 %r187,%r111,63; set.u32.lt.s64 %r189,%r110,%r111; neg.s32 %r190,%r189; cvt.u16.u64 %r193,%r187; cvt.u16.u32 %r194,%r190; or.b16 %r192,%r193,%r194; .loc 1 61 6 cvt.u32.u16 %r195,%r192; cvt.u16.u8 %r196,%r195; setp.eq.u16 %r197,%r196,0; @ %r197 bra $L2; .loc 1 63 7 cvt.u32.u32 %r200,%r22; cvt.s64.s8 %r199,%r200; st.u64 [%stack+8],%r199_gfortran_runtime_errorr203,%r111,%r111; add.u64 %r204,%r203,%r111; shl.b64 %r205,%r204,3; add.u64 %r206,%r182,%r205; ld.u64 %r31,[%r206+56]; ld.u64 %r33,[%r206+48]; .loc 1 71 9 ld.u64 %r114,[%r206+40]; .loc 1 73 3 setp.ne.u64 %r220,%r111,0; @ %r220 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r221,%r110,%r111; @ %r221 bra $L4; bra $L50; $L3: add.u64 %r63,%r182,40; add.u64 %r29,%frame,120; add.u64 %r124,%frame,240; add.u64 %r222,%r182,16; add.u64 %r224,%r25,%r25; add.u64 %r225,%r224,%r25; shl.b64 %r226,%r225,3; add.u64 %r174,%r222,%r226; .loc 1 79 12 mov.u64 %r369,0; $L8: .loc 1 75 18 ld.u64 %r227,[%r63]; st.u64 [%r29],%r227; .loc 1 76 19 ld.u64 %r229,[%r63+16]; add.u64 %r228,%r229,1; ld.u64 %r230,[%r63+8]; sub.u64 %r38,%r228,%r230; .loc 1 78 10 setp.lt.s64 %r231,%r38,0; @ %r231 bra $L6; .loc 1 76 17 st.u64 [%r124],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r124],%r369; $L7: .loc 1 73 3 add.u64 %r63,%r63,24; add.u64 %r29,%r29,8; add.u64 %r124,%r124,8; setp.ne.u64 %r233,%r63,%r174; @ %r233 bra $L8; bra $L9; $L50: .loc 1 90 6 ld.u64 %r234,[%r181]; setp.eq.u64 %r235,%r234,0; @ ! %r235 bra $L11; bra $L10; $L4: add.u64 %r237,%r25,%r25; add.u64 %r238,%r237,%r25; shl.b64 %r239,%r238,3; add.u64 %r240,%r239,40; add.u64 %r132,%r182,%r240; shl.b64 %r241,%r25,3; add.u64 %r107,%r241,-8; add.u64 %r355,%frame,120; add.u64 %r126,%r355,%r107; add.u64 %r359,%frame,240; add.u64 %r105,%r359,%r107; cvt.u32.u32 %r245,%r22; cvt.s64.s8 %r244,%r245; add.u64 %r247,%r244,%r244; add.u64 %r248,%r247,%r244; shl.b64 %r249,%r248,3; add.u64 %r250,%r182,40; add.u64 %r81,%r249,%r250; .loc 1 87 12 mov.u64 %r368,0; $L14: .loc 1 83 18 ld.u64 %r251,[%r132]; st.u64 [%r126],%r251; .loc 1 84 19 ld.u64 %r253,[%r132+16]; add.u64 %r252,%r253,1; ld.u64 %r254,[%r132+8]; sub.u64 %r45,%r252,%r254; .loc 1 86 10 setp.lt.s64 %r255,%r45,0; @ %r255 bra $L12; .loc 1 84 17 st.u64 [%r105],%r45; bra $L13; $L12: .loc 1 87 12 st.u64 [%r105],%r368; $L13: .loc 1 81 3 add.u64 %r132,%r132,24; add.u64 %r126,%r126,8; add.u64 %r105,%r105,8; setp.ne.u64 %r257,%r81,%r132; @ %r257 bra $L14; bra $L51; $L10: .loc 1 94 7 setp.le.s64 %r258,%r110,0; @ %r258 bra $L16; add.u64 %r359,%frame,240; $L36: add.u64 %r153,%r181,40; mov.u64 %r151,%r359; cvt.u32.u32 %r260,%r22; cvt.s64.s8 %r259,%r260; add.u64 %r262,%r259,%r259; add.u64 %r263,%r262,%r259; shl.b64 %r264,%r263,3; add.u64 %r265,%r181,16; add.u64 %r138,%r264,%r265; .loc 1 97 10 mov.u64 %r102,1; .loc 1 101 4 mov.u64 %r288,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r266,0; st.u64 [%r181+8],%r266; .loc 1 106 28 cvt.u16.u32 %r269,%r22; add.u16 %r268,%r269,-1; cvt.u32.u16 %r270,%r268; st.u8 [%r181+28],%r270; .loc 1 108 20 add.u32 %r271,%r22,-2; cvt.s64.s32 %r56,%r271; add.u64 %r273,%r56,%r56; add.u64 %r274,%r273,%r56; shl.b64 %r275,%r274,3; add.u64 %r276,%r181,%r275; .loc 1 108 67 shl.b64 %r278,%r56,3; add.u64 %r279,%frame,%r278; .loc 1 108 59 ld.u64 %r281,[%r276+40]; ld.u64 %r282,[%r279+240]; mul.lo.u64 %r116,%r281,%r282; .loc 1 110 29 mov.u64 %r284284; call (%value_in),_gfortrani_xmallocarray285,[%value_in]; } .loc 1 110 27 st.u64 [%r181],%r285; .loc 1 111 10 setp.eq.u64 %r287,%r116,0; @ ! %r287 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r102,%r52,%r102; $L17: .loc 1 101 4 st.u64 [%r153+8],%r288; ld.u64 %r52,[%r151]; add.u64 %r289,%r52,-1; st.u64 [%r153+16],%r289; st.u64 [%r153],%r102; .loc 1 94 7 add.u64 %r153,%r153,24; add.u64 %r151,%r151,8; setp.ne.u64 %r290,%r138,%r153; @ %r290 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r181+48],%r116; mov.u64 %r292,-1; st.u64 [%r181+56],%r292; mov.u64 %r293,1; st.u64 [%r181+40],%r293; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r62,[%r181+28]; .loc 1 121 10 setp.eq.u64 %r294,%r62,%r110; @ %r294 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r110; st.u64 [%stack],%r62;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r297,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r298,[%r297+36]; setp.eq.u32 %r299,%r298,0; @ %r299 bra $L21; .loc 1 128 2 add.u64 %r359,%frame,240; cvta.const.u64 %r303,$LC2r359302303; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r305,%r110,0; @ %r305 bra $L24; $L27: .loc 1 68 9 add.u64 %r306,%r31,1; .loc 1 68 7 sub.u64 %r112,%r306,%r33; max.s64 %r46,%r112,0; .loc 1 140 8 ld.u64 %r121,[%r182]; .loc 1 141 8 ld.u64 %r122,[%r181]; .loc 1 180 31 add.u64 %r69,%r114,%r114; ld.u64 %r48,[%frame+360]; .loc 1 194 22 ld.u64 %r70,[%frame+120]; .loc 1 194 12 add.u64 %r72,%r70,%r70; .loc 1 195 22 ld.u64 %r73,[%frame]; .loc 1 195 12 shl.b64 %r75,%r73,3; setp.gt.s64 %r356,%r112,0; setp.le.s64 %r357,%r110,1; setp.eq.u32 %r361,%r184,0; .loc 1 159 10 mov.u64 %r362,0; cvt.u32.u32 %r363,%r22; cvt.s64.s8 %r364,%r363; add.u64 %r365,%r364,-1; add.u64 %r366,%frame,120; add.u64 %r367,%frame,240; bra $L25; $L24: add.u64 %r166,%frame,360; add.u64 %r165,%r181,40; mov.u64 %r163,%frame; cvt.u32.u32 %r310,%r22; cvt.s64.s8 %r309,%r310; add.u64 %r154,%r309,-1; .loc 1 132 3 mov.u64 %r123,0; add.u64 %r359,%frame,240; .loc 1 134 16 mov.u64 %r311,%r123; $L26: st.u64 [%r166],%r311; .loc 1 135 18 ld.u64 %r312,[%r165]; st.u64 [%r163],%r312; .loc 1 136 17 shl.b64 %r314,%r123,3; add.u64 %r315,%r359,%r314; .loc 1 136 10 ld.u64 %r316,[%r315]; setp.le.s64 %r317,%r316,0; @ %r317 bra $L1; .loc 1 132 26 add.u64 %r123,%r123,1; .loc 1 132 3 add.u64 %r166,%r166,8; add.u64 %r165,%r165,24; add.u64 %r163,%r163,8; setp.ne.u64 %r318,%r123,%r154; @ %r318 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r356 bra $L37; .loc 1 159 10 st.u64 [%r122],%r362; bra $L29; $L37: mov.u64 %r117,%r121; .loc 1 155 9 mov.u32 %r104,-32768; .loc 1 157 9 mov.u64 %r103,1; .loc 1 178 8 mov.u64 %r149,0; $L28: .loc 1 182 14 ld.s16 %r180,[%r117]; .loc 1 185 14 add.u64 %r149,%r149,1; .loc 1 182 7 @ %r361 bra $L30; .loc 1 182 6 cvt.u16.u32 %r322,%r104; cvt.u16.u32 %r323,%r180; setp.le.s16 %r324,%r322,%r323; selp.u32 %r104,%r180,%r104,%r324; selp.u64 %r103,%r149,%r103,%r324; bra $L31; $L30: cvt.u16.u32 %r325,%r104; cvt.u16.u32 %r326,%r180; setp.lt.s16 %r327,%r325,%r326; selp.u32 %r104,%r180,%r104,%r327; selp.u64 %r103,%r149,%r103,%r327; $L31: .loc 1 180 31 add.u64 %r117,%r117,%r69; .loc 1 180 6 setp.gt.s64 %r328,%r46,%r149; @ %r328 bra $L28; .loc 1 189 12 st.u64 [%r122],%r103; $L29: .loc 1 193 15 add.u64 %r48,%r48,1; .loc 1 194 12 add.u64 %r121,%r121,%r72; .loc 1 195 12 add.u64 %r122,%r122,%r75; .loc 1 197 32 ld.u64 %r145,[%frame+240]; .loc 1 197 13 setp.ne.u64 %r329,%r48,%r145; @ %r329 bra $L25; .loc 1 204 23 mul.lo.u64 %r330,%r70,%r48; .loc 1 204 9 add.u64 %r331,%r330,%r330; sub.u64 %r118,%r121,%r331; .loc 1 205 23 mul.lo.u64 %r332,%r73,%r48; .loc 1 205 9 shl.b64 %r333,%r332,3; sub.u64 %r119,%r122,%r333; .loc 1 207 7 @ %r357 bra $L1; add.u64 %r150,%frame,368; mov.u64 %r109,8; .loc 1 206 5 mov.u64 %r120,1; bra $L34; $L35: .loc 1 201 13 st.u64 [%r150],%r362; .loc 1 204 23 mul.lo.u64 %r339,%r89,%r88; .loc 1 204 9 add.u64 %r340,%r339,%r339; sub.u64 %r118,%r121,%r340; .loc 1 205 23 mul.lo.u64 %r341,%r92,%r88; .loc 1 205 9 shl.b64 %r342,%r341,3; sub.u64 %r119,%r122,%r342; .loc 1 206 5 add.u64 %r120,%r120,1; .loc 1 207 7 add.u64 %r150,%r150,8; add.u64 %r109,%r109,8; setp.eq.u64 %r343,%r120,%r365; @ %r343 bra $L1; $L34: .loc 1 215 16 ld.u64 %r344,[%r150]; add.u64 %r88,%r344,1; st.u64 [%r150],%r88; .loc 1 216 23 add.u64 %r346,%r366,%r109; ld.u64 %r89,[%r346]; .loc 1 216 13 add.u64 %r347,%r89,%r89; add.u64 %r121,%r118,%r347; .loc 1 217 23 add.u64 %r348,%frame,%r109; ld.u64 %r92,[%r348]; .loc 1 217 13 shl.b64 %r349,%r92,3; add.u64 %r122,%r119,%r349; .loc 1 197 32 add.u64 %r351,%r367,%r109; ld.u64 %r95,[%r351]; .loc 1 197 13 setp.eq.u64 %r352,%r88,%r95; @ %r352 bra $L35; .loc 1 201 13 mov.u64 %r48,0; bra $L25; $L51: .loc 1 90 6 ld.u64 %r353,[%r181]; setp.eq.u64 %r354,%r353,0; @ ! %r354 bra $L11; bra $L36; $L1:_gfortran_mmaxloc1_8_i2 .visible .func _gfortran_mmaxloc1_8_, .param .u32 %in_ar4) {608pred %r218; .reg .u32 %r223; .reg .u6464u64pred %r373; .reg .u64 %r374; .reg .u64 %r376; .reg .u32 %r377; .reg .pred %r378; .reg .u64 %r381; .reg .u64 %r382; .reg .u64 %r386; .reg .pred %r388; .reg .pred %r389; .reg .u64 %r390; .reg .u32 %r391; .reg .u64pred %r398; .reg .pred %r399; .reg .pred %r404; .reg .pred16u32 %r413; .reg .predpred %r417; .reg .pred %r418; .reg .predpredpredu64 %r460; .reg .u64 %r461; .reg .u64 %r462; mov.u64 %r213,%ar0; mov.u64 %r214,%ar1; mov.u64 %r215,%ar2; mov.u64 %r216,%ar3; mov.u32 %r217,%ar4; .loc 1 251 6 setp.ne.u64 %r218,%r216,0; @ %r218 bra $L1r214215217; call _gfortran_maxloc1_8_7 bra $L52; $L53: .loc 1 261 10 ld.u64 %r22,[%r215]; .loc 1 261 7 add.u64 %r146,%r22,-1; .loc 1 262 10 ld.s8 %r23,[%r214+28]; .loc 1 262 38 add.u32 %r223,%r23,-1; .loc 1 262 8 cvt.s64.s32 %r147,%r223; .loc 1 265 7 shr.u64 %r225,%r146,63; set.u32.gt.s64 %r227,%r146,%r147; neg.s32 %r228,%r227; cvt.u16.u64 %r231,%r225; cvt.u16.u32 %r232,%r228; or.b16 %r230,%r231,%r232; .loc 1 265 6 cvt.u32.u16 %r233,%r230; cvt.u16.u8 %r234,%r233; setp.eq.u16 %r235,%r234,0; @ %r235 bra $L55; .loc 1 267 7 cvt.u32.u32 %r238,%r23; cvt.s64.s8 %r237,%r238; st.u64 [%stack+8],%r237; st.u64 [%stack],%r22; cvta.const.u64 %r236236_gfortran_runtime_error5: .loc 1 272 9 add.u64 %r452,%r146,%r146; add.u64 %r451,%r452,%r146; shl.b64 %r243,%r451,3; add.u64 %r244,%r214,%r243; ld.u64 %r247,[%r244+56]; add.u64 %r246,%r247,1; .loc 1 272 7 ld.u64 %r254,[%r244+48]; sub.u64 %r148,%r246,%r254; .loc 1 273 6 setp.le.s64 %r255,%r148,0; @ %r255 bra $L52; .loc 1 276 9 ld.u64 %r166,[%r216]; .loc 1 278 15 ld.u64 %r34,[%r216+16]; .loc 1 280 22 cvt.u32.u64 %r35,%r34; .loc 1 280 53 add.u32 %r256,%r35,-4; and.b32 %r257,%r256,-5; set.u32.eq.u32 %r259,%r257,0; neg.s32 %r260,%r259; .loc 1 280 22 add.u32 %r261,%r35,-1; set.u32.le.u32 %r263,%r261,1; neg.s32 %r264,%r263; .loc 1 280 6 cvt.u16.u32 %r266,%r260; cvt.u16.u32 %r267,%r264; or.b16 %r265,%r266,%r267; cvt.u32.u16 %r268,%r265; cvt.u16.u8 %r269,%r268; setp.ne.u16 %r270,%r269,0; @ %r270 bra $L57; .loc 1 282 7 setp.ne.u32 %r272,%r35,16; @ %r272 bra $L58; $L57: .loc 1 289 9 shl.b64 %r276,%r451,3; add.u64 %r277,%r214,%r276; ld.u64 %r151,[%r277+40]; .loc 1 290 12 add.u64 %r283,%r216,%r276; ld.u64 %r39,[%r283+40]; .loc 1 292 3 setp.ne.u64 %r285,%r146,0; @ %r285 bra $L59; $L65: .loc 1 302 3 setp.lt.s64 %r286,%r146,%r147; @ %r286 bra $L60; bra $L119; $L58: .loc 1 287 5_gfortran_runtime_error9: add.u64 %r205,%r214,40; add.u64 %r67,%r216,40; add.u64 %r289,%r214,16; add.u64 %r291,%r22,%r22; add.u64 %r292,%r291,%r22; shl.b64 %r293,%r292,3; add.u64 %r210,%r289,%r293; .loc 1 292 3 mov.u64 %r91,0; add.u64 %r449,%frame,240; add.u64 %r447,%frame,360; .loc 1 299 12 mov.u64 %r462,%r91; $L64: .loc 1 294 18 add.u64 %r295,%r449,%r91; ld.u64 %r296,[%r205]; st.u64 [%r295],%r296; .loc 1 295 18 add.u64 %r297,%frame,%r91; .loc 1 295 20 ld.u64 %r299,[%r67]; mul.lo.u64 %r298,%r299,%r34; .loc 1 295 18 st.u64 [%r297],%r298; .loc 1 296 19 ld.u64 %r301,[%r205+16]; add.u64 %r300,%r301,1; ld.u64 %r302,[%r205+8]; sub.u64 %r51,%r300,%r302; .loc 1 298 10 setp.lt.s64 %r303,%r51,0; @ %r303 bra $L62; .loc 1 296 17 add.u64 %r305,%r447,%r91; st.u64 [%r305],%r51; bra $L63; $L62: .loc 1 299 12 add.u64 %r307,%r447,%r91; st.u64 [%r307],%r462; $L63: .loc 1 292 3 add.u64 %r205,%r205,24; add.u64 %r67,%r67,24; add.u64 %r91,%r91,8; setp.ne.u64 %r309,%r205,%r210; @ %r309 bra $L64; bra $L65; $L119: .loc 1 312 6 ld.u64 %r310,[%r213]; setp.eq.u64 %r311,%r310,0; @ ! %r311 bra $L67; bra $L66; $L60: add.u64 %r313,%r22,%r22; add.u64 %r314,%r313,%r22; shl.b64 %r315,%r314,3; add.u64 %r128,%r315,40; add.u64 %r133,%r214,%r128; add.u64 %r124,%r216,%r128; shl.b64 %r316,%r22,3; add.u64 %r83,%r316,-8; cvt.u32.u32 %r318,%r23; cvt.s64.s8 %r317,%r318; shl.b64 %r319,%r317,3; add.u64 %r209,%r319,-8; add.u64 %r449,%frame,240; add.u64 %r447,%frame,360; .loc 1 309 12 mov.u64 %r461,0; $L70: .loc 1 304 18 add.u64 %r321,%r449,%r83; ld.u64 %r322,[%r133]; st.u64 [%r321],%r322; .loc 1 305 18 add.u64 %r323,%frame,%r83; .loc 1 305 20 ld.u64 %r325,[%r124]; mul.lo.u64 %r324,%r325,%r34; .loc 1 305 18 st.u64 [%r323],%r324; .loc 1 306 19 ld.u64 %r327,[%r133+16]; add.u64 %r326,%r327,1; ld.u64 %r328,[%r133+8]; sub.u64 %r64,%r326,%r328; .loc 1 308 10 setp.lt.s64 %r329,%r64,0; @ %r329 bra $L68; .loc 1 306 17 add.u64 %r331,%r447,%r83; st.u64 [%r331],%r64; bra $L69; $L68: .loc 1 309 12 add.u64 %r333,%r447,%r83; st.u64 [%r333],%r461; $L69: .loc 1 302 3 add.u64 %r133,%r133,24; add.u64 %r124,%r124,24; add.u64 %r83,%r83,8; setp.ne.u64 %r335,%r83,%r209; @ %r335 bra $L70; bra $L120; $L66: .loc 1 316 7 setp.eq.u64 %r336,%r147,0; @ %r336 bra $L72; add.u64 %r447,%frame,360; $L96: add.u64 %r161,%r213,40; mov.u64 %r143,%r447; cvt.u32.u32 %r338,%r23; cvt.s64.s8 %r337,%r338; add.u64 %r340,%r337,%r337; add.u64 %r341,%r340,%r337; shl.b64 %r342,%r341,3; add.u64 %r343,%r213,16; add.u64 %r135,%r342,%r343; .loc 1 319 10 mov.u64 %r131,1; .loc 1 323 4 mov.u64 %r362,0; bra $L73; $L72: .loc 1 327 20 add.u32 %r344,%r23,-2; cvt.s64.s32 %r73,%r344; add.u64 %r346,%r73,%r73; add.u64 %r347,%r346,%r73; shl.b64 %r348,%r347,3; add.u64 %r349,%r213,%r348; .loc 1 327 67 shl.b64 %r351,%r73,3; add.u64 %r352,%frame,%r351; .loc 1 327 59 ld.u64 %r354,[%r349+40]; ld.u64 %r355,[%r352+360]; mul.lo.u64 %r155,%r354,%r355; .loc 1 329 24 mov.u64 %r356,0; st.u64 [%r213+8],%r356; .loc 1 330 28 cvt.u16.u32 %r359,%r23; add.u16 %r358,%r359,-1; cvt.u32.u16 %r360,%r358; st.u8 [%r213+28],%r360; .loc 1 332 10 setp.eq.u64 %r361,%r155,0; @ %r361 bra $L74; bra $L121; $L76: .loc 1 321 47 mul.lo.u64 %r131,%r70,%r131; $L73: .loc 1 323 4 st.u64 [%r161+8],%r362; ld.u64 %r70,[%r143]; add.u64 %r363,%r70,-1; st.u64 [%r161+16],%r363; st.u64 [%r161],%r131; .loc 1 316 7 add.u64 %r161,%r161,24; add.u64 %r143,%r143,8; setp.ne.u64 %r364,%r135,%r161; @ %r364 bra $L76; bra $L72; $L74: .loc 1 335 4 st.u64 [%r213+48],%r155; mov.u64 %r366,-1; st.u64 [%r213+56],%r366; mov.u64 %r367,1; st.u64 [%r213+40],%r367; .loc 1 336 4 bra $L52; $L121: .loc 1 339 24 mov.u64 %r369369; call (%value_in),_gfortrani_xmallocarray70,[%value_in]; } .loc 1 339 22 st.u64 [%r213],%r370; bra $L77; $L67: .loc 1 344 19 ld.s8 %r372,[%r213+28]; .loc 1 344 10 setp.eq.u64 %r373,%r372,%r147; @ %r373 bra $L78; .loc 1 345 2 cvta.const.u64 %r374374_gfortran_runtime_error78: .loc 1 347 11 cvta.global.u64 %r376,_gfortrani_compile_options; .loc 1 347 10 ld.u32 %r377,[%r376+36]; setp.eq.u32 %r378,%r377,0; @ %r378 bra $L77; .loc 1 349 4 add.u64 %r447,%frame,360; cvta.const.u64 %r382,$LC2r4473382; call _gfortrani_bounds_ifunction_return4 cvta.const.u64 %r386,$LC386382; call _gfortrani_bounds_equal_extents$L77: .loc 1 356 3 setp.ne.u64 %r388,%r147,0; @ %r388 bra $L79; $L82: .loc 1 364 8 ld.u64 %r167,[%r213]; .loc 1 365 8 ld.u64 %r165,[%r214]; .loc 1 367 9 setp.ne.u64 %r389,%r165,0; @ %r389 bra $L80; bra $L52; $L79: add.u64 %r194,%frame,480; add.u64 %r193,%r213,40; add.u64 %r188,%frame,120; cvt.u32.u32 %r391,%r23; cvt.s64.s8 %r390,%r391; add.u64 %r174,%r390,-1; .loc 1 356 3 mov.u64 %r168,0; add.u64 %r447,%frame,360; .loc 1 358 16 mov.u64 %r392,%r168; $L81: st.u64 [%r194],%r392; .loc 1 359 18 ld.u64 %r393,[%r193]; st.u64 [%r188],%r393; .loc 1 360 17 shl.b64 %r395,%r168,3; add.u64 %r396,%r447,%r395; .loc 1 360 10 ld.u64 %r397,[%r396]; setp.le.s64 %r398,%r397,0; @ %r398 bra $L52; .loc 1 356 26 add.u64 %r168,%r168,1; .loc 1 356 3 add.u64 %r194,%r194,8; add.u64 %r193,%r193,24; add.u64 %r188,%r188,8; setp.ne.u64 %r399,%r168,%r174; @ %r399 bra $L81; bra $L82; $L80: .loc 1 290 12 mul.lo.u64 %r42,%r39,%r34; .loc 1 386 32 add.u64 %r87,%r151,%r151; ld.u64 %r68,[%frame+480]; .loc 1 430 22 ld.u64 %r97,[%frame+240]; .loc 1 430 12 add.u64 %r99,%r97,%r97; .loc 1 431 23 ld.u64 %r100,[%frame]; .loc 1 432 22 ld.u64 %r102,[%frame+120]; .loc 1 432 12 shl.b64 %r104,%r102,3; setp.le.s64 %r446,%r147,1; .loc 1 408 9 setp.ne.u32 %r454,%r217,0; cvt.u32.u32 %r455,%r23; cvt.s64.s8 %r456,%r455; add.u64 %r457,%r456,-1; add.u64 %r458,%frame,240; add.u64 %r459,%frame,360; add.u64 %r460,%frame,120; $L97: .loc 1 356 3 mov.u64 %r160,%r166; mov.u64 %r159,%r165; .loc 1 386 9 mov.u64 %r137,0; $L86: .loc 1 389 7 ld.s8 %r208,[%r160]; mov.u64 %r189,%r137; .loc 1 386 24 add.u64 %r137,%r137,1; .loc 1 389 6 setp.eq.u32 %r448,%r208,0; @ %r448 bra $L83; .loc 1 397 11 ld.s16 %r141,[%r159]; .loc 1 408 9 @ %r454 bra $L84; bra $L122; $L83: .loc 1 386 32 add.u64 %r159,%r159,%r87; .loc 1 386 47 add.u64 %r160,%r160,%r42; .loc 1 386 2 setp.ne.u64 %r404,%r148,%r137; @ %r404 bra $L86; .loc 1 385 9 mov.u64 %r137,0; bra $L87; $L122: .loc 1 418 8 setp.gt.s64 %r405,%r148,%r189; @ %r405 bra $L88; bra $L87; $L84: .loc 1 409 8 setp.le.s64 %r406,%r148,%r189; @ %r406 bra $L87; $L90: .loc 1 414 16 add.u64 %r189,%r189,1; .loc 1 411 8 @ %r448 bra $L89; .loc 1 411 18 ld.s16 %r88,[%r159]; .loc 1 411 15 cvt.u16.u32 %r408,%r88; cvt.u16.u32 %r409,%r141; setp.ge.s16 %r410,%r408,%r409; selp.u32 %r141,%r88,%r141,%r410; selp.u64 %r137,%r189,%r137,%r410; $L89: .loc 1 409 33 add.u64 %r159,%r159,%r87; .loc 1 409 48 add.u64 %r160,%r160,%r42; .loc 1 409 8 setp.eq.u64 %r411,%r148,%r189; @ %r411 bra $L87; .loc 1 411 9 ld.s8 %r208,[%r160]; setp.eq.u32 %r448,%r208,0; bra $L90; $L88: .loc 1 423 16 add.u64 %r189,%r189,1; .loc 1 420 8 ld.s8 %r413,[%r160]; cvt.u16.u32 %r412,%r413; setp.eq.u16 %r414,%r412,0; @ %r414 bra $L91; .loc 1 420 18 ld.s16 %r93,[%r159]; .loc 1 420 15 cvt.u16.u32 %r415,%r93; cvt.u16.u32 %r416,%r141; setp.gt.s16 %r417,%r415,%r416; selp.u32 %r141,%r93,%r141,%r417; selp.u64 %r137,%r189,%r137,%r417; $L91: .loc 1 418 33 add.u64 %r159,%r159,%r87; .loc 1 418 48 add.u64 %r160,%r160,%r42; .loc 1 418 8 setp.ne.u64 %r418,%r148,%r189; @ %r418 bra $L88; $L87: .loc 1 426 8 st.u64 [%r167],%r137; .loc 1 429 15 add.u64 %r68,%r68,1; .loc 1 430 12 add.u64 %r165,%r165,%r99; .loc 1 431 13 add.u64 %r166,%r166,%r100; .loc 1 432 12 add.u64 %r167,%r167,%r104; .loc 1 434 32 ld.u64 %r29,[%frame+360]; .loc 1 434 13 setp.ne.u64 %r419,%r29,%r68; @ %r419 bra $L97; .loc 1 441 23 mul.lo.u64 %r106,%r68,%r97; .loc 1 442 24 mul.lo.u64 %r420,%r68,%r100; .loc 1 442 10 sub.u64 %r162,%r166,%r420; .loc 1 443 23 mul.lo.u64 %r421,%r68,%r102; .loc 1 443 9 shl.b64 %r422,%r421,3; sub.u64 %r163,%r167,%r422; .loc 1 445 7 @ %r446 bra $L52; add.u64 %r204,%frame,488; mov.u64 %r203,8; .loc 1 444 5 mov.u64 %r164,1; .loc 1 438 13 mov.u64 %r453,0; bra $L94; $L95: st.u64 [%r204],%r453; .loc 1 441 23 mul.lo.u64 %r106,%r117,%r116; .loc 1 442 24 mul.lo.u64 %r428,%r119,%r116; .loc 1 442 10 sub.u64 %r162,%r166,%r428; .loc 1 443 23 mul.lo.u64 %r429,%r121,%r116; .loc 1 443 9 shl.b64 %r430,%r429,3; sub.u64 %r163,%r167,%r430; .loc 1 444 5 add.u64 %r164,%r164,1; .loc 1 445 7 add.u64 %r204,%r204,8; add.u64 %r203,%r203,8; setp.eq.u64 %r431,%r164,%r457; @ %r431 bra $L52; $L94: .loc 1 453 16 ld.u64 %r432,[%r204]; add.u64 %r116,%r432,1; st.u64 [%r204],%r116; .loc 1 454 23 add.u64 %r434,%r458,%r203; ld.u64 %r117,[%r434]; .loc 1 454 13 sub.u64 %r435,%r117,%r106; add.u64 %r436,%r435,%r435; add.u64 %r165,%r165,%r436; .loc 1 455 24 add.u64 %r437,%frame,%r203; ld.u64 %r119,[%r437]; .loc 1 455 14 add.u64 %r166,%r162,%r119; .loc 1 456 23 add.u64 %r439,%r460,%r203; ld.u64 %r121,[%r439]; .loc 1 456 13 shl.b64 %r440,%r121,3; add.u64 %r167,%r163,%r440; .loc 1 434 32 add.u64 %r442,%r459,%r203; ld.u64 %r125,[%r442]; .loc 1 434 13 setp.eq.u64 %r443,%r116,%r125; @ %r443 bra $L95; .loc 1 438 13 mov.u64 %r68,0; bra $L97; $L120: .loc 1 312 6 ld.u64 %r444,[%r213]; setp.eq.u64 %r445,%r444,0; @ ! %r445 bra $L67; bra $L96; $L52_gfortran_smaxloc1_8_i2 .visible .func _gfortran_smaxloc1_8_, .param .u32 %in_ar4) {local .align 16 .b8 %frame_ar[368u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 483 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L124; .loc 1 483 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L125; $L124:r152154; call _gfortran_maxloc1_8_490 7 bra $L123; $L125: .loc 1 493 10 ld.u64 %r23,[%r152]; .loc 1 493 7 add.u64 %r88,%r23,-1; .loc 1 494 10 ld.s8 %r24,[%r151+28]; .loc 1 494 38 add.u32 %r26,%r24,-1; .loc 1 494 8 cvt.s64.s32 %r296,%r26; .loc 1 496 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 496 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L127; .loc 1 503 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L128; $L134: .loc 1 511 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L129; bra $L182; $L127: .loc 1 498 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error128: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 508 12 mov.u64 %r302,0; $L133: .loc 1 505 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 507 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L131; .loc 1 505 17 st.u64 [%r139],%r36; bra $L132; $L131: .loc 1 508 12 st.u64 [%r139],%r302; $L132: .loc 1 503 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L133; bra $L134; $L182: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L136; bra $L135; $L129: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 517 12 mov.u64 %r301,0; $L139: .loc 1 514 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 516 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L137; .loc 1 513 17 st.u64 [%r43],%r41; bra $L138; $L137: .loc 1 517 12 st.u64 [%r43],%r301; $L138: .loc 1 511 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L139; bra $L183; $L135: .loc 1 524 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L141; add.u64 %r295,%frame,120; $L161: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 527 10 mov.u64 %r81,1; .loc 1 531 4 mov.u64 %r239,0; bra $L142; $L141: .loc 1 535 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 536 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 538 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 538 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 538 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 540 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L143; bra $L184; $L145: .loc 1 529 48 mul.lo.u64 %r81,%r46,%r81; $L142: .loc 1 531 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 524 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L145; bra $L141; $L143: .loc 1 543 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 544 4 bra $L123; $L184: .loc 1 547 24 mov.u64 %r24696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 547 22 st.u64 [%r150],%r98; $L149: .loc 1 573 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L146; bra $L147; $L136: .loc 1 551 19 ld.s8 %r55,[%r150+28]; .loc 1 551 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L148; .loc 1 552 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error148: .loc 1 557 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 557 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L149; .loc 1 559 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L147; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 559 10 mov.u64 %r104,0; $L151: .loc 1 563 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 563 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 564 18 ld.u64 %r63,[%r31]; .loc 1 559 25 add.u64 %r104,%r104,1; .loc 1 564 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L150; .loc 1 565 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error150: .loc 1 559 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L151; $L146: .loc 1 575 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L152; shl.b64 %r266,%r296,3; bra $L153; $L152: mov.u64 %r266,8; $L153: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L154: .loc 1 576 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 573 26 add.u64 %r100,%r100,1; .loc 1 573 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L154; $L147: ld.u64 %r80,[%frame+240]; .loc 1 585 22 ld.u64 %r68,[%frame]; .loc 1 585 12 shl.b64 %r69,%r68,3; .loc 1 587 32 ld.u64 %r59,[%frame+120]; .loc 1 594 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 583 13 mov.u64 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L160: st.u64 [%r98],%r280; .loc 1 584 15 add.u64 %r80,%r80,1; .loc 1 585 12 add.u64 %r98,%r98,%r69; .loc 1 587 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L160; .loc 1 596 7 @ ! %r294 bra $L185; bra $L123; $L159: .loc 1 591 13 st.u64 [%r99],%r280; .loc 1 594 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 595 5 add.u64 %r97,%r97,1; .loc 1 596 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L158; bra $L123; $L185: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 594 23 mov.u64 %r72,%r44; .loc 1 595 5 mov.u64 %r97,1; $L158: .loc 1 600 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 601 23 ld.u64 %r75,[%r124]; .loc 1 601 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,3; add.u64 %r98,%r98,%r291; .loc 1 587 32 ld.u64 %r77,[%r123]; .loc 1 587 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L159; .loc 1 591 13 mov.u64 %r80,0; bra $L160; $L183: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L136; bra $L161; $L123: .loc 1 605 1 ret; } maxloc1_16_i2.o/_gfortran_maxloc1_16_i2 .visible .func _gfortran_maxloc1_16_i2.file 1 "../../../libgfortran/generated/maxloc1_16_i2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_16_i2 .visible .func _gfortran_mmaxloc1_16__gfortran_smaxloc1_16_i2 .visible .func _gfortran_smaxloc1_16_VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_maxloc1_16_i2 .visible .func _gfortran_maxloc1_16_i248007predpredu64 %r231; .reg .u64 %r232; .reg .pred %r233; .reg .pred %r235; .reg .u64 %r236; .reg .pred %r237; .reg .u64u64predu64 %r264; .reg .u64 %r265; .reg .u64 %r266; .reg .u64 %r267; .reg .u64 %r268; .reg .u16 %r270; .reg .u16 %r271; .reg .u32 %r272; .reg .u32 %r273; .reg .u64 %r275; .reg .u64u64predu64 %r312; .reg .u64u32 %r316; .reg .u64 %r317; .reg .u64pred %r323; .reg .pred %r324; .reg .u64 %r326; .reg .u16 %r329; .reg .u16 %r330; .reg .predpred %r334; .reg .u64 %r337; .reg .u64 %r340; .reg .u32 %r342; .reg .u64 %r347; .reg .pred %r349; .reg .pred %r352; .reg .u64 %r358; .reg .u64 %r359; .reg .u64 %r360; .reg .u64 %r361; .reg .pred %r362; .reg .u64 %r363; .reg .u64predpred %r385; .reg .predpredmov.u64 %r183,%ar0; mov.u64 %r184,%ar1; mov.u64 %r185,%ar2; mov.u32 %r186,%ar3; .loc 1 58 10 ld.s8 %r22,[%r184+28]; .loc 1 58 38 add.u32 %r187,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r110,%r187; .loc 1 59 10 ld.u64 %r25,[%r185]; .loc 1 59 7 add.u64 %r111,%r25,-1; .loc 1 61 7 shr.u64 %r189,%r111,63; set.u32.lt.s64 %r191,%r110,%r111; neg.s32 %r192,%r191; cvt.u16.u64 %r195,%r189; cvt.u16.u32 %r196,%r192; or.b16 %r194,%r195,%r196; .loc 1 61 6 cvt.u32.u16 %r197,%r194; cvt.u16.u8 %r198,%r197; setp.eq.u16 %r199,%r198,0; @ %r199 bra $L2; .loc 1 63 7 cvt.u32.u32 %r202,%r22; cvt.s64.s8 %r201,%r202; st.u64 [%stack+8],%r2000200_gfortran_runtime_errorr205,%r111,%r111; add.u64 %r206,%r205,%r111; shl.b64 %r207,%r206,3; add.u64 %r208,%r184,%r207; ld.u64 %r31,[%r208+56]; ld.u64 %r33,[%r208+48]; .loc 1 71 9 ld.u64 %r114,[%r208+40]; .loc 1 73 3 setp.ne.u64 %r222,%r111,0; @ %r222 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r223,%r110,%r111; @ %r223 bra $L4; bra $L49; $L3: add.u64 %r109,%r184,40; add.u64 %r128,%frame,120; add.u64 %r181,%frame,240; add.u64 %r224,%r184,16; add.u64 %r226,%r25,%r25; add.u64 %r227,%r226,%r25; shl.b64 %r228,%r227,3; add.u64 %r137,%r224,%r228; .loc 1 79 12 mov.u64 %r396,0; $L8: .loc 1 75 18 ld.u64 %r229,[%r109]; st.u64 [%r128],%r229; .loc 1 76 19 ld.u64 %r231,[%r109+16]; add.u64 %r230,%r231,1; ld.u64 %r232,[%r109+8]; sub.u64 %r38,%r230,%r232; .loc 1 78 10 setp.lt.s64 %r233,%r38,0; @ %r233 bra $L6; .loc 1 76 17 st.u64 [%r181],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r181],%r396; $L7: .loc 1 73 3 add.u64 %r109,%r109,24; add.u64 %r128,%r128,8; add.u64 %r181,%r181,8; setp.ne.u64 %r235,%r109,%r137; @ %r235 bra $L8; bra $L9; $L49: .loc 1 90 6 ld.u64 %r236,[%r183]; setp.eq.u64 %r237,%r236,0; @ ! %r237 bra $L11; bra $L10; $L4: add.u64 %r239,%r25,%r25; add.u64 %r240,%r239,%r25; shl.b64 %r241,%r240,3; add.u64 %r242,%r241,40; add.u64 %r134,%r184,%r242; shl.b64 %r243,%r25,3; add.u64 %r107,%r243,-8; add.u64 %r384,%frame,120; add.u64 %r120,%r384,%r107; add.u64 %r383,%frame,240; add.u64 %r104,%r383,%r107; add.u64 %r246,%r184,40; cvt.u32.u32 %r248,%r22; cvt.s64.s8 %r247,%r248; add.u64 %r250,%r247,%r247; add.u64 %r251,%r250,%r247; shl.b64 %r252,%r251,3; add.u64 %r63,%r246,%r252; .loc 1 87 12 mov.u64 %r395,0; $L14: .loc 1 83 18 ld.u64 %r253,[%r134]; st.u64 [%r120],%r253; .loc 1 84 19 ld.u64 %r255,[%r134+16]; add.u64 %r254,%r255,1; ld.u64 %r256,[%r134+8]; sub.u64 %r45,%r254,%r256; .loc 1 86 10 setp.lt.s64 %r257,%r45,0; @ %r257 bra $L12; .loc 1 84 17 st.u64 [%r104],%r45; bra $L13; $L12: .loc 1 87 12 st.u64 [%r104],%r395; $L13: .loc 1 81 3 add.u64 %r134,%r134,24; add.u64 %r120,%r120,8; add.u64 %r104,%r104,8; setp.ne.u64 %r259,%r63,%r134; @ %r259 bra $L14; bra $L50; $L10: .loc 1 94 7 setp.le.s64 %r260,%r110,0; @ %r260 bra $L16; add.u64 %r383,%frame,240; $L37: add.u64 %r154,%r183,40; mov.u64 %r152,%r383; cvt.u32.u32 %r262,%r22; cvt.s64.s8 %r261,%r262; add.u64 %r264,%r261,%r261; add.u64 %r265,%r264,%r261; shl.b64 %r266,%r265,3; add.u64 %r267,%r183,16; add.u64 %r138,%r266,%r267; .loc 1 97 10 mov.u64 %r101,1; .loc 1 101 4 mov.u64 %r290,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r268,0; st.u64 [%r183+8],%r268; .loc 1 106 28 cvt.u16.u32 %r271,%r22; add.u16 %r270,%r271,-1; cvt.u32.u16 %r272,%r270; st.u8 [%r183+28],%r272; .loc 1 108 20 add.u32 %r273,%r22,-2; cvt.s64.s32 %r56,%r273; add.u64 %r275,%r56,%r56; add.u64 %r276,%r275,%r56; shl.b64 %r277,%r276,3; add.u64 %r278,%r183,%r277; .loc 1 108 67 shl.b64 %r280,%r56,3; add.u64 %r281,%frame,%r280; .loc 1 108 59 ld.u64 %r283,[%r278+40]; ld.u64 %r284,[%r281+240]; mul.lo.u64 %r116,%r283,%r284; .loc 1 110 29 mov.u64 %r286,16286; call (%value_in),_gfortrani_xmallocarray287,[%value_in]; } .loc 1 110 27 st.u64 [%r183],%r287; .loc 1 111 10 setp.eq.u64 %r289,%r116,0; @ ! %r289 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r101,%r52,%r101; $L17: .loc 1 101 4 st.u64 [%r154+8],%r290; ld.u64 %r52,[%r152]; add.u64 %r291,%r52,-1; st.u64 [%r154+16],%r291; st.u64 [%r154],%r101; .loc 1 94 7 add.u64 %r154,%r154,24; add.u64 %r152,%r152,8; setp.ne.u64 %r292,%r138,%r154; @ %r292 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r183+48],%r116; mov.u64 %r294,-1; st.u64 [%r183+56],%r294; mov.u64 %r295,1; st.u64 [%r183+40],%r295; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r62,[%r183+28]; .loc 1 121 10 setp.eq.u64 %r296,%r62,%r110; @ %r296 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r110; st.u64 [%stack],%r62;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r299,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r300,[%r299+36]; setp.eq.u32 %r301,%r300,0; @ %r301 bra $L21; .loc 1 128 2 add.u64 %r383,%frame,240; cvta.const.u64 %r305,$LC204305; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r307,%r110,0; @ %r307 bra $L24; $L27: .loc 1 68 9 add.u64 %r308,%r31,1; .loc 1 68 7 sub.u64 %r112,%r308,%r33; max.s64 %r46,%r112,0; .loc 1 140 8 ld.u64 %r124,[%r184]; .loc 1 141 8 ld.u64 %r125,[%r183]; .loc 1 180 31 add.u64 %r70,%r114,%r114; ld.u64 %r77,[%frame+360]; .loc 1 194 22 ld.u64 %r72,[%frame+120]; .loc 1 194 12 add.u64 %r73,%r72,%r72; .loc 1 195 22 ld.u64 %r75,[%frame]; .loc 1 195 12 shl.b64 %r76,%r75,4; .loc 1 197 32 ld.u64 %r105,[%frame+240]; .loc 1 204 23 mul.lo.u64 %r311,%r72,%r105; .loc 1 204 9 add.u64 %r312,%r311,%r311; neg.s64 %r135,%r312; .loc 1 205 23 mul.lo.u64 %r313,%r75,%r105; .loc 1 205 9 shl.b64 %r314,%r313,4; neg.s64 %r139,%r314; setp.gt.s64 %r385,%r112,0; setp.le.s64 %r386,%r110,1; mov.u64 %r388,1; setp.eq.u32 %r389,%r186,0; cvt.u32.u32 %r390,%r22; cvt.s64.s8 %r391,%r390; add.u64 %r392,%r391,-1; add.u64 %r393,%frame,120; add.u64 %r394,%frame,240; bra $L25; $L24: add.u64 %r167,%frame,360; add.u64 %r166,%r183,40; mov.u64 %r164,%frame; cvt.u32.u32 %r316,%r22; cvt.s64.s8 %r315,%r316; add.u64 %r155,%r315,-1; .loc 1 132 3 mov.u64 %r126,0; add.u64 %r383,%frame,240; .loc 1 134 16 mov.u64 %r317,%r126; $L26: st.u64 [%r167],%r317; .loc 1 135 18 ld.u64 %r318,[%r166]; st.u64 [%r164],%r318; .loc 1 136 17 shl.b64 %r320,%r126,3; add.u64 %r321,%r383,%r320; .loc 1 136 10 ld.u64 %r322,[%r321]; setp.le.s64 %r323,%r322,0; @ %r323 bra $L1; .loc 1 132 26 add.u64 %r126,%r126,1; .loc 1 132 3 add.u64 %r167,%r167,8; add.u64 %r166,%r166,24; add.u64 %r164,%r164,8; setp.ne.u64 %r324,%r126,%r155; @ %r324 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r385 bra $L38; .loc 1 159 10 mov.u64 %r326,0; st.u64 [%r125],%r326; st.u64 [%r125+8],%r326; bra $L29; $L38: mov.u64 %r119,%r124; mov.u64 %r376,%r388; mov.u64 %r377,0; .loc 1 155 9 mov.u32 %r103,-32768; .loc 1 157 9 mov.u64 %r374,1; mov.u64 %r375,%r377; $L28: .loc 1 182 14 ld.s16 %r182,[%r119]; .loc 1 182 7 @ %r389 bra $L30; .loc 1 182 6 cvt.u16.u32 %r329,%r103; cvt.u16.u32 %r330,%r182; setp.le.s16 %r331,%r329,%r330; @ %r331 bra $L31; bra $L32; $L30: cvt.u16.u32 %r332,%r103; cvt.u16.u32 %r333,%r182; setp.ge.s16 %r334,%r332,%r333; @ %r334 bra $L32; $L31: .loc 1 185 14 mov.u64 %r374,%r376; mov.u64 %r375,%r377; .loc 1 182 14 mov.u32 %r103,%r182; $L32: .loc 1 180 31 add.u64 %r119,%r119,%r70; .loc 1 180 6 add.u64 %r337,%r376,%r388; set.u32.lt.u64 %r342,%r337,%r376; cvt.s64.s32 %r340,%r342; mov.u64 %r376,%r337; sub.u64 %r377,%r377,%r340; add.u64 %r347,%r376,-1; setp.gt.s64 %r349,%r46,%r347; @ %r349 bra $L28; .loc 1 189 12 st.u64 [%r125],%r374; st.u64 [%r125+8],%r375; $L29: .loc 1 193 15 add.u64 %r77,%r77,1; .loc 1 194 12 add.u64 %r124,%r124,%r73; .loc 1 195 12 add.u64 %r125,%r125,%r76; .loc 1 197 13 setp.ne.u64 %r352,%r77,%r105; @ %r352 bra $L25; .loc 1 204 9 add.u64 %r121,%r124,%r135; .loc 1 205 9 add.u64 %r122,%r125,%r139; .loc 1 207 7 @ %r386 bra $L1; add.u64 %r130,%frame,368; mov.u64 %r149,8; .loc 1 206 5 mov.u64 %r123,1; .loc 1 201 13 mov.u64 %r387,0; bra $L35; $L36: st.u64 [%r130],%r387; .loc 1 204 23 mul.lo.u64 %r358,%r89,%r88; .loc 1 204 9 add.u64 %r359,%r358,%r358; sub.u64 %r121,%r124,%r359; .loc 1 205 23 mul.lo.u64 %r360,%r92,%r88; .loc 1 205 9 shl.b64 %r361,%r360,4; sub.u64 %r122,%r125,%r361; .loc 1 206 5 add.u64 %r123,%r123,1; .loc 1 207 7 add.u64 %r130,%r130,8; add.u64 %r149,%r149,8; setp.eq.u64 %r362,%r123,%r392; @ %r362 bra $L1; $L35: .loc 1 215 16 ld.u64 %r363,[%r130]; add.u64 %r88,%r363,1; st.u64 [%r130],%r88; .loc 1 216 23 add.u64 %r365,%r393,%r149; ld.u64 %r89,[%r365]; .loc 1 216 13 add.u64 %r366,%r89,%r89; add.u64 %r124,%r121,%r366; .loc 1 217 23 add.u64 %r367,%frame,%r149; ld.u64 %r92,[%r367]; .loc 1 217 13 shl.b64 %r368,%r92,4; add.u64 %r125,%r122,%r368; .loc 1 197 32 add.u64 %r370,%r394,%r149; ld.u64 %r95,[%r370]; .loc 1 197 13 setp.eq.u64 %r371,%r88,%r95; @ %r371 bra $L36; .loc 1 201 13 mov.u64 %r77,0; bra $L25; $L50: .loc 1 90 6 ld.u64 %r372,[%r183]; setp.eq.u64 %r373,%r372,0; @ ! %r373 bra $L11; bra $L37; $L1:_gfortran_mmaxloc1_16_i2 .visible .func _gfortran_mmaxloc1_16_, .param .u32 %in_ar4) {60897u64u64 %r267; .reg .u64 %r274; .reg .pred %r275; .reg .u32 %r276; .reg .u32 %r277; .reg .u32u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .pred %r329; .reg .u64 %r330; .reg .pred %r331; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32 %r338; .reg .u64 %r339; .reg .u64 %r341; .reg .u64 %r342; .reg .u64 %r343; .reg .u64 %r344; .reg .u64pred %r355; .reg .pred64predpred %r408; .reg .pred %r409; .reg .u64 %r410; .reg .u32.reg .u64 %r417; .reg .pred %r418; .reg .predpred %r428; .reg .pred %r429; .reg .pred %r430; .reg .u64.reg .u32 %r456; .reg .u64 %r460; .reg .u16 %r462; .reg .u16 %r463; .reg .predpred %r479; .reg .pred %r482; .reg .u64 %r486; .reg .u64 %r489; .reg .u64 %r492; .reg .u64 %r495; .reg .u32u64 %r547; .reg .pred %r548; .reg .u64 %r549; .reg .u64 %r551; .reg .u64u64 %r557; .reg .u64u64 %r572; .reg .u64 %r583; .reg .u64 %r597; .reg .predpred %r609; .reg .u64 %r610; .reg .u32 %r611; .reg .u64 %r612; .reg .u64 %r613; .reg .u64 %r614; .reg .u64 %r615; .reg .u64 %r616; .reg .u64 %r617; mov.u64 %r233,%ar0; mov.u64 %r234,%ar1; mov.u64 %r235,%ar2; mov.u64 %r236,%ar3; mov.u32 %r237,%ar4; .loc 1 251 6 setp.ne.u64 %r238,%r236,0; @ %r238 bra $L3r2235237; call _gfortran_maxloc1_16_7 bra $L51; $L52: .loc 1 261 10 ld.u64 %r22,[%r235]; .loc 1 261 7 add.u64 %r153,%r22,-1; .loc 1 262 10 ld.s8 %r23,[%r234+28]; .loc 1 262 38 add.u32 %r243,%r23,-1; .loc 1 262 8 cvt.s64.s32 %r154,%r243; .loc 1 265 7 shr.u64 %r245,%r153,63; set.u32.gt.s64 %r247,%r153,%r154; neg.s32 %r248,%r247; cvt.u16.u64 %r251,%r245; cvt.u16.u32 %r252,%r248; or.b16 %r250,%r251,%r252; .loc 1 265 6 cvt.u32.u16 %r253,%r250; cvt.u16.u8 %r254,%r253; setp.eq.u16 %r255,%r254,0; @ %r255 bra $L54; .loc 1 267 7 cvt.u32.u32 %r258,%r23; cvt.s64.s8 %r257,%r258; st.u64 [%stack+8],%r257; st.u64 [%stack],%r22; cvta.const.u64 %r22stack; call _gfortran_runtime_error4: .loc 1 272 9 add.u64 %r602,%r153,%r153; add.u64 %r607,%r602,%r153; shl.b64 %r263,%r607,3; add.u64 %r264,%r234,%r263; ld.u64 %r267,[%r264+56]; add.u64 %r266,%r267,1; .loc 1 272 7 ld.u64 %r274,[%r264+48]; sub.u64 %r155,%r266,%r274; .loc 1 273 6 setp.le.s64 %r275,%r155,0; @ %r275 bra $L51; .loc 1 276 9 ld.u64 %r173,[%r236]; .loc 1 278 15 ld.u64 %r35,[%r236+16]; .loc 1 280 22 cvt.u32.u64 %r36,%r35; .loc 1 280 53 add.u32 %r276,%r36,-4; and.b32 %r277,%r276,-5; set.u32.eq.u32 %r279,%r277,0; neg.s32 %r280,%r279; .loc 1 280 22 add.u32 %r281,%r36,-1; set.u32.le.u32 %r283,%r281,1; neg.s32 %r284,%r283; .loc 1 280 6 cvt.u16.u32 %r286,%r280; cvt.u16.u32 %r287,%r284; or.b16 %r285,%r286,%r287;56; .loc 1 282 7 setp.ne.u32 %r292,%r36,16; @ %r292 bra $L57; $L56: .loc 1 289 9 shl.b64 %r296,%r607,3; add.u64 %r297,%r234,%r296; ld.u64 %r158,[%r297+40]; .loc 1 290 12 add.u64 %r303,%r236,%r296; ld.u64 %r41,[%r303+40]; .loc 1 292 3 setp.ne.u64 %r305,%r153,0; @ %r305 bra $L58; $L64: .loc 1 302 3 setp.lt.s64 %r306,%r153,%r154; @ %r306 bra $L59; bra $L118; $L57: .loc 1 287 5 cvta.const.u64 %r307307_gfortran_runtime_error8: add.u64 %r164,%r234,40; add.u64 %r150,%r236,40; add.u64 %r309,%r234,16; add.u64 %r311,%r22,%r22; add.u64 %r312,%r311,%r22; shl.b64 %r313,%r312,3; add.u64 %r223,%r309,%r313; .loc 1 292 3 mov.u64 %r53,0; add.u64 %r603,%frame,240; add.u64 %r606,%frame,360; .loc 1 299 12 mov.u64 %r617,%r53; $L63: .loc 1 294 18 add.u64 %r315,%r603,%r53; ld.u64 %r316,[%r164]; st.u64 [%r315],%r316; .loc 1 295 18 add.u64 %r317,%frame,%r53; .loc 1 295 20 ld.u64 %r319,[%r150]; mul.lo.u64 %r318,%r319,%r35; .loc 1 295 18 st.u64 [%r317],%r318; .loc 1 296 19 ld.u64 %r321,[%r164+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r164+8]; sub.u64 %r52,%r320,%r322; .loc 1 298 10 setp.lt.s64 %r323,%r52,0; @ %r323 bra $L61; .loc 1 296 17 add.u64 %r325,%r606,%r53; st.u64 [%r325],%r52; bra $L62; $L61: .loc 1 299 12 add.u64 %r327,%r606,%r53; st.u64 [%r327],%r617; $L62: .loc 1 292 3 add.u64 %r164,%r164,24; add.u64 %r150,%r150,24; add.u64 %r53,%r53,8; setp.ne.u64 %r329,%r164,%r223; @ %r329 bra $L63; bra $L64; $L118: .loc 1 312 6 ld.u64 %r330,[%r233]; setp.eq.u64 %r331,%r330,0; @ ! %r331 bra $L66; bra $L65; $L59: add.u64 %r333,%r22,%r22; add.u64 %r334,%r333,%r22; shl.b64 %r335,%r334,3; add.u64 %r178,%r335,40; add.u64 %r59,%r234,%r178; add.u64 %r180,%r236,%r178; shl.b64 %r336,%r22,3; add.u64 %r91,%r336,-8; cvt.u32.u32 %r338,%r23; cvt.s64.s8 %r337,%r338; shl.b64 %r339,%r337,3; add.u64 %r97,%r339,-8; add.u64 %r603,%frame,240; add.u64 %r606,%frame,360; .loc 1 309 12 mov.u64 %r616,0; $L69: .loc 1 304 18 add.u64 %r341,%r603,%r91; ld.u64 %r342,[%r59]; st.u64 [%r341],%r342; .loc 1 305 18 add.u64 %r343,%frame,%r91; .loc 1 305 20 ld.u64 %r345,[%r180]; mul.lo.u64 %r344,%r345,%r35; .loc 1 305 18 st.u64 [%r343],%r344; .loc 1 306 19 ld.u64 %r347,[%r59+16]; add.u64 %r346,%r347,1; ld.u64 %r348,[%r59+8]; sub.u64 %r66,%r346,%r348; .loc 1 308 10 setp.lt.s64 %r349,%r66,0; @ %r349 bra $L67; .loc 1 306 17 add.u64 %r351,%r606,%r91; st.u64 [%r351],%r66; bra $L68; $L67: .loc 1 309 12 add.u64 %r353,%r606,%r91; st.u64 [%r353],%r616; $L68: .loc 1 302 3 add.u64 %r59,%r59,24; add.u64 %r180,%r180,24; add.u64 %r91,%r91,8; setp.ne.u64 %r355,%r91,%r97; @ %r355 bra $L69; bra $L119; $L65: .loc 1 316 7 setp.eq.u64 %r356,%r154,0; @ %r356 bra $L71; add.u64 %r606,%frame,360; $L97: add.u64 %r137,%r233,40; mov.u64 %r134,%r606; cvt.u32.u32 %r358,%r23; cvt.s64.s8 %r357,%r358; add.u64 %r360,%r357,%r357; add.u64 %r361,%r360,%r357; shl.b64 %r362,%r361,3; add.u64 %r363,%r233,16; add.u64 %r64,%r362,%r363; .loc 1 319 10 mov.u64 %r136,1; .loc 1 323 4 mov.u64 %r382,0; bra $L72; $L71: .loc 1 327 20 add.u32 %r364,%r23,-2; cvt.s64.s32 %r75,%r364; add.u64 %r366,%r75,%r75; add.u64 %r367,%r366,%r75; shl.b64 %r368,%r367,3; add.u64 %r369,%r233,%r368; .loc 1 327 67 shl.b64 %r371,%r75,3; add.u64 %r372,%frame,%r371; .loc 1 327 59 ld.u64 %r374,[%r369+40]; ld.u64 %r375,[%r372+360]; mul.lo.u64 %r162,%r374,%r375; .loc 1 329 24 mov.u64 %r376,0; st.u64 [%r233+8],%r376; .loc 1 330 28 cvt.u16.u32 %r379,%r23; add.u16 %r378,%r379,-1; cvt.u32.u16 %r380,%r378; st.u8 [%r233+28],%r380; .loc 1 332 10 setp.eq.u64 %r381,%r162,0; @ %r381 bra $L73; bra $L120; $L75: .loc 1 321 47 mul.lo.u64 %r136,%r72,%r136; $L72: .loc 1 323 4 st.u64 [%r137+8],%r382; ld.u64 %r72,[%r134]; add.u64 %r383,%r72,-1; st.u64 [%r137+16],%r383; st.u64 [%r137],%r136; .loc 1 316 7 add.u64 %r137,%r137,24; add.u64 %r134,%r134,8; setp.ne.u64 %r384,%r64,%r137; @ %r384 bra $L75; bra $L71; $L73: .loc 1 335 4 st.u64 [%r233+48],%r162; mov.u64 %r386,-1; st.u64 [%r233+56],%r386; mov.u64 %r387,1; st.u64 [%r233+40],%r387; .loc 1 336 4 bra $L51; $L120: .loc 1 339 24 mov.u64 %r389,16389; call (%value_in),_gfortrani_xmallocarray90,[%value_in]; } .loc 1 339 22 st.u64 [%r233],%r390; bra $L76; $L66: .loc 1 344 19 ld.s8 %r392,[%r233+28]; .loc 1 344 10 setp.eq.u64 %r393,%r392,%r154; @ %r393 bra $L77; .loc 1 345 2 cvta.const.u64 %r3944_gfortran_runtime_error77: .loc 1 347 11 cvta.global.u64 %r396,_gfortrani_compile_options; .loc 1 347 10 ld.u32 %r397,[%r396+36]; setp.eq.u32 %r398,%r397,0; @ %r398 bra $L76; .loc 1 349 4 add.u64 %r606,%frame,360; cvta.const.u64 %r402,$LC24002; call _gfortrani_bounds_ifunction_return4 cvta.const.u64 %r406,$LC406402; call _gfortrani_bounds_equal_extents$L76: .loc 1 356 3 setp.ne.u64 %r408,%r154,0; @ %r408 bra $L78; $L81: .loc 1 364 8 ld.u64 %r174,[%r233]; .loc 1 365 8 ld.u64 %r172,[%r234]; .loc 1 367 9 setp.ne.u64 %r409,%r172,0; @ %r409 bra $L79; bra $L51; $L78: add.u64 %r188,%frame,480; add.u64 %r181,%r233,40; add.u64 %r149,%frame,120; cvt.u32.u32 %r411,%r23; cvt.s64.s8 %r410,%r411; add.u64 %r138,%r410,-1; .loc 1 356 3 mov.u64 %r175,0; add.u64 %r606,%frame,360; .loc 1 358 16 mov.u64 %r412,%r175; $L80: st.u64 [%r188],%r412; .loc 1 359 18 ld.u64 %r413,[%r181]; st.u64 [%r149],%r413; .loc 1 360 17 shl.b64 %r415,%r175,3; add.u64 %r416,%r606,%r415; .loc 1 360 10 ld.u64 %r417,[%r416]; setp.le.s64 %r418,%r417,0; @ %r418 bra $L51; .loc 1 356 26 add.u64 %r175,%r175,1; .loc 1 356 3 add.u64 %r188,%r188,8; add.u64 %r181,%r181,24; add.u64 %r149,%r149,8; setp.ne.u64 %r419,%r138,%r175; @ %r419 bra $L80; bra $L81; $L79: .loc 1 290 12 mul.lo.u64 %r43,%r41,%r35; .loc 1 386 32 add.u64 %r88,%r158,%r158; ld.u64 %r189,[%frame+480]; .loc 1 430 22 ld.u64 %r102,[%frame+240]; .loc 1 430 12 add.u64 %r103,%r102,%r102; .loc 1 431 23 ld.u64 %r105,[%frame]; .loc 1 432 22 ld.u64 %r107,[%frame+120]; .loc 1 432 12 shl.b64 %r108,%r107,4; .loc 1 434 32 ld.u64 %r29,[%frame+360]; .loc 1 441 23 mul.lo.u64 %r39,%r29,%r102; .loc 1 442 24 mul.lo.u64 %r422,%r29,%r105; .loc 1 442 10 neg.s64 %r92,%r422; .loc 1 443 23 mul.lo.u64 %r423,%r29,%r107; .loc 1 443 9 shl.b64 %r424,%r423,4; neg.s64 %r185,%r424; setp.le.s64 %r601,%r154,1; .loc 1 408 9 setp.ne.u32 %r609,%r237,0; add.u64 %r610,%r155,-1; cvt.u32.u32 %r611,%r23; cvt.s64.s8 %r612,%r611; add.u64 %r613,%r612,-1; add.u64 %r614,%frame,240; add.u64 %r615,%frame,360; $L98: .loc 1 356 3 mov.u64 %r168,%r173; mov.u64 %r167,%r172; .loc 1 386 9 mov.u64 %r197,0; $L85: .loc 1 389 7 ld.s8 %r224,[%r168]; mov.u64 %r230,%r197; .loc 1 386 24 add.u64 %r197,%r197,1; .loc 1 389 6 setp.eq.u32 %r604,%r224,0; @ %r604 bra $L82; .loc 1 397 11 ld.s16 %r145,[%r167]; .loc 1 398 11 mov.u64 %r563,%r197; shr.s64 %r564,%r563,63; .loc 1 408 9 @ %r609 bra $L83; bra $L121; $L82: .loc 1 386 32 add.u64 %r167,%r167,%r88; .loc 1 386 47 add.u64 %r168,%r168,%r43; .loc 1 386 2 setp.ne.u64 %r428,%r155,%r197; @ %r428 bra $L85; .loc 1 385 9 mov.u64 %r563,0; mov.u64 %r564,%r563; bra $L86; $L121: .loc 1 418 8 setp.gt.s64 %r429,%r155,%r230; @ %r429 bra $L87; bra $L86; $L83: .loc 1 409 8 setp.le.s64 %r430,%r155,%r230; @ %r430 bra $L86; add.u64 %r571,%r230,1; shr.s64 %r572,%r571,63; sub.u64 %r434,%r610,%r230; shr.s64 %r437,%r230,63; add.u64 %r440,%r230,2; set.u32.lt.u64 %r445,%r440,%r230; cvt.s64.s32 %r443,%r445; sub.u64 %r449,%r437,%r443; add.u64 %r451,%r434,%r440; set.u32.lt.u64 %r456,%r451,%r434; cvt.s64.s32 %r454,%r456; sub.u64 %r460,%r449,%r454; mov.u64 %r583,1; $L90: .loc 1 411 8 @ %r604 bra $L88; .loc 1 411 18 ld.s16 %r89,[%r167]; .loc 1 411 15 cvt.u16.u32 %r462,%r89; cvt.u16.u32 %r463,%r145; setp.lt.s16 %r464,%r462,%r463; .loc 1 414 16 selp.u64 %r563,%r563,%r571,%r464; selp.u64 %r564,%r564,%r572,%r464; selp.u32 %r145,%r145,%r89,%r464; $L88: .loc 1 409 33 add.u64 %r167,%r167,%r88; .loc 1 409 48 add.u64 %r168,%r168,%r43; .loc 1 409 8 add.u64 %r467,%r571,%r583; set.u32.lt.u64 %r472,%r467,%r571; cvt.s64.s32 %r470,%r472; mov.u64 %r571,%r467; sub.u64 %r572,%r572,%r470; setp.ne.u64 %r479,%r451,%r571; @ %r479 bra $L99; setp.ne.u64 %r482,%r460,%r572; @ ! %r482 bra $L86; $L99: .loc 1 411 9 ld.s8 %r224,[%r168]; setp.eq.u32 %r604,%r224,0; bra $L90; $L87: add.u64 %r567,%r230,1; shr.s64 %r568,%r567,63; sub.u64 %r486,%r610,%r230; shr.s64 %r489,%r230,63; add.u64 %r492,%r230,2; set.u32.lt.u64 %r497,%r492,%r230; cvt.s64.s32 %r495,%r497; sub.u64 %r501,%r489,%r495; add.u64 %r503,%r486,%r492; set.u32.lt.u64 %r508,%r503,%r486; cvt.s64.s32 %r506,%r508; sub.u64 %r512,%r501,%r506; .loc 1 418 8 mov.u64 %r597,1; $L113: .loc 1 420 8 ld.s8 %r514,[%r168]; cvt.u16.u32 %r513,%r514; setp.eq.u16 %r515,%r513,0; @ %r515 bra $L91; .loc 1 420 18 ld.s16 %r94,[%r167]; .loc 1 420 15 cvt.u16.u32 %r516,%r94; cvt.u16.u32 %r517,%r145; setp.le.s16 %r518,%r516,%r517; .loc 1 423 16 selp.u64 %r563,%r563,%r567,%r518; selp.u64 %r564,%r564,%r568,%r518; selp.u32 %r145,%r145,%r94,%r518; $L91: .loc 1 418 33 add.u64 %r167,%r167,%r88; .loc 1 418 48 add.u64 %r168,%r168,%r43; .loc 1 418 8 add.u64 %r521,%r567,%r597; set.u32.lt.u64 %r526,%r521,%r567; cvt.s64.s32 %r524,%r526; mov.u64 %r567,%r521; sub.u64 %r568,%r568,%r524; setp.ne.u64 %r533,%r503,%r567; @ %r533 bra $L113; setp.ne.u64 %r536,%r512,%r568; @ %r536 bra $L113; $L86: .loc 1 426 8 st.u64 [%r174],%r563; st.u64 [%r174+8],%r564; .loc 1 429 15 add.u64 %r189,%r189,1; .loc 1 430 12 add.u64 %r172,%r172,%r103; .loc 1 431 13 add.u64 %r173,%r173,%r105; .loc 1 432 12 add.u64 %r174,%r174,%r108; .loc 1 434 13 setp.ne.u64 %r539,%r29,%r189; @ %r539 bra $L98; .loc 1 442 10 add.u64 %r169,%r173,%r92; .loc 1 443 9 add.u64 %r170,%r174,%r185; .loc 1 445 7 @ %r601 bra $L51; add.u64 %r219,%frame,488; .loc 1 441 23 mov.u64 %r112,%r39; .loc 1 445 7 mov.u64 %r218,8; .loc 1 444 5 mov.u64 %r171,1; add.u64 %r605,%frame,120; .loc 1 438 13 mov.u64 %r608,0; bra $L95; $L96: st.u64 [%r219],%r608; .loc 1 441 23 mul.lo.u64 %r112,%r125,%r124; .loc 1 442 24 mul.lo.u64 %r545,%r127,%r124; .loc 1 442 10 sub.u64 %r169,%r173,%r545; .loc 1 443 23 mul.lo.u64 %r546,%r129,%r124; .loc 1 443 9 shl.b64 %r547,%r546,4; sub.u64 %r170,%r174,%r547; .loc 1 444 5 add.u64 %r171,%r171,1; .loc 1 445 7 add.u64 %r219,%r219,8; add.u64 %r218,%r218,8; setp.eq.u64 %r548,%r171,%r613; @ %r548 bra $L51; $L95: .loc 1 453 16 ld.u64 %r549,[%r219]; add.u64 %r124,%r549,1; st.u64 [%r219],%r124; .loc 1 454 23 add.u64 %r551,%r614,%r218; ld.u64 %r125,[%r551]; .loc 1 454 13 sub.u64 %r552,%r125,%r112; add.u64 %r553,%r552,%r552; add.u64 %r172,%r172,%r553; .loc 1 455 24 add.u64 %r554,%frame,%r218; ld.u64 %r127,[%r554]; .loc 1 455 14 add.u64 %r173,%r169,%r127; .loc 1 456 23 add.u64 %r556,%r605,%r218; ld.u64 %r129,[%r556]; .loc 1 456 13 shl.b64 %r557,%r129,4; add.u64 %r174,%r170,%r557; .loc 1 434 32 add.u64 %r559,%r615,%r218; ld.u64 %r133,[%r559]; .loc 1 434 13 setp.eq.u64 %r560,%r124,%r133; @ %r560 bra $L96; .loc 1 438 13 mov.u64 %r189,0; bra $L98; $L119: .loc 1 312 6 ld.u64 %r561,[%r233]; setp.eq.u64 %r562,%r561,0; @ ! %r562 bra $L66; bra $L97; $L51_gfortran_smaxloc1_16_i2 .visible .func _gfortran_smaxloc1_16_, .param .u32 %in_ar4) {local .align 16 .b8 %frame_ar[368u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r282; .reg .predpred %r293; .reg .pred %r294; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 483 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L123; .loc 1 483 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L124; r152154; call _gfortran_maxloc1_16_490 7 bra $L122; $L124: .loc 1 493 10 ld.u64 %r23,[%r152]; .loc 1 493 7 add.u64 %r88,%r23,-1; .loc 1 494 10 ld.s8 %r24,[%r151+28]; .loc 1 494 38 add.u32 %r26,%r24,-1; .loc 1 494 8 cvt.s64.s32 %r297,%r26; .loc 1 496 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r297; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 496 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L126; .loc 1 503 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L127; $L133: .loc 1 511 3 setp.lt.s64 %r175,%r88,%r297; @ %r175 bra $L128; bra $L181; $L126: .loc 1 498 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error127: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 508 12 mov.u64 %r303,0; $L132: .loc 1 505 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 507 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L130; .loc 1 505 17 st.u64 [%r139],%r36; bra $L131; $L130: .loc 1 508 12 st.u64 [%r139],%r303; $L131: .loc 1 503 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L132; bra $L133; $L181: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L135; bra $L134; $L128: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r296,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r296,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 517 12 mov.u64 %r302,0; $L138: .loc 1 514 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 516 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L136; .loc 1 513 17 st.u64 [%r43],%r41; bra $L137; $L136: .loc 1 517 12 st.u64 [%r43],%r302; $L137: .loc 1 511 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L138; bra $L182; $L134: .loc 1 524 7 setp.eq.u64 %r213,%r297,0; @ %r213 bra $L140; add.u64 %r296,%frame,120; $L160: add.u64 %r111,%r150,40; mov.u64 %r107,%r296; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 527 10 mov.u64 %r81,1; .loc 1 531 4 mov.u64 %r239,0; bra $L141; $L140: .loc 1 535 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 536 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 538 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 538 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 538 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 540 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L142; bra $L183; $L144: .loc 1 529 48 mul.lo.u64 %r81,%r46,%r81; $L141: .loc 1 531 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 524 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L144; bra $L140; $L142: .loc 1 543 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 544 4 bra $L122; $L183: .loc 1 547 24 mov.u64 %r246,1696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 547 22 st.u64 [%r150],%r98; $L148: .loc 1 573 3 setp.ne.u64 %r249,%r297,0; @ %r249 bra $L145; bra $L146; $L135: .loc 1 551 19 ld.s8 %r55,[%r150+28]; .loc 1 551 10 setp.eq.u64 %r250,%r55,%r297; @ %r250 bra $L147; .loc 1 552 2 st.u64 [%stack+8],%r297; st.u64 [%stack],%r55;_gfortran_runtime_error147: .loc 1 557 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 557 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L148; .loc 1 559 4 setp.eq.u64 %r256,%r297,0; @ %r256 bra $L146; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 559 10 mov.u64 %r104,0; $L150: .loc 1 563 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 563 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 564 18 ld.u64 %r63,[%r31]; .loc 1 559 25 add.u64 %r104,%r104,1; .loc 1 564 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L149; .loc 1 565 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error149: .loc 1 559 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L150; $L145: .loc 1 575 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L151; shl.b64 %r266,%r297,3; bra $L152; $L151: mov.u64 %r266,8; $L152: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L153: .loc 1 576 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 573 26 add.u64 %r100,%r100,1; .loc 1 573 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r297,%r100; @ %r279 bra $L153; $L146: ld.u64 %r80,[%frame+240]; .loc 1 585 22 ld.u64 %r68,[%frame]; .loc 1 585 12 shl.b64 %r69,%r68,4; .loc 1 587 32 ld.u64 %r59,[%frame+120]; .loc 1 594 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r295,%r297,1; .loc 1 583 13 mov.u64 %r280,0; cvt.u32.u32 %r299,%r24; cvt.s64.s8 %r300,%r299; add.u64 %r301,%r300,-1; $L159: st.u64 [%r98],%r280; st.u64 [%r98+8],%r280; .loc 1 584 15 add.u64 %r80,%r80,1; .loc 1 585 12 add.u64 %r98,%r98,%r69; .loc 1 587 13 setp.ne.u64 %r282,%r59,%r80; @ %r282 bra $L159; .loc 1 596 7 @ ! %r295 bra $L184; bra $L122; $L158: .loc 1 591 13 st.u64 [%r99],%r280; .loc 1 594 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 595 5 add.u64 %r97,%r97,1; .loc 1 596 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r285,%r97,%r301; @ %r285 bra $L157; bra $L122; $L184: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 594 23 mov.u64 %r72,%r44; .loc 1 595 5 mov.u64 %r97,1; $L157: .loc 1 600 16 ld.u64 %r290,[%r99]; add.u64 %r74,%r290,1; st.u64 [%r99],%r74; .loc 1 601 23 ld.u64 %r75,[%r124]; .loc 1 601 13 sub.u64 %r291,%r75,%r72; shl.b64 %r292,%r291,4; add.u64 %r98,%r98,%r292; .loc 1 587 32 ld.u64 %r77,[%r123]; .loc 1 587 13 setp.eq.u64 %r293,%r74,%r77; @ %r293 bra $L158; .loc 1 591 13 mov.u64 %r80,0; bra $L159; $L182: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r294,%r98,0; @ ! %r294 bra $L135; bra $L160; $L122: .loc 1 605 1 ret; } maxloc1_4_i4.o/_gfortran_maxloc1_4_i4 .visible .func _gfortran_maxloc1_4_fortran/generated/maxloc1_4_i4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_4_i4 .visible .func _gfortran_mmaxloc1_4__gfortran_smaxloc1_4_i4 .visible .func _gfortran_smaxloc1_4_VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_maxloc1_4_i4 .visible .func _gfortran_maxloc1_4_i4480pred %r221; .reg .pred %r222; .reg .u64u64 %r229; .reg .u64 %r230; .reg .u64 %r231; .reg .pred %r232; .reg .predu64 %r280; .reg .u64 %r282; .reg .u64predu32u64 %r317; .reg .u64pred %r325; .reg .pred %r326; .reg .u32 %r327; .reg .pred %r328; .reg .pred %r329; .reg .u64 %r335; .reg .u64pred %r358; .reg .u32 %r359; .reg .u64u64 %r365; mov.u64 %r182,%ar0; mov.u64 %r183,%ar1; mov.u64 %r184,%ar2; mov.u32 %r185,%ar3; .loc 1 58 10 ld.s8 %r22,[%r183+28]; .loc 1 58 38 add.u32 %r186,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r108,%r186; .loc 1 59 10 ld.u64 %r25,[%r184]; .loc 1 59 7 add.u64 %r109,%r25,-1; .loc 1 61 7 shr.u64 %r188,%r109,63; set.u32.lt.s64 %r190,%r108,%r109; neg.s32 %r191,%r190; cvt.u16.u64 %r194,%r188; cvt.u16.u32 %r195,%r191; or.b16 %r193,%r194,%r195; .loc 1 61 6 cvt.u32.u16 %r196,%r193; cvt.u16.u8 %r197,%r196; setp.eq.u16 %r198,%r197,0; @ %r198 bra $L2; .loc 1 63 7 cvt.u32.u32 %r201,%r22; cvt.s64.s8 %r200,%r201; st.u64 [%stack+8],%r20019_gfortran_runtime_errorr204,%r109,%r109; add.u64 %r205,%r204,%r109; shl.b64 %r206,%r205,3; add.u64 %r207,%r183,%r206; ld.u64 %r31,[%r207+56]; ld.u64 %r33,[%r207+48]; .loc 1 71 9 ld.u64 %r112,[%r207+40]; .loc 1 73 3 setp.ne.u64 %r221,%r109,0; @ %r221 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r222,%r108,%r109; @ %r222 bra $L4; bra $L49; $L3: add.u64 %r29,%r183,40; add.u64 %r125,%frame,120; add.u64 %r126,%frame,240; add.u64 %r223,%r183,16; add.u64 %r225,%r25,%r25; add.u64 %r226,%r225,%r25; shl.b64 %r227,%r226,3; add.u64 %r147,%r223,%r227; .loc 1 79 12 mov.u64 %r365,0; $L8: .loc 1 75 18 ld.u64 %r228,[%r29]; st.u64 [%r125],%r228; .loc 1 76 19 ld.u64 %r230,[%r29+16]; add.u64 %r229,%r230,1; ld.u64 %r231,[%r29+8]; sub.u64 %r38,%r229,%r231; .loc 1 78 10 setp.lt.s64 %r232,%r38,0; @ %r232 bra $L6; .loc 1 76 17 st.u64 [%r126],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r126],%r365; $L7: .loc 1 73 3 add.u64 %r29,%r29,24; add.u64 %r125,%r125,8; add.u64 %r126,%r126,8; setp.ne.u64 %r234,%r29,%r147; @ %r234 bra $L8; bra $L9; $L49: .loc 1 90 6 ld.u64 %r235,[%r182]; setp.eq.u64 %r236,%r235,0; @ ! %r236 bra $L11; bra $L10; $L4: add.u64 %r238,%r25,%r25; add.u64 %r239,%r238,%r25; shl.b64 %r240,%r239,3; add.u64 %r241,%r240,40; add.u64 %r137,%r183,%r241; shl.b64 %r242,%r25,3; add.u64 %r105,%r242,-8; add.u64 %r352,%frame,120; add.u64 %r127,%r352,%r105; add.u64 %r351,%frame,240; add.u64 %r103,%r351,%r105; cvt.u32.u32 %r246,%r22; cvt.s64.s8 %r245,%r246; add.u64 %r248,%r245,%r245; add.u64 %r249,%r248,%r245; shl.b64 %r250,%r249,3; add.u64 %r251,%r183,40; add.u64 %r64,%r250,%r251; .loc 1 87 12 mov.u64 %r364,0; $L14: .loc 1 83 18 ld.u64 %r252,[%r137]; st.u64 [%r127],%r252; .loc 1 84 19 ld.u64 %r254,[%r137+16]; add.u64 %r253,%r254,1; ld.u64 %r255,[%r137+8]; sub.u64 %r44,%r253,%r255; .loc 1 86 10 setp.lt.s64 %r256,%r44,0; @ %r256 bra $L12; .loc 1 84 17 st.u64 [%r103],%r44; bra $L13; $L12: .loc 1 87 12 st.u64 [%r103],%r364; $L13: .loc 1 81 3 add.u64 %r137,%r137,24; add.u64 %r127,%r127,8; add.u64 %r103,%r103,8; setp.ne.u64 %r258,%r64,%r137; @ %r258 bra $L14; bra $L50; $L10: .loc 1 94 7 setp.le.s64 %r259,%r108,0; @ %r259 bra $L16; add.u64 %r351,%frame,240; $L37: add.u64 %r154,%r182,40; mov.u64 %r152,%r351; cvt.u32.u32 %r261,%r22; cvt.s64.s8 %r260,%r261; add.u64 %r263,%r260,%r260; add.u64 %r264,%r263,%r260; shl.b64 %r265,%r264,3; add.u64 %r266,%r182,16; add.u64 %r138,%r265,%r266; .loc 1 97 10 mov.u64 %r100,1; .loc 1 101 4 mov.u64 %r289,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r267,0; st.u64 [%r182+8],%r267; .loc 1 106 28 cvt.u16.u32 %r270,%r22; add.u16 %r269,%r270,-1; cvt.u32.u16 %r271,%r269; st.u8 [%r182+28],%r271; .loc 1 108 20 add.u32 %r272,%r22,-2; cvt.s64.s32 %r54,%r272; add.u64 %r274,%r54,%r54; add.u64 %r275,%r274,%r54; shl.b64 %r276,%r275,3; add.u64 %r277,%r182,%r276; .loc 1 108 67 shl.b64 %r279,%r54,3; add.u64 %r280,%frame,%r279; .loc 1 108 59 ld.u64 %r282,[%r277+40]; ld.u64 %r283,[%r280+240]; mul.lo.u64 %r114,%r282,%r283; .loc 1 110 29285; call (%value_in),_gfortrani_xmallocarray286,[%value_in]; } .loc 1 110 27 st.u64 [%r182],%r286; .loc 1 111 10 setp.eq.u64 %r288,%r114,0; @ ! %r288 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r100,%r50,%r100; $L17: .loc 1 101 4 st.u64 [%r154+8],%r289; ld.u64 %r50,[%r152]; add.u64 %r290,%r50,-1; st.u64 [%r154+16],%r290; st.u64 [%r154],%r100; .loc 1 94 7 add.u64 %r154,%r154,24; add.u64 %r152,%r152,8; setp.ne.u64 %r291,%r138,%r154; @ %r291 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r182+48],%r114; mov.u64 %r293,-1; st.u64 [%r182+56],%r293; mov.u64 %r294,1; st.u64 [%r182+40],%r294; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r60,[%r182+28]; .loc 1 121 10 setp.eq.u64 %r295,%r60,%r108; @ %r295 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r108; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r298,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r299,[%r298+36]; setp.eq.u32 %r300,%r299,0; @ %r300 bra $L21; .loc 1 128 2 add.u64 %r351,%frame,240; cvta.const.u64 %r304,$LC2r351303304; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r306,%r108,0; @ %r306 bra $L24; $L27: .loc 1 68 9 add.u64 %r307,%r31,1; .loc 1 68 7 sub.u64 %r110,%r307,%r33; max.s64 %r45,%r110,0; .loc 1 140 8 ld.u64 %r122,[%r183]; .loc 1 141 8 ld.u64 %r123,[%r182]; .loc 1 180 31 shl.b64 %r68,%r112,2; ld.u64 %r75,[%frame+360]; .loc 1 194 22 ld.u64 %r70,[%frame+120]; .loc 1 194 12 shl.b64 %r71,%r70,2; .loc 1 195 22 ld.u64 %r73,[%frame]; .loc 1 195 12 shl.b64 %r74,%r73,2; .loc 1 197 32 ld.u64 %r143,[%frame+240]; .loc 1 204 23 mul.lo.u64 %r308,%r70,%r143; .loc 1 204 9 shl.b64 %r309,%r308,2; neg.s64 %r134,%r309; .loc 1 205 23 mul.lo.u64 %r310,%r73,%r143; .loc 1 205 9 shl.b64 %r311,%r310,2; neg.s64 %r128,%r311; setp.gt.s64 %r353,%r110,0; setp.le.s64 %r354,%r108,1; .loc 1 155 9 mov.u32 %r357,-2147483648; setp.eq.u32 %r358,%r185,0; cvt.u32.u32 %r359,%r22; cvt.s64.s8 %r360,%r359; add.u64 %r361,%r360,-1; add.u64 %r362,%frame,120; add.u64 %r363,%frame,240; bra $L25; $L24: add.u64 %r167,%frame,360; add.u64 %r166,%r182,40; mov.u64 %r164,%frame; cvt.u32.u32 %r313,%r22; cvt.s64.s8 %r312,%r313; add.u64 %r155,%r312,-1; .loc 1 132 3 mov.u64 %r124,0; add.u64 %r351,%frame,240; .loc 1 134 16 mov.u64 %r314,%r124; $L26: st.u64 [%r167],%r314; .loc 1 135 18 ld.u64 %r315,[%r166]; st.u64 [%r164],%r315; .loc 1 136 17 shl.b64 %r317,%r124,3; add.u64 %r318,%r351,%r317; .loc 1 136 10 ld.u64 %r319,[%r318]; setp.le.s64 %r320,%r319,0; @ %r320 bra $L1; .loc 1 132 26 add.u64 %r124,%r124,1; .loc 1 132 3 add.u64 %r167,%r167,8; add.u64 %r166,%r166,24; add.u64 %r164,%r164,8; setp.ne.u64 %r321,%r124,%r155; @ %r321 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r353 bra $L38; .loc 1 159 10 mov.u32 %r323,0; st.u32 [%r123],%r323; bra $L29; $L38: mov.u64 %r118,%r122; .loc 1 155 9 mov.u32 %r102,%r357; .loc 1 157 9 mov.u32 %r101,1; .loc 1 178 8 mov.u64 %r117,0; $L28: .loc 1 182 14 ld.u32 %r181,[%r118]; .loc 1 182 7 @ %r358 bra $L30; .loc 1 182 6 setp.le.s32 %r325,%r102,%r181; @ %r325 bra $L31; bra $L32; $L30: setp.ge.s32 %r326,%r102,%r181; @ %r326 bra $L32; $L31: cvt.u32.u64 %r327,%r117; add.u32 %r101,%r327,1; .loc 1 182 14 mov.u32 %r102,%r181; $L32: .loc 1 180 23 add.u64 %r117,%r117,1; .loc 1 180 31 add.u64 %r118,%r118,%r68; .loc 1 180 6 setp.gt.s64 %r328,%r45,%r117; @ %r328 bra $L28; .loc 1 189 12 st.u32 [%r123],%r101; $L29: .loc 1 193 15 add.u64 %r75,%r75,1; .loc 1 194 12 add.u64 %r122,%r122,%r71; .loc 1 195 12 add.u64 %r123,%r123,%r74; .loc 1 197 13 setp.ne.u64 %r329,%r75,%r143; @ %r329 bra $L25; .loc 1 204 9 add.u64 %r119,%r122,%r134; .loc 1 205 9 add.u64 %r120,%r123,%r128; .loc 1 207 7 @ %r354 bra $L1; add.u64 %r133,%frame,368; mov.u64 %r149,8; .loc 1 206 5 mov.u64 %r121,1; .loc 1 201 13 mov.u64 %r356,0; bra $L35; $L36: st.u64 [%r133],%r356; .loc 1 204 23 mul.lo.u64 %r335,%r87,%r86; .loc 1 204 9 shl.b64 %r336,%r335,2; sub.u64 %r119,%r122,%r336; .loc 1 205 23 mul.lo.u64 %r337,%r90,%r86; .loc 1 205 9 shl.b64 %r338,%r337,2; sub.u64 %r120,%r123,%r338; .loc 1 206 5 add.u64 %r121,%r121,1; .loc 1 207 7 add.u64 %r133,%r133,8; add.u64 %r149,%r149,8; setp.eq.u64 %r339,%r121,%r361; @ %r339 bra $L1; $L35: .loc 1 215 16 ld.u64 %r340,[%r133]; add.u64 %r86,%r340,1; st.u64 [%r133],%r86; .loc 1 216 23 add.u64 %r342,%r362,%r149; ld.u64 %r87,[%r342]; .loc 1 216 13 shl.b64 %r343,%r87,2; add.u64 %r122,%r119,%r343; .loc 1 217 23 add.u64 %r344,%frame,%r149; ld.u64 %r90,[%r344]; .loc 1 217 13 shl.b64 %r345,%r90,2; add.u64 %r123,%r120,%r345; .loc 1 197 32 add.u64 %r347,%r363,%r149; ld.u64 %r93,[%r347]; .loc 1 197 13 setp.eq.u64 %r348,%r86,%r93; @ %r348 bra $L36; .loc 1 201 13 mov.u64 %r75,0; bra $L25; $L50: .loc 1 90 6 ld.u64 %r349,[%r182]; setp.eq.u64 %r350,%r349,0; @ ! %r350 bra $L11; bra $L37; $L1:_gfortran_mmaxloc1_4_i4 .visible .func _gfortran_mmaxloc1_4_, .param .u32 %in_ar4) {60898u16 %r238; .reg .u16u64 %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u64 %r306; .reg .u64pred %r317; .reg .u64 %r318; .reg .pred %r319; .reg .u64u32u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64u64predu32 %r385; .reg .predu64 %r394; .reg .predu64predu64 %r410; .reg .u32 %r412; .reg .pred %r414; .reg .pred %r415; .reg .pred %r416; .reg .predpred %r423; .reg .predpredpred %r446; .reg .u64 %r447; .reg .predmov.u64 %r221,%ar0; mov.u64 %r222,%ar1; mov.u64 %r223,%ar2; mov.u64 %r224,%ar3; mov.u32 %r225,%ar4; .loc 1 251 6 setp.ne.u64 %r226,%r224,0; @ %r226 bra $L223225; call _gfortran_maxloc1_4_7 bra $L51; $L52: .loc 1 261 10 ld.u64 %r22,[%r223]; .loc 1 261 7 add.u64 %r151,%r22,-1; .loc 1 262 10 ld.s8 %r23,[%r222+28]; .loc 1 262 38 add.u32 %r231,%r23,-1; .loc 1 262 8 cvt.s64.s32 %r152,%r231; .loc 1 265 7 shr.u64 %r233,%r151,63; set.u32.gt.s64 %r235,%r151,%r152; neg.s32 %r236,%r235; cvt.u16.u64 %r239,%r233; cvt.u16.u32 %r240,%r236; or.b16 %r238,%r239,%r240; .loc 1 265 6 cvt.u32.u16 %r241,%r238; cvt.u16.u8 %r242,%r241; setp.eq.u16 %r243,%r242,0; @ %r243 bra $L54; .loc 1 267 7 cvt.u32.u32 %r246,%r23; cvt.s64.s8 %r245,%r246; st.u64 [%stack+8],%r245; st.u64 [%stack],%r22; cvta.const.u64 %r244244_gfortran_runtime_error4: .loc 1 272 9 add.u64 %r449,%r151,%r151; add.u64 %r455,%r449,%r151; shl.b64 %r251,%r455,3; add.u64 %r252,%r222,%r251; ld.u64 %r255,[%r252+56]; add.u64 %r254,%r255,1; .loc 1 272 7 ld.u64 %r262,[%r252+48]; sub.u64 %r153,%r254,%r262; .loc 1 273 6 setp.le.s64 %r263,%r153,0; @ %r263 bra $L51; .loc 1 276 9 ld.u64 %r169,[%r224]; .loc 1 278 15 ld.u64 %r34,[%r224+16]; .loc 1 280 22 cvt.u32.u64 %r35,%r34; .loc 1 280 53 add.u32 %r264,%r35,-4; and.b32 %r265,%r264,-5; set.u32.eq.u32 %r267,%r265,0; neg.s32 %r268,%r267; .loc 1 280 22 add.u32 %r269,%r35,-1; set.u32.le.u32 %r271,%r269,1; neg.s32 %r272,%r271; .loc 1 280 6 cvt.u16.u32 %r274,%r268; cvt.u16.u32 %r275,%r272; or.loc 1 282 7 setp.ne.u32 %r280,%r35,16; @ %r280 bra $L57; $L56: .loc 1 289 9 shl.b64 %r284,%r455,3; add.u64 %r285,%r222,%r284; ld.u64 %r156,[%r285+40]; .loc 1 290 12 add.u64 %r291,%r224,%r284; ld.u64 %r41,[%r291+40]; .loc 1 292 3 setp.ne.u64 %r293,%r151,0; @ %r293 bra $L58; $L64: .loc 1 302 3 setp.lt.s64 %r294,%r151,%r152; @ %r294 bra $L59; bra $L116; $L57: .loc 1 287 5 cvta.const.u64 %r2955_gfortran_runtime_error8: add.u64 %r96,%r222,40; add.u64 %r68,%r224,40; add.u64 %r298,%r22,%r22; add.u64 %r299,%r298,%r22; shl.b64 %r300,%r299,3; add.u64 %r301,%r222,16; add.u64 %r39,%r300,%r301; .loc 1 292 3 mov.u64 %r139,0; add.u64 %r452,%frame,240; add.u64 %r451,%frame,360; .loc 1 299 12 mov.u64 %r465,%r139; $L63: .loc 1 294 18 add.u64 %r303,%r452,%r139; ld.u64 %r304,[%r96]; st.u64 [%r303],%r304; .loc 1 295 18 add.u64 %r305,%frame,%r139; .loc 1 295 20 ld.u64 %r307,[%r68]; mul.lo.u64 %r306,%r307,%r34; .loc 1 295 18 st.u64 [%r305],%r306; .loc 1 296 19 ld.u64 %r309,[%r96+16]; add.u64 %r308,%r309,1; ld.u64 %r310,[%r96+8]; sub.u64 %r53,%r308,%r310; .loc 1 298 10 setp.lt.s64 %r311,%r53,0; @ %r311 bra $L61; .loc 1 296 17 add.u64 %r313,%r451,%r139; st.u64 [%r313],%r53; bra $L62; $L61: .loc 1 299 12 add.u64 %r315,%r451,%r139; st.u64 [%r315],%r465; $L62: .loc 1 292 3 add.u64 %r96,%r96,24; add.u64 %r68,%r68,24; add.u64 %r139,%r139,8; setp.ne.u64 %r317,%r39,%r96; @ %r317 bra $L63; bra $L64; $L116: .loc 1 312 6 ld.u64 %r318,[%r221]; setp.eq.u64 %r319,%r318,0; @ ! %r319 bra $L66; bra $L65; $L59: add.u64 %r321,%r22,%r22; add.u64 %r322,%r321,%r22; shl.b64 %r323,%r322,3; add.u64 %r130,%r323,40; add.u64 %r134,%r222,%r130; add.u64 %r111,%r224,%r130; shl.b64 %r324,%r22,3; add.u64 %r58,%r324,-8; cvt.u32.u32 %r326,%r23; cvt.s64.s8 %r325,%r326; shl.b64 %r327,%r325,3; add.u64 %r69,%r327,-8; add.u64 %r452,%frame,240; add.u64 %r451,%frame,360; .loc 1 309 12 mov.u64 %r464,0; $L69: .loc 1 304 18 add.u64 %r329,%r452,%r58; ld.u64 %r330,[%r134]; st.u64 [%r329],%r330; .loc 1 305 18 add.u64 %r331,%frame,%r58; .loc 1 305 20 ld.u64 %r333,[%r111]; mul.lo.u64 %r332,%r333,%r34; .loc 1 305 18 st.u64 [%r331],%r332; .loc 1 306 19 ld.u64 %r335,[%r134+16]; add.u64 %r334,%r335,1; ld.u64 %r336,[%r134+8]; sub.u64 %r65,%r334,%r336; .loc 1 308 10 setp.lt.s64 %r337,%r65,0; @ %r337 bra $L67; .loc 1 306 17 add.u64 %r339,%r451,%r58; st.u64 [%r339],%r65; bra $L68; $L67: .loc 1 309 12 add.u64 %r341,%r451,%r58; st.u64 [%r341],%r464; $L68: .loc 1 302 3 add.u64 %r134,%r134,24; add.u64 %r111,%r111,24; add.u64 %r58,%r58,8; setp.ne.u64 %r343,%r69,%r58; @ %r343 bra $L69; bra $L117; $L65: .loc 1 316 7 setp.eq.u64 %r344,%r152,0; @ %r344 bra $L71; add.u64 %r451,%frame,360; $L96: add.u64 %r148,%r221,40; mov.u64 %r146,%r451; cvt.u32.u32 %r346,%r23; cvt.s64.s8 %r345,%r346; add.u64 %r348,%r345,%r345; add.u64 %r349,%r348,%r345; shl.b64 %r350,%r349,3; add.u64 %r351,%r221,16; add.u64 %r135,%r350,%r351; .loc 1 319 10 mov.u64 %r136,1; .loc 1 323 4 mov.u64 %r370,0; bra $L72; $L71: .loc 1 327 20 add.u32 %r352,%r23,-2; cvt.s64.s32 %r76,%r352; add.u64 %r354,%r76,%r76; add.u64 %r355,%r354,%r76; shl.b64 %r356,%r355,3; add.u64 %r357,%r221,%r356; .loc 1 327 67 shl.b64 %r359,%r76,3; add.u64 %r360,%frame,%r359; .loc 1 327 59 ld.u64 %r362,[%r357+40]; ld.u64 %r363,[%r360+360]; mul.lo.u64 %r160,%r362,%r363; .loc 1 329 24 mov.u64 %r364,0; st.u64 [%r221+8],%r364; .loc 1 330 28 cvt.u16.u32 %r367,%r23; add.u16 %r366,%r367,-1; cvt.u32.u16 %r368,%r366; st.u8 [%r221+28],%r368; .loc 1 332 10 setp.eq.u64 %r369,%r160,0; @ %r369 bra $L73; bra $L118; $L75: .loc 1 321 47 mul.lo.u64 %r136,%r73,%r136; $L72: .loc 1 323 4 st.u64 [%r148+8],%r370; ld.u64 %r73,[%r146]; add.u64 %r371,%r73,-1; st.u64 [%r148+16],%r371; st.u64 [%r148],%r136; .loc 1 316 7 add.u64 %r148,%r148,24; add.u64 %r146,%r146,8; setp.ne.u64 %r372,%r135,%r148; @ %r372 bra $L75; bra $L71; $L73: .loc 1 335 4 st.u64 [%r221+48],%r160; mov.u64 %r374,-1; st.u64 [%r221+56],%r374; mov.u64 %r375,1; st.u64 [%r221+40],%r375; .loc 1 336 4 bra $L51; $L118: .loc 1 339 24call (%value_in),_gfortrani_xmallocarray78,[%value_in]; } .loc 1 339 22 st.u64 [%r221],%r378; bra $L76; $L66: .loc 1 344 19 ld.s8 %r380,[%r221+28]; .loc 1 344 10 setp.eq.u64 %r381,%r380,%r152; @ %r381 bra $L77; .loc 1 345 2 cvta.const.u64 %r382,$LC5; {stack; call _gfortran_runtime_error77: .loc 1 347 11 cvta.global.u64 %r384,_gfortrani_compile_options; .loc 1 347 10 ld.u32 %r385,[%r384+36]; setp.eq.u32 %r386,%r385,0; @ %r386 bra $L76; .loc 1 349 4 add.u64 %r451,%frame,360; cvta.const.u64 %r390,$LC2390; call _gfortrani_bounds_ifunction_return4 cvta.const.u64 %r394,$LC222394390; call _gfortrani_bounds_equal_extents$L76: .loc 1 356 3 setp.ne.u64 %r396,%r152,0; @ %r396 bra $L78; $L81: .loc 1 364 8 ld.u64 %r170,[%r221]; .loc 1 365 8 ld.u64 %r168,[%r222]; .loc 1 367 9 setp.ne.u64 %r397,%r168,0; @ %r397 bra $L79; bra $L51; $L78: add.u64 %r198,%frame,480; add.u64 %r197,%r221,40; add.u64 %r195,%frame,120; cvt.u32.u32 %r399,%r23; cvt.s64.s8 %r398,%r399; add.u64 %r164,%r398,-1; .loc 1 356 3 mov.u64 %r171,0; add.u64 %r451,%frame,360; .loc 1 358 16 mov.u64 %r400,%r171; $L80: st.u64 [%r198],%r400; .loc 1 359 18 ld.u64 %r401,[%r197]; st.u64 [%r195],%r401; .loc 1 360 17 shl.b64 %r403,%r171,3; add.u64 %r404,%r451,%r403; .loc 1 360 10 ld.u64 %r405,[%r404]; setp.le.s64 %r406,%r405,0; @ %r406 bra $L51; .loc 1 356 26 add.u64 %r171,%r171,1; .loc 1 356 3 add.u64 %r198,%r198,8; add.u64 %r197,%r197,24; add.u64 %r195,%r195,8; setp.ne.u64 %r407,%r164,%r171; @ %r407 bra $L80; bra $L81; $L79: .loc 1 290 12 mul.lo.u64 %r44,%r41,%r34; .loc 1 386 32 shl.b64 %r91,%r156,2; ld.u64 %r187,[%frame+480]; .loc 1 430 22 ld.u64 %r101,[%frame+240]; .loc 1 430 12 shl.b64 %r102,%r101,2; .loc 1 431 23 ld.u64 %r104,[%frame]; .loc 1 432 22 ld.u64 %r106,[%frame+120]; .loc 1 432 12 shl.b64 %r107,%r106,2; .loc 1 434 32 ld.u64 %r29,[%frame+360]; .loc 1 441 23 mul.lo.u64 %r38,%r29,%r101; .loc 1 442 24 mul.lo.u64 %r408,%r29,%r104; .loc 1 442 10 neg.s64 %r137,%r408; .loc 1 443 23 mul.lo.u64 %r409,%r29,%r106; .loc 1 443 9 shl.b64 %r410,%r409,2; neg.s64 %r188,%r410; setp.le.s64 %r450,%r152,1; .loc 1 408 9 setp.ne.u32 %r457,%r225,0; cvt.u32.u32 %r458,%r23; cvt.s64.s8 %r459,%r458; add.u64 %r460,%r459,-1; add.u64 %r461,%frame,240; add.u64 %r462,%frame,360; add.u64 %r463,%frame,120; $L97: .loc 1 356 3 mov.u64 %r163,%r169; mov.u64 %r162,%r168; .loc 1 386 9 mov.u64 %r161,0; $L85: .loc 1 389 7 ld.s8 %r220,[%r163]; .loc 1 389 6 setp.eq.u32 %r453,%r220,0; @ %r453 bra $L82; .loc 1 397 11 ld.u32 %r145,[%r162]; .loc 1 398 11 cvt.u32.u64 %r412,%r161; add.u32 %r143,%r412,1; .loc 1 408 9 @ %r457 bra $L83; bra $L119; $L82: .loc 1 386 24 add.u64 %r161,%r161,1; .loc 1 386 32 add.u64 %r162,%r162,%r91; .loc 1 386 47 add.u64 %r163,%r163,%r44; .loc 1 386 2 setp.ne.u64 %r414,%r153,%r161; @ %r414 bra $L85; .loc 1 385 9 mov.u32 %r143,%r220; bra $L86; $L119: .loc 1 418 8 setp.gt.s64 %r415,%r153,%r161; @ %r415 bra $L87; bra $L86; $L83: .loc 1 409 8 setp.le.s64 %r416,%r153,%r161; @ %r416 bra $L86; add.u64 %r205,%r161,1; $L89: .loc 1 411 8 @ %r453 bra $L88; .loc 1 411 18 ld.u32 %r93,[%r162]; .loc 1 411 15 setp.lt.s32 %r418,%r93,%r145; @ %r418 bra $L88; .loc 1 414 16 cvt.u32.u64 %r143,%r205; mov.u32 %r145,%r93; $L88: .loc 1 409 33 add.u64 %r162,%r162,%r91; .loc 1 409 48 add.u64 %r163,%r163,%r44; .loc 1 409 8 add.u64 %r206,%r205,1; setp.eq.u64 %r419,%r153,%r205; @ %r419 bra $L86; .loc 1 411 9 ld.s8 %r220,[%r163]; mov.u64 %r205,%r206; setp.eq.u32 %r453,%r220,0; bra $L89; $L87: add.u64 %r201,%r161,1; bra $L91; $L98: mov.u64 %r201,%r202; $L91: .loc 1 420 8 ld.s8 %r421,[%r163]; cvt.u16.u32 %r420,%r421; setp.eq.u16 %r422,%r420,0; @ %r422 bra $L90; .loc 1 420 18 ld.u32 %r98,[%r162]; .loc 1 420 15 setp.le.s32 %r423,%r98,%r145; @ %r423 bra $L90; .loc 1 423 16 cvt.u32.u64 %r143,%r201; mov.u32 %r145,%r98; $L90: .loc 1 418 33 add.u64 %r162,%r162,%r91; .loc 1 418 48 add.u64 %r163,%r163,%r44; .loc 1 418 8 add.u64 %r202,%r201,1; setp.ne.u64 %r424,%r153,%r201; @ %r424 bra $L98; $L86: .loc 1 426 8 st.u32 [%r170],%r143; .loc 1 429 15 add.u64 %r187,%r187,1; .loc 1 430 12 add.u64 %r168,%r168,%r102; .loc 1 431 13 add.u64 %r169,%r169,%r104; .loc 1 432 12 add.u64 %r170,%r170,%r107; .loc 1 434 13 setp.ne.u64 %r425,%r29,%r187; @ %r425 bra $L97; .loc 1 442 10 add.u64 %r165,%r169,%r137; .loc 1 443 9 add.u64 %r166,%r170,%r188; .loc 1 445 7 @ %r450 bra $L51; add.u64 %r150,%frame,488; .loc 1 441 23 mov.u64 %r110,%r38; .loc 1 445 7 mov.u64 %r215,8; .loc 1 444 5 mov.u64 %r167,1; .loc 1 438 13 mov.u64 %r456,0; bra $L94; $L95: st.u64 [%r150],%r456; .loc 1 441 23 mul.lo.u64 %r110,%r123,%r122; .loc 1 442 24 mul.lo.u64 %r431,%r125,%r122; .loc 1 442 10 sub.u64 %r165,%r169,%r431; .loc 1 443 23 mul.lo.u64 %r432,%r127,%r122; .loc 1 443 9 shl.b64 %r433,%r432,2; sub.u64 %r166,%r170,%r433; .loc 1 444 5 add.u64 %r167,%r167,1; .loc 1 445 7 add.u64 %r150,%r150,8; add.u64 %r215,%r215,8; setp.eq.u64 %r434,%r167,%r460; @ %r434 bra $L51; $L94: .loc 1 453 16 ld.u64 %r435,[%r150]; add.u64 %r122,%r435,1; st.u64 [%r150],%r122; .loc 1 454 23 add.u64 %r437,%r461,%r215; ld.u64 %r123,[%r437]; .loc 1 454 13 sub.u64 %r438,%r123,%r110; shl.b64 %r439,%r438,2; add.u64 %r168,%r168,%r439; .loc 1 455 24 add.u64 %r440,%frame,%r215; ld.u64 %r125,[%r440]; .loc 1 455 14 add.u64 %r169,%r165,%r125; .loc 1 456 23 add.u64 %r442,%r463,%r215; ld.u64 %r127,[%r442]; .loc 1 456 13 shl.b64 %r443,%r127,2; add.u64 %r170,%r166,%r443; .loc 1 434 32 add.u64 %r445,%r462,%r215; ld.u64 %r131,[%r445]; .loc 1 434 13 setp.eq.u64 %r446,%r122,%r131; @ %r446 bra $L95; .loc 1 438 13 mov.u64 %r187,0; bra $L97; $L117: .loc 1 312 6 ld.u64 %r447,[%r221]; setp.eq.u64 %r448,%r447,0; @ ! %r448 bra $L66; bra $L96; $L51_gfortran_smaxloc1_4_i4 .visible .func _gfortran_smaxloc1_4_, .param .u32 %in_ar4) {local .align 16 .b8 %frame_ar[368u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u32 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 483 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L121; .loc 1 483 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L122; $L121:r152154; call _gfortran_maxloc1_4_490 7 bra $L120; $L122: .loc 1 493 10 ld.u64 %r23,[%r152]; .loc 1 493 7 add.u64 %r88,%r23,-1; .loc 1 494 10 ld.s8 %r24,[%r151+28]; .loc 1 494 38 add.u32 %r26,%r24,-1; .loc 1 494 8 cvt.s64.s32 %r296,%r26; .loc 1 496 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 496 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L124; .loc 1 503 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L125; $L131: .loc 1 511 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L126; bra $L179; $L124: .loc 1 498 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error125: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 508 12 mov.u64 %r302,0; $L130: .loc 1 505 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 507 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L128; .loc 1 505 17 st.u64 [%r139],%r36; bra $L129; $L128: .loc 1 508 12 st.u64 [%r139],%r302; $L129: .loc 1 503 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L130; bra $L131; $L179: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L133; bra $L132; $L126: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 517 12 mov.u64 %r301,0; $L136: .loc 1 514 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 516 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L134; .loc 1 513 17 st.u64 [%r43],%r41; bra $L135; $L134: .loc 1 517 12 st.u64 [%r43],%r301; $L135: .loc 1 511 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L136; bra $L180; $L132: .loc 1 524 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L138; add.u64 %r295,%frame,120; $L158: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 527 10 mov.u64 %r81,1; .loc 1 531 4 mov.u64 %r239,0; bra $L139; $L138: .loc 1 535 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 536 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 538 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 538 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 538 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 540 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L140; bra $L181; $L142: .loc 1 529 48 mul.lo.u64 %r81,%r46,%r81; $L139: .loc 1 531 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 524 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L142; bra $L138; $L140: .loc 1 543 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 544 4 bra $L120; $L181: .loc 1 547 2496246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 547 22 st.u64 [%r150],%r98; $L146: .loc 1 573 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L143; bra $L144; $L133: .loc 1 551 19 ld.s8 %r55,[%r150+28]; .loc 1 551 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L145; .loc 1 552 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error145: .loc 1 557 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 557 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L146; .loc 1 559 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L144; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 559 10 mov.u64 %r104,0; $L148: .loc 1 563 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 563 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 564 18 ld.u64 %r63,[%r31]; .loc 1 559 25 add.u64 %r104,%r104,1; .loc 1 564 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L147; .loc 1 565 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error147: .loc 1 559 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L148; $L143: .loc 1 575 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L149; shl.b64 %r266,%r296,3; bra $L150; $L149: mov.u64 %r266,8; $L150: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L151: .loc 1 576 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 573 26 add.u64 %r100,%r100,1; .loc 1 573 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L151; $L144: ld.u64 %r80,[%frame+240]; .loc 1 585 22 ld.u64 %r68,[%frame]; .loc 1 585 12 shl.b64 %r69,%r68,2; .loc 1 587 32 ld.u64 %r59,[%frame+120]; .loc 1 594 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 583 13 mov.u32 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L157: st.u32 [%r98],%r280; .loc 1 584 15 add.u64 %r80,%r80,1; .loc 1 585 12 add.u64 %r98,%r98,%r69; .loc 1 587 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L157; .loc 1 596 7 @ ! %r294 bra $L182; bra $L120; $L156: .loc 1 591 13 st.u64 [%r99],%r297; .loc 1 594 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 595 5 add.u64 %r97,%r97,1; .loc 1 596 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L155; bra $L120; $L182: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 594 23 mov.u64 %r72,%r44; .loc 1 595 5 mov.u64 %r97,1; .loc 1 591 13 mov.u64 %r297,0; $L155: .loc 1 600 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 601 23 ld.u64 %r75,[%r124]; .loc 1 601 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,2; add.u64 %r98,%r98,%r291; .loc 1 587 32 ld.u64 %r77,[%r123]; .loc 1 587 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L156; .loc 1 591 13 mov.u64 %r80,0; bra $L157; $L180: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L133; bra $L158; $L120: .loc 1 605 1 ret; } maxloc1_8_i4.o/_gfortran_maxloc1_8_i4 .visible .func _gfortran_maxloc1_8_fortran/generated/maxloc1_8_i4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_8_i4 .visible .func _gfortran_mmaxloc1_8__gfortran_smaxloc1_8_i4 .visible .func _gfortran_smaxloc1_8_VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_maxloc1_8_i4 .visible .func _gfortran_maxloc1_8_i448095predu64u64 %r229; .reg .u64 %r230; .reg .predpred %r255; .reg .predu64 %r266; .reg .u16u64predpred %r315; .reg .predu64 %r327; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .pred %r337; .reg .u64 %r338; .reg .u64 %r340; .reg .u64 %r341; .reg .u64 %r342; .reg .u64 %r343; .reg .u64 %r345; .reg .pred %r346; .reg .u64 %r347; .reg .pred %r348; .reg .u64 %r349; .reg .predu64 %r353; .reg .pred %r355; .reg .u64u64 %r359; .reg .u64mov.u64 %r181,%ar0; mov.u64 %r182,%ar1; mov.u64 %r183,%ar2; mov.u32 %r184,%ar3; .loc 1 58 10 ld.s8 %r22,[%r182+28]; .loc 1 58 38 add.u32 %r185,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r110,%r185; .loc 1 59 10 ld.u64 %r25,[%r183]; .loc 1 59 7 add.u64 %r111,%r25,-1; .loc 1 61 7 shr.u64 %r187,%r111,63; set.u32.lt.s64 %r189,%r110,%r111; neg.s32 %r190,%r189; cvt.u16.u64 %r193,%r187; cvt.u16.u32 %r194,%r190; or.b16 %r192,%r193,%r194; .loc 1 61 6 cvt.u32.u16 %r195,%r192; cvt.u16.u8 %r196,%r195; setp.eq.u16 %r197,%r196,0; @ %r197 bra $L2; .loc 1 63 7 cvt.u32.u32 %r200,%r22; cvt.s64.s8 %r199,%r200; st.u64 [%stack+8],%r199_gfortran_runtime_errorr203,%r111,%r111; add.u64 %r204,%r203,%r111; shl.b64 %r205,%r204,3; add.u64 %r206,%r182,%r205; ld.u64 %r31,[%r206+56]; ld.u64 %r33,[%r206+48]; .loc 1 71 9 ld.u64 %r114,[%r206+40]; .loc 1 73 3 setp.ne.u64 %r220,%r111,0; @ %r220 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r221,%r110,%r111; @ %r221 bra $L4; bra $L50; $L3: add.u64 %r63,%r182,40; add.u64 %r29,%frame,120; add.u64 %r124,%frame,240; add.u64 %r222,%r182,16; add.u64 %r224,%r25,%r25; add.u64 %r225,%r224,%r25; shl.b64 %r226,%r225,3; add.u64 %r174,%r222,%r226; .loc 1 79 12 mov.u64 %r363,0; $L8: .loc 1 75 18 ld.u64 %r227,[%r63]; st.u64 [%r29],%r227; .loc 1 76 19 ld.u64 %r229,[%r63+16]; add.u64 %r228,%r229,1; ld.u64 %r230,[%r63+8]; sub.u64 %r38,%r228,%r230; .loc 1 78 10 setp.lt.s64 %r231,%r38,0; @ %r231 bra $L6; .loc 1 76 17 st.u64 [%r124],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r124],%r363; $L7: .loc 1 73 3 add.u64 %r63,%r63,24; add.u64 %r29,%r29,8; add.u64 %r124,%r124,8; setp.ne.u64 %r233,%r63,%r174; @ %r233 bra $L8; bra $L9; $L50: .loc 1 90 6 ld.u64 %r234,[%r181]; setp.eq.u64 %r235,%r234,0; @ ! %r235 bra $L11; bra $L10; $L4: add.u64 %r237,%r25,%r25; add.u64 %r238,%r237,%r25; shl.b64 %r239,%r238,3; add.u64 %r240,%r239,40; add.u64 %r132,%r182,%r240; shl.b64 %r241,%r25,3; add.u64 %r107,%r241,-8; add.u64 %r349,%frame,120; add.u64 %r126,%r349,%r107; add.u64 %r353,%frame,240; add.u64 %r105,%r353,%r107; cvt.u32.u32 %r245,%r22; cvt.s64.s8 %r244,%r245; add.u64 %r247,%r244,%r244; add.u64 %r248,%r247,%r244; shl.b64 %r249,%r248,3; add.u64 %r250,%r182,40; add.u64 %r81,%r249,%r250; .loc 1 87 12 mov.u64 %r362,0; $L14: .loc 1 83 18 ld.u64 %r251,[%r132]; st.u64 [%r126],%r251; .loc 1 84 19 ld.u64 %r253,[%r132+16]; add.u64 %r252,%r253,1; ld.u64 %r254,[%r132+8]; sub.u64 %r45,%r252,%r254; .loc 1 86 10 setp.lt.s64 %r255,%r45,0; @ %r255 bra $L12; .loc 1 84 17 st.u64 [%r105],%r45; bra $L13; $L12: .loc 1 87 12 st.u64 [%r105],%r362; $L13: .loc 1 81 3 add.u64 %r132,%r132,24; add.u64 %r126,%r126,8; add.u64 %r105,%r105,8; setp.ne.u64 %r257,%r81,%r132; @ %r257 bra $L14; bra $L51; $L10: .loc 1 94 7 setp.le.s64 %r258,%r110,0; @ %r258 bra $L16; add.u64 %r353,%frame,240; $L36: add.u64 %r153,%r181,40; mov.u64 %r151,%r353; cvt.u32.u32 %r260,%r22; cvt.s64.s8 %r259,%r260; add.u64 %r262,%r259,%r259; add.u64 %r263,%r262,%r259; shl.b64 %r264,%r263,3; add.u64 %r265,%r181,16; add.u64 %r138,%r264,%r265; .loc 1 97 10 mov.u64 %r102,1; .loc 1 101 4 mov.u64 %r288,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r266,0; st.u64 [%r181+8],%r266; .loc 1 106 28 cvt.u16.u32 %r269,%r22; add.u16 %r268,%r269,-1; cvt.u32.u16 %r270,%r268; st.u8 [%r181+28],%r270; .loc 1 108 20 add.u32 %r271,%r22,-2; cvt.s64.s32 %r56,%r271; add.u64 %r273,%r56,%r56; add.u64 %r274,%r273,%r56; shl.b64 %r275,%r274,3; add.u64 %r276,%r181,%r275; .loc 1 108 67 shl.b64 %r278,%r56,3; add.u64 %r279,%frame,%r278; .loc 1 108 59 ld.u64 %r281,[%r276+40]; ld.u64 %r282,[%r279+240]; mul.lo.u64 %r116,%r281,%r282; .loc 1 110 29 mov.u64 %r284284; call (%value_in),_gfortrani_xmallocarray285,[%value_in]; } .loc 1 110 27 st.u64 [%r181],%r285; .loc 1 111 10 setp.eq.u64 %r287,%r116,0; @ ! %r287 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r102,%r52,%r102; $L17: .loc 1 101 4 st.u64 [%r153+8],%r288; ld.u64 %r52,[%r151]; add.u64 %r289,%r52,-1; st.u64 [%r153+16],%r289; st.u64 [%r153],%r102; .loc 1 94 7 add.u64 %r153,%r153,24; add.u64 %r151,%r151,8; setp.ne.u64 %r290,%r138,%r153; @ %r290 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r181+48],%r116; mov.u64 %r292,-1; st.u64 [%r181+56],%r292; mov.u64 %r293,1; st.u64 [%r181+40],%r293; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r62,[%r181+28]; .loc 1 121 10 setp.eq.u64 %r294,%r62,%r110; @ %r294 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r110; st.u64 [%stack],%r62;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r297,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r298,[%r297+36]; setp.eq.u32 %r299,%r298,0; @ %r299 bra $L21; .loc 1 128 2 add.u64 %r353,%frame,240; cvta.const.u64 %r303,$LC2r353302303; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r305,%r110,0; @ %r305 bra $L24; $L27: .loc 1 68 9 add.u64 %r306,%r31,1; .loc 1 68 7 sub.u64 %r112,%r306,%r33; max.s64 %r46,%r112,0; .loc 1 140 8 ld.u64 %r121,[%r182]; .loc 1 141 8 ld.u64 %r122,[%r181]; .loc 1 180 31 shl.b64 %r69,%r114,2; ld.u64 %r48,[%frame+360]; .loc 1 194 22 ld.u64 %r70,[%frame+120]; .loc 1 194 12 shl.b64 %r72,%r70,2; .loc 1 195 22 ld.u64 %r73,[%frame]; .loc 1 195 12 shl.b64 %r75,%r73,3; setp.gt.s64 %r350,%r112,0; setp.le.s64 %r351,%r110,1; setp.eq.u32 %r355,%r184,0; .loc 1 159 10 mov.u64 %r356,0; cvt.u32.u32 %r357,%r22; cvt.s64.s8 %r358,%r357; add.u64 %r359,%r358,-1; add.u64 %r360,%frame,120; add.u64 %r361,%frame,240; bra $L25; $L24: add.u64 %r166,%frame,360; add.u64 %r165,%r181,40; mov.u64 %r163,%frame; cvt.u32.u32 %r308,%r22; cvt.s64.s8 %r307,%r308; add.u64 %r154,%r307,-1; .loc 1 132 3 mov.u64 %r123,0; add.u64 %r353,%frame,240; .loc 1 134 16 mov.u64 %r309,%r123; $L26: st.u64 [%r166],%r309; .loc 1 135 18 ld.u64 %r310,[%r165]; st.u64 [%r163],%r310; .loc 1 136 17 shl.b64 %r312,%r123,3; add.u64 %r313,%r353,%r312; .loc 1 136 10 ld.u64 %r314,[%r313]; setp.le.s64 %r315,%r314,0; @ %r315 bra $L1; .loc 1 132 26 add.u64 %r123,%r123,1; .loc 1 132 3 add.u64 %r166,%r166,8; add.u64 %r165,%r165,24; add.u64 %r163,%r163,8; setp.ne.u64 %r316,%r123,%r154; @ %r316 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r350 bra $L37; .loc 1 159 10 st.u64 [%r122],%r356; bra $L29; $L37: mov.u64 %r117,%r121; .loc 1 155 9 mov.u32 %r104,-2147483648; .loc 1 157 9 mov.u64 %r103,1; .loc 1 178 8 mov.u64 %r149,0; $L28: .loc 1 182 14 ld.u32 %r180,[%r117]; .loc 1 185 14 add.u64 %r149,%r149,1; .loc 1 182 7 @ %r355 bra $L30; .loc 1 182 6 setp.le.s32 %r320,%r104,%r180; selp.u32 %r104,%r180,%r104,%r320; selp.u64 %r103,%r149,%r103,%r320; bra $L31; $L30: setp.lt.s32 %r321,%r104,%r180; selp.u32 %r104,%r180,%r104,%r321; selp.u64 %r103,%r149,%r103,%r321; $L31: .loc 1 180 31 add.u64 %r117,%r117,%r69; .loc 1 180 6 setp.gt.s64 %r322,%r46,%r149; @ %r322 bra $L28; .loc 1 189 12 st.u64 [%r122],%r103; $L29: .loc 1 193 15 add.u64 %r48,%r48,1; .loc 1 194 12 add.u64 %r121,%r121,%r72; .loc 1 195 12 add.u64 %r122,%r122,%r75; .loc 1 197 32 ld.u64 %r145,[%frame+240]; .loc 1 197 13 setp.ne.u64 %r323,%r48,%r145; @ %r323 bra $L25; .loc 1 204 23 mul.lo.u64 %r324,%r70,%r48; .loc 1 204 9 shl.b64 %r325,%r324,2; sub.u64 %r118,%r121,%r325; .loc 1 205 23 mul.lo.u64 %r326,%r73,%r48; .loc 1 205 9 shl.b64 %r327,%r326,3; sub.u64 %r119,%r122,%r327; .loc 1 207 7 @ %r351 bra $L1; add.u64 %r150,%frame,368; mov.u64 %r109,8; .loc 1 206 5 mov.u64 %r120,1; bra $L34; $L35: .loc 1 201 13 st.u64 [%r150],%r356; .loc 1 204 23 mul.lo.u64 %r333,%r89,%r88; .loc 1 204 9 shl.b64 %r334,%r333,2; sub.u64 %r118,%r121,%r334; .loc 1 205 23 mul.lo.u64 %r335,%r92,%r88; .loc 1 205 9 shl.b64 %r336,%r335,3; sub.u64 %r119,%r122,%r336; .loc 1 206 5 add.u64 %r120,%r120,1; .loc 1 207 7 add.u64 %r150,%r150,8; add.u64 %r109,%r109,8; setp.eq.u64 %r337,%r120,%r359; @ %r337 bra $L1; $L34: .loc 1 215 16 ld.u64 %r338,[%r150]; add.u64 %r88,%r338,1; st.u64 [%r150],%r88; .loc 1 216 23 add.u64 %r340,%r360,%r109; ld.u64 %r89,[%r340]; .loc 1 216 13 shl.b64 %r341,%r89,2; add.u64 %r121,%r118,%r341; .loc 1 217 23 add.u64 %r342,%frame,%r109; ld.u64 %r92,[%r342]; .loc 1 217 13 shl.b64 %r343,%r92,3; add.u64 %r122,%r119,%r343; .loc 1 197 32 add.u64 %r345,%r361,%r109; ld.u64 %r95,[%r345]; .loc 1 197 13 setp.eq.u64 %r346,%r88,%r95; @ %r346 bra $L35; .loc 1 201 13 mov.u64 %r48,0; bra $L25; $L51: .loc 1 90 6 ld.u64 %r347,[%r181]; setp.eq.u64 %r348,%r347,0; @ ! %r348 bra $L11; bra $L36; $L1:_gfortran_mmaxloc1_8_i4 .visible .func _gfortran_mmaxloc1_8_, .param .u32 %in_ar4) {608pred %r218; .reg .u32 %r223; .reg .u6464u64pred %r373; .reg .u64 %r374; .reg .u64 %r376; .reg .u32 %r377; .reg .pred %r378; .reg .u64 %r381; .reg .u64 %r382; .reg .u64 %r386; .reg .pred %r388; .reg .pred %r389; .reg .u64 %r390; .reg .u32 %r391; .reg .u64pred %r398; .reg .pred %r399; .reg .pred %r402; .reg .pred %r403; .reg .pred %r404; .reg .pred %r406; .reg .pred %r407; .reg .u16 %r408; .reg .u32 %r409; .reg .pred %r410; .reg .pred %r411; .reg .pred %r412; .reg .pred.reg .u64 %r416; .reg .u64predpredmov.u64 %r213,%ar0; mov.u64 %r214,%ar1; mov.u64 %r215,%ar2; mov.u64 %r216,%ar3; mov.u32 %r217,%ar4; .loc 1 251 6 setp.ne.u64 %r218,%r216,0; @ %r218 bra $L1r214215217; call _gfortran_maxloc1_8_7 bra $L52; $L53: .loc 1 261 10 ld.u64 %r22,[%r215]; .loc 1 261 7 add.u64 %r146,%r22,-1; .loc 1 262 10 ld.s8 %r23,[%r214+28]; .loc 1 262 38 add.u32 %r223,%r23,-1; .loc 1 262 8 cvt.s64.s32 %r147,%r223; .loc 1 265 7 shr.u64 %r225,%r146,63; set.u32.gt.s64 %r227,%r146,%r147; neg.s32 %r228,%r227; cvt.u16.u64 %r231,%r225; cvt.u16.u32 %r232,%r228; or.b16 %r230,%r231,%r232; .loc 1 265 6 cvt.u32.u16 %r233,%r230; cvt.u16.u8 %r234,%r233; setp.eq.u16 %r235,%r234,0; @ %r235 bra $L55; .loc 1 267 7 cvt.u32.u32 %r238,%r23; cvt.s64.s8 %r237,%r238; st.u64 [%stack+8],%r237; st.u64 [%stack],%r22; cvta.const.u64 %r236236_gfortran_runtime_error5: .loc 1 272 9 add.u64 %r445,%r146,%r146; add.u64 %r444,%r445,%r146; shl.b64 %r243,%r444,3; add.u64 %r244,%r214,%r243; ld.u64 %r247,[%r244+56]; add.u64 %r246,%r247,1; .loc 1 272 7 ld.u64 %r254,[%r244+48]; sub.u64 %r148,%r246,%r254; .loc 1 273 6 setp.le.s64 %r255,%r148,0; @ %r255 bra $L52; .loc 1 276 9 ld.u64 %r166,[%r216]; .loc 1 278 15 ld.u64 %r34,[%r216+16]; .loc 1 280 22 cvt.u32.u64 %r35,%r34; .loc 1 280 53 add.u32 %r256,%r35,-4; and.b32 %r257,%r256,-5; set.u32.eq.u32 %r259,%r257,0; neg.s32 %r260,%r259; .loc 1 280 22 add.u32 %r261,%r35,-1; set.u32.le.u32 %r263,%r261,1; neg.s32 %r264,%r263; .loc 1 280 6 cvt.u16.u32 %r266,%r260; cvt.u16.u32 %r267,%r264; or.b16 %r265,%r266,%r267; cvt.u32.u16 %r268,%r265; cvt.u16.u8 %r269,%r268; setp.ne.u16 %r270,%r269,0; @ %r270 bra $L57; .loc 1 282 7 setp.ne.u32 %r272,%r35,16; @ %r272 bra $L58; $L57: .loc 1 289 9 shl.b64 %r276,%r444,3; add.u64 %r277,%r214,%r276; ld.u64 %r151,[%r277+40]; .loc 1 290 12 add.u64 %r283,%r216,%r276; ld.u64 %r39,[%r283+40]; .loc 1 292 3 setp.ne.u64 %r285,%r146,0; @ %r285 bra $L59; $L65: .loc 1 302 3 setp.lt.s64 %r286,%r146,%r147; @ %r286 bra $L60; bra $L119; $L58: .loc 1 287 5_gfortran_runtime_error9: add.u64 %r205,%r214,40; add.u64 %r67,%r216,40; add.u64 %r289,%r214,16; add.u64 %r291,%r22,%r22; add.u64 %r292,%r291,%r22; shl.b64 %r293,%r292,3; add.u64 %r210,%r289,%r293; .loc 1 292 3 mov.u64 %r91,0; add.u64 %r442,%frame,240; add.u64 %r440,%frame,360; .loc 1 299 12 mov.u64 %r456,%r91; $L64: .loc 1 294 18 add.u64 %r295,%r442,%r91; ld.u64 %r296,[%r205]; st.u64 [%r295],%r296; .loc 1 295 18 add.u64 %r297,%frame,%r91; .loc 1 295 20 ld.u64 %r299,[%r67]; mul.lo.u64 %r298,%r299,%r34; .loc 1 295 18 st.u64 [%r297],%r298; .loc 1 296 19 ld.u64 %r301,[%r205+16]; add.u64 %r300,%r301,1; ld.u64 %r302,[%r205+8]; sub.u64 %r51,%r300,%r302; .loc 1 298 10 setp.lt.s64 %r303,%r51,0; @ %r303 bra $L62; .loc 1 296 17 add.u64 %r305,%r440,%r91; st.u64 [%r305],%r51; bra $L63; $L62: .loc 1 299 12 add.u64 %r307,%r440,%r91; st.u64 [%r307],%r456; $L63: .loc 1 292 3 add.u64 %r205,%r205,24; add.u64 %r67,%r67,24; add.u64 %r91,%r91,8; setp.ne.u64 %r309,%r205,%r210; @ %r309 bra $L64; bra $L65; $L119: .loc 1 312 6 ld.u64 %r310,[%r213]; setp.eq.u64 %r311,%r310,0; @ ! %r311 bra $L67; bra $L66; $L60: add.u64 %r313,%r22,%r22; add.u64 %r314,%r313,%r22; shl.b64 %r315,%r314,3; add.u64 %r128,%r315,40; add.u64 %r133,%r214,%r128; add.u64 %r124,%r216,%r128; shl.b64 %r316,%r22,3; add.u64 %r83,%r316,-8; cvt.u32.u32 %r318,%r23; cvt.s64.s8 %r317,%r318; shl.b64 %r319,%r317,3; add.u64 %r209,%r319,-8; add.u64 %r442,%frame,240; add.u64 %r440,%frame,360; .loc 1 309 12 mov.u64 %r455,0; $L70: .loc 1 304 18 add.u64 %r321,%r442,%r83; ld.u64 %r322,[%r133]; st.u64 [%r321],%r322; .loc 1 305 18 add.u64 %r323,%frame,%r83; .loc 1 305 20 ld.u64 %r325,[%r124]; mul.lo.u64 %r324,%r325,%r34; .loc 1 305 18 st.u64 [%r323],%r324; .loc 1 306 19 ld.u64 %r327,[%r133+16]; add.u64 %r326,%r327,1; ld.u64 %r328,[%r133+8]; sub.u64 %r64,%r326,%r328; .loc 1 308 10 setp.lt.s64 %r329,%r64,0; @ %r329 bra $L68; .loc 1 306 17 add.u64 %r331,%r440,%r83; st.u64 [%r331],%r64; bra $L69; $L68: .loc 1 309 12 add.u64 %r333,%r440,%r83; st.u64 [%r333],%r455; $L69: .loc 1 302 3 add.u64 %r133,%r133,24; add.u64 %r124,%r124,24; add.u64 %r83,%r83,8; setp.ne.u64 %r335,%r83,%r209; @ %r335 bra $L70; bra $L120; $L66: .loc 1 316 7 setp.eq.u64 %r336,%r147,0; @ %r336 bra $L72; add.u64 %r440,%frame,360; $L96: add.u64 %r161,%r213,40; mov.u64 %r143,%r440; cvt.u32.u32 %r338,%r23; cvt.s64.s8 %r337,%r338; add.u64 %r340,%r337,%r337; add.u64 %r341,%r340,%r337; shl.b64 %r342,%r341,3; add.u64 %r343,%r213,16; add.u64 %r135,%r342,%r343; .loc 1 319 10 mov.u64 %r131,1; .loc 1 323 4 mov.u64 %r362,0; bra $L73; $L72: .loc 1 327 20 add.u32 %r344,%r23,-2; cvt.s64.s32 %r73,%r344; add.u64 %r346,%r73,%r73; add.u64 %r347,%r346,%r73; shl.b64 %r348,%r347,3; add.u64 %r349,%r213,%r348; .loc 1 327 67 shl.b64 %r351,%r73,3; add.u64 %r352,%frame,%r351; .loc 1 327 59 ld.u64 %r354,[%r349+40]; ld.u64 %r355,[%r352+360]; mul.lo.u64 %r155,%r354,%r355; .loc 1 329 24 mov.u64 %r356,0; st.u64 [%r213+8],%r356; .loc 1 330 28 cvt.u16.u32 %r359,%r23; add.u16 %r358,%r359,-1; cvt.u32.u16 %r360,%r358; st.u8 [%r213+28],%r360; .loc 1 332 10 setp.eq.u64 %r361,%r155,0; @ %r361 bra $L74; bra $L121; $L76: .loc 1 321 47 mul.lo.u64 %r131,%r70,%r131; $L73: .loc 1 323 4 st.u64 [%r161+8],%r362; ld.u64 %r70,[%r143]; add.u64 %r363,%r70,-1; st.u64 [%r161+16],%r363; st.u64 [%r161],%r131; .loc 1 316 7 add.u64 %r161,%r161,24; add.u64 %r143,%r143,8; setp.ne.u64 %r364,%r135,%r161; @ %r364 bra $L76; bra $L72; $L74: .loc 1 335 4 st.u64 [%r213+48],%r155; mov.u64 %r366,-1; st.u64 [%r213+56],%r366; mov.u64 %r367,1; st.u64 [%r213+40],%r367; .loc 1 336 4 bra $L52; $L121: .loc 1 339 24 mov.u64 %r369369; call (%value_in),_gfortrani_xmallocarray70,[%value_in]; } .loc 1 339 22 st.u64 [%r213],%r370; bra $L77; $L67: .loc 1 344 19 ld.s8 %r372,[%r213+28]; .loc 1 344 10 setp.eq.u64 %r373,%r372,%r147; @ %r373 bra $L78; .loc 1 345 2 cvta.const.u64 %r374374_gfortran_runtime_error78: .loc 1 347 11 cvta.global.u64 %r376,_gfortrani_compile_options; .loc 1 347 10 ld.u32 %r377,[%r376+36]; setp.eq.u32 %r378,%r377,0; @ %r378 bra $L77; .loc 1 349 4 add.u64 %r440,%frame,360; cvta.const.u64 %r382,$LC2r43382; call _gfortrani_bounds_ifunction_return4 cvta.const.u64 %r386,$LC386382; call _gfortrani_bounds_equal_extents$L77: .loc 1 356 3 setp.ne.u64 %r388,%r147,0; @ %r388 bra $L79; $L82: .loc 1 364 8 ld.u64 %r167,[%r213]; .loc 1 365 8 ld.u64 %r165,[%r214]; .loc 1 367 9 setp.ne.u64 %r389,%r165,0; @ %r389 bra $L80; bra $L52; $L79: add.u64 %r194,%frame,480; add.u64 %r193,%r213,40; add.u64 %r188,%frame,120; cvt.u32.u32 %r391,%r23; cvt.s64.s8 %r390,%r391; add.u64 %r174,%r390,-1; .loc 1 356 3 mov.u64 %r168,0; add.u64 %r440,%frame,360; .loc 1 358 16 mov.u64 %r392,%r168; $L81: st.u64 [%r194],%r392; .loc 1 359 18 ld.u64 %r393,[%r193]; st.u64 [%r188],%r393; .loc 1 360 17 shl.b64 %r395,%r168,3; add.u64 %r396,%r440,%r395; .loc 1 360 10 ld.u64 %r397,[%r396]; setp.le.s64 %r398,%r397,0; @ %r398 bra $L52; .loc 1 356 26 add.u64 %r168,%r168,1; .loc 1 356 3 add.u64 %r194,%r194,8; add.u64 %r193,%r193,24; add.u64 %r188,%r188,8; setp.ne.u64 %r399,%r168,%r174; @ %r399 bra $L81; bra $L82; $L80: .loc 1 290 12 mul.lo.u64 %r42,%r39,%r34; .loc 1 386 32 shl.b64 %r87,%r151,2; ld.u64 %r68,[%frame+480]; .loc 1 430 22 ld.u64 %r97,[%frame+240]; .loc 1 430 12 shl.b64 %r99,%r97,2; .loc 1 431 23 ld.u64 %r100,[%frame]; .loc 1 432 22 ld.u64 %r102,[%frame+120]; .loc 1 432 12 shl.b64 %r104,%r102,3; setp.le.s64 %r446,%r147,1; .loc 1 408 9 setp.ne.u32 %r448,%r217,0; cvt.u32.u32 %r449,%r23; cvt.s64.s8 %r450,%r449; add.u64 %r451,%r450,-1; add.u64 %r452,%frame,240; add.u64 %r453,%frame,360; add.u64 %r454,%frame,120; $L97: .loc 1 356 3 mov.u64 %r160,%r166; mov.u64 %r159,%r165; .loc 1 386 9 mov.u64 %r137,0; $L86: .loc 1 389 7 ld.s8 %r208,[%r160]; mov.u64 %r189,%r137; .loc 1 386 24 add.u64 %r137,%r137,1; .loc 1 389 6 setp.eq.u32 %r441,%r208,0; @ %r441 bra $L83; .loc 1 397 11 ld.u32 %r141,[%r159]; .loc 1 408 9 @ %r448 bra $L84; bra $L122; $L83: .loc 1 386 32 add.u64 %r159,%r159,%r87; .loc 1 386 47 add.u64 %r160,%r160,%r42; .loc 1 386 2 setp.ne.u64 %r402,%r148,%r137; @ %r402 bra $L86; .loc 1 385 9 mov.u64 %r137,0; bra $L87; $L122: .loc 1 418 8 setp.gt.s64 %r403,%r148,%r189; @ %r403 bra $L88; bra $L87; $L84: .loc 1 409 8 setp.le.s64 %r404,%r148,%r189; @ %r404 bra $L87; $L90: .loc 1 414 16 add.u64 %r189,%r189,1; .loc 1 411 8 @ %r441 bra $L89; .loc 1 411 18 ld.u32 %r88,[%r159]; .loc 1 411 15 setp.ge.s32 %r406,%r88,%r141; selp.u32 %r141,%r88,%r141,%r406; selp.u64 %r137,%r189,%r137,%r406; $L89: .loc 1 409 33 add.u64 %r159,%r159,%r87; .loc 1 409 48 add.u64 %r160,%r160,%r42; .loc 1 409 8 setp.eq.u64 %r407,%r148,%r189; @ %r407 bra $L87; .loc 1 411 9 ld.s8 %r208,[%r160]; setp.eq.u32 %r441,%r208,0; bra $L90; $L88: .loc 1 423 16 add.u64 %r189,%r189,1; .loc 1 420 8 ld.s8 %r409,[%r160]; cvt.u16.u32 %r408,%r409; setp.eq.u16 %r410,%r408,0; @ %r410 bra $L91; .loc 1 420 18 ld.u32 %r93,[%r159]; .loc 1 420 15 setp.gt.s32 %r411,%r93,%r141; selp.u32 %r141,%r93,%r141,%r411; selp.u64 %r137,%r189,%r137,%r411; $L91: .loc 1 418 33 add.u64 %r159,%r159,%r87; .loc 1 418 48 add.u64 %r160,%r160,%r42; .loc 1 418 8 setp.ne.u64 %r412,%r148,%r189; @ %r412 bra $L88; $L87: .loc 1 426 8 st.u64 [%r167],%r137; .loc 1 429 15 add.u64 %r68,%r68,1; .loc 1 430 12 add.u64 %r165,%r165,%r99; .loc 1 431 13 add.u64 %r166,%r166,%r100; .loc 1 432 12 add.u64 %r167,%r167,%r104; .loc 1 434 32 ld.u64 %r29,[%frame+360]; .loc 1 434 13 setp.ne.u64 %r413,%r29,%r68; @ %r413 bra $L97; .loc 1 441 23 mul.lo.u64 %r106,%r68,%r97; .loc 1 442 24 mul.lo.u64 %r414,%r68,%r100; .loc 1 442 10 sub.u64 %r162,%r166,%r414; .loc 1 443 23 mul.lo.u64 %r415,%r68,%r102; .loc 1 443 9 shl.b64 %r416,%r415,3; sub.u64 %r163,%r167,%r416; .loc 1 445 7 @ %r446 bra $L52; add.u64 %r204,%frame,488; mov.u64 %r203,8; .loc 1 444 5 mov.u64 %r164,1; .loc 1 438 13 mov.u64 %r447,0; bra $L94; $L95: st.u64 [%r204],%r447; .loc 1 441 23 mul.lo.u64 %r106,%r117,%r116; .loc 1 442 24 mul.lo.u64 %r422,%r119,%r116; .loc 1 442 10 sub.u64 %r162,%r166,%r422; .loc 1 443 23 mul.lo.u64 %r423,%r121,%r116; .loc 1 443 9 shl.b64 %r424,%r423,3; sub.u64 %r163,%r167,%r424; .loc 1 444 5 add.u64 %r164,%r164,1; .loc 1 445 7 add.u64 %r204,%r204,8; add.u64 %r203,%r203,8; setp.eq.u64 %r425,%r164,%r451; @ %r425 bra $L52; $L94: .loc 1 453 16 ld.u64 %r426,[%r204]; add.u64 %r116,%r426,1; st.u64 [%r204],%r116; .loc 1 454 23 add.u64 %r428,%r452,%r203; ld.u64 %r117,[%r428]; .loc 1 454 13 sub.u64 %r429,%r117,%r106; shl.b64 %r430,%r429,2; add.u64 %r165,%r165,%r430; .loc 1 455 24 add.u64 %r431,%frame,%r203; ld.u64 %r119,[%r431]; .loc 1 455 14 add.u64 %r166,%r162,%r119; .loc 1 456 23 add.u64 %r433,%r454,%r203; ld.u64 %r121,[%r433]; .loc 1 456 13 shl.b64 %r434,%r121,3; add.u64 %r167,%r163,%r434; .loc 1 434 32 add.u64 %r436,%r453,%r203; ld.u64 %r125,[%r436]; .loc 1 434 13 setp.eq.u64 %r437,%r116,%r125; @ %r437 bra $L95; .loc 1 438 13 mov.u64 %r68,0; bra $L97; $L120: .loc 1 312 6 ld.u64 %r438,[%r213]; setp.eq.u64 %r439,%r438,0; @ ! %r439 bra $L67; bra $L96; $L52_gfortran_smaxloc1_8_i4 .visible .func _gfortran_smaxloc1_8_, .param .u32 %in_ar4) {local .align 16 .b8 %frame_ar[368u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 483 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L124; .loc 1 483 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L125; $L124:r152154; call _gfortran_maxloc1_8_490 7 bra $L123; $L125: .loc 1 493 10 ld.u64 %r23,[%r152]; .loc 1 493 7 add.u64 %r88,%r23,-1; .loc 1 494 10 ld.s8 %r24,[%r151+28]; .loc 1 494 38 add.u32 %r26,%r24,-1; .loc 1 494 8 cvt.s64.s32 %r296,%r26; .loc 1 496 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 496 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L127; .loc 1 503 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L128; $L134: .loc 1 511 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L129; bra $L182; $L127: .loc 1 498 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error128: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 508 12 mov.u64 %r302,0; $L133: .loc 1 505 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 507 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L131; .loc 1 505 17 st.u64 [%r139],%r36; bra $L132; $L131: .loc 1 508 12 st.u64 [%r139],%r302; $L132: .loc 1 503 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L133; bra $L134; $L182: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L136; bra $L135; $L129: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 517 12 mov.u64 %r301,0; $L139: .loc 1 514 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 516 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L137; .loc 1 513 17 st.u64 [%r43],%r41; bra $L138; $L137: .loc 1 517 12 st.u64 [%r43],%r301; $L138: .loc 1 511 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L139; bra $L183; $L135: .loc 1 524 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L141; add.u64 %r295,%frame,120; $L161: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 527 10 mov.u64 %r81,1; .loc 1 531 4 mov.u64 %r239,0; bra $L142; $L141: .loc 1 535 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 536 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 538 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 538 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 538 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 540 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L143; bra $L184; $L145: .loc 1 529 48 mul.lo.u64 %r81,%r46,%r81; $L142: .loc 1 531 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 524 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L145; bra $L141; $L143: .loc 1 543 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 544 4 bra $L123; $L184: .loc 1 547 24 mov.u64 %r24696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 547 22 st.u64 [%r150],%r98; $L149: .loc 1 573 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L146; bra $L147; $L136: .loc 1 551 19 ld.s8 %r55,[%r150+28]; .loc 1 551 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L148; .loc 1 552 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error148: .loc 1 557 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 557 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L149; .loc 1 559 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L147; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 559 10 mov.u64 %r104,0; $L151: .loc 1 563 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 563 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 564 18 ld.u64 %r63,[%r31]; .loc 1 559 25 add.u64 %r104,%r104,1; .loc 1 564 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L150; .loc 1 565 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error150: .loc 1 559 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L151; $L146: .loc 1 575 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L152; shl.b64 %r266,%r296,3; bra $L153; $L152: mov.u64 %r266,8; $L153: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L154: .loc 1 576 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 573 26 add.u64 %r100,%r100,1; .loc 1 573 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L154; $L147: ld.u64 %r80,[%frame+240]; .loc 1 585 22 ld.u64 %r68,[%frame]; .loc 1 585 12 shl.b64 %r69,%r68,3; .loc 1 587 32 ld.u64 %r59,[%frame+120]; .loc 1 594 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 583 13 mov.u64 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L160: st.u64 [%r98],%r280; .loc 1 584 15 add.u64 %r80,%r80,1; .loc 1 585 12 add.u64 %r98,%r98,%r69; .loc 1 587 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L160; .loc 1 596 7 @ ! %r294 bra $L185; bra $L123; $L159: .loc 1 591 13 st.u64 [%r99],%r280; .loc 1 594 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 595 5 add.u64 %r97,%r97,1; .loc 1 596 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L158; bra $L123; $L185: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 594 23 mov.u64 %r72,%r44; .loc 1 595 5 mov.u64 %r97,1; $L158: .loc 1 600 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 601 23 ld.u64 %r75,[%r124]; .loc 1 601 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,3; add.u64 %r98,%r98,%r291; .loc 1 587 32 ld.u64 %r77,[%r123]; .loc 1 587 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L159; .loc 1 591 13 mov.u64 %r80,0; bra $L160; $L183: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L136; bra $L161; $L123: .loc 1 605 1 ret; } maxloc1_16_i4.o/_gfortran_maxloc1_16_i4 .visible .func _gfortran_maxloc1_16_fortran/generated/maxloc1_16_i4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_16_i4 .visible .func _gfortran_mmaxloc1_16__gfortran_smaxloc1_16_i4 .visible .func _gfortran_smaxloc1_16_VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_maxloc1_16_i4 .visible .func _gfortran_maxloc1_16_i448007predpredu64 %r231; .reg .u64 %r232; .reg .pred %r233; .reg .pred %r235; .reg .u64 %r236; .reg .pred %r237; .reg .u64u64predu64 %r264; .reg .u64 %r265; .reg .u64 %r266; .reg .u64 %r267; .reg .u64 %r268; .reg .u16 %r270; .reg .u16 %r271; .reg .u32 %r272; .reg .u32 %r273; .reg .u64 %r275; .reg .u64u64pred64pred64predpredpred %r383; .reg .u32u64 %r389; .reg .u64 %r390; mov.u64 %r183,%ar0; mov.u64 %r184,%ar1; mov.u64 %r185,%ar2; mov.u32 %r186,%ar3; .loc 1 58 10 ld.s8 %r22,[%r184+28]; .loc 1 58 38 add.u32 %r187,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r110,%r187; .loc 1 59 10 ld.u64 %r25,[%r185]; .loc 1 59 7 add.u64 %r111,%r25,-1; .loc 1 61 7 shr.u64 %r189,%r111,63; set.u32.lt.s64 %r191,%r110,%r111; neg.s32 %r192,%r191; cvt.u16.u64 %r195,%r189; cvt.u16.u32 %r196,%r192; or.b16 %r194,%r195,%r196; .loc 1 61 6 cvt.u32.u16 %r197,%r194; cvt.u16.u8 %r198,%r197; setp.eq.u16 %r199,%r198,0; @ %r199 bra $L2; .loc 1 63 7 cvt.u32.u32 %r202,%r22; cvt.s64.s8 %r201,%r202; st.u64 [%stack+8],%r2000200_gfortran_runtime_errorr205,%r111,%r111; add.u64 %r206,%r205,%r111; shl.b64 %r207,%r206,3; add.u64 %r208,%r184,%r207; ld.u64 %r31,[%r208+56]; ld.u64 %r33,[%r208+48]; .loc 1 71 9 ld.u64 %r114,[%r208+40]; .loc 1 73 3 setp.ne.u64 %r222,%r111,0; @ %r222 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r223,%r110,%r111; @ %r223 bra $L4; bra $L49; $L3: add.u64 %r109,%r184,40; add.u64 %r128,%frame,120; add.u64 %r181,%frame,240; add.u64 %r224,%r184,16; add.u64 %r226,%r25,%r25; add.u64 %r227,%r226,%r25; shl.b64 %r228,%r227,3; add.u64 %r137,%r224,%r228; .loc 1 79 12 mov.u64 %r390,0; $L8: .loc 1 75 18 ld.u64 %r229,[%r109]; st.u64 [%r128],%r229; .loc 1 76 19 ld.u64 %r231,[%r109+16]; add.u64 %r230,%r231,1; ld.u64 %r232,[%r109+8]; sub.u64 %r38,%r230,%r232; .loc 1 78 10 setp.lt.s64 %r233,%r38,0; @ %r233 bra $L6; .loc 1 76 17 st.u64 [%r181],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r181],%r390; $L7: .loc 1 73 3 add.u64 %r109,%r109,24; add.u64 %r128,%r128,8; add.u64 %r181,%r181,8; setp.ne.u64 %r235,%r109,%r137; @ %r235 bra $L8; bra $L9; $L49: .loc 1 90 6 ld.u64 %r236,[%r183]; setp.eq.u64 %r237,%r236,0; @ ! %r237 bra $L11; bra $L10; $L4: add.u64 %r239,%r25,%r25; add.u64 %r240,%r239,%r25; shl.b64 %r241,%r240,3; add.u64 %r242,%r241,40; add.u64 %r134,%r184,%r242; shl.b64 %r243,%r25,3; add.u64 %r107,%r243,-8; add.u64 %r378,%frame,120; add.u64 %r120,%r378,%r107; add.u64 %r377,%frame,240; add.u64 %r104,%r377,%r107; add.u64 %r246,%r184,40; cvt.u32.u32 %r248,%r22; cvt.s64.s8 %r247,%r248; add.u64 %r250,%r247,%r247; add.u64 %r251,%r250,%r247; shl.b64 %r252,%r251,3; add.u64 %r63,%r246,%r252; .loc 1 87 12 mov.u64 %r389,0; $L14: .loc 1 83 18 ld.u64 %r253,[%r134]; st.u64 [%r120],%r253; .loc 1 84 19 ld.u64 %r255,[%r134+16]; add.u64 %r254,%r255,1; ld.u64 %r256,[%r134+8]; sub.u64 %r45,%r254,%r256; .loc 1 86 10 setp.lt.s64 %r257,%r45,0; @ %r257 bra $L12; .loc 1 84 17 st.u64 [%r104],%r45; bra $L13; $L12: .loc 1 87 12 st.u64 [%r104],%r389; $L13: .loc 1 81 3 add.u64 %r134,%r134,24; add.u64 %r120,%r120,8; add.u64 %r104,%r104,8; setp.ne.u64 %r259,%r63,%r134; @ %r259 bra $L14; bra $L50; $L10: .loc 1 94 7 setp.le.s64 %r260,%r110,0; @ %r260 bra $L16; add.u64 %r377,%frame,240; $L37: add.u64 %r154,%r183,40; mov.u64 %r152,%r377; cvt.u32.u32 %r262,%r22; cvt.s64.s8 %r261,%r262; add.u64 %r264,%r261,%r261; add.u64 %r265,%r264,%r261; shl.b64 %r266,%r265,3; add.u64 %r267,%r183,16; add.u64 %r138,%r266,%r267; .loc 1 97 10 mov.u64 %r101,1; .loc 1 101 4 mov.u64 %r290,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r268,0; st.u64 [%r183+8],%r268; .loc 1 106 28 cvt.u16.u32 %r271,%r22; add.u16 %r270,%r271,-1; cvt.u32.u16 %r272,%r270; st.u8 [%r183+28],%r272; .loc 1 108 20 add.u32 %r273,%r22,-2; cvt.s64.s32 %r56,%r273; add.u64 %r275,%r56,%r56; add.u64 %r276,%r275,%r56; shl.b64 %r277,%r276,3; add.u64 %r278,%r183,%r277; .loc 1 108 67 shl.b64 %r280,%r56,3; add.u64 %r281,%frame,%r280; .loc 1 108 59 ld.u64 %r283,[%r278+40]; ld.u64 %r284,[%r281+240]; mul.lo.u64 %r116,%r283,%r284; .loc 1 110 29 mov.u64 %r286,16286; call (%value_in),_gfortrani_xmallocarray287,[%value_in]; } .loc 1 110 27 st.u64 [%r183],%r287; .loc 1 111 10 setp.eq.u64 %r289,%r116,0; @ ! %r289 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r101,%r52,%r101; $L17: .loc 1 101 4 st.u64 [%r154+8],%r290; ld.u64 %r52,[%r152]; add.u64 %r291,%r52,-1; st.u64 [%r154+16],%r291; st.u64 [%r154],%r101; .loc 1 94 7 add.u64 %r154,%r154,24; add.u64 %r152,%r152,8; setp.ne.u64 %r292,%r138,%r154; @ %r292 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r183+48],%r116; mov.u64 %r294,-1; st.u64 [%r183+56],%r294; mov.u64 %r295,1; st.u64 [%r183+40],%r295; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r62,[%r183+28]; .loc 1 121 10 setp.eq.u64 %r296,%r62,%r110; @ %r296 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r110; st.u64 [%stack],%r62;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r299,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r300,[%r299+36]; setp.eq.u32 %r301,%r300,0; @ %r301 bra $L21; .loc 1 128 2 add.u64 %r377,%frame,240; cvta.const.u64 %r305,$LC2call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r307,%r110,0; @ %r307 bra $L24; $L27: .loc 1 68 9 add.u64 %r308,%r31,1; .loc 1 68 7 sub.u64 %r112,%r308,%r33; max.s64 %r46,%r112,0; .loc 1 140 8 ld.u64 %r124,[%r184]; .loc 1 141 8 ld.u64 %r125,[%r183]; .loc 1 180 31 shl.b64 %r70,%r114,2; ld.u64 %r77,[%frame+360]; .loc 1 194 22 ld.u64 %r72,[%frame+120]; .loc 1 194 12 shl.b64 %r73,%r72,2; .loc 1 195 22 ld.u64 %r75,[%frame]; .loc 1 195 12 shl.b64 %r76,%r75,4; .loc 1 197 32 ld.u64 %r105,[%frame+240]; .loc 1 204 23 mul.lo.u64 %r309,%r72,%r105; .loc 1 204 9 shl.b64 %r310,%r309,2; neg.s64 %r135,%r310; .loc 1 205 23 mul.lo.u64 %r311,%r75,%r105; .loc 1 205 9 shl.b64 %r312,%r311,4; neg.s64 %r139,%r312; setp.gt.s64 %r379,%r112,0; setp.le.s64 %r380,%r110,1; mov.u64 %r382,1; setp.eq.u32 %r383,%r186,0; cvt.u32.u32 %r384,%r22; cvt.s64.s8 %r385,%r384; add.u64 %r386,%r385,-1; add.u64 %r387,%frame,120; add.u64 %r388,%frame,240; bra $L25; $L24: add.u64 %r167,%frame,360; add.u64 %r166,%r183,40; mov.u64 %r164,%frame; cvt.u32.u32 %r314,%r22; cvt.s64.s8 %r313,%r314; add.u64 %r155,%r313,-1; .loc 1 132 3 mov.u64 %r126,0; add.u64 %r377,%frame,240; .loc 1 134 16 mov.u64 %r315,%r126; $L26: st.u64 [%r167],%r315; .loc 1 135 18 ld.u64 %r316,[%r166]; st.u64 [%r164],%r316; .loc 1 136 17 shl.b64 %r318,%r126,3; add.u64 %r319,%r377,%r318; .loc 1 136 10 ld.u64 %r320,[%r319]; setp.le.s64 %r321,%r320,0; @ %r321 bra $L1; .loc 1 132 26 add.u64 %r126,%r126,1; .loc 1 132 3 add.u64 %r167,%r167,8; add.u64 %r166,%r166,24; add.u64 %r164,%r164,8; setp.ne.u64 %r322,%r126,%r155; @ %r322 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r379 bra $L38; .loc 1 159 10 mov.u64 %r324,0; st.u64 [%r125],%r324; st.u64 [%r125+8],%r324; bra $L29; $L38: mov.u64 %r119,%r124; mov.u64 %r370,%r382; mov.u64 %r371,0; .loc 1 155 9 mov.u32 %r103,-2147483648; .loc 1 157 9 mov.u64 %r368,1; mov.u64 %r369,%r371; $L28: .loc 1 182 14 ld.u32 %r182,[%r119]; .loc 1 182 7 @ %r383 bra $L30; .loc 1 182 6 setp.le.s32 %r327,%r103,%r182; @ %r327 bra $L31; bra $L32; $L30: setp.ge.s32 %r328,%r103,%r182; @ %r328 bra $L32; $L31: .loc 1 185 14 mov.u64 %r368,%r370; mov.u64 %r369,%r371; .loc 1 182 14 mov.u32 %r103,%r182; $L32: .loc 1 180 31 add.u64 %r119,%r119,%r70; .loc 1 180 6 add.u64 %r331,%r370,%r382; set.u32.lt.u64 %r336,%r331,%r370; cvt.s64.s32 %r334,%r336; mov.u64 %r370,%r331; sub.u64 %r371,%r371,%r334; add.u64 %r341,%r370,-1; setp.gt.s64 %r343,%r46,%r341; @ %r343 bra $L28; .loc 1 189 12 st.u64 [%r125],%r368; st.u64 [%r125+8],%r369; $L29: .loc 1 193 15 add.u64 %r77,%r77,1; .loc 1 194 12 add.u64 %r124,%r124,%r73; .loc 1 195 12 add.u64 %r125,%r125,%r76; .loc 1 197 13 setp.ne.u64 %r346,%r77,%r105; @ %r346 bra $L25; .loc 1 204 9 add.u64 %r121,%r124,%r135; .loc 1 205 9 add.u64 %r122,%r125,%r139; .loc 1 207 7 @ %r380 bra $L1; add.u64 %r130,%frame,368; mov.u64 %r149,8; .loc 1 206 5 mov.u64 %r123,1; .loc 1 201 13 mov.u64 %r381,0; bra $L35; $L36: st.u64 [%r130],%r381; .loc 1 204 23 mul.lo.u64 %r352,%r89,%r88; .loc 1 204 9 shl.b64 %r353,%r352,2; sub.u64 %r121,%r124,%r353; .loc 1 205 23 mul.lo.u64 %r354,%r92,%r88; .loc 1 205 9 shl.b64 %r355,%r354,4; sub.u64 %r122,%r125,%r355; .loc 1 206 5 add.u64 %r123,%r123,1; .loc 1 207 7 add.u64 %r130,%r130,8; add.u64 %r149,%r149,8; setp.eq.u64 %r356,%r123,%r386; @ %r356 bra $L1; $L35: .loc 1 215 16 ld.u64 %r357,[%r130]; add.u64 %r88,%r357,1; st.u64 [%r130],%r88; .loc 1 216 23 add.u64 %r359,%r387,%r149; ld.u64 %r89,[%r359]; .loc 1 216 13 shl.b64 %r360,%r89,2; add.u64 %r124,%r121,%r360; .loc 1 217 23 add.u64 %r361,%frame,%r149; ld.u64 %r92,[%r361]; .loc 1 217 13 shl.b64 %r362,%r92,4; add.u64 %r125,%r122,%r362; .loc 1 197 32 add.u64 %r364,%r388,%r149; ld.u64 %r95,[%r364]; .loc 1 197 13 setp.eq.u64 %r365,%r88,%r95; @ %r365 bra $L36; .loc 1 201 13 mov.u64 %r77,0; bra $L25; $L50: .loc 1 90 6 ld.u64 %r366,[%r183]; setp.eq.u64 %r367,%r366,0; @ ! %r367 bra $L11; bra $L37; $L1:_gfortran_mmaxloc1_16_i4 .visible .func _gfortran_mmaxloc1_16_, .param .u32 %in_ar4) {60897u64u64 %r267; .reg .u64 %r274; .reg .pred %r275; .reg .u32 %r276; .reg .u32 %r277; .reg .u32u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .pred %r329; .reg .u64 %r330; .reg .pred %r331; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32 %r338; .reg .u64 %r339; .reg .u64 %r341; .reg .u64 %r342; .reg .u64 %r343; .reg .u64 %r344; .reg .u64pred %r355; .reg .pred64predpred %r408; .reg .pred %r409; .reg .u64 %r410; .reg .u32.reg .u64 %r417; .reg .pred %r418; .reg .predpred %r428; .reg .u64 %r432; .reg .u64pred %r460; .reg .u64 %r463; .reg .u64 %r466; .reg .u32 %r468; .reg .pred %r475; .reg .pred.reg .u32 %r504; .reg .u64 %r508; .reg .u16 %r509; .reg .u32 %r510; .reg .pred %r511; .reg .pred %r512; .reg .u64 %r515; .reg .u64 %r518; .reg .u32 %r520; .reg .pred %r527; .reg .pred %r530; .reg .pred %r533; .reg .u64pred %r542; .reg .u64 %r543; .reg .u64pred.reg .u64 %r610; .reg .u64 %r611; mov.u64 %r233,%ar0; mov.u64 %r234,%ar1; mov.u64 %r235,%ar2; mov.u64 %r236,%ar3; mov.u32 %r237,%ar4; .loc 1 251 6 setp.ne.u64 %r238,%r236,0; @ %r238 bra $L3r2235237; call _gfortran_maxloc1_16_7 bra $L51; $L52: .loc 1 261 10 ld.u64 %r22,[%r235]; .loc 1 261 7 add.u64 %r153,%r22,-1; .loc 1 262 10 ld.s8 %r23,[%r234+28]; .loc 1 262 38 add.u32 %r243,%r23,-1; .loc 1 262 8 cvt.s64.s32 %r154,%r243; .loc 1 265 7 shr.u64 %r245,%r153,63; set.u32.gt.s64 %r247,%r153,%r154; neg.s32 %r248,%r247; cvt.u16.u64 %r251,%r245; cvt.u16.u32 %r252,%r248; or.b16 %r250,%r251,%r252; .loc 1 265 6 cvt.u32.u16 %r253,%r250; cvt.u16.u8 %r254,%r253; setp.eq.u16 %r255,%r254,0; @ %r255 bra $L54; .loc 1 267 7 cvt.u32.u32 %r258,%r23; cvt.s64.s8 %r257,%r258; st.u64 [%stack+8],%r257; st.u64 [%stack],%r22; cvta.const.u64 %r22stack; call _gfortran_runtime_error4: .loc 1 272 9 add.u64 %r596,%r153,%r153; add.u64 %r601,%r596,%r153; shl.b64 %r263,%r601,3; add.u64 %r264,%r234,%r263; ld.u64 %r267,[%r264+56]; add.u64 %r266,%r267,1; .loc 1 272 7 ld.u64 %r274,[%r264+48]; sub.u64 %r155,%r266,%r274; .loc 1 273 6 setp.le.s64 %r275,%r155,0; @ %r275 bra $L51; .loc 1 276 9 ld.u64 %r173,[%r236]; .loc 1 278 15 ld.u64 %r35,[%r236+16]; .loc 1 280 22 cvt.u32.u64 %r36,%r35; .loc 1 280 53 add.u32 %r276,%r36,-4; and.b32 %r277,%r276,-5; set.u32.eq.u32 %r279,%r277,0; neg.s32 %r280,%r279; .loc 1 280 22 add.u32 %r281,%r36,-1; set.u32.le.u32 %r283,%r281,1; neg.s32 %r284,%r283; .loc 1 280 6 cvt.u16.u32 %r286,%r280; cvt.u16.u32 %r287,%r284; or.b16 %r285,%r286,%r287;56; .loc 1 282 7 setp.ne.u32 %r292,%r36,16; @ %r292 bra $L57; $L56: .loc 1 289 9 shl.b64 %r296,%r601,3; add.u64 %r297,%r234,%r296; ld.u64 %r158,[%r297+40]; .loc 1 290 12 add.u64 %r303,%r236,%r296; ld.u64 %r41,[%r303+40]; .loc 1 292 3 setp.ne.u64 %r305,%r153,0; @ %r305 bra $L58; $L64: .loc 1 302 3 setp.lt.s64 %r306,%r153,%r154; @ %r306 bra $L59; bra $L118; $L57: .loc 1 287 5 cvta.const.u64 %r307307_gfortran_runtime_error8: add.u64 %r164,%r234,40; add.u64 %r150,%r236,40; add.u64 %r309,%r234,16; add.u64 %r311,%r22,%r22; add.u64 %r312,%r311,%r22; shl.b64 %r313,%r312,3; add.u64 %r223,%r309,%r313; .loc 1 292 3 mov.u64 %r53,0; add.u64 %r597,%frame,240; add.u64 %r600,%frame,360; .loc 1 299 12 mov.u64 %r611,%r53; $L63: .loc 1 294 18 add.u64 %r315,%r597,%r53; ld.u64 %r316,[%r164]; st.u64 [%r315],%r316; .loc 1 295 18 add.u64 %r317,%frame,%r53; .loc 1 295 20 ld.u64 %r319,[%r150]; mul.lo.u64 %r318,%r319,%r35; .loc 1 295 18 st.u64 [%r317],%r318; .loc 1 296 19 ld.u64 %r321,[%r164+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r164+8]; sub.u64 %r52,%r320,%r322; .loc 1 298 10 setp.lt.s64 %r323,%r52,0; @ %r323 bra $L61; .loc 1 296 17 add.u64 %r325,%r600,%r53; st.u64 [%r325],%r52; bra $L62; $L61: .loc 1 299 12 add.u64 %r327,%r600,%r53; st.u64 [%r327],%r611; $L62: .loc 1 292 3 add.u64 %r164,%r164,24; add.u64 %r150,%r150,24; add.u64 %r53,%r53,8; setp.ne.u64 %r329,%r164,%r223; @ %r329 bra $L63; bra $L64; $L118: .loc 1 312 6 ld.u64 %r330,[%r233]; setp.eq.u64 %r331,%r330,0; @ ! %r331 bra $L66; bra $L65; $L59: add.u64 %r333,%r22,%r22; add.u64 %r334,%r333,%r22; shl.b64 %r335,%r334,3; add.u64 %r178,%r335,40; add.u64 %r59,%r234,%r178; add.u64 %r180,%r236,%r178; shl.b64 %r336,%r22,3; add.u64 %r91,%r336,-8; cvt.u32.u32 %r338,%r23; cvt.s64.s8 %r337,%r338; shl.b64 %r339,%r337,3; add.u64 %r97,%r339,-8; add.u64 %r597,%frame,240; add.u64 %r600,%frame,360; .loc 1 309 12 mov.u64 %r610,0; $L69: .loc 1 304 18 add.u64 %r341,%r597,%r91; ld.u64 %r342,[%r59]; st.u64 [%r341],%r342; .loc 1 305 18 add.u64 %r343,%frame,%r91; .loc 1 305 20 ld.u64 %r345,[%r180]; mul.lo.u64 %r344,%r345,%r35; .loc 1 305 18 st.u64 [%r343],%r344; .loc 1 306 19 ld.u64 %r347,[%r59+16]; add.u64 %r346,%r347,1; ld.u64 %r348,[%r59+8]; sub.u64 %r66,%r346,%r348; .loc 1 308 10 setp.lt.s64 %r349,%r66,0; @ %r349 bra $L67; .loc 1 306 17 add.u64 %r351,%r600,%r91; st.u64 [%r351],%r66; bra $L68; $L67: .loc 1 309 12 add.u64 %r353,%r600,%r91; st.u64 [%r353],%r610; $L68: .loc 1 302 3 add.u64 %r59,%r59,24; add.u64 %r180,%r180,24; add.u64 %r91,%r91,8; setp.ne.u64 %r355,%r91,%r97; @ %r355 bra $L69; bra $L119; $L65: .loc 1 316 7 setp.eq.u64 %r356,%r154,0; @ %r356 bra $L71; add.u64 %r600,%frame,360; $L97: add.u64 %r137,%r233,40; mov.u64 %r134,%r600; cvt.u32.u32 %r358,%r23; cvt.s64.s8 %r357,%r358; add.u64 %r360,%r357,%r357; add.u64 %r361,%r360,%r357; shl.b64 %r362,%r361,3; add.u64 %r363,%r233,16; add.u64 %r64,%r362,%r363; .loc 1 319 10 mov.u64 %r136,1; .loc 1 323 4 mov.u64 %r382,0; bra $L72; $L71: .loc 1 327 20 add.u32 %r364,%r23,-2; cvt.s64.s32 %r75,%r364; add.u64 %r366,%r75,%r75; add.u64 %r367,%r366,%r75; shl.b64 %r368,%r367,3; add.u64 %r369,%r233,%r368; .loc 1 327 67 shl.b64 %r371,%r75,3; add.u64 %r372,%frame,%r371; .loc 1 327 59 ld.u64 %r374,[%r369+40]; ld.u64 %r375,[%r372+360]; mul.lo.u64 %r162,%r374,%r375; .loc 1 329 24 mov.u64 %r376,0; st.u64 [%r233+8],%r376; .loc 1 330 28 cvt.u16.u32 %r379,%r23; add.u16 %r378,%r379,-1; cvt.u32.u16 %r380,%r378; st.u8 [%r233+28],%r380; .loc 1 332 10 setp.eq.u64 %r381,%r162,0; @ %r381 bra $L73; bra $L120; $L75: .loc 1 321 47 mul.lo.u64 %r136,%r72,%r136; $L72: .loc 1 323 4 st.u64 [%r137+8],%r382; ld.u64 %r72,[%r134]; add.u64 %r383,%r72,-1; st.u64 [%r137+16],%r383; st.u64 [%r137],%r136; .loc 1 316 7 add.u64 %r137,%r137,24; add.u64 %r134,%r134,8; setp.ne.u64 %r384,%r64,%r137; @ %r384 bra $L75; bra $L71; $L73: .loc 1 335 4 st.u64 [%r233+48],%r162; mov.u64 %r386,-1; st.u64 [%r233+56],%r386; mov.u64 %r387,1; st.u64 [%r233+40],%r387; .loc 1 336 4 bra $L51; $L120: .loc 1 339 24 mov.u64 %r389,16389; call (%value_in),_gfortrani_xmallocarray90,[%value_in]; } .loc 1 339 22 st.u64 [%r233],%r390; bra $L76; $L66: .loc 1 344 19 ld.s8 %r392,[%r233+28]; .loc 1 344 10 setp.eq.u64 %r393,%r392,%r154; @ %r393 bra $L77; .loc 1 345 2 cvta.const.u64 %r3944_gfortran_runtime_error77: .loc 1 347 11 cvta.global.u64 %r396,_gfortrani_compile_options; .loc 1 347 10 ld.u32 %r397,[%r396+36]; setp.eq.u32 %r398,%r397,0; @ %r398 bra $L76; .loc 1 349 4 add.u64 %r600,%frame,360; cvta.const.u64 %r402,$LC2r600002; call _gfortrani_bounds_ifunction_return4 cvta.const.u64 %r406,$LC406402; call _gfortrani_bounds_equal_extents$L76: .loc 1 356 3 setp.ne.u64 %r408,%r154,0; @ %r408 bra $L78; $L81: .loc 1 364 8 ld.u64 %r174,[%r233]; .loc 1 365 8 ld.u64 %r172,[%r234]; .loc 1 367 9 setp.ne.u64 %r409,%r172,0; @ %r409 bra $L79; bra $L51; $L78: add.u64 %r188,%frame,480; add.u64 %r181,%r233,40; add.u64 %r149,%frame,120; cvt.u32.u32 %r411,%r23; cvt.s64.s8 %r410,%r411; add.u64 %r138,%r410,-1; .loc 1 356 3 mov.u64 %r175,0; add.u64 %r600,%frame,360; .loc 1 358 16 mov.u64 %r412,%r175; $L80: st.u64 [%r188],%r412; .loc 1 359 18 ld.u64 %r413,[%r181]; st.u64 [%r149],%r413; .loc 1 360 17 shl.b64 %r415,%r175,3; add.u64 %r416,%r600,%r415; .loc 1 360 10 ld.u64 %r417,[%r416]; setp.le.s64 %r418,%r417,0; @ %r418 bra $L51; .loc 1 356 26 add.u64 %r175,%r175,1; .loc 1 356 3 add.u64 %r188,%r188,8; add.u64 %r181,%r181,24; add.u64 %r149,%r149,8; setp.ne.u64 %r419,%r138,%r175; @ %r419 bra $L80; bra $L81; $L79: .loc 1 290 12 mul.lo.u64 %r43,%r41,%r35; .loc 1 386 32 shl.b64 %r88,%r158,2; ld.u64 %r189,[%frame+480]; .loc 1 430 22 ld.u64 %r102,[%frame+240]; .loc 1 430 12 shl.b64 %r103,%r102,2; .loc 1 431 23 ld.u64 %r105,[%frame]; .loc 1 432 22 ld.u64 %r107,[%frame+120]; .loc 1 432 12 shl.b64 %r108,%r107,4; .loc 1 434 32 ld.u64 %r29,[%frame+360]; .loc 1 441 23 mul.lo.u64 %r39,%r29,%r102; .loc 1 442 24 mul.lo.u64 %r420,%r29,%r105; .loc 1 442 10 neg.s64 %r92,%r420; .loc 1 443 23 mul.lo.u64 %r421,%r29,%r107; .loc 1 443 9 shl.b64 %r422,%r421,4; neg.s64 %r185,%r422; setp.le.s64 %r595,%r154,1; .loc 1 408 9 setp.ne.u32 %r603,%r237,0; add.u64 %r604,%r155,-1; cvt.u32.u32 %r605,%r23; cvt.s64.s8 %r606,%r605; add.u64 %r607,%r606,-1; add.u64 %r608,%frame,240; add.u64 %r609,%frame,360; $L98: .loc 1 356 3 mov.u64 %r168,%r173; mov.u64 %r167,%r172; .loc 1 386 9 mov.u64 %r197,0; $L85: .loc 1 389 7 ld.s8 %r224,[%r168]; mov.u64 %r230,%r197; .loc 1 386 24 add.u64 %r197,%r197,1; .loc 1 389 6 setp.eq.u32 %r598,%r224,0; @ %r598 bra $L82; .loc 1 397 11 ld.u32 %r145,[%r167]; .loc 1 398 11 mov.u64 %r557,%r197; shr.s64 %r558,%r557,63; .loc 1 408 9 @ %r603 bra $L83; bra $L121; $L82: .loc 1 386 32 add.u64 %r167,%r167,%r88; .loc 1 386 47 add.u64 %r168,%r168,%r43; .loc 1 386 2 setp.ne.u64 %r426,%r155,%r197; @ %r426 bra $L85; .loc 1 385 9 mov.u64 %r557,0; mov.u64 %r558,%r557; bra $L86; $L121: .loc 1 418 8 setp.gt.s64 %r427,%r155,%r230; @ %r427 bra $L87; bra $L86; $L83: .loc 1 409 8 setp.le.s64 %r428,%r155,%r230; @ %r428 bra $L86; add.u64 %r565,%r230,1; shr.s64 %r566,%r565,63; sub.u64 %r432,%r604,%r230; shr.s64 %r435,%r230,63; add.u64 %r438,%r230,2; set.u32.lt.u64 %r443,%r438,%r230; cvt.s64.s32 %r441,%r443; sub.u64 %r447,%r435,%r441; add.u64 %r449,%r432,%r438; set.u32.lt.u64 %r454,%r449,%r432; cvt.s64.s32 %r452,%r454; sub.u64 %r458,%r447,%r452; mov.u64 %r577,1; $L90: .loc 1 411 8 @ %r598 bra $L88; .loc 1 411 18 ld.u32 %r89,[%r167]; .loc 1 411 15 setp.lt.s32 %r460,%r89,%r145; .loc 1 414 16 selp.u64 %r557,%r557,%r565,%r460; selp.u64 %r558,%r558,%r566,%r460; selp.u32 %r145,%r145,%r89,%r460; $L88: .loc 1 409 33 add.u64 %r167,%r167,%r88; .loc 1 409 48 add.u64 %r168,%r168,%r43; .loc 1 409 8 add.u64 %r463,%r565,%r577; set.u32.lt.u64 %r468,%r463,%r565; cvt.s64.s32 %r466,%r468; mov.u64 %r565,%r463; sub.u64 %r566,%r566,%r466; setp.ne.u64 %r475,%r449,%r565; @ %r475 bra $L99; setp.ne.u64 %r478,%r458,%r566; @ ! %r478 bra $L86; $L99: .loc 1 411 9 ld.s8 %r224,[%r168]; setp.eq.u32 %r598,%r224,0; bra $L90; $L87: add.u64 %r561,%r230,1; shr.s64 %r562,%r561,63; sub.u64 %r482,%r604,%r230; shr.s64 %r485,%r230,63; add.u64 %r488,%r230,2; set.u32.lt.u64 %r493,%r488,%r230; cvt.s64.s32 %r491,%r493; sub.u64 %r497,%r485,%r491; add.u64 %r499,%r482,%r488; set.u32.lt.u64 %r504,%r499,%r482; cvt.s64.s32 %r502,%r504; sub.u64 %r508,%r497,%r502; .loc 1 418 8 mov.u64 %r591,1; $L113: .loc 1 420 8 ld.s8 %r510,[%r168]; cvt.u16.u32 %r509,%r510; setp.eq.u16 %r511,%r509,0; @ %r511 bra $L91; .loc 1 420 18 ld.u32 %r94,[%r167]; .loc 1 420 15 setp.le.s32 %r512,%r94,%r145; .loc 1 423 16 selp.u64 %r557,%r557,%r561,%r512; selp.u64 %r558,%r558,%r562,%r512; selp.u32 %r145,%r145,%r94,%r512; $L91: .loc 1 418 33 add.u64 %r167,%r167,%r88; .loc 1 418 48 add.u64 %r168,%r168,%r43; .loc 1 418 8 add.u64 %r515,%r561,%r591; set.u32.lt.u64 %r520,%r515,%r561; cvt.s64.s32 %r518,%r520; mov.u64 %r561,%r515; sub.u64 %r562,%r562,%r518; setp.ne.u64 %r527,%r499,%r561; @ %r527 bra $L113; setp.ne.u64 %r530,%r508,%r562; @ %r530 bra $L113; $L86: .loc 1 426 8 st.u64 [%r174],%r557; st.u64 [%r174+8],%r558; .loc 1 429 15 add.u64 %r189,%r189,1; .loc 1 430 12 add.u64 %r172,%r172,%r103; .loc 1 431 13 add.u64 %r173,%r173,%r105; .loc 1 432 12 add.u64 %r174,%r174,%r108; .loc 1 434 13 setp.ne.u64 %r533,%r29,%r189; @ %r533 bra $L98; .loc 1 442 10 add.u64 %r169,%r173,%r92; .loc 1 443 9 add.u64 %r170,%r174,%r185; .loc 1 445 7 @ %r595 bra $L51; add.u64 %r219,%frame,488; .loc 1 441 23 mov.u64 %r112,%r39; .loc 1 445 7 mov.u64 %r218,8; .loc 1 444 5 mov.u64 %r171,1; add.u64 %r599,%frame,120; .loc 1 438 13 mov.u64 %r602,0; bra $L95; $L96: st.u64 [%r219],%r602; .loc 1 441 23 mul.lo.u64 %r112,%r125,%r124; .loc 1 442 24 mul.lo.u64 %r539,%r127,%r124; .loc 1 442 10 sub.u64 %r169,%r173,%r539; .loc 1 443 23 mul.lo.u64 %r540,%r129,%r124; .loc 1 443 9 shl.b64 %r541,%r540,4; sub.u64 %r170,%r174,%r541; .loc 1 444 5 add.u64 %r171,%r171,1; .loc 1 445 7 add.u64 %r219,%r219,8; add.u64 %r218,%r218,8; setp.eq.u64 %r542,%r171,%r607; @ %r542 bra $L51; $L95: .loc 1 453 16 ld.u64 %r543,[%r219]; add.u64 %r124,%r543,1; st.u64 [%r219],%r124; .loc 1 454 23 add.u64 %r545,%r608,%r218; ld.u64 %r125,[%r545]; .loc 1 454 13 sub.u64 %r546,%r125,%r112; shl.b64 %r547,%r546,2; add.u64 %r172,%r172,%r547; .loc 1 455 24 add.u64 %r548,%frame,%r218; ld.u64 %r127,[%r548]; .loc 1 455 14 add.u64 %r173,%r169,%r127; .loc 1 456 23 add.u64 %r550,%r599,%r218; ld.u64 %r129,[%r550]; .loc 1 456 13 shl.b64 %r551,%r129,4; add.u64 %r174,%r170,%r551; .loc 1 434 32 add.u64 %r553,%r609,%r218; ld.u64 %r133,[%r553]; .loc 1 434 13 setp.eq.u64 %r554,%r124,%r133; @ %r554 bra $L96; .loc 1 438 13 mov.u64 %r189,0; bra $L98; $L119: .loc 1 312 6 ld.u64 %r555,[%r233]; setp.eq.u64 %r556,%r555,0; @ ! %r556 bra $L66; bra $L97; $L51_gfortran_smaxloc1_16_i4 .visible .func _gfortran_smaxloc1_16_, .param .u32 %in_ar4) {local .align 16 .b8 %frame_ar[368u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r282; .reg .predpred %r293; .reg .pred %r294; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 483 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L123; .loc 1 483 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L124; r152154; call _gfortran_maxloc1_16_490 7 bra $L122; $L124: .loc 1 493 10 ld.u64 %r23,[%r152]; .loc 1 493 7 add.u64 %r88,%r23,-1; .loc 1 494 10 ld.s8 %r24,[%r151+28]; .loc 1 494 38 add.u32 %r26,%r24,-1; .loc 1 494 8 cvt.s64.s32 %r297,%r26; .loc 1 496 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r297; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 496 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L126; .loc 1 503 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L127; $L133: .loc 1 511 3 setp.lt.s64 %r175,%r88,%r297; @ %r175 bra $L128; bra $L181; $L126: .loc 1 498 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error127: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 508 12 mov.u64 %r303,0; $L132: .loc 1 505 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 507 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L130; .loc 1 505 17 st.u64 [%r139],%r36; bra $L131; $L130: .loc 1 508 12 st.u64 [%r139],%r303; $L131: .loc 1 503 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L132; bra $L133; $L181: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L135; bra $L134; $L128: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r296,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r296,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 517 12 mov.u64 %r302,0; $L138: .loc 1 514 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 516 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L136; .loc 1 513 17 st.u64 [%r43],%r41; bra $L137; $L136: .loc 1 517 12 st.u64 [%r43],%r302; $L137: .loc 1 511 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L138; bra $L182; $L134: .loc 1 524 7 setp.eq.u64 %r213,%r297,0; @ %r213 bra $L140; add.u64 %r296,%frame,120; $L160: add.u64 %r111,%r150,40; mov.u64 %r107,%r296; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 527 10 mov.u64 %r81,1; .loc 1 531 4 mov.u64 %r239,0; bra $L141; $L140: .loc 1 535 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 536 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 538 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 538 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 538 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 540 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L142; bra $L183; $L144: .loc 1 529 48 mul.lo.u64 %r81,%r46,%r81; $L141: .loc 1 531 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 524 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L144; bra $L140; $L142: .loc 1 543 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 544 4 bra $L122; $L183: .loc 1 547 24 mov.u64 %r246,1696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 547 22 st.u64 [%r150],%r98; $L148: .loc 1 573 3 setp.ne.u64 %r249,%r297,0; @ %r249 bra $L145; bra $L146; $L135: .loc 1 551 19 ld.s8 %r55,[%r150+28]; .loc 1 551 10 setp.eq.u64 %r250,%r55,%r297; @ %r250 bra $L147; .loc 1 552 2 st.u64 [%stack+8],%r297; st.u64 [%stack],%r55;_gfortran_runtime_error147: .loc 1 557 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 557 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L148; .loc 1 559 4 setp.eq.u64 %r256,%r297,0; @ %r256 bra $L146; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 559 10 mov.u64 %r104,0; $L150: .loc 1 563 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 563 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 564 18 ld.u64 %r63,[%r31]; .loc 1 559 25 add.u64 %r104,%r104,1; .loc 1 564 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L149; .loc 1 565 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error149: .loc 1 559 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L150; $L145: .loc 1 575 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L151; shl.b64 %r266,%r297,3; bra $L152; $L151: mov.u64 %r266,8; $L152: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L153: .loc 1 576 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 573 26 add.u64 %r100,%r100,1; .loc 1 573 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r297,%r100; @ %r279 bra $L153; $L146: ld.u64 %r80,[%frame+240]; .loc 1 585 22 ld.u64 %r68,[%frame]; .loc 1 585 12 shl.b64 %r69,%r68,4; .loc 1 587 32 ld.u64 %r59,[%frame+120]; .loc 1 594 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r295,%r297,1; .loc 1 583 13 mov.u64 %r280,0; cvt.u32.u32 %r299,%r24; cvt.s64.s8 %r300,%r299; add.u64 %r301,%r300,-1; $L159: st.u64 [%r98],%r280; st.u64 [%r98+8],%r280; .loc 1 584 15 add.u64 %r80,%r80,1; .loc 1 585 12 add.u64 %r98,%r98,%r69; .loc 1 587 13 setp.ne.u64 %r282,%r59,%r80; @ %r282 bra $L159; .loc 1 596 7 @ ! %r295 bra $L184; bra $L122; $L158: .loc 1 591 13 st.u64 [%r99],%r280; .loc 1 594 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 595 5 add.u64 %r97,%r97,1; .loc 1 596 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r285,%r97,%r301; @ %r285 bra $L157; bra $L122; $L184: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 594 23 mov.u64 %r72,%r44; .loc 1 595 5 mov.u64 %r97,1; $L157: .loc 1 600 16 ld.u64 %r290,[%r99]; add.u64 %r74,%r290,1; st.u64 [%r99],%r74; .loc 1 601 23 ld.u64 %r75,[%r124]; .loc 1 601 13 sub.u64 %r291,%r75,%r72; shl.b64 %r292,%r291,4; add.u64 %r98,%r98,%r292; .loc 1 587 32 ld.u64 %r77,[%r123]; .loc 1 587 13 setp.eq.u64 %r293,%r74,%r77; @ %r293 bra $L158; .loc 1 591 13 mov.u64 %r80,0; bra $L159; $L182: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r294,%r98,0; @ ! %r294 bra $L135; bra $L160; $L122: .loc 1 605 1 ret; } maxloc1_4_i8.o/_gfortran_maxloc1_4_i8 .visible .func _gfortran_maxloc1_4_i8.file 1 "../../../libgfortran/generated/maxloc1_4_i8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_4_i8 .visible .func _gfortran_mmaxloc1_4__gfortran_smaxloc1_4_i8 .visible .func _gfortran_smaxloc1_4_VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_maxloc1_4_i8 .visible .func _gfortran_maxloc1_4_i8480pred %r221; .reg .pred %r222; .reg .u64u64 %r229; .reg .u64 %r230; .reg .u64 %r231; .reg .pred %r232; .reg .predu64 %r280; .reg .u64 %r282; .reg .u64predu32u64 %r317; .reg .u64pred %r325; .reg .pred %r326; .reg .u32 %r327; .reg .pred %r328; .reg .pred %r329; .reg .u64 %r335; .reg .u64pred %r358; .reg .u32 %r359; .reg .u64u64 %r365; mov.u64 %r182,%ar0; mov.u64 %r183,%ar1; mov.u64 %r184,%ar2; mov.u32 %r185,%ar3; .loc 1 58 10 ld.s8 %r22,[%r183+28]; .loc 1 58 38 add.u32 %r186,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r108,%r186; .loc 1 59 10 ld.u64 %r25,[%r184]; .loc 1 59 7 add.u64 %r109,%r25,-1; .loc 1 61 7 shr.u64 %r188,%r109,63; set.u32.lt.s64 %r190,%r108,%r109; neg.s32 %r191,%r190; cvt.u16.u64 %r194,%r188; cvt.u16.u32 %r195,%r191; or.b16 %r193,%r194,%r195; .loc 1 61 6 cvt.u32.u16 %r196,%r193; cvt.u16.u8 %r197,%r196; setp.eq.u16 %r198,%r197,0; @ %r198 bra $L2; .loc 1 63 7 cvt.u32.u32 %r201,%r22; cvt.s64.s8 %r200,%r201; st.u64 [%stack+8],%r20019_gfortran_runtime_errorr204,%r109,%r109; add.u64 %r205,%r204,%r109; shl.b64 %r206,%r205,3; add.u64 %r207,%r183,%r206; ld.u64 %r31,[%r207+56]; ld.u64 %r33,[%r207+48]; .loc 1 71 9 ld.u64 %r112,[%r207+40]; .loc 1 73 3 setp.ne.u64 %r221,%r109,0; @ %r221 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r222,%r108,%r109; @ %r222 bra $L4; bra $L49; $L3: add.u64 %r29,%r183,40; add.u64 %r125,%frame,120; add.u64 %r126,%frame,240; add.u64 %r223,%r183,16; add.u64 %r225,%r25,%r25; add.u64 %r226,%r225,%r25; shl.b64 %r227,%r226,3; add.u64 %r147,%r223,%r227; .loc 1 79 12 mov.u64 %r365,0; $L8: .loc 1 75 18 ld.u64 %r228,[%r29]; st.u64 [%r125],%r228; .loc 1 76 19 ld.u64 %r230,[%r29+16]; add.u64 %r229,%r230,1; ld.u64 %r231,[%r29+8]; sub.u64 %r38,%r229,%r231; .loc 1 78 10 setp.lt.s64 %r232,%r38,0; @ %r232 bra $L6; .loc 1 76 17 st.u64 [%r126],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r126],%r365; $L7: .loc 1 73 3 add.u64 %r29,%r29,24; add.u64 %r125,%r125,8; add.u64 %r126,%r126,8; setp.ne.u64 %r234,%r29,%r147; @ %r234 bra $L8; bra $L9; $L49: .loc 1 90 6 ld.u64 %r235,[%r182]; setp.eq.u64 %r236,%r235,0; @ ! %r236 bra $L11; bra $L10; $L4: add.u64 %r238,%r25,%r25; add.u64 %r239,%r238,%r25; shl.b64 %r240,%r239,3; add.u64 %r241,%r240,40; add.u64 %r137,%r183,%r241; shl.b64 %r242,%r25,3; add.u64 %r105,%r242,-8; add.u64 %r352,%frame,120; add.u64 %r127,%r352,%r105; add.u64 %r351,%frame,240; add.u64 %r103,%r351,%r105; cvt.u32.u32 %r246,%r22; cvt.s64.s8 %r245,%r246; add.u64 %r248,%r245,%r245; add.u64 %r249,%r248,%r245; shl.b64 %r250,%r249,3; add.u64 %r251,%r183,40; add.u64 %r64,%r250,%r251; .loc 1 87 12 mov.u64 %r364,0; $L14: .loc 1 83 18 ld.u64 %r252,[%r137]; st.u64 [%r127],%r252; .loc 1 84 19 ld.u64 %r254,[%r137+16]; add.u64 %r253,%r254,1; ld.u64 %r255,[%r137+8]; sub.u64 %r44,%r253,%r255; .loc 1 86 10 setp.lt.s64 %r256,%r44,0; @ %r256 bra $L12; .loc 1 84 17 st.u64 [%r103],%r44; bra $L13; $L12: .loc 1 87 12 st.u64 [%r103],%r364; $L13: .loc 1 81 3 add.u64 %r137,%r137,24; add.u64 %r127,%r127,8; add.u64 %r103,%r103,8; setp.ne.u64 %r258,%r64,%r137; @ %r258 bra $L14; bra $L50; $L10: .loc 1 94 7 setp.le.s64 %r259,%r108,0; @ %r259 bra $L16; add.u64 %r351,%frame,240; $L37: add.u64 %r154,%r182,40; mov.u64 %r152,%r351; cvt.u32.u32 %r261,%r22; cvt.s64.s8 %r260,%r261; add.u64 %r263,%r260,%r260; add.u64 %r264,%r263,%r260; shl.b64 %r265,%r264,3; add.u64 %r266,%r182,16; add.u64 %r138,%r265,%r266; .loc 1 97 10 mov.u64 %r100,1; .loc 1 101 4 mov.u64 %r289,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r267,0; st.u64 [%r182+8],%r267; .loc 1 106 28 cvt.u16.u32 %r270,%r22; add.u16 %r269,%r270,-1; cvt.u32.u16 %r271,%r269; st.u8 [%r182+28],%r271; .loc 1 108 20 add.u32 %r272,%r22,-2; cvt.s64.s32 %r54,%r272; add.u64 %r274,%r54,%r54; add.u64 %r275,%r274,%r54; shl.b64 %r276,%r275,3; add.u64 %r277,%r182,%r276; .loc 1 108 67 shl.b64 %r279,%r54,3; add.u64 %r280,%frame,%r279; .loc 1 108 59 ld.u64 %r282,[%r277+40]; ld.u64 %r283,[%r280+240]; mul.lo.u64 %r114,%r282,%r283; .loc 1 110 29285; call (%value_in),_gfortrani_xmallocarray286,[%value_in]; } .loc 1 110 27 st.u64 [%r182],%r286; .loc 1 111 10 setp.eq.u64 %r288,%r114,0; @ ! %r288 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r100,%r50,%r100; $L17: .loc 1 101 4 st.u64 [%r154+8],%r289; ld.u64 %r50,[%r152]; add.u64 %r290,%r50,-1; st.u64 [%r154+16],%r290; st.u64 [%r154],%r100; .loc 1 94 7 add.u64 %r154,%r154,24; add.u64 %r152,%r152,8; setp.ne.u64 %r291,%r138,%r154; @ %r291 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r182+48],%r114; mov.u64 %r293,-1; st.u64 [%r182+56],%r293; mov.u64 %r294,1; st.u64 [%r182+40],%r294; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r60,[%r182+28]; .loc 1 121 10 setp.eq.u64 %r295,%r60,%r108; @ %r295 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r108; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r298,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r299,[%r298+36]; setp.eq.u32 %r300,%r299,0; @ %r300 bra $L21; .loc 1 128 2 add.u64 %r351,%frame,240; cvta.const.u64 %r304,$LC2r351303304; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r306,%r108,0; @ %r306 bra $L24; $L27: .loc 1 68 9 add.u64 %r307,%r31,1; .loc 1 68 7 sub.u64 %r110,%r307,%r33; max.s64 %r45,%r110,0; .loc 1 140 8 ld.u64 %r122,[%r183]; .loc 1 141 8 ld.u64 %r123,[%r182]; .loc 1 180 31 shl.b64 %r68,%r112,3; ld.u64 %r75,[%frame+360]; .loc 1 194 22 ld.u64 %r70,[%frame+120]; .loc 1 194 12 shl.b64 %r71,%r70,3; .loc 1 195 22 ld.u64 %r73,[%frame]; .loc 1 195 12 shl.b64 %r74,%r73,2; .loc 1 197 32 ld.u64 %r143,[%frame+240]; .loc 1 204 23 mul.lo.u64 %r308,%r70,%r143; .loc 1 204 9 shl.b64 %r309,%r308,3; neg.s64 %r134,%r309; .loc 1 205 23 mul.lo.u64 %r310,%r73,%r143; .loc 1 205 9 shl.b64 %r311,%r310,2; neg.s64 %r128,%r311; setp.gt.s64 %r353,%r110,0; setp.le.s64 %r354,%r108,1; .loc 1 155 9 mov.u64 %r357,-9223372036854775808; setp.eq.u32 %r358,%r185,0; cvt.u32.u32 %r359,%r22; cvt.s64.s8 %r360,%r359; add.u64 %r361,%r360,-1; add.u64 %r362,%frame,120; add.u64 %r363,%frame,240; bra $L25; $L24: add.u64 %r167,%frame,360; add.u64 %r166,%r182,40; mov.u64 %r164,%frame; cvt.u32.u32 %r313,%r22; cvt.s64.s8 %r312,%r313; add.u64 %r155,%r312,-1; .loc 1 132 3 mov.u64 %r124,0; add.u64 %r351,%frame,240; .loc 1 134 16 mov.u64 %r314,%r124; $L26: st.u64 [%r167],%r314; .loc 1 135 18 ld.u64 %r315,[%r166]; st.u64 [%r164],%r315; .loc 1 136 17 shl.b64 %r317,%r124,3; add.u64 %r318,%r351,%r317; .loc 1 136 10 ld.u64 %r319,[%r318]; setp.le.s64 %r320,%r319,0; @ %r320 bra $L1; .loc 1 132 26 add.u64 %r124,%r124,1; .loc 1 132 3 add.u64 %r167,%r167,8; add.u64 %r166,%r166,24; add.u64 %r164,%r164,8; setp.ne.u64 %r321,%r124,%r155; @ %r321 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r353 bra $L38; .loc 1 159 10 mov.u32 %r323,0; st.u32 [%r123],%r323; bra $L29; $L38: mov.u64 %r118,%r122; .loc 1 155 9 mov.u64 %r102,%r357; .loc 1 157 9 mov.u32 %r101,1; .loc 1 178 8 mov.u64 %r117,0; $L28: .loc 1 182 14 ld.u64 %r181,[%r118]; .loc 1 182 7 @ %r358 bra $L30; .loc 1 182 6 setp.le.s64 %r325,%r102,%r181; @ %r325 bra $L31; bra $L32; $L30: setp.ge.s64 %r326,%r102,%r181; @ %r326 bra $L32; $L31: cvt.u32.u64 %r327,%r117; add.u32 %r101,%r327,1; .loc 1 182 14 mov.u64 %r102,%r181; $L32: .loc 1 180 23 add.u64 %r117,%r117,1; .loc 1 180 31 add.u64 %r118,%r118,%r68; .loc 1 180 6 setp.gt.s64 %r328,%r45,%r117; @ %r328 bra $L28; .loc 1 189 12 st.u32 [%r123],%r101; $L29: .loc 1 193 15 add.u64 %r75,%r75,1; .loc 1 194 12 add.u64 %r122,%r122,%r71; .loc 1 195 12 add.u64 %r123,%r123,%r74; .loc 1 197 13 setp.ne.u64 %r329,%r75,%r143; @ %r329 bra $L25; .loc 1 204 9 add.u64 %r119,%r122,%r134; .loc 1 205 9 add.u64 %r120,%r123,%r128; .loc 1 207 7 @ %r354 bra $L1; add.u64 %r133,%frame,368; mov.u64 %r149,8; .loc 1 206 5 mov.u64 %r121,1; .loc 1 201 13 mov.u64 %r356,0; bra $L35; $L36: st.u64 [%r133],%r356; .loc 1 204 23 mul.lo.u64 %r335,%r87,%r86; .loc 1 204 9 shl.b64 %r336,%r335,3; sub.u64 %r119,%r122,%r336; .loc 1 205 23 mul.lo.u64 %r337,%r90,%r86; .loc 1 205 9 shl.b64 %r338,%r337,2; sub.u64 %r120,%r123,%r338; .loc 1 206 5 add.u64 %r121,%r121,1; .loc 1 207 7 add.u64 %r133,%r133,8; add.u64 %r149,%r149,8; setp.eq.u64 %r339,%r121,%r361; @ %r339 bra $L1; $L35: .loc 1 215 16 ld.u64 %r340,[%r133]; add.u64 %r86,%r340,1; st.u64 [%r133],%r86; .loc 1 216 23 add.u64 %r342,%r362,%r149; ld.u64 %r87,[%r342]; .loc 1 216 13 shl.b64 %r343,%r87,3; add.u64 %r122,%r119,%r343; .loc 1 217 23 add.u64 %r344,%frame,%r149; ld.u64 %r90,[%r344]; .loc 1 217 13 shl.b64 %r345,%r90,2; add.u64 %r123,%r120,%r345; .loc 1 197 32 add.u64 %r347,%r363,%r149; ld.u64 %r93,[%r347]; .loc 1 197 13 setp.eq.u64 %r348,%r86,%r93; @ %r348 bra $L36; .loc 1 201 13 mov.u64 %r75,0; bra $L25; $L50: .loc 1 90 6 ld.u64 %r349,[%r182]; setp.eq.u64 %r350,%r349,0; @ ! %r350 bra $L11; bra $L37; $L1:_gfortran_mmaxloc1_4_i8 .visible .func _gfortran_mmaxloc1_4_, .param .u32 %in_ar4) {60898u16 %r238; .reg .u16u64 %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u64 %r306; .reg .u64pred %r317; .reg .u64 %r318; .reg .pred %r319; .reg .u64u32u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64u64predu32 %r385; .reg .predu64 %r394; .reg .predu64predu64 %r410; .reg .u32 %r412; .reg .pred %r414; .reg .pred %r415; .reg .pred %r416; .reg .predpred %r423; .reg .predpredpred %r446; .reg .u64 %r447; .reg .predmov.u64 %r221,%ar0; mov.u64 %r222,%ar1; mov.u64 %r223,%ar2; mov.u64 %r224,%ar3; mov.u32 %r225,%ar4; .loc 1 251 6 setp.ne.u64 %r226,%r224,0; @ %r226 bra $L223225; call _gfortran_maxloc1_4_7 bra $L51; $L52: .loc 1 261 10 ld.u64 %r22,[%r223]; .loc 1 261 7 add.u64 %r151,%r22,-1; .loc 1 262 10 ld.s8 %r23,[%r222+28]; .loc 1 262 38 add.u32 %r231,%r23,-1; .loc 1 262 8 cvt.s64.s32 %r152,%r231; .loc 1 265 7 shr.u64 %r233,%r151,63; set.u32.gt.s64 %r235,%r151,%r152; neg.s32 %r236,%r235; cvt.u16.u64 %r239,%r233; cvt.u16.u32 %r240,%r236; or.b16 %r238,%r239,%r240; .loc 1 265 6 cvt.u32.u16 %r241,%r238; cvt.u16.u8 %r242,%r241; setp.eq.u16 %r243,%r242,0; @ %r243 bra $L54; .loc 1 267 7 cvt.u32.u32 %r246,%r23; cvt.s64.s8 %r245,%r246; st.u64 [%stack+8],%r245; st.u64 [%stack],%r22; cvta.const.u64 %r244244_gfortran_runtime_error4: .loc 1 272 9 add.u64 %r449,%r151,%r151; add.u64 %r455,%r449,%r151; shl.b64 %r251,%r455,3; add.u64 %r252,%r222,%r251; ld.u64 %r255,[%r252+56]; add.u64 %r254,%r255,1; .loc 1 272 7 ld.u64 %r262,[%r252+48]; sub.u64 %r153,%r254,%r262; .loc 1 273 6 setp.le.s64 %r263,%r153,0; @ %r263 bra $L51; .loc 1 276 9 ld.u64 %r169,[%r224]; .loc 1 278 15 ld.u64 %r34,[%r224+16]; .loc 1 280 22 cvt.u32.u64 %r35,%r34; .loc 1 280 53 add.u32 %r264,%r35,-4; and.b32 %r265,%r264,-5; set.u32.eq.u32 %r267,%r265,0; neg.s32 %r268,%r267; .loc 1 280 22 add.u32 %r269,%r35,-1; set.u32.le.u32 %r271,%r269,1; neg.s32 %r272,%r271; .loc 1 280 6 cvt.u16.u32 %r274,%r268; cvt.u16.u32 %r275,%r272; or.loc 1 282 7 setp.ne.u32 %r280,%r35,16; @ %r280 bra $L57; $L56: .loc 1 289 9 shl.b64 %r284,%r455,3; add.u64 %r285,%r222,%r284; ld.u64 %r156,[%r285+40]; .loc 1 290 12 add.u64 %r291,%r224,%r284; ld.u64 %r41,[%r291+40]; .loc 1 292 3 setp.ne.u64 %r293,%r151,0; @ %r293 bra $L58; $L64: .loc 1 302 3 setp.lt.s64 %r294,%r151,%r152; @ %r294 bra $L59; bra $L116; $L57: .loc 1 287 5 cvta.const.u64 %r2955_gfortran_runtime_error8: add.u64 %r96,%r222,40; add.u64 %r68,%r224,40; add.u64 %r298,%r22,%r22; add.u64 %r299,%r298,%r22; shl.b64 %r300,%r299,3; add.u64 %r301,%r222,16; add.u64 %r39,%r300,%r301; .loc 1 292 3 mov.u64 %r139,0; add.u64 %r452,%frame,240; add.u64 %r451,%frame,360; .loc 1 299 12 mov.u64 %r465,%r139; $L63: .loc 1 294 18 add.u64 %r303,%r452,%r139; ld.u64 %r304,[%r96]; st.u64 [%r303],%r304; .loc 1 295 18 add.u64 %r305,%frame,%r139; .loc 1 295 20 ld.u64 %r307,[%r68]; mul.lo.u64 %r306,%r307,%r34; .loc 1 295 18 st.u64 [%r305],%r306; .loc 1 296 19 ld.u64 %r309,[%r96+16]; add.u64 %r308,%r309,1; ld.u64 %r310,[%r96+8]; sub.u64 %r53,%r308,%r310; .loc 1 298 10 setp.lt.s64 %r311,%r53,0; @ %r311 bra $L61; .loc 1 296 17 add.u64 %r313,%r451,%r139; st.u64 [%r313],%r53; bra $L62; $L61: .loc 1 299 12 add.u64 %r315,%r451,%r139; st.u64 [%r315],%r465; $L62: .loc 1 292 3 add.u64 %r96,%r96,24; add.u64 %r68,%r68,24; add.u64 %r139,%r139,8; setp.ne.u64 %r317,%r39,%r96; @ %r317 bra $L63; bra $L64; $L116: .loc 1 312 6 ld.u64 %r318,[%r221]; setp.eq.u64 %r319,%r318,0; @ ! %r319 bra $L66; bra $L65; $L59: add.u64 %r321,%r22,%r22; add.u64 %r322,%r321,%r22; shl.b64 %r323,%r322,3; add.u64 %r130,%r323,40; add.u64 %r134,%r222,%r130; add.u64 %r111,%r224,%r130; shl.b64 %r324,%r22,3; add.u64 %r58,%r324,-8; cvt.u32.u32 %r326,%r23; cvt.s64.s8 %r325,%r326; shl.b64 %r327,%r325,3; add.u64 %r69,%r327,-8; add.u64 %r452,%frame,240; add.u64 %r451,%frame,360; .loc 1 309 12 mov.u64 %r464,0; $L69: .loc 1 304 18 add.u64 %r329,%r452,%r58; ld.u64 %r330,[%r134]; st.u64 [%r329],%r330; .loc 1 305 18 add.u64 %r331,%frame,%r58; .loc 1 305 20 ld.u64 %r333,[%r111]; mul.lo.u64 %r332,%r333,%r34; .loc 1 305 18 st.u64 [%r331],%r332; .loc 1 306 19 ld.u64 %r335,[%r134+16]; add.u64 %r334,%r335,1; ld.u64 %r336,[%r134+8]; sub.u64 %r65,%r334,%r336; .loc 1 308 10 setp.lt.s64 %r337,%r65,0; @ %r337 bra $L67; .loc 1 306 17 add.u64 %r339,%r451,%r58; st.u64 [%r339],%r65; bra $L68; $L67: .loc 1 309 12 add.u64 %r341,%r451,%r58; st.u64 [%r341],%r464; $L68: .loc 1 302 3 add.u64 %r134,%r134,24; add.u64 %r111,%r111,24; add.u64 %r58,%r58,8; setp.ne.u64 %r343,%r69,%r58; @ %r343 bra $L69; bra $L117; $L65: .loc 1 316 7 setp.eq.u64 %r344,%r152,0; @ %r344 bra $L71; add.u64 %r451,%frame,360; $L96: add.u64 %r148,%r221,40; mov.u64 %r146,%r451; cvt.u32.u32 %r346,%r23; cvt.s64.s8 %r345,%r346; add.u64 %r348,%r345,%r345; add.u64 %r349,%r348,%r345; shl.b64 %r350,%r349,3; add.u64 %r351,%r221,16; add.u64 %r135,%r350,%r351; .loc 1 319 10 mov.u64 %r136,1; .loc 1 323 4 mov.u64 %r370,0; bra $L72; $L71: .loc 1 327 20 add.u32 %r352,%r23,-2; cvt.s64.s32 %r76,%r352; add.u64 %r354,%r76,%r76; add.u64 %r355,%r354,%r76; shl.b64 %r356,%r355,3; add.u64 %r357,%r221,%r356; .loc 1 327 67 shl.b64 %r359,%r76,3; add.u64 %r360,%frame,%r359; .loc 1 327 59 ld.u64 %r362,[%r357+40]; ld.u64 %r363,[%r360+360]; mul.lo.u64 %r160,%r362,%r363; .loc 1 329 24 mov.u64 %r364,0; st.u64 [%r221+8],%r364; .loc 1 330 28 cvt.u16.u32 %r367,%r23; add.u16 %r366,%r367,-1; cvt.u32.u16 %r368,%r366; st.u8 [%r221+28],%r368; .loc 1 332 10 setp.eq.u64 %r369,%r160,0; @ %r369 bra $L73; bra $L118; $L75: .loc 1 321 47 mul.lo.u64 %r136,%r73,%r136; $L72: .loc 1 323 4 st.u64 [%r148+8],%r370; ld.u64 %r73,[%r146]; add.u64 %r371,%r73,-1; st.u64 [%r148+16],%r371; st.u64 [%r148],%r136; .loc 1 316 7 add.u64 %r148,%r148,24; add.u64 %r146,%r146,8; setp.ne.u64 %r372,%r135,%r148; @ %r372 bra $L75; bra $L71; $L73: .loc 1 335 4 st.u64 [%r221+48],%r160; mov.u64 %r374,-1; st.u64 [%r221+56],%r374; mov.u64 %r375,1; st.u64 [%r221+40],%r375; .loc 1 336 4 bra $L51; $L118: .loc 1 339 24call (%value_in),_gfortrani_xmallocarray78,[%value_in]; } .loc 1 339 22 st.u64 [%r221],%r378; bra $L76; $L66: .loc 1 344 19 ld.s8 %r380,[%r221+28]; .loc 1 344 10 setp.eq.u64 %r381,%r380,%r152; @ %r381 bra $L77; .loc 1 345 2 cvta.const.u64 %r382,$LC5; {stack; call _gfortran_runtime_error77: .loc 1 347 11 cvta.global.u64 %r384,_gfortrani_compile_options; .loc 1 347 10 ld.u32 %r385,[%r384+36]; setp.eq.u32 %r386,%r385,0; @ %r386 bra $L76; .loc 1 349 4 add.u64 %r451,%frame,360; cvta.const.u64 %r390,$LC2390; call _gfortrani_bounds_ifunction_return4 cvta.const.u64 %r394,$LC222394390; call _gfortrani_bounds_equal_extents$L76: .loc 1 356 3 setp.ne.u64 %r396,%r152,0; @ %r396 bra $L78; $L81: .loc 1 364 8 ld.u64 %r170,[%r221]; .loc 1 365 8 ld.u64 %r168,[%r222]; .loc 1 367 9 setp.ne.u64 %r397,%r168,0; @ %r397 bra $L79; bra $L51; $L78: add.u64 %r198,%frame,480; add.u64 %r197,%r221,40; add.u64 %r195,%frame,120; cvt.u32.u32 %r399,%r23; cvt.s64.s8 %r398,%r399; add.u64 %r164,%r398,-1; .loc 1 356 3 mov.u64 %r171,0; add.u64 %r451,%frame,360; .loc 1 358 16 mov.u64 %r400,%r171; $L80: st.u64 [%r198],%r400; .loc 1 359 18 ld.u64 %r401,[%r197]; st.u64 [%r195],%r401; .loc 1 360 17 shl.b64 %r403,%r171,3; add.u64 %r404,%r451,%r403; .loc 1 360 10 ld.u64 %r405,[%r404]; setp.le.s64 %r406,%r405,0; @ %r406 bra $L51; .loc 1 356 26 add.u64 %r171,%r171,1; .loc 1 356 3 add.u64 %r198,%r198,8; add.u64 %r197,%r197,24; add.u64 %r195,%r195,8; setp.ne.u64 %r407,%r164,%r171; @ %r407 bra $L80; bra $L81; $L79: .loc 1 290 12 mul.lo.u64 %r44,%r41,%r34; .loc 1 386 32 shl.b64 %r91,%r156,3; ld.u64 %r187,[%frame+480]; .loc 1 430 22 ld.u64 %r101,[%frame+240]; .loc 1 430 12 shl.b64 %r102,%r101,3; .loc 1 431 23 ld.u64 %r104,[%frame]; .loc 1 432 22 ld.u64 %r106,[%frame+120]; .loc 1 432 12 shl.b64 %r107,%r106,2; .loc 1 434 32 ld.u64 %r29,[%frame+360]; .loc 1 441 23 mul.lo.u64 %r38,%r29,%r101; .loc 1 442 24 mul.lo.u64 %r408,%r29,%r104; .loc 1 442 10 neg.s64 %r137,%r408; .loc 1 443 23 mul.lo.u64 %r409,%r29,%r106; .loc 1 443 9 shl.b64 %r410,%r409,2; neg.s64 %r188,%r410; setp.le.s64 %r450,%r152,1; .loc 1 408 9 setp.ne.u32 %r457,%r225,0; cvt.u32.u32 %r458,%r23; cvt.s64.s8 %r459,%r458; add.u64 %r460,%r459,-1; add.u64 %r461,%frame,240; add.u64 %r462,%frame,360; add.u64 %r463,%frame,120; $L97: .loc 1 356 3 mov.u64 %r163,%r169; mov.u64 %r162,%r168; .loc 1 386 9 mov.u64 %r161,0; $L85: .loc 1 389 7 ld.s8 %r220,[%r163]; .loc 1 389 6 setp.eq.u32 %r453,%r220,0; @ %r453 bra $L82; .loc 1 397 11 ld.u64 %r145,[%r162]; .loc 1 398 11 cvt.u32.u64 %r412,%r161; add.u32 %r143,%r412,1; .loc 1 408 9 @ %r457 bra $L83; bra $L119; $L82: .loc 1 386 24 add.u64 %r161,%r161,1; .loc 1 386 32 add.u64 %r162,%r162,%r91; .loc 1 386 47 add.u64 %r163,%r163,%r44; .loc 1 386 2 setp.ne.u64 %r414,%r153,%r161; @ %r414 bra $L85; .loc 1 385 9 mov.u32 %r143,%r220; bra $L86; $L119: .loc 1 418 8 setp.gt.s64 %r415,%r153,%r161; @ %r415 bra $L87; bra $L86; $L83: .loc 1 409 8 setp.le.s64 %r416,%r153,%r161; @ %r416 bra $L86; add.u64 %r205,%r161,1; $L89: .loc 1 411 8 @ %r453 bra $L88; .loc 1 411 18 ld.u64 %r93,[%r162]; .loc 1 411 15 setp.lt.s64 %r418,%r93,%r145; @ %r418 bra $L88; .loc 1 414 16 cvt.u32.u64 %r143,%r205; mov.u64 %r145,%r93; $L88: .loc 1 409 33 add.u64 %r162,%r162,%r91; .loc 1 409 48 add.u64 %r163,%r163,%r44; .loc 1 409 8 add.u64 %r206,%r205,1; setp.eq.u64 %r419,%r153,%r205; @ %r419 bra $L86; .loc 1 411 9 ld.s8 %r220,[%r163]; mov.u64 %r205,%r206; setp.eq.u32 %r453,%r220,0; bra $L89; $L87: add.u64 %r201,%r161,1; bra $L91; $L98: mov.u64 %r201,%r202; $L91: .loc 1 420 8 ld.s8 %r421,[%r163]; cvt.u16.u32 %r420,%r421; setp.eq.u16 %r422,%r420,0; @ %r422 bra $L90; .loc 1 420 18 ld.u64 %r98,[%r162]; .loc 1 420 15 setp.le.s64 %r423,%r98,%r145; @ %r423 bra $L90; .loc 1 423 16 cvt.u32.u64 %r143,%r201; mov.u64 %r145,%r98; $L90: .loc 1 418 33 add.u64 %r162,%r162,%r91; .loc 1 418 48 add.u64 %r163,%r163,%r44; .loc 1 418 8 add.u64 %r202,%r201,1; setp.ne.u64 %r424,%r153,%r201; @ %r424 bra $L98; $L86: .loc 1 426 8 st.u32 [%r170],%r143; .loc 1 429 15 add.u64 %r187,%r187,1; .loc 1 430 12 add.u64 %r168,%r168,%r102; .loc 1 431 13 add.u64 %r169,%r169,%r104; .loc 1 432 12 add.u64 %r170,%r170,%r107; .loc 1 434 13 setp.ne.u64 %r425,%r29,%r187; @ %r425 bra $L97; .loc 1 442 10 add.u64 %r165,%r169,%r137; .loc 1 443 9 add.u64 %r166,%r170,%r188; .loc 1 445 7 @ %r450 bra $L51; add.u64 %r150,%frame,488; .loc 1 441 23 mov.u64 %r110,%r38; .loc 1 445 7 mov.u64 %r215,8; .loc 1 444 5 mov.u64 %r167,1; .loc 1 438 13 mov.u64 %r456,0; bra $L94; $L95: st.u64 [%r150],%r456; .loc 1 441 23 mul.lo.u64 %r110,%r123,%r122; .loc 1 442 24 mul.lo.u64 %r431,%r125,%r122; .loc 1 442 10 sub.u64 %r165,%r169,%r431; .loc 1 443 23 mul.lo.u64 %r432,%r127,%r122; .loc 1 443 9 shl.b64 %r433,%r432,2; sub.u64 %r166,%r170,%r433; .loc 1 444 5 add.u64 %r167,%r167,1; .loc 1 445 7 add.u64 %r150,%r150,8; add.u64 %r215,%r215,8; setp.eq.u64 %r434,%r167,%r460; @ %r434 bra $L51; $L94: .loc 1 453 16 ld.u64 %r435,[%r150]; add.u64 %r122,%r435,1; st.u64 [%r150],%r122; .loc 1 454 23 add.u64 %r437,%r461,%r215; ld.u64 %r123,[%r437]; .loc 1 454 13 sub.u64 %r438,%r123,%r110; shl.b64 %r439,%r438,3; add.u64 %r168,%r168,%r439; .loc 1 455 24 add.u64 %r440,%frame,%r215; ld.u64 %r125,[%r440]; .loc 1 455 14 add.u64 %r169,%r165,%r125; .loc 1 456 23 add.u64 %r442,%r463,%r215; ld.u64 %r127,[%r442]; .loc 1 456 13 shl.b64 %r443,%r127,2; add.u64 %r170,%r166,%r443; .loc 1 434 32 add.u64 %r445,%r462,%r215; ld.u64 %r131,[%r445]; .loc 1 434 13 setp.eq.u64 %r446,%r122,%r131; @ %r446 bra $L95; .loc 1 438 13 mov.u64 %r187,0; bra $L97; $L117: .loc 1 312 6 ld.u64 %r447,[%r221]; setp.eq.u64 %r448,%r447,0; @ ! %r448 bra $L66; bra $L96; $L51_gfortran_smaxloc1_4_i8 .visible .func _gfortran_smaxloc1_4_, .param .u32 %in_ar4) {local .align 16 .b8 %frame_ar[368u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u32 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 483 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L121; .loc 1 483 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L122; $L121:r152154; call _gfortran_maxloc1_4_490 7 bra $L120; $L122: .loc 1 493 10 ld.u64 %r23,[%r152]; .loc 1 493 7 add.u64 %r88,%r23,-1; .loc 1 494 10 ld.s8 %r24,[%r151+28]; .loc 1 494 38 add.u32 %r26,%r24,-1; .loc 1 494 8 cvt.s64.s32 %r296,%r26; .loc 1 496 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 496 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L124; .loc 1 503 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L125; $L131: .loc 1 511 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L126; bra $L179; $L124: .loc 1 498 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error125: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 508 12 mov.u64 %r302,0; $L130: .loc 1 505 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 507 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L128; .loc 1 505 17 st.u64 [%r139],%r36; bra $L129; $L128: .loc 1 508 12 st.u64 [%r139],%r302; $L129: .loc 1 503 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L130; bra $L131; $L179: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L133; bra $L132; $L126: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 517 12 mov.u64 %r301,0; $L136: .loc 1 514 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 516 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L134; .loc 1 513 17 st.u64 [%r43],%r41; bra $L135; $L134: .loc 1 517 12 st.u64 [%r43],%r301; $L135: .loc 1 511 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L136; bra $L180; $L132: .loc 1 524 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L138; add.u64 %r295,%frame,120; $L158: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 527 10 mov.u64 %r81,1; .loc 1 531 4 mov.u64 %r239,0; bra $L139; $L138: .loc 1 535 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 536 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 538 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 538 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 538 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 540 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L140; bra $L181; $L142: .loc 1 529 48 mul.lo.u64 %r81,%r46,%r81; $L139: .loc 1 531 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 524 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L142; bra $L138; $L140: .loc 1 543 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 544 4 bra $L120; $L181: .loc 1 547 2496246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 547 22 st.u64 [%r150],%r98; $L146: .loc 1 573 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L143; bra $L144; $L133: .loc 1 551 19 ld.s8 %r55,[%r150+28]; .loc 1 551 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L145; .loc 1 552 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error145: .loc 1 557 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 557 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L146; .loc 1 559 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L144; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 559 10 mov.u64 %r104,0; $L148: .loc 1 563 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 563 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 564 18 ld.u64 %r63,[%r31]; .loc 1 559 25 add.u64 %r104,%r104,1; .loc 1 564 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L147; .loc 1 565 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error147: .loc 1 559 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L148; $L143: .loc 1 575 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L149; shl.b64 %r266,%r296,3; bra $L150; $L149: mov.u64 %r266,8; $L150: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L151: .loc 1 576 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 573 26 add.u64 %r100,%r100,1; .loc 1 573 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L151; $L144: ld.u64 %r80,[%frame+240]; .loc 1 585 22 ld.u64 %r68,[%frame]; .loc 1 585 12 shl.b64 %r69,%r68,2; .loc 1 587 32 ld.u64 %r59,[%frame+120]; .loc 1 594 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 583 13 mov.u32 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L157: st.u32 [%r98],%r280; .loc 1 584 15 add.u64 %r80,%r80,1; .loc 1 585 12 add.u64 %r98,%r98,%r69; .loc 1 587 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L157; .loc 1 596 7 @ ! %r294 bra $L182; bra $L120; $L156: .loc 1 591 13 st.u64 [%r99],%r297; .loc 1 594 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 595 5 add.u64 %r97,%r97,1; .loc 1 596 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L155; bra $L120; $L182: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 594 23 mov.u64 %r72,%r44; .loc 1 595 5 mov.u64 %r97,1; .loc 1 591 13 mov.u64 %r297,0; $L155: .loc 1 600 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 601 23 ld.u64 %r75,[%r124]; .loc 1 601 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,2; add.u64 %r98,%r98,%r291; .loc 1 587 32 ld.u64 %r77,[%r123]; .loc 1 587 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L156; .loc 1 591 13 mov.u64 %r80,0; bra $L157; $L180: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L133; bra $L158; $L120: .loc 1 605 1 ret; } maxloc1_8_i8.o/_gfortran_maxloc1_8_i8 .visible .func _gfortran_maxloc1_8_i8.file 1 "../../../libgfortran/generated/maxloc1_8_i8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_8_i8 .visible .func _gfortran_mmaxloc1_8__gfortran_smaxloc1_8_i8 .visible .func _gfortran_smaxloc1_8_VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_maxloc1_8_i8 .visible .func _gfortran_maxloc1_8_i848095u16 %r196; .reg .predu64u64 %r229; .reg .u64 %r230; .reg .predpred %r255; .reg .predu64 %r266; .reg .u16u64predpred %r315; .reg .predu64 %r327; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .pred %r337; .reg .u64 %r338; .reg .u64 %r340; .reg .u64 %r341; .reg .u64 %r342; .reg .u64 %r343; .reg .u64 %r345; .reg .pred %r346; .reg .u64 %r347; .reg .pred %r348; .reg .u64 %r349; .reg .predu64 %r353; .reg .pred %r355; .reg .u64u64 %r359; .reg .u64mov.u64 %r181,%ar0; mov.u64 %r182,%ar1; mov.u64 %r183,%ar2; mov.u32 %r184,%ar3; .loc 1 58 10 ld.s8 %r22,[%r182+28]; .loc 1 58 38 add.u32 %r185,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r110,%r185; .loc 1 59 10 ld.u64 %r25,[%r183]; .loc 1 59 7 add.u64 %r111,%r25,-1; .loc 1 61 7 shr.u64 %r187,%r111,63; set.u32.lt.s64 %r189,%r110,%r111; neg.s32 %r190,%r189; cvt.u16.u64 %r193,%r187; cvt.u16.u32 %r194,%r190; or.b16 %r192,%r193,%r194; .loc 1 61 6 cvt.u32.u16 %r195,%r192; cvt.u16.u8 %r196,%r195; setp.eq.u16 %r197,%r196,0; @ %r197 bra $L2; .loc 1 63 7 cvt.u32.u32 %r200,%r22; cvt.s64.s8 %r199,%r200; st.u64 [%stack+8],%r199_gfortran_runtime_errorr203,%r111,%r111; add.u64 %r204,%r203,%r111; shl.b64 %r205,%r204,3; add.u64 %r206,%r182,%r205; ld.u64 %r31,[%r206+56]; ld.u64 %r33,[%r206+48]; .loc 1 71 9 ld.u64 %r114,[%r206+40]; .loc 1 73 3 setp.ne.u64 %r220,%r111,0; @ %r220 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r221,%r110,%r111; @ %r221 bra $L4; bra $L50; $L3: add.u64 %r63,%r182,40; add.u64 %r29,%frame,120; add.u64 %r124,%frame,240; add.u64 %r222,%r182,16; add.u64 %r224,%r25,%r25; add.u64 %r225,%r224,%r25; shl.b64 %r226,%r225,3; add.u64 %r174,%r222,%r226; .loc 1 79 12 mov.u64 %r363,0; $L8: .loc 1 75 18 ld.u64 %r227,[%r63]; st.u64 [%r29],%r227; .loc 1 76 19 ld.u64 %r229,[%r63+16]; add.u64 %r228,%r229,1; ld.u64 %r230,[%r63+8]; sub.u64 %r38,%r228,%r230; .loc 1 78 10 setp.lt.s64 %r231,%r38,0; @ %r231 bra $L6; .loc 1 76 17 st.u64 [%r124],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r124],%r363; $L7: .loc 1 73 3 add.u64 %r63,%r63,24; add.u64 %r29,%r29,8; add.u64 %r124,%r124,8; setp.ne.u64 %r233,%r63,%r174; @ %r233 bra $L8; bra $L9; $L50: .loc 1 90 6 ld.u64 %r234,[%r181]; setp.eq.u64 %r235,%r234,0; @ ! %r235 bra $L11; bra $L10; $L4: add.u64 %r237,%r25,%r25; add.u64 %r238,%r237,%r25; shl.b64 %r239,%r238,3; add.u64 %r240,%r239,40; add.u64 %r132,%r182,%r240; shl.b64 %r241,%r25,3; add.u64 %r107,%r241,-8; add.u64 %r349,%frame,120; add.u64 %r126,%r349,%r107; add.u64 %r353,%frame,240; add.u64 %r105,%r353,%r107; cvt.u32.u32 %r245,%r22; cvt.s64.s8 %r244,%r245; add.u64 %r247,%r244,%r244; add.u64 %r248,%r247,%r244; shl.b64 %r249,%r248,3; add.u64 %r250,%r182,40; add.u64 %r81,%r249,%r250; .loc 1 87 12 mov.u64 %r362,0; $L14: .loc 1 83 18 ld.u64 %r251,[%r132]; st.u64 [%r126],%r251; .loc 1 84 19 ld.u64 %r253,[%r132+16]; add.u64 %r252,%r253,1; ld.u64 %r254,[%r132+8]; sub.u64 %r45,%r252,%r254; .loc 1 86 10 setp.lt.s64 %r255,%r45,0; @ %r255 bra $L12; .loc 1 84 17 st.u64 [%r105],%r45; bra $L13; $L12: .loc 1 87 12 st.u64 [%r105],%r362; $L13: .loc 1 81 3 add.u64 %r132,%r132,24; add.u64 %r126,%r126,8; add.u64 %r105,%r105,8; setp.ne.u64 %r257,%r81,%r132; @ %r257 bra $L14; bra $L51; $L10: .loc 1 94 7 setp.le.s64 %r258,%r110,0; @ %r258 bra $L16; add.u64 %r353,%frame,240; $L36: add.u64 %r153,%r181,40; mov.u64 %r151,%r353; cvt.u32.u32 %r260,%r22; cvt.s64.s8 %r259,%r260; add.u64 %r262,%r259,%r259; add.u64 %r263,%r262,%r259; shl.b64 %r264,%r263,3; add.u64 %r265,%r181,16; add.u64 %r138,%r264,%r265; .loc 1 97 10 mov.u64 %r102,1; .loc 1 101 4 mov.u64 %r288,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r266,0; st.u64 [%r181+8],%r266; .loc 1 106 28 cvt.u16.u32 %r269,%r22; add.u16 %r268,%r269,-1; cvt.u32.u16 %r270,%r268; st.u8 [%r181+28],%r270; .loc 1 108 20 add.u32 %r271,%r22,-2; cvt.s64.s32 %r56,%r271; add.u64 %r273,%r56,%r56; add.u64 %r274,%r273,%r56; shl.b64 %r275,%r274,3; add.u64 %r276,%r181,%r275; .loc 1 108 67 shl.b64 %r278,%r56,3; add.u64 %r279,%frame,%r278; .loc 1 108 59 ld.u64 %r281,[%r276+40]; ld.u64 %r282,[%r279+240]; mul.lo.u64 %r116,%r281,%r282; .loc 1 110 29 mov.u64 %r284284; call (%value_in),_gfortrani_xmallocarray285,[%value_in]; } .loc 1 110 27 st.u64 [%r181],%r285; .loc 1 111 10 setp.eq.u64 %r287,%r116,0; @ ! %r287 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r102,%r52,%r102; $L17: .loc 1 101 4 st.u64 [%r153+8],%r288; ld.u64 %r52,[%r151]; add.u64 %r289,%r52,-1; st.u64 [%r153+16],%r289; st.u64 [%r153],%r102; .loc 1 94 7 add.u64 %r153,%r153,24; add.u64 %r151,%r151,8; setp.ne.u64 %r290,%r138,%r153; @ %r290 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r181+48],%r116; mov.u64 %r292,-1; st.u64 [%r181+56],%r292; mov.u64 %r293,1; st.u64 [%r181+40],%r293; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r62,[%r181+28]; .loc 1 121 10 setp.eq.u64 %r294,%r62,%r110; @ %r294 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r110; st.u64 [%stack],%r62;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r297,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r298,[%r297+36]; setp.eq.u32 %r299,%r298,0; @ %r299 bra $L21; .loc 1 128 2 add.u64 %r353,%frame,240; cvta.const.u64 %r303,$LC2r353302303; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r305,%r110,0; @ %r305 bra $L24; $L27: .loc 1 68 9 add.u64 %r306,%r31,1; .loc 1 68 7 sub.u64 %r112,%r306,%r33; max.s64 %r46,%r112,0; .loc 1 140 8 ld.u64 %r121,[%r182]; .loc 1 141 8 ld.u64 %r122,[%r181]; .loc 1 180 31 shl.b64 %r69,%r114,3; ld.u64 %r48,[%frame+360]; .loc 1 194 22 ld.u64 %r70,[%frame+120]; .loc 1 194 12 shl.b64 %r72,%r70,3; .loc 1 195 22 ld.u64 %r73,[%frame]; .loc 1 195 12 shl.b64 %r75,%r73,3; setp.gt.s64 %r350,%r112,0; setp.le.s64 %r351,%r110,1; setp.eq.u32 %r355,%r184,0; .loc 1 159 10 mov.u64 %r356,0; cvt.u32.u32 %r357,%r22; cvt.s64.s8 %r358,%r357; add.u64 %r359,%r358,-1; add.u64 %r360,%frame,120; add.u64 %r361,%frame,240; bra $L25; $L24: add.u64 %r166,%frame,360; add.u64 %r165,%r181,40; mov.u64 %r163,%frame; cvt.u32.u32 %r308,%r22; cvt.s64.s8 %r307,%r308; add.u64 %r154,%r307,-1; .loc 1 132 3 mov.u64 %r123,0; add.u64 %r353,%frame,240; .loc 1 134 16 mov.u64 %r309,%r123; $L26: st.u64 [%r166],%r309; .loc 1 135 18 ld.u64 %r310,[%r165]; st.u64 [%r163],%r310; .loc 1 136 17 shl.b64 %r312,%r123,3; add.u64 %r313,%r353,%r312; .loc 1 136 10 ld.u64 %r314,[%r313]; setp.le.s64 %r315,%r314,0; @ %r315 bra $L1; .loc 1 132 26 add.u64 %r123,%r123,1; .loc 1 132 3 add.u64 %r166,%r166,8; add.u64 %r165,%r165,24; add.u64 %r163,%r163,8; setp.ne.u64 %r316,%r123,%r154; @ %r316 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r350 bra $L37; .loc 1 159 10 st.u64 [%r122],%r356; bra $L29; $L37: mov.u64 %r117,%r121; .loc 1 155 9 mov.u64 %r104,-9223372036854775808; .loc 1 157 9 mov.u64 %r103,1; .loc 1 178 8 mov.u64 %r149,0; $L28: .loc 1 182 14 ld.u64 %r180,[%r117]; .loc 1 185 14 add.u64 %r149,%r149,1; .loc 1 182 7 @ %r355 bra $L30; .loc 1 182 6 setp.le.s64 %r320,%r104,%r180; selp.u64 %r104,%r180,%r104,%r320; selp.u64 %r103,%r149,%r103,%r320; bra $L31; $L30: setp.lt.s64 %r321,%r104,%r180; selp.u64 %r104,%r180,%r104,%r321; selp.u64 %r103,%r149,%r103,%r321; $L31: .loc 1 180 31 add.u64 %r117,%r117,%r69; .loc 1 180 6 setp.gt.s64 %r322,%r46,%r149; @ %r322 bra $L28; .loc 1 189 12 st.u64 [%r122],%r103; $L29: .loc 1 193 15 add.u64 %r48,%r48,1; .loc 1 194 12 add.u64 %r121,%r121,%r72; .loc 1 195 12 add.u64 %r122,%r122,%r75; .loc 1 197 32 ld.u64 %r145,[%frame+240]; .loc 1 197 13 setp.ne.u64 %r323,%r48,%r145; @ %r323 bra $L25; .loc 1 204 23 mul.lo.u64 %r324,%r70,%r48; .loc 1 204 9 shl.b64 %r325,%r324,3; sub.u64 %r118,%r121,%r325; .loc 1 205 23 mul.lo.u64 %r326,%r73,%r48; .loc 1 205 9 shl.b64 %r327,%r326,3; sub.u64 %r119,%r122,%r327; .loc 1 207 7 @ %r351 bra $L1; add.u64 %r150,%frame,368; mov.u64 %r109,8; .loc 1 206 5 mov.u64 %r120,1; bra $L34; $L35: .loc 1 201 13 st.u64 [%r150],%r356; .loc 1 204 23 mul.lo.u64 %r333,%r89,%r88; .loc 1 204 9 shl.b64 %r334,%r333,3; sub.u64 %r118,%r121,%r334; .loc 1 205 23 mul.lo.u64 %r335,%r92,%r88; .loc 1 205 9 shl.b64 %r336,%r335,3; sub.u64 %r119,%r122,%r336; .loc 1 206 5 add.u64 %r120,%r120,1; .loc 1 207 7 add.u64 %r150,%r150,8; add.u64 %r109,%r109,8; setp.eq.u64 %r337,%r120,%r359; @ %r337 bra $L1; $L34: .loc 1 215 16 ld.u64 %r338,[%r150]; add.u64 %r88,%r338,1; st.u64 [%r150],%r88; .loc 1 216 23 add.u64 %r340,%r360,%r109; ld.u64 %r89,[%r340]; .loc 1 216 13 shl.b64 %r341,%r89,3; add.u64 %r121,%r118,%r341; .loc 1 217 23 add.u64 %r342,%frame,%r109; ld.u64 %r92,[%r342]; .loc 1 217 13 shl.b64 %r343,%r92,3; add.u64 %r122,%r119,%r343; .loc 1 197 32 add.u64 %r345,%r361,%r109; ld.u64 %r95,[%r345]; .loc 1 197 13 setp.eq.u64 %r346,%r88,%r95; @ %r346 bra $L35; .loc 1 201 13 mov.u64 %r48,0; bra $L25; $L51: .loc 1 90 6 ld.u64 %r347,[%r181]; setp.eq.u64 %r348,%r347,0; @ ! %r348 bra $L11; bra $L36; $L1:_gfortran_mmaxloc1_8_i8 .visible .func _gfortran_mmaxloc1_8_, .param .u32 %in_ar4) {608pred %r218; .reg .u32 %r223; .reg .u6464u64pred %r373; .reg .u64 %r374; .reg .u64 %r376; .reg .u32 %r377; .reg .pred %r378; .reg .u64 %r381; .reg .u64 %r382; .reg .u64 %r386; .reg .pred %r388; .reg .pred %r389; .reg .u64 %r390; .reg .u32 %r391; .reg .u64pred %r398; .reg .pred %r399; .reg .pred %r402; .reg .pred %r403; .reg .pred %r404; .reg .pred %r406; .reg .pred %r407; .reg .u16 %r408; .reg .u32 %r409; .reg .pred %r410; .reg .pred %r411; .reg .pred %r412; .reg .pred.reg .u64 %r416; .reg .u64predpredmov.u64 %r213,%ar0; mov.u64 %r214,%ar1; mov.u64 %r215,%ar2; mov.u64 %r216,%ar3; mov.u32 %r217,%ar4; .loc 1 251 6 setp.ne.u64 %r218,%r216,0; @ %r218 bra $L1r214215217; call _gfortran_maxloc1_8_7 bra $L52; $L53: .loc 1 261 10 ld.u64 %r22,[%r215]; .loc 1 261 7 add.u64 %r146,%r22,-1; .loc 1 262 10 ld.s8 %r23,[%r214+28]; .loc 1 262 38 add.u32 %r223,%r23,-1; .loc 1 262 8 cvt.s64.s32 %r147,%r223; .loc 1 265 7 shr.u64 %r225,%r146,63; set.u32.gt.s64 %r227,%r146,%r147; neg.s32 %r228,%r227; cvt.u16.u64 %r231,%r225; cvt.u16.u32 %r232,%r228; or.b16 %r230,%r231,%r232; .loc 1 265 6 cvt.u32.u16 %r233,%r230; cvt.u16.u8 %r234,%r233; setp.eq.u16 %r235,%r234,0; @ %r235 bra $L55; .loc 1 267 7 cvt.u32.u32 %r238,%r23; cvt.s64.s8 %r237,%r238; st.u64 [%stack+8],%r237; st.u64 [%stack],%r22; cvta.const.u64 %r236236_gfortran_runtime_error5: .loc 1 272 9 add.u64 %r445,%r146,%r146; add.u64 %r444,%r445,%r146; shl.b64 %r243,%r444,3; add.u64 %r244,%r214,%r243; ld.u64 %r247,[%r244+56]; add.u64 %r246,%r247,1; .loc 1 272 7 ld.u64 %r254,[%r244+48]; sub.u64 %r148,%r246,%r254; .loc 1 273 6 setp.le.s64 %r255,%r148,0; @ %r255 bra $L52; .loc 1 276 9 ld.u64 %r166,[%r216]; .loc 1 278 15 ld.u64 %r34,[%r216+16]; .loc 1 280 22 cvt.u32.u64 %r35,%r34; .loc 1 280 53 add.u32 %r256,%r35,-4; and.b32 %r257,%r256,-5; set.u32.eq.u32 %r259,%r257,0; neg.s32 %r260,%r259; .loc 1 280 22 add.u32 %r261,%r35,-1; set.u32.le.u32 %r263,%r261,1; neg.s32 %r264,%r263; .loc 1 280 6 cvt.u16.u32 %r266,%r260; cvt.u16.u32 %r267,%r264; or.b16 %r265,%r266,%r267; cvt.u32.u16 %r268,%r265; cvt.u16.u8 %r269,%r268; setp.ne.u16 %r270,%r269,0; @ %r270 bra $L57; .loc 1 282 7 setp.ne.u32 %r272,%r35,16; @ %r272 bra $L58; $L57: .loc 1 289 9 shl.b64 %r276,%r444,3; add.u64 %r277,%r214,%r276; ld.u64 %r151,[%r277+40]; .loc 1 290 12 add.u64 %r283,%r216,%r276; ld.u64 %r39,[%r283+40]; .loc 1 292 3 setp.ne.u64 %r285,%r146,0; @ %r285 bra $L59; $L65: .loc 1 302 3 setp.lt.s64 %r286,%r146,%r147; @ %r286 bra $L60; bra $L119; $L58: .loc 1 287 5_gfortran_runtime_error9: add.u64 %r205,%r214,40; add.u64 %r67,%r216,40; add.u64 %r289,%r214,16; add.u64 %r291,%r22,%r22; add.u64 %r292,%r291,%r22; shl.b64 %r293,%r292,3; add.u64 %r210,%r289,%r293; .loc 1 292 3 mov.u64 %r91,0; add.u64 %r442,%frame,240; add.u64 %r440,%frame,360; .loc 1 299 12 mov.u64 %r456,%r91; $L64: .loc 1 294 18 add.u64 %r295,%r442,%r91; ld.u64 %r296,[%r205]; st.u64 [%r295],%r296; .loc 1 295 18 add.u64 %r297,%frame,%r91; .loc 1 295 20 ld.u64 %r299,[%r67]; mul.lo.u64 %r298,%r299,%r34; .loc 1 295 18 st.u64 [%r297],%r298; .loc 1 296 19 ld.u64 %r301,[%r205+16]; add.u64 %r300,%r301,1; ld.u64 %r302,[%r205+8]; sub.u64 %r51,%r300,%r302; .loc 1 298 10 setp.lt.s64 %r303,%r51,0; @ %r303 bra $L62; .loc 1 296 17 add.u64 %r305,%r440,%r91; st.u64 [%r305],%r51; bra $L63; $L62: .loc 1 299 12 add.u64 %r307,%r440,%r91; st.u64 [%r307],%r456; $L63: .loc 1 292 3 add.u64 %r205,%r205,24; add.u64 %r67,%r67,24; add.u64 %r91,%r91,8; setp.ne.u64 %r309,%r205,%r210; @ %r309 bra $L64; bra $L65; $L119: .loc 1 312 6 ld.u64 %r310,[%r213]; setp.eq.u64 %r311,%r310,0; @ ! %r311 bra $L67; bra $L66; $L60: add.u64 %r313,%r22,%r22; add.u64 %r314,%r313,%r22; shl.b64 %r315,%r314,3; add.u64 %r128,%r315,40; add.u64 %r133,%r214,%r128; add.u64 %r124,%r216,%r128; shl.b64 %r316,%r22,3; add.u64 %r83,%r316,-8; cvt.u32.u32 %r318,%r23; cvt.s64.s8 %r317,%r318; shl.b64 %r319,%r317,3; add.u64 %r209,%r319,-8; add.u64 %r442,%frame,240; add.u64 %r440,%frame,360; .loc 1 309 12 mov.u64 %r455,0; $L70: .loc 1 304 18 add.u64 %r321,%r442,%r83; ld.u64 %r322,[%r133]; st.u64 [%r321],%r322; .loc 1 305 18 add.u64 %r323,%frame,%r83; .loc 1 305 20 ld.u64 %r325,[%r124]; mul.lo.u64 %r324,%r325,%r34; .loc 1 305 18 st.u64 [%r323],%r324; .loc 1 306 19 ld.u64 %r327,[%r133+16]; add.u64 %r326,%r327,1; ld.u64 %r328,[%r133+8]; sub.u64 %r64,%r326,%r328; .loc 1 308 10 setp.lt.s64 %r329,%r64,0; @ %r329 bra $L68; .loc 1 306 17 add.u64 %r331,%r440,%r83; st.u64 [%r331],%r64; bra $L69; $L68: .loc 1 309 12 add.u64 %r333,%r440,%r83; st.u64 [%r333],%r455; $L69: .loc 1 302 3 add.u64 %r133,%r133,24; add.u64 %r124,%r124,24; add.u64 %r83,%r83,8; setp.ne.u64 %r335,%r83,%r209; @ %r335 bra $L70; bra $L120; $L66: .loc 1 316 7 setp.eq.u64 %r336,%r147,0; @ %r336 bra $L72; add.u64 %r440,%frame,360; $L96: add.u64 %r161,%r213,40; mov.u64 %r143,%r440; cvt.u32.u32 %r338,%r23; cvt.s64.s8 %r337,%r338; add.u64 %r340,%r337,%r337; add.u64 %r341,%r340,%r337; shl.b64 %r342,%r341,3; add.u64 %r343,%r213,16; add.u64 %r135,%r342,%r343; .loc 1 319 10 mov.u64 %r131,1; .loc 1 323 4 mov.u64 %r362,0; bra $L73; $L72: .loc 1 327 20 add.u32 %r344,%r23,-2; cvt.s64.s32 %r73,%r344; add.u64 %r346,%r73,%r73; add.u64 %r347,%r346,%r73; shl.b64 %r348,%r347,3; add.u64 %r349,%r213,%r348; .loc 1 327 67 shl.b64 %r351,%r73,3; add.u64 %r352,%frame,%r351; .loc 1 327 59 ld.u64 %r354,[%r349+40]; ld.u64 %r355,[%r352+360]; mul.lo.u64 %r155,%r354,%r355; .loc 1 329 24 mov.u64 %r356,0; st.u64 [%r213+8],%r356; .loc 1 330 28 cvt.u16.u32 %r359,%r23; add.u16 %r358,%r359,-1; cvt.u32.u16 %r360,%r358; st.u8 [%r213+28],%r360; .loc 1 332 10 setp.eq.u64 %r361,%r155,0; @ %r361 bra $L74; bra $L121; $L76: .loc 1 321 47 mul.lo.u64 %r131,%r70,%r131; $L73: .loc 1 323 4 st.u64 [%r161+8],%r362; ld.u64 %r70,[%r143]; add.u64 %r363,%r70,-1; st.u64 [%r161+16],%r363; st.u64 [%r161],%r131; .loc 1 316 7 add.u64 %r161,%r161,24; add.u64 %r143,%r143,8; setp.ne.u64 %r364,%r135,%r161; @ %r364 bra $L76; bra $L72; $L74: .loc 1 335 4 st.u64 [%r213+48],%r155; mov.u64 %r366,-1; st.u64 [%r213+56],%r366; mov.u64 %r367,1; st.u64 [%r213+40],%r367; .loc 1 336 4 bra $L52; $L121: .loc 1 339 24 mov.u64 %r369369; call (%value_in),_gfortrani_xmallocarray70,[%value_in]; } .loc 1 339 22 st.u64 [%r213],%r370; bra $L77; $L67: .loc 1 344 19 ld.s8 %r372,[%r213+28]; .loc 1 344 10 setp.eq.u64 %r373,%r372,%r147; @ %r373 bra $L78; .loc 1 345 2 cvta.const.u64 %r374374_gfortran_runtime_error78: .loc 1 347 11 cvta.global.u64 %r376,_gfortrani_compile_options; .loc 1 347 10 ld.u32 %r377,[%r376+36]; setp.eq.u32 %r378,%r377,0; @ %r378 bra $L77; .loc 1 349 4 add.u64 %r440,%frame,360; cvta.const.u64 %r382,$LC2r43382; call _gfortrani_bounds_ifunction_return4 cvta.const.u64 %r386,$LC386382; call _gfortrani_bounds_equal_extents$L77: .loc 1 356 3 setp.ne.u64 %r388,%r147,0; @ %r388 bra $L79; $L82: .loc 1 364 8 ld.u64 %r167,[%r213]; .loc 1 365 8 ld.u64 %r165,[%r214]; .loc 1 367 9 setp.ne.u64 %r389,%r165,0; @ %r389 bra $L80; bra $L52; $L79: add.u64 %r194,%frame,480; add.u64 %r193,%r213,40; add.u64 %r188,%frame,120; cvt.u32.u32 %r391,%r23; cvt.s64.s8 %r390,%r391; add.u64 %r174,%r390,-1; .loc 1 356 3 mov.u64 %r168,0; add.u64 %r440,%frame,360; .loc 1 358 16 mov.u64 %r392,%r168; $L81: st.u64 [%r194],%r392; .loc 1 359 18 ld.u64 %r393,[%r193]; st.u64 [%r188],%r393; .loc 1 360 17 shl.b64 %r395,%r168,3; add.u64 %r396,%r440,%r395; .loc 1 360 10 ld.u64 %r397,[%r396]; setp.le.s64 %r398,%r397,0; @ %r398 bra $L52; .loc 1 356 26 add.u64 %r168,%r168,1; .loc 1 356 3 add.u64 %r194,%r194,8; add.u64 %r193,%r193,24; add.u64 %r188,%r188,8; setp.ne.u64 %r399,%r168,%r174; @ %r399 bra $L81; bra $L82; $L80: .loc 1 290 12 mul.lo.u64 %r42,%r39,%r34; .loc 1 386 32 shl.b64 %r87,%r151,3; ld.u64 %r68,[%frame+480]; .loc 1 430 22 ld.u64 %r97,[%frame+240]; .loc 1 430 12 shl.b64 %r99,%r97,3; .loc 1 431 23 ld.u64 %r100,[%frame]; .loc 1 432 22 ld.u64 %r102,[%frame+120]; .loc 1 432 12 shl.b64 %r104,%r102,3; setp.le.s64 %r446,%r147,1; .loc 1 408 9 setp.ne.u32 %r448,%r217,0; cvt.u32.u32 %r449,%r23; cvt.s64.s8 %r450,%r449; add.u64 %r451,%r450,-1; add.u64 %r452,%frame,240; add.u64 %r453,%frame,360; add.u64 %r454,%frame,120; $L97: .loc 1 356 3 mov.u64 %r160,%r166; mov.u64 %r159,%r165; .loc 1 386 9 mov.u64 %r137,0; $L86: .loc 1 389 7 ld.s8 %r208,[%r160]; mov.u64 %r189,%r137; .loc 1 386 24 add.u64 %r137,%r137,1; .loc 1 389 6 setp.eq.u32 %r441,%r208,0; @ %r441 bra $L83; .loc 1 397 11 ld.u64 %r141,[%r159]; .loc 1 408 9 @ %r448 bra $L84; bra $L122; $L83: .loc 1 386 32 add.u64 %r159,%r159,%r87; .loc 1 386 47 add.u64 %r160,%r160,%r42; .loc 1 386 2 setp.ne.u64 %r402,%r148,%r137; @ %r402 bra $L86; .loc 1 385 9 mov.u64 %r137,0; bra $L87; $L122: .loc 1 418 8 setp.gt.s64 %r403,%r148,%r189; @ %r403 bra $L88; bra $L87; $L84: .loc 1 409 8 setp.le.s64 %r404,%r148,%r189; @ %r404 bra $L87; $L90: .loc 1 414 16 add.u64 %r189,%r189,1; .loc 1 411 8 @ %r441 bra $L89; .loc 1 411 18 ld.u64 %r88,[%r159]; .loc 1 411 15 setp.ge.s64 %r406,%r88,%r141; selp.u64 %r141,%r88,%r141,%r406; selp.u64 %r137,%r189,%r137,%r406; $L89: .loc 1 409 33 add.u64 %r159,%r159,%r87; .loc 1 409 48 add.u64 %r160,%r160,%r42; .loc 1 409 8 setp.eq.u64 %r407,%r148,%r189; @ %r407 bra $L87; .loc 1 411 9 ld.s8 %r208,[%r160]; setp.eq.u32 %r441,%r208,0; bra $L90; $L88: .loc 1 423 16 add.u64 %r189,%r189,1; .loc 1 420 8 ld.s8 %r409,[%r160]; cvt.u16.u32 %r408,%r409; setp.eq.u16 %r410,%r408,0; @ %r410 bra $L91; .loc 1 420 18 ld.u64 %r93,[%r159]; .loc 1 420 15 setp.gt.s64 %r411,%r93,%r141; selp.u64 %r141,%r93,%r141,%r411; selp.u64 %r137,%r189,%r137,%r411; $L91: .loc 1 418 33 add.u64 %r159,%r159,%r87; .loc 1 418 48 add.u64 %r160,%r160,%r42; .loc 1 418 8 setp.ne.u64 %r412,%r148,%r189; @ %r412 bra $L88; $L87: .loc 1 426 8 st.u64 [%r167],%r137; .loc 1 429 15 add.u64 %r68,%r68,1; .loc 1 430 12 add.u64 %r165,%r165,%r99; .loc 1 431 13 add.u64 %r166,%r166,%r100; .loc 1 432 12 add.u64 %r167,%r167,%r104; .loc 1 434 32 ld.u64 %r29,[%frame+360]; .loc 1 434 13 setp.ne.u64 %r413,%r29,%r68; @ %r413 bra $L97; .loc 1 441 23 mul.lo.u64 %r106,%r68,%r97; .loc 1 442 24 mul.lo.u64 %r414,%r68,%r100; .loc 1 442 10 sub.u64 %r162,%r166,%r414; .loc 1 443 23 mul.lo.u64 %r415,%r68,%r102; .loc 1 443 9 shl.b64 %r416,%r415,3; sub.u64 %r163,%r167,%r416; .loc 1 445 7 @ %r446 bra $L52; add.u64 %r204,%frame,488; mov.u64 %r203,8; .loc 1 444 5 mov.u64 %r164,1; .loc 1 438 13 mov.u64 %r447,0; bra $L94; $L95: st.u64 [%r204],%r447; .loc 1 441 23 mul.lo.u64 %r106,%r117,%r116; .loc 1 442 24 mul.lo.u64 %r422,%r119,%r116; .loc 1 442 10 sub.u64 %r162,%r166,%r422; .loc 1 443 23 mul.lo.u64 %r423,%r121,%r116; .loc 1 443 9 shl.b64 %r424,%r423,3; sub.u64 %r163,%r167,%r424; .loc 1 444 5 add.u64 %r164,%r164,1; .loc 1 445 7 add.u64 %r204,%r204,8; add.u64 %r203,%r203,8; setp.eq.u64 %r425,%r164,%r451; @ %r425 bra $L52; $L94: .loc 1 453 16 ld.u64 %r426,[%r204]; add.u64 %r116,%r426,1; st.u64 [%r204],%r116; .loc 1 454 23 add.u64 %r428,%r452,%r203; ld.u64 %r117,[%r428]; .loc 1 454 13 sub.u64 %r429,%r117,%r106; shl.b64 %r430,%r429,3; add.u64 %r165,%r165,%r430; .loc 1 455 24 add.u64 %r431,%frame,%r203; ld.u64 %r119,[%r431]; .loc 1 455 14 add.u64 %r166,%r162,%r119; .loc 1 456 23 add.u64 %r433,%r454,%r203; ld.u64 %r121,[%r433]; .loc 1 456 13 shl.b64 %r434,%r121,3; add.u64 %r167,%r163,%r434; .loc 1 434 32 add.u64 %r436,%r453,%r203; ld.u64 %r125,[%r436]; .loc 1 434 13 setp.eq.u64 %r437,%r116,%r125; @ %r437 bra $L95; .loc 1 438 13 mov.u64 %r68,0; bra $L97; $L120: .loc 1 312 6 ld.u64 %r438,[%r213]; setp.eq.u64 %r439,%r438,0; @ ! %r439 bra $L67; bra $L96; $L52_gfortran_smaxloc1_8_i8 .visible .func _gfortran_smaxloc1_8_, .param .u32 %in_ar4) {local .align 16 .b8 %frame_ar[368u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 483 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L124; .loc 1 483 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L125; $L124:r152154; call _gfortran_maxloc1_8_490 7 bra $L123; $L125: .loc 1 493 10 ld.u64 %r23,[%r152]; .loc 1 493 7 add.u64 %r88,%r23,-1; .loc 1 494 10 ld.s8 %r24,[%r151+28]; .loc 1 494 38 add.u32 %r26,%r24,-1; .loc 1 494 8 cvt.s64.s32 %r296,%r26; .loc 1 496 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 496 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L127; .loc 1 503 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L128; $L134: .loc 1 511 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L129; bra $L182; $L127: .loc 1 498 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error128: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 508 12 mov.u64 %r302,0; $L133: .loc 1 505 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 507 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L131; .loc 1 505 17 st.u64 [%r139],%r36; bra $L132; $L131: .loc 1 508 12 st.u64 [%r139],%r302; $L132: .loc 1 503 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L133; bra $L134; $L182: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L136; bra $L135; $L129: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 517 12 mov.u64 %r301,0; $L139: .loc 1 514 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 516 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L137; .loc 1 513 17 st.u64 [%r43],%r41; bra $L138; $L137: .loc 1 517 12 st.u64 [%r43],%r301; $L138: .loc 1 511 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L139; bra $L183; $L135: .loc 1 524 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L141; add.u64 %r295,%frame,120; $L161: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 527 10 mov.u64 %r81,1; .loc 1 531 4 mov.u64 %r239,0; bra $L142; $L141: .loc 1 535 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 536 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 538 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 538 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 538 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 540 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L143; bra $L184; $L145: .loc 1 529 48 mul.lo.u64 %r81,%r46,%r81; $L142: .loc 1 531 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 524 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L145; bra $L141; $L143: .loc 1 543 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 544 4 bra $L123; $L184: .loc 1 547 24 mov.u64 %r24696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 547 22 st.u64 [%r150],%r98; $L149: .loc 1 573 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L146; bra $L147; $L136: .loc 1 551 19 ld.s8 %r55,[%r150+28]; .loc 1 551 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L148; .loc 1 552 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error148: .loc 1 557 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 557 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L149; .loc 1 559 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L147; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 559 10 mov.u64 %r104,0; $L151: .loc 1 563 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 563 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 564 18 ld.u64 %r63,[%r31]; .loc 1 559 25 add.u64 %r104,%r104,1; .loc 1 564 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L150; .loc 1 565 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error150: .loc 1 559 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L151; $L146: .loc 1 575 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L152; shl.b64 %r266,%r296,3; bra $L153; $L152: mov.u64 %r266,8; $L153: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L154: .loc 1 576 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 573 26 add.u64 %r100,%r100,1; .loc 1 573 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L154; $L147: ld.u64 %r80,[%frame+240]; .loc 1 585 22 ld.u64 %r68,[%frame]; .loc 1 585 12 shl.b64 %r69,%r68,3; .loc 1 587 32 ld.u64 %r59,[%frame+120]; .loc 1 594 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 583 13 mov.u64 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L160: st.u64 [%r98],%r280; .loc 1 584 15 add.u64 %r80,%r80,1; .loc 1 585 12 add.u64 %r98,%r98,%r69; .loc 1 587 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L160; .loc 1 596 7 @ ! %r294 bra $L185; bra $L123; $L159: .loc 1 591 13 st.u64 [%r99],%r280; .loc 1 594 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 595 5 add.u64 %r97,%r97,1; .loc 1 596 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L158; bra $L123; $L185: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 594 23 mov.u64 %r72,%r44; .loc 1 595 5 mov.u64 %r97,1; $L158: .loc 1 600 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 601 23 ld.u64 %r75,[%r124]; .loc 1 601 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,3; add.u64 %r98,%r98,%r291; .loc 1 587 32 ld.u64 %r77,[%r123]; .loc 1 587 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L159; .loc 1 591 13 mov.u64 %r80,0; bra $L160; $L183: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L136; bra $L161; $L123: .loc 1 605 1 ret; } maxloc1_16_i8.o/_gfortran_maxloc1_16_i8 .visible .func _gfortran_maxloc1_16_i8.file 1 "../../../libgfortran/generated/maxloc1_16_i8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_16_i8 .visible .func _gfortran_mmaxloc1_16__gfortran_smaxloc1_16_i8 .visible .func _gfortran_smaxloc1_16_VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_maxloc1_16_i8 .visible .func _gfortran_maxloc1_16_i8480predpredu64 %r231; .reg .u64 %r232; .reg .pred %r233; .reg .pred %r235; .reg .u64 %r236; .reg .pred %r237; .reg .u64u64predu64 %r264; .reg .u64 %r265; .reg .u64 %r266; .reg .u64 %r267; .reg .u64 %r268; .reg .u16 %r270; .reg .u16 %r271; .reg .u32 %r272; .reg .u32 %r273; .reg .u64 %r275; .reg .u64u64pred64pred64predpredpred %r383; .reg .u32u64 %r389; .reg .u64 %r390; mov.u64 %r183,%ar0; mov.u64 %r184,%ar1; mov.u64 %r185,%ar2; mov.u32 %r186,%ar3; .loc 1 58 10 ld.s8 %r22,[%r184+28]; .loc 1 58 38 add.u32 %r187,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r110,%r187; .loc 1 59 10 ld.u64 %r25,[%r185]; .loc 1 59 7 add.u64 %r111,%r25,-1; .loc 1 61 7 shr.u64 %r189,%r111,63; set.u32.lt.s64 %r191,%r110,%r111; neg.s32 %r192,%r191; cvt.u16.u64 %r195,%r189; cvt.u16.u32 %r196,%r192; or.b16 %r194,%r195,%r196; .loc 1 61 6 cvt.u32.u16 %r197,%r194; cvt.u16.u8 %r198,%r197; setp.eq.u16 %r199,%r198,0; @ %r199 bra $L2; .loc 1 63 7 cvt.u32.u32 %r202,%r22; cvt.s64.s8 %r201,%r202; st.u64 [%stack+8],%r2000200_gfortran_runtime_errorr205,%r111,%r111; add.u64 %r206,%r205,%r111; shl.b64 %r207,%r206,3; add.u64 %r208,%r184,%r207; ld.u64 %r31,[%r208+56]; ld.u64 %r33,[%r208+48]; .loc 1 71 9 ld.u64 %r114,[%r208+40]; .loc 1 73 3 setp.ne.u64 %r222,%r111,0; @ %r222 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r223,%r110,%r111; @ %r223 bra $L4; bra $L49; $L3: add.u64 %r109,%r184,40; add.u64 %r128,%frame,120; add.u64 %r181,%frame,240; add.u64 %r224,%r184,16; add.u64 %r226,%r25,%r25; add.u64 %r227,%r226,%r25; shl.b64 %r228,%r227,3; add.u64 %r137,%r224,%r228; .loc 1 79 12 mov.u64 %r390,0; $L8: .loc 1 75 18 ld.u64 %r229,[%r109]; st.u64 [%r128],%r229; .loc 1 76 19 ld.u64 %r231,[%r109+16]; add.u64 %r230,%r231,1; ld.u64 %r232,[%r109+8]; sub.u64 %r38,%r230,%r232; .loc 1 78 10 setp.lt.s64 %r233,%r38,0; @ %r233 bra $L6; .loc 1 76 17 st.u64 [%r181],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r181],%r390; $L7: .loc 1 73 3 add.u64 %r109,%r109,24; add.u64 %r128,%r128,8; add.u64 %r181,%r181,8; setp.ne.u64 %r235,%r109,%r137; @ %r235 bra $L8; bra $L9; $L49: .loc 1 90 6 ld.u64 %r236,[%r183]; setp.eq.u64 %r237,%r236,0; @ ! %r237 bra $L11; bra $L10; $L4: add.u64 %r239,%r25,%r25; add.u64 %r240,%r239,%r25; shl.b64 %r241,%r240,3; add.u64 %r242,%r241,40; add.u64 %r134,%r184,%r242; shl.b64 %r243,%r25,3; add.u64 %r107,%r243,-8; add.u64 %r378,%frame,120; add.u64 %r120,%r378,%r107; add.u64 %r377,%frame,240; add.u64 %r104,%r377,%r107; add.u64 %r246,%r184,40; cvt.u32.u32 %r248,%r22; cvt.s64.s8 %r247,%r248; add.u64 %r250,%r247,%r247; add.u64 %r251,%r250,%r247; shl.b64 %r252,%r251,3; add.u64 %r63,%r246,%r252; .loc 1 87 12 mov.u64 %r389,0; $L14: .loc 1 83 18 ld.u64 %r253,[%r134]; st.u64 [%r120],%r253; .loc 1 84 19 ld.u64 %r255,[%r134+16]; add.u64 %r254,%r255,1; ld.u64 %r256,[%r134+8]; sub.u64 %r45,%r254,%r256; .loc 1 86 10 setp.lt.s64 %r257,%r45,0; @ %r257 bra $L12; .loc 1 84 17 st.u64 [%r104],%r45; bra $L13; $L12: .loc 1 87 12 st.u64 [%r104],%r389; $L13: .loc 1 81 3 add.u64 %r134,%r134,24; add.u64 %r120,%r120,8; add.u64 %r104,%r104,8; setp.ne.u64 %r259,%r63,%r134; @ %r259 bra $L14; bra $L50; $L10: .loc 1 94 7 setp.le.s64 %r260,%r110,0; @ %r260 bra $L16; add.u64 %r377,%frame,240; $L37: add.u64 %r154,%r183,40; mov.u64 %r152,%r377; cvt.u32.u32 %r262,%r22; cvt.s64.s8 %r261,%r262; add.u64 %r264,%r261,%r261; add.u64 %r265,%r264,%r261; shl.b64 %r266,%r265,3; add.u64 %r267,%r183,16; add.u64 %r138,%r266,%r267; .loc 1 97 10 mov.u64 %r101,1; .loc 1 101 4 mov.u64 %r290,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r268,0; st.u64 [%r183+8],%r268; .loc 1 106 28 cvt.u16.u32 %r271,%r22; add.u16 %r270,%r271,-1; cvt.u32.u16 %r272,%r270; st.u8 [%r183+28],%r272; .loc 1 108 20 add.u32 %r273,%r22,-2; cvt.s64.s32 %r56,%r273; add.u64 %r275,%r56,%r56; add.u64 %r276,%r275,%r56; shl.b64 %r277,%r276,3; add.u64 %r278,%r183,%r277; .loc 1 108 67 shl.b64 %r280,%r56,3; add.u64 %r281,%frame,%r280; .loc 1 108 59 ld.u64 %r283,[%r278+40]; ld.u64 %r284,[%r281+240]; mul.lo.u64 %r116,%r283,%r284; .loc 1 110 29 mov.u64 %r286,16286; call (%value_in),_gfortrani_xmallocarray287,[%value_in]; } .loc 1 110 27 st.u64 [%r183],%r287; .loc 1 111 10 setp.eq.u64 %r289,%r116,0; @ ! %r289 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r101,%r52,%r101; $L17: .loc 1 101 4 st.u64 [%r154+8],%r290; ld.u64 %r52,[%r152]; add.u64 %r291,%r52,-1; st.u64 [%r154+16],%r291; st.u64 [%r154],%r101; .loc 1 94 7 add.u64 %r154,%r154,24; add.u64 %r152,%r152,8; setp.ne.u64 %r292,%r138,%r154; @ %r292 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r183+48],%r116; mov.u64 %r294,-1; st.u64 [%r183+56],%r294; mov.u64 %r295,1; st.u64 [%r183+40],%r295; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r62,[%r183+28]; .loc 1 121 10 setp.eq.u64 %r296,%r62,%r110; @ %r296 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r110; st.u64 [%stack],%r62;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r299,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r300,[%r299+36]; setp.eq.u32 %r301,%r300,0; @ %r301 bra $L21; .loc 1 128 2 add.u64 %r377,%frame,240; cvta.const.u64 %r305,$LC2call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r307,%r110,0; @ %r307 bra $L24; $L27: .loc 1 68 9 add.u64 %r308,%r31,1; .loc 1 68 7 sub.u64 %r112,%r308,%r33; max.s64 %r46,%r112,0; .loc 1 140 8 ld.u64 %r124,[%r184]; .loc 1 141 8 ld.u64 %r125,[%r183]; .loc 1 180 31 shl.b64 %r70,%r114,3; ld.u64 %r77,[%frame+360]; .loc 1 194 22 ld.u64 %r72,[%frame+120]; .loc 1 194 12 shl.b64 %r73,%r72,3; .loc 1 195 22 ld.u64 %r75,[%frame]; .loc 1 195 12 shl.b64 %r76,%r75,4; .loc 1 197 32 ld.u64 %r105,[%frame+240]; .loc 1 204 23 mul.lo.u64 %r309,%r72,%r105; .loc 1 204 9 shl.b64 %r310,%r309,3; neg.s64 %r135,%r310; .loc 1 205 23 mul.lo.u64 %r311,%r75,%r105; .loc 1 205 9 shl.b64 %r312,%r311,4; neg.s64 %r139,%r312; setp.gt.s64 %r379,%r112,0; setp.le.s64 %r380,%r110,1; mov.u64 %r382,1; setp.eq.u32 %r383,%r186,0; cvt.u32.u32 %r384,%r22; cvt.s64.s8 %r385,%r384; add.u64 %r386,%r385,-1; add.u64 %r387,%frame,120; add.u64 %r388,%frame,240; bra $L25; $L24: add.u64 %r167,%frame,360; add.u64 %r166,%r183,40; mov.u64 %r164,%frame; cvt.u32.u32 %r314,%r22; cvt.s64.s8 %r313,%r314; add.u64 %r155,%r313,-1; .loc 1 132 3 mov.u64 %r126,0; add.u64 %r377,%frame,240; .loc 1 134 16 mov.u64 %r315,%r126; $L26: st.u64 [%r167],%r315; .loc 1 135 18 ld.u64 %r316,[%r166]; st.u64 [%r164],%r316; .loc 1 136 17 shl.b64 %r318,%r126,3; add.u64 %r319,%r377,%r318; .loc 1 136 10 ld.u64 %r320,[%r319]; setp.le.s64 %r321,%r320,0; @ %r321 bra $L1; .loc 1 132 26 add.u64 %r126,%r126,1; .loc 1 132 3 add.u64 %r167,%r167,8; add.u64 %r166,%r166,24; add.u64 %r164,%r164,8; setp.ne.u64 %r322,%r126,%r155; @ %r322 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r379 bra $L38; .loc 1 159 10 mov.u64 %r324,0; st.u64 [%r125],%r324; st.u64 [%r125+8],%r324; bra $L29; $L38: mov.u64 %r119,%r124; mov.u64 %r370,%r382; mov.u64 %r371,0; .loc 1 155 9 mov.u64 %r103,-9223372036854775808; .loc 1 157 9 mov.u64 %r368,1; mov.u64 %r369,%r371; $L28: .loc 1 182 14 ld.u64 %r182,[%r119]; .loc 1 182 7 @ %r383 bra $L30; .loc 1 182 6 setp.le.s64 %r327,%r103,%r182; @ %r327 bra $L31; bra $L32; $L30: setp.ge.s64 %r328,%r103,%r182; @ %r328 bra $L32; $L31: .loc 1 185 14 mov.u64 %r368,%r370; mov.u64 %r369,%r371; .loc 1 182 14 mov.u64 %r103,%r182; $L32: .loc 1 180 31 add.u64 %r119,%r119,%r70; .loc 1 180 6 add.u64 %r331,%r370,%r382; set.u32.lt.u64 %r336,%r331,%r370; cvt.s64.s32 %r334,%r336; mov.u64 %r370,%r331; sub.u64 %r371,%r371,%r334; add.u64 %r341,%r370,-1; setp.gt.s64 %r343,%r46,%r341; @ %r343 bra $L28; .loc 1 189 12 st.u64 [%r125],%r368; st.u64 [%r125+8],%r369; $L29: .loc 1 193 15 add.u64 %r77,%r77,1; .loc 1 194 12 add.u64 %r124,%r124,%r73; .loc 1 195 12 add.u64 %r125,%r125,%r76; .loc 1 197 13 setp.ne.u64 %r346,%r77,%r105; @ %r346 bra $L25; .loc 1 204 9 add.u64 %r121,%r124,%r135; .loc 1 205 9 add.u64 %r122,%r125,%r139; .loc 1 207 7 @ %r380 bra $L1; add.u64 %r130,%frame,368; mov.u64 %r149,8; .loc 1 206 5 mov.u64 %r123,1; .loc 1 201 13 mov.u64 %r381,0; bra $L35; $L36: st.u64 [%r130],%r381; .loc 1 204 23 mul.lo.u64 %r352,%r89,%r88; .loc 1 204 9 shl.b64 %r353,%r352,3; sub.u64 %r121,%r124,%r353; .loc 1 205 23 mul.lo.u64 %r354,%r92,%r88; .loc 1 205 9 shl.b64 %r355,%r354,4; sub.u64 %r122,%r125,%r355; .loc 1 206 5 add.u64 %r123,%r123,1; .loc 1 207 7 add.u64 %r130,%r130,8; add.u64 %r149,%r149,8; setp.eq.u64 %r356,%r123,%r386; @ %r356 bra $L1; $L35: .loc 1 215 16 ld.u64 %r357,[%r130]; add.u64 %r88,%r357,1; st.u64 [%r130],%r88; .loc 1 216 23 add.u64 %r359,%r387,%r149; ld.u64 %r89,[%r359]; .loc 1 216 13 shl.b64 %r360,%r89,3; add.u64 %r124,%r121,%r360; .loc 1 217 23 add.u64 %r361,%frame,%r149; ld.u64 %r92,[%r361]; .loc 1 217 13 shl.b64 %r362,%r92,4; add.u64 %r125,%r122,%r362; .loc 1 197 32 add.u64 %r364,%r388,%r149; ld.u64 %r95,[%r364]; .loc 1 197 13 setp.eq.u64 %r365,%r88,%r95; @ %r365 bra $L36; .loc 1 201 13 mov.u64 %r77,0; bra $L25; $L50: .loc 1 90 6 ld.u64 %r366,[%r183]; setp.eq.u64 %r367,%r366,0; @ ! %r367 bra $L11; bra $L37; $L1:_gfortran_mmaxloc1_16_i8 .visible .func _gfortran_mmaxloc1_16_, .param .u32 %in_ar4) {60897u64u64 %r267; .reg .u64 %r274; .reg .pred %r275; .reg .u32 %r276; .reg .u32 %r277; .reg .u32u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .pred %r329; .reg .u64 %r330; .reg .pred %r331; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32 %r338; .reg .u64 %r339; .reg .u64 %r341; .reg .u64 %r342; .reg .u64 %r343; .reg .u64 %r344; .reg .u64pred %r355; .reg .pred64predpred %r408; .reg .pred %r409; .reg .u64 %r410; .reg .u32.reg .u64 %r417; .reg .pred %r418; .reg .predpred %r428; .reg .u64 %r432; .reg .u64pred %r460; .reg .u64 %r463; .reg .u64 %r466; .reg .u32 %r468; .reg .pred %r475; .reg .pred.reg .u32 %r504; .reg .u64 %r508; .reg .u16 %r509; .reg .u32 %r510; .reg .pred %r511; .reg .pred %r512; .reg .u64 %r515; .reg .u64 %r518; .reg .u32 %r520; .reg .pred %r527; .reg .pred %r530; .reg .pred %r533; .reg .u64pred %r542; .reg .u64 %r543; .reg .u64pred.reg .u64 %r610; .reg .u64 %r611; mov.u64 %r233,%ar0; mov.u64 %r234,%ar1; mov.u64 %r235,%ar2; mov.u64 %r236,%ar3; mov.u32 %r237,%ar4; .loc 1 251 6 setp.ne.u64 %r238,%r236,0; @ %r238 bra $L3r2235237; call _gfortran_maxloc1_16_7 bra $L51; $L52: .loc 1 261 10 ld.u64 %r22,[%r235]; .loc 1 261 7 add.u64 %r153,%r22,-1; .loc 1 262 10 ld.s8 %r23,[%r234+28]; .loc 1 262 38 add.u32 %r243,%r23,-1; .loc 1 262 8 cvt.s64.s32 %r154,%r243; .loc 1 265 7 shr.u64 %r245,%r153,63; set.u32.gt.s64 %r247,%r153,%r154; neg.s32 %r248,%r247; cvt.u16.u64 %r251,%r245; cvt.u16.u32 %r252,%r248; or.b16 %r250,%r251,%r252; .loc 1 265 6 cvt.u32.u16 %r253,%r250; cvt.u16.u8 %r254,%r253; setp.eq.u16 %r255,%r254,0; @ %r255 bra $L54; .loc 1 267 7 cvt.u32.u32 %r258,%r23; cvt.s64.s8 %r257,%r258; st.u64 [%stack+8],%r257; st.u64 [%stack],%r22; cvta.const.u64 %r22stack; call _gfortran_runtime_error4: .loc 1 272 9 add.u64 %r596,%r153,%r153; add.u64 %r601,%r596,%r153; shl.b64 %r263,%r601,3; add.u64 %r264,%r234,%r263; ld.u64 %r267,[%r264+56]; add.u64 %r266,%r267,1; .loc 1 272 7 ld.u64 %r274,[%r264+48]; sub.u64 %r155,%r266,%r274; .loc 1 273 6 setp.le.s64 %r275,%r155,0; @ %r275 bra $L51; .loc 1 276 9 ld.u64 %r173,[%r236]; .loc 1 278 15 ld.u64 %r35,[%r236+16]; .loc 1 280 22 cvt.u32.u64 %r36,%r35; .loc 1 280 53 add.u32 %r276,%r36,-4; and.b32 %r277,%r276,-5; set.u32.eq.u32 %r279,%r277,0; neg.s32 %r280,%r279; .loc 1 280 22 add.u32 %r281,%r36,-1; set.u32.le.u32 %r283,%r281,1; neg.s32 %r284,%r283; .loc 1 280 6 cvt.u16.u32 %r286,%r280; cvt.u16.u32 %r287,%r284; or.b16 %r285,%r286,%r287;56; .loc 1 282 7 setp.ne.u32 %r292,%r36,16; @ %r292 bra $L57; $L56: .loc 1 289 9 shl.b64 %r296,%r601,3; add.u64 %r297,%r234,%r296; ld.u64 %r158,[%r297+40]; .loc 1 290 12 add.u64 %r303,%r236,%r296; ld.u64 %r41,[%r303+40]; .loc 1 292 3 setp.ne.u64 %r305,%r153,0; @ %r305 bra $L58; $L64: .loc 1 302 3 setp.lt.s64 %r306,%r153,%r154; @ %r306 bra $L59; bra $L118; $L57: .loc 1 287 5 cvta.const.u64 %r307307_gfortran_runtime_error8: add.u64 %r164,%r234,40; add.u64 %r150,%r236,40; add.u64 %r309,%r234,16; add.u64 %r311,%r22,%r22; add.u64 %r312,%r311,%r22; shl.b64 %r313,%r312,3; add.u64 %r223,%r309,%r313; .loc 1 292 3 mov.u64 %r53,0; add.u64 %r597,%frame,240; add.u64 %r600,%frame,360; .loc 1 299 12 mov.u64 %r611,%r53; $L63: .loc 1 294 18 add.u64 %r315,%r597,%r53; ld.u64 %r316,[%r164]; st.u64 [%r315],%r316; .loc 1 295 18 add.u64 %r317,%frame,%r53; .loc 1 295 20 ld.u64 %r319,[%r150]; mul.lo.u64 %r318,%r319,%r35; .loc 1 295 18 st.u64 [%r317],%r318; .loc 1 296 19 ld.u64 %r321,[%r164+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r164+8]; sub.u64 %r52,%r320,%r322; .loc 1 298 10 setp.lt.s64 %r323,%r52,0; @ %r323 bra $L61; .loc 1 296 17 add.u64 %r325,%r600,%r53; st.u64 [%r325],%r52; bra $L62; $L61: .loc 1 299 12 add.u64 %r327,%r600,%r53; st.u64 [%r327],%r611; $L62: .loc 1 292 3 add.u64 %r164,%r164,24; add.u64 %r150,%r150,24; add.u64 %r53,%r53,8; setp.ne.u64 %r329,%r164,%r223; @ %r329 bra $L63; bra $L64; $L118: .loc 1 312 6 ld.u64 %r330,[%r233]; setp.eq.u64 %r331,%r330,0; @ ! %r331 bra $L66; bra $L65; $L59: add.u64 %r333,%r22,%r22; add.u64 %r334,%r333,%r22; shl.b64 %r335,%r334,3; add.u64 %r178,%r335,40; add.u64 %r59,%r234,%r178; add.u64 %r180,%r236,%r178; shl.b64 %r336,%r22,3; add.u64 %r91,%r336,-8; cvt.u32.u32 %r338,%r23; cvt.s64.s8 %r337,%r338; shl.b64 %r339,%r337,3; add.u64 %r97,%r339,-8; add.u64 %r597,%frame,240; add.u64 %r600,%frame,360; .loc 1 309 12 mov.u64 %r610,0; $L69: .loc 1 304 18 add.u64 %r341,%r597,%r91; ld.u64 %r342,[%r59]; st.u64 [%r341],%r342; .loc 1 305 18 add.u64 %r343,%frame,%r91; .loc 1 305 20 ld.u64 %r345,[%r180]; mul.lo.u64 %r344,%r345,%r35; .loc 1 305 18 st.u64 [%r343],%r344; .loc 1 306 19 ld.u64 %r347,[%r59+16]; add.u64 %r346,%r347,1; ld.u64 %r348,[%r59+8]; sub.u64 %r66,%r346,%r348; .loc 1 308 10 setp.lt.s64 %r349,%r66,0; @ %r349 bra $L67; .loc 1 306 17 add.u64 %r351,%r600,%r91; st.u64 [%r351],%r66; bra $L68; $L67: .loc 1 309 12 add.u64 %r353,%r600,%r91; st.u64 [%r353],%r610; $L68: .loc 1 302 3 add.u64 %r59,%r59,24; add.u64 %r180,%r180,24; add.u64 %r91,%r91,8; setp.ne.u64 %r355,%r91,%r97; @ %r355 bra $L69; bra $L119; $L65: .loc 1 316 7 setp.eq.u64 %r356,%r154,0; @ %r356 bra $L71; add.u64 %r600,%frame,360; $L97: add.u64 %r137,%r233,40; mov.u64 %r134,%r600; cvt.u32.u32 %r358,%r23; cvt.s64.s8 %r357,%r358; add.u64 %r360,%r357,%r357; add.u64 %r361,%r360,%r357; shl.b64 %r362,%r361,3; add.u64 %r363,%r233,16; add.u64 %r64,%r362,%r363; .loc 1 319 10 mov.u64 %r136,1; .loc 1 323 4 mov.u64 %r382,0; bra $L72; $L71: .loc 1 327 20 add.u32 %r364,%r23,-2; cvt.s64.s32 %r75,%r364; add.u64 %r366,%r75,%r75; add.u64 %r367,%r366,%r75; shl.b64 %r368,%r367,3; add.u64 %r369,%r233,%r368; .loc 1 327 67 shl.b64 %r371,%r75,3; add.u64 %r372,%frame,%r371; .loc 1 327 59 ld.u64 %r374,[%r369+40]; ld.u64 %r375,[%r372+360]; mul.lo.u64 %r162,%r374,%r375; .loc 1 329 24 mov.u64 %r376,0; st.u64 [%r233+8],%r376; .loc 1 330 28 cvt.u16.u32 %r379,%r23; add.u16 %r378,%r379,-1; cvt.u32.u16 %r380,%r378; st.u8 [%r233+28],%r380; .loc 1 332 10 setp.eq.u64 %r381,%r162,0; @ %r381 bra $L73; bra $L120; $L75: .loc 1 321 47 mul.lo.u64 %r136,%r72,%r136; $L72: .loc 1 323 4 st.u64 [%r137+8],%r382; ld.u64 %r72,[%r134]; add.u64 %r383,%r72,-1; st.u64 [%r137+16],%r383; st.u64 [%r137],%r136; .loc 1 316 7 add.u64 %r137,%r137,24; add.u64 %r134,%r134,8; setp.ne.u64 %r384,%r64,%r137; @ %r384 bra $L75; bra $L71; $L73: .loc 1 335 4 st.u64 [%r233+48],%r162; mov.u64 %r386,-1; st.u64 [%r233+56],%r386; mov.u64 %r387,1; st.u64 [%r233+40],%r387; .loc 1 336 4 bra $L51; $L120: .loc 1 339 24 mov.u64 %r389,16389; call (%value_in),_gfortrani_xmallocarray90,[%value_in]; } .loc 1 339 22 st.u64 [%r233],%r390; bra $L76; $L66: .loc 1 344 19 ld.s8 %r392,[%r233+28]; .loc 1 344 10 setp.eq.u64 %r393,%r392,%r154; @ %r393 bra $L77; .loc 1 345 2 cvta.const.u64 %r3944_gfortran_runtime_error77: .loc 1 347 11 cvta.global.u64 %r396,_gfortrani_compile_options; .loc 1 347 10 ld.u32 %r397,[%r396+36]; setp.eq.u32 %r398,%r397,0; @ %r398 bra $L76; .loc 1 349 4 add.u64 %r600,%frame,360; cvta.const.u64 %r402,$LC2r600002; call _gfortrani_bounds_ifunction_return4 cvta.const.u64 %r406,$LC406402; call _gfortrani_bounds_equal_extents$L76: .loc 1 356 3 setp.ne.u64 %r408,%r154,0; @ %r408 bra $L78; $L81: .loc 1 364 8 ld.u64 %r174,[%r233]; .loc 1 365 8 ld.u64 %r172,[%r234]; .loc 1 367 9 setp.ne.u64 %r409,%r172,0; @ %r409 bra $L79; bra $L51; $L78: add.u64 %r188,%frame,480; add.u64 %r181,%r233,40; add.u64 %r149,%frame,120; cvt.u32.u32 %r411,%r23; cvt.s64.s8 %r410,%r411; add.u64 %r138,%r410,-1; .loc 1 356 3 mov.u64 %r175,0; add.u64 %r600,%frame,360; .loc 1 358 16 mov.u64 %r412,%r175; $L80: st.u64 [%r188],%r412; .loc 1 359 18 ld.u64 %r413,[%r181]; st.u64 [%r149],%r413; .loc 1 360 17 shl.b64 %r415,%r175,3; add.u64 %r416,%r600,%r415; .loc 1 360 10 ld.u64 %r417,[%r416]; setp.le.s64 %r418,%r417,0; @ %r418 bra $L51; .loc 1 356 26 add.u64 %r175,%r175,1; .loc 1 356 3 add.u64 %r188,%r188,8; add.u64 %r181,%r181,24; add.u64 %r149,%r149,8; setp.ne.u64 %r419,%r138,%r175; @ %r419 bra $L80; bra $L81; $L79: .loc 1 290 12 mul.lo.u64 %r43,%r41,%r35; .loc 1 386 32 shl.b64 %r88,%r158,3; ld.u64 %r189,[%frame+480]; .loc 1 430 22 ld.u64 %r102,[%frame+240]; .loc 1 430 12 shl.b64 %r103,%r102,3; .loc 1 431 23 ld.u64 %r105,[%frame]; .loc 1 432 22 ld.u64 %r107,[%frame+120]; .loc 1 432 12 shl.b64 %r108,%r107,4; .loc 1 434 32 ld.u64 %r29,[%frame+360]; .loc 1 441 23 mul.lo.u64 %r39,%r29,%r102; .loc 1 442 24 mul.lo.u64 %r420,%r29,%r105; .loc 1 442 10 neg.s64 %r92,%r420; .loc 1 443 23 mul.lo.u64 %r421,%r29,%r107; .loc 1 443 9 shl.b64 %r422,%r421,4; neg.s64 %r185,%r422; setp.le.s64 %r595,%r154,1; .loc 1 408 9 setp.ne.u32 %r603,%r237,0; add.u64 %r604,%r155,-1; cvt.u32.u32 %r605,%r23; cvt.s64.s8 %r606,%r605; add.u64 %r607,%r606,-1; add.u64 %r608,%frame,240; add.u64 %r609,%frame,360; $L98: .loc 1 356 3 mov.u64 %r168,%r173; mov.u64 %r167,%r172; .loc 1 386 9 mov.u64 %r197,0; $L85: .loc 1 389 7 ld.s8 %r224,[%r168]; mov.u64 %r230,%r197; .loc 1 386 24 add.u64 %r197,%r197,1; .loc 1 389 6 setp.eq.u32 %r598,%r224,0; @ %r598 bra $L82; .loc 1 397 11 ld.u64 %r145,[%r167]; .loc 1 398 11 mov.u64 %r557,%r197; shr.s64 %r558,%r557,63; .loc 1 408 9 @ %r603 bra $L83; bra $L121; $L82: .loc 1 386 32 add.u64 %r167,%r167,%r88; .loc 1 386 47 add.u64 %r168,%r168,%r43; .loc 1 386 2 setp.ne.u64 %r426,%r155,%r197; @ %r426 bra $L85; .loc 1 385 9 mov.u64 %r557,0; mov.u64 %r558,%r557; bra $L86; $L121: .loc 1 418 8 setp.gt.s64 %r427,%r155,%r230; @ %r427 bra $L87; bra $L86; $L83: .loc 1 409 8 setp.le.s64 %r428,%r155,%r230; @ %r428 bra $L86; add.u64 %r565,%r230,1; shr.s64 %r566,%r565,63; sub.u64 %r432,%r604,%r230; shr.s64 %r435,%r230,63; add.u64 %r438,%r230,2; set.u32.lt.u64 %r443,%r438,%r230; cvt.s64.s32 %r441,%r443; sub.u64 %r447,%r435,%r441; add.u64 %r449,%r432,%r438; set.u32.lt.u64 %r454,%r449,%r432; cvt.s64.s32 %r452,%r454; sub.u64 %r458,%r447,%r452; mov.u64 %r577,1; $L90: .loc 1 411 8 @ %r598 bra $L88; .loc 1 411 18 ld.u64 %r89,[%r167]; .loc 1 411 15 setp.lt.s64 %r460,%r89,%r145; .loc 1 414 16 selp.u64 %r557,%r557,%r565,%r460; selp.u64 %r558,%r558,%r566,%r460; selp.u64 %r145,%r145,%r89,%r460; $L88: .loc 1 409 33 add.u64 %r167,%r167,%r88; .loc 1 409 48 add.u64 %r168,%r168,%r43; .loc 1 409 8 add.u64 %r463,%r565,%r577; set.u32.lt.u64 %r468,%r463,%r565; cvt.s64.s32 %r466,%r468; mov.u64 %r565,%r463; sub.u64 %r566,%r566,%r466; setp.ne.u64 %r475,%r449,%r565; @ %r475 bra $L99; setp.ne.u64 %r478,%r458,%r566; @ ! %r478 bra $L86; $L99: .loc 1 411 9 ld.s8 %r224,[%r168]; setp.eq.u32 %r598,%r224,0; bra $L90; $L87: add.u64 %r561,%r230,1; shr.s64 %r562,%r561,63; sub.u64 %r482,%r604,%r230; shr.s64 %r485,%r230,63; add.u64 %r488,%r230,2; set.u32.lt.u64 %r493,%r488,%r230; cvt.s64.s32 %r491,%r493; sub.u64 %r497,%r485,%r491; add.u64 %r499,%r482,%r488; set.u32.lt.u64 %r504,%r499,%r482; cvt.s64.s32 %r502,%r504; sub.u64 %r508,%r497,%r502; .loc 1 418 8 mov.u64 %r591,1; $L113: .loc 1 420 8 ld.s8 %r510,[%r168]; cvt.u16.u32 %r509,%r510; setp.eq.u16 %r511,%r509,0; @ %r511 bra $L91; .loc 1 420 18 ld.u64 %r94,[%r167]; .loc 1 420 15 setp.le.s64 %r512,%r94,%r145; .loc 1 423 16 selp.u64 %r557,%r557,%r561,%r512; selp.u64 %r558,%r558,%r562,%r512; selp.u64 %r145,%r145,%r94,%r512; $L91: .loc 1 418 33 add.u64 %r167,%r167,%r88; .loc 1 418 48 add.u64 %r168,%r168,%r43; .loc 1 418 8 add.u64 %r515,%r561,%r591; set.u32.lt.u64 %r520,%r515,%r561; cvt.s64.s32 %r518,%r520; mov.u64 %r561,%r515; sub.u64 %r562,%r562,%r518; setp.ne.u64 %r527,%r499,%r561; @ %r527 bra $L113; setp.ne.u64 %r530,%r508,%r562; @ %r530 bra $L113; $L86: .loc 1 426 8 st.u64 [%r174],%r557; st.u64 [%r174+8],%r558; .loc 1 429 15 add.u64 %r189,%r189,1; .loc 1 430 12 add.u64 %r172,%r172,%r103; .loc 1 431 13 add.u64 %r173,%r173,%r105; .loc 1 432 12 add.u64 %r174,%r174,%r108; .loc 1 434 13 setp.ne.u64 %r533,%r29,%r189; @ %r533 bra $L98; .loc 1 442 10 add.u64 %r169,%r173,%r92; .loc 1 443 9 add.u64 %r170,%r174,%r185; .loc 1 445 7 @ %r595 bra $L51; add.u64 %r219,%frame,488; .loc 1 441 23 mov.u64 %r112,%r39; .loc 1 445 7 mov.u64 %r218,8; .loc 1 444 5 mov.u64 %r171,1; add.u64 %r599,%frame,120; .loc 1 438 13 mov.u64 %r602,0; bra $L95; $L96: st.u64 [%r219],%r602; .loc 1 441 23 mul.lo.u64 %r112,%r125,%r124; .loc 1 442 24 mul.lo.u64 %r539,%r127,%r124; .loc 1 442 10 sub.u64 %r169,%r173,%r539; .loc 1 443 23 mul.lo.u64 %r540,%r129,%r124; .loc 1 443 9 shl.b64 %r541,%r540,4; sub.u64 %r170,%r174,%r541; .loc 1 444 5 add.u64 %r171,%r171,1; .loc 1 445 7 add.u64 %r219,%r219,8; add.u64 %r218,%r218,8; setp.eq.u64 %r542,%r171,%r607; @ %r542 bra $L51; $L95: .loc 1 453 16 ld.u64 %r543,[%r219]; add.u64 %r124,%r543,1; st.u64 [%r219],%r124; .loc 1 454 23 add.u64 %r545,%r608,%r218; ld.u64 %r125,[%r545]; .loc 1 454 13 sub.u64 %r546,%r125,%r112; shl.b64 %r547,%r546,3; add.u64 %r172,%r172,%r547; .loc 1 455 24 add.u64 %r548,%frame,%r218; ld.u64 %r127,[%r548]; .loc 1 455 14 add.u64 %r173,%r169,%r127; .loc 1 456 23 add.u64 %r550,%r599,%r218; ld.u64 %r129,[%r550]; .loc 1 456 13 shl.b64 %r551,%r129,4; add.u64 %r174,%r170,%r551; .loc 1 434 32 add.u64 %r553,%r609,%r218; ld.u64 %r133,[%r553]; .loc 1 434 13 setp.eq.u64 %r554,%r124,%r133; @ %r554 bra $L96; .loc 1 438 13 mov.u64 %r189,0; bra $L98; $L119: .loc 1 312 6 ld.u64 %r555,[%r233]; setp.eq.u64 %r556,%r555,0; @ ! %r556 bra $L66; bra $L97; $L51_gfortran_smaxloc1_16_i8 .visible .func _gfortran_smaxloc1_16_, .param .u32 %in_ar4) {local .align 16 .b8 %frame_ar[368u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r282; .reg .predpred %r293; .reg .pred %r294; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 483 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L123; .loc 1 483 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L124; r152154; call _gfortran_maxloc1_16_490 7 bra $L122; $L124: .loc 1 493 10 ld.u64 %r23,[%r152]; .loc 1 493 7 add.u64 %r88,%r23,-1; .loc 1 494 10 ld.s8 %r24,[%r151+28]; .loc 1 494 38 add.u32 %r26,%r24,-1; .loc 1 494 8 cvt.s64.s32 %r297,%r26; .loc 1 496 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r297; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 496 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L126; .loc 1 503 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L127; $L133: .loc 1 511 3 setp.lt.s64 %r175,%r88,%r297; @ %r175 bra $L128; bra $L181; $L126: .loc 1 498 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error127: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 508 12 mov.u64 %r303,0; $L132: .loc 1 505 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 507 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L130; .loc 1 505 17 st.u64 [%r139],%r36; bra $L131; $L130: .loc 1 508 12 st.u64 [%r139],%r303; $L131: .loc 1 503 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L132; bra $L133; $L181: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L135; bra $L134; $L128: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r296,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r296,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 517 12 mov.u64 %r302,0; $L138: .loc 1 514 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 516 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L136; .loc 1 513 17 st.u64 [%r43],%r41; bra $L137; $L136: .loc 1 517 12 st.u64 [%r43],%r302; $L137: .loc 1 511 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L138; bra $L182; $L134: .loc 1 524 7 setp.eq.u64 %r213,%r297,0; @ %r213 bra $L140; add.u64 %r296,%frame,120; $L160: add.u64 %r111,%r150,40; mov.u64 %r107,%r296; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 527 10 mov.u64 %r81,1; .loc 1 531 4 mov.u64 %r239,0; bra $L141; $L140: .loc 1 535 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 536 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 538 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 538 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 538 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 540 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L142; bra $L183; $L144: .loc 1 529 48 mul.lo.u64 %r81,%r46,%r81; $L141: .loc 1 531 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 524 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L144; bra $L140; $L142: .loc 1 543 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 544 4 bra $L122; $L183: .loc 1 547 24 mov.u64 %r246,1696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 547 22 st.u64 [%r150],%r98; $L148: .loc 1 573 3 setp.ne.u64 %r249,%r297,0; @ %r249 bra $L145; bra $L146; $L135: .loc 1 551 19 ld.s8 %r55,[%r150+28]; .loc 1 551 10 setp.eq.u64 %r250,%r55,%r297; @ %r250 bra $L147; .loc 1 552 2 st.u64 [%stack+8],%r297; st.u64 [%stack],%r55;_gfortran_runtime_error147: .loc 1 557 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 557 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L148; .loc 1 559 4 setp.eq.u64 %r256,%r297,0; @ %r256 bra $L146; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 559 10 mov.u64 %r104,0; $L150: .loc 1 563 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 563 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 564 18 ld.u64 %r63,[%r31]; .loc 1 559 25 add.u64 %r104,%r104,1; .loc 1 564 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L149; .loc 1 565 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error149: .loc 1 559 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L150; $L145: .loc 1 575 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L151; shl.b64 %r266,%r297,3; bra $L152; $L151: mov.u64 %r266,8; $L152: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L153: .loc 1 576 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 573 26 add.u64 %r100,%r100,1; .loc 1 573 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r297,%r100; @ %r279 bra $L153; $L146: ld.u64 %r80,[%frame+240]; .loc 1 585 22 ld.u64 %r68,[%frame]; .loc 1 585 12 shl.b64 %r69,%r68,4; .loc 1 587 32 ld.u64 %r59,[%frame+120]; .loc 1 594 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r295,%r297,1; .loc 1 583 13 mov.u64 %r280,0; cvt.u32.u32 %r299,%r24; cvt.s64.s8 %r300,%r299; add.u64 %r301,%r300,-1; $L159: st.u64 [%r98],%r280; st.u64 [%r98+8],%r280; .loc 1 584 15 add.u64 %r80,%r80,1; .loc 1 585 12 add.u64 %r98,%r98,%r69; .loc 1 587 13 setp.ne.u64 %r282,%r59,%r80; @ %r282 bra $L159; .loc 1 596 7 @ ! %r295 bra $L184; bra $L122; $L158: .loc 1 591 13 st.u64 [%r99],%r280; .loc 1 594 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 595 5 add.u64 %r97,%r97,1; .loc 1 596 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r285,%r97,%r301; @ %r285 bra $L157; bra $L122; $L184: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 594 23 mov.u64 %r72,%r44; .loc 1 595 5 mov.u64 %r97,1; $L157: .loc 1 600 16 ld.u64 %r290,[%r99]; add.u64 %r74,%r290,1; st.u64 [%r99],%r74; .loc 1 601 23 ld.u64 %r75,[%r124]; .loc 1 601 13 sub.u64 %r291,%r75,%r72; shl.b64 %r292,%r291,4; add.u64 %r98,%r98,%r292; .loc 1 587 32 ld.u64 %r77,[%r123]; .loc 1 587 13 setp.eq.u64 %r293,%r74,%r77; @ %r293 bra $L158; .loc 1 591 13 mov.u64 %r80,0; bra $L159; $L182: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r294,%r98,0; @ ! %r294 bra $L135; bra $L160; $L122: .loc 1 605 1 ret; } maxloc1_4_i16.o/_gfortran_maxloc1_4_i16 .visible .func _gfortran_maxloc1_4_i16.file 1 "../../../libgfortran/generated/maxloc1_4_i16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_4_i16 .visible .func _gfortran_mmaxloc1_4_i16_gfortran_smaxloc1_4_i16 .visible .func _gfortran_smaxloc1_4_i16VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_maxloc1_4_i16 .visible .func _gfortran_maxloc1_4_i16480pred %r221; .reg .pred %r222; .reg .u64u64 %r229; .reg .u64 %r230; .reg .u64 %r231; .reg .pred %r232; .reg .predu64 %r280; .reg .u64 %r282; .reg .u64predu32u64 %r317; .reg .u64predpred %r339; .reg .pred %r342; .reg .u32pred %r355; .reg .u64 %r356; .reg .u64 %r358; .reg .u64 %r359; .reg .u64 %r360; .reg .u64 %r361; .reg .u64 %r363; .reg .predpredu64u64 %r384; .reg .u64 %r385; mov.u64 %r182,%ar0; mov.u64 %r183,%ar1; mov.u64 %r184,%ar2; mov.u32 %r185,%ar3; .loc 1 58 10 ld.s8 %r22,[%r183+28]; .loc 1 58 38 add.u32 %r186,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r108,%r186; .loc 1 59 10 ld.u64 %r25,[%r184]; .loc 1 59 7 add.u64 %r109,%r25,-1; .loc 1 61 7 shr.u64 %r188,%r109,63; set.u32.lt.s64 %r190,%r108,%r109; neg.s32 %r191,%r190; cvt.u16.u64 %r194,%r188; cvt.u16.u32 %r195,%r191; or.b16 %r193,%r194,%r195; .loc 1 61 6 cvt.u32.u16 %r196,%r193; cvt.u16.u8 %r197,%r196; setp.eq.u16 %r198,%r197,0; @ %r198 bra $L2; .loc 1 63 7 cvt.u32.u32 %r201,%r22; cvt.s64.s8 %r200,%r201; st.u64 [%stack+8],%r20019_gfortran_runtime_errorr204,%r109,%r109; add.u64 %r205,%r204,%r109; shl.b64 %r206,%r205,3; add.u64 %r207,%r183,%r206; ld.u64 %r31,[%r207+56]; ld.u64 %r33,[%r207+48]; .loc 1 71 9 ld.u64 %r112,[%r207+40]; .loc 1 73 3 setp.ne.u64 %r221,%r109,0; @ %r221 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r222,%r108,%r109; @ %r222 bra $L4; bra $L51; $L3: add.u64 %r29,%r183,40; add.u64 %r125,%frame,120; add.u64 %r126,%frame,240; add.u64 %r223,%r183,16; add.u64 %r225,%r25,%r25; add.u64 %r226,%r225,%r25; shl.b64 %r227,%r226,3; add.u64 %r147,%r223,%r227; .loc 1 79 12 mov.u64 %r385,0; $L8: .loc 1 75 18 ld.u64 %r228,[%r29]; st.u64 [%r125],%r228; .loc 1 76 19 ld.u64 %r230,[%r29+16]; add.u64 %r229,%r230,1; ld.u64 %r231,[%r29+8]; sub.u64 %r38,%r229,%r231; .loc 1 78 10 setp.lt.s64 %r232,%r38,0; @ %r232 bra $L6; .loc 1 76 17 st.u64 [%r126],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r126],%r385; $L7: .loc 1 73 3 add.u64 %r29,%r29,24; add.u64 %r125,%r125,8; add.u64 %r126,%r126,8; setp.ne.u64 %r234,%r29,%r147; @ %r234 bra $L8; bra $L9; $L51: .loc 1 90 6 ld.u64 %r235,[%r182]; setp.eq.u64 %r236,%r235,0; @ ! %r236 bra $L11; bra $L10; $L4: add.u64 %r238,%r25,%r25; add.u64 %r239,%r238,%r25; shl.b64 %r240,%r239,3; add.u64 %r241,%r240,40; add.u64 %r137,%r183,%r241; shl.b64 %r242,%r25,3; add.u64 %r105,%r242,-8; add.u64 %r373,%frame,120; add.u64 %r127,%r373,%r105; add.u64 %r372,%frame,240; add.u64 %r103,%r372,%r105; cvt.u32.u32 %r246,%r22; cvt.s64.s8 %r245,%r246; add.u64 %r248,%r245,%r245; add.u64 %r249,%r248,%r245; shl.b64 %r250,%r249,3; add.u64 %r251,%r183,40; add.u64 %r64,%r250,%r251; .loc 1 87 12 mov.u64 %r384,0; $L14: .loc 1 83 18 ld.u64 %r252,[%r137]; st.u64 [%r127],%r252; .loc 1 84 19 ld.u64 %r254,[%r137+16]; add.u64 %r253,%r254,1; ld.u64 %r255,[%r137+8]; sub.u64 %r44,%r253,%r255; .loc 1 86 10 setp.lt.s64 %r256,%r44,0; @ %r256 bra $L12; .loc 1 84 17 st.u64 [%r103],%r44; bra $L13; $L12: .loc 1 87 12 st.u64 [%r103],%r384; $L13: .loc 1 81 3 add.u64 %r137,%r137,24; add.u64 %r127,%r127,8; add.u64 %r103,%r103,8; setp.ne.u64 %r258,%r64,%r137; @ %r258 bra $L14; bra $L52; $L10: .loc 1 94 7 setp.le.s64 %r259,%r108,0; @ %r259 bra $L16; add.u64 %r372,%frame,240; $L39: add.u64 %r154,%r182,40; mov.u64 %r152,%r372; cvt.u32.u32 %r261,%r22; cvt.s64.s8 %r260,%r261; add.u64 %r263,%r260,%r260; add.u64 %r264,%r263,%r260; shl.b64 %r265,%r264,3; add.u64 %r266,%r182,16; add.u64 %r138,%r265,%r266; .loc 1 97 10 mov.u64 %r100,1; .loc 1 101 4 mov.u64 %r289,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r267,0; st.u64 [%r182+8],%r267; .loc 1 106 28 cvt.u16.u32 %r270,%r22; add.u16 %r269,%r270,-1; cvt.u32.u16 %r271,%r269; st.u8 [%r182+28],%r271; .loc 1 108 20 add.u32 %r272,%r22,-2; cvt.s64.s32 %r54,%r272; add.u64 %r274,%r54,%r54; add.u64 %r275,%r274,%r54; shl.b64 %r276,%r275,3; add.u64 %r277,%r182,%r276; .loc 1 108 67 shl.b64 %r279,%r54,3; add.u64 %r280,%frame,%r279; .loc 1 108 59 ld.u64 %r282,[%r277+40]; ld.u64 %r283,[%r280+240]; mul.lo.u64 %r114,%r282,%r283; .loc 1 110 29285; call (%value_in),_gfortrani_xmallocarray286,[%value_in]; } .loc 1 110 27 st.u64 [%r182],%r286; .loc 1 111 10 setp.eq.u64 %r288,%r114,0; @ ! %r288 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r100,%r50,%r100; $L17: .loc 1 101 4 st.u64 [%r154+8],%r289; ld.u64 %r50,[%r152]; add.u64 %r290,%r50,-1; st.u64 [%r154+16],%r290; st.u64 [%r154],%r100; .loc 1 94 7 add.u64 %r154,%r154,24; add.u64 %r152,%r152,8; setp.ne.u64 %r291,%r138,%r154; @ %r291 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r182+48],%r114; mov.u64 %r293,-1; st.u64 [%r182+56],%r293; mov.u64 %r294,1; st.u64 [%r182+40],%r294; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r60,[%r182+28]; .loc 1 121 10 setp.eq.u64 %r295,%r60,%r108; @ %r295 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r108; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r298,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r299,[%r298+36]; setp.eq.u32 %r300,%r299,0; @ %r300 bra $L21; .loc 1 128 2 add.u64 %r372,%frame,240; cvta.const.u64 %r304,$LC2r372303304; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r306,%r108,0; @ %r306 bra $L24; $L27: .loc 1 68 9 add.u64 %r307,%r31,1; .loc 1 68 7 sub.u64 %r110,%r307,%r33; max.s64 %r45,%r110,0; .loc 1 140 8 ld.u64 %r122,[%r183]; .loc 1 141 8 ld.u64 %r123,[%r182]; .loc 1 180 31 shl.b64 %r68,%r112,4; ld.u64 %r75,[%frame+360]; .loc 1 194 22 ld.u64 %r70,[%frame+120]; .loc 1 194 12 shl.b64 %r71,%r70,4; .loc 1 195 22 ld.u64 %r73,[%frame]; .loc 1 195 12 shl.b64 %r74,%r73,2; .loc 1 197 32 ld.u64 %r143,[%frame+240]; .loc 1 204 23 mul.lo.u64 %r308,%r70,%r143; .loc 1 204 9 shl.b64 %r309,%r308,4; neg.s64 %r134,%r309; .loc 1 205 23 mul.lo.u64 %r310,%r73,%r143; .loc 1 205 9 shl.b64 %r311,%r310,2; neg.s64 %r128,%r311; setp.gt.s64 %r374,%r110,0; setp.le.s64 %r375,%r108,1; .loc 1 155 9 mov.u64 %r377,0; setp.eq.u32 %r378,%r185,0; cvt.u32.u32 %r379,%r22; cvt.s64.s8 %r380,%r379; add.u64 %r381,%r380,-1; add.u64 %r382,%frame,120; add.u64 %r383,%frame,240; bra $L25; $L24: add.u64 %r167,%frame,360; add.u64 %r166,%r182,40; mov.u64 %r164,%frame; cvt.u32.u32 %r313,%r22; cvt.s64.s8 %r312,%r313; add.u64 %r155,%r312,-1; .loc 1 132 3 mov.u64 %r124,0; add.u64 %r372,%frame,240; .loc 1 134 16 mov.u64 %r314,%r124; $L26: st.u64 [%r167],%r314; .loc 1 135 18 ld.u64 %r315,[%r166]; st.u64 [%r164],%r315; .loc 1 136 17 shl.b64 %r317,%r124,3; add.u64 %r318,%r372,%r317; .loc 1 136 10 ld.u64 %r319,[%r318]; setp.le.s64 %r320,%r319,0; @ %r320 bra $L1; .loc 1 132 26 add.u64 %r124,%r124,1; .loc 1 132 3 add.u64 %r167,%r167,8; add.u64 %r166,%r166,24; add.u64 %r164,%r164,8; setp.ne.u64 %r321,%r124,%r155; @ %r321 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r374 bra $L40; .loc 1 159 10 mov.u32 %r323,0; st.u32 [%r123],%r323; bra $L29; $L40: mov.u64 %r118,%r122; .loc 1 155 9 mov.u64 %r367,%r377; mov.u64 %r368,-9223372036854775808; .loc 1 157 9 mov.u32 %r101,1; .loc 1 178 8 mov.u64 %r117,0; $L28: .loc 1 182 14 ld.u64 %r369,[%r118]; ld.u64 %r370,[%r118+8]; .loc 1 182 7 @ %r378 bra $L30; .loc 1 182 6 setp.gt.s64 %r327,%r368,%r370; @ %r327 bra $L33; setp.ne.u64 %r330,%r368,%r370; @ %r330 bra $L31; setp.gt.u64 %r333,%r367,%r369; @ %r333 bra $L33; bra $L31; $L30: setp.gt.s64 %r336,%r370,%r368; @ %r336 bra $L31; setp.ne.u64 %r339,%r370,%r368; @ %r339 bra $L33; setp.gt.u64 %r342,%r369,%r367; @ ! %r342 bra $L33; $L31: cvt.u32.u64 %r343,%r117; add.u32 %r101,%r343,1; .loc 1 182 14 mov.u64 %r367,%r369; mov.u64 %r368,%r370; $L33: .loc 1 180 23 add.u64 %r117,%r117,1; .loc 1 180 31 add.u64 %r118,%r118,%r68; .loc 1 180 6 setp.gt.s64 %r344,%r45,%r117; @ %r344 bra $L28; .loc 1 189 12 st.u32 [%r123],%r101; $L29: .loc 1 193 15 add.u64 %r75,%r75,1; .loc 1 194 12 add.u64 %r122,%r122,%r71; .loc 1 195 12 add.u64 %r123,%r123,%r74; .loc 1 197 13 setp.ne.u64 %r345,%r75,%r143; @ %r345 bra $L25; .loc 1 204 9 add.u64 %r119,%r122,%r134; .loc 1 205 9 add.u64 %r120,%r123,%r128; .loc 1 207 7 @ %r375 bra $L1; add.u64 %r133,%frame,368; mov.u64 %r149,8; .loc 1 206 5 mov.u64 %r121,1; bra $L37; $L38: .loc 1 201 13 st.u64 [%r133],%r377; .loc 1 204 23 mul.lo.u64 %r351,%r87,%r86; .loc 1 204 9 shl.b64 %r352,%r351,4; sub.u64 %r119,%r122,%r352; .loc 1 205 23 mul.lo.u64 %r353,%r90,%r86; .loc 1 205 9 shl.b64 %r354,%r353,2; sub.u64 %r120,%r123,%r354; .loc 1 206 5 add.u64 %r121,%r121,1; .loc 1 207 7 add.u64 %r133,%r133,8; add.u64 %r149,%r149,8; setp.eq.u64 %r355,%r121,%r381; @ %r355 bra $L1; $L37: .loc 1 215 16 ld.u64 %r356,[%r133]; add.u64 %r86,%r356,1; st.u64 [%r133],%r86; .loc 1 216 23 add.u64 %r358,%r382,%r149; ld.u64 %r87,[%r358]; .loc 1 216 13 shl.b64 %r359,%r87,4; add.u64 %r122,%r119,%r359; .loc 1 217 23 add.u64 %r360,%frame,%r149; ld.u64 %r90,[%r360]; .loc 1 217 13 shl.b64 %r361,%r90,2; add.u64 %r123,%r120,%r361; .loc 1 197 32 add.u64 %r363,%r383,%r149; ld.u64 %r93,[%r363]; .loc 1 197 13 setp.eq.u64 %r364,%r86,%r93; @ %r364 bra $L38; .loc 1 201 13 mov.u64 %r75,0; bra $L25; $L52: .loc 1 90 6 ld.u64 %r365,[%r182]; setp.eq.u64 %r366,%r365,0; @ ! %r366 bra $L11; bra $L39; $L1:_gfortran_mmaxloc1_4_i16 .visible .func _gfortran_mmaxloc1_4_i16, .param .u32 %in_ar4) {60896u16 %r238; .reg .u16u64 %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u64 %r306; .reg .u64pred %r317; .reg .u64 %r318; .reg .pred %r319; .reg .u64u32u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64u64predu32 %r385; .reg .predu64 %r394; .reg .predu64predu64 %r410; .reg .u32 %r412; .reg .pred %r414; .reg .pred %r415; .reg .pred %r416; .reg .pred %r420; .reg .pred %r423; .reg .pred %r426; .reg .predpred %r430; .reg .pred %r433; .reg .pred %r436; .reg .pred %r439; .reg .predpred %r462; .reg .u64 %r463; .reg .predpred %r473; .reg .u64 %r475; .reg .pred %r476; .reg .u64 %r477; .reg .u64 %r478; .reg .pred %r479; .reg .u32u64 %r484; .reg .u64 %r485; .reg .u64 %r486; .reg .u64 %r487; mov.u64 %r221,%ar0; mov.u64 %r222,%ar1; mov.u64 %r223,%ar2; mov.u64 %r224,%ar3; mov.u32 %r225,%ar4; .loc 1 251 6 setp.ne.u64 %r226,%r224,0; @ %r226 bra $L223225; call _gfortran_maxloc1_4_i167 bra $L53; $L54: .loc 1 261 10 ld.u64 %r22,[%r223]; .loc 1 261 7 add.u64 %r151,%r22,-1; .loc 1 262 10 ld.s8 %r23,[%r222+28]; .loc 1 262 38 add.u32 %r231,%r23,-1; .loc 1 262 8 cvt.s64.s32 %r152,%r231; .loc 1 265 7 shr.u64 %r233,%r151,63; set.u32.gt.s64 %r235,%r151,%r152; neg.s32 %r236,%r235; cvt.u16.u64 %r239,%r233; cvt.u16.u32 %r240,%r236; or.b16 %r238,%r239,%r240; .loc 1 265 6 cvt.u32.u16 %r241,%r238; cvt.u16.u8 %r242,%r241; setp.eq.u16 %r243,%r242,0; @ %r243 bra $L56; .loc 1 267 7 cvt.u32.u32 %r246,%r23; cvt.s64.s8 %r245,%r246; st.u64 [%stack+8],%r245; st.u64 [%stack],%r22; cvta.const.u64 %r244244_gfortran_runtime_error6: .loc 1 272 9 add.u64 %r472,%r151,%r151; add.u64 %r475,%r472,%r151; shl.b64 %r251,%r475,3; add.u64 %r252,%r222,%r251; ld.u64 %r255,[%r252+56]; add.u64 %r254,%r255,1; .loc 1 272 7 ld.u64 %r262,[%r252+48]; sub.u64 %r153,%r254,%r262; .loc 1 273 6 setp.le.s64 %r263,%r153,0; @ %r263 bra $L53; .loc 1 276 9 ld.u64 %r169,[%r224]; .loc 1 278 15 ld.u64 %r34,[%r224+16]; .loc 1 280 22 cvt.u32.u64 %r35,%r34; .loc 1 280 53 add.u32 %r264,%r35,-4; and.b32 %r265,%r264,-5; set.u32.eq.u32 %r267,%r265,0; neg.s32 %r268,%r267; .loc 1 280 22 add.u32 %r269,%r35,-1; set.u32.le.u32 %r271,%r269,1; neg.s32 %r272,%r271; .loc 1 280 6 cvt.u16.u32 %r274,%r268; cvt.u16.u32 %r275,%r272; or.loc 1 282 7 setp.ne.u32 %r280,%r35,16; @ %r280 bra $L59; $L58: .loc 1 289 9 shl.b64 %r284,%r475,3; add.u64 %r285,%r222,%r284; ld.u64 %r156,[%r285+40]; .loc 1 290 12 add.u64 %r291,%r224,%r284; ld.u64 %r41,[%r291+40]; .loc 1 292 3 setp.ne.u64 %r293,%r151,0; @ %r293 bra $L60; $L66: .loc 1 302 3 setp.lt.s64 %r294,%r151,%r152; @ %r294 bra $L61; bra $L122; $L59: .loc 1 287 5 cvta.const.u64 %r2955_gfortran_runtime_error60: add.u64 %r96,%r222,40; add.u64 %r68,%r224,40; add.u64 %r298,%r22,%r22; add.u64 %r299,%r298,%r22; shl.b64 %r300,%r299,3; add.u64 %r301,%r222,16; add.u64 %r39,%r300,%r301; .loc 1 292 3 mov.u64 %r139,0; add.u64 %r471,%frame,240; add.u64 %r477,%frame,360; .loc 1 299 12 mov.u64 %r487,%r139; $L65: .loc 1 294 18 add.u64 %r303,%r471,%r139; ld.u64 %r304,[%r96]; st.u64 [%r303],%r304; .loc 1 295 18 add.u64 %r305,%frame,%r139; .loc 1 295 20 ld.u64 %r307,[%r68]; mul.lo.u64 %r306,%r307,%r34; .loc 1 295 18 st.u64 [%r305],%r306; .loc 1 296 19 ld.u64 %r309,[%r96+16]; add.u64 %r308,%r309,1; ld.u64 %r310,[%r96+8]; sub.u64 %r53,%r308,%r310; .loc 1 298 10 setp.lt.s64 %r311,%r53,0; @ %r311 bra $L63; .loc 1 296 17 add.u64 %r313,%r477,%r139; st.u64 [%r313],%r53; bra $L64; $L63: .loc 1 299 12 add.u64 %r315,%r477,%r139; st.u64 [%r315],%r487; $L64: .loc 1 292 3 add.u64 %r96,%r96,24; add.u64 %r68,%r68,24; add.u64 %r139,%r139,8; setp.ne.u64 %r317,%r39,%r96; @ %r317 bra $L65; bra $L66; $L122: .loc 1 312 6 ld.u64 %r318,[%r221]; setp.eq.u64 %r319,%r318,0; @ ! %r319 bra $L68; bra $L67; $L61: add.u64 %r321,%r22,%r22; add.u64 %r322,%r321,%r22; shl.b64 %r323,%r322,3; add.u64 %r130,%r323,40; add.u64 %r134,%r222,%r130; add.u64 %r111,%r224,%r130; shl.b64 %r324,%r22,3; add.u64 %r58,%r324,-8; cvt.u32.u32 %r326,%r23; cvt.s64.s8 %r325,%r326; shl.b64 %r327,%r325,3; add.u64 %r69,%r327,-8; add.u64 %r471,%frame,240; add.u64 %r477,%frame,360; .loc 1 309 12 mov.u64 %r486,0; $L71: .loc 1 304 18 add.u64 %r329,%r471,%r58; ld.u64 %r330,[%r134]; st.u64 [%r329],%r330; .loc 1 305 18 add.u64 %r331,%frame,%r58; .loc 1 305 20 ld.u64 %r333,[%r111]; mul.lo.u64 %r332,%r333,%r34; .loc 1 305 18 st.u64 [%r331],%r332; .loc 1 306 19 ld.u64 %r335,[%r134+16]; add.u64 %r334,%r335,1; ld.u64 %r336,[%r134+8]; sub.u64 %r65,%r334,%r336; .loc 1 308 10 setp.lt.s64 %r337,%r65,0; @ %r337 bra $L69; .loc 1 306 17 add.u64 %r339,%r477,%r58; st.u64 [%r339],%r65; bra $L70; $L69: .loc 1 309 12 add.u64 %r341,%r477,%r58; st.u64 [%r341],%r486; $L70: .loc 1 302 3 add.u64 %r134,%r134,24; add.u64 %r111,%r111,24; add.u64 %r58,%r58,8; setp.ne.u64 %r343,%r69,%r58; @ %r343 bra $L71; bra $L123; $L67: .loc 1 316 7 setp.eq.u64 %r344,%r152,0; @ %r344 bra $L73; add.u64 %r477,%frame,360; $L100: add.u64 %r148,%r221,40; mov.u64 %r146,%r477; cvt.u32.u32 %r346,%r23; cvt.s64.s8 %r345,%r346; add.u64 %r348,%r345,%r345; add.u64 %r349,%r348,%r345; shl.b64 %r350,%r349,3; add.u64 %r351,%r221,16; add.u64 %r135,%r350,%r351; .loc 1 319 10 mov.u64 %r136,1; .loc 1 323 4 mov.u64 %r370,0; bra $L74; $L73: .loc 1 327 20 add.u32 %r352,%r23,-2; cvt.s64.s32 %r76,%r352; add.u64 %r354,%r76,%r76; add.u64 %r355,%r354,%r76; shl.b64 %r356,%r355,3; add.u64 %r357,%r221,%r356; .loc 1 327 67 shl.b64 %r359,%r76,3; add.u64 %r360,%frame,%r359; .loc 1 327 59 ld.u64 %r362,[%r357+40]; ld.u64 %r363,[%r360+360]; mul.lo.u64 %r160,%r362,%r363; .loc 1 329 24 mov.u64 %r364,0; st.u64 [%r221+8],%r364; .loc 1 330 28 cvt.u16.u32 %r367,%r23; add.u16 %r366,%r367,-1; cvt.u32.u16 %r368,%r366; st.u8 [%r221+28],%r368; .loc 1 332 10 setp.eq.u64 %r369,%r160,0; @ %r369 bra $L75; bra $L124; $L77: .loc 1 321 47 mul.lo.u64 %r136,%r73,%r136; $L74: .loc 1 323 4 st.u64 [%r148+8],%r370; ld.u64 %r73,[%r146]; add.u64 %r371,%r73,-1; st.u64 [%r148+16],%r371; st.u64 [%r148],%r136; .loc 1 316 7 add.u64 %r148,%r148,24; add.u64 %r146,%r146,8; setp.ne.u64 %r372,%r135,%r148; @ %r372 bra $L77; bra $L73; $L75: .loc 1 335 4 st.u64 [%r221+48],%r160; mov.u64 %r374,-1; st.u64 [%r221+56],%r374; mov.u64 %r375,1; st.u64 [%r221+40],%r375; .loc 1 336 4 bra $L53; $L124: .loc 1 339 24call (%value_in),_gfortrani_xmallocarray78,[%value_in]; } .loc 1 339 22 st.u64 [%r221],%r378; bra $L78; $L68: .loc 1 344 19 ld.s8 %r380,[%r221+28]; .loc 1 344 10 setp.eq.u64 %r381,%r380,%r152; @ %r381 bra $L79; .loc 1 345 2 cvta.const.u64 %r382,$LC5; {stack; call _gfortran_runtime_error79: .loc 1 347 11 cvta.global.u64 %r384,_gfortrani_compile_options; .loc 1 347 10 ld.u32 %r385,[%r384+36]; setp.eq.u32 %r386,%r385,0; @ %r386 bra $L78; .loc 1 349 4 add.u64 %r477,%frame,360; cvta.const.u64 %r390,$LC2r477389390; call _gfortrani_bounds_ifunction_return4 cvta.const.u64 %r394,$LC222394390; call _gfortrani_bounds_equal_extents$L78: .loc 1 356 3 setp.ne.u64 %r396,%r152,0; @ %r396 bra $L80; $L83: .loc 1 364 8 ld.u64 %r170,[%r221]; .loc 1 365 8 ld.u64 %r168,[%r222]; .loc 1 367 9 setp.ne.u64 %r397,%r168,0; @ %r397 bra $L81; bra $L53; $L80: add.u64 %r198,%frame,480; add.u64 %r197,%r221,40; add.u64 %r195,%frame,120; cvt.u32.u32 %r399,%r23; cvt.s64.s8 %r398,%r399; add.u64 %r164,%r398,-1; .loc 1 356 3 mov.u64 %r171,0; add.u64 %r477,%frame,360; .loc 1 358 16 mov.u64 %r400,%r171; $L82: st.u64 [%r198],%r400; .loc 1 359 18 ld.u64 %r401,[%r197]; st.u64 [%r195],%r401; .loc 1 360 17 shl.b64 %r403,%r171,3; add.u64 %r404,%r477,%r403; .loc 1 360 10 ld.u64 %r405,[%r404]; setp.le.s64 %r406,%r405,0; @ %r406 bra $L53; .loc 1 356 26 add.u64 %r171,%r171,1; .loc 1 356 3 add.u64 %r198,%r198,8; add.u64 %r197,%r197,24; add.u64 %r195,%r195,8; setp.ne.u64 %r407,%r164,%r171; @ %r407 bra $L82; bra $L83; $L81: .loc 1 290 12 mul.lo.u64 %r44,%r41,%r34; .loc 1 386 32 shl.b64 %r91,%r156,4; ld.u64 %r187,[%frame+480]; .loc 1 430 22 ld.u64 %r101,[%frame+240]; .loc 1 430 12 shl.b64 %r102,%r101,4; .loc 1 431 23 ld.u64 %r104,[%frame]; .loc 1 432 22 ld.u64 %r106,[%frame+120]; .loc 1 432 12 shl.b64 %r107,%r106,2; .loc 1 434 32 ld.u64 %r29,[%frame+360]; .loc 1 441 23 mul.lo.u64 %r38,%r29,%r101; .loc 1 442 24 mul.lo.u64 %r408,%r29,%r104; .loc 1 442 10 neg.s64 %r137,%r408; .loc 1 443 23 mul.lo.u64 %r409,%r29,%r106; .loc 1 443 9 shl.b64 %r410,%r409,2; neg.s64 %r188,%r410; setp.le.s64 %r476,%r152,1; .loc 1 408 9 setp.ne.u32 %r479,%r225,0; cvt.u32.u32 %r480,%r23; cvt.s64.s8 %r481,%r480; add.u64 %r482,%r481,-1; add.u64 %r483,%frame,240; add.u64 %r484,%frame,360; add.u64 %r485,%frame,120; $L101: .loc 1 356 3 mov.u64 %r163,%r169; mov.u64 %r162,%r168; .loc 1 386 9 mov.u64 %r161,0; $L87: .loc 1 389 7 ld.s8 %r220,[%r163]; .loc 1 389 6 setp.eq.u32 %r473,%r220,0; @ %r473 bra $L84; .loc 1 397 11 ld.u64 %r469,[%r162]; ld.u64 %r470,[%r162+8]; .loc 1 398 11 cvt.u32.u64 %r412,%r161; add.u32 %r143,%r412,1; .loc 1 408 9 @ %r479 bra $L85; bra $L125; $L84: .loc 1 386 24 add.u64 %r161,%r161,1; .loc 1 386 32 add.u64 %r162,%r162,%r91; .loc 1 386 47 add.u64 %r163,%r163,%r44; .loc 1 386 2 setp.ne.u64 %r414,%r153,%r161; @ %r414 bra $L87; .loc 1 385 9 mov.u32 %r143,%r220; bra $L88; $L125: .loc 1 418 8 setp.gt.s64 %r415,%r153,%r161; @ %r415 bra $L89; bra $L88; $L85: .loc 1 409 8 setp.le.s64 %r416,%r153,%r161; @ %r416 bra $L88; add.u64 %r205,%r161,1; $L92: .loc 1 411 8 @ %r473 bra $L90; .loc 1 411 18 ld.u64 %r465,[%r162]; ld.u64 %r466,[%r162+8]; .loc 1 411 15 setp.gt.s64 %r420,%r470,%r466; @ %r420 bra $L90; setp.ne.u64 %r423,%r470,%r466; @ %r423 bra $L103; setp.gt.u64 %r426,%r469,%r465; @ %r426 bra $L90; $L103: .loc 1 414 16 cvt.u32.u64 %r143,%r205; mov.u64 %r469,%r465; mov.u64 %r470,%r466; $L90: .loc 1 409 33 add.u64 %r162,%r162,%r91; .loc 1 409 48 add.u64 %r163,%r163,%r44; .loc 1 409 8 add.u64 %r206,%r205,1; setp.eq.u64 %r427,%r153,%r205; @ %r427 bra $L88; .loc 1 411 9 ld.s8 %r220,[%r163]; mov.u64 %r205,%r206; setp.eq.u32 %r473,%r220,0; bra $L92; $L89: add.u64 %r201,%r161,1; bra $L95; $L102: mov.u64 %r201,%r202; $L95: .loc 1 420 8 ld.s8 %r429,[%r163]; cvt.u16.u32 %r428,%r429; setp.eq.u16 %r430,%r428,0; @ %r430 bra $L93; .loc 1 420 18 ld.u64 %r467,[%r162]; ld.u64 %r468,[%r162+8]; .loc 1 420 15 setp.gt.s64 %r433,%r468,%r470; @ %r433 bra $L104; setp.ne.u64 %r436,%r468,%r470; @ %r436 bra $L93; setp.gt.u64 %r439,%r467,%r469; @ ! %r439 bra $L93; $L104: .loc 1 423 16 cvt.u32.u64 %r143,%r201; mov.u64 %r469,%r467; mov.u64 %r470,%r468; $L93: .loc 1 418 33 add.u64 %r162,%r162,%r91; .loc 1 418 48 add.u64 %r163,%r163,%r44; .loc 1 418 8 add.u64 %r202,%r201,1; setp.ne.u64 %r440,%r153,%r201; @ %r440 bra $L102; $L88: .loc 1 426 8 st.u32 [%r170],%r143; .loc 1 429 15 add.u64 %r187,%r187,1; .loc 1 430 12 add.u64 %r168,%r168,%r102; .loc 1 431 13 add.u64 %r169,%r169,%r104; .loc 1 432 12 add.u64 %r170,%r170,%r107; .loc 1 434 13 setp.ne.u64 %r441,%r29,%r187; @ %r441 bra $L101; .loc 1 442 10 add.u64 %r165,%r169,%r137; .loc 1 443 9 add.u64 %r166,%r170,%r188; .loc 1 445 7 @ %r476 bra $L53; add.u64 %r150,%frame,488; .loc 1 441 23 mov.u64 %r110,%r38; .loc 1 445 7 mov.u64 %r215,8; .loc 1 444 5 mov.u64 %r167,1; .loc 1 438 13 mov.u64 %r478,0; bra $L98; $L99: st.u64 [%r150],%r478; .loc 1 441 23 mul.lo.u64 %r110,%r123,%r122; .loc 1 442 24 mul.lo.u64 %r447,%r125,%r122; .loc 1 442 10 sub.u64 %r165,%r169,%r447; .loc 1 443 23 mul.lo.u64 %r448,%r127,%r122; .loc 1 443 9 shl.b64 %r449,%r448,2; sub.u64 %r166,%r170,%r449; .loc 1 444 5 add.u64 %r167,%r167,1; .loc 1 445 7 add.u64 %r150,%r150,8; add.u64 %r215,%r215,8; setp.eq.u64 %r450,%r167,%r482; @ %r450 bra $L53; $L98: .loc 1 453 16 ld.u64 %r451,[%r150]; add.u64 %r122,%r451,1; st.u64 [%r150],%r122; .loc 1 454 23 add.u64 %r453,%r483,%r215; ld.u64 %r123,[%r453]; .loc 1 454 13 sub.u64 %r454,%r123,%r110; shl.b64 %r455,%r454,4; add.u64 %r168,%r168,%r455; .loc 1 455 24 add.u64 %r456,%frame,%r215; ld.u64 %r125,[%r456]; .loc 1 455 14 add.u64 %r169,%r165,%r125; .loc 1 456 23 add.u64 %r458,%r485,%r215; ld.u64 %r127,[%r458]; .loc 1 456 13 shl.b64 %r459,%r127,2; add.u64 %r170,%r166,%r459; .loc 1 434 32 add.u64 %r461,%r484,%r215; ld.u64 %r131,[%r461]; .loc 1 434 13 setp.eq.u64 %r462,%r122,%r131; @ %r462 bra $L99; .loc 1 438 13 mov.u64 %r187,0; bra $L101; $L123: .loc 1 312 6 ld.u64 %r463,[%r221]; setp.eq.u64 %r464,%r463,0; @ ! %r464 bra $L68; bra $L100; $L53_gfortran_smaxloc1_4_i16 .visible .func _gfortran_smaxloc1_4_i16, .param .u32 %in_ar4) {local .align 16 .b8 %frame_ar[368u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u32 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 483 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L127; .loc 1 483 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L128; $L127:r152154; call _gfortran_maxloc1_4_i16490 7 bra $L126; $L128: .loc 1 493 10 ld.u64 %r23,[%r152]; .loc 1 493 7 add.u64 %r88,%r23,-1; .loc 1 494 10 ld.s8 %r24,[%r151+28]; .loc 1 494 38 add.u32 %r26,%r24,-1; .loc 1 494 8 cvt.s64.s32 %r296,%r26; .loc 1 496 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 496 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L130; .loc 1 503 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L131; $L137: .loc 1 511 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L132; bra $L185; $L130: .loc 1 498 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error131: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 508 12 mov.u64 %r302,0; $L136: .loc 1 505 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 507 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L134; .loc 1 505 17 st.u64 [%r139],%r36; bra $L135; $L134: .loc 1 508 12 st.u64 [%r139],%r302; $L135: .loc 1 503 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L136; bra $L137; $L185: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L139; bra $L138; $L132: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 517 12 mov.u64 %r301,0; $L142: .loc 1 514 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 516 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L140; .loc 1 513 17 st.u64 [%r43],%r41; bra $L141; $L140: .loc 1 517 12 st.u64 [%r43],%r301; $L141: .loc 1 511 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L142; bra $L186; $L138: .loc 1 524 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L144; add.u64 %r295,%frame,120; $L164: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 527 10 mov.u64 %r81,1; .loc 1 531 4 mov.u64 %r239,0; bra $L145; $L144: .loc 1 535 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 536 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 538 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 538 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 538 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 540 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L146; bra $L187; $L148: .loc 1 529 48 mul.lo.u64 %r81,%r46,%r81; $L145: .loc 1 531 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 524 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L148; bra $L144; $L146: .loc 1 543 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 544 4 bra $L126; $L187: .loc 1 547 2496246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 547 22 st.u64 [%r150],%r98; $L152: .loc 1 573 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L149; bra $L150; $L139: .loc 1 551 19 ld.s8 %r55,[%r150+28]; .loc 1 551 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L151; .loc 1 552 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error151: .loc 1 557 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 557 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L152; .loc 1 559 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L150; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 559 10 mov.u64 %r104,0; $L154: .loc 1 563 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 563 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 564 18 ld.u64 %r63,[%r31]; .loc 1 559 25 add.u64 %r104,%r104,1; .loc 1 564 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L153; .loc 1 565 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error153: .loc 1 559 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L154; $L149: .loc 1 575 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L155; shl.b64 %r266,%r296,3; bra $L156; $L155: mov.u64 %r266,8; $L156: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L157: .loc 1 576 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 573 26 add.u64 %r100,%r100,1; .loc 1 573 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L157; $L150: ld.u64 %r80,[%frame+240]; .loc 1 585 22 ld.u64 %r68,[%frame]; .loc 1 585 12 shl.b64 %r69,%r68,2; .loc 1 587 32 ld.u64 %r59,[%frame+120]; .loc 1 594 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 583 13 mov.u32 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L163: st.u32 [%r98],%r280; .loc 1 584 15 add.u64 %r80,%r80,1; .loc 1 585 12 add.u64 %r98,%r98,%r69; .loc 1 587 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L163; .loc 1 596 7 @ ! %r294 bra $L188; bra $L126; $L162: .loc 1 591 13 st.u64 [%r99],%r297; .loc 1 594 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 595 5 add.u64 %r97,%r97,1; .loc 1 596 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L161; bra $L126; $L188: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 594 23 mov.u64 %r72,%r44; .loc 1 595 5 mov.u64 %r97,1; .loc 1 591 13 mov.u64 %r297,0; $L161: .loc 1 600 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 601 23 ld.u64 %r75,[%r124]; .loc 1 601 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,2; add.u64 %r98,%r98,%r291; .loc 1 587 32 ld.u64 %r77,[%r123]; .loc 1 587 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L162; .loc 1 591 13 mov.u64 %r80,0; bra $L163; $L186: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L139; bra $L164; $L126: .loc 1 605 1 ret; } maxloc1_8_i16.o/_gfortran_maxloc1_8_i16 .visible .func _gfortran_maxloc1_8_i16.file 1 "../../../libgfortran/generated/maxloc1_8_i16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_8_i16 .visible .func _gfortran_mmaxloc1_8_i16_gfortran_smaxloc1_8_i16 .visible .func _gfortran_smaxloc1_8_i16VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_maxloc1_8_i16 .visible .func _gfortran_maxloc1_8_i1648095predu64u64 %r229; .reg .u64 %r230; .reg .predpred %r255; .reg .predu64 %r266; .reg .u16u64predpred %r315; .reg .pred %r316; .reg .pred %r322; .reg .pred %r325; .reg .pred %r328; .reg .pred %r331; .reg .pred %r334; .reg .pred %r337; .reg .predu64u64predmov.u64 %r181,%ar0; mov.u64 %r182,%ar1; mov.u64 %r183,%ar2; mov.u32 %r184,%ar3; .loc 1 58 10 ld.s8 %r22,[%r182+28]; .loc 1 58 38 add.u32 %r185,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r110,%r185; .loc 1 59 10 ld.u64 %r25,[%r183]; .loc 1 59 7 add.u64 %r111,%r25,-1; .loc 1 61 7 shr.u64 %r187,%r111,63; set.u32.lt.s64 %r189,%r110,%r111; neg.s32 %r190,%r189; cvt.u16.u64 %r193,%r187; cvt.u16.u32 %r194,%r190; or.b16 %r192,%r193,%r194; .loc 1 61 6 cvt.u32.u16 %r195,%r192; cvt.u16.u8 %r196,%r195; setp.eq.u16 %r197,%r196,0; @ %r197 bra $L2; .loc 1 63 7 cvt.u32.u32 %r200,%r22; cvt.s64.s8 %r199,%r200; st.u64 [%stack+8],%r199_gfortran_runtime_errorr203,%r111,%r111; add.u64 %r204,%r203,%r111; shl.b64 %r205,%r204,3; add.u64 %r206,%r182,%r205; ld.u64 %r31,[%r206+56]; ld.u64 %r33,[%r206+48]; .loc 1 71 9 ld.u64 %r114,[%r206+40]; .loc 1 73 3 setp.ne.u64 %r220,%r111,0; @ %r220 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r221,%r110,%r111; @ %r221 bra $L4; bra $L52; $L3: add.u64 %r63,%r182,40; add.u64 %r29,%frame,120; add.u64 %r124,%frame,240; add.u64 %r222,%r182,16; add.u64 %r224,%r25,%r25; add.u64 %r225,%r224,%r25; shl.b64 %r226,%r225,3; add.u64 %r174,%r222,%r226; .loc 1 79 12 mov.u64 %r383,0; $L8: .loc 1 75 18 ld.u64 %r227,[%r63]; st.u64 [%r29],%r227; .loc 1 76 19 ld.u64 %r229,[%r63+16]; add.u64 %r228,%r229,1; ld.u64 %r230,[%r63+8]; sub.u64 %r38,%r228,%r230; .loc 1 78 10 setp.lt.s64 %r231,%r38,0; @ %r231 bra $L6; .loc 1 76 17 st.u64 [%r124],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r124],%r383; $L7: .loc 1 73 3 add.u64 %r63,%r63,24; add.u64 %r29,%r29,8; add.u64 %r124,%r124,8; setp.ne.u64 %r233,%r63,%r174; @ %r233 bra $L8; bra $L9; $L52: .loc 1 90 6 ld.u64 %r234,[%r181]; setp.eq.u64 %r235,%r234,0; @ ! %r235 bra $L11; bra $L10; $L4: add.u64 %r237,%r25,%r25; add.u64 %r238,%r237,%r25; shl.b64 %r239,%r238,3; add.u64 %r240,%r239,40; add.u64 %r132,%r182,%r240; shl.b64 %r241,%r25,3; add.u64 %r107,%r241,-8; add.u64 %r371,%frame,120; add.u64 %r126,%r371,%r107; add.u64 %r370,%frame,240; add.u64 %r105,%r370,%r107; cvt.u32.u32 %r245,%r22; cvt.s64.s8 %r244,%r245; add.u64 %r247,%r244,%r244; add.u64 %r248,%r247,%r244; shl.b64 %r249,%r248,3; add.u64 %r250,%r182,40; add.u64 %r81,%r249,%r250; .loc 1 87 12 mov.u64 %r382,0; $L14: .loc 1 83 18 ld.u64 %r251,[%r132]; st.u64 [%r126],%r251; .loc 1 84 19 ld.u64 %r253,[%r132+16]; add.u64 %r252,%r253,1; ld.u64 %r254,[%r132+8]; sub.u64 %r45,%r252,%r254; .loc 1 86 10 setp.lt.s64 %r255,%r45,0; @ %r255 bra $L12; .loc 1 84 17 st.u64 [%r105],%r45; bra $L13; $L12: .loc 1 87 12 st.u64 [%r105],%r382; $L13: .loc 1 81 3 add.u64 %r132,%r132,24; add.u64 %r126,%r126,8; add.u64 %r105,%r105,8; setp.ne.u64 %r257,%r81,%r132; @ %r257 bra $L14; bra $L53; $L10: .loc 1 94 7 setp.le.s64 %r258,%r110,0; @ %r258 bra $L16; add.u64 %r370,%frame,240; $L38: add.u64 %r153,%r181,40; mov.u64 %r151,%r370; cvt.u32.u32 %r260,%r22; cvt.s64.s8 %r259,%r260; add.u64 %r262,%r259,%r259; add.u64 %r263,%r262,%r259; shl.b64 %r264,%r263,3; add.u64 %r265,%r181,16; add.u64 %r138,%r264,%r265; .loc 1 97 10 mov.u64 %r102,1; .loc 1 101 4 mov.u64 %r288,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r266,0; st.u64 [%r181+8],%r266; .loc 1 106 28 cvt.u16.u32 %r269,%r22; add.u16 %r268,%r269,-1; cvt.u32.u16 %r270,%r268; st.u8 [%r181+28],%r270; .loc 1 108 20 add.u32 %r271,%r22,-2; cvt.s64.s32 %r56,%r271; add.u64 %r273,%r56,%r56; add.u64 %r274,%r273,%r56; shl.b64 %r275,%r274,3; add.u64 %r276,%r181,%r275; .loc 1 108 67 shl.b64 %r278,%r56,3; add.u64 %r279,%frame,%r278; .loc 1 108 59 ld.u64 %r281,[%r276+40]; ld.u64 %r282,[%r279+240]; mul.lo.u64 %r116,%r281,%r282; .loc 1 110 29 mov.u64 %r284284; call (%value_in),_gfortrani_xmallocarray285,[%value_in]; } .loc 1 110 27 st.u64 [%r181],%r285; .loc 1 111 10 setp.eq.u64 %r287,%r116,0; @ ! %r287 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r102,%r52,%r102; $L17: .loc 1 101 4 st.u64 [%r153+8],%r288; ld.u64 %r52,[%r151]; add.u64 %r289,%r52,-1; st.u64 [%r153+16],%r289; st.u64 [%r153],%r102; .loc 1 94 7 add.u64 %r153,%r153,24; add.u64 %r151,%r151,8; setp.ne.u64 %r290,%r138,%r153; @ %r290 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r181+48],%r116; mov.u64 %r292,-1; st.u64 [%r181+56],%r292; mov.u64 %r293,1; st.u64 [%r181+40],%r293; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r62,[%r181+28]; .loc 1 121 10 setp.eq.u64 %r294,%r62,%r110; @ %r294 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r110; st.u64 [%stack],%r62;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r297,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r298,[%r297+36]; setp.eq.u32 %r299,%r298,0; @ %r299 bra $L21; .loc 1 128 2 add.u64 %r370,%frame,240; cvta.const.u64 %r303,$LC2r370302303; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r305,%r110,0; @ %r305 bra $L24; $L27: .loc 1 68 9 add.u64 %r306,%r31,1; .loc 1 68 7 sub.u64 %r112,%r306,%r33; max.s64 %r46,%r112,0; .loc 1 140 8 ld.u64 %r121,[%r182]; .loc 1 141 8 ld.u64 %r122,[%r181]; .loc 1 180 31 shl.b64 %r69,%r114,4; ld.u64 %r48,[%frame+360]; .loc 1 194 22 ld.u64 %r70,[%frame+120]; .loc 1 194 12 shl.b64 %r72,%r70,4; .loc 1 195 22 ld.u64 %r73,[%frame]; .loc 1 195 12 shl.b64 %r75,%r73,3; setp.gt.s64 %r372,%r112,0; setp.le.s64 %r373,%r110,1; .loc 1 155 9 mov.u64 %r375,0; setp.eq.u32 %r376,%r184,0; cvt.u32.u32 %r377,%r22; cvt.s64.s8 %r378,%r377; add.u64 %r379,%r378,-1; add.u64 %r380,%frame,120; add.u64 %r381,%frame,240; bra $L25; $L24: add.u64 %r166,%frame,360; add.u64 %r165,%r181,40; mov.u64 %r163,%frame; cvt.u32.u32 %r308,%r22; cvt.s64.s8 %r307,%r308; add.u64 %r154,%r307,-1; .loc 1 132 3 mov.u64 %r123,0; add.u64 %r370,%frame,240; .loc 1 134 16 mov.u64 %r309,%r123; $L26: st.u64 [%r166],%r309; .loc 1 135 18 ld.u64 %r310,[%r165]; st.u64 [%r163],%r310; .loc 1 136 17 shl.b64 %r312,%r123,3; add.u64 %r313,%r370,%r312; .loc 1 136 10 ld.u64 %r314,[%r313]; setp.le.s64 %r315,%r314,0; @ %r315 bra $L1; .loc 1 132 26 add.u64 %r123,%r123,1; .loc 1 132 3 add.u64 %r166,%r166,8; add.u64 %r165,%r165,24; add.u64 %r163,%r163,8; setp.ne.u64 %r316,%r123,%r154; @ %r316 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r372 bra $L39; .loc 1 159 10 st.u64 [%r122],%r375; bra $L29; $L39: mov.u64 %r117,%r121; .loc 1 155 9 mov.u64 %r365,%r375; mov.u64 %r366,-9223372036854775808; .loc 1 157 9 mov.u64 %r103,1; .loc 1 178 8 mov.u64 %r149,0; $L28: .loc 1 182 14 ld.u64 %r367,[%r117]; ld.u64 %r368,[%r117+8]; .loc 1 185 14 add.u64 %r149,%r149,1; .loc 1 182 7 @ %r376 bra $L30; .loc 1 182 6 setp.gt.s64 %r322,%r366,%r368; @ %r322 bra $L31; setp.ne.u64 %r325,%r366,%r368; @ %r325 bra $L40; setp.gt.u64 %r328,%r365,%r367; @ %r328 bra $L31; bra $L40; $L30: setp.gt.s64 %r331,%r368,%r366; @ %r331 bra $L41; setp.ne.u64 %r334,%r368,%r366; @ %r334 bra $L31; setp.gt.u64 %r337,%r367,%r365; @ %r337 bra $L41; bra $L31; $L40: mov.u64 %r365,%r367; mov.u64 %r366,%r368; .loc 1 185 14 mov.u64 %r103,%r149; bra $L31; $L41: .loc 1 182 6 mov.u64 %r365,%r367; mov.u64 %r366,%r368; .loc 1 185 14 mov.u64 %r103,%r149; $L31: .loc 1 180 31 add.u64 %r117,%r117,%r69; .loc 1 180 6 setp.gt.s64 %r338,%r46,%r149; @ %r338 bra $L28; .loc 1 189 12 st.u64 [%r122],%r103; $L29: .loc 1 193 15 add.u64 %r48,%r48,1; .loc 1 194 12 add.u64 %r121,%r121,%r72; .loc 1 195 12 add.u64 %r122,%r122,%r75; .loc 1 197 32 ld.u64 %r145,[%frame+240]; .loc 1 197 13 setp.ne.u64 %r339,%r48,%r145; @ %r339 bra $L25; .loc 1 204 23 mul.lo.u64 %r340,%r70,%r48; .loc 1 204 9 shl.b64 %r341,%r340,4; sub.u64 %r118,%r121,%r341; .loc 1 205 23 mul.lo.u64 %r342,%r73,%r48; .loc 1 205 9 shl.b64 %r343,%r342,3; sub.u64 %r119,%r122,%r343; .loc 1 207 7 @ %r373 bra $L1; add.u64 %r150,%frame,368; mov.u64 %r109,8; .loc 1 206 5 mov.u64 %r120,1; bra $L36; $L37: .loc 1 201 13 st.u64 [%r150],%r375; .loc 1 204 23 mul.lo.u64 %r349,%r89,%r88; .loc 1 204 9 shl.b64 %r350,%r349,4; sub.u64 %r118,%r121,%r350; .loc 1 205 23 mul.lo.u64 %r351,%r92,%r88; .loc 1 205 9 shl.b64 %r352,%r351,3; sub.u64 %r119,%r122,%r352; .loc 1 206 5 add.u64 %r120,%r120,1; .loc 1 207 7 add.u64 %r150,%r150,8; add.u64 %r109,%r109,8; setp.eq.u64 %r353,%r120,%r379; @ %r353 bra $L1; $L36: .loc 1 215 16 ld.u64 %r354,[%r150]; add.u64 %r88,%r354,1; st.u64 [%r150],%r88; .loc 1 216 23 add.u64 %r356,%r380,%r109; ld.u64 %r89,[%r356]; .loc 1 216 13 shl.b64 %r357,%r89,4; add.u64 %r121,%r118,%r357; .loc 1 217 23 add.u64 %r358,%frame,%r109; ld.u64 %r92,[%r358]; .loc 1 217 13 shl.b64 %r359,%r92,3; add.u64 %r122,%r119,%r359; .loc 1 197 32 add.u64 %r361,%r381,%r109; ld.u64 %r95,[%r361]; .loc 1 197 13 setp.eq.u64 %r362,%r88,%r95; @ %r362 bra $L37; .loc 1 201 13 mov.u64 %r48,0; bra $L25; $L53: .loc 1 90 6 ld.u64 %r363,[%r181]; setp.eq.u64 %r364,%r363,0; @ ! %r364 bra $L11; bra $L38; $L1:_gfortran_mmaxloc1_8_i16 .visible .func _gfortran_mmaxloc1_8_i16, .param .u32 %in_ar4) {608pred %r218; .reg .u32 %r223; .reg .u6464u64pred %r373; .reg .u64 %r374; .reg .u64 %r376; .reg .u32 %r377; .reg .pred %r378; .reg .u64 %r381; .reg .u64 %r382; .reg .u64 %r386; .reg .pred %r388; .reg .pred %r389; .reg .u64 %r390; .reg .u32 %r391; .reg .u64pred %r398; .reg .pred %r399; .reg .pred %r402; .reg .pred %r403; .reg .pred %r404; .reg .pred %r408; .reg .pred %r411; .reg .pred %r414; .reg .pred %r415; .reg .u16 %r416; .reg .u32 %r417; .reg .pred %r418; .reg .pred %r421; .reg .pred %r424; .reg .pred %r427; .reg .predu64 %r460; .reg .u64 %r461; .reg .u64 %r463; .reg .pred %r464; .reg .u64 %r465; .reg .predpredmov.u64 %r213,%ar0; mov.u64 %r214,%ar1; mov.u64 %r215,%ar2; mov.u64 %r216,%ar3; mov.u32 %r217,%ar4; .loc 1 251 6 setp.ne.u64 %r218,%r216,0; @ %r218 bra $L1r214215217; call _gfortran_maxloc1_8_i167 bra $L54; $L55: .loc 1 261 10 ld.u64 %r22,[%r215]; .loc 1 261 7 add.u64 %r146,%r22,-1; .loc 1 262 10 ld.s8 %r23,[%r214+28]; .loc 1 262 38 add.u32 %r223,%r23,-1; .loc 1 262 8 cvt.s64.s32 %r147,%r223; .loc 1 265 7 shr.u64 %r225,%r146,63; set.u32.gt.s64 %r227,%r146,%r147; neg.s32 %r228,%r227; cvt.u16.u64 %r231,%r225; cvt.u16.u32 %r232,%r228; or.b16 %r230,%r231,%r232; .loc 1 265 6 cvt.u32.u16 %r233,%r230; cvt.u16.u8 %r234,%r233; setp.eq.u16 %r235,%r234,0; @ %r235 bra $L57; .loc 1 267 7 cvt.u32.u32 %r238,%r23; cvt.s64.s8 %r237,%r238; st.u64 [%stack+8],%r237; st.u64 [%stack],%r22; cvta.const.u64 %r236236_gfortran_runtime_error7: .loc 1 272 9 add.u64 %r468,%r146,%r146; add.u64 %r463,%r468,%r146; shl.b64 %r243,%r463,3; add.u64 %r244,%r214,%r243; ld.u64 %r247,[%r244+56]; add.u64 %r246,%r247,1; .loc 1 272 7 ld.u64 %r254,[%r244+48]; sub.u64 %r148,%r246,%r254; .loc 1 273 6 setp.le.s64 %r255,%r148,0; @ %r255 bra $L54; .loc 1 276 9 ld.u64 %r166,[%r216]; .loc 1 278 15 ld.u64 %r34,[%r216+16]; .loc 1 280 22 cvt.u32.u64 %r35,%r34; .loc 1 280 53 add.u32 %r256,%r35,-4; and.b32 %r257,%r256,-5; set.u32.eq.u32 %r259,%r257,0; neg.s32 %r260,%r259; .loc 1 280 22 add.u32 %r261,%r35,-1; set.u32.le.u32 %r263,%r261,1; neg.s32 %r264,%r263; .loc 1 280 6 cvt.u16.u32 %r266,%r260; cvt.u16.u32 %r267,%r264; or.b16 %r265,%r266,%r267; cvt.u32.u16 %r268,%r265; cvt.u16.u8 %r269,%r268; setp.ne.u16 %r270,%r269,0; @ %r270 bra $L59; .loc 1 282 7 setp.ne.u32 %r272,%r35,16; @ %r272 bra $L60; $L59: .loc 1 289 9 shl.b64 %r276,%r463,3; add.u64 %r277,%r214,%r276; ld.u64 %r151,[%r277+40]; .loc 1 290 12 add.u64 %r283,%r216,%r276; ld.u64 %r39,[%r283+40]; .loc 1 292 3 setp.ne.u64 %r285,%r146,0; @ %r285 bra $L61; $L67: .loc 1 302 3 setp.lt.s64 %r286,%r146,%r147; @ %r286 bra $L62; bra $L123; $L60: .loc 1 287 5_gfortran_runtime_error61: add.u64 %r205,%r214,40; add.u64 %r67,%r216,40; add.u64 %r289,%r214,16; add.u64 %r291,%r22,%r22; add.u64 %r292,%r291,%r22; shl.b64 %r293,%r292,3; add.u64 %r210,%r289,%r293; .loc 1 292 3 mov.u64 %r91,0; add.u64 %r467,%frame,240; add.u64 %r465,%frame,360; .loc 1 299 12 mov.u64 %r478,%r91; $L66: .loc 1 294 18 add.u64 %r295,%r467,%r91; ld.u64 %r296,[%r205]; st.u64 [%r295],%r296; .loc 1 295 18 add.u64 %r297,%frame,%r91; .loc 1 295 20 ld.u64 %r299,[%r67]; mul.lo.u64 %r298,%r299,%r34; .loc 1 295 18 st.u64 [%r297],%r298; .loc 1 296 19 ld.u64 %r301,[%r205+16]; add.u64 %r300,%r301,1; ld.u64 %r302,[%r205+8]; sub.u64 %r51,%r300,%r302; .loc 1 298 10 setp.lt.s64 %r303,%r51,0; @ %r303 bra $L64; .loc 1 296 17 add.u64 %r305,%r465,%r91; st.u64 [%r305],%r51; bra $L65; $L64: .loc 1 299 12 add.u64 %r307,%r465,%r91; st.u64 [%r307],%r478; $L65: .loc 1 292 3 add.u64 %r205,%r205,24; add.u64 %r67,%r67,24; add.u64 %r91,%r91,8; setp.ne.u64 %r309,%r205,%r210; @ %r309 bra $L66; bra $L67; $L123: .loc 1 312 6 ld.u64 %r310,[%r213]; setp.eq.u64 %r311,%r310,0; @ ! %r311 bra $L69; bra $L68; $L62: add.u64 %r313,%r22,%r22; add.u64 %r314,%r313,%r22; shl.b64 %r315,%r314,3; add.u64 %r128,%r315,40; add.u64 %r133,%r214,%r128; add.u64 %r124,%r216,%r128; shl.b64 %r316,%r22,3; add.u64 %r83,%r316,-8; cvt.u32.u32 %r318,%r23; cvt.s64.s8 %r317,%r318; shl.b64 %r319,%r317,3; add.u64 %r209,%r319,-8; add.u64 %r467,%frame,240; add.u64 %r465,%frame,360; .loc 1 309 12 mov.u64 %r477,0; $L72: .loc 1 304 18 add.u64 %r321,%r467,%r83; ld.u64 %r322,[%r133]; st.u64 [%r321],%r322; .loc 1 305 18 add.u64 %r323,%frame,%r83; .loc 1 305 20 ld.u64 %r325,[%r124]; mul.lo.u64 %r324,%r325,%r34; .loc 1 305 18 st.u64 [%r323],%r324; .loc 1 306 19 ld.u64 %r327,[%r133+16]; add.u64 %r326,%r327,1; ld.u64 %r328,[%r133+8]; sub.u64 %r64,%r326,%r328; .loc 1 308 10 setp.lt.s64 %r329,%r64,0; @ %r329 bra $L70; .loc 1 306 17 add.u64 %r331,%r465,%r83; st.u64 [%r331],%r64; bra $L71; $L70: .loc 1 309 12 add.u64 %r333,%r465,%r83; st.u64 [%r333],%r477; $L71: .loc 1 302 3 add.u64 %r133,%r133,24; add.u64 %r124,%r124,24; add.u64 %r83,%r83,8; setp.ne.u64 %r335,%r83,%r209; @ %r335 bra $L72; bra $L124; $L68: .loc 1 316 7 setp.eq.u64 %r336,%r147,0; @ %r336 bra $L74; add.u64 %r465,%frame,360; $L100: add.u64 %r161,%r213,40; mov.u64 %r143,%r465; cvt.u32.u32 %r338,%r23; cvt.s64.s8 %r337,%r338; add.u64 %r340,%r337,%r337; add.u64 %r341,%r340,%r337; shl.b64 %r342,%r341,3; add.u64 %r343,%r213,16; add.u64 %r135,%r342,%r343; .loc 1 319 10 mov.u64 %r131,1; .loc 1 323 4 mov.u64 %r362,0; bra $L75; $L74: .loc 1 327 20 add.u32 %r344,%r23,-2; cvt.s64.s32 %r73,%r344; add.u64 %r346,%r73,%r73; add.u64 %r347,%r346,%r73; shl.b64 %r348,%r347,3; add.u64 %r349,%r213,%r348; .loc 1 327 67 shl.b64 %r351,%r73,3; add.u64 %r352,%frame,%r351; .loc 1 327 59 ld.u64 %r354,[%r349+40]; ld.u64 %r355,[%r352+360]; mul.lo.u64 %r155,%r354,%r355; .loc 1 329 24 mov.u64 %r356,0; st.u64 [%r213+8],%r356; .loc 1 330 28 cvt.u16.u32 %r359,%r23; add.u16 %r358,%r359,-1; cvt.u32.u16 %r360,%r358; st.u8 [%r213+28],%r360; .loc 1 332 10 setp.eq.u64 %r361,%r155,0; @ %r361 bra $L76; bra $L125; $L78: .loc 1 321 47 mul.lo.u64 %r131,%r70,%r131; $L75: .loc 1 323 4 st.u64 [%r161+8],%r362; ld.u64 %r70,[%r143]; add.u64 %r363,%r70,-1; st.u64 [%r161+16],%r363; st.u64 [%r161],%r131; .loc 1 316 7 add.u64 %r161,%r161,24; add.u64 %r143,%r143,8; setp.ne.u64 %r364,%r135,%r161; @ %r364 bra $L78; bra $L74; $L76: .loc 1 335 4 st.u64 [%r213+48],%r155; mov.u64 %r366,-1; st.u64 [%r213+56],%r366; mov.u64 %r367,1; st.u64 [%r213+40],%r367; .loc 1 336 4 bra $L54; $L125: .loc 1 339 24 mov.u64 %r369369; call (%value_in),_gfortrani_xmallocarray70,[%value_in]; } .loc 1 339 22 st.u64 [%r213],%r370; bra $L79; $L69: .loc 1 344 19 ld.s8 %r372,[%r213+28]; .loc 1 344 10 setp.eq.u64 %r373,%r372,%r147; @ %r373 bra $L80; .loc 1 345 2 cvta.const.u64 %r374374_gfortran_runtime_error80: .loc 1 347 11 cvta.global.u64 %r376,_gfortrani_compile_options; .loc 1 347 10 ld.u32 %r377,[%r376+36]; setp.eq.u32 %r378,%r377,0; @ %r378 bra $L79; .loc 1 349 4 add.u64 %r465,%frame,360; cvta.const.u64 %r382,$LC2r3382; call _gfortrani_bounds_ifunction_return4 cvta.const.u64 %r386,$LC386382; call _gfortrani_bounds_equal_extents$L79: .loc 1 356 3 setp.ne.u64 %r388,%r147,0; @ %r388 bra $L81; $L84: .loc 1 364 8 ld.u64 %r167,[%r213]; .loc 1 365 8 ld.u64 %r165,[%r214]; .loc 1 367 9 setp.ne.u64 %r389,%r165,0; @ %r389 bra $L82; bra $L54; $L81: add.u64 %r194,%frame,480; add.u64 %r193,%r213,40; add.u64 %r188,%frame,120; cvt.u32.u32 %r391,%r23; cvt.s64.s8 %r390,%r391; add.u64 %r174,%r390,-1; .loc 1 356 3 mov.u64 %r168,0; add.u64 %r465,%frame,360; .loc 1 358 16 mov.u64 %r392,%r168; $L83: st.u64 [%r194],%r392; .loc 1 359 18 ld.u64 %r393,[%r193]; st.u64 [%r188],%r393; .loc 1 360 17 shl.b64 %r395,%r168,3; add.u64 %r396,%r465,%r395; .loc 1 360 10 ld.u64 %r397,[%r396]; setp.le.s64 %r398,%r397,0; @ %r398 bra $L54; .loc 1 356 26 add.u64 %r168,%r168,1; .loc 1 356 3 add.u64 %r194,%r194,8; add.u64 %r193,%r193,24; add.u64 %r188,%r188,8; setp.ne.u64 %r399,%r168,%r174; @ %r399 bra $L83; bra $L84; $L82: .loc 1 290 12 mul.lo.u64 %r42,%r39,%r34; .loc 1 386 32 shl.b64 %r87,%r151,4; ld.u64 %r68,[%frame+480]; .loc 1 430 22 ld.u64 %r97,[%frame+240]; .loc 1 430 12 shl.b64 %r99,%r97,4; .loc 1 431 23 ld.u64 %r100,[%frame]; .loc 1 432 22 ld.u64 %r102,[%frame+120]; .loc 1 432 12 shl.b64 %r104,%r102,3; setp.le.s64 %r464,%r147,1; .loc 1 408 9 setp.ne.u32 %r470,%r217,0; cvt.u32.u32 %r471,%r23; cvt.s64.s8 %r472,%r471; add.u64 %r473,%r472,-1; add.u64 %r474,%frame,240; add.u64 %r475,%frame,360; add.u64 %r476,%frame,120; $L101: .loc 1 356 3 mov.u64 %r160,%r166; mov.u64 %r159,%r165; .loc 1 386 9 mov.u64 %r137,0; $L88: .loc 1 389 7 ld.s8 %r208,[%r160]; mov.u64 %r189,%r137; .loc 1 386 24 add.u64 %r137,%r137,1; .loc 1 389 6 setp.eq.u32 %r466,%r208,0; @ %r466 bra $L85; .loc 1 397 11 ld.u64 %r460,[%r159]; ld.u64 %r461,[%r159+8]; .loc 1 408 9 @ %r470 bra $L86; bra $L126; $L85: .loc 1 386 32 add.u64 %r159,%r159,%r87; .loc 1 386 47 add.u64 %r160,%r160,%r42; .loc 1 386 2 setp.ne.u64 %r402,%r148,%r137; @ %r402 bra $L88; .loc 1 385 9 mov.u64 %r137,0; bra $L89; $L126: .loc 1 418 8 setp.gt.s64 %r403,%r148,%r189; @ %r403 bra $L90; bra $L89; $L86: .loc 1 409 8 setp.le.s64 %r404,%r148,%r189; @ %r404 bra $L89; $L93: .loc 1 414 16 add.u64 %r189,%r189,1; .loc 1 411 8 @ %r466 bra $L91; .loc 1 411 18 ld.u64 %r456,[%r159]; ld.u64 %r457,[%r159+8]; .loc 1 411 15 setp.gt.s64 %r408,%r461,%r457; @ %r408 bra $L91; setp.ne.u64 %r411,%r461,%r457; @ %r411 bra $L102; setp.gt.u64 %r414,%r460,%r456; @ %r414 bra $L91; $L102: mov.u64 %r460,%r456; mov.u64 %r461,%r457; .loc 1 414 16 mov.u64 %r137,%r189; $L91: .loc 1 409 33 add.u64 %r159,%r159,%r87; .loc 1 409 48 add.u64 %r160,%r160,%r42; .loc 1 409 8 setp.eq.u64 %r415,%r148,%r189; @ %r415 bra $L89; .loc 1 411 9 ld.s8 %r208,[%r160]; setp.eq.u32 %r466,%r208,0; bra $L93; $L90: .loc 1 423 16 add.u64 %r189,%r189,1; .loc 1 420 8 ld.s8 %r417,[%r160]; cvt.u16.u32 %r416,%r417; setp.eq.u16 %r418,%r416,0; @ %r418 bra $L94; .loc 1 420 18 ld.u64 %r458,[%r159]; ld.u64 %r459,[%r159+8]; .loc 1 420 15 setp.gt.s64 %r421,%r459,%r461; @ %r421 bra $L103; setp.ne.u64 %r424,%r459,%r461; @ %r424 bra $L94; setp.gt.u64 %r427,%r458,%r460; @ ! %r427 bra $L94; $L103: mov.u64 %r460,%r458; mov.u64 %r461,%r459; .loc 1 423 16 mov.u64 %r137,%r189; $L94: .loc 1 418 33 add.u64 %r159,%r159,%r87; .loc 1 418 48 add.u64 %r160,%r160,%r42; .loc 1 418 8 setp.ne.u64 %r428,%r148,%r189; @ %r428 bra $L90; $L89: .loc 1 426 8 st.u64 [%r167],%r137; .loc 1 429 15 add.u64 %r68,%r68,1; .loc 1 430 12 add.u64 %r165,%r165,%r99; .loc 1 431 13 add.u64 %r166,%r166,%r100; .loc 1 432 12 add.u64 %r167,%r167,%r104; .loc 1 434 32 ld.u64 %r29,[%frame+360]; .loc 1 434 13 setp.ne.u64 %r429,%r29,%r68; @ %r429 bra $L101; .loc 1 441 23 mul.lo.u64 %r106,%r68,%r97; .loc 1 442 24 mul.lo.u64 %r430,%r68,%r100; .loc 1 442 10 sub.u64 %r162,%r166,%r430; .loc 1 443 23 mul.lo.u64 %r431,%r68,%r102; .loc 1 443 9 shl.b64 %r432,%r431,3; sub.u64 %r163,%r167,%r432; .loc 1 445 7 @ %r464 bra $L54; add.u64 %r204,%frame,488; mov.u64 %r203,8; .loc 1 444 5 mov.u64 %r164,1; .loc 1 438 13 mov.u64 %r469,0; bra $L98; $L99: st.u64 [%r204],%r469; .loc 1 441 23 mul.lo.u64 %r106,%r117,%r116; .loc 1 442 24 mul.lo.u64 %r438,%r119,%r116; .loc 1 442 10 sub.u64 %r162,%r166,%r438; .loc 1 443 23 mul.lo.u64 %r439,%r121,%r116; .loc 1 443 9 shl.b64 %r440,%r439,3; sub.u64 %r163,%r167,%r440; .loc 1 444 5 add.u64 %r164,%r164,1; .loc 1 445 7 add.u64 %r204,%r204,8; add.u64 %r203,%r203,8; setp.eq.u64 %r441,%r164,%r473; @ %r441 bra $L54; $L98: .loc 1 453 16 ld.u64 %r442,[%r204]; add.u64 %r116,%r442,1; st.u64 [%r204],%r116; .loc 1 454 23 add.u64 %r444,%r474,%r203; ld.u64 %r117,[%r444]; .loc 1 454 13 sub.u64 %r445,%r117,%r106; shl.b64 %r446,%r445,4; add.u64 %r165,%r165,%r446; .loc 1 455 24 add.u64 %r447,%frame,%r203; ld.u64 %r119,[%r447]; .loc 1 455 14 add.u64 %r166,%r162,%r119; .loc 1 456 23 add.u64 %r449,%r476,%r203; ld.u64 %r121,[%r449]; .loc 1 456 13 shl.b64 %r450,%r121,3; add.u64 %r167,%r163,%r450; .loc 1 434 32 add.u64 %r452,%r475,%r203; ld.u64 %r125,[%r452]; .loc 1 434 13 setp.eq.u64 %r453,%r116,%r125; @ %r453 bra $L99; .loc 1 438 13 mov.u64 %r68,0; bra $L101; $L124: .loc 1 312 6 ld.u64 %r454,[%r213]; setp.eq.u64 %r455,%r454,0; @ ! %r455 bra $L69; bra $L100; $L54_gfortran_smaxloc1_8_i16 .visible .func _gfortran_smaxloc1_8_i16, .param .u32 %in_ar4) {local .align 16 .b8 %frame_ar[368u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 483 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L128; .loc 1 483 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L129; $L128:r152154; call _gfortran_maxloc1_8_i16490 7 bra $L127; $L129: .loc 1 493 10 ld.u64 %r23,[%r152]; .loc 1 493 7 add.u64 %r88,%r23,-1; .loc 1 494 10 ld.s8 %r24,[%r151+28]; .loc 1 494 38 add.u32 %r26,%r24,-1; .loc 1 494 8 cvt.s64.s32 %r296,%r26; .loc 1 496 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 496 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L131; .loc 1 503 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L132; $L138: .loc 1 511 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L133; bra $L186; $L131: .loc 1 498 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error132: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 508 12 mov.u64 %r302,0; $L137: .loc 1 505 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 507 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L135; .loc 1 505 17 st.u64 [%r139],%r36; bra $L136; $L135: .loc 1 508 12 st.u64 [%r139],%r302; $L136: .loc 1 503 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L137; bra $L138; $L186: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L140; bra $L139; $L133: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 517 12 mov.u64 %r301,0; $L143: .loc 1 514 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 516 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L141; .loc 1 513 17 st.u64 [%r43],%r41; bra $L142; $L141: .loc 1 517 12 st.u64 [%r43],%r301; $L142: .loc 1 511 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L143; bra $L187; $L139: .loc 1 524 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L145; add.u64 %r295,%frame,120; $L165: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 527 10 mov.u64 %r81,1; .loc 1 531 4 mov.u64 %r239,0; bra $L146; $L145: .loc 1 535 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 536 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 538 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 538 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 538 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 540 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L147; bra $L188; $L149: .loc 1 529 48 mul.lo.u64 %r81,%r46,%r81; $L146: .loc 1 531 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 524 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L149; bra $L145; $L147: .loc 1 543 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 544 4 bra $L127; $L188: .loc 1 547 24 mov.u64 %r24696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 547 22 st.u64 [%r150],%r98; $L153: .loc 1 573 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L150; bra $L151; $L140: .loc 1 551 19 ld.s8 %r55,[%r150+28]; .loc 1 551 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L152; .loc 1 552 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error152: .loc 1 557 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 557 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L153; .loc 1 559 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L151; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 559 10 mov.u64 %r104,0; $L155: .loc 1 563 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 563 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 564 18 ld.u64 %r63,[%r31]; .loc 1 559 25 add.u64 %r104,%r104,1; .loc 1 564 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L154; .loc 1 565 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error154: .loc 1 559 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L155; $L150: .loc 1 575 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L156; shl.b64 %r266,%r296,3; bra $L157; $L156: mov.u64 %r266,8; $L157: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L158: .loc 1 576 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 573 26 add.u64 %r100,%r100,1; .loc 1 573 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L158; $L151: ld.u64 %r80,[%frame+240]; .loc 1 585 22 ld.u64 %r68,[%frame]; .loc 1 585 12 shl.b64 %r69,%r68,3; .loc 1 587 32 ld.u64 %r59,[%frame+120]; .loc 1 594 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 583 13 mov.u64 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L164: st.u64 [%r98],%r280; .loc 1 584 15 add.u64 %r80,%r80,1; .loc 1 585 12 add.u64 %r98,%r98,%r69; .loc 1 587 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L164; .loc 1 596 7 @ ! %r294 bra $L189; bra $L127; $L163: .loc 1 591 13 st.u64 [%r99],%r280; .loc 1 594 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 595 5 add.u64 %r97,%r97,1; .loc 1 596 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L162; bra $L127; $L189: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 594 23 mov.u64 %r72,%r44; .loc 1 595 5 mov.u64 %r97,1; $L162: .loc 1 600 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 601 23 ld.u64 %r75,[%r124]; .loc 1 601 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,3; add.u64 %r98,%r98,%r291; .loc 1 587 32 ld.u64 %r77,[%r123]; .loc 1 587 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L163; .loc 1 591 13 mov.u64 %r80,0; bra $L164; $L187: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L140; bra $L165; $L127: .loc 1 605 1 ret; } /73 _gfortran_maxloc1_16_i16 .visible .func _gfortran_maxloc1_16_i16.file 1 "../../../libgfortran/generated/maxloc1_16_i16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_16_i16 .visible .func _gfortran_mmaxloc1_16_i16_gfortran_smaxloc1_16_i16 .visible .func _gfortran_smaxloc1_16_i16VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_maxloc1_16_i16 .visible .func _gfortran_maxloc1_16_i1648095u64 %r107predpredu64 %r231; .reg .u64 %r232; .reg .pred %r233; .reg .pred %r235; .reg .u64 %r236; .reg .pred %r237; .reg .u64u64predu64 %r264; .reg .u64 %r265; .reg .u64 %r266; .reg .u64 %r267; .reg .u64 %r268; .reg .u16 %r270; .reg .u16 %r271; .reg .u32 %r272; .reg .u32 %r273; .reg .u64 %r275; .reg .u64u64pred64pred %r329; .reg .pred %r332; .reg .pred %r335; .reg .predpred %r359; .reg .predpredu64predpred %r403; .reg .u32 %r404; .reg .u64 %r405; .reg .u64 %r406; .reg .u64 %r407; .reg .u64 %r408; .reg .u64 %r409; .reg .u64 %r410; mov.u64 %r183,%ar0; mov.u64 %r184,%ar1; mov.u64 %r185,%ar2; mov.u32 %r186,%ar3; .loc 1 58 10 ld.s8 %r22,[%r184+28]; .loc 1 58 38 add.u32 %r187,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r110,%r187; .loc 1 59 10 ld.u64 %r25,[%r185]; .loc 1 59 7 add.u64 %r111,%r25,-1; .loc 1 61 7 shr.u64 %r189,%r111,63; set.u32.lt.s64 %r191,%r110,%r111; neg.s32 %r192,%r191; cvt.u16.u64 %r195,%r189; cvt.u16.u32 %r196,%r192; or.b16 %r194,%r195,%r196; .loc 1 61 6 cvt.u32.u16 %r197,%r194; cvt.u16.u8 %r198,%r197; setp.eq.u16 %r199,%r198,0; @ %r199 bra $L2; .loc 1 63 7 cvt.u32.u32 %r202,%r22; cvt.s64.s8 %r201,%r202; st.u64 [%stack+8],%r2000200_gfortran_runtime_errorr205,%r111,%r111; add.u64 %r206,%r205,%r111; shl.b64 %r207,%r206,3; add.u64 %r208,%r184,%r207; ld.u64 %r31,[%r208+56]; ld.u64 %r33,[%r208+48]; .loc 1 71 9 ld.u64 %r114,[%r208+40]; .loc 1 73 3 setp.ne.u64 %r222,%r111,0; @ %r222 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r223,%r110,%r111; @ %r223 bra $L4; bra $L51; $L3: add.u64 %r109,%r184,40; add.u64 %r128,%frame,120; add.u64 %r181,%frame,240; add.u64 %r224,%r184,16; add.u64 %r226,%r25,%r25; add.u64 %r227,%r226,%r25; shl.b64 %r228,%r227,3; add.u64 %r137,%r224,%r228; .loc 1 79 12 mov.u64 %r410,0; $L8: .loc 1 75 18 ld.u64 %r229,[%r109]; st.u64 [%r128],%r229; .loc 1 76 19 ld.u64 %r231,[%r109+16]; add.u64 %r230,%r231,1; ld.u64 %r232,[%r109+8]; sub.u64 %r38,%r230,%r232; .loc 1 78 10 setp.lt.s64 %r233,%r38,0; @ %r233 bra $L6; .loc 1 76 17 st.u64 [%r181],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r181],%r410; $L7: .loc 1 73 3 add.u64 %r109,%r109,24; add.u64 %r128,%r128,8; add.u64 %r181,%r181,8; setp.ne.u64 %r235,%r109,%r137; @ %r235 bra $L8; bra $L9; $L51: .loc 1 90 6 ld.u64 %r236,[%r183]; setp.eq.u64 %r237,%r236,0; @ ! %r237 bra $L11; bra $L10; $L4: add.u64 %r239,%r25,%r25; add.u64 %r240,%r239,%r25; shl.b64 %r241,%r240,3; add.u64 %r242,%r241,40; add.u64 %r134,%r184,%r242; shl.b64 %r243,%r25,3; add.u64 %r107,%r243,-8; add.u64 %r398,%frame,120; add.u64 %r120,%r398,%r107; add.u64 %r397,%frame,240; add.u64 %r104,%r397,%r107; add.u64 %r246,%r184,40; cvt.u32.u32 %r248,%r22; cvt.s64.s8 %r247,%r248; add.u64 %r250,%r247,%r247; add.u64 %r251,%r250,%r247; shl.b64 %r252,%r251,3; add.u64 %r63,%r246,%r252; .loc 1 87 12 mov.u64 %r409,0; $L14: .loc 1 83 18 ld.u64 %r253,[%r134]; st.u64 [%r120],%r253; .loc 1 84 19 ld.u64 %r255,[%r134+16]; add.u64 %r254,%r255,1; ld.u64 %r256,[%r134+8]; sub.u64 %r45,%r254,%r256; .loc 1 86 10 setp.lt.s64 %r257,%r45,0; @ %r257 bra $L12; .loc 1 84 17 st.u64 [%r104],%r45; bra $L13; $L12: .loc 1 87 12 st.u64 [%r104],%r409; $L13: .loc 1 81 3 add.u64 %r134,%r134,24; add.u64 %r120,%r120,8; add.u64 %r104,%r104,8; setp.ne.u64 %r259,%r63,%r134; @ %r259 bra $L14; bra $L52; $L10: .loc 1 94 7 setp.le.s64 %r260,%r110,0; @ %r260 bra $L16; add.u64 %r397,%frame,240; $L39: add.u64 %r154,%r183,40; mov.u64 %r152,%r397; cvt.u32.u32 %r262,%r22; cvt.s64.s8 %r261,%r262; add.u64 %r264,%r261,%r261; add.u64 %r265,%r264,%r261; shl.b64 %r266,%r265,3; add.u64 %r267,%r183,16; add.u64 %r138,%r266,%r267; .loc 1 97 10 mov.u64 %r101,1; .loc 1 101 4 mov.u64 %r290,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r268,0; st.u64 [%r183+8],%r268; .loc 1 106 28 cvt.u16.u32 %r271,%r22; add.u16 %r270,%r271,-1; cvt.u32.u16 %r272,%r270; st.u8 [%r183+28],%r272; .loc 1 108 20 add.u32 %r273,%r22,-2; cvt.s64.s32 %r56,%r273; add.u64 %r275,%r56,%r56; add.u64 %r276,%r275,%r56; shl.b64 %r277,%r276,3; add.u64 %r278,%r183,%r277; .loc 1 108 67 shl.b64 %r280,%r56,3; add.u64 %r281,%frame,%r280; .loc 1 108 59 ld.u64 %r283,[%r278+40]; ld.u64 %r284,[%r281+240]; mul.lo.u64 %r116,%r283,%r284; .loc 1 110 29 mov.u64 %r286,16286; call (%value_in),_gfortrani_xmallocarray287,[%value_in]; } .loc 1 110 27 st.u64 [%r183],%r287; .loc 1 111 10 setp.eq.u64 %r289,%r116,0; @ ! %r289 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r101,%r52,%r101; $L17: .loc 1 101 4 st.u64 [%r154+8],%r290; ld.u64 %r52,[%r152]; add.u64 %r291,%r52,-1; st.u64 [%r154+16],%r291; st.u64 [%r154],%r101; .loc 1 94 7 add.u64 %r154,%r154,24; add.u64 %r152,%r152,8; setp.ne.u64 %r292,%r138,%r154; @ %r292 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r183+48],%r116; mov.u64 %r294,-1; st.u64 [%r183+56],%r294; mov.u64 %r295,1; st.u64 [%r183+40],%r295; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r62,[%r183+28]; .loc 1 121 10 setp.eq.u64 %r296,%r62,%r110; @ %r296 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r110; st.u64 [%stack],%r62;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r299,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r300,[%r299+36]; setp.eq.u32 %r301,%r300,0; @ %r301 bra $L21; .loc 1 128 2 add.u64 %r397,%frame,240; cvta.const.u64 %r305,$LC2304305; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r307,%r110,0; @ %r307 bra $L24; $L27: .loc 1 68 9 add.u64 %r308,%r31,1; .loc 1 68 7 sub.u64 %r112,%r308,%r33; max.s64 %r46,%r112,0; .loc 1 140 8 ld.u64 %r124,[%r184]; .loc 1 141 8 ld.u64 %r125,[%r183]; .loc 1 180 31 shl.b64 %r70,%r114,4; ld.u64 %r77,[%frame+360]; .loc 1 194 22 ld.u64 %r72,[%frame+120]; .loc 1 194 12 shl.b64 %r73,%r72,4; .loc 1 195 22 ld.u64 %r75,[%frame]; .loc 1 195 12 shl.b64 %r76,%r75,4; .loc 1 197 32 ld.u64 %r105,[%frame+240]; .loc 1 204 23 mul.lo.u64 %r309,%r72,%r105; .loc 1 204 9 shl.b64 %r310,%r309,4; neg.s64 %r135,%r310; .loc 1 205 23 mul.lo.u64 %r311,%r75,%r105; .loc 1 205 9 shl.b64 %r312,%r311,4; neg.s64 %r139,%r312; setp.gt.s64 %r399,%r112,0; setp.le.s64 %r400,%r110,1; mov.u64 %r402,1; setp.eq.u32 %r403,%r186,0; cvt.u32.u32 %r404,%r22; cvt.s64.s8 %r405,%r404; add.u64 %r406,%r405,-1; add.u64 %r407,%frame,120; add.u64 %r408,%frame,240; bra $L25; $L24: add.u64 %r167,%frame,360; add.u64 %r166,%r183,40; mov.u64 %r164,%frame; cvt.u32.u32 %r314,%r22; cvt.s64.s8 %r313,%r314; add.u64 %r155,%r313,-1; .loc 1 132 3 mov.u64 %r126,0; add.u64 %r397,%frame,240; .loc 1 134 16 mov.u64 %r315,%r126; $L26: st.u64 [%r167],%r315; .loc 1 135 18 ld.u64 %r316,[%r166]; st.u64 [%r164],%r316; .loc 1 136 17 shl.b64 %r318,%r126,3; add.u64 %r319,%r397,%r318; .loc 1 136 10 ld.u64 %r320,[%r319]; setp.le.s64 %r321,%r320,0; @ %r321 bra $L1; .loc 1 132 26 add.u64 %r126,%r126,1; .loc 1 132 3 add.u64 %r167,%r167,8; add.u64 %r166,%r166,24; add.u64 %r164,%r164,8; setp.ne.u64 %r322,%r126,%r155; @ %r322 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r399 bra $L40; .loc 1 159 10 mov.u64 %r324,0; st.u64 [%r125],%r324; st.u64 [%r125+8],%r324; bra $L29; $L40: mov.u64 %r119,%r124; mov.u64 %r388,%r402; mov.u64 %r389,0; .loc 1 155 9 mov.u64 %r386,%r389; mov.u64 %r387,-9223372036854775808; .loc 1 157 9 mov.u64 %r384,1; mov.u64 %r385,%r389; $L28: .loc 1 182 14 ld.u64 %r390,[%r119]; ld.u64 %r391,[%r119+8]; .loc 1 182 7 @ %r403 bra $L30; .loc 1 182 6 setp.gt.s64 %r329,%r387,%r391; @ %r329 bra $L33; setp.ne.u64 %r332,%r387,%r391; @ %r332 bra $L31; setp.gt.u64 %r335,%r386,%r390; @ %r335 bra $L33; bra $L31; $L30: setp.gt.s64 %r338,%r391,%r387; @ %r338 bra $L31; setp.ne.u64 %r341,%r391,%r387; @ %r341 bra $L33; setp.gt.u64 %r344,%r390,%r386; @ ! %r344 bra $L33; $L31: .loc 1 185 14 mov.u64 %r384,%r388; mov.u64 %r385,%r389; .loc 1 182 14 mov.u64 %r386,%r390; mov.u64 %r387,%r391; $L33: .loc 1 180 31 add.u64 %r119,%r119,%r70; .loc 1 180 6 add.u64 %r347,%r388,%r402; set.u32.lt.u64 %r352,%r347,%r388; cvt.s64.s32 %r350,%r352; mov.u64 %r388,%r347; sub.u64 %r389,%r389,%r350; add.u64 %r357,%r388,-1; setp.gt.s64 %r359,%r46,%r357; @ %r359 bra $L28; .loc 1 189 12 st.u64 [%r125],%r384; st.u64 [%r125+8],%r385; $L29: .loc 1 193 15 add.u64 %r77,%r77,1; .loc 1 194 12 add.u64 %r124,%r124,%r73; .loc 1 195 12 add.u64 %r125,%r125,%r76; .loc 1 197 13 setp.ne.u64 %r362,%r77,%r105; @ %r362 bra $L25; .loc 1 204 9 add.u64 %r121,%r124,%r135; .loc 1 205 9 add.u64 %r122,%r125,%r139; .loc 1 207 7 @ %r400 bra $L1; add.u64 %r130,%frame,368; mov.u64 %r149,8; .loc 1 206 5 mov.u64 %r123,1; .loc 1 201 13 mov.u64 %r401,0; bra $L37; $L38: st.u64 [%r130],%r401; .loc 1 204 23 mul.lo.u64 %r368,%r89,%r88; .loc 1 204 9 shl.b64 %r369,%r368,4; sub.u64 %r121,%r124,%r369; .loc 1 205 23 mul.lo.u64 %r370,%r92,%r88; .loc 1 205 9 shl.b64 %r371,%r370,4; sub.u64 %r122,%r125,%r371; .loc 1 206 5 add.u64 %r123,%r123,1; .loc 1 207 7 add.u64 %r130,%r130,8; add.u64 %r149,%r149,8; setp.eq.u64 %r372,%r123,%r406; @ %r372 bra $L1; $L37: .loc 1 215 16 ld.u64 %r373,[%r130]; add.u64 %r88,%r373,1; st.u64 [%r130],%r88; .loc 1 216 23 add.u64 %r375,%r407,%r149; ld.u64 %r89,[%r375]; .loc 1 216 13 shl.b64 %r376,%r89,4; add.u64 %r124,%r121,%r376; .loc 1 217 23 add.u64 %r377,%frame,%r149; ld.u64 %r92,[%r377]; .loc 1 217 13 shl.b64 %r378,%r92,4; add.u64 %r125,%r122,%r378; .loc 1 197 32 add.u64 %r380,%r408,%r149; ld.u64 %r95,[%r380]; .loc 1 197 13 setp.eq.u64 %r381,%r88,%r95; @ %r381 bra $L38; .loc 1 201 13 mov.u64 %r77,0; bra $L25; $L52: .loc 1 90 6 ld.u64 %r382,[%r183]; setp.eq.u64 %r383,%r382,0; @ ! %r383 bra $L11; bra $L39; $L1:_gfortran_mmaxloc1_16_i16 .visible .func _gfortran_mmaxloc1_16_i16, .param .u32 %in_ar4) {60897u64u64 %r267; .reg .u64 %r274; .reg .pred %r275; .reg .u32 %r276; .reg .u32 %r277; .reg .u32u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .pred %r329; .reg .u64 %r330; .reg .pred %r331; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32 %r338; .reg .u64 %r339; .reg .u64 %r341; .reg .u64 %r342; .reg .u64 %r343; .reg .u64 %r344; .reg .u64pred %r355; .reg .pred64predpred %r408; .reg .pred %r409; .reg .u64 %r410; .reg .u32.reg .u64 %r417; .reg .pred %r418; .reg .predpred %r428; .reg .u64 %r432; .reg .u64pred %r462; .reg .pred %r465; .reg .predpred %r483; .reg .pred.reg .u32 %r512; .reg .u64 %r516; .reg .u16 %r517; .reg .u32 %r518; .reg .pred %r519; .reg .pred %r522; .reg .pred %r525; .reg .predu32 %r536; .reg .pred %r543; .reg .pred %r546; .reg .pred %r549; .reg .u64 %r555; .reg .u64 %r556; .reg .u64 %r557; .reg .predu64u64 %r577; .reg .u64u64 %r587; .reg .u64 %r588; .reg .u64 %r599; .reg .u64 %r613; .reg .u64 %r617; .reg .u64 %r618; .reg .u64 %r619; .reg .u64 %r620; .reg .pred %r621; .reg .u64 %r622; .reg .pred %r623; .reg .u64 %r624; .reg .pred %r625; .reg .u64 %r626; .reg .u32 %r627; .reg .u64 %r628; .reg .u64 %r629; .reg .u64 %r630; .reg .u64 %r631; .reg .u64 %r632; .reg .u64 %r633; mov.u64 %r233,%ar0; mov.u64 %r234,%ar1; mov.u64 %r235,%ar2; mov.u64 %r236,%ar3; mov.u32 %r237,%ar4; .loc 1 251 6 setp.ne.u64 %r238,%r236,0; @ %r238 bra $L3r2235237; call _gfortran_maxloc1_16_i167 bra $L53; $L54: .loc 1 261 10 ld.u64 %r22,[%r235]; .loc 1 261 7 add.u64 %r153,%r22,-1; .loc 1 262 10 ld.s8 %r23,[%r234+28]; .loc 1 262 38 add.u32 %r243,%r23,-1; .loc 1 262 8 cvt.s64.s32 %r154,%r243; .loc 1 265 7 shr.u64 %r245,%r153,63; set.u32.gt.s64 %r247,%r153,%r154; neg.s32 %r248,%r247; cvt.u16.u64 %r251,%r245; cvt.u16.u32 %r252,%r248; or.b16 %r250,%r251,%r252; .loc 1 265 6 cvt.u32.u16 %r253,%r250; cvt.u16.u8 %r254,%r253; setp.eq.u16 %r255,%r254,0; @ %r255 bra $L56; .loc 1 267 7 cvt.u32.u32 %r258,%r23; cvt.s64.s8 %r257,%r258; st.u64 [%stack+8],%r257; st.u64 [%stack],%r22; cvta.const.u64 %r22stack; call _gfortran_runtime_error6: .loc 1 272 9 add.u64 %r619,%r153,%r153; add.u64 %r620,%r619,%r153; shl.b64 %r263,%r620,3; add.u64 %r264,%r234,%r263; ld.u64 %r267,[%r264+56]; add.u64 %r266,%r267,1; .loc 1 272 7 ld.u64 %r274,[%r264+48]; sub.u64 %r155,%r266,%r274; .loc 1 273 6 setp.le.s64 %r275,%r155,0; @ %r275 bra $L53; .loc 1 276 9 ld.u64 %r173,[%r236]; .loc 1 278 15 ld.u64 %r35,[%r236+16]; .loc 1 280 22 cvt.u32.u64 %r36,%r35; .loc 1 280 53 add.u32 %r276,%r36,-4; and.b32 %r277,%r276,-5; set.u32.eq.u32 %r279,%r277,0; neg.s32 %r280,%r279; .loc 1 280 22 add.u32 %r281,%r36,-1; set.u32.le.u32 %r283,%r281,1; neg.s32 %r284,%r283; .loc 1 280 6 cvt.u16.u32 %r286,%r280; cvt.u16.u32 %r287,%r284; or.b16 %r285,%r286,%r287;58; .loc 1 282 7 setp.ne.u32 %r292,%r36,16; @ %r292 bra $L59; $L58: .loc 1 289 9 shl.b64 %r296,%r620,3; add.u64 %r297,%r234,%r296; ld.u64 %r158,[%r297+40]; .loc 1 290 12 add.u64 %r303,%r236,%r296; ld.u64 %r41,[%r303+40]; .loc 1 292 3 setp.ne.u64 %r305,%r153,0; @ %r305 bra $L60; $L66: .loc 1 302 3 setp.lt.s64 %r306,%r153,%r154; @ %r306 bra $L61; bra $L124; $L59: .loc 1 287 5 cvta.const.u64 %r307307_gfortran_runtime_error60: add.u64 %r164,%r234,40; add.u64 %r150,%r236,40; add.u64 %r309,%r234,16; add.u64 %r311,%r22,%r22; add.u64 %r312,%r311,%r22; shl.b64 %r313,%r312,3; add.u64 %r223,%r309,%r313; .loc 1 292 3 mov.u64 %r53,0; add.u64 %r622,%frame,240; add.u64 %r618,%frame,360; .loc 1 299 12 mov.u64 %r633,%r53; $L65: .loc 1 294 18 add.u64 %r315,%r622,%r53; ld.u64 %r316,[%r164]; st.u64 [%r315],%r316; .loc 1 295 18 add.u64 %r317,%frame,%r53; .loc 1 295 20 ld.u64 %r319,[%r150]; mul.lo.u64 %r318,%r319,%r35; .loc 1 295 18 st.u64 [%r317],%r318; .loc 1 296 19 ld.u64 %r321,[%r164+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r164+8]; sub.u64 %r52,%r320,%r322; .loc 1 298 10 setp.lt.s64 %r323,%r52,0; @ %r323 bra $L63; .loc 1 296 17 add.u64 %r325,%r618,%r53; st.u64 [%r325],%r52; bra $L64; $L63: .loc 1 299 12 add.u64 %r327,%r618,%r53; st.u64 [%r327],%r633; $L64: .loc 1 292 3 add.u64 %r164,%r164,24; add.u64 %r150,%r150,24; add.u64 %r53,%r53,8; setp.ne.u64 %r329,%r164,%r223; @ %r329 bra $L65; bra $L66; $L124: .loc 1 312 6 ld.u64 %r330,[%r233]; setp.eq.u64 %r331,%r330,0; @ ! %r331 bra $L68; bra $L67; $L61: add.u64 %r333,%r22,%r22; add.u64 %r334,%r333,%r22; shl.b64 %r335,%r334,3; add.u64 %r178,%r335,40; add.u64 %r59,%r234,%r178; add.u64 %r180,%r236,%r178; shl.b64 %r336,%r22,3; add.u64 %r91,%r336,-8; cvt.u32.u32 %r338,%r23; cvt.s64.s8 %r337,%r338; shl.b64 %r339,%r337,3; add.u64 %r97,%r339,-8; add.u64 %r622,%frame,240; add.u64 %r618,%frame,360; .loc 1 309 12 mov.u64 %r632,0; $L71: .loc 1 304 18 add.u64 %r341,%r622,%r91; ld.u64 %r342,[%r59]; st.u64 [%r341],%r342; .loc 1 305 18 add.u64 %r343,%frame,%r91; .loc 1 305 20 ld.u64 %r345,[%r180]; mul.lo.u64 %r344,%r345,%r35; .loc 1 305 18 st.u64 [%r343],%r344; .loc 1 306 19 ld.u64 %r347,[%r59+16]; add.u64 %r346,%r347,1; ld.u64 %r348,[%r59+8]; sub.u64 %r66,%r346,%r348; .loc 1 308 10 setp.lt.s64 %r349,%r66,0; @ %r349 bra $L69; .loc 1 306 17 add.u64 %r351,%r618,%r91; st.u64 [%r351],%r66; bra $L70; $L69: .loc 1 309 12 add.u64 %r353,%r618,%r91; st.u64 [%r353],%r632; $L70: .loc 1 302 3 add.u64 %r59,%r59,24; add.u64 %r180,%r180,24; add.u64 %r91,%r91,8; setp.ne.u64 %r355,%r91,%r97; @ %r355 bra $L71; bra $L125; $L67: .loc 1 316 7 setp.eq.u64 %r356,%r154,0; @ %r356 bra $L73; add.u64 %r618,%frame,360; $L101: add.u64 %r137,%r233,40; mov.u64 %r134,%r618; cvt.u32.u32 %r358,%r23; cvt.s64.s8 %r357,%r358; add.u64 %r360,%r357,%r357; add.u64 %r361,%r360,%r357; shl.b64 %r362,%r361,3; add.u64 %r363,%r233,16; add.u64 %r64,%r362,%r363; .loc 1 319 10 mov.u64 %r136,1; .loc 1 323 4 mov.u64 %r382,0; bra $L74; $L73: .loc 1 327 20 add.u32 %r364,%r23,-2; cvt.s64.s32 %r75,%r364; add.u64 %r366,%r75,%r75; add.u64 %r367,%r366,%r75; shl.b64 %r368,%r367,3; add.u64 %r369,%r233,%r368; .loc 1 327 67 shl.b64 %r371,%r75,3; add.u64 %r372,%frame,%r371; .loc 1 327 59 ld.u64 %r374,[%r369+40]; ld.u64 %r375,[%r372+360]; mul.lo.u64 %r162,%r374,%r375; .loc 1 329 24 mov.u64 %r376,0; st.u64 [%r233+8],%r376; .loc 1 330 28 cvt.u16.u32 %r379,%r23; add.u16 %r378,%r379,-1; cvt.u32.u16 %r380,%r378; st.u8 [%r233+28],%r380; .loc 1 332 10 setp.eq.u64 %r381,%r162,0; @ %r381 bra $L75; bra $L126; $L77: .loc 1 321 47 mul.lo.u64 %r136,%r72,%r136; $L74: .loc 1 323 4 st.u64 [%r137+8],%r382; ld.u64 %r72,[%r134]; add.u64 %r383,%r72,-1; st.u64 [%r137+16],%r383; st.u64 [%r137],%r136; .loc 1 316 7 add.u64 %r137,%r137,24; add.u64 %r134,%r134,8; setp.ne.u64 %r384,%r64,%r137; @ %r384 bra $L77; bra $L73; $L75: .loc 1 335 4 st.u64 [%r233+48],%r162; mov.u64 %r386,-1; st.u64 [%r233+56],%r386; mov.u64 %r387,1; st.u64 [%r233+40],%r387; .loc 1 336 4 bra $L53; $L126: .loc 1 339 24 mov.u64 %r389,16389; call (%value_in),_gfortrani_xmallocarray90,[%value_in]; } .loc 1 339 22 st.u64 [%r233],%r390; bra $L78; $L68: .loc 1 344 19 ld.s8 %r392,[%r233+28]; .loc 1 344 10 setp.eq.u64 %r393,%r392,%r154; @ %r393 bra $L79; .loc 1 345 2 cvta.const.u64 %r3944_gfortran_runtime_error79: .loc 1 347 11 cvta.global.u64 %r396,_gfortrani_compile_options; .loc 1 347 10 ld.u32 %r397,[%r396+36]; setp.eq.u32 %r398,%r397,0; @ %r398 bra $L78; .loc 1 349 4 add.u64 %r618,%frame,360; cvta.const.u64 %r402,$LC2r61002; call _gfortrani_bounds_ifunction_return4 cvta.const.u64 %r406,$LC406402; call _gfortrani_bounds_equal_extents$L78: .loc 1 356 3 setp.ne.u64 %r408,%r154,0; @ %r408 bra $L80; $L83: .loc 1 364 8 ld.u64 %r174,[%r233]; .loc 1 365 8 ld.u64 %r172,[%r234]; .loc 1 367 9 setp.ne.u64 %r409,%r172,0; @ %r409 bra $L81; bra $L53; $L80: add.u64 %r188,%frame,480; add.u64 %r181,%r233,40; add.u64 %r149,%frame,120; cvt.u32.u32 %r411,%r23; cvt.s64.s8 %r410,%r411; add.u64 %r138,%r410,-1; .loc 1 356 3 mov.u64 %r175,0; add.u64 %r618,%frame,360; .loc 1 358 16 mov.u64 %r412,%r175; $L82: st.u64 [%r188],%r412; .loc 1 359 18 ld.u64 %r413,[%r181]; st.u64 [%r149],%r413; .loc 1 360 17 shl.b64 %r415,%r175,3; add.u64 %r416,%r618,%r415; .loc 1 360 10 ld.u64 %r417,[%r416]; setp.le.s64 %r418,%r417,0; @ %r418 bra $L53; .loc 1 356 26 add.u64 %r175,%r175,1; .loc 1 356 3 add.u64 %r188,%r188,8; add.u64 %r181,%r181,24; add.u64 %r149,%r149,8; setp.ne.u64 %r419,%r138,%r175; @ %r419 bra $L82; bra $L83; $L81: .loc 1 290 12 mul.lo.u64 %r43,%r41,%r35; .loc 1 386 32 shl.b64 %r88,%r158,4; ld.u64 %r189,[%frame+480]; .loc 1 430 22 ld.u64 %r102,[%frame+240]; .loc 1 430 12 shl.b64 %r103,%r102,4; .loc 1 431 23 ld.u64 %r105,[%frame]; .loc 1 432 22 ld.u64 %r107,[%frame+120]; .loc 1 432 12 shl.b64 %r108,%r107,4; .loc 1 434 32 ld.u64 %r29,[%frame+360]; .loc 1 441 23 mul.lo.u64 %r39,%r29,%r102; .loc 1 442 24 mul.lo.u64 %r420,%r29,%r105; .loc 1 442 10 neg.s64 %r92,%r420; .loc 1 443 23 mul.lo.u64 %r421,%r29,%r107; .loc 1 443 9 shl.b64 %r422,%r421,4; neg.s64 %r185,%r422; setp.le.s64 %r621,%r154,1; .loc 1 408 9 setp.ne.u32 %r625,%r237,0; add.u64 %r626,%r155,-1; cvt.u32.u32 %r627,%r23; cvt.s64.s8 %r628,%r627; add.u64 %r629,%r628,-1; add.u64 %r630,%frame,240; add.u64 %r631,%frame,360; $L102: .loc 1 356 3 mov.u64 %r168,%r173; mov.u64 %r167,%r172; .loc 1 386 9 mov.u64 %r197,0; $L87: .loc 1 389 7 ld.s8 %r224,[%r168]; mov.u64 %r230,%r197; .loc 1 386 24 add.u64 %r197,%r197,1; .loc 1 389 6 setp.eq.u32 %r623,%r224,0; @ %r623 bra $L84; .loc 1 397 11 ld.u64 %r579,[%r167]; ld.u64 %r580,[%r167+8]; .loc 1 398 11 mov.u64 %r577,%r197; shr.s64 %r578,%r577,63; .loc 1 408 9 @ %r625 bra $L85; bra $L127; $L84: .loc 1 386 32 add.u64 %r167,%r167,%r88; .loc 1 386 47 add.u64 %r168,%r168,%r43; .loc 1 386 2 setp.ne.u64 %r426,%r155,%r197; @ %r426 bra $L87; .loc 1 385 9 mov.u64 %r577,0; mov.u64 %r578,%r577; bra $L88; $L127: .loc 1 418 8 setp.gt.s64 %r427,%r155,%r230; @ %r427 bra $L89; bra $L88; $L85: .loc 1 409 8 setp.le.s64 %r428,%r155,%r230; @ %r428 bra $L88; add.u64 %r587,%r230,1; shr.s64 %r588,%r587,63; sub.u64 %r432,%r626,%r230; shr.s64 %r435,%r230,63; add.u64 %r438,%r230,2; set.u32.lt.u64 %r443,%r438,%r230; cvt.s64.s32 %r441,%r443; sub.u64 %r447,%r435,%r441; add.u64 %r449,%r432,%r438; set.u32.lt.u64 %r454,%r449,%r432; cvt.s64.s32 %r452,%r454; sub.u64 %r458,%r447,%r452; mov.u64 %r599,1; $L93: .loc 1 411 8 @ %r623 bra $L90; .loc 1 411 18 ld.u64 %r573,[%r167]; ld.u64 %r574,[%r167+8]; .loc 1 411 15 setp.gt.s64 %r462,%r580,%r574; @ %r462 bra $L90; setp.ne.u64 %r465,%r580,%r574; @ %r465 bra $L103; setp.gt.u64 %r468,%r579,%r573; @ %r468 bra $L90; $L103: .loc 1 414 16 mov.u64 %r577,%r587; mov.u64 %r578,%r588; mov.u64 %r579,%r573; mov.u64 %r580,%r574; $L90: .loc 1 409 33 add.u64 %r167,%r167,%r88; .loc 1 409 48 add.u64 %r168,%r168,%r43; .loc 1 409 8 add.u64 %r471,%r587,%r599; set.u32.lt.u64 %r476,%r471,%r587; cvt.s64.s32 %r474,%r476; mov.u64 %r587,%r471; sub.u64 %r588,%r588,%r474; setp.ne.u64 %r483,%r449,%r587; @ %r483 bra $L104; setp.ne.u64 %r486,%r458,%r588; @ ! %r486 bra $L88; $L104: .loc 1 411 9 ld.s8 %r224,[%r168]; setp.eq.u32 %r623,%r224,0; bra $L93; $L89: add.u64 %r583,%r230,1; shr.s64 %r584,%r583,63; sub.u64 %r490,%r626,%r230; shr.s64 %r493,%r230,63; add.u64 %r496,%r230,2; set.u32.lt.u64 %r501,%r496,%r230; cvt.s64.s32 %r499,%r501; sub.u64 %r505,%r493,%r499; add.u64 %r507,%r490,%r496; set.u32.lt.u64 %r512,%r507,%r490; cvt.s64.s32 %r510,%r512; sub.u64 %r516,%r505,%r510; .loc 1 418 8 mov.u64 %r613,1; $L119: .loc 1 420 8 ld.s8 %r518,[%r168]; cvt.u16.u32 %r517,%r518; setp.eq.u16 %r519,%r517,0; @ %r519 bra $L94; .loc 1 420 18 ld.u64 %r575,[%r167]; ld.u64 %r576,[%r167+8]; .loc 1 420 15 setp.gt.s64 %r522,%r576,%r580; @ %r522 bra $L105; setp.ne.u64 %r525,%r576,%r580; @ %r525 bra $L94; setp.gt.u64 %r528,%r575,%r579; @ ! %r528 bra $L94; $L105: .loc 1 423 16 mov.u64 %r577,%r583; mov.u64 %r578,%r584; mov.u64 %r579,%r575; mov.u64 %r580,%r576; $L94: .loc 1 418 33 add.u64 %r167,%r167,%r88; .loc 1 418 48 add.u64 %r168,%r168,%r43; .loc 1 418 8 add.u64 %r531,%r583,%r613; set.u32.lt.u64 %r536,%r531,%r583; cvt.s64.s32 %r534,%r536; mov.u64 %r583,%r531; sub.u64 %r584,%r584,%r534; setp.ne.u64 %r543,%r507,%r583; @ %r543 bra $L119; setp.ne.u64 %r546,%r516,%r584; @ %r546 bra $L119; $L88: .loc 1 426 8 st.u64 [%r174],%r577; st.u64 [%r174+8],%r578; .loc 1 429 15 add.u64 %r189,%r189,1; .loc 1 430 12 add.u64 %r172,%r172,%r103; .loc 1 431 13 add.u64 %r173,%r173,%r105; .loc 1 432 12 add.u64 %r174,%r174,%r108; .loc 1 434 13 setp.ne.u64 %r549,%r29,%r189; @ %r549 bra $L102; .loc 1 442 10 add.u64 %r169,%r173,%r92; .loc 1 443 9 add.u64 %r170,%r174,%r185; .loc 1 445 7 @ %r621 bra $L53; add.u64 %r219,%frame,488; .loc 1 441 23 mov.u64 %r112,%r39; .loc 1 445 7 mov.u64 %r218,8; .loc 1 444 5 mov.u64 %r171,1; add.u64 %r617,%frame,120; .loc 1 438 13 mov.u64 %r624,0; bra $L99; $L100: st.u64 [%r219],%r624; .loc 1 441 23 mul.lo.u64 %r112,%r125,%r124; .loc 1 442 24 mul.lo.u64 %r555,%r127,%r124; .loc 1 442 10 sub.u64 %r169,%r173,%r555; .loc 1 443 23 mul.lo.u64 %r556,%r129,%r124; .loc 1 443 9 shl.b64 %r557,%r556,4; sub.u64 %r170,%r174,%r557; .loc 1 444 5 add.u64 %r171,%r171,1; .loc 1 445 7 add.u64 %r219,%r219,8; add.u64 %r218,%r218,8; setp.eq.u64 %r558,%r171,%r629; @ %r558 bra $L53; $L99: .loc 1 453 16 ld.u64 %r559,[%r219]; add.u64 %r124,%r559,1; st.u64 [%r219],%r124; .loc 1 454 23 add.u64 %r561,%r630,%r218; ld.u64 %r125,[%r561]; .loc 1 454 13 sub.u64 %r562,%r125,%r112; shl.b64 %r563,%r562,4; add.u64 %r172,%r172,%r563; .loc 1 455 24 add.u64 %r564,%frame,%r218; ld.u64 %r127,[%r564]; .loc 1 455 14 add.u64 %r173,%r169,%r127; .loc 1 456 23 add.u64 %r566,%r617,%r218; ld.u64 %r129,[%r566]; .loc 1 456 13 shl.b64 %r567,%r129,4; add.u64 %r174,%r170,%r567; .loc 1 434 32 add.u64 %r569,%r631,%r218; ld.u64 %r133,[%r569]; .loc 1 434 13 setp.eq.u64 %r570,%r124,%r133; @ %r570 bra $L100; .loc 1 438 13 mov.u64 %r189,0; bra $L102; $L125: .loc 1 312 6 ld.u64 %r571,[%r233]; setp.eq.u64 %r572,%r571,0; @ ! %r572 bra $L68; bra $L101; $L53_gfortran_smaxloc1_16_i16 .visible .func _gfortran_smaxloc1_16_i16, .param .u32 %in_ar4) {local .align 16 .b8 %frame_ar[368u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r282; .reg .predpred %r293; .reg .pred %r294; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 483 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L129; .loc 1 483 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L130; $L129:r152154; call _gfortran_maxloc1_16_i16490 7 bra $L128; $L130: .loc 1 493 10 ld.u64 %r23,[%r152]; .loc 1 493 7 add.u64 %r88,%r23,-1; .loc 1 494 10 ld.s8 %r24,[%r151+28]; .loc 1 494 38 add.u32 %r26,%r24,-1; .loc 1 494 8 cvt.s64.s32 %r297,%r26; .loc 1 496 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r297; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 496 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L132; .loc 1 503 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L133; $L139: .loc 1 511 3 setp.lt.s64 %r175,%r88,%r297; @ %r175 bra $L134; bra $L187; $L132: .loc 1 498 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error133: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 508 12 mov.u64 %r303,0; $L138: .loc 1 505 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 507 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L136; .loc 1 505 17 st.u64 [%r139],%r36; bra $L137; $L136: .loc 1 508 12 st.u64 [%r139],%r303; $L137: .loc 1 503 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L138; bra $L139; $L187: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L141; bra $L140; $L134: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r296,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r296,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 517 12 mov.u64 %r302,0; $L144: .loc 1 514 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 516 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L142; .loc 1 513 17 st.u64 [%r43],%r41; bra $L143; $L142: .loc 1 517 12 st.u64 [%r43],%r302; $L143: .loc 1 511 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L144; bra $L188; $L140: .loc 1 524 7 setp.eq.u64 %r213,%r297,0; @ %r213 bra $L146; add.u64 %r296,%frame,120; $L166: add.u64 %r111,%r150,40; mov.u64 %r107,%r296; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 527 10 mov.u64 %r81,1; .loc 1 531 4 mov.u64 %r239,0; bra $L147; $L146: .loc 1 535 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 536 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 538 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 538 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 538 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 540 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L148; bra $L189; $L150: .loc 1 529 48 mul.lo.u64 %r81,%r46,%r81; $L147: .loc 1 531 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 524 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L150; bra $L146; $L148: .loc 1 543 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 544 4 bra $L128; $L189: .loc 1 547 24 mov.u64 %r246,1696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 547 22 st.u64 [%r150],%r98; $L154: .loc 1 573 3 setp.ne.u64 %r249,%r297,0; @ %r249 bra $L151; bra $L152; $L141: .loc 1 551 19 ld.s8 %r55,[%r150+28]; .loc 1 551 10 setp.eq.u64 %r250,%r55,%r297; @ %r250 bra $L153; .loc 1 552 2 st.u64 [%stack+8],%r297; st.u64 [%stack],%r55;_gfortran_runtime_error153: .loc 1 557 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 557 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L154; .loc 1 559 4 setp.eq.u64 %r256,%r297,0; @ %r256 bra $L152; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 559 10 mov.u64 %r104,0; $L156: .loc 1 563 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 563 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 564 18 ld.u64 %r63,[%r31]; .loc 1 559 25 add.u64 %r104,%r104,1; .loc 1 564 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L155; .loc 1 565 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error155: .loc 1 559 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L156; $L151: .loc 1 575 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L157; shl.b64 %r266,%r297,3; bra $L158; $L157: mov.u64 %r266,8; $L158: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L159: .loc 1 576 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 573 26 add.u64 %r100,%r100,1; .loc 1 573 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r297,%r100; @ %r279 bra $L159; $L152: ld.u64 %r80,[%frame+240]; .loc 1 585 22 ld.u64 %r68,[%frame]; .loc 1 585 12 shl.b64 %r69,%r68,4; .loc 1 587 32 ld.u64 %r59,[%frame+120]; .loc 1 594 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r295,%r297,1; .loc 1 583 13 mov.u64 %r280,0; cvt.u32.u32 %r299,%r24; cvt.s64.s8 %r300,%r299; add.u64 %r301,%r300,-1; $L165: st.u64 [%r98],%r280; st.u64 [%r98+8],%r280; .loc 1 584 15 add.u64 %r80,%r80,1; .loc 1 585 12 add.u64 %r98,%r98,%r69; .loc 1 587 13 setp.ne.u64 %r282,%r59,%r80; @ %r282 bra $L165; .loc 1 596 7 @ ! %r295 bra $L190; bra $L128; $L164: .loc 1 591 13 st.u64 [%r99],%r280; .loc 1 594 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 595 5 add.u64 %r97,%r97,1; .loc 1 596 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r285,%r97,%r301; @ %r285 bra $L163; bra $L128; $L190: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 594 23 mov.u64 %r72,%r44; .loc 1 595 5 mov.u64 %r97,1; $L163: .loc 1 600 16 ld.u64 %r290,[%r99]; add.u64 %r74,%r290,1; st.u64 [%r99],%r74; .loc 1 601 23 ld.u64 %r75,[%r124]; .loc 1 601 13 sub.u64 %r291,%r75,%r72; shl.b64 %r292,%r291,4; add.u64 %r98,%r98,%r292; .loc 1 587 32 ld.u64 %r77,[%r123]; .loc 1 587 13 setp.eq.u64 %r293,%r74,%r77; @ %r293 bra $L164; .loc 1 591 13 mov.u64 %r80,0; bra $L165; $L188: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r294,%r98,0; @ ! %r294 bra $L141; bra $L166; $L128: .loc 1 605 1 ret; } maxloc1_4_r4.o/_gfortran_maxloc1_4_r4 .visible .func _gfortran_maxloc1_4_rfortran/generated/maxloc1_4_r4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_4_r4 .visible .func _gfortran_mmaxloc1_4_r_gfortran_smaxloc1_4_r4 .visible .func _gfortran_smaxloc1_4_rVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_maxloc1_4_r4 .visible .func _gfortran_maxloc1_4_r4480100; .reg .u64 %r102; .reg .u32predpredpred %r235; .reg .predpredu64u16u64 %r279; .reg .u64 %r280; .reg .u64 %r282; .reg .u64 %r283; .reg .u64predu64u32 %r316; .reg .u64 %r317; .reg .u64pred %r323; .reg .pred %r324; .reg .u32 %r326; .reg .predpred %r330; .reg .predpred %r345; .reg .u64mov.u64 %r185,%ar0; mov.u64 %r186,%ar1; mov.u64 %r187,%ar2; mov.u32 %r188,%ar3; .loc 1 58 10 ld.s8 %r22,[%r186+28]; .loc 1 58 38 add.u32 %r189,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r112,%r189; .loc 1 59 10 ld.u64 %r25,[%r187]; .loc 1 59 7 add.u64 %r113,%r25,-1; .loc 1 61 7 shr.u64 %r191,%r113,63; set.u32.lt.s64 %r193,%r112,%r113; neg.s32 %r194,%r193; cvt.u16.u64 %r197,%r191; cvt.u16.u32 %r198,%r194; or.b16 %r196,%r197,%r198; .loc 1 61 6 cvt.u32.u16 %r199,%r196; cvt.u16.u8 %r200,%r199; setp.eq.u16 %r201,%r200,0; @ %r201 bra $L2; .loc 1 63 7 cvt.u32.u32 %r204,%r22; cvt.s64.s8 %r203,%r204; st.u64 [%stack+8],%r20302202_gfortran_runtime_errorr207,%r113,%r113; add.u64 %r208,%r207,%r113; shl.b64 %r209,%r208,3; add.u64 %r210,%r186,%r209; ld.u64 %r30,[%r210+56]; ld.u64 %r32,[%r210+48]; .loc 1 71 9 ld.u64 %r116,[%r210+40]; .loc 1 73 3 setp.ne.u64 %r224,%r113,0; @ %r224 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r225,%r112,%r113; @ %r225 bra $L4; bra $L55; $L3: add.u64 %r184,%r186,40; add.u64 %r181,%frame,120; add.u64 %r47,%frame,240; add.u64 %r226,%r186,16; add.u64 %r228,%r25,%r25; add.u64 %r229,%r228,%r25; shl.b64 %r230,%r229,3; add.u64 %r148,%r226,%r230; .loc 1 79 12 mov.u64 %r371,0; $L8: .loc 1 75 18 ld.u64 %r231,[%r184]; st.u64 [%r181],%r231; .loc 1 76 19 ld.u64 %r233,[%r184+16]; add.u64 %r232,%r233,1; ld.u64 %r234,[%r184+8]; sub.u64 %r37,%r232,%r234; .loc 1 78 10 setp.lt.s64 %r235,%r37,0; @ %r235 bra $L6; .loc 1 76 17 st.u64 [%r47],%r37; bra $L7; $L6: .loc 1 79 12 st.u64 [%r47],%r371; $L7: .loc 1 73 3 add.u64 %r184,%r184,24; add.u64 %r181,%r181,8; add.u64 %r47,%r47,8; setp.ne.u64 %r237,%r148,%r184; @ %r237 bra $L8; bra $L9; $L55: .loc 1 90 6 ld.u64 %r238,[%r185]; setp.eq.u64 %r239,%r238,0; @ ! %r239 bra $L11; bra $L10; $L4: add.u64 %r241,%r25,%r25; add.u64 %r242,%r241,%r25; shl.b64 %r243,%r242,3; add.u64 %r244,%r243,40; add.u64 %r130,%r186,%r244; shl.b64 %r245,%r25,3; add.u64 %r102,%r245,-8; add.u64 %r359,%frame,120; add.u64 %r106,%r359,%r102; add.u64 %r358,%frame,240; add.u64 %r98,%r358,%r102; cvt.u32.u32 %r249,%r22; cvt.s64.s8 %r248,%r249; add.u64 %r251,%r248,%r248; add.u64 %r252,%r251,%r248; shl.b64 %r253,%r252,3; add.u64 %r254,%r186,40; add.u64 %r127,%r253,%r254; .loc 1 87 12 mov.u64 %r370,0; $L14: .loc 1 83 18 ld.u64 %r255,[%r130]; st.u64 [%r106],%r255; .loc 1 84 19 ld.u64 %r257,[%r130+16]; add.u64 %r256,%r257,1; ld.u64 %r258,[%r130+8]; sub.u64 %r44,%r256,%r258; .loc 1 86 10 setp.lt.s64 %r259,%r44,0; @ %r259 bra $L12; .loc 1 84 17 st.u64 [%r98],%r44; bra $L13; $L12: .loc 1 87 12 st.u64 [%r98],%r370; $L13: .loc 1 81 3 add.u64 %r130,%r130,24; add.u64 %r106,%r106,8; add.u64 %r98,%r98,8; setp.ne.u64 %r261,%r127,%r130; @ %r261 bra $L14; bra $L56; $L10: .loc 1 94 7 setp.le.s64 %r262,%r112,0; @ %r262 bra $L16; add.u64 %r358,%frame,240; $L42: add.u64 %r155,%r185,40; mov.u64 %r146,%r358; add.u64 %r263,%r185,16; cvt.u32.u32 %r265,%r22; cvt.s64.s8 %r264,%r265; add.u64 %r267,%r264,%r264; add.u64 %r268,%r267,%r264; shl.b64 %r269,%r268,3; add.u64 %r134,%r263,%r269; .loc 1 97 10 mov.u64 %r100,1; .loc 1 101 4 mov.u64 %r292,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r270,0; st.u64 [%r185+8],%r270; .loc 1 106 28 cvt.u16.u32 %r273,%r22; add.u16 %r272,%r273,-1; cvt.u32.u16 %r274,%r272; st.u8 [%r185+28],%r274; .loc 1 108 20 add.u32 %r275,%r22,-2; cvt.s64.s32 %r53,%r275; add.u64 %r277,%r53,%r53; add.u64 %r278,%r277,%r53; shl.b64 %r279,%r278,3; add.u64 %r280,%r185,%r279; .loc 1 108 67 shl.b64 %r282,%r53,3; add.u64 %r283,%frame,%r282; .loc 1 108 59 ld.u64 %r285,[%r280+40]; ld.u64 %r286,[%r283+240]; mul.lo.u64 %r118,%r285,%r286; .loc 1 110 29288; call (%value_in),_gfortrani_xmallocarray289,[%value_in]; } .loc 1 110 27 st.u64 [%r185],%r289; .loc 1 111 10 setp.eq.u64 %r291,%r118,0; @ ! %r291 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r100,%r49,%r100; $L17: .loc 1 101 4 st.u64 [%r155+8],%r292; ld.u64 %r49,[%r146]; add.u64 %r293,%r49,-1; st.u64 [%r155+16],%r293; st.u64 [%r155],%r100; .loc 1 94 7 add.u64 %r155,%r155,24; add.u64 %r146,%r146,8; setp.ne.u64 %r294,%r134,%r155; @ %r294 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r185+48],%r118; mov.u64 %r296,-1; st.u64 [%r185+56],%r296; mov.u64 %r297,1; st.u64 [%r185+40],%r297; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r59,[%r185+28]; .loc 1 121 10 setp.eq.u64 %r298,%r59,%r112; @ %r298 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r112; st.u64 [%stack],%r59;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r301,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r302,[%r301+36]; setp.eq.u32 %r303,%r302,0; @ %r303 bra $L21; .loc 1 128 2 add.u64 %r358,%frame,240; cvta.const.u64 %r307,$LC2r35306307; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r309,%r112,0; @ %r309 bra $L24; $L27: .loc 1 68 9 add.u64 %r310,%r30,1; .loc 1 68 7 sub.u64 %r114,%r310,%r32; max.s64 %r131,%r114,0; .loc 1 140 8 ld.u64 %r124,[%r186]; .loc 1 141 8 ld.u64 %r125,[%r185]; .loc 1 168 42 shl.b64 %r66,%r116,2; ld.u64 %r135,[%frame+360]; .loc 1 194 22 ld.u64 %r70,[%frame+120]; .loc 1 194 12 shl.b64 %r71,%r70,2; .loc 1 195 22 ld.u64 %r73,[%frame]; .loc 1 195 12 shl.b64 %r74,%r73,2; .loc 1 197 32 ld.u64 %r61,[%frame+240]; .loc 1 204 23 mul.lo.u64 %r311,%r61,%r70; .loc 1 204 9 shl.b64 %r312,%r311,2; neg.s64 %r149,%r312; .loc 1 205 23 mul.lo.u64 %r313,%r61,%r73; .loc 1 205 9 shl.b64 %r314,%r313,2; neg.s64 %r132,%r314; setp.gt.s64 %r360,%r114,0; setp.le.s64 %r361,%r112,1; .loc 1 157 9 mov.u32 %r363,1; setp.eq.u32 %r364,%r188,0; cvt.u32.u32 %r365,%r22; cvt.s64.s8 %r366,%r365; add.u64 %r367,%r366,-1; add.u64 %r368,%frame,120; add.u64 %r369,%frame,240; bra $L25; $L24: add.u64 %r168,%frame,360; add.u64 %r167,%r185,40; mov.u64 %r165,%frame; cvt.u32.u32 %r316,%r22; cvt.s64.s8 %r315,%r316; add.u64 %r156,%r315,-1; .loc 1 132 3 mov.u64 %r126,0; add.u64 %r358,%frame,240; .loc 1 134 16 mov.u64 %r317,%r126; $L26: st.u64 [%r168],%r317; .loc 1 135 18 ld.u64 %r318,[%r167]; st.u64 [%r165],%r318; .loc 1 136 17 shl.b64 %r320,%r126,3; add.u64 %r321,%r358,%r320; .loc 1 136 10 ld.u64 %r322,[%r321]; setp.le.s64 %r323,%r322,0; @ %r323 bra $L1; .loc 1 132 26 add.u64 %r126,%r126,1; .loc 1 132 3 add.u64 %r168,%r168,8; add.u64 %r167,%r167,24; add.u64 %r165,%r165,8; setp.ne.u64 %r324,%r126,%r156; @ %r324 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r360 bra $L43; .loc 1 159 10 mov.u32 %r326,0; st.u32 [%r125],%r326; bra $L29; $L43: mov.u64 %r120,%r124; .loc 1 168 19 mov.u64 %r119,0; $L28: .loc 1 170 7 ld.f32 %r105,[%r120]; .loc 1 170 6 setp.ge.f32 %r327,%r105,0fff800000; @ ! %r327 bra $L57; .loc 1 173 14 cvt.u32.u64 %r328,%r119; add.u32 %r103,%r328,1; .loc 1 180 6 setp.le.s64 %r329,%r131,%r119; @ %r329 bra $L32; add.u64 %r171,%r119,1; .loc 1 170 7 mov.f32 %r183,%r105; bra $L33; $L57: .loc 1 168 34 add.u64 %r119,%r119,1; .loc 1 168 42 add.u64 %r120,%r120,%r66; .loc 1 168 12 setp.lt.s64 %r330,%r119,%r131; @ %r330 bra $L28; .loc 1 157 9 mov.u32 %r103,%r363; bra $L32; $L33: .loc 1 182 7 @ %r364 bra $L34; .loc 1 182 6 setp.le.f32 %r332,%r105,%r183; @ %r332 bra $L35; bra $L36; $L34: setp.lt.f32 %r333,%r105,%r183; @ ! %r333 bra $L36; $L35: .loc 1 185 14 cvt.u32.u64 %r103,%r171; .loc 1 184 14 mov.f32 %r105,%r183; $L36: .loc 1 180 31 add.u64 %r120,%r120,%r66; .loc 1 180 6 add.u64 %r172,%r171,1; setp.eq.u64 %r334,%r131,%r171; @ %r334 bra $L32; .loc 1 182 14 ld.f32 %r183,[%r120]; mov.u64 %r171,%r172; bra $L33; $L32: .loc 1 189 12 st.u32 [%r125],%r103; $L29: .loc 1 193 15 add.u64 %r135,%r135,1; .loc 1 194 12 add.u64 %r124,%r124,%r71; .loc 1 195 12 add.u64 %r125,%r125,%r74; .loc 1 197 13 setp.ne.u64 %r335,%r61,%r135; @ %r335 bra $L25; .loc 1 204 9 add.u64 %r121,%r124,%r149; .loc 1 205 9 add.u64 %r122,%r125,%r132; .loc 1 207 7 @ %r361 bra $L1; add.u64 %r137,%frame,368; mov.u64 %r180,8; .loc 1 206 5 mov.u64 %r123,1; .loc 1 201 13 mov.u64 %r362,0; bra $L40; $L41: st.u64 [%r137],%r362; .loc 1 204 23 mul.lo.u64 %r341,%r87,%r86; .loc 1 204 9 shl.b64 %r342,%r341,2; sub.u64 %r121,%r124,%r342; .loc 1 205 23 mul.lo.u64 %r343,%r90,%r86; .loc 1 205 9 shl.b64 %r344,%r343,2; sub.u64 %r122,%r125,%r344; .loc 1 206 5 add.u64 %r123,%r123,1; .loc 1 207 7 add.u64 %r137,%r137,8; add.u64 %r180,%r180,8; setp.eq.u64 %r345,%r123,%r367; @ %r345 bra $L1; $L40: .loc 1 215 16 ld.u64 %r346,[%r137]; add.u64 %r86,%r346,1; st.u64 [%r137],%r86; .loc 1 216 23 add.u64 %r348,%r368,%r180; ld.u64 %r87,[%r348]; .loc 1 216 13 shl.b64 %r349,%r87,2; add.u64 %r124,%r121,%r349; .loc 1 217 23 add.u64 %r350,%frame,%r180; ld.u64 %r90,[%r350]; .loc 1 217 13 shl.b64 %r351,%r90,2; add.u64 %r125,%r122,%r351; .loc 1 197 32 add.u64 %r353,%r369,%r180; ld.u64 %r94,[%r353]; .loc 1 197 13 setp.eq.u64 %r354,%r86,%r94; @ %r354 bra $L41; .loc 1 201 13 mov.u64 %r135,0; bra $L25; $L56: .loc 1 90 6 ld.u64 %r355,[%r185]; setp.eq.u64 %r356,%r355,0; @ ! %r356 bra $L11; bra $L42; $L1:_gfortran_mmaxloc1_4_r4 .visible .func _gfortran_mmaxloc1_4_r, .param .u32 %in_ar4) {608u32 %r267; .reg .u32 %r269; .reg .u32 %r270; .reg .u32 %r271; .reg .u32 %r273; .reg .u32predu64 %r293; .reg .predu64 %r302; .reg .u64 %r303; .reg .u64 %r305; .reg .u64 %r306; .reg .u64u64 %r327; .reg .u32u64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64pred %r339; .reg .u64 %r341; .reg .u64 %r343; .reg .pred %r345; .reg .predpred %r383; .reg .u64 %r384; .reg .u64 %r386; .reg .u32 %r387; .reg .pred %r388; .reg .u64 %r391; .reg .u64pred %r409; .reg .u64 %r410; .reg .u64 %r411; .reg .u64 %r412; .reg .pred %r414; .reg .u32pred %r418; .reg .pred %r419; .reg .pred %r422; .reg .pred %r423; .reg .pred %r425; .reg .pred %r426; .reg .predu64predmov.u64 %r223,%ar0; mov.u64 %r224,%ar1; mov.u64 %r225,%ar2; mov.u64 %r226,%ar3; mov.u32 %r227,%ar4; .loc 1 251 6 setp.ne.u64 %r228,%r226,0; @ %r228 bra $L227; call _gfortran_maxloc1_4_r47 bra $L58; $L59: .loc 1 261 10 ld.u64 %r22,[%r225]; .loc 1 261 7 add.u64 %r159,%r22,-1; .loc 1 262 10 ld.s8 %r23,[%r224+28]; .loc 1 262 38 add.u32 %r233,%r23,-1; .loc 1 262 8 cvt.s64.s32 %r160,%r233; .loc 1 265 7 shr.u64 %r235,%r159,63; set.u32.gt.s64 %r237,%r159,%r160; neg.s32 %r238,%r237; cvt.u16.u64 %r241,%r235; cvt.u16.u32 %r242,%r238; or.b16 %r240,%r241,%r242; .loc 1 265 6 cvt.u32.u16 %r243,%r240; cvt.u16.u8 %r244,%r243; setp.eq.u16 %r245,%r244,0; @ %r245 bra $L61; .loc 1 267 7 cvt.u32.u32 %r248,%r23; cvt.s64.s8 %r247,%r248; st.u64 [%stack+8],%r247; st.u64 [%stack],%r22; cvta.const.u64 %r246246_gfortran_runtime_error61: .loc 1 272 9 add.u64 %r456,%r159,%r159; add.u64 %r457,%r456,%r159; shl.b64 %r253,%r457,3; add.u64 %r254,%r224,%r253; ld.u64 %r257,[%r254+56]; add.u64 %r256,%r257,1; .loc 1 272 7 ld.u64 %r264,[%r254+48]; sub.u64 %r161,%r256,%r264; .loc 1 273 6 setp.le.s64 %r265,%r161,0; @ %r265 bra $L58; .loc 1 276 9 ld.u64 %r180,[%r226]; .loc 1 278 15 ld.u64 %r34,[%r226+16]; .loc 1 280 22 cvt.u32.u64 %r35,%r34; .loc 1 280 53 add.u32 %r266,%r35,-4; and.b32 %r267,%r266,-5; set.u32.eq.u32 %r269,%r267,0; neg.s32 %r270,%r269; .loc 1 280 22 add.u32 %r271,%r35,-1; set.u32.le.u32 %r273,%r271,1; neg.s32 %r274,%r273; .loc 1 280 6 cvt.u16.u32 %r276,%r270; cvt.u16.u32 %r277,%r274; or.b16 %r275,%r276,%r277; cvt.u32.u16 %r278,%r275; cvt.u16.u8 %r279,%r278; setp.ne.u16 %r280,%r279,0; @ %r280 bra $L63; .loc 1 282 7 setp.ne.u32 %r282,%r35,16; @ %r282 bra $L64; $L63: .loc 1 289 9 shl.b64 %r286,%r457,3; add.u64 %r287,%r224,%r286; ld.u64 %r164,[%r287+40]; .loc 1 290 12 add.u64 %r293,%r226,%r286; ld.u64 %r41,[%r293+40]; .loc 1 292 3 setp.ne.u64 %r295,%r159,0; @ %r295 bra $L65; $L71: .loc 1 302 3 setp.lt.s64 %r296,%r159,%r160; @ %r296 bra $L66; bra $L129; $L64: .loc 1 287 5 cvta.const.u64 %r2977_gfortran_runtime_error65: add.u64 %r91,%r224,40; add.u64 %r40,%r226,40; add.u64 %r300,%r22,%r22; add.u64 %r301,%r300,%r22; shl.b64 %r302,%r301,3; add.u64 %r303,%r224,16; add.u64 %r186,%r302,%r303; .loc 1 292 3 mov.u64 %r148,0; add.u64 %r453,%frame,240; add.u64 %r452,%frame,360; .loc 1 299 12 mov.u64 %r467,%r148; $L70: .loc 1 294 18 add.u64 %r305,%r453,%r148; ld.u64 %r306,[%r91]; st.u64 [%r305],%r306; .loc 1 295 18 add.u64 %r307,%frame,%r148; .loc 1 295 20 ld.u64 %r309,[%r40]; mul.lo.u64 %r308,%r309,%r34; .loc 1 295 18 st.u64 [%r307],%r308; .loc 1 296 19 ld.u64 %r311,[%r91+16]; add.u64 %r310,%r311,1; ld.u64 %r312,[%r91+8]; sub.u64 %r52,%r310,%r312; .loc 1 298 10 setp.lt.s64 %r313,%r52,0; @ %r313 bra $L68; .loc 1 296 17 add.u64 %r315,%r452,%r148; st.u64 [%r315],%r52; bra $L69; $L68: .loc 1 299 12 add.u64 %r317,%r452,%r148; st.u64 [%r317],%r467; $L69: .loc 1 292 3 add.u64 %r91,%r91,24; add.u64 %r40,%r40,24; add.u64 %r148,%r148,8; setp.ne.u64 %r319,%r186,%r91; @ %r319 bra $L70; bra $L71; $L129: .loc 1 312 6 ld.u64 %r320,[%r223]; setp.eq.u64 %r321,%r320,0; @ ! %r321 bra $L73; bra $L72; $L66: add.u64 %r323,%r22,%r22; add.u64 %r324,%r323,%r22; shl.b64 %r325,%r324,3; add.u64 %r116,%r325,40; add.u64 %r133,%r224,%r116; add.u64 %r87,%r226,%r116; shl.b64 %r326,%r22,3; add.u64 %r157,%r326,-8; cvt.u32.u32 %r328,%r23; cvt.s64.s8 %r327,%r328; shl.b64 %r329,%r327,3; add.u64 %r96,%r329,-8; add.u64 %r453,%frame,240; add.u64 %r452,%frame,360; .loc 1 309 12 mov.u64 %r466,0; $L76: .loc 1 304 18 add.u64 %r331,%r453,%r157; ld.u64 %r332,[%r133]; st.u64 [%r331],%r332; .loc 1 305 18 add.u64 %r333,%frame,%r157; .loc 1 305 20 ld.u64 %r335,[%r87]; mul.lo.u64 %r334,%r335,%r34; .loc 1 305 18 st.u64 [%r333],%r334; .loc 1 306 19 ld.u64 %r337,[%r133+16]; add.u64 %r336,%r337,1; ld.u64 %r338,[%r133+8]; sub.u64 %r61,%r336,%r338; .loc 1 308 10 setp.lt.s64 %r339,%r61,0; @ %r339 bra $L74; .loc 1 306 17 add.u64 %r341,%r452,%r157; st.u64 [%r341],%r61; bra $L75; $L74: .loc 1 309 12 add.u64 %r343,%r452,%r157; st.u64 [%r343],%r466; $L75: .loc 1 302 3 add.u64 %r133,%r133,24; add.u64 %r87,%r87,24; add.u64 %r157,%r157,8; setp.ne.u64 %r345,%r96,%r157; @ %r345 bra $L76; bra $L130; $L72: .loc 1 316 7 setp.eq.u64 %r346,%r160,0; @ %r346 bra $L78; add.u64 %r452,%frame,360; $L106: add.u64 %r146,%r223,40; mov.u64 %r143,%r452; cvt.u32.u32 %r348,%r23; cvt.s64.s8 %r347,%r348; add.u64 %r350,%r347,%r347; add.u64 %r351,%r350,%r347; shl.b64 %r352,%r351,3; add.u64 %r353,%r223,16; add.u64 %r134,%r352,%r353; .loc 1 319 10 mov.u64 %r139,1; .loc 1 323 4 mov.u64 %r372,0; bra $L79; $L78: .loc 1 327 20 add.u32 %r354,%r23,-2; cvt.s64.s32 %r71,%r354; add.u64 %r356,%r71,%r71; add.u64 %r357,%r356,%r71; shl.b64 %r358,%r357,3; add.u64 %r359,%r223,%r358; .loc 1 327 67 shl.b64 %r361,%r71,3; add.u64 %r362,%frame,%r361; .loc 1 327 59 ld.u64 %r364,[%r359+40]; ld.u64 %r365,[%r362+360]; mul.lo.u64 %r168,%r364,%r365; .loc 1 329 24 mov.u64 %r366,0; st.u64 [%r223+8],%r366; .loc 1 330 28 cvt.u16.u32 %r369,%r23; add.u16 %r368,%r369,-1; cvt.u32.u16 %r370,%r368; st.u8 [%r223+28],%r370; .loc 1 332 10 setp.eq.u64 %r371,%r168,0; @ %r371 bra $L80; bra $L131; $L82: .loc 1 321 47 mul.lo.u64 %r139,%r68,%r139; $L79: .loc 1 323 4 st.u64 [%r146+8],%r372; ld.u64 %r68,[%r143]; add.u64 %r373,%r68,-1; st.u64 [%r146+16],%r373; st.u64 [%r146],%r139; .loc 1 316 7 add.u64 %r146,%r146,24; add.u64 %r143,%r143,8; setp.ne.u64 %r374,%r134,%r146; @ %r374 bra $L82; bra $L78; $L80: .loc 1 335 4 st.u64 [%r223+48],%r168; mov.u64 %r376,-1; st.u64 [%r223+56],%r376; mov.u64 %r377,1; st.u64 [%r223+40],%r377; .loc 1 336 4 bra $L58; $L131: .loc 1 339 24call (%value_in),_gfortrani_xmallocarray80,[%value_in]; } .loc 1 339 22 st.u64 [%r223],%r380; bra $L83; $L73: .loc 1 344 19 ld.s8 %r382,[%r223+28]; .loc 1 344 10 setp.eq.u64 %r383,%r382,%r160; @ %r383 bra $L84; .loc 1 345 2 cvta.const.u64 %r3843stack; call _gfortran_runtime_error84: .loc 1 347 11 cvta.global.u64 %r386,_gfortrani_compile_options; .loc 1 347 10 ld.u32 %r387,[%r386+36]; setp.eq.u32 %r388,%r387,0; @ %r388 bra $L83; .loc 1 349 4 add.u64 %r452,%frame,360; cvta.const.u64 %r392,$LC239392; call _gfortrani_bounds_ifunction_return4 cvta.const.u64 %r396,$LC396392; call _gfortrani_bounds_equal_extents$L83: .loc 1 356 3 setp.ne.u64 %r398,%r160,0; @ %r398 bra $L85; $L88: .loc 1 364 8 ld.u64 %r181,[%r223]; .loc 1 365 8 ld.u64 %r179,[%r224]; .loc 1 367 9 setp.ne.u64 %r399,%r179,0; @ %r399 bra $L86; bra $L58; $L85: add.u64 %r199,%frame,480; add.u64 %r195,%r223,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r401,%r23; cvt.s64.s8 %r400,%r401; add.u64 %r149,%r400,-1; .loc 1 356 3 mov.u64 %r182,0; add.u64 %r452,%frame,360; .loc 1 358 16 mov.u64 %r402,%r182; $L87: st.u64 [%r199],%r402; .loc 1 359 18 ld.u64 %r403,[%r195]; st.u64 [%r187],%r403; .loc 1 360 17 shl.b64 %r405,%r182,3; add.u64 %r406,%r452,%r405; .loc 1 360 10 ld.u64 %r407,[%r406]; setp.le.s64 %r408,%r407,0; @ %r408 bra $L58; .loc 1 356 26 add.u64 %r182,%r182,1; .loc 1 356 3 add.u64 %r199,%r199,8; add.u64 %r195,%r195,24; add.u64 %r187,%r187,8; setp.ne.u64 %r409,%r149,%r182; @ %r409 bra $L87; bra $L88; $L86: .loc 1 290 12 mul.lo.u64 %r43,%r41,%r34; .loc 1 386 32 shl.b64 %r86,%r164,2; ld.u64 %r39,[%frame+480]; .loc 1 430 22 ld.u64 %r100,[%frame+240]; .loc 1 430 12 shl.b64 %r101,%r100,2; .loc 1 431 23 ld.u64 %r103,[%frame]; .loc 1 432 22 ld.u64 %r105,[%frame+120]; .loc 1 432 12 shl.b64 %r106,%r105,2; .loc 1 434 32 ld.u64 %r172,[%frame+360]; .loc 1 441 23 mul.lo.u64 %r38,%r172,%r100; .loc 1 442 24 mul.lo.u64 %r410,%r172,%r103; .loc 1 442 10 neg.s64 %r94,%r410; .loc 1 443 23 mul.lo.u64 %r411,%r172,%r105; .loc 1 443 9 shl.b64 %r412,%r411,2; neg.s64 %r140,%r412; setp.le.s64 %r451,%r160,1; .loc 1 408 9 setp.ne.u32 %r459,%r227,0; cvt.u32.u32 %r460,%r23; cvt.s64.s8 %r461,%r460; add.u64 %r462,%r461,-1; add.u64 %r463,%frame,240; add.u64 %r464,%frame,360; add.u64 %r465,%frame,120; $L107: .loc 1 356 3 mov.u64 %r174,%r180; mov.u64 %r173,%r179; .loc 1 383 16 mov.u32 %r147,0; .loc 1 386 9 mov.u64 %r171,0; $L94: .loc 1 389 7 ld.s8 %r222,[%r174]; .loc 1 389 6 setp.eq.u32 %r454,%r222,0; @ %r454 bra $L89; .loc 1 392 10 setp.ne.u32 %r414,%r147,0; @ %r414 bra $L90; cvt.u32.u64 %r415,%r171; add.u32 %r147,%r415,1; $L90: .loc 1 394 11 ld.f32 %r145,[%r173]; .loc 1 394 10 setp.ge.f32 %r416,%r145,0fff800000; @ ! %r416 bra $L89; .loc 1 398 11 cvt.u32.u64 %r417,%r171; add.u32 %r170,%r417,1; .loc 1 404 9 setp.le.s64 %r418,%r161,%r171; @ ! %r418 bra $L132; bra $L92; $L89: .loc 1 386 24 add.u64 %r171,%r171,1; .loc 1 386 32 add.u64 %r173,%r173,%r86; .loc 1 386 47 add.u64 %r174,%r174,%r43; .loc 1 386 2 setp.ne.u64 %r419,%r161,%r171; @ %r419 bra $L94; bra $L92; $L132: .loc 1 408 9 @ %r459 bra $L95; add.u64 %r205,%r171,1; .loc 1 398 11 mov.u32 %r147,%r170; bra $L96; $L95: add.u64 %r209,%r171,1; mov.u32 %r147,%r170; $L99: .loc 1 411 8 @ %r454 bra $L97; .loc 1 411 18 ld.f32 %r89,[%r173]; .loc 1 411 15 setp.ge.f32 %r422,%r89,%r145; @ ! %r422 bra $L97; .loc 1 414 16 cvt.u32.u64 %r147,%r209; .loc 1 413 16 mov.f32 %r145,%r89; $L97: .loc 1 409 33 add.u64 %r173,%r173,%r86; .loc 1 409 48 add.u64 %r174,%r174,%r43; .loc 1 409 8 add.u64 %r210,%r209,1; setp.eq.u64 %r423,%r161,%r209; @ %r423 bra $L92; .loc 1 411 9 ld.s8 %r222,[%r174]; mov.u64 %r209,%r210; setp.eq.u32 %r454,%r222,0; bra $L99; $L96: .loc 1 420 8 @ %r454 bra $L100; .loc 1 420 18 ld.f32 %r93,[%r173]; .loc 1 420 15 setp.gt.f32 %r425,%r93,%r145; @ ! %r425 bra $L100; .loc 1 423 16 cvt.u32.u64 %r147,%r205; .loc 1 422 16 mov.f32 %r145,%r93; $L100: .loc 1 418 33 add.u64 %r173,%r173,%r86; .loc 1 418 48 add.u64 %r174,%r174,%r43; .loc 1 418 8 add.u64 %r206,%r205,1; setp.eq.u64 %r426,%r161,%r205; @ %r426 bra $L92; .loc 1 420 9 ld.s8 %r222,[%r174]; mov.u64 %r205,%r206; setp.eq.u32 %r454,%r222,0; bra $L96; $L92: .loc 1 426 8 st.u32 [%r181],%r147; .loc 1 429 15 add.u64 %r39,%r39,1; .loc 1 430 12 add.u64 %r179,%r179,%r101; .loc 1 431 13 add.u64 %r180,%r180,%r103; .loc 1 432 12 add.u64 %r181,%r181,%r106; .loc 1 434 13 setp.ne.u64 %r427,%r39,%r172; @ %r427 bra $L107; .loc 1 442 10 add.u64 %r176,%r180,%r94; .loc 1 443 9 add.u64 %r177,%r181,%r140; .loc 1 445 7 @ %r451 bra $L58; add.u64 %r158,%frame,488; .loc 1 441 23 mov.u64 %r110,%r38; .loc 1 445 7 mov.u64 %r219,8; .loc 1 444 5 mov.u64 %r178,1; .loc 1 438 13 mov.u64 %r458,0; bra $L104; $L105: st.u64 [%r158],%r458; .loc 1 441 23 mul.lo.u64 %r110,%r124,%r123; .loc 1 442 24 mul.lo.u64 %r433,%r126,%r123; .loc 1 442 10 sub.u64 %r176,%r180,%r433; .loc 1 443 23 mul.lo.u64 %r434,%r128,%r123; .loc 1 443 9 shl.b64 %r435,%r434,2; sub.u64 %r177,%r181,%r435; .loc 1 444 5 add.u64 %r178,%r178,1; .loc 1 445 7 add.u64 %r158,%r158,8; add.u64 %r219,%r219,8; setp.eq.u64 %r436,%r178,%r462; @ %r436 bra $L58; $L104: .loc 1 453 16 ld.u64 %r437,[%r158]; add.u64 %r123,%r437,1; st.u64 [%r158],%r123; .loc 1 454 23 add.u64 %r439,%r463,%r219; ld.u64 %r124,[%r439]; .loc 1 454 13 sub.u64 %r440,%r124,%r110; shl.b64 %r441,%r440,2; add.u64 %r179,%r179,%r441; .loc 1 455 24 add.u64 %r442,%frame,%r219; ld.u64 %r126,[%r442]; .loc 1 455 14 add.u64 %r180,%r176,%r126; .loc 1 456 23 add.u64 %r444,%r465,%r219; ld.u64 %r128,[%r444]; .loc 1 456 13 shl.b64 %r445,%r128,2; add.u64 %r181,%r177,%r445; .loc 1 434 32 add.u64 %r447,%r464,%r219; ld.u64 %r131,[%r447]; .loc 1 434 13 setp.eq.u64 %r448,%r123,%r131; @ %r448 bra $L105; .loc 1 438 13 mov.u64 %r39,0; bra $L107; $L130: .loc 1 312 6 ld.u64 %r449,[%r223]; setp.eq.u64 %r450,%r449,0; @ ! %r450 bra $L73; bra $L106; $L5_gfortran_smaxloc1_4_r4 .visible .func _gfortran_smaxloc1_4_r, .param .u32 %in_ar4) {local .align 16 .b8 %frame_ar[368u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u32 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 483 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L134; .loc 1 483 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L135; $L134:r152154; call _gfortran_maxloc1_4_r4490 7 bra $L133; $L135: .loc 1 493 10 ld.u64 %r23,[%r152]; .loc 1 493 7 add.u64 %r88,%r23,-1; .loc 1 494 10 ld.s8 %r24,[%r151+28]; .loc 1 494 38 add.u32 %r26,%r24,-1; .loc 1 494 8 cvt.s64.s32 %r296,%r26; .loc 1 496 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 496 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L137; .loc 1 503 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L138; $L144: .loc 1 511 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L139; bra $L192; $L137: .loc 1 498 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error138: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 508 12 mov.u64 %r302,0; $L143: .loc 1 505 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 507 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L141; .loc 1 505 17 st.u64 [%r139],%r36; bra $L142; $L141: .loc 1 508 12 st.u64 [%r139],%r302; $L142: .loc 1 503 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L143; bra $L144; $L192: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L146; bra $L145; $L139: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 517 12 mov.u64 %r301,0; $L149: .loc 1 514 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 516 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L147; .loc 1 513 17 st.u64 [%r43],%r41; bra $L148; $L147: .loc 1 517 12 st.u64 [%r43],%r301; $L148: .loc 1 511 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L149; bra $L193; $L145: .loc 1 524 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L151; add.u64 %r295,%frame,120; $L171: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 527 10 mov.u64 %r81,1; .loc 1 531 4 mov.u64 %r239,0; bra $L152; $L151: .loc 1 535 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 536 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 538 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 538 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 538 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 540 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L153; bra $L194; $L155: .loc 1 529 48 mul.lo.u64 %r81,%r46,%r81; $L152: .loc 1 531 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 524 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L155; bra $L151; $L153: .loc 1 543 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 544 4 bra $L133; $L194: .loc 1 547 2496246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 547 22 st.u64 [%r150],%r98; $L159: .loc 1 573 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L156; bra $L157; $L146: .loc 1 551 19 ld.s8 %r55,[%r150+28]; .loc 1 551 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L158; .loc 1 552 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error158: .loc 1 557 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 557 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L159; .loc 1 559 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L157; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 559 10 mov.u64 %r104,0; $L161: .loc 1 563 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 563 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 564 18 ld.u64 %r63,[%r31]; .loc 1 559 25 add.u64 %r104,%r104,1; .loc 1 564 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L160; .loc 1 565 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error160: .loc 1 559 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L161; $L156: .loc 1 575 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L162; shl.b64 %r266,%r296,3; bra $L163; $L162: mov.u64 %r266,8; $L163: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L164: .loc 1 576 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 573 26 add.u64 %r100,%r100,1; .loc 1 573 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L164; $L157: ld.u64 %r80,[%frame+240]; .loc 1 585 22 ld.u64 %r68,[%frame]; .loc 1 585 12 shl.b64 %r69,%r68,2; .loc 1 587 32 ld.u64 %r59,[%frame+120]; .loc 1 594 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 583 13 mov.u32 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L170: st.u32 [%r98],%r280; .loc 1 584 15 add.u64 %r80,%r80,1; .loc 1 585 12 add.u64 %r98,%r98,%r69; .loc 1 587 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L170; .loc 1 596 7 @ ! %r294 bra $L195; bra $L133; $L169: .loc 1 591 13 st.u64 [%r99],%r297; .loc 1 594 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 595 5 add.u64 %r97,%r97,1; .loc 1 596 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L168; bra $L133; $L195: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 594 23 mov.u64 %r72,%r44; .loc 1 595 5 mov.u64 %r97,1; .loc 1 591 13 mov.u64 %r297,0; $L168: .loc 1 600 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 601 23 ld.u64 %r75,[%r124]; .loc 1 601 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,2; add.u64 %r98,%r98,%r291; .loc 1 587 32 ld.u64 %r77,[%r123]; .loc 1 587 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L169; .loc 1 591 13 mov.u64 %r80,0; bra $L170; $L193: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L146; bra $L171; $L133: .loc 1 605 1 ret; } maxloc1_8_r4.o/_gfortran_maxloc1_8_r4 .visible .func _gfortran_maxloc1_8_rfortran/generated/maxloc1_8_r4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_8_r4 .visible .func _gfortran_mmaxloc1_8_r_gfortran_smaxloc1_8_r4 .visible .func _gfortran_smaxloc1_8_rVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_maxloc1_8_r4 .visible .func _gfortran_maxloc1_8_r4480pred %r221; .reg .pred %r222; .reg .u64u64 %r229; .reg .u64 %r230; .reg .u64 %r231; .reg .pred %r232; .reg .predu64 %r280; .reg .u64 %r282; .reg .u64pred64predu64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64pred %r345; .reg .u64u64 %r353; .reg .pred %r354; .reg .pred %r355; .reg .u64 %r356; .reg .predpred %r361; .reg .pred %r362; .reg .predmov.u64 %r182,%ar0; mov.u64 %r183,%ar1; mov.u64 %r184,%ar2; mov.u32 %r185,%ar3; .loc 1 58 10 ld.s8 %r22,[%r183+28]; .loc 1 58 38 add.u32 %r186,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r112,%r186; .loc 1 59 10 ld.u64 %r25,[%r184]; .loc 1 59 7 add.u64 %r113,%r25,-1; .loc 1 61 7 shr.u64 %r188,%r113,63; set.u32.lt.s64 %r190,%r112,%r113; neg.s32 %r191,%r190; cvt.u16.u64 %r194,%r188; cvt.u16.u32 %r195,%r191; or.b16 %r193,%r194,%r195; .loc 1 61 6 cvt.u32.u16 %r196,%r193; cvt.u16.u8 %r197,%r196; setp.eq.u16 %r198,%r197,0; @ %r198 bra $L2; .loc 1 63 7 cvt.u32.u32 %r201,%r22; cvt.s64.s8 %r200,%r201; st.u64 [%stack+8],%r20019_gfortran_runtime_errorr204,%r113,%r113; add.u64 %r205,%r204,%r113; shl.b64 %r206,%r205,3; add.u64 %r207,%r183,%r206; ld.u64 %r31,[%r207+56]; ld.u64 %r33,[%r207+48]; .loc 1 71 9 ld.u64 %r116,[%r207+40]; .loc 1 73 3 setp.ne.u64 %r221,%r113,0; @ %r221 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r222,%r112,%r113; @ %r222 bra $L4; bra $L56; $L3: add.u64 %r99,%r183,40; add.u64 %r130,%frame,120; add.u64 %r181,%frame,240; add.u64 %r223,%r183,16; add.u64 %r225,%r25,%r25; add.u64 %r226,%r225,%r25; shl.b64 %r227,%r226,3; add.u64 %r72,%r223,%r227; .loc 1 79 12 mov.u64 %r372,0; $L8: .loc 1 75 18 ld.u64 %r228,[%r99]; st.u64 [%r130],%r228; .loc 1 76 19 ld.u64 %r230,[%r99+16]; add.u64 %r229,%r230,1; ld.u64 %r231,[%r99+8]; sub.u64 %r38,%r229,%r231; .loc 1 78 10 setp.lt.s64 %r232,%r38,0; @ %r232 bra $L6; .loc 1 76 17 st.u64 [%r181],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r181],%r372; $L7: .loc 1 73 3 add.u64 %r99,%r99,24; add.u64 %r130,%r130,8; add.u64 %r181,%r181,8; setp.ne.u64 %r234,%r72,%r99; @ %r234 bra $L8; bra $L9; $L56: .loc 1 90 6 ld.u64 %r235,[%r182]; setp.eq.u64 %r236,%r235,0; @ ! %r236 bra $L11; bra $L10; $L4: add.u64 %r238,%r25,%r25; add.u64 %r239,%r238,%r25; shl.b64 %r240,%r239,3; add.u64 %r241,%r240,40; add.u64 %r136,%r183,%r241; shl.b64 %r242,%r25,3; add.u64 %r108,%r242,-8; add.u64 %r360,%frame,120; add.u64 %r110,%r360,%r108; add.u64 %r359,%frame,240; add.u64 %r104,%r359,%r108; cvt.u32.u32 %r246,%r22; cvt.s64.s8 %r245,%r246; add.u64 %r248,%r245,%r245; add.u64 %r249,%r248,%r245; shl.b64 %r250,%r249,3; add.u64 %r251,%r183,40; add.u64 %r58,%r250,%r251; .loc 1 87 12 mov.u64 %r371,0; $L14: .loc 1 83 18 ld.u64 %r252,[%r136]; st.u64 [%r110],%r252; .loc 1 84 19 ld.u64 %r254,[%r136+16]; add.u64 %r253,%r254,1; ld.u64 %r255,[%r136+8]; sub.u64 %r46,%r253,%r255; .loc 1 86 10 setp.lt.s64 %r256,%r46,0; @ %r256 bra $L12; .loc 1 84 17 st.u64 [%r104],%r46; bra $L13; $L12: .loc 1 87 12 st.u64 [%r104],%r371; $L13: .loc 1 81 3 add.u64 %r136,%r136,24; add.u64 %r110,%r110,8; add.u64 %r104,%r104,8; setp.ne.u64 %r258,%r58,%r136; @ %r258 bra $L14; bra $L57; $L10: .loc 1 94 7 setp.le.s64 %r259,%r112,0; @ %r259 bra $L16; add.u64 %r359,%frame,240; $L42: add.u64 %r153,%r182,40; mov.u64 %r146,%r359; cvt.u32.u32 %r261,%r22; cvt.s64.s8 %r260,%r261; add.u64 %r263,%r260,%r260; add.u64 %r264,%r263,%r260; shl.b64 %r265,%r264,3; add.u64 %r266,%r182,16; add.u64 %r138,%r265,%r266; .loc 1 97 10 mov.u64 %r103,1; .loc 1 101 4 mov.u64 %r289,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r267,0; st.u64 [%r182+8],%r267; .loc 1 106 28 cvt.u16.u32 %r270,%r22; add.u16 %r269,%r270,-1; cvt.u32.u16 %r271,%r269; st.u8 [%r182+28],%r271; .loc 1 108 20 add.u32 %r272,%r22,-2; cvt.s64.s32 %r56,%r272; add.u64 %r274,%r56,%r56; add.u64 %r275,%r274,%r56; shl.b64 %r276,%r275,3; add.u64 %r277,%r182,%r276; .loc 1 108 67 shl.b64 %r279,%r56,3; add.u64 %r280,%frame,%r279; .loc 1 108 59 ld.u64 %r282,[%r277+40]; ld.u64 %r283,[%r280+240]; mul.lo.u64 %r118,%r282,%r283; .loc 1 110 29 mov.u64 %r28511r285; call (%value_in),_gfortrani_xmallocarray286,[%value_in]; } .loc 1 110 27 st.u64 [%r182],%r286; .loc 1 111 10 setp.eq.u64 %r288,%r118,0; @ ! %r288 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r103,%r52,%r103; $L17: .loc 1 101 4 st.u64 [%r153+8],%r289; ld.u64 %r52,[%r146]; add.u64 %r290,%r52,-1; st.u64 [%r153+16],%r290; st.u64 [%r153],%r103; .loc 1 94 7 add.u64 %r153,%r153,24; add.u64 %r146,%r146,8; setp.ne.u64 %r291,%r138,%r153; @ %r291 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r182+48],%r118; mov.u64 %r293,-1; st.u64 [%r182+56],%r293; mov.u64 %r294,1; st.u64 [%r182+40],%r294; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r63,[%r182+28]; .loc 1 121 10 setp.eq.u64 %r295,%r63,%r112; @ %r295 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r112; st.u64 [%stack],%r63;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r298,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r299,[%r298+36]; setp.eq.u32 %r300,%r299,0; @ %r300 bra $L21; .loc 1 128 2 add.u64 %r359,%frame,240; cvta.const.u64 %r304,$LC2r359303304; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r306,%r112,0; @ %r306 bra $L24; $L27: .loc 1 68 9 add.u64 %r307,%r31,1; .loc 1 68 7 sub.u64 %r114,%r307,%r33; max.s64 %r132,%r114,0; .loc 1 140 8 ld.u64 %r126,[%r183]; .loc 1 141 8 ld.u64 %r127,[%r182]; .loc 1 168 42 shl.b64 %r71,%r116,2; ld.u64 %r48,[%frame+360]; .loc 1 194 22 ld.u64 %r73,[%frame+120]; .loc 1 194 12 shl.b64 %r75,%r73,2; .loc 1 195 22 ld.u64 %r76,[%frame]; .loc 1 195 12 shl.b64 %r78,%r76,3; setp.gt.s64 %r361,%r114,0; setp.le.s64 %r362,%r112,1; setp.eq.u32 %r364,%r185,0; .loc 1 159 10 mov.u64 %r365,0; cvt.u32.u32 %r366,%r22; cvt.s64.s8 %r367,%r366; add.u64 %r368,%r367,-1; add.u64 %r369,%frame,120; add.u64 %r370,%frame,240; bra $L25; $L24: add.u64 %r166,%frame,360; add.u64 %r165,%r182,40; mov.u64 %r163,%frame; cvt.u32.u32 %r309,%r22; cvt.s64.s8 %r308,%r309; add.u64 %r154,%r308,-1; .loc 1 132 3 mov.u64 %r128,0; add.u64 %r359,%frame,240; .loc 1 134 16 mov.u64 %r310,%r128; $L26: st.u64 [%r166],%r310; .loc 1 135 18 ld.u64 %r311,[%r165]; st.u64 [%r163],%r311; .loc 1 136 17 shl.b64 %r313,%r128,3; add.u64 %r314,%r359,%r313; .loc 1 136 10 ld.u64 %r315,[%r314]; setp.le.s64 %r316,%r315,0; @ %r316 bra $L1; .loc 1 132 26 add.u64 %r128,%r128,1; .loc 1 132 3 add.u64 %r166,%r166,8; add.u64 %r165,%r165,24; add.u64 %r163,%r163,8; setp.ne.u64 %r317,%r128,%r154; @ %r317 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r361 bra $L43; .loc 1 159 10 st.u64 [%r127],%r365; bra $L29; $L43: mov.u64 %r121,%r126; .loc 1 168 19 mov.u64 %r105,0; $L28: .loc 1 170 7 ld.f32 %r106,[%r121]; mov.u64 %r137,%r105; .loc 1 168 34 add.u64 %r105,%r105,1; .loc 1 170 6 setp.ge.f32 %r320,%r106,0fff800000; @ ! %r320 bra $L58; .loc 1 180 6 setp.gt.s64 %r321,%r132,%r137; @ %r321 bra $L44; bra $L33; $L58: .loc 1 168 42 add.u64 %r121,%r121,%r71; .loc 1 168 12 setp.gt.s64 %r322,%r132,%r105; @ %r322 bra $L28; .loc 1 157 9 mov.u64 %r105,1; bra $L33; $L44: .loc 1 170 7 mov.f32 %r180,%r106; $L32: .loc 1 182 7 @ %r364 bra $L34; set.u32.le.f32 %r325,%r106,%r180; neg.s32 %r326,%r325; cvt.u32.u32 %r324,%r326; cvt.u32.u8 %r120,%r324; bra $L35; $L34: set.u32.lt.f32 %r328,%r106,%r180; neg.s32 %r329,%r328; cvt.u32.u32 %r327,%r329; cvt.u32.u8 %r120,%r327; bra $L35; $L59: .loc 1 182 14 ld.f32 %r180,[%r121]; bra $L32; $L33: .loc 1 189 12 st.u64 [%r127],%r105; $L29: .loc 1 193 15 add.u64 %r48,%r48,1; .loc 1 194 12 add.u64 %r126,%r126,%r75; .loc 1 195 12 add.u64 %r127,%r127,%r78; .loc 1 197 32 ld.u64 %r66,[%frame+240]; .loc 1 197 13 setp.ne.u64 %r331,%r66,%r48; @ %r331 bra $L25; .loc 1 204 23 mul.lo.u64 %r332,%r48,%r73; .loc 1 204 9 shl.b64 %r333,%r332,2; sub.u64 %r123,%r126,%r333; .loc 1 205 23 mul.lo.u64 %r334,%r48,%r76; .loc 1 205 9 shl.b64 %r335,%r334,3; sub.u64 %r124,%r127,%r335; .loc 1 207 7 @ %r362 bra $L1; add.u64 %r175,%frame,368; mov.u64 %r174,8; .loc 1 206 5 mov.u64 %r125,1; bra $L38; $L39: .loc 1 201 13 st.u64 [%r175],%r365; .loc 1 204 23 mul.lo.u64 %r341,%r90,%r89; .loc 1 204 9 shl.b64 %r342,%r341,2; sub.u64 %r123,%r126,%r342; .loc 1 205 23 mul.lo.u64 %r343,%r93,%r89; .loc 1 205 9 shl.b64 %r344,%r343,3; sub.u64 %r124,%r127,%r344; .loc 1 206 5 add.u64 %r125,%r125,1; .loc 1 207 7 add.u64 %r175,%r175,8; add.u64 %r174,%r174,8; setp.eq.u64 %r345,%r125,%r368; @ %r345 bra $L1; $L38: .loc 1 215 16 ld.u64 %r346,[%r175]; add.u64 %r89,%r346,1; st.u64 [%r175],%r89; .loc 1 216 23 add.u64 %r348,%r369,%r174; ld.u64 %r90,[%r348]; .loc 1 216 13 shl.b64 %r349,%r90,2; add.u64 %r126,%r123,%r349; .loc 1 217 23 add.u64 %r350,%frame,%r174; ld.u64 %r93,[%r350]; .loc 1 217 13 shl.b64 %r351,%r93,3; add.u64 %r127,%r124,%r351; .loc 1 197 32 add.u64 %r353,%r370,%r174; ld.u64 %r96,[%r353]; .loc 1 197 13 setp.eq.u64 %r354,%r89,%r96; @ %r354 bra $L39; .loc 1 201 13 mov.u64 %r48,0; bra $L25; $L35: .loc 1 185 14 add.u64 %r137,%r137,1; .loc 1 182 6 setp.ne.u32 %r355,%r120,0; .loc 1 184 14 selp.f32 %r106,%r180,%r106,%r355; selp.u64 %r105,%r137,%r105,%r355; .loc 1 180 31 add.u64 %r121,%r121,%r71; .loc 1 180 6 setp.eq.u64 %r330,%r132,%r137; @ ! %r330 bra $L59; bra $L33; $L57: .loc 1 90 6 ld.u64 %r356,[%r182]; setp.eq.u64 %r357,%r356,0; @ ! %r357 bra $L11; bra $L42; $L1:_gfortran_mmaxloc1_8_r4 .visible .func _gfortran_mmaxloc1_8_r, .param .u32 %in_ar4) {608pred %r218; .reg .u32 %r223; .reg .u6464u64pred %r373; .reg .u64 %r374; .reg .u64 %r376; .reg .u32 %r377; .reg .pred %r378; .reg .u64 %r381; .reg .u64 %r382; .reg .u64 %r386; .reg .pred %r388; .reg .pred %r389; .reg .u64 %r390; .reg .u32 %r391; .reg .u64pred %r398; .reg .pred %r399; .reg .pred %r401; .reg .pred %r402; .reg .pred %r403; .reg .pred %r404; .reg .pred %r407; .reg .pred %r408; .reg .pred %r410; .reg .pred %r411; .reg .pred.reg .u64 %r421; .reg .u64u64pred %r445; .reg .u64 %r446; .reg .predmov.u64 %r213,%ar0; mov.u64 %r214,%ar1; mov.u64 %r215,%ar2; mov.u64 %r216,%ar3; mov.u32 %r217,%ar4; .loc 1 251 6 setp.ne.u64 %r218,%r216,0; @ %r218 bra $L611r214215217; call _gfortran_maxloc1_8_r47 bra $L60; $L61: .loc 1 261 10 ld.u64 %r22,[%r215]; .loc 1 261 7 add.u64 %r152,%r22,-1; .loc 1 262 10 ld.s8 %r23,[%r214+28]; .loc 1 262 38 add.u32 %r223,%r23,-1; .loc 1 262 8 cvt.s64.s32 %r153,%r223; .loc 1 265 7 shr.u64 %r225,%r152,63; set.u32.gt.s64 %r227,%r152,%r153; neg.s32 %r228,%r227; cvt.u16.u64 %r231,%r225; cvt.u16.u32 %r232,%r228; or.b16 %r230,%r231,%r232; .loc 1 265 6 cvt.u32.u16 %r233,%r230; cvt.u16.u8 %r234,%r233; setp.eq.u16 %r235,%r234,0; @ %r235 bra $L63; .loc 1 267 7 cvt.u32.u32 %r238,%r23; cvt.s64.s8 %r237,%r238; st.u64 [%stack+8],%r237; st.u64 [%stack],%r22; cvta.const.u64 %r236236_gfortran_runtime_error63: .loc 1 272 9 add.u64 %r442,%r152,%r152; add.u64 %r444,%r442,%r152; shl.b64 %r243,%r444,3; add.u64 %r244,%r214,%r243; ld.u64 %r247,[%r244+56]; add.u64 %r246,%r247,1; .loc 1 272 7 ld.u64 %r254,[%r244+48]; sub.u64 %r154,%r246,%r254; .loc 1 273 6 setp.le.s64 %r255,%r154,0; @ %r255 bra $L60; .loc 1 276 9 ld.u64 %r172,[%r216]; .loc 1 278 15 ld.u64 %r34,[%r216+16]; .loc 1 280 22 cvt.u32.u64 %r35,%r34; .loc 1 280 53 add.u32 %r256,%r35,-4; and.b32 %r257,%r256,-5; set.u32.eq.u32 %r259,%r257,0; neg.s32 %r260,%r259; .loc 1 280 22 add.u32 %r261,%r35,-1; set.u32.le.u32 %r263,%r261,1; neg.s32 %r264,%r263; .loc 1 280 6 cvt.u16.u32 %r266,%r260; cvt.u16.u32 %r267,%r264; or.b16 %r265,%r266,%r267; cvt.u32.u16 %r268,%r265; cvt.u16.u8 %r269,%r268; setp.ne.u16 %r270,%r269,0; @ %r270 bra $L65; .loc 1 282 7 setp.ne.u32 %r272,%r35,16; @ %r272 bra $L66; $L65: .loc 1 289 9 shl.b64 %r276,%r444,3; add.u64 %r277,%r214,%r276; ld.u64 %r157,[%r277+40]; .loc 1 290 12 add.u64 %r283,%r216,%r276; ld.u64 %r38,[%r283+40]; .loc 1 292 3 setp.ne.u64 %r285,%r152,0; @ %r285 bra $L67; $L73: .loc 1 302 3 setp.lt.s64 %r286,%r152,%r153; @ %r286 bra $L68; bra $L134; $L66: .loc 1 287 5_gfortran_runtime_error67: add.u64 %r205,%r214,40; add.u64 %r70,%r216,40; add.u64 %r289,%r214,16; add.u64 %r291,%r22,%r22; add.u64 %r292,%r291,%r22; shl.b64 %r293,%r292,3; add.u64 %r188,%r289,%r293; .loc 1 292 3 mov.u64 %r92,0; add.u64 %r441,%frame,240; add.u64 %r439,%frame,360; .loc 1 299 12 mov.u64 %r455,%r92; $L72: .loc 1 294 18 add.u64 %r295,%r441,%r92; ld.u64 %r296,[%r205]; st.u64 [%r295],%r296; .loc 1 295 18 add.u64 %r297,%frame,%r92; .loc 1 295 20 ld.u64 %r299,[%r70]; mul.lo.u64 %r298,%r299,%r34; .loc 1 295 18 st.u64 [%r297],%r298; .loc 1 296 19 ld.u64 %r301,[%r205+16]; add.u64 %r300,%r301,1; ld.u64 %r302,[%r205+8]; sub.u64 %r49,%r300,%r302; .loc 1 298 10 setp.lt.s64 %r303,%r49,0; @ %r303 bra $L70; .loc 1 296 17 add.u64 %r305,%r439,%r92; st.u64 [%r305],%r49; bra $L71; $L70: .loc 1 299 12 add.u64 %r307,%r439,%r92; st.u64 [%r307],%r455; $L71: .loc 1 292 3 add.u64 %r205,%r205,24; add.u64 %r70,%r70,24; add.u64 %r92,%r92,8; setp.ne.u64 %r309,%r188,%r205; @ %r309 bra $L72; bra $L73; $L134: .loc 1 312 6 ld.u64 %r310,[%r213]; setp.eq.u64 %r311,%r310,0; @ ! %r311 bra $L75; bra $L74; $L68: add.u64 %r313,%r22,%r22; add.u64 %r314,%r313,%r22; shl.b64 %r315,%r314,3; add.u64 %r123,%r315,40; add.u64 %r128,%r214,%r123; add.u64 %r107,%r216,%r123; shl.b64 %r316,%r22,3; add.u64 %r78,%r316,-8; cvt.u32.u32 %r318,%r23; cvt.s64.s8 %r317,%r318; shl.b64 %r319,%r317,3; add.u64 %r180,%r319,-8; add.u64 %r441,%frame,240; add.u64 %r439,%frame,360; .loc 1 309 12 mov.u64 %r454,0; $L78: .loc 1 304 18 add.u64 %r321,%r441,%r78; ld.u64 %r322,[%r128]; st.u64 [%r321],%r322; .loc 1 305 18 add.u64 %r323,%frame,%r78; .loc 1 305 20 ld.u64 %r325,[%r107]; mul.lo.u64 %r324,%r325,%r34; .loc 1 305 18 st.u64 [%r323],%r324; .loc 1 306 19 ld.u64 %r327,[%r128+16]; add.u64 %r326,%r327,1; ld.u64 %r328,[%r128+8]; sub.u64 %r61,%r326,%r328; .loc 1 308 10 setp.lt.s64 %r329,%r61,0; @ %r329 bra $L76; .loc 1 306 17 add.u64 %r331,%r439,%r78; st.u64 [%r331],%r61; bra $L77; $L76: .loc 1 309 12 add.u64 %r333,%r439,%r78; st.u64 [%r333],%r454; $L77: .loc 1 302 3 add.u64 %r128,%r128,24; add.u64 %r107,%r107,24; add.u64 %r78,%r78,8; setp.ne.u64 %r335,%r78,%r180; @ %r335 bra $L78; bra $L135; $L74: .loc 1 316 7 setp.eq.u64 %r336,%r153,0; @ %r336 bra $L80; add.u64 %r439,%frame,360; $L107: add.u64 %r144,%r213,40; mov.u64 %r139,%r439; cvt.u32.u32 %r338,%r23; cvt.s64.s8 %r337,%r338; add.u64 %r340,%r337,%r337; add.u64 %r341,%r340,%r337; shl.b64 %r342,%r341,3; add.u64 %r343,%r213,16; add.u64 %r129,%r342,%r343; .loc 1 319 10 mov.u64 %r132,1; .loc 1 323 4 mov.u64 %r362,0; bra $L81; $L80: .loc 1 327 20 add.u32 %r344,%r23,-2; cvt.s64.s32 %r68,%r344; add.u64 %r346,%r68,%r68; add.u64 %r347,%r346,%r68; shl.b64 %r348,%r347,3; add.u64 %r349,%r213,%r348; .loc 1 327 67 shl.b64 %r351,%r68,3; add.u64 %r352,%frame,%r351; .loc 1 327 59 ld.u64 %r354,[%r349+40]; ld.u64 %r355,[%r352+360]; mul.lo.u64 %r161,%r354,%r355; .loc 1 329 24 mov.u64 %r356,0; st.u64 [%r213+8],%r356; .loc 1 330 28 cvt.u16.u32 %r359,%r23; add.u16 %r358,%r359,-1; cvt.u32.u16 %r360,%r358; st.u8 [%r213+28],%r360; .loc 1 332 10 setp.eq.u64 %r361,%r161,0; @ %r361 bra $L82; bra $L136; $L84: .loc 1 321 47 mul.lo.u64 %r132,%r65,%r132; $L81: .loc 1 323 4 st.u64 [%r144+8],%r362; ld.u64 %r65,[%r139]; add.u64 %r363,%r65,-1; st.u64 [%r144+16],%r363; st.u64 [%r144],%r132; .loc 1 316 7 add.u64 %r144,%r144,24; add.u64 %r139,%r139,8; setp.ne.u64 %r364,%r129,%r144; @ %r364 bra $L84; bra $L80; $L82: .loc 1 335 4 st.u64 [%r213+48],%r161; mov.u64 %r366,-1; st.u64 [%r213+56],%r366; mov.u64 %r367,1; st.u64 [%r213+40],%r367; .loc 1 336 4 bra $L60; $L136: .loc 1 339 24 mov.u64 %r369369; call (%value_in),_gfortrani_xmallocarray70,[%value_in]; } .loc 1 339 22 st.u64 [%r213],%r370; bra $L85; $L75: .loc 1 344 19 ld.s8 %r372,[%r213+28]; .loc 1 344 10 setp.eq.u64 %r373,%r372,%r153; @ %r373 bra $L86; .loc 1 345 2 cvta.const.u64 %r374374_gfortran_runtime_error86: .loc 1 347 11 cvta.global.u64 %r376,_gfortrani_compile_options; .loc 1 347 10 ld.u32 %r377,[%r376+36]; setp.eq.u32 %r378,%r377,0; @ %r378 bra $L85; .loc 1 349 4 add.u64 %r439,%frame,360; cvta.const.u64 %r382,$LC2382; call _gfortrani_bounds_ifunction_return4 cvta.const.u64 %r386,$LC386382; call _gfortrani_bounds_equal_extents$L85: .loc 1 356 3 setp.ne.u64 %r388,%r153,0; @ %r388 bra $L87; $L90: .loc 1 364 8 ld.u64 %r173,[%r213]; .loc 1 365 8 ld.u64 %r171,[%r214]; .loc 1 367 9 setp.ne.u64 %r389,%r171,0; @ %r389 bra $L88; bra $L60; $L87: add.u64 %r193,%frame,480; add.u64 %r187,%r213,40; add.u64 %r181,%frame,120; cvt.u32.u32 %r391,%r23; cvt.s64.s8 %r390,%r391; add.u64 %r145,%r390,-1; .loc 1 356 3 mov.u64 %r174,0; add.u64 %r439,%frame,360; .loc 1 358 16 mov.u64 %r392,%r174; $L89: st.u64 [%r193],%r392; .loc 1 359 18 ld.u64 %r393,[%r187]; st.u64 [%r181],%r393; .loc 1 360 17 shl.b64 %r395,%r174,3; add.u64 %r396,%r439,%r395; .loc 1 360 10 ld.u64 %r397,[%r396]; setp.le.s64 %r398,%r397,0; @ %r398 bra $L60; .loc 1 356 26 add.u64 %r174,%r174,1; .loc 1 356 3 add.u64 %r193,%r193,8; add.u64 %r187,%r187,24; add.u64 %r181,%r181,8; setp.ne.u64 %r399,%r145,%r174; @ %r399 bra $L89; bra $L90; $L88: .loc 1 290 12 mul.lo.u64 %r40,%r38,%r34; .loc 1 386 32 shl.b64 %r82,%r157,2; ld.u64 %r185,[%frame+480]; .loc 1 430 22 ld.u64 %r93,[%frame+240]; .loc 1 430 12 shl.b64 %r95,%r93,2; .loc 1 431 23 ld.u64 %r96,[%frame]; .loc 1 432 22 ld.u64 %r98,[%frame+120]; .loc 1 432 12 shl.b64 %r100,%r98,3; setp.le.s64 %r445,%r153,1; .loc 1 408 9 setp.eq.u32 %r447,%r217,0; cvt.u32.u32 %r448,%r23; cvt.s64.s8 %r449,%r448; add.u64 %r450,%r449,-1; add.u64 %r451,%frame,240; add.u64 %r452,%frame,360; add.u64 %r453,%frame,120; $L108: .loc 1 356 3 mov.u64 %r166,%r172; mov.u64 %r165,%r171; .loc 1 383 16 mov.u64 %r142,0; .loc 1 386 9 mov.u64 %r138,%r142; $L96: .loc 1 389 7 ld.s8 %r210,[%r166]; mov.u64 %r182,%r138; .loc 1 393 17 add.u64 %r138,%r138,1; .loc 1 389 6 setp.eq.u32 %r440,%r210,0; @ %r440 bra $L91; .loc 1 392 10 setp.eq.u64 %r401,%r142,0; .loc 1 393 17 selp.u64 %r142,%r138,%r142,%r401; .loc 1 394 11 ld.f32 %r140,[%r165]; .loc 1 394 10 setp.ge.f32 %r402,%r140,0fff800000; @ ! %r402 bra $L91; .loc 1 404 9 setp.le.s64 %r403,%r154,%r182; @ ! %r403 bra $L137; bra $L94; $L91: .loc 1 386 32 add.u64 %r165,%r165,%r82; .loc 1 386 47 add.u64 %r166,%r166,%r40; .loc 1 386 2 setp.ne.u64 %r404,%r154,%r138; @ %r404 bra $L96; bra $L94; $L137: .loc 1 408 9 @ %r447 bra $L97; $L100: .loc 1 414 16 add.u64 %r182,%r182,1; .loc 1 411 8 @ %r440 bra $L98; .loc 1 411 18 ld.f32 %r85,[%r165]; .loc 1 411 15 setp.ge.f32 %r407,%r85,%r140; .loc 1 413 16 selp.f32 %r140,%r85,%r140,%r407; selp.u64 %r138,%r182,%r138,%r407; $L98: .loc 1 409 33 add.u64 %r165,%r165,%r82; .loc 1 409 48 add.u64 %r166,%r166,%r40; .loc 1 409 8 setp.eq.u64 %r408,%r154,%r182; @ %r408 bra $L110; .loc 1 411 9 ld.s8 %r210,[%r166]; setp.eq.u32 %r440,%r210,0; bra $L100; $L97: .loc 1 423 16 add.u64 %r182,%r182,1; .loc 1 420 8 @ %r440 bra $L101; .loc 1 420 18 ld.f32 %r90,[%r165]; .loc 1 420 15 setp.gt.f32 %r410,%r90,%r140; .loc 1 422 16 selp.f32 %r140,%r90,%r140,%r410; selp.u64 %r138,%r182,%r138,%r410; $L101: .loc 1 418 33 add.u64 %r165,%r165,%r82; .loc 1 418 48 add.u64 %r166,%r166,%r40; .loc 1 418 8 setp.eq.u64 %r411,%r154,%r182; @ %r411 bra $L111; .loc 1 420 9 ld.s8 %r210,[%r166]; setp.eq.u32 %r440,%r210,0; bra $L97; $L110: mov.u64 %r142,%r138; bra $L94; $L111: mov.u64 %r142,%r138; $L94: .loc 1 426 8 st.u64 [%r173],%r142; .loc 1 429 15 add.u64 %r185,%r185,1; .loc 1 430 12 add.u64 %r171,%r171,%r95; .loc 1 431 13 add.u64 %r172,%r172,%r96; .loc 1 432 12 add.u64 %r173,%r173,%r100; .loc 1 434 32 ld.u64 %r164,[%frame+360]; .loc 1 434 13 setp.ne.u64 %r412,%r185,%r164; @ %r412 bra $L108; .loc 1 441 23 mul.lo.u64 %r102,%r93,%r185; .loc 1 442 24 mul.lo.u64 %r413,%r96,%r185; .loc 1 442 10 sub.u64 %r168,%r172,%r413; .loc 1 443 23 mul.lo.u64 %r414,%r98,%r185; .loc 1 443 9 shl.b64 %r415,%r414,3; sub.u64 %r169,%r173,%r415; .loc 1 445 7 @ %r445 bra $L60; add.u64 %r203,%frame,488; mov.u64 %r202,8; .loc 1 444 5 mov.u64 %r170,1; .loc 1 438 13 mov.u64 %r446,0; bra $L105; $L106: st.u64 [%r203],%r446; .loc 1 441 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 442 24 mul.lo.u64 %r421,%r116,%r113; .loc 1 442 10 sub.u64 %r168,%r172,%r421; .loc 1 443 23 mul.lo.u64 %r422,%r118,%r113; .loc 1 443 9 shl.b64 %r423,%r422,3; sub.u64 %r169,%r173,%r423; .loc 1 444 5 add.u64 %r170,%r170,1; .loc 1 445 7 add.u64 %r203,%r203,8; add.u64 %r202,%r202,8; setp.eq.u64 %r424,%r170,%r450; @ %r424 bra $L60; $L105: .loc 1 453 16 ld.u64 %r425,[%r203]; add.u64 %r113,%r425,1; st.u64 [%r203],%r113; .loc 1 454 23 add.u64 %r427,%r451,%r202; ld.u64 %r114,[%r427]; .loc 1 454 13 sub.u64 %r428,%r114,%r102; shl.b64 %r429,%r428,2; add.u64 %r171,%r171,%r429; .loc 1 455 24 add.u64 %r430,%frame,%r202; ld.u64 %r116,[%r430]; .loc 1 455 14 add.u64 %r172,%r168,%r116; .loc 1 456 23 add.u64 %r432,%r453,%r202; ld.u64 %r118,[%r432]; .loc 1 456 13 shl.b64 %r433,%r118,3; add.u64 %r173,%r169,%r433; .loc 1 434 32 add.u64 %r435,%r452,%r202; ld.u64 %r122,[%r435]; .loc 1 434 13 setp.eq.u64 %r436,%r113,%r122; @ %r436 bra $L106; .loc 1 438 13 mov.u64 %r185,0; bra $L108; $L135: .loc 1 312 6 ld.u64 %r437,[%r213]; setp.eq.u64 %r438,%r437,0; @ ! %r438 bra $L75; bra $L107; $L60_gfortran_smaxloc1_8_r4 .visible .func _gfortran_smaxloc1_8_r, .param .u32 %in_ar4) {local .align 16 .b8 %frame_ar[368u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 483 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L139; .loc 1 483 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L140; $L139:r152154; call _gfortran_maxloc1_8_r4490 7 bra $L138; $L140: .loc 1 493 10 ld.u64 %r23,[%r152]; .loc 1 493 7 add.u64 %r88,%r23,-1; .loc 1 494 10 ld.s8 %r24,[%r151+28]; .loc 1 494 38 add.u32 %r26,%r24,-1; .loc 1 494 8 cvt.s64.s32 %r296,%r26; .loc 1 496 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 496 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L142; .loc 1 503 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L143; $L149: .loc 1 511 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L144; bra $L197; $L142: .loc 1 498 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error143: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 508 12 mov.u64 %r302,0; $L148: .loc 1 505 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 507 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L146; .loc 1 505 17 st.u64 [%r139],%r36; bra $L147; $L146: .loc 1 508 12 st.u64 [%r139],%r302; $L147: .loc 1 503 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L148; bra $L149; $L197: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L151; bra $L150; $L144: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 517 12 mov.u64 %r301,0; $L154: .loc 1 514 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 516 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L152; .loc 1 513 17 st.u64 [%r43],%r41; bra $L153; $L152: .loc 1 517 12 st.u64 [%r43],%r301; $L153: .loc 1 511 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L154; bra $L198; $L150: .loc 1 524 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L156; add.u64 %r295,%frame,120; $L176: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 527 10 mov.u64 %r81,1; .loc 1 531 4 mov.u64 %r239,0; bra $L157; $L156: .loc 1 535 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 536 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 538 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 538 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 538 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 540 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L158; bra $L199; $L160: .loc 1 529 48 mul.lo.u64 %r81,%r46,%r81; $L157: .loc 1 531 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 524 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L160; bra $L156; $L158: .loc 1 543 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 544 4 bra $L138; $L199: .loc 1 547 24 mov.u64 %r24696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 547 22 st.u64 [%r150],%r98; $L164: .loc 1 573 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L161; bra $L162; $L151: .loc 1 551 19 ld.s8 %r55,[%r150+28]; .loc 1 551 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L163; .loc 1 552 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error163: .loc 1 557 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 557 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L164; .loc 1 559 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L162; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 559 10 mov.u64 %r104,0; $L166: .loc 1 563 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 563 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 564 18 ld.u64 %r63,[%r31]; .loc 1 559 25 add.u64 %r104,%r104,1; .loc 1 564 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L165; .loc 1 565 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error165: .loc 1 559 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L166; $L161: .loc 1 575 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L167; shl.b64 %r266,%r296,3; bra $L168; $L167: mov.u64 %r266,8; $L168: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L169: .loc 1 576 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 573 26 add.u64 %r100,%r100,1; .loc 1 573 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L169; $L162: ld.u64 %r80,[%frame+240]; .loc 1 585 22 ld.u64 %r68,[%frame]; .loc 1 585 12 shl.b64 %r69,%r68,3; .loc 1 587 32 ld.u64 %r59,[%frame+120]; .loc 1 594 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 583 13 mov.u64 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L175: st.u64 [%r98],%r280; .loc 1 584 15 add.u64 %r80,%r80,1; .loc 1 585 12 add.u64 %r98,%r98,%r69; .loc 1 587 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L175; .loc 1 596 7 @ ! %r294 bra $L200; bra $L138; $L174: .loc 1 591 13 st.u64 [%r99],%r280; .loc 1 594 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 595 5 add.u64 %r97,%r97,1; .loc 1 596 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L173; bra $L138; $L200: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 594 23 mov.u64 %r72,%r44; .loc 1 595 5 mov.u64 %r97,1; $L173: .loc 1 600 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 601 23 ld.u64 %r75,[%r124]; .loc 1 601 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,3; add.u64 %r98,%r98,%r291; .loc 1 587 32 ld.u64 %r77,[%r123]; .loc 1 587 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L174; .loc 1 591 13 mov.u64 %r80,0; bra $L175; $L198: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L151; bra $L176; $L138: .loc 1 605 1 ret; } maxloc1_16_r4.o/_gfortran_maxloc1_16_r4 .visible .func _gfortran_maxloc1_16_rfortran/generated/maxloc1_16_r4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_16_r4 .visible .func _gfortran_mmaxloc1_16_r_gfortran_smaxloc1_16_r4 .visible .func _gfortran_smaxloc1_16_rVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_maxloc1_16_r4 .visible .func _gfortran_maxloc1_16_r4480predu64 %r258; .reg .u64pred %r265; .reg .pred64pred %r304; .reg .u64 %r305; .reg .u64u64 %r318; .reg .u64u64 %r324; .reg .u64predpred %r334; .reg .pred %r336; .reg .u64 %r340; .reg .u64 %r343; .reg .u64 %r346; .reg .u64 %r349; .reg .u32 %r351; .reg .u64 %r355; .reg .u64 %r357; .reg .u64 %r360; .reg .u32 %r362; .reg .u64 %r366; .reg .pred %r367; .reg .pred %r369; .reg .pred %r370; .reg .u64 %r373; .reg .u64 %r376; .reg .u32pred %r391; .reg .u64 %r397; .reg .u6464mov.u64 %r191,%ar0; mov.u64 %r192,%ar1; mov.u64 %r193,%ar2; mov.u32 %r194,%ar3; .loc 1 58 10 ld.s8 %r22,[%r192+28]; .loc 1 58 38 add.u32 %r195,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r118,%r195; .loc 1 59 10 ld.u64 %r25,[%r193]; .loc 1 59 7 add.u64 %r119,%r25,-1; .loc 1 61 7 shr.u64 %r197,%r119,63; set.u32.lt.s64 %r199,%r118,%r119; neg.s32 %r200,%r199; cvt.u16.u64 %r203,%r197; cvt.u16.u32 %r204,%r200; or.b16 %r202,%r203,%r204; .loc 1 61 6 cvt.u32.u16 %r205,%r202; cvt.u16.u8 %r206,%r205; setp.eq.u16 %r207,%r206,0; @ %r207 bra $L2; .loc 1 63 7 cvt.u32.u32 %r210,%r22; cvt.s64.s8 %r209,%r210; st.u64 [%stack+8],%r209_gfortran_runtime_errorr213,%r119,%r119; add.u64 %r214,%r213,%r119; shl.b64 %r215,%r214,3; add.u64 %r216,%r192,%r215; ld.u64 %r30,[%r216+56]; ld.u64 %r32,[%r216+48]; .loc 1 71 9 ld.u64 %r122,[%r216+40]; .loc 1 73 3 setp.ne.u64 %r230,%r119,0; @ %r230 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r231,%r118,%r119; @ %r231 bra $L4; bra $L57; $L3: add.u64 %r181,%r192,40; add.u64 %r69,%frame,120; add.u64 %r97,%frame,240; add.u64 %r232,%r192,16; add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r67,%r232,%r236; .loc 1 79 12 mov.u64 %r447,0; $L8: .loc 1 75 18 ld.u64 %r237,[%r181]; st.u64 [%r69],%r237; .loc 1 76 19 ld.u64 %r239,[%r181+16]; add.u64 %r238,%r239,1; ld.u64 %r240,[%r181+8]; sub.u64 %r37,%r238,%r240; .loc 1 78 10 setp.lt.s64 %r241,%r37,0; @ %r241 bra $L6; .loc 1 76 17 st.u64 [%r97],%r37; bra $L7; $L6: .loc 1 79 12 st.u64 [%r97],%r447; $L7: .loc 1 73 3 add.u64 %r181,%r181,24; add.u64 %r69,%r69,8; add.u64 %r97,%r97,8; setp.ne.u64 %r243,%r67,%r181; @ %r243 bra $L8; bra $L9; $L57: .loc 1 90 6 ld.u64 %r244,[%r191]; setp.eq.u64 %r245,%r244,0; @ ! %r245 bra $L11; bra $L10; $L4: add.u64 %r247,%r25,%r25; add.u64 %r248,%r247,%r25; shl.b64 %r249,%r248,3; add.u64 %r250,%r249,40; add.u64 %r110,%r192,%r250; shl.b64 %r251,%r25,3; add.u64 %r99,%r251,-8; add.u64 %r435,%frame,120; add.u64 %r101,%r435,%r99; add.u64 %r434,%frame,240; add.u64 %r96,%r434,%r99; add.u64 %r254,%r192,40; cvt.u32.u32 %r256,%r22; cvt.s64.s8 %r255,%r256; add.u64 %r258,%r255,%r255; add.u64 %r259,%r258,%r255; shl.b64 %r260,%r259,3; add.u64 %r184,%r254,%r260; .loc 1 87 12 mov.u64 %r446,0; $L14: .loc 1 83 18 ld.u64 %r261,[%r110]; st.u64 [%r101],%r261; .loc 1 84 19 ld.u64 %r263,[%r110+16]; add.u64 %r262,%r263,1; ld.u64 %r264,[%r110+8]; sub.u64 %r44,%r262,%r264; .loc 1 86 10 setp.lt.s64 %r265,%r44,0; @ %r265 bra $L12; .loc 1 84 17 st.u64 [%r96],%r44; bra $L13; $L12: .loc 1 87 12 st.u64 [%r96],%r446; $L13: .loc 1 81 3 add.u64 %r110,%r110,24; add.u64 %r101,%r101,8; add.u64 %r96,%r96,8; setp.ne.u64 %r267,%r110,%r184; @ %r267 bra $L14; bra $L58; $L10: .loc 1 94 7 setp.le.s64 %r268,%r118,0; @ %r268 bra $L16; add.u64 %r434,%frame,240; $L43: add.u64 %r146,%r191,40; mov.u64 %r143,%r434; cvt.u32.u32 %r270,%r22; cvt.s64.s8 %r269,%r270; add.u64 %r272,%r269,%r269; add.u64 %r273,%r272,%r269; shl.b64 %r274,%r273,3; add.u64 %r275,%r191,16; add.u64 %r111,%r274,%r275; .loc 1 97 10 mov.u64 %r103,1; .loc 1 101 4 mov.u64 %r298,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r276,0; st.u64 [%r191+8],%r276; .loc 1 106 28 cvt.u16.u32 %r279,%r22; add.u16 %r278,%r279,-1; cvt.u32.u16 %r280,%r278; st.u8 [%r191+28],%r280; .loc 1 108 20 add.u32 %r281,%r22,-2; cvt.s64.s32 %r53,%r281; add.u64 %r283,%r53,%r53; add.u64 %r284,%r283,%r53; shl.b64 %r285,%r284,3; add.u64 %r286,%r191,%r285; .loc 1 108 67 shl.b64 %r288,%r53,3; add.u64 %r289,%frame,%r288; .loc 1 108 59 ld.u64 %r291,[%r286+40]; ld.u64 %r292,[%r289+240]; mul.lo.u64 %r124,%r291,%r292; .loc 1 110 29 mov.u64 %r294,16call (%value_in),_gfortrani_xmallocarray295,[%value_in]; } .loc 1 110 27 st.u64 [%r191],%r295; .loc 1 111 10 setp.eq.u64 %r297,%r124,0; @ ! %r297 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r103,%r49,%r103; $L17: .loc 1 101 4 st.u64 [%r146+8],%r298; ld.u64 %r49,[%r143]; add.u64 %r299,%r49,-1; st.u64 [%r146+16],%r299; st.u64 [%r146],%r103; .loc 1 94 7 add.u64 %r146,%r146,24; add.u64 %r143,%r143,8; setp.ne.u64 %r300,%r111,%r146; @ %r300 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r191+48],%r124; mov.u64 %r302,-1; st.u64 [%r191+56],%r302; mov.u64 %r303,1; st.u64 [%r191+40],%r303; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r59,[%r191+28]; .loc 1 121 10 setp.eq.u64 %r304,%r59,%r118; @ %r304 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r118; st.u64 [%stack],%r59;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r307,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r308,[%r307+36]; setp.eq.u32 %r309,%r308,0; @ %r309 bra $L21; .loc 1 128 2 add.u64 %r434,%frame,240; cvta.const.u64 %r313,$LC2r4312313; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r315,%r118,0; @ %r315 bra $L24; $L27: .loc 1 68 9 add.u64 %r316,%r30,1; .loc 1 68 7 sub.u64 %r120,%r316,%r32; max.s64 %r64,%r120,0; .loc 1 140 8 ld.u64 %r133,[%r192]; .loc 1 141 8 ld.u64 %r134,[%r191]; .loc 1 168 42 shl.b64 %r66,%r122,2; ld.u64 %r142,[%frame+360]; .loc 1 194 22 ld.u64 %r71,[%frame+120]; .loc 1 194 12 shl.b64 %r72,%r71,2; .loc 1 195 22 ld.u64 %r74,[%frame]; .loc 1 195 12 shl.b64 %r75,%r74,4; .loc 1 197 32 ld.u64 %r61,[%frame+240]; .loc 1 204 23 mul.lo.u64 %r317,%r61,%r71; .loc 1 204 9 shl.b64 %r318,%r317,2; neg.s64 %r154,%r318; .loc 1 205 23 mul.lo.u64 %r319,%r61,%r74; .loc 1 205 9 shl.b64 %r320,%r319,4; neg.s64 %r140,%r320; setp.gt.s64 %r436,%r120,0; setp.le.s64 %r437,%r118,1; add.u64 %r439,%r64,-1; setp.eq.u32 %r440,%r194,0; cvt.u32.u32 %r441,%r22; cvt.s64.s8 %r442,%r441; add.u64 %r443,%r442,-1; add.u64 %r444,%frame,120; add.u64 %r445,%frame,240; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r191,40; mov.u64 %r162,%frame; cvt.u32.u32 %r322,%r22; cvt.s64.s8 %r321,%r322; add.u64 %r148,%r321,-1; .loc 1 132 3 mov.u64 %r135,0; add.u64 %r434,%frame,240; .loc 1 134 16 mov.u64 %r323,%r135; $L26: st.u64 [%r165],%r323; .loc 1 135 18 ld.u64 %r324,[%r164]; st.u64 [%r162],%r324; .loc 1 136 17 shl.b64 %r326,%r135,3; add.u64 %r327,%r434,%r326; .loc 1 136 10 ld.u64 %r328,[%r327]; setp.le.s64 %r329,%r328,0; @ %r329 bra $L1; .loc 1 132 26 add.u64 %r135,%r135,1; .loc 1 132 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r330,%r135,%r148; @ %r330 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r436 bra $L44; .loc 1 159 10 mov.u64 %r332,0; st.u64 [%r134],%r332; st.u64 [%r134+8],%r332; bra $L29; $L44: mov.u64 %r126,%r133; .loc 1 168 19 mov.u64 %r145,0; $L28: .loc 1 170 7 ld.f32 %r108,[%r126]; mov.u64 %r127,%r145; .loc 1 168 34 add.u64 %r145,%r145,1; .loc 1 170 6 setp.ge.f32 %r334,%r108,0fff800000; @ ! %r334 bra $L59; .loc 1 173 14 mov.u64 %r413,%r145; shr.s64 %r414,%r413,63; .loc 1 180 6 setp.le.s64 %r336,%r64,%r127; @ %r336 bra $L32; add.u64 %r417,%r127,1; shr.s64 %r418,%r417,63; sub.u64 %r340,%r439,%r127; shr.s64 %r343,%r127,63; add.u64 %r346,%r127,2; set.u32.lt.u64 %r351,%r346,%r127; cvt.s64.s32 %r349,%r351; sub.u64 %r355,%r343,%r349; add.u64 %r357,%r340,%r346; set.u32.lt.u64 %r362,%r357,%r340; cvt.s64.s32 %r360,%r362; sub.u64 %r366,%r355,%r360; .loc 1 170 7 mov.f32 %r185,%r108; .loc 1 180 6 mov.u64 %r429,1; bra $L33; $L59: .loc 1 168 42 add.u64 %r126,%r126,%r66; .loc 1 168 12 setp.gt.s64 %r367,%r64,%r145; @ %r367 bra $L28; .loc 1 157 9 mov.u64 %r413,1; mov.u64 %r414,0; bra $L32; $L33: .loc 1 182 7 @ %r440 bra $L34; .loc 1 182 6 setp.le.f32 %r369,%r108,%r185; @ %r369 bra $L35; bra $L36; $L34: setp.lt.f32 %r370,%r108,%r185; @ ! %r370 bra $L36; $L35: .loc 1 185 14 mov.u64 %r413,%r417; mov.u64 %r414,%r418; .loc 1 184 14 mov.f32 %r108,%r185; $L36: .loc 1 180 31 add.u64 %r126,%r126,%r66; .loc 1 180 6 add.u64 %r373,%r417,%r429; set.u32.lt.u64 %r378,%r373,%r417; cvt.s64.s32 %r376,%r378; mov.u64 %r417,%r373; sub.u64 %r418,%r418,%r376; setp.ne.u64 %r385,%r357,%r417; @ %r385 bra $L46; setp.ne.u64 %r388,%r366,%r418; @ ! %r388 bra $L32; $L46: .loc 1 182 14 ld.f32 %r185,[%r126]; bra $L33; $L32: .loc 1 189 12 st.u64 [%r134],%r413; st.u64 [%r134+8],%r414; $L29: .loc 1 193 15 add.u64 %r142,%r142,1; .loc 1 194 12 add.u64 %r133,%r133,%r72; .loc 1 195 12 add.u64 %r134,%r134,%r75; .loc 1 197 13 setp.ne.u64 %r391,%r61,%r142; @ %r391 bra $L25; .loc 1 204 9 add.u64 %r130,%r133,%r154; .loc 1 205 9 add.u64 %r131,%r134,%r140; .loc 1 207 7 @ %r437 bra $L1; add.u64 %r117,%frame,368; mov.u64 %r180,8; .loc 1 206 5 mov.u64 %r132,1; .loc 1 201 13 mov.u64 %r438,0; bra $L41; $L42: st.u64 [%r117],%r438; .loc 1 204 23 mul.lo.u64 %r397,%r89,%r88; .loc 1 204 9 shl.b64 %r398,%r397,2; sub.u64 %r130,%r133,%r398; .loc 1 205 23 mul.lo.u64 %r399,%r92,%r88; .loc 1 205 9 shl.b64 %r400,%r399,4; sub.u64 %r131,%r134,%r400; .loc 1 206 5 add.u64 %r132,%r132,1; .loc 1 207 7 add.u64 %r117,%r117,8; add.u64 %r180,%r180,8; setp.eq.u64 %r401,%r132,%r443; @ %r401 bra $L1; $L41: .loc 1 215 16 ld.u64 %r402,[%r117]; add.u64 %r88,%r402,1; st.u64 [%r117],%r88; .loc 1 216 23 add.u64 %r404,%r444,%r180; ld.u64 %r89,[%r404]; .loc 1 216 13 shl.b64 %r405,%r89,2; add.u64 %r133,%r130,%r405; .loc 1 217 23 add.u64 %r406,%frame,%r180; ld.u64 %r92,[%r406]; .loc 1 217 13 shl.b64 %r407,%r92,4; add.u64 %r134,%r131,%r407; .loc 1 197 32 add.u64 %r409,%r445,%r180; ld.u64 %r95,[%r409]; .loc 1 197 13 setp.eq.u64 %r410,%r88,%r95; @ %r410 bra $L42; .loc 1 201 13 mov.u64 %r142,0; bra $L25; $L58: .loc 1 90 6 ld.u64 %r411,[%r191]; setp.eq.u64 %r412,%r411,0; @ ! %r412 bra $L11; bra $L43; $L1:_gfortran_mmaxloc1_16_r4 .visible .func _gfortran_mmaxloc1_16_r, .param .u32 %in_ar4) {608pred %r239; .reg .u32 %r244; .reg .u64 %r246; .reg .u32 %r248; .reg .u32 %r249; .reg .u16 %r251; .reg .u16 %r252; .reg .u16 %r253; .reg .u32u64 %r268; .reg .u64 %r275; .reg .pred %r276; .reg .u32 %r277; .reg .u32 %r278; .reg .u32 %r280; .reg .u32pred %r291; .reg .pred %r293; .reg .u64predu64 %r312; .reg .u64u64 %r317; .reg .u64 %r318; .reg .u64u64pred %r356; .reg .pred %r357; .reg .u64 %r358; .reg .u32pred %r410; .reg .u64pred %r419; .reg .predpred %r427; .reg .predu64 %r454; .reg .u64 %r457; .reg .u32 %r459; .reg .u64 %r463; .reg .u64 %r465; .reg .u64pred %r541; .reg .predu64u64 %r565; .reg .u64 %r567; .reg .pred %r568; .reg .u64 %r569; .reg .pred %r570; .reg .u64 %r571; .reg .u64 %r572; .reg .u64 %r573; .reg .u64.reg .u64 %r633; .reg .u64 %r634; mov.u64 %r234,%ar0; mov.u64 %r235,%ar1; mov.u64 %r236,%ar2; mov.u64 %r237,%ar3; mov.u32 %r238,%ar4; .loc 1 251 6 setp.ne.u64 %r239,%r237,0; @ %r239 bra $L61; .loc 1 254 7r2236238; call _gfortran_maxloc1_16_r47 bra $L60; $L61: .loc 1 261 10 ld.u64 %r22,[%r236]; .loc 1 261 7 add.u64 %r152,%r22,-1; .loc 1 262 10 ld.s8 %r23,[%r235+28]; .loc 1 262 38 add.u32 %r244,%r23,-1; .loc 1 262 8 cvt.s64.s32 %r153,%r244; .loc 1 265 7 shr.u64 %r246,%r152,63; set.u32.gt.s64 %r248,%r152,%r153; neg.s32 %r249,%r248; cvt.u16.u64 %r252,%r246; cvt.u16.u32 %r253,%r249; or.b16 %r251,%r252,%r253; .loc 1 265 6 cvt.u32.u16 %r254,%r251; cvt.u16.u8 %r255,%r254; setp.eq.u16 %r256,%r255,0; @ %r256 bra $L63; .loc 1 267 7 cvt.u32.u32 %r259,%r23; cvt.s64.s8 %r258,%r259; st.u64 [%stack+8],%r258; st.u64 [%stack],%r22; cvta.const.u64 %r25_gfortran_runtime_error63: .loc 1 272 9 add.u64 %r621,%r152,%r152; add.u64 %r617,%r621,%r152; shl.b64 %r264,%r617,3; add.u64 %r265,%r235,%r264; ld.u64 %r268,[%r265+56]; add.u64 %r267,%r268,1; .loc 1 272 7 ld.u64 %r275,[%r265+48]; sub.u64 %r154,%r267,%r275; .loc 1 273 6 setp.le.s64 %r276,%r154,0; @ %r276 bra $L60; .loc 1 276 9 ld.u64 %r174,[%r237]; .loc 1 278 15 ld.u64 %r33,[%r237+16]; .loc 1 280 22 cvt.u32.u64 %r34,%r33; .loc 1 280 53 add.u32 %r277,%r34,-4; and.b32 %r278,%r277,-5; set.u32.eq.u32 %r280,%r278,0; neg.s32 %r281,%r280; .loc 1 280 22 add.u32 %r282,%r34,-1; set.u32.le.u32 %r284,%r282,1; neg.s32 %r285,%r284; .loc 1 280 6 cvt.u16.u32 %r287,%r281; cvt.u16.u32 %r288,%r285; or.b16 %r286,%r287,%r288; cvt.u32.u16 %r289,%r286; cvt.u16.u8 %r290,%r289; setp.ne.u16 %r291,%r290,0; @ %r291 bra $L65; .loc 1 282 7 setp.ne.u32 %r293,%r34,16; @ %r293 bra $L66; $L65: .loc 1 289 9 shl.b64 %r297,%r617,3; add.u64 %r298,%r235,%r297; ld.u64 %r157,[%r298+40]; .loc 1 290 12 add.u64 %r304,%r237,%r297; ld.u64 %r40,[%r304+40]; .loc 1 292 3 setp.ne.u64 %r306,%r152,0; @ %r306 bra $L67; $L73: .loc 1 302 3 setp.lt.s64 %r307,%r152,%r153; @ %r307 bra $L68; bra $L135; $L66: .loc 1 287 5 cvta.const.u64 %r30830_gfortran_runtime_error67: add.u64 %r91,%r235,40; add.u64 %r196,%r237,40; add.u64 %r311,%r22,%r22; add.u64 %r312,%r311,%r22; shl.b64 %r313,%r312,3; add.u64 %r314,%r235,16; add.u64 %r226,%r313,%r314; .loc 1 292 3 mov.u64 %r107,0; add.u64 %r619,%frame,240; add.u64 %r623,%frame,360; .loc 1 299 12 mov.u64 %r634,%r107; $L72: .loc 1 294 18 add.u64 %r316,%r619,%r107; ld.u64 %r317,[%r91]; st.u64 [%r316],%r317; .loc 1 295 18 add.u64 %r318,%frame,%r107; .loc 1 295 20 ld.u64 %r320,[%r196]; mul.lo.u64 %r319,%r320,%r33; .loc 1 295 18 st.u64 [%r318],%r319; .loc 1 296 19 ld.u64 %r322,[%r91+16]; add.u64 %r321,%r322,1; ld.u64 %r323,[%r91+8]; sub.u64 %r52,%r321,%r323; .loc 1 298 10 setp.lt.s64 %r324,%r52,0; @ %r324 bra $L70; .loc 1 296 17 add.u64 %r326,%r623,%r107; st.u64 [%r326],%r52; bra $L71; $L70: .loc 1 299 12 add.u64 %r328,%r623,%r107; st.u64 [%r328],%r634; $L71: .loc 1 292 3 add.u64 %r91,%r91,24; add.u64 %r196,%r196,24; add.u64 %r107,%r107,8; setp.ne.u64 %r330,%r91,%r226; @ %r330 bra $L72; bra $L73; $L135: .loc 1 312 6 ld.u64 %r331,[%r234]; setp.eq.u64 %r332,%r331,0; @ ! %r332 bra $L75; bra $L74; $L68: add.u64 %r334,%r22,%r22; add.u64 %r335,%r334,%r22; shl.b64 %r336,%r335,3; add.u64 %r180,%r336,40; add.u64 %r57,%r235,%r180; add.u64 %r182,%r237,%r180; shl.b64 %r337,%r22,3; add.u64 %r74,%r337,-8; cvt.u32.u32 %r339,%r23; cvt.s64.s8 %r338,%r339; shl.b64 %r340,%r338,3; add.u64 %r88,%r340,-8; add.u64 %r619,%frame,240; add.u64 %r623,%frame,360; .loc 1 309 12 mov.u64 %r633,0; $L78: .loc 1 304 18 add.u64 %r342,%r619,%r74; ld.u64 %r343,[%r57]; st.u64 [%r342],%r343; .loc 1 305 18 add.u64 %r344,%frame,%r74; .loc 1 305 20 ld.u64 %r346,[%r182]; mul.lo.u64 %r345,%r346,%r33; .loc 1 305 18 st.u64 [%r344],%r345; .loc 1 306 19 ld.u64 %r348,[%r57+16]; add.u64 %r347,%r348,1; ld.u64 %r349,[%r57+8]; sub.u64 %r64,%r347,%r349; .loc 1 308 10 setp.lt.s64 %r350,%r64,0; @ %r350 bra $L76; .loc 1 306 17 add.u64 %r352,%r623,%r74; st.u64 [%r352],%r64; bra $L77; $L76: .loc 1 309 12 add.u64 %r354,%r623,%r74; st.u64 [%r354],%r633; $L77: .loc 1 302 3 add.u64 %r57,%r57,24; add.u64 %r182,%r182,24; add.u64 %r74,%r74,8; setp.ne.u64 %r356,%r74,%r88; @ %r356 bra $L78; bra $L136; $L74: .loc 1 316 7 setp.eq.u64 %r357,%r153,0; @ %r357 bra $L80; add.u64 %r623,%frame,360; $L110: add.u64 %r133,%r234,40; mov.u64 %r131,%r623; cvt.u32.u32 %r359,%r23; cvt.s64.s8 %r358,%r359; add.u64 %r361,%r358,%r358; add.u64 %r362,%r361,%r358; shl.b64 %r363,%r362,3; add.u64 %r364,%r234,16; add.u64 %r60,%r363,%r364; .loc 1 319 10 mov.u64 %r136,1; .loc 1 323 4 mov.u64 %r383,0; bra $L81; $L80: .loc 1 327 20 add.u32 %r365,%r23,-2; cvt.s64.s32 %r72,%r365; add.u64 %r367,%r72,%r72; add.u64 %r368,%r367,%r72; shl.b64 %r369,%r368,3; add.u64 %r370,%r234,%r369; .loc 1 327 67 shl.b64 %r372,%r72,3; add.u64 %r373,%frame,%r372; .loc 1 327 59 ld.u64 %r375,[%r370+40]; ld.u64 %r376,[%r373+360]; mul.lo.u64 %r161,%r375,%r376; .loc 1 329 24 mov.u64 %r377,0; st.u64 [%r234+8],%r377; .loc 1 330 28 cvt.u16.u32 %r380,%r23; add.u16 %r379,%r380,-1; cvt.u32.u16 %r381,%r379; st.u8 [%r234+28],%r381; .loc 1 332 10 setp.eq.u64 %r382,%r161,0; @ %r382 bra $L82; bra $L137; $L84: .loc 1 321 47 mul.lo.u64 %r136,%r69,%r136; $L81: .loc 1 323 4 st.u64 [%r133+8],%r383; ld.u64 %r69,[%r131]; add.u64 %r384,%r69,-1; st.u64 [%r133+16],%r384; st.u64 [%r133],%r136; .loc 1 316 7 add.u64 %r133,%r133,24; add.u64 %r131,%r131,8; setp.ne.u64 %r385,%r60,%r133; @ %r385 bra $L84; bra $L80; $L82: .loc 1 335 4 st.u64 [%r234+48],%r161; mov.u64 %r387,-1; st.u64 [%r234+56],%r387; mov.u64 %r388,1; st.u64 [%r234+40],%r388; .loc 1 336 4 bra $L60; $L137: .loc 1 339 24 mov.u64 %r390,16390; call (%value_in),_gfortrani_xmallocarray91,[%value_in]; } .loc 1 339 22 st.u64 [%r234],%r391; bra $L85; $L75: .loc 1 344 19 ld.s8 %r393,[%r234+28]; .loc 1 344 10 setp.eq.u64 %r394,%r393,%r153; @ %r394 bra $L86; .loc 1 345 2 cvta.const.u64 %r3955_gfortran_runtime_error86: .loc 1 347 11 cvta.global.u64 %r397,_gfortrani_compile_options; .loc 1 347 10 ld.u32 %r398,[%r397+36]; setp.eq.u32 %r399,%r398,0; @ %r399 bra $L85; .loc 1 349 4 add.u64 %r623,%frame,360; cvta.const.u64 %r403,$LC2;r62302403; call _gfortrani_bounds_ifunction_return4 cvta.const.u64 %r407,$LC207403; call _gfortrani_bounds_equal_extents$L85: .loc 1 356 3 setp.ne.u64 %r409,%r153,0; @ %r409 bra $L87; $L90: .loc 1 364 8 ld.u64 %r175,[%r234]; .loc 1 365 8 ld.u64 %r173,[%r235]; .loc 1 367 9 setp.ne.u64 %r410,%r173,0; @ %r410 bra $L88; bra $L60; $L87: add.u64 %r149,%frame,480; add.u64 %r148,%r234,40; add.u64 %r144,%frame,120; cvt.u32.u32 %r412,%r23; cvt.s64.s8 %r411,%r412; add.u64 %r134,%r411,-1; .loc 1 356 3 mov.u64 %r176,0; add.u64 %r623,%frame,360; .loc 1 358 16 mov.u64 %r413,%r176; $L89: st.u64 [%r149],%r413; .loc 1 359 18 ld.u64 %r414,[%r148]; st.u64 [%r144],%r414; .loc 1 360 17 shl.b64 %r416,%r176,3; add.u64 %r417,%r623,%r416; .loc 1 360 10 ld.u64 %r418,[%r417]; setp.le.s64 %r419,%r418,0; @ %r419 bra $L60; .loc 1 356 26 add.u64 %r176,%r176,1; .loc 1 356 3 add.u64 %r149,%r149,8; add.u64 %r148,%r148,24; add.u64 %r144,%r144,8; setp.ne.u64 %r420,%r134,%r176; @ %r420 bra $L89; bra $L90; $L88: .loc 1 290 12 mul.lo.u64 %r43,%r40,%r33; .loc 1 386 32 shl.b64 %r87,%r157,2; ld.u64 %r38,[%frame+480]; .loc 1 430 22 ld.u64 %r100,[%frame+240]; .loc 1 430 12 shl.b64 %r101,%r100,2; .loc 1 431 23 ld.u64 %r103,[%frame]; .loc 1 432 22 ld.u64 %r105,[%frame+120]; .loc 1 432 12 shl.b64 %r106,%r105,4; .loc 1 434 32 ld.u64 %r166,[%frame+360]; .loc 1 441 23 mul.lo.u64 %r37,%r166,%r100; .loc 1 442 24 mul.lo.u64 %r421,%r166,%r103; .loc 1 442 10 neg.s64 %r85,%r421; .loc 1 443 23 mul.lo.u64 %r422,%r166,%r105; .loc 1 443 9 shl.b64 %r423,%r422,4; neg.s64 %r93,%r423; setp.le.s64 %r618,%r153,1; .loc 1 356 3 mov.u64 %r626,1; .loc 1 408 9 setp.ne.u32 %r627,%r238,0; add.u64 %r628,%r154,-1; cvt.u32.u32 %r629,%r23; cvt.s64.s8 %r630,%r629; add.u64 %r631,%r630,-1; add.u64 %r632,%frame,240; $L111: .loc 1 356 3 mov.u64 %r169,%r174; mov.u64 %r168,%r173; mov.u64 %r573,%r626; mov.u64 %r574,0; .loc 1 383 17 mov.u64 %r571,%r574; mov.u64 %r572,%r574; .loc 1 386 9 mov.u64 %r200,%r574; $L96: .loc 1 389 7 ld.s8 %r225,[%r169]; mov.u64 %r231,%r200; add.u64 %r200,%r200,1; .loc 1 389 6 setp.eq.u32 %r620,%r225,0; @ %r620 bra $L91; .loc 1 392 10 or.b64 %r425,%r571,%r572; setp.ne.u64 %r427,%r425,0; .loc 1 393 17 selp.u64 %r571,%r571,%r573,%r427; selp.u64 %r572,%r572,%r574,%r427; .loc 1 394 11 ld.f32 %r145,[%r168]; .loc 1 394 10 setp.ge.f32 %r428,%r145,0fff800000; @ ! %r428 bra $L91; .loc 1 398 11 shr.s64 %r429,%r200,63; .loc 1 404 9 setp.le.s64 %r430,%r154,%r231; @ ! %r430 bra $L138; bra $L94; $L91: .loc 1 386 32 add.u64 %r168,%r168,%r87; .loc 1 386 47 add.u64 %r169,%r169,%r43; .loc 1 386 2 add.u64 %r433,%r573,%r626; set.u32.lt.u64 %r438,%r433,%r573; cvt.s64.s32 %r436,%r438; mov.u64 %r573,%r433; sub.u64 %r574,%r574,%r436; setp.ne.u64 %r443,%r154,%r200; @ %r443 bra $L96; bra $L94; $L138: .loc 1 408 9 @ %r627 bra $L97; add.u64 %r579,%r231,1; shr.s64 %r580,%r579,63; sub.u64 %r448,%r628,%r231; shr.s64 %r451,%r231,63; add.u64 %r454,%r231,2; set.u32.lt.u64 %r459,%r454,%r231; cvt.s64.s32 %r457,%r459; sub.u64 %r463,%r451,%r457; add.u64 %r465,%r448,%r454; set.u32.lt.u64 %r470,%r465,%r448; cvt.s64.s32 %r468,%r470; sub.u64 %r474,%r463,%r468; .loc 1 398 11 mov.u64 %r571,%r200; mov.u64 %r572,%r429; bra $L98; $L97: add.u64 %r583,%r231,1; shr.s64 %r584,%r583,63; sub.u64 %r478,%r628,%r231; shr.s64 %r481,%r231,63; add.u64 %r484,%r231,2; set.u32.lt.u64 %r489,%r484,%r231; cvt.s64.s32 %r487,%r489; sub.u64 %r493,%r481,%r487; add.u64 %r495,%r478,%r484; set.u32.lt.u64 %r500,%r495,%r478; cvt.s64.s32 %r498,%r500; sub.u64 %r504,%r493,%r498; mov.u64 %r571,%r200; mov.u64 %r572,%r429; $L102: .loc 1 411 8 @ %r620 bra $L99; .loc 1 411 18 ld.f32 %r90,[%r168]; .loc 1 411 15 setp.ge.f32 %r506,%r90,%r145; .loc 1 414 16 selp.u64 %r571,%r583,%r571,%r506; selp.u64 %r572,%r584,%r572,%r506; selp.f32 %r145,%r90,%r145,%r506; $L99: .loc 1 409 33 add.u64 %r168,%r168,%r87; .loc 1 409 48 add.u64 %r169,%r169,%r43; .loc 1 409 8 add.u64 %r509,%r583,%r626; set.u32.lt.u64 %r514,%r509,%r583; cvt.s64.s32 %r512,%r514; mov.u64 %r583,%r509; sub.u64 %r584,%r584,%r512; setp.ne.u64 %r521,%r495,%r583; @ %r521 bra $L114; setp.ne.u64 %r524,%r504,%r584; @ ! %r524 bra $L94; $L114: .loc 1 411 9 ld.s8 %r225,[%r169]; setp.eq.u32 %r620,%r225,0; bra $L102; $L98: .loc 1 420 8 @ %r620 bra $L103; .loc 1 420 18 ld.f32 %r94,[%r168]; .loc 1 420 15 setp.gt.f32 %r526,%r94,%r145; .loc 1 423 16 selp.u64 %r571,%r579,%r571,%r526; selp.u64 %r572,%r580,%r572,%r526; selp.f32 %r145,%r94,%r145,%r526; $L103: .loc 1 418 33 add.u64 %r168,%r168,%r87; .loc 1 418 48 add.u64 %r169,%r169,%r43; .loc 1 418 8 add.u64 %r529,%r579,%r626; set.u32.lt.u64 %r534,%r529,%r579; cvt.s64.s32 %r532,%r534; mov.u64 %r579,%r529; sub.u64 %r580,%r580,%r532; setp.ne.u64 %r541,%r465,%r579; @ %r541 bra $L116; setp.ne.u64 %r544,%r474,%r580; @ ! %r544 bra $L94; $L116: .loc 1 420 9 ld.s8 %r225,[%r169]; setp.eq.u32 %r620,%r225,0; bra $L98; $L94: .loc 1 426 8 st.u64 [%r175],%r571; st.u64 [%r175+8],%r572; .loc 1 429 15 add.u64 %r38,%r38,1; .loc 1 430 12 add.u64 %r173,%r173,%r101; .loc 1 431 13 add.u64 %r174,%r174,%r103; .loc 1 432 12 add.u64 %r175,%r175,%r106; .loc 1 434 13 setp.ne.u64 %r547,%r38,%r166; @ %r547 bra $L111; .loc 1 442 10 add.u64 %r170,%r174,%r85; .loc 1 443 9 add.u64 %r171,%r175,%r93; .loc 1 445 7 @ %r618 bra $L60; add.u64 %r151,%frame,488; .loc 1 441 23 mov.u64 %r110,%r37; .loc 1 445 7 mov.u64 %r219,8; .loc 1 444 5 mov.u64 %r172,1; add.u64 %r623,%frame,360; add.u64 %r622,%frame,120; .loc 1 438 13 mov.u64 %r624,0; bra $L108; $L109: st.u64 [%r151],%r624; .loc 1 441 23 mul.lo.u64 %r110,%r123,%r122; .loc 1 442 24 mul.lo.u64 %r553,%r125,%r122; .loc 1 442 10 sub.u64 %r170,%r174,%r553; .loc 1 443 23 mul.lo.u64 %r554,%r127,%r122; .loc 1 443 9 shl.b64 %r555,%r554,4; sub.u64 %r171,%r175,%r555; .loc 1 444 5 add.u64 %r172,%r172,1; .loc 1 445 7 add.u64 %r151,%r151,8; add.u64 %r219,%r219,8; setp.eq.u64 %r556,%r172,%r631; @ %r556 bra $L60; $L108: .loc 1 453 16 ld.u64 %r557,[%r151]; add.u64 %r122,%r557,1; st.u64 [%r151],%r122; .loc 1 454 23 add.u64 %r559,%r632,%r219; ld.u64 %r123,[%r559]; .loc 1 454 13 sub.u64 %r560,%r123,%r110; shl.b64 %r561,%r560,2; add.u64 %r173,%r173,%r561; .loc 1 455 24 add.u64 %r562,%frame,%r219; ld.u64 %r125,[%r562]; .loc 1 455 14 add.u64 %r174,%r170,%r125; .loc 1 456 23 add.u64 %r564,%r622,%r219; ld.u64 %r127,[%r564]; .loc 1 456 13 shl.b64 %r565,%r127,4; add.u64 %r175,%r171,%r565; .loc 1 434 32 add.u64 %r567,%r623,%r219; ld.u64 %r130,[%r567]; .loc 1 434 13 setp.eq.u64 %r568,%r122,%r130; @ %r568 bra $L109; .loc 1 438 13 mov.u64 %r38,0; bra $L111; $L136: .loc 1 312 6 ld.u64 %r569,[%r234]; setp.eq.u64 %r570,%r569,0; @ ! %r570 bra $L75; bra $L110; $L60_gfortran_smaxloc1_16_r4 .visible .func _gfortran_smaxloc1_16_r, .param .u32 %in_ar4) {local .align 16 .b8 %frame_ar[368u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r282; .reg .predpred %r293; .reg .pred %r294; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 483 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L140; .loc 1 483 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L141; $L140:r152154; call _gfortran_maxloc1_16_r4490 7 bra $L139; $L141: .loc 1 493 10 ld.u64 %r23,[%r152]; .loc 1 493 7 add.u64 %r88,%r23,-1; .loc 1 494 10 ld.s8 %r24,[%r151+28]; .loc 1 494 38 add.u32 %r26,%r24,-1; .loc 1 494 8 cvt.s64.s32 %r297,%r26; .loc 1 496 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r297; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 496 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L143; .loc 1 503 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L144; $L150: .loc 1 511 3 setp.lt.s64 %r175,%r88,%r297; @ %r175 bra $L145; bra $L198; $L143: .loc 1 498 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error144: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 508 12 mov.u64 %r303,0; $L149: .loc 1 505 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 507 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L147; .loc 1 505 17 st.u64 [%r139],%r36; bra $L148; $L147: .loc 1 508 12 st.u64 [%r139],%r303; $L148: .loc 1 503 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L149; bra $L150; $L198: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L152; bra $L151; $L145: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r296,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r296,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 517 12 mov.u64 %r302,0; $L155: .loc 1 514 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 516 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L153; .loc 1 513 17 st.u64 [%r43],%r41; bra $L154; $L153: .loc 1 517 12 st.u64 [%r43],%r302; $L154: .loc 1 511 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L155; bra $L199; $L151: .loc 1 524 7 setp.eq.u64 %r213,%r297,0; @ %r213 bra $L157; add.u64 %r296,%frame,120; $L177: add.u64 %r111,%r150,40; mov.u64 %r107,%r296; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 527 10 mov.u64 %r81,1; .loc 1 531 4 mov.u64 %r239,0; bra $L158; $L157: .loc 1 535 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 536 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 538 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 538 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 538 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 540 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L159; bra $L200; $L161: .loc 1 529 48 mul.lo.u64 %r81,%r46,%r81; $L158: .loc 1 531 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 524 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L161; bra $L157; $L159: .loc 1 543 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 544 4 bra $L139; $L200: .loc 1 547 24 mov.u64 %r246,1696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 547 22 st.u64 [%r150],%r98; $L165: .loc 1 573 3 setp.ne.u64 %r249,%r297,0; @ %r249 bra $L162; bra $L163; $L152: .loc 1 551 19 ld.s8 %r55,[%r150+28]; .loc 1 551 10 setp.eq.u64 %r250,%r55,%r297; @ %r250 bra $L164; .loc 1 552 2 st.u64 [%stack+8],%r297; st.u64 [%stack],%r55;_gfortran_runtime_error164: .loc 1 557 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 557 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L165; .loc 1 559 4 setp.eq.u64 %r256,%r297,0; @ %r256 bra $L163; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 559 10 mov.u64 %r104,0; $L167: .loc 1 563 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 563 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 564 18 ld.u64 %r63,[%r31]; .loc 1 559 25 add.u64 %r104,%r104,1; .loc 1 564 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L166; .loc 1 565 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error166: .loc 1 559 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L167; $L162: .loc 1 575 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L168; shl.b64 %r266,%r297,3; bra $L169; $L168: mov.u64 %r266,8; $L169: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L170: .loc 1 576 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 573 26 add.u64 %r100,%r100,1; .loc 1 573 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r297,%r100; @ %r279 bra $L170; $L163: ld.u64 %r80,[%frame+240]; .loc 1 585 22 ld.u64 %r68,[%frame]; .loc 1 585 12 shl.b64 %r69,%r68,4; .loc 1 587 32 ld.u64 %r59,[%frame+120]; .loc 1 594 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r295,%r297,1; .loc 1 583 13 mov.u64 %r280,0; cvt.u32.u32 %r299,%r24; cvt.s64.s8 %r300,%r299; add.u64 %r301,%r300,-1; $L176: st.u64 [%r98],%r280; st.u64 [%r98+8],%r280; .loc 1 584 15 add.u64 %r80,%r80,1; .loc 1 585 12 add.u64 %r98,%r98,%r69; .loc 1 587 13 setp.ne.u64 %r282,%r59,%r80; @ %r282 bra $L176; .loc 1 596 7 @ ! %r295 bra $L201; bra $L139; $L175: .loc 1 591 13 st.u64 [%r99],%r280; .loc 1 594 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 595 5 add.u64 %r97,%r97,1; .loc 1 596 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r285,%r97,%r301; @ %r285 bra $L174; bra $L139; $L201: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 594 23 mov.u64 %r72,%r44; .loc 1 595 5 mov.u64 %r97,1; $L174: .loc 1 600 16 ld.u64 %r290,[%r99]; add.u64 %r74,%r290,1; st.u64 [%r99],%r74; .loc 1 601 23 ld.u64 %r75,[%r124]; .loc 1 601 13 sub.u64 %r291,%r75,%r72; shl.b64 %r292,%r291,4; add.u64 %r98,%r98,%r292; .loc 1 587 32 ld.u64 %r77,[%r123]; .loc 1 587 13 setp.eq.u64 %r293,%r74,%r77; @ %r293 bra $L175; .loc 1 591 13 mov.u64 %r80,0; bra $L176; $L199: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r294,%r98,0; @ ! %r294 bra $L152; bra $L177; $L139: .loc 1 605 1 ret; } maxloc1_4_r8.o/_gfortran_maxloc1_4_r8 .visible .func _gfortran_maxloc1_4_r8.file 1 "../../../libgfortran/generated/maxloc1_4_r8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_4_r8 .visible .func _gfortran_mmaxloc1_4_r8_gfortran_smaxloc1_4_r8 .visible .func _gfortran_smaxloc1_4_r8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_maxloc1_4_r8 .visible .func _gfortran_maxloc1_4_r8480100; .reg .u64predpredpred %r235; .reg .predpredu64u16u64 %r279; .reg .u64 %r280; .reg .u64 %r282; .reg .u64 %r283; .reg .u64predu64u32 %r316; .reg .u64 %r317; .reg .u64pred %r323; .reg .pred %r324; .reg .u32 %r326; .reg .predpred %r330; .reg .predpred %r345; .reg .u64mov.u64 %r185,%ar0; mov.u64 %r186,%ar1; mov.u64 %r187,%ar2; mov.u32 %r188,%ar3; .loc 1 58 10 ld.s8 %r22,[%r186+28]; .loc 1 58 38 add.u32 %r189,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r112,%r189; .loc 1 59 10 ld.u64 %r25,[%r187]; .loc 1 59 7 add.u64 %r113,%r25,-1; .loc 1 61 7 shr.u64 %r191,%r113,63; set.u32.lt.s64 %r193,%r112,%r113; neg.s32 %r194,%r193; cvt.u16.u64 %r197,%r191; cvt.u16.u32 %r198,%r194; or.b16 %r196,%r197,%r198; .loc 1 61 6 cvt.u32.u16 %r199,%r196; cvt.u16.u8 %r200,%r199; setp.eq.u16 %r201,%r200,0; @ %r201 bra $L2; .loc 1 63 7 cvt.u32.u32 %r204,%r22; cvt.s64.s8 %r203,%r204; st.u64 [%stack+8],%r20302202_gfortran_runtime_errorr207,%r113,%r113; add.u64 %r208,%r207,%r113; shl.b64 %r209,%r208,3; add.u64 %r210,%r186,%r209; ld.u64 %r30,[%r210+56]; ld.u64 %r32,[%r210+48]; .loc 1 71 9 ld.u64 %r116,[%r210+40]; .loc 1 73 3 setp.ne.u64 %r224,%r113,0; @ %r224 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r225,%r112,%r113; @ %r225 bra $L4; bra $L55; $L3: add.u64 %r184,%r186,40; add.u64 %r181,%frame,120; add.u64 %r47,%frame,240; add.u64 %r226,%r186,16; add.u64 %r228,%r25,%r25; add.u64 %r229,%r228,%r25; shl.b64 %r230,%r229,3; add.u64 %r148,%r226,%r230; .loc 1 79 12 mov.u64 %r371,0; $L8: .loc 1 75 18 ld.u64 %r231,[%r184]; st.u64 [%r181],%r231; .loc 1 76 19 ld.u64 %r233,[%r184+16]; add.u64 %r232,%r233,1; ld.u64 %r234,[%r184+8]; sub.u64 %r37,%r232,%r234; .loc 1 78 10 setp.lt.s64 %r235,%r37,0; @ %r235 bra $L6; .loc 1 76 17 st.u64 [%r47],%r37; bra $L7; $L6: .loc 1 79 12 st.u64 [%r47],%r371; $L7: .loc 1 73 3 add.u64 %r184,%r184,24; add.u64 %r181,%r181,8; add.u64 %r47,%r47,8; setp.ne.u64 %r237,%r148,%r184; @ %r237 bra $L8; bra $L9; $L55: .loc 1 90 6 ld.u64 %r238,[%r185]; setp.eq.u64 %r239,%r238,0; @ ! %r239 bra $L11; bra $L10; $L4: add.u64 %r241,%r25,%r25; add.u64 %r242,%r241,%r25; shl.b64 %r243,%r242,3; add.u64 %r244,%r243,40; add.u64 %r130,%r186,%r244; shl.b64 %r245,%r25,3; add.u64 %r102,%r245,-8; add.u64 %r359,%frame,120; add.u64 %r106,%r359,%r102; add.u64 %r358,%frame,240; add.u64 %r98,%r358,%r102; cvt.u32.u32 %r249,%r22; cvt.s64.s8 %r248,%r249; add.u64 %r251,%r248,%r248; add.u64 %r252,%r251,%r248; shl.b64 %r253,%r252,3; add.u64 %r254,%r186,40; add.u64 %r127,%r253,%r254; .loc 1 87 12 mov.u64 %r370,0; $L14: .loc 1 83 18 ld.u64 %r255,[%r130]; st.u64 [%r106],%r255; .loc 1 84 19 ld.u64 %r257,[%r130+16]; add.u64 %r256,%r257,1; ld.u64 %r258,[%r130+8]; sub.u64 %r44,%r256,%r258; .loc 1 86 10 setp.lt.s64 %r259,%r44,0; @ %r259 bra $L12; .loc 1 84 17 st.u64 [%r98],%r44; bra $L13; $L12: .loc 1 87 12 st.u64 [%r98],%r370; $L13: .loc 1 81 3 add.u64 %r130,%r130,24; add.u64 %r106,%r106,8; add.u64 %r98,%r98,8; setp.ne.u64 %r261,%r127,%r130; @ %r261 bra $L14; bra $L56; $L10: .loc 1 94 7 setp.le.s64 %r262,%r112,0; @ %r262 bra $L16; add.u64 %r358,%frame,240; $L42: add.u64 %r155,%r185,40; mov.u64 %r146,%r358; add.u64 %r263,%r185,16; cvt.u32.u32 %r265,%r22; cvt.s64.s8 %r264,%r265; add.u64 %r267,%r264,%r264; add.u64 %r268,%r267,%r264; shl.b64 %r269,%r268,3; add.u64 %r134,%r263,%r269; .loc 1 97 10 mov.u64 %r100,1; .loc 1 101 4 mov.u64 %r292,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r270,0; st.u64 [%r185+8],%r270; .loc 1 106 28 cvt.u16.u32 %r273,%r22; add.u16 %r272,%r273,-1; cvt.u32.u16 %r274,%r272; st.u8 [%r185+28],%r274; .loc 1 108 20 add.u32 %r275,%r22,-2; cvt.s64.s32 %r53,%r275; add.u64 %r277,%r53,%r53; add.u64 %r278,%r277,%r53; shl.b64 %r279,%r278,3; add.u64 %r280,%r185,%r279; .loc 1 108 67 shl.b64 %r282,%r53,3; add.u64 %r283,%frame,%r282; .loc 1 108 59 ld.u64 %r285,[%r280+40]; ld.u64 %r286,[%r283+240]; mul.lo.u64 %r118,%r285,%r286; .loc 1 110 29288; call (%value_in),_gfortrani_xmallocarray289,[%value_in]; } .loc 1 110 27 st.u64 [%r185],%r289; .loc 1 111 10 setp.eq.u64 %r291,%r118,0; @ ! %r291 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r100,%r49,%r100; $L17: .loc 1 101 4 st.u64 [%r155+8],%r292; ld.u64 %r49,[%r146]; add.u64 %r293,%r49,-1; st.u64 [%r155+16],%r293; st.u64 [%r155],%r100; .loc 1 94 7 add.u64 %r155,%r155,24; add.u64 %r146,%r146,8; setp.ne.u64 %r294,%r134,%r155; @ %r294 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r185+48],%r118; mov.u64 %r296,-1; st.u64 [%r185+56],%r296; mov.u64 %r297,1; st.u64 [%r185+40],%r297; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r59,[%r185+28]; .loc 1 121 10 setp.eq.u64 %r298,%r59,%r112; @ %r298 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r112; st.u64 [%stack],%r59;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r301,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r302,[%r301+36]; setp.eq.u32 %r303,%r302,0; @ %r303 bra $L21; .loc 1 128 2 add.u64 %r358,%frame,240; cvta.const.u64 %r307,$LC2r35306307; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r309,%r112,0; @ %r309 bra $L24; $L27: .loc 1 68 9 add.u64 %r310,%r30,1; .loc 1 68 7 sub.u64 %r114,%r310,%r32; max.s64 %r131,%r114,0; .loc 1 140 8 ld.u64 %r124,[%r186]; .loc 1 141 8 ld.u64 %r125,[%r185]; .loc 1 168 42 shl.b64 %r66,%r116,3; ld.u64 %r135,[%frame+360]; .loc 1 194 22 ld.u64 %r70,[%frame+120]; .loc 1 194 12 shl.b64 %r71,%r70,3; .loc 1 195 22 ld.u64 %r73,[%frame]; .loc 1 195 12 shl.b64 %r74,%r73,2; .loc 1 197 32 ld.u64 %r61,[%frame+240]; .loc 1 204 23 mul.lo.u64 %r311,%r61,%r70; .loc 1 204 9 shl.b64 %r312,%r311,3; neg.s64 %r149,%r312; .loc 1 205 23 mul.lo.u64 %r313,%r61,%r73; .loc 1 205 9 shl.b64 %r314,%r313,2; neg.s64 %r132,%r314; setp.gt.s64 %r360,%r114,0; setp.le.s64 %r361,%r112,1; .loc 1 157 9 mov.u32 %r363,1; setp.eq.u32 %r364,%r188,0; cvt.u32.u32 %r365,%r22; cvt.s64.s8 %r366,%r365; add.u64 %r367,%r366,-1; add.u64 %r368,%frame,120; add.u64 %r369,%frame,240; bra $L25; $L24: add.u64 %r168,%frame,360; add.u64 %r167,%r185,40; mov.u64 %r165,%frame; cvt.u32.u32 %r316,%r22; cvt.s64.s8 %r315,%r316; add.u64 %r156,%r315,-1; .loc 1 132 3 mov.u64 %r126,0; add.u64 %r358,%frame,240; .loc 1 134 16 mov.u64 %r317,%r126; $L26: st.u64 [%r168],%r317; .loc 1 135 18 ld.u64 %r318,[%r167]; st.u64 [%r165],%r318; .loc 1 136 17 shl.b64 %r320,%r126,3; add.u64 %r321,%r358,%r320; .loc 1 136 10 ld.u64 %r322,[%r321]; setp.le.s64 %r323,%r322,0; @ %r323 bra $L1; .loc 1 132 26 add.u64 %r126,%r126,1; .loc 1 132 3 add.u64 %r168,%r168,8; add.u64 %r167,%r167,24; add.u64 %r165,%r165,8; setp.ne.u64 %r324,%r126,%r156; @ %r324 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r360 bra $L43; .loc 1 159 10 mov.u32 %r326,0; st.u32 [%r125],%r326; bra $L29; $L43: mov.u64 %r120,%r124; .loc 1 168 19 mov.u64 %r119,0; $L28: .loc 1 170 7 ld.f64 %r105,[%r120]; .loc 1 170 6 setp.ge.f64 %r327,%r105,0dfff0000000000000; @ ! %r327 bra $L57; .loc 1 173 14 cvt.u32.u64 %r328,%r119; add.u32 %r103,%r328,1; .loc 1 180 6 setp.le.s64 %r329,%r131,%r119; @ %r329 bra $L32; add.u64 %r171,%r119,1; .loc 1 170 7 mov.f64 %r183,%r105; bra $L33; $L57: .loc 1 168 34 add.u64 %r119,%r119,1; .loc 1 168 42 add.u64 %r120,%r120,%r66; .loc 1 168 12 setp.lt.s64 %r330,%r119,%r131; @ %r330 bra $L28; .loc 1 157 9 mov.u32 %r103,%r363; bra $L32; $L33: .loc 1 182 7 @ %r364 bra $L34; .loc 1 182 6 setp.le.f64 %r332,%r105,%r183; @ %r332 bra $L35; bra $L36; $L34: setp.lt.f64 %r333,%r105,%r183; @ ! %r333 bra $L36; $L35: .loc 1 185 14 cvt.u32.u64 %r103,%r171; .loc 1 184 14 mov.f64 %r105,%r183; $L36: .loc 1 180 31 add.u64 %r120,%r120,%r66; .loc 1 180 6 add.u64 %r172,%r171,1; setp.eq.u64 %r334,%r131,%r171; @ %r334 bra $L32; .loc 1 182 14 ld.f64 %r183,[%r120]; mov.u64 %r171,%r172; bra $L33; $L32: .loc 1 189 12 st.u32 [%r125],%r103; $L29: .loc 1 193 15 add.u64 %r135,%r135,1; .loc 1 194 12 add.u64 %r124,%r124,%r71; .loc 1 195 12 add.u64 %r125,%r125,%r74; .loc 1 197 13 setp.ne.u64 %r335,%r61,%r135; @ %r335 bra $L25; .loc 1 204 9 add.u64 %r121,%r124,%r149; .loc 1 205 9 add.u64 %r122,%r125,%r132; .loc 1 207 7 @ %r361 bra $L1; add.u64 %r137,%frame,368; mov.u64 %r180,8; .loc 1 206 5 mov.u64 %r123,1; .loc 1 201 13 mov.u64 %r362,0; bra $L40; $L41: st.u64 [%r137],%r362; .loc 1 204 23 mul.lo.u64 %r341,%r87,%r86; .loc 1 204 9 shl.b64 %r342,%r341,3; sub.u64 %r121,%r124,%r342; .loc 1 205 23 mul.lo.u64 %r343,%r90,%r86; .loc 1 205 9 shl.b64 %r344,%r343,2; sub.u64 %r122,%r125,%r344; .loc 1 206 5 add.u64 %r123,%r123,1; .loc 1 207 7 add.u64 %r137,%r137,8; add.u64 %r180,%r180,8; setp.eq.u64 %r345,%r123,%r367; @ %r345 bra $L1; $L40: .loc 1 215 16 ld.u64 %r346,[%r137]; add.u64 %r86,%r346,1; st.u64 [%r137],%r86; .loc 1 216 23 add.u64 %r348,%r368,%r180; ld.u64 %r87,[%r348]; .loc 1 216 13 shl.b64 %r349,%r87,3; add.u64 %r124,%r121,%r349; .loc 1 217 23 add.u64 %r350,%frame,%r180; ld.u64 %r90,[%r350]; .loc 1 217 13 shl.b64 %r351,%r90,2; add.u64 %r125,%r122,%r351; .loc 1 197 32 add.u64 %r353,%r369,%r180; ld.u64 %r94,[%r353]; .loc 1 197 13 setp.eq.u64 %r354,%r86,%r94; @ %r354 bra $L41; .loc 1 201 13 mov.u64 %r135,0; bra $L25; $L56: .loc 1 90 6 ld.u64 %r355,[%r185]; setp.eq.u64 %r356,%r355,0; @ ! %r356 bra $L11; bra $L42; $L1:_gfortran_mmaxloc1_4_r8 .visible .func _gfortran_mmaxloc1_4_r8, .param .u32 %in_ar4) {608u32 %r267; .reg .u32 %r269; .reg .u32 %r270; .reg .u32 %r271; .reg .u32 %r273; .reg .u32predu64 %r293; .reg .predu64 %r302; .reg .u64 %r303; .reg .u64 %r305; .reg .u64 %r306; .reg .u64u64 %r327; .reg .u32u64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64pred %r339; .reg .u64 %r341; .reg .u64 %r343; .reg .pred %r345; .reg .predpred %r383; .reg .u64 %r384; .reg .u64 %r386; .reg .u32 %r387; .reg .pred %r388; .reg .u64 %r391; .reg .u64pred %r409; .reg .u64 %r410; .reg .u64 %r411; .reg .u64 %r412; .reg .pred %r414; .reg .u32pred %r418; .reg .pred %r419; .reg .pred %r422; .reg .pred %r423; .reg .pred %r425; .reg .pred %r426; .reg .predu64predmov.u64 %r223,%ar0; mov.u64 %r224,%ar1; mov.u64 %r225,%ar2; mov.u64 %r226,%ar3; mov.u32 %r227,%ar4; .loc 1 251 6 setp.ne.u64 %r228,%r226,0; @ %r228 bra $L227; call _gfortran_maxloc1_4_r87 bra $L58; $L59: .loc 1 261 10 ld.u64 %r22,[%r225]; .loc 1 261 7 add.u64 %r159,%r22,-1; .loc 1 262 10 ld.s8 %r23,[%r224+28]; .loc 1 262 38 add.u32 %r233,%r23,-1; .loc 1 262 8 cvt.s64.s32 %r160,%r233; .loc 1 265 7 shr.u64 %r235,%r159,63; set.u32.gt.s64 %r237,%r159,%r160; neg.s32 %r238,%r237; cvt.u16.u64 %r241,%r235; cvt.u16.u32 %r242,%r238; or.b16 %r240,%r241,%r242; .loc 1 265 6 cvt.u32.u16 %r243,%r240; cvt.u16.u8 %r244,%r243; setp.eq.u16 %r245,%r244,0; @ %r245 bra $L61; .loc 1 267 7 cvt.u32.u32 %r248,%r23; cvt.s64.s8 %r247,%r248; st.u64 [%stack+8],%r247; st.u64 [%stack],%r22; cvta.const.u64 %r246246_gfortran_runtime_error61: .loc 1 272 9 add.u64 %r456,%r159,%r159; add.u64 %r457,%r456,%r159; shl.b64 %r253,%r457,3; add.u64 %r254,%r224,%r253; ld.u64 %r257,[%r254+56]; add.u64 %r256,%r257,1; .loc 1 272 7 ld.u64 %r264,[%r254+48]; sub.u64 %r161,%r256,%r264; .loc 1 273 6 setp.le.s64 %r265,%r161,0; @ %r265 bra $L58; .loc 1 276 9 ld.u64 %r180,[%r226]; .loc 1 278 15 ld.u64 %r34,[%r226+16]; .loc 1 280 22 cvt.u32.u64 %r35,%r34; .loc 1 280 53 add.u32 %r266,%r35,-4; and.b32 %r267,%r266,-5; set.u32.eq.u32 %r269,%r267,0; neg.s32 %r270,%r269; .loc 1 280 22 add.u32 %r271,%r35,-1; set.u32.le.u32 %r273,%r271,1; neg.s32 %r274,%r273; .loc 1 280 6 cvt.u16.u32 %r276,%r270; cvt.u16.u32 %r277,%r274; or.b16 %r275,%r276,%r277; cvt.u32.u16 %r278,%r275; cvt.u16.u8 %r279,%r278; setp.ne.u16 %r280,%r279,0; @ %r280 bra $L63; .loc 1 282 7 setp.ne.u32 %r282,%r35,16; @ %r282 bra $L64; $L63: .loc 1 289 9 shl.b64 %r286,%r457,3; add.u64 %r287,%r224,%r286; ld.u64 %r164,[%r287+40]; .loc 1 290 12 add.u64 %r293,%r226,%r286; ld.u64 %r41,[%r293+40]; .loc 1 292 3 setp.ne.u64 %r295,%r159,0; @ %r295 bra $L65; $L71: .loc 1 302 3 setp.lt.s64 %r296,%r159,%r160; @ %r296 bra $L66; bra $L129; $L64: .loc 1 287 5 cvta.const.u64 %r2977_gfortran_runtime_error65: add.u64 %r91,%r224,40; add.u64 %r40,%r226,40; add.u64 %r300,%r22,%r22; add.u64 %r301,%r300,%r22; shl.b64 %r302,%r301,3; add.u64 %r303,%r224,16; add.u64 %r186,%r302,%r303; .loc 1 292 3 mov.u64 %r148,0; add.u64 %r453,%frame,240; add.u64 %r452,%frame,360; .loc 1 299 12 mov.u64 %r467,%r148; $L70: .loc 1 294 18 add.u64 %r305,%r453,%r148; ld.u64 %r306,[%r91]; st.u64 [%r305],%r306; .loc 1 295 18 add.u64 %r307,%frame,%r148; .loc 1 295 20 ld.u64 %r309,[%r40]; mul.lo.u64 %r308,%r309,%r34; .loc 1 295 18 st.u64 [%r307],%r308; .loc 1 296 19 ld.u64 %r311,[%r91+16]; add.u64 %r310,%r311,1; ld.u64 %r312,[%r91+8]; sub.u64 %r52,%r310,%r312; .loc 1 298 10 setp.lt.s64 %r313,%r52,0; @ %r313 bra $L68; .loc 1 296 17 add.u64 %r315,%r452,%r148; st.u64 [%r315],%r52; bra $L69; $L68: .loc 1 299 12 add.u64 %r317,%r452,%r148; st.u64 [%r317],%r467; $L69: .loc 1 292 3 add.u64 %r91,%r91,24; add.u64 %r40,%r40,24; add.u64 %r148,%r148,8; setp.ne.u64 %r319,%r186,%r91; @ %r319 bra $L70; bra $L71; $L129: .loc 1 312 6 ld.u64 %r320,[%r223]; setp.eq.u64 %r321,%r320,0; @ ! %r321 bra $L73; bra $L72; $L66: add.u64 %r323,%r22,%r22; add.u64 %r324,%r323,%r22; shl.b64 %r325,%r324,3; add.u64 %r116,%r325,40; add.u64 %r133,%r224,%r116; add.u64 %r87,%r226,%r116; shl.b64 %r326,%r22,3; add.u64 %r157,%r326,-8; cvt.u32.u32 %r328,%r23; cvt.s64.s8 %r327,%r328; shl.b64 %r329,%r327,3; add.u64 %r96,%r329,-8; add.u64 %r453,%frame,240; add.u64 %r452,%frame,360; .loc 1 309 12 mov.u64 %r466,0; $L76: .loc 1 304 18 add.u64 %r331,%r453,%r157; ld.u64 %r332,[%r133]; st.u64 [%r331],%r332; .loc 1 305 18 add.u64 %r333,%frame,%r157; .loc 1 305 20 ld.u64 %r335,[%r87]; mul.lo.u64 %r334,%r335,%r34; .loc 1 305 18 st.u64 [%r333],%r334; .loc 1 306 19 ld.u64 %r337,[%r133+16]; add.u64 %r336,%r337,1; ld.u64 %r338,[%r133+8]; sub.u64 %r61,%r336,%r338; .loc 1 308 10 setp.lt.s64 %r339,%r61,0; @ %r339 bra $L74; .loc 1 306 17 add.u64 %r341,%r452,%r157; st.u64 [%r341],%r61; bra $L75; $L74: .loc 1 309 12 add.u64 %r343,%r452,%r157; st.u64 [%r343],%r466; $L75: .loc 1 302 3 add.u64 %r133,%r133,24; add.u64 %r87,%r87,24; add.u64 %r157,%r157,8; setp.ne.u64 %r345,%r96,%r157; @ %r345 bra $L76; bra $L130; $L72: .loc 1 316 7 setp.eq.u64 %r346,%r160,0; @ %r346 bra $L78; add.u64 %r452,%frame,360; $L106: add.u64 %r146,%r223,40; mov.u64 %r143,%r452; cvt.u32.u32 %r348,%r23; cvt.s64.s8 %r347,%r348; add.u64 %r350,%r347,%r347; add.u64 %r351,%r350,%r347; shl.b64 %r352,%r351,3; add.u64 %r353,%r223,16; add.u64 %r134,%r352,%r353; .loc 1 319 10 mov.u64 %r139,1; .loc 1 323 4 mov.u64 %r372,0; bra $L79; $L78: .loc 1 327 20 add.u32 %r354,%r23,-2; cvt.s64.s32 %r71,%r354; add.u64 %r356,%r71,%r71; add.u64 %r357,%r356,%r71; shl.b64 %r358,%r357,3; add.u64 %r359,%r223,%r358; .loc 1 327 67 shl.b64 %r361,%r71,3; add.u64 %r362,%frame,%r361; .loc 1 327 59 ld.u64 %r364,[%r359+40]; ld.u64 %r365,[%r362+360]; mul.lo.u64 %r168,%r364,%r365; .loc 1 329 24 mov.u64 %r366,0; st.u64 [%r223+8],%r366; .loc 1 330 28 cvt.u16.u32 %r369,%r23; add.u16 %r368,%r369,-1; cvt.u32.u16 %r370,%r368; st.u8 [%r223+28],%r370; .loc 1 332 10 setp.eq.u64 %r371,%r168,0; @ %r371 bra $L80; bra $L131; $L82: .loc 1 321 47 mul.lo.u64 %r139,%r68,%r139; $L79: .loc 1 323 4 st.u64 [%r146+8],%r372; ld.u64 %r68,[%r143]; add.u64 %r373,%r68,-1; st.u64 [%r146+16],%r373; st.u64 [%r146],%r139; .loc 1 316 7 add.u64 %r146,%r146,24; add.u64 %r143,%r143,8; setp.ne.u64 %r374,%r134,%r146; @ %r374 bra $L82; bra $L78; $L80: .loc 1 335 4 st.u64 [%r223+48],%r168; mov.u64 %r376,-1; st.u64 [%r223+56],%r376; mov.u64 %r377,1; st.u64 [%r223+40],%r377; .loc 1 336 4 bra $L58; $L131: .loc 1 339 24call (%value_in),_gfortrani_xmallocarray80,[%value_in]; } .loc 1 339 22 st.u64 [%r223],%r380; bra $L83; $L73: .loc 1 344 19 ld.s8 %r382,[%r223+28]; .loc 1 344 10 setp.eq.u64 %r383,%r382,%r160; @ %r383 bra $L84; .loc 1 345 2 cvta.const.u64 %r3843stack; call _gfortran_runtime_error84: .loc 1 347 11 cvta.global.u64 %r386,_gfortrani_compile_options; .loc 1 347 10 ld.u32 %r387,[%r386+36]; setp.eq.u32 %r388,%r387,0; @ %r388 bra $L83; .loc 1 349 4 add.u64 %r452,%frame,360; cvta.const.u64 %r392,$LC239392; call _gfortrani_bounds_ifunction_return4 cvta.const.u64 %r396,$LC396392; call _gfortrani_bounds_equal_extents$L83: .loc 1 356 3 setp.ne.u64 %r398,%r160,0; @ %r398 bra $L85; $L88: .loc 1 364 8 ld.u64 %r181,[%r223]; .loc 1 365 8 ld.u64 %r179,[%r224]; .loc 1 367 9 setp.ne.u64 %r399,%r179,0; @ %r399 bra $L86; bra $L58; $L85: add.u64 %r199,%frame,480; add.u64 %r195,%r223,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r401,%r23; cvt.s64.s8 %r400,%r401; add.u64 %r149,%r400,-1; .loc 1 356 3 mov.u64 %r182,0; add.u64 %r452,%frame,360; .loc 1 358 16 mov.u64 %r402,%r182; $L87: st.u64 [%r199],%r402; .loc 1 359 18 ld.u64 %r403,[%r195]; st.u64 [%r187],%r403; .loc 1 360 17 shl.b64 %r405,%r182,3; add.u64 %r406,%r452,%r405; .loc 1 360 10 ld.u64 %r407,[%r406]; setp.le.s64 %r408,%r407,0; @ %r408 bra $L58; .loc 1 356 26 add.u64 %r182,%r182,1; .loc 1 356 3 add.u64 %r199,%r199,8; add.u64 %r195,%r195,24; add.u64 %r187,%r187,8; setp.ne.u64 %r409,%r149,%r182; @ %r409 bra $L87; bra $L88; $L86: .loc 1 290 12 mul.lo.u64 %r43,%r41,%r34; .loc 1 386 32 shl.b64 %r86,%r164,3; ld.u64 %r39,[%frame+480]; .loc 1 430 22 ld.u64 %r100,[%frame+240]; .loc 1 430 12 shl.b64 %r101,%r100,3; .loc 1 431 23 ld.u64 %r103,[%frame]; .loc 1 432 22 ld.u64 %r105,[%frame+120]; .loc 1 432 12 shl.b64 %r106,%r105,2; .loc 1 434 32 ld.u64 %r172,[%frame+360]; .loc 1 441 23 mul.lo.u64 %r38,%r172,%r100; .loc 1 442 24 mul.lo.u64 %r410,%r172,%r103; .loc 1 442 10 neg.s64 %r94,%r410; .loc 1 443 23 mul.lo.u64 %r411,%r172,%r105; .loc 1 443 9 shl.b64 %r412,%r411,2; neg.s64 %r140,%r412; setp.le.s64 %r451,%r160,1; .loc 1 408 9 setp.ne.u32 %r459,%r227,0; cvt.u32.u32 %r460,%r23; cvt.s64.s8 %r461,%r460; add.u64 %r462,%r461,-1; add.u64 %r463,%frame,240; add.u64 %r464,%frame,360; add.u64 %r465,%frame,120; $L107: .loc 1 356 3 mov.u64 %r174,%r180; mov.u64 %r173,%r179; .loc 1 383 16 mov.u32 %r147,0; .loc 1 386 9 mov.u64 %r171,0; $L94: .loc 1 389 7 ld.s8 %r222,[%r174]; .loc 1 389 6 setp.eq.u32 %r454,%r222,0; @ %r454 bra $L89; .loc 1 392 10 setp.ne.u32 %r414,%r147,0; @ %r414 bra $L90; cvt.u32.u64 %r415,%r171; add.u32 %r147,%r415,1; $L90: .loc 1 394 11 ld.f64 %r145,[%r173]; .loc 1 394 10 setp.ge.f64 %r416,%r145,0dfff0000000000000; @ ! %r416 bra $L89; .loc 1 398 11 cvt.u32.u64 %r417,%r171; add.u32 %r170,%r417,1; .loc 1 404 9 setp.le.s64 %r418,%r161,%r171; @ ! %r418 bra $L132; bra $L92; $L89: .loc 1 386 24 add.u64 %r171,%r171,1; .loc 1 386 32 add.u64 %r173,%r173,%r86; .loc 1 386 47 add.u64 %r174,%r174,%r43; .loc 1 386 2 setp.ne.u64 %r419,%r161,%r171; @ %r419 bra $L94; bra $L92; $L132: .loc 1 408 9 @ %r459 bra $L95; add.u64 %r205,%r171,1; .loc 1 398 11 mov.u32 %r147,%r170; bra $L96; $L95: add.u64 %r209,%r171,1; mov.u32 %r147,%r170; $L99: .loc 1 411 8 @ %r454 bra $L97; .loc 1 411 18 ld.f64 %r89,[%r173]; .loc 1 411 15 setp.ge.f64 %r422,%r89,%r145; @ ! %r422 bra $L97; .loc 1 414 16 cvt.u32.u64 %r147,%r209; .loc 1 413 16 mov.f64 %r145,%r89; $L97: .loc 1 409 33 add.u64 %r173,%r173,%r86; .loc 1 409 48 add.u64 %r174,%r174,%r43; .loc 1 409 8 add.u64 %r210,%r209,1; setp.eq.u64 %r423,%r161,%r209; @ %r423 bra $L92; .loc 1 411 9 ld.s8 %r222,[%r174]; mov.u64 %r209,%r210; setp.eq.u32 %r454,%r222,0; bra $L99; $L96: .loc 1 420 8 @ %r454 bra $L100; .loc 1 420 18 ld.f64 %r93,[%r173]; .loc 1 420 15 setp.gt.f64 %r425,%r93,%r145; @ ! %r425 bra $L100; .loc 1 423 16 cvt.u32.u64 %r147,%r205; .loc 1 422 16 mov.f64 %r145,%r93; $L100: .loc 1 418 33 add.u64 %r173,%r173,%r86; .loc 1 418 48 add.u64 %r174,%r174,%r43; .loc 1 418 8 add.u64 %r206,%r205,1; setp.eq.u64 %r426,%r161,%r205; @ %r426 bra $L92; .loc 1 420 9 ld.s8 %r222,[%r174]; mov.u64 %r205,%r206; setp.eq.u32 %r454,%r222,0; bra $L96; $L92: .loc 1 426 8 st.u32 [%r181],%r147; .loc 1 429 15 add.u64 %r39,%r39,1; .loc 1 430 12 add.u64 %r179,%r179,%r101; .loc 1 431 13 add.u64 %r180,%r180,%r103; .loc 1 432 12 add.u64 %r181,%r181,%r106; .loc 1 434 13 setp.ne.u64 %r427,%r39,%r172; @ %r427 bra $L107; .loc 1 442 10 add.u64 %r176,%r180,%r94; .loc 1 443 9 add.u64 %r177,%r181,%r140; .loc 1 445 7 @ %r451 bra $L58; add.u64 %r158,%frame,488; .loc 1 441 23 mov.u64 %r110,%r38; .loc 1 445 7 mov.u64 %r219,8; .loc 1 444 5 mov.u64 %r178,1; .loc 1 438 13 mov.u64 %r458,0; bra $L104; $L105: st.u64 [%r158],%r458; .loc 1 441 23 mul.lo.u64 %r110,%r124,%r123; .loc 1 442 24 mul.lo.u64 %r433,%r126,%r123; .loc 1 442 10 sub.u64 %r176,%r180,%r433; .loc 1 443 23 mul.lo.u64 %r434,%r128,%r123; .loc 1 443 9 shl.b64 %r435,%r434,2; sub.u64 %r177,%r181,%r435; .loc 1 444 5 add.u64 %r178,%r178,1; .loc 1 445 7 add.u64 %r158,%r158,8; add.u64 %r219,%r219,8; setp.eq.u64 %r436,%r178,%r462; @ %r436 bra $L58; $L104: .loc 1 453 16 ld.u64 %r437,[%r158]; add.u64 %r123,%r437,1; st.u64 [%r158],%r123; .loc 1 454 23 add.u64 %r439,%r463,%r219; ld.u64 %r124,[%r439]; .loc 1 454 13 sub.u64 %r440,%r124,%r110; shl.b64 %r441,%r440,3; add.u64 %r179,%r179,%r441; .loc 1 455 24 add.u64 %r442,%frame,%r219; ld.u64 %r126,[%r442]; .loc 1 455 14 add.u64 %r180,%r176,%r126; .loc 1 456 23 add.u64 %r444,%r465,%r219; ld.u64 %r128,[%r444]; .loc 1 456 13 shl.b64 %r445,%r128,2; add.u64 %r181,%r177,%r445; .loc 1 434 32 add.u64 %r447,%r464,%r219; ld.u64 %r131,[%r447]; .loc 1 434 13 setp.eq.u64 %r448,%r123,%r131; @ %r448 bra $L105; .loc 1 438 13 mov.u64 %r39,0; bra $L107; $L130: .loc 1 312 6 ld.u64 %r449,[%r223]; setp.eq.u64 %r450,%r449,0; @ ! %r450 bra $L73; bra $L106; $L5_gfortran_smaxloc1_4_r8 .visible .func _gfortran_smaxloc1_4_r8, .param .u32 %in_ar4) {local .align 16 .b8 %frame_ar[368u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u32 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 483 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L134; .loc 1 483 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L135; $L134:r152154; call _gfortran_maxloc1_4_r8490 7 bra $L133; $L135: .loc 1 493 10 ld.u64 %r23,[%r152]; .loc 1 493 7 add.u64 %r88,%r23,-1; .loc 1 494 10 ld.s8 %r24,[%r151+28]; .loc 1 494 38 add.u32 %r26,%r24,-1; .loc 1 494 8 cvt.s64.s32 %r296,%r26; .loc 1 496 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 496 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L137; .loc 1 503 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L138; $L144: .loc 1 511 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L139; bra $L192; $L137: .loc 1 498 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error138: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 508 12 mov.u64 %r302,0; $L143: .loc 1 505 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 507 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L141; .loc 1 505 17 st.u64 [%r139],%r36; bra $L142; $L141: .loc 1 508 12 st.u64 [%r139],%r302; $L142: .loc 1 503 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L143; bra $L144; $L192: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L146; bra $L145; $L139: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 517 12 mov.u64 %r301,0; $L149: .loc 1 514 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 516 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L147; .loc 1 513 17 st.u64 [%r43],%r41; bra $L148; $L147: .loc 1 517 12 st.u64 [%r43],%r301; $L148: .loc 1 511 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L149; bra $L193; $L145: .loc 1 524 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L151; add.u64 %r295,%frame,120; $L171: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 527 10 mov.u64 %r81,1; .loc 1 531 4 mov.u64 %r239,0; bra $L152; $L151: .loc 1 535 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 536 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 538 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 538 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 538 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 540 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L153; bra $L194; $L155: .loc 1 529 48 mul.lo.u64 %r81,%r46,%r81; $L152: .loc 1 531 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 524 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L155; bra $L151; $L153: .loc 1 543 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 544 4 bra $L133; $L194: .loc 1 547 2496246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 547 22 st.u64 [%r150],%r98; $L159: .loc 1 573 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L156; bra $L157; $L146: .loc 1 551 19 ld.s8 %r55,[%r150+28]; .loc 1 551 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L158; .loc 1 552 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error158: .loc 1 557 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 557 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L159; .loc 1 559 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L157; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 559 10 mov.u64 %r104,0; $L161: .loc 1 563 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 563 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 564 18 ld.u64 %r63,[%r31]; .loc 1 559 25 add.u64 %r104,%r104,1; .loc 1 564 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L160; .loc 1 565 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error160: .loc 1 559 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L161; $L156: .loc 1 575 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L162; shl.b64 %r266,%r296,3; bra $L163; $L162: mov.u64 %r266,8; $L163: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L164: .loc 1 576 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 573 26 add.u64 %r100,%r100,1; .loc 1 573 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L164; $L157: ld.u64 %r80,[%frame+240]; .loc 1 585 22 ld.u64 %r68,[%frame]; .loc 1 585 12 shl.b64 %r69,%r68,2; .loc 1 587 32 ld.u64 %r59,[%frame+120]; .loc 1 594 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 583 13 mov.u32 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L170: st.u32 [%r98],%r280; .loc 1 584 15 add.u64 %r80,%r80,1; .loc 1 585 12 add.u64 %r98,%r98,%r69; .loc 1 587 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L170; .loc 1 596 7 @ ! %r294 bra $L195; bra $L133; $L169: .loc 1 591 13 st.u64 [%r99],%r297; .loc 1 594 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 595 5 add.u64 %r97,%r97,1; .loc 1 596 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L168; bra $L133; $L195: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 594 23 mov.u64 %r72,%r44; .loc 1 595 5 mov.u64 %r97,1; .loc 1 591 13 mov.u64 %r297,0; $L168: .loc 1 600 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 601 23 ld.u64 %r75,[%r124]; .loc 1 601 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,2; add.u64 %r98,%r98,%r291; .loc 1 587 32 ld.u64 %r77,[%r123]; .loc 1 587 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L169; .loc 1 591 13 mov.u64 %r80,0; bra $L170; $L193: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L146; bra $L171; $L133: .loc 1 605 1 ret; } maxloc1_8_r8.o/_gfortran_maxloc1_8_r8 .visible .func _gfortran_maxloc1_8_r8.file 1 "../../../libgfortran/generated/maxloc1_8_r8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_8_r8 .visible .func _gfortran_mmaxloc1_8_r8_gfortran_smaxloc1_8_r8 .visible .func _gfortran_smaxloc1_8_r8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_maxloc1_8_r8 .visible .func _gfortran_maxloc1_8_r8480pred %r221; .reg .pred %r222; .reg .u64u64 %r229; .reg .u64 %r230; .reg .u64 %r231; .reg .pred %r232; .reg .predu64 %r280; .reg .u64 %r282; .reg .u64pred64predu64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64pred %r345; .reg .u64u64 %r353; .reg .pred %r354; .reg .pred %r355; .reg .u64 %r356; .reg .predpred %r361; .reg .pred %r362; .reg .predmov.u64 %r182,%ar0; mov.u64 %r183,%ar1; mov.u64 %r184,%ar2; mov.u32 %r185,%ar3; .loc 1 58 10 ld.s8 %r22,[%r183+28]; .loc 1 58 38 add.u32 %r186,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r112,%r186; .loc 1 59 10 ld.u64 %r25,[%r184]; .loc 1 59 7 add.u64 %r113,%r25,-1; .loc 1 61 7 shr.u64 %r188,%r113,63; set.u32.lt.s64 %r190,%r112,%r113; neg.s32 %r191,%r190; cvt.u16.u64 %r194,%r188; cvt.u16.u32 %r195,%r191; or.b16 %r193,%r194,%r195; .loc 1 61 6 cvt.u32.u16 %r196,%r193; cvt.u16.u8 %r197,%r196; setp.eq.u16 %r198,%r197,0; @ %r198 bra $L2; .loc 1 63 7 cvt.u32.u32 %r201,%r22; cvt.s64.s8 %r200,%r201; st.u64 [%stack+8],%r20019_gfortran_runtime_errorr204,%r113,%r113; add.u64 %r205,%r204,%r113; shl.b64 %r206,%r205,3; add.u64 %r207,%r183,%r206; ld.u64 %r31,[%r207+56]; ld.u64 %r33,[%r207+48]; .loc 1 71 9 ld.u64 %r116,[%r207+40]; .loc 1 73 3 setp.ne.u64 %r221,%r113,0; @ %r221 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r222,%r112,%r113; @ %r222 bra $L4; bra $L56; $L3: add.u64 %r99,%r183,40; add.u64 %r130,%frame,120; add.u64 %r181,%frame,240; add.u64 %r223,%r183,16; add.u64 %r225,%r25,%r25; add.u64 %r226,%r225,%r25; shl.b64 %r227,%r226,3; add.u64 %r72,%r223,%r227; .loc 1 79 12 mov.u64 %r372,0; $L8: .loc 1 75 18 ld.u64 %r228,[%r99]; st.u64 [%r130],%r228; .loc 1 76 19 ld.u64 %r230,[%r99+16]; add.u64 %r229,%r230,1; ld.u64 %r231,[%r99+8]; sub.u64 %r38,%r229,%r231; .loc 1 78 10 setp.lt.s64 %r232,%r38,0; @ %r232 bra $L6; .loc 1 76 17 st.u64 [%r181],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r181],%r372; $L7: .loc 1 73 3 add.u64 %r99,%r99,24; add.u64 %r130,%r130,8; add.u64 %r181,%r181,8; setp.ne.u64 %r234,%r72,%r99; @ %r234 bra $L8; bra $L9; $L56: .loc 1 90 6 ld.u64 %r235,[%r182]; setp.eq.u64 %r236,%r235,0; @ ! %r236 bra $L11; bra $L10; $L4: add.u64 %r238,%r25,%r25; add.u64 %r239,%r238,%r25; shl.b64 %r240,%r239,3; add.u64 %r241,%r240,40; add.u64 %r136,%r183,%r241; shl.b64 %r242,%r25,3; add.u64 %r108,%r242,-8; add.u64 %r360,%frame,120; add.u64 %r110,%r360,%r108; add.u64 %r359,%frame,240; add.u64 %r104,%r359,%r108; cvt.u32.u32 %r246,%r22; cvt.s64.s8 %r245,%r246; add.u64 %r248,%r245,%r245; add.u64 %r249,%r248,%r245; shl.b64 %r250,%r249,3; add.u64 %r251,%r183,40; add.u64 %r58,%r250,%r251; .loc 1 87 12 mov.u64 %r371,0; $L14: .loc 1 83 18 ld.u64 %r252,[%r136]; st.u64 [%r110],%r252; .loc 1 84 19 ld.u64 %r254,[%r136+16]; add.u64 %r253,%r254,1; ld.u64 %r255,[%r136+8]; sub.u64 %r46,%r253,%r255; .loc 1 86 10 setp.lt.s64 %r256,%r46,0; @ %r256 bra $L12; .loc 1 84 17 st.u64 [%r104],%r46; bra $L13; $L12: .loc 1 87 12 st.u64 [%r104],%r371; $L13: .loc 1 81 3 add.u64 %r136,%r136,24; add.u64 %r110,%r110,8; add.u64 %r104,%r104,8; setp.ne.u64 %r258,%r58,%r136; @ %r258 bra $L14; bra $L57; $L10: .loc 1 94 7 setp.le.s64 %r259,%r112,0; @ %r259 bra $L16; add.u64 %r359,%frame,240; $L42: add.u64 %r153,%r182,40; mov.u64 %r146,%r359; cvt.u32.u32 %r261,%r22; cvt.s64.s8 %r260,%r261; add.u64 %r263,%r260,%r260; add.u64 %r264,%r263,%r260; shl.b64 %r265,%r264,3; add.u64 %r266,%r182,16; add.u64 %r138,%r265,%r266; .loc 1 97 10 mov.u64 %r103,1; .loc 1 101 4 mov.u64 %r289,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r267,0; st.u64 [%r182+8],%r267; .loc 1 106 28 cvt.u16.u32 %r270,%r22; add.u16 %r269,%r270,-1; cvt.u32.u16 %r271,%r269; st.u8 [%r182+28],%r271; .loc 1 108 20 add.u32 %r272,%r22,-2; cvt.s64.s32 %r56,%r272; add.u64 %r274,%r56,%r56; add.u64 %r275,%r274,%r56; shl.b64 %r276,%r275,3; add.u64 %r277,%r182,%r276; .loc 1 108 67 shl.b64 %r279,%r56,3; add.u64 %r280,%frame,%r279; .loc 1 108 59 ld.u64 %r282,[%r277+40]; ld.u64 %r283,[%r280+240]; mul.lo.u64 %r118,%r282,%r283; .loc 1 110 29 mov.u64 %r28511r285; call (%value_in),_gfortrani_xmallocarray286,[%value_in]; } .loc 1 110 27 st.u64 [%r182],%r286; .loc 1 111 10 setp.eq.u64 %r288,%r118,0; @ ! %r288 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r103,%r52,%r103; $L17: .loc 1 101 4 st.u64 [%r153+8],%r289; ld.u64 %r52,[%r146]; add.u64 %r290,%r52,-1; st.u64 [%r153+16],%r290; st.u64 [%r153],%r103; .loc 1 94 7 add.u64 %r153,%r153,24; add.u64 %r146,%r146,8; setp.ne.u64 %r291,%r138,%r153; @ %r291 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r182+48],%r118; mov.u64 %r293,-1; st.u64 [%r182+56],%r293; mov.u64 %r294,1; st.u64 [%r182+40],%r294; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r63,[%r182+28]; .loc 1 121 10 setp.eq.u64 %r295,%r63,%r112; @ %r295 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r112; st.u64 [%stack],%r63;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r298,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r299,[%r298+36]; setp.eq.u32 %r300,%r299,0; @ %r300 bra $L21; .loc 1 128 2 add.u64 %r359,%frame,240; cvta.const.u64 %r304,$LC2r359303304; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r306,%r112,0; @ %r306 bra $L24; $L27: .loc 1 68 9 add.u64 %r307,%r31,1; .loc 1 68 7 sub.u64 %r114,%r307,%r33; max.s64 %r132,%r114,0; .loc 1 140 8 ld.u64 %r126,[%r183]; .loc 1 141 8 ld.u64 %r127,[%r182]; .loc 1 168 42 shl.b64 %r71,%r116,3; ld.u64 %r48,[%frame+360]; .loc 1 194 22 ld.u64 %r73,[%frame+120]; .loc 1 194 12 shl.b64 %r75,%r73,3; .loc 1 195 22 ld.u64 %r76,[%frame]; .loc 1 195 12 shl.b64 %r78,%r76,3; setp.gt.s64 %r361,%r114,0; setp.le.s64 %r362,%r112,1; setp.eq.u32 %r364,%r185,0; .loc 1 159 10 mov.u64 %r365,0; cvt.u32.u32 %r366,%r22; cvt.s64.s8 %r367,%r366; add.u64 %r368,%r367,-1; add.u64 %r369,%frame,120; add.u64 %r370,%frame,240; bra $L25; $L24: add.u64 %r166,%frame,360; add.u64 %r165,%r182,40; mov.u64 %r163,%frame; cvt.u32.u32 %r309,%r22; cvt.s64.s8 %r308,%r309; add.u64 %r154,%r308,-1; .loc 1 132 3 mov.u64 %r128,0; add.u64 %r359,%frame,240; .loc 1 134 16 mov.u64 %r310,%r128; $L26: st.u64 [%r166],%r310; .loc 1 135 18 ld.u64 %r311,[%r165]; st.u64 [%r163],%r311; .loc 1 136 17 shl.b64 %r313,%r128,3; add.u64 %r314,%r359,%r313; .loc 1 136 10 ld.u64 %r315,[%r314]; setp.le.s64 %r316,%r315,0; @ %r316 bra $L1; .loc 1 132 26 add.u64 %r128,%r128,1; .loc 1 132 3 add.u64 %r166,%r166,8; add.u64 %r165,%r165,24; add.u64 %r163,%r163,8; setp.ne.u64 %r317,%r128,%r154; @ %r317 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r361 bra $L43; .loc 1 159 10 st.u64 [%r127],%r365; bra $L29; $L43: mov.u64 %r121,%r126; .loc 1 168 19 mov.u64 %r105,0; $L28: .loc 1 170 7 ld.f64 %r106,[%r121]; mov.u64 %r137,%r105; .loc 1 168 34 add.u64 %r105,%r105,1; .loc 1 170 6 setp.ge.f64 %r320,%r106,0dfff0000000000000; @ ! %r320 bra $L58; .loc 1 180 6 setp.gt.s64 %r321,%r132,%r137; @ %r321 bra $L44; bra $L33; $L58: .loc 1 168 42 add.u64 %r121,%r121,%r71; .loc 1 168 12 setp.gt.s64 %r322,%r132,%r105; @ %r322 bra $L28; .loc 1 157 9 mov.u64 %r105,1; bra $L33; $L44: .loc 1 170 7 mov.f64 %r180,%r106; $L32: .loc 1 182 7 @ %r364 bra $L34; set.u32.le.f64 %r325,%r106,%r180; neg.s32 %r326,%r325; cvt.u32.u32 %r324,%r326; cvt.u32.u8 %r120,%r324; bra $L35; $L34: set.u32.lt.f64 %r328,%r106,%r180; neg.s32 %r329,%r328; cvt.u32.u32 %r327,%r329; cvt.u32.u8 %r120,%r327; bra $L35; $L59: .loc 1 182 14 ld.f64 %r180,[%r121]; bra $L32; $L33: .loc 1 189 12 st.u64 [%r127],%r105; $L29: .loc 1 193 15 add.u64 %r48,%r48,1; .loc 1 194 12 add.u64 %r126,%r126,%r75; .loc 1 195 12 add.u64 %r127,%r127,%r78; .loc 1 197 32 ld.u64 %r66,[%frame+240]; .loc 1 197 13 setp.ne.u64 %r331,%r66,%r48; @ %r331 bra $L25; .loc 1 204 23 mul.lo.u64 %r332,%r48,%r73; .loc 1 204 9 shl.b64 %r333,%r332,3; sub.u64 %r123,%r126,%r333; .loc 1 205 23 mul.lo.u64 %r334,%r48,%r76; .loc 1 205 9 shl.b64 %r335,%r334,3; sub.u64 %r124,%r127,%r335; .loc 1 207 7 @ %r362 bra $L1; add.u64 %r175,%frame,368; mov.u64 %r174,8; .loc 1 206 5 mov.u64 %r125,1; bra $L38; $L39: .loc 1 201 13 st.u64 [%r175],%r365; .loc 1 204 23 mul.lo.u64 %r341,%r90,%r89; .loc 1 204 9 shl.b64 %r342,%r341,3; sub.u64 %r123,%r126,%r342; .loc 1 205 23 mul.lo.u64 %r343,%r93,%r89; .loc 1 205 9 shl.b64 %r344,%r343,3; sub.u64 %r124,%r127,%r344; .loc 1 206 5 add.u64 %r125,%r125,1; .loc 1 207 7 add.u64 %r175,%r175,8; add.u64 %r174,%r174,8; setp.eq.u64 %r345,%r125,%r368; @ %r345 bra $L1; $L38: .loc 1 215 16 ld.u64 %r346,[%r175]; add.u64 %r89,%r346,1; st.u64 [%r175],%r89; .loc 1 216 23 add.u64 %r348,%r369,%r174; ld.u64 %r90,[%r348]; .loc 1 216 13 shl.b64 %r349,%r90,3; add.u64 %r126,%r123,%r349; .loc 1 217 23 add.u64 %r350,%frame,%r174; ld.u64 %r93,[%r350]; .loc 1 217 13 shl.b64 %r351,%r93,3; add.u64 %r127,%r124,%r351; .loc 1 197 32 add.u64 %r353,%r370,%r174; ld.u64 %r96,[%r353]; .loc 1 197 13 setp.eq.u64 %r354,%r89,%r96; @ %r354 bra $L39; .loc 1 201 13 mov.u64 %r48,0; bra $L25; $L35: .loc 1 185 14 add.u64 %r137,%r137,1; .loc 1 182 6 setp.ne.u32 %r355,%r120,0; .loc 1 184 14 selp.f64 %r106,%r180,%r106,%r355; selp.u64 %r105,%r137,%r105,%r355; .loc 1 180 31 add.u64 %r121,%r121,%r71; .loc 1 180 6 setp.eq.u64 %r330,%r132,%r137; @ ! %r330 bra $L59; bra $L33; $L57: .loc 1 90 6 ld.u64 %r356,[%r182]; setp.eq.u64 %r357,%r356,0; @ ! %r357 bra $L11; bra $L42; $L1:_gfortran_mmaxloc1_8_r8 .visible .func _gfortran_mmaxloc1_8_r8, .param .u32 %in_ar4) {608pred %r218; .reg .u32 %r223; .reg .u6464u64pred %r373; .reg .u64 %r374; .reg .u64 %r376; .reg .u32 %r377; .reg .pred %r378; .reg .u64 %r381; .reg .u64 %r382; .reg .u64 %r386; .reg .pred %r388; .reg .pred %r389; .reg .u64 %r390; .reg .u32 %r391; .reg .u64pred %r398; .reg .pred %r399; .reg .pred %r401; .reg .pred %r402; .reg .pred %r403; .reg .pred %r404; .reg .pred %r407; .reg .pred %r408; .reg .pred %r410; .reg .pred %r411; .reg .pred.reg .u64 %r421; .reg .u64u64pred %r445; .reg .u64 %r446; .reg .predmov.u64 %r213,%ar0; mov.u64 %r214,%ar1; mov.u64 %r215,%ar2; mov.u64 %r216,%ar3; mov.u32 %r217,%ar4; .loc 1 251 6 setp.ne.u64 %r218,%r216,0; @ %r218 bra $L611r214215217; call _gfortran_maxloc1_8_r87 bra $L60; $L61: .loc 1 261 10 ld.u64 %r22,[%r215]; .loc 1 261 7 add.u64 %r152,%r22,-1; .loc 1 262 10 ld.s8 %r23,[%r214+28]; .loc 1 262 38 add.u32 %r223,%r23,-1; .loc 1 262 8 cvt.s64.s32 %r153,%r223; .loc 1 265 7 shr.u64 %r225,%r152,63; set.u32.gt.s64 %r227,%r152,%r153; neg.s32 %r228,%r227; cvt.u16.u64 %r231,%r225; cvt.u16.u32 %r232,%r228; or.b16 %r230,%r231,%r232; .loc 1 265 6 cvt.u32.u16 %r233,%r230; cvt.u16.u8 %r234,%r233; setp.eq.u16 %r235,%r234,0; @ %r235 bra $L63; .loc 1 267 7 cvt.u32.u32 %r238,%r23; cvt.s64.s8 %r237,%r238; st.u64 [%stack+8],%r237; st.u64 [%stack],%r22; cvta.const.u64 %r236236_gfortran_runtime_error63: .loc 1 272 9 add.u64 %r442,%r152,%r152; add.u64 %r444,%r442,%r152; shl.b64 %r243,%r444,3; add.u64 %r244,%r214,%r243; ld.u64 %r247,[%r244+56]; add.u64 %r246,%r247,1; .loc 1 272 7 ld.u64 %r254,[%r244+48]; sub.u64 %r154,%r246,%r254; .loc 1 273 6 setp.le.s64 %r255,%r154,0; @ %r255 bra $L60; .loc 1 276 9 ld.u64 %r172,[%r216]; .loc 1 278 15 ld.u64 %r34,[%r216+16]; .loc 1 280 22 cvt.u32.u64 %r35,%r34; .loc 1 280 53 add.u32 %r256,%r35,-4; and.b32 %r257,%r256,-5; set.u32.eq.u32 %r259,%r257,0; neg.s32 %r260,%r259; .loc 1 280 22 add.u32 %r261,%r35,-1; set.u32.le.u32 %r263,%r261,1; neg.s32 %r264,%r263; .loc 1 280 6 cvt.u16.u32 %r266,%r260; cvt.u16.u32 %r267,%r264; or.b16 %r265,%r266,%r267; cvt.u32.u16 %r268,%r265; cvt.u16.u8 %r269,%r268; setp.ne.u16 %r270,%r269,0; @ %r270 bra $L65; .loc 1 282 7 setp.ne.u32 %r272,%r35,16; @ %r272 bra $L66; $L65: .loc 1 289 9 shl.b64 %r276,%r444,3; add.u64 %r277,%r214,%r276; ld.u64 %r157,[%r277+40]; .loc 1 290 12 add.u64 %r283,%r216,%r276; ld.u64 %r38,[%r283+40]; .loc 1 292 3 setp.ne.u64 %r285,%r152,0; @ %r285 bra $L67; $L73: .loc 1 302 3 setp.lt.s64 %r286,%r152,%r153; @ %r286 bra $L68; bra $L134; $L66: .loc 1 287 5_gfortran_runtime_error67: add.u64 %r205,%r214,40; add.u64 %r70,%r216,40; add.u64 %r289,%r214,16; add.u64 %r291,%r22,%r22; add.u64 %r292,%r291,%r22; shl.b64 %r293,%r292,3; add.u64 %r188,%r289,%r293; .loc 1 292 3 mov.u64 %r92,0; add.u64 %r441,%frame,240; add.u64 %r439,%frame,360; .loc 1 299 12 mov.u64 %r455,%r92; $L72: .loc 1 294 18 add.u64 %r295,%r441,%r92; ld.u64 %r296,[%r205]; st.u64 [%r295],%r296; .loc 1 295 18 add.u64 %r297,%frame,%r92; .loc 1 295 20 ld.u64 %r299,[%r70]; mul.lo.u64 %r298,%r299,%r34; .loc 1 295 18 st.u64 [%r297],%r298; .loc 1 296 19 ld.u64 %r301,[%r205+16]; add.u64 %r300,%r301,1; ld.u64 %r302,[%r205+8]; sub.u64 %r49,%r300,%r302; .loc 1 298 10 setp.lt.s64 %r303,%r49,0; @ %r303 bra $L70; .loc 1 296 17 add.u64 %r305,%r439,%r92; st.u64 [%r305],%r49; bra $L71; $L70: .loc 1 299 12 add.u64 %r307,%r439,%r92; st.u64 [%r307],%r455; $L71: .loc 1 292 3 add.u64 %r205,%r205,24; add.u64 %r70,%r70,24; add.u64 %r92,%r92,8; setp.ne.u64 %r309,%r188,%r205; @ %r309 bra $L72; bra $L73; $L134: .loc 1 312 6 ld.u64 %r310,[%r213]; setp.eq.u64 %r311,%r310,0; @ ! %r311 bra $L75; bra $L74; $L68: add.u64 %r313,%r22,%r22; add.u64 %r314,%r313,%r22; shl.b64 %r315,%r314,3; add.u64 %r123,%r315,40; add.u64 %r128,%r214,%r123; add.u64 %r107,%r216,%r123; shl.b64 %r316,%r22,3; add.u64 %r78,%r316,-8; cvt.u32.u32 %r318,%r23; cvt.s64.s8 %r317,%r318; shl.b64 %r319,%r317,3; add.u64 %r180,%r319,-8; add.u64 %r441,%frame,240; add.u64 %r439,%frame,360; .loc 1 309 12 mov.u64 %r454,0; $L78: .loc 1 304 18 add.u64 %r321,%r441,%r78; ld.u64 %r322,[%r128]; st.u64 [%r321],%r322; .loc 1 305 18 add.u64 %r323,%frame,%r78; .loc 1 305 20 ld.u64 %r325,[%r107]; mul.lo.u64 %r324,%r325,%r34; .loc 1 305 18 st.u64 [%r323],%r324; .loc 1 306 19 ld.u64 %r327,[%r128+16]; add.u64 %r326,%r327,1; ld.u64 %r328,[%r128+8]; sub.u64 %r61,%r326,%r328; .loc 1 308 10 setp.lt.s64 %r329,%r61,0; @ %r329 bra $L76; .loc 1 306 17 add.u64 %r331,%r439,%r78; st.u64 [%r331],%r61; bra $L77; $L76: .loc 1 309 12 add.u64 %r333,%r439,%r78; st.u64 [%r333],%r454; $L77: .loc 1 302 3 add.u64 %r128,%r128,24; add.u64 %r107,%r107,24; add.u64 %r78,%r78,8; setp.ne.u64 %r335,%r78,%r180; @ %r335 bra $L78; bra $L135; $L74: .loc 1 316 7 setp.eq.u64 %r336,%r153,0; @ %r336 bra $L80; add.u64 %r439,%frame,360; $L107: add.u64 %r144,%r213,40; mov.u64 %r139,%r439; cvt.u32.u32 %r338,%r23; cvt.s64.s8 %r337,%r338; add.u64 %r340,%r337,%r337; add.u64 %r341,%r340,%r337; shl.b64 %r342,%r341,3; add.u64 %r343,%r213,16; add.u64 %r129,%r342,%r343; .loc 1 319 10 mov.u64 %r132,1; .loc 1 323 4 mov.u64 %r362,0; bra $L81; $L80: .loc 1 327 20 add.u32 %r344,%r23,-2; cvt.s64.s32 %r68,%r344; add.u64 %r346,%r68,%r68; add.u64 %r347,%r346,%r68; shl.b64 %r348,%r347,3; add.u64 %r349,%r213,%r348; .loc 1 327 67 shl.b64 %r351,%r68,3; add.u64 %r352,%frame,%r351; .loc 1 327 59 ld.u64 %r354,[%r349+40]; ld.u64 %r355,[%r352+360]; mul.lo.u64 %r161,%r354,%r355; .loc 1 329 24 mov.u64 %r356,0; st.u64 [%r213+8],%r356; .loc 1 330 28 cvt.u16.u32 %r359,%r23; add.u16 %r358,%r359,-1; cvt.u32.u16 %r360,%r358; st.u8 [%r213+28],%r360; .loc 1 332 10 setp.eq.u64 %r361,%r161,0; @ %r361 bra $L82; bra $L136; $L84: .loc 1 321 47 mul.lo.u64 %r132,%r65,%r132; $L81: .loc 1 323 4 st.u64 [%r144+8],%r362; ld.u64 %r65,[%r139]; add.u64 %r363,%r65,-1; st.u64 [%r144+16],%r363; st.u64 [%r144],%r132; .loc 1 316 7 add.u64 %r144,%r144,24; add.u64 %r139,%r139,8; setp.ne.u64 %r364,%r129,%r144; @ %r364 bra $L84; bra $L80; $L82: .loc 1 335 4 st.u64 [%r213+48],%r161; mov.u64 %r366,-1; st.u64 [%r213+56],%r366; mov.u64 %r367,1; st.u64 [%r213+40],%r367; .loc 1 336 4 bra $L60; $L136: .loc 1 339 24 mov.u64 %r369369; call (%value_in),_gfortrani_xmallocarray70,[%value_in]; } .loc 1 339 22 st.u64 [%r213],%r370; bra $L85; $L75: .loc 1 344 19 ld.s8 %r372,[%r213+28]; .loc 1 344 10 setp.eq.u64 %r373,%r372,%r153; @ %r373 bra $L86; .loc 1 345 2 cvta.const.u64 %r374374_gfortran_runtime_error86: .loc 1 347 11 cvta.global.u64 %r376,_gfortrani_compile_options; .loc 1 347 10 ld.u32 %r377,[%r376+36]; setp.eq.u32 %r378,%r377,0; @ %r378 bra $L85; .loc 1 349 4 add.u64 %r439,%frame,360; cvta.const.u64 %r382,$LC2382; call _gfortrani_bounds_ifunction_return4 cvta.const.u64 %r386,$LC386382; call _gfortrani_bounds_equal_extents$L85: .loc 1 356 3 setp.ne.u64 %r388,%r153,0; @ %r388 bra $L87; $L90: .loc 1 364 8 ld.u64 %r173,[%r213]; .loc 1 365 8 ld.u64 %r171,[%r214]; .loc 1 367 9 setp.ne.u64 %r389,%r171,0; @ %r389 bra $L88; bra $L60; $L87: add.u64 %r193,%frame,480; add.u64 %r187,%r213,40; add.u64 %r181,%frame,120; cvt.u32.u32 %r391,%r23; cvt.s64.s8 %r390,%r391; add.u64 %r145,%r390,-1; .loc 1 356 3 mov.u64 %r174,0; add.u64 %r439,%frame,360; .loc 1 358 16 mov.u64 %r392,%r174; $L89: st.u64 [%r193],%r392; .loc 1 359 18 ld.u64 %r393,[%r187]; st.u64 [%r181],%r393; .loc 1 360 17 shl.b64 %r395,%r174,3; add.u64 %r396,%r439,%r395; .loc 1 360 10 ld.u64 %r397,[%r396]; setp.le.s64 %r398,%r397,0; @ %r398 bra $L60; .loc 1 356 26 add.u64 %r174,%r174,1; .loc 1 356 3 add.u64 %r193,%r193,8; add.u64 %r187,%r187,24; add.u64 %r181,%r181,8; setp.ne.u64 %r399,%r145,%r174; @ %r399 bra $L89; bra $L90; $L88: .loc 1 290 12 mul.lo.u64 %r40,%r38,%r34; .loc 1 386 32 shl.b64 %r82,%r157,3; ld.u64 %r185,[%frame+480]; .loc 1 430 22 ld.u64 %r93,[%frame+240]; .loc 1 430 12 shl.b64 %r95,%r93,3; .loc 1 431 23 ld.u64 %r96,[%frame]; .loc 1 432 22 ld.u64 %r98,[%frame+120]; .loc 1 432 12 shl.b64 %r100,%r98,3; setp.le.s64 %r445,%r153,1; .loc 1 408 9 setp.eq.u32 %r447,%r217,0; cvt.u32.u32 %r448,%r23; cvt.s64.s8 %r449,%r448; add.u64 %r450,%r449,-1; add.u64 %r451,%frame,240; add.u64 %r452,%frame,360; add.u64 %r453,%frame,120; $L108: .loc 1 356 3 mov.u64 %r166,%r172; mov.u64 %r165,%r171; .loc 1 383 16 mov.u64 %r142,0; .loc 1 386 9 mov.u64 %r138,%r142; $L96: .loc 1 389 7 ld.s8 %r210,[%r166]; mov.u64 %r182,%r138; .loc 1 393 17 add.u64 %r138,%r138,1; .loc 1 389 6 setp.eq.u32 %r440,%r210,0; @ %r440 bra $L91; .loc 1 392 10 setp.eq.u64 %r401,%r142,0; .loc 1 393 17 selp.u64 %r142,%r138,%r142,%r401; .loc 1 394 11 ld.f64 %r140,[%r165]; .loc 1 394 10 setp.ge.f64 %r402,%r140,0dfff0000000000000; @ ! %r402 bra $L91; .loc 1 404 9 setp.le.s64 %r403,%r154,%r182; @ ! %r403 bra $L137; bra $L94; $L91: .loc 1 386 32 add.u64 %r165,%r165,%r82; .loc 1 386 47 add.u64 %r166,%r166,%r40; .loc 1 386 2 setp.ne.u64 %r404,%r154,%r138; @ %r404 bra $L96; bra $L94; $L137: .loc 1 408 9 @ %r447 bra $L97; $L100: .loc 1 414 16 add.u64 %r182,%r182,1; .loc 1 411 8 @ %r440 bra $L98; .loc 1 411 18 ld.f64 %r85,[%r165]; .loc 1 411 15 setp.ge.f64 %r407,%r85,%r140; .loc 1 413 16 selp.f64 %r140,%r85,%r140,%r407; selp.u64 %r138,%r182,%r138,%r407; $L98: .loc 1 409 33 add.u64 %r165,%r165,%r82; .loc 1 409 48 add.u64 %r166,%r166,%r40; .loc 1 409 8 setp.eq.u64 %r408,%r154,%r182; @ %r408 bra $L110; .loc 1 411 9 ld.s8 %r210,[%r166]; setp.eq.u32 %r440,%r210,0; bra $L100; $L97: .loc 1 423 16 add.u64 %r182,%r182,1; .loc 1 420 8 @ %r440 bra $L101; .loc 1 420 18 ld.f64 %r90,[%r165]; .loc 1 420 15 setp.gt.f64 %r410,%r90,%r140; .loc 1 422 16 selp.f64 %r140,%r90,%r140,%r410; selp.u64 %r138,%r182,%r138,%r410; $L101: .loc 1 418 33 add.u64 %r165,%r165,%r82; .loc 1 418 48 add.u64 %r166,%r166,%r40; .loc 1 418 8 setp.eq.u64 %r411,%r154,%r182; @ %r411 bra $L111; .loc 1 420 9 ld.s8 %r210,[%r166]; setp.eq.u32 %r440,%r210,0; bra $L97; $L110: mov.u64 %r142,%r138; bra $L94; $L111: mov.u64 %r142,%r138; $L94: .loc 1 426 8 st.u64 [%r173],%r142; .loc 1 429 15 add.u64 %r185,%r185,1; .loc 1 430 12 add.u64 %r171,%r171,%r95; .loc 1 431 13 add.u64 %r172,%r172,%r96; .loc 1 432 12 add.u64 %r173,%r173,%r100; .loc 1 434 32 ld.u64 %r164,[%frame+360]; .loc 1 434 13 setp.ne.u64 %r412,%r185,%r164; @ %r412 bra $L108; .loc 1 441 23 mul.lo.u64 %r102,%r93,%r185; .loc 1 442 24 mul.lo.u64 %r413,%r96,%r185; .loc 1 442 10 sub.u64 %r168,%r172,%r413; .loc 1 443 23 mul.lo.u64 %r414,%r98,%r185; .loc 1 443 9 shl.b64 %r415,%r414,3; sub.u64 %r169,%r173,%r415; .loc 1 445 7 @ %r445 bra $L60; add.u64 %r203,%frame,488; mov.u64 %r202,8; .loc 1 444 5 mov.u64 %r170,1; .loc 1 438 13 mov.u64 %r446,0; bra $L105; $L106: st.u64 [%r203],%r446; .loc 1 441 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 442 24 mul.lo.u64 %r421,%r116,%r113; .loc 1 442 10 sub.u64 %r168,%r172,%r421; .loc 1 443 23 mul.lo.u64 %r422,%r118,%r113; .loc 1 443 9 shl.b64 %r423,%r422,3; sub.u64 %r169,%r173,%r423; .loc 1 444 5 add.u64 %r170,%r170,1; .loc 1 445 7 add.u64 %r203,%r203,8; add.u64 %r202,%r202,8; setp.eq.u64 %r424,%r170,%r450; @ %r424 bra $L60; $L105: .loc 1 453 16 ld.u64 %r425,[%r203]; add.u64 %r113,%r425,1; st.u64 [%r203],%r113; .loc 1 454 23 add.u64 %r427,%r451,%r202; ld.u64 %r114,[%r427]; .loc 1 454 13 sub.u64 %r428,%r114,%r102; shl.b64 %r429,%r428,3; add.u64 %r171,%r171,%r429; .loc 1 455 24 add.u64 %r430,%frame,%r202; ld.u64 %r116,[%r430]; .loc 1 455 14 add.u64 %r172,%r168,%r116; .loc 1 456 23 add.u64 %r432,%r453,%r202; ld.u64 %r118,[%r432]; .loc 1 456 13 shl.b64 %r433,%r118,3; add.u64 %r173,%r169,%r433; .loc 1 434 32 add.u64 %r435,%r452,%r202; ld.u64 %r122,[%r435]; .loc 1 434 13 setp.eq.u64 %r436,%r113,%r122; @ %r436 bra $L106; .loc 1 438 13 mov.u64 %r185,0; bra $L108; $L135: .loc 1 312 6 ld.u64 %r437,[%r213]; setp.eq.u64 %r438,%r437,0; @ ! %r438 bra $L75; bra $L107; $L60_gfortran_smaxloc1_8_r8 .visible .func _gfortran_smaxloc1_8_r8, .param .u32 %in_ar4) {local .align 16 .b8 %frame_ar[368u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 483 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L139; .loc 1 483 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L140; $L139:r152154; call _gfortran_maxloc1_8_r8490 7 bra $L138; $L140: .loc 1 493 10 ld.u64 %r23,[%r152]; .loc 1 493 7 add.u64 %r88,%r23,-1; .loc 1 494 10 ld.s8 %r24,[%r151+28]; .loc 1 494 38 add.u32 %r26,%r24,-1; .loc 1 494 8 cvt.s64.s32 %r296,%r26; .loc 1 496 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 496 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L142; .loc 1 503 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L143; $L149: .loc 1 511 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L144; bra $L197; $L142: .loc 1 498 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error143: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 508 12 mov.u64 %r302,0; $L148: .loc 1 505 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 507 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L146; .loc 1 505 17 st.u64 [%r139],%r36; bra $L147; $L146: .loc 1 508 12 st.u64 [%r139],%r302; $L147: .loc 1 503 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L148; bra $L149; $L197: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L151; bra $L150; $L144: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 517 12 mov.u64 %r301,0; $L154: .loc 1 514 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 516 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L152; .loc 1 513 17 st.u64 [%r43],%r41; bra $L153; $L152: .loc 1 517 12 st.u64 [%r43],%r301; $L153: .loc 1 511 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L154; bra $L198; $L150: .loc 1 524 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L156; add.u64 %r295,%frame,120; $L176: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 527 10 mov.u64 %r81,1; .loc 1 531 4 mov.u64 %r239,0; bra $L157; $L156: .loc 1 535 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 536 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 538 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 538 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 538 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 540 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L158; bra $L199; $L160: .loc 1 529 48 mul.lo.u64 %r81,%r46,%r81; $L157: .loc 1 531 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 524 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L160; bra $L156; $L158: .loc 1 543 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 544 4 bra $L138; $L199: .loc 1 547 24 mov.u64 %r24696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 547 22 st.u64 [%r150],%r98; $L164: .loc 1 573 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L161; bra $L162; $L151: .loc 1 551 19 ld.s8 %r55,[%r150+28]; .loc 1 551 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L163; .loc 1 552 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error163: .loc 1 557 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 557 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L164; .loc 1 559 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L162; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 559 10 mov.u64 %r104,0; $L166: .loc 1 563 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 563 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 564 18 ld.u64 %r63,[%r31]; .loc 1 559 25 add.u64 %r104,%r104,1; .loc 1 564 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L165; .loc 1 565 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error165: .loc 1 559 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L166; $L161: .loc 1 575 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L167; shl.b64 %r266,%r296,3; bra $L168; $L167: mov.u64 %r266,8; $L168: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L169: .loc 1 576 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 573 26 add.u64 %r100,%r100,1; .loc 1 573 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L169; $L162: ld.u64 %r80,[%frame+240]; .loc 1 585 22 ld.u64 %r68,[%frame]; .loc 1 585 12 shl.b64 %r69,%r68,3; .loc 1 587 32 ld.u64 %r59,[%frame+120]; .loc 1 594 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 583 13 mov.u64 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L175: st.u64 [%r98],%r280; .loc 1 584 15 add.u64 %r80,%r80,1; .loc 1 585 12 add.u64 %r98,%r98,%r69; .loc 1 587 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L175; .loc 1 596 7 @ ! %r294 bra $L200; bra $L138; $L174: .loc 1 591 13 st.u64 [%r99],%r280; .loc 1 594 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 595 5 add.u64 %r97,%r97,1; .loc 1 596 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L173; bra $L138; $L200: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 594 23 mov.u64 %r72,%r44; .loc 1 595 5 mov.u64 %r97,1; $L173: .loc 1 600 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 601 23 ld.u64 %r75,[%r124]; .loc 1 601 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,3; add.u64 %r98,%r98,%r291; .loc 1 587 32 ld.u64 %r77,[%r123]; .loc 1 587 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L174; .loc 1 591 13 mov.u64 %r80,0; bra $L175; $L198: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L151; bra $L176; $L138: .loc 1 605 1 ret; } maxloc1_16_r8.o/_gfortran_maxloc1_16_r8 .visible .func _gfortran_maxloc1_16_r8.file 1 "../../../libgfortran/generated/maxloc1_16_r8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_16_r8 .visible .func _gfortran_mmaxloc1_16_r8_gfortran_smaxloc1_16_r8 .visible .func _gfortran_smaxloc1_16_r8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_maxloc1_16_r8 .visible .func _gfortran_maxloc1_16_r8480predu64 %r258; .reg .u64pred %r265; .reg .pred64pred %r304; .reg .u64 %r305; .reg .u64u64 %r318; .reg .u64u64 %r324; .reg .u64predpred %r334; .reg .pred %r336; .reg .u64 %r340; .reg .u64 %r343; .reg .u64 %r346; .reg .u64 %r349; .reg .u32 %r351; .reg .u64 %r355; .reg .u64 %r357; .reg .u64 %r360; .reg .u32 %r362; .reg .u64 %r366; .reg .pred %r367; .reg .pred %r369; .reg .pred %r370; .reg .u64 %r373; .reg .u64 %r376; .reg .u32pred %r391; .reg .u64 %r397; .reg .u6464mov.u64 %r191,%ar0; mov.u64 %r192,%ar1; mov.u64 %r193,%ar2; mov.u32 %r194,%ar3; .loc 1 58 10 ld.s8 %r22,[%r192+28]; .loc 1 58 38 add.u32 %r195,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r118,%r195; .loc 1 59 10 ld.u64 %r25,[%r193]; .loc 1 59 7 add.u64 %r119,%r25,-1; .loc 1 61 7 shr.u64 %r197,%r119,63; set.u32.lt.s64 %r199,%r118,%r119; neg.s32 %r200,%r199; cvt.u16.u64 %r203,%r197; cvt.u16.u32 %r204,%r200; or.b16 %r202,%r203,%r204; .loc 1 61 6 cvt.u32.u16 %r205,%r202; cvt.u16.u8 %r206,%r205; setp.eq.u16 %r207,%r206,0; @ %r207 bra $L2; .loc 1 63 7 cvt.u32.u32 %r210,%r22; cvt.s64.s8 %r209,%r210; st.u64 [%stack+8],%r209_gfortran_runtime_errorr213,%r119,%r119; add.u64 %r214,%r213,%r119; shl.b64 %r215,%r214,3; add.u64 %r216,%r192,%r215; ld.u64 %r30,[%r216+56]; ld.u64 %r32,[%r216+48]; .loc 1 71 9 ld.u64 %r122,[%r216+40]; .loc 1 73 3 setp.ne.u64 %r230,%r119,0; @ %r230 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r231,%r118,%r119; @ %r231 bra $L4; bra $L57; $L3: add.u64 %r181,%r192,40; add.u64 %r69,%frame,120; add.u64 %r97,%frame,240; add.u64 %r232,%r192,16; add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r67,%r232,%r236; .loc 1 79 12 mov.u64 %r447,0; $L8: .loc 1 75 18 ld.u64 %r237,[%r181]; st.u64 [%r69],%r237; .loc 1 76 19 ld.u64 %r239,[%r181+16]; add.u64 %r238,%r239,1; ld.u64 %r240,[%r181+8]; sub.u64 %r37,%r238,%r240; .loc 1 78 10 setp.lt.s64 %r241,%r37,0; @ %r241 bra $L6; .loc 1 76 17 st.u64 [%r97],%r37; bra $L7; $L6: .loc 1 79 12 st.u64 [%r97],%r447; $L7: .loc 1 73 3 add.u64 %r181,%r181,24; add.u64 %r69,%r69,8; add.u64 %r97,%r97,8; setp.ne.u64 %r243,%r67,%r181; @ %r243 bra $L8; bra $L9; $L57: .loc 1 90 6 ld.u64 %r244,[%r191]; setp.eq.u64 %r245,%r244,0; @ ! %r245 bra $L11; bra $L10; $L4: add.u64 %r247,%r25,%r25; add.u64 %r248,%r247,%r25; shl.b64 %r249,%r248,3; add.u64 %r250,%r249,40; add.u64 %r110,%r192,%r250; shl.b64 %r251,%r25,3; add.u64 %r99,%r251,-8; add.u64 %r435,%frame,120; add.u64 %r101,%r435,%r99; add.u64 %r434,%frame,240; add.u64 %r96,%r434,%r99; add.u64 %r254,%r192,40; cvt.u32.u32 %r256,%r22; cvt.s64.s8 %r255,%r256; add.u64 %r258,%r255,%r255; add.u64 %r259,%r258,%r255; shl.b64 %r260,%r259,3; add.u64 %r184,%r254,%r260; .loc 1 87 12 mov.u64 %r446,0; $L14: .loc 1 83 18 ld.u64 %r261,[%r110]; st.u64 [%r101],%r261; .loc 1 84 19 ld.u64 %r263,[%r110+16]; add.u64 %r262,%r263,1; ld.u64 %r264,[%r110+8]; sub.u64 %r44,%r262,%r264; .loc 1 86 10 setp.lt.s64 %r265,%r44,0; @ %r265 bra $L12; .loc 1 84 17 st.u64 [%r96],%r44; bra $L13; $L12: .loc 1 87 12 st.u64 [%r96],%r446; $L13: .loc 1 81 3 add.u64 %r110,%r110,24; add.u64 %r101,%r101,8; add.u64 %r96,%r96,8; setp.ne.u64 %r267,%r110,%r184; @ %r267 bra $L14; bra $L58; $L10: .loc 1 94 7 setp.le.s64 %r268,%r118,0; @ %r268 bra $L16; add.u64 %r434,%frame,240; $L43: add.u64 %r146,%r191,40; mov.u64 %r143,%r434; cvt.u32.u32 %r270,%r22; cvt.s64.s8 %r269,%r270; add.u64 %r272,%r269,%r269; add.u64 %r273,%r272,%r269; shl.b64 %r274,%r273,3; add.u64 %r275,%r191,16; add.u64 %r111,%r274,%r275; .loc 1 97 10 mov.u64 %r103,1; .loc 1 101 4 mov.u64 %r298,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r276,0; st.u64 [%r191+8],%r276; .loc 1 106 28 cvt.u16.u32 %r279,%r22; add.u16 %r278,%r279,-1; cvt.u32.u16 %r280,%r278; st.u8 [%r191+28],%r280; .loc 1 108 20 add.u32 %r281,%r22,-2; cvt.s64.s32 %r53,%r281; add.u64 %r283,%r53,%r53; add.u64 %r284,%r283,%r53; shl.b64 %r285,%r284,3; add.u64 %r286,%r191,%r285; .loc 1 108 67 shl.b64 %r288,%r53,3; add.u64 %r289,%frame,%r288; .loc 1 108 59 ld.u64 %r291,[%r286+40]; ld.u64 %r292,[%r289+240]; mul.lo.u64 %r124,%r291,%r292; .loc 1 110 29 mov.u64 %r294,16call (%value_in),_gfortrani_xmallocarray295,[%value_in]; } .loc 1 110 27 st.u64 [%r191],%r295; .loc 1 111 10 setp.eq.u64 %r297,%r124,0; @ ! %r297 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r103,%r49,%r103; $L17: .loc 1 101 4 st.u64 [%r146+8],%r298; ld.u64 %r49,[%r143]; add.u64 %r299,%r49,-1; st.u64 [%r146+16],%r299; st.u64 [%r146],%r103; .loc 1 94 7 add.u64 %r146,%r146,24; add.u64 %r143,%r143,8; setp.ne.u64 %r300,%r111,%r146; @ %r300 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r191+48],%r124; mov.u64 %r302,-1; st.u64 [%r191+56],%r302; mov.u64 %r303,1; st.u64 [%r191+40],%r303; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r59,[%r191+28]; .loc 1 121 10 setp.eq.u64 %r304,%r59,%r118; @ %r304 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r118; st.u64 [%stack],%r59;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r307,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r308,[%r307+36]; setp.eq.u32 %r309,%r308,0; @ %r309 bra $L21; .loc 1 128 2 add.u64 %r434,%frame,240; cvta.const.u64 %r313,$LC2r4312313; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r315,%r118,0; @ %r315 bra $L24; $L27: .loc 1 68 9 add.u64 %r316,%r30,1; .loc 1 68 7 sub.u64 %r120,%r316,%r32; max.s64 %r64,%r120,0; .loc 1 140 8 ld.u64 %r133,[%r192]; .loc 1 141 8 ld.u64 %r134,[%r191]; .loc 1 168 42 shl.b64 %r66,%r122,3; ld.u64 %r142,[%frame+360]; .loc 1 194 22 ld.u64 %r71,[%frame+120]; .loc 1 194 12 shl.b64 %r72,%r71,3; .loc 1 195 22 ld.u64 %r74,[%frame]; .loc 1 195 12 shl.b64 %r75,%r74,4; .loc 1 197 32 ld.u64 %r61,[%frame+240]; .loc 1 204 23 mul.lo.u64 %r317,%r61,%r71; .loc 1 204 9 shl.b64 %r318,%r317,3; neg.s64 %r154,%r318; .loc 1 205 23 mul.lo.u64 %r319,%r61,%r74; .loc 1 205 9 shl.b64 %r320,%r319,4; neg.s64 %r140,%r320; setp.gt.s64 %r436,%r120,0; setp.le.s64 %r437,%r118,1; add.u64 %r439,%r64,-1; setp.eq.u32 %r440,%r194,0; cvt.u32.u32 %r441,%r22; cvt.s64.s8 %r442,%r441; add.u64 %r443,%r442,-1; add.u64 %r444,%frame,120; add.u64 %r445,%frame,240; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r191,40; mov.u64 %r162,%frame; cvt.u32.u32 %r322,%r22; cvt.s64.s8 %r321,%r322; add.u64 %r148,%r321,-1; .loc 1 132 3 mov.u64 %r135,0; add.u64 %r434,%frame,240; .loc 1 134 16 mov.u64 %r323,%r135; $L26: st.u64 [%r165],%r323; .loc 1 135 18 ld.u64 %r324,[%r164]; st.u64 [%r162],%r324; .loc 1 136 17 shl.b64 %r326,%r135,3; add.u64 %r327,%r434,%r326; .loc 1 136 10 ld.u64 %r328,[%r327]; setp.le.s64 %r329,%r328,0; @ %r329 bra $L1; .loc 1 132 26 add.u64 %r135,%r135,1; .loc 1 132 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r330,%r135,%r148; @ %r330 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r436 bra $L44; .loc 1 159 10 mov.u64 %r332,0; st.u64 [%r134],%r332; st.u64 [%r134+8],%r332; bra $L29; $L44: mov.u64 %r126,%r133; .loc 1 168 19 mov.u64 %r145,0; $L28: .loc 1 170 7 ld.f64 %r108,[%r126]; mov.u64 %r127,%r145; .loc 1 168 34 add.u64 %r145,%r145,1; .loc 1 170 6 setp.ge.f64 %r334,%r108,0dfff0000000000000; @ ! %r334 bra $L59; .loc 1 173 14 mov.u64 %r413,%r145; shr.s64 %r414,%r413,63; .loc 1 180 6 setp.le.s64 %r336,%r64,%r127; @ %r336 bra $L32; add.u64 %r417,%r127,1; shr.s64 %r418,%r417,63; sub.u64 %r340,%r439,%r127; shr.s64 %r343,%r127,63; add.u64 %r346,%r127,2; set.u32.lt.u64 %r351,%r346,%r127; cvt.s64.s32 %r349,%r351; sub.u64 %r355,%r343,%r349; add.u64 %r357,%r340,%r346; set.u32.lt.u64 %r362,%r357,%r340; cvt.s64.s32 %r360,%r362; sub.u64 %r366,%r355,%r360; .loc 1 170 7 mov.f64 %r185,%r108; .loc 1 180 6 mov.u64 %r429,1; bra $L33; $L59: .loc 1 168 42 add.u64 %r126,%r126,%r66; .loc 1 168 12 setp.gt.s64 %r367,%r64,%r145; @ %r367 bra $L28; .loc 1 157 9 mov.u64 %r413,1; mov.u64 %r414,0; bra $L32; $L33: .loc 1 182 7 @ %r440 bra $L34; .loc 1 182 6 setp.le.f64 %r369,%r108,%r185; @ %r369 bra $L35; bra $L36; $L34: setp.lt.f64 %r370,%r108,%r185; @ ! %r370 bra $L36; $L35: .loc 1 185 14 mov.u64 %r413,%r417; mov.u64 %r414,%r418; .loc 1 184 14 mov.f64 %r108,%r185; $L36: .loc 1 180 31 add.u64 %r126,%r126,%r66; .loc 1 180 6 add.u64 %r373,%r417,%r429; set.u32.lt.u64 %r378,%r373,%r417; cvt.s64.s32 %r376,%r378; mov.u64 %r417,%r373; sub.u64 %r418,%r418,%r376; setp.ne.u64 %r385,%r357,%r417; @ %r385 bra $L46; setp.ne.u64 %r388,%r366,%r418; @ ! %r388 bra $L32; $L46: .loc 1 182 14 ld.f64 %r185,[%r126]; bra $L33; $L32: .loc 1 189 12 st.u64 [%r134],%r413; st.u64 [%r134+8],%r414; $L29: .loc 1 193 15 add.u64 %r142,%r142,1; .loc 1 194 12 add.u64 %r133,%r133,%r72; .loc 1 195 12 add.u64 %r134,%r134,%r75; .loc 1 197 13 setp.ne.u64 %r391,%r61,%r142; @ %r391 bra $L25; .loc 1 204 9 add.u64 %r130,%r133,%r154; .loc 1 205 9 add.u64 %r131,%r134,%r140; .loc 1 207 7 @ %r437 bra $L1; add.u64 %r117,%frame,368; mov.u64 %r180,8; .loc 1 206 5 mov.u64 %r132,1; .loc 1 201 13 mov.u64 %r438,0; bra $L41; $L42: st.u64 [%r117],%r438; .loc 1 204 23 mul.lo.u64 %r397,%r89,%r88; .loc 1 204 9 shl.b64 %r398,%r397,3; sub.u64 %r130,%r133,%r398; .loc 1 205 23 mul.lo.u64 %r399,%r92,%r88; .loc 1 205 9 shl.b64 %r400,%r399,4; sub.u64 %r131,%r134,%r400; .loc 1 206 5 add.u64 %r132,%r132,1; .loc 1 207 7 add.u64 %r117,%r117,8; add.u64 %r180,%r180,8; setp.eq.u64 %r401,%r132,%r443; @ %r401 bra $L1; $L41: .loc 1 215 16 ld.u64 %r402,[%r117]; add.u64 %r88,%r402,1; st.u64 [%r117],%r88; .loc 1 216 23 add.u64 %r404,%r444,%r180; ld.u64 %r89,[%r404]; .loc 1 216 13 shl.b64 %r405,%r89,3; add.u64 %r133,%r130,%r405; .loc 1 217 23 add.u64 %r406,%frame,%r180; ld.u64 %r92,[%r406]; .loc 1 217 13 shl.b64 %r407,%r92,4; add.u64 %r134,%r131,%r407; .loc 1 197 32 add.u64 %r409,%r445,%r180; ld.u64 %r95,[%r409]; .loc 1 197 13 setp.eq.u64 %r410,%r88,%r95; @ %r410 bra $L42; .loc 1 201 13 mov.u64 %r142,0; bra $L25; $L58: .loc 1 90 6 ld.u64 %r411,[%r191]; setp.eq.u64 %r412,%r411,0; @ ! %r412 bra $L11; bra $L43; $L1:_gfortran_mmaxloc1_16_r8 .visible .func _gfortran_mmaxloc1_16_r8, .param .u32 %in_ar4) {608pred %r239; .reg .u32 %r244; .reg .u64 %r246; .reg .u32 %r248; .reg .u32 %r249; .reg .u16 %r251; .reg .u16 %r252; .reg .u16 %r253; .reg .u32u64 %r268; .reg .u64 %r275; .reg .pred %r276; .reg .u32 %r277; .reg .u32 %r278; .reg .u32 %r280; .reg .u32pred %r291; .reg .pred %r293; .reg .u64predu64 %r312; .reg .u64u64 %r317; .reg .u64 %r318; .reg .u64u64pred %r356; .reg .pred %r357; .reg .u64 %r358; .reg .u32pred %r410; .reg .u64pred %r419; .reg .predpred %r427; .reg .predu64 %r454; .reg .u64 %r457; .reg .u32 %r459; .reg .u64 %r463; .reg .u64 %r465; .reg .u64pred %r541; .reg .predu64u64 %r565; .reg .u64 %r567; .reg .pred %r568; .reg .u64 %r569; .reg .pred %r570; .reg .u64 %r571; .reg .u64 %r572; .reg .u64 %r573; .reg .u64.reg .u64 %r633; .reg .u64 %r634; mov.u64 %r234,%ar0; mov.u64 %r235,%ar1; mov.u64 %r236,%ar2; mov.u64 %r237,%ar3; mov.u32 %r238,%ar4; .loc 1 251 6 setp.ne.u64 %r239,%r237,0; @ %r239 bra $L61; .loc 1 254 7r2236238; call _gfortran_maxloc1_16_r87 bra $L60; $L61: .loc 1 261 10 ld.u64 %r22,[%r236]; .loc 1 261 7 add.u64 %r152,%r22,-1; .loc 1 262 10 ld.s8 %r23,[%r235+28]; .loc 1 262 38 add.u32 %r244,%r23,-1; .loc 1 262 8 cvt.s64.s32 %r153,%r244; .loc 1 265 7 shr.u64 %r246,%r152,63; set.u32.gt.s64 %r248,%r152,%r153; neg.s32 %r249,%r248; cvt.u16.u64 %r252,%r246; cvt.u16.u32 %r253,%r249; or.b16 %r251,%r252,%r253; .loc 1 265 6 cvt.u32.u16 %r254,%r251; cvt.u16.u8 %r255,%r254; setp.eq.u16 %r256,%r255,0; @ %r256 bra $L63; .loc 1 267 7 cvt.u32.u32 %r259,%r23; cvt.s64.s8 %r258,%r259; st.u64 [%stack+8],%r258; st.u64 [%stack],%r22; cvta.const.u64 %r25_gfortran_runtime_error63: .loc 1 272 9 add.u64 %r621,%r152,%r152; add.u64 %r617,%r621,%r152; shl.b64 %r264,%r617,3; add.u64 %r265,%r235,%r264; ld.u64 %r268,[%r265+56]; add.u64 %r267,%r268,1; .loc 1 272 7 ld.u64 %r275,[%r265+48]; sub.u64 %r154,%r267,%r275; .loc 1 273 6 setp.le.s64 %r276,%r154,0; @ %r276 bra $L60; .loc 1 276 9 ld.u64 %r174,[%r237]; .loc 1 278 15 ld.u64 %r33,[%r237+16]; .loc 1 280 22 cvt.u32.u64 %r34,%r33; .loc 1 280 53 add.u32 %r277,%r34,-4; and.b32 %r278,%r277,-5; set.u32.eq.u32 %r280,%r278,0; neg.s32 %r281,%r280; .loc 1 280 22 add.u32 %r282,%r34,-1; set.u32.le.u32 %r284,%r282,1; neg.s32 %r285,%r284; .loc 1 280 6 cvt.u16.u32 %r287,%r281; cvt.u16.u32 %r288,%r285; or.b16 %r286,%r287,%r288; cvt.u32.u16 %r289,%r286; cvt.u16.u8 %r290,%r289; setp.ne.u16 %r291,%r290,0; @ %r291 bra $L65; .loc 1 282 7 setp.ne.u32 %r293,%r34,16; @ %r293 bra $L66; $L65: .loc 1 289 9 shl.b64 %r297,%r617,3; add.u64 %r298,%r235,%r297; ld.u64 %r157,[%r298+40]; .loc 1 290 12 add.u64 %r304,%r237,%r297; ld.u64 %r40,[%r304+40]; .loc 1 292 3 setp.ne.u64 %r306,%r152,0; @ %r306 bra $L67; $L73: .loc 1 302 3 setp.lt.s64 %r307,%r152,%r153; @ %r307 bra $L68; bra $L135; $L66: .loc 1 287 5 cvta.const.u64 %r30830_gfortran_runtime_error67: add.u64 %r91,%r235,40; add.u64 %r196,%r237,40; add.u64 %r311,%r22,%r22; add.u64 %r312,%r311,%r22; shl.b64 %r313,%r312,3; add.u64 %r314,%r235,16; add.u64 %r226,%r313,%r314; .loc 1 292 3 mov.u64 %r107,0; add.u64 %r619,%frame,240; add.u64 %r623,%frame,360; .loc 1 299 12 mov.u64 %r634,%r107; $L72: .loc 1 294 18 add.u64 %r316,%r619,%r107; ld.u64 %r317,[%r91]; st.u64 [%r316],%r317; .loc 1 295 18 add.u64 %r318,%frame,%r107; .loc 1 295 20 ld.u64 %r320,[%r196]; mul.lo.u64 %r319,%r320,%r33; .loc 1 295 18 st.u64 [%r318],%r319; .loc 1 296 19 ld.u64 %r322,[%r91+16]; add.u64 %r321,%r322,1; ld.u64 %r323,[%r91+8]; sub.u64 %r52,%r321,%r323; .loc 1 298 10 setp.lt.s64 %r324,%r52,0; @ %r324 bra $L70; .loc 1 296 17 add.u64 %r326,%r623,%r107; st.u64 [%r326],%r52; bra $L71; $L70: .loc 1 299 12 add.u64 %r328,%r623,%r107; st.u64 [%r328],%r634; $L71: .loc 1 292 3 add.u64 %r91,%r91,24; add.u64 %r196,%r196,24; add.u64 %r107,%r107,8; setp.ne.u64 %r330,%r91,%r226; @ %r330 bra $L72; bra $L73; $L135: .loc 1 312 6 ld.u64 %r331,[%r234]; setp.eq.u64 %r332,%r331,0; @ ! %r332 bra $L75; bra $L74; $L68: add.u64 %r334,%r22,%r22; add.u64 %r335,%r334,%r22; shl.b64 %r336,%r335,3; add.u64 %r180,%r336,40; add.u64 %r57,%r235,%r180; add.u64 %r182,%r237,%r180; shl.b64 %r337,%r22,3; add.u64 %r74,%r337,-8; cvt.u32.u32 %r339,%r23; cvt.s64.s8 %r338,%r339; shl.b64 %r340,%r338,3; add.u64 %r88,%r340,-8; add.u64 %r619,%frame,240; add.u64 %r623,%frame,360; .loc 1 309 12 mov.u64 %r633,0; $L78: .loc 1 304 18 add.u64 %r342,%r619,%r74; ld.u64 %r343,[%r57]; st.u64 [%r342],%r343; .loc 1 305 18 add.u64 %r344,%frame,%r74; .loc 1 305 20 ld.u64 %r346,[%r182]; mul.lo.u64 %r345,%r346,%r33; .loc 1 305 18 st.u64 [%r344],%r345; .loc 1 306 19 ld.u64 %r348,[%r57+16]; add.u64 %r347,%r348,1; ld.u64 %r349,[%r57+8]; sub.u64 %r64,%r347,%r349; .loc 1 308 10 setp.lt.s64 %r350,%r64,0; @ %r350 bra $L76; .loc 1 306 17 add.u64 %r352,%r623,%r74; st.u64 [%r352],%r64; bra $L77; $L76: .loc 1 309 12 add.u64 %r354,%r623,%r74; st.u64 [%r354],%r633; $L77: .loc 1 302 3 add.u64 %r57,%r57,24; add.u64 %r182,%r182,24; add.u64 %r74,%r74,8; setp.ne.u64 %r356,%r74,%r88; @ %r356 bra $L78; bra $L136; $L74: .loc 1 316 7 setp.eq.u64 %r357,%r153,0; @ %r357 bra $L80; add.u64 %r623,%frame,360; $L110: add.u64 %r133,%r234,40; mov.u64 %r131,%r623; cvt.u32.u32 %r359,%r23; cvt.s64.s8 %r358,%r359; add.u64 %r361,%r358,%r358; add.u64 %r362,%r361,%r358; shl.b64 %r363,%r362,3; add.u64 %r364,%r234,16; add.u64 %r60,%r363,%r364; .loc 1 319 10 mov.u64 %r136,1; .loc 1 323 4 mov.u64 %r383,0; bra $L81; $L80: .loc 1 327 20 add.u32 %r365,%r23,-2; cvt.s64.s32 %r72,%r365; add.u64 %r367,%r72,%r72; add.u64 %r368,%r367,%r72; shl.b64 %r369,%r368,3; add.u64 %r370,%r234,%r369; .loc 1 327 67 shl.b64 %r372,%r72,3; add.u64 %r373,%frame,%r372; .loc 1 327 59 ld.u64 %r375,[%r370+40]; ld.u64 %r376,[%r373+360]; mul.lo.u64 %r161,%r375,%r376; .loc 1 329 24 mov.u64 %r377,0; st.u64 [%r234+8],%r377; .loc 1 330 28 cvt.u16.u32 %r380,%r23; add.u16 %r379,%r380,-1; cvt.u32.u16 %r381,%r379; st.u8 [%r234+28],%r381; .loc 1 332 10 setp.eq.u64 %r382,%r161,0; @ %r382 bra $L82; bra $L137; $L84: .loc 1 321 47 mul.lo.u64 %r136,%r69,%r136; $L81: .loc 1 323 4 st.u64 [%r133+8],%r383; ld.u64 %r69,[%r131]; add.u64 %r384,%r69,-1; st.u64 [%r133+16],%r384; st.u64 [%r133],%r136; .loc 1 316 7 add.u64 %r133,%r133,24; add.u64 %r131,%r131,8; setp.ne.u64 %r385,%r60,%r133; @ %r385 bra $L84; bra $L80; $L82: .loc 1 335 4 st.u64 [%r234+48],%r161; mov.u64 %r387,-1; st.u64 [%r234+56],%r387; mov.u64 %r388,1; st.u64 [%r234+40],%r388; .loc 1 336 4 bra $L60; $L137: .loc 1 339 24 mov.u64 %r390,16390; call (%value_in),_gfortrani_xmallocarray91,[%value_in]; } .loc 1 339 22 st.u64 [%r234],%r391; bra $L85; $L75: .loc 1 344 19 ld.s8 %r393,[%r234+28]; .loc 1 344 10 setp.eq.u64 %r394,%r393,%r153; @ %r394 bra $L86; .loc 1 345 2 cvta.const.u64 %r3955_gfortran_runtime_error86: .loc 1 347 11 cvta.global.u64 %r397,_gfortrani_compile_options; .loc 1 347 10 ld.u32 %r398,[%r397+36]; setp.eq.u32 %r399,%r398,0; @ %r399 bra $L85; .loc 1 349 4 add.u64 %r623,%frame,360; cvta.const.u64 %r403,$LC2;r62302403; call _gfortrani_bounds_ifunction_return4 cvta.const.u64 %r407,$LC207403; call _gfortrani_bounds_equal_extents$L85: .loc 1 356 3 setp.ne.u64 %r409,%r153,0; @ %r409 bra $L87; $L90: .loc 1 364 8 ld.u64 %r175,[%r234]; .loc 1 365 8 ld.u64 %r173,[%r235]; .loc 1 367 9 setp.ne.u64 %r410,%r173,0; @ %r410 bra $L88; bra $L60; $L87: add.u64 %r149,%frame,480; add.u64 %r148,%r234,40; add.u64 %r144,%frame,120; cvt.u32.u32 %r412,%r23; cvt.s64.s8 %r411,%r412; add.u64 %r134,%r411,-1; .loc 1 356 3 mov.u64 %r176,0; add.u64 %r623,%frame,360; .loc 1 358 16 mov.u64 %r413,%r176; $L89: st.u64 [%r149],%r413; .loc 1 359 18 ld.u64 %r414,[%r148]; st.u64 [%r144],%r414; .loc 1 360 17 shl.b64 %r416,%r176,3; add.u64 %r417,%r623,%r416; .loc 1 360 10 ld.u64 %r418,[%r417]; setp.le.s64 %r419,%r418,0; @ %r419 bra $L60; .loc 1 356 26 add.u64 %r176,%r176,1; .loc 1 356 3 add.u64 %r149,%r149,8; add.u64 %r148,%r148,24; add.u64 %r144,%r144,8; setp.ne.u64 %r420,%r134,%r176; @ %r420 bra $L89; bra $L90; $L88: .loc 1 290 12 mul.lo.u64 %r43,%r40,%r33; .loc 1 386 32 shl.b64 %r87,%r157,3; ld.u64 %r38,[%frame+480]; .loc 1 430 22 ld.u64 %r100,[%frame+240]; .loc 1 430 12 shl.b64 %r101,%r100,3; .loc 1 431 23 ld.u64 %r103,[%frame]; .loc 1 432 22 ld.u64 %r105,[%frame+120]; .loc 1 432 12 shl.b64 %r106,%r105,4; .loc 1 434 32 ld.u64 %r166,[%frame+360]; .loc 1 441 23 mul.lo.u64 %r37,%r166,%r100; .loc 1 442 24 mul.lo.u64 %r421,%r166,%r103; .loc 1 442 10 neg.s64 %r85,%r421; .loc 1 443 23 mul.lo.u64 %r422,%r166,%r105; .loc 1 443 9 shl.b64 %r423,%r422,4; neg.s64 %r93,%r423; setp.le.s64 %r618,%r153,1; .loc 1 356 3 mov.u64 %r626,1; .loc 1 408 9 setp.ne.u32 %r627,%r238,0; add.u64 %r628,%r154,-1; cvt.u32.u32 %r629,%r23; cvt.s64.s8 %r630,%r629; add.u64 %r631,%r630,-1; add.u64 %r632,%frame,240; $L111: .loc 1 356 3 mov.u64 %r169,%r174; mov.u64 %r168,%r173; mov.u64 %r573,%r626; mov.u64 %r574,0; .loc 1 383 17 mov.u64 %r571,%r574; mov.u64 %r572,%r574; .loc 1 386 9 mov.u64 %r200,%r574; $L96: .loc 1 389 7 ld.s8 %r225,[%r169]; mov.u64 %r231,%r200; add.u64 %r200,%r200,1; .loc 1 389 6 setp.eq.u32 %r620,%r225,0; @ %r620 bra $L91; .loc 1 392 10 or.b64 %r425,%r571,%r572; setp.ne.u64 %r427,%r425,0; .loc 1 393 17 selp.u64 %r571,%r571,%r573,%r427; selp.u64 %r572,%r572,%r574,%r427; .loc 1 394 11 ld.f64 %r145,[%r168]; .loc 1 394 10 setp.ge.f64 %r428,%r145,0dfff0000000000000; @ ! %r428 bra $L91; .loc 1 398 11 shr.s64 %r429,%r200,63; .loc 1 404 9 setp.le.s64 %r430,%r154,%r231; @ ! %r430 bra $L138; bra $L94; $L91: .loc 1 386 32 add.u64 %r168,%r168,%r87; .loc 1 386 47 add.u64 %r169,%r169,%r43; .loc 1 386 2 add.u64 %r433,%r573,%r626; set.u32.lt.u64 %r438,%r433,%r573; cvt.s64.s32 %r436,%r438; mov.u64 %r573,%r433; sub.u64 %r574,%r574,%r436; setp.ne.u64 %r443,%r154,%r200; @ %r443 bra $L96; bra $L94; $L138: .loc 1 408 9 @ %r627 bra $L97; add.u64 %r579,%r231,1; shr.s64 %r580,%r579,63; sub.u64 %r448,%r628,%r231; shr.s64 %r451,%r231,63; add.u64 %r454,%r231,2; set.u32.lt.u64 %r459,%r454,%r231; cvt.s64.s32 %r457,%r459; sub.u64 %r463,%r451,%r457; add.u64 %r465,%r448,%r454; set.u32.lt.u64 %r470,%r465,%r448; cvt.s64.s32 %r468,%r470; sub.u64 %r474,%r463,%r468; .loc 1 398 11 mov.u64 %r571,%r200; mov.u64 %r572,%r429; bra $L98; $L97: add.u64 %r583,%r231,1; shr.s64 %r584,%r583,63; sub.u64 %r478,%r628,%r231; shr.s64 %r481,%r231,63; add.u64 %r484,%r231,2; set.u32.lt.u64 %r489,%r484,%r231; cvt.s64.s32 %r487,%r489; sub.u64 %r493,%r481,%r487; add.u64 %r495,%r478,%r484; set.u32.lt.u64 %r500,%r495,%r478; cvt.s64.s32 %r498,%r500; sub.u64 %r504,%r493,%r498; mov.u64 %r571,%r200; mov.u64 %r572,%r429; $L102: .loc 1 411 8 @ %r620 bra $L99; .loc 1 411 18 ld.f64 %r90,[%r168]; .loc 1 411 15 setp.ge.f64 %r506,%r90,%r145; .loc 1 414 16 selp.u64 %r571,%r583,%r571,%r506; selp.u64 %r572,%r584,%r572,%r506; selp.f64 %r145,%r90,%r145,%r506; $L99: .loc 1 409 33 add.u64 %r168,%r168,%r87; .loc 1 409 48 add.u64 %r169,%r169,%r43; .loc 1 409 8 add.u64 %r509,%r583,%r626; set.u32.lt.u64 %r514,%r509,%r583; cvt.s64.s32 %r512,%r514; mov.u64 %r583,%r509; sub.u64 %r584,%r584,%r512; setp.ne.u64 %r521,%r495,%r583; @ %r521 bra $L114; setp.ne.u64 %r524,%r504,%r584; @ ! %r524 bra $L94; $L114: .loc 1 411 9 ld.s8 %r225,[%r169]; setp.eq.u32 %r620,%r225,0; bra $L102; $L98: .loc 1 420 8 @ %r620 bra $L103; .loc 1 420 18 ld.f64 %r94,[%r168]; .loc 1 420 15 setp.gt.f64 %r526,%r94,%r145; .loc 1 423 16 selp.u64 %r571,%r579,%r571,%r526; selp.u64 %r572,%r580,%r572,%r526; selp.f64 %r145,%r94,%r145,%r526; $L103: .loc 1 418 33 add.u64 %r168,%r168,%r87; .loc 1 418 48 add.u64 %r169,%r169,%r43; .loc 1 418 8 add.u64 %r529,%r579,%r626; set.u32.lt.u64 %r534,%r529,%r579; cvt.s64.s32 %r532,%r534; mov.u64 %r579,%r529; sub.u64 %r580,%r580,%r532; setp.ne.u64 %r541,%r465,%r579; @ %r541 bra $L116; setp.ne.u64 %r544,%r474,%r580; @ ! %r544 bra $L94; $L116: .loc 1 420 9 ld.s8 %r225,[%r169]; setp.eq.u32 %r620,%r225,0; bra $L98; $L94: .loc 1 426 8 st.u64 [%r175],%r571; st.u64 [%r175+8],%r572; .loc 1 429 15 add.u64 %r38,%r38,1; .loc 1 430 12 add.u64 %r173,%r173,%r101; .loc 1 431 13 add.u64 %r174,%r174,%r103; .loc 1 432 12 add.u64 %r175,%r175,%r106; .loc 1 434 13 setp.ne.u64 %r547,%r38,%r166; @ %r547 bra $L111; .loc 1 442 10 add.u64 %r170,%r174,%r85; .loc 1 443 9 add.u64 %r171,%r175,%r93; .loc 1 445 7 @ %r618 bra $L60; add.u64 %r151,%frame,488; .loc 1 441 23 mov.u64 %r110,%r37; .loc 1 445 7 mov.u64 %r219,8; .loc 1 444 5 mov.u64 %r172,1; add.u64 %r623,%frame,360; add.u64 %r622,%frame,120; .loc 1 438 13 mov.u64 %r624,0; bra $L108; $L109: st.u64 [%r151],%r624; .loc 1 441 23 mul.lo.u64 %r110,%r123,%r122; .loc 1 442 24 mul.lo.u64 %r553,%r125,%r122; .loc 1 442 10 sub.u64 %r170,%r174,%r553; .loc 1 443 23 mul.lo.u64 %r554,%r127,%r122; .loc 1 443 9 shl.b64 %r555,%r554,4; sub.u64 %r171,%r175,%r555; .loc 1 444 5 add.u64 %r172,%r172,1; .loc 1 445 7 add.u64 %r151,%r151,8; add.u64 %r219,%r219,8; setp.eq.u64 %r556,%r172,%r631; @ %r556 bra $L60; $L108: .loc 1 453 16 ld.u64 %r557,[%r151]; add.u64 %r122,%r557,1; st.u64 [%r151],%r122; .loc 1 454 23 add.u64 %r559,%r632,%r219; ld.u64 %r123,[%r559]; .loc 1 454 13 sub.u64 %r560,%r123,%r110; shl.b64 %r561,%r560,3; add.u64 %r173,%r173,%r561; .loc 1 455 24 add.u64 %r562,%frame,%r219; ld.u64 %r125,[%r562]; .loc 1 455 14 add.u64 %r174,%r170,%r125; .loc 1 456 23 add.u64 %r564,%r622,%r219; ld.u64 %r127,[%r564]; .loc 1 456 13 shl.b64 %r565,%r127,4; add.u64 %r175,%r171,%r565; .loc 1 434 32 add.u64 %r567,%r623,%r219; ld.u64 %r130,[%r567]; .loc 1 434 13 setp.eq.u64 %r568,%r122,%r130; @ %r568 bra $L109; .loc 1 438 13 mov.u64 %r38,0; bra $L111; $L136: .loc 1 312 6 ld.u64 %r569,[%r234]; setp.eq.u64 %r570,%r569,0; @ ! %r570 bra $L75; bra $L110; $L60_gfortran_smaxloc1_16_r8 .visible .func _gfortran_smaxloc1_16_r8, .param .u32 %in_ar4) {local .align 16 .b8 %frame_ar[368u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r282; .reg .predpred %r293; .reg .pred %r294; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 483 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L140; .loc 1 483 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L141; $L140:r152154; call _gfortran_maxloc1_16_r8490 7 bra $L139; $L141: .loc 1 493 10 ld.u64 %r23,[%r152]; .loc 1 493 7 add.u64 %r88,%r23,-1; .loc 1 494 10 ld.s8 %r24,[%r151+28]; .loc 1 494 38 add.u32 %r26,%r24,-1; .loc 1 494 8 cvt.s64.s32 %r297,%r26; .loc 1 496 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r297; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 496 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L143; .loc 1 503 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L144; $L150: .loc 1 511 3 setp.lt.s64 %r175,%r88,%r297; @ %r175 bra $L145; bra $L198; $L143: .loc 1 498 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error144: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 508 12 mov.u64 %r303,0; $L149: .loc 1 505 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 507 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L147; .loc 1 505 17 st.u64 [%r139],%r36; bra $L148; $L147: .loc 1 508 12 st.u64 [%r139],%r303; $L148: .loc 1 503 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L149; bra $L150; $L198: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L152; bra $L151; $L145: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r296,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r296,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 517 12 mov.u64 %r302,0; $L155: .loc 1 514 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 516 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L153; .loc 1 513 17 st.u64 [%r43],%r41; bra $L154; $L153: .loc 1 517 12 st.u64 [%r43],%r302; $L154: .loc 1 511 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L155; bra $L199; $L151: .loc 1 524 7 setp.eq.u64 %r213,%r297,0; @ %r213 bra $L157; add.u64 %r296,%frame,120; $L177: add.u64 %r111,%r150,40; mov.u64 %r107,%r296; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 527 10 mov.u64 %r81,1; .loc 1 531 4 mov.u64 %r239,0; bra $L158; $L157: .loc 1 535 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 536 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 538 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 538 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 538 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 540 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L159; bra $L200; $L161: .loc 1 529 48 mul.lo.u64 %r81,%r46,%r81; $L158: .loc 1 531 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 524 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L161; bra $L157; $L159: .loc 1 543 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 544 4 bra $L139; $L200: .loc 1 547 24 mov.u64 %r246,1696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 547 22 st.u64 [%r150],%r98; $L165: .loc 1 573 3 setp.ne.u64 %r249,%r297,0; @ %r249 bra $L162; bra $L163; $L152: .loc 1 551 19 ld.s8 %r55,[%r150+28]; .loc 1 551 10 setp.eq.u64 %r250,%r55,%r297; @ %r250 bra $L164; .loc 1 552 2 st.u64 [%stack+8],%r297; st.u64 [%stack],%r55;_gfortran_runtime_error164: .loc 1 557 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 557 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L165; .loc 1 559 4 setp.eq.u64 %r256,%r297,0; @ %r256 bra $L163; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 559 10 mov.u64 %r104,0; $L167: .loc 1 563 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 563 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 564 18 ld.u64 %r63,[%r31]; .loc 1 559 25 add.u64 %r104,%r104,1; .loc 1 564 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L166; .loc 1 565 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error166: .loc 1 559 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L167; $L162: .loc 1 575 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L168; shl.b64 %r266,%r297,3; bra $L169; $L168: mov.u64 %r266,8; $L169: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L170: .loc 1 576 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 573 26 add.u64 %r100,%r100,1; .loc 1 573 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r297,%r100; @ %r279 bra $L170; $L163: ld.u64 %r80,[%frame+240]; .loc 1 585 22 ld.u64 %r68,[%frame]; .loc 1 585 12 shl.b64 %r69,%r68,4; .loc 1 587 32 ld.u64 %r59,[%frame+120]; .loc 1 594 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r295,%r297,1; .loc 1 583 13 mov.u64 %r280,0; cvt.u32.u32 %r299,%r24; cvt.s64.s8 %r300,%r299; add.u64 %r301,%r300,-1; $L176: st.u64 [%r98],%r280; st.u64 [%r98+8],%r280; .loc 1 584 15 add.u64 %r80,%r80,1; .loc 1 585 12 add.u64 %r98,%r98,%r69; .loc 1 587 13 setp.ne.u64 %r282,%r59,%r80; @ %r282 bra $L176; .loc 1 596 7 @ ! %r295 bra $L201; bra $L139; $L175: .loc 1 591 13 st.u64 [%r99],%r280; .loc 1 594 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 595 5 add.u64 %r97,%r97,1; .loc 1 596 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r285,%r97,%r301; @ %r285 bra $L174; bra $L139; $L201: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 594 23 mov.u64 %r72,%r44; .loc 1 595 5 mov.u64 %r97,1; $L174: .loc 1 600 16 ld.u64 %r290,[%r99]; add.u64 %r74,%r290,1; st.u64 [%r99],%r74; .loc 1 601 23 ld.u64 %r75,[%r124]; .loc 1 601 13 sub.u64 %r291,%r75,%r72; shl.b64 %r292,%r291,4; add.u64 %r98,%r98,%r292; .loc 1 587 32 ld.u64 %r77,[%r123]; .loc 1 587 13 setp.eq.u64 %r293,%r74,%r77; @ %r293 bra $L175; .loc 1 591 13 mov.u64 %r80,0; bra $L176; $L199: .loc 1 520 15 ld.u64 %r98,[%r150]; .loc 1 520 6 setp.eq.u64 %r294,%r98,0; @ ! %r294 bra $L152; bra $L177; $L139: .loc 1 605 1 ret; } maxloc1_4_r10.o/16228021991 maxloc1_8_r16.o/162280219maxval_i1.o/_gfortran_maxval_i1 .visible .func _gfortran_maxval_fortran/generated/maxval_i1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxval_i1 .visible .func _gfortran_mmaxval_i1gfortran_smaxval_i1 .visible .func _gfortran_smaxval_i1VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,86,65,76114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_maxval_i1 .visible .func _gfortran_maxval_480predpred %r209; .reg .pred %r210; .reg .u64 %r211; .reg .u64 %r213; .reg .u64pred %r220; .reg .pred %r222; .reg .u64pred %r247; .reg .u64 %r248; .reg .u32 %r249; .reg .u64 %r251; .reg .u64 %r252; .reg .u64 %r253; .reg .u64 %r254; .reg .u64 %r255; .reg .u16 %r257; .reg .u16predpred %r288; .reg .u64 %r291; .reg .u64 %r292; .reg .predpred %r304; .reg .pred64pred %r335; .reg .u64 %r336; .reg .u64mov.u64 %r171,%ar0; mov.u64 %r172,%ar1; mov.u64 %r173,%ar2; .loc 1 55 10 ld.s8 %r22,[%r172+28]; .loc 1 55 38 add.u32 %r174,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r174; .loc 1 56 10 ld.u64 %r25,[%r173]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r176,%r104,63; set.u32.lt.s64 %r178,%r103,%r104; neg.s32 %r179,%r178; cvt.u16.u64 %r182,%r176; cvt.u16.u32 %r183,%r179; or.b16 %r181,%r182,%r183; .loc 1 58 6 cvt.u32.u16 %r184,%r181; cvt.u16.u8 %r185,%r184; setp.eq.u16 %r186,%r185,0; @ %r186 bra $L2; .loc 1 60 7 cvt.u32.u32 %r189,%r22; cvt.s64.s8 %r188,%r189; st.u64 [%stack+8],%r18818187_gfortran_runtime_errorr192,%r104,%r104; add.u64 %r193,%r192,%r104; shl.b64 %r194,%r193,3; add.u64 %r195,%r172,%r194; ld.u64 %r31,[%r195+56]; ld.u64 %r33,[%r195+48]; .loc 1 68 9 ld.u64 %r107,[%r195+40]; .loc 1 70 3 setp.ne.u64 %r209,%r104,0; @ %r209 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r210,%r103,%r104; @ %r210 bra $L4; bra $L46; $L3: add.u64 %r168,%r172,40; add.u64 %r165,%frame,120; add.u64 %r49,%frame,240; add.u64 %r211,%r172,16; add.u64 %r213,%r25,%r25; add.u64 %r214,%r213,%r25; shl.b64 %r215,%r214,3; add.u64 %r139,%r211,%r215; .loc 1 76 12 mov.u64 %r349,0; $L8: .loc 1 72 18 ld.u64 %r216,[%r168]; st.u64 [%r165],%r216; .loc 1 73 19 ld.u64 %r218,[%r168+16]; add.u64 %r217,%r218,1; ld.u64 %r219,[%r168+8]; sub.u64 %r38,%r217,%r219; .loc 1 75 10 setp.lt.s64 %r220,%r38,0; @ %r220 bra $L6; .loc 1 73 17 st.u64 [%r49],%r38; bra $L7; $L6: .loc 1 76 12 st.u64 [%r49],%r349; $L7: .loc 1 70 3 add.u64 %r168,%r168,24; add.u64 %r165,%r165,8; add.u64 %r49,%r49,8; setp.ne.u64 %r222,%r139,%r168; @ %r222 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r223,[%r171]; setp.eq.u64 %r224,%r223,0; @ ! %r224 bra $L11; bra $L10; $L4: add.u64 %r226,%r25,%r25; add.u64 %r227,%r226,%r25; shl.b64 %r228,%r227,3; add.u64 %r229,%r228,40; add.u64 %r99,%r172,%r229; shl.b64 %r230,%r25,3; add.u64 %r91,%r230,-8; add.u64 %r336,%frame,120; add.u64 %r93,%r336,%r91; add.u64 %r337,%frame,240; add.u64 %r78,%r337,%r91; cvt.u32.u32 %r234,%r22; cvt.s64.s8 %r233,%r234; add.u64 %r236,%r233,%r233; add.u64 %r237,%r236,%r233172,40; add.u64 %r121,%r238,%r239; .loc 1 84 12 mov.u64 %r348,0; $L14: .loc 1 80 18 ld.u64 %r240,[%r99]; st.u64 [%r93],%r240; .loc 1 81 19 ld.u64 %r242,[%r99+16]; add.u64 %r241,%r242,1; ld.u64 %r243,[%r99+8]; sub.u64 %r46,%r241,%r243; .loc 1 83 10 setp.lt.s64 %r244,%r46,0; @ %r244 bra $L12; .loc 1 81 17 st.u64 [%r78],%r46; bra $L13; $L12: .loc 1 84 12 st.u64 [%r78],%r348; $L13: .loc 1 78 3 add.u64 %r99,%r99,24; add.u64 %r93,%r93,8; add.u64 %r78,%r78,8; setp.ne.u64 %r246,%r99,%r121; @ %r246 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r247,%r103,0; @ %r247 bra $L16; add.u64 %r337,%frame,240; $L35: add.u64 %r137,%r171,40; mov.u64 %r132,%r337; cvt.u32.u32 %r249,%r22; cvt.s64.s8 %r248,%r249; add.u64 %r251,%r248,%r248; add.u64 %r252,%r251,%r248; shl.b64 %r253,%r252,3; add.u64 %r254,%r171,16; add.u64 %r100,%r253,%r254; .loc 1 94 10 mov.u64 %r95,1; .loc 1 98 4 mov.u64 %r277,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r255,0; st.u64 [%r171+8],%r255; .loc 1 103 28 cvt.u16.u32 %r258,%r22; add.u16 %r257,%r258,-1; cvt.u32.u16 %r259,%r257; st.u8 [%r171+28],%r259; .loc 1 105 20 add.u32 %r260,%r22,-2; cvt.s64.s32 %r55,%r260; add.u64 %r262,%r55,%r55; add.u64 %r263,%r262,%r55; shl.b64 %r264,%r263,3; add.u64 %r265,%r171,%r264; .loc 1 105 67 shl.b64 %r267,%r55,3; add.u64 %r268,%frame,%r267; .loc 1 105 59 ld.u64 %r270,[%r265+40]; ld.u64 %r271,[%r268+240]; mul.lo.u64 %r109,%r270,%r271; .loc 1 107 29 mov.u64 %r27273; call (%value_in),_gfortrani_xmallocarray274,[%value_in]; } .loc 1 107 27 st.u64 [%r171],%r274; .loc 1 108 10 setp.eq.u64 %r276,%r109,0; @ ! %r276 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r95,%r51,%r95; $L17: .loc 1 98 4 st.u64 [%r137+8],%r277; ld.u64 %r51,[%r132]; add.u64 %r278,%r51,-1; st.u64 [%r137+16],%r278; st.u64 [%r137],%r95; .loc 1 91 7 add.u64 %r137,%r137,24; add.u64 %r132,%r132,8; setp.ne.u64 %r279,%r100,%r137; @ %r279 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r171+48],%r109; mov.u64 %r281,-1; st.u64 [%r171+56],%r281; st.u64 [%r171+40],%r273; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r61,[%r171+28]; .loc 1 118 10 setp.eq.u64 %r283,%r61,%r103; @ %r283 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r61;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r286,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r287,[%r286+36]; setp.eq.u32 %r288,%r287,0; @ %r288 bra $L21; .loc 1 125 2 add.u64 %r337,%frame,240; cvta.const.u64 %r292,$LC2r329292; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r294,%r103,0; @ %r294 bra $L24; $L27: .loc 1 65 9 add.u64 %r295,%r31,1; .loc 1 65 7 sub.u64 %r105,%r295,%r33; max.s64 %r41,%r105,0; .loc 1 137 8 ld.u64 %r117,[%r172]; .loc 1 138 8 ld.u64 %r118,[%r171]; ld.u64 %r48,[%frame+360]; .loc 1 180 22 ld.u64 %r70,[%frame+120]; .loc 1 181 22 ld.u64 %r72,[%frame]; setp.gt.s64 %r338,%r105,0; setp.le.s64 %r339,%r103,1; cvt.u32.u32 %r341,%r22; cvt.s64.s8 %r342,%r341; add.u64 %r343,%r342,-1; add.u64 %r344,%frame,120; add.u64 %r345,%frame,240; .loc 1 187 13 mov.u64 %r346,0; bra $L25; $L24: add.u64 %r157,%frame,360; add.u64 %r156,%r171,40; mov.u64 %r154,%frame; cvt.u32.u32 %r297,%r22; cvt.s64.s8 %r296,%r297; add.u64 %r145,%r296,-1; .loc 1 129 3 mov.u64 %r119,0; add.u64 %r337,%frame,240; .loc 1 131 16 mov.u64 %r298,%r119; $L26: st.u64 [%r157],%r298; .loc 1 132 18 ld.u64 %r299,[%r156]; st.u64 [%r154],%r299; .loc 1 133 17 shl.b64 %r301,%r119,3; add.u64 %r302,%r337,%r301; .loc 1 133 10 ld.u64 %r303,[%r302]; setp.le.s64 %r304,%r303,0; @ %r304 bra $L1; .loc 1 129 26 add.u64 %r119,%r119,1; .loc 1 129 3 add.u64 %r157,%r157,8; add.u64 %r156,%r156,24; add.u64 %r154,%r154,8; setp.ne.u64 %r305,%r119,%r145; @ %r305 bra $L26; bra $L27; $L25: .loc 1 153 5 @ %r338 bra $L28; .loc 1 154 10 mov.u32 %r307,-128; st.u8 [%r118],%r307; bra $L29; $L28: .loc 1 158 36 mov.u64 %r112,%r117; .loc 1 151 9 mov.u32 %r44,-128; .loc 1 158 13 mov.u64 %r111,0; $L30: ld.s8 %r310,[%r112]; cvt.u16.u32 %r309,%r310; cvt.u16.u32 %r312,%r44; max.s16 %r311,%r309,%r312; cvt.u32.u16 %r313,%r311; cvt.s32.s8 %r44,%r313; .loc 1 158 28 add.u64 %r111,%r111,1; .loc 1 158 36 add.u64 %r112,%r112,%r107; .loc 1 158 6 setp.gt.s64 %r314,%r41,%r111; @ %r314 bra $L30; .loc 1 175 12 cvt.u32.u32 %r315,%r44; st.u8 [%r118],%r315; $L29: .loc 1 179 15 add.u64 %r48,%r48,1; .loc 1 180 12 add.u64 %r117,%r117,%r70; .loc 1 181 12 add.u64 %r118,%r118,%r72; .loc 1 183 32 ld.u64 %r126,[%frame+240]; .loc 1 183 13 setp.ne.u64 %r316,%r48,%r126; @ %r316 bra $L25; .loc 1 190 23 mul.lo.u64 %r317,%r70,%r48; .loc 1 190 9 sub.u64 %r114,%r117,%r317; .loc 1 191 23 mul.lo.u64 %r318,%r72,%r48; .loc 1 191 9 sub.u64 %r115,%r118,%r318; .loc 1 193 7 @ %r339 bra $L1; add.u64 %r143,%frame,368; mov.u64 %r102,8; .loc 1 192 5 mov.u64 %r116,1; bra $L33; $L34: .loc 1 187 13 st.u64 [%r143],%r346; .loc 1 190 23 mul.lo.u64 %r324,%r84,%r83; .loc 1 190 9 sub.u64 %r114,%r117,%r324; .loc 1 191 23 mul.lo.u64 %r325,%r86,%r83; .loc 1 191 9 sub.u64 %r115,%r118,%r325; .loc 1 192 5 add.u64 %r116,%r116,1; .loc 1 193 7 add.u64 %r143,%r143,8; add.u64 %r102,%r102,8; setp.eq.u64 %r326,%r116,%r343; @ %r326 bra $L1; $L33: .loc 1 201 16 ld.u64 %r327,[%r143]; add.u64 %r83,%r327,1; st.u64 [%r143],%r83; .loc 1 202 23 add.u64 %r329,%r344,%r102; ld.u64 %r84,[%r329]; .loc 1 202 13 add.u64 %r117,%r114,%r84; .loc 1 203 23 add.u64 %r330,%frame,%r102; ld.u64 %r86,[%r330]; .loc 1 203 13 add.u64 %r118,%r115,%r86; .loc 1 183 32 add.u64 %r332,%r345,%r102; ld.u64 %r88,[%r332]; .loc 1 183 13 setp.eq.u64 %r333,%r83,%r88; @ %r333 bra $L34; .loc 1 187 13 mov.u64 %r48,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r334,[%r171]; setp.eq.u64 %r335,%r334,0; @ ! %r335 bra $L11; bra $L35; $L_gfortran_mmaxval_i1 .visible .func _gfortran_mmaxval_60899pred %r207; .reg .u32 %r211; .reg .u64 %r213; .reg .u32 %r215; .reg .u32 %r216; .reg .u16u16 %r255; .reg .u32 %r256; .reg .u16 %r257; .reg .pred %r258; .reg .predpred %r274; .reg .u64 %r275; .reg .u64 %r277; .reg .u64 %r279; .reg .u64 %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64pred %r297; .reg .u64 %r298; .reg .predu64 %r305; .reg .u32 %r306; .reg .u64u64 %r313; .reg .u64pred %r323; .reg .pred %r324; .reg .u64 %r325; .reg .u32 %r326; .reg .u64u64 %r331; .reg .u32 %r332; .reg .u64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u64u16 %r347; .reg .u32 %r348; .reg .pred %r349; .reg .u64 %r350; .reg .u64 %r351; .reg .predu16 %r388; .reg .u32 %r389; .reg .predpred %r397; .reg .u32u64 %r437; mov.u64 %r203,%ar0; mov.u64 %r204,%ar1; mov.u64 %r205,%ar2; mov.u64 %r206,%ar3; .loc 1 237 6 setp.ne.u64 %r207,%r206,0; @ %r207 bra $L490r204205; call _gfortran_maxva.loc 1 244 7 bra $L48; $L49: .loc 1 247 10 ld.u64 %r22,[%r205]; .loc 1 247 7 add.u64 %r131,%r22,-1; .loc 1 248 10 ld.s8 %r23,[%r204+28]; .loc 1 248 38 add.u32 %r211,%r23,-1; .loc 1 248 8 cvt.s64.s32 %r132,%r211; .loc 1 251 7 shr.u64 %r213,%r131,63; set.u32.gt.s64 %r215,%r131,%r132; neg.s32 %r216,%r215; cvt.u16.u64 %r219,%r213; cvt.u16.u32 %r220,%r216; or.b16 %r218,%r219,%r220; .loc 1 251 6 cvt.u32.u16 %r221,%r218; cvt.u16.u8 %r222,%r221; setp.eq.u16 %r223,%r222,0; @ %r223 bra $L51; .loc 1 253 7 cvt.u32.u32 %r226,%r23; cvt.s64.s8 %r225,%r226; st.u64 [%stack+8],%r225; st.u64 [%stack],%r22; cvta.const.u64 %r2242stack; call _gfortran_runtime_error1: .loc 1 258 9 add.u64 %r426,%r131,%r131; add.u64 %r425,%r426,%r131; shl.b64 %r231,%r425,3; add.u64 %r232,%r204,%r231; ld.u64 %r235,[%r232+56]; add.u64 %r234,%r235,1; .loc 1 258 7 ld.u64 %r242,[%r232+48]; sub.u64 %r133,%r234,%r242; .loc 1 259 6 setp.le.s64 %r243,%r133,0; @ %r243 bra $L48; .loc 1 262 9 ld.u64 %r145,[%r206]; .loc 1 264 15 ld.u64 %r34,[%r206+16]; .loc 1 266 22 cvt.u32.u64 %r35,%r34; .loc 1 266 53 add.u32 %r244,%r35,-4; and.b32 %r245,%r244,-5; set.u32.eq.u32 %r247,%r245,0; neg.s32 %r248,%r247; .loc 1 266 22 add.u32 %r249,%r35,-1; set.u32.le.u32 %r251,%r249,1; neg.s32 %r252,%r251; .loc 1 266 6 cvt.u16.u32 %r254,%r248; cvt.u16.u32 %r255,%r252; or.b16 %r253,%r254,%r255; cvt.u32.u16 %r256,%r253; cvt.u16.u8 %r257,%r256; setp.ne.u16 %r258,%r257,0; @ %r258 bra $L53; .loc 1 268 7 setp.ne.u32 %r260,%r35,16; @ %r260 bra $L54; $L53: .loc 1 275 9 shl.b64 %r264,%r425,3; add.u64 %r265,%r204,%r264; ld.u64 %r136,[%r265+40]; .loc 1 276 12 add.u64 %r271,%r206,%r264; ld.u64 %r42,[%r271+40]; .loc 1 278 3 setp.ne.u64 %r273,%r131,0; @ %r273 bra $L55; $L61: .loc 1 288 3 setp.lt.s64 %r274,%r131,%r132; @ %r274 bra $L56; bra $L105; $L54: .loc 1 273 5 cvta.const.u64 %r275275_gfortran_runtime_error5: add.u64 %r187,%r204,40; add.u64 %r189,%r206,40; add.u64 %r277,%r204,16; add.u64 %r279,%r22,%r22; add.u64 %r280,%r279,%r22; shl.b64 %r281,%r280,3; add.u64 %r200,%r277,%r281; .loc 1 278 3 mov.u64 %r191,0; add.u64 %r423,%frame,240; add.u64 %r422,%frame,360; .loc 1 285 12 mov.u64 %r437,%r191; $L60: .loc 1 280 18 add.u64 %r283,%r423,%r191; ld.u64 %r284,[%r187]; st.u64 [%r283],%r284; .loc 1 281 18 add.u64 %r285,%frame,%r191; .loc 1 281 20 ld.u64 %r287,[%r189]; mul.lo.u64 %r286,%r287,%r34; .loc 1 281 18 st.u64 [%r285],%r286; .loc 1 282 19 ld.u64 %r289,[%r187+16]; add.u64 %r288,%r289,1; ld.u64 %r290,[%r187+8]; sub.u64 %r53,%r288,%r290; .loc 1 284 10 setp.lt.s64 %r291,%r53,0; @ %r291 bra $L58; .loc 1 282 17 add.u64 %r293,%r422,%r191; st.u64 [%r293],%r53; bra $L59; $L58: .loc 1 285 12 add.u64 %r295,%r422,%r191; st.u64 [%r295],%r437; $L59: .loc 1 278 3 add.u64 %r187,%r187,24; add.u64 %r189,%r189,24; add.u64 %r191,%r191,8; setp.ne.u64 %r297,%r187,%r200; @ %r297 bra $L60; bra $L61; $L105: .loc 1 298 6 ld.u64 %r298,[%r203]; setp.eq.u64 %r299,%r298,0; @ ! %r299 bra $L63; bra $L62; $L56: add.u64 %r301,%r22,%r22; add.u64 %r302,%r301,%r22; shl.b64 %r303,%r302,3; add.u64 %r152,%r303,40; add.u64 %r55,%r204,%r152; add.u64 %r154,%r206,%r152; shl.b64 %r304,%r22,3; add.u64 %r99,%r304,-8; cvt.u32.u32 %r306,%r23; cvt.s64.s8 %r305,%r306; shl.b64 %r307,%r305,3; add.u64 %r184,%r307,-8; add.u64 %r423,%frame,240; add.u64 %r422,%frame,360; .loc 1 295 12 mov.u64 %r436,0; $L66: .loc 1 290 18 add.u64 %r309,%r423,%r99; ld.u64 %r310,[%r55]; st.u64 [%r309],%r310; .loc 1 291 18 add.u64 %r311,%frame,%r99; .loc 1 291 20 ld.u64 %r313,[%r154]; mul.lo.u64 %r312,%r313,%r34; .loc 1 291 18 st.u64 [%r311],%r312; .loc 1 292 19 ld.u64 %r315,[%r55+16]; add.u64 %r314,%r315,1; ld.u64 %r316,[%r55+8]; sub.u64 %r63,%r314,%r316; .loc 1 294 10 setp.lt.s64 %r317,%r63,0; @ %r317 bra $L64; .loc 1 292 17 add.u64 %r319,%r422,%r99; st.u64 [%r319],%r63; bra $L65; $L64: .loc 1 295 12 add.u64 %r321,%r422,%r99; st.u64 [%r321],%r436; $L65: .loc 1 288 3 add.u64 %r55,%r55,24; add.u64 %r154,%r154,24; add.u64 %r99,%r99,8; setp.ne.u64 %r323,%r99,%r184; @ %r323 bra $L66; bra $L106; $L62: .loc 1 302 7 setp.eq.u64 %r324,%r132,0; @ %r324 bra $L68; add.u64 %r422,%frame,360; $L86: add.u64 %r123,%r203,40; mov.u64 %r117,%r422; cvt.u32.u32 %r326,%r23; cvt.s64.s8 %r325,%r326; add.u64 %r328,%r325,%r325; add.u64 %r329,%r328,%r325; shl.b64 %r330,%r329,3; add.u64 %r331,%r203,16; add.u64 %r71,%r330,%r331; .loc 1 305 10 mov.u64 %r119,1; .loc 1 309 4 mov.u64 %r350,0; bra $L69; $L68: .loc 1 313 20 add.u32 %r332,%r23,-2; cvt.s64.s32 %r69,%r332; add.u64 %r334,%r69,%r69; add.u64 %r335,%r334,%r69; shl.b64 %r336,%r335,3; add.u64 %r337,%r203,%r336; .loc 1 313 67 shl.b64 %r339,%r69,3; add.u64 %r340,%frame,%r339; .loc 1 313 59 ld.u64 %r342,[%r337+40]; ld.u64 %r343,[%r340+360]; mul.lo.u64 %r139,%r342,%r343; .loc 1 315 24 mov.u64 %r344,0; st.u64 [%r203+8],%r344; .loc 1 316 28 cvt.u16.u32 %r347,%r23; add.u16 %r346,%r347,-1; cvt.u32.u16 %r348,%r346; st.u8 [%r203+28],%r348; .loc 1 318 10 setp.eq.u64 %r349,%r139,0; @ %r349 bra $L70; bra $L107; $L72: .loc 1 307 47 mul.lo.u64 %r119,%r66,%r119; $L69: .loc 1 309 4 st.u64 [%r123+8],%r350; ld.u64 %r66,[%r117]; add.u64 %r351,%r66,-1; st.u64 [%r123+16],%r351; st.u64 [%r123],%r119; .loc 1 302 7 add.u64 %r123,%r123,24; add.u64 %r117,%r117,8; setp.ne.u64 %r352,%r71,%r123; @ %r352 bra $L72; bra $L68; $L70: .loc 1 321 4 st.u64 [%r203+48],%r139; mov.u64 %r354,-1; st.u64 [%r203+56],%r354; mov.u64 %r355,1; st.u64 [%r203+40],%r355; .loc 1 322 4 bra $L48; $L107: .loc 1 325 24 mov.u64 %r3571357; call (%value_in),_gfortrani_xmallocarray58,[%value_in]; } .loc 1 325 22 st.u64 [%r203],%r358; bra $L73; $L63: .loc 1 330 19 ld.s8 %r360,[%r203+28]; .loc 1 330 10 setp.eq.u64 %r361,%r360,%r132; @ %r361 bra $L74; .loc 1 331 2 cvta.const.u64 %r362362_gfortran_runtime_error74: .loc 1 333 11 cvta.global.u64 %r364,_gfortrani_compile_options; .loc 1 333 10 ld.u32 %r365,[%r364+36]; setp.eq.u32 %r366,%r365,0; @ %r366 bra $L73; .loc 1 335 4 add.u64 %r422,%frame,360; cvta.const.u64 %r370,$LC2r422369370; call _gfortrani_bounds_ifunction_return337 4 cvta.const.u64 %r374,$LC6; {374370; call _gfortrani_bounds_equal_extents$L73: .loc 1 342 3 setp.ne.u64 %r376,%r132,0; @ %r376 bra $L75; $L78: .loc 1 350 8 ld.u64 %r146,[%r203]; .loc 1 351 8 ld.u64 %r144,[%r204]; .loc 1 353 9 setp.ne.u64 %r377,%r144,0; @ %r377 bra $L76; bra $L48; $L75: add.u64 %r170,%frame,480; add.u64 %r167,%r203,40; add.u64 %r160,%frame,120; cvt.u32.u32 %r379,%r23; cvt.s64.s8 %r378,%r379; add.u64 %r124,%r378,-1; .loc 1 342 3 mov.u64 %r151,0; add.u64 %r422,%frame,360; .loc 1 344 16 mov.u64 %r380,%r151; $L77: st.u64 [%r170],%r380; .loc 1 345 18 ld.u64 %r381,[%r167]; st.u64 [%r160],%r381; .loc 1 346 17 shl.b64 %r383,%r151,3; add.u64 %r384,%r422,%r383; .loc 1 346 10 ld.u64 %r385,[%r384]; setp.le.s64 %r386,%r385,0; @ %r386 bra $L48; .loc 1 342 26 add.u64 %r151,%r151,1; .loc 1 342 3 add.u64 %r170,%r170,8; add.u64 %r167,%r167,24; add.u64 %r160,%r160,8; setp.ne.u64 %r387,%r124,%r151; @ %r387 bra $L77; bra $L78; $L76: .loc 1 276 12 mul.lo.u64 %r44,%r42,%r34; ld.u64 %r173,[%frame+480]; .loc 1 401 22 ld.u64 %r85,[%frame+240]; .loc 1 402 23 ld.u64 %r87,[%frame]; .loc 1 403 22 ld.u64 %r89,[%frame+120]; setp.le.s64 %r427,%r132,1; .loc 1 365 9 mov.u32 %r429,-128; cvt.u32.u32 %r430,%r23; cvt.s64.s8 %r431,%r430; add.u64 %r432,%r431,-1; add.u64 %r433,%frame,240; add.u64 %r434,%frame,360; add.u64 %r435,%frame,120; $L87: .loc 1 342 3 mov.u64 %r150,%r145; mov.u64 %r149,%r144; .loc 1 365 9 mov.u32 %r121,%r429; .loc 1 370 9 mov.u64 %r148,0; $L80: .loc 1 394 6 ld.s8 %r389,[%r150]; cvt.u16.u32 %r388,%r389; setp.eq.u16 %r390,%r388,0; @ %r390 bra $L79; ld.s8 %r393,[%r149]; cvt.u16.u32 %r392,%r393; cvt.u16.u32 %r395,%r121; max.s16 %r394,%r392,%r395; cvt.u32.u16 %r396,%r394; cvt.s32.s8 %r121,%r396; $L79: .loc 1 370 24 add.u64 %r148,%r148,1; .loc 1 370 32 add.u64 %r149,%r149,%r136; .loc 1 370 47 add.u64 %r150,%r150,%r44; .loc 1 370 2 setp.ne.u64 %r397,%r133,%r148; @ %r397 bra $L80; .loc 1 397 8 cvt.u32.u32 %r398,%r121; st.u8 [%r146],%r398; .loc 1 400 15 add.u64 %r173,%r173,1; .loc 1 401 12 add.u64 %r144,%r144,%r85; .loc 1 402 13 add.u64 %r145,%r145,%r87; .loc 1 403 12 add.u64 %r146,%r146,%r89; .loc 1 405 32 ld.u64 %r79,[%frame+360]; .loc 1 405 13 setp.ne.u64 %r399,%r79,%r173; @ %r399 bra $L87; .loc 1 412 23 mul.lo.u64 %r93,%r173,%r85; .loc 1 413 24 mul.lo.u64 %r400,%r173,%r87; .loc 1 413 10 sub.u64 %r141,%r145,%r400; .loc 1 414 23 mul.lo.u64 %r401,%r173,%r89; .loc 1 414 9 sub.u64 %r142,%r146,%r401; .loc 1 416 7 @ ! %r427 bra $L108; bra $L48; $L85: .loc 1 409 13 st.u64 [%r40],%r428; .loc 1 412 23 mul.lo.u64 %r93,%r105,%r104; .loc 1 413 24 mul.lo.u64 %r404,%r107,%r104; .loc 1 413 10 sub.u64 %r141,%r145,%r404; .loc 1 414 23 mul.lo.u64 %r405,%r109,%r104; .loc 1 414 9 sub.u64 %r142,%r146,%r405; .loc 1 415 5 add.u64 %r143,%r143,1; .loc 1 416 7 add.u64 %r40,%r40,8; add.u64 %r164,%r164,8; setp.ne.u64 %r406,%r143,%r432; @ %r406 bra $L84; bra $L48; $L108: add.u64 %r40,%frame,488; mov.u64 %r164,8; .loc 1 415 5 mov.u64 %r143,1; .loc 1 409 13 mov.u64 %r428,0; $L84: .loc 1 424 16 ld.u64 %r410,[%r40]; add.u64 %r104,%r410,1; st.u64 [%r40],%r104; .loc 1 425 23 add.u64 %r412,%r433,%r164; ld.u64 %r105,[%r412]; .loc 1 425 13 sub.u64 %r413,%r105,%r93; add.u64 %r144,%r144,%r413; .loc 1 426 24 add.u64 %r414,%frame,%r164; ld.u64 %r107,[%r414]; .loc 1 426 14 add.u64 %r145,%r141,%r107; .loc 1 427 23 add.u64 %r416,%r435,%r164; ld.u64 %r109,[%r416]; .loc 1 427 13 add.u64 %r146,%r142,%r109; .loc 1 405 32 add.u64 %r418,%r434,%r164; ld.u64 %r112,[%r418]; .loc 1 405 13 setp.eq.u64 %r419,%r104,%r112; @ %r419 bra $L85; .loc 1 409 13 mov.u64 %r173,0; bra $L87; $L106: .loc 1 298 6 ld.u64 %r420,[%r203]; setp.eq.u64 %r421,%r420,0; @ ! %r421 bra $L63; bra $L86; $L48:_gfortran_smaxval_i1 .visible .func _gfortran_smaxval_68predpred %r184; .reg .predpred %r206; .reg .predu32 %r211; .reg .u64 %r213; .reg .u64u32 %r222; .reg .u64pred %r264; .reg .u64 %r266; .reg .u32 %r270; .reg .u64 %r272; .reg .u64 %r274; .reg .predpredu64 %r297; mov.u64 %r148,%ar0; mov.u64 %r149,%ar1; mov.u64 %r150,%ar2; mov.u64 %r151,%ar3; .loc 1 454 6 setp.eq.u64 %r152,%r151,0; @ %r152 bra $L110; .loc 1 454 20 ld.u32 %r153,[%r151]; setp.eq.u32 %r154,%r153,0; @ %r154 bra $L111; $L110: .loc 1 459 7gfortran_maxva.loc 1 461 7 bra $L109; $L111: .loc 1 464 10 ld.u64 %r23,[%r150]; .loc 1 464 7 add.u64 %r86,%r23,-1; .loc 1 465 10 ld.s8 %r24,[%r149+28]; .loc 1 465 38 add.u32 %r26,%r24,-1; .loc 1 465 8 cvt.s64.s32 %r291,%r26; .loc 1 467 7 shr.u64 %r159,%r86,63; set.u32.gt.s64 %r161,%r86,%r291; neg.s32 %r162,%r161; cvt.u16.u64 %r165,%r159; cvt.u16.u32 %r166,%r162; or.b16 %r164,%r165,%r166; .loc 1 467 6 cvt.u32.u16 %r167,%r164; cvt.u16.u8 %r168,%r167; setp.ne.u16 %r169,%r168,0; @ %r169 bra $L113; .loc 1 474 3 setp.ne.u64 %r170,%r86,0; @ %r170 bra $L114; $L120: .loc 1 482 3 setp.lt.s64 %r171,%r86,%r291; @ %r171 bra $L115; bra $L168; $L113: .loc 1 469 7 cvt.u32.u32 %r174,%r24; cvt.s64.s8 %r173,%r174; st.u64 [%stack+8],%r173172172_gfortran_runtime_error114: add.u64 %r135,%r149,48; add.u64 %r137,%frame,120; add.u64 %r138,%r149,56; add.u64 %r176,%r149,24; add.u64 %r178,%r23,%r23; add.u64 %r179,%r178,%r23; shl.b64 %r180,%r179,3; add.u64 %r144,%r176,%r180; .loc 1 479 12 mov.u64 %r297,0; $L119: .loc 1 476 19 ld.u64 %r182,[%r138]; add.u64 %r181,%r182,1; ld.u64 %r183,[%r135]; sub.u64 %r36,%r181,%r183; .loc 1 478 10 setp.le.s64 %r184,%r36,0; @ %r184 bra $L117; .loc 1 476 17 st.u64 [%r137],%r36; bra $L118; $L117: .loc 1 479 12 st.u64 [%r137],%r297; $L118: .loc 1 474 3 add.u64 %r135,%r135,24; add.u64 %r137,%r137,8; add.u64 %r138,%r138,24; setp.ne.u64 %r186,%r135,%r144; @ %r186 bra $L119; bra $L120; $L168: .loc 1 491 15 ld.u64 %r96,[%r148]; .loc 1 491 6 setp.eq.u64 %r187,%r96,0; @ ! %r187 bra $L122; bra $L121; $L115: add.u64 %r189,%r23,%r23; add.u64 %r190,%r189,%r23; shl.b64 %r191,%r190,3; add.u64 %r192,%r191,48; add.u64 %r57,%r149,%r192; add.u64 %r290,%frame,120; shl.b64 %r194,%r23,3; add.u64 %r195,%r194,-8; add.u64 %r67,%r290,%r195; add.u64 %r196,%r149,48; cvt.u32.u32 %r198,%r24; cvt.s64.s8 %r197,%r198; add.u64 %r200,%r197,%r197; add.u64 %r201,%r200,%r197; shl.b64 %r202,%r201,3; add.u64 %r131,%r196,%r202; .loc 1 488 12 mov.u64 %r296,0; $L125: .loc 1 485 2 ld.u64 %r204,[%r57+8]; add.u64 %r203,%r204,1; ld.u64 %r205,[%r57]; sub.u64 %r41,%r203,%r205; .loc 1 487 10 setp.le.s64 %r206,%r41,0; @ %r206 bra $L123; .loc 1 484 17 st.u64 [%r67],%r41; bra $L124; $L123: .loc 1 488 12 st.u64 [%r67],%r296; $L124: .loc 1 482 3 add.u64 %r57,%r57,24; add.u64 %r67,%r67,8; setp.ne.u64 %r208,%r57,%r131; @ %r208 bra $L125; bra $L169; $L121: .loc 1 495 7 setp.eq.u64 %r209,%r291,0; @ %r209 bra $L127; add.u64 %r290,%frame,120; $L147: add.u64 %r102,%r148,40; mov.u64 %r84,%r290; cvt.u32.u32 %r211,%r24; cvt.s64.s8 %r210,%r211; add.u64 %r213,%r210,%r210; add.u64 %r214,%r213,%r210; shl.b64 %r215,%r214,3; add.u64 %r216,%r148,16; add.u64 %r60,%r215,%r216; .loc 1 498 10 mov.u64 %r82,1; .loc 1 502 4 mov.u64 %r235,0; bra $L128; $L127: .loc 1 506 24 mov.u64 %r217,0; st.u64 [%r148+8],%r217; .loc 1 507 28 cvt.u16.u32 %r220,%r24; add.u16 %r219,%r220,-1; cvt.u32.u16 %r221,%r219; st.u8 [%r148+28],%r221; .loc 1 509 20 add.u32 %r222,%r24,-2; cvt.s64.s32 %r50,%r222; add.u64 %r224,%r50,%r50; add.u64 %r225,%r224,%r50; shl.b64 %r226,%r225,3; add.u64 %r227,%r148,%r226; .loc 1 509 67 shl.b64 %r229,%r50,3; add.u64 %r230,%frame,%r229; .loc 1 509 59 ld.u64 %r232,[%r227+40]; ld.u64 %r233,[%r230+120]; mul.lo.u64 %r93,%r232,%r233; .loc 1 511 10 setp.eq.u64 %r234,%r93,0; @ %r234 bra $L129; bra $L170; $L131: .loc 1 500 48 mul.lo.u64 %r82,%r46,%r82; $L128: .loc 1 502 4 st.u64 [%r102+8],%r235; ld.u64 %r46,[%r84]; add.u64 %r236,%r46,-1; st.u64 [%r102+16],%r236; st.u64 [%r102],%r82; .loc 1 495 7 add.u64 %r102,%r102,24;131; bra $L127; $L129: .loc 1 514 4 st.u64 [%r148+48],%r93; mov.u64 %r239,-1; st.u64 [%r148+56],%r239; mov.u64 %r240,1; st.u64 [%r148+40],%r240; .loc 1 515 4 bra $L109; $L170: .loc 1 518 24 mov.u64 %r242(%value_in),_gfortrani_xmallocarray3,[%value_in]; } mov.u64 %r96,%r243; .loc 1 518 22 st.u64 [%r148],%r96; $L135: .loc 1 544 3 setp.ne.u64 %r245,%r291,0; @ %r245 bra $L132; bra $L133; $L122: .loc 1 522 19 ld.s8 %r55,[%r148+28]; .loc 1 522 10 setp.eq.u64 %r246,%r55,%r291; @ %r246 bra $L134; .loc 1 523 2 st.u64 [%stack+8],%r291; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 528 11 cvta.global.u64 %r249,_gfortrani_compile_options; .loc 1 528 10 ld.u32 %r250,[%r249+36]; setp.eq.u32 %r251,%r250,0; @ %r251 bra $L135; .loc 1 530 4 setp.eq.u64 %r252,%r291,0; @ %r252 bra $L133; add.u64 %r31,%r148,48; add.u64 %r125,%frame,120; cvt.u32.u32 %r254,%r24; cvt.s64.s8 %r253,%r254; add.u64 %r92,%r253,-1; .loc 1 530 10 mov.u64 %r101,0; $L137: .loc 1 534 21 ld.u64 %r256,[%r31+8]; add.u64 %r255,%r256,1; .loc 1 534 19 ld.u64 %r257,[%r31]; sub.u64 %r91,%r255,%r257; .loc 1 535 18 ld.u64 %r64,[%r125]; .loc 1 530 25 add.u64 %r101,%r101,1; .loc 1 535 11 setp.eq.u64 %r258,%r64,%r91; @ %r258 bra $L136; .loc 1 536 3 st.u64 [%stack+16],%r64; st.u64 [%stack+8],%r91; st.u64 [%stack],%r101; cvta.const.u64 %r22_gfortran_runtime_error136: .loc 1 530 4 add.u64 %r31,%r31,24; add.u64 %r125,%r125,8; setp.ne.u64 %r261,%r92,%r101; @ %r261 bra $L137; $L132: .loc 1 546 16 cvt.u16.u32 %r263,%r24; setp.le.s16 %r264,%r263,1; @ %r264 bra $L138; shl.b64 %r262,%r291,3; bra $L139; $L138: mov.u64 %r262,8; $L139: add.u64 %r266,%frame,240; mov.u32 %r27026272,[%value_in]; } add.u64 %r111,%r148,40; mov.u64 %r109,%frame; mov.u64 %r98,0; $L140: .loc 1 547 18 ld.u64 %r274,[%r111]; st.u64 [%r109],%r274; .loc 1 544 26 add.u64 %r98,%r98,1; .loc 1 544 3 add.u64 %r111,%r111,24; add.u64 %r109,%r109,8; setp.gt.s64 %r275,%r291,%r98; @ %r275 bra $L140; $L133: ld.u64 %r103,[%frame+240]; .loc 1 556 22 ld.u64 %r69,[%frame]; .loc 1 558 32 ld.u64 %r59,[%frame+120]; .loc 1 565 23 mul.lo.u64 %r81,%r59,%r69; setp.le.s64 %r289,%r291,1; .loc 1 554 13 mov.u32 %r276,-128; cvt.u32.u32 %r293,%r24; cvt.s64.s8 %r294,%r293; add.u64 %r295,%r294,-1; $L146: st.u8 [%r96],%r276; .loc 1 555 15 add.u64 %r103,%r103,1; .loc 1 556 12 add.u64 %r96,%r96,%r69; .loc 1 558 13 setp.ne.u64 %r277,%r59,%r103; @ %r277 bra $L146; .loc 1 567 7 @ ! %r289 bra $L171; bra $L109; $L145: .loc 1 562 13 st.u64 [%r97],%r292; .loc 1 565 23 mul.lo.u64 %r71,%r74,%r73; .loc 1 566 5 add.u64 %r95,%r95,1; .loc 1 567 7 add.u64 %r97,%r97,8; add.u64 %r119,%r119,8; add.u64 %r118,%r118,8; setp.ne.u64 %r280,%r95,%r295; @ %r280 bra $L144; bra $L109; $L171: add.u64 %r97,%frame,248; add.u64 %r119,%frame,8; add.u64 %r118,%frame,128; .loc 1 565 23 mov.u64 %r71,%r81; .loc 1 566 5 mov.u64 %r95,1; .loc 1 562 13 mov.u64 %r292,0; $L144: .loc 1 571 16 ld.u64 %r285,[%r97]; add.u64 %r73,%r285,1; st.u64 [%r97],%r73; .loc 1 572 23 ld.u64 %r74,[%r119]; .loc 1 572 13 sub.u64 %r286,%r74,%r71; add.u64 %r96,%r96,%r286; .loc 1 558 32 ld.u64 %r77,[%r118]; .loc 1 558 13 setp.eq.u64 %r287,%r73,%r77; @ %r287 bra $L145; .loc 1 562 13 mov.u64 %r103,0; bra $L146; $L169: .loc 1 491 15 ld.u64 %r96,[%r148]; .loc 1 491 6 setp.eq.u64 %r288,%r96,0; @ ! %r288 bra $L122; bra $L147; $L109: .loc 1 576 1 ret; } maxval_i2.o/_gfortran_maxval_i2 .visible .func _gfortran_maxval_fortran/generated/maxval_i2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxval_i2 .visible .func _gfortran_mmaxval_i2gfortran_smaxval_i2 .visible .func _gfortran_smaxval_i2VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,86,65,76114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_maxval_i2 .visible .func _gfortran_maxval_48099pred %r217; .reg .pred64u64 %r251; .reg .predu64 %r256; .reg .u64 %r257; .reg .u32u32 %r268; .reg .u64predmov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; .loc 1 55 10 ld.s8 %r22,[%r180+28]; .loc 1 55 38 add.u32 %r182,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r182; .loc 1 56 10 ld.u64 %r25,[%r181]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r184,%r104,63; set.u32.lt.s64 %r186,%r103,%r104; neg.s32 %r187,%r186; cvt.u16.u64 %r190,%r184; cvt.u16.u32 %r191,%r187; or.b16 %r189,%r190,%r191; .loc 1 58 6 cvt.u32.u16 %r192,%r189; cvt.u16.u8 %r193,%r192; setp.eq.u16 %r194,%r193,0; @ %r194 bra $L2; .loc 1 60 7 cvt.u32.u32 %r197,%r22; cvt.s64.s8 %r196,%r197; st.u64 [%stack+8],%r1961955_gfortran_runtime_errorr200,%r104,%r104; add.u64 %r201,%r200,%r104; shl.b64 %r202,%r201,3; add.u64 %r203,%r180,%r202; ld.u64 %r30,[%r203+56]; ld.u64 %r32,[%r203+48]; .loc 1 68 9 ld.u64 %r107,[%r203+40]; .loc 1 70 3 setp.ne.u64 %r217,%r104,0; @ %r217 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r218,%r103,%r104; @ %r218 bra $L4; bra $L46; $L3: add.u64 %r78,%r180,40; add.u64 %r59,%frame,120; add.u64 %r91,%frame,240; add.u64 %r219,%r180,16; add.u64 %r221,%r25,%r25; add.u64 %r222,%r221,%r25; shl.b64 %r223,%r222,3; add.u64 %r174,%r219,%r223; .loc 1 76 12 mov.u64 %r363,0; $L8: .loc 1 72 18 ld.u64 %r224,[%r78]; st.u64 [%r59],%r224; .loc 1 73 19 ld.u64 %r226,[%r78+16]; add.u64 %r225,%r226,1; ld.u64 %r227,[%r78+8]; sub.u64 %r37,%r225,%r227; .loc 1 75 10 setp.lt.s64 %r228,%r37,0; @ %r228 bra $L6; .loc 1 73 17 st.u64 [%r91],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r91],%r363; $L7: .loc 1 70 3 add.u64 %r78,%r78,24; add.u64 %r59,%r59,8; add.u64 %r91,%r91,8; setp.ne.u64 %r230,%r78,%r174; @ %r230 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r231,[%r179]; setp.eq.u64 %r232,%r231,0; @ ! %r232 bra $L11; bra $L10; $L4: add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r237,%r236,40; add.u64 %r143,%r180,%r237; shl.b64 %r238,%r25,3; add.u64 %r125,%r238,-8; add.u64 %r351,%frame,120; add.u64 %r128,%r351,%r125; add.u64 %r350,%frame,240; add.u64 %r121,%r350,%r125; cvt.u32.u32 %r242,%r22; cvt.s64.s8 %r241,%r242; add.u64 %r244,%r241,%r241; add.u64 %r245,%r244,%r241; shl.b64 %r246,%r245,3; add.u64 %r247,%r180,40; add.u64 %r95,%r246,%r247; .loc 1 84 12 mov.u64 %r362,0; $L14: .loc 1 80 18 ld.u64 %r248,[%r143]; st.u64 [%r128],%r248; .loc 1 81 19 ld.u64 %r250,[%r143+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r143+8]; sub.u64 %r45,%r249,%r251; .loc 1 83 10 setp.lt.s64 %r252,%r45,0; @ %r252 bra $L12; .loc 1 81 17 st.u64 [%r121],%r45; bra $L13; $L12: .loc 1 84 12 st.u64 [%r121],%r362; $L13: .loc 1 78 3 add.u64 %r143,%r143,24; add.u64 %r128,%r128,8; add.u64 %r121,%r121,8; setp.ne.u64 %r254,%r95,%r143; @ %r254 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r255,%r103,0; @ %r255 bra $L16; add.u64 %r350,%frame,240; $L34: add.u64 %r152,%r179,40; mov.u64 %r150,%r350; add.u64 %r256,%r179,16; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r144,%r256,%r262; .loc 1 94 10 mov.u64 %r99,1; .loc 1 98 4 mov.u64 %r285,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r263,0; st.u64 [%r179+8],%r263; .loc 1 103 28 cvt.u16.u32 %r266,%r22; add.u16 %r265,%r266,-1; cvt.u32.u16 %r267,%r265; st.u8 [%r179+28],%r267; .loc 1 105 20 add.u32 %r268,%r22,-2; cvt.s64.s32 %r52,%r268; add.u64 %r270,%r52,%r52; add.u64 %r271,%r270,%r52; shl.b64 %r272,%r271,3; add.u64 %r273,%r179,%r272; .loc 1 105 67 shl.b64 %r275,%r52,3; add.u64 %r276,%frame,%r275; .loc 1 105 59 ld.u64 %r278,[%r273+40]; ld.u64 %r279,[%r276+240]; mul.lo.u64 %r109,%r278,%r279; .loc 1 107 29 mov.u64 %r281,281; call (%value_in),_gfortrani_xmallocarray282,[%value_in]; } .loc 1 107 27 st.u64 [%r179],%r282; .loc 1 108 10 setp.eq.u64 %r284,%r109,0; @ ! %r284 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r99,%r48,%r99; $L17: .loc 1 98 4 st.u64 [%r152+8],%r285; ld.u64 %r48,[%r150]; add.u64 %r286,%r48,-1; st.u64 [%r152+16],%r286; st.u64 [%r152],%r99; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r287,%r144,%r152; @ %r287 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r179+48],%r109; mov.u64 %r289,-1; st.u64 [%r179+56],%r289; mov.u64 %r290,1; st.u64 [%r179+40],%r290; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r179+28]; .loc 1 118 10 setp.eq.u64 %r291,%r58,%r103; @ %r291 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r294,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r295,[%r294+36]; setp.eq.u32 %r296,%r295,0; @ %r296 bra $L21; .loc 1 125 2 add.u64 %r350,%frame,240; cvta.const.u64 %r300,$LC2r350299300; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r302,%r103,0; @ %r302 bra $L24; $L27: .loc 1 65 9 add.u64 %r303,%r30,1; .loc 1 65 7 sub.u64 %r105,%r303,%r32; max.s64 %r40,%r105,0; .loc 1 137 8 ld.u64 %r116,[%r180]; .loc 1 138 8 ld.u64 %r117,[%r179]; .loc 1 158 36 add.u64 %r66,%r107,%r107; ld.u64 %r138,[%frame+360]; .loc 1 180 22 ld.u64 %r68,[%frame+120]; .loc 1 180 12 add.u64 %r69,%r68,%r68; .loc 1 181 22 ld.u64 %r71,[%frame]; .loc 1 181 12 add.u64 %r72,%r71,%r71; .loc 1 183 32 ld.u64 %r123,[%frame+240]; .loc 1 190 23 mul.lo.u64 %r307,%r68,%r123; .loc 1 190 9 add.u64 %r308,%r307,%r307; neg.s64 %r127,%r308; .loc 1 191 23 mul.lo.u64 %r309,%r71,%r123; .loc 1 191 9 add.u64 %r310,%r309,%r309; neg.s64 %r130,%r310; setp.gt.s64 %r352,%r105,0; setp.le.s64 %r353,%r103,1; cvt.u32.u32 %r355,%r22; cvt.s64.s8 %r356,%r355; add.u64 %r357,%r356,-1; add.u64 %r358,%frame,120; add.u64 %r359,%frame,240; .loc 1 187 13 mov.u64 %r360,0; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r179,40; mov.u64 %r162,%frame; cvt.u32.u32 %r312,%r22; cvt.s64.s8 %r311,%r312; add.u64 %r153,%r311,-1; .loc 1 129 3 mov.u64 %r118,0; add.u64 %r350,%frame,240; .loc 1 131 16 mov.u64 %r313,%r118; $L26: st.u64 [%r165],%r313; .loc 1 132 18 ld.u64 %r314,[%r164]; st.u64 [%r162],%r314; .loc 1 133 17 shl.b64 %r316,%r118,3; add.u64 %r317,%r350,%r316; .loc 1 133 10 ld.u64 %r318,[%r317]; setp.le.s64 %r319,%r318,0; @ %r319 bra $L1; .loc 1 129 26 add.u64 %r118,%r118,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r320,%r118,%r153; @ %r320 bra $L26; bra $L27; $L25: .loc 1 153 5 @ %r352 bra $L35; .loc 1 154 10 mov.u16 %r322,-32768; st.u16 [%r117],%r322; bra $L29; $L35: mov.u64 %r112,%r116; .loc 1 151 9 mov.u32 %r43,-32768; .loc 1 158 13 mov.u64 %r111,0; $L28: ld.u16 %r324,[%r112]; cvt.u16.u32 %r325,%r43; max.s16 %r323,%r324,%r325; cvt.s32.s16 %r43,%r323; .loc 1 158 28 add.u64 %r111,%r111,1; .loc 1 158 36 add.u64 %r112,%r112,%r66; .loc 1 158 6 setp.gt.s64 %r326,%r40,%r111; @ %r326 bra $L28; .loc 1 175 12 cvt.u16.u32 %r327,%r43; st.u16 [%r117],%r327; $L29: .loc 1 179 15 add.u64 %r138,%r138,1; .loc 1 180 12 add.u64 %r116,%r116,%r69; .loc 1 181 12 add.u64 %r117,%r117,%r72; .loc 1 183 13 setp.ne.u64 %r328,%r138,%r123; @ %r328 bra $L25; .loc 1 190 9 add.u64 %r113,%r116,%r127; .loc 1 191 9 add.u64 %r114,%r117,%r130; .loc 1 193 7 @ %r353 bra $L1; add.u64 %r122,%frame,368; mov.u64 %r140,8; .loc 1 192 5 mov.u64 %r115,1; bra $L32; $L33: .loc 1 187 13 st.u64 [%r122],%r360; .loc 1 190 23 mul.lo.u64 %r334,%r85,%r84; .loc 1 190 9 add.u64 %r335,%r334,%r334; sub.u64 %r113,%r116,%r335; .loc 1 191 23 mul.lo.u64 %r336,%r88,%r84; .loc 1 191 9 add.u64 %r337,%r336,%r336; sub.u64 %r114,%r117,%r337; .loc 1 192 5 add.u64 %r115,%r115,1; .loc 1 193 7 add.u64 %r122,%r122,8; add.u64 %r140,%r140,8; setp.eq.u64 %r338,%r115,%r357; @ %r338 bra $L1; $L32: .loc 1 201 16 ld.u64 %r339,[%r122]; add.u64 %r84,%r339,1; st.u64 [%r122],%r84; .loc 1 202 23 add.u64 %r341,%r358,%r140; ld.u64 %r85,[%r341]; .loc 1 202 13 add.u64 %r342,%r85,%r85; add.u64 %r116,%r113,%r342; .loc 1 203 23 add.u64 %r343,%frame,%r140; ld.u64 %r88,[%r343]; .loc 1 203 13 add.u64 %r344,%r88,%r88; add.u64 %r117,%r114,%r344; .loc 1 183 32 add.u64 %r346,%r359,%r140; ld.u64 %r92,[%r346]; .loc 1 183 13 setp.eq.u64 %r347,%r84,%r92; @ %r347 bra $L33; .loc 1 187 13 mov.u64 %r138,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r348,[%r179]; setp.eq.u64 %r349,%r348,0; @ ! %r349 bra $L11; bra $L34; $L_gfortran_mmaxval_i2 .visible .func _gfortran_mmaxval_608u64u16 %r263; .reg .pred %r264; .reg .pred %r266; .reg .u64 %r270; .reg .u64 %r271; .reg .u64 %r277; .reg .pred %r279; .reg .pred %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .predu64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32u64 %r342; .reg .u64pred %r358; .reg .u64predu64 %r391; .reg .pred %r392; .reg .pred %r393; .reg .u64u64 %r421; .reg .u64predmov.u64 %r211,%ar2; mov.u64 %r212,%ar3; .loc 1 237 6 setp.ne.u64 %r213,%r212,0; @ %r213 bra $L49gfortran_maxva.loc 1 244 7 bra $L48; $L49: .loc 1 247 10 ld.u64 %r22,[%r211]; .loc 1 247 7 add.u64 %r136,%r22,-1; .loc 1 248 10 ld.s8 %r23,[%r210+28]; .loc 1 248 38 add.u32 %r217,%r23,-1; .loc 1 248 8 cvt.s64.s32 %r137,%r217; .loc 1 251 7 shr.u64 %r219,%r136,63; set.u32.gt.s64 %r221,%r136,%r137; neg.s32 %r222,%r221; cvt.u16.u64 %r225,%r219; cvt.u16.u32 %r226,%r222; or.b16 %r224,%r225,%r226; .loc 1 251 6 cvt.u32.u16 %r227,%r224; cvt.u16.u8 %r228,%r227; setp.eq.u16 %r229,%r228,0; @ %r229 bra $L51; .loc 1 253 7 cvt.u32.u32 %r232,%r23; cvt.s64.s8 %r231,%r232; st.u64 [%stack+8],%r231; st.u64 [%stack],%r22; cvta.const.u64 %r2302stack; call _gfortran_runtime_error1: .loc 1 258 9 add.u64 %r435,%r136,%r136; add.u64 %r436,%r435,%r136; shl.b64 %r237,%r436,3; add.u64 %r238,%r210,%r237; ld.u64 %r241,[%r238+56]; add.u64 %r240,%r241,1; .loc 1 258 7 ld.u64 %r248,[%r238+48]; sub.u64 %r138,%r240,%r248; .loc 1 259 6 setp.le.s64 %r249,%r138,0; @ %r249 bra $L48; .loc 1 262 9 ld.u64 %r151,[%r212]; .loc 1 264 15 ld.u64 %r34,[%r212+16]; .loc 1 266 22 cvt.u32.u64 %r35,%r34; .loc 1 266 53 add.u32 %r250,%r35,-4; and.b32 %r251,%r250,-5; set.u32.eq.u32 %r253,%r251,0; neg.s32 %r254,%r253; .loc 1 266 22 add.u32 %r255,%r35,-1; set.u32.le.u32 %r257,%r255,1; neg.s32 %r258,%r257; .loc 1 266 6 cvt.u16.u32 %r260,%r254; cvt.u16.u32 %r261,%r258; or.b16 %r259,%r260,%r261; cvt.u32.u16 %r262,%r259; cvt.u16.u8 %r263,%r262; setp.ne.u16 %r264,%r263,0; @ %r264 bra $L53; .loc 1 268 7 setp.ne.u32 %r266,%r35,16; @ %r266 bra $L54; $L53: .loc 1 275 9 shl.b64 %r270,%r436,3; add.u64 %r271,%r210,%r270; ld.u64 %r141,[%r271+40]; .loc 1 276 12 add.u64 %r277,%r212,%r270; ld.u64 %r42,[%r277+40]; .loc 1 278 3 setp.ne.u64 %r279,%r136,0; @ %r279 bra $L55; $L61: .loc 1 288 3 setp.lt.s64 %r280,%r136,%r137; @ %r280 bra $L56; bra $L105; $L54: .loc 1 273 5 cvta.const.u64 %r281281_gfortran_runtime_error5: add.u64 %r40,%r210,40; add.u64 %r179,%r212,40; add.u64 %r283,%r210,16; add.u64 %r285,%r22,%r22; add.u64 %r286,%r285,%r22; shl.b64 %r287,%r286,3; add.u64 %r206,%r283,%r287; .loc 1 278 3 mov.u64 %r91,0; add.u64 %r433,%frame,240; add.u64 %r432,%frame,360; .loc 1 285 12 mov.u64 %r447,%r91; $L60: .loc 1 280 18 add.u64 %r289,%r433,%r91; ld.u64 %r290,[%r40]; st.u64 [%r289],%r290; .loc 1 281 18 add.u64 %r291,%frame,%r91; .loc 1 281 20 ld.u64 %r293,[%r179]; mul.lo.u64 %r292,%r293,%r34; .loc 1 281 18 st.u64 [%r291],%r292; .loc 1 282 19 ld.u64 %r295,[%r40+16]; add.u64 %r294,%r295,1; ld.u64 %r296,[%r40+8]; sub.u64 %r54,%r294,%r296; .loc 1 284 10 setp.lt.s64 %r297,%r54,0; @ %r297 bra $L58; .loc 1 282 17 add.u64 %r299,%r432,%r91; st.u64 [%r299],%r54; bra $L59; $L58: .loc 1 285 12 add.u64 %r301,%r432,%r91; st.u64 [%r301],%r447; $L59: .loc 1 278 3 add.u64 %r40,%r40,24; add.u64 %r179,%r179,24; add.u64 %r91,%r91,8; setp.ne.u64 %r303,%r40,%r206; @ %r303 bra $L60; bra $L61; $L105: .loc 1 298 6 ld.u64 %r304,[%r209]; setp.eq.u64 %r305,%r304,0; @ ! %r305 bra $L63; bra $L62; $L56: add.u64 %r307,%r22,%r22; add.u64 %r308,%r307,%r22; shl.b64 %r309,%r308,3; add.u64 %r126,%r309,40; add.u64 %r132,%r210,%r126; add.u64 %r124,%r212,%r126; shl.b64 %r310,%r22,3; add.u64 %r102,%r310,-8; cvt.u32.u32 %r312,%r23; cvt.s64.s8 %r311,%r312; shl.b64 %r313,%r311,3; add.u64 %r68,%r313,-8; add.u64 %r433,%frame,240; add.u64 %r432,%frame,360; .loc 1 295 12 mov.u64 %r446,0; $L66: .loc 1 290 18 add.u64 %r315,%r433,%r102; ld.u64 %r316,[%r132]; st.u64 [%r315],%r316; .loc 1 291 18 add.u64 %r317,%frame,%r102; .loc 1 291 20 ld.u64 %r319,[%r124]; mul.lo.u64 %r318,%r319,%r34; .loc 1 291 18 st.u64 [%r317],%r318; .loc 1 292 19 ld.u64 %r321,[%r132+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r132+8]; sub.u64 %r65,%r320,%r322; .loc 1 294 10 setp.lt.s64 %r323,%r65,0; @ %r323 bra $L64; .loc 1 292 17 add.u64 %r325,%r432,%r102; st.u64 [%r325],%r65; bra $L65; $L64: .loc 1 295 12 add.u64 %r327,%r432,%r102; st.u64 [%r327],%r446; $L65: .loc 1 288 3 add.u64 %r132,%r132,24; add.u64 %r124,%r124,24; add.u64 %r102,%r102,8; setp.ne.u64 %r329,%r68,%r102; @ %r329 bra $L66; bra $L106; $L62: .loc 1 302 7 setp.eq.u64 %r330,%r137,0; @ %r330 bra $L68; add.u64 %r432,%frame,360; $L86: add.u64 %r172,%r209,40; mov.u64 %r164,%r432; cvt.u32.u32 %r332,%r23; cvt.s64.s8 %r331,%r332; add.u64 %r334,%r331,%r331; add.u64 %r335,%r334,%r331; shl.b64 %r336,%r335,3; add.u64 %r337,%r209,16; add.u64 %r133,%r336,%r337; .loc 1 305 10 mov.u64 %r128,1; .loc 1 309 4 mov.u64 %r356,0; bra $L69; $L68: .loc 1 313 20 add.u32 %r338,%r23,-2; cvt.s64.s32 %r74,%r338; add.u64 %r340,%r74,%r74; add.u64 %r341,%r340,%r74; shl.b64 %r342,%r341,3; add.u64 %r343,%r209,%r342; .loc 1 313 67 shl.b64 %r345,%r74,3; add.u64 %r346,%frame,%r345; .loc 1 313 59 ld.u64 %r348,[%r343+40]; ld.u64 %r349,[%r346+360]; mul.lo.u64 %r145,%r348,%r349; .loc 1 315 24 mov.u64 %r350,0; st.u64 [%r209+8],%r350; .loc 1 316 28 cvt.u16.u32 %r353,%r23; add.u16 %r352,%r353,-1; cvt.u32.u16 %r354,%r352; st.u8 [%r209+28],%r354; .loc 1 318 10 setp.eq.u64 %r355,%r145,0; @ %r355 bra $L70; bra $L107; $L72: .loc 1 307 47 mul.lo.u64 %r128,%r71,%r128; $L69: .loc 1 309 4 st.u64 [%r172+8],%r356; ld.u64 %r71,[%r164]; add.u64 %r357,%r71,-1; st.u64 [%r172+16],%r357; st.u64 [%r172],%r128; .loc 1 302 7 add.u64 %r172,%r172,24; add.u64 %r164,%r164,8; setp.ne.u64 %r358,%r133,%r172; @ %r358 bra $L72; bra $L68; $L70: .loc 1 321 4 st.u64 [%r209+48],%r145; mov.u64 %r360,-1; st.u64 [%r209+56],%r360; mov.u64 %r361,1; st.u64 [%r209+40],%r361; .loc 1 322 4 bra $L48; $L107: .loc 1 325 24 mov.u64 %r363,363; call (%value_in),_gfortrani_xmallocarray64,[%value_in]; } .loc 1 325 22 st.u64 [%r209],%r364; bra $L73; $L63: .loc 1 330 19 ld.s8 %r366,[%r209+28]; .loc 1 330 10 setp.eq.u64 %r367,%r366,%r137; @ %r367 bra $L74; .loc 1 331 2 cvta.const.u64 %r36836_gfortran_runtime_error74: .loc 1 333 11 cvta.global.u64 %r370,_gfortrani_compile_options; .loc 1 333 10 ld.u32 %r371,[%r370+36]; setp.eq.u32 %r372,%r371,0; @ %r372 bra $L73; .loc 1 335 4 add.u64 %r432,%frame,360; cvta.const.u64 %r376,$LC2r437376; call _gfortrani_bounds_ifunction_return337 4 cvta.const.u64 %r380,$LC1210380376; call _gfortrani_bounds_equal_extents$L73: .loc 1 342 3 setp.ne.u64 %r382,%r137,0; @ %r382 bra $L75; $L78: .loc 1 350 8 ld.u64 %r152,[%r209]; .loc 1 351 8 ld.u64 %r150,[%r210]; .loc 1 353 9 setp.ne.u64 %r383,%r150,0; @ %r383 bra $L76; bra $L48; $L75: add.u64 %r190,%frame,480; add.u64 %r189,%r209,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r385,%r23; cvt.s64.s8 %r384,%r385; add.u64 %r175,%r384,-1; .loc 1 342 3 mov.u64 %r157,0; add.u64 %r432,%frame,360; .loc 1 344 16 mov.u64 %r386,%r157; $L77: st.u64 [%r190],%r386; .loc 1 345 18 ld.u64 %r387,[%r189]; st.u64 [%r187],%r387; .loc 1 346 17 shl.b64 %r389,%r157,3; add.u64 %r390,%r432,%r389; .loc 1 346 10 ld.u64 %r391,[%r390]; setp.le.s64 %r392,%r391,0; @ %r392 bra $L48; .loc 1 342 26 add.u64 %r157,%r157,1; .loc 1 342 3 add.u64 %r190,%r190,8; add.u64 %r189,%r189,24; add.u64 %r187,%r187,8; setp.ne.u64 %r393,%r157,%r175; @ %r393 bra $L77; bra $L78; $L76: .loc 1 276 12 mul.lo.u64 %r45,%r42,%r34; .loc 1 370 32 add.u64 %r90,%r141,%r141; ld.u64 %r39,[%frame+480]; .loc 1 401 22 ld.u64 %r93,[%frame+240]; .loc 1 401 12 add.u64 %r94,%r93,%r93; .loc 1 402 23 ld.u64 %r96,[%frame]; .loc 1 403 22 ld.u64 %r98,[%frame+120]; .loc 1 403 12 add.u64 %r99,%r98,%r98; .loc 1 405 32 ld.u64 %r84,[%frame+360]; .loc 1 412 23 mul.lo.u64 %r38,%r84,%r93; .loc 1 413 24 mul.lo.u64 %r397,%r84,%r96; .loc 1 413 10 neg.s64 %r163,%r397; .loc 1 414 23 mul.lo.u64 %r398,%r84,%r98; .loc 1 414 9 add.u64 %r399,%r398,%r398; neg.s64 %r165,%r399; setp.le.s64 %r437,%r137,1; .loc 1 365 9 mov.u32 %r439,-32768; cvt.u32.u32 %r440,%r23; cvt.s64.s8 %r441,%r440; add.u64 %r442,%r441,-1; add.u64 %r443,%frame,240; add.u64 %r444,%frame,360; add.u64 %r445,%frame,120; $L87: .loc 1 342 3 mov.u64 %r156,%r151; mov.u64 %r155,%r150; .loc 1 365 9 mov.u32 %r129,%r439; .loc 1 370 9 mov.u64 %r154,0; $L80: .loc 1 394 6 ld.s8 %r401,[%r156]; cvt.u16.u32 %r400,%r401; setp.eq.u16 %r402,%r400,0; @ %r402 bra $L79; ld.u16 %r404,[%r155]; cvt.u16.u32 %r405,%r129; max.s16 %r403,%r404,%r405; cvt.s32.s16 %r129,%r403; $L79: .loc 1 370 24 add.u64 %r154,%r154,1; .loc 1 370 32 add.u64 %r155,%r155,%r90; .loc 1 370 47 add.u64 %r156,%r156,%r45; .loc 1 370 2 setp.ne.u64 %r406,%r138,%r154; @ %r406 bra $L80; .loc 1 397 8 cvt.u16.u32 %r407,%r129; st.u16 [%r152],%r407; .loc 1 400 15 add.u64 %r39,%r39,1; .loc 1 401 12 add.u64 %r150,%r150,%r94; .loc 1 402 13 add.u64 %r151,%r151,%r96; .loc 1 403 12 add.u64 %r152,%r152,%r99; .loc 1 405 13 setp.ne.u64 %r408,%r84,%r39; @ %r408 bra $L87; .loc 1 413 10 add.u64 %r147,%r151,%r163; .loc 1 414 9 add.u64 %r148,%r152,%r165; .loc 1 416 7 @ ! %r437 bra $L108; bra $L48; $L85: .loc 1 409 13 st.u64 [%r41],%r438; .loc 1 412 23 mul.lo.u64 %r101,%r113,%r112; .loc 1 413 24 mul.lo.u64 %r411,%r115,%r112; .loc 1 413 10 sub.u64 %r147,%r151,%r411; .loc 1 414 23 mul.lo.u64 %r412,%r117,%r112; .loc 1 414 9 add.u64 %r413,%r412,%r412; sub.u64 %r148,%r152,%r413; .loc 1 415 5 add.u64 %r149,%r149,1; .loc 1 416 7 add.u64 %r41,%r41,8; add.u64 %r166,%r166,8; setp.ne.u64 %r414,%r149,%r442; @ %r414 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; .loc 1 412 23 mov.u64 %r101,%r38; .loc 1 416 7 mov.u64 %r166,8; .loc 1 415 5 mov.u64 %r149,1; .loc 1 409 13 mov.u64 %r438,0; $L84: .loc 1 424 16 ld.u64 %r418,[%r41]; add.u64 %r112,%r418,1; st.u64 [%r41],%r112; .loc 1 425 23 add.u64 %r420,%r443,%r166; ld.u64 %r113,[%r420]; .loc 1 425 13 sub.u64 %r421,%r113,%r101; add.u64 %r422,%r421,%r421; add.u64 %r150,%r150,%r422; .loc 1 426 24 add.u64 %r423,%frame,%r166; ld.u64 %r115,[%r423]; .loc 1 426 14 add.u64 %r151,%r147,%r115; .loc 1 427 23 add.u64 %r425,%r445,%r166; ld.u64 %r117,[%r425]; .loc 1 427 13 add.u64 %r426,%r117,%r117; add.u64 %r152,%r148,%r426; .loc 1 405 32 add.u64 %r428,%r444,%r166; ld.u64 %r121,[%r428]; .loc 1 405 13 setp.eq.u64 %r429,%r112,%r121; @ %r429 bra $L85; .loc 1 409 13 mov.u64 %r39,0; bra $L87; $L106: .loc 1 298 6 ld.u64 %r430,[%r209]; setp.eq.u64 %r431,%r430,0; @ ! %r431 bra $L63; bra $L86; $L48:_gfortran_smaxval_i2 .visible .func _gfortran_smaxval_68u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64predpred %r292; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 454 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 454 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_maxva.loc 1 461 7 bra $L109; $L111: .loc 1 464 10 ld.u64 %r23,[%r152]; .loc 1 464 7 add.u64 %r88,%r23,-1; .loc 1 465 10 ld.s8 %r24,[%r151+28]; .loc 1 465 38 add.u32 %r26,%r24,-1; .loc 1 465 8 cvt.s64.s32 %r295,%r26; .loc 1 467 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r295; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 467 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 474 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 482 3 setp.lt.s64 %r173,%r88,%r295; @ %r173 bra $L115; bra $L168; $L113: .loc 1 469 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 479 12 mov.u64 %r301,0; $L119: .loc 1 476 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 478 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 476 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 479 12 st.u64 [%r139],%r301; $L118: .loc 1 474 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 491 15 ld.u64 %r98,[%r150]; .loc 1 491 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r294,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r294,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 488 12 mov.u64 %r300,0; $L125: .loc 1 485 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 487 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 484 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 488 12 st.u64 [%r43],%r300; $L124: .loc 1 482 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 495 7 setp.eq.u64 %r211,%r295,0; @ %r211 bra $L127; add.u64 %r294,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r294; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 498 10 mov.u64 %r81,1; .loc 1 502 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 506 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 507 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 509 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 509 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 509 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 511 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 500 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 502 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 495 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 514 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 515 4 bra $L109; $L170: .loc 1 518 24 mov.u64 %r244,4; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 518 22 st.u64 [%r150],%r98; $L135: .loc 1 544 3 setp.ne.u64 %r247,%r295,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 522 19 ld.s8 %r55,[%r150+28]; .loc 1 522 10 setp.eq.u64 %r248,%r55,%r295; @ %r248 bra $L134; .loc 1 523 2 st.u64 [%stack+8],%r295; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 528 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 528 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 530 4 setp.eq.u64 %r254,%r295,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 530 10 mov.u64 %r104,0; $L137: .loc 1 534 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 534 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 535 18 ld.u64 %r63,[%r31]; .loc 1 530 25 add.u64 %r104,%r104,1; .loc 1 535 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 536 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 530 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 546 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r295,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 547 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 544 26 add.u64 %r100,%r100,1; .loc 1 544 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r295,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 556 22 ld.u64 %r68,[%frame]; .loc 1 556 12 add.u64 %r69,%r68,%r68; .loc 1 558 32 ld.u64 %r59,[%frame+120]; .loc 1 565 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r293,%r295,1; .loc 1 554 13 mov.u16 %r279,-32768; cvt.u32.u32 %r297,%r24; cvt.s64.s8 %r298,%r297; add.u64 %r299,%r298,-1; $L146: st.u16 [%r98],%r279; .loc 1 555 15 add.u64 %r80,%r80,1; .loc 1 556 12 add.u64 %r98,%r98,%r69; .loc 1 558 13 setp.ne.u64 %r280,%r59,%r80; @ %r280 bra $L146; .loc 1 567 7 @ ! %r293 bra $L171; bra $L109; $L145: .loc 1 562 13 st.u64 [%r99],%r296; .loc 1 565 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 566 5 add.u64 %r97,%r97,1; .loc 1 567 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r283,%r97,%r299; @ %r283 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 565 23 mov.u64 %r72,%r44; .loc 1 566 5 mov.u64 %r97,1; .loc 1 562 13 mov.u64 %r296,0; $L144: .loc 1 571 16 ld.u64 %r288,[%r99]; add.u64 %r74,%r288,1; st.u64 [%r99],%r74; .loc 1 572 23 ld.u64 %r75,[%r124]; .loc 1 572 13 sub.u64 %r289,%r75,%r72; add.u64 %r290,%r289,%r289; add.u64 %r98,%r98,%r290; .loc 1 558 32 ld.u64 %r77,[%r123]; .loc 1 558 13 setp.eq.u64 %r291,%r74,%r77; @ %r291 bra $L145; .loc 1 562 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 491 15 ld.u64 %r98,[%r150]; .loc 1 491 6 setp.eq.u64 %r292,%r98,0; @ ! %r292 bra $L122; bra $L147; $L109: .loc 1 576 1 ret; } maxval_i4.o/_gfortran_maxval_i4 .visible .func _gfortran_maxval_fortran/generated/maxval_i4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxval_i4 .visible .func _gfortran_mmaxval_i4gfortran_smaxval_i4 .visible .func _gfortran_smaxval_i4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,86,65,76114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_maxval_i4 .visible .func _gfortran_maxval_48099pred %r217; .reg .pred64u64 %r251; .reg .predu64 %r256; .reg .u64 %r257; .reg .u32u32 %r268; .reg .u64u64predpredu64u64predmov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; .loc 1 55 10 ld.s8 %r22,[%r180+28]; .loc 1 55 38 add.u32 %r182,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r182; .loc 1 56 10 ld.u64 %r25,[%r181]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r184,%r104,63; set.u32.lt.s64 %r186,%r103,%r104; neg.s32 %r187,%r186; cvt.u16.u64 %r190,%r184; cvt.u16.u32 %r191,%r187; or.b16 %r189,%r190,%r191; .loc 1 58 6 cvt.u32.u16 %r192,%r189; cvt.u16.u8 %r193,%r192; setp.eq.u16 %r194,%r193,0; @ %r194 bra $L2; .loc 1 60 7 cvt.u32.u32 %r197,%r22; cvt.s64.s8 %r196,%r197; st.u64 [%stack+8],%r1961955_gfortran_runtime_errorr200,%r104,%r104; add.u64 %r201,%r200,%r104; shl.b64 %r202,%r201,3; add.u64 %r203,%r180,%r202; ld.u64 %r30,[%r203+56]; ld.u64 %r32,[%r203+48]; .loc 1 68 9 ld.u64 %r107,[%r203+40]; .loc 1 70 3 setp.ne.u64 %r217,%r104,0; @ %r217 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r218,%r103,%r104; @ %r218 bra $L4; bra $L46; $L3: add.u64 %r78,%r180,40; add.u64 %r59,%frame,120; add.u64 %r91,%frame,240; add.u64 %r219,%r180,16; add.u64 %r221,%r25,%r25; add.u64 %r222,%r221,%r25; shl.b64 %r223,%r222,3; add.u64 %r174,%r219,%r223; .loc 1 76 12 mov.u64 %r357,0; $L8: .loc 1 72 18 ld.u64 %r224,[%r78]; st.u64 [%r59],%r224; .loc 1 73 19 ld.u64 %r226,[%r78+16]; add.u64 %r225,%r226,1; ld.u64 %r227,[%r78+8]; sub.u64 %r37,%r225,%r227; .loc 1 75 10 setp.lt.s64 %r228,%r37,0; @ %r228 bra $L6; .loc 1 73 17 st.u64 [%r91],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r91],%r357; $L7: .loc 1 70 3 add.u64 %r78,%r78,24; add.u64 %r59,%r59,8; add.u64 %r91,%r91,8; setp.ne.u64 %r230,%r78,%r174; @ %r230 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r231,[%r179]; setp.eq.u64 %r232,%r231,0; @ ! %r232 bra $L11; bra $L10; $L4: add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r237,%r236,40; add.u64 %r143,%r180,%r237; shl.b64 %r238,%r25,3; add.u64 %r125,%r238,-8; add.u64 %r344,%frame,120; add.u64 %r128,%r344,%r125; add.u64 %r345,%frame,240; add.u64 %r121,%r345,%r125; cvt.u32.u32 %r242,%r22; cvt.s64.s8 %r241,%r242; add.u64 %r244,%r241,%r241; add.u64 %r245,%r244,%r241; shl.b64 %r246,%r245,3; add.u64 %r247,%r180,40; add.u64 %r95,%r246,%r247; .loc 1 84 12 mov.u64 %r356,0; $L14: .loc 1 80 18 ld.u64 %r248,[%r143]; st.u64 [%r128],%r248; .loc 1 81 19 ld.u64 %r250,[%r143+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r143+8]; sub.u64 %r45,%r249,%r251; .loc 1 83 10 setp.lt.s64 %r252,%r45,0; @ %r252 bra $L12; .loc 1 81 17 st.u64 [%r121],%r45; bra $L13; $L12: .loc 1 84 12 st.u64 [%r121],%r356; $L13: .loc 1 78 3 add.u64 %r143,%r143,24; add.u64 %r128,%r128,8; add.u64 %r121,%r121,8; setp.ne.u64 %r254,%r95,%r143; @ %r254 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r255,%r103,0; @ %r255 bra $L16; add.u64 %r345,%frame,240; $L34: add.u64 %r152,%r179,40; mov.u64 %r150,%r345; add.u64 %r256,%r179,16; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r144,%r256,%r262; .loc 1 94 10 mov.u64 %r99,1; .loc 1 98 4 mov.u64 %r285,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r263,0; st.u64 [%r179+8],%r263; .loc 1 103 28 cvt.u16.u32 %r266,%r22; add.u16 %r265,%r266,-1; cvt.u32.u16 %r267,%r265; st.u8 [%r179+28],%r267; .loc 1 105 20 add.u32 %r268,%r22,-2; cvt.s64.s32 %r52,%r268; add.u64 %r270,%r52,%r52; add.u64 %r271,%r270,%r52; shl.b64 %r272,%r271,3; add.u64 %r273,%r179,%r272; .loc 1 105 67 shl.b64 %r275,%r52,3; add.u64 %r276,%frame,%r275; .loc 1 105 59 ld.u64 %r278,[%r273+40]; ld.u64 %r279,[%r276+240]; mul.lo.u64 %r109,%r278,%r279; .loc 1 107 29281; call (%value_in),_gfortrani_xmallocarray282,[%value_in]; } .loc 1 107 27 st.u64 [%r179],%r282; .loc 1 108 10 setp.eq.u64 %r284,%r109,0; @ ! %r284 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r99,%r48,%r99; $L17: .loc 1 98 4 st.u64 [%r152+8],%r285; ld.u64 %r48,[%r150]; add.u64 %r286,%r48,-1; st.u64 [%r152+16],%r286; st.u64 [%r152],%r99; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r287,%r144,%r152; @ %r287 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r179+48],%r109; mov.u64 %r289,-1; st.u64 [%r179+56],%r289; mov.u64 %r290,1; st.u64 [%r179+40],%r290; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r179+28]; .loc 1 118 10 setp.eq.u64 %r291,%r58,%r103; @ %r291 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r294,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r295,[%r294+36]; setp.eq.u32 %r296,%r295,0; @ %r296 bra $L21; .loc 1 125 2 add.u64 %r345,%frame,240; cvta.const.u64 %r300,$LC2r345299300; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r302,%r103,0; @ %r302 bra $L24; $L27: .loc 1 65 9 add.u64 %r303,%r30,1; .loc 1 65 7 sub.u64 %r105,%r303,%r32; max.s64 %r40,%r105,0; .loc 1 137 8 ld.u64 %r116,[%r180]; .loc 1 138 8 ld.u64 %r117,[%r179]; .loc 1 158 36 shl.b64 %r66,%r107,2; ld.u64 %r138,[%frame+360]; .loc 1 180 22 ld.u64 %r68,[%frame+120]; .loc 1 180 12 shl.b64 %r69,%r68,2; .loc 1 181 22 ld.u64 %r71,[%frame]; .loc 1 181 12 shl.b64 %r72,%r71,2; .loc 1 183 32 ld.u64 %r123,[%frame+240]; .loc 1 190 23 mul.lo.u64 %r304,%r68,%r123; .loc 1 190 9 shl.b64 %r305,%r304,2; neg.s64 %r127,%r305; .loc 1 191 23 mul.lo.u64 %r306,%r71,%r123; .loc 1 191 9 shl.b64 %r307,%r306,2; neg.s64 %r130,%r307; setp.gt.s64 %r346,%r105,0; setp.le.s64 %r347,%r103,1; cvt.u32.u32 %r349,%r22; cvt.s64.s8 %r350,%r349; add.u64 %r351,%r350,-1; add.u64 %r352,%frame,120; add.u64 %r353,%frame,240; .loc 1 187 13 mov.u64 %r354,0; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r179,40; mov.u64 %r162,%frame; cvt.u32.u32 %r309,%r22; cvt.s64.s8 %r308,%r309; add.u64 %r153,%r308,-1; .loc 1 129 3 mov.u64 %r118,0; add.u64 %r345,%frame,240; .loc 1 131 16 mov.u64 %r310,%r118; $L26: st.u64 [%r165],%r310; .loc 1 132 18 ld.u64 %r311,[%r164]; st.u64 [%r162],%r311; .loc 1 133 17 shl.b64 %r313,%r118,3; add.u64 %r314,%r345,%r313; .loc 1 133 10 ld.u64 %r315,[%r314]; setp.le.s64 %r316,%r315,0; @ %r316 bra $L1; .loc 1 129 26 add.u64 %r118,%r118,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r317,%r118,%r153; @ %r317 bra $L26; bra $L27; $L25: .loc 1 153 5 @ %r346 bra $L35; .loc 1 154 10 mov.u32 %r319,-2147483648; st.u32 [%r117],%r319; bra $L29; $L35: mov.u64 %r112,%r116; .loc 1 151 9 mov.u32 %r43,-2147483648; .loc 1 158 13 mov.u64 %r111,0; $L28: ld.u32 %r320,[%r112]; max.s32 %r43,%r43,%r320; .loc 1 158 28 add.u64 %r111,%r111,1; .loc 1 158 36 add.u64 %r112,%r112,%r66; .loc 1 158 6 setp.gt.s64 %r321,%r40,%r111; @ %r321 bra $L28; .loc 1 175 12 st.u32 [%r117],%r43; $L29: .loc 1 179 15 add.u64 %r138,%r138,1; .loc 1 180 12 add.u64 %r116,%r116,%r69; .loc 1 181 12 add.u64 %r117,%r117,%r72; .loc 1 183 13 setp.ne.u64 %r322,%r138,%r123; @ %r322 bra $L25; .loc 1 190 9 add.u64 %r113,%r116,%r127; .loc 1 191 9 add.u64 %r114,%r117,%r130; .loc 1 193 7 @ %r347 bra $L1; add.u64 %r122,%frame,368; mov.u64 %r140,8; .loc 1 192 5 mov.u64 %r115,1; bra $L32; $L33: .loc 1 187 13 st.u64 [%r122],%r354; .loc 1 190 23 mul.lo.u64 %r328,%r85,%r84; .loc 1 190 9 shl.b64 %r329,%r328,2; sub.u64 %r113,%r116,%r329; .loc 1 191 23 mul.lo.u64 %r330,%r88,%r84; .loc 1 191 9 shl.b64 %r331,%r330,2; sub.u64 %r114,%r117,%r331; .loc 1 192 5 add.u64 %r115,%r115,1; .loc 1 193 7 add.u64 %r122,%r122,8; add.u64 %r140,%r140,8; setp.eq.u64 %r332,%r115,%r351; @ %r332 bra $L1; $L32: .loc 1 201 16 ld.u64 %r333,[%r122]; add.u64 %r84,%r333,1; st.u64 [%r122],%r84; .loc 1 202 23 add.u64 %r335,%r352,%r140; ld.u64 %r85,[%r335]; .loc 1 202 13 shl.b64 %r336,%r85,2; add.u64 %r116,%r113,%r336; .loc 1 203 23 add.u64 %r337,%frame,%r140; ld.u64 %r88,[%r337]; .loc 1 203 13 shl.b64 %r338,%r88,2; add.u64 %r117,%r114,%r338; .loc 1 183 32 add.u64 %r340,%r353,%r140; ld.u64 %r92,[%r340]; .loc 1 183 13 setp.eq.u64 %r341,%r84,%r92; @ %r341 bra $L33; .loc 1 187 13 mov.u64 %r138,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r342,[%r179]; setp.eq.u64 %r343,%r342,0; @ ! %r343 bra $L11; bra $L34; $L_gfortran_mmaxval_i4 .visible .func _gfortran_mmaxval_608u64u16 %r263; .reg .pred %r264; .reg .pred %r266; .reg .u64 %r270; .reg .u64 %r271; .reg .u64 %r277; .reg .pred %r279; .reg .pred %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .predu64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32u64 %r342; .reg .u64pred %r358; .reg .u64predu64 %r391; .reg .predpred %r399; .reg .u32 %r400; .reg .pred %r401; .reg .predmov.u64 %r211,%ar2; mov.u64 %r212,%ar3; .loc 1 237 6 setp.ne.u64 %r213,%r212,0; @ %r213 bra $L49gfortran_maxva.loc 1 244 7 bra $L48; $L49: .loc 1 247 10 ld.u64 %r22,[%r211]; .loc 1 247 7 add.u64 %r136,%r22,-1; .loc 1 248 10 ld.s8 %r23,[%r210+28]; .loc 1 248 38 add.u32 %r217,%r23,-1; .loc 1 248 8 cvt.s64.s32 %r137,%r217; .loc 1 251 7 shr.u64 %r219,%r136,63; set.u32.gt.s64 %r221,%r136,%r137; neg.s32 %r222,%r221; cvt.u16.u64 %r225,%r219; cvt.u16.u32 %r226,%r222; or.b16 %r224,%r225,%r226; .loc 1 251 6 cvt.u32.u16 %r227,%r224; cvt.u16.u8 %r228,%r227; setp.eq.u16 %r229,%r228,0; @ %r229 bra $L51; .loc 1 253 7 cvt.u32.u32 %r232,%r23; cvt.s64.s8 %r231,%r232; st.u64 [%stack+8],%r231; st.u64 [%stack],%r22; cvta.const.u64 %r2302stack; call _gfortran_runtime_error1: .loc 1 258 9 add.u64 %r429,%r136,%r136; add.u64 %r430,%r429,%r136; shl.b64 %r237,%r430,3; add.u64 %r238,%r210,%r237; ld.u64 %r241,[%r238+56]; add.u64 %r240,%r241,1; .loc 1 258 7 ld.u64 %r248,[%r238+48]; sub.u64 %r138,%r240,%r248; .loc 1 259 6 setp.le.s64 %r249,%r138,0; @ %r249 bra $L48; .loc 1 262 9 ld.u64 %r151,[%r212]; .loc 1 264 15 ld.u64 %r34,[%r212+16]; .loc 1 266 22 cvt.u32.u64 %r35,%r34; .loc 1 266 53 add.u32 %r250,%r35,-4; and.b32 %r251,%r250,-5; set.u32.eq.u32 %r253,%r251,0; neg.s32 %r254,%r253; .loc 1 266 22 add.u32 %r255,%r35,-1; set.u32.le.u32 %r257,%r255,1; neg.s32 %r258,%r257; .loc 1 266 6 cvt.u16.u32 %r260,%r254; cvt.u16.u32 %r261,%r258; or.b16 %r259,%r260,%r261; cvt.u32.u16 %r262,%r259; cvt.u16.u8 %r263,%r262; setp.ne.u16 %r264,%r263,0; @ %r264 bra $L53; .loc 1 268 7 setp.ne.u32 %r266,%r35,16; @ %r266 bra $L54; $L53: .loc 1 275 9 shl.b64 %r270,%r430,3; add.u64 %r271,%r210,%r270; ld.u64 %r141,[%r271+40]; .loc 1 276 12 add.u64 %r277,%r212,%r270; ld.u64 %r42,[%r277+40]; .loc 1 278 3 setp.ne.u64 %r279,%r136,0; @ %r279 bra $L55; $L61: .loc 1 288 3 setp.lt.s64 %r280,%r136,%r137; @ %r280 bra $L56; bra $L105; $L54: .loc 1 273 5 cvta.const.u64 %r281281_gfortran_runtime_error5: add.u64 %r40,%r210,40; add.u64 %r179,%r212,40; add.u64 %r283,%r210,16; add.u64 %r285,%r22,%r22; add.u64 %r286,%r285,%r22; shl.b64 %r287,%r286,3; add.u64 %r206,%r283,%r287; .loc 1 278 3 mov.u64 %r91,0; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 285 12 mov.u64 %r441,%r91; $L60: .loc 1 280 18 add.u64 %r289,%r427,%r91; ld.u64 %r290,[%r40]; st.u64 [%r289],%r290; .loc 1 281 18 add.u64 %r291,%frame,%r91; .loc 1 281 20 ld.u64 %r293,[%r179]; mul.lo.u64 %r292,%r293,%r34; .loc 1 281 18 st.u64 [%r291],%r292; .loc 1 282 19 ld.u64 %r295,[%r40+16]; add.u64 %r294,%r295,1; ld.u64 %r296,[%r40+8]; sub.u64 %r54,%r294,%r296; .loc 1 284 10 setp.lt.s64 %r297,%r54,0; @ %r297 bra $L58; .loc 1 282 17 add.u64 %r299,%r426,%r91; st.u64 [%r299],%r54; bra $L59; $L58: .loc 1 285 12 add.u64 %r301,%r426,%r91; st.u64 [%r301],%r441; $L59: .loc 1 278 3 add.u64 %r40,%r40,24; add.u64 %r179,%r179,24; add.u64 %r91,%r91,8; setp.ne.u64 %r303,%r40,%r206; @ %r303 bra $L60; bra $L61; $L105: .loc 1 298 6 ld.u64 %r304,[%r209]; setp.eq.u64 %r305,%r304,0; @ ! %r305 bra $L63; bra $L62; $L56: add.u64 %r307,%r22,%r22; add.u64 %r308,%r307,%r22; shl.b64 %r309,%r308,3; add.u64 %r126,%r309,40; add.u64 %r132,%r210,%r126; add.u64 %r124,%r212,%r126; shl.b64 %r310,%r22,3; add.u64 %r102,%r310,-8; cvt.u32.u32 %r312,%r23; cvt.s64.s8 %r311,%r312; shl.b64 %r313,%r311,3; add.u64 %r68,%r313,-8; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 295 12 mov.u64 %r440,0; $L66: .loc 1 290 18 add.u64 %r315,%r427,%r102; ld.u64 %r316,[%r132]; st.u64 [%r315],%r316; .loc 1 291 18 add.u64 %r317,%frame,%r102; .loc 1 291 20 ld.u64 %r319,[%r124]; mul.lo.u64 %r318,%r319,%r34; .loc 1 291 18 st.u64 [%r317],%r318; .loc 1 292 19 ld.u64 %r321,[%r132+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r132+8]; sub.u64 %r65,%r320,%r322; .loc 1 294 10 setp.lt.s64 %r323,%r65,0; @ %r323 bra $L64; .loc 1 292 17 add.u64 %r325,%r426,%r102; st.u64 [%r325],%r65; bra $L65; $L64: .loc 1 295 12 add.u64 %r327,%r426,%r102; st.u64 [%r327],%r440; $L65: .loc 1 288 3 add.u64 %r132,%r132,24; add.u64 %r124,%r124,24; add.u64 %r102,%r102,8; setp.ne.u64 %r329,%r68,%r102; @ %r329 bra $L66; bra $L106; $L62: .loc 1 302 7 setp.eq.u64 %r330,%r137,0; @ %r330 bra $L68; add.u64 %r426,%frame,360; $L86: add.u64 %r172,%r209,40; mov.u64 %r164,%r426; cvt.u32.u32 %r332,%r23; cvt.s64.s8 %r331,%r332; add.u64 %r334,%r331,%r331; add.u64 %r335,%r334,%r331; shl.b64 %r336,%r335,3; add.u64 %r337,%r209,16; add.u64 %r133,%r336,%r337; .loc 1 305 10 mov.u64 %r128,1; .loc 1 309 4 mov.u64 %r356,0; bra $L69; $L68: .loc 1 313 20 add.u32 %r338,%r23,-2; cvt.s64.s32 %r74,%r338; add.u64 %r340,%r74,%r74; add.u64 %r341,%r340,%r74; shl.b64 %r342,%r341,3; add.u64 %r343,%r209,%r342; .loc 1 313 67 shl.b64 %r345,%r74,3; add.u64 %r346,%frame,%r345; .loc 1 313 59 ld.u64 %r348,[%r343+40]; ld.u64 %r349,[%r346+360]; mul.lo.u64 %r145,%r348,%r349; .loc 1 315 24 mov.u64 %r350,0; st.u64 [%r209+8],%r350; .loc 1 316 28 cvt.u16.u32 %r353,%r23; add.u16 %r352,%r353,-1; cvt.u32.u16 %r354,%r352; st.u8 [%r209+28],%r354; .loc 1 318 10 setp.eq.u64 %r355,%r145,0; @ %r355 bra $L70; bra $L107; $L72: .loc 1 307 47 mul.lo.u64 %r128,%r71,%r128; $L69: .loc 1 309 4 st.u64 [%r172+8],%r356; ld.u64 %r71,[%r164]; add.u64 %r357,%r71,-1; st.u64 [%r172+16],%r357; st.u64 [%r172],%r128; .loc 1 302 7 add.u64 %r172,%r172,24; add.u64 %r164,%r164,8; setp.ne.u64 %r358,%r133,%r172; @ %r358 bra $L72; bra $L68; $L70: .loc 1 321 4 st.u64 [%r209+48],%r145; mov.u64 %r360,-1; st.u64 [%r209+56],%r360; mov.u64 %r361,1; st.u64 [%r209+40],%r361; .loc 1 322 4 bra $L48; $L107: .loc 1 325 24363; call (%value_in),_gfortrani_xmallocarray64,[%value_in]; } .loc 1 325 22 st.u64 [%r209],%r364; bra $L73; $L63: .loc 1 330 19 ld.s8 %r366,[%r209+28]; .loc 1 330 10 setp.eq.u64 %r367,%r366,%r137; @ %r367 bra $L74; .loc 1 331 2 cvta.const.u64 %r36836_gfortran_runtime_error74: .loc 1 333 11 cvta.global.u64 %r370,_gfortrani_compile_options; .loc 1 333 10 ld.u32 %r371,[%r370+36]; setp.eq.u32 %r372,%r371,0; @ %r372 bra $L73; .loc 1 335 4 add.u64 %r426,%frame,360; cvta.const.u64 %r376,$LC2r42637376; call _gfortrani_bounds_ifunction_return337 4 cvta.const.u64 %r380,$LC1210380376; call _gfortrani_bounds_equal_extents$L73: .loc 1 342 3 setp.ne.u64 %r382,%r137,0; @ %r382 bra $L75; $L78: .loc 1 350 8 ld.u64 %r152,[%r209]; .loc 1 351 8 ld.u64 %r150,[%r210]; .loc 1 353 9 setp.ne.u64 %r383,%r150,0; @ %r383 bra $L76; bra $L48; $L75: add.u64 %r190,%frame,480; add.u64 %r189,%r209,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r385,%r23; cvt.s64.s8 %r384,%r385; add.u64 %r175,%r384,-1; .loc 1 342 3 mov.u64 %r157,0; add.u64 %r426,%frame,360; .loc 1 344 16 mov.u64 %r386,%r157; $L77: st.u64 [%r190],%r386; .loc 1 345 18 ld.u64 %r387,[%r189]; st.u64 [%r187],%r387; .loc 1 346 17 shl.b64 %r389,%r157,3; add.u64 %r390,%r426,%r389; .loc 1 346 10 ld.u64 %r391,[%r390]; setp.le.s64 %r392,%r391,0; @ %r392 bra $L48; .loc 1 342 26 add.u64 %r157,%r157,1; .loc 1 342 3 add.u64 %r190,%r190,8; add.u64 %r189,%r189,24; add.u64 %r187,%r187,8; setp.ne.u64 %r393,%r157,%r175; @ %r393 bra $L77; bra $L78; $L76: .loc 1 276 12 mul.lo.u64 %r45,%r42,%r34; .loc 1 370 32 shl.b64 %r90,%r141,2; ld.u64 %r39,[%frame+480]; .loc 1 401 22 ld.u64 %r93,[%frame+240]; .loc 1 401 12 shl.b64 %r94,%r93,2; .loc 1 402 23 ld.u64 %r96,[%frame]; .loc 1 403 22 ld.u64 %r98,[%frame+120]; .loc 1 403 12 shl.b64 %r99,%r98,2; .loc 1 405 32 ld.u64 %r84,[%frame+360]; .loc 1 412 23 mul.lo.u64 %r38,%r84,%r93; .loc 1 413 24 mul.lo.u64 %r394,%r84,%r96; .loc 1 413 10 neg.s64 %r163,%r394; .loc 1 414 23 mul.lo.u64 %r395,%r84,%r98; .loc 1 414 9 shl.b64 %r396,%r395,2; neg.s64 %r165,%r396; setp.le.s64 %r431,%r137,1; cvt.u32.u32 %r433,%r23; cvt.s64.s8 %r434,%r433; add.u64 %r435,%r434,-1; add.u64 %r436,%frame,240; add.u64 %r437,%frame,360; add.u64 %r438,%frame,120; .loc 1 409 13 mov.u64 %r439,0; $L87: .loc 1 342 3 mov.u64 %r156,%r151; mov.u64 %r155,%r150; .loc 1 365 9 mov.u32 %r129,-2147483648; .loc 1 370 9 mov.u64 %r154,0; $L80: .loc 1 394 6 ld.s8 %r398,[%r156]; cvt.u16.u32 %r397,%r398; setp.eq.u16 %r399,%r397,0; @ %r399 bra $L79; ld.u32 %r400,[%r155]; max.s32 %r129,%r129,%r400; $L79: .loc 1 370 24 add.u64 %r154,%r154,1; .loc 1 370 32 add.u64 %r155,%r155,%r90; .loc 1 370 47 add.u64 %r156,%r156,%r45; .loc 1 370 2 setp.ne.u64 %r401,%r138,%r154; @ %r401 bra $L80; .loc 1 397 8 st.u32 [%r152],%r129; .loc 1 400 15 add.u64 %r39,%r39,1; .loc 1 401 12 add.u64 %r150,%r150,%r94; .loc 1 402 13 add.u64 %r151,%r151,%r96; .loc 1 403 12 add.u64 %r152,%r152,%r99; .loc 1 405 13 setp.ne.u64 %r402,%r84,%r39; @ %r402 bra $L87; .loc 1 413 10 add.u64 %r147,%r151,%r163; .loc 1 414 9 add.u64 %r148,%r152,%r165; .loc 1 416 7 @ ! %r431 bra $L108; bra $L48; $L85: .loc 1 409 13 st.u64 [%r41],%r439; .loc 1 412 23 mul.lo.u64 %r101,%r113,%r112; .loc 1 413 24 mul.lo.u64 %r405,%r115,%r112; .loc 1 413 10 sub.u64 %r147,%r151,%r405; .loc 1 414 23 mul.lo.u64 %r406,%r117,%r112; .loc 1 414 9 shl.b64 %r407,%r406,2; sub.u64 %r148,%r152,%r407; .loc 1 415 5 add.u64 %r149,%r149,1; .loc 1 416 7 add.u64 %r41,%r41,8; add.u64 %r166,%r166,8; setp.ne.u64 %r408,%r149,%r435; @ %r408 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; .loc 1 412 23 mov.u64 %r101,%r38; .loc 1 416 7 mov.u64 %r166,8; .loc 1 415 5 mov.u64 %r149,1; $L84: .loc 1 424 16 ld.u64 %r412,[%r41]; add.u64 %r112,%r412,1; st.u64 [%r41],%r112; .loc 1 425 23 add.u64 %r414,%r436,%r166; ld.u64 %r113,[%r414]; .loc 1 425 13 sub.u64 %r415,%r113,%r101; shl.b64 %r416,%r415,2; add.u64 %r150,%r150,%r416; .loc 1 426 24 add.u64 %r417,%frame,%r166; ld.u64 %r115,[%r417]; .loc 1 426 14 add.u64 %r151,%r147,%r115; .loc 1 427 23 add.u64 %r419,%r438,%r166; ld.u64 %r117,[%r419]; .loc 1 427 13 shl.b64 %r420,%r117,2; add.u64 %r152,%r148,%r420; .loc 1 405 32 add.u64 %r422,%r437,%r166; ld.u64 %r121,[%r422]; .loc 1 405 13 setp.eq.u64 %r423,%r112,%r121; @ %r423 bra $L85; .loc 1 409 13 mov.u64 %r39,0; bra $L87; $L106: .loc 1 298 6 ld.u64 %r424,[%r209]; setp.eq.u64 %r425,%r424,0; @ ! %r425 bra $L63; bra $L86; $L48:_gfortran_smaxval_i4 .visible .func _gfortran_smaxval_68u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64predu64mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 454 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 454 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_maxva.loc 1 461 7 bra $L109; $L111: .loc 1 464 10 ld.u64 %r23,[%r152]; .loc 1 464 7 add.u64 %r88,%r23,-1; .loc 1 465 10 ld.s8 %r24,[%r151+28]; .loc 1 465 38 add.u32 %r26,%r24,-1; .loc 1 465 8 cvt.s64.s32 %r294,%r26; .loc 1 467 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r294; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 467 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 474 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 482 3 setp.lt.s64 %r173,%r88,%r294; @ %r173 bra $L115; bra $L168; $L113: .loc 1 469 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 479 12 mov.u64 %r300,0; $L119: .loc 1 476 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 478 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 476 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 479 12 st.u64 [%r139],%r300; $L118: .loc 1 474 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 491 15 ld.u64 %r98,[%r150]; .loc 1 491 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r293,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r293,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 488 12 mov.u64 %r299,0; $L125: .loc 1 485 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 487 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 484 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 488 12 st.u64 [%r43],%r299; $L124: .loc 1 482 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 495 7 setp.eq.u64 %r211,%r294,0; @ %r211 bra $L127; add.u64 %r293,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r293; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 498 10 mov.u64 %r81,1; .loc 1 502 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 506 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 507 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 509 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 509 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 509 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 511 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 500 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 502 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 495 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 514 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 515 4 bra $L109; $L170: .loc 1 518 2496244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 518 22 st.u64 [%r150],%r98; $L135: .loc 1 544 3 setp.ne.u64 %r247,%r294,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 522 19 ld.s8 %r55,[%r150+28]; .loc 1 522 10 setp.eq.u64 %r248,%r55,%r294; @ %r248 bra $L134; .loc 1 523 2 st.u64 [%stack+8],%r294; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 528 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 528 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 530 4 setp.eq.u64 %r254,%r294,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 530 10 mov.u64 %r104,0; $L137: .loc 1 534 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 534 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 535 18 ld.u64 %r63,[%r31]; .loc 1 530 25 add.u64 %r104,%r104,1; .loc 1 535 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 536 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 530 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 546 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r294,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 547 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 544 26 add.u64 %r100,%r100,1; .loc 1 544 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r294,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 556 22 ld.u64 %r68,[%frame]; .loc 1 556 12 shl.b64 %r69,%r68,2; .loc 1 558 32 ld.u64 %r59,[%frame+120]; .loc 1 565 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r292,%r294,1; .loc 1 554 13 mov.u32 %r278,-2147483648; cvt.u32.u32 %r296,%r24; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; $L146: st.u32 [%r98],%r278; .loc 1 555 15 add.u64 %r80,%r80,1; .loc 1 556 12 add.u64 %r98,%r98,%r69; .loc 1 558 13 setp.ne.u64 %r279,%r59,%r80; @ %r279 bra $L146; .loc 1 567 7 @ ! %r292 bra $L171; bra $L109; $L145: .loc 1 562 13 st.u64 [%r99],%r295; .loc 1 565 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 566 5 add.u64 %r97,%r97,1; .loc 1 567 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r282,%r97,%r298; @ %r282 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 565 23 mov.u64 %r72,%r44; .loc 1 566 5 mov.u64 %r97,1; .loc 1 562 13 mov.u64 %r295,0; $L144: .loc 1 571 16 ld.u64 %r287,[%r99]; add.u64 %r74,%r287,1; st.u64 [%r99],%r74; .loc 1 572 23 ld.u64 %r75,[%r124]; .loc 1 572 13 sub.u64 %r288,%r75,%r72; shl.b64 %r289,%r288,2; add.u64 %r98,%r98,%r289; .loc 1 558 32 ld.u64 %r77,[%r123]; .loc 1 558 13 setp.eq.u64 %r290,%r74,%r77; @ %r290 bra $L145; .loc 1 562 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 491 15 ld.u64 %r98,[%r150]; .loc 1 491 6 setp.eq.u64 %r291,%r98,0; @ ! %r291 bra $L122; bra $L147; $L109: .loc 1 576 1 ret; } maxval_i8.o/_gfortran_maxval_i8 .visible .func _gfortran_maxval_fortran/generated/maxval_i8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxval_i8 .visible .func _gfortran_mmaxval_i8gfortran_smaxval_i8 .visible .func _gfortran_smaxval_i8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,86,65,76114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_maxval_i8 .visible .func _gfortran_maxval_48097pred %r218; .reg .predpred %r231; .reg .u64 %r232; .reg .predu64 %r250; .reg .u64 %r251; .reg .u64 %r252; .reg .pred %r253; .reg .predu64u16u64 %r274; .reg .u64pred %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u32 %r306; .reg .u64u64 %r311; .reg .u64 %r312; .reg .pred %r313; .reg .predu64predmov.u64 %r180,%ar0; mov.u64 %r181,%ar1; mov.u64 %r182,%ar2; .loc 1 55 10 ld.s8 %r22,[%r181+28]; .loc 1 55 38 add.u32 %r183,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r108,%r183; .loc 1 56 10 ld.u64 %r25,[%r182]; .loc 1 56 7 add.u64 %r109,%r25,-1; .loc 1 58 7 shr.u64 %r185,%r109,63; set.u32.lt.s64 %r187,%r108,%r109; neg.s32 %r188,%r187; cvt.u16.u64 %r191,%r185; cvt.u16.u32 %r192,%r188; or.b16 %r190,%r191,%r192; .loc 1 58 6 cvt.u32.u16 %r193,%r190; cvt.u16.u8 %r194,%r193; setp.eq.u16 %r195,%r194,0; @ %r195 bra $L2; .loc 1 60 7 cvt.u32.u32 %r198,%r22; cvt.s64.s8 %r197,%r198; st.u64 [%stack+8],%r1971966_gfortran_runtime_errorr201,%r109,%r109; add.u64 %r202,%r201,%r109; shl.b64 %r203,%r202,3; add.u64 %r204,%r181,%r203; ld.u64 %r30,[%r204+56]; ld.u64 %r32,[%r204+48]; .loc 1 68 9 ld.u64 %r112,[%r204+40]; .loc 1 70 3 setp.ne.u64 %r218,%r109,0; @ %r218 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r219,%r108,%r109; @ %r219 bra $L4; bra $L46; $L3: add.u64 %r63,%r181,40; add.u64 %r48,%frame,120; add.u64 %r97,%frame,240; add.u64 %r220,%r181,16; add.u64 %r222,%r25,%r25; add.u64 %r223,%r222,%r25; shl.b64 %r224,%r223,3; add.u64 %r175,%r220,%r224; .loc 1 76 12 mov.u64 %r358,0; $L8: .loc 1 72 18 ld.u64 %r225,[%r63]; st.u64 [%r48],%r225; .loc 1 73 19 ld.u64 %r227,[%r63+16]; add.u64 %r226,%r227,1; ld.u64 %r228,[%r63+8]; sub.u64 %r37,%r226,%r228; .loc 1 75 10 setp.lt.s64 %r229,%r37,0; @ %r229 bra $L6; .loc 1 73 17 st.u64 [%r97],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r97],%r358; $L7: .loc 1 70 3 add.u64 %r63,%r63,24; add.u64 %r48,%r48,8; add.u64 %r97,%r97,8; setp.ne.u64 %r231,%r63,%r175; @ %r231 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r232,[%r180]; setp.eq.u64 %r233,%r232,0; @ ! %r233 bra $L11; bra $L10; $L4: add.u64 %r235,%r25,%r25; add.u64 %r236,%r235,%r25; shl.b64 %r237,%r236,3; add.u64 %r238,%r237,40; add.u64 %r136,%r181,%r238; shl.b64 %r239,%r25,3; add.u64 %r106,%r239,-8; add.u64 %r345,%frame,120; add.u64 %r129,%r345,%r106; add.u64 %r346,%frame,240; add.u64 %r104,%r346,%r106; cvt.u32.u32 %r243,%r22; cvt.s64.s8 %r242,%r243; add.u64 %r245,%r242,%r242; add.u64 %r246,%r245,%r242; shl.b64 %r247,%r246,3; add.u64 %r248,%r181,40; add.u64 %r82,%r247,%r248; .loc 1 84 12 mov.u64 %r357,0; $L14: .loc 1 80 18 ld.u64 %r249,[%r136]; st.u64 [%r129],%r249; .loc 1 81 19 ld.u64 %r251,[%r136+16]; add.u64 %r250,%r251,1; ld.u64 %r252,[%r136+8]; sub.u64 %r45,%r250,%r252; .loc 1 83 10 setp.lt.s64 %r253,%r45,0; @ %r253 bra $L12; .loc 1 81 17 st.u64 [%r104],%r45; bra $L13; $L12: .loc 1 84 12 st.u64 [%r104],%r357; $L13: .loc 1 78 3 add.u64 %r136,%r136,24; add.u64 %r129,%r129,8; add.u64 %r104,%r104,8; setp.ne.u64 %r255,%r82,%r136; @ %r255 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r256,%r108,0; @ %r256 bra $L16; add.u64 %r346,%frame,240; $L34: add.u64 %r152,%r180,40; mov.u64 %r150,%r346; add.u64 %r257,%r180,16; cvt.u32.u32 %r259,%r22; cvt.s64.s8 %r258,%r259; add.u64 %r261,%r258,%r258; add.u64 %r262,%r261,%r258; shl.b64 %r263,%r262,3; add.u64 %r139,%r257,%r263; .loc 1 94 10 mov.u64 %r103,1; .loc 1 98 4 mov.u64 %r286,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r264,0; st.u64 [%r180+8],%r264; .loc 1 103 28 cvt.u16.u32 %r267,%r22; add.u16 %r266,%r267,-1; cvt.u32.u16 %r268,%r266; st.u8 [%r180+28],%r268; .loc 1 105 20 add.u32 %r269,%r22,-2; cvt.s64.s32 %r54,%r269; add.u64 %r271,%r54,%r54; add.u64 %r272,%r271,%r54; shl.b64 %r273,%r272,3; add.u64 %r274,%r180,%r273; .loc 1 105 67 shl.b64 %r276,%r54,3; add.u64 %r277,%frame,%r276; .loc 1 105 59 ld.u64 %r279,[%r274+40]; ld.u64 %r280,[%r277+240]; mul.lo.u64 %r114,%r279,%r280; .loc 1 107 29 mov.u64 %r2811282; call (%value_in),_gfortrani_xmallocarray283,[%value_in]; } .loc 1 107 27 st.u64 [%r180],%r283; .loc 1 108 10 setp.eq.u64 %r285,%r114,0; @ ! %r285 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r103,%r50,%r103; $L17: .loc 1 98 4 st.u64 [%r152+8],%r286; ld.u64 %r50,[%r150]; add.u64 %r287,%r50,-1; st.u64 [%r152+16],%r287; st.u64 [%r152],%r103; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r288,%r139,%r152; @ %r288 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r180+48],%r114; mov.u64 %r290,-1; st.u64 [%r180+56],%r290; mov.u64 %r291,1; st.u64 [%r180+40],%r291; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r60,[%r180+28]; .loc 1 118 10 setp.eq.u64 %r292,%r60,%r108; @ %r292 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r108; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r295,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r296,[%r295+36]; setp.eq.u32 %r297,%r296,0; @ %r297 bra $L21; .loc 1 125 2 add.u64 %r346,%frame,240; cvta.const.u64 %r301,$LC2r346300301; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r303,%r108,0; @ %r303 bra $L24; $L27: .loc 1 65 9 add.u64 %r304,%r30,1; .loc 1 65 7 sub.u64 %r110,%r304,%r32; max.s64 %r40,%r110,0; .loc 1 137 8 ld.u64 %r121,[%r181]; .loc 1 138 8 ld.u64 %r122,[%r180]; .loc 1 158 36 shl.b64 %r68,%r112,3; ld.u64 %r47,[%frame+360]; .loc 1 180 22 ld.u64 %r69,[%frame+120]; .loc 1 180 12 shl.b64 %r71,%r69,3; .loc 1 181 22 ld.u64 %r72,[%frame]; .loc 1 181 12 shl.b64 %r74,%r72,3; setp.gt.s64 %r347,%r110,0; setp.le.s64 %r348,%r108,1; cvt.u32.u32 %r350,%r22; cvt.s64.s8 %r351,%r350; add.u64 %r352,%r351,-1; add.u64 %r353,%frame,120; add.u64 %r354,%frame,240; .loc 1 187 13 mov.u64 %r355,0; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r180,40; mov.u64 %r162,%frame; cvt.u32.u32 %r306,%r22; cvt.s64.s8 %r305,%r306; add.u64 %r153,%r305,-1; .loc 1 129 3 mov.u64 %r123,0; add.u64 %r346,%frame,240; .loc 1 131 16 mov.u64 %r307,%r123; $L26: st.u64 [%r165],%r307; .loc 1 132 18 ld.u64 %r308,[%r164]; st.u64 [%r162],%r308; .loc 1 133 17 shl.b64 %r310,%r123,3; add.u64 %r311,%r346,%r310; .loc 1 133 10 ld.u64 %r312,[%r311]; setp.le.s64 %r313,%r312,0; @ %r313 bra $L1; .loc 1 129 26 add.u64 %r123,%r123,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r314,%r123,%r153; @ %r314 bra $L26; bra $L27; $L25: .loc 1 153 5 @ %r347 bra $L35; .loc 1 154 10 mov.u64 %r316,-9223372036854775808; st.u64 [%r122],%r316; bra $L29; $L35: mov.u64 %r116,%r121; .loc 1 151 9 mov.u64 %r43,-9223372036854775808; .loc 1 158 13 mov.u64 %r115,0; $L28: ld.u64 %r317,[%r116]; max.s64 %r43,%r43,%r317; .loc 1 158 28 add.u64 %r115,%r115,1; .loc 1 158 36 add.u64 %r116,%r116,%r68; .loc 1 158 6 setp.gt.s64 %r318,%r40,%r115; @ %r318 bra $L28; .loc 1 175 12 st.u64 [%r122],%r43; $L29: .loc 1 179 15 add.u64 %r47,%r47,1; .loc 1 180 12 add.u64 %r121,%r121,%r71; .loc 1 181 12 add.u64 %r122,%r122,%r74; .loc 1 183 32 ld.u64 %r127,[%frame+240]; .loc 1 183 13 setp.ne.u64 %r319,%r47,%r127; @ %r319 bra $L25; .loc 1 190 23 mul.lo.u64 %r320,%r69,%r47; .loc 1 190 9 shl.b64 %r321,%r320,3; sub.u64 %r118,%r121,%r321; .loc 1 191 23 mul.lo.u64 %r322,%r72,%r47; .loc 1 191 9 shl.b64 %r323,%r322,3; sub.u64 %r119,%r122,%r323; .loc 1 193 7 @ %r348 bra $L1; add.u64 %r144,%frame,368; mov.u64 %r107,8; .loc 1 192 5 mov.u64 %r120,1; bra $L32; $L33: .loc 1 187 13 st.u64 [%r144],%r355; .loc 1 190 23 mul.lo.u64 %r329,%r89,%r88; .loc 1 190 9 shl.b64 %r330,%r329,3; sub.u64 %r118,%r121,%r330; .loc 1 191 23 mul.lo.u64 %r331,%r92,%r88; .loc 1 191 9 shl.b64 %r332,%r331,3; sub.u64 %r119,%r122,%r332; .loc 1 192 5 add.u64 %r120,%r120,1; .loc 1 193 7 add.u64 %r144,%r144,8; add.u64 %r107,%r107,8; setp.eq.u64 %r333,%r120,%r352; @ %r333 bra $L1; $L32: .loc 1 201 16 ld.u64 %r334,[%r144]; add.u64 %r88,%r334,1; st.u64 [%r144],%r88; .loc 1 202 23 add.u64 %r336,%r353,%r107; ld.u64 %r89,[%r336]; .loc 1 202 13 shl.b64 %r337,%r89,3; add.u64 %r121,%r118,%r337; .loc 1 203 23 add.u64 %r338,%frame,%r107; ld.u64 %r92,[%r338]; .loc 1 203 13 shl.b64 %r339,%r92,3; add.u64 %r122,%r119,%r339; .loc 1 183 32 add.u64 %r341,%r354,%r107; ld.u64 %r95,[%r341]; .loc 1 183 13 setp.eq.u64 %r342,%r88,%r95; @ %r342 bra $L33; .loc 1 187 13 mov.u64 %r47,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r343,[%r180]; setp.eq.u64 %r344,%r343,0; @ ! %r344 bra $L11; bra $L34; $L_gfortran_mmaxval_i8 .visible .func _gfortran_mmaxval_608101; .reg .u64 %r105; .reg .u64predpred %r280; .reg .pred %r281; .reg .u64 %r282; .reg .u64 %r284; .reg .u64predu64 %r318; .reg .u64u64u64 %r364; .reg .u64 %r365; .reg .u64 %r367; .reg .predu64pred %r393; .reg .predpredmov.u64 %r210,%ar0; mov.u64 %r211,%ar1; mov.u64 %r212,%ar2; mov.u64 %r213,%ar3; .loc 1 237 6911212; call _gfortran_maxva.loc 1 244 7 bra $L48; $L49: .loc 1 247 10 ld.u64 %r22,[%r212]; .loc 1 247 7 add.u64 %r140,%r22,-1; .loc 1 248 10 ld.s8 %r23,[%r211+28]; .loc 1 248 38 add.u32 %r218,%r23,-1; .loc 1 248 8 cvt.s64.s32 %r141,%r218; .loc 1 251 7 shr.u64 %r220,%r140,63; set.u32.gt.s64 %r222,%r140,%r141; neg.s32 %r223,%r222; cvt.u16.u64 %r226,%r220; cvt.u16.u32 %r227,%r223; or.b16 %r225,%r226,%r227; .loc 1 251 6 cvt.u32.u16 %r228,%r225; cvt.u16.u8 %r229,%r228; setp.eq.u16 %r230,%r229,0; @ %r230 bra $L51; .loc 1 253 7 cvt.u32.u32 %r233,%r23; cvt.s64.s8 %r232,%r233; st.u64 [%stack+8],%r232; st.u64 [%stack],%r22; cvta.const.u64 %r2312stack; call _gfortran_runtime_error1: .loc 1 258 9 add.u64 %r431,%r140,%r140; add.u64 %r430,%r431,%r140; shl.b64 %r238,%r430,3; add.u64 %r239,%r211,%r238; ld.u64 %r242,[%r239+56]; add.u64 %r241,%r242,1; .loc 1 258 7 ld.u64 %r249,[%r239+48]; sub.u64 %r142,%r241,%r249; .loc 1 259 6 setp.le.s64 %r250,%r142,0; @ %r250 bra $L48; .loc 1 262 9 ld.u64 %r155,[%r213]; .loc 1 264 15 ld.u64 %r35,[%r213+16]; .loc 1 266 22 cvt.u32.u64 %r36,%r35; .loc 1 266 53 add.u32 %r251,%r36,-4; and.b32 %r252,%r251,-5; set.u32.eq.u32 %r254,%r252,0; neg.s32 %r255,%r254; .loc 1 266 22 add.u32 %r256,%r36,-1; set.u32.le.u32 %r258,%r256,1; neg.s32 %r259,%r258; .loc 1 266 6 cvt.u16.u32 %r261,%r255; cvt.u16.u32 %r262,%r259; or.b16 %r260,%r261,%r262; cvt.u32.u16 %r263,%r260; cvt.u16.u8 %r264,%r263; setp.ne.u16 %r265,%r264,0; @ %r265 bra $L53; .loc 1 268 7 setp.ne.u32 %r267,%r36,16; @ %r267 bra $L54; $L53: .loc 1 275 9 shl.b64 %r271,%r430,3; add.u64 %r272,%r211,%r271; ld.u64 %r145,[%r272+40]; .loc 1 276 12 add.u64 %r278,%r213,%r271; ld.u64 %r43,[%r278+40]; .loc 1 278 3 setp.ne.u64 %r280,%r140,0; @ %r280 bra $L55; $L61: .loc 1 288 3 setp.lt.s64 %r281,%r140,%r141; @ %r281 bra $L56; bra $L105; $L54: .loc 1 273 5 cvta.const.u64 %r282282_gfortran_runtime_error5: add.u64 %r194,%r211,40; add.u64 %r196,%r213,40; add.u64 %r284,%r211,16; add.u64 %r286,%r22,%r22; add.u64 %r287,%r286,%r22; shl.b64 %r288,%r287,3; add.u64 %r207,%r284,%r288; .loc 1 278 3 mov.u64 %r198,0; add.u64 %r428,%frame,240; add.u64 %r427,%frame,360; .loc 1 285 12 mov.u64 %r442,%r198; $L60: .loc 1 280 18 add.u64 %r290,%r428,%r198; ld.u64 %r291,[%r194]; st.u64 [%r290],%r291; .loc 1 281 18 add.u64 %r292,%frame,%r198; .loc 1 281 20 ld.u64 %r294,[%r196]; mul.lo.u64 %r293,%r294,%r35; .loc 1 281 18 st.u64 [%r292],%r293; .loc 1 282 19 ld.u64 %r296,[%r194+16]; add.u64 %r295,%r296,1; ld.u64 %r297,[%r194+8]; sub.u64 %r55,%r295,%r297; .loc 1 284 10 setp.lt.s64 %r298,%r55,0; @ %r298 bra $L58; .loc 1 282 17 add.u64 %r300,%r427,%r198; st.u64 [%r300],%r55; bra $L59; $L58: .loc 1 285 12 add.u64 %r302,%r427,%r198; st.u64 [%r302],%r442; $L59: .loc 1 278 3 add.u64 %r194,%r194,24; add.u64 %r196,%r196,24; add.u64 %r198,%r198,8; setp.ne.u64 %r304,%r194,%r207; @ %r304 bra $L60; bra $L61; $L105: .loc 1 298 6 ld.u64 %r305,[%r210]; setp.eq.u64 %r306,%r305,0; @ ! %r306 bra $L63; bra $L62; $L56: add.u64 %r308,%r22,%r22; add.u64 %r309,%r308,%r22; shl.b64 %r310,%r309,3; add.u64 %r58,%r310,40; add.u64 %r99,%r211,%r58; add.u64 %r138,%r213,%r58; shl.b64 %r311,%r22,3; add.u64 %r192,%r311,-8; cvt.u32.u32 %r313,%r23; cvt.s64.s8 %r312,%r313; shl.b64 %r314,%r312,3; add.u64 %r39,%r314,-8; add.u64 %r428,%frame,240; add.u64 %r427,%frame,360; .loc 1 295 12 mov.u64 %r441,0; $L66: .loc 1 290 18 add.u64 %r316,%r428,%r192; ld.u64 %r317,[%r99]; st.u64 [%r316],%r317; .loc 1 291 18 add.u64 %r318,%frame,%r192; .loc 1 291 20 ld.u64 %r320,[%r138]; mul.lo.u64 %r319,%r320,%r35; .loc 1 291 18 st.u64 [%r318],%r319; .loc 1 292 19 ld.u64 %r322,[%r99+16]; add.u64 %r321,%r322,1; ld.u64 %r323,[%r99+8]; sub.u64 %r66,%r321,%r323; .loc 1 294 10 setp.lt.s64 %r324,%r66,0; @ %r324 bra $L64; .loc 1 292 17 add.u64 %r326,%r427,%r192; st.u64 [%r326],%r66; bra $L65; $L64: .loc 1 295 12 add.u64 %r328,%r427,%r192; st.u64 [%r328],%r441; $L65: .loc 1 288 3 add.u64 %r99,%r99,24; add.u64 %r138,%r138,24; add.u64 %r192,%r192,8; setp.ne.u64 %r330,%r39,%r192; @ %r330 bra $L66; bra $L106; $L62: .loc 1 302 7 setp.eq.u64 %r331,%r141,0; @ %r331 bra $L68; add.u64 %r427,%frame,360; $L86: add.u64 %r136,%r210,40; mov.u64 %r134,%r427; cvt.u32.u32 %r333,%r23; cvt.s64.s8 %r332,%r333; add.u64 %r335,%r332,%r332; add.u64 %r336,%r335,%r332; shl.b64 %r337,%r336,3; add.u64 %r338,%r210,16; add.u64 %r105,%r337,%r338; .loc 1 305 10 mov.u64 %r130,1; .loc 1 309 4 mov.u64 %r357,0; bra $L69; $L68: .loc 1 313 20 add.u32 %r339,%r23,-2; cvt.s64.s32 %r73,%r339; add.u64 %r341,%r73,%r73; add.u64 %r342,%r341,%r73; shl.b64 %r343,%r342,3; add.u64 %r344,%r210,%r343; .loc 1 313 67 shl.b64 %r346,%r73,3; add.u64 %r347,%frame,%r346; .loc 1 313 59 ld.u64 %r349,[%r344+40]; ld.u64 %r350,[%r347+360]; mul.lo.u64 %r149,%r349,%r350; .loc 1 315 24 mov.u64 %r351,0; st.u64 [%r210+8],%r351; .loc 1 316 28 cvt.u16.u32 %r354,%r23; add.u16 %r353,%r354,-1; cvt.u32.u16 %r355,%r353; st.u8 [%r210+28],%r355; .loc 1 318 10 setp.eq.u64 %r356,%r149,0; @ %r356 bra $L70; bra $L107; $L72: .loc 1 307 47 mul.lo.u64 %r130,%r70,%r130; $L69: .loc 1 309 4 st.u64 [%r136+8],%r357; ld.u64 %r70,[%r134]; add.u64 %r358,%r70,-1; st.u64 [%r136+16],%r358; st.u64 [%r136],%r130; .loc 1 302 7 add.u64 %r136,%r136,24; add.u64 %r134,%r134,8; setp.ne.u64 %r359,%r105,%r136; @ %r359 bra $L72; bra $L68; $L70: .loc 1 321 4 st.u64 [%r210+48],%r149; mov.u64 %r361,-1; st.u64 [%r210+56],%r361; mov.u64 %r362,1; st.u64 [%r210+40],%r362; .loc 1 322 4 bra $L48; $L107: .loc 1 325 24 mov.u64 %r364364; call (%value_in),_gfortrani_xmallocarray65,[%value_in]; } .loc 1 325 22 st.u64 [%r210],%r365; bra $L73; $L63: .loc 1 330 19 ld.s8 %r367,[%r210+28]; .loc 1 330 10 setp.eq.u64 %r368,%r367,%r141; @ %r368 bra $L74; .loc 1 331 2 cvta.const.u64 %r36stack; call _gfortran_runtime_error74: .loc 1 333 11 cvta.global.u64 %r371,_gfortrani_compile_options; .loc 1 333 10 ld.u32 %r372,[%r371+36]; setp.eq.u32 %r373,%r372,0; @ %r373 bra $L73; .loc 1 335 4 add.u64 %r427,%frame,360; cvta.const.u64 %r377,$LC2r427376377; call _gfortrani_bounds_ifunction_return337 4 cvta.const.u64 %r381,$LC1r2113377; call _gfortrani_bounds_equal_extents$L73: .loc 1 342 3 setp.ne.u64 %r383,%r141,0; @ %r383 bra $L75; $L78: .loc 1 350 8 ld.u64 %r156,[%r210]; .loc 1 351 8 ld.u64 %r154,[%r211]; .loc 1 353 9 setp.ne.u64 %r384,%r154,0; @ %r384 bra $L76; bra $L48; $L75: add.u64 %r183,%frame,480; add.u64 %r180,%r210,40; add.u64 %r177,%frame,120; cvt.u32.u32 %r386,%r23; cvt.s64.s8 %r385,%r386; add.u64 %r137,%r385,-1; .loc 1 342 3 mov.u64 %r161,0; add.u64 %r427,%frame,360; .loc 1 344 16 mov.u64 %r387,%r161; $L77: st.u64 [%r183],%r387; .loc 1 345 18 ld.u64 %r388,[%r180]; st.u64 [%r177],%r388; .loc 1 346 17 shl.b64 %r390,%r161,3; add.u64 %r391,%r427,%r390; .loc 1 346 10 ld.u64 %r392,[%r391]; setp.le.s64 %r393,%r392,0; @ %r393 bra $L48; .loc 1 342 26 add.u64 %r161,%r161,1; .loc 1 342 3 add.u64 %r183,%r183,8; add.u64 %r180,%r180,24; add.u64 %r177,%r177,8; setp.ne.u64 %r394,%r137,%r161; @ %r394 bra $L77; bra $L78; $L76: .loc 1 276 12 mul.lo.u64 %r46,%r43,%r35; .loc 1 370 32 shl.b64 %r89,%r145,3; ld.u64 %r29,[%frame+480]; .loc 1 401 22 ld.u64 %r91,[%frame+240]; .loc 1 401 12 shl.b64 %r93,%r91,3; .loc 1 402 23 ld.u64 %r94,[%frame]; .loc 1 403 22 ld.u64 %r96,[%frame+120]; .loc 1 403 12 shl.b64 %r98,%r96,3; setp.le.s64 %r432,%r141,1; cvt.u32.u32 %r434,%r23; cvt.s64.s8 %r435,%r434; add.u64 %r436,%r435,-1; add.u64 %r437,%frame,240; add.u64 %r438,%frame,360; add.u64 %r439,%frame,120; .loc 1 409 13 mov.u64 %r440,0; $L87: .loc 1 342 3 mov.u64 %r160,%r155; mov.u64 %r159,%r154; .loc 1 365 9 mov.u64 %r132,-9223372036854775808; .loc 1 370 9 mov.u64 %r158,0; $L80: .loc 1 394 6 ld.s8 %r396,[%r160]; cvt.u16.u32 %r395,%r396; setp.eq.u16 %r397,%r395,0; @ %r397 bra $L79; ld.u64 %r398,[%r159]; max.s64 %r132,%r132,%r398; $L79: .loc 1 370 24 add.u64 %r158,%r158,1; .loc 1 370 32 add.u64 %r159,%r159,%r89; .loc 1 370 47 add.u64 %r160,%r160,%r46; .loc 1 370 2 setp.ne.u64 %r399,%r142,%r158; @ %r399 bra $L80; .loc 1 397 8 st.u64 [%r156],%r132; .loc 1 400 15 add.u64 %r29,%r29,1; .loc 1 401 12 add.u64 %r154,%r154,%r93; .loc 1 402 13 add.u64 %r155,%r155,%r94; .loc 1 403 12 add.u64 %r156,%r156,%r98; .loc 1 405 32 ld.u64 %r83,[%frame+360]; .loc 1 405 13 setp.ne.u64 %r400,%r83,%r29; @ %r400 bra $L87; .loc 1 412 23 mul.lo.u64 %r101,%r29,%r91; .loc 1 413 24 mul.lo.u64 %r401,%r29,%r94; .loc 1 413 10 sub.u64 %r151,%r155,%r401; .loc 1 414 23 mul.lo.u64 %r402,%r29,%r96; .loc 1 414 9 shl.b64 %r403,%r402,3; sub.u64 %r152,%r156,%r403; .loc 1 416 7 @ ! %r432 bra $L108; bra $L48; $L85: .loc 1 409 13 st.u64 [%r41],%r440; .loc 1 412 23 mul.lo.u64 %r101,%r113,%r112; .loc 1 413 24 mul.lo.u64 %r406,%r115,%r112; .loc 1 413 10 sub.u64 %r151,%r155,%r406; .loc 1 414 23 mul.lo.u64 %r407,%r117,%r112; .loc 1 414 9 shl.b64 %r408,%r407,3; sub.u64 %r152,%r156,%r408; .loc 1 415 5 add.u64 %r153,%r153,1; .loc 1 416 7 add.u64 %r41,%r41,8; add.u64 %r30,%r30,8; setp.ne.u64 %r409,%r153,%r436; @ %r409 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; mov.u64 %r30,8; .loc 1 415 5 mov.u64 %r153,1; $L84: .loc 1 424 16 ld.u64 %r413,[%r41]; add.u64 %r112,%r413,1; st.u64 [%r41],%r112; .loc 1 425 23 add.u64 %r415,%r437,%r30; ld.u64 %r113,[%r415]; .loc 1 425 13 sub.u64 %r416,%r113,%r101; shl.b64 %r417,%r416,3; add.u64 %r154,%r154,%r417; .loc 1 426 24 add.u64 %r418,%frame,%r30; ld.u64 %r115,[%r418]; .loc 1 426 14 add.u64 %r155,%r151,%r115; .loc 1 427 23 add.u64 %r420,%r439,%r30; ld.u64 %r117,[%r420]; .loc 1 427 13 shl.b64 %r421,%r117,3; add.u64 %r156,%r152,%r421; .loc 1 405 32 add.u64 %r423,%r438,%r30; ld.u64 %r120,[%r423]; .loc 1 405 13 setp.eq.u64 %r424,%r112,%r120; @ %r424 bra $L85; .loc 1 409 13 mov.u64 %r29,0; bra $L87; $L106: .loc 1 298 6 ld.u64 %r425,[%r210]; setp.eq.u64 %r426,%r425,0; @ ! %r426 bra $L63; bra $L86; $L48:_gfortran_smaxval_i8 .visible .func _gfortran_smaxval_68u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64predu64mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 454 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 454 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_maxva.loc 1 461 7 bra $L109; $L111: .loc 1 464 10 ld.u64 %r23,[%r152]; .loc 1 464 7 add.u64 %r88,%r23,-1; .loc 1 465 10 ld.s8 %r24,[%r151+28]; .loc 1 465 38 add.u32 %r26,%r24,-1; .loc 1 465 8 cvt.s64.s32 %r294,%r26; .loc 1 467 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r294; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 467 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 474 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 482 3 setp.lt.s64 %r173,%r88,%r294; @ %r173 bra $L115; bra $L168; $L113: .loc 1 469 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 479 12 mov.u64 %r300,0; $L119: .loc 1 476 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 478 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 476 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 479 12 st.u64 [%r139],%r300; $L118: .loc 1 474 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 491 15 ld.u64 %r98,[%r150]; .loc 1 491 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r293,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r293,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 488 12 mov.u64 %r299,0; $L125: .loc 1 485 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 487 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 484 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 488 12 st.u64 [%r43],%r299; $L124: .loc 1 482 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 495 7 setp.eq.u64 %r211,%r294,0; @ %r211 bra $L127; add.u64 %r293,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r293; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 498 10 mov.u64 %r81,1; .loc 1 502 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 506 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 507 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 509 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 509 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 509 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 511 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 500 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 502 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 495 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 514 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 515 4 bra $L109; $L170: .loc 1 518 24 mov.u64 %r24496244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 518 22 st.u64 [%r150],%r98; $L135: .loc 1 544 3 setp.ne.u64 %r247,%r294,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 522 19 ld.s8 %r55,[%r150+28]; .loc 1 522 10 setp.eq.u64 %r248,%r55,%r294; @ %r248 bra $L134; .loc 1 523 2 st.u64 [%stack+8],%r294; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 528 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 528 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 530 4 setp.eq.u64 %r254,%r294,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 530 10 mov.u64 %r104,0; $L137: .loc 1 534 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 534 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 535 18 ld.u64 %r63,[%r31]; .loc 1 530 25 add.u64 %r104,%r104,1; .loc 1 535 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 536 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 530 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 546 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r294,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 547 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 544 26 add.u64 %r100,%r100,1; .loc 1 544 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r294,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 556 22 ld.u64 %r68,[%frame]; .loc 1 556 12 shl.b64 %r69,%r68,3; .loc 1 558 32 ld.u64 %r59,[%frame+120]; .loc 1 565 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r292,%r294,1; .loc 1 554 13 mov.u64 %r278,-9223372036854775808; cvt.u32.u32 %r296,%r24; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; $L146: st.u64 [%r98],%r278; .loc 1 555 15 add.u64 %r80,%r80,1; .loc 1 556 12 add.u64 %r98,%r98,%r69; .loc 1 558 13 setp.ne.u64 %r279,%r59,%r80; @ %r279 bra $L146; .loc 1 567 7 @ ! %r292 bra $L171; bra $L109; $L145: .loc 1 562 13 st.u64 [%r99],%r295; .loc 1 565 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 566 5 add.u64 %r97,%r97,1; .loc 1 567 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r282,%r97,%r298; @ %r282 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 565 23 mov.u64 %r72,%r44; .loc 1 566 5 mov.u64 %r97,1; .loc 1 562 13 mov.u64 %r295,0; $L144: .loc 1 571 16 ld.u64 %r287,[%r99]; add.u64 %r74,%r287,1; st.u64 [%r99],%r74; .loc 1 572 23 ld.u64 %r75,[%r124]; .loc 1 572 13 sub.u64 %r288,%r75,%r72; shl.b64 %r289,%r288,3; add.u64 %r98,%r98,%r289; .loc 1 558 32 ld.u64 %r77,[%r123]; .loc 1 558 13 setp.eq.u64 %r290,%r74,%r77; @ %r290 bra $L145; .loc 1 562 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 491 15 ld.u64 %r98,[%r150]; .loc 1 491 6 setp.eq.u64 %r291,%r98,0; @ ! %r291 bra $L122; bra $L147; $L109: .loc 1 576 1 ret; } maxval_i16.o/_gfortran_maxval_i16 .visible .func _gfortran_maxval_i16fortran/generated/maxval_i16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxval_i16 .visible .func _gfortran_mmaxval_i16gfortran_smaxval_i16 .visible .func _gfortran_smaxval_i16VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,86,65,76114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_maxval_i16 .visible .func _gfortran_maxval_i1648099pred %r217; .reg .pred64u64 %r251; .reg .predu64 %r256; .reg .u64 %r257; .reg .u32u32 %r268; .reg .u64u64pred %r317; .reg .u64 %r320; .reg .pred %r324; .reg .pred %r327; .reg .pred %r330; .reg .predu64 %r359; .reg .u64predu64 %r370; .reg .u64 %r372; .reg .u64 %r373; mov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; .loc 1 55 10 ld.s8 %r22,[%r180+28]; .loc 1 55 38 add.u32 %r182,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r182; .loc 1 56 10 ld.u64 %r25,[%r181]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r184,%r104,63; set.u32.lt.s64 %r186,%r103,%r104; neg.s32 %r187,%r186; cvt.u16.u64 %r190,%r184; cvt.u16.u32 %r191,%r187; or.b16 %r189,%r190,%r191; .loc 1 58 6 cvt.u32.u16 %r192,%r189; cvt.u16.u8 %r193,%r192; setp.eq.u16 %r194,%r193,0; @ %r194 bra $L2; .loc 1 60 7 cvt.u32.u32 %r197,%r22; cvt.s64.s8 %r196,%r197; st.u64 [%stack+8],%r1961955_gfortran_runtime_errorr200,%r104,%r104; add.u64 %r201,%r200,%r104; shl.b64 %r202,%r201,3; add.u64 %r203,%r180,%r202; ld.u64 %r30,[%r203+56]; ld.u64 %r32,[%r203+48]; .loc 1 68 9 ld.u64 %r107,[%r203+40]; .loc 1 70 3 setp.ne.u64 %r217,%r104,0; @ %r217 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r218,%r103,%r104; @ %r218 bra $L4; bra $L48; $L3: add.u64 %r78,%r180,40; add.u64 %r59,%frame,120; add.u64 %r91,%frame,240; add.u64 %r219,%r180,16; add.u64 %r221,%r25,%r25; add.u64 %r222,%r221,%r25; shl.b64 %r223,%r222,3; add.u64 %r174,%r219,%r223; .loc 1 76 12 mov.u64 %r373,0; $L8: .loc 1 72 18 ld.u64 %r224,[%r78]; st.u64 [%r59],%r224; .loc 1 73 19 ld.u64 %r226,[%r78+16]; add.u64 %r225,%r226,1; ld.u64 %r227,[%r78+8]; sub.u64 %r37,%r225,%r227; .loc 1 75 10 setp.lt.s64 %r228,%r37,0; @ %r228 bra $L6; .loc 1 73 17 st.u64 [%r91],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r91],%r373; $L7: .loc 1 70 3 add.u64 %r78,%r78,24; add.u64 %r59,%r59,8; add.u64 %r91,%r91,8; setp.ne.u64 %r230,%r78,%r174; @ %r230 bra $L8; bra $L9; $L48: .loc 1 87 6 ld.u64 %r231,[%r179]; setp.eq.u64 %r232,%r231,0; @ ! %r232 bra $L11; bra $L10; $L4: add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r237,%r236,40; add.u64 %r143,%r180,%r237; shl.b64 %r238,%r25,3; add.u64 %r125,%r238,-8; add.u64 %r361,%frame,120; add.u64 %r128,%r361,%r125; add.u64 %r360,%frame,240; add.u64 %r121,%r360,%r125; cvt.u32.u32 %r242,%r22; cvt.s64.s8 %r241,%r242; add.u64 %r244,%r241,%r241; add.u64 %r245,%r244,%r241; shl.b64 %r246,%r245,3; add.u64 %r247,%r180,40; add.u64 %r95,%r246,%r247; .loc 1 84 12 mov.u64 %r372,0; $L14: .loc 1 80 18 ld.u64 %r248,[%r143]; st.u64 [%r128],%r248; .loc 1 81 19 ld.u64 %r250,[%r143+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r143+8]; sub.u64 %r45,%r249,%r251; .loc 1 83 10 setp.lt.s64 %r252,%r45,0; @ %r252 bra $L12; .loc 1 81 17 st.u64 [%r121],%r45; bra $L13; $L12: .loc 1 84 12 st.u64 [%r121],%r372; $L13: .loc 1 78 3 add.u64 %r143,%r143,24; add.u64 %r128,%r128,8; add.u64 %r121,%r121,8; setp.ne.u64 %r254,%r95,%r143; @ %r254 bra $L14; bra $L49; $L10: .loc 1 91 7 setp.le.s64 %r255,%r103,0; @ %r255 bra $L16; add.u64 %r360,%frame,240; $L36: add.u64 %r152,%r179,40; mov.u64 %r150,%r360; add.u64 %r256,%r179,16; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r144,%r256,%r262; .loc 1 94 10 mov.u64 %r99,1; .loc 1 98 4 mov.u64 %r285,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r263,0; st.u64 [%r179+8],%r263; .loc 1 103 28 cvt.u16.u32 %r266,%r22; add.u16 %r265,%r266,-1; cvt.u32.u16 %r267,%r265; st.u8 [%r179+28],%r267; .loc 1 105 20 add.u32 %r268,%r22,-2; cvt.s64.s32 %r52,%r268; add.u64 %r270,%r52,%r52; add.u64 %r271,%r270,%r52; shl.b64 %r272,%r271,3; add.u64 %r273,%r179,%r272; .loc 1 105 67 shl.b64 %r275,%r52,3; add.u64 %r276,%frame,%r275; .loc 1 105 59 ld.u64 %r278,[%r273+40]; ld.u64 %r279,[%r276+240]; mul.lo.u64 %r109,%r278,%r279; .loc 1 107 29 mov.u64 %r281,16281; call (%value_in),_gfortrani_xmallocarray282,[%value_in]; } .loc 1 107 27 st.u64 [%r179],%r282; .loc 1 108 10 setp.eq.u64 %r284,%r109,0; @ ! %r284 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r99,%r48,%r99; $L17: .loc 1 98 4 st.u64 [%r152+8],%r285; ld.u64 %r48,[%r150]; add.u64 %r286,%r48,-1; st.u64 [%r152+16],%r286; st.u64 [%r152],%r99; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r287,%r144,%r152; @ %r287 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r179+48],%r109; mov.u64 %r289,-1; st.u64 [%r179+56],%r289; mov.u64 %r290,1; st.u64 [%r179+40],%r290; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r179+28]; .loc 1 118 10 setp.eq.u64 %r291,%r58,%r103; @ %r291 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r294,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r295,[%r294+36]; setp.eq.u32 %r296,%r295,0; @ %r296 bra $L21; .loc 1 125 2 add.u64 %r360,%frame,240; cvta.const.u64 %r300,$LC2r360299300; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r302,%r103,0; @ %r302 bra $L24; $L27: .loc 1 65 9 add.u64 %r303,%r30,1; .loc 1 65 7 sub.u64 %r105,%r303,%r32; max.s64 %r40,%r105,0; .loc 1 137 8 ld.u64 %r116,[%r180]; .loc 1 138 8 ld.u64 %r117,[%r179]; .loc 1 158 36 shl.b64 %r66,%r107,4; ld.u64 %r138,[%frame+360]; .loc 1 180 22 ld.u64 %r68,[%frame+120]; .loc 1 180 12 shl.b64 %r69,%r68,4; .loc 1 181 22 ld.u64 %r71,[%frame]; .loc 1 181 12 shl.b64 %r72,%r71,4; .loc 1 183 32 ld.u64 %r123,[%frame+240]; .loc 1 190 23 mul.lo.u64 %r304,%r68,%r123; .loc 1 190 9 shl.b64 %r305,%r304,4; neg.s64 %r127,%r305; .loc 1 191 23 mul.lo.u64 %r306,%r71,%r123; .loc 1 191 9 shl.b64 %r307,%r306,4; neg.s64 %r130,%r307; setp.gt.s64 %r362,%r105,0; setp.le.s64 %r363,%r103,1; .loc 1 151 9 mov.u64 %r365,0; cvt.u32.u32 %r366,%r22; cvt.s64.s8 %r367,%r366; add.u64 %r368,%r367,-1; add.u64 %r369,%frame,120; add.u64 %r370,%frame,240; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r179,40; mov.u64 %r162,%frame; cvt.u32.u32 %r309,%r22; cvt.s64.s8 %r308,%r309; add.u64 %r153,%r308,-1; .loc 1 129 3 mov.u64 %r118,0; add.u64 %r360,%frame,240; .loc 1 131 16 mov.u64 %r310,%r118; $L26: st.u64 [%r165],%r310; .loc 1 132 18 ld.u64 %r311,[%r164]; st.u64 [%r162],%r311; .loc 1 133 17 shl.b64 %r313,%r118,3; add.u64 %r314,%r360,%r313; .loc 1 133 10 ld.u64 %r315,[%r314]; setp.le.s64 %r316,%r315,0; @ %r316 bra $L1; .loc 1 129 26 add.u64 %r118,%r118,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r317,%r118,%r153; @ %r317 bra $L26; bra $L27; $L25: .loc 1 153 5 @ %r362 bra $L37; .loc 1 154 10 st.u64 [%r117],%r365; mov.u64 %r320,-9223372036854775808; st.u64 [%r117+8],%r320; bra $L29; $L37: mov.u64 %r112,%r116; .loc 1 151 9 mov.u64 %r356,%r365; mov.u64 %r357,-9223372036854775808; .loc 1 158 13 mov.u64 %r111,0; $L28: ld.u64 %r358,[%r112]; ld.u64 %r359,[%r112+8]; setp.gt.s64 %r324,%r359,%r357; @ %r324 bra $L31; setp.ne.u64 %r327,%r359,%r357; @ %r327 bra $L30; setp.gt.u64 %r330,%r358,%r356; @ ! %r330 bra $L30; $L31: mov.u64 %r356,%r358; mov.u64 %r357,%r359; $L30: .loc 1 158 28 add.u64 %r111,%r111,1; .loc 1 158 36 add.u64 %r112,%r112,%r66; .loc 1 158 6 setp.gt.s64 %r331,%r40,%r111; @ %r331 bra $L28; .loc 1 175 12 st.u64 [%r117],%r356; st.u64 [%r117+8],%r357; $L29: .loc 1 179 15 add.u64 %r138,%r138,1; .loc 1 180 12 add.u64 %r116,%r116,%r69; .loc 1 181 12 add.u64 %r117,%r117,%r72; .loc 1 183 13 setp.ne.u64 %r334,%r138,%r123; @ %r334 bra $L25; .loc 1 190 9 add.u64 %r113,%r116,%r127; .loc 1 191 9 add.u64 %r114,%r117,%r130; .loc 1 193 7 @ %r363 bra $L1; add.u64 %r122,%frame,368; mov.u64 %r140,8; .loc 1 192 5 mov.u64 %r115,1; bra $L34; $L35: .loc 1 187 13 st.u64 [%r122],%r365; .loc 1 190 23 mul.lo.u64 %r340,%r85,%r84; .loc 1 190 9 shl.b64 %r341,%r340,4; sub.u64 %r113,%r116,%r341; .loc 1 191 23 mul.lo.u64 %r342,%r88,%r84; .loc 1 191 9 shl.b64 %r343,%r342,4; sub.u64 %r114,%r117,%r343; .loc 1 192 5 add.u64 %r115,%r115,1; .loc 1 193 7 add.u64 %r122,%r122,8; add.u64 %r140,%r140,8; setp.eq.u64 %r344,%r115,%r368; @ %r344 bra $L1; $L34: .loc 1 201 16 ld.u64 %r345,[%r122]; add.u64 %r84,%r345,1; st.u64 [%r122],%r84; .loc 1 202 23 add.u64 %r347,%r369,%r140; ld.u64 %r85,[%r347]; .loc 1 202 13 shl.b64 %r348,%r85,4; add.u64 %r116,%r113,%r348; .loc 1 203 23 add.u64 %r349,%frame,%r140; ld.u64 %r88,[%r349]; .loc 1 203 13 shl.b64 %r350,%r88,4; add.u64 %r117,%r114,%r350; .loc 1 183 32 add.u64 %r352,%r370,%r140; ld.u64 %r92,[%r352]; .loc 1 183 13 setp.eq.u64 %r353,%r84,%r92; @ %r353 bra $L35; .loc 1 187 13 mov.u64 %r138,0; bra $L25; $L49: .loc 1 87 6 ld.u64 %r354,[%r179]; setp.eq.u64 %r355,%r354,0; @ ! %r355 bra $L11; bra $L36; $L_gfortran_mmaxval_i16 .visible .func _gfortran_mmaxval_i16608u64 %r166; .reg .u64u64u16 %r263; .reg .pred %r264; .reg .pred %r266; .reg .u64 %r270; .reg .u64 %r271; .reg .u64 %r277; .reg .pred %r279; .reg .pred %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .predu64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32u64 %r342; .reg .u64pred %r358; .reg .u64predu64 %r391; .reg .predpred %r399; .reg .pred %r403; .reg .pred %r406; .reg .predmov.u64 %r211,%ar2; mov.u64 %r212,%ar3; .loc 1 237 6 setp.ne.u64 %r213,%r212,0; @ %r213 bra $Lgfortran_maxval_i16.loc 1 244 7 bra $L50; $L51: .loc 1 247 10 ld.u64 %r22,[%r211]; .loc 1 247 7 add.u64 %r136,%r22,-1; .loc 1 248 10 ld.s8 %r23,[%r210+28]; .loc 1 248 38 add.u32 %r217,%r23,-1; .loc 1 248 8 cvt.s64.s32 %r137,%r217; .loc 1 251 7 shr.u64 %r219,%r136,63; set.u32.gt.s64 %r221,%r136,%r137; neg.s32 %r222,%r221; cvt.u16.u64 %r225,%r219; cvt.u16.u32 %r226,%r222; or.b16 %r224,%r225,%r226; .loc 1 251 6 cvt.u32.u16 %r227,%r224; cvt.u16.u8 %r228,%r227; setp.eq.u16 %r229,%r228,0; @ %r229 bra $L53; .loc 1 253 7 cvt.u32.u32 %r232,%r23; cvt.s64.s8 %r231,%r232; st.u64 [%stack+8],%r231; st.u64 [%stack],%r22; cvta.const.u64 %r2302stack; call _gfortran_runtime_error3: .loc 1 258 9 add.u64 %r443,%r136,%r136; add.u64 %r446,%r443,%r136; shl.b64 %r237,%r446,3; add.u64 %r238,%r210,%r237; ld.u64 %r241,[%r238+56]; add.u64 %r240,%r241,1; .loc 1 258 7 ld.u64 %r248,[%r238+48]; sub.u64 %r138,%r240,%r248; .loc 1 259 6 setp.le.s64 %r249,%r138,0; @ %r249 bra $L50; .loc 1 262 9 ld.u64 %r151,[%r212]; .loc 1 264 15 ld.u64 %r34,[%r212+16]; .loc 1 266 22 cvt.u32.u64 %r35,%r34; .loc 1 266 53 add.u32 %r250,%r35,-4; and.b32 %r251,%r250,-5; set.u32.eq.u32 %r253,%r251,0; neg.s32 %r254,%r253; .loc 1 266 22 add.u32 %r255,%r35,-1; set.u32.le.u32 %r257,%r255,1; neg.s32 %r258,%r257; .loc 1 266 6 cvt.u16.u32 %r260,%r254; cvt.u16.u32 %r261,%r258; or.b16 %r259,%r260,%r261; cvt.u32.u16 %r262,%r259; cvt.u16.u8 %r263,%r262; setp.ne.u16 %r264,%r263,0; @ %r264 bra $L55; .loc 1 268 7 setp.ne.u32 %r266,%r35,16; @ %r266 bra $L56; $L55: .loc 1 275 9 shl.b64 %r270,%r446,3; add.u64 %r271,%r210,%r270; ld.u64 %r141,[%r271+40]; .loc 1 276 12 add.u64 %r277,%r212,%r270; ld.u64 %r42,[%r277+40]; .loc 1 278 3 setp.ne.u64 %r279,%r136,0; @ %r279 bra $L57; $L63: .loc 1 288 3 setp.lt.s64 %r280,%r136,%r137; @ %r280 bra $L58; bra $L109; $L56: .loc 1 273 5 cvta.const.u64 %r281281_gfortran_runtime_error7: add.u64 %r40,%r210,40; add.u64 %r179,%r212,40; add.u64 %r283,%r210,16; add.u64 %r285,%r22,%r22; add.u64 %r286,%r285,%r22; shl.b64 %r287,%r286,3; add.u64 %r206,%r283,%r287; .loc 1 278 3 mov.u64 %r91,0; add.u64 %r444,%frame,240; add.u64 %r442,%frame,360; .loc 1 285 12 mov.u64 %r456,%r91; $L62: .loc 1 280 18 add.u64 %r289,%r444,%r91; ld.u64 %r290,[%r40]; st.u64 [%r289],%r290; .loc 1 281 18 add.u64 %r291,%frame,%r91; .loc 1 281 20 ld.u64 %r293,[%r179]; mul.lo.u64 %r292,%r293,%r34; .loc 1 281 18 st.u64 [%r291],%r292; .loc 1 282 19 ld.u64 %r295,[%r40+16]; add.u64 %r294,%r295,1; ld.u64 %r296,[%r40+8]; sub.u64 %r54,%r294,%r296; .loc 1 284 10 setp.lt.s64 %r297,%r54,0; @ %r297 bra $L60; .loc 1 282 17 add.u64 %r299,%r442,%r91; st.u64 [%r299],%r54; bra $L61; $L60: .loc 1 285 12 add.u64 %r301,%r442,%r91; st.u64 [%r301],%r456; $L61: .loc 1 278 3 add.u64 %r40,%r40,24; add.u64 %r179,%r179,24; add.u64 %r91,%r91,8; setp.ne.u64 %r303,%r40,%r206; @ %r303 bra $L62; bra $L63; $L109: .loc 1 298 6 ld.u64 %r304,[%r209]; setp.eq.u64 %r305,%r304,0; @ ! %r305 bra $L65; bra $L64; $L58: add.u64 %r307,%r22,%r22; add.u64 %r308,%r307,%r22; shl.b64 %r309,%r308,3; add.u64 %r126,%r309,40; add.u64 %r132,%r210,%r126; add.u64 %r124,%r212,%r126; shl.b64 %r310,%r22,3; add.u64 %r102,%r310,-8; cvt.u32.u32 %r312,%r23; cvt.s64.s8 %r311,%r312; shl.b64 %r313,%r311,3; add.u64 %r68,%r313,-8; add.u64 %r444,%frame,240; add.u64 %r442,%frame,360; .loc 1 295 12 mov.u64 %r455,0; $L68: .loc 1 290 18 add.u64 %r315,%r444,%r102; ld.u64 %r316,[%r132]; st.u64 [%r315],%r316; .loc 1 291 18 add.u64 %r317,%frame,%r102; .loc 1 291 20 ld.u64 %r319,[%r124]; mul.lo.u64 %r318,%r319,%r34; .loc 1 291 18 st.u64 [%r317],%r318; .loc 1 292 19 ld.u64 %r321,[%r132+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r132+8]; sub.u64 %r65,%r320,%r322; .loc 1 294 10 setp.lt.s64 %r323,%r65,0; @ %r323 bra $L66; .loc 1 292 17 add.u64 %r325,%r442,%r102; st.u64 [%r325],%r65; bra $L67; $L66: .loc 1 295 12 add.u64 %r327,%r442,%r102; st.u64 [%r327],%r455; $L67: .loc 1 288 3 add.u64 %r132,%r132,24; add.u64 %r124,%r124,24; add.u64 %r102,%r102,8; setp.ne.u64 %r329,%r68,%r102; @ %r329 bra $L68; bra $L110; $L64: .loc 1 302 7 setp.eq.u64 %r330,%r137,0; @ %r330 bra $L70; add.u64 %r442,%frame,360; $L90: add.u64 %r172,%r209,40; mov.u64 %r164,%r442; cvt.u32.u32 %r332,%r23; cvt.s64.s8 %r331,%r332; add.u64 %r334,%r331,%r331; add.u64 %r335,%r334,%r331; shl.b64 %r336,%r335,3; add.u64 %r337,%r209,16; add.u64 %r133,%r336,%r337; .loc 1 305 10 mov.u64 %r128,1; .loc 1 309 4 mov.u64 %r356,0; bra $L71; $L70: .loc 1 313 20 add.u32 %r338,%r23,-2; cvt.s64.s32 %r74,%r338; add.u64 %r340,%r74,%r74; add.u64 %r341,%r340,%r74; shl.b64 %r342,%r341,3; add.u64 %r343,%r209,%r342; .loc 1 313 67 shl.b64 %r345,%r74,3; add.u64 %r346,%frame,%r345; .loc 1 313 59 ld.u64 %r348,[%r343+40]; ld.u64 %r349,[%r346+360]; mul.lo.u64 %r145,%r348,%r349; .loc 1 315 24 mov.u64 %r350,0; st.u64 [%r209+8],%r350; .loc 1 316 28 cvt.u16.u32 %r353,%r23; add.u16 %r352,%r353,-1; cvt.u32.u16 %r354,%r352; st.u8 [%r209+28],%r354; .loc 1 318 10 setp.eq.u64 %r355,%r145,0; @ %r355 bra $L72; bra $L111; $L74: .loc 1 307 47 mul.lo.u64 %r128,%r71,%r128; $L71: .loc 1 309 4 st.u64 [%r172+8],%r356; ld.u64 %r71,[%r164]; add.u64 %r357,%r71,-1; st.u64 [%r172+16],%r357; st.u64 [%r172],%r128; .loc 1 302 7 add.u64 %r172,%r172,24; add.u64 %r164,%r164,8; setp.ne.u64 %r358,%r133,%r172; @ %r358 bra $L74; bra $L70; $L72: .loc 1 321 4 st.u64 [%r209+48],%r145; mov.u64 %r360,-1; st.u64 [%r209+56],%r360; mov.u64 %r361,1; st.u64 [%r209+40],%r361; .loc 1 322 4 bra $L50; $L111: .loc 1 325 24 mov.u64 %r363,16363; call (%value_in),_gfortrani_xmallocarray64,[%value_in]; } .loc 1 325 22 st.u64 [%r209],%r364; bra $L75; $L65: .loc 1 330 19 ld.s8 %r366,[%r209+28]; .loc 1 330 10 setp.eq.u64 %r367,%r366,%r137; @ %r367 bra $L76; .loc 1 331 2 cvta.const.u64 %r36836_gfortran_runtime_error76: .loc 1 333 11 cvta.global.u64 %r370,_gfortrani_compile_options; .loc 1 333 10 ld.u32 %r371,[%r370+36]; setp.eq.u32 %r372,%r371,0; @ %r372 bra $L75; .loc 1 335 4 add.u64 %r442,%frame,360; cvta.const.u64 %r376,$LC2r44237376; call _gfortrani_bounds_ifunction_return337 4 cvta.const.u64 %r380,$LC1210380376; call _gfortrani_bounds_equal_extents$L75: .loc 1 342 3 setp.ne.u64 %r382,%r137,0; @ %r382 bra $L77; $L80: .loc 1 350 8 ld.u64 %r152,[%r209]; .loc 1 351 8 ld.u64 %r150,[%r210]; .loc 1 353 9 setp.ne.u64 %r383,%r150,0; @ %r383 bra $L78; bra $L50; $L77: add.u64 %r190,%frame,480; add.u64 %r189,%r209,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r385,%r23; cvt.s64.s8 %r384,%r385; add.u64 %r175,%r384,-1; .loc 1 342 3 mov.u64 %r157,0; add.u64 %r442,%frame,360; .loc 1 344 16 mov.u64 %r386,%r157; $L79: st.u64 [%r190],%r386; .loc 1 345 18 ld.u64 %r387,[%r189]; st.u64 [%r187],%r387; .loc 1 346 17 shl.b64 %r389,%r157,3; add.u64 %r390,%r442,%r389; .loc 1 346 10 ld.u64 %r391,[%r390]; setp.le.s64 %r392,%r391,0; @ %r392 bra $L50; .loc 1 342 26 add.u64 %r157,%r157,1; .loc 1 342 3 add.u64 %r190,%r190,8; add.u64 %r189,%r189,24; add.u64 %r187,%r187,8; setp.ne.u64 %r393,%r157,%r175; @ %r393 bra $L79; bra $L80; $L78: .loc 1 276 12 mul.lo.u64 %r45,%r42,%r34; .loc 1 370 32 shl.b64 %r90,%r141,4; ld.u64 %r39,[%frame+480]; .loc 1 401 22 ld.u64 %r93,[%frame+240]; .loc 1 401 12 shl.b64 %r94,%r93,4; .loc 1 402 23 ld.u64 %r96,[%frame]; .loc 1 403 22 ld.u64 %r98,[%frame+120]; .loc 1 403 12 shl.b64 %r99,%r98,4; .loc 1 405 32 ld.u64 %r84,[%frame+360]; .loc 1 412 23 mul.lo.u64 %r38,%r84,%r93; .loc 1 413 24 mul.lo.u64 %r394,%r84,%r96; .loc 1 413 10 neg.s64 %r163,%r394; .loc 1 414 23 mul.lo.u64 %r395,%r84,%r98; .loc 1 414 9 shl.b64 %r396,%r395,4; neg.s64 %r165,%r396; setp.le.s64 %r441,%r137,1; .loc 1 365 9 mov.u64 %r448,0; cvt.u32.u32 %r449,%r23; cvt.s64.s8 %r450,%r449; add.u64 %r451,%r450,-1; add.u64 %r452,%frame,240; add.u64 %r453,%frame,360; add.u64 %r454,%frame,120; $L91: .loc 1 342 3 mov.u64 %r156,%r151; mov.u64 %r155,%r150; .loc 1 365 9 mov.u64 %r437,%r448; mov.u64 %r438,-9223372036854775808; .loc 1 370 9 mov.u64 %r154,0; $L84: .loc 1 394 6 ld.s8 %r398,[%r156]; cvt.u16.u32 %r397,%r398; setp.eq.u16 %r399,%r397,0; @ %r399 bra $L81; ld.u64 %r439,[%r155]; ld.u64 %r440,[%r155+8]; setp.gt.s64 %r403,%r440,%r438; @ %r403 bra $L83; setp.ne.u64 %r406,%r440,%r438; @ %r406 bra $L81; setp.gt.u64 %r409,%r439,%r437; @ ! %r409 bra $L81; $L83: mov.u64 %r437,%r439; mov.u64 %r438,%r440; $L81: .loc 1 370 24 add.u64 %r154,%r154,1; .loc 1 370 32 add.u64 %r155,%r155,%r90; .loc 1 370 47 add.u64 %r156,%r156,%r45; .loc 1 370 2 setp.ne.u64 %r410,%r138,%r154; @ %r410 bra $L84; .loc 1 397 8 st.u64 [%r152],%r437; st.u64 [%r152+8],%r438; .loc 1 400 15 add.u64 %r39,%r39,1; .loc 1 401 12 add.u64 %r150,%r150,%r94; .loc 1 402 13 add.u64 %r151,%r151,%r96; .loc 1 403 12 add.u64 %r152,%r152,%r99; .loc 1 405 13 setp.ne.u64 %r413,%r84,%r39; @ %r413 bra $L91; .loc 1 413 10 add.u64 %r147,%r151,%r163; .loc 1 414 9 add.u64 %r148,%r152,%r165; .loc 1 416 7 @ ! %r441 bra $L112; bra $L50; $L89: .loc 1 409 13 st.u64 [%r41],%r448; .loc 1 412 23 mul.lo.u64 %r101,%r113,%r112; .loc 1 413 24 mul.lo.u64 %r416,%r115,%r112; .loc 1 413 10 sub.u64 %r147,%r151,%r416; .loc 1 414 23 mul.lo.u64 %r417,%r117,%r112; .loc 1 414 9 shl.b64 %r418,%r417,4; sub.u64 %r148,%r152,%r418; .loc 1 415 5 add.u64 %r149,%r149,1; .loc 1 416 7 add.u64 %r41,%r41,8; add.u64 %r166,%r166,8; setp.ne.u64 %r419,%r149,%r451; @ %r419 bra $L88; bra $L50; $L112: add.u64 %r41,%frame,488; .loc 1 412 23 mov.u64 %r101,%r38; .loc 1 416 7 mov.u64 %r166,8; .loc 1 415 5 mov.u64 %r149,1; $L88: .loc 1 424 16 ld.u64 %r423,[%r41]; add.u64 %r112,%r423,1; st.u64 [%r41],%r112; .loc 1 425 23 add.u64 %r425,%r452,%r166; ld.u64 %r113,[%r425]; .loc 1 425 13 sub.u64 %r426,%r113,%r101; shl.b64 %r427,%r426,4; add.u64 %r150,%r150,%r427; .loc 1 426 24 add.u64 %r428,%frame,%r166; ld.u64 %r115,[%r428]; .loc 1 426 14 add.u64 %r151,%r147,%r115; .loc 1 427 23 add.u64 %r430,%r454,%r166; ld.u64 %r117,[%r430]; .loc 1 427 13 shl.b64 %r431,%r117,4; add.u64 %r152,%r148,%r431; .loc 1 405 32 add.u64 %r433,%r453,%r166; ld.u64 %r121,[%r433]; .loc 1 405 13 setp.eq.u64 %r434,%r112,%r121; @ %r434 bra $L89; .loc 1 409 13 mov.u64 %r39,0; bra $L91; $L110: .loc 1 298 6 ld.u64 %r435,[%r209]; setp.eq.u64 %r436,%r435,0; @ ! %r436 bra $L65; bra $L90; $L50:_gfortran_smaxval_i16 .visible .func _gfortran_smaxval_i1668u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64u64 %r279; .reg .pred %r280; .reg .predpred %r292; .reg .predu32u64 %r301; mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 454 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L114; .loc 1 454 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L115; $L114:r152; call _gfortran_maxval_i16.loc 1 461 7 bra $L113; $L115: .loc 1 464 10 ld.u64 %r23,[%r152]; .loc 1 464 7 add.u64 %r88,%r23,-1; .loc 1 465 10 ld.s8 %r24,[%r151+28]; .loc 1 465 38 add.u32 %r26,%r24,-1; .loc 1 465 8 cvt.s64.s32 %r295,%r26; .loc 1 467 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r295; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 467 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L117; .loc 1 474 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L118; $L124: .loc 1 482 3 setp.lt.s64 %r173,%r88,%r295; @ %r173 bra $L119; bra $L172; $L117: .loc 1 469 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error118: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 479 12 mov.u64 %r301,0; $L123: .loc 1 476 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 478 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L121; .loc 1 476 17 st.u64 [%r139],%r36; bra $L122; $L121: .loc 1 479 12 st.u64 [%r139],%r301; $L122: .loc 1 474 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L123; bra $L124; $L172: .loc 1 491 15 ld.u64 %r98,[%r150]; .loc 1 491 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L126; bra $L125; $L119: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r294,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r294,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 488 12 mov.u64 %r300,0; $L129: .loc 1 485 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 487 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L127; .loc 1 484 17 st.u64 [%r43],%r41; bra $L128; $L127: .loc 1 488 12 st.u64 [%r43],%r300; $L128: .loc 1 482 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L129; bra $L173; $L125: .loc 1 495 7 setp.eq.u64 %r211,%r295,0; @ %r211 bra $L131; add.u64 %r294,%frame,120; $L151: add.u64 %r111,%r150,40; mov.u64 %r107,%r294; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 498 10 mov.u64 %r81,1; .loc 1 502 4 mov.u64 %r237,0; bra $L132; $L131: .loc 1 506 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 507 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 509 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 509 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 509 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 511 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L133; bra $L174; $L135: .loc 1 500 48 mul.lo.u64 %r81,%r46,%r81; $L132: .loc 1 502 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 495 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L135; bra $L131; $L133: .loc 1 514 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 515 4 bra $L113; $L174: .loc 1 518 24 mov.u64 %r244,1696244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 518 22 st.u64 [%r150],%r98; $L139: .loc 1 544 3 setp.ne.u64 %r247,%r295,0; @ %r247 bra $L136; bra $L137; $L126: .loc 1 522 19 ld.s8 %r55,[%r150+28]; .loc 1 522 10 setp.eq.u64 %r248,%r55,%r295; @ %r248 bra $L138; .loc 1 523 2 st.u64 [%stack+8],%r295; st.u64 [%stack],%r55;_gfortran_runtime_error138: .loc 1 528 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 528 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L139; .loc 1 530 4 setp.eq.u64 %r254,%r295,0; @ %r254 bra $L137; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 530 10 mov.u64 %r104,0; $L141: .loc 1 534 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 534 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 535 18 ld.u64 %r63,[%r31]; .loc 1 530 25 add.u64 %r104,%r104,1; .loc 1 535 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L140; .loc 1 536 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error140: .loc 1 530 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L141; $L136: .loc 1 546 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L142; shl.b64 %r264,%r295,3; bra $L143; $L142: mov.u64 %r264,8; $L143: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L144: .loc 1 547 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 544 26 add.u64 %r100,%r100,1; .loc 1 544 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r295,%r100; @ %r277 bra $L144; $L137: ld.u64 %r80,[%frame+240]; .loc 1 556 22 ld.u64 %r68,[%frame]; .loc 1 556 12 shl.b64 %r69,%r68,4; .loc 1 558 32 ld.u64 %r59,[%frame+120]; .loc 1 565 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r293,%r295,1; .loc 1 554 13 mov.u64 %r278,0; cvt.u32.u32 %r297,%r24; cvt.s64.s8 %r298,%r297; add.u64 %r299,%r298,-1; $L150: st.u64 [%r98],%r278; mov.u64 %r279,-9223372036854775808; st.u64 [%r98+8],%r279; .loc 1 555 15 add.u64 %r80,%r80,1; .loc 1 556 12 add.u64 %r98,%r98,%r69; .loc 1 558 13 setp.ne.u64 %r280,%r59,%r80; @ %r280 bra $L150; .loc 1 567 7 @ ! %r293 bra $L175; bra $L113; $L149: .loc 1 562 13 st.u64 [%r99],%r278; .loc 1 565 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 566 5 add.u64 %r97,%r97,1; .loc 1 567 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r283,%r97,%r299; @ %r283 bra $L148; bra $L113; $L175: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 565 23 mov.u64 %r72,%r44; .loc 1 566 5 mov.u64 %r97,1; $L148: .loc 1 571 16 ld.u64 %r288,[%r99]; add.u64 %r74,%r288,1; st.u64 [%r99],%r74; .loc 1 572 23 ld.u64 %r75,[%r124]; .loc 1 572 13 sub.u64 %r289,%r75,%r72; shl.b64 %r290,%r289,4; add.u64 %r98,%r98,%r290; .loc 1 558 32 ld.u64 %r77,[%r123]; .loc 1 558 13 setp.eq.u64 %r291,%r74,%r77; @ %r291 bra $L149; .loc 1 562 13 mov.u64 %r80,0; bra $L150; $L173: .loc 1 491 15 ld.u64 %r98,[%r150]; .loc 1 491 6 setp.eq.u64 %r292,%r98,0; @ ! %r292 bra $L126; bra $L151; $L113: .loc 1 576 1 ret; } maxval_r4.o/_gfortran_maxval_r4 .visible .func _gfortran_maxval_r4fortran/generated/maxval_r4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxval_r4 .visible .func _gfortran_mmaxval_r4gfortran_smaxval_r4 .visible .func _gfortran_smaxval_r4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,86,65,76114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_maxval_r4 .visible .func _gfortran_maxval_r4480u64 %r56; .reg .u64 %r62; .reg .u64 %r65; .reg .u64u64u32 %r194; .reg .u16 %r195; .reg .predu64 %r227; .reg .u64 %r228; .reg .u64 %r229; .reg .predpredu32u64 %r250; .reg .u64 %r251; .reg .u64 %r252; .reg .u64 %r253; .reg .predu32u16u64pred %r319; .reg .f32 %r321; .reg .pred %r322; .reg .pred %r323; .reg .pred %r324; .reg .pred %r325; .reg .pred %r326; .reg .u64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64u64 %r357; .reg .u64 %r358; .reg .u64 %r360; .reg .u64 %r361; mov.u64 %r181,%ar0; mov.u64 %r182,%ar1; mov.u64 %r183,%ar2; .loc 1 55 10 ld.s8 %r22,[%r182+28]; .loc 1 55 38 add.u32 %r184,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r109,%r184; .loc 1 56 10 ld.u64 %r25,[%r183]; .loc 1 56 7 add.u64 %r110,%r25,-1; .loc 1 58 7 shr.u64 %r186,%r110,63; set.u32.lt.s64 %r188,%r109,%r110; neg.s32 %r189,%r188; cvt.u16.u64 %r192,%r186; cvt.u16.u32 %r193,%r189; or.b16 %r191,%r192,%r193; .loc 1 58 6 cvt.u32.u16 %r194,%r191; cvt.u16.u8 %r195,%r194; setp.eq.u16 %r196,%r195,0; @ %r196 bra $L2; .loc 1 60 7 cvt.u32.u32 %r199,%r22; cvt.s64.s8 %r198,%r199; st.u64 [%stack+8],%r19819197_gfortran_runtime_errorr202,%r110,%r110; add.u64 %r203,%r202,%r110; shl.b64 %r204,%r203,3; add.u64 %r205,%r182,%r204; ld.u64 %r31,[%r205+56]; ld.u64 %r33,[%r205+48]; .loc 1 68 9 ld.u64 %r113,[%r205+40]; .loc 1 70 3 setp.ne.u64 %r219,%r110,0; @ %r219 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r220,%r109,%r110; @ %r220 bra $L4; bra $L52; $L3: add.u64 %r29,%r182,40; add.u64 %r126,%frame,120; add.u64 %r180,%frame,240; add.u64 %r221,%r182,16; add.u64 %r223,%r25,%r25; add.u64 %r224,%r223,%r25; shl.b64 %r225,%r224,3; add.u64 %r71,%r221,%r225; .loc 1 76 12 mov.u64 %r361,0; $L8: .loc 1 72 18 ld.u64 %r226,[%r29]; st.u64 [%r126],%r226; .loc 1 73 19 ld.u64 %r228,[%r29+16]; add.u64 %r227,%r228,1; ld.u64 %r229,[%r29+8]; sub.u64 %r38,%r227,%r229; .loc 1 75 10 setp.lt.s64 %r230,%r38,0; @ %r230 bra $L6; .loc 1 73 17 st.u64 [%r180],%r38; bra $L7; $L6: .loc 1 76 12 st.u64 [%r180],%r361; $L7: .loc 1 70 3 add.u64 %r29,%r29,24; add.u64 %r126,%r126,8; add.u64 %r180,%r180,8; setp.ne.u64 %r232,%r71,%r29; @ %r232 bra $L8; bra $L9; $L52: .loc 1 87 6 ld.u64 %r233,[%r181]; setp.eq.u64 %r234,%r233,0; @ ! %r234 bra $L11; bra $L10; $L4: add.u64 %r236,%r25,%r25; add.u64 %r237,%r236,%r25238,40; add.u64 %r144,%r182,%r239; shl.b64 %r240,%r25,3; add.u64 %r107,%r240,-8; add.u64 %r349,%frame,120; add.u64 %r128,%r349,%r107; add.u64 %r348,%frame,240; add.u64 %r105,%r348,%r107; cvt.u32.u32 %r244,%r22; cvt.s64.s8 %r243,%r244; add.u64 %r246,%r243,%r243; add.u64 %r247,%r246,%r243; shl.b64 %r248,%r247,3; add.u64 %r249,%r182,40; add.u64 %r65,%r248,%r249; .loc 1 84 12 mov.u64 %r360,0; $L14: .loc 1 80 18 ld.u64 %r250,[%r144]; st.u64 [%r128],%r250; .loc 1 81 19 ld.u64 %r252,[%r144+16]; add.u64 %r251,%r252,1; ld.u64 %r253,[%r144+8]; sub.u64 %r45,%r251,%r253; .loc 1 83 10 setp.lt.s64 %r254,%r45,0; @ %r254 bra $L12; .loc 1 81 17 st.u64 [%r105],%r45; bra $L13; $L12: .loc 1 84 12 st.u64 [%r105],%r360; $L13: .loc 1 78 3 add.u64 %r144,%r144,24; add.u64 %r128,%r128,8; add.u64 %r105,%r105,8; setp.ne.u64 %r256,%r65,%r144; @ %r256 bra $L14; bra $L53; $L10: .loc 1 91 7 setp.le.s64 %r257,%r109,0; @ %r257 bra $L16; add.u64 %r348,%frame,240; $L38: add.u64 %r153,%r181,40; mov.u64 %r151,%r348; add.u64 %r258,%r181,16; cvt.u32.u32 %r260,%r22; cvt.s64.s8 %r259,%r260; add.u64 %r262,%r259,%r259; add.u64 %r263,%r262,%r259; shl.b64 %r264,%r263,3; add.u64 %r145,%r258,%r264; .loc 1 94 10 mov.u64 %r103,1; .loc 1 98 4 mov.u64 %r287,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r265,0; st.u64 [%r181+8],%r265; .loc 1 103 28 cvt.u16.u32 %r268,%r22; add.u16 %r267,%r268,-1; cvt.u32.u16 %r269,%r267; st.u8 [%r181+28],%r269; .loc 1 105 20 add.u32 %r270,%r22,-2; cvt.s64.s32 %r56,%r270; add.u64 %r272,%r56,%r56; add.u64 %r273,%r272,%r56; shl.b64 %r274,%r273,3; add.u64 %r275,%r181,%r274; .loc 1 105 67 shl.b64 %r277,%r56,3; add.u64 %r278,%frame,%r277; .loc 1 105 59 ld.u64 %r280,[%r275+40]; ld.u64 %r281,[%r278+240]; mul.lo.u64 %r115,%r280,%r281; .loc 1 107 29283; call (%value_in),_gfortrani_xmallocarray284,[%value_in]; } .loc 1 107 27 st.u64 [%r181],%r284; .loc 1 108 10 setp.eq.u64 %r286,%r115,0; @ ! %r286 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r103,%r52,%r103; $L17: .loc 1 98 4 st.u64 [%r153+8],%r287; ld.u64 %r52,[%r151]; add.u64 %r288,%r52,-1; st.u64 [%r153+16],%r288; st.u64 [%r153],%r103; .loc 1 91 7 add.u64 %r153,%r153,24; add.u64 %r151,%r151,8; setp.ne.u64 %r289,%r145,%r153; @ %r289 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r181+48],%r115; mov.u64 %r291,-1; st.u64 [%r181+56],%r291; mov.u64 %r292,1; st.u64 [%r181+40],%r292; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r62,[%r181+28]; .loc 1 118 10 setp.eq.u64 %r293,%r62,%r109; @ %r293 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r109; st.u64 [%stack],%r62;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r296,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r297,[%r296+36]; setp.eq.u32 %r298,%r297,0; @ %r298 bra $L21; .loc 1 125 2 add.u64 %r348,%frame,240; cvta.const.u64 %r302,$LC2r3430302; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r304,%r109,0; @ %r304 bra $L24; $L27: .loc 1 65 9 add.u64 %r305,%r31,1; .loc 1 65 7 sub.u64 %r111,%r305,%r33; max.s64 %r46,%r111,0; .loc 1 137 8 ld.u64 %r122,[%r182]; .loc 1 138 8 ld.u64 %r123,[%r181]; .loc 1 158 36 shl.b64 %r69,%r113,2; ld.u64 %r137,[%frame+360]; .loc 1 180 22 ld.u64 %r73,[%frame+120]; .loc 1 180 12 shl.b64 %r74,%r73,2; .loc 1 181 22 ld.u64 %r76,[%frame]; .loc 1 181 12 shl.b64 %r77,%r76,2; .loc 1 183 32 ld.u64 %r142,[%frame+240]; .loc 1 190 23 mul.lo.u64 %r306,%r73,%r142; .loc 1 190 9 shl.b64 %r307,%r306,2; neg.s64 %r143,%r307; .loc 1 191 23 mul.lo.u64 %r308,%r76,%r142; .loc 1 191 9 shl.b64 %r309,%r308,2; neg.s64 %r70,%r309; setp.gt.s64 %r350,%r111,0; setp.le.s64 %r351,%r109,1; cvt.u32.u32 %r353,%r22; cvt.s64.s8 %r354,%r353; add.u64 %r355,%r354,-1; add.u64 %r356,%frame,120; add.u64 %r357,%frame,240; .loc 1 187 13 mov.u64 %r358,0; bra $L25; $L24: add.u64 %r166,%frame,360; add.u64 %r165,%r181,40; mov.u64 %r163,%frame; cvt.u32.u32 %r311,%r22; cvt.s64.s8 %r310,%r311; add.u64 %r154,%r310,-1; .loc 1 129 3 mov.u64 %r124,0; add.u64 %r348,%frame,240; .loc 1 131 16 mov.u64 %r312,%r124; $L26: st.u64 [%r166],%r312; .loc 1 132 18 ld.u64 %r313,[%r165]; st.u64 [%r163],%r313; .loc 1 133 17 shl.b64 %r315,%r124,3; add.u64 %r316,%r348,%r315; .loc 1 133 10 ld.u64 %r317,[%r316]; setp.le.s64 %r318,%r317,0; @ %r318 bra $L1; .loc 1 129 26 add.u64 %r124,%r124,1; .loc 1 129 3 add.u64 %r166,%r166,8; add.u64 %r165,%r165,24; add.u64 %r163,%r163,8; setp.ne.u64 %r319,%r124,%r154; @ %r319 bra $L26; bra $L27; $L25: .loc 1 153 5 @ %r350 bra $L39; .loc 1 154 10 mov.f32 %r321,0fff7fffff; st.f32 [%r123],%r321; bra $L29; $L39: mov.u64 %r139,%r122; .loc 1 158 13 mov.u64 %r129,0; $L28: .loc 1 163 7 ld.f32 %r179,[%r139]; mov.u64 %r117,%r129; .loc 1 158 28 add.u64 %r129,%r129,1; mov.u64 %r118,%r139; .loc 1 158 36 add.u64 %r139,%r139,%r69; .loc 1 163 6 setp.ge.f32 %r322,%r179,0fff800000; @ %r322 bra $L40; .loc 1 158 6 setp.gt.s64 %r323,%r46,%r129; @ %r323 bra $L28; .loc 1 167 15 mov.f32 %r104,0f7fc00000; bra $L31; $L40: .loc 1 149 9 mov.f32 %r104,0fff800000; $L30: .loc 1 171 6 setp.lt.f32 %r324,%r104,%r179; .loc 1 172 12 selp.f32 %r104,%r179,%r104,%r324; .loc 1 168 28 add.u64 %r117,%r117,1; .loc 1 168 36 add.u64 %r118,%r118,%r69; .loc 1 168 11 setp.le.s64 %r325,%r46,%r117; @ %r325 bra $L31; .loc 1 171 7 ld.f32 %r179,[%r118]; bra $L30; $L31: .loc 1 175 12 st.f32 [%r123],%r104; $L29: .loc 1 179 15 add.u64 %r137,%r137,1; .loc 1 180 12 add.u64 %r122,%r122,%r74; .loc 1 181 12 add.u64 %r123,%r123,%r77; .loc 1 183 13 setp.ne.u64 %r326,%r137,%r142; @ %r326 bra $L25; .loc 1 190 9 add.u64 %r119,%r122,%r143; .loc 1 191 9 add.u64 %r120,%r123,%r70; .loc 1 193 7 @ %r351 bra $L1; add.u64 %r132,%frame,368; mov.u64 %r174,8; .loc 1 192 5 mov.u64 %r121,1; bra $L36; $L37: .loc 1 187 13 st.u64 [%r132],%r358; .loc 1 190 23 mul.lo.u64 %r332,%r90,%r89; .loc 1 190 9 shl.b64 %r333,%r332,2; sub.u64 %r119,%r122,%r333; .loc 1 191 23 mul.lo.u64 %r334,%r93,%r89; .loc 1 191 9 shl.b64 %r335,%r334,2; sub.u64 %r120,%r123,%r335; .loc 1 192 5 add.u64 %r121,%r121,1; .loc 1 193 7 add.u64 %r132,%r132,8; add.u64 %r174,%r174,8; setp.eq.u64 %r336,%r121,%r355; @ %r336 bra $L1; $L36: .loc 1 201 16 ld.u64 %r337,[%r132]; add.u64 %r89,%r337,1; st.u64 [%r132],%r89; .loc 1 202 23 add.u64 %r339,%r356,%r174; ld.u64 %r90,[%r339]; .loc 1 202 13 shl.b64 %r340,%r90,2; add.u64 %r122,%r119,%r340; .loc 1 203 23 add.u64 %r341,%frame,%r174; ld.u64 %r93,[%r341]; .loc 1 203 13 shl.b64 %r342,%r93,2; add.u64 %r123,%r120,%r342; .loc 1 183 32 add.u64 %r344,%r357,%r174; ld.u64 %r96,[%r344]; .loc 1 183 13 setp.eq.u64 %r345,%r89,%r96; @ %r345 bra $L37; .loc 1 187 13 mov.u64 %r137,0; bra $L25; $L53: .loc 1 87 6 ld.u64 %r346,[%r181]; setp.eq.u64 %r347,%r346,0; @ ! %r347 bra $L11; bra $L38; $L_gfortran_mmaxval_r4 .visible .func _gfortran_mmaxval_r608pred %r215; .reg .u32 %r219; .reg .u64 %r221; .reg .u32 %r223; .reg .u32 %r224; .reg .u16 %r226; .reg .u16 %r227; .reg .u16u16 %r263; .reg .u32pred %r282; .reg .u64 %r283; .reg .u64u64u6464pred %r325; .reg .u64 %r327; .reg .u64 %r329; .reg .pred %r331; .reg .pred %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u32u64pred %r357; .reg .u64 %r358; .reg .u64 %r359; .reg .predu64 %r392; .reg .u64 %r393; .reg .pred %r394; .reg .predpred %r403; .reg .pred %r405; .reg .predpredu64 %r422; .reg .u64 %r424; .reg .u64 %r425; .reg .u64 %r427; .reg .pred %r428; .reg .pred %r429; .reg .u64 %r430; .reg .pred %r431; .reg .u64 %r433; .reg .predu64 %r449; mov.u64 %r211,%ar0; mov.u64 %r212,%ar1; mov.u64 %r213,%ar2; mov.u64 %r214,%ar3; .loc 1 237 6 setp.ne.u64 %r215,%r214,0; @ %r215 bra $Lgfortran_maxval_r4.loc 1 244 7 bra $L54; $L55: .loc 1 247 10 ld.u64 %r22,[%r213]; .loc 1 247 7 add.u64 %r142,%r22,-1; .loc 1 248 10 ld.s8 %r23,[%r212+28]; .loc 1 248 38 add.u32 %r219,%r23,-1; .loc 1 248 8 cvt.s64.s32 %r143,%r219; .loc 1 251 7 shr.u64 %r221,%r142,63; set.u32.gt.s64 %r223,%r142,%r143; neg.s32 %r224,%r223; cvt.u16.u64 %r227,%r221; cvt.u16.u32 %r228,%r224; or.b16 %r226,%r227,%r228; .loc 1 251 6 cvt.u32.u16 %r229,%r226; cvt.u16.u8 %r230,%r229; setp.eq.u16 %r231,%r230,0; @ %r231 bra $L57; .loc 1 253 7 cvt.u32.u32 %r234,%r23; cvt.s64.s8 %r233,%r234; st.u64 [%stack+8],%r233; st.u64 [%stack],%r22; cvta.const.u64 %r2322stack; call _gfortran_runtime_error7: .loc 1 258 9 add.u64 %r435,%r142,%r142; add.u64 %r433,%r435,%r142; shl.b64 %r239,%r433,3; add.u64 %r240,%r212,%r239; ld.u64 %r243,[%r240+56]; add.u64 %r242,%r243,1; .loc 1 258 7 ld.u64 %r250,[%r240+48]; sub.u64 %r144,%r242,%r250; .loc 1 259 6 setp.le.s64 %r251,%r144,0; @ %r251 bra $L54; .loc 1 262 9 ld.u64 %r159,[%r214]; .loc 1 264 15 ld.u64 %r33,[%r214+16]; .loc 1 266 22 cvt.u32.u64 %r34,%r33; .loc 1 266 53 add.u32 %r252,%r34,-4; and.b32 %r253,%r252,-5; set.u32.eq.u32 %r255,%r253,0; neg.s32 %r256,%r255; .loc 1 266 22 add.u32 %r257,%r34,-1; set.u32.le.u32 %r259,%r257,1; neg.s32 %r260,%r259; .loc 1 266 6 cvt.u16.u32 %r262,%r256; cvt.u16.u32 %r263,%r260; or.b16 %r261,%r262,%r263; cvt.u32.u16 %r264,%r261; cvt.u16.u8 %r265,%r264; setp.ne.u16 %r266,%r265,0; @ %r266 bra $L59; .loc 1 268 7 setp.ne.u32 %r268,%r34,16; @ %r268 bra $L60; $L59: .loc 1 275 9 shl.b64 %r272,%r433,3; add.u64 %r273,%r212,%r272; ld.u64 %r147,[%r273+40]; .loc 1 276 12 add.u64 %r279,%r214,%r272; ld.u64 %r40,[%r279+40]; .loc 1 278 3 setp.ne.u64 %r281,%r142,0; @ %r281 bra $L61; $L67: .loc 1 288 3 setp.lt.s64 %r282,%r142,%r143; @ %r282 bra $L62; bra $L118; $L60: .loc 1 273 5 cvta.const.u64 %r283283_gfortran_runtime_error61: add.u64 %r53,%r212,40; add.u64 %r163,%r214,40; add.u64 %r286,%r22,%r22; add.u64 %r287,%r286,%r22; shl.b64 %r288,%r287,3; add.u64 %r289,%r212,16; add.u64 %r39,%r288,%r289; .loc 1 278 3 mov.u64 %r209,0; add.u64 %r438,%frame,240; add.u64 %r436,%frame,360; .loc 1 285 12 mov.u64 %r449,%r209; $L66: .loc 1 280 18 add.u64 %r291,%r438,%r209; ld.u64 %r292,[%r53]; st.u64 [%r291],%r292; .loc 1 281 18 add.u64 %r293,%frame,%r209; .loc 1 281 20 ld.u64 %r295,[%r163]; mul.lo.u64 %r294,%r295,%r33; .loc 1 281 18 st.u64 [%r293],%r294; .loc 1 282 19 ld.u64 %r297,[%r53+16]; add.u64 %r296,%r297,1; ld.u64 %r298,[%r53+8]; sub.u64 %r51,%r296,%r298; .loc 1 284 10 setp.lt.s64 %r299,%r51,0; @ %r299 bra $L64; .loc 1 282 17 add.u64 %r301,%r436,%r209; st.u64 [%r301],%r51; bra $L65; $L64: .loc 1 285 12 add.u64 %r303,%r436,%r209; st.u64 [%r303],%r449; $L65: .loc 1 278 3 add.u64 %r53,%r53,24; add.u64 %r163,%r163,24; add.u64 %r209,%r209,8; setp.ne.u64 %r305,%r39,%r53; @ %r305 bra $L66; bra $L67; $L118: .loc 1 298 6 ld.u64 %r306,[%r211]; setp.eq.u64 %r307,%r306,0; @ ! %r307 bra $L69; bra $L68; $L62: add.u64 %r309,%r22,%r22; add.u64 %r310,%r309,%r22; shl.b64 %r311,%r310,3; add.u64 %r167,%r311,40; add.u64 %r171,%r212,%r167; add.u64 %r165,%r214,%r167; shl.b64 %r312,%r22,3; add.u64 %r139,%r312,-8; cvt.u32.u32 %r314,%r23; cvt.s64.s8 %r313,%r314; shl.b64 %r315,%r313,3; add.u64 %r124,%r315,-8; add.u64 %r438,%frame,240; add.u64 %r436,%frame,360; .loc 1 295 12 mov.u64 %r448,0; $L72: .loc 1 290 18 add.u64 %r317,%r438,%r139; ld.u64 %r318,[%r171]; st.u64 [%r317],%r318; .loc 1 291 18 add.u64 %r319,%frame,%r139; .loc 1 291 20 ld.u64 %r321,[%r165]; mul.lo.u64 %r320,%r321,%r33; .loc 1 291 18 st.u64 [%r319],%r320; .loc 1 292 19 ld.u64 %r323,[%r171+16]; add.u64 %r322,%r323,1; ld.u64 %r324,[%r171+8]; sub.u64 %r63,%r322,%r324; .loc 1 294 10 setp.lt.s64 %r325,%r63,0; @ %r325 bra $L70; .loc 1 292 17 add.u64 %r327,%r436,%r139; st.u64 [%r327],%r63; bra $L71; $L70: .loc 1 295 12 add.u64 %r329,%r436,%r139; st.u64 [%r329],%r448; $L71: .loc 1 288 3 add.u64 %r171,%r171,24; add.u64 %r165,%r165,24; add.u64 %r139,%r139,8; setp.ne.u64 %r331,%r124,%r139; @ %r331 bra $L72; bra $L119; $L68: .loc 1 302 7 setp.eq.u64 %r332,%r143,0; @ %r332 bra $L74; add.u64 %r436,%frame,360; $L96: add.u64 %r185,%r211,40; mov.u64 %r183,%r436; add.u64 %r333,%r211,16; cvt.u32.u32 %r335,%r23; cvt.s64.s8 %r334,%r335; add.u64 %r337,%r334,%r334; add.u64 %r338,%r337,%r334; shl.b64 %r339,%r338,3; add.u64 %r172,%r333,%r339; .loc 1 305 10 mov.u64 %r131,1; .loc 1 309 4 mov.u64 %r358,0; bra $L75; $L74: .loc 1 313 20 add.u32 %r340,%r23,-2; cvt.s64.s32 %r73,%r340; add.u64 %r342,%r73,%r73; add.u64 %r343,%r342,%r73; shl.b64 %r344,%r343,3; add.u64 %r345,%r211,%r344; .loc 1 313 67 shl.b64 %r347,%r73,3; add.u64 %r348,%frame,%r347; .loc 1 313 59 ld.u64 %r350,[%r345+40]; ld.u64 %r351,[%r348+360]; mul.lo.u64 %r151,%r350,%r351; .loc 1 315 24 mov.u64 %r352,0; st.u64 [%r211+8],%r352; .loc 1 316 28 cvt.u16.u32 %r355,%r23; add.u16 %r354,%r355,-1; cvt.u32.u16 %r356,%r354; st.u8 [%r211+28],%r356; .loc 1 318 10 setp.eq.u64 %r357,%r151,0; @ %r357 bra $L76; bra $L120; $L78: .loc 1 307 47 mul.lo.u64 %r131,%r70,%r131; $L75: .loc 1 309 4 st.u64 [%r185+8],%r358; ld.u64 %r70,[%r183]; add.u64 %r359,%r70,-1; st.u64 [%r185+16],%r359; st.u64 [%r185],%r131; .loc 1 302 7 add.u64 %r185,%r185,24; add.u64 %r183,%r183,8; setp.ne.u64 %r360,%r172,%r185; @ %r360 bra $L78; bra $L74; $L76: .loc 1 321 4 st.u64 [%r211+48],%r151; mov.u64 %r362,-1; st.u64 [%r211+56],%r362; mov.u64 %r363,1; st.u64 [%r211+40],%r363; .loc 1 322 4 bra $L54; $L120: .loc 1 325 24365; call (%value_in),_gfortrani_xmallocarray66,[%value_in]; } .loc 1 325 22 st.u64 [%r211],%r366; bra $L79; $L69: .loc 1 330 19 ld.s8 %r368,[%r211+28]; .loc 1 330 10 setp.eq.u64 %r369,%r368,%r143; @ %r369 bra $L80; .loc 1 331 2 cvta.const.u64 %r370370_gfortran_runtime_error80: .loc 1 333 11 cvta.global.u64 %r372,_gfortrani_compile_options; .loc 1 333 10 ld.u32 %r373,[%r372+36]; setp.eq.u32 %r374,%r373,0; @ %r374 bra $L79; .loc 1 335 4 add.u64 %r436,%frame,360; cvta.const.u64 %r378,$LC2r4r377378; call _gfortrani_bounds_ifunction_return337 4 cvta.const.u64 %r382,$LC1212382378; call _gfortrani_bounds_equal_extents$L79: .loc 1 342 3 setp.ne.u64 %r384,%r143,0; @ %r384 bra $L81; $L84: .loc 1 350 8 ld.u64 %r160,[%r211]; .loc 1 351 8 ld.u64 %r158,[%r212]; .loc 1 353 9 setp.ne.u64 %r385,%r158,0; @ %r385 bra $L82; bra $L54; $L81: add.u64 %r198,%frame,480; add.u64 %r197,%r211,40; add.u64 %r195,%frame,120; cvt.u32.u32 %r387,%r23; cvt.s64.s8 %r386,%r387; add.u64 %r186,%r386,-1; .loc 1 342 3 mov.u64 %r161,0; add.u64 %r436,%frame,360; .loc 1 344 16 mov.u64 %r388,%r161; $L83: st.u64 [%r198],%r388; .loc 1 345 18 ld.u64 %r389,[%r197]; st.u64 [%r195],%r389; .loc 1 346 17 shl.b64 %r391,%r161,3; add.u64 %r392,%r436,%r391; .loc 1 346 10 ld.u64 %r393,[%r392]; setp.le.s64 %r394,%r393,0; @ %r394 bra $L54; .loc 1 342 26 add.u64 %r161,%r161,1; .loc 1 342 3 add.u64 %r198,%r198,8; add.u64 %r197,%r197,24; add.u64 %r195,%r195,8; setp.ne.u64 %r395,%r161,%r186; @ %r395 bra $L83; bra $L84; $L82: .loc 1 276 12 mul.lo.u64 %r42,%r40,%r33; .loc 1 370 32 shl.b64 %r87,%r147,2; ld.u64 %r38,[%frame+480]; .loc 1 401 22 ld.u64 %r95,[%frame+240]; .loc 1 401 12 shl.b64 %r96,%r95,2; .loc 1 402 23 ld.u64 %r98,[%frame]; .loc 1 403 22 ld.u64 %r100,[%frame+120]; .loc 1 403 12 shl.b64 %r101,%r100,2; .loc 1 405 32 ld.u64 %r59,[%frame+360]; .loc 1 412 23 mul.lo.u64 %r37,%r95,%r59; .loc 1 413 24 mul.lo.u64 %r396,%r98,%r59; .loc 1 413 10 neg.s64 %r89,%r396; .loc 1 414 23 mul.lo.u64 %r397,%r100,%r59; .loc 1 414 9 shl.b64 %r398,%r397,2; neg.s64 %r132,%r398; setp.le.s64 %r434,%r143,1; .loc 1 377 19 mov.u32 %r440,1; cvt.u32.u32 %r442,%r23; cvt.s64.s8 %r443,%r442; add.u64 %r444,%r443,-1; add.u64 %r445,%frame,240; add.u64 %r446,%frame,360; add.u64 %r447,%frame,120; $L97: .loc 1 342 3 mov.u64 %r154,%r159; mov.u64 %r153,%r158; .loc 1 368 6 mov.u32 %r137,0; .loc 1 370 9 mov.u64 %r152,0; $L87: .loc 1 374 7 ld.s8 %r210,[%r154]; .loc 1 374 6 setp.eq.u32 %r437,%r210,0; @ %r437 bra $L85; .loc 1 378 10 ld.f32 %r400,[%r153]; setp.ge.f32 %r401,%r400,0fff800000; @ %r401 bra $L86; .loc 1 377 19 mov.u32 %r137,%r440; $L85: .loc 1 370 24 add.u64 %r152,%r152,1; .loc 1 370 32 add.u64 %r153,%r153,%r87; .loc 1 370 47 add.u64 %r154,%r154,%r42; .loc 1 370 2 setp.ne.u64 %r402,%r144,%r152; @ %r402 bra $L87; .loc 1 386 47 setp.ne.u32 %r403,%r137,0; selp.f32 %r136,0f7fc00000,0fff7fffff,%r403; bra $L88; $L91: .loc 1 394 6 @ %r437 bra $L89; .loc 1 394 16 ld.f32 %r91,[%r153]; .loc 1 394 13 setp.gt.f32 %r405,%r91,%r136; .loc 1 395 12 selp.f32 %r136,%r91,%r136,%r405; $L89: .loc 1 391 28 add.u64 %r152,%r152,1; .loc 1 391 36 add.u64 %r153,%r153,%r87; .loc 1 391 51 add.u64 %r154,%r154,%r42; .loc 1 391 11 setp.eq.u64 %r406,%r144,%r152; @ %r406 bra $L88; .loc 1 394 7 ld.s8 %r210,[%r154]; setp.eq.u32 %r437,%r210,0; bra $L91; $L99: .loc 1 386 47 mov.f32 %r136,0f7fc00000; $L88: .loc 1 397 8 st.f32 [%r160],%r136; .loc 1 400 15 add.u64 %r38,%r38,1; .loc 1 401 12 add.u64 %r158,%r158,%r96; .loc 1 402 13 add.u64 %r159,%r159,%r98; .loc 1 403 12 add.u64 %r160,%r160,%r101; .loc 1 405 13 setp.ne.u64 %r407,%r59,%r38; @ %r407 bra $L97; .loc 1 413 10 add.u64 %r155,%r159,%r89; .loc 1 414 9 add.u64 %r156,%r160,%r132; .loc 1 416 7 @ %r434 bra $L54; add.u64 %r169,%frame,488; .loc 1 412 23 mov.u64 %r103,%r37; .loc 1 416 7 mov.u64 %r141,8; .loc 1 415 5 mov.u64 %r157,1; .loc 1 409 13 mov.u64 %r439,0; bra $L94; $L95: st.u64 [%r169],%r439; .loc 1 412 23 mul.lo.u64 %r103,%r114,%r113; .loc 1 413 24 mul.lo.u64 %r413,%r116,%r113; .loc 1 413 10 sub.u64 %r155,%r159,%r413; .loc 1 414 23 mul.lo.u64 %r414,%r118,%r113; .loc 1 414 9 shl.b64 %r415,%r414,2; sub.u64 %r156,%r160,%r415; .loc 1 415 5 add.u64 %r157,%r157,1; .loc 1 416 7 add.u64 %r169,%r169,8; add.u64 %r141,%r141,8; setp.eq.u64 %r416,%r157,%r444; @ %r416 bra $L54; $L94: .loc 1 424 16 ld.u64 %r417,[%r169]; add.u64 %r113,%r417,1; st.u64 [%r169],%r113; .loc 1 425 23 add.u64 %r419,%r445,%r141; ld.u64 %r114,[%r419]; .loc 1 425 13 sub.u64 %r420,%r114,%r103; shl.b64 %r421,%r420,2; add.u64 %r158,%r158,%r421; .loc 1 426 24 add.u64 %r422,%frame,%r141; ld.u64 %r116,[%r422]; .loc 1 426 14 add.u64 %r159,%r155,%r116; .loc 1 427 23 add.u64 %r424,%r447,%r141; ld.u64 %r118,[%r424]; .loc 1 427 13 shl.b64 %r425,%r118,2; add.u64 %r160,%r156,%r425; .loc 1 405 32 add.u64 %r427,%r446,%r141; ld.u64 %r121,[%r427]; .loc 1 405 13 setp.eq.u64 %r428,%r113,%r121; @ %r428 bra $L95; .loc 1 409 13 mov.u64 %r38,0; bra $L97; $L86: .loc 1 383 9 setp.le.s64 %r429,%r144,%r152; @ %r429 bra $L99; mov.f32 %r136,0fff800000; bra $L91; $L119: .loc 1 298 6 ld.u64 %r430,[%r211]; setp.eq.u64 %r431,%r430,0; @ ! %r431 bra $L69; bra $L96; $L54:_gfortran_smaxval_r4 .visible .func _gfortran_smaxval_r68u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64 %r276; .reg .predpredu64mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 454 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L122; .loc 1 454 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L123; r152; call _gfortran_maxval_r4.loc 1 461 7 bra $L121; $L123: .loc 1 464 10 ld.u64 %r23,[%r152]; .loc 1 464 7 add.u64 %r88,%r23,-1; .loc 1 465 10 ld.s8 %r24,[%r151+28]; .loc 1 465 38 add.u32 %r26,%r24,-1; .loc 1 465 8 cvt.s64.s32 %r294,%r26; .loc 1 467 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r294; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 467 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L125; .loc 1 474 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L126; $L132: .loc 1 482 3 setp.lt.s64 %r173,%r88,%r294; @ %r173 bra $L127; bra $L180; $L125: .loc 1 469 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error126: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 479 12 mov.u64 %r300,0; $L131: .loc 1 476 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 478 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L129; .loc 1 476 17 st.u64 [%r139],%r36; bra $L130; $L129: .loc 1 479 12 st.u64 [%r139],%r300; $L130: .loc 1 474 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L131; bra $L132; $L180: .loc 1 491 15 ld.u64 %r98,[%r150]; .loc 1 491 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L134; bra $L133; $L127: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r293,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r293,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 488 12 mov.u64 %r299,0; $L137: .loc 1 485 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 487 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L135; .loc 1 484 17 st.u64 [%r43],%r41; bra $L136; $L135: .loc 1 488 12 st.u64 [%r43],%r299; $L136: .loc 1 482 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L137; bra $L181; $L133: .loc 1 495 7 setp.eq.u64 %r211,%r294,0; @ %r211 bra $L139; add.u64 %r293,%frame,120; $L159: add.u64 %r111,%r150,40; mov.u64 %r107,%r293; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 498 10 mov.u64 %r81,1; .loc 1 502 4 mov.u64 %r237,0; bra $L140; $L139: .loc 1 506 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 507 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 509 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 509 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 509 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 511 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L141; bra $L182; $L143: .loc 1 500 48 mul.lo.u64 %r81,%r46,%r81; $L140: .loc 1 502 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 495 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L143; bra $L139; $L141: .loc 1 514 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 515 4 bra $L121; $L182: .loc 1 518 2496244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 518 22 st.u64 [%r150],%r98; $L147: .loc 1 544 3 setp.ne.u64 %r247,%r294,0; @ %r247 bra $L144; bra $L145; $L134: .loc 1 522 19 ld.s8 %r55,[%r150+28]; .loc 1 522 10 setp.eq.u64 %r248,%r55,%r294; @ %r248 bra $L146; .loc 1 523 2 st.u64 [%stack+8],%r294; st.u64 [%stack],%r55;_gfortran_runtime_error146: .loc 1 528 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 528 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L147; .loc 1 530 4 setp.eq.u64 %r254,%r294,0; @ %r254 bra $L145; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 530 10 mov.u64 %r104,0; $L149: .loc 1 534 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 534 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 535 18 ld.u64 %r63,[%r31]; .loc 1 530 25 add.u64 %r104,%r104,1; .loc 1 535 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L148; .loc 1 536 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error148: .loc 1 530 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L149; $L144: .loc 1 546 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L150; shl.b64 %r264,%r294,3; bra $L151; $L150: mov.u64 %r264,8; $L151: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L152: .loc 1 547 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 544 26 add.u64 %r100,%r100,1; .loc 1 544 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r294,%r100; @ %r277 bra $L152; $L145: ld.u64 %r80,[%frame+240]; .loc 1 556 22 ld.u64 %r68,[%frame]; .loc 1 556 12 shl.b64 %r69,%r68,2; .loc 1 558 32 ld.u64 %r59,[%frame+120]; .loc 1 565 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r292,%r294,1; .loc 1 554 13 mov.f32 %r278,0fff7fffff; cvt.u32.u32 %r296,%r24; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; $L158: st.f32 [%r98],%r278; .loc 1 555 15 add.u64 %r80,%r80,1; .loc 1 556 12 add.u64 %r98,%r98,%r69; .loc 1 558 13 setp.ne.u64 %r279,%r59,%r80; @ %r279 bra $L158; .loc 1 567 7 @ ! %r292 bra $L183; bra $L121; $L157: .loc 1 562 13 st.u64 [%r99],%r295; .loc 1 565 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 566 5 add.u64 %r97,%r97,1; .loc 1 567 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r282,%r97,%r298; @ %r282 bra $L156; bra $L121; $L183: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 565 23 mov.u64 %r72,%r44; .loc 1 566 5 mov.u64 %r97,1; .loc 1 562 13 mov.u64 %r295,0; $L156: .loc 1 571 16 ld.u64 %r287,[%r99]; add.u64 %r74,%r287,1; st.u64 [%r99],%r74; .loc 1 572 23 ld.u64 %r75,[%r124]; .loc 1 572 13 sub.u64 %r288,%r75,%r72; shl.b64 %r289,%r288,2; add.u64 %r98,%r98,%r289; .loc 1 558 32 ld.u64 %r77,[%r123]; .loc 1 558 13 setp.eq.u64 %r290,%r74,%r77; @ %r290 bra $L157; .loc 1 562 13 mov.u64 %r80,0; bra $L158; $L181: .loc 1 491 15 ld.u64 %r98,[%r150]; .loc 1 491 6 setp.eq.u64 %r291,%r98,0; @ ! %r291 bra $L134; bra $L159; $L121: .loc 1 576 1 ret; } maxval_r8.o/_gfortran_maxval_r8 .visible .func _gfortran_maxval_r8fortran/generated/maxval_r8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxval_r8 .visible .func _gfortran_mmaxval_r8gfortran_smaxval_r8 .visible .func _gfortran_smaxval_r8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,86,65,76114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_maxval_r8 .visible .func _gfortran_maxval_r8480u64 %r56; .reg .u64 %r62; .reg .u64 %r65; .reg .u64u32 %r194; .reg .u16 %r195; .reg .predu64 %r227; .reg .u64 %r228; .reg .u64 %r229; .reg .predpredu32u64 %r250; .reg .u64 %r251; .reg .u64 %r252; .reg .u64 %r253; .reg .predu32u16u64pred %r319; .reg .f64 %r321; .reg .pred %r322; .reg .pred %r323; .reg .pred %r324; .reg .pred %r325; .reg .pred %r326; .reg .u64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64u64 %r357; .reg .u64 %r358; .reg .u64 %r360; .reg .u64 %r361; mov.u64 %r181,%ar0; mov.u64 %r182,%ar1; mov.u64 %r183,%ar2; .loc 1 55 10 ld.s8 %r22,[%r182+28]; .loc 1 55 38 add.u32 %r184,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r109,%r184; .loc 1 56 10 ld.u64 %r25,[%r183]; .loc 1 56 7 add.u64 %r110,%r25,-1; .loc 1 58 7 shr.u64 %r186,%r110,63; set.u32.lt.s64 %r188,%r109,%r110; neg.s32 %r189,%r188; cvt.u16.u64 %r192,%r186; cvt.u16.u32 %r193,%r189; or.b16 %r191,%r192,%r193; .loc 1 58 6 cvt.u32.u16 %r194,%r191; cvt.u16.u8 %r195,%r194; setp.eq.u16 %r196,%r195,0; @ %r196 bra $L2; .loc 1 60 7 cvt.u32.u32 %r199,%r22; cvt.s64.s8 %r198,%r199; st.u64 [%stack+8],%r19819197_gfortran_runtime_errorr202,%r110,%r110; add.u64 %r203,%r202,%r110; shl.b64 %r204,%r203,3; add.u64 %r205,%r182,%r204; ld.u64 %r31,[%r205+56]; ld.u64 %r33,[%r205+48]; .loc 1 68 9 ld.u64 %r113,[%r205+40]; .loc 1 70 3 setp.ne.u64 %r219,%r110,0; @ %r219 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r220,%r109,%r110; @ %r220 bra $L4; bra $L52; $L3: add.u64 %r29,%r182,40; add.u64 %r126,%frame,120; add.u64 %r180,%frame,240; add.u64 %r221,%r182,16; add.u64 %r223,%r25,%r25; add.u64 %r224,%r223,%r25; shl.b64 %r225,%r224,3; add.u64 %r71,%r221,%r225; .loc 1 76 12 mov.u64 %r361,0; $L8: .loc 1 72 18 ld.u64 %r226,[%r29]; st.u64 [%r126],%r226; .loc 1 73 19 ld.u64 %r228,[%r29+16]; add.u64 %r227,%r228,1; ld.u64 %r229,[%r29+8]; sub.u64 %r38,%r227,%r229; .loc 1 75 10 setp.lt.s64 %r230,%r38,0; @ %r230 bra $L6; .loc 1 73 17 st.u64 [%r180],%r38; bra $L7; $L6: .loc 1 76 12 st.u64 [%r180],%r361; $L7: .loc 1 70 3 add.u64 %r29,%r29,24; add.u64 %r126,%r126,8; add.u64 %r180,%r180,8; setp.ne.u64 %r232,%r71,%r29; @ %r232 bra $L8; bra $L9; $L52: .loc 1 87 6 ld.u64 %r233,[%r181]; setp.eq.u64 %r234,%r233,0; @ ! %r234 bra $L11; bra $L10; $L4: add.u64 %r236,%r25,%r25; add.u64 %r237,%r236,%r25238,40; add.u64 %r144,%r182,%r239; shl.b64 %r240,%r25,3; add.u64 %r107,%r240,-8; add.u64 %r349,%frame,120; add.u64 %r128,%r349,%r107; add.u64 %r348,%frame,240; add.u64 %r105,%r348,%r107; cvt.u32.u32 %r244,%r22; cvt.s64.s8 %r243,%r244; add.u64 %r246,%r243,%r243; add.u64 %r247,%r246,%r243; shl.b64 %r248,%r247,3; add.u64 %r249,%r182,40; add.u64 %r65,%r248,%r249; .loc 1 84 12 mov.u64 %r360,0; $L14: .loc 1 80 18 ld.u64 %r250,[%r144]; st.u64 [%r128],%r250; .loc 1 81 19 ld.u64 %r252,[%r144+16]; add.u64 %r251,%r252,1; ld.u64 %r253,[%r144+8]; sub.u64 %r45,%r251,%r253; .loc 1 83 10 setp.lt.s64 %r254,%r45,0; @ %r254 bra $L12; .loc 1 81 17 st.u64 [%r105],%r45; bra $L13; $L12: .loc 1 84 12 st.u64 [%r105],%r360; $L13: .loc 1 78 3 add.u64 %r144,%r144,24; add.u64 %r128,%r128,8; add.u64 %r105,%r105,8; setp.ne.u64 %r256,%r65,%r144; @ %r256 bra $L14; bra $L53; $L10: .loc 1 91 7 setp.le.s64 %r257,%r109,0; @ %r257 bra $L16; add.u64 %r348,%frame,240; $L38: add.u64 %r153,%r181,40; mov.u64 %r151,%r348; add.u64 %r258,%r181,16; cvt.u32.u32 %r260,%r22; cvt.s64.s8 %r259,%r260; add.u64 %r262,%r259,%r259; add.u64 %r263,%r262,%r259; shl.b64 %r264,%r263,3; add.u64 %r145,%r258,%r264; .loc 1 94 10 mov.u64 %r103,1; .loc 1 98 4 mov.u64 %r287,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r265,0; st.u64 [%r181+8],%r265; .loc 1 103 28 cvt.u16.u32 %r268,%r22; add.u16 %r267,%r268,-1; cvt.u32.u16 %r269,%r267; st.u8 [%r181+28],%r269; .loc 1 105 20 add.u32 %r270,%r22,-2; cvt.s64.s32 %r56,%r270; add.u64 %r272,%r56,%r56; add.u64 %r273,%r272,%r56; shl.b64 %r274,%r273,3; add.u64 %r275,%r181,%r274; .loc 1 105 67 shl.b64 %r277,%r56,3; add.u64 %r278,%frame,%r277; .loc 1 105 59 ld.u64 %r280,[%r275+40]; ld.u64 %r281,[%r278+240]; mul.lo.u64 %r115,%r280,%r281; .loc 1 107 29 mov.u64 %r283283; call (%value_in),_gfortrani_xmallocarray284,[%value_in]; } .loc 1 107 27 st.u64 [%r181],%r284; .loc 1 108 10 setp.eq.u64 %r286,%r115,0; @ ! %r286 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r103,%r52,%r103; $L17: .loc 1 98 4 st.u64 [%r153+8],%r287; ld.u64 %r52,[%r151]; add.u64 %r288,%r52,-1; st.u64 [%r153+16],%r288; st.u64 [%r153],%r103; .loc 1 91 7 add.u64 %r153,%r153,24; add.u64 %r151,%r151,8; setp.ne.u64 %r289,%r145,%r153; @ %r289 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r181+48],%r115; mov.u64 %r291,-1; st.u64 [%r181+56],%r291; mov.u64 %r292,1; st.u64 [%r181+40],%r292; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r62,[%r181+28]; .loc 1 118 10 setp.eq.u64 %r293,%r62,%r109; @ %r293 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r109; st.u64 [%stack],%r62;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r296,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r297,[%r296+36]; setp.eq.u32 %r298,%r297,0; @ %r298 bra $L21; .loc 1 125 2 add.u64 %r348,%frame,240; cvta.const.u64 %r302,$LC2r3430302; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r304,%r109,0; @ %r304 bra $L24; $L27: .loc 1 65 9 add.u64 %r305,%r31,1; .loc 1 65 7 sub.u64 %r111,%r305,%r33; max.s64 %r46,%r111,0; .loc 1 137 8 ld.u64 %r122,[%r182]; .loc 1 138 8 ld.u64 %r123,[%r181]; .loc 1 158 36 shl.b64 %r69,%r113,3; ld.u64 %r137,[%frame+360]; .loc 1 180 22 ld.u64 %r73,[%frame+120]; .loc 1 180 12 shl.b64 %r74,%r73,3; .loc 1 181 22 ld.u64 %r76,[%frame]; .loc 1 181 12 shl.b64 %r77,%r76,3; .loc 1 183 32 ld.u64 %r142,[%frame+240]; .loc 1 190 23 mul.lo.u64 %r306,%r73,%r142; .loc 1 190 9 shl.b64 %r307,%r306,3; neg.s64 %r143,%r307; .loc 1 191 23 mul.lo.u64 %r308,%r76,%r142; .loc 1 191 9 shl.b64 %r309,%r308,3; neg.s64 %r70,%r309; setp.gt.s64 %r350,%r111,0; setp.le.s64 %r351,%r109,1; cvt.u32.u32 %r353,%r22; cvt.s64.s8 %r354,%r353; add.u64 %r355,%r354,-1; add.u64 %r356,%frame,120; add.u64 %r357,%frame,240; .loc 1 187 13 mov.u64 %r358,0; bra $L25; $L24: add.u64 %r166,%frame,360; add.u64 %r165,%r181,40; mov.u64 %r163,%frame; cvt.u32.u32 %r311,%r22; cvt.s64.s8 %r310,%r311; add.u64 %r154,%r310,-1; .loc 1 129 3 mov.u64 %r124,0; add.u64 %r348,%frame,240; .loc 1 131 16 mov.u64 %r312,%r124; $L26: st.u64 [%r166],%r312; .loc 1 132 18 ld.u64 %r313,[%r165]; st.u64 [%r163],%r313; .loc 1 133 17 shl.b64 %r315,%r124,3; add.u64 %r316,%r348,%r315; .loc 1 133 10 ld.u64 %r317,[%r316]; setp.le.s64 %r318,%r317,0; @ %r318 bra $L1; .loc 1 129 26 add.u64 %r124,%r124,1; .loc 1 129 3 add.u64 %r166,%r166,8; add.u64 %r165,%r165,24; add.u64 %r163,%r163,8; setp.ne.u64 %r319,%r124,%r154; @ %r319 bra $L26; bra $L27; $L25: .loc 1 153 5 @ %r350 bra $L39; .loc 1 154 10 mov.f64 %r321,0dffefffffffffffff; st.f64 [%r123],%r321; bra $L29; $L39: mov.u64 %r139,%r122; .loc 1 158 13 mov.u64 %r129,0; $L28: .loc 1 163 7 ld.f64 %r179,[%r139]; mov.u64 %r117,%r129; .loc 1 158 28 add.u64 %r129,%r129,1; mov.u64 %r118,%r139; .loc 1 158 36 add.u64 %r139,%r139,%r69; .loc 1 163 6 setp.ge.f64 %r322,%r179,0dfff0000000000000; @ %r322 bra $L40; .loc 1 158 6 setp.gt.s64 %r323,%r46,%r129; @ %r323 bra $L28; .loc 1 167 15 mov.f64 %r104,0d7ff8000000000000; bra $L31; $L40: .loc 1 149 9 mov.f64 %r104,0dfff0000000000000; $L30: .loc 1 171 6 setp.lt.f64 %r324,%r104,%r179; .loc 1 172 12 selp.f64 %r104,%r179,%r104,%r324; .loc 1 168 28 add.u64 %r117,%r117,1; .loc 1 168 36 add.u64 %r118,%r118,%r69; .loc 1 168 11 setp.le.s64 %r325,%r46,%r117; @ %r325 bra $L31; .loc 1 171 7 ld.f64 %r179,[%r118]; bra $L30; $L31: .loc 1 175 12 st.f64 [%r123],%r104; $L29: .loc 1 179 15 add.u64 %r137,%r137,1; .loc 1 180 12 add.u64 %r122,%r122,%r74; .loc 1 181 12 add.u64 %r123,%r123,%r77; .loc 1 183 13 setp.ne.u64 %r326,%r137,%r142; @ %r326 bra $L25; .loc 1 190 9 add.u64 %r119,%r122,%r143; .loc 1 191 9 add.u64 %r120,%r123,%r70; .loc 1 193 7 @ %r351 bra $L1; add.u64 %r132,%frame,368; mov.u64 %r174,8; .loc 1 192 5 mov.u64 %r121,1; bra $L36; $L37: .loc 1 187 13 st.u64 [%r132],%r358; .loc 1 190 23 mul.lo.u64 %r332,%r90,%r89; .loc 1 190 9 shl.b64 %r333,%r332,3; sub.u64 %r119,%r122,%r333; .loc 1 191 23 mul.lo.u64 %r334,%r93,%r89; .loc 1 191 9 shl.b64 %r335,%r334,3; sub.u64 %r120,%r123,%r335; .loc 1 192 5 add.u64 %r121,%r121,1; .loc 1 193 7 add.u64 %r132,%r132,8; add.u64 %r174,%r174,8; setp.eq.u64 %r336,%r121,%r355; @ %r336 bra $L1; $L36: .loc 1 201 16 ld.u64 %r337,[%r132]; add.u64 %r89,%r337,1; st.u64 [%r132],%r89; .loc 1 202 23 add.u64 %r339,%r356,%r174; ld.u64 %r90,[%r339]; .loc 1 202 13 shl.b64 %r340,%r90,3; add.u64 %r122,%r119,%r340; .loc 1 203 23 add.u64 %r341,%frame,%r174; ld.u64 %r93,[%r341]; .loc 1 203 13 shl.b64 %r342,%r93,3; add.u64 %r123,%r120,%r342; .loc 1 183 32 add.u64 %r344,%r357,%r174; ld.u64 %r96,[%r344]; .loc 1 183 13 setp.eq.u64 %r345,%r89,%r96; @ %r345 bra $L37; .loc 1 187 13 mov.u64 %r137,0; bra $L25; $L53: .loc 1 87 6 ld.u64 %r346,[%r181]; setp.eq.u64 %r347,%r346,0; @ ! %r347 bra $L11; bra $L38; $L_gfortran_mmaxval_r8 .visible .func _gfortran_mmaxval_r8608pred %r215; .reg .u32 %r219; .reg .u64 %r221; .reg .u32 %r223; .reg .u32 %r224; .reg .u16 %r226; .reg .u16 %r227; .reg .u16u16 %r263; .reg .u32pred %r282; .reg .u64 %r283; .reg .u64u64u6464pred %r325; .reg .u64 %r327; .reg .u64 %r329; .reg .pred %r331; .reg .pred %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u32u64pred %r357; .reg .u64 %r358; .reg .u64 %r359; .reg .predu64 %r392; .reg .u64 %r393; .reg .pred %r394; .reg .predpred %r403; .reg .pred %r405; .reg .predpredu64 %r422; .reg .u64 %r424; .reg .u64 %r425; .reg .u64 %r427; .reg .pred %r428; .reg .pred %r429; .reg .u64 %r430; .reg .pred %r431; .reg .u64 %r433; .reg .predu64 %r449; mov.u64 %r211,%ar0; mov.u64 %r212,%ar1; mov.u64 %r213,%ar2; mov.u64 %r214,%ar3; .loc 1 237 6 setp.ne.u64 %r215,%r214,0; @ %r215 bra $Lgfortran_maxval_r8.loc 1 244 7 bra $L54; $L55: .loc 1 247 10 ld.u64 %r22,[%r213]; .loc 1 247 7 add.u64 %r142,%r22,-1; .loc 1 248 10 ld.s8 %r23,[%r212+28]; .loc 1 248 38 add.u32 %r219,%r23,-1; .loc 1 248 8 cvt.s64.s32 %r143,%r219; .loc 1 251 7 shr.u64 %r221,%r142,63; set.u32.gt.s64 %r223,%r142,%r143; neg.s32 %r224,%r223; cvt.u16.u64 %r227,%r221; cvt.u16.u32 %r228,%r224; or.b16 %r226,%r227,%r228; .loc 1 251 6 cvt.u32.u16 %r229,%r226; cvt.u16.u8 %r230,%r229; setp.eq.u16 %r231,%r230,0; @ %r231 bra $L57; .loc 1 253 7 cvt.u32.u32 %r234,%r23; cvt.s64.s8 %r233,%r234; st.u64 [%stack+8],%r233; st.u64 [%stack],%r22; cvta.const.u64 %r2322stack; call _gfortran_runtime_error7: .loc 1 258 9 add.u64 %r435,%r142,%r142; add.u64 %r433,%r435,%r142; shl.b64 %r239,%r433,3; add.u64 %r240,%r212,%r239; ld.u64 %r243,[%r240+56]; add.u64 %r242,%r243,1; .loc 1 258 7 ld.u64 %r250,[%r240+48]; sub.u64 %r144,%r242,%r250; .loc 1 259 6 setp.le.s64 %r251,%r144,0; @ %r251 bra $L54; .loc 1 262 9 ld.u64 %r159,[%r214]; .loc 1 264 15 ld.u64 %r33,[%r214+16]; .loc 1 266 22 cvt.u32.u64 %r34,%r33; .loc 1 266 53 add.u32 %r252,%r34,-4; and.b32 %r253,%r252,-5; set.u32.eq.u32 %r255,%r253,0; neg.s32 %r256,%r255; .loc 1 266 22 add.u32 %r257,%r34,-1; set.u32.le.u32 %r259,%r257,1; neg.s32 %r260,%r259; .loc 1 266 6 cvt.u16.u32 %r262,%r256; cvt.u16.u32 %r263,%r260; or.b16 %r261,%r262,%r263; cvt.u32.u16 %r264,%r261; cvt.u16.u8 %r265,%r264; setp.ne.u16 %r266,%r265,0; @ %r266 bra $L59; .loc 1 268 7 setp.ne.u32 %r268,%r34,16; @ %r268 bra $L60; $L59: .loc 1 275 9 shl.b64 %r272,%r433,3; add.u64 %r273,%r212,%r272; ld.u64 %r147,[%r273+40]; .loc 1 276 12 add.u64 %r279,%r214,%r272; ld.u64 %r40,[%r279+40]; .loc 1 278 3 setp.ne.u64 %r281,%r142,0; @ %r281 bra $L61; $L67: .loc 1 288 3 setp.lt.s64 %r282,%r142,%r143; @ %r282 bra $L62; bra $L118; $L60: .loc 1 273 5 cvta.const.u64 %r283283_gfortran_runtime_error61: add.u64 %r53,%r212,40; add.u64 %r163,%r214,40; add.u64 %r286,%r22,%r22; add.u64 %r287,%r286,%r22; shl.b64 %r288,%r287,3; add.u64 %r289,%r212,16; add.u64 %r39,%r288,%r289; .loc 1 278 3 mov.u64 %r209,0; add.u64 %r438,%frame,240; add.u64 %r436,%frame,360; .loc 1 285 12 mov.u64 %r449,%r209; $L66: .loc 1 280 18 add.u64 %r291,%r438,%r209; ld.u64 %r292,[%r53]; st.u64 [%r291],%r292; .loc 1 281 18 add.u64 %r293,%frame,%r209; .loc 1 281 20 ld.u64 %r295,[%r163]; mul.lo.u64 %r294,%r295,%r33; .loc 1 281 18 st.u64 [%r293],%r294; .loc 1 282 19 ld.u64 %r297,[%r53+16]; add.u64 %r296,%r297,1; ld.u64 %r298,[%r53+8]; sub.u64 %r51,%r296,%r298; .loc 1 284 10 setp.lt.s64 %r299,%r51,0; @ %r299 bra $L64; .loc 1 282 17 add.u64 %r301,%r436,%r209; st.u64 [%r301],%r51; bra $L65; $L64: .loc 1 285 12 add.u64 %r303,%r436,%r209; st.u64 [%r303],%r449; $L65: .loc 1 278 3 add.u64 %r53,%r53,24; add.u64 %r163,%r163,24; add.u64 %r209,%r209,8; setp.ne.u64 %r305,%r39,%r53; @ %r305 bra $L66; bra $L67; $L118: .loc 1 298 6 ld.u64 %r306,[%r211]; setp.eq.u64 %r307,%r306,0; @ ! %r307 bra $L69; bra $L68; $L62: add.u64 %r309,%r22,%r22; add.u64 %r310,%r309,%r22; shl.b64 %r311,%r310,3; add.u64 %r167,%r311,40; add.u64 %r171,%r212,%r167; add.u64 %r165,%r214,%r167; shl.b64 %r312,%r22,3; add.u64 %r139,%r312,-8; cvt.u32.u32 %r314,%r23; cvt.s64.s8 %r313,%r314; shl.b64 %r315,%r313,3; add.u64 %r124,%r315,-8; add.u64 %r438,%frame,240; add.u64 %r436,%frame,360; .loc 1 295 12 mov.u64 %r448,0; $L72: .loc 1 290 18 add.u64 %r317,%r438,%r139; ld.u64 %r318,[%r171]; st.u64 [%r317],%r318; .loc 1 291 18 add.u64 %r319,%frame,%r139; .loc 1 291 20 ld.u64 %r321,[%r165]; mul.lo.u64 %r320,%r321,%r33; .loc 1 291 18 st.u64 [%r319],%r320; .loc 1 292 19 ld.u64 %r323,[%r171+16]; add.u64 %r322,%r323,1; ld.u64 %r324,[%r171+8]; sub.u64 %r63,%r322,%r324; .loc 1 294 10 setp.lt.s64 %r325,%r63,0; @ %r325 bra $L70; .loc 1 292 17 add.u64 %r327,%r436,%r139; st.u64 [%r327],%r63; bra $L71; $L70: .loc 1 295 12 add.u64 %r329,%r436,%r139; st.u64 [%r329],%r448; $L71: .loc 1 288 3 add.u64 %r171,%r171,24; add.u64 %r165,%r165,24; add.u64 %r139,%r139,8; setp.ne.u64 %r331,%r124,%r139; @ %r331 bra $L72; bra $L119; $L68: .loc 1 302 7 setp.eq.u64 %r332,%r143,0; @ %r332 bra $L74; add.u64 %r436,%frame,360; $L96: add.u64 %r185,%r211,40; mov.u64 %r183,%r436; add.u64 %r333,%r211,16; cvt.u32.u32 %r335,%r23; cvt.s64.s8 %r334,%r335; add.u64 %r337,%r334,%r334; add.u64 %r338,%r337,%r334; shl.b64 %r339,%r338,3; add.u64 %r172,%r333,%r339; .loc 1 305 10 mov.u64 %r131,1; .loc 1 309 4 mov.u64 %r358,0; bra $L75; $L74: .loc 1 313 20 add.u32 %r340,%r23,-2; cvt.s64.s32 %r73,%r340; add.u64 %r342,%r73,%r73; add.u64 %r343,%r342,%r73; shl.b64 %r344,%r343,3; add.u64 %r345,%r211,%r344; .loc 1 313 67 shl.b64 %r347,%r73,3; add.u64 %r348,%frame,%r347; .loc 1 313 59 ld.u64 %r350,[%r345+40]; ld.u64 %r351,[%r348+360]; mul.lo.u64 %r151,%r350,%r351; .loc 1 315 24 mov.u64 %r352,0; st.u64 [%r211+8],%r352; .loc 1 316 28 cvt.u16.u32 %r355,%r23; add.u16 %r354,%r355,-1; cvt.u32.u16 %r356,%r354; st.u8 [%r211+28],%r356; .loc 1 318 10 setp.eq.u64 %r357,%r151,0; @ %r357 bra $L76; bra $L120; $L78: .loc 1 307 47 mul.lo.u64 %r131,%r70,%r131; $L75: .loc 1 309 4 st.u64 [%r185+8],%r358; ld.u64 %r70,[%r183]; add.u64 %r359,%r70,-1; st.u64 [%r185+16],%r359; st.u64 [%r185],%r131; .loc 1 302 7 add.u64 %r185,%r185,24; add.u64 %r183,%r183,8; setp.ne.u64 %r360,%r172,%r185; @ %r360 bra $L78; bra $L74; $L76: .loc 1 321 4 st.u64 [%r211+48],%r151; mov.u64 %r362,-1; st.u64 [%r211+56],%r362; mov.u64 %r363,1; st.u64 [%r211+40],%r363; .loc 1 322 4 bra $L54; $L120: .loc 1 325 24 mov.u64 %r365365; call (%value_in),_gfortrani_xmallocarray66,[%value_in]; } .loc 1 325 22 st.u64 [%r211],%r366; bra $L79; $L69: .loc 1 330 19 ld.s8 %r368,[%r211+28]; .loc 1 330 10 setp.eq.u64 %r369,%r368,%r143; @ %r369 bra $L80; .loc 1 331 2 cvta.const.u64 %r370370_gfortran_runtime_error80: .loc 1 333 11 cvta.global.u64 %r372,_gfortrani_compile_options; .loc 1 333 10 ld.u32 %r373,[%r372+36]; setp.eq.u32 %r374,%r373,0; @ %r374 bra $L79; .loc 1 335 4 add.u64 %r436,%frame,360; cvta.const.u64 %r378,$LC2r4r377378; call _gfortrani_bounds_ifunction_return337 4 cvta.const.u64 %r382,$LC1212382378; call _gfortrani_bounds_equal_extents$L79: .loc 1 342 3 setp.ne.u64 %r384,%r143,0; @ %r384 bra $L81; $L84: .loc 1 350 8 ld.u64 %r160,[%r211]; .loc 1 351 8 ld.u64 %r158,[%r212]; .loc 1 353 9 setp.ne.u64 %r385,%r158,0; @ %r385 bra $L82; bra $L54; $L81: add.u64 %r198,%frame,480; add.u64 %r197,%r211,40; add.u64 %r195,%frame,120; cvt.u32.u32 %r387,%r23; cvt.s64.s8 %r386,%r387; add.u64 %r186,%r386,-1; .loc 1 342 3 mov.u64 %r161,0; add.u64 %r436,%frame,360; .loc 1 344 16 mov.u64 %r388,%r161; $L83: st.u64 [%r198],%r388; .loc 1 345 18 ld.u64 %r389,[%r197]; st.u64 [%r195],%r389; .loc 1 346 17 shl.b64 %r391,%r161,3; add.u64 %r392,%r436,%r391; .loc 1 346 10 ld.u64 %r393,[%r392]; setp.le.s64 %r394,%r393,0; @ %r394 bra $L54; .loc 1 342 26 add.u64 %r161,%r161,1; .loc 1 342 3 add.u64 %r198,%r198,8; add.u64 %r197,%r197,24; add.u64 %r195,%r195,8; setp.ne.u64 %r395,%r161,%r186; @ %r395 bra $L83; bra $L84; $L82: .loc 1 276 12 mul.lo.u64 %r42,%r40,%r33; .loc 1 370 32 shl.b64 %r87,%r147,3; ld.u64 %r38,[%frame+480]; .loc 1 401 22 ld.u64 %r95,[%frame+240]; .loc 1 401 12 shl.b64 %r96,%r95,3; .loc 1 402 23 ld.u64 %r98,[%frame]; .loc 1 403 22 ld.u64 %r100,[%frame+120]; .loc 1 403 12 shl.b64 %r101,%r100,3; .loc 1 405 32 ld.u64 %r59,[%frame+360]; .loc 1 412 23 mul.lo.u64 %r37,%r95,%r59; .loc 1 413 24 mul.lo.u64 %r396,%r98,%r59; .loc 1 413 10 neg.s64 %r89,%r396; .loc 1 414 23 mul.lo.u64 %r397,%r100,%r59; .loc 1 414 9 shl.b64 %r398,%r397,3; neg.s64 %r132,%r398; setp.le.s64 %r434,%r143,1; .loc 1 377 19 mov.u32 %r440,1; cvt.u32.u32 %r442,%r23; cvt.s64.s8 %r443,%r442; add.u64 %r444,%r443,-1; add.u64 %r445,%frame,240; add.u64 %r446,%frame,360; add.u64 %r447,%frame,120; $L97: .loc 1 342 3 mov.u64 %r154,%r159; mov.u64 %r153,%r158; .loc 1 368 6 mov.u32 %r137,0; .loc 1 370 9 mov.u64 %r152,0; $L87: .loc 1 374 7 ld.s8 %r210,[%r154]; .loc 1 374 6 setp.eq.u32 %r437,%r210,0; @ %r437 bra $L85; .loc 1 378 10 ld.f64 %r400,[%r153]; setp.ge.f64 %r401,%r400,0dfff0000000000000; @ %r401 bra $L86; .loc 1 377 19 mov.u32 %r137,%r440; $L85: .loc 1 370 24 add.u64 %r152,%r152,1; .loc 1 370 32 add.u64 %r153,%r153,%r87; .loc 1 370 47 add.u64 %r154,%r154,%r42; .loc 1 370 2 setp.ne.u64 %r402,%r144,%r152; @ %r402 bra $L87; .loc 1 386 47 setp.ne.u32 %r403,%r137,0; selp.f64 %r136,0d7ff8000000000000,0dffefffffffffffff,%r403; bra $L88; $L91: .loc 1 394 6 @ %r437 bra $L89; .loc 1 394 16 ld.f64 %r91,[%r153]; .loc 1 394 13 setp.gt.f64 %r405,%r91,%r136; .loc 1 395 12 selp.f64 %r136,%r91,%r136,%r405; $L89: .loc 1 391 28 add.u64 %r152,%r152,1; .loc 1 391 36 add.u64 %r153,%r153,%r87; .loc 1 391 51 add.u64 %r154,%r154,%r42; .loc 1 391 11 setp.eq.u64 %r406,%r144,%r152; @ %r406 bra $L88; .loc 1 394 7 ld.s8 %r210,[%r154]; setp.eq.u32 %r437,%r210,0; bra $L91; $L99: .loc 1 386 47 mov.f64 %r136,0d7ff8000000000000; $L88: .loc 1 397 8 st.f64 [%r160],%r136; .loc 1 400 15 add.u64 %r38,%r38,1; .loc 1 401 12 add.u64 %r158,%r158,%r96; .loc 1 402 13 add.u64 %r159,%r159,%r98; .loc 1 403 12 add.u64 %r160,%r160,%r101; .loc 1 405 13 setp.ne.u64 %r407,%r59,%r38; @ %r407 bra $L97; .loc 1 413 10 add.u64 %r155,%r159,%r89; .loc 1 414 9 add.u64 %r156,%r160,%r132; .loc 1 416 7 @ %r434 bra $L54; add.u64 %r169,%frame,488; .loc 1 412 23 mov.u64 %r103,%r37; .loc 1 416 7 mov.u64 %r141,8; .loc 1 415 5 mov.u64 %r157,1; .loc 1 409 13 mov.u64 %r439,0; bra $L94; $L95: st.u64 [%r169],%r439; .loc 1 412 23 mul.lo.u64 %r103,%r114,%r113; .loc 1 413 24 mul.lo.u64 %r413,%r116,%r113; .loc 1 413 10 sub.u64 %r155,%r159,%r413; .loc 1 414 23 mul.lo.u64 %r414,%r118,%r113; .loc 1 414 9 shl.b64 %r415,%r414,3; sub.u64 %r156,%r160,%r415; .loc 1 415 5 add.u64 %r157,%r157,1; .loc 1 416 7 add.u64 %r169,%r169,8; add.u64 %r141,%r141,8; setp.eq.u64 %r416,%r157,%r444; @ %r416 bra $L54; $L94: .loc 1 424 16 ld.u64 %r417,[%r169]; add.u64 %r113,%r417,1; st.u64 [%r169],%r113; .loc 1 425 23 add.u64 %r419,%r445,%r141; ld.u64 %r114,[%r419]; .loc 1 425 13 sub.u64 %r420,%r114,%r103; shl.b64 %r421,%r420,3; add.u64 %r158,%r158,%r421; .loc 1 426 24 add.u64 %r422,%frame,%r141; ld.u64 %r116,[%r422]; .loc 1 426 14 add.u64 %r159,%r155,%r116; .loc 1 427 23 add.u64 %r424,%r447,%r141; ld.u64 %r118,[%r424]; .loc 1 427 13 shl.b64 %r425,%r118,3; add.u64 %r160,%r156,%r425; .loc 1 405 32 add.u64 %r427,%r446,%r141; ld.u64 %r121,[%r427]; .loc 1 405 13 setp.eq.u64 %r428,%r113,%r121; @ %r428 bra $L95; .loc 1 409 13 mov.u64 %r38,0; bra $L97; $L86: .loc 1 383 9 setp.le.s64 %r429,%r144,%r152; @ %r429 bra $L99; mov.f64 %r136,0dfff0000000000000; bra $L91; $L119: .loc 1 298 6 ld.u64 %r430,[%r211]; setp.eq.u64 %r431,%r430,0; @ ! %r431 bra $L69; bra $L96; $L54:_gfortran_smaxval_r8 .visible .func _gfortran_smaxval_r868u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64 %r276; .reg .predpredu64mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 454 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L122; .loc 1 454 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L123; r152; call _gfortran_maxval_r8.loc 1 461 7 bra $L121; $L123: .loc 1 464 10 ld.u64 %r23,[%r152]; .loc 1 464 7 add.u64 %r88,%r23,-1; .loc 1 465 10 ld.s8 %r24,[%r151+28]; .loc 1 465 38 add.u32 %r26,%r24,-1; .loc 1 465 8 cvt.s64.s32 %r294,%r26; .loc 1 467 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r294; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 467 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L125; .loc 1 474 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L126; $L132: .loc 1 482 3 setp.lt.s64 %r173,%r88,%r294; @ %r173 bra $L127; bra $L180; $L125: .loc 1 469 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error126: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 479 12 mov.u64 %r300,0; $L131: .loc 1 476 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 478 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L129; .loc 1 476 17 st.u64 [%r139],%r36; bra $L130; $L129: .loc 1 479 12 st.u64 [%r139],%r300; $L130: .loc 1 474 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L131; bra $L132; $L180: .loc 1 491 15 ld.u64 %r98,[%r150]; .loc 1 491 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L134; bra $L133; $L127: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r293,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r293,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 488 12 mov.u64 %r299,0; $L137: .loc 1 485 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 487 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L135; .loc 1 484 17 st.u64 [%r43],%r41; bra $L136; $L135: .loc 1 488 12 st.u64 [%r43],%r299; $L136: .loc 1 482 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L137; bra $L181; $L133: .loc 1 495 7 setp.eq.u64 %r211,%r294,0; @ %r211 bra $L139; add.u64 %r293,%frame,120; $L159: add.u64 %r111,%r150,40; mov.u64 %r107,%r293; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 498 10 mov.u64 %r81,1; .loc 1 502 4 mov.u64 %r237,0; bra $L140; $L139: .loc 1 506 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 507 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 509 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 509 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 509 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 511 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L141; bra $L182; $L143: .loc 1 500 48 mul.lo.u64 %r81,%r46,%r81; $L140: .loc 1 502 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 495 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L143; bra $L139; $L141: .loc 1 514 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 515 4 bra $L121; $L182: .loc 1 518 24 mov.u64 %r24496244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 518 22 st.u64 [%r150],%r98; $L147: .loc 1 544 3 setp.ne.u64 %r247,%r294,0; @ %r247 bra $L144; bra $L145; $L134: .loc 1 522 19 ld.s8 %r55,[%r150+28]; .loc 1 522 10 setp.eq.u64 %r248,%r55,%r294; @ %r248 bra $L146; .loc 1 523 2 st.u64 [%stack+8],%r294; st.u64 [%stack],%r55;_gfortran_runtime_error146: .loc 1 528 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 528 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L147; .loc 1 530 4 setp.eq.u64 %r254,%r294,0; @ %r254 bra $L145; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 530 10 mov.u64 %r104,0; $L149: .loc 1 534 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 534 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 535 18 ld.u64 %r63,[%r31]; .loc 1 530 25 add.u64 %r104,%r104,1; .loc 1 535 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L148; .loc 1 536 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error148: .loc 1 530 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L149; $L144: .loc 1 546 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L150; shl.b64 %r264,%r294,3; bra $L151; $L150: mov.u64 %r264,8; $L151: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L152: .loc 1 547 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 544 26 add.u64 %r100,%r100,1; .loc 1 544 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r294,%r100; @ %r277 bra $L152; $L145: ld.u64 %r80,[%frame+240]; .loc 1 556 22 ld.u64 %r68,[%frame]; .loc 1 556 12 shl.b64 %r69,%r68,3; .loc 1 558 32 ld.u64 %r59,[%frame+120]; .loc 1 565 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r292,%r294,1; .loc 1 554 13 mov.f64 %r278,0dffefffffffffffff; cvt.u32.u32 %r296,%r24; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; $L158: st.f64 [%r98],%r278; .loc 1 555 15 add.u64 %r80,%r80,1; .loc 1 556 12 add.u64 %r98,%r98,%r69; .loc 1 558 13 setp.ne.u64 %r279,%r59,%r80; @ %r279 bra $L158; .loc 1 567 7 @ ! %r292 bra $L183; bra $L121; $L157: .loc 1 562 13 st.u64 [%r99],%r295; .loc 1 565 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 566 5 add.u64 %r97,%r97,1; .loc 1 567 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r282,%r97,%r298; @ %r282 bra $L156; bra $L121; $L183: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 565 23 mov.u64 %r72,%r44; .loc 1 566 5 mov.u64 %r97,1; .loc 1 562 13 mov.u64 %r295,0; $L156: .loc 1 571 16 ld.u64 %r287,[%r99]; add.u64 %r74,%r287,1; st.u64 [%r99],%r74; .loc 1 572 23 ld.u64 %r75,[%r124]; .loc 1 572 13 sub.u64 %r288,%r75,%r72; shl.b64 %r289,%r288,3; add.u64 %r98,%r98,%r289; .loc 1 558 32 ld.u64 %r77,[%r123]; .loc 1 558 13 setp.eq.u64 %r290,%r74,%r77; @ %r290 bra $L157; .loc 1 562 13 mov.u64 %r80,0; bra $L158; $L181: .loc 1 491 15 ld.u64 %r98,[%r150]; .loc 1 491 6 setp.eq.u64 %r291,%r98,0; @ ! %r291 bra $L134; bra $L159; $L121: .loc 1 576 1 ret; } maxval_r10.o/ minloc0_4_i1.o/_gfortran_minloc0_4_i1 .visible .func _gfortran_minloc0_4_i1libgfortran/generated/minloc0_4_i1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_4_i1 .visible .func _gfortran_mminloc0_4_i1_gfortran_sminloc0_4_i1 .visible .func _gfortran_sminloc0_4_i1VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_4_i1 .visible .func _gfortran_minloc0_4_i1stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[368u64 %r124; .reg .u32 %r125; .reg .u64u16 %r167; .reg .u16 %r168; .reg .predmov.u32 %r116,%ar2; .loc 1 50 10 ld.s8 %r22,[%r115+28]; .loc 1 50 8 cvt.u32.u32 %r117,%r22; cvt.s64.s8 %r69,%r117; .loc 1 51 6 setp.gt.s64 %r118,%r69,0; @ %r118 bra $L2; .loc 1 52 5 cvta.const.u64 %r1111_gfortran_runtime_errorr23,[%r114]; .loc 1 54 6 setp.ne.u64 %r121,%r23,0; @ %r121 bra $L3; .loc 1 56 7 st.u64 [%r114+48],%r23; add.u64 %r123,%r69,-1; st.u64 [%r114+56],%r123; mov.u64 %r124,1; st.u64 [%r114+40],%r124; .loc 1 57 28 cvt.u32.u64 %r125,%r124; st.u8 [%r114+28],%r125; .loc 1 58 24 st.u64 [%r114+8],%r23; .loc 1 59 29128; call (%value_in),_gfortrani_xmallocarray131,[%value_in]; } mov.u64 %r23,%r131; .loc 1 59 27 st.u64 [%r114],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r133,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r134,[%r133+36]; setp.eq.u32 %r135,%r134,0; @ %r135 bra $L4; .loc 1 64r115138; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r114]; $L4: .loc 1 68 11 ld.u64 %r70,[%r114+40]; add.u64 %r38,%r115,40; mov.u64 %r43,%frame; add.u64 %r34,%frame,120; .loc 1 70 10 mov.u64 %r96,0; add.u64 %r189,%frame,240; .loc 1 74 16 mov.u64 %r146,%r96; bra $L7; $L22: mov.u64 %r96,%r75; $L7: .loc 1 72 18 ld.u64 %r139,[%r38]; st.u64 [%r43],%r139; .loc 1 73 19 ld.u64 %r141,[%r38+16]; add.u64 %r140,%r141,1; ld.u64 %r142,[%r38+8]; sub.u64 %r31,%r140,%r142; .loc 1 73 17 st.u64 [%r34],%r31; .loc 1 74 16 shl.b64 %r144,%r96,3; add.u64 %r145,%r189,%r144; st.u64 [%r145],%r146; .loc 1 75 10 setp.gt.s64 %r147,%r31,0; @ %r147 bra $L5; shl.b64 %r81,%r70,2; mov.u64 %r83,%r23; .loc 1 78 11 mov.u64 %r76,0; .loc 1 79 24 cvt.u32.u64 %r148,%r76; $L6: st.u32 [%r83],%r148; .loc 1 78 27 add.u64 %r76,%r76,1; .loc 1 78 4 add.u64 %r83,%r83,%r81; setp.ne.u64 %r149,%r69,%r76; @ %r149 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r75,%r96,1; .loc 1 70 3 add.u64 %r38,%r38,24; add.u64 %r43,%r43,8; add.u64 %r34,%r34,8; setp.ne.u64 %r150,%r69,%r75; @ %r150 bra $L22; .loc 1 84 8 ld.u64 %r61,[%r115]; shl.b64 %r78,%r70,2; mov.u64 %r89,%r23; .loc 1 87 10 mov.u64 %r74,0; .loc 1 88 23 mov.u32 %r151,1; $L8: st.u32 [%r89],%r151; mov.u64 %r97,%r74; .loc 1 87 26 add.u64 %r74,%r74,1; .loc 1 87 3 add.u64 %r89,%r89,%r78; setp.ne.u64 %r152,%r96,%r97; @ %r152 bra $L8; .loc 1 101 9 setp.ne.u64 %r153,%r61,0; @ ! %r153 bra $L1; .loc 1 149 19 ld.u64 %r47,[%frame]; .loc 1 151 34 ld.u64 %r50,[%frame+120]; cvt.u32.u32 %r155,%r22; cvt.s64.s8 %r154,%r155; shl.b64 %r156,%r154,3; add.u64 %r100,%r189,%r156; .loc 1 99 12 mov.u32 %r64,127; setp.eq.u32 %r188,%r116,0; add.u64 %r190,%frame,248; .loc 1 157 13 mov.u64 %r193,0; .loc 1 174 32 add.u64 %r194,%frame,120; $L21: .loc 1 151 21 ld.u64 %r108,[%frame+240]; .loc 1 126 10 @ %r188 bra $L23; mov.u64 %r39,%r108; mov.u64 %r59,%r61; $L14: .loc 1 129 10 ld.s8 %r32,[%r59]; .loc 1 129 9 cvt.u16.u32 %r159,%r32; cvt.u16.u32 %r160,%r64; setp.gt.s16 %r161,%r159,%r160; @ %r161 bra $L12; mov.u64 %r41,%r190; mov.u64 %r62,%r23; mov.u64 %r103,%r39; $L13: .loc 1 133 34 cvt.u32.u64 %r164,%r103; add.u32 %r163,%r164,1; .loc 1 133 23 st.u32 [%r62],%r163; .loc 1 132 3 add.u64 %r62,%r62,%r78; setp.eq.u64 %r165,%r41,%r100; @ %r165 bra $L24; .loc 1 133 30 ld.u64 %r103,[%r41]; add.u64 %r41,%r41,8; bra $L13; $L24: mov.u32 %r64,%r32; $L12: .loc 1 135 11 add.u64 %r59,%r59,%r47; .loc 1 137 9 add.u64 %r39,%r39,1; .loc 1 137 2 st.u64 [%frame+240],%r39; setp.ne.u64 %r166,%r39,%r50; @ %r166 bra $L14; bra $L34; $L23: mov.u64 %r49,%r108; mov.u64 %r60,%r61; $L11: .loc 1 141 10 ld.s8 %r40,[%r60]; .loc 1 141 9 cvt.u16.u32 %r167,%r40; cvt.u16.u32 %r168,%r64; setp.ge.s16 %r169,%r167,%r168; @ %r169 bra $L16; mov.u64 %r106,%r190; mov.u64 %r107,%r23; mov.u64 %r42,%r49; $L17: .loc 1 145 34 cvt.u32.u64 %r172,%r42; add.u32 %r171,%r172,1; .loc 1 145 23 st.u32 [%r107],%r171; .loc 1 144 3 add.u64 %r107,%r107,%r78; setp.eq.u64 %r173,%r100,%r106; @ %r173 bra $L25; .loc 1 145 30 ld.u64 %r42,[%r106]; add.u64 %r106,%r106,8; bra $L17; $L25: mov.u32 %r64,%r40; $L16: .loc 1 149 9 add.u64 %r60,%r60,%r47; .loc 1 151 14 add.u64 %r49,%r49,1; .loc 1 151 7 st.u64 [%frame+240],%r49; setp.ne.u64 %r174,%r49,%r50; @ %r174 bra $L11; .loc 1 149 9 sub.u64 %r175,%r50,%r108; mad.lo.u64 %r61,%r175,%r47,%r61; bra $L18; $L34: .loc 1 135 11 sub.u64 %r177,%r50,%r108; mad.lo.u64 %r61,%r177,%r47,%r61; $L18: mov.u64 %r77,%r190; .loc 1 151 34 mov.u64 %r58,%r50; mov.u64 %r56,%r47; .loc 1 152 9 mov.u64 %r63,0; $L20: .loc 1 157 13 st.u64 [%r77+-8],%r193; .loc 1 160 23 mul.lo.u64 %r53,%r56,%r58; mov.u64 %r98,%r63; .loc 1 161 5 add.u64 %r63,%r63,1; .loc 1 162 7 setp.eq.u64 %r181,%r98,%r96; @ %r181 bra $L1; .loc 1 170 16 ld.u64 %r182,[%r77]; add.u64 %r55,%r182,1; st.u64 [%r77],%r55; shl.b64 %r67,%r63,3; .loc 1 171 23 add.u64 %r183,%frame,%r67; ld.u64 %r56,[%r183]; .loc 1 171 13 sub.u64 %r184,%r56,%r53; add.u64 %r61,%r61,%r184; .loc 1 174 32 add.u64 %r186,%r194,%r67; ld.u64 %r58,[%r186]; .loc 1 174 7 add.u64 %r77,%r77,8; setp.eq.u64 %r187,%r55,%r58; @ %r187 bra $L20; bra $L21; $L_gfortran_mminloc0_4_i1 .visible .func _gfortran_mminloc0_4_i1reg .u64 %stack; mov.u64 %stack,0480998pred %r168; .reg .predpredpredpredpred %r239; .reg .pred %r240; .reg .u32predpred %r295; mov.u64 %r164,%ar0; mov.u64 %r165,%ar1; mov.u64 %r166,%ar2; mov.u32 %r167,%ar3; .loc 1 202 6 setp.ne.u64 %r168,%r166,0; @ %r168 bra $L361617; call _gfortran_minloc0_4_.loc 1 205 7 bra $L35; $L36: .loc 1 208 10 ld.s8 %r22,[%r165+28]; .loc 1 209 6 setp.gt.s32 %r172,%r22,0; @ %r172 bra $L38; .loc 1 210 5 cvta.const.u64 %r173173_gfortran_runtime_error%r175,[%r164]; setp.ne.u64 %r176,%r175,0; @ %r176 bra $L39; .loc 1 214 7 st.u64 [%r164+48],%r175; add.u32 %r178,%r22,-1; cvt.s64.s32 %r179,%r178; st.u64 [%r164+56],%r179; mov.u64 %r180,1; st.u64 [%r164+40],%r180; .loc 1 215 28 cvt.u32.u64 %r181,%r180; st.u8 [%r164+28],%r181; .loc 1 216 24 st.u64 [%r164+8],%r175; .loc 1 217 29 cvt.u32.u32 %r186,%r22; cvt.s64.s8 %r185,%r18call (%value_in),_gfortrani_xmallocarray187,[%value_in]; } .loc 1 217 27 st.u64 [%r164],%r187; bra $L40; $L39: .loc 1 221 11 cvta.global.u64 %r189,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r190,[%r189+36]; setp.eq.u32 %r191,%r190,0164r194; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19716r197194; call _gfortrani_bounds_equal_extents$L40: .loc 1 231 15 ld.u64 %r31,[%r166+16]; .loc 1 233 9 ld.u64 %r96,[%r166]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r199,%r32,-4; and.b32 %r200,%r199,-5; set.u32.eq.u32 %r202,%r200,0; neg.s32 %r203,%r202; .loc 1 235 22 add.u32 %r204,%r32,-1; set.u32.le.u32 %r206,%r204,1; neg.s32 %r207,%r206; .loc 1 235 6 cvt.u16.u32 %r209,%r203; cvt.u16.u32 %r210,%r207; or.b16 %r208,%r209,%r210; cvt.u32.u16 %r211,%r208; cvt.u16.u8 %r212,%r211; setp.ne.u16 %r213,%r212,0; @ %r213 bra $L41; .loc 1 237 7 setp.ne.u32 %r215,%r32,16; @ %r215 bra $L42; $L41: .loc 1 244 11 ld.u64 %r108,[%r164+40]; .loc 1 245 8 ld.u64 %r109,[%r164]; .loc 1 246 17 cvt.u32.u32 %r216,%r22; cvt.s64.s8 %r67,%r216; add.u64 %r52,%r165,40; add.u64 %r285,%frame,120; mov.u64 %r116,%r285; add.u64 %r141,%r166,40; mov.u64 %r163,%frame; add.u64 %r160,%frame,240; add.u64 %r157,%frame,360; mov.u64 %r158,%r157; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r225,%r29; bra $L43; $L42: .loc 1 242 5 cvta.const.u64 %r217,$LC3stack; call _gfortran_runtime_error64: mov.u64 %r29,%r113; $L43: .loc 1 248 18 ld.u64 %r219,[%r52]; st.u64 [%r116],%r219; .loc 1 249 20 ld.u64 %r221,[%r141]; mul.lo.u64 %r220,%r221,%r31; .loc 1 249 18 st.u64 [%r163],%r220; .loc 1 250 19 ld.u64 %r223,[%r52+16]; add.u64 %r222,%r223,1; ld.u64 %r224,[%r52+8]; sub.u64 %r44,%r222,%r224; .loc 1 250 17 st.u64 [%r160],%r44; .loc 1 251 16 st.u64 [%r158],%r225; .loc 1 252 10 setp.gt.s64 %r226,%r44,0; @ %r226 bra $L44; shl.b64 %r147,%r108,2; mov.u64 %r136,%r109; .loc 1 255 11 mov.u64 %r114,0; .loc 1 256 24 cvt.u32.u64 %r227,%r114; $L45: st.u32 [%r136],%r227; .loc 1 255 27 add.u64 %r114,%r114,1; .loc 1 255 4 add.u64 %r136,%r136,%r147; setp.ne.u64 %r228,%r67,%r114; @ %r228 bra $L45; bra $L35; $L44: .loc 1 246 26 add.u64 %r113,%r29,1; .loc 1 246 3 add.u64 %r52,%r52,24; add.u64 %r116,%r116,8; add.u64 %r141,%r141,24; add.u64 %r163,%r163,8; add.u64 %r160,%r160,8; add.u64 %r158,%r158,8; setp.ne.u64 %r229,%r67,%r113; @ %r229 bra $L64; .loc 1 261 8 ld.u64 %r93,[%r165]; shl.b64 %r57,%r108,2; mov.u64 %r60,%r109; .loc 1 264 10 mov.u64 %r112,0; .loc 1 265 23 cvt.u32.u64 %r230,%r112; $L47: st.u32 [%r60],%r230; mov.u64 %r143,%r112; .loc 1 264 26 add.u64 %r112,%r112,1; .loc 1 264 3 add.u64 %r60,%r60,%r57; setp.ne.u64 %r231,%r29,%r143; @ %r231 bra $L47; .loc 1 276 9 setp.eq.u64 %r232,%r93,0; @ %r232 bra $L35; .loc 1 331 19 ld.u64 %r73,[%frame+120]; .loc 1 332 20 ld.u64 %r75,[%frame]; .loc 1 334 34 ld.u64 %r78,[%frame+240]; ld.u64 %r161,[%frame+360]; cvt.u32.u32 %r234,%r22; cvt.s64.s8 %r233,%r234; shl.b64 %r235,%r233,3; add.u64 %r125,%r235,%r157; .loc 1 274 12 mov.u32 %r101,127; setp.eq.u32 %r287,1,0; add.u64 %r286,%frame,368; .loc 1 340 13 mov.u64 %r292,0; .loc 1 359 32 add.u64 %r293,%frame,240; .loc 1 308 12 setp.eq.u32 %r295,%r167,0; $L63: .loc 1 280 10 @ %r287 bra $L48; mov.u64 %r53,%r161; mov.u32 %r94,0; mov.u32 %r289,1; $L53: mov.u64 %r45,%r93; .loc 1 300 13 add.u64 %r93,%r45,%r73; .loc 1 284 11 ld.s8 %r238,[%r96]; cvt.u16.u32 %r237,%r238; setp.eq.u16 %r239,%r237,0; @ %r239 bra $L49; setp.eq.u32 %r240,%r94,0; selp.u64 %r161,%r161,%r53,%r240; .loc 1 294 16 ld.s8 %r101,[%r45]; mov.u64 %r135,%r286; mov.u64 %r148,%r109; mov.u64 %r159,%r161; $L52: .loc 1 296 33 cvt.u32.u64 %r243,%r159; add.u32 %r242,%r243,1; .loc 1 296 22 st.u32 [%r148],%r242; .loc 1 295 9 add.u64 %r148,%r148,%r57; setp.eq.u64 %r244,%r125,%r135; @ %r244 bra $L65; .loc 1 296 29 ld.u64 %r159,[%r135]; add.u64 %r135,%r135,8; bra $L52; $L49: .loc 1 301 14 add.u64 %r96,%r96,%r75; .loc 1 303 11 add.u64 %r53,%r53,1; mov.u32 %r94,%r289; .loc 1 303 4 setp.ne.u64 %r245,%r53,%r78; @ %r245 bra $L53; bra $L89; $L48: .loc 1 308 12 @ %r295 bra $L66; mov.u64 %r64,%r161; mov.u64 %r91,%r93; $L58: .loc 1 311 11 ld.s8 %r248,[%r96]; cvt.u16.u32 %r247,%r248; setp.eq.u16 %r249,%r247,0; @ %r249 bra $L56; .loc 1 311 12 ld.s8 %r58,[%r91]; .loc 1 311 11 cvt.u16.u32 %r250,%r58; cvt.u16.u32 %r251,%r101; setp.gt.s16 %r252,%r250,%r251; @ %r252 bra $L56; mov.u64 %r103,%r286; mov.u64 %r117,%r109; mov.u64 %r154,%r64; .loc 1 314 18 mov.u64 %r144,%r292; $L57: .loc 1 315 36 cvt.u32.u64 %r255,%r154; add.u32 %r254,%r255,1; .loc 1 315 25 st.u32 [%r117],%r254; .loc 1 314 34 add.u64 %r111,%r144,1; .loc 1 314 11 add.u64 %r117,%r117,%r57; setp.le.s64 %r256,%r29,%r144; @ %r256 bra $L67; .loc 1 315 32 ld.u64 %r154,[%r103]; add.u64 %r103,%r103,8; mov.u64 %r144,%r111; bra $L57; $L67: mov.u32 %r101,%r58; $L56: .loc 1 317 8 add.u64 %r91,%r91,%r73; .loc 1 319 13 add.u64 %r64,%r64,1; .loc 1 319 6 st.u64 [%frame+360],%r64; setp.ne.u64 %r257,%r64,%r78; @ %r257 bra $L58; .loc 1 317 8 sub.u64 %r258,%r78,%r161; mad.lo.u64 %r93,%r258,%r73,%r93; bra $L59; $L66: mov.u64 %r77,%r161; mov.u64 %r95,%r96; mov.u64 %r92,%r93; $L55: .loc 1 323 11 ld.s8 %r261,[%r95]; cvt.u16.u32 %r260,%r261; setp.eq.u16 %r262,%r260,0; @ %r262 bra $L60; .loc 1 323 12 ld.s8 %r68,[%r92]; .loc 1 323 11 cvt.u16.u32 %r263,%r68; cvt.u16.u32 %r264,%r101; setp.ge.s16 %r265,%r263,%r264; @ %r265 bra $L60; mov.u64 %r90,%r286; mov.u64 %r98,%r109; mov.u64 %r149,%r77; .loc 1 326 12 mov.u64 %r145,%r292; $L61: .loc 1 327 36 cvt.u32.u64 %r268,%r149; add.u32 %r267,%r268,1; .loc 1 327 25 st.u32 [%r98],%r267; .loc 1 326 28 add.u64 %r110,%r145,1; .loc 1 326 5 add.u64 %r98,%r98,%r57; setp.le.s64 %r269,%r29,%r145; @ %r269 bra $L68; .loc 1 327 32 ld.u64 %r149,[%r90]; add.u64 %r90,%r90,8; mov.u64 %r145,%r110; bra $L61; $L68: mov.u32 %r101,%r68; $L60: .loc 1 331 9 add.u64 %r92,%r92,%r73; .loc 1 332 10 add.u64 %r95,%r95,%r75; .loc 1 334 14 add.u64 %r77,%r77,1; .loc 1 334 7 st.u64 [%frame+360],%r77; setp.ne.u64 %r270,%r77,%r78; @ %r270 bra $L55; sub.u64 %r131,%r78,%r161; .loc 1 331 9 mad.lo.u64 %r93,%r73,%r131,%r93; .loc 1 332 10 mad.lo.u64 %r96,%r75,%r131,%r96; bra $L59; $L89: st.u64 [%frame+360],%r78; $L59: mov.u64 %r123,%r286; mov.u64 %r87,%r75; .loc 1 334 34 mov.u64 %r89,%r78; mov.u64 %r85,%r73; mov.u64 %r122,8; .loc 1 335 9 mov.u64 %r99,0; $L62: .loc 1 340 13 st.u64 [%r123+-8],%r292; .loc 1 343 23 mul.lo.u64 %r80,%r85,%r89; .loc 1 344 24 mul.lo.u64 %r82,%r89,%r87; mov.u64 %r97,%r99; .loc 1 345 5 add.u64 %r99,%r99,1; .loc 1 346 7 setp.eq.u64 %r275,%r29,%r97; @ %r275 bra $L35; .loc 1 354 16 ld.u64 %r276,[%r123]; add.u64 %r84,%r276,1; st.u64 [%r123],%r84; .loc 1 355 23 add.u64 %r278,%r285,%r122; ld.u64 %r85,[%r278]; .loc 1 355 13 sub.u64 %r279,%r85,%r80; add.u64 %r93,%r93,%r279; .loc 1 356 24 add.u64 %r280,%frame,%r122; ld.u64 %r87,[%r280]; .loc 1 356 14 sub.u64 %r281,%r87,%r82; add.u64 %r96,%r96,%r281; .loc 1 359 32 add.u64 %r283,%r293,%r122; ld.u64 %r89,[%r283]; .loc 1 359 7 add.u64 %r123,%r123,8; add.u64 %r122,%r122,8; setp.eq.u64 %r284,%r84,%r89; @ %r284 bra $L62; ld.u64 %r161,[%frame+360]; bra $L63; $L65: mov.u64 %r93,%r45; setp.eq.u32 %r287,1,1; bra $L48; $L35: .loc 1 _gfortran_sminloc0_4_i1 .visible .func _gfortran_sminloc0_4_i1u32pred.reg .u64u32 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 378 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L91; .loc 1 378 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L92; $L91: .loc 1 380 7 {_gfortran_minloc0_4_.loc 1 381 7 bra $L90; $L92: .loc 1 384 8 ld.s8 %r30,[%r36+28]_gfortran_runtime_error94: .loc 1 389 15 ld.u64 %r24,[%r35]; .loc 1 389 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L95; .loc 1 391 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 392 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 393 24 st.u64 [%r35+8],%r24; .loc 1 394 29 mov.u64 %r56call (%value_in),_gfortrani_xmallocarray94 27 st.u64 [%r35],%r24; bra $L96; $L95: .loc 1 396 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 396 11 ld.u32 %r62,[%r61+36; .loc 1 398 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 403 8 ld.u64 %r24,[%r35]; $L96: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,2; mov.u64 %r29,%r24; .loc 1 404 10 mov.u64 %r32,0; .loc 1 405 23 cvt.u32.u64 %r68,%r32; $L97: st.u32 [%r29],%r68; .loc 1 404 24 add.u64 %r32,%r32,1; .loc 1 404 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L97; $L90: .loc 1 406 1 ret; } minloc0_8_i1.o/_gfortran_minloc0_8_i1 .visible .func _gfortran_minloc0_8_i1libgfortran/generated/minloc0_8_i1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_8_i1 .visible .func _gfortran_mminloc0_8_i1_gfortran_sminloc0_8_i1 .visible .func _gfortran_sminloc0_8_i1VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_8_i1 .visible .func _gfortran_minloc0_8_i1stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[368pred %r145; .reg .pred %r146; .reg .u64 %r147; .reg .pred %r148; .reg .predu64 %r152; .reg .u16 %r155; .reg .u16predpredu64 %r188; mov.u64 %r110,%ar0; mov.u64 %r111,%ar1; mov.u32 %r112,%ar2; .loc 1 50 10 ld.s8 %r22,[%r111+28]; .loc 1 50 8 cvt.u32.u32 %r113,%r22; cvt.s64.s8 %r67,%r113; .loc 1 51 6 setp.gt.s64 %r114,%r67,0; @ %r114 bra $L2; .loc 1 52 5 cvta.const.u64 %r115115_gfortran_runtime_errorr23,[%r110]; .loc 1 54 6 setp.ne.u64 %r117,%r23,0; @ %r117 bra $L3; .loc 1 56 7 st.u64 [%r110+48],%r23; add.u64 %r119,%r67,-1; st.u64 [%r110+56],%r119; mov.u64 %r120,1; st.u64 [%r110+40],%r120; .loc 1 57 28 cvt.u32.u64 %r121,%r120; st.u8 [%r110+28],%r121; .loc 1 58 24 st.u64 [%r110+8],%r23; .loc 1 59 29 mov.u64 %r16call (%value_in),_gfortrani_xmallocarray127,[%value_in]; } mov.u64 %r23,%r127; .loc 1 59 27 st.u64 [%r110],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r129,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r130,[%r129+36]; setp.eq.u32 %r131,%r130,0; @ %r131 bra $L4; .loc 1 64110134; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r110]; $L4: .loc 1 68 11 ld.u64 %r68,[%r110+40]; add.u64 %r38,%r111,40; mov.u64 %r102,%frame; add.u64 %r103,%frame,120; .loc 1 70 10 mov.u64 %r94,0; add.u64 %r183,%frame,240; .loc 1 74 16 mov.u64 %r142,%r94; bra $L7; $L22: mov.u64 %r94,%r74; $L7: .loc 1 72 18 ld.u64 %r135,[%r38]; st.u64 [%r102],%r135; .loc 1 73 19 ld.u64 %r137,[%r38+16]; add.u64 %r136,%r137,1; ld.u64 %r138,[%r38+8]; sub.u64 %r31,%r136,%r138; .loc 1 73 17 st.u64 [%r103],%r31; .loc 1 74 16 shl.b64 %r140,%r94,3; add.u64 %r141,%r183,%r140; st.u64 [%r141],%r142; .loc 1 75 10 setp.gt.s64 %r143,%r31,0; @ %r143 bra $L5; shl.b64 %r76,%r68,3; mov.u64 %r82,%r23; .loc 1 78 11 mov.u64 %r75,0; .loc 1 79 24 mov.u64 %r144,%r75; $L6: st.u64 [%r82],%r144; .loc 1 78 27 add.u64 %r75,%r75,1; .loc 1 78 4 add.u64 %r82,%r82,%r76; setp.ne.u64 %r145,%r67,%r75; @ %r145 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r74,%r94,1; .loc 1 70 3 add.u64 %r38,%r38,24; add.u64 %r102,%r102,8; add.u64 %r103,%r103,8; setp.ne.u64 %r146,%r67,%r74; @ %r146 bra $L22; .loc 1 84 8 ld.u64 %r58,[%r111]; shl.b64 %r41,%r68,3; mov.u64 %r37,%r23; .loc 1 87 10 mov.u64 %r73,0; .loc 1 88 23 mov.u64 %r147,1; $L8: st.u64 [%r37],%r147; mov.u64 %r95,%r73; .loc 1 87 26 add.u64 %r73,%r73,1; .loc 1 87 3 add.u64 %r37,%r37,%r41; setp.ne.u64 %r148,%r94,%r95; @ %r148 bra $L8; .loc 1 101 9 setp.ne.u64 %r149,%r58,0; @ ! %r149 bra $L1; .loc 1 149 19 ld.u64 %r43,[%frame]; .loc 1 151 34 ld.u64 %r48,[%frame+120]; cvt.u32.u32 %r151,%r22; cvt.s64.s8 %r150,%r151; shl.b64 %r152,%r150,3; add.u64 %r69,%r183,%r152; .loc 1 99 12 mov.u32 %r63,127; setp.eq.u32 %r182,%r112,0; add.u64 %r184,%frame,248; .loc 1 157 13 mov.u64 %r187,0; .loc 1 174 32 add.u64 %r188,%frame,120; $L21: .loc 1 151 21 ld.u64 %r100,[%frame+240]; .loc 1 126 10 @ %r182 bra $L23; mov.u64 %r39,%r100; mov.u64 %r56,%r58; $L14: .loc 1 129 10 ld.s8 %r32,[%r56]; .loc 1 129 9 cvt.u16.u32 %r155,%r32; cvt.u16.u32 %r156,%r63; setp.gt.s16 %r157,%r155,%r156; @ %r157 bra $L12; mov.u64 %r71,%r184; mov.u64 %r33,%r23; mov.u64 %r85,%r39; $L13: .loc 1 133 34 add.u64 %r159,%r85,1; .loc 1 133 23 st.u64 [%r33],%r159; .loc 1 132 3 add.u64 %r33,%r33,%r41; setp.eq.u64 %r160,%r69,%r71; @ %r160 bra $L24; .loc 1 133 30 ld.u64 %r85,[%r71]; add.u64 %r71,%r71,8; bra $L13; $L24: mov.u32 %r63,%r32; $L12: .loc 1 135 11 add.u64 %r56,%r56,%r43; .loc 1 137 9 add.u64 %r39,%r39,1; .loc 1 137 2 st.u64 [%frame+240],%r39; setp.ne.u64 %r161,%r39,%r48; @ %r161 bra $L14; bra $L34; $L23: mov.u64 %r46,%r100; mov.u64 %r57,%r58; $L11: .loc 1 141 10 ld.s8 %r40,[%r57]; .loc 1 141 9 cvt.u16.u32 %r162,%r40; cvt.u16.u32 %r163,%r63; setp.ge.s16 %r164,%r162,%r163; @ %r164 bra $L16; mov.u64 %r80,%r184; mov.u64 %r98,%r23; mov.u64 %r81,%r46; $L17: .loc 1 145 34 add.u64 %r166,%r81,1; .loc 1 145 23 st.u64 [%r98],%r166; .loc 1 144 3 add.u64 %r98,%r98,%r41; setp.eq.u64 %r167,%r69,%r80; @ %r167 bra $L25; .loc 1 145 30 ld.u64 %r81,[%r80]; add.u64 %r80,%r80,8; bra $L17; $L25: mov.u32 %r63,%r40; $L16: .loc 1 149 9 add.u64 %r57,%r57,%r43; .loc 1 151 14 add.u64 %r46,%r46,1; .loc 1 151 7 st.u64 [%frame+240],%r46; setp.ne.u64 %r168,%r46,%r48; @ %r168 bra $L11; .loc 1 149 9 sub.u64 %r169,%r48,%r100; mad.lo.u64 %r58,%r169,%r43,%r58; bra $L18; $L34: .loc 1 135 11 sub.u64 %r171,%r48,%r100; mad.lo.u64 %r58,%r171,%r43,%r58; $L18: mov.u64 %r66,%r184; .loc 1 151 34 mov.u64 %r55,%r48; mov.u64 %r53,%r43; .loc 1 152 9 mov.u64 %r62,0; $L20: .loc 1 157 13 st.u64 [%r66+-8],%r187; .loc 1 160 23 mul.lo.u64 %r50,%r53,%r55; mov.u64 %r96,%r62; .loc 1 161 5 add.u64 %r62,%r62,1; .loc 1 162 7 setp.eq.u64 %r175,%r96,%r94; @ %r175 bra $L1; .loc 1 170 16 ld.u64 %r176,[%r66]; add.u64 %r52,%r176,1; st.u64 [%r66],%r52; shl.b64 %r61,%r62,3; .loc 1 171 23 add.u64 %r177,%frame,%r61; ld.u64 %r53,[%r177]; .loc 1 171 13 sub.u64 %r178,%r53,%r50; add.u64 %r58,%r58,%r178; .loc 1 174 32 add.u64 %r180,%r188,%r61; ld.u64 %r55,[%r180]; .loc 1 174 7 add.u64 %r66,%r66,8; setp.eq.u64 %r181,%r52,%r55; @ %r181 bra $L20; bra $L21; $L_gfortran_mminloc0_8_i1 .visible .func _gfortran_mminloc0_8_i1reg .u64 %stack; mov.u64 %stack,0480u329u64pred %r185; .reg .u64 %r188; .reg .u64u32 %r196; .reg .u32 %r197; .reg .u32u16pred %r223; .reg .u64 %r224; .reg .predu32 %r232; .reg .pred %r233; .reg .pred %r234; .reg .u64 %r236; .reg .predpred %r248; .reg .pred %r249; .reg .u64 %r250; .reg .u16 %r252; .reg .u32 %r253; .reg .predu64 %r274; .reg .pred %r275; .reg .u64mov.u64 %r158,%ar0; mov.u64 %r159,%ar1; mov.u64 %r160,%ar2; mov.u32 %r161,%ar3; .loc 1 202 6 setp.ne.u64 %r162,%r160,0; @ %r162 bra $L3615r159161; call _gfortran_minloc0_8_.loc 1 205 7 bra $L35; $L36: .loc 1 208 10 ld.s8 %r22,[%r159+28]; .loc 1 209 6 setp.gt.s32 %r166,%r22,0; @ %r166 bra $L38; .loc 1 210 5 cvta.const.u64 %r16167_gfortran_runtime_error%r169,[%r1539; .loc 1 214 7 st.u64 [%r158+48],%r169; add.u32 %r172,%r22,-1; cvt.s64.s32 %r173,%r172; st.u64 [%r158+56],%r173; mov.u64 %r174,1; st.u64 [%r158+40],%r174; .loc 1 215 28 cvt.u32.u64 %r175,%r174; st.u8 [%r158+28],%r175; .loc 1 216 24 st.u64 [%r158+8],%r169; .loc 1 217 29 cvt.u32.u32 %r180,%r22; cvt.s64.s8 %r179,%r180; mov.u64 %r178call (%value_in),_gfortrani_xmallocarray181,[%value_in]; } .loc 1 217 27 st.u64 [%r158],%r181; bra $L40; $L39: .loc 1 221 11 cvta.global.u64 %r183,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r184,[%r183+36]; setp.eq.u32 %r185,%r184,0r159188; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r191,$LC2; {15919188; call _gfortrani_bounds_equal_extents$L40: .loc 1 231 15 ld.u64 %r31,[%r160+16]; .loc 1 233 9 ld.u64 %r94,[%r160]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r193,%r32,-4; and.b32 %r194,%r193,-5; set.u32.eq.u32 %r196,%r194,0; neg.s32 %r197,%r196; .loc 1 235 22 add.u32 %r198,%r32,-1; set.u32.le.u32 %r200,%r198,1; neg.s32 %r201,%r200; .loc 1 235 6 cvt.u16.u32 %r203,%r197; cvt.u16.u32 %r204,%r201; or.b16 %r202,%r203,%r204; cvt.u32.u16 %r205,%r202; cvt.u16.u8 %r206,%r205; setp.ne.u16 %r207,%r206,0; @ %r207 bra $L41; .loc 1 237 7 setp.ne.u32 %r209,%r32,16; @ %r209 bra $L42; $L41: .loc 1 244 11 ld.u64 %r107,[%r158+40]; .loc 1 245 8 ld.u64 %r108,[%r158]; .loc 1 246 17 cvt.u32.u32 %r210,%r22; cvt.s64.s8 %r67,%r210; add.u64 %r151,%r159,40; add.u64 %r278,%frame,120; mov.u64 %r147,%r278; add.u64 %r145,%r160,40; mov.u64 %r144,%frame; add.u64 %r69,%frame,240; add.u64 %r63,%frame,360; mov.u64 %r130,%r63; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r219,%r29; bra $L43; $L42: .loc 1 242 5_gfortran_runtime_error64: mov.u64 %r29,%r112; $L43: .loc 1 248 18 ld.u64 %r213,[%r151]; st.u64 [%r147],%r213; .loc 1 249 20 ld.u64 %r215,[%r145]; mul.lo.u64 %r214,%r215,%r31; .loc 1 249 18 st.u64 [%r144],%r214; .loc 1 250 19 ld.u64 %r217,[%r151+16]; add.u64 %r216,%r217,1; ld.u64 %r218,[%r151+8]; sub.u64 %r44,%r216,%r218; .loc 1 250 17 st.u64 [%r69],%r44; .loc 1 251 16 st.u64 [%r130],%r219; .loc 1 252 10 setp.gt.s64 %r220,%r44,0; @ %r220 bra $L44; shl.b64 %r135,%r107,3; mov.u64 %r126,%r108; .loc 1 255 11 mov.u64 %r113,0; .loc 1 256 24 mov.u64 %r221,%r113; $L45: st.u64 [%r126],%r221; .loc 1 255 27 add.u64 %r113,%r113,1; .loc 1 255 4 add.u64 %r126,%r126,%r135; setp.ne.u64 %r222,%r67,%r113; @ %r222 bra $L45; bra $L35; $L44: .loc 1 246 26 add.u64 %r112,%r29,1; .loc 1 246 3 add.u64 %r151,%r151,24; add.u64 %r147,%r147,8; add.u64 %r145,%r145,24; add.u64 %r144,%r144,8; add.u64 %r69,%r69,8; add.u64 %r130,%r130,8; setp.ne.u64 %r223,%r67,%r112; @ %r223 bra $L64; .loc 1 261 8 ld.u64 %r90,[%r159]; shl.b64 %r156,%r107,3; mov.u64 %r154,%r108; .loc 1 264 10 mov.u64 %r111,0; .loc 1 265 23 mov.u64 %r224,%r111; $L47: st.u64 [%r154],%r224; mov.u64 %r138,%r111; .loc 1 264 26 add.u64 %r111,%r111,1; .loc 1 264 3 add.u64 %r154,%r154,%r156; setp.ne.u64 %r225,%r29,%r138; @ %r225 bra $L47; .loc 1 276 9 setp.eq.u64 %r226,%r90,0; @ %r226 bra $L35; .loc 1 331 19 ld.u64 %r71,[%frame+120]; .loc 1 332 20 ld.u64 %r73,[%frame]; .loc 1 334 34 ld.u64 %r76,[%frame+240]; ld.u64 %r153,[%frame+360]; cvt.u32.u32 %r228,%r22; cvt.s64.s8 %r227,%r228; shl.b64 %r229,%r227,3; add.u64 %r116,%r229,%r63; .loc 1 274 12 mov.u32 %r99,127; setp.eq.u32 %r277,1,0; add.u64 %r276,%frame,368; .loc 1 340 13 mov.u64 %r283,0; .loc 1 359 32 add.u64 %r284,%frame,240; .loc 1 308 12 setp.eq.u32 %r286,%r161,0; $L63: .loc 1 280 10 @ %r277 bra $L48; mov.u64 %r53,%r153; mov.u32 %r91,0; mov.u32 %r280,1; $L53: mov.u64 %r45,%r90; .loc 1 300 13 add.u64 %r90,%r45,%r71; .loc 1 284 11 ld.s8 %r232,[%r940; @ %r233 bra $L49; setp.eq.u32 %r234,%r91,0; selp.u64 %r153,%r153,%r53,%r234; .loc 1 294 16 ld.s8 %r99,[%r45]; mov.u64 %r125,%r276; mov.u64 %r136,%r108; mov.u64 %r149,%r153; $L52: .loc 1 296 33 add.u64 %r236,%r149,1; .loc 1 296 22 st.u64 [%r136],%r236; .loc 1 295 9 add.u64 %r136,%r136,%r156; setp.eq.u64 %r237,%r116,%r125; @ %r237 bra $L65; .loc 1 296 29 ld.u64 %r149,[%r125]; add.u64 %r125,%r125,8; bra $L52; $L49: .loc 1 301 14 add.u64 %r94,%r94,%r73; .loc 1 303 11 add.u64 %r53,%r53,1; mov.u32 %r91,%r280; .loc 1 303 4 setp.ne.u64 %r238,%r53,%r76; @ %r238 bra $L53; bra $L89; $L48: .loc 1 308 12 @ %r286 bra $L66; mov.u64 %r65,%r153; mov.u64 %r88,%r90; $L58: .loc 1 311 11 ld.s8 %r241,[%r94]; cvt.u16.u32 %r240,%r241; setp.eq.u16 %r242,%r240,0; @ %r242 bra $L56; .loc 1 311 12 ld.s8 %r60,[%r88]; .loc 1 311 11 cvt.u16.u32 %r243,%r60; cvt.u16.u32 %r244,%r99; setp.gt.s16 %r245,%r243,%r244; @ %r245 bra $L56; mov.u64 %r61,%r276; mov.u64 %r92,%r108; mov.u64 %r146,%r65; .loc 1 314 18 mov.u64 %r139,%r283; $L57: .loc 1 315 36 add.u64 %r247,%r146,1; .loc 1 315 25 st.u64 [%r92],%r247; .loc 1 314 34 add.u64 %r110,%r139,1; .loc 1 314 11 add.u64 %r92,%r92,%r156; setp.le.s64 %r248,%r29,%r139; @ %r248 bra $L67; .loc 1 315 32 ld.u64 %r146,[%r61]; add.u64 %r61,%r61,8; mov.u64 %r139,%r110; bra $L57; $L67: mov.u32 %r99,%r60; $L56: .loc 1 317 8 add.u64 %r88,%r88,%r71; .loc 1 319 13 add.u64 %r65,%r65,1; .loc 1 319 6 st.u64 [%frame+360],%r65; setp.ne.u64 %r249,%r65,%r76; @ %r249 bra $L58; .loc 1 317 8 sub.u64 %r250,%r76,%r153; mad.lo.u64 %r90,%r250,%r71,%r90; bra $L59; $L66: mov.u64 %r75,%r153; mov.u64 %r93,%r94; mov.u64 %r89,%r90; $L55: .loc 1 323 11 ld.s8 %r253,[%r93]; cvt.u16.u32 %r252,%r253; setp.eq.u16 %r254,%r252,0; @ %r254 bra $L60; .loc 1 323 12 ld.s8 %r68,[%r89]; .loc 1 323 11 cvt.u16.u32 %r255,%r68; cvt.u16.u32 %r256,%r99; setp.ge.s16 %r257,%r255,%r256; @ %r257 bra $L60; mov.u64 %r114,%r276; mov.u64 %r56,%r108; mov.u64 %r142,%r75; .loc 1 326 12 mov.u64 %r140,%r283; $L61: .loc 1 327 36 add.u64 %r259,%r142,1; .loc 1 327 25 st.u64 [%r56],%r259; .loc 1 326 28 add.u64 %r109,%r140,1; .loc 1 326 5 add.u64 %r56,%r56,%r156; setp.le.s64 %r260,%r29,%r140; @ %r260 bra $L68; .loc 1 327 32 ld.u64 %r142,[%r114]; add.u64 %r114,%r114,8; mov.u64 %r140,%r109; bra $L61; $L68: mov.u32 %r99,%r68; $L60: .loc 1 331 9 add.u64 %r89,%r89,%r71; .loc 1 332 10 add.u64 %r93,%r93,%r73; .loc 1 334 14 add.u64 %r75,%r75,1; .loc 1 334 7 st.u64 [%frame+360],%r75; setp.ne.u64 %r261,%r75,%r76; @ %r261 bra $L55; sub.u64 %r122,%r76,%r153; .loc 1 331 9 mad.lo.u64 %r90,%r71,%r122,%r90; .loc 1 332 10 mad.lo.u64 %r94,%r73,%r122,%r94; bra $L59; $L89: st.u64 [%frame+360],%r76; $L59: mov.u64 %r102,%r276; mov.u64 %r85,%r73; .loc 1 334 34 mov.u64 %r87,%r76; mov.u64 %r83,%r71; mov.u64 %r101,8; .loc 1 335 9 mov.u64 %r97,0; $L62: .loc 1 340 13 st.u64 [%r102+-8],%r283; .loc 1 343 23 mul.lo.u64 %r78,%r83,%r87; .loc 1 344 24 mul.lo.u64 %r80,%r87,%r85; mov.u64 %r54,%r97; .loc 1 345 5 add.u64 %r97,%r97,1; .loc 1 346 7 setp.eq.u64 %r266,%r29,%r54; @ %r266 bra $L35; .loc 1 354 16 ld.u64 %r267,[%r102]; add.u64 %r82,%r267,1; st.u64 [%r102],%r82; .loc 1 355 23 add.u64 %r269,%r278,%r101; ld.u64 %r83,[%r269]; .loc 1 355 13 sub.u64 %r270,%r83,%r78; add.u64 %r90,%r90,%r270; .loc 1 356 24 add.u64 %r271,%frame,%r101; ld.u64 %r85,[%r271]; .loc 1 356 14 sub.u64 %r272,%r85,%r80; add.u64 %r94,%r94,%r272; .loc 1 359 32 add.u64 %r274,%r284,%r101; ld.u64 %r87,[%r274]; .loc 1 359 7 add.u64 %r102,%r102,8; add.u64 %r101,%r101,8; setp.eq.u64 %r275,%r82,%r87; @ %r275 bra $L62; ld.u64 %r153,[%frame+360]; bra $L63; $L65: mov.u64 %r90,%r45; setp.eq.u32 %r277,1,1; bra $L48; $L35: .loc 1 _gfortran_sminloc0_8_i1 .visible .func _gfortran_sminloc0_8_i1u32pred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 378 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L91; .loc 1 378 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L92; $L91: .loc 1 380 7 {_gfortran_minloc0_8_.loc 1 381 7 bra $L90; $L92: .loc 1 384 8 ld.s8 %r30,[%r36+28]_gfortran_runtime_error94: .loc 1 389 15 ld.u64 %r24,[%r35]; .loc 1 389 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L95; .loc 1 391 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 392 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 393 24 st.u64 [%r35+8],%r24; .loc 1 394 29 mov.u64 call (%value_in),_gfortrani_xmallocarray94 27 st.u64 [%r35],%r24; bra $L96; $L95: .loc 1 396 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 396 11 ld.u32 %r62,[%r61+36; .loc 1 398 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 403 8 ld.u64 %r24,[%r35]; $L96: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,3; mov.u64 %r29,%r24; .loc 1 404 10 mov.u64 %r32,0; .loc 1 405 23 mov.u64 %r68,%r32; $L97: st.u64 [%r29],%r68; .loc 1 404 24 add.u64 %r32,%r32,1; .loc 1 404 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L97; $L90: .loc 1 406 1 ret; } minloc0_16_i1.o/_gfortran_minloc0_16_i1 .visible .func _gfortran_minloc0_16_i1libgfortran/generated/minloc0_16_i1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_16_i1 .visible .func _gfortran_mminloc0_16_i1_gfortran_sminloc0_16_i1 .visible .func _gfortran_sminloc0_16_i1VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_16_i1 .visible .func _gfortran_minloc0_16_i1stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[36832pred %r166; .reg .u16 %r167; .reg .u16 %r168; .reg .predmov.u64 %r112,%ar0; mov.u64 %r113,%ar1; mov.u32 %r114,%ar2; .loc 1 50 10 ld.s8 %r22,[%r113+28]; .loc 1 50 8 cvt.u32.u32 %r115,%r22; cvt.s64.s8 %r69,%r115; .loc 1 51 6 setp.gt.s64 %r116,%r69,0; @ %r116 bra $L2; .loc 1 52 5 cvta.const.u64 %r11117_gfortran_runtime_errorr23,[%r112]; .loc 1 54 6 setp.ne.u64 %r119,%r23,0; @ %r119 bra $L3; .loc 1 56 7 st.u64 [%r112+48],%r23; add.u64 %r121,%r69,-1; st.u64 [%r112+56],%r121; mov.u64 %r122,1; st.u64 [%r112+40],%r122; .loc 1 57 28 cvt.u32.u64 %r123,%r122; st.u8 [%r112+28],%r123; .loc 1 58 24 st.u64 [%r112+8],%r23; .loc 1 59 29 mov.u64 %r126,16126; call (%value_in),_gfortrani_xmallocarray129,[%value_in]; } mov.u64 %r23,%r129; .loc 1 59 27 st.u64 [%r112],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r131,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r132,[%r131+36]; setp.eq.u32 %r133,%r132,0; @ %r133 bra $L4; .loc 1 64r113136; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r112]; $L4: .loc 1 68 11 ld.u64 %r70,[%r112+40]; add.u64 %r44,%r113,40; mov.u64 %r39,%frame; add.u64 %r105,%frame,120; .loc 1 70 10 mov.u64 %r97,0; add.u64 %r189,%frame,240; .loc 1 74 16 mov.u64 %r144,%r97; bra $L7; $L22: mov.u64 %r97,%r76; $L7: .loc 1 72 18 ld.u64 %r137,[%r44]; st.u64 [%r39],%r137; .loc 1 73 19 ld.u64 %r139,[%r44+16]; add.u64 %r138,%r139,1; ld.u64 %r140,[%r44+8]; sub.u64 %r32,%r138,%r140; .loc 1 73 17 st.u64 [%r105],%r32; .loc 1 74 16 shl.b64 %r142,%r97,3; add.u64 %r143,%r189,%r142; st.u64 [%r143],%r144; .loc 1 75 10 setp.gt.s64 %r145,%r32,0; @ %r145 bra $L5; shl.b64 %r83,%r70,4; mov.u64 %r85,%r23; .loc 1 78 11 mov.u64 %r77,0; .loc 1 79 24 mov.u64 %r146,%r77; $L6: st.u64 [%r85],%r146; st.u64 [%r85+8],%r146; .loc 1 78 27 add.u64 %r77,%r77,1; .loc 1 78 4 add.u64 %r85,%r85,%r83; setp.ne.u64 %r148,%r69,%r77; @ %r148 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r76,%r97,1; .loc 1 70 3 add.u64 %r44,%r44,24; add.u64 %r39,%r39,8; add.u64 %r105,%r105,8; setp.ne.u64 %r149,%r69,%r76; @ %r149 bra $L22; .loc 1 84 8 ld.u64 %r60,[%r113]; shl.b64 %r37,%r70,4; mov.u64 %r82,%r23; .loc 1 87 10 mov.u64 %r75,0; .loc 1 88 23 mov.u64 %r150,1; mov.u64 %r151,%r75; $L8: st.u64 [%r82],%r150; st.u64 [%r82+8],%r151; mov.u64 %r98,%r75; .loc 1 87 26 add.u64 %r75,%r75,1; .loc 1 87 3 add.u64 %r82,%r82,%r37; setp.ne.u64 %r152,%r97,%r98; @ %r152 bra $L8; .loc 1 101 9 setp.ne.u64 %r153,%r60,0; @ ! %r153 bra $L1; .loc 1 149 19 ld.u64 %r47,[%frame]; .loc 1 151 34 ld.u64 %r50,[%frame+120]; cvt.u32.u32 %r155,%r22; cvt.s64.s8 %r154,%r155; shl.b64 %r156,%r154,3; add.u64 %r86,%r189,%r156; .loc 1 99 12 mov.u32 %r64,127; setp.eq.u32 %r188,%r114,0; add.u64 %r190,%frame,248; .loc 1 157 13 mov.u64 %r193,0; .loc 1 174 32 add.u64 %r194,%frame,120; $L21: .loc 1 151 21 ld.u64 %r104,[%frame+240]; .loc 1 126 10 @ %r188 bra $L23; mov.u64 %r40,%r104; mov.u64 %r58,%r60; $L14: .loc 1 129 10 ld.s8 %r33,[%r58]; .loc 1 129 9 cvt.u16.u32 %r159,%r33; cvt.u16.u32 %r160,%r64; setp.gt.s16 %r161,%r159,%r160; @ %r161 bra $L12; mov.u64 %r27,%r190; mov.u64 %r61,%r23; mov.u64 %r100,%r40; $L13: .loc 1 133 34 add.u64 %r163,%r100,1; st.u64 [%r61],%r163; shr.s64 %r164,%r163,63; st.u64 [%r61+8],%r164; .loc 1 132 3 add.u64 %r61,%r61,%r37; setp.eq.u64 %r165,%r27,%r86; @ %r165 bra $L24; .loc 1 133 30 ld.u64 %r100,[%r27]; add.u64 %r27,%r27,8; bra $L13; $L24: mov.u32 %r64,%r33; $L12: .loc 1 135 11 add.u64 %r58,%r58,%r47; .loc 1 137 9 add.u64 %r40,%r40,1; .loc 1 137 2 st.u64 [%frame+240],%r40; setp.ne.u64 %r166,%r40,%r50; @ %r166 bra $L14; bra $L34; $L23: mov.u64 %r49,%r104; mov.u64 %r59,%r60; $L11: .loc 1 141 10 ld.s8 %r42,[%r59]; .loc 1 141 9 cvt.u16.u32 %r167,%r42; cvt.u16.u32 %r168,%r64; setp.ge.s16 %r169,%r167,%r168; @ %r169 bra $L16; mov.u64 %r101,%r190; mov.u64 %r102,%r23; mov.u64 %r81,%r49; $L17: .loc 1 145 34 add.u64 %r171,%r81,1; st.u64 [%r102],%r171; shr.s64 %r172,%r171,63; st.u64 [%r102+8],%r172; .loc 1 144 3 add.u64 %r102,%r102,%r37; setp.eq.u64 %r173,%r86,%r101; @ %r173 bra $L25; .loc 1 145 30 ld.u64 %r81,[%r101]; add.u64 %r101,%r101,8; bra $L17; $L25: mov.u32 %r64,%r42; $L16: .loc 1 149 9 add.u64 %r59,%r59,%r47; .loc 1 151 14 add.u64 %r49,%r49,1; .loc 1 151 7 st.u64 [%frame+240],%r49; setp.ne.u64 %r174,%r49,%r50; @ %r174 bra $L11; .loc 1 149 9 sub.u64 %r175,%r50,%r104; mad.lo.u64 %r60,%r175,%r47,%r60; bra $L18; $L34: .loc 1 135 11 sub.u64 %r177,%r50,%r104; mad.lo.u64 %r60,%r177,%r47,%r60; $L18: mov.u64 %r74,%r190; .loc 1 151 34 mov.u64 %r57,%r50; mov.u64 %r55,%r47; .loc 1 152 9 mov.u64 %r63,0; $L20: .loc 1 157 13 st.u64 [%r74+-8],%r193; .loc 1 160 23 mul.lo.u64 %r52,%r55,%r57; mov.u64 %r99,%r63; .loc 1 161 5 add.u64 %r63,%r63,1; .loc 1 162 7 setp.eq.u64 %r181,%r99,%r97; @ %r181 bra $L1; .loc 1 170 16 ld.u64 %r182,[%r74]; add.u64 %r54,%r182,1; st.u64 [%r74],%r54; shl.b64 %r66,%r63,3; .loc 1 171 23 add.u64 %r183,%frame,%r66; ld.u64 %r55,[%r183]; .loc 1 171 13 sub.u64 %r184,%r55,%r52; add.u64 %r60,%r60,%r184; .loc 1 174 32 add.u64 %r186,%r194,%r66; ld.u64 %r57,[%r186]; .loc 1 174 7 add.u64 %r74,%r74,8; setp.eq.u64 %r187,%r54,%r57; @ %r187 bra $L20; bra $L21; $L_gfortran_mminloc0_16_i1 .visible .func _gfortran_mminloc0_16_i1reg .u64 %stack; mov.u64 %stack,0480u64u64 %r170; .reg .u64 %r172; .reg .predpred %r188; .reg .u64 %r191; .reg .u64 %r194; .reg .u32 %r196; .reg .u32 %r197; .reg .u32u16 %r206; .reg .u16u32 %r213; .reg .u64pred %r223; .reg .u64pred %r239; .reg .u64 %r241; .reg .u64 %r242; .reg .pred %r243; .reg .predpred %r269; .reg .pred %r274; .reg .u64u64 %r280; .reg .u64 %r282; .reg .predu32 %r288; .reg .u64 %r291; .reg .u64 %r292; .reg .pred %r294; mov.u64 %r161,%ar0; mov.u64 %r162,%ar1; mov.u64 %r163,%ar2; mov.u32 %r164,%ar3; .loc 1 202 6 setp.ne.u64 %r165,%r163,0; @ %r165 bra $L36gfortran_minloc0_16_.loc 1 205 7 bra $L35; $L36: .loc 1 208 10 ld.s8 %r22,[%r162+28]; .loc 1 209 6 setp.gt.s32 %r169,%r22,0; @ %r169 bra $L38; .loc 1 210 5 cvta.const.u64 %r170170_gfortran_runtime_error%r172,[%r161]; setp.ne.u64 %r173,%r172,0; @ %r173 bra $L39; .loc 1 214 7 st.u64 [%r161+48],%r172; add.u32 %r175,%r22,-1; cvt.s64.s32 %r176,%r175; st.u64 [%r161+56],%r176; mov.u64 %r177,1; st.u64 [%r161+40],%r177; .loc 1 215 28 cvt.u32.u64 %r178,%r177; st.u8 [%r161+28],%r178; .loc 1 216 24 st.u64 [%r161+8],%r172; .loc 1 217 29 cvt.u32.u32 %r183,%r22; cvt.s64.s8 %r182,%r183; mov.u64 %r181,16call (%value_in),_gfortrani_xmallocarray184,[%value_in]; } .loc 1 217 27 st.u64 [%r161],%r184; bra $L40; $L39: .loc 1 221 11 cvta.global.u64 %r186,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r187,[%r186+36]; setp.eq.u32 %r188,%r187,0161_gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r194163194191; call _gfortrani_bounds_equal_extents$L40: .loc 1 231 15 ld.u64 %r31,[%r163+16]; .loc 1 233 9 ld.u64 %r93,[%r163]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r196,%r32,-4; and.b32 %r197,%r196,-5; set.u32.eq.u32 %r199,%r197,0; neg.s32 %r200,%r199; .loc 1 235 22 add.u32 %r201,%r32,-1; set.u32.le.u32 %r203,%r201,1; neg.s32 %r204,%r203; .loc 1 235 6 cvt.u16.u32 %r206,%r200; cvt.u16.u32 %r207,%r204; or.b16 %r205,%r206,%r207; cvt.u32.u16 %r208,%r205; cvt.u16.u8 %r209,%r20837 7 setp.ne.u32 %r212,%r32,16; @ %r212 bra $L42; $L41: .loc 1 244 11 ld.u64 %r105,[%r161+40]; .loc 1 245 8 ld.u64 %r106,[%r161]; .loc 1 246 17 cvt.u32.u32 %r213,%r22; cvt.s64.s8 %r64,%r213; add.u64 %r159,%r162,40; add.u64 %r285,%frame,120; mov.u64 %r157,%r285; add.u64 %r156,%r163,40; mov.u64 %r152,%frame; add.u64 %r153,%frame,240; add.u64 %r149,%frame,360; mov.u64 %r148,%r149; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r222,%r29; bra $L43; $L42: .loc 1 242 5_gfortran_runtime_error64: mov.u64 %r29,%r110; $L43: .loc 1 248 18 ld.u64 %r216,[%r159]; st.u64 [%r157],%r216; .loc 1 249 20 ld.u64 %r218,[%r156]; mul.lo.u64 %r217,%r218,%r31; .loc 1 249 18 st.u64 [%r152],%r217; .loc 1 250 19 ld.u64 %r220,[%r159+16]; add.u64 %r219,%r220,1; ld.u64 %r221,[%r159+8]; sub.u64 %r44,%r219,%r221; .loc 1 250 17 st.u64 [%r153],%r44; .loc 1 251 16 st.u64 [%r148],%r222; .loc 1 252 10 setp.gt.s64 %r223,%r44,0; @ %r223 bra $L44; shl.b64 %r139,%r105,4; mov.u64 %r130,%r106; .loc 1 255 11 mov.u64 %r111,0; .loc 1 256 24 mov.u64 %r224,%r111; $L45: st.u64 [%r130],%r224; st.u64 [%r130+8],%r224; .loc 1 255 27 add.u64 %r111,%r111,1; .loc 1 255 4 add.u64 %r130,%r130,%r139; setp.ne.u64 %r226,%r64,%r111; @ %r226 bra $L45; bra $L35; $L44: .loc 1 246 26 add.u64 %r110,%r29,1; .loc 1 246 3 add.u64 %r159,%r159,24; add.u64 %r157,%r157,8; add.u64 %r156,%r156,24; add.u64 %r152,%r152,8; add.u64 %r153,%r153,8; add.u64 %r148,%r148,8; setp.ne.u64 %r227,%r64,%r110; @ %r227 bra $L64; .loc 1 261 8 ld.u64 %r90,[%r162]; shl.b64 %r113,%r105,4; mov.u64 %r51,%r106; .loc 1 264 10 mov.u64 %r109,0; .loc 1 265 23 mov.u64 %r228,%r109; $L47: st.u64 [%r51],%r228; st.u64 [%r51+8],%r228; mov.u64 %r141,%r109; .loc 1 264 26 add.u64 %r109,%r109,1; .loc 1 264 3 add.u64 %r51,%r51,%r113; setp.ne.u64 %r230,%r29,%r141; @ %r230 bra $L47; .loc 1 276 9 setp.eq.u64 %r231,%r90,0; @ %r231 bra $L35; .loc 1 331 19 ld.u64 %r70,[%frame+120]; .loc 1 332 20 ld.u64 %r72,[%frame]; .loc 1 334 34 ld.u64 %r75,[%frame+240]; ld.u64 %r158,[%frame+360]; cvt.u32.u32 %r233,%r22; cvt.s64.s8 %r232,%r233; shl.b64 %r234,%r232,3; add.u64 %r120,%r234,%r149; .loc 1 274 12 mov.u32 %r96,127; setp.eq.u32 %r284,1,0; add.u64 %r286,%frame,368; .loc 1 340 13 mov.u64 %r291,0; .loc 1 359 32 add.u64 %r292,%frame,240; .loc 1 308 12 setp.eq.u32 %r294,%r164,0; $L63: .loc 1 280 10 @ %r284 bra $L48; mov.u64 %r52,%r158; mov.u32 %r91,0; mov.u32 %r288,1; $L53: mov.u64 %r45,%r90; .loc 1 300 13 add.u64 %r90,%r45,%r70; .loc 1 284 11 ld.s8 %r237,[%r93]; cvt.u16.u32 %r236,%r237; setp.eq.u16 %r238,%r236,0; @ %r238 bra $L49; setp.eq.u32 %r239,%r91,0; selp.u64 %r158,%r158,%r52,%r239; .loc 1 294 16 ld.s8 %r96,[%r45]; mov.u64 %r129,%r286; mov.u64 %r140,%r106; mov.u64 %r155,%r158; $L52: .loc 1 296 33 add.u64 %r241,%r155,1; st.u64 [%r140],%r241; shr.s64 %r242,%r241,63; st.u64 [%r140+8],%r242; .loc 1 295 9 add.u64 %r140,%r140,%r113; setp.eq.u64 %r243,%r120,%r129; @ %r243 bra $L65; .loc 1 296 29 ld.u64 %r155,[%r129]; add.u64 %r129,%r129,8; bra $L52; $L49: .loc 1 301 14 add.u64 %r93,%r93,%r72; .loc 1 303 11 add.u64 %r52,%r52,1; mov.u32 %r91,%r288; .loc 1 303 4 setp.ne.u64 %r244,%r52,%r75; @ %r244 bra $L53; bra $L89; $L48: .loc 1 308 12 @ %r294 bra $L66; mov.u64 %r61,%r158; mov.u64 %r88,%r90; $L58: .loc 1 311 11 ld.s8 %r247,[%r93]; cvt.u16.u32 %r246,%r247; setp.eq.u16 %r248,%r246,0; @ %r248 bra $L56; .loc 1 311 12 ld.s8 %r57,[%r88]; .loc 1 311 11 cvt.u16.u32 %r249,%r57; cvt.u16.u32 %r250,%r96; setp.gt.s16 %r251,%r249,%r250; @ %r251 bra $L56; mov.u64 %r94,%r286; mov.u64 %r97,%r106; mov.u64 %r151,%r61; .loc 1 314 18 mov.u64 %r142,%r291; $L57: .loc 1 315 36 add.u64 %r253,%r151,1; st.u64 [%r97],%r253; shr.s64 %r254,%r253,63; st.u64 [%r97+8],%r254; .loc 1 314 34 add.u64 %r108,%r142,1; .loc 1 314 11 add.u64 %r97,%r97,%r113; setp.le.s64 %r255,%r29,%r142; @ %r255 bra $L67; .loc 1 315 32 ld.u64 %r151,[%r94]; add.u64 %r94,%r94,8; mov.u64 %r142,%r108; bra $L57; $L67: mov.u32 %r96,%r57; $L56: .loc 1 317 8 add.u64 %r88,%r88,%r70; .loc 1 319 13 add.u64 %r61,%r61,1; .loc 1 319 6 st.u64 [%frame+360],%r61; setp.ne.u64 %r256,%r61,%r75; @ %r256 bra $L58; .loc 1 317 8 sub.u64 %r257,%r75,%r158; mad.lo.u64 %r90,%r257,%r70,%r90; bra $L59; $L66: mov.u64 %r74,%r158; mov.u64 %r92,%r93; mov.u64 %r89,%r90; $L55: .loc 1 323 11 ld.s8 %r260,[%r92]; cvt.u16.u32 %r259,%r260; setp.eq.u16 %r261,%r259,0; @ %r261 bra $L60; .loc 1 323 12 ld.s8 %r65,[%r89]; .loc 1 323 11 cvt.u16.u32 %r262,%r65; cvt.u16.u32 %r263,%r96; setp.ge.s16 %r264,%r262,%r263; @ %r264 bra $L60; mov.u64 %r56,%r286; mov.u64 %r66,%r106; mov.u64 %r147,%r74; .loc 1 326 12 mov.u64 %r143,%r291; $L61: .loc 1 327 36 add.u64 %r266,%r147,1; st.u64 [%r66],%r266; shr.s64 %r267,%r266,63; st.u64 [%r66+8],%r267; .loc 1 326 28 add.u64 %r107,%r143,1; .loc 1 326 5 add.u64 %r66,%r66,%r113; setp.le.s64 %r268,%r29,%r143; @ %r268 bra $L68; .loc 1 327 32 ld.u64 %r147,[%r56]; add.u64 %r56,%r56,8; mov.u64 %r143,%r107; bra $L61; $L68: mov.u32 %r96,%r65; $L60: .loc 1 331 9 add.u64 %r89,%r89,%r70; .loc 1 332 10 add.u64 %r92,%r92,%r72; .loc 1 334 14 add.u64 %r74,%r74,1; .loc 1 334 7 st.u64 [%frame+360],%r74; setp.ne.u64 %r269,%r74,%r75; @ %r269 bra $L55; sub.u64 %r126,%r75,%r158; .loc 1 331 9 mad.lo.u64 %r90,%r70,%r126,%r90; .loc 1 332 10 mad.lo.u64 %r93,%r72,%r126,%r93; bra $L59; $L89: st.u64 [%frame+360],%r75; $L59: mov.u64 %r118,%r286; mov.u64 %r84,%r72; .loc 1 334 34 mov.u64 %r86,%r75; mov.u64 %r82,%r70; mov.u64 %r117,8; .loc 1 335 9 mov.u64 %r95,0; $L62: .loc 1 340 13 st.u64 [%r118+-8],%r291; .loc 1 343 23 mul.lo.u64 %r77,%r82,%r86; .loc 1 344 24 mul.lo.u64 %r79,%r86,%r84; mov.u64 %r62,%r95; .loc 1 345 5 add.u64 %r95,%r95,1; .loc 1 346 7 setp.eq.u64 %r274,%r29,%r62; @ %r274 bra $L35; .loc 1 354 16 ld.u64 %r275,[%r118]; add.u64 %r81,%r275,1; st.u64 [%r118],%r81; .loc 1 355 23 add.u64 %r277,%r285,%r117; ld.u64 %r82,[%r277]; .loc 1 355 13 sub.u64 %r278,%r82,%r77; add.u64 %r90,%r90,%r278; .loc 1 356 24 add.u64 %r279,%frame,%r117; ld.u64 %r84,[%r279]; .loc 1 356 14 sub.u64 %r280,%r84,%r79; add.u64 %r93,%r93,%r280; .loc 1 359 32 add.u64 %r282,%r292,%r117; ld.u64 %r86,[%r282]; .loc 1 359 7 add.u64 %r118,%r118,8; add.u64 %r117,%r117,8; setp.eq.u64 %r283,%r81,%r86; @ %r283 bra $L62; ld.u64 %r158,[%frame+360]; bra $L63; $L65: mov.u64 %r90,%r45; setp.eq.u32 %r284,1,1; bra $L48; $L35: .loc 1 _gfortran_sminloc0_16_i1 .visible .func _gfortran_sminloc0_16_i1u32pred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 378 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L91; .loc 1 378 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L92; $L91: .loc 1 380 7 {_gfortran_minloc0_16_.loc 1 381 7 bra $L90; $L92: .loc 1 384 8 ld.s8 %r30,[%r36+28]_gfortran_runtime_error94: .loc 1 389 15 ld.u64 %r24,[%r35]; .loc 1 389 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L95; .loc 1 391 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 392 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 393 24 st.u64 [%r35+8],%r24; .loc 1 394 29call (%value_in),_gfortrani_xmallocarray94 27 st.u64 [%r35],%r24; bra $L96; $L95: .loc 1 396 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 396 11 ld.u32 %r62,[%r61+36; .loc 1 398 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 403 8 ld.u64 %r24,[%r35]; $L96: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,4; mov.u64 %r29,%r24; .loc 1 404 10 mov.u64 %r32,0; .loc 1 405 23 mov.u64 %r68,%r32; $L97: st.u64 [%r29],%r68; st.u64 [%r29+8],%r68; .loc 1 404 24 add.u64 %r32,%r32,1; .loc 1 404 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r70,%r30,%r32; @ %r70 bra $L97; $L90: .loc 1 406 1 ret; } minloc0_4_i2.o/_gfortran_minloc0_4_i2 .visible .func _gfortran_minloc0_4_i2libgfortran/generated/minloc0_4_i2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_4_i2 .visible .func _gfortran_mminloc0_4_i2_gfortran_sminloc0_4_i2 .visible .func _gfortran_sminloc0_4_i2VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_4_i2 .visible .func _gfortran_minloc0_4_i2stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[368u64predpredpred %r168; .reg .pred %r169; .reg .u16 %r170; .reg .u16 %r171; .reg .pred %r172; .reg .u32 %r174; .reg .u32predu64 %r198; mov.u64 %r116,%ar0; mov.u64 %r117,%ar1; mov.u32 %r118,%ar2; .loc 1 50 10 ld.s8 %r22,[%r117+28]; .loc 1 50 8 cvt.u32.u32 %r119,%r22; cvt.s64.s8 %r72,%r119; .loc 1 51 6 setp.gt.s64 %r120,%r72,0; @ %r120 bra $L2; .loc 1 52 5 cvta.const.u64 %r121121_gfortran_runtime_errorr23,[%r116]; .loc 1 54 6 setp.ne.u64 %r123,%r23,0; @ %r123 bra $L3; .loc 1 56 7 st.u64 [%r116+48],%r23; add.u64 %r125,%r72,-1; st.u64 [%r116+56],%r125; mov.u64 %r126,1; st.u64 [%r116+40],%r126; .loc 1 57 28 cvt.u32.u64 %r127,%r126; st.u8 [%r116+28],%r127; .loc 1 58 24 st.u64 [%r116+8],%r23; .loc 1 59 29 mov.u64 %r130,4;r130; call (%value_in),_gfortrani_xmallocarray133,[%value_in]; } mov.u64 %r23,%r133; .loc 1 59 27 st.u64 [%r116],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r135,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r136,[%r135+36]; setp.eq.u32 %r137,%r136,0; @ %r137 bra $L4; .loc 1 64r140; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r116]; $L4: .loc 1 68 11 ld.u64 %r73,[%r116+40]; add.u64 %r39,%r117,40; mov.u64 %r45,%frame; add.u64 %r35,%frame,120; .loc 1 70 10 mov.u64 %r99,0; add.u64 %r193,%frame,240; .loc 1 74 16 mov.u64 %r148,%r99; bra $L7; $L22: mov.u64 %r99,%r79; $L7: .loc 1 72 18 ld.u64 %r141,[%r39]; st.u64 [%r45],%r141; .loc 1 73 19 ld.u64 %r143,[%r39+16]; add.u64 %r142,%r143,1; ld.u64 %r144,[%r39+8]; sub.u64 %r31,%r142,%r144; .loc 1 73 17 st.u64 [%r35],%r31; .loc 1 74 16 shl.b64 %r146,%r99,3; add.u64 %r147,%r193,%r146; st.u64 [%r147],%r148; .loc 1 75 10 setp.gt.s64 %r149,%r31,0; @ %r149 bra $L5; shl.b64 %r85,%r73,2; mov.u64 %r87,%r23; .loc 1 78 11 mov.u64 %r80,0; .loc 1 79 24 cvt.u32.u64 %r150,%r80; $L6: st.u32 [%r87],%r150; .loc 1 78 27 add.u64 %r80,%r80,1; .loc 1 78 4 add.u64 %r87,%r87,%r85; setp.ne.u64 %r151,%r72,%r80; @ %r151 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r79,%r99,1; .loc 1 70 3 add.u64 %r39,%r39,24; add.u64 %r45,%r45,8; add.u64 %r35,%r35,8; setp.ne.u64 %r152,%r72,%r79; @ %r152 bra $L22; .loc 1 84 8 ld.u64 %r64,[%r117]; shl.b64 %r82,%r73,2; mov.u64 %r92,%r23; .loc 1 87 10 mov.u64 %r78,0; .loc 1 88 23 mov.u32 %r153,1; $L8: st.u32 [%r92],%r153; mov.u64 %r32,%r78; .loc 1 87 26 add.u64 %r78,%r78,1; .loc 1 87 3 add.u64 %r92,%r92,%r82; setp.ne.u64 %r154,%r99,%r32; @ %r154 bra $L8; .loc 1 101 9 setp.ne.u64 %r155,%r64,0; @ ! %r155 bra $L1; .loc 1 149 19 ld.u64 %r49,[%frame]; .loc 1 149 9 add.u64 %r51,%r49,%r49; .loc 1 151 34 ld.u64 %r53,[%frame+120]; cvt.u32.u32 %r158,%r22; cvt.s64.s8 %r157,%r158; shl.b64 %r159,%r157,3; add.u64 %r102,%r193,%r159; .loc 1 99 12 mov.u32 %r67,32767; setp.eq.u32 %r192,%r118,0; add.u64 %r194,%frame,248; .loc 1 157 13 mov.u64 %r197,0; .loc 1 174 32 add.u64 %r198,%frame,120; $L21: .loc 1 151 21 ld.u64 %r110,[%frame+240]; .loc 1 126 10 @ %r192 bra $L23; mov.u64 %r40,%r110; mov.u64 %r62,%r64; $L14: .loc 1 129 10 ld.s16 %r33,[%r62]; .loc 1 129 9 cvt.u16.u32 %r162,%r33; cvt.u16.u32 %r163,%r67; setp.gt.s16 %r164,%r162,%r163; @ %r164 bra $L12; mov.u64 %r43,%r194; mov.u64 %r65,%r23; mov.u64 %r105,%r40; $L13: .loc 1 133 34 cvt.u32.u64 %r167,%r105; add.u32 %r166,%r167,1; .loc 1 133 23 st.u32 [%r65],%r166; .loc 1 132 3 add.u64 %r65,%r65,%r82; setp.eq.u64 %r168,%r43,%r102; @ %r168 bra $L24; .loc 1 133 30 ld.u64 %r105,[%r43]; add.u64 %r43,%r43,8; bra $L13; $L24: mov.u32 %r67,%r33; $L12: .loc 1 135 11 add.u64 %r62,%r62,%r51; .loc 1 137 9 add.u64 %r40,%r40,1; .loc 1 137 2 st.u64 [%frame+240],%r40; setp.ne.u64 %r169,%r40,%r53; @ %r169 bra $L14; bra $L34; $L23: mov.u64 %r52,%r110; mov.u64 %r63,%r64; $L11: .loc 1 141 10 ld.s16 %r42,[%r63]; .loc 1 141 9 cvt.u16.u32 %r170,%r42; cvt.u16.u32 %r171,%r67; setp.ge.s16 %r172,%r170,%r171; @ %r172 bra $L16; mov.u64 %r108,%r194; mov.u64 %r109,%r23; mov.u64 %r44,%r52; $L17: .loc 1 145 34 cvt.u32.u64 %r175,%r44; add.u32 %r174,%r175,1; .loc 1 145 23 st.u32 [%r109],%r174; .loc 1 144 3 add.u64 %r109,%r109,%r82; setp.eq.u64 %r176,%r102,%r108; @ %r176 bra $L25; .loc 1 145 30 ld.u64 %r44,[%r108]; add.u64 %r108,%r108,8; bra $L17; $L25: mov.u32 %r67,%r42; $L16: .loc 1 149 9 add.u64 %r63,%r63,%r51; .loc 1 151 14 add.u64 %r52,%r52,1; .loc 1 151 7 st.u64 [%frame+240],%r52; setp.ne.u64 %r177,%r52,%r53; @ %r177 bra $L11; .loc 1 149 9 sub.u64 %r178,%r53,%r110; mad.lo.u64 %r64,%r178,%r51,%r64; bra $L18; $L34: .loc 1 135 11 sub.u64 %r180,%r53,%r110; mad.lo.u64 %r64,%r180,%r51,%r64; $L18: mov.u64 %r81,%r194; .loc 1 151 34 mov.u64 %r61,%r53; mov.u64 %r59,%r49; .loc 1 152 9 mov.u64 %r66,0; $L20: .loc 1 157 13 st.u64 [%r81+-8],%r197; .loc 1 160 23 mul.lo.u64 %r56,%r59,%r61; mov.u64 %r100,%r66; .loc 1 161 5 add.u64 %r66,%r66,1; .loc 1 162 7 setp.eq.u64 %r184,%r100,%r99; @ %r184 bra $L1; .loc 1 170 16 ld.u64 %r185,[%r81]; add.u64 %r58,%r185,1; st.u64 [%r81],%r58; shl.b64 %r70,%r66,3; .loc 1 171 23 add.u64 %r186,%frame,%r70; ld.u64 %r59,[%r186]; .loc 1 171 13 sub.u64 %r187,%r59,%r56; add.u64 %r188,%r187,%r187; add.u64 %r64,%r64,%r188; .loc 1 174 32 add.u64 %r190,%r198,%r70; ld.u64 %r61,[%r190]; .loc 1 174 7 add.u64 %r81,%r81,8; setp.eq.u64 %r191,%r58,%r61; @ %r191 bra $L20; bra $L21; $L_gfortran_mminloc0_4_i2 .visible .func _gfortran_mminloc0_4_i2reg .u64 %stack; mov.u64 %stack,0480predpred %r193; .reg .u64 %r196; .reg .u6464pred %r228; .reg .u32 %r229; .reg .predpredu64predu64 %r279; .reg .u64 %r281; .reg .u64 %r282; .reg .u64pred %r299; mov.u64 %r166,%ar0; mov.u64 %r167,%ar1; mov.u64 %r168,%ar2; mov.u32 %r169,%ar3; .loc 1 202 6 setp.ne.u64 %r170,%r168,0; @ %r170 bra $L36gfortran_minloc0_4_.loc 1 205 7 bra $L35; $L36: .loc 1 208 10 ld.s8 %r22,[%r167+28]; .loc 1 209 6 setp.gt.s32 %r174,%r22,0; @ %r174 bra $L38; .loc 1 210 5 cvta.const.u64 %r175175_gfortran_runtime_error%r177,[%r166]; setp.ne.u64 %r178,%r177,0; @ %r178 bra $L39; .loc 1 214 7 st.u64 [%r166+48],%r177; add.u32 %r180,%r22,-1; cvt.s64.s32 %r181,%r180; st.u64 [%r166+56],%r181; mov.u64 %r182,1; st.u64 [%r166+40],%r182; .loc 1 215 28 cvt.u32.u64 %r183,%r182; st.u8 [%r166+28],%r183; .loc 1 216 24 st.u64 [%r166+8],%r177; .loc 1 217 29 cvt.u32.u32 %r188,%r22; cvt.s64.s8 %r187,%r188call (%value_in),_gfortrani_xmallocarray189,[%value_in]; } .loc 1 217 27 st.u64 [%r166],%r189; bra $L40; $L39: .loc 1 221 11 cvta.global.u64 %r191,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r192,[%r191+36];r167196; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19call _gfortrani_bounds_equal_extents$L40: .loc 1 231 15 ld.u64 %r31,[%r168+16]; .loc 1 233 9 ld.u64 %r96,[%r168]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r201,%r32,-4; and.b32 %r202,%r201,-5; set.u32.eq.u32 %r204,%r202,0; neg.s32 %r205,%r204; .loc 1 235 22 add.u32 %r206,%r32,-1; set.u32.le.u32 %r208,%r206,1; neg.s32 %r209,%r208; .loc 1 235 6 cvt.u16.u32 %r211,%r205; cvt.u16.u32 %r212,%r209; or.b16 %r210,%r211,%r212; cvt.u32.u16 %r213,%r210; cvt.u16.u8 %r214,%r213; setp.ne.u16 %r215,%r214,0; @ %r215 bra $L41; .loc 1 237 7 setp.ne.u32 %r217,%r32,16; @ %r217 bra $L42; $L41: .loc 1 244 11 ld.u64 %r109,[%r166+40]; .loc 1 245 8 ld.u64 %r110,[%r166]; .loc 1 246 17 cvt.u32.u32 %r218,%r22; cvt.s64.s8 %r67,%r218; add.u64 %r52,%r167,40; add.u64 %r289,%frame,120; mov.u64 %r118,%r289; add.u64 %r105,%r168,40; mov.u64 %r165,%frame; add.u64 %r162,%frame,240; add.u64 %r159,%frame,360; mov.u64 %r160,%r159; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r227,%r29; bra $L43; $L42: .loc 1 242 5_gfortran_runtime_error64: mov.u64 %r29,%r115; $L43: .loc 1 248 18 ld.u64 %r221,[%r52]; st.u64 [%r118],%r221; .loc 1 249 20 ld.u64 %r223,[%r105]; mul.lo.u64 %r222,%r223,%r31; .loc 1 249 18 st.u64 [%r165],%r222; .loc 1 250 19 ld.u64 %r225,[%r52+16]; add.u64 %r224,%r225,1; ld.u64 %r226,[%r52+8]; sub.u64 %r44,%r224,%r226; .loc 1 250 17 st.u64 [%r162],%r44; .loc 1 251 16 st.u64 [%r160],%r227; .loc 1 252 10 setp.gt.s64 %r228,%r44,0; @ %r228 bra $L44; shl.b64 %r149,%r109,2; mov.u64 %r138,%r110; .loc 1 255 11 mov.u64 %r116,0; .loc 1 256 24 cvt.u32.u64 %r229,%r116; $L45: st.u32 [%r138],%r229; .loc 1 255 27 add.u64 %r116,%r116,1; .loc 1 255 4 add.u64 %r138,%r138,%r149; setp.ne.u64 %r230,%r67,%r116; @ %r230 bra $L45; bra $L35; $L44: .loc 1 246 26 add.u64 %r115,%r29,1; .loc 1 246 3 add.u64 %r52,%r52,24; add.u64 %r118,%r118,8; add.u64 %r105,%r105,24; add.u64 %r165,%r165,8; add.u64 %r162,%r162,8; add.u64 %r160,%r160,8; setp.ne.u64 %r231,%r67,%r115; @ %r231 bra $L64; .loc 1 261 8 ld.u64 %r94,[%r167]; shl.b64 %r57,%r109,2; mov.u64 %r60,%r110; .loc 1 264 10 mov.u64 %r114,0; .loc 1 265 23 cvt.u32.u64 %r232,%r114; $L47: st.u32 [%r60],%r232; mov.u64 %r145,%r114; .loc 1 264 26 add.u64 %r114,%r114,1; .loc 1 264 3 add.u64 %r60,%r60,%r57; setp.ne.u64 %r233,%r29,%r145; @ %r233 bra $L47; .loc 1 276 9 setp.eq.u64 %r234,%r94,0; @ %r234 bra $L35; .loc 1 331 19 ld.u64 %r73,[%frame+120]; .loc 1 331 9 add.u64 %r75,%r73,%r73; .loc 1 332 20 ld.u64 %r76,[%frame]; .loc 1 334 34 ld.u64 %r79,[%frame+240]; ld.u64 %r163,[%frame+360]; cvt.u32.u32 %r237,%r22; cvt.s64.s8 %r236,%r237; shl.b64 %r238,%r236,3; add.u64 %r127,%r238,%r159; .loc 1 274 12 mov.u32 %r100,32767; setp.eq.u32 %r291,1,0; add.u64 %r290,%frame,368; .loc 1 340 13 mov.u64 %r296,0; .loc 1 359 32 add.u64 %r297,%frame,240; .loc 1 308 12 setp.eq.u32 %r299,%r169,0; $L63: .loc 1 280 10 @ %r291 bra $L48; mov.u64 %r53,%r163; mov.u32 %r46,0; mov.u32 %r293,1; $L53: mov.u64 %r45,%r94; .loc 1 300 13 add.u64 %r94,%r45,%r75; .loc 1 284 11 ld.s8 %r241,[%r96]; cvt.u16.u32 %r240,%r241; setp.eq.u16 %r242,%r240,0; @ %r242 bra $L49; setp.eq.u32 %r243,%r46,0; selp.u64 %r163,%r163,%r53,%r243; .loc 1 294 16 ld.s16 %r100,[%r45]; mov.u64 %r137,%r290; mov.u64 %r150,%r110; mov.u64 %r161,%r163; $L52: .loc 1 296 33 cvt.u32.u64 %r246,%r161; add.u32 %r245,%r246,1; .loc 1 296 22 st.u32 [%r150],%r245; .loc 1 295 9 add.u64 %r150,%r150,%r57; setp.eq.u64 %r247,%r127,%r137; @ %r247 bra $L65; .loc 1 296 29 ld.u64 %r161,[%r137]; add.u64 %r137,%r137,8; bra $L52; $L49: .loc 1 301 14 add.u64 %r96,%r96,%r76; .loc 1 303 11 add.u64 %r53,%r53,1; mov.u32 %r46,%r293; .loc 1 303 4 setp.ne.u64 %r248,%r53,%r79; @ %r248 bra $L53; bra $L89; $L48: .loc 1 308 12 @ %r299 bra $L66; mov.u64 %r64,%r163; mov.u64 %r92,%r94; $L58: .loc 1 311 11 ld.s8 %r251,[%r96]; cvt.u16.u32 %r250,%r251; setp.eq.u16 %r252,%r250,0; @ %r252 bra $L56; .loc 1 311 12 ld.s16 %r58,[%r92]; .loc 1 311 11 cvt.u16.u32 %r253,%r58; cvt.u16.u32 %r254,%r100; setp.gt.s16 %r255,%r253,%r254; @ %r255 bra $L56; mov.u64 %r103,%r290; mov.u64 %r119,%r110; mov.u64 %r156,%r64; .loc 1 314 18 mov.u64 %r146,%r296; $L57: .loc 1 315 36 cvt.u32.u64 %r258,%r156; add.u32 %r257,%r258,1; .loc 1 315 25 st.u32 [%r119],%r257; .loc 1 314 34 add.u64 %r113,%r146,1; .loc 1 314 11 add.u64 %r119,%r119,%r57; setp.le.s64 %r259,%r29,%r146; @ %r259 bra $L67; .loc 1 315 32 ld.u64 %r156,[%r103]; add.u64 %r103,%r103,8; mov.u64 %r146,%r113; bra $L57; $L67: mov.u32 %r100,%r58; $L56: .loc 1 317 8 add.u64 %r92,%r92,%r75; .loc 1 319 13 add.u64 %r64,%r64,1; .loc 1 319 6 st.u64 [%frame+360],%r64; setp.ne.u64 %r260,%r64,%r79; @ %r260 bra $L58; .loc 1 317 8 sub.u64 %r261,%r79,%r163; mad.lo.u64 %r94,%r261,%r75,%r94; bra $L59; $L66: mov.u64 %r78,%r163; mov.u64 %r95,%r96; mov.u64 %r93,%r94; $L55: .loc 1 323 11 ld.s8 %r264,[%r95]; cvt.u16.u32 %r263,%r264; setp.eq.u16 %r265,%r263,0; @ %r265 bra $L60; .loc 1 323 12 ld.s16 %r68,[%r93]; .loc 1 323 11 cvt.u16.u32 %r266,%r68; cvt.u16.u32 %r267,%r100; setp.ge.s16 %r268,%r266,%r267; @ %r268 bra $L60; mov.u64 %r91,%r290; mov.u64 %r97,%r110; mov.u64 %r151,%r78; .loc 1 326 12 mov.u64 %r147,%r296; $L61: .loc 1 327 36 cvt.u32.u64 %r271,%r151; add.u32 %r270,%r271,1; .loc 1 327 25 st.u32 [%r97],%r270; .loc 1 326 28 add.u64 %r112,%r147,1; .loc 1 326 5 add.u64 %r97,%r97,%r57; setp.le.s64 %r272,%r29,%r147; @ %r272 bra $L68; .loc 1 327 32 ld.u64 %r151,[%r91]; add.u64 %r91,%r91,8; mov.u64 %r147,%r112; bra $L61; $L68: mov.u32 %r100,%r68; $L60: .loc 1 331 9 add.u64 %r93,%r93,%r75; .loc 1 332 10 add.u64 %r95,%r95,%r76; .loc 1 334 14 add.u64 %r78,%r78,1; .loc 1 334 7 st.u64 [%frame+360],%r78; setp.ne.u64 %r273,%r78,%r79; @ %r273 bra $L55; sub.u64 %r133,%r79,%r163; .loc 1 331 9 mad.lo.u64 %r94,%r75,%r133,%r94; .loc 1 332 10 mad.lo.u64 %r96,%r76,%r133,%r96; bra $L59; $L89: st.u64 [%frame+360],%r79; $L59: mov.u64 %r125,%r290; mov.u64 %r88,%r76; .loc 1 334 34 mov.u64 %r90,%r79; mov.u64 %r86,%r73; mov.u64 %r124,8; .loc 1 335 9 mov.u64 %r98,0; $L62: .loc 1 340 13 st.u64 [%r125+-8],%r296; .loc 1 343 23 mul.lo.u64 %r81,%r86,%r90; .loc 1 344 24 mul.lo.u64 %r83,%r90,%r88; mov.u64 %r104,%r98; .loc 1 345 5 add.u64 %r98,%r98,1; .loc 1 346 7 setp.eq.u64 %r278,%r29,%r104; @ %r278 bra $L35; .loc 1 354 16 ld.u64 %r279,[%r125]; add.u64 %r85,%r279,1; st.u64 [%r125],%r85; .loc 1 355 23 add.u64 %r281,%r289,%r124; ld.u64 %r86,[%r281]; .loc 1 355 13 sub.u64 %r282,%r86,%r81; add.u64 %r283,%r282,%r282; add.u64 %r94,%r94,%r283; .loc 1 356 24 add.u64 %r284,%frame,%r124; ld.u64 %r88,[%r284]; .loc 1 356 14 sub.u64 %r285,%r88,%r83; add.u64 %r96,%r96,%r285; .loc 1 359 32 add.u64 %r287,%r297,%r124; ld.u64 %r90,[%r287]; .loc 1 359 7 add.u64 %r125,%r125,8; add.u64 %r124,%r124,8; setp.eq.u64 %r288,%r85,%r90; @ %r288 bra $L62; ld.u64 %r163,[%frame+360]; bra $L63; $L65: mov.u64 %r94,%r45; setp.eq.u32 %r291,1,1; bra $L48; $L35: .loc 1 _gfortran_sminloc0_4_i2 .visible .func _gfortran_sminloc0_4_i2u32pred.reg .u64u32 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 378 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L91; .loc 1 378 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L92; $L91: .loc 1 380 7 {_gfortran_minloc0_4_.loc 1 381 7 bra $L90; $L92: .loc 1 384 8 ld.s8 %r30,[%r36+28]_gfortran_runtime_error94: .loc 1 389 15 ld.u64 %r24,[%r35]; .loc 1 389 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L95; .loc 1 391 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 392 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 393 24 st.u64 [%r35+8],%r24; .loc 1 394 29 mov.u64 %r56call (%value_in),_gfortrani_xmallocarray94 27 st.u64 [%r35],%r24; bra $L96; $L95: .loc 1 396 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 396 11 ld.u32 %r62,[%r61+36; .loc 1 398 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 403 8 ld.u64 %r24,[%r35]; $L96: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,2; mov.u64 %r29,%r24; .loc 1 404 10 mov.u64 %r32,0; .loc 1 405 23 cvt.u32.u64 %r68,%r32; $L97: st.u32 [%r29],%r68; .loc 1 404 24 add.u64 %r32,%r32,1; .loc 1 404 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L97; $L90: .loc 1 406 1 ret; } minloc0_8_i2.o/_gfortran_minloc0_8_i2 .visible .func _gfortran_minloc0_8_i2libgfortran/generated/minloc0_8_i2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_8_i2 .visible .func _gfortran_mminloc0_8_i2_gfortran_sminloc0_8_i2 .visible .func _gfortran_sminloc0_8_i2VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_8_i2 .visible .func _gfortran_minloc0_8_i2stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[36832predpredpredpredmov.u64 %r112,%ar0; mov.u64 %r113,%ar1; mov.u32 %r114,%ar2; .loc 1 50 10 ld.s8 %r22,[%r113+28]; .loc 1 50 8 cvt.u32.u32 %r115,%r22; cvt.s64.s8 %r71,%r115; .loc 1 51 6 setp.gt.s64 %r116,%r71,0; @ %r116 bra $L2; .loc 1 52 5 cvta.const.u64 %r11117_gfortran_runtime_errorr23,[%r112]; .loc 1 54 6 setp.ne.u64 %r119,%r23,0; @ %r119 bra $L3; .loc 1 56 7 st.u64 [%r112+48],%r23; add.u64 %r121,%r71,-1; st.u64 [%r112+56],%r121; mov.u64 %r122,1; st.u64 [%r112+40],%r122; .loc 1 57 28 cvt.u32.u64 %r123,%r122; st.u8 [%r112+28],%r123; .loc 1 58 24 st.u64 [%r112+8],%r23; .loc 1 59 29 mov.u64 %r171call (%value_in),_gfortrani_xmallocarray129,[%value_in]; } mov.u64 %r23,%r129; .loc 1 59 27 st.u64 [%r112],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r131,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r132,[%r131+36]; setp.eq.u32 %r133,%r132,0; @ %r133 bra $L4; .loc 1 64r113136; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r112]; $L4: .loc 1 68 11 ld.u64 %r72,[%r112+40]; add.u64 %r36,%r113,40; mov.u64 %r104,%frame; add.u64 %r105,%frame,120; .loc 1 70 10 mov.u64 %r98,0; add.u64 %r187,%frame,240; .loc 1 74 16 mov.u64 %r144,%r98; bra $L7; $L22: mov.u64 %r98,%r79; $L7: .loc 1 72 18 ld.u64 %r137,[%r36]; st.u64 [%r104],%r137; .loc 1 73 19 ld.u64 %r139,[%r36+16]; add.u64 %r138,%r139,1; ld.u64 %r140,[%r36+8]; sub.u64 %r31,%r138,%r140; .loc 1 73 17 st.u64 [%r105],%r31; .loc 1 74 16 shl.b64 %r142,%r98,3; add.u64 %r143,%r187,%r142; st.u64 [%r143],%r144; .loc 1 75 10 setp.gt.s64 %r145,%r31,0; @ %r145 bra $L5; shl.b64 %r81,%r72,3; mov.u64 %r87,%r23; .loc 1 78 11 mov.u64 %r80,0; .loc 1 79 24 mov.u64 %r146,%r80; $L6: st.u64 [%r87],%r146; .loc 1 78 27 add.u64 %r80,%r80,1; .loc 1 78 4 add.u64 %r87,%r87,%r81; setp.ne.u64 %r147,%r71,%r80; @ %r147 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r79,%r98,1; .loc 1 70 3 add.u64 %r36,%r36,24; add.u64 %r104,%r104,8; add.u64 %r105,%r105,8; setp.ne.u64 %r148,%r71,%r79; @ %r148 bra $L22; .loc 1 84 8 ld.u64 %r62,[%r113]; shl.b64 %r84,%r72,3; mov.u64 %r39,%r23; .loc 1 87 10 mov.u64 %r78,0; .loc 1 88 23 mov.u64 %r149,1; $L8: st.u64 [%r39],%r149; mov.u64 %r32,%r78; .loc 1 87 26 add.u64 %r78,%r78,1; .loc 1 87 3 add.u64 %r39,%r39,%r84; setp.ne.u64 %r150,%r98,%r32; @ %r150 bra $L8; .loc 1 101 9 setp.ne.u64 %r151,%r62,0; @ ! %r151 bra $L1; .loc 1 149 19 ld.u64 %r45,[%frame]; .loc 1 149 9 add.u64 %r47,%r45,%r45; .loc 1 151 34 ld.u64 %r51,[%frame+120]; cvt.u32.u32 %r154,%r22; cvt.s64.s8 %r153,%r154; shl.b64 %r155,%r153,3; add.u64 %r73,%r187,%r155; .loc 1 99 12 mov.u32 %r67,32767; setp.eq.u32 %r186,%r114,0; add.u64 %r188,%frame,248; .loc 1 157 13 mov.u64 %r191,0; .loc 1 174 32 add.u64 %r192,%frame,120; $L21: .loc 1 151 21 ld.u64 %r103,[%frame+240]; .loc 1 126 10 @ %r186 bra $L23; mov.u64 %r40,%r103; mov.u64 %r60,%r62; $L14: .loc 1 129 10 ld.s16 %r33,[%r60]; .loc 1 129 9 cvt.u16.u32 %r158,%r33; cvt.u16.u32 %r159,%r67; setp.gt.s16 %r160,%r158,%r159; @ %r160 bra $L12; mov.u64 %r76,%r188; mov.u64 %r34,%r23; mov.u64 %r90,%r40; $L13: .loc 1 133 34 add.u64 %r162,%r90,1; .loc 1 133 23 st.u64 [%r34],%r162; .loc 1 132 3 add.u64 %r34,%r34,%r84; setp.eq.u64 %r163,%r73,%r76; @ %r163 bra $L24; .loc 1 133 30 ld.u64 %r90,[%r76]; add.u64 %r76,%r76,8; bra $L13; $L24: mov.u32 %r67,%r33; $L12: .loc 1 135 11 add.u64 %r60,%r60,%r47; .loc 1 137 9 add.u64 %r40,%r40,1; .loc 1 137 2 st.u64 [%frame+240],%r40; setp.ne.u64 %r164,%r40,%r51; @ %r164 bra $L14; bra $L34; $L23: mov.u64 %r49,%r103; mov.u64 %r61,%r62; $L11: .loc 1 141 10 ld.s16 %r42,[%r61]; .loc 1 141 9 cvt.u16.u32 %r165,%r42; cvt.u16.u32 %r166,%r67; setp.ge.s16 %r167,%r165,%r166; @ %r167 bra $L16; mov.u64 %r85,%r188; mov.u64 %r101,%r23; mov.u64 %r86,%r49; $L17: .loc 1 145 34 add.u64 %r169,%r86,1; .loc 1 145 23 st.u64 [%r101],%r169; .loc 1 144 3 add.u64 %r101,%r101,%r84; setp.eq.u64 %r170,%r73,%r85; @ %r170 bra $L25; .loc 1 145 30 ld.u64 %r86,[%r85]; add.u64 %r85,%r85,8; bra $L17; $L25: mov.u32 %r67,%r42; $L16: .loc 1 149 9 add.u64 %r61,%r61,%r47; .loc 1 151 14 add.u64 %r49,%r49,1; .loc 1 151 7 st.u64 [%frame+240],%r49; setp.ne.u64 %r171,%r49,%r51; @ %r171 bra $L11; .loc 1 149 9 sub.u64 %r172,%r51,%r103; mad.lo.u64 %r62,%r172,%r47,%r62; bra $L18; $L34: .loc 1 135 11 sub.u64 %r174,%r51,%r103; mad.lo.u64 %r62,%r174,%r47,%r62; $L18: mov.u64 %r70,%r188; .loc 1 151 34 mov.u64 %r59,%r51; mov.u64 %r57,%r45; .loc 1 152 9 mov.u64 %r66,0; $L20: .loc 1 157 13 st.u64 [%r70+-8],%r191; .loc 1 160 23 mul.lo.u64 %r54,%r57,%r59; mov.u64 %r99,%r66; .loc 1 161 5 add.u64 %r66,%r66,1; .loc 1 162 7 setp.eq.u64 %r178,%r99,%r98; @ %r178 bra $L1; .loc 1 170 16 ld.u64 %r179,[%r70]; add.u64 %r56,%r179,1; st.u64 [%r70],%r56; shl.b64 %r65,%r66,3; .loc 1 171 23 add.u64 %r180,%frame,%r65; ld.u64 %r57,[%r180]; .loc 1 171 13 sub.u64 %r181,%r57,%r54; add.u64 %r182,%r181,%r181; add.u64 %r62,%r62,%r182; .loc 1 174 32 add.u64 %r184,%r192,%r65; ld.u64 %r59,[%r184]; .loc 1 174 7 add.u64 %r70,%r70,8; setp.eq.u64 %r185,%r56,%r59; @ %r185 bra $L20; bra $L21; $L_gfortran_mminloc0_8_i2 .visible .func _gfortran_mminloc0_8_i2reg .u64 %stack; mov.u64 %stack,0480pred %r187; .reg .u64 %r190; .reg .u64u64pred %r222; .reg .u64 %r223; .reg .predu32 %r231; .reg .u64 %r232; .reg .u16 %r234; .reg .u32 %r235; .reg .pred %r236; .reg .predpred %r251; .reg .pred %r252; .reg .u64 %r253; .reg .u16 %r255; .reg .u32predpred %r281; .reg .u64 %r282; .reg .u32 %r284; .reg .u64 %r287; .reg .u64 %r288; .reg .pred %r290; mov.u64 %r160,%ar0; mov.u64 %r161,%ar1; mov.u64 %r162,%ar2; mov.u32 %r163,%ar3; .loc 1 202 6 setp.ne.u64 %r164,%r162,0; @ %r164 bra $L36; .loc 1 204 7 {1163; call _gfortran_minloc0_8_.loc 1 205 7 bra $L35; $L36: .loc 1 208 10 ld.s8 %r22,[%r161+28]; .loc 1 209 6 setp.gt.s32 %r168,%r22,0; @ %r168 bra $L38; .loc 1 210 5 cvta.const.u64 %r16stack; call _gfortran_runtime_error%r171,[%r160]; setp.ne.u64 %r172,%r171,0; @ %r172 bra $L39; .loc 1 214 7 st.u64 [%r160+48],%r171; add.u32 %r174,%r22,-1; cvt.s64.s32 %r175,%r174; st.u64 [%r160+56],%r175; mov.u64 %r176,1; st.u64 [%r160+40],%r176; .loc 1 215 28 cvt.u32.u64 %r177,%r176; st.u8 [%r160+28],%r177; .loc 1 216 24 st.u64 [%r160+8],%r171; .loc 1 217 29 cvt.u32.u32 %r182,%r22; cvt.s64.s8 %r181,%r182; mov.u64 %r180call (%value_in),_gfortrani_xmallocarray183,[%value_in]; } .loc 1 217 27 st.u64 [%r160],%r183; bra $L40; $L39: .loc 1 221 11 cvta.global.u64 %r185,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r186,[%r185+36]; setp.eq.u32 %r187,%r186,0; @ %r187 bra $L40; .loc 1 224 4 cvta.const.u64 %r190,$LC1; {161190; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19316161193190; call _gfortrani_bounds_equal_extents$L40: .loc 1 231 15 ld.u64 %r31,[%r162+16]; .loc 1 233 9 ld.u64 %r96,[%r162]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r195,%r32,-4; and.b32 %r196,%r195,-5; set.u32.eq.u32 %r198,%r196,0; neg.s32 %r199,%r198; .loc 1 235 22 add.u32 %r200,%r32,-1; set.u32.le.u32 %r202,%r200,1; neg.s32 %r203,%r202; .loc 1 235 6 cvt.u16.u32 %r205,%r199; cvt.u16.u32 %r206,%r203; or.b16 %r204,%r205,%r206; cvt.u32.u16 %r207,%r204; cvt.u16.u8 %r208,%r20741; .loc 1 237 7 setp.ne.u32 %r211,%r32,16; @ %r211 bra $L42; $L41: .loc 1 244 11 ld.u64 %r110,[%r160+40]; .loc 1 245 8 ld.u64 %r111,[%r160]; .loc 1 246 17 cvt.u32.u32 %r212,%r22; cvt.s64.s8 %r68,%r212; add.u64 %r153,%r161,40; add.u64 %r282,%frame,120; mov.u64 %r149,%r282; add.u64 %r147,%r162,40; mov.u64 %r146,%frame; add.u64 %r71,%frame,240; add.u64 %r64,%frame,360; mov.u64 %r47,%r64; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r221,%r29; bra $L43; $L42: .loc 1 242 5_gfortran_runtime_error64: mov.u64 %r29,%r116; $L43: .loc 1 248 18 ld.u64 %r215,[%r153]; st.u64 [%r149],%r215; .loc 1 249 20 ld.u64 %r217,[%r147]; mul.lo.u64 %r216,%r217,%r31; .loc 1 249 18 st.u64 [%r146],%r216; .loc 1 250 19 ld.u64 %r219,[%r153+16]; add.u64 %r218,%r219,1; ld.u64 %r220,[%r153+8]; sub.u64 %r44,%r218,%r220; .loc 1 250 17 st.u64 [%r71],%r44; .loc 1 251 16 st.u64 [%r47],%r221; .loc 1 252 10 setp.gt.s64 %r222,%r44,0; @ %r222 bra $L44; shl.b64 %r138,%r110,3; mov.u64 %r130,%r111; .loc 1 255 11 mov.u64 %r117,0; .loc 1 256 24 mov.u64 %r223,%r117; $L45: st.u64 [%r130],%r223; .loc 1 255 27 add.u64 %r117,%r117,1; .loc 1 255 4 add.u64 %r130,%r130,%r138; setp.ne.u64 %r224,%r68,%r117; @ %r224 bra $L45; bra $L35; $L44: .loc 1 246 26 add.u64 %r116,%r29,1; .loc 1 246 3 add.u64 %r153,%r153,24; add.u64 %r149,%r149,8; add.u64 %r147,%r147,24; add.u64 %r146,%r146,8; add.u64 %r71,%r71,8; add.u64 %r47,%r47,8; setp.ne.u64 %r225,%r68,%r116; @ %r225 bra $L64; .loc 1 261 8 ld.u64 %r93,[%r161]; shl.b64 %r158,%r110,3; mov.u64 %r156,%r111; .loc 1 264 10 mov.u64 %r115,0; .loc 1 265 23 mov.u64 %r226,%r115; $L47: st.u64 [%r156],%r226; mov.u64 %r141,%r115; .loc 1 264 26 add.u64 %r115,%r115,1; .loc 1 264 3 add.u64 %r156,%r156,%r158; setp.ne.u64 %r227,%r29,%r141; @ %r227 bra $L47; .loc 1 276 9 setp.eq.u64 %r228,%r93,0; @ %r228 bra $L35; .loc 1 331 19 ld.u64 %r73,[%frame+120]; .loc 1 331 9 add.u64 %r75,%r73,%r73; .loc 1 332 20 ld.u64 %r76,[%frame]; .loc 1 334 34 ld.u64 %r79,[%frame+240]; ld.u64 %r155,[%frame+360]; cvt.u32.u32 %r231,%r22; cvt.s64.s8 %r230,%r231; shl.b64 %r232,%r230,3; add.u64 %r120,%r232,%r64; .loc 1 274 12 mov.u32 %r101,32767; setp.eq.u32 %r281,1,0; add.u64 %r280,%frame,368; .loc 1 340 13 mov.u64 %r287,0; .loc 1 359 32 add.u64 %r288,%frame,240; .loc 1 308 12 setp.eq.u32 %r290,%r163,0; $L63: .loc 1 280 10 @ %r281 bra $L48; mov.u64 %r56,%r155; mov.u32 %r46,0; mov.u32 %r284,1; $L53: mov.u64 %r45,%r93; .loc 1 300 13 add.u64 %r93,%r45,%r75; .loc 1 284 11 ld.s8 %r235,[%r96]; cvt.u16.u32 %r234,%r235; setp.eq.u16 %r236,%r234,0; @ %r236 bra $L49; setp.eq.u32 %r237,%r46,0; selp.u64 %r155,%r155,%r56,%r237; .loc 1 294 16 ld.s16 %r101,[%r45]; mov.u64 %r129,%r280; mov.u64 %r139,%r111; mov.u64 %r151,%r155; $L52: .loc 1 296 33 add.u64 %r239,%r151,1; .loc 1 296 22 st.u64 [%r139],%r239; .loc 1 295 9 add.u64 %r139,%r139,%r158; setp.eq.u64 %r240,%r120,%r129; @ %r240 bra $L65; .loc 1 296 29 ld.u64 %r151,[%r129]; add.u64 %r129,%r129,8; bra $L52; $L49: .loc 1 301 14 add.u64 %r96,%r96,%r76; .loc 1 303 11 add.u64 %r56,%r56,1; mov.u32 %r46,%r284; .loc 1 303 4 setp.ne.u64 %r241,%r56,%r79; @ %r241 bra $L53; bra $L89; $L48: .loc 1 308 12 @ %r290 bra $L66; mov.u64 %r66,%r155; mov.u64 %r91,%r93; $L58: .loc 1 311 11 ld.s8 %r244,[%r96]; cvt.u16.u32 %r243,%r244; setp.eq.u16 %r245,%r243,0; @ %r245 bra $L56; .loc 1 311 12 ld.s16 %r61,[%r91]; .loc 1 311 11 cvt.u16.u32 %r246,%r61; cvt.u16.u32 %r247,%r101; setp.gt.s16 %r248,%r246,%r247; @ %r248 bra $L56; mov.u64 %r62,%r280; mov.u64 %r94,%r111; mov.u64 %r148,%r66; .loc 1 314 18 mov.u64 %r142,%r287; $L57: .loc 1 315 36 add.u64 %r250,%r148,1; .loc 1 315 25 st.u64 [%r94],%r250; .loc 1 314 34 add.u64 %r114,%r142,1; .loc 1 314 11 add.u64 %r94,%r94,%r158; setp.le.s64 %r251,%r29,%r142; @ %r251 bra $L67; .loc 1 315 32 ld.u64 %r148,[%r62]; add.u64 %r62,%r62,8; mov.u64 %r142,%r114; bra $L57; $L67: mov.u32 %r101,%r61; $L56: .loc 1 317 8 add.u64 %r91,%r91,%r75; .loc 1 319 13 add.u64 %r66,%r66,1; .loc 1 319 6 st.u64 [%frame+360],%r66; setp.ne.u64 %r252,%r66,%r79; @ %r252 bra $L58; .loc 1 317 8 sub.u64 %r253,%r79,%r155; mad.lo.u64 %r93,%r253,%r75,%r93; bra $L59; $L66: mov.u64 %r78,%r155; mov.u64 %r95,%r96; mov.u64 %r92,%r93; $L55: .loc 1 323 11 ld.s8 %r256,[%r95]; cvt.u16.u32 %r255,%r256; setp.eq.u16 %r257,%r255,0; @ %r257 bra $L60; .loc 1 323 12 ld.s16 %r69,[%r92]; .loc 1 323 11 cvt.u16.u32 %r258,%r69; cvt.u16.u32 %r259,%r101; setp.ge.s16 %r260,%r258,%r259; @ %r260 bra $L60; mov.u64 %r118,%r280; mov.u64 %r57,%r111; mov.u64 %r144,%r78; .loc 1 326 12 mov.u64 %r143,%r287; $L61: .loc 1 327 36 add.u64 %r262,%r144,1; .loc 1 327 25 st.u64 [%r57],%r262; .loc 1 326 28 add.u64 %r113,%r143,1; .loc 1 326 5 add.u64 %r57,%r57,%r158; setp.le.s64 %r263,%r29,%r143; @ %r263 bra $L68; .loc 1 327 32 ld.u64 %r144,[%r118]; add.u64 %r118,%r118,8; mov.u64 %r143,%r113; bra $L61; $L68: mov.u32 %r101,%r69; $L60: .loc 1 331 9 add.u64 %r92,%r92,%r75; .loc 1 332 10 add.u64 %r95,%r95,%r76; .loc 1 334 14 add.u64 %r78,%r78,1; .loc 1 334 7 st.u64 [%frame+360],%r78; setp.ne.u64 %r264,%r78,%r79; @ %r264 bra $L55; sub.u64 %r126,%r79,%r155; .loc 1 331 9 mad.lo.u64 %r93,%r75,%r126,%r93; .loc 1 332 10 mad.lo.u64 %r96,%r76,%r126,%r96; bra $L59; $L89: st.u64 [%frame+360],%r79; $L59: mov.u64 %r105,%r280; mov.u64 %r88,%r76; .loc 1 334 34 mov.u64 %r90,%r79; mov.u64 %r86,%r73; mov.u64 %r104,8; .loc 1 335 9 mov.u64 %r99,0; $L62: .loc 1 340 13 st.u64 [%r105+-8],%r287; .loc 1 343 23 mul.lo.u64 %r81,%r86,%r90; .loc 1 344 24 mul.lo.u64 %r83,%r90,%r88; mov.u64 %r70,%r99; .loc 1 345 5 add.u64 %r99,%r99,1; .loc 1 346 7 setp.eq.u64 %r269,%r29,%r70; @ %r269 bra $L35; .loc 1 354 16 ld.u64 %r270,[%r105]; add.u64 %r85,%r270,1; st.u64 [%r105],%r85; .loc 1 355 23 add.u64 %r272,%r282,%r104; ld.u64 %r86,[%r272]; .loc 1 355 13 sub.u64 %r273,%r86,%r81; add.u64 %r274,%r273,%r273; add.u64 %r93,%r93,%r274; .loc 1 356 24 add.u64 %r275,%frame,%r104; ld.u64 %r88,[%r275]; .loc 1 356 14 sub.u64 %r276,%r88,%r83; add.u64 %r96,%r96,%r276; .loc 1 359 32 add.u64 %r278,%r288,%r104; ld.u64 %r90,[%r278]; .loc 1 359 7 add.u64 %r105,%r105,8; add.u64 %r104,%r104,8; setp.eq.u64 %r279,%r85,%r90; @ %r279 bra $L62; ld.u64 %r155,[%frame+360]; bra $L63; $L65: mov.u64 %r93,%r45; setp.eq.u32 %r281,1,1; bra $L48; $L35: .loc 1 _gfortran_sminloc0_8_i2 .visible .func _gfortran_sminloc0_8_i2u32pred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 378 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L91; .loc 1 378 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L92; $L91: .loc 1 380 7 {_gfortran_minloc0_8_.loc 1 381 7 bra $L90; $L92: .loc 1 384 8 ld.s8 %r30,[%r36+28]_gfortran_runtime_error94: .loc 1 389 15 ld.u64 %r24,[%r35]; .loc 1 389 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L95; .loc 1 391 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 392 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 393 24 st.u64 [%r35+8],%r24; .loc 1 394 29 mov.u64 call (%value_in),_gfortrani_xmallocarray94 27 st.u64 [%r35],%r24; bra $L96; $L95: .loc 1 396 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 396 11 ld.u32 %r62,[%r61+36; .loc 1 398 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 403 8 ld.u64 %r24,[%r35]; $L96: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,3; mov.u64 %r29,%r24; .loc 1 404 10 mov.u64 %r32,0; .loc 1 405 23 mov.u64 %r68,%r32; $L97: st.u64 [%r29],%r68; .loc 1 404 24 add.u64 %r32,%r32,1; .loc 1 404 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L97; $L90: .loc 1 406 1 ret; } minloc0_16_i2.o/_gfortran_minloc0_16_i2 .visible .func _gfortran_minloc0_16_i2libgfortran/generated/minloc0_16_i2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_16_i2 .visible .func _gfortran_mminloc0_16_i2_gfortran_sminloc0_16_i2 .visible .func _gfortran_sminloc0_16_i2VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_16_i2 .visible .func _gfortran_minloc0_16_i2stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[3683264pred %r168; .reg .pred %r169; .reg .u16 %r170; .reg .u16 %r171; .reg .pred.reg .u64 %r180; .reg .predmov.u32 %r116,%ar2; .loc 1 50 10 ld.s8 %r22,[%r115+28]; .loc 1 50 8 cvt.u32.u32 %r117,%r22; cvt.s64.s8 %r71,%r117; .loc 1 51 6 setp.gt.s64 %r118,%r71,0; @ %r118 bra $L2; .loc 1 52 5 cvta.const.u64 %r1111_gfortran_runtime_errorr23,[%r114]; .loc 1 54 6 setp.ne.u64 %r121,%r23,0; @ %r121 bra $L3; .loc 1 56 7 st.u64 [%r114+48],%r23; add.u64 %r123,%r71,-1; st.u64 [%r114+56],%r123; mov.u64 %r124,1; st.u64 [%r114+40],%r124; .loc 1 57 28 cvt.u32.u64 %r125,%r124; st.u8 [%r114+28],%r125; .loc 1 58 24 st.u64 [%r114+8],%r23; .loc 1 59 29 mov.u64 %r128,1671call (%value_in),_gfortrani_xmallocarray131,[%value_in]; } mov.u64 %r23,%r131; .loc 1 59 27 st.u64 [%r114],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r133,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r134,[%r133+36]; setp.eq.u32 %r135,%r134,0; @ %r135 bra $L4; .loc 1 64r115138; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r114]; $L4: .loc 1 68 11 ld.u64 %r72,[%r114+40]; add.u64 %r44,%r115,40; mov.u64 %r35,%frame; add.u64 %r52,%frame,120; .loc 1 70 10 mov.u64 %r100,0; add.u64 %r193,%frame,240; .loc 1 74 16 mov.u64 %r146,%r100; bra $L7; $L22: mov.u64 %r100,%r79; $L7: .loc 1 72 18 ld.u64 %r139,[%r44]; st.u64 [%r35],%r139; .loc 1 73 19 ld.u64 %r141,[%r44+16]; add.u64 %r140,%r141,1; ld.u64 %r142,[%r44+8]; sub.u64 %r32,%r140,%r142; .loc 1 73 17 st.u64 [%r52],%r32; .loc 1 74 16 shl.b64 %r144,%r100,3; add.u64 %r145,%r193,%r144; st.u64 [%r145],%r146; .loc 1 75 10 setp.gt.s64 %r147,%r32,0; @ %r147 bra $L5; shl.b64 %r86,%r72,4; mov.u64 %r88,%r23; .loc 1 78 11 mov.u64 %r80,0; .loc 1 79 24 mov.u64 %r148,%r80; $L6: st.u64 [%r88],%r148; st.u64 [%r88+8],%r148; .loc 1 78 27 add.u64 %r80,%r80,1; .loc 1 78 4 add.u64 %r88,%r88,%r86; setp.ne.u64 %r150,%r71,%r80; @ %r150 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r79,%r100,1; .loc 1 70 3 add.u64 %r44,%r44,24; add.u64 %r35,%r35,8; add.u64 %r52,%r52,8; setp.ne.u64 %r151,%r71,%r79; @ %r151 bra $L22; .loc 1 84 8 ld.u64 %r62,[%r115]; shl.b64 %r38,%r72,4; mov.u64 %r85,%r23; .loc 1 87 10 mov.u64 %r78,0; .loc 1 88 23 mov.u64 %r152,1; mov.u64 %r153,%r78; $L8: st.u64 [%r85],%r152; st.u64 [%r85+8],%r153; mov.u64 %r33,%r78; .loc 1 87 26 add.u64 %r78,%r78,1; .loc 1 87 3 add.u64 %r85,%r85,%r38; setp.ne.u64 %r154,%r100,%r33; @ %r154 bra $L8; .loc 1 101 9 setp.ne.u64 %r155,%r62,0; @ ! %r155 bra $L1; .loc 1 149 19 ld.u64 %r47,[%frame]; .loc 1 149 9 add.u64 %r49,%r47,%r47; .loc 1 151 34 ld.u64 %r51,[%frame+120]; cvt.u32.u32 %r158,%r22; cvt.s64.s8 %r157,%r158; shl.b64 %r159,%r157,3; add.u64 %r89,%r193,%r159; .loc 1 99 12 mov.u32 %r66,32767; setp.eq.u32 %r192,%r116,0; add.u64 %r194,%frame,248; .loc 1 157 13 mov.u64 %r197,0; .loc 1 174 32 add.u64 %r198,%frame,120; $L21: .loc 1 151 21 ld.u64 %r107,[%frame+240]; .loc 1 126 10 @ %r192 bra $L23; mov.u64 %r41,%r107; mov.u64 %r60,%r62; $L14: .loc 1 129 10 ld.s16 %r34,[%r60]; .loc 1 129 9 cvt.u16.u32 %r162,%r34; cvt.u16.u32 %r163,%r66; setp.gt.s16 %r164,%r162,%r163; @ %r164 bra $L12; mov.u64 %r27,%r194; mov.u64 %r63,%r23; mov.u64 %r103,%r41; $L13: .loc 1 133 34 add.u64 %r166,%r103,1; st.u64 [%r63],%r166; shr.s64 %r167,%r166,63; st.u64 [%r63+8],%r167; .loc 1 132 3 add.u64 %r63,%r63,%r38; setp.eq.u64 %r168,%r27,%r89; @ %r168 bra $L24; .loc 1 133 30 ld.u64 %r103,[%r27]; add.u64 %r27,%r27,8; bra $L13; $L24: mov.u32 %r66,%r34; $L12: .loc 1 135 11 add.u64 %r60,%r60,%r49; .loc 1 137 9 add.u64 %r41,%r41,1; .loc 1 137 2 st.u64 [%frame+240],%r41; setp.ne.u64 %r169,%r41,%r51; @ %r169 bra $L14; bra $L34; $L23: mov.u64 %r50,%r107; mov.u64 %r61,%r62; $L11: .loc 1 141 10 ld.s16 %r42,[%r61]; .loc 1 141 9 cvt.u16.u32 %r170,%r42; cvt.u16.u32 %r171,%r66; setp.ge.s16 %r172,%r170,%r171; @ %r172 bra $L16; mov.u64 %r104,%r194; mov.u64 %r105,%r23; mov.u64 %r84,%r50; $L17: .loc 1 145 34 add.u64 %r174,%r84,1; st.u64 [%r105],%r174; shr.s64 %r175,%r174,63; st.u64 [%r105+8],%r175; .loc 1 144 3 add.u64 %r105,%r105,%r38; setp.eq.u64 %r176,%r89,%r104; @ %r176 bra $L25; .loc 1 145 30 ld.u64 %r84,[%r104]; add.u64 %r104,%r104,8; bra $L17; $L25: mov.u32 %r66,%r42; $L16: .loc 1 149 9 add.u64 %r61,%r61,%r49; .loc 1 151 14 add.u64 %r50,%r50,1; .loc 1 151 7 st.u64 [%frame+240],%r50; setp.ne.u64 %r177,%r50,%r51; @ %r177 bra $L11; .loc 1 149 9 sub.u64 %r178,%r51,%r107; mad.lo.u64 %r62,%r178,%r49,%r62; bra $L18; $L34: .loc 1 135 11 sub.u64 %r180,%r51,%r107; mad.lo.u64 %r62,%r180,%r49,%r62; $L18: mov.u64 %r77,%r194; .loc 1 151 34 mov.u64 %r59,%r51; mov.u64 %r57,%r47; .loc 1 152 9 mov.u64 %r65,0; $L20: .loc 1 157 13 st.u64 [%r77+-8],%r197; .loc 1 160 23 mul.lo.u64 %r54,%r57,%r59; mov.u64 %r101,%r65; .loc 1 161 5 add.u64 %r65,%r65,1; .loc 1 162 7 setp.eq.u64 %r184,%r101,%r100; @ %r184 bra $L1; .loc 1 170 16 ld.u64 %r185,[%r77]; add.u64 %r56,%r185,1; st.u64 [%r77],%r56; shl.b64 %r68,%r65,3; .loc 1 171 23 add.u64 %r186,%frame,%r68; ld.u64 %r57,[%r186]; .loc 1 171 13 sub.u64 %r187,%r57,%r54; add.u64 %r188,%r187,%r187; add.u64 %r62,%r62,%r188; .loc 1 174 32 add.u64 %r190,%r198,%r68; ld.u64 %r59,[%r190]; .loc 1 174 7 add.u64 %r77,%r77,8; setp.eq.u64 %r191,%r56,%r59; @ %r191 bra $L20; bra $L21; $L_gfortran_mminloc0_16_i2 .visible .func _gfortran_mminloc0_16_i2reg .u64 %stack; mov.u64 %stack,0480u64 %r62; .reg .u64 %r64; .reg .u32106; .reg .u64 %r107predpredpredu16predpred %r246; .reg .pred %r247; .reg .u16 %r249; .reg .u32 %r250; .reg .predpredpredu64 %r280; .reg .u64 %r281; .reg .u64predpred %r298; mov.u64 %r163,%ar0; mov.u64 %r164,%ar1; mov.u64 %r165,%ar2; mov.u32 %r166,%ar3; .loc 1 202 6 setp.ne.u64 %r167,%r165,0; @ %r167 bra $L3616r1166; call _gfortran_minloc0_16_.loc 1 205 7 bra $L35; $L36: .loc 1 208 10 ld.s8 %r22,[%r164+28]; .loc 1 209 6 setp.gt.s32 %r171,%r22,0; @ %r171 bra $L38; .loc 1 210 5 cvta.const.u64 %r172172_gfortran_runtime_error%r174,[%r163]; setp.ne.u64 %r175,%r174,0; @ %r175 bra $L39; .loc 1 214 7 st.u64 [%r163+48],%r174; add.u32 %r177,%r22,-1; cvt.s64.s32 %r178,%r177; st.u64 [%r163+56],%r178; mov.u64 %r179,1; st.u64 [%r163+40],%r179; .loc 1 215 28 cvt.u32.u64 %r180,%r179; st.u8 [%r163+28],%r180; .loc 1 216 24 st.u64 [%r163+8],%r174; .loc 1 217 29 cvt.u32.u32 %r185,%r22; cvt.s64.s8 %r184,%r185; mov.u64 %r183,16call (%value_in),_gfortrani_xmallocarray186,[%value_in]; } .loc 1 217 27 st.u64 [%r163],%r186; bra $L40; $L39: .loc 1 221 11 cvta.global.u64 %r188,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r189,[%r188+36]; setp.eq.u32 %r190,%r189,0r164193; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r196193; call _gfortrani_bounds_equal_extents$L40: .loc 1 231 15 ld.u64 %r31,[%r165+16]; .loc 1 233 9 ld.u64 %r93,[%r165]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r198,%r32,-4; and.b32 %r199,%r198,-5; set.u32.eq.u32 %r201,%r199,0; neg.s32 %r202,%r201; .loc 1 235 22 add.u32 %r203,%r32,-1; set.u32.le.u32 %r205,%r203,1; neg.s32 %r206,%r205; .loc 1 235 6 cvt.u16.u32 %r208,%r202; cvt.u16.u32 %r209,%r206; or.b16 %r207,%r208,%r209;41; .loc 1 237 7 setp.ne.u32 %r214,%r32,16; @ %r214 bra $L42; $L41: .loc 1 244 11 ld.u64 %r106,[%r163+40]; .loc 1 245 8 ld.u64 %r107,[%r163]; .loc 1 246 17 cvt.u32.u32 %r215,%r22; cvt.s64.s8 %r64,%r215; add.u64 %r161,%r164,40; add.u64 %r289,%frame,120; mov.u64 %r159,%r289; add.u64 %r158,%r165,40; mov.u64 %r154,%frame; add.u64 %r155,%frame,240; add.u64 %r151,%frame,360; mov.u64 %r150,%r151; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r224,%r29; bra $L43; $L42: .loc 1 242 5_gfortran_runtime_error64: mov.u64 %r29,%r112; $L43: .loc 1 248 18 ld.u64 %r218,[%r161]; st.u64 [%r159],%r218; .loc 1 249 20 ld.u64 %r220,[%r158]; mul.lo.u64 %r219,%r220,%r31; .loc 1 249 18 st.u64 [%r154],%r219; .loc 1 250 19 ld.u64 %r222,[%r161+16]; add.u64 %r221,%r222,1; ld.u64 %r223,[%r161+8]; sub.u64 %r44,%r221,%r223; .loc 1 250 17 st.u64 [%r155],%r44; .loc 1 251 16 st.u64 [%r150],%r224; .loc 1 252 10 setp.gt.s64 %r225,%r44,0; @ %r225 bra $L44; shl.b64 %r141,%r106,4; mov.u64 %r132,%r107; .loc 1 255 11 mov.u64 %r113,0; .loc 1 256 24 mov.u64 %r226,%r113; $L45: st.u64 [%r132],%r226; st.u64 [%r132+8],%r226; .loc 1 255 27 add.u64 %r113,%r113,1; .loc 1 255 4 add.u64 %r132,%r132,%r141; setp.ne.u64 %r228,%r64,%r113; @ %r228 bra $L45; bra $L35; $L44: .loc 1 246 26 add.u64 %r112,%r29,1; .loc 1 246 3 add.u64 %r161,%r161,24; add.u64 %r159,%r159,8; add.u64 %r158,%r158,24; add.u64 %r154,%r154,8; add.u64 %r155,%r155,8; add.u64 %r150,%r150,8; setp.ne.u64 %r229,%r64,%r112; @ %r229 bra $L64; .loc 1 261 8 ld.u64 %r91,[%r164]; shl.b64 %r115,%r106,4; mov.u64 %r52,%r107; .loc 1 264 10 mov.u64 %r111,0; .loc 1 265 23 mov.u64 %r230,%r111; $L47: st.u64 [%r52],%r230; st.u64 [%r52+8],%r230; mov.u64 %r143,%r111; .loc 1 264 26 add.u64 %r111,%r111,1; .loc 1 264 3 add.u64 %r52,%r52,%r115; setp.ne.u64 %r232,%r29,%r143; @ %r232 bra $L47; .loc 1 276 9 setp.eq.u64 %r233,%r91,0; @ %r233 bra $L35; .loc 1 331 19 ld.u64 %r70,[%frame+120]; .loc 1 331 9 add.u64 %r72,%r70,%r70; .loc 1 332 20 ld.u64 %r73,[%frame]; .loc 1 334 34 ld.u64 %r76,[%frame+240]; ld.u64 %r160,[%frame+360]; cvt.u32.u32 %r236,%r22; cvt.s64.s8 %r235,%r236; shl.b64 %r237,%r235,3; add.u64 %r122,%r237,%r151; .loc 1 274 12 mov.u32 %r97,32767; setp.eq.u32 %r288,1,0; add.u64 %r290,%frame,368; .loc 1 340 13 mov.u64 %r295,0; .loc 1 359 32 add.u64 %r296,%frame,240; .loc 1 308 12 setp.eq.u32 %r298,%r166,0; $L63: .loc 1 280 10 @ %r288 bra $L48; mov.u64 %r53,%r160; mov.u32 %r46,0; mov.u32 %r292,1; $L53: mov.u64 %r45,%r91; .loc 1 300 13 add.u64 %r91,%r45,%r72; .loc 1 284 11 ld.s8 %r240,[%r93]; cvt.u16.u32 %r239,%r240; setp.eq.u16 %r241,%r239,0; @ %r241 bra $L49; setp.eq.u32 %r242,%r46,0; selp.u64 %r160,%r160,%r53,%r242; .loc 1 294 16 ld.s16 %r97,[%r45]; mov.u64 %r131,%r290; mov.u64 %r142,%r107; mov.u64 %r157,%r160; $L52: .loc 1 296 33 add.u64 %r244,%r157,1; st.u64 [%r142],%r244; shr.s64 %r245,%r244,63; st.u64 [%r142+8],%r245; .loc 1 295 9 add.u64 %r142,%r142,%r115; setp.eq.u64 %r246,%r122,%r131; @ %r246 bra $L65; .loc 1 296 29 ld.u64 %r157,[%r131]; add.u64 %r131,%r131,8; bra $L52; $L49: .loc 1 301 14 add.u64 %r93,%r93,%r73; .loc 1 303 11 add.u64 %r53,%r53,1; mov.u32 %r46,%r292; .loc 1 303 4 setp.ne.u64 %r247,%r53,%r76; @ %r247 bra $L53; bra $L89; $L48: .loc 1 308 12 @ %r298 bra $L66; mov.u64 %r62,%r160; mov.u64 %r89,%r91; $L58: .loc 1 311 11 ld.s8 %r250,[%r93]; cvt.u16.u32 %r249,%r250; setp.eq.u16 %r251,%r249,0; @ %r251 bra $L56; .loc 1 311 12 ld.s16 %r58,[%r89]; .loc 1 311 11 cvt.u16.u32 %r252,%r58; cvt.u16.u32 %r253,%r97; setp.gt.s16 %r254,%r252,%r253; @ %r254 bra $L56; mov.u64 %r94,%r290; mov.u64 %r98,%r107; mov.u64 %r153,%r62; .loc 1 314 18 mov.u64 %r144,%r295; $L57: .loc 1 315 36 add.u64 %r256,%r153,1; st.u64 [%r98],%r256; shr.s64 %r257,%r256,63; st.u64 [%r98+8],%r257; .loc 1 314 34 add.u64 %r110,%r144,1; .loc 1 314 11 add.u64 %r98,%r98,%r115; setp.le.s64 %r258,%r29,%r144; @ %r258 bra $L67; .loc 1 315 32 ld.u64 %r153,[%r94]; add.u64 %r94,%r94,8; mov.u64 %r144,%r110; bra $L57; $L67: mov.u32 %r97,%r58; $L56: .loc 1 317 8 add.u64 %r89,%r89,%r72; .loc 1 319 13 add.u64 %r62,%r62,1; .loc 1 319 6 st.u64 [%frame+360],%r62; setp.ne.u64 %r259,%r62,%r76; @ %r259 bra $L58; .loc 1 317 8 sub.u64 %r260,%r76,%r160; mad.lo.u64 %r91,%r260,%r72,%r91; bra $L59; $L66: mov.u64 %r75,%r160; mov.u64 %r92,%r93; mov.u64 %r90,%r91; $L55: .loc 1 323 11 ld.s8 %r263,[%r92]; cvt.u16.u32 %r262,%r263; setp.eq.u16 %r264,%r262,0; @ %r264 bra $L60; .loc 1 323 12 ld.s16 %r65,[%r90]; .loc 1 323 11 cvt.u16.u32 %r265,%r65; cvt.u16.u32 %r266,%r97; setp.ge.s16 %r267,%r265,%r266; @ %r267 bra $L60; mov.u64 %r57,%r290; mov.u64 %r66,%r107; mov.u64 %r149,%r75; .loc 1 326 12 mov.u64 %r145,%r295; $L61: .loc 1 327 36 add.u64 %r269,%r149,1; st.u64 [%r66],%r269; shr.s64 %r270,%r269,63; st.u64 [%r66+8],%r270; .loc 1 326 28 add.u64 %r109,%r145,1; .loc 1 326 5 add.u64 %r66,%r66,%r115; setp.le.s64 %r271,%r29,%r145; @ %r271 bra $L68; .loc 1 327 32 ld.u64 %r149,[%r57]; add.u64 %r57,%r57,8; mov.u64 %r145,%r109; bra $L61; $L68: mov.u32 %r97,%r65; $L60: .loc 1 331 9 add.u64 %r90,%r90,%r72; .loc 1 332 10 add.u64 %r92,%r92,%r73; .loc 1 334 14 add.u64 %r75,%r75,1; .loc 1 334 7 st.u64 [%frame+360],%r75; setp.ne.u64 %r272,%r75,%r76; @ %r272 bra $L55; sub.u64 %r128,%r76,%r160; .loc 1 331 9 mad.lo.u64 %r91,%r72,%r128,%r91; .loc 1 332 10 mad.lo.u64 %r93,%r73,%r128,%r93; bra $L59; $L89: st.u64 [%frame+360],%r76; $L59: mov.u64 %r120,%r290; mov.u64 %r85,%r73; .loc 1 334 34 mov.u64 %r87,%r76; mov.u64 %r83,%r70; mov.u64 %r119,8; .loc 1 335 9 mov.u64 %r96,0; $L62: .loc 1 340 13 st.u64 [%r120+-8],%r295; .loc 1 343 23 mul.lo.u64 %r78,%r83,%r87; .loc 1 344 24 mul.lo.u64 %r80,%r87,%r85; mov.u64 %r95,%r96; .loc 1 345 5 add.u64 %r96,%r96,1; .loc 1 346 7 setp.eq.u64 %r277,%r29,%r95; @ %r277 bra $L35; .loc 1 354 16 ld.u64 %r278,[%r120]; add.u64 %r82,%r278,1; st.u64 [%r120],%r82; .loc 1 355 23 add.u64 %r280,%r289,%r119; ld.u64 %r83,[%r280]; .loc 1 355 13 sub.u64 %r281,%r83,%r78; add.u64 %r282,%r281,%r281; add.u64 %r91,%r91,%r282; .loc 1 356 24 add.u64 %r283,%frame,%r119; ld.u64 %r85,[%r283]; .loc 1 356 14 sub.u64 %r284,%r85,%r80; add.u64 %r93,%r93,%r284; .loc 1 359 32 add.u64 %r286,%r296,%r119; ld.u64 %r87,[%r286]; .loc 1 359 7 add.u64 %r120,%r120,8; add.u64 %r119,%r119,8; setp.eq.u64 %r287,%r82,%r87; @ %r287 bra $L62; ld.u64 %r160,[%frame+360]; bra $L63; $L65: mov.u64 %r91,%r45; setp.eq.u32 %r288,1,1; bra $L48; $L35: .loc 1 _gfortran_sminloc0_16_i2 .visible .func _gfortran_sminloc0_16_i2u32pred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 378 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L91; .loc 1 378 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L92; $L91: .loc 1 380 7 {_gfortran_minloc0_16_.loc 1 381 7 bra $L90; $L92: .loc 1 384 8 ld.s8 %r30,[%r36+28]_gfortran_runtime_error94: .loc 1 389 15 ld.u64 %r24,[%r35]; .loc 1 389 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L95; .loc 1 391 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 392 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 393 24 st.u64 [%r35+8],%r24; .loc 1 394 29call (%value_in),_gfortrani_xmallocarray94 27 st.u64 [%r35],%r24; bra $L96; $L95: .loc 1 396 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 396 11 ld.u32 %r62,[%r61+36; .loc 1 398 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 403 8 ld.u64 %r24,[%r35]; $L96: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,4; mov.u64 %r29,%r24; .loc 1 404 10 mov.u64 %r32,0; .loc 1 405 23 mov.u64 %r68,%r32; $L97: st.u64 [%r29],%r68; st.u64 [%r29+8],%r68; .loc 1 404 24 add.u64 %r32,%r32,1; .loc 1 404 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r70,%r30,%r32; @ %r70 bra $L97; $L90: .loc 1 406 1 ret; } minloc0_4_i4.o/_gfortran_minloc0_4_i4 .visible .func _gfortran_minloc0_4_i4libgfortran/generated/minloc0_4_i4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_4_i4 .visible .func _gfortran_mminloc0_4_i4_gfortran_sminloc0_4_i4 .visible .func _gfortran_sminloc0_4_i4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_4_i4 .visible .func _gfortran_minloc0_4_i4stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[368u64predpred %r154; .reg .predpredpredmov.u64 %r116,%ar0; mov.u64 %r117,%ar1; mov.u32 %r118,%ar2; .loc 1 50 10 ld.s8 %r22,[%r117+28]; .loc 1 50 8 cvt.u32.u32 %r119,%r22; cvt.s64.s8 %r72,%r119; .loc 1 51 6 setp.gt.s64 %r120,%r72,0; @ %r120 bra $L2; .loc 1 52 5 cvta.const.u64 %r121121_gfortran_runtime_errorr23,[%r116]; .loc 1 54 6 setp.ne.u64 %r123,%r23,0; @ %r123 bra $L3; .loc 1 56 7 st.u64 [%r116+48],%r23; add.u64 %r125,%r72,-1; st.u64 [%r116+56],%r125; mov.u64 %r126,1; st.u64 [%r116+40],%r126; .loc 1 57 28 cvt.u32.u64 %r127,%r126; st.u8 [%r116+28],%r127; .loc 1 58 24 st.u64 [%r116+8],%r23; .loc 1 59 29 mov.u64 %r130,4;r130; call (%value_in),_gfortrani_xmallocarray133,[%value_in]; } mov.u64 %r23,%r133; .loc 1 59 27 st.u64 [%r116],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r135,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r136,[%r135+36]; setp.eq.u32 %r137,%r136,0; @ %r137 bra $L4; .loc 1 64r140; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r116]; $L4: .loc 1 68 11 ld.u64 %r73,[%r116+40]; add.u64 %r39,%r117,40; mov.u64 %r45,%frame; add.u64 %r35,%frame,120; .loc 1 70 10 mov.u64 %r99,0; add.u64 %r188,%frame,240; .loc 1 74 16 mov.u64 %r148,%r99; bra $L7; $L22: mov.u64 %r99,%r79; $L7: .loc 1 72 18 ld.u64 %r141,[%r39]; st.u64 [%r45],%r141; .loc 1 73 19 ld.u64 %r143,[%r39+16]; add.u64 %r142,%r143,1; ld.u64 %r144,[%r39+8]; sub.u64 %r31,%r142,%r144; .loc 1 73 17 st.u64 [%r35],%r31; .loc 1 74 16 shl.b64 %r146,%r99,3; add.u64 %r147,%r188,%r146; st.u64 [%r147],%r148; .loc 1 75 10 setp.gt.s64 %r149,%r31,0; @ %r149 bra $L5; shl.b64 %r85,%r73,2; mov.u64 %r87,%r23; .loc 1 78 11 mov.u64 %r80,0; .loc 1 79 24 cvt.u32.u64 %r150,%r80; $L6: st.u32 [%r87],%r150; .loc 1 78 27 add.u64 %r80,%r80,1; .loc 1 78 4 add.u64 %r87,%r87,%r85; setp.ne.u64 %r151,%r72,%r80; @ %r151 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r79,%r99,1; .loc 1 70 3 add.u64 %r39,%r39,24; add.u64 %r45,%r45,8; add.u64 %r35,%r35,8; setp.ne.u64 %r152,%r72,%r79; @ %r152 bra $L22; .loc 1 84 8 ld.u64 %r64,[%r117]; shl.b64 %r82,%r73,2; mov.u64 %r92,%r23; .loc 1 87 10 mov.u64 %r78,0; .loc 1 88 23 mov.u32 %r153,1; $L8: st.u32 [%r92],%r153; mov.u64 %r32,%r78; .loc 1 87 26 add.u64 %r78,%r78,1; .loc 1 87 3 add.u64 %r92,%r92,%r82; setp.ne.u64 %r154,%r99,%r32; @ %r154 bra $L8; .loc 1 101 9 setp.ne.u64 %r155,%r64,0; @ ! %r155 bra $L1; .loc 1 149 19 ld.u64 %r49,[%frame]; .loc 1 149 9 shl.b64 %r51,%r49,2; .loc 1 151 34 ld.u64 %r53,[%frame+120]; cvt.u32.u32 %r157,%r22; cvt.s64.s8 %r156,%r157; shl.b64 %r158,%r156,3; add.u64 %r102,%r188,%r158; .loc 1 99 12 mov.u32 %r67,2147483647; setp.eq.u32 %r187,%r118,0; add.u64 %r189,%frame,248; .loc 1 157 13 mov.u64 %r192,0; .loc 1 174 32 add.u64 %r193,%frame,120; $L21: .loc 1 151 21 ld.u64 %r110,[%frame+240]; .loc 1 126 10 @ %r187 bra $L23; mov.u64 %r40,%r110; mov.u64 %r62,%r64; $L14: .loc 1 129 10 ld.u32 %r33,[%r62]; .loc 1 129 9 setp.gt.s32 %r161,%r33,%r67; @ %r161 bra $L12; mov.u64 %r43,%r189; mov.u64 %r65,%r23; mov.u64 %r105,%r40; $L13: .loc 1 133 34 cvt.u32.u64 %r164,%r105; add.u32 %r163,%r164,1; .loc 1 133 23 st.u32 [%r65],%r163; .loc 1 132 3 add.u64 %r65,%r65,%r82; setp.eq.u64 %r165,%r43,%r102; @ %r165 bra $L24; .loc 1 133 30 ld.u64 %r105,[%r43]; add.u64 %r43,%r43,8; bra $L13; $L24: mov.u32 %r67,%r33; $L12: .loc 1 135 11 add.u64 %r62,%r62,%r51; .loc 1 137 9 add.u64 %r40,%r40,1; .loc 1 137 2 st.u64 [%frame+240],%r40; setp.ne.u64 %r166,%r40,%r53; @ %r166 bra $L14; bra $L34; $L23: mov.u64 %r52,%r110; mov.u64 %r63,%r64; $L11: .loc 1 141 10 ld.u32 %r42,[%r63]; .loc 1 141 9 setp.ge.s32 %r167,%r42,%r67; @ %r167 bra $L16; mov.u64 %r108,%r189; mov.u64 %r109,%r23; mov.u64 %r44,%r52; $L17: .loc 1 145 34 cvt.u32.u64 %r170,%r44; add.u32 %r169,%r170,1; .loc 1 145 23 st.u32 [%r109],%r169; .loc 1 144 3 add.u64 %r109,%r109,%r82; setp.eq.u64 %r171,%r102,%r108; @ %r171 bra $L25; .loc 1 145 30 ld.u64 %r44,[%r108]; add.u64 %r108,%r108,8; bra $L17; $L25: mov.u32 %r67,%r42; $L16: .loc 1 149 9 add.u64 %r63,%r63,%r51; .loc 1 151 14 add.u64 %r52,%r52,1; .loc 1 151 7 st.u64 [%frame+240],%r52; setp.ne.u64 %r172,%r52,%r53; @ %r172 bra $L11; .loc 1 149 9 sub.u64 %r173,%r53,%r110; mad.lo.u64 %r64,%r173,%r51,%r64; bra $L18; $L34: .loc 1 135 11 sub.u64 %r175,%r53,%r110; mad.lo.u64 %r64,%r175,%r51,%r64; $L18: mov.u64 %r81,%r189; .loc 1 151 34 mov.u64 %r61,%r53; mov.u64 %r59,%r49; .loc 1 152 9 mov.u64 %r66,0; $L20: .loc 1 157 13 st.u64 [%r81+-8],%r192; .loc 1 160 23 mul.lo.u64 %r56,%r59,%r61; mov.u64 %r100,%r66; .loc 1 161 5 add.u64 %r66,%r66,1; .loc 1 162 7 setp.eq.u64 %r179,%r100,%r99; @ %r179 bra $L1; .loc 1 170 16 ld.u64 %r180,[%r81]; add.u64 %r58,%r180,1; st.u64 [%r81],%r58; shl.b64 %r70,%r66,3; .loc 1 171 23 add.u64 %r181,%frame,%r70; ld.u64 %r59,[%r181]; .loc 1 171 13 sub.u64 %r182,%r59,%r56; shl.b64 %r183,%r182,2; add.u64 %r64,%r64,%r183; .loc 1 174 32 add.u64 %r185,%r193,%r70; ld.u64 %r61,[%r185]; .loc 1 174 7 add.u64 %r81,%r81,8; setp.eq.u64 %r186,%r58,%r61; @ %r186 bra $L20; bra $L21; $L_gfortran_mminloc0_4_i4 .visible .func _gfortran_mminloc0_4_i4reg .u64 %stack; mov.u64 %stack,0480predpred %r193; .reg .u64 %r196; .reg .u6464pred %r228; .reg .u32 %r229; .reg .predpred %r233; .reg .pred %r234; .reg .u64u16u64u64 %r292; .reg .pred %r294; mov.u64 %r166,%ar0; mov.u64 %r167,%ar1; mov.u64 %r168,%ar2; mov.u32 %r169,%ar3; .loc 1 202 6 setp.ne.u64 %r170,%r168,0; @ %r170 bra $L36gfortran_minloc0_4_.loc 1 205 7 bra $L35; $L36: .loc 1 208 10 ld.s8 %r22,[%r167+28]; .loc 1 209 6 setp.gt.s32 %r174,%r22,0; @ %r174 bra $L38; .loc 1 210 5 cvta.const.u64 %r175175_gfortran_runtime_error%r177,[%r166]; setp.ne.u64 %r178,%r177,0; @ %r178 bra $L39; .loc 1 214 7 st.u64 [%r166+48],%r177; add.u32 %r180,%r22,-1; cvt.s64.s32 %r181,%r180; st.u64 [%r166+56],%r181; mov.u64 %r182,1; st.u64 [%r166+40],%r182; .loc 1 215 28 cvt.u32.u64 %r183,%r182; st.u8 [%r166+28],%r183; .loc 1 216 24 st.u64 [%r166+8],%r177; .loc 1 217 29 cvt.u32.u32 %r188,%r22; cvt.s64.s8 %r187,%r188call (%value_in),_gfortrani_xmallocarray189,[%value_in]; } .loc 1 217 27 st.u64 [%r166],%r189; bra $L40; $L39: .loc 1 221 11 cvta.global.u64 %r191,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r192,[%r191+36];r167196; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19call _gfortrani_bounds_equal_extents$L40: .loc 1 231 15 ld.u64 %r31,[%r168+16]; .loc 1 233 9 ld.u64 %r96,[%r168]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r201,%r32,-4; and.b32 %r202,%r201,-5; set.u32.eq.u32 %r204,%r202,0; neg.s32 %r205,%r204; .loc 1 235 22 add.u32 %r206,%r32,-1; set.u32.le.u32 %r208,%r206,1; neg.s32 %r209,%r208; .loc 1 235 6 cvt.u16.u32 %r211,%r205; cvt.u16.u32 %r212,%r209; or.b16 %r210,%r211,%r212; cvt.u32.u16 %r213,%r210; cvt.u16.u8 %r214,%r213; setp.ne.u16 %r215,%r214,0; @ %r215 bra $L41; .loc 1 237 7 setp.ne.u32 %r217,%r32,16; @ %r217 bra $L42; $L41: .loc 1 244 11 ld.u64 %r109,[%r166+40]; .loc 1 245 8 ld.u64 %r110,[%r166]; .loc 1 246 17 cvt.u32.u32 %r218,%r22; cvt.s64.s8 %r67,%r218; add.u64 %r52,%r167,40; add.u64 %r286,%frame,120; mov.u64 %r118,%r286; add.u64 %r105,%r168,40; mov.u64 %r165,%frame; add.u64 %r162,%frame,240; add.u64 %r159,%frame,360; mov.u64 %r160,%r159; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r227,%r29; bra $L43; $L42: .loc 1 242 5_gfortran_runtime_error64: mov.u64 %r29,%r115; $L43: .loc 1 248 18 ld.u64 %r221,[%r52]; st.u64 [%r118],%r221; .loc 1 249 20 ld.u64 %r223,[%r105]; mul.lo.u64 %r222,%r223,%r31; .loc 1 249 18 st.u64 [%r165],%r222; .loc 1 250 19 ld.u64 %r225,[%r52+16]; add.u64 %r224,%r225,1; ld.u64 %r226,[%r52+8]; sub.u64 %r44,%r224,%r226; .loc 1 250 17 st.u64 [%r162],%r44; .loc 1 251 16 st.u64 [%r160],%r227; .loc 1 252 10 setp.gt.s64 %r228,%r44,0; @ %r228 bra $L44; shl.b64 %r149,%r109,2; mov.u64 %r138,%r110; .loc 1 255 11 mov.u64 %r116,0; .loc 1 256 24 cvt.u32.u64 %r229,%r116; $L45: st.u32 [%r138],%r229; .loc 1 255 27 add.u64 %r116,%r116,1; .loc 1 255 4 add.u64 %r138,%r138,%r149; setp.ne.u64 %r230,%r67,%r116; @ %r230 bra $L45; bra $L35; $L44: .loc 1 246 26 add.u64 %r115,%r29,1; .loc 1 246 3 add.u64 %r52,%r52,24; add.u64 %r118,%r118,8; add.u64 %r105,%r105,24; add.u64 %r165,%r165,8; add.u64 %r162,%r162,8; add.u64 %r160,%r160,8; setp.ne.u64 %r231,%r67,%r115; @ %r231 bra $L64; .loc 1 261 8 ld.u64 %r94,[%r167]; shl.b64 %r57,%r109,2; mov.u64 %r60,%r110; .loc 1 264 10 mov.u64 %r114,0; .loc 1 265 23 cvt.u32.u64 %r232,%r114; $L47: st.u32 [%r60],%r232; mov.u64 %r145,%r114; .loc 1 264 26 add.u64 %r114,%r114,1; .loc 1 264 3 add.u64 %r60,%r60,%r57; setp.ne.u64 %r233,%r29,%r145; @ %r233 bra $L47; .loc 1 276 9 setp.eq.u64 %r234,%r94,0; @ %r234 bra $L35; .loc 1 331 19 ld.u64 %r73,[%frame+120]; .loc 1 331 9 shl.b64 %r75,%r73,2; .loc 1 332 20 ld.u64 %r76,[%frame]; .loc 1 334 34 ld.u64 %r79,[%frame+240]; ld.u64 %r163,[%frame+360]; cvt.u32.u32 %r236,%r22; cvt.s64.s8 %r235,%r236; shl.b64 %r237,%r235,3; add.u64 %r127,%r237,%r159; .loc 1 274 12 mov.u32 %r100,2147483647; setp.eq.u32 %r285,1,0; add.u64 %r284,%frame,368; .loc 1 340 13 mov.u64 %r291,0; .loc 1 359 32 add.u64 %r292,%frame,240; .loc 1 308 12 setp.eq.u32 %r294,%r169,0; $L63: .loc 1 280 10 @ %r285 bra $L48; mov.u64 %r53,%r163; mov.u32 %r46,0; mov.u32 %r288,1; $L53: mov.u64 %r45,%r94; .loc 1 300 13 add.u64 %r94,%r45,%r75; .loc 1 284 11 ld.s8 %r240,[%r96]; cvt.u16.u32 %r239,%r240; setp.eq.u16 %r241,%r239,0; @ %r241 bra $L49; setp.eq.u32 %r242,%r46,0; selp.u64 %r163,%r163,%r53,%r242; .loc 1 294 16 ld.u32 %r100,[%r45]; mov.u64 %r137,%r284; mov.u64 %r150,%r110; mov.u64 %r161,%r163; $L52: .loc 1 296 33 cvt.u32.u64 %r245,%r161; add.u32 %r244,%r245,1; .loc 1 296 22 st.u32 [%r150],%r244; .loc 1 295 9 add.u64 %r150,%r150,%r57; setp.eq.u64 %r246,%r127,%r137; @ %r246 bra $L65; .loc 1 296 29 ld.u64 %r161,[%r137]; add.u64 %r137,%r137,8; bra $L52; $L49: .loc 1 301 14 add.u64 %r96,%r96,%r76; .loc 1 303 11 add.u64 %r53,%r53,1; mov.u32 %r46,%r288; .loc 1 303 4 setp.ne.u64 %r247,%r53,%r79; @ %r247 bra $L53; bra $L89; $L48: .loc 1 308 12 @ %r294 bra $L66; mov.u64 %r64,%r163; mov.u64 %r92,%r94; $L58: .loc 1 311 11 ld.s8 %r250,[%r96]; cvt.u16.u32 %r249,%r250; setp.eq.u16 %r251,%r249,0; @ %r251 bra $L56; .loc 1 311 12 ld.u32 %r58,[%r92]; .loc 1 311 11 setp.gt.s32 %r252,%r58,%r100; @ %r252 bra $L56; mov.u64 %r103,%r284; mov.u64 %r119,%r110; mov.u64 %r156,%r64; .loc 1 314 18 mov.u64 %r146,%r291; $L57: .loc 1 315 36 cvt.u32.u64 %r255,%r156; add.u32 %r254,%r255,1; .loc 1 315 25 st.u32 [%r119],%r254; .loc 1 314 34 add.u64 %r113,%r146,1; .loc 1 314 11 add.u64 %r119,%r119,%r57; setp.le.s64 %r256,%r29,%r146; @ %r256 bra $L67; .loc 1 315 32 ld.u64 %r156,[%r103]; add.u64 %r103,%r103,8; mov.u64 %r146,%r113; bra $L57; $L67: mov.u32 %r100,%r58; $L56: .loc 1 317 8 add.u64 %r92,%r92,%r75; .loc 1 319 13 add.u64 %r64,%r64,1; .loc 1 319 6 st.u64 [%frame+360],%r64; setp.ne.u64 %r257,%r64,%r79; @ %r257 bra $L58; .loc 1 317 8 sub.u64 %r258,%r79,%r163; mad.lo.u64 %r94,%r258,%r75,%r94; bra $L59; $L66: mov.u64 %r78,%r163; mov.u64 %r95,%r96; mov.u64 %r93,%r94; $L55: .loc 1 323 11 ld.s8 %r261,[%r95]; cvt.u16.u32 %r260,%r261; setp.eq.u16 %r262,%r260,0; @ %r262 bra $L60; .loc 1 323 12 ld.u32 %r68,[%r93]; .loc 1 323 11 setp.ge.s32 %r263,%r68,%r100; @ %r263 bra $L60; mov.u64 %r91,%r284; mov.u64 %r97,%r110; mov.u64 %r151,%r78; .loc 1 326 12 mov.u64 %r147,%r291; $L61: .loc 1 327 36 cvt.u32.u64 %r266,%r151; add.u32 %r265,%r266,1; .loc 1 327 25 st.u32 [%r97],%r265; .loc 1 326 28 add.u64 %r112,%r147,1; .loc 1 326 5 add.u64 %r97,%r97,%r57; setp.le.s64 %r267,%r29,%r147; @ %r267 bra $L68; .loc 1 327 32 ld.u64 %r151,[%r91]; add.u64 %r91,%r91,8; mov.u64 %r147,%r112; bra $L61; $L68: mov.u32 %r100,%r68; $L60: .loc 1 331 9 add.u64 %r93,%r93,%r75; .loc 1 332 10 add.u64 %r95,%r95,%r76; .loc 1 334 14 add.u64 %r78,%r78,1; .loc 1 334 7 st.u64 [%frame+360],%r78; setp.ne.u64 %r268,%r78,%r79; @ %r268 bra $L55; sub.u64 %r133,%r79,%r163; .loc 1 331 9 mad.lo.u64 %r94,%r75,%r133,%r94; .loc 1 332 10 mad.lo.u64 %r96,%r76,%r133,%r96; bra $L59; $L89: st.u64 [%frame+360],%r79; $L59: mov.u64 %r125,%r284; mov.u64 %r88,%r76; .loc 1 334 34 mov.u64 %r90,%r79; mov.u64 %r86,%r73; mov.u64 %r124,8; .loc 1 335 9 mov.u64 %r98,0; $L62: .loc 1 340 13 st.u64 [%r125+-8],%r291; .loc 1 343 23 mul.lo.u64 %r81,%r86,%r90; .loc 1 344 24 mul.lo.u64 %r83,%r90,%r88; mov.u64 %r104,%r98; .loc 1 345 5 add.u64 %r98,%r98,1; .loc 1 346 7 setp.eq.u64 %r273,%r29,%r104; @ %r273 bra $L35; .loc 1 354 16 ld.u64 %r274,[%r125]; add.u64 %r85,%r274,1; st.u64 [%r125],%r85; .loc 1 355 23 add.u64 %r276,%r286,%r124; ld.u64 %r86,[%r276]; .loc 1 355 13 sub.u64 %r277,%r86,%r81; shl.b64 %r278,%r277,2; add.u64 %r94,%r94,%r278; .loc 1 356 24 add.u64 %r279,%frame,%r124; ld.u64 %r88,[%r279]; .loc 1 356 14 sub.u64 %r280,%r88,%r83; add.u64 %r96,%r96,%r280; .loc 1 359 32 add.u64 %r282,%r292,%r124; ld.u64 %r90,[%r282]; .loc 1 359 7 add.u64 %r125,%r125,8; add.u64 %r124,%r124,8; setp.eq.u64 %r283,%r85,%r90; @ %r283 bra $L62; ld.u64 %r163,[%frame+360]; bra $L63; $L65: mov.u64 %r94,%r45; setp.eq.u32 %r285,1,1; bra $L48; $L35: .loc 1 _gfortran_sminloc0_4_i4 .visible .func _gfortran_sminloc0_4_i4u32pred.reg .u64u32 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 378 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L91; .loc 1 378 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L92; $L91: .loc 1 380 7 {_gfortran_minloc0_4_.loc 1 381 7 bra $L90; $L92: .loc 1 384 8 ld.s8 %r30,[%r36+28]_gfortran_runtime_error94: .loc 1 389 15 ld.u64 %r24,[%r35]; .loc 1 389 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L95; .loc 1 391 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 392 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 393 24 st.u64 [%r35+8],%r24; .loc 1 394 29 mov.u64 %r56call (%value_in),_gfortrani_xmallocarray94 27 st.u64 [%r35],%r24; bra $L96; $L95: .loc 1 396 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 396 11 ld.u32 %r62,[%r61+36; .loc 1 398 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 403 8 ld.u64 %r24,[%r35]; $L96: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,2; mov.u64 %r29,%r24; .loc 1 404 10 mov.u64 %r32,0; .loc 1 405 23 cvt.u32.u64 %r68,%r32; $L97: st.u32 [%r29],%r68; .loc 1 404 24 add.u64 %r32,%r32,1; .loc 1 404 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L97; $L90: .loc 1 406 1 ret; } minloc0_8_i4.o/_gfortran_minloc0_8_i4 .visible .func _gfortran_minloc0_8_i4libgfortran/generated/minloc0_8_i4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_8_i4 .visible .func _gfortran_mminloc0_8_i4_gfortran_sminloc0_8_i4 .visible .func _gfortran_sminloc0_8_i4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_8_i4 .visible .func _gfortran_minloc0_8_i4stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[36832predpred %r180; .reg .predmov.u64 %r112,%ar0; mov.u64 %r113,%ar1; mov.u32 %r114,%ar2; .loc 1 50 10 ld.s8 %r22,[%r113+28]; .loc 1 50 8 cvt.u32.u32 %r115,%r22; cvt.s64.s8 %r71,%r115; .loc 1 51 6 setp.gt.s64 %r116,%r71,0; @ %r116 bra $L2; .loc 1 52 5 cvta.const.u64 %r11117_gfortran_runtime_errorr23,[%r112]; .loc 1 54 6 setp.ne.u64 %r119,%r23,0; @ %r119 bra $L3; .loc 1 56 7 st.u64 [%r112+48],%r23; add.u64 %r121,%r71,-1; st.u64 [%r112+56],%r121; mov.u64 %r122,1; st.u64 [%r112+40],%r122; .loc 1 57 28 cvt.u32.u64 %r123,%r122; st.u8 [%r112+28],%r123; .loc 1 58 24 st.u64 [%r112+8],%r23; .loc 1 59 29 mov.u64 %r171call (%value_in),_gfortrani_xmallocarray129,[%value_in]; } mov.u64 %r23,%r129; .loc 1 59 27 st.u64 [%r112],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r131,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r132,[%r131+36]; setp.eq.u32 %r133,%r132,0; @ %r133 bra $L4; .loc 1 64r113136; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r112]; $L4: .loc 1 68 11 ld.u64 %r72,[%r112+40]; add.u64 %r36,%r113,40; mov.u64 %r104,%frame; add.u64 %r105,%frame,120; .loc 1 70 10 mov.u64 %r98,0; add.u64 %r182,%frame,240; .loc 1 74 16 mov.u64 %r144,%r98; bra $L7; $L22: mov.u64 %r98,%r79; $L7: .loc 1 72 18 ld.u64 %r137,[%r36]; st.u64 [%r104],%r137; .loc 1 73 19 ld.u64 %r139,[%r36+16]; add.u64 %r138,%r139,1; ld.u64 %r140,[%r36+8]; sub.u64 %r31,%r138,%r140; .loc 1 73 17 st.u64 [%r105],%r31; .loc 1 74 16 shl.b64 %r142,%r98,3; add.u64 %r143,%r182,%r142; st.u64 [%r143],%r144; .loc 1 75 10 setp.gt.s64 %r145,%r31,0; @ %r145 bra $L5; shl.b64 %r81,%r72,3; mov.u64 %r87,%r23; .loc 1 78 11 mov.u64 %r80,0; .loc 1 79 24 mov.u64 %r146,%r80; $L6: st.u64 [%r87],%r146; .loc 1 78 27 add.u64 %r80,%r80,1; .loc 1 78 4 add.u64 %r87,%r87,%r81; setp.ne.u64 %r147,%r71,%r80; @ %r147 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r79,%r98,1; .loc 1 70 3 add.u64 %r36,%r36,24; add.u64 %r104,%r104,8; add.u64 %r105,%r105,8; setp.ne.u64 %r148,%r71,%r79; @ %r148 bra $L22; .loc 1 84 8 ld.u64 %r62,[%r113]; shl.b64 %r84,%r72,3; mov.u64 %r39,%r23; .loc 1 87 10 mov.u64 %r78,0; .loc 1 88 23 mov.u64 %r149,1; $L8: st.u64 [%r39],%r149; mov.u64 %r32,%r78; .loc 1 87 26 add.u64 %r78,%r78,1; .loc 1 87 3 add.u64 %r39,%r39,%r84; setp.ne.u64 %r150,%r98,%r32; @ %r150 bra $L8; .loc 1 101 9 setp.ne.u64 %r151,%r62,0; @ ! %r151 bra $L1; .loc 1 149 19 ld.u64 %r45,[%frame]; .loc 1 149 9 shl.b64 %r47,%r45,2; .loc 1 151 34 ld.u64 %r51,[%frame+120]; cvt.u32.u32 %r153,%r22; cvt.s64.s8 %r152,%r153; shl.b64 %r154,%r152,3; add.u64 %r73,%r182,%r154; .loc 1 99 12 mov.u32 %r67,2147483647; setp.eq.u32 %r181,%r114,0; add.u64 %r183,%frame,248; .loc 1 157 13 mov.u64 %r186,0; .loc 1 174 32 add.u64 %r187,%frame,120; $L21: .loc 1 151 21 ld.u64 %r103,[%frame+240]; .loc 1 126 10 @ %r181 bra $L23; mov.u64 %r40,%r103; mov.u64 %r60,%r62; $L14: .loc 1 129 10 ld.u32 %r33,[%r60]; .loc 1 129 9 setp.gt.s32 %r157,%r33,%r67; @ %r157 bra $L12; mov.u64 %r76,%r183; mov.u64 %r34,%r23; mov.u64 %r90,%r40; $L13: .loc 1 133 34 add.u64 %r159,%r90,1; .loc 1 133 23 st.u64 [%r34],%r159; .loc 1 132 3 add.u64 %r34,%r34,%r84; setp.eq.u64 %r160,%r73,%r76; @ %r160 bra $L24; .loc 1 133 30 ld.u64 %r90,[%r76]; add.u64 %r76,%r76,8; bra $L13; $L24: mov.u32 %r67,%r33; $L12: .loc 1 135 11 add.u64 %r60,%r60,%r47; .loc 1 137 9 add.u64 %r40,%r40,1; .loc 1 137 2 st.u64 [%frame+240],%r40; setp.ne.u64 %r161,%r40,%r51; @ %r161 bra $L14; bra $L34; $L23: mov.u64 %r49,%r103; mov.u64 %r61,%r62; $L11: .loc 1 141 10 ld.u32 %r42,[%r61]; .loc 1 141 9 setp.ge.s32 %r162,%r42,%r67; @ %r162 bra $L16; mov.u64 %r85,%r183; mov.u64 %r101,%r23; mov.u64 %r86,%r49; $L17: .loc 1 145 34 add.u64 %r164,%r86,1; .loc 1 145 23 st.u64 [%r101],%r164; .loc 1 144 3 add.u64 %r101,%r101,%r84; setp.eq.u64 %r165,%r73,%r85; @ %r165 bra $L25; .loc 1 145 30 ld.u64 %r86,[%r85]; add.u64 %r85,%r85,8; bra $L17; $L25: mov.u32 %r67,%r42; $L16: .loc 1 149 9 add.u64 %r61,%r61,%r47; .loc 1 151 14 add.u64 %r49,%r49,1; .loc 1 151 7 st.u64 [%frame+240],%r49; setp.ne.u64 %r166,%r49,%r51; @ %r166 bra $L11; .loc 1 149 9 sub.u64 %r167,%r51,%r103; mad.lo.u64 %r62,%r167,%r47,%r62; bra $L18; $L34: .loc 1 135 11 sub.u64 %r169,%r51,%r103; mad.lo.u64 %r62,%r169,%r47,%r62; $L18: mov.u64 %r70,%r183; .loc 1 151 34 mov.u64 %r59,%r51; mov.u64 %r57,%r45; .loc 1 152 9 mov.u64 %r66,0; $L20: .loc 1 157 13 st.u64 [%r70+-8],%r186; .loc 1 160 23 mul.lo.u64 %r54,%r57,%r59; mov.u64 %r99,%r66; .loc 1 161 5 add.u64 %r66,%r66,1; .loc 1 162 7 setp.eq.u64 %r173,%r99,%r98; @ %r173 bra $L1; .loc 1 170 16 ld.u64 %r174,[%r70]; add.u64 %r56,%r174,1; st.u64 [%r70],%r56; shl.b64 %r65,%r66,3; .loc 1 171 23 add.u64 %r175,%frame,%r65; ld.u64 %r57,[%r175]; .loc 1 171 13 sub.u64 %r176,%r57,%r54; shl.b64 %r177,%r176,2; add.u64 %r62,%r62,%r177; .loc 1 174 32 add.u64 %r179,%r187,%r65; ld.u64 %r59,[%r179]; .loc 1 174 7 add.u64 %r70,%r70,8; setp.eq.u64 %r180,%r56,%r59; @ %r180 bra $L20; bra $L21; $L_gfortran_mminloc0_8_i4 .visible .func _gfortran_mminloc0_8_i4reg .u64 %stack; mov.u64 %stack,0480pred %r187; .reg .u64 %r190; .reg .u64u64pred %r222; .reg .u64 %r223; .reg .predu32 %r230; .reg .u64 %r231; .reg .u16 %r233; .reg .u32 %r234; .reg .pred %r235; .reg .predpred %r248; .reg .predu64 %r273; .reg .predu64 %r283; .reg .pred %r285; mov.u64 %r160,%ar0; mov.u64 %r161,%ar1; mov.u64 %r162,%ar2; mov.u32 %r163,%ar3; .loc 1 202 6 setp.ne.u64 %r164,%r162,0; @ %r164 bra $L36; .loc 1 204 7 {1163; call _gfortran_minloc0_8_.loc 1 205 7 bra $L35; $L36: .loc 1 208 10 ld.s8 %r22,[%r161+28]; .loc 1 209 6 setp.gt.s32 %r168,%r22,0; @ %r168 bra $L38; .loc 1 210 5 cvta.const.u64 %r16stack; call _gfortran_runtime_error%r171,[%r160]; setp.ne.u64 %r172,%r171,0; @ %r172 bra $L39; .loc 1 214 7 st.u64 [%r160+48],%r171; add.u32 %r174,%r22,-1; cvt.s64.s32 %r175,%r174; st.u64 [%r160+56],%r175; mov.u64 %r176,1; st.u64 [%r160+40],%r176; .loc 1 215 28 cvt.u32.u64 %r177,%r176; st.u8 [%r160+28],%r177; .loc 1 216 24 st.u64 [%r160+8],%r171; .loc 1 217 29 cvt.u32.u32 %r182,%r22; cvt.s64.s8 %r181,%r182; mov.u64 %r180call (%value_in),_gfortrani_xmallocarray183,[%value_in]; } .loc 1 217 27 st.u64 [%r160],%r183; bra $L40; $L39: .loc 1 221 11 cvta.global.u64 %r185,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r186,[%r185+36]; setp.eq.u32 %r187,%r186,0; @ %r187 bra $L40; .loc 1 224 4 cvta.const.u64 %r190,$LC1; {161190; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19316161193190; call _gfortrani_bounds_equal_extents$L40: .loc 1 231 15 ld.u64 %r31,[%r162+16]; .loc 1 233 9 ld.u64 %r96,[%r162]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r195,%r32,-4; and.b32 %r196,%r195,-5; set.u32.eq.u32 %r198,%r196,0; neg.s32 %r199,%r198; .loc 1 235 22 add.u32 %r200,%r32,-1; set.u32.le.u32 %r202,%r200,1; neg.s32 %r203,%r202; .loc 1 235 6 cvt.u16.u32 %r205,%r199; cvt.u16.u32 %r206,%r203; or.b16 %r204,%r205,%r206; cvt.u32.u16 %r207,%r204; cvt.u16.u8 %r208,%r20741; .loc 1 237 7 setp.ne.u32 %r211,%r32,16; @ %r211 bra $L42; $L41: .loc 1 244 11 ld.u64 %r110,[%r160+40]; .loc 1 245 8 ld.u64 %r111,[%r160]; .loc 1 246 17 cvt.u32.u32 %r212,%r22; cvt.s64.s8 %r68,%r212; add.u64 %r153,%r161,40; add.u64 %r277,%frame,120; mov.u64 %r149,%r277; add.u64 %r147,%r162,40; mov.u64 %r146,%frame; add.u64 %r71,%frame,240; add.u64 %r64,%frame,360; mov.u64 %r47,%r64; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r221,%r29; bra $L43; $L42: .loc 1 242 5_gfortran_runtime_error64: mov.u64 %r29,%r116; $L43: .loc 1 248 18 ld.u64 %r215,[%r153]; st.u64 [%r149],%r215; .loc 1 249 20 ld.u64 %r217,[%r147]; mul.lo.u64 %r216,%r217,%r31; .loc 1 249 18 st.u64 [%r146],%r216; .loc 1 250 19 ld.u64 %r219,[%r153+16]; add.u64 %r218,%r219,1; ld.u64 %r220,[%r153+8]; sub.u64 %r44,%r218,%r220; .loc 1 250 17 st.u64 [%r71],%r44; .loc 1 251 16 st.u64 [%r47],%r221; .loc 1 252 10 setp.gt.s64 %r222,%r44,0; @ %r222 bra $L44; shl.b64 %r138,%r110,3; mov.u64 %r130,%r111; .loc 1 255 11 mov.u64 %r117,0; .loc 1 256 24 mov.u64 %r223,%r117; $L45: st.u64 [%r130],%r223; .loc 1 255 27 add.u64 %r117,%r117,1; .loc 1 255 4 add.u64 %r130,%r130,%r138; setp.ne.u64 %r224,%r68,%r117; @ %r224 bra $L45; bra $L35; $L44: .loc 1 246 26 add.u64 %r116,%r29,1; .loc 1 246 3 add.u64 %r153,%r153,24; add.u64 %r149,%r149,8; add.u64 %r147,%r147,24; add.u64 %r146,%r146,8; add.u64 %r71,%r71,8; add.u64 %r47,%r47,8; setp.ne.u64 %r225,%r68,%r116; @ %r225 bra $L64; .loc 1 261 8 ld.u64 %r93,[%r161]; shl.b64 %r158,%r110,3; mov.u64 %r156,%r111; .loc 1 264 10 mov.u64 %r115,0; .loc 1 265 23 mov.u64 %r226,%r115; $L47: st.u64 [%r156],%r226; mov.u64 %r141,%r115; .loc 1 264 26 add.u64 %r115,%r115,1; .loc 1 264 3 add.u64 %r156,%r156,%r158; setp.ne.u64 %r227,%r29,%r141; @ %r227 bra $L47; .loc 1 276 9 setp.eq.u64 %r228,%r93,0; @ %r228 bra $L35; .loc 1 331 19 ld.u64 %r73,[%frame+120]; .loc 1 331 9 shl.b64 %r75,%r73,2; .loc 1 332 20 ld.u64 %r76,[%frame]; .loc 1 334 34 ld.u64 %r79,[%frame+240]; ld.u64 %r155,[%frame+360]; cvt.u32.u32 %r230,%r22; cvt.s64.s8 %r229,%r230; shl.b64 %r231,%r229,3; add.u64 %r120,%r231,%r64; .loc 1 274 12 mov.u32 %r101,2147483647; setp.eq.u32 %r276,1,0; add.u64 %r275,%frame,368; .loc 1 340 13 mov.u64 %r282,0; .loc 1 359 32 add.u64 %r283,%frame,240; .loc 1 308 12 setp.eq.u32 %r285,%r163,0; $L63: .loc 1 280 10 @ %r276 bra $L48; mov.u64 %r56,%r155; mov.u32 %r46,0; mov.u32 %r279,1; $L53: mov.u64 %r45,%r93; .loc 1 300 13 add.u64 %r93,%r45,%r75; .loc 1 284 11 ld.s8 %r234,[%r96]; cvt.u16.u32 %r233,%r234; setp.eq.u16 %r235,%r233,0; @ %r235 bra $L49; setp.eq.u32 %r236,%r46,0; selp.u64 %r155,%r155,%r56,%r236; .loc 1 294 16 ld.u32 %r101,[%r45]; mov.u64 %r129,%r275; mov.u64 %r139,%r111; mov.u64 %r151,%r155; $L52: .loc 1 296 33 add.u64 %r238,%r151,1; .loc 1 296 22 st.u64 [%r139],%r238; .loc 1 295 9 add.u64 %r139,%r139,%r158; setp.eq.u64 %r239,%r120,%r129; @ %r239 bra $L65; .loc 1 296 29 ld.u64 %r151,[%r129]; add.u64 %r129,%r129,8; bra $L52; $L49: .loc 1 301 14 add.u64 %r96,%r96,%r76; .loc 1 303 11 add.u64 %r56,%r56,1; mov.u32 %r46,%r279; .loc 1 303 4 setp.ne.u64 %r240,%r56,%r79; @ %r240 bra $L53; bra $L89; $L48: .loc 1 308 12 @ %r285 bra $L66; mov.u64 %r66,%r155; mov.u64 %r91,%r93; $L58: .loc 1 311 11 ld.s8 %r243,[%r96]; cvt.u16.u32 %r242,%r243; setp.eq.u16 %r244,%r242,0; @ %r244 bra $L56; .loc 1 311 12 ld.u32 %r61,[%r91]; .loc 1 311 11 setp.gt.s32 %r245,%r61,%r101; @ %r245 bra $L56; mov.u64 %r62,%r275; mov.u64 %r94,%r111; mov.u64 %r148,%r66; .loc 1 314 18 mov.u64 %r142,%r282; $L57: .loc 1 315 36 add.u64 %r247,%r148,1; .loc 1 315 25 st.u64 [%r94],%r247; .loc 1 314 34 add.u64 %r114,%r142,1; .loc 1 314 11 add.u64 %r94,%r94,%r158; setp.le.s64 %r248,%r29,%r142; @ %r248 bra $L67; .loc 1 315 32 ld.u64 %r148,[%r62]; add.u64 %r62,%r62,8; mov.u64 %r142,%r114; bra $L57; $L67: mov.u32 %r101,%r61; $L56: .loc 1 317 8 add.u64 %r91,%r91,%r75; .loc 1 319 13 add.u64 %r66,%r66,1; .loc 1 319 6 st.u64 [%frame+360],%r66; setp.ne.u64 %r249,%r66,%r79; @ %r249 bra $L58; .loc 1 317 8 sub.u64 %r250,%r79,%r155; mad.lo.u64 %r93,%r250,%r75,%r93; bra $L59; $L66: mov.u64 %r78,%r155; mov.u64 %r95,%r96; mov.u64 %r92,%r93; $L55: .loc 1 323 11 ld.s8 %r253,[%r95]; cvt.u16.u32 %r252,%r253; setp.eq.u16 %r254,%r252,0; @ %r254 bra $L60; .loc 1 323 12 ld.u32 %r69,[%r92]; .loc 1 323 11 setp.ge.s32 %r255,%r69,%r101; @ %r255 bra $L60; mov.u64 %r118,%r275; mov.u64 %r57,%r111; mov.u64 %r144,%r78; .loc 1 326 12 mov.u64 %r143,%r282; $L61: .loc 1 327 36 add.u64 %r257,%r144,1; .loc 1 327 25 st.u64 [%r57],%r257; .loc 1 326 28 add.u64 %r113,%r143,1; .loc 1 326 5 add.u64 %r57,%r57,%r158; setp.le.s64 %r258,%r29,%r143; @ %r258 bra $L68; .loc 1 327 32 ld.u64 %r144,[%r118]; add.u64 %r118,%r118,8; mov.u64 %r143,%r113; bra $L61; $L68: mov.u32 %r101,%r69; $L60: .loc 1 331 9 add.u64 %r92,%r92,%r75; .loc 1 332 10 add.u64 %r95,%r95,%r76; .loc 1 334 14 add.u64 %r78,%r78,1; .loc 1 334 7 st.u64 [%frame+360],%r78; setp.ne.u64 %r259,%r78,%r79; @ %r259 bra $L55; sub.u64 %r126,%r79,%r155; .loc 1 331 9 mad.lo.u64 %r93,%r75,%r126,%r93; .loc 1 332 10 mad.lo.u64 %r96,%r76,%r126,%r96; bra $L59; $L89: st.u64 [%frame+360],%r79; $L59: mov.u64 %r105,%r275; mov.u64 %r88,%r76; .loc 1 334 34 mov.u64 %r90,%r79; mov.u64 %r86,%r73; mov.u64 %r104,8; .loc 1 335 9 mov.u64 %r99,0; $L62: .loc 1 340 13 st.u64 [%r105+-8],%r282; .loc 1 343 23 mul.lo.u64 %r81,%r86,%r90; .loc 1 344 24 mul.lo.u64 %r83,%r90,%r88; mov.u64 %r70,%r99; .loc 1 345 5 add.u64 %r99,%r99,1; .loc 1 346 7 setp.eq.u64 %r264,%r29,%r70; @ %r264 bra $L35; .loc 1 354 16 ld.u64 %r265,[%r105]; add.u64 %r85,%r265,1; st.u64 [%r105],%r85; .loc 1 355 23 add.u64 %r267,%r277,%r104; ld.u64 %r86,[%r267]; .loc 1 355 13 sub.u64 %r268,%r86,%r81; shl.b64 %r269,%r268,2; add.u64 %r93,%r93,%r269; .loc 1 356 24 add.u64 %r270,%frame,%r104; ld.u64 %r88,[%r270]; .loc 1 356 14 sub.u64 %r271,%r88,%r83; add.u64 %r96,%r96,%r271; .loc 1 359 32 add.u64 %r273,%r283,%r104; ld.u64 %r90,[%r273]; .loc 1 359 7 add.u64 %r105,%r105,8; add.u64 %r104,%r104,8; setp.eq.u64 %r274,%r85,%r90; @ %r274 bra $L62; ld.u64 %r155,[%frame+360]; bra $L63; $L65: mov.u64 %r93,%r45; setp.eq.u32 %r276,1,1; bra $L48; $L35: .loc 1 _gfortran_sminloc0_8_i4 .visible .func _gfortran_sminloc0_8_i4u32pred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 378 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L91; .loc 1 378 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L92; $L91: .loc 1 380 7 {_gfortran_minloc0_8_.loc 1 381 7 bra $L90; $L92: .loc 1 384 8 ld.s8 %r30,[%r36+28]_gfortran_runtime_error94: .loc 1 389 15 ld.u64 %r24,[%r35]; .loc 1 389 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L95; .loc 1 391 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 392 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 393 24 st.u64 [%r35+8],%r24; .loc 1 394 29 mov.u64 call (%value_in),_gfortrani_xmallocarray94 27 st.u64 [%r35],%r24; bra $L96; $L95: .loc 1 396 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 396 11 ld.u32 %r62,[%r61+36; .loc 1 398 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 403 8 ld.u64 %r24,[%r35]; $L96: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,3; mov.u64 %r29,%r24; .loc 1 404 10 mov.u64 %r32,0; .loc 1 405 23 mov.u64 %r68,%r32; $L97: st.u64 [%r29],%r68; .loc 1 404 24 add.u64 %r32,%r32,1; .loc 1 404 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L97; $L90: .loc 1 406 1 ret; } minloc0_16_i4.o/_gfortran_minloc0_16_i4 .visible .func _gfortran_minloc0_16_i4libgfortran/generated/minloc0_16_i4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_16_i4 .visible .func _gfortran_mminloc0_16_i4_gfortran_sminloc0_16_i4 .visible .func _gfortran_sminloc0_16_i4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_16_i4 .visible .func _gfortran_minloc0_16_i4stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[3683264u32 %r157; .reg .u64 %r158; .reg .predpred %r166; .reg .predpred %r171; .reg .predpredmov.u64 %r114,%ar0; mov.u64 %r115,%ar1; mov.u32 %r116,%ar2; .loc 1 50 10 ld.s8 %r22,[%r115+28]; .loc 1 50 8 cvt.u32.u32 %r117,%r22; cvt.s64.s8 %r71,%r117; .loc 1 51 6 setp.gt.s64 %r118,%r71,0; @ %r118 bra $L2; .loc 1 52 5 cvta.const.u64 %r1111_gfortran_runtime_errorr23,[%r114]; .loc 1 54 6 setp.ne.u64 %r121,%r23,0; @ %r121 bra $L3; .loc 1 56 7 st.u64 [%r114+48],%r23; add.u64 %r123,%r71,-1; st.u64 [%r114+56],%r123; mov.u64 %r124,1; st.u64 [%r114+40],%r124; .loc 1 57 28 cvt.u32.u64 %r125,%r124; st.u8 [%r114+28],%r125; .loc 1 58 24 st.u64 [%r114+8],%r23; .loc 1 59 29 mov.u64 %r128,1671call (%value_in),_gfortrani_xmallocarray131,[%value_in]; } mov.u64 %r23,%r131; .loc 1 59 27 st.u64 [%r114],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r133,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r134,[%r133+36]; setp.eq.u32 %r135,%r134,0; @ %r135 bra $L4; .loc 1 64r115138; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r114]; $L4: .loc 1 68 11 ld.u64 %r72,[%r114+40]; add.u64 %r44,%r115,40; mov.u64 %r35,%frame; add.u64 %r52,%frame,120; .loc 1 70 10 mov.u64 %r100,0; add.u64 %r188,%frame,240; .loc 1 74 16 mov.u64 %r146,%r100; bra $L7; $L22: mov.u64 %r100,%r79; $L7: .loc 1 72 18 ld.u64 %r139,[%r44]; st.u64 [%r35],%r139; .loc 1 73 19 ld.u64 %r141,[%r44+16]; add.u64 %r140,%r141,1; ld.u64 %r142,[%r44+8]; sub.u64 %r32,%r140,%r142; .loc 1 73 17 st.u64 [%r52],%r32; .loc 1 74 16 shl.b64 %r144,%r100,3; add.u64 %r145,%r188,%r144; st.u64 [%r145],%r146; .loc 1 75 10 setp.gt.s64 %r147,%r32,0; @ %r147 bra $L5; shl.b64 %r86,%r72,4; mov.u64 %r88,%r23; .loc 1 78 11 mov.u64 %r80,0; .loc 1 79 24 mov.u64 %r148,%r80; $L6: st.u64 [%r88],%r148; st.u64 [%r88+8],%r148; .loc 1 78 27 add.u64 %r80,%r80,1; .loc 1 78 4 add.u64 %r88,%r88,%r86; setp.ne.u64 %r150,%r71,%r80; @ %r150 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r79,%r100,1; .loc 1 70 3 add.u64 %r44,%r44,24; add.u64 %r35,%r35,8; add.u64 %r52,%r52,8; setp.ne.u64 %r151,%r71,%r79; @ %r151 bra $L22; .loc 1 84 8 ld.u64 %r62,[%r115]; shl.b64 %r38,%r72,4; mov.u64 %r85,%r23; .loc 1 87 10 mov.u64 %r78,0; .loc 1 88 23 mov.u64 %r152,1; mov.u64 %r153,%r78; $L8: st.u64 [%r85],%r152; st.u64 [%r85+8],%r153; mov.u64 %r33,%r78; .loc 1 87 26 add.u64 %r78,%r78,1; .loc 1 87 3 add.u64 %r85,%r85,%r38; setp.ne.u64 %r154,%r100,%r33; @ %r154 bra $L8; .loc 1 101 9 setp.ne.u64 %r155,%r62,0; @ ! %r155 bra $L1; .loc 1 149 19 ld.u64 %r47,[%frame]; .loc 1 149 9 shl.b64 %r49,%r47,2; .loc 1 151 34 ld.u64 %r51,[%frame+120]; cvt.u32.u32 %r157,%r22; cvt.s64.s8 %r156,%r157; shl.b64 %r158,%r156,3; add.u64 %r89,%r188,%r158; .loc 1 99 12 mov.u32 %r66,2147483647; setp.eq.u32 %r187,%r116,0; add.u64 %r189,%frame,248; .loc 1 157 13 mov.u64 %r192,0; .loc 1 174 32 add.u64 %r193,%frame,120; $L21: .loc 1 151 21 ld.u64 %r107,[%frame+240]; .loc 1 126 10 @ %r187 bra $L23; mov.u64 %r41,%r107; mov.u64 %r60,%r62; $L14: .loc 1 129 10 ld.u32 %r34,[%r60]; .loc 1 129 9 setp.gt.s32 %r161,%r34,%r66; @ %r161 bra $L12; mov.u64 %r27,%r189; mov.u64 %r63,%r23; mov.u64 %r103,%r41; $L13: .loc 1 133 34 add.u64 %r163,%r103,1; st.u64 [%r63],%r163; shr.s64 %r164,%r163,63; st.u64 [%r63+8],%r164; .loc 1 132 3 add.u64 %r63,%r63,%r38; setp.eq.u64 %r165,%r27,%r89; @ %r165 bra $L24; .loc 1 133 30 ld.u64 %r103,[%r27]; add.u64 %r27,%r27,8; bra $L13; $L24: mov.u32 %r66,%r34; $L12: .loc 1 135 11 add.u64 %r60,%r60,%r49; .loc 1 137 9 add.u64 %r41,%r41,1; .loc 1 137 2 st.u64 [%frame+240],%r41; setp.ne.u64 %r166,%r41,%r51; @ %r166 bra $L14; bra $L34; $L23: mov.u64 %r50,%r107; mov.u64 %r61,%r62; $L11: .loc 1 141 10 ld.u32 %r42,[%r61]; .loc 1 141 9 setp.ge.s32 %r167,%r42,%r66; @ %r167 bra $L16; mov.u64 %r104,%r189; mov.u64 %r105,%r23; mov.u64 %r84,%r50; $L17: .loc 1 145 34 add.u64 %r169,%r84,1; st.u64 [%r105],%r169; shr.s64 %r170,%r169,63; st.u64 [%r105+8],%r170; .loc 1 144 3 add.u64 %r105,%r105,%r38; setp.eq.u64 %r171,%r89,%r104; @ %r171 bra $L25; .loc 1 145 30 ld.u64 %r84,[%r104]; add.u64 %r104,%r104,8; bra $L17; $L25: mov.u32 %r66,%r42; $L16: .loc 1 149 9 add.u64 %r61,%r61,%r49; .loc 1 151 14 add.u64 %r50,%r50,1; .loc 1 151 7 st.u64 [%frame+240],%r50; setp.ne.u64 %r172,%r50,%r51; @ %r172 bra $L11; .loc 1 149 9 sub.u64 %r173,%r51,%r107; mad.lo.u64 %r62,%r173,%r49,%r62; bra $L18; $L34: .loc 1 135 11 sub.u64 %r175,%r51,%r107; mad.lo.u64 %r62,%r175,%r49,%r62; $L18: mov.u64 %r77,%r189; .loc 1 151 34 mov.u64 %r59,%r51; mov.u64 %r57,%r47; .loc 1 152 9 mov.u64 %r65,0; $L20: .loc 1 157 13 st.u64 [%r77+-8],%r192; .loc 1 160 23 mul.lo.u64 %r54,%r57,%r59; mov.u64 %r101,%r65; .loc 1 161 5 add.u64 %r65,%r65,1; .loc 1 162 7 setp.eq.u64 %r179,%r101,%r100; @ %r179 bra $L1; .loc 1 170 16 ld.u64 %r180,[%r77]; add.u64 %r56,%r180,1; st.u64 [%r77],%r56; shl.b64 %r68,%r65,3; .loc 1 171 23 add.u64 %r181,%frame,%r68; ld.u64 %r57,[%r181]; .loc 1 171 13 sub.u64 %r182,%r57,%r54; shl.b64 %r183,%r182,2; add.u64 %r62,%r62,%r183; .loc 1 174 32 add.u64 %r185,%r193,%r68; ld.u64 %r59,[%r185]; .loc 1 174 7 add.u64 %r77,%r77,8; setp.eq.u64 %r186,%r56,%r59; @ %r186 bra $L20; bra $L21; $L_gfortran_mminloc0_16_i4 .visible .func _gfortran_mminloc0_16_i4reg .u64 %stack; mov.u64 %stack,0480u64 %r62; .reg .u64 %r64; .reg .u32106; .reg .u64 %r107predpredpredu64pred %r293; mov.u64 %r163,%ar0; mov.u64 %r164,%ar1; mov.u64 %r165,%ar2; mov.u32 %r166,%ar3; .loc 1 202 6 setp.ne.u64 %r167,%r165,0; @ %r167 bra $L3616r1166; call _gfortran_minloc0_16_.loc 1 205 7 bra $L35; $L36: .loc 1 208 10 ld.s8 %r22,[%r164+28]; .loc 1 209 6 setp.gt.s32 %r171,%r22,0; @ %r171 bra $L38; .loc 1 210 5 cvta.const.u64 %r172172_gfortran_runtime_error%r174,[%r163]; setp.ne.u64 %r175,%r174,0; @ %r175 bra $L39; .loc 1 214 7 st.u64 [%r163+48],%r174; add.u32 %r177,%r22,-1; cvt.s64.s32 %r178,%r177; st.u64 [%r163+56],%r178; mov.u64 %r179,1; st.u64 [%r163+40],%r179; .loc 1 215 28 cvt.u32.u64 %r180,%r179; st.u8 [%r163+28],%r180; .loc 1 216 24 st.u64 [%r163+8],%r174; .loc 1 217 29 cvt.u32.u32 %r185,%r22; cvt.s64.s8 %r184,%r185; mov.u64 %r183,16call (%value_in),_gfortrani_xmallocarray186,[%value_in]; } .loc 1 217 27 st.u64 [%r163],%r186; bra $L40; $L39: .loc 1 221 11 cvta.global.u64 %r188,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r189,[%r188+36]; setp.eq.u32 %r190,%r189,0r164193; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r196193; call _gfortrani_bounds_equal_extents$L40: .loc 1 231 15 ld.u64 %r31,[%r165+16]; .loc 1 233 9 ld.u64 %r93,[%r165]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r198,%r32,-4; and.b32 %r199,%r198,-5; set.u32.eq.u32 %r201,%r199,0; neg.s32 %r202,%r201; .loc 1 235 22 add.u32 %r203,%r32,-1; set.u32.le.u32 %r205,%r203,1; neg.s32 %r206,%r205; .loc 1 235 6 cvt.u16.u32 %r208,%r202; cvt.u16.u32 %r209,%r206; or.b16 %r207,%r208,%r209;41; .loc 1 237 7 setp.ne.u32 %r214,%r32,16; @ %r214 bra $L42; $L41: .loc 1 244 11 ld.u64 %r106,[%r163+40]; .loc 1 245 8 ld.u64 %r107,[%r163]; .loc 1 246 17 cvt.u32.u32 %r215,%r22; cvt.s64.s8 %r64,%r215; add.u64 %r161,%r164,40; add.u64 %r284,%frame,120; mov.u64 %r159,%r284; add.u64 %r158,%r165,40; mov.u64 %r154,%frame; add.u64 %r155,%frame,240; add.u64 %r151,%frame,360; mov.u64 %r150,%r151; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r224,%r29; bra $L43; $L42: .loc 1 242 5_gfortran_runtime_error64: mov.u64 %r29,%r112; $L43: .loc 1 248 18 ld.u64 %r218,[%r161]; st.u64 [%r159],%r218; .loc 1 249 20 ld.u64 %r220,[%r158]; mul.lo.u64 %r219,%r220,%r31; .loc 1 249 18 st.u64 [%r154],%r219; .loc 1 250 19 ld.u64 %r222,[%r161+16]; add.u64 %r221,%r222,1; ld.u64 %r223,[%r161+8]; sub.u64 %r44,%r221,%r223; .loc 1 250 17 st.u64 [%r155],%r44; .loc 1 251 16 st.u64 [%r150],%r224; .loc 1 252 10 setp.gt.s64 %r225,%r44,0; @ %r225 bra $L44; shl.b64 %r141,%r106,4; mov.u64 %r132,%r107; .loc 1 255 11 mov.u64 %r113,0; .loc 1 256 24 mov.u64 %r226,%r113; $L45: st.u64 [%r132],%r226; st.u64 [%r132+8],%r226; .loc 1 255 27 add.u64 %r113,%r113,1; .loc 1 255 4 add.u64 %r132,%r132,%r141; setp.ne.u64 %r228,%r64,%r113; @ %r228 bra $L45; bra $L35; $L44: .loc 1 246 26 add.u64 %r112,%r29,1; .loc 1 246 3 add.u64 %r161,%r161,24; add.u64 %r159,%r159,8; add.u64 %r158,%r158,24; add.u64 %r154,%r154,8; add.u64 %r155,%r155,8; add.u64 %r150,%r150,8; setp.ne.u64 %r229,%r64,%r112; @ %r229 bra $L64; .loc 1 261 8 ld.u64 %r91,[%r164]; shl.b64 %r115,%r106,4; mov.u64 %r52,%r107; .loc 1 264 10 mov.u64 %r111,0; .loc 1 265 23 mov.u64 %r230,%r111; $L47: st.u64 [%r52],%r230; st.u64 [%r52+8],%r230; mov.u64 %r143,%r111; .loc 1 264 26 add.u64 %r111,%r111,1; .loc 1 264 3 add.u64 %r52,%r52,%r115; setp.ne.u64 %r232,%r29,%r143; @ %r232 bra $L47; .loc 1 276 9 setp.eq.u64 %r233,%r91,0; @ %r233 bra $L35; .loc 1 331 19 ld.u64 %r70,[%frame+120]; .loc 1 331 9 shl.b64 %r72,%r70,2; .loc 1 332 20 ld.u64 %r73,[%frame]; .loc 1 334 34 ld.u64 %r76,[%frame+240]; ld.u64 %r160,[%frame+360]; cvt.u32.u32 %r235,%r22; cvt.s64.s8 %r234,%r235; shl.b64 %r236,%r234,3; add.u64 %r122,%r236,%r151; .loc 1 274 12 mov.u32 %r97,2147483647; setp.eq.u32 %r283,1,0; add.u64 %r285,%frame,368; .loc 1 340 13 mov.u64 %r290,0; .loc 1 359 32 add.u64 %r291,%frame,240; .loc 1 308 12 setp.eq.u32 %r293,%r166,0; $L63: .loc 1 280 10 @ %r283 bra $L48; mov.u64 %r53,%r160; mov.u32 %r46,0; mov.u32 %r287,1; $L53: mov.u64 %r45,%r91; .loc 1 300 13 add.u64 %r91,%r45,%r72; .loc 1 284 11 ld.s8 %r239,[%r93]; cvt.u16.u32 %r238,%r239; setp.eq.u16 %r240,%r238,0; @ %r240 bra $L49; setp.eq.u32 %r241,%r46,0; selp.u64 %r160,%r160,%r53,%r241; .loc 1 294 16 ld.u32 %r97,[%r45]; mov.u64 %r131,%r285; mov.u64 %r142,%r107; mov.u64 %r157,%r160; $L52: .loc 1 296 33 add.u64 %r243,%r157,1; st.u64 [%r142],%r243; shr.s64 %r244,%r243,63; st.u64 [%r142+8],%r244; .loc 1 295 9 add.u64 %r142,%r142,%r115; setp.eq.u64 %r245,%r122,%r131; @ %r245 bra $L65; .loc 1 296 29 ld.u64 %r157,[%r131]; add.u64 %r131,%r131,8; bra $L52; $L49: .loc 1 301 14 add.u64 %r93,%r93,%r73; .loc 1 303 11 add.u64 %r53,%r53,1; mov.u32 %r46,%r287; .loc 1 303 4 setp.ne.u64 %r246,%r53,%r76; @ %r246 bra $L53; bra $L89; $L48: .loc 1 308 12 @ %r293 bra $L66; mov.u64 %r62,%r160; mov.u64 %r89,%r91; $L58: .loc 1 311 11 ld.s8 %r249,[%r93]; cvt.u16.u32 %r248,%r249; setp.eq.u16 %r250,%r248,0; @ %r250 bra $L56; .loc 1 311 12 ld.u32 %r58,[%r89]; .loc 1 311 11 setp.gt.s32 %r251,%r58,%r97; @ %r251 bra $L56; mov.u64 %r94,%r285; mov.u64 %r98,%r107; mov.u64 %r153,%r62; .loc 1 314 18 mov.u64 %r144,%r290; $L57: .loc 1 315 36 add.u64 %r253,%r153,1; st.u64 [%r98],%r253; shr.s64 %r254,%r253,63; st.u64 [%r98+8],%r254; .loc 1 314 34 add.u64 %r110,%r144,1; .loc 1 314 11 add.u64 %r98,%r98,%r115; setp.le.s64 %r255,%r29,%r144; @ %r255 bra $L67; .loc 1 315 32 ld.u64 %r153,[%r94]; add.u64 %r94,%r94,8; mov.u64 %r144,%r110; bra $L57; $L67: mov.u32 %r97,%r58; $L56: .loc 1 317 8 add.u64 %r89,%r89,%r72; .loc 1 319 13 add.u64 %r62,%r62,1; .loc 1 319 6 st.u64 [%frame+360],%r62; setp.ne.u64 %r256,%r62,%r76; @ %r256 bra $L58; .loc 1 317 8 sub.u64 %r257,%r76,%r160; mad.lo.u64 %r91,%r257,%r72,%r91; bra $L59; $L66: mov.u64 %r75,%r160; mov.u64 %r92,%r93; mov.u64 %r90,%r91; $L55: .loc 1 323 11 ld.s8 %r260,[%r92]; cvt.u16.u32 %r259,%r260; setp.eq.u16 %r261,%r259,0; @ %r261 bra $L60; .loc 1 323 12 ld.u32 %r65,[%r90]; .loc 1 323 11 setp.ge.s32 %r262,%r65,%r97; @ %r262 bra $L60; mov.u64 %r57,%r285; mov.u64 %r66,%r107; mov.u64 %r149,%r75; .loc 1 326 12 mov.u64 %r145,%r290; $L61: .loc 1 327 36 add.u64 %r264,%r149,1; st.u64 [%r66],%r264; shr.s64 %r265,%r264,63; st.u64 [%r66+8],%r265; .loc 1 326 28 add.u64 %r109,%r145,1; .loc 1 326 5 add.u64 %r66,%r66,%r115; setp.le.s64 %r266,%r29,%r145; @ %r266 bra $L68; .loc 1 327 32 ld.u64 %r149,[%r57]; add.u64 %r57,%r57,8; mov.u64 %r145,%r109; bra $L61; $L68: mov.u32 %r97,%r65; $L60: .loc 1 331 9 add.u64 %r90,%r90,%r72; .loc 1 332 10 add.u64 %r92,%r92,%r73; .loc 1 334 14 add.u64 %r75,%r75,1; .loc 1 334 7 st.u64 [%frame+360],%r75; setp.ne.u64 %r267,%r75,%r76; @ %r267 bra $L55; sub.u64 %r128,%r76,%r160; .loc 1 331 9 mad.lo.u64 %r91,%r72,%r128,%r91; .loc 1 332 10 mad.lo.u64 %r93,%r73,%r128,%r93; bra $L59; $L89: st.u64 [%frame+360],%r76; $L59: mov.u64 %r120,%r285; mov.u64 %r85,%r73; .loc 1 334 34 mov.u64 %r87,%r76; mov.u64 %r83,%r70; mov.u64 %r119,8; .loc 1 335 9 mov.u64 %r96,0; $L62: .loc 1 340 13 st.u64 [%r120+-8],%r290; .loc 1 343 23 mul.lo.u64 %r78,%r83,%r87; .loc 1 344 24 mul.lo.u64 %r80,%r87,%r85; mov.u64 %r95,%r96; .loc 1 345 5 add.u64 %r96,%r96,1; .loc 1 346 7 setp.eq.u64 %r272,%r29,%r95; @ %r272 bra $L35; .loc 1 354 16 ld.u64 %r273,[%r120]; add.u64 %r82,%r273,1; st.u64 [%r120],%r82; .loc 1 355 23 add.u64 %r275,%r284,%r119; ld.u64 %r83,[%r275]; .loc 1 355 13 sub.u64 %r276,%r83,%r78; shl.b64 %r277,%r276,2; add.u64 %r91,%r91,%r277; .loc 1 356 24 add.u64 %r278,%frame,%r119; ld.u64 %r85,[%r278]; .loc 1 356 14 sub.u64 %r279,%r85,%r80; add.u64 %r93,%r93,%r279; .loc 1 359 32 add.u64 %r281,%r291,%r119; ld.u64 %r87,[%r281]; .loc 1 359 7 add.u64 %r120,%r120,8; add.u64 %r119,%r119,8; setp.eq.u64 %r282,%r82,%r87; @ %r282 bra $L62; ld.u64 %r160,[%frame+360]; bra $L63; $L65: mov.u64 %r91,%r45; setp.eq.u32 %r283,1,1; bra $L48; $L35: .loc 1 _gfortran_sminloc0_16_i4 .visible .func _gfortran_sminloc0_16_i4u32pred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 378 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L91; .loc 1 378 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L92; $L91: .loc 1 380 7 {_gfortran_minloc0_16_.loc 1 381 7 bra $L90; $L92: .loc 1 384 8 ld.s8 %r30,[%r36+28]_gfortran_runtime_error94: .loc 1 389 15 ld.u64 %r24,[%r35]; .loc 1 389 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L95; .loc 1 391 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 392 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 393 24 st.u64 [%r35+8],%r24; .loc 1 394 29call (%value_in),_gfortrani_xmallocarray94 27 st.u64 [%r35],%r24; bra $L96; $L95: .loc 1 396 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 396 11 ld.u32 %r62,[%r61+36; .loc 1 398 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 403 8 ld.u64 %r24,[%r35]; $L96: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,4; mov.u64 %r29,%r24; .loc 1 404 10 mov.u64 %r32,0; .loc 1 405 23 mov.u64 %r68,%r32; $L97: st.u64 [%r29],%r68; st.u64 [%r29+8],%r68; .loc 1 404 24 add.u64 %r32,%r32,1; .loc 1 404 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r70,%r30,%r32; @ %r70 bra $L97; $L90: .loc 1 406 1 ret; } minloc0_4_i8.o/_gfortran_minloc0_4_i8 .visible .func _gfortran_minloc0_4_i8libgfortran/generated/minloc0_4_i8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_4_i8 .visible .func _gfortran_mminloc0_4_i8_gfortran_sminloc0_4_i8 .visible .func _gfortran_sminloc0_4_i8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_4_i8 .visible .func _gfortran_minloc0_4_i8stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[368u64predpred %r154; .reg .predpredpredmov.u64 %r116,%ar0; mov.u64 %r117,%ar1; mov.u32 %r118,%ar2; .loc 1 50 10 ld.s8 %r22,[%r117+28]; .loc 1 50 8 cvt.u32.u32 %r119,%r22; cvt.s64.s8 %r72,%r119; .loc 1 51 6 setp.gt.s64 %r120,%r72,0; @ %r120 bra $L2; .loc 1 52 5 cvta.const.u64 %r121121_gfortran_runtime_errorr23,[%r116]; .loc 1 54 6 setp.ne.u64 %r123,%r23,0; @ %r123 bra $L3; .loc 1 56 7 st.u64 [%r116+48],%r23; add.u64 %r125,%r72,-1; st.u64 [%r116+56],%r125; mov.u64 %r126,1; st.u64 [%r116+40],%r126; .loc 1 57 28 cvt.u32.u64 %r127,%r126; st.u8 [%r116+28],%r127; .loc 1 58 24 st.u64 [%r116+8],%r23; .loc 1 59 29 mov.u64 %r130,4;r130; call (%value_in),_gfortrani_xmallocarray133,[%value_in]; } mov.u64 %r23,%r133; .loc 1 59 27 st.u64 [%r116],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r135,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r136,[%r135+36]; setp.eq.u32 %r137,%r136,0; @ %r137 bra $L4; .loc 1 64r140; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r116]; $L4: .loc 1 68 11 ld.u64 %r73,[%r116+40]; add.u64 %r39,%r117,40; mov.u64 %r45,%frame; add.u64 %r35,%frame,120; .loc 1 70 10 mov.u64 %r99,0; add.u64 %r188,%frame,240; .loc 1 74 16 mov.u64 %r148,%r99; bra $L7; $L22: mov.u64 %r99,%r79; $L7: .loc 1 72 18 ld.u64 %r141,[%r39]; st.u64 [%r45],%r141; .loc 1 73 19 ld.u64 %r143,[%r39+16]; add.u64 %r142,%r143,1; ld.u64 %r144,[%r39+8]; sub.u64 %r31,%r142,%r144; .loc 1 73 17 st.u64 [%r35],%r31; .loc 1 74 16 shl.b64 %r146,%r99,3; add.u64 %r147,%r188,%r146; st.u64 [%r147],%r148; .loc 1 75 10 setp.gt.s64 %r149,%r31,0; @ %r149 bra $L5; shl.b64 %r85,%r73,2; mov.u64 %r87,%r23; .loc 1 78 11 mov.u64 %r80,0; .loc 1 79 24 cvt.u32.u64 %r150,%r80; $L6: st.u32 [%r87],%r150; .loc 1 78 27 add.u64 %r80,%r80,1; .loc 1 78 4 add.u64 %r87,%r87,%r85; setp.ne.u64 %r151,%r72,%r80; @ %r151 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r79,%r99,1; .loc 1 70 3 add.u64 %r39,%r39,24; add.u64 %r45,%r45,8; add.u64 %r35,%r35,8; setp.ne.u64 %r152,%r72,%r79; @ %r152 bra $L22; .loc 1 84 8 ld.u64 %r64,[%r117]; shl.b64 %r82,%r73,2; mov.u64 %r92,%r23; .loc 1 87 10 mov.u64 %r78,0; .loc 1 88 23 mov.u32 %r153,1; $L8: st.u32 [%r92],%r153; mov.u64 %r32,%r78; .loc 1 87 26 add.u64 %r78,%r78,1; .loc 1 87 3 add.u64 %r92,%r92,%r82; setp.ne.u64 %r154,%r99,%r32; @ %r154 bra $L8; .loc 1 101 9 setp.ne.u64 %r155,%r64,0; @ ! %r155 bra $L1; .loc 1 149 19 ld.u64 %r49,[%frame]; .loc 1 149 9 shl.b64 %r51,%r49,3; .loc 1 151 34 ld.u64 %r53,[%frame+120]; cvt.u32.u32 %r157,%r22; cvt.s64.s8 %r156,%r157; shl.b64 %r158,%r156,3; add.u64 %r102,%r188,%r158; .loc 1 99 12 mov.u64 %r67,9223372036854775807; setp.eq.u32 %r187,%r118,0; add.u64 %r189,%frame,248; .loc 1 157 13 mov.u64 %r192,0; .loc 1 174 32 add.u64 %r193,%frame,120; $L21: .loc 1 151 21 ld.u64 %r110,[%frame+240]; .loc 1 126 10 @ %r187 bra $L23; mov.u64 %r40,%r110; mov.u64 %r62,%r64; $L14: .loc 1 129 10 ld.u64 %r33,[%r62]; .loc 1 129 9 setp.gt.s64 %r161,%r33,%r67; @ %r161 bra $L12; mov.u64 %r43,%r189; mov.u64 %r65,%r23; mov.u64 %r105,%r40; $L13: .loc 1 133 34 cvt.u32.u64 %r164,%r105; add.u32 %r163,%r164,1; .loc 1 133 23 st.u32 [%r65],%r163; .loc 1 132 3 add.u64 %r65,%r65,%r82; setp.eq.u64 %r165,%r43,%r102; @ %r165 bra $L24; .loc 1 133 30 ld.u64 %r105,[%r43]; add.u64 %r43,%r43,8; bra $L13; $L24: mov.u64 %r67,%r33; $L12: .loc 1 135 11 add.u64 %r62,%r62,%r51; .loc 1 137 9 add.u64 %r40,%r40,1; .loc 1 137 2 st.u64 [%frame+240],%r40; setp.ne.u64 %r166,%r40,%r53; @ %r166 bra $L14; bra $L34; $L23: mov.u64 %r52,%r110; mov.u64 %r63,%r64; $L11: .loc 1 141 10 ld.u64 %r42,[%r63]; .loc 1 141 9 setp.ge.s64 %r167,%r42,%r67; @ %r167 bra $L16; mov.u64 %r108,%r189; mov.u64 %r109,%r23; mov.u64 %r44,%r52; $L17: .loc 1 145 34 cvt.u32.u64 %r170,%r44; add.u32 %r169,%r170,1; .loc 1 145 23 st.u32 [%r109],%r169; .loc 1 144 3 add.u64 %r109,%r109,%r82; setp.eq.u64 %r171,%r102,%r108; @ %r171 bra $L25; .loc 1 145 30 ld.u64 %r44,[%r108]; add.u64 %r108,%r108,8; bra $L17; $L25: mov.u64 %r67,%r42; $L16: .loc 1 149 9 add.u64 %r63,%r63,%r51; .loc 1 151 14 add.u64 %r52,%r52,1; .loc 1 151 7 st.u64 [%frame+240],%r52; setp.ne.u64 %r172,%r52,%r53; @ %r172 bra $L11; .loc 1 149 9 sub.u64 %r173,%r53,%r110; mad.lo.u64 %r64,%r173,%r51,%r64; bra $L18; $L34: .loc 1 135 11 sub.u64 %r175,%r53,%r110; mad.lo.u64 %r64,%r175,%r51,%r64; $L18: mov.u64 %r81,%r189; .loc 1 151 34 mov.u64 %r61,%r53; mov.u64 %r59,%r49; .loc 1 152 9 mov.u64 %r66,0; $L20: .loc 1 157 13 st.u64 [%r81+-8],%r192; .loc 1 160 23 mul.lo.u64 %r56,%r59,%r61; mov.u64 %r100,%r66; .loc 1 161 5 add.u64 %r66,%r66,1; .loc 1 162 7 setp.eq.u64 %r179,%r100,%r99; @ %r179 bra $L1; .loc 1 170 16 ld.u64 %r180,[%r81]; add.u64 %r58,%r180,1; st.u64 [%r81],%r58; shl.b64 %r70,%r66,3; .loc 1 171 23 add.u64 %r181,%frame,%r70; ld.u64 %r59,[%r181]; .loc 1 171 13 sub.u64 %r182,%r59,%r56; shl.b64 %r183,%r182,3; add.u64 %r64,%r64,%r183; .loc 1 174 32 add.u64 %r185,%r193,%r70; ld.u64 %r61,[%r185]; .loc 1 174 7 add.u64 %r81,%r81,8; setp.eq.u64 %r186,%r58,%r61; @ %r186 bra $L20; bra $L21; $L_gfortran_mminloc0_4_i8 .visible .func _gfortran_mminloc0_4_i8reg .u64 %stack; mov.u64 %stack,0480predpred %r193; .reg .u64 %r196; .reg .u6464pred %r228; .reg .u32 %r229; .reg .predpred %r233; .reg .pred %r234; .reg .u64u16u64u64 %r292; .reg .pred %r294; mov.u64 %r166,%ar0; mov.u64 %r167,%ar1; mov.u64 %r168,%ar2; mov.u32 %r169,%ar3; .loc 1 202 6 setp.ne.u64 %r170,%r168,0; @ %r170 bra $L36gfortran_minloc0_4_.loc 1 205 7 bra $L35; $L36: .loc 1 208 10 ld.s8 %r22,[%r167+28]; .loc 1 209 6 setp.gt.s32 %r174,%r22,0; @ %r174 bra $L38; .loc 1 210 5 cvta.const.u64 %r175175_gfortran_runtime_error%r177,[%r166]; setp.ne.u64 %r178,%r177,0; @ %r178 bra $L39; .loc 1 214 7 st.u64 [%r166+48],%r177; add.u32 %r180,%r22,-1; cvt.s64.s32 %r181,%r180; st.u64 [%r166+56],%r181; mov.u64 %r182,1; st.u64 [%r166+40],%r182; .loc 1 215 28 cvt.u32.u64 %r183,%r182; st.u8 [%r166+28],%r183; .loc 1 216 24 st.u64 [%r166+8],%r177; .loc 1 217 29 cvt.u32.u32 %r188,%r22; cvt.s64.s8 %r187,%r188call (%value_in),_gfortrani_xmallocarray189,[%value_in]; } .loc 1 217 27 st.u64 [%r166],%r189; bra $L40; $L39: .loc 1 221 11 cvta.global.u64 %r191,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r192,[%r191+36];r167196; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19call _gfortrani_bounds_equal_extents$L40: .loc 1 231 15 ld.u64 %r31,[%r168+16]; .loc 1 233 9 ld.u64 %r96,[%r168]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r201,%r32,-4; and.b32 %r202,%r201,-5; set.u32.eq.u32 %r204,%r202,0; neg.s32 %r205,%r204; .loc 1 235 22 add.u32 %r206,%r32,-1; set.u32.le.u32 %r208,%r206,1; neg.s32 %r209,%r208; .loc 1 235 6 cvt.u16.u32 %r211,%r205; cvt.u16.u32 %r212,%r209; or.b16 %r210,%r211,%r212; cvt.u32.u16 %r213,%r210; cvt.u16.u8 %r214,%r213; setp.ne.u16 %r215,%r214,0; @ %r215 bra $L41; .loc 1 237 7 setp.ne.u32 %r217,%r32,16; @ %r217 bra $L42; $L41: .loc 1 244 11 ld.u64 %r109,[%r166+40]; .loc 1 245 8 ld.u64 %r110,[%r166]; .loc 1 246 17 cvt.u32.u32 %r218,%r22; cvt.s64.s8 %r67,%r218; add.u64 %r52,%r167,40; add.u64 %r286,%frame,120; mov.u64 %r118,%r286; add.u64 %r105,%r168,40; mov.u64 %r165,%frame; add.u64 %r162,%frame,240; add.u64 %r159,%frame,360; mov.u64 %r160,%r159; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r227,%r29; bra $L43; $L42: .loc 1 242 5_gfortran_runtime_error64: mov.u64 %r29,%r115; $L43: .loc 1 248 18 ld.u64 %r221,[%r52]; st.u64 [%r118],%r221; .loc 1 249 20 ld.u64 %r223,[%r105]; mul.lo.u64 %r222,%r223,%r31; .loc 1 249 18 st.u64 [%r165],%r222; .loc 1 250 19 ld.u64 %r225,[%r52+16]; add.u64 %r224,%r225,1; ld.u64 %r226,[%r52+8]; sub.u64 %r44,%r224,%r226; .loc 1 250 17 st.u64 [%r162],%r44; .loc 1 251 16 st.u64 [%r160],%r227; .loc 1 252 10 setp.gt.s64 %r228,%r44,0; @ %r228 bra $L44; shl.b64 %r149,%r109,2; mov.u64 %r138,%r110; .loc 1 255 11 mov.u64 %r116,0; .loc 1 256 24 cvt.u32.u64 %r229,%r116; $L45: st.u32 [%r138],%r229; .loc 1 255 27 add.u64 %r116,%r116,1; .loc 1 255 4 add.u64 %r138,%r138,%r149; setp.ne.u64 %r230,%r67,%r116; @ %r230 bra $L45; bra $L35; $L44: .loc 1 246 26 add.u64 %r115,%r29,1; .loc 1 246 3 add.u64 %r52,%r52,24; add.u64 %r118,%r118,8; add.u64 %r105,%r105,24; add.u64 %r165,%r165,8; add.u64 %r162,%r162,8; add.u64 %r160,%r160,8; setp.ne.u64 %r231,%r67,%r115; @ %r231 bra $L64; .loc 1 261 8 ld.u64 %r94,[%r167]; shl.b64 %r57,%r109,2; mov.u64 %r60,%r110; .loc 1 264 10 mov.u64 %r114,0; .loc 1 265 23 cvt.u32.u64 %r232,%r114; $L47: st.u32 [%r60],%r232; mov.u64 %r145,%r114; .loc 1 264 26 add.u64 %r114,%r114,1; .loc 1 264 3 add.u64 %r60,%r60,%r57; setp.ne.u64 %r233,%r29,%r145; @ %r233 bra $L47; .loc 1 276 9 setp.eq.u64 %r234,%r94,0; @ %r234 bra $L35; .loc 1 331 19 ld.u64 %r73,[%frame+120]; .loc 1 331 9 shl.b64 %r75,%r73,3; .loc 1 332 20 ld.u64 %r76,[%frame]; .loc 1 334 34 ld.u64 %r79,[%frame+240]; ld.u64 %r163,[%frame+360]; cvt.u32.u32 %r236,%r22; cvt.s64.s8 %r235,%r236; shl.b64 %r237,%r235,3; add.u64 %r127,%r237,%r159; .loc 1 274 12 mov.u64 %r100,9223372036854775807; setp.eq.u32 %r285,1,0; add.u64 %r284,%frame,368; .loc 1 340 13 mov.u64 %r291,0; .loc 1 359 32 add.u64 %r292,%frame,240; .loc 1 308 12 setp.eq.u32 %r294,%r169,0; $L63: .loc 1 280 10 @ %r285 bra $L48; mov.u64 %r53,%r163; mov.u32 %r46,0; mov.u32 %r288,1; $L53: mov.u64 %r45,%r94; .loc 1 300 13 add.u64 %r94,%r45,%r75; .loc 1 284 11 ld.s8 %r240,[%r96]; cvt.u16.u32 %r239,%r240; setp.eq.u16 %r241,%r239,0; @ %r241 bra $L49; setp.eq.u32 %r242,%r46,0; selp.u64 %r163,%r163,%r53,%r242; .loc 1 294 16 ld.u64 %r100,[%r45]; mov.u64 %r137,%r284; mov.u64 %r150,%r110; mov.u64 %r161,%r163; $L52: .loc 1 296 33 cvt.u32.u64 %r245,%r161; add.u32 %r244,%r245,1; .loc 1 296 22 st.u32 [%r150],%r244; .loc 1 295 9 add.u64 %r150,%r150,%r57; setp.eq.u64 %r246,%r127,%r137; @ %r246 bra $L65; .loc 1 296 29 ld.u64 %r161,[%r137]; add.u64 %r137,%r137,8; bra $L52; $L49: .loc 1 301 14 add.u64 %r96,%r96,%r76; .loc 1 303 11 add.u64 %r53,%r53,1; mov.u32 %r46,%r288; .loc 1 303 4 setp.ne.u64 %r247,%r53,%r79; @ %r247 bra $L53; bra $L89; $L48: .loc 1 308 12 @ %r294 bra $L66; mov.u64 %r64,%r163; mov.u64 %r92,%r94; $L58: .loc 1 311 11 ld.s8 %r250,[%r96]; cvt.u16.u32 %r249,%r250; setp.eq.u16 %r251,%r249,0; @ %r251 bra $L56; .loc 1 311 12 ld.u64 %r58,[%r92]; .loc 1 311 11 setp.gt.s64 %r252,%r58,%r100; @ %r252 bra $L56; mov.u64 %r103,%r284; mov.u64 %r119,%r110; mov.u64 %r156,%r64; .loc 1 314 18 mov.u64 %r146,%r291; $L57: .loc 1 315 36 cvt.u32.u64 %r255,%r156; add.u32 %r254,%r255,1; .loc 1 315 25 st.u32 [%r119],%r254; .loc 1 314 34 add.u64 %r113,%r146,1; .loc 1 314 11 add.u64 %r119,%r119,%r57; setp.le.s64 %r256,%r29,%r146; @ %r256 bra $L67; .loc 1 315 32 ld.u64 %r156,[%r103]; add.u64 %r103,%r103,8; mov.u64 %r146,%r113; bra $L57; $L67: mov.u64 %r100,%r58; $L56: .loc 1 317 8 add.u64 %r92,%r92,%r75; .loc 1 319 13 add.u64 %r64,%r64,1; .loc 1 319 6 st.u64 [%frame+360],%r64; setp.ne.u64 %r257,%r64,%r79; @ %r257 bra $L58; .loc 1 317 8 sub.u64 %r258,%r79,%r163; mad.lo.u64 %r94,%r258,%r75,%r94; bra $L59; $L66: mov.u64 %r78,%r163; mov.u64 %r95,%r96; mov.u64 %r93,%r94; $L55: .loc 1 323 11 ld.s8 %r261,[%r95]; cvt.u16.u32 %r260,%r261; setp.eq.u16 %r262,%r260,0; @ %r262 bra $L60; .loc 1 323 12 ld.u64 %r68,[%r93]; .loc 1 323 11 setp.ge.s64 %r263,%r68,%r100; @ %r263 bra $L60; mov.u64 %r91,%r284; mov.u64 %r97,%r110; mov.u64 %r151,%r78; .loc 1 326 12 mov.u64 %r147,%r291; $L61: .loc 1 327 36 cvt.u32.u64 %r266,%r151; add.u32 %r265,%r266,1; .loc 1 327 25 st.u32 [%r97],%r265; .loc 1 326 28 add.u64 %r112,%r147,1; .loc 1 326 5 add.u64 %r97,%r97,%r57; setp.le.s64 %r267,%r29,%r147; @ %r267 bra $L68; .loc 1 327 32 ld.u64 %r151,[%r91]; add.u64 %r91,%r91,8; mov.u64 %r147,%r112; bra $L61; $L68: mov.u64 %r100,%r68; $L60: .loc 1 331 9 add.u64 %r93,%r93,%r75; .loc 1 332 10 add.u64 %r95,%r95,%r76; .loc 1 334 14 add.u64 %r78,%r78,1; .loc 1 334 7 st.u64 [%frame+360],%r78; setp.ne.u64 %r268,%r78,%r79; @ %r268 bra $L55; sub.u64 %r133,%r79,%r163; .loc 1 331 9 mad.lo.u64 %r94,%r75,%r133,%r94; .loc 1 332 10 mad.lo.u64 %r96,%r76,%r133,%r96; bra $L59; $L89: st.u64 [%frame+360],%r79; $L59: mov.u64 %r125,%r284; mov.u64 %r88,%r76; .loc 1 334 34 mov.u64 %r90,%r79; mov.u64 %r86,%r73; mov.u64 %r124,8; .loc 1 335 9 mov.u64 %r98,0; $L62: .loc 1 340 13 st.u64 [%r125+-8],%r291; .loc 1 343 23 mul.lo.u64 %r81,%r86,%r90; .loc 1 344 24 mul.lo.u64 %r83,%r90,%r88; mov.u64 %r104,%r98; .loc 1 345 5 add.u64 %r98,%r98,1; .loc 1 346 7 setp.eq.u64 %r273,%r29,%r104; @ %r273 bra $L35; .loc 1 354 16 ld.u64 %r274,[%r125]; add.u64 %r85,%r274,1; st.u64 [%r125],%r85; .loc 1 355 23 add.u64 %r276,%r286,%r124; ld.u64 %r86,[%r276]; .loc 1 355 13 sub.u64 %r277,%r86,%r81; shl.b64 %r278,%r277,3; add.u64 %r94,%r94,%r278; .loc 1 356 24 add.u64 %r279,%frame,%r124; ld.u64 %r88,[%r279]; .loc 1 356 14 sub.u64 %r280,%r88,%r83; add.u64 %r96,%r96,%r280; .loc 1 359 32 add.u64 %r282,%r292,%r124; ld.u64 %r90,[%r282]; .loc 1 359 7 add.u64 %r125,%r125,8; add.u64 %r124,%r124,8; setp.eq.u64 %r283,%r85,%r90; @ %r283 bra $L62; ld.u64 %r163,[%frame+360]; bra $L63; $L65: mov.u64 %r94,%r45; setp.eq.u32 %r285,1,1; bra $L48; $L35: .loc 1 _gfortran_sminloc0_4_i8 .visible .func _gfortran_sminloc0_4_i8u32pred.reg .u64u32 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 378 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L91; .loc 1 378 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L92; $L91: .loc 1 380 7 {_gfortran_minloc0_4_.loc 1 381 7 bra $L90; $L92: .loc 1 384 8 ld.s8 %r30,[%r36+28]_gfortran_runtime_error94: .loc 1 389 15 ld.u64 %r24,[%r35]; .loc 1 389 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L95; .loc 1 391 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 392 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 393 24 st.u64 [%r35+8],%r24; .loc 1 394 29 mov.u64 %r56call (%value_in),_gfortrani_xmallocarray94 27 st.u64 [%r35],%r24; bra $L96; $L95: .loc 1 396 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 396 11 ld.u32 %r62,[%r61+36; .loc 1 398 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 403 8 ld.u64 %r24,[%r35]; $L96: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,2; mov.u64 %r29,%r24; .loc 1 404 10 mov.u64 %r32,0; .loc 1 405 23 cvt.u32.u64 %r68,%r32; $L97: st.u32 [%r29],%r68; .loc 1 404 24 add.u64 %r32,%r32,1; .loc 1 404 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L97; $L90: .loc 1 406 1 ret; } minloc0_8_i8.o/_gfortran_minloc0_8_i8 .visible .func _gfortran_minloc0_8_i8libgfortran/generated/minloc0_8_i8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_8_i8 .visible .func _gfortran_mminloc0_8_i8_gfortran_sminloc0_8_i8 .visible .func _gfortran_sminloc0_8_i8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_8_i8 .visible .func _gfortran_minloc0_8_i8stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[36832predpred %r180; .reg .predmov.u64 %r112,%ar0; mov.u64 %r113,%ar1; mov.u32 %r114,%ar2; .loc 1 50 10 ld.s8 %r22,[%r113+28]; .loc 1 50 8 cvt.u32.u32 %r115,%r22; cvt.s64.s8 %r71,%r115; .loc 1 51 6 setp.gt.s64 %r116,%r71,0; @ %r116 bra $L2; .loc 1 52 5 cvta.const.u64 %r11117_gfortran_runtime_errorr23,[%r112]; .loc 1 54 6 setp.ne.u64 %r119,%r23,0; @ %r119 bra $L3; .loc 1 56 7 st.u64 [%r112+48],%r23; add.u64 %r121,%r71,-1; st.u64 [%r112+56],%r121; mov.u64 %r122,1; st.u64 [%r112+40],%r122; .loc 1 57 28 cvt.u32.u64 %r123,%r122; st.u8 [%r112+28],%r123; .loc 1 58 24 st.u64 [%r112+8],%r23; .loc 1 59 29 mov.u64 %r171call (%value_in),_gfortrani_xmallocarray129,[%value_in]; } mov.u64 %r23,%r129; .loc 1 59 27 st.u64 [%r112],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r131,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r132,[%r131+36]; setp.eq.u32 %r133,%r132,0; @ %r133 bra $L4; .loc 1 64r113136; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r112]; $L4: .loc 1 68 11 ld.u64 %r72,[%r112+40]; add.u64 %r36,%r113,40; mov.u64 %r104,%frame; add.u64 %r105,%frame,120; .loc 1 70 10 mov.u64 %r98,0; add.u64 %r182,%frame,240; .loc 1 74 16 mov.u64 %r144,%r98; bra $L7; $L22: mov.u64 %r98,%r79; $L7: .loc 1 72 18 ld.u64 %r137,[%r36]; st.u64 [%r104],%r137; .loc 1 73 19 ld.u64 %r139,[%r36+16]; add.u64 %r138,%r139,1; ld.u64 %r140,[%r36+8]; sub.u64 %r31,%r138,%r140; .loc 1 73 17 st.u64 [%r105],%r31; .loc 1 74 16 shl.b64 %r142,%r98,3; add.u64 %r143,%r182,%r142; st.u64 [%r143],%r144; .loc 1 75 10 setp.gt.s64 %r145,%r31,0; @ %r145 bra $L5; shl.b64 %r81,%r72,3; mov.u64 %r87,%r23; .loc 1 78 11 mov.u64 %r80,0; .loc 1 79 24 mov.u64 %r146,%r80; $L6: st.u64 [%r87],%r146; .loc 1 78 27 add.u64 %r80,%r80,1; .loc 1 78 4 add.u64 %r87,%r87,%r81; setp.ne.u64 %r147,%r71,%r80; @ %r147 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r79,%r98,1; .loc 1 70 3 add.u64 %r36,%r36,24; add.u64 %r104,%r104,8; add.u64 %r105,%r105,8; setp.ne.u64 %r148,%r71,%r79; @ %r148 bra $L22; .loc 1 84 8 ld.u64 %r62,[%r113]; shl.b64 %r84,%r72,3; mov.u64 %r39,%r23; .loc 1 87 10 mov.u64 %r78,0; .loc 1 88 23 mov.u64 %r149,1; $L8: st.u64 [%r39],%r149; mov.u64 %r32,%r78; .loc 1 87 26 add.u64 %r78,%r78,1; .loc 1 87 3 add.u64 %r39,%r39,%r84; setp.ne.u64 %r150,%r98,%r32; @ %r150 bra $L8; .loc 1 101 9 setp.ne.u64 %r151,%r62,0; @ ! %r151 bra $L1; .loc 1 149 19 ld.u64 %r45,[%frame]; .loc 1 149 9 shl.b64 %r47,%r45,3; .loc 1 151 34 ld.u64 %r51,[%frame+120]; cvt.u32.u32 %r153,%r22; cvt.s64.s8 %r152,%r153; shl.b64 %r154,%r152,3; add.u64 %r73,%r182,%r154; .loc 1 99 12 mov.u64 %r67,9223372036854775807; setp.eq.u32 %r181,%r114,0; add.u64 %r183,%frame,248; .loc 1 157 13 mov.u64 %r186,0; .loc 1 174 32 add.u64 %r187,%frame,120; $L21: .loc 1 151 21 ld.u64 %r103,[%frame+240]; .loc 1 126 10 @ %r181 bra $L23; mov.u64 %r40,%r103; mov.u64 %r60,%r62; $L14: .loc 1 129 10 ld.u64 %r33,[%r60]; .loc 1 129 9 setp.gt.s64 %r157,%r33,%r67; @ %r157 bra $L12; mov.u64 %r76,%r183; mov.u64 %r34,%r23; mov.u64 %r90,%r40; $L13: .loc 1 133 34 add.u64 %r159,%r90,1; .loc 1 133 23 st.u64 [%r34],%r159; .loc 1 132 3 add.u64 %r34,%r34,%r84; setp.eq.u64 %r160,%r73,%r76; @ %r160 bra $L24; .loc 1 133 30 ld.u64 %r90,[%r76]; add.u64 %r76,%r76,8; bra $L13; $L24: mov.u64 %r67,%r33; $L12: .loc 1 135 11 add.u64 %r60,%r60,%r47; .loc 1 137 9 add.u64 %r40,%r40,1; .loc 1 137 2 st.u64 [%frame+240],%r40; setp.ne.u64 %r161,%r40,%r51; @ %r161 bra $L14; bra $L34; $L23: mov.u64 %r49,%r103; mov.u64 %r61,%r62; $L11: .loc 1 141 10 ld.u64 %r42,[%r61]; .loc 1 141 9 setp.ge.s64 %r162,%r42,%r67; @ %r162 bra $L16; mov.u64 %r85,%r183; mov.u64 %r101,%r23; mov.u64 %r86,%r49; $L17: .loc 1 145 34 add.u64 %r164,%r86,1; .loc 1 145 23 st.u64 [%r101],%r164; .loc 1 144 3 add.u64 %r101,%r101,%r84; setp.eq.u64 %r165,%r73,%r85; @ %r165 bra $L25; .loc 1 145 30 ld.u64 %r86,[%r85]; add.u64 %r85,%r85,8; bra $L17; $L25: mov.u64 %r67,%r42; $L16: .loc 1 149 9 add.u64 %r61,%r61,%r47; .loc 1 151 14 add.u64 %r49,%r49,1; .loc 1 151 7 st.u64 [%frame+240],%r49; setp.ne.u64 %r166,%r49,%r51; @ %r166 bra $L11; .loc 1 149 9 sub.u64 %r167,%r51,%r103; mad.lo.u64 %r62,%r167,%r47,%r62; bra $L18; $L34: .loc 1 135 11 sub.u64 %r169,%r51,%r103; mad.lo.u64 %r62,%r169,%r47,%r62; $L18: mov.u64 %r70,%r183; .loc 1 151 34 mov.u64 %r59,%r51; mov.u64 %r57,%r45; .loc 1 152 9 mov.u64 %r66,0; $L20: .loc 1 157 13 st.u64 [%r70+-8],%r186; .loc 1 160 23 mul.lo.u64 %r54,%r57,%r59; mov.u64 %r99,%r66; .loc 1 161 5 add.u64 %r66,%r66,1; .loc 1 162 7 setp.eq.u64 %r173,%r99,%r98; @ %r173 bra $L1; .loc 1 170 16 ld.u64 %r174,[%r70]; add.u64 %r56,%r174,1; st.u64 [%r70],%r56; shl.b64 %r65,%r66,3; .loc 1 171 23 add.u64 %r175,%frame,%r65; ld.u64 %r57,[%r175]; .loc 1 171 13 sub.u64 %r176,%r57,%r54; shl.b64 %r177,%r176,3; add.u64 %r62,%r62,%r177; .loc 1 174 32 add.u64 %r179,%r187,%r65; ld.u64 %r59,[%r179]; .loc 1 174 7 add.u64 %r70,%r70,8; setp.eq.u64 %r180,%r56,%r59; @ %r180 bra $L20; bra $L21; $L_gfortran_mminloc0_8_i8 .visible .func _gfortran_mminloc0_8_i8reg .u64 %stack; mov.u64 %stack,0480pred %r187; .reg .u64 %r190; .reg .u64u64pred %r222; .reg .u64 %r223; .reg .predu32 %r230; .reg .u64 %r231; .reg .u16 %r233; .reg .u32 %r234; .reg .pred %r235; .reg .predpred %r248; .reg .predu64 %r273; .reg .predu64 %r283; .reg .pred %r285; mov.u64 %r160,%ar0; mov.u64 %r161,%ar1; mov.u64 %r162,%ar2; mov.u32 %r163,%ar3; .loc 1 202 6 setp.ne.u64 %r164,%r162,0; @ %r164 bra $L36; .loc 1 204 7 {1163; call _gfortran_minloc0_8_.loc 1 205 7 bra $L35; $L36: .loc 1 208 10 ld.s8 %r22,[%r161+28]; .loc 1 209 6 setp.gt.s32 %r168,%r22,0; @ %r168 bra $L38; .loc 1 210 5 cvta.const.u64 %r16stack; call _gfortran_runtime_error%r171,[%r160]; setp.ne.u64 %r172,%r171,0; @ %r172 bra $L39; .loc 1 214 7 st.u64 [%r160+48],%r171; add.u32 %r174,%r22,-1; cvt.s64.s32 %r175,%r174; st.u64 [%r160+56],%r175; mov.u64 %r176,1; st.u64 [%r160+40],%r176; .loc 1 215 28 cvt.u32.u64 %r177,%r176; st.u8 [%r160+28],%r177; .loc 1 216 24 st.u64 [%r160+8],%r171; .loc 1 217 29 cvt.u32.u32 %r182,%r22; cvt.s64.s8 %r181,%r182; mov.u64 %r180call (%value_in),_gfortrani_xmallocarray183,[%value_in]; } .loc 1 217 27 st.u64 [%r160],%r183; bra $L40; $L39: .loc 1 221 11 cvta.global.u64 %r185,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r186,[%r185+36]; setp.eq.u32 %r187,%r186,0; @ %r187 bra $L40; .loc 1 224 4 cvta.const.u64 %r190,$LC1; {161190; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19316161193190; call _gfortrani_bounds_equal_extents$L40: .loc 1 231 15 ld.u64 %r31,[%r162+16]; .loc 1 233 9 ld.u64 %r96,[%r162]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r195,%r32,-4; and.b32 %r196,%r195,-5; set.u32.eq.u32 %r198,%r196,0; neg.s32 %r199,%r198; .loc 1 235 22 add.u32 %r200,%r32,-1; set.u32.le.u32 %r202,%r200,1; neg.s32 %r203,%r202; .loc 1 235 6 cvt.u16.u32 %r205,%r199; cvt.u16.u32 %r206,%r203; or.b16 %r204,%r205,%r206; cvt.u32.u16 %r207,%r204; cvt.u16.u8 %r208,%r20741; .loc 1 237 7 setp.ne.u32 %r211,%r32,16; @ %r211 bra $L42; $L41: .loc 1 244 11 ld.u64 %r110,[%r160+40]; .loc 1 245 8 ld.u64 %r111,[%r160]; .loc 1 246 17 cvt.u32.u32 %r212,%r22; cvt.s64.s8 %r68,%r212; add.u64 %r153,%r161,40; add.u64 %r277,%frame,120; mov.u64 %r149,%r277; add.u64 %r147,%r162,40; mov.u64 %r146,%frame; add.u64 %r71,%frame,240; add.u64 %r64,%frame,360; mov.u64 %r47,%r64; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r221,%r29; bra $L43; $L42: .loc 1 242 5_gfortran_runtime_error64: mov.u64 %r29,%r116; $L43: .loc 1 248 18 ld.u64 %r215,[%r153]; st.u64 [%r149],%r215; .loc 1 249 20 ld.u64 %r217,[%r147]; mul.lo.u64 %r216,%r217,%r31; .loc 1 249 18 st.u64 [%r146],%r216; .loc 1 250 19 ld.u64 %r219,[%r153+16]; add.u64 %r218,%r219,1; ld.u64 %r220,[%r153+8]; sub.u64 %r44,%r218,%r220; .loc 1 250 17 st.u64 [%r71],%r44; .loc 1 251 16 st.u64 [%r47],%r221; .loc 1 252 10 setp.gt.s64 %r222,%r44,0; @ %r222 bra $L44; shl.b64 %r138,%r110,3; mov.u64 %r130,%r111; .loc 1 255 11 mov.u64 %r117,0; .loc 1 256 24 mov.u64 %r223,%r117; $L45: st.u64 [%r130],%r223; .loc 1 255 27 add.u64 %r117,%r117,1; .loc 1 255 4 add.u64 %r130,%r130,%r138; setp.ne.u64 %r224,%r68,%r117; @ %r224 bra $L45; bra $L35; $L44: .loc 1 246 26 add.u64 %r116,%r29,1; .loc 1 246 3 add.u64 %r153,%r153,24; add.u64 %r149,%r149,8; add.u64 %r147,%r147,24; add.u64 %r146,%r146,8; add.u64 %r71,%r71,8; add.u64 %r47,%r47,8; setp.ne.u64 %r225,%r68,%r116; @ %r225 bra $L64; .loc 1 261 8 ld.u64 %r93,[%r161]; shl.b64 %r158,%r110,3; mov.u64 %r156,%r111; .loc 1 264 10 mov.u64 %r115,0; .loc 1 265 23 mov.u64 %r226,%r115; $L47: st.u64 [%r156],%r226; mov.u64 %r141,%r115; .loc 1 264 26 add.u64 %r115,%r115,1; .loc 1 264 3 add.u64 %r156,%r156,%r158; setp.ne.u64 %r227,%r29,%r141; @ %r227 bra $L47; .loc 1 276 9 setp.eq.u64 %r228,%r93,0; @ %r228 bra $L35; .loc 1 331 19 ld.u64 %r73,[%frame+120]; .loc 1 331 9 shl.b64 %r75,%r73,3; .loc 1 332 20 ld.u64 %r76,[%frame]; .loc 1 334 34 ld.u64 %r79,[%frame+240]; ld.u64 %r155,[%frame+360]; cvt.u32.u32 %r230,%r22; cvt.s64.s8 %r229,%r230; shl.b64 %r231,%r229,3; add.u64 %r120,%r231,%r64; .loc 1 274 12 mov.u64 %r101,9223372036854775807; setp.eq.u32 %r276,1,0; add.u64 %r275,%frame,368; .loc 1 340 13 mov.u64 %r282,0; .loc 1 359 32 add.u64 %r283,%frame,240; .loc 1 308 12 setp.eq.u32 %r285,%r163,0; $L63: .loc 1 280 10 @ %r276 bra $L48; mov.u64 %r56,%r155; mov.u32 %r46,0; mov.u32 %r279,1; $L53: mov.u64 %r45,%r93; .loc 1 300 13 add.u64 %r93,%r45,%r75; .loc 1 284 11 ld.s8 %r234,[%r96]; cvt.u16.u32 %r233,%r234; setp.eq.u16 %r235,%r233,0; @ %r235 bra $L49; setp.eq.u32 %r236,%r46,0; selp.u64 %r155,%r155,%r56,%r236; .loc 1 294 16 ld.u64 %r101,[%r45]; mov.u64 %r129,%r275; mov.u64 %r139,%r111; mov.u64 %r151,%r155; $L52: .loc 1 296 33 add.u64 %r238,%r151,1; .loc 1 296 22 st.u64 [%r139],%r238; .loc 1 295 9 add.u64 %r139,%r139,%r158; setp.eq.u64 %r239,%r120,%r129; @ %r239 bra $L65; .loc 1 296 29 ld.u64 %r151,[%r129]; add.u64 %r129,%r129,8; bra $L52; $L49: .loc 1 301 14 add.u64 %r96,%r96,%r76; .loc 1 303 11 add.u64 %r56,%r56,1; mov.u32 %r46,%r279; .loc 1 303 4 setp.ne.u64 %r240,%r56,%r79; @ %r240 bra $L53; bra $L89; $L48: .loc 1 308 12 @ %r285 bra $L66; mov.u64 %r66,%r155; mov.u64 %r91,%r93; $L58: .loc 1 311 11 ld.s8 %r243,[%r96]; cvt.u16.u32 %r242,%r243; setp.eq.u16 %r244,%r242,0; @ %r244 bra $L56; .loc 1 311 12 ld.u64 %r61,[%r91]; .loc 1 311 11 setp.gt.s64 %r245,%r61,%r101; @ %r245 bra $L56; mov.u64 %r62,%r275; mov.u64 %r94,%r111; mov.u64 %r148,%r66; .loc 1 314 18 mov.u64 %r142,%r282; $L57: .loc 1 315 36 add.u64 %r247,%r148,1; .loc 1 315 25 st.u64 [%r94],%r247; .loc 1 314 34 add.u64 %r114,%r142,1; .loc 1 314 11 add.u64 %r94,%r94,%r158; setp.le.s64 %r248,%r29,%r142; @ %r248 bra $L67; .loc 1 315 32 ld.u64 %r148,[%r62]; add.u64 %r62,%r62,8; mov.u64 %r142,%r114; bra $L57; $L67: mov.u64 %r101,%r61; $L56: .loc 1 317 8 add.u64 %r91,%r91,%r75; .loc 1 319 13 add.u64 %r66,%r66,1; .loc 1 319 6 st.u64 [%frame+360],%r66; setp.ne.u64 %r249,%r66,%r79; @ %r249 bra $L58; .loc 1 317 8 sub.u64 %r250,%r79,%r155; mad.lo.u64 %r93,%r250,%r75,%r93; bra $L59; $L66: mov.u64 %r78,%r155; mov.u64 %r95,%r96; mov.u64 %r92,%r93; $L55: .loc 1 323 11 ld.s8 %r253,[%r95]; cvt.u16.u32 %r252,%r253; setp.eq.u16 %r254,%r252,0; @ %r254 bra $L60; .loc 1 323 12 ld.u64 %r69,[%r92]; .loc 1 323 11 setp.ge.s64 %r255,%r69,%r101; @ %r255 bra $L60; mov.u64 %r118,%r275; mov.u64 %r57,%r111; mov.u64 %r144,%r78; .loc 1 326 12 mov.u64 %r143,%r282; $L61: .loc 1 327 36 add.u64 %r257,%r144,1; .loc 1 327 25 st.u64 [%r57],%r257; .loc 1 326 28 add.u64 %r113,%r143,1; .loc 1 326 5 add.u64 %r57,%r57,%r158; setp.le.s64 %r258,%r29,%r143; @ %r258 bra $L68; .loc 1 327 32 ld.u64 %r144,[%r118]; add.u64 %r118,%r118,8; mov.u64 %r143,%r113; bra $L61; $L68: mov.u64 %r101,%r69; $L60: .loc 1 331 9 add.u64 %r92,%r92,%r75; .loc 1 332 10 add.u64 %r95,%r95,%r76; .loc 1 334 14 add.u64 %r78,%r78,1; .loc 1 334 7 st.u64 [%frame+360],%r78; setp.ne.u64 %r259,%r78,%r79; @ %r259 bra $L55; sub.u64 %r126,%r79,%r155; .loc 1 331 9 mad.lo.u64 %r93,%r75,%r126,%r93; .loc 1 332 10 mad.lo.u64 %r96,%r76,%r126,%r96; bra $L59; $L89: st.u64 [%frame+360],%r79; $L59: mov.u64 %r105,%r275; mov.u64 %r88,%r76; .loc 1 334 34 mov.u64 %r90,%r79; mov.u64 %r86,%r73; mov.u64 %r104,8; .loc 1 335 9 mov.u64 %r99,0; $L62: .loc 1 340 13 st.u64 [%r105+-8],%r282; .loc 1 343 23 mul.lo.u64 %r81,%r86,%r90; .loc 1 344 24 mul.lo.u64 %r83,%r90,%r88; mov.u64 %r70,%r99; .loc 1 345 5 add.u64 %r99,%r99,1; .loc 1 346 7 setp.eq.u64 %r264,%r29,%r70; @ %r264 bra $L35; .loc 1 354 16 ld.u64 %r265,[%r105]; add.u64 %r85,%r265,1; st.u64 [%r105],%r85; .loc 1 355 23 add.u64 %r267,%r277,%r104; ld.u64 %r86,[%r267]; .loc 1 355 13 sub.u64 %r268,%r86,%r81; shl.b64 %r269,%r268,3; add.u64 %r93,%r93,%r269; .loc 1 356 24 add.u64 %r270,%frame,%r104; ld.u64 %r88,[%r270]; .loc 1 356 14 sub.u64 %r271,%r88,%r83; add.u64 %r96,%r96,%r271; .loc 1 359 32 add.u64 %r273,%r283,%r104; ld.u64 %r90,[%r273]; .loc 1 359 7 add.u64 %r105,%r105,8; add.u64 %r104,%r104,8; setp.eq.u64 %r274,%r85,%r90; @ %r274 bra $L62; ld.u64 %r155,[%frame+360]; bra $L63; $L65: mov.u64 %r93,%r45; setp.eq.u32 %r276,1,1; bra $L48; $L35: .loc 1 _gfortran_sminloc0_8_i8 .visible .func _gfortran_sminloc0_8_i8u32pred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 378 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L91; .loc 1 378 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L92; $L91: .loc 1 380 7 {_gfortran_minloc0_8_.loc 1 381 7 bra $L90; $L92: .loc 1 384 8 ld.s8 %r30,[%r36+28]_gfortran_runtime_error94: .loc 1 389 15 ld.u64 %r24,[%r35]; .loc 1 389 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L95; .loc 1 391 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 392 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 393 24 st.u64 [%r35+8],%r24; .loc 1 394 29 mov.u64 call (%value_in),_gfortrani_xmallocarray94 27 st.u64 [%r35],%r24; bra $L96; $L95: .loc 1 396 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 396 11 ld.u32 %r62,[%r61+36; .loc 1 398 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 403 8 ld.u64 %r24,[%r35]; $L96: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,3; mov.u64 %r29,%r24; .loc 1 404 10 mov.u64 %r32,0; .loc 1 405 23 mov.u64 %r68,%r32; $L97: st.u64 [%r29],%r68; .loc 1 404 24 add.u64 %r32,%r32,1; .loc 1 404 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L97; $L90: .loc 1 406 1 ret; } minloc0_16_i8.o/_gfortran_minloc0_16_i8 .visible .func _gfortran_minloc0_16_i8libgfortran/generated/minloc0_16_i8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_16_i8 .visible .func _gfortran_mminloc0_16_i8_gfortran_sminloc0_16_i8 .visible .func _gfortran_sminloc0_16_i8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_16_i8 .visible .func _gfortran_minloc0_16_i8stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[3683264u32 %r157; .reg .u64 %r158; .reg .predpred %r166; .reg .predpred %r171; .reg .predpredmov.u64 %r114,%ar0; mov.u64 %r115,%ar1; mov.u32 %r116,%ar2; .loc 1 50 10 ld.s8 %r22,[%r115+28]; .loc 1 50 8 cvt.u32.u32 %r117,%r22; cvt.s64.s8 %r71,%r117; .loc 1 51 6 setp.gt.s64 %r118,%r71,0; @ %r118 bra $L2; .loc 1 52 5 cvta.const.u64 %r1111_gfortran_runtime_errorr23,[%r114]; .loc 1 54 6 setp.ne.u64 %r121,%r23,0; @ %r121 bra $L3; .loc 1 56 7 st.u64 [%r114+48],%r23; add.u64 %r123,%r71,-1; st.u64 [%r114+56],%r123; mov.u64 %r124,1; st.u64 [%r114+40],%r124; .loc 1 57 28 cvt.u32.u64 %r125,%r124; st.u8 [%r114+28],%r125; .loc 1 58 24 st.u64 [%r114+8],%r23; .loc 1 59 29 mov.u64 %r128,1671call (%value_in),_gfortrani_xmallocarray131,[%value_in]; } mov.u64 %r23,%r131; .loc 1 59 27 st.u64 [%r114],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r133,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r134,[%r133+36]; setp.eq.u32 %r135,%r134,0; @ %r135 bra $L4; .loc 1 64r115138; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r114]; $L4: .loc 1 68 11 ld.u64 %r72,[%r114+40]; add.u64 %r44,%r115,40; mov.u64 %r35,%frame; add.u64 %r52,%frame,120; .loc 1 70 10 mov.u64 %r100,0; add.u64 %r188,%frame,240; .loc 1 74 16 mov.u64 %r146,%r100; bra $L7; $L22: mov.u64 %r100,%r79; $L7: .loc 1 72 18 ld.u64 %r139,[%r44]; st.u64 [%r35],%r139; .loc 1 73 19 ld.u64 %r141,[%r44+16]; add.u64 %r140,%r141,1; ld.u64 %r142,[%r44+8]; sub.u64 %r32,%r140,%r142; .loc 1 73 17 st.u64 [%r52],%r32; .loc 1 74 16 shl.b64 %r144,%r100,3; add.u64 %r145,%r188,%r144; st.u64 [%r145],%r146; .loc 1 75 10 setp.gt.s64 %r147,%r32,0; @ %r147 bra $L5; shl.b64 %r86,%r72,4; mov.u64 %r88,%r23; .loc 1 78 11 mov.u64 %r80,0; .loc 1 79 24 mov.u64 %r148,%r80; $L6: st.u64 [%r88],%r148; st.u64 [%r88+8],%r148; .loc 1 78 27 add.u64 %r80,%r80,1; .loc 1 78 4 add.u64 %r88,%r88,%r86; setp.ne.u64 %r150,%r71,%r80; @ %r150 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r79,%r100,1; .loc 1 70 3 add.u64 %r44,%r44,24; add.u64 %r35,%r35,8; add.u64 %r52,%r52,8; setp.ne.u64 %r151,%r71,%r79; @ %r151 bra $L22; .loc 1 84 8 ld.u64 %r62,[%r115]; shl.b64 %r38,%r72,4; mov.u64 %r85,%r23; .loc 1 87 10 mov.u64 %r78,0; .loc 1 88 23 mov.u64 %r152,1; mov.u64 %r153,%r78; $L8: st.u64 [%r85],%r152; st.u64 [%r85+8],%r153; mov.u64 %r33,%r78; .loc 1 87 26 add.u64 %r78,%r78,1; .loc 1 87 3 add.u64 %r85,%r85,%r38; setp.ne.u64 %r154,%r100,%r33; @ %r154 bra $L8; .loc 1 101 9 setp.ne.u64 %r155,%r62,0; @ ! %r155 bra $L1; .loc 1 149 19 ld.u64 %r47,[%frame]; .loc 1 149 9 shl.b64 %r49,%r47,3; .loc 1 151 34 ld.u64 %r51,[%frame+120]; cvt.u32.u32 %r157,%r22; cvt.s64.s8 %r156,%r157; shl.b64 %r158,%r156,3; add.u64 %r89,%r188,%r158; .loc 1 99 12 mov.u64 %r66,9223372036854775807; setp.eq.u32 %r187,%r116,0; add.u64 %r189,%frame,248; .loc 1 157 13 mov.u64 %r192,0; .loc 1 174 32 add.u64 %r193,%frame,120; $L21: .loc 1 151 21 ld.u64 %r107,[%frame+240]; .loc 1 126 10 @ %r187 bra $L23; mov.u64 %r41,%r107; mov.u64 %r60,%r62; $L14: .loc 1 129 10 ld.u64 %r34,[%r60]; .loc 1 129 9 setp.gt.s64 %r161,%r34,%r66; @ %r161 bra $L12; mov.u64 %r27,%r189; mov.u64 %r63,%r23; mov.u64 %r103,%r41; $L13: .loc 1 133 34 add.u64 %r163,%r103,1; st.u64 [%r63],%r163; shr.s64 %r164,%r163,63; st.u64 [%r63+8],%r164; .loc 1 132 3 add.u64 %r63,%r63,%r38; setp.eq.u64 %r165,%r27,%r89; @ %r165 bra $L24; .loc 1 133 30 ld.u64 %r103,[%r27]; add.u64 %r27,%r27,8; bra $L13; $L24: mov.u64 %r66,%r34; $L12: .loc 1 135 11 add.u64 %r60,%r60,%r49; .loc 1 137 9 add.u64 %r41,%r41,1; .loc 1 137 2 st.u64 [%frame+240],%r41; setp.ne.u64 %r166,%r41,%r51; @ %r166 bra $L14; bra $L34; $L23: mov.u64 %r50,%r107; mov.u64 %r61,%r62; $L11: .loc 1 141 10 ld.u64 %r42,[%r61]; .loc 1 141 9 setp.ge.s64 %r167,%r42,%r66; @ %r167 bra $L16; mov.u64 %r104,%r189; mov.u64 %r105,%r23; mov.u64 %r84,%r50; $L17: .loc 1 145 34 add.u64 %r169,%r84,1; st.u64 [%r105],%r169; shr.s64 %r170,%r169,63; st.u64 [%r105+8],%r170; .loc 1 144 3 add.u64 %r105,%r105,%r38; setp.eq.u64 %r171,%r89,%r104; @ %r171 bra $L25; .loc 1 145 30 ld.u64 %r84,[%r104]; add.u64 %r104,%r104,8; bra $L17; $L25: mov.u64 %r66,%r42; $L16: .loc 1 149 9 add.u64 %r61,%r61,%r49; .loc 1 151 14 add.u64 %r50,%r50,1; .loc 1 151 7 st.u64 [%frame+240],%r50; setp.ne.u64 %r172,%r50,%r51; @ %r172 bra $L11; .loc 1 149 9 sub.u64 %r173,%r51,%r107; mad.lo.u64 %r62,%r173,%r49,%r62; bra $L18; $L34: .loc 1 135 11 sub.u64 %r175,%r51,%r107; mad.lo.u64 %r62,%r175,%r49,%r62; $L18: mov.u64 %r77,%r189; .loc 1 151 34 mov.u64 %r59,%r51; mov.u64 %r57,%r47; .loc 1 152 9 mov.u64 %r65,0; $L20: .loc 1 157 13 st.u64 [%r77+-8],%r192; .loc 1 160 23 mul.lo.u64 %r54,%r57,%r59; mov.u64 %r101,%r65; .loc 1 161 5 add.u64 %r65,%r65,1; .loc 1 162 7 setp.eq.u64 %r179,%r101,%r100; @ %r179 bra $L1; .loc 1 170 16 ld.u64 %r180,[%r77]; add.u64 %r56,%r180,1; st.u64 [%r77],%r56; shl.b64 %r68,%r65,3; .loc 1 171 23 add.u64 %r181,%frame,%r68; ld.u64 %r57,[%r181]; .loc 1 171 13 sub.u64 %r182,%r57,%r54; shl.b64 %r183,%r182,3; add.u64 %r62,%r62,%r183; .loc 1 174 32 add.u64 %r185,%r193,%r68; ld.u64 %r59,[%r185]; .loc 1 174 7 add.u64 %r77,%r77,8; setp.eq.u64 %r186,%r56,%r59; @ %r186 bra $L20; bra $L21; $L_gfortran_mminloc0_16_i8 .visible .func _gfortran_mminloc0_16_i8reg .u64 %stack; mov.u64 %stack,0480u64u32 %r198; .reg .u32u64pred %r293; mov.u64 %r163,%ar0; mov.u64 %r164,%ar1; mov.u64 %r165,%ar2; mov.u32 %r166,%ar3; .loc 1 202 6 setp.ne.u64 %r167,%r165,0; @ %r167 bra $L3616r1166; call _gfortran_minloc0_16_.loc 1 205 7 bra $L35; $L36: .loc 1 208 10 ld.s8 %r22,[%r164+28]; .loc 1 209 6 setp.gt.s32 %r171,%r22,0; @ %r171 bra $L38; .loc 1 210 5 cvta.const.u64 %r172172_gfortran_runtime_error%r174,[%r163]; setp.ne.u64 %r175,%r174,0; @ %r175 bra $L39; .loc 1 214 7 st.u64 [%r163+48],%r174; add.u32 %r177,%r22,-1; cvt.s64.s32 %r178,%r177; st.u64 [%r163+56],%r178; mov.u64 %r179,1; st.u64 [%r163+40],%r179; .loc 1 215 28 cvt.u32.u64 %r180,%r179; st.u8 [%r163+28],%r180; .loc 1 216 24 st.u64 [%r163+8],%r174; .loc 1 217 29 cvt.u32.u32 %r185,%r22; cvt.s64.s8 %r184,%r185; mov.u64 %r183,16call (%value_in),_gfortrani_xmallocarray186,[%value_in]; } .loc 1 217 27 st.u64 [%r163],%r186; bra $L40; $L39: .loc 1 221 11 cvta.global.u64 %r188,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r189,[%r188+36]; setp.eq.u32 %r190,%r189,0r164193; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r196193; call _gfortrani_bounds_equal_extents$L40: .loc 1 231 15 ld.u64 %r31,[%r165+16]; .loc 1 233 9 ld.u64 %r93,[%r165]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r198,%r32,-4; and.b32 %r199,%r198,-5; set.u32.eq.u32 %r201,%r199,0; neg.s32 %r202,%r201; .loc 1 235 22 add.u32 %r203,%r32,-1; set.u32.le.u32 %r205,%r203,1; neg.s32 %r206,%r205; .loc 1 235 6 cvt.u16.u32 %r208,%r202; cvt.u16.u32 %r209,%r206; or.b16 %r207,%r208,%r209;41; .loc 1 237 7 setp.ne.u32 %r214,%r32,16; @ %r214 bra $L42; $L41: .loc 1 244 11 ld.u64 %r106,[%r163+40]; .loc 1 245 8 ld.u64 %r107,[%r163]; .loc 1 246 17 cvt.u32.u32 %r215,%r22; cvt.s64.s8 %r64,%r215; add.u64 %r161,%r164,40; add.u64 %r284,%frame,120; mov.u64 %r159,%r284; add.u64 %r158,%r165,40; mov.u64 %r154,%frame; add.u64 %r155,%frame,240; add.u64 %r151,%frame,360; mov.u64 %r150,%r151; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r224,%r29; bra $L43; $L42: .loc 1 242 5_gfortran_runtime_error64: mov.u64 %r29,%r112; $L43: .loc 1 248 18 ld.u64 %r218,[%r161]; st.u64 [%r159],%r218; .loc 1 249 20 ld.u64 %r220,[%r158]; mul.lo.u64 %r219,%r220,%r31; .loc 1 249 18 st.u64 [%r154],%r219; .loc 1 250 19 ld.u64 %r222,[%r161+16]; add.u64 %r221,%r222,1; ld.u64 %r223,[%r161+8]; sub.u64 %r44,%r221,%r223; .loc 1 250 17 st.u64 [%r155],%r44; .loc 1 251 16 st.u64 [%r150],%r224; .loc 1 252 10 setp.gt.s64 %r225,%r44,0; @ %r225 bra $L44; shl.b64 %r141,%r106,4; mov.u64 %r132,%r107; .loc 1 255 11 mov.u64 %r113,0; .loc 1 256 24 mov.u64 %r226,%r113; $L45: st.u64 [%r132],%r226; st.u64 [%r132+8],%r226; .loc 1 255 27 add.u64 %r113,%r113,1; .loc 1 255 4 add.u64 %r132,%r132,%r141; setp.ne.u64 %r228,%r64,%r113; @ %r228 bra $L45; bra $L35; $L44: .loc 1 246 26 add.u64 %r112,%r29,1; .loc 1 246 3 add.u64 %r161,%r161,24; add.u64 %r159,%r159,8; add.u64 %r158,%r158,24; add.u64 %r154,%r154,8; add.u64 %r155,%r155,8; add.u64 %r150,%r150,8; setp.ne.u64 %r229,%r64,%r112; @ %r229 bra $L64; .loc 1 261 8 ld.u64 %r91,[%r164]; shl.b64 %r115,%r106,4; mov.u64 %r52,%r107; .loc 1 264 10 mov.u64 %r111,0; .loc 1 265 23 mov.u64 %r230,%r111; $L47: st.u64 [%r52],%r230; st.u64 [%r52+8],%r230; mov.u64 %r143,%r111; .loc 1 264 26 add.u64 %r111,%r111,1; .loc 1 264 3 add.u64 %r52,%r52,%r115; setp.ne.u64 %r232,%r29,%r143; @ %r232 bra $L47; .loc 1 276 9 setp.eq.u64 %r233,%r91,0; @ %r233 bra $L35; .loc 1 331 19 ld.u64 %r70,[%frame+120]; .loc 1 331 9 shl.b64 %r72,%r70,3; .loc 1 332 20 ld.u64 %r73,[%frame]; .loc 1 334 34 ld.u64 %r76,[%frame+240]; ld.u64 %r160,[%frame+360]; cvt.u32.u32 %r235,%r22; cvt.s64.s8 %r234,%r235; shl.b64 %r236,%r234,3; add.u64 %r122,%r236,%r151; .loc 1 274 12 mov.u64 %r97,9223372036854775807; setp.eq.u32 %r283,1,0; add.u64 %r285,%frame,368; .loc 1 340 13 mov.u64 %r290,0; .loc 1 359 32 add.u64 %r291,%frame,240; .loc 1 308 12 setp.eq.u32 %r293,%r166,0; $L63: .loc 1 280 10 @ %r283 bra $L48; mov.u64 %r53,%r160; mov.u32 %r46,0; mov.u32 %r287,1; $L53: mov.u64 %r45,%r91; .loc 1 300 13 add.u64 %r91,%r45,%r72; .loc 1 284 11 ld.s8 %r239,[%r93]; cvt.u16.u32 %r238,%r239; setp.eq.u16 %r240,%r238,0; @ %r240 bra $L49; setp.eq.u32 %r241,%r46,0; selp.u64 %r160,%r160,%r53,%r241; .loc 1 294 16 ld.u64 %r97,[%r45]; mov.u64 %r131,%r285; mov.u64 %r142,%r107; mov.u64 %r157,%r160; $L52: .loc 1 296 33 add.u64 %r243,%r157,1; st.u64 [%r142],%r243; shr.s64 %r244,%r243,63; st.u64 [%r142+8],%r244; .loc 1 295 9 add.u64 %r142,%r142,%r115; setp.eq.u64 %r245,%r122,%r131; @ %r245 bra $L65; .loc 1 296 29 ld.u64 %r157,[%r131]; add.u64 %r131,%r131,8; bra $L52; $L49: .loc 1 301 14 add.u64 %r93,%r93,%r73; .loc 1 303 11 add.u64 %r53,%r53,1; mov.u32 %r46,%r287; .loc 1 303 4 setp.ne.u64 %r246,%r53,%r76; @ %r246 bra $L53; bra $L89; $L48: .loc 1 308 12 @ %r293 bra $L66; mov.u64 %r62,%r160; mov.u64 %r89,%r91; $L58: .loc 1 311 11 ld.s8 %r249,[%r93]; cvt.u16.u32 %r248,%r249; setp.eq.u16 %r250,%r248,0; @ %r250 bra $L56; .loc 1 311 12 ld.u64 %r58,[%r89]; .loc 1 311 11 setp.gt.s64 %r251,%r58,%r97; @ %r251 bra $L56; mov.u64 %r94,%r285; mov.u64 %r98,%r107; mov.u64 %r153,%r62; .loc 1 314 18 mov.u64 %r144,%r290; $L57: .loc 1 315 36 add.u64 %r253,%r153,1; st.u64 [%r98],%r253; shr.s64 %r254,%r253,63; st.u64 [%r98+8],%r254; .loc 1 314 34 add.u64 %r110,%r144,1; .loc 1 314 11 add.u64 %r98,%r98,%r115; setp.le.s64 %r255,%r29,%r144; @ %r255 bra $L67; .loc 1 315 32 ld.u64 %r153,[%r94]; add.u64 %r94,%r94,8; mov.u64 %r144,%r110; bra $L57; $L67: mov.u64 %r97,%r58; $L56: .loc 1 317 8 add.u64 %r89,%r89,%r72; .loc 1 319 13 add.u64 %r62,%r62,1; .loc 1 319 6 st.u64 [%frame+360],%r62; setp.ne.u64 %r256,%r62,%r76; @ %r256 bra $L58; .loc 1 317 8 sub.u64 %r257,%r76,%r160; mad.lo.u64 %r91,%r257,%r72,%r91; bra $L59; $L66: mov.u64 %r75,%r160; mov.u64 %r92,%r93; mov.u64 %r90,%r91; $L55: .loc 1 323 11 ld.s8 %r260,[%r92]; cvt.u16.u32 %r259,%r260; setp.eq.u16 %r261,%r259,0; @ %r261 bra $L60; .loc 1 323 12 ld.u64 %r65,[%r90]; .loc 1 323 11 setp.ge.s64 %r262,%r65,%r97; @ %r262 bra $L60; mov.u64 %r57,%r285; mov.u64 %r66,%r107; mov.u64 %r149,%r75; .loc 1 326 12 mov.u64 %r145,%r290; $L61: .loc 1 327 36 add.u64 %r264,%r149,1; st.u64 [%r66],%r264; shr.s64 %r265,%r264,63; st.u64 [%r66+8],%r265; .loc 1 326 28 add.u64 %r109,%r145,1; .loc 1 326 5 add.u64 %r66,%r66,%r115; setp.le.s64 %r266,%r29,%r145; @ %r266 bra $L68; .loc 1 327 32 ld.u64 %r149,[%r57]; add.u64 %r57,%r57,8; mov.u64 %r145,%r109; bra $L61; $L68: mov.u64 %r97,%r65; $L60: .loc 1 331 9 add.u64 %r90,%r90,%r72; .loc 1 332 10 add.u64 %r92,%r92,%r73; .loc 1 334 14 add.u64 %r75,%r75,1; .loc 1 334 7 st.u64 [%frame+360],%r75; setp.ne.u64 %r267,%r75,%r76; @ %r267 bra $L55; sub.u64 %r128,%r76,%r160; .loc 1 331 9 mad.lo.u64 %r91,%r72,%r128,%r91; .loc 1 332 10 mad.lo.u64 %r93,%r73,%r128,%r93; bra $L59; $L89: st.u64 [%frame+360],%r76; $L59: mov.u64 %r120,%r285; mov.u64 %r85,%r73; .loc 1 334 34 mov.u64 %r87,%r76; mov.u64 %r83,%r70; mov.u64 %r119,8; .loc 1 335 9 mov.u64 %r96,0; $L62: .loc 1 340 13 st.u64 [%r120+-8],%r290; .loc 1 343 23 mul.lo.u64 %r78,%r83,%r87; .loc 1 344 24 mul.lo.u64 %r80,%r87,%r85; mov.u64 %r95,%r96; .loc 1 345 5 add.u64 %r96,%r96,1; .loc 1 346 7 setp.eq.u64 %r272,%r29,%r95; @ %r272 bra $L35; .loc 1 354 16 ld.u64 %r273,[%r120]; add.u64 %r82,%r273,1; st.u64 [%r120],%r82; .loc 1 355 23 add.u64 %r275,%r284,%r119; ld.u64 %r83,[%r275]; .loc 1 355 13 sub.u64 %r276,%r83,%r78; shl.b64 %r277,%r276,3; add.u64 %r91,%r91,%r277; .loc 1 356 24 add.u64 %r278,%frame,%r119; ld.u64 %r85,[%r278]; .loc 1 356 14 sub.u64 %r279,%r85,%r80; add.u64 %r93,%r93,%r279; .loc 1 359 32 add.u64 %r281,%r291,%r119; ld.u64 %r87,[%r281]; .loc 1 359 7 add.u64 %r120,%r120,8; add.u64 %r119,%r119,8; setp.eq.u64 %r282,%r82,%r87; @ %r282 bra $L62; ld.u64 %r160,[%frame+360]; bra $L63; $L65: mov.u64 %r91,%r45; setp.eq.u32 %r283,1,1; bra $L48; $L35: .loc 1 _gfortran_sminloc0_16_i8 .visible .func _gfortran_sminloc0_16_i8u32pred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 378 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L91; .loc 1 378 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L92; $L91: .loc 1 380 7 {_gfortran_minloc0_16_.loc 1 381 7 bra $L90; $L92: .loc 1 384 8 ld.s8 %r30,[%r36+28]_gfortran_runtime_error94: .loc 1 389 15 ld.u64 %r24,[%r35]; .loc 1 389 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L95; .loc 1 391 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 392 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 393 24 st.u64 [%r35+8],%r24; .loc 1 394 29call (%value_in),_gfortrani_xmallocarray94 27 st.u64 [%r35],%r24; bra $L96; $L95: .loc 1 396 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 396 11 ld.u32 %r62,[%r61+36; .loc 1 398 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 403 8 ld.u64 %r24,[%r35]; $L96: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,4; mov.u64 %r29,%r24; .loc 1 404 10 mov.u64 %r32,0; .loc 1 405 23 mov.u64 %r68,%r32; $L97: st.u64 [%r29],%r68; st.u64 [%r29+8],%r68; .loc 1 404 24 add.u64 %r32,%r32,1; .loc 1 404 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r70,%r30,%r32; @ %r70 bra $L97; $L90: .loc 1 406 1 ret; } minloc0_4_i16.o/_gfortran_minloc0_4_i16 .visible .func _gfortran_minloc0_4_i16libgfortran/generated/minloc0_4_i16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_4_i16 .visible .func _gfortran_mminloc0_4_i16_gfortran_sminloc0_4_i16 .visible .func _gfortran_sminloc0_4_i16VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_4_i16 .visible .func _gfortran_minloc0_4_i16stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[368predpred %r154; .reg .predpred %r163; .reg .pred %r166; .reg .predpredpredpredu64 %r191; .reg .predpred64u64 %r211; .reg .u64 %r214; .reg .u64 %r215; mov.u64 %r116,%ar0; mov.u64 %r117,%ar1; mov.u32 %r118,%ar2; .loc 1 50 10 ld.s8 %r22,[%r117+28]; .loc 1 50 8 cvt.u32.u32 %r119,%r22; cvt.s64.s8 %r72,%r119; .loc 1 51 6 setp.gt.s64 %r120,%r72,0; @ %r120 bra $L2; .loc 1 52 5 cvta.const.u64 %r121121_gfortran_runtime_errorr23,[%r116]; .loc 1 54 6 setp.ne.u64 %r123,%r23,0; @ %r123 bra $L3; .loc 1 56 7 st.u64 [%r116+48],%r23; add.u64 %r125,%r72,-1; st.u64 [%r116+56],%r125; mov.u64 %r126,1; st.u64 [%r116+40],%r126; .loc 1 57 28 cvt.u32.u64 %r127,%r126; st.u8 [%r116+28],%r127; .loc 1 58 24 st.u64 [%r116+8],%r23; .loc 1 59 29 mov.u64 %r130,4;r130; call (%value_in),_gfortrani_xmallocarray133,[%value_in]; } mov.u64 %r23,%r133; .loc 1 59 27 st.u64 [%r116],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r135,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r136,[%r135+36]; setp.eq.u32 %r137,%r136,0; @ %r137 bra $L4; .loc 1 64r140; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r116]; $L4: .loc 1 68 11 ld.u64 %r73,[%r116+40]; add.u64 %r39,%r117,40; mov.u64 %r45,%frame; add.u64 %r35,%frame,120; .loc 1 70 10 mov.u64 %r99,0; add.u64 %r210,%frame,240; .loc 1 74 16 mov.u64 %r148,%r99; bra $L7; $L24: mov.u64 %r99,%r79; $L7: .loc 1 72 18 ld.u64 %r141,[%r39]; st.u64 [%r45],%r141; .loc 1 73 19 ld.u64 %r143,[%r39+16]; add.u64 %r142,%r143,1; ld.u64 %r144,[%r39+8]; sub.u64 %r31,%r142,%r144; .loc 1 73 17 st.u64 [%r35],%r31; .loc 1 74 16 shl.b64 %r146,%r99,3; add.u64 %r147,%r210,%r146; st.u64 [%r147],%r148; .loc 1 75 10 setp.gt.s64 %r149,%r31,0; @ %r149 bra $L5; shl.b64 %r85,%r73,2; mov.u64 %r87,%r23; .loc 1 78 11 mov.u64 %r80,0; .loc 1 79 24 cvt.u32.u64 %r150,%r80; $L6: st.u32 [%r87],%r150; .loc 1 78 27 add.u64 %r80,%r80,1; .loc 1 78 4 add.u64 %r87,%r87,%r85; setp.ne.u64 %r151,%r72,%r80; @ %r151 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r79,%r99,1; .loc 1 70 3 add.u64 %r39,%r39,24; add.u64 %r45,%r45,8; add.u64 %r35,%r35,8; setp.ne.u64 %r152,%r72,%r79; @ %r152 bra $L24; .loc 1 84 8 ld.u64 %r64,[%r117]; shl.b64 %r82,%r73,2; mov.u64 %r92,%r23; .loc 1 87 10 mov.u64 %r78,0; .loc 1 88 23 mov.u32 %r153,1; $L8: st.u32 [%r92],%r153; mov.u64 %r32,%r78; .loc 1 87 26 add.u64 %r78,%r78,1; .loc 1 87 3 add.u64 %r92,%r92,%r82; setp.ne.u64 %r154,%r99,%r32; @ %r154 bra $L8; .loc 1 101 9 setp.ne.u64 %r155,%r64,0; @ ! %r155 bra $L1; .loc 1 149 19 ld.u64 %r49,[%frame]; .loc 1 149 9 shl.b64 %r51,%r49,4; .loc 1 151 34 ld.u64 %r53,[%frame+120]; cvt.u32.u32 %r157,%r22; cvt.s64.s8 %r156,%r157; shl.b64 %r158,%r156,3; add.u64 %r102,%r210,%r158; .loc 1 99 12 mov.u64 %r207,-1; mov.u64 %r208,9223372036854775807; setp.eq.u32 %r209,%r118,0; add.u64 %r211,%frame,248; .loc 1 157 13 mov.u64 %r214,0; .loc 1 174 32 add.u64 %r215,%frame,120; $L23: .loc 1 151 21 ld.u64 %r110,[%frame+240]; .loc 1 126 10 @ %r209 bra $L25; mov.u64 %r40,%r110; mov.u64 %r62,%r64; $L15: .loc 1 129 10 ld.u64 %r203,[%r62]; ld.u64 %r204,[%r62+8]; .loc 1 129 9 setp.gt.s64 %r163,%r204,%r208; @ %r163 bra $L12; setp.ne.u64 %r166,%r204,%r208; @ %r166 bra $L28; setp.gt.u64 %r169,%r203,%r207; @ %r169 bra $L12; $L28: mov.u64 %r43,%r211; mov.u64 %r65,%r23; mov.u64 %r105,%r40; $L14: .loc 1 133 34 cvt.u32.u64 %r172,%r105; add.u32 %r171,%r172,1; .loc 1 133 23 st.u32 [%r65],%r171; .loc 1 132 3 add.u64 %r65,%r65,%r82; setp.eq.u64 %r173,%r43,%r102; @ %r173 bra $L26; .loc 1 133 30 ld.u64 %r105,[%r43]; add.u64 %r43,%r43,8; bra $L14; $L26: mov.u64 %r207,%r203; mov.u64 %r208,%r204; $L12: .loc 1 135 11 add.u64 %r62,%r62,%r51; .loc 1 137 9 add.u64 %r40,%r40,1; .loc 1 137 2 st.u64 [%frame+240],%r40; setp.ne.u64 %r174,%r40,%r53; @ %r174 bra $L15; bra $L38; $L25: mov.u64 %r52,%r110; mov.u64 %r63,%r64; $L11: .loc 1 141 10 ld.u64 %r205,[%r63]; ld.u64 %r206,[%r63+8]; .loc 1 141 9 setp.gt.s64 %r177,%r208,%r206; @ %r177 bra $L29; setp.ne.u64 %r180,%r208,%r206; @ %r180 bra $L17; setp.gt.u64 %r183,%r207,%r205; @ ! %r183 bra $L17; $L29: mov.u64 %r108,%r211; mov.u64 %r109,%r23; mov.u64 %r44,%r52; $L19: .loc 1 145 34 cvt.u32.u64 %r186,%r44; add.u32 %r185,%r186,1; .loc 1 145 23 st.u32 [%r109],%r185; .loc 1 144 3 add.u64 %r109,%r109,%r82; setp.eq.u64 %r187,%r102,%r108; @ %r187 bra $L27; .loc 1 145 30 ld.u64 %r44,[%r108]; add.u64 %r108,%r108,8; bra $L19; $L27: mov.u64 %r207,%r205; mov.u64 %r208,%r206; $L17: .loc 1 149 9 add.u64 %r63,%r63,%r51; .loc 1 151 14 add.u64 %r52,%r52,1; .loc 1 151 7 st.u64 [%frame+240],%r52; setp.ne.u64 %r188,%r52,%r53; @ %r188 bra $L11; .loc 1 149 9 sub.u64 %r189,%r53,%r110; mad.lo.u64 %r64,%r189,%r51,%r64; bra $L20; $L38: .loc 1 135 11 sub.u64 %r191,%r53,%r110; mad.lo.u64 %r64,%r191,%r51,%r64; $L20: mov.u64 %r81,%r211; .loc 1 151 34 mov.u64 %r61,%r53; mov.u64 %r59,%r49; .loc 1 152 9 mov.u64 %r66,0; $L22: .loc 1 157 13 st.u64 [%r81+-8],%r214; .loc 1 160 23 mul.lo.u64 %r56,%r59,%r61; mov.u64 %r100,%r66; .loc 1 161 5 add.u64 %r66,%r66,1; .loc 1 162 7 setp.eq.u64 %r195,%r100,%r99; @ %r195 bra $L1; .loc 1 170 16 ld.u64 %r196,[%r81]; add.u64 %r58,%r196,1; st.u64 [%r81],%r58; shl.b64 %r70,%r66,3; .loc 1 171 23 add.u64 %r197,%frame,%r70; ld.u64 %r59,[%r197]; .loc 1 171 13 sub.u64 %r198,%r59,%r56; shl.b64 %r199,%r198,4; add.u64 %r64,%r64,%r199; .loc 1 174 32 add.u64 %r201,%r215,%r70; ld.u64 %r61,[%r201]; .loc 1 174 7 add.u64 %r81,%r81,8; setp.eq.u64 %r202,%r58,%r61; @ %r202 bra $L22; bra $L23; $L_gfortran_mminloc0_4_i16 .visible .func _gfortran_mminloc0_4_i16reg .u64 %stack; mov.u64 %stack,0480predpred %r193; .reg .u64 %r196; .reg .u6464pred %r228; .reg .u32 %r229; .reg .predpred %r233; .reg .pred %r234; .reg .u64u16predpred %r265; .reg .u64 %r266; .reg .u16 %r268; .reg .u32 %r269; .reg .pred %r270; .reg .predpredu64 %r313; .reg .u64 %r314; .reg .pred %r316; mov.u64 %r166,%ar0; mov.u64 %r167,%ar1; mov.u64 %r168,%ar2; mov.u32 %r169,%ar3; .loc 1 202 6 setp.ne.u64 %r170,%r168,0; @ %r170 bra $L40gfortran_minloc0_4_i16.loc 1 205 7 bra $L39; $L40: .loc 1 208 10 ld.s8 %r22,[%r167+28]; .loc 1 209 6 setp.gt.s32 %r174,%r22,0; @ %r174 bra $L42; .loc 1 210 5 cvta.const.u64 %r175175_gfortran_runtime_error42: .loc 1 212 6 ld.u64 %r177,[%r166]; setp.ne.u64 %r178,%r177,0; @ %r178 bra $L43; .loc 1 214 7 st.u64 [%r166+48],%r177; add.u32 %r180,%r22,-1; cvt.s64.s32 %r181,%r180; st.u64 [%r166+56],%r181; mov.u64 %r182,1; st.u64 [%r166+40],%r182; .loc 1 215 28 cvt.u32.u64 %r183,%r182; st.u8 [%r166+28],%r183; .loc 1 216 24 st.u64 [%r166+8],%r177; .loc 1 217 29 cvt.u32.u32 %r188,%r22; cvt.s64.s8 %r187,%r188call (%value_in),_gfortrani_xmallocarray189,[%value_in]; } .loc 1 217 27 st.u64 [%r166],%r189; bra $L44; $L43: .loc 1 221 11 cvta.global.u64 %r191,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r192,[%r191+36];r167196; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19call _gfortrani_bounds_equal_extents$L44: .loc 1 231 15 ld.u64 %r31,[%r168+16]; .loc 1 233 9 ld.u64 %r96,[%r168]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r201,%r32,-4; and.b32 %r202,%r201,-5; set.u32.eq.u32 %r204,%r202,0; neg.s32 %r205,%r204; .loc 1 235 22 add.u32 %r206,%r32,-1; set.u32.le.u32 %r208,%r206,1; neg.s32 %r209,%r208; .loc 1 235 6 cvt.u16.u32 %r211,%r205; cvt.u16.u32 %r212,%r209; or.b16 %r210,%r211,%r212; cvt.u32.u16 %r213,%r210; cvt.u16.u8 %r214,%r213; setp.ne.u16 %r215,%r214,0; @ %r215 bra $L45; .loc 1 237 7 setp.ne.u32 %r217,%r32,16; @ %r217 bra $L46; $L45: .loc 1 244 11 ld.u64 %r109,[%r166+40]; .loc 1 245 8 ld.u64 %r110,[%r166]; .loc 1 246 17 cvt.u32.u32 %r218,%r22; cvt.s64.s8 %r67,%r218; add.u64 %r52,%r167,40; add.u64 %r307,%frame,120; mov.u64 %r118,%r307; add.u64 %r105,%r168,40; mov.u64 %r165,%frame; add.u64 %r162,%frame,240; add.u64 %r159,%frame,360; mov.u64 %r160,%r159; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r227,%r29; bra $L47; $L46: .loc 1 242 5_gfortran_runtime_error70: mov.u64 %r29,%r115; $L47: .loc 1 248 18 ld.u64 %r221,[%r52]; st.u64 [%r118],%r221; .loc 1 249 20 ld.u64 %r223,[%r105]; mul.lo.u64 %r222,%r223,%r31; .loc 1 249 18 st.u64 [%r165],%r222; .loc 1 250 19 ld.u64 %r225,[%r52+16]; add.u64 %r224,%r225,1; ld.u64 %r226,[%r52+8]; sub.u64 %r44,%r224,%r226; .loc 1 250 17 st.u64 [%r162],%r44; .loc 1 251 16 st.u64 [%r160],%r227; .loc 1 252 10 setp.gt.s64 %r228,%r44,0; @ %r228 bra $L48; shl.b64 %r149,%r109,2; mov.u64 %r138,%r110; .loc 1 255 11 mov.u64 %r116,0; .loc 1 256 24 cvt.u32.u64 %r229,%r116; $L49: st.u32 [%r138],%r229; .loc 1 255 27 add.u64 %r116,%r116,1; .loc 1 255 4 add.u64 %r138,%r138,%r149; setp.ne.u64 %r230,%r67,%r116; @ %r230 bra $L49; bra $L39; $L48: .loc 1 246 26 add.u64 %r115,%r29,1; .loc 1 246 3 add.u64 %r52,%r52,24; add.u64 %r118,%r118,8; add.u64 %r105,%r105,24; add.u64 %r165,%r165,8; add.u64 %r162,%r162,8; add.u64 %r160,%r160,8; setp.ne.u64 %r231,%r67,%r115; @ %r231 bra $L70; .loc 1 261 8 ld.u64 %r94,[%r167]; shl.b64 %r57,%r109,2; mov.u64 %r60,%r110; .loc 1 264 10 mov.u64 %r114,0; .loc 1 265 23 cvt.u32.u64 %r232,%r114; $L51: st.u32 [%r60],%r232; mov.u64 %r145,%r114; .loc 1 264 26 add.u64 %r114,%r114,1; .loc 1 264 3 add.u64 %r60,%r60,%r57; setp.ne.u64 %r233,%r29,%r145; @ %r233 bra $L51; .loc 1 276 9 setp.eq.u64 %r234,%r94,0; @ %r234 bra $L39; .loc 1 331 19 ld.u64 %r73,[%frame+120]; .loc 1 331 9 shl.b64 %r75,%r73,4; .loc 1 332 20 ld.u64 %r76,[%frame]; .loc 1 334 34 ld.u64 %r79,[%frame+240]; ld.u64 %r163,[%frame+360]; cvt.u32.u32 %r236,%r22; cvt.s64.s8 %r235,%r236; shl.b64 %r237,%r235,3; add.u64 %r127,%r237,%r159; .loc 1 274 12 mov.u64 %r304,-1; mov.u64 %r305,9223372036854775807; setp.eq.u32 %r306,1,0; add.u64 %r308,%frame,368; .loc 1 340 13 mov.u64 %r313,0; .loc 1 359 32 add.u64 %r314,%frame,240; .loc 1 308 12 setp.eq.u32 %r316,%r169,0; $L69: .loc 1 280 10 @ %r306 bra $L52; mov.u64 %r53,%r163; mov.u32 %r46,0; mov.u32 %r310,1; $L57: mov.u64 %r45,%r94; .loc 1 300 13 add.u64 %r94,%r45,%r75; .loc 1 284 11 ld.s8 %r240,[%r96]; cvt.u16.u32 %r239,%r240; setp.eq.u16 %r241,%r239,0; @ %r241 bra $L53; setp.eq.u32 %r242,%r46,0; selp.u64 %r163,%r163,%r53,%r242; .loc 1 294 16 ld.u64 %r304,[%r45]; ld.u64 %r305,[%r45+8]; mov.u64 %r137,%r308; mov.u64 %r150,%r110; mov.u64 %r161,%r163; $L56: .loc 1 296 33 cvt.u32.u64 %r245,%r161; add.u32 %r244,%r245,1; .loc 1 296 22 st.u32 [%r150],%r244; .loc 1 295 9 add.u64 %r150,%r150,%r57; setp.eq.u64 %r246,%r127,%r137; @ %r246 bra $L71; .loc 1 296 29 ld.u64 %r161,[%r137]; add.u64 %r137,%r137,8; bra $L56; $L53: .loc 1 301 14 add.u64 %r96,%r96,%r76; .loc 1 303 11 add.u64 %r53,%r53,1; mov.u32 %r46,%r310; .loc 1 303 4 setp.ne.u64 %r247,%r53,%r79; @ %r247 bra $L57; bra $L97; $L52: .loc 1 308 12 @ %r316 bra $L72; mov.u64 %r64,%r163; mov.u64 %r92,%r94; $L63: .loc 1 311 11 ld.s8 %r250,[%r96]; cvt.u16.u32 %r249,%r250; setp.eq.u16 %r251,%r249,0; @ %r251 bra $L60; .loc 1 311 12 ld.u64 %r300,[%r92]; ld.u64 %r301,[%r92+8]; .loc 1 311 11 setp.gt.s64 %r254,%r301,%r305; @ %r254 bra $L60; setp.ne.u64 %r257,%r301,%r305; @ %r257 bra $L75; setp.gt.u64 %r260,%r300,%r304; @ %r260 bra $L60; $L75: mov.u64 %r103,%r308; mov.u64 %r119,%r110; mov.u64 %r156,%r64; .loc 1 314 18 mov.u64 %r146,%r313; $L62: .loc 1 315 36 cvt.u32.u64 %r263,%r156; add.u32 %r262,%r263,1; .loc 1 315 25 st.u32 [%r119],%r262; .loc 1 314 34 add.u64 %r113,%r146,1; .loc 1 314 11 add.u64 %r119,%r119,%r57; setp.le.s64 %r264,%r29,%r146; @ %r264 bra $L73; .loc 1 315 32 ld.u64 %r156,[%r103]; add.u64 %r103,%r103,8; mov.u64 %r146,%r113; bra $L62; $L73: mov.u64 %r304,%r300; mov.u64 %r305,%r301; $L60: .loc 1 317 8 add.u64 %r92,%r92,%r75; .loc 1 319 13 add.u64 %r64,%r64,1; .loc 1 319 6 st.u64 [%frame+360],%r64; setp.ne.u64 %r265,%r64,%r79; @ %r265 bra $L63; .loc 1 317 8 sub.u64 %r266,%r79,%r163; mad.lo.u64 %r94,%r266,%r75,%r94; bra $L64; $L72: mov.u64 %r78,%r163; mov.u64 %r95,%r96; mov.u64 %r93,%r94; $L59: .loc 1 323 11 ld.s8 %r269,[%r95]; cvt.u16.u32 %r268,%r269; setp.eq.u16 %r270,%r268,0; @ %r270 bra $L65; .loc 1 323 12 ld.u64 %r302,[%r93]; ld.u64 %r303,[%r93+8]; .loc 1 323 11 setp.gt.s64 %r273,%r305,%r303; @ %r273 bra $L76; setp.ne.u64 %r276,%r305,%r303; @ %r276 bra $L65; setp.gt.u64 %r279,%r304,%r302; @ ! %r279 bra $L65; $L76: mov.u64 %r91,%r308; mov.u64 %r97,%r110; mov.u64 %r151,%r78; .loc 1 326 12 mov.u64 %r147,%r313; $L67: .loc 1 327 36 cvt.u32.u64 %r282,%r151; add.u32 %r281,%r282,1; .loc 1 327 25 st.u32 [%r97],%r281; .loc 1 326 28 add.u64 %r112,%r147,1; .loc 1 326 5 add.u64 %r97,%r97,%r57; setp.le.s64 %r283,%r29,%r147; @ %r283 bra $L74; .loc 1 327 32 ld.u64 %r151,[%r91]; add.u64 %r91,%r91,8; mov.u64 %r147,%r112; bra $L67; $L74: mov.u64 %r304,%r302; mov.u64 %r305,%r303; $L65: .loc 1 331 9 add.u64 %r93,%r93,%r75; .loc 1 332 10 add.u64 %r95,%r95,%r76; .loc 1 334 14 add.u64 %r78,%r78,1; .loc 1 334 7 st.u64 [%frame+360],%r78; setp.ne.u64 %r284,%r78,%r79; @ %r284 bra $L59; sub.u64 %r133,%r79,%r163; .loc 1 331 9 mad.lo.u64 %r94,%r75,%r133,%r94; .loc 1 332 10 mad.lo.u64 %r96,%r76,%r133,%r96; bra $L64; $L97: st.u64 [%frame+360],%r79; $L64: mov.u64 %r125,%r308; mov.u64 %r88,%r76; .loc 1 334 34 mov.u64 %r90,%r79; mov.u64 %r86,%r73; mov.u64 %r124,8; .loc 1 335 9 mov.u64 %r98,0; $L68: .loc 1 340 13 st.u64 [%r125+-8],%r313; .loc 1 343 23 mul.lo.u64 %r81,%r86,%r90; .loc 1 344 24 mul.lo.u64 %r83,%r90,%r88; mov.u64 %r104,%r98; .loc 1 345 5 add.u64 %r98,%r98,1; .loc 1 346 7 setp.eq.u64 %r289,%r29,%r104; @ %r289 bra $L39; .loc 1 354 16 ld.u64 %r290,[%r125]; add.u64 %r85,%r290,1; st.u64 [%r125],%r85; .loc 1 355 23 add.u64 %r292,%r307,%r124; ld.u64 %r86,[%r292]; .loc 1 355 13 sub.u64 %r293,%r86,%r81; shl.b64 %r294,%r293,4; add.u64 %r94,%r94,%r294; .loc 1 356 24 add.u64 %r295,%frame,%r124; ld.u64 %r88,[%r295]; .loc 1 356 14 sub.u64 %r296,%r88,%r83; add.u64 %r96,%r96,%r296; .loc 1 359 32 add.u64 %r298,%r314,%r124; ld.u64 %r90,[%r298]; .loc 1 359 7 add.u64 %r125,%r125,8; add.u64 %r124,%r124,8; setp.eq.u64 %r299,%r85,%r90; @ %r299 bra $L68; ld.u64 %r163,[%frame+360]; bra $L69; $L71: mov.u64 %r94,%r45; setp.eq.u32 %r306,1,1; bra $L52; $L39: .loc 1 _gfortran_sminloc0_4_i16 .visible .func _gfortran_sminloc0_4_i16u32pred.reg .u64u32 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 378 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L99; .loc 1 378 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L100; $L99: .loc 1 380 7 {_gfortran_minloc0_4_i16.loc 1 381 7 bra $L98; $L100: .loc 1 384 8 ld.s8 %r30,[%r36+28]; .loc 1 386 6 setp.gt.s64 %r46,%r30,0; @ %r46 bra $L102_gfortran_runtime_error102: .loc 1 389 15 ld.u64 %r24,[%r35]; .loc 1 389 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L103; .loc 1 391 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 392 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 393 24 st.u64 [%r35+8],%r24; .loc 1 394 29 mov.u64 %r56call (%value_in),_gfortrani_xmallocarray94 27 st.u64 [%r35],%r24; bra $L104; $L103: .loc 1 396 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 396 11 ld.u32 %r62,[%r61+36.loc 1 398 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 403 8 ld.u64 %r24,[%r35]; $L104: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,2; mov.u64 %r29,%r24; .loc 1 404 10 mov.u64 %r32,0; .loc 1 405 23 cvt.u32.u64 %r68,%r32; $L105: st.u32 [%r29],%r68; .loc 1 404 24 add.u64 %r32,%r32,1; .loc 1 404 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L105; $L98: .loc 1 406 1 ret; } minloc0_8_i16.o/_gfortran_minloc0_8_i16 .visible .func _gfortran_minloc0_8_i16libgfortran/generated/minloc0_8_i16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_8_i16 .visible .func _gfortran_mminloc0_8_i16_gfortran_sminloc0_8_i16 .visible .func _gfortran_sminloc0_8_i16VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_8_i16 .visible .func _gfortran_minloc0_8_i16stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[36832predpredpred %r169; .reg .pred %r172; .reg .predpredpredpredpredmov.u64 %r112,%ar0; mov.u64 %r113,%ar1; mov.u32 %r114,%ar2; .loc 1 50 10 ld.s8 %r22,[%r113+28]; .loc 1 50 8 cvt.u32.u32 %r115,%r22; cvt.s64.s8 %r71,%r115; .loc 1 51 6 setp.gt.s64 %r116,%r71,0; @ %r116 bra $L2; .loc 1 52 5 cvta.const.u64 %r11117_gfortran_runtime_errorr23,[%r112]; .loc 1 54 6 setp.ne.u64 %r119,%r23,0; @ %r119 bra $L3; .loc 1 56 7 st.u64 [%r112+48],%r23; add.u64 %r121,%r71,-1; st.u64 [%r112+56],%r121; mov.u64 %r122,1; st.u64 [%r112+40],%r122; .loc 1 57 28 cvt.u32.u64 %r123,%r122; st.u8 [%r112+28],%r123; .loc 1 58 24 st.u64 [%r112+8],%r23; .loc 1 59 29 mov.u64 %r171call (%value_in),_gfortrani_xmallocarray129,[%value_in]; } mov.u64 %r23,%r129; .loc 1 59 27 st.u64 [%r112],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r131,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r132,[%r131+36]; setp.eq.u32 %r133,%r132,0; @ %r133 bra $L4; .loc 1 64r113136; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r112]; $L4: .loc 1 68 11 ld.u64 %r72,[%r112+40]; add.u64 %r36,%r113,40; mov.u64 %r104,%frame; add.u64 %r105,%frame,120; .loc 1 70 10 mov.u64 %r98,0; add.u64 %r204,%frame,240; .loc 1 74 16 mov.u64 %r144,%r98; bra $L7; $L24: mov.u64 %r98,%r79; $L7: .loc 1 72 18 ld.u64 %r137,[%r36]; st.u64 [%r104],%r137; .loc 1 73 19 ld.u64 %r139,[%r36+16]; add.u64 %r138,%r139,1; ld.u64 %r140,[%r36+8]; sub.u64 %r31,%r138,%r140; .loc 1 73 17 st.u64 [%r105],%r31; .loc 1 74 16 shl.b64 %r142,%r98,3; add.u64 %r143,%r204,%r142; st.u64 [%r143],%r144; .loc 1 75 10 setp.gt.s64 %r145,%r31,0; @ %r145 bra $L5; shl.b64 %r81,%r72,3; mov.u64 %r87,%r23; .loc 1 78 11 mov.u64 %r80,0; .loc 1 79 24 mov.u64 %r146,%r80; $L6: st.u64 [%r87],%r146; .loc 1 78 27 add.u64 %r80,%r80,1; .loc 1 78 4 add.u64 %r87,%r87,%r81; setp.ne.u64 %r147,%r71,%r80; @ %r147 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r79,%r98,1; .loc 1 70 3 add.u64 %r36,%r36,24; add.u64 %r104,%r104,8; add.u64 %r105,%r105,8; setp.ne.u64 %r148,%r71,%r79; @ %r148 bra $L24; .loc 1 84 8 ld.u64 %r62,[%r113]; shl.b64 %r84,%r72,3; mov.u64 %r39,%r23; .loc 1 87 10 mov.u64 %r78,0; .loc 1 88 23 mov.u64 %r149,1; $L8: st.u64 [%r39],%r149; mov.u64 %r32,%r78; .loc 1 87 26 add.u64 %r78,%r78,1; .loc 1 87 3 add.u64 %r39,%r39,%r84; setp.ne.u64 %r150,%r98,%r32; @ %r150 bra $L8; .loc 1 101 9 setp.ne.u64 %r151,%r62,0; @ ! %r151 bra $L1; .loc 1 149 19 ld.u64 %r45,[%frame]; .loc 1 149 9 shl.b64 %r47,%r45,4; .loc 1 151 34 ld.u64 %r51,[%frame+120]; cvt.u32.u32 %r153,%r22; cvt.s64.s8 %r152,%r153; shl.b64 %r154,%r152,3; add.u64 %r73,%r204,%r154; .loc 1 99 12 mov.u64 %r201,-1; mov.u64 %r202,9223372036854775807; setp.eq.u32 %r203,%r114,0; add.u64 %r205,%frame,248; .loc 1 157 13 mov.u64 %r208,0; .loc 1 174 32 add.u64 %r209,%frame,120; $L23: .loc 1 151 21 ld.u64 %r103,[%frame+240]; .loc 1 126 10 @ %r203 bra $L25; mov.u64 %r40,%r103; mov.u64 %r60,%r62; $L15: .loc 1 129 10 ld.u64 %r197,[%r60]; ld.u64 %r198,[%r60+8]; .loc 1 129 9 setp.gt.s64 %r159,%r198,%r202; @ %r159 bra $L12; setp.ne.u64 %r162,%r198,%r202; @ %r162 bra $L28; setp.gt.u64 %r165,%r197,%r201; @ %r165 bra $L12; $L28: mov.u64 %r76,%r205; mov.u64 %r34,%r23; mov.u64 %r90,%r40; $L14: .loc 1 133 34 add.u64 %r167,%r90,1; .loc 1 133 23 st.u64 [%r34],%r167; .loc 1 132 3 add.u64 %r34,%r34,%r84; setp.eq.u64 %r168,%r73,%r76; @ %r168 bra $L26; .loc 1 133 30 ld.u64 %r90,[%r76]; add.u64 %r76,%r76,8; bra $L14; $L26: mov.u64 %r201,%r197; mov.u64 %r202,%r198; $L12: .loc 1 135 11 add.u64 %r60,%r60,%r47; .loc 1 137 9 add.u64 %r40,%r40,1; .loc 1 137 2 st.u64 [%frame+240],%r40; setp.ne.u64 %r169,%r40,%r51; @ %r169 bra $L15; bra $L38; $L25: mov.u64 %r49,%r103; mov.u64 %r61,%r62; $L11: .loc 1 141 10 ld.u64 %r199,[%r61]; ld.u64 %r200,[%r61+8]; .loc 1 141 9 setp.gt.s64 %r172,%r202,%r200; @ %r172 bra $L29; setp.ne.u64 %r175,%r202,%r200; @ %r175 bra $L17; setp.gt.u64 %r178,%r201,%r199; @ ! %r178 bra $L17; $L29: mov.u64 %r85,%r205; mov.u64 %r101,%r23; mov.u64 %r86,%r49; $L19: .loc 1 145 34 add.u64 %r180,%r86,1; .loc 1 145 23 st.u64 [%r101],%r180; .loc 1 144 3 add.u64 %r101,%r101,%r84; setp.eq.u64 %r181,%r73,%r85; @ %r181 bra $L27; .loc 1 145 30 ld.u64 %r86,[%r85]; add.u64 %r85,%r85,8; bra $L19; $L27: mov.u64 %r201,%r199; mov.u64 %r202,%r200; $L17: .loc 1 149 9 add.u64 %r61,%r61,%r47; .loc 1 151 14 add.u64 %r49,%r49,1; .loc 1 151 7 st.u64 [%frame+240],%r49; setp.ne.u64 %r182,%r49,%r51; @ %r182 bra $L11; .loc 1 149 9 sub.u64 %r183,%r51,%r103; mad.lo.u64 %r62,%r183,%r47,%r62; bra $L20; $L38: .loc 1 135 11 sub.u64 %r185,%r51,%r103; mad.lo.u64 %r62,%r185,%r47,%r62; $L20: mov.u64 %r70,%r205; .loc 1 151 34 mov.u64 %r59,%r51; mov.u64 %r57,%r45; .loc 1 152 9 mov.u64 %r66,0; $L22: .loc 1 157 13 st.u64 [%r70+-8],%r208; .loc 1 160 23 mul.lo.u64 %r54,%r57,%r59; mov.u64 %r99,%r66; .loc 1 161 5 add.u64 %r66,%r66,1; .loc 1 162 7 setp.eq.u64 %r189,%r99,%r98; @ %r189 bra $L1; .loc 1 170 16 ld.u64 %r190,[%r70]; add.u64 %r56,%r190,1; st.u64 [%r70],%r56; shl.b64 %r65,%r66,3; .loc 1 171 23 add.u64 %r191,%frame,%r65; ld.u64 %r57,[%r191]; .loc 1 171 13 sub.u64 %r192,%r57,%r54; shl.b64 %r193,%r192,4; add.u64 %r62,%r62,%r193; .loc 1 174 32 add.u64 %r195,%r209,%r65; ld.u64 %r59,[%r195]; .loc 1 174 7 add.u64 %r70,%r70,8; setp.eq.u64 %r196,%r56,%r59; @ %r196 bra $L22; bra $L23; $L_gfortran_mminloc0_8_i16 .visible .func _gfortran_mminloc0_8_i16reg .u64 %stack; mov.u64 %stack,04809pred %r187; .reg .u64 %r190; .reg .u64u64pred %r222; .reg .u64 %r223; .reg .predu32 %r230; .reg .u64 %r231; .reg .u16 %r233; .reg .u32 %r234; .reg .pred %r235; .reg .pred %r236; .reg .u64 %r238; .reg .pred %r239; .reg .pred %r240; .reg .u16 %r242; .reg .u32 %r243; .reg .pred %r244; .reg .pred %r247; .reg .pred %r250; .reg .predu32 %r261; .reg .pred %r262; .reg .pred %r265; .reg .pred %r268; .reg .pred %r271; .reg .u64 %r273; .reg .pred %r274; .reg .predu64 %r283; .reg .u64u64 %r305; .reg .pred %r307; mov.u64 %r160,%ar0; mov.u64 %r161,%ar1; mov.u64 %r162,%ar2; mov.u32 %r163,%ar3; .loc 1 202 6 setp.ne.u64 %r164,%r162,0; @ %r164 bra $L40; .loc 1 204 7 {1163; call _gfortran_minloc0_8_i16.loc 1 205 7 bra $L39; $L40: .loc 1 208 10 ld.s8 %r22,[%r161+28]; .loc 1 209 6 setp.gt.s32 %r168,%r22,0; @ %r168 bra $L42; .loc 1 210 5 cvta.const.u64 %r16stack; call _gfortran_runtime_error42: .loc 1 212 6 ld.u64 %r171,[%r160]; setp.ne.u64 %r172,%r171,0; @ %r172 bra $L43; .loc 1 214 7 st.u64 [%r160+48],%r171; add.u32 %r174,%r22,-1; cvt.s64.s32 %r175,%r174; st.u64 [%r160+56],%r175; mov.u64 %r176,1; st.u64 [%r160+40],%r176; .loc 1 215 28 cvt.u32.u64 %r177,%r176; st.u8 [%r160+28],%r177; .loc 1 216 24 st.u64 [%r160+8],%r171; .loc 1 217 29 cvt.u32.u32 %r182,%r22; cvt.s64.s8 %r181,%r182; mov.u64 %r180call (%value_in),_gfortrani_xmallocarray183,[%value_in]; } .loc 1 217 27 st.u64 [%r160],%r183; bra $L44; $L43: .loc 1 221 11 cvta.global.u64 %r185,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r186,[%r185+36]; setp.eq.u32 %r187,%r186,0; @ %r187 bra $L44; .loc 1 224 4 cvta.const.u64 %r190,$LC1; {161190; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19316161193190; call _gfortrani_bounds_equal_extents$L44: .loc 1 231 15 ld.u64 %r31,[%r162+16]; .loc 1 233 9 ld.u64 %r96,[%r162]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r195,%r32,-4; and.b32 %r196,%r195,-5; set.u32.eq.u32 %r198,%r196,0; neg.s32 %r199,%r198; .loc 1 235 22 add.u32 %r200,%r32,-1; set.u32.le.u32 %r202,%r200,1; neg.s32 %r203,%r202; .loc 1 235 6 cvt.u16.u32 %r205,%r199; cvt.u16.u32 %r206,%r203; or.b16 %r204,%r205,%r206; cvt.u32.u16 %r207,%r204; cvt.u16.u8 %r208,%r207; .loc 1 237 7 setp.ne.u32 %r211,%r32,16; @ %r211 bra $L46; $L45: .loc 1 244 11 ld.u64 %r110,[%r160+40]; .loc 1 245 8 ld.u64 %r111,[%r160]; .loc 1 246 17 cvt.u32.u32 %r212,%r22; cvt.s64.s8 %r68,%r212; add.u64 %r153,%r161,40; add.u64 %r298,%frame,120; mov.u64 %r149,%r298; add.u64 %r147,%r162,40; mov.u64 %r146,%frame; add.u64 %r71,%frame,240; add.u64 %r64,%frame,360; mov.u64 %r47,%r64; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r221,%r29; bra $L47; $L46: .loc 1 242 5_gfortran_runtime_error70: mov.u64 %r29,%r116; $L47: .loc 1 248 18 ld.u64 %r215,[%r153]; st.u64 [%r149],%r215; .loc 1 249 20 ld.u64 %r217,[%r147]; mul.lo.u64 %r216,%r217,%r31; .loc 1 249 18 st.u64 [%r146],%r216; .loc 1 250 19 ld.u64 %r219,[%r153+16]; add.u64 %r218,%r219,1; ld.u64 %r220,[%r153+8]; sub.u64 %r44,%r218,%r220; .loc 1 250 17 st.u64 [%r71],%r44; .loc 1 251 16 st.u64 [%r47],%r221; .loc 1 252 10 setp.gt.s64 %r222,%r44,0; @ %r222 bra $L48; shl.b64 %r138,%r110,3; mov.u64 %r130,%r111; .loc 1 255 11 mov.u64 %r117,0; .loc 1 256 24 mov.u64 %r223,%r117; $L49: st.u64 [%r130],%r223; .loc 1 255 27 add.u64 %r117,%r117,1; .loc 1 255 4 add.u64 %r130,%r130,%r138; setp.ne.u64 %r224,%r68,%r117; @ %r224 bra $L49; bra $L39; $L48: .loc 1 246 26 add.u64 %r116,%r29,1; .loc 1 246 3 add.u64 %r153,%r153,24; add.u64 %r149,%r149,8; add.u64 %r147,%r147,24; add.u64 %r146,%r146,8; add.u64 %r71,%r71,8; add.u64 %r47,%r47,8; setp.ne.u64 %r225,%r68,%r116; @ %r225 bra $L70; .loc 1 261 8 ld.u64 %r93,[%r161]; shl.b64 %r158,%r110,3; mov.u64 %r156,%r111; .loc 1 264 10 mov.u64 %r115,0; .loc 1 265 23 mov.u64 %r226,%r115; $L51: st.u64 [%r156],%r226; mov.u64 %r141,%r115; .loc 1 264 26 add.u64 %r115,%r115,1; .loc 1 264 3 add.u64 %r156,%r156,%r158; setp.ne.u64 %r227,%r29,%r141; @ %r227 bra $L51; .loc 1 276 9 setp.eq.u64 %r228,%r93,0; @ %r228 bra $L39; .loc 1 331 19 ld.u64 %r73,[%frame+120]; .loc 1 331 9 shl.b64 %r75,%r73,4; .loc 1 332 20 ld.u64 %r76,[%frame]; .loc 1 334 34 ld.u64 %r79,[%frame+240]; ld.u64 %r155,[%frame+360]; cvt.u32.u32 %r230,%r22; cvt.s64.s8 %r229,%r230; shl.b64 %r231,%r229,3; add.u64 %r120,%r231,%r64; .loc 1 274 12 mov.u64 %r295,-1; mov.u64 %r296,9223372036854775807; setp.eq.u32 %r297,1,0; add.u64 %r299,%frame,368; .loc 1 340 13 mov.u64 %r304,0; .loc 1 359 32 add.u64 %r305,%frame,240; .loc 1 308 12 setp.eq.u32 %r307,%r163,0; $L69: .loc 1 280 10 @ %r297 bra $L52; mov.u64 %r56,%r155; mov.u32 %r46,0; mov.u32 %r301,1; $L57: mov.u64 %r45,%r93; .loc 1 300 13 add.u64 %r93,%r45,%r75; .loc 1 284 11 ld.s8 %r234,[%r96]; cvt.u16.u32 %r233,%r234; setp.eq.u16 %r235,%r233,0; @ %r235 bra $L53; setp.eq.u32 %r236,%r46,0; selp.u64 %r155,%r155,%r56,%r236; .loc 1 294 16 ld.u64 %r295,[%r45]; ld.u64 %r296,[%r45+8]; mov.u64 %r129,%r299; mov.u64 %r139,%r111; mov.u64 %r151,%r155; $L56: .loc 1 296 33 add.u64 %r238,%r151,1; .loc 1 296 22 st.u64 [%r139],%r238; .loc 1 295 9 add.u64 %r139,%r139,%r158; setp.eq.u64 %r239,%r120,%r129; @ %r239 bra $L71; .loc 1 296 29 ld.u64 %r151,[%r129]; add.u64 %r129,%r129,8; bra $L56; $L53: .loc 1 301 14 add.u64 %r96,%r96,%r76; .loc 1 303 11 add.u64 %r56,%r56,1; mov.u32 %r46,%r301; .loc 1 303 4 setp.ne.u64 %r240,%r56,%r79; @ %r240 bra $L57; bra $L97; $L52: .loc 1 308 12 @ %r307 bra $L72; mov.u64 %r66,%r155; mov.u64 %r91,%r93; $L63: .loc 1 311 11 ld.s8 %r243,[%r96]; cvt.u16.u32 %r242,%r243; setp.eq.u16 %r244,%r242,0; @ %r244 bra $L60; .loc 1 311 12 ld.u64 %r291,[%r91]; ld.u64 %r292,[%r91+8]; .loc 1 311 11 setp.gt.s64 %r247,%r292,%r296; @ %r247 bra $L60; setp.ne.u64 %r250,%r292,%r296; @ %r250 bra $L75; setp.gt.u64 %r253,%r291,%r295; @ %r253 bra $L60; $L75: mov.u64 %r62,%r299; mov.u64 %r94,%r111; mov.u64 %r148,%r66; .loc 1 314 18 mov.u64 %r142,%r304; $L62: .loc 1 315 36 add.u64 %r255,%r148,1; .loc 1 315 25 st.u64 [%r94],%r255; .loc 1 314 34 add.u64 %r114,%r142,1; .loc 1 314 11 add.u64 %r94,%r94,%r158; setp.le.s64 %r256,%r29,%r142; @ %r256 bra $L73; .loc 1 315 32 ld.u64 %r148,[%r62]; add.u64 %r62,%r62,8; mov.u64 %r142,%r114; bra $L62; $L73: mov.u64 %r295,%r291; mov.u64 %r296,%r292; $L60: .loc 1 317 8 add.u64 %r91,%r91,%r75; .loc 1 319 13 add.u64 %r66,%r66,1; .loc 1 319 6 st.u64 [%frame+360],%r66; setp.ne.u64 %r257,%r66,%r79; @ %r257 bra $L63; .loc 1 317 8 sub.u64 %r258,%r79,%r155; mad.lo.u64 %r93,%r258,%r75,%r93; bra $L64; $L72: mov.u64 %r78,%r155; mov.u64 %r95,%r96; mov.u64 %r92,%r93; $L59: .loc 1 323 11 ld.s8 %r261,[%r95]; cvt.u16.u32 %r260,%r261; setp.eq.u16 %r262,%r260,0; @ %r262 bra $L65; .loc 1 323 12 ld.u64 %r293,[%r92]; ld.u64 %r294,[%r92+8]; .loc 1 323 11 setp.gt.s64 %r265,%r296,%r294; @ %r265 bra $L76; setp.ne.u64 %r268,%r296,%r294; @ %r268 bra $L65; setp.gt.u64 %r271,%r295,%r293; @ ! %r271 bra $L65; $L76: mov.u64 %r118,%r299; mov.u64 %r57,%r111; mov.u64 %r144,%r78; .loc 1 326 12 mov.u64 %r143,%r304; $L67: .loc 1 327 36 add.u64 %r273,%r144,1; .loc 1 327 25 st.u64 [%r57],%r273; .loc 1 326 28 add.u64 %r113,%r143,1; .loc 1 326 5 add.u64 %r57,%r57,%r158; setp.le.s64 %r274,%r29,%r143; @ %r274 bra $L74; .loc 1 327 32 ld.u64 %r144,[%r118]; add.u64 %r118,%r118,8; mov.u64 %r143,%r113; bra $L67; $L74: mov.u64 %r295,%r293; mov.u64 %r296,%r294; $L65: .loc 1 331 9 add.u64 %r92,%r92,%r75; .loc 1 332 10 add.u64 %r95,%r95,%r76; .loc 1 334 14 add.u64 %r78,%r78,1; .loc 1 334 7 st.u64 [%frame+360],%r78; setp.ne.u64 %r275,%r78,%r79; @ %r275 bra $L59; sub.u64 %r126,%r79,%r155; .loc 1 331 9 mad.lo.u64 %r93,%r75,%r126,%r93; .loc 1 332 10 mad.lo.u64 %r96,%r76,%r126,%r96; bra $L64; $L97: st.u64 [%frame+360],%r79; $L64: mov.u64 %r105,%r299; mov.u64 %r88,%r76; .loc 1 334 34 mov.u64 %r90,%r79; mov.u64 %r86,%r73; mov.u64 %r104,8; .loc 1 335 9 mov.u64 %r99,0; $L68: .loc 1 340 13 st.u64 [%r105+-8],%r304; .loc 1 343 23 mul.lo.u64 %r81,%r86,%r90; .loc 1 344 24 mul.lo.u64 %r83,%r90,%r88; mov.u64 %r70,%r99; .loc 1 345 5 add.u64 %r99,%r99,1; .loc 1 346 7 setp.eq.u64 %r280,%r29,%r70; @ %r280 bra $L39; .loc 1 354 16 ld.u64 %r281,[%r105]; add.u64 %r85,%r281,1; st.u64 [%r105],%r85; .loc 1 355 23 add.u64 %r283,%r298,%r104; ld.u64 %r86,[%r283]; .loc 1 355 13 sub.u64 %r284,%r86,%r81; shl.b64 %r285,%r284,4; add.u64 %r93,%r93,%r285; .loc 1 356 24 add.u64 %r286,%frame,%r104; ld.u64 %r88,[%r286]; .loc 1 356 14 sub.u64 %r287,%r88,%r83; add.u64 %r96,%r96,%r287; .loc 1 359 32 add.u64 %r289,%r305,%r104; ld.u64 %r90,[%r289]; .loc 1 359 7 add.u64 %r105,%r105,8; add.u64 %r104,%r104,8; setp.eq.u64 %r290,%r85,%r90; @ %r290 bra $L68; ld.u64 %r155,[%frame+360]; bra $L69; $L71: mov.u64 %r93,%r45; setp.eq.u32 %r297,1,1; bra $L52; $L39: .loc 1 _gfortran_sminloc0_8_i16 .visible .func _gfortran_sminloc0_8_i16u32pred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 378 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L99; .loc 1 378 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L100; $L99: .loc 1 380 7 {_gfortran_minloc0_8_i16.loc 1 381 7 bra $L98; $L100: .loc 1 384 8 ld.s8 %r30,[%r36+28]; .loc 1 386 6 setp.gt.s64 %r46,%r30,0; @ %r46 bra $L102_gfortran_runtime_error102: .loc 1 389 15 ld.u64 %r24,[%r35]; .loc 1 389 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L103; .loc 1 391 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 392 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 393 24 st.u64 [%r35+8],%r24; .loc 1 394 29 mov.u64 call (%value_in),_gfortrani_xmallocarray94 27 st.u64 [%r35],%r24; bra $L104; $L103: .loc 1 396 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 396 11 ld.u32 %r62,[%r61+36.loc 1 398 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 403 8 ld.u64 %r24,[%r35]; $L104: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,3; mov.u64 %r29,%r24; .loc 1 404 10 mov.u64 %r32,0; .loc 1 405 23 mov.u64 %r68,%r32; $L105: st.u64 [%r29],%r68; .loc 1 404 24 add.u64 %r32,%r32,1; .loc 1 404 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L105; $L98: .loc 1 406 1 ret; } /127 _gfortran_minloc0_16_i16 .visible .func _gfortran_minloc0_16_i16libgfortran/generated/minloc0_16_i16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_16_i16 .visible .func _gfortran_mminloc0_16_i16_gfortran_sminloc0_16_i16 .visible .func _gfortran_sminloc0_16_i16VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_16_i16 .visible .func _gfortran_minloc0_16_i16stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[3683264u32 %r157; .reg .u64 %r158; .reg .pred %r163; .reg .pred %r166; .reg .predpredpredpredu64 %r191; .reg .predpred64mov.u32 %r116,%ar2; .loc 1 50 10 ld.s8 %r22,[%r115+28]; .loc 1 50 8 cvt.u32.u32 %r117,%r22; cvt.s64.s8 %r71,%r117; .loc 1 51 6 setp.gt.s64 %r118,%r71,0; @ %r118 bra $L2; .loc 1 52 5 cvta.const.u64 %r1111_gfortran_runtime_errorr23,[%r114]; .loc 1 54 6 setp.ne.u64 %r121,%r23,0; @ %r121 bra $L3; .loc 1 56 7 st.u64 [%r114+48],%r23; add.u64 %r123,%r71,-1; st.u64 [%r114+56],%r123; mov.u64 %r124,1; st.u64 [%r114+40],%r124; .loc 1 57 28 cvt.u32.u64 %r125,%r124; st.u8 [%r114+28],%r125; .loc 1 58 24 st.u64 [%r114+8],%r23; .loc 1 59 29 mov.u64 %r128,1671call (%value_in),_gfortrani_xmallocarray131,[%value_in]; } mov.u64 %r23,%r131; .loc 1 59 27 st.u64 [%r114],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r133,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r134,[%r133+36]; setp.eq.u32 %r135,%r134,0; @ %r135 bra $L4; .loc 1 64r115138; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r114]; $L4: .loc 1 68 11 ld.u64 %r72,[%r114+40]; add.u64 %r44,%r115,40; mov.u64 %r35,%frame; add.u64 %r52,%frame,120; .loc 1 70 10 mov.u64 %r100,0; add.u64 %r210,%frame,240; .loc 1 74 16 mov.u64 %r146,%r100; bra $L7; $L24: mov.u64 %r100,%r79; $L7: .loc 1 72 18 ld.u64 %r139,[%r44]; st.u64 [%r35],%r139; .loc 1 73 19 ld.u64 %r141,[%r44+16]; add.u64 %r140,%r141,1; ld.u64 %r142,[%r44+8]; sub.u64 %r32,%r140,%r142; .loc 1 73 17 st.u64 [%r52],%r32; .loc 1 74 16 shl.b64 %r144,%r100,3; add.u64 %r145,%r210,%r144; st.u64 [%r145],%r146; .loc 1 75 10 setp.gt.s64 %r147,%r32,0; @ %r147 bra $L5; shl.b64 %r86,%r72,4; mov.u64 %r88,%r23; .loc 1 78 11 mov.u64 %r80,0; .loc 1 79 24 mov.u64 %r148,%r80; $L6: st.u64 [%r88],%r148; st.u64 [%r88+8],%r148; .loc 1 78 27 add.u64 %r80,%r80,1; .loc 1 78 4 add.u64 %r88,%r88,%r86; setp.ne.u64 %r150,%r71,%r80; @ %r150 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r79,%r100,1; .loc 1 70 3 add.u64 %r44,%r44,24; add.u64 %r35,%r35,8; add.u64 %r52,%r52,8; setp.ne.u64 %r151,%r71,%r79; @ %r151 bra $L24; .loc 1 84 8 ld.u64 %r62,[%r115]; shl.b64 %r38,%r72,4; mov.u64 %r85,%r23; .loc 1 87 10 mov.u64 %r78,0; .loc 1 88 23 mov.u64 %r152,1; mov.u64 %r153,%r78; $L8: st.u64 [%r85],%r152; st.u64 [%r85+8],%r153; mov.u64 %r33,%r78; .loc 1 87 26 add.u64 %r78,%r78,1; .loc 1 87 3 add.u64 %r85,%r85,%r38; setp.ne.u64 %r154,%r100,%r33; @ %r154 bra $L8; .loc 1 101 9 setp.ne.u64 %r155,%r62,0; @ ! %r155 bra $L1; .loc 1 149 19 ld.u64 %r47,[%frame]; .loc 1 149 9 shl.b64 %r49,%r47,4; .loc 1 151 34 ld.u64 %r51,[%frame+120]; cvt.u32.u32 %r157,%r22; cvt.s64.s8 %r156,%r157; shl.b64 %r158,%r156,3; add.u64 %r89,%r210,%r158; .loc 1 99 12 mov.u64 %r207,-1; mov.u64 %r208,9223372036854775807; setp.eq.u32 %r209,%r116,0; add.u64 %r211,%frame,248; .loc 1 157 13 mov.u64 %r214,0; .loc 1 174 32 add.u64 %r215,%frame,120; $L23: .loc 1 151 21 ld.u64 %r107,[%frame+240]; .loc 1 126 10 @ %r209 bra $L25; mov.u64 %r41,%r107; mov.u64 %r60,%r62; $L15: .loc 1 129 10 ld.u64 %r203,[%r60]; ld.u64 %r204,[%r60+8]; .loc 1 129 9 setp.gt.s64 %r163,%r204,%r208; @ %r163 bra $L12; setp.ne.u64 %r166,%r204,%r208; @ %r166 bra $L28; setp.gt.u64 %r169,%r203,%r207; @ %r169 bra $L12; $L28: mov.u64 %r27,%r211; mov.u64 %r63,%r23; mov.u64 %r103,%r41; $L14: .loc 1 133 34 add.u64 %r171,%r103,1; st.u64 [%r63],%r171; shr.s64 %r172,%r171,63; st.u64 [%r63+8],%r172; .loc 1 132 3 add.u64 %r63,%r63,%r38; setp.eq.u64 %r173,%r27,%r89; @ %r173 bra $L26; .loc 1 133 30 ld.u64 %r103,[%r27]; add.u64 %r27,%r27,8; bra $L14; $L26: mov.u64 %r207,%r203; mov.u64 %r208,%r204; $L12: .loc 1 135 11 add.u64 %r60,%r60,%r49; .loc 1 137 9 add.u64 %r41,%r41,1; .loc 1 137 2 st.u64 [%frame+240],%r41; setp.ne.u64 %r174,%r41,%r51; @ %r174 bra $L15; bra $L38; $L25: mov.u64 %r50,%r107; mov.u64 %r61,%r62; $L11: .loc 1 141 10 ld.u64 %r205,[%r61]; ld.u64 %r206,[%r61+8]; .loc 1 141 9 setp.gt.s64 %r177,%r208,%r206; @ %r177 bra $L29; setp.ne.u64 %r180,%r208,%r206; @ %r180 bra $L17; setp.gt.u64 %r183,%r207,%r205; @ ! %r183 bra $L17; $L29: mov.u64 %r104,%r211; mov.u64 %r105,%r23; mov.u64 %r84,%r50; $L19: .loc 1 145 34 add.u64 %r185,%r84,1; st.u64 [%r105],%r185; shr.s64 %r186,%r185,63; st.u64 [%r105+8],%r186; .loc 1 144 3 add.u64 %r105,%r105,%r38; setp.eq.u64 %r187,%r89,%r104; @ %r187 bra $L27; .loc 1 145 30 ld.u64 %r84,[%r104]; add.u64 %r104,%r104,8; bra $L19; $L27: mov.u64 %r207,%r205; mov.u64 %r208,%r206; $L17: .loc 1 149 9 add.u64 %r61,%r61,%r49; .loc 1 151 14 add.u64 %r50,%r50,1; .loc 1 151 7 st.u64 [%frame+240],%r50; setp.ne.u64 %r188,%r50,%r51; @ %r188 bra $L11; .loc 1 149 9 sub.u64 %r189,%r51,%r107; mad.lo.u64 %r62,%r189,%r49,%r62; bra $L20; $L38: .loc 1 135 11 sub.u64 %r191,%r51,%r107; mad.lo.u64 %r62,%r191,%r49,%r62; $L20: mov.u64 %r77,%r211; .loc 1 151 34 mov.u64 %r59,%r51; mov.u64 %r57,%r47; .loc 1 152 9 mov.u64 %r65,0; $L22: .loc 1 157 13 st.u64 [%r77+-8],%r214; .loc 1 160 23 mul.lo.u64 %r54,%r57,%r59; mov.u64 %r101,%r65; .loc 1 161 5 add.u64 %r65,%r65,1; .loc 1 162 7 setp.eq.u64 %r195,%r101,%r100; @ %r195 bra $L1; .loc 1 170 16 ld.u64 %r196,[%r77]; add.u64 %r56,%r196,1; st.u64 [%r77],%r56; shl.b64 %r68,%r65,3; .loc 1 171 23 add.u64 %r197,%frame,%r68; ld.u64 %r57,[%r197]; .loc 1 171 13 sub.u64 %r198,%r57,%r54; shl.b64 %r199,%r198,4; add.u64 %r62,%r62,%r199; .loc 1 174 32 add.u64 %r201,%r215,%r68; ld.u64 %r59,[%r201]; .loc 1 174 7 add.u64 %r77,%r77,8; setp.eq.u64 %r202,%r56,%r59; @ %r202 bra $L22; bra $L23; $L_gfortran_mminloc0_16_i16 .visible .func _gfortran_mminloc0_16_i16reg .u64 %stack; mov.u64 %stack,04808u64u32 %r198; .reg .u32pred %r272; .reg .predpredu64u64 %r302; .reg .u64mov.u64 %r163,%ar0; mov.u64 %r164,%ar1; mov.u64 %r165,%ar2; mov.u32 %r166,%ar3; .loc 1 202 6 setp.ne.u64 %r167,%r165,0; @ %r167 bra $L4016r1166; call _gfortran_minloc0_16_i16.loc 1 205 7 bra $L39; $L40: .loc 1 208 10 ld.s8 %r22,[%r164+28]; .loc 1 209 6 setp.gt.s32 %r171,%r22,0; @ %r171 bra $L42; .loc 1 210 5 cvta.const.u64 %r172172_gfortran_runtime_error42: .loc 1 212 6 ld.u64 %r174,[%r163]; setp.ne.u64 %r175,%r174,0; @ %r175 bra $L43; .loc 1 214 7 st.u64 [%r163+48],%r174; add.u32 %r177,%r22,-1; cvt.s64.s32 %r178,%r177; st.u64 [%r163+56],%r178; mov.u64 %r179,1; st.u64 [%r163+40],%r179; .loc 1 215 28 cvt.u32.u64 %r180,%r179; st.u8 [%r163+28],%r180; .loc 1 216 24 st.u64 [%r163+8],%r174; .loc 1 217 29 cvt.u32.u32 %r185,%r22; cvt.s64.s8 %r184,%r185; mov.u64 %r183,16call (%value_in),_gfortrani_xmallocarray186,[%value_in]; } .loc 1 217 27 st.u64 [%r163],%r186; bra $L44; $L43: .loc 1 221 11 cvta.global.u64 %r188,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r189,[%r188+36]; setp.eq.u32 %r190,%r189,0r164193; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r196193; call _gfortrani_bounds_equal_extents$L44: .loc 1 231 15 ld.u64 %r31,[%r165+16]; .loc 1 233 9 ld.u64 %r93,[%r165]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r198,%r32,-4; and.b32 %r199,%r198,-5; set.u32.eq.u32 %r201,%r199,0; neg.s32 %r202,%r201; .loc 1 235 22 add.u32 %r203,%r32,-1; set.u32.le.u32 %r205,%r203,1; neg.s32 %r206,%r205; .loc 1 235 6 cvt.u16.u32 %r208,%r202; cvt.u16.u32 %r209,%r206; or.b16 %r207,%r208,%r209;45; .loc 1 237 7 setp.ne.u32 %r214,%r32,16; @ %r214 bra $L46; $L45: .loc 1 244 11 ld.u64 %r106,[%r163+40]; .loc 1 245 8 ld.u64 %r107,[%r163]; .loc 1 246 17 cvt.u32.u32 %r215,%r22; cvt.s64.s8 %r64,%r215; add.u64 %r161,%r164,40; add.u64 %r306,%frame,120; mov.u64 %r159,%r306; add.u64 %r158,%r165,40; mov.u64 %r154,%frame; add.u64 %r155,%frame,240; add.u64 %r151,%frame,360; mov.u64 %r150,%r151; .loc 1 246 10 mov.u64 %r29,0; .loc 1 251 16 mov.u64 %r224,%r29; bra $L47; $L46: .loc 1 242 5_gfortran_runtime_error70: mov.u64 %r29,%r112; $L47: .loc 1 248 18 ld.u64 %r218,[%r161]; st.u64 [%r159],%r218; .loc 1 249 20 ld.u64 %r220,[%r158]; mul.lo.u64 %r219,%r220,%r31; .loc 1 249 18 st.u64 [%r154],%r219; .loc 1 250 19 ld.u64 %r222,[%r161+16]; add.u64 %r221,%r222,1; ld.u64 %r223,[%r161+8]; sub.u64 %r44,%r221,%r223; .loc 1 250 17 st.u64 [%r155],%r44; .loc 1 251 16 st.u64 [%r150],%r224; .loc 1 252 10 setp.gt.s64 %r225,%r44,0; @ %r225 bra $L48; shl.b64 %r141,%r106,4; mov.u64 %r132,%r107; .loc 1 255 11 mov.u64 %r113,0; .loc 1 256 24 mov.u64 %r226,%r113; $L49: st.u64 [%r132],%r226; st.u64 [%r132+8],%r226; .loc 1 255 27 add.u64 %r113,%r113,1; .loc 1 255 4 add.u64 %r132,%r132,%r141; setp.ne.u64 %r228,%r64,%r113; @ %r228 bra $L49; bra $L39; $L48: .loc 1 246 26 add.u64 %r112,%r29,1; .loc 1 246 3 add.u64 %r161,%r161,24; add.u64 %r159,%r159,8; add.u64 %r158,%r158,24; add.u64 %r154,%r154,8; add.u64 %r155,%r155,8; add.u64 %r150,%r150,8; setp.ne.u64 %r229,%r64,%r112; @ %r229 bra $L70; .loc 1 261 8 ld.u64 %r91,[%r164]; shl.b64 %r115,%r106,4; mov.u64 %r52,%r107; .loc 1 264 10 mov.u64 %r111,0; .loc 1 265 23 mov.u64 %r230,%r111; $L51: st.u64 [%r52],%r230; st.u64 [%r52+8],%r230; mov.u64 %r143,%r111; .loc 1 264 26 add.u64 %r111,%r111,1; .loc 1 264 3 add.u64 %r52,%r52,%r115; setp.ne.u64 %r232,%r29,%r143; @ %r232 bra $L51; .loc 1 276 9 setp.eq.u64 %r233,%r91,0; @ %r233 bra $L39; .loc 1 331 19 ld.u64 %r70,[%frame+120]; .loc 1 331 9 shl.b64 %r72,%r70,4; .loc 1 332 20 ld.u64 %r73,[%frame]; .loc 1 334 34 ld.u64 %r76,[%frame+240]; ld.u64 %r160,[%frame+360]; cvt.u32.u32 %r235,%r22; cvt.s64.s8 %r234,%r235; shl.b64 %r236,%r234,3; add.u64 %r122,%r236,%r151; .loc 1 274 12 mov.u64 %r303,-1; mov.u64 %r304,9223372036854775807; setp.eq.u32 %r307,1,0; add.u64 %r305,%frame,368; .loc 1 340 13 mov.u64 %r312,0; .loc 1 359 32 add.u64 %r313,%frame,240; .loc 1 308 12 setp.eq.u32 %r315,%r166,0; $L69: .loc 1 280 10 @ %r307 bra $L52; mov.u64 %r53,%r160; mov.u32 %r46,0; mov.u32 %r309,1; $L57: mov.u64 %r45,%r91; .loc 1 300 13 add.u64 %r91,%r45,%r72; .loc 1 284 11 ld.s8 %r239,[%r93]; cvt.u16.u32 %r238,%r239; setp.eq.u16 %r240,%r238,0; @ %r240 bra $L53; setp.eq.u32 %r241,%r46,0; selp.u64 %r160,%r160,%r53,%r241; .loc 1 294 16 ld.u64 %r303,[%r45]; ld.u64 %r304,[%r45+8]; mov.u64 %r131,%r305; mov.u64 %r142,%r107; mov.u64 %r157,%r160; $L56: .loc 1 296 33 add.u64 %r243,%r157,1; st.u64 [%r142],%r243; shr.s64 %r244,%r243,63; st.u64 [%r142+8],%r244; .loc 1 295 9 add.u64 %r142,%r142,%r115; setp.eq.u64 %r245,%r122,%r131; @ %r245 bra $L71; .loc 1 296 29 ld.u64 %r157,[%r131]; add.u64 %r131,%r131,8; bra $L56; $L53: .loc 1 301 14 add.u64 %r93,%r93,%r73; .loc 1 303 11 add.u64 %r53,%r53,1; mov.u32 %r46,%r309; .loc 1 303 4 setp.ne.u64 %r246,%r53,%r76; @ %r246 bra $L57; bra $L97; $L52: .loc 1 308 12 @ %r315 bra $L72; mov.u64 %r62,%r160; mov.u64 %r89,%r91; $L63: .loc 1 311 11 ld.s8 %r249,[%r93]; cvt.u16.u32 %r248,%r249; setp.eq.u16 %r250,%r248,0; @ %r250 bra $L60; .loc 1 311 12 ld.u64 %r299,[%r89]; ld.u64 %r300,[%r89+8]; .loc 1 311 11 setp.gt.s64 %r253,%r300,%r304; @ %r253 bra $L60; setp.ne.u64 %r256,%r300,%r304; @ %r256 bra $L75; setp.gt.u64 %r259,%r299,%r303; @ %r259 bra $L60; $L75: mov.u64 %r94,%r305; mov.u64 %r98,%r107; mov.u64 %r153,%r62; .loc 1 314 18 mov.u64 %r144,%r312; $L62: .loc 1 315 36 add.u64 %r261,%r153,1; st.u64 [%r98],%r261; shr.s64 %r262,%r261,63; st.u64 [%r98+8],%r262; .loc 1 314 34 add.u64 %r110,%r144,1; .loc 1 314 11 add.u64 %r98,%r98,%r115; setp.le.s64 %r263,%r29,%r144; @ %r263 bra $L73; .loc 1 315 32 ld.u64 %r153,[%r94]; add.u64 %r94,%r94,8; mov.u64 %r144,%r110; bra $L62; $L73: mov.u64 %r303,%r299; mov.u64 %r304,%r300; $L60: .loc 1 317 8 add.u64 %r89,%r89,%r72; .loc 1 319 13 add.u64 %r62,%r62,1; .loc 1 319 6 st.u64 [%frame+360],%r62; setp.ne.u64 %r264,%r62,%r76; @ %r264 bra $L63; .loc 1 317 8 sub.u64 %r265,%r76,%r160; mad.lo.u64 %r91,%r265,%r72,%r91; bra $L64; $L72: mov.u64 %r75,%r160; mov.u64 %r92,%r93; mov.u64 %r90,%r91; $L59: .loc 1 323 11 ld.s8 %r268,[%r92]; cvt.u16.u32 %r267,%r268; setp.eq.u16 %r269,%r267,0; @ %r269 bra $L65; .loc 1 323 12 ld.u64 %r301,[%r90]; ld.u64 %r302,[%r90+8]; .loc 1 323 11 setp.gt.s64 %r272,%r304,%r302; @ %r272 bra $L76; setp.ne.u64 %r275,%r304,%r302; @ %r275 bra $L65; setp.gt.u64 %r278,%r303,%r301; @ ! %r278 bra $L65; $L76: mov.u64 %r57,%r305; mov.u64 %r66,%r107; mov.u64 %r149,%r75; .loc 1 326 12 mov.u64 %r145,%r312; $L67: .loc 1 327 36 add.u64 %r280,%r149,1; st.u64 [%r66],%r280; shr.s64 %r281,%r280,63; st.u64 [%r66+8],%r281; .loc 1 326 28 add.u64 %r109,%r145,1; .loc 1 326 5 add.u64 %r66,%r66,%r115; setp.le.s64 %r282,%r29,%r145; @ %r282 bra $L74; .loc 1 327 32 ld.u64 %r149,[%r57]; add.u64 %r57,%r57,8; mov.u64 %r145,%r109; bra $L67; $L74: mov.u64 %r303,%r301; mov.u64 %r304,%r302; $L65: .loc 1 331 9 add.u64 %r90,%r90,%r72; .loc 1 332 10 add.u64 %r92,%r92,%r73; .loc 1 334 14 add.u64 %r75,%r75,1; .loc 1 334 7 st.u64 [%frame+360],%r75; setp.ne.u64 %r283,%r75,%r76; @ %r283 bra $L59; sub.u64 %r128,%r76,%r160; .loc 1 331 9 mad.lo.u64 %r91,%r72,%r128,%r91; .loc 1 332 10 mad.lo.u64 %r93,%r73,%r128,%r93; bra $L64; $L97: st.u64 [%frame+360],%r76; $L64: mov.u64 %r120,%r305; mov.u64 %r85,%r73; .loc 1 334 34 mov.u64 %r87,%r76; mov.u64 %r83,%r70; mov.u64 %r119,8; .loc 1 335 9 mov.u64 %r96,0; $L68: .loc 1 340 13 st.u64 [%r120+-8],%r312; .loc 1 343 23 mul.lo.u64 %r78,%r83,%r87; .loc 1 344 24 mul.lo.u64 %r80,%r87,%r85; mov.u64 %r95,%r96; .loc 1 345 5 add.u64 %r96,%r96,1; .loc 1 346 7 setp.eq.u64 %r288,%r29,%r95; @ %r288 bra $L39; .loc 1 354 16 ld.u64 %r289,[%r120]; add.u64 %r82,%r289,1; st.u64 [%r120],%r82; .loc 1 355 23 add.u64 %r291,%r306,%r119; ld.u64 %r83,[%r291]; .loc 1 355 13 sub.u64 %r292,%r83,%r78; shl.b64 %r293,%r292,4; add.u64 %r91,%r91,%r293; .loc 1 356 24 add.u64 %r294,%frame,%r119; ld.u64 %r85,[%r294]; .loc 1 356 14 sub.u64 %r295,%r85,%r80; add.u64 %r93,%r93,%r295; .loc 1 359 32 add.u64 %r297,%r313,%r119; ld.u64 %r87,[%r297]; .loc 1 359 7 add.u64 %r120,%r120,8; add.u64 %r119,%r119,8; setp.eq.u64 %r298,%r82,%r87; @ %r298 bra $L68; ld.u64 %r160,[%frame+360]; bra $L69; $L71: mov.u64 %r91,%r45; setp.eq.u32 %r307,1,1; bra $L52; $L39: .loc 1 _gfortran_sminloc0_16_i16 .visible .func _gfortran_sminloc0_16_i16u32pred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 378 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L99; .loc 1 378 20 ld.u32 %r40,[%r37]; setp.eq.u32 %r41,%r40,0; @ %r41 bra $L100; $L99: .loc 1 380 7 {_gfortran_minloc0_16_i16.loc 1 381 7 bra $L98; $L100: .loc 1 384 8 ld.s8 %r30,[%r36+28]; .loc 1 386 6 setp.gt.s64 %r46,%r30,0; @ %r46 bra $L102_gfortran_runtime_error102: .loc 1 389 15 ld.u64 %r24,[%r35]; .loc 1 389 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L103; .loc 1 391 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 392 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 393 24 st.u64 [%r35+8],%r24; .loc 1 394 29call (%value_in),_gfortrani_xmallocarray94 27 st.u64 [%r35],%r24; bra $L104; $L103: .loc 1 396 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 396 11 ld.u32 %r62,[%r61+36.loc 1 398 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 403 8 ld.u64 %r24,[%r35]; $L104: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,4; mov.u64 %r29,%r24; .loc 1 404 10 mov.u64 %r32,0; .loc 1 405 23 mov.u64 %r68,%r32; $L105: st.u64 [%r29],%r68; st.u64 [%r29+8],%r68; .loc 1 404 24 add.u64 %r32,%r32,1; .loc 1 404 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r70,%r30,%r32; @ %r70 bra $L105; $L98: .loc 1 406 1 ret; } minloc0_4_r4.o/_gfortran_minloc0_4_r4 .visible .func _gfortran_minloc0_4_libgfortran/generated/minloc0_4_r4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_4_r4 .visible .func _gfortran_mminloc0_4_r4_gfortran_sminloc0_4_r4 .visible .func _gfortran_sminloc0_4_r4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_4_r4 .visible .func _gfortran_minloc0_4_stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[368.reg .u64predpredpredu32 %r176; .reg .pred %r177; .reg .pred %r178; .reg .pred %r180; .reg .u32 %r182; .reg .u32 %r183; .reg .pred64pred %r208; .reg .u32 %r210; .reg .u64 %r211; .reg .u64 %r212; .reg .pred50 10 ld.s8 %r22,[%r128+28]; .loc 1 50 8 cvt.u32.u32 %r130,%r22; cvt.s64.s8 %r79,%r130; .loc 1 51 6 setp.gt.s64 %r131,%r79,0; @ %r131 bra $L2; .loc 1 52 5 cvta.const.u64 %r1321stack; call _gfortran_runtime_errorr23,[%r127]; .loc 1 54 6 setp.ne.u64 %r134,%r23,0; @ %r134 bra $L3; .loc 1 56 7 st.u64 [%r127+48],%r23; add.u64 %r136,%r79,-1; st.u64 [%r127+56],%r136; mov.u64 %r137,1; st.u64 [%r127+40],%r137; .loc 1 57 28 cvt.u32.u64 %r138,%r137; st.u8 [%r127+28],%r138; .loc 1 58 24 st.u64 [%r127+8],%r23; .loc 1 59 2979call (%value_in),_gfortrani_xmallocarray4,[%value_in]; } mov.u64 %r23,%r144; .loc 1 59 27 st.u64 [%r127],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r146,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r147,[%r146+36]; setp.eq.u32 %r148,%r147,0; @ %r148 bra $L4; .loc 1 64 2r12151; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r127]; $L4: .loc 1 68 11 ld.u64 %r80,[%r127+40]; add.u64 %r44,%r128,40; mov.u64 %r37,%frame; add.u64 %r47,%frame,120; .loc 1 70 10 mov.u64 %r103,0; add.u64 %r206,%frame,240; .loc 1 74 16 mov.u64 %r159,%r103; bra $L7; $L32: mov.u64 %r103,%r88; $L7: .loc 1 72 18 ld.u64 %r152,[%r44]; st.u64 [%r37],%r152; .loc 1 73 19 ld.u64 %r154,[%r44+16]; add.u64 %r153,%r154,1; ld.u64 %r155,[%r44+8]; sub.u64 %r31,%r153,%r155; .loc 1 73 17 st.u64 [%r47],%r31; .loc 1 74 16 shl.b64 %r157,%r103,3; add.u64 %r158,%r206,%r157; st.u64 [%r158],%r159; .loc 1 75 10 setp.gt.s64 %r160,%r31,0; @ %r160 bra $L5; shl.b64 %r122,%r80,2; mov.u64 %r101,%r23; .loc 1 78 11 mov.u64 %r89,0; .loc 1 79 24 cvt.u32.u64 %r161,%r89; $L6: st.u32 [%r101],%r161; .loc 1 78 27 add.u64 %r89,%r89,1; .loc 1 78 4 add.u64 %r101,%r101,%r122; setp.ne.u64 %r162,%r79,%r89; @ %r162 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r88,%r103,1; .loc 1 70 3 add.u64 %r44,%r44,24; add.u64 %r37,%r37,8; add.u64 %r47,%r47,8; setp.ne.u64 %r163,%r79,%r88; @ %r163 bra $L32; .loc 1 84 8 ld.u64 %r67,[%r128]; shl.b64 %r108,%r80,2; mov.u64 %r106,%r23; .loc 1 87 10 mov.u64 %r87,0; .loc 1 88 23 mov.u32 %r164,1; $L8: st.u32 [%r106],%r164; mov.u64 %r104,%r87; .loc 1 87 26 add.u64 %r87,%r87,1; .loc 1 87 3 add.u64 %r106,%r106,%r108; setp.ne.u64 %r165,%r103,%r104; @ %r165 bra $L8; .loc 1 101 9 setp.eq.u64 %r166,%r67,0; @ %r166 bra $L1; .loc 1 149 19 ld.u64 %r51,[%frame]; .loc 1 149 9 shl.b64 %r53,%r51,2; .loc 1 151 34 ld.u64 %r56,[%frame+120]; ld.u64 %r123,[%frame+240]; .loc 1 141 10 ld.f32 %r125,[%r67]; cvt.u32.u32 %r168,%r22; cvt.s64.s8 %r167,%r168; shl.b64 %r169,%r167,3; add.u64 %r113,%r206,%r169; .loc 1 97 12 mov.f32 %r73,0f7f800000; setp.eq.u32 %r208,1,0; add.u64 %r207,%frame,248; .loc 1 157 13 mov.u64 %r211,0; .loc 1 174 32 add.u64 %r212,%frame,120; .loc 1 126 10 setp.eq.u32 %r214,%r129,0; $L30: .loc 1 106 10 @ %r208 bra $L10; mov.u64 %r38,%r123; mov.u32 %r85,0; .loc 1 110 12 mov.u32 %r210,1; $L17: .loc 1 118 13 add.u64 %r126,%r67,%r53; .loc 1 110 11 setp.ge.f32 %r172,%r73,%r125; @ ! %r172 bra $L51; setp.eq.u32 %r173,%r85,0; selp.u64 %r123,%r123,%r38,%r173; mov.u64 %r90,%r207; mov.u64 %r95,%r23; mov.u64 %r119,%r123; $L15: .loc 1 115 36 cvt.u32.u64 %r176,%r119; add.u32 %r175,%r176,1; .loc 1 115 25 st.u32 [%r95],%r175; .loc 1 114 5 add.u64 %r95,%r95,%r108; setp.eq.u64 %r177,%r90,%r113; @ %r177 bra $L33; .loc 1 115 32 ld.u64 %r119,[%r90]; add.u64 %r90,%r90,8; bra $L15; $L51: .loc 1 120 11 add.u64 %r38,%r38,1; .loc 1 120 4 setp.eq.u64 %r178,%r38,%r56; @ %r178 bra $L16; .loc 1 110 12 ld.f32 %r125,[%r126]; mov.u64 %r67,%r126; mov.u32 %r85,%r210; bra $L17; $L10: .loc 1 126 10 @ %r214 bra $L34; mov.u64 %r46,%r123; mov.u64 %r65,%r67; $L24: .loc 1 129 9 setp.ge.f32 %r180,%r73,%r125; @ ! %r180 bra $L19; mov.u64 %r84,%r207; mov.u64 %r39,%r23; mov.u64 %r116,%r46; $L21: .loc 1 133 34 cvt.u32.u64 %r183,%r116; add.u32 %r182,%r183,1; .loc 1 133 23 st.u32 [%r39],%r182; .loc 1 132 3 add.u64 %r39,%r39,%r108; setp.eq.u64 %r184,%r84,%r113; @ %r184 bra $L35; .loc 1 133 30 ld.u64 %r116,[%r84]; add.u64 %r84,%r84,8; bra $L21; $L35: .loc 1 131 10 mov.f32 %r73,%r125; $L19: .loc 1 135 11 add.u64 %r65,%r65,%r53; .loc 1 137 9 add.u64 %r46,%r46,1; .loc 1 137 2 st.u64 [%frame+240],%r46; setp.ne.u64 %r185,%r46,%r56; @ %r185 bra $L22; .loc 1 135 11 sub.u64 %r186,%r56,%r123; mad.lo.u64 %r67,%r186,%r53,%r67; bra $L23; $L22: .loc 1 129 10 ld.f32 %r125,[%r65]; bra $L24; $L34: mov.u64 %r55,%r123; mov.u64 %r66,%r67; $L18: .loc 1 141 9 setp.gt.f32 %r188,%r73,%r125; @ ! %r188 bra $L25; mov.u64 %r117,%r207; mov.u64 %r120,%r23; mov.u64 %r110,%r55; $L27: .loc 1 145 34 cvt.u32.u64 %r191,%r110; add.u32 %r190,%r191,1; .loc 1 145 23 st.u32 [%r120],%r190; .loc 1 144 3 add.u64 %r120,%r120,%r108; setp.eq.u64 %r192,%r113,%r117; @ %r192 bra $L36; .loc 1 145 30 ld.u64 %r110,[%r117]; add.u64 %r117,%r117,8; bra $L27; $L36: .loc 1 143 10 mov.f32 %r73,%r125; $L25: .loc 1 149 9 add.u64 %r66,%r66,%r53; .loc 1 151 14 add.u64 %r55,%r55,1; .loc 1 151 7 st.u64 [%frame+240],%r55; setp.ne.u64 %r193,%r55,%r56; @ %r193 bra $L28; .loc 1 149 9 sub.u64 %r194,%r56,%r123; mad.lo.u64 %r67,%r194,%r53,%r67; bra $L23; $L28: .loc 1 141 10 ld.f32 %r125,[%r66]; bra $L18; $L16: st.u64 [%frame+240],%r56; .loc 1 118 13 mov.u64 %r67,%r126; $L23: mov.u64 %r76,%r207; .loc 1 151 34 mov.u64 %r63,%r56; mov.u64 %r61,%r51; .loc 1 152 9 mov.u64 %r71,0; $L29: .loc 1 157 13 st.u64 [%r76+-8],%r211; .loc 1 160 23 mul.lo.u64 %r58,%r61,%r63; mov.u64 %r105,%r71; .loc 1 161 5 add.u64 %r71,%r71,1; .loc 1 162 7 setp.eq.u64 %r198,%r105,%r103; @ %r198 bra $L1; .loc 1 170 16 ld.u64 %r199,[%r76]; add.u64 %r60,%r199,1; st.u64 [%r76],%r60; shl.b64 %r69,%r71,3; .loc 1 171 23 add.u64 %r200,%frame,%r69; ld.u64 %r61,[%r200]; .loc 1 171 13 sub.u64 %r201,%r61,%r58; shl.b64 %r202,%r201,2; add.u64 %r67,%r67,%r202; .loc 1 174 32 add.u64 %r204,%r212,%r69; ld.u64 %r63,[%r204]; .loc 1 174 7 add.u64 %r76,%r76,8; setp.eq.u64 %r205,%r60,%r63; @ %r205 bra $L29; ld.u64 %r123,[%frame+240]; .loc 1 141 10 ld.f32 %r125,[%r67]; bra $L30; $L33: mov.f32 %r73,%r125; setp.eq.u32 %r208,1,1; bra $L10; $L_gfortran_mminloc0_4_r4 .visible .func _gfortran_mminloc0_4_r4reg .u64 %stack; mov.u64 %stack,0480pred %r172; .reg .predpredpred %r195; .reg .u64 %r198; .reg .u64u32 %r211; .reg .u16pred %r217; .reg .predpredu16 %r257; .reg .u32 %r258; .reg .pred %r259; .reg .pred %r260; .reg .u32 %r262; .reg .u32 %r263; .reg .pred %r264; .reg .pred %r265; .reg .u64 %r266; .reg .u16 %r268; .reg .u32 %r269; .reg .pred %r270; .reg .pred %r271; .reg .u32 %r273; .reg .u32 %r274; .reg .pred %r275; .reg .pred %r276; .reg .pred %r281; .reg .u64 %r282; .reg .u64pred %r294; .reg .u64 %r296; .reg .u64 %r297; .reg .pred %r299; mov.u64 %r168,%ar0; mov.u64 %r169,%ar1; mov.u64 %r170,%ar2; mov.u32 %r171,%ar3; .loc 1 202 6 setp.ne.u64 %r172,%r170,0; @ %r172 bra $Lcall _gfortran_minloc0_4_r4.loc 1 205 7 bra $L52; $L53: .loc 1 208 10 ld.s8 %r22,[%r169+28]; .loc 1 209 6 setp.gt.s32 %r176,%r22,0; @ %r176 bra $L55; .loc 1 210 5 cvta.const.u64 %r177_gfortran_runtime_error5: .loc 1 212 6 ld.u64 %r179,[%r168]; setp.ne.u64 %r180,%r179,0; @ %r180 bra $L56; .loc 1 214 7 st.u64 [%r168+48],%r179; add.u32 %r182,%r22,-1; cvt.s64.s32 %r183,%r182; st.u64 [%r168+56],%r183; mov.u64 %r184,1; st.u64 [%r168+40],%r184; .loc 1 215 28 cvt.u32.u64 %r185,%r184; st.u8 [%r168+28],%r185; .loc 1 216 24 st.u64 [%r168+8],%r179; .loc 1 217 29 cvt.u32.u32 %r190,%r22; cvt.s64.s8 %r189,%r1call (%value_in),_gfortrani_xmallocarray191,[%value_in]; } .loc 1 217 27 st.u64 [%r168],%r191; bra $L57; $L56: .loc 1 221 11 cvta.global.u64 %r193,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r194,[%r193+36]; setp.eq.u32 %r195,%r194,0r169198; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r2011716920198; call _gfortrani_bounds_equal_extents$L57: .loc 1 231 15 ld.u64 %r31,[%r170+16]; .loc 1 233 9 ld.u64 %r99,[%r170]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r203,%r32,-4; and.b32 %r204,%r203,-5;.loc 1 235 22 add.u32 %r208,%r32,-1; set.u32.le.u32 %r210,%r208,1; neg.s32 %r211,%r210; .loc 1 235 6 cvt.u16.u32 %r213,%r207; cvt.u16.u32 %r214,%r211; or.b16 %r212,%r213,%r214; cvt.u32.u16 %r215,%r212; cvt.u16.u8 %r216,%r215; setp.ne.u16 %r217,%r216,0; @ %r217 bra $L58; .loc 1 237 7 setp.ne.u32 %r219,%r32,16; @ %r219 bra $L59; $L58: .loc 1 244 11 ld.u64 %r115,[%r168+40]; .loc 1 245 8 ld.u64 %r116,[%r168]; .loc 1 246 17 cvt.u32.u32 %r220,%r22; cvt.s64.s8 %r139,%r220; add.u64 %r166,%r169,40; add.u64 %r292,%frame,120; mov.u64 %r163,%r292; add.u64 %r160,%r170,40; mov.u64 %r159,%frame; add.u64 %r156,%frame,240; add.u64 %r152,%frame,360; mov.u64 %r154,%r152; .loc 1 246 10 mov.u64 %r122,0; .loc 1 251 16 mov.u64 %r229,%r122; bra $L60; $L59: .loc 1 242 5_gfortran_runtime_error248 18 ld.u64 %r223,[%r166]; st.u64 [%r163],%r223; .loc 1 249 20 ld.u64 %r225,[%r160]; mul.lo.u64 %r224,%r225,%r31; .loc 1 249 18 st.u64 [%r159],%r224; .loc 1 250 19 ld.u64 %r227,[%r166+16]; add.u64 %r226,%r227,1; ld.u64 %r228,[%r166+8]; sub.u64 %r45,%r226,%r228; .loc 1 250 17 st.u64 [%r156],%r45; .loc 1 251 16 st.u64 [%r154],%r229; .loc 1 252 10 setp.gt.s64 %r230,%r45,0; @ %r230 bra $L61; shl.b64 %r136,%r115,2; mov.u64 %r138,%r116; .loc 1 255 11 mov.u64 %r123,0; .loc 1 256 24 cvt.u32.u64 %r231,%r123; $L62: st.u32 [%r138],%r231; .loc 1 255 27 add.u64 %r123,%r123,1; .loc 1 255 4 add.u64 %r138,%r138,%r136; setp.ne.u64 %r232,%r123,%r139; @ %r232 bra $L62; bra $L52; $L61: .loc 1 246 26 add.u64 %r122,%r122,1; .loc 1 246 3 add.u64 %r166,%r166,24; add.u64 %r163,%r163,8; add.u64 %r160,%r160,24; add.u64 %r159,%r159,8; add.u64 %r156,%r156,8; add.u64 %r154,%r154,8; setp.ne.u64 %r233,%r122,%r139; @ %r233 bra $L60; .loc 1 261 8 ld.u64 %r96,[%r169]; shl.b64 %r126,%r115,2; mov.u64 %r124,%r116; .loc 1 264 10 mov.u64 %r121,0; .loc 1 265 23 cvt.u32.u64 %r234,%r121; $L64: st.u32 [%r124],%r234; .loc 1 264 26 add.u64 %r121,%r121,1; .loc 1 264 3 add.u64 %r124,%r124,%r126; setp.ne.u64 %r235,%r121,%r139; @ %r235 bra $L64; .loc 1 276 9 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L52; .loc 1 331 19 ld.u64 %r74,[%frame+120]; .loc 1 331 9 shl.b64 %r76,%r74,2; .loc 1 332 20 ld.u64 %r77,[%frame]; .loc 1 334 34 ld.u64 %r81,[%frame+240]; .loc 1 334 21 ld.u64 %r164,[%frame+360]; cvt.u32.u32 %r238,%r22; cvt.s64.s8 %r237,%r238; shl.b64 %r239,%r237,3; add.u64 %r80,%r239,%r152; .loc 1 272 12 mov.f32 %r104,0f7f800000; setp.eq.u32 %r294,1,0; add.u64 %r293,%frame,368; .loc 1 340 13 mov.u64 %r296,0; .loc 1 359 32 add.u64 %r297,%frame,240; .loc 1 308 12 setp.eq.u32 %r299,%r171,0; $L83: .loc 1 280 10 @ %r294 bra $L65; $L72: .loc 1 284 11 ld.s8 %r242,[%r99]; cvt.u16.u32 %r241,%r242; setp.eq.u16 %r243,%r241,0; @ %r243 bra $L66; .loc 1 287 8 ld.u32 %r244,[%r116]; setp.ne.u32 %r245,%r244,0; @ %r245 bra $L67; mov.u64 %r101,%r293; mov.u64 %r103,%r116; mov.u64 %r157,%r164; $L68: .loc 1 289 38 cvt.u32.u64 %r248,%r157; add.u32 %r247,%r248,1; .loc 1 289 27 st.u32 [%r103],%r247; .loc 1 288 7 add.u64 %r103,%r103,%r126; setp.eq.u64 %r249,%r80,%r101; @ %r249 bra $L67; .loc 1 289 34 ld.u64 %r157,[%r101]; add.u64 %r101,%r101,8; bra $L68; $L67: .loc 1 290 9 ld.f32 %r52,[%r96]; .loc 1 290 8 setp.le.f32 %r250,%r52,%r104; @ ! %r250 bra $L66; mov.u64 %r129,%r293; mov.u64 %r130,%r116; mov.u64 %r161,%r164; $L71: .loc 1 296 33 cvt.u32.u64 %r253,%r161; add.u32 %r252,%r253,1; .loc 1 296 22 st.u32 [%r130],%r252; .loc 1 295 9 add.u64 %r130,%r130,%r126; setp.eq.u64 %r254,%r80,%r129; @ %r254 bra $L84; .loc 1 296 29 ld.u64 %r161,[%r129]; add.u64 %r129,%r129,8; bra $L71; $L66: .loc 1 300 13 add.u64 %r96,%r96,%r76; .loc 1 301 14 add.u64 %r99,%r99,%r77; .loc 1 303 11 add.u64 %r164,%r164,1; .loc 1 303 4 st.u64 [%frame+360],%r164; setp.ne.u64 %r255,%r164,%r81; @ %r255 bra $L72; bra $L73; $L65: .loc 1 308 12 @ %r299 bra $L85; mov.u64 %r66,%r164; mov.u64 %r94,%r96; $L78: .loc 1 311 11 ld.s8 %r258,[%r99]; cvt.u16.u32 %r257,%r258; setp.eq.u16 %r259,%r257,0; @ %r259 bra $L75; .loc 1 311 12 ld.f32 %r61,[%r94]; .loc 1 311 11 setp.le.f32 %r260,%r61,%r104; @ ! %r260 bra $L75; mov.u64 %r62,%r293; mov.u64 %r69,%r116; mov.u64 %r155,%r66; .loc 1 314 18 mov.u64 %r120,0; $L77: .loc 1 315 36 cvt.u32.u64 %r263,%r155; add.u32 %r262,%r263,1; .loc 1 315 25 st.u32 [%r69],%r262; .loc 1 314 34 add.u64 %r120,%r120,1; .loc 1 314 11 add.u64 %r69,%r69,%r126; setp.ge.s64 %r264,%r120,%r139; @ %r264 bra $L86; .loc 1 315 32 ld.u64 %r155,[%r62]; add.u64 %r62,%r62,8; bra $L77; $L86: .loc 1 313 18 mov.f32 %r104,%r61; $L75: .loc 1 317 8 add.u64 %r94,%r94,%r76; .loc 1 319 13 add.u64 %r66,%r66,1; .loc 1 319 6 st.u64 [%frame+360],%r66; setp.ne.u64 %r265,%r66,%r81; @ %r265 bra $L78; .loc 1 317 8 sub.u64 %r266,%r81,%r164; mad.lo.u64 %r96,%r266,%r76,%r96; bra $L73; $L85: mov.u64 %r79,%r164; mov.u64 %r98,%r99; mov.u64 %r95,%r96; $L74: .loc 1 323 11 ld.s8 %r269,[%r98]; cvt.u16.u32 %r268,%r269; setp.eq.u16 %r270,%r268,0; @ %r270 bra $L79; .loc 1 323 12 ld.f32 %r68,[%r95]; .loc 1 323 11 setp.lt.f32 %r271,%r68,%r104; @ ! %r271 bra $L79; mov.u64 %r46,%r293; mov.u64 %r56,%r116; mov.u64 %r151,%r79; .loc 1 326 12 mov.u64 %r118,0; $L81: .loc 1 327 36 cvt.u32.u64 %r274,%r151; add.u32 %r273,%r274,1; .loc 1 327 25 st.u32 [%r56],%r273; .loc 1 326 28 add.u64 %r118,%r118,1; .loc 1 326 5 add.u64 %r56,%r56,%r126; setp.ge.s64 %r275,%r118,%r139; @ %r275 bra $L87; .loc 1 327 32 ld.u64 %r151,[%r46]; add.u64 %r46,%r46,8; bra $L81; $L87: .loc 1 325 12 mov.f32 %r104,%r68; $L79: .loc 1 331 9 add.u64 %r95,%r95,%r76; .loc 1 332 10 add.u64 %r98,%r98,%r77; .loc 1 334 14 add.u64 %r79,%r79,1; .loc 1 334 7 st.u64 [%frame+360],%r79; setp.ne.u64 %r276,%r79,%r81; @ %r276 bra $L74; sub.u64 %r143,%r81,%r164; .loc 1 331 9 mad.lo.u64 %r96,%r76,%r143,%r96; .loc 1 332 10 mad.lo.u64 %r99,%r77,%r143,%r99; $L73: mov.u64 %r119,%r293; .loc 1 325 12 mov.u64 %r90,%r77; .loc 1 334 34 mov.u64 %r92,%r81; mov.u64 %r88,%r74; mov.u64 %r110,8; .loc 1 335 9 mov.u64 %r102,0; $L82: .loc 1 340 13 st.u64 [%r119+-8],%r296; .loc 1 343 23 mul.lo.u64 %r83,%r88,%r92; .loc 1 344 24 mul.lo.u64 %r85,%r92,%r90; .loc 1 345 5 add.u64 %r102,%r102,1; .loc 1 346 7 setp.eq.u64 %r281,%r102,%r139; @ %r281 bra $L52; .loc 1 354 16 ld.u64 %r282,[%r119]; add.u64 %r87,%r282,1; st.u64 [%r119],%r87; .loc 1 355 23 add.u64 %r284,%r292,%r110; ld.u64 %r88,[%r284]; .loc 1 355 13 sub.u64 %r285,%r88,%r83; shl.b64 %r286,%r285,2; add.u64 %r96,%r96,%r286; .loc 1 356 24 add.u64 %r287,%frame,%r110; ld.u64 %r90,[%r287]; .loc 1 356 14 sub.u64 %r288,%r90,%r85; add.u64 %r99,%r99,%r288; .loc 1 359 32 add.u64 %r290,%r297,%r110; ld.u64 %r92,[%r290]; .loc 1 359 7 add.u64 %r119,%r119,8; add.u64 %r110,%r110,8; setp.eq.u64 %r291,%r87,%r92; @ %r291 bra $L82; .loc 1 334 21 ld.u64 %r164,[%frame+360]; bra $L83; $L84: .loc 1 290 9 mov.f32 %r104,%r52; setp.eq.u32 %r294,1,1; bra $L65; $L52: .loc 1 _gfortran_sminloc0_4_r4 .visible .func _gfortran_sminloc0_4_r4u32pred.reg .u64u32 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 378 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L114; .loc 1 378 20 ld.u32 %r40,[%r37380 7 {_gfortran_minloc0_4_r4.loc 1 381 7 bra $L113; $L115: .loc 1 384 8 ld.s8 %r30,[%r36+28]; .loc 1 386 6 setp.gt.s64 %r46,%r30,0; @ %r46 bra $L1_gfortran_runtime_error117: .loc 1 389 15 ld.u64 %r24,[%r35]; .loc 1 389 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L118; .loc 1 391 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 392 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 393 24 st.u64 [%r35+8],%r24; .loc 1 394 29 mov.u64 %r56call (%value_in),_gfortrani_xmallocarray94 27 st.u64 [%r35],%r24; bra $L119; $L118: .loc 1 396 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 396 11 ld.u32 %r62,[%r61+36.loc 1 398 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 403 8 ld.u64 %r24,[%r35]; $L119: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,2; mov.u64 %r29,%r24; .loc 1 404 10 mov.u64 %r32,0; .loc 1 405 23 cvt.u32.u64 %r68,%r32; $L120: st.u32 [%r29],%r68; .loc 1 404 24 add.u64 %r32,%r32,1; .loc 1 404 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L120; $L113: .loc 1 406 1 ret; } minloc0_8_r4.o/_gfortran_minloc0_8_r4 .visible .func _gfortran_minloc0_8_libgfortran/generated/minloc0_8_r4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_8_r4 .visible .func _gfortran_mminloc0_8_r4_gfortran_sminloc0_8_r4 .visible .func _gfortran_sminloc0_8_r4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_8_r4 .visible .func _gfortran_minloc0_8_stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[368pred.reg .u64 %r140; .reg .u32 %r141; .reg .predpred %r157; .reg .u64pred %r170; .reg .pred %r171; .reg .pred %r173; .reg .u64pred %r177; .reg .u64 %r178; .reg .predpredpred %r196; .reg .predpred %r205; mov.u64 %r121,%ar0; mov.u64 %r122,%ar1; mov.u32 %r123,%ar2; .loc 1 50 10 ld.s8 %r22,[%r122+28]; .loc 1 50 8 cvt.u32.u32 %r124,%r22; cvt.s64.s8 %r84,%r124; .loc 1 51 6 setp.gt.s64 %r125,%r84,0; @ %r125 bra $L2; .loc 1 52 5 cvta.const.u64 %r126,$LCstack; call _gfortran_runtime_errorr23,[%r121]; .loc 1 54 6 setp.ne.u64 %r128,%r23,0; @ %r128 bra $L3; .loc 1 56 7 st.u64 [%r121+48],%r23; add.u64 %r130,%r84,-1; st.u64 [%r121+56],%r130; mov.u64 %r131,1; st.u64 [%r121+40],%r131; .loc 1 57 28 cvt.u32.u64 %r132,%r131; st.u8 [%r121+28],%r132; .loc 1 58 24 st.u64 [%r121+8],%r23; .loc 1 59 29 mov.u64 %r135135; call (%value_in),_gfortrani_xmallocarray138,[%value_in]; } mov.u64 %r23,%r138; .loc 1 59 27 st.u64 [%r121],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r140,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r141,[%r140+36]; setp.eq.u32 %r142,%r141,0; @ %r142 bra $L4; .loc 1 64r122145; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r121]; $L4: .loc 1 68 11 ld.u64 %r85,[%r121+40]; add.u64 %r33,%r122,40; mov.u64 %r53,%frame; add.u64 %r45,%frame,120; .loc 1 70 10 mov.u64 %r106,0; add.u64 %r198,%frame,240; .loc 1 74 16 mov.u64 %r153,%r106; bra $L7; $L32: mov.u64 %r106,%r91; $L7: .loc 1 72 18 ld.u64 %r146,[%r33]; st.u64 [%r53],%r146; .loc 1 73 19 ld.u64 %r148,[%r33+16]; add.u64 %r147,%r148,1; ld.u64 %r149,[%r33+8]; sub.u64 %r32,%r147,%r149; .loc 1 73 17 st.u64 [%r45],%r32; .loc 1 74 16 shl.b64 %r151,%r106,3; add.u64 %r152,%r198,%r151; st.u64 [%r152],%r153; .loc 1 75 10 setp.gt.s64 %r154,%r32,0; @ %r154 bra $L5; shl.b64 %r114,%r85,3; mov.u64 %r104,%r23; .loc 1 78 11 mov.u64 %r92,0; .loc 1 79 24 mov.u64 %r155,%r92; $L6: st.u64 [%r104],%r155; .loc 1 78 27 add.u64 %r92,%r92,1; .loc 1 78 4 add.u64 %r104,%r104,%r114; setp.ne.u64 %r156,%r84,%r92; @ %r156 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r91,%r106,1; .loc 1 70 3 add.u64 %r33,%r33,24; add.u64 %r53,%r53,8; add.u64 %r45,%r45,8; setp.ne.u64 %r157,%r84,%r91; @ %r157 bra $L32; .loc 1 84 8 ld.u64 %r73,[%r122]; shl.b64 %r44,%r85,3; mov.u64 %r55,%r23; .loc 1 87 10 mov.u64 %r90,0; .loc 1 88 23 mov.u64 %r158,1; $L8: st.u64 [%r55],%r158; mov.u64 %r107,%r90; .loc 1 87 26 add.u64 %r90,%r90,1; .loc 1 87 3 add.u64 %r55,%r55,%r44; setp.ne.u64 %r159,%r106,%r107; @ %r159 bra $L8; .loc 1 101 9 setp.eq.u64 %r160,%r73,0; @ %r160 bra $L1; .loc 1 149 19 ld.u64 %r57,[%frame]; .loc 1 149 9 shl.b64 %r59,%r57,2; .loc 1 151 34 ld.u64 %r61,[%frame+120]; ld.u64 %r115,[%frame+240]; .loc 1 141 10 ld.f32 %r119,[%r73]; cvt.u32.u32 %r162,%r22; cvt.s64.s8 %r161,%r162; shl.b64 %r163,%r161,3; add.u64 %r95,%r198,%r163; .loc 1 97 12 mov.f32 %r78,0f7f800000; setp.eq.u32 %r197,1,0; add.u64 %r199,%frame,248; .loc 1 157 13 mov.u64 %r202,0; .loc 1 174 32 add.u64 %r203,%frame,120; .loc 1 126 10 setp.eq.u32 %r205,%r123,0; $L30: .loc 1 106 10 @ %r197 bra $L10; mov.u64 %r40,%r115; mov.u32 %r89,0; .loc 1 110 12 mov.u32 %r201,1; $L17: .loc 1 118 13 add.u64 %r120,%r73,%r59; .loc 1 110 11 setp.ge.f32 %r166,%r78,%r119; @ ! %r166 bra $L51; setp.eq.u32 %r167,%r89,0; selp.u64 %r115,%r115,%r40,%r167; mov.u64 %r77,%r199; mov.u64 %r97,%r23; mov.u64 %r111,%r115; $L15: .loc 1 115 36 add.u64 %r169,%r111,1; .loc 1 115 25 st.u64 [%r97],%r169; .loc 1 114 5 add.u64 %r97,%r97,%r44; setp.eq.u64 %r170,%r77,%r95; @ %r170 bra $L33; .loc 1 115 32 ld.u64 %r111,[%r77]; add.u64 %r77,%r77,8; bra $L15; $L51: .loc 1 120 11 add.u64 %r40,%r40,1; .loc 1 120 4 setp.eq.u64 %r171,%r40,%r61; @ %r171 bra $L16; .loc 1 110 12 ld.f32 %r119,[%r120]; mov.u64 %r73,%r120; mov.u32 %r89,%r201; bra $L17; $L10: .loc 1 126 10 @ %r205 bra $L34; mov.u64 %r52,%r115; mov.u64 %r71,%r73; $L24: .loc 1 129 9 setp.ge.f32 %r173,%r78,%r119; @ ! %r173 bra $L19; mov.u64 %r110,%r199; mov.u64 %r112,%r23; mov.u64 %r109,%r52; $L21: .loc 1 133 34 add.u64 %r175,%r109,1; .loc 1 133 23 st.u64 [%r112],%r175; .loc 1 132 3 add.u64 %r112,%r112,%r44; setp.eq.u64 %r176,%r95,%r110; @ %r176 bra $L35; .loc 1 133 30 ld.u64 %r109,[%r110]; add.u64 %r110,%r110,8; bra $L21; $L35: .loc 1 131 10 mov.f32 %r78,%r119; $L19: .loc 1 135 11 add.u64 %r71,%r71,%r59; .loc 1 137 9 add.u64 %r52,%r52,1; .loc 1 137 2 st.u64 [%frame+240],%r52; setp.ne.u64 %r177,%r52,%r61; @ %r177 bra $L22; .loc 1 135 11 sub.u64 %r178,%r61,%r115; mad.lo.u64 %r73,%r178,%r59,%r73; bra $L23; $L22: .loc 1 129 10 ld.f32 %r119,[%r71]; bra $L24; $L34: mov.u64 %r60,%r115; mov.u64 %r72,%r73; $L18: .loc 1 141 9 setp.gt.f32 %r180,%r78,%r119; @ ! %r180 bra $L25; mov.u64 %r50,%r199; mov.u64 %r47,%r23; mov.u64 %r93,%r60; $L27: .loc 1 145 34 add.u64 %r182,%r93,1; .loc 1 145 23 st.u64 [%r47],%r182; .loc 1 144 3 add.u64 %r47,%r47,%r44; setp.eq.u64 %r183,%r50,%r95; @ %r183 bra $L36; .loc 1 145 30 ld.u64 %r93,[%r50]; add.u64 %r50,%r50,8; bra $L27; $L36: .loc 1 143 10 mov.f32 %r78,%r119; $L25: .loc 1 149 9 add.u64 %r72,%r72,%r59; .loc 1 151 14 add.u64 %r60,%r60,1; .loc 1 151 7 st.u64 [%frame+240],%r60; setp.ne.u64 %r184,%r60,%r61; @ %r184 bra $L28; .loc 1 149 9 sub.u64 %r185,%r61,%r115; mad.lo.u64 %r73,%r185,%r59,%r73; bra $L23; $L28: .loc 1 141 10 ld.f32 %r119,[%r72]; bra $L18; $L16: st.u64 [%frame+240],%r61; .loc 1 118 13 mov.u64 %r73,%r120; $L23: mov.u64 %r42,%r199; .loc 1 151 34 mov.u64 %r70,%r61; mov.u64 %r68,%r57; .loc 1 152 9 mov.u64 %r76,0; $L29: .loc 1 157 13 st.u64 [%r42+-8],%r202; .loc 1 160 23 mul.lo.u64 %r65,%r68,%r70; mov.u64 %r108,%r76; .loc 1 161 5 add.u64 %r76,%r76,1; .loc 1 162 7 setp.eq.u64 %r189,%r108,%r106; @ %r189 bra $L1; .loc 1 170 16 ld.u64 %r190,[%r42]; add.u64 %r67,%r190,1; st.u64 [%r42],%r67; shl.b64 %r118,%r76,3; .loc 1 171 23 add.u64 %r191,%frame,%r118; ld.u64 %r68,[%r191]; .loc 1 171 13 sub.u64 %r192,%r68,%r65; shl.b64 %r193,%r192,2; add.u64 %r73,%r73,%r193; .loc 1 174 32 add.u64 %r195,%r203,%r118; ld.u64 %r70,[%r195]; .loc 1 174 7 add.u64 %r42,%r42,8; setp.eq.u64 %r196,%r67,%r70; @ %r196 bra $L29; ld.u64 %r115,[%frame+240]; .loc 1 141 10 ld.f32 %r119,[%r73]; bra $L30; $L33: mov.f32 %r78,%r119; setp.eq.u32 %r197,1,1; bra $L10; $L_gfortran_mminloc0_8_r4 .visible .func _gfortran_mminloc0_8_r4reg .u64 %stack; mov.u64 %stack,0480pred %r187; .reg .u64 %r190; .reg .u64u64pred %r222; .reg .u64 %r223; .reg .predu32 %r230; .reg .u64 %r231; .reg .u16 %r233; .reg .u32 %r234; .reg .pred %r235; .reg .u64 %r236; .reg .predpred %r250; .reg .u64 %r252; .reg .predu16 %r257; .reg .u32 %r258; .reg .pred %r259; .reg .pred %r260; .reg .u64 %r262; .reg .pred %r263; .reg .predpred %r282; .reg .u64 %r284; .reg .u64 %r285; .reg .pred %r287; mov.u64 %r160,%ar0; mov.u64 %r161,%ar1; mov.u64 %r162,%ar2; mov.u32 %r163,%ar3; .loc 1 202 6 setp.ne.u64 %r164,%r162,0; @ %r164 bra $L53; .loc 1 204 7 {1163; call _gfortran_minloc0_8_r4.loc 1 205 7 bra $L52; $L53: .loc 1 208 10 ld.s8 %r22,[%r161+28]; .loc 1 209 6 setp.gt.s32 %r168,%r22,0; @ %r168 bra $L55; .loc 1 210 5 cvta.const.u64 %r16stack; call _gfortran_runtime_error5: .loc 1 212 6 ld.u64 %r171,[%r160]; setp.ne.u64 %r172,%r171,0; @ %r172 bra $L56; .loc 1 214 7 st.u64 [%r160+48],%r171; add.u32 %r174,%r22,-1; cvt.s64.s32 %r175,%r174; st.u64 [%r160+56],%r175; mov.u64 %r176,1; st.u64 [%r160+40],%r176; .loc 1 215 28 cvt.u32.u64 %r177,%r176; st.u8 [%r160+28],%r177; .loc 1 216 24 st.u64 [%r160+8],%r171; .loc 1 217 29 cvt.u32.u32 %r182,%r22; cvt.s64.s8 %r181,%r182; mov.u64 %r180call (%value_in),_gfortrani_xmallocarray183,[%value_in]; } .loc 1 217 27 st.u64 [%r160],%r183; bra $L57; $L56: .loc 1 221 11 cvta.global.u64 %r185,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r186,[%r185+36]; setp.eq.u32 %r187,%r186,0; @ %r187 bra $L57; .loc 1 224 4 cvta.const.u64 %r190,$LC1; {161190; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19316161193190; call _gfortrani_bounds_equal_extents$L57: .loc 1 231 15 ld.u64 %r31,[%r162+16]; .loc 1 233 9 ld.u64 %r101,[%r162]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r195,%r32,-4; and.b32 %r196,%r195,-5; set.u32.eq.u32 %r198,%r196,0; neg.s32 %r199,%r198; .loc 1 235 22 add.u32 %r200,%r32,-1; set.u32.le.u32 %r202,%r200,1; neg.s32 %r203,%r202; .loc 1 235 6 cvt.u16.u32 %r205,%r199; cvt.u16.u32 %r206,%r203; or.b16 %r204,%r205,%r206; cvt.u32.u16 %r207,%r204; cvt.u16.u8 %r208,%r20758; .loc 1 237 7 setp.ne.u32 %r211,%r32,16; @ %r211 bra $L59; $L58: .loc 1 244 11 ld.u64 %r116,[%r160+40]; .loc 1 245 8 ld.u64 %r117,[%r160]; .loc 1 246 17 cvt.u32.u32 %r212,%r22; cvt.s64.s8 %r137,%r212; add.u64 %r52,%r161,40; add.u64 %r280,%frame,120; mov.u64 %r62,%r280; add.u64 %r67,%r162,40; mov.u64 %r70,%frame; add.u64 %r57,%frame,240; add.u64 %r59,%frame,360; mov.u64 %r58,%r59; .loc 1 246 10 mov.u64 %r122,0; .loc 1 251 16 mov.u64 %r221,%r122; bra $L60; $L59: .loc 1 242 5_gfortran_runtime_error248 18 ld.u64 %r215,[%r52]; st.u64 [%r62],%r215; .loc 1 249 20 ld.u64 %r217,[%r67]; mul.lo.u64 %r216,%r217,%r31; .loc 1 249 18 st.u64 [%r70],%r216; .loc 1 250 19 ld.u64 %r219,[%r52+16]; add.u64 %r218,%r219,1; ld.u64 %r220,[%r52+8]; sub.u64 %r45,%r218,%r220; .loc 1 250 17 st.u64 [%r57],%r45; .loc 1 251 16 st.u64 [%r58],%r221; .loc 1 252 10 setp.gt.s64 %r222,%r45,0; @ %r222 bra $L61; shl.b64 %r130,%r116,3; mov.u64 %r132,%r117; .loc 1 255 11 mov.u64 %r123,0; .loc 1 256 24 mov.u64 %r223,%r123; $L62: st.u64 [%r132],%r223; .loc 1 255 27 add.u64 %r123,%r123,1; .loc 1 255 4 add.u64 %r132,%r132,%r130; setp.ne.u64 %r224,%r123,%r137; @ %r224 bra $L62; bra $L52; $L61: .loc 1 246 26 add.u64 %r122,%r122,1; .loc 1 246 3 add.u64 %r52,%r52,24; add.u64 %r62,%r62,8; add.u64 %r67,%r67,24; add.u64 %r70,%r70,8; add.u64 %r57,%r57,8; add.u64 %r58,%r58,8; setp.ne.u64 %r225,%r122,%r137; @ %r225 bra $L60; .loc 1 261 8 ld.u64 %r98,[%r161]; shl.b64 %r134,%r116,3; mov.u64 %r153,%r117; .loc 1 264 10 mov.u64 %r121,0; .loc 1 265 23 mov.u64 %r226,%r121; $L64: st.u64 [%r153],%r226; .loc 1 264 26 add.u64 %r121,%r121,1; .loc 1 264 3 add.u64 %r153,%r153,%r134; setp.ne.u64 %r227,%r121,%r137; @ %r227 bra $L64; .loc 1 276 9 setp.eq.u64 %r228,%r98,0; @ %r228 bra $L52; .loc 1 331 19 ld.u64 %r77,[%frame+120]; .loc 1 331 9 shl.b64 %r79,%r77,2; .loc 1 332 20 ld.u64 %r80,[%frame]; .loc 1 334 34 ld.u64 %r84,[%frame+240]; .loc 1 334 21 ld.u64 %r156,[%frame+360]; cvt.u32.u32 %r230,%r22; cvt.s64.s8 %r229,%r230; shl.b64 %r231,%r229,3; add.u64 %r48,%r231,%r59; .loc 1 272 12 mov.f32 %r108,0f7f800000; setp.eq.u32 %r282,1,0; add.u64 %r281,%frame,368; .loc 1 340 13 mov.u64 %r284,0; .loc 1 359 32 add.u64 %r285,%frame,240; .loc 1 308 12 setp.eq.u32 %r287,%r163,0; $L83: .loc 1 280 10 @ %r282 bra $L65; $L72: .loc 1 284 11 ld.s8 %r234,[%r101]; cvt.u16.u32 %r233,%r234; setp.eq.u16 %r235,%r233,0; @ %r235 bra $L66; .loc 1 287 8 ld.u64 %r236,[%r117]; setp.ne.u64 %r237,%r236,0; @ %r237 bra $L67; mov.u64 %r66,%r281; mov.u64 %r76,%r117; mov.u64 %r150,%r156; $L68: .loc 1 289 38 add.u64 %r239,%r150,1; .loc 1 289 27 st.u64 [%r76],%r239; .loc 1 288 7 add.u64 %r76,%r76,%r134; setp.eq.u64 %r240,%r48,%r66; @ %r240 bra $L67; .loc 1 289 34 ld.u64 %r150,[%r66]; add.u64 %r66,%r66,8; bra $L68; $L67: .loc 1 290 9 ld.f32 %r54,[%r98]; .loc 1 290 8 setp.le.f32 %r241,%r54,%r108; @ ! %r241 bra $L66; mov.u64 %r112,%r281; mov.u64 %r125,%r117; mov.u64 %r154,%r156; $L71: .loc 1 296 33 add.u64 %r243,%r154,1; .loc 1 296 22 st.u64 [%r125],%r243; .loc 1 295 9 add.u64 %r125,%r125,%r134; setp.eq.u64 %r244,%r48,%r112; @ %r244 bra $L84; .loc 1 296 29 ld.u64 %r154,[%r112]; add.u64 %r112,%r112,8; bra $L71; $L66: .loc 1 300 13 add.u64 %r98,%r98,%r79; .loc 1 301 14 add.u64 %r101,%r101,%r80; .loc 1 303 11 add.u64 %r156,%r156,1; .loc 1 303 4 st.u64 [%frame+360],%r156; setp.ne.u64 %r245,%r156,%r84; @ %r245 bra $L72; bra $L73; $L65: .loc 1 308 12 @ %r287 bra $L85; mov.u64 %r72,%r156; mov.u64 %r96,%r98; $L78: .loc 1 311 11 ld.s8 %r248,[%r101]; cvt.u16.u32 %r247,%r248; setp.eq.u16 %r249,%r247,0; @ %r249 bra $L75; .loc 1 311 12 ld.f32 %r65,[%r96]; .loc 1 311 11 setp.le.f32 %r250,%r65,%r108; @ ! %r250 bra $L75; mov.u64 %r124,%r281; mov.u64 %r36,%r117; mov.u64 %r136,%r72; .loc 1 314 18 mov.u64 %r120,0; $L77: .loc 1 315 36 add.u64 %r252,%r136,1; .loc 1 315 25 st.u64 [%r36],%r252; .loc 1 314 34 add.u64 %r120,%r120,1; .loc 1 314 11 add.u64 %r36,%r36,%r134; setp.ge.s64 %r253,%r120,%r137; @ %r253 bra $L86; .loc 1 315 32 ld.u64 %r136,[%r124]; add.u64 %r124,%r124,8; bra $L77; $L86: .loc 1 313 18 mov.f32 %r108,%r65; $L75: .loc 1 317 8 add.u64 %r96,%r96,%r79; .loc 1 319 13 add.u64 %r72,%r72,1; .loc 1 319 6 st.u64 [%frame+360],%r72; setp.ne.u64 %r254,%r72,%r84; @ %r254 bra $L78; .loc 1 317 8 sub.u64 %r255,%r84,%r156; mad.lo.u64 %r98,%r255,%r79,%r98; bra $L73; $L85: mov.u64 %r83,%r156; mov.u64 %r100,%r101; mov.u64 %r97,%r98; $L74: .loc 1 323 11 ld.s8 %r258,[%r100]; cvt.u16.u32 %r257,%r258; setp.eq.u16 %r259,%r257,0; @ %r259 bra $L79; .loc 1 323 12 ld.f32 %r74,[%r97]; .loc 1 323 11 setp.lt.f32 %r260,%r74,%r108; @ ! %r260 bra $L79; mov.u64 %r155,%r281; mov.u64 %r159,%r117; mov.u64 %r127,%r83; .loc 1 326 12 mov.u64 %r119,0; $L81: .loc 1 327 36 add.u64 %r262,%r127,1; .loc 1 327 25 st.u64 [%r159],%r262; .loc 1 326 28 add.u64 %r119,%r119,1; .loc 1 326 5 add.u64 %r159,%r159,%r134; setp.ge.s64 %r263,%r119,%r137; @ %r263 bra $L87; .loc 1 327 32 ld.u64 %r127,[%r155]; add.u64 %r155,%r155,8; bra $L81; $L87: .loc 1 325 12 mov.f32 %r108,%r74; $L79: .loc 1 331 9 add.u64 %r97,%r97,%r79; .loc 1 332 10 add.u64 %r100,%r100,%r80; .loc 1 334 14 add.u64 %r83,%r83,1; .loc 1 334 7 st.u64 [%frame+360],%r83; setp.ne.u64 %r264,%r83,%r84; @ %r264 bra $L74; sub.u64 %r141,%r84,%r156; .loc 1 331 9 mad.lo.u64 %r98,%r79,%r141,%r98; .loc 1 332 10 mad.lo.u64 %r101,%r80,%r141,%r101; $L73: mov.u64 %r107,%r281; .loc 1 325 12 mov.u64 %r93,%r80; .loc 1 334 34 mov.u64 %r95,%r84; mov.u64 %r91,%r77; mov.u64 %r105,8; .loc 1 335 9 mov.u64 %r106,0; $L82: .loc 1 340 13 st.u64 [%r107+-8],%r284; .loc 1 343 23 mul.lo.u64 %r86,%r91,%r95; .loc 1 344 24 mul.lo.u64 %r88,%r95,%r93; .loc 1 345 5 add.u64 %r106,%r106,1; .loc 1 346 7 setp.eq.u64 %r269,%r106,%r137; @ %r269 bra $L52; .loc 1 354 16 ld.u64 %r270,[%r107]; add.u64 %r90,%r270,1; st.u64 [%r107],%r90; .loc 1 355 23 add.u64 %r272,%r280,%r105; ld.u64 %r91,[%r272]; .loc 1 355 13 sub.u64 %r273,%r91,%r86; shl.b64 %r274,%r273,2; add.u64 %r98,%r98,%r274; .loc 1 356 24 add.u64 %r275,%frame,%r105; ld.u64 %r93,[%r275]; .loc 1 356 14 sub.u64 %r276,%r93,%r88; add.u64 %r101,%r101,%r276; .loc 1 359 32 add.u64 %r278,%r285,%r105; ld.u64 %r95,[%r278]; .loc 1 359 7 add.u64 %r107,%r107,8; add.u64 %r105,%r105,8; setp.eq.u64 %r279,%r90,%r95; @ %r279 bra $L82; .loc 1 334 21 ld.u64 %r156,[%frame+360]; bra $L83; $L84: .loc 1 290 9 mov.f32 %r108,%r54; setp.eq.u32 %r282,1,1; bra $L65; $L52: .loc 1 _gfortran_sminloc0_8_r4 .visible .func _gfortran_sminloc0_8_r4u32pred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 378 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L114; .loc 1 378 20 ld.u32 %r40,[%r37380 7 {_gfortran_minloc0_8_r4.loc 1 381 7 bra $L113; $L115: .loc 1 384 8 ld.s8 %r30,[%r36+28]; .loc 1 386 6 setp.gt.s64 %r46,%r30,0; @ %r46 bra $L1_gfortran_runtime_error117: .loc 1 389 15 ld.u64 %r24,[%r35]; .loc 1 389 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L118; .loc 1 391 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 392 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 393 24 st.u64 [%r35+8],%r24; .loc 1 394 29 mov.u64 call (%value_in),_gfortrani_xmallocarray94 27 st.u64 [%r35],%r24; bra $L119; $L118: .loc 1 396 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 396 11 ld.u32 %r62,[%r61+36.loc 1 398 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 403 8 ld.u64 %r24,[%r35]; $L119: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,3; mov.u64 %r29,%r24; .loc 1 404 10 mov.u64 %r32,0; .loc 1 405 23 mov.u64 %r68,%r32; $L120: st.u64 [%r29],%r68; .loc 1 404 24 add.u64 %r32,%r32,1; .loc 1 404 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L120; $L113: .loc 1 406 1 ret; } minloc0_16_r4.o/_gfortran_minloc0_16_r4 .visible .func _gfortran_minloc0_16_libgfortran/generated/minloc0_16_r4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_16_r4 .visible .func _gfortran_mminloc0_16_r4_gfortran_sminloc0_16_r4 .visible .func _gfortran_sminloc0_16_r4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_16_r4 .visible .func _gfortran_minloc0_16_stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[368105; .reg .u64 %r108predpred %r160; .reg .predpredu32 %r167; .reg .u64 %r168; .reg .pred %r171; .reg .pred %r172; .reg .u64 %r174; .reg .u64 %r175; .reg .pred %r176; .reg .pred %r177; .reg .predu64 %r193; .reg .predmov.u64 %r124,%ar0; mov.u64 %r125,%ar1; mov.u32 %r126,%ar2; .loc 1 50 10 ld.s8 %r22,[%r125+28]; .loc 1 50 8 cvt.u32.u32 %r127,%r22; cvt.s64.s8 %r79,%r127; .loc 1 51 6 setp.gt.s64 %r128,%r79,0; @ %r128 bra $L2; .loc 1 52 5 cvta.const.u64 %r112_gfortran_runtime_errorr23,[%r124]; .loc 1 54 6 setp.ne.u64 %r131,%r23,0; @ %r131 bra $L3; .loc 1 56 7 st.u64 [%r124+48],%r23; add.u64 %r133,%r79,-1; st.u64 [%r124+56],%r133; mov.u64 %r134,1; st.u64 [%r124+40],%r134; .loc 1 57 28 cvt.u32.u64 %r135,%r134; st.u8 [%r124+28],%r135; .loc 1 58 24 st.u64 [%r124+8],%r23; .loc 1 59 29 mov.u64 %r138,1679call (%value_in),_gfortrani_xmallocarray1,[%value_in]; } mov.u64 %r23,%r141; .loc 1 59 27 st.u64 [%r124],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r143,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r144,[%r143+36]; setp.eq.u32 %r145,%r144,0; @ %r145 bra $L4; .loc 1 64r148; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r124]; $L4: .loc 1 68 11 ld.u64 %r80,[%r124+40]; add.u64 %r51,%r125,40; mov.u64 %r42,%frame; add.u64 %r76,%frame,120; .loc 1 70 10 mov.u64 %r108,0; add.u64 %r205,%frame,240; .loc 1 74 16 mov.u64 %r156,%r108; bra $L7; $L32: mov.u64 %r108,%r88; $L7: .loc 1 72 18 ld.u64 %r149,[%r51]; st.u64 [%r42],%r149; .loc 1 73 19 ld.u64 %r151,[%r51+16]; add.u64 %r150,%r151,1; ld.u64 %r152,[%r51+8]; sub.u64 %r31,%r150,%r152; .loc 1 73 17 st.u64 [%r76],%r31; .loc 1 74 16 shl.b64 %r154,%r108,3; add.u64 %r155,%r205,%r154; st.u64 [%r155],%r156; .loc 1 75 10 setp.gt.s64 %r157,%r31,0; @ %r157 bra $L5; shl.b64 %r120,%r80,4; mov.u64 %r105,%r23; .loc 1 78 11 mov.u64 %r89,0; .loc 1 79 24 mov.u64 %r158,%r89; $L6: st.u64 [%r105],%r158; st.u64 [%r105+8],%r158; .loc 1 78 27 add.u64 %r89,%r89,1; .loc 1 78 4 add.u64 %r105,%r105,%r120; setp.ne.u64 %r160,%r79,%r89; @ %r160 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r88,%r108,1; .loc 1 70 3 add.u64 %r51,%r51,24; add.u64 %r42,%r42,8; add.u64 %r76,%r76,8; setp.ne.u64 %r161,%r79,%r88; @ %r161 bra $L32; .loc 1 84 8 ld.u64 %r69,[%r125]; shl.b64 %r35,%r80,4; mov.u64 %r48,%r23; .loc 1 87 10 mov.u64 %r87,0; .loc 1 88 23 mov.u64 %r162,1; mov.u64 %r163,%r87; $L8: st.u64 [%r48],%r162; st.u64 [%r48+8],%r163; mov.u64 %r109,%r87; .loc 1 87 26 add.u64 %r87,%r87,1; .loc 1 87 3 add.u64 %r48,%r48,%r35; setp.ne.u64 %r164,%r108,%r109; @ %r164 bra $L8; .loc 1 101 9 setp.eq.u64 %r165,%r69,0; @ %r165 bra $L1; .loc 1 149 19 ld.u64 %r54,[%frame]; .loc 1 149 9 shl.b64 %r56,%r54,2; .loc 1 151 34 ld.u64 %r58,[%frame+120]; ld.u64 %r121,[%frame+240]; .loc 1 141 10 ld.f32 %r122,[%r69]; cvt.u32.u32 %r167,%r22; cvt.s64.s8 %r166,%r167; shl.b64 %r168,%r166,3; add.u64 %r46,%r205,%r168; .loc 1 97 12 mov.f32 %r74,0f7f800000; setp.eq.u32 %r207,1,0; add.u64 %r206,%frame,248; .loc 1 157 13 mov.u64 %r210,0; .loc 1 174 32 add.u64 %r211,%frame,120; .loc 1 126 10 setp.eq.u32 %r213,%r126,0; $L30: .loc 1 106 10 @ %r207 bra $L10; mov.u64 %r37,%r121; mov.u32 %r85,0; .loc 1 110 12 mov.u32 %r209,1; $L17: .loc 1 118 13 add.u64 %r123,%r69,%r56; .loc 1 110 11 setp.ge.f32 %r171,%r74,%r122; @ ! %r171 bra $L51; setp.eq.u32 %r172,%r85,0; selp.u64 %r121,%r121,%r37,%r172; mov.u64 %r78,%r206; mov.u64 %r94,%r23; mov.u64 %r117,%r121; $L15: .loc 1 115 36 add.u64 %r174,%r117,1; st.u64 [%r94],%r174; shr.s64 %r175,%r174,63; st.u64 [%r94+8],%r175; .loc 1 114 5 add.u64 %r94,%r94,%r35; setp.eq.u64 %r176,%r46,%r78; @ %r176 bra $L33; .loc 1 115 32 ld.u64 %r117,[%r78]; add.u64 %r78,%r78,8; bra $L15; $L51: .loc 1 120 11 add.u64 %r37,%r37,1; .loc 1 120 4 setp.eq.u64 %r177,%r37,%r58; @ %r177 bra $L16; .loc 1 110 12 ld.f32 %r122,[%r123]; mov.u64 %r69,%r123; mov.u32 %r85,%r209; bra $L17; $L10: .loc 1 126 10 @ %r213 bra $L34; mov.u64 %r49,%r121; mov.u64 %r67,%r69; $L24: .loc 1 129 9 setp.ge.f32 %r179,%r74,%r122; @ ! %r179 bra $L19; mov.u64 %r119,%r206; mov.u64 %r70,%r23; mov.u64 %r114,%r49; $L21: .loc 1 133 34 add.u64 %r181,%r114,1; st.u64 [%r70],%r181; shr.s64 %r182,%r181,63; st.u64 [%r70+8],%r182; .loc 1 132 3 add.u64 %r70,%r70,%r35; setp.eq.u64 %r183,%r46,%r119; @ %r183 bra $L35; .loc 1 133 30 ld.u64 %r114,[%r119]; add.u64 %r119,%r119,8; bra $L21; $L35: .loc 1 131 10 mov.f32 %r74,%r122; $L19: .loc 1 135 11 add.u64 %r67,%r67,%r56; .loc 1 137 9 add.u64 %r49,%r49,1; .loc 1 137 2 st.u64 [%frame+240],%r49; setp.ne.u64 %r184,%r49,%r58; @ %r184 bra $L22; .loc 1 135 11 sub.u64 %r185,%r58,%r121; mad.lo.u64 %r69,%r185,%r56,%r69; bra $L23; $L22: .loc 1 129 10 ld.f32 %r122,[%r67]; bra $L24; $L34: mov.u64 %r57,%r121; mov.u64 %r68,%r69; $L18: .loc 1 141 9 setp.gt.f32 %r187,%r74,%r122; @ ! %r187 bra $L25; mov.u64 %r112,%r206; mov.u64 %r116,%r23; mov.u64 %r111,%r57; $L27: .loc 1 145 34 add.u64 %r189,%r111,1; st.u64 [%r116],%r189; shr.s64 %r190,%r189,63; st.u64 [%r116+8],%r190; .loc 1 144 3 add.u64 %r116,%r116,%r35; setp.eq.u64 %r191,%r46,%r112; @ %r191 bra $L36; .loc 1 145 30 ld.u64 %r111,[%r112]; add.u64 %r112,%r112,8; bra $L27; $L36: .loc 1 143 10 mov.f32 %r74,%r122; $L25: .loc 1 149 9 add.u64 %r68,%r68,%r56; .loc 1 151 14 add.u64 %r57,%r57,1; .loc 1 151 7 st.u64 [%frame+240],%r57; setp.ne.u64 %r192,%r57,%r58; @ %r192 bra $L28; .loc 1 149 9 sub.u64 %r193,%r58,%r121; mad.lo.u64 %r69,%r193,%r56,%r69; bra $L23; $L28: .loc 1 141 10 ld.f32 %r122,[%r68]; bra $L18; $L16: st.u64 [%frame+240],%r58; .loc 1 118 13 mov.u64 %r69,%r123; $L23: mov.u64 %r71,%r206; .loc 1 151 34 mov.u64 %r65,%r58; mov.u64 %r63,%r54; .loc 1 152 9 mov.u64 %r73,0; $L29: .loc 1 157 13 st.u64 [%r71+-8],%r210; .loc 1 160 23 mul.lo.u64 %r60,%r63,%r65; mov.u64 %r110,%r73; .loc 1 161 5 add.u64 %r73,%r73,1; .loc 1 162 7 setp.eq.u64 %r197,%r110,%r108; @ %r197 bra $L1; .loc 1 170 16 ld.u64 %r198,[%r71]; add.u64 %r62,%r198,1; st.u64 [%r71],%r62; shl.b64 %r40,%r73,3; .loc 1 171 23 add.u64 %r199,%frame,%r40; ld.u64 %r63,[%r199]; .loc 1 171 13 sub.u64 %r200,%r63,%r60; shl.b64 %r201,%r200,2; add.u64 %r69,%r69,%r201; .loc 1 174 32 add.u64 %r203,%r211,%r40; ld.u64 %r65,[%r203]; .loc 1 174 7 add.u64 %r71,%r71,8; setp.eq.u64 %r204,%r62,%r65; @ %r204 bra $L29; ld.u64 %r121,[%frame+240]; .loc 1 141 10 ld.f32 %r122,[%r69]; bra $L30; $L33: mov.f32 %r74,%r122; setp.eq.u32 %r207,1,1; bra $L10; $L_gfortran_mminloc0_16_r4 .visible .func _gfortran_mminloc0_16_r4reg .u64 %stack; mov.u64 %stack,0480pred %r168; .reg .predpredpred %r226; .reg .u64 %r227; .reg .pred %r229; .reg .predpred %r234; .reg .u64u16pred %r248; .reg .pred %r249; .reg .u64 %r251; .reg .u64 %r252; .reg .pred %r253; .reg .pred %r254; .reg .u16 %r256; .reg .u32predpred %r274; .reg .predu64 %r283; .reg .u64u64 %r287; .reg .u64 %r289; .reg .pred %r290; .reg .u64 %r291; .reg .u64 %r292; .reg .predpred %r298; mov.u64 %r164,%ar0; mov.u64 %r165,%ar1; mov.u64 %r166,%ar2; mov.u32 %r167,%ar3; .loc 1 202 6 setp.ne.u64 %r168,%r166,0; @ %r168 bra $L1617; call _gfortran_minloc0_16_r4.loc 1 205 7 bra $L52; $L53: .loc 1 208 10 ld.s8 %r22,[%r165+28]; .loc 1 209 6 setp.gt.s32 %r172,%r22,0; @ %r172 bra $L55; .loc 1 210 5 cvta.const.u64 %r173173_gfortran_runtime_error5: .loc 1 212 6 ld.u64 %r175,[%r164]; setp.ne.u64 %r176,%r175,0; @ %r176 bra $L56; .loc 1 214 7 st.u64 [%r164+48],%r175; add.u32 %r178,%r22,-1; cvt.s64.s32 %r179,%r178; st.u64 [%r164+56],%r179; mov.u64 %r180,1; st.u64 [%r164+40],%r180; .loc 1 215 28 cvt.u32.u64 %r181,%r180; st.u8 [%r164+28],%r181; .loc 1 216 24 st.u64 [%r164+8],%r175; .loc 1 217 29 cvt.u32.u32 %r186,%r22; cvt.s64.s8 %r185,%r186; mov.u64 %r184,16call (%value_in),_gfortrani_xmallocarray187,[%value_in]; } .loc 1 217 27 st.u64 [%r164],%r187; bra $L57; $L56: .loc 1 221 11 cvta.global.u64 %r189,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r190,[%r189+36]; setp.eq.u32 %r191,%r190,0164r194; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19716r197194; call _gfortrani_bounds_equal_extents$L57: .loc 1 231 15 ld.u64 %r31,[%r166+16]; .loc 1 233 9 ld.u64 %r104,[%r166]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r199,%r32,-4; and.b32 %r200,%r199,-5; set.u32.eq.u32 %r202,%r200,0; neg.s32 %r203,%r202; .loc 1 235 22 add.u32 %r204,%r32,-1; set.u32.le.u32 %r206,%r204,1; neg.s32 %r207,%r206; .loc 1 235 6 cvt.u16.u32 %r209,%r203; cvt.u16.u32 %r210,%r207; or.b16 %r208,%r209,%r210; cvt.u32.u16 %r211,%r208; cvt.u16.u8 %r212,%r211; setp.ne.u16 %r213,%r212,0; @ %r213 bra $L58; .loc 1 237 7 setp.ne.u32 %r215,%r32,16; @ %r215 bra $L59; $L58: .loc 1 244 11 ld.u64 %r119,[%r164+40]; .loc 1 245 8 ld.u64 %r120,[%r164]; .loc 1 246 17 cvt.u32.u32 %r216,%r22; cvt.s64.s8 %r137,%r216; add.u64 %r153,%r165,40; add.u64 %r291,%frame,120; mov.u64 %r150,%r291; add.u64 %r151,%r166,40; mov.u64 %r65,%frame; add.u64 %r62,%frame,240; add.u64 %r48,%frame,360; mov.u64 %r54,%r48; .loc 1 246 10 mov.u64 %r126,0; .loc 1 251 16 mov.u64 %r225,%r126; bra $L60; $L59: .loc 1 242 5 cvta.const.u64 %r217,$LC3stack; call _gfortran_runtime_error248 18 ld.u64 %r219,[%r153]; st.u64 [%r150],%r219; .loc 1 249 20 ld.u64 %r221,[%r151]; mul.lo.u64 %r220,%r221,%r31; .loc 1 249 18 st.u64 [%r65],%r220; .loc 1 250 19 ld.u64 %r223,[%r153+16]; add.u64 %r222,%r223,1; ld.u64 %r224,[%r153+8]; sub.u64 %r47,%r222,%r224; .loc 1 250 17 st.u64 [%r62],%r47; .loc 1 251 16 st.u64 [%r54],%r225; .loc 1 252 10 setp.gt.s64 %r226,%r47,0; @ %r226 bra $L61; shl.b64 %r132,%r119,4; mov.u64 %r136,%r120; .loc 1 255 11 mov.u64 %r127,0; .loc 1 256 24 mov.u64 %r227,%r127; $L62: st.u64 [%r136],%r227; st.u64 [%r136+8],%r227; .loc 1 255 27 add.u64 %r127,%r127,1; .loc 1 255 4 add.u64 %r136,%r136,%r132; setp.ne.u64 %r229,%r127,%r137; @ %r229 bra $L62; bra $L52; $L61: .loc 1 246 26 add.u64 %r126,%r126,1; .loc 1 246 3 add.u64 %r153,%r153,24; add.u64 %r150,%r150,8; add.u64 %r151,%r151,24; add.u64 %r65,%r65,8; add.u64 %r62,%r62,8; add.u64 %r54,%r54,8; setp.ne.u64 %r230,%r126,%r137; @ %r230 bra $L60; .loc 1 261 8 ld.u64 %r102,[%r165]; shl.b64 %r159,%r119,4; mov.u64 %r157,%r120; .loc 1 264 10 mov.u64 %r125,0; .loc 1 265 23 mov.u64 %r231,%r125; $L64: st.u64 [%r157],%r231; st.u64 [%r157+8],%r231; .loc 1 264 26 add.u64 %r125,%r125,1; .loc 1 264 3 add.u64 %r157,%r157,%r159; setp.ne.u64 %r233,%r125,%r137; @ %r233 bra $L64; .loc 1 276 9 setp.eq.u64 %r234,%r102,0; @ %r234 bra $L52; .loc 1 331 19 ld.u64 %r82,[%frame+120]; .loc 1 331 9 shl.b64 %r84,%r82,2; .loc 1 332 20 ld.u64 %r85,[%frame]; .loc 1 334 34 ld.u64 %r87,[%frame+240]; .loc 1 334 21 ld.u64 %r161,[%frame+360]; cvt.u32.u32 %r236,%r22; cvt.s64.s8 %r235,%r236; shl.b64 %r237,%r235,3; add.u64 %r68,%r237,%r48; .loc 1 272 12 mov.f32 %r110,0f7f800000; setp.eq.u32 %r293,1,0; add.u64 %r292,%frame,368; .loc 1 340 13 mov.u64 %r295,0; .loc 1 359 32 add.u64 %r296,%frame,240; .loc 1 308 12 setp.eq.u32 %r298,%r167,0; $L83: .loc 1 280 10 @ %r293 bra $L65; $L72: .loc 1 284 11 ld.s8 %r240,[%r104]; cvt.u16.u32 %r239,%r240; setp.eq.u16 %r241,%r239,0; @ %r241 bra $L66; .loc 1 287 8 ld.u64 %r242,[%r120]; ld.u64 %r243,[%r120+8]; or.b64 %r242,%r242,%r243; setp.ne.u64 %r244,%r242,0; @ %r244 bra $L67; mov.u64 %r99,%r292; mov.u64 %r105,%r120; mov.u64 %r152,%r161; $L68: .loc 1 289 38 add.u64 %r246,%r152,1; st.u64 [%r105],%r246; shr.s64 %r247,%r246,63; st.u64 [%r105+8],%r247; .loc 1 288 7 add.u64 %r105,%r105,%r159; setp.eq.u64 %r248,%r68,%r99; @ %r248 bra $L67; .loc 1 289 34 ld.u64 %r152,[%r99]; add.u64 %r99,%r99,8; bra $L68; $L67: .loc 1 290 9 ld.f32 %r55,[%r102]; .loc 1 290 8 setp.le.f32 %r249,%r55,%r110; @ ! %r249 bra $L66; mov.u64 %r129,%r292; mov.u64 %r130,%r120; mov.u64 %r156,%r161; $L71: .loc 1 296 33 add.u64 %r251,%r156,1; st.u64 [%r130],%r251; shr.s64 %r252,%r251,63; st.u64 [%r130+8],%r252; .loc 1 295 9 add.u64 %r130,%r130,%r159; setp.eq.u64 %r253,%r68,%r129; @ %r253 bra $L84; .loc 1 296 29 ld.u64 %r156,[%r129]; add.u64 %r129,%r129,8; bra $L71; $L66: .loc 1 300 13 add.u64 %r102,%r102,%r84; .loc 1 301 14 add.u64 %r104,%r104,%r85; .loc 1 303 11 add.u64 %r161,%r161,1; .loc 1 303 4 st.u64 [%frame+360],%r161; setp.ne.u64 %r254,%r161,%r87; @ %r254 bra $L72; bra $L73; $L65: .loc 1 308 12 @ %r298 bra $L85; mov.u64 %r72,%r161; mov.u64 %r100,%r102; $L78: .loc 1 311 11 ld.s8 %r257,[%r104]; cvt.u16.u32 %r256,%r257; setp.eq.u16 %r258,%r256,0; @ %r258 bra $L75; .loc 1 311 12 ld.f32 %r66,[%r100]; .loc 1 311 11 setp.le.f32 %r259,%r66,%r110; @ ! %r259 bra $L75; mov.u64 %r49,%r292; mov.u64 %r64,%r120; mov.u64 %r149,%r72; .loc 1 314 18 mov.u64 %r123,0; $L77: .loc 1 315 36 add.u64 %r261,%r149,1; st.u64 [%r64],%r261; shr.s64 %r262,%r261,63; st.u64 [%r64+8],%r262; .loc 1 314 34 add.u64 %r123,%r123,1; .loc 1 314 11 add.u64 %r64,%r64,%r159; setp.ge.s64 %r263,%r123,%r137; @ %r263 bra $L86; .loc 1 315 32 ld.u64 %r149,[%r49]; add.u64 %r49,%r49,8; bra $L77; $L86: .loc 1 313 18 mov.f32 %r110,%r66; $L75: .loc 1 317 8 add.u64 %r100,%r100,%r84; .loc 1 319 13 add.u64 %r72,%r72,1; .loc 1 319 6 st.u64 [%frame+360],%r72; setp.ne.u64 %r264,%r72,%r87; @ %r264 bra $L78; .loc 1 317 8 sub.u64 %r265,%r87,%r161; mad.lo.u64 %r102,%r265,%r84,%r102; bra $L73; $L85: mov.u64 %r73,%r161; mov.u64 %r103,%r104; mov.u64 %r101,%r102; $L74: .loc 1 323 11 ld.s8 %r268,[%r103]; cvt.u16.u32 %r267,%r268; setp.eq.u16 %r269,%r267,0; @ %r269 bra $L79; .loc 1 323 12 ld.f32 %r77,[%r101]; .loc 1 323 11 setp.lt.f32 %r270,%r77,%r110; @ ! %r270 bra $L79; mov.u64 %r128,%r292; mov.u64 %r36,%r120; mov.u64 %r135,%r73; .loc 1 326 12 mov.u64 %r122,0; $L81: .loc 1 327 36 add.u64 %r272,%r135,1; st.u64 [%r36],%r272; shr.s64 %r273,%r272,63; st.u64 [%r36+8],%r273; .loc 1 326 28 add.u64 %r122,%r122,1; .loc 1 326 5 add.u64 %r36,%r36,%r159; setp.ge.s64 %r274,%r122,%r137; @ %r274 bra $L87; .loc 1 327 32 ld.u64 %r135,[%r128]; add.u64 %r128,%r128,8; bra $L81; $L87: .loc 1 325 12 mov.f32 %r110,%r77; $L79: .loc 1 331 9 add.u64 %r101,%r101,%r84; .loc 1 332 10 add.u64 %r103,%r103,%r85; .loc 1 334 14 add.u64 %r73,%r73,1; .loc 1 334 7 st.u64 [%frame+360],%r73; setp.ne.u64 %r275,%r73,%r87; @ %r275 bra $L74; sub.u64 %r141,%r87,%r161; .loc 1 331 9 mad.lo.u64 %r102,%r84,%r141,%r102; .loc 1 332 10 mad.lo.u64 %r104,%r85,%r141,%r104; $L73: mov.u64 %r114,%r292; .loc 1 325 12 mov.u64 %r96,%r85; .loc 1 334 34 mov.u64 %r98,%r87; mov.u64 %r94,%r82; mov.u64 %r113,8; .loc 1 335 9 mov.u64 %r108,0; $L82: .loc 1 340 13 st.u64 [%r114+-8],%r295; .loc 1 343 23 mul.lo.u64 %r89,%r94,%r98; .loc 1 344 24 mul.lo.u64 %r91,%r98,%r96; .loc 1 345 5 add.u64 %r108,%r108,1; .loc 1 346 7 setp.eq.u64 %r280,%r108,%r137; @ %r280 bra $L52; .loc 1 354 16 ld.u64 %r281,[%r114]; add.u64 %r93,%r281,1; st.u64 [%r114],%r93; .loc 1 355 23 add.u64 %r283,%r291,%r113; ld.u64 %r94,[%r283]; .loc 1 355 13 sub.u64 %r284,%r94,%r89; shl.b64 %r285,%r284,2; add.u64 %r102,%r102,%r285; .loc 1 356 24 add.u64 %r286,%frame,%r113; ld.u64 %r96,[%r286]; .loc 1 356 14 sub.u64 %r287,%r96,%r91; add.u64 %r104,%r104,%r287; .loc 1 359 32 add.u64 %r289,%r296,%r113; ld.u64 %r98,[%r289]; .loc 1 359 7 add.u64 %r114,%r114,8; add.u64 %r113,%r113,8; setp.eq.u64 %r290,%r93,%r98; @ %r290 bra $L82; .loc 1 334 21 ld.u64 %r161,[%frame+360]; bra $L83; $L84: .loc 1 290 9 mov.f32 %r110,%r55; setp.eq.u32 %r293,1,1; bra $L65; $L52: .loc 1 _gfortran_sminloc0_16_r4 .visible .func _gfortran_sminloc0_16_r4u32pred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 378 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L114; .loc 1 378 20 ld.u32 %r40,[%r37380 7 {_gfortran_minloc0_16_r4.loc 1 381 7 bra $L113; $L115: .loc 1 384 8 ld.s8 %r30,[%r36+28]; .loc 1 386 6 setp.gt.s64 %r46,%r30,0; @ %r46 bra $L1_gfortran_runtime_error117: .loc 1 389 15 ld.u64 %r24,[%r35]; .loc 1 389 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L118; .loc 1 391 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 392 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 393 24 st.u64 [%r35+8],%r24; .loc 1 394 29call (%value_in),_gfortrani_xmallocarray94 27 st.u64 [%r35],%r24; bra $L119; $L118: .loc 1 396 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 396 11 ld.u32 %r62,[%r61+36.loc 1 398 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 403 8 ld.u64 %r24,[%r35]; $L119: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,4; mov.u64 %r29,%r24; .loc 1 404 10 mov.u64 %r32,0; .loc 1 405 23 mov.u64 %r68,%r32; $L120: st.u64 [%r29],%r68; st.u64 [%r29+8],%r68; .loc 1 404 24 add.u64 %r32,%r32,1; .loc 1 404 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r70,%r30,%r32; @ %r70 bra $L120; $L113: .loc 1 406 1 ret; } minloc0_4_r8.o/_gfortran_minloc0_4_r8 .visible .func _gfortran_minloc0_4_libgfortran/generated/minloc0_4_r8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_4_r8 .visible .func _gfortran_mminloc0_4_r8_gfortran_sminloc0_4_r8 .visible .func _gfortran_sminloc0_4_r8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_4_r8 .visible .func _gfortran_minloc0_4_stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[368.reg .u64predpredpredu32 %r176; .reg .pred %r177; .reg .pred %r178; .reg .pred %r180; .reg .u32 %r182; .reg .u32 %r183; .reg .pred64pred %r208; .reg .u32 %r210; .reg .u64 %r211; .reg .u64 %r212; .reg .pred50 10 ld.s8 %r22,[%r128+28]; .loc 1 50 8 cvt.u32.u32 %r130,%r22; cvt.s64.s8 %r79,%r130; .loc 1 51 6 setp.gt.s64 %r131,%r79,0; @ %r131 bra $L2; .loc 1 52 5 cvta.const.u64 %r1321stack; call _gfortran_runtime_errorr23,[%r127]; .loc 1 54 6 setp.ne.u64 %r134,%r23,0; @ %r134 bra $L3; .loc 1 56 7 st.u64 [%r127+48],%r23; add.u64 %r136,%r79,-1; st.u64 [%r127+56],%r136; mov.u64 %r137,1; st.u64 [%r127+40],%r137; .loc 1 57 28 cvt.u32.u64 %r138,%r137; st.u8 [%r127+28],%r138; .loc 1 58 24 st.u64 [%r127+8],%r23; .loc 1 59 2979call (%value_in),_gfortrani_xmallocarray4,[%value_in]; } mov.u64 %r23,%r144; .loc 1 59 27 st.u64 [%r127],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r146,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r147,[%r146+36]; setp.eq.u32 %r148,%r147,0; @ %r148 bra $L4; .loc 1 64 2r12151; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r127]; $L4: .loc 1 68 11 ld.u64 %r80,[%r127+40]; add.u64 %r44,%r128,40; mov.u64 %r37,%frame; add.u64 %r47,%frame,120; .loc 1 70 10 mov.u64 %r103,0; add.u64 %r206,%frame,240; .loc 1 74 16 mov.u64 %r159,%r103; bra $L7; $L32: mov.u64 %r103,%r88; $L7: .loc 1 72 18 ld.u64 %r152,[%r44]; st.u64 [%r37],%r152; .loc 1 73 19 ld.u64 %r154,[%r44+16]; add.u64 %r153,%r154,1; ld.u64 %r155,[%r44+8]; sub.u64 %r31,%r153,%r155; .loc 1 73 17 st.u64 [%r47],%r31; .loc 1 74 16 shl.b64 %r157,%r103,3; add.u64 %r158,%r206,%r157; st.u64 [%r158],%r159; .loc 1 75 10 setp.gt.s64 %r160,%r31,0; @ %r160 bra $L5; shl.b64 %r122,%r80,2; mov.u64 %r101,%r23; .loc 1 78 11 mov.u64 %r89,0; .loc 1 79 24 cvt.u32.u64 %r161,%r89; $L6: st.u32 [%r101],%r161; .loc 1 78 27 add.u64 %r89,%r89,1; .loc 1 78 4 add.u64 %r101,%r101,%r122; setp.ne.u64 %r162,%r79,%r89; @ %r162 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r88,%r103,1; .loc 1 70 3 add.u64 %r44,%r44,24; add.u64 %r37,%r37,8; add.u64 %r47,%r47,8; setp.ne.u64 %r163,%r79,%r88; @ %r163 bra $L32; .loc 1 84 8 ld.u64 %r67,[%r128]; shl.b64 %r108,%r80,2; mov.u64 %r106,%r23; .loc 1 87 10 mov.u64 %r87,0; .loc 1 88 23 mov.u32 %r164,1; $L8: st.u32 [%r106],%r164; mov.u64 %r104,%r87; .loc 1 87 26 add.u64 %r87,%r87,1; .loc 1 87 3 add.u64 %r106,%r106,%r108; setp.ne.u64 %r165,%r103,%r104; @ %r165 bra $L8; .loc 1 101 9 setp.eq.u64 %r166,%r67,0; @ %r166 bra $L1; .loc 1 149 19 ld.u64 %r51,[%frame]; .loc 1 149 9 shl.b64 %r53,%r51,3; .loc 1 151 34 ld.u64 %r56,[%frame+120]; ld.u64 %r123,[%frame+240]; .loc 1 141 10 ld.f64 %r125,[%r67]; cvt.u32.u32 %r168,%r22; cvt.s64.s8 %r167,%r168; shl.b64 %r169,%r167,3; add.u64 %r113,%r206,%r169; .loc 1 97 12 mov.f64 %r73,0d7ff0000000000000; setp.eq.u32 %r208,1,0; add.u64 %r207,%frame,248; .loc 1 157 13 mov.u64 %r211,0; .loc 1 174 32 add.u64 %r212,%frame,120; .loc 1 126 10 setp.eq.u32 %r214,%r129,0; $L30: .loc 1 106 10 @ %r208 bra $L10; mov.u64 %r38,%r123; mov.u32 %r85,0; .loc 1 110 12 mov.u32 %r210,1; $L17: .loc 1 118 13 add.u64 %r126,%r67,%r53; .loc 1 110 11 setp.ge.f64 %r172,%r73,%r125; @ ! %r172 bra $L51; setp.eq.u32 %r173,%r85,0; selp.u64 %r123,%r123,%r38,%r173; mov.u64 %r90,%r207; mov.u64 %r95,%r23; mov.u64 %r119,%r123; $L15: .loc 1 115 36 cvt.u32.u64 %r176,%r119; add.u32 %r175,%r176,1; .loc 1 115 25 st.u32 [%r95],%r175; .loc 1 114 5 add.u64 %r95,%r95,%r108; setp.eq.u64 %r177,%r90,%r113; @ %r177 bra $L33; .loc 1 115 32 ld.u64 %r119,[%r90]; add.u64 %r90,%r90,8; bra $L15; $L51: .loc 1 120 11 add.u64 %r38,%r38,1; .loc 1 120 4 setp.eq.u64 %r178,%r38,%r56; @ %r178 bra $L16; .loc 1 110 12 ld.f64 %r125,[%r126]; mov.u64 %r67,%r126; mov.u32 %r85,%r210; bra $L17; $L10: .loc 1 126 10 @ %r214 bra $L34; mov.u64 %r46,%r123; mov.u64 %r65,%r67; $L24: .loc 1 129 9 setp.ge.f64 %r180,%r73,%r125; @ ! %r180 bra $L19; mov.u64 %r84,%r207; mov.u64 %r39,%r23; mov.u64 %r116,%r46; $L21: .loc 1 133 34 cvt.u32.u64 %r183,%r116; add.u32 %r182,%r183,1; .loc 1 133 23 st.u32 [%r39],%r182; .loc 1 132 3 add.u64 %r39,%r39,%r108; setp.eq.u64 %r184,%r84,%r113; @ %r184 bra $L35; .loc 1 133 30 ld.u64 %r116,[%r84]; add.u64 %r84,%r84,8; bra $L21; $L35: .loc 1 131 10 mov.f64 %r73,%r125; $L19: .loc 1 135 11 add.u64 %r65,%r65,%r53; .loc 1 137 9 add.u64 %r46,%r46,1; .loc 1 137 2 st.u64 [%frame+240],%r46; setp.ne.u64 %r185,%r46,%r56; @ %r185 bra $L22; .loc 1 135 11 sub.u64 %r186,%r56,%r123; mad.lo.u64 %r67,%r186,%r53,%r67; bra $L23; $L22: .loc 1 129 10 ld.f64 %r125,[%r65]; bra $L24; $L34: mov.u64 %r55,%r123; mov.u64 %r66,%r67; $L18: .loc 1 141 9 setp.gt.f64 %r188,%r73,%r125; @ ! %r188 bra $L25; mov.u64 %r117,%r207; mov.u64 %r120,%r23; mov.u64 %r110,%r55; $L27: .loc 1 145 34 cvt.u32.u64 %r191,%r110; add.u32 %r190,%r191,1; .loc 1 145 23 st.u32 [%r120],%r190; .loc 1 144 3 add.u64 %r120,%r120,%r108; setp.eq.u64 %r192,%r113,%r117; @ %r192 bra $L36; .loc 1 145 30 ld.u64 %r110,[%r117]; add.u64 %r117,%r117,8; bra $L27; $L36: .loc 1 143 10 mov.f64 %r73,%r125; $L25: .loc 1 149 9 add.u64 %r66,%r66,%r53; .loc 1 151 14 add.u64 %r55,%r55,1; .loc 1 151 7 st.u64 [%frame+240],%r55; setp.ne.u64 %r193,%r55,%r56; @ %r193 bra $L28; .loc 1 149 9 sub.u64 %r194,%r56,%r123; mad.lo.u64 %r67,%r194,%r53,%r67; bra $L23; $L28: .loc 1 141 10 ld.f64 %r125,[%r66]; bra $L18; $L16: st.u64 [%frame+240],%r56; .loc 1 118 13 mov.u64 %r67,%r126; $L23: mov.u64 %r76,%r207; .loc 1 151 34 mov.u64 %r63,%r56; mov.u64 %r61,%r51; .loc 1 152 9 mov.u64 %r71,0; $L29: .loc 1 157 13 st.u64 [%r76+-8],%r211; .loc 1 160 23 mul.lo.u64 %r58,%r61,%r63; mov.u64 %r105,%r71; .loc 1 161 5 add.u64 %r71,%r71,1; .loc 1 162 7 setp.eq.u64 %r198,%r105,%r103; @ %r198 bra $L1; .loc 1 170 16 ld.u64 %r199,[%r76]; add.u64 %r60,%r199,1; st.u64 [%r76],%r60; shl.b64 %r69,%r71,3; .loc 1 171 23 add.u64 %r200,%frame,%r69; ld.u64 %r61,[%r200]; .loc 1 171 13 sub.u64 %r201,%r61,%r58; shl.b64 %r202,%r201,3; add.u64 %r67,%r67,%r202; .loc 1 174 32 add.u64 %r204,%r212,%r69; ld.u64 %r63,[%r204]; .loc 1 174 7 add.u64 %r76,%r76,8; setp.eq.u64 %r205,%r60,%r63; @ %r205 bra $L29; ld.u64 %r123,[%frame+240]; .loc 1 141 10 ld.f64 %r125,[%r67]; bra $L30; $L33: mov.f64 %r73,%r125; setp.eq.u32 %r208,1,1; bra $L10; $L_gfortran_mminloc0_4_r8 .visible .func _gfortran_mminloc0_4_r8reg .u64 %stack; mov.u64 %stack,0480pred %r172; .reg .predpredpred %r195; .reg .u64 %r198; .reg .u64u32 %r211; .reg .u16pred %r217; .reg .predpredu16 %r257; .reg .u32 %r258; .reg .pred %r259; .reg .pred %r260; .reg .u32 %r262; .reg .u32 %r263; .reg .pred %r264; .reg .pred %r265; .reg .u64 %r266; .reg .u16 %r268; .reg .u32 %r269; .reg .pred %r270; .reg .pred %r271; .reg .u32 %r273; .reg .u32 %r274; .reg .pred %r275; .reg .pred %r276; .reg .pred %r281; .reg .u64 %r282; .reg .u64pred %r294; .reg .u64 %r296; .reg .u64 %r297; .reg .pred %r299; mov.u64 %r168,%ar0; mov.u64 %r169,%ar1; mov.u64 %r170,%ar2; mov.u32 %r171,%ar3; .loc 1 202 6 setp.ne.u64 %r172,%r170,0; @ %r172 bra $Lcall _gfortran_minloc0_4_r8.loc 1 205 7 bra $L52; $L53: .loc 1 208 10 ld.s8 %r22,[%r169+28]; .loc 1 209 6 setp.gt.s32 %r176,%r22,0; @ %r176 bra $L55; .loc 1 210 5 cvta.const.u64 %r177_gfortran_runtime_error5: .loc 1 212 6 ld.u64 %r179,[%r168]; setp.ne.u64 %r180,%r179,0; @ %r180 bra $L56; .loc 1 214 7 st.u64 [%r168+48],%r179; add.u32 %r182,%r22,-1; cvt.s64.s32 %r183,%r182; st.u64 [%r168+56],%r183; mov.u64 %r184,1; st.u64 [%r168+40],%r184; .loc 1 215 28 cvt.u32.u64 %r185,%r184; st.u8 [%r168+28],%r185; .loc 1 216 24 st.u64 [%r168+8],%r179; .loc 1 217 29 cvt.u32.u32 %r190,%r22; cvt.s64.s8 %r189,%r1call (%value_in),_gfortrani_xmallocarray191,[%value_in]; } .loc 1 217 27 st.u64 [%r168],%r191; bra $L57; $L56: .loc 1 221 11 cvta.global.u64 %r193,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r194,[%r193+36]; setp.eq.u32 %r195,%r194,0r169198; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r2011716920198; call _gfortrani_bounds_equal_extents$L57: .loc 1 231 15 ld.u64 %r31,[%r170+16]; .loc 1 233 9 ld.u64 %r99,[%r170]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r203,%r32,-4; and.b32 %r204,%r203,-5;.loc 1 235 22 add.u32 %r208,%r32,-1; set.u32.le.u32 %r210,%r208,1; neg.s32 %r211,%r210; .loc 1 235 6 cvt.u16.u32 %r213,%r207; cvt.u16.u32 %r214,%r211; or.b16 %r212,%r213,%r214; cvt.u32.u16 %r215,%r212; cvt.u16.u8 %r216,%r215; setp.ne.u16 %r217,%r216,0; @ %r217 bra $L58; .loc 1 237 7 setp.ne.u32 %r219,%r32,16; @ %r219 bra $L59; $L58: .loc 1 244 11 ld.u64 %r115,[%r168+40]; .loc 1 245 8 ld.u64 %r116,[%r168]; .loc 1 246 17 cvt.u32.u32 %r220,%r22; cvt.s64.s8 %r139,%r220; add.u64 %r166,%r169,40; add.u64 %r292,%frame,120; mov.u64 %r163,%r292; add.u64 %r160,%r170,40; mov.u64 %r159,%frame; add.u64 %r156,%frame,240; add.u64 %r152,%frame,360; mov.u64 %r154,%r152; .loc 1 246 10 mov.u64 %r122,0; .loc 1 251 16 mov.u64 %r229,%r122; bra $L60; $L59: .loc 1 242 5_gfortran_runtime_error248 18 ld.u64 %r223,[%r166]; st.u64 [%r163],%r223; .loc 1 249 20 ld.u64 %r225,[%r160]; mul.lo.u64 %r224,%r225,%r31; .loc 1 249 18 st.u64 [%r159],%r224; .loc 1 250 19 ld.u64 %r227,[%r166+16]; add.u64 %r226,%r227,1; ld.u64 %r228,[%r166+8]; sub.u64 %r45,%r226,%r228; .loc 1 250 17 st.u64 [%r156],%r45; .loc 1 251 16 st.u64 [%r154],%r229; .loc 1 252 10 setp.gt.s64 %r230,%r45,0; @ %r230 bra $L61; shl.b64 %r136,%r115,2; mov.u64 %r138,%r116; .loc 1 255 11 mov.u64 %r123,0; .loc 1 256 24 cvt.u32.u64 %r231,%r123; $L62: st.u32 [%r138],%r231; .loc 1 255 27 add.u64 %r123,%r123,1; .loc 1 255 4 add.u64 %r138,%r138,%r136; setp.ne.u64 %r232,%r123,%r139; @ %r232 bra $L62; bra $L52; $L61: .loc 1 246 26 add.u64 %r122,%r122,1; .loc 1 246 3 add.u64 %r166,%r166,24; add.u64 %r163,%r163,8; add.u64 %r160,%r160,24; add.u64 %r159,%r159,8; add.u64 %r156,%r156,8; add.u64 %r154,%r154,8; setp.ne.u64 %r233,%r122,%r139; @ %r233 bra $L60; .loc 1 261 8 ld.u64 %r96,[%r169]; shl.b64 %r126,%r115,2; mov.u64 %r124,%r116; .loc 1 264 10 mov.u64 %r121,0; .loc 1 265 23 cvt.u32.u64 %r234,%r121; $L64: st.u32 [%r124],%r234; .loc 1 264 26 add.u64 %r121,%r121,1; .loc 1 264 3 add.u64 %r124,%r124,%r126; setp.ne.u64 %r235,%r121,%r139; @ %r235 bra $L64; .loc 1 276 9 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L52; .loc 1 331 19 ld.u64 %r74,[%frame+120]; .loc 1 331 9 shl.b64 %r76,%r74,3; .loc 1 332 20 ld.u64 %r77,[%frame]; .loc 1 334 34 ld.u64 %r81,[%frame+240]; .loc 1 334 21 ld.u64 %r164,[%frame+360]; cvt.u32.u32 %r238,%r22; cvt.s64.s8 %r237,%r238; shl.b64 %r239,%r237,3; add.u64 %r80,%r239,%r152; .loc 1 272 12 mov.f64 %r104,0d7ff0000000000000; setp.eq.u32 %r294,1,0; add.u64 %r293,%frame,368; .loc 1 340 13 mov.u64 %r296,0; .loc 1 359 32 add.u64 %r297,%frame,240; .loc 1 308 12 setp.eq.u32 %r299,%r171,0; $L83: .loc 1 280 10 @ %r294 bra $L65; $L72: .loc 1 284 11 ld.s8 %r242,[%r99]; cvt.u16.u32 %r241,%r242; setp.eq.u16 %r243,%r241,0; @ %r243 bra $L66; .loc 1 287 8 ld.u32 %r244,[%r116]; setp.ne.u32 %r245,%r244,0; @ %r245 bra $L67; mov.u64 %r101,%r293; mov.u64 %r103,%r116; mov.u64 %r157,%r164; $L68: .loc 1 289 38 cvt.u32.u64 %r248,%r157; add.u32 %r247,%r248,1; .loc 1 289 27 st.u32 [%r103],%r247; .loc 1 288 7 add.u64 %r103,%r103,%r126; setp.eq.u64 %r249,%r80,%r101; @ %r249 bra $L67; .loc 1 289 34 ld.u64 %r157,[%r101]; add.u64 %r101,%r101,8; bra $L68; $L67: .loc 1 290 9 ld.f64 %r52,[%r96]; .loc 1 290 8 setp.le.f64 %r250,%r52,%r104; @ ! %r250 bra $L66; mov.u64 %r129,%r293; mov.u64 %r130,%r116; mov.u64 %r161,%r164; $L71: .loc 1 296 33 cvt.u32.u64 %r253,%r161; add.u32 %r252,%r253,1; .loc 1 296 22 st.u32 [%r130],%r252; .loc 1 295 9 add.u64 %r130,%r130,%r126; setp.eq.u64 %r254,%r80,%r129; @ %r254 bra $L84; .loc 1 296 29 ld.u64 %r161,[%r129]; add.u64 %r129,%r129,8; bra $L71; $L66: .loc 1 300 13 add.u64 %r96,%r96,%r76; .loc 1 301 14 add.u64 %r99,%r99,%r77; .loc 1 303 11 add.u64 %r164,%r164,1; .loc 1 303 4 st.u64 [%frame+360],%r164; setp.ne.u64 %r255,%r164,%r81; @ %r255 bra $L72; bra $L73; $L65: .loc 1 308 12 @ %r299 bra $L85; mov.u64 %r66,%r164; mov.u64 %r94,%r96; $L78: .loc 1 311 11 ld.s8 %r258,[%r99]; cvt.u16.u32 %r257,%r258; setp.eq.u16 %r259,%r257,0; @ %r259 bra $L75; .loc 1 311 12 ld.f64 %r61,[%r94]; .loc 1 311 11 setp.le.f64 %r260,%r61,%r104; @ ! %r260 bra $L75; mov.u64 %r62,%r293; mov.u64 %r69,%r116; mov.u64 %r155,%r66; .loc 1 314 18 mov.u64 %r120,0; $L77: .loc 1 315 36 cvt.u32.u64 %r263,%r155; add.u32 %r262,%r263,1; .loc 1 315 25 st.u32 [%r69],%r262; .loc 1 314 34 add.u64 %r120,%r120,1; .loc 1 314 11 add.u64 %r69,%r69,%r126; setp.ge.s64 %r264,%r120,%r139; @ %r264 bra $L86; .loc 1 315 32 ld.u64 %r155,[%r62]; add.u64 %r62,%r62,8; bra $L77; $L86: .loc 1 313 18 mov.f64 %r104,%r61; $L75: .loc 1 317 8 add.u64 %r94,%r94,%r76; .loc 1 319 13 add.u64 %r66,%r66,1; .loc 1 319 6 st.u64 [%frame+360],%r66; setp.ne.u64 %r265,%r66,%r81; @ %r265 bra $L78; .loc 1 317 8 sub.u64 %r266,%r81,%r164; mad.lo.u64 %r96,%r266,%r76,%r96; bra $L73; $L85: mov.u64 %r79,%r164; mov.u64 %r98,%r99; mov.u64 %r95,%r96; $L74: .loc 1 323 11 ld.s8 %r269,[%r98]; cvt.u16.u32 %r268,%r269; setp.eq.u16 %r270,%r268,0; @ %r270 bra $L79; .loc 1 323 12 ld.f64 %r68,[%r95]; .loc 1 323 11 setp.lt.f64 %r271,%r68,%r104; @ ! %r271 bra $L79; mov.u64 %r46,%r293; mov.u64 %r56,%r116; mov.u64 %r151,%r79; .loc 1 326 12 mov.u64 %r118,0; $L81: .loc 1 327 36 cvt.u32.u64 %r274,%r151; add.u32 %r273,%r274,1; .loc 1 327 25 st.u32 [%r56],%r273; .loc 1 326 28 add.u64 %r118,%r118,1; .loc 1 326 5 add.u64 %r56,%r56,%r126; setp.ge.s64 %r275,%r118,%r139; @ %r275 bra $L87; .loc 1 327 32 ld.u64 %r151,[%r46]; add.u64 %r46,%r46,8; bra $L81; $L87: .loc 1 325 12 mov.f64 %r104,%r68; $L79: .loc 1 331 9 add.u64 %r95,%r95,%r76; .loc 1 332 10 add.u64 %r98,%r98,%r77; .loc 1 334 14 add.u64 %r79,%r79,1; .loc 1 334 7 st.u64 [%frame+360],%r79; setp.ne.u64 %r276,%r79,%r81; @ %r276 bra $L74; sub.u64 %r143,%r81,%r164; .loc 1 331 9 mad.lo.u64 %r96,%r76,%r143,%r96; .loc 1 332 10 mad.lo.u64 %r99,%r77,%r143,%r99; $L73: mov.u64 %r119,%r293; .loc 1 325 12 mov.u64 %r90,%r77; .loc 1 334 34 mov.u64 %r92,%r81; mov.u64 %r88,%r74; mov.u64 %r110,8; .loc 1 335 9 mov.u64 %r102,0; $L82: .loc 1 340 13 st.u64 [%r119+-8],%r296; .loc 1 343 23 mul.lo.u64 %r83,%r88,%r92; .loc 1 344 24 mul.lo.u64 %r85,%r92,%r90; .loc 1 345 5 add.u64 %r102,%r102,1; .loc 1 346 7 setp.eq.u64 %r281,%r102,%r139; @ %r281 bra $L52; .loc 1 354 16 ld.u64 %r282,[%r119]; add.u64 %r87,%r282,1; st.u64 [%r119],%r87; .loc 1 355 23 add.u64 %r284,%r292,%r110; ld.u64 %r88,[%r284]; .loc 1 355 13 sub.u64 %r285,%r88,%r83; shl.b64 %r286,%r285,3; add.u64 %r96,%r96,%r286; .loc 1 356 24 add.u64 %r287,%frame,%r110; ld.u64 %r90,[%r287]; .loc 1 356 14 sub.u64 %r288,%r90,%r85; add.u64 %r99,%r99,%r288; .loc 1 359 32 add.u64 %r290,%r297,%r110; ld.u64 %r92,[%r290]; .loc 1 359 7 add.u64 %r119,%r119,8; add.u64 %r110,%r110,8; setp.eq.u64 %r291,%r87,%r92; @ %r291 bra $L82; .loc 1 334 21 ld.u64 %r164,[%frame+360]; bra $L83; $L84: .loc 1 290 9 mov.f64 %r104,%r52; setp.eq.u32 %r294,1,1; bra $L65; $L52: .loc 1 _gfortran_sminloc0_4_r8 .visible .func _gfortran_sminloc0_4_r8u32pred.reg .u64u32 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 378 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L114; .loc 1 378 20 ld.u32 %r40,[%r37380 7 {_gfortran_minloc0_4_r8.loc 1 381 7 bra $L113; $L115: .loc 1 384 8 ld.s8 %r30,[%r36+28]; .loc 1 386 6 setp.gt.s64 %r46,%r30,0; @ %r46 bra $L1_gfortran_runtime_error117: .loc 1 389 15 ld.u64 %r24,[%r35]; .loc 1 389 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L118; .loc 1 391 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 392 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 393 24 st.u64 [%r35+8],%r24; .loc 1 394 29 mov.u64 %r56call (%value_in),_gfortrani_xmallocarray94 27 st.u64 [%r35],%r24; bra $L119; $L118: .loc 1 396 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 396 11 ld.u32 %r62,[%r61+36.loc 1 398 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 403 8 ld.u64 %r24,[%r35]; $L119: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,2; mov.u64 %r29,%r24; .loc 1 404 10 mov.u64 %r32,0; .loc 1 405 23 cvt.u32.u64 %r68,%r32; $L120: st.u32 [%r29],%r68; .loc 1 404 24 add.u64 %r32,%r32,1; .loc 1 404 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L120; $L113: .loc 1 406 1 ret; } minloc0_8_r8.o/_gfortran_minloc0_8_r8 .visible .func _gfortran_minloc0_8_libgfortran/generated/minloc0_8_r8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_8_r8 .visible .func _gfortran_mminloc0_8_r8_gfortran_sminloc0_8_r8 .visible .func _gfortran_sminloc0_8_r8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_8_r8 .visible .func _gfortran_minloc0_8_stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[368pred.reg .u64 %r140; .reg .u32 %r141; .reg .predpred %r157; .reg .u64pred %r170; .reg .pred %r171; .reg .pred %r173; .reg .u64pred %r177; .reg .u64 %r178; .reg .predpredpred %r196; .reg .predpred %r205; mov.u64 %r121,%ar0; mov.u64 %r122,%ar1; mov.u32 %r123,%ar2; .loc 1 50 10 ld.s8 %r22,[%r122+28]; .loc 1 50 8 cvt.u32.u32 %r124,%r22; cvt.s64.s8 %r84,%r124; .loc 1 51 6 setp.gt.s64 %r125,%r84,0; @ %r125 bra $L2; .loc 1 52 5 cvta.const.u64 %r126,$LCstack; call _gfortran_runtime_errorr23,[%r121]; .loc 1 54 6 setp.ne.u64 %r128,%r23,0; @ %r128 bra $L3; .loc 1 56 7 st.u64 [%r121+48],%r23; add.u64 %r130,%r84,-1; st.u64 [%r121+56],%r130; mov.u64 %r131,1; st.u64 [%r121+40],%r131; .loc 1 57 28 cvt.u32.u64 %r132,%r131; st.u8 [%r121+28],%r132; .loc 1 58 24 st.u64 [%r121+8],%r23; .loc 1 59 29 mov.u64 %r135135; call (%value_in),_gfortrani_xmallocarray138,[%value_in]; } mov.u64 %r23,%r138; .loc 1 59 27 st.u64 [%r121],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r140,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r141,[%r140+36]; setp.eq.u32 %r142,%r141,0; @ %r142 bra $L4; .loc 1 64r122145; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r121]; $L4: .loc 1 68 11 ld.u64 %r85,[%r121+40]; add.u64 %r33,%r122,40; mov.u64 %r53,%frame; add.u64 %r45,%frame,120; .loc 1 70 10 mov.u64 %r106,0; add.u64 %r198,%frame,240; .loc 1 74 16 mov.u64 %r153,%r106; bra $L7; $L32: mov.u64 %r106,%r91; $L7: .loc 1 72 18 ld.u64 %r146,[%r33]; st.u64 [%r53],%r146; .loc 1 73 19 ld.u64 %r148,[%r33+16]; add.u64 %r147,%r148,1; ld.u64 %r149,[%r33+8]; sub.u64 %r32,%r147,%r149; .loc 1 73 17 st.u64 [%r45],%r32; .loc 1 74 16 shl.b64 %r151,%r106,3; add.u64 %r152,%r198,%r151; st.u64 [%r152],%r153; .loc 1 75 10 setp.gt.s64 %r154,%r32,0; @ %r154 bra $L5; shl.b64 %r114,%r85,3; mov.u64 %r104,%r23; .loc 1 78 11 mov.u64 %r92,0; .loc 1 79 24 mov.u64 %r155,%r92; $L6: st.u64 [%r104],%r155; .loc 1 78 27 add.u64 %r92,%r92,1; .loc 1 78 4 add.u64 %r104,%r104,%r114; setp.ne.u64 %r156,%r84,%r92; @ %r156 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r91,%r106,1; .loc 1 70 3 add.u64 %r33,%r33,24; add.u64 %r53,%r53,8; add.u64 %r45,%r45,8; setp.ne.u64 %r157,%r84,%r91; @ %r157 bra $L32; .loc 1 84 8 ld.u64 %r73,[%r122]; shl.b64 %r44,%r85,3; mov.u64 %r55,%r23; .loc 1 87 10 mov.u64 %r90,0; .loc 1 88 23 mov.u64 %r158,1; $L8: st.u64 [%r55],%r158; mov.u64 %r107,%r90; .loc 1 87 26 add.u64 %r90,%r90,1; .loc 1 87 3 add.u64 %r55,%r55,%r44; setp.ne.u64 %r159,%r106,%r107; @ %r159 bra $L8; .loc 1 101 9 setp.eq.u64 %r160,%r73,0; @ %r160 bra $L1; .loc 1 149 19 ld.u64 %r57,[%frame]; .loc 1 149 9 shl.b64 %r59,%r57,3; .loc 1 151 34 ld.u64 %r61,[%frame+120]; ld.u64 %r115,[%frame+240]; .loc 1 141 10 ld.f64 %r119,[%r73]; cvt.u32.u32 %r162,%r22; cvt.s64.s8 %r161,%r162; shl.b64 %r163,%r161,3; add.u64 %r95,%r198,%r163; .loc 1 97 12 mov.f64 %r78,0d7ff0000000000000; setp.eq.u32 %r197,1,0; add.u64 %r199,%frame,248; .loc 1 157 13 mov.u64 %r202,0; .loc 1 174 32 add.u64 %r203,%frame,120; .loc 1 126 10 setp.eq.u32 %r205,%r123,0; $L30: .loc 1 106 10 @ %r197 bra $L10; mov.u64 %r40,%r115; mov.u32 %r89,0; .loc 1 110 12 mov.u32 %r201,1; $L17: .loc 1 118 13 add.u64 %r120,%r73,%r59; .loc 1 110 11 setp.ge.f64 %r166,%r78,%r119; @ ! %r166 bra $L51; setp.eq.u32 %r167,%r89,0; selp.u64 %r115,%r115,%r40,%r167; mov.u64 %r77,%r199; mov.u64 %r97,%r23; mov.u64 %r111,%r115; $L15: .loc 1 115 36 add.u64 %r169,%r111,1; .loc 1 115 25 st.u64 [%r97],%r169; .loc 1 114 5 add.u64 %r97,%r97,%r44; setp.eq.u64 %r170,%r77,%r95; @ %r170 bra $L33; .loc 1 115 32 ld.u64 %r111,[%r77]; add.u64 %r77,%r77,8; bra $L15; $L51: .loc 1 120 11 add.u64 %r40,%r40,1; .loc 1 120 4 setp.eq.u64 %r171,%r40,%r61; @ %r171 bra $L16; .loc 1 110 12 ld.f64 %r119,[%r120]; mov.u64 %r73,%r120; mov.u32 %r89,%r201; bra $L17; $L10: .loc 1 126 10 @ %r205 bra $L34; mov.u64 %r52,%r115; mov.u64 %r71,%r73; $L24: .loc 1 129 9 setp.ge.f64 %r173,%r78,%r119; @ ! %r173 bra $L19; mov.u64 %r110,%r199; mov.u64 %r112,%r23; mov.u64 %r109,%r52; $L21: .loc 1 133 34 add.u64 %r175,%r109,1; .loc 1 133 23 st.u64 [%r112],%r175; .loc 1 132 3 add.u64 %r112,%r112,%r44; setp.eq.u64 %r176,%r95,%r110; @ %r176 bra $L35; .loc 1 133 30 ld.u64 %r109,[%r110]; add.u64 %r110,%r110,8; bra $L21; $L35: .loc 1 131 10 mov.f64 %r78,%r119; $L19: .loc 1 135 11 add.u64 %r71,%r71,%r59; .loc 1 137 9 add.u64 %r52,%r52,1; .loc 1 137 2 st.u64 [%frame+240],%r52; setp.ne.u64 %r177,%r52,%r61; @ %r177 bra $L22; .loc 1 135 11 sub.u64 %r178,%r61,%r115; mad.lo.u64 %r73,%r178,%r59,%r73; bra $L23; $L22: .loc 1 129 10 ld.f64 %r119,[%r71]; bra $L24; $L34: mov.u64 %r60,%r115; mov.u64 %r72,%r73; $L18: .loc 1 141 9 setp.gt.f64 %r180,%r78,%r119; @ ! %r180 bra $L25; mov.u64 %r50,%r199; mov.u64 %r47,%r23; mov.u64 %r93,%r60; $L27: .loc 1 145 34 add.u64 %r182,%r93,1; .loc 1 145 23 st.u64 [%r47],%r182; .loc 1 144 3 add.u64 %r47,%r47,%r44; setp.eq.u64 %r183,%r50,%r95; @ %r183 bra $L36; .loc 1 145 30 ld.u64 %r93,[%r50]; add.u64 %r50,%r50,8; bra $L27; $L36: .loc 1 143 10 mov.f64 %r78,%r119; $L25: .loc 1 149 9 add.u64 %r72,%r72,%r59; .loc 1 151 14 add.u64 %r60,%r60,1; .loc 1 151 7 st.u64 [%frame+240],%r60; setp.ne.u64 %r184,%r60,%r61; @ %r184 bra $L28; .loc 1 149 9 sub.u64 %r185,%r61,%r115; mad.lo.u64 %r73,%r185,%r59,%r73; bra $L23; $L28: .loc 1 141 10 ld.f64 %r119,[%r72]; bra $L18; $L16: st.u64 [%frame+240],%r61; .loc 1 118 13 mov.u64 %r73,%r120; $L23: mov.u64 %r42,%r199; .loc 1 151 34 mov.u64 %r70,%r61; mov.u64 %r68,%r57; .loc 1 152 9 mov.u64 %r76,0; $L29: .loc 1 157 13 st.u64 [%r42+-8],%r202; .loc 1 160 23 mul.lo.u64 %r65,%r68,%r70; mov.u64 %r108,%r76; .loc 1 161 5 add.u64 %r76,%r76,1; .loc 1 162 7 setp.eq.u64 %r189,%r108,%r106; @ %r189 bra $L1; .loc 1 170 16 ld.u64 %r190,[%r42]; add.u64 %r67,%r190,1; st.u64 [%r42],%r67; shl.b64 %r118,%r76,3; .loc 1 171 23 add.u64 %r191,%frame,%r118; ld.u64 %r68,[%r191]; .loc 1 171 13 sub.u64 %r192,%r68,%r65; shl.b64 %r193,%r192,3; add.u64 %r73,%r73,%r193; .loc 1 174 32 add.u64 %r195,%r203,%r118; ld.u64 %r70,[%r195]; .loc 1 174 7 add.u64 %r42,%r42,8; setp.eq.u64 %r196,%r67,%r70; @ %r196 bra $L29; ld.u64 %r115,[%frame+240]; .loc 1 141 10 ld.f64 %r119,[%r73]; bra $L30; $L33: mov.f64 %r78,%r119; setp.eq.u32 %r197,1,1; bra $L10; $L_gfortran_mminloc0_8_r8 .visible .func _gfortran_mminloc0_8_r8reg .u64 %stack; mov.u64 %stack,0480pred %r187; .reg .u64 %r190; .reg .u64u64pred %r222; .reg .u64 %r223; .reg .predu32 %r230; .reg .u64 %r231; .reg .u16 %r233; .reg .u32 %r234; .reg .pred %r235; .reg .u64 %r236; .reg .predpred %r250; .reg .u64 %r252; .reg .predu16 %r257; .reg .u32 %r258; .reg .pred %r259; .reg .pred %r260; .reg .u64 %r262; .reg .pred %r263; .reg .predpred %r282; .reg .u64 %r284; .reg .u64 %r285; .reg .pred %r287; mov.u64 %r160,%ar0; mov.u64 %r161,%ar1; mov.u64 %r162,%ar2; mov.u32 %r163,%ar3; .loc 1 202 6 setp.ne.u64 %r164,%r162,0; @ %r164 bra $L53; .loc 1 204 7 {1163; call _gfortran_minloc0_8_r8.loc 1 205 7 bra $L52; $L53: .loc 1 208 10 ld.s8 %r22,[%r161+28]; .loc 1 209 6 setp.gt.s32 %r168,%r22,0; @ %r168 bra $L55; .loc 1 210 5 cvta.const.u64 %r16stack; call _gfortran_runtime_error5: .loc 1 212 6 ld.u64 %r171,[%r160]; setp.ne.u64 %r172,%r171,0; @ %r172 bra $L56; .loc 1 214 7 st.u64 [%r160+48],%r171; add.u32 %r174,%r22,-1; cvt.s64.s32 %r175,%r174; st.u64 [%r160+56],%r175; mov.u64 %r176,1; st.u64 [%r160+40],%r176; .loc 1 215 28 cvt.u32.u64 %r177,%r176; st.u8 [%r160+28],%r177; .loc 1 216 24 st.u64 [%r160+8],%r171; .loc 1 217 29 cvt.u32.u32 %r182,%r22; cvt.s64.s8 %r181,%r182; mov.u64 %r180call (%value_in),_gfortrani_xmallocarray183,[%value_in]; } .loc 1 217 27 st.u64 [%r160],%r183; bra $L57; $L56: .loc 1 221 11 cvta.global.u64 %r185,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r186,[%r185+36]; setp.eq.u32 %r187,%r186,0; @ %r187 bra $L57; .loc 1 224 4 cvta.const.u64 %r190,$LC1; {161190; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19316161193190; call _gfortrani_bounds_equal_extents$L57: .loc 1 231 15 ld.u64 %r31,[%r162+16]; .loc 1 233 9 ld.u64 %r101,[%r162]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r195,%r32,-4; and.b32 %r196,%r195,-5; set.u32.eq.u32 %r198,%r196,0; neg.s32 %r199,%r198; .loc 1 235 22 add.u32 %r200,%r32,-1; set.u32.le.u32 %r202,%r200,1; neg.s32 %r203,%r202; .loc 1 235 6 cvt.u16.u32 %r205,%r199; cvt.u16.u32 %r206,%r203; or.b16 %r204,%r205,%r206; cvt.u32.u16 %r207,%r204; cvt.u16.u8 %r208,%r20758; .loc 1 237 7 setp.ne.u32 %r211,%r32,16; @ %r211 bra $L59; $L58: .loc 1 244 11 ld.u64 %r116,[%r160+40]; .loc 1 245 8 ld.u64 %r117,[%r160]; .loc 1 246 17 cvt.u32.u32 %r212,%r22; cvt.s64.s8 %r137,%r212; add.u64 %r52,%r161,40; add.u64 %r280,%frame,120; mov.u64 %r62,%r280; add.u64 %r67,%r162,40; mov.u64 %r70,%frame; add.u64 %r57,%frame,240; add.u64 %r59,%frame,360; mov.u64 %r58,%r59; .loc 1 246 10 mov.u64 %r122,0; .loc 1 251 16 mov.u64 %r221,%r122; bra $L60; $L59: .loc 1 242 5_gfortran_runtime_error248 18 ld.u64 %r215,[%r52]; st.u64 [%r62],%r215; .loc 1 249 20 ld.u64 %r217,[%r67]; mul.lo.u64 %r216,%r217,%r31; .loc 1 249 18 st.u64 [%r70],%r216; .loc 1 250 19 ld.u64 %r219,[%r52+16]; add.u64 %r218,%r219,1; ld.u64 %r220,[%r52+8]; sub.u64 %r45,%r218,%r220; .loc 1 250 17 st.u64 [%r57],%r45; .loc 1 251 16 st.u64 [%r58],%r221; .loc 1 252 10 setp.gt.s64 %r222,%r45,0; @ %r222 bra $L61; shl.b64 %r130,%r116,3; mov.u64 %r132,%r117; .loc 1 255 11 mov.u64 %r123,0; .loc 1 256 24 mov.u64 %r223,%r123; $L62: st.u64 [%r132],%r223; .loc 1 255 27 add.u64 %r123,%r123,1; .loc 1 255 4 add.u64 %r132,%r132,%r130; setp.ne.u64 %r224,%r123,%r137; @ %r224 bra $L62; bra $L52; $L61: .loc 1 246 26 add.u64 %r122,%r122,1; .loc 1 246 3 add.u64 %r52,%r52,24; add.u64 %r62,%r62,8; add.u64 %r67,%r67,24; add.u64 %r70,%r70,8; add.u64 %r57,%r57,8; add.u64 %r58,%r58,8; setp.ne.u64 %r225,%r122,%r137; @ %r225 bra $L60; .loc 1 261 8 ld.u64 %r98,[%r161]; shl.b64 %r134,%r116,3; mov.u64 %r153,%r117; .loc 1 264 10 mov.u64 %r121,0; .loc 1 265 23 mov.u64 %r226,%r121; $L64: st.u64 [%r153],%r226; .loc 1 264 26 add.u64 %r121,%r121,1; .loc 1 264 3 add.u64 %r153,%r153,%r134; setp.ne.u64 %r227,%r121,%r137; @ %r227 bra $L64; .loc 1 276 9 setp.eq.u64 %r228,%r98,0; @ %r228 bra $L52; .loc 1 331 19 ld.u64 %r77,[%frame+120]; .loc 1 331 9 shl.b64 %r79,%r77,3; .loc 1 332 20 ld.u64 %r80,[%frame]; .loc 1 334 34 ld.u64 %r84,[%frame+240]; .loc 1 334 21 ld.u64 %r156,[%frame+360]; cvt.u32.u32 %r230,%r22; cvt.s64.s8 %r229,%r230; shl.b64 %r231,%r229,3; add.u64 %r48,%r231,%r59; .loc 1 272 12 mov.f64 %r108,0d7ff0000000000000; setp.eq.u32 %r282,1,0; add.u64 %r281,%frame,368; .loc 1 340 13 mov.u64 %r284,0; .loc 1 359 32 add.u64 %r285,%frame,240; .loc 1 308 12 setp.eq.u32 %r287,%r163,0; $L83: .loc 1 280 10 @ %r282 bra $L65; $L72: .loc 1 284 11 ld.s8 %r234,[%r101]; cvt.u16.u32 %r233,%r234; setp.eq.u16 %r235,%r233,0; @ %r235 bra $L66; .loc 1 287 8 ld.u64 %r236,[%r117]; setp.ne.u64 %r237,%r236,0; @ %r237 bra $L67; mov.u64 %r66,%r281; mov.u64 %r76,%r117; mov.u64 %r150,%r156; $L68: .loc 1 289 38 add.u64 %r239,%r150,1; .loc 1 289 27 st.u64 [%r76],%r239; .loc 1 288 7 add.u64 %r76,%r76,%r134; setp.eq.u64 %r240,%r48,%r66; @ %r240 bra $L67; .loc 1 289 34 ld.u64 %r150,[%r66]; add.u64 %r66,%r66,8; bra $L68; $L67: .loc 1 290 9 ld.f64 %r54,[%r98]; .loc 1 290 8 setp.le.f64 %r241,%r54,%r108; @ ! %r241 bra $L66; mov.u64 %r112,%r281; mov.u64 %r125,%r117; mov.u64 %r154,%r156; $L71: .loc 1 296 33 add.u64 %r243,%r154,1; .loc 1 296 22 st.u64 [%r125],%r243; .loc 1 295 9 add.u64 %r125,%r125,%r134; setp.eq.u64 %r244,%r48,%r112; @ %r244 bra $L84; .loc 1 296 29 ld.u64 %r154,[%r112]; add.u64 %r112,%r112,8; bra $L71; $L66: .loc 1 300 13 add.u64 %r98,%r98,%r79; .loc 1 301 14 add.u64 %r101,%r101,%r80; .loc 1 303 11 add.u64 %r156,%r156,1; .loc 1 303 4 st.u64 [%frame+360],%r156; setp.ne.u64 %r245,%r156,%r84; @ %r245 bra $L72; bra $L73; $L65: .loc 1 308 12 @ %r287 bra $L85; mov.u64 %r72,%r156; mov.u64 %r96,%r98; $L78: .loc 1 311 11 ld.s8 %r248,[%r101]; cvt.u16.u32 %r247,%r248; setp.eq.u16 %r249,%r247,0; @ %r249 bra $L75; .loc 1 311 12 ld.f64 %r65,[%r96]; .loc 1 311 11 setp.le.f64 %r250,%r65,%r108; @ ! %r250 bra $L75; mov.u64 %r124,%r281; mov.u64 %r36,%r117; mov.u64 %r136,%r72; .loc 1 314 18 mov.u64 %r120,0; $L77: .loc 1 315 36 add.u64 %r252,%r136,1; .loc 1 315 25 st.u64 [%r36],%r252; .loc 1 314 34 add.u64 %r120,%r120,1; .loc 1 314 11 add.u64 %r36,%r36,%r134; setp.ge.s64 %r253,%r120,%r137; @ %r253 bra $L86; .loc 1 315 32 ld.u64 %r136,[%r124]; add.u64 %r124,%r124,8; bra $L77; $L86: .loc 1 313 18 mov.f64 %r108,%r65; $L75: .loc 1 317 8 add.u64 %r96,%r96,%r79; .loc 1 319 13 add.u64 %r72,%r72,1; .loc 1 319 6 st.u64 [%frame+360],%r72; setp.ne.u64 %r254,%r72,%r84; @ %r254 bra $L78; .loc 1 317 8 sub.u64 %r255,%r84,%r156; mad.lo.u64 %r98,%r255,%r79,%r98; bra $L73; $L85: mov.u64 %r83,%r156; mov.u64 %r100,%r101; mov.u64 %r97,%r98; $L74: .loc 1 323 11 ld.s8 %r258,[%r100]; cvt.u16.u32 %r257,%r258; setp.eq.u16 %r259,%r257,0; @ %r259 bra $L79; .loc 1 323 12 ld.f64 %r74,[%r97]; .loc 1 323 11 setp.lt.f64 %r260,%r74,%r108; @ ! %r260 bra $L79; mov.u64 %r155,%r281; mov.u64 %r159,%r117; mov.u64 %r127,%r83; .loc 1 326 12 mov.u64 %r119,0; $L81: .loc 1 327 36 add.u64 %r262,%r127,1; .loc 1 327 25 st.u64 [%r159],%r262; .loc 1 326 28 add.u64 %r119,%r119,1; .loc 1 326 5 add.u64 %r159,%r159,%r134; setp.ge.s64 %r263,%r119,%r137; @ %r263 bra $L87; .loc 1 327 32 ld.u64 %r127,[%r155]; add.u64 %r155,%r155,8; bra $L81; $L87: .loc 1 325 12 mov.f64 %r108,%r74; $L79: .loc 1 331 9 add.u64 %r97,%r97,%r79; .loc 1 332 10 add.u64 %r100,%r100,%r80; .loc 1 334 14 add.u64 %r83,%r83,1; .loc 1 334 7 st.u64 [%frame+360],%r83; setp.ne.u64 %r264,%r83,%r84; @ %r264 bra $L74; sub.u64 %r141,%r84,%r156; .loc 1 331 9 mad.lo.u64 %r98,%r79,%r141,%r98; .loc 1 332 10 mad.lo.u64 %r101,%r80,%r141,%r101; $L73: mov.u64 %r107,%r281; .loc 1 325 12 mov.u64 %r93,%r80; .loc 1 334 34 mov.u64 %r95,%r84; mov.u64 %r91,%r77; mov.u64 %r105,8; .loc 1 335 9 mov.u64 %r106,0; $L82: .loc 1 340 13 st.u64 [%r107+-8],%r284; .loc 1 343 23 mul.lo.u64 %r86,%r91,%r95; .loc 1 344 24 mul.lo.u64 %r88,%r95,%r93; .loc 1 345 5 add.u64 %r106,%r106,1; .loc 1 346 7 setp.eq.u64 %r269,%r106,%r137; @ %r269 bra $L52; .loc 1 354 16 ld.u64 %r270,[%r107]; add.u64 %r90,%r270,1; st.u64 [%r107],%r90; .loc 1 355 23 add.u64 %r272,%r280,%r105; ld.u64 %r91,[%r272]; .loc 1 355 13 sub.u64 %r273,%r91,%r86; shl.b64 %r274,%r273,3; add.u64 %r98,%r98,%r274; .loc 1 356 24 add.u64 %r275,%frame,%r105; ld.u64 %r93,[%r275]; .loc 1 356 14 sub.u64 %r276,%r93,%r88; add.u64 %r101,%r101,%r276; .loc 1 359 32 add.u64 %r278,%r285,%r105; ld.u64 %r95,[%r278]; .loc 1 359 7 add.u64 %r107,%r107,8; add.u64 %r105,%r105,8; setp.eq.u64 %r279,%r90,%r95; @ %r279 bra $L82; .loc 1 334 21 ld.u64 %r156,[%frame+360]; bra $L83; $L84: .loc 1 290 9 mov.f64 %r108,%r54; setp.eq.u32 %r282,1,1; bra $L65; $L52: .loc 1 _gfortran_sminloc0_8_r8 .visible .func _gfortran_sminloc0_8_r8u32pred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 378 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L114; .loc 1 378 20 ld.u32 %r40,[%r37380 7 {_gfortran_minloc0_8_r8.loc 1 381 7 bra $L113; $L115: .loc 1 384 8 ld.s8 %r30,[%r36+28]; .loc 1 386 6 setp.gt.s64 %r46,%r30,0; @ %r46 bra $L1_gfortran_runtime_error117: .loc 1 389 15 ld.u64 %r24,[%r35]; .loc 1 389 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L118; .loc 1 391 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 392 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 393 24 st.u64 [%r35+8],%r24; .loc 1 394 29 mov.u64 call (%value_in),_gfortrani_xmallocarray94 27 st.u64 [%r35],%r24; bra $L119; $L118: .loc 1 396 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 396 11 ld.u32 %r62,[%r61+36.loc 1 398 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 403 8 ld.u64 %r24,[%r35]; $L119: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,3; mov.u64 %r29,%r24; .loc 1 404 10 mov.u64 %r32,0; .loc 1 405 23 mov.u64 %r68,%r32; $L120: st.u64 [%r29],%r68; .loc 1 404 24 add.u64 %r32,%r32,1; .loc 1 404 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r69,%r30,%r32; @ %r69 bra $L120; $L113: .loc 1 406 1 ret; } minloc0_16_r8.o/_gfortran_minloc0_16_r8 .visible .func _gfortran_minloc0_16_libgfortran/generated/minloc0_16_r8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_16_r8 .visible .func _gfortran_mminloc0_16_r8_gfortran_sminloc0_16_r8 .visible .func _gfortran_sminloc0_16_r8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,487] = {77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_16_r8 .visible .func _gfortran_minloc0_16_stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[368105; .reg .u64 %r108predpred %r160; .reg .predpredu32 %r167; .reg .u64 %r168; .reg .pred %r171; .reg .pred %r172; .reg .u64 %r174; .reg .u64 %r175; .reg .pred %r176; .reg .pred %r177; .reg .predu64 %r193; .reg .predmov.u64 %r124,%ar0; mov.u64 %r125,%ar1; mov.u32 %r126,%ar2; .loc 1 50 10 ld.s8 %r22,[%r125+28]; .loc 1 50 8 cvt.u32.u32 %r127,%r22; cvt.s64.s8 %r79,%r127; .loc 1 51 6 setp.gt.s64 %r128,%r79,0; @ %r128 bra $L2; .loc 1 52 5 cvta.const.u64 %r112_gfortran_runtime_errorr23,[%r124]; .loc 1 54 6 setp.ne.u64 %r131,%r23,0; @ %r131 bra $L3; .loc 1 56 7 st.u64 [%r124+48],%r23; add.u64 %r133,%r79,-1; st.u64 [%r124+56],%r133; mov.u64 %r134,1; st.u64 [%r124+40],%r134; .loc 1 57 28 cvt.u32.u64 %r135,%r134; st.u8 [%r124+28],%r135; .loc 1 58 24 st.u64 [%r124+8],%r23; .loc 1 59 29 mov.u64 %r138,1679call (%value_in),_gfortrani_xmallocarray1,[%value_in]; } mov.u64 %r23,%r141; .loc 1 59 27 st.u64 [%r124],%r23; bra $L4; $L3: .loc 1 63 11 cvta.global.u64 %r143,_gfortrani_compile_options; .loc 1 63 10 ld.u32 %r144,[%r143+36]; setp.eq.u32 %r145,%r144,0; @ %r145 bra $L4; .loc 1 64r148; call _gfortrani_bounds_iforeach_return.loc 1 69 8 ld.u64 %r23,[%r124]; $L4: .loc 1 68 11 ld.u64 %r80,[%r124+40]; add.u64 %r51,%r125,40; mov.u64 %r42,%frame; add.u64 %r76,%frame,120; .loc 1 70 10 mov.u64 %r108,0; add.u64 %r205,%frame,240; .loc 1 74 16 mov.u64 %r156,%r108; bra $L7; $L32: mov.u64 %r108,%r88; $L7: .loc 1 72 18 ld.u64 %r149,[%r51]; st.u64 [%r42],%r149; .loc 1 73 19 ld.u64 %r151,[%r51+16]; add.u64 %r150,%r151,1; ld.u64 %r152,[%r51+8]; sub.u64 %r31,%r150,%r152; .loc 1 73 17 st.u64 [%r76],%r31; .loc 1 74 16 shl.b64 %r154,%r108,3; add.u64 %r155,%r205,%r154; st.u64 [%r155],%r156; .loc 1 75 10 setp.gt.s64 %r157,%r31,0; @ %r157 bra $L5; shl.b64 %r120,%r80,4; mov.u64 %r105,%r23; .loc 1 78 11 mov.u64 %r89,0; .loc 1 79 24 mov.u64 %r158,%r89; $L6: st.u64 [%r105],%r158; st.u64 [%r105+8],%r158; .loc 1 78 27 add.u64 %r89,%r89,1; .loc 1 78 4 add.u64 %r105,%r105,%r120; setp.ne.u64 %r160,%r79,%r89; @ %r160 bra $L6; bra $L1; $L5: .loc 1 70 26 add.u64 %r88,%r108,1; .loc 1 70 3 add.u64 %r51,%r51,24; add.u64 %r42,%r42,8; add.u64 %r76,%r76,8; setp.ne.u64 %r161,%r79,%r88; @ %r161 bra $L32; .loc 1 84 8 ld.u64 %r69,[%r125]; shl.b64 %r35,%r80,4; mov.u64 %r48,%r23; .loc 1 87 10 mov.u64 %r87,0; .loc 1 88 23 mov.u64 %r162,1; mov.u64 %r163,%r87; $L8: st.u64 [%r48],%r162; st.u64 [%r48+8],%r163; mov.u64 %r109,%r87; .loc 1 87 26 add.u64 %r87,%r87,1; .loc 1 87 3 add.u64 %r48,%r48,%r35; setp.ne.u64 %r164,%r108,%r109; @ %r164 bra $L8; .loc 1 101 9 setp.eq.u64 %r165,%r69,0; @ %r165 bra $L1; .loc 1 149 19 ld.u64 %r54,[%frame]; .loc 1 149 9 shl.b64 %r56,%r54,3; .loc 1 151 34 ld.u64 %r58,[%frame+120]; ld.u64 %r121,[%frame+240]; .loc 1 141 10 ld.f64 %r122,[%r69]; cvt.u32.u32 %r167,%r22; cvt.s64.s8 %r166,%r167; shl.b64 %r168,%r166,3; add.u64 %r46,%r205,%r168; .loc 1 97 12 mov.f64 %r74,0d7ff0000000000000; setp.eq.u32 %r207,1,0; add.u64 %r206,%frame,248; .loc 1 157 13 mov.u64 %r210,0; .loc 1 174 32 add.u64 %r211,%frame,120; .loc 1 126 10 setp.eq.u32 %r213,%r126,0; $L30: .loc 1 106 10 @ %r207 bra $L10; mov.u64 %r37,%r121; mov.u32 %r85,0; .loc 1 110 12 mov.u32 %r209,1; $L17: .loc 1 118 13 add.u64 %r123,%r69,%r56; .loc 1 110 11 setp.ge.f64 %r171,%r74,%r122; @ ! %r171 bra $L51; setp.eq.u32 %r172,%r85,0; selp.u64 %r121,%r121,%r37,%r172; mov.u64 %r78,%r206; mov.u64 %r94,%r23; mov.u64 %r117,%r121; $L15: .loc 1 115 36 add.u64 %r174,%r117,1; st.u64 [%r94],%r174; shr.s64 %r175,%r174,63; st.u64 [%r94+8],%r175; .loc 1 114 5 add.u64 %r94,%r94,%r35; setp.eq.u64 %r176,%r46,%r78; @ %r176 bra $L33; .loc 1 115 32 ld.u64 %r117,[%r78]; add.u64 %r78,%r78,8; bra $L15; $L51: .loc 1 120 11 add.u64 %r37,%r37,1; .loc 1 120 4 setp.eq.u64 %r177,%r37,%r58; @ %r177 bra $L16; .loc 1 110 12 ld.f64 %r122,[%r123]; mov.u64 %r69,%r123; mov.u32 %r85,%r209; bra $L17; $L10: .loc 1 126 10 @ %r213 bra $L34; mov.u64 %r49,%r121; mov.u64 %r67,%r69; $L24: .loc 1 129 9 setp.ge.f64 %r179,%r74,%r122; @ ! %r179 bra $L19; mov.u64 %r119,%r206; mov.u64 %r70,%r23; mov.u64 %r114,%r49; $L21: .loc 1 133 34 add.u64 %r181,%r114,1; st.u64 [%r70],%r181; shr.s64 %r182,%r181,63; st.u64 [%r70+8],%r182; .loc 1 132 3 add.u64 %r70,%r70,%r35; setp.eq.u64 %r183,%r46,%r119; @ %r183 bra $L35; .loc 1 133 30 ld.u64 %r114,[%r119]; add.u64 %r119,%r119,8; bra $L21; $L35: .loc 1 131 10 mov.f64 %r74,%r122; $L19: .loc 1 135 11 add.u64 %r67,%r67,%r56; .loc 1 137 9 add.u64 %r49,%r49,1; .loc 1 137 2 st.u64 [%frame+240],%r49; setp.ne.u64 %r184,%r49,%r58; @ %r184 bra $L22; .loc 1 135 11 sub.u64 %r185,%r58,%r121; mad.lo.u64 %r69,%r185,%r56,%r69; bra $L23; $L22: .loc 1 129 10 ld.f64 %r122,[%r67]; bra $L24; $L34: mov.u64 %r57,%r121; mov.u64 %r68,%r69; $L18: .loc 1 141 9 setp.gt.f64 %r187,%r74,%r122; @ ! %r187 bra $L25; mov.u64 %r112,%r206; mov.u64 %r116,%r23; mov.u64 %r111,%r57; $L27: .loc 1 145 34 add.u64 %r189,%r111,1; st.u64 [%r116],%r189; shr.s64 %r190,%r189,63; st.u64 [%r116+8],%r190; .loc 1 144 3 add.u64 %r116,%r116,%r35; setp.eq.u64 %r191,%r46,%r112; @ %r191 bra $L36; .loc 1 145 30 ld.u64 %r111,[%r112]; add.u64 %r112,%r112,8; bra $L27; $L36: .loc 1 143 10 mov.f64 %r74,%r122; $L25: .loc 1 149 9 add.u64 %r68,%r68,%r56; .loc 1 151 14 add.u64 %r57,%r57,1; .loc 1 151 7 st.u64 [%frame+240],%r57; setp.ne.u64 %r192,%r57,%r58; @ %r192 bra $L28; .loc 1 149 9 sub.u64 %r193,%r58,%r121; mad.lo.u64 %r69,%r193,%r56,%r69; bra $L23; $L28: .loc 1 141 10 ld.f64 %r122,[%r68]; bra $L18; $L16: st.u64 [%frame+240],%r58; .loc 1 118 13 mov.u64 %r69,%r123; $L23: mov.u64 %r71,%r206; .loc 1 151 34 mov.u64 %r65,%r58; mov.u64 %r63,%r54; .loc 1 152 9 mov.u64 %r73,0; $L29: .loc 1 157 13 st.u64 [%r71+-8],%r210; .loc 1 160 23 mul.lo.u64 %r60,%r63,%r65; mov.u64 %r110,%r73; .loc 1 161 5 add.u64 %r73,%r73,1; .loc 1 162 7 setp.eq.u64 %r197,%r110,%r108; @ %r197 bra $L1; .loc 1 170 16 ld.u64 %r198,[%r71]; add.u64 %r62,%r198,1; st.u64 [%r71],%r62; shl.b64 %r40,%r73,3; .loc 1 171 23 add.u64 %r199,%frame,%r40; ld.u64 %r63,[%r199]; .loc 1 171 13 sub.u64 %r200,%r63,%r60; shl.b64 %r201,%r200,3; add.u64 %r69,%r69,%r201; .loc 1 174 32 add.u64 %r203,%r211,%r40; ld.u64 %r65,[%r203]; .loc 1 174 7 add.u64 %r71,%r71,8; setp.eq.u64 %r204,%r62,%r65; @ %r204 bra $L29; ld.u64 %r121,[%frame+240]; .loc 1 141 10 ld.f64 %r122,[%r69]; bra $L30; $L33: mov.f64 %r74,%r122; setp.eq.u32 %r207,1,1; bra $L10; $L_gfortran_mminloc0_16_r8 .visible .func _gfortran_mminloc0_16_r8reg .u64 %stack; mov.u64 %stack,0480pred %r168; .reg .predpredpred %r226; .reg .u64 %r227; .reg .pred %r229; .reg .predpred %r234; .reg .u64u16pred %r248; .reg .pred %r249; .reg .u64 %r251; .reg .u64 %r252; .reg .pred %r253; .reg .pred %r254; .reg .u16 %r256; .reg .u32predpred %r274; .reg .predu64 %r283; .reg .u64u64 %r287; .reg .u64 %r289; .reg .pred %r290; .reg .u64 %r291; .reg .u64 %r292; .reg .predpred %r298; mov.u64 %r164,%ar0; mov.u64 %r165,%ar1; mov.u64 %r166,%ar2; mov.u32 %r167,%ar3; .loc 1 202 6 setp.ne.u64 %r168,%r166,0; @ %r168 bra $L1617; call _gfortran_minloc0_16_r8.loc 1 205 7 bra $L52; $L53: .loc 1 208 10 ld.s8 %r22,[%r165+28]; .loc 1 209 6 setp.gt.s32 %r172,%r22,0; @ %r172 bra $L55; .loc 1 210 5 cvta.const.u64 %r173173_gfortran_runtime_error5: .loc 1 212 6 ld.u64 %r175,[%r164]; setp.ne.u64 %r176,%r175,0; @ %r176 bra $L56; .loc 1 214 7 st.u64 [%r164+48],%r175; add.u32 %r178,%r22,-1; cvt.s64.s32 %r179,%r178; st.u64 [%r164+56],%r179; mov.u64 %r180,1; st.u64 [%r164+40],%r180; .loc 1 215 28 cvt.u32.u64 %r181,%r180; st.u8 [%r164+28],%r181; .loc 1 216 24 st.u64 [%r164+8],%r175; .loc 1 217 29 cvt.u32.u32 %r186,%r22; cvt.s64.s8 %r185,%r186; mov.u64 %r184,16call (%value_in),_gfortrani_xmallocarray187,[%value_in]; } .loc 1 217 27 st.u64 [%r164],%r187; bra $L57; $L56: .loc 1 221 11 cvta.global.u64 %r189,_gfortrani_compile_options; .loc 1 221 10 ld.u32 %r190,[%r189+36]; setp.eq.u32 %r191,%r190,0164r194; call _gfortrani_bounds_iforeach_return.loc 1 226 4 cvta.const.u64 %r19716r197194; call _gfortrani_bounds_equal_extents$L57: .loc 1 231 15 ld.u64 %r31,[%r166+16]; .loc 1 233 9 ld.u64 %r104,[%r166]; .loc 1 235 22 cvt.u32.u64 %r32,%r31; .loc 1 235 53 add.u32 %r199,%r32,-4; and.b32 %r200,%r199,-5; set.u32.eq.u32 %r202,%r200,0; neg.s32 %r203,%r202; .loc 1 235 22 add.u32 %r204,%r32,-1; set.u32.le.u32 %r206,%r204,1; neg.s32 %r207,%r206; .loc 1 235 6 cvt.u16.u32 %r209,%r203; cvt.u16.u32 %r210,%r207; or.b16 %r208,%r209,%r210; cvt.u32.u16 %r211,%r208; cvt.u16.u8 %r212,%r211; setp.ne.u16 %r213,%r212,0; @ %r213 bra $L58; .loc 1 237 7 setp.ne.u32 %r215,%r32,16; @ %r215 bra $L59; $L58: .loc 1 244 11 ld.u64 %r119,[%r164+40]; .loc 1 245 8 ld.u64 %r120,[%r164]; .loc 1 246 17 cvt.u32.u32 %r216,%r22; cvt.s64.s8 %r137,%r216; add.u64 %r153,%r165,40; add.u64 %r291,%frame,120; mov.u64 %r150,%r291; add.u64 %r151,%r166,40; mov.u64 %r65,%frame; add.u64 %r62,%frame,240; add.u64 %r48,%frame,360; mov.u64 %r54,%r48; .loc 1 246 10 mov.u64 %r126,0; .loc 1 251 16 mov.u64 %r225,%r126; bra $L60; $L59: .loc 1 242 5 cvta.const.u64 %r217,$LC3stack; call _gfortran_runtime_error248 18 ld.u64 %r219,[%r153]; st.u64 [%r150],%r219; .loc 1 249 20 ld.u64 %r221,[%r151]; mul.lo.u64 %r220,%r221,%r31; .loc 1 249 18 st.u64 [%r65],%r220; .loc 1 250 19 ld.u64 %r223,[%r153+16]; add.u64 %r222,%r223,1; ld.u64 %r224,[%r153+8]; sub.u64 %r47,%r222,%r224; .loc 1 250 17 st.u64 [%r62],%r47; .loc 1 251 16 st.u64 [%r54],%r225; .loc 1 252 10 setp.gt.s64 %r226,%r47,0; @ %r226 bra $L61; shl.b64 %r132,%r119,4; mov.u64 %r136,%r120; .loc 1 255 11 mov.u64 %r127,0; .loc 1 256 24 mov.u64 %r227,%r127; $L62: st.u64 [%r136],%r227; st.u64 [%r136+8],%r227; .loc 1 255 27 add.u64 %r127,%r127,1; .loc 1 255 4 add.u64 %r136,%r136,%r132; setp.ne.u64 %r229,%r127,%r137; @ %r229 bra $L62; bra $L52; $L61: .loc 1 246 26 add.u64 %r126,%r126,1; .loc 1 246 3 add.u64 %r153,%r153,24; add.u64 %r150,%r150,8; add.u64 %r151,%r151,24; add.u64 %r65,%r65,8; add.u64 %r62,%r62,8; add.u64 %r54,%r54,8; setp.ne.u64 %r230,%r126,%r137; @ %r230 bra $L60; .loc 1 261 8 ld.u64 %r102,[%r165]; shl.b64 %r159,%r119,4; mov.u64 %r157,%r120; .loc 1 264 10 mov.u64 %r125,0; .loc 1 265 23 mov.u64 %r231,%r125; $L64: st.u64 [%r157],%r231; st.u64 [%r157+8],%r231; .loc 1 264 26 add.u64 %r125,%r125,1; .loc 1 264 3 add.u64 %r157,%r157,%r159; setp.ne.u64 %r233,%r125,%r137; @ %r233 bra $L64; .loc 1 276 9 setp.eq.u64 %r234,%r102,0; @ %r234 bra $L52; .loc 1 331 19 ld.u64 %r82,[%frame+120]; .loc 1 331 9 shl.b64 %r84,%r82,3; .loc 1 332 20 ld.u64 %r85,[%frame]; .loc 1 334 34 ld.u64 %r87,[%frame+240]; .loc 1 334 21 ld.u64 %r161,[%frame+360]; cvt.u32.u32 %r236,%r22; cvt.s64.s8 %r235,%r236; shl.b64 %r237,%r235,3; add.u64 %r68,%r237,%r48; .loc 1 272 12 mov.f64 %r110,0d7ff0000000000000; setp.eq.u32 %r293,1,0; add.u64 %r292,%frame,368; .loc 1 340 13 mov.u64 %r295,0; .loc 1 359 32 add.u64 %r296,%frame,240; .loc 1 308 12 setp.eq.u32 %r298,%r167,0; $L83: .loc 1 280 10 @ %r293 bra $L65; $L72: .loc 1 284 11 ld.s8 %r240,[%r104]; cvt.u16.u32 %r239,%r240; setp.eq.u16 %r241,%r239,0; @ %r241 bra $L66; .loc 1 287 8 ld.u64 %r242,[%r120]; ld.u64 %r243,[%r120+8]; or.b64 %r242,%r242,%r243; setp.ne.u64 %r244,%r242,0; @ %r244 bra $L67; mov.u64 %r99,%r292; mov.u64 %r105,%r120; mov.u64 %r152,%r161; $L68: .loc 1 289 38 add.u64 %r246,%r152,1; st.u64 [%r105],%r246; shr.s64 %r247,%r246,63; st.u64 [%r105+8],%r247; .loc 1 288 7 add.u64 %r105,%r105,%r159; setp.eq.u64 %r248,%r68,%r99; @ %r248 bra $L67; .loc 1 289 34 ld.u64 %r152,[%r99]; add.u64 %r99,%r99,8; bra $L68; $L67: .loc 1 290 9 ld.f64 %r55,[%r102]; .loc 1 290 8 setp.le.f64 %r249,%r55,%r110; @ ! %r249 bra $L66; mov.u64 %r129,%r292; mov.u64 %r130,%r120; mov.u64 %r156,%r161; $L71: .loc 1 296 33 add.u64 %r251,%r156,1; st.u64 [%r130],%r251; shr.s64 %r252,%r251,63; st.u64 [%r130+8],%r252; .loc 1 295 9 add.u64 %r130,%r130,%r159; setp.eq.u64 %r253,%r68,%r129; @ %r253 bra $L84; .loc 1 296 29 ld.u64 %r156,[%r129]; add.u64 %r129,%r129,8; bra $L71; $L66: .loc 1 300 13 add.u64 %r102,%r102,%r84; .loc 1 301 14 add.u64 %r104,%r104,%r85; .loc 1 303 11 add.u64 %r161,%r161,1; .loc 1 303 4 st.u64 [%frame+360],%r161; setp.ne.u64 %r254,%r161,%r87; @ %r254 bra $L72; bra $L73; $L65: .loc 1 308 12 @ %r298 bra $L85; mov.u64 %r72,%r161; mov.u64 %r100,%r102; $L78: .loc 1 311 11 ld.s8 %r257,[%r104]; cvt.u16.u32 %r256,%r257; setp.eq.u16 %r258,%r256,0; @ %r258 bra $L75; .loc 1 311 12 ld.f64 %r66,[%r100]; .loc 1 311 11 setp.le.f64 %r259,%r66,%r110; @ ! %r259 bra $L75; mov.u64 %r49,%r292; mov.u64 %r64,%r120; mov.u64 %r149,%r72; .loc 1 314 18 mov.u64 %r123,0; $L77: .loc 1 315 36 add.u64 %r261,%r149,1; st.u64 [%r64],%r261; shr.s64 %r262,%r261,63; st.u64 [%r64+8],%r262; .loc 1 314 34 add.u64 %r123,%r123,1; .loc 1 314 11 add.u64 %r64,%r64,%r159; setp.ge.s64 %r263,%r123,%r137; @ %r263 bra $L86; .loc 1 315 32 ld.u64 %r149,[%r49]; add.u64 %r49,%r49,8; bra $L77; $L86: .loc 1 313 18 mov.f64 %r110,%r66; $L75: .loc 1 317 8 add.u64 %r100,%r100,%r84; .loc 1 319 13 add.u64 %r72,%r72,1; .loc 1 319 6 st.u64 [%frame+360],%r72; setp.ne.u64 %r264,%r72,%r87; @ %r264 bra $L78; .loc 1 317 8 sub.u64 %r265,%r87,%r161; mad.lo.u64 %r102,%r265,%r84,%r102; bra $L73; $L85: mov.u64 %r73,%r161; mov.u64 %r103,%r104; mov.u64 %r101,%r102; $L74: .loc 1 323 11 ld.s8 %r268,[%r103]; cvt.u16.u32 %r267,%r268; setp.eq.u16 %r269,%r267,0; @ %r269 bra $L79; .loc 1 323 12 ld.f64 %r77,[%r101]; .loc 1 323 11 setp.lt.f64 %r270,%r77,%r110; @ ! %r270 bra $L79; mov.u64 %r128,%r292; mov.u64 %r36,%r120; mov.u64 %r135,%r73; .loc 1 326 12 mov.u64 %r122,0; $L81: .loc 1 327 36 add.u64 %r272,%r135,1; st.u64 [%r36],%r272; shr.s64 %r273,%r272,63; st.u64 [%r36+8],%r273; .loc 1 326 28 add.u64 %r122,%r122,1; .loc 1 326 5 add.u64 %r36,%r36,%r159; setp.ge.s64 %r274,%r122,%r137; @ %r274 bra $L87; .loc 1 327 32 ld.u64 %r135,[%r128]; add.u64 %r128,%r128,8; bra $L81; $L87: .loc 1 325 12 mov.f64 %r110,%r77; $L79: .loc 1 331 9 add.u64 %r101,%r101,%r84; .loc 1 332 10 add.u64 %r103,%r103,%r85; .loc 1 334 14 add.u64 %r73,%r73,1; .loc 1 334 7 st.u64 [%frame+360],%r73; setp.ne.u64 %r275,%r73,%r87; @ %r275 bra $L74; sub.u64 %r141,%r87,%r161; .loc 1 331 9 mad.lo.u64 %r102,%r84,%r141,%r102; .loc 1 332 10 mad.lo.u64 %r104,%r85,%r141,%r104; $L73: mov.u64 %r114,%r292; .loc 1 325 12 mov.u64 %r96,%r85; .loc 1 334 34 mov.u64 %r98,%r87; mov.u64 %r94,%r82; mov.u64 %r113,8; .loc 1 335 9 mov.u64 %r108,0; $L82: .loc 1 340 13 st.u64 [%r114+-8],%r295; .loc 1 343 23 mul.lo.u64 %r89,%r94,%r98; .loc 1 344 24 mul.lo.u64 %r91,%r98,%r96; .loc 1 345 5 add.u64 %r108,%r108,1; .loc 1 346 7 setp.eq.u64 %r280,%r108,%r137; @ %r280 bra $L52; .loc 1 354 16 ld.u64 %r281,[%r114]; add.u64 %r93,%r281,1; st.u64 [%r114],%r93; .loc 1 355 23 add.u64 %r283,%r291,%r113; ld.u64 %r94,[%r283]; .loc 1 355 13 sub.u64 %r284,%r94,%r89; shl.b64 %r285,%r284,3; add.u64 %r102,%r102,%r285; .loc 1 356 24 add.u64 %r286,%frame,%r113; ld.u64 %r96,[%r286]; .loc 1 356 14 sub.u64 %r287,%r96,%r91; add.u64 %r104,%r104,%r287; .loc 1 359 32 add.u64 %r289,%r296,%r113; ld.u64 %r98,[%r289]; .loc 1 359 7 add.u64 %r114,%r114,8; add.u64 %r113,%r113,8; setp.eq.u64 %r290,%r93,%r98; @ %r290 bra $L82; .loc 1 334 21 ld.u64 %r161,[%frame+360]; bra $L83; $L84: .loc 1 290 9 mov.f64 %r110,%r55; setp.eq.u32 %r293,1,1; bra $L65; $L52: .loc 1 _gfortran_sminloc0_16_r8 .visible .func _gfortran_sminloc0_16_r8u32pred.reg .u64u64 %r68; .reg .predar2; mov.u32 %r38,%ar3; .loc 1 378 6 setp.eq.u64 %r39,%r37,0; @ %r39 bra $L114; .loc 1 378 20 ld.u32 %r40,[%r37380 7 {_gfortran_minloc0_16_r8.loc 1 381 7 bra $L113; $L115: .loc 1 384 8 ld.s8 %r30,[%r36+28]; .loc 1 386 6 setp.gt.s64 %r46,%r30,0; @ %r46 bra $L1_gfortran_runtime_error117: .loc 1 389 15 ld.u64 %r24,[%r35]; .loc 1 389 6 setp.ne.u64 %r49,%r24,0; @ %r49 bra $L118; .loc 1 391 7 st.u64 [%r35+48],%r24; add.u64 %r51,%r30,-1; st.u64 [%r35+56],%r51; mov.u64 %r52,1; st.u64 [%r35+40],%r52; .loc 1 392 28 cvt.u32.u64 %r53,%r52; st.u8 [%r35+28],%r53; .loc 1 393 24 st.u64 [%r35+8],%r24; .loc 1 394 29call (%value_in),_gfortrani_xmallocarray94 27 st.u64 [%r35],%r24; bra $L119; $L118: .loc 1 396 12 cvta.global.u64 %r61,_gfortrani_compile_options; .loc 1 396 11 ld.u32 %r62,[%r61+36.loc 1 398 8 cvta.const.u64 %r66,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 403 8 ld.u64 %r24,[%r35]; $L119: ld.u64 %r67,[%r35+40]; shl.b64 %r34,%r67,4; mov.u64 %r29,%r24; .loc 1 404 10 mov.u64 %r32,0; .loc 1 405 23 mov.u64 %r68,%r32; $L120: st.u64 [%r29],%r68; st.u64 [%r29+8],%r68; .loc 1 404 24 add.u64 %r32,%r32,1; .loc 1 404 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r70,%r30,%r32; @ %r70 bra $L120; $L113: .loc 1 406 1 ret; } minloc0_4_r10.o/1622802195145 minloc0_8_r16.o/1622802195minloc1_4_i1.o/_gfortran_minloc1_4_i1 .visible .func _gfortran_minloc1_4_i1.file 1 "../../../libgfortran/generated/minloc1_4_i1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_4_i1 .visible .func _gfortran_mminloc1_4__gfortran_sminloc1_4_i1 .visible .func _gfortran_sminloc1_4_VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_minloc1_4_i1 .visible .func _gfortran_minloc1_4_i1480101; .reg .u32predu64 %r230; .reg .u64 %r231; .reg .u64 %r232; .reg .u64 %r233; .reg .pred %r234; .reg .predpredu64 %r267; .reg .u64 %r268; .reg .u64 %r269; .reg .u16 %r271; .reg .u16 %r272; .reg .u32 %r273; .reg .u32 %r274; .reg .u64u64 %r281; .reg .u6464predpred %r353; .reg .u64 %r354; .reg .pred %r355; .reg .pred %r356; .reg .predu64 %r369; mov.u64 %r184,%ar0; mov.u64 %r185,%ar1; mov.u64 %r186,%ar2; mov.u32 %r187,%ar3; .loc 1 58 10 ld.s8 %r22,[%r185+28]; .loc 1 58 38 add.u32 %r188,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r114,%r188; .loc 1 59 10 ld.u64 %r25,[%r186]; .loc 1 59 7 add.u64 %r115,%r25,-1; .loc 1 61 7 shr.u64 %r190,%r115,63; set.u32.lt.s64 %r192,%r114,%r115; neg.s32 %r193,%r192; cvt.u16.u64 %r196,%r190; cvt.u16.u32 %r197,%r193; or.b16 %r195,%r196,%r197; .loc 1 61 6 cvt.u32.u16 %r198,%r195; cvt.u16.u8 %r199,%r198; setp.eq.u16 %r200,%r199,0; @ %r200 bra $L2; .loc 1 63 7 cvt.u32.u32 %r203,%r22; cvt.s64.s8 %r202,%r203; st.u64 [%stack+8],%r20201201_gfortran_runtime_errorr206,%r115,%r115; add.u64 %r207,%r206,%r115; shl.b64 %r208,%r207,3; add.u64 %r209,%r185,%r208; ld.u64 %r31,[%r209+56]; ld.u64 %r33,[%r209+48]; .loc 1 71 9 ld.u64 %r118,[%r209+40]; .loc 1 73 3 setp.ne.u64 %r223,%r115,0; @ %r223 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r224,%r114,%r115; @ %r224 bra $L4; bra $L52; $L3: add.u64 %r182,%r185,40; add.u64 %r178,%frame,120; add.u64 %r47,%frame,240; add.u64 %r225,%r185,16; add.u64 %r227,%r25,%r25; add.u64 %r228,%r227,%r25; shl.b64 %r229,%r228,3; add.u64 %r70,%r225,%r229; .loc 1 79 12 mov.u64 %r369,0; $L8: .loc 1 75 18 ld.u64 %r230,[%r182]; st.u64 [%r178],%r230; .loc 1 76 19 ld.u64 %r232,[%r182+16]; add.u64 %r231,%r232,1; ld.u64 %r233,[%r182+8]; sub.u64 %r38,%r231,%r233; .loc 1 78 10 setp.lt.s64 %r234,%r38,0; @ %r234 bra $L6; .loc 1 76 17 st.u64 [%r47],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r47],%r369; $L7: .loc 1 73 3 add.u64 %r182,%r182,24; add.u64 %r178,%r178,8; add.u64 %r47,%r47,8; setp.ne.u64 %r236,%r70,%r182; @ %r236 bra $L8; bra $L9; $L52: .loc 1 90 6 ld.u64 %r237,[%r184]; setp.eq.u64 %r238,%r237,0; @ ! %r238 bra $L11; bra $L10; $L4: add.u64 %r240,%r25,%r25; add.u64 %r241,%r240,%r25; shl.b64 %r242,%r241,3; add.u64 %r243,%r242,40; add.u64 %r106,%r185,%r243; shl.b64 %r244,%r25,3; add.u64 %r94,%r244,-8; add.u64 %r359,%frame,120; add.u64 %r96,%r359,%r94; add.u64 %r358,%frame,240; add.u64 %r92,%r358,%r94; cvt.u32.u32 %r248,%r22; cvt.s64.s8 %r247,%r248; add.u64 %r250,%r247,%r247; add.u64 %r251,%r250,%r247; shl.b64 %r252,%r251,3; add.u64 %r253,%r185,40; add.u64 %r133,%r252,%r253; .loc 1 87 12 mov.u64 %r368,0; $L14: .loc 1 83 18 ld.u64 %r254,[%r106]; st.u64 [%r96],%r254; .loc 1 84 19 ld.u64 %r256,[%r106+16]; add.u64 %r255,%r256,1; ld.u64 %r257,[%r106+8]; sub.u64 %r44,%r255,%r257; .loc 1 86 10 setp.lt.s64 %r258,%r44,0; @ %r258 bra $L12; .loc 1 84 17 st.u64 [%r92],%r44; bra $L13; $L12: .loc 1 87 12 st.u64 [%r92],%r368; $L13: .loc 1 81 3 add.u64 %r106,%r106,24; add.u64 %r96,%r96,8; add.u64 %r92,%r92,8; setp.ne.u64 %r260,%r106,%r133; @ %r260 bra $L14; bra $L53; $L10: .loc 1 94 7 setp.le.s64 %r261,%r114,0; @ %r261 bra $L16; add.u64 %r358,%frame,240; $L40: add.u64 %r147,%r184,40; mov.u64 %r137,%r358; cvt.u32.u32 %r263,%r22; cvt.s64.s8 %r262,%r263; add.u64 %r265,%r262,%r262; add.u64 %r266,%r265,%r262; shl.b64 %r267,%r266,3; add.u64 %r268,%r184,16; add.u64 %r107,%r267,%r268; .loc 1 97 10 mov.u64 %r98,1; .loc 1 101 4 mov.u64 %r291,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r269,0; st.u64 [%r184+8],%r269; .loc 1 106 28 cvt.u16.u32 %r272,%r22; add.u16 %r271,%r272,-1; cvt.u32.u16 %r273,%r271; st.u8 [%r184+28],%r273; .loc 1 108 20 add.u32 %r274,%r22,-2; cvt.s64.s32 %r53,%r274; add.u64 %r276,%r53,%r53; add.u64 %r277,%r276,%r53; shl.b64 %r278,%r277,3; add.u64 %r279,%r184,%r278; .loc 1 108 67 shl.b64 %r281,%r53,3; add.u64 %r282,%frame,%r281; .loc 1 108 59 ld.u64 %r284,[%r279+40]; ld.u64 %r285,[%r282+240]; mul.lo.u64 %r120,%r284,%r285; .loc 1 110 29287; call (%value_in),_gfortrani_xmallocarray288,[%value_in]; } .loc 1 110 27 st.u64 [%r184],%r288; .loc 1 111 10 setp.eq.u64 %r290,%r120,0; @ ! %r290 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r98,%r49,%r98; $L17: .loc 1 101 4 st.u64 [%r147+8],%r291; ld.u64 %r49,[%r137]; add.u64 %r292,%r49,-1; st.u64 [%r147+16],%r292; st.u64 [%r147],%r98; .loc 1 94 7 add.u64 %r147,%r147,24; add.u64 %r137,%r137,8; setp.ne.u64 %r293,%r107,%r147; @ %r293 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r184+48],%r120; mov.u64 %r295,-1; st.u64 [%r184+56],%r295; mov.u64 %r296,1; st.u64 [%r184+40],%r296; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r59,[%r184+28]; .loc 1 121 10 setp.eq.u64 %r297,%r59,%r114; @ %r297 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r114; st.u64 [%stack],%r59;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r300,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r301,[%r300+36]; setp.eq.u32 %r302,%r301,0; @ %r302 bra $L21; .loc 1 128 2 add.u64 %r358,%frame,240; cvta.const.u64 %r306,$LC2r3530306; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r308,%r114,0; @ %r308 bra $L24; $L27: .loc 1 68 9 add.u64 %r309,%r31,1; .loc 1 68 7 sub.u64 %r116,%r309,%r33; max.s64 %r79,%r116,0; .loc 1 140 8 ld.u64 %r129,[%r185]; .loc 1 141 8 ld.u64 %r130,[%r184]; ld.u64 %r149,[%frame+360]; .loc 1 204 22 ld.u64 %r72,[%frame+120]; .loc 1 205 22 ld.u64 %r74,[%frame]; .loc 1 205 12 shl.b64 %r75,%r74,2; .loc 1 207 32 ld.u64 %r67,[%frame+240]; .loc 1 214 23 mul.lo.u64 %r310,%r67,%r72; .loc 1 214 9 neg.s64 %r145,%r310; .loc 1 215 23 mul.lo.u64 %r311,%r67,%r74; .loc 1 215 9 shl.b64 %r312,%r311,2; neg.s64 %r144,%r312; setp.gt.s64 %r356,%r116,0; setp.le.s64 %r357,%r114,1; .loc 1 180 9 setp.eq.u32 %r361,%r187,0; .loc 1 155 9 mov.u32 %r362,127; cvt.u32.u32 %r363,%r22; cvt.s64.s8 %r364,%r363; add.u64 %r365,%r364,-1; add.u64 %r366,%frame,120; add.u64 %r367,%frame,240; bra $L25; $L24: add.u64 %r166,%frame,360; add.u64 %r165,%r184,40; mov.u64 %r163,%frame; cvt.u32.u32 %r314,%r22; cvt.s64.s8 %r313,%r314; add.u64 %r148,%r313,-1; .loc 1 132 3 mov.u64 %r131,0; add.u64 %r358,%frame,240; .loc 1 134 16 mov.u64 %r315,%r131; $L26: st.u64 [%r166],%r315; .loc 1 135 18 ld.u64 %r316,[%r165]; st.u64 [%r163],%r316; .loc 1 136 17 shl.b64 %r318,%r131,3; add.u64 %r319,%r358,%r318; .loc 1 136 10 ld.u64 %r320,[%r319]; setp.le.s64 %r321,%r320,0; @ %r321 bra $L1; .loc 1 132 26 add.u64 %r131,%r131,1; .loc 1 132 3 add.u64 %r166,%r166,8; add.u64 %r165,%r165,24; add.u64 %r163,%r163,8; setp.ne.u64 %r322,%r131,%r148; @ %r322 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r356 bra $L28; .loc 1 159 10 mov.u32 %r324,0; st.u32 [%r130],%r324; bra $L29; $L28: .loc 1 180 9 @ %r361 bra $L30; .loc 1 190 33 mov.u64 %r97,%r129; .loc 1 155 9 mov.u32 %r104,%r362; .loc 1 157 9 mov.u32 %r101,1; .loc 1 178 8 mov.u64 %r125,0; $L32: .loc 1 183 9 ld.s8 %r66,[%r97]; .loc 1 183 8 cvt.u16.u32 %r326,%r66; cvt.u16.u32 %r327,%r104; setp.gt.s16 %r328,%r326,%r327; @ %r328 bra $L31; cvt.u32.u64 %r329,%r125; add.u32 %r101,%r329,1; mov.u32 %r104,%r66; $L31: .loc 1 181 25 add.u64 %r125,%r125,1; .loc 1 181 33 add.u64 %r97,%r97,%r118; .loc 1 181 8 setp.gt.s64 %r330,%r79,%r125; @ %r330 bra $L32; bra $L33; $L30: .loc 1 190 33 mov.u64 %r123,%r129; .loc 1 155 9 mov.u32 %r105,%r362; .loc 1 157 9 mov.u32 %r101,1; .loc 1 178 8 mov.u64 %r122,0; $L35: .loc 1 192 9 ld.s8 %r68,[%r123]; .loc 1 192 8 cvt.u16.u32 %r331,%r68; cvt.u16.u32 %r332,%r105; setp.ge.s16 %r333,%r331,%r332; @ %r333 bra $L34; cvt.u32.u64 %r334,%r122; add.u32 %r101,%r334,1; mov.u32 %r105,%r68; $L34: .loc 1 190 25 add.u64 %r122,%r122,1; .loc 1 190 33 add.u64 %r123,%r123,%r118; .loc 1 190 8 setp.gt.s64 %r335,%r79,%r122; @ %r335 bra $L35; $L33: .loc 1 199 12 st.u32 [%r130],%r101; $L29: .loc 1 203 15 add.u64 %r149,%r149,1; .loc 1 204 12 add.u64 %r129,%r129,%r72; .loc 1 205 12 add.u64 %r130,%r130,%r75; .loc 1 207 13 setp.ne.u64 %r336,%r67,%r149; @ %r336 bra $L25; .loc 1 214 9 add.u64 %r126,%r129,%r145; .loc 1 215 9 add.u64 %r127,%r130,%r144; .loc 1 217 7 @ %r357 bra $L1; add.u64 %r155,%frame,368; mov.u64 %r113,8; .loc 1 216 5 mov.u64 %r128,1; .loc 1 211 13 mov.u64 %r360,0; bra $L38; $L39: st.u64 [%r155],%r360; .loc 1 214 23 mul.lo.u64 %r342,%r86,%r85; .loc 1 214 9 sub.u64 %r126,%r129,%r342; .loc 1 215 23 mul.lo.u64 %r343,%r88,%r85; .loc 1 215 9 shl.b64 %r344,%r343,2; sub.u64 %r127,%r130,%r344; .loc 1 216 5 add.u64 %r128,%r128,1; .loc 1 217 7 add.u64 %r155,%r155,8; add.u64 %r113,%r113,8; setp.eq.u64 %r345,%r128,%r365; @ %r345 bra $L1; $L38: .loc 1 225 16 ld.u64 %r346,[%r155]; add.u64 %r85,%r346,1; st.u64 [%r155],%r85; .loc 1 226 23 add.u64 %r348,%r366,%r113; ld.u64 %r86,[%r348]; .loc 1 226 13 add.u64 %r129,%r126,%r86; .loc 1 227 23 add.u64 %r349,%frame,%r113; ld.u64 %r88,[%r349]; .loc 1 227 13 shl.b64 %r350,%r88,2; add.u64 %r130,%r127,%r350; .loc 1 207 32 add.u64 %r352,%r367,%r113; ld.u64 %r91,[%r352]; .loc 1 207 13 setp.eq.u64 %r353,%r85,%r91; @ %r353 bra $L39; .loc 1 211 13 mov.u64 %r149,0; bra $L25; $L53: .loc 1 90 6 ld.u64 %r354,[%r184]; setp.eq.u64 %r355,%r354,0; @ ! %r355 bra $L11; bra $L40; $L1:_gfortran_mminloc1_4_i1 .visible .func _gfortran_mminloc1_4_, .param .u32 %in_ar4) {608pred %r225; .reg .u32 %r230; .reg .u64 %r232; .reg .u32 %r234; .reg .u32 %r235; .reg .u16 %r237; .reg .u16 %r238; .reg .u16pred %r262; .reg .u32 %r263; .reg .u32 %r264; .reg .u32 %r266; .reg .u32u16 %r274; .reg .u32 %r275; .reg .u16 %r276; .reg .predpredu64 %r302; .reg .u64 %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u64 %r306; .reg .u6464u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .pred %r336; .reg .u64 %r338; .reg .u64predpredpredpred %r413; .reg .pred %r414; .reg .predpred %r419; .reg .pred %r420; .reg .u16 %r421; .reg .u32 %r422; .reg .pred %r423; .reg .u16 %r424; .reg .u16 %r425; .reg .pred %r426; .reg .pred %r427; .reg .predu64predpred %r454; .reg .u64 %r455; .reg .u64 %r456; .reg .pred %r457; .reg .u64 %r458; .reg .predu64 %r467; mov.u64 %r220,%ar0; mov.u64 %r221,%ar1; mov.u64 %r222,%ar2; mov.u64 %r223,%ar3; mov.u32 %r224,%ar4; .loc 1 261 6 setp.ne.u64 %r225,%r223,0; @ %r225 bra $L2224; call _gfortran_minloc1_4_268 7 bra $L54; $L55: .loc 1 271 10 ld.u64 %r22,[%r222]; .loc 1 271 7 add.u64 %r152,%r22,-1; .loc 1 272 10 ld.s8 %r23,[%r221+28]; .loc 1 272 38 add.u32 %r230,%r23,-1; .loc 1 272 8 cvt.s64.s32 %r153,%r230; .loc 1 275 7 shr.u64 %r232,%r152,63; set.u32.gt.s64 %r234,%r152,%r153; neg.s32 %r235,%r234; cvt.u16.u64 %r238,%r232; cvt.u16.u32 %r239,%r235; or.b16 %r237,%r238,%r239; .loc 1 275 6 cvt.u32.u16 %r240,%r237; cvt.u16.u8 %r241,%r240; setp.eq.u16 %r242,%r241,0; @ %r242 bra $L57; .loc 1 277 7 cvt.u32.u32 %r245,%r23; cvt.s64.s8 %r244,%r245; st.u64 [%stack+8],%r244; st.u64 [%stack],%r22; cvta.const.u64 %r243stack; call _gfortran_runtime_error7: .loc 1 282 9 add.u64 %r452,%r152,%r152; add.u64 %r453,%r452,%r152; shl.b64 %r250,%r453,3; add.u64 %r251,%r221,%r250; ld.u64 %r254,[%r251+56]; add.u64 %r253,%r254,1; .loc 1 282 7 ld.u64 %r261,[%r251+48]; sub.u64 %r154,%r253,%r261; .loc 1 283 6 setp.le.s64 %r262,%r154,0; @ %r262 bra $L54; .loc 1 286 9 ld.u64 %r169,[%r223]; .loc 1 288 15 ld.u64 %r34,[%r223+16]; .loc 1 290 22 cvt.u32.u64 %r35,%r34; .loc 1 290 53 add.u32 %r263,%r35,-4; and.b32 %r264,%r263,-5; set.u32.eq.u32 %r266,%r264,0; neg.s32 %r267,%r266; .loc 1 290 22 add.u32 %r268,%r35,-1; set.u32.le.u32 %r270,%r268,1; neg.s32 %r271,%r270; .loc 1 290 6 cvt.u16.u32 %r273,%r267; cvt.u16.u32 %r274,%r271; or.b16 %r272,%r273,%r274; cvt.u32.u16 %r275,%r272; cvt.u16.u8 %r276,%r275; setp.ne.u16 %r277,%r276,0; @ %r277 bra $L59; .loc 1 292 7 setp.ne.u32 %r279,%r35,16; @ %r279 bra $L60; $L59: .loc 1 299 9 shl.b64 %r283,%r453,3; add.u64 %r284,%r221,%r283; ld.u64 %r157,[%r284+40]; .loc 1 300 12 add.u64 %r290,%r223,%r283; ld.u64 %r42,[%r290+40]; .loc 1 302 3 setp.ne.u64 %r292,%r152,0; @ %r292 bra $L61; $L67: .loc 1 312 3 setp.lt.s64 %r293,%r152,%r153; @ %r293 bra $L62; bra $L119; $L60: .loc 1 297 5 cvta.const.u64 %r2944_gfortran_runtime_error61: add.u64 %r40,%r221,40; add.u64 %r178,%r223,40; add.u64 %r297,%r22,%r22; add.u64 %r298,%r297,%r22; shl.b64 %r299,%r298,3; add.u64 %r300,%r221,16; add.u64 %r131,%r299,%r300; .loc 1 302 3 mov.u64 %r147,0; add.u64 %r456,%frame,240; add.u64 %r455,%frame,360; .loc 1 309 12 mov.u64 %r467,%r147; $L66: .loc 1 304 18 add.u64 %r302,%r456,%r147; ld.u64 %r303,[%r40]; st.u64 [%r302],%r303; .loc 1 305 18 add.u64 %r304,%frame,%r147; .loc 1 305 20 ld.u64 %r306,[%r178]; mul.lo.u64 %r305,%r306,%r34; .loc 1 305 18 st.u64 [%r304],%r305; .loc 1 306 19 ld.u64 %r308,[%r40+16]; add.u64 %r307,%r308,1; ld.u64 %r309,[%r40+8]; sub.u64 %r54,%r307,%r309; .loc 1 308 10 setp.lt.s64 %r310,%r54,0; @ %r310 bra $L64; .loc 1 306 17 add.u64 %r312,%r455,%r147; st.u64 [%r312],%r54; bra $L65; $L64: .loc 1 309 12 add.u64 %r314,%r455,%r147; st.u64 [%r314],%r467; $L65: .loc 1 302 3 add.u64 %r40,%r40,24; add.u64 %r178,%r178,24; add.u64 %r147,%r147,8; setp.ne.u64 %r316,%r40,%r131; @ %r316 bra $L66; bra $L67; $L119: .loc 1 322 6 ld.u64 %r317,[%r220]; setp.eq.u64 %r318,%r317,0; @ ! %r318 bra $L69; bra $L68; $L62: add.u64 %r320,%r22,%r22; add.u64 %r321,%r320,%r22; shl.b64 %r322,%r321,3; add.u64 %r114,%r322,40; add.u64 %r132,%r221,%r114; add.u64 %r96,%r223,%r114; shl.b64 %r323,%r22,3; add.u64 %r150,%r323,-8; cvt.u32.u32 %r325,%r23; cvt.s64.s8 %r324,%r325; shl.b64 %r326,%r324,3; add.u64 %r39,%r326,-8; add.u64 %r456,%frame,240; add.u64 %r455,%frame,360; .loc 1 319 12 mov.u64 %r466,0; $L72: .loc 1 314 18 add.u64 %r328,%r456,%r150; ld.u64 %r329,[%r132]; st.u64 [%r328],%r329; .loc 1 315 18 add.u64 %r330,%frame,%r150; .loc 1 315 20 ld.u64 %r332,[%r96]; mul.lo.u64 %r331,%r332,%r34; .loc 1 315 18 st.u64 [%r330],%r331; .loc 1 316 19 ld.u64 %r334,[%r132+16]; add.u64 %r333,%r334,1; ld.u64 %r335,[%r132+8]; sub.u64 %r64,%r333,%r335; .loc 1 318 10 setp.lt.s64 %r336,%r64,0; @ %r336 bra $L70; .loc 1 316 17 add.u64 %r338,%r455,%r150; st.u64 [%r338],%r64; bra $L71; $L70: .loc 1 319 12 add.u64 %r340,%r455,%r150; st.u64 [%r340],%r466; $L71: .loc 1 312 3 add.u64 %r132,%r132,24; add.u64 %r96,%r96,24; add.u64 %r150,%r150,8; setp.ne.u64 %r342,%r39,%r150; @ %r342 bra $L72; bra $L120; $L68: .loc 1 326 7 setp.eq.u64 %r343,%r153,0; @ %r343 bra $L74; add.u64 %r455,%frame,360; $L99: add.u64 %r145,%r220,40; mov.u64 %r142,%r455; cvt.u32.u32 %r345,%r23; cvt.s64.s8 %r344,%r345; add.u64 %r347,%r344,%r344; add.u64 %r348,%r347,%r344; shl.b64 %r349,%r348,3; add.u64 %r350,%r220,16; add.u64 %r133,%r349,%r350; .loc 1 329 10 mov.u64 %r135,1; .loc 1 333 4 mov.u64 %r369,0; bra $L75; $L74: .loc 1 337 20 add.u32 %r351,%r23,-2; cvt.s64.s32 %r74,%r351; add.u64 %r353,%r74,%r74; add.u64 %r354,%r353,%r74; shl.b64 %r355,%r354,3; add.u64 %r356,%r220,%r355; .loc 1 337 67 shl.b64 %r358,%r74,3; add.u64 %r359,%frame,%r358; .loc 1 337 59 ld.u64 %r361,[%r356+40]; ld.u64 %r362,[%r359+360]; mul.lo.u64 %r160,%r361,%r362; .loc 1 339 24 mov.u64 %r363,0; st.u64 [%r220+8],%r363; .loc 1 340 28 cvt.u16.u32 %r366,%r23; add.u16 %r365,%r366,-1; cvt.u32.u16 %r367,%r365; st.u8 [%r220+28],%r367; .loc 1 342 10 setp.eq.u64 %r368,%r160,0; @ %r368 bra $L76; bra $L121; $L78: .loc 1 331 47 mul.lo.u64 %r135,%r71,%r135; $L75: .loc 1 333 4 st.u64 [%r145+8],%r369; ld.u64 %r71,[%r142]; add.u64 %r370,%r71,-1; st.u64 [%r145+16],%r370; st.u64 [%r145],%r135; .loc 1 326 7 add.u64 %r145,%r145,24; add.u64 %r142,%r142,8; setp.ne.u64 %r371,%r133,%r145; @ %r371 bra $L78; bra $L74; $L76: .loc 1 345 4 st.u64 [%r220+48],%r160; mov.u64 %r373,-1; st.u64 [%r220+56],%r373; mov.u64 %r374,1; st.u64 [%r220+40],%r374; .loc 1 346 4 bra $L54; $L121: .loc 1 349 24call (%value_in),_gfortrani_xmallocarray77,[%value_in]; } .loc 1 349 22 st.u64 [%r220],%r377; bra $L79; $L69: .loc 1 354 19 ld.s8 %r379,[%r220+28]; .loc 1 354 10 setp.eq.u64 %r380,%r379,%r153; @ %r380 bra $L80; .loc 1 355 2 cvta.const.u64 %r381381_gfortran_runtime_error80: .loc 1 357 11 cvta.global.u64 %r383,_gfortrani_compile_options; .loc 1 357 10 ld.u32 %r384,[%r383+36]; setp.eq.u32 %r385,%r384,0; @ %r385 bra $L79; .loc 1 359 4 add.u64 %r455,%frame,360; cvta.const.u64 %r389,$LC2r455388389; call _gfortrani_bounds_ifunction_return361 4 cvta.const.u64 %r393,$LC393389; call _gfortrani_bounds_equal_extents$L79: .loc 1 366 3 setp.ne.u64 %r395,%r153,0; @ %r395 bra $L81; $L84: .loc 1 374 8 ld.u64 %r170,[%r220]; .loc 1 375 8 ld.u64 %r168,[%r221]; .loc 1 377 9 setp.ne.u64 %r396,%r168,0; @ %r396 bra $L82; bra $L54; $L81: add.u64 %r197,%frame,480; add.u64 %r196,%r220,40; add.u64 %r192,%frame,120; cvt.u32.u32 %r398,%r23; cvt.s64.s8 %r397,%r398; add.u64 %r146,%r397,-1; .loc 1 366 3 mov.u64 %r171,0; add.u64 %r455,%frame,360; .loc 1 368 16 mov.u64 %r399,%r171; $L83: st.u64 [%r197],%r399; .loc 1 369 18 ld.u64 %r400,[%r196]; st.u64 [%r192],%r400; .loc 1 370 17 shl.b64 %r402,%r171,3; add.u64 %r403,%r455,%r402; .loc 1 370 10 ld.u64 %r404,[%r403]; setp.le.s64 %r405,%r404,0; @ %r405 bra $L54; .loc 1 366 26 add.u64 %r171,%r171,1; .loc 1 366 3 add.u64 %r197,%r197,8; add.u64 %r196,%r196,24; add.u64 %r192,%r192,8; setp.ne.u64 %r406,%r146,%r171; @ %r406 bra $L83; bra $L84; $L82: .loc 1 300 12 mul.lo.u64 %r45,%r42,%r34; ld.u64 %r189,[%frame+480]; .loc 1 440 22 ld.u64 %r100,[%frame+240]; .loc 1 441 23 ld.u64 %r102,[%frame]; .loc 1 442 22 ld.u64 %r104,[%frame+120]; .loc 1 442 12 shl.b64 %r105,%r104,2; .loc 1 444 32 ld.u64 %r29,[%frame+360]; .loc 1 451 23 mul.lo.u64 %r38,%r100,%r29; .loc 1 452 24 mul.lo.u64 %r407,%r102,%r29; .loc 1 452 10 neg.s64 %r136,%r407; .loc 1 453 23 mul.lo.u64 %r408,%r104,%r29; .loc 1 453 9 shl.b64 %r409,%r408,2; neg.s64 %r190,%r409; setp.le.s64 %r454,%r153,1; .loc 1 418 9 setp.ne.u32 %r459,%r224,0; cvt.u32.u32 %r460,%r23; cvt.s64.s8 %r461,%r460; add.u64 %r462,%r461,-1; add.u64 %r463,%frame,240; add.u64 %r464,%frame,360; add.u64 %r465,%frame,120; $L100: .loc 1 366 3 mov.u64 %r163,%r169; mov.u64 %r162,%r168; .loc 1 396 9 mov.u64 %r161,0; $L88: .loc 1 399 7 ld.s8 %r219,[%r163]; .loc 1 399 6 setp.eq.u32 %r457,%r219,0; @ %r457 bra $L85; .loc 1 407 11 ld.s8 %r143,[%r162]; .loc 1 408 11 cvt.u32.u64 %r411,%r161; add.u32 %r141,%r411,1; .loc 1 418 9 @ %r459 bra $L86; bra $L122; $L85: .loc 1 396 24 add.u64 %r161,%r161,1; .loc 1 396 32 add.u64 %r162,%r162,%r157; .loc 1 396 47 add.u64 %r163,%r163,%r45; .loc 1 396 2 setp.ne.u64 %r413,%r154,%r161; @ %r413 bra $L88; .loc 1 395 9 mov.u32 %r141,%r219; bra $L89; $L122: .loc 1 428 10 setp.gt.s64 %r414,%r154,%r161; @ %r414 bra $L90; bra $L89; $L86: .loc 1 419 8 setp.le.s64 %r415,%r154,%r161; @ %r415 bra $L89; add.u64 %r204,%r161,1; $L92: .loc 1 421 8 @ %r457 bra $L91; .loc 1 421 18 ld.s8 %r89,[%r162]; .loc 1 421 15 cvt.u16.u32 %r417,%r89; cvt.u16.u32 %r418,%r143; setp.gt.s16 %r419,%r417,%r418; @ %r419 bra $L91; .loc 1 424 16 cvt.u32.u64 %r141,%r204; mov.u32 %r143,%r89; $L91: .loc 1 419 33 add.u64 %r162,%r162,%r157; .loc 1 419 48 add.u64 %r163,%r163,%r45; .loc 1 419 8 add.u64 %r205,%r204,1; setp.eq.u64 %r420,%r154,%r204; @ %r420 bra $L89; .loc 1 421 9 ld.s8 %r219,[%r163]; mov.u64 %r204,%r205; setp.eq.u32 %r457,%r219,0; bra $L92; $L90: add.u64 %r200,%r161,1; bra $L94; $L101: mov.u64 %r200,%r201; $L94: .loc 1 430 10 ld.s8 %r422,[%r163]; cvt.u16.u32 %r421,%r422; setp.eq.u16 %r423,%r421,0; @ %r423 bra $L93; .loc 1 430 20 ld.s8 %r94,[%r162]; .loc 1 430 17 cvt.u16.u32 %r424,%r94; cvt.u16.u32 %r425,%r143; setp.ge.s16 %r426,%r424,%r425; @ %r426 bra $L93; .loc 1 433 11 cvt.u32.u64 %r141,%r200; mov.u32 %r143,%r94; $L93: .loc 1 428 35 add.u64 %r162,%r162,%r157; .loc 1 428 50 add.u64 %r163,%r163,%r45; .loc 1 428 10 add.u64 %r201,%r200,1; setp.ne.u64 %r427,%r154,%r200; @ %r427 bra $L101; $L89: .loc 1 436 8 st.u32 [%r170],%r141; .loc 1 439 15 add.u64 %r189,%r189,1; .loc 1 440 12 add.u64 %r168,%r168,%r100; .loc 1 441 13 add.u64 %r169,%r169,%r102; .loc 1 442 12 add.u64 %r170,%r170,%r105; .loc 1 444 13 setp.ne.u64 %r428,%r29,%r189; @ %r428 bra $L100; .loc 1 452 10 add.u64 %r165,%r169,%r136; .loc 1 453 9 add.u64 %r166,%r170,%r190; .loc 1 455 7 @ %r454 bra $L54; add.u64 %r151,%frame,488; .loc 1 451 23 mov.u64 %r108,%r38; .loc 1 455 7 mov.u64 %r214,8; .loc 1 454 5 mov.u64 %r167,1; .loc 1 448 13 mov.u64 %r458,0; bra $L97; $L98: st.u64 [%r151],%r458; .loc 1 451 23 mul.lo.u64 %r108,%r122,%r121; .loc 1 452 24 mul.lo.u64 %r434,%r124,%r121; .loc 1 452 10 sub.u64 %r165,%r169,%r434; .loc 1 453 23 mul.lo.u64 %r435,%r126,%r121; .loc 1 453 9 shl.b64 %r436,%r435,2; sub.u64 %r166,%r170,%r436; .loc 1 454 5 add.u64 %r167,%r167,1; .loc 1 455 7 add.u64 %r151,%r151,8; add.u64 %r214,%r214,8; setp.eq.u64 %r437,%r167,%r462; @ %r437 bra $L54; $L97: .loc 1 463 16 ld.u64 %r438,[%r151]; add.u64 %r121,%r438,1; st.u64 [%r151],%r121; .loc 1 464 23 add.u64 %r440,%r463,%r214; ld.u64 %r122,[%r440]; .loc 1 464 13 sub.u64 %r441,%r122,%r108; add.u64 %r168,%r168,%r441; .loc 1 465 24 add.u64 %r442,%frame,%r214; ld.u64 %r124,[%r442]; .loc 1 465 14 add.u64 %r169,%r165,%r124; .loc 1 466 23 add.u64 %r444,%r465,%r214; ld.u64 %r126,[%r444]; .loc 1 466 13 shl.b64 %r445,%r126,2; add.u64 %r170,%r166,%r445; .loc 1 444 32 add.u64 %r447,%r464,%r214; ld.u64 %r130,[%r447]; .loc 1 444 13 setp.eq.u64 %r448,%r121,%r130; @ %r448 bra $L98; .loc 1 448 13 mov.u64 %r189,0; bra $L100; $L120: .loc 1 322 6 ld.u64 %r449,[%r220]; setp.eq.u64 %r450,%r449,0; @ ! %r450 bra $L69; bra $L99; $L54_gfortran_sminloc1_4_i1 .visible .func _gfortran_sminloc1_4_, .param .u32 %in_ar4) {local .align 16 .b8 %frame_ar[368u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u32 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 493 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L124; .loc 1 493 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L125; $L124:r152154; call _gfortran_minloc1_4_500 7 bra $L123; $L125: .loc 1 503 10 ld.u64 %r23,[%r152]; .loc 1 503 7 add.u64 %r88,%r23,-1; .loc 1 504 10 ld.s8 %r24,[%r151+28]; .loc 1 504 38 add.u32 %r26,%r24,-1; .loc 1 504 8 cvt.s64.s32 %r296,%r26; .loc 1 506 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 506 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L127; .loc 1 513 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L128; $L134: .loc 1 521 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L129; bra $L182; $L127: .loc 1 508 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error128: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 518 12 mov.u64 %r302,0; $L133: .loc 1 515 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 517 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L131; .loc 1 515 17 st.u64 [%r139],%r36; bra $L132; $L131: .loc 1 518 12 st.u64 [%r139],%r302; $L132: .loc 1 513 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L133; bra $L134; $L182: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L136; bra $L135; $L129: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 527 12 mov.u64 %r301,0; $L139: .loc 1 524 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 526 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L137; .loc 1 523 17 st.u64 [%r43],%r41; bra $L138; $L137: .loc 1 527 12 st.u64 [%r43],%r301; $L138: .loc 1 521 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L139; bra $L183; $L135: .loc 1 534 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L141; add.u64 %r295,%frame,120; $L161: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 537 10 mov.u64 %r81,1; .loc 1 541 4 mov.u64 %r239,0; bra $L142; $L141: .loc 1 545 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 546 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 548 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 548 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 548 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 550 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L143; bra $L184; $L145: .loc 1 539 48 mul.lo.u64 %r81,%r46,%r81; $L142: .loc 1 541 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 534 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L145; bra $L141; $L143: .loc 1 553 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 554 4 bra $L123; $L184: .loc 1 557 2496246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 557 22 st.u64 [%r150],%r98; $L149: .loc 1 583 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L146; bra $L147; $L136: .loc 1 561 19 ld.s8 %r55,[%r150+28]; .loc 1 561 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L148; .loc 1 562 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error148: .loc 1 567 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 567 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L149; .loc 1 569 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L147; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 569 10 mov.u64 %r104,0; $L151: .loc 1 573 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 573 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 574 18 ld.u64 %r63,[%r31]; .loc 1 569 25 add.u64 %r104,%r104,1; .loc 1 574 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L150; .loc 1 575 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error150: .loc 1 569 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L151; $L146: .loc 1 585 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L152; shl.b64 %r266,%r296,3; bra $L153; $L152: mov.u64 %r266,8; $L153: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L154: .loc 1 586 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 583 26 add.u64 %r100,%r100,1; .loc 1 583 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L154; $L147: ld.u64 %r80,[%frame+240]; .loc 1 595 22 ld.u64 %r68,[%frame]; .loc 1 595 12 shl.b64 %r69,%r68,2; .loc 1 597 32 ld.u64 %r59,[%frame+120]; .loc 1 604 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 593 13 mov.u32 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L160: st.u32 [%r98],%r280; .loc 1 594 15 add.u64 %r80,%r80,1; .loc 1 595 12 add.u64 %r98,%r98,%r69; .loc 1 597 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L160; .loc 1 606 7 @ ! %r294 bra $L185; bra $L123; $L159: .loc 1 601 13 st.u64 [%r99],%r297; .loc 1 604 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 605 5 add.u64 %r97,%r97,1; .loc 1 606 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L158; bra $L123; $L185: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 604 23 mov.u64 %r72,%r44; .loc 1 605 5 mov.u64 %r97,1; .loc 1 601 13 mov.u64 %r297,0; $L158: .loc 1 610 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 611 23 ld.u64 %r75,[%r124]; .loc 1 611 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,2; add.u64 %r98,%r98,%r291; .loc 1 597 32 ld.u64 %r77,[%r123]; .loc 1 597 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L159; .loc 1 601 13 mov.u64 %r80,0; bra $L160; $L183: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L136; bra $L161; $L123: .loc 1 615 1 ret; } minloc1_8_i1.o/_gfortran_minloc1_8_i1 .visible .func _gfortran_minloc1_8_i1.file 1 "../../../libgfortran/generated/minloc1_8_i1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_8_i1 .visible .func _gfortran_mminloc1_8__gfortran_sminloc1_8_i1 .visible .func _gfortran_sminloc1_8_VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_minloc1_8_i1 .visible .func _gfortran_minloc1_8_i1480100; .reg .u64predu64u64 %r229; .reg .u64 %r230; .reg .predpred %r255; .reg .predu64 %r266; .reg .u16u64predpred %r322; .reg .predpred %r326; .reg .predu64predpred %r353; .reg .pred %r354; .reg .predu64mov.u64 %r181,%ar0; mov.u64 %r182,%ar1; mov.u64 %r183,%ar2; mov.u32 %r184,%ar3; .loc 1 58 10 ld.s8 %r22,[%r182+28]; .loc 1 58 38 add.u32 %r185,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r115,%r185; .loc 1 59 10 ld.u64 %r25,[%r183]; .loc 1 59 7 add.u64 %r116,%r25,-1; .loc 1 61 7 shr.u64 %r187,%r116,63; set.u32.lt.s64 %r189,%r115,%r116; neg.s32 %r190,%r189; cvt.u16.u64 %r193,%r187; cvt.u16.u32 %r194,%r190; or.b16 %r192,%r193,%r194; .loc 1 61 6 cvt.u32.u16 %r195,%r192; cvt.u16.u8 %r196,%r195; setp.eq.u16 %r197,%r196,0; @ %r197 bra $L2; .loc 1 63 7 cvt.u32.u32 %r200,%r22; cvt.s64.s8 %r199,%r200; st.u64 [%stack+8],%r199_gfortran_runtime_errorr203,%r116,%r116; add.u64 %r204,%r203,%r116; shl.b64 %r205,%r204,3; add.u64 %r206,%r182,%r205; ld.u64 %r30,[%r206+56]; ld.u64 %r32,[%r206+48]; .loc 1 71 9 ld.u64 %r119,[%r206+40]; .loc 1 73 3 setp.ne.u64 %r220,%r116,0; @ %r220 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r221,%r115,%r116; @ %r221 bra $L4; bra $L54; $L3: add.u64 %r133,%r182,40; add.u64 %r179,%frame,120; add.u64 %r174,%frame,240; add.u64 %r222,%r182,16; add.u64 %r224,%r25,%r25; add.u64 %r225,%r224,%r25; shl.b64 %r226,%r225,3; add.u64 %r123,%r222,%r226; .loc 1 79 12 mov.u64 %r364,0; $L8: .loc 1 75 18 ld.u64 %r227,[%r133]; st.u64 [%r179],%r227; .loc 1 76 19 ld.u64 %r229,[%r133+16]; add.u64 %r228,%r229,1; ld.u64 %r230,[%r133+8]; sub.u64 %r37,%r228,%r230; .loc 1 78 10 setp.lt.s64 %r231,%r37,0; @ %r231 bra $L6; .loc 1 76 17 st.u64 [%r174],%r37; bra $L7; $L6: .loc 1 79 12 st.u64 [%r174],%r364; $L7: .loc 1 73 3 add.u64 %r133,%r133,24; add.u64 %r179,%r179,8; add.u64 %r174,%r174,8; setp.ne.u64 %r233,%r123,%r133; @ %r233 bra $L8; bra $L9; $L54: .loc 1 90 6 ld.u64 %r234,[%r181]; setp.eq.u64 %r235,%r234,0; @ ! %r235 bra $L11; bra $L10; $L4: add.u64 %r237,%r25,%r25; add.u64 %r238,%r237,%r25; shl.b64 %r239,%r238,3; add.u64 %r240,%r239,40; add.u64 %r108,%r182,%r240; shl.b64 %r241,%r25,3; add.u64 %r96,%r241,-8; add.u64 %r352,%frame,120; add.u64 %r98,%r352,%r96; add.u64 %r351,%frame,240; add.u64 %r93,%r351,%r96; cvt.u32.u32 %r245,%r22; cvt.s64.s8 %r244,%r245; add.u64 %r247,%r244,%r244; add.u64 %r248,%r247,%r244; shl.b64 %r249,%r248,3; add.u64 %r250,%r182,40; add.u64 %r94,%r249,%r250; .loc 1 87 12 mov.u64 %r363,0; $L14: .loc 1 83 18 ld.u64 %r251,[%r108]; st.u64 [%r98],%r251; .loc 1 84 19 ld.u64 %r253,[%r108+16]; add.u64 %r252,%r253,1; ld.u64 %r254,[%r108+8]; sub.u64 %r43,%r252,%r254; .loc 1 86 10 setp.lt.s64 %r255,%r43,0; @ %r255 bra $L12; .loc 1 84 17 st.u64 [%r93],%r43; bra $L13; $L12: .loc 1 87 12 st.u64 [%r93],%r363; $L13: .loc 1 81 3 add.u64 %r108,%r108,24; add.u64 %r98,%r98,8; add.u64 %r93,%r93,8; setp.ne.u64 %r257,%r94,%r108; @ %r257 bra $L14; bra $L55; $L10: .loc 1 94 7 setp.le.s64 %r258,%r115,0; @ %r258 bra $L16; add.u64 %r351,%frame,240; $L40: add.u64 %r145,%r181,40; mov.u64 %r139,%r351; cvt.u32.u32 %r260,%r22; cvt.s64.s8 %r259,%r260; add.u64 %r262,%r259,%r259; add.u64 %r263,%r262,%r259; shl.b64 %r264,%r263,3; add.u64 %r265,%r181,16; add.u64 %r109,%r264,%r265; .loc 1 97 10 mov.u64 %r100,1; .loc 1 101 4 mov.u64 %r288,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r266,0; st.u64 [%r181+8],%r266; .loc 1 106 28 cvt.u16.u32 %r269,%r22; add.u16 %r268,%r269,-1; cvt.u32.u16 %r270,%r268; st.u8 [%r181+28],%r270; .loc 1 108 20 add.u32 %r271,%r22,-2; cvt.s64.s32 %r53,%r271; add.u64 %r273,%r53,%r53; add.u64 %r274,%r273,%r53; shl.b64 %r275,%r274,3; add.u64 %r276,%r181,%r275; .loc 1 108 67 shl.b64 %r278,%r53,3; add.u64 %r279,%frame,%r278; .loc 1 108 59 ld.u64 %r281,[%r276+40]; ld.u64 %r282,[%r279+240]; mul.lo.u64 %r121,%r281,%r282; .loc 1 110 29 mov.u64 %r284284; call (%value_in),_gfortrani_xmallocarray285,[%value_in]; } .loc 1 110 27 st.u64 [%r181],%r285; .loc 1 111 10 setp.eq.u64 %r287,%r121,0; @ ! %r287 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r100,%r49,%r100; $L17: .loc 1 101 4 st.u64 [%r145+8],%r288; ld.u64 %r49,[%r139]; add.u64 %r289,%r49,-1; st.u64 [%r145+16],%r289; st.u64 [%r145],%r100; .loc 1 94 7 add.u64 %r145,%r145,24; add.u64 %r139,%r139,8; setp.ne.u64 %r290,%r109,%r145; @ %r290 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r181+48],%r121; mov.u64 %r292,-1; st.u64 [%r181+56],%r292; mov.u64 %r293,1; st.u64 [%r181+40],%r293; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r60,[%r181+28]; .loc 1 121 10 setp.eq.u64 %r294,%r60,%r115; @ %r294 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r115; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r297,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r298,[%r297+36]; setp.eq.u32 %r299,%r298,0; @ %r299 bra $L21; .loc 1 128 2 add.u64 %r351,%frame,240; cvta.const.u64 %r303,$LC2r351302303; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r305,%r115,0; @ %r305 bra $L24; $L27: .loc 1 68 9 add.u64 %r306,%r30,1; .loc 1 68 7 sub.u64 %r117,%r306,%r32; max.s64 %r80,%r117,0; .loc 1 140 8 ld.u64 %r129,[%r182]; .loc 1 141 8 ld.u64 %r130,[%r181]; ld.u64 %r45,[%frame+360]; .loc 1 204 22 ld.u64 %r72,[%frame+120]; .loc 1 205 22 ld.u64 %r74,[%frame]; .loc 1 205 12 shl.b64 %r76,%r74,3; setp.gt.s64 %r353,%r117,0; setp.le.s64 %r354,%r115,1; .loc 1 180 9 setp.eq.u32 %r356,%r184,0; .loc 1 159 10 mov.u64 %r357,0; cvt.u32.u32 %r358,%r22; cvt.s64.s8 %r359,%r358; add.u64 %r360,%r359,-1; add.u64 %r361,%frame,120; add.u64 %r362,%frame,240; bra $L25; $L24: add.u64 %r163,%frame,360; add.u64 %r162,%r181,40; mov.u64 %r160,%frame; cvt.u32.u32 %r308,%r22; cvt.s64.s8 %r307,%r308; add.u64 %r148,%r307,-1; .loc 1 132 3 mov.u64 %r131,0; add.u64 %r351,%frame,240; .loc 1 134 16 mov.u64 %r309,%r131; $L26: st.u64 [%r163],%r309; .loc 1 135 18 ld.u64 %r310,[%r162]; st.u64 [%r160],%r310; .loc 1 136 17 shl.b64 %r312,%r131,3; add.u64 %r313,%r351,%r312; .loc 1 136 10 ld.u64 %r314,[%r313]; setp.le.s64 %r315,%r314,0; @ %r315 bra $L1; .loc 1 132 26 add.u64 %r131,%r131,1; .loc 1 132 3 add.u64 %r163,%r163,8; add.u64 %r162,%r162,24; add.u64 %r160,%r160,8; setp.ne.u64 %r316,%r131,%r148; @ %r316 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r353 bra $L28; .loc 1 159 10 st.u64 [%r130],%r357; bra $L29; $L28: .loc 1 180 9 @ %r356 bra $L30; .loc 1 190 33 mov.u64 %r99,%r129; .loc 1 155 9 mov.u32 %r106,127; .loc 1 157 9 mov.u64 %r103,1; .loc 1 178 8 mov.u64 %r147,0; $L32: .loc 1 183 9 ld.s8 %r66,[%r99]; .loc 1 186 16 add.u64 %r147,%r147,1; .loc 1 183 8 cvt.u16.u32 %r320,%r66; cvt.u16.u32 %r321,%r106; setp.le.s16 %r322,%r320,%r321; selp.u32 %r106,%r66,%r106,%r322; selp.u64 %r103,%r147,%r103,%r322; .loc 1 181 33 add.u64 %r99,%r99,%r119; .loc 1 181 8 setp.gt.s64 %r323,%r80,%r147; @ %r323 bra $L32; bra $L33; $L30: .loc 1 190 33 mov.u64 %r124,%r129; .loc 1 155 9 mov.u32 %r107,127; .loc 1 157 9 mov.u64 %r103,1; .loc 1 178 8 mov.u64 %r146,0; $L35: .loc 1 192 9 ld.s8 %r70,[%r124]; .loc 1 195 16 add.u64 %r146,%r146,1; .loc 1 192 8 cvt.u16.u32 %r324,%r70; cvt.u16.u32 %r325,%r107; setp.lt.s16 %r326,%r324,%r325; selp.u32 %r107,%r70,%r107,%r326; selp.u64 %r103,%r146,%r103,%r326; .loc 1 190 33 add.u64 %r124,%r124,%r119; .loc 1 190 8 setp.gt.s64 %r327,%r80,%r146; @ %r327 bra $L35; $L33: .loc 1 199 12 st.u64 [%r130],%r103; $L29: .loc 1 203 15 add.u64 %r45,%r45,1; .loc 1 204 12 add.u64 %r129,%r129,%r72; .loc 1 205 12 add.u64 %r130,%r130,%r76; .loc 1 207 32 ld.u64 %r67,[%frame+240]; .loc 1 207 13 setp.ne.u64 %r328,%r67,%r45; @ %r328 bra $L25; .loc 1 214 23 mul.lo.u64 %r329,%r45,%r72; .loc 1 214 9 sub.u64 %r126,%r129,%r329; .loc 1 215 23 mul.lo.u64 %r330,%r45,%r74; .loc 1 215 9 shl.b64 %r331,%r330,3; sub.u64 %r127,%r130,%r331; .loc 1 217 7 @ %r354 bra $L1; add.u64 %r155,%frame,368; mov.u64 %r171,8; .loc 1 216 5 mov.u64 %r128,1; bra $L38; $L39: .loc 1 211 13 st.u64 [%r155],%r357; .loc 1 214 23 mul.lo.u64 %r337,%r87,%r86; .loc 1 214 9 sub.u64 %r126,%r129,%r337; .loc 1 215 23 mul.lo.u64 %r338,%r89,%r86; .loc 1 215 9 shl.b64 %r339,%r338,3; sub.u64 %r127,%r130,%r339; .loc 1 216 5 add.u64 %r128,%r128,1; .loc 1 217 7 add.u64 %r155,%r155,8; add.u64 %r171,%r171,8; setp.eq.u64 %r340,%r128,%r360; @ %r340 bra $L1; $L38: .loc 1 225 16 ld.u64 %r341,[%r155]; add.u64 %r86,%r341,1; st.u64 [%r155],%r86; .loc 1 226 23 add.u64 %r343,%r361,%r171; ld.u64 %r87,[%r343]; .loc 1 226 13 add.u64 %r129,%r126,%r87; .loc 1 227 23 add.u64 %r344,%frame,%r171; ld.u64 %r89,[%r344]; .loc 1 227 13 shl.b64 %r345,%r89,3; add.u64 %r130,%r127,%r345; .loc 1 207 32 add.u64 %r347,%r362,%r171; ld.u64 %r92,[%r347]; .loc 1 207 13 setp.eq.u64 %r348,%r86,%r92; @ %r348 bra $L39; .loc 1 211 13 mov.u64 %r45,0; bra $L25; $L55: .loc 1 90 6 ld.u64 %r349,[%r181]; setp.eq.u64 %r350,%r349,0; @ ! %r350 bra $L11; bra $L40; $L1:_gfortran_mminloc1_8_i1 .visible .func _gfortran_mminloc1_8_, .param .u32 %in_ar4) {608101; .reg .u64 %r106; .reg .u64.reg .u64u64 %r214; .reg .u64u32 %r227; .reg .u16 %r229; .reg .u16u16 %r233; .reg .pred %r234; .reg .u64u64pred646464pred %r335; .reg .u64 %r336; .reg .u32 %r337; .reg .u64u64 %r342; .reg .u32 %r343; .reg .u64pred %r372; .reg .u64 %r373; .reg .u64 %r375; .reg .u32 %r376; .reg .predu32pred %r397; .reg .pred %r398; .reg .pred %r401; .reg .pred %r402; .reg .predpredpred %r414; .reg .pred %r415; .reg .predpredpredpredmov.u64 %r212,%ar0; mov.u64 %r213,%ar1; mov.u64 %r214,%ar2; mov.u64 %r215,%ar3; mov.u32 %r216,%ar4; .loc 1 261 6 setp.ne.u64 %r217,%r215,0; @ %r217 bra $L121321216; call _gfortran_minloc1_8_268 7 bra $L56; $L57: .loc 1 271 10 ld.u64 %r22,[%r214]; .loc 1 271 7 add.u64 %r142,%r22,-1; .loc 1 272 10 ld.s8 %r23,[%r213+28]; .loc 1 272 38 add.u32 %r222,%r23,-1; .loc 1 272 8 cvt.s64.s32 %r143,%r222; .loc 1 275 7 shr.u64 %r224,%r142,63; set.u32.gt.s64 %r226,%r142,%r143; neg.s32 %r227,%r226; cvt.u16.u64 %r230,%r224; cvt.u16.u32 %r231,%r227; or.b16 %r229,%r230,%r231; .loc 1 275 6 cvt.u32.u16 %r232,%r229; cvt.u16.u8 %r233,%r232; setp.eq.u16 %r234,%r233,0; @ %r234 bra $L59; .loc 1 277 7 cvt.u32.u32 %r237,%r23; cvt.s64.s8 %r236,%r237; st.u64 [%stack+8],%r236; st.u64 [%stack],%r22; cvta.const.u64 %r235235_gfortran_runtime_error9: .loc 1 282 9 add.u64 %r447,%r142,%r142; add.u64 %r448,%r447,%r142; shl.b64 %r242,%r448,3; add.u64 %r243,%r213,%r242; ld.u64 %r246,[%r243+56]; add.u64 %r245,%r246,1; .loc 1 282 7 ld.u64 %r253,[%r243+48]; sub.u64 %r144,%r245,%r253; .loc 1 283 6 setp.le.s64 %r254,%r144,0; @ %r254 bra $L56; .loc 1 286 9 ld.u64 %r160,[%r215]; .loc 1 288 15 ld.u64 %r34,[%r215+16]; .loc 1 290 22 cvt.u32.u64 %r35,%r34; .loc 1 290 53 add.u32 %r255,%r35,-4; and.b32 %r256,%r255,-5; set.u32.eq.u32 %r258,%r256,0; neg.s32 %r259,%r258; .loc 1 290 22 add.u32 %r260,%r35,-1; set.u32.le.u32 %r262,%r260,1; neg.s32 %r263,%r262; .loc 1 290 6 cvt.u16.u32 %r265,%r259; cvt.u16.u32 %r266,%r263; or.loc 1 292 7 setp.ne.u32 %r271,%r35,16; @ %r271 bra $L62; $L61: .loc 1 299 9 shl.b64 %r275,%r448,3; add.u64 %r276,%r213,%r275; ld.u64 %r147,[%r276+40]; .loc 1 300 12 add.u64 %r282,%r215,%r275; ld.u64 %r38,[%r282+40]; .loc 1 302 3 setp.ne.u64 %r284,%r142,0; @ %r284 bra $L63; $L69: .loc 1 312 3 setp.lt.s64 %r285,%r142,%r143; @ %r285 bra $L64; bra $L123; $L62: .loc 1 297 5 cvta.const.u64 %r286286_gfortran_runtime_error63: add.u64 %r99,%r213,40; add.u64 %r62,%r215,40; add.u64 %r289,%r22,%r22; add.u64 %r290,%r289,%r22; shl.b64 %r291,%r290,3; add.u64 %r292,%r213,16; add.u64 %r88,%r291,%r292; .loc 1 302 3 mov.u64 %r40,0; add.u64 %r445,%frame,240; add.u64 %r443,%frame,360; .loc 1 309 12 mov.u64 %r458,%r40; $L68: .loc 1 304 18 add.u64 %r294,%r445,%r40; ld.u64 %r295,[%r99]; st.u64 [%r294],%r295; .loc 1 305 18 add.u64 %r296,%frame,%r40; .loc 1 305 20 ld.u64 %r298,[%r62]; mul.lo.u64 %r297,%r298,%r34; .loc 1 305 18 st.u64 [%r296],%r297; .loc 1 306 19 ld.u64 %r300,[%r99+16]; add.u64 %r299,%r300,1; ld.u64 %r301,[%r99+8]; sub.u64 %r50,%r299,%r301; .loc 1 308 10 setp.lt.s64 %r302,%r50,0; @ %r302 bra $L66; .loc 1 306 17 add.u64 %r304,%r443,%r40; st.u64 [%r304],%r50; bra $L67; $L66: .loc 1 309 12 add.u64 %r306,%r443,%r40; st.u64 [%r306],%r458; $L67: .loc 1 302 3 add.u64 %r99,%r99,24; add.u64 %r62,%r62,24; add.u64 %r40,%r40,8; setp.ne.u64 %r308,%r88,%r99; @ %r308 bra $L68; bra $L69; $L123: .loc 1 322 6 ld.u64 %r309,[%r212]; setp.eq.u64 %r310,%r309,0; @ ! %r310 bra $L71; bra $L70; $L64: add.u64 %r312,%r22,%r22; add.u64 %r313,%r312,%r22; shl.b64 %r314,%r313,3; add.u64 %r122,%r314,40; add.u64 %r125,%r213,%r122; add.u64 %r106,%r215,%r122; shl.b64 %r315,%r22,3; add.u64 %r59,%r315,-8; cvt.u32.u32 %r317,%r23; cvt.s64.s8 %r316,%r317; shl.b64 %r318,%r316,3; add.u64 %r200,%r318,-8; add.u64 %r445,%frame,240; add.u64 %r443,%frame,360; .loc 1 319 12 mov.u64 %r457,0; $L74: .loc 1 314 18 add.u64 %r320,%r445,%r59; ld.u64 %r321,[%r125]; st.u64 [%r320],%r321; .loc 1 315 18 add.u64 %r322,%frame,%r59; .loc 1 315 20 ld.u64 %r324,[%r106]; mul.lo.u64 %r323,%r324,%r34; .loc 1 315 18 st.u64 [%r322],%r323; .loc 1 316 19 ld.u64 %r326,[%r125+16]; add.u64 %r325,%r326,1; ld.u64 %r327,[%r125+8]; sub.u64 %r61,%r325,%r327; .loc 1 318 10 setp.lt.s64 %r328,%r61,0; @ %r328 bra $L72; .loc 1 316 17 add.u64 %r330,%r443,%r59; st.u64 [%r330],%r61; bra $L73; $L72: .loc 1 319 12 add.u64 %r332,%r443,%r59; st.u64 [%r332],%r457; $L73: .loc 1 312 3 add.u64 %r125,%r125,24; add.u64 %r106,%r106,24; add.u64 %r59,%r59,8; setp.ne.u64 %r334,%r59,%r200; @ %r334 bra $L74; bra $L124; $L70: .loc 1 326 7 setp.eq.u64 %r335,%r143,0; @ %r335 bra $L76; add.u64 %r443,%frame,360; $L101: add.u64 %r139,%r212,40; mov.u64 %r137,%r443; cvt.u32.u32 %r337,%r23; cvt.s64.s8 %r336,%r337; add.u64 %r339,%r336,%r336; add.u64 %r340,%r339,%r336; shl.b64 %r341,%r340,3; add.u64 %r342,%r212,16; add.u64 %r128,%r341,%r342; .loc 1 329 10 mov.u64 %r126,1; .loc 1 333 4 mov.u64 %r361,0; bra $L77; $L76: .loc 1 337 20 add.u32 %r343,%r23,-2; cvt.s64.s32 %r70,%r343; add.u64 %r345,%r70,%r70; add.u64 %r346,%r345,%r70; shl.b64 %r347,%r346,3; add.u64 %r348,%r212,%r347; .loc 1 337 67 shl.b64 %r350,%r70,3; add.u64 %r351,%frame,%r350; .loc 1 337 59 ld.u64 %r353,[%r348+40]; ld.u64 %r354,[%r351+360]; mul.lo.u64 %r150,%r353,%r354; .loc 1 339 24 mov.u64 %r355,0; st.u64 [%r212+8],%r355; .loc 1 340 28 cvt.u16.u32 %r358,%r23; add.u16 %r357,%r358,-1; cvt.u32.u16 %r359,%r357; st.u8 [%r212+28],%r359; .loc 1 342 10 setp.eq.u64 %r360,%r150,0; @ %r360 bra $L78; bra $L125; $L80: .loc 1 331 47 mul.lo.u64 %r126,%r67,%r126; $L77: .loc 1 333 4 st.u64 [%r139+8],%r361; ld.u64 %r67,[%r137]; add.u64 %r362,%r67,-1; st.u64 [%r139+16],%r362; st.u64 [%r139],%r126; .loc 1 326 7 add.u64 %r139,%r139,24; add.u64 %r137,%r137,8; setp.ne.u64 %r363,%r128,%r139; @ %r363 bra $L80; bra $L76; $L78: .loc 1 345 4 st.u64 [%r212+48],%r150; mov.u64 %r365,-1; st.u64 [%r212+56],%r365; mov.u64 %r366,1; st.u64 [%r212+40],%r366; .loc 1 346 4 bra $L56; $L125: .loc 1 349 24 mov.u64 %r368368; call (%value_in),_gfortrani_xmallocarray69,[%value_in]; } .loc 1 349 22 st.u64 [%r212],%r369; bra $L81; $L71: .loc 1 354 19 ld.s8 %r371,[%r212+28]; .loc 1 354 10 setp.eq.u64 %r372,%r371,%r143; @ %r372 bra $L82; .loc 1 355 2 cvta.const.u64 %r373373_gfortran_runtime_error82: .loc 1 357 11 cvta.global.u64 %r375,_gfortrani_compile_options; .loc 1 357 10 ld.u32 %r376,[%r375+36]; setp.eq.u32 %r377,%r376,0; @ %r377 bra $L81; .loc 1 359 4 add.u64 %r443,%frame,360; cvta.const.u64 %r381,$LC2r443380381; call _gfortrani_bounds_ifunction_return361 4 cvta.const.u64 %r385,$LC38381; call _gfortrani_bounds_equal_extents$L81: .loc 1 366 3 setp.ne.u64 %r387,%r143,0; @ %r387 bra $L83; $L86: .loc 1 374 8 ld.u64 %r161,[%r212]; .loc 1 375 8 ld.u64 %r159,[%r213]; .loc 1 377 9 setp.ne.u64 %r388,%r159,0; @ %r388 bra $L84; bra $L56; $L83: add.u64 %r187,%frame,480; add.u64 %r184,%r212,40; add.u64 %r179,%frame,120; cvt.u32.u32 %r390,%r23; cvt.s64.s8 %r389,%r390; add.u64 %r155,%r389,-1; .loc 1 366 3 mov.u64 %r162,0; add.u64 %r443,%frame,360; .loc 1 368 16 mov.u64 %r391,%r162; $L85: st.u64 [%r187],%r391; .loc 1 369 18 ld.u64 %r392,[%r184]; st.u64 [%r179],%r392; .loc 1 370 17 shl.b64 %r394,%r162,3; add.u64 %r395,%r443,%r394; .loc 1 370 10 ld.u64 %r396,[%r395]; setp.le.s64 %r397,%r396,0; @ %r397 bra $L56; .loc 1 366 26 add.u64 %r162,%r162,1; .loc 1 366 3 add.u64 %r187,%r187,8; add.u64 %r184,%r184,24; add.u64 %r179,%r179,8; setp.ne.u64 %r398,%r155,%r162; @ %r398 bra $L85; bra $L86; $L84: .loc 1 300 12 mul.lo.u64 %r41,%r38,%r34; ld.u64 %r186,[%frame+480]; .loc 1 440 22 ld.u64 %r92,[%frame+240]; .loc 1 441 23 ld.u64 %r94,[%frame]; .loc 1 442 22 ld.u64 %r96,[%frame+120]; .loc 1 442 12 shl.b64 %r98,%r96,3; setp.le.s64 %r442,%r143,1; .loc 1 418 9 setp.ne.u32 %r450,%r216,0; cvt.u32.u32 %r451,%r23; cvt.s64.s8 %r452,%r451; add.u64 %r453,%r452,-1; add.u64 %r454,%frame,240; add.u64 %r455,%frame,360; add.u64 %r456,%frame,120; $L102: .loc 1 366 3 mov.u64 %r154,%r160; mov.u64 %r153,%r159; .loc 1 396 9 mov.u64 %r133,0; $L90: .loc 1 399 7 ld.s8 %r205,[%r154]; mov.u64 %r183,%r133; .loc 1 396 24 add.u64 %r133,%r133,1; .loc 1 399 6 setp.eq.u32 %r444,%r205,0; @ %r444 bra $L87; .loc 1 407 11 ld.s8 %r136,[%r153]; .loc 1 418 9 @ %r450 bra $L88; bra $L126; $L87: .loc 1 396 32 add.u64 %r153,%r153,%r147; .loc 1 396 47 add.u64 %r154,%r154,%r41; .loc 1 396 2 setp.ne.u64 %r401,%r144,%r133; @ %r401 bra $L90; .loc 1 395 9 mov.u64 %r133,0; bra $L91; $L126: .loc 1 428 10 setp.gt.s64 %r402,%r144,%r183; @ %r402 bra $L96; bra $L91; $L88: .loc 1 419 8 setp.le.s64 %r403,%r144,%r183; @ %r403 bra $L91; $L94: .loc 1 424 16 add.u64 %r183,%r183,1; .loc 1 421 8 @ %r444 bra $L93; .loc 1 421 18 ld.s8 %r85,[%r153]; .loc 1 421 15 cvt.u16.u32 %r405,%r85; cvt.u16.u32 %r406,%r136; setp.le.s16 %r407,%r405,%r406; selp.u32 %r136,%r85,%r136,%r407; selp.u64 %r133,%r183,%r133,%r407; $L93: .loc 1 419 33 add.u64 %r153,%r153,%r147; .loc 1 419 48 add.u64 %r154,%r154,%r41; .loc 1 419 8 setp.eq.u64 %r408,%r144,%r183; @ %r408 bra $L91; .loc 1 421 9 ld.s8 %r205,[%r154]; setp.eq.u32 %r444,%r205,0; bra $L94; $L96: .loc 1 433 11 add.u64 %r183,%r183,1; .loc 1 430 10 ld.s8 %r410,[%r154]; cvt.u16.u32 %r409,%r410; setp.eq.u16 %r411,%r409,0; @ %r411 bra $L95; .loc 1 430 20 ld.s8 %r90,[%r153]; .loc 1 430 17 cvt.u16.u32 %r412,%r90; cvt.u16.u32 %r413,%r136; setp.lt.s16 %r414,%r412,%r413; selp.u32 %r136,%r90,%r136,%r414; selp.u64 %r133,%r183,%r133,%r414; $L95: .loc 1 428 35 add.u64 %r153,%r153,%r147; .loc 1 428 50 add.u64 %r154,%r154,%r41; .loc 1 428 10 setp.ne.u64 %r415,%r144,%r183; @ %r415 bra $L96; $L91: .loc 1 436 8 st.u64 [%r161],%r133; .loc 1 439 15 add.u64 %r186,%r186,1; .loc 1 440 12 add.u64 %r159,%r159,%r92; .loc 1 441 13 add.u64 %r160,%r160,%r94; .loc 1 442 12 add.u64 %r161,%r161,%r98; .loc 1 444 32 ld.u64 %r29,[%frame+360]; .loc 1 444 13 setp.ne.u64 %r416,%r29,%r186; @ %r416 bra $L102; .loc 1 451 23 mul.lo.u64 %r101,%r186,%r92; .loc 1 452 24 mul.lo.u64 %r417,%r186,%r94; .loc 1 452 10 sub.u64 %r156,%r160,%r417; .loc 1 453 23 mul.lo.u64 %r418,%r186,%r96; .loc 1 453 9 shl.b64 %r419,%r418,3; sub.u64 %r157,%r161,%r419; .loc 1 455 7 @ %r442 bra $L56; add.u64 %r197,%frame,488; mov.u64 %r196,8; .loc 1 454 5 mov.u64 %r158,1; .loc 1 448 13 mov.u64 %r449,0; bra $L99; $L100: st.u64 [%r197],%r449; .loc 1 451 23 mul.lo.u64 %r101,%r113,%r112; .loc 1 452 24 mul.lo.u64 %r425,%r115,%r112; .loc 1 452 10 sub.u64 %r156,%r160,%r425; .loc 1 453 23 mul.lo.u64 %r426,%r117,%r112; .loc 1 453 9 shl.b64 %r427,%r426,3; sub.u64 %r157,%r161,%r427; .loc 1 454 5 add.u64 %r158,%r158,1; .loc 1 455 7 add.u64 %r197,%r197,8; add.u64 %r196,%r196,8; setp.eq.u64 %r428,%r158,%r453; @ %r428 bra $L56; $L99: .loc 1 463 16 ld.u64 %r429,[%r197]; add.u64 %r112,%r429,1; st.u64 [%r197],%r112; .loc 1 464 23 add.u64 %r431,%r454,%r196; ld.u64 %r113,[%r431]; .loc 1 464 13 sub.u64 %r432,%r113,%r101; add.u64 %r159,%r159,%r432; .loc 1 465 24 add.u64 %r433,%frame,%r196; ld.u64 %r115,[%r433]; .loc 1 465 14 add.u64 %r160,%r156,%r115; .loc 1 466 23 add.u64 %r435,%r456,%r196; ld.u64 %r117,[%r435]; .loc 1 466 13 shl.b64 %r436,%r117,3; add.u64 %r161,%r157,%r436; .loc 1 444 32 add.u64 %r438,%r455,%r196; ld.u64 %r121,[%r438]; .loc 1 444 13 setp.eq.u64 %r439,%r112,%r121; @ %r439 bra $L100; .loc 1 448 13 mov.u64 %r186,0; bra $L102; $L124: .loc 1 322 6 ld.u64 %r440,[%r212]; setp.eq.u64 %r441,%r440,0; @ ! %r441 bra $L71; bra $L101; $L56_gfortran_sminloc1_8_i1 .visible .func _gfortran_sminloc1_8_, .param .u32 %in_ar4) {local .align 16 .b8 %frame_ar[368u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 493 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L128; .loc 1 493 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L129; $L128:r152154; call _gfortran_minloc1_8_500 7 bra $L127; $L129: .loc 1 503 10 ld.u64 %r23,[%r152]; .loc 1 503 7 add.u64 %r88,%r23,-1; .loc 1 504 10 ld.s8 %r24,[%r151+28]; .loc 1 504 38 add.u32 %r26,%r24,-1; .loc 1 504 8 cvt.s64.s32 %r296,%r26; .loc 1 506 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 506 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L131; .loc 1 513 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L132; $L138: .loc 1 521 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L133; bra $L186; $L131: .loc 1 508 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error132: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 518 12 mov.u64 %r302,0; $L137: .loc 1 515 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 517 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L135; .loc 1 515 17 st.u64 [%r139],%r36; bra $L136; $L135: .loc 1 518 12 st.u64 [%r139],%r302; $L136: .loc 1 513 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L137; bra $L138; $L186: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L140; bra $L139; $L133: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 527 12 mov.u64 %r301,0; $L143: .loc 1 524 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 526 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L141; .loc 1 523 17 st.u64 [%r43],%r41; bra $L142; $L141: .loc 1 527 12 st.u64 [%r43],%r301; $L142: .loc 1 521 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L143; bra $L187; $L139: .loc 1 534 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L145; add.u64 %r295,%frame,120; $L165: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 537 10 mov.u64 %r81,1; .loc 1 541 4 mov.u64 %r239,0; bra $L146; $L145: .loc 1 545 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 546 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 548 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 548 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 548 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 550 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L147; bra $L188; $L149: .loc 1 539 48 mul.lo.u64 %r81,%r46,%r81; $L146: .loc 1 541 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 534 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L149; bra $L145; $L147: .loc 1 553 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 554 4 bra $L127; $L188: .loc 1 557 24 mov.u64 %r24696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 557 22 st.u64 [%r150],%r98; $L153: .loc 1 583 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L150; bra $L151; $L140: .loc 1 561 19 ld.s8 %r55,[%r150+28]; .loc 1 561 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L152; .loc 1 562 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error152: .loc 1 567 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 567 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L153; .loc 1 569 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L151; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 569 10 mov.u64 %r104,0; $L155: .loc 1 573 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 573 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 574 18 ld.u64 %r63,[%r31]; .loc 1 569 25 add.u64 %r104,%r104,1; .loc 1 574 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L154; .loc 1 575 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error154: .loc 1 569 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L155; $L150: .loc 1 585 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L156; shl.b64 %r266,%r296,3; bra $L157; $L156: mov.u64 %r266,8; $L157: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L158: .loc 1 586 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 583 26 add.u64 %r100,%r100,1; .loc 1 583 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L158; $L151: ld.u64 %r80,[%frame+240]; .loc 1 595 22 ld.u64 %r68,[%frame]; .loc 1 595 12 shl.b64 %r69,%r68,3; .loc 1 597 32 ld.u64 %r59,[%frame+120]; .loc 1 604 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 593 13 mov.u64 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L164: st.u64 [%r98],%r280; .loc 1 594 15 add.u64 %r80,%r80,1; .loc 1 595 12 add.u64 %r98,%r98,%r69; .loc 1 597 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L164; .loc 1 606 7 @ ! %r294 bra $L189; bra $L127; $L163: .loc 1 601 13 st.u64 [%r99],%r280; .loc 1 604 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 605 5 add.u64 %r97,%r97,1; .loc 1 606 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L162; bra $L127; $L189: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 604 23 mov.u64 %r72,%r44; .loc 1 605 5 mov.u64 %r97,1; $L162: .loc 1 610 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 611 23 ld.u64 %r75,[%r124]; .loc 1 611 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,3; add.u64 %r98,%r98,%r291; .loc 1 597 32 ld.u64 %r77,[%r123]; .loc 1 597 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L163; .loc 1 601 13 mov.u64 %r80,0; bra $L164; $L187: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L140; bra $L165; $L127: .loc 1 615 1 ret; } minloc1_16_i1.o/_gfortran_minloc1_16_i1 .visible .func _gfortran_minloc1_16_i1.file 1 "../../../libgfortran/generated/minloc1_16_i1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_16_i1 .visible .func _gfortran_mminloc1_16__gfortran_sminloc1_16_i1 .visible .func _gfortran_sminloc1_16_VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_minloc1_16_i1 .visible .func _gfortran_minloc1_16_i1480predpred %r225; .reg .predpred %r236; .reg .pred %r238; .reg .u64u64pred %r263; .reg .u64 %r264; .reg .u32 %r265; .reg .u64 %r267; .reg .u64 %r268; .reg .u64 %r269; .reg .u64 %r270; .reg .u64 %r271; .reg .u16 %r273; .reg .u16u64u32 %r316; .reg .u64 %r317; .reg .u64pred %r323; .reg .pred %r324; .reg .u64 %r326; .reg .u16 %r329; .reg .u16 %r330; .reg .pred %r331; .reg .u64 %r334; .reg .u64 %r337; .reg .u32 %r339; .reg .u64 %r344; .reg .pred %r346; .reg .u16 %r347; .reg .u16 %r348; .reg .pred %r349; .reg .u64 %r352; .reg .u6464u64 %r391; .reg .u64pred %r403; .reg .predmov.u64 %r186,%ar0; mov.u64 %r187,%ar1; mov.u64 %r188,%ar2; mov.u32 %r189,%ar3; .loc 1 58 10 ld.s8 %r22,[%r187+28]; .loc 1 58 38 add.u32 %r190,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r109,%r190; .loc 1 59 10 ld.u64 %r25,[%r188]; .loc 1 59 7 add.u64 %r110,%r25,-1; .loc 1 61 7 shr.u64 %r192,%r110,63; set.u32.lt.s64 %r194,%r109,%r110; neg.s32 %r195,%r194; cvt.u16.u64 %r198,%r192; cvt.u16.u32 %r199,%r195; or.b16 %r197,%r198,%r199; .loc 1 61 6 cvt.u32.u16 %r200,%r197; cvt.u16.u8 %r201,%r200; setp.eq.u16 %r202,%r201,0; @ %r202 bra $L2; .loc 1 63 7 cvt.u32.u32 %r205,%r22; cvt.s64.s8 %r204,%r205; st.u64 [%stack+8],%r20403203_gfortran_runtime_errorr208,%r110,%r110; add.u64 %r209,%r208,%r110; shl.b64 %r210,%r209,3; add.u64 %r211,%r187,%r210; ld.u64 %r31,[%r211+56]; ld.u64 %r33,[%r211+48]; .loc 1 71 9 ld.u64 %r113,[%r211+40]; .loc 1 73 3 setp.ne.u64 %r225,%r110,0; @ %r225 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r226,%r109,%r110; @ %r226 bra $L4; bra $L52; $L3: add.u64 %r177,%r187,40; add.u64 %r47,%frame,120; add.u64 %r151,%frame,240; add.u64 %r227,%r187,16; add.u64 %r229,%r25,%r25; add.u64 %r230,%r229,%r25; shl.b64 %r231,%r230,3; add.u64 %r153,%r227,%r231; .loc 1 79 12 mov.u64 %r414,0; $L8: .loc 1 75 18 ld.u64 %r232,[%r177]; st.u64 [%r47],%r232; .loc 1 76 19 ld.u64 %r234,[%r177+16]; add.u64 %r233,%r234,1; ld.u64 %r235,[%r177+8]; sub.u64 %r38,%r233,%r235; .loc 1 78 10 setp.lt.s64 %r236,%r38,0; @ %r236 bra $L6; .loc 1 76 17 st.u64 [%r151],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r151],%r414; $L7: .loc 1 73 3 add.u64 %r177,%r177,24; add.u64 %r47,%r47,8; add.u64 %r151,%r151,8; setp.ne.u64 %r238,%r153,%r177; @ %r238 bra $L8; bra $L9; $L52: .loc 1 90 6 ld.u64 %r239,[%r186]; setp.eq.u64 %r240,%r239,0; @ ! %r240 bra $L11; bra $L10; $L4: add.u64 %r242,%r25,%r25; add.u64 %r243,%r242,%r25; shl.b64 %r244,%r243,3; add.u64 %r245,%r244,40; add.u64 %r99,%r187,%r245; shl.b64 %r246,%r25,3; add.u64 %r68,%r246,-8; add.u64 %r402,%frame,120; add.u64 %r92,%r402,%r68; add.u64 %r401,%frame,240; add.u64 %r61,%r401,%r68; cvt.u32.u32 %r250,%r22; cvt.s64.s8 %r249,%r250; add.u64 %r252,%r249,%r249; add.u64 %r253,%r252,%r249; shl.b64 %r254,%r253,3; add.u64 %r255,%r187,40; add.u64 %r184,%r254,%r255; .loc 1 87 12 mov.u64 %r413,0; $L14: .loc 1 83 18 ld.u64 %r256,[%r99]; st.u64 [%r92],%r256; .loc 1 84 19 ld.u64 %r258,[%r99+16]; add.u64 %r257,%r258,1; ld.u64 %r259,[%r99+8]; sub.u64 %r44,%r257,%r259; .loc 1 86 10 setp.lt.s64 %r260,%r44,0; @ %r260 bra $L12; .loc 1 84 17 st.u64 [%r61],%r44; bra $L13; $L12: .loc 1 87 12 st.u64 [%r61],%r413; $L13: .loc 1 81 3 add.u64 %r99,%r99,24; add.u64 %r92,%r92,8; add.u64 %r61,%r61,8; setp.ne.u64 %r262,%r99,%r184; @ %r262 bra $L14; bra $L53; $L10: .loc 1 94 7 setp.le.s64 %r263,%r109,0; @ %r263 bra $L16; add.u64 %r401,%frame,240; $L40: add.u64 %r136,%r186,40; mov.u64 %r122,%r401; cvt.u32.u32 %r265,%r22; cvt.s64.s8 %r264,%r265; add.u64 %r267,%r264,%r264; add.u64 %r268,%r267,%r264; shl.b64 %r269,%r268,3; add.u64 %r270,%r186,16; add.u64 %r100,%r269,%r270; .loc 1 97 10 mov.u64 %r98,1; .loc 1 101 4 mov.u64 %r293,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r271,0; st.u64 [%r186+8],%r271; .loc 1 106 28 cvt.u16.u32 %r274,%r22; add.u16 %r273,%r274,-1; cvt.u32.u16 %r275,%r273; st.u8 [%r186+28],%r275; .loc 1 108 20 add.u32 %r276,%r22,-2; cvt.s64.s32 %r53,%r276; add.u64 %r278,%r53,%r53; add.u64 %r279,%r278,%r53; shl.b64 %r280,%r279,3; add.u64 %r281,%r186,%r280; .loc 1 108 67 shl.b64 %r283,%r53,3; add.u64 %r284,%frame,%r283; .loc 1 108 59 ld.u64 %r286,[%r281+40]; ld.u64 %r287,[%r284+240]; mul.lo.u64 %r115,%r286,%r287; .loc 1 110 29 mov.u64 %r289,16289; call (%value_in),_gfortrani_xmallocarray290,[%value_in]; } .loc 1 110 27 st.u64 [%r186],%r290; .loc 1 111 10 setp.eq.u64 %r292,%r115,0; @ ! %r292 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r98,%r49,%r98; $L17: .loc 1 101 4 st.u64 [%r136+8],%r293; ld.u64 %r49,[%r122]; add.u64 %r294,%r49,-1; st.u64 [%r136+16],%r294; st.u64 [%r136],%r98; .loc 1 94 7 add.u64 %r136,%r136,24; add.u64 %r122,%r122,8; setp.ne.u64 %r295,%r100,%r136; @ %r295 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r186+48],%r115; mov.u64 %r297,-1; st.u64 [%r186+56],%r297; mov.u64 %r298,1; st.u64 [%r186+40],%r298; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r59,[%r186+28]; .loc 1 121 10 setp.eq.u64 %r299,%r59,%r109; @ %r299 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r109; st.u64 [%stack],%r59;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r302,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r303,[%r302+36]; setp.eq.u32 %r304,%r303,0; @ %r304 bra $L21; .loc 1 128 2 add.u64 %r401,%frame,240; cvta.const.u64 %r308,$LC2r401307308; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r310,%r109,0; @ %r310 bra $L24; $L27: .loc 1 68 9 add.u64 %r311,%r31,1; .loc 1 68 7 sub.u64 %r111,%r311,%r33; max.s64 %r78,%r111,0; .loc 1 140 8 ld.u64 %r126,[%r187]; .loc 1 141 8 ld.u64 %r127,[%r186]; ld.u64 %r148,[%frame+360]; .loc 1 204 22 ld.u64 %r71,[%frame+120]; .loc 1 205 22 ld.u64 %r73,[%frame]; .loc 1 205 12 shl.b64 %r74,%r73,4; .loc 1 207 32 ld.u64 %r65,[%frame+240]; .loc 1 214 23 mul.lo.u64 %r312,%r65,%r71; .loc 1 214 9 neg.s64 %r134,%r312; .loc 1 215 23 mul.lo.u64 %r313,%r65,%r73; .loc 1 215 9 shl.b64 %r314,%r313,4; neg.s64 %r139,%r314; setp.gt.s64 %r403,%r111,0; setp.le.s64 %r404,%r109,1; .loc 1 180 9 setp.eq.u32 %r406,%r189,0; .loc 1 190 33 mov.u64 %r407,1; cvt.u32.u32 %r408,%r22; cvt.s64.s8 %r409,%r408; add.u64 %r410,%r409,-1; add.u64 %r411,%frame,120; add.u64 %r412,%frame,240; bra $L25; $L24: add.u64 %r160,%frame,360; add.u64 %r159,%r186,40; mov.u64 %r157,%frame; cvt.u32.u32 %r316,%r22; cvt.s64.s8 %r315,%r316; add.u64 %r137,%r315,-1; .loc 1 132 3 mov.u64 %r128,0; add.u64 %r401,%frame,240; .loc 1 134 16 mov.u64 %r317,%r128; $L26: st.u64 [%r160],%r317; .loc 1 135 18 ld.u64 %r318,[%r159]; st.u64 [%r157],%r318; .loc 1 136 17 shl.b64 %r320,%r128,3; add.u64 %r321,%r401,%r320; .loc 1 136 10 ld.u64 %r322,[%r321]; setp.le.s64 %r323,%r322,0; @ %r323 bra $L1; .loc 1 132 26 add.u64 %r128,%r128,1; .loc 1 132 3 add.u64 %r160,%r160,8; add.u64 %r159,%r159,24; add.u64 %r157,%r157,8; setp.ne.u64 %r324,%r128,%r137; @ %r324 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r403 bra $L28; .loc 1 159 10 mov.u64 %r326,0; st.u64 [%r127],%r326; st.u64 [%r127+8],%r326; bra $L29; $L28: .loc 1 180 9 @ %r406 bra $L30; .loc 1 190 33 mov.u64 %r121,%r126; mov.u64 %r391,%r407; mov.u64 %r392,0; .loc 1 155 9 mov.u32 %r101,127; .loc 1 157 9 mov.u64 %r387,1; mov.u64 %r388,%r392; $L32: .loc 1 183 9 ld.s8 %r64,[%r121]; .loc 1 183 8 cvt.u16.u32 %r329,%r64; cvt.u16.u32 %r330,%r101; setp.gt.s16 %r331,%r329,%r330; .loc 1 186 16 selp.u64 %r387,%r387,%r391,%r331; selp.u64 %r388,%r388,%r392,%r331; selp.u32 %r101,%r101,%r64,%r331; .loc 1 181 33 add.u64 %r121,%r121,%r113; .loc 1 181 8 add.u64 %r334,%r391,%r407; set.u32.lt.u64 %r339,%r334,%r391; cvt.s64.s32 %r337,%r339; mov.u64 %r391,%r334; sub.u64 %r392,%r392,%r337; add.u64 %r344,%r391,-1; setp.gt.s64 %r346,%r78,%r344; @ %r346 bra $L32; bra $L33; $L30: .loc 1 190 33 mov.u64 %r118,%r126; mov.u64 %r389,%r407; mov.u64 %r390,0; .loc 1 155 9 mov.u32 %r102,127; .loc 1 157 9 mov.u64 %r387,1; mov.u64 %r388,%r390; $L35: .loc 1 192 9 ld.s8 %r67,[%r118]; .loc 1 192 8 cvt.u16.u32 %r347,%r67; cvt.u16.u32 %r348,%r102; setp.ge.s16 %r349,%r347,%r348; .loc 1 195 16 selp.u64 %r387,%r387,%r389,%r349; selp.u64 %r388,%r388,%r390,%r349; selp.u32 %r102,%r102,%r67,%r349; .loc 1 190 33 add.u64 %r118,%r118,%r113; .loc 1 190 8 add.u64 %r352,%r389,%r407; set.u32.lt.u64 %r357,%r352,%r389; cvt.s64.s32 %r355,%r357; mov.u64 %r389,%r352; sub.u64 %r390,%r390,%r355; add.u64 %r362,%r389,-1; setp.gt.s64 %r364,%r78,%r362; @ %r364 bra $L35; $L33: .loc 1 199 12 st.u64 [%r127],%r387; st.u64 [%r127+8],%r388; $L29: .loc 1 203 15 add.u64 %r148,%r148,1; .loc 1 204 12 add.u64 %r126,%r126,%r71; .loc 1 205 12 add.u64 %r127,%r127,%r74; .loc 1 207 13 setp.ne.u64 %r367,%r65,%r148; @ %r367 bra $L25; .loc 1 214 9 add.u64 %r123,%r126,%r134; .loc 1 215 9 add.u64 %r124,%r127,%r139; .loc 1 217 7 @ %r404 bra $L1; add.u64 %r154,%frame,368; mov.u64 %r108,8; .loc 1 216 5 mov.u64 %r125,1; .loc 1 211 13 mov.u64 %r405,0; bra $L38; $L39: st.u64 [%r154],%r405; .loc 1 214 23 mul.lo.u64 %r373,%r85,%r84; .loc 1 214 9 sub.u64 %r123,%r126,%r373; .loc 1 215 23 mul.lo.u64 %r374,%r87,%r84; .loc 1 215 9 shl.b64 %r375,%r374,4; sub.u64 %r124,%r127,%r375; .loc 1 216 5 add.u64 %r125,%r125,1; .loc 1 217 7 add.u64 %r154,%r154,8; add.u64 %r108,%r108,8; setp.eq.u64 %r376,%r125,%r410; @ %r376 bra $L1; $L38: .loc 1 225 16 ld.u64 %r377,[%r154]; add.u64 %r84,%r377,1; st.u64 [%r154],%r84; .loc 1 226 23 add.u64 %r379,%r411,%r108; ld.u64 %r85,[%r379]; .loc 1 226 13 add.u64 %r126,%r123,%r85; .loc 1 227 23 add.u64 %r380,%frame,%r108; ld.u64 %r87,[%r380]; .loc 1 227 13 shl.b64 %r381,%r87,4; add.u64 %r127,%r124,%r381; .loc 1 207 32 add.u64 %r383,%r412,%r108; ld.u64 %r90,[%r383]; .loc 1 207 13 setp.eq.u64 %r384,%r84,%r90; @ %r384 bra $L39; .loc 1 211 13 mov.u64 %r148,0; bra $L25; $L53: .loc 1 90 6 ld.u64 %r385,[%r186]; setp.eq.u64 %r386,%r385,0; @ ! %r386 bra $L11; bra $L40; $L1:_gfortran_mminloc1_16_i1 .visible .func _gfortran_mminloc1_16_, .param .u32 %in_ar4) {608pred %r289; .reg .predpred %r304; .reg .pred %r305; .reg .u64u64u64 %r318; .reg .u64pred %r322; .reg .u64 %r324; .reg .u64 %r326; .reg .pred %r328; .reg .u64 %r329; .reg .predu64pred %r354; .reg .predpredpred %r397; .reg .u64 %r400; .reg .u64 %r401; .reg .u64 %r405; .reg .pred.reg .u64 %r416; .reg .predpred %r425; .reg .pred %r426; .reg .pred %r427; .reg .u64 %r431; .reg .u64 %r434; .reg .u64pred %r476; .reg .pred %r479; .reg .u64 %r483; .reg .u64 %r486; .reg .u64 %r489; .reg .u64 %r492; .reg .u32 %r494; .reg .u64 %r498; .reg .u64 %r500; .reg .u64 %r503; .reg .u32 %r505; .reg .u64 %r509; .reg .u16 %r510; .reg .u32 %r511; .reg .pred %r512; .reg .u16 %r513; .reg .u16 %r514; .reg .pred %r515; .reg .u64 %r518; .reg .u64 %r521; .reg .u32 %r523; .reg .predpred %r558; .reg .u64 %r559; .reg .u64 %r560; .reg .u64u64 %r593; .reg .pred.reg .u64 %r612; .reg .u64 %r613; mov.u64 %r232,%ar0; mov.u64 %r233,%ar1; mov.u64 %r234,%ar2; mov.u64 %r235,%ar3; mov.u32 %r236,%ar4; .loc 1 261 6 setp.ne.u64 %r237,%r235,0; @ %r237 bra $L23236; call _gfortran_minloc1_16_268 7 bra $L54; $L55: .loc 1 271 10 ld.u64 %r22,[%r234]; .loc 1 271 7 add.u64 %r150,%r22,-1; .loc 1 272 10 ld.s8 %r23,[%r233+28]; .loc 1 272 38 add.u32 %r242,%r23,-1; .loc 1 272 8 cvt.s64.s32 %r151,%r242; .loc 1 275 7 shr.u64 %r244,%r150,63; set.u32.gt.s64 %r246,%r150,%r151; neg.s32 %r247,%r246; cvt.u16.u64 %r250,%r244; cvt.u16.u32 %r251,%r247; or.b16 %r249,%r250,%r251; .loc 1 275 6.loc 1 277 7 cvt.u32.u32 %r257,%r23; cvt.s64.s8 %r256,%r257; st.u64 [%stack+8],%r256; st.u64 [%stack],%r22; cvta.const.u64 %r22_gfortran_runtime_error7: .loc 1 282 9 add.u64 %r598,%r150,%r150; add.u64 %r602,%r598,%r150; shl.b64 %r262,%r602,3; add.u64 %r263,%r233,%r262; ld.u64 %r266,[%r263+56]; add.u64 %r265,%r266,1; .loc 1 282 7 ld.u64 %r273,[%r263+48]; sub.u64 %r152,%r265,%r273; .loc 1 283 6 setp.le.s64 %r274,%r152,0; @ %r274 bra $L54; .loc 1 286 9 ld.u64 %r170,[%r235]; .loc 1 288 15 ld.u64 %r35,[%r235+16]; .loc 1 290 22 cvt.u32.u64 %r36,%r35; .loc 1 290 53 add.u32 %r275,%r36,-4; and.b32 %r276,%r275,-5; set.u32.eq.u32 %r278,%r276,0; neg.s32 %r279,%r278; .loc 1 290 22 add.u32 %r280,%r36,-1; set.u32.le.u32 %r282,%r280,1; neg.s32 %r283,%r282; .loc 1 290 6 cvt.u16.u32 %r285,%r279; cvt.u16.u32 %r286,%r283; or.b16 %r284,%r285,%r286; cvt.u32.u16 %r287,%r284; cvt.u16.u8 %r288,%r287; setp.ne.u16 %r289,%r288,0; @ %r289 bra $L59; .loc 1 292 7 setp.ne.u32 %r291,%r36,16; @ %r291 bra $L60; $L59: .loc 1 299 9 shl.b64 %r295,%r602,3; add.u64 %r296,%r233,%r295; ld.u64 %r155,[%r296+40]; .loc 1 300 12 add.u64 %r302,%r235,%r295; ld.u64 %r41,[%r302+40]; .loc 1 302 3 setp.ne.u64 %r304,%r150,0; @ %r304 bra $L61; $L67: .loc 1 312 3 setp.lt.s64 %r305,%r150,%r151; @ %r305 bra $L62; bra $L121; $L60: .loc 1 297 5 cvta.const.u64 %r306306_gfortran_runtime_error61: add.u64 %r85,%r233,40; add.u64 %r228,%r235,40; add.u64 %r309,%r22,%r22; add.u64 %r310,%r309,%r22; shl.b64 %r311,%r310,3; add.u64 %r312,%r233,16; add.u64 %r76,%r311,%r312; .loc 1 302 3 mov.u64 %r163,0; add.u64 %r599,%frame,240; add.u64 %r603,%frame,360; .loc 1 309 12 mov.u64 %r613,%r163; $L66: .loc 1 304 18 add.u64 %r314,%r599,%r163; ld.u64 %r315,[%r85]; st.u64 [%r314],%r315; .loc 1 305 18 add.u64 %r316,%frame,%r163; .loc 1 305 20 ld.u64 %r318,[%r228]; mul.lo.u64 %r317,%r318,%r35; .loc 1 305 18 st.u64 [%r316],%r317; .loc 1 306 19 ld.u64 %r320,[%r85+16]; add.u64 %r319,%r320,1; ld.u64 %r321,[%r85+8]; sub.u64 %r53,%r319,%r321; .loc 1 308 10 setp.lt.s64 %r322,%r53,0; @ %r322 bra $L64; .loc 1 306 17 add.u64 %r324,%r603,%r163; st.u64 [%r324],%r53; bra $L65; $L64: .loc 1 309 12 add.u64 %r326,%r603,%r163; st.u64 [%r326],%r613; $L65: .loc 1 302 3 add.u64 %r85,%r85,24; add.u64 %r228,%r228,24; add.u64 %r163,%r163,8; setp.ne.u64 %r328,%r76,%r85; @ %r328 bra $L66; bra $L67; $L121: .loc 1 322 6 ld.u64 %r329,[%r232]; setp.eq.u64 %r330,%r329,0; @ ! %r330 bra $L69; bra $L68; $L62: add.u64 %r332,%r22,%r22; add.u64 %r333,%r332,%r22; shl.b64 %r334,%r333,3; add.u64 %r176,%r334,40; add.u64 %r149,%r233,%r176; add.u64 %r221,%r235,%r176; shl.b64 %r335,%r22,3; add.u64 %r68,%r335,-8; cvt.u32.u32 %r337,%r23; cvt.s64.s8 %r336,%r337; shl.b64 %r338,%r336,3; add.u64 %r181,%r338,-8; add.u64 %r599,%frame,240; add.u64 %r603,%frame,360; .loc 1 319 12 mov.u64 %r612,0; $L72: .loc 1 314 18 add.u64 %r340,%r599,%r68; ld.u64 %r341,[%r149]; st.u64 [%r340],%r341; .loc 1 315 18 add.u64 %r342,%frame,%r68; .loc 1 315 20 ld.u64 %r344,[%r221]; mul.lo.u64 %r343,%r344,%r35; .loc 1 315 18 st.u64 [%r342],%r343; .loc 1 316 19 ld.u64 %r346,[%r149+16]; add.u64 %r345,%r346,1; ld.u64 %r347,[%r149+8]; sub.u64 %r66,%r345,%r347; .loc 1 318 10 setp.lt.s64 %r348,%r66,0; @ %r348 bra $L70; .loc 1 316 17 add.u64 %r350,%r603,%r68; st.u64 [%r350],%r66; bra $L71; $L70: .loc 1 319 12 add.u64 %r352,%r603,%r68; st.u64 [%r352],%r612; $L71: .loc 1 312 3 add.u64 %r149,%r149,24; add.u64 %r221,%r221,24; add.u64 %r68,%r68,8; setp.ne.u64 %r354,%r68,%r181; @ %r354 bra $L72; bra $L122; $L68: .loc 1 326 7 setp.eq.u64 %r355,%r151,0; @ %r355 bra $L74; add.u64 %r603,%frame,360; $L100: add.u64 %r132,%r232,40; mov.u64 %r130,%r603; cvt.u32.u32 %r357,%r23; cvt.s64.s8 %r356,%r357; add.u64 %r359,%r356,%r356; add.u64 %r360,%r359,%r356; shl.b64 %r361,%r360,3; add.u64 %r362,%r232,16; add.u64 %r59,%r361,%r362; .loc 1 329 10 mov.u64 %r133,1; .loc 1 333 4 mov.u64 %r381,0; bra $L75; $L74: .loc 1 337 20 add.u32 %r363,%r23,-2; cvt.s64.s32 %r74,%r363; add.u64 %r365,%r74,%r74; add.u64 %r366,%r365,%r74; shl.b64 %r367,%r366,3; add.u64 %r368,%r232,%r367; .loc 1 337 67 shl.b64 %r370,%r74,3; add.u64 %r371,%frame,%r370; .loc 1 337 59 ld.u64 %r373,[%r368+40]; ld.u64 %r374,[%r371+360]; mul.lo.u64 %r158,%r373,%r374; .loc 1 339 24 mov.u64 %r375,0; st.u64 [%r232+8],%r375; .loc 1 340 28 cvt.u16.u32 %r378,%r23; add.u16 %r377,%r378,-1; cvt.u32.u16 %r379,%r377; st.u8 [%r232+28],%r379; .loc 1 342 10 setp.eq.u64 %r380,%r158,0; @ %r380 bra $L76; bra $L123; $L78: .loc 1 331 47 mul.lo.u64 %r133,%r71,%r133; $L75: .loc 1 333 4 st.u64 [%r132+8],%r381; ld.u64 %r71,[%r130]; add.u64 %r382,%r71,-1; st.u64 [%r132+16],%r382; st.u64 [%r132],%r133; .loc 1 326 7 add.u64 %r132,%r132,24; add.u64 %r130,%r130,8; setp.ne.u64 %r383,%r59,%r132; @ %r383 bra $L78; bra $L74; $L76: .loc 1 345 4 st.u64 [%r232+48],%r158; mov.u64 %r385,-1; st.u64 [%r232+56],%r385; mov.u64 %r386,1; st.u64 [%r232+40],%r386; .loc 1 346 4 bra $L54; $L123: .loc 1 349 24 mov.u64 %r388,16388; call (%value_in),_gfortrani_xmallocarray89,[%value_in]; } .loc 1 349 22 st.u64 [%r232],%r389; bra $L79; $L69: .loc 1 354 19 ld.s8 %r391,[%r232+28]; .loc 1 354 10 setp.eq.u64 %r392,%r391,%r151; @ %r392 bra $L80; .loc 1 355 2 cvta.const.u64 %r3933_gfortran_runtime_error80: .loc 1 357 11 cvta.global.u64 %r395,_gfortrani_compile_options; .loc 1 357 10 ld.u32 %r396,[%r395+36]; setp.eq.u32 %r397,%r396,0; @ %r397 bra $L79; .loc 1 359 4 add.u64 %r603,%frame,360; cvta.const.u64 %r401,$LC2r60300401; call _gfortrani_bounds_ifunction_return361 4 cvta.const.u64 %r405,$LC40401; call _gfortrani_bounds_equal_extents$L79: .loc 1 366 3 setp.ne.u64 %r407,%r151,0; @ %r407 bra $L81; $L84: .loc 1 374 8 ld.u64 %r171,[%r232]; .loc 1 375 8 ld.u64 %r169,[%r233]; .loc 1 377 9 setp.ne.u64 %r408,%r169,0; @ %r408 bra $L82; bra $L54; $L81: add.u64 %r186,%frame,480; add.u64 %r148,%r232,40; add.u64 %r145,%frame,120; cvt.u32.u32 %r410,%r23; cvt.s64.s8 %r409,%r410; add.u64 %r134,%r409,-1; .loc 1 366 3 mov.u64 %r172,0; add.u64 %r603,%frame,360; .loc 1 368 16 mov.u64 %r411,%r172; $L83: st.u64 [%r186],%r411; .loc 1 369 18 ld.u64 %r412,[%r148]; st.u64 [%r145],%r412; .loc 1 370 17 shl.b64 %r414,%r172,3; add.u64 %r415,%r603,%r414; .loc 1 370 10 ld.u64 %r416,[%r415]; setp.le.s64 %r417,%r416,0; @ %r417 bra $L54; .loc 1 366 26 add.u64 %r172,%r172,1; .loc 1 366 3 add.u64 %r186,%r186,8; add.u64 %r148,%r148,24; add.u64 %r145,%r145,8; setp.ne.u64 %r418,%r134,%r172; @ %r418 bra $L83; bra $L84; $L82: .loc 1 300 12 mul.lo.u64 %r44,%r41,%r35; ld.u64 %r188,[%frame+480]; .loc 1 440 22 ld.u64 %r101,[%frame+240]; .loc 1 441 23 ld.u64 %r103,[%frame]; .loc 1 442 22 ld.u64 %r105,[%frame+120]; .loc 1 442 12 shl.b64 %r106,%r105,4; .loc 1 444 32 ld.u64 %r29,[%frame+360]; .loc 1 451 23 mul.lo.u64 %r39,%r101,%r29; .loc 1 452 24 mul.lo.u64 %r419,%r103,%r29; .loc 1 452 10 neg.s64 %r92,%r419; .loc 1 453 23 mul.lo.u64 %r420,%r105,%r29; .loc 1 453 9 shl.b64 %r421,%r420,4; neg.s64 %r182,%r421; setp.le.s64 %r597,%r151,1; .loc 1 418 9 setp.ne.u32 %r605,%r236,0; add.u64 %r606,%r152,-1; cvt.u32.u32 %r607,%r23; cvt.s64.s8 %r608,%r607; add.u64 %r609,%r608,-1; add.u64 %r610,%frame,240; add.u64 %r611,%frame,360; $L101: .loc 1 366 3 mov.u64 %r165,%r170; mov.u64 %r164,%r169; .loc 1 396 9 mov.u64 %r199,0; $L88: .loc 1 399 7 ld.s8 %r227,[%r165]; mov.u64 %r229,%r199; .loc 1 396 24 add.u64 %r199,%r199,1; .loc 1 399 6 setp.eq.u32 %r600,%r227,0; @ %r600 bra $L85; .loc 1 407 11 ld.s8 %r143,[%r164]; .loc 1 408 11 mov.u64 %r559,%r199; shr.s64 %r560,%r559,63; .loc 1 418 9 @ %r605 bra $L86; bra $L124; $L85: .loc 1 396 32 add.u64 %r164,%r164,%r155; .loc 1 396 47 add.u64 %r165,%r165,%r44; .loc 1 396 2 setp.ne.u64 %r425,%r152,%r199; @ %r425 bra $L88; .loc 1 395 9 mov.u64 %r559,0; mov.u64 %r560,%r559; bra $L89; $L124: .loc 1 428 10 setp.gt.s64 %r426,%r152,%r229; @ %r426 bra $L90; bra $L89; $L86: .loc 1 419 8 setp.le.s64 %r427,%r152,%r229; @ %r427 bra $L89; add.u64 %r567,%r229,1; shr.s64 %r568,%r567,63; sub.u64 %r431,%r606,%r229; shr.s64 %r434,%r229,63; add.u64 %r437,%r229,2; set.u32.lt.u64 %r442,%r437,%r229; cvt.s64.s32 %r440,%r442; sub.u64 %r446,%r434,%r440; add.u64 %r448,%r431,%r437; set.u32.lt.u64 %r453,%r448,%r431; cvt.s64.s32 %r451,%r453; sub.u64 %r457,%r446,%r451; mov.u64 %r579,1; $L93: .loc 1 421 8 @ %r600 bra $L91; .loc 1 421 18 ld.s8 %r90,[%r164]; .loc 1 421 15 cvt.u16.u32 %r459,%r90; cvt.u16.u32 %r460,%r143; setp.gt.s16 %r461,%r459,%r460; .loc 1 424 16 selp.u64 %r559,%r559,%r567,%r461; selp.u64 %r560,%r560,%r568,%r461; selp.u32 %r143,%r143,%r90,%r461; $L91: .loc 1 419 33 add.u64 %r164,%r164,%r155; .loc 1 419 48 add.u64 %r165,%r165,%r44; .loc 1 419 8 add.u64 %r464,%r567,%r579; set.u32.lt.u64 %r469,%r464,%r567; cvt.s64.s32 %r467,%r469; mov.u64 %r567,%r464; sub.u64 %r568,%r568,%r467; setp.ne.u64 %r476,%r448,%r567; @ %r476 bra $L102; setp.ne.u64 %r479,%r457,%r568; @ ! %r479 bra $L89; $L102: .loc 1 421 9 ld.s8 %r227,[%r165]; setp.eq.u32 %r600,%r227,0; bra $L93; $L90: add.u64 %r563,%r229,1; shr.s64 %r564,%r563,63; sub.u64 %r483,%r606,%r229; shr.s64 %r486,%r229,63; add.u64 %r489,%r229,2; set.u32.lt.u64 %r494,%r489,%r229; cvt.s64.s32 %r492,%r494; sub.u64 %r498,%r486,%r492; add.u64 %r500,%r483,%r489; set.u32.lt.u64 %r505,%r500,%r483; cvt.s64.s32 %r503,%r505; sub.u64 %r509,%r498,%r503; .loc 1 428 10 mov.u64 %r593,1; $L116: .loc 1 430 10 ld.s8 %r511,[%r165]; cvt.u16.u32 %r510,%r511; setp.eq.u16 %r512,%r510,0; @ %r512 bra $L94; .loc 1 430 20 ld.s8 %r95,[%r164]; .loc 1 430 17 cvt.u16.u32 %r513,%r95; cvt.u16.u32 %r514,%r143; setp.ge.s16 %r515,%r513,%r514; .loc 1 433 11 selp.u64 %r559,%r559,%r563,%r515; selp.u64 %r560,%r560,%r564,%r515; selp.u32 %r143,%r143,%r95,%r515; $L94: .loc 1 428 35 add.u64 %r164,%r164,%r155; .loc 1 428 50 add.u64 %r165,%r165,%r44; .loc 1 428 10 add.u64 %r518,%r563,%r593; set.u32.lt.u64 %r523,%r518,%r563; cvt.s64.s32 %r521,%r523; mov.u64 %r563,%r518; sub.u64 %r564,%r564,%r521; setp.ne.u64 %r530,%r500,%r563; @ %r530 bra $L116; setp.ne.u64 %r533,%r509,%r564; @ %r533 bra $L116; $L89: .loc 1 436 8 st.u64 [%r171],%r559; st.u64 [%r171+8],%r560; .loc 1 439 15 add.u64 %r188,%r188,1; .loc 1 440 12 add.u64 %r169,%r169,%r101; .loc 1 441 13 add.u64 %r170,%r170,%r103; .loc 1 442 12 add.u64 %r171,%r171,%r106; .loc 1 444 13 setp.ne.u64 %r536,%r29,%r188; @ %r536 bra $L101; .loc 1 452 10 add.u64 %r166,%r170,%r92; .loc 1 453 9 add.u64 %r167,%r171,%r182; .loc 1 455 7 @ %r597 bra $L54; add.u64 %r220,%frame,488; .loc 1 451 23 mov.u64 %r110,%r39; .loc 1 455 7 mov.u64 %r219,8; .loc 1 454 5 mov.u64 %r168,1; add.u64 %r601,%frame,120; .loc 1 448 13 mov.u64 %r604,0; bra $L98; $L99: st.u64 [%r220],%r604; .loc 1 451 23 mul.lo.u64 %r110,%r122,%r121; .loc 1 452 24 mul.lo.u64 %r542,%r124,%r121; .loc 1 452 10 sub.u64 %r166,%r170,%r542; .loc 1 453 23 mul.lo.u64 %r543,%r126,%r121; .loc 1 453 9 shl.b64 %r544,%r543,4; sub.u64 %r167,%r171,%r544; .loc 1 454 5 add.u64 %r168,%r168,1; .loc 1 455 7 add.u64 %r220,%r220,8; add.u64 %r219,%r219,8; setp.eq.u64 %r545,%r168,%r609; @ %r545 bra $L54; $L98: .loc 1 463 16 ld.u64 %r546,[%r220]; add.u64 %r121,%r546,1; st.u64 [%r220],%r121; .loc 1 464 23 add.u64 %r548,%r610,%r219; ld.u64 %r122,[%r548]; .loc 1 464 13 sub.u64 %r549,%r122,%r110; add.u64 %r169,%r169,%r549; .loc 1 465 24 add.u64 %r550,%frame,%r219; ld.u64 %r124,[%r550]; .loc 1 465 14 add.u64 %r170,%r166,%r124; .loc 1 466 23 add.u64 %r552,%r601,%r219; ld.u64 %r126,[%r552]; .loc 1 466 13 shl.b64 %r553,%r126,4; add.u64 %r171,%r167,%r553; .loc 1 444 32 add.u64 %r555,%r611,%r219; ld.u64 %r129,[%r555]; .loc 1 444 13 setp.eq.u64 %r556,%r121,%r129; @ %r556 bra $L99; .loc 1 448 13 mov.u64 %r188,0; bra $L101; $L122: .loc 1 322 6 ld.u64 %r557,[%r232]; setp.eq.u64 %r558,%r557,0; @ ! %r558 bra $L69; bra $L100; $L54_gfortran_sminloc1_16_i1 .visible .func _gfortran_sminloc1_16_, .param .u32 %in_ar4) {local .align 16 .b8 %frame_ar[368u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r282; .reg .predpred %r293; .reg .pred %r294; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 493 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L126; .loc 1 493 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L127; $L126:r152154; call _gfortran_minloc1_16_500 7 bra $L125; $L127: .loc 1 503 10 ld.u64 %r23,[%r152]; .loc 1 503 7 add.u64 %r88,%r23,-1; .loc 1 504 10 ld.s8 %r24,[%r151+28]; .loc 1 504 38 add.u32 %r26,%r24,-1; .loc 1 504 8 cvt.s64.s32 %r297,%r26; .loc 1 506 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r297; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 506 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L129; .loc 1 513 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L130; $L136: .loc 1 521 3 setp.lt.s64 %r175,%r88,%r297; @ %r175 bra $L131; bra $L184; $L129: .loc 1 508 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error130: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 518 12 mov.u64 %r303,0; $L135: .loc 1 515 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 517 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L133; .loc 1 515 17 st.u64 [%r139],%r36; bra $L134; $L133: .loc 1 518 12 st.u64 [%r139],%r303; $L134: .loc 1 513 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L135; bra $L136; $L184: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L138; bra $L137; $L131: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r296,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r296,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 527 12 mov.u64 %r302,0; $L141: .loc 1 524 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 526 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L139; .loc 1 523 17 st.u64 [%r43],%r41; bra $L140; $L139: .loc 1 527 12 st.u64 [%r43],%r302; $L140: .loc 1 521 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L141; bra $L185; $L137: .loc 1 534 7 setp.eq.u64 %r213,%r297,0; @ %r213 bra $L143; add.u64 %r296,%frame,120; $L163: add.u64 %r111,%r150,40; mov.u64 %r107,%r296; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 537 10 mov.u64 %r81,1; .loc 1 541 4 mov.u64 %r239,0; bra $L144; $L143: .loc 1 545 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 546 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 548 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 548 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 548 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 550 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L145; bra $L186; $L147: .loc 1 539 48 mul.lo.u64 %r81,%r46,%r81; $L144: .loc 1 541 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 534 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L147; bra $L143; $L145: .loc 1 553 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 554 4 bra $L125; $L186: .loc 1 557 24 mov.u64 %r246,1696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 557 22 st.u64 [%r150],%r98; $L151: .loc 1 583 3 setp.ne.u64 %r249,%r297,0; @ %r249 bra $L148; bra $L149; $L138: .loc 1 561 19 ld.s8 %r55,[%r150+28]; .loc 1 561 10 setp.eq.u64 %r250,%r55,%r297; @ %r250 bra $L150; .loc 1 562 2 st.u64 [%stack+8],%r297; st.u64 [%stack],%r55;_gfortran_runtime_error150: .loc 1 567 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 567 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L151; .loc 1 569 4 setp.eq.u64 %r256,%r297,0; @ %r256 bra $L149; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 569 10 mov.u64 %r104,0; $L153: .loc 1 573 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 573 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 574 18 ld.u64 %r63,[%r31]; .loc 1 569 25 add.u64 %r104,%r104,1; .loc 1 574 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L152; .loc 1 575 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error152: .loc 1 569 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L153; $L148: .loc 1 585 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L154; shl.b64 %r266,%r297,3; bra $L155; $L154: mov.u64 %r266,8; $L155: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L156: .loc 1 586 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 583 26 add.u64 %r100,%r100,1; .loc 1 583 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r297,%r100; @ %r279 bra $L156; $L149: ld.u64 %r80,[%frame+240]; .loc 1 595 22 ld.u64 %r68,[%frame]; .loc 1 595 12 shl.b64 %r69,%r68,4; .loc 1 597 32 ld.u64 %r59,[%frame+120]; .loc 1 604 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r295,%r297,1; .loc 1 593 13 mov.u64 %r280,0; cvt.u32.u32 %r299,%r24; cvt.s64.s8 %r300,%r299; add.u64 %r301,%r300,-1; $L162: st.u64 [%r98],%r280; st.u64 [%r98+8],%r280; .loc 1 594 15 add.u64 %r80,%r80,1; .loc 1 595 12 add.u64 %r98,%r98,%r69; .loc 1 597 13 setp.ne.u64 %r282,%r59,%r80; @ %r282 bra $L162; .loc 1 606 7 @ ! %r295 bra $L187; bra $L125; $L161: .loc 1 601 13 st.u64 [%r99],%r280; .loc 1 604 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 605 5 add.u64 %r97,%r97,1; .loc 1 606 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r285,%r97,%r301; @ %r285 bra $L160; bra $L125; $L187: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 604 23 mov.u64 %r72,%r44; .loc 1 605 5 mov.u64 %r97,1; $L160: .loc 1 610 16 ld.u64 %r290,[%r99]; add.u64 %r74,%r290,1; st.u64 [%r99],%r74; .loc 1 611 23 ld.u64 %r75,[%r124]; .loc 1 611 13 sub.u64 %r291,%r75,%r72; shl.b64 %r292,%r291,4; add.u64 %r98,%r98,%r292; .loc 1 597 32 ld.u64 %r77,[%r123]; .loc 1 597 13 setp.eq.u64 %r293,%r74,%r77; @ %r293 bra $L161; .loc 1 601 13 mov.u64 %r80,0; bra $L162; $L185: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r294,%r98,0; @ ! %r294 bra $L138; bra $L163; $L125: .loc 1 615 1 ret; } minloc1_4_i2.o/_gfortran_minloc1_4_i2 .visible .func _gfortran_minloc1_4_i2.file 1 "../../../libgfortran/generated/minloc1_4_i2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_4_i2 .visible .func _gfortran_mminloc1_4__gfortran_sminloc1_4_i2 .visible .func _gfortran_sminloc1_4_VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_minloc1_4_i2 .visible .func _gfortran_minloc1_4_i2480predpred %r238; .reg .predpred %r262; .reg .pred %r264; .reg .pred %r265; .reg .u64u64predu64 %r318; .reg .u64pred %r328; .reg .pred %r329; .reg .u32 %r331; .reg .u16 %r333; .reg .u16 %r334; .reg .pred %r335; .reg .u32 %r336; .reg .pred %r337; .reg .u16 %r338; .reg .u16mov.u64 %r188,%ar0; mov.u64 %r189,%ar1; mov.u64 %r190,%ar2; mov.u32 %r191,%ar3; .loc 1 58 10 ld.s8 %r22,[%r189+28]; .loc 1 58 38 add.u32 %r192,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r118,%r192; .loc 1 59 10 ld.u64 %r25,[%r190]; .loc 1 59 7 add.u64 %r119,%r25,-1; .loc 1 61 7 shr.u64 %r194,%r119,63; set.u32.lt.s64 %r196,%r118,%r119; neg.s32 %r197,%r196; cvt.u16.u64 %r200,%r194; cvt.u16.u32 %r201,%r197; or.b16 %r199,%r200,%r201; .loc 1 61 6 cvt.u32.u16 %r202,%r199; cvt.u16.u8 %r203,%r202; setp.eq.u16 %r204,%r203,0; @ %r204 bra $L2; .loc 1 63 7 cvt.u32.u32 %r207,%r22; cvt.s64.s8 %r206,%r207; st.u64 [%stack+8],%r20605205_gfortran_runtime_errorr210,%r119,%r119; add.u64 %r211,%r210,%r119; shl.b64 %r212,%r211,3; add.u64 %r213,%r189,%r212; ld.u64 %r30,[%r213+56]; ld.u64 %r32,[%r213+48]; .loc 1 71 9 ld.u64 %r122,[%r213+40]; .loc 1 73 3 setp.ne.u64 %r227,%r119,0; @ %r227 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r228,%r118,%r119; @ %r228 bra $L4; bra $L52; $L3: add.u64 %r138,%r189,40; add.u64 %r186,%frame,120; add.u64 %r185,%frame,240; add.u64 %r229,%r189,16; add.u64 %r231,%r25,%r25; add.u64 %r232,%r231,%r25; shl.b64 %r233,%r232,3; add.u64 %r145,%r229,%r233; .loc 1 79 12 mov.u64 %r378,0; $L8: .loc 1 75 18 ld.u64 %r234,[%r138]; st.u64 [%r186],%r234; .loc 1 76 19 ld.u64 %r236,[%r138+16]; add.u64 %r235,%r236,1; ld.u64 %r237,[%r138+8]; sub.u64 %r37,%r235,%r237; .loc 1 78 10 setp.lt.s64 %r238,%r37,0; @ %r238 bra $L6; .loc 1 76 17 st.u64 [%r185],%r37; bra $L7; $L6: .loc 1 79 12 st.u64 [%r185],%r378; $L7: .loc 1 73 3 add.u64 %r138,%r138,24; add.u64 %r186,%r186,8; add.u64 %r185,%r185,8; setp.ne.u64 %r240,%r138,%r145; @ %r240 bra $L8; bra $L9; $L52: .loc 1 90 6 ld.u64 %r241,[%r188]; setp.eq.u64 %r242,%r241,0; @ ! %r242 bra $L11; bra $L10; $L4: add.u64 %r244,%r25,%r25; add.u64 %r245,%r244,%r25; shl.b64 %r246,%r245,3; add.u64 %r247,%r246,40; add.u64 %r114,%r189,%r247; shl.b64 %r248,%r25,3; add.u64 %r102,%r248,-8; add.u64 %r366,%frame,120; add.u64 %r107,%r366,%r102; add.u64 %r365,%frame,240; add.u64 %r100,%r365,%r102; cvt.u32.u32 %r252,%r22; cvt.s64.s8 %r251,%r252; add.u64 %r254,%r251,%r251; add.u64 %r255,%r254,%r251; shl.b64 %r256,%r255,3; add.u64 %r257,%r189,40; add.u64 %r143,%r256,%r257; .loc 1 87 12 mov.u64 %r377,0; $L14: .loc 1 83 18 ld.u64 %r258,[%r114]; st.u64 [%r107],%r258; .loc 1 84 19 ld.u64 %r260,[%r114+16]; add.u64 %r259,%r260,1; ld.u64 %r261,[%r114+8]; sub.u64 %r43,%r259,%r261; .loc 1 86 10 setp.lt.s64 %r262,%r43,0; @ %r262 bra $L12; .loc 1 84 17 st.u64 [%r100],%r43; bra $L13; $L12: .loc 1 87 12 st.u64 [%r100],%r377; $L13: .loc 1 81 3 add.u64 %r114,%r114,24; add.u64 %r107,%r107,8; add.u64 %r100,%r100,8; setp.ne.u64 %r264,%r114,%r143; @ %r264 bra $L14; bra $L53; $L10: .loc 1 94 7 setp.le.s64 %r265,%r118,0; @ %r265 bra $L16; add.u64 %r365,%frame,240; $L39: add.u64 %r155,%r188,40; mov.u64 %r151,%r365; cvt.u32.u32 %r267,%r22; cvt.s64.s8 %r266,%r267; add.u64 %r269,%r266,%r266; add.u64 %r270,%r269,%r266; shl.b64 %r271,%r270,3; add.u64 %r272,%r188,16; add.u64 %r115,%r271,%r272; .loc 1 97 10 mov.u64 %r105,1; .loc 1 101 4 mov.u64 %r295,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r273,0; st.u64 [%r188+8],%r273; .loc 1 106 28 cvt.u16.u32 %r276,%r22; add.u16 %r275,%r276,-1; cvt.u32.u16 %r277,%r275; st.u8 [%r188+28],%r277; .loc 1 108 20 add.u32 %r278,%r22,-2; cvt.s64.s32 %r52,%r278; add.u64 %r280,%r52,%r52; add.u64 %r281,%r280,%r52; shl.b64 %r282,%r281,3; add.u64 %r283,%r188,%r282; .loc 1 108 67 shl.b64 %r285,%r52,3; add.u64 %r286,%frame,%r285; .loc 1 108 59 ld.u64 %r288,[%r283+40]; ld.u64 %r289,[%r286+240]; mul.lo.u64 %r124,%r288,%r289; .loc 1 110 29291; call (%value_in),_gfortrani_xmallocarray292,[%value_in]; } .loc 1 110 27 st.u64 [%r188],%r292; .loc 1 111 10 setp.eq.u64 %r294,%r124,0; @ ! %r294 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r105,%r48,%r105; $L17: .loc 1 101 4 st.u64 [%r155+8],%r295; ld.u64 %r48,[%r151]; add.u64 %r296,%r48,-1; st.u64 [%r155+16],%r296; st.u64 [%r155],%r105; .loc 1 94 7 add.u64 %r155,%r155,24; add.u64 %r151,%r151,8; setp.ne.u64 %r297,%r115,%r155; @ %r297 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r188+48],%r124; mov.u64 %r299,-1; st.u64 [%r188+56],%r299; mov.u64 %r300,1; st.u64 [%r188+40],%r300; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r59,[%r188+28]; .loc 1 121 10 setp.eq.u64 %r301,%r59,%r118; @ %r301 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r118; st.u64 [%stack],%r59;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r304,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r305,[%r304+36]; setp.eq.u32 %r306,%r305,0; @ %r306 bra $L21; .loc 1 128 2 add.u64 %r365,%frame,240; cvta.const.u64 %r310,$LC2r309310; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r312,%r118,0; @ %r312 bra $L24; $L27: .loc 1 68 9 add.u64 %r313,%r30,1; .loc 1 68 7 sub.u64 %r120,%r313,%r32; max.s64 %r86,%r120,0; .loc 1 140 8 ld.u64 %r134,[%r189]; .loc 1 141 8 ld.u64 %r135,[%r188]; .loc 1 190 33 add.u64 %r75,%r122,%r122; ld.u64 %r153,[%frame+360]; .loc 1 204 22 ld.u64 %r77,[%frame+120]; .loc 1 204 12 add.u64 %r78,%r77,%r77; .loc 1 205 22 ld.u64 %r80,[%frame]; .loc 1 205 12 shl.b64 %r81,%r80,2; .loc 1 207 32 ld.u64 %r68,[%frame+240]; .loc 1 214 23 mul.lo.u64 %r316,%r68,%r77; .loc 1 214 9 add.u64 %r317,%r316,%r316; neg.s64 %r142,%r317; .loc 1 215 23 mul.lo.u64 %r318,%r68,%r80; .loc 1 215 9 shl.b64 %r319,%r318,2; neg.s64 %r140,%r319; setp.gt.s64 %r367,%r120,0; setp.le.s64 %r368,%r118,1; .loc 1 180 9 setp.eq.u32 %r370,%r191,0; .loc 1 155 9 mov.u32 %r371,32767; cvt.u32.u32 %r372,%r22; cvt.s64.s8 %r373,%r372; add.u64 %r374,%r373,-1; add.u64 %r375,%frame,120; add.u64 %r376,%frame,240; bra $L25; $L24: add.u64 %r171,%frame,360; add.u64 %r170,%r188,40; mov.u64 %r168,%frame; cvt.u32.u32 %r321,%r22; cvt.s64.s8 %r320,%r321; add.u64 %r159,%r320,-1; .loc 1 132 3 mov.u64 %r136,0; add.u64 %r365,%frame,240; .loc 1 134 16 mov.u64 %r322,%r136; $L26: st.u64 [%r171],%r322; .loc 1 135 18 ld.u64 %r323,[%r170]; st.u64 [%r168],%r323; .loc 1 136 17 shl.b64 %r325,%r136,3; add.u64 %r326,%r365,%r325; .loc 1 136 10 ld.u64 %r327,[%r326]; setp.le.s64 %r328,%r327,0; @ %r328 bra $L1; .loc 1 132 26 add.u64 %r136,%r136,1; .loc 1 132 3 add.u64 %r171,%r171,8; add.u64 %r170,%r170,24; add.u64 %r168,%r168,8; setp.ne.u64 %r329,%r136,%r159; @ %r329 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r367 bra $L28; .loc 1 159 10 mov.u32 %r331,0; st.u32 [%r135],%r331; bra $L29; $L28: .loc 1 180 9 @ %r370 bra $L40; mov.u64 %r129,%r134; .loc 1 155 9 mov.u32 %r108,%r371; .loc 1 157 9 mov.u32 %r106,1; .loc 1 178 8 mov.u64 %r128,0; $L32: .loc 1 183 9 ld.s16 %r67,[%r129]; .loc 1 183 8 cvt.u16.u32 %r333,%r67; cvt.u16.u32 %r334,%r108; setp.gt.s16 %r335,%r333,%r334; @ %r335 bra $L31; cvt.u32.u64 %r336,%r128; add.u32 %r106,%r336,1; mov.u32 %r108,%r67; $L31: .loc 1 181 25 add.u64 %r128,%r128,1; .loc 1 181 33 add.u64 %r129,%r129,%r75; .loc 1 181 8 setp.gt.s64 %r337,%r86,%r128; @ %r337 bra $L32; bra $L33; $L40: mov.u64 %r126,%r134; .loc 1 155 9 mov.u32 %r110,%r371; .loc 1 157 9 mov.u32 %r106,1; .loc 1 178 8 mov.u64 %r104,0; $L30: .loc 1 192 9 ld.s16 %r72,[%r126]; .loc 1 192 8 cvt.u16.u32 %r338,%r72; cvt.u16.u32 %r339,%r110; setp.ge.s16 %r340,%r338,%r339; @ %r340 bra $L34; cvt.u32.u64 %r341,%r104; add.u32 %r106,%r341,1; mov.u32 %r110,%r72; $L34: .loc 1 190 25 add.u64 %r104,%r104,1; .loc 1 190 33 add.u64 %r126,%r126,%r75; .loc 1 190 8 setp.gt.s64 %r342,%r86,%r104; @ %r342 bra $L30; $L33: .loc 1 199 12 st.u32 [%r135],%r106; $L29: .loc 1 203 15 add.u64 %r153,%r153,1; .loc 1 204 12 add.u64 %r134,%r134,%r78; .loc 1 205 12 add.u64 %r135,%r135,%r81; .loc 1 207 13 setp.ne.u64 %r343,%r68,%r153; @ %r343 bra $L25; .loc 1 214 9 add.u64 %r131,%r134,%r142; .loc 1 215 9 add.u64 %r132,%r135,%r140; .loc 1 217 7 @ %r368 bra $L1; add.u64 %r156,%frame,368; mov.u64 %r117,8; .loc 1 216 5 mov.u64 %r133,1; .loc 1 211 13 mov.u64 %r369,0; bra $L37; $L38: st.u64 [%r156],%r369; .loc 1 214 23 mul.lo.u64 %r349,%r93,%r92; .loc 1 214 9 add.u64 %r350,%r349,%r349; sub.u64 %r131,%r134,%r350; .loc 1 215 23 mul.lo.u64 %r351,%r96,%r92; .loc 1 215 9 shl.b64 %r352,%r351,2; sub.u64 %r132,%r135,%r352; .loc 1 216 5 add.u64 %r133,%r133,1; .loc 1 217 7 add.u64 %r156,%r156,8; add.u64 %r117,%r117,8; setp.eq.u64 %r353,%r133,%r374; @ %r353 bra $L1; $L37: .loc 1 225 16 ld.u64 %r354,[%r156]; add.u64 %r92,%r354,1; st.u64 [%r156],%r92; .loc 1 226 23 add.u64 %r356,%r375,%r117; ld.u64 %r93,[%r356]; .loc 1 226 13 add.u64 %r357,%r93,%r93; add.u64 %r134,%r131,%r357; .loc 1 227 23 add.u64 %r358,%frame,%r117; ld.u64 %r96,[%r358]; .loc 1 227 13 shl.b64 %r359,%r96,2; add.u64 %r135,%r132,%r359; .loc 1 207 32 add.u64 %r361,%r376,%r117; ld.u64 %r99,[%r361]; .loc 1 207 13 setp.eq.u64 %r362,%r92,%r99; @ %r362 bra $L38; .loc 1 211 13 mov.u64 %r153,0; bra $L25; $L53: .loc 1 90 6 ld.u64 %r363,[%r188]; setp.eq.u64 %r364,%r363,0; @ ! %r364 bra $L11; bra $L39; $L1:_gfortran_mminloc1_4_i2 .visible .func _gfortran_mminloc1_4_, .param .u32 %in_ar4) {60898u16 %r238; .reg .u16u64 %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u64 %r306; .reg .u64pred %r317; .reg .u64 %r318; .reg .pred %r319; .reg .u64u32u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64u64predu32 %r385; .reg .predu64 %r394; .reg .predu64pred %r407; .reg .u64 %r410; .reg .u64 %r411; .reg .u64 %r412; .reg .u32 %r414; .reg .pred %r416; .reg .pred %r417; .reg .predpred %r423; .reg .u16 %r424; .reg .u32 %r425; .reg .predpred %r429; .reg .pred %r430; .reg .predpred %r461; .reg .u64 %r462; .reg .pred.reg .u64 %r471; mov.u64 %r221,%ar0; mov.u64 %r222,%ar1; mov.u64 %r223,%ar2; mov.u64 %r224,%ar3; mov.u32 %r225,%ar4; .loc 1 261 6 setp.ne.u64 %r226,%r224,0; @ %r226 bra $L223225; call _gfortran_minloc1_4_268 7 bra $L54; $L55: .loc 1 271 10 ld.u64 %r22,[%r223]; .loc 1 271 7 add.u64 %r151,%r22,-1; .loc 1 272 10 ld.s8 %r23,[%r222+28]; .loc 1 272 38 add.u32 %r231,%r23,-1; .loc 1 272 8 cvt.s64.s32 %r152,%r231; .loc 1 275 7 shr.u64 %r233,%r151,63; set.u32.gt.s64 %r235,%r151,%r152; neg.s32 %r236,%r235; cvt.u16.u64 %r239,%r233; cvt.u16.u32 %r240,%r236; or.b16 %r238,%r239,%r240; .loc 1 275 6 cvt.u32.u16 %r241,%r238; cvt.u16.u8 %r242,%r241; setp.eq.u16 %r243,%r242,0; @ %r243 bra $L57; .loc 1 277 7 cvt.u32.u32 %r246,%r23; cvt.s64.s8 %r245,%r246; st.u64 [%stack+8],%r245; st.u64 [%stack],%r22; cvta.const.u64 %r244244_gfortran_runtime_error7: .loc 1 282 9 add.u64 %r456,%r151,%r151; add.u64 %r457,%r456,%r151; shl.b64 %r251,%r457,3; add.u64 %r252,%r222,%r251; ld.u64 %r255,[%r252+56]; add.u64 %r254,%r255,1; .loc 1 282 7 ld.u64 %r262,[%r252+48]; sub.u64 %r153,%r254,%r262; .loc 1 283 6 setp.le.s64 %r263,%r153,0; @ %r263 bra $L54; .loc 1 286 9 ld.u64 %r169,[%r224]; .loc 1 288 15 ld.u64 %r34,[%r224+16]; .loc 1 290 22 cvt.u32.u64 %r35,%r34; .loc 1 290 53 add.u32 %r264,%r35,-4; and.b32 %r265,%r264,-5; set.u32.eq.u32 %r267,%r265,0; neg.s32 %r268,%r267; .loc 1 290 22 add.u32 %r269,%r35,-1; set.u32.le.u32 %r271,%r269,1; neg.s32 %r272,%r271; .loc 1 290 6 cvt.u16.u32 %r274,%r268; cvt.u16.u32 %r275,%r272; or.loc 1 292 7 setp.ne.u32 %r280,%r35,16; @ %r280 bra $L60; $L59: .loc 1 299 9 shl.b64 %r284,%r457,3; add.u64 %r285,%r222,%r284; ld.u64 %r156,[%r285+40]; .loc 1 300 12 add.u64 %r291,%r224,%r284; ld.u64 %r41,[%r291+40]; .loc 1 302 3 setp.ne.u64 %r293,%r151,0; @ %r293 bra $L61; $L67: .loc 1 312 3 setp.lt.s64 %r294,%r151,%r152; @ %r294 bra $L62; bra $L119; $L60: .loc 1 297 5 cvta.const.u64 %r2955_gfortran_runtime_error61: add.u64 %r96,%r222,40; add.u64 %r68,%r224,40; add.u64 %r298,%r22,%r22; add.u64 %r299,%r298,%r22; shl.b64 %r300,%r299,3; add.u64 %r301,%r222,16; add.u64 %r39,%r300,%r301; .loc 1 302 3 mov.u64 %r139,0; add.u64 %r460,%frame,240; add.u64 %r459,%frame,360; .loc 1 309 12 mov.u64 %r471,%r139; $L66: .loc 1 304 18 add.u64 %r303,%r460,%r139; ld.u64 %r304,[%r96]; st.u64 [%r303],%r304; .loc 1 305 18 add.u64 %r305,%frame,%r139; .loc 1 305 20 ld.u64 %r307,[%r68]; mul.lo.u64 %r306,%r307,%r34; .loc 1 305 18 st.u64 [%r305],%r306; .loc 1 306 19 ld.u64 %r309,[%r96+16]; add.u64 %r308,%r309,1; ld.u64 %r310,[%r96+8]; sub.u64 %r53,%r308,%r310; .loc 1 308 10 setp.lt.s64 %r311,%r53,0; @ %r311 bra $L64; .loc 1 306 17 add.u64 %r313,%r459,%r139; st.u64 [%r313],%r53; bra $L65; $L64: .loc 1 309 12 add.u64 %r315,%r459,%r139; st.u64 [%r315],%r471; $L65: .loc 1 302 3 add.u64 %r96,%r96,24; add.u64 %r68,%r68,24; add.u64 %r139,%r139,8; setp.ne.u64 %r317,%r39,%r96; @ %r317 bra $L66; bra $L67; $L119: .loc 1 322 6 ld.u64 %r318,[%r221]; setp.eq.u64 %r319,%r318,0; @ ! %r319 bra $L69; bra $L68; $L62: add.u64 %r321,%r22,%r22; add.u64 %r322,%r321,%r22; shl.b64 %r323,%r322,3; add.u64 %r130,%r323,40; add.u64 %r134,%r222,%r130; add.u64 %r111,%r224,%r130; shl.b64 %r324,%r22,3; add.u64 %r58,%r324,-8; cvt.u32.u32 %r326,%r23; cvt.s64.s8 %r325,%r326; shl.b64 %r327,%r325,3; add.u64 %r69,%r327,-8; add.u64 %r460,%frame,240; add.u64 %r459,%frame,360; .loc 1 319 12 mov.u64 %r470,0; $L72: .loc 1 314 18 add.u64 %r329,%r460,%r58; ld.u64 %r330,[%r134]; st.u64 [%r329],%r330; .loc 1 315 18 add.u64 %r331,%frame,%r58; .loc 1 315 20 ld.u64 %r333,[%r111]; mul.lo.u64 %r332,%r333,%r34; .loc 1 315 18 st.u64 [%r331],%r332; .loc 1 316 19 ld.u64 %r335,[%r134+16]; add.u64 %r334,%r335,1; ld.u64 %r336,[%r134+8]; sub.u64 %r65,%r334,%r336; .loc 1 318 10 setp.lt.s64 %r337,%r65,0; @ %r337 bra $L70; .loc 1 316 17 add.u64 %r339,%r459,%r58; st.u64 [%r339],%r65; bra $L71; $L70: .loc 1 319 12 add.u64 %r341,%r459,%r58; st.u64 [%r341],%r470; $L71: .loc 1 312 3 add.u64 %r134,%r134,24; add.u64 %r111,%r111,24; add.u64 %r58,%r58,8; setp.ne.u64 %r343,%r69,%r58; @ %r343 bra $L72; bra $L120; $L68: .loc 1 326 7 setp.eq.u64 %r344,%r152,0; @ %r344 bra $L74; add.u64 %r459,%frame,360; $L99: add.u64 %r148,%r221,40; mov.u64 %r146,%r459; cvt.u32.u32 %r346,%r23; cvt.s64.s8 %r345,%r346; add.u64 %r348,%r345,%r345; add.u64 %r349,%r348,%r345; shl.b64 %r350,%r349,3; add.u64 %r351,%r221,16; add.u64 %r135,%r350,%r351; .loc 1 329 10 mov.u64 %r136,1; .loc 1 333 4 mov.u64 %r370,0; bra $L75; $L74: .loc 1 337 20 add.u32 %r352,%r23,-2; cvt.s64.s32 %r76,%r352; add.u64 %r354,%r76,%r76; add.u64 %r355,%r354,%r76; shl.b64 %r356,%r355,3; add.u64 %r357,%r221,%r356; .loc 1 337 67 shl.b64 %r359,%r76,3; add.u64 %r360,%frame,%r359; .loc 1 337 59 ld.u64 %r362,[%r357+40]; ld.u64 %r363,[%r360+360]; mul.lo.u64 %r160,%r362,%r363; .loc 1 339 24 mov.u64 %r364,0; st.u64 [%r221+8],%r364; .loc 1 340 28 cvt.u16.u32 %r367,%r23; add.u16 %r366,%r367,-1; cvt.u32.u16 %r368,%r366; st.u8 [%r221+28],%r368; .loc 1 342 10 setp.eq.u64 %r369,%r160,0; @ %r369 bra $L76; bra $L121; $L78: .loc 1 331 47 mul.lo.u64 %r136,%r73,%r136; $L75: .loc 1 333 4 st.u64 [%r148+8],%r370; ld.u64 %r73,[%r146]; add.u64 %r371,%r73,-1; st.u64 [%r148+16],%r371; st.u64 [%r148],%r136; .loc 1 326 7 add.u64 %r148,%r148,24; add.u64 %r146,%r146,8; setp.ne.u64 %r372,%r135,%r148; @ %r372 bra $L78; bra $L74; $L76: .loc 1 345 4 st.u64 [%r221+48],%r160; mov.u64 %r374,-1; st.u64 [%r221+56],%r374; mov.u64 %r375,1; st.u64 [%r221+40],%r375; .loc 1 346 4 bra $L54; $L121: .loc 1 349 24call (%value_in),_gfortrani_xmallocarray78,[%value_in]; } .loc 1 349 22 st.u64 [%r221],%r378; bra $L79; $L69: .loc 1 354 19 ld.s8 %r380,[%r221+28]; .loc 1 354 10 setp.eq.u64 %r381,%r380,%r152; @ %r381 bra $L80; .loc 1 355 2 cvta.const.u64 %r382,$LC5; {stack; call _gfortran_runtime_error80: .loc 1 357 11 cvta.global.u64 %r384,_gfortrani_compile_options; .loc 1 357 10 ld.u32 %r385,[%r384+36]; setp.eq.u32 %r386,%r385,0; @ %r386 bra $L79; .loc 1 359 4 add.u64 %r459,%frame,360; cvta.const.u64 %r390,$LC2r459389390; call _gfortrani_bounds_ifunction_return361 4 cvta.const.u64 %r394,$LC222394390; call _gfortrani_bounds_equal_extents$L79: .loc 1 366 3 setp.ne.u64 %r396,%r152,0; @ %r396 bra $L81; $L84: .loc 1 374 8 ld.u64 %r170,[%r221]; .loc 1 375 8 ld.u64 %r168,[%r222]; .loc 1 377 9 setp.ne.u64 %r397,%r168,0; @ %r397 bra $L82; bra $L54; $L81: add.u64 %r198,%frame,480; add.u64 %r197,%r221,40; add.u64 %r195,%frame,120; cvt.u32.u32 %r399,%r23; cvt.s64.s8 %r398,%r399; add.u64 %r164,%r398,-1; .loc 1 366 3 mov.u64 %r171,0; add.u64 %r459,%frame,360; .loc 1 368 16 mov.u64 %r400,%r171; $L83: st.u64 [%r198],%r400; .loc 1 369 18 ld.u64 %r401,[%r197]; st.u64 [%r195],%r401; .loc 1 370 17 shl.b64 %r403,%r171,3; add.u64 %r404,%r459,%r403; .loc 1 370 10 ld.u64 %r405,[%r404]; setp.le.s64 %r406,%r405,0; @ %r406 bra $L54; .loc 1 366 26 add.u64 %r171,%r171,1; .loc 1 366 3 add.u64 %r198,%r198,8; add.u64 %r197,%r197,24; add.u64 %r195,%r195,8; setp.ne.u64 %r407,%r164,%r171; @ %r407 bra $L83; bra $L84; $L82: .loc 1 300 12 mul.lo.u64 %r44,%r41,%r34; .loc 1 396 32 add.u64 %r91,%r156,%r156; ld.u64 %r187,[%frame+480]; .loc 1 440 22 ld.u64 %r101,[%frame+240]; .loc 1 440 12 add.u64 %r102,%r101,%r101; .loc 1 441 23 ld.u64 %r104,[%frame]; .loc 1 442 22 ld.u64 %r106,[%frame+120]; .loc 1 442 12 shl.b64 %r107,%r106,2; .loc 1 444 32 ld.u64 %r29,[%frame+360]; .loc 1 451 23 mul.lo.u64 %r38,%r29,%r101; .loc 1 452 24 mul.lo.u64 %r410,%r29,%r104; .loc 1 452 10 neg.s64 %r137,%r410; .loc 1 453 23 mul.lo.u64 %r411,%r29,%r106; .loc 1 453 9 shl.b64 %r412,%r411,2; neg.s64 %r188,%r412; setp.le.s64 %r458,%r152,1; .loc 1 418 9 setp.ne.u32 %r463,%r225,0; cvt.u32.u32 %r464,%r23; cvt.s64.s8 %r465,%r464; add.u64 %r466,%r465,-1; add.u64 %r467,%frame,240; add.u64 %r468,%frame,360; add.u64 %r469,%frame,120; $L100: .loc 1 366 3 mov.u64 %r163,%r169; mov.u64 %r162,%r168; .loc 1 396 9 mov.u64 %r161,0; $L88: .loc 1 399 7 ld.s8 %r220,[%r163]; .loc 1 399 6 setp.eq.u32 %r461,%r220,0; @ %r461 bra $L85; .loc 1 407 11 ld.s16 %r145,[%r162]; .loc 1 408 11 cvt.u32.u64 %r414,%r161; add.u32 %r143,%r414,1; .loc 1 418 9 @ %r463 bra $L86; bra $L122; $L85: .loc 1 396 24 add.u64 %r161,%r161,1; .loc 1 396 32 add.u64 %r162,%r162,%r91; .loc 1 396 47 add.u64 %r163,%r163,%r44; .loc 1 396 2 setp.ne.u64 %r416,%r153,%r161; @ %r416 bra $L88; .loc 1 395 9 mov.u32 %r143,%r220; bra $L89; $L122: .loc 1 428 10 setp.gt.s64 %r417,%r153,%r161; @ %r417 bra $L90; bra $L89; $L86: .loc 1 419 8 setp.le.s64 %r418,%r153,%r161; @ %r418 bra $L89; add.u64 %r205,%r161,1; $L92: .loc 1 421 8 @ %r461 bra $L91; .loc 1 421 18 ld.s16 %r93,[%r162]; .loc 1 421 15 cvt.u16.u32 %r420,%r93; cvt.u16.u32 %r421,%r145; setp.gt.s16 %r422,%r420,%r421; @ %r422 bra $L91; .loc 1 424 16 cvt.u32.u64 %r143,%r205; mov.u32 %r145,%r93; $L91: .loc 1 419 33 add.u64 %r162,%r162,%r91; .loc 1 419 48 add.u64 %r163,%r163,%r44; .loc 1 419 8 add.u64 %r206,%r205,1; setp.eq.u64 %r423,%r153,%r205; @ %r423 bra $L89; .loc 1 421 9 ld.s8 %r220,[%r163]; mov.u64 %r205,%r206; setp.eq.u32 %r461,%r220,0; bra $L92; $L90: add.u64 %r201,%r161,1; bra $L94; $L101: mov.u64 %r201,%r202; $L94: .loc 1 430 10 ld.s8 %r425,[%r163]; cvt.u16.u32 %r424,%r425; setp.eq.u16 %r426,%r424,0; @ %r426 bra $L93; .loc 1 430 20 ld.s16 %r98,[%r162]; .loc 1 430 17 cvt.u16.u32 %r427,%r98; cvt.u16.u32 %r428,%r145; setp.ge.s16 %r429,%r427,%r428; @ %r429 bra $L93; .loc 1 433 11 cvt.u32.u64 %r143,%r201; mov.u32 %r145,%r98; $L93: .loc 1 428 35 add.u64 %r162,%r162,%r91; .loc 1 428 50 add.u64 %r163,%r163,%r44; .loc 1 428 10 add.u64 %r202,%r201,1; setp.ne.u64 %r430,%r153,%r201; @ %r430 bra $L101; $L89: .loc 1 436 8 st.u32 [%r170],%r143; .loc 1 439 15 add.u64 %r187,%r187,1; .loc 1 440 12 add.u64 %r168,%r168,%r102; .loc 1 441 13 add.u64 %r169,%r169,%r104; .loc 1 442 12 add.u64 %r170,%r170,%r107; .loc 1 444 13 setp.ne.u64 %r431,%r29,%r187; @ %r431 bra $L100; .loc 1 452 10 add.u64 %r165,%r169,%r137; .loc 1 453 9 add.u64 %r166,%r170,%r188; .loc 1 455 7 @ %r458 bra $L54; add.u64 %r150,%frame,488; .loc 1 451 23 mov.u64 %r110,%r38; .loc 1 455 7 mov.u64 %r215,8; .loc 1 454 5 mov.u64 %r167,1; .loc 1 448 13 mov.u64 %r462,0; bra $L97; $L98: st.u64 [%r150],%r462; .loc 1 451 23 mul.lo.u64 %r110,%r123,%r122; .loc 1 452 24 mul.lo.u64 %r437,%r125,%r122; .loc 1 452 10 sub.u64 %r165,%r169,%r437; .loc 1 453 23 mul.lo.u64 %r438,%r127,%r122; .loc 1 453 9 shl.b64 %r439,%r438,2; sub.u64 %r166,%r170,%r439; .loc 1 454 5 add.u64 %r167,%r167,1; .loc 1 455 7 add.u64 %r150,%r150,8; add.u64 %r215,%r215,8; setp.eq.u64 %r440,%r167,%r466; @ %r440 bra $L54; $L97: .loc 1 463 16 ld.u64 %r441,[%r150]; add.u64 %r122,%r441,1; st.u64 [%r150],%r122; .loc 1 464 23 add.u64 %r443,%r467,%r215; ld.u64 %r123,[%r443]; .loc 1 464 13 sub.u64 %r444,%r123,%r110; add.u64 %r445,%r444,%r444; add.u64 %r168,%r168,%r445; .loc 1 465 24 add.u64 %r446,%frame,%r215; ld.u64 %r125,[%r446]; .loc 1 465 14 add.u64 %r169,%r165,%r125; .loc 1 466 23 add.u64 %r448,%r469,%r215; ld.u64 %r127,[%r448]; .loc 1 466 13 shl.b64 %r449,%r127,2; add.u64 %r170,%r166,%r449; .loc 1 444 32 add.u64 %r451,%r468,%r215; ld.u64 %r131,[%r451]; .loc 1 444 13 setp.eq.u64 %r452,%r122,%r131; @ %r452 bra $L98; .loc 1 448 13 mov.u64 %r187,0; bra $L100; $L120: .loc 1 322 6 ld.u64 %r453,[%r221]; setp.eq.u64 %r454,%r453,0; @ ! %r454 bra $L69; bra $L99; $L54_gfortran_sminloc1_4_i2 .visible .func _gfortran_sminloc1_4_, .param .u32 %in_ar4) {local .align 16 .b8 %frame_ar[368u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u32 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 493 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L124; .loc 1 493 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L125; $L124:r152154; call _gfortran_minloc1_4_500 7 bra $L123; $L125: .loc 1 503 10 ld.u64 %r23,[%r152]; .loc 1 503 7 add.u64 %r88,%r23,-1; .loc 1 504 10 ld.s8 %r24,[%r151+28]; .loc 1 504 38 add.u32 %r26,%r24,-1; .loc 1 504 8 cvt.s64.s32 %r296,%r26; .loc 1 506 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 506 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L127; .loc 1 513 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L128; $L134: .loc 1 521 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L129; bra $L182; $L127: .loc 1 508 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error128: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 518 12 mov.u64 %r302,0; $L133: .loc 1 515 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 517 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L131; .loc 1 515 17 st.u64 [%r139],%r36; bra $L132; $L131: .loc 1 518 12 st.u64 [%r139],%r302; $L132: .loc 1 513 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L133; bra $L134; $L182: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L136; bra $L135; $L129: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 527 12 mov.u64 %r301,0; $L139: .loc 1 524 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 526 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L137; .loc 1 523 17 st.u64 [%r43],%r41; bra $L138; $L137: .loc 1 527 12 st.u64 [%r43],%r301; $L138: .loc 1 521 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L139; bra $L183; $L135: .loc 1 534 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L141; add.u64 %r295,%frame,120; $L161: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 537 10 mov.u64 %r81,1; .loc 1 541 4 mov.u64 %r239,0; bra $L142; $L141: .loc 1 545 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 546 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 548 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 548 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 548 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 550 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L143; bra $L184; $L145: .loc 1 539 48 mul.lo.u64 %r81,%r46,%r81; $L142: .loc 1 541 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 534 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L145; bra $L141; $L143: .loc 1 553 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 554 4 bra $L123; $L184: .loc 1 557 2496246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 557 22 st.u64 [%r150],%r98; $L149: .loc 1 583 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L146; bra $L147; $L136: .loc 1 561 19 ld.s8 %r55,[%r150+28]; .loc 1 561 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L148; .loc 1 562 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error148: .loc 1 567 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 567 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L149; .loc 1 569 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L147; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 569 10 mov.u64 %r104,0; $L151: .loc 1 573 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 573 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 574 18 ld.u64 %r63,[%r31]; .loc 1 569 25 add.u64 %r104,%r104,1; .loc 1 574 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L150; .loc 1 575 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error150: .loc 1 569 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L151; $L146: .loc 1 585 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L152; shl.b64 %r266,%r296,3; bra $L153; $L152: mov.u64 %r266,8; $L153: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L154: .loc 1 586 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 583 26 add.u64 %r100,%r100,1; .loc 1 583 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L154; $L147: ld.u64 %r80,[%frame+240]; .loc 1 595 22 ld.u64 %r68,[%frame]; .loc 1 595 12 shl.b64 %r69,%r68,2; .loc 1 597 32 ld.u64 %r59,[%frame+120]; .loc 1 604 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 593 13 mov.u32 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L160: st.u32 [%r98],%r280; .loc 1 594 15 add.u64 %r80,%r80,1; .loc 1 595 12 add.u64 %r98,%r98,%r69; .loc 1 597 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L160; .loc 1 606 7 @ ! %r294 bra $L185; bra $L123; $L159: .loc 1 601 13 st.u64 [%r99],%r297; .loc 1 604 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 605 5 add.u64 %r97,%r97,1; .loc 1 606 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L158; bra $L123; $L185: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 604 23 mov.u64 %r72,%r44; .loc 1 605 5 mov.u64 %r97,1; .loc 1 601 13 mov.u64 %r297,0; $L158: .loc 1 610 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 611 23 ld.u64 %r75,[%r124]; .loc 1 611 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,2; add.u64 %r98,%r98,%r291; .loc 1 597 32 ld.u64 %r77,[%r123]; .loc 1 597 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L159; .loc 1 601 13 mov.u64 %r80,0; bra $L160; $L183: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L136; bra $L161; $L123: .loc 1 615 1 ret; } minloc1_8_i2.o/_gfortran_minloc1_8_i2 .visible .func _gfortran_minloc1_8_i2.file 1 "../../../libgfortran/generated/minloc1_8_i2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_8_i2 .visible .func _gfortran_mminloc1_8__gfortran_sminloc1_8_i2 .visible .func _gfortran_sminloc1_8_VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_minloc1_8_i2 .visible .func _gfortran_minloc1_8_i2480predpredpred %r235; .reg .predu64 %r266; .reg .u64 %r267; .reg .u64u16u64 %r279; .reg .u64 %r280; .reg .u64 %r282; .reg .u64 %r283; .reg .u64pred32u64pred %r328; .reg .pred %r329; .reg .u16 %r330; .reg .u16 %r331; .reg .predu64 %r335; .reg .u64mov.u64 %r185,%ar0; mov.u64 %r186,%ar1; mov.u64 %r187,%ar2; mov.u32 %r188,%ar3; .loc 1 58 10 ld.s8 %r22,[%r186+28]; .loc 1 58 38 add.u32 %r189,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r120,%r189; .loc 1 59 10 ld.u64 %r25,[%r187]; .loc 1 59 7 add.u64 %r121,%r25,-1; .loc 1 61 7 shr.u64 %r191,%r121,63; set.u32.lt.s64 %r193,%r120,%r121; neg.s32 %r194,%r193; cvt.u16.u64 %r197,%r191; cvt.u16.u32 %r198,%r194; or.b16 %r196,%r197,%r198; .loc 1 61 6 cvt.u32.u16 %r199,%r196; cvt.u16.u8 %r200,%r199; setp.eq.u16 %r201,%r200,0; @ %r201 bra $L2; .loc 1 63 7 cvt.u32.u32 %r204,%r22; cvt.s64.s8 %r203,%r204; st.u64 [%stack+8],%r20302202_gfortran_runtime_errorr207,%r121,%r121; add.u64 %r208,%r207,%r121; shl.b64 %r209,%r208,3; add.u64 %r210,%r186,%r209; ld.u64 %r30,[%r210+56]; ld.u64 %r32,[%r210+48]; .loc 1 71 9 ld.u64 %r124,[%r210+40]; .loc 1 73 3 setp.ne.u64 %r224,%r121,0; @ %r224 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r225,%r120,%r121; @ %r225 bra $L4; bra $L54; $L3: add.u64 %r47,%r186,40; add.u64 %r119,%frame,120; add.u64 %r137,%frame,240; add.u64 %r226,%r186,16; add.u64 %r228,%r25,%r25; add.u64 %r229,%r228,%r25; shl.b64 %r230,%r229,3; add.u64 %r46,%r226,%r230; .loc 1 79 12 mov.u64 %r373,0; $L8: .loc 1 75 18 ld.u64 %r231,[%r47]; st.u64 [%r119],%r231; .loc 1 76 19 ld.u64 %r233,[%r47+16]; add.u64 %r232,%r233,1; ld.u64 %r234,[%r47+8]; sub.u64 %r37,%r232,%r234; .loc 1 78 10 setp.lt.s64 %r235,%r37,0; @ %r235 bra $L6; .loc 1 76 17 st.u64 [%r137],%r37; bra $L7; $L6: .loc 1 79 12 st.u64 [%r137],%r373; $L7: .loc 1 73 3 add.u64 %r47,%r47,24; add.u64 %r119,%r119,8; add.u64 %r137,%r137,8; setp.ne.u64 %r237,%r46,%r47; @ %r237 bra $L8; bra $L9; $L54: .loc 1 90 6 ld.u64 %r238,[%r185]; setp.eq.u64 %r239,%r238,0; @ ! %r239 bra $L11; bra $L10; $L4: add.u64 %r241,%r25,%r25; add.u64 %r242,%r241,%r25; shl.b64 %r243,%r242,3; add.u64 %r244,%r243,40; add.u64 %r118,%r186,%r244; shl.b64 %r245,%r25,3; add.u64 %r108,%r245,-8; add.u64 %r361,%frame,120; add.u64 %r112,%r361,%r108; add.u64 %r360,%frame,240; add.u64 %r104,%r360,%r108; cvt.u32.u32 %r249,%r22; cvt.s64.s8 %r248,%r249; add.u64 %r251,%r248,%r248; add.u64 %r252,%r251,%r248; shl.b64 %r253,%r252,3; add.u64 %r254,%r186,40; add.u64 %r64,%r253,%r254; .loc 1 87 12 mov.u64 %r372,0; $L14: .loc 1 83 18 ld.u64 %r255,[%r118]; st.u64 [%r112],%r255; .loc 1 84 19 ld.u64 %r257,[%r118+16]; add.u64 %r256,%r257,1; ld.u64 %r258,[%r118+8]; sub.u64 %r43,%r256,%r258; .loc 1 86 10 setp.lt.s64 %r259,%r43,0; @ %r259 bra $L12; .loc 1 84 17 st.u64 [%r104],%r43; bra $L13; $L12: .loc 1 87 12 st.u64 [%r104],%r372; $L13: .loc 1 81 3 add.u64 %r118,%r118,24; add.u64 %r112,%r112,8; add.u64 %r104,%r104,8; setp.ne.u64 %r261,%r64,%r118; @ %r261 bra $L14; bra $L55; $L10: .loc 1 94 7 setp.le.s64 %r262,%r120,0; @ %r262 bra $L16; add.u64 %r360,%frame,240; $L39: add.u64 %r156,%r185,40; mov.u64 %r153,%r360; cvt.u32.u32 %r264,%r22; cvt.s64.s8 %r263,%r264; add.u64 %r266,%r263,%r263; add.u64 %r267,%r266,%r263; shl.b64 %r268,%r267,3; add.u64 %r269,%r185,16; add.u64 %r130,%r268,%r269; .loc 1 97 10 mov.u64 %r107,1; .loc 1 101 4 mov.u64 %r292,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r270,0; st.u64 [%r185+8],%r270; .loc 1 106 28 cvt.u16.u32 %r273,%r22; add.u16 %r272,%r273,-1; cvt.u32.u16 %r274,%r272; st.u8 [%r185+28],%r274; .loc 1 108 20 add.u32 %r275,%r22,-2; cvt.s64.s32 %r54,%r275; add.u64 %r277,%r54,%r54; add.u64 %r278,%r277,%r54; shl.b64 %r279,%r278,3; add.u64 %r280,%r185,%r279; .loc 1 108 67 shl.b64 %r282,%r54,3; add.u64 %r283,%frame,%r282; .loc 1 108 59 ld.u64 %r285,[%r280+40]; ld.u64 %r286,[%r283+240]; mul.lo.u64 %r126,%r285,%r286; .loc 1 110 29 mov.u64 %r288288; call (%value_in),_gfortrani_xmallocarray289,[%value_in]; } .loc 1 110 27 st.u64 [%r185],%r289; .loc 1 111 10 setp.eq.u64 %r291,%r126,0; @ ! %r291 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r107,%r50,%r107; $L17: .loc 1 101 4 st.u64 [%r156+8],%r292; ld.u64 %r50,[%r153]; add.u64 %r293,%r50,-1; st.u64 [%r156+16],%r293; st.u64 [%r156],%r107; .loc 1 94 7 add.u64 %r156,%r156,24; add.u64 %r153,%r153,8; setp.ne.u64 %r294,%r130,%r156; @ %r294 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r185+48],%r126; mov.u64 %r296,-1; st.u64 [%r185+56],%r296; mov.u64 %r297,1; st.u64 [%r185+40],%r297; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r60,[%r185+28]; .loc 1 121 10 setp.eq.u64 %r298,%r60,%r120; @ %r298 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r120; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r301,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r302,[%r301+36]; setp.eq.u32 %r303,%r302,0; @ %r303 bra $L21; .loc 1 128 2 add.u64 %r360,%frame,240; cvta.const.u64 %r307,$LC2r360306307; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r309,%r120,0; @ %r309 bra $L24; $L27: .loc 1 68 9 add.u64 %r310,%r30,1; .loc 1 68 7 sub.u64 %r122,%r310,%r32; max.s64 %r87,%r122,0; .loc 1 140 8 ld.u64 %r134,[%r186]; .loc 1 141 8 ld.u64 %r135,[%r185]; .loc 1 190 33 add.u64 %r75,%r124,%r124; ld.u64 %r45,[%frame+360]; .loc 1 204 22 ld.u64 %r76,[%frame+120]; .loc 1 204 12 add.u64 %r78,%r76,%r76; .loc 1 205 22 ld.u64 %r79,[%frame]; .loc 1 205 12 shl.b64 %r81,%r79,3; setp.gt.s64 %r362,%r122,0; setp.le.s64 %r363,%r120,1; .loc 1 180 9 setp.eq.u32 %r365,%r188,0; .loc 1 159 10 mov.u64 %r366,0; cvt.u32.u32 %r367,%r22; cvt.s64.s8 %r368,%r367; add.u64 %r369,%r368,-1; add.u64 %r370,%frame,120; add.u64 %r371,%frame,240; bra $L25; $L24: add.u64 %r170,%frame,360; add.u64 %r169,%r185,40; mov.u64 %r167,%frame; cvt.u32.u32 %r314,%r22; cvt.s64.s8 %r313,%r314; add.u64 %r158,%r313,-1; .loc 1 132 3 mov.u64 %r136,0; add.u64 %r360,%frame,240; .loc 1 134 16 mov.u64 %r315,%r136; $L26: st.u64 [%r170],%r315; .loc 1 135 18 ld.u64 %r316,[%r169]; st.u64 [%r167],%r316; .loc 1 136 17 shl.b64 %r318,%r136,3; add.u64 %r319,%r360,%r318; .loc 1 136 10 ld.u64 %r320,[%r319]; setp.le.s64 %r321,%r320,0; @ %r321 bra $L1; .loc 1 132 26 add.u64 %r136,%r136,1; .loc 1 132 3 add.u64 %r170,%r170,8; add.u64 %r169,%r169,24; add.u64 %r167,%r167,8; setp.ne.u64 %r322,%r136,%r158; @ %r322 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r362 bra $L28; .loc 1 159 10 st.u64 [%r135],%r366; bra $L29; $L28: .loc 1 180 9 @ %r365 bra $L40; mov.u64 %r129,%r134; .loc 1 155 9 mov.u32 %r111,32767; .loc 1 157 9 mov.u64 %r110,1; .loc 1 178 8 mov.u64 %r152,0; $L32: .loc 1 183 9 ld.s16 %r67,[%r129]; .loc 1 186 16 add.u64 %r152,%r152,1; .loc 1 183 8 cvt.u16.u32 %r326,%r67; cvt.u16.u32 %r327,%r111; setp.le.s16 %r328,%r326,%r327; selp.u32 %r111,%r67,%r111,%r328; selp.u64 %r110,%r152,%r110,%r328; .loc 1 181 33 add.u64 %r129,%r129,%r75; .loc 1 181 8 setp.gt.s64 %r329,%r87,%r152; @ %r329 bra $L32; bra $L33; $L40: mov.u64 %r127,%r134; .loc 1 155 9 mov.u32 %r113,32767; .loc 1 157 9 mov.u64 %r110,1; .loc 1 178 8 mov.u64 %r106,0; $L30: .loc 1 192 9 ld.s16 %r72,[%r127]; .loc 1 195 16 add.u64 %r106,%r106,1; .loc 1 192 8 cvt.u16.u32 %r330,%r72; cvt.u16.u32 %r331,%r113; setp.lt.s16 %r332,%r330,%r331; selp.u32 %r113,%r72,%r113,%r332; selp.u64 %r110,%r106,%r110,%r332; .loc 1 190 33 add.u64 %r127,%r127,%r75; .loc 1 190 8 setp.gt.s64 %r333,%r87,%r106; @ %r333 bra $L30; $L33: .loc 1 199 12 st.u64 [%r135],%r110; $L29: .loc 1 203 15 add.u64 %r45,%r45,1; .loc 1 204 12 add.u64 %r134,%r134,%r78; .loc 1 205 12 add.u64 %r135,%r135,%r81; .loc 1 207 32 ld.u64 %r68,[%frame+240]; .loc 1 207 13 setp.ne.u64 %r334,%r68,%r45; @ %r334 bra $L25; .loc 1 214 23 mul.lo.u64 %r335,%r45,%r76; .loc 1 214 9 add.u64 %r336,%r335,%r335; sub.u64 %r131,%r134,%r336; .loc 1 215 23 mul.lo.u64 %r337,%r45,%r79; .loc 1 215 9 shl.b64 %r338,%r337,3; sub.u64 %r132,%r135,%r338; .loc 1 217 7 @ %r363 bra $L1; add.u64 %r157,%frame,368; mov.u64 %r178,8; .loc 1 216 5 mov.u64 %r133,1; bra $L37; $L38: .loc 1 211 13 st.u64 [%r157],%r366; .loc 1 214 23 mul.lo.u64 %r344,%r94,%r93; .loc 1 214 9 add.u64 %r345,%r344,%r344; sub.u64 %r131,%r134,%r345; .loc 1 215 23 mul.lo.u64 %r346,%r97,%r93; .loc 1 215 9 shl.b64 %r347,%r346,3; sub.u64 %r132,%r135,%r347; .loc 1 216 5 add.u64 %r133,%r133,1; .loc 1 217 7 add.u64 %r157,%r157,8; add.u64 %r178,%r178,8; setp.eq.u64 %r348,%r133,%r369; @ %r348 bra $L1; $L37: .loc 1 225 16 ld.u64 %r349,[%r157]; add.u64 %r93,%r349,1; st.u64 [%r157],%r93; .loc 1 226 23 add.u64 %r351,%r370,%r178; ld.u64 %r94,[%r351]; .loc 1 226 13 add.u64 %r352,%r94,%r94; add.u64 %r134,%r131,%r352; .loc 1 227 23 add.u64 %r353,%frame,%r178; ld.u64 %r97,[%r353]; .loc 1 227 13 shl.b64 %r354,%r97,3; add.u64 %r135,%r132,%r354; .loc 1 207 32 add.u64 %r356,%r371,%r178; ld.u64 %r100,[%r356]; .loc 1 207 13 setp.eq.u64 %r357,%r93,%r100; @ %r357 bra $L38; .loc 1 211 13 mov.u64 %r45,0; bra $L25; $L55: .loc 1 90 6 ld.u64 %r358,[%r185]; setp.eq.u64 %r359,%r358,0; @ ! %r359 bra $L11; bra $L39; $L1:_gfortran_mminloc1_8_i2 .visible .func _gfortran_mminloc1_8_, .param .u32 %in_ar4) {608pred %r218; .reg .u32 %r223; .reg .u6464u64pred %r373; .reg .u64 %r374; .reg .u64 %r376; .reg .u32 %r377; .reg .pred %r378; .reg .u64 %r381; .reg .u64 %r382; .reg .u64 %r386; .reg .pred %r388; .reg .pred %r389; .reg .u64 %r390; .reg .u32 %r391; .reg .u64pred %r398; .reg .pred %r399; .reg .pred %r404; .reg .pred16u32 %r413; .reg .predpred %r417; .reg .pred %r418; .reg .predpredpredu64 %r460; .reg .u64 %r461; .reg .u64 %r462; mov.u64 %r213,%ar0; mov.u64 %r214,%ar1; mov.u64 %r215,%ar2; mov.u64 %r216,%ar3; mov.u32 %r217,%ar4; .loc 1 261 6 setp.ne.u64 %r218,%r216,0; @ %r218 bra $L1r214215217; call _gfortran_minloc1_8_268 7 bra $L56; $L57: .loc 1 271 10 ld.u64 %r22,[%r215]; .loc 1 271 7 add.u64 %r146,%r22,-1; .loc 1 272 10 ld.s8 %r23,[%r214+28]; .loc 1 272 38 add.u32 %r223,%r23,-1; .loc 1 272 8 cvt.s64.s32 %r147,%r223; .loc 1 275 7 shr.u64 %r225,%r146,63; set.u32.gt.s64 %r227,%r146,%r147; neg.s32 %r228,%r227; cvt.u16.u64 %r231,%r225; cvt.u16.u32 %r232,%r228; or.b16 %r230,%r231,%r232; .loc 1 275 6 cvt.u32.u16 %r233,%r230; cvt.u16.u8 %r234,%r233; setp.eq.u16 %r235,%r234,0; @ %r235 bra $L59; .loc 1 277 7 cvt.u32.u32 %r238,%r23; cvt.s64.s8 %r237,%r238; st.u64 [%stack+8],%r237; st.u64 [%stack],%r22; cvta.const.u64 %r236236_gfortran_runtime_error9: .loc 1 282 9 add.u64 %r452,%r146,%r146; add.u64 %r451,%r452,%r146; shl.b64 %r243,%r451,3; add.u64 %r244,%r214,%r243; ld.u64 %r247,[%r244+56]; add.u64 %r246,%r247,1; .loc 1 282 7 ld.u64 %r254,[%r244+48]; sub.u64 %r148,%r246,%r254; .loc 1 283 6 setp.le.s64 %r255,%r148,0; @ %r255 bra $L56; .loc 1 286 9 ld.u64 %r166,[%r216]; .loc 1 288 15 ld.u64 %r34,[%r216+16]; .loc 1 290 22 cvt.u32.u64 %r35,%r34; .loc 1 290 53 add.u32 %r256,%r35,-4; and.b32 %r257,%r256,-5; set.u32.eq.u32 %r259,%r257,0; neg.s32 %r260,%r259; .loc 1 290 22 add.u32 %r261,%r35,-1; set.u32.le.u32 %r263,%r261,1; neg.s32 %r264,%r263; .loc 1 290 6 cvt.u16.u32 %r266,%r260; cvt.u16.u32 %r267,%r264; or.b16 %r265,%r266,%r267; cvt.u32.u16 %r268,%r265; cvt.u16.u8 %r269,%r268; setp.ne.u16 %r270,%r269,0; @ %r270 bra $L61; .loc 1 292 7 setp.ne.u32 %r272,%r35,16; @ %r272 bra $L62; $L61: .loc 1 299 9 shl.b64 %r276,%r451,3; add.u64 %r277,%r214,%r276; ld.u64 %r151,[%r277+40]; .loc 1 300 12 add.u64 %r283,%r216,%r276; ld.u64 %r39,[%r283+40]; .loc 1 302 3 setp.ne.u64 %r285,%r146,0; @ %r285 bra $L63; $L69: .loc 1 312 3 setp.lt.s64 %r286,%r146,%r147; @ %r286 bra $L64; bra $L123; $L62: .loc 1 297 5_gfortran_runtime_error63: add.u64 %r205,%r214,40; add.u64 %r67,%r216,40; add.u64 %r289,%r214,16; add.u64 %r291,%r22,%r22; add.u64 %r292,%r291,%r22; shl.b64 %r293,%r292,3; add.u64 %r210,%r289,%r293; .loc 1 302 3 mov.u64 %r91,0; add.u64 %r449,%frame,240; add.u64 %r447,%frame,360; .loc 1 309 12 mov.u64 %r462,%r91; $L68: .loc 1 304 18 add.u64 %r295,%r449,%r91; ld.u64 %r296,[%r205]; st.u64 [%r295],%r296; .loc 1 305 18 add.u64 %r297,%frame,%r91; .loc 1 305 20 ld.u64 %r299,[%r67]; mul.lo.u64 %r298,%r299,%r34; .loc 1 305 18 st.u64 [%r297],%r298; .loc 1 306 19 ld.u64 %r301,[%r205+16]; add.u64 %r300,%r301,1; ld.u64 %r302,[%r205+8]; sub.u64 %r51,%r300,%r302; .loc 1 308 10 setp.lt.s64 %r303,%r51,0; @ %r303 bra $L66; .loc 1 306 17 add.u64 %r305,%r447,%r91; st.u64 [%r305],%r51; bra $L67; $L66: .loc 1 309 12 add.u64 %r307,%r447,%r91; st.u64 [%r307],%r462; $L67: .loc 1 302 3 add.u64 %r205,%r205,24; add.u64 %r67,%r67,24; add.u64 %r91,%r91,8; setp.ne.u64 %r309,%r205,%r210; @ %r309 bra $L68; bra $L69; $L123: .loc 1 322 6 ld.u64 %r310,[%r213]; setp.eq.u64 %r311,%r310,0; @ ! %r311 bra $L71; bra $L70; $L64: add.u64 %r313,%r22,%r22; add.u64 %r314,%r313,%r22; shl.b64 %r315,%r314,3; add.u64 %r128,%r315,40; add.u64 %r133,%r214,%r128; add.u64 %r124,%r216,%r128; shl.b64 %r316,%r22,3; add.u64 %r83,%r316,-8; cvt.u32.u32 %r318,%r23; cvt.s64.s8 %r317,%r318; shl.b64 %r319,%r317,3; add.u64 %r209,%r319,-8; add.u64 %r449,%frame,240; add.u64 %r447,%frame,360; .loc 1 319 12 mov.u64 %r461,0; $L74: .loc 1 314 18 add.u64 %r321,%r449,%r83; ld.u64 %r322,[%r133]; st.u64 [%r321],%r322; .loc 1 315 18 add.u64 %r323,%frame,%r83; .loc 1 315 20 ld.u64 %r325,[%r124]; mul.lo.u64 %r324,%r325,%r34; .loc 1 315 18 st.u64 [%r323],%r324; .loc 1 316 19 ld.u64 %r327,[%r133+16]; add.u64 %r326,%r327,1; ld.u64 %r328,[%r133+8]; sub.u64 %r64,%r326,%r328; .loc 1 318 10 setp.lt.s64 %r329,%r64,0; @ %r329 bra $L72; .loc 1 316 17 add.u64 %r331,%r447,%r83; st.u64 [%r331],%r64; bra $L73; $L72: .loc 1 319 12 add.u64 %r333,%r447,%r83; st.u64 [%r333],%r461; $L73: .loc 1 312 3 add.u64 %r133,%r133,24; add.u64 %r124,%r124,24; add.u64 %r83,%r83,8; setp.ne.u64 %r335,%r83,%r209; @ %r335 bra $L74; bra $L124; $L70: .loc 1 326 7 setp.eq.u64 %r336,%r147,0; @ %r336 bra $L76; add.u64 %r447,%frame,360; $L100: add.u64 %r161,%r213,40; mov.u64 %r143,%r447; cvt.u32.u32 %r338,%r23; cvt.s64.s8 %r337,%r338; add.u64 %r340,%r337,%r337; add.u64 %r341,%r340,%r337; shl.b64 %r342,%r341,3; add.u64 %r343,%r213,16; add.u64 %r135,%r342,%r343; .loc 1 329 10 mov.u64 %r131,1; .loc 1 333 4 mov.u64 %r362,0; bra $L77; $L76: .loc 1 337 20 add.u32 %r344,%r23,-2; cvt.s64.s32 %r73,%r344; add.u64 %r346,%r73,%r73; add.u64 %r347,%r346,%r73; shl.b64 %r348,%r347,3; add.u64 %r349,%r213,%r348; .loc 1 337 67 shl.b64 %r351,%r73,3; add.u64 %r352,%frame,%r351; .loc 1 337 59 ld.u64 %r354,[%r349+40]; ld.u64 %r355,[%r352+360]; mul.lo.u64 %r155,%r354,%r355; .loc 1 339 24 mov.u64 %r356,0; st.u64 [%r213+8],%r356; .loc 1 340 28 cvt.u16.u32 %r359,%r23; add.u16 %r358,%r359,-1; cvt.u32.u16 %r360,%r358; st.u8 [%r213+28],%r360; .loc 1 342 10 setp.eq.u64 %r361,%r155,0; @ %r361 bra $L78; bra $L125; $L80: .loc 1 331 47 mul.lo.u64 %r131,%r70,%r131; $L77: .loc 1 333 4 st.u64 [%r161+8],%r362; ld.u64 %r70,[%r143]; add.u64 %r363,%r70,-1; st.u64 [%r161+16],%r363; st.u64 [%r161],%r131; .loc 1 326 7 add.u64 %r161,%r161,24; add.u64 %r143,%r143,8; setp.ne.u64 %r364,%r135,%r161; @ %r364 bra $L80; bra $L76; $L78: .loc 1 345 4 st.u64 [%r213+48],%r155; mov.u64 %r366,-1; st.u64 [%r213+56],%r366; mov.u64 %r367,1; st.u64 [%r213+40],%r367; .loc 1 346 4 bra $L56; $L125: .loc 1 349 24 mov.u64 %r369369; call (%value_in),_gfortrani_xmallocarray70,[%value_in]; } .loc 1 349 22 st.u64 [%r213],%r370; bra $L81; $L71: .loc 1 354 19 ld.s8 %r372,[%r213+28]; .loc 1 354 10 setp.eq.u64 %r373,%r372,%r147; @ %r373 bra $L82; .loc 1 355 2 cvta.const.u64 %r374374_gfortran_runtime_error82: .loc 1 357 11 cvta.global.u64 %r376,_gfortrani_compile_options; .loc 1 357 10 ld.u32 %r377,[%r376+36]; setp.eq.u32 %r378,%r377,0; @ %r378 bra $L81; .loc 1 359 4 add.u64 %r447,%frame,360; cvta.const.u64 %r382,$LC2r4473382; call _gfortrani_bounds_ifunction_return361 4 cvta.const.u64 %r386,$LC386382; call _gfortrani_bounds_equal_extents$L81: .loc 1 366 3 setp.ne.u64 %r388,%r147,0; @ %r388 bra $L83; $L86: .loc 1 374 8 ld.u64 %r167,[%r213]; .loc 1 375 8 ld.u64 %r165,[%r214]; .loc 1 377 9 setp.ne.u64 %r389,%r165,0; @ %r389 bra $L84; bra $L56; $L83: add.u64 %r194,%frame,480; add.u64 %r193,%r213,40; add.u64 %r188,%frame,120; cvt.u32.u32 %r391,%r23; cvt.s64.s8 %r390,%r391; add.u64 %r174,%r390,-1; .loc 1 366 3 mov.u64 %r168,0; add.u64 %r447,%frame,360; .loc 1 368 16 mov.u64 %r392,%r168; $L85: st.u64 [%r194],%r392; .loc 1 369 18 ld.u64 %r393,[%r193]; st.u64 [%r188],%r393; .loc 1 370 17 shl.b64 %r395,%r168,3; add.u64 %r396,%r447,%r395; .loc 1 370 10 ld.u64 %r397,[%r396]; setp.le.s64 %r398,%r397,0; @ %r398 bra $L56; .loc 1 366 26 add.u64 %r168,%r168,1; .loc 1 366 3 add.u64 %r194,%r194,8; add.u64 %r193,%r193,24; add.u64 %r188,%r188,8; setp.ne.u64 %r399,%r168,%r174; @ %r399 bra $L85; bra $L86; $L84: .loc 1 300 12 mul.lo.u64 %r42,%r39,%r34; .loc 1 396 32 add.u64 %r87,%r151,%r151; ld.u64 %r68,[%frame+480]; .loc 1 440 22 ld.u64 %r97,[%frame+240]; .loc 1 440 12 add.u64 %r99,%r97,%r97; .loc 1 441 23 ld.u64 %r100,[%frame]; .loc 1 442 22 ld.u64 %r102,[%frame+120]; .loc 1 442 12 shl.b64 %r104,%r102,3; setp.le.s64 %r446,%r147,1; .loc 1 418 9 setp.ne.u32 %r454,%r217,0; cvt.u32.u32 %r455,%r23; cvt.s64.s8 %r456,%r455; add.u64 %r457,%r456,-1; add.u64 %r458,%frame,240; add.u64 %r459,%frame,360; add.u64 %r460,%frame,120; $L101: .loc 1 366 3 mov.u64 %r160,%r166; mov.u64 %r159,%r165; .loc 1 396 9 mov.u64 %r137,0; $L90: .loc 1 399 7 ld.s8 %r208,[%r160]; mov.u64 %r189,%r137; .loc 1 396 24 add.u64 %r137,%r137,1; .loc 1 399 6 setp.eq.u32 %r448,%r208,0; @ %r448 bra $L87; .loc 1 407 11 ld.s16 %r141,[%r159]; .loc 1 418 9 @ %r454 bra $L88; bra $L126; $L87: .loc 1 396 32 add.u64 %r159,%r159,%r87; .loc 1 396 47 add.u64 %r160,%r160,%r42; .loc 1 396 2 setp.ne.u64 %r404,%r148,%r137; @ %r404 bra $L90; .loc 1 395 9 mov.u64 %r137,0; bra $L91; $L126: .loc 1 428 10 setp.gt.s64 %r405,%r148,%r189; @ %r405 bra $L92; bra $L91; $L88: .loc 1 419 8 setp.le.s64 %r406,%r148,%r189; @ %r406 bra $L91; $L94: .loc 1 424 16 add.u64 %r189,%r189,1; .loc 1 421 8 @ %r448 bra $L93; .loc 1 421 18 ld.s16 %r88,[%r159]; .loc 1 421 15 cvt.u16.u32 %r408,%r88; cvt.u16.u32 %r409,%r141; setp.le.s16 %r410,%r408,%r409; selp.u32 %r141,%r88,%r141,%r410; selp.u64 %r137,%r189,%r137,%r410; $L93: .loc 1 419 33 add.u64 %r159,%r159,%r87; .loc 1 419 48 add.u64 %r160,%r160,%r42; .loc 1 419 8 setp.eq.u64 %r411,%r148,%r189; @ %r411 bra $L91; .loc 1 421 9 ld.s8 %r208,[%r160]; setp.eq.u32 %r448,%r208,0; bra $L94; $L92: .loc 1 433 11 add.u64 %r189,%r189,1; .loc 1 430 10 ld.s8 %r413,[%r160]; cvt.u16.u32 %r412,%r413; setp.eq.u16 %r414,%r412,0; @ %r414 bra $L95; .loc 1 430 20 ld.s16 %r93,[%r159]; .loc 1 430 17 cvt.u16.u32 %r415,%r93; cvt.u16.u32 %r416,%r141; setp.lt.s16 %r417,%r415,%r416; selp.u32 %r141,%r93,%r141,%r417; selp.u64 %r137,%r189,%r137,%r417; $L95: .loc 1 428 35 add.u64 %r159,%r159,%r87; .loc 1 428 50 add.u64 %r160,%r160,%r42; .loc 1 428 10 setp.ne.u64 %r418,%r148,%r189; @ %r418 bra $L92; $L91: .loc 1 436 8 st.u64 [%r167],%r137; .loc 1 439 15 add.u64 %r68,%r68,1; .loc 1 440 12 add.u64 %r165,%r165,%r99; .loc 1 441 13 add.u64 %r166,%r166,%r100; .loc 1 442 12 add.u64 %r167,%r167,%r104; .loc 1 444 32 ld.u64 %r29,[%frame+360]; .loc 1 444 13 setp.ne.u64 %r419,%r29,%r68; @ %r419 bra $L101; .loc 1 451 23 mul.lo.u64 %r106,%r68,%r97; .loc 1 452 24 mul.lo.u64 %r420,%r68,%r100; .loc 1 452 10 sub.u64 %r162,%r166,%r420; .loc 1 453 23 mul.lo.u64 %r421,%r68,%r102; .loc 1 453 9 shl.b64 %r422,%r421,3; sub.u64 %r163,%r167,%r422; .loc 1 455 7 @ %r446 bra $L56; add.u64 %r204,%frame,488; mov.u64 %r203,8; .loc 1 454 5 mov.u64 %r164,1; .loc 1 448 13 mov.u64 %r453,0; bra $L98; $L99: st.u64 [%r204],%r453; .loc 1 451 23 mul.lo.u64 %r106,%r117,%r116; .loc 1 452 24 mul.lo.u64 %r428,%r119,%r116; .loc 1 452 10 sub.u64 %r162,%r166,%r428; .loc 1 453 23 mul.lo.u64 %r429,%r121,%r116; .loc 1 453 9 shl.b64 %r430,%r429,3; sub.u64 %r163,%r167,%r430; .loc 1 454 5 add.u64 %r164,%r164,1; .loc 1 455 7 add.u64 %r204,%r204,8; add.u64 %r203,%r203,8; setp.eq.u64 %r431,%r164,%r457; @ %r431 bra $L56; $L98: .loc 1 463 16 ld.u64 %r432,[%r204]; add.u64 %r116,%r432,1; st.u64 [%r204],%r116; .loc 1 464 23 add.u64 %r434,%r458,%r203; ld.u64 %r117,[%r434]; .loc 1 464 13 sub.u64 %r435,%r117,%r106; add.u64 %r436,%r435,%r435; add.u64 %r165,%r165,%r436; .loc 1 465 24 add.u64 %r437,%frame,%r203; ld.u64 %r119,[%r437]; .loc 1 465 14 add.u64 %r166,%r162,%r119; .loc 1 466 23 add.u64 %r439,%r460,%r203; ld.u64 %r121,[%r439]; .loc 1 466 13 shl.b64 %r440,%r121,3; add.u64 %r167,%r163,%r440; .loc 1 444 32 add.u64 %r442,%r459,%r203; ld.u64 %r125,[%r442]; .loc 1 444 13 setp.eq.u64 %r443,%r116,%r125; @ %r443 bra $L99; .loc 1 448 13 mov.u64 %r68,0; bra $L101; $L124: .loc 1 322 6 ld.u64 %r444,[%r213]; setp.eq.u64 %r445,%r444,0; @ ! %r445 bra $L71; bra $L100; $L56_gfortran_sminloc1_8_i2 .visible .func _gfortran_sminloc1_8_, .param .u32 %in_ar4) {local .align 16 .b8 %frame_ar[368u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 493 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L128; .loc 1 493 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L129; $L128:r152154; call _gfortran_minloc1_8_500 7 bra $L127; $L129: .loc 1 503 10 ld.u64 %r23,[%r152]; .loc 1 503 7 add.u64 %r88,%r23,-1; .loc 1 504 10 ld.s8 %r24,[%r151+28]; .loc 1 504 38 add.u32 %r26,%r24,-1; .loc 1 504 8 cvt.s64.s32 %r296,%r26; .loc 1 506 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 506 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L131; .loc 1 513 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L132; $L138: .loc 1 521 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L133; bra $L186; $L131: .loc 1 508 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error132: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 518 12 mov.u64 %r302,0; $L137: .loc 1 515 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 517 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L135; .loc 1 515 17 st.u64 [%r139],%r36; bra $L136; $L135: .loc 1 518 12 st.u64 [%r139],%r302; $L136: .loc 1 513 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L137; bra $L138; $L186: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L140; bra $L139; $L133: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 527 12 mov.u64 %r301,0; $L143: .loc 1 524 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 526 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L141; .loc 1 523 17 st.u64 [%r43],%r41; bra $L142; $L141: .loc 1 527 12 st.u64 [%r43],%r301; $L142: .loc 1 521 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L143; bra $L187; $L139: .loc 1 534 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L145; add.u64 %r295,%frame,120; $L165: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 537 10 mov.u64 %r81,1; .loc 1 541 4 mov.u64 %r239,0; bra $L146; $L145: .loc 1 545 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 546 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 548 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 548 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 548 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 550 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L147; bra $L188; $L149: .loc 1 539 48 mul.lo.u64 %r81,%r46,%r81; $L146: .loc 1 541 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 534 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L149; bra $L145; $L147: .loc 1 553 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 554 4 bra $L127; $L188: .loc 1 557 24 mov.u64 %r24696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 557 22 st.u64 [%r150],%r98; $L153: .loc 1 583 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L150; bra $L151; $L140: .loc 1 561 19 ld.s8 %r55,[%r150+28]; .loc 1 561 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L152; .loc 1 562 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error152: .loc 1 567 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 567 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L153; .loc 1 569 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L151; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 569 10 mov.u64 %r104,0; $L155: .loc 1 573 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 573 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 574 18 ld.u64 %r63,[%r31]; .loc 1 569 25 add.u64 %r104,%r104,1; .loc 1 574 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L154; .loc 1 575 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error154: .loc 1 569 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L155; $L150: .loc 1 585 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L156; shl.b64 %r266,%r296,3; bra $L157; $L156: mov.u64 %r266,8; $L157: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L158: .loc 1 586 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 583 26 add.u64 %r100,%r100,1; .loc 1 583 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L158; $L151: ld.u64 %r80,[%frame+240]; .loc 1 595 22 ld.u64 %r68,[%frame]; .loc 1 595 12 shl.b64 %r69,%r68,3; .loc 1 597 32 ld.u64 %r59,[%frame+120]; .loc 1 604 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 593 13 mov.u64 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L164: st.u64 [%r98],%r280; .loc 1 594 15 add.u64 %r80,%r80,1; .loc 1 595 12 add.u64 %r98,%r98,%r69; .loc 1 597 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L164; .loc 1 606 7 @ ! %r294 bra $L189; bra $L127; $L163: .loc 1 601 13 st.u64 [%r99],%r280; .loc 1 604 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 605 5 add.u64 %r97,%r97,1; .loc 1 606 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L162; bra $L127; $L189: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 604 23 mov.u64 %r72,%r44; .loc 1 605 5 mov.u64 %r97,1; $L162: .loc 1 610 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 611 23 ld.u64 %r75,[%r124]; .loc 1 611 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,3; add.u64 %r98,%r98,%r291; .loc 1 597 32 ld.u64 %r77,[%r123]; .loc 1 597 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L163; .loc 1 601 13 mov.u64 %r80,0; bra $L164; $L187: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L140; bra $L165; $L127: .loc 1 615 1 ret; } minloc1_16_i2.o/_gfortran_minloc1_16_i2 .visible .func _gfortran_minloc1_16_i2.file 1 "../../../libgfortran/generated/minloc1_16_i2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_16_i2 .visible .func _gfortran_mminloc1_16__gfortran_sminloc1_16_i2 .visible .func _gfortran_sminloc1_16_VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_minloc1_16_i2 .visible .func _gfortran_minloc1_16_i2480pred %r229; .reg .predpred %r240; .reg .predu16 %r278; .reg .u32 %r279; .reg .u32 %r280; .reg .u64 %r282; .reg .u64predpredu6432u64 %r328; .reg .u64 %r329; .reg .pred %r330; .reg .pred %r331; .reg .u64 %r333; .reg .u16 %r336; .reg .u16 %r337; .reg .predpredu64 %r399; .reg .u64 %r400; .reg .u64.loc 1 58 10 ld.s8 %r22,[%r191+28]; .loc 1 58 38 add.u32 %r194,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r118,%r194; .loc 1 59 10 ld.u64 %r25,[%r192]; .loc 1 59 7 add.u64 %r119,%r25,-1; .loc 1 61 7 shr.u64 %r196,%r119,63; set.u32.lt.s64 %r198,%r118,%r119; neg.s32 %r199,%r198; cvt.u16.u64 %r202,%r196;61%r206 bra $L2; .loc 1 63 7 cvt.u32.u32 %r209,%r22; cvt.s64.s8 %r208,%r209; st.u64 [%stack+8],%r208007_gfortran_runtime_errorr212,%r119,%r119; add.u64 %r213,%r212,%r119; shl.b64 %r214,%r213,3; add.u64 %r215,%r191,%r214; ld.u64 %r31,[%r215+56]; ld.u64 %r33,[%r215+48]; .loc 1 71 9 ld.u64 %r122,[%r215+40]; .loc 1 73 3 setp.ne.u64 %r229,%r119,0; @ %r229 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r230,%r118,%r119; @ %r230 bra $L4; bra $L52; $L3: add.u64 %r188,%r191,40; add.u64 %r184,%frame,120; add.u64 %r47,%frame,240; add.u64 %r231,%r191,16; add.u64 %r233,%r25,%r25; add.u64 %r234,%r233,%r25; shl.b64 %r235,%r234,3; add.u64 %r186,%r231,%r235; .loc 1 79 12 mov.u64 %r423,0; $L8: .loc 1 75 18 ld.u64 %r236,[%r188]; st.u64 [%r184],%r236; .loc 1 76 19 ld.u64 %r238,[%r188+16]; add.u64 %r237,%r238,1; ld.u64 %r239,[%r188+8]; sub.u64 %r38,%r237,%r239; .loc 1 78 10 setp.lt.s64 %r240,%r38,0; @ %r240 bra $L6; .loc 1 76 17 st.u64 [%r47],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r47],%r423; $L7: .loc 1 73 3 add.u64 %r188,%r188,24; add.u64 %r184,%r184,8; add.u64 %r47,%r47,8; setp.ne.u64 %r242,%r186,%r188; @ %r242 bra $L8; bra $L9; $L52: .loc 1 90 6 ld.u64 %r243,[%r190]; setp.eq.u64 %r244,%r243,0; @ ! %r244 bra $L11; bra $L10; $L4: add.u64 %r246,%r25,%r25; add.u64 %r247,%r246,%r25; shl.b64 %r248,%r247,3; add.u64 %r249,%r248,40; add.u64 %r110,%r191,%r249; shl.b64 %r250,%r25,3; add.u64 %r98,%r250,-8; add.u64 %r411,%frame,120; add.u64 %r100,%r411,%r98; add.u64 %r410,%frame,240; add.u64 %r96,%r410,%r98; cvt.u32.u32 %r254,%r22; cvt.s64.s8 %r253,%r254; add.u64 %r256,%r253,%r253; add.u64 %r257,%r256,%r253; shl.b64 %r258,%r257,3; add.u64 %r259,%r191,40; add.u64 %r135,%r258,%r259; .loc 1 87 12 mov.u64 %r422,0; $L14: .loc 1 83 18 ld.u64 %r260,[%r110]; st.u64 [%r100],%r260; .loc 1 84 19 ld.u64 %r262,[%r110+16]; add.u64 %r261,%r262,1; ld.u64 %r263,[%r110+8]; sub.u64 %r44,%r261,%r263; .loc 1 86 10 setp.lt.s64 %r264,%r44,0; @ %r264 bra $L12; .loc 1 84 17 st.u64 [%r96],%r44; bra $L13; $L12: .loc 1 87 12 st.u64 [%r96],%r422; $L13: .loc 1 81 3 add.u64 %r110,%r110,24; add.u64 %r100,%r100,8; add.u64 %r96,%r96,8; setp.ne.u64 %r266,%r110,%r135; @ %r266 bra $L14; bra $L53; $L10: .loc 1 94 7 setp.le.s64 %r267,%r118,0; @ %r267 bra $L16; add.u64 %r410,%frame,240; $L39: add.u64 %r146,%r190,40; mov.u64 %r143,%r410; cvt.u32.u32 %r269,%r22; cvt.s64.s8 %r268,%r269; add.u64 %r271,%r268,%r268; add.u64 %r272,%r271,%r268; shl.b64 %r273,%r272,3; add.u64 %r274,%r190,16; add.u64 %r111,%r273,%r274; .loc 1 97 10 mov.u64 %r102,1; .loc 1 101 4 mov.u64 %r297,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r275,0; st.u64 [%r190+8],%r275; .loc 1 106 28 cvt.u16.u32 %r278,%r22; add.u16 %r277,%r278,-1; cvt.u32.u16 %r279,%r277; st.u8 [%r190+28],%r279; .loc 1 108 20 add.u32 %r280,%r22,-2; cvt.s64.s32 %r53,%r280; add.u64 %r282,%r53,%r53; add.u64 %r283,%r282,%r53; shl.b64 %r284,%r283,3; add.u64 %r285,%r190,%r284; .loc 1 108 67 shl.b64 %r287,%r53,3; add.u64 %r288,%frame,%r287; .loc 1 108 59 ld.u64 %r290,[%r285+40]; ld.u64 %r291,[%r288+240]; mul.lo.u64 %r124,%r290,%r291; .loc 1 110 29 mov.u64 %r293,16293; call (%value_in),_gfortrani_xmallocarray294,[%value_in]; } .loc 1 110 27 st.u64 [%r190],%r294; .loc 1 111 10 setp.eq.u64 %r296,%r124,0; @ ! %r296 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r102,%r49,%r102; $L17: .loc 1 101 4 st.u64 [%r146+8],%r297; ld.u64 %r49,[%r143]; add.u64 %r298,%r49,-1; st.u64 [%r146+16],%r298; st.u64 [%r146],%r102; .loc 1 94 7 add.u64 %r146,%r146,24; add.u64 %r143,%r143,8; setp.ne.u64 %r299,%r111,%r146; @ %r299 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r190+48],%r124; mov.u64 %r301,-1; st.u64 [%r190+56],%r301; mov.u64 %r302,1; st.u64 [%r190+40],%r302; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r59,[%r190+28]; .loc 1 121 10 setp.eq.u64 %r303,%r59,%r118; @ %r303 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r118; st.u64 [%stack],%r59;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r306,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r307,[%r306+36]; setp.eq.u32 %r308,%r307,0; @ %r308 bra $L21; .loc 1 128 2 add.u64 %r410,%frame,240; cvta.const.u64 %r312,$LC2; cvta.const.u64 %r311,$LC3;_gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r314,%r118,0; @ %r314 bra $L24; $L27: .loc 1 68 9 add.u64 %r315,%r31,1; .loc 1 68 7 sub.u64 %r120,%r315,%r33; max.s64 %r82,%r120,0; .loc 1 140 8 ld.u64 %r131,[%r191]; .loc 1 141 8 ld.u64 %r132,[%r190]; .loc 1 190 33 add.u64 %r71,%r122,%r122; ld.u64 %r141,[%frame+360]; .loc 1 204 22 ld.u64 %r73,[%frame+120]; .loc 1 204 12 add.u64 %r74,%r73,%r73; .loc 1 205 22 ld.u64 %r76,[%frame]; .loc 1 205 12 shl.b64 %r77,%r76,4; .loc 1 207 32 ld.u64 %r66,[%frame+240]; .loc 1 214 23 mul.lo.u64 %r318,%r66,%r73; .loc 1 214 9 add.u64 %r319,%r318,%r318; neg.s64 %r140,%r319; .loc 1 215 23 mul.lo.u64 %r320,%r66,%r76; .loc 1 215 9 shl.b64 %r321,%r320,4; neg.s64 %r145,%r321; setp.gt.s64 %r412,%r120,0; setp.le.s64 %r413,%r118,1; .loc 1 180 9 setp.eq.u32 %r415,%r193,0; mov.u64 %r416,1; cvt.u32.u32 %r417,%r22; cvt.s64.s8 %r418,%r417; add.u64 %r419,%r418,-1; add.u64 %r420,%frame,120; add.u64 %r421,%frame,240; bra $L25; $L24: add.u64 %r168,%frame,360; add.u64 %r167,%r190,40; mov.u64 %r165,%frame; cvt.u32.u32 %r323,%r22; cvt.s64.s8 %r322,%r323; add.u64 %r148,%r322,-1; .loc 1 132 3 mov.u64 %r133,0; add.u64 %r410,%frame,240; .loc 1 134 16 mov.u64 %r324,%r133; $L26: st.u64 [%r168],%r324; .loc 1 135 18 ld.u64 %r325,[%r167]; st.u64 [%r165],%r325; .loc 1 136 17 shl.b64 %r327,%r133,3; add.u64 %r328,%r410,%r327; .loc 1 136 10 ld.u64 %r329,[%r328]; setp.le.s64 %r330,%r329,0; @ %r330 bra $L1; .loc 1 132 26 add.u64 %r133,%r133,1; .loc 1 132 3 add.u64 %r168,%r168,8; add.u64 %r167,%r167,24; add.u64 %r165,%r165,8; setp.ne.u64 %r331,%r133,%r148; @ %r331 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r412 bra $L28; .loc 1 159 10 mov.u64 %r333,0; st.u64 [%r132],%r333; st.u64 [%r132+8],%r333; bra $L29; $L28: .loc 1 180 9 @ %r415 bra $L40; mov.u64 %r127,%r131; mov.u64 %r400,%r416; mov.u64 %r401,0; .loc 1 155 9 mov.u32 %r108,32767; .loc 1 157 9 mov.u64 %r396,1; mov.u64 %r397,%r401; $L32: .loc 1 183 9 ld.s16 %r65,[%r127]; .loc 1 183 8 cvt.u16.u32 %r336,%r65; cvt.u16.u32 %r337,%r108; setp.gt.s16 %r338,%r336,%r337; .loc 1 186 16 selp.u64 %r396,%r396,%r400,%r338; selp.u64 %r397,%r397,%r401,%r338; selp.u32 %r108,%r108,%r65,%r338; .loc 1 181 33 add.u64 %r127,%r127,%r71; .loc 1 181 8 add.u64 %r341,%r400,%r416; set.u32.lt.u64 %r346,%r341,%r400; cvt.s64.s32 %r344,%r346; mov.u64 %r400,%r341; sub.u64 %r401,%r401,%r344; add.u64 %r351,%r400,-1; setp.gt.s64 %r353,%r82,%r351; @ %r353 bra $L32; bra $L33; $L40: mov.u64 %r101,%r131; mov.u64 %r398,%r416; mov.u64 %r399,0; .loc 1 155 9 mov.u32 %r109,32767; .loc 1 157 9 mov.u64 %r396,1; mov.u64 %r397,%r399; $L30: .loc 1 192 9 ld.s16 %r69,[%r101]; .loc 1 192 8 cvt.u16.u32 %r354,%r69; cvt.u16.u32 %r355,%r109; setp.ge.s16 %r356,%r354,%r355; .loc 1 195 16 selp.u64 %r396,%r396,%r398,%r356; selp.u64 %r397,%r397,%r399,%r356; selp.u32 %r109,%r109,%r69,%r356; .loc 1 190 33 add.u64 %r101,%r101,%r71; .loc 1 190 8 add.u64 %r359,%r398,%r416; set.u32.lt.u64 %r364,%r359,%r398; cvt.s64.s32 %r362,%r364; mov.u64 %r398,%r359; sub.u64 %r399,%r399,%r362; add.u64 %r369,%r398,-1; setp.gt.s64 %r371,%r82,%r369; @ %r371 bra $L30; $L33: .loc 1 199 12 st.u64 [%r132],%r396; st.u64 [%r132+8],%r397; $L29: .loc 1 203 15 add.u64 %r141,%r141,1; .loc 1 204 12 add.u64 %r131,%r131,%r74; .loc 1 205 12 add.u64 %r132,%r132,%r77; .loc 1 207 13 setp.ne.u64 %r374,%r66,%r141; @ %r374 bra $L25; .loc 1 214 9 add.u64 %r128,%r131,%r140; .loc 1 215 9 add.u64 %r129,%r132,%r145; .loc 1 217 7 @ %r413 bra $L1; add.u64 %r157,%frame,368; mov.u64 %r117,8; .loc 1 216 5 mov.u64 %r130,1; .loc 1 211 13 mov.u64 %r414,0; bra $L37; $L38: st.u64 [%r157],%r414; .loc 1 214 23 mul.lo.u64 %r380,%r89,%r88; .loc 1 214 9 add.u64 %r381,%r380,%r380; sub.u64 %r128,%r131,%r381; .loc 1 215 23 mul.lo.u64 %r382,%r92,%r88; .loc 1 215 9 shl.b64 %r383,%r382,4; sub.u64 %r129,%r132,%r383; .loc 1 216 5 add.u64 %r130,%r130,1; .loc 1 217 7 add.u64 %r157,%r157,8; add.u64 %r117,%r117,8; setp.eq.u64 %r384,%r130,%r419; @ %r384 bra $L1; $L37: .loc 1 225 16 ld.u64 %r385,[%r157]; add.u64 %r88,%r385,1; st.u64 [%r157],%r88; .loc 1 226 23 add.u64 %r387,%r420,%r117; ld.u64 %r89,[%r387]; .loc 1 226 13 add.u64 %r388,%r89,%r89; add.u64 %r131,%r128,%r388; .loc 1 227 23 add.u64 %r389,%frame,%r117; ld.u64 %r92,[%r389]; .loc 1 227 13 shl.b64 %r390,%r92,4; add.u64 %r132,%r129,%r390; .loc 1 207 32 add.u64 %r392,%r421,%r117; ld.u64 %r95,[%r392]; .loc 1 207 13 setp.eq.u64 %r393,%r88,%r95; @ %r393 bra $L38; .loc 1 211 13 mov.u64 %r141,0; bra $L25; $L53: .loc 1 90 6 ld.u64 %r394,[%r190]; setp.eq.u64 %r395,%r394,0; @ ! %r395 bra $L11; bra $L39; $L1:_gfortran_mminloc1_16_i2 .visible .func _gfortran_mminloc1_16_, .param .u32 %in_ar4) {60897u64u64 %r267; .reg .u64 %r274; .reg .pred %r275; .reg .u32 %r276; .reg .u32 %r277; .reg .u32u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .pred %r329; .reg .u64 %r330; .reg .pred %r331; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32 %r338; .reg .u64 %r339; .reg .u64 %r341; .reg .u64 %r342; .reg .u64 %r343; .reg .u64 %r344; .reg .u64pred %r355; .reg .pred64predpred %r408; .reg .pred %r409; .reg .u64 %r410; .reg .u32.reg .u64 %r417; .reg .pred %r418; .reg .predpred %r428; .reg .pred %r429; .reg .pred %r430; .reg .u64.reg .u32 %r456; .reg .u64 %r460; .reg .u16 %r462; .reg .u16 %r463; .reg .predpred %r479; .reg .pred %r482; .reg .u64 %r486; .reg .u64 %r489; .reg .u64 %r492; .reg .u64 %r495; .reg .u32u64 %r547; .reg .pred %r548; .reg .u64 %r549; .reg .u64 %r551; .reg .u64u64 %r557; .reg .u64u64 %r572; .reg .u64 %r583; .reg .u64 %r597; .reg .predpred %r609; .reg .u64 %r610; .reg .u32 %r611; .reg .u64 %r612; .reg .u64 %r613; .reg .u64 %r614; .reg .u64 %r615; .reg .u64 %r616; .reg .u64 %r617; mov.u64 %r233,%ar0; mov.u64 %r234,%ar1; mov.u64 %r235,%ar2; mov.u64 %r236,%ar3; mov.u32 %r237,%ar4; .loc 1 261 6 setp.ne.u64 %r238,%r236,0; @ %r238 bra $L3r2235237; call _gfortran_minloc1_16_268 7 bra $L54; $L55: .loc 1 271 10 ld.u64 %r22,[%r235]; .loc 1 271 7 add.u64 %r153,%r22,-1; .loc 1 272 10 ld.s8 %r23,[%r234+28]; .loc 1 272 38 add.u32 %r243,%r23,-1; .loc 1 272 8 cvt.s64.s32 %r154,%r243; .loc 1 275 7 shr.u64 %r245,%r153,63; set.u32.gt.s64 %r247,%r153,%r154; neg.s32 %r248,%r247; cvt.u16.u64 %r251,%r245; cvt.u16.u32 %r252,%r248; or.b16 %r250,%r251,%r252; .loc 1 275 6 cvt.u32.u16 %r253,%r250; cvt.u16.u8 %r254,%r253; setp.eq.u16 %r255,%r254,0; @ %r255 bra $L57; .loc 1 277 7 cvt.u32.u32 %r258,%r23; cvt.s64.s8 %r257,%r258; st.u64 [%stack+8],%r257; st.u64 [%stack],%r22; cvta.const.u64 %r22stack; call _gfortran_runtime_error7: .loc 1 282 9 add.u64 %r602,%r153,%r153; add.u64 %r607,%r602,%r153; shl.b64 %r263,%r607,3; add.u64 %r264,%r234,%r263; ld.u64 %r267,[%r264+56]; add.u64 %r266,%r267,1; .loc 1 282 7 ld.u64 %r274,[%r264+48]; sub.u64 %r155,%r266,%r274; .loc 1 283 6 setp.le.s64 %r275,%r155,0; @ %r275 bra $L54; .loc 1 286 9 ld.u64 %r173,[%r236]; .loc 1 288 15 ld.u64 %r35,[%r236+16]; .loc 1 290 22 cvt.u32.u64 %r36,%r35; .loc 1 290 53 add.u32 %r276,%r36,-4; and.b32 %r277,%r276,-5; set.u32.eq.u32 %r279,%r277,0; neg.s32 %r280,%r279; .loc 1 290 22 add.u32 %r281,%r36,-1; set.u32.le.u32 %r283,%r281,1; neg.s32 %r284,%r283; .loc 1 290 6 cvt.u16.u32 %r286,%r280; cvt.u16.u32 %r287,%r284; or.b16 %r285,%r286,%r287;59; .loc 1 292 7 setp.ne.u32 %r292,%r36,16; @ %r292 bra $L60; $L59: .loc 1 299 9 shl.b64 %r296,%r607,3; add.u64 %r297,%r234,%r296; ld.u64 %r158,[%r297+40]; .loc 1 300 12 add.u64 %r303,%r236,%r296; ld.u64 %r41,[%r303+40]; .loc 1 302 3 setp.ne.u64 %r305,%r153,0; @ %r305 bra $L61; $L67: .loc 1 312 3 setp.lt.s64 %r306,%r153,%r154; @ %r306 bra $L62; bra $L121; $L60: .loc 1 297 5 cvta.const.u64 %r307307_gfortran_runtime_error61: add.u64 %r164,%r234,40; add.u64 %r150,%r236,40; add.u64 %r309,%r234,16; add.u64 %r311,%r22,%r22; add.u64 %r312,%r311,%r22; shl.b64 %r313,%r312,3; add.u64 %r223,%r309,%r313; .loc 1 302 3 mov.u64 %r53,0; add.u64 %r603,%frame,240; add.u64 %r606,%frame,360; .loc 1 309 12 mov.u64 %r617,%r53; $L66: .loc 1 304 18 add.u64 %r315,%r603,%r53; ld.u64 %r316,[%r164]; st.u64 [%r315],%r316; .loc 1 305 18 add.u64 %r317,%frame,%r53; .loc 1 305 20 ld.u64 %r319,[%r150]; mul.lo.u64 %r318,%r319,%r35; .loc 1 305 18 st.u64 [%r317],%r318; .loc 1 306 19 ld.u64 %r321,[%r164+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r164+8]; sub.u64 %r52,%r320,%r322; .loc 1 308 10 setp.lt.s64 %r323,%r52,0; @ %r323 bra $L64; .loc 1 306 17 add.u64 %r325,%r606,%r53; st.u64 [%r325],%r52; bra $L65; $L64: .loc 1 309 12 add.u64 %r327,%r606,%r53; st.u64 [%r327],%r617; $L65: .loc 1 302 3 add.u64 %r164,%r164,24; add.u64 %r150,%r150,24; add.u64 %r53,%r53,8; setp.ne.u64 %r329,%r164,%r223; @ %r329 bra $L66; bra $L67; $L121: .loc 1 322 6 ld.u64 %r330,[%r233]; setp.eq.u64 %r331,%r330,0; @ ! %r331 bra $L69; bra $L68; $L62: add.u64 %r333,%r22,%r22; add.u64 %r334,%r333,%r22; shl.b64 %r335,%r334,3; add.u64 %r178,%r335,40; add.u64 %r59,%r234,%r178; add.u64 %r180,%r236,%r178; shl.b64 %r336,%r22,3; add.u64 %r91,%r336,-8; cvt.u32.u32 %r338,%r23; cvt.s64.s8 %r337,%r338; shl.b64 %r339,%r337,3; add.u64 %r97,%r339,-8; add.u64 %r603,%frame,240; add.u64 %r606,%frame,360; .loc 1 319 12 mov.u64 %r616,0; $L72: .loc 1 314 18 add.u64 %r341,%r603,%r91; ld.u64 %r342,[%r59]; st.u64 [%r341],%r342; .loc 1 315 18 add.u64 %r343,%frame,%r91; .loc 1 315 20 ld.u64 %r345,[%r180]; mul.lo.u64 %r344,%r345,%r35; .loc 1 315 18 st.u64 [%r343],%r344; .loc 1 316 19 ld.u64 %r347,[%r59+16]; add.u64 %r346,%r347,1; ld.u64 %r348,[%r59+8]; sub.u64 %r66,%r346,%r348; .loc 1 318 10 setp.lt.s64 %r349,%r66,0; @ %r349 bra $L70; .loc 1 316 17 add.u64 %r351,%r606,%r91; st.u64 [%r351],%r66; bra $L71; $L70: .loc 1 319 12 add.u64 %r353,%r606,%r91; st.u64 [%r353],%r616; $L71: .loc 1 312 3 add.u64 %r59,%r59,24; add.u64 %r180,%r180,24; add.u64 %r91,%r91,8; setp.ne.u64 %r355,%r91,%r97; @ %r355 bra $L72; bra $L122; $L68: .loc 1 326 7 setp.eq.u64 %r356,%r154,0; @ %r356 bra $L74; add.u64 %r606,%frame,360; $L100: add.u64 %r137,%r233,40; mov.u64 %r134,%r606; cvt.u32.u32 %r358,%r23; cvt.s64.s8 %r357,%r358; add.u64 %r360,%r357,%r357; add.u64 %r361,%r360,%r357; shl.b64 %r362,%r361,3; add.u64 %r363,%r233,16; add.u64 %r64,%r362,%r363; .loc 1 329 10 mov.u64 %r136,1; .loc 1 333 4 mov.u64 %r382,0; bra $L75; $L74: .loc 1 337 20 add.u32 %r364,%r23,-2; cvt.s64.s32 %r75,%r364; add.u64 %r366,%r75,%r75; add.u64 %r367,%r366,%r75; shl.b64 %r368,%r367,3; add.u64 %r369,%r233,%r368; .loc 1 337 67 shl.b64 %r371,%r75,3; add.u64 %r372,%frame,%r371; .loc 1 337 59 ld.u64 %r374,[%r369+40]; ld.u64 %r375,[%r372+360]; mul.lo.u64 %r162,%r374,%r375; .loc 1 339 24 mov.u64 %r376,0; st.u64 [%r233+8],%r376; .loc 1 340 28 cvt.u16.u32 %r379,%r23; add.u16 %r378,%r379,-1; cvt.u32.u16 %r380,%r378; st.u8 [%r233+28],%r380; .loc 1 342 10 setp.eq.u64 %r381,%r162,0; @ %r381 bra $L76; bra $L123; $L78: .loc 1 331 47 mul.lo.u64 %r136,%r72,%r136; $L75: .loc 1 333 4 st.u64 [%r137+8],%r382; ld.u64 %r72,[%r134]; add.u64 %r383,%r72,-1; st.u64 [%r137+16],%r383; st.u64 [%r137],%r136; .loc 1 326 7 add.u64 %r137,%r137,24; add.u64 %r134,%r134,8; setp.ne.u64 %r384,%r64,%r137; @ %r384 bra $L78; bra $L74; $L76: .loc 1 345 4 st.u64 [%r233+48],%r162; mov.u64 %r386,-1; st.u64 [%r233+56],%r386; mov.u64 %r387,1; st.u64 [%r233+40],%r387; .loc 1 346 4 bra $L54; $L123: .loc 1 349 24 mov.u64 %r389,16389; call (%value_in),_gfortrani_xmallocarray90,[%value_in]; } .loc 1 349 22 st.u64 [%r233],%r390; bra $L79; $L69: .loc 1 354 19 ld.s8 %r392,[%r233+28]; .loc 1 354 10 setp.eq.u64 %r393,%r392,%r154; @ %r393 bra $L80; .loc 1 355 2 cvta.const.u64 %r3944_gfortran_runtime_error80: .loc 1 357 11 cvta.global.u64 %r396,_gfortrani_compile_options; .loc 1 357 10 ld.u32 %r397,[%r396+36]; setp.eq.u32 %r398,%r397,0; @ %r398 bra $L79; .loc 1 359 4 add.u64 %r606,%frame,360; cvta.const.u64 %r402,$LC24002; call _gfortrani_bounds_ifunction_return361 4 cvta.const.u64 %r406,$LC406402; call _gfortrani_bounds_equal_extents$L79: .loc 1 366 3 setp.ne.u64 %r408,%r154,0; @ %r408 bra $L81; $L84: .loc 1 374 8 ld.u64 %r174,[%r233]; .loc 1 375 8 ld.u64 %r172,[%r234]; .loc 1 377 9 setp.ne.u64 %r409,%r172,0; @ %r409 bra $L82; bra $L54; $L81: add.u64 %r188,%frame,480; add.u64 %r181,%r233,40; add.u64 %r149,%frame,120; cvt.u32.u32 %r411,%r23; cvt.s64.s8 %r410,%r411; add.u64 %r138,%r410,-1; .loc 1 366 3 mov.u64 %r175,0; add.u64 %r606,%frame,360; .loc 1 368 16 mov.u64 %r412,%r175; $L83: st.u64 [%r188],%r412; .loc 1 369 18 ld.u64 %r413,[%r181]; st.u64 [%r149],%r413; .loc 1 370 17 shl.b64 %r415,%r175,3; add.u64 %r416,%r606,%r415; .loc 1 370 10 ld.u64 %r417,[%r416]; setp.le.s64 %r418,%r417,0; @ %r418 bra $L54; .loc 1 366 26 add.u64 %r175,%r175,1; .loc 1 366 3 add.u64 %r188,%r188,8; add.u64 %r181,%r181,24; add.u64 %r149,%r149,8; setp.ne.u64 %r419,%r138,%r175; @ %r419 bra $L83; bra $L84; $L82: .loc 1 300 12 mul.lo.u64 %r43,%r41,%r35; .loc 1 396 32 add.u64 %r88,%r158,%r158; ld.u64 %r189,[%frame+480]; .loc 1 440 22 ld.u64 %r102,[%frame+240]; .loc 1 440 12 add.u64 %r103,%r102,%r102; .loc 1 441 23 ld.u64 %r105,[%frame]; .loc 1 442 22 ld.u64 %r107,[%frame+120]; .loc 1 442 12 shl.b64 %r108,%r107,4; .loc 1 444 32 ld.u64 %r29,[%frame+360]; .loc 1 451 23 mul.lo.u64 %r39,%r29,%r102; .loc 1 452 24 mul.lo.u64 %r422,%r29,%r105; .loc 1 452 10 neg.s64 %r92,%r422; .loc 1 453 23 mul.lo.u64 %r423,%r29,%r107; .loc 1 453 9 shl.b64 %r424,%r423,4; neg.s64 %r185,%r424; setp.le.s64 %r601,%r154,1; .loc 1 418 9 setp.ne.u32 %r609,%r237,0; add.u64 %r610,%r155,-1; cvt.u32.u32 %r611,%r23; cvt.s64.s8 %r612,%r611; add.u64 %r613,%r612,-1; add.u64 %r614,%frame,240; add.u64 %r615,%frame,360; $L101: .loc 1 366 3 mov.u64 %r168,%r173; mov.u64 %r167,%r172; .loc 1 396 9 mov.u64 %r197,0; $L88: .loc 1 399 7 ld.s8 %r224,[%r168]; mov.u64 %r230,%r197; .loc 1 396 24 add.u64 %r197,%r197,1; .loc 1 399 6 setp.eq.u32 %r604,%r224,0; @ %r604 bra $L85; .loc 1 407 11 ld.s16 %r145,[%r167]; .loc 1 408 11 mov.u64 %r563,%r197; shr.s64 %r564,%r563,63; .loc 1 418 9 @ %r609 bra $L86; bra $L124; $L85: .loc 1 396 32 add.u64 %r167,%r167,%r88; .loc 1 396 47 add.u64 %r168,%r168,%r43; .loc 1 396 2 setp.ne.u64 %r428,%r155,%r197; @ %r428 bra $L88; .loc 1 395 9 mov.u64 %r563,0; mov.u64 %r564,%r563; bra $L89; $L124: .loc 1 428 10 setp.gt.s64 %r429,%r155,%r230; @ %r429 bra $L90; bra $L89; $L86: .loc 1 419 8 setp.le.s64 %r430,%r155,%r230; @ %r430 bra $L89; add.u64 %r571,%r230,1; shr.s64 %r572,%r571,63; sub.u64 %r434,%r610,%r230; shr.s64 %r437,%r230,63; add.u64 %r440,%r230,2; set.u32.lt.u64 %r445,%r440,%r230; cvt.s64.s32 %r443,%r445; sub.u64 %r449,%r437,%r443; add.u64 %r451,%r434,%r440; set.u32.lt.u64 %r456,%r451,%r434; cvt.s64.s32 %r454,%r456; sub.u64 %r460,%r449,%r454; mov.u64 %r583,1; $L93: .loc 1 421 8 @ %r604 bra $L91; .loc 1 421 18 ld.s16 %r89,[%r167]; .loc 1 421 15 cvt.u16.u32 %r462,%r89; cvt.u16.u32 %r463,%r145; setp.gt.s16 %r464,%r462,%r463; .loc 1 424 16 selp.u64 %r563,%r563,%r571,%r464; selp.u64 %r564,%r564,%r572,%r464; selp.u32 %r145,%r145,%r89,%r464; $L91: .loc 1 419 33 add.u64 %r167,%r167,%r88; .loc 1 419 48 add.u64 %r168,%r168,%r43; .loc 1 419 8 add.u64 %r467,%r571,%r583; set.u32.lt.u64 %r472,%r467,%r571; cvt.s64.s32 %r470,%r472; mov.u64 %r571,%r467; sub.u64 %r572,%r572,%r470; setp.ne.u64 %r479,%r451,%r571; @ %r479 bra $L102; setp.ne.u64 %r482,%r460,%r572; @ ! %r482 bra $L89; $L102: .loc 1 421 9 ld.s8 %r224,[%r168]; setp.eq.u32 %r604,%r224,0; bra $L93; $L90: add.u64 %r567,%r230,1; shr.s64 %r568,%r567,63; sub.u64 %r486,%r610,%r230; shr.s64 %r489,%r230,63; add.u64 %r492,%r230,2; set.u32.lt.u64 %r497,%r492,%r230; cvt.s64.s32 %r495,%r497; sub.u64 %r501,%r489,%r495; add.u64 %r503,%r486,%r492; set.u32.lt.u64 %r508,%r503,%r486; cvt.s64.s32 %r506,%r508; sub.u64 %r512,%r501,%r506; .loc 1 428 10 mov.u64 %r597,1; $L116: .loc 1 430 10 ld.s8 %r514,[%r168]; cvt.u16.u32 %r513,%r514; setp.eq.u16 %r515,%r513,0; @ %r515 bra $L94; .loc 1 430 20 ld.s16 %r94,[%r167]; .loc 1 430 17 cvt.u16.u32 %r516,%r94; cvt.u16.u32 %r517,%r145; setp.ge.s16 %r518,%r516,%r517; .loc 1 433 11 selp.u64 %r563,%r563,%r567,%r518; selp.u64 %r564,%r564,%r568,%r518; selp.u32 %r145,%r145,%r94,%r518; $L94: .loc 1 428 35 add.u64 %r167,%r167,%r88; .loc 1 428 50 add.u64 %r168,%r168,%r43; .loc 1 428 10 add.u64 %r521,%r567,%r597; set.u32.lt.u64 %r526,%r521,%r567; cvt.s64.s32 %r524,%r526; mov.u64 %r567,%r521; sub.u64 %r568,%r568,%r524; setp.ne.u64 %r533,%r503,%r567; @ %r533 bra $L116; setp.ne.u64 %r536,%r512,%r568; @ %r536 bra $L116; $L89: .loc 1 436 8 st.u64 [%r174],%r563; st.u64 [%r174+8],%r564; .loc 1 439 15 add.u64 %r189,%r189,1; .loc 1 440 12 add.u64 %r172,%r172,%r103; .loc 1 441 13 add.u64 %r173,%r173,%r105; .loc 1 442 12 add.u64 %r174,%r174,%r108; .loc 1 444 13 setp.ne.u64 %r539,%r29,%r189; @ %r539 bra $L101; .loc 1 452 10 add.u64 %r169,%r173,%r92; .loc 1 453 9 add.u64 %r170,%r174,%r185; .loc 1 455 7 @ %r601 bra $L54; add.u64 %r219,%frame,488; .loc 1 451 23 mov.u64 %r112,%r39; .loc 1 455 7 mov.u64 %r218,8; .loc 1 454 5 mov.u64 %r171,1; add.u64 %r605,%frame,120; .loc 1 448 13 mov.u64 %r608,0; bra $L98; $L99: st.u64 [%r219],%r608; .loc 1 451 23 mul.lo.u64 %r112,%r125,%r124; .loc 1 452 24 mul.lo.u64 %r545,%r127,%r124; .loc 1 452 10 sub.u64 %r169,%r173,%r545; .loc 1 453 23 mul.lo.u64 %r546,%r129,%r124; .loc 1 453 9 shl.b64 %r547,%r546,4; sub.u64 %r170,%r174,%r547; .loc 1 454 5 add.u64 %r171,%r171,1; .loc 1 455 7 add.u64 %r219,%r219,8; add.u64 %r218,%r218,8; setp.eq.u64 %r548,%r171,%r613; @ %r548 bra $L54; $L98: .loc 1 463 16 ld.u64 %r549,[%r219]; add.u64 %r124,%r549,1; st.u64 [%r219],%r124; .loc 1 464 23 add.u64 %r551,%r614,%r218; ld.u64 %r125,[%r551]; .loc 1 464 13 sub.u64 %r552,%r125,%r112; add.u64 %r553,%r552,%r552; add.u64 %r172,%r172,%r553; .loc 1 465 24 add.u64 %r554,%frame,%r218; ld.u64 %r127,[%r554]; .loc 1 465 14 add.u64 %r173,%r169,%r127; .loc 1 466 23 add.u64 %r556,%r605,%r218; ld.u64 %r129,[%r556]; .loc 1 466 13 shl.b64 %r557,%r129,4; add.u64 %r174,%r170,%r557; .loc 1 444 32 add.u64 %r559,%r615,%r218; ld.u64 %r133,[%r559]; .loc 1 444 13 setp.eq.u64 %r560,%r124,%r133; @ %r560 bra $L99; .loc 1 448 13 mov.u64 %r189,0; bra $L101; $L122: .loc 1 322 6 ld.u64 %r561,[%r233]; setp.eq.u64 %r562,%r561,0; @ ! %r562 bra $L69; bra $L100; $L54_gfortran_sminloc1_16_i2 .visible .func _gfortran_sminloc1_16_, .param .u32 %in_ar4) {local .align 16 .b8 %frame_ar[368u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r282; .reg .predpred %r293; .reg .pred %r294; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 493 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L126; .loc 1 493 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L127; $L126:r152154; call _gfortran_minloc1_16_500 7 bra $L125; $L127: .loc 1 503 10 ld.u64 %r23,[%r152]; .loc 1 503 7 add.u64 %r88,%r23,-1; .loc 1 504 10 ld.s8 %r24,[%r151+28]; .loc 1 504 38 add.u32 %r26,%r24,-1; .loc 1 504 8 cvt.s64.s32 %r297,%r26; .loc 1 506 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r297; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 506 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L129; .loc 1 513 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L130; $L136: .loc 1 521 3 setp.lt.s64 %r175,%r88,%r297; @ %r175 bra $L131; bra $L184; $L129: .loc 1 508 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error130: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 518 12 mov.u64 %r303,0; $L135: .loc 1 515 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 517 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L133; .loc 1 515 17 st.u64 [%r139],%r36; bra $L134; $L133: .loc 1 518 12 st.u64 [%r139],%r303; $L134: .loc 1 513 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L135; bra $L136; $L184: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L138; bra $L137; $L131: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r296,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r296,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 527 12 mov.u64 %r302,0; $L141: .loc 1 524 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 526 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L139; .loc 1 523 17 st.u64 [%r43],%r41; bra $L140; $L139: .loc 1 527 12 st.u64 [%r43],%r302; $L140: .loc 1 521 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L141; bra $L185; $L137: .loc 1 534 7 setp.eq.u64 %r213,%r297,0; @ %r213 bra $L143; add.u64 %r296,%frame,120; $L163: add.u64 %r111,%r150,40; mov.u64 %r107,%r296; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 537 10 mov.u64 %r81,1; .loc 1 541 4 mov.u64 %r239,0; bra $L144; $L143: .loc 1 545 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 546 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 548 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 548 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 548 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 550 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L145; bra $L186; $L147: .loc 1 539 48 mul.lo.u64 %r81,%r46,%r81; $L144: .loc 1 541 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 534 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L147; bra $L143; $L145: .loc 1 553 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 554 4 bra $L125; $L186: .loc 1 557 24 mov.u64 %r246,1696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 557 22 st.u64 [%r150],%r98; $L151: .loc 1 583 3 setp.ne.u64 %r249,%r297,0; @ %r249 bra $L148; bra $L149; $L138: .loc 1 561 19 ld.s8 %r55,[%r150+28]; .loc 1 561 10 setp.eq.u64 %r250,%r55,%r297; @ %r250 bra $L150; .loc 1 562 2 st.u64 [%stack+8],%r297; st.u64 [%stack],%r55;_gfortran_runtime_error150: .loc 1 567 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 567 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L151; .loc 1 569 4 setp.eq.u64 %r256,%r297,0; @ %r256 bra $L149; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 569 10 mov.u64 %r104,0; $L153: .loc 1 573 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 573 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 574 18 ld.u64 %r63,[%r31]; .loc 1 569 25 add.u64 %r104,%r104,1; .loc 1 574 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L152; .loc 1 575 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error152: .loc 1 569 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L153; $L148: .loc 1 585 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L154; shl.b64 %r266,%r297,3; bra $L155; $L154: mov.u64 %r266,8; $L155: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L156: .loc 1 586 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 583 26 add.u64 %r100,%r100,1; .loc 1 583 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r297,%r100; @ %r279 bra $L156; $L149: ld.u64 %r80,[%frame+240]; .loc 1 595 22 ld.u64 %r68,[%frame]; .loc 1 595 12 shl.b64 %r69,%r68,4; .loc 1 597 32 ld.u64 %r59,[%frame+120]; .loc 1 604 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r295,%r297,1; .loc 1 593 13 mov.u64 %r280,0; cvt.u32.u32 %r299,%r24; cvt.s64.s8 %r300,%r299; add.u64 %r301,%r300,-1; $L162: st.u64 [%r98],%r280; st.u64 [%r98+8],%r280; .loc 1 594 15 add.u64 %r80,%r80,1; .loc 1 595 12 add.u64 %r98,%r98,%r69; .loc 1 597 13 setp.ne.u64 %r282,%r59,%r80; @ %r282 bra $L162; .loc 1 606 7 @ ! %r295 bra $L187; bra $L125; $L161: .loc 1 601 13 st.u64 [%r99],%r280; .loc 1 604 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 605 5 add.u64 %r97,%r97,1; .loc 1 606 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r285,%r97,%r301; @ %r285 bra $L160; bra $L125; $L187: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 604 23 mov.u64 %r72,%r44; .loc 1 605 5 mov.u64 %r97,1; $L160: .loc 1 610 16 ld.u64 %r290,[%r99]; add.u64 %r74,%r290,1; st.u64 [%r99],%r74; .loc 1 611 23 ld.u64 %r75,[%r124]; .loc 1 611 13 sub.u64 %r291,%r75,%r72; shl.b64 %r292,%r291,4; add.u64 %r98,%r98,%r292; .loc 1 597 32 ld.u64 %r77,[%r123]; .loc 1 597 13 setp.eq.u64 %r293,%r74,%r77; @ %r293 bra $L161; .loc 1 601 13 mov.u64 %r80,0; bra $L162; $L185: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r294,%r98,0; @ ! %r294 bra $L138; bra $L163; $L125: .loc 1 615 1 ret; } minloc1_4_i4.o/_gfortran_minloc1_4_i4 .visible .func _gfortran_minloc1_4_fortran/generated/minloc1_4_i4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_4_i4 .visible .func _gfortran_mminloc1_4__gfortran_sminloc1_4_i4 .visible .func _gfortran_sminloc1_4_VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_minloc1_4_i4 .visible .func _gfortran_minloc1_4_i4480predpred %r238; .reg .predpred %r262; .reg .pred %r264; .reg .pred %r265; .reg .u64u64predu64 %r318; .reg .u32u64 %r325; .reg .pred %r326; .reg .pred %r327; .reg .u32 %r329; .reg .predu32 %r335; .reg .pred %r336; .reg .predu64u64predmov.u64 %r188,%ar0; mov.u64 %r189,%ar1; mov.u64 %r190,%ar2; mov.u32 %r191,%ar3; .loc 1 58 10 ld.s8 %r22,[%r189+28]; .loc 1 58 38 add.u32 %r192,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r118,%r192; .loc 1 59 10 ld.u64 %r25,[%r190]; .loc 1 59 7 add.u64 %r119,%r25,-1; .loc 1 61 7 shr.u64 %r194,%r119,63; set.u32.lt.s64 %r196,%r118,%r119; neg.s32 %r197,%r196; cvt.u16.u64 %r200,%r194; cvt.u16.u32 %r201,%r197; or.b16 %r199,%r200,%r201; .loc 1 61 6 cvt.u32.u16 %r202,%r199; cvt.u16.u8 %r203,%r202; setp.eq.u16 %r204,%r203,0; @ %r204 bra $L2; .loc 1 63 7 cvt.u32.u32 %r207,%r22; cvt.s64.s8 %r206,%r207; st.u64 [%stack+8],%r20605205_gfortran_runtime_errorr210,%r119,%r119; add.u64 %r211,%r210,%r119; shl.b64 %r212,%r211,3; add.u64 %r213,%r189,%r212; ld.u64 %r30,[%r213+56]; ld.u64 %r32,[%r213+48]; .loc 1 71 9 ld.u64 %r122,[%r213+40]; .loc 1 73 3 setp.ne.u64 %r227,%r119,0; @ %r227 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r228,%r118,%r119; @ %r228 bra $L4; bra $L52; $L3: add.u64 %r138,%r189,40; add.u64 %r186,%frame,120; add.u64 %r185,%frame,240; add.u64 %r229,%r189,16; add.u64 %r231,%r25,%r25; add.u64 %r232,%r231,%r25; shl.b64 %r233,%r232,3; add.u64 %r145,%r229,%r233; .loc 1 79 12 mov.u64 %r372,0; $L8: .loc 1 75 18 ld.u64 %r234,[%r138]; st.u64 [%r186],%r234; .loc 1 76 19 ld.u64 %r236,[%r138+16]; add.u64 %r235,%r236,1; ld.u64 %r237,[%r138+8]; sub.u64 %r37,%r235,%r237; .loc 1 78 10 setp.lt.s64 %r238,%r37,0; @ %r238 bra $L6; .loc 1 76 17 st.u64 [%r185],%r37; bra $L7; $L6: .loc 1 79 12 st.u64 [%r185],%r372; $L7: .loc 1 73 3 add.u64 %r138,%r138,24; add.u64 %r186,%r186,8; add.u64 %r185,%r185,8; setp.ne.u64 %r240,%r138,%r145; @ %r240 bra $L8; bra $L9; $L52: .loc 1 90 6 ld.u64 %r241,[%r188]; setp.eq.u64 %r242,%r241,0; @ ! %r242 bra $L11; bra $L10; $L4: add.u64 %r244,%r25,%r25; add.u64 %r245,%r244,%r25; shl.b64 %r246,%r245,3; add.u64 %r247,%r246,40; add.u64 %r114,%r189,%r247; shl.b64 %r248,%r25,3; add.u64 %r102,%r248,-8; add.u64 %r362,%frame,120; add.u64 %r107,%r362,%r102; add.u64 %r361,%frame,240; add.u64 %r100,%r361,%r102; cvt.u32.u32 %r252,%r22; cvt.s64.s8 %r251,%r252; add.u64 %r254,%r251,%r251; add.u64 %r255,%r254,%r251; shl.b64 %r256,%r255,3; add.u64 %r257,%r189,40; add.u64 %r143,%r256,%r257; .loc 1 87 12 mov.u64 %r371,0; $L14: .loc 1 83 18 ld.u64 %r258,[%r114]; st.u64 [%r107],%r258; .loc 1 84 19 ld.u64 %r260,[%r114+16]; add.u64 %r259,%r260,1; ld.u64 %r261,[%r114+8]; sub.u64 %r43,%r259,%r261; .loc 1 86 10 setp.lt.s64 %r262,%r43,0; @ %r262 bra $L12; .loc 1 84 17 st.u64 [%r100],%r43; bra $L13; $L12: .loc 1 87 12 st.u64 [%r100],%r371; $L13: .loc 1 81 3 add.u64 %r114,%r114,24; add.u64 %r107,%r107,8; add.u64 %r100,%r100,8; setp.ne.u64 %r264,%r114,%r143; @ %r264 bra $L14; bra $L53; $L10: .loc 1 94 7 setp.le.s64 %r265,%r118,0; @ %r265 bra $L16; add.u64 %r361,%frame,240; $L39: add.u64 %r155,%r188,40; mov.u64 %r151,%r361; cvt.u32.u32 %r267,%r22; cvt.s64.s8 %r266,%r267; add.u64 %r269,%r266,%r266; add.u64 %r270,%r269,%r266; shl.b64 %r271,%r270,3; add.u64 %r272,%r188,16; add.u64 %r115,%r271,%r272; .loc 1 97 10 mov.u64 %r105,1; .loc 1 101 4 mov.u64 %r295,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r273,0; st.u64 [%r188+8],%r273; .loc 1 106 28 cvt.u16.u32 %r276,%r22; add.u16 %r275,%r276,-1; cvt.u32.u16 %r277,%r275; st.u8 [%r188+28],%r277; .loc 1 108 20 add.u32 %r278,%r22,-2; cvt.s64.s32 %r52,%r278; add.u64 %r280,%r52,%r52; add.u64 %r281,%r280,%r52; shl.b64 %r282,%r281,3; add.u64 %r283,%r188,%r282; .loc 1 108 67 shl.b64 %r285,%r52,3; add.u64 %r286,%frame,%r285; .loc 1 108 59 ld.u64 %r288,[%r283+40]; ld.u64 %r289,[%r286+240]; mul.lo.u64 %r124,%r288,%r289; .loc 1 110 29291; call (%value_in),_gfortrani_xmallocarray292,[%value_in]; } .loc 1 110 27 st.u64 [%r188],%r292; .loc 1 111 10 setp.eq.u64 %r294,%r124,0; @ ! %r294 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r105,%r48,%r105; $L17: .loc 1 101 4 st.u64 [%r155+8],%r295; ld.u64 %r48,[%r151]; add.u64 %r296,%r48,-1; st.u64 [%r155+16],%r296; st.u64 [%r155],%r105; .loc 1 94 7 add.u64 %r155,%r155,24; add.u64 %r151,%r151,8; setp.ne.u64 %r297,%r115,%r155; @ %r297 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r188+48],%r124; mov.u64 %r299,-1; st.u64 [%r188+56],%r299; mov.u64 %r300,1; st.u64 [%r188+40],%r300; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r59,[%r188+28]; .loc 1 121 10 setp.eq.u64 %r301,%r59,%r118; @ %r301 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r118; st.u64 [%stack],%r59;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r304,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r305,[%r304+36]; setp.eq.u32 %r306,%r305,0; @ %r306 bra $L21; .loc 1 128 2 add.u64 %r361,%frame,240; cvta.const.u64 %r310,$LC2r361309310; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r312,%r118,0; @ %r312 bra $L24; $L27: .loc 1 68 9 add.u64 %r313,%r30,1; .loc 1 68 7 sub.u64 %r120,%r313,%r32; max.s64 %r86,%r120,0; .loc 1 140 8 ld.u64 %r134,[%r189]; .loc 1 141 8 ld.u64 %r135,[%r188]; .loc 1 190 33 shl.b64 %r75,%r122,2; ld.u64 %r153,[%frame+360]; .loc 1 204 22 ld.u64 %r77,[%frame+120]; .loc 1 204 12 shl.b64 %r78,%r77,2; .loc 1 205 22 ld.u64 %r80,[%frame]; .loc 1 205 12 shl.b64 %r81,%r80,2; .loc 1 207 32 ld.u64 %r68,[%frame+240]; .loc 1 214 23 mul.lo.u64 %r314,%r68,%r77; .loc 1 214 9 shl.b64 %r315,%r314,2; neg.s64 %r142,%r315; .loc 1 215 23 mul.lo.u64 %r316,%r68,%r80; .loc 1 215 9 shl.b64 %r317,%r316,2; neg.s64 %r140,%r317; setp.gt.s64 %r359,%r120,0; setp.le.s64 %r360,%r118,1; .loc 1 180 9 setp.eq.u32 %r364,%r191,0; .loc 1 155 9 mov.u32 %r365,2147483647; cvt.u32.u32 %r366,%r22; cvt.s64.s8 %r367,%r366; add.u64 %r368,%r367,-1; add.u64 %r369,%frame,120; add.u64 %r370,%frame,240; bra $L25; $L24: add.u64 %r171,%frame,360; add.u64 %r170,%r188,40; mov.u64 %r168,%frame; cvt.u32.u32 %r319,%r22; cvt.s64.s8 %r318,%r319; add.u64 %r159,%r318,-1; .loc 1 132 3 mov.u64 %r136,0; add.u64 %r361,%frame,240; .loc 1 134 16 mov.u64 %r320,%r136; $L26: st.u64 [%r171],%r320; .loc 1 135 18 ld.u64 %r321,[%r170]; st.u64 [%r168],%r321; .loc 1 136 17 shl.b64 %r323,%r136,3; add.u64 %r324,%r361,%r323; .loc 1 136 10 ld.u64 %r325,[%r324]; setp.le.s64 %r326,%r325,0; @ %r326 bra $L1; .loc 1 132 26 add.u64 %r136,%r136,1; .loc 1 132 3 add.u64 %r171,%r171,8; add.u64 %r170,%r170,24; add.u64 %r168,%r168,8; setp.ne.u64 %r327,%r136,%r159; @ %r327 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r359 bra $L28; .loc 1 159 10 mov.u32 %r329,0; st.u32 [%r135],%r329; bra $L29; $L28: .loc 1 180 9 @ %r364 bra $L40; mov.u64 %r129,%r134; .loc 1 155 9 mov.u32 %r108,%r365; .loc 1 157 9 mov.u32 %r106,1; .loc 1 178 8 mov.u64 %r128,0; $L32: .loc 1 183 9 ld.u32 %r67,[%r129]; .loc 1 183 8 setp.gt.s32 %r331,%r67,%r108; @ %r331 bra $L31; cvt.u32.u64 %r332,%r128; add.u32 %r106,%r332,1; mov.u32 %r108,%r67; $L31: .loc 1 181 25 add.u64 %r128,%r128,1; .loc 1 181 33 add.u64 %r129,%r129,%r75; .loc 1 181 8 setp.gt.s64 %r333,%r86,%r128; @ %r333 bra $L32; bra $L33; $L40: mov.u64 %r126,%r134; .loc 1 155 9 mov.u32 %r110,%r365; .loc 1 157 9 mov.u32 %r106,1; .loc 1 178 8 mov.u64 %r104,0; $L30: .loc 1 192 9 ld.u32 %r72,[%r126]; .loc 1 192 8 setp.ge.s32 %r334,%r72,%r110; @ %r334 bra $L34; cvt.u32.u64 %r335,%r104; add.u32 %r106,%r335,1; mov.u32 %r110,%r72; $L34: .loc 1 190 25 add.u64 %r104,%r104,1; .loc 1 190 33 add.u64 %r126,%r126,%r75; .loc 1 190 8 setp.gt.s64 %r336,%r86,%r104; @ %r336 bra $L30; $L33: .loc 1 199 12 st.u32 [%r135],%r106; $L29: .loc 1 203 15 add.u64 %r153,%r153,1; .loc 1 204 12 add.u64 %r134,%r134,%r78; .loc 1 205 12 add.u64 %r135,%r135,%r81; .loc 1 207 13 setp.ne.u64 %r337,%r68,%r153; @ %r337 bra $L25; .loc 1 214 9 add.u64 %r131,%r134,%r142; .loc 1 215 9 add.u64 %r132,%r135,%r140; .loc 1 217 7 @ %r360 bra $L1; add.u64 %r156,%frame,368; mov.u64 %r117,8; .loc 1 216 5 mov.u64 %r133,1; .loc 1 211 13 mov.u64 %r363,0; bra $L37; $L38: st.u64 [%r156],%r363; .loc 1 214 23 mul.lo.u64 %r343,%r93,%r92; .loc 1 214 9 shl.b64 %r344,%r343,2; sub.u64 %r131,%r134,%r344; .loc 1 215 23 mul.lo.u64 %r345,%r96,%r92; .loc 1 215 9 shl.b64 %r346,%r345,2; sub.u64 %r132,%r135,%r346; .loc 1 216 5 add.u64 %r133,%r133,1; .loc 1 217 7 add.u64 %r156,%r156,8; add.u64 %r117,%r117,8; setp.eq.u64 %r347,%r133,%r368; @ %r347 bra $L1; $L37: .loc 1 225 16 ld.u64 %r348,[%r156]; add.u64 %r92,%r348,1; st.u64 [%r156],%r92; .loc 1 226 23 add.u64 %r350,%r369,%r117; ld.u64 %r93,[%r350]; .loc 1 226 13 shl.b64 %r351,%r93,2; add.u64 %r134,%r131,%r351; .loc 1 227 23 add.u64 %r352,%frame,%r117; ld.u64 %r96,[%r352]; .loc 1 227 13 shl.b64 %r353,%r96,2; add.u64 %r135,%r132,%r353; .loc 1 207 32 add.u64 %r355,%r370,%r117; ld.u64 %r99,[%r355]; .loc 1 207 13 setp.eq.u64 %r356,%r92,%r99; @ %r356 bra $L38; .loc 1 211 13 mov.u64 %r153,0; bra $L25; $L53: .loc 1 90 6 ld.u64 %r357,[%r188]; setp.eq.u64 %r358,%r357,0; @ ! %r358 bra $L11; bra $L39; $L1:_gfortran_mminloc1_4_i4 .visible .func _gfortran_mminloc1_4_, .param .u32 %in_ar4) {60898u16 %r238; .reg .u16u64 %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u64 %r306; .reg .u64pred %r317; .reg .u64 %r318; .reg .pred %r319; .reg .u64u32u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64u64predu32 %r385; .reg .predu64 %r394; .reg .predu64predu64 %r410; .reg .u32 %r412; .reg .pred %r414; .reg .pred %r415; .reg .pred %r416; .reg .predpred %r423; .reg .predpredpred %r446; .reg .u64 %r447; .reg .predmov.u64 %r221,%ar0; mov.u64 %r222,%ar1; mov.u64 %r223,%ar2; mov.u64 %r224,%ar3; mov.u32 %r225,%ar4; .loc 1 261 6 setp.ne.u64 %r226,%r224,0; @ %r226 bra $L223225; call _gfortran_minloc1_4_268 7 bra $L54; $L55: .loc 1 271 10 ld.u64 %r22,[%r223]; .loc 1 271 7 add.u64 %r151,%r22,-1; .loc 1 272 10 ld.s8 %r23,[%r222+28]; .loc 1 272 38 add.u32 %r231,%r23,-1; .loc 1 272 8 cvt.s64.s32 %r152,%r231; .loc 1 275 7 shr.u64 %r233,%r151,63; set.u32.gt.s64 %r235,%r151,%r152; neg.s32 %r236,%r235; cvt.u16.u64 %r239,%r233; cvt.u16.u32 %r240,%r236; or.b16 %r238,%r239,%r240; .loc 1 275 6 cvt.u32.u16 %r241,%r238; cvt.u16.u8 %r242,%r241; setp.eq.u16 %r243,%r242,0; @ %r243 bra $L57; .loc 1 277 7 cvt.u32.u32 %r246,%r23; cvt.s64.s8 %r245,%r246; st.u64 [%stack+8],%r245; st.u64 [%stack],%r22; cvta.const.u64 %r244244_gfortran_runtime_error7: .loc 1 282 9 add.u64 %r449,%r151,%r151; add.u64 %r455,%r449,%r151; shl.b64 %r251,%r455,3; add.u64 %r252,%r222,%r251; ld.u64 %r255,[%r252+56]; add.u64 %r254,%r255,1; .loc 1 282 7 ld.u64 %r262,[%r252+48]; sub.u64 %r153,%r254,%r262; .loc 1 283 6 setp.le.s64 %r263,%r153,0; @ %r263 bra $L54; .loc 1 286 9 ld.u64 %r169,[%r224]; .loc 1 288 15 ld.u64 %r34,[%r224+16]; .loc 1 290 22 cvt.u32.u64 %r35,%r34; .loc 1 290 53 add.u32 %r264,%r35,-4; and.b32 %r265,%r264,-5; set.u32.eq.u32 %r267,%r265,0; neg.s32 %r268,%r267; .loc 1 290 22 add.u32 %r269,%r35,-1; set.u32.le.u32 %r271,%r269,1; neg.s32 %r272,%r271; .loc 1 290 6 cvt.u16.u32 %r274,%r268; cvt.u16.u32 %r275,%r272; or.loc 1 292 7 setp.ne.u32 %r280,%r35,16; @ %r280 bra $L60; $L59: .loc 1 299 9 shl.b64 %r284,%r455,3; add.u64 %r285,%r222,%r284; ld.u64 %r156,[%r285+40]; .loc 1 300 12 add.u64 %r291,%r224,%r284; ld.u64 %r41,[%r291+40]; .loc 1 302 3 setp.ne.u64 %r293,%r151,0; @ %r293 bra $L61; $L67: .loc 1 312 3 setp.lt.s64 %r294,%r151,%r152; @ %r294 bra $L62; bra $L119; $L60: .loc 1 297 5 cvta.const.u64 %r2955_gfortran_runtime_error61: add.u64 %r96,%r222,40; add.u64 %r68,%r224,40; add.u64 %r298,%r22,%r22; add.u64 %r299,%r298,%r22; shl.b64 %r300,%r299,3; add.u64 %r301,%r222,16; add.u64 %r39,%r300,%r301; .loc 1 302 3 mov.u64 %r139,0; add.u64 %r452,%frame,240; add.u64 %r451,%frame,360; .loc 1 309 12 mov.u64 %r465,%r139; $L66: .loc 1 304 18 add.u64 %r303,%r452,%r139; ld.u64 %r304,[%r96]; st.u64 [%r303],%r304; .loc 1 305 18 add.u64 %r305,%frame,%r139; .loc 1 305 20 ld.u64 %r307,[%r68]; mul.lo.u64 %r306,%r307,%r34; .loc 1 305 18 st.u64 [%r305],%r306; .loc 1 306 19 ld.u64 %r309,[%r96+16]; add.u64 %r308,%r309,1; ld.u64 %r310,[%r96+8]; sub.u64 %r53,%r308,%r310; .loc 1 308 10 setp.lt.s64 %r311,%r53,0; @ %r311 bra $L64; .loc 1 306 17 add.u64 %r313,%r451,%r139; st.u64 [%r313],%r53; bra $L65; $L64: .loc 1 309 12 add.u64 %r315,%r451,%r139; st.u64 [%r315],%r465; $L65: .loc 1 302 3 add.u64 %r96,%r96,24; add.u64 %r68,%r68,24; add.u64 %r139,%r139,8; setp.ne.u64 %r317,%r39,%r96; @ %r317 bra $L66; bra $L67; $L119: .loc 1 322 6 ld.u64 %r318,[%r221]; setp.eq.u64 %r319,%r318,0; @ ! %r319 bra $L69; bra $L68; $L62: add.u64 %r321,%r22,%r22; add.u64 %r322,%r321,%r22; shl.b64 %r323,%r322,3; add.u64 %r130,%r323,40; add.u64 %r134,%r222,%r130; add.u64 %r111,%r224,%r130; shl.b64 %r324,%r22,3; add.u64 %r58,%r324,-8; cvt.u32.u32 %r326,%r23; cvt.s64.s8 %r325,%r326; shl.b64 %r327,%r325,3; add.u64 %r69,%r327,-8; add.u64 %r452,%frame,240; add.u64 %r451,%frame,360; .loc 1 319 12 mov.u64 %r464,0; $L72: .loc 1 314 18 add.u64 %r329,%r452,%r58; ld.u64 %r330,[%r134]; st.u64 [%r329],%r330; .loc 1 315 18 add.u64 %r331,%frame,%r58; .loc 1 315 20 ld.u64 %r333,[%r111]; mul.lo.u64 %r332,%r333,%r34; .loc 1 315 18 st.u64 [%r331],%r332; .loc 1 316 19 ld.u64 %r335,[%r134+16]; add.u64 %r334,%r335,1; ld.u64 %r336,[%r134+8]; sub.u64 %r65,%r334,%r336; .loc 1 318 10 setp.lt.s64 %r337,%r65,0; @ %r337 bra $L70; .loc 1 316 17 add.u64 %r339,%r451,%r58; st.u64 [%r339],%r65; bra $L71; $L70: .loc 1 319 12 add.u64 %r341,%r451,%r58; st.u64 [%r341],%r464; $L71: .loc 1 312 3 add.u64 %r134,%r134,24; add.u64 %r111,%r111,24; add.u64 %r58,%r58,8; setp.ne.u64 %r343,%r69,%r58; @ %r343 bra $L72; bra $L120; $L68: .loc 1 326 7 setp.eq.u64 %r344,%r152,0; @ %r344 bra $L74; add.u64 %r451,%frame,360; $L99: add.u64 %r148,%r221,40; mov.u64 %r146,%r451; cvt.u32.u32 %r346,%r23; cvt.s64.s8 %r345,%r346; add.u64 %r348,%r345,%r345; add.u64 %r349,%r348,%r345; shl.b64 %r350,%r349,3; add.u64 %r351,%r221,16; add.u64 %r135,%r350,%r351; .loc 1 329 10 mov.u64 %r136,1; .loc 1 333 4 mov.u64 %r370,0; bra $L75; $L74: .loc 1 337 20 add.u32 %r352,%r23,-2; cvt.s64.s32 %r76,%r352; add.u64 %r354,%r76,%r76; add.u64 %r355,%r354,%r76; shl.b64 %r356,%r355,3; add.u64 %r357,%r221,%r356; .loc 1 337 67 shl.b64 %r359,%r76,3; add.u64 %r360,%frame,%r359; .loc 1 337 59 ld.u64 %r362,[%r357+40]; ld.u64 %r363,[%r360+360]; mul.lo.u64 %r160,%r362,%r363; .loc 1 339 24 mov.u64 %r364,0; st.u64 [%r221+8],%r364; .loc 1 340 28 cvt.u16.u32 %r367,%r23; add.u16 %r366,%r367,-1; cvt.u32.u16 %r368,%r366; st.u8 [%r221+28],%r368; .loc 1 342 10 setp.eq.u64 %r369,%r160,0; @ %r369 bra $L76; bra $L121; $L78: .loc 1 331 47 mul.lo.u64 %r136,%r73,%r136; $L75: .loc 1 333 4 st.u64 [%r148+8],%r370; ld.u64 %r73,[%r146]; add.u64 %r371,%r73,-1; st.u64 [%r148+16],%r371; st.u64 [%r148],%r136; .loc 1 326 7 add.u64 %r148,%r148,24; add.u64 %r146,%r146,8; setp.ne.u64 %r372,%r135,%r148; @ %r372 bra $L78; bra $L74; $L76: .loc 1 345 4 st.u64 [%r221+48],%r160; mov.u64 %r374,-1; st.u64 [%r221+56],%r374; mov.u64 %r375,1; st.u64 [%r221+40],%r375; .loc 1 346 4 bra $L54; $L121: .loc 1 349 24call (%value_in),_gfortrani_xmallocarray78,[%value_in]; } .loc 1 349 22 st.u64 [%r221],%r378; bra $L79; $L69: .loc 1 354 19 ld.s8 %r380,[%r221+28]; .loc 1 354 10 setp.eq.u64 %r381,%r380,%r152; @ %r381 bra $L80; .loc 1 355 2 cvta.const.u64 %r382,$LC5; {stack; call _gfortran_runtime_error80: .loc 1 357 11 cvta.global.u64 %r384,_gfortrani_compile_options; .loc 1 357 10 ld.u32 %r385,[%r384+36]; setp.eq.u32 %r386,%r385,0; @ %r386 bra $L79; .loc 1 359 4 add.u64 %r451,%frame,360; cvta.const.u64 %r390,$LC2390; call _gfortrani_bounds_ifunction_return361 4 cvta.const.u64 %r394,$LC222394390; call _gfortrani_bounds_equal_extents$L79: .loc 1 366 3 setp.ne.u64 %r396,%r152,0; @ %r396 bra $L81; $L84: .loc 1 374 8 ld.u64 %r170,[%r221]; .loc 1 375 8 ld.u64 %r168,[%r222]; .loc 1 377 9 setp.ne.u64 %r397,%r168,0; @ %r397 bra $L82; bra $L54; $L81: add.u64 %r198,%frame,480; add.u64 %r197,%r221,40; add.u64 %r195,%frame,120; cvt.u32.u32 %r399,%r23; cvt.s64.s8 %r398,%r399; add.u64 %r164,%r398,-1; .loc 1 366 3 mov.u64 %r171,0; add.u64 %r451,%frame,360; .loc 1 368 16 mov.u64 %r400,%r171; $L83: st.u64 [%r198],%r400; .loc 1 369 18 ld.u64 %r401,[%r197]; st.u64 [%r195],%r401; .loc 1 370 17 shl.b64 %r403,%r171,3; add.u64 %r404,%r451,%r403; .loc 1 370 10 ld.u64 %r405,[%r404]; setp.le.s64 %r406,%r405,0; @ %r406 bra $L54; .loc 1 366 26 add.u64 %r171,%r171,1; .loc 1 366 3 add.u64 %r198,%r198,8; add.u64 %r197,%r197,24; add.u64 %r195,%r195,8; setp.ne.u64 %r407,%r164,%r171; @ %r407 bra $L83; bra $L84; $L82: .loc 1 300 12 mul.lo.u64 %r44,%r41,%r34; .loc 1 396 32 shl.b64 %r91,%r156,2; ld.u64 %r187,[%frame+480]; .loc 1 440 22 ld.u64 %r101,[%frame+240]; .loc 1 440 12 shl.b64 %r102,%r101,2; .loc 1 441 23 ld.u64 %r104,[%frame]; .loc 1 442 22 ld.u64 %r106,[%frame+120]; .loc 1 442 12 shl.b64 %r107,%r106,2; .loc 1 444 32 ld.u64 %r29,[%frame+360]; .loc 1 451 23 mul.lo.u64 %r38,%r29,%r101; .loc 1 452 24 mul.lo.u64 %r408,%r29,%r104; .loc 1 452 10 neg.s64 %r137,%r408; .loc 1 453 23 mul.lo.u64 %r409,%r29,%r106; .loc 1 453 9 shl.b64 %r410,%r409,2; neg.s64 %r188,%r410; setp.le.s64 %r450,%r152,1; .loc 1 418 9 setp.ne.u32 %r457,%r225,0; cvt.u32.u32 %r458,%r23; cvt.s64.s8 %r459,%r458; add.u64 %r460,%r459,-1; add.u64 %r461,%frame,240; add.u64 %r462,%frame,360; add.u64 %r463,%frame,120; $L100: .loc 1 366 3 mov.u64 %r163,%r169; mov.u64 %r162,%r168; .loc 1 396 9 mov.u64 %r161,0; $L88: .loc 1 399 7 ld.s8 %r220,[%r163]; .loc 1 399 6 setp.eq.u32 %r453,%r220,0; @ %r453 bra $L85; .loc 1 407 11 ld.u32 %r145,[%r162]; .loc 1 408 11 cvt.u32.u64 %r412,%r161; add.u32 %r143,%r412,1; .loc 1 418 9 @ %r457 bra $L86; bra $L122; $L85: .loc 1 396 24 add.u64 %r161,%r161,1; .loc 1 396 32 add.u64 %r162,%r162,%r91; .loc 1 396 47 add.u64 %r163,%r163,%r44; .loc 1 396 2 setp.ne.u64 %r414,%r153,%r161; @ %r414 bra $L88; .loc 1 395 9 mov.u32 %r143,%r220; bra $L89; $L122: .loc 1 428 10 setp.gt.s64 %r415,%r153,%r161; @ %r415 bra $L90; bra $L89; $L86: .loc 1 419 8 setp.le.s64 %r416,%r153,%r161; @ %r416 bra $L89; add.u64 %r205,%r161,1; $L92: .loc 1 421 8 @ %r453 bra $L91; .loc 1 421 18 ld.u32 %r93,[%r162]; .loc 1 421 15 setp.gt.s32 %r418,%r93,%r145; @ %r418 bra $L91; .loc 1 424 16 cvt.u32.u64 %r143,%r205; mov.u32 %r145,%r93; $L91: .loc 1 419 33 add.u64 %r162,%r162,%r91; .loc 1 419 48 add.u64 %r163,%r163,%r44; .loc 1 419 8 add.u64 %r206,%r205,1; setp.eq.u64 %r419,%r153,%r205; @ %r419 bra $L89; .loc 1 421 9 ld.s8 %r220,[%r163]; mov.u64 %r205,%r206; setp.eq.u32 %r453,%r220,0; bra $L92; $L90: add.u64 %r201,%r161,1; bra $L94; $L101: mov.u64 %r201,%r202; $L94: .loc 1 430 10 ld.s8 %r421,[%r163]; cvt.u16.u32 %r420,%r421; setp.eq.u16 %r422,%r420,0; @ %r422 bra $L93; .loc 1 430 20 ld.u32 %r98,[%r162]; .loc 1 430 17 setp.ge.s32 %r423,%r98,%r145; @ %r423 bra $L93; .loc 1 433 11 cvt.u32.u64 %r143,%r201; mov.u32 %r145,%r98; $L93: .loc 1 428 35 add.u64 %r162,%r162,%r91; .loc 1 428 50 add.u64 %r163,%r163,%r44; .loc 1 428 10 add.u64 %r202,%r201,1; setp.ne.u64 %r424,%r153,%r201; @ %r424 bra $L101; $L89: .loc 1 436 8 st.u32 [%r170],%r143; .loc 1 439 15 add.u64 %r187,%r187,1; .loc 1 440 12 add.u64 %r168,%r168,%r102; .loc 1 441 13 add.u64 %r169,%r169,%r104; .loc 1 442 12 add.u64 %r170,%r170,%r107; .loc 1 444 13 setp.ne.u64 %r425,%r29,%r187; @ %r425 bra $L100; .loc 1 452 10 add.u64 %r165,%r169,%r137; .loc 1 453 9 add.u64 %r166,%r170,%r188; .loc 1 455 7 @ %r450 bra $L54; add.u64 %r150,%frame,488; .loc 1 451 23 mov.u64 %r110,%r38; .loc 1 455 7 mov.u64 %r215,8; .loc 1 454 5 mov.u64 %r167,1; .loc 1 448 13 mov.u64 %r456,0; bra $L97; $L98: st.u64 [%r150],%r456; .loc 1 451 23 mul.lo.u64 %r110,%r123,%r122; .loc 1 452 24 mul.lo.u64 %r431,%r125,%r122; .loc 1 452 10 sub.u64 %r165,%r169,%r431; .loc 1 453 23 mul.lo.u64 %r432,%r127,%r122; .loc 1 453 9 shl.b64 %r433,%r432,2; sub.u64 %r166,%r170,%r433; .loc 1 454 5 add.u64 %r167,%r167,1; .loc 1 455 7 add.u64 %r150,%r150,8; add.u64 %r215,%r215,8; setp.eq.u64 %r434,%r167,%r460; @ %r434 bra $L54; $L97: .loc 1 463 16 ld.u64 %r435,[%r150]; add.u64 %r122,%r435,1; st.u64 [%r150],%r122; .loc 1 464 23 add.u64 %r437,%r461,%r215; ld.u64 %r123,[%r437]; .loc 1 464 13 sub.u64 %r438,%r123,%r110; shl.b64 %r439,%r438,2; add.u64 %r168,%r168,%r439; .loc 1 465 24 add.u64 %r440,%frame,%r215; ld.u64 %r125,[%r440]; .loc 1 465 14 add.u64 %r169,%r165,%r125; .loc 1 466 23 add.u64 %r442,%r463,%r215; ld.u64 %r127,[%r442]; .loc 1 466 13 shl.b64 %r443,%r127,2; add.u64 %r170,%r166,%r443; .loc 1 444 32 add.u64 %r445,%r462,%r215; ld.u64 %r131,[%r445]; .loc 1 444 13 setp.eq.u64 %r446,%r122,%r131; @ %r446 bra $L98; .loc 1 448 13 mov.u64 %r187,0; bra $L100; $L120: .loc 1 322 6 ld.u64 %r447,[%r221]; setp.eq.u64 %r448,%r447,0; @ ! %r448 bra $L69; bra $L99; $L54_gfortran_sminloc1_4_i4 .visible .func _gfortran_sminloc1_4_, .param .u32 %in_ar4) {local .align 16 .b8 %frame_ar[368u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u32 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 493 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L124; .loc 1 493 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L125; $L124:r152154; call _gfortran_minloc1_4_500 7 bra $L123; $L125: .loc 1 503 10 ld.u64 %r23,[%r152]; .loc 1 503 7 add.u64 %r88,%r23,-1; .loc 1 504 10 ld.s8 %r24,[%r151+28]; .loc 1 504 38 add.u32 %r26,%r24,-1; .loc 1 504 8 cvt.s64.s32 %r296,%r26; .loc 1 506 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 506 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L127; .loc 1 513 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L128; $L134: .loc 1 521 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L129; bra $L182; $L127: .loc 1 508 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error128: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 518 12 mov.u64 %r302,0; $L133: .loc 1 515 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 517 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L131; .loc 1 515 17 st.u64 [%r139],%r36; bra $L132; $L131: .loc 1 518 12 st.u64 [%r139],%r302; $L132: .loc 1 513 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L133; bra $L134; $L182: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L136; bra $L135; $L129: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 527 12 mov.u64 %r301,0; $L139: .loc 1 524 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 526 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L137; .loc 1 523 17 st.u64 [%r43],%r41; bra $L138; $L137: .loc 1 527 12 st.u64 [%r43],%r301; $L138: .loc 1 521 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L139; bra $L183; $L135: .loc 1 534 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L141; add.u64 %r295,%frame,120; $L161: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 537 10 mov.u64 %r81,1; .loc 1 541 4 mov.u64 %r239,0; bra $L142; $L141: .loc 1 545 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 546 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 548 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 548 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 548 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 550 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L143; bra $L184; $L145: .loc 1 539 48 mul.lo.u64 %r81,%r46,%r81; $L142: .loc 1 541 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 534 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L145; bra $L141; $L143: .loc 1 553 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 554 4 bra $L123; $L184: .loc 1 557 2496246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 557 22 st.u64 [%r150],%r98; $L149: .loc 1 583 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L146; bra $L147; $L136: .loc 1 561 19 ld.s8 %r55,[%r150+28]; .loc 1 561 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L148; .loc 1 562 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error148: .loc 1 567 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 567 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L149; .loc 1 569 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L147; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 569 10 mov.u64 %r104,0; $L151: .loc 1 573 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 573 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 574 18 ld.u64 %r63,[%r31]; .loc 1 569 25 add.u64 %r104,%r104,1; .loc 1 574 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L150; .loc 1 575 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error150: .loc 1 569 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L151; $L146: .loc 1 585 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L152; shl.b64 %r266,%r296,3; bra $L153; $L152: mov.u64 %r266,8; $L153: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L154: .loc 1 586 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 583 26 add.u64 %r100,%r100,1; .loc 1 583 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L154; $L147: ld.u64 %r80,[%frame+240]; .loc 1 595 22 ld.u64 %r68,[%frame]; .loc 1 595 12 shl.b64 %r69,%r68,2; .loc 1 597 32 ld.u64 %r59,[%frame+120]; .loc 1 604 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 593 13 mov.u32 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L160: st.u32 [%r98],%r280; .loc 1 594 15 add.u64 %r80,%r80,1; .loc 1 595 12 add.u64 %r98,%r98,%r69; .loc 1 597 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L160; .loc 1 606 7 @ ! %r294 bra $L185; bra $L123; $L159: .loc 1 601 13 st.u64 [%r99],%r297; .loc 1 604 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 605 5 add.u64 %r97,%r97,1; .loc 1 606 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L158; bra $L123; $L185: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 604 23 mov.u64 %r72,%r44; .loc 1 605 5 mov.u64 %r97,1; .loc 1 601 13 mov.u64 %r297,0; $L158: .loc 1 610 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 611 23 ld.u64 %r75,[%r124]; .loc 1 611 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,2; add.u64 %r98,%r98,%r291; .loc 1 597 32 ld.u64 %r77,[%r123]; .loc 1 597 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L159; .loc 1 601 13 mov.u64 %r80,0; bra $L160; $L183: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L136; bra $L161; $L123: .loc 1 615 1 ret; } minloc1_8_i4.o/_gfortran_minloc1_8_i4 .visible .func _gfortran_minloc1_8_fortran/generated/minloc1_8_i4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_8_i4 .visible .func _gfortran_mminloc1_8__gfortran_sminloc1_8_i4 .visible .func _gfortran_sminloc1_8_VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_minloc1_8_i4 .visible .func _gfortran_minloc1_8_i4480predpredpred %r235; .reg .predu64 %r266; .reg .u64 %r267; .reg .u64u16u64 %r279; .reg .u64 %r280; .reg .u64 %r282; .reg .u64 %r283; .reg .u64predu64u64predpred %r326; .reg .predu64 %r338; .reg .u64pred %r353; .reg .pred %r354; .reg .pred64mov.u64 %r185,%ar0; mov.u64 %r186,%ar1; mov.u64 %r187,%ar2; mov.u32 %r188,%ar3; .loc 1 58 10 ld.s8 %r22,[%r186+28]; .loc 1 58 38 add.u32 %r189,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r120,%r189; .loc 1 59 10 ld.u64 %r25,[%r187]; .loc 1 59 7 add.u64 %r121,%r25,-1; .loc 1 61 7 shr.u64 %r191,%r121,63; set.u32.lt.s64 %r193,%r120,%r121; neg.s32 %r194,%r193; cvt.u16.u64 %r197,%r191; cvt.u16.u32 %r198,%r194; or.b16 %r196,%r197,%r198; .loc 1 61 6 cvt.u32.u16 %r199,%r196; cvt.u16.u8 %r200,%r199; setp.eq.u16 %r201,%r200,0; @ %r201 bra $L2; .loc 1 63 7 cvt.u32.u32 %r204,%r22; cvt.s64.s8 %r203,%r204; st.u64 [%stack+8],%r20302202_gfortran_runtime_errorr207,%r121,%r121; add.u64 %r208,%r207,%r121; shl.b64 %r209,%r208,3; add.u64 %r210,%r186,%r209; ld.u64 %r30,[%r210+56]; ld.u64 %r32,[%r210+48]; .loc 1 71 9 ld.u64 %r124,[%r210+40]; .loc 1 73 3 setp.ne.u64 %r224,%r121,0; @ %r224 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r225,%r120,%r121; @ %r225 bra $L4; bra $L54; $L3: add.u64 %r47,%r186,40; add.u64 %r119,%frame,120; add.u64 %r137,%frame,240; add.u64 %r226,%r186,16; add.u64 %r228,%r25,%r25; add.u64 %r229,%r228,%r25; shl.b64 %r230,%r229,3; add.u64 %r46,%r226,%r230; .loc 1 79 12 mov.u64 %r367,0; $L8: .loc 1 75 18 ld.u64 %r231,[%r47]; st.u64 [%r119],%r231; .loc 1 76 19 ld.u64 %r233,[%r47+16]; add.u64 %r232,%r233,1; ld.u64 %r234,[%r47+8]; sub.u64 %r37,%r232,%r234; .loc 1 78 10 setp.lt.s64 %r235,%r37,0; @ %r235 bra $L6; .loc 1 76 17 st.u64 [%r137],%r37; bra $L7; $L6: .loc 1 79 12 st.u64 [%r137],%r367; $L7: .loc 1 73 3 add.u64 %r47,%r47,24; add.u64 %r119,%r119,8; add.u64 %r137,%r137,8; setp.ne.u64 %r237,%r46,%r47; @ %r237 bra $L8; bra $L9; $L54: .loc 1 90 6 ld.u64 %r238,[%r185]; setp.eq.u64 %r239,%r238,0; @ ! %r239 bra $L11; bra $L10; $L4: add.u64 %r241,%r25,%r25; add.u64 %r242,%r241,%r25; shl.b64 %r243,%r242,3; add.u64 %r244,%r243,40; add.u64 %r118,%r186,%r244; shl.b64 %r245,%r25,3; add.u64 %r108,%r245,-8; add.u64 %r357,%frame,120; add.u64 %r112,%r357,%r108; add.u64 %r356,%frame,240; add.u64 %r104,%r356,%r108; cvt.u32.u32 %r249,%r22; cvt.s64.s8 %r248,%r249; add.u64 %r251,%r248,%r248; add.u64 %r252,%r251,%r248; shl.b64 %r253,%r252,3; add.u64 %r254,%r186,40; add.u64 %r64,%r253,%r254; .loc 1 87 12 mov.u64 %r366,0; $L14: .loc 1 83 18 ld.u64 %r255,[%r118]; st.u64 [%r112],%r255; .loc 1 84 19 ld.u64 %r257,[%r118+16]; add.u64 %r256,%r257,1; ld.u64 %r258,[%r118+8]; sub.u64 %r43,%r256,%r258; .loc 1 86 10 setp.lt.s64 %r259,%r43,0; @ %r259 bra $L12; .loc 1 84 17 st.u64 [%r104],%r43; bra $L13; $L12: .loc 1 87 12 st.u64 [%r104],%r366; $L13: .loc 1 81 3 add.u64 %r118,%r118,24; add.u64 %r112,%r112,8; add.u64 %r104,%r104,8; setp.ne.u64 %r261,%r64,%r118; @ %r261 bra $L14; bra $L55; $L10: .loc 1 94 7 setp.le.s64 %r262,%r120,0; @ %r262 bra $L16; add.u64 %r356,%frame,240; $L39: add.u64 %r156,%r185,40; mov.u64 %r153,%r356; cvt.u32.u32 %r264,%r22; cvt.s64.s8 %r263,%r264; add.u64 %r266,%r263,%r263; add.u64 %r267,%r266,%r263; shl.b64 %r268,%r267,3; add.u64 %r269,%r185,16; add.u64 %r130,%r268,%r269; .loc 1 97 10 mov.u64 %r107,1; .loc 1 101 4 mov.u64 %r292,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r270,0; st.u64 [%r185+8],%r270; .loc 1 106 28 cvt.u16.u32 %r273,%r22; add.u16 %r272,%r273,-1; cvt.u32.u16 %r274,%r272; st.u8 [%r185+28],%r274; .loc 1 108 20 add.u32 %r275,%r22,-2; cvt.s64.s32 %r54,%r275; add.u64 %r277,%r54,%r54; add.u64 %r278,%r277,%r54; shl.b64 %r279,%r278,3; add.u64 %r280,%r185,%r279; .loc 1 108 67 shl.b64 %r282,%r54,3; add.u64 %r283,%frame,%r282; .loc 1 108 59 ld.u64 %r285,[%r280+40]; ld.u64 %r286,[%r283+240]; mul.lo.u64 %r126,%r285,%r286; .loc 1 110 29 mov.u64 %r288288; call (%value_in),_gfortrani_xmallocarray289,[%value_in]; } .loc 1 110 27 st.u64 [%r185],%r289; .loc 1 111 10 setp.eq.u64 %r291,%r126,0; @ ! %r291 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r107,%r50,%r107; $L17: .loc 1 101 4 st.u64 [%r156+8],%r292; ld.u64 %r50,[%r153]; add.u64 %r293,%r50,-1; st.u64 [%r156+16],%r293; st.u64 [%r156],%r107; .loc 1 94 7 add.u64 %r156,%r156,24; add.u64 %r153,%r153,8; setp.ne.u64 %r294,%r130,%r156; @ %r294 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r185+48],%r126; mov.u64 %r296,-1; st.u64 [%r185+56],%r296; mov.u64 %r297,1; st.u64 [%r185+40],%r297; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r60,[%r185+28]; .loc 1 121 10 setp.eq.u64 %r298,%r60,%r120; @ %r298 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r120; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r301,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r302,[%r301+36]; setp.eq.u32 %r303,%r302,0; @ %r303 bra $L21; .loc 1 128 2 add.u64 %r356,%frame,240; cvta.const.u64 %r307,$LC2r356306307; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r309,%r120,0; @ %r309 bra $L24; $L27: .loc 1 68 9 add.u64 %r310,%r30,1; .loc 1 68 7 sub.u64 %r122,%r310,%r32; max.s64 %r87,%r122,0; .loc 1 140 8 ld.u64 %r134,[%r186]; .loc 1 141 8 ld.u64 %r135,[%r185]; .loc 1 190 33 shl.b64 %r75,%r124,2; ld.u64 %r45,[%frame+360]; .loc 1 204 22 ld.u64 %r76,[%frame+120]; .loc 1 204 12 shl.b64 %r78,%r76,2; .loc 1 205 22 ld.u64 %r79,[%frame]; .loc 1 205 12 shl.b64 %r81,%r79,3; setp.gt.s64 %r354,%r122,0; setp.le.s64 %r355,%r120,1; .loc 1 180 9 setp.eq.u32 %r359,%r188,0; .loc 1 159 10 mov.u64 %r360,0; cvt.u32.u32 %r361,%r22; cvt.s64.s8 %r362,%r361; add.u64 %r363,%r362,-1; add.u64 %r364,%frame,120; add.u64 %r365,%frame,240; bra $L25; $L24: add.u64 %r170,%frame,360; add.u64 %r169,%r185,40; mov.u64 %r167,%frame; cvt.u32.u32 %r312,%r22; cvt.s64.s8 %r311,%r312; add.u64 %r158,%r311,-1; .loc 1 132 3 mov.u64 %r136,0; add.u64 %r356,%frame,240; .loc 1 134 16 mov.u64 %r313,%r136; $L26: st.u64 [%r170],%r313; .loc 1 135 18 ld.u64 %r314,[%r169]; st.u64 [%r167],%r314; .loc 1 136 17 shl.b64 %r316,%r136,3; add.u64 %r317,%r356,%r316; .loc 1 136 10 ld.u64 %r318,[%r317]; setp.le.s64 %r319,%r318,0; @ %r319 bra $L1; .loc 1 132 26 add.u64 %r136,%r136,1; .loc 1 132 3 add.u64 %r170,%r170,8; add.u64 %r169,%r169,24; add.u64 %r167,%r167,8; setp.ne.u64 %r320,%r136,%r158; @ %r320 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r354 bra $L28; .loc 1 159 10 st.u64 [%r135],%r360; bra $L29; $L28: .loc 1 180 9 @ %r359 bra $L40; mov.u64 %r129,%r134; .loc 1 155 9 mov.u32 %r111,2147483647; .loc 1 157 9 mov.u64 %r110,1; .loc 1 178 8 mov.u64 %r152,0; $L32: .loc 1 183 9 ld.u32 %r67,[%r129]; .loc 1 186 16 add.u64 %r152,%r152,1; .loc 1 183 8 setp.le.s32 %r324,%r67,%r111; selp.u32 %r111,%r67,%r111,%r324; selp.u64 %r110,%r152,%r110,%r324; .loc 1 181 33 add.u64 %r129,%r129,%r75; .loc 1 181 8 setp.gt.s64 %r325,%r87,%r152; @ %r325 bra $L32; bra $L33; $L40: mov.u64 %r127,%r134; .loc 1 155 9 mov.u32 %r113,2147483647; .loc 1 157 9 mov.u64 %r110,1; .loc 1 178 8 mov.u64 %r106,0; $L30: .loc 1 192 9 ld.u32 %r72,[%r127]; .loc 1 195 16 add.u64 %r106,%r106,1; .loc 1 192 8 setp.lt.s32 %r326,%r72,%r113; selp.u32 %r113,%r72,%r113,%r326; selp.u64 %r110,%r106,%r110,%r326; .loc 1 190 33 add.u64 %r127,%r127,%r75; .loc 1 190 8 setp.gt.s64 %r327,%r87,%r106; @ %r327 bra $L30; $L33: .loc 1 199 12 st.u64 [%r135],%r110; $L29: .loc 1 203 15 add.u64 %r45,%r45,1; .loc 1 204 12 add.u64 %r134,%r134,%r78; .loc 1 205 12 add.u64 %r135,%r135,%r81; .loc 1 207 32 ld.u64 %r68,[%frame+240]; .loc 1 207 13 setp.ne.u64 %r328,%r68,%r45; @ %r328 bra $L25; .loc 1 214 23 mul.lo.u64 %r329,%r45,%r76; .loc 1 214 9 shl.b64 %r330,%r329,2; sub.u64 %r131,%r134,%r330; .loc 1 215 23 mul.lo.u64 %r331,%r45,%r79; .loc 1 215 9 shl.b64 %r332,%r331,3; sub.u64 %r132,%r135,%r332; .loc 1 217 7 @ %r355 bra $L1; add.u64 %r157,%frame,368; mov.u64 %r178,8; .loc 1 216 5 mov.u64 %r133,1; bra $L37; $L38: .loc 1 211 13 st.u64 [%r157],%r360; .loc 1 214 23 mul.lo.u64 %r338,%r94,%r93; .loc 1 214 9 shl.b64 %r339,%r338,2; sub.u64 %r131,%r134,%r339; .loc 1 215 23 mul.lo.u64 %r340,%r97,%r93; .loc 1 215 9 shl.b64 %r341,%r340,3; sub.u64 %r132,%r135,%r341; .loc 1 216 5 add.u64 %r133,%r133,1; .loc 1 217 7 add.u64 %r157,%r157,8; add.u64 %r178,%r178,8; setp.eq.u64 %r342,%r133,%r363; @ %r342 bra $L1; $L37: .loc 1 225 16 ld.u64 %r343,[%r157]; add.u64 %r93,%r343,1; st.u64 [%r157],%r93; .loc 1 226 23 add.u64 %r345,%r364,%r178; ld.u64 %r94,[%r345]; .loc 1 226 13 shl.b64 %r346,%r94,2; add.u64 %r134,%r131,%r346; .loc 1 227 23 add.u64 %r347,%frame,%r178; ld.u64 %r97,[%r347]; .loc 1 227 13 shl.b64 %r348,%r97,3; add.u64 %r135,%r132,%r348; .loc 1 207 32 add.u64 %r350,%r365,%r178; ld.u64 %r100,[%r350]; .loc 1 207 13 setp.eq.u64 %r351,%r93,%r100; @ %r351 bra $L38; .loc 1 211 13 mov.u64 %r45,0; bra $L25; $L55: .loc 1 90 6 ld.u64 %r352,[%r185]; setp.eq.u64 %r353,%r352,0; @ ! %r353 bra $L11; bra $L39; $L1:_gfortran_mminloc1_8_i4 .visible .func _gfortran_mminloc1_8_, .param .u32 %in_ar4) {608pred %r218; .reg .u32 %r223; .reg .u6464u64pred %r373; .reg .u64 %r374; .reg .u64 %r376; .reg .u32 %r377; .reg .pred %r378; .reg .u64 %r381; .reg .u64 %r382; .reg .u64 %r386; .reg .pred %r388; .reg .pred %r389; .reg .u64 %r390; .reg .u32 %r391; .reg .u64pred %r398; .reg .pred %r399; .reg .pred %r402; .reg .pred %r403; .reg .pred %r404; .reg .pred %r406; .reg .pred %r407; .reg .u16 %r408; .reg .u32 %r409; .reg .pred %r410; .reg .pred %r411; .reg .pred %r412; .reg .pred.reg .u64 %r416; .reg .u64predpredmov.u64 %r213,%ar0; mov.u64 %r214,%ar1; mov.u64 %r215,%ar2; mov.u64 %r216,%ar3; mov.u32 %r217,%ar4; .loc 1 261 6 setp.ne.u64 %r218,%r216,0; @ %r218 bra $L1r214215217; call _gfortran_minloc1_8_268 7 bra $L56; $L57: .loc 1 271 10 ld.u64 %r22,[%r215]; .loc 1 271 7 add.u64 %r146,%r22,-1; .loc 1 272 10 ld.s8 %r23,[%r214+28]; .loc 1 272 38 add.u32 %r223,%r23,-1; .loc 1 272 8 cvt.s64.s32 %r147,%r223; .loc 1 275 7 shr.u64 %r225,%r146,63; set.u32.gt.s64 %r227,%r146,%r147; neg.s32 %r228,%r227; cvt.u16.u64 %r231,%r225; cvt.u16.u32 %r232,%r228; or.b16 %r230,%r231,%r232; .loc 1 275 6 cvt.u32.u16 %r233,%r230; cvt.u16.u8 %r234,%r233; setp.eq.u16 %r235,%r234,0; @ %r235 bra $L59; .loc 1 277 7 cvt.u32.u32 %r238,%r23; cvt.s64.s8 %r237,%r238; st.u64 [%stack+8],%r237; st.u64 [%stack],%r22; cvta.const.u64 %r236236_gfortran_runtime_error9: .loc 1 282 9 add.u64 %r445,%r146,%r146; add.u64 %r444,%r445,%r146; shl.b64 %r243,%r444,3; add.u64 %r244,%r214,%r243; ld.u64 %r247,[%r244+56]; add.u64 %r246,%r247,1; .loc 1 282 7 ld.u64 %r254,[%r244+48]; sub.u64 %r148,%r246,%r254; .loc 1 283 6 setp.le.s64 %r255,%r148,0; @ %r255 bra $L56; .loc 1 286 9 ld.u64 %r166,[%r216]; .loc 1 288 15 ld.u64 %r34,[%r216+16]; .loc 1 290 22 cvt.u32.u64 %r35,%r34; .loc 1 290 53 add.u32 %r256,%r35,-4; and.b32 %r257,%r256,-5; set.u32.eq.u32 %r259,%r257,0; neg.s32 %r260,%r259; .loc 1 290 22 add.u32 %r261,%r35,-1; set.u32.le.u32 %r263,%r261,1; neg.s32 %r264,%r263; .loc 1 290 6 cvt.u16.u32 %r266,%r260; cvt.u16.u32 %r267,%r264; or.b16 %r265,%r266,%r267; cvt.u32.u16 %r268,%r265; cvt.u16.u8 %r269,%r268; setp.ne.u16 %r270,%r269,0; @ %r270 bra $L61; .loc 1 292 7 setp.ne.u32 %r272,%r35,16; @ %r272 bra $L62; $L61: .loc 1 299 9 shl.b64 %r276,%r444,3; add.u64 %r277,%r214,%r276; ld.u64 %r151,[%r277+40]; .loc 1 300 12 add.u64 %r283,%r216,%r276; ld.u64 %r39,[%r283+40]; .loc 1 302 3 setp.ne.u64 %r285,%r146,0; @ %r285 bra $L63; $L69: .loc 1 312 3 setp.lt.s64 %r286,%r146,%r147; @ %r286 bra $L64; bra $L123; $L62: .loc 1 297 5_gfortran_runtime_error63: add.u64 %r205,%r214,40; add.u64 %r67,%r216,40; add.u64 %r289,%r214,16; add.u64 %r291,%r22,%r22; add.u64 %r292,%r291,%r22; shl.b64 %r293,%r292,3; add.u64 %r210,%r289,%r293; .loc 1 302 3 mov.u64 %r91,0; add.u64 %r442,%frame,240; add.u64 %r440,%frame,360; .loc 1 309 12 mov.u64 %r456,%r91; $L68: .loc 1 304 18 add.u64 %r295,%r442,%r91; ld.u64 %r296,[%r205]; st.u64 [%r295],%r296; .loc 1 305 18 add.u64 %r297,%frame,%r91; .loc 1 305 20 ld.u64 %r299,[%r67]; mul.lo.u64 %r298,%r299,%r34; .loc 1 305 18 st.u64 [%r297],%r298; .loc 1 306 19 ld.u64 %r301,[%r205+16]; add.u64 %r300,%r301,1; ld.u64 %r302,[%r205+8]; sub.u64 %r51,%r300,%r302; .loc 1 308 10 setp.lt.s64 %r303,%r51,0; @ %r303 bra $L66; .loc 1 306 17 add.u64 %r305,%r440,%r91; st.u64 [%r305],%r51; bra $L67; $L66: .loc 1 309 12 add.u64 %r307,%r440,%r91; st.u64 [%r307],%r456; $L67: .loc 1 302 3 add.u64 %r205,%r205,24; add.u64 %r67,%r67,24; add.u64 %r91,%r91,8; setp.ne.u64 %r309,%r205,%r210; @ %r309 bra $L68; bra $L69; $L123: .loc 1 322 6 ld.u64 %r310,[%r213]; setp.eq.u64 %r311,%r310,0; @ ! %r311 bra $L71; bra $L70; $L64: add.u64 %r313,%r22,%r22; add.u64 %r314,%r313,%r22; shl.b64 %r315,%r314,3; add.u64 %r128,%r315,40; add.u64 %r133,%r214,%r128; add.u64 %r124,%r216,%r128; shl.b64 %r316,%r22,3; add.u64 %r83,%r316,-8; cvt.u32.u32 %r318,%r23; cvt.s64.s8 %r317,%r318; shl.b64 %r319,%r317,3; add.u64 %r209,%r319,-8; add.u64 %r442,%frame,240; add.u64 %r440,%frame,360; .loc 1 319 12 mov.u64 %r455,0; $L74: .loc 1 314 18 add.u64 %r321,%r442,%r83; ld.u64 %r322,[%r133]; st.u64 [%r321],%r322; .loc 1 315 18 add.u64 %r323,%frame,%r83; .loc 1 315 20 ld.u64 %r325,[%r124]; mul.lo.u64 %r324,%r325,%r34; .loc 1 315 18 st.u64 [%r323],%r324; .loc 1 316 19 ld.u64 %r327,[%r133+16]; add.u64 %r326,%r327,1; ld.u64 %r328,[%r133+8]; sub.u64 %r64,%r326,%r328; .loc 1 318 10 setp.lt.s64 %r329,%r64,0; @ %r329 bra $L72; .loc 1 316 17 add.u64 %r331,%r440,%r83; st.u64 [%r331],%r64; bra $L73; $L72: .loc 1 319 12 add.u64 %r333,%r440,%r83; st.u64 [%r333],%r455; $L73: .loc 1 312 3 add.u64 %r133,%r133,24; add.u64 %r124,%r124,24; add.u64 %r83,%r83,8; setp.ne.u64 %r335,%r83,%r209; @ %r335 bra $L74; bra $L124; $L70: .loc 1 326 7 setp.eq.u64 %r336,%r147,0; @ %r336 bra $L76; add.u64 %r440,%frame,360; $L100: add.u64 %r161,%r213,40; mov.u64 %r143,%r440; cvt.u32.u32 %r338,%r23; cvt.s64.s8 %r337,%r338; add.u64 %r340,%r337,%r337; add.u64 %r341,%r340,%r337; shl.b64 %r342,%r341,3; add.u64 %r343,%r213,16; add.u64 %r135,%r342,%r343; .loc 1 329 10 mov.u64 %r131,1; .loc 1 333 4 mov.u64 %r362,0; bra $L77; $L76: .loc 1 337 20 add.u32 %r344,%r23,-2; cvt.s64.s32 %r73,%r344; add.u64 %r346,%r73,%r73; add.u64 %r347,%r346,%r73; shl.b64 %r348,%r347,3; add.u64 %r349,%r213,%r348; .loc 1 337 67 shl.b64 %r351,%r73,3; add.u64 %r352,%frame,%r351; .loc 1 337 59 ld.u64 %r354,[%r349+40]; ld.u64 %r355,[%r352+360]; mul.lo.u64 %r155,%r354,%r355; .loc 1 339 24 mov.u64 %r356,0; st.u64 [%r213+8],%r356; .loc 1 340 28 cvt.u16.u32 %r359,%r23; add.u16 %r358,%r359,-1; cvt.u32.u16 %r360,%r358; st.u8 [%r213+28],%r360; .loc 1 342 10 setp.eq.u64 %r361,%r155,0; @ %r361 bra $L78; bra $L125; $L80: .loc 1 331 47 mul.lo.u64 %r131,%r70,%r131; $L77: .loc 1 333 4 st.u64 [%r161+8],%r362; ld.u64 %r70,[%r143]; add.u64 %r363,%r70,-1; st.u64 [%r161+16],%r363; st.u64 [%r161],%r131; .loc 1 326 7 add.u64 %r161,%r161,24; add.u64 %r143,%r143,8; setp.ne.u64 %r364,%r135,%r161; @ %r364 bra $L80; bra $L76; $L78: .loc 1 345 4 st.u64 [%r213+48],%r155; mov.u64 %r366,-1; st.u64 [%r213+56],%r366; mov.u64 %r367,1; st.u64 [%r213+40],%r367; .loc 1 346 4 bra $L56; $L125: .loc 1 349 24 mov.u64 %r369369; call (%value_in),_gfortrani_xmallocarray70,[%value_in]; } .loc 1 349 22 st.u64 [%r213],%r370; bra $L81; $L71: .loc 1 354 19 ld.s8 %r372,[%r213+28]; .loc 1 354 10 setp.eq.u64 %r373,%r372,%r147; @ %r373 bra $L82; .loc 1 355 2 cvta.const.u64 %r374374_gfortran_runtime_error82: .loc 1 357 11 cvta.global.u64 %r376,_gfortrani_compile_options; .loc 1 357 10 ld.u32 %r377,[%r376+36]; setp.eq.u32 %r378,%r377,0; @ %r378 bra $L81; .loc 1 359 4 add.u64 %r440,%frame,360; cvta.const.u64 %r382,$LC2r43382; call _gfortrani_bounds_ifunction_return361 4 cvta.const.u64 %r386,$LC386382; call _gfortrani_bounds_equal_extents$L81: .loc 1 366 3 setp.ne.u64 %r388,%r147,0; @ %r388 bra $L83; $L86: .loc 1 374 8 ld.u64 %r167,[%r213]; .loc 1 375 8 ld.u64 %r165,[%r214]; .loc 1 377 9 setp.ne.u64 %r389,%r165,0; @ %r389 bra $L84; bra $L56; $L83: add.u64 %r194,%frame,480; add.u64 %r193,%r213,40; add.u64 %r188,%frame,120; cvt.u32.u32 %r391,%r23; cvt.s64.s8 %r390,%r391; add.u64 %r174,%r390,-1; .loc 1 366 3 mov.u64 %r168,0; add.u64 %r440,%frame,360; .loc 1 368 16 mov.u64 %r392,%r168; $L85: st.u64 [%r194],%r392; .loc 1 369 18 ld.u64 %r393,[%r193]; st.u64 [%r188],%r393; .loc 1 370 17 shl.b64 %r395,%r168,3; add.u64 %r396,%r440,%r395; .loc 1 370 10 ld.u64 %r397,[%r396]; setp.le.s64 %r398,%r397,0; @ %r398 bra $L56; .loc 1 366 26 add.u64 %r168,%r168,1; .loc 1 366 3 add.u64 %r194,%r194,8; add.u64 %r193,%r193,24; add.u64 %r188,%r188,8; setp.ne.u64 %r399,%r168,%r174; @ %r399 bra $L85; bra $L86; $L84: .loc 1 300 12 mul.lo.u64 %r42,%r39,%r34; .loc 1 396 32 shl.b64 %r87,%r151,2; ld.u64 %r68,[%frame+480]; .loc 1 440 22 ld.u64 %r97,[%frame+240]; .loc 1 440 12 shl.b64 %r99,%r97,2; .loc 1 441 23 ld.u64 %r100,[%frame]; .loc 1 442 22 ld.u64 %r102,[%frame+120]; .loc 1 442 12 shl.b64 %r104,%r102,3; setp.le.s64 %r446,%r147,1; .loc 1 418 9 setp.ne.u32 %r448,%r217,0; cvt.u32.u32 %r449,%r23; cvt.s64.s8 %r450,%r449; add.u64 %r451,%r450,-1; add.u64 %r452,%frame,240; add.u64 %r453,%frame,360; add.u64 %r454,%frame,120; $L101: .loc 1 366 3 mov.u64 %r160,%r166; mov.u64 %r159,%r165; .loc 1 396 9 mov.u64 %r137,0; $L90: .loc 1 399 7 ld.s8 %r208,[%r160]; mov.u64 %r189,%r137; .loc 1 396 24 add.u64 %r137,%r137,1; .loc 1 399 6 setp.eq.u32 %r441,%r208,0; @ %r441 bra $L87; .loc 1 407 11 ld.u32 %r141,[%r159]; .loc 1 418 9 @ %r448 bra $L88; bra $L126; $L87: .loc 1 396 32 add.u64 %r159,%r159,%r87; .loc 1 396 47 add.u64 %r160,%r160,%r42; .loc 1 396 2 setp.ne.u64 %r402,%r148,%r137; @ %r402 bra $L90; .loc 1 395 9 mov.u64 %r137,0; bra $L91; $L126: .loc 1 428 10 setp.gt.s64 %r403,%r148,%r189; @ %r403 bra $L92; bra $L91; $L88: .loc 1 419 8 setp.le.s64 %r404,%r148,%r189; @ %r404 bra $L91; $L94: .loc 1 424 16 add.u64 %r189,%r189,1; .loc 1 421 8 @ %r441 bra $L93; .loc 1 421 18 ld.u32 %r88,[%r159]; .loc 1 421 15 setp.le.s32 %r406,%r88,%r141; selp.u32 %r141,%r88,%r141,%r406; selp.u64 %r137,%r189,%r137,%r406; $L93: .loc 1 419 33 add.u64 %r159,%r159,%r87; .loc 1 419 48 add.u64 %r160,%r160,%r42; .loc 1 419 8 setp.eq.u64 %r407,%r148,%r189; @ %r407 bra $L91; .loc 1 421 9 ld.s8 %r208,[%r160]; setp.eq.u32 %r441,%r208,0; bra $L94; $L92: .loc 1 433 11 add.u64 %r189,%r189,1; .loc 1 430 10 ld.s8 %r409,[%r160]; cvt.u16.u32 %r408,%r409; setp.eq.u16 %r410,%r408,0; @ %r410 bra $L95; .loc 1 430 20 ld.u32 %r93,[%r159]; .loc 1 430 17 setp.lt.s32 %r411,%r93,%r141; selp.u32 %r141,%r93,%r141,%r411; selp.u64 %r137,%r189,%r137,%r411; $L95: .loc 1 428 35 add.u64 %r159,%r159,%r87; .loc 1 428 50 add.u64 %r160,%r160,%r42; .loc 1 428 10 setp.ne.u64 %r412,%r148,%r189; @ %r412 bra $L92; $L91: .loc 1 436 8 st.u64 [%r167],%r137; .loc 1 439 15 add.u64 %r68,%r68,1; .loc 1 440 12 add.u64 %r165,%r165,%r99; .loc 1 441 13 add.u64 %r166,%r166,%r100; .loc 1 442 12 add.u64 %r167,%r167,%r104; .loc 1 444 32 ld.u64 %r29,[%frame+360]; .loc 1 444 13 setp.ne.u64 %r413,%r29,%r68; @ %r413 bra $L101; .loc 1 451 23 mul.lo.u64 %r106,%r68,%r97; .loc 1 452 24 mul.lo.u64 %r414,%r68,%r100; .loc 1 452 10 sub.u64 %r162,%r166,%r414; .loc 1 453 23 mul.lo.u64 %r415,%r68,%r102; .loc 1 453 9 shl.b64 %r416,%r415,3; sub.u64 %r163,%r167,%r416; .loc 1 455 7 @ %r446 bra $L56; add.u64 %r204,%frame,488; mov.u64 %r203,8; .loc 1 454 5 mov.u64 %r164,1; .loc 1 448 13 mov.u64 %r447,0; bra $L98; $L99: st.u64 [%r204],%r447; .loc 1 451 23 mul.lo.u64 %r106,%r117,%r116; .loc 1 452 24 mul.lo.u64 %r422,%r119,%r116; .loc 1 452 10 sub.u64 %r162,%r166,%r422; .loc 1 453 23 mul.lo.u64 %r423,%r121,%r116; .loc 1 453 9 shl.b64 %r424,%r423,3; sub.u64 %r163,%r167,%r424; .loc 1 454 5 add.u64 %r164,%r164,1; .loc 1 455 7 add.u64 %r204,%r204,8; add.u64 %r203,%r203,8; setp.eq.u64 %r425,%r164,%r451; @ %r425 bra $L56; $L98: .loc 1 463 16 ld.u64 %r426,[%r204]; add.u64 %r116,%r426,1; st.u64 [%r204],%r116; .loc 1 464 23 add.u64 %r428,%r452,%r203; ld.u64 %r117,[%r428]; .loc 1 464 13 sub.u64 %r429,%r117,%r106; shl.b64 %r430,%r429,2; add.u64 %r165,%r165,%r430; .loc 1 465 24 add.u64 %r431,%frame,%r203; ld.u64 %r119,[%r431]; .loc 1 465 14 add.u64 %r166,%r162,%r119; .loc 1 466 23 add.u64 %r433,%r454,%r203; ld.u64 %r121,[%r433]; .loc 1 466 13 shl.b64 %r434,%r121,3; add.u64 %r167,%r163,%r434; .loc 1 444 32 add.u64 %r436,%r453,%r203; ld.u64 %r125,[%r436]; .loc 1 444 13 setp.eq.u64 %r437,%r116,%r125; @ %r437 bra $L99; .loc 1 448 13 mov.u64 %r68,0; bra $L101; $L124: .loc 1 322 6 ld.u64 %r438,[%r213]; setp.eq.u64 %r439,%r438,0; @ ! %r439 bra $L71; bra $L100; $L56_gfortran_sminloc1_8_i4 .visible .func _gfortran_sminloc1_8_, .param .u32 %in_ar4) {local .align 16 .b8 %frame_ar[368u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 493 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L128; .loc 1 493 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L129; $L128:r152154; call _gfortran_minloc1_8_500 7 bra $L127; $L129: .loc 1 503 10 ld.u64 %r23,[%r152]; .loc 1 503 7 add.u64 %r88,%r23,-1; .loc 1 504 10 ld.s8 %r24,[%r151+28]; .loc 1 504 38 add.u32 %r26,%r24,-1; .loc 1 504 8 cvt.s64.s32 %r296,%r26; .loc 1 506 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 506 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L131; .loc 1 513 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L132; $L138: .loc 1 521 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L133; bra $L186; $L131: .loc 1 508 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error132: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 518 12 mov.u64 %r302,0; $L137: .loc 1 515 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 517 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L135; .loc 1 515 17 st.u64 [%r139],%r36; bra $L136; $L135: .loc 1 518 12 st.u64 [%r139],%r302; $L136: .loc 1 513 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L137; bra $L138; $L186: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L140; bra $L139; $L133: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 527 12 mov.u64 %r301,0; $L143: .loc 1 524 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 526 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L141; .loc 1 523 17 st.u64 [%r43],%r41; bra $L142; $L141: .loc 1 527 12 st.u64 [%r43],%r301; $L142: .loc 1 521 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L143; bra $L187; $L139: .loc 1 534 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L145; add.u64 %r295,%frame,120; $L165: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 537 10 mov.u64 %r81,1; .loc 1 541 4 mov.u64 %r239,0; bra $L146; $L145: .loc 1 545 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 546 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 548 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 548 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 548 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 550 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L147; bra $L188; $L149: .loc 1 539 48 mul.lo.u64 %r81,%r46,%r81; $L146: .loc 1 541 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 534 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L149; bra $L145; $L147: .loc 1 553 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 554 4 bra $L127; $L188: .loc 1 557 24 mov.u64 %r24696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 557 22 st.u64 [%r150],%r98; $L153: .loc 1 583 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L150; bra $L151; $L140: .loc 1 561 19 ld.s8 %r55,[%r150+28]; .loc 1 561 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L152; .loc 1 562 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error152: .loc 1 567 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 567 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L153; .loc 1 569 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L151; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 569 10 mov.u64 %r104,0; $L155: .loc 1 573 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 573 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 574 18 ld.u64 %r63,[%r31]; .loc 1 569 25 add.u64 %r104,%r104,1; .loc 1 574 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L154; .loc 1 575 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error154: .loc 1 569 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L155; $L150: .loc 1 585 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L156; shl.b64 %r266,%r296,3; bra $L157; $L156: mov.u64 %r266,8; $L157: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L158: .loc 1 586 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 583 26 add.u64 %r100,%r100,1; .loc 1 583 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L158; $L151: ld.u64 %r80,[%frame+240]; .loc 1 595 22 ld.u64 %r68,[%frame]; .loc 1 595 12 shl.b64 %r69,%r68,3; .loc 1 597 32 ld.u64 %r59,[%frame+120]; .loc 1 604 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 593 13 mov.u64 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L164: st.u64 [%r98],%r280; .loc 1 594 15 add.u64 %r80,%r80,1; .loc 1 595 12 add.u64 %r98,%r98,%r69; .loc 1 597 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L164; .loc 1 606 7 @ ! %r294 bra $L189; bra $L127; $L163: .loc 1 601 13 st.u64 [%r99],%r280; .loc 1 604 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 605 5 add.u64 %r97,%r97,1; .loc 1 606 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L162; bra $L127; $L189: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 604 23 mov.u64 %r72,%r44; .loc 1 605 5 mov.u64 %r97,1; $L162: .loc 1 610 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 611 23 ld.u64 %r75,[%r124]; .loc 1 611 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,3; add.u64 %r98,%r98,%r291; .loc 1 597 32 ld.u64 %r77,[%r123]; .loc 1 597 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L163; .loc 1 601 13 mov.u64 %r80,0; bra $L164; $L187: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L140; bra $L165; $L127: .loc 1 615 1 ret; } minloc1_16_i4.o/_gfortran_minloc1_16_i4 .visible .func _gfortran_minloc1_16_fortran/generated/minloc1_16_i4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_16_i4 .visible .func _gfortran_mminloc1_16__gfortran_sminloc1_16_i4 .visible .func _gfortran_sminloc1_16_VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_minloc1_16_i4 .visible .func _gfortran_minloc1_16_i4480pred %r229; .reg .predpred %r240; .reg .predu16 %r278; .reg .u32 %r279; .reg .u32 %r280; .reg .u64 %r282; .reg .u64predu64 %r318; .reg .u64pred %r328; .reg .pred %r329; .reg .u64 %r331; .reg .pred %r334; .reg .u64 %r337; .reg .u64 %r340; .reg .u32 %r342; .reg .u64 %r347; .reg .pred %r349; .reg .predpred %r365; .reg .pred %r368; .reg .u64 %r374; .reg .u64 %r375; .reg .u64u64pred %r389; .reg .u64pred %r407; .reg .u64 %r408; .reg .pred %r409; .reg .u64 %r410; .reg .u32.loc 1 58 10 ld.s8 %r22,[%r191+28]; .loc 1 58 38 add.u32 %r194,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r118,%r194; .loc 1 59 10 ld.u64 %r25,[%r192]; .loc 1 59 7 add.u64 %r119,%r25,-1; .loc 1 61 7 shr.u64 %r196,%r119,63; set.u32.lt.s64 %r198,%r118,%r119; neg.s32 %r199,%r198; cvt.u16.u64 %r202,%r196;61%r206 bra $L2; .loc 1 63 7 cvt.u32.u32 %r209,%r22; cvt.s64.s8 %r208,%r209; st.u64 [%stack+8],%r208007_gfortran_runtime_errorr212,%r119,%r119; add.u64 %r213,%r212,%r119; shl.b64 %r214,%r213,3; add.u64 %r215,%r191,%r214; ld.u64 %r31,[%r215+56]; ld.u64 %r33,[%r215+48]; .loc 1 71 9 ld.u64 %r122,[%r215+40]; .loc 1 73 3 setp.ne.u64 %r229,%r119,0; @ %r229 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r230,%r118,%r119; @ %r230 bra $L4; bra $L52; $L3: add.u64 %r188,%r191,40; add.u64 %r184,%frame,120; add.u64 %r47,%frame,240; add.u64 %r231,%r191,16; add.u64 %r233,%r25,%r25; add.u64 %r234,%r233,%r25; shl.b64 %r235,%r234,3; add.u64 %r186,%r231,%r235; .loc 1 79 12 mov.u64 %r417,0; $L8: .loc 1 75 18 ld.u64 %r236,[%r188]; st.u64 [%r184],%r236; .loc 1 76 19 ld.u64 %r238,[%r188+16]; add.u64 %r237,%r238,1; ld.u64 %r239,[%r188+8]; sub.u64 %r38,%r237,%r239; .loc 1 78 10 setp.lt.s64 %r240,%r38,0; @ %r240 bra $L6; .loc 1 76 17 st.u64 [%r47],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r47],%r417; $L7: .loc 1 73 3 add.u64 %r188,%r188,24; add.u64 %r184,%r184,8; add.u64 %r47,%r47,8; setp.ne.u64 %r242,%r186,%r188; @ %r242 bra $L8; bra $L9; $L52: .loc 1 90 6 ld.u64 %r243,[%r190]; setp.eq.u64 %r244,%r243,0; @ ! %r244 bra $L11; bra $L10; $L4: add.u64 %r246,%r25,%r25; add.u64 %r247,%r246,%r25; shl.b64 %r248,%r247,3; add.u64 %r249,%r248,40; add.u64 %r110,%r191,%r249; shl.b64 %r250,%r25,3; add.u64 %r98,%r250,-8; add.u64 %r405,%frame,120; add.u64 %r100,%r405,%r98; add.u64 %r404,%frame,240; add.u64 %r96,%r404,%r98; cvt.u32.u32 %r254,%r22; cvt.s64.s8 %r253,%r254; add.u64 %r256,%r253,%r253; add.u64 %r257,%r256,%r253; shl.b64 %r258,%r257,3; add.u64 %r259,%r191,40; add.u64 %r135,%r258,%r259; .loc 1 87 12 mov.u64 %r416,0; $L14: .loc 1 83 18 ld.u64 %r260,[%r110]; st.u64 [%r100],%r260; .loc 1 84 19 ld.u64 %r262,[%r110+16]; add.u64 %r261,%r262,1; ld.u64 %r263,[%r110+8]; sub.u64 %r44,%r261,%r263; .loc 1 86 10 setp.lt.s64 %r264,%r44,0; @ %r264 bra $L12; .loc 1 84 17 st.u64 [%r96],%r44; bra $L13; $L12: .loc 1 87 12 st.u64 [%r96],%r416; $L13: .loc 1 81 3 add.u64 %r110,%r110,24; add.u64 %r100,%r100,8; add.u64 %r96,%r96,8; setp.ne.u64 %r266,%r110,%r135; @ %r266 bra $L14; bra $L53; $L10: .loc 1 94 7 setp.le.s64 %r267,%r118,0; @ %r267 bra $L16; add.u64 %r404,%frame,240; $L39: add.u64 %r146,%r190,40; mov.u64 %r143,%r404; cvt.u32.u32 %r269,%r22; cvt.s64.s8 %r268,%r269; add.u64 %r271,%r268,%r268; add.u64 %r272,%r271,%r268; shl.b64 %r273,%r272,3; add.u64 %r274,%r190,16; add.u64 %r111,%r273,%r274; .loc 1 97 10 mov.u64 %r102,1; .loc 1 101 4 mov.u64 %r297,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r275,0; st.u64 [%r190+8],%r275; .loc 1 106 28 cvt.u16.u32 %r278,%r22; add.u16 %r277,%r278,-1; cvt.u32.u16 %r279,%r277; st.u8 [%r190+28],%r279; .loc 1 108 20 add.u32 %r280,%r22,-2; cvt.s64.s32 %r53,%r280; add.u64 %r282,%r53,%r53; add.u64 %r283,%r282,%r53; shl.b64 %r284,%r283,3; add.u64 %r285,%r190,%r284; .loc 1 108 67 shl.b64 %r287,%r53,3; add.u64 %r288,%frame,%r287; .loc 1 108 59 ld.u64 %r290,[%r285+40]; ld.u64 %r291,[%r288+240]; mul.lo.u64 %r124,%r290,%r291; .loc 1 110 29 mov.u64 %r293,16293; call (%value_in),_gfortrani_xmallocarray294,[%value_in]; } .loc 1 110 27 st.u64 [%r190],%r294; .loc 1 111 10 setp.eq.u64 %r296,%r124,0; @ ! %r296 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r102,%r49,%r102; $L17: .loc 1 101 4 st.u64 [%r146+8],%r297; ld.u64 %r49,[%r143]; add.u64 %r298,%r49,-1; st.u64 [%r146+16],%r298; st.u64 [%r146],%r102; .loc 1 94 7 add.u64 %r146,%r146,24; add.u64 %r143,%r143,8; setp.ne.u64 %r299,%r111,%r146; @ %r299 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r190+48],%r124; mov.u64 %r301,-1; st.u64 [%r190+56],%r301; mov.u64 %r302,1; st.u64 [%r190+40],%r302; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r59,[%r190+28]; .loc 1 121 10 setp.eq.u64 %r303,%r59,%r118; @ %r303 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r118; st.u64 [%stack],%r59;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r306,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r307,[%r306+36]; setp.eq.u32 %r308,%r307,0; @ %r308 bra $L21; .loc 1 128 2 add.u64 %r404,%frame,240; cvta.const.u64 %r312,$LC2; cvta.const.u64 %r311,$LC3;_gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r314,%r118,0; @ %r314 bra $L24; $L27: .loc 1 68 9 add.u64 %r315,%r31,1; .loc 1 68 7 sub.u64 %r120,%r315,%r33; max.s64 %r82,%r120,0; .loc 1 140 8 ld.u64 %r131,[%r191]; .loc 1 141 8 ld.u64 %r132,[%r190]; .loc 1 190 33 shl.b64 %r71,%r122,2; ld.u64 %r141,[%frame+360]; .loc 1 204 22 ld.u64 %r73,[%frame+120]; .loc 1 204 12 shl.b64 %r74,%r73,2; .loc 1 205 22 ld.u64 %r76,[%frame]; .loc 1 205 12 shl.b64 %r77,%r76,4; .loc 1 207 32 ld.u64 %r66,[%frame+240]; .loc 1 214 23 mul.lo.u64 %r316,%r66,%r73; .loc 1 214 9 shl.b64 %r317,%r316,2; neg.s64 %r140,%r317; .loc 1 215 23 mul.lo.u64 %r318,%r66,%r76; .loc 1 215 9 shl.b64 %r319,%r318,4; neg.s64 %r145,%r319; setp.gt.s64 %r406,%r120,0; setp.le.s64 %r407,%r118,1; .loc 1 180 9 setp.eq.u32 %r409,%r193,0; mov.u64 %r410,1; cvt.u32.u32 %r411,%r22; cvt.s64.s8 %r412,%r411; add.u64 %r413,%r412,-1; add.u64 %r414,%frame,120; add.u64 %r415,%frame,240; bra $L25; $L24: add.u64 %r168,%frame,360; add.u64 %r167,%r190,40; mov.u64 %r165,%frame; cvt.u32.u32 %r321,%r22; cvt.s64.s8 %r320,%r321; add.u64 %r148,%r320,-1; .loc 1 132 3 mov.u64 %r133,0; add.u64 %r404,%frame,240; .loc 1 134 16 mov.u64 %r322,%r133; $L26: st.u64 [%r168],%r322; .loc 1 135 18 ld.u64 %r323,[%r167]; st.u64 [%r165],%r323; .loc 1 136 17 shl.b64 %r325,%r133,3; add.u64 %r326,%r404,%r325; .loc 1 136 10 ld.u64 %r327,[%r326]; setp.le.s64 %r328,%r327,0; @ %r328 bra $L1; .loc 1 132 26 add.u64 %r133,%r133,1; .loc 1 132 3 add.u64 %r168,%r168,8; add.u64 %r167,%r167,24; add.u64 %r165,%r165,8; setp.ne.u64 %r329,%r133,%r148; @ %r329 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r406 bra $L28; .loc 1 159 10 mov.u64 %r331,0; st.u64 [%r132],%r331; st.u64 [%r132+8],%r331; bra $L29; $L28: .loc 1 180 9 @ %r409 bra $L40; mov.u64 %r127,%r131; mov.u64 %r394,%r410; mov.u64 %r395,0; .loc 1 155 9 mov.u32 %r108,2147483647; .loc 1 157 9 mov.u64 %r390,1; mov.u64 %r391,%r395; $L32: .loc 1 183 9 ld.u32 %r65,[%r127]; .loc 1 183 8 setp.gt.s32 %r334,%r65,%r108; .loc 1 186 16 selp.u64 %r390,%r390,%r394,%r334; selp.u64 %r391,%r391,%r395,%r334; selp.u32 %r108,%r108,%r65,%r334; .loc 1 181 33 add.u64 %r127,%r127,%r71; .loc 1 181 8 add.u64 %r337,%r394,%r410; set.u32.lt.u64 %r342,%r337,%r394; cvt.s64.s32 %r340,%r342; mov.u64 %r394,%r337; sub.u64 %r395,%r395,%r340; add.u64 %r347,%r394,-1; setp.gt.s64 %r349,%r82,%r347; @ %r349 bra $L32; bra $L33; $L40: mov.u64 %r101,%r131; mov.u64 %r392,%r410; mov.u64 %r393,0; .loc 1 155 9 mov.u32 %r109,2147483647; .loc 1 157 9 mov.u64 %r390,1; mov.u64 %r391,%r393; $L30: .loc 1 192 9 ld.u32 %r69,[%r101]; .loc 1 192 8 setp.ge.s32 %r350,%r69,%r109; .loc 1 195 16 selp.u64 %r390,%r390,%r392,%r350; selp.u64 %r391,%r391,%r393,%r350; selp.u32 %r109,%r109,%r69,%r350; .loc 1 190 33 add.u64 %r101,%r101,%r71; .loc 1 190 8 add.u64 %r353,%r392,%r410; set.u32.lt.u64 %r358,%r353,%r392; cvt.s64.s32 %r356,%r358; mov.u64 %r392,%r353; sub.u64 %r393,%r393,%r356; add.u64 %r363,%r392,-1; setp.gt.s64 %r365,%r82,%r363; @ %r365 bra $L30; $L33: .loc 1 199 12 st.u64 [%r132],%r390; st.u64 [%r132+8],%r391; $L29: .loc 1 203 15 add.u64 %r141,%r141,1; .loc 1 204 12 add.u64 %r131,%r131,%r74; .loc 1 205 12 add.u64 %r132,%r132,%r77; .loc 1 207 13 setp.ne.u64 %r368,%r66,%r141; @ %r368 bra $L25; .loc 1 214 9 add.u64 %r128,%r131,%r140; .loc 1 215 9 add.u64 %r129,%r132,%r145; .loc 1 217 7 @ %r407 bra $L1; add.u64 %r157,%frame,368; mov.u64 %r117,8; .loc 1 216 5 mov.u64 %r130,1; .loc 1 211 13 mov.u64 %r408,0; bra $L37; $L38: st.u64 [%r157],%r408; .loc 1 214 23 mul.lo.u64 %r374,%r89,%r88; .loc 1 214 9 shl.b64 %r375,%r374,2; sub.u64 %r128,%r131,%r375; .loc 1 215 23 mul.lo.u64 %r376,%r92,%r88; .loc 1 215 9 shl.b64 %r377,%r376,4; sub.u64 %r129,%r132,%r377; .loc 1 216 5 add.u64 %r130,%r130,1; .loc 1 217 7 add.u64 %r157,%r157,8; add.u64 %r117,%r117,8; setp.eq.u64 %r378,%r130,%r413; @ %r378 bra $L1; $L37: .loc 1 225 16 ld.u64 %r379,[%r157]; add.u64 %r88,%r379,1; st.u64 [%r157],%r88; .loc 1 226 23 add.u64 %r381,%r414,%r117; ld.u64 %r89,[%r381]; .loc 1 226 13 shl.b64 %r382,%r89,2; add.u64 %r131,%r128,%r382; .loc 1 227 23 add.u64 %r383,%frame,%r117; ld.u64 %r92,[%r383]; .loc 1 227 13 shl.b64 %r384,%r92,4; add.u64 %r132,%r129,%r384; .loc 1 207 32 add.u64 %r386,%r415,%r117; ld.u64 %r95,[%r386]; .loc 1 207 13 setp.eq.u64 %r387,%r88,%r95; @ %r387 bra $L38; .loc 1 211 13 mov.u64 %r141,0; bra $L25; $L53: .loc 1 90 6 ld.u64 %r388,[%r190]; setp.eq.u64 %r389,%r388,0; @ ! %r389 bra $L11; bra $L39; $L1:_gfortran_mminloc1_16_i4 .visible .func _gfortran_mminloc1_16_, .param .u32 %in_ar4) {60897u64u64 %r267; .reg .u64 %r274; .reg .pred %r275; .reg .u32 %r276; .reg .u32 %r277; .reg .u32u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .pred %r329; .reg .u64 %r330; .reg .pred %r331; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32 %r338; .reg .u64 %r339; .reg .u64 %r341; .reg .u64 %r342; .reg .u64 %r343; .reg .u64 %r344; .reg .u64pred %r355; .reg .pred64predpred %r408; .reg .pred %r409; .reg .u64 %r410; .reg .u32.reg .u64 %r417; .reg .pred %r418; .reg .predpred %r428; .reg .u64 %r432; .reg .u64pred %r460; .reg .u64 %r463; .reg .u64 %r466; .reg .u32 %r468; .reg .pred %r475; .reg .pred.reg .u32 %r504; .reg .u64 %r508; .reg .u16 %r509; .reg .u32 %r510; .reg .pred %r511; .reg .pred %r512; .reg .u64 %r515; .reg .u64 %r518; .reg .u32 %r520; .reg .pred %r527; .reg .pred %r530; .reg .pred %r533; .reg .u64pred %r542; .reg .u64 %r543; .reg .u64pred.reg .u64 %r610; .reg .u64 %r611; mov.u64 %r233,%ar0; mov.u64 %r234,%ar1; mov.u64 %r235,%ar2; mov.u64 %r236,%ar3; mov.u32 %r237,%ar4; .loc 1 261 6 setp.ne.u64 %r238,%r236,0; @ %r238 bra $L3r2235237; call _gfortran_minloc1_16_268 7 bra $L54; $L55: .loc 1 271 10 ld.u64 %r22,[%r235]; .loc 1 271 7 add.u64 %r153,%r22,-1; .loc 1 272 10 ld.s8 %r23,[%r234+28]; .loc 1 272 38 add.u32 %r243,%r23,-1; .loc 1 272 8 cvt.s64.s32 %r154,%r243; .loc 1 275 7 shr.u64 %r245,%r153,63; set.u32.gt.s64 %r247,%r153,%r154; neg.s32 %r248,%r247; cvt.u16.u64 %r251,%r245; cvt.u16.u32 %r252,%r248; or.b16 %r250,%r251,%r252; .loc 1 275 6 cvt.u32.u16 %r253,%r250; cvt.u16.u8 %r254,%r253; setp.eq.u16 %r255,%r254,0; @ %r255 bra $L57; .loc 1 277 7 cvt.u32.u32 %r258,%r23; cvt.s64.s8 %r257,%r258; st.u64 [%stack+8],%r257; st.u64 [%stack],%r22; cvta.const.u64 %r22stack; call _gfortran_runtime_error7: .loc 1 282 9 add.u64 %r596,%r153,%r153; add.u64 %r601,%r596,%r153; shl.b64 %r263,%r601,3; add.u64 %r264,%r234,%r263; ld.u64 %r267,[%r264+56]; add.u64 %r266,%r267,1; .loc 1 282 7 ld.u64 %r274,[%r264+48]; sub.u64 %r155,%r266,%r274; .loc 1 283 6 setp.le.s64 %r275,%r155,0; @ %r275 bra $L54; .loc 1 286 9 ld.u64 %r173,[%r236]; .loc 1 288 15 ld.u64 %r35,[%r236+16]; .loc 1 290 22 cvt.u32.u64 %r36,%r35; .loc 1 290 53 add.u32 %r276,%r36,-4; and.b32 %r277,%r276,-5; set.u32.eq.u32 %r279,%r277,0; neg.s32 %r280,%r279; .loc 1 290 22 add.u32 %r281,%r36,-1; set.u32.le.u32 %r283,%r281,1; neg.s32 %r284,%r283; .loc 1 290 6 cvt.u16.u32 %r286,%r280; cvt.u16.u32 %r287,%r284; or.b16 %r285,%r286,%r287;59; .loc 1 292 7 setp.ne.u32 %r292,%r36,16; @ %r292 bra $L60; $L59: .loc 1 299 9 shl.b64 %r296,%r601,3; add.u64 %r297,%r234,%r296; ld.u64 %r158,[%r297+40]; .loc 1 300 12 add.u64 %r303,%r236,%r296; ld.u64 %r41,[%r303+40]; .loc 1 302 3 setp.ne.u64 %r305,%r153,0; @ %r305 bra $L61; $L67: .loc 1 312 3 setp.lt.s64 %r306,%r153,%r154; @ %r306 bra $L62; bra $L121; $L60: .loc 1 297 5 cvta.const.u64 %r307307_gfortran_runtime_error61: add.u64 %r164,%r234,40; add.u64 %r150,%r236,40; add.u64 %r309,%r234,16; add.u64 %r311,%r22,%r22; add.u64 %r312,%r311,%r22; shl.b64 %r313,%r312,3; add.u64 %r223,%r309,%r313; .loc 1 302 3 mov.u64 %r53,0; add.u64 %r597,%frame,240; add.u64 %r600,%frame,360; .loc 1 309 12 mov.u64 %r611,%r53; $L66: .loc 1 304 18 add.u64 %r315,%r597,%r53; ld.u64 %r316,[%r164]; st.u64 [%r315],%r316; .loc 1 305 18 add.u64 %r317,%frame,%r53; .loc 1 305 20 ld.u64 %r319,[%r150]; mul.lo.u64 %r318,%r319,%r35; .loc 1 305 18 st.u64 [%r317],%r318; .loc 1 306 19 ld.u64 %r321,[%r164+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r164+8]; sub.u64 %r52,%r320,%r322; .loc 1 308 10 setp.lt.s64 %r323,%r52,0; @ %r323 bra $L64; .loc 1 306 17 add.u64 %r325,%r600,%r53; st.u64 [%r325],%r52; bra $L65; $L64: .loc 1 309 12 add.u64 %r327,%r600,%r53; st.u64 [%r327],%r611; $L65: .loc 1 302 3 add.u64 %r164,%r164,24; add.u64 %r150,%r150,24; add.u64 %r53,%r53,8; setp.ne.u64 %r329,%r164,%r223; @ %r329 bra $L66; bra $L67; $L121: .loc 1 322 6 ld.u64 %r330,[%r233]; setp.eq.u64 %r331,%r330,0; @ ! %r331 bra $L69; bra $L68; $L62: add.u64 %r333,%r22,%r22; add.u64 %r334,%r333,%r22; shl.b64 %r335,%r334,3; add.u64 %r178,%r335,40; add.u64 %r59,%r234,%r178; add.u64 %r180,%r236,%r178; shl.b64 %r336,%r22,3; add.u64 %r91,%r336,-8; cvt.u32.u32 %r338,%r23; cvt.s64.s8 %r337,%r338; shl.b64 %r339,%r337,3; add.u64 %r97,%r339,-8; add.u64 %r597,%frame,240; add.u64 %r600,%frame,360; .loc 1 319 12 mov.u64 %r610,0; $L72: .loc 1 314 18 add.u64 %r341,%r597,%r91; ld.u64 %r342,[%r59]; st.u64 [%r341],%r342; .loc 1 315 18 add.u64 %r343,%frame,%r91; .loc 1 315 20 ld.u64 %r345,[%r180]; mul.lo.u64 %r344,%r345,%r35; .loc 1 315 18 st.u64 [%r343],%r344; .loc 1 316 19 ld.u64 %r347,[%r59+16]; add.u64 %r346,%r347,1; ld.u64 %r348,[%r59+8]; sub.u64 %r66,%r346,%r348; .loc 1 318 10 setp.lt.s64 %r349,%r66,0; @ %r349 bra $L70; .loc 1 316 17 add.u64 %r351,%r600,%r91; st.u64 [%r351],%r66; bra $L71; $L70: .loc 1 319 12 add.u64 %r353,%r600,%r91; st.u64 [%r353],%r610; $L71: .loc 1 312 3 add.u64 %r59,%r59,24; add.u64 %r180,%r180,24; add.u64 %r91,%r91,8; setp.ne.u64 %r355,%r91,%r97; @ %r355 bra $L72; bra $L122; $L68: .loc 1 326 7 setp.eq.u64 %r356,%r154,0; @ %r356 bra $L74; add.u64 %r600,%frame,360; $L100: add.u64 %r137,%r233,40; mov.u64 %r134,%r600; cvt.u32.u32 %r358,%r23; cvt.s64.s8 %r357,%r358; add.u64 %r360,%r357,%r357; add.u64 %r361,%r360,%r357; shl.b64 %r362,%r361,3; add.u64 %r363,%r233,16; add.u64 %r64,%r362,%r363; .loc 1 329 10 mov.u64 %r136,1; .loc 1 333 4 mov.u64 %r382,0; bra $L75; $L74: .loc 1 337 20 add.u32 %r364,%r23,-2; cvt.s64.s32 %r75,%r364; add.u64 %r366,%r75,%r75; add.u64 %r367,%r366,%r75; shl.b64 %r368,%r367,3; add.u64 %r369,%r233,%r368; .loc 1 337 67 shl.b64 %r371,%r75,3; add.u64 %r372,%frame,%r371; .loc 1 337 59 ld.u64 %r374,[%r369+40]; ld.u64 %r375,[%r372+360]; mul.lo.u64 %r162,%r374,%r375; .loc 1 339 24 mov.u64 %r376,0; st.u64 [%r233+8],%r376; .loc 1 340 28 cvt.u16.u32 %r379,%r23; add.u16 %r378,%r379,-1; cvt.u32.u16 %r380,%r378; st.u8 [%r233+28],%r380; .loc 1 342 10 setp.eq.u64 %r381,%r162,0; @ %r381 bra $L76; bra $L123; $L78: .loc 1 331 47 mul.lo.u64 %r136,%r72,%r136; $L75: .loc 1 333 4 st.u64 [%r137+8],%r382; ld.u64 %r72,[%r134]; add.u64 %r383,%r72,-1; st.u64 [%r137+16],%r383; st.u64 [%r137],%r136; .loc 1 326 7 add.u64 %r137,%r137,24; add.u64 %r134,%r134,8; setp.ne.u64 %r384,%r64,%r137; @ %r384 bra $L78; bra $L74; $L76: .loc 1 345 4 st.u64 [%r233+48],%r162; mov.u64 %r386,-1; st.u64 [%r233+56],%r386; mov.u64 %r387,1; st.u64 [%r233+40],%r387; .loc 1 346 4 bra $L54; $L123: .loc 1 349 24 mov.u64 %r389,16389; call (%value_in),_gfortrani_xmallocarray90,[%value_in]; } .loc 1 349 22 st.u64 [%r233],%r390; bra $L79; $L69: .loc 1 354 19 ld.s8 %r392,[%r233+28]; .loc 1 354 10 setp.eq.u64 %r393,%r392,%r154; @ %r393 bra $L80; .loc 1 355 2 cvta.const.u64 %r3944_gfortran_runtime_error80: .loc 1 357 11 cvta.global.u64 %r396,_gfortrani_compile_options; .loc 1 357 10 ld.u32 %r397,[%r396+36]; setp.eq.u32 %r398,%r397,0; @ %r398 bra $L79; .loc 1 359 4 add.u64 %r600,%frame,360; cvta.const.u64 %r402,$LC2r600002; call _gfortrani_bounds_ifunction_return361 4 cvta.const.u64 %r406,$LC406402; call _gfortrani_bounds_equal_extents$L79: .loc 1 366 3 setp.ne.u64 %r408,%r154,0; @ %r408 bra $L81; $L84: .loc 1 374 8 ld.u64 %r174,[%r233]; .loc 1 375 8 ld.u64 %r172,[%r234]; .loc 1 377 9 setp.ne.u64 %r409,%r172,0; @ %r409 bra $L82; bra $L54; $L81: add.u64 %r188,%frame,480; add.u64 %r181,%r233,40; add.u64 %r149,%frame,120; cvt.u32.u32 %r411,%r23; cvt.s64.s8 %r410,%r411; add.u64 %r138,%r410,-1; .loc 1 366 3 mov.u64 %r175,0; add.u64 %r600,%frame,360; .loc 1 368 16 mov.u64 %r412,%r175; $L83: st.u64 [%r188],%r412; .loc 1 369 18 ld.u64 %r413,[%r181]; st.u64 [%r149],%r413; .loc 1 370 17 shl.b64 %r415,%r175,3; add.u64 %r416,%r600,%r415; .loc 1 370 10 ld.u64 %r417,[%r416]; setp.le.s64 %r418,%r417,0; @ %r418 bra $L54; .loc 1 366 26 add.u64 %r175,%r175,1; .loc 1 366 3 add.u64 %r188,%r188,8; add.u64 %r181,%r181,24; add.u64 %r149,%r149,8; setp.ne.u64 %r419,%r138,%r175; @ %r419 bra $L83; bra $L84; $L82: .loc 1 300 12 mul.lo.u64 %r43,%r41,%r35; .loc 1 396 32 shl.b64 %r88,%r158,2; ld.u64 %r189,[%frame+480]; .loc 1 440 22 ld.u64 %r102,[%frame+240]; .loc 1 440 12 shl.b64 %r103,%r102,2; .loc 1 441 23 ld.u64 %r105,[%frame]; .loc 1 442 22 ld.u64 %r107,[%frame+120]; .loc 1 442 12 shl.b64 %r108,%r107,4; .loc 1 444 32 ld.u64 %r29,[%frame+360]; .loc 1 451 23 mul.lo.u64 %r39,%r29,%r102; .loc 1 452 24 mul.lo.u64 %r420,%r29,%r105; .loc 1 452 10 neg.s64 %r92,%r420; .loc 1 453 23 mul.lo.u64 %r421,%r29,%r107; .loc 1 453 9 shl.b64 %r422,%r421,4; neg.s64 %r185,%r422; setp.le.s64 %r595,%r154,1; .loc 1 418 9 setp.ne.u32 %r603,%r237,0; add.u64 %r604,%r155,-1; cvt.u32.u32 %r605,%r23; cvt.s64.s8 %r606,%r605; add.u64 %r607,%r606,-1; add.u64 %r608,%frame,240; add.u64 %r609,%frame,360; $L101: .loc 1 366 3 mov.u64 %r168,%r173; mov.u64 %r167,%r172; .loc 1 396 9 mov.u64 %r197,0; $L88: .loc 1 399 7 ld.s8 %r224,[%r168]; mov.u64 %r230,%r197; .loc 1 396 24 add.u64 %r197,%r197,1; .loc 1 399 6 setp.eq.u32 %r598,%r224,0; @ %r598 bra $L85; .loc 1 407 11 ld.u32 %r145,[%r167]; .loc 1 408 11 mov.u64 %r557,%r197; shr.s64 %r558,%r557,63; .loc 1 418 9 @ %r603 bra $L86; bra $L124; $L85: .loc 1 396 32 add.u64 %r167,%r167,%r88; .loc 1 396 47 add.u64 %r168,%r168,%r43; .loc 1 396 2 setp.ne.u64 %r426,%r155,%r197; @ %r426 bra $L88; .loc 1 395 9 mov.u64 %r557,0; mov.u64 %r558,%r557; bra $L89; $L124: .loc 1 428 10 setp.gt.s64 %r427,%r155,%r230; @ %r427 bra $L90; bra $L89; $L86: .loc 1 419 8 setp.le.s64 %r428,%r155,%r230; @ %r428 bra $L89; add.u64 %r565,%r230,1; shr.s64 %r566,%r565,63; sub.u64 %r432,%r604,%r230; shr.s64 %r435,%r230,63; add.u64 %r438,%r230,2; set.u32.lt.u64 %r443,%r438,%r230; cvt.s64.s32 %r441,%r443; sub.u64 %r447,%r435,%r441; add.u64 %r449,%r432,%r438; set.u32.lt.u64 %r454,%r449,%r432; cvt.s64.s32 %r452,%r454; sub.u64 %r458,%r447,%r452; mov.u64 %r577,1; $L93: .loc 1 421 8 @ %r598 bra $L91; .loc 1 421 18 ld.u32 %r89,[%r167]; .loc 1 421 15 setp.gt.s32 %r460,%r89,%r145; .loc 1 424 16 selp.u64 %r557,%r557,%r565,%r460; selp.u64 %r558,%r558,%r566,%r460; selp.u32 %r145,%r145,%r89,%r460; $L91: .loc 1 419 33 add.u64 %r167,%r167,%r88; .loc 1 419 48 add.u64 %r168,%r168,%r43; .loc 1 419 8 add.u64 %r463,%r565,%r577; set.u32.lt.u64 %r468,%r463,%r565; cvt.s64.s32 %r466,%r468; mov.u64 %r565,%r463; sub.u64 %r566,%r566,%r466; setp.ne.u64 %r475,%r449,%r565; @ %r475 bra $L102; setp.ne.u64 %r478,%r458,%r566; @ ! %r478 bra $L89; $L102: .loc 1 421 9 ld.s8 %r224,[%r168]; setp.eq.u32 %r598,%r224,0; bra $L93; $L90: add.u64 %r561,%r230,1; shr.s64 %r562,%r561,63; sub.u64 %r482,%r604,%r230; shr.s64 %r485,%r230,63; add.u64 %r488,%r230,2; set.u32.lt.u64 %r493,%r488,%r230; cvt.s64.s32 %r491,%r493; sub.u64 %r497,%r485,%r491; add.u64 %r499,%r482,%r488; set.u32.lt.u64 %r504,%r499,%r482; cvt.s64.s32 %r502,%r504; sub.u64 %r508,%r497,%r502; .loc 1 428 10 mov.u64 %r591,1; $L116: .loc 1 430 10 ld.s8 %r510,[%r168]; cvt.u16.u32 %r509,%r510; setp.eq.u16 %r511,%r509,0; @ %r511 bra $L94; .loc 1 430 20 ld.u32 %r94,[%r167]; .loc 1 430 17 setp.ge.s32 %r512,%r94,%r145; .loc 1 433 11 selp.u64 %r557,%r557,%r561,%r512; selp.u64 %r558,%r558,%r562,%r512; selp.u32 %r145,%r145,%r94,%r512; $L94: .loc 1 428 35 add.u64 %r167,%r167,%r88; .loc 1 428 50 add.u64 %r168,%r168,%r43; .loc 1 428 10 add.u64 %r515,%r561,%r591; set.u32.lt.u64 %r520,%r515,%r561; cvt.s64.s32 %r518,%r520; mov.u64 %r561,%r515; sub.u64 %r562,%r562,%r518; setp.ne.u64 %r527,%r499,%r561; @ %r527 bra $L116; setp.ne.u64 %r530,%r508,%r562; @ %r530 bra $L116; $L89: .loc 1 436 8 st.u64 [%r174],%r557; st.u64 [%r174+8],%r558; .loc 1 439 15 add.u64 %r189,%r189,1; .loc 1 440 12 add.u64 %r172,%r172,%r103; .loc 1 441 13 add.u64 %r173,%r173,%r105; .loc 1 442 12 add.u64 %r174,%r174,%r108; .loc 1 444 13 setp.ne.u64 %r533,%r29,%r189; @ %r533 bra $L101; .loc 1 452 10 add.u64 %r169,%r173,%r92; .loc 1 453 9 add.u64 %r170,%r174,%r185; .loc 1 455 7 @ %r595 bra $L54; add.u64 %r219,%frame,488; .loc 1 451 23 mov.u64 %r112,%r39; .loc 1 455 7 mov.u64 %r218,8; .loc 1 454 5 mov.u64 %r171,1; add.u64 %r599,%frame,120; .loc 1 448 13 mov.u64 %r602,0; bra $L98; $L99: st.u64 [%r219],%r602; .loc 1 451 23 mul.lo.u64 %r112,%r125,%r124; .loc 1 452 24 mul.lo.u64 %r539,%r127,%r124; .loc 1 452 10 sub.u64 %r169,%r173,%r539; .loc 1 453 23 mul.lo.u64 %r540,%r129,%r124; .loc 1 453 9 shl.b64 %r541,%r540,4; sub.u64 %r170,%r174,%r541; .loc 1 454 5 add.u64 %r171,%r171,1; .loc 1 455 7 add.u64 %r219,%r219,8; add.u64 %r218,%r218,8; setp.eq.u64 %r542,%r171,%r607; @ %r542 bra $L54; $L98: .loc 1 463 16 ld.u64 %r543,[%r219]; add.u64 %r124,%r543,1; st.u64 [%r219],%r124; .loc 1 464 23 add.u64 %r545,%r608,%r218; ld.u64 %r125,[%r545]; .loc 1 464 13 sub.u64 %r546,%r125,%r112; shl.b64 %r547,%r546,2; add.u64 %r172,%r172,%r547; .loc 1 465 24 add.u64 %r548,%frame,%r218; ld.u64 %r127,[%r548]; .loc 1 465 14 add.u64 %r173,%r169,%r127; .loc 1 466 23 add.u64 %r550,%r599,%r218; ld.u64 %r129,[%r550]; .loc 1 466 13 shl.b64 %r551,%r129,4; add.u64 %r174,%r170,%r551; .loc 1 444 32 add.u64 %r553,%r609,%r218; ld.u64 %r133,[%r553]; .loc 1 444 13 setp.eq.u64 %r554,%r124,%r133; @ %r554 bra $L99; .loc 1 448 13 mov.u64 %r189,0; bra $L101; $L122: .loc 1 322 6 ld.u64 %r555,[%r233]; setp.eq.u64 %r556,%r555,0; @ ! %r556 bra $L69; bra $L100; $L54_gfortran_sminloc1_16_i4 .visible .func _gfortran_sminloc1_16_, .param .u32 %in_ar4) {local .align 16 .b8 %frame_ar[368u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r282; .reg .predpred %r293; .reg .pred %r294; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 493 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L126; .loc 1 493 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L127; $L126:r152154; call _gfortran_minloc1_16_500 7 bra $L125; $L127: .loc 1 503 10 ld.u64 %r23,[%r152]; .loc 1 503 7 add.u64 %r88,%r23,-1; .loc 1 504 10 ld.s8 %r24,[%r151+28]; .loc 1 504 38 add.u32 %r26,%r24,-1; .loc 1 504 8 cvt.s64.s32 %r297,%r26; .loc 1 506 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r297; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 506 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L129; .loc 1 513 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L130; $L136: .loc 1 521 3 setp.lt.s64 %r175,%r88,%r297; @ %r175 bra $L131; bra $L184; $L129: .loc 1 508 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error130: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 518 12 mov.u64 %r303,0; $L135: .loc 1 515 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 517 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L133; .loc 1 515 17 st.u64 [%r139],%r36; bra $L134; $L133: .loc 1 518 12 st.u64 [%r139],%r303; $L134: .loc 1 513 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L135; bra $L136; $L184: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L138; bra $L137; $L131: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r296,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r296,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 527 12 mov.u64 %r302,0; $L141: .loc 1 524 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 526 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L139; .loc 1 523 17 st.u64 [%r43],%r41; bra $L140; $L139: .loc 1 527 12 st.u64 [%r43],%r302; $L140: .loc 1 521 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L141; bra $L185; $L137: .loc 1 534 7 setp.eq.u64 %r213,%r297,0; @ %r213 bra $L143; add.u64 %r296,%frame,120; $L163: add.u64 %r111,%r150,40; mov.u64 %r107,%r296; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 537 10 mov.u64 %r81,1; .loc 1 541 4 mov.u64 %r239,0; bra $L144; $L143: .loc 1 545 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 546 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 548 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 548 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 548 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 550 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L145; bra $L186; $L147: .loc 1 539 48 mul.lo.u64 %r81,%r46,%r81; $L144: .loc 1 541 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 534 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L147; bra $L143; $L145: .loc 1 553 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 554 4 bra $L125; $L186: .loc 1 557 24 mov.u64 %r246,1696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 557 22 st.u64 [%r150],%r98; $L151: .loc 1 583 3 setp.ne.u64 %r249,%r297,0; @ %r249 bra $L148; bra $L149; $L138: .loc 1 561 19 ld.s8 %r55,[%r150+28]; .loc 1 561 10 setp.eq.u64 %r250,%r55,%r297; @ %r250 bra $L150; .loc 1 562 2 st.u64 [%stack+8],%r297; st.u64 [%stack],%r55;_gfortran_runtime_error150: .loc 1 567 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 567 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L151; .loc 1 569 4 setp.eq.u64 %r256,%r297,0; @ %r256 bra $L149; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 569 10 mov.u64 %r104,0; $L153: .loc 1 573 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 573 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 574 18 ld.u64 %r63,[%r31]; .loc 1 569 25 add.u64 %r104,%r104,1; .loc 1 574 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L152; .loc 1 575 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error152: .loc 1 569 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L153; $L148: .loc 1 585 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L154; shl.b64 %r266,%r297,3; bra $L155; $L154: mov.u64 %r266,8; $L155: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L156: .loc 1 586 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 583 26 add.u64 %r100,%r100,1; .loc 1 583 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r297,%r100; @ %r279 bra $L156; $L149: ld.u64 %r80,[%frame+240]; .loc 1 595 22 ld.u64 %r68,[%frame]; .loc 1 595 12 shl.b64 %r69,%r68,4; .loc 1 597 32 ld.u64 %r59,[%frame+120]; .loc 1 604 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r295,%r297,1; .loc 1 593 13 mov.u64 %r280,0; cvt.u32.u32 %r299,%r24; cvt.s64.s8 %r300,%r299; add.u64 %r301,%r300,-1; $L162: st.u64 [%r98],%r280; st.u64 [%r98+8],%r280; .loc 1 594 15 add.u64 %r80,%r80,1; .loc 1 595 12 add.u64 %r98,%r98,%r69; .loc 1 597 13 setp.ne.u64 %r282,%r59,%r80; @ %r282 bra $L162; .loc 1 606 7 @ ! %r295 bra $L187; bra $L125; $L161: .loc 1 601 13 st.u64 [%r99],%r280; .loc 1 604 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 605 5 add.u64 %r97,%r97,1; .loc 1 606 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r285,%r97,%r301; @ %r285 bra $L160; bra $L125; $L187: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 604 23 mov.u64 %r72,%r44; .loc 1 605 5 mov.u64 %r97,1; $L160: .loc 1 610 16 ld.u64 %r290,[%r99]; add.u64 %r74,%r290,1; st.u64 [%r99],%r74; .loc 1 611 23 ld.u64 %r75,[%r124]; .loc 1 611 13 sub.u64 %r291,%r75,%r72; shl.b64 %r292,%r291,4; add.u64 %r98,%r98,%r292; .loc 1 597 32 ld.u64 %r77,[%r123]; .loc 1 597 13 setp.eq.u64 %r293,%r74,%r77; @ %r293 bra $L161; .loc 1 601 13 mov.u64 %r80,0; bra $L162; $L185: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r294,%r98,0; @ ! %r294 bra $L138; bra $L163; $L125: .loc 1 615 1 ret; } minloc1_4_i8.o/_gfortran_minloc1_4_i8 .visible .func _gfortran_minloc1_4_i8.file 1 "../../../libgfortran/generated/minloc1_4_i8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_4_i8 .visible .func _gfortran_mminloc1_4__gfortran_sminloc1_4_i8 .visible .func _gfortran_sminloc1_4_VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_minloc1_4_i8 .visible .func _gfortran_minloc1_4_i8480predu64 %r258; .reg .u64pred %r262; .reg .pred %r264; .reg .pred %r265; .reg .u64u64predu64 %r318; .reg .u32u64 %r325; .reg .pred %r326; .reg .pred %r327; .reg .u32 %r329; .reg .predu32 %r335; .reg .pred %r336; .reg .predu64u64predmov.u64 %r188,%ar0; mov.u64 %r189,%ar1; mov.u64 %r190,%ar2; mov.u32 %r191,%ar3; .loc 1 58 10 ld.s8 %r22,[%r189+28]; .loc 1 58 38 add.u32 %r192,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r118,%r192; .loc 1 59 10 ld.u64 %r25,[%r190]; .loc 1 59 7 add.u64 %r119,%r25,-1; .loc 1 61 7 shr.u64 %r194,%r119,63; set.u32.lt.s64 %r196,%r118,%r119; neg.s32 %r197,%r196; cvt.u16.u64 %r200,%r194; cvt.u16.u32 %r201,%r197; or.b16 %r199,%r200,%r201; .loc 1 61 6 cvt.u32.u16 %r202,%r199; cvt.u16.u8 %r203,%r202; setp.eq.u16 %r204,%r203,0; @ %r204 bra $L2; .loc 1 63 7 cvt.u32.u32 %r207,%r22; cvt.s64.s8 %r206,%r207; st.u64 [%stack+8],%r20605205_gfortran_runtime_errorr210,%r119,%r119; add.u64 %r211,%r210,%r119; shl.b64 %r212,%r211,3; add.u64 %r213,%r189,%r212; ld.u64 %r30,[%r213+56]; ld.u64 %r32,[%r213+48]; .loc 1 71 9 ld.u64 %r122,[%r213+40]; .loc 1 73 3 setp.ne.u64 %r227,%r119,0; @ %r227 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r228,%r118,%r119; @ %r228 bra $L4; bra $L52; $L3: add.u64 %r138,%r189,40; add.u64 %r186,%frame,120; add.u64 %r185,%frame,240; add.u64 %r229,%r189,16; add.u64 %r231,%r25,%r25; add.u64 %r232,%r231,%r25; shl.b64 %r233,%r232,3; add.u64 %r145,%r229,%r233; .loc 1 79 12 mov.u64 %r372,0; $L8: .loc 1 75 18 ld.u64 %r234,[%r138]; st.u64 [%r186],%r234; .loc 1 76 19 ld.u64 %r236,[%r138+16]; add.u64 %r235,%r236,1; ld.u64 %r237,[%r138+8]; sub.u64 %r37,%r235,%r237; .loc 1 78 10 setp.lt.s64 %r238,%r37,0; @ %r238 bra $L6; .loc 1 76 17 st.u64 [%r185],%r37; bra $L7; $L6: .loc 1 79 12 st.u64 [%r185],%r372; $L7: .loc 1 73 3 add.u64 %r138,%r138,24; add.u64 %r186,%r186,8; add.u64 %r185,%r185,8; setp.ne.u64 %r240,%r138,%r145; @ %r240 bra $L8; bra $L9; $L52: .loc 1 90 6 ld.u64 %r241,[%r188]; setp.eq.u64 %r242,%r241,0; @ ! %r242 bra $L11; bra $L10; $L4: add.u64 %r244,%r25,%r25; add.u64 %r245,%r244,%r25; shl.b64 %r246,%r245,3; add.u64 %r247,%r246,40; add.u64 %r114,%r189,%r247; shl.b64 %r248,%r25,3; add.u64 %r102,%r248,-8; add.u64 %r362,%frame,120; add.u64 %r107,%r362,%r102; add.u64 %r361,%frame,240; add.u64 %r100,%r361,%r102; cvt.u32.u32 %r252,%r22; cvt.s64.s8 %r251,%r252; add.u64 %r254,%r251,%r251; add.u64 %r255,%r254,%r251; shl.b64 %r256,%r255,3; add.u64 %r257,%r189,40; add.u64 %r143,%r256,%r257; .loc 1 87 12 mov.u64 %r371,0; $L14: .loc 1 83 18 ld.u64 %r258,[%r114]; st.u64 [%r107],%r258; .loc 1 84 19 ld.u64 %r260,[%r114+16]; add.u64 %r259,%r260,1; ld.u64 %r261,[%r114+8]; sub.u64 %r43,%r259,%r261; .loc 1 86 10 setp.lt.s64 %r262,%r43,0; @ %r262 bra $L12; .loc 1 84 17 st.u64 [%r100],%r43; bra $L13; $L12: .loc 1 87 12 st.u64 [%r100],%r371; $L13: .loc 1 81 3 add.u64 %r114,%r114,24; add.u64 %r107,%r107,8; add.u64 %r100,%r100,8; setp.ne.u64 %r264,%r114,%r143; @ %r264 bra $L14; bra $L53; $L10: .loc 1 94 7 setp.le.s64 %r265,%r118,0; @ %r265 bra $L16; add.u64 %r361,%frame,240; $L39: add.u64 %r155,%r188,40; mov.u64 %r151,%r361; cvt.u32.u32 %r267,%r22; cvt.s64.s8 %r266,%r267; add.u64 %r269,%r266,%r266; add.u64 %r270,%r269,%r266; shl.b64 %r271,%r270,3; add.u64 %r272,%r188,16; add.u64 %r115,%r271,%r272; .loc 1 97 10 mov.u64 %r105,1; .loc 1 101 4 mov.u64 %r295,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r273,0; st.u64 [%r188+8],%r273; .loc 1 106 28 cvt.u16.u32 %r276,%r22; add.u16 %r275,%r276,-1; cvt.u32.u16 %r277,%r275; st.u8 [%r188+28],%r277; .loc 1 108 20 add.u32 %r278,%r22,-2; cvt.s64.s32 %r52,%r278; add.u64 %r280,%r52,%r52; add.u64 %r281,%r280,%r52; shl.b64 %r282,%r281,3; add.u64 %r283,%r188,%r282; .loc 1 108 67 shl.b64 %r285,%r52,3; add.u64 %r286,%frame,%r285; .loc 1 108 59 ld.u64 %r288,[%r283+40]; ld.u64 %r289,[%r286+240]; mul.lo.u64 %r124,%r288,%r289; .loc 1 110 29291; call (%value_in),_gfortrani_xmallocarray292,[%value_in]; } .loc 1 110 27 st.u64 [%r188],%r292; .loc 1 111 10 setp.eq.u64 %r294,%r124,0; @ ! %r294 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r105,%r48,%r105; $L17: .loc 1 101 4 st.u64 [%r155+8],%r295; ld.u64 %r48,[%r151]; add.u64 %r296,%r48,-1; st.u64 [%r155+16],%r296; st.u64 [%r155],%r105; .loc 1 94 7 add.u64 %r155,%r155,24; add.u64 %r151,%r151,8; setp.ne.u64 %r297,%r115,%r155; @ %r297 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r188+48],%r124; mov.u64 %r299,-1; st.u64 [%r188+56],%r299; mov.u64 %r300,1; st.u64 [%r188+40],%r300; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r59,[%r188+28]; .loc 1 121 10 setp.eq.u64 %r301,%r59,%r118; @ %r301 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r118; st.u64 [%stack],%r59;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r304,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r305,[%r304+36]; setp.eq.u32 %r306,%r305,0; @ %r306 bra $L21; .loc 1 128 2 add.u64 %r361,%frame,240; cvta.const.u64 %r310,$LC2r361309310; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r312,%r118,0; @ %r312 bra $L24; $L27: .loc 1 68 9 add.u64 %r313,%r30,1; .loc 1 68 7 sub.u64 %r120,%r313,%r32; max.s64 %r86,%r120,0; .loc 1 140 8 ld.u64 %r134,[%r189]; .loc 1 141 8 ld.u64 %r135,[%r188]; .loc 1 190 33 shl.b64 %r75,%r122,3; ld.u64 %r153,[%frame+360]; .loc 1 204 22 ld.u64 %r77,[%frame+120]; .loc 1 204 12 shl.b64 %r78,%r77,3; .loc 1 205 22 ld.u64 %r80,[%frame]; .loc 1 205 12 shl.b64 %r81,%r80,2; .loc 1 207 32 ld.u64 %r68,[%frame+240]; .loc 1 214 23 mul.lo.u64 %r314,%r68,%r77; .loc 1 214 9 shl.b64 %r315,%r314,3; neg.s64 %r142,%r315; .loc 1 215 23 mul.lo.u64 %r316,%r68,%r80; .loc 1 215 9 shl.b64 %r317,%r316,2; neg.s64 %r140,%r317; setp.gt.s64 %r359,%r120,0; setp.le.s64 %r360,%r118,1; .loc 1 180 9 setp.eq.u32 %r364,%r191,0; .loc 1 155 9 mov.u64 %r365,9223372036854775807; cvt.u32.u32 %r366,%r22; cvt.s64.s8 %r367,%r366; add.u64 %r368,%r367,-1; add.u64 %r369,%frame,120; add.u64 %r370,%frame,240; bra $L25; $L24: add.u64 %r171,%frame,360; add.u64 %r170,%r188,40; mov.u64 %r168,%frame; cvt.u32.u32 %r319,%r22; cvt.s64.s8 %r318,%r319; add.u64 %r159,%r318,-1; .loc 1 132 3 mov.u64 %r136,0; add.u64 %r361,%frame,240; .loc 1 134 16 mov.u64 %r320,%r136; $L26: st.u64 [%r171],%r320; .loc 1 135 18 ld.u64 %r321,[%r170]; st.u64 [%r168],%r321; .loc 1 136 17 shl.b64 %r323,%r136,3; add.u64 %r324,%r361,%r323; .loc 1 136 10 ld.u64 %r325,[%r324]; setp.le.s64 %r326,%r325,0; @ %r326 bra $L1; .loc 1 132 26 add.u64 %r136,%r136,1; .loc 1 132 3 add.u64 %r171,%r171,8; add.u64 %r170,%r170,24; add.u64 %r168,%r168,8; setp.ne.u64 %r327,%r136,%r159; @ %r327 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r359 bra $L28; .loc 1 159 10 mov.u32 %r329,0; st.u32 [%r135],%r329; bra $L29; $L28: .loc 1 180 9 @ %r364 bra $L40; mov.u64 %r129,%r134; .loc 1 155 9 mov.u64 %r108,%r365; .loc 1 157 9 mov.u32 %r106,1; .loc 1 178 8 mov.u64 %r128,0; $L32: .loc 1 183 9 ld.u64 %r67,[%r129]; .loc 1 183 8 setp.gt.s64 %r331,%r67,%r108; @ %r331 bra $L31; cvt.u32.u64 %r332,%r128; add.u32 %r106,%r332,1; mov.u64 %r108,%r67; $L31: .loc 1 181 25 add.u64 %r128,%r128,1; .loc 1 181 33 add.u64 %r129,%r129,%r75; .loc 1 181 8 setp.gt.s64 %r333,%r86,%r128; @ %r333 bra $L32; bra $L33; $L40: mov.u64 %r126,%r134; .loc 1 155 9 mov.u64 %r110,%r365; .loc 1 157 9 mov.u32 %r106,1; .loc 1 178 8 mov.u64 %r104,0; $L30: .loc 1 192 9 ld.u64 %r72,[%r126]; .loc 1 192 8 setp.ge.s64 %r334,%r72,%r110; @ %r334 bra $L34; cvt.u32.u64 %r335,%r104; add.u32 %r106,%r335,1; mov.u64 %r110,%r72; $L34: .loc 1 190 25 add.u64 %r104,%r104,1; .loc 1 190 33 add.u64 %r126,%r126,%r75; .loc 1 190 8 setp.gt.s64 %r336,%r86,%r104; @ %r336 bra $L30; $L33: .loc 1 199 12 st.u32 [%r135],%r106; $L29: .loc 1 203 15 add.u64 %r153,%r153,1; .loc 1 204 12 add.u64 %r134,%r134,%r78; .loc 1 205 12 add.u64 %r135,%r135,%r81; .loc 1 207 13 setp.ne.u64 %r337,%r68,%r153; @ %r337 bra $L25; .loc 1 214 9 add.u64 %r131,%r134,%r142; .loc 1 215 9 add.u64 %r132,%r135,%r140; .loc 1 217 7 @ %r360 bra $L1; add.u64 %r156,%frame,368; mov.u64 %r117,8; .loc 1 216 5 mov.u64 %r133,1; .loc 1 211 13 mov.u64 %r363,0; bra $L37; $L38: st.u64 [%r156],%r363; .loc 1 214 23 mul.lo.u64 %r343,%r93,%r92; .loc 1 214 9 shl.b64 %r344,%r343,3; sub.u64 %r131,%r134,%r344; .loc 1 215 23 mul.lo.u64 %r345,%r96,%r92; .loc 1 215 9 shl.b64 %r346,%r345,2; sub.u64 %r132,%r135,%r346; .loc 1 216 5 add.u64 %r133,%r133,1; .loc 1 217 7 add.u64 %r156,%r156,8; add.u64 %r117,%r117,8; setp.eq.u64 %r347,%r133,%r368; @ %r347 bra $L1; $L37: .loc 1 225 16 ld.u64 %r348,[%r156]; add.u64 %r92,%r348,1; st.u64 [%r156],%r92; .loc 1 226 23 add.u64 %r350,%r369,%r117; ld.u64 %r93,[%r350]; .loc 1 226 13 shl.b64 %r351,%r93,3; add.u64 %r134,%r131,%r351; .loc 1 227 23 add.u64 %r352,%frame,%r117; ld.u64 %r96,[%r352]; .loc 1 227 13 shl.b64 %r353,%r96,2; add.u64 %r135,%r132,%r353; .loc 1 207 32 add.u64 %r355,%r370,%r117; ld.u64 %r99,[%r355]; .loc 1 207 13 setp.eq.u64 %r356,%r92,%r99; @ %r356 bra $L38; .loc 1 211 13 mov.u64 %r153,0; bra $L25; $L53: .loc 1 90 6 ld.u64 %r357,[%r188]; setp.eq.u64 %r358,%r357,0; @ ! %r358 bra $L11; bra $L39; $L1:_gfortran_mminloc1_4_i8 .visible .func _gfortran_mminloc1_4_, .param .u32 %in_ar4) {60898u16 %r238; .reg .u16u64 %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u64 %r306; .reg .u64pred %r317; .reg .u64 %r318; .reg .pred %r319; .reg .u64u32u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64u64predu32 %r385; .reg .predu64 %r394; .reg .predu64predu64 %r410; .reg .u32 %r412; .reg .pred %r414; .reg .pred %r415; .reg .pred %r416; .reg .predpred %r423; .reg .predpredpred %r446; .reg .u64 %r447; .reg .predmov.u64 %r221,%ar0; mov.u64 %r222,%ar1; mov.u64 %r223,%ar2; mov.u64 %r224,%ar3; mov.u32 %r225,%ar4; .loc 1 261 6 setp.ne.u64 %r226,%r224,0; @ %r226 bra $L223225; call _gfortran_minloc1_4_268 7 bra $L54; $L55: .loc 1 271 10 ld.u64 %r22,[%r223]; .loc 1 271 7 add.u64 %r151,%r22,-1; .loc 1 272 10 ld.s8 %r23,[%r222+28]; .loc 1 272 38 add.u32 %r231,%r23,-1; .loc 1 272 8 cvt.s64.s32 %r152,%r231; .loc 1 275 7 shr.u64 %r233,%r151,63; set.u32.gt.s64 %r235,%r151,%r152; neg.s32 %r236,%r235; cvt.u16.u64 %r239,%r233; cvt.u16.u32 %r240,%r236; or.b16 %r238,%r239,%r240; .loc 1 275 6 cvt.u32.u16 %r241,%r238; cvt.u16.u8 %r242,%r241; setp.eq.u16 %r243,%r242,0; @ %r243 bra $L57; .loc 1 277 7 cvt.u32.u32 %r246,%r23; cvt.s64.s8 %r245,%r246; st.u64 [%stack+8],%r245; st.u64 [%stack],%r22; cvta.const.u64 %r244244_gfortran_runtime_error7: .loc 1 282 9 add.u64 %r449,%r151,%r151; add.u64 %r455,%r449,%r151; shl.b64 %r251,%r455,3; add.u64 %r252,%r222,%r251; ld.u64 %r255,[%r252+56]; add.u64 %r254,%r255,1; .loc 1 282 7 ld.u64 %r262,[%r252+48]; sub.u64 %r153,%r254,%r262; .loc 1 283 6 setp.le.s64 %r263,%r153,0; @ %r263 bra $L54; .loc 1 286 9 ld.u64 %r169,[%r224]; .loc 1 288 15 ld.u64 %r34,[%r224+16]; .loc 1 290 22 cvt.u32.u64 %r35,%r34; .loc 1 290 53 add.u32 %r264,%r35,-4; and.b32 %r265,%r264,-5; set.u32.eq.u32 %r267,%r265,0; neg.s32 %r268,%r267; .loc 1 290 22 add.u32 %r269,%r35,-1; set.u32.le.u32 %r271,%r269,1; neg.s32 %r272,%r271; .loc 1 290 6 cvt.u16.u32 %r274,%r268; cvt.u16.u32 %r275,%r272; or.loc 1 292 7 setp.ne.u32 %r280,%r35,16; @ %r280 bra $L60; $L59: .loc 1 299 9 shl.b64 %r284,%r455,3; add.u64 %r285,%r222,%r284; ld.u64 %r156,[%r285+40]; .loc 1 300 12 add.u64 %r291,%r224,%r284; ld.u64 %r41,[%r291+40]; .loc 1 302 3 setp.ne.u64 %r293,%r151,0; @ %r293 bra $L61; $L67: .loc 1 312 3 setp.lt.s64 %r294,%r151,%r152; @ %r294 bra $L62; bra $L119; $L60: .loc 1 297 5 cvta.const.u64 %r2955_gfortran_runtime_error61: add.u64 %r96,%r222,40; add.u64 %r68,%r224,40; add.u64 %r298,%r22,%r22; add.u64 %r299,%r298,%r22; shl.b64 %r300,%r299,3; add.u64 %r301,%r222,16; add.u64 %r39,%r300,%r301; .loc 1 302 3 mov.u64 %r139,0; add.u64 %r452,%frame,240; add.u64 %r451,%frame,360; .loc 1 309 12 mov.u64 %r465,%r139; $L66: .loc 1 304 18 add.u64 %r303,%r452,%r139; ld.u64 %r304,[%r96]; st.u64 [%r303],%r304; .loc 1 305 18 add.u64 %r305,%frame,%r139; .loc 1 305 20 ld.u64 %r307,[%r68]; mul.lo.u64 %r306,%r307,%r34; .loc 1 305 18 st.u64 [%r305],%r306; .loc 1 306 19 ld.u64 %r309,[%r96+16]; add.u64 %r308,%r309,1; ld.u64 %r310,[%r96+8]; sub.u64 %r53,%r308,%r310; .loc 1 308 10 setp.lt.s64 %r311,%r53,0; @ %r311 bra $L64; .loc 1 306 17 add.u64 %r313,%r451,%r139; st.u64 [%r313],%r53; bra $L65; $L64: .loc 1 309 12 add.u64 %r315,%r451,%r139; st.u64 [%r315],%r465; $L65: .loc 1 302 3 add.u64 %r96,%r96,24; add.u64 %r68,%r68,24; add.u64 %r139,%r139,8; setp.ne.u64 %r317,%r39,%r96; @ %r317 bra $L66; bra $L67; $L119: .loc 1 322 6 ld.u64 %r318,[%r221]; setp.eq.u64 %r319,%r318,0; @ ! %r319 bra $L69; bra $L68; $L62: add.u64 %r321,%r22,%r22; add.u64 %r322,%r321,%r22; shl.b64 %r323,%r322,3; add.u64 %r130,%r323,40; add.u64 %r134,%r222,%r130; add.u64 %r111,%r224,%r130; shl.b64 %r324,%r22,3; add.u64 %r58,%r324,-8; cvt.u32.u32 %r326,%r23; cvt.s64.s8 %r325,%r326; shl.b64 %r327,%r325,3; add.u64 %r69,%r327,-8; add.u64 %r452,%frame,240; add.u64 %r451,%frame,360; .loc 1 319 12 mov.u64 %r464,0; $L72: .loc 1 314 18 add.u64 %r329,%r452,%r58; ld.u64 %r330,[%r134]; st.u64 [%r329],%r330; .loc 1 315 18 add.u64 %r331,%frame,%r58; .loc 1 315 20 ld.u64 %r333,[%r111]; mul.lo.u64 %r332,%r333,%r34; .loc 1 315 18 st.u64 [%r331],%r332; .loc 1 316 19 ld.u64 %r335,[%r134+16]; add.u64 %r334,%r335,1; ld.u64 %r336,[%r134+8]; sub.u64 %r65,%r334,%r336; .loc 1 318 10 setp.lt.s64 %r337,%r65,0; @ %r337 bra $L70; .loc 1 316 17 add.u64 %r339,%r451,%r58; st.u64 [%r339],%r65; bra $L71; $L70: .loc 1 319 12 add.u64 %r341,%r451,%r58; st.u64 [%r341],%r464; $L71: .loc 1 312 3 add.u64 %r134,%r134,24; add.u64 %r111,%r111,24; add.u64 %r58,%r58,8; setp.ne.u64 %r343,%r69,%r58; @ %r343 bra $L72; bra $L120; $L68: .loc 1 326 7 setp.eq.u64 %r344,%r152,0; @ %r344 bra $L74; add.u64 %r451,%frame,360; $L99: add.u64 %r148,%r221,40; mov.u64 %r146,%r451; cvt.u32.u32 %r346,%r23; cvt.s64.s8 %r345,%r346; add.u64 %r348,%r345,%r345; add.u64 %r349,%r348,%r345; shl.b64 %r350,%r349,3; add.u64 %r351,%r221,16; add.u64 %r135,%r350,%r351; .loc 1 329 10 mov.u64 %r136,1; .loc 1 333 4 mov.u64 %r370,0; bra $L75; $L74: .loc 1 337 20 add.u32 %r352,%r23,-2; cvt.s64.s32 %r76,%r352; add.u64 %r354,%r76,%r76; add.u64 %r355,%r354,%r76; shl.b64 %r356,%r355,3; add.u64 %r357,%r221,%r356; .loc 1 337 67 shl.b64 %r359,%r76,3; add.u64 %r360,%frame,%r359; .loc 1 337 59 ld.u64 %r362,[%r357+40]; ld.u64 %r363,[%r360+360]; mul.lo.u64 %r160,%r362,%r363; .loc 1 339 24 mov.u64 %r364,0; st.u64 [%r221+8],%r364; .loc 1 340 28 cvt.u16.u32 %r367,%r23; add.u16 %r366,%r367,-1; cvt.u32.u16 %r368,%r366; st.u8 [%r221+28],%r368; .loc 1 342 10 setp.eq.u64 %r369,%r160,0; @ %r369 bra $L76; bra $L121; $L78: .loc 1 331 47 mul.lo.u64 %r136,%r73,%r136; $L75: .loc 1 333 4 st.u64 [%r148+8],%r370; ld.u64 %r73,[%r146]; add.u64 %r371,%r73,-1; st.u64 [%r148+16],%r371; st.u64 [%r148],%r136; .loc 1 326 7 add.u64 %r148,%r148,24; add.u64 %r146,%r146,8; setp.ne.u64 %r372,%r135,%r148; @ %r372 bra $L78; bra $L74; $L76: .loc 1 345 4 st.u64 [%r221+48],%r160; mov.u64 %r374,-1; st.u64 [%r221+56],%r374; mov.u64 %r375,1; st.u64 [%r221+40],%r375; .loc 1 346 4 bra $L54; $L121: .loc 1 349 24call (%value_in),_gfortrani_xmallocarray78,[%value_in]; } .loc 1 349 22 st.u64 [%r221],%r378; bra $L79; $L69: .loc 1 354 19 ld.s8 %r380,[%r221+28]; .loc 1 354 10 setp.eq.u64 %r381,%r380,%r152; @ %r381 bra $L80; .loc 1 355 2 cvta.const.u64 %r382,$LC5; {stack; call _gfortran_runtime_error80: .loc 1 357 11 cvta.global.u64 %r384,_gfortrani_compile_options; .loc 1 357 10 ld.u32 %r385,[%r384+36]; setp.eq.u32 %r386,%r385,0; @ %r386 bra $L79; .loc 1 359 4 add.u64 %r451,%frame,360; cvta.const.u64 %r390,$LC2390; call _gfortrani_bounds_ifunction_return361 4 cvta.const.u64 %r394,$LC222394390; call _gfortrani_bounds_equal_extents$L79: .loc 1 366 3 setp.ne.u64 %r396,%r152,0; @ %r396 bra $L81; $L84: .loc 1 374 8 ld.u64 %r170,[%r221]; .loc 1 375 8 ld.u64 %r168,[%r222]; .loc 1 377 9 setp.ne.u64 %r397,%r168,0; @ %r397 bra $L82; bra $L54; $L81: add.u64 %r198,%frame,480; add.u64 %r197,%r221,40; add.u64 %r195,%frame,120; cvt.u32.u32 %r399,%r23; cvt.s64.s8 %r398,%r399; add.u64 %r164,%r398,-1; .loc 1 366 3 mov.u64 %r171,0; add.u64 %r451,%frame,360; .loc 1 368 16 mov.u64 %r400,%r171; $L83: st.u64 [%r198],%r400; .loc 1 369 18 ld.u64 %r401,[%r197]; st.u64 [%r195],%r401; .loc 1 370 17 shl.b64 %r403,%r171,3; add.u64 %r404,%r451,%r403; .loc 1 370 10 ld.u64 %r405,[%r404]; setp.le.s64 %r406,%r405,0; @ %r406 bra $L54; .loc 1 366 26 add.u64 %r171,%r171,1; .loc 1 366 3 add.u64 %r198,%r198,8; add.u64 %r197,%r197,24; add.u64 %r195,%r195,8; setp.ne.u64 %r407,%r164,%r171; @ %r407 bra $L83; bra $L84; $L82: .loc 1 300 12 mul.lo.u64 %r44,%r41,%r34; .loc 1 396 32 shl.b64 %r91,%r156,3; ld.u64 %r187,[%frame+480]; .loc 1 440 22 ld.u64 %r101,[%frame+240]; .loc 1 440 12 shl.b64 %r102,%r101,3; .loc 1 441 23 ld.u64 %r104,[%frame]; .loc 1 442 22 ld.u64 %r106,[%frame+120]; .loc 1 442 12 shl.b64 %r107,%r106,2; .loc 1 444 32 ld.u64 %r29,[%frame+360]; .loc 1 451 23 mul.lo.u64 %r38,%r29,%r101; .loc 1 452 24 mul.lo.u64 %r408,%r29,%r104; .loc 1 452 10 neg.s64 %r137,%r408; .loc 1 453 23 mul.lo.u64 %r409,%r29,%r106; .loc 1 453 9 shl.b64 %r410,%r409,2; neg.s64 %r188,%r410; setp.le.s64 %r450,%r152,1; .loc 1 418 9 setp.ne.u32 %r457,%r225,0; cvt.u32.u32 %r458,%r23; cvt.s64.s8 %r459,%r458; add.u64 %r460,%r459,-1; add.u64 %r461,%frame,240; add.u64 %r462,%frame,360; add.u64 %r463,%frame,120; $L100: .loc 1 366 3 mov.u64 %r163,%r169; mov.u64 %r162,%r168; .loc 1 396 9 mov.u64 %r161,0; $L88: .loc 1 399 7 ld.s8 %r220,[%r163]; .loc 1 399 6 setp.eq.u32 %r453,%r220,0; @ %r453 bra $L85; .loc 1 407 11 ld.u64 %r145,[%r162]; .loc 1 408 11 cvt.u32.u64 %r412,%r161; add.u32 %r143,%r412,1; .loc 1 418 9 @ %r457 bra $L86; bra $L122; $L85: .loc 1 396 24 add.u64 %r161,%r161,1; .loc 1 396 32 add.u64 %r162,%r162,%r91; .loc 1 396 47 add.u64 %r163,%r163,%r44; .loc 1 396 2 setp.ne.u64 %r414,%r153,%r161; @ %r414 bra $L88; .loc 1 395 9 mov.u32 %r143,%r220; bra $L89; $L122: .loc 1 428 10 setp.gt.s64 %r415,%r153,%r161; @ %r415 bra $L90; bra $L89; $L86: .loc 1 419 8 setp.le.s64 %r416,%r153,%r161; @ %r416 bra $L89; add.u64 %r205,%r161,1; $L92: .loc 1 421 8 @ %r453 bra $L91; .loc 1 421 18 ld.u64 %r93,[%r162]; .loc 1 421 15 setp.gt.s64 %r418,%r93,%r145; @ %r418 bra $L91; .loc 1 424 16 cvt.u32.u64 %r143,%r205; mov.u64 %r145,%r93; $L91: .loc 1 419 33 add.u64 %r162,%r162,%r91; .loc 1 419 48 add.u64 %r163,%r163,%r44; .loc 1 419 8 add.u64 %r206,%r205,1; setp.eq.u64 %r419,%r153,%r205; @ %r419 bra $L89; .loc 1 421 9 ld.s8 %r220,[%r163]; mov.u64 %r205,%r206; setp.eq.u32 %r453,%r220,0; bra $L92; $L90: add.u64 %r201,%r161,1; bra $L94; $L101: mov.u64 %r201,%r202; $L94: .loc 1 430 10 ld.s8 %r421,[%r163]; cvt.u16.u32 %r420,%r421; setp.eq.u16 %r422,%r420,0; @ %r422 bra $L93; .loc 1 430 20 ld.u64 %r98,[%r162]; .loc 1 430 17 setp.ge.s64 %r423,%r98,%r145; @ %r423 bra $L93; .loc 1 433 11 cvt.u32.u64 %r143,%r201; mov.u64 %r145,%r98; $L93: .loc 1 428 35 add.u64 %r162,%r162,%r91; .loc 1 428 50 add.u64 %r163,%r163,%r44; .loc 1 428 10 add.u64 %r202,%r201,1; setp.ne.u64 %r424,%r153,%r201; @ %r424 bra $L101; $L89: .loc 1 436 8 st.u32 [%r170],%r143; .loc 1 439 15 add.u64 %r187,%r187,1; .loc 1 440 12 add.u64 %r168,%r168,%r102; .loc 1 441 13 add.u64 %r169,%r169,%r104; .loc 1 442 12 add.u64 %r170,%r170,%r107; .loc 1 444 13 setp.ne.u64 %r425,%r29,%r187; @ %r425 bra $L100; .loc 1 452 10 add.u64 %r165,%r169,%r137; .loc 1 453 9 add.u64 %r166,%r170,%r188; .loc 1 455 7 @ %r450 bra $L54; add.u64 %r150,%frame,488; .loc 1 451 23 mov.u64 %r110,%r38; .loc 1 455 7 mov.u64 %r215,8; .loc 1 454 5 mov.u64 %r167,1; .loc 1 448 13 mov.u64 %r456,0; bra $L97; $L98: st.u64 [%r150],%r456; .loc 1 451 23 mul.lo.u64 %r110,%r123,%r122; .loc 1 452 24 mul.lo.u64 %r431,%r125,%r122; .loc 1 452 10 sub.u64 %r165,%r169,%r431; .loc 1 453 23 mul.lo.u64 %r432,%r127,%r122; .loc 1 453 9 shl.b64 %r433,%r432,2; sub.u64 %r166,%r170,%r433; .loc 1 454 5 add.u64 %r167,%r167,1; .loc 1 455 7 add.u64 %r150,%r150,8; add.u64 %r215,%r215,8; setp.eq.u64 %r434,%r167,%r460; @ %r434 bra $L54; $L97: .loc 1 463 16 ld.u64 %r435,[%r150]; add.u64 %r122,%r435,1; st.u64 [%r150],%r122; .loc 1 464 23 add.u64 %r437,%r461,%r215; ld.u64 %r123,[%r437]; .loc 1 464 13 sub.u64 %r438,%r123,%r110; shl.b64 %r439,%r438,3; add.u64 %r168,%r168,%r439; .loc 1 465 24 add.u64 %r440,%frame,%r215; ld.u64 %r125,[%r440]; .loc 1 465 14 add.u64 %r169,%r165,%r125; .loc 1 466 23 add.u64 %r442,%r463,%r215; ld.u64 %r127,[%r442]; .loc 1 466 13 shl.b64 %r443,%r127,2; add.u64 %r170,%r166,%r443; .loc 1 444 32 add.u64 %r445,%r462,%r215; ld.u64 %r131,[%r445]; .loc 1 444 13 setp.eq.u64 %r446,%r122,%r131; @ %r446 bra $L98; .loc 1 448 13 mov.u64 %r187,0; bra $L100; $L120: .loc 1 322 6 ld.u64 %r447,[%r221]; setp.eq.u64 %r448,%r447,0; @ ! %r448 bra $L69; bra $L99; $L54_gfortran_sminloc1_4_i8 .visible .func _gfortran_sminloc1_4_, .param .u32 %in_ar4) {local .align 16 .b8 %frame_ar[368u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u32 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 493 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L124; .loc 1 493 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L125; $L124:r152154; call _gfortran_minloc1_4_500 7 bra $L123; $L125: .loc 1 503 10 ld.u64 %r23,[%r152]; .loc 1 503 7 add.u64 %r88,%r23,-1; .loc 1 504 10 ld.s8 %r24,[%r151+28]; .loc 1 504 38 add.u32 %r26,%r24,-1; .loc 1 504 8 cvt.s64.s32 %r296,%r26; .loc 1 506 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 506 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L127; .loc 1 513 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L128; $L134: .loc 1 521 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L129; bra $L182; $L127: .loc 1 508 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error128: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 518 12 mov.u64 %r302,0; $L133: .loc 1 515 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 517 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L131; .loc 1 515 17 st.u64 [%r139],%r36; bra $L132; $L131: .loc 1 518 12 st.u64 [%r139],%r302; $L132: .loc 1 513 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L133; bra $L134; $L182: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L136; bra $L135; $L129: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 527 12 mov.u64 %r301,0; $L139: .loc 1 524 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 526 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L137; .loc 1 523 17 st.u64 [%r43],%r41; bra $L138; $L137: .loc 1 527 12 st.u64 [%r43],%r301; $L138: .loc 1 521 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L139; bra $L183; $L135: .loc 1 534 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L141; add.u64 %r295,%frame,120; $L161: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 537 10 mov.u64 %r81,1; .loc 1 541 4 mov.u64 %r239,0; bra $L142; $L141: .loc 1 545 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 546 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 548 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 548 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 548 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 550 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L143; bra $L184; $L145: .loc 1 539 48 mul.lo.u64 %r81,%r46,%r81; $L142: .loc 1 541 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 534 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L145; bra $L141; $L143: .loc 1 553 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 554 4 bra $L123; $L184: .loc 1 557 2496246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 557 22 st.u64 [%r150],%r98; $L149: .loc 1 583 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L146; bra $L147; $L136: .loc 1 561 19 ld.s8 %r55,[%r150+28]; .loc 1 561 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L148; .loc 1 562 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error148: .loc 1 567 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 567 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L149; .loc 1 569 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L147; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 569 10 mov.u64 %r104,0; $L151: .loc 1 573 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 573 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 574 18 ld.u64 %r63,[%r31]; .loc 1 569 25 add.u64 %r104,%r104,1; .loc 1 574 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L150; .loc 1 575 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error150: .loc 1 569 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L151; $L146: .loc 1 585 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L152; shl.b64 %r266,%r296,3; bra $L153; $L152: mov.u64 %r266,8; $L153: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L154: .loc 1 586 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 583 26 add.u64 %r100,%r100,1; .loc 1 583 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L154; $L147: ld.u64 %r80,[%frame+240]; .loc 1 595 22 ld.u64 %r68,[%frame]; .loc 1 595 12 shl.b64 %r69,%r68,2; .loc 1 597 32 ld.u64 %r59,[%frame+120]; .loc 1 604 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 593 13 mov.u32 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L160: st.u32 [%r98],%r280; .loc 1 594 15 add.u64 %r80,%r80,1; .loc 1 595 12 add.u64 %r98,%r98,%r69; .loc 1 597 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L160; .loc 1 606 7 @ ! %r294 bra $L185; bra $L123; $L159: .loc 1 601 13 st.u64 [%r99],%r297; .loc 1 604 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 605 5 add.u64 %r97,%r97,1; .loc 1 606 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L158; bra $L123; $L185: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 604 23 mov.u64 %r72,%r44; .loc 1 605 5 mov.u64 %r97,1; .loc 1 601 13 mov.u64 %r297,0; $L158: .loc 1 610 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 611 23 ld.u64 %r75,[%r124]; .loc 1 611 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,2; add.u64 %r98,%r98,%r291; .loc 1 597 32 ld.u64 %r77,[%r123]; .loc 1 597 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L159; .loc 1 601 13 mov.u64 %r80,0; bra $L160; $L183: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L136; bra $L161; $L123: .loc 1 615 1 ret; } minloc1_8_i8.o/_gfortran_minloc1_8_i8 .visible .func _gfortran_minloc1_8_i8.file 1 "../../../libgfortran/generated/minloc1_8_i8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_8_i8 .visible .func _gfortran_mminloc1_8__gfortran_sminloc1_8_i8 .visible .func _gfortran_sminloc1_8_VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_minloc1_8_i8 .visible .func _gfortran_minloc1_8_i8480predpredpred %r235; .reg .predu64 %r266; .reg .u64 %r267; .reg .u64u16u64 %r279; .reg .u64 %r280; .reg .u64 %r282; .reg .u64 %r283; .reg .u64predu64u64predpred %r326; .reg .predu64 %r338; .reg .u64pred %r353; .reg .pred %r354; .reg .pred64mov.u64 %r185,%ar0; mov.u64 %r186,%ar1; mov.u64 %r187,%ar2; mov.u32 %r188,%ar3; .loc 1 58 10 ld.s8 %r22,[%r186+28]; .loc 1 58 38 add.u32 %r189,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r120,%r189; .loc 1 59 10 ld.u64 %r25,[%r187]; .loc 1 59 7 add.u64 %r121,%r25,-1; .loc 1 61 7 shr.u64 %r191,%r121,63; set.u32.lt.s64 %r193,%r120,%r121; neg.s32 %r194,%r193; cvt.u16.u64 %r197,%r191; cvt.u16.u32 %r198,%r194; or.b16 %r196,%r197,%r198; .loc 1 61 6 cvt.u32.u16 %r199,%r196; cvt.u16.u8 %r200,%r199; setp.eq.u16 %r201,%r200,0; @ %r201 bra $L2; .loc 1 63 7 cvt.u32.u32 %r204,%r22; cvt.s64.s8 %r203,%r204; st.u64 [%stack+8],%r20302202_gfortran_runtime_errorr207,%r121,%r121; add.u64 %r208,%r207,%r121; shl.b64 %r209,%r208,3; add.u64 %r210,%r186,%r209; ld.u64 %r30,[%r210+56]; ld.u64 %r32,[%r210+48]; .loc 1 71 9 ld.u64 %r124,[%r210+40]; .loc 1 73 3 setp.ne.u64 %r224,%r121,0; @ %r224 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r225,%r120,%r121; @ %r225 bra $L4; bra $L54; $L3: add.u64 %r47,%r186,40; add.u64 %r119,%frame,120; add.u64 %r137,%frame,240; add.u64 %r226,%r186,16; add.u64 %r228,%r25,%r25; add.u64 %r229,%r228,%r25; shl.b64 %r230,%r229,3; add.u64 %r46,%r226,%r230; .loc 1 79 12 mov.u64 %r367,0; $L8: .loc 1 75 18 ld.u64 %r231,[%r47]; st.u64 [%r119],%r231; .loc 1 76 19 ld.u64 %r233,[%r47+16]; add.u64 %r232,%r233,1; ld.u64 %r234,[%r47+8]; sub.u64 %r37,%r232,%r234; .loc 1 78 10 setp.lt.s64 %r235,%r37,0; @ %r235 bra $L6; .loc 1 76 17 st.u64 [%r137],%r37; bra $L7; $L6: .loc 1 79 12 st.u64 [%r137],%r367; $L7: .loc 1 73 3 add.u64 %r47,%r47,24; add.u64 %r119,%r119,8; add.u64 %r137,%r137,8; setp.ne.u64 %r237,%r46,%r47; @ %r237 bra $L8; bra $L9; $L54: .loc 1 90 6 ld.u64 %r238,[%r185]; setp.eq.u64 %r239,%r238,0; @ ! %r239 bra $L11; bra $L10; $L4: add.u64 %r241,%r25,%r25; add.u64 %r242,%r241,%r25; shl.b64 %r243,%r242,3; add.u64 %r244,%r243,40; add.u64 %r118,%r186,%r244; shl.b64 %r245,%r25,3; add.u64 %r108,%r245,-8; add.u64 %r357,%frame,120; add.u64 %r112,%r357,%r108; add.u64 %r356,%frame,240; add.u64 %r104,%r356,%r108; cvt.u32.u32 %r249,%r22; cvt.s64.s8 %r248,%r249; add.u64 %r251,%r248,%r248; add.u64 %r252,%r251,%r248; shl.b64 %r253,%r252,3; add.u64 %r254,%r186,40; add.u64 %r64,%r253,%r254; .loc 1 87 12 mov.u64 %r366,0; $L14: .loc 1 83 18 ld.u64 %r255,[%r118]; st.u64 [%r112],%r255; .loc 1 84 19 ld.u64 %r257,[%r118+16]; add.u64 %r256,%r257,1; ld.u64 %r258,[%r118+8]; sub.u64 %r43,%r256,%r258; .loc 1 86 10 setp.lt.s64 %r259,%r43,0; @ %r259 bra $L12; .loc 1 84 17 st.u64 [%r104],%r43; bra $L13; $L12: .loc 1 87 12 st.u64 [%r104],%r366; $L13: .loc 1 81 3 add.u64 %r118,%r118,24; add.u64 %r112,%r112,8; add.u64 %r104,%r104,8; setp.ne.u64 %r261,%r64,%r118; @ %r261 bra $L14; bra $L55; $L10: .loc 1 94 7 setp.le.s64 %r262,%r120,0; @ %r262 bra $L16; add.u64 %r356,%frame,240; $L39: add.u64 %r156,%r185,40; mov.u64 %r153,%r356; cvt.u32.u32 %r264,%r22; cvt.s64.s8 %r263,%r264; add.u64 %r266,%r263,%r263; add.u64 %r267,%r266,%r263; shl.b64 %r268,%r267,3; add.u64 %r269,%r185,16; add.u64 %r130,%r268,%r269; .loc 1 97 10 mov.u64 %r107,1; .loc 1 101 4 mov.u64 %r292,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r270,0; st.u64 [%r185+8],%r270; .loc 1 106 28 cvt.u16.u32 %r273,%r22; add.u16 %r272,%r273,-1; cvt.u32.u16 %r274,%r272; st.u8 [%r185+28],%r274; .loc 1 108 20 add.u32 %r275,%r22,-2; cvt.s64.s32 %r54,%r275; add.u64 %r277,%r54,%r54; add.u64 %r278,%r277,%r54; shl.b64 %r279,%r278,3; add.u64 %r280,%r185,%r279; .loc 1 108 67 shl.b64 %r282,%r54,3; add.u64 %r283,%frame,%r282; .loc 1 108 59 ld.u64 %r285,[%r280+40]; ld.u64 %r286,[%r283+240]; mul.lo.u64 %r126,%r285,%r286; .loc 1 110 29 mov.u64 %r288288; call (%value_in),_gfortrani_xmallocarray289,[%value_in]; } .loc 1 110 27 st.u64 [%r185],%r289; .loc 1 111 10 setp.eq.u64 %r291,%r126,0; @ ! %r291 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r107,%r50,%r107; $L17: .loc 1 101 4 st.u64 [%r156+8],%r292; ld.u64 %r50,[%r153]; add.u64 %r293,%r50,-1; st.u64 [%r156+16],%r293; st.u64 [%r156],%r107; .loc 1 94 7 add.u64 %r156,%r156,24; add.u64 %r153,%r153,8; setp.ne.u64 %r294,%r130,%r156; @ %r294 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r185+48],%r126; mov.u64 %r296,-1; st.u64 [%r185+56],%r296; mov.u64 %r297,1; st.u64 [%r185+40],%r297; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r60,[%r185+28]; .loc 1 121 10 setp.eq.u64 %r298,%r60,%r120; @ %r298 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r120; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r301,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r302,[%r301+36]; setp.eq.u32 %r303,%r302,0; @ %r303 bra $L21; .loc 1 128 2 add.u64 %r356,%frame,240; cvta.const.u64 %r307,$LC2r356306307; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r309,%r120,0; @ %r309 bra $L24; $L27: .loc 1 68 9 add.u64 %r310,%r30,1; .loc 1 68 7 sub.u64 %r122,%r310,%r32; max.s64 %r87,%r122,0; .loc 1 140 8 ld.u64 %r134,[%r186]; .loc 1 141 8 ld.u64 %r135,[%r185]; .loc 1 190 33 shl.b64 %r75,%r124,3; ld.u64 %r45,[%frame+360]; .loc 1 204 22 ld.u64 %r76,[%frame+120]; .loc 1 204 12 shl.b64 %r78,%r76,3; .loc 1 205 22 ld.u64 %r79,[%frame]; .loc 1 205 12 shl.b64 %r81,%r79,3; setp.gt.s64 %r354,%r122,0; setp.le.s64 %r355,%r120,1; .loc 1 180 9 setp.eq.u32 %r359,%r188,0; .loc 1 159 10 mov.u64 %r360,0; cvt.u32.u32 %r361,%r22; cvt.s64.s8 %r362,%r361; add.u64 %r363,%r362,-1; add.u64 %r364,%frame,120; add.u64 %r365,%frame,240; bra $L25; $L24: add.u64 %r170,%frame,360; add.u64 %r169,%r185,40; mov.u64 %r167,%frame; cvt.u32.u32 %r312,%r22; cvt.s64.s8 %r311,%r312; add.u64 %r158,%r311,-1; .loc 1 132 3 mov.u64 %r136,0; add.u64 %r356,%frame,240; .loc 1 134 16 mov.u64 %r313,%r136; $L26: st.u64 [%r170],%r313; .loc 1 135 18 ld.u64 %r314,[%r169]; st.u64 [%r167],%r314; .loc 1 136 17 shl.b64 %r316,%r136,3; add.u64 %r317,%r356,%r316; .loc 1 136 10 ld.u64 %r318,[%r317]; setp.le.s64 %r319,%r318,0; @ %r319 bra $L1; .loc 1 132 26 add.u64 %r136,%r136,1; .loc 1 132 3 add.u64 %r170,%r170,8; add.u64 %r169,%r169,24; add.u64 %r167,%r167,8; setp.ne.u64 %r320,%r136,%r158; @ %r320 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r354 bra $L28; .loc 1 159 10 st.u64 [%r135],%r360; bra $L29; $L28: .loc 1 180 9 @ %r359 bra $L40; mov.u64 %r129,%r134; .loc 1 155 9 mov.u64 %r111,9223372036854775807; .loc 1 157 9 mov.u64 %r110,1; .loc 1 178 8 mov.u64 %r152,0; $L32: .loc 1 183 9 ld.u64 %r67,[%r129]; .loc 1 186 16 add.u64 %r152,%r152,1; .loc 1 183 8 setp.le.s64 %r324,%r67,%r111; selp.u64 %r111,%r67,%r111,%r324; selp.u64 %r110,%r152,%r110,%r324; .loc 1 181 33 add.u64 %r129,%r129,%r75; .loc 1 181 8 setp.gt.s64 %r325,%r87,%r152; @ %r325 bra $L32; bra $L33; $L40: mov.u64 %r127,%r134; .loc 1 155 9 mov.u64 %r113,9223372036854775807; .loc 1 157 9 mov.u64 %r110,1; .loc 1 178 8 mov.u64 %r106,0; $L30: .loc 1 192 9 ld.u64 %r72,[%r127]; .loc 1 195 16 add.u64 %r106,%r106,1; .loc 1 192 8 setp.lt.s64 %r326,%r72,%r113; selp.u64 %r113,%r72,%r113,%r326; selp.u64 %r110,%r106,%r110,%r326; .loc 1 190 33 add.u64 %r127,%r127,%r75; .loc 1 190 8 setp.gt.s64 %r327,%r87,%r106; @ %r327 bra $L30; $L33: .loc 1 199 12 st.u64 [%r135],%r110; $L29: .loc 1 203 15 add.u64 %r45,%r45,1; .loc 1 204 12 add.u64 %r134,%r134,%r78; .loc 1 205 12 add.u64 %r135,%r135,%r81; .loc 1 207 32 ld.u64 %r68,[%frame+240]; .loc 1 207 13 setp.ne.u64 %r328,%r68,%r45; @ %r328 bra $L25; .loc 1 214 23 mul.lo.u64 %r329,%r45,%r76; .loc 1 214 9 shl.b64 %r330,%r329,3; sub.u64 %r131,%r134,%r330; .loc 1 215 23 mul.lo.u64 %r331,%r45,%r79; .loc 1 215 9 shl.b64 %r332,%r331,3; sub.u64 %r132,%r135,%r332; .loc 1 217 7 @ %r355 bra $L1; add.u64 %r157,%frame,368; mov.u64 %r178,8; .loc 1 216 5 mov.u64 %r133,1; bra $L37; $L38: .loc 1 211 13 st.u64 [%r157],%r360; .loc 1 214 23 mul.lo.u64 %r338,%r94,%r93; .loc 1 214 9 shl.b64 %r339,%r338,3; sub.u64 %r131,%r134,%r339; .loc 1 215 23 mul.lo.u64 %r340,%r97,%r93; .loc 1 215 9 shl.b64 %r341,%r340,3; sub.u64 %r132,%r135,%r341; .loc 1 216 5 add.u64 %r133,%r133,1; .loc 1 217 7 add.u64 %r157,%r157,8; add.u64 %r178,%r178,8; setp.eq.u64 %r342,%r133,%r363; @ %r342 bra $L1; $L37: .loc 1 225 16 ld.u64 %r343,[%r157]; add.u64 %r93,%r343,1; st.u64 [%r157],%r93; .loc 1 226 23 add.u64 %r345,%r364,%r178; ld.u64 %r94,[%r345]; .loc 1 226 13 shl.b64 %r346,%r94,3; add.u64 %r134,%r131,%r346; .loc 1 227 23 add.u64 %r347,%frame,%r178; ld.u64 %r97,[%r347]; .loc 1 227 13 shl.b64 %r348,%r97,3; add.u64 %r135,%r132,%r348; .loc 1 207 32 add.u64 %r350,%r365,%r178; ld.u64 %r100,[%r350]; .loc 1 207 13 setp.eq.u64 %r351,%r93,%r100; @ %r351 bra $L38; .loc 1 211 13 mov.u64 %r45,0; bra $L25; $L55: .loc 1 90 6 ld.u64 %r352,[%r185]; setp.eq.u64 %r353,%r352,0; @ ! %r353 bra $L11; bra $L39; $L1:_gfortran_mminloc1_8_i8 .visible .func _gfortran_mminloc1_8_, .param .u32 %in_ar4) {608pred %r218; .reg .u32 %r223; .reg .u6464u64pred %r373; .reg .u64 %r374; .reg .u64 %r376; .reg .u32 %r377; .reg .pred %r378; .reg .u64 %r381; .reg .u64 %r382; .reg .u64 %r386; .reg .pred %r388; .reg .pred %r389; .reg .u64 %r390; .reg .u32 %r391; .reg .u64pred %r398; .reg .pred %r399; .reg .pred %r402; .reg .pred %r403; .reg .pred %r404; .reg .pred %r406; .reg .pred %r407; .reg .u16 %r408; .reg .u32 %r409; .reg .pred %r410; .reg .pred %r411; .reg .pred %r412; .reg .pred.reg .u64 %r416; .reg .u64predpredmov.u64 %r213,%ar0; mov.u64 %r214,%ar1; mov.u64 %r215,%ar2; mov.u64 %r216,%ar3; mov.u32 %r217,%ar4; .loc 1 261 6 setp.ne.u64 %r218,%r216,0; @ %r218 bra $L1r214215217; call _gfortran_minloc1_8_268 7 bra $L56; $L57: .loc 1 271 10 ld.u64 %r22,[%r215]; .loc 1 271 7 add.u64 %r146,%r22,-1; .loc 1 272 10 ld.s8 %r23,[%r214+28]; .loc 1 272 38 add.u32 %r223,%r23,-1; .loc 1 272 8 cvt.s64.s32 %r147,%r223; .loc 1 275 7 shr.u64 %r225,%r146,63; set.u32.gt.s64 %r227,%r146,%r147; neg.s32 %r228,%r227; cvt.u16.u64 %r231,%r225; cvt.u16.u32 %r232,%r228; or.b16 %r230,%r231,%r232; .loc 1 275 6 cvt.u32.u16 %r233,%r230; cvt.u16.u8 %r234,%r233; setp.eq.u16 %r235,%r234,0; @ %r235 bra $L59; .loc 1 277 7 cvt.u32.u32 %r238,%r23; cvt.s64.s8 %r237,%r238; st.u64 [%stack+8],%r237; st.u64 [%stack],%r22; cvta.const.u64 %r236236_gfortran_runtime_error9: .loc 1 282 9 add.u64 %r445,%r146,%r146; add.u64 %r444,%r445,%r146; shl.b64 %r243,%r444,3; add.u64 %r244,%r214,%r243; ld.u64 %r247,[%r244+56]; add.u64 %r246,%r247,1; .loc 1 282 7 ld.u64 %r254,[%r244+48]; sub.u64 %r148,%r246,%r254; .loc 1 283 6 setp.le.s64 %r255,%r148,0; @ %r255 bra $L56; .loc 1 286 9 ld.u64 %r166,[%r216]; .loc 1 288 15 ld.u64 %r34,[%r216+16]; .loc 1 290 22 cvt.u32.u64 %r35,%r34; .loc 1 290 53 add.u32 %r256,%r35,-4; and.b32 %r257,%r256,-5; set.u32.eq.u32 %r259,%r257,0; neg.s32 %r260,%r259; .loc 1 290 22 add.u32 %r261,%r35,-1; set.u32.le.u32 %r263,%r261,1; neg.s32 %r264,%r263; .loc 1 290 6 cvt.u16.u32 %r266,%r260; cvt.u16.u32 %r267,%r264; or.b16 %r265,%r266,%r267; cvt.u32.u16 %r268,%r265; cvt.u16.u8 %r269,%r268; setp.ne.u16 %r270,%r269,0; @ %r270 bra $L61; .loc 1 292 7 setp.ne.u32 %r272,%r35,16; @ %r272 bra $L62; $L61: .loc 1 299 9 shl.b64 %r276,%r444,3; add.u64 %r277,%r214,%r276; ld.u64 %r151,[%r277+40]; .loc 1 300 12 add.u64 %r283,%r216,%r276; ld.u64 %r39,[%r283+40]; .loc 1 302 3 setp.ne.u64 %r285,%r146,0; @ %r285 bra $L63; $L69: .loc 1 312 3 setp.lt.s64 %r286,%r146,%r147; @ %r286 bra $L64; bra $L123; $L62: .loc 1 297 5_gfortran_runtime_error63: add.u64 %r205,%r214,40; add.u64 %r67,%r216,40; add.u64 %r289,%r214,16; add.u64 %r291,%r22,%r22; add.u64 %r292,%r291,%r22; shl.b64 %r293,%r292,3; add.u64 %r210,%r289,%r293; .loc 1 302 3 mov.u64 %r91,0; add.u64 %r442,%frame,240; add.u64 %r440,%frame,360; .loc 1 309 12 mov.u64 %r456,%r91; $L68: .loc 1 304 18 add.u64 %r295,%r442,%r91; ld.u64 %r296,[%r205]; st.u64 [%r295],%r296; .loc 1 305 18 add.u64 %r297,%frame,%r91; .loc 1 305 20 ld.u64 %r299,[%r67]; mul.lo.u64 %r298,%r299,%r34; .loc 1 305 18 st.u64 [%r297],%r298; .loc 1 306 19 ld.u64 %r301,[%r205+16]; add.u64 %r300,%r301,1; ld.u64 %r302,[%r205+8]; sub.u64 %r51,%r300,%r302; .loc 1 308 10 setp.lt.s64 %r303,%r51,0; @ %r303 bra $L66; .loc 1 306 17 add.u64 %r305,%r440,%r91; st.u64 [%r305],%r51; bra $L67; $L66: .loc 1 309 12 add.u64 %r307,%r440,%r91; st.u64 [%r307],%r456; $L67: .loc 1 302 3 add.u64 %r205,%r205,24; add.u64 %r67,%r67,24; add.u64 %r91,%r91,8; setp.ne.u64 %r309,%r205,%r210; @ %r309 bra $L68; bra $L69; $L123: .loc 1 322 6 ld.u64 %r310,[%r213]; setp.eq.u64 %r311,%r310,0; @ ! %r311 bra $L71; bra $L70; $L64: add.u64 %r313,%r22,%r22; add.u64 %r314,%r313,%r22; shl.b64 %r315,%r314,3; add.u64 %r128,%r315,40; add.u64 %r133,%r214,%r128; add.u64 %r124,%r216,%r128; shl.b64 %r316,%r22,3; add.u64 %r83,%r316,-8; cvt.u32.u32 %r318,%r23; cvt.s64.s8 %r317,%r318; shl.b64 %r319,%r317,3; add.u64 %r209,%r319,-8; add.u64 %r442,%frame,240; add.u64 %r440,%frame,360; .loc 1 319 12 mov.u64 %r455,0; $L74: .loc 1 314 18 add.u64 %r321,%r442,%r83; ld.u64 %r322,[%r133]; st.u64 [%r321],%r322; .loc 1 315 18 add.u64 %r323,%frame,%r83; .loc 1 315 20 ld.u64 %r325,[%r124]; mul.lo.u64 %r324,%r325,%r34; .loc 1 315 18 st.u64 [%r323],%r324; .loc 1 316 19 ld.u64 %r327,[%r133+16]; add.u64 %r326,%r327,1; ld.u64 %r328,[%r133+8]; sub.u64 %r64,%r326,%r328; .loc 1 318 10 setp.lt.s64 %r329,%r64,0; @ %r329 bra $L72; .loc 1 316 17 add.u64 %r331,%r440,%r83; st.u64 [%r331],%r64; bra $L73; $L72: .loc 1 319 12 add.u64 %r333,%r440,%r83; st.u64 [%r333],%r455; $L73: .loc 1 312 3 add.u64 %r133,%r133,24; add.u64 %r124,%r124,24; add.u64 %r83,%r83,8; setp.ne.u64 %r335,%r83,%r209; @ %r335 bra $L74; bra $L124; $L70: .loc 1 326 7 setp.eq.u64 %r336,%r147,0; @ %r336 bra $L76; add.u64 %r440,%frame,360; $L100: add.u64 %r161,%r213,40; mov.u64 %r143,%r440; cvt.u32.u32 %r338,%r23; cvt.s64.s8 %r337,%r338; add.u64 %r340,%r337,%r337; add.u64 %r341,%r340,%r337; shl.b64 %r342,%r341,3; add.u64 %r343,%r213,16; add.u64 %r135,%r342,%r343; .loc 1 329 10 mov.u64 %r131,1; .loc 1 333 4 mov.u64 %r362,0; bra $L77; $L76: .loc 1 337 20 add.u32 %r344,%r23,-2; cvt.s64.s32 %r73,%r344; add.u64 %r346,%r73,%r73; add.u64 %r347,%r346,%r73; shl.b64 %r348,%r347,3; add.u64 %r349,%r213,%r348; .loc 1 337 67 shl.b64 %r351,%r73,3; add.u64 %r352,%frame,%r351; .loc 1 337 59 ld.u64 %r354,[%r349+40]; ld.u64 %r355,[%r352+360]; mul.lo.u64 %r155,%r354,%r355; .loc 1 339 24 mov.u64 %r356,0; st.u64 [%r213+8],%r356; .loc 1 340 28 cvt.u16.u32 %r359,%r23; add.u16 %r358,%r359,-1; cvt.u32.u16 %r360,%r358; st.u8 [%r213+28],%r360; .loc 1 342 10 setp.eq.u64 %r361,%r155,0; @ %r361 bra $L78; bra $L125; $L80: .loc 1 331 47 mul.lo.u64 %r131,%r70,%r131; $L77: .loc 1 333 4 st.u64 [%r161+8],%r362; ld.u64 %r70,[%r143]; add.u64 %r363,%r70,-1; st.u64 [%r161+16],%r363; st.u64 [%r161],%r131; .loc 1 326 7 add.u64 %r161,%r161,24; add.u64 %r143,%r143,8; setp.ne.u64 %r364,%r135,%r161; @ %r364 bra $L80; bra $L76; $L78: .loc 1 345 4 st.u64 [%r213+48],%r155; mov.u64 %r366,-1; st.u64 [%r213+56],%r366; mov.u64 %r367,1; st.u64 [%r213+40],%r367; .loc 1 346 4 bra $L56; $L125: .loc 1 349 24 mov.u64 %r369369; call (%value_in),_gfortrani_xmallocarray70,[%value_in]; } .loc 1 349 22 st.u64 [%r213],%r370; bra $L81; $L71: .loc 1 354 19 ld.s8 %r372,[%r213+28]; .loc 1 354 10 setp.eq.u64 %r373,%r372,%r147; @ %r373 bra $L82; .loc 1 355 2 cvta.const.u64 %r374374_gfortran_runtime_error82: .loc 1 357 11 cvta.global.u64 %r376,_gfortrani_compile_options; .loc 1 357 10 ld.u32 %r377,[%r376+36]; setp.eq.u32 %r378,%r377,0; @ %r378 bra $L81; .loc 1 359 4 add.u64 %r440,%frame,360; cvta.const.u64 %r382,$LC2r43382; call _gfortrani_bounds_ifunction_return361 4 cvta.const.u64 %r386,$LC386382; call _gfortrani_bounds_equal_extents$L81: .loc 1 366 3 setp.ne.u64 %r388,%r147,0; @ %r388 bra $L83; $L86: .loc 1 374 8 ld.u64 %r167,[%r213]; .loc 1 375 8 ld.u64 %r165,[%r214]; .loc 1 377 9 setp.ne.u64 %r389,%r165,0; @ %r389 bra $L84; bra $L56; $L83: add.u64 %r194,%frame,480; add.u64 %r193,%r213,40; add.u64 %r188,%frame,120; cvt.u32.u32 %r391,%r23; cvt.s64.s8 %r390,%r391; add.u64 %r174,%r390,-1; .loc 1 366 3 mov.u64 %r168,0; add.u64 %r440,%frame,360; .loc 1 368 16 mov.u64 %r392,%r168; $L85: st.u64 [%r194],%r392; .loc 1 369 18 ld.u64 %r393,[%r193]; st.u64 [%r188],%r393; .loc 1 370 17 shl.b64 %r395,%r168,3; add.u64 %r396,%r440,%r395; .loc 1 370 10 ld.u64 %r397,[%r396]; setp.le.s64 %r398,%r397,0; @ %r398 bra $L56; .loc 1 366 26 add.u64 %r168,%r168,1; .loc 1 366 3 add.u64 %r194,%r194,8; add.u64 %r193,%r193,24; add.u64 %r188,%r188,8; setp.ne.u64 %r399,%r168,%r174; @ %r399 bra $L85; bra $L86; $L84: .loc 1 300 12 mul.lo.u64 %r42,%r39,%r34; .loc 1 396 32 shl.b64 %r87,%r151,3; ld.u64 %r68,[%frame+480]; .loc 1 440 22 ld.u64 %r97,[%frame+240]; .loc 1 440 12 shl.b64 %r99,%r97,3; .loc 1 441 23 ld.u64 %r100,[%frame]; .loc 1 442 22 ld.u64 %r102,[%frame+120]; .loc 1 442 12 shl.b64 %r104,%r102,3; setp.le.s64 %r446,%r147,1; .loc 1 418 9 setp.ne.u32 %r448,%r217,0; cvt.u32.u32 %r449,%r23; cvt.s64.s8 %r450,%r449; add.u64 %r451,%r450,-1; add.u64 %r452,%frame,240; add.u64 %r453,%frame,360; add.u64 %r454,%frame,120; $L101: .loc 1 366 3 mov.u64 %r160,%r166; mov.u64 %r159,%r165; .loc 1 396 9 mov.u64 %r137,0; $L90: .loc 1 399 7 ld.s8 %r208,[%r160]; mov.u64 %r189,%r137; .loc 1 396 24 add.u64 %r137,%r137,1; .loc 1 399 6 setp.eq.u32 %r441,%r208,0; @ %r441 bra $L87; .loc 1 407 11 ld.u64 %r141,[%r159]; .loc 1 418 9 @ %r448 bra $L88; bra $L126; $L87: .loc 1 396 32 add.u64 %r159,%r159,%r87; .loc 1 396 47 add.u64 %r160,%r160,%r42; .loc 1 396 2 setp.ne.u64 %r402,%r148,%r137; @ %r402 bra $L90; .loc 1 395 9 mov.u64 %r137,0; bra $L91; $L126: .loc 1 428 10 setp.gt.s64 %r403,%r148,%r189; @ %r403 bra $L92; bra $L91; $L88: .loc 1 419 8 setp.le.s64 %r404,%r148,%r189; @ %r404 bra $L91; $L94: .loc 1 424 16 add.u64 %r189,%r189,1; .loc 1 421 8 @ %r441 bra $L93; .loc 1 421 18 ld.u64 %r88,[%r159]; .loc 1 421 15 setp.le.s64 %r406,%r88,%r141; selp.u64 %r141,%r88,%r141,%r406; selp.u64 %r137,%r189,%r137,%r406; $L93: .loc 1 419 33 add.u64 %r159,%r159,%r87; .loc 1 419 48 add.u64 %r160,%r160,%r42; .loc 1 419 8 setp.eq.u64 %r407,%r148,%r189; @ %r407 bra $L91; .loc 1 421 9 ld.s8 %r208,[%r160]; setp.eq.u32 %r441,%r208,0; bra $L94; $L92: .loc 1 433 11 add.u64 %r189,%r189,1; .loc 1 430 10 ld.s8 %r409,[%r160]; cvt.u16.u32 %r408,%r409; setp.eq.u16 %r410,%r408,0; @ %r410 bra $L95; .loc 1 430 20 ld.u64 %r93,[%r159]; .loc 1 430 17 setp.lt.s64 %r411,%r93,%r141; selp.u64 %r141,%r93,%r141,%r411; selp.u64 %r137,%r189,%r137,%r411; $L95: .loc 1 428 35 add.u64 %r159,%r159,%r87; .loc 1 428 50 add.u64 %r160,%r160,%r42; .loc 1 428 10 setp.ne.u64 %r412,%r148,%r189; @ %r412 bra $L92; $L91: .loc 1 436 8 st.u64 [%r167],%r137; .loc 1 439 15 add.u64 %r68,%r68,1; .loc 1 440 12 add.u64 %r165,%r165,%r99; .loc 1 441 13 add.u64 %r166,%r166,%r100; .loc 1 442 12 add.u64 %r167,%r167,%r104; .loc 1 444 32 ld.u64 %r29,[%frame+360]; .loc 1 444 13 setp.ne.u64 %r413,%r29,%r68; @ %r413 bra $L101; .loc 1 451 23 mul.lo.u64 %r106,%r68,%r97; .loc 1 452 24 mul.lo.u64 %r414,%r68,%r100; .loc 1 452 10 sub.u64 %r162,%r166,%r414; .loc 1 453 23 mul.lo.u64 %r415,%r68,%r102; .loc 1 453 9 shl.b64 %r416,%r415,3; sub.u64 %r163,%r167,%r416; .loc 1 455 7 @ %r446 bra $L56; add.u64 %r204,%frame,488; mov.u64 %r203,8; .loc 1 454 5 mov.u64 %r164,1; .loc 1 448 13 mov.u64 %r447,0; bra $L98; $L99: st.u64 [%r204],%r447; .loc 1 451 23 mul.lo.u64 %r106,%r117,%r116; .loc 1 452 24 mul.lo.u64 %r422,%r119,%r116; .loc 1 452 10 sub.u64 %r162,%r166,%r422; .loc 1 453 23 mul.lo.u64 %r423,%r121,%r116; .loc 1 453 9 shl.b64 %r424,%r423,3; sub.u64 %r163,%r167,%r424; .loc 1 454 5 add.u64 %r164,%r164,1; .loc 1 455 7 add.u64 %r204,%r204,8; add.u64 %r203,%r203,8; setp.eq.u64 %r425,%r164,%r451; @ %r425 bra $L56; $L98: .loc 1 463 16 ld.u64 %r426,[%r204]; add.u64 %r116,%r426,1; st.u64 [%r204],%r116; .loc 1 464 23 add.u64 %r428,%r452,%r203; ld.u64 %r117,[%r428]; .loc 1 464 13 sub.u64 %r429,%r117,%r106; shl.b64 %r430,%r429,3; add.u64 %r165,%r165,%r430; .loc 1 465 24 add.u64 %r431,%frame,%r203; ld.u64 %r119,[%r431]; .loc 1 465 14 add.u64 %r166,%r162,%r119; .loc 1 466 23 add.u64 %r433,%r454,%r203; ld.u64 %r121,[%r433]; .loc 1 466 13 shl.b64 %r434,%r121,3; add.u64 %r167,%r163,%r434; .loc 1 444 32 add.u64 %r436,%r453,%r203; ld.u64 %r125,[%r436]; .loc 1 444 13 setp.eq.u64 %r437,%r116,%r125; @ %r437 bra $L99; .loc 1 448 13 mov.u64 %r68,0; bra $L101; $L124: .loc 1 322 6 ld.u64 %r438,[%r213]; setp.eq.u64 %r439,%r438,0; @ ! %r439 bra $L71; bra $L100; $L56_gfortran_sminloc1_8_i8 .visible .func _gfortran_sminloc1_8_, .param .u32 %in_ar4) {local .align 16 .b8 %frame_ar[368u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 493 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L128; .loc 1 493 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L129; $L128:r152154; call _gfortran_minloc1_8_500 7 bra $L127; $L129: .loc 1 503 10 ld.u64 %r23,[%r152]; .loc 1 503 7 add.u64 %r88,%r23,-1; .loc 1 504 10 ld.s8 %r24,[%r151+28]; .loc 1 504 38 add.u32 %r26,%r24,-1; .loc 1 504 8 cvt.s64.s32 %r296,%r26; .loc 1 506 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 506 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L131; .loc 1 513 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L132; $L138: .loc 1 521 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L133; bra $L186; $L131: .loc 1 508 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error132: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 518 12 mov.u64 %r302,0; $L137: .loc 1 515 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 517 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L135; .loc 1 515 17 st.u64 [%r139],%r36; bra $L136; $L135: .loc 1 518 12 st.u64 [%r139],%r302; $L136: .loc 1 513 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L137; bra $L138; $L186: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L140; bra $L139; $L133: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 527 12 mov.u64 %r301,0; $L143: .loc 1 524 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 526 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L141; .loc 1 523 17 st.u64 [%r43],%r41; bra $L142; $L141: .loc 1 527 12 st.u64 [%r43],%r301; $L142: .loc 1 521 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L143; bra $L187; $L139: .loc 1 534 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L145; add.u64 %r295,%frame,120; $L165: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 537 10 mov.u64 %r81,1; .loc 1 541 4 mov.u64 %r239,0; bra $L146; $L145: .loc 1 545 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 546 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 548 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 548 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 548 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 550 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L147; bra $L188; $L149: .loc 1 539 48 mul.lo.u64 %r81,%r46,%r81; $L146: .loc 1 541 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 534 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L149; bra $L145; $L147: .loc 1 553 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 554 4 bra $L127; $L188: .loc 1 557 24 mov.u64 %r24696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 557 22 st.u64 [%r150],%r98; $L153: .loc 1 583 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L150; bra $L151; $L140: .loc 1 561 19 ld.s8 %r55,[%r150+28]; .loc 1 561 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L152; .loc 1 562 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error152: .loc 1 567 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 567 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L153; .loc 1 569 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L151; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 569 10 mov.u64 %r104,0; $L155: .loc 1 573 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 573 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 574 18 ld.u64 %r63,[%r31]; .loc 1 569 25 add.u64 %r104,%r104,1; .loc 1 574 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L154; .loc 1 575 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error154: .loc 1 569 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L155; $L150: .loc 1 585 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L156; shl.b64 %r266,%r296,3; bra $L157; $L156: mov.u64 %r266,8; $L157: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L158: .loc 1 586 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 583 26 add.u64 %r100,%r100,1; .loc 1 583 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L158; $L151: ld.u64 %r80,[%frame+240]; .loc 1 595 22 ld.u64 %r68,[%frame]; .loc 1 595 12 shl.b64 %r69,%r68,3; .loc 1 597 32 ld.u64 %r59,[%frame+120]; .loc 1 604 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 593 13 mov.u64 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L164: st.u64 [%r98],%r280; .loc 1 594 15 add.u64 %r80,%r80,1; .loc 1 595 12 add.u64 %r98,%r98,%r69; .loc 1 597 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L164; .loc 1 606 7 @ ! %r294 bra $L189; bra $L127; $L163: .loc 1 601 13 st.u64 [%r99],%r280; .loc 1 604 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 605 5 add.u64 %r97,%r97,1; .loc 1 606 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L162; bra $L127; $L189: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 604 23 mov.u64 %r72,%r44; .loc 1 605 5 mov.u64 %r97,1; $L162: .loc 1 610 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 611 23 ld.u64 %r75,[%r124]; .loc 1 611 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,3; add.u64 %r98,%r98,%r291; .loc 1 597 32 ld.u64 %r77,[%r123]; .loc 1 597 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L163; .loc 1 601 13 mov.u64 %r80,0; bra $L164; $L187: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L140; bra $L165; $L127: .loc 1 615 1 ret; } minloc1_16_i8.o/_gfortran_minloc1_16_i8 .visible .func _gfortran_minloc1_16_i8.file 1 "../../../libgfortran/generated/minloc1_16_i8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_16_i8 .visible .func _gfortran_mminloc1_16__gfortran_sminloc1_16_i8 .visible .func _gfortran_sminloc1_16_VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_minloc1_16_i8 .visible .func _gfortran_minloc1_16_i8480u64 %r59; .reg .u64pred %r229; .reg .predpred %r240; .reg .predu16 %r278; .reg .u32 %r279; .reg .u32 %r280; .reg .u64 %r282; .reg .u64predu64 %r318; .reg .u64pred %r328; .reg .pred %r329; .reg .u64 %r331; .reg .pred %r334; .reg .u64 %r337; .reg .u64 %r340; .reg .u32 %r342; .reg .u64 %r347; .reg .pred %r349; .reg .predpred %r365; .reg .pred %r368; .reg .u64 %r374; .reg .u64 %r375; .reg .u64u64pred %r389; .reg .u64pred %r407; .reg .u64 %r408; .reg .pred %r409; .reg .u64 %r410; .reg .u32.loc 1 58 10 ld.s8 %r22,[%r191+28]; .loc 1 58 38 add.u32 %r194,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r118,%r194; .loc 1 59 10 ld.u64 %r25,[%r192]; .loc 1 59 7 add.u64 %r119,%r25,-1; .loc 1 61 7 shr.u64 %r196,%r119,63; set.u32.lt.s64 %r198,%r118,%r119; neg.s32 %r199,%r198; cvt.u16.u64 %r202,%r196;61%r206 bra $L2; .loc 1 63 7 cvt.u32.u32 %r209,%r22; cvt.s64.s8 %r208,%r209; st.u64 [%stack+8],%r208007_gfortran_runtime_errorr212,%r119,%r119; add.u64 %r213,%r212,%r119; shl.b64 %r214,%r213,3; add.u64 %r215,%r191,%r214; ld.u64 %r31,[%r215+56]; ld.u64 %r33,[%r215+48]; .loc 1 71 9 ld.u64 %r122,[%r215+40]; .loc 1 73 3 setp.ne.u64 %r229,%r119,0; @ %r229 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r230,%r118,%r119; @ %r230 bra $L4; bra $L52; $L3: add.u64 %r188,%r191,40; add.u64 %r184,%frame,120; add.u64 %r47,%frame,240; add.u64 %r231,%r191,16; add.u64 %r233,%r25,%r25; add.u64 %r234,%r233,%r25; shl.b64 %r235,%r234,3; add.u64 %r186,%r231,%r235; .loc 1 79 12 mov.u64 %r417,0; $L8: .loc 1 75 18 ld.u64 %r236,[%r188]; st.u64 [%r184],%r236; .loc 1 76 19 ld.u64 %r238,[%r188+16]; add.u64 %r237,%r238,1; ld.u64 %r239,[%r188+8]; sub.u64 %r38,%r237,%r239; .loc 1 78 10 setp.lt.s64 %r240,%r38,0; @ %r240 bra $L6; .loc 1 76 17 st.u64 [%r47],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r47],%r417; $L7: .loc 1 73 3 add.u64 %r188,%r188,24; add.u64 %r184,%r184,8; add.u64 %r47,%r47,8; setp.ne.u64 %r242,%r186,%r188; @ %r242 bra $L8; bra $L9; $L52: .loc 1 90 6 ld.u64 %r243,[%r190]; setp.eq.u64 %r244,%r243,0; @ ! %r244 bra $L11; bra $L10; $L4: add.u64 %r246,%r25,%r25; add.u64 %r247,%r246,%r25; shl.b64 %r248,%r247,3; add.u64 %r249,%r248,40; add.u64 %r110,%r191,%r249; shl.b64 %r250,%r25,3; add.u64 %r98,%r250,-8; add.u64 %r405,%frame,120; add.u64 %r100,%r405,%r98; add.u64 %r404,%frame,240; add.u64 %r96,%r404,%r98; cvt.u32.u32 %r254,%r22; cvt.s64.s8 %r253,%r254; add.u64 %r256,%r253,%r253; add.u64 %r257,%r256,%r253; shl.b64 %r258,%r257,3; add.u64 %r259,%r191,40; add.u64 %r135,%r258,%r259; .loc 1 87 12 mov.u64 %r416,0; $L14: .loc 1 83 18 ld.u64 %r260,[%r110]; st.u64 [%r100],%r260; .loc 1 84 19 ld.u64 %r262,[%r110+16]; add.u64 %r261,%r262,1; ld.u64 %r263,[%r110+8]; sub.u64 %r44,%r261,%r263; .loc 1 86 10 setp.lt.s64 %r264,%r44,0; @ %r264 bra $L12; .loc 1 84 17 st.u64 [%r96],%r44; bra $L13; $L12: .loc 1 87 12 st.u64 [%r96],%r416; $L13: .loc 1 81 3 add.u64 %r110,%r110,24; add.u64 %r100,%r100,8; add.u64 %r96,%r96,8; setp.ne.u64 %r266,%r110,%r135; @ %r266 bra $L14; bra $L53; $L10: .loc 1 94 7 setp.le.s64 %r267,%r118,0; @ %r267 bra $L16; add.u64 %r404,%frame,240; $L39: add.u64 %r146,%r190,40; mov.u64 %r143,%r404; cvt.u32.u32 %r269,%r22; cvt.s64.s8 %r268,%r269; add.u64 %r271,%r268,%r268; add.u64 %r272,%r271,%r268; shl.b64 %r273,%r272,3; add.u64 %r274,%r190,16; add.u64 %r111,%r273,%r274; .loc 1 97 10 mov.u64 %r102,1; .loc 1 101 4 mov.u64 %r297,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r275,0; st.u64 [%r190+8],%r275; .loc 1 106 28 cvt.u16.u32 %r278,%r22; add.u16 %r277,%r278,-1; cvt.u32.u16 %r279,%r277; st.u8 [%r190+28],%r279; .loc 1 108 20 add.u32 %r280,%r22,-2; cvt.s64.s32 %r53,%r280; add.u64 %r282,%r53,%r53; add.u64 %r283,%r282,%r53; shl.b64 %r284,%r283,3; add.u64 %r285,%r190,%r284; .loc 1 108 67 shl.b64 %r287,%r53,3; add.u64 %r288,%frame,%r287; .loc 1 108 59 ld.u64 %r290,[%r285+40]; ld.u64 %r291,[%r288+240]; mul.lo.u64 %r124,%r290,%r291; .loc 1 110 29 mov.u64 %r293,16293; call (%value_in),_gfortrani_xmallocarray294,[%value_in]; } .loc 1 110 27 st.u64 [%r190],%r294; .loc 1 111 10 setp.eq.u64 %r296,%r124,0; @ ! %r296 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r102,%r49,%r102; $L17: .loc 1 101 4 st.u64 [%r146+8],%r297; ld.u64 %r49,[%r143]; add.u64 %r298,%r49,-1; st.u64 [%r146+16],%r298; st.u64 [%r146],%r102; .loc 1 94 7 add.u64 %r146,%r146,24; add.u64 %r143,%r143,8; setp.ne.u64 %r299,%r111,%r146; @ %r299 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r190+48],%r124; mov.u64 %r301,-1; st.u64 [%r190+56],%r301; mov.u64 %r302,1; st.u64 [%r190+40],%r302; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r59,[%r190+28]; .loc 1 121 10 setp.eq.u64 %r303,%r59,%r118; @ %r303 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r118; st.u64 [%stack],%r59;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r306,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r307,[%r306+36]; setp.eq.u32 %r308,%r307,0; @ %r308 bra $L21; .loc 1 128 2 add.u64 %r404,%frame,240; cvta.const.u64 %r312,$LC2; cvta.const.u64 %r311,$LC3;_gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r314,%r118,0; @ %r314 bra $L24; $L27: .loc 1 68 9 add.u64 %r315,%r31,1; .loc 1 68 7 sub.u64 %r120,%r315,%r33; max.s64 %r82,%r120,0; .loc 1 140 8 ld.u64 %r131,[%r191]; .loc 1 141 8 ld.u64 %r132,[%r190]; .loc 1 190 33 shl.b64 %r71,%r122,3; ld.u64 %r141,[%frame+360]; .loc 1 204 22 ld.u64 %r73,[%frame+120]; .loc 1 204 12 shl.b64 %r74,%r73,3; .loc 1 205 22 ld.u64 %r76,[%frame]; .loc 1 205 12 shl.b64 %r77,%r76,4; .loc 1 207 32 ld.u64 %r66,[%frame+240]; .loc 1 214 23 mul.lo.u64 %r316,%r66,%r73; .loc 1 214 9 shl.b64 %r317,%r316,3; neg.s64 %r140,%r317; .loc 1 215 23 mul.lo.u64 %r318,%r66,%r76; .loc 1 215 9 shl.b64 %r319,%r318,4; neg.s64 %r145,%r319; setp.gt.s64 %r406,%r120,0; setp.le.s64 %r407,%r118,1; .loc 1 180 9 setp.eq.u32 %r409,%r193,0; mov.u64 %r410,1; cvt.u32.u32 %r411,%r22; cvt.s64.s8 %r412,%r411; add.u64 %r413,%r412,-1; add.u64 %r414,%frame,120; add.u64 %r415,%frame,240; bra $L25; $L24: add.u64 %r168,%frame,360; add.u64 %r167,%r190,40; mov.u64 %r165,%frame; cvt.u32.u32 %r321,%r22; cvt.s64.s8 %r320,%r321; add.u64 %r148,%r320,-1; .loc 1 132 3 mov.u64 %r133,0; add.u64 %r404,%frame,240; .loc 1 134 16 mov.u64 %r322,%r133; $L26: st.u64 [%r168],%r322; .loc 1 135 18 ld.u64 %r323,[%r167]; st.u64 [%r165],%r323; .loc 1 136 17 shl.b64 %r325,%r133,3; add.u64 %r326,%r404,%r325; .loc 1 136 10 ld.u64 %r327,[%r326]; setp.le.s64 %r328,%r327,0; @ %r328 bra $L1; .loc 1 132 26 add.u64 %r133,%r133,1; .loc 1 132 3 add.u64 %r168,%r168,8; add.u64 %r167,%r167,24; add.u64 %r165,%r165,8; setp.ne.u64 %r329,%r133,%r148; @ %r329 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r406 bra $L28; .loc 1 159 10 mov.u64 %r331,0; st.u64 [%r132],%r331; st.u64 [%r132+8],%r331; bra $L29; $L28: .loc 1 180 9 @ %r409 bra $L40; mov.u64 %r127,%r131; mov.u64 %r394,%r410; mov.u64 %r395,0; .loc 1 155 9 mov.u64 %r108,9223372036854775807; .loc 1 157 9 mov.u64 %r390,1; mov.u64 %r391,%r395; $L32: .loc 1 183 9 ld.u64 %r65,[%r127]; .loc 1 183 8 setp.gt.s64 %r334,%r65,%r108; .loc 1 186 16 selp.u64 %r390,%r390,%r394,%r334; selp.u64 %r391,%r391,%r395,%r334; selp.u64 %r108,%r108,%r65,%r334; .loc 1 181 33 add.u64 %r127,%r127,%r71; .loc 1 181 8 add.u64 %r337,%r394,%r410; set.u32.lt.u64 %r342,%r337,%r394; cvt.s64.s32 %r340,%r342; mov.u64 %r394,%r337; sub.u64 %r395,%r395,%r340; add.u64 %r347,%r394,-1; setp.gt.s64 %r349,%r82,%r347; @ %r349 bra $L32; bra $L33; $L40: mov.u64 %r101,%r131; mov.u64 %r392,%r410; mov.u64 %r393,0; .loc 1 155 9 mov.u64 %r109,9223372036854775807; .loc 1 157 9 mov.u64 %r390,1; mov.u64 %r391,%r393; $L30: .loc 1 192 9 ld.u64 %r69,[%r101]; .loc 1 192 8 setp.ge.s64 %r350,%r69,%r109; .loc 1 195 16 selp.u64 %r390,%r390,%r392,%r350; selp.u64 %r391,%r391,%r393,%r350; selp.u64 %r109,%r109,%r69,%r350; .loc 1 190 33 add.u64 %r101,%r101,%r71; .loc 1 190 8 add.u64 %r353,%r392,%r410; set.u32.lt.u64 %r358,%r353,%r392; cvt.s64.s32 %r356,%r358; mov.u64 %r392,%r353; sub.u64 %r393,%r393,%r356; add.u64 %r363,%r392,-1; setp.gt.s64 %r365,%r82,%r363; @ %r365 bra $L30; $L33: .loc 1 199 12 st.u64 [%r132],%r390; st.u64 [%r132+8],%r391; $L29: .loc 1 203 15 add.u64 %r141,%r141,1; .loc 1 204 12 add.u64 %r131,%r131,%r74; .loc 1 205 12 add.u64 %r132,%r132,%r77; .loc 1 207 13 setp.ne.u64 %r368,%r66,%r141; @ %r368 bra $L25; .loc 1 214 9 add.u64 %r128,%r131,%r140; .loc 1 215 9 add.u64 %r129,%r132,%r145; .loc 1 217 7 @ %r407 bra $L1; add.u64 %r157,%frame,368; mov.u64 %r117,8; .loc 1 216 5 mov.u64 %r130,1; .loc 1 211 13 mov.u64 %r408,0; bra $L37; $L38: st.u64 [%r157],%r408; .loc 1 214 23 mul.lo.u64 %r374,%r89,%r88; .loc 1 214 9 shl.b64 %r375,%r374,3; sub.u64 %r128,%r131,%r375; .loc 1 215 23 mul.lo.u64 %r376,%r92,%r88; .loc 1 215 9 shl.b64 %r377,%r376,4; sub.u64 %r129,%r132,%r377; .loc 1 216 5 add.u64 %r130,%r130,1; .loc 1 217 7 add.u64 %r157,%r157,8; add.u64 %r117,%r117,8; setp.eq.u64 %r378,%r130,%r413; @ %r378 bra $L1; $L37: .loc 1 225 16 ld.u64 %r379,[%r157]; add.u64 %r88,%r379,1; st.u64 [%r157],%r88; .loc 1 226 23 add.u64 %r381,%r414,%r117; ld.u64 %r89,[%r381]; .loc 1 226 13 shl.b64 %r382,%r89,3; add.u64 %r131,%r128,%r382; .loc 1 227 23 add.u64 %r383,%frame,%r117; ld.u64 %r92,[%r383]; .loc 1 227 13 shl.b64 %r384,%r92,4; add.u64 %r132,%r129,%r384; .loc 1 207 32 add.u64 %r386,%r415,%r117; ld.u64 %r95,[%r386]; .loc 1 207 13 setp.eq.u64 %r387,%r88,%r95; @ %r387 bra $L38; .loc 1 211 13 mov.u64 %r141,0; bra $L25; $L53: .loc 1 90 6 ld.u64 %r388,[%r190]; setp.eq.u64 %r389,%r388,0; @ ! %r389 bra $L11; bra $L39; $L1:_gfortran_mminloc1_16_i8 .visible .func _gfortran_mminloc1_16_, .param .u32 %in_ar4) {60897u64u64 %r267; .reg .u64 %r274; .reg .pred %r275; .reg .u32 %r276; .reg .u32 %r277; .reg .u32u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .pred %r329; .reg .u64 %r330; .reg .pred %r331; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32 %r338; .reg .u64 %r339; .reg .u64 %r341; .reg .u64 %r342; .reg .u64 %r343; .reg .u64 %r344; .reg .u64pred %r355; .reg .pred64predpred %r408; .reg .pred %r409; .reg .u64 %r410; .reg .u32.reg .u64 %r417; .reg .pred %r418; .reg .predpred %r428; .reg .u64 %r432; .reg .u64pred %r460; .reg .u64 %r463; .reg .u64 %r466; .reg .u32 %r468; .reg .pred %r475; .reg .pred.reg .u32 %r504; .reg .u64 %r508; .reg .u16 %r509; .reg .u32 %r510; .reg .pred %r511; .reg .pred %r512; .reg .u64 %r515; .reg .u64 %r518; .reg .u32 %r520; .reg .pred %r527; .reg .pred %r530; .reg .pred %r533; .reg .u64pred %r542; .reg .u64 %r543; .reg .u64pred.reg .u64 %r610; .reg .u64 %r611; mov.u64 %r233,%ar0; mov.u64 %r234,%ar1; mov.u64 %r235,%ar2; mov.u64 %r236,%ar3; mov.u32 %r237,%ar4; .loc 1 261 6 setp.ne.u64 %r238,%r236,0; @ %r238 bra $L3r2235237; call _gfortran_minloc1_16_268 7 bra $L54; $L55: .loc 1 271 10 ld.u64 %r22,[%r235]; .loc 1 271 7 add.u64 %r153,%r22,-1; .loc 1 272 10 ld.s8 %r23,[%r234+28]; .loc 1 272 38 add.u32 %r243,%r23,-1; .loc 1 272 8 cvt.s64.s32 %r154,%r243; .loc 1 275 7 shr.u64 %r245,%r153,63; set.u32.gt.s64 %r247,%r153,%r154; neg.s32 %r248,%r247; cvt.u16.u64 %r251,%r245; cvt.u16.u32 %r252,%r248; or.b16 %r250,%r251,%r252; .loc 1 275 6 cvt.u32.u16 %r253,%r250; cvt.u16.u8 %r254,%r253; setp.eq.u16 %r255,%r254,0; @ %r255 bra $L57; .loc 1 277 7 cvt.u32.u32 %r258,%r23; cvt.s64.s8 %r257,%r258; st.u64 [%stack+8],%r257; st.u64 [%stack],%r22; cvta.const.u64 %r22stack; call _gfortran_runtime_error7: .loc 1 282 9 add.u64 %r596,%r153,%r153; add.u64 %r601,%r596,%r153; shl.b64 %r263,%r601,3; add.u64 %r264,%r234,%r263; ld.u64 %r267,[%r264+56]; add.u64 %r266,%r267,1; .loc 1 282 7 ld.u64 %r274,[%r264+48]; sub.u64 %r155,%r266,%r274; .loc 1 283 6 setp.le.s64 %r275,%r155,0; @ %r275 bra $L54; .loc 1 286 9 ld.u64 %r173,[%r236]; .loc 1 288 15 ld.u64 %r35,[%r236+16]; .loc 1 290 22 cvt.u32.u64 %r36,%r35; .loc 1 290 53 add.u32 %r276,%r36,-4; and.b32 %r277,%r276,-5; set.u32.eq.u32 %r279,%r277,0; neg.s32 %r280,%r279; .loc 1 290 22 add.u32 %r281,%r36,-1; set.u32.le.u32 %r283,%r281,1; neg.s32 %r284,%r283; .loc 1 290 6 cvt.u16.u32 %r286,%r280; cvt.u16.u32 %r287,%r284; or.b16 %r285,%r286,%r287;59; .loc 1 292 7 setp.ne.u32 %r292,%r36,16; @ %r292 bra $L60; $L59: .loc 1 299 9 shl.b64 %r296,%r601,3; add.u64 %r297,%r234,%r296; ld.u64 %r158,[%r297+40]; .loc 1 300 12 add.u64 %r303,%r236,%r296; ld.u64 %r41,[%r303+40]; .loc 1 302 3 setp.ne.u64 %r305,%r153,0; @ %r305 bra $L61; $L67: .loc 1 312 3 setp.lt.s64 %r306,%r153,%r154; @ %r306 bra $L62; bra $L121; $L60: .loc 1 297 5 cvta.const.u64 %r307307_gfortran_runtime_error61: add.u64 %r164,%r234,40; add.u64 %r150,%r236,40; add.u64 %r309,%r234,16; add.u64 %r311,%r22,%r22; add.u64 %r312,%r311,%r22; shl.b64 %r313,%r312,3; add.u64 %r223,%r309,%r313; .loc 1 302 3 mov.u64 %r53,0; add.u64 %r597,%frame,240; add.u64 %r600,%frame,360; .loc 1 309 12 mov.u64 %r611,%r53; $L66: .loc 1 304 18 add.u64 %r315,%r597,%r53; ld.u64 %r316,[%r164]; st.u64 [%r315],%r316; .loc 1 305 18 add.u64 %r317,%frame,%r53; .loc 1 305 20 ld.u64 %r319,[%r150]; mul.lo.u64 %r318,%r319,%r35; .loc 1 305 18 st.u64 [%r317],%r318; .loc 1 306 19 ld.u64 %r321,[%r164+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r164+8]; sub.u64 %r52,%r320,%r322; .loc 1 308 10 setp.lt.s64 %r323,%r52,0; @ %r323 bra $L64; .loc 1 306 17 add.u64 %r325,%r600,%r53; st.u64 [%r325],%r52; bra $L65; $L64: .loc 1 309 12 add.u64 %r327,%r600,%r53; st.u64 [%r327],%r611; $L65: .loc 1 302 3 add.u64 %r164,%r164,24; add.u64 %r150,%r150,24; add.u64 %r53,%r53,8; setp.ne.u64 %r329,%r164,%r223; @ %r329 bra $L66; bra $L67; $L121: .loc 1 322 6 ld.u64 %r330,[%r233]; setp.eq.u64 %r331,%r330,0; @ ! %r331 bra $L69; bra $L68; $L62: add.u64 %r333,%r22,%r22; add.u64 %r334,%r333,%r22; shl.b64 %r335,%r334,3; add.u64 %r178,%r335,40; add.u64 %r59,%r234,%r178; add.u64 %r180,%r236,%r178; shl.b64 %r336,%r22,3; add.u64 %r91,%r336,-8; cvt.u32.u32 %r338,%r23; cvt.s64.s8 %r337,%r338; shl.b64 %r339,%r337,3; add.u64 %r97,%r339,-8; add.u64 %r597,%frame,240; add.u64 %r600,%frame,360; .loc 1 319 12 mov.u64 %r610,0; $L72: .loc 1 314 18 add.u64 %r341,%r597,%r91; ld.u64 %r342,[%r59]; st.u64 [%r341],%r342; .loc 1 315 18 add.u64 %r343,%frame,%r91; .loc 1 315 20 ld.u64 %r345,[%r180]; mul.lo.u64 %r344,%r345,%r35; .loc 1 315 18 st.u64 [%r343],%r344; .loc 1 316 19 ld.u64 %r347,[%r59+16]; add.u64 %r346,%r347,1; ld.u64 %r348,[%r59+8]; sub.u64 %r66,%r346,%r348; .loc 1 318 10 setp.lt.s64 %r349,%r66,0; @ %r349 bra $L70; .loc 1 316 17 add.u64 %r351,%r600,%r91; st.u64 [%r351],%r66; bra $L71; $L70: .loc 1 319 12 add.u64 %r353,%r600,%r91; st.u64 [%r353],%r610; $L71: .loc 1 312 3 add.u64 %r59,%r59,24; add.u64 %r180,%r180,24; add.u64 %r91,%r91,8; setp.ne.u64 %r355,%r91,%r97; @ %r355 bra $L72; bra $L122; $L68: .loc 1 326 7 setp.eq.u64 %r356,%r154,0; @ %r356 bra $L74; add.u64 %r600,%frame,360; $L100: add.u64 %r137,%r233,40; mov.u64 %r134,%r600; cvt.u32.u32 %r358,%r23; cvt.s64.s8 %r357,%r358; add.u64 %r360,%r357,%r357; add.u64 %r361,%r360,%r357; shl.b64 %r362,%r361,3; add.u64 %r363,%r233,16; add.u64 %r64,%r362,%r363; .loc 1 329 10 mov.u64 %r136,1; .loc 1 333 4 mov.u64 %r382,0; bra $L75; $L74: .loc 1 337 20 add.u32 %r364,%r23,-2; cvt.s64.s32 %r75,%r364; add.u64 %r366,%r75,%r75; add.u64 %r367,%r366,%r75; shl.b64 %r368,%r367,3; add.u64 %r369,%r233,%r368; .loc 1 337 67 shl.b64 %r371,%r75,3; add.u64 %r372,%frame,%r371; .loc 1 337 59 ld.u64 %r374,[%r369+40]; ld.u64 %r375,[%r372+360]; mul.lo.u64 %r162,%r374,%r375; .loc 1 339 24 mov.u64 %r376,0; st.u64 [%r233+8],%r376; .loc 1 340 28 cvt.u16.u32 %r379,%r23; add.u16 %r378,%r379,-1; cvt.u32.u16 %r380,%r378; st.u8 [%r233+28],%r380; .loc 1 342 10 setp.eq.u64 %r381,%r162,0; @ %r381 bra $L76; bra $L123; $L78: .loc 1 331 47 mul.lo.u64 %r136,%r72,%r136; $L75: .loc 1 333 4 st.u64 [%r137+8],%r382; ld.u64 %r72,[%r134]; add.u64 %r383,%r72,-1; st.u64 [%r137+16],%r383; st.u64 [%r137],%r136; .loc 1 326 7 add.u64 %r137,%r137,24; add.u64 %r134,%r134,8; setp.ne.u64 %r384,%r64,%r137; @ %r384 bra $L78; bra $L74; $L76: .loc 1 345 4 st.u64 [%r233+48],%r162; mov.u64 %r386,-1; st.u64 [%r233+56],%r386; mov.u64 %r387,1; st.u64 [%r233+40],%r387; .loc 1 346 4 bra $L54; $L123: .loc 1 349 24 mov.u64 %r389,16389; call (%value_in),_gfortrani_xmallocarray90,[%value_in]; } .loc 1 349 22 st.u64 [%r233],%r390; bra $L79; $L69: .loc 1 354 19 ld.s8 %r392,[%r233+28]; .loc 1 354 10 setp.eq.u64 %r393,%r392,%r154; @ %r393 bra $L80; .loc 1 355 2 cvta.const.u64 %r3944_gfortran_runtime_error80: .loc 1 357 11 cvta.global.u64 %r396,_gfortrani_compile_options; .loc 1 357 10 ld.u32 %r397,[%r396+36]; setp.eq.u32 %r398,%r397,0; @ %r398 bra $L79; .loc 1 359 4 add.u64 %r600,%frame,360; cvta.const.u64 %r402,$LC2r600002; call _gfortrani_bounds_ifunction_return361 4 cvta.const.u64 %r406,$LC406402; call _gfortrani_bounds_equal_extents$L79: .loc 1 366 3 setp.ne.u64 %r408,%r154,0; @ %r408 bra $L81; $L84: .loc 1 374 8 ld.u64 %r174,[%r233]; .loc 1 375 8 ld.u64 %r172,[%r234]; .loc 1 377 9 setp.ne.u64 %r409,%r172,0; @ %r409 bra $L82; bra $L54; $L81: add.u64 %r188,%frame,480; add.u64 %r181,%r233,40; add.u64 %r149,%frame,120; cvt.u32.u32 %r411,%r23; cvt.s64.s8 %r410,%r411; add.u64 %r138,%r410,-1; .loc 1 366 3 mov.u64 %r175,0; add.u64 %r600,%frame,360; .loc 1 368 16 mov.u64 %r412,%r175; $L83: st.u64 [%r188],%r412; .loc 1 369 18 ld.u64 %r413,[%r181]; st.u64 [%r149],%r413; .loc 1 370 17 shl.b64 %r415,%r175,3; add.u64 %r416,%r600,%r415; .loc 1 370 10 ld.u64 %r417,[%r416]; setp.le.s64 %r418,%r417,0; @ %r418 bra $L54; .loc 1 366 26 add.u64 %r175,%r175,1; .loc 1 366 3 add.u64 %r188,%r188,8; add.u64 %r181,%r181,24; add.u64 %r149,%r149,8; setp.ne.u64 %r419,%r138,%r175; @ %r419 bra $L83; bra $L84; $L82: .loc 1 300 12 mul.lo.u64 %r43,%r41,%r35; .loc 1 396 32 shl.b64 %r88,%r158,3; ld.u64 %r189,[%frame+480]; .loc 1 440 22 ld.u64 %r102,[%frame+240]; .loc 1 440 12 shl.b64 %r103,%r102,3; .loc 1 441 23 ld.u64 %r105,[%frame]; .loc 1 442 22 ld.u64 %r107,[%frame+120]; .loc 1 442 12 shl.b64 %r108,%r107,4; .loc 1 444 32 ld.u64 %r29,[%frame+360]; .loc 1 451 23 mul.lo.u64 %r39,%r29,%r102; .loc 1 452 24 mul.lo.u64 %r420,%r29,%r105; .loc 1 452 10 neg.s64 %r92,%r420; .loc 1 453 23 mul.lo.u64 %r421,%r29,%r107; .loc 1 453 9 shl.b64 %r422,%r421,4; neg.s64 %r185,%r422; setp.le.s64 %r595,%r154,1; .loc 1 418 9 setp.ne.u32 %r603,%r237,0; add.u64 %r604,%r155,-1; cvt.u32.u32 %r605,%r23; cvt.s64.s8 %r606,%r605; add.u64 %r607,%r606,-1; add.u64 %r608,%frame,240; add.u64 %r609,%frame,360; $L101: .loc 1 366 3 mov.u64 %r168,%r173; mov.u64 %r167,%r172; .loc 1 396 9 mov.u64 %r197,0; $L88: .loc 1 399 7 ld.s8 %r224,[%r168]; mov.u64 %r230,%r197; .loc 1 396 24 add.u64 %r197,%r197,1; .loc 1 399 6 setp.eq.u32 %r598,%r224,0; @ %r598 bra $L85; .loc 1 407 11 ld.u64 %r145,[%r167]; .loc 1 408 11 mov.u64 %r557,%r197; shr.s64 %r558,%r557,63; .loc 1 418 9 @ %r603 bra $L86; bra $L124; $L85: .loc 1 396 32 add.u64 %r167,%r167,%r88; .loc 1 396 47 add.u64 %r168,%r168,%r43; .loc 1 396 2 setp.ne.u64 %r426,%r155,%r197; @ %r426 bra $L88; .loc 1 395 9 mov.u64 %r557,0; mov.u64 %r558,%r557; bra $L89; $L124: .loc 1 428 10 setp.gt.s64 %r427,%r155,%r230; @ %r427 bra $L90; bra $L89; $L86: .loc 1 419 8 setp.le.s64 %r428,%r155,%r230; @ %r428 bra $L89; add.u64 %r565,%r230,1; shr.s64 %r566,%r565,63; sub.u64 %r432,%r604,%r230; shr.s64 %r435,%r230,63; add.u64 %r438,%r230,2; set.u32.lt.u64 %r443,%r438,%r230; cvt.s64.s32 %r441,%r443; sub.u64 %r447,%r435,%r441; add.u64 %r449,%r432,%r438; set.u32.lt.u64 %r454,%r449,%r432; cvt.s64.s32 %r452,%r454; sub.u64 %r458,%r447,%r452; mov.u64 %r577,1; $L93: .loc 1 421 8 @ %r598 bra $L91; .loc 1 421 18 ld.u64 %r89,[%r167]; .loc 1 421 15 setp.gt.s64 %r460,%r89,%r145; .loc 1 424 16 selp.u64 %r557,%r557,%r565,%r460; selp.u64 %r558,%r558,%r566,%r460; selp.u64 %r145,%r145,%r89,%r460; $L91: .loc 1 419 33 add.u64 %r167,%r167,%r88; .loc 1 419 48 add.u64 %r168,%r168,%r43; .loc 1 419 8 add.u64 %r463,%r565,%r577; set.u32.lt.u64 %r468,%r463,%r565; cvt.s64.s32 %r466,%r468; mov.u64 %r565,%r463; sub.u64 %r566,%r566,%r466; setp.ne.u64 %r475,%r449,%r565; @ %r475 bra $L102; setp.ne.u64 %r478,%r458,%r566; @ ! %r478 bra $L89; $L102: .loc 1 421 9 ld.s8 %r224,[%r168]; setp.eq.u32 %r598,%r224,0; bra $L93; $L90: add.u64 %r561,%r230,1; shr.s64 %r562,%r561,63; sub.u64 %r482,%r604,%r230; shr.s64 %r485,%r230,63; add.u64 %r488,%r230,2; set.u32.lt.u64 %r493,%r488,%r230; cvt.s64.s32 %r491,%r493; sub.u64 %r497,%r485,%r491; add.u64 %r499,%r482,%r488; set.u32.lt.u64 %r504,%r499,%r482; cvt.s64.s32 %r502,%r504; sub.u64 %r508,%r497,%r502; .loc 1 428 10 mov.u64 %r591,1; $L116: .loc 1 430 10 ld.s8 %r510,[%r168]; cvt.u16.u32 %r509,%r510; setp.eq.u16 %r511,%r509,0; @ %r511 bra $L94; .loc 1 430 20 ld.u64 %r94,[%r167]; .loc 1 430 17 setp.ge.s64 %r512,%r94,%r145; .loc 1 433 11 selp.u64 %r557,%r557,%r561,%r512; selp.u64 %r558,%r558,%r562,%r512; selp.u64 %r145,%r145,%r94,%r512; $L94: .loc 1 428 35 add.u64 %r167,%r167,%r88; .loc 1 428 50 add.u64 %r168,%r168,%r43; .loc 1 428 10 add.u64 %r515,%r561,%r591; set.u32.lt.u64 %r520,%r515,%r561; cvt.s64.s32 %r518,%r520; mov.u64 %r561,%r515; sub.u64 %r562,%r562,%r518; setp.ne.u64 %r527,%r499,%r561; @ %r527 bra $L116; setp.ne.u64 %r530,%r508,%r562; @ %r530 bra $L116; $L89: .loc 1 436 8 st.u64 [%r174],%r557; st.u64 [%r174+8],%r558; .loc 1 439 15 add.u64 %r189,%r189,1; .loc 1 440 12 add.u64 %r172,%r172,%r103; .loc 1 441 13 add.u64 %r173,%r173,%r105; .loc 1 442 12 add.u64 %r174,%r174,%r108; .loc 1 444 13 setp.ne.u64 %r533,%r29,%r189; @ %r533 bra $L101; .loc 1 452 10 add.u64 %r169,%r173,%r92; .loc 1 453 9 add.u64 %r170,%r174,%r185; .loc 1 455 7 @ %r595 bra $L54; add.u64 %r219,%frame,488; .loc 1 451 23 mov.u64 %r112,%r39; .loc 1 455 7 mov.u64 %r218,8; .loc 1 454 5 mov.u64 %r171,1; add.u64 %r599,%frame,120; .loc 1 448 13 mov.u64 %r602,0; bra $L98; $L99: st.u64 [%r219],%r602; .loc 1 451 23 mul.lo.u64 %r112,%r125,%r124; .loc 1 452 24 mul.lo.u64 %r539,%r127,%r124; .loc 1 452 10 sub.u64 %r169,%r173,%r539; .loc 1 453 23 mul.lo.u64 %r540,%r129,%r124; .loc 1 453 9 shl.b64 %r541,%r540,4; sub.u64 %r170,%r174,%r541; .loc 1 454 5 add.u64 %r171,%r171,1; .loc 1 455 7 add.u64 %r219,%r219,8; add.u64 %r218,%r218,8; setp.eq.u64 %r542,%r171,%r607; @ %r542 bra $L54; $L98: .loc 1 463 16 ld.u64 %r543,[%r219]; add.u64 %r124,%r543,1; st.u64 [%r219],%r124; .loc 1 464 23 add.u64 %r545,%r608,%r218; ld.u64 %r125,[%r545]; .loc 1 464 13 sub.u64 %r546,%r125,%r112; shl.b64 %r547,%r546,3; add.u64 %r172,%r172,%r547; .loc 1 465 24 add.u64 %r548,%frame,%r218; ld.u64 %r127,[%r548]; .loc 1 465 14 add.u64 %r173,%r169,%r127; .loc 1 466 23 add.u64 %r550,%r599,%r218; ld.u64 %r129,[%r550]; .loc 1 466 13 shl.b64 %r551,%r129,4; add.u64 %r174,%r170,%r551; .loc 1 444 32 add.u64 %r553,%r609,%r218; ld.u64 %r133,[%r553]; .loc 1 444 13 setp.eq.u64 %r554,%r124,%r133; @ %r554 bra $L99; .loc 1 448 13 mov.u64 %r189,0; bra $L101; $L122: .loc 1 322 6 ld.u64 %r555,[%r233]; setp.eq.u64 %r556,%r555,0; @ ! %r556 bra $L69; bra $L100; $L54_gfortran_sminloc1_16_i8 .visible .func _gfortran_sminloc1_16_, .param .u32 %in_ar4) {local .align 16 .b8 %frame_ar[368u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r282; .reg .predpred %r293; .reg .pred %r294; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 493 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L126; .loc 1 493 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L127; $L126:r152154; call _gfortran_minloc1_16_500 7 bra $L125; $L127: .loc 1 503 10 ld.u64 %r23,[%r152]; .loc 1 503 7 add.u64 %r88,%r23,-1; .loc 1 504 10 ld.s8 %r24,[%r151+28]; .loc 1 504 38 add.u32 %r26,%r24,-1; .loc 1 504 8 cvt.s64.s32 %r297,%r26; .loc 1 506 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r297; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 506 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L129; .loc 1 513 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L130; $L136: .loc 1 521 3 setp.lt.s64 %r175,%r88,%r297; @ %r175 bra $L131; bra $L184; $L129: .loc 1 508 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error130: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 518 12 mov.u64 %r303,0; $L135: .loc 1 515 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 517 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L133; .loc 1 515 17 st.u64 [%r139],%r36; bra $L134; $L133: .loc 1 518 12 st.u64 [%r139],%r303; $L134: .loc 1 513 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L135; bra $L136; $L184: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L138; bra $L137; $L131: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r296,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r296,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 527 12 mov.u64 %r302,0; $L141: .loc 1 524 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 526 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L139; .loc 1 523 17 st.u64 [%r43],%r41; bra $L140; $L139: .loc 1 527 12 st.u64 [%r43],%r302; $L140: .loc 1 521 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L141; bra $L185; $L137: .loc 1 534 7 setp.eq.u64 %r213,%r297,0; @ %r213 bra $L143; add.u64 %r296,%frame,120; $L163: add.u64 %r111,%r150,40; mov.u64 %r107,%r296; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 537 10 mov.u64 %r81,1; .loc 1 541 4 mov.u64 %r239,0; bra $L144; $L143: .loc 1 545 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 546 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 548 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 548 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 548 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 550 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L145; bra $L186; $L147: .loc 1 539 48 mul.lo.u64 %r81,%r46,%r81; $L144: .loc 1 541 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 534 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L147; bra $L143; $L145: .loc 1 553 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 554 4 bra $L125; $L186: .loc 1 557 24 mov.u64 %r246,1696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 557 22 st.u64 [%r150],%r98; $L151: .loc 1 583 3 setp.ne.u64 %r249,%r297,0; @ %r249 bra $L148; bra $L149; $L138: .loc 1 561 19 ld.s8 %r55,[%r150+28]; .loc 1 561 10 setp.eq.u64 %r250,%r55,%r297; @ %r250 bra $L150; .loc 1 562 2 st.u64 [%stack+8],%r297; st.u64 [%stack],%r55;_gfortran_runtime_error150: .loc 1 567 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 567 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L151; .loc 1 569 4 setp.eq.u64 %r256,%r297,0; @ %r256 bra $L149; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 569 10 mov.u64 %r104,0; $L153: .loc 1 573 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 573 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 574 18 ld.u64 %r63,[%r31]; .loc 1 569 25 add.u64 %r104,%r104,1; .loc 1 574 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L152; .loc 1 575 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error152: .loc 1 569 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L153; $L148: .loc 1 585 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L154; shl.b64 %r266,%r297,3; bra $L155; $L154: mov.u64 %r266,8; $L155: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L156: .loc 1 586 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 583 26 add.u64 %r100,%r100,1; .loc 1 583 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r297,%r100; @ %r279 bra $L156; $L149: ld.u64 %r80,[%frame+240]; .loc 1 595 22 ld.u64 %r68,[%frame]; .loc 1 595 12 shl.b64 %r69,%r68,4; .loc 1 597 32 ld.u64 %r59,[%frame+120]; .loc 1 604 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r295,%r297,1; .loc 1 593 13 mov.u64 %r280,0; cvt.u32.u32 %r299,%r24; cvt.s64.s8 %r300,%r299; add.u64 %r301,%r300,-1; $L162: st.u64 [%r98],%r280; st.u64 [%r98+8],%r280; .loc 1 594 15 add.u64 %r80,%r80,1; .loc 1 595 12 add.u64 %r98,%r98,%r69; .loc 1 597 13 setp.ne.u64 %r282,%r59,%r80; @ %r282 bra $L162; .loc 1 606 7 @ ! %r295 bra $L187; bra $L125; $L161: .loc 1 601 13 st.u64 [%r99],%r280; .loc 1 604 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 605 5 add.u64 %r97,%r97,1; .loc 1 606 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r285,%r97,%r301; @ %r285 bra $L160; bra $L125; $L187: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 604 23 mov.u64 %r72,%r44; .loc 1 605 5 mov.u64 %r97,1; $L160: .loc 1 610 16 ld.u64 %r290,[%r99]; add.u64 %r74,%r290,1; st.u64 [%r99],%r74; .loc 1 611 23 ld.u64 %r75,[%r124]; .loc 1 611 13 sub.u64 %r291,%r75,%r72; shl.b64 %r292,%r291,4; add.u64 %r98,%r98,%r292; .loc 1 597 32 ld.u64 %r77,[%r123]; .loc 1 597 13 setp.eq.u64 %r293,%r74,%r77; @ %r293 bra $L161; .loc 1 601 13 mov.u64 %r80,0; bra $L162; $L185: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r294,%r98,0; @ ! %r294 bra $L138; bra $L163; $L125: .loc 1 615 1 ret; } minloc1_4_i16.o/_gfortran_minloc1_4_i16 .visible .func _gfortran_minloc1_4_i16.file 1 "../../../libgfortran/generated/minloc1_4_i16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_4_i16 .visible .func _gfortran_mminloc1_4_i16_gfortran_sminloc1_4_i16 .visible .func _gfortran_sminloc1_4_i16VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_minloc1_4_i16 .visible .func _gfortran_minloc1_4_i16480predu64 %r258; .reg .u64pred %r262; .reg .pred %r264; .reg .pred %r265; .reg .u64u64predu64 %r318; .reg .u32u64 %r325; .reg .pred %r326; .reg .pred %r327; .reg .u32predpredpred %r352; .reg .predu64 %r371; .reg .predpredu32u64 %r396; mov.u64 %r188,%ar0; mov.u64 %r189,%ar1; mov.u64 %r190,%ar2; mov.u32 %r191,%ar3; .loc 1 58 10 ld.s8 %r22,[%r189+28]; .loc 1 58 38 add.u32 %r192,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r118,%r192; .loc 1 59 10 ld.u64 %r25,[%r190]; .loc 1 59 7 add.u64 %r119,%r25,-1; .loc 1 61 7 shr.u64 %r194,%r119,63; set.u32.lt.s64 %r196,%r118,%r119; neg.s32 %r197,%r196; cvt.u16.u64 %r200,%r194; cvt.u16.u32 %r201,%r197; or.b16 %r199,%r200,%r201; .loc 1 61 6 cvt.u32.u16 %r202,%r199; cvt.u16.u8 %r203,%r202; setp.eq.u16 %r204,%r203,0; @ %r204 bra $L2; .loc 1 63 7 cvt.u32.u32 %r207,%r22; cvt.s64.s8 %r206,%r207; st.u64 [%stack+8],%r20605205_gfortran_runtime_errorr210,%r119,%r119; add.u64 %r211,%r210,%r119; shl.b64 %r212,%r211,3; add.u64 %r213,%r189,%r212; ld.u64 %r30,[%r213+56]; ld.u64 %r32,[%r213+48]; .loc 1 71 9 ld.u64 %r122,[%r213+40]; .loc 1 73 3 setp.ne.u64 %r227,%r119,0; @ %r227 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r228,%r118,%r119; @ %r228 bra $L4; bra $L56; $L3: add.u64 %r138,%r189,40; add.u64 %r186,%frame,120; add.u64 %r185,%frame,240; add.u64 %r229,%r189,16; add.u64 %r231,%r25,%r25; add.u64 %r232,%r231,%r25; shl.b64 %r233,%r232,3; add.u64 %r145,%r229,%r233; .loc 1 79 12 mov.u64 %r396,0; $L8: .loc 1 75 18 ld.u64 %r234,[%r138]; st.u64 [%r186],%r234; .loc 1 76 19 ld.u64 %r236,[%r138+16]; add.u64 %r235,%r236,1; ld.u64 %r237,[%r138+8]; sub.u64 %r37,%r235,%r237; .loc 1 78 10 setp.lt.s64 %r238,%r37,0; @ %r238 bra $L6; .loc 1 76 17 st.u64 [%r185],%r37; bra $L7; $L6: .loc 1 79 12 st.u64 [%r185],%r396; $L7: .loc 1 73 3 add.u64 %r138,%r138,24; add.u64 %r186,%r186,8; add.u64 %r185,%r185,8; setp.ne.u64 %r240,%r138,%r145; @ %r240 bra $L8; bra $L9; $L56: .loc 1 90 6 ld.u64 %r241,[%r188]; setp.eq.u64 %r242,%r241,0; @ ! %r242 bra $L11; bra $L10; $L4: add.u64 %r244,%r25,%r25; add.u64 %r245,%r244,%r25; shl.b64 %r246,%r245,3; add.u64 %r247,%r246,40; add.u64 %r114,%r189,%r247; shl.b64 %r248,%r25,3; add.u64 %r102,%r248,-8; add.u64 %r384,%frame,120; add.u64 %r107,%r384,%r102; add.u64 %r383,%frame,240; add.u64 %r100,%r383,%r102; cvt.u32.u32 %r252,%r22; cvt.s64.s8 %r251,%r252; add.u64 %r254,%r251,%r251; add.u64 %r255,%r254,%r251; shl.b64 %r256,%r255,3; add.u64 %r257,%r189,40; add.u64 %r143,%r256,%r257; .loc 1 87 12 mov.u64 %r395,0; $L14: .loc 1 83 18 ld.u64 %r258,[%r114]; st.u64 [%r107],%r258; .loc 1 84 19 ld.u64 %r260,[%r114+16]; add.u64 %r259,%r260,1; ld.u64 %r261,[%r114+8]; sub.u64 %r43,%r259,%r261; .loc 1 86 10 setp.lt.s64 %r262,%r43,0; @ %r262 bra $L12; .loc 1 84 17 st.u64 [%r100],%r43; bra $L13; $L12: .loc 1 87 12 st.u64 [%r100],%r395; $L13: .loc 1 81 3 add.u64 %r114,%r114,24; add.u64 %r107,%r107,8; add.u64 %r100,%r100,8; setp.ne.u64 %r264,%r114,%r143; @ %r264 bra $L14; bra $L57; $L10: .loc 1 94 7 setp.le.s64 %r265,%r118,0; @ %r265 bra $L16; add.u64 %r383,%frame,240; $L41: add.u64 %r155,%r188,40; mov.u64 %r151,%r383; cvt.u32.u32 %r267,%r22; cvt.s64.s8 %r266,%r267; add.u64 %r269,%r266,%r266; add.u64 %r270,%r269,%r266; shl.b64 %r271,%r270,3; add.u64 %r272,%r188,16; add.u64 %r115,%r271,%r272; .loc 1 97 10 mov.u64 %r105,1; .loc 1 101 4 mov.u64 %r295,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r273,0; st.u64 [%r188+8],%r273; .loc 1 106 28 cvt.u16.u32 %r276,%r22; add.u16 %r275,%r276,-1; cvt.u32.u16 %r277,%r275; st.u8 [%r188+28],%r277; .loc 1 108 20 add.u32 %r278,%r22,-2; cvt.s64.s32 %r52,%r278; add.u64 %r280,%r52,%r52; add.u64 %r281,%r280,%r52; shl.b64 %r282,%r281,3; add.u64 %r283,%r188,%r282; .loc 1 108 67 shl.b64 %r285,%r52,3; add.u64 %r286,%frame,%r285; .loc 1 108 59 ld.u64 %r288,[%r283+40]; ld.u64 %r289,[%r286+240]; mul.lo.u64 %r124,%r288,%r289; .loc 1 110 29291; call (%value_in),_gfortrani_xmallocarray292,[%value_in]; } .loc 1 110 27 st.u64 [%r188],%r292; .loc 1 111 10 setp.eq.u64 %r294,%r124,0; @ ! %r294 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r105,%r48,%r105; $L17: .loc 1 101 4 st.u64 [%r155+8],%r295; ld.u64 %r48,[%r151]; add.u64 %r296,%r48,-1; st.u64 [%r155+16],%r296; st.u64 [%r155],%r105; .loc 1 94 7 add.u64 %r155,%r155,24; add.u64 %r151,%r151,8; setp.ne.u64 %r297,%r115,%r155; @ %r297 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r188+48],%r124; mov.u64 %r299,-1; st.u64 [%r188+56],%r299; mov.u64 %r300,1; st.u64 [%r188+40],%r300; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r59,[%r188+28]; .loc 1 121 10 setp.eq.u64 %r301,%r59,%r118; @ %r301 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r118; st.u64 [%stack],%r59;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r304,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r305,[%r304+36]; setp.eq.u32 %r306,%r305,0; @ %r306 bra $L21; .loc 1 128 2 add.u64 %r383,%frame,240; cvta.const.u64 %r310,$LC209310; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r312,%r118,0; @ %r312 bra $L24; $L27: .loc 1 68 9 add.u64 %r313,%r30,1; .loc 1 68 7 sub.u64 %r120,%r313,%r32; max.s64 %r86,%r120,0; .loc 1 140 8 ld.u64 %r134,[%r189]; .loc 1 141 8 ld.u64 %r135,[%r188]; .loc 1 190 33 shl.b64 %r75,%r122,4; ld.u64 %r153,[%frame+360]; .loc 1 204 22 ld.u64 %r77,[%frame+120]; .loc 1 204 12 shl.b64 %r78,%r77,4; .loc 1 205 22 ld.u64 %r80,[%frame]; .loc 1 205 12 shl.b64 %r81,%r80,2; .loc 1 207 32 ld.u64 %r68,[%frame+240]; .loc 1 214 23 mul.lo.u64 %r314,%r68,%r77; .loc 1 214 9 shl.b64 %r315,%r314,4; neg.s64 %r142,%r315; .loc 1 215 23 mul.lo.u64 %r316,%r68,%r80; .loc 1 215 9 shl.b64 %r317,%r316,2; neg.s64 %r140,%r317; setp.gt.s64 %r385,%r120,0; setp.le.s64 %r386,%r118,1; .loc 1 180 9 setp.eq.u32 %r388,%r191,0; .loc 1 155 9 mov.u64 %r389,-1; cvt.u32.u32 %r390,%r22; cvt.s64.s8 %r391,%r390; add.u64 %r392,%r391,-1; add.u64 %r393,%frame,120; add.u64 %r394,%frame,240; bra $L25; $L24: add.u64 %r171,%frame,360; add.u64 %r170,%r188,40; mov.u64 %r168,%frame; cvt.u32.u32 %r319,%r22; cvt.s64.s8 %r318,%r319; add.u64 %r159,%r318,-1; .loc 1 132 3 mov.u64 %r136,0; add.u64 %r383,%frame,240; .loc 1 134 16 mov.u64 %r320,%r136; $L26: st.u64 [%r171],%r320; .loc 1 135 18 ld.u64 %r321,[%r170]; st.u64 [%r168],%r321; .loc 1 136 17 shl.b64 %r323,%r136,3; add.u64 %r324,%r383,%r323; .loc 1 136 10 ld.u64 %r325,[%r324]; setp.le.s64 %r326,%r325,0; @ %r326 bra $L1; .loc 1 132 26 add.u64 %r136,%r136,1; .loc 1 132 3 add.u64 %r171,%r171,8; add.u64 %r170,%r170,24; add.u64 %r168,%r168,8; setp.ne.u64 %r327,%r136,%r159; @ %r327 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r385 bra $L28; .loc 1 159 10 mov.u32 %r329,0; st.u32 [%r135],%r329; bra $L29; $L28: .loc 1 180 9 @ %r388 bra $L42; mov.u64 %r129,%r134; .loc 1 155 9 mov.u64 %r379,%r389; mov.u64 %r380,9223372036854775807; .loc 1 157 9 mov.u32 %r106,1; .loc 1 178 8 mov.u64 %r128,0; $L33: .loc 1 183 9 ld.u64 %r375,[%r129]; ld.u64 %r376,[%r129+8]; .loc 1 183 8 setp.gt.s64 %r333,%r376,%r380; @ %r333 bra $L31; setp.ne.u64 %r336,%r376,%r380; @ %r336 bra $L43; setp.gt.u64 %r339,%r375,%r379; @ %r339 bra $L31; $L43: cvt.u32.u64 %r340,%r128; add.u32 %r106,%r340,1; mov.u64 %r379,%r375; mov.u64 %r380,%r376; $L31: .loc 1 181 25 add.u64 %r128,%r128,1; .loc 1 181 33 add.u64 %r129,%r129,%r75; .loc 1 181 8 setp.gt.s64 %r341,%r86,%r128; @ %r341 bra $L33; bra $L34; $L42: mov.u64 %r126,%r134; .loc 1 155 9 mov.u64 %r381,%r389; mov.u64 %r382,9223372036854775807; .loc 1 157 9 mov.u32 %r106,1; .loc 1 178 8 mov.u64 %r104,0; $L30: .loc 1 192 9 ld.u64 %r377,[%r126]; ld.u64 %r378,[%r126+8]; .loc 1 192 8 setp.gt.s64 %r344,%r382,%r378; @ %r344 bra $L44; setp.ne.u64 %r347,%r382,%r378; @ %r347 bra $L35; setp.gt.u64 %r350,%r381,%r377; @ ! %r350 bra $L35; $L44: cvt.u32.u64 %r351,%r104; add.u32 %r106,%r351,1; mov.u64 %r381,%r377; mov.u64 %r382,%r378; $L35: .loc 1 190 25 add.u64 %r104,%r104,1; .loc 1 190 33 add.u64 %r126,%r126,%r75; .loc 1 190 8 setp.gt.s64 %r352,%r86,%r104; @ %r352 bra $L30; $L34: .loc 1 199 12 st.u32 [%r135],%r106; $L29: .loc 1 203 15 add.u64 %r153,%r153,1; .loc 1 204 12 add.u64 %r134,%r134,%r78; .loc 1 205 12 add.u64 %r135,%r135,%r81; .loc 1 207 13 setp.ne.u64 %r353,%r68,%r153; @ %r353 bra $L25; .loc 1 214 9 add.u64 %r131,%r134,%r142; .loc 1 215 9 add.u64 %r132,%r135,%r140; .loc 1 217 7 @ %r386 bra $L1; add.u64 %r156,%frame,368; mov.u64 %r117,8; .loc 1 216 5 mov.u64 %r133,1; .loc 1 211 13 mov.u64 %r387,0; bra $L39; $L40: st.u64 [%r156],%r387; .loc 1 214 23 mul.lo.u64 %r359,%r93,%r92; .loc 1 214 9 shl.b64 %r360,%r359,4; sub.u64 %r131,%r134,%r360; .loc 1 215 23 mul.lo.u64 %r361,%r96,%r92; .loc 1 215 9 shl.b64 %r362,%r361,2; sub.u64 %r132,%r135,%r362; .loc 1 216 5 add.u64 %r133,%r133,1; .loc 1 217 7 add.u64 %r156,%r156,8; add.u64 %r117,%r117,8; setp.eq.u64 %r363,%r133,%r392; @ %r363 bra $L1; $L39: .loc 1 225 16 ld.u64 %r364,[%r156]; add.u64 %r92,%r364,1; st.u64 [%r156],%r92; .loc 1 226 23 add.u64 %r366,%r393,%r117; ld.u64 %r93,[%r366]; .loc 1 226 13 shl.b64 %r367,%r93,4; add.u64 %r134,%r131,%r367; .loc 1 227 23 add.u64 %r368,%frame,%r117; ld.u64 %r96,[%r368]; .loc 1 227 13 shl.b64 %r369,%r96,2; add.u64 %r135,%r132,%r369; .loc 1 207 32 add.u64 %r371,%r394,%r117; ld.u64 %r99,[%r371]; .loc 1 207 13 setp.eq.u64 %r372,%r92,%r99; @ %r372 bra $L40; .loc 1 211 13 mov.u64 %r153,0; bra $L25; $L57: .loc 1 90 6 ld.u64 %r373,[%r188]; setp.eq.u64 %r374,%r373,0; @ ! %r374 bra $L11; bra $L41; $L1:_gfortran_mminloc1_4_i16 .visible .func _gfortran_mminloc1_4_i16, .param .u32 %in_ar4) {60896u16 %r238; .reg .u16u64 %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u64 %r306; .reg .u64pred %r317; .reg .u64 %r318; .reg .pred %r319; .reg .u64u32u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64u64predu32 %r385; .reg .predu64 %r394; .reg .predu64predu64 %r410; .reg .u32 %r412; .reg .pred %r414; .reg .pred %r415; .reg .pred %r416; .reg .pred %r420; .reg .pred %r423; .reg .pred %r426; .reg .predpred %r430; .reg .pred %r433; .reg .pred %r436; .reg .pred %r439; .reg .predpred %r462; .reg .u64 %r463; .reg .predpred %r473; .reg .u64 %r475; .reg .pred %r476; .reg .u64 %r477; .reg .u64 %r478; .reg .pred %r479; .reg .u32u64 %r484; .reg .u64 %r485; .reg .u64 %r486; .reg .u64 %r487; mov.u64 %r221,%ar0; mov.u64 %r222,%ar1; mov.u64 %r223,%ar2; mov.u64 %r224,%ar3; mov.u32 %r225,%ar4; .loc 1 261 6 setp.ne.u64 %r226,%r224,0; @ %r226 bra $L223225; call _gfortran_minloc1_4_i16268 7 bra $L58; $L59: .loc 1 271 10 ld.u64 %r22,[%r223]; .loc 1 271 7 add.u64 %r151,%r22,-1; .loc 1 272 10 ld.s8 %r23,[%r222+28]; .loc 1 272 38 add.u32 %r231,%r23,-1; .loc 1 272 8 cvt.s64.s32 %r152,%r231; .loc 1 275 7 shr.u64 %r233,%r151,63; set.u32.gt.s64 %r235,%r151,%r152; neg.s32 %r236,%r235; cvt.u16.u64 %r239,%r233; cvt.u16.u32 %r240,%r236; or.b16 %r238,%r239,%r240; .loc 1 275 6 cvt.u32.u16 %r241,%r238; cvt.u16.u8 %r242,%r241; setp.eq.u16 %r243,%r242,0; @ %r243 bra $L61; .loc 1 277 7 cvt.u32.u32 %r246,%r23; cvt.s64.s8 %r245,%r246; st.u64 [%stack+8],%r245; st.u64 [%stack],%r22; cvta.const.u64 %r244244_gfortran_runtime_error61: .loc 1 282 9 add.u64 %r472,%r151,%r151; add.u64 %r475,%r472,%r151; shl.b64 %r251,%r475,3; add.u64 %r252,%r222,%r251; ld.u64 %r255,[%r252+56]; add.u64 %r254,%r255,1; .loc 1 282 7 ld.u64 %r262,[%r252+48]; sub.u64 %r153,%r254,%r262; .loc 1 283 6 setp.le.s64 %r263,%r153,0; @ %r263 bra $L58; .loc 1 286 9 ld.u64 %r169,[%r224]; .loc 1 288 15 ld.u64 %r34,[%r224+16]; .loc 1 290 22 cvt.u32.u64 %r35,%r34; .loc 1 290 53 add.u32 %r264,%r35,-4; and.b32 %r265,%r264,-5; set.u32.eq.u32 %r267,%r265,0; neg.s32 %r268,%r267; .loc 1 290 22 add.u32 %r269,%r35,-1; set.u32.le.u32 %r271,%r269,1; neg.s32 %r272,%r271; .loc 1 290 6 cvt.u16.u32 %r274,%r268; cvt.u16.u32 %r275,%r272; or.loc 1 292 7 setp.ne.u32 %r280,%r35,16; @ %r280 bra $L64; $L63: .loc 1 299 9 shl.b64 %r284,%r475,3; add.u64 %r285,%r222,%r284; ld.u64 %r156,[%r285+40]; .loc 1 300 12 add.u64 %r291,%r224,%r284; ld.u64 %r41,[%r291+40]; .loc 1 302 3 setp.ne.u64 %r293,%r151,0; @ %r293 bra $L65; $L71: .loc 1 312 3 setp.lt.s64 %r294,%r151,%r152; @ %r294 bra $L66; bra $L127; $L64: .loc 1 297 5 cvta.const.u64 %r2955_gfortran_runtime_error65: add.u64 %r96,%r222,40; add.u64 %r68,%r224,40; add.u64 %r298,%r22,%r22; add.u64 %r299,%r298,%r22; shl.b64 %r300,%r299,3; add.u64 %r301,%r222,16; add.u64 %r39,%r300,%r301; .loc 1 302 3 mov.u64 %r139,0; add.u64 %r471,%frame,240; add.u64 %r477,%frame,360; .loc 1 309 12 mov.u64 %r487,%r139; $L70: .loc 1 304 18 add.u64 %r303,%r471,%r139; ld.u64 %r304,[%r96]; st.u64 [%r303],%r304; .loc 1 305 18 add.u64 %r305,%frame,%r139; .loc 1 305 20 ld.u64 %r307,[%r68]; mul.lo.u64 %r306,%r307,%r34; .loc 1 305 18 st.u64 [%r305],%r306; .loc 1 306 19 ld.u64 %r309,[%r96+16]; add.u64 %r308,%r309,1; ld.u64 %r310,[%r96+8]; sub.u64 %r53,%r308,%r310; .loc 1 308 10 setp.lt.s64 %r311,%r53,0; @ %r311 bra $L68; .loc 1 306 17 add.u64 %r313,%r477,%r139; st.u64 [%r313],%r53; bra $L69; $L68: .loc 1 309 12 add.u64 %r315,%r477,%r139; st.u64 [%r315],%r487; $L69: .loc 1 302 3 add.u64 %r96,%r96,24; add.u64 %r68,%r68,24; add.u64 %r139,%r139,8; setp.ne.u64 %r317,%r39,%r96; @ %r317 bra $L70; bra $L71; $L127: .loc 1 322 6 ld.u64 %r318,[%r221]; setp.eq.u64 %r319,%r318,0; @ ! %r319 bra $L73; bra $L72; $L66: add.u64 %r321,%r22,%r22; add.u64 %r322,%r321,%r22; shl.b64 %r323,%r322,3; add.u64 %r130,%r323,40; add.u64 %r134,%r222,%r130; add.u64 %r111,%r224,%r130; shl.b64 %r324,%r22,3; add.u64 %r58,%r324,-8; cvt.u32.u32 %r326,%r23; cvt.s64.s8 %r325,%r326; shl.b64 %r327,%r325,3; add.u64 %r69,%r327,-8; add.u64 %r471,%frame,240; add.u64 %r477,%frame,360; .loc 1 319 12 mov.u64 %r486,0; $L76: .loc 1 314 18 add.u64 %r329,%r471,%r58; ld.u64 %r330,[%r134]; st.u64 [%r329],%r330; .loc 1 315 18 add.u64 %r331,%frame,%r58; .loc 1 315 20 ld.u64 %r333,[%r111]; mul.lo.u64 %r332,%r333,%r34; .loc 1 315 18 st.u64 [%r331],%r332; .loc 1 316 19 ld.u64 %r335,[%r134+16]; add.u64 %r334,%r335,1; ld.u64 %r336,[%r134+8]; sub.u64 %r65,%r334,%r336; .loc 1 318 10 setp.lt.s64 %r337,%r65,0; @ %r337 bra $L74; .loc 1 316 17 add.u64 %r339,%r477,%r58; st.u64 [%r339],%r65; bra $L75; $L74: .loc 1 319 12 add.u64 %r341,%r477,%r58; st.u64 [%r341],%r486; $L75: .loc 1 312 3 add.u64 %r134,%r134,24; add.u64 %r111,%r111,24; add.u64 %r58,%r58,8; setp.ne.u64 %r343,%r69,%r58; @ %r343 bra $L76; bra $L128; $L72: .loc 1 326 7 setp.eq.u64 %r344,%r152,0; @ %r344 bra $L78; add.u64 %r477,%frame,360; $L105: add.u64 %r148,%r221,40; mov.u64 %r146,%r477; cvt.u32.u32 %r346,%r23; cvt.s64.s8 %r345,%r346; add.u64 %r348,%r345,%r345; add.u64 %r349,%r348,%r345; shl.b64 %r350,%r349,3; add.u64 %r351,%r221,16; add.u64 %r135,%r350,%r351; .loc 1 329 10 mov.u64 %r136,1; .loc 1 333 4 mov.u64 %r370,0; bra $L79; $L78: .loc 1 337 20 add.u32 %r352,%r23,-2; cvt.s64.s32 %r76,%r352; add.u64 %r354,%r76,%r76; add.u64 %r355,%r354,%r76; shl.b64 %r356,%r355,3; add.u64 %r357,%r221,%r356; .loc 1 337 67 shl.b64 %r359,%r76,3; add.u64 %r360,%frame,%r359; .loc 1 337 59 ld.u64 %r362,[%r357+40]; ld.u64 %r363,[%r360+360]; mul.lo.u64 %r160,%r362,%r363; .loc 1 339 24 mov.u64 %r364,0; st.u64 [%r221+8],%r364; .loc 1 340 28 cvt.u16.u32 %r367,%r23; add.u16 %r366,%r367,-1; cvt.u32.u16 %r368,%r366; st.u8 [%r221+28],%r368; .loc 1 342 10 setp.eq.u64 %r369,%r160,0; @ %r369 bra $L80; bra $L129; $L82: .loc 1 331 47 mul.lo.u64 %r136,%r73,%r136; $L79: .loc 1 333 4 st.u64 [%r148+8],%r370; ld.u64 %r73,[%r146]; add.u64 %r371,%r73,-1; st.u64 [%r148+16],%r371; st.u64 [%r148],%r136; .loc 1 326 7 add.u64 %r148,%r148,24; add.u64 %r146,%r146,8; setp.ne.u64 %r372,%r135,%r148; @ %r372 bra $L82; bra $L78; $L80: .loc 1 345 4 st.u64 [%r221+48],%r160; mov.u64 %r374,-1; st.u64 [%r221+56],%r374; mov.u64 %r375,1; st.u64 [%r221+40],%r375; .loc 1 346 4 bra $L58; $L129: .loc 1 349 24call (%value_in),_gfortrani_xmallocarray78,[%value_in]; } .loc 1 349 22 st.u64 [%r221],%r378; bra $L83; $L73: .loc 1 354 19 ld.s8 %r380,[%r221+28]; .loc 1 354 10 setp.eq.u64 %r381,%r380,%r152; @ %r381 bra $L84; .loc 1 355 2 cvta.const.u64 %r382,$LC5; {stack; call _gfortran_runtime_error84: .loc 1 357 11 cvta.global.u64 %r384,_gfortrani_compile_options; .loc 1 357 10 ld.u32 %r385,[%r384+36]; setp.eq.u32 %r386,%r385,0; @ %r386 bra $L83; .loc 1 359 4 add.u64 %r477,%frame,360; cvta.const.u64 %r390,$LC2r477389390; call _gfortrani_bounds_ifunction_return361 4 cvta.const.u64 %r394,$LC222394390; call _gfortrani_bounds_equal_extents$L83: .loc 1 366 3 setp.ne.u64 %r396,%r152,0; @ %r396 bra $L85; $L88: .loc 1 374 8 ld.u64 %r170,[%r221]; .loc 1 375 8 ld.u64 %r168,[%r222]; .loc 1 377 9 setp.ne.u64 %r397,%r168,0; @ %r397 bra $L86; bra $L58; $L85: add.u64 %r198,%frame,480; add.u64 %r197,%r221,40; add.u64 %r195,%frame,120; cvt.u32.u32 %r399,%r23; cvt.s64.s8 %r398,%r399; add.u64 %r164,%r398,-1; .loc 1 366 3 mov.u64 %r171,0; add.u64 %r477,%frame,360; .loc 1 368 16 mov.u64 %r400,%r171; $L87: st.u64 [%r198],%r400; .loc 1 369 18 ld.u64 %r401,[%r197]; st.u64 [%r195],%r401; .loc 1 370 17 shl.b64 %r403,%r171,3; add.u64 %r404,%r477,%r403; .loc 1 370 10 ld.u64 %r405,[%r404]; setp.le.s64 %r406,%r405,0; @ %r406 bra $L58; .loc 1 366 26 add.u64 %r171,%r171,1; .loc 1 366 3 add.u64 %r198,%r198,8; add.u64 %r197,%r197,24; add.u64 %r195,%r195,8; setp.ne.u64 %r407,%r164,%r171; @ %r407 bra $L87; bra $L88; $L86: .loc 1 300 12 mul.lo.u64 %r44,%r41,%r34; .loc 1 396 32 shl.b64 %r91,%r156,4; ld.u64 %r187,[%frame+480]; .loc 1 440 22 ld.u64 %r101,[%frame+240]; .loc 1 440 12 shl.b64 %r102,%r101,4; .loc 1 441 23 ld.u64 %r104,[%frame]; .loc 1 442 22 ld.u64 %r106,[%frame+120]; .loc 1 442 12 shl.b64 %r107,%r106,2; .loc 1 444 32 ld.u64 %r29,[%frame+360]; .loc 1 451 23 mul.lo.u64 %r38,%r29,%r101; .loc 1 452 24 mul.lo.u64 %r408,%r29,%r104; .loc 1 452 10 neg.s64 %r137,%r408; .loc 1 453 23 mul.lo.u64 %r409,%r29,%r106; .loc 1 453 9 shl.b64 %r410,%r409,2; neg.s64 %r188,%r410; setp.le.s64 %r476,%r152,1; .loc 1 418 9 setp.ne.u32 %r479,%r225,0; cvt.u32.u32 %r480,%r23; cvt.s64.s8 %r481,%r480; add.u64 %r482,%r481,-1; add.u64 %r483,%frame,240; add.u64 %r484,%frame,360; add.u64 %r485,%frame,120; $L106: .loc 1 366 3 mov.u64 %r163,%r169; mov.u64 %r162,%r168; .loc 1 396 9 mov.u64 %r161,0; $L92: .loc 1 399 7 ld.s8 %r220,[%r163]; .loc 1 399 6 setp.eq.u32 %r473,%r220,0; @ %r473 bra $L89; .loc 1 407 11 ld.u64 %r469,[%r162]; ld.u64 %r470,[%r162+8]; .loc 1 408 11 cvt.u32.u64 %r412,%r161; add.u32 %r143,%r412,1; .loc 1 418 9 @ %r479 bra $L90; bra $L130; $L89: .loc 1 396 24 add.u64 %r161,%r161,1; .loc 1 396 32 add.u64 %r162,%r162,%r91; .loc 1 396 47 add.u64 %r163,%r163,%r44; .loc 1 396 2 setp.ne.u64 %r414,%r153,%r161; @ %r414 bra $L92; .loc 1 395 9 mov.u32 %r143,%r220; bra $L93; $L130: .loc 1 428 10 setp.gt.s64 %r415,%r153,%r161; @ %r415 bra $L94; bra $L93; $L90: .loc 1 419 8 setp.le.s64 %r416,%r153,%r161; @ %r416 bra $L93; add.u64 %r205,%r161,1; $L97: .loc 1 421 8 @ %r473 bra $L95; .loc 1 421 18 ld.u64 %r465,[%r162]; ld.u64 %r466,[%r162+8]; .loc 1 421 15 setp.gt.s64 %r420,%r466,%r470; @ %r420 bra $L95; setp.ne.u64 %r423,%r466,%r470; @ %r423 bra $L108; setp.gt.u64 %r426,%r465,%r469; @ %r426 bra $L95; $L108: .loc 1 424 16 cvt.u32.u64 %r143,%r205; mov.u64 %r469,%r465; mov.u64 %r470,%r466; $L95: .loc 1 419 33 add.u64 %r162,%r162,%r91; .loc 1 419 48 add.u64 %r163,%r163,%r44; .loc 1 419 8 add.u64 %r206,%r205,1; setp.eq.u64 %r427,%r153,%r205; @ %r427 bra $L93; .loc 1 421 9 ld.s8 %r220,[%r163]; mov.u64 %r205,%r206; setp.eq.u32 %r473,%r220,0; bra $L97; $L94: add.u64 %r201,%r161,1; bra $L100; $L107: mov.u64 %r201,%r202; $L100: .loc 1 430 10 ld.s8 %r429,[%r163]; cvt.u16.u32 %r428,%r429; setp.eq.u16 %r430,%r428,0; @ %r430 bra $L98; .loc 1 430 20 ld.u64 %r467,[%r162]; ld.u64 %r468,[%r162+8]; .loc 1 430 17 setp.gt.s64 %r433,%r470,%r468; @ %r433 bra $L109; setp.ne.u64 %r436,%r470,%r468; @ %r436 bra $L98; setp.gt.u64 %r439,%r469,%r467; @ ! %r439 bra $L98; $L109: .loc 1 433 11 cvt.u32.u64 %r143,%r201; mov.u64 %r469,%r467; mov.u64 %r470,%r468; $L98: .loc 1 428 35 add.u64 %r162,%r162,%r91; .loc 1 428 50 add.u64 %r163,%r163,%r44; .loc 1 428 10 add.u64 %r202,%r201,1; setp.ne.u64 %r440,%r153,%r201; @ %r440 bra $L107; $L93: .loc 1 436 8 st.u32 [%r170],%r143; .loc 1 439 15 add.u64 %r187,%r187,1; .loc 1 440 12 add.u64 %r168,%r168,%r102; .loc 1 441 13 add.u64 %r169,%r169,%r104; .loc 1 442 12 add.u64 %r170,%r170,%r107; .loc 1 444 13 setp.ne.u64 %r441,%r29,%r187; @ %r441 bra $L106; .loc 1 452 10 add.u64 %r165,%r169,%r137; .loc 1 453 9 add.u64 %r166,%r170,%r188; .loc 1 455 7 @ %r476 bra $L58; add.u64 %r150,%frame,488; .loc 1 451 23 mov.u64 %r110,%r38; .loc 1 455 7 mov.u64 %r215,8; .loc 1 454 5 mov.u64 %r167,1; .loc 1 448 13 mov.u64 %r478,0; bra $L103; $L104: st.u64 [%r150],%r478; .loc 1 451 23 mul.lo.u64 %r110,%r123,%r122; .loc 1 452 24 mul.lo.u64 %r447,%r125,%r122; .loc 1 452 10 sub.u64 %r165,%r169,%r447; .loc 1 453 23 mul.lo.u64 %r448,%r127,%r122; .loc 1 453 9 shl.b64 %r449,%r448,2; sub.u64 %r166,%r170,%r449; .loc 1 454 5 add.u64 %r167,%r167,1; .loc 1 455 7 add.u64 %r150,%r150,8; add.u64 %r215,%r215,8; setp.eq.u64 %r450,%r167,%r482; @ %r450 bra $L58; $L103: .loc 1 463 16 ld.u64 %r451,[%r150]; add.u64 %r122,%r451,1; st.u64 [%r150],%r122; .loc 1 464 23 add.u64 %r453,%r483,%r215; ld.u64 %r123,[%r453]; .loc 1 464 13 sub.u64 %r454,%r123,%r110; shl.b64 %r455,%r454,4; add.u64 %r168,%r168,%r455; .loc 1 465 24 add.u64 %r456,%frame,%r215; ld.u64 %r125,[%r456]; .loc 1 465 14 add.u64 %r169,%r165,%r125; .loc 1 466 23 add.u64 %r458,%r485,%r215; ld.u64 %r127,[%r458]; .loc 1 466 13 shl.b64 %r459,%r127,2; add.u64 %r170,%r166,%r459; .loc 1 444 32 add.u64 %r461,%r484,%r215; ld.u64 %r131,[%r461]; .loc 1 444 13 setp.eq.u64 %r462,%r122,%r131; @ %r462 bra $L104; .loc 1 448 13 mov.u64 %r187,0; bra $L106; $L128: .loc 1 322 6 ld.u64 %r463,[%r221]; setp.eq.u64 %r464,%r463,0; @ ! %r464 bra $L73; bra $L105; $L5_gfortran_sminloc1_4_i16 .visible .func _gfortran_sminloc1_4_i16, .param .u32 %in_ar4) {local .align 16 .b8 %frame_ar[368u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u32 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 493 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L132; .loc 1 493 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L133; $L132:r152154; call _gfortran_minloc1_4_i16500 7 bra $L131; $L133: .loc 1 503 10 ld.u64 %r23,[%r152]; .loc 1 503 7 add.u64 %r88,%r23,-1; .loc 1 504 10 ld.s8 %r24,[%r151+28]; .loc 1 504 38 add.u32 %r26,%r24,-1; .loc 1 504 8 cvt.s64.s32 %r296,%r26; .loc 1 506 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 506 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L135; .loc 1 513 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L136; $L142: .loc 1 521 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L137; bra $L190; $L135: .loc 1 508 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error136: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 518 12 mov.u64 %r302,0; $L141: .loc 1 515 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 517 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L139; .loc 1 515 17 st.u64 [%r139],%r36; bra $L140; $L139: .loc 1 518 12 st.u64 [%r139],%r302; $L140: .loc 1 513 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L141; bra $L142; $L190: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L144; bra $L143; $L137: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 527 12 mov.u64 %r301,0; $L147: .loc 1 524 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 526 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L145; .loc 1 523 17 st.u64 [%r43],%r41; bra $L146; $L145: .loc 1 527 12 st.u64 [%r43],%r301; $L146: .loc 1 521 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L147; bra $L191; $L143: .loc 1 534 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L149; add.u64 %r295,%frame,120; $L169: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 537 10 mov.u64 %r81,1; .loc 1 541 4 mov.u64 %r239,0; bra $L150; $L149: .loc 1 545 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 546 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 548 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 548 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 548 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 550 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L151; bra $L192; $L153: .loc 1 539 48 mul.lo.u64 %r81,%r46,%r81; $L150: .loc 1 541 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 534 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L153; bra $L149; $L151: .loc 1 553 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 554 4 bra $L131; $L192: .loc 1 557 2496246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 557 22 st.u64 [%r150],%r98; $L157: .loc 1 583 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L154; bra $L155; $L144: .loc 1 561 19 ld.s8 %r55,[%r150+28]; .loc 1 561 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L156; .loc 1 562 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error156: .loc 1 567 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 567 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L157; .loc 1 569 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L155; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 569 10 mov.u64 %r104,0; $L159: .loc 1 573 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 573 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 574 18 ld.u64 %r63,[%r31]; .loc 1 569 25 add.u64 %r104,%r104,1; .loc 1 574 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L158; .loc 1 575 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error158: .loc 1 569 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L159; $L154: .loc 1 585 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L160; shl.b64 %r266,%r296,3; bra $L161; $L160: mov.u64 %r266,8; $L161: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L162: .loc 1 586 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 583 26 add.u64 %r100,%r100,1; .loc 1 583 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L162; $L155: ld.u64 %r80,[%frame+240]; .loc 1 595 22 ld.u64 %r68,[%frame]; .loc 1 595 12 shl.b64 %r69,%r68,2; .loc 1 597 32 ld.u64 %r59,[%frame+120]; .loc 1 604 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 593 13 mov.u32 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L168: st.u32 [%r98],%r280; .loc 1 594 15 add.u64 %r80,%r80,1; .loc 1 595 12 add.u64 %r98,%r98,%r69; .loc 1 597 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L168; .loc 1 606 7 @ ! %r294 bra $L193; bra $L131; $L167: .loc 1 601 13 st.u64 [%r99],%r297; .loc 1 604 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 605 5 add.u64 %r97,%r97,1; .loc 1 606 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L166; bra $L131; $L193: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 604 23 mov.u64 %r72,%r44; .loc 1 605 5 mov.u64 %r97,1; .loc 1 601 13 mov.u64 %r297,0; $L166: .loc 1 610 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 611 23 ld.u64 %r75,[%r124]; .loc 1 611 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,2; add.u64 %r98,%r98,%r291; .loc 1 597 32 ld.u64 %r77,[%r123]; .loc 1 597 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L167; .loc 1 601 13 mov.u64 %r80,0; bra $L168; $L191: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L144; bra $L169; $L131: .loc 1 615 1 ret; } minloc1_8_i16.o/_gfortran_minloc1_8_i16 .visible .func _gfortran_minloc1_8_i16.file 1 "../../../libgfortran/generated/minloc1_8_i16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_8_i16 .visible .func _gfortran_mminloc1_8_i16_gfortran_sminloc1_8_i16 .visible .func _gfortran_sminloc1_8_i16VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_minloc1_8_i16 .visible .func _gfortran_minloc1_8_i16480predpredpred %r235; .reg .predu64 %r266; .reg .u64 %r267; .reg .u64u16u64 %r279; .reg .u64 %r280; .reg .u64 %r282; .reg .u64 %r283; .reg .u64predu64u64pred %r339; .reg .predpredpred %r380; .reg .predu64 %r390; .reg .u64 %r391; mov.u64 %r185,%ar0; mov.u64 %r186,%ar1; mov.u64 %r187,%ar2; mov.u32 %r188,%ar3; .loc 1 58 10 ld.s8 %r22,[%r186+28]; .loc 1 58 38 add.u32 %r189,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r120,%r189; .loc 1 59 10 ld.u64 %r25,[%r187]; .loc 1 59 7 add.u64 %r121,%r25,-1; .loc 1 61 7 shr.u64 %r191,%r121,63; set.u32.lt.s64 %r193,%r120,%r121; neg.s32 %r194,%r193; cvt.u16.u64 %r197,%r191; cvt.u16.u32 %r198,%r194; or.b16 %r196,%r197,%r198; .loc 1 61 6 cvt.u32.u16 %r199,%r196; cvt.u16.u8 %r200,%r199; setp.eq.u16 %r201,%r200,0; @ %r201 bra $L2; .loc 1 63 7 cvt.u32.u32 %r204,%r22; cvt.s64.s8 %r203,%r204; st.u64 [%stack+8],%r20302202_gfortran_runtime_errorr207,%r121,%r121; add.u64 %r208,%r207,%r121; shl.b64 %r209,%r208,3; add.u64 %r210,%r186,%r209; ld.u64 %r30,[%r210+56]; ld.u64 %r32,[%r210+48]; .loc 1 71 9 ld.u64 %r124,[%r210+40]; .loc 1 73 3 setp.ne.u64 %r224,%r121,0; @ %r224 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r225,%r120,%r121; @ %r225 bra $L4; bra $L56; $L3: add.u64 %r47,%r186,40; add.u64 %r119,%frame,120; add.u64 %r137,%frame,240; add.u64 %r226,%r186,16; add.u64 %r228,%r25,%r25; add.u64 %r229,%r228,%r25; shl.b64 %r230,%r229,3; add.u64 %r46,%r226,%r230; .loc 1 79 12 mov.u64 %r391,0; $L8: .loc 1 75 18 ld.u64 %r231,[%r47]; st.u64 [%r119],%r231; .loc 1 76 19 ld.u64 %r233,[%r47+16]; add.u64 %r232,%r233,1; ld.u64 %r234,[%r47+8]; sub.u64 %r37,%r232,%r234; .loc 1 78 10 setp.lt.s64 %r235,%r37,0; @ %r235 bra $L6; .loc 1 76 17 st.u64 [%r137],%r37; bra $L7; $L6: .loc 1 79 12 st.u64 [%r137],%r391; $L7: .loc 1 73 3 add.u64 %r47,%r47,24; add.u64 %r119,%r119,8; add.u64 %r137,%r137,8; setp.ne.u64 %r237,%r46,%r47; @ %r237 bra $L8; bra $L9; $L56: .loc 1 90 6 ld.u64 %r238,[%r185]; setp.eq.u64 %r239,%r238,0; @ ! %r239 bra $L11; bra $L10; $L4: add.u64 %r241,%r25,%r25; add.u64 %r242,%r241,%r25; shl.b64 %r243,%r242,3; add.u64 %r244,%r243,40; add.u64 %r118,%r186,%r244; shl.b64 %r245,%r25,3; add.u64 %r108,%r245,-8; add.u64 %r379,%frame,120; add.u64 %r112,%r379,%r108; add.u64 %r378,%frame,240; add.u64 %r104,%r378,%r108; cvt.u32.u32 %r249,%r22; cvt.s64.s8 %r248,%r249; add.u64 %r251,%r248,%r248; add.u64 %r252,%r251,%r248; shl.b64 %r253,%r252,3; add.u64 %r254,%r186,40; add.u64 %r64,%r253,%r254; .loc 1 87 12 mov.u64 %r390,0; $L14: .loc 1 83 18 ld.u64 %r255,[%r118]; st.u64 [%r112],%r255; .loc 1 84 19 ld.u64 %r257,[%r118+16]; add.u64 %r256,%r257,1; ld.u64 %r258,[%r118+8]; sub.u64 %r43,%r256,%r258; .loc 1 86 10 setp.lt.s64 %r259,%r43,0; @ %r259 bra $L12; .loc 1 84 17 st.u64 [%r104],%r43; bra $L13; $L12: .loc 1 87 12 st.u64 [%r104],%r390; $L13: .loc 1 81 3 add.u64 %r118,%r118,24; add.u64 %r112,%r112,8; add.u64 %r104,%r104,8; setp.ne.u64 %r261,%r64,%r118; @ %r261 bra $L14; bra $L57; $L10: .loc 1 94 7 setp.le.s64 %r262,%r120,0; @ %r262 bra $L16; add.u64 %r378,%frame,240; $L41: add.u64 %r156,%r185,40; mov.u64 %r153,%r378; cvt.u32.u32 %r264,%r22; cvt.s64.s8 %r263,%r264; add.u64 %r266,%r263,%r263; add.u64 %r267,%r266,%r263; shl.b64 %r268,%r267,3; add.u64 %r269,%r185,16; add.u64 %r130,%r268,%r269; .loc 1 97 10 mov.u64 %r107,1; .loc 1 101 4 mov.u64 %r292,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r270,0; st.u64 [%r185+8],%r270; .loc 1 106 28 cvt.u16.u32 %r273,%r22; add.u16 %r272,%r273,-1; cvt.u32.u16 %r274,%r272; st.u8 [%r185+28],%r274; .loc 1 108 20 add.u32 %r275,%r22,-2; cvt.s64.s32 %r54,%r275; add.u64 %r277,%r54,%r54; add.u64 %r278,%r277,%r54; shl.b64 %r279,%r278,3; add.u64 %r280,%r185,%r279; .loc 1 108 67 shl.b64 %r282,%r54,3; add.u64 %r283,%frame,%r282; .loc 1 108 59 ld.u64 %r285,[%r280+40]; ld.u64 %r286,[%r283+240]; mul.lo.u64 %r126,%r285,%r286; .loc 1 110 29 mov.u64 %r288288; call (%value_in),_gfortrani_xmallocarray289,[%value_in]; } .loc 1 110 27 st.u64 [%r185],%r289; .loc 1 111 10 setp.eq.u64 %r291,%r126,0; @ ! %r291 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r107,%r50,%r107; $L17: .loc 1 101 4 st.u64 [%r156+8],%r292; ld.u64 %r50,[%r153]; add.u64 %r293,%r50,-1; st.u64 [%r156+16],%r293; st.u64 [%r156],%r107; .loc 1 94 7 add.u64 %r156,%r156,24; add.u64 %r153,%r153,8; setp.ne.u64 %r294,%r130,%r156; @ %r294 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r185+48],%r126; mov.u64 %r296,-1; st.u64 [%r185+56],%r296; mov.u64 %r297,1; st.u64 [%r185+40],%r297; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r60,[%r185+28]; .loc 1 121 10 setp.eq.u64 %r298,%r60,%r120; @ %r298 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r120; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r301,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r302,[%r301+36]; setp.eq.u32 %r303,%r302,0; @ %r303 bra $L21; .loc 1 128 2 add.u64 %r378,%frame,240; cvta.const.u64 %r307,$LC206307; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r309,%r120,0; @ %r309 bra $L24; $L27: .loc 1 68 9 add.u64 %r310,%r30,1; .loc 1 68 7 sub.u64 %r122,%r310,%r32; max.s64 %r87,%r122,0; .loc 1 140 8 ld.u64 %r134,[%r186]; .loc 1 141 8 ld.u64 %r135,[%r185]; .loc 1 190 33 shl.b64 %r75,%r124,4; ld.u64 %r45,[%frame+360]; .loc 1 204 22 ld.u64 %r76,[%frame+120]; .loc 1 204 12 shl.b64 %r78,%r76,4; .loc 1 205 22 ld.u64 %r79,[%frame]; .loc 1 205 12 shl.b64 %r81,%r79,3; setp.gt.s64 %r380,%r122,0; setp.le.s64 %r381,%r120,1; .loc 1 180 9 setp.eq.u32 %r383,%r188,0; .loc 1 155 9 mov.u64 %r384,-1; cvt.u32.u32 %r385,%r22; cvt.s64.s8 %r386,%r385; add.u64 %r387,%r386,-1; add.u64 %r388,%frame,120; add.u64 %r389,%frame,240; bra $L25; $L24: add.u64 %r170,%frame,360; add.u64 %r169,%r185,40; mov.u64 %r167,%frame; cvt.u32.u32 %r312,%r22; cvt.s64.s8 %r311,%r312; add.u64 %r158,%r311,-1; .loc 1 132 3 mov.u64 %r136,0; add.u64 %r378,%frame,240; .loc 1 134 16 mov.u64 %r313,%r136; $L26: st.u64 [%r170],%r313; .loc 1 135 18 ld.u64 %r314,[%r169]; st.u64 [%r167],%r314; .loc 1 136 17 shl.b64 %r316,%r136,3; add.u64 %r317,%r378,%r316; .loc 1 136 10 ld.u64 %r318,[%r317]; setp.le.s64 %r319,%r318,0; @ %r319 bra $L1; .loc 1 132 26 add.u64 %r136,%r136,1; .loc 1 132 3 add.u64 %r170,%r170,8; add.u64 %r169,%r169,24; add.u64 %r167,%r167,8; setp.ne.u64 %r320,%r136,%r158; @ %r320 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r380 bra $L28; .loc 1 159 10 mov.u64 %r322,0; st.u64 [%r135],%r322; bra $L29; $L28: .loc 1 180 9 @ %r383 bra $L42; mov.u64 %r129,%r134; .loc 1 155 9 mov.u64 %r374,%r384; mov.u64 %r375,9223372036854775807; .loc 1 157 9 mov.u64 %r110,1; .loc 1 178 8 mov.u64 %r152,0; $L33: .loc 1 183 9 ld.u64 %r370,[%r129]; ld.u64 %r371,[%r129+8]; .loc 1 186 16 add.u64 %r152,%r152,1; .loc 1 183 8 setp.gt.s64 %r326,%r371,%r375; @ %r326 bra $L31; setp.ne.u64 %r329,%r371,%r375; @ %r329 bra $L43; setp.gt.u64 %r332,%r370,%r374; @ %r332 bra $L31; $L43: mov.u64 %r374,%r370; mov.u64 %r375,%r371; .loc 1 186 16 mov.u64 %r110,%r152; $L31: .loc 1 181 33 add.u64 %r129,%r129,%r75; .loc 1 181 8 setp.gt.s64 %r333,%r87,%r152; @ %r333 bra $L33; bra $L34; $L42: mov.u64 %r127,%r134; .loc 1 155 9 mov.u64 %r376,%r384; mov.u64 %r377,9223372036854775807; .loc 1 157 9 mov.u64 %r110,1; .loc 1 178 8 mov.u64 %r106,0; $L30: .loc 1 192 9 ld.u64 %r372,[%r127]; ld.u64 %r373,[%r127+8]; .loc 1 195 16 add.u64 %r106,%r106,1; .loc 1 192 8 setp.gt.s64 %r336,%r377,%r373; @ %r336 bra $L44; setp.ne.u64 %r339,%r377,%r373; @ %r339 bra $L35; setp.gt.u64 %r342,%r376,%r372; @ ! %r342 bra $L35; $L44: mov.u64 %r376,%r372; mov.u64 %r377,%r373; .loc 1 195 16 mov.u64 %r110,%r106; $L35: .loc 1 190 33 add.u64 %r127,%r127,%r75; .loc 1 190 8 setp.gt.s64 %r343,%r87,%r106; @ %r343 bra $L30; $L34: .loc 1 199 12 st.u64 [%r135],%r110; $L29: .loc 1 203 15 add.u64 %r45,%r45,1; .loc 1 204 12 add.u64 %r134,%r134,%r78; .loc 1 205 12 add.u64 %r135,%r135,%r81; .loc 1 207 32 ld.u64 %r68,[%frame+240]; .loc 1 207 13 setp.ne.u64 %r344,%r68,%r45; @ %r344 bra $L25; .loc 1 214 23 mul.lo.u64 %r345,%r45,%r76; .loc 1 214 9 shl.b64 %r346,%r345,4; sub.u64 %r131,%r134,%r346; .loc 1 215 23 mul.lo.u64 %r347,%r45,%r79; .loc 1 215 9 shl.b64 %r348,%r347,3; sub.u64 %r132,%r135,%r348; .loc 1 217 7 @ %r381 bra $L1; add.u64 %r157,%frame,368; mov.u64 %r178,8; .loc 1 216 5 mov.u64 %r133,1; .loc 1 211 13 mov.u64 %r382,0; bra $L39; $L40: st.u64 [%r157],%r382; .loc 1 214 23 mul.lo.u64 %r354,%r94,%r93; .loc 1 214 9 shl.b64 %r355,%r354,4; sub.u64 %r131,%r134,%r355; .loc 1 215 23 mul.lo.u64 %r356,%r97,%r93; .loc 1 215 9 shl.b64 %r357,%r356,3; sub.u64 %r132,%r135,%r357; .loc 1 216 5 add.u64 %r133,%r133,1; .loc 1 217 7 add.u64 %r157,%r157,8; add.u64 %r178,%r178,8; setp.eq.u64 %r358,%r133,%r387; @ %r358 bra $L1; $L39: .loc 1 225 16 ld.u64 %r359,[%r157]; add.u64 %r93,%r359,1; st.u64 [%r157],%r93; .loc 1 226 23 add.u64 %r361,%r388,%r178; ld.u64 %r94,[%r361]; .loc 1 226 13 shl.b64 %r362,%r94,4; add.u64 %r134,%r131,%r362; .loc 1 227 23 add.u64 %r363,%frame,%r178; ld.u64 %r97,[%r363]; .loc 1 227 13 shl.b64 %r364,%r97,3; add.u64 %r135,%r132,%r364; .loc 1 207 32 add.u64 %r366,%r389,%r178; ld.u64 %r100,[%r366]; .loc 1 207 13 setp.eq.u64 %r367,%r93,%r100; @ %r367 bra $L40; .loc 1 211 13 mov.u64 %r45,0; bra $L25; $L57: .loc 1 90 6 ld.u64 %r368,[%r185]; setp.eq.u64 %r369,%r368,0; @ ! %r369 bra $L11; bra $L41; $L1:_gfortran_mminloc1_8_i16 .visible .func _gfortran_mminloc1_8_i16, .param .u32 %in_ar4) {608pred %r218; .reg .u32 %r223; .reg .u6464u64pred %r373; .reg .u64 %r374; .reg .u64 %r376; .reg .u32 %r377; .reg .pred %r378; .reg .u64 %r381; .reg .u64 %r382; .reg .u64 %r386; .reg .pred %r388; .reg .pred %r389; .reg .u64 %r390; .reg .u32 %r391; .reg .u64pred %r398; .reg .pred %r399; .reg .pred %r402; .reg .pred %r403; .reg .pred %r404; .reg .pred %r408; .reg .pred %r411; .reg .pred %r414; .reg .pred %r415; .reg .u16 %r416; .reg .u32 %r417; .reg .pred %r418; .reg .pred %r421; .reg .pred %r424; .reg .pred %r427; .reg .predu64 %r460; .reg .u64 %r461; .reg .u64 %r463; .reg .pred %r464; .reg .u64 %r465; .reg .predpredmov.u64 %r213,%ar0; mov.u64 %r214,%ar1; mov.u64 %r215,%ar2; mov.u64 %r216,%ar3; mov.u32 %r217,%ar4; .loc 1 261 6 setp.ne.u64 %r218,%r216,0; @ %r218 bra $L1r214215217; call _gfortran_minloc1_8_i16268 7 bra $L58; $L59: .loc 1 271 10 ld.u64 %r22,[%r215]; .loc 1 271 7 add.u64 %r146,%r22,-1; .loc 1 272 10 ld.s8 %r23,[%r214+28]; .loc 1 272 38 add.u32 %r223,%r23,-1; .loc 1 272 8 cvt.s64.s32 %r147,%r223; .loc 1 275 7 shr.u64 %r225,%r146,63; set.u32.gt.s64 %r227,%r146,%r147; neg.s32 %r228,%r227; cvt.u16.u64 %r231,%r225; cvt.u16.u32 %r232,%r228; or.b16 %r230,%r231,%r232; .loc 1 275 6 cvt.u32.u16 %r233,%r230; cvt.u16.u8 %r234,%r233; setp.eq.u16 %r235,%r234,0; @ %r235 bra $L61; .loc 1 277 7 cvt.u32.u32 %r238,%r23; cvt.s64.s8 %r237,%r238; st.u64 [%stack+8],%r237; st.u64 [%stack],%r22; cvta.const.u64 %r236236_gfortran_runtime_error61: .loc 1 282 9 add.u64 %r468,%r146,%r146; add.u64 %r463,%r468,%r146; shl.b64 %r243,%r463,3; add.u64 %r244,%r214,%r243; ld.u64 %r247,[%r244+56]; add.u64 %r246,%r247,1; .loc 1 282 7 ld.u64 %r254,[%r244+48]; sub.u64 %r148,%r246,%r254; .loc 1 283 6 setp.le.s64 %r255,%r148,0; @ %r255 bra $L58; .loc 1 286 9 ld.u64 %r166,[%r216]; .loc 1 288 15 ld.u64 %r34,[%r216+16]; .loc 1 290 22 cvt.u32.u64 %r35,%r34; .loc 1 290 53 add.u32 %r256,%r35,-4; and.b32 %r257,%r256,-5; set.u32.eq.u32 %r259,%r257,0; neg.s32 %r260,%r259; .loc 1 290 22 add.u32 %r261,%r35,-1; set.u32.le.u32 %r263,%r261,1; neg.s32 %r264,%r263; .loc 1 290 6 cvt.u16.u32 %r266,%r260; cvt.u16.u32 %r267,%r264; or.b16 %r265,%r266,%r267; cvt.u32.u16 %r268,%r265; cvt.u16.u8 %r269,%r268; setp.ne.u16 %r270,%r269,0; @ %r270 bra $L63; .loc 1 292 7 setp.ne.u32 %r272,%r35,16; @ %r272 bra $L64; $L63: .loc 1 299 9 shl.b64 %r276,%r463,3; add.u64 %r277,%r214,%r276; ld.u64 %r151,[%r277+40]; .loc 1 300 12 add.u64 %r283,%r216,%r276; ld.u64 %r39,[%r283+40]; .loc 1 302 3 setp.ne.u64 %r285,%r146,0; @ %r285 bra $L65; $L71: .loc 1 312 3 setp.lt.s64 %r286,%r146,%r147; @ %r286 bra $L66; bra $L127; $L64: .loc 1 297 5_gfortran_runtime_error65: add.u64 %r205,%r214,40; add.u64 %r67,%r216,40; add.u64 %r289,%r214,16; add.u64 %r291,%r22,%r22; add.u64 %r292,%r291,%r22; shl.b64 %r293,%r292,3; add.u64 %r210,%r289,%r293; .loc 1 302 3 mov.u64 %r91,0; add.u64 %r467,%frame,240; add.u64 %r465,%frame,360; .loc 1 309 12 mov.u64 %r478,%r91; $L70: .loc 1 304 18 add.u64 %r295,%r467,%r91; ld.u64 %r296,[%r205]; st.u64 [%r295],%r296; .loc 1 305 18 add.u64 %r297,%frame,%r91; .loc 1 305 20 ld.u64 %r299,[%r67]; mul.lo.u64 %r298,%r299,%r34; .loc 1 305 18 st.u64 [%r297],%r298; .loc 1 306 19 ld.u64 %r301,[%r205+16]; add.u64 %r300,%r301,1; ld.u64 %r302,[%r205+8]; sub.u64 %r51,%r300,%r302; .loc 1 308 10 setp.lt.s64 %r303,%r51,0; @ %r303 bra $L68; .loc 1 306 17 add.u64 %r305,%r465,%r91; st.u64 [%r305],%r51; bra $L69; $L68: .loc 1 309 12 add.u64 %r307,%r465,%r91; st.u64 [%r307],%r478; $L69: .loc 1 302 3 add.u64 %r205,%r205,24; add.u64 %r67,%r67,24; add.u64 %r91,%r91,8; setp.ne.u64 %r309,%r205,%r210; @ %r309 bra $L70; bra $L71; $L127: .loc 1 322 6 ld.u64 %r310,[%r213]; setp.eq.u64 %r311,%r310,0; @ ! %r311 bra $L73; bra $L72; $L66: add.u64 %r313,%r22,%r22; add.u64 %r314,%r313,%r22; shl.b64 %r315,%r314,3; add.u64 %r128,%r315,40; add.u64 %r133,%r214,%r128; add.u64 %r124,%r216,%r128; shl.b64 %r316,%r22,3; add.u64 %r83,%r316,-8; cvt.u32.u32 %r318,%r23; cvt.s64.s8 %r317,%r318; shl.b64 %r319,%r317,3; add.u64 %r209,%r319,-8; add.u64 %r467,%frame,240; add.u64 %r465,%frame,360; .loc 1 319 12 mov.u64 %r477,0; $L76: .loc 1 314 18 add.u64 %r321,%r467,%r83; ld.u64 %r322,[%r133]; st.u64 [%r321],%r322; .loc 1 315 18 add.u64 %r323,%frame,%r83; .loc 1 315 20 ld.u64 %r325,[%r124]; mul.lo.u64 %r324,%r325,%r34; .loc 1 315 18 st.u64 [%r323],%r324; .loc 1 316 19 ld.u64 %r327,[%r133+16]; add.u64 %r326,%r327,1; ld.u64 %r328,[%r133+8]; sub.u64 %r64,%r326,%r328; .loc 1 318 10 setp.lt.s64 %r329,%r64,0; @ %r329 bra $L74; .loc 1 316 17 add.u64 %r331,%r465,%r83; st.u64 [%r331],%r64; bra $L75; $L74: .loc 1 319 12 add.u64 %r333,%r465,%r83; st.u64 [%r333],%r477; $L75: .loc 1 312 3 add.u64 %r133,%r133,24; add.u64 %r124,%r124,24; add.u64 %r83,%r83,8; setp.ne.u64 %r335,%r83,%r209; @ %r335 bra $L76; bra $L128; $L72: .loc 1 326 7 setp.eq.u64 %r336,%r147,0; @ %r336 bra $L78; add.u64 %r465,%frame,360; $L104: add.u64 %r161,%r213,40; mov.u64 %r143,%r465; cvt.u32.u32 %r338,%r23; cvt.s64.s8 %r337,%r338; add.u64 %r340,%r337,%r337; add.u64 %r341,%r340,%r337; shl.b64 %r342,%r341,3; add.u64 %r343,%r213,16; add.u64 %r135,%r342,%r343; .loc 1 329 10 mov.u64 %r131,1; .loc 1 333 4 mov.u64 %r362,0; bra $L79; $L78: .loc 1 337 20 add.u32 %r344,%r23,-2; cvt.s64.s32 %r73,%r344; add.u64 %r346,%r73,%r73; add.u64 %r347,%r346,%r73; shl.b64 %r348,%r347,3; add.u64 %r349,%r213,%r348; .loc 1 337 67 shl.b64 %r351,%r73,3; add.u64 %r352,%frame,%r351; .loc 1 337 59 ld.u64 %r354,[%r349+40]; ld.u64 %r355,[%r352+360]; mul.lo.u64 %r155,%r354,%r355; .loc 1 339 24 mov.u64 %r356,0; st.u64 [%r213+8],%r356; .loc 1 340 28 cvt.u16.u32 %r359,%r23; add.u16 %r358,%r359,-1; cvt.u32.u16 %r360,%r358; st.u8 [%r213+28],%r360; .loc 1 342 10 setp.eq.u64 %r361,%r155,0; @ %r361 bra $L80; bra $L129; $L82: .loc 1 331 47 mul.lo.u64 %r131,%r70,%r131; $L79: .loc 1 333 4 st.u64 [%r161+8],%r362; ld.u64 %r70,[%r143]; add.u64 %r363,%r70,-1; st.u64 [%r161+16],%r363; st.u64 [%r161],%r131; .loc 1 326 7 add.u64 %r161,%r161,24; add.u64 %r143,%r143,8; setp.ne.u64 %r364,%r135,%r161; @ %r364 bra $L82; bra $L78; $L80: .loc 1 345 4 st.u64 [%r213+48],%r155; mov.u64 %r366,-1; st.u64 [%r213+56],%r366; mov.u64 %r367,1; st.u64 [%r213+40],%r367; .loc 1 346 4 bra $L58; $L129: .loc 1 349 24 mov.u64 %r369369; call (%value_in),_gfortrani_xmallocarray70,[%value_in]; } .loc 1 349 22 st.u64 [%r213],%r370; bra $L83; $L73: .loc 1 354 19 ld.s8 %r372,[%r213+28]; .loc 1 354 10 setp.eq.u64 %r373,%r372,%r147; @ %r373 bra $L84; .loc 1 355 2 cvta.const.u64 %r374374_gfortran_runtime_error84: .loc 1 357 11 cvta.global.u64 %r376,_gfortrani_compile_options; .loc 1 357 10 ld.u32 %r377,[%r376+36]; setp.eq.u32 %r378,%r377,0; @ %r378 bra $L83; .loc 1 359 4 add.u64 %r465,%frame,360; cvta.const.u64 %r382,$LC2r3382; call _gfortrani_bounds_ifunction_return361 4 cvta.const.u64 %r386,$LC386382; call _gfortrani_bounds_equal_extents$L83: .loc 1 366 3 setp.ne.u64 %r388,%r147,0; @ %r388 bra $L85; $L88: .loc 1 374 8 ld.u64 %r167,[%r213]; .loc 1 375 8 ld.u64 %r165,[%r214]; .loc 1 377 9 setp.ne.u64 %r389,%r165,0; @ %r389 bra $L86; bra $L58; $L85: add.u64 %r194,%frame,480; add.u64 %r193,%r213,40; add.u64 %r188,%frame,120; cvt.u32.u32 %r391,%r23; cvt.s64.s8 %r390,%r391; add.u64 %r174,%r390,-1; .loc 1 366 3 mov.u64 %r168,0; add.u64 %r465,%frame,360; .loc 1 368 16 mov.u64 %r392,%r168; $L87: st.u64 [%r194],%r392; .loc 1 369 18 ld.u64 %r393,[%r193]; st.u64 [%r188],%r393; .loc 1 370 17 shl.b64 %r395,%r168,3; add.u64 %r396,%r465,%r395; .loc 1 370 10 ld.u64 %r397,[%r396]; setp.le.s64 %r398,%r397,0; @ %r398 bra $L58; .loc 1 366 26 add.u64 %r168,%r168,1; .loc 1 366 3 add.u64 %r194,%r194,8; add.u64 %r193,%r193,24; add.u64 %r188,%r188,8; setp.ne.u64 %r399,%r168,%r174; @ %r399 bra $L87; bra $L88; $L86: .loc 1 300 12 mul.lo.u64 %r42,%r39,%r34; .loc 1 396 32 shl.b64 %r87,%r151,4; ld.u64 %r68,[%frame+480]; .loc 1 440 22 ld.u64 %r97,[%frame+240]; .loc 1 440 12 shl.b64 %r99,%r97,4; .loc 1 441 23 ld.u64 %r100,[%frame]; .loc 1 442 22 ld.u64 %r102,[%frame+120]; .loc 1 442 12 shl.b64 %r104,%r102,3; setp.le.s64 %r464,%r147,1; .loc 1 418 9 setp.ne.u32 %r470,%r217,0; cvt.u32.u32 %r471,%r23; cvt.s64.s8 %r472,%r471; add.u64 %r473,%r472,-1; add.u64 %r474,%frame,240; add.u64 %r475,%frame,360; add.u64 %r476,%frame,120; $L105: .loc 1 366 3 mov.u64 %r160,%r166; mov.u64 %r159,%r165; .loc 1 396 9 mov.u64 %r137,0; $L92: .loc 1 399 7 ld.s8 %r208,[%r160]; mov.u64 %r189,%r137; .loc 1 396 24 add.u64 %r137,%r137,1; .loc 1 399 6 setp.eq.u32 %r466,%r208,0; @ %r466 bra $L89; .loc 1 407 11 ld.u64 %r460,[%r159]; ld.u64 %r461,[%r159+8]; .loc 1 418 9 @ %r470 bra $L90; bra $L130; $L89: .loc 1 396 32 add.u64 %r159,%r159,%r87; .loc 1 396 47 add.u64 %r160,%r160,%r42; .loc 1 396 2 setp.ne.u64 %r402,%r148,%r137; @ %r402 bra $L92; .loc 1 395 9 mov.u64 %r137,0; bra $L93; $L130: .loc 1 428 10 setp.gt.s64 %r403,%r148,%r189; @ %r403 bra $L94; bra $L93; $L90: .loc 1 419 8 setp.le.s64 %r404,%r148,%r189; @ %r404 bra $L93; $L97: .loc 1 424 16 add.u64 %r189,%r189,1; .loc 1 421 8 @ %r466 bra $L95; .loc 1 421 18 ld.u64 %r456,[%r159]; ld.u64 %r457,[%r159+8]; .loc 1 421 15 setp.gt.s64 %r408,%r457,%r461; @ %r408 bra $L95; setp.ne.u64 %r411,%r457,%r461; @ %r411 bra $L106; setp.gt.u64 %r414,%r456,%r460; @ %r414 bra $L95; $L106: mov.u64 %r460,%r456; mov.u64 %r461,%r457; .loc 1 424 16 mov.u64 %r137,%r189; $L95: .loc 1 419 33 add.u64 %r159,%r159,%r87; .loc 1 419 48 add.u64 %r160,%r160,%r42; .loc 1 419 8 setp.eq.u64 %r415,%r148,%r189; @ %r415 bra $L93; .loc 1 421 9 ld.s8 %r208,[%r160]; setp.eq.u32 %r466,%r208,0; bra $L97; $L94: .loc 1 433 11 add.u64 %r189,%r189,1; .loc 1 430 10 ld.s8 %r417,[%r160]; cvt.u16.u32 %r416,%r417; setp.eq.u16 %r418,%r416,0; @ %r418 bra $L98; .loc 1 430 20 ld.u64 %r458,[%r159]; ld.u64 %r459,[%r159+8]; .loc 1 430 17 setp.gt.s64 %r421,%r461,%r459; @ %r421 bra $L107; setp.ne.u64 %r424,%r461,%r459; @ %r424 bra $L98; setp.gt.u64 %r427,%r460,%r458; @ ! %r427 bra $L98; $L107: mov.u64 %r460,%r458; mov.u64 %r461,%r459; .loc 1 433 11 mov.u64 %r137,%r189; $L98: .loc 1 428 35 add.u64 %r159,%r159,%r87; .loc 1 428 50 add.u64 %r160,%r160,%r42; .loc 1 428 10 setp.ne.u64 %r428,%r148,%r189; @ %r428 bra $L94; $L93: .loc 1 436 8 st.u64 [%r167],%r137; .loc 1 439 15 add.u64 %r68,%r68,1; .loc 1 440 12 add.u64 %r165,%r165,%r99; .loc 1 441 13 add.u64 %r166,%r166,%r100; .loc 1 442 12 add.u64 %r167,%r167,%r104; .loc 1 444 32 ld.u64 %r29,[%frame+360]; .loc 1 444 13 setp.ne.u64 %r429,%r29,%r68; @ %r429 bra $L105; .loc 1 451 23 mul.lo.u64 %r106,%r68,%r97; .loc 1 452 24 mul.lo.u64 %r430,%r68,%r100; .loc 1 452 10 sub.u64 %r162,%r166,%r430; .loc 1 453 23 mul.lo.u64 %r431,%r68,%r102; .loc 1 453 9 shl.b64 %r432,%r431,3; sub.u64 %r163,%r167,%r432; .loc 1 455 7 @ %r464 bra $L58; add.u64 %r204,%frame,488; mov.u64 %r203,8; .loc 1 454 5 mov.u64 %r164,1; .loc 1 448 13 mov.u64 %r469,0; bra $L102; $L103: st.u64 [%r204],%r469; .loc 1 451 23 mul.lo.u64 %r106,%r117,%r116; .loc 1 452 24 mul.lo.u64 %r438,%r119,%r116; .loc 1 452 10 sub.u64 %r162,%r166,%r438; .loc 1 453 23 mul.lo.u64 %r439,%r121,%r116; .loc 1 453 9 shl.b64 %r440,%r439,3; sub.u64 %r163,%r167,%r440; .loc 1 454 5 add.u64 %r164,%r164,1; .loc 1 455 7 add.u64 %r204,%r204,8; add.u64 %r203,%r203,8; setp.eq.u64 %r441,%r164,%r473; @ %r441 bra $L58; $L102: .loc 1 463 16 ld.u64 %r442,[%r204]; add.u64 %r116,%r442,1; st.u64 [%r204],%r116; .loc 1 464 23 add.u64 %r444,%r474,%r203; ld.u64 %r117,[%r444]; .loc 1 464 13 sub.u64 %r445,%r117,%r106; shl.b64 %r446,%r445,4; add.u64 %r165,%r165,%r446; .loc 1 465 24 add.u64 %r447,%frame,%r203; ld.u64 %r119,[%r447]; .loc 1 465 14 add.u64 %r166,%r162,%r119; .loc 1 466 23 add.u64 %r449,%r476,%r203; ld.u64 %r121,[%r449]; .loc 1 466 13 shl.b64 %r450,%r121,3; add.u64 %r167,%r163,%r450; .loc 1 444 32 add.u64 %r452,%r475,%r203; ld.u64 %r125,[%r452]; .loc 1 444 13 setp.eq.u64 %r453,%r116,%r125; @ %r453 bra $L103; .loc 1 448 13 mov.u64 %r68,0; bra $L105; $L128: .loc 1 322 6 ld.u64 %r454,[%r213]; setp.eq.u64 %r455,%r454,0; @ ! %r455 bra $L73; bra $L104; $L5_gfortran_sminloc1_8_i16 .visible .func _gfortran_sminloc1_8_i16, .param .u32 %in_ar4) {local .align 16 .b8 %frame_ar[368u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 493 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L132; .loc 1 493 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L133; $L132:r152154; call _gfortran_minloc1_8_i16500 7 bra $L131; $L133: .loc 1 503 10 ld.u64 %r23,[%r152]; .loc 1 503 7 add.u64 %r88,%r23,-1; .loc 1 504 10 ld.s8 %r24,[%r151+28]; .loc 1 504 38 add.u32 %r26,%r24,-1; .loc 1 504 8 cvt.s64.s32 %r296,%r26; .loc 1 506 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 506 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L135; .loc 1 513 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L136; $L142: .loc 1 521 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L137; bra $L190; $L135: .loc 1 508 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error136: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 518 12 mov.u64 %r302,0; $L141: .loc 1 515 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 517 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L139; .loc 1 515 17 st.u64 [%r139],%r36; bra $L140; $L139: .loc 1 518 12 st.u64 [%r139],%r302; $L140: .loc 1 513 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L141; bra $L142; $L190: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L144; bra $L143; $L137: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 527 12 mov.u64 %r301,0; $L147: .loc 1 524 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 526 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L145; .loc 1 523 17 st.u64 [%r43],%r41; bra $L146; $L145: .loc 1 527 12 st.u64 [%r43],%r301; $L146: .loc 1 521 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L147; bra $L191; $L143: .loc 1 534 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L149; add.u64 %r295,%frame,120; $L169: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 537 10 mov.u64 %r81,1; .loc 1 541 4 mov.u64 %r239,0; bra $L150; $L149: .loc 1 545 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 546 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 548 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 548 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 548 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 550 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L151; bra $L192; $L153: .loc 1 539 48 mul.lo.u64 %r81,%r46,%r81; $L150: .loc 1 541 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 534 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L153; bra $L149; $L151: .loc 1 553 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 554 4 bra $L131; $L192: .loc 1 557 24 mov.u64 %r24696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 557 22 st.u64 [%r150],%r98; $L157: .loc 1 583 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L154; bra $L155; $L144: .loc 1 561 19 ld.s8 %r55,[%r150+28]; .loc 1 561 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L156; .loc 1 562 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error156: .loc 1 567 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 567 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L157; .loc 1 569 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L155; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 569 10 mov.u64 %r104,0; $L159: .loc 1 573 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 573 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 574 18 ld.u64 %r63,[%r31]; .loc 1 569 25 add.u64 %r104,%r104,1; .loc 1 574 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L158; .loc 1 575 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error158: .loc 1 569 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L159; $L154: .loc 1 585 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L160; shl.b64 %r266,%r296,3; bra $L161; $L160: mov.u64 %r266,8; $L161: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L162: .loc 1 586 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 583 26 add.u64 %r100,%r100,1; .loc 1 583 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L162; $L155: ld.u64 %r80,[%frame+240]; .loc 1 595 22 ld.u64 %r68,[%frame]; .loc 1 595 12 shl.b64 %r69,%r68,3; .loc 1 597 32 ld.u64 %r59,[%frame+120]; .loc 1 604 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 593 13 mov.u64 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L168: st.u64 [%r98],%r280; .loc 1 594 15 add.u64 %r80,%r80,1; .loc 1 595 12 add.u64 %r98,%r98,%r69; .loc 1 597 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L168; .loc 1 606 7 @ ! %r294 bra $L193; bra $L131; $L167: .loc 1 601 13 st.u64 [%r99],%r280; .loc 1 604 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 605 5 add.u64 %r97,%r97,1; .loc 1 606 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L166; bra $L131; $L193: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 604 23 mov.u64 %r72,%r44; .loc 1 605 5 mov.u64 %r97,1; $L166: .loc 1 610 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 611 23 ld.u64 %r75,[%r124]; .loc 1 611 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,3; add.u64 %r98,%r98,%r291; .loc 1 597 32 ld.u64 %r77,[%r123]; .loc 1 597 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L167; .loc 1 601 13 mov.u64 %r80,0; bra $L168; $L191: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L144; bra $L169; $L131: .loc 1 615 1 ret; } /181 _gfortran_minloc1_16_i16 .visible .func _gfortran_minloc1_16_i16.file 1 "../../../libgfortran/generated/minloc1_16_i16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_16_i16 .visible .func _gfortran_mminloc1_16_i16_gfortran_sminloc1_16_i16 .visible .func _gfortran_sminloc1_16_i16VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_minloc1_16_i16 .visible .func _gfortran_minloc1_16_i16480pred %r229; .reg .predpred %r240; .reg .predu16 %r278; .reg .u32 %r279; .reg .u32 %r280; .reg .u64 %r282; .reg .u64predu64 %r318; .reg .u64pred %r328; .reg .pred %r329; .reg .u64 %r331; .reg .pred %r336; .reg .pred %r339; .reg .pred %r342; .reg .u64 %r345; .reg .u64 %r348; .reg .u32pred %r360; .reg .pred %r363; .reg .pred %r366; .reg .u64 %r369; .reg .u64 %r372; .reg .u32.loc 1 58 10 ld.s8 %r22,[%r191+28]; .loc 1 58 38 add.u32 %r194,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r118,%r194; .loc 1 59 10 ld.u64 %r25,[%r192]; .loc 1 59 7 add.u64 %r119,%r25,-1; .loc 1 61 7 shr.u64 %r196,%r119,63; set.u32.lt.s64 %r198,%r118,%r119; neg.s32 %r199,%r198; cvt.u16.u64 %r202,%r196;61%r206 bra $L2; .loc 1 63 7 cvt.u32.u32 %r209,%r22; cvt.s64.s8 %r208,%r209; st.u64 [%stack+8],%r208007_gfortran_runtime_errorr212,%r119,%r119; add.u64 %r213,%r212,%r119; shl.b64 %r214,%r213,3; add.u64 %r215,%r191,%r214; ld.u64 %r31,[%r215+56]; ld.u64 %r33,[%r215+48]; .loc 1 71 9 ld.u64 %r122,[%r215+40]; .loc 1 73 3 setp.ne.u64 %r229,%r119,0; @ %r229 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r230,%r118,%r119; @ %r230 bra $L4; bra $L56; $L3: add.u64 %r188,%r191,40; add.u64 %r184,%frame,120; add.u64 %r47,%frame,240; add.u64 %r231,%r191,16; add.u64 %r233,%r25,%r25; add.u64 %r234,%r233,%r25; shl.b64 %r235,%r234,3; add.u64 %r186,%r231,%r235; .loc 1 79 12 mov.u64 %r441,0; $L8: .loc 1 75 18 ld.u64 %r236,[%r188]; st.u64 [%r184],%r236; .loc 1 76 19 ld.u64 %r238,[%r188+16]; add.u64 %r237,%r238,1; ld.u64 %r239,[%r188+8]; sub.u64 %r38,%r237,%r239; .loc 1 78 10 setp.lt.s64 %r240,%r38,0; @ %r240 bra $L6; .loc 1 76 17 st.u64 [%r47],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r47],%r441; $L7: .loc 1 73 3 add.u64 %r188,%r188,24; add.u64 %r184,%r184,8; add.u64 %r47,%r47,8; setp.ne.u64 %r242,%r186,%r188; @ %r242 bra $L8; bra $L9; $L56: .loc 1 90 6 ld.u64 %r243,[%r190]; setp.eq.u64 %r244,%r243,0; @ ! %r244 bra $L11; bra $L10; $L4: add.u64 %r246,%r25,%r25; add.u64 %r247,%r246,%r25; shl.b64 %r248,%r247,3; add.u64 %r249,%r248,40; add.u64 %r110,%r191,%r249; shl.b64 %r250,%r25,3; add.u64 %r98,%r250,-8; add.u64 %r429,%frame,120; add.u64 %r100,%r429,%r98; add.u64 %r428,%frame,240; add.u64 %r96,%r428,%r98; cvt.u32.u32 %r254,%r22; cvt.s64.s8 %r253,%r254; add.u64 %r256,%r253,%r253; add.u64 %r257,%r256,%r253; shl.b64 %r258,%r257,3; add.u64 %r259,%r191,40; add.u64 %r135,%r258,%r259; .loc 1 87 12 mov.u64 %r440,0; $L14: .loc 1 83 18 ld.u64 %r260,[%r110]; st.u64 [%r100],%r260; .loc 1 84 19 ld.u64 %r262,[%r110+16]; add.u64 %r261,%r262,1; ld.u64 %r263,[%r110+8]; sub.u64 %r44,%r261,%r263; .loc 1 86 10 setp.lt.s64 %r264,%r44,0; @ %r264 bra $L12; .loc 1 84 17 st.u64 [%r96],%r44; bra $L13; $L12: .loc 1 87 12 st.u64 [%r96],%r440; $L13: .loc 1 81 3 add.u64 %r110,%r110,24; add.u64 %r100,%r100,8; add.u64 %r96,%r96,8; setp.ne.u64 %r266,%r110,%r135; @ %r266 bra $L14; bra $L57; $L10: .loc 1 94 7 setp.le.s64 %r267,%r118,0; @ %r267 bra $L16; add.u64 %r428,%frame,240; $L41: add.u64 %r146,%r190,40; mov.u64 %r143,%r428; cvt.u32.u32 %r269,%r22; cvt.s64.s8 %r268,%r269; add.u64 %r271,%r268,%r268; add.u64 %r272,%r271,%r268; shl.b64 %r273,%r272,3; add.u64 %r274,%r190,16; add.u64 %r111,%r273,%r274; .loc 1 97 10 mov.u64 %r102,1; .loc 1 101 4 mov.u64 %r297,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r275,0; st.u64 [%r190+8],%r275; .loc 1 106 28 cvt.u16.u32 %r278,%r22; add.u16 %r277,%r278,-1; cvt.u32.u16 %r279,%r277; st.u8 [%r190+28],%r279; .loc 1 108 20 add.u32 %r280,%r22,-2; cvt.s64.s32 %r53,%r280; add.u64 %r282,%r53,%r53; add.u64 %r283,%r282,%r53; shl.b64 %r284,%r283,3; add.u64 %r285,%r190,%r284; .loc 1 108 67 shl.b64 %r287,%r53,3; add.u64 %r288,%frame,%r287; .loc 1 108 59 ld.u64 %r290,[%r285+40]; ld.u64 %r291,[%r288+240]; mul.lo.u64 %r124,%r290,%r291; .loc 1 110 29 mov.u64 %r293,16293; call (%value_in),_gfortrani_xmallocarray294,[%value_in]; } .loc 1 110 27 st.u64 [%r190],%r294; .loc 1 111 10 setp.eq.u64 %r296,%r124,0; @ ! %r296 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r102,%r49,%r102; $L17: .loc 1 101 4 st.u64 [%r146+8],%r297; ld.u64 %r49,[%r143]; add.u64 %r298,%r49,-1; st.u64 [%r146+16],%r298; st.u64 [%r146],%r102; .loc 1 94 7 add.u64 %r146,%r146,24; add.u64 %r143,%r143,8; setp.ne.u64 %r299,%r111,%r146; @ %r299 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r190+48],%r124; mov.u64 %r301,-1; st.u64 [%r190+56],%r301; mov.u64 %r302,1; st.u64 [%r190+40],%r302; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r59,[%r190+28]; .loc 1 121 10 setp.eq.u64 %r303,%r59,%r118; @ %r303 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r118; st.u64 [%stack],%r59;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r306,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r307,[%r306+36]; setp.eq.u32 %r308,%r307,0; @ %r308 bra $L21; .loc 1 128 2 add.u64 %r428,%frame,240; cvta.const.u64 %r312,$LC2; cvta.const.u64 %r311,$LC3;_gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r314,%r118,0; @ %r314 bra $L24; $L27: .loc 1 68 9 add.u64 %r315,%r31,1; .loc 1 68 7 sub.u64 %r120,%r315,%r33; max.s64 %r82,%r120,0; .loc 1 140 8 ld.u64 %r131,[%r191]; .loc 1 141 8 ld.u64 %r132,[%r190]; .loc 1 190 33 shl.b64 %r71,%r122,4; ld.u64 %r141,[%frame+360]; .loc 1 204 22 ld.u64 %r73,[%frame+120]; .loc 1 204 12 shl.b64 %r74,%r73,4; .loc 1 205 22 ld.u64 %r76,[%frame]; .loc 1 205 12 shl.b64 %r77,%r76,4; .loc 1 207 32 ld.u64 %r66,[%frame+240]; .loc 1 214 23 mul.lo.u64 %r316,%r66,%r73; .loc 1 214 9 shl.b64 %r317,%r316,4; neg.s64 %r140,%r317; .loc 1 215 23 mul.lo.u64 %r318,%r66,%r76; .loc 1 215 9 shl.b64 %r319,%r318,4; neg.s64 %r145,%r319; setp.gt.s64 %r430,%r120,0; setp.le.s64 %r431,%r118,1; .loc 1 180 9 setp.eq.u32 %r433,%r193,0; mov.u64 %r434,1; cvt.u32.u32 %r435,%r22; cvt.s64.s8 %r436,%r435; add.u64 %r437,%r436,-1; add.u64 %r438,%frame,120; add.u64 %r439,%frame,240; bra $L25; $L24: add.u64 %r168,%frame,360; add.u64 %r167,%r190,40; mov.u64 %r165,%frame; cvt.u32.u32 %r321,%r22; cvt.s64.s8 %r320,%r321; add.u64 %r148,%r320,-1; .loc 1 132 3 mov.u64 %r133,0; add.u64 %r428,%frame,240; .loc 1 134 16 mov.u64 %r322,%r133; $L26: st.u64 [%r168],%r322; .loc 1 135 18 ld.u64 %r323,[%r167]; st.u64 [%r165],%r323; .loc 1 136 17 shl.b64 %r325,%r133,3; add.u64 %r326,%r428,%r325; .loc 1 136 10 ld.u64 %r327,[%r326]; setp.le.s64 %r328,%r327,0; @ %r328 bra $L1; .loc 1 132 26 add.u64 %r133,%r133,1; .loc 1 132 3 add.u64 %r168,%r168,8; add.u64 %r167,%r167,24; add.u64 %r165,%r165,8; setp.ne.u64 %r329,%r133,%r148; @ %r329 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r430 bra $L28; .loc 1 159 10 mov.u64 %r331,0; st.u64 [%r132],%r331; st.u64 [%r132+8],%r331; bra $L29; $L28: .loc 1 180 9 @ %r433 bra $L42; mov.u64 %r127,%r131; mov.u64 %r418,%r434; mov.u64 %r419,0; .loc 1 155 9 mov.u64 %r412,-1; mov.u64 %r413,9223372036854775807; .loc 1 157 9 mov.u64 %r410,1; mov.u64 %r411,%r419; $L33: .loc 1 183 9 ld.u64 %r406,[%r127]; ld.u64 %r407,[%r127+8]; .loc 1 183 8 setp.gt.s64 %r336,%r407,%r413; @ %r336 bra $L31; setp.ne.u64 %r339,%r407,%r413; @ %r339 bra $L43; setp.gt.u64 %r342,%r406,%r412; @ %r342 bra $L31; $L43: .loc 1 186 16 mov.u64 %r410,%r418; mov.u64 %r411,%r419; mov.u64 %r412,%r406; mov.u64 %r413,%r407; $L31: .loc 1 181 33 add.u64 %r127,%r127,%r71; .loc 1 181 8 add.u64 %r345,%r418,%r434; set.u32.lt.u64 %r350,%r345,%r418; cvt.s64.s32 %r348,%r350; mov.u64 %r418,%r345; sub.u64 %r419,%r419,%r348; add.u64 %r355,%r418,-1; setp.gt.s64 %r357,%r82,%r355; @ %r357 bra $L33; bra $L34; $L42: mov.u64 %r101,%r131; mov.u64 %r416,%r434; mov.u64 %r417,0; .loc 1 155 9 mov.u64 %r414,-1; mov.u64 %r415,9223372036854775807; .loc 1 157 9 mov.u64 %r410,1; mov.u64 %r411,%r417; $L30: .loc 1 192 9 ld.u64 %r408,[%r101]; ld.u64 %r409,[%r101+8]; .loc 1 192 8 setp.gt.s64 %r360,%r415,%r409; @ %r360 bra $L44; setp.ne.u64 %r363,%r415,%r409; @ %r363 bra $L35; setp.gt.u64 %r366,%r414,%r408; @ ! %r366 bra $L35; $L44: .loc 1 195 16 mov.u64 %r410,%r416; mov.u64 %r411,%r417; mov.u64 %r414,%r408; mov.u64 %r415,%r409; $L35: .loc 1 190 33 add.u64 %r101,%r101,%r71; .loc 1 190 8 add.u64 %r369,%r416,%r434; set.u32.lt.u64 %r374,%r369,%r416; cvt.s64.s32 %r372,%r374; mov.u64 %r416,%r369; sub.u64 %r417,%r417,%r372; add.u64 %r379,%r416,-1; setp.gt.s64 %r381,%r82,%r379; @ %r381 bra $L30; $L34: .loc 1 199 12 st.u64 [%r132],%r410; st.u64 [%r132+8],%r411; $L29: .loc 1 203 15 add.u64 %r141,%r141,1; .loc 1 204 12 add.u64 %r131,%r131,%r74; .loc 1 205 12 add.u64 %r132,%r132,%r77; .loc 1 207 13 setp.ne.u64 %r384,%r66,%r141; @ %r384 bra $L25; .loc 1 214 9 add.u64 %r128,%r131,%r140; .loc 1 215 9 add.u64 %r129,%r132,%r145; .loc 1 217 7 @ %r431 bra $L1; add.u64 %r157,%frame,368; mov.u64 %r117,8; .loc 1 216 5 mov.u64 %r130,1; .loc 1 211 13 mov.u64 %r432,0; bra $L39; $L40: st.u64 [%r157],%r432; .loc 1 214 23 mul.lo.u64 %r390,%r89,%r88; .loc 1 214 9 shl.b64 %r391,%r390,4; sub.u64 %r128,%r131,%r391; .loc 1 215 23 mul.lo.u64 %r392,%r92,%r88; .loc 1 215 9 shl.b64 %r393,%r392,4; sub.u64 %r129,%r132,%r393; .loc 1 216 5 add.u64 %r130,%r130,1; .loc 1 217 7 add.u64 %r157,%r157,8; add.u64 %r117,%r117,8; setp.eq.u64 %r394,%r130,%r437; @ %r394 bra $L1; $L39: .loc 1 225 16 ld.u64 %r395,[%r157]; add.u64 %r88,%r395,1; st.u64 [%r157],%r88; .loc 1 226 23 add.u64 %r397,%r438,%r117; ld.u64 %r89,[%r397]; .loc 1 226 13 shl.b64 %r398,%r89,4; add.u64 %r131,%r128,%r398; .loc 1 227 23 add.u64 %r399,%frame,%r117; ld.u64 %r92,[%r399]; .loc 1 227 13 shl.b64 %r400,%r92,4; add.u64 %r132,%r129,%r400; .loc 1 207 32 add.u64 %r402,%r439,%r117; ld.u64 %r95,[%r402]; .loc 1 207 13 setp.eq.u64 %r403,%r88,%r95; @ %r403 bra $L40; .loc 1 211 13 mov.u64 %r141,0; bra $L25; $L57: .loc 1 90 6 ld.u64 %r404,[%r190]; setp.eq.u64 %r405,%r404,0; @ ! %r405 bra $L11; bra $L41; $L1:_gfortran_mminloc1_16_i16 .visible .func _gfortran_mminloc1_16_i16, .param .u32 %in_ar4) {60897u64u64 %r267; .reg .u64 %r274; .reg .pred %r275; .reg .u32 %r276; .reg .u32 %r277; .reg .u32u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .pred %r329; .reg .u64 %r330; .reg .pred %r331; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32 %r338; .reg .u64 %r339; .reg .u64 %r341; .reg .u64 %r342; .reg .u64 %r343; .reg .u64 %r344; .reg .u64pred %r355; .reg .pred64predpred %r408; .reg .pred %r409; .reg .u64 %r410; .reg .u32.reg .u64 %r417; .reg .pred %r418; .reg .predpred %r428; .reg .u64 %r432; .reg .u64pred %r462; .reg .pred %r465; .reg .predpred %r483; .reg .pred.reg .u32 %r512; .reg .u64 %r516; .reg .u16 %r517; .reg .u32 %r518; .reg .pred %r519; .reg .pred %r522; .reg .pred %r525; .reg .predu32 %r536; .reg .pred %r543; .reg .pred %r546; .reg .pred %r549; .reg .u64 %r555; .reg .u64 %r556; .reg .u64 %r557; .reg .predu64u64 %r577; .reg .u64u64 %r587; .reg .u64 %r588; .reg .u64 %r599; .reg .u64 %r613; .reg .u64 %r617; .reg .u64 %r618; .reg .u64 %r619; .reg .u64 %r620; .reg .pred %r621; .reg .u64 %r622; .reg .pred %r623; .reg .u64 %r624; .reg .pred %r625; .reg .u64 %r626; .reg .u32 %r627; .reg .u64 %r628; .reg .u64 %r629; .reg .u64 %r630; .reg .u64 %r631; .reg .u64 %r632; .reg .u64 %r633; mov.u64 %r233,%ar0; mov.u64 %r234,%ar1; mov.u64 %r235,%ar2; mov.u64 %r236,%ar3; mov.u32 %r237,%ar4; .loc 1 261 6 setp.ne.u64 %r238,%r236,0; @ %r238 bra $L3r2235237; call _gfortran_minloc1_16_i16268 7 bra $L58; $L59: .loc 1 271 10 ld.u64 %r22,[%r235]; .loc 1 271 7 add.u64 %r153,%r22,-1; .loc 1 272 10 ld.s8 %r23,[%r234+28]; .loc 1 272 38 add.u32 %r243,%r23,-1; .loc 1 272 8 cvt.s64.s32 %r154,%r243; .loc 1 275 7 shr.u64 %r245,%r153,63; set.u32.gt.s64 %r247,%r153,%r154; neg.s32 %r248,%r247; cvt.u16.u64 %r251,%r245; cvt.u16.u32 %r252,%r248; or.b16 %r250,%r251,%r252; .loc 1 275 6 cvt.u32.u16 %r253,%r250; cvt.u16.u8 %r254,%r253; setp.eq.u16 %r255,%r254,0; @ %r255 bra $L61; .loc 1 277 7 cvt.u32.u32 %r258,%r23; cvt.s64.s8 %r257,%r258; st.u64 [%stack+8],%r257; st.u64 [%stack],%r22; cvta.const.u64 %r22stack; call _gfortran_runtime_error61: .loc 1 282 9 add.u64 %r619,%r153,%r153; add.u64 %r620,%r619,%r153; shl.b64 %r263,%r620,3; add.u64 %r264,%r234,%r263; ld.u64 %r267,[%r264+56]; add.u64 %r266,%r267,1; .loc 1 282 7 ld.u64 %r274,[%r264+48]; sub.u64 %r155,%r266,%r274; .loc 1 283 6 setp.le.s64 %r275,%r155,0; @ %r275 bra $L58; .loc 1 286 9 ld.u64 %r173,[%r236]; .loc 1 288 15 ld.u64 %r35,[%r236+16]; .loc 1 290 22 cvt.u32.u64 %r36,%r35; .loc 1 290 53 add.u32 %r276,%r36,-4; and.b32 %r277,%r276,-5; set.u32.eq.u32 %r279,%r277,0; neg.s32 %r280,%r279; .loc 1 290 22 add.u32 %r281,%r36,-1; set.u32.le.u32 %r283,%r281,1; neg.s32 %r284,%r283; .loc 1 290 6 cvt.u16.u32 %r286,%r280; cvt.u16.u32 %r287,%r284; or.b16 %r285,%r286,%r287;; .loc 1 292 7 setp.ne.u32 %r292,%r36,16; @ %r292 bra $L64; $L63: .loc 1 299 9 shl.b64 %r296,%r620,3; add.u64 %r297,%r234,%r296; ld.u64 %r158,[%r297+40]; .loc 1 300 12 add.u64 %r303,%r236,%r296; ld.u64 %r41,[%r303+40]; .loc 1 302 3 setp.ne.u64 %r305,%r153,0; @ %r305 bra $L65; $L71: .loc 1 312 3 setp.lt.s64 %r306,%r153,%r154; @ %r306 bra $L66; bra $L129; $L64: .loc 1 297 5 cvta.const.u64 %r307307_gfortran_runtime_error65: add.u64 %r164,%r234,40; add.u64 %r150,%r236,40; add.u64 %r309,%r234,16; add.u64 %r311,%r22,%r22; add.u64 %r312,%r311,%r22; shl.b64 %r313,%r312,3; add.u64 %r223,%r309,%r313; .loc 1 302 3 mov.u64 %r53,0; add.u64 %r622,%frame,240; add.u64 %r618,%frame,360; .loc 1 309 12 mov.u64 %r633,%r53; $L70: .loc 1 304 18 add.u64 %r315,%r622,%r53; ld.u64 %r316,[%r164]; st.u64 [%r315],%r316; .loc 1 305 18 add.u64 %r317,%frame,%r53; .loc 1 305 20 ld.u64 %r319,[%r150]; mul.lo.u64 %r318,%r319,%r35; .loc 1 305 18 st.u64 [%r317],%r318; .loc 1 306 19 ld.u64 %r321,[%r164+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r164+8]; sub.u64 %r52,%r320,%r322; .loc 1 308 10 setp.lt.s64 %r323,%r52,0; @ %r323 bra $L68; .loc 1 306 17 add.u64 %r325,%r618,%r53; st.u64 [%r325],%r52; bra $L69; $L68: .loc 1 309 12 add.u64 %r327,%r618,%r53; st.u64 [%r327],%r633; $L69: .loc 1 302 3 add.u64 %r164,%r164,24; add.u64 %r150,%r150,24; add.u64 %r53,%r53,8; setp.ne.u64 %r329,%r164,%r223; @ %r329 bra $L70; bra $L71; $L129: .loc 1 322 6 ld.u64 %r330,[%r233]; setp.eq.u64 %r331,%r330,0; @ ! %r331 bra $L73; bra $L72; $L66: add.u64 %r333,%r22,%r22; add.u64 %r334,%r333,%r22; shl.b64 %r335,%r334,3; add.u64 %r178,%r335,40; add.u64 %r59,%r234,%r178; add.u64 %r180,%r236,%r178; shl.b64 %r336,%r22,3; add.u64 %r91,%r336,-8; cvt.u32.u32 %r338,%r23; cvt.s64.s8 %r337,%r338; shl.b64 %r339,%r337,3; add.u64 %r97,%r339,-8; add.u64 %r622,%frame,240; add.u64 %r618,%frame,360; .loc 1 319 12 mov.u64 %r632,0; $L76: .loc 1 314 18 add.u64 %r341,%r622,%r91; ld.u64 %r342,[%r59]; st.u64 [%r341],%r342; .loc 1 315 18 add.u64 %r343,%frame,%r91; .loc 1 315 20 ld.u64 %r345,[%r180]; mul.lo.u64 %r344,%r345,%r35; .loc 1 315 18 st.u64 [%r343],%r344; .loc 1 316 19 ld.u64 %r347,[%r59+16]; add.u64 %r346,%r347,1; ld.u64 %r348,[%r59+8]; sub.u64 %r66,%r346,%r348; .loc 1 318 10 setp.lt.s64 %r349,%r66,0; @ %r349 bra $L74; .loc 1 316 17 add.u64 %r351,%r618,%r91; st.u64 [%r351],%r66; bra $L75; $L74: .loc 1 319 12 add.u64 %r353,%r618,%r91; st.u64 [%r353],%r632; $L75: .loc 1 312 3 add.u64 %r59,%r59,24; add.u64 %r180,%r180,24; add.u64 %r91,%r91,8; setp.ne.u64 %r355,%r91,%r97; @ %r355 bra $L76; bra $L130; $L72: .loc 1 326 7 setp.eq.u64 %r356,%r154,0; @ %r356 bra $L78; add.u64 %r618,%frame,360; $L106: add.u64 %r137,%r233,40; mov.u64 %r134,%r618; cvt.u32.u32 %r358,%r23; cvt.s64.s8 %r357,%r358; add.u64 %r360,%r357,%r357; add.u64 %r361,%r360,%r357; shl.b64 %r362,%r361,3; add.u64 %r363,%r233,16; add.u64 %r64,%r362,%r363; .loc 1 329 10 mov.u64 %r136,1; .loc 1 333 4 mov.u64 %r382,0; bra $L79; $L78: .loc 1 337 20 add.u32 %r364,%r23,-2; cvt.s64.s32 %r75,%r364; add.u64 %r366,%r75,%r75; add.u64 %r367,%r366,%r75; shl.b64 %r368,%r367,3; add.u64 %r369,%r233,%r368; .loc 1 337 67 shl.b64 %r371,%r75,3; add.u64 %r372,%frame,%r371; .loc 1 337 59 ld.u64 %r374,[%r369+40]; ld.u64 %r375,[%r372+360]; mul.lo.u64 %r162,%r374,%r375; .loc 1 339 24 mov.u64 %r376,0; st.u64 [%r233+8],%r376; .loc 1 340 28 cvt.u16.u32 %r379,%r23; add.u16 %r378,%r379,-1; cvt.u32.u16 %r380,%r378; st.u8 [%r233+28],%r380; .loc 1 342 10 setp.eq.u64 %r381,%r162,0; @ %r381 bra $L80; bra $L131; $L82: .loc 1 331 47 mul.lo.u64 %r136,%r72,%r136; $L79: .loc 1 333 4 st.u64 [%r137+8],%r382; ld.u64 %r72,[%r134]; add.u64 %r383,%r72,-1; st.u64 [%r137+16],%r383; st.u64 [%r137],%r136; .loc 1 326 7 add.u64 %r137,%r137,24; add.u64 %r134,%r134,8; setp.ne.u64 %r384,%r64,%r137; @ %r384 bra $L82; bra $L78; $L80: .loc 1 345 4 st.u64 [%r233+48],%r162; mov.u64 %r386,-1; st.u64 [%r233+56],%r386; mov.u64 %r387,1; st.u64 [%r233+40],%r387; .loc 1 346 4 bra $L58; $L131: .loc 1 349 24 mov.u64 %r389,16389; call (%value_in),_gfortrani_xmallocarray90,[%value_in]; } .loc 1 349 22 st.u64 [%r233],%r390; bra $L83; $L73: .loc 1 354 19 ld.s8 %r392,[%r233+28]; .loc 1 354 10 setp.eq.u64 %r393,%r392,%r154; @ %r393 bra $L84; .loc 1 355 2 cvta.const.u64 %r3944_gfortran_runtime_error84: .loc 1 357 11 cvta.global.u64 %r396,_gfortrani_compile_options; .loc 1 357 10 ld.u32 %r397,[%r396+36]; setp.eq.u32 %r398,%r397,0; @ %r398 bra $L83; .loc 1 359 4 add.u64 %r618,%frame,360; cvta.const.u64 %r402,$LC2r61002; call _gfortrani_bounds_ifunction_return361 4 cvta.const.u64 %r406,$LC406402; call _gfortrani_bounds_equal_extents$L83: .loc 1 366 3 setp.ne.u64 %r408,%r154,0; @ %r408 bra $L85; $L88: .loc 1 374 8 ld.u64 %r174,[%r233]; .loc 1 375 8 ld.u64 %r172,[%r234]; .loc 1 377 9 setp.ne.u64 %r409,%r172,0; @ %r409 bra $L86; bra $L58; $L85: add.u64 %r188,%frame,480; add.u64 %r181,%r233,40; add.u64 %r149,%frame,120; cvt.u32.u32 %r411,%r23; cvt.s64.s8 %r410,%r411; add.u64 %r138,%r410,-1; .loc 1 366 3 mov.u64 %r175,0; add.u64 %r618,%frame,360; .loc 1 368 16 mov.u64 %r412,%r175; $L87: st.u64 [%r188],%r412; .loc 1 369 18 ld.u64 %r413,[%r181]; st.u64 [%r149],%r413; .loc 1 370 17 shl.b64 %r415,%r175,3; add.u64 %r416,%r618,%r415; .loc 1 370 10 ld.u64 %r417,[%r416]; setp.le.s64 %r418,%r417,0; @ %r418 bra $L58; .loc 1 366 26 add.u64 %r175,%r175,1; .loc 1 366 3 add.u64 %r188,%r188,8; add.u64 %r181,%r181,24; add.u64 %r149,%r149,8; setp.ne.u64 %r419,%r138,%r175; @ %r419 bra $L87; bra $L88; $L86: .loc 1 300 12 mul.lo.u64 %r43,%r41,%r35; .loc 1 396 32 shl.b64 %r88,%r158,4; ld.u64 %r189,[%frame+480]; .loc 1 440 22 ld.u64 %r102,[%frame+240]; .loc 1 440 12 shl.b64 %r103,%r102,4; .loc 1 441 23 ld.u64 %r105,[%frame]; .loc 1 442 22 ld.u64 %r107,[%frame+120]; .loc 1 442 12 shl.b64 %r108,%r107,4; .loc 1 444 32 ld.u64 %r29,[%frame+360]; .loc 1 451 23 mul.lo.u64 %r39,%r29,%r102; .loc 1 452 24 mul.lo.u64 %r420,%r29,%r105; .loc 1 452 10 neg.s64 %r92,%r420; .loc 1 453 23 mul.lo.u64 %r421,%r29,%r107; .loc 1 453 9 shl.b64 %r422,%r421,4; neg.s64 %r185,%r422; setp.le.s64 %r621,%r154,1; .loc 1 418 9 setp.ne.u32 %r625,%r237,0; add.u64 %r626,%r155,-1; cvt.u32.u32 %r627,%r23; cvt.s64.s8 %r628,%r627; add.u64 %r629,%r628,-1; add.u64 %r630,%frame,240; add.u64 %r631,%frame,360; $L107: .loc 1 366 3 mov.u64 %r168,%r173; mov.u64 %r167,%r172; .loc 1 396 9 mov.u64 %r197,0; $L92: .loc 1 399 7 ld.s8 %r224,[%r168]; mov.u64 %r230,%r197; .loc 1 396 24 add.u64 %r197,%r197,1; .loc 1 399 6 setp.eq.u32 %r623,%r224,0; @ %r623 bra $L89; .loc 1 407 11 ld.u64 %r579,[%r167]; ld.u64 %r580,[%r167+8]; .loc 1 408 11 mov.u64 %r577,%r197; shr.s64 %r578,%r577,63; .loc 1 418 9 @ %r625 bra $L90; bra $L132; $L89: .loc 1 396 32 add.u64 %r167,%r167,%r88; .loc 1 396 47 add.u64 %r168,%r168,%r43; .loc 1 396 2 setp.ne.u64 %r426,%r155,%r197; @ %r426 bra $L92; .loc 1 395 9 mov.u64 %r577,0; mov.u64 %r578,%r577; bra $L93; $L132: .loc 1 428 10 setp.gt.s64 %r427,%r155,%r230; @ %r427 bra $L94; bra $L93; $L90: .loc 1 419 8 setp.le.s64 %r428,%r155,%r230; @ %r428 bra $L93; add.u64 %r587,%r230,1; shr.s64 %r588,%r587,63; sub.u64 %r432,%r626,%r230; shr.s64 %r435,%r230,63; add.u64 %r438,%r230,2; set.u32.lt.u64 %r443,%r438,%r230; cvt.s64.s32 %r441,%r443; sub.u64 %r447,%r435,%r441; add.u64 %r449,%r432,%r438; set.u32.lt.u64 %r454,%r449,%r432; cvt.s64.s32 %r452,%r454; sub.u64 %r458,%r447,%r452; mov.u64 %r599,1; $L98: .loc 1 421 8 @ %r623 bra $L95; .loc 1 421 18 ld.u64 %r573,[%r167]; ld.u64 %r574,[%r167+8]; .loc 1 421 15 setp.gt.s64 %r462,%r574,%r580; @ %r462 bra $L95; setp.ne.u64 %r465,%r574,%r580; @ %r465 bra $L108; setp.gt.u64 %r468,%r573,%r579; @ %r468 bra $L95; $L108: .loc 1 424 16 mov.u64 %r577,%r587; mov.u64 %r578,%r588; mov.u64 %r579,%r573; mov.u64 %r580,%r574; $L95: .loc 1 419 33 add.u64 %r167,%r167,%r88; .loc 1 419 48 add.u64 %r168,%r168,%r43; .loc 1 419 8 add.u64 %r471,%r587,%r599; set.u32.lt.u64 %r476,%r471,%r587; cvt.s64.s32 %r474,%r476; mov.u64 %r587,%r471; sub.u64 %r588,%r588,%r474; setp.ne.u64 %r483,%r449,%r587; @ %r483 bra $L109; setp.ne.u64 %r486,%r458,%r588; @ ! %r486 bra $L93; $L109: .loc 1 421 9 ld.s8 %r224,[%r168]; setp.eq.u32 %r623,%r224,0; bra $L98; $L94: add.u64 %r583,%r230,1; shr.s64 %r584,%r583,63; sub.u64 %r490,%r626,%r230; shr.s64 %r493,%r230,63; add.u64 %r496,%r230,2; set.u32.lt.u64 %r501,%r496,%r230; cvt.s64.s32 %r499,%r501; sub.u64 %r505,%r493,%r499; add.u64 %r507,%r490,%r496; set.u32.lt.u64 %r512,%r507,%r490; cvt.s64.s32 %r510,%r512; sub.u64 %r516,%r505,%r510; .loc 1 428 10 mov.u64 %r613,1; $L124: .loc 1 430 10 ld.s8 %r518,[%r168]; cvt.u16.u32 %r517,%r518; setp.eq.u16 %r519,%r517,0; @ %r519 bra $L99; .loc 1 430 20 ld.u64 %r575,[%r167]; ld.u64 %r576,[%r167+8]; .loc 1 430 17 setp.gt.s64 %r522,%r580,%r576; @ %r522 bra $L110; setp.ne.u64 %r525,%r580,%r576; @ %r525 bra $L99; setp.gt.u64 %r528,%r579,%r575; @ ! %r528 bra $L99; $L110: .loc 1 433 11 mov.u64 %r577,%r583; mov.u64 %r578,%r584; mov.u64 %r579,%r575; mov.u64 %r580,%r576; $L99: .loc 1 428 35 add.u64 %r167,%r167,%r88; .loc 1 428 50 add.u64 %r168,%r168,%r43; .loc 1 428 10 add.u64 %r531,%r583,%r613; set.u32.lt.u64 %r536,%r531,%r583; cvt.s64.s32 %r534,%r536; mov.u64 %r583,%r531; sub.u64 %r584,%r584,%r534; setp.ne.u64 %r543,%r507,%r583; @ %r543 bra $L124; setp.ne.u64 %r546,%r516,%r584; @ %r546 bra $L124; $L93: .loc 1 436 8 st.u64 [%r174],%r577; st.u64 [%r174+8],%r578; .loc 1 439 15 add.u64 %r189,%r189,1; .loc 1 440 12 add.u64 %r172,%r172,%r103; .loc 1 441 13 add.u64 %r173,%r173,%r105; .loc 1 442 12 add.u64 %r174,%r174,%r108; .loc 1 444 13 setp.ne.u64 %r549,%r29,%r189; @ %r549 bra $L107; .loc 1 452 10 add.u64 %r169,%r173,%r92; .loc 1 453 9 add.u64 %r170,%r174,%r185; .loc 1 455 7 @ %r621 bra $L58; add.u64 %r219,%frame,488; .loc 1 451 23 mov.u64 %r112,%r39; .loc 1 455 7 mov.u64 %r218,8; .loc 1 454 5 mov.u64 %r171,1; add.u64 %r617,%frame,120; .loc 1 448 13 mov.u64 %r624,0; bra $L104; $L105: st.u64 [%r219],%r624; .loc 1 451 23 mul.lo.u64 %r112,%r125,%r124; .loc 1 452 24 mul.lo.u64 %r555,%r127,%r124; .loc 1 452 10 sub.u64 %r169,%r173,%r555; .loc 1 453 23 mul.lo.u64 %r556,%r129,%r124; .loc 1 453 9 shl.b64 %r557,%r556,4; sub.u64 %r170,%r174,%r557; .loc 1 454 5 add.u64 %r171,%r171,1; .loc 1 455 7 add.u64 %r219,%r219,8; add.u64 %r218,%r218,8; setp.eq.u64 %r558,%r171,%r629; @ %r558 bra $L58; $L104: .loc 1 463 16 ld.u64 %r559,[%r219]; add.u64 %r124,%r559,1; st.u64 [%r219],%r124; .loc 1 464 23 add.u64 %r561,%r630,%r218; ld.u64 %r125,[%r561]; .loc 1 464 13 sub.u64 %r562,%r125,%r112; shl.b64 %r563,%r562,4; add.u64 %r172,%r172,%r563; .loc 1 465 24 add.u64 %r564,%frame,%r218; ld.u64 %r127,[%r564]; .loc 1 465 14 add.u64 %r173,%r169,%r127; .loc 1 466 23 add.u64 %r566,%r617,%r218; ld.u64 %r129,[%r566]; .loc 1 466 13 shl.b64 %r567,%r129,4; add.u64 %r174,%r170,%r567; .loc 1 444 32 add.u64 %r569,%r631,%r218; ld.u64 %r133,[%r569]; .loc 1 444 13 setp.eq.u64 %r570,%r124,%r133; @ %r570 bra $L105; .loc 1 448 13 mov.u64 %r189,0; bra $L107; $L130: .loc 1 322 6 ld.u64 %r571,[%r233]; setp.eq.u64 %r572,%r571,0; @ ! %r572 bra $L73; bra $L106; $L5_gfortran_sminloc1_16_i16 .visible .func _gfortran_sminloc1_16_i16, .param .u32 %in_ar4) {local .align 16 .b8 %frame_ar[368u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r282; .reg .predpred %r293; .reg .pred %r294; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 493 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L134; .loc 1 493 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L135; $L134:r152154; call _gfortran_minloc1_16_i16500 7 bra $L133; $L135: .loc 1 503 10 ld.u64 %r23,[%r152]; .loc 1 503 7 add.u64 %r88,%r23,-1; .loc 1 504 10 ld.s8 %r24,[%r151+28]; .loc 1 504 38 add.u32 %r26,%r24,-1; .loc 1 504 8 cvt.s64.s32 %r297,%r26; .loc 1 506 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r297; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 506 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L137; .loc 1 513 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L138; $L144: .loc 1 521 3 setp.lt.s64 %r175,%r88,%r297; @ %r175 bra $L139; bra $L192; $L137: .loc 1 508 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error138: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 518 12 mov.u64 %r303,0; $L143: .loc 1 515 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 517 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L141; .loc 1 515 17 st.u64 [%r139],%r36; bra $L142; $L141: .loc 1 518 12 st.u64 [%r139],%r303; $L142: .loc 1 513 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L143; bra $L144; $L192: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L146; bra $L145; $L139: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r296,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r296,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 527 12 mov.u64 %r302,0; $L149: .loc 1 524 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 526 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L147; .loc 1 523 17 st.u64 [%r43],%r41; bra $L148; $L147: .loc 1 527 12 st.u64 [%r43],%r302; $L148: .loc 1 521 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L149; bra $L193; $L145: .loc 1 534 7 setp.eq.u64 %r213,%r297,0; @ %r213 bra $L151; add.u64 %r296,%frame,120; $L171: add.u64 %r111,%r150,40; mov.u64 %r107,%r296; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 537 10 mov.u64 %r81,1; .loc 1 541 4 mov.u64 %r239,0; bra $L152; $L151: .loc 1 545 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 546 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 548 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 548 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 548 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 550 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L153; bra $L194; $L155: .loc 1 539 48 mul.lo.u64 %r81,%r46,%r81; $L152: .loc 1 541 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 534 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L155; bra $L151; $L153: .loc 1 553 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 554 4 bra $L133; $L194: .loc 1 557 24 mov.u64 %r246,1696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 557 22 st.u64 [%r150],%r98; $L159: .loc 1 583 3 setp.ne.u64 %r249,%r297,0; @ %r249 bra $L156; bra $L157; $L146: .loc 1 561 19 ld.s8 %r55,[%r150+28]; .loc 1 561 10 setp.eq.u64 %r250,%r55,%r297; @ %r250 bra $L158; .loc 1 562 2 st.u64 [%stack+8],%r297; st.u64 [%stack],%r55;_gfortran_runtime_error158: .loc 1 567 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 567 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L159; .loc 1 569 4 setp.eq.u64 %r256,%r297,0; @ %r256 bra $L157; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 569 10 mov.u64 %r104,0; $L161: .loc 1 573 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 573 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 574 18 ld.u64 %r63,[%r31]; .loc 1 569 25 add.u64 %r104,%r104,1; .loc 1 574 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L160; .loc 1 575 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error160: .loc 1 569 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L161; $L156: .loc 1 585 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L162; shl.b64 %r266,%r297,3; bra $L163; $L162: mov.u64 %r266,8; $L163: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L164: .loc 1 586 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 583 26 add.u64 %r100,%r100,1; .loc 1 583 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r297,%r100; @ %r279 bra $L164; $L157: ld.u64 %r80,[%frame+240]; .loc 1 595 22 ld.u64 %r68,[%frame]; .loc 1 595 12 shl.b64 %r69,%r68,4; .loc 1 597 32 ld.u64 %r59,[%frame+120]; .loc 1 604 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r295,%r297,1; .loc 1 593 13 mov.u64 %r280,0; cvt.u32.u32 %r299,%r24; cvt.s64.s8 %r300,%r299; add.u64 %r301,%r300,-1; $L170: st.u64 [%r98],%r280; st.u64 [%r98+8],%r280; .loc 1 594 15 add.u64 %r80,%r80,1; .loc 1 595 12 add.u64 %r98,%r98,%r69; .loc 1 597 13 setp.ne.u64 %r282,%r59,%r80; @ %r282 bra $L170; .loc 1 606 7 @ ! %r295 bra $L195; bra $L133; $L169: .loc 1 601 13 st.u64 [%r99],%r280; .loc 1 604 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 605 5 add.u64 %r97,%r97,1; .loc 1 606 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r285,%r97,%r301; @ %r285 bra $L168; bra $L133; $L195: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 604 23 mov.u64 %r72,%r44; .loc 1 605 5 mov.u64 %r97,1; $L168: .loc 1 610 16 ld.u64 %r290,[%r99]; add.u64 %r74,%r290,1; st.u64 [%r99],%r74; .loc 1 611 23 ld.u64 %r75,[%r124]; .loc 1 611 13 sub.u64 %r291,%r75,%r72; shl.b64 %r292,%r291,4; add.u64 %r98,%r98,%r292; .loc 1 597 32 ld.u64 %r77,[%r123]; .loc 1 597 13 setp.eq.u64 %r293,%r74,%r77; @ %r293 bra $L169; .loc 1 601 13 mov.u64 %r80,0; bra $L170; $L193: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r294,%r98,0; @ ! %r294 bra $L146; bra $L171; $L133: .loc 1 615 1 ret; } minloc1_4_r4.o/_gfortran_minloc1_4_r4 .visible .func _gfortran_minloc1_4_rfortran/generated/minloc1_4_r4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_4_r4 .visible .func _gfortran_mminloc1_4_r_gfortran_sminloc1_4_r4 .visible .func _gfortran_sminloc1_4_rVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_minloc1_4_r4 .visible .func _gfortran_minloc1_4_r4480pred %r229; .reg .predpred %r240; .reg .predu16 %r278; .reg .u32 %r279; .reg .u32 %r280; .reg .u64 %r282; .reg .u64predu64 %r318; .reg .u64predpred %r337; .reg .pred %r338; .reg .pred %r339; .reg .predu64 %r357; .reg .u64 %r358; .reg .u64 %r360; .reg .pred %r361; .reg .u64 %r362; .reg .predpred %r367; .reg .u64.loc 1 58 10 ld.s8 %r22,[%r191+28]; .loc 1 58 38 add.u32 %r194,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r122,%r194; .loc 1 59 10 ld.u64 %r25,[%r192]; .loc 1 59 7 add.u64 %r123,%r25,-1; .loc 1 61 7 shr.u64 %r196,%r123,63; set.u32.lt.s64 %r198,%r122,%r123; neg.s32 %r199,%r198; cvt.u16.u64 %r202,%r196;61%r206 bra $L2; .loc 1 63 7 cvt.u32.u32 %r209,%r22; cvt.s64.s8 %r208,%r209; st.u64 [%stack+8],%r208007_gfortran_runtime_errorr212,%r123,%r123; add.u64 %r213,%r212,%r123; shl.b64 %r214,%r213,3; add.u64 %r215,%r191,%r214; ld.u64 %r30,[%r215+56]; ld.u64 %r32,[%r215+48]; .loc 1 71 9 ld.u64 %r126,[%r215+40]; .loc 1 73 3 setp.ne.u64 %r229,%r123,0; @ %r229 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r230,%r122,%r123; @ %r230 bra $L4; bra $L61; $L3: add.u64 %r45,%r191,40; add.u64 %r69,%frame,120; add.u64 %r44,%frame,240; add.u64 %r232,%r25,%r25; add.u64 %r233,%r232,%r25; shl.b64 %r234,%r233,3; add.u64 %r235,%r191,16; add.u64 %r114,%r234,%r235; .loc 1 79 12 mov.u64 %r377,0; $L8: .loc 1 75 18 ld.u64 %r236,[%r45]; st.u64 [%r69],%r236; .loc 1 76 19 ld.u64 %r238,[%r45+16]; add.u64 %r237,%r238,1; ld.u64 %r239,[%r45+8]; sub.u64 %r37,%r237,%r239; .loc 1 78 10 setp.lt.s64 %r240,%r37,0; @ %r240 bra $L6; .loc 1 76 17 st.u64 [%r44],%r37; bra $L7; $L6: .loc 1 79 12 st.u64 [%r44],%r377; $L7: .loc 1 73 3 add.u64 %r45,%r45,24; add.u64 %r69,%r69,8; add.u64 %r44,%r44,8; setp.ne.u64 %r242,%r45,%r114; @ %r242 bra $L8; bra $L9; $L61: .loc 1 90 6 ld.u64 %r243,[%r190]; setp.eq.u64 %r244,%r243,0; @ ! %r244 bra $L11; bra $L10; $L4: add.u64 %r246,%r25,%r25; add.u64 %r247,%r246,%r25; shl.b64 %r248,%r247,3; add.u64 %r249,%r248,40; add.u64 %r106,%r191,%r249; shl.b64 %r250,%r25,3; add.u64 %r81,%r250,-8; add.u64 %r365,%frame,120; add.u64 %r101,%r365,%r81; add.u64 %r364,%frame,240; add.u64 %r58,%r364,%r81; add.u64 %r253,%r191,40; cvt.u32.u32 %r255,%r22; cvt.s64.s8 %r254,%r255; add.u64 %r257,%r254,%r254; add.u64 %r258,%r257,%r254; shl.b64 %r259,%r258,3; add.u64 %r185,%r253,%r259; .loc 1 87 12 mov.u64 %r376,0; $L14: .loc 1 83 18 ld.u64 %r260,[%r106]; st.u64 [%r101],%r260; .loc 1 84 19 ld.u64 %r262,[%r106+16]; add.u64 %r261,%r262,1; ld.u64 %r263,[%r106+8]; sub.u64 %r42,%r261,%r263; .loc 1 86 10 setp.lt.s64 %r264,%r42,0; @ %r264 bra $L12; .loc 1 84 17 st.u64 [%r58],%r42; bra $L13; $L12: .loc 1 87 12 st.u64 [%r58],%r376; $L13: .loc 1 81 3 add.u64 %r106,%r106,24; add.u64 %r101,%r101,8; add.u64 %r58,%r58,8; setp.ne.u64 %r266,%r106,%r185; @ %r266 bra $L14; bra $L62; $L10: .loc 1 94 7 setp.le.s64 %r267,%r122,0; @ %r267 bra $L16; add.u64 %r364,%frame,240; $L45: add.u64 %r144,%r190,40; mov.u64 %r119,%r364; cvt.u32.u32 %r269,%r22; cvt.s64.s8 %r268,%r269; add.u64 %r271,%r268,%r268; add.u64 %r272,%r271,%r268; shl.b64 %r273,%r272,3; add.u64 %r274,%r190,16; add.u64 %r109,%r273,%r274; .loc 1 97 10 mov.u64 %r108,1; .loc 1 101 4 mov.u64 %r297,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r275,0; st.u64 [%r190+8],%r275; .loc 1 106 28 cvt.u16.u32 %r278,%r22; add.u16 %r277,%r278,-1; cvt.u32.u16 %r279,%r277; st.u8 [%r190+28],%r279; .loc 1 108 20 add.u32 %r280,%r22,-2; cvt.s64.s32 %r51,%r280; add.u64 %r282,%r51,%r51; add.u64 %r283,%r282,%r51; shl.b64 %r284,%r283,3; add.u64 %r285,%r190,%r284; .loc 1 108 67 shl.b64 %r287,%r51,3; add.u64 %r288,%frame,%r287; .loc 1 108 59 ld.u64 %r290,[%r285+40]; ld.u64 %r291,[%r288+240]; mul.lo.u64 %r128,%r290,%r291; .loc 1 110 29293; call (%value_in),_gfortrani_xmallocarray294,[%value_in]; } .loc 1 110 27 st.u64 [%r190],%r294; .loc 1 111 10 setp.eq.u64 %r296,%r128,0; @ ! %r296 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r108,%r47,%r108; $L17: .loc 1 101 4 st.u64 [%r144+8],%r297; ld.u64 %r47,[%r119]; add.u64 %r298,%r47,-1; st.u64 [%r144+16],%r298; st.u64 [%r144],%r108; .loc 1 94 7 add.u64 %r144,%r144,24; add.u64 %r119,%r119,8; setp.ne.u64 %r299,%r109,%r144; @ %r299 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r190+48],%r128; mov.u64 %r301,-1; st.u64 [%r190+56],%r301; mov.u64 %r302,1; st.u64 [%r190+40],%r302; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r57,[%r190+28]; .loc 1 121 10 setp.eq.u64 %r303,%r57,%r122; @ %r303 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r122; st.u64 [%stack],%r57;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r306,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r307,[%r306+36]; setp.eq.u32 %r308,%r307,0; @ %r308 bra $L21; .loc 1 128 2 add.u64 %r364,%frame,240; cvta.const.u64 %r312,$LC2; cvta.const.u64 %r311,$LC3;_gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r314,%r122,0; @ %r314 bra $L24; $L27: .loc 1 68 9 add.u64 %r315,%r30,1; .loc 1 68 7 sub.u64 %r124,%r315,%r32; max.s64 %r142,%r124,0; .loc 1 140 8 ld.u64 %r137,[%r191]; .loc 1 141 8 ld.u64 %r138,[%r190]; .loc 1 168 40 shl.b64 %r65,%r126,2; ld.u64 %r145,[%frame+360]; .loc 1 204 22 ld.u64 %r75,[%frame+120]; .loc 1 204 12 shl.b64 %r76,%r75,2; .loc 1 205 22 ld.u64 %r78,[%frame]; .loc 1 205 12 shl.b64 %r79,%r78,2; .loc 1 207 32 ld.u64 %r117,[%frame+240]; .loc 1 214 23 mul.lo.u64 %r316,%r75,%r117; .loc 1 214 9 shl.b64 %r317,%r316,2; neg.s64 %r160,%r317; .loc 1 215 23 mul.lo.u64 %r318,%r78,%r117; .loc 1 215 9 shl.b64 %r319,%r318,2; neg.s64 %r68,%r319; setp.gt.s64 %r366,%r124,0; setp.le.s64 %r367,%r122,1; .loc 1 157 9 mov.u32 %r369,1; .loc 1 180 9 setp.ne.u32 %r370,%r193,0; cvt.u32.u32 %r371,%r22; cvt.s64.s8 %r372,%r371; add.u64 %r373,%r372,-1; add.u64 %r374,%frame,120; add.u64 %r375,%frame,240; bra $L25; $L24: add.u64 %r166,%frame,360; add.u64 %r165,%r190,40; mov.u64 %r161,%frame; cvt.u32.u32 %r321,%r22; cvt.s64.s8 %r320,%r321; add.u64 %r146,%r320,-1; .loc 1 132 3 mov.u64 %r139,0; add.u64 %r364,%frame,240; .loc 1 134 16 mov.u64 %r322,%r139; $L26: st.u64 [%r166],%r322; .loc 1 135 18 ld.u64 %r323,[%r165]; st.u64 [%r161],%r323; .loc 1 136 17 shl.b64 %r325,%r139,3; add.u64 %r326,%r364,%r325; .loc 1 136 10 ld.u64 %r327,[%r326]; setp.le.s64 %r328,%r327,0; @ %r328 bra $L1; .loc 1 132 26 add.u64 %r139,%r139,1; .loc 1 132 3 add.u64 %r166,%r166,8; add.u64 %r165,%r165,24; add.u64 %r161,%r161,8; setp.ne.u64 %r329,%r139,%r146; @ %r329 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r366 bra $L46; .loc 1 159 10 mov.u32 %r331,0; st.u32 [%r138],%r331; bra $L29; $L46: mov.u64 %r130,%r137; .loc 1 168 17 mov.u64 %r129,0; $L28: .loc 1 170 7 ld.f32 %r107,[%r130]; .loc 1 170 6 setp.le.f32 %r332,%r107,0f7f800000; @ ! %r332 bra $L63; .loc 1 173 14 cvt.u32.u64 %r333,%r129; add.u32 %r112,%r333,1; .loc 1 180 9 @ %r370 bra $L32; .loc 1 190 8 setp.gt.s64 %r335,%r142,%r129; @ %r335 bra $L33; bra $L34; $L32: .loc 1 181 8 setp.le.s64 %r336,%r142,%r129; @ %r336 bra $L34; add.u64 %r173,%r129,1; .loc 1 170 7 mov.f32 %r186,%r107; bra $L35; $L63: .loc 1 168 32 add.u64 %r129,%r129,1; .loc 1 168 40 add.u64 %r130,%r130,%r65; .loc 1 168 10 setp.lt.s64 %r337,%r129,%r142; @ %r337 bra $L28; .loc 1 157 9 mov.u32 %r112,%r369; bra $L34; $L35: .loc 1 183 8 setp.ge.f32 %r338,%r107,%r186; @ ! %r338 bra $L36; .loc 1 186 16 cvt.u32.u64 %r112,%r173; .loc 1 185 16 mov.f32 %r107,%r186; $L36: .loc 1 181 33 add.u64 %r130,%r130,%r65; .loc 1 181 8 add.u64 %r174,%r173,1; setp.eq.u64 %r339,%r142,%r173; @ %r339 bra $L34; .loc 1 183 9 ld.f32 %r186,[%r130]; mov.u64 %r173,%r174; bra $L35; $L33: add.u64 %r169,%r129,1; bra $L40; $L47: mov.u64 %r169,%r170; $L40: .loc 1 192 9 ld.f32 %r70,[%r130]; .loc 1 192 8 setp.lt.f32 %r340,%r70,%r107; @ ! %r340 bra $L38; .loc 1 195 16 cvt.u32.u64 %r112,%r169; .loc 1 194 16 mov.f32 %r107,%r70; $L38: .loc 1 190 33 add.u64 %r130,%r130,%r65; .loc 1 190 8 add.u64 %r170,%r169,1; setp.ne.u64 %r341,%r142,%r169; @ %r341 bra $L47; $L34: .loc 1 199 12 st.u32 [%r138],%r112; $L29: .loc 1 203 15 add.u64 %r145,%r145,1; .loc 1 204 12 add.u64 %r137,%r137,%r76; .loc 1 205 12 add.u64 %r138,%r138,%r79; .loc 1 207 13 setp.ne.u64 %r342,%r145,%r117; @ %r342 bra $L25; .loc 1 214 9 add.u64 %r134,%r137,%r160; .loc 1 215 9 add.u64 %r135,%r138,%r68; .loc 1 217 7 @ %r367 bra $L1; add.u64 %r121,%frame,368; mov.u64 %r182,8; .loc 1 216 5 mov.u64 %r136,1; .loc 1 211 13 mov.u64 %r368,0; bra $L43; $L44: st.u64 [%r121],%r368; .loc 1 214 23 mul.lo.u64 %r348,%r94,%r93; .loc 1 214 9 shl.b64 %r349,%r348,2; sub.u64 %r134,%r137,%r349; .loc 1 215 23 mul.lo.u64 %r350,%r97,%r93; .loc 1 215 9 shl.b64 %r351,%r350,2; sub.u64 %r135,%r138,%r351; .loc 1 216 5 add.u64 %r136,%r13182,%r182,8; setp.eq.u64 %r352,%r136,%r373; @ %r352 bra $L1; $L43: .loc 1 225 16 ld.u64 %r353,[%r121]; add.u64 %r93,%r353,1; st.u64 [%r121],%r93; .loc 1 226 23 add.u64 %r355,%r374,%r182; ld.u64 %r94,[%r355]; .loc 1 226 13 shl.b64 %r356,%r94,2; add.u64 %r137,%r134,%r356; .loc 1 227 23 add.u64 %r357,%frame,%r182; ld.u64 %r97,[%r357]; .loc 1 227 13 shl.b64 %r358,%r97,2; add.u64 %r138,%r135,%r358; .loc 1 207 32 add.u64 %r360,%r375,%r182; ld.u64 %r100,[%r360]; .loc 1 207 13 setp.eq.u64 %r361,%r93,%r100; @ %r361 bra $L44; .loc 1 211 13 mov.u64 %r145,0; bra $L25; $L62: .loc 1 90 6 ld.u64 %r362,[%r190]; setp.eq.u64 %r363,%r362,0; @ ! %r363 bra $L11; bra $L45; $L1:_gfortran_mminloc1_4_r4 .visible .func _gfortran_mminloc1_4_r, .param .u32 %in_ar4) {608u32 %r267; .reg .u32 %r269; .reg .u32 %r270; .reg .u32 %r271; .reg .u32 %r273; .reg .u32predu64 %r293; .reg .predu64 %r302; .reg .u64 %r303; .reg .u64 %r305; .reg .u64 %r306; .reg .u64u64 %r327; .reg .u32u64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64pred %r339; .reg .u64 %r341; .reg .u64 %r343; .reg .pred %r345; .reg .predpred %r383; .reg .u64 %r384; .reg .u64 %r386; .reg .u32 %r387; .reg .pred %r388; .reg .u64 %r391; .reg .u64pred %r409; .reg .u64 %r410; .reg .u64 %r411; .reg .u64 %r412; .reg .pred %r414; .reg .u32pred %r418; .reg .pred %r419; .reg .pred %r422; .reg .pred %r423; .reg .pred %r425; .reg .pred %r426; .reg .predu64predmov.u64 %r223,%ar0; mov.u64 %r224,%ar1; mov.u64 %r225,%ar2; mov.u64 %r226,%ar3; mov.u32 %r227,%ar4; .loc 1 261 6 setp.ne.u64 %r228,%r226,0; @ %r228 bra $L65227; call _gfortran_minloc1_4_r4268 7 bra $L64; $L65: .loc 1 271 10 ld.u64 %r22,[%r225]; .loc 1 271 7 add.u64 %r159,%r22,-1; .loc 1 272 10 ld.s8 %r23,[%r224+28]; .loc 1 272 38 add.u32 %r233,%r23,-1; .loc 1 272 8 cvt.s64.s32 %r160,%r233; .loc 1 275 7 shr.u64 %r235,%r159,63; set.u32.gt.s64 %r237,%r159,%r160; neg.s32 %r238,%r237; cvt.u16.u64 %r241,%r235; cvt.u16.u32 %r242,%r238; or.b16 %r240,%r241,%r242; .loc 1 275 6 cvt.u32.u16 %r243,%r240; cvt.u16.u8 %r244,%r243; setp.eq.u16 %r245,%r244,0; @ %r245 bra $L67; .loc 1 277 7 cvt.u32.u32 %r248,%r23; cvt.s64.s8 %r247,%r248; st.u64 [%stack+8],%r247; st.u64 [%stack],%r22; cvta.const.u64 %r246246_gfortran_runtime_error67: .loc 1 282 9 add.u64 %r456,%r159,%r159; add.u64 %r457,%r456,%r159; shl.b64 %r253,%r457,3; add.u64 %r254,%r224,%r253; ld.u64 %r257,[%r254+56]; add.u64 %r256,%r257,1; .loc 1 282 7 ld.u64 %r264,[%r254+48]; sub.u64 %r161,%r256,%r264; .loc 1 283 6 setp.le.s64 %r265,%r161,0; @ %r265 bra $L64; .loc 1 286 9 ld.u64 %r180,[%r226]; .loc 1 288 15 ld.u64 %r34,[%r226+16]; .loc 1 290 22 cvt.u32.u64 %r35,%r34; .loc 1 290 53 add.u32 %r266,%r35,-4; and.b32 %r267,%r266,-5; set.u32.eq.u32 %r269,%r267,0; neg.s32 %r270,%r269; .loc 1 290 22 add.u32 %r271,%r35,-1; set.u32.le.u32 %r273,%r271,1; neg.s32 %r274,%r273; .loc 1 290 6 cvt.u16.u32 %r276,%r270; cvt.u16.u32 %r277,%r274; or.b16 %r275,%r276,%r277; cvt.u32.u16 %r278,%r275; cvt.u16.u8 %r279,%r278; setp.ne.u16 %r280,%r279,0; @ %r280 bra $L69; .loc 1 292 7 setp.ne.u32 %r282,%r35,16; @ %r282 bra $L70; $L69: .loc 1 299 9 shl.b64 %r286,%r457,3; add.u64 %r287,%r224,%r286; ld.u64 %r164,[%r287+40]; .loc 1 300 12 add.u64 %r293,%r226,%r286; ld.u64 %r41,[%r293+40]; .loc 1 302 3 setp.ne.u64 %r295,%r159,0; @ %r295 bra $L71; $L77: .loc 1 312 3 setp.lt.s64 %r296,%r159,%r160; @ %r296 bra $L72; bra $L135; $L70: .loc 1 297 5 cvta.const.u64 %r2977_gfortran_runtime_error71: add.u64 %r91,%r224,40; add.u64 %r40,%r226,40; add.u64 %r300,%r22,%r22; add.u64 %r301,%r300,%r22; shl.b64 %r302,%r301,3; add.u64 %r303,%r224,16; add.u64 %r186,%r302,%r303; .loc 1 302 3 mov.u64 %r148,0; add.u64 %r453,%frame,240; add.u64 %r452,%frame,360; .loc 1 309 12 mov.u64 %r467,%r148; $L76: .loc 1 304 18 add.u64 %r305,%r453,%r148; ld.u64 %r306,[%r91]; st.u64 [%r305],%r306; .loc 1 305 18 add.u64 %r307,%frame,%r148; .loc 1 305 20 ld.u64 %r309,[%r40]; mul.lo.u64 %r308,%r309,%r34; .loc 1 305 18 st.u64 [%r307],%r308; .loc 1 306 19 ld.u64 %r311,[%r91+16]; add.u64 %r310,%r311,1; ld.u64 %r312,[%r91+8]; sub.u64 %r52,%r310,%r312; .loc 1 308 10 setp.lt.s64 %r313,%r52,0; @ %r313 bra $L74; .loc 1 306 17 add.u64 %r315,%r452,%r148; st.u64 [%r315],%r52; bra $L75; $L74: .loc 1 309 12 add.u64 %r317,%r452,%r148; st.u64 [%r317],%r467; $L75: .loc 1 302 3 add.u64 %r91,%r91,24; add.u64 %r40,%r40,24; add.u64 %r148,%r148,8; setp.ne.u64 %r319,%r186,%r91; @ %r319 bra $L76; bra $L77; $L135: .loc 1 322 6 ld.u64 %r320,[%r223]; setp.eq.u64 %r321,%r320,0; @ ! %r321 bra $L79; bra $L78; $L72: add.u64 %r323,%r22,%r22; add.u64 %r324,%r323,%r22; shl.b64 %r325,%r324,3; add.u64 %r116,%r325,40; add.u64 %r133,%r224,%r116; add.u64 %r87,%r226,%r116; shl.b64 %r326,%r22,3; add.u64 %r157,%r326,-8; cvt.u32.u32 %r328,%r23; cvt.s64.s8 %r327,%r328; shl.b64 %r329,%r327,3; add.u64 %r96,%r329,-8; add.u64 %r453,%frame,240; add.u64 %r452,%frame,360; .loc 1 319 12 mov.u64 %r466,0; $L82: .loc 1 314 18 add.u64 %r331,%r453,%r157; ld.u64 %r332,[%r133]; st.u64 [%r331],%r332; .loc 1 315 18 add.u64 %r333,%frame,%r157; .loc 1 315 20 ld.u64 %r335,[%r87]; mul.lo.u64 %r334,%r335,%r34; .loc 1 315 18 st.u64 [%r333],%r334; .loc 1 316 19 ld.u64 %r337,[%r133+16]; add.u64 %r336,%r337,1; ld.u64 %r338,[%r133+8]; sub.u64 %r61,%r336,%r338; .loc 1 318 10 setp.lt.s64 %r339,%r61,0; @ %r339 bra $L80; .loc 1 316 17 add.u64 %r341,%r452,%r157; st.u64 [%r341],%r61; bra $L81; $L80: .loc 1 319 12 add.u64 %r343,%r452,%r157; st.u64 [%r343],%r466; $L81: .loc 1 312 3 add.u64 %r133,%r133,24; add.u64 %r87,%r87,24; add.u64 %r157,%r157,8; setp.ne.u64 %r345,%r96,%r157; @ %r345 bra $L82; bra $L136; $L78: .loc 1 326 7 setp.eq.u64 %r346,%r160,0; @ %r346 bra $L84; add.u64 %r452,%frame,360; $L112: add.u64 %r146,%r223,40; mov.u64 %r143,%r452; cvt.u32.u32 %r348,%r23; cvt.s64.s8 %r347,%r348; add.u64 %r350,%r347,%r347; add.u64 %r351,%r350,%r347; shl.b64 %r352,%r351,3; add.u64 %r353,%r223,16; add.u64 %r134,%r352,%r353; .loc 1 329 10 mov.u64 %r139,1; .loc 1 333 4 mov.u64 %r372,0; bra $L85; $L84: .loc 1 337 20 add.u32 %r354,%r23,-2; cvt.s64.s32 %r71,%r354; add.u64 %r356,%r71,%r71; add.u64 %r357,%r356,%r71; shl.b64 %r358,%r357,3; add.u64 %r359,%r223,%r358; .loc 1 337 67 shl.b64 %r361,%r71,3; add.u64 %r362,%frame,%r361; .loc 1 337 59 ld.u64 %r364,[%r359+40]; ld.u64 %r365,[%r362+360]; mul.lo.u64 %r168,%r364,%r365; .loc 1 339 24 mov.u64 %r366,0; st.u64 [%r223+8],%r366; .loc 1 340 28 cvt.u16.u32 %r369,%r23; add.u16 %r368,%r369,-1; cvt.u32.u16 %r370,%r368; st.u8 [%r223+28],%r370; .loc 1 342 10 setp.eq.u64 %r371,%r168,0; @ %r371 bra $L86; bra $L137; $L88: .loc 1 331 47 mul.lo.u64 %r139,%r68,%r139; $L85: .loc 1 333 4 st.u64 [%r146+8],%r372; ld.u64 %r68,[%r143]; add.u64 %r373,%r68,-1; st.u64 [%r146+16],%r373; st.u64 [%r146],%r139; .loc 1 326 7 add.u64 %r146,%r146,24; add.u64 %r143,%r143,8; setp.ne.u64 %r374,%r134,%r146; @ %r374 bra $L88; bra $L84; $L86: .loc 1 345 4 st.u64 [%r223+48],%r168; mov.u64 %r376,-1; st.u64 [%r223+56],%r376; mov.u64 %r377,1; st.u64 [%r223+40],%r377; .loc 1 346 4 bra $L64; $L137: .loc 1 349 24call (%value_in),_gfortrani_xmallocarray80,[%value_in]; } .loc 1 349 22 st.u64 [%r223],%r380; bra $L89; $L79: .loc 1 354 19 ld.s8 %r382,[%r223+28]; .loc 1 354 10 setp.eq.u64 %r383,%r382,%r160; @ %r383 bra $L90; .loc 1 355 2 cvta.const.u64 %r3843stack; call _gfortran_runtime_error90: .loc 1 357 11 cvta.global.u64 %r386,_gfortrani_compile_options; .loc 1 357 10 ld.u32 %r387,[%r386+36]; setp.eq.u32 %r388,%r387,0; @ %r388 bra $L89; .loc 1 359 4 add.u64 %r452,%frame,360; cvta.const.u64 %r392,$LC239392; call _gfortrani_bounds_ifunction_return361 4 cvta.const.u64 %r396,$LC396392; call _gfortrani_bounds_equal_extents$L89: .loc 1 366 3 setp.ne.u64 %r398,%r160,0; @ %r398 bra $L91; $L94: .loc 1 374 8 ld.u64 %r181,[%r223]; .loc 1 375 8 ld.u64 %r179,[%r224]; .loc 1 377 9 setp.ne.u64 %r399,%r179,0; @ %r399 bra $L92; bra $L64; $L91: add.u64 %r199,%frame,480; add.u64 %r195,%r223,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r401,%r23; cvt.s64.s8 %r400,%r401; add.u64 %r149,%r400,-1; .loc 1 366 3 mov.u64 %r182,0; add.u64 %r452,%frame,360; .loc 1 368 16 mov.u64 %r402,%r182; $L93: st.u64 [%r199],%r402; .loc 1 369 18 ld.u64 %r403,[%r195]; st.u64 [%r187],%r403; .loc 1 370 17 shl.b64 %r405,%r182,3; add.u64 %r406,%r452,%r405; .loc 1 370 10 ld.u64 %r407,[%r406]; setp.le.s64 %r408,%r407,0; @ %r408 bra $L64; .loc 1 366 26 add.u64 %r182,%r182,1; .loc 1 366 3 add.u64 %r199,%r199,8; add.u64 %r195,%r195,24; add.u64 %r187,%r187,8; setp.ne.u64 %r409,%r149,%r182; @ %r409 bra $L93; bra $L94; $L92: .loc 1 300 12 mul.lo.u64 %r43,%r41,%r34; .loc 1 396 32 shl.b64 %r86,%r164,2; ld.u64 %r39,[%frame+480]; .loc 1 440 22 ld.u64 %r100,[%frame+240]; .loc 1 440 12 shl.b64 %r101,%r100,2; .loc 1 441 23 ld.u64 %r103,[%frame]; .loc 1 442 22 ld.u64 %r105,[%frame+120]; .loc 1 442 12 shl.b64 %r106,%r105,2; .loc 1 444 32 ld.u64 %r172,[%frame+360]; .loc 1 451 23 mul.lo.u64 %r38,%r172,%r100; .loc 1 452 24 mul.lo.u64 %r410,%r172,%r103; .loc 1 452 10 neg.s64 %r94,%r410; .loc 1 453 23 mul.lo.u64 %r411,%r172,%r105; .loc 1 453 9 shl.b64 %r412,%r411,2; neg.s64 %r140,%r412; setp.le.s64 %r451,%r160,1; .loc 1 418 9 setp.ne.u32 %r459,%r227,0; cvt.u32.u32 %r460,%r23; cvt.s64.s8 %r461,%r460; add.u64 %r462,%r461,-1; add.u64 %r463,%frame,240; add.u64 %r464,%frame,360; add.u64 %r465,%frame,120; $L113: .loc 1 366 3 mov.u64 %r174,%r180; mov.u64 %r173,%r179; .loc 1 393 16 mov.u32 %r147,0; .loc 1 396 9 mov.u64 %r171,0; $L100: .loc 1 399 7 ld.s8 %r222,[%r174]; .loc 1 399 6 setp.eq.u32 %r454,%r222,0; @ %r454 bra $L95; .loc 1 402 10 setp.ne.u32 %r414,%r147,0; @ %r414 bra $L96; cvt.u32.u64 %r415,%r171; add.u32 %r147,%r415,1; $L96: .loc 1 404 11 ld.f32 %r145,[%r173]; .loc 1 404 10 setp.le.f32 %r416,%r145,0f7f800000; @ ! %r416 bra $L95; .loc 1 408 11 cvt.u32.u64 %r417,%r171; add.u32 %r170,%r417,1; .loc 1 414 9 setp.le.s64 %r418,%r161,%r171; @ ! %r418 bra $L138; bra $L98; $L95: .loc 1 396 24 add.u64 %r171,%r171,1; .loc 1 396 32 add.u64 %r173,%r173,%r86; .loc 1 396 47 add.u64 %r174,%r174,%r43; .loc 1 396 2 setp.ne.u64 %r419,%r161,%r171; @ %r419 bra $L100; bra $L98; $L138: .loc 1 418 9 @ %r459 bra $L101; add.u64 %r205,%r171,1; .loc 1 408 11 mov.u32 %r147,%r170; bra $L102; $L101: add.u64 %r209,%r171,1; mov.u32 %r147,%r170; $L105: .loc 1 421 8 @ %r454 bra $L103; .loc 1 421 18 ld.f32 %r89,[%r173]; .loc 1 421 15 setp.le.f32 %r422,%r89,%r145; @ ! %r422 bra $L103; .loc 1 424 16 cvt.u32.u64 %r147,%r209; .loc 1 423 16 mov.f32 %r145,%r89; $L103: .loc 1 419 33 add.u64 %r173,%r173,%r86; .loc 1 419 48 add.u64 %r174,%r174,%r43; .loc 1 419 8 add.u64 %r210,%r209,1; setp.eq.u64 %r423,%r161,%r209; @ %r423 bra $L98; .loc 1 421 9 ld.s8 %r222,[%r174]; mov.u64 %r209,%r210; setp.eq.u32 %r454,%r222,0; bra $L105; $L102: .loc 1 430 10 @ %r454 bra $L106; .loc 1 430 20 ld.f32 %r93,[%r173]; .loc 1 430 17 setp.lt.f32 %r425,%r93,%r145; @ ! %r425 bra $L106; .loc 1 433 11 cvt.u32.u64 %r147,%r205; .loc 1 432 18 mov.f32 %r145,%r93; $L106: .loc 1 428 35 add.u64 %r173,%r173,%r86; .loc 1 428 50 add.u64 %r174,%r174,%r43; .loc 1 428 10 add.u64 %r206,%r205,1; setp.eq.u64 %r426,%r161,%r205; @ %r426 bra $L98; .loc 1 430 11 ld.s8 %r222,[%r174]; mov.u64 %r205,%r206; setp.eq.u32 %r454,%r222,0; bra $L102; $L98: .loc 1 436 8 st.u32 [%r181],%r147; .loc 1 439 15 add.u64 %r39,%r39,1; .loc 1 440 12 add.u64 %r179,%r179,%r101; .loc 1 441 13 add.u64 %r180,%r180,%r103; .loc 1 442 12 add.u64 %r181,%r181,%r106; .loc 1 444 13 setp.ne.u64 %r427,%r39,%r172; @ %r427 bra $L113; .loc 1 452 10 add.u64 %r176,%r180,%r94; .loc 1 453 9 add.u64 %r177,%r181,%r140; .loc 1 455 7 @ %r451 bra $L64; add.u64 %r158,%frame,488; .loc 1 451 23 mov.u64 %r110,%r38; .loc 1 455 7 mov.u64 %r219,8; .loc 1 454 5 mov.u64 %r178,1; .loc 1 448 13 mov.u64 %r458,0; bra $L110; $L111: st.u64 [%r158],%r458; .loc 1 451 23 mul.lo.u64 %r110,%r124,%r123; .loc 1 452 24 mul.lo.u64 %r433,%r126,%r123; .loc 1 452 10 sub.u64 %r176,%r180,%r433; .loc 1 453 23 mul.lo.u64 %r434,%r128,%r123; .loc 1 453 9 shl.b64 %r435,%r434,2; sub.u64 %r177,%r181,%r435; .loc 1 454 5 add.u64 %r178,%r178,1; .loc 1 455 7 add.u64 %r158,%r158,8; add.u64 %r219,%r219,8; setp.eq.u64 %r436,%r178,%r462; @ %r436 bra $L64; $L110: .loc 1 463 16 ld.u64 %r437,[%r158]; add.u64 %r123,%r437,1; st.u64 [%r158],%r123; .loc 1 464 23 add.u64 %r439,%r463,%r219; ld.u64 %r124,[%r439]; .loc 1 464 13 sub.u64 %r440,%r124,%r110; shl.b64 %r441,%r440,2; add.u64 %r179,%r179,%r441; .loc 1 465 24 add.u64 %r442,%frame,%r219; ld.u64 %r126,[%r442]; .loc 1 465 14 add.u64 %r180,%r176,%r126; .loc 1 466 23 add.u64 %r444,%r465,%r219; ld.u64 %r128,[%r444]; .loc 1 466 13 shl.b64 %r445,%r128,2; add.u64 %r181,%r177,%r445; .loc 1 444 32 add.u64 %r447,%r464,%r219; ld.u64 %r131,[%r447]; .loc 1 444 13 setp.eq.u64 %r448,%r123,%r131; @ %r448 bra $L111; .loc 1 448 13 mov.u64 %r39,0; bra $L113; $L136: .loc 1 322 6 ld.u64 %r449,[%r223]; setp.eq.u64 %r450,%r449,0; @ ! %r450 bra $L79; bra $L112; $L64_gfortran_sminloc1_4_r4 .visible .func _gfortran_sminloc1_4_r, .param .u32 %in_ar4) {local .align 16 .b8 %frame_ar[368u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u32 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 493 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L140; .loc 1 493 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L141; $L140:r152154; call _gfortran_minloc1_4_r4500 7 bra $L139; $L141: .loc 1 503 10 ld.u64 %r23,[%r152]; .loc 1 503 7 add.u64 %r88,%r23,-1; .loc 1 504 10 ld.s8 %r24,[%r151+28]; .loc 1 504 38 add.u32 %r26,%r24,-1; .loc 1 504 8 cvt.s64.s32 %r296,%r26; .loc 1 506 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 506 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L143; .loc 1 513 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L144; $L150: .loc 1 521 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L145; bra $L198; $L143: .loc 1 508 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error144: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 518 12 mov.u64 %r302,0; $L149: .loc 1 515 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 517 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L147; .loc 1 515 17 st.u64 [%r139],%r36; bra $L148; $L147: .loc 1 518 12 st.u64 [%r139],%r302; $L148: .loc 1 513 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L149; bra $L150; $L198: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L152; bra $L151; $L145: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 527 12 mov.u64 %r301,0; $L155: .loc 1 524 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 526 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L153; .loc 1 523 17 st.u64 [%r43],%r41; bra $L154; $L153: .loc 1 527 12 st.u64 [%r43],%r301; $L154: .loc 1 521 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L155; bra $L199; $L151: .loc 1 534 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L157; add.u64 %r295,%frame,120; $L177: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 537 10 mov.u64 %r81,1; .loc 1 541 4 mov.u64 %r239,0; bra $L158; $L157: .loc 1 545 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 546 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 548 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 548 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 548 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 550 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L159; bra $L200; $L161: .loc 1 539 48 mul.lo.u64 %r81,%r46,%r81; $L158: .loc 1 541 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 534 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L161; bra $L157; $L159: .loc 1 553 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 554 4 bra $L139; $L200: .loc 1 557 2496246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 557 22 st.u64 [%r150],%r98; $L165: .loc 1 583 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L162; bra $L163; $L152: .loc 1 561 19 ld.s8 %r55,[%r150+28]; .loc 1 561 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L164; .loc 1 562 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error164: .loc 1 567 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 567 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L165; .loc 1 569 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L163; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 569 10 mov.u64 %r104,0; $L167: .loc 1 573 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 573 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 574 18 ld.u64 %r63,[%r31]; .loc 1 569 25 add.u64 %r104,%r104,1; .loc 1 574 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L166; .loc 1 575 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error166: .loc 1 569 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L167; $L162: .loc 1 585 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L168; shl.b64 %r266,%r296,3; bra $L169; $L168: mov.u64 %r266,8; $L169: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L170: .loc 1 586 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 583 26 add.u64 %r100,%r100,1; .loc 1 583 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L170; $L163: ld.u64 %r80,[%frame+240]; .loc 1 595 22 ld.u64 %r68,[%frame]; .loc 1 595 12 shl.b64 %r69,%r68,2; .loc 1 597 32 ld.u64 %r59,[%frame+120]; .loc 1 604 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 593 13 mov.u32 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L176: st.u32 [%r98],%r280; .loc 1 594 15 add.u64 %r80,%r80,1; .loc 1 595 12 add.u64 %r98,%r98,%r69; .loc 1 597 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L176; .loc 1 606 7 @ ! %r294 bra $L201; bra $L139; $L175: .loc 1 601 13 st.u64 [%r99],%r297; .loc 1 604 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 605 5 add.u64 %r97,%r97,1; .loc 1 606 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L174; bra $L139; $L201: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 604 23 mov.u64 %r72,%r44; .loc 1 605 5 mov.u64 %r97,1; .loc 1 601 13 mov.u64 %r297,0; $L174: .loc 1 610 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 611 23 ld.u64 %r75,[%r124]; .loc 1 611 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,2; add.u64 %r98,%r98,%r291; .loc 1 597 32 ld.u64 %r77,[%r123]; .loc 1 597 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L175; .loc 1 601 13 mov.u64 %r80,0; bra $L176; $L199: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L152; bra $L177; $L139: .loc 1 615 1 ret; } minloc1_8_r4.o/_gfortran_minloc1_8_r4 .visible .func _gfortran_minloc1_8_rfortran/generated/minloc1_8_r4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_8_r4 .visible .func _gfortran_mminloc1_8_r_gfortran_sminloc1_8_r4 .visible .func _gfortran_sminloc1_8_rVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_minloc1_8_r4 .visible .func _gfortran_minloc1_8_r4480pred %r221; .reg .pred %r222; .reg .u64u64 %r229; .reg .u64 %r230; .reg .u64 %r231; .reg .pred %r232; .reg .predu64 %r280; .reg .u64 %r282; .reg .u64pred64pred %r317; .reg .predpred %r323; .reg .predpred %r326; .reg .pred %r327; .reg .predu64 %r333; .reg .u64pred %r352; .reg .u64 %r353; .reg .predmov.u64 %r182,%ar0; mov.u64 %r183,%ar1; mov.u64 %r184,%ar2; mov.u32 %r185,%ar3; .loc 1 58 10 ld.s8 %r22,[%r183+28]; .loc 1 58 38 add.u32 %r186,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r126,%r186; .loc 1 59 10 ld.u64 %r25,[%r184]; .loc 1 59 7 add.u64 %r127,%r25,-1; .loc 1 61 7 shr.u64 %r188,%r127,63; set.u32.lt.s64 %r190,%r126,%r127; neg.s32 %r191,%r190; cvt.u16.u64 %r194,%r188; cvt.u16.u32 %r195,%r191; or.b16 %r193,%r194,%r195; .loc 1 61 6 cvt.u32.u16 %r196,%r193; cvt.u16.u8 %r197,%r196; setp.eq.u16 %r198,%r197,0; @ %r198 bra $L2; .loc 1 63 7 cvt.u32.u32 %r201,%r22; cvt.s64.s8 %r200,%r201; st.u64 [%stack+8],%r20019_gfortran_runtime_errorr204,%r127,%r127; add.u64 %r205,%r204,%r127; shl.b64 %r206,%r205,3; add.u64 %r207,%r183,%r206; ld.u64 %r31,[%r207+56]; ld.u64 %r33,[%r207+48]; .loc 1 71 9 ld.u64 %r130,[%r207+40]; .loc 1 73 3 setp.ne.u64 %r221,%r127,0; @ %r221 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r222,%r126,%r127; @ %r222 bra $L4; bra $L62; $L3: add.u64 %r125,%r183,40; add.u64 %r144,%frame,120; add.u64 %r181,%frame,240; add.u64 %r223,%r183,16; add.u64 %r225,%r25,%r25; add.u64 %r226,%r225,%r25; shl.b64 %r227,%r226,3; add.u64 %r75,%r223,%r227; .loc 1 79 12 mov.u64 %r368,0; $L8: .loc 1 75 18 ld.u64 %r228,[%r125]; st.u64 [%r144],%r228; .loc 1 76 19 ld.u64 %r230,[%r125+16]; add.u64 %r229,%r230,1; ld.u64 %r231,[%r125+8]; sub.u64 %r38,%r229,%r231; .loc 1 78 10 setp.lt.s64 %r232,%r38,0; @ %r232 bra $L6; .loc 1 76 17 st.u64 [%r181],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r181],%r368; $L7: .loc 1 73 3 add.u64 %r125,%r125,24; add.u64 %r144,%r144,8; add.u64 %r181,%r181,8; setp.ne.u64 %r234,%r75,%r125; @ %r234 bra $L8; bra $L9; $L62: .loc 1 90 6 ld.u64 %r235,[%r182]; setp.eq.u64 %r236,%r235,0; @ ! %r236 bra $L11; bra $L10; $L4: add.u64 %r238,%r25,%r25; add.u64 %r239,%r238,%r25; shl.b64 %r240,%r239,3; add.u64 %r241,%r240,40; add.u64 %r121,%r183,%r241; shl.b64 %r242,%r25,3; add.u64 %r111,%r242,-8; add.u64 %r356,%frame,120; add.u64 %r115,%r356,%r111; add.u64 %r355,%frame,240; add.u64 %r109,%r355,%r111; cvt.u32.u32 %r246,%r22; cvt.s64.s8 %r245,%r246; add.u64 %r248,%r245,%r245; add.u64 %r249,%r248,%r245; shl.b64 %r250,%r249,3; add.u64 %r251,%r183,40; add.u64 %r49,%r250,%r251; .loc 1 87 12 mov.u64 %r367,0; $L14: .loc 1 83 18 ld.u64 %r252,[%r121]; st.u64 [%r115],%r252; .loc 1 84 19 ld.u64 %r254,[%r121+16]; add.u64 %r253,%r254,1; ld.u64 %r255,[%r121+8]; sub.u64 %r45,%r253,%r255; .loc 1 86 10 setp.lt.s64 %r256,%r45,0; @ %r256 bra $L12; .loc 1 84 17 st.u64 [%r109],%r45; bra $L13; $L12: .loc 1 87 12 st.u64 [%r109],%r367; $L13: .loc 1 81 3 add.u64 %r121,%r121,24; add.u64 %r115,%r115,8; add.u64 %r109,%r109,8; setp.ne.u64 %r258,%r49,%r121; @ %r258 bra $L14; bra $L63; $L10: .loc 1 94 7 setp.le.s64 %r259,%r126,0; @ %r259 bra $L16; add.u64 %r355,%frame,240; $L44: add.u64 %r150,%r182,40; mov.u64 %r148,%r355; cvt.u32.u32 %r261,%r22; cvt.s64.s8 %r260,%r261; add.u64 %r263,%r260,%r260; add.u64 %r264,%r263,%r260; shl.b64 %r265,%r264,3; add.u64 %r266,%r182,16; add.u64 %r123,%r265,%r266; .loc 1 97 10 mov.u64 %r114,1; .loc 1 101 4 mov.u64 %r289,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r267,0; st.u64 [%r182+8],%r267; .loc 1 106 28 cvt.u16.u32 %r270,%r22; add.u16 %r269,%r270,-1; cvt.u32.u16 %r271,%r269; st.u8 [%r182+28],%r271; .loc 1 108 20 add.u32 %r272,%r22,-2; cvt.s64.s32 %r56,%r272; add.u64 %r274,%r56,%r56; add.u64 %r275,%r274,%r56; shl.b64 %r276,%r275,3; add.u64 %r277,%r182,%r276; .loc 1 108 67 shl.b64 %r279,%r56,3; add.u64 %r280,%frame,%r279; .loc 1 108 59 ld.u64 %r282,[%r277+40]; ld.u64 %r283,[%r280+240]; mul.lo.u64 %r132,%r282,%r283; .loc 1 110 29 mov.u64 %r2851285; call (%value_in),_gfortrani_xmallocarray286,[%value_in]; } .loc 1 110 27 st.u64 [%r182],%r286; .loc 1 111 10 setp.eq.u64 %r288,%r132,0; @ ! %r288 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r114,%r52,%r114; $L17: .loc 1 101 4 st.u64 [%r150+8],%r289; ld.u64 %r52,[%r148]; add.u64 %r290,%r52,-1; st.u64 [%r150+16],%r290; st.u64 [%r150],%r114; .loc 1 94 7 add.u64 %r150,%r150,24; add.u64 %r148,%r148,8; setp.ne.u64 %r291,%r123,%r150; @ %r291 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r182+48],%r132; mov.u64 %r293,-1; st.u64 [%r182+56],%r293; mov.u64 %r294,1; st.u64 [%r182+40],%r294; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r62,[%r182+28]; .loc 1 121 10 setp.eq.u64 %r295,%r62,%r126; @ %r295 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r126; st.u64 [%stack],%r62;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r298,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r299,[%r298+36]; setp.eq.u32 %r300,%r299,0; @ %r300 bra $L21; .loc 1 128 2 add.u64 %r355,%frame,240; cvta.const.u64 %r304,$LC2call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r306,%r126,0; @ %r306 bra $L24; $L27: .loc 1 68 9 add.u64 %r307,%r31,1; .loc 1 68 7 sub.u64 %r128,%r307,%r33; max.s64 %r145,%r128,0; .loc 1 140 8 ld.u64 %r140,[%r183]; .loc 1 141 8 ld.u64 %r141,[%r182]; .loc 1 168 40 shl.b64 %r72,%r130,2; ld.u64 %r47,[%frame+360]; .loc 1 204 22 ld.u64 %r82,[%frame+120]; .loc 1 204 12 shl.b64 %r84,%r82,2; .loc 1 205 22 ld.u64 %r85,[%frame]; .loc 1 205 12 shl.b64 %r87,%r85,3; setp.gt.s64 %r357,%r128,0; setp.le.s64 %r358,%r126,1; .loc 1 180 9 setp.ne.u32 %r360,%r185,0; .loc 1 159 10 mov.u64 %r361,0; cvt.u32.u32 %r362,%r22; cvt.s64.s8 %r363,%r362; add.u64 %r364,%r363,-1; add.u64 %r365,%frame,120; add.u64 %r366,%frame,240; bra $L25; $L24: add.u64 %r168,%frame,360; add.u64 %r167,%r182,40; mov.u64 %r165,%frame; cvt.u32.u32 %r309,%r22; cvt.s64.s8 %r308,%r309; add.u64 %r154,%r308,-1; .loc 1 132 3 mov.u64 %r142,0; add.u64 %r355,%frame,240; .loc 1 134 16 mov.u64 %r310,%r142; $L26: st.u64 [%r168],%r310; .loc 1 135 18 ld.u64 %r311,[%r167]; st.u64 [%r165],%r311; .loc 1 136 17 shl.b64 %r313,%r142,3; add.u64 %r314,%r355,%r313; .loc 1 136 10 ld.u64 %r315,[%r314]; setp.le.s64 %r316,%r315,0; @ %r316 bra $L1; .loc 1 132 26 add.u64 %r142,%r142,1; .loc 1 132 3 add.u64 %r168,%r168,8; add.u64 %r167,%r167,24; add.u64 %r165,%r165,8; setp.ne.u64 %r317,%r142,%r154; @ %r317 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r357 bra $L45; .loc 1 159 10 st.u64 [%r141],%r361; bra $L29; $L45: mov.u64 %r133,%r140; .loc 1 168 17 mov.u64 %r118,0; $L28: .loc 1 170 7 ld.f32 %r113,[%r133]; mov.u64 %r151,%r118; .loc 1 168 32 add.u64 %r118,%r118,1; .loc 1 170 6 setp.le.f32 %r320,%r113,0f7f800000; @ ! %r320 bra $L64; .loc 1 180 9 @ %r360 bra $L32; .loc 1 190 8 setp.gt.s64 %r322,%r145,%r151; @ %r322 bra $L33; bra $L34; $L32: .loc 1 181 8 setp.gt.s64 %r323,%r145,%r151; @ %r323 bra $L46; bra $L34; $L64: .loc 1 168 40 add.u64 %r133,%r133,%r72; .loc 1 168 10 setp.gt.s64 %r324,%r145,%r118; @ %r324 bra $L28; .loc 1 157 9 mov.u64 %r118,1; bra $L34; $L46: .loc 1 170 7 mov.f32 %r180,%r113; $L35: .loc 1 186 16 add.u64 %r151,%r151,1; .loc 1 183 8 setp.ge.f32 %r325,%r113,%r180; .loc 1 185 16 selp.f32 %r113,%r180,%r113,%r325; selp.u64 %r118,%r151,%r118,%r325; .loc 1 181 33 add.u64 %r133,%r133,%r72; .loc 1 181 8 setp.eq.u64 %r326,%r145,%r151; @ %r326 bra $L34; .loc 1 183 9 ld.f32 %r180,[%r133]; bra $L35; $L33: .loc 1 192 9 ld.f32 %r77,[%r133]; .loc 1 195 16 add.u64 %r151,%r151,1; .loc 1 192 8 setp.lt.f32 %r327,%r77,%r113; .loc 1 194 16 selp.f32 %r113,%r77,%r113,%r327; selp.u64 %r118,%r151,%r118,%r327; .loc 1 190 33 add.u64 %r133,%r133,%r72; .loc 1 190 8 setp.ne.u64 %r328,%r145,%r151; @ %r328 bra $L33; $L34: .loc 1 199 12 st.u64 [%r141],%r118; $L29: .loc 1 203 15 add.u64 %r47,%r47,1; .loc 1 204 12 add.u64 %r140,%r140,%r84; .loc 1 205 12 add.u64 %r141,%r141,%r87; .loc 1 207 32 ld.u64 %r122,[%frame+240]; .loc 1 207 13 setp.ne.u64 %r329,%r47,%r122; @ %r329 bra $L25; .loc 1 214 23 mul.lo.u64 %r330,%r82,%r47; .loc 1 214 9 shl.b64 %r331,%r330,2; sub.u64 %r137,%r140,%r331; .loc 1 215 23 mul.lo.u64 %r332,%r85,%r47; .loc 1 215 9 shl.b64 %r333,%r332,3; sub.u64 %r138,%r141,%r333; .loc 1 217 7 @ %r358 bra $L1; add.u64 %r177,%frame,368; mov.u64 %r176,8; .loc 1 216 5 mov.u64 %r139,1; bra $L42; $L43: .loc 1 211 13 st.u64 [%r177],%r361; .loc 1 214 23 mul.lo.u64 %r339,%r100,%r99; .loc 1 214 9 shl.b64 %r340,%r339,2; sub.u64 %r137,%r140,%r340; .loc 1 215 23 mul.lo.u64 %r341,%r103,%r99; .loc 1 215 9 shl.b64 %r342,%r341,3; sub.u64 %r138,%r141,%r342; .loc 1 216 5 add.u64 %r139,%r139,1; .loc 1 217 7 add.u64 %r177,%r177,8; add.u64 %r176,%r176,8; setp.eq.u64 %r343,%r139,%r364; @ %r343 bra $L1; $L42: .loc 1 225 16 ld.u64 %r344,[%r177]; add.u64 %r99,%r344,1; st.u64 [%r177],%r99; .loc 1 226 23 add.u64 %r346,%r365,%r176; ld.u64 %r100,[%r346]; .loc 1 226 13 shl.b64 %r347,%r100,2; add.u64 %r140,%r137,%r347; .loc 1 227 23 add.u64 %r348,%frame,%r176; ld.u64 %r103,[%r348]; .loc 1 227 13 shl.b64 %r349,%r103,3; add.u64 %r141,%r138,%r349; .loc 1 207 32 add.u64 %r351,%r366,%r176; ld.u64 %r107,[%r351]; .loc 1 207 13 setp.eq.u64 %r352,%r99,%r107; @ %r352 bra $L43; .loc 1 211 13 mov.u64 %r47,0; bra $L25; $L63: .loc 1 90 6 ld.u64 %r353,[%r182]; setp.eq.u64 %r354,%r353,0; @ ! %r354 bra $L11; bra $L44; $L1:_gfortran_mminloc1_8_r4 .visible .func _gfortran_mminloc1_8_r, .param .u32 %in_ar4) {608pred %r218; .reg .u32 %r223; .reg .u6464u64pred %r373; .reg .u64 %r374; .reg .u64 %r376; .reg .u32 %r377; .reg .pred %r378; .reg .u64 %r381; .reg .u64 %r382; .reg .u64 %r386; .reg .pred %r388; .reg .pred %r389; .reg .u64 %r390; .reg .u32 %r391; .reg .u64pred %r398; .reg .pred %r399; .reg .pred %r401; .reg .pred %r402; .reg .pred %r403; .reg .pred %r404; .reg .pred %r407; .reg .pred %r408; .reg .pred %r410; .reg .pred %r411; .reg .pred.reg .u64 %r421; .reg .u64u64pred %r445; .reg .u64 %r446; .reg .predmov.u64 %r213,%ar0; mov.u64 %r214,%ar1; mov.u64 %r215,%ar2; mov.u64 %r216,%ar3; mov.u32 %r217,%ar4; .loc 1 261 6 setp.ne.u64 %r218,%r216,0; @ %r218 bra $L661r214215217; call _gfortran_minloc1_8_r4268 7 bra $L65; $L66: .loc 1 271 10 ld.u64 %r22,[%r215]; .loc 1 271 7 add.u64 %r152,%r22,-1; .loc 1 272 10 ld.s8 %r23,[%r214+28]; .loc 1 272 38 add.u32 %r223,%r23,-1; .loc 1 272 8 cvt.s64.s32 %r153,%r223; .loc 1 275 7 shr.u64 %r225,%r152,63; set.u32.gt.s64 %r227,%r152,%r153; neg.s32 %r228,%r227; cvt.u16.u64 %r231,%r225; cvt.u16.u32 %r232,%r228; or.b16 %r230,%r231,%r232; .loc 1 275 6 cvt.u32.u16 %r233,%r230; cvt.u16.u8 %r234,%r233; setp.eq.u16 %r235,%r234,0; @ %r235 bra $L68; .loc 1 277 7 cvt.u32.u32 %r238,%r23; cvt.s64.s8 %r237,%r238; st.u64 [%stack+8],%r237; st.u64 [%stack],%r22; cvta.const.u64 %r236236_gfortran_runtime_error68: .loc 1 282 9 add.u64 %r442,%r152,%r152; add.u64 %r444,%r442,%r152; shl.b64 %r243,%r444,3; add.u64 %r244,%r214,%r243; ld.u64 %r247,[%r244+56]; add.u64 %r246,%r247,1; .loc 1 282 7 ld.u64 %r254,[%r244+48]; sub.u64 %r154,%r246,%r254; .loc 1 283 6 setp.le.s64 %r255,%r154,0; @ %r255 bra $L65; .loc 1 286 9 ld.u64 %r172,[%r216]; .loc 1 288 15 ld.u64 %r34,[%r216+16]; .loc 1 290 22 cvt.u32.u64 %r35,%r34; .loc 1 290 53 add.u32 %r256,%r35,-4; and.b32 %r257,%r256,-5; set.u32.eq.u32 %r259,%r257,0; neg.s32 %r260,%r259; .loc 1 290 22 add.u32 %r261,%r35,-1; set.u32.le.u32 %r263,%r261,1; neg.s32 %r264,%r263; .loc 1 290 6 cvt.u16.u32 %r266,%r260; cvt.u16.u32 %r267,%r264; or.b16 %r265,%r266,%r267; cvt.u32.u16 %r268,%r265; cvt.u16.u8 %r269,%r268; setp.ne.u16 %r270,%r269,0; @ %r270 bra $L70; .loc 1 292 7 setp.ne.u32 %r272,%r35,16; @ %r272 bra $L71; $L70: .loc 1 299 9 shl.b64 %r276,%r444,3; add.u64 %r277,%r214,%r276; ld.u64 %r157,[%r277+40]; .loc 1 300 12 add.u64 %r283,%r216,%r276; ld.u64 %r38,[%r283+40]; .loc 1 302 3 setp.ne.u64 %r285,%r152,0; @ %r285 bra $L72; $L78: .loc 1 312 3 setp.lt.s64 %r286,%r152,%r153; @ %r286 bra $L73; bra $L139; $L71: .loc 1 297 5_gfortran_runtime_error72: add.u64 %r205,%r214,40; add.u64 %r70,%r216,40; add.u64 %r289,%r214,16; add.u64 %r291,%r22,%r22; add.u64 %r292,%r291,%r22; shl.b64 %r293,%r292,3; add.u64 %r188,%r289,%r293; .loc 1 302 3 mov.u64 %r92,0; add.u64 %r441,%frame,240; add.u64 %r439,%frame,360; .loc 1 309 12 mov.u64 %r455,%r92; $L77: .loc 1 304 18 add.u64 %r295,%r441,%r92; ld.u64 %r296,[%r205]; st.u64 [%r295],%r296; .loc 1 305 18 add.u64 %r297,%frame,%r92; .loc 1 305 20 ld.u64 %r299,[%r70]; mul.lo.u64 %r298,%r299,%r34; .loc 1 305 18 st.u64 [%r297],%r298; .loc 1 306 19 ld.u64 %r301,[%r205+16]; add.u64 %r300,%r301,1; ld.u64 %r302,[%r205+8]; sub.u64 %r49,%r300,%r302; .loc 1 308 10 setp.lt.s64 %r303,%r49,0; @ %r303 bra $L75; .loc 1 306 17 add.u64 %r305,%r439,%r92; st.u64 [%r305],%r49; bra $L76; $L75: .loc 1 309 12 add.u64 %r307,%r439,%r92; st.u64 [%r307],%r455; $L76: .loc 1 302 3 add.u64 %r205,%r205,24; add.u64 %r70,%r70,24; add.u64 %r92,%r92,8; setp.ne.u64 %r309,%r188,%r205; @ %r309 bra $L77; bra $L78; $L139: .loc 1 322 6 ld.u64 %r310,[%r213]; setp.eq.u64 %r311,%r310,0; @ ! %r311 bra $L80; bra $L79; $L73: add.u64 %r313,%r22,%r22; add.u64 %r314,%r313,%r22; shl.b64 %r315,%r314,3; add.u64 %r123,%r315,40; add.u64 %r128,%r214,%r123; add.u64 %r107,%r216,%r123; shl.b64 %r316,%r22,3; add.u64 %r78,%r316,-8; cvt.u32.u32 %r318,%r23; cvt.s64.s8 %r317,%r318; shl.b64 %r319,%r317,3; add.u64 %r180,%r319,-8; add.u64 %r441,%frame,240; add.u64 %r439,%frame,360; .loc 1 319 12 mov.u64 %r454,0; $L83: .loc 1 314 18 add.u64 %r321,%r441,%r78; ld.u64 %r322,[%r128]; st.u64 [%r321],%r322; .loc 1 315 18 add.u64 %r323,%frame,%r78; .loc 1 315 20 ld.u64 %r325,[%r107]; mul.lo.u64 %r324,%r325,%r34; .loc 1 315 18 st.u64 [%r323],%r324; .loc 1 316 19 ld.u64 %r327,[%r128+16]; add.u64 %r326,%r327,1; ld.u64 %r328,[%r128+8]; sub.u64 %r61,%r326,%r328; .loc 1 318 10 setp.lt.s64 %r329,%r61,0; @ %r329 bra $L81; .loc 1 316 17 add.u64 %r331,%r439,%r78; st.u64 [%r331],%r61; bra $L82; $L81: .loc 1 319 12 add.u64 %r333,%r439,%r78; st.u64 [%r333],%r454; $L82: .loc 1 312 3 add.u64 %r128,%r128,24; add.u64 %r107,%r107,24; add.u64 %r78,%r78,8; setp.ne.u64 %r335,%r78,%r180; @ %r335 bra $L83; bra $L140; $L79: .loc 1 326 7 setp.eq.u64 %r336,%r153,0; @ %r336 bra $L85; add.u64 %r439,%frame,360; $L112: add.u64 %r144,%r213,40; mov.u64 %r139,%r439; cvt.u32.u32 %r338,%r23; cvt.s64.s8 %r337,%r338; add.u64 %r340,%r337,%r337; add.u64 %r341,%r340,%r337; shl.b64 %r342,%r341,3; add.u64 %r343,%r213,16; add.u64 %r129,%r342,%r343; .loc 1 329 10 mov.u64 %r132,1; .loc 1 333 4 mov.u64 %r362,0; bra $L86; $L85: .loc 1 337 20 add.u32 %r344,%r23,-2; cvt.s64.s32 %r68,%r344; add.u64 %r346,%r68,%r68; add.u64 %r347,%r346,%r68; shl.b64 %r348,%r347,3; add.u64 %r349,%r213,%r348; .loc 1 337 67 shl.b64 %r351,%r68,3; add.u64 %r352,%frame,%r351; .loc 1 337 59 ld.u64 %r354,[%r349+40]; ld.u64 %r355,[%r352+360]; mul.lo.u64 %r161,%r354,%r355; .loc 1 339 24 mov.u64 %r356,0; st.u64 [%r213+8],%r356; .loc 1 340 28 cvt.u16.u32 %r359,%r23; add.u16 %r358,%r359,-1; cvt.u32.u16 %r360,%r358; st.u8 [%r213+28],%r360; .loc 1 342 10 setp.eq.u64 %r361,%r161,0; @ %r361 bra $L87; bra $L141; $L89: .loc 1 331 47 mul.lo.u64 %r132,%r65,%r132; $L86: .loc 1 333 4 st.u64 [%r144+8],%r362; ld.u64 %r65,[%r139]; add.u64 %r363,%r65,-1; st.u64 [%r144+16],%r363; st.u64 [%r144],%r132; .loc 1 326 7 add.u64 %r144,%r144,24; add.u64 %r139,%r139,8; setp.ne.u64 %r364,%r129,%r144; @ %r364 bra $L89; bra $L85; $L87: .loc 1 345 4 st.u64 [%r213+48],%r161; mov.u64 %r366,-1; st.u64 [%r213+56],%r366; mov.u64 %r367,1; st.u64 [%r213+40],%r367; .loc 1 346 4 bra $L65; $L141: .loc 1 349 24 mov.u64 %r369369; call (%value_in),_gfortrani_xmallocarray70,[%value_in]; } .loc 1 349 22 st.u64 [%r213],%r370; bra $L90; $L80: .loc 1 354 19 ld.s8 %r372,[%r213+28]; .loc 1 354 10 setp.eq.u64 %r373,%r372,%r153; @ %r373 bra $L91; .loc 1 355 2 cvta.const.u64 %r374374_gfortran_runtime_error91: .loc 1 357 11 cvta.global.u64 %r376,_gfortrani_compile_options; .loc 1 357 10 ld.u32 %r377,[%r376+36]; setp.eq.u32 %r378,%r377,0; @ %r378 bra $L90; .loc 1 359 4 add.u64 %r439,%frame,360; cvta.const.u64 %r382,$LC2382; call _gfortrani_bounds_ifunction_return361 4 cvta.const.u64 %r386,$LC386382; call _gfortrani_bounds_equal_extents$L90: .loc 1 366 3 setp.ne.u64 %r388,%r153,0; @ %r388 bra $L92; $L95: .loc 1 374 8 ld.u64 %r173,[%r213]; .loc 1 375 8 ld.u64 %r171,[%r214]; .loc 1 377 9 setp.ne.u64 %r389,%r171,0; @ %r389 bra $L93; bra $L65; $L92: add.u64 %r193,%frame,480; add.u64 %r187,%r213,40; add.u64 %r181,%frame,120; cvt.u32.u32 %r391,%r23; cvt.s64.s8 %r390,%r391; add.u64 %r145,%r390,-1; .loc 1 366 3 mov.u64 %r174,0; add.u64 %r439,%frame,360; .loc 1 368 16 mov.u64 %r392,%r174; $L94: st.u64 [%r193],%r392; .loc 1 369 18 ld.u64 %r393,[%r187]; st.u64 [%r181],%r393; .loc 1 370 17 shl.b64 %r395,%r174,3; add.u64 %r396,%r439,%r395; .loc 1 370 10 ld.u64 %r397,[%r396]; setp.le.s64 %r398,%r397,0; @ %r398 bra $L65; .loc 1 366 26 add.u64 %r174,%r174,1; .loc 1 366 3 add.u64 %r193,%r193,8; add.u64 %r187,%r187,24; add.u64 %r181,%r181,8; setp.ne.u64 %r399,%r145,%r174; @ %r399 bra $L94; bra $L95; $L93: .loc 1 300 12 mul.lo.u64 %r40,%r38,%r34; .loc 1 396 32 shl.b64 %r82,%r157,2; ld.u64 %r185,[%frame+480]; .loc 1 440 22 ld.u64 %r93,[%frame+240]; .loc 1 440 12 shl.b64 %r95,%r93,2; .loc 1 441 23 ld.u64 %r96,[%frame]; .loc 1 442 22 ld.u64 %r98,[%frame+120]; .loc 1 442 12 shl.b64 %r100,%r98,3; setp.le.s64 %r445,%r153,1; .loc 1 418 9 setp.eq.u32 %r447,%r217,0; cvt.u32.u32 %r448,%r23; cvt.s64.s8 %r449,%r448; add.u64 %r450,%r449,-1; add.u64 %r451,%frame,240; add.u64 %r452,%frame,360; add.u64 %r453,%frame,120; $L113: .loc 1 366 3 mov.u64 %r166,%r172; mov.u64 %r165,%r171; .loc 1 393 16 mov.u64 %r142,0; .loc 1 396 9 mov.u64 %r138,%r142; $L101: .loc 1 399 7 ld.s8 %r210,[%r166]; mov.u64 %r182,%r138; .loc 1 403 17 add.u64 %r138,%r138,1; .loc 1 399 6 setp.eq.u32 %r440,%r210,0; @ %r440 bra $L96; .loc 1 402 10 setp.eq.u64 %r401,%r142,0; .loc 1 403 17 selp.u64 %r142,%r138,%r142,%r401; .loc 1 404 11 ld.f32 %r140,[%r165]; .loc 1 404 10 setp.le.f32 %r402,%r140,0f7f800000; @ ! %r402 bra $L96; .loc 1 414 9 setp.le.s64 %r403,%r154,%r182; @ ! %r403 bra $L142; bra $L99; $L96: .loc 1 396 32 add.u64 %r165,%r165,%r82; .loc 1 396 47 add.u64 %r166,%r166,%r40; .loc 1 396 2 setp.ne.u64 %r404,%r154,%r138; @ %r404 bra $L101; bra $L99; $L142: .loc 1 418 9 @ %r447 bra $L102; $L105: .loc 1 424 16 add.u64 %r182,%r182,1; .loc 1 421 8 @ %r440 bra $L103; .loc 1 421 18 ld.f32 %r85,[%r165]; .loc 1 421 15 setp.le.f32 %r407,%r85,%r140; .loc 1 423 16 selp.f32 %r140,%r85,%r140,%r407; selp.u64 %r138,%r182,%r138,%r407; $L103: .loc 1 419 33 add.u64 %r165,%r165,%r82; .loc 1 419 48 add.u64 %r166,%r166,%r40; .loc 1 419 8 setp.eq.u64 %r408,%r154,%r182; @ %r408 bra $L115; .loc 1 421 9 ld.s8 %r210,[%r166]; setp.eq.u32 %r440,%r210,0; bra $L105; $L102: .loc 1 433 11 add.u64 %r182,%r182,1; .loc 1 430 10 @ %r440 bra $L106; .loc 1 430 20 ld.f32 %r90,[%r165]; .loc 1 430 17 setp.lt.f32 %r410,%r90,%r140; .loc 1 432 18 selp.f32 %r140,%r90,%r140,%r410; selp.u64 %r138,%r182,%r138,%r410; $L106: .loc 1 428 35 add.u64 %r165,%r165,%r82; .loc 1 428 50 add.u64 %r166,%r166,%r40; .loc 1 428 10 setp.eq.u64 %r411,%r154,%r182; @ %r411 bra $L116; .loc 1 430 11 ld.s8 %r210,[%r166]; setp.eq.u32 %r440,%r210,0; bra $L102; $L115: mov.u64 %r142,%r138; bra $L99; $L116: mov.u64 %r142,%r138; $L99: .loc 1 436 8 st.u64 [%r173],%r142; .loc 1 439 15 add.u64 %r185,%r185,1; .loc 1 440 12 add.u64 %r171,%r171,%r95; .loc 1 441 13 add.u64 %r172,%r172,%r96; .loc 1 442 12 add.u64 %r173,%r173,%r100; .loc 1 444 32 ld.u64 %r164,[%frame+360]; .loc 1 444 13 setp.ne.u64 %r412,%r185,%r164; @ %r412 bra $L113; .loc 1 451 23 mul.lo.u64 %r102,%r93,%r185; .loc 1 452 24 mul.lo.u64 %r413,%r96,%r185; .loc 1 452 10 sub.u64 %r168,%r172,%r413; .loc 1 453 23 mul.lo.u64 %r414,%r98,%r185; .loc 1 453 9 shl.b64 %r415,%r414,3; sub.u64 %r169,%r173,%r415; .loc 1 455 7 @ %r445 bra $L65; add.u64 %r203,%frame,488; mov.u64 %r202,8; .loc 1 454 5 mov.u64 %r170,1; .loc 1 448 13 mov.u64 %r446,0; bra $L110; $L111: st.u64 [%r203],%r446; .loc 1 451 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 452 24 mul.lo.u64 %r421,%r116,%r113; .loc 1 452 10 sub.u64 %r168,%r172,%r421; .loc 1 453 23 mul.lo.u64 %r422,%r118,%r113; .loc 1 453 9 shl.b64 %r423,%r422,3; sub.u64 %r169,%r173,%r423; .loc 1 454 5 add.u64 %r170,%r170,1; .loc 1 455 7 add.u64 %r203,%r203,8; add.u64 %r202,%r202,8; setp.eq.u64 %r424,%r170,%r450; @ %r424 bra $L65; $L110: .loc 1 463 16 ld.u64 %r425,[%r203]; add.u64 %r113,%r425,1; st.u64 [%r203],%r113; .loc 1 464 23 add.u64 %r427,%r451,%r202; ld.u64 %r114,[%r427]; .loc 1 464 13 sub.u64 %r428,%r114,%r102; shl.b64 %r429,%r428,2; add.u64 %r171,%r171,%r429; .loc 1 465 24 add.u64 %r430,%frame,%r202; ld.u64 %r116,[%r430]; .loc 1 465 14 add.u64 %r172,%r168,%r116; .loc 1 466 23 add.u64 %r432,%r453,%r202; ld.u64 %r118,[%r432]; .loc 1 466 13 shl.b64 %r433,%r118,3; add.u64 %r173,%r169,%r433; .loc 1 444 32 add.u64 %r435,%r452,%r202; ld.u64 %r122,[%r435]; .loc 1 444 13 setp.eq.u64 %r436,%r113,%r122; @ %r436 bra $L111; .loc 1 448 13 mov.u64 %r185,0; bra $L113; $L140: .loc 1 322 6 ld.u64 %r437,[%r213]; setp.eq.u64 %r438,%r437,0; @ ! %r438 bra $L80; bra $L112; $L65_gfortran_sminloc1_8_r4 .visible .func _gfortran_sminloc1_8_r, .param .u32 %in_ar4) {local .align 16 .b8 %frame_ar[368u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 493 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L144; .loc 1 493 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L145; $L144:r152154; call _gfortran_minloc1_8_r4500 7 bra $L143; $L145: .loc 1 503 10 ld.u64 %r23,[%r152]; .loc 1 503 7 add.u64 %r88,%r23,-1; .loc 1 504 10 ld.s8 %r24,[%r151+28]; .loc 1 504 38 add.u32 %r26,%r24,-1; .loc 1 504 8 cvt.s64.s32 %r296,%r26; .loc 1 506 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 506 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L147; .loc 1 513 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L148; $L154: .loc 1 521 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L149; bra $L202; $L147: .loc 1 508 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error148: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 518 12 mov.u64 %r302,0; $L153: .loc 1 515 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 517 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L151; .loc 1 515 17 st.u64 [%r139],%r36; bra $L152; $L151: .loc 1 518 12 st.u64 [%r139],%r302; $L152: .loc 1 513 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L153; bra $L154; $L202: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L156; bra $L155; $L149: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 527 12 mov.u64 %r301,0; $L159: .loc 1 524 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 526 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L157; .loc 1 523 17 st.u64 [%r43],%r41; bra $L158; $L157: .loc 1 527 12 st.u64 [%r43],%r301; $L158: .loc 1 521 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L159; bra $L203; $L155: .loc 1 534 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L161; add.u64 %r295,%frame,120; $L181: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 537 10 mov.u64 %r81,1; .loc 1 541 4 mov.u64 %r239,0; bra $L162; $L161: .loc 1 545 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 546 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 548 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 548 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 548 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 550 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L163; bra $L204; $L165: .loc 1 539 48 mul.lo.u64 %r81,%r46,%r81; $L162: .loc 1 541 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 534 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L165; bra $L161; $L163: .loc 1 553 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 554 4 bra $L143; $L204: .loc 1 557 24 mov.u64 %r24696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 557 22 st.u64 [%r150],%r98; $L169: .loc 1 583 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L166; bra $L167; $L156: .loc 1 561 19 ld.s8 %r55,[%r150+28]; .loc 1 561 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L168; .loc 1 562 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error168: .loc 1 567 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 567 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L169; .loc 1 569 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L167; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 569 10 mov.u64 %r104,0; $L171: .loc 1 573 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 573 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 574 18 ld.u64 %r63,[%r31]; .loc 1 569 25 add.u64 %r104,%r104,1; .loc 1 574 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L170; .loc 1 575 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error170: .loc 1 569 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L171; $L166: .loc 1 585 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L172; shl.b64 %r266,%r296,3; bra $L173; $L172: mov.u64 %r266,8; $L173: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L174: .loc 1 586 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 583 26 add.u64 %r100,%r100,1; .loc 1 583 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L174; $L167: ld.u64 %r80,[%frame+240]; .loc 1 595 22 ld.u64 %r68,[%frame]; .loc 1 595 12 shl.b64 %r69,%r68,3; .loc 1 597 32 ld.u64 %r59,[%frame+120]; .loc 1 604 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 593 13 mov.u64 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L180: st.u64 [%r98],%r280; .loc 1 594 15 add.u64 %r80,%r80,1; .loc 1 595 12 add.u64 %r98,%r98,%r69; .loc 1 597 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L180; .loc 1 606 7 @ ! %r294 bra $L205; bra $L143; $L179: .loc 1 601 13 st.u64 [%r99],%r280; .loc 1 604 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 605 5 add.u64 %r97,%r97,1; .loc 1 606 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L178; bra $L143; $L205: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 604 23 mov.u64 %r72,%r44; .loc 1 605 5 mov.u64 %r97,1; $L178: .loc 1 610 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 611 23 ld.u64 %r75,[%r124]; .loc 1 611 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,3; add.u64 %r98,%r98,%r291; .loc 1 597 32 ld.u64 %r77,[%r123]; .loc 1 597 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L179; .loc 1 601 13 mov.u64 %r80,0; bra $L180; $L203: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L156; bra $L181; $L143: .loc 1 615 1 ret; } minloc1_16_r4.o/_gfortran_minloc1_16_r4 .visible .func _gfortran_minloc1_16_rfortran/generated/minloc1_16_r4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_16_r4 .visible .func _gfortran_mminloc1_16_r_gfortran_sminloc1_16_r4 .visible .func _gfortran_sminloc1_16_rVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_minloc1_16_r4 .visible .func _gfortran_minloc1_16_r4480u64 %r116; .reg .f32u16 %r213; .reg .u16 %r214; .reg .u16 %r215; .reg .u32 %r216; .reg .u16 %r217; .reg .predpredu64 %r265; .reg .u32predu64 %r283; .reg .u64pred64u32 %r333; .reg .u64 %r334; .reg .u64pred %r340; .reg .pred %r341; .reg .u64 %r343; .reg .pred %r345; .reg .pred %r348; .reg .predpredu32 %r438; .reg .pred %r445; .reg .pred %r448; .reg .predu64 %r460; .reg .predpred %r470; .reg .u64 %r471; .reg .predu64 %r507; .reg .pred %r511; .reg .pred %r512; .reg .u64 %r513; .reg .u64 %r514; .reg .u64 %r515; .reg .pred %r516; .reg .u64 %r517; .reg .u32 %r518; .reg .u64 %r519; .reg .u64 %r520; .reg .u64 %r521; .reg .u64 %r522; .reg .u64 %r523; .reg .u64 %r524; mov.u64 %r202,%ar0; mov.u64 %r203,%ar1; mov.u64 %r204,%ar2; mov.u32 %r205,%ar3; .loc 1 58 10 ld.s8 %r22,[%r203+28]; .loc 1 58 38 add.u32 %r206,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r130,%r206; .loc 1 59 10 ld.u64 %r25,[%r204]; .loc 1 59 7 add.u64 %r131,%r25,-1; .loc 1 61 7 shr.u64 %r208,%r131,63; set.u32.lt.s64 %r210,%r130,%r131; neg.s32 %r211,%r210; cvt.u16.u64 %r214,%r208; cvt.u16.u32 %r215,%r211; or.b16 %r213,%r214,%r215; .loc 1 61 6 cvt.u32.u16 %r216,%r213; cvt.u16.u8 %r217,%r216; setp.eq.u16 %r218,%r217,0; @ %r218 bra $L2; .loc 1 63 7 cvt.u32.u32 %r221,%r22; cvt.s64.s8 %r220,%r221; st.u64 [%stack+8],%r22011_gfortran_runtime_errorr224,%r131,%r131; add.u64 %r225,%r224,%r131; shl.b64 %r226,%r225,3; add.u64 %r227,%r203,%r226; ld.u64 %r30,[%r227+56]; ld.u64 %r32,[%r227+48]; .loc 1 71 9 ld.u64 %r134,[%r227+40]; .loc 1 73 3 setp.ne.u64 %r241,%r131,0; @ %r241 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r242,%r130,%r131; @ %r242 bra $L4; bra $L63; $L3: add.u64 %r76,%r203,40; add.u64 %r38,%frame,120; add.u64 %r164,%frame,240; add.u64 %r244,%r25,%r25; add.u64 %r245,%r244,%r25; shl.b64 %r246,%r245,3; add.u64 %r247,%r203,16; add.u64 %r43,%r246,%r247; .loc 1 79 12 mov.u64 %r524,0; $L8: .loc 1 75 18 ld.u64 %r248,[%r76]; st.u64 [%r38],%r248; .loc 1 76 19 ld.u64 %r250,[%r76+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r76+8]; sub.u64 %r37,%r249,%r251; .loc 1 78 10 setp.lt.s64 %r252,%r37,0; @ %r252 bra $L6; .loc 1 76 17 st.u64 [%r164],%r37; bra $L7; $L6: .loc 1 79 12 st.u64 [%r164],%r524; $L7: .loc 1 73 3 add.u64 %r76,%r76,24; add.u64 %r38,%r38,8; add.u64 %r164,%r164,8; setp.ne.u64 %r254,%r43,%r76; @ %r254 bra $L8; bra $L9; $L63: .loc 1 90 6 ld.u64 %r255,[%r202]; setp.eq.u64 %r256,%r255,0; @ ! %r256 bra $L11; bra $L10; $L4: add.u64 %r258,%r25,%r25; add.u64 %r259,%r258,%r25; shl.b64 %r260,%r259,3; add.u64 %r261,%r260,40; add.u64 %r72,%r203,%r261; shl.b64 %r262,%r25,3; add.u64 %r197,%r262,-8; add.u64 %r514,%frame,120; add.u64 %r128,%r514,%r197; add.u64 %r513,%frame,240; add.u64 %r195,%r513,%r197; cvt.u32.u32 %r266,%r22; cvt.s64.s8 %r265,%r266; add.u64 %r268,%r265,%r265; add.u64 %r269,%r268,%r265; shl.b64 %r270,%r269,3; add.u64 %r271,%r203,40; add.u64 %r64,%r270,%r271; .loc 1 87 12 mov.u64 %r523,0; $L14: .loc 1 83 18 ld.u64 %r272,[%r72]; st.u64 [%r128],%r272; .loc 1 84 19 ld.u64 %r274,[%r72+16]; add.u64 %r273,%r274,1; ld.u64 %r275,[%r72+8]; sub.u64 %r45,%r273,%r275; .loc 1 86 10 setp.lt.s64 %r276,%r45,0; @ %r276 bra $L12; .loc 1 84 17 st.u64 [%r195],%r45; bra $L13; $L12: .loc 1 87 12 st.u64 [%r195],%r523; $L13: .loc 1 81 3 add.u64 %r72,%r72,24; add.u64 %r128,%r128,8; add.u64 %r195,%r195,8; setp.ne.u64 %r278,%r64,%r72; @ %r278 bra $L14; bra $L64; $L10: .loc 1 94 7 setp.le.s64 %r279,%r130,0; @ %r279 bra $L16; add.u64 %r513,%frame,240; $L46: add.u64 %r113,%r202,40; mov.u64 %r110,%r513; cvt.u32.u32 %r281,%r22; cvt.s64.s8 %r280,%r281; add.u64 %r283,%r280,%r280; add.u64 %r284,%r283,%r280; shl.b64 %r285,%r284,3; add.u64 %r286,%r202,16; add.u64 %r86,%r285,%r286; .loc 1 97 10 mov.u64 %r111,1; .loc 1 101 4 mov.u64 %r309,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r287,0; st.u64 [%r202+8],%r287; .loc 1 106 28 cvt.u16.u32 %r290,%r22; add.u16 %r289,%r290,-1; cvt.u32.u16 %r291,%r289; st.u8 [%r202+28],%r291; .loc 1 108 20 add.u32 %r292,%r22,-2; cvt.s64.s32 %r53,%r292; add.u64 %r294,%r53,%r53; add.u64 %r295,%r294,%r53; shl.b64 %r296,%r295,3; add.u64 %r297,%r202,%r296; .loc 1 108 67 shl.b64 %r299,%r53,3; add.u64 %r300,%frame,%r299; .loc 1 108 59 ld.u64 %r302,[%r297+40]; ld.u64 %r303,[%r300+240]; mul.lo.u64 %r136,%r302,%r303; .loc 1 110 29 mov.u64 %r305,16305; call (%value_in),_gfortrani_xmallocarray06,[%value_in]; } .loc 1 110 27 st.u64 [%r202],%r306; .loc 1 111 10 setp.eq.u64 %r308,%r136,0; @ ! %r308 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r111,%r49,%r111; $L17: .loc 1 101 4 st.u64 [%r113+8],%r309; ld.u64 %r49,[%r110]; add.u64 %r310,%r49,-1; st.u64 [%r113+16],%r310; st.u64 [%r113],%r111; .loc 1 94 7 add.u64 %r113,%r113,24; add.u64 %r110,%r110,8; setp.ne.u64 %r311,%r86,%r113; @ %r311 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r202+48],%r136; mov.u64 %r313,-1; st.u64 [%r202+56],%r313; mov.u64 %r314,1; st.u64 [%r202+40],%r314; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r60,[%r202+28]; .loc 1 121 10 setp.eq.u64 %r315,%r60,%r130; @ %r315 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r130; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r318,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r319,[%r318+36]; setp.eq.u32 %r320,%r319,0; @ %r320 bra $L21; .loc 1 128 2 add.u64 %r513,%frame,240; cvta.const.u64 %r324,$LC2r513324; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r326,%r130,0; @ %r326 bra $L24; $L27: .loc 1 68 9 add.u64 %r327,%r30,1; .loc 1 68 7 sub.u64 %r132,%r327,%r32; max.s64 %r68,%r132,0; .loc 1 140 8 ld.u64 %r147,[%r203]; .loc 1 141 8 ld.u64 %r148,[%r202]; .loc 1 168 40 shl.b64 %r70,%r134,2; ld.u64 %r161,[%frame+360]; .loc 1 204 22 ld.u64 %r81,[%frame+120]; .loc 1 204 12 shl.b64 %r82,%r81,2; .loc 1 205 22 ld.u64 %r84,[%frame]; .loc 1 205 12 shl.b64 %r85,%r84,4; .loc 1 207 32 ld.u64 %r122,[%frame+240]; .loc 1 214 23 mul.lo.u64 %r328,%r81,%r122; .loc 1 214 9 shl.b64 %r329,%r328,2; neg.s64 %r166,%r329; .loc 1 215 23 mul.lo.u64 %r330,%r84,%r122; .loc 1 215 9 shl.b64 %r331,%r330,4; neg.s64 %r71,%r331; setp.gt.s64 %r511,%r132,0; setp.le.s64 %r512,%r130,1; .loc 1 180 9 setp.ne.u32 %r516,%r205,0; add.u64 %r517,%r68,-1; cvt.u32.u32 %r518,%r22; cvt.s64.s8 %r519,%r518; add.u64 %r520,%r519,-1; add.u64 %r521,%frame,120; add.u64 %r522,%frame,240; bra $L25; $L24: add.u64 %r158,%frame,360; add.u64 %r157,%r202,40; mov.u64 %r153,%frame; cvt.u32.u32 %r333,%r22; cvt.s64.s8 %r332,%r333; add.u64 %r116,%r332,-1; .loc 1 132 3 mov.u64 %r149,0; add.u64 %r513,%frame,240; .loc 1 134 16 mov.u64 %r334,%r149; $L26: st.u64 [%r158],%r334; .loc 1 135 18 ld.u64 %r335,[%r157]; st.u64 [%r153],%r335; .loc 1 136 17 shl.b64 %r337,%r149,3; add.u64 %r338,%r513,%r337; .loc 1 136 10 ld.u64 %r339,[%r338]; setp.le.s64 %r340,%r339,0; @ %r340 bra $L1; .loc 1 132 26 add.u64 %r149,%r149,1; .loc 1 132 3 add.u64 %r158,%r158,8; add.u64 %r157,%r157,24; add.u64 %r153,%r153,8; setp.ne.u64 %r341,%r116,%r149; @ %r341 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r511 bra $L47; .loc 1 159 10 mov.u64 %r343,0; st.u64 [%r148],%r343; st.u64 [%r148+8],%r343; bra $L29; $L47: mov.u64 %r138,%r147; .loc 1 168 17 mov.u64 %r156,0; $L28: .loc 1 170 7 ld.f32 %r119,[%r138]; mov.u64 %r199,%r156; .loc 1 168 32 add.u64 %r156,%r156,1; .loc 1 170 6 setp.le.f32 %r345,%r119,0f7f800000; @ ! %r345 bra $L65; .loc 1 173 14 mov.u64 %r473,%r156; shr.s64 %r474,%r473,63; .loc 1 180 9 @ %r516 bra $L32; .loc 1 190 8 setp.gt.s64 %r348,%r68,%r199; @ %r348 bra $L33; bra $L34; $L32: .loc 1 181 8 setp.le.s64 %r349,%r68,%r199; @ %r349 bra $L34; add.u64 %r481,%r199,1; shr.s64 %r482,%r481,63; sub.u64 %r353,%r517,%r199; shr.s64 %r356,%r199,63; add.u64 %r359,%r199,2; set.u32.lt.u64 %r364,%r359,%r199; cvt.s64.s32 %r362,%r364; sub.u64 %r368,%r356,%r362; add.u64 %r370,%r353,%r359; set.u32.lt.u64 %r375,%r370,%r353; cvt.s64.s32 %r373,%r375; sub.u64 %r379,%r368,%r373; .loc 1 170 7 mov.f32 %r194,%r119; .loc 1 181 8 mov.u64 %r493,1; bra $L35; $L65: .loc 1 168 40 add.u64 %r138,%r138,%r70; .loc 1 168 10 setp.gt.s64 %r380,%r68,%r156; @ %r380 bra $L28; .loc 1 157 9 mov.u64 %r473,1; mov.u64 %r474,0; bra $L34; $L35: .loc 1 183 8 setp.ge.f32 %r381,%r119,%r194; .loc 1 186 16 selp.u64 %r473,%r481,%r473,%r381; selp.u64 %r474,%r482,%r474,%r381; selp.f32 %r119,%r194,%r119,%r381; .loc 1 181 33 add.u64 %r138,%r138,%r70; .loc 1 181 8 add.u64 %r384,%r481,%r493; set.u32.lt.u64 %r389,%r384,%r481; cvt.s64.s32 %r387,%r389; mov.u64 %r481,%r384; sub.u64 %r482,%r482,%r387; setp.ne.u64 %r396,%r370,%r481; @ %r396 bra $L50; setp.ne.u64 %r399,%r379,%r482; @ ! %r399 bra $L34; $L50: .loc 1 183 9 ld.f32 %r194,[%r138]; bra $L35; $L33: add.u64 %r477,%r199,1; shr.s64 %r478,%r477,63; sub.u64 %r403,%r517,%r199; shr.s64 %r406,%r199,63; add.u64 %r409,%r199,2; set.u32.lt.u64 %r414,%r409,%r199; cvt.s64.s32 %r412,%r414; sub.u64 %r418,%r406,%r412; add.u64 %r420,%r403,%r409; set.u32.lt.u64 %r425,%r420,%r403; cvt.s64.s32 %r423,%r425; sub.u64 %r429,%r418,%r423; .loc 1 190 8 mov.u64 %r507,1; $L60: .loc 1 192 9 ld.f32 %r75,[%r138]; .loc 1 192 8 setp.lt.f32 %r430,%r75,%r119; .loc 1 195 16 selp.u64 %r473,%r477,%r473,%r430; selp.u64 %r474,%r478,%r474,%r430; selp.f32 %r119,%r75,%r119,%r430; .loc 1 190 33 add.u64 %r138,%r138,%r70; .loc 1 190 8 add.u64 %r433,%r477,%r507; set.u32.lt.u64 %r438,%r433,%r477; cvt.s64.s32 %r436,%r438; mov.u64 %r477,%r433; sub.u64 %r478,%r478,%r436; setp.ne.u64 %r445,%r420,%r477; @ %r445 bra $L60; setp.ne.u64 %r448,%r429,%r478; @ %r448 bra $L60; $L34: .loc 1 199 12 st.u64 [%r148],%r473; st.u64 [%r148+8],%r474; $L29: .loc 1 203 15 add.u64 %r161,%r161,1; .loc 1 204 12 add.u64 %r147,%r147,%r82; .loc 1 205 12 add.u64 %r148,%r148,%r85; .loc 1 207 13 setp.ne.u64 %r451,%r161,%r122; @ %r451 bra $L25; .loc 1 214 9 add.u64 %r144,%r147,%r166; .loc 1 215 9 add.u64 %r145,%r148,%r71; .loc 1 217 7 @ %r512 bra $L1; add.u64 %r129,%frame,368; mov.u64 %r190,8; .loc 1 216 5 mov.u64 %r146,1; .loc 1 211 13 mov.u64 %r515,0; bra $L44; $L45: st.u64 [%r129],%r515; .loc 1 214 23 mul.lo.u64 %r457,%r97,%r96; .loc 1 214 9 shl.b64 %r458,%r457,2; sub.u64 %r144,%r147,%r458; .loc 1 215 23 mul.lo.u64 %r459,%r100,%r96; .loc 1 215 9 shl.b64 %r460,%r459,4; sub.u64 %r145,%r148,%r460; .loc 1 216 5 add.u64 %r146,%r146,1; .loc 1 217 7 add.u64 %r129,%r129,8; add.u64 %r190,%r190,8; setp.eq.u64 %r461,%r146,%r520; @ %r461 bra $L1; $L44: .loc 1 225 16 ld.u64 %r462,[%r129]; add.u64 %r96,%r462,1; st.u64 [%r129],%r96; .loc 1 226 23 add.u64 %r464,%r521,%r190; ld.u64 %r97,[%r464]; .loc 1 226 13 shl.b64 %r465,%r97,2; add.u64 %r147,%r144,%r465; .loc 1 227 23 add.u64 %r466,%frame,%r190; ld.u64 %r100,[%r466]; .loc 1 227 13 shl.b64 %r467,%r100,4; add.u64 %r148,%r145,%r467; .loc 1 207 32 add.u64 %r469,%r522,%r190; ld.u64 %r103,[%r469]; .loc 1 207 13 setp.eq.u64 %r470,%r96,%r103; @ %r470 bra $L45; .loc 1 211 13 mov.u64 %r161,0; bra $L25; $L64: .loc 1 90 6 ld.u64 %r471,[%r202]; setp.eq.u64 %r472,%r471,0; @ ! %r472 bra $L11; bra $L46; $L1:_gfortran_mminloc1_16_r4 .visible .func _gfortran_mminloc1_16_r, .param .u32 %in_ar4) {608pred %r239; .reg .u32 %r244; .reg .u64 %r246; .reg .u32 %r248; .reg .u32 %r249; .reg .u16 %r251; .reg .u16 %r252; .reg .u16 %r253; .reg .u32u64 %r268; .reg .u64 %r275; .reg .pred %r276; .reg .u32 %r277; .reg .u32 %r278; .reg .u32 %r280; .reg .u32pred %r291; .reg .pred %r293; .reg .u64predu64 %r312; .reg .u64u64 %r317; .reg .u64 %r318; .reg .u64u64pred %r356; .reg .pred %r357; .reg .u64 %r358; .reg .u32pred %r410; .reg .u64pred %r419; .reg .predpred %r427; .reg .predu64 %r454; .reg .u64 %r457; .reg .u32 %r459; .reg .u64 %r463; .reg .u64 %r465; .reg .u64pred %r541; .reg .predu64u64 %r565; .reg .u64 %r567; .reg .pred %r568; .reg .u64 %r569; .reg .pred %r570; .reg .u64 %r571; .reg .u64 %r572; .reg .u64 %r573; .reg .u64.reg .u64 %r633; .reg .u64 %r634; mov.u64 %r234,%ar0; mov.u64 %r235,%ar1; mov.u64 %r236,%ar2; mov.u64 %r237,%ar3; mov.u32 %r238,%ar4; .loc 1 261 6 setp.ne.u64 %r239,%r237,0; @ %r239 bra $L67; .loc 1 264 7r2236238; call _gfortran_minloc1_16_r4268 7 bra $L66; $L67: .loc 1 271 10 ld.u64 %r22,[%r236]; .loc 1 271 7 add.u64 %r152,%r22,-1; .loc 1 272 10 ld.s8 %r23,[%r235+28]; .loc 1 272 38 add.u32 %r244,%r23,-1; .loc 1 272 8 cvt.s64.s32 %r153,%r244; .loc 1 275 7 shr.u64 %r246,%r152,63; set.u32.gt.s64 %r248,%r152,%r153; neg.s32 %r249,%r248; cvt.u16.u64 %r252,%r246; cvt.u16.u32 %r253,%r249; or.b16 %r251,%r252,%r253; .loc 1 275 6 cvt.u32.u16 %r254,%r251; cvt.u16.u8 %r255,%r254; setp.eq.u16 %r256,%r255,0; @ %r256 bra $L69; .loc 1 277 7 cvt.u32.u32 %r259,%r23; cvt.s64.s8 %r258,%r259; st.u64 [%stack+8],%r258; st.u64 [%stack],%r22; cvta.const.u64 %r25_gfortran_runtime_error69: .loc 1 282 9 add.u64 %r621,%r152,%r152; add.u64 %r617,%r621,%r152; shl.b64 %r264,%r617,3; add.u64 %r265,%r235,%r264; ld.u64 %r268,[%r265+56]; add.u64 %r267,%r268,1; .loc 1 282 7 ld.u64 %r275,[%r265+48]; sub.u64 %r154,%r267,%r275; .loc 1 283 6 setp.le.s64 %r276,%r154,0; @ %r276 bra $L66; .loc 1 286 9 ld.u64 %r174,[%r237]; .loc 1 288 15 ld.u64 %r33,[%r237+16]; .loc 1 290 22 cvt.u32.u64 %r34,%r33; .loc 1 290 53 add.u32 %r277,%r34,-4; and.b32 %r278,%r277,-5; set.u32.eq.u32 %r280,%r278,0; neg.s32 %r281,%r280; .loc 1 290 22 add.u32 %r282,%r34,-1; set.u32.le.u32 %r284,%r282,1; neg.s32 %r285,%r284; .loc 1 290 6 cvt.u16.u32 %r287,%r281; cvt.u16.u32 %r288,%r285; or.b16 %r286,%r287,%r288; cvt.u32.u16 %r289,%r286; cvt.u16.u8 %r290,%r289; setp.ne.u16 %r291,%r290,0; @ %r291 bra $L71; .loc 1 292 7 setp.ne.u32 %r293,%r34,16; @ %r293 bra $L72; $L71: .loc 1 299 9 shl.b64 %r297,%r617,3; add.u64 %r298,%r235,%r297; ld.u64 %r157,[%r298+40]; .loc 1 300 12 add.u64 %r304,%r237,%r297; ld.u64 %r40,[%r304+40]; .loc 1 302 3 setp.ne.u64 %r306,%r152,0; @ %r306 bra $L73; $L79: .loc 1 312 3 setp.lt.s64 %r307,%r152,%r153; @ %r307 bra $L74; bra $L141; $L72: .loc 1 297 5 cvta.const.u64 %r30830_gfortran_runtime_error73: add.u64 %r91,%r235,40; add.u64 %r196,%r237,40; add.u64 %r311,%r22,%r22; add.u64 %r312,%r311,%r22; shl.b64 %r313,%r312,3; add.u64 %r314,%r235,16; add.u64 %r226,%r313,%r314; .loc 1 302 3 mov.u64 %r107,0; add.u64 %r619,%frame,240; add.u64 %r623,%frame,360; .loc 1 309 12 mov.u64 %r634,%r107; $L78: .loc 1 304 18 add.u64 %r316,%r619,%r107; ld.u64 %r317,[%r91]; st.u64 [%r316],%r317; .loc 1 305 18 add.u64 %r318,%frame,%r107; .loc 1 305 20 ld.u64 %r320,[%r196]; mul.lo.u64 %r319,%r320,%r33; .loc 1 305 18 st.u64 [%r318],%r319; .loc 1 306 19 ld.u64 %r322,[%r91+16]; add.u64 %r321,%r322,1; ld.u64 %r323,[%r91+8]; sub.u64 %r52,%r321,%r323; .loc 1 308 10 setp.lt.s64 %r324,%r52,0; @ %r324 bra $L76; .loc 1 306 17 add.u64 %r326,%r623,%r107; st.u64 [%r326],%r52; bra $L77; $L76: .loc 1 309 12 add.u64 %r328,%r623,%r107; st.u64 [%r328],%r634; $L77: .loc 1 302 3 add.u64 %r91,%r91,24; add.u64 %r196,%r196,24; add.u64 %r107,%r107,8; setp.ne.u64 %r330,%r91,%r226; @ %r330 bra $L78; bra $L79; $L141: .loc 1 322 6 ld.u64 %r331,[%r234]; setp.eq.u64 %r332,%r331,0; @ ! %r332 bra $L81; bra $L80; $L74: add.u64 %r334,%r22,%r22; add.u64 %r335,%r334,%r22; shl.b64 %r336,%r335,3; add.u64 %r180,%r336,40; add.u64 %r57,%r235,%r180; add.u64 %r182,%r237,%r180; shl.b64 %r337,%r22,3; add.u64 %r74,%r337,-8; cvt.u32.u32 %r339,%r23; cvt.s64.s8 %r338,%r339; shl.b64 %r340,%r338,3; add.u64 %r88,%r340,-8; add.u64 %r619,%frame,240; add.u64 %r623,%frame,360; .loc 1 319 12 mov.u64 %r633,0; $L84: .loc 1 314 18 add.u64 %r342,%r619,%r74; ld.u64 %r343,[%r57]; st.u64 [%r342],%r343; .loc 1 315 18 add.u64 %r344,%frame,%r74; .loc 1 315 20 ld.u64 %r346,[%r182]; mul.lo.u64 %r345,%r346,%r33; .loc 1 315 18 st.u64 [%r344],%r345; .loc 1 316 19 ld.u64 %r348,[%r57+16]; add.u64 %r347,%r348,1; ld.u64 %r349,[%r57+8]; sub.u64 %r64,%r347,%r349; .loc 1 318 10 setp.lt.s64 %r350,%r64,0; @ %r350 bra $L82; .loc 1 316 17 add.u64 %r352,%r623,%r74; st.u64 [%r352],%r64; bra $L83; $L82: .loc 1 319 12 add.u64 %r354,%r623,%r74; st.u64 [%r354],%r633; $L83: .loc 1 312 3 add.u64 %r57,%r57,24; add.u64 %r182,%r182,24; add.u64 %r74,%r74,8; setp.ne.u64 %r356,%r74,%r88; @ %r356 bra $L84; bra $L142; $L80: .loc 1 326 7 setp.eq.u64 %r357,%r153,0; @ %r357 bra $L86; add.u64 %r623,%frame,360; $L116: add.u64 %r133,%r234,40; mov.u64 %r131,%r623; cvt.u32.u32 %r359,%r23; cvt.s64.s8 %r358,%r359; add.u64 %r361,%r358,%r358; add.u64 %r362,%r361,%r358; shl.b64 %r363,%r362,3; add.u64 %r364,%r234,16; add.u64 %r60,%r363,%r364; .loc 1 329 10 mov.u64 %r136,1; .loc 1 333 4 mov.u64 %r383,0; bra $L87; $L86: .loc 1 337 20 add.u32 %r365,%r23,-2; cvt.s64.s32 %r72,%r365; add.u64 %r367,%r72,%r72; add.u64 %r368,%r367,%r72; shl.b64 %r369,%r368,3; add.u64 %r370,%r234,%r369; .loc 1 337 67 shl.b64 %r372,%r72,3; add.u64 %r373,%frame,%r372; .loc 1 337 59 ld.u64 %r375,[%r370+40]; ld.u64 %r376,[%r373+360]; mul.lo.u64 %r161,%r375,%r376; .loc 1 339 24 mov.u64 %r377,0; st.u64 [%r234+8],%r377; .loc 1 340 28 cvt.u16.u32 %r380,%r23; add.u16 %r379,%r380,-1; cvt.u32.u16 %r381,%r379; st.u8 [%r234+28],%r381; .loc 1 342 10 setp.eq.u64 %r382,%r161,0; @ %r382 bra $L88; bra $L143; $L90: .loc 1 331 47 mul.lo.u64 %r136,%r69,%r136; $L87: .loc 1 333 4 st.u64 [%r133+8],%r383; ld.u64 %r69,[%r131]; add.u64 %r384,%r69,-1; st.u64 [%r133+16],%r384; st.u64 [%r133],%r136; .loc 1 326 7 add.u64 %r133,%r133,24; add.u64 %r131,%r131,8; setp.ne.u64 %r385,%r60,%r133; @ %r385 bra $L90; bra $L86; $L88: .loc 1 345 4 st.u64 [%r234+48],%r161; mov.u64 %r387,-1; st.u64 [%r234+56],%r387; mov.u64 %r388,1; st.u64 [%r234+40],%r388; .loc 1 346 4 bra $L66; $L143: .loc 1 349 24 mov.u64 %r390,16390; call (%value_in),_gfortrani_xmallocarray91,[%value_in]; } .loc 1 349 22 st.u64 [%r234],%r391; bra $L91; $L81: .loc 1 354 19 ld.s8 %r393,[%r234+28]; .loc 1 354 10 setp.eq.u64 %r394,%r393,%r153; @ %r394 bra $L92; .loc 1 355 2 cvta.const.u64 %r3955_gfortran_runtime_error92: .loc 1 357 11 cvta.global.u64 %r397,_gfortrani_compile_options; .loc 1 357 10 ld.u32 %r398,[%r397+36]; setp.eq.u32 %r399,%r398,0; @ %r399 bra $L91; .loc 1 359 4 add.u64 %r623,%frame,360; cvta.const.u64 %r403,$LC2;r62302403; call _gfortrani_bounds_ifunction_return361 4 cvta.const.u64 %r407,$LC207403; call _gfortrani_bounds_equal_extents$L91: .loc 1 366 3 setp.ne.u64 %r409,%r153,0; @ %r409 bra $L93; $L96: .loc 1 374 8 ld.u64 %r175,[%r234]; .loc 1 375 8 ld.u64 %r173,[%r235]; .loc 1 377 9 setp.ne.u64 %r410,%r173,0; @ %r410 bra $L94; bra $L66; $L93: add.u64 %r149,%frame,480; add.u64 %r148,%r234,40; add.u64 %r144,%frame,120; cvt.u32.u32 %r412,%r23; cvt.s64.s8 %r411,%r412; add.u64 %r134,%r411,-1; .loc 1 366 3 mov.u64 %r176,0; add.u64 %r623,%frame,360; .loc 1 368 16 mov.u64 %r413,%r176; $L95: st.u64 [%r149],%r413; .loc 1 369 18 ld.u64 %r414,[%r148]; st.u64 [%r144],%r414; .loc 1 370 17 shl.b64 %r416,%r176,3; add.u64 %r417,%r623,%r416; .loc 1 370 10 ld.u64 %r418,[%r417]; setp.le.s64 %r419,%r418,0; @ %r419 bra $L66; .loc 1 366 26 add.u64 %r176,%r176,1; .loc 1 366 3 add.u64 %r149,%r149,8; add.u64 %r148,%r148,24; add.u64 %r144,%r144,8; setp.ne.u64 %r420,%r134,%r176; @ %r420 bra $L95; bra $L96; $L94: .loc 1 300 12 mul.lo.u64 %r43,%r40,%r33; .loc 1 396 32 shl.b64 %r87,%r157,2; ld.u64 %r38,[%frame+480]; .loc 1 440 22 ld.u64 %r100,[%frame+240]; .loc 1 440 12 shl.b64 %r101,%r100,2; .loc 1 441 23 ld.u64 %r103,[%frame]; .loc 1 442 22 ld.u64 %r105,[%frame+120]; .loc 1 442 12 shl.b64 %r106,%r105,4; .loc 1 444 32 ld.u64 %r166,[%frame+360]; .loc 1 451 23 mul.lo.u64 %r37,%r166,%r100; .loc 1 452 24 mul.lo.u64 %r421,%r166,%r103; .loc 1 452 10 neg.s64 %r85,%r421; .loc 1 453 23 mul.lo.u64 %r422,%r166,%r105; .loc 1 453 9 shl.b64 %r423,%r422,4; neg.s64 %r93,%r423; setp.le.s64 %r618,%r153,1; .loc 1 366 3 mov.u64 %r626,1; .loc 1 418 9 setp.ne.u32 %r627,%r238,0; add.u64 %r628,%r154,-1; cvt.u32.u32 %r629,%r23; cvt.s64.s8 %r630,%r629; add.u64 %r631,%r630,-1; add.u64 %r632,%frame,240; $L117: .loc 1 366 3 mov.u64 %r169,%r174; mov.u64 %r168,%r173; mov.u64 %r573,%r626; mov.u64 %r574,0; .loc 1 393 17 mov.u64 %r571,%r574; mov.u64 %r572,%r574; .loc 1 396 9 mov.u64 %r200,%r574; $L102: .loc 1 399 7 ld.s8 %r225,[%r169]; mov.u64 %r231,%r200; add.u64 %r200,%r200,1; .loc 1 399 6 setp.eq.u32 %r620,%r225,0; @ %r620 bra $L97; .loc 1 402 10 or.b64 %r425,%r571,%r572; setp.ne.u64 %r427,%r425,0; .loc 1 403 17 selp.u64 %r571,%r571,%r573,%r427; selp.u64 %r572,%r572,%r574,%r427; .loc 1 404 11 ld.f32 %r145,[%r168]; .loc 1 404 10 setp.le.f32 %r428,%r145,0f7f800000; @ ! %r428 bra $L97; .loc 1 408 11 shr.s64 %r429,%r200,63; .loc 1 414 9 setp.le.s64 %r430,%r154,%r231; @ ! %r430 bra $L144; bra $L100; $L97: .loc 1 396 32 add.u64 %r168,%r168,%r87; .loc 1 396 47 add.u64 %r169,%r169,%r43; .loc 1 396 2 add.u64 %r433,%r573,%r626; set.u32.lt.u64 %r438,%r433,%r573; cvt.s64.s32 %r436,%r438; mov.u64 %r573,%r433; sub.u64 %r574,%r574,%r436; setp.ne.u64 %r443,%r154,%r200; @ %r443 bra $L102; bra $L100; $L144: .loc 1 418 9 @ %r627 bra $L103; add.u64 %r579,%r231,1; shr.s64 %r580,%r579,63; sub.u64 %r448,%r628,%r231; shr.s64 %r451,%r231,63; add.u64 %r454,%r231,2; set.u32.lt.u64 %r459,%r454,%r231; cvt.s64.s32 %r457,%r459; sub.u64 %r463,%r451,%r457; add.u64 %r465,%r448,%r454; set.u32.lt.u64 %r470,%r465,%r448; cvt.s64.s32 %r468,%r470; sub.u64 %r474,%r463,%r468; .loc 1 408 11 mov.u64 %r571,%r200; mov.u64 %r572,%r429; bra $L104; $L103: add.u64 %r583,%r231,1; shr.s64 %r584,%r583,63; sub.u64 %r478,%r628,%r231; shr.s64 %r481,%r231,63; add.u64 %r484,%r231,2; set.u32.lt.u64 %r489,%r484,%r231; cvt.s64.s32 %r487,%r489; sub.u64 %r493,%r481,%r487; add.u64 %r495,%r478,%r484; set.u32.lt.u64 %r500,%r495,%r478; cvt.s64.s32 %r498,%r500; sub.u64 %r504,%r493,%r498; mov.u64 %r571,%r200; mov.u64 %r572,%r429; $L108: .loc 1 421 8 @ %r620 bra $L105; .loc 1 421 18 ld.f32 %r90,[%r168]; .loc 1 421 15 setp.le.f32 %r506,%r90,%r145; .loc 1 424 16 selp.u64 %r571,%r583,%r571,%r506; selp.u64 %r572,%r584,%r572,%r506; selp.f32 %r145,%r90,%r145,%r506; $L105: .loc 1 419 33 add.u64 %r168,%r168,%r87; .loc 1 419 48 add.u64 %r169,%r169,%r43; .loc 1 419 8 add.u64 %r509,%r583,%r626; set.u32.lt.u64 %r514,%r509,%r583; cvt.s64.s32 %r512,%r514; mov.u64 %r583,%r509; sub.u64 %r584,%r584,%r512; setp.ne.u64 %r521,%r495,%r583; @ %r521 bra $L120; setp.ne.u64 %r524,%r504,%r584; @ ! %r524 bra $L100; $L120: .loc 1 421 9 ld.s8 %r225,[%r169]; setp.eq.u32 %r620,%r225,0; bra $L108; $L104: .loc 1 430 10 @ %r620 bra $L109; .loc 1 430 20 ld.f32 %r94,[%r168]; .loc 1 430 17 setp.lt.f32 %r526,%r94,%r145; .loc 1 433 11 selp.u64 %r571,%r579,%r571,%r526; selp.u64 %r572,%r580,%r572,%r526; selp.f32 %r145,%r94,%r145,%r526; $L109: .loc 1 428 35 add.u64 %r168,%r168,%r87; .loc 1 428 50 add.u64 %r169,%r169,%r43; .loc 1 428 10 add.u64 %r529,%r579,%r626; set.u32.lt.u64 %r534,%r529,%r579; cvt.s64.s32 %r532,%r534; mov.u64 %r579,%r529; sub.u64 %r580,%r580,%r532; setp.ne.u64 %r541,%r465,%r579; @ %r541 bra $L122; setp.ne.u64 %r544,%r474,%r580; @ ! %r544 bra $L100; $L122: .loc 1 430 11 ld.s8 %r225,[%r169]; setp.eq.u32 %r620,%r225,0; bra $L104; $L100: .loc 1 436 8 st.u64 [%r175],%r571; st.u64 [%r175+8],%r572; .loc 1 439 15 add.u64 %r38,%r38,1; .loc 1 440 12 add.u64 %r173,%r173,%r101; .loc 1 441 13 add.u64 %r174,%r174,%r103; .loc 1 442 12 add.u64 %r175,%r175,%r106; .loc 1 444 13 setp.ne.u64 %r547,%r38,%r166; @ %r547 bra $L117; .loc 1 452 10 add.u64 %r170,%r174,%r85; .loc 1 453 9 add.u64 %r171,%r175,%r93; .loc 1 455 7 @ %r618 bra $L66; add.u64 %r151,%frame,488; .loc 1 451 23 mov.u64 %r110,%r37; .loc 1 455 7 mov.u64 %r219,8; .loc 1 454 5 mov.u64 %r172,1; add.u64 %r623,%frame,360; add.u64 %r622,%frame,120; .loc 1 448 13 mov.u64 %r624,0; bra $L114; $L115: st.u64 [%r151],%r624; .loc 1 451 23 mul.lo.u64 %r110,%r123,%r122; .loc 1 452 24 mul.lo.u64 %r553,%r125,%r122; .loc 1 452 10 sub.u64 %r170,%r174,%r553; .loc 1 453 23 mul.lo.u64 %r554,%r127,%r122; .loc 1 453 9 shl.b64 %r555,%r554,4; sub.u64 %r171,%r175,%r555; .loc 1 454 5 add.u64 %r172,%r172,1; .loc 1 455 7 add.u64 %r151,%r151,8; add.u64 %r219,%r219,8; setp.eq.u64 %r556,%r172,%r631; @ %r556 bra $L66; $L114: .loc 1 463 16 ld.u64 %r557,[%r151]; add.u64 %r122,%r557,1; st.u64 [%r151],%r122; .loc 1 464 23 add.u64 %r559,%r632,%r219; ld.u64 %r123,[%r559]; .loc 1 464 13 sub.u64 %r560,%r123,%r110; shl.b64 %r561,%r560,2; add.u64 %r173,%r173,%r561; .loc 1 465 24 add.u64 %r562,%frame,%r219; ld.u64 %r125,[%r562]; .loc 1 465 14 add.u64 %r174,%r170,%r125; .loc 1 466 23 add.u64 %r564,%r622,%r219; ld.u64 %r127,[%r564]; .loc 1 466 13 shl.b64 %r565,%r127,4; add.u64 %r175,%r171,%r565; .loc 1 444 32 add.u64 %r567,%r623,%r219; ld.u64 %r130,[%r567]; .loc 1 444 13 setp.eq.u64 %r568,%r122,%r130; @ %r568 bra $L115; .loc 1 448 13 mov.u64 %r38,0; bra $L117; $L142: .loc 1 322 6 ld.u64 %r569,[%r234]; setp.eq.u64 %r570,%r569,0; @ ! %r570 bra $L81; bra $L116; $L66_gfortran_sminloc1_16_r4 .visible .func _gfortran_sminloc1_16_r, .param .u32 %in_ar4) {local .align 16 .b8 %frame_ar[368u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r282; .reg .predpred %r293; .reg .pred %r294; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 493 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L146; .loc 1 493 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L147; $L146:r152154; call _gfortran_minloc1_16_r4500 7 bra $L145; $L147: .loc 1 503 10 ld.u64 %r23,[%r152]; .loc 1 503 7 add.u64 %r88,%r23,-1; .loc 1 504 10 ld.s8 %r24,[%r151+28]; .loc 1 504 38 add.u32 %r26,%r24,-1; .loc 1 504 8 cvt.s64.s32 %r297,%r26; .loc 1 506 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r297; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 506 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L149; .loc 1 513 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L150; $L156: .loc 1 521 3 setp.lt.s64 %r175,%r88,%r297; @ %r175 bra $L151; bra $L204; $L149: .loc 1 508 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error150: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 518 12 mov.u64 %r303,0; $L155: .loc 1 515 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 517 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L153; .loc 1 515 17 st.u64 [%r139],%r36; bra $L154; $L153: .loc 1 518 12 st.u64 [%r139],%r303; $L154: .loc 1 513 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L155; bra $L156; $L204: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L158; bra $L157; $L151: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r296,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r296,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 527 12 mov.u64 %r302,0; $L161: .loc 1 524 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 526 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L159; .loc 1 523 17 st.u64 [%r43],%r41; bra $L160; $L159: .loc 1 527 12 st.u64 [%r43],%r302; $L160: .loc 1 521 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L161; bra $L205; $L157: .loc 1 534 7 setp.eq.u64 %r213,%r297,0; @ %r213 bra $L163; add.u64 %r296,%frame,120; $L183: add.u64 %r111,%r150,40; mov.u64 %r107,%r296; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 537 10 mov.u64 %r81,1; .loc 1 541 4 mov.u64 %r239,0; bra $L164; $L163: .loc 1 545 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 546 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 548 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 548 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 548 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 550 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L165; bra $L206; $L167: .loc 1 539 48 mul.lo.u64 %r81,%r46,%r81; $L164: .loc 1 541 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 534 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L167; bra $L163; $L165: .loc 1 553 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 554 4 bra $L145; $L206: .loc 1 557 24 mov.u64 %r246,1696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 557 22 st.u64 [%r150],%r98; $L171: .loc 1 583 3 setp.ne.u64 %r249,%r297,0; @ %r249 bra $L168; bra $L169; $L158: .loc 1 561 19 ld.s8 %r55,[%r150+28]; .loc 1 561 10 setp.eq.u64 %r250,%r55,%r297; @ %r250 bra $L170; .loc 1 562 2 st.u64 [%stack+8],%r297; st.u64 [%stack],%r55;_gfortran_runtime_error170: .loc 1 567 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 567 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L171; .loc 1 569 4 setp.eq.u64 %r256,%r297,0; @ %r256 bra $L169; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 569 10 mov.u64 %r104,0; $L173: .loc 1 573 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 573 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 574 18 ld.u64 %r63,[%r31]; .loc 1 569 25 add.u64 %r104,%r104,1; .loc 1 574 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L172; .loc 1 575 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error172: .loc 1 569 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L173; $L168: .loc 1 585 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L174; shl.b64 %r266,%r297,3; bra $L175; $L174: mov.u64 %r266,8; $L175: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L176: .loc 1 586 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 583 26 add.u64 %r100,%r100,1; .loc 1 583 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r297,%r100; @ %r279 bra $L176; $L169: ld.u64 %r80,[%frame+240]; .loc 1 595 22 ld.u64 %r68,[%frame]; .loc 1 595 12 shl.b64 %r69,%r68,4; .loc 1 597 32 ld.u64 %r59,[%frame+120]; .loc 1 604 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r295,%r297,1; .loc 1 593 13 mov.u64 %r280,0; cvt.u32.u32 %r299,%r24; cvt.s64.s8 %r300,%r299; add.u64 %r301,%r300,-1; $L182: st.u64 [%r98],%r280; st.u64 [%r98+8],%r280; .loc 1 594 15 add.u64 %r80,%r80,1; .loc 1 595 12 add.u64 %r98,%r98,%r69; .loc 1 597 13 setp.ne.u64 %r282,%r59,%r80; @ %r282 bra $L182; .loc 1 606 7 @ ! %r295 bra $L207; bra $L145; $L181: .loc 1 601 13 st.u64 [%r99],%r280; .loc 1 604 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 605 5 add.u64 %r97,%r97,1; .loc 1 606 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r285,%r97,%r301; @ %r285 bra $L180; bra $L145; $L207: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 604 23 mov.u64 %r72,%r44; .loc 1 605 5 mov.u64 %r97,1; $L180: .loc 1 610 16 ld.u64 %r290,[%r99]; add.u64 %r74,%r290,1; st.u64 [%r99],%r74; .loc 1 611 23 ld.u64 %r75,[%r124]; .loc 1 611 13 sub.u64 %r291,%r75,%r72; shl.b64 %r292,%r291,4; add.u64 %r98,%r98,%r292; .loc 1 597 32 ld.u64 %r77,[%r123]; .loc 1 597 13 setp.eq.u64 %r293,%r74,%r77; @ %r293 bra $L181; .loc 1 601 13 mov.u64 %r80,0; bra $L182; $L205: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r294,%r98,0; @ ! %r294 bra $L158; bra $L183; $L145: .loc 1 615 1 ret; } minloc1_4_r8.o/_gfortran_minloc1_4_r8 .visible .func _gfortran_minloc1_4_r8.file 1 "../../../libgfortran/generated/minloc1_4_r8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_4_r8 .visible .func _gfortran_mminloc1_4_r8_gfortran_sminloc1_4_r8 .visible .func _gfortran_sminloc1_4_r8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_minloc1_4_r8 .visible .func _gfortran_minloc1_4_r8480pred %r229; .reg .predpred %r240; .reg .predu16 %r278; .reg .u32 %r279; .reg .u32 %r280; .reg .u64 %r282; .reg .u64predu64 %r318; .reg .u64predpred %r337; .reg .pred %r338; .reg .pred %r339; .reg .predu64 %r357; .reg .u64 %r358; .reg .u64 %r360; .reg .pred %r361; .reg .u64 %r362; .reg .predpred %r367; .reg .u64.loc 1 58 10 ld.s8 %r22,[%r191+28]; .loc 1 58 38 add.u32 %r194,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r122,%r194; .loc 1 59 10 ld.u64 %r25,[%r192]; .loc 1 59 7 add.u64 %r123,%r25,-1; .loc 1 61 7 shr.u64 %r196,%r123,63; set.u32.lt.s64 %r198,%r122,%r123; neg.s32 %r199,%r198; cvt.u16.u64 %r202,%r196;61%r206 bra $L2; .loc 1 63 7 cvt.u32.u32 %r209,%r22; cvt.s64.s8 %r208,%r209; st.u64 [%stack+8],%r208007_gfortran_runtime_errorr212,%r123,%r123; add.u64 %r213,%r212,%r123; shl.b64 %r214,%r213,3; add.u64 %r215,%r191,%r214; ld.u64 %r30,[%r215+56]; ld.u64 %r32,[%r215+48]; .loc 1 71 9 ld.u64 %r126,[%r215+40]; .loc 1 73 3 setp.ne.u64 %r229,%r123,0; @ %r229 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r230,%r122,%r123; @ %r230 bra $L4; bra $L61; $L3: add.u64 %r45,%r191,40; add.u64 %r69,%frame,120; add.u64 %r44,%frame,240; add.u64 %r232,%r25,%r25; add.u64 %r233,%r232,%r25; shl.b64 %r234,%r233,3; add.u64 %r235,%r191,16; add.u64 %r114,%r234,%r235; .loc 1 79 12 mov.u64 %r377,0; $L8: .loc 1 75 18 ld.u64 %r236,[%r45]; st.u64 [%r69],%r236; .loc 1 76 19 ld.u64 %r238,[%r45+16]; add.u64 %r237,%r238,1; ld.u64 %r239,[%r45+8]; sub.u64 %r37,%r237,%r239; .loc 1 78 10 setp.lt.s64 %r240,%r37,0; @ %r240 bra $L6; .loc 1 76 17 st.u64 [%r44],%r37; bra $L7; $L6: .loc 1 79 12 st.u64 [%r44],%r377; $L7: .loc 1 73 3 add.u64 %r45,%r45,24; add.u64 %r69,%r69,8; add.u64 %r44,%r44,8; setp.ne.u64 %r242,%r45,%r114; @ %r242 bra $L8; bra $L9; $L61: .loc 1 90 6 ld.u64 %r243,[%r190]; setp.eq.u64 %r244,%r243,0; @ ! %r244 bra $L11; bra $L10; $L4: add.u64 %r246,%r25,%r25; add.u64 %r247,%r246,%r25; shl.b64 %r248,%r247,3; add.u64 %r249,%r248,40; add.u64 %r106,%r191,%r249; shl.b64 %r250,%r25,3; add.u64 %r81,%r250,-8; add.u64 %r365,%frame,120; add.u64 %r101,%r365,%r81; add.u64 %r364,%frame,240; add.u64 %r58,%r364,%r81; add.u64 %r253,%r191,40; cvt.u32.u32 %r255,%r22; cvt.s64.s8 %r254,%r255; add.u64 %r257,%r254,%r254; add.u64 %r258,%r257,%r254; shl.b64 %r259,%r258,3; add.u64 %r185,%r253,%r259; .loc 1 87 12 mov.u64 %r376,0; $L14: .loc 1 83 18 ld.u64 %r260,[%r106]; st.u64 [%r101],%r260; .loc 1 84 19 ld.u64 %r262,[%r106+16]; add.u64 %r261,%r262,1; ld.u64 %r263,[%r106+8]; sub.u64 %r42,%r261,%r263; .loc 1 86 10 setp.lt.s64 %r264,%r42,0; @ %r264 bra $L12; .loc 1 84 17 st.u64 [%r58],%r42; bra $L13; $L12: .loc 1 87 12 st.u64 [%r58],%r376; $L13: .loc 1 81 3 add.u64 %r106,%r106,24; add.u64 %r101,%r101,8; add.u64 %r58,%r58,8; setp.ne.u64 %r266,%r106,%r185; @ %r266 bra $L14; bra $L62; $L10: .loc 1 94 7 setp.le.s64 %r267,%r122,0; @ %r267 bra $L16; add.u64 %r364,%frame,240; $L45: add.u64 %r144,%r190,40; mov.u64 %r119,%r364; cvt.u32.u32 %r269,%r22; cvt.s64.s8 %r268,%r269; add.u64 %r271,%r268,%r268; add.u64 %r272,%r271,%r268; shl.b64 %r273,%r272,3; add.u64 %r274,%r190,16; add.u64 %r109,%r273,%r274; .loc 1 97 10 mov.u64 %r108,1; .loc 1 101 4 mov.u64 %r297,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r275,0; st.u64 [%r190+8],%r275; .loc 1 106 28 cvt.u16.u32 %r278,%r22; add.u16 %r277,%r278,-1; cvt.u32.u16 %r279,%r277; st.u8 [%r190+28],%r279; .loc 1 108 20 add.u32 %r280,%r22,-2; cvt.s64.s32 %r51,%r280; add.u64 %r282,%r51,%r51; add.u64 %r283,%r282,%r51; shl.b64 %r284,%r283,3; add.u64 %r285,%r190,%r284; .loc 1 108 67 shl.b64 %r287,%r51,3; add.u64 %r288,%frame,%r287; .loc 1 108 59 ld.u64 %r290,[%r285+40]; ld.u64 %r291,[%r288+240]; mul.lo.u64 %r128,%r290,%r291; .loc 1 110 29293; call (%value_in),_gfortrani_xmallocarray294,[%value_in]; } .loc 1 110 27 st.u64 [%r190],%r294; .loc 1 111 10 setp.eq.u64 %r296,%r128,0; @ ! %r296 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r108,%r47,%r108; $L17: .loc 1 101 4 st.u64 [%r144+8],%r297; ld.u64 %r47,[%r119]; add.u64 %r298,%r47,-1; st.u64 [%r144+16],%r298; st.u64 [%r144],%r108; .loc 1 94 7 add.u64 %r144,%r144,24; add.u64 %r119,%r119,8; setp.ne.u64 %r299,%r109,%r144; @ %r299 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r190+48],%r128; mov.u64 %r301,-1; st.u64 [%r190+56],%r301; mov.u64 %r302,1; st.u64 [%r190+40],%r302; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r57,[%r190+28]; .loc 1 121 10 setp.eq.u64 %r303,%r57,%r122; @ %r303 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r122; st.u64 [%stack],%r57;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r306,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r307,[%r306+36]; setp.eq.u32 %r308,%r307,0; @ %r308 bra $L21; .loc 1 128 2 add.u64 %r364,%frame,240; cvta.const.u64 %r312,$LC2; cvta.const.u64 %r311,$LC3;_gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r314,%r122,0; @ %r314 bra $L24; $L27: .loc 1 68 9 add.u64 %r315,%r30,1; .loc 1 68 7 sub.u64 %r124,%r315,%r32; max.s64 %r142,%r124,0; .loc 1 140 8 ld.u64 %r137,[%r191]; .loc 1 141 8 ld.u64 %r138,[%r190]; .loc 1 168 40 shl.b64 %r65,%r126,3; ld.u64 %r145,[%frame+360]; .loc 1 204 22 ld.u64 %r75,[%frame+120]; .loc 1 204 12 shl.b64 %r76,%r75,3; .loc 1 205 22 ld.u64 %r78,[%frame]; .loc 1 205 12 shl.b64 %r79,%r78,2; .loc 1 207 32 ld.u64 %r117,[%frame+240]; .loc 1 214 23 mul.lo.u64 %r316,%r75,%r117; .loc 1 214 9 shl.b64 %r317,%r316,3; neg.s64 %r160,%r317; .loc 1 215 23 mul.lo.u64 %r318,%r78,%r117; .loc 1 215 9 shl.b64 %r319,%r318,2; neg.s64 %r68,%r319; setp.gt.s64 %r366,%r124,0; setp.le.s64 %r367,%r122,1; .loc 1 157 9 mov.u32 %r369,1; .loc 1 180 9 setp.ne.u32 %r370,%r193,0; cvt.u32.u32 %r371,%r22; cvt.s64.s8 %r372,%r371; add.u64 %r373,%r372,-1; add.u64 %r374,%frame,120; add.u64 %r375,%frame,240; bra $L25; $L24: add.u64 %r166,%frame,360; add.u64 %r165,%r190,40; mov.u64 %r161,%frame; cvt.u32.u32 %r321,%r22; cvt.s64.s8 %r320,%r321; add.u64 %r146,%r320,-1; .loc 1 132 3 mov.u64 %r139,0; add.u64 %r364,%frame,240; .loc 1 134 16 mov.u64 %r322,%r139; $L26: st.u64 [%r166],%r322; .loc 1 135 18 ld.u64 %r323,[%r165]; st.u64 [%r161],%r323; .loc 1 136 17 shl.b64 %r325,%r139,3; add.u64 %r326,%r364,%r325; .loc 1 136 10 ld.u64 %r327,[%r326]; setp.le.s64 %r328,%r327,0; @ %r328 bra $L1; .loc 1 132 26 add.u64 %r139,%r139,1; .loc 1 132 3 add.u64 %r166,%r166,8; add.u64 %r165,%r165,24; add.u64 %r161,%r161,8; setp.ne.u64 %r329,%r139,%r146; @ %r329 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r366 bra $L46; .loc 1 159 10 mov.u32 %r331,0; st.u32 [%r138],%r331; bra $L29; $L46: mov.u64 %r130,%r137; .loc 1 168 17 mov.u64 %r129,0; $L28: .loc 1 170 7 ld.f64 %r107,[%r130]; .loc 1 170 6 setp.le.f64 %r332,%r107,0d7ff0000000000000; @ ! %r332 bra $L63; .loc 1 173 14 cvt.u32.u64 %r333,%r129; add.u32 %r112,%r333,1; .loc 1 180 9 @ %r370 bra $L32; .loc 1 190 8 setp.gt.s64 %r335,%r142,%r129; @ %r335 bra $L33; bra $L34; $L32: .loc 1 181 8 setp.le.s64 %r336,%r142,%r129; @ %r336 bra $L34; add.u64 %r173,%r129,1; .loc 1 170 7 mov.f64 %r186,%r107; bra $L35; $L63: .loc 1 168 32 add.u64 %r129,%r129,1; .loc 1 168 40 add.u64 %r130,%r130,%r65; .loc 1 168 10 setp.lt.s64 %r337,%r129,%r142; @ %r337 bra $L28; .loc 1 157 9 mov.u32 %r112,%r369; bra $L34; $L35: .loc 1 183 8 setp.ge.f64 %r338,%r107,%r186; @ ! %r338 bra $L36; .loc 1 186 16 cvt.u32.u64 %r112,%r173; .loc 1 185 16 mov.f64 %r107,%r186; $L36: .loc 1 181 33 add.u64 %r130,%r130,%r65; .loc 1 181 8 add.u64 %r174,%r173,1; setp.eq.u64 %r339,%r142,%r173; @ %r339 bra $L34; .loc 1 183 9 ld.f64 %r186,[%r130]; mov.u64 %r173,%r174; bra $L35; $L33: add.u64 %r169,%r129,1; bra $L40; $L47: mov.u64 %r169,%r170; $L40: .loc 1 192 9 ld.f64 %r70,[%r130]; .loc 1 192 8 setp.lt.f64 %r340,%r70,%r107; @ ! %r340 bra $L38; .loc 1 195 16 cvt.u32.u64 %r112,%r169; .loc 1 194 16 mov.f64 %r107,%r70; $L38: .loc 1 190 33 add.u64 %r130,%r130,%r65; .loc 1 190 8 add.u64 %r170,%r169,1; setp.ne.u64 %r341,%r142,%r169; @ %r341 bra $L47; $L34: .loc 1 199 12 st.u32 [%r138],%r112; $L29: .loc 1 203 15 add.u64 %r145,%r145,1; .loc 1 204 12 add.u64 %r137,%r137,%r76; .loc 1 205 12 add.u64 %r138,%r138,%r79; .loc 1 207 13 setp.ne.u64 %r342,%r145,%r117; @ %r342 bra $L25; .loc 1 214 9 add.u64 %r134,%r137,%r160; .loc 1 215 9 add.u64 %r135,%r138,%r68; .loc 1 217 7 @ %r367 bra $L1; add.u64 %r121,%frame,368; mov.u64 %r182,8; .loc 1 216 5 mov.u64 %r136,1; .loc 1 211 13 mov.u64 %r368,0; bra $L43; $L44: st.u64 [%r121],%r368; .loc 1 214 23 mul.lo.u64 %r348,%r94,%r93; .loc 1 214 9 shl.b64 %r349,%r348,3; sub.u64 %r134,%r137,%r349; .loc 1 215 23 mul.lo.u64 %r350,%r97,%r93; .loc 1 215 9 shl.b64 %r351,%r350,2; sub.u64 %r135,%r138,%r351; .loc 1 216 5 add.u64 %r136,%r13182,%r182,8; setp.eq.u64 %r352,%r136,%r373; @ %r352 bra $L1; $L43: .loc 1 225 16 ld.u64 %r353,[%r121]; add.u64 %r93,%r353,1; st.u64 [%r121],%r93; .loc 1 226 23 add.u64 %r355,%r374,%r182; ld.u64 %r94,[%r355]; .loc 1 226 13 shl.b64 %r356,%r94,3; add.u64 %r137,%r134,%r356; .loc 1 227 23 add.u64 %r357,%frame,%r182; ld.u64 %r97,[%r357]; .loc 1 227 13 shl.b64 %r358,%r97,2; add.u64 %r138,%r135,%r358; .loc 1 207 32 add.u64 %r360,%r375,%r182; ld.u64 %r100,[%r360]; .loc 1 207 13 setp.eq.u64 %r361,%r93,%r100; @ %r361 bra $L44; .loc 1 211 13 mov.u64 %r145,0; bra $L25; $L62: .loc 1 90 6 ld.u64 %r362,[%r190]; setp.eq.u64 %r363,%r362,0; @ ! %r363 bra $L11; bra $L45; $L1:_gfortran_mminloc1_4_r8 .visible .func _gfortran_mminloc1_4_r8, .param .u32 %in_ar4) {608u32 %r267; .reg .u32 %r269; .reg .u32 %r270; .reg .u32 %r271; .reg .u32 %r273; .reg .u32predu64 %r293; .reg .predu64 %r302; .reg .u64 %r303; .reg .u64 %r305; .reg .u64 %r306; .reg .u64u64 %r327; .reg .u32u64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64pred %r339; .reg .u64 %r341; .reg .u64 %r343; .reg .pred %r345; .reg .predpred %r383; .reg .u64 %r384; .reg .u64 %r386; .reg .u32 %r387; .reg .pred %r388; .reg .u64 %r391; .reg .u64pred %r409; .reg .u64 %r410; .reg .u64 %r411; .reg .u64 %r412; .reg .pred %r414; .reg .u32pred %r418; .reg .pred %r419; .reg .pred %r422; .reg .pred %r423; .reg .pred %r425; .reg .pred %r426; .reg .predu64predmov.u64 %r223,%ar0; mov.u64 %r224,%ar1; mov.u64 %r225,%ar2; mov.u64 %r226,%ar3; mov.u32 %r227,%ar4; .loc 1 261 6 setp.ne.u64 %r228,%r226,0; @ %r228 bra $L65227; call _gfortran_minloc1_4_r8268 7 bra $L64; $L65: .loc 1 271 10 ld.u64 %r22,[%r225]; .loc 1 271 7 add.u64 %r159,%r22,-1; .loc 1 272 10 ld.s8 %r23,[%r224+28]; .loc 1 272 38 add.u32 %r233,%r23,-1; .loc 1 272 8 cvt.s64.s32 %r160,%r233; .loc 1 275 7 shr.u64 %r235,%r159,63; set.u32.gt.s64 %r237,%r159,%r160; neg.s32 %r238,%r237; cvt.u16.u64 %r241,%r235; cvt.u16.u32 %r242,%r238; or.b16 %r240,%r241,%r242; .loc 1 275 6 cvt.u32.u16 %r243,%r240; cvt.u16.u8 %r244,%r243; setp.eq.u16 %r245,%r244,0; @ %r245 bra $L67; .loc 1 277 7 cvt.u32.u32 %r248,%r23; cvt.s64.s8 %r247,%r248; st.u64 [%stack+8],%r247; st.u64 [%stack],%r22; cvta.const.u64 %r246246_gfortran_runtime_error67: .loc 1 282 9 add.u64 %r456,%r159,%r159; add.u64 %r457,%r456,%r159; shl.b64 %r253,%r457,3; add.u64 %r254,%r224,%r253; ld.u64 %r257,[%r254+56]; add.u64 %r256,%r257,1; .loc 1 282 7 ld.u64 %r264,[%r254+48]; sub.u64 %r161,%r256,%r264; .loc 1 283 6 setp.le.s64 %r265,%r161,0; @ %r265 bra $L64; .loc 1 286 9 ld.u64 %r180,[%r226]; .loc 1 288 15 ld.u64 %r34,[%r226+16]; .loc 1 290 22 cvt.u32.u64 %r35,%r34; .loc 1 290 53 add.u32 %r266,%r35,-4; and.b32 %r267,%r266,-5; set.u32.eq.u32 %r269,%r267,0; neg.s32 %r270,%r269; .loc 1 290 22 add.u32 %r271,%r35,-1; set.u32.le.u32 %r273,%r271,1; neg.s32 %r274,%r273; .loc 1 290 6 cvt.u16.u32 %r276,%r270; cvt.u16.u32 %r277,%r274; or.b16 %r275,%r276,%r277; cvt.u32.u16 %r278,%r275; cvt.u16.u8 %r279,%r278; setp.ne.u16 %r280,%r279,0; @ %r280 bra $L69; .loc 1 292 7 setp.ne.u32 %r282,%r35,16; @ %r282 bra $L70; $L69: .loc 1 299 9 shl.b64 %r286,%r457,3; add.u64 %r287,%r224,%r286; ld.u64 %r164,[%r287+40]; .loc 1 300 12 add.u64 %r293,%r226,%r286; ld.u64 %r41,[%r293+40]; .loc 1 302 3 setp.ne.u64 %r295,%r159,0; @ %r295 bra $L71; $L77: .loc 1 312 3 setp.lt.s64 %r296,%r159,%r160; @ %r296 bra $L72; bra $L135; $L70: .loc 1 297 5 cvta.const.u64 %r2977_gfortran_runtime_error71: add.u64 %r91,%r224,40; add.u64 %r40,%r226,40; add.u64 %r300,%r22,%r22; add.u64 %r301,%r300,%r22; shl.b64 %r302,%r301,3; add.u64 %r303,%r224,16; add.u64 %r186,%r302,%r303; .loc 1 302 3 mov.u64 %r148,0; add.u64 %r453,%frame,240; add.u64 %r452,%frame,360; .loc 1 309 12 mov.u64 %r467,%r148; $L76: .loc 1 304 18 add.u64 %r305,%r453,%r148; ld.u64 %r306,[%r91]; st.u64 [%r305],%r306; .loc 1 305 18 add.u64 %r307,%frame,%r148; .loc 1 305 20 ld.u64 %r309,[%r40]; mul.lo.u64 %r308,%r309,%r34; .loc 1 305 18 st.u64 [%r307],%r308; .loc 1 306 19 ld.u64 %r311,[%r91+16]; add.u64 %r310,%r311,1; ld.u64 %r312,[%r91+8]; sub.u64 %r52,%r310,%r312; .loc 1 308 10 setp.lt.s64 %r313,%r52,0; @ %r313 bra $L74; .loc 1 306 17 add.u64 %r315,%r452,%r148; st.u64 [%r315],%r52; bra $L75; $L74: .loc 1 309 12 add.u64 %r317,%r452,%r148; st.u64 [%r317],%r467; $L75: .loc 1 302 3 add.u64 %r91,%r91,24; add.u64 %r40,%r40,24; add.u64 %r148,%r148,8; setp.ne.u64 %r319,%r186,%r91; @ %r319 bra $L76; bra $L77; $L135: .loc 1 322 6 ld.u64 %r320,[%r223]; setp.eq.u64 %r321,%r320,0; @ ! %r321 bra $L79; bra $L78; $L72: add.u64 %r323,%r22,%r22; add.u64 %r324,%r323,%r22; shl.b64 %r325,%r324,3; add.u64 %r116,%r325,40; add.u64 %r133,%r224,%r116; add.u64 %r87,%r226,%r116; shl.b64 %r326,%r22,3; add.u64 %r157,%r326,-8; cvt.u32.u32 %r328,%r23; cvt.s64.s8 %r327,%r328; shl.b64 %r329,%r327,3; add.u64 %r96,%r329,-8; add.u64 %r453,%frame,240; add.u64 %r452,%frame,360; .loc 1 319 12 mov.u64 %r466,0; $L82: .loc 1 314 18 add.u64 %r331,%r453,%r157; ld.u64 %r332,[%r133]; st.u64 [%r331],%r332; .loc 1 315 18 add.u64 %r333,%frame,%r157; .loc 1 315 20 ld.u64 %r335,[%r87]; mul.lo.u64 %r334,%r335,%r34; .loc 1 315 18 st.u64 [%r333],%r334; .loc 1 316 19 ld.u64 %r337,[%r133+16]; add.u64 %r336,%r337,1; ld.u64 %r338,[%r133+8]; sub.u64 %r61,%r336,%r338; .loc 1 318 10 setp.lt.s64 %r339,%r61,0; @ %r339 bra $L80; .loc 1 316 17 add.u64 %r341,%r452,%r157; st.u64 [%r341],%r61; bra $L81; $L80: .loc 1 319 12 add.u64 %r343,%r452,%r157; st.u64 [%r343],%r466; $L81: .loc 1 312 3 add.u64 %r133,%r133,24; add.u64 %r87,%r87,24; add.u64 %r157,%r157,8; setp.ne.u64 %r345,%r96,%r157; @ %r345 bra $L82; bra $L136; $L78: .loc 1 326 7 setp.eq.u64 %r346,%r160,0; @ %r346 bra $L84; add.u64 %r452,%frame,360; $L112: add.u64 %r146,%r223,40; mov.u64 %r143,%r452; cvt.u32.u32 %r348,%r23; cvt.s64.s8 %r347,%r348; add.u64 %r350,%r347,%r347; add.u64 %r351,%r350,%r347; shl.b64 %r352,%r351,3; add.u64 %r353,%r223,16; add.u64 %r134,%r352,%r353; .loc 1 329 10 mov.u64 %r139,1; .loc 1 333 4 mov.u64 %r372,0; bra $L85; $L84: .loc 1 337 20 add.u32 %r354,%r23,-2; cvt.s64.s32 %r71,%r354; add.u64 %r356,%r71,%r71; add.u64 %r357,%r356,%r71; shl.b64 %r358,%r357,3; add.u64 %r359,%r223,%r358; .loc 1 337 67 shl.b64 %r361,%r71,3; add.u64 %r362,%frame,%r361; .loc 1 337 59 ld.u64 %r364,[%r359+40]; ld.u64 %r365,[%r362+360]; mul.lo.u64 %r168,%r364,%r365; .loc 1 339 24 mov.u64 %r366,0; st.u64 [%r223+8],%r366; .loc 1 340 28 cvt.u16.u32 %r369,%r23; add.u16 %r368,%r369,-1; cvt.u32.u16 %r370,%r368; st.u8 [%r223+28],%r370; .loc 1 342 10 setp.eq.u64 %r371,%r168,0; @ %r371 bra $L86; bra $L137; $L88: .loc 1 331 47 mul.lo.u64 %r139,%r68,%r139; $L85: .loc 1 333 4 st.u64 [%r146+8],%r372; ld.u64 %r68,[%r143]; add.u64 %r373,%r68,-1; st.u64 [%r146+16],%r373; st.u64 [%r146],%r139; .loc 1 326 7 add.u64 %r146,%r146,24; add.u64 %r143,%r143,8; setp.ne.u64 %r374,%r134,%r146; @ %r374 bra $L88; bra $L84; $L86: .loc 1 345 4 st.u64 [%r223+48],%r168; mov.u64 %r376,-1; st.u64 [%r223+56],%r376; mov.u64 %r377,1; st.u64 [%r223+40],%r377; .loc 1 346 4 bra $L64; $L137: .loc 1 349 24call (%value_in),_gfortrani_xmallocarray80,[%value_in]; } .loc 1 349 22 st.u64 [%r223],%r380; bra $L89; $L79: .loc 1 354 19 ld.s8 %r382,[%r223+28]; .loc 1 354 10 setp.eq.u64 %r383,%r382,%r160; @ %r383 bra $L90; .loc 1 355 2 cvta.const.u64 %r3843stack; call _gfortran_runtime_error90: .loc 1 357 11 cvta.global.u64 %r386,_gfortrani_compile_options; .loc 1 357 10 ld.u32 %r387,[%r386+36]; setp.eq.u32 %r388,%r387,0; @ %r388 bra $L89; .loc 1 359 4 add.u64 %r452,%frame,360; cvta.const.u64 %r392,$LC239392; call _gfortrani_bounds_ifunction_return361 4 cvta.const.u64 %r396,$LC396392; call _gfortrani_bounds_equal_extents$L89: .loc 1 366 3 setp.ne.u64 %r398,%r160,0; @ %r398 bra $L91; $L94: .loc 1 374 8 ld.u64 %r181,[%r223]; .loc 1 375 8 ld.u64 %r179,[%r224]; .loc 1 377 9 setp.ne.u64 %r399,%r179,0; @ %r399 bra $L92; bra $L64; $L91: add.u64 %r199,%frame,480; add.u64 %r195,%r223,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r401,%r23; cvt.s64.s8 %r400,%r401; add.u64 %r149,%r400,-1; .loc 1 366 3 mov.u64 %r182,0; add.u64 %r452,%frame,360; .loc 1 368 16 mov.u64 %r402,%r182; $L93: st.u64 [%r199],%r402; .loc 1 369 18 ld.u64 %r403,[%r195]; st.u64 [%r187],%r403; .loc 1 370 17 shl.b64 %r405,%r182,3; add.u64 %r406,%r452,%r405; .loc 1 370 10 ld.u64 %r407,[%r406]; setp.le.s64 %r408,%r407,0; @ %r408 bra $L64; .loc 1 366 26 add.u64 %r182,%r182,1; .loc 1 366 3 add.u64 %r199,%r199,8; add.u64 %r195,%r195,24; add.u64 %r187,%r187,8; setp.ne.u64 %r409,%r149,%r182; @ %r409 bra $L93; bra $L94; $L92: .loc 1 300 12 mul.lo.u64 %r43,%r41,%r34; .loc 1 396 32 shl.b64 %r86,%r164,3; ld.u64 %r39,[%frame+480]; .loc 1 440 22 ld.u64 %r100,[%frame+240]; .loc 1 440 12 shl.b64 %r101,%r100,3; .loc 1 441 23 ld.u64 %r103,[%frame]; .loc 1 442 22 ld.u64 %r105,[%frame+120]; .loc 1 442 12 shl.b64 %r106,%r105,2; .loc 1 444 32 ld.u64 %r172,[%frame+360]; .loc 1 451 23 mul.lo.u64 %r38,%r172,%r100; .loc 1 452 24 mul.lo.u64 %r410,%r172,%r103; .loc 1 452 10 neg.s64 %r94,%r410; .loc 1 453 23 mul.lo.u64 %r411,%r172,%r105; .loc 1 453 9 shl.b64 %r412,%r411,2; neg.s64 %r140,%r412; setp.le.s64 %r451,%r160,1; .loc 1 418 9 setp.ne.u32 %r459,%r227,0; cvt.u32.u32 %r460,%r23; cvt.s64.s8 %r461,%r460; add.u64 %r462,%r461,-1; add.u64 %r463,%frame,240; add.u64 %r464,%frame,360; add.u64 %r465,%frame,120; $L113: .loc 1 366 3 mov.u64 %r174,%r180; mov.u64 %r173,%r179; .loc 1 393 16 mov.u32 %r147,0; .loc 1 396 9 mov.u64 %r171,0; $L100: .loc 1 399 7 ld.s8 %r222,[%r174]; .loc 1 399 6 setp.eq.u32 %r454,%r222,0; @ %r454 bra $L95; .loc 1 402 10 setp.ne.u32 %r414,%r147,0; @ %r414 bra $L96; cvt.u32.u64 %r415,%r171; add.u32 %r147,%r415,1; $L96: .loc 1 404 11 ld.f64 %r145,[%r173]; .loc 1 404 10 setp.le.f64 %r416,%r145,0d7ff0000000000000; @ ! %r416 bra $L95; .loc 1 408 11 cvt.u32.u64 %r417,%r171; add.u32 %r170,%r417,1; .loc 1 414 9 setp.le.s64 %r418,%r161,%r171; @ ! %r418 bra $L138; bra $L98; $L95: .loc 1 396 24 add.u64 %r171,%r171,1; .loc 1 396 32 add.u64 %r173,%r173,%r86; .loc 1 396 47 add.u64 %r174,%r174,%r43; .loc 1 396 2 setp.ne.u64 %r419,%r161,%r171; @ %r419 bra $L100; bra $L98; $L138: .loc 1 418 9 @ %r459 bra $L101; add.u64 %r205,%r171,1; .loc 1 408 11 mov.u32 %r147,%r170; bra $L102; $L101: add.u64 %r209,%r171,1; mov.u32 %r147,%r170; $L105: .loc 1 421 8 @ %r454 bra $L103; .loc 1 421 18 ld.f64 %r89,[%r173]; .loc 1 421 15 setp.le.f64 %r422,%r89,%r145; @ ! %r422 bra $L103; .loc 1 424 16 cvt.u32.u64 %r147,%r209; .loc 1 423 16 mov.f64 %r145,%r89; $L103: .loc 1 419 33 add.u64 %r173,%r173,%r86; .loc 1 419 48 add.u64 %r174,%r174,%r43; .loc 1 419 8 add.u64 %r210,%r209,1; setp.eq.u64 %r423,%r161,%r209; @ %r423 bra $L98; .loc 1 421 9 ld.s8 %r222,[%r174]; mov.u64 %r209,%r210; setp.eq.u32 %r454,%r222,0; bra $L105; $L102: .loc 1 430 10 @ %r454 bra $L106; .loc 1 430 20 ld.f64 %r93,[%r173]; .loc 1 430 17 setp.lt.f64 %r425,%r93,%r145; @ ! %r425 bra $L106; .loc 1 433 11 cvt.u32.u64 %r147,%r205; .loc 1 432 18 mov.f64 %r145,%r93; $L106: .loc 1 428 35 add.u64 %r173,%r173,%r86; .loc 1 428 50 add.u64 %r174,%r174,%r43; .loc 1 428 10 add.u64 %r206,%r205,1; setp.eq.u64 %r426,%r161,%r205; @ %r426 bra $L98; .loc 1 430 11 ld.s8 %r222,[%r174]; mov.u64 %r205,%r206; setp.eq.u32 %r454,%r222,0; bra $L102; $L98: .loc 1 436 8 st.u32 [%r181],%r147; .loc 1 439 15 add.u64 %r39,%r39,1; .loc 1 440 12 add.u64 %r179,%r179,%r101; .loc 1 441 13 add.u64 %r180,%r180,%r103; .loc 1 442 12 add.u64 %r181,%r181,%r106; .loc 1 444 13 setp.ne.u64 %r427,%r39,%r172; @ %r427 bra $L113; .loc 1 452 10 add.u64 %r176,%r180,%r94; .loc 1 453 9 add.u64 %r177,%r181,%r140; .loc 1 455 7 @ %r451 bra $L64; add.u64 %r158,%frame,488; .loc 1 451 23 mov.u64 %r110,%r38; .loc 1 455 7 mov.u64 %r219,8; .loc 1 454 5 mov.u64 %r178,1; .loc 1 448 13 mov.u64 %r458,0; bra $L110; $L111: st.u64 [%r158],%r458; .loc 1 451 23 mul.lo.u64 %r110,%r124,%r123; .loc 1 452 24 mul.lo.u64 %r433,%r126,%r123; .loc 1 452 10 sub.u64 %r176,%r180,%r433; .loc 1 453 23 mul.lo.u64 %r434,%r128,%r123; .loc 1 453 9 shl.b64 %r435,%r434,2; sub.u64 %r177,%r181,%r435; .loc 1 454 5 add.u64 %r178,%r178,1; .loc 1 455 7 add.u64 %r158,%r158,8; add.u64 %r219,%r219,8; setp.eq.u64 %r436,%r178,%r462; @ %r436 bra $L64; $L110: .loc 1 463 16 ld.u64 %r437,[%r158]; add.u64 %r123,%r437,1; st.u64 [%r158],%r123; .loc 1 464 23 add.u64 %r439,%r463,%r219; ld.u64 %r124,[%r439]; .loc 1 464 13 sub.u64 %r440,%r124,%r110; shl.b64 %r441,%r440,3; add.u64 %r179,%r179,%r441; .loc 1 465 24 add.u64 %r442,%frame,%r219; ld.u64 %r126,[%r442]; .loc 1 465 14 add.u64 %r180,%r176,%r126; .loc 1 466 23 add.u64 %r444,%r465,%r219; ld.u64 %r128,[%r444]; .loc 1 466 13 shl.b64 %r445,%r128,2; add.u64 %r181,%r177,%r445; .loc 1 444 32 add.u64 %r447,%r464,%r219; ld.u64 %r131,[%r447]; .loc 1 444 13 setp.eq.u64 %r448,%r123,%r131; @ %r448 bra $L111; .loc 1 448 13 mov.u64 %r39,0; bra $L113; $L136: .loc 1 322 6 ld.u64 %r449,[%r223]; setp.eq.u64 %r450,%r449,0; @ ! %r450 bra $L79; bra $L112; $L64_gfortran_sminloc1_4_r8 .visible .func _gfortran_sminloc1_4_r8, .param .u32 %in_ar4) {local .align 16 .b8 %frame_ar[368u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u32 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 493 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L140; .loc 1 493 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L141; $L140:r152154; call _gfortran_minloc1_4_r8500 7 bra $L139; $L141: .loc 1 503 10 ld.u64 %r23,[%r152]; .loc 1 503 7 add.u64 %r88,%r23,-1; .loc 1 504 10 ld.s8 %r24,[%r151+28]; .loc 1 504 38 add.u32 %r26,%r24,-1; .loc 1 504 8 cvt.s64.s32 %r296,%r26; .loc 1 506 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 506 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L143; .loc 1 513 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L144; $L150: .loc 1 521 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L145; bra $L198; $L143: .loc 1 508 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error144: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 518 12 mov.u64 %r302,0; $L149: .loc 1 515 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 517 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L147; .loc 1 515 17 st.u64 [%r139],%r36; bra $L148; $L147: .loc 1 518 12 st.u64 [%r139],%r302; $L148: .loc 1 513 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L149; bra $L150; $L198: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L152; bra $L151; $L145: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 527 12 mov.u64 %r301,0; $L155: .loc 1 524 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 526 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L153; .loc 1 523 17 st.u64 [%r43],%r41; bra $L154; $L153: .loc 1 527 12 st.u64 [%r43],%r301; $L154: .loc 1 521 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L155; bra $L199; $L151: .loc 1 534 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L157; add.u64 %r295,%frame,120; $L177: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 537 10 mov.u64 %r81,1; .loc 1 541 4 mov.u64 %r239,0; bra $L158; $L157: .loc 1 545 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 546 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 548 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 548 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 548 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 550 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L159; bra $L200; $L161: .loc 1 539 48 mul.lo.u64 %r81,%r46,%r81; $L158: .loc 1 541 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 534 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L161; bra $L157; $L159: .loc 1 553 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 554 4 bra $L139; $L200: .loc 1 557 2496246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 557 22 st.u64 [%r150],%r98; $L165: .loc 1 583 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L162; bra $L163; $L152: .loc 1 561 19 ld.s8 %r55,[%r150+28]; .loc 1 561 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L164; .loc 1 562 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error164: .loc 1 567 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 567 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L165; .loc 1 569 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L163; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 569 10 mov.u64 %r104,0; $L167: .loc 1 573 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 573 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 574 18 ld.u64 %r63,[%r31]; .loc 1 569 25 add.u64 %r104,%r104,1; .loc 1 574 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L166; .loc 1 575 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error166: .loc 1 569 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L167; $L162: .loc 1 585 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L168; shl.b64 %r266,%r296,3; bra $L169; $L168: mov.u64 %r266,8; $L169: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L170: .loc 1 586 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 583 26 add.u64 %r100,%r100,1; .loc 1 583 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L170; $L163: ld.u64 %r80,[%frame+240]; .loc 1 595 22 ld.u64 %r68,[%frame]; .loc 1 595 12 shl.b64 %r69,%r68,2; .loc 1 597 32 ld.u64 %r59,[%frame+120]; .loc 1 604 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 593 13 mov.u32 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L176: st.u32 [%r98],%r280; .loc 1 594 15 add.u64 %r80,%r80,1; .loc 1 595 12 add.u64 %r98,%r98,%r69; .loc 1 597 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L176; .loc 1 606 7 @ ! %r294 bra $L201; bra $L139; $L175: .loc 1 601 13 st.u64 [%r99],%r297; .loc 1 604 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 605 5 add.u64 %r97,%r97,1; .loc 1 606 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L174; bra $L139; $L201: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 604 23 mov.u64 %r72,%r44; .loc 1 605 5 mov.u64 %r97,1; .loc 1 601 13 mov.u64 %r297,0; $L174: .loc 1 610 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 611 23 ld.u64 %r75,[%r124]; .loc 1 611 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,2; add.u64 %r98,%r98,%r291; .loc 1 597 32 ld.u64 %r77,[%r123]; .loc 1 597 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L175; .loc 1 601 13 mov.u64 %r80,0; bra $L176; $L199: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L152; bra $L177; $L139: .loc 1 615 1 ret; } minloc1_8_r8.o/_gfortran_minloc1_8_r8 .visible .func _gfortran_minloc1_8_r8.file 1 "../../../libgfortran/generated/minloc1_8_r8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_8_r8 .visible .func _gfortran_mminloc1_8_r8_gfortran_sminloc1_8_r8 .visible .func _gfortran_sminloc1_8_r8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_minloc1_8_r8 .visible .func _gfortran_minloc1_8_r8480pred %r221; .reg .pred %r222; .reg .u64u64 %r229; .reg .u64 %r230; .reg .u64 %r231; .reg .pred %r232; .reg .predu64 %r280; .reg .u64 %r282; .reg .u64pred64pred %r317; .reg .predpred %r323; .reg .predpred %r326; .reg .pred %r327; .reg .predu64 %r333; .reg .u64pred %r352; .reg .u64 %r353; .reg .predmov.u64 %r182,%ar0; mov.u64 %r183,%ar1; mov.u64 %r184,%ar2; mov.u32 %r185,%ar3; .loc 1 58 10 ld.s8 %r22,[%r183+28]; .loc 1 58 38 add.u32 %r186,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r126,%r186; .loc 1 59 10 ld.u64 %r25,[%r184]; .loc 1 59 7 add.u64 %r127,%r25,-1; .loc 1 61 7 shr.u64 %r188,%r127,63; set.u32.lt.s64 %r190,%r126,%r127; neg.s32 %r191,%r190; cvt.u16.u64 %r194,%r188; cvt.u16.u32 %r195,%r191; or.b16 %r193,%r194,%r195; .loc 1 61 6 cvt.u32.u16 %r196,%r193; cvt.u16.u8 %r197,%r196; setp.eq.u16 %r198,%r197,0; @ %r198 bra $L2; .loc 1 63 7 cvt.u32.u32 %r201,%r22; cvt.s64.s8 %r200,%r201; st.u64 [%stack+8],%r20019_gfortran_runtime_errorr204,%r127,%r127; add.u64 %r205,%r204,%r127; shl.b64 %r206,%r205,3; add.u64 %r207,%r183,%r206; ld.u64 %r31,[%r207+56]; ld.u64 %r33,[%r207+48]; .loc 1 71 9 ld.u64 %r130,[%r207+40]; .loc 1 73 3 setp.ne.u64 %r221,%r127,0; @ %r221 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r222,%r126,%r127; @ %r222 bra $L4; bra $L62; $L3: add.u64 %r125,%r183,40; add.u64 %r144,%frame,120; add.u64 %r181,%frame,240; add.u64 %r223,%r183,16; add.u64 %r225,%r25,%r25; add.u64 %r226,%r225,%r25; shl.b64 %r227,%r226,3; add.u64 %r75,%r223,%r227; .loc 1 79 12 mov.u64 %r368,0; $L8: .loc 1 75 18 ld.u64 %r228,[%r125]; st.u64 [%r144],%r228; .loc 1 76 19 ld.u64 %r230,[%r125+16]; add.u64 %r229,%r230,1; ld.u64 %r231,[%r125+8]; sub.u64 %r38,%r229,%r231; .loc 1 78 10 setp.lt.s64 %r232,%r38,0; @ %r232 bra $L6; .loc 1 76 17 st.u64 [%r181],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r181],%r368; $L7: .loc 1 73 3 add.u64 %r125,%r125,24; add.u64 %r144,%r144,8; add.u64 %r181,%r181,8; setp.ne.u64 %r234,%r75,%r125; @ %r234 bra $L8; bra $L9; $L62: .loc 1 90 6 ld.u64 %r235,[%r182]; setp.eq.u64 %r236,%r235,0; @ ! %r236 bra $L11; bra $L10; $L4: add.u64 %r238,%r25,%r25; add.u64 %r239,%r238,%r25; shl.b64 %r240,%r239,3; add.u64 %r241,%r240,40; add.u64 %r121,%r183,%r241; shl.b64 %r242,%r25,3; add.u64 %r111,%r242,-8; add.u64 %r356,%frame,120; add.u64 %r115,%r356,%r111; add.u64 %r355,%frame,240; add.u64 %r109,%r355,%r111; cvt.u32.u32 %r246,%r22; cvt.s64.s8 %r245,%r246; add.u64 %r248,%r245,%r245; add.u64 %r249,%r248,%r245; shl.b64 %r250,%r249,3; add.u64 %r251,%r183,40; add.u64 %r49,%r250,%r251; .loc 1 87 12 mov.u64 %r367,0; $L14: .loc 1 83 18 ld.u64 %r252,[%r121]; st.u64 [%r115],%r252; .loc 1 84 19 ld.u64 %r254,[%r121+16]; add.u64 %r253,%r254,1; ld.u64 %r255,[%r121+8]; sub.u64 %r45,%r253,%r255; .loc 1 86 10 setp.lt.s64 %r256,%r45,0; @ %r256 bra $L12; .loc 1 84 17 st.u64 [%r109],%r45; bra $L13; $L12: .loc 1 87 12 st.u64 [%r109],%r367; $L13: .loc 1 81 3 add.u64 %r121,%r121,24; add.u64 %r115,%r115,8; add.u64 %r109,%r109,8; setp.ne.u64 %r258,%r49,%r121; @ %r258 bra $L14; bra $L63; $L10: .loc 1 94 7 setp.le.s64 %r259,%r126,0; @ %r259 bra $L16; add.u64 %r355,%frame,240; $L44: add.u64 %r150,%r182,40; mov.u64 %r148,%r355; cvt.u32.u32 %r261,%r22; cvt.s64.s8 %r260,%r261; add.u64 %r263,%r260,%r260; add.u64 %r264,%r263,%r260; shl.b64 %r265,%r264,3; add.u64 %r266,%r182,16; add.u64 %r123,%r265,%r266; .loc 1 97 10 mov.u64 %r114,1; .loc 1 101 4 mov.u64 %r289,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r267,0; st.u64 [%r182+8],%r267; .loc 1 106 28 cvt.u16.u32 %r270,%r22; add.u16 %r269,%r270,-1; cvt.u32.u16 %r271,%r269; st.u8 [%r182+28],%r271; .loc 1 108 20 add.u32 %r272,%r22,-2; cvt.s64.s32 %r56,%r272; add.u64 %r274,%r56,%r56; add.u64 %r275,%r274,%r56; shl.b64 %r276,%r275,3; add.u64 %r277,%r182,%r276; .loc 1 108 67 shl.b64 %r279,%r56,3; add.u64 %r280,%frame,%r279; .loc 1 108 59 ld.u64 %r282,[%r277+40]; ld.u64 %r283,[%r280+240]; mul.lo.u64 %r132,%r282,%r283; .loc 1 110 29 mov.u64 %r2851285; call (%value_in),_gfortrani_xmallocarray286,[%value_in]; } .loc 1 110 27 st.u64 [%r182],%r286; .loc 1 111 10 setp.eq.u64 %r288,%r132,0; @ ! %r288 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r114,%r52,%r114; $L17: .loc 1 101 4 st.u64 [%r150+8],%r289; ld.u64 %r52,[%r148]; add.u64 %r290,%r52,-1; st.u64 [%r150+16],%r290; st.u64 [%r150],%r114; .loc 1 94 7 add.u64 %r150,%r150,24; add.u64 %r148,%r148,8; setp.ne.u64 %r291,%r123,%r150; @ %r291 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r182+48],%r132; mov.u64 %r293,-1; st.u64 [%r182+56],%r293; mov.u64 %r294,1; st.u64 [%r182+40],%r294; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r62,[%r182+28]; .loc 1 121 10 setp.eq.u64 %r295,%r62,%r126; @ %r295 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r126; st.u64 [%stack],%r62;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r298,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r299,[%r298+36]; setp.eq.u32 %r300,%r299,0; @ %r300 bra $L21; .loc 1 128 2 add.u64 %r355,%frame,240; cvta.const.u64 %r304,$LC2call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r306,%r126,0; @ %r306 bra $L24; $L27: .loc 1 68 9 add.u64 %r307,%r31,1; .loc 1 68 7 sub.u64 %r128,%r307,%r33; max.s64 %r145,%r128,0; .loc 1 140 8 ld.u64 %r140,[%r183]; .loc 1 141 8 ld.u64 %r141,[%r182]; .loc 1 168 40 shl.b64 %r72,%r130,3; ld.u64 %r47,[%frame+360]; .loc 1 204 22 ld.u64 %r82,[%frame+120]; .loc 1 204 12 shl.b64 %r84,%r82,3; .loc 1 205 22 ld.u64 %r85,[%frame]; .loc 1 205 12 shl.b64 %r87,%r85,3; setp.gt.s64 %r357,%r128,0; setp.le.s64 %r358,%r126,1; .loc 1 180 9 setp.ne.u32 %r360,%r185,0; .loc 1 159 10 mov.u64 %r361,0; cvt.u32.u32 %r362,%r22; cvt.s64.s8 %r363,%r362; add.u64 %r364,%r363,-1; add.u64 %r365,%frame,120; add.u64 %r366,%frame,240; bra $L25; $L24: add.u64 %r168,%frame,360; add.u64 %r167,%r182,40; mov.u64 %r165,%frame; cvt.u32.u32 %r309,%r22; cvt.s64.s8 %r308,%r309; add.u64 %r154,%r308,-1; .loc 1 132 3 mov.u64 %r142,0; add.u64 %r355,%frame,240; .loc 1 134 16 mov.u64 %r310,%r142; $L26: st.u64 [%r168],%r310; .loc 1 135 18 ld.u64 %r311,[%r167]; st.u64 [%r165],%r311; .loc 1 136 17 shl.b64 %r313,%r142,3; add.u64 %r314,%r355,%r313; .loc 1 136 10 ld.u64 %r315,[%r314]; setp.le.s64 %r316,%r315,0; @ %r316 bra $L1; .loc 1 132 26 add.u64 %r142,%r142,1; .loc 1 132 3 add.u64 %r168,%r168,8; add.u64 %r167,%r167,24; add.u64 %r165,%r165,8; setp.ne.u64 %r317,%r142,%r154; @ %r317 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r357 bra $L45; .loc 1 159 10 st.u64 [%r141],%r361; bra $L29; $L45: mov.u64 %r133,%r140; .loc 1 168 17 mov.u64 %r118,0; $L28: .loc 1 170 7 ld.f64 %r113,[%r133]; mov.u64 %r151,%r118; .loc 1 168 32 add.u64 %r118,%r118,1; .loc 1 170 6 setp.le.f64 %r320,%r113,0d7ff0000000000000; @ ! %r320 bra $L64; .loc 1 180 9 @ %r360 bra $L32; .loc 1 190 8 setp.gt.s64 %r322,%r145,%r151; @ %r322 bra $L33; bra $L34; $L32: .loc 1 181 8 setp.gt.s64 %r323,%r145,%r151; @ %r323 bra $L46; bra $L34; $L64: .loc 1 168 40 add.u64 %r133,%r133,%r72; .loc 1 168 10 setp.gt.s64 %r324,%r145,%r118; @ %r324 bra $L28; .loc 1 157 9 mov.u64 %r118,1; bra $L34; $L46: .loc 1 170 7 mov.f64 %r180,%r113; $L35: .loc 1 186 16 add.u64 %r151,%r151,1; .loc 1 183 8 setp.ge.f64 %r325,%r113,%r180; .loc 1 185 16 selp.f64 %r113,%r180,%r113,%r325; selp.u64 %r118,%r151,%r118,%r325; .loc 1 181 33 add.u64 %r133,%r133,%r72; .loc 1 181 8 setp.eq.u64 %r326,%r145,%r151; @ %r326 bra $L34; .loc 1 183 9 ld.f64 %r180,[%r133]; bra $L35; $L33: .loc 1 192 9 ld.f64 %r77,[%r133]; .loc 1 195 16 add.u64 %r151,%r151,1; .loc 1 192 8 setp.lt.f64 %r327,%r77,%r113; .loc 1 194 16 selp.f64 %r113,%r77,%r113,%r327; selp.u64 %r118,%r151,%r118,%r327; .loc 1 190 33 add.u64 %r133,%r133,%r72; .loc 1 190 8 setp.ne.u64 %r328,%r145,%r151; @ %r328 bra $L33; $L34: .loc 1 199 12 st.u64 [%r141],%r118; $L29: .loc 1 203 15 add.u64 %r47,%r47,1; .loc 1 204 12 add.u64 %r140,%r140,%r84; .loc 1 205 12 add.u64 %r141,%r141,%r87; .loc 1 207 32 ld.u64 %r122,[%frame+240]; .loc 1 207 13 setp.ne.u64 %r329,%r47,%r122; @ %r329 bra $L25; .loc 1 214 23 mul.lo.u64 %r330,%r82,%r47; .loc 1 214 9 shl.b64 %r331,%r330,3; sub.u64 %r137,%r140,%r331; .loc 1 215 23 mul.lo.u64 %r332,%r85,%r47; .loc 1 215 9 shl.b64 %r333,%r332,3; sub.u64 %r138,%r141,%r333; .loc 1 217 7 @ %r358 bra $L1; add.u64 %r177,%frame,368; mov.u64 %r176,8; .loc 1 216 5 mov.u64 %r139,1; bra $L42; $L43: .loc 1 211 13 st.u64 [%r177],%r361; .loc 1 214 23 mul.lo.u64 %r339,%r100,%r99; .loc 1 214 9 shl.b64 %r340,%r339,3; sub.u64 %r137,%r140,%r340; .loc 1 215 23 mul.lo.u64 %r341,%r103,%r99; .loc 1 215 9 shl.b64 %r342,%r341,3; sub.u64 %r138,%r141,%r342; .loc 1 216 5 add.u64 %r139,%r139,1; .loc 1 217 7 add.u64 %r177,%r177,8; add.u64 %r176,%r176,8; setp.eq.u64 %r343,%r139,%r364; @ %r343 bra $L1; $L42: .loc 1 225 16 ld.u64 %r344,[%r177]; add.u64 %r99,%r344,1; st.u64 [%r177],%r99; .loc 1 226 23 add.u64 %r346,%r365,%r176; ld.u64 %r100,[%r346]; .loc 1 226 13 shl.b64 %r347,%r100,3; add.u64 %r140,%r137,%r347; .loc 1 227 23 add.u64 %r348,%frame,%r176; ld.u64 %r103,[%r348]; .loc 1 227 13 shl.b64 %r349,%r103,3; add.u64 %r141,%r138,%r349; .loc 1 207 32 add.u64 %r351,%r366,%r176; ld.u64 %r107,[%r351]; .loc 1 207 13 setp.eq.u64 %r352,%r99,%r107; @ %r352 bra $L43; .loc 1 211 13 mov.u64 %r47,0; bra $L25; $L63: .loc 1 90 6 ld.u64 %r353,[%r182]; setp.eq.u64 %r354,%r353,0; @ ! %r354 bra $L11; bra $L44; $L1:_gfortran_mminloc1_8_r8 .visible .func _gfortran_mminloc1_8_r8, .param .u32 %in_ar4) {608pred %r218; .reg .u32 %r223; .reg .u6464u64pred %r373; .reg .u64 %r374; .reg .u64 %r376; .reg .u32 %r377; .reg .pred %r378; .reg .u64 %r381; .reg .u64 %r382; .reg .u64 %r386; .reg .pred %r388; .reg .pred %r389; .reg .u64 %r390; .reg .u32 %r391; .reg .u64pred %r398; .reg .pred %r399; .reg .pred %r401; .reg .pred %r402; .reg .pred %r403; .reg .pred %r404; .reg .pred %r407; .reg .pred %r408; .reg .pred %r410; .reg .pred %r411; .reg .pred.reg .u64 %r421; .reg .u64u64pred %r445; .reg .u64 %r446; .reg .predmov.u64 %r213,%ar0; mov.u64 %r214,%ar1; mov.u64 %r215,%ar2; mov.u64 %r216,%ar3; mov.u32 %r217,%ar4; .loc 1 261 6 setp.ne.u64 %r218,%r216,0; @ %r218 bra $L661r214215217; call _gfortran_minloc1_8_r8268 7 bra $L65; $L66: .loc 1 271 10 ld.u64 %r22,[%r215]; .loc 1 271 7 add.u64 %r152,%r22,-1; .loc 1 272 10 ld.s8 %r23,[%r214+28]; .loc 1 272 38 add.u32 %r223,%r23,-1; .loc 1 272 8 cvt.s64.s32 %r153,%r223; .loc 1 275 7 shr.u64 %r225,%r152,63; set.u32.gt.s64 %r227,%r152,%r153; neg.s32 %r228,%r227; cvt.u16.u64 %r231,%r225; cvt.u16.u32 %r232,%r228; or.b16 %r230,%r231,%r232; .loc 1 275 6 cvt.u32.u16 %r233,%r230; cvt.u16.u8 %r234,%r233; setp.eq.u16 %r235,%r234,0; @ %r235 bra $L68; .loc 1 277 7 cvt.u32.u32 %r238,%r23; cvt.s64.s8 %r237,%r238; st.u64 [%stack+8],%r237; st.u64 [%stack],%r22; cvta.const.u64 %r236236_gfortran_runtime_error68: .loc 1 282 9 add.u64 %r442,%r152,%r152; add.u64 %r444,%r442,%r152; shl.b64 %r243,%r444,3; add.u64 %r244,%r214,%r243; ld.u64 %r247,[%r244+56]; add.u64 %r246,%r247,1; .loc 1 282 7 ld.u64 %r254,[%r244+48]; sub.u64 %r154,%r246,%r254; .loc 1 283 6 setp.le.s64 %r255,%r154,0; @ %r255 bra $L65; .loc 1 286 9 ld.u64 %r172,[%r216]; .loc 1 288 15 ld.u64 %r34,[%r216+16]; .loc 1 290 22 cvt.u32.u64 %r35,%r34; .loc 1 290 53 add.u32 %r256,%r35,-4; and.b32 %r257,%r256,-5; set.u32.eq.u32 %r259,%r257,0; neg.s32 %r260,%r259; .loc 1 290 22 add.u32 %r261,%r35,-1; set.u32.le.u32 %r263,%r261,1; neg.s32 %r264,%r263; .loc 1 290 6 cvt.u16.u32 %r266,%r260; cvt.u16.u32 %r267,%r264; or.b16 %r265,%r266,%r267; cvt.u32.u16 %r268,%r265; cvt.u16.u8 %r269,%r268; setp.ne.u16 %r270,%r269,0; @ %r270 bra $L70; .loc 1 292 7 setp.ne.u32 %r272,%r35,16; @ %r272 bra $L71; $L70: .loc 1 299 9 shl.b64 %r276,%r444,3; add.u64 %r277,%r214,%r276; ld.u64 %r157,[%r277+40]; .loc 1 300 12 add.u64 %r283,%r216,%r276; ld.u64 %r38,[%r283+40]; .loc 1 302 3 setp.ne.u64 %r285,%r152,0; @ %r285 bra $L72; $L78: .loc 1 312 3 setp.lt.s64 %r286,%r152,%r153; @ %r286 bra $L73; bra $L139; $L71: .loc 1 297 5_gfortran_runtime_error72: add.u64 %r205,%r214,40; add.u64 %r70,%r216,40; add.u64 %r289,%r214,16; add.u64 %r291,%r22,%r22; add.u64 %r292,%r291,%r22; shl.b64 %r293,%r292,3; add.u64 %r188,%r289,%r293; .loc 1 302 3 mov.u64 %r92,0; add.u64 %r441,%frame,240; add.u64 %r439,%frame,360; .loc 1 309 12 mov.u64 %r455,%r92; $L77: .loc 1 304 18 add.u64 %r295,%r441,%r92; ld.u64 %r296,[%r205]; st.u64 [%r295],%r296; .loc 1 305 18 add.u64 %r297,%frame,%r92; .loc 1 305 20 ld.u64 %r299,[%r70]; mul.lo.u64 %r298,%r299,%r34; .loc 1 305 18 st.u64 [%r297],%r298; .loc 1 306 19 ld.u64 %r301,[%r205+16]; add.u64 %r300,%r301,1; ld.u64 %r302,[%r205+8]; sub.u64 %r49,%r300,%r302; .loc 1 308 10 setp.lt.s64 %r303,%r49,0; @ %r303 bra $L75; .loc 1 306 17 add.u64 %r305,%r439,%r92; st.u64 [%r305],%r49; bra $L76; $L75: .loc 1 309 12 add.u64 %r307,%r439,%r92; st.u64 [%r307],%r455; $L76: .loc 1 302 3 add.u64 %r205,%r205,24; add.u64 %r70,%r70,24; add.u64 %r92,%r92,8; setp.ne.u64 %r309,%r188,%r205; @ %r309 bra $L77; bra $L78; $L139: .loc 1 322 6 ld.u64 %r310,[%r213]; setp.eq.u64 %r311,%r310,0; @ ! %r311 bra $L80; bra $L79; $L73: add.u64 %r313,%r22,%r22; add.u64 %r314,%r313,%r22; shl.b64 %r315,%r314,3; add.u64 %r123,%r315,40; add.u64 %r128,%r214,%r123; add.u64 %r107,%r216,%r123; shl.b64 %r316,%r22,3; add.u64 %r78,%r316,-8; cvt.u32.u32 %r318,%r23; cvt.s64.s8 %r317,%r318; shl.b64 %r319,%r317,3; add.u64 %r180,%r319,-8; add.u64 %r441,%frame,240; add.u64 %r439,%frame,360; .loc 1 319 12 mov.u64 %r454,0; $L83: .loc 1 314 18 add.u64 %r321,%r441,%r78; ld.u64 %r322,[%r128]; st.u64 [%r321],%r322; .loc 1 315 18 add.u64 %r323,%frame,%r78; .loc 1 315 20 ld.u64 %r325,[%r107]; mul.lo.u64 %r324,%r325,%r34; .loc 1 315 18 st.u64 [%r323],%r324; .loc 1 316 19 ld.u64 %r327,[%r128+16]; add.u64 %r326,%r327,1; ld.u64 %r328,[%r128+8]; sub.u64 %r61,%r326,%r328; .loc 1 318 10 setp.lt.s64 %r329,%r61,0; @ %r329 bra $L81; .loc 1 316 17 add.u64 %r331,%r439,%r78; st.u64 [%r331],%r61; bra $L82; $L81: .loc 1 319 12 add.u64 %r333,%r439,%r78; st.u64 [%r333],%r454; $L82: .loc 1 312 3 add.u64 %r128,%r128,24; add.u64 %r107,%r107,24; add.u64 %r78,%r78,8; setp.ne.u64 %r335,%r78,%r180; @ %r335 bra $L83; bra $L140; $L79: .loc 1 326 7 setp.eq.u64 %r336,%r153,0; @ %r336 bra $L85; add.u64 %r439,%frame,360; $L112: add.u64 %r144,%r213,40; mov.u64 %r139,%r439; cvt.u32.u32 %r338,%r23; cvt.s64.s8 %r337,%r338; add.u64 %r340,%r337,%r337; add.u64 %r341,%r340,%r337; shl.b64 %r342,%r341,3; add.u64 %r343,%r213,16; add.u64 %r129,%r342,%r343; .loc 1 329 10 mov.u64 %r132,1; .loc 1 333 4 mov.u64 %r362,0; bra $L86; $L85: .loc 1 337 20 add.u32 %r344,%r23,-2; cvt.s64.s32 %r68,%r344; add.u64 %r346,%r68,%r68; add.u64 %r347,%r346,%r68; shl.b64 %r348,%r347,3; add.u64 %r349,%r213,%r348; .loc 1 337 67 shl.b64 %r351,%r68,3; add.u64 %r352,%frame,%r351; .loc 1 337 59 ld.u64 %r354,[%r349+40]; ld.u64 %r355,[%r352+360]; mul.lo.u64 %r161,%r354,%r355; .loc 1 339 24 mov.u64 %r356,0; st.u64 [%r213+8],%r356; .loc 1 340 28 cvt.u16.u32 %r359,%r23; add.u16 %r358,%r359,-1; cvt.u32.u16 %r360,%r358; st.u8 [%r213+28],%r360; .loc 1 342 10 setp.eq.u64 %r361,%r161,0; @ %r361 bra $L87; bra $L141; $L89: .loc 1 331 47 mul.lo.u64 %r132,%r65,%r132; $L86: .loc 1 333 4 st.u64 [%r144+8],%r362; ld.u64 %r65,[%r139]; add.u64 %r363,%r65,-1; st.u64 [%r144+16],%r363; st.u64 [%r144],%r132; .loc 1 326 7 add.u64 %r144,%r144,24; add.u64 %r139,%r139,8; setp.ne.u64 %r364,%r129,%r144; @ %r364 bra $L89; bra $L85; $L87: .loc 1 345 4 st.u64 [%r213+48],%r161; mov.u64 %r366,-1; st.u64 [%r213+56],%r366; mov.u64 %r367,1; st.u64 [%r213+40],%r367; .loc 1 346 4 bra $L65; $L141: .loc 1 349 24 mov.u64 %r369369; call (%value_in),_gfortrani_xmallocarray70,[%value_in]; } .loc 1 349 22 st.u64 [%r213],%r370; bra $L90; $L80: .loc 1 354 19 ld.s8 %r372,[%r213+28]; .loc 1 354 10 setp.eq.u64 %r373,%r372,%r153; @ %r373 bra $L91; .loc 1 355 2 cvta.const.u64 %r374374_gfortran_runtime_error91: .loc 1 357 11 cvta.global.u64 %r376,_gfortrani_compile_options; .loc 1 357 10 ld.u32 %r377,[%r376+36]; setp.eq.u32 %r378,%r377,0; @ %r378 bra $L90; .loc 1 359 4 add.u64 %r439,%frame,360; cvta.const.u64 %r382,$LC2382; call _gfortrani_bounds_ifunction_return361 4 cvta.const.u64 %r386,$LC386382; call _gfortrani_bounds_equal_extents$L90: .loc 1 366 3 setp.ne.u64 %r388,%r153,0; @ %r388 bra $L92; $L95: .loc 1 374 8 ld.u64 %r173,[%r213]; .loc 1 375 8 ld.u64 %r171,[%r214]; .loc 1 377 9 setp.ne.u64 %r389,%r171,0; @ %r389 bra $L93; bra $L65; $L92: add.u64 %r193,%frame,480; add.u64 %r187,%r213,40; add.u64 %r181,%frame,120; cvt.u32.u32 %r391,%r23; cvt.s64.s8 %r390,%r391; add.u64 %r145,%r390,-1; .loc 1 366 3 mov.u64 %r174,0; add.u64 %r439,%frame,360; .loc 1 368 16 mov.u64 %r392,%r174; $L94: st.u64 [%r193],%r392; .loc 1 369 18 ld.u64 %r393,[%r187]; st.u64 [%r181],%r393; .loc 1 370 17 shl.b64 %r395,%r174,3; add.u64 %r396,%r439,%r395; .loc 1 370 10 ld.u64 %r397,[%r396]; setp.le.s64 %r398,%r397,0; @ %r398 bra $L65; .loc 1 366 26 add.u64 %r174,%r174,1; .loc 1 366 3 add.u64 %r193,%r193,8; add.u64 %r187,%r187,24; add.u64 %r181,%r181,8; setp.ne.u64 %r399,%r145,%r174; @ %r399 bra $L94; bra $L95; $L93: .loc 1 300 12 mul.lo.u64 %r40,%r38,%r34; .loc 1 396 32 shl.b64 %r82,%r157,3; ld.u64 %r185,[%frame+480]; .loc 1 440 22 ld.u64 %r93,[%frame+240]; .loc 1 440 12 shl.b64 %r95,%r93,3; .loc 1 441 23 ld.u64 %r96,[%frame]; .loc 1 442 22 ld.u64 %r98,[%frame+120]; .loc 1 442 12 shl.b64 %r100,%r98,3; setp.le.s64 %r445,%r153,1; .loc 1 418 9 setp.eq.u32 %r447,%r217,0; cvt.u32.u32 %r448,%r23; cvt.s64.s8 %r449,%r448; add.u64 %r450,%r449,-1; add.u64 %r451,%frame,240; add.u64 %r452,%frame,360; add.u64 %r453,%frame,120; $L113: .loc 1 366 3 mov.u64 %r166,%r172; mov.u64 %r165,%r171; .loc 1 393 16 mov.u64 %r142,0; .loc 1 396 9 mov.u64 %r138,%r142; $L101: .loc 1 399 7 ld.s8 %r210,[%r166]; mov.u64 %r182,%r138; .loc 1 403 17 add.u64 %r138,%r138,1; .loc 1 399 6 setp.eq.u32 %r440,%r210,0; @ %r440 bra $L96; .loc 1 402 10 setp.eq.u64 %r401,%r142,0; .loc 1 403 17 selp.u64 %r142,%r138,%r142,%r401; .loc 1 404 11 ld.f64 %r140,[%r165]; .loc 1 404 10 setp.le.f64 %r402,%r140,0d7ff0000000000000; @ ! %r402 bra $L96; .loc 1 414 9 setp.le.s64 %r403,%r154,%r182; @ ! %r403 bra $L142; bra $L99; $L96: .loc 1 396 32 add.u64 %r165,%r165,%r82; .loc 1 396 47 add.u64 %r166,%r166,%r40; .loc 1 396 2 setp.ne.u64 %r404,%r154,%r138; @ %r404 bra $L101; bra $L99; $L142: .loc 1 418 9 @ %r447 bra $L102; $L105: .loc 1 424 16 add.u64 %r182,%r182,1; .loc 1 421 8 @ %r440 bra $L103; .loc 1 421 18 ld.f64 %r85,[%r165]; .loc 1 421 15 setp.le.f64 %r407,%r85,%r140; .loc 1 423 16 selp.f64 %r140,%r85,%r140,%r407; selp.u64 %r138,%r182,%r138,%r407; $L103: .loc 1 419 33 add.u64 %r165,%r165,%r82; .loc 1 419 48 add.u64 %r166,%r166,%r40; .loc 1 419 8 setp.eq.u64 %r408,%r154,%r182; @ %r408 bra $L115; .loc 1 421 9 ld.s8 %r210,[%r166]; setp.eq.u32 %r440,%r210,0; bra $L105; $L102: .loc 1 433 11 add.u64 %r182,%r182,1; .loc 1 430 10 @ %r440 bra $L106; .loc 1 430 20 ld.f64 %r90,[%r165]; .loc 1 430 17 setp.lt.f64 %r410,%r90,%r140; .loc 1 432 18 selp.f64 %r140,%r90,%r140,%r410; selp.u64 %r138,%r182,%r138,%r410; $L106: .loc 1 428 35 add.u64 %r165,%r165,%r82; .loc 1 428 50 add.u64 %r166,%r166,%r40; .loc 1 428 10 setp.eq.u64 %r411,%r154,%r182; @ %r411 bra $L116; .loc 1 430 11 ld.s8 %r210,[%r166]; setp.eq.u32 %r440,%r210,0; bra $L102; $L115: mov.u64 %r142,%r138; bra $L99; $L116: mov.u64 %r142,%r138; $L99: .loc 1 436 8 st.u64 [%r173],%r142; .loc 1 439 15 add.u64 %r185,%r185,1; .loc 1 440 12 add.u64 %r171,%r171,%r95; .loc 1 441 13 add.u64 %r172,%r172,%r96; .loc 1 442 12 add.u64 %r173,%r173,%r100; .loc 1 444 32 ld.u64 %r164,[%frame+360]; .loc 1 444 13 setp.ne.u64 %r412,%r185,%r164; @ %r412 bra $L113; .loc 1 451 23 mul.lo.u64 %r102,%r93,%r185; .loc 1 452 24 mul.lo.u64 %r413,%r96,%r185; .loc 1 452 10 sub.u64 %r168,%r172,%r413; .loc 1 453 23 mul.lo.u64 %r414,%r98,%r185; .loc 1 453 9 shl.b64 %r415,%r414,3; sub.u64 %r169,%r173,%r415; .loc 1 455 7 @ %r445 bra $L65; add.u64 %r203,%frame,488; mov.u64 %r202,8; .loc 1 454 5 mov.u64 %r170,1; .loc 1 448 13 mov.u64 %r446,0; bra $L110; $L111: st.u64 [%r203],%r446; .loc 1 451 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 452 24 mul.lo.u64 %r421,%r116,%r113; .loc 1 452 10 sub.u64 %r168,%r172,%r421; .loc 1 453 23 mul.lo.u64 %r422,%r118,%r113; .loc 1 453 9 shl.b64 %r423,%r422,3; sub.u64 %r169,%r173,%r423; .loc 1 454 5 add.u64 %r170,%r170,1; .loc 1 455 7 add.u64 %r203,%r203,8; add.u64 %r202,%r202,8; setp.eq.u64 %r424,%r170,%r450; @ %r424 bra $L65; $L110: .loc 1 463 16 ld.u64 %r425,[%r203]; add.u64 %r113,%r425,1; st.u64 [%r203],%r113; .loc 1 464 23 add.u64 %r427,%r451,%r202; ld.u64 %r114,[%r427]; .loc 1 464 13 sub.u64 %r428,%r114,%r102; shl.b64 %r429,%r428,3; add.u64 %r171,%r171,%r429; .loc 1 465 24 add.u64 %r430,%frame,%r202; ld.u64 %r116,[%r430]; .loc 1 465 14 add.u64 %r172,%r168,%r116; .loc 1 466 23 add.u64 %r432,%r453,%r202; ld.u64 %r118,[%r432]; .loc 1 466 13 shl.b64 %r433,%r118,3; add.u64 %r173,%r169,%r433; .loc 1 444 32 add.u64 %r435,%r452,%r202; ld.u64 %r122,[%r435]; .loc 1 444 13 setp.eq.u64 %r436,%r113,%r122; @ %r436 bra $L111; .loc 1 448 13 mov.u64 %r185,0; bra $L113; $L140: .loc 1 322 6 ld.u64 %r437,[%r213]; setp.eq.u64 %r438,%r437,0; @ ! %r438 bra $L80; bra $L112; $L65_gfortran_sminloc1_8_r8 .visible .func _gfortran_sminloc1_8_r8, .param .u32 %in_ar4) {local .align 16 .b8 %frame_ar[368u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r281; .reg .predpred %r292; .reg .pred %r293; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 493 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L144; .loc 1 493 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L145; $L144:r152154; call _gfortran_minloc1_8_r8500 7 bra $L143; $L145: .loc 1 503 10 ld.u64 %r23,[%r152]; .loc 1 503 7 add.u64 %r88,%r23,-1; .loc 1 504 10 ld.s8 %r24,[%r151+28]; .loc 1 504 38 add.u32 %r26,%r24,-1; .loc 1 504 8 cvt.s64.s32 %r296,%r26; .loc 1 506 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r296; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 506 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L147; .loc 1 513 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L148; $L154: .loc 1 521 3 setp.lt.s64 %r175,%r88,%r296; @ %r175 bra $L149; bra $L202; $L147: .loc 1 508 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error148: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 518 12 mov.u64 %r302,0; $L153: .loc 1 515 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 517 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L151; .loc 1 515 17 st.u64 [%r139],%r36; bra $L152; $L151: .loc 1 518 12 st.u64 [%r139],%r302; $L152: .loc 1 513 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L153; bra $L154; $L202: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L156; bra $L155; $L149: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r295,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r295,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 527 12 mov.u64 %r301,0; $L159: .loc 1 524 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 526 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L157; .loc 1 523 17 st.u64 [%r43],%r41; bra $L158; $L157: .loc 1 527 12 st.u64 [%r43],%r301; $L158: .loc 1 521 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L159; bra $L203; $L155: .loc 1 534 7 setp.eq.u64 %r213,%r296,0; @ %r213 bra $L161; add.u64 %r295,%frame,120; $L181: add.u64 %r111,%r150,40; mov.u64 %r107,%r295; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 537 10 mov.u64 %r81,1; .loc 1 541 4 mov.u64 %r239,0; bra $L162; $L161: .loc 1 545 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 546 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 548 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 548 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 548 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 550 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L163; bra $L204; $L165: .loc 1 539 48 mul.lo.u64 %r81,%r46,%r81; $L162: .loc 1 541 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 534 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L165; bra $L161; $L163: .loc 1 553 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 554 4 bra $L143; $L204: .loc 1 557 24 mov.u64 %r24696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 557 22 st.u64 [%r150],%r98; $L169: .loc 1 583 3 setp.ne.u64 %r249,%r296,0; @ %r249 bra $L166; bra $L167; $L156: .loc 1 561 19 ld.s8 %r55,[%r150+28]; .loc 1 561 10 setp.eq.u64 %r250,%r55,%r296; @ %r250 bra $L168; .loc 1 562 2 st.u64 [%stack+8],%r296; st.u64 [%stack],%r55;_gfortran_runtime_error168: .loc 1 567 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 567 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L169; .loc 1 569 4 setp.eq.u64 %r256,%r296,0; @ %r256 bra $L167; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 569 10 mov.u64 %r104,0; $L171: .loc 1 573 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 573 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 574 18 ld.u64 %r63,[%r31]; .loc 1 569 25 add.u64 %r104,%r104,1; .loc 1 574 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L170; .loc 1 575 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error170: .loc 1 569 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L171; $L166: .loc 1 585 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L172; shl.b64 %r266,%r296,3; bra $L173; $L172: mov.u64 %r266,8; $L173: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L174: .loc 1 586 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 583 26 add.u64 %r100,%r100,1; .loc 1 583 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r296,%r100; @ %r279 bra $L174; $L167: ld.u64 %r80,[%frame+240]; .loc 1 595 22 ld.u64 %r68,[%frame]; .loc 1 595 12 shl.b64 %r69,%r68,3; .loc 1 597 32 ld.u64 %r59,[%frame+120]; .loc 1 604 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r294,%r296,1; .loc 1 593 13 mov.u64 %r280,0; cvt.u32.u32 %r298,%r24; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; $L180: st.u64 [%r98],%r280; .loc 1 594 15 add.u64 %r80,%r80,1; .loc 1 595 12 add.u64 %r98,%r98,%r69; .loc 1 597 13 setp.ne.u64 %r281,%r59,%r80; @ %r281 bra $L180; .loc 1 606 7 @ ! %r294 bra $L205; bra $L143; $L179: .loc 1 601 13 st.u64 [%r99],%r280; .loc 1 604 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 605 5 add.u64 %r97,%r97,1; .loc 1 606 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r284,%r97,%r300; @ %r284 bra $L178; bra $L143; $L205: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 604 23 mov.u64 %r72,%r44; .loc 1 605 5 mov.u64 %r97,1; $L178: .loc 1 610 16 ld.u64 %r289,[%r99]; add.u64 %r74,%r289,1; st.u64 [%r99],%r74; .loc 1 611 23 ld.u64 %r75,[%r124]; .loc 1 611 13 sub.u64 %r290,%r75,%r72; shl.b64 %r291,%r290,3; add.u64 %r98,%r98,%r291; .loc 1 597 32 ld.u64 %r77,[%r123]; .loc 1 597 13 setp.eq.u64 %r292,%r74,%r77; @ %r292 bra $L179; .loc 1 601 13 mov.u64 %r80,0; bra $L180; $L203: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r293,%r98,0; @ ! %r293 bra $L156; bra $L181; $L143: .loc 1 615 1 ret; } minloc1_16_r8.o/_gfortran_minloc1_16_r8 .visible .func _gfortran_minloc1_16_r8.file 1 "../../../libgfortran/generated/minloc1_16_r8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_16_r8 .visible .func _gfortran_mminloc1_16_r8_gfortran_sminloc1_16_r8 .visible .func _gfortran_sminloc1_16_r8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents74] = {114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_minloc1_16_r8 .visible .func _gfortran_minloc1_16_r8480u64 %r116; .reg .f64u16 %r213; .reg .u16 %r214; .reg .u16 %r215; .reg .u32 %r216; .reg .u16 %r217; .reg .predpredu64 %r265; .reg .u32predu64 %r283; .reg .u64pred64u32 %r333; .reg .u64 %r334; .reg .u64pred %r340; .reg .pred %r341; .reg .u64 %r343; .reg .pred %r345; .reg .pred %r348; .reg .predpredu32 %r438; .reg .pred %r445; .reg .pred %r448; .reg .predu64 %r460; .reg .predpred %r470; .reg .u64 %r471; .reg .predu64 %r507; .reg .pred %r511; .reg .pred %r512; .reg .u64 %r513; .reg .u64 %r514; .reg .u64 %r515; .reg .pred %r516; .reg .u64 %r517; .reg .u32 %r518; .reg .u64 %r519; .reg .u64 %r520; .reg .u64 %r521; .reg .u64 %r522; .reg .u64 %r523; .reg .u64 %r524; mov.u64 %r202,%ar0; mov.u64 %r203,%ar1; mov.u64 %r204,%ar2; mov.u32 %r205,%ar3; .loc 1 58 10 ld.s8 %r22,[%r203+28]; .loc 1 58 38 add.u32 %r206,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r130,%r206; .loc 1 59 10 ld.u64 %r25,[%r204]; .loc 1 59 7 add.u64 %r131,%r25,-1; .loc 1 61 7 shr.u64 %r208,%r131,63; set.u32.lt.s64 %r210,%r130,%r131; neg.s32 %r211,%r210; cvt.u16.u64 %r214,%r208; cvt.u16.u32 %r215,%r211; or.b16 %r213,%r214,%r215; .loc 1 61 6 cvt.u32.u16 %r216,%r213; cvt.u16.u8 %r217,%r216; setp.eq.u16 %r218,%r217,0; @ %r218 bra $L2; .loc 1 63 7 cvt.u32.u32 %r221,%r22; cvt.s64.s8 %r220,%r221; st.u64 [%stack+8],%r22011_gfortran_runtime_errorr224,%r131,%r131; add.u64 %r225,%r224,%r131; shl.b64 %r226,%r225,3; add.u64 %r227,%r203,%r226; ld.u64 %r30,[%r227+56]; ld.u64 %r32,[%r227+48]; .loc 1 71 9 ld.u64 %r134,[%r227+40]; .loc 1 73 3 setp.ne.u64 %r241,%r131,0; @ %r241 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r242,%r130,%r131; @ %r242 bra $L4; bra $L63; $L3: add.u64 %r76,%r203,40; add.u64 %r38,%frame,120; add.u64 %r164,%frame,240; add.u64 %r244,%r25,%r25; add.u64 %r245,%r244,%r25; shl.b64 %r246,%r245,3; add.u64 %r247,%r203,16; add.u64 %r43,%r246,%r247; .loc 1 79 12 mov.u64 %r524,0; $L8: .loc 1 75 18 ld.u64 %r248,[%r76]; st.u64 [%r38],%r248; .loc 1 76 19 ld.u64 %r250,[%r76+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r76+8]; sub.u64 %r37,%r249,%r251; .loc 1 78 10 setp.lt.s64 %r252,%r37,0; @ %r252 bra $L6; .loc 1 76 17 st.u64 [%r164],%r37; bra $L7; $L6: .loc 1 79 12 st.u64 [%r164],%r524; $L7: .loc 1 73 3 add.u64 %r76,%r76,24; add.u64 %r38,%r38,8; add.u64 %r164,%r164,8; setp.ne.u64 %r254,%r43,%r76; @ %r254 bra $L8; bra $L9; $L63: .loc 1 90 6 ld.u64 %r255,[%r202]; setp.eq.u64 %r256,%r255,0; @ ! %r256 bra $L11; bra $L10; $L4: add.u64 %r258,%r25,%r25; add.u64 %r259,%r258,%r25; shl.b64 %r260,%r259,3; add.u64 %r261,%r260,40; add.u64 %r72,%r203,%r261; shl.b64 %r262,%r25,3; add.u64 %r197,%r262,-8; add.u64 %r514,%frame,120; add.u64 %r128,%r514,%r197; add.u64 %r513,%frame,240; add.u64 %r195,%r513,%r197; cvt.u32.u32 %r266,%r22; cvt.s64.s8 %r265,%r266; add.u64 %r268,%r265,%r265; add.u64 %r269,%r268,%r265; shl.b64 %r270,%r269,3; add.u64 %r271,%r203,40; add.u64 %r64,%r270,%r271; .loc 1 87 12 mov.u64 %r523,0; $L14: .loc 1 83 18 ld.u64 %r272,[%r72]; st.u64 [%r128],%r272; .loc 1 84 19 ld.u64 %r274,[%r72+16]; add.u64 %r273,%r274,1; ld.u64 %r275,[%r72+8]; sub.u64 %r45,%r273,%r275; .loc 1 86 10 setp.lt.s64 %r276,%r45,0; @ %r276 bra $L12; .loc 1 84 17 st.u64 [%r195],%r45; bra $L13; $L12: .loc 1 87 12 st.u64 [%r195],%r523; $L13: .loc 1 81 3 add.u64 %r72,%r72,24; add.u64 %r128,%r128,8; add.u64 %r195,%r195,8; setp.ne.u64 %r278,%r64,%r72; @ %r278 bra $L14; bra $L64; $L10: .loc 1 94 7 setp.le.s64 %r279,%r130,0; @ %r279 bra $L16; add.u64 %r513,%frame,240; $L46: add.u64 %r113,%r202,40; mov.u64 %r110,%r513; cvt.u32.u32 %r281,%r22; cvt.s64.s8 %r280,%r281; add.u64 %r283,%r280,%r280; add.u64 %r284,%r283,%r280; shl.b64 %r285,%r284,3; add.u64 %r286,%r202,16; add.u64 %r86,%r285,%r286; .loc 1 97 10 mov.u64 %r111,1; .loc 1 101 4 mov.u64 %r309,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r287,0; st.u64 [%r202+8],%r287; .loc 1 106 28 cvt.u16.u32 %r290,%r22; add.u16 %r289,%r290,-1; cvt.u32.u16 %r291,%r289; st.u8 [%r202+28],%r291; .loc 1 108 20 add.u32 %r292,%r22,-2; cvt.s64.s32 %r53,%r292; add.u64 %r294,%r53,%r53; add.u64 %r295,%r294,%r53; shl.b64 %r296,%r295,3; add.u64 %r297,%r202,%r296; .loc 1 108 67 shl.b64 %r299,%r53,3; add.u64 %r300,%frame,%r299; .loc 1 108 59 ld.u64 %r302,[%r297+40]; ld.u64 %r303,[%r300+240]; mul.lo.u64 %r136,%r302,%r303; .loc 1 110 29 mov.u64 %r305,16305; call (%value_in),_gfortrani_xmallocarray06,[%value_in]; } .loc 1 110 27 st.u64 [%r202],%r306; .loc 1 111 10 setp.eq.u64 %r308,%r136,0; @ ! %r308 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r111,%r49,%r111; $L17: .loc 1 101 4 st.u64 [%r113+8],%r309; ld.u64 %r49,[%r110]; add.u64 %r310,%r49,-1; st.u64 [%r113+16],%r310; st.u64 [%r113],%r111; .loc 1 94 7 add.u64 %r113,%r113,24; add.u64 %r110,%r110,8; setp.ne.u64 %r311,%r86,%r113; @ %r311 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r202+48],%r136; mov.u64 %r313,-1; st.u64 [%r202+56],%r313; mov.u64 %r314,1; st.u64 [%r202+40],%r314; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r60,[%r202+28]; .loc 1 121 10 setp.eq.u64 %r315,%r60,%r130; @ %r315 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r130; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r318,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r319,[%r318+36]; setp.eq.u32 %r320,%r319,0; @ %r320 bra $L21; .loc 1 128 2 add.u64 %r513,%frame,240; cvta.const.u64 %r324,$LC2r513324; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r326,%r130,0; @ %r326 bra $L24; $L27: .loc 1 68 9 add.u64 %r327,%r30,1; .loc 1 68 7 sub.u64 %r132,%r327,%r32; max.s64 %r68,%r132,0; .loc 1 140 8 ld.u64 %r147,[%r203]; .loc 1 141 8 ld.u64 %r148,[%r202]; .loc 1 168 40 shl.b64 %r70,%r134,3; ld.u64 %r161,[%frame+360]; .loc 1 204 22 ld.u64 %r81,[%frame+120]; .loc 1 204 12 shl.b64 %r82,%r81,3; .loc 1 205 22 ld.u64 %r84,[%frame]; .loc 1 205 12 shl.b64 %r85,%r84,4; .loc 1 207 32 ld.u64 %r122,[%frame+240]; .loc 1 214 23 mul.lo.u64 %r328,%r81,%r122; .loc 1 214 9 shl.b64 %r329,%r328,3; neg.s64 %r166,%r329; .loc 1 215 23 mul.lo.u64 %r330,%r84,%r122; .loc 1 215 9 shl.b64 %r331,%r330,4; neg.s64 %r71,%r331; setp.gt.s64 %r511,%r132,0; setp.le.s64 %r512,%r130,1; .loc 1 180 9 setp.ne.u32 %r516,%r205,0; add.u64 %r517,%r68,-1; cvt.u32.u32 %r518,%r22; cvt.s64.s8 %r519,%r518; add.u64 %r520,%r519,-1; add.u64 %r521,%frame,120; add.u64 %r522,%frame,240; bra $L25; $L24: add.u64 %r158,%frame,360; add.u64 %r157,%r202,40; mov.u64 %r153,%frame; cvt.u32.u32 %r333,%r22; cvt.s64.s8 %r332,%r333; add.u64 %r116,%r332,-1; .loc 1 132 3 mov.u64 %r149,0; add.u64 %r513,%frame,240; .loc 1 134 16 mov.u64 %r334,%r149; $L26: st.u64 [%r158],%r334; .loc 1 135 18 ld.u64 %r335,[%r157]; st.u64 [%r153],%r335; .loc 1 136 17 shl.b64 %r337,%r149,3; add.u64 %r338,%r513,%r337; .loc 1 136 10 ld.u64 %r339,[%r338]; setp.le.s64 %r340,%r339,0; @ %r340 bra $L1; .loc 1 132 26 add.u64 %r149,%r149,1; .loc 1 132 3 add.u64 %r158,%r158,8; add.u64 %r157,%r157,24; add.u64 %r153,%r153,8; setp.ne.u64 %r341,%r116,%r149; @ %r341 bra $L26; bra $L27; $L25: .loc 1 158 5 @ %r511 bra $L47; .loc 1 159 10 mov.u64 %r343,0; st.u64 [%r148],%r343; st.u64 [%r148+8],%r343; bra $L29; $L47: mov.u64 %r138,%r147; .loc 1 168 17 mov.u64 %r156,0; $L28: .loc 1 170 7 ld.f64 %r119,[%r138]; mov.u64 %r199,%r156; .loc 1 168 32 add.u64 %r156,%r156,1; .loc 1 170 6 setp.le.f64 %r345,%r119,0d7ff0000000000000; @ ! %r345 bra $L65; .loc 1 173 14 mov.u64 %r473,%r156; shr.s64 %r474,%r473,63; .loc 1 180 9 @ %r516 bra $L32; .loc 1 190 8 setp.gt.s64 %r348,%r68,%r199; @ %r348 bra $L33; bra $L34; $L32: .loc 1 181 8 setp.le.s64 %r349,%r68,%r199; @ %r349 bra $L34; add.u64 %r481,%r199,1; shr.s64 %r482,%r481,63; sub.u64 %r353,%r517,%r199; shr.s64 %r356,%r199,63; add.u64 %r359,%r199,2; set.u32.lt.u64 %r364,%r359,%r199; cvt.s64.s32 %r362,%r364; sub.u64 %r368,%r356,%r362; add.u64 %r370,%r353,%r359; set.u32.lt.u64 %r375,%r370,%r353; cvt.s64.s32 %r373,%r375; sub.u64 %r379,%r368,%r373; .loc 1 170 7 mov.f64 %r194,%r119; .loc 1 181 8 mov.u64 %r493,1; bra $L35; $L65: .loc 1 168 40 add.u64 %r138,%r138,%r70; .loc 1 168 10 setp.gt.s64 %r380,%r68,%r156; @ %r380 bra $L28; .loc 1 157 9 mov.u64 %r473,1; mov.u64 %r474,0; bra $L34; $L35: .loc 1 183 8 setp.ge.f64 %r381,%r119,%r194; .loc 1 186 16 selp.u64 %r473,%r481,%r473,%r381; selp.u64 %r474,%r482,%r474,%r381; selp.f64 %r119,%r194,%r119,%r381; .loc 1 181 33 add.u64 %r138,%r138,%r70; .loc 1 181 8 add.u64 %r384,%r481,%r493; set.u32.lt.u64 %r389,%r384,%r481; cvt.s64.s32 %r387,%r389; mov.u64 %r481,%r384; sub.u64 %r482,%r482,%r387; setp.ne.u64 %r396,%r370,%r481; @ %r396 bra $L50; setp.ne.u64 %r399,%r379,%r482; @ ! %r399 bra $L34; $L50: .loc 1 183 9 ld.f64 %r194,[%r138]; bra $L35; $L33: add.u64 %r477,%r199,1; shr.s64 %r478,%r477,63; sub.u64 %r403,%r517,%r199; shr.s64 %r406,%r199,63; add.u64 %r409,%r199,2; set.u32.lt.u64 %r414,%r409,%r199; cvt.s64.s32 %r412,%r414; sub.u64 %r418,%r406,%r412; add.u64 %r420,%r403,%r409; set.u32.lt.u64 %r425,%r420,%r403; cvt.s64.s32 %r423,%r425; sub.u64 %r429,%r418,%r423; .loc 1 190 8 mov.u64 %r507,1; $L60: .loc 1 192 9 ld.f64 %r75,[%r138]; .loc 1 192 8 setp.lt.f64 %r430,%r75,%r119; .loc 1 195 16 selp.u64 %r473,%r477,%r473,%r430; selp.u64 %r474,%r478,%r474,%r430; selp.f64 %r119,%r75,%r119,%r430; .loc 1 190 33 add.u64 %r138,%r138,%r70; .loc 1 190 8 add.u64 %r433,%r477,%r507; set.u32.lt.u64 %r438,%r433,%r477; cvt.s64.s32 %r436,%r438; mov.u64 %r477,%r433; sub.u64 %r478,%r478,%r436; setp.ne.u64 %r445,%r420,%r477; @ %r445 bra $L60; setp.ne.u64 %r448,%r429,%r478; @ %r448 bra $L60; $L34: .loc 1 199 12 st.u64 [%r148],%r473; st.u64 [%r148+8],%r474; $L29: .loc 1 203 15 add.u64 %r161,%r161,1; .loc 1 204 12 add.u64 %r147,%r147,%r82; .loc 1 205 12 add.u64 %r148,%r148,%r85; .loc 1 207 13 setp.ne.u64 %r451,%r161,%r122; @ %r451 bra $L25; .loc 1 214 9 add.u64 %r144,%r147,%r166; .loc 1 215 9 add.u64 %r145,%r148,%r71; .loc 1 217 7 @ %r512 bra $L1; add.u64 %r129,%frame,368; mov.u64 %r190,8; .loc 1 216 5 mov.u64 %r146,1; .loc 1 211 13 mov.u64 %r515,0; bra $L44; $L45: st.u64 [%r129],%r515; .loc 1 214 23 mul.lo.u64 %r457,%r97,%r96; .loc 1 214 9 shl.b64 %r458,%r457,3; sub.u64 %r144,%r147,%r458; .loc 1 215 23 mul.lo.u64 %r459,%r100,%r96; .loc 1 215 9 shl.b64 %r460,%r459,4; sub.u64 %r145,%r148,%r460; .loc 1 216 5 add.u64 %r146,%r146,1; .loc 1 217 7 add.u64 %r129,%r129,8; add.u64 %r190,%r190,8; setp.eq.u64 %r461,%r146,%r520; @ %r461 bra $L1; $L44: .loc 1 225 16 ld.u64 %r462,[%r129]; add.u64 %r96,%r462,1; st.u64 [%r129],%r96; .loc 1 226 23 add.u64 %r464,%r521,%r190; ld.u64 %r97,[%r464]; .loc 1 226 13 shl.b64 %r465,%r97,3; add.u64 %r147,%r144,%r465; .loc 1 227 23 add.u64 %r466,%frame,%r190; ld.u64 %r100,[%r466]; .loc 1 227 13 shl.b64 %r467,%r100,4; add.u64 %r148,%r145,%r467; .loc 1 207 32 add.u64 %r469,%r522,%r190; ld.u64 %r103,[%r469]; .loc 1 207 13 setp.eq.u64 %r470,%r96,%r103; @ %r470 bra $L45; .loc 1 211 13 mov.u64 %r161,0; bra $L25; $L64: .loc 1 90 6 ld.u64 %r471,[%r202]; setp.eq.u64 %r472,%r471,0; @ ! %r472 bra $L11; bra $L46; $L1:_gfortran_mminloc1_16_r8 .visible .func _gfortran_mminloc1_16_r8, .param .u32 %in_ar4) {608pred %r239; .reg .u32 %r244; .reg .u64 %r246; .reg .u32 %r248; .reg .u32 %r249; .reg .u16 %r251; .reg .u16 %r252; .reg .u16 %r253; .reg .u32u64 %r268; .reg .u64 %r275; .reg .pred %r276; .reg .u32 %r277; .reg .u32 %r278; .reg .u32 %r280; .reg .u32pred %r291; .reg .pred %r293; .reg .u64predu64 %r312; .reg .u64u64 %r317; .reg .u64 %r318; .reg .u64u64pred %r356; .reg .pred %r357; .reg .u64 %r358; .reg .u32pred %r410; .reg .u64pred %r419; .reg .predpred %r427; .reg .predu64 %r454; .reg .u64 %r457; .reg .u32 %r459; .reg .u64 %r463; .reg .u64 %r465; .reg .u64pred %r541; .reg .predu64u64 %r565; .reg .u64 %r567; .reg .pred %r568; .reg .u64 %r569; .reg .pred %r570; .reg .u64 %r571; .reg .u64 %r572; .reg .u64 %r573; .reg .u64.reg .u64 %r633; .reg .u64 %r634; mov.u64 %r234,%ar0; mov.u64 %r235,%ar1; mov.u64 %r236,%ar2; mov.u64 %r237,%ar3; mov.u32 %r238,%ar4; .loc 1 261 6 setp.ne.u64 %r239,%r237,0; @ %r239 bra $L67; .loc 1 264 7r2236238; call _gfortran_minloc1_16_r8268 7 bra $L66; $L67: .loc 1 271 10 ld.u64 %r22,[%r236]; .loc 1 271 7 add.u64 %r152,%r22,-1; .loc 1 272 10 ld.s8 %r23,[%r235+28]; .loc 1 272 38 add.u32 %r244,%r23,-1; .loc 1 272 8 cvt.s64.s32 %r153,%r244; .loc 1 275 7 shr.u64 %r246,%r152,63; set.u32.gt.s64 %r248,%r152,%r153; neg.s32 %r249,%r248; cvt.u16.u64 %r252,%r246; cvt.u16.u32 %r253,%r249; or.b16 %r251,%r252,%r253; .loc 1 275 6 cvt.u32.u16 %r254,%r251; cvt.u16.u8 %r255,%r254; setp.eq.u16 %r256,%r255,0; @ %r256 bra $L69; .loc 1 277 7 cvt.u32.u32 %r259,%r23; cvt.s64.s8 %r258,%r259; st.u64 [%stack+8],%r258; st.u64 [%stack],%r22; cvta.const.u64 %r25_gfortran_runtime_error69: .loc 1 282 9 add.u64 %r621,%r152,%r152; add.u64 %r617,%r621,%r152; shl.b64 %r264,%r617,3; add.u64 %r265,%r235,%r264; ld.u64 %r268,[%r265+56]; add.u64 %r267,%r268,1; .loc 1 282 7 ld.u64 %r275,[%r265+48]; sub.u64 %r154,%r267,%r275; .loc 1 283 6 setp.le.s64 %r276,%r154,0; @ %r276 bra $L66; .loc 1 286 9 ld.u64 %r174,[%r237]; .loc 1 288 15 ld.u64 %r33,[%r237+16]; .loc 1 290 22 cvt.u32.u64 %r34,%r33; .loc 1 290 53 add.u32 %r277,%r34,-4; and.b32 %r278,%r277,-5; set.u32.eq.u32 %r280,%r278,0; neg.s32 %r281,%r280; .loc 1 290 22 add.u32 %r282,%r34,-1; set.u32.le.u32 %r284,%r282,1; neg.s32 %r285,%r284; .loc 1 290 6 cvt.u16.u32 %r287,%r281; cvt.u16.u32 %r288,%r285; or.b16 %r286,%r287,%r288; cvt.u32.u16 %r289,%r286; cvt.u16.u8 %r290,%r289; setp.ne.u16 %r291,%r290,0; @ %r291 bra $L71; .loc 1 292 7 setp.ne.u32 %r293,%r34,16; @ %r293 bra $L72; $L71: .loc 1 299 9 shl.b64 %r297,%r617,3; add.u64 %r298,%r235,%r297; ld.u64 %r157,[%r298+40]; .loc 1 300 12 add.u64 %r304,%r237,%r297; ld.u64 %r40,[%r304+40]; .loc 1 302 3 setp.ne.u64 %r306,%r152,0; @ %r306 bra $L73; $L79: .loc 1 312 3 setp.lt.s64 %r307,%r152,%r153; @ %r307 bra $L74; bra $L141; $L72: .loc 1 297 5 cvta.const.u64 %r30830_gfortran_runtime_error73: add.u64 %r91,%r235,40; add.u64 %r196,%r237,40; add.u64 %r311,%r22,%r22; add.u64 %r312,%r311,%r22; shl.b64 %r313,%r312,3; add.u64 %r314,%r235,16; add.u64 %r226,%r313,%r314; .loc 1 302 3 mov.u64 %r107,0; add.u64 %r619,%frame,240; add.u64 %r623,%frame,360; .loc 1 309 12 mov.u64 %r634,%r107; $L78: .loc 1 304 18 add.u64 %r316,%r619,%r107; ld.u64 %r317,[%r91]; st.u64 [%r316],%r317; .loc 1 305 18 add.u64 %r318,%frame,%r107; .loc 1 305 20 ld.u64 %r320,[%r196]; mul.lo.u64 %r319,%r320,%r33; .loc 1 305 18 st.u64 [%r318],%r319; .loc 1 306 19 ld.u64 %r322,[%r91+16]; add.u64 %r321,%r322,1; ld.u64 %r323,[%r91+8]; sub.u64 %r52,%r321,%r323; .loc 1 308 10 setp.lt.s64 %r324,%r52,0; @ %r324 bra $L76; .loc 1 306 17 add.u64 %r326,%r623,%r107; st.u64 [%r326],%r52; bra $L77; $L76: .loc 1 309 12 add.u64 %r328,%r623,%r107; st.u64 [%r328],%r634; $L77: .loc 1 302 3 add.u64 %r91,%r91,24; add.u64 %r196,%r196,24; add.u64 %r107,%r107,8; setp.ne.u64 %r330,%r91,%r226; @ %r330 bra $L78; bra $L79; $L141: .loc 1 322 6 ld.u64 %r331,[%r234]; setp.eq.u64 %r332,%r331,0; @ ! %r332 bra $L81; bra $L80; $L74: add.u64 %r334,%r22,%r22; add.u64 %r335,%r334,%r22; shl.b64 %r336,%r335,3; add.u64 %r180,%r336,40; add.u64 %r57,%r235,%r180; add.u64 %r182,%r237,%r180; shl.b64 %r337,%r22,3; add.u64 %r74,%r337,-8; cvt.u32.u32 %r339,%r23; cvt.s64.s8 %r338,%r339; shl.b64 %r340,%r338,3; add.u64 %r88,%r340,-8; add.u64 %r619,%frame,240; add.u64 %r623,%frame,360; .loc 1 319 12 mov.u64 %r633,0; $L84: .loc 1 314 18 add.u64 %r342,%r619,%r74; ld.u64 %r343,[%r57]; st.u64 [%r342],%r343; .loc 1 315 18 add.u64 %r344,%frame,%r74; .loc 1 315 20 ld.u64 %r346,[%r182]; mul.lo.u64 %r345,%r346,%r33; .loc 1 315 18 st.u64 [%r344],%r345; .loc 1 316 19 ld.u64 %r348,[%r57+16]; add.u64 %r347,%r348,1; ld.u64 %r349,[%r57+8]; sub.u64 %r64,%r347,%r349; .loc 1 318 10 setp.lt.s64 %r350,%r64,0; @ %r350 bra $L82; .loc 1 316 17 add.u64 %r352,%r623,%r74; st.u64 [%r352],%r64; bra $L83; $L82: .loc 1 319 12 add.u64 %r354,%r623,%r74; st.u64 [%r354],%r633; $L83: .loc 1 312 3 add.u64 %r57,%r57,24; add.u64 %r182,%r182,24; add.u64 %r74,%r74,8; setp.ne.u64 %r356,%r74,%r88; @ %r356 bra $L84; bra $L142; $L80: .loc 1 326 7 setp.eq.u64 %r357,%r153,0; @ %r357 bra $L86; add.u64 %r623,%frame,360; $L116: add.u64 %r133,%r234,40; mov.u64 %r131,%r623; cvt.u32.u32 %r359,%r23; cvt.s64.s8 %r358,%r359; add.u64 %r361,%r358,%r358; add.u64 %r362,%r361,%r358; shl.b64 %r363,%r362,3; add.u64 %r364,%r234,16; add.u64 %r60,%r363,%r364; .loc 1 329 10 mov.u64 %r136,1; .loc 1 333 4 mov.u64 %r383,0; bra $L87; $L86: .loc 1 337 20 add.u32 %r365,%r23,-2; cvt.s64.s32 %r72,%r365; add.u64 %r367,%r72,%r72; add.u64 %r368,%r367,%r72; shl.b64 %r369,%r368,3; add.u64 %r370,%r234,%r369; .loc 1 337 67 shl.b64 %r372,%r72,3; add.u64 %r373,%frame,%r372; .loc 1 337 59 ld.u64 %r375,[%r370+40]; ld.u64 %r376,[%r373+360]; mul.lo.u64 %r161,%r375,%r376; .loc 1 339 24 mov.u64 %r377,0; st.u64 [%r234+8],%r377; .loc 1 340 28 cvt.u16.u32 %r380,%r23; add.u16 %r379,%r380,-1; cvt.u32.u16 %r381,%r379; st.u8 [%r234+28],%r381; .loc 1 342 10 setp.eq.u64 %r382,%r161,0; @ %r382 bra $L88; bra $L143; $L90: .loc 1 331 47 mul.lo.u64 %r136,%r69,%r136; $L87: .loc 1 333 4 st.u64 [%r133+8],%r383; ld.u64 %r69,[%r131]; add.u64 %r384,%r69,-1; st.u64 [%r133+16],%r384; st.u64 [%r133],%r136; .loc 1 326 7 add.u64 %r133,%r133,24; add.u64 %r131,%r131,8; setp.ne.u64 %r385,%r60,%r133; @ %r385 bra $L90; bra $L86; $L88: .loc 1 345 4 st.u64 [%r234+48],%r161; mov.u64 %r387,-1; st.u64 [%r234+56],%r387; mov.u64 %r388,1; st.u64 [%r234+40],%r388; .loc 1 346 4 bra $L66; $L143: .loc 1 349 24 mov.u64 %r390,16390; call (%value_in),_gfortrani_xmallocarray91,[%value_in]; } .loc 1 349 22 st.u64 [%r234],%r391; bra $L91; $L81: .loc 1 354 19 ld.s8 %r393,[%r234+28]; .loc 1 354 10 setp.eq.u64 %r394,%r393,%r153; @ %r394 bra $L92; .loc 1 355 2 cvta.const.u64 %r3955_gfortran_runtime_error92: .loc 1 357 11 cvta.global.u64 %r397,_gfortrani_compile_options; .loc 1 357 10 ld.u32 %r398,[%r397+36]; setp.eq.u32 %r399,%r398,0; @ %r399 bra $L91; .loc 1 359 4 add.u64 %r623,%frame,360; cvta.const.u64 %r403,$LC2;r62302403; call _gfortrani_bounds_ifunction_return361 4 cvta.const.u64 %r407,$LC207403; call _gfortrani_bounds_equal_extents$L91: .loc 1 366 3 setp.ne.u64 %r409,%r153,0; @ %r409 bra $L93; $L96: .loc 1 374 8 ld.u64 %r175,[%r234]; .loc 1 375 8 ld.u64 %r173,[%r235]; .loc 1 377 9 setp.ne.u64 %r410,%r173,0; @ %r410 bra $L94; bra $L66; $L93: add.u64 %r149,%frame,480; add.u64 %r148,%r234,40; add.u64 %r144,%frame,120; cvt.u32.u32 %r412,%r23; cvt.s64.s8 %r411,%r412; add.u64 %r134,%r411,-1; .loc 1 366 3 mov.u64 %r176,0; add.u64 %r623,%frame,360; .loc 1 368 16 mov.u64 %r413,%r176; $L95: st.u64 [%r149],%r413; .loc 1 369 18 ld.u64 %r414,[%r148]; st.u64 [%r144],%r414; .loc 1 370 17 shl.b64 %r416,%r176,3; add.u64 %r417,%r623,%r416; .loc 1 370 10 ld.u64 %r418,[%r417]; setp.le.s64 %r419,%r418,0; @ %r419 bra $L66; .loc 1 366 26 add.u64 %r176,%r176,1; .loc 1 366 3 add.u64 %r149,%r149,8; add.u64 %r148,%r148,24; add.u64 %r144,%r144,8; setp.ne.u64 %r420,%r134,%r176; @ %r420 bra $L95; bra $L96; $L94: .loc 1 300 12 mul.lo.u64 %r43,%r40,%r33; .loc 1 396 32 shl.b64 %r87,%r157,3; ld.u64 %r38,[%frame+480]; .loc 1 440 22 ld.u64 %r100,[%frame+240]; .loc 1 440 12 shl.b64 %r101,%r100,3; .loc 1 441 23 ld.u64 %r103,[%frame]; .loc 1 442 22 ld.u64 %r105,[%frame+120]; .loc 1 442 12 shl.b64 %r106,%r105,4; .loc 1 444 32 ld.u64 %r166,[%frame+360]; .loc 1 451 23 mul.lo.u64 %r37,%r166,%r100; .loc 1 452 24 mul.lo.u64 %r421,%r166,%r103; .loc 1 452 10 neg.s64 %r85,%r421; .loc 1 453 23 mul.lo.u64 %r422,%r166,%r105; .loc 1 453 9 shl.b64 %r423,%r422,4; neg.s64 %r93,%r423; setp.le.s64 %r618,%r153,1; .loc 1 366 3 mov.u64 %r626,1; .loc 1 418 9 setp.ne.u32 %r627,%r238,0; add.u64 %r628,%r154,-1; cvt.u32.u32 %r629,%r23; cvt.s64.s8 %r630,%r629; add.u64 %r631,%r630,-1; add.u64 %r632,%frame,240; $L117: .loc 1 366 3 mov.u64 %r169,%r174; mov.u64 %r168,%r173; mov.u64 %r573,%r626; mov.u64 %r574,0; .loc 1 393 17 mov.u64 %r571,%r574; mov.u64 %r572,%r574; .loc 1 396 9 mov.u64 %r200,%r574; $L102: .loc 1 399 7 ld.s8 %r225,[%r169]; mov.u64 %r231,%r200; add.u64 %r200,%r200,1; .loc 1 399 6 setp.eq.u32 %r620,%r225,0; @ %r620 bra $L97; .loc 1 402 10 or.b64 %r425,%r571,%r572; setp.ne.u64 %r427,%r425,0; .loc 1 403 17 selp.u64 %r571,%r571,%r573,%r427; selp.u64 %r572,%r572,%r574,%r427; .loc 1 404 11 ld.f64 %r145,[%r168]; .loc 1 404 10 setp.le.f64 %r428,%r145,0d7ff0000000000000; @ ! %r428 bra $L97; .loc 1 408 11 shr.s64 %r429,%r200,63; .loc 1 414 9 setp.le.s64 %r430,%r154,%r231; @ ! %r430 bra $L144; bra $L100; $L97: .loc 1 396 32 add.u64 %r168,%r168,%r87; .loc 1 396 47 add.u64 %r169,%r169,%r43; .loc 1 396 2 add.u64 %r433,%r573,%r626; set.u32.lt.u64 %r438,%r433,%r573; cvt.s64.s32 %r436,%r438; mov.u64 %r573,%r433; sub.u64 %r574,%r574,%r436; setp.ne.u64 %r443,%r154,%r200; @ %r443 bra $L102; bra $L100; $L144: .loc 1 418 9 @ %r627 bra $L103; add.u64 %r579,%r231,1; shr.s64 %r580,%r579,63; sub.u64 %r448,%r628,%r231; shr.s64 %r451,%r231,63; add.u64 %r454,%r231,2; set.u32.lt.u64 %r459,%r454,%r231; cvt.s64.s32 %r457,%r459; sub.u64 %r463,%r451,%r457; add.u64 %r465,%r448,%r454; set.u32.lt.u64 %r470,%r465,%r448; cvt.s64.s32 %r468,%r470; sub.u64 %r474,%r463,%r468; .loc 1 408 11 mov.u64 %r571,%r200; mov.u64 %r572,%r429; bra $L104; $L103: add.u64 %r583,%r231,1; shr.s64 %r584,%r583,63; sub.u64 %r478,%r628,%r231; shr.s64 %r481,%r231,63; add.u64 %r484,%r231,2; set.u32.lt.u64 %r489,%r484,%r231; cvt.s64.s32 %r487,%r489; sub.u64 %r493,%r481,%r487; add.u64 %r495,%r478,%r484; set.u32.lt.u64 %r500,%r495,%r478; cvt.s64.s32 %r498,%r500; sub.u64 %r504,%r493,%r498; mov.u64 %r571,%r200; mov.u64 %r572,%r429; $L108: .loc 1 421 8 @ %r620 bra $L105; .loc 1 421 18 ld.f64 %r90,[%r168]; .loc 1 421 15 setp.le.f64 %r506,%r90,%r145; .loc 1 424 16 selp.u64 %r571,%r583,%r571,%r506; selp.u64 %r572,%r584,%r572,%r506; selp.f64 %r145,%r90,%r145,%r506; $L105: .loc 1 419 33 add.u64 %r168,%r168,%r87; .loc 1 419 48 add.u64 %r169,%r169,%r43; .loc 1 419 8 add.u64 %r509,%r583,%r626; set.u32.lt.u64 %r514,%r509,%r583; cvt.s64.s32 %r512,%r514; mov.u64 %r583,%r509; sub.u64 %r584,%r584,%r512; setp.ne.u64 %r521,%r495,%r583; @ %r521 bra $L120; setp.ne.u64 %r524,%r504,%r584; @ ! %r524 bra $L100; $L120: .loc 1 421 9 ld.s8 %r225,[%r169]; setp.eq.u32 %r620,%r225,0; bra $L108; $L104: .loc 1 430 10 @ %r620 bra $L109; .loc 1 430 20 ld.f64 %r94,[%r168]; .loc 1 430 17 setp.lt.f64 %r526,%r94,%r145; .loc 1 433 11 selp.u64 %r571,%r579,%r571,%r526; selp.u64 %r572,%r580,%r572,%r526; selp.f64 %r145,%r94,%r145,%r526; $L109: .loc 1 428 35 add.u64 %r168,%r168,%r87; .loc 1 428 50 add.u64 %r169,%r169,%r43; .loc 1 428 10 add.u64 %r529,%r579,%r626; set.u32.lt.u64 %r534,%r529,%r579; cvt.s64.s32 %r532,%r534; mov.u64 %r579,%r529; sub.u64 %r580,%r580,%r532; setp.ne.u64 %r541,%r465,%r579; @ %r541 bra $L122; setp.ne.u64 %r544,%r474,%r580; @ ! %r544 bra $L100; $L122: .loc 1 430 11 ld.s8 %r225,[%r169]; setp.eq.u32 %r620,%r225,0; bra $L104; $L100: .loc 1 436 8 st.u64 [%r175],%r571; st.u64 [%r175+8],%r572; .loc 1 439 15 add.u64 %r38,%r38,1; .loc 1 440 12 add.u64 %r173,%r173,%r101; .loc 1 441 13 add.u64 %r174,%r174,%r103; .loc 1 442 12 add.u64 %r175,%r175,%r106; .loc 1 444 13 setp.ne.u64 %r547,%r38,%r166; @ %r547 bra $L117; .loc 1 452 10 add.u64 %r170,%r174,%r85; .loc 1 453 9 add.u64 %r171,%r175,%r93; .loc 1 455 7 @ %r618 bra $L66; add.u64 %r151,%frame,488; .loc 1 451 23 mov.u64 %r110,%r37; .loc 1 455 7 mov.u64 %r219,8; .loc 1 454 5 mov.u64 %r172,1; add.u64 %r623,%frame,360; add.u64 %r622,%frame,120; .loc 1 448 13 mov.u64 %r624,0; bra $L114; $L115: st.u64 [%r151],%r624; .loc 1 451 23 mul.lo.u64 %r110,%r123,%r122; .loc 1 452 24 mul.lo.u64 %r553,%r125,%r122; .loc 1 452 10 sub.u64 %r170,%r174,%r553; .loc 1 453 23 mul.lo.u64 %r554,%r127,%r122; .loc 1 453 9 shl.b64 %r555,%r554,4; sub.u64 %r171,%r175,%r555; .loc 1 454 5 add.u64 %r172,%r172,1; .loc 1 455 7 add.u64 %r151,%r151,8; add.u64 %r219,%r219,8; setp.eq.u64 %r556,%r172,%r631; @ %r556 bra $L66; $L114: .loc 1 463 16 ld.u64 %r557,[%r151]; add.u64 %r122,%r557,1; st.u64 [%r151],%r122; .loc 1 464 23 add.u64 %r559,%r632,%r219; ld.u64 %r123,[%r559]; .loc 1 464 13 sub.u64 %r560,%r123,%r110; shl.b64 %r561,%r560,3; add.u64 %r173,%r173,%r561; .loc 1 465 24 add.u64 %r562,%frame,%r219; ld.u64 %r125,[%r562]; .loc 1 465 14 add.u64 %r174,%r170,%r125; .loc 1 466 23 add.u64 %r564,%r622,%r219; ld.u64 %r127,[%r564]; .loc 1 466 13 shl.b64 %r565,%r127,4; add.u64 %r175,%r171,%r565; .loc 1 444 32 add.u64 %r567,%r623,%r219; ld.u64 %r130,[%r567]; .loc 1 444 13 setp.eq.u64 %r568,%r122,%r130; @ %r568 bra $L115; .loc 1 448 13 mov.u64 %r38,0; bra $L117; $L142: .loc 1 322 6 ld.u64 %r569,[%r234]; setp.eq.u64 %r570,%r569,0; @ ! %r570 bra $L81; bra $L116; $L66_gfortran_sminloc1_16_r8 .visible .func _gfortran_sminloc1_16_r8, .param .u32 %in_ar4) {local .align 16 .b8 %frame_ar[368u64 %r107; .reg .u643264predpredpredpred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .u64 %r253; .reg .u32u32 %r258; .reg .u64predpred %r279; .reg .u64 %r280; .reg .pred %r282; .reg .predpred %r293; .reg .pred %r294; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; mov.u32 %r154,%ar4; .loc 1 493 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L146; .loc 1 493 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L147; $L146:r152154; call _gfortran_minloc1_16_r8500 7 bra $L145; $L147: .loc 1 503 10 ld.u64 %r23,[%r152]; .loc 1 503 7 add.u64 %r88,%r23,-1; .loc 1 504 10 ld.s8 %r24,[%r151+28]; .loc 1 504 38 add.u32 %r26,%r24,-1; .loc 1 504 8 cvt.s64.s32 %r297,%r26; .loc 1 506 7 shr.u64 %r163,%r88,63; set.u32.gt.s64 %r165,%r88,%r297; neg.s32 %r166,%r165; cvt.u16.u64 %r169,%r163; cvt.u16.u32 %r170,%r166; or.b16 %r168,%r169,%r170; .loc 1 506 6 cvt.u32.u16 %r171,%r168; cvt.u16.u8 %r172,%r171; setp.ne.u16 %r173,%r172,0; @ %r173 bra $L149; .loc 1 513 3 setp.ne.u64 %r174,%r88,0; @ %r174 bra $L150; $L156: .loc 1 521 3 setp.lt.s64 %r175,%r88,%r297; @ %r175 bra $L151; bra $L204; $L149: .loc 1 508 7 cvt.u32.u32 %r178,%r24; cvt.s64.s8 %r177,%r178; st.u64 [%stack+8],%r1771761stack; call _gfortran_runtime_error150: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r180,%r151,24; add.u64 %r182,%r23,%r23; add.u64 %r183,%r182,%r23; shl.b64 %r184,%r183,3; add.u64 %r146,%r180,%r184; .loc 1 518 12 mov.u64 %r303,0; $L155: .loc 1 515 19 ld.u64 %r186,[%r140]; add.u64 %r185,%r186,1; ld.u64 %r187,[%r137]; sub.u64 %r36,%r185,%r187; .loc 1 517 10 setp.le.s64 %r188,%r36,0; @ %r188 bra $L153; .loc 1 515 17 st.u64 [%r139],%r36; bra $L154; $L153: .loc 1 518 12 st.u64 [%r139],%r303; $L154: .loc 1 513 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r190,%r137,%r146; @ %r190 bra $L155; bra $L156; $L204: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r191,%r98,0; @ ! %r191 bra $L158; bra $L157; $L151: add.u64 %r193,%r23,%r23; add.u64 %r194,%r193,%r23195,48; add.u64 %r127,%r151,%r196; add.u64 %r296,%frame,120; shl.b64 %r198,%r23,3; add.u64 %r199,%r198,-8; add.u64 %r43,%r296,%r199; add.u64 %r200,%r151,48; cvt.u32.u32 %r202,%r24; cvt.s64.s8 %r201,%r202; add.u64 %r204,%r201,%r201; add.u64 %r205,%r204,%r201; shl.b64 %r206,%r205,3; add.u64 %r133,%r200,%r206; .loc 1 527 12 mov.u64 %r302,0; $L161: .loc 1 524 2 ld.u64 %r208,[%r127+8]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r127]; sub.u64 %r41,%r207,%r209; .loc 1 526 10 setp.le.s64 %r210,%r41,0; @ %r210 bra $L159; .loc 1 523 17 st.u64 [%r43],%r41; bra $L160; $L159: .loc 1 527 12 st.u64 [%r43],%r302; $L160: .loc 1 521 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r212,%r127,%r133; @ %r212 bra $L161; bra $L205; $L157: .loc 1 534 7 setp.eq.u64 %r213,%r297,0; @ %r213 bra $L163; add.u64 %r296,%frame,120; $L183: add.u64 %r111,%r150,40; mov.u64 %r107,%r296; cvt.u32.u32 %r215,%r24; cvt.s64.s8 %r214,%r215; add.u64 %r217,%r214,%r214; add.u64 %r218,%r217,%r214; shl.b64 %r219,%r218,3; add.u64 %r220,%r150,16; add.u64 %r79,%r219,%r220; .loc 1 537 10 mov.u64 %r81,1; .loc 1 541 4 mov.u64 %r239,0; bra $L164; $L163: .loc 1 545 24 mov.u64 %r221,0; st.u64 [%r150+8],%r221; .loc 1 546 28 cvt.u16.u32 %r224,%r24; add.u16 %r223,%r224,-1; cvt.u32.u16 %r225,%r223; st.u8 [%r150+28],%r225; .loc 1 548 20 add.u32 %r226,%r24,-2; cvt.s64.s32 %r50,%r226; add.u64 %r228,%r50,%r50; add.u64 %r229,%r228,%r50; shl.b64 %r230,%r229,3; add.u64 %r231,%r150,%r230; .loc 1 548 67 shl.b64 %r233,%r50,3; add.u64 %r234,%frame,%r233; .loc 1 548 59 ld.u64 %r236,[%r231+40]; ld.u64 %r237,[%r234+120]; mul.lo.u64 %r96,%r236,%r237; .loc 1 550 10 setp.eq.u64 %r238,%r96,0; @ %r238 bra $L165; bra $L206; $L167: .loc 1 539 48 mul.lo.u64 %r81,%r46,%r81; $L164: .loc 1 541 4 st.u64 [%r111+8],%r239; ld.u64 %r46,[%r107]; add.u64 %r240,%r46,-1; st.u64 [%r111+16],%r240; st.u64 [%r111],%r81; .loc 1 534 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r241,%r79,%r111; @ %r241 bra $L167; bra $L163; $L165: .loc 1 553 4 st.u64 [%r150+48],%r96; mov.u64 %r243,-1; st.u64 [%r150+56],%r243; mov.u64 %r244,1; st.u64 [%r150+40],%r244; .loc 1 554 4 bra $L145; $L206: .loc 1 557 24 mov.u64 %r246,1696246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } mov.u64 %r98,%r247; .loc 1 557 22 st.u64 [%r150],%r98; $L171: .loc 1 583 3 setp.ne.u64 %r249,%r297,0; @ %r249 bra $L168; bra $L169; $L158: .loc 1 561 19 ld.s8 %r55,[%r150+28]; .loc 1 561 10 setp.eq.u64 %r250,%r55,%r297; @ %r250 bra $L170; .loc 1 562 2 st.u64 [%stack+8],%r297; st.u64 [%stack],%r55;_gfortran_runtime_error170: .loc 1 567 11 cvta.global.u64 %r253,_gfortrani_compile_options; .loc 1 567 10 ld.u32 %r254,[%r253+36]; setp.eq.u32 %r255,%r254,0; @ %r255 bra $L171; .loc 1 569 4 setp.eq.u64 %r256,%r297,0; @ %r256 bra $L169; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r258,%r24; cvt.s64.s8 %r257,%r258; add.u64 %r129,%r257,-1; .loc 1 569 10 mov.u64 %r104,0; $L173: .loc 1 573 21 ld.u64 %r260,[%r78+8]; add.u64 %r259,%r260,1; .loc 1 573 19 ld.u64 %r261,[%r78]; sub.u64 %r93,%r259,%r261; .loc 1 574 18 ld.u64 %r63,[%r31]; .loc 1 569 25 add.u64 %r104,%r104,1; .loc 1 574 11 setp.eq.u64 %r262,%r63,%r93; @ %r262 bra $L172; .loc 1 575 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104;_gfortran_runtime_error172: .loc 1 569 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r265,%r129,%r104; @ %r265 bra $L173; $L168: .loc 1 585 16 cvt.u16.u32 %r267,%r24; setp.le.s16 %r268,%r267,1; @ %r268 bra $L174; shl.b64 %r266,%r297,3; bra $L175; $L174: mov.u64 %r266,8; $L175: add.u64 %r270,%frame,240; mov.u32 %r27426676,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L176: .loc 1 586 18 ld.u64 %r278,[%r116]; st.u64 [%r114],%r278; .loc 1 583 26 add.u64 %r100,%r100,1; .loc 1 583 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r279,%r297,%r100; @ %r279 bra $L176; $L169: ld.u64 %r80,[%frame+240]; .loc 1 595 22 ld.u64 %r68,[%frame]; .loc 1 595 12 shl.b64 %r69,%r68,4; .loc 1 597 32 ld.u64 %r59,[%frame+120]; .loc 1 604 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r295,%r297,1; .loc 1 593 13 mov.u64 %r280,0; cvt.u32.u32 %r299,%r24; cvt.s64.s8 %r300,%r299; add.u64 %r301,%r300,-1; $L182: st.u64 [%r98],%r280; st.u64 [%r98+8],%r280; .loc 1 594 15 add.u64 %r80,%r80,1; .loc 1 595 12 add.u64 %r98,%r98,%r69; .loc 1 597 13 setp.ne.u64 %r282,%r59,%r80; @ %r282 bra $L182; .loc 1 606 7 @ ! %r295 bra $L207; bra $L145; $L181: .loc 1 601 13 st.u64 [%r99],%r280; .loc 1 604 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 605 5 add.u64 %r97,%r97,1; .loc 1 606 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r285,%r97,%r301; @ %r285 bra $L180; bra $L145; $L207: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 604 23 mov.u64 %r72,%r44; .loc 1 605 5 mov.u64 %r97,1; $L180: .loc 1 610 16 ld.u64 %r290,[%r99]; add.u64 %r74,%r290,1; st.u64 [%r99],%r74; .loc 1 611 23 ld.u64 %r75,[%r124]; .loc 1 611 13 sub.u64 %r291,%r75,%r72; shl.b64 %r292,%r291,4; add.u64 %r98,%r98,%r292; .loc 1 597 32 ld.u64 %r77,[%r123]; .loc 1 597 13 setp.eq.u64 %r293,%r74,%r77; @ %r293 bra $L181; .loc 1 601 13 mov.u64 %r80,0; bra $L182; $L205: .loc 1 530 15 ld.u64 %r98,[%r150]; .loc 1 530 6 setp.eq.u64 %r294,%r98,0; @ ! %r294 bra $L158; bra $L183; $L145: .loc 1 615 1 ret; } minloc1_4_r10.o/1622802197199 minloc1_8_r16.o/1622802197minval_i1.o/_gfortran_minval_i1 .visible .func _gfortran_minval_fortran/generated/minval_i1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminval_i1 .visible .func _gfortran_mminval_i1gfortran_sminval_i1 .visible .func _gfortran_sminval_i1VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,86,65,76114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_minval_i1 .visible .func _gfortran_minval_480predpred %r209; .reg .pred %r210; .reg .u64 %r211; .reg .u64 %r213; .reg .u64pred %r220; .reg .pred %r222; .reg .u64pred %r247; .reg .u64 %r248; .reg .u32 %r249; .reg .u64 %r251; .reg .u64 %r252; .reg .u64 %r253; .reg .u64 %r254; .reg .u64 %r255; .reg .u16 %r257; .reg .u16predpred %r288; .reg .u64 %r291; .reg .u64 %r292; .reg .predpred %r304; .reg .pred64pred %r335; .reg .u64 %r336; .reg .u64mov.u64 %r171,%ar0; mov.u64 %r172,%ar1; mov.u64 %r173,%ar2; .loc 1 55 10 ld.s8 %r22,[%r172+28]; .loc 1 55 38 add.u32 %r174,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r174; .loc 1 56 10 ld.u64 %r25,[%r173]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r176,%r104,63; set.u32.lt.s64 %r178,%r103,%r104; neg.s32 %r179,%r178; cvt.u16.u64 %r182,%r176; cvt.u16.u32 %r183,%r179; or.b16 %r181,%r182,%r183; .loc 1 58 6 cvt.u32.u16 %r184,%r181; cvt.u16.u8 %r185,%r184; setp.eq.u16 %r186,%r185,0; @ %r186 bra $L2; .loc 1 60 7 cvt.u32.u32 %r189,%r22; cvt.s64.s8 %r188,%r189; st.u64 [%stack+8],%r18818187_gfortran_runtime_errorr192,%r104,%r104; add.u64 %r193,%r192,%r104; shl.b64 %r194,%r193,3; add.u64 %r195,%r172,%r194; ld.u64 %r31,[%r195+56]; ld.u64 %r33,[%r195+48]; .loc 1 68 9 ld.u64 %r107,[%r195+40]; .loc 1 70 3 setp.ne.u64 %r209,%r104,0; @ %r209 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r210,%r103,%r104; @ %r210 bra $L4; bra $L46; $L3: add.u64 %r168,%r172,40; add.u64 %r165,%frame,120; add.u64 %r49,%frame,240; add.u64 %r211,%r172,16; add.u64 %r213,%r25,%r25; add.u64 %r214,%r213,%r25; shl.b64 %r215,%r214,3; add.u64 %r139,%r211,%r215; .loc 1 76 12 mov.u64 %r349,0; $L8: .loc 1 72 18 ld.u64 %r216,[%r168]; st.u64 [%r165],%r216; .loc 1 73 19 ld.u64 %r218,[%r168+16]; add.u64 %r217,%r218,1; ld.u64 %r219,[%r168+8]; sub.u64 %r38,%r217,%r219; .loc 1 75 10 setp.lt.s64 %r220,%r38,0; @ %r220 bra $L6; .loc 1 73 17 st.u64 [%r49],%r38; bra $L7; $L6: .loc 1 76 12 st.u64 [%r49],%r349; $L7: .loc 1 70 3 add.u64 %r168,%r168,24; add.u64 %r165,%r165,8; add.u64 %r49,%r49,8; setp.ne.u64 %r222,%r139,%r168; @ %r222 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r223,[%r171]; setp.eq.u64 %r224,%r223,0; @ ! %r224 bra $L11; bra $L10; $L4: add.u64 %r226,%r25,%r25; add.u64 %r227,%r226,%r25; shl.b64 %r228,%r227,3; add.u64 %r229,%r228,40; add.u64 %r99,%r172,%r229; shl.b64 %r230,%r25,3; add.u64 %r91,%r230,-8; add.u64 %r336,%frame,120; add.u64 %r93,%r336,%r91; add.u64 %r337,%frame,240; add.u64 %r78,%r337,%r91; cvt.u32.u32 %r234,%r22; cvt.s64.s8 %r233,%r234; add.u64 %r236,%r233,%r233; add.u64 %r237,%r236,%r233172,40; add.u64 %r121,%r238,%r239; .loc 1 84 12 mov.u64 %r348,0; $L14: .loc 1 80 18 ld.u64 %r240,[%r99]; st.u64 [%r93],%r240; .loc 1 81 19 ld.u64 %r242,[%r99+16]; add.u64 %r241,%r242,1; ld.u64 %r243,[%r99+8]; sub.u64 %r46,%r241,%r243; .loc 1 83 10 setp.lt.s64 %r244,%r46,0; @ %r244 bra $L12; .loc 1 81 17 st.u64 [%r78],%r46; bra $L13; $L12: .loc 1 84 12 st.u64 [%r78],%r348; $L13: .loc 1 78 3 add.u64 %r99,%r99,24; add.u64 %r93,%r93,8; add.u64 %r78,%r78,8; setp.ne.u64 %r246,%r99,%r121; @ %r246 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r247,%r103,0; @ %r247 bra $L16; add.u64 %r337,%frame,240; $L35: add.u64 %r137,%r171,40; mov.u64 %r132,%r337; cvt.u32.u32 %r249,%r22; cvt.s64.s8 %r248,%r249; add.u64 %r251,%r248,%r248; add.u64 %r252,%r251,%r248; shl.b64 %r253,%r252,3; add.u64 %r254,%r171,16; add.u64 %r100,%r253,%r254; .loc 1 94 10 mov.u64 %r95,1; .loc 1 98 4 mov.u64 %r277,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r255,0; st.u64 [%r171+8],%r255; .loc 1 103 28 cvt.u16.u32 %r258,%r22; add.u16 %r257,%r258,-1; cvt.u32.u16 %r259,%r257; st.u8 [%r171+28],%r259; .loc 1 105 20 add.u32 %r260,%r22,-2; cvt.s64.s32 %r55,%r260; add.u64 %r262,%r55,%r55; add.u64 %r263,%r262,%r55; shl.b64 %r264,%r263,3; add.u64 %r265,%r171,%r264; .loc 1 105 67 shl.b64 %r267,%r55,3; add.u64 %r268,%frame,%r267; .loc 1 105 59 ld.u64 %r270,[%r265+40]; ld.u64 %r271,[%r268+240]; mul.lo.u64 %r109,%r270,%r271; .loc 1 107 29 mov.u64 %r27273; call (%value_in),_gfortrani_xmallocarray274,[%value_in]; } .loc 1 107 27 st.u64 [%r171],%r274; .loc 1 108 10 setp.eq.u64 %r276,%r109,0; @ ! %r276 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r95,%r51,%r95; $L17: .loc 1 98 4 st.u64 [%r137+8],%r277; ld.u64 %r51,[%r132]; add.u64 %r278,%r51,-1; st.u64 [%r137+16],%r278; st.u64 [%r137],%r95; .loc 1 91 7 add.u64 %r137,%r137,24; add.u64 %r132,%r132,8; setp.ne.u64 %r279,%r100,%r137; @ %r279 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r171+48],%r109; mov.u64 %r281,-1; st.u64 [%r171+56],%r281; st.u64 [%r171+40],%r273; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r61,[%r171+28]; .loc 1 118 10 setp.eq.u64 %r283,%r61,%r103; @ %r283 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r61;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r286,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r287,[%r286+36]; setp.eq.u32 %r288,%r287,0; @ %r288 bra $L21; .loc 1 125 2 add.u64 %r337,%frame,240; cvta.const.u64 %r292,$LC2r329292; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r294,%r103,0; @ %r294 bra $L24; $L27: .loc 1 65 9 add.u64 %r295,%r31,1; .loc 1 65 7 sub.u64 %r105,%r295,%r33; max.s64 %r41,%r105,0; .loc 1 137 8 ld.u64 %r117,[%r172]; .loc 1 138 8 ld.u64 %r118,[%r171]; ld.u64 %r48,[%frame+360]; .loc 1 180 22 ld.u64 %r70,[%frame+120]; .loc 1 181 22 ld.u64 %r72,[%frame]; setp.gt.s64 %r338,%r105,0; setp.le.s64 %r339,%r103,1; cvt.u32.u32 %r341,%r22; cvt.s64.s8 %r342,%r341; add.u64 %r343,%r342,-1; add.u64 %r344,%frame,120; add.u64 %r345,%frame,240; .loc 1 187 13 mov.u64 %r346,0; bra $L25; $L24: add.u64 %r157,%frame,360; add.u64 %r156,%r171,40; mov.u64 %r154,%frame; cvt.u32.u32 %r297,%r22; cvt.s64.s8 %r296,%r297; add.u64 %r145,%r296,-1; .loc 1 129 3 mov.u64 %r119,0; add.u64 %r337,%frame,240; .loc 1 131 16 mov.u64 %r298,%r119; $L26: st.u64 [%r157],%r298; .loc 1 132 18 ld.u64 %r299,[%r156]; st.u64 [%r154],%r299; .loc 1 133 17 shl.b64 %r301,%r119,3; add.u64 %r302,%r337,%r301; .loc 1 133 10 ld.u64 %r303,[%r302]; setp.le.s64 %r304,%r303,0; @ %r304 bra $L1; .loc 1 129 26 add.u64 %r119,%r119,1; .loc 1 129 3 add.u64 %r157,%r157,8; add.u64 %r156,%r156,24; add.u64 %r154,%r154,8; setp.ne.u64 %r305,%r119,%r145; @ %r305 bra $L26; bra $L27; $L25: .loc 1 153 5 @ %r338 bra $L28; .loc 1 154 10 mov.u32 %r307,127; st.u8 [%r118],%r307; bra $L29; $L28: .loc 1 158 36 mov.u64 %r112,%r117; .loc 1 151 9 mov.u32 %r44,127; .loc 1 158 13 mov.u64 %r111,0; $L30: ld.s8 %r310,[%r112]; cvt.u16.u32 %r309,%r310; cvt.u16.u32 %r312,%r44; min.s16 %r311,%r309,%r312; cvt.u32.u16 %r313,%r311; cvt.s32.s8 %r44,%r313; .loc 1 158 28 add.u64 %r111,%r111,1; .loc 1 158 36 add.u64 %r112,%r112,%r107; .loc 1 158 6 setp.gt.s64 %r314,%r41,%r111; @ %r314 bra $L30; .loc 1 175 12 cvt.u32.u32 %r315,%r44; st.u8 [%r118],%r315; $L29: .loc 1 179 15 add.u64 %r48,%r48,1; .loc 1 180 12 add.u64 %r117,%r117,%r70; .loc 1 181 12 add.u64 %r118,%r118,%r72; .loc 1 183 32 ld.u64 %r126,[%frame+240]; .loc 1 183 13 setp.ne.u64 %r316,%r48,%r126; @ %r316 bra $L25; .loc 1 190 23 mul.lo.u64 %r317,%r70,%r48; .loc 1 190 9 sub.u64 %r114,%r117,%r317; .loc 1 191 23 mul.lo.u64 %r318,%r72,%r48; .loc 1 191 9 sub.u64 %r115,%r118,%r318; .loc 1 193 7 @ %r339 bra $L1; add.u64 %r143,%frame,368; mov.u64 %r102,8; .loc 1 192 5 mov.u64 %r116,1; bra $L33; $L34: .loc 1 187 13 st.u64 [%r143],%r346; .loc 1 190 23 mul.lo.u64 %r324,%r84,%r83; .loc 1 190 9 sub.u64 %r114,%r117,%r324; .loc 1 191 23 mul.lo.u64 %r325,%r86,%r83; .loc 1 191 9 sub.u64 %r115,%r118,%r325; .loc 1 192 5 add.u64 %r116,%r116,1; .loc 1 193 7 add.u64 %r143,%r143,8; add.u64 %r102,%r102,8; setp.eq.u64 %r326,%r116,%r343; @ %r326 bra $L1; $L33: .loc 1 201 16 ld.u64 %r327,[%r143]; add.u64 %r83,%r327,1; st.u64 [%r143],%r83; .loc 1 202 23 add.u64 %r329,%r344,%r102; ld.u64 %r84,[%r329]; .loc 1 202 13 add.u64 %r117,%r114,%r84; .loc 1 203 23 add.u64 %r330,%frame,%r102; ld.u64 %r86,[%r330]; .loc 1 203 13 add.u64 %r118,%r115,%r86; .loc 1 183 32 add.u64 %r332,%r345,%r102; ld.u64 %r88,[%r332]; .loc 1 183 13 setp.eq.u64 %r333,%r83,%r88; @ %r333 bra $L34; .loc 1 187 13 mov.u64 %r48,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r334,[%r171]; setp.eq.u64 %r335,%r334,0; @ ! %r335 bra $L11; bra $L35; $L_gfortran_mminval_i1 .visible .func _gfortran_mminval_60899pred %r207; .reg .u32 %r211; .reg .u64 %r213; .reg .u32 %r215; .reg .u32 %r216; .reg .u16u16 %r255; .reg .u32 %r256; .reg .u16 %r257; .reg .pred %r258; .reg .predpred %r274; .reg .u64 %r275; .reg .u64 %r277; .reg .u64 %r279; .reg .u64 %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64pred %r297; .reg .u64 %r298; .reg .predu64 %r305; .reg .u32 %r306; .reg .u64u64 %r313; .reg .u64pred %r323; .reg .pred %r324; .reg .u64 %r325; .reg .u32 %r326; .reg .u64u64 %r331; .reg .u32 %r332; .reg .u64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u64u16 %r347; .reg .u32 %r348; .reg .pred %r349; .reg .u64 %r350; .reg .u64 %r351; .reg .predu16 %r388; .reg .u32 %r389; .reg .predpred %r397; .reg .u32u64 %r437; mov.u64 %r203,%ar0; mov.u64 %r204,%ar1; mov.u64 %r205,%ar2; mov.u64 %r206,%ar3; .loc 1 237 6 setp.ne.u64 %r207,%r206,0; @ %r207 bra $L490r204205; call _gfortran_minva.loc 1 244 7 bra $L48; $L49: .loc 1 247 10 ld.u64 %r22,[%r205]; .loc 1 247 7 add.u64 %r131,%r22,-1; .loc 1 248 10 ld.s8 %r23,[%r204+28]; .loc 1 248 38 add.u32 %r211,%r23,-1; .loc 1 248 8 cvt.s64.s32 %r132,%r211; .loc 1 251 7 shr.u64 %r213,%r131,63; set.u32.gt.s64 %r215,%r131,%r132; neg.s32 %r216,%r215; cvt.u16.u64 %r219,%r213; cvt.u16.u32 %r220,%r216; or.b16 %r218,%r219,%r220; .loc 1 251 6 cvt.u32.u16 %r221,%r218; cvt.u16.u8 %r222,%r221; setp.eq.u16 %r223,%r222,0; @ %r223 bra $L51; .loc 1 253 7 cvt.u32.u32 %r226,%r23; cvt.s64.s8 %r225,%r226; st.u64 [%stack+8],%r225; st.u64 [%stack],%r22; cvta.const.u64 %r2242stack; call _gfortran_runtime_error1: .loc 1 258 9 add.u64 %r426,%r131,%r131; add.u64 %r425,%r426,%r131; shl.b64 %r231,%r425,3; add.u64 %r232,%r204,%r231; ld.u64 %r235,[%r232+56]; add.u64 %r234,%r235,1; .loc 1 258 7 ld.u64 %r242,[%r232+48]; sub.u64 %r133,%r234,%r242; .loc 1 259 6 setp.le.s64 %r243,%r133,0; @ %r243 bra $L48; .loc 1 262 9 ld.u64 %r145,[%r206]; .loc 1 264 15 ld.u64 %r34,[%r206+16]; .loc 1 266 22 cvt.u32.u64 %r35,%r34; .loc 1 266 53 add.u32 %r244,%r35,-4; and.b32 %r245,%r244,-5; set.u32.eq.u32 %r247,%r245,0; neg.s32 %r248,%r247; .loc 1 266 22 add.u32 %r249,%r35,-1; set.u32.le.u32 %r251,%r249,1; neg.s32 %r252,%r251; .loc 1 266 6 cvt.u16.u32 %r254,%r248; cvt.u16.u32 %r255,%r252; or.b16 %r253,%r254,%r255; cvt.u32.u16 %r256,%r253; cvt.u16.u8 %r257,%r256; setp.ne.u16 %r258,%r257,0; @ %r258 bra $L53; .loc 1 268 7 setp.ne.u32 %r260,%r35,16; @ %r260 bra $L54; $L53: .loc 1 275 9 shl.b64 %r264,%r425,3; add.u64 %r265,%r204,%r264; ld.u64 %r136,[%r265+40]; .loc 1 276 12 add.u64 %r271,%r206,%r264; ld.u64 %r42,[%r271+40]; .loc 1 278 3 setp.ne.u64 %r273,%r131,0; @ %r273 bra $L55; $L61: .loc 1 288 3 setp.lt.s64 %r274,%r131,%r132; @ %r274 bra $L56; bra $L105; $L54: .loc 1 273 5 cvta.const.u64 %r275275_gfortran_runtime_error5: add.u64 %r187,%r204,40; add.u64 %r189,%r206,40; add.u64 %r277,%r204,16; add.u64 %r279,%r22,%r22; add.u64 %r280,%r279,%r22; shl.b64 %r281,%r280,3; add.u64 %r200,%r277,%r281; .loc 1 278 3 mov.u64 %r191,0; add.u64 %r423,%frame,240; add.u64 %r422,%frame,360; .loc 1 285 12 mov.u64 %r437,%r191; $L60: .loc 1 280 18 add.u64 %r283,%r423,%r191; ld.u64 %r284,[%r187]; st.u64 [%r283],%r284; .loc 1 281 18 add.u64 %r285,%frame,%r191; .loc 1 281 20 ld.u64 %r287,[%r189]; mul.lo.u64 %r286,%r287,%r34; .loc 1 281 18 st.u64 [%r285],%r286; .loc 1 282 19 ld.u64 %r289,[%r187+16]; add.u64 %r288,%r289,1; ld.u64 %r290,[%r187+8]; sub.u64 %r53,%r288,%r290; .loc 1 284 10 setp.lt.s64 %r291,%r53,0; @ %r291 bra $L58; .loc 1 282 17 add.u64 %r293,%r422,%r191; st.u64 [%r293],%r53; bra $L59; $L58: .loc 1 285 12 add.u64 %r295,%r422,%r191; st.u64 [%r295],%r437; $L59: .loc 1 278 3 add.u64 %r187,%r187,24; add.u64 %r189,%r189,24; add.u64 %r191,%r191,8; setp.ne.u64 %r297,%r187,%r200; @ %r297 bra $L60; bra $L61; $L105: .loc 1 298 6 ld.u64 %r298,[%r203]; setp.eq.u64 %r299,%r298,0; @ ! %r299 bra $L63; bra $L62; $L56: add.u64 %r301,%r22,%r22; add.u64 %r302,%r301,%r22; shl.b64 %r303,%r302,3; add.u64 %r152,%r303,40; add.u64 %r55,%r204,%r152; add.u64 %r154,%r206,%r152; shl.b64 %r304,%r22,3; add.u64 %r99,%r304,-8; cvt.u32.u32 %r306,%r23; cvt.s64.s8 %r305,%r306; shl.b64 %r307,%r305,3; add.u64 %r184,%r307,-8; add.u64 %r423,%frame,240; add.u64 %r422,%frame,360; .loc 1 295 12 mov.u64 %r436,0; $L66: .loc 1 290 18 add.u64 %r309,%r423,%r99; ld.u64 %r310,[%r55]; st.u64 [%r309],%r310; .loc 1 291 18 add.u64 %r311,%frame,%r99; .loc 1 291 20 ld.u64 %r313,[%r154]; mul.lo.u64 %r312,%r313,%r34; .loc 1 291 18 st.u64 [%r311],%r312; .loc 1 292 19 ld.u64 %r315,[%r55+16]; add.u64 %r314,%r315,1; ld.u64 %r316,[%r55+8]; sub.u64 %r63,%r314,%r316; .loc 1 294 10 setp.lt.s64 %r317,%r63,0; @ %r317 bra $L64; .loc 1 292 17 add.u64 %r319,%r422,%r99; st.u64 [%r319],%r63; bra $L65; $L64: .loc 1 295 12 add.u64 %r321,%r422,%r99; st.u64 [%r321],%r436; $L65: .loc 1 288 3 add.u64 %r55,%r55,24; add.u64 %r154,%r154,24; add.u64 %r99,%r99,8; setp.ne.u64 %r323,%r99,%r184; @ %r323 bra $L66; bra $L106; $L62: .loc 1 302 7 setp.eq.u64 %r324,%r132,0; @ %r324 bra $L68; add.u64 %r422,%frame,360; $L86: add.u64 %r123,%r203,40; mov.u64 %r117,%r422; cvt.u32.u32 %r326,%r23; cvt.s64.s8 %r325,%r326; add.u64 %r328,%r325,%r325; add.u64 %r329,%r328,%r325; shl.b64 %r330,%r329,3; add.u64 %r331,%r203,16; add.u64 %r71,%r330,%r331; .loc 1 305 10 mov.u64 %r119,1; .loc 1 309 4 mov.u64 %r350,0; bra $L69; $L68: .loc 1 313 20 add.u32 %r332,%r23,-2; cvt.s64.s32 %r69,%r332; add.u64 %r334,%r69,%r69; add.u64 %r335,%r334,%r69; shl.b64 %r336,%r335,3; add.u64 %r337,%r203,%r336; .loc 1 313 67 shl.b64 %r339,%r69,3; add.u64 %r340,%frame,%r339; .loc 1 313 59 ld.u64 %r342,[%r337+40]; ld.u64 %r343,[%r340+360]; mul.lo.u64 %r139,%r342,%r343; .loc 1 315 24 mov.u64 %r344,0; st.u64 [%r203+8],%r344; .loc 1 316 28 cvt.u16.u32 %r347,%r23; add.u16 %r346,%r347,-1; cvt.u32.u16 %r348,%r346; st.u8 [%r203+28],%r348; .loc 1 318 10 setp.eq.u64 %r349,%r139,0; @ %r349 bra $L70; bra $L107; $L72: .loc 1 307 47 mul.lo.u64 %r119,%r66,%r119; $L69: .loc 1 309 4 st.u64 [%r123+8],%r350; ld.u64 %r66,[%r117]; add.u64 %r351,%r66,-1; st.u64 [%r123+16],%r351; st.u64 [%r123],%r119; .loc 1 302 7 add.u64 %r123,%r123,24; add.u64 %r117,%r117,8; setp.ne.u64 %r352,%r71,%r123; @ %r352 bra $L72; bra $L68; $L70: .loc 1 321 4 st.u64 [%r203+48],%r139; mov.u64 %r354,-1; st.u64 [%r203+56],%r354; mov.u64 %r355,1; st.u64 [%r203+40],%r355; .loc 1 322 4 bra $L48; $L107: .loc 1 325 24 mov.u64 %r3571357; call (%value_in),_gfortrani_xmallocarray58,[%value_in]; } .loc 1 325 22 st.u64 [%r203],%r358; bra $L73; $L63: .loc 1 330 19 ld.s8 %r360,[%r203+28]; .loc 1 330 10 setp.eq.u64 %r361,%r360,%r132; @ %r361 bra $L74; .loc 1 331 2 cvta.const.u64 %r362362_gfortran_runtime_error74: .loc 1 333 11 cvta.global.u64 %r364,_gfortrani_compile_options; .loc 1 333 10 ld.u32 %r365,[%r364+36]; setp.eq.u32 %r366,%r365,0; @ %r366 bra $L73; .loc 1 335 4 add.u64 %r422,%frame,360; cvta.const.u64 %r370,$LC2r422369370; call _gfortrani_bounds_ifunction_return337 4 cvta.const.u64 %r374,$LC6; {374370; call _gfortrani_bounds_equal_extents$L73: .loc 1 342 3 setp.ne.u64 %r376,%r132,0; @ %r376 bra $L75; $L78: .loc 1 350 8 ld.u64 %r146,[%r203]; .loc 1 351 8 ld.u64 %r144,[%r204]; .loc 1 353 9 setp.ne.u64 %r377,%r144,0; @ %r377 bra $L76; bra $L48; $L75: add.u64 %r170,%frame,480; add.u64 %r167,%r203,40; add.u64 %r160,%frame,120; cvt.u32.u32 %r379,%r23; cvt.s64.s8 %r378,%r379; add.u64 %r124,%r378,-1; .loc 1 342 3 mov.u64 %r151,0; add.u64 %r422,%frame,360; .loc 1 344 16 mov.u64 %r380,%r151; $L77: st.u64 [%r170],%r380; .loc 1 345 18 ld.u64 %r381,[%r167]; st.u64 [%r160],%r381; .loc 1 346 17 shl.b64 %r383,%r151,3; add.u64 %r384,%r422,%r383; .loc 1 346 10 ld.u64 %r385,[%r384]; setp.le.s64 %r386,%r385,0; @ %r386 bra $L48; .loc 1 342 26 add.u64 %r151,%r151,1; .loc 1 342 3 add.u64 %r170,%r170,8; add.u64 %r167,%r167,24; add.u64 %r160,%r160,8; setp.ne.u64 %r387,%r124,%r151; @ %r387 bra $L77; bra $L78; $L76: .loc 1 276 12 mul.lo.u64 %r44,%r42,%r34; ld.u64 %r173,[%frame+480]; .loc 1 401 22 ld.u64 %r85,[%frame+240]; .loc 1 402 23 ld.u64 %r87,[%frame]; .loc 1 403 22 ld.u64 %r89,[%frame+120]; setp.le.s64 %r427,%r132,1; .loc 1 365 9 mov.u32 %r429,127; cvt.u32.u32 %r430,%r23; cvt.s64.s8 %r431,%r430; add.u64 %r432,%r431,-1; add.u64 %r433,%frame,240; add.u64 %r434,%frame,360; add.u64 %r435,%frame,120; $L87: .loc 1 342 3 mov.u64 %r150,%r145; mov.u64 %r149,%r144; .loc 1 365 9 mov.u32 %r121,%r429; .loc 1 370 9 mov.u64 %r148,0; $L80: .loc 1 394 6 ld.s8 %r389,[%r150]; cvt.u16.u32 %r388,%r389; setp.eq.u16 %r390,%r388,0; @ %r390 bra $L79; ld.s8 %r393,[%r149]; cvt.u16.u32 %r392,%r393; cvt.u16.u32 %r395,%r121; min.s16 %r394,%r392,%r395; cvt.u32.u16 %r396,%r394; cvt.s32.s8 %r121,%r396; $L79: .loc 1 370 24 add.u64 %r148,%r148,1; .loc 1 370 32 add.u64 %r149,%r149,%r136; .loc 1 370 47 add.u64 %r150,%r150,%r44; .loc 1 370 2 setp.ne.u64 %r397,%r133,%r148; @ %r397 bra $L80; .loc 1 397 8 cvt.u32.u32 %r398,%r121; st.u8 [%r146],%r398; .loc 1 400 15 add.u64 %r173,%r173,1; .loc 1 401 12 add.u64 %r144,%r144,%r85; .loc 1 402 13 add.u64 %r145,%r145,%r87; .loc 1 403 12 add.u64 %r146,%r146,%r89; .loc 1 405 32 ld.u64 %r79,[%frame+360]; .loc 1 405 13 setp.ne.u64 %r399,%r79,%r173; @ %r399 bra $L87; .loc 1 412 23 mul.lo.u64 %r93,%r173,%r85; .loc 1 413 24 mul.lo.u64 %r400,%r173,%r87; .loc 1 413 10 sub.u64 %r141,%r145,%r400; .loc 1 414 23 mul.lo.u64 %r401,%r173,%r89; .loc 1 414 9 sub.u64 %r142,%r146,%r401; .loc 1 416 7 @ ! %r427 bra $L108; bra $L48; $L85: .loc 1 409 13 st.u64 [%r40],%r428; .loc 1 412 23 mul.lo.u64 %r93,%r105,%r104; .loc 1 413 24 mul.lo.u64 %r404,%r107,%r104; .loc 1 413 10 sub.u64 %r141,%r145,%r404; .loc 1 414 23 mul.lo.u64 %r405,%r109,%r104; .loc 1 414 9 sub.u64 %r142,%r146,%r405; .loc 1 415 5 add.u64 %r143,%r143,1; .loc 1 416 7 add.u64 %r40,%r40,8; add.u64 %r164,%r164,8; setp.ne.u64 %r406,%r143,%r432; @ %r406 bra $L84; bra $L48; $L108: add.u64 %r40,%frame,488; mov.u64 %r164,8; .loc 1 415 5 mov.u64 %r143,1; .loc 1 409 13 mov.u64 %r428,0; $L84: .loc 1 424 16 ld.u64 %r410,[%r40]; add.u64 %r104,%r410,1; st.u64 [%r40],%r104; .loc 1 425 23 add.u64 %r412,%r433,%r164; ld.u64 %r105,[%r412]; .loc 1 425 13 sub.u64 %r413,%r105,%r93; add.u64 %r144,%r144,%r413; .loc 1 426 24 add.u64 %r414,%frame,%r164; ld.u64 %r107,[%r414]; .loc 1 426 14 add.u64 %r145,%r141,%r107; .loc 1 427 23 add.u64 %r416,%r435,%r164; ld.u64 %r109,[%r416]; .loc 1 427 13 add.u64 %r146,%r142,%r109; .loc 1 405 32 add.u64 %r418,%r434,%r164; ld.u64 %r112,[%r418]; .loc 1 405 13 setp.eq.u64 %r419,%r104,%r112; @ %r419 bra $L85; .loc 1 409 13 mov.u64 %r173,0; bra $L87; $L106: .loc 1 298 6 ld.u64 %r420,[%r203]; setp.eq.u64 %r421,%r420,0; @ ! %r421 bra $L63; bra $L86; $L48:_gfortran_sminval_i1 .visible .func _gfortran_sminval_68predpred %r184; .reg .predpred %r206; .reg .predu32 %r211; .reg .u64 %r213; .reg .u64u32 %r222; .reg .u64pred %r264; .reg .u64 %r266; .reg .u32 %r270; .reg .u64 %r272; .reg .u64 %r274; .reg .predpredu64 %r297; mov.u64 %r148,%ar0; mov.u64 %r149,%ar1; mov.u64 %r150,%ar2; mov.u64 %r151,%ar3; .loc 1 454 6 setp.eq.u64 %r152,%r151,0; @ %r152 bra $L110; .loc 1 454 20 ld.u32 %r153,[%r151]; setp.eq.u32 %r154,%r153,0; @ %r154 bra $L111; $L110: .loc 1 459 7gfortran_minva.loc 1 461 7 bra $L109; $L111: .loc 1 464 10 ld.u64 %r23,[%r150]; .loc 1 464 7 add.u64 %r86,%r23,-1; .loc 1 465 10 ld.s8 %r24,[%r149+28]; .loc 1 465 38 add.u32 %r26,%r24,-1; .loc 1 465 8 cvt.s64.s32 %r291,%r26; .loc 1 467 7 shr.u64 %r159,%r86,63; set.u32.gt.s64 %r161,%r86,%r291; neg.s32 %r162,%r161; cvt.u16.u64 %r165,%r159; cvt.u16.u32 %r166,%r162; or.b16 %r164,%r165,%r166; .loc 1 467 6 cvt.u32.u16 %r167,%r164; cvt.u16.u8 %r168,%r167; setp.ne.u16 %r169,%r168,0; @ %r169 bra $L113; .loc 1 474 3 setp.ne.u64 %r170,%r86,0; @ %r170 bra $L114; $L120: .loc 1 482 3 setp.lt.s64 %r171,%r86,%r291; @ %r171 bra $L115; bra $L168; $L113: .loc 1 469 7 cvt.u32.u32 %r174,%r24; cvt.s64.s8 %r173,%r174; st.u64 [%stack+8],%r173172172_gfortran_runtime_error114: add.u64 %r135,%r149,48; add.u64 %r137,%frame,120; add.u64 %r138,%r149,56; add.u64 %r176,%r149,24; add.u64 %r178,%r23,%r23; add.u64 %r179,%r178,%r23; shl.b64 %r180,%r179,3; add.u64 %r144,%r176,%r180; .loc 1 479 12 mov.u64 %r297,0; $L119: .loc 1 476 19 ld.u64 %r182,[%r138]; add.u64 %r181,%r182,1; ld.u64 %r183,[%r135]; sub.u64 %r36,%r181,%r183; .loc 1 478 10 setp.le.s64 %r184,%r36,0; @ %r184 bra $L117; .loc 1 476 17 st.u64 [%r137],%r36; bra $L118; $L117: .loc 1 479 12 st.u64 [%r137],%r297; $L118: .loc 1 474 3 add.u64 %r135,%r135,24; add.u64 %r137,%r137,8; add.u64 %r138,%r138,24; setp.ne.u64 %r186,%r135,%r144; @ %r186 bra $L119; bra $L120; $L168: .loc 1 491 15 ld.u64 %r96,[%r148]; .loc 1 491 6 setp.eq.u64 %r187,%r96,0; @ ! %r187 bra $L122; bra $L121; $L115: add.u64 %r189,%r23,%r23; add.u64 %r190,%r189,%r23; shl.b64 %r191,%r190,3; add.u64 %r192,%r191,48; add.u64 %r57,%r149,%r192; add.u64 %r290,%frame,120; shl.b64 %r194,%r23,3; add.u64 %r195,%r194,-8; add.u64 %r67,%r290,%r195; add.u64 %r196,%r149,48; cvt.u32.u32 %r198,%r24; cvt.s64.s8 %r197,%r198; add.u64 %r200,%r197,%r197; add.u64 %r201,%r200,%r197; shl.b64 %r202,%r201,3; add.u64 %r131,%r196,%r202; .loc 1 488 12 mov.u64 %r296,0; $L125: .loc 1 485 2 ld.u64 %r204,[%r57+8]; add.u64 %r203,%r204,1; ld.u64 %r205,[%r57]; sub.u64 %r41,%r203,%r205; .loc 1 487 10 setp.le.s64 %r206,%r41,0; @ %r206 bra $L123; .loc 1 484 17 st.u64 [%r67],%r41; bra $L124; $L123: .loc 1 488 12 st.u64 [%r67],%r296; $L124: .loc 1 482 3 add.u64 %r57,%r57,24; add.u64 %r67,%r67,8; setp.ne.u64 %r208,%r57,%r131; @ %r208 bra $L125; bra $L169; $L121: .loc 1 495 7 setp.eq.u64 %r209,%r291,0; @ %r209 bra $L127; add.u64 %r290,%frame,120; $L147: add.u64 %r102,%r148,40; mov.u64 %r84,%r290; cvt.u32.u32 %r211,%r24; cvt.s64.s8 %r210,%r211; add.u64 %r213,%r210,%r210; add.u64 %r214,%r213,%r210; shl.b64 %r215,%r214,3; add.u64 %r216,%r148,16; add.u64 %r60,%r215,%r216; .loc 1 498 10 mov.u64 %r82,1; .loc 1 502 4 mov.u64 %r235,0; bra $L128; $L127: .loc 1 506 24 mov.u64 %r217,0; st.u64 [%r148+8],%r217; .loc 1 507 28 cvt.u16.u32 %r220,%r24; add.u16 %r219,%r220,-1; cvt.u32.u16 %r221,%r219; st.u8 [%r148+28],%r221; .loc 1 509 20 add.u32 %r222,%r24,-2; cvt.s64.s32 %r50,%r222; add.u64 %r224,%r50,%r50; add.u64 %r225,%r224,%r50; shl.b64 %r226,%r225,3; add.u64 %r227,%r148,%r226; .loc 1 509 67 shl.b64 %r229,%r50,3; add.u64 %r230,%frame,%r229; .loc 1 509 59 ld.u64 %r232,[%r227+40]; ld.u64 %r233,[%r230+120]; mul.lo.u64 %r93,%r232,%r233; .loc 1 511 10 setp.eq.u64 %r234,%r93,0; @ %r234 bra $L129; bra $L170; $L131: .loc 1 500 48 mul.lo.u64 %r82,%r46,%r82; $L128: .loc 1 502 4 st.u64 [%r102+8],%r235; ld.u64 %r46,[%r84]; add.u64 %r236,%r46,-1; st.u64 [%r102+16],%r236; st.u64 [%r102],%r82; .loc 1 495 7 add.u64 %r102,%r102,24;131; bra $L127; $L129: .loc 1 514 4 st.u64 [%r148+48],%r93; mov.u64 %r239,-1; st.u64 [%r148+56],%r239; mov.u64 %r240,1; st.u64 [%r148+40],%r240; .loc 1 515 4 bra $L109; $L170: .loc 1 518 24 mov.u64 %r242(%value_in),_gfortrani_xmallocarray3,[%value_in]; } mov.u64 %r96,%r243; .loc 1 518 22 st.u64 [%r148],%r96; $L135: .loc 1 544 3 setp.ne.u64 %r245,%r291,0; @ %r245 bra $L132; bra $L133; $L122: .loc 1 522 19 ld.s8 %r55,[%r148+28]; .loc 1 522 10 setp.eq.u64 %r246,%r55,%r291; @ %r246 bra $L134; .loc 1 523 2 st.u64 [%stack+8],%r291; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 528 11 cvta.global.u64 %r249,_gfortrani_compile_options; .loc 1 528 10 ld.u32 %r250,[%r249+36]; setp.eq.u32 %r251,%r250,0; @ %r251 bra $L135; .loc 1 530 4 setp.eq.u64 %r252,%r291,0; @ %r252 bra $L133; add.u64 %r31,%r148,48; add.u64 %r125,%frame,120; cvt.u32.u32 %r254,%r24; cvt.s64.s8 %r253,%r254; add.u64 %r92,%r253,-1; .loc 1 530 10 mov.u64 %r101,0; $L137: .loc 1 534 21 ld.u64 %r256,[%r31+8]; add.u64 %r255,%r256,1; .loc 1 534 19 ld.u64 %r257,[%r31]; sub.u64 %r91,%r255,%r257; .loc 1 535 18 ld.u64 %r64,[%r125]; .loc 1 530 25 add.u64 %r101,%r101,1; .loc 1 535 11 setp.eq.u64 %r258,%r64,%r91; @ %r258 bra $L136; .loc 1 536 3 st.u64 [%stack+16],%r64; st.u64 [%stack+8],%r91; st.u64 [%stack],%r101; cvta.const.u64 %r22_gfortran_runtime_error136: .loc 1 530 4 add.u64 %r31,%r31,24; add.u64 %r125,%r125,8; setp.ne.u64 %r261,%r92,%r101; @ %r261 bra $L137; $L132: .loc 1 546 16 cvt.u16.u32 %r263,%r24; setp.le.s16 %r264,%r263,1; @ %r264 bra $L138; shl.b64 %r262,%r291,3; bra $L139; $L138: mov.u64 %r262,8; $L139: add.u64 %r266,%frame,240; mov.u32 %r27026272,[%value_in]; } add.u64 %r111,%r148,40; mov.u64 %r109,%frame; mov.u64 %r98,0; $L140: .loc 1 547 18 ld.u64 %r274,[%r111]; st.u64 [%r109],%r274; .loc 1 544 26 add.u64 %r98,%r98,1; .loc 1 544 3 add.u64 %r111,%r111,24; add.u64 %r109,%r109,8; setp.gt.s64 %r275,%r291,%r98; @ %r275 bra $L140; $L133: ld.u64 %r103,[%frame+240]; .loc 1 556 22 ld.u64 %r69,[%frame]; .loc 1 558 32 ld.u64 %r59,[%frame+120]; .loc 1 565 23 mul.lo.u64 %r81,%r59,%r69; setp.le.s64 %r289,%r291,1; .loc 1 554 13 mov.u32 %r276,127; cvt.u32.u32 %r293,%r24; cvt.s64.s8 %r294,%r293; add.u64 %r295,%r294,-1; $L146: st.u8 [%r96],%r276; .loc 1 555 15 add.u64 %r103,%r103,1; .loc 1 556 12 add.u64 %r96,%r96,%r69; .loc 1 558 13 setp.ne.u64 %r277,%r59,%r103; @ %r277 bra $L146; .loc 1 567 7 @ ! %r289 bra $L171; bra $L109; $L145: .loc 1 562 13 st.u64 [%r97],%r292; .loc 1 565 23 mul.lo.u64 %r71,%r74,%r73; .loc 1 566 5 add.u64 %r95,%r95,1; .loc 1 567 7 add.u64 %r97,%r97,8; add.u64 %r119,%r119,8; add.u64 %r118,%r118,8; setp.ne.u64 %r280,%r95,%r295; @ %r280 bra $L144; bra $L109; $L171: add.u64 %r97,%frame,248; add.u64 %r119,%frame,8; add.u64 %r118,%frame,128; .loc 1 565 23 mov.u64 %r71,%r81; .loc 1 566 5 mov.u64 %r95,1; .loc 1 562 13 mov.u64 %r292,0; $L144: .loc 1 571 16 ld.u64 %r285,[%r97]; add.u64 %r73,%r285,1; st.u64 [%r97],%r73; .loc 1 572 23 ld.u64 %r74,[%r119]; .loc 1 572 13 sub.u64 %r286,%r74,%r71; add.u64 %r96,%r96,%r286; .loc 1 558 32 ld.u64 %r77,[%r118]; .loc 1 558 13 setp.eq.u64 %r287,%r73,%r77; @ %r287 bra $L145; .loc 1 562 13 mov.u64 %r103,0; bra $L146; $L169: .loc 1 491 15 ld.u64 %r96,[%r148]; .loc 1 491 6 setp.eq.u64 %r288,%r96,0; @ ! %r288 bra $L122; bra $L147; $L109: .loc 1 576 1 ret; } minval_i2.o/_gfortran_minval_i2 .visible .func _gfortran_minval_fortran/generated/minval_i2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminval_i2 .visible .func _gfortran_mminval_i2gfortran_sminval_i2 .visible .func _gfortran_sminval_i2VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,86,65,76114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_minval_i2 .visible .func _gfortran_minval_48099pred %r217; .reg .pred64u64 %r251; .reg .predu64 %r256; .reg .u64 %r257; .reg .u32u32 %r268; .reg .u64predmov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; .loc 1 55 10 ld.s8 %r22,[%r180+28]; .loc 1 55 38 add.u32 %r182,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r182; .loc 1 56 10 ld.u64 %r25,[%r181]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r184,%r104,63; set.u32.lt.s64 %r186,%r103,%r104; neg.s32 %r187,%r186; cvt.u16.u64 %r190,%r184; cvt.u16.u32 %r191,%r187; or.b16 %r189,%r190,%r191; .loc 1 58 6 cvt.u32.u16 %r192,%r189; cvt.u16.u8 %r193,%r192; setp.eq.u16 %r194,%r193,0; @ %r194 bra $L2; .loc 1 60 7 cvt.u32.u32 %r197,%r22; cvt.s64.s8 %r196,%r197; st.u64 [%stack+8],%r1961955_gfortran_runtime_errorr200,%r104,%r104; add.u64 %r201,%r200,%r104; shl.b64 %r202,%r201,3; add.u64 %r203,%r180,%r202; ld.u64 %r30,[%r203+56]; ld.u64 %r32,[%r203+48]; .loc 1 68 9 ld.u64 %r107,[%r203+40]; .loc 1 70 3 setp.ne.u64 %r217,%r104,0; @ %r217 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r218,%r103,%r104; @ %r218 bra $L4; bra $L46; $L3: add.u64 %r78,%r180,40; add.u64 %r59,%frame,120; add.u64 %r91,%frame,240; add.u64 %r219,%r180,16; add.u64 %r221,%r25,%r25; add.u64 %r222,%r221,%r25; shl.b64 %r223,%r222,3; add.u64 %r174,%r219,%r223; .loc 1 76 12 mov.u64 %r363,0; $L8: .loc 1 72 18 ld.u64 %r224,[%r78]; st.u64 [%r59],%r224; .loc 1 73 19 ld.u64 %r226,[%r78+16]; add.u64 %r225,%r226,1; ld.u64 %r227,[%r78+8]; sub.u64 %r37,%r225,%r227; .loc 1 75 10 setp.lt.s64 %r228,%r37,0; @ %r228 bra $L6; .loc 1 73 17 st.u64 [%r91],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r91],%r363; $L7: .loc 1 70 3 add.u64 %r78,%r78,24; add.u64 %r59,%r59,8; add.u64 %r91,%r91,8; setp.ne.u64 %r230,%r78,%r174; @ %r230 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r231,[%r179]; setp.eq.u64 %r232,%r231,0; @ ! %r232 bra $L11; bra $L10; $L4: add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r237,%r236,40; add.u64 %r143,%r180,%r237; shl.b64 %r238,%r25,3; add.u64 %r125,%r238,-8; add.u64 %r351,%frame,120; add.u64 %r128,%r351,%r125; add.u64 %r350,%frame,240; add.u64 %r121,%r350,%r125; cvt.u32.u32 %r242,%r22; cvt.s64.s8 %r241,%r242; add.u64 %r244,%r241,%r241; add.u64 %r245,%r244,%r241; shl.b64 %r246,%r245,3; add.u64 %r247,%r180,40; add.u64 %r95,%r246,%r247; .loc 1 84 12 mov.u64 %r362,0; $L14: .loc 1 80 18 ld.u64 %r248,[%r143]; st.u64 [%r128],%r248; .loc 1 81 19 ld.u64 %r250,[%r143+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r143+8]; sub.u64 %r45,%r249,%r251; .loc 1 83 10 setp.lt.s64 %r252,%r45,0; @ %r252 bra $L12; .loc 1 81 17 st.u64 [%r121],%r45; bra $L13; $L12: .loc 1 84 12 st.u64 [%r121],%r362; $L13: .loc 1 78 3 add.u64 %r143,%r143,24; add.u64 %r128,%r128,8; add.u64 %r121,%r121,8; setp.ne.u64 %r254,%r95,%r143; @ %r254 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r255,%r103,0; @ %r255 bra $L16; add.u64 %r350,%frame,240; $L34: add.u64 %r152,%r179,40; mov.u64 %r150,%r350; add.u64 %r256,%r179,16; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r144,%r256,%r262; .loc 1 94 10 mov.u64 %r99,1; .loc 1 98 4 mov.u64 %r285,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r263,0; st.u64 [%r179+8],%r263; .loc 1 103 28 cvt.u16.u32 %r266,%r22; add.u16 %r265,%r266,-1; cvt.u32.u16 %r267,%r265; st.u8 [%r179+28],%r267; .loc 1 105 20 add.u32 %r268,%r22,-2; cvt.s64.s32 %r52,%r268; add.u64 %r270,%r52,%r52; add.u64 %r271,%r270,%r52; shl.b64 %r272,%r271,3; add.u64 %r273,%r179,%r272; .loc 1 105 67 shl.b64 %r275,%r52,3; add.u64 %r276,%frame,%r275; .loc 1 105 59 ld.u64 %r278,[%r273+40]; ld.u64 %r279,[%r276+240]; mul.lo.u64 %r109,%r278,%r279; .loc 1 107 29 mov.u64 %r281,281; call (%value_in),_gfortrani_xmallocarray282,[%value_in]; } .loc 1 107 27 st.u64 [%r179],%r282; .loc 1 108 10 setp.eq.u64 %r284,%r109,0; @ ! %r284 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r99,%r48,%r99; $L17: .loc 1 98 4 st.u64 [%r152+8],%r285; ld.u64 %r48,[%r150]; add.u64 %r286,%r48,-1; st.u64 [%r152+16],%r286; st.u64 [%r152],%r99; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r287,%r144,%r152; @ %r287 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r179+48],%r109; mov.u64 %r289,-1; st.u64 [%r179+56],%r289; mov.u64 %r290,1; st.u64 [%r179+40],%r290; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r179+28]; .loc 1 118 10 setp.eq.u64 %r291,%r58,%r103; @ %r291 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r294,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r295,[%r294+36]; setp.eq.u32 %r296,%r295,0; @ %r296 bra $L21; .loc 1 125 2 add.u64 %r350,%frame,240; cvta.const.u64 %r300,$LC2r350299300; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r302,%r103,0; @ %r302 bra $L24; $L27: .loc 1 65 9 add.u64 %r303,%r30,1; .loc 1 65 7 sub.u64 %r105,%r303,%r32; max.s64 %r40,%r105,0; .loc 1 137 8 ld.u64 %r116,[%r180]; .loc 1 138 8 ld.u64 %r117,[%r179]; .loc 1 158 36 add.u64 %r66,%r107,%r107; ld.u64 %r138,[%frame+360]; .loc 1 180 22 ld.u64 %r68,[%frame+120]; .loc 1 180 12 add.u64 %r69,%r68,%r68; .loc 1 181 22 ld.u64 %r71,[%frame]; .loc 1 181 12 add.u64 %r72,%r71,%r71; .loc 1 183 32 ld.u64 %r123,[%frame+240]; .loc 1 190 23 mul.lo.u64 %r307,%r68,%r123; .loc 1 190 9 add.u64 %r308,%r307,%r307; neg.s64 %r127,%r308; .loc 1 191 23 mul.lo.u64 %r309,%r71,%r123; .loc 1 191 9 add.u64 %r310,%r309,%r309; neg.s64 %r130,%r310; setp.gt.s64 %r352,%r105,0; setp.le.s64 %r353,%r103,1; cvt.u32.u32 %r355,%r22; cvt.s64.s8 %r356,%r355; add.u64 %r357,%r356,-1; add.u64 %r358,%frame,120; add.u64 %r359,%frame,240; .loc 1 187 13 mov.u64 %r360,0; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r179,40; mov.u64 %r162,%frame; cvt.u32.u32 %r312,%r22; cvt.s64.s8 %r311,%r312; add.u64 %r153,%r311,-1; .loc 1 129 3 mov.u64 %r118,0; add.u64 %r350,%frame,240; .loc 1 131 16 mov.u64 %r313,%r118; $L26: st.u64 [%r165],%r313; .loc 1 132 18 ld.u64 %r314,[%r164]; st.u64 [%r162],%r314; .loc 1 133 17 shl.b64 %r316,%r118,3; add.u64 %r317,%r350,%r316; .loc 1 133 10 ld.u64 %r318,[%r317]; setp.le.s64 %r319,%r318,0; @ %r319 bra $L1; .loc 1 129 26 add.u64 %r118,%r118,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r320,%r118,%r153; @ %r320 bra $L26; bra $L27; $L25: .loc 1 153 5 @ %r352 bra $L35; .loc 1 154 10 mov.u16 %r322,32767; st.u16 [%r117],%r322; bra $L29; $L35: mov.u64 %r112,%r116; .loc 1 151 9 mov.u32 %r43,32767; .loc 1 158 13 mov.u64 %r111,0; $L28: ld.u16 %r324,[%r112]; cvt.u16.u32 %r325,%r43; min.s16 %r323,%r324,%r325; cvt.s32.s16 %r43,%r323; .loc 1 158 28 add.u64 %r111,%r111,1; .loc 1 158 36 add.u64 %r112,%r112,%r66; .loc 1 158 6 setp.gt.s64 %r326,%r40,%r111; @ %r326 bra $L28; .loc 1 175 12 cvt.u16.u32 %r327,%r43; st.u16 [%r117],%r327; $L29: .loc 1 179 15 add.u64 %r138,%r138,1; .loc 1 180 12 add.u64 %r116,%r116,%r69; .loc 1 181 12 add.u64 %r117,%r117,%r72; .loc 1 183 13 setp.ne.u64 %r328,%r138,%r123; @ %r328 bra $L25; .loc 1 190 9 add.u64 %r113,%r116,%r127; .loc 1 191 9 add.u64 %r114,%r117,%r130; .loc 1 193 7 @ %r353 bra $L1; add.u64 %r122,%frame,368; mov.u64 %r140,8; .loc 1 192 5 mov.u64 %r115,1; bra $L32; $L33: .loc 1 187 13 st.u64 [%r122],%r360; .loc 1 190 23 mul.lo.u64 %r334,%r85,%r84; .loc 1 190 9 add.u64 %r335,%r334,%r334; sub.u64 %r113,%r116,%r335; .loc 1 191 23 mul.lo.u64 %r336,%r88,%r84; .loc 1 191 9 add.u64 %r337,%r336,%r336; sub.u64 %r114,%r117,%r337; .loc 1 192 5 add.u64 %r115,%r115,1; .loc 1 193 7 add.u64 %r122,%r122,8; add.u64 %r140,%r140,8; setp.eq.u64 %r338,%r115,%r357; @ %r338 bra $L1; $L32: .loc 1 201 16 ld.u64 %r339,[%r122]; add.u64 %r84,%r339,1; st.u64 [%r122],%r84; .loc 1 202 23 add.u64 %r341,%r358,%r140; ld.u64 %r85,[%r341]; .loc 1 202 13 add.u64 %r342,%r85,%r85; add.u64 %r116,%r113,%r342; .loc 1 203 23 add.u64 %r343,%frame,%r140; ld.u64 %r88,[%r343]; .loc 1 203 13 add.u64 %r344,%r88,%r88; add.u64 %r117,%r114,%r344; .loc 1 183 32 add.u64 %r346,%r359,%r140; ld.u64 %r92,[%r346]; .loc 1 183 13 setp.eq.u64 %r347,%r84,%r92; @ %r347 bra $L33; .loc 1 187 13 mov.u64 %r138,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r348,[%r179]; setp.eq.u64 %r349,%r348,0; @ ! %r349 bra $L11; bra $L34; $L_gfortran_mminval_i2 .visible .func _gfortran_mminval_608u64u16 %r263; .reg .pred %r264; .reg .pred %r266; .reg .u64 %r270; .reg .u64 %r271; .reg .u64 %r277; .reg .pred %r279; .reg .pred %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .predu64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32u64 %r342; .reg .u64pred %r358; .reg .u64predu64 %r391; .reg .pred %r392; .reg .pred %r393; .reg .u64u64 %r421; .reg .u64predmov.u64 %r211,%ar2; mov.u64 %r212,%ar3; .loc 1 237 6 setp.ne.u64 %r213,%r212,0; @ %r213 bra $L49gfortran_minva.loc 1 244 7 bra $L48; $L49: .loc 1 247 10 ld.u64 %r22,[%r211]; .loc 1 247 7 add.u64 %r136,%r22,-1; .loc 1 248 10 ld.s8 %r23,[%r210+28]; .loc 1 248 38 add.u32 %r217,%r23,-1; .loc 1 248 8 cvt.s64.s32 %r137,%r217; .loc 1 251 7 shr.u64 %r219,%r136,63; set.u32.gt.s64 %r221,%r136,%r137; neg.s32 %r222,%r221; cvt.u16.u64 %r225,%r219; cvt.u16.u32 %r226,%r222; or.b16 %r224,%r225,%r226; .loc 1 251 6 cvt.u32.u16 %r227,%r224; cvt.u16.u8 %r228,%r227; setp.eq.u16 %r229,%r228,0; @ %r229 bra $L51; .loc 1 253 7 cvt.u32.u32 %r232,%r23; cvt.s64.s8 %r231,%r232; st.u64 [%stack+8],%r231; st.u64 [%stack],%r22; cvta.const.u64 %r2302stack; call _gfortran_runtime_error1: .loc 1 258 9 add.u64 %r435,%r136,%r136; add.u64 %r436,%r435,%r136; shl.b64 %r237,%r436,3; add.u64 %r238,%r210,%r237; ld.u64 %r241,[%r238+56]; add.u64 %r240,%r241,1; .loc 1 258 7 ld.u64 %r248,[%r238+48]; sub.u64 %r138,%r240,%r248; .loc 1 259 6 setp.le.s64 %r249,%r138,0; @ %r249 bra $L48; .loc 1 262 9 ld.u64 %r151,[%r212]; .loc 1 264 15 ld.u64 %r34,[%r212+16]; .loc 1 266 22 cvt.u32.u64 %r35,%r34; .loc 1 266 53 add.u32 %r250,%r35,-4; and.b32 %r251,%r250,-5; set.u32.eq.u32 %r253,%r251,0; neg.s32 %r254,%r253; .loc 1 266 22 add.u32 %r255,%r35,-1; set.u32.le.u32 %r257,%r255,1; neg.s32 %r258,%r257; .loc 1 266 6 cvt.u16.u32 %r260,%r254; cvt.u16.u32 %r261,%r258; or.b16 %r259,%r260,%r261; cvt.u32.u16 %r262,%r259; cvt.u16.u8 %r263,%r262; setp.ne.u16 %r264,%r263,0; @ %r264 bra $L53; .loc 1 268 7 setp.ne.u32 %r266,%r35,16; @ %r266 bra $L54; $L53: .loc 1 275 9 shl.b64 %r270,%r436,3; add.u64 %r271,%r210,%r270; ld.u64 %r141,[%r271+40]; .loc 1 276 12 add.u64 %r277,%r212,%r270; ld.u64 %r42,[%r277+40]; .loc 1 278 3 setp.ne.u64 %r279,%r136,0; @ %r279 bra $L55; $L61: .loc 1 288 3 setp.lt.s64 %r280,%r136,%r137; @ %r280 bra $L56; bra $L105; $L54: .loc 1 273 5 cvta.const.u64 %r281281_gfortran_runtime_error5: add.u64 %r40,%r210,40; add.u64 %r179,%r212,40; add.u64 %r283,%r210,16; add.u64 %r285,%r22,%r22; add.u64 %r286,%r285,%r22; shl.b64 %r287,%r286,3; add.u64 %r206,%r283,%r287; .loc 1 278 3 mov.u64 %r91,0; add.u64 %r433,%frame,240; add.u64 %r432,%frame,360; .loc 1 285 12 mov.u64 %r447,%r91; $L60: .loc 1 280 18 add.u64 %r289,%r433,%r91; ld.u64 %r290,[%r40]; st.u64 [%r289],%r290; .loc 1 281 18 add.u64 %r291,%frame,%r91; .loc 1 281 20 ld.u64 %r293,[%r179]; mul.lo.u64 %r292,%r293,%r34; .loc 1 281 18 st.u64 [%r291],%r292; .loc 1 282 19 ld.u64 %r295,[%r40+16]; add.u64 %r294,%r295,1; ld.u64 %r296,[%r40+8]; sub.u64 %r54,%r294,%r296; .loc 1 284 10 setp.lt.s64 %r297,%r54,0; @ %r297 bra $L58; .loc 1 282 17 add.u64 %r299,%r432,%r91; st.u64 [%r299],%r54; bra $L59; $L58: .loc 1 285 12 add.u64 %r301,%r432,%r91; st.u64 [%r301],%r447; $L59: .loc 1 278 3 add.u64 %r40,%r40,24; add.u64 %r179,%r179,24; add.u64 %r91,%r91,8; setp.ne.u64 %r303,%r40,%r206; @ %r303 bra $L60; bra $L61; $L105: .loc 1 298 6 ld.u64 %r304,[%r209]; setp.eq.u64 %r305,%r304,0; @ ! %r305 bra $L63; bra $L62; $L56: add.u64 %r307,%r22,%r22; add.u64 %r308,%r307,%r22; shl.b64 %r309,%r308,3; add.u64 %r126,%r309,40; add.u64 %r132,%r210,%r126; add.u64 %r124,%r212,%r126; shl.b64 %r310,%r22,3; add.u64 %r102,%r310,-8; cvt.u32.u32 %r312,%r23; cvt.s64.s8 %r311,%r312; shl.b64 %r313,%r311,3; add.u64 %r68,%r313,-8; add.u64 %r433,%frame,240; add.u64 %r432,%frame,360; .loc 1 295 12 mov.u64 %r446,0; $L66: .loc 1 290 18 add.u64 %r315,%r433,%r102; ld.u64 %r316,[%r132]; st.u64 [%r315],%r316; .loc 1 291 18 add.u64 %r317,%frame,%r102; .loc 1 291 20 ld.u64 %r319,[%r124]; mul.lo.u64 %r318,%r319,%r34; .loc 1 291 18 st.u64 [%r317],%r318; .loc 1 292 19 ld.u64 %r321,[%r132+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r132+8]; sub.u64 %r65,%r320,%r322; .loc 1 294 10 setp.lt.s64 %r323,%r65,0; @ %r323 bra $L64; .loc 1 292 17 add.u64 %r325,%r432,%r102; st.u64 [%r325],%r65; bra $L65; $L64: .loc 1 295 12 add.u64 %r327,%r432,%r102; st.u64 [%r327],%r446; $L65: .loc 1 288 3 add.u64 %r132,%r132,24; add.u64 %r124,%r124,24; add.u64 %r102,%r102,8; setp.ne.u64 %r329,%r68,%r102; @ %r329 bra $L66; bra $L106; $L62: .loc 1 302 7 setp.eq.u64 %r330,%r137,0; @ %r330 bra $L68; add.u64 %r432,%frame,360; $L86: add.u64 %r172,%r209,40; mov.u64 %r164,%r432; cvt.u32.u32 %r332,%r23; cvt.s64.s8 %r331,%r332; add.u64 %r334,%r331,%r331; add.u64 %r335,%r334,%r331; shl.b64 %r336,%r335,3; add.u64 %r337,%r209,16; add.u64 %r133,%r336,%r337; .loc 1 305 10 mov.u64 %r128,1; .loc 1 309 4 mov.u64 %r356,0; bra $L69; $L68: .loc 1 313 20 add.u32 %r338,%r23,-2; cvt.s64.s32 %r74,%r338; add.u64 %r340,%r74,%r74; add.u64 %r341,%r340,%r74; shl.b64 %r342,%r341,3; add.u64 %r343,%r209,%r342; .loc 1 313 67 shl.b64 %r345,%r74,3; add.u64 %r346,%frame,%r345; .loc 1 313 59 ld.u64 %r348,[%r343+40]; ld.u64 %r349,[%r346+360]; mul.lo.u64 %r145,%r348,%r349; .loc 1 315 24 mov.u64 %r350,0; st.u64 [%r209+8],%r350; .loc 1 316 28 cvt.u16.u32 %r353,%r23; add.u16 %r352,%r353,-1; cvt.u32.u16 %r354,%r352; st.u8 [%r209+28],%r354; .loc 1 318 10 setp.eq.u64 %r355,%r145,0; @ %r355 bra $L70; bra $L107; $L72: .loc 1 307 47 mul.lo.u64 %r128,%r71,%r128; $L69: .loc 1 309 4 st.u64 [%r172+8],%r356; ld.u64 %r71,[%r164]; add.u64 %r357,%r71,-1; st.u64 [%r172+16],%r357; st.u64 [%r172],%r128; .loc 1 302 7 add.u64 %r172,%r172,24; add.u64 %r164,%r164,8; setp.ne.u64 %r358,%r133,%r172; @ %r358 bra $L72; bra $L68; $L70: .loc 1 321 4 st.u64 [%r209+48],%r145; mov.u64 %r360,-1; st.u64 [%r209+56],%r360; mov.u64 %r361,1; st.u64 [%r209+40],%r361; .loc 1 322 4 bra $L48; $L107: .loc 1 325 24 mov.u64 %r363,363; call (%value_in),_gfortrani_xmallocarray64,[%value_in]; } .loc 1 325 22 st.u64 [%r209],%r364; bra $L73; $L63: .loc 1 330 19 ld.s8 %r366,[%r209+28]; .loc 1 330 10 setp.eq.u64 %r367,%r366,%r137; @ %r367 bra $L74; .loc 1 331 2 cvta.const.u64 %r36836_gfortran_runtime_error74: .loc 1 333 11 cvta.global.u64 %r370,_gfortrani_compile_options; .loc 1 333 10 ld.u32 %r371,[%r370+36]; setp.eq.u32 %r372,%r371,0; @ %r372 bra $L73; .loc 1 335 4 add.u64 %r432,%frame,360; cvta.const.u64 %r376,$LC2r437376; call _gfortrani_bounds_ifunction_return337 4 cvta.const.u64 %r380,$LC1210380376; call _gfortrani_bounds_equal_extents$L73: .loc 1 342 3 setp.ne.u64 %r382,%r137,0; @ %r382 bra $L75; $L78: .loc 1 350 8 ld.u64 %r152,[%r209]; .loc 1 351 8 ld.u64 %r150,[%r210]; .loc 1 353 9 setp.ne.u64 %r383,%r150,0; @ %r383 bra $L76; bra $L48; $L75: add.u64 %r190,%frame,480; add.u64 %r189,%r209,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r385,%r23; cvt.s64.s8 %r384,%r385; add.u64 %r175,%r384,-1; .loc 1 342 3 mov.u64 %r157,0; add.u64 %r432,%frame,360; .loc 1 344 16 mov.u64 %r386,%r157; $L77: st.u64 [%r190],%r386; .loc 1 345 18 ld.u64 %r387,[%r189]; st.u64 [%r187],%r387; .loc 1 346 17 shl.b64 %r389,%r157,3; add.u64 %r390,%r432,%r389; .loc 1 346 10 ld.u64 %r391,[%r390]; setp.le.s64 %r392,%r391,0; @ %r392 bra $L48; .loc 1 342 26 add.u64 %r157,%r157,1; .loc 1 342 3 add.u64 %r190,%r190,8; add.u64 %r189,%r189,24; add.u64 %r187,%r187,8; setp.ne.u64 %r393,%r157,%r175; @ %r393 bra $L77; bra $L78; $L76: .loc 1 276 12 mul.lo.u64 %r45,%r42,%r34; .loc 1 370 32 add.u64 %r90,%r141,%r141; ld.u64 %r39,[%frame+480]; .loc 1 401 22 ld.u64 %r93,[%frame+240]; .loc 1 401 12 add.u64 %r94,%r93,%r93; .loc 1 402 23 ld.u64 %r96,[%frame]; .loc 1 403 22 ld.u64 %r98,[%frame+120]; .loc 1 403 12 add.u64 %r99,%r98,%r98; .loc 1 405 32 ld.u64 %r84,[%frame+360]; .loc 1 412 23 mul.lo.u64 %r38,%r84,%r93; .loc 1 413 24 mul.lo.u64 %r397,%r84,%r96; .loc 1 413 10 neg.s64 %r163,%r397; .loc 1 414 23 mul.lo.u64 %r398,%r84,%r98; .loc 1 414 9 add.u64 %r399,%r398,%r398; neg.s64 %r165,%r399; setp.le.s64 %r437,%r137,1; .loc 1 365 9 mov.u32 %r439,32767; cvt.u32.u32 %r440,%r23; cvt.s64.s8 %r441,%r440; add.u64 %r442,%r441,-1; add.u64 %r443,%frame,240; add.u64 %r444,%frame,360; add.u64 %r445,%frame,120; $L87: .loc 1 342 3 mov.u64 %r156,%r151; mov.u64 %r155,%r150; .loc 1 365 9 mov.u32 %r129,%r439; .loc 1 370 9 mov.u64 %r154,0; $L80: .loc 1 394 6 ld.s8 %r401,[%r156]; cvt.u16.u32 %r400,%r401; setp.eq.u16 %r402,%r400,0; @ %r402 bra $L79; ld.u16 %r404,[%r155]; cvt.u16.u32 %r405,%r129; min.s16 %r403,%r404,%r405; cvt.s32.s16 %r129,%r403; $L79: .loc 1 370 24 add.u64 %r154,%r154,1; .loc 1 370 32 add.u64 %r155,%r155,%r90; .loc 1 370 47 add.u64 %r156,%r156,%r45; .loc 1 370 2 setp.ne.u64 %r406,%r138,%r154; @ %r406 bra $L80; .loc 1 397 8 cvt.u16.u32 %r407,%r129; st.u16 [%r152],%r407; .loc 1 400 15 add.u64 %r39,%r39,1; .loc 1 401 12 add.u64 %r150,%r150,%r94; .loc 1 402 13 add.u64 %r151,%r151,%r96; .loc 1 403 12 add.u64 %r152,%r152,%r99; .loc 1 405 13 setp.ne.u64 %r408,%r84,%r39; @ %r408 bra $L87; .loc 1 413 10 add.u64 %r147,%r151,%r163; .loc 1 414 9 add.u64 %r148,%r152,%r165; .loc 1 416 7 @ ! %r437 bra $L108; bra $L48; $L85: .loc 1 409 13 st.u64 [%r41],%r438; .loc 1 412 23 mul.lo.u64 %r101,%r113,%r112; .loc 1 413 24 mul.lo.u64 %r411,%r115,%r112; .loc 1 413 10 sub.u64 %r147,%r151,%r411; .loc 1 414 23 mul.lo.u64 %r412,%r117,%r112; .loc 1 414 9 add.u64 %r413,%r412,%r412; sub.u64 %r148,%r152,%r413; .loc 1 415 5 add.u64 %r149,%r149,1; .loc 1 416 7 add.u64 %r41,%r41,8; add.u64 %r166,%r166,8; setp.ne.u64 %r414,%r149,%r442; @ %r414 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; .loc 1 412 23 mov.u64 %r101,%r38; .loc 1 416 7 mov.u64 %r166,8; .loc 1 415 5 mov.u64 %r149,1; .loc 1 409 13 mov.u64 %r438,0; $L84: .loc 1 424 16 ld.u64 %r418,[%r41]; add.u64 %r112,%r418,1; st.u64 [%r41],%r112; .loc 1 425 23 add.u64 %r420,%r443,%r166; ld.u64 %r113,[%r420]; .loc 1 425 13 sub.u64 %r421,%r113,%r101; add.u64 %r422,%r421,%r421; add.u64 %r150,%r150,%r422; .loc 1 426 24 add.u64 %r423,%frame,%r166; ld.u64 %r115,[%r423]; .loc 1 426 14 add.u64 %r151,%r147,%r115; .loc 1 427 23 add.u64 %r425,%r445,%r166; ld.u64 %r117,[%r425]; .loc 1 427 13 add.u64 %r426,%r117,%r117; add.u64 %r152,%r148,%r426; .loc 1 405 32 add.u64 %r428,%r444,%r166; ld.u64 %r121,[%r428]; .loc 1 405 13 setp.eq.u64 %r429,%r112,%r121; @ %r429 bra $L85; .loc 1 409 13 mov.u64 %r39,0; bra $L87; $L106: .loc 1 298 6 ld.u64 %r430,[%r209]; setp.eq.u64 %r431,%r430,0; @ ! %r431 bra $L63; bra $L86; $L48:_gfortran_sminval_i2 .visible .func _gfortran_sminval_68u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64predpred %r292; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 454 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 454 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_minva.loc 1 461 7 bra $L109; $L111: .loc 1 464 10 ld.u64 %r23,[%r152]; .loc 1 464 7 add.u64 %r88,%r23,-1; .loc 1 465 10 ld.s8 %r24,[%r151+28]; .loc 1 465 38 add.u32 %r26,%r24,-1; .loc 1 465 8 cvt.s64.s32 %r295,%r26; .loc 1 467 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r295; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 467 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 474 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 482 3 setp.lt.s64 %r173,%r88,%r295; @ %r173 bra $L115; bra $L168; $L113: .loc 1 469 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 479 12 mov.u64 %r301,0; $L119: .loc 1 476 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 478 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 476 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 479 12 st.u64 [%r139],%r301; $L118: .loc 1 474 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 491 15 ld.u64 %r98,[%r150]; .loc 1 491 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r294,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r294,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 488 12 mov.u64 %r300,0; $L125: .loc 1 485 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 487 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 484 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 488 12 st.u64 [%r43],%r300; $L124: .loc 1 482 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 495 7 setp.eq.u64 %r211,%r295,0; @ %r211 bra $L127; add.u64 %r294,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r294; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 498 10 mov.u64 %r81,1; .loc 1 502 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 506 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 507 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 509 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 509 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 509 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 511 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 500 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 502 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 495 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 514 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 515 4 bra $L109; $L170: .loc 1 518 24 mov.u64 %r244,4; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 518 22 st.u64 [%r150],%r98; $L135: .loc 1 544 3 setp.ne.u64 %r247,%r295,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 522 19 ld.s8 %r55,[%r150+28]; .loc 1 522 10 setp.eq.u64 %r248,%r55,%r295; @ %r248 bra $L134; .loc 1 523 2 st.u64 [%stack+8],%r295; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 528 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 528 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 530 4 setp.eq.u64 %r254,%r295,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 530 10 mov.u64 %r104,0; $L137: .loc 1 534 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 534 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 535 18 ld.u64 %r63,[%r31]; .loc 1 530 25 add.u64 %r104,%r104,1; .loc 1 535 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 536 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 530 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 546 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r295,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 547 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 544 26 add.u64 %r100,%r100,1; .loc 1 544 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r295,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 556 22 ld.u64 %r68,[%frame]; .loc 1 556 12 add.u64 %r69,%r68,%r68; .loc 1 558 32 ld.u64 %r59,[%frame+120]; .loc 1 565 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r293,%r295,1; .loc 1 554 13 mov.u16 %r279,32767; cvt.u32.u32 %r297,%r24; cvt.s64.s8 %r298,%r297; add.u64 %r299,%r298,-1; $L146: st.u16 [%r98],%r279; .loc 1 555 15 add.u64 %r80,%r80,1; .loc 1 556 12 add.u64 %r98,%r98,%r69; .loc 1 558 13 setp.ne.u64 %r280,%r59,%r80; @ %r280 bra $L146; .loc 1 567 7 @ ! %r293 bra $L171; bra $L109; $L145: .loc 1 562 13 st.u64 [%r99],%r296; .loc 1 565 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 566 5 add.u64 %r97,%r97,1; .loc 1 567 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r283,%r97,%r299; @ %r283 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 565 23 mov.u64 %r72,%r44; .loc 1 566 5 mov.u64 %r97,1; .loc 1 562 13 mov.u64 %r296,0; $L144: .loc 1 571 16 ld.u64 %r288,[%r99]; add.u64 %r74,%r288,1; st.u64 [%r99],%r74; .loc 1 572 23 ld.u64 %r75,[%r124]; .loc 1 572 13 sub.u64 %r289,%r75,%r72; add.u64 %r290,%r289,%r289; add.u64 %r98,%r98,%r290; .loc 1 558 32 ld.u64 %r77,[%r123]; .loc 1 558 13 setp.eq.u64 %r291,%r74,%r77; @ %r291 bra $L145; .loc 1 562 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 491 15 ld.u64 %r98,[%r150]; .loc 1 491 6 setp.eq.u64 %r292,%r98,0; @ ! %r292 bra $L122; bra $L147; $L109: .loc 1 576 1 ret; } minval_i4.o/_gfortran_minval_i4 .visible .func _gfortran_minval_fortran/generated/minval_i4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminval_i4 .visible .func _gfortran_mminval_i4gfortran_sminval_i4 .visible .func _gfortran_sminval_i4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,86,65,76114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_minval_i4 .visible .func _gfortran_minval_48099pred %r217; .reg .pred64u64 %r251; .reg .predu64 %r256; .reg .u64 %r257; .reg .u32u32 %r268; .reg .u64u64predpredu64u64predmov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; .loc 1 55 10 ld.s8 %r22,[%r180+28]; .loc 1 55 38 add.u32 %r182,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r182; .loc 1 56 10 ld.u64 %r25,[%r181]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r184,%r104,63; set.u32.lt.s64 %r186,%r103,%r104; neg.s32 %r187,%r186; cvt.u16.u64 %r190,%r184; cvt.u16.u32 %r191,%r187; or.b16 %r189,%r190,%r191; .loc 1 58 6 cvt.u32.u16 %r192,%r189; cvt.u16.u8 %r193,%r192; setp.eq.u16 %r194,%r193,0; @ %r194 bra $L2; .loc 1 60 7 cvt.u32.u32 %r197,%r22; cvt.s64.s8 %r196,%r197; st.u64 [%stack+8],%r1961955_gfortran_runtime_errorr200,%r104,%r104; add.u64 %r201,%r200,%r104; shl.b64 %r202,%r201,3; add.u64 %r203,%r180,%r202; ld.u64 %r30,[%r203+56]; ld.u64 %r32,[%r203+48]; .loc 1 68 9 ld.u64 %r107,[%r203+40]; .loc 1 70 3 setp.ne.u64 %r217,%r104,0; @ %r217 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r218,%r103,%r104; @ %r218 bra $L4; bra $L46; $L3: add.u64 %r78,%r180,40; add.u64 %r59,%frame,120; add.u64 %r91,%frame,240; add.u64 %r219,%r180,16; add.u64 %r221,%r25,%r25; add.u64 %r222,%r221,%r25; shl.b64 %r223,%r222,3; add.u64 %r174,%r219,%r223; .loc 1 76 12 mov.u64 %r357,0; $L8: .loc 1 72 18 ld.u64 %r224,[%r78]; st.u64 [%r59],%r224; .loc 1 73 19 ld.u64 %r226,[%r78+16]; add.u64 %r225,%r226,1; ld.u64 %r227,[%r78+8]; sub.u64 %r37,%r225,%r227; .loc 1 75 10 setp.lt.s64 %r228,%r37,0; @ %r228 bra $L6; .loc 1 73 17 st.u64 [%r91],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r91],%r357; $L7: .loc 1 70 3 add.u64 %r78,%r78,24; add.u64 %r59,%r59,8; add.u64 %r91,%r91,8; setp.ne.u64 %r230,%r78,%r174; @ %r230 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r231,[%r179]; setp.eq.u64 %r232,%r231,0; @ ! %r232 bra $L11; bra $L10; $L4: add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r237,%r236,40; add.u64 %r143,%r180,%r237; shl.b64 %r238,%r25,3; add.u64 %r125,%r238,-8; add.u64 %r344,%frame,120; add.u64 %r128,%r344,%r125; add.u64 %r345,%frame,240; add.u64 %r121,%r345,%r125; cvt.u32.u32 %r242,%r22; cvt.s64.s8 %r241,%r242; add.u64 %r244,%r241,%r241; add.u64 %r245,%r244,%r241; shl.b64 %r246,%r245,3; add.u64 %r247,%r180,40; add.u64 %r95,%r246,%r247; .loc 1 84 12 mov.u64 %r356,0; $L14: .loc 1 80 18 ld.u64 %r248,[%r143]; st.u64 [%r128],%r248; .loc 1 81 19 ld.u64 %r250,[%r143+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r143+8]; sub.u64 %r45,%r249,%r251; .loc 1 83 10 setp.lt.s64 %r252,%r45,0; @ %r252 bra $L12; .loc 1 81 17 st.u64 [%r121],%r45; bra $L13; $L12: .loc 1 84 12 st.u64 [%r121],%r356; $L13: .loc 1 78 3 add.u64 %r143,%r143,24; add.u64 %r128,%r128,8; add.u64 %r121,%r121,8; setp.ne.u64 %r254,%r95,%r143; @ %r254 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r255,%r103,0; @ %r255 bra $L16; add.u64 %r345,%frame,240; $L34: add.u64 %r152,%r179,40; mov.u64 %r150,%r345; add.u64 %r256,%r179,16; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r144,%r256,%r262; .loc 1 94 10 mov.u64 %r99,1; .loc 1 98 4 mov.u64 %r285,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r263,0; st.u64 [%r179+8],%r263; .loc 1 103 28 cvt.u16.u32 %r266,%r22; add.u16 %r265,%r266,-1; cvt.u32.u16 %r267,%r265; st.u8 [%r179+28],%r267; .loc 1 105 20 add.u32 %r268,%r22,-2; cvt.s64.s32 %r52,%r268; add.u64 %r270,%r52,%r52; add.u64 %r271,%r270,%r52; shl.b64 %r272,%r271,3; add.u64 %r273,%r179,%r272; .loc 1 105 67 shl.b64 %r275,%r52,3; add.u64 %r276,%frame,%r275; .loc 1 105 59 ld.u64 %r278,[%r273+40]; ld.u64 %r279,[%r276+240]; mul.lo.u64 %r109,%r278,%r279; .loc 1 107 29281; call (%value_in),_gfortrani_xmallocarray282,[%value_in]; } .loc 1 107 27 st.u64 [%r179],%r282; .loc 1 108 10 setp.eq.u64 %r284,%r109,0; @ ! %r284 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r99,%r48,%r99; $L17: .loc 1 98 4 st.u64 [%r152+8],%r285; ld.u64 %r48,[%r150]; add.u64 %r286,%r48,-1; st.u64 [%r152+16],%r286; st.u64 [%r152],%r99; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r287,%r144,%r152; @ %r287 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r179+48],%r109; mov.u64 %r289,-1; st.u64 [%r179+56],%r289; mov.u64 %r290,1; st.u64 [%r179+40],%r290; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r179+28]; .loc 1 118 10 setp.eq.u64 %r291,%r58,%r103; @ %r291 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r294,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r295,[%r294+36]; setp.eq.u32 %r296,%r295,0; @ %r296 bra $L21; .loc 1 125 2 add.u64 %r345,%frame,240; cvta.const.u64 %r300,$LC2r345299300; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r302,%r103,0; @ %r302 bra $L24; $L27: .loc 1 65 9 add.u64 %r303,%r30,1; .loc 1 65 7 sub.u64 %r105,%r303,%r32; max.s64 %r40,%r105,0; .loc 1 137 8 ld.u64 %r116,[%r180]; .loc 1 138 8 ld.u64 %r117,[%r179]; .loc 1 158 36 shl.b64 %r66,%r107,2; ld.u64 %r138,[%frame+360]; .loc 1 180 22 ld.u64 %r68,[%frame+120]; .loc 1 180 12 shl.b64 %r69,%r68,2; .loc 1 181 22 ld.u64 %r71,[%frame]; .loc 1 181 12 shl.b64 %r72,%r71,2; .loc 1 183 32 ld.u64 %r123,[%frame+240]; .loc 1 190 23 mul.lo.u64 %r304,%r68,%r123; .loc 1 190 9 shl.b64 %r305,%r304,2; neg.s64 %r127,%r305; .loc 1 191 23 mul.lo.u64 %r306,%r71,%r123; .loc 1 191 9 shl.b64 %r307,%r306,2; neg.s64 %r130,%r307; setp.gt.s64 %r346,%r105,0; setp.le.s64 %r347,%r103,1; cvt.u32.u32 %r349,%r22; cvt.s64.s8 %r350,%r349; add.u64 %r351,%r350,-1; add.u64 %r352,%frame,120; add.u64 %r353,%frame,240; .loc 1 187 13 mov.u64 %r354,0; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r179,40; mov.u64 %r162,%frame; cvt.u32.u32 %r309,%r22; cvt.s64.s8 %r308,%r309; add.u64 %r153,%r308,-1; .loc 1 129 3 mov.u64 %r118,0; add.u64 %r345,%frame,240; .loc 1 131 16 mov.u64 %r310,%r118; $L26: st.u64 [%r165],%r310; .loc 1 132 18 ld.u64 %r311,[%r164]; st.u64 [%r162],%r311; .loc 1 133 17 shl.b64 %r313,%r118,3; add.u64 %r314,%r345,%r313; .loc 1 133 10 ld.u64 %r315,[%r314]; setp.le.s64 %r316,%r315,0; @ %r316 bra $L1; .loc 1 129 26 add.u64 %r118,%r118,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r317,%r118,%r153; @ %r317 bra $L26; bra $L27; $L25: .loc 1 153 5 @ %r346 bra $L35; .loc 1 154 10 mov.u32 %r319,2147483647; st.u32 [%r117],%r319; bra $L29; $L35: mov.u64 %r112,%r116; .loc 1 151 9 mov.u32 %r43,2147483647; .loc 1 158 13 mov.u64 %r111,0; $L28: ld.u32 %r320,[%r112]; min.s32 %r43,%r43,%r320; .loc 1 158 28 add.u64 %r111,%r111,1; .loc 1 158 36 add.u64 %r112,%r112,%r66; .loc 1 158 6 setp.gt.s64 %r321,%r40,%r111; @ %r321 bra $L28; .loc 1 175 12 st.u32 [%r117],%r43; $L29: .loc 1 179 15 add.u64 %r138,%r138,1; .loc 1 180 12 add.u64 %r116,%r116,%r69; .loc 1 181 12 add.u64 %r117,%r117,%r72; .loc 1 183 13 setp.ne.u64 %r322,%r138,%r123; @ %r322 bra $L25; .loc 1 190 9 add.u64 %r113,%r116,%r127; .loc 1 191 9 add.u64 %r114,%r117,%r130; .loc 1 193 7 @ %r347 bra $L1; add.u64 %r122,%frame,368; mov.u64 %r140,8; .loc 1 192 5 mov.u64 %r115,1; bra $L32; $L33: .loc 1 187 13 st.u64 [%r122],%r354; .loc 1 190 23 mul.lo.u64 %r328,%r85,%r84; .loc 1 190 9 shl.b64 %r329,%r328,2; sub.u64 %r113,%r116,%r329; .loc 1 191 23 mul.lo.u64 %r330,%r88,%r84; .loc 1 191 9 shl.b64 %r331,%r330,2; sub.u64 %r114,%r117,%r331; .loc 1 192 5 add.u64 %r115,%r115,1; .loc 1 193 7 add.u64 %r122,%r122,8; add.u64 %r140,%r140,8; setp.eq.u64 %r332,%r115,%r351; @ %r332 bra $L1; $L32: .loc 1 201 16 ld.u64 %r333,[%r122]; add.u64 %r84,%r333,1; st.u64 [%r122],%r84; .loc 1 202 23 add.u64 %r335,%r352,%r140; ld.u64 %r85,[%r335]; .loc 1 202 13 shl.b64 %r336,%r85,2; add.u64 %r116,%r113,%r336; .loc 1 203 23 add.u64 %r337,%frame,%r140; ld.u64 %r88,[%r337]; .loc 1 203 13 shl.b64 %r338,%r88,2; add.u64 %r117,%r114,%r338; .loc 1 183 32 add.u64 %r340,%r353,%r140; ld.u64 %r92,[%r340]; .loc 1 183 13 setp.eq.u64 %r341,%r84,%r92; @ %r341 bra $L33; .loc 1 187 13 mov.u64 %r138,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r342,[%r179]; setp.eq.u64 %r343,%r342,0; @ ! %r343 bra $L11; bra $L34; $L_gfortran_mminval_i4 .visible .func _gfortran_mminval_608u64u16 %r263; .reg .pred %r264; .reg .pred %r266; .reg .u64 %r270; .reg .u64 %r271; .reg .u64 %r277; .reg .pred %r279; .reg .pred %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .predu64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32u64 %r342; .reg .u64pred %r358; .reg .u64predu64 %r391; .reg .predpred %r399; .reg .u32 %r400; .reg .pred %r401; .reg .predmov.u64 %r211,%ar2; mov.u64 %r212,%ar3; .loc 1 237 6 setp.ne.u64 %r213,%r212,0; @ %r213 bra $L49gfortran_minva.loc 1 244 7 bra $L48; $L49: .loc 1 247 10 ld.u64 %r22,[%r211]; .loc 1 247 7 add.u64 %r136,%r22,-1; .loc 1 248 10 ld.s8 %r23,[%r210+28]; .loc 1 248 38 add.u32 %r217,%r23,-1; .loc 1 248 8 cvt.s64.s32 %r137,%r217; .loc 1 251 7 shr.u64 %r219,%r136,63; set.u32.gt.s64 %r221,%r136,%r137; neg.s32 %r222,%r221; cvt.u16.u64 %r225,%r219; cvt.u16.u32 %r226,%r222; or.b16 %r224,%r225,%r226; .loc 1 251 6 cvt.u32.u16 %r227,%r224; cvt.u16.u8 %r228,%r227; setp.eq.u16 %r229,%r228,0; @ %r229 bra $L51; .loc 1 253 7 cvt.u32.u32 %r232,%r23; cvt.s64.s8 %r231,%r232; st.u64 [%stack+8],%r231; st.u64 [%stack],%r22; cvta.const.u64 %r2302stack; call _gfortran_runtime_error1: .loc 1 258 9 add.u64 %r429,%r136,%r136; add.u64 %r430,%r429,%r136; shl.b64 %r237,%r430,3; add.u64 %r238,%r210,%r237; ld.u64 %r241,[%r238+56]; add.u64 %r240,%r241,1; .loc 1 258 7 ld.u64 %r248,[%r238+48]; sub.u64 %r138,%r240,%r248; .loc 1 259 6 setp.le.s64 %r249,%r138,0; @ %r249 bra $L48; .loc 1 262 9 ld.u64 %r151,[%r212]; .loc 1 264 15 ld.u64 %r34,[%r212+16]; .loc 1 266 22 cvt.u32.u64 %r35,%r34; .loc 1 266 53 add.u32 %r250,%r35,-4; and.b32 %r251,%r250,-5; set.u32.eq.u32 %r253,%r251,0; neg.s32 %r254,%r253; .loc 1 266 22 add.u32 %r255,%r35,-1; set.u32.le.u32 %r257,%r255,1; neg.s32 %r258,%r257; .loc 1 266 6 cvt.u16.u32 %r260,%r254; cvt.u16.u32 %r261,%r258; or.b16 %r259,%r260,%r261; cvt.u32.u16 %r262,%r259; cvt.u16.u8 %r263,%r262; setp.ne.u16 %r264,%r263,0; @ %r264 bra $L53; .loc 1 268 7 setp.ne.u32 %r266,%r35,16; @ %r266 bra $L54; $L53: .loc 1 275 9 shl.b64 %r270,%r430,3; add.u64 %r271,%r210,%r270; ld.u64 %r141,[%r271+40]; .loc 1 276 12 add.u64 %r277,%r212,%r270; ld.u64 %r42,[%r277+40]; .loc 1 278 3 setp.ne.u64 %r279,%r136,0; @ %r279 bra $L55; $L61: .loc 1 288 3 setp.lt.s64 %r280,%r136,%r137; @ %r280 bra $L56; bra $L105; $L54: .loc 1 273 5 cvta.const.u64 %r281281_gfortran_runtime_error5: add.u64 %r40,%r210,40; add.u64 %r179,%r212,40; add.u64 %r283,%r210,16; add.u64 %r285,%r22,%r22; add.u64 %r286,%r285,%r22; shl.b64 %r287,%r286,3; add.u64 %r206,%r283,%r287; .loc 1 278 3 mov.u64 %r91,0; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 285 12 mov.u64 %r441,%r91; $L60: .loc 1 280 18 add.u64 %r289,%r427,%r91; ld.u64 %r290,[%r40]; st.u64 [%r289],%r290; .loc 1 281 18 add.u64 %r291,%frame,%r91; .loc 1 281 20 ld.u64 %r293,[%r179]; mul.lo.u64 %r292,%r293,%r34; .loc 1 281 18 st.u64 [%r291],%r292; .loc 1 282 19 ld.u64 %r295,[%r40+16]; add.u64 %r294,%r295,1; ld.u64 %r296,[%r40+8]; sub.u64 %r54,%r294,%r296; .loc 1 284 10 setp.lt.s64 %r297,%r54,0; @ %r297 bra $L58; .loc 1 282 17 add.u64 %r299,%r426,%r91; st.u64 [%r299],%r54; bra $L59; $L58: .loc 1 285 12 add.u64 %r301,%r426,%r91; st.u64 [%r301],%r441; $L59: .loc 1 278 3 add.u64 %r40,%r40,24; add.u64 %r179,%r179,24; add.u64 %r91,%r91,8; setp.ne.u64 %r303,%r40,%r206; @ %r303 bra $L60; bra $L61; $L105: .loc 1 298 6 ld.u64 %r304,[%r209]; setp.eq.u64 %r305,%r304,0; @ ! %r305 bra $L63; bra $L62; $L56: add.u64 %r307,%r22,%r22; add.u64 %r308,%r307,%r22; shl.b64 %r309,%r308,3; add.u64 %r126,%r309,40; add.u64 %r132,%r210,%r126; add.u64 %r124,%r212,%r126; shl.b64 %r310,%r22,3; add.u64 %r102,%r310,-8; cvt.u32.u32 %r312,%r23; cvt.s64.s8 %r311,%r312; shl.b64 %r313,%r311,3; add.u64 %r68,%r313,-8; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 295 12 mov.u64 %r440,0; $L66: .loc 1 290 18 add.u64 %r315,%r427,%r102; ld.u64 %r316,[%r132]; st.u64 [%r315],%r316; .loc 1 291 18 add.u64 %r317,%frame,%r102; .loc 1 291 20 ld.u64 %r319,[%r124]; mul.lo.u64 %r318,%r319,%r34; .loc 1 291 18 st.u64 [%r317],%r318; .loc 1 292 19 ld.u64 %r321,[%r132+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r132+8]; sub.u64 %r65,%r320,%r322; .loc 1 294 10 setp.lt.s64 %r323,%r65,0; @ %r323 bra $L64; .loc 1 292 17 add.u64 %r325,%r426,%r102; st.u64 [%r325],%r65; bra $L65; $L64: .loc 1 295 12 add.u64 %r327,%r426,%r102; st.u64 [%r327],%r440; $L65: .loc 1 288 3 add.u64 %r132,%r132,24; add.u64 %r124,%r124,24; add.u64 %r102,%r102,8; setp.ne.u64 %r329,%r68,%r102; @ %r329 bra $L66; bra $L106; $L62: .loc 1 302 7 setp.eq.u64 %r330,%r137,0; @ %r330 bra $L68; add.u64 %r426,%frame,360; $L86: add.u64 %r172,%r209,40; mov.u64 %r164,%r426; cvt.u32.u32 %r332,%r23; cvt.s64.s8 %r331,%r332; add.u64 %r334,%r331,%r331; add.u64 %r335,%r334,%r331; shl.b64 %r336,%r335,3; add.u64 %r337,%r209,16; add.u64 %r133,%r336,%r337; .loc 1 305 10 mov.u64 %r128,1; .loc 1 309 4 mov.u64 %r356,0; bra $L69; $L68: .loc 1 313 20 add.u32 %r338,%r23,-2; cvt.s64.s32 %r74,%r338; add.u64 %r340,%r74,%r74; add.u64 %r341,%r340,%r74; shl.b64 %r342,%r341,3; add.u64 %r343,%r209,%r342; .loc 1 313 67 shl.b64 %r345,%r74,3; add.u64 %r346,%frame,%r345; .loc 1 313 59 ld.u64 %r348,[%r343+40]; ld.u64 %r349,[%r346+360]; mul.lo.u64 %r145,%r348,%r349; .loc 1 315 24 mov.u64 %r350,0; st.u64 [%r209+8],%r350; .loc 1 316 28 cvt.u16.u32 %r353,%r23; add.u16 %r352,%r353,-1; cvt.u32.u16 %r354,%r352; st.u8 [%r209+28],%r354; .loc 1 318 10 setp.eq.u64 %r355,%r145,0; @ %r355 bra $L70; bra $L107; $L72: .loc 1 307 47 mul.lo.u64 %r128,%r71,%r128; $L69: .loc 1 309 4 st.u64 [%r172+8],%r356; ld.u64 %r71,[%r164]; add.u64 %r357,%r71,-1; st.u64 [%r172+16],%r357; st.u64 [%r172],%r128; .loc 1 302 7 add.u64 %r172,%r172,24; add.u64 %r164,%r164,8; setp.ne.u64 %r358,%r133,%r172; @ %r358 bra $L72; bra $L68; $L70: .loc 1 321 4 st.u64 [%r209+48],%r145; mov.u64 %r360,-1; st.u64 [%r209+56],%r360; mov.u64 %r361,1; st.u64 [%r209+40],%r361; .loc 1 322 4 bra $L48; $L107: .loc 1 325 24363; call (%value_in),_gfortrani_xmallocarray64,[%value_in]; } .loc 1 325 22 st.u64 [%r209],%r364; bra $L73; $L63: .loc 1 330 19 ld.s8 %r366,[%r209+28]; .loc 1 330 10 setp.eq.u64 %r367,%r366,%r137; @ %r367 bra $L74; .loc 1 331 2 cvta.const.u64 %r36836_gfortran_runtime_error74: .loc 1 333 11 cvta.global.u64 %r370,_gfortrani_compile_options; .loc 1 333 10 ld.u32 %r371,[%r370+36]; setp.eq.u32 %r372,%r371,0; @ %r372 bra $L73; .loc 1 335 4 add.u64 %r426,%frame,360; cvta.const.u64 %r376,$LC2r42637376; call _gfortrani_bounds_ifunction_return337 4 cvta.const.u64 %r380,$LC1210380376; call _gfortrani_bounds_equal_extents$L73: .loc 1 342 3 setp.ne.u64 %r382,%r137,0; @ %r382 bra $L75; $L78: .loc 1 350 8 ld.u64 %r152,[%r209]; .loc 1 351 8 ld.u64 %r150,[%r210]; .loc 1 353 9 setp.ne.u64 %r383,%r150,0; @ %r383 bra $L76; bra $L48; $L75: add.u64 %r190,%frame,480; add.u64 %r189,%r209,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r385,%r23; cvt.s64.s8 %r384,%r385; add.u64 %r175,%r384,-1; .loc 1 342 3 mov.u64 %r157,0; add.u64 %r426,%frame,360; .loc 1 344 16 mov.u64 %r386,%r157; $L77: st.u64 [%r190],%r386; .loc 1 345 18 ld.u64 %r387,[%r189]; st.u64 [%r187],%r387; .loc 1 346 17 shl.b64 %r389,%r157,3; add.u64 %r390,%r426,%r389; .loc 1 346 10 ld.u64 %r391,[%r390]; setp.le.s64 %r392,%r391,0; @ %r392 bra $L48; .loc 1 342 26 add.u64 %r157,%r157,1; .loc 1 342 3 add.u64 %r190,%r190,8; add.u64 %r189,%r189,24; add.u64 %r187,%r187,8; setp.ne.u64 %r393,%r157,%r175; @ %r393 bra $L77; bra $L78; $L76: .loc 1 276 12 mul.lo.u64 %r45,%r42,%r34; .loc 1 370 32 shl.b64 %r90,%r141,2; ld.u64 %r39,[%frame+480]; .loc 1 401 22 ld.u64 %r93,[%frame+240]; .loc 1 401 12 shl.b64 %r94,%r93,2; .loc 1 402 23 ld.u64 %r96,[%frame]; .loc 1 403 22 ld.u64 %r98,[%frame+120]; .loc 1 403 12 shl.b64 %r99,%r98,2; .loc 1 405 32 ld.u64 %r84,[%frame+360]; .loc 1 412 23 mul.lo.u64 %r38,%r84,%r93; .loc 1 413 24 mul.lo.u64 %r394,%r84,%r96; .loc 1 413 10 neg.s64 %r163,%r394; .loc 1 414 23 mul.lo.u64 %r395,%r84,%r98; .loc 1 414 9 shl.b64 %r396,%r395,2; neg.s64 %r165,%r396; setp.le.s64 %r431,%r137,1; cvt.u32.u32 %r433,%r23; cvt.s64.s8 %r434,%r433; add.u64 %r435,%r434,-1; add.u64 %r436,%frame,240; add.u64 %r437,%frame,360; add.u64 %r438,%frame,120; .loc 1 409 13 mov.u64 %r439,0; $L87: .loc 1 342 3 mov.u64 %r156,%r151; mov.u64 %r155,%r150; .loc 1 365 9 mov.u32 %r129,2147483647; .loc 1 370 9 mov.u64 %r154,0; $L80: .loc 1 394 6 ld.s8 %r398,[%r156]; cvt.u16.u32 %r397,%r398; setp.eq.u16 %r399,%r397,0; @ %r399 bra $L79; ld.u32 %r400,[%r155]; min.s32 %r129,%r129,%r400; $L79: .loc 1 370 24 add.u64 %r154,%r154,1; .loc 1 370 32 add.u64 %r155,%r155,%r90; .loc 1 370 47 add.u64 %r156,%r156,%r45; .loc 1 370 2 setp.ne.u64 %r401,%r138,%r154; @ %r401 bra $L80; .loc 1 397 8 st.u32 [%r152],%r129; .loc 1 400 15 add.u64 %r39,%r39,1; .loc 1 401 12 add.u64 %r150,%r150,%r94; .loc 1 402 13 add.u64 %r151,%r151,%r96; .loc 1 403 12 add.u64 %r152,%r152,%r99; .loc 1 405 13 setp.ne.u64 %r402,%r84,%r39; @ %r402 bra $L87; .loc 1 413 10 add.u64 %r147,%r151,%r163; .loc 1 414 9 add.u64 %r148,%r152,%r165; .loc 1 416 7 @ ! %r431 bra $L108; bra $L48; $L85: .loc 1 409 13 st.u64 [%r41],%r439; .loc 1 412 23 mul.lo.u64 %r101,%r113,%r112; .loc 1 413 24 mul.lo.u64 %r405,%r115,%r112; .loc 1 413 10 sub.u64 %r147,%r151,%r405; .loc 1 414 23 mul.lo.u64 %r406,%r117,%r112; .loc 1 414 9 shl.b64 %r407,%r406,2; sub.u64 %r148,%r152,%r407; .loc 1 415 5 add.u64 %r149,%r149,1; .loc 1 416 7 add.u64 %r41,%r41,8; add.u64 %r166,%r166,8; setp.ne.u64 %r408,%r149,%r435; @ %r408 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; .loc 1 412 23 mov.u64 %r101,%r38; .loc 1 416 7 mov.u64 %r166,8; .loc 1 415 5 mov.u64 %r149,1; $L84: .loc 1 424 16 ld.u64 %r412,[%r41]; add.u64 %r112,%r412,1; st.u64 [%r41],%r112; .loc 1 425 23 add.u64 %r414,%r436,%r166; ld.u64 %r113,[%r414]; .loc 1 425 13 sub.u64 %r415,%r113,%r101; shl.b64 %r416,%r415,2; add.u64 %r150,%r150,%r416; .loc 1 426 24 add.u64 %r417,%frame,%r166; ld.u64 %r115,[%r417]; .loc 1 426 14 add.u64 %r151,%r147,%r115; .loc 1 427 23 add.u64 %r419,%r438,%r166; ld.u64 %r117,[%r419]; .loc 1 427 13 shl.b64 %r420,%r117,2; add.u64 %r152,%r148,%r420; .loc 1 405 32 add.u64 %r422,%r437,%r166; ld.u64 %r121,[%r422]; .loc 1 405 13 setp.eq.u64 %r423,%r112,%r121; @ %r423 bra $L85; .loc 1 409 13 mov.u64 %r39,0; bra $L87; $L106: .loc 1 298 6 ld.u64 %r424,[%r209]; setp.eq.u64 %r425,%r424,0; @ ! %r425 bra $L63; bra $L86; $L48:_gfortran_sminval_i4 .visible .func _gfortran_sminval_68u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64predu64mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 454 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 454 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_minva.loc 1 461 7 bra $L109; $L111: .loc 1 464 10 ld.u64 %r23,[%r152]; .loc 1 464 7 add.u64 %r88,%r23,-1; .loc 1 465 10 ld.s8 %r24,[%r151+28]; .loc 1 465 38 add.u32 %r26,%r24,-1; .loc 1 465 8 cvt.s64.s32 %r294,%r26; .loc 1 467 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r294; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 467 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 474 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 482 3 setp.lt.s64 %r173,%r88,%r294; @ %r173 bra $L115; bra $L168; $L113: .loc 1 469 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 479 12 mov.u64 %r300,0; $L119: .loc 1 476 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 478 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 476 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 479 12 st.u64 [%r139],%r300; $L118: .loc 1 474 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 491 15 ld.u64 %r98,[%r150]; .loc 1 491 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r293,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r293,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 488 12 mov.u64 %r299,0; $L125: .loc 1 485 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 487 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 484 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 488 12 st.u64 [%r43],%r299; $L124: .loc 1 482 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 495 7 setp.eq.u64 %r211,%r294,0; @ %r211 bra $L127; add.u64 %r293,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r293; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 498 10 mov.u64 %r81,1; .loc 1 502 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 506 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 507 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 509 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 509 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 509 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 511 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 500 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 502 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 495 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 514 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 515 4 bra $L109; $L170: .loc 1 518 2496244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 518 22 st.u64 [%r150],%r98; $L135: .loc 1 544 3 setp.ne.u64 %r247,%r294,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 522 19 ld.s8 %r55,[%r150+28]; .loc 1 522 10 setp.eq.u64 %r248,%r55,%r294; @ %r248 bra $L134; .loc 1 523 2 st.u64 [%stack+8],%r294; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 528 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 528 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 530 4 setp.eq.u64 %r254,%r294,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 530 10 mov.u64 %r104,0; $L137: .loc 1 534 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 534 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 535 18 ld.u64 %r63,[%r31]; .loc 1 530 25 add.u64 %r104,%r104,1; .loc 1 535 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 536 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 530 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 546 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r294,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 547 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 544 26 add.u64 %r100,%r100,1; .loc 1 544 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r294,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 556 22 ld.u64 %r68,[%frame]; .loc 1 556 12 shl.b64 %r69,%r68,2; .loc 1 558 32 ld.u64 %r59,[%frame+120]; .loc 1 565 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r292,%r294,1; .loc 1 554 13 mov.u32 %r278,2147483647; cvt.u32.u32 %r296,%r24; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; $L146: st.u32 [%r98],%r278; .loc 1 555 15 add.u64 %r80,%r80,1; .loc 1 556 12 add.u64 %r98,%r98,%r69; .loc 1 558 13 setp.ne.u64 %r279,%r59,%r80; @ %r279 bra $L146; .loc 1 567 7 @ ! %r292 bra $L171; bra $L109; $L145: .loc 1 562 13 st.u64 [%r99],%r295; .loc 1 565 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 566 5 add.u64 %r97,%r97,1; .loc 1 567 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r282,%r97,%r298; @ %r282 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 565 23 mov.u64 %r72,%r44; .loc 1 566 5 mov.u64 %r97,1; .loc 1 562 13 mov.u64 %r295,0; $L144: .loc 1 571 16 ld.u64 %r287,[%r99]; add.u64 %r74,%r287,1; st.u64 [%r99],%r74; .loc 1 572 23 ld.u64 %r75,[%r124]; .loc 1 572 13 sub.u64 %r288,%r75,%r72; shl.b64 %r289,%r288,2; add.u64 %r98,%r98,%r289; .loc 1 558 32 ld.u64 %r77,[%r123]; .loc 1 558 13 setp.eq.u64 %r290,%r74,%r77; @ %r290 bra $L145; .loc 1 562 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 491 15 ld.u64 %r98,[%r150]; .loc 1 491 6 setp.eq.u64 %r291,%r98,0; @ ! %r291 bra $L122; bra $L147; $L109: .loc 1 576 1 ret; } minval_i8.o/_gfortran_minval_i8 .visible .func _gfortran_minval_fortran/generated/minval_i8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminval_i8 .visible .func _gfortran_mminval_i8gfortran_sminval_i8 .visible .func _gfortran_sminval_i8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,86,65,76114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_minval_i8 .visible .func _gfortran_minval_48097pred %r218; .reg .predpred %r231; .reg .u64 %r232; .reg .predu64 %r250; .reg .u64 %r251; .reg .u64 %r252; .reg .pred %r253; .reg .predu64u16u64 %r274; .reg .u64pred %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u32 %r306; .reg .u64u64 %r311; .reg .u64 %r312; .reg .pred %r313; .reg .predu64predmov.u64 %r180,%ar0; mov.u64 %r181,%ar1; mov.u64 %r182,%ar2; .loc 1 55 10 ld.s8 %r22,[%r181+28]; .loc 1 55 38 add.u32 %r183,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r108,%r183; .loc 1 56 10 ld.u64 %r25,[%r182]; .loc 1 56 7 add.u64 %r109,%r25,-1; .loc 1 58 7 shr.u64 %r185,%r109,63; set.u32.lt.s64 %r187,%r108,%r109; neg.s32 %r188,%r187; cvt.u16.u64 %r191,%r185; cvt.u16.u32 %r192,%r188; or.b16 %r190,%r191,%r192; .loc 1 58 6 cvt.u32.u16 %r193,%r190; cvt.u16.u8 %r194,%r193; setp.eq.u16 %r195,%r194,0; @ %r195 bra $L2; .loc 1 60 7 cvt.u32.u32 %r198,%r22; cvt.s64.s8 %r197,%r198; st.u64 [%stack+8],%r1971966_gfortran_runtime_errorr201,%r109,%r109; add.u64 %r202,%r201,%r109; shl.b64 %r203,%r202,3; add.u64 %r204,%r181,%r203; ld.u64 %r30,[%r204+56]; ld.u64 %r32,[%r204+48]; .loc 1 68 9 ld.u64 %r112,[%r204+40]; .loc 1 70 3 setp.ne.u64 %r218,%r109,0; @ %r218 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r219,%r108,%r109; @ %r219 bra $L4; bra $L46; $L3: add.u64 %r63,%r181,40; add.u64 %r48,%frame,120; add.u64 %r97,%frame,240; add.u64 %r220,%r181,16; add.u64 %r222,%r25,%r25; add.u64 %r223,%r222,%r25; shl.b64 %r224,%r223,3; add.u64 %r175,%r220,%r224; .loc 1 76 12 mov.u64 %r358,0; $L8: .loc 1 72 18 ld.u64 %r225,[%r63]; st.u64 [%r48],%r225; .loc 1 73 19 ld.u64 %r227,[%r63+16]; add.u64 %r226,%r227,1; ld.u64 %r228,[%r63+8]; sub.u64 %r37,%r226,%r228; .loc 1 75 10 setp.lt.s64 %r229,%r37,0; @ %r229 bra $L6; .loc 1 73 17 st.u64 [%r97],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r97],%r358; $L7: .loc 1 70 3 add.u64 %r63,%r63,24; add.u64 %r48,%r48,8; add.u64 %r97,%r97,8; setp.ne.u64 %r231,%r63,%r175; @ %r231 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r232,[%r180]; setp.eq.u64 %r233,%r232,0; @ ! %r233 bra $L11; bra $L10; $L4: add.u64 %r235,%r25,%r25; add.u64 %r236,%r235,%r25; shl.b64 %r237,%r236,3; add.u64 %r238,%r237,40; add.u64 %r136,%r181,%r238; shl.b64 %r239,%r25,3; add.u64 %r106,%r239,-8; add.u64 %r345,%frame,120; add.u64 %r129,%r345,%r106; add.u64 %r346,%frame,240; add.u64 %r104,%r346,%r106; cvt.u32.u32 %r243,%r22; cvt.s64.s8 %r242,%r243; add.u64 %r245,%r242,%r242; add.u64 %r246,%r245,%r242; shl.b64 %r247,%r246,3; add.u64 %r248,%r181,40; add.u64 %r82,%r247,%r248; .loc 1 84 12 mov.u64 %r357,0; $L14: .loc 1 80 18 ld.u64 %r249,[%r136]; st.u64 [%r129],%r249; .loc 1 81 19 ld.u64 %r251,[%r136+16]; add.u64 %r250,%r251,1; ld.u64 %r252,[%r136+8]; sub.u64 %r45,%r250,%r252; .loc 1 83 10 setp.lt.s64 %r253,%r45,0; @ %r253 bra $L12; .loc 1 81 17 st.u64 [%r104],%r45; bra $L13; $L12: .loc 1 84 12 st.u64 [%r104],%r357; $L13: .loc 1 78 3 add.u64 %r136,%r136,24; add.u64 %r129,%r129,8; add.u64 %r104,%r104,8; setp.ne.u64 %r255,%r82,%r136; @ %r255 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r256,%r108,0; @ %r256 bra $L16; add.u64 %r346,%frame,240; $L34: add.u64 %r152,%r180,40; mov.u64 %r150,%r346; add.u64 %r257,%r180,16; cvt.u32.u32 %r259,%r22; cvt.s64.s8 %r258,%r259; add.u64 %r261,%r258,%r258; add.u64 %r262,%r261,%r258; shl.b64 %r263,%r262,3; add.u64 %r139,%r257,%r263; .loc 1 94 10 mov.u64 %r103,1; .loc 1 98 4 mov.u64 %r286,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r264,0; st.u64 [%r180+8],%r264; .loc 1 103 28 cvt.u16.u32 %r267,%r22; add.u16 %r266,%r267,-1; cvt.u32.u16 %r268,%r266; st.u8 [%r180+28],%r268; .loc 1 105 20 add.u32 %r269,%r22,-2; cvt.s64.s32 %r54,%r269; add.u64 %r271,%r54,%r54; add.u64 %r272,%r271,%r54; shl.b64 %r273,%r272,3; add.u64 %r274,%r180,%r273; .loc 1 105 67 shl.b64 %r276,%r54,3; add.u64 %r277,%frame,%r276; .loc 1 105 59 ld.u64 %r279,[%r274+40]; ld.u64 %r280,[%r277+240]; mul.lo.u64 %r114,%r279,%r280; .loc 1 107 29 mov.u64 %r2811282; call (%value_in),_gfortrani_xmallocarray283,[%value_in]; } .loc 1 107 27 st.u64 [%r180],%r283; .loc 1 108 10 setp.eq.u64 %r285,%r114,0; @ ! %r285 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r103,%r50,%r103; $L17: .loc 1 98 4 st.u64 [%r152+8],%r286; ld.u64 %r50,[%r150]; add.u64 %r287,%r50,-1; st.u64 [%r152+16],%r287; st.u64 [%r152],%r103; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r288,%r139,%r152; @ %r288 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r180+48],%r114; mov.u64 %r290,-1; st.u64 [%r180+56],%r290; mov.u64 %r291,1; st.u64 [%r180+40],%r291; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r60,[%r180+28]; .loc 1 118 10 setp.eq.u64 %r292,%r60,%r108; @ %r292 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r108; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r295,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r296,[%r295+36]; setp.eq.u32 %r297,%r296,0; @ %r297 bra $L21; .loc 1 125 2 add.u64 %r346,%frame,240; cvta.const.u64 %r301,$LC2r346300301; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r303,%r108,0; @ %r303 bra $L24; $L27: .loc 1 65 9 add.u64 %r304,%r30,1; .loc 1 65 7 sub.u64 %r110,%r304,%r32; max.s64 %r40,%r110,0; .loc 1 137 8 ld.u64 %r121,[%r181]; .loc 1 138 8 ld.u64 %r122,[%r180]; .loc 1 158 36 shl.b64 %r68,%r112,3; ld.u64 %r47,[%frame+360]; .loc 1 180 22 ld.u64 %r69,[%frame+120]; .loc 1 180 12 shl.b64 %r71,%r69,3; .loc 1 181 22 ld.u64 %r72,[%frame]; .loc 1 181 12 shl.b64 %r74,%r72,3; setp.gt.s64 %r347,%r110,0; setp.le.s64 %r348,%r108,1; cvt.u32.u32 %r350,%r22; cvt.s64.s8 %r351,%r350; add.u64 %r352,%r351,-1; add.u64 %r353,%frame,120; add.u64 %r354,%frame,240; .loc 1 187 13 mov.u64 %r355,0; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r180,40; mov.u64 %r162,%frame; cvt.u32.u32 %r306,%r22; cvt.s64.s8 %r305,%r306; add.u64 %r153,%r305,-1; .loc 1 129 3 mov.u64 %r123,0; add.u64 %r346,%frame,240; .loc 1 131 16 mov.u64 %r307,%r123; $L26: st.u64 [%r165],%r307; .loc 1 132 18 ld.u64 %r308,[%r164]; st.u64 [%r162],%r308; .loc 1 133 17 shl.b64 %r310,%r123,3; add.u64 %r311,%r346,%r310; .loc 1 133 10 ld.u64 %r312,[%r311]; setp.le.s64 %r313,%r312,0; @ %r313 bra $L1; .loc 1 129 26 add.u64 %r123,%r123,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r314,%r123,%r153; @ %r314 bra $L26; bra $L27; $L25: .loc 1 153 5 @ %r347 bra $L35; .loc 1 154 10 mov.u64 %r316,9223372036854775807; st.u64 [%r122],%r316; bra $L29; $L35: mov.u64 %r116,%r121; .loc 1 151 9 mov.u64 %r43,9223372036854775807; .loc 1 158 13 mov.u64 %r115,0; $L28: ld.u64 %r317,[%r116]; min.s64 %r43,%r43,%r317; .loc 1 158 28 add.u64 %r115,%r115,1; .loc 1 158 36 add.u64 %r116,%r116,%r68; .loc 1 158 6 setp.gt.s64 %r318,%r40,%r115; @ %r318 bra $L28; .loc 1 175 12 st.u64 [%r122],%r43; $L29: .loc 1 179 15 add.u64 %r47,%r47,1; .loc 1 180 12 add.u64 %r121,%r121,%r71; .loc 1 181 12 add.u64 %r122,%r122,%r74; .loc 1 183 32 ld.u64 %r127,[%frame+240]; .loc 1 183 13 setp.ne.u64 %r319,%r47,%r127; @ %r319 bra $L25; .loc 1 190 23 mul.lo.u64 %r320,%r69,%r47; .loc 1 190 9 shl.b64 %r321,%r320,3; sub.u64 %r118,%r121,%r321; .loc 1 191 23 mul.lo.u64 %r322,%r72,%r47; .loc 1 191 9 shl.b64 %r323,%r322,3; sub.u64 %r119,%r122,%r323; .loc 1 193 7 @ %r348 bra $L1; add.u64 %r144,%frame,368; mov.u64 %r107,8; .loc 1 192 5 mov.u64 %r120,1; bra $L32; $L33: .loc 1 187 13 st.u64 [%r144],%r355; .loc 1 190 23 mul.lo.u64 %r329,%r89,%r88; .loc 1 190 9 shl.b64 %r330,%r329,3; sub.u64 %r118,%r121,%r330; .loc 1 191 23 mul.lo.u64 %r331,%r92,%r88; .loc 1 191 9 shl.b64 %r332,%r331,3; sub.u64 %r119,%r122,%r332; .loc 1 192 5 add.u64 %r120,%r120,1; .loc 1 193 7 add.u64 %r144,%r144,8; add.u64 %r107,%r107,8; setp.eq.u64 %r333,%r120,%r352; @ %r333 bra $L1; $L32: .loc 1 201 16 ld.u64 %r334,[%r144]; add.u64 %r88,%r334,1; st.u64 [%r144],%r88; .loc 1 202 23 add.u64 %r336,%r353,%r107; ld.u64 %r89,[%r336]; .loc 1 202 13 shl.b64 %r337,%r89,3; add.u64 %r121,%r118,%r337; .loc 1 203 23 add.u64 %r338,%frame,%r107; ld.u64 %r92,[%r338]; .loc 1 203 13 shl.b64 %r339,%r92,3; add.u64 %r122,%r119,%r339; .loc 1 183 32 add.u64 %r341,%r354,%r107; ld.u64 %r95,[%r341]; .loc 1 183 13 setp.eq.u64 %r342,%r88,%r95; @ %r342 bra $L33; .loc 1 187 13 mov.u64 %r47,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r343,[%r180]; setp.eq.u64 %r344,%r343,0; @ ! %r344 bra $L11; bra $L34; $L_gfortran_mminval_i8 .visible .func _gfortran_mminval_608101; .reg .u64 %r105; .reg .u64predpred %r280; .reg .pred %r281; .reg .u64 %r282; .reg .u64 %r284; .reg .u64predu64 %r318; .reg .u64u64u64 %r364; .reg .u64 %r365; .reg .u64 %r367; .reg .predu64pred %r393; .reg .predpredmov.u64 %r210,%ar0; mov.u64 %r211,%ar1; mov.u64 %r212,%ar2; mov.u64 %r213,%ar3; .loc 1 237 6911212; call _gfortran_minva.loc 1 244 7 bra $L48; $L49: .loc 1 247 10 ld.u64 %r22,[%r212]; .loc 1 247 7 add.u64 %r140,%r22,-1; .loc 1 248 10 ld.s8 %r23,[%r211+28]; .loc 1 248 38 add.u32 %r218,%r23,-1; .loc 1 248 8 cvt.s64.s32 %r141,%r218; .loc 1 251 7 shr.u64 %r220,%r140,63; set.u32.gt.s64 %r222,%r140,%r141; neg.s32 %r223,%r222; cvt.u16.u64 %r226,%r220; cvt.u16.u32 %r227,%r223; or.b16 %r225,%r226,%r227; .loc 1 251 6 cvt.u32.u16 %r228,%r225; cvt.u16.u8 %r229,%r228; setp.eq.u16 %r230,%r229,0; @ %r230 bra $L51; .loc 1 253 7 cvt.u32.u32 %r233,%r23; cvt.s64.s8 %r232,%r233; st.u64 [%stack+8],%r232; st.u64 [%stack],%r22; cvta.const.u64 %r2312stack; call _gfortran_runtime_error1: .loc 1 258 9 add.u64 %r431,%r140,%r140; add.u64 %r430,%r431,%r140; shl.b64 %r238,%r430,3; add.u64 %r239,%r211,%r238; ld.u64 %r242,[%r239+56]; add.u64 %r241,%r242,1; .loc 1 258 7 ld.u64 %r249,[%r239+48]; sub.u64 %r142,%r241,%r249; .loc 1 259 6 setp.le.s64 %r250,%r142,0; @ %r250 bra $L48; .loc 1 262 9 ld.u64 %r155,[%r213]; .loc 1 264 15 ld.u64 %r35,[%r213+16]; .loc 1 266 22 cvt.u32.u64 %r36,%r35; .loc 1 266 53 add.u32 %r251,%r36,-4; and.b32 %r252,%r251,-5; set.u32.eq.u32 %r254,%r252,0; neg.s32 %r255,%r254; .loc 1 266 22 add.u32 %r256,%r36,-1; set.u32.le.u32 %r258,%r256,1; neg.s32 %r259,%r258; .loc 1 266 6 cvt.u16.u32 %r261,%r255; cvt.u16.u32 %r262,%r259; or.b16 %r260,%r261,%r262; cvt.u32.u16 %r263,%r260; cvt.u16.u8 %r264,%r263; setp.ne.u16 %r265,%r264,0; @ %r265 bra $L53; .loc 1 268 7 setp.ne.u32 %r267,%r36,16; @ %r267 bra $L54; $L53: .loc 1 275 9 shl.b64 %r271,%r430,3; add.u64 %r272,%r211,%r271; ld.u64 %r145,[%r272+40]; .loc 1 276 12 add.u64 %r278,%r213,%r271; ld.u64 %r43,[%r278+40]; .loc 1 278 3 setp.ne.u64 %r280,%r140,0; @ %r280 bra $L55; $L61: .loc 1 288 3 setp.lt.s64 %r281,%r140,%r141; @ %r281 bra $L56; bra $L105; $L54: .loc 1 273 5 cvta.const.u64 %r282282_gfortran_runtime_error5: add.u64 %r194,%r211,40; add.u64 %r196,%r213,40; add.u64 %r284,%r211,16; add.u64 %r286,%r22,%r22; add.u64 %r287,%r286,%r22; shl.b64 %r288,%r287,3; add.u64 %r207,%r284,%r288; .loc 1 278 3 mov.u64 %r198,0; add.u64 %r428,%frame,240; add.u64 %r427,%frame,360; .loc 1 285 12 mov.u64 %r442,%r198; $L60: .loc 1 280 18 add.u64 %r290,%r428,%r198; ld.u64 %r291,[%r194]; st.u64 [%r290],%r291; .loc 1 281 18 add.u64 %r292,%frame,%r198; .loc 1 281 20 ld.u64 %r294,[%r196]; mul.lo.u64 %r293,%r294,%r35; .loc 1 281 18 st.u64 [%r292],%r293; .loc 1 282 19 ld.u64 %r296,[%r194+16]; add.u64 %r295,%r296,1; ld.u64 %r297,[%r194+8]; sub.u64 %r55,%r295,%r297; .loc 1 284 10 setp.lt.s64 %r298,%r55,0; @ %r298 bra $L58; .loc 1 282 17 add.u64 %r300,%r427,%r198; st.u64 [%r300],%r55; bra $L59; $L58: .loc 1 285 12 add.u64 %r302,%r427,%r198; st.u64 [%r302],%r442; $L59: .loc 1 278 3 add.u64 %r194,%r194,24; add.u64 %r196,%r196,24; add.u64 %r198,%r198,8; setp.ne.u64 %r304,%r194,%r207; @ %r304 bra $L60; bra $L61; $L105: .loc 1 298 6 ld.u64 %r305,[%r210]; setp.eq.u64 %r306,%r305,0; @ ! %r306 bra $L63; bra $L62; $L56: add.u64 %r308,%r22,%r22; add.u64 %r309,%r308,%r22; shl.b64 %r310,%r309,3; add.u64 %r58,%r310,40; add.u64 %r99,%r211,%r58; add.u64 %r138,%r213,%r58; shl.b64 %r311,%r22,3; add.u64 %r192,%r311,-8; cvt.u32.u32 %r313,%r23; cvt.s64.s8 %r312,%r313; shl.b64 %r314,%r312,3; add.u64 %r39,%r314,-8; add.u64 %r428,%frame,240; add.u64 %r427,%frame,360; .loc 1 295 12 mov.u64 %r441,0; $L66: .loc 1 290 18 add.u64 %r316,%r428,%r192; ld.u64 %r317,[%r99]; st.u64 [%r316],%r317; .loc 1 291 18 add.u64 %r318,%frame,%r192; .loc 1 291 20 ld.u64 %r320,[%r138]; mul.lo.u64 %r319,%r320,%r35; .loc 1 291 18 st.u64 [%r318],%r319; .loc 1 292 19 ld.u64 %r322,[%r99+16]; add.u64 %r321,%r322,1; ld.u64 %r323,[%r99+8]; sub.u64 %r66,%r321,%r323; .loc 1 294 10 setp.lt.s64 %r324,%r66,0; @ %r324 bra $L64; .loc 1 292 17 add.u64 %r326,%r427,%r192; st.u64 [%r326],%r66; bra $L65; $L64: .loc 1 295 12 add.u64 %r328,%r427,%r192; st.u64 [%r328],%r441; $L65: .loc 1 288 3 add.u64 %r99,%r99,24; add.u64 %r138,%r138,24; add.u64 %r192,%r192,8; setp.ne.u64 %r330,%r39,%r192; @ %r330 bra $L66; bra $L106; $L62: .loc 1 302 7 setp.eq.u64 %r331,%r141,0; @ %r331 bra $L68; add.u64 %r427,%frame,360; $L86: add.u64 %r136,%r210,40; mov.u64 %r134,%r427; cvt.u32.u32 %r333,%r23; cvt.s64.s8 %r332,%r333; add.u64 %r335,%r332,%r332; add.u64 %r336,%r335,%r332; shl.b64 %r337,%r336,3; add.u64 %r338,%r210,16; add.u64 %r105,%r337,%r338; .loc 1 305 10 mov.u64 %r130,1; .loc 1 309 4 mov.u64 %r357,0; bra $L69; $L68: .loc 1 313 20 add.u32 %r339,%r23,-2; cvt.s64.s32 %r73,%r339; add.u64 %r341,%r73,%r73; add.u64 %r342,%r341,%r73; shl.b64 %r343,%r342,3; add.u64 %r344,%r210,%r343; .loc 1 313 67 shl.b64 %r346,%r73,3; add.u64 %r347,%frame,%r346; .loc 1 313 59 ld.u64 %r349,[%r344+40]; ld.u64 %r350,[%r347+360]; mul.lo.u64 %r149,%r349,%r350; .loc 1 315 24 mov.u64 %r351,0; st.u64 [%r210+8],%r351; .loc 1 316 28 cvt.u16.u32 %r354,%r23; add.u16 %r353,%r354,-1; cvt.u32.u16 %r355,%r353; st.u8 [%r210+28],%r355; .loc 1 318 10 setp.eq.u64 %r356,%r149,0; @ %r356 bra $L70; bra $L107; $L72: .loc 1 307 47 mul.lo.u64 %r130,%r70,%r130; $L69: .loc 1 309 4 st.u64 [%r136+8],%r357; ld.u64 %r70,[%r134]; add.u64 %r358,%r70,-1; st.u64 [%r136+16],%r358; st.u64 [%r136],%r130; .loc 1 302 7 add.u64 %r136,%r136,24; add.u64 %r134,%r134,8; setp.ne.u64 %r359,%r105,%r136; @ %r359 bra $L72; bra $L68; $L70: .loc 1 321 4 st.u64 [%r210+48],%r149; mov.u64 %r361,-1; st.u64 [%r210+56],%r361; mov.u64 %r362,1; st.u64 [%r210+40],%r362; .loc 1 322 4 bra $L48; $L107: .loc 1 325 24 mov.u64 %r364364; call (%value_in),_gfortrani_xmallocarray65,[%value_in]; } .loc 1 325 22 st.u64 [%r210],%r365; bra $L73; $L63: .loc 1 330 19 ld.s8 %r367,[%r210+28]; .loc 1 330 10 setp.eq.u64 %r368,%r367,%r141; @ %r368 bra $L74; .loc 1 331 2 cvta.const.u64 %r36stack; call _gfortran_runtime_error74: .loc 1 333 11 cvta.global.u64 %r371,_gfortrani_compile_options; .loc 1 333 10 ld.u32 %r372,[%r371+36]; setp.eq.u32 %r373,%r372,0; @ %r373 bra $L73; .loc 1 335 4 add.u64 %r427,%frame,360; cvta.const.u64 %r377,$LC2r427376377; call _gfortrani_bounds_ifunction_return337 4 cvta.const.u64 %r381,$LC1r2113377; call _gfortrani_bounds_equal_extents$L73: .loc 1 342 3 setp.ne.u64 %r383,%r141,0; @ %r383 bra $L75; $L78: .loc 1 350 8 ld.u64 %r156,[%r210]; .loc 1 351 8 ld.u64 %r154,[%r211]; .loc 1 353 9 setp.ne.u64 %r384,%r154,0; @ %r384 bra $L76; bra $L48; $L75: add.u64 %r183,%frame,480; add.u64 %r180,%r210,40; add.u64 %r177,%frame,120; cvt.u32.u32 %r386,%r23; cvt.s64.s8 %r385,%r386; add.u64 %r137,%r385,-1; .loc 1 342 3 mov.u64 %r161,0; add.u64 %r427,%frame,360; .loc 1 344 16 mov.u64 %r387,%r161; $L77: st.u64 [%r183],%r387; .loc 1 345 18 ld.u64 %r388,[%r180]; st.u64 [%r177],%r388; .loc 1 346 17 shl.b64 %r390,%r161,3; add.u64 %r391,%r427,%r390; .loc 1 346 10 ld.u64 %r392,[%r391]; setp.le.s64 %r393,%r392,0; @ %r393 bra $L48; .loc 1 342 26 add.u64 %r161,%r161,1; .loc 1 342 3 add.u64 %r183,%r183,8; add.u64 %r180,%r180,24; add.u64 %r177,%r177,8; setp.ne.u64 %r394,%r137,%r161; @ %r394 bra $L77; bra $L78; $L76: .loc 1 276 12 mul.lo.u64 %r46,%r43,%r35; .loc 1 370 32 shl.b64 %r89,%r145,3; ld.u64 %r29,[%frame+480]; .loc 1 401 22 ld.u64 %r91,[%frame+240]; .loc 1 401 12 shl.b64 %r93,%r91,3; .loc 1 402 23 ld.u64 %r94,[%frame]; .loc 1 403 22 ld.u64 %r96,[%frame+120]; .loc 1 403 12 shl.b64 %r98,%r96,3; setp.le.s64 %r432,%r141,1; cvt.u32.u32 %r434,%r23; cvt.s64.s8 %r435,%r434; add.u64 %r436,%r435,-1; add.u64 %r437,%frame,240; add.u64 %r438,%frame,360; add.u64 %r439,%frame,120; .loc 1 409 13 mov.u64 %r440,0; $L87: .loc 1 342 3 mov.u64 %r160,%r155; mov.u64 %r159,%r154; .loc 1 365 9 mov.u64 %r132,9223372036854775807; .loc 1 370 9 mov.u64 %r158,0; $L80: .loc 1 394 6 ld.s8 %r396,[%r160]; cvt.u16.u32 %r395,%r396; setp.eq.u16 %r397,%r395,0; @ %r397 bra $L79; ld.u64 %r398,[%r159]; min.s64 %r132,%r132,%r398; $L79: .loc 1 370 24 add.u64 %r158,%r158,1; .loc 1 370 32 add.u64 %r159,%r159,%r89; .loc 1 370 47 add.u64 %r160,%r160,%r46; .loc 1 370 2 setp.ne.u64 %r399,%r142,%r158; @ %r399 bra $L80; .loc 1 397 8 st.u64 [%r156],%r132; .loc 1 400 15 add.u64 %r29,%r29,1; .loc 1 401 12 add.u64 %r154,%r154,%r93; .loc 1 402 13 add.u64 %r155,%r155,%r94; .loc 1 403 12 add.u64 %r156,%r156,%r98; .loc 1 405 32 ld.u64 %r83,[%frame+360]; .loc 1 405 13 setp.ne.u64 %r400,%r83,%r29; @ %r400 bra $L87; .loc 1 412 23 mul.lo.u64 %r101,%r29,%r91; .loc 1 413 24 mul.lo.u64 %r401,%r29,%r94; .loc 1 413 10 sub.u64 %r151,%r155,%r401; .loc 1 414 23 mul.lo.u64 %r402,%r29,%r96; .loc 1 414 9 shl.b64 %r403,%r402,3; sub.u64 %r152,%r156,%r403; .loc 1 416 7 @ ! %r432 bra $L108; bra $L48; $L85: .loc 1 409 13 st.u64 [%r41],%r440; .loc 1 412 23 mul.lo.u64 %r101,%r113,%r112; .loc 1 413 24 mul.lo.u64 %r406,%r115,%r112; .loc 1 413 10 sub.u64 %r151,%r155,%r406; .loc 1 414 23 mul.lo.u64 %r407,%r117,%r112; .loc 1 414 9 shl.b64 %r408,%r407,3; sub.u64 %r152,%r156,%r408; .loc 1 415 5 add.u64 %r153,%r153,1; .loc 1 416 7 add.u64 %r41,%r41,8; add.u64 %r30,%r30,8; setp.ne.u64 %r409,%r153,%r436; @ %r409 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; mov.u64 %r30,8; .loc 1 415 5 mov.u64 %r153,1; $L84: .loc 1 424 16 ld.u64 %r413,[%r41]; add.u64 %r112,%r413,1; st.u64 [%r41],%r112; .loc 1 425 23 add.u64 %r415,%r437,%r30; ld.u64 %r113,[%r415]; .loc 1 425 13 sub.u64 %r416,%r113,%r101; shl.b64 %r417,%r416,3; add.u64 %r154,%r154,%r417; .loc 1 426 24 add.u64 %r418,%frame,%r30; ld.u64 %r115,[%r418]; .loc 1 426 14 add.u64 %r155,%r151,%r115; .loc 1 427 23 add.u64 %r420,%r439,%r30; ld.u64 %r117,[%r420]; .loc 1 427 13 shl.b64 %r421,%r117,3; add.u64 %r156,%r152,%r421; .loc 1 405 32 add.u64 %r423,%r438,%r30; ld.u64 %r120,[%r423]; .loc 1 405 13 setp.eq.u64 %r424,%r112,%r120; @ %r424 bra $L85; .loc 1 409 13 mov.u64 %r29,0; bra $L87; $L106: .loc 1 298 6 ld.u64 %r425,[%r210]; setp.eq.u64 %r426,%r425,0; @ ! %r426 bra $L63; bra $L86; $L48:_gfortran_sminval_i8 .visible .func _gfortran_sminval_68u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64predu64mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 454 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 454 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_minva.loc 1 461 7 bra $L109; $L111: .loc 1 464 10 ld.u64 %r23,[%r152]; .loc 1 464 7 add.u64 %r88,%r23,-1; .loc 1 465 10 ld.s8 %r24,[%r151+28]; .loc 1 465 38 add.u32 %r26,%r24,-1; .loc 1 465 8 cvt.s64.s32 %r294,%r26; .loc 1 467 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r294; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 467 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 474 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 482 3 setp.lt.s64 %r173,%r88,%r294; @ %r173 bra $L115; bra $L168; $L113: .loc 1 469 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 479 12 mov.u64 %r300,0; $L119: .loc 1 476 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 478 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 476 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 479 12 st.u64 [%r139],%r300; $L118: .loc 1 474 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 491 15 ld.u64 %r98,[%r150]; .loc 1 491 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r293,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r293,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 488 12 mov.u64 %r299,0; $L125: .loc 1 485 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 487 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 484 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 488 12 st.u64 [%r43],%r299; $L124: .loc 1 482 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 495 7 setp.eq.u64 %r211,%r294,0; @ %r211 bra $L127; add.u64 %r293,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r293; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 498 10 mov.u64 %r81,1; .loc 1 502 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 506 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 507 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 509 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 509 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 509 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 511 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 500 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 502 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 495 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 514 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 515 4 bra $L109; $L170: .loc 1 518 24 mov.u64 %r24496244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 518 22 st.u64 [%r150],%r98; $L135: .loc 1 544 3 setp.ne.u64 %r247,%r294,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 522 19 ld.s8 %r55,[%r150+28]; .loc 1 522 10 setp.eq.u64 %r248,%r55,%r294; @ %r248 bra $L134; .loc 1 523 2 st.u64 [%stack+8],%r294; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 528 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 528 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 530 4 setp.eq.u64 %r254,%r294,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 530 10 mov.u64 %r104,0; $L137: .loc 1 534 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 534 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 535 18 ld.u64 %r63,[%r31]; .loc 1 530 25 add.u64 %r104,%r104,1; .loc 1 535 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 536 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 530 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 546 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r294,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 547 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 544 26 add.u64 %r100,%r100,1; .loc 1 544 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r294,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 556 22 ld.u64 %r68,[%frame]; .loc 1 556 12 shl.b64 %r69,%r68,3; .loc 1 558 32 ld.u64 %r59,[%frame+120]; .loc 1 565 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r292,%r294,1; .loc 1 554 13 mov.u64 %r278,9223372036854775807; cvt.u32.u32 %r296,%r24; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; $L146: st.u64 [%r98],%r278; .loc 1 555 15 add.u64 %r80,%r80,1; .loc 1 556 12 add.u64 %r98,%r98,%r69; .loc 1 558 13 setp.ne.u64 %r279,%r59,%r80; @ %r279 bra $L146; .loc 1 567 7 @ ! %r292 bra $L171; bra $L109; $L145: .loc 1 562 13 st.u64 [%r99],%r295; .loc 1 565 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 566 5 add.u64 %r97,%r97,1; .loc 1 567 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r282,%r97,%r298; @ %r282 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 565 23 mov.u64 %r72,%r44; .loc 1 566 5 mov.u64 %r97,1; .loc 1 562 13 mov.u64 %r295,0; $L144: .loc 1 571 16 ld.u64 %r287,[%r99]; add.u64 %r74,%r287,1; st.u64 [%r99],%r74; .loc 1 572 23 ld.u64 %r75,[%r124]; .loc 1 572 13 sub.u64 %r288,%r75,%r72; shl.b64 %r289,%r288,3; add.u64 %r98,%r98,%r289; .loc 1 558 32 ld.u64 %r77,[%r123]; .loc 1 558 13 setp.eq.u64 %r290,%r74,%r77; @ %r290 bra $L145; .loc 1 562 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 491 15 ld.u64 %r98,[%r150]; .loc 1 491 6 setp.eq.u64 %r291,%r98,0; @ ! %r291 bra $L122; bra $L147; $L109: .loc 1 576 1 ret; } minval_i16.o/_gfortran_minval_i16 .visible .func _gfortran_minval_i16fortran/generated/minval_i16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminval_i16 .visible .func _gfortran_mminval_i16gfortran_sminval_i16 .visible .func _gfortran_sminval_i16VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,86,65,76114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_minval_i16 .visible .func _gfortran_minval_i1648099pred %r217; .reg .pred64u64 %r251; .reg .predu64 %r256; .reg .u64 %r257; .reg .u32u32 %r268; .reg .u64u64pred %r317; .reg .u64pred %r327; .reg .pred %r330; .reg .predu64 %r359; .reg .u64predu64 %r370; .reg .u64 %r372; .reg .u64 %r373; mov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; .loc 1 55 10 ld.s8 %r22,[%r180+28]; .loc 1 55 38 add.u32 %r182,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r182; .loc 1 56 10 ld.u64 %r25,[%r181]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r184,%r104,63; set.u32.lt.s64 %r186,%r103,%r104; neg.s32 %r187,%r186; cvt.u16.u64 %r190,%r184; cvt.u16.u32 %r191,%r187; or.b16 %r189,%r190,%r191; .loc 1 58 6 cvt.u32.u16 %r192,%r189; cvt.u16.u8 %r193,%r192; setp.eq.u16 %r194,%r193,0; @ %r194 bra $L2; .loc 1 60 7 cvt.u32.u32 %r197,%r22; cvt.s64.s8 %r196,%r197; st.u64 [%stack+8],%r1961955_gfortran_runtime_errorr200,%r104,%r104; add.u64 %r201,%r200,%r104; shl.b64 %r202,%r201,3; add.u64 %r203,%r180,%r202; ld.u64 %r30,[%r203+56]; ld.u64 %r32,[%r203+48]; .loc 1 68 9 ld.u64 %r107,[%r203+40]; .loc 1 70 3 setp.ne.u64 %r217,%r104,0; @ %r217 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r218,%r103,%r104; @ %r218 bra $L4; bra $L48; $L3: add.u64 %r78,%r180,40; add.u64 %r59,%frame,120; add.u64 %r91,%frame,240; add.u64 %r219,%r180,16; add.u64 %r221,%r25,%r25; add.u64 %r222,%r221,%r25; shl.b64 %r223,%r222,3; add.u64 %r174,%r219,%r223; .loc 1 76 12 mov.u64 %r373,0; $L8: .loc 1 72 18 ld.u64 %r224,[%r78]; st.u64 [%r59],%r224; .loc 1 73 19 ld.u64 %r226,[%r78+16]; add.u64 %r225,%r226,1; ld.u64 %r227,[%r78+8]; sub.u64 %r37,%r225,%r227; .loc 1 75 10 setp.lt.s64 %r228,%r37,0; @ %r228 bra $L6; .loc 1 73 17 st.u64 [%r91],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r91],%r373; $L7: .loc 1 70 3 add.u64 %r78,%r78,24; add.u64 %r59,%r59,8; add.u64 %r91,%r91,8; setp.ne.u64 %r230,%r78,%r174; @ %r230 bra $L8; bra $L9; $L48: .loc 1 87 6 ld.u64 %r231,[%r179]; setp.eq.u64 %r232,%r231,0; @ ! %r232 bra $L11; bra $L10; $L4: add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r237,%r236,40; add.u64 %r143,%r180,%r237; shl.b64 %r238,%r25,3; add.u64 %r125,%r238,-8; add.u64 %r361,%frame,120; add.u64 %r128,%r361,%r125; add.u64 %r360,%frame,240; add.u64 %r121,%r360,%r125; cvt.u32.u32 %r242,%r22; cvt.s64.s8 %r241,%r242; add.u64 %r244,%r241,%r241; add.u64 %r245,%r244,%r241; shl.b64 %r246,%r245,3; add.u64 %r247,%r180,40; add.u64 %r95,%r246,%r247; .loc 1 84 12 mov.u64 %r372,0; $L14: .loc 1 80 18 ld.u64 %r248,[%r143]; st.u64 [%r128],%r248; .loc 1 81 19 ld.u64 %r250,[%r143+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r143+8]; sub.u64 %r45,%r249,%r251; .loc 1 83 10 setp.lt.s64 %r252,%r45,0; @ %r252 bra $L12; .loc 1 81 17 st.u64 [%r121],%r45; bra $L13; $L12: .loc 1 84 12 st.u64 [%r121],%r372; $L13: .loc 1 78 3 add.u64 %r143,%r143,24; add.u64 %r128,%r128,8; add.u64 %r121,%r121,8; setp.ne.u64 %r254,%r95,%r143; @ %r254 bra $L14; bra $L49; $L10: .loc 1 91 7 setp.le.s64 %r255,%r103,0; @ %r255 bra $L16; add.u64 %r360,%frame,240; $L36: add.u64 %r152,%r179,40; mov.u64 %r150,%r360; add.u64 %r256,%r179,16; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r144,%r256,%r262; .loc 1 94 10 mov.u64 %r99,1; .loc 1 98 4 mov.u64 %r285,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r263,0; st.u64 [%r179+8],%r263; .loc 1 103 28 cvt.u16.u32 %r266,%r22; add.u16 %r265,%r266,-1; cvt.u32.u16 %r267,%r265; st.u8 [%r179+28],%r267; .loc 1 105 20 add.u32 %r268,%r22,-2; cvt.s64.s32 %r52,%r268; add.u64 %r270,%r52,%r52; add.u64 %r271,%r270,%r52; shl.b64 %r272,%r271,3; add.u64 %r273,%r179,%r272; .loc 1 105 67 shl.b64 %r275,%r52,3; add.u64 %r276,%frame,%r275; .loc 1 105 59 ld.u64 %r278,[%r273+40]; ld.u64 %r279,[%r276+240]; mul.lo.u64 %r109,%r278,%r279; .loc 1 107 29 mov.u64 %r281,16281; call (%value_in),_gfortrani_xmallocarray282,[%value_in]; } .loc 1 107 27 st.u64 [%r179],%r282; .loc 1 108 10 setp.eq.u64 %r284,%r109,0; @ ! %r284 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r99,%r48,%r99; $L17: .loc 1 98 4 st.u64 [%r152+8],%r285; ld.u64 %r48,[%r150]; add.u64 %r286,%r48,-1; st.u64 [%r152+16],%r286; st.u64 [%r152],%r99; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r287,%r144,%r152; @ %r287 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r179+48],%r109; mov.u64 %r289,-1; st.u64 [%r179+56],%r289; mov.u64 %r290,1; st.u64 [%r179+40],%r290; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r179+28]; .loc 1 118 10 setp.eq.u64 %r291,%r58,%r103; @ %r291 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r294,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r295,[%r294+36]; setp.eq.u32 %r296,%r295,0; @ %r296 bra $L21; .loc 1 125 2 add.u64 %r360,%frame,240; cvta.const.u64 %r300,$LC2r360299300; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r302,%r103,0; @ %r302 bra $L24; $L27: .loc 1 65 9 add.u64 %r303,%r30,1; .loc 1 65 7 sub.u64 %r105,%r303,%r32; max.s64 %r40,%r105,0; .loc 1 137 8 ld.u64 %r116,[%r180]; .loc 1 138 8 ld.u64 %r117,[%r179]; .loc 1 158 36 shl.b64 %r66,%r107,4; ld.u64 %r138,[%frame+360]; .loc 1 180 22 ld.u64 %r68,[%frame+120]; .loc 1 180 12 shl.b64 %r69,%r68,4; .loc 1 181 22 ld.u64 %r71,[%frame]; .loc 1 181 12 shl.b64 %r72,%r71,4; .loc 1 183 32 ld.u64 %r123,[%frame+240]; .loc 1 190 23 mul.lo.u64 %r304,%r68,%r123; .loc 1 190 9 shl.b64 %r305,%r304,4; neg.s64 %r127,%r305; .loc 1 191 23 mul.lo.u64 %r306,%r71,%r123; .loc 1 191 9 shl.b64 %r307,%r306,4; neg.s64 %r130,%r307; setp.gt.s64 %r362,%r105,0; setp.le.s64 %r363,%r103,1; cvt.u32.u32 %r365,%r22; cvt.s64.s8 %r366,%r365; add.u64 %r367,%r366,-1; add.u64 %r368,%frame,120; add.u64 %r369,%frame,240; .loc 1 187 13 mov.u64 %r370,0; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r179,40; mov.u64 %r162,%frame; cvt.u32.u32 %r309,%r22; cvt.s64.s8 %r308,%r309; add.u64 %r153,%r308,-1; .loc 1 129 3 mov.u64 %r118,0; add.u64 %r360,%frame,240; .loc 1 131 16 mov.u64 %r310,%r118; $L26: st.u64 [%r165],%r310; .loc 1 132 18 ld.u64 %r311,[%r164]; st.u64 [%r162],%r311; .loc 1 133 17 shl.b64 %r313,%r118,3; add.u64 %r314,%r360,%r313; .loc 1 133 10 ld.u64 %r315,[%r314]; setp.le.s64 %r316,%r315,0; @ %r316 bra $L1; .loc 1 129 26 add.u64 %r118,%r118,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r317,%r118,%r153; @ %r317 bra $L26; bra $L27; $L25: .loc 1 153 5 @ %r362 bra $L37; .loc 1 154 10 mov.u64 %r319,-1; st.u64 [%r117],%r319; mov.u64 %r320,9223372036854775807; st.u64 [%r117+8],%r320; bra $L29; $L37: mov.u64 %r112,%r116; .loc 1 151 9 mov.u64 %r356,-1; mov.u64 %r357,9223372036854775807; .loc 1 158 13 mov.u64 %r111,0; $L28: ld.u64 %r358,[%r112]; ld.u64 %r359,[%r112+8]; setp.gt.s64 %r324,%r357,%r359; @ %r324 bra $L31; setp.ne.u64 %r327,%r357,%r359; @ %r327 bra $L30; setp.gt.u64 %r330,%r356,%r358; @ ! %r330 bra $L30; $L31: mov.u64 %r356,%r358; mov.u64 %r357,%r359; $L30: .loc 1 158 28 add.u64 %r111,%r111,1; .loc 1 158 36 add.u64 %r112,%r112,%r66; .loc 1 158 6 setp.gt.s64 %r331,%r40,%r111; @ %r331 bra $L28; .loc 1 175 12 st.u64 [%r117],%r356; st.u64 [%r117+8],%r357; $L29: .loc 1 179 15 add.u64 %r138,%r138,1; .loc 1 180 12 add.u64 %r116,%r116,%r69; .loc 1 181 12 add.u64 %r117,%r117,%r72; .loc 1 183 13 setp.ne.u64 %r334,%r138,%r123; @ %r334 bra $L25; .loc 1 190 9 add.u64 %r113,%r116,%r127; .loc 1 191 9 add.u64 %r114,%r117,%r130; .loc 1 193 7 @ %r363 bra $L1; add.u64 %r122,%frame,368; mov.u64 %r140,8; .loc 1 192 5 mov.u64 %r115,1; bra $L34; $L35: .loc 1 187 13 st.u64 [%r122],%r370; .loc 1 190 23 mul.lo.u64 %r340,%r85,%r84; .loc 1 190 9 shl.b64 %r341,%r340,4; sub.u64 %r113,%r116,%r341; .loc 1 191 23 mul.lo.u64 %r342,%r88,%r84; .loc 1 191 9 shl.b64 %r343,%r342,4; sub.u64 %r114,%r117,%r343; .loc 1 192 5 add.u64 %r115,%r115,1; .loc 1 193 7 add.u64 %r122,%r122,8; add.u64 %r140,%r140,8; setp.eq.u64 %r344,%r115,%r367; @ %r344 bra $L1; $L34: .loc 1 201 16 ld.u64 %r345,[%r122]; add.u64 %r84,%r345,1; st.u64 [%r122],%r84; .loc 1 202 23 add.u64 %r347,%r368,%r140; ld.u64 %r85,[%r347]; .loc 1 202 13 shl.b64 %r348,%r85,4; add.u64 %r116,%r113,%r348; .loc 1 203 23 add.u64 %r349,%frame,%r140; ld.u64 %r88,[%r349]; .loc 1 203 13 shl.b64 %r350,%r88,4; add.u64 %r117,%r114,%r350; .loc 1 183 32 add.u64 %r352,%r369,%r140; ld.u64 %r92,[%r352]; .loc 1 183 13 setp.eq.u64 %r353,%r84,%r92; @ %r353 bra $L35; .loc 1 187 13 mov.u64 %r138,0; bra $L25; $L49: .loc 1 87 6 ld.u64 %r354,[%r179]; setp.eq.u64 %r355,%r354,0; @ ! %r355 bra $L11; bra $L36; $L_gfortran_mminval_i16 .visible .func _gfortran_mminval_i16608u64 %r166; .reg .u64u64u16 %r263; .reg .pred %r264; .reg .pred %r266; .reg .u64 %r270; .reg .u64 %r271; .reg .u64 %r277; .reg .pred %r279; .reg .pred %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .predu64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32u64 %r342; .reg .u64pred %r358; .reg .u64predu64 %r391; .reg .predpred %r399; .reg .pred %r403; .reg .pred %r406; .reg .predmov.u64 %r211,%ar2; mov.u64 %r212,%ar3; .loc 1 237 6 setp.ne.u64 %r213,%r212,0; @ %r213 bra $Lgfortran_minval_i16.loc 1 244 7 bra $L50; $L51: .loc 1 247 10 ld.u64 %r22,[%r211]; .loc 1 247 7 add.u64 %r136,%r22,-1; .loc 1 248 10 ld.s8 %r23,[%r210+28]; .loc 1 248 38 add.u32 %r217,%r23,-1; .loc 1 248 8 cvt.s64.s32 %r137,%r217; .loc 1 251 7 shr.u64 %r219,%r136,63; set.u32.gt.s64 %r221,%r136,%r137; neg.s32 %r222,%r221; cvt.u16.u64 %r225,%r219; cvt.u16.u32 %r226,%r222; or.b16 %r224,%r225,%r226; .loc 1 251 6 cvt.u32.u16 %r227,%r224; cvt.u16.u8 %r228,%r227; setp.eq.u16 %r229,%r228,0; @ %r229 bra $L53; .loc 1 253 7 cvt.u32.u32 %r232,%r23; cvt.s64.s8 %r231,%r232; st.u64 [%stack+8],%r231; st.u64 [%stack],%r22; cvta.const.u64 %r2302stack; call _gfortran_runtime_error3: .loc 1 258 9 add.u64 %r443,%r136,%r136; add.u64 %r446,%r443,%r136; shl.b64 %r237,%r446,3; add.u64 %r238,%r210,%r237; ld.u64 %r241,[%r238+56]; add.u64 %r240,%r241,1; .loc 1 258 7 ld.u64 %r248,[%r238+48]; sub.u64 %r138,%r240,%r248; .loc 1 259 6 setp.le.s64 %r249,%r138,0; @ %r249 bra $L50; .loc 1 262 9 ld.u64 %r151,[%r212]; .loc 1 264 15 ld.u64 %r34,[%r212+16]; .loc 1 266 22 cvt.u32.u64 %r35,%r34; .loc 1 266 53 add.u32 %r250,%r35,-4; and.b32 %r251,%r250,-5; set.u32.eq.u32 %r253,%r251,0; neg.s32 %r254,%r253; .loc 1 266 22 add.u32 %r255,%r35,-1; set.u32.le.u32 %r257,%r255,1; neg.s32 %r258,%r257; .loc 1 266 6 cvt.u16.u32 %r260,%r254; cvt.u16.u32 %r261,%r258; or.b16 %r259,%r260,%r261; cvt.u32.u16 %r262,%r259; cvt.u16.u8 %r263,%r262; setp.ne.u16 %r264,%r263,0; @ %r264 bra $L55; .loc 1 268 7 setp.ne.u32 %r266,%r35,16; @ %r266 bra $L56; $L55: .loc 1 275 9 shl.b64 %r270,%r446,3; add.u64 %r271,%r210,%r270; ld.u64 %r141,[%r271+40]; .loc 1 276 12 add.u64 %r277,%r212,%r270; ld.u64 %r42,[%r277+40]; .loc 1 278 3 setp.ne.u64 %r279,%r136,0; @ %r279 bra $L57; $L63: .loc 1 288 3 setp.lt.s64 %r280,%r136,%r137; @ %r280 bra $L58; bra $L109; $L56: .loc 1 273 5 cvta.const.u64 %r281281_gfortran_runtime_error7: add.u64 %r40,%r210,40; add.u64 %r179,%r212,40; add.u64 %r283,%r210,16; add.u64 %r285,%r22,%r22; add.u64 %r286,%r285,%r22; shl.b64 %r287,%r286,3; add.u64 %r206,%r283,%r287; .loc 1 278 3 mov.u64 %r91,0; add.u64 %r444,%frame,240; add.u64 %r442,%frame,360; .loc 1 285 12 mov.u64 %r456,%r91; $L62: .loc 1 280 18 add.u64 %r289,%r444,%r91; ld.u64 %r290,[%r40]; st.u64 [%r289],%r290; .loc 1 281 18 add.u64 %r291,%frame,%r91; .loc 1 281 20 ld.u64 %r293,[%r179]; mul.lo.u64 %r292,%r293,%r34; .loc 1 281 18 st.u64 [%r291],%r292; .loc 1 282 19 ld.u64 %r295,[%r40+16]; add.u64 %r294,%r295,1; ld.u64 %r296,[%r40+8]; sub.u64 %r54,%r294,%r296; .loc 1 284 10 setp.lt.s64 %r297,%r54,0; @ %r297 bra $L60; .loc 1 282 17 add.u64 %r299,%r442,%r91; st.u64 [%r299],%r54; bra $L61; $L60: .loc 1 285 12 add.u64 %r301,%r442,%r91; st.u64 [%r301],%r456; $L61: .loc 1 278 3 add.u64 %r40,%r40,24; add.u64 %r179,%r179,24; add.u64 %r91,%r91,8; setp.ne.u64 %r303,%r40,%r206; @ %r303 bra $L62; bra $L63; $L109: .loc 1 298 6 ld.u64 %r304,[%r209]; setp.eq.u64 %r305,%r304,0; @ ! %r305 bra $L65; bra $L64; $L58: add.u64 %r307,%r22,%r22; add.u64 %r308,%r307,%r22; shl.b64 %r309,%r308,3; add.u64 %r126,%r309,40; add.u64 %r132,%r210,%r126; add.u64 %r124,%r212,%r126; shl.b64 %r310,%r22,3; add.u64 %r102,%r310,-8; cvt.u32.u32 %r312,%r23; cvt.s64.s8 %r311,%r312; shl.b64 %r313,%r311,3; add.u64 %r68,%r313,-8; add.u64 %r444,%frame,240; add.u64 %r442,%frame,360; .loc 1 295 12 mov.u64 %r455,0; $L68: .loc 1 290 18 add.u64 %r315,%r444,%r102; ld.u64 %r316,[%r132]; st.u64 [%r315],%r316; .loc 1 291 18 add.u64 %r317,%frame,%r102; .loc 1 291 20 ld.u64 %r319,[%r124]; mul.lo.u64 %r318,%r319,%r34; .loc 1 291 18 st.u64 [%r317],%r318; .loc 1 292 19 ld.u64 %r321,[%r132+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r132+8]; sub.u64 %r65,%r320,%r322; .loc 1 294 10 setp.lt.s64 %r323,%r65,0; @ %r323 bra $L66; .loc 1 292 17 add.u64 %r325,%r442,%r102; st.u64 [%r325],%r65; bra $L67; $L66: .loc 1 295 12 add.u64 %r327,%r442,%r102; st.u64 [%r327],%r455; $L67: .loc 1 288 3 add.u64 %r132,%r132,24; add.u64 %r124,%r124,24; add.u64 %r102,%r102,8; setp.ne.u64 %r329,%r68,%r102; @ %r329 bra $L68; bra $L110; $L64: .loc 1 302 7 setp.eq.u64 %r330,%r137,0; @ %r330 bra $L70; add.u64 %r442,%frame,360; $L90: add.u64 %r172,%r209,40; mov.u64 %r164,%r442; cvt.u32.u32 %r332,%r23; cvt.s64.s8 %r331,%r332; add.u64 %r334,%r331,%r331; add.u64 %r335,%r334,%r331; shl.b64 %r336,%r335,3; add.u64 %r337,%r209,16; add.u64 %r133,%r336,%r337; .loc 1 305 10 mov.u64 %r128,1; .loc 1 309 4 mov.u64 %r356,0; bra $L71; $L70: .loc 1 313 20 add.u32 %r338,%r23,-2; cvt.s64.s32 %r74,%r338; add.u64 %r340,%r74,%r74; add.u64 %r341,%r340,%r74; shl.b64 %r342,%r341,3; add.u64 %r343,%r209,%r342; .loc 1 313 67 shl.b64 %r345,%r74,3; add.u64 %r346,%frame,%r345; .loc 1 313 59 ld.u64 %r348,[%r343+40]; ld.u64 %r349,[%r346+360]; mul.lo.u64 %r145,%r348,%r349; .loc 1 315 24 mov.u64 %r350,0; st.u64 [%r209+8],%r350; .loc 1 316 28 cvt.u16.u32 %r353,%r23; add.u16 %r352,%r353,-1; cvt.u32.u16 %r354,%r352; st.u8 [%r209+28],%r354; .loc 1 318 10 setp.eq.u64 %r355,%r145,0; @ %r355 bra $L72; bra $L111; $L74: .loc 1 307 47 mul.lo.u64 %r128,%r71,%r128; $L71: .loc 1 309 4 st.u64 [%r172+8],%r356; ld.u64 %r71,[%r164]; add.u64 %r357,%r71,-1; st.u64 [%r172+16],%r357; st.u64 [%r172],%r128; .loc 1 302 7 add.u64 %r172,%r172,24; add.u64 %r164,%r164,8; setp.ne.u64 %r358,%r133,%r172; @ %r358 bra $L74; bra $L70; $L72: .loc 1 321 4 st.u64 [%r209+48],%r145; mov.u64 %r360,-1; st.u64 [%r209+56],%r360; mov.u64 %r361,1; st.u64 [%r209+40],%r361; .loc 1 322 4 bra $L50; $L111: .loc 1 325 24 mov.u64 %r363,16363; call (%value_in),_gfortrani_xmallocarray64,[%value_in]; } .loc 1 325 22 st.u64 [%r209],%r364; bra $L75; $L65: .loc 1 330 19 ld.s8 %r366,[%r209+28]; .loc 1 330 10 setp.eq.u64 %r367,%r366,%r137; @ %r367 bra $L76; .loc 1 331 2 cvta.const.u64 %r36836_gfortran_runtime_error76: .loc 1 333 11 cvta.global.u64 %r370,_gfortrani_compile_options; .loc 1 333 10 ld.u32 %r371,[%r370+36]; setp.eq.u32 %r372,%r371,0; @ %r372 bra $L75; .loc 1 335 4 add.u64 %r442,%frame,360; cvta.const.u64 %r376,$LC2r44237376; call _gfortrani_bounds_ifunction_return337 4 cvta.const.u64 %r380,$LC1210380376; call _gfortrani_bounds_equal_extents$L75: .loc 1 342 3 setp.ne.u64 %r382,%r137,0; @ %r382 bra $L77; $L80: .loc 1 350 8 ld.u64 %r152,[%r209]; .loc 1 351 8 ld.u64 %r150,[%r210]; .loc 1 353 9 setp.ne.u64 %r383,%r150,0; @ %r383 bra $L78; bra $L50; $L77: add.u64 %r190,%frame,480; add.u64 %r189,%r209,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r385,%r23; cvt.s64.s8 %r384,%r385; add.u64 %r175,%r384,-1; .loc 1 342 3 mov.u64 %r157,0; add.u64 %r442,%frame,360; .loc 1 344 16 mov.u64 %r386,%r157; $L79: st.u64 [%r190],%r386; .loc 1 345 18 ld.u64 %r387,[%r189]; st.u64 [%r187],%r387; .loc 1 346 17 shl.b64 %r389,%r157,3; add.u64 %r390,%r442,%r389; .loc 1 346 10 ld.u64 %r391,[%r390]; setp.le.s64 %r392,%r391,0; @ %r392 bra $L50; .loc 1 342 26 add.u64 %r157,%r157,1; .loc 1 342 3 add.u64 %r190,%r190,8; add.u64 %r189,%r189,24; add.u64 %r187,%r187,8; setp.ne.u64 %r393,%r157,%r175; @ %r393 bra $L79; bra $L80; $L78: .loc 1 276 12 mul.lo.u64 %r45,%r42,%r34; .loc 1 370 32 shl.b64 %r90,%r141,4; ld.u64 %r39,[%frame+480]; .loc 1 401 22 ld.u64 %r93,[%frame+240]; .loc 1 401 12 shl.b64 %r94,%r93,4; .loc 1 402 23 ld.u64 %r96,[%frame]; .loc 1 403 22 ld.u64 %r98,[%frame+120]; .loc 1 403 12 shl.b64 %r99,%r98,4; .loc 1 405 32 ld.u64 %r84,[%frame+360]; .loc 1 412 23 mul.lo.u64 %r38,%r84,%r93; .loc 1 413 24 mul.lo.u64 %r394,%r84,%r96; .loc 1 413 10 neg.s64 %r163,%r394; .loc 1 414 23 mul.lo.u64 %r395,%r84,%r98; .loc 1 414 9 shl.b64 %r396,%r395,4; neg.s64 %r165,%r396; setp.le.s64 %r441,%r137,1; .loc 1 365 9 mov.u64 %r448,-1; cvt.u32.u32 %r449,%r23; cvt.s64.s8 %r450,%r449; add.u64 %r451,%r450,-1; add.u64 %r452,%frame,240; add.u64 %r453,%frame,360; add.u64 %r454,%frame,120; $L91: .loc 1 342 3 mov.u64 %r156,%r151; mov.u64 %r155,%r150; .loc 1 365 9 mov.u64 %r437,%r448; mov.u64 %r438,9223372036854775807; .loc 1 370 9 mov.u64 %r154,0; $L84: .loc 1 394 6 ld.s8 %r398,[%r156]; cvt.u16.u32 %r397,%r398; setp.eq.u16 %r399,%r397,0; @ %r399 bra $L81; ld.u64 %r439,[%r155]; ld.u64 %r440,[%r155+8]; setp.gt.s64 %r403,%r438,%r440; @ %r403 bra $L83; setp.ne.u64 %r406,%r438,%r440; @ %r406 bra $L81; setp.gt.u64 %r409,%r437,%r439; @ ! %r409 bra $L81; $L83: mov.u64 %r437,%r439; mov.u64 %r438,%r440; $L81: .loc 1 370 24 add.u64 %r154,%r154,1; .loc 1 370 32 add.u64 %r155,%r155,%r90; .loc 1 370 47 add.u64 %r156,%r156,%r45; .loc 1 370 2 setp.ne.u64 %r410,%r138,%r154; @ %r410 bra $L84; .loc 1 397 8 st.u64 [%r152],%r437; st.u64 [%r152+8],%r438; .loc 1 400 15 add.u64 %r39,%r39,1; .loc 1 401 12 add.u64 %r150,%r150,%r94; .loc 1 402 13 add.u64 %r151,%r151,%r96; .loc 1 403 12 add.u64 %r152,%r152,%r99; .loc 1 405 13 setp.ne.u64 %r413,%r84,%r39; @ %r413 bra $L91; .loc 1 413 10 add.u64 %r147,%r151,%r163; .loc 1 414 9 add.u64 %r148,%r152,%r165; .loc 1 416 7 @ ! %r441 bra $L112; bra $L50; $L89: .loc 1 409 13 st.u64 [%r41],%r447; .loc 1 412 23 mul.lo.u64 %r101,%r113,%r112; .loc 1 413 24 mul.lo.u64 %r416,%r115,%r112; .loc 1 413 10 sub.u64 %r147,%r151,%r416; .loc 1 414 23 mul.lo.u64 %r417,%r117,%r112; .loc 1 414 9 shl.b64 %r418,%r417,4; sub.u64 %r148,%r152,%r418; .loc 1 415 5 add.u64 %r149,%r149,1; .loc 1 416 7 add.u64 %r41,%r41,8; add.u64 %r166,%r166,8; setp.ne.u64 %r419,%r149,%r451; @ %r419 bra $L88; bra $L50; $L112: add.u64 %r41,%frame,488; .loc 1 412 23 mov.u64 %r101,%r38; .loc 1 416 7 mov.u64 %r166,8; .loc 1 415 5 mov.u64 %r149,1; .loc 1 409 13 mov.u64 %r447,0; $L88: .loc 1 424 16 ld.u64 %r423,[%r41]; add.u64 %r112,%r423,1; st.u64 [%r41],%r112; .loc 1 425 23 add.u64 %r425,%r452,%r166; ld.u64 %r113,[%r425]; .loc 1 425 13 sub.u64 %r426,%r113,%r101; shl.b64 %r427,%r426,4; add.u64 %r150,%r150,%r427; .loc 1 426 24 add.u64 %r428,%frame,%r166; ld.u64 %r115,[%r428]; .loc 1 426 14 add.u64 %r151,%r147,%r115; .loc 1 427 23 add.u64 %r430,%r454,%r166; ld.u64 %r117,[%r430]; .loc 1 427 13 shl.b64 %r431,%r117,4; add.u64 %r152,%r148,%r431; .loc 1 405 32 add.u64 %r433,%r453,%r166; ld.u64 %r121,[%r433]; .loc 1 405 13 setp.eq.u64 %r434,%r112,%r121; @ %r434 bra $L89; .loc 1 409 13 mov.u64 %r39,0; bra $L91; $L110: .loc 1 298 6 ld.u64 %r435,[%r209]; setp.eq.u64 %r436,%r435,0; @ ! %r436 bra $L65; bra $L90; $L50:_gfortran_sminval_i16 .visible .func _gfortran_sminval_i1668u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64u64 %r279; .reg .pred %r280; .reg .predpred %r292; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 454 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L114; .loc 1 454 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L115; $L114:r152; call _gfortran_minval_i16.loc 1 461 7 bra $L113; $L115: .loc 1 464 10 ld.u64 %r23,[%r152]; .loc 1 464 7 add.u64 %r88,%r23,-1; .loc 1 465 10 ld.s8 %r24,[%r151+28]; .loc 1 465 38 add.u32 %r26,%r24,-1; .loc 1 465 8 cvt.s64.s32 %r295,%r26; .loc 1 467 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r295; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 467 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L117; .loc 1 474 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L118; $L124: .loc 1 482 3 setp.lt.s64 %r173,%r88,%r295; @ %r173 bra $L119; bra $L172; $L117: .loc 1 469 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error118: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 479 12 mov.u64 %r301,0; $L123: .loc 1 476 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 478 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L121; .loc 1 476 17 st.u64 [%r139],%r36; bra $L122; $L121: .loc 1 479 12 st.u64 [%r139],%r301; $L122: .loc 1 474 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L123; bra $L124; $L172: .loc 1 491 15 ld.u64 %r98,[%r150]; .loc 1 491 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L126; bra $L125; $L119: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r294,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r294,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 488 12 mov.u64 %r300,0; $L129: .loc 1 485 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 487 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L127; .loc 1 484 17 st.u64 [%r43],%r41; bra $L128; $L127: .loc 1 488 12 st.u64 [%r43],%r300; $L128: .loc 1 482 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L129; bra $L173; $L125: .loc 1 495 7 setp.eq.u64 %r211,%r295,0; @ %r211 bra $L131; add.u64 %r294,%frame,120; $L151: add.u64 %r111,%r150,40; mov.u64 %r107,%r294; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 498 10 mov.u64 %r81,1; .loc 1 502 4 mov.u64 %r237,0; bra $L132; $L131: .loc 1 506 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 507 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 509 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 509 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 509 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 511 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L133; bra $L174; $L135: .loc 1 500 48 mul.lo.u64 %r81,%r46,%r81; $L132: .loc 1 502 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 495 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L135; bra $L131; $L133: .loc 1 514 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 515 4 bra $L113; $L174: .loc 1 518 24 mov.u64 %r244,1696244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 518 22 st.u64 [%r150],%r98; $L139: .loc 1 544 3 setp.ne.u64 %r247,%r295,0; @ %r247 bra $L136; bra $L137; $L126: .loc 1 522 19 ld.s8 %r55,[%r150+28]; .loc 1 522 10 setp.eq.u64 %r248,%r55,%r295; @ %r248 bra $L138; .loc 1 523 2 st.u64 [%stack+8],%r295; st.u64 [%stack],%r55;_gfortran_runtime_error138: .loc 1 528 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 528 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L139; .loc 1 530 4 setp.eq.u64 %r254,%r295,0; @ %r254 bra $L137; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 530 10 mov.u64 %r104,0; $L141: .loc 1 534 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 534 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 535 18 ld.u64 %r63,[%r31]; .loc 1 530 25 add.u64 %r104,%r104,1; .loc 1 535 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L140; .loc 1 536 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error140: .loc 1 530 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L141; $L136: .loc 1 546 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L142; shl.b64 %r264,%r295,3; bra $L143; $L142: mov.u64 %r264,8; $L143: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L144: .loc 1 547 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 544 26 add.u64 %r100,%r100,1; .loc 1 544 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r295,%r100; @ %r277 bra $L144; $L137: ld.u64 %r80,[%frame+240]; .loc 1 556 22 ld.u64 %r68,[%frame]; .loc 1 556 12 shl.b64 %r69,%r68,4; .loc 1 558 32 ld.u64 %r59,[%frame+120]; .loc 1 565 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r293,%r295,1; .loc 1 554 13 mov.u64 %r278,-1; cvt.u32.u32 %r297,%r24; cvt.s64.s8 %r298,%r297; add.u64 %r299,%r298,-1; $L150: st.u64 [%r98],%r278; mov.u64 %r279,9223372036854775807; st.u64 [%r98+8],%r279; .loc 1 555 15 add.u64 %r80,%r80,1; .loc 1 556 12 add.u64 %r98,%r98,%r69; .loc 1 558 13 setp.ne.u64 %r280,%r59,%r80; @ %r280 bra $L150; .loc 1 567 7 @ ! %r293 bra $L175; bra $L113; $L149: .loc 1 562 13 st.u64 [%r99],%r296; .loc 1 565 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 566 5 add.u64 %r97,%r97,1; .loc 1 567 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r283,%r97,%r299; @ %r283 bra $L148; bra $L113; $L175: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 565 23 mov.u64 %r72,%r44; .loc 1 566 5 mov.u64 %r97,1; .loc 1 562 13 mov.u64 %r296,0; $L148: .loc 1 571 16 ld.u64 %r288,[%r99]; add.u64 %r74,%r288,1; st.u64 [%r99],%r74; .loc 1 572 23 ld.u64 %r75,[%r124]; .loc 1 572 13 sub.u64 %r289,%r75,%r72; shl.b64 %r290,%r289,4; add.u64 %r98,%r98,%r290; .loc 1 558 32 ld.u64 %r77,[%r123]; .loc 1 558 13 setp.eq.u64 %r291,%r74,%r77; @ %r291 bra $L149; .loc 1 562 13 mov.u64 %r80,0; bra $L150; $L173: .loc 1 491 15 ld.u64 %r98,[%r150]; .loc 1 491 6 setp.eq.u64 %r292,%r98,0; @ ! %r292 bra $L126; bra $L151; $L113: .loc 1 576 1 ret; } minval_r4.o/_gfortran_minval_r4 .visible .func _gfortran_minval_r4fortran/generated/minval_r4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminval_r4 .visible .func _gfortran_mminval_r4gfortran_sminval_r4 .visible .func _gfortran_sminval_r4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,86,65,76114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_minval_r4 .visible .func _gfortran_minval_r4480u64 %r56; .reg .u64 %r62; .reg .u64 %r65; .reg .u64u64u32 %r194; .reg .u16 %r195; .reg .predu64 %r227; .reg .u64 %r228; .reg .u64 %r229; .reg .predpredu32u64 %r250; .reg .u64 %r251; .reg .u64 %r252; .reg .u64 %r253; .reg .predu32u16u64pred %r319; .reg .f32 %r321; .reg .pred %r322; .reg .pred %r323; .reg .pred %r324; .reg .pred %r325; .reg .pred %r326; .reg .u64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64u64 %r357; .reg .u64 %r358; .reg .u64 %r360; .reg .u64 %r361; mov.u64 %r181,%ar0; mov.u64 %r182,%ar1; mov.u64 %r183,%ar2; .loc 1 55 10 ld.s8 %r22,[%r182+28]; .loc 1 55 38 add.u32 %r184,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r109,%r184; .loc 1 56 10 ld.u64 %r25,[%r183]; .loc 1 56 7 add.u64 %r110,%r25,-1; .loc 1 58 7 shr.u64 %r186,%r110,63; set.u32.lt.s64 %r188,%r109,%r110; neg.s32 %r189,%r188; cvt.u16.u64 %r192,%r186; cvt.u16.u32 %r193,%r189; or.b16 %r191,%r192,%r193; .loc 1 58 6 cvt.u32.u16 %r194,%r191; cvt.u16.u8 %r195,%r194; setp.eq.u16 %r196,%r195,0; @ %r196 bra $L2; .loc 1 60 7 cvt.u32.u32 %r199,%r22; cvt.s64.s8 %r198,%r199; st.u64 [%stack+8],%r19819197_gfortran_runtime_errorr202,%r110,%r110; add.u64 %r203,%r202,%r110; shl.b64 %r204,%r203,3; add.u64 %r205,%r182,%r204; ld.u64 %r31,[%r205+56]; ld.u64 %r33,[%r205+48]; .loc 1 68 9 ld.u64 %r113,[%r205+40]; .loc 1 70 3 setp.ne.u64 %r219,%r110,0; @ %r219 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r220,%r109,%r110; @ %r220 bra $L4; bra $L52; $L3: add.u64 %r29,%r182,40; add.u64 %r126,%frame,120; add.u64 %r180,%frame,240; add.u64 %r221,%r182,16; add.u64 %r223,%r25,%r25; add.u64 %r224,%r223,%r25; shl.b64 %r225,%r224,3; add.u64 %r71,%r221,%r225; .loc 1 76 12 mov.u64 %r361,0; $L8: .loc 1 72 18 ld.u64 %r226,[%r29]; st.u64 [%r126],%r226; .loc 1 73 19 ld.u64 %r228,[%r29+16]; add.u64 %r227,%r228,1; ld.u64 %r229,[%r29+8]; sub.u64 %r38,%r227,%r229; .loc 1 75 10 setp.lt.s64 %r230,%r38,0; @ %r230 bra $L6; .loc 1 73 17 st.u64 [%r180],%r38; bra $L7; $L6: .loc 1 76 12 st.u64 [%r180],%r361; $L7: .loc 1 70 3 add.u64 %r29,%r29,24; add.u64 %r126,%r126,8; add.u64 %r180,%r180,8; setp.ne.u64 %r232,%r71,%r29; @ %r232 bra $L8; bra $L9; $L52: .loc 1 87 6 ld.u64 %r233,[%r181]; setp.eq.u64 %r234,%r233,0; @ ! %r234 bra $L11; bra $L10; $L4: add.u64 %r236,%r25,%r25; add.u64 %r237,%r236,%r25238,40; add.u64 %r144,%r182,%r239; shl.b64 %r240,%r25,3; add.u64 %r107,%r240,-8; add.u64 %r349,%frame,120; add.u64 %r128,%r349,%r107; add.u64 %r348,%frame,240; add.u64 %r105,%r348,%r107; cvt.u32.u32 %r244,%r22; cvt.s64.s8 %r243,%r244; add.u64 %r246,%r243,%r243; add.u64 %r247,%r246,%r243; shl.b64 %r248,%r247,3; add.u64 %r249,%r182,40; add.u64 %r65,%r248,%r249; .loc 1 84 12 mov.u64 %r360,0; $L14: .loc 1 80 18 ld.u64 %r250,[%r144]; st.u64 [%r128],%r250; .loc 1 81 19 ld.u64 %r252,[%r144+16]; add.u64 %r251,%r252,1; ld.u64 %r253,[%r144+8]; sub.u64 %r45,%r251,%r253; .loc 1 83 10 setp.lt.s64 %r254,%r45,0; @ %r254 bra $L12; .loc 1 81 17 st.u64 [%r105],%r45; bra $L13; $L12: .loc 1 84 12 st.u64 [%r105],%r360; $L13: .loc 1 78 3 add.u64 %r144,%r144,24; add.u64 %r128,%r128,8; add.u64 %r105,%r105,8; setp.ne.u64 %r256,%r65,%r144; @ %r256 bra $L14; bra $L53; $L10: .loc 1 91 7 setp.le.s64 %r257,%r109,0; @ %r257 bra $L16; add.u64 %r348,%frame,240; $L38: add.u64 %r153,%r181,40; mov.u64 %r151,%r348; add.u64 %r258,%r181,16; cvt.u32.u32 %r260,%r22; cvt.s64.s8 %r259,%r260; add.u64 %r262,%r259,%r259; add.u64 %r263,%r262,%r259; shl.b64 %r264,%r263,3; add.u64 %r145,%r258,%r264; .loc 1 94 10 mov.u64 %r103,1; .loc 1 98 4 mov.u64 %r287,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r265,0; st.u64 [%r181+8],%r265; .loc 1 103 28 cvt.u16.u32 %r268,%r22; add.u16 %r267,%r268,-1; cvt.u32.u16 %r269,%r267; st.u8 [%r181+28],%r269; .loc 1 105 20 add.u32 %r270,%r22,-2; cvt.s64.s32 %r56,%r270; add.u64 %r272,%r56,%r56; add.u64 %r273,%r272,%r56; shl.b64 %r274,%r273,3; add.u64 %r275,%r181,%r274; .loc 1 105 67 shl.b64 %r277,%r56,3; add.u64 %r278,%frame,%r277; .loc 1 105 59 ld.u64 %r280,[%r275+40]; ld.u64 %r281,[%r278+240]; mul.lo.u64 %r115,%r280,%r281; .loc 1 107 29283; call (%value_in),_gfortrani_xmallocarray284,[%value_in]; } .loc 1 107 27 st.u64 [%r181],%r284; .loc 1 108 10 setp.eq.u64 %r286,%r115,0; @ ! %r286 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r103,%r52,%r103; $L17: .loc 1 98 4 st.u64 [%r153+8],%r287; ld.u64 %r52,[%r151]; add.u64 %r288,%r52,-1; st.u64 [%r153+16],%r288; st.u64 [%r153],%r103; .loc 1 91 7 add.u64 %r153,%r153,24; add.u64 %r151,%r151,8; setp.ne.u64 %r289,%r145,%r153; @ %r289 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r181+48],%r115; mov.u64 %r291,-1; st.u64 [%r181+56],%r291; mov.u64 %r292,1; st.u64 [%r181+40],%r292; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r62,[%r181+28]; .loc 1 118 10 setp.eq.u64 %r293,%r62,%r109; @ %r293 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r109; st.u64 [%stack],%r62;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r296,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r297,[%r296+36]; setp.eq.u32 %r298,%r297,0; @ %r298 bra $L21; .loc 1 125 2 add.u64 %r348,%frame,240; cvta.const.u64 %r302,$LC2r3430302; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r304,%r109,0; @ %r304 bra $L24; $L27: .loc 1 65 9 add.u64 %r305,%r31,1; .loc 1 65 7 sub.u64 %r111,%r305,%r33; max.s64 %r46,%r111,0; .loc 1 137 8 ld.u64 %r122,[%r182]; .loc 1 138 8 ld.u64 %r123,[%r181]; .loc 1 158 36 shl.b64 %r69,%r113,2; ld.u64 %r137,[%frame+360]; .loc 1 180 22 ld.u64 %r73,[%frame+120]; .loc 1 180 12 shl.b64 %r74,%r73,2; .loc 1 181 22 ld.u64 %r76,[%frame]; .loc 1 181 12 shl.b64 %r77,%r76,2; .loc 1 183 32 ld.u64 %r142,[%frame+240]; .loc 1 190 23 mul.lo.u64 %r306,%r73,%r142; .loc 1 190 9 shl.b64 %r307,%r306,2; neg.s64 %r143,%r307; .loc 1 191 23 mul.lo.u64 %r308,%r76,%r142; .loc 1 191 9 shl.b64 %r309,%r308,2; neg.s64 %r70,%r309; setp.gt.s64 %r350,%r111,0; setp.le.s64 %r351,%r109,1; cvt.u32.u32 %r353,%r22; cvt.s64.s8 %r354,%r353; add.u64 %r355,%r354,-1; add.u64 %r356,%frame,120; add.u64 %r357,%frame,240; .loc 1 187 13 mov.u64 %r358,0; bra $L25; $L24: add.u64 %r166,%frame,360; add.u64 %r165,%r181,40; mov.u64 %r163,%frame; cvt.u32.u32 %r311,%r22; cvt.s64.s8 %r310,%r311; add.u64 %r154,%r310,-1; .loc 1 129 3 mov.u64 %r124,0; add.u64 %r348,%frame,240; .loc 1 131 16 mov.u64 %r312,%r124; $L26: st.u64 [%r166],%r312; .loc 1 132 18 ld.u64 %r313,[%r165]; st.u64 [%r163],%r313; .loc 1 133 17 shl.b64 %r315,%r124,3; add.u64 %r316,%r348,%r315; .loc 1 133 10 ld.u64 %r317,[%r316]; setp.le.s64 %r318,%r317,0; @ %r318 bra $L1; .loc 1 129 26 add.u64 %r124,%r124,1; .loc 1 129 3 add.u64 %r166,%r166,8; add.u64 %r165,%r165,24; add.u64 %r163,%r163,8; setp.ne.u64 %r319,%r124,%r154; @ %r319 bra $L26; bra $L27; $L25: .loc 1 153 5 @ %r350 bra $L39; .loc 1 154 10 mov.f32 %r321,0f7f7fffff; st.f32 [%r123],%r321; bra $L29; $L39: mov.u64 %r139,%r122; .loc 1 158 13 mov.u64 %r129,0; $L28: .loc 1 163 7 ld.f32 %r179,[%r139]; mov.u64 %r117,%r129; .loc 1 158 28 add.u64 %r129,%r129,1; mov.u64 %r118,%r139; .loc 1 158 36 add.u64 %r139,%r139,%r69; .loc 1 163 6 setp.le.f32 %r322,%r179,0f7f800000; @ %r322 bra $L40; .loc 1 158 6 setp.gt.s64 %r323,%r46,%r129; @ %r323 bra $L28; .loc 1 167 15 mov.f32 %r104,0f7fc00000; bra $L31; $L40: .loc 1 149 9 mov.f32 %r104,0f7f800000; $L30: .loc 1 171 6 setp.gt.f32 %r324,%r104,%r179; .loc 1 172 12 selp.f32 %r104,%r179,%r104,%r324; .loc 1 168 28 add.u64 %r117,%r117,1; .loc 1 168 36 add.u64 %r118,%r118,%r69; .loc 1 168 11 setp.le.s64 %r325,%r46,%r117; @ %r325 bra $L31; .loc 1 171 7 ld.f32 %r179,[%r118]; bra $L30; $L31: .loc 1 175 12 st.f32 [%r123],%r104; $L29: .loc 1 179 15 add.u64 %r137,%r137,1; .loc 1 180 12 add.u64 %r122,%r122,%r74; .loc 1 181 12 add.u64 %r123,%r123,%r77; .loc 1 183 13 setp.ne.u64 %r326,%r137,%r142; @ %r326 bra $L25; .loc 1 190 9 add.u64 %r119,%r122,%r143; .loc 1 191 9 add.u64 %r120,%r123,%r70; .loc 1 193 7 @ %r351 bra $L1; add.u64 %r132,%frame,368; mov.u64 %r174,8; .loc 1 192 5 mov.u64 %r121,1; bra $L36; $L37: .loc 1 187 13 st.u64 [%r132],%r358; .loc 1 190 23 mul.lo.u64 %r332,%r90,%r89; .loc 1 190 9 shl.b64 %r333,%r332,2; sub.u64 %r119,%r122,%r333; .loc 1 191 23 mul.lo.u64 %r334,%r93,%r89; .loc 1 191 9 shl.b64 %r335,%r334,2; sub.u64 %r120,%r123,%r335; .loc 1 192 5 add.u64 %r121,%r121,1; .loc 1 193 7 add.u64 %r132,%r132,8; add.u64 %r174,%r174,8; setp.eq.u64 %r336,%r121,%r355; @ %r336 bra $L1; $L36: .loc 1 201 16 ld.u64 %r337,[%r132]; add.u64 %r89,%r337,1; st.u64 [%r132],%r89; .loc 1 202 23 add.u64 %r339,%r356,%r174; ld.u64 %r90,[%r339]; .loc 1 202 13 shl.b64 %r340,%r90,2; add.u64 %r122,%r119,%r340; .loc 1 203 23 add.u64 %r341,%frame,%r174; ld.u64 %r93,[%r341]; .loc 1 203 13 shl.b64 %r342,%r93,2; add.u64 %r123,%r120,%r342; .loc 1 183 32 add.u64 %r344,%r357,%r174; ld.u64 %r96,[%r344]; .loc 1 183 13 setp.eq.u64 %r345,%r89,%r96; @ %r345 bra $L37; .loc 1 187 13 mov.u64 %r137,0; bra $L25; $L53: .loc 1 87 6 ld.u64 %r346,[%r181]; setp.eq.u64 %r347,%r346,0; @ ! %r347 bra $L11; bra $L38; $L_gfortran_mminval_r4 .visible .func _gfortran_mminval_r608pred %r215; .reg .u32 %r219; .reg .u64 %r221; .reg .u32 %r223; .reg .u32 %r224; .reg .u16 %r226; .reg .u16 %r227; .reg .u16u16 %r263; .reg .u32pred %r282; .reg .u64 %r283; .reg .u64u64u6464pred %r325; .reg .u64 %r327; .reg .u64 %r329; .reg .pred %r331; .reg .pred %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u32u64pred %r357; .reg .u64 %r358; .reg .u64 %r359; .reg .predu64 %r392; .reg .u64 %r393; .reg .pred %r394; .reg .predpred %r403; .reg .pred %r405; .reg .predpredu64 %r422; .reg .u64 %r424; .reg .u64 %r425; .reg .u64 %r427; .reg .pred %r428; .reg .pred %r429; .reg .u64 %r430; .reg .pred %r431; .reg .u64 %r433; .reg .predu64 %r449; mov.u64 %r211,%ar0; mov.u64 %r212,%ar1; mov.u64 %r213,%ar2; mov.u64 %r214,%ar3; .loc 1 237 6 setp.ne.u64 %r215,%r214,0; @ %r215 bra $Lgfortran_minval_r4.loc 1 244 7 bra $L54; $L55: .loc 1 247 10 ld.u64 %r22,[%r213]; .loc 1 247 7 add.u64 %r142,%r22,-1; .loc 1 248 10 ld.s8 %r23,[%r212+28]; .loc 1 248 38 add.u32 %r219,%r23,-1; .loc 1 248 8 cvt.s64.s32 %r143,%r219; .loc 1 251 7 shr.u64 %r221,%r142,63; set.u32.gt.s64 %r223,%r142,%r143; neg.s32 %r224,%r223; cvt.u16.u64 %r227,%r221; cvt.u16.u32 %r228,%r224; or.b16 %r226,%r227,%r228; .loc 1 251 6 cvt.u32.u16 %r229,%r226; cvt.u16.u8 %r230,%r229; setp.eq.u16 %r231,%r230,0; @ %r231 bra $L57; .loc 1 253 7 cvt.u32.u32 %r234,%r23; cvt.s64.s8 %r233,%r234; st.u64 [%stack+8],%r233; st.u64 [%stack],%r22; cvta.const.u64 %r2322stack; call _gfortran_runtime_error7: .loc 1 258 9 add.u64 %r435,%r142,%r142; add.u64 %r433,%r435,%r142; shl.b64 %r239,%r433,3; add.u64 %r240,%r212,%r239; ld.u64 %r243,[%r240+56]; add.u64 %r242,%r243,1; .loc 1 258 7 ld.u64 %r250,[%r240+48]; sub.u64 %r144,%r242,%r250; .loc 1 259 6 setp.le.s64 %r251,%r144,0; @ %r251 bra $L54; .loc 1 262 9 ld.u64 %r159,[%r214]; .loc 1 264 15 ld.u64 %r33,[%r214+16]; .loc 1 266 22 cvt.u32.u64 %r34,%r33; .loc 1 266 53 add.u32 %r252,%r34,-4; and.b32 %r253,%r252,-5; set.u32.eq.u32 %r255,%r253,0; neg.s32 %r256,%r255; .loc 1 266 22 add.u32 %r257,%r34,-1; set.u32.le.u32 %r259,%r257,1; neg.s32 %r260,%r259; .loc 1 266 6 cvt.u16.u32 %r262,%r256; cvt.u16.u32 %r263,%r260; or.b16 %r261,%r262,%r263; cvt.u32.u16 %r264,%r261; cvt.u16.u8 %r265,%r264; setp.ne.u16 %r266,%r265,0; @ %r266 bra $L59; .loc 1 268 7 setp.ne.u32 %r268,%r34,16; @ %r268 bra $L60; $L59: .loc 1 275 9 shl.b64 %r272,%r433,3; add.u64 %r273,%r212,%r272; ld.u64 %r147,[%r273+40]; .loc 1 276 12 add.u64 %r279,%r214,%r272; ld.u64 %r40,[%r279+40]; .loc 1 278 3 setp.ne.u64 %r281,%r142,0; @ %r281 bra $L61; $L67: .loc 1 288 3 setp.lt.s64 %r282,%r142,%r143; @ %r282 bra $L62; bra $L118; $L60: .loc 1 273 5 cvta.const.u64 %r283283_gfortran_runtime_error61: add.u64 %r53,%r212,40; add.u64 %r163,%r214,40; add.u64 %r286,%r22,%r22; add.u64 %r287,%r286,%r22; shl.b64 %r288,%r287,3; add.u64 %r289,%r212,16; add.u64 %r39,%r288,%r289; .loc 1 278 3 mov.u64 %r209,0; add.u64 %r438,%frame,240; add.u64 %r436,%frame,360; .loc 1 285 12 mov.u64 %r449,%r209; $L66: .loc 1 280 18 add.u64 %r291,%r438,%r209; ld.u64 %r292,[%r53]; st.u64 [%r291],%r292; .loc 1 281 18 add.u64 %r293,%frame,%r209; .loc 1 281 20 ld.u64 %r295,[%r163]; mul.lo.u64 %r294,%r295,%r33; .loc 1 281 18 st.u64 [%r293],%r294; .loc 1 282 19 ld.u64 %r297,[%r53+16]; add.u64 %r296,%r297,1; ld.u64 %r298,[%r53+8]; sub.u64 %r51,%r296,%r298; .loc 1 284 10 setp.lt.s64 %r299,%r51,0; @ %r299 bra $L64; .loc 1 282 17 add.u64 %r301,%r436,%r209; st.u64 [%r301],%r51; bra $L65; $L64: .loc 1 285 12 add.u64 %r303,%r436,%r209; st.u64 [%r303],%r449; $L65: .loc 1 278 3 add.u64 %r53,%r53,24; add.u64 %r163,%r163,24; add.u64 %r209,%r209,8; setp.ne.u64 %r305,%r39,%r53; @ %r305 bra $L66; bra $L67; $L118: .loc 1 298 6 ld.u64 %r306,[%r211]; setp.eq.u64 %r307,%r306,0; @ ! %r307 bra $L69; bra $L68; $L62: add.u64 %r309,%r22,%r22; add.u64 %r310,%r309,%r22; shl.b64 %r311,%r310,3; add.u64 %r167,%r311,40; add.u64 %r171,%r212,%r167; add.u64 %r165,%r214,%r167; shl.b64 %r312,%r22,3; add.u64 %r139,%r312,-8; cvt.u32.u32 %r314,%r23; cvt.s64.s8 %r313,%r314; shl.b64 %r315,%r313,3; add.u64 %r124,%r315,-8; add.u64 %r438,%frame,240; add.u64 %r436,%frame,360; .loc 1 295 12 mov.u64 %r448,0; $L72: .loc 1 290 18 add.u64 %r317,%r438,%r139; ld.u64 %r318,[%r171]; st.u64 [%r317],%r318; .loc 1 291 18 add.u64 %r319,%frame,%r139; .loc 1 291 20 ld.u64 %r321,[%r165]; mul.lo.u64 %r320,%r321,%r33; .loc 1 291 18 st.u64 [%r319],%r320; .loc 1 292 19 ld.u64 %r323,[%r171+16]; add.u64 %r322,%r323,1; ld.u64 %r324,[%r171+8]; sub.u64 %r63,%r322,%r324; .loc 1 294 10 setp.lt.s64 %r325,%r63,0; @ %r325 bra $L70; .loc 1 292 17 add.u64 %r327,%r436,%r139; st.u64 [%r327],%r63; bra $L71; $L70: .loc 1 295 12 add.u64 %r329,%r436,%r139; st.u64 [%r329],%r448; $L71: .loc 1 288 3 add.u64 %r171,%r171,24; add.u64 %r165,%r165,24; add.u64 %r139,%r139,8; setp.ne.u64 %r331,%r124,%r139; @ %r331 bra $L72; bra $L119; $L68: .loc 1 302 7 setp.eq.u64 %r332,%r143,0; @ %r332 bra $L74; add.u64 %r436,%frame,360; $L96: add.u64 %r185,%r211,40; mov.u64 %r183,%r436; add.u64 %r333,%r211,16; cvt.u32.u32 %r335,%r23; cvt.s64.s8 %r334,%r335; add.u64 %r337,%r334,%r334; add.u64 %r338,%r337,%r334; shl.b64 %r339,%r338,3; add.u64 %r172,%r333,%r339; .loc 1 305 10 mov.u64 %r131,1; .loc 1 309 4 mov.u64 %r358,0; bra $L75; $L74: .loc 1 313 20 add.u32 %r340,%r23,-2; cvt.s64.s32 %r73,%r340; add.u64 %r342,%r73,%r73; add.u64 %r343,%r342,%r73; shl.b64 %r344,%r343,3; add.u64 %r345,%r211,%r344; .loc 1 313 67 shl.b64 %r347,%r73,3; add.u64 %r348,%frame,%r347; .loc 1 313 59 ld.u64 %r350,[%r345+40]; ld.u64 %r351,[%r348+360]; mul.lo.u64 %r151,%r350,%r351; .loc 1 315 24 mov.u64 %r352,0; st.u64 [%r211+8],%r352; .loc 1 316 28 cvt.u16.u32 %r355,%r23; add.u16 %r354,%r355,-1; cvt.u32.u16 %r356,%r354; st.u8 [%r211+28],%r356; .loc 1 318 10 setp.eq.u64 %r357,%r151,0; @ %r357 bra $L76; bra $L120; $L78: .loc 1 307 47 mul.lo.u64 %r131,%r70,%r131; $L75: .loc 1 309 4 st.u64 [%r185+8],%r358; ld.u64 %r70,[%r183]; add.u64 %r359,%r70,-1; st.u64 [%r185+16],%r359; st.u64 [%r185],%r131; .loc 1 302 7 add.u64 %r185,%r185,24; add.u64 %r183,%r183,8; setp.ne.u64 %r360,%r172,%r185; @ %r360 bra $L78; bra $L74; $L76: .loc 1 321 4 st.u64 [%r211+48],%r151; mov.u64 %r362,-1; st.u64 [%r211+56],%r362; mov.u64 %r363,1; st.u64 [%r211+40],%r363; .loc 1 322 4 bra $L54; $L120: .loc 1 325 24365; call (%value_in),_gfortrani_xmallocarray66,[%value_in]; } .loc 1 325 22 st.u64 [%r211],%r366; bra $L79; $L69: .loc 1 330 19 ld.s8 %r368,[%r211+28]; .loc 1 330 10 setp.eq.u64 %r369,%r368,%r143; @ %r369 bra $L80; .loc 1 331 2 cvta.const.u64 %r370370_gfortran_runtime_error80: .loc 1 333 11 cvta.global.u64 %r372,_gfortrani_compile_options; .loc 1 333 10 ld.u32 %r373,[%r372+36]; setp.eq.u32 %r374,%r373,0; @ %r374 bra $L79; .loc 1 335 4 add.u64 %r436,%frame,360; cvta.const.u64 %r378,$LC2r4r377378; call _gfortrani_bounds_ifunction_return337 4 cvta.const.u64 %r382,$LC1212382378; call _gfortrani_bounds_equal_extents$L79: .loc 1 342 3 setp.ne.u64 %r384,%r143,0; @ %r384 bra $L81; $L84: .loc 1 350 8 ld.u64 %r160,[%r211]; .loc 1 351 8 ld.u64 %r158,[%r212]; .loc 1 353 9 setp.ne.u64 %r385,%r158,0; @ %r385 bra $L82; bra $L54; $L81: add.u64 %r198,%frame,480; add.u64 %r197,%r211,40; add.u64 %r195,%frame,120; cvt.u32.u32 %r387,%r23; cvt.s64.s8 %r386,%r387; add.u64 %r186,%r386,-1; .loc 1 342 3 mov.u64 %r161,0; add.u64 %r436,%frame,360; .loc 1 344 16 mov.u64 %r388,%r161; $L83: st.u64 [%r198],%r388; .loc 1 345 18 ld.u64 %r389,[%r197]; st.u64 [%r195],%r389; .loc 1 346 17 shl.b64 %r391,%r161,3; add.u64 %r392,%r436,%r391; .loc 1 346 10 ld.u64 %r393,[%r392]; setp.le.s64 %r394,%r393,0; @ %r394 bra $L54; .loc 1 342 26 add.u64 %r161,%r161,1; .loc 1 342 3 add.u64 %r198,%r198,8; add.u64 %r197,%r197,24; add.u64 %r195,%r195,8; setp.ne.u64 %r395,%r161,%r186; @ %r395 bra $L83; bra $L84; $L82: .loc 1 276 12 mul.lo.u64 %r42,%r40,%r33; .loc 1 370 32 shl.b64 %r87,%r147,2; ld.u64 %r38,[%frame+480]; .loc 1 401 22 ld.u64 %r95,[%frame+240]; .loc 1 401 12 shl.b64 %r96,%r95,2; .loc 1 402 23 ld.u64 %r98,[%frame]; .loc 1 403 22 ld.u64 %r100,[%frame+120]; .loc 1 403 12 shl.b64 %r101,%r100,2; .loc 1 405 32 ld.u64 %r59,[%frame+360]; .loc 1 412 23 mul.lo.u64 %r37,%r95,%r59; .loc 1 413 24 mul.lo.u64 %r396,%r98,%r59; .loc 1 413 10 neg.s64 %r89,%r396; .loc 1 414 23 mul.lo.u64 %r397,%r100,%r59; .loc 1 414 9 shl.b64 %r398,%r397,2; neg.s64 %r132,%r398; setp.le.s64 %r434,%r143,1; .loc 1 377 19 mov.u32 %r440,1; cvt.u32.u32 %r442,%r23; cvt.s64.s8 %r443,%r442; add.u64 %r444,%r443,-1; add.u64 %r445,%frame,240; add.u64 %r446,%frame,360; add.u64 %r447,%frame,120; $L97: .loc 1 342 3 mov.u64 %r154,%r159; mov.u64 %r153,%r158; .loc 1 368 6 mov.u32 %r137,0; .loc 1 370 9 mov.u64 %r152,0; $L87: .loc 1 374 7 ld.s8 %r210,[%r154]; .loc 1 374 6 setp.eq.u32 %r437,%r210,0; @ %r437 bra $L85; .loc 1 378 10 ld.f32 %r400,[%r153]; setp.le.f32 %r401,%r400,0f7f800000; @ %r401 bra $L86; .loc 1 377 19 mov.u32 %r137,%r440; $L85: .loc 1 370 24 add.u64 %r152,%r152,1; .loc 1 370 32 add.u64 %r153,%r153,%r87; .loc 1 370 47 add.u64 %r154,%r154,%r42; .loc 1 370 2 setp.ne.u64 %r402,%r144,%r152; @ %r402 bra $L87; .loc 1 386 47 setp.ne.u32 %r403,%r137,0; selp.f32 %r136,0f7fc00000,0f7f7fffff,%r403; bra $L88; $L91: .loc 1 394 6 @ %r437 bra $L89; .loc 1 394 16 ld.f32 %r91,[%r153]; .loc 1 394 13 setp.lt.f32 %r405,%r91,%r136; .loc 1 395 12 selp.f32 %r136,%r91,%r136,%r405; $L89: .loc 1 391 28 add.u64 %r152,%r152,1; .loc 1 391 36 add.u64 %r153,%r153,%r87; .loc 1 391 51 add.u64 %r154,%r154,%r42; .loc 1 391 11 setp.eq.u64 %r406,%r144,%r152; @ %r406 bra $L88; .loc 1 394 7 ld.s8 %r210,[%r154]; setp.eq.u32 %r437,%r210,0; bra $L91; $L99: .loc 1 386 47 mov.f32 %r136,0f7fc00000; $L88: .loc 1 397 8 st.f32 [%r160],%r136; .loc 1 400 15 add.u64 %r38,%r38,1; .loc 1 401 12 add.u64 %r158,%r158,%r96; .loc 1 402 13 add.u64 %r159,%r159,%r98; .loc 1 403 12 add.u64 %r160,%r160,%r101; .loc 1 405 13 setp.ne.u64 %r407,%r59,%r38; @ %r407 bra $L97; .loc 1 413 10 add.u64 %r155,%r159,%r89; .loc 1 414 9 add.u64 %r156,%r160,%r132; .loc 1 416 7 @ %r434 bra $L54; add.u64 %r169,%frame,488; .loc 1 412 23 mov.u64 %r103,%r37; .loc 1 416 7 mov.u64 %r141,8; .loc 1 415 5 mov.u64 %r157,1; .loc 1 409 13 mov.u64 %r439,0; bra $L94; $L95: st.u64 [%r169],%r439; .loc 1 412 23 mul.lo.u64 %r103,%r114,%r113; .loc 1 413 24 mul.lo.u64 %r413,%r116,%r113; .loc 1 413 10 sub.u64 %r155,%r159,%r413; .loc 1 414 23 mul.lo.u64 %r414,%r118,%r113; .loc 1 414 9 shl.b64 %r415,%r414,2; sub.u64 %r156,%r160,%r415; .loc 1 415 5 add.u64 %r157,%r157,1; .loc 1 416 7 add.u64 %r169,%r169,8; add.u64 %r141,%r141,8; setp.eq.u64 %r416,%r157,%r444; @ %r416 bra $L54; $L94: .loc 1 424 16 ld.u64 %r417,[%r169]; add.u64 %r113,%r417,1; st.u64 [%r169],%r113; .loc 1 425 23 add.u64 %r419,%r445,%r141; ld.u64 %r114,[%r419]; .loc 1 425 13 sub.u64 %r420,%r114,%r103; shl.b64 %r421,%r420,2; add.u64 %r158,%r158,%r421; .loc 1 426 24 add.u64 %r422,%frame,%r141; ld.u64 %r116,[%r422]; .loc 1 426 14 add.u64 %r159,%r155,%r116; .loc 1 427 23 add.u64 %r424,%r447,%r141; ld.u64 %r118,[%r424]; .loc 1 427 13 shl.b64 %r425,%r118,2; add.u64 %r160,%r156,%r425; .loc 1 405 32 add.u64 %r427,%r446,%r141; ld.u64 %r121,[%r427]; .loc 1 405 13 setp.eq.u64 %r428,%r113,%r121; @ %r428 bra $L95; .loc 1 409 13 mov.u64 %r38,0; bra $L97; $L86: .loc 1 383 9 setp.le.s64 %r429,%r144,%r152; @ %r429 bra $L99; mov.f32 %r136,0f7f800000; bra $L91; $L119: .loc 1 298 6 ld.u64 %r430,[%r211]; setp.eq.u64 %r431,%r430,0; @ ! %r431 bra $L69; bra $L96; $L54:_gfortran_sminval_r4 .visible .func _gfortran_sminval_r68u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64 %r276; .reg .predpredu64mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 454 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L122; .loc 1 454 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L123; r152; call _gfortran_minval_r4.loc 1 461 7 bra $L121; $L123: .loc 1 464 10 ld.u64 %r23,[%r152]; .loc 1 464 7 add.u64 %r88,%r23,-1; .loc 1 465 10 ld.s8 %r24,[%r151+28]; .loc 1 465 38 add.u32 %r26,%r24,-1; .loc 1 465 8 cvt.s64.s32 %r294,%r26; .loc 1 467 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r294; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 467 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L125; .loc 1 474 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L126; $L132: .loc 1 482 3 setp.lt.s64 %r173,%r88,%r294; @ %r173 bra $L127; bra $L180; $L125: .loc 1 469 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error126: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 479 12 mov.u64 %r300,0; $L131: .loc 1 476 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 478 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L129; .loc 1 476 17 st.u64 [%r139],%r36; bra $L130; $L129: .loc 1 479 12 st.u64 [%r139],%r300; $L130: .loc 1 474 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L131; bra $L132; $L180: .loc 1 491 15 ld.u64 %r98,[%r150]; .loc 1 491 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L134; bra $L133; $L127: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r293,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r293,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 488 12 mov.u64 %r299,0; $L137: .loc 1 485 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 487 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L135; .loc 1 484 17 st.u64 [%r43],%r41; bra $L136; $L135: .loc 1 488 12 st.u64 [%r43],%r299; $L136: .loc 1 482 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L137; bra $L181; $L133: .loc 1 495 7 setp.eq.u64 %r211,%r294,0; @ %r211 bra $L139; add.u64 %r293,%frame,120; $L159: add.u64 %r111,%r150,40; mov.u64 %r107,%r293; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 498 10 mov.u64 %r81,1; .loc 1 502 4 mov.u64 %r237,0; bra $L140; $L139: .loc 1 506 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 507 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 509 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 509 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 509 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 511 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L141; bra $L182; $L143: .loc 1 500 48 mul.lo.u64 %r81,%r46,%r81; $L140: .loc 1 502 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 495 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L143; bra $L139; $L141: .loc 1 514 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 515 4 bra $L121; $L182: .loc 1 518 2496244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 518 22 st.u64 [%r150],%r98; $L147: .loc 1 544 3 setp.ne.u64 %r247,%r294,0; @ %r247 bra $L144; bra $L145; $L134: .loc 1 522 19 ld.s8 %r55,[%r150+28]; .loc 1 522 10 setp.eq.u64 %r248,%r55,%r294; @ %r248 bra $L146; .loc 1 523 2 st.u64 [%stack+8],%r294; st.u64 [%stack],%r55;_gfortran_runtime_error146: .loc 1 528 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 528 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L147; .loc 1 530 4 setp.eq.u64 %r254,%r294,0; @ %r254 bra $L145; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 530 10 mov.u64 %r104,0; $L149: .loc 1 534 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 534 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 535 18 ld.u64 %r63,[%r31]; .loc 1 530 25 add.u64 %r104,%r104,1; .loc 1 535 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L148; .loc 1 536 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error148: .loc 1 530 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L149; $L144: .loc 1 546 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L150; shl.b64 %r264,%r294,3; bra $L151; $L150: mov.u64 %r264,8; $L151: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L152: .loc 1 547 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 544 26 add.u64 %r100,%r100,1; .loc 1 544 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r294,%r100; @ %r277 bra $L152; $L145: ld.u64 %r80,[%frame+240]; .loc 1 556 22 ld.u64 %r68,[%frame]; .loc 1 556 12 shl.b64 %r69,%r68,2; .loc 1 558 32 ld.u64 %r59,[%frame+120]; .loc 1 565 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r292,%r294,1; .loc 1 554 13 mov.f32 %r278,0f7f7fffff; cvt.u32.u32 %r296,%r24; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; $L158: st.f32 [%r98],%r278; .loc 1 555 15 add.u64 %r80,%r80,1; .loc 1 556 12 add.u64 %r98,%r98,%r69; .loc 1 558 13 setp.ne.u64 %r279,%r59,%r80; @ %r279 bra $L158; .loc 1 567 7 @ ! %r292 bra $L183; bra $L121; $L157: .loc 1 562 13 st.u64 [%r99],%r295; .loc 1 565 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 566 5 add.u64 %r97,%r97,1; .loc 1 567 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r282,%r97,%r298; @ %r282 bra $L156; bra $L121; $L183: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 565 23 mov.u64 %r72,%r44; .loc 1 566 5 mov.u64 %r97,1; .loc 1 562 13 mov.u64 %r295,0; $L156: .loc 1 571 16 ld.u64 %r287,[%r99]; add.u64 %r74,%r287,1; st.u64 [%r99],%r74; .loc 1 572 23 ld.u64 %r75,[%r124]; .loc 1 572 13 sub.u64 %r288,%r75,%r72; shl.b64 %r289,%r288,2; add.u64 %r98,%r98,%r289; .loc 1 558 32 ld.u64 %r77,[%r123]; .loc 1 558 13 setp.eq.u64 %r290,%r74,%r77; @ %r290 bra $L157; .loc 1 562 13 mov.u64 %r80,0; bra $L158; $L181: .loc 1 491 15 ld.u64 %r98,[%r150]; .loc 1 491 6 setp.eq.u64 %r291,%r98,0; @ ! %r291 bra $L134; bra $L159; $L121: .loc 1 576 1 ret; } minval_r8.o/_gfortran_minval_r8 .visible .func _gfortran_minval_r8fortran/generated/minval_r8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminval_r8 .visible .func _gfortran_mminval_r8gfortran_sminval_r8 .visible .func _gfortran_sminval_r8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,86,65,76114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_minval_r8 .visible .func _gfortran_minval_r8480u64 %r56; .reg .u64 %r62; .reg .u64 %r65; .reg .u64u32 %r194; .reg .u16 %r195; .reg .predu64 %r227; .reg .u64 %r228; .reg .u64 %r229; .reg .predpredu32u64 %r250; .reg .u64 %r251; .reg .u64 %r252; .reg .u64 %r253; .reg .predu32u16u64pred %r319; .reg .f64 %r321; .reg .pred %r322; .reg .pred %r323; .reg .pred %r324; .reg .pred %r325; .reg .pred %r326; .reg .u64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64u64 %r357; .reg .u64 %r358; .reg .u64 %r360; .reg .u64 %r361; mov.u64 %r181,%ar0; mov.u64 %r182,%ar1; mov.u64 %r183,%ar2; .loc 1 55 10 ld.s8 %r22,[%r182+28]; .loc 1 55 38 add.u32 %r184,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r109,%r184; .loc 1 56 10 ld.u64 %r25,[%r183]; .loc 1 56 7 add.u64 %r110,%r25,-1; .loc 1 58 7 shr.u64 %r186,%r110,63; set.u32.lt.s64 %r188,%r109,%r110; neg.s32 %r189,%r188; cvt.u16.u64 %r192,%r186; cvt.u16.u32 %r193,%r189; or.b16 %r191,%r192,%r193; .loc 1 58 6 cvt.u32.u16 %r194,%r191; cvt.u16.u8 %r195,%r194; setp.eq.u16 %r196,%r195,0; @ %r196 bra $L2; .loc 1 60 7 cvt.u32.u32 %r199,%r22; cvt.s64.s8 %r198,%r199; st.u64 [%stack+8],%r19819197_gfortran_runtime_errorr202,%r110,%r110; add.u64 %r203,%r202,%r110; shl.b64 %r204,%r203,3; add.u64 %r205,%r182,%r204; ld.u64 %r31,[%r205+56]; ld.u64 %r33,[%r205+48]; .loc 1 68 9 ld.u64 %r113,[%r205+40]; .loc 1 70 3 setp.ne.u64 %r219,%r110,0; @ %r219 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r220,%r109,%r110; @ %r220 bra $L4; bra $L52; $L3: add.u64 %r29,%r182,40; add.u64 %r126,%frame,120; add.u64 %r180,%frame,240; add.u64 %r221,%r182,16; add.u64 %r223,%r25,%r25; add.u64 %r224,%r223,%r25; shl.b64 %r225,%r224,3; add.u64 %r71,%r221,%r225; .loc 1 76 12 mov.u64 %r361,0; $L8: .loc 1 72 18 ld.u64 %r226,[%r29]; st.u64 [%r126],%r226; .loc 1 73 19 ld.u64 %r228,[%r29+16]; add.u64 %r227,%r228,1; ld.u64 %r229,[%r29+8]; sub.u64 %r38,%r227,%r229; .loc 1 75 10 setp.lt.s64 %r230,%r38,0; @ %r230 bra $L6; .loc 1 73 17 st.u64 [%r180],%r38; bra $L7; $L6: .loc 1 76 12 st.u64 [%r180],%r361; $L7: .loc 1 70 3 add.u64 %r29,%r29,24; add.u64 %r126,%r126,8; add.u64 %r180,%r180,8; setp.ne.u64 %r232,%r71,%r29; @ %r232 bra $L8; bra $L9; $L52: .loc 1 87 6 ld.u64 %r233,[%r181]; setp.eq.u64 %r234,%r233,0; @ ! %r234 bra $L11; bra $L10; $L4: add.u64 %r236,%r25,%r25; add.u64 %r237,%r236,%r25238,40; add.u64 %r144,%r182,%r239; shl.b64 %r240,%r25,3; add.u64 %r107,%r240,-8; add.u64 %r349,%frame,120; add.u64 %r128,%r349,%r107; add.u64 %r348,%frame,240; add.u64 %r105,%r348,%r107; cvt.u32.u32 %r244,%r22; cvt.s64.s8 %r243,%r244; add.u64 %r246,%r243,%r243; add.u64 %r247,%r246,%r243; shl.b64 %r248,%r247,3; add.u64 %r249,%r182,40; add.u64 %r65,%r248,%r249; .loc 1 84 12 mov.u64 %r360,0; $L14: .loc 1 80 18 ld.u64 %r250,[%r144]; st.u64 [%r128],%r250; .loc 1 81 19 ld.u64 %r252,[%r144+16]; add.u64 %r251,%r252,1; ld.u64 %r253,[%r144+8]; sub.u64 %r45,%r251,%r253; .loc 1 83 10 setp.lt.s64 %r254,%r45,0; @ %r254 bra $L12; .loc 1 81 17 st.u64 [%r105],%r45; bra $L13; $L12: .loc 1 84 12 st.u64 [%r105],%r360; $L13: .loc 1 78 3 add.u64 %r144,%r144,24; add.u64 %r128,%r128,8; add.u64 %r105,%r105,8; setp.ne.u64 %r256,%r65,%r144; @ %r256 bra $L14; bra $L53; $L10: .loc 1 91 7 setp.le.s64 %r257,%r109,0; @ %r257 bra $L16; add.u64 %r348,%frame,240; $L38: add.u64 %r153,%r181,40; mov.u64 %r151,%r348; add.u64 %r258,%r181,16; cvt.u32.u32 %r260,%r22; cvt.s64.s8 %r259,%r260; add.u64 %r262,%r259,%r259; add.u64 %r263,%r262,%r259; shl.b64 %r264,%r263,3; add.u64 %r145,%r258,%r264; .loc 1 94 10 mov.u64 %r103,1; .loc 1 98 4 mov.u64 %r287,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r265,0; st.u64 [%r181+8],%r265; .loc 1 103 28 cvt.u16.u32 %r268,%r22; add.u16 %r267,%r268,-1; cvt.u32.u16 %r269,%r267; st.u8 [%r181+28],%r269; .loc 1 105 20 add.u32 %r270,%r22,-2; cvt.s64.s32 %r56,%r270; add.u64 %r272,%r56,%r56; add.u64 %r273,%r272,%r56; shl.b64 %r274,%r273,3; add.u64 %r275,%r181,%r274; .loc 1 105 67 shl.b64 %r277,%r56,3; add.u64 %r278,%frame,%r277; .loc 1 105 59 ld.u64 %r280,[%r275+40]; ld.u64 %r281,[%r278+240]; mul.lo.u64 %r115,%r280,%r281; .loc 1 107 29 mov.u64 %r283283; call (%value_in),_gfortrani_xmallocarray284,[%value_in]; } .loc 1 107 27 st.u64 [%r181],%r284; .loc 1 108 10 setp.eq.u64 %r286,%r115,0; @ ! %r286 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r103,%r52,%r103; $L17: .loc 1 98 4 st.u64 [%r153+8],%r287; ld.u64 %r52,[%r151]; add.u64 %r288,%r52,-1; st.u64 [%r153+16],%r288; st.u64 [%r153],%r103; .loc 1 91 7 add.u64 %r153,%r153,24; add.u64 %r151,%r151,8; setp.ne.u64 %r289,%r145,%r153; @ %r289 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r181+48],%r115; mov.u64 %r291,-1; st.u64 [%r181+56],%r291; mov.u64 %r292,1; st.u64 [%r181+40],%r292; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r62,[%r181+28]; .loc 1 118 10 setp.eq.u64 %r293,%r62,%r109; @ %r293 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r109; st.u64 [%stack],%r62;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r296,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r297,[%r296+36]; setp.eq.u32 %r298,%r297,0; @ %r298 bra $L21; .loc 1 125 2 add.u64 %r348,%frame,240; cvta.const.u64 %r302,$LC2r3430302; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r304,%r109,0; @ %r304 bra $L24; $L27: .loc 1 65 9 add.u64 %r305,%r31,1; .loc 1 65 7 sub.u64 %r111,%r305,%r33; max.s64 %r46,%r111,0; .loc 1 137 8 ld.u64 %r122,[%r182]; .loc 1 138 8 ld.u64 %r123,[%r181]; .loc 1 158 36 shl.b64 %r69,%r113,3; ld.u64 %r137,[%frame+360]; .loc 1 180 22 ld.u64 %r73,[%frame+120]; .loc 1 180 12 shl.b64 %r74,%r73,3; .loc 1 181 22 ld.u64 %r76,[%frame]; .loc 1 181 12 shl.b64 %r77,%r76,3; .loc 1 183 32 ld.u64 %r142,[%frame+240]; .loc 1 190 23 mul.lo.u64 %r306,%r73,%r142; .loc 1 190 9 shl.b64 %r307,%r306,3; neg.s64 %r143,%r307; .loc 1 191 23 mul.lo.u64 %r308,%r76,%r142; .loc 1 191 9 shl.b64 %r309,%r308,3; neg.s64 %r70,%r309; setp.gt.s64 %r350,%r111,0; setp.le.s64 %r351,%r109,1; cvt.u32.u32 %r353,%r22; cvt.s64.s8 %r354,%r353; add.u64 %r355,%r354,-1; add.u64 %r356,%frame,120; add.u64 %r357,%frame,240; .loc 1 187 13 mov.u64 %r358,0; bra $L25; $L24: add.u64 %r166,%frame,360; add.u64 %r165,%r181,40; mov.u64 %r163,%frame; cvt.u32.u32 %r311,%r22; cvt.s64.s8 %r310,%r311; add.u64 %r154,%r310,-1; .loc 1 129 3 mov.u64 %r124,0; add.u64 %r348,%frame,240; .loc 1 131 16 mov.u64 %r312,%r124; $L26: st.u64 [%r166],%r312; .loc 1 132 18 ld.u64 %r313,[%r165]; st.u64 [%r163],%r313; .loc 1 133 17 shl.b64 %r315,%r124,3; add.u64 %r316,%r348,%r315; .loc 1 133 10 ld.u64 %r317,[%r316]; setp.le.s64 %r318,%r317,0; @ %r318 bra $L1; .loc 1 129 26 add.u64 %r124,%r124,1; .loc 1 129 3 add.u64 %r166,%r166,8; add.u64 %r165,%r165,24; add.u64 %r163,%r163,8; setp.ne.u64 %r319,%r124,%r154; @ %r319 bra $L26; bra $L27; $L25: .loc 1 153 5 @ %r350 bra $L39; .loc 1 154 10 mov.f64 %r321,0d7fefffffffffffff; st.f64 [%r123],%r321; bra $L29; $L39: mov.u64 %r139,%r122; .loc 1 158 13 mov.u64 %r129,0; $L28: .loc 1 163 7 ld.f64 %r179,[%r139]; mov.u64 %r117,%r129; .loc 1 158 28 add.u64 %r129,%r129,1; mov.u64 %r118,%r139; .loc 1 158 36 add.u64 %r139,%r139,%r69; .loc 1 163 6 setp.le.f64 %r322,%r179,0d7ff0000000000000; @ %r322 bra $L40; .loc 1 158 6 setp.gt.s64 %r323,%r46,%r129; @ %r323 bra $L28; .loc 1 167 15 mov.f64 %r104,0d7ff8000000000000; bra $L31; $L40: .loc 1 149 9 mov.f64 %r104,0d7ff0000000000000; $L30: .loc 1 171 6 setp.gt.f64 %r324,%r104,%r179; .loc 1 172 12 selp.f64 %r104,%r179,%r104,%r324; .loc 1 168 28 add.u64 %r117,%r117,1; .loc 1 168 36 add.u64 %r118,%r118,%r69; .loc 1 168 11 setp.le.s64 %r325,%r46,%r117; @ %r325 bra $L31; .loc 1 171 7 ld.f64 %r179,[%r118]; bra $L30; $L31: .loc 1 175 12 st.f64 [%r123],%r104; $L29: .loc 1 179 15 add.u64 %r137,%r137,1; .loc 1 180 12 add.u64 %r122,%r122,%r74; .loc 1 181 12 add.u64 %r123,%r123,%r77; .loc 1 183 13 setp.ne.u64 %r326,%r137,%r142; @ %r326 bra $L25; .loc 1 190 9 add.u64 %r119,%r122,%r143; .loc 1 191 9 add.u64 %r120,%r123,%r70; .loc 1 193 7 @ %r351 bra $L1; add.u64 %r132,%frame,368; mov.u64 %r174,8; .loc 1 192 5 mov.u64 %r121,1; bra $L36; $L37: .loc 1 187 13 st.u64 [%r132],%r358; .loc 1 190 23 mul.lo.u64 %r332,%r90,%r89; .loc 1 190 9 shl.b64 %r333,%r332,3; sub.u64 %r119,%r122,%r333; .loc 1 191 23 mul.lo.u64 %r334,%r93,%r89; .loc 1 191 9 shl.b64 %r335,%r334,3; sub.u64 %r120,%r123,%r335; .loc 1 192 5 add.u64 %r121,%r121,1; .loc 1 193 7 add.u64 %r132,%r132,8; add.u64 %r174,%r174,8; setp.eq.u64 %r336,%r121,%r355; @ %r336 bra $L1; $L36: .loc 1 201 16 ld.u64 %r337,[%r132]; add.u64 %r89,%r337,1; st.u64 [%r132],%r89; .loc 1 202 23 add.u64 %r339,%r356,%r174; ld.u64 %r90,[%r339]; .loc 1 202 13 shl.b64 %r340,%r90,3; add.u64 %r122,%r119,%r340; .loc 1 203 23 add.u64 %r341,%frame,%r174; ld.u64 %r93,[%r341]; .loc 1 203 13 shl.b64 %r342,%r93,3; add.u64 %r123,%r120,%r342; .loc 1 183 32 add.u64 %r344,%r357,%r174; ld.u64 %r96,[%r344]; .loc 1 183 13 setp.eq.u64 %r345,%r89,%r96; @ %r345 bra $L37; .loc 1 187 13 mov.u64 %r137,0; bra $L25; $L53: .loc 1 87 6 ld.u64 %r346,[%r181]; setp.eq.u64 %r347,%r346,0; @ ! %r347 bra $L11; bra $L38; $L_gfortran_mminval_r8 .visible .func _gfortran_mminval_r8608pred %r215; .reg .u32 %r219; .reg .u64 %r221; .reg .u32 %r223; .reg .u32 %r224; .reg .u16 %r226; .reg .u16 %r227; .reg .u16u16 %r263; .reg .u32pred %r282; .reg .u64 %r283; .reg .u64u64u6464pred %r325; .reg .u64 %r327; .reg .u64 %r329; .reg .pred %r331; .reg .pred %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u32u64pred %r357; .reg .u64 %r358; .reg .u64 %r359; .reg .predu64 %r392; .reg .u64 %r393; .reg .pred %r394; .reg .predpred %r403; .reg .pred %r405; .reg .predpredu64 %r422; .reg .u64 %r424; .reg .u64 %r425; .reg .u64 %r427; .reg .pred %r428; .reg .pred %r429; .reg .u64 %r430; .reg .pred %r431; .reg .u64 %r433; .reg .predu64 %r449; mov.u64 %r211,%ar0; mov.u64 %r212,%ar1; mov.u64 %r213,%ar2; mov.u64 %r214,%ar3; .loc 1 237 6 setp.ne.u64 %r215,%r214,0; @ %r215 bra $Lgfortran_minval_r8.loc 1 244 7 bra $L54; $L55: .loc 1 247 10 ld.u64 %r22,[%r213]; .loc 1 247 7 add.u64 %r142,%r22,-1; .loc 1 248 10 ld.s8 %r23,[%r212+28]; .loc 1 248 38 add.u32 %r219,%r23,-1; .loc 1 248 8 cvt.s64.s32 %r143,%r219; .loc 1 251 7 shr.u64 %r221,%r142,63; set.u32.gt.s64 %r223,%r142,%r143; neg.s32 %r224,%r223; cvt.u16.u64 %r227,%r221; cvt.u16.u32 %r228,%r224; or.b16 %r226,%r227,%r228; .loc 1 251 6 cvt.u32.u16 %r229,%r226; cvt.u16.u8 %r230,%r229; setp.eq.u16 %r231,%r230,0; @ %r231 bra $L57; .loc 1 253 7 cvt.u32.u32 %r234,%r23; cvt.s64.s8 %r233,%r234; st.u64 [%stack+8],%r233; st.u64 [%stack],%r22; cvta.const.u64 %r2322stack; call _gfortran_runtime_error7: .loc 1 258 9 add.u64 %r435,%r142,%r142; add.u64 %r433,%r435,%r142; shl.b64 %r239,%r433,3; add.u64 %r240,%r212,%r239; ld.u64 %r243,[%r240+56]; add.u64 %r242,%r243,1; .loc 1 258 7 ld.u64 %r250,[%r240+48]; sub.u64 %r144,%r242,%r250; .loc 1 259 6 setp.le.s64 %r251,%r144,0; @ %r251 bra $L54; .loc 1 262 9 ld.u64 %r159,[%r214]; .loc 1 264 15 ld.u64 %r33,[%r214+16]; .loc 1 266 22 cvt.u32.u64 %r34,%r33; .loc 1 266 53 add.u32 %r252,%r34,-4; and.b32 %r253,%r252,-5; set.u32.eq.u32 %r255,%r253,0; neg.s32 %r256,%r255; .loc 1 266 22 add.u32 %r257,%r34,-1; set.u32.le.u32 %r259,%r257,1; neg.s32 %r260,%r259; .loc 1 266 6 cvt.u16.u32 %r262,%r256; cvt.u16.u32 %r263,%r260; or.b16 %r261,%r262,%r263; cvt.u32.u16 %r264,%r261; cvt.u16.u8 %r265,%r264; setp.ne.u16 %r266,%r265,0; @ %r266 bra $L59; .loc 1 268 7 setp.ne.u32 %r268,%r34,16; @ %r268 bra $L60; $L59: .loc 1 275 9 shl.b64 %r272,%r433,3; add.u64 %r273,%r212,%r272; ld.u64 %r147,[%r273+40]; .loc 1 276 12 add.u64 %r279,%r214,%r272; ld.u64 %r40,[%r279+40]; .loc 1 278 3 setp.ne.u64 %r281,%r142,0; @ %r281 bra $L61; $L67: .loc 1 288 3 setp.lt.s64 %r282,%r142,%r143; @ %r282 bra $L62; bra $L118; $L60: .loc 1 273 5 cvta.const.u64 %r283283_gfortran_runtime_error61: add.u64 %r53,%r212,40; add.u64 %r163,%r214,40; add.u64 %r286,%r22,%r22; add.u64 %r287,%r286,%r22; shl.b64 %r288,%r287,3; add.u64 %r289,%r212,16; add.u64 %r39,%r288,%r289; .loc 1 278 3 mov.u64 %r209,0; add.u64 %r438,%frame,240; add.u64 %r436,%frame,360; .loc 1 285 12 mov.u64 %r449,%r209; $L66: .loc 1 280 18 add.u64 %r291,%r438,%r209; ld.u64 %r292,[%r53]; st.u64 [%r291],%r292; .loc 1 281 18 add.u64 %r293,%frame,%r209; .loc 1 281 20 ld.u64 %r295,[%r163]; mul.lo.u64 %r294,%r295,%r33; .loc 1 281 18 st.u64 [%r293],%r294; .loc 1 282 19 ld.u64 %r297,[%r53+16]; add.u64 %r296,%r297,1; ld.u64 %r298,[%r53+8]; sub.u64 %r51,%r296,%r298; .loc 1 284 10 setp.lt.s64 %r299,%r51,0; @ %r299 bra $L64; .loc 1 282 17 add.u64 %r301,%r436,%r209; st.u64 [%r301],%r51; bra $L65; $L64: .loc 1 285 12 add.u64 %r303,%r436,%r209; st.u64 [%r303],%r449; $L65: .loc 1 278 3 add.u64 %r53,%r53,24; add.u64 %r163,%r163,24; add.u64 %r209,%r209,8; setp.ne.u64 %r305,%r39,%r53; @ %r305 bra $L66; bra $L67; $L118: .loc 1 298 6 ld.u64 %r306,[%r211]; setp.eq.u64 %r307,%r306,0; @ ! %r307 bra $L69; bra $L68; $L62: add.u64 %r309,%r22,%r22; add.u64 %r310,%r309,%r22; shl.b64 %r311,%r310,3; add.u64 %r167,%r311,40; add.u64 %r171,%r212,%r167; add.u64 %r165,%r214,%r167; shl.b64 %r312,%r22,3; add.u64 %r139,%r312,-8; cvt.u32.u32 %r314,%r23; cvt.s64.s8 %r313,%r314; shl.b64 %r315,%r313,3; add.u64 %r124,%r315,-8; add.u64 %r438,%frame,240; add.u64 %r436,%frame,360; .loc 1 295 12 mov.u64 %r448,0; $L72: .loc 1 290 18 add.u64 %r317,%r438,%r139; ld.u64 %r318,[%r171]; st.u64 [%r317],%r318; .loc 1 291 18 add.u64 %r319,%frame,%r139; .loc 1 291 20 ld.u64 %r321,[%r165]; mul.lo.u64 %r320,%r321,%r33; .loc 1 291 18 st.u64 [%r319],%r320; .loc 1 292 19 ld.u64 %r323,[%r171+16]; add.u64 %r322,%r323,1; ld.u64 %r324,[%r171+8]; sub.u64 %r63,%r322,%r324; .loc 1 294 10 setp.lt.s64 %r325,%r63,0; @ %r325 bra $L70; .loc 1 292 17 add.u64 %r327,%r436,%r139; st.u64 [%r327],%r63; bra $L71; $L70: .loc 1 295 12 add.u64 %r329,%r436,%r139; st.u64 [%r329],%r448; $L71: .loc 1 288 3 add.u64 %r171,%r171,24; add.u64 %r165,%r165,24; add.u64 %r139,%r139,8; setp.ne.u64 %r331,%r124,%r139; @ %r331 bra $L72; bra $L119; $L68: .loc 1 302 7 setp.eq.u64 %r332,%r143,0; @ %r332 bra $L74; add.u64 %r436,%frame,360; $L96: add.u64 %r185,%r211,40; mov.u64 %r183,%r436; add.u64 %r333,%r211,16; cvt.u32.u32 %r335,%r23; cvt.s64.s8 %r334,%r335; add.u64 %r337,%r334,%r334; add.u64 %r338,%r337,%r334; shl.b64 %r339,%r338,3; add.u64 %r172,%r333,%r339; .loc 1 305 10 mov.u64 %r131,1; .loc 1 309 4 mov.u64 %r358,0; bra $L75; $L74: .loc 1 313 20 add.u32 %r340,%r23,-2; cvt.s64.s32 %r73,%r340; add.u64 %r342,%r73,%r73; add.u64 %r343,%r342,%r73; shl.b64 %r344,%r343,3; add.u64 %r345,%r211,%r344; .loc 1 313 67 shl.b64 %r347,%r73,3; add.u64 %r348,%frame,%r347; .loc 1 313 59 ld.u64 %r350,[%r345+40]; ld.u64 %r351,[%r348+360]; mul.lo.u64 %r151,%r350,%r351; .loc 1 315 24 mov.u64 %r352,0; st.u64 [%r211+8],%r352; .loc 1 316 28 cvt.u16.u32 %r355,%r23; add.u16 %r354,%r355,-1; cvt.u32.u16 %r356,%r354; st.u8 [%r211+28],%r356; .loc 1 318 10 setp.eq.u64 %r357,%r151,0; @ %r357 bra $L76; bra $L120; $L78: .loc 1 307 47 mul.lo.u64 %r131,%r70,%r131; $L75: .loc 1 309 4 st.u64 [%r185+8],%r358; ld.u64 %r70,[%r183]; add.u64 %r359,%r70,-1; st.u64 [%r185+16],%r359; st.u64 [%r185],%r131; .loc 1 302 7 add.u64 %r185,%r185,24; add.u64 %r183,%r183,8; setp.ne.u64 %r360,%r172,%r185; @ %r360 bra $L78; bra $L74; $L76: .loc 1 321 4 st.u64 [%r211+48],%r151; mov.u64 %r362,-1; st.u64 [%r211+56],%r362; mov.u64 %r363,1; st.u64 [%r211+40],%r363; .loc 1 322 4 bra $L54; $L120: .loc 1 325 24 mov.u64 %r365365; call (%value_in),_gfortrani_xmallocarray66,[%value_in]; } .loc 1 325 22 st.u64 [%r211],%r366; bra $L79; $L69: .loc 1 330 19 ld.s8 %r368,[%r211+28]; .loc 1 330 10 setp.eq.u64 %r369,%r368,%r143; @ %r369 bra $L80; .loc 1 331 2 cvta.const.u64 %r370370_gfortran_runtime_error80: .loc 1 333 11 cvta.global.u64 %r372,_gfortrani_compile_options; .loc 1 333 10 ld.u32 %r373,[%r372+36]; setp.eq.u32 %r374,%r373,0; @ %r374 bra $L79; .loc 1 335 4 add.u64 %r436,%frame,360; cvta.const.u64 %r378,$LC2r4r377378; call _gfortrani_bounds_ifunction_return337 4 cvta.const.u64 %r382,$LC1212382378; call _gfortrani_bounds_equal_extents$L79: .loc 1 342 3 setp.ne.u64 %r384,%r143,0; @ %r384 bra $L81; $L84: .loc 1 350 8 ld.u64 %r160,[%r211]; .loc 1 351 8 ld.u64 %r158,[%r212]; .loc 1 353 9 setp.ne.u64 %r385,%r158,0; @ %r385 bra $L82; bra $L54; $L81: add.u64 %r198,%frame,480; add.u64 %r197,%r211,40; add.u64 %r195,%frame,120; cvt.u32.u32 %r387,%r23; cvt.s64.s8 %r386,%r387; add.u64 %r186,%r386,-1; .loc 1 342 3 mov.u64 %r161,0; add.u64 %r436,%frame,360; .loc 1 344 16 mov.u64 %r388,%r161; $L83: st.u64 [%r198],%r388; .loc 1 345 18 ld.u64 %r389,[%r197]; st.u64 [%r195],%r389; .loc 1 346 17 shl.b64 %r391,%r161,3; add.u64 %r392,%r436,%r391; .loc 1 346 10 ld.u64 %r393,[%r392]; setp.le.s64 %r394,%r393,0; @ %r394 bra $L54; .loc 1 342 26 add.u64 %r161,%r161,1; .loc 1 342 3 add.u64 %r198,%r198,8; add.u64 %r197,%r197,24; add.u64 %r195,%r195,8; setp.ne.u64 %r395,%r161,%r186; @ %r395 bra $L83; bra $L84; $L82: .loc 1 276 12 mul.lo.u64 %r42,%r40,%r33; .loc 1 370 32 shl.b64 %r87,%r147,3; ld.u64 %r38,[%frame+480]; .loc 1 401 22 ld.u64 %r95,[%frame+240]; .loc 1 401 12 shl.b64 %r96,%r95,3; .loc 1 402 23 ld.u64 %r98,[%frame]; .loc 1 403 22 ld.u64 %r100,[%frame+120]; .loc 1 403 12 shl.b64 %r101,%r100,3; .loc 1 405 32 ld.u64 %r59,[%frame+360]; .loc 1 412 23 mul.lo.u64 %r37,%r95,%r59; .loc 1 413 24 mul.lo.u64 %r396,%r98,%r59; .loc 1 413 10 neg.s64 %r89,%r396; .loc 1 414 23 mul.lo.u64 %r397,%r100,%r59; .loc 1 414 9 shl.b64 %r398,%r397,3; neg.s64 %r132,%r398; setp.le.s64 %r434,%r143,1; .loc 1 377 19 mov.u32 %r440,1; cvt.u32.u32 %r442,%r23; cvt.s64.s8 %r443,%r442; add.u64 %r444,%r443,-1; add.u64 %r445,%frame,240; add.u64 %r446,%frame,360; add.u64 %r447,%frame,120; $L97: .loc 1 342 3 mov.u64 %r154,%r159; mov.u64 %r153,%r158; .loc 1 368 6 mov.u32 %r137,0; .loc 1 370 9 mov.u64 %r152,0; $L87: .loc 1 374 7 ld.s8 %r210,[%r154]; .loc 1 374 6 setp.eq.u32 %r437,%r210,0; @ %r437 bra $L85; .loc 1 378 10 ld.f64 %r400,[%r153]; setp.le.f64 %r401,%r400,0d7ff0000000000000; @ %r401 bra $L86; .loc 1 377 19 mov.u32 %r137,%r440; $L85: .loc 1 370 24 add.u64 %r152,%r152,1; .loc 1 370 32 add.u64 %r153,%r153,%r87; .loc 1 370 47 add.u64 %r154,%r154,%r42; .loc 1 370 2 setp.ne.u64 %r402,%r144,%r152; @ %r402 bra $L87; .loc 1 386 47 setp.ne.u32 %r403,%r137,0; selp.f64 %r136,0d7ff8000000000000,0d7fefffffffffffff,%r403; bra $L88; $L91: .loc 1 394 6 @ %r437 bra $L89; .loc 1 394 16 ld.f64 %r91,[%r153]; .loc 1 394 13 setp.lt.f64 %r405,%r91,%r136; .loc 1 395 12 selp.f64 %r136,%r91,%r136,%r405; $L89: .loc 1 391 28 add.u64 %r152,%r152,1; .loc 1 391 36 add.u64 %r153,%r153,%r87; .loc 1 391 51 add.u64 %r154,%r154,%r42; .loc 1 391 11 setp.eq.u64 %r406,%r144,%r152; @ %r406 bra $L88; .loc 1 394 7 ld.s8 %r210,[%r154]; setp.eq.u32 %r437,%r210,0; bra $L91; $L99: .loc 1 386 47 mov.f64 %r136,0d7ff8000000000000; $L88: .loc 1 397 8 st.f64 [%r160],%r136; .loc 1 400 15 add.u64 %r38,%r38,1; .loc 1 401 12 add.u64 %r158,%r158,%r96; .loc 1 402 13 add.u64 %r159,%r159,%r98; .loc 1 403 12 add.u64 %r160,%r160,%r101; .loc 1 405 13 setp.ne.u64 %r407,%r59,%r38; @ %r407 bra $L97; .loc 1 413 10 add.u64 %r155,%r159,%r89; .loc 1 414 9 add.u64 %r156,%r160,%r132; .loc 1 416 7 @ %r434 bra $L54; add.u64 %r169,%frame,488; .loc 1 412 23 mov.u64 %r103,%r37; .loc 1 416 7 mov.u64 %r141,8; .loc 1 415 5 mov.u64 %r157,1; .loc 1 409 13 mov.u64 %r439,0; bra $L94; $L95: st.u64 [%r169],%r439; .loc 1 412 23 mul.lo.u64 %r103,%r114,%r113; .loc 1 413 24 mul.lo.u64 %r413,%r116,%r113; .loc 1 413 10 sub.u64 %r155,%r159,%r413; .loc 1 414 23 mul.lo.u64 %r414,%r118,%r113; .loc 1 414 9 shl.b64 %r415,%r414,3; sub.u64 %r156,%r160,%r415; .loc 1 415 5 add.u64 %r157,%r157,1; .loc 1 416 7 add.u64 %r169,%r169,8; add.u64 %r141,%r141,8; setp.eq.u64 %r416,%r157,%r444; @ %r416 bra $L54; $L94: .loc 1 424 16 ld.u64 %r417,[%r169]; add.u64 %r113,%r417,1; st.u64 [%r169],%r113; .loc 1 425 23 add.u64 %r419,%r445,%r141; ld.u64 %r114,[%r419]; .loc 1 425 13 sub.u64 %r420,%r114,%r103; shl.b64 %r421,%r420,3; add.u64 %r158,%r158,%r421; .loc 1 426 24 add.u64 %r422,%frame,%r141; ld.u64 %r116,[%r422]; .loc 1 426 14 add.u64 %r159,%r155,%r116; .loc 1 427 23 add.u64 %r424,%r447,%r141; ld.u64 %r118,[%r424]; .loc 1 427 13 shl.b64 %r425,%r118,3; add.u64 %r160,%r156,%r425; .loc 1 405 32 add.u64 %r427,%r446,%r141; ld.u64 %r121,[%r427]; .loc 1 405 13 setp.eq.u64 %r428,%r113,%r121; @ %r428 bra $L95; .loc 1 409 13 mov.u64 %r38,0; bra $L97; $L86: .loc 1 383 9 setp.le.s64 %r429,%r144,%r152; @ %r429 bra $L99; mov.f64 %r136,0d7ff0000000000000; bra $L91; $L119: .loc 1 298 6 ld.u64 %r430,[%r211]; setp.eq.u64 %r431,%r430,0; @ ! %r431 bra $L69; bra $L96; $L54:_gfortran_sminval_r8 .visible .func _gfortran_sminval_r868u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64 %r276; .reg .predpredu64mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 454 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L122; .loc 1 454 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L123; r152; call _gfortran_minval_r8.loc 1 461 7 bra $L121; $L123: .loc 1 464 10 ld.u64 %r23,[%r152]; .loc 1 464 7 add.u64 %r88,%r23,-1; .loc 1 465 10 ld.s8 %r24,[%r151+28]; .loc 1 465 38 add.u32 %r26,%r24,-1; .loc 1 465 8 cvt.s64.s32 %r294,%r26; .loc 1 467 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r294; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 467 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L125; .loc 1 474 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L126; $L132: .loc 1 482 3 setp.lt.s64 %r173,%r88,%r294; @ %r173 bra $L127; bra $L180; $L125: .loc 1 469 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error126: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 479 12 mov.u64 %r300,0; $L131: .loc 1 476 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 478 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L129; .loc 1 476 17 st.u64 [%r139],%r36; bra $L130; $L129: .loc 1 479 12 st.u64 [%r139],%r300; $L130: .loc 1 474 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L131; bra $L132; $L180: .loc 1 491 15 ld.u64 %r98,[%r150]; .loc 1 491 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L134; bra $L133; $L127: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r293,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r293,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 488 12 mov.u64 %r299,0; $L137: .loc 1 485 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 487 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L135; .loc 1 484 17 st.u64 [%r43],%r41; bra $L136; $L135: .loc 1 488 12 st.u64 [%r43],%r299; $L136: .loc 1 482 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L137; bra $L181; $L133: .loc 1 495 7 setp.eq.u64 %r211,%r294,0; @ %r211 bra $L139; add.u64 %r293,%frame,120; $L159: add.u64 %r111,%r150,40; mov.u64 %r107,%r293; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 498 10 mov.u64 %r81,1; .loc 1 502 4 mov.u64 %r237,0; bra $L140; $L139: .loc 1 506 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 507 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 509 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 509 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 509 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 511 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L141; bra $L182; $L143: .loc 1 500 48 mul.lo.u64 %r81,%r46,%r81; $L140: .loc 1 502 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 495 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L143; bra $L139; $L141: .loc 1 514 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 515 4 bra $L121; $L182: .loc 1 518 24 mov.u64 %r24496244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 518 22 st.u64 [%r150],%r98; $L147: .loc 1 544 3 setp.ne.u64 %r247,%r294,0; @ %r247 bra $L144; bra $L145; $L134: .loc 1 522 19 ld.s8 %r55,[%r150+28]; .loc 1 522 10 setp.eq.u64 %r248,%r55,%r294; @ %r248 bra $L146; .loc 1 523 2 st.u64 [%stack+8],%r294; st.u64 [%stack],%r55;_gfortran_runtime_error146: .loc 1 528 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 528 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L147; .loc 1 530 4 setp.eq.u64 %r254,%r294,0; @ %r254 bra $L145; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 530 10 mov.u64 %r104,0; $L149: .loc 1 534 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 534 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 535 18 ld.u64 %r63,[%r31]; .loc 1 530 25 add.u64 %r104,%r104,1; .loc 1 535 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L148; .loc 1 536 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error148: .loc 1 530 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L149; $L144: .loc 1 546 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L150; shl.b64 %r264,%r294,3; bra $L151; $L150: mov.u64 %r264,8; $L151: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L152: .loc 1 547 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 544 26 add.u64 %r100,%r100,1; .loc 1 544 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r294,%r100; @ %r277 bra $L152; $L145: ld.u64 %r80,[%frame+240]; .loc 1 556 22 ld.u64 %r68,[%frame]; .loc 1 556 12 shl.b64 %r69,%r68,3; .loc 1 558 32 ld.u64 %r59,[%frame+120]; .loc 1 565 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r292,%r294,1; .loc 1 554 13 mov.f64 %r278,0d7fefffffffffffff; cvt.u32.u32 %r296,%r24; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; $L158: st.f64 [%r98],%r278; .loc 1 555 15 add.u64 %r80,%r80,1; .loc 1 556 12 add.u64 %r98,%r98,%r69; .loc 1 558 13 setp.ne.u64 %r279,%r59,%r80; @ %r279 bra $L158; .loc 1 567 7 @ ! %r292 bra $L183; bra $L121; $L157: .loc 1 562 13 st.u64 [%r99],%r295; .loc 1 565 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 566 5 add.u64 %r97,%r97,1; .loc 1 567 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r282,%r97,%r298; @ %r282 bra $L156; bra $L121; $L183: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 565 23 mov.u64 %r72,%r44; .loc 1 566 5 mov.u64 %r97,1; .loc 1 562 13 mov.u64 %r295,0; $L156: .loc 1 571 16 ld.u64 %r287,[%r99]; add.u64 %r74,%r287,1; st.u64 [%r99],%r74; .loc 1 572 23 ld.u64 %r75,[%r124]; .loc 1 572 13 sub.u64 %r288,%r75,%r72; shl.b64 %r289,%r288,3; add.u64 %r98,%r98,%r289; .loc 1 558 32 ld.u64 %r77,[%r123]; .loc 1 558 13 setp.eq.u64 %r290,%r74,%r77; @ %r290 bra $L157; .loc 1 562 13 mov.u64 %r80,0; bra $L158; $L181: .loc 1 491 15 ld.u64 %r98,[%r150]; .loc 1 491 6 setp.eq.u64 %r291,%r98,0; @ ! %r291 bra $L134; bra $L159; $L121: .loc 1 576 1 ret; } minval_r10.o/ product_i1.o/_gfortran_product_i1 .visible .func _gfortran_product_fortran/generated/product_i1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mproduct_i1 .visible .func _gfortran_mproduct_i1gfortran_sproduct_i1 .visible .func _gfortran_sproduct_i1VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,108] = {80,82,79,68,85,67,84114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_product_i1 .visible .func _gfortran_product_480predu64 %r229; .reg .u64u64pred %r247; .reg .pred %r249; .reg .predu64u64 %r334; .reg .u64 %r336; .reg .predpred %r343; .reg .u32u64 %r353; .reg .u16 %r354; mov.u64 %r174,%ar0; mov.u64 %r175,%ar1; mov.u64 %r176,%ar2; .loc 1 55 10 ld.s8 %r22,[%r175+28]; .loc 1 55 38 add.u32 %r177,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r107,%r177; .loc 1 56 10 ld.u64 %r25,[%r176]; .loc 1 56 7 add.u64 %r108,%r25,-1; .loc 1 58 7 shr.u64 %r179,%r108,63; set.u32.lt.s64 %r181,%r107,%r108; neg.s32 %r182,%r181; cvt.u16.u64 %r185,%r179; cvt.u16.u32 %r186,%r182; or.b16 %r184,%r185,%r186; .loc 1 58 6 cvt.u32.u16 %r187,%r184; cvt.u16.u8 %r188,%r187; setp.eq.u16 %r189,%r188,0; @ %r189 bra $L2; .loc 1 60 7 cvt.u32.u32 %r192,%r22; cvt.s64.s8 %r191,%r192; st.u64 [%stack+8],%r19190,$LC0;stack; call _gfortran_runtime_errorr195,%r108,%r108; add.u64 %r196,%r195,%r108; shl.b64 %r197,%r196,3; add.u64 %r198,%r175,%r197; ld.u64 %r30,[%r198+56]; ld.u64 %r32,[%r198+48]; .loc 1 68 9 ld.u64 %r111,[%r198+40]; .loc 1 70 3 setp.ne.u64 %r212,%r108,0; @ %r212 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r213,%r107,%r108; @ %r213 bra $L4; bra $L46; $L3: add.u64 %r125,%r175,40; add.u64 %r172,%frame,120; add.u64 %r170,%frame,240; add.u64 %r214,%r175,16; add.u64 %r216,%r25,%r25; add.u64 %r217,%r216,%r25; shl.b64 %r218,%r217,3; add.u64 %r79,%r214,%r218; .loc 1 76 12 mov.u64 %r353,0; $L8: .loc 1 72 18 ld.u64 %r219,[%r125]; st.u64 [%r172],%r219; .loc 1 73 19 ld.u64 %r221,[%r125+16]; add.u64 %r220,%r221,1; ld.u64 %r222,[%r125+8]; sub.u64 %r37,%r220,%r222; .loc 1 75 10 setp.lt.s64 %r223,%r37,0; @ %r223 bra $L6; .loc 1 73 17 st.u64 [%r170],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r170],%r353; $L7: .loc 1 70 3 add.u64 %r125,%r125,24; add.u64 %r172,%r172,8; add.u64 %r170,%r170,8; setp.ne.u64 %r225,%r79,%r125; @ %r225 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r226,[%r174]; setp.eq.u64 %r227,%r226,0; @ ! %r227 bra $L11; bra $L10; $L4: add.u64 %r229,%r25,%r25; add.u64 %r230,%r229,%r25; shl.b64 %r231,%r230,3; add.u64 %r232,%r231,40; add.u64 %r117,%r175,%r232; shl.b64 %r233,%r25,3; add.u64 %r97,%r233,-8; add.u64 %r340,%frame,120; add.u64 %r100,%r340,%r97; add.u64 %r341,%frame,240; add.u64 %r95,%r341,%r97; cvt.u32.u32 %r237,%r22; cvt.s64.s8 %r236,%r237; add.u64 %r239,%r236,%r236; add.u64 %r240,%r239,%r236; shl.b64 %r241,%r240,3; add.u64 %r242,%r175,40; add.u64 %r94,%r241,%r242; .loc 1 84 12 mov.u64 %r352,0; $L14: .loc 1 80 18 ld.u64 %r243,[%r117]; st.u64 [%r100],%r243; .loc 1 81 19 ld.u64 %r245,[%r117+16]; add.u64 %r244,%r245,1; ld.u64 %r246,[%r117+8]; sub.u64 %r44,%r244,%r246; .loc 1 83 10 setp.lt.s64 %r247,%r44,0; @ %r247 bra $L12; .loc 1 81 17 st.u64 [%r95],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r95],%r352; $L13: .loc 1 78 3 add.u64 %r117,%r117,24; add.u64 %r100,%r100,8; add.u64 %r95,%r95,8; setp.ne.u64 %r249,%r94,%r117; @ %r249 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r250,%r107,0; @ %r250 bra $L16; add.u64 %r341,%frame,240; $L35: add.u64 %r147,%r174,40; mov.u64 %r138,%r341; cvt.u32.u32 %r252,%r22; cvt.s64.s8 %r251,%r252; add.u64 %r254,%r251,%r251; add.u64 %r255,%r254,%r251; shl.b64 %r256,%r255,3; add.u64 %r257,%r174,16; add.u64 %r126,%r256,%r257; .loc 1 94 10 mov.u64 %r99,1; .loc 1 98 4 mov.u64 %r280,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r258,0; st.u64 [%r174+8],%r258; .loc 1 103 28 cvt.u16.u32 %r261,%r22; add.u16 %r260,%r261,-1; cvt.u32.u16 %r262,%r260; st.u8 [%r174+28],%r262; .loc 1 105 20 add.u32 %r263,%r22,-2; cvt.s64.s32 %r55,%r263; add.u64 %r265,%r55,%r55; add.u64 %r266,%r265,%r55; shl.b64 %r267,%r266,3; add.u64 %r268,%r174,%r267; .loc 1 105 67 shl.b64 %r270,%r55,3; add.u64 %r271,%frame,%r270; .loc 1 105 59 ld.u64 %r273,[%r268+40]; ld.u64 %r274,[%r271+240]; mul.lo.u64 %r113,%r273,%r274; .loc 1 107 29 mov.u64 %r27611(%value_in),_gfortrani_xmallocarray277,[%value_in]; } .loc 1 107 27 st.u64 [%r174],%r277; .loc 1 108 10 setp.eq.u64 %r279,%r113,0; @ ! %r279 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r99,%r51,%r99; $L17: .loc 1 98 4 st.u64 [%r147+8],%r280; ld.u64 %r51,[%r138]; add.u64 %r281,%r51,-1; st.u64 [%r147+16],%r281; st.u64 [%r147],%r99; .loc 1 91 7 add.u64 %r147,%r147,24; add.u64 %r138,%r138,8; setp.ne.u64 %r282,%r126,%r147; @ %r282 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r174+48],%r113; mov.u64 %r284,-1; st.u64 [%r174+56],%r284; st.u64 [%r174+40],%r276; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r62,[%r174+28]; .loc 1 118 10 setp.eq.u64 %r286,%r62,%r107; @ %r286 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r107;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r289,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r290,[%r289+36]; setp.eq.u32 %r291,%r290,0; @ %r291 bra $L21; .loc 1 125 2 add.u64 %r341,%frame,240; cvta.const.u64 %r295,$LC2r34129495; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r297,%r107,0; @ %r297 bra $L24; $L27: .loc 1 65 9 add.u64 %r298,%r30,1; .loc 1 65 7 sub.u64 %r109,%r298,%r32; max.s64 %r45,%r109,0; .loc 1 137 8 ld.u64 %r121,[%r175]; .loc 1 138 8 ld.u64 %r122,[%r174]; ld.u64 %r47,[%frame+360]; .loc 1 166 22 ld.u64 %r74,[%frame+120]; .loc 1 167 22 ld.u64 %r76,[%frame]; setp.gt.s64 %r342,%r109,0; setp.le.s64 %r343,%r107,1; cvt.u32.u32 %r345,%r22; cvt.s64.s8 %r346,%r345; add.u64 %r347,%r346,-1; add.u64 %r348,%frame,120; add.u64 %r349,%frame,240; .loc 1 173 13 mov.u64 %r350,0; bra $L25; $L24: add.u64 %r160,%frame,360; add.u64 %r159,%r174,40; mov.u64 %r157,%frame; cvt.u32.u32 %r300,%r22; cvt.s64.s8 %r299,%r300; add.u64 %r148,%r299,-1; .loc 1 129 3 mov.u64 %r123,0; add.u64 %r341,%frame,240; .loc 1 131 16 mov.u64 %r301,%r123; $L26: st.u64 [%r160],%r301; .loc 1 132 18 ld.u64 %r302,[%r159]; st.u64 [%r157],%r302; .loc 1 133 17 shl.b64 %r304,%r123,3; add.u64 %r305,%r341,%r304; .loc 1 133 10 ld.u64 %r306,[%r305]; setp.le.s64 %r307,%r306,0; @ %r307 bra $L1; .loc 1 129 26 add.u64 %r123,%r123,1; .loc 1 129 3 add.u64 %r160,%r160,8; add.u64 %r159,%r159,24; add.u64 %r157,%r157,8; setp.ne.u64 %r308,%r123,%r148; @ %r308 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r342 bra $L28; .loc 1 150 10 mov.u32 %r310,1; st.u8 [%r122],%r310; bra $L29; $L28: .loc 1 154 36 mov.u64 %r116,%r121; .loc 1 148 10 mov.u32 %r114,1; .loc 1 154 13 mov.u64 %r115,0; $L30: .loc 1 158 10 ld.u8 %r354,[%r116]; mov.u16 %r314,%r354; cvt.u16.u32 %r315,%r114; mul.lo.u16 %r313,%r314,%r315; cvt.u32.u16 %r316,%r313; cvt.s32.s8 %r114,%r316; .loc 1 154 28 add.u64 %r115,%r115,1; .loc 1 154 36 add.u64 %r116,%r116,%r111; .loc 1 154 6 setp.gt.s64 %r318,%r45,%r115; @ %r318 bra $L30; .loc 1 161 12 cvt.u32.u32 %r319,%r114; st.u8 [%r122],%r319; $L29: .loc 1 165 15 add.u64 %r47,%r47,1; .loc 1 166 12 add.u64 %r121,%r121,%r74; .loc 1 167 12 add.u64 %r122,%r122,%r76; .loc 1 169 32 ld.u64 %r128,[%frame+240]; .loc 1 169 13 setp.ne.u64 %r320,%r47,%r128; @ %r320 bra $L25; .loc 1 176 23 mul.lo.u64 %r321,%r74,%r47; .loc 1 176 9 sub.u64 %r118,%r121,%r321; .loc 1 177 23 mul.lo.u64 %r322,%r76,%r47; .loc 1 177 9 sub.u64 %r119,%r122,%r322; .loc 1 179 7 @ %r343 bra $L1; add.u64 %r143,%frame,368; mov.u64 %r106,8; .loc 1 178 5 mov.u64 %r120,1; bra $L33; $L34: .loc 1 173 13 st.u64 [%r143],%r350; .loc 1 176 23 mul.lo.u64 %r328,%r89,%r88; .loc 1 176 9 sub.u64 %r118,%r121,%r328; .loc 1 177 23 mul.lo.u64 %r329,%r91,%r88; .loc 1 177 9 sub.u64 %r119,%r122,%r329; .loc 1 178 5 add.u64 %r120,%r120,1; .loc 1 179 7 add.u64 %r143,%r143,8; add.u64 %r106,%r106,8; setp.eq.u64 %r330,%r120,%r347; @ %r330 bra $L1; $L33: .loc 1 187 16 ld.u64 %r331,[%r143]; add.u64 %r88,%r331,1; st.u64 [%r143],%r88; .loc 1 188 23 add.u64 %r333,%r348,%r106; ld.u64 %r89,[%r333]; .loc 1 188 13 add.u64 %r121,%r118,%r89; .loc 1 189 23 add.u64 %r334,%frame,%r106; ld.u64 %r91,[%r334]; .loc 1 189 13 add.u64 %r122,%r119,%r91; .loc 1 169 32 add.u64 %r336,%r349,%r106; ld.u64 %r93,[%r336]; .loc 1 169 13 setp.eq.u64 %r337,%r88,%r93; @ %r337 bra $L34; .loc 1 173 13 mov.u64 %r47,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r338,[%r174]; setp.eq.u64 %r339,%r338,0; @ ! %r339 bra $L11; bra $L35; $L1: .loc 1 19_gfortran_mproduct_i1 .visible .func _gfortran_mproduct_608pred %r210; .reg .u32 %r214; .reg .u64 %r216; .reg .u32 %r218; .reg .u32 %r219; .reg .u16 %r221; .reg .u16 %r222; .reg .u16 %r223; .reg .u32 %r224; .reg .u16 %r225; .reg .predu32 %r259; .reg .u16u64 %r280; .reg .u64u64predu64u64 %r333; .reg .u64 %r334; .reg .u32predpredu64 %r377; .reg .pred %r379; .reg .pred %r380; .reg .u64 %r381; .reg .u32pred %r389; .reg .predu32 %r399; .reg .predpredmov.u64 %r206,%ar0; mov.u64 %r207,%ar1; mov.u64 %r208,%ar2; mov.u64 %r209,%ar3; .loc 1 223 6 setp.ne.u64 %r210,%r209,0; @ %r210 bra $L49; .loc 1 228 7 {_gfortran_product_.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r208]; .loc 1 233 7 add.u64 %r135,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r207+28]; .loc 1 234 38 add.u32 %r214,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r136,%r214; .loc 1 237 7 shr.u64 %r216,%r135,63; set.u32.gt.s64 %r218,%r135,%r136; neg.s32 %r219,%r218; cvt.u16.u64 %r222,%r216; cvt.u16.u32 %r223,%r219; or.b16 %r221,%r222,%r223; .loc 1 237 6 cvt.u32.u16 %r224,%r221; cvt.u16.u8 %r225,%r224; setp.eq.u16 %r226,%r225,0; @ %r226 bra $L51; .loc 1 239 7 cvt.u32.u32 %r229,%r23; cvt.s64.s8 %r228,%r229; st.u64 [%stack+8],%r228; st.u64 [%stack],%r22; cvta.const.u64 %r227_gfortran_runtime_error1: .loc 1 244 9 add.u64 %r429,%r135,%r135; add.u64 %r430,%r429,%r135; shl.b64 %r234,%r430,3; add.u64 %r235,%r207,%r234; ld.u64 %r238,[%r235+56]; add.u64 %r237,%r238,1; .loc 1 244 7 ld.u64 %r245,[%r235+48]; sub.u64 %r137,%r237,%r245; .loc 1 245 6 setp.le.s64 %r246,%r137,0; @ %r246 bra $L48; .loc 1 248 9 ld.u64 %r149,[%r209]; .loc 1 250 15 ld.u64 %r34,[%r209+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r247,%r35,-4; and.b32 %r248,%r247,-5; set.u32.eq.u32 %r250,%r248,0; neg.s32 %r251,%r250; .loc 1 252 22 add.u32 %r252,%r35,-1; set.u32.le.u32 %r254,%r252,1; neg.s32 %r255,%r254; .loc 1 252 6 cvt.u16.u32 %r257,%r251; cvt.u16.u32 %r258,%r255; or.b16 %r256,%r257,%r258; cvt.u32.u16 %r259,%r256; cvt.u16.u8 %r260,%r259; setp.ne.u16 %r261,%r260,0; @ %r261 bra $L53; .loc 1 254 7 setp.ne.u32 %r263,%r35,16; @ %r263 bra $L54; $L53: .loc 1 261 9 shl.b64 %r267,%r430,3; add.u64 %r268,%r207,%r267; ld.u64 %r140,[%r268+40]; .loc 1 262 12 add.u64 %r274,%r209,%r267; ld.u64 %r42,[%r274+40]; .loc 1 264 3 setp.ne.u64 %r276,%r135,0; @ %r276 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r277,%r135,%r136; @ %r277 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r27827_gfortran_runtime_error5: add.u64 %r190,%r207,40; add.u64 %r192,%r209,40; add.u64 %r280,%r207,16; add.u64 %r282,%r22,%r22; add.u64 %r283,%r282,%r22; shl.b64 %r284,%r283,3; add.u64 %r203,%r280,%r284; .loc 1 264 3 mov.u64 %r194,0; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 271 12 mov.u64 %r441,%r194; $L60: .loc 1 266 18 add.u64 %r286,%r427,%r194; ld.u64 %r287,[%r190]; st.u64 [%r286],%r287; .loc 1 267 18 add.u64 %r288,%frame,%r194; .loc 1 267 20 ld.u64 %r290,[%r192]; mul.lo.u64 %r289,%r290,%r34; .loc 1 267 18 st.u64 [%r288],%r289; .loc 1 268 19 ld.u64 %r292,[%r190+16]; add.u64 %r291,%r292,1; ld.u64 %r293,[%r190+8]; sub.u64 %r53,%r291,%r293; .loc 1 270 10 setp.lt.s64 %r294,%r53,0; @ %r294 bra $L58; .loc 1 268 17 add.u64 %r296,%r426,%r194; st.u64 [%r296],%r53; bra $L59; $L58: .loc 1 271 12 add.u64 %r298,%r426,%r194; st.u64 [%r298],%r441; $L59: .loc 1 264 3 add.u64 %r190,%r190,24; add.u64 %r192,%r192,24; add.u64 %r194,%r194,8; setp.ne.u64 %r300,%r190,%r203; @ %r300 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r301,[%r206]; setp.eq.u64 %r302,%r301,0; @ ! %r302 bra $L63; bra $L62; $L56: add.u64 %r304,%r22,%r22; add.u64 %r305,%r304,%r22; shl.b64 %r306,%r305,3; add.u64 %r155,%r306,40; add.u64 %r58,%r207,%r155; add.u64 %r157,%r209,%r155; shl.b64 %r307,%r22,3; add.u64 %r103,%r307,-8; cvt.u32.u32 %r309,%r23; cvt.s64.s8 %r308,%r309; shl.b64 %r310,%r308,3; add.u64 %r187,%r310,-8; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 281 12 mov.u64 %r440,0; $L66: .loc 1 276 18 add.u64 %r312,%r427,%r103; ld.u64 %r313,[%r58]; st.u64 [%r312],%r313; .loc 1 277 18 add.u64 %r314,%frame,%r103; .loc 1 277 20 ld.u64 %r316,[%r157]; mul.lo.u64 %r315,%r316,%r34; .loc 1 277 18 st.u64 [%r314],%r315; .loc 1 278 19 ld.u64 %r318,[%r58+16]; add.u64 %r317,%r318,1; ld.u64 %r319,[%r58+8]; sub.u64 %r64,%r317,%r319; .loc 1 280 10 setp.lt.s64 %r320,%r64,0; @ %r320 bra $L64; .loc 1 278 17 add.u64 %r322,%r426,%r103; st.u64 [%r322],%r64; bra $L65; $L64: .loc 1 281 12 add.u64 %r324,%r426,%r103; st.u64 [%r324],%r440; $L65: .loc 1 274 3 add.u64 %r58,%r58,24; add.u64 %r157,%r157,24; add.u64 %r103,%r103,8; setp.ne.u64 %r326,%r103,%r187; @ %r326 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r327,%r136,0; @ %r327 bra $L68; add.u64 %r426,%frame,360; $L86: add.u64 %r127,%r206,40; mov.u64 %r121,%r426; cvt.u32.u32 %r329,%r23; cvt.s64.s8 %r328,%r329; add.u64 %r331,%r328,%r328; add.u64 %r332,%r331,%r328; shl.b64 %r333,%r332,3; add.u64 %r334,%r206,16; add.u64 %r72,%r333,%r334; .loc 1 291 10 mov.u64 %r123,1; .loc 1 295 4 mov.u64 %r353,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r335,%r23,-2; cvt.s64.s32 %r70,%r335; add.u64 %r337,%r70,%r70; add.u64 %r338,%r337,%r70; shl.b64 %r339,%r338,3; add.u64 %r340,%r206,%r339; .loc 1 299 67 shl.b64 %r342,%r70,3; add.u64 %r343,%frame,%r342; .loc 1 299 59 ld.u64 %r345,[%r340+40]; ld.u64 %r346,[%r343+360]; mul.lo.u64 %r143,%r345,%r346; .loc 1 301 24 mov.u64 %r347,0; st.u64 [%r206+8],%r347; .loc 1 302 28 cvt.u16.u32 %r350,%r23; add.u16 %r349,%r350,-1; cvt.u32.u16 %r351,%r349; st.u8 [%r206+28],%r351; .loc 1 304 10 setp.eq.u64 %r352,%r143,0; @ %r352 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r123,%r67,%r123; $L69: .loc 1 295 4 st.u64 [%r127+8],%r353; ld.u64 %r67,[%r121]; add.u64 %r354,%r67,-1; st.u64 [%r127+16],%r354; st.u64 [%r127],%r123; .loc 1 288 7 add.u64 %r127,%r127,24; add.u64 %r121,%r121,8; setp.ne.u64 %r355,%r72,%r127; @ %r355 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r206+48],%r143; mov.u64 %r357,-1; st.u64 [%r206+56],%r357; mov.u64 %r358,1; st.u64 [%r206+40],%r358; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r360360; call (%value_in),_gfortrani_xmallocarray61,[%value_in]; } .loc 1 311 22 st.u64 [%r206],%r361; bra $L73; $L63: .loc 1 316 19 ld.s8 %r363,[%r206+28]; .loc 1 316 10 setp.eq.u64 %r364,%r363,%r136; @ %r364 bra $L74; .loc 1 317 2 cvta.const.u64 %r365365_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r367,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r368,[%r367+36]; setp.eq.u32 %r369,%r368,0; @ %r369 bra $L73; .loc 1 321 4 add.u64 %r426,%frame,360; cvta.const.u64 %r373,$LC2; cvta.const.u64 %r372,$LC3; {372373; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r377,$LC377373; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r379,%r136,0; @ %r379 bra $L75; $L78: .loc 1 336 8 ld.u64 %r150,[%r206]; .loc 1 337 8 ld.u64 %r148,[%r207]; .loc 1 339 9 setp.ne.u64 %r380,%r148,0; @ %r380 bra $L76; bra $L48; $L75: add.u64 %r173,%frame,480; add.u64 %r170,%r206,40; add.u64 %r163,%frame,120; cvt.u32.u32 %r382,%r23; cvt.s64.s8 %r381,%r382; add.u64 %r128,%r381,-1; .loc 1 328 3 mov.u64 %r154,0; add.u64 %r426,%frame,360; .loc 1 330 16 mov.u64 %r383,%r154; $L77: st.u64 [%r173],%r383; .loc 1 331 18 ld.u64 %r384,[%r170]; st.u64 [%r163],%r384; .loc 1 332 17 shl.b64 %r386,%r154,3; add.u64 %r387,%r426,%r386; .loc 1 332 10 ld.u64 %r388,[%r387]; setp.le.s64 %r389,%r388,0; @ %r389 bra $L48; .loc 1 328 26 add.u64 %r154,%r154,1; .loc 1 328 3 add.u64 %r173,%r173,8; add.u64 %r170,%r170,24; add.u64 %r163,%r163,8; setp.ne.u64 %r390,%r128,%r154; @ %r390 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r44,%r42,%r34; ld.u64 %r176,[%frame+480]; .loc 1 359 22 ld.u64 %r89,[%frame+240]; .loc 1 360 23 ld.u64 %r91,[%frame]; .loc 1 361 22 ld.u64 %r93,[%frame+120]; setp.le.s64 %r431,%r136,1; .loc 1 348 10 mov.u32 %r433,1; cvt.u32.u32 %r434,%r23; cvt.s64.s8 %r435,%r434; add.u64 %r436,%r435,-1; add.u64 %r437,%frame,240; add.u64 %r438,%frame,360; add.u64 %r439,%frame,120; $L87: .loc 1 328 3 mov.u64 %r153,%r149; mov.u64 %r152,%r148; .loc 1 348 10 mov.u32 %r125,%r433; .loc 1 349 9 mov.u64 %r151,0; $L80: .loc 1 352 6 ld.s8 %r392,[%r153]; cvt.u16.u32 %r391,%r392; setp.eq.u16 %r393,%r391,0; @ %r393 bra $L79; .loc 1 353 12 ld.u8 %r442,[%r152]; mov.u16 %r397,%r442; cvt.u16.u32 %r398,%r125; mul.lo.u16 %r396,%r397,%r398; cvt.u32.u16 %r399,%r396; cvt.s32.s8 %r125,%r399; $L79: .loc 1 349 24 add.u64 %r151,%r151,1; .loc 1 349 32 add.u64 %r152,%r152,%r140; .loc 1 349 47 add.u64 %r153,%r153,%r44; .loc 1 349 2 setp.ne.u64 %r401,%r137,%r151; @ %r401 bra $L80; .loc 1 355 8 cvt.u32.u32 %r402,%r125; st.u8 [%r150],%r402; .loc 1 358 15 add.u64 %r176,%r176,1; .loc 1 359 12 add.u64 %r148,%r148,%r89; .loc 1 360 13 add.u64 %r149,%r149,%r91; .loc 1 361 12 add.u64 %r150,%r150,%r93; .loc 1 363 32 ld.u64 %r80,[%frame+360]; .loc 1 363 13 setp.ne.u64 %r403,%r80,%r176; @ %r403 bra $L87; .loc 1 370 23 mul.lo.u64 %r97,%r176,%r89; .loc 1 371 24 mul.lo.u64 %r404,%r176,%r91; .loc 1 371 10 sub.u64 %r145,%r149,%r404; .loc 1 372 23 mul.lo.u64 %r405,%r176,%r93; .loc 1 372 9 sub.u64 %r146,%r150,%r405; .loc 1 374 7 @ ! %r431 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r40],%r432; .loc 1 370 23 mul.lo.u64 %r97,%r109,%r108; .loc 1 371 24 mul.lo.u64 %r408,%r111,%r108; .loc 1 371 10 sub.u64 %r145,%r149,%r408; .loc 1 372 23 mul.lo.u64 %r409,%r113,%r108; .loc 1 372 9 sub.u64 %r146,%r150,%r409; .loc 1 373 5 add.u64 %r147,%r147,1; .loc 1 374 7 add.u64 %r40,%r40,8; add.u64 %r167,%r167,8; setp.ne.u64 %r410,%r147,%r436; @ %r410 bra $L84; bra $L48; $L108: add.u64 %r40,%frame,488; mov.u64 %r167,8; .loc 1 373 5 mov.u64 %r147,1; .loc 1 367 13 mov.u64 %r432,0; $L84: .loc 1 382 16 ld.u64 %r414,[%r40]; add.u64 %r108,%r414,1; st.u64 [%r40],%r108; .loc 1 383 23 add.u64 %r416,%r437,%r167; ld.u64 %r109,[%r416]; .loc 1 383 13 sub.u64 %r417,%r109,%r97; add.u64 %r148,%r148,%r417; .loc 1 384 24 add.u64 %r418,%frame,%r167; ld.u64 %r111,[%r418]; .loc 1 384 14 add.u64 %r149,%r145,%r111; .loc 1 385 23 add.u64 %r420,%r439,%r167; ld.u64 %r113,[%r420]; .loc 1 385 13 add.u64 %r150,%r146,%r113; .loc 1 363 32 add.u64 %r422,%r438,%r167; ld.u64 %r116,[%r422]; .loc 1 363 13 setp.eq.u64 %r423,%r108,%r116; @ %r423 bra $L85; .loc 1 367 13 mov.u64 %r176,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r424,[%r206]; setp.eq.u64 %r425,%r424,0; @ ! %r425 bra $L63; bra $L86; $L48:_gfortran_sproduct_i1 .visible .func _gfortran_sproduct_68predpred %r184; .reg .predpred %r206; .reg .predu32 %r211; .reg .u64 %r213; .reg .u64u32 %r222; .reg .u64pred %r264; .reg .u64 %r266; .reg .u32 %r270; .reg .u64 %r272; .reg .u64 %r274; .reg .predpredu64 %r297; mov.u64 %r148,%ar0; mov.u64 %r149,%ar1; mov.u64 %r150,%ar2; mov.u64 %r151,%ar3; .loc 1 412 6 setp.eq.u64 %r152,%r151,0; @ %r152 bra $L110; .loc 1 412 20 ld.u32 %r153,[%r151]; setp.eq.u32 %r154,%r153,0; @ %r154 bra $L111; $L110: .loc 1 417 7gfortran_product_.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r150]; .loc 1 422 7 add.u64 %r86,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r149+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r291,%r26; .loc 1 425 7 shr.u64 %r159,%r86,63; set.u32.gt.s64 %r161,%r86,%r291; neg.s32 %r162,%r161; cvt.u16.u64 %r165,%r159; cvt.u16.u32 %r166,%r162; or.b16 %r164,%r165,%r166; .loc 1 425 6 cvt.u32.u16 %r167,%r164; cvt.u16.u8 %r168,%r167; setp.ne.u16 %r169,%r168,0; @ %r169 bra $L113; .loc 1 432 3 setp.ne.u64 %r170,%r86,0; @ %r170 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r171,%r86,%r291; @ %r171 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r174,%r24; cvt.s64.s8 %r173,%r174; st.u64 [%stack+8],%r173172172_gfortran_runtime_error114: add.u64 %r135,%r149,48; add.u64 %r137,%frame,120; add.u64 %r138,%r149,56; add.u64 %r176,%r149,24; add.u64 %r178,%r23,%r23; add.u64 %r179,%r178,%r23; shl.b64 %r180,%r179,3; add.u64 %r144,%r176,%r180; .loc 1 437 12 mov.u64 %r297,0; $L119: .loc 1 434 19 ld.u64 %r182,[%r138]; add.u64 %r181,%r182,1; ld.u64 %r183,[%r135]; sub.u64 %r36,%r181,%r183; .loc 1 436 10 setp.le.s64 %r184,%r36,0; @ %r184 bra $L117; .loc 1 434 17 st.u64 [%r137],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r137],%r297; $L118: .loc 1 432 3 add.u64 %r135,%r135,24; add.u64 %r137,%r137,8; add.u64 %r138,%r138,24; setp.ne.u64 %r186,%r135,%r144; @ %r186 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r96,[%r148]; .loc 1 449 6 setp.eq.u64 %r187,%r96,0; @ ! %r187 bra $L122; bra $L121; $L115: add.u64 %r189,%r23,%r23; add.u64 %r190,%r189,%r23; shl.b64 %r191,%r190,3; add.u64 %r192,%r191,48; add.u64 %r57,%r149,%r192; add.u64 %r290,%frame,120; shl.b64 %r194,%r23,3; add.u64 %r195,%r194,-8; add.u64 %r67,%r290,%r195; add.u64 %r196,%r149,48; cvt.u32.u32 %r198,%r24; cvt.s64.s8 %r197,%r198; add.u64 %r200,%r197,%r197; add.u64 %r201,%r200,%r197; shl.b64 %r202,%r201,3; add.u64 %r131,%r196,%r202; .loc 1 446 12 mov.u64 %r296,0; $L125: .loc 1 443 2 ld.u64 %r204,[%r57+8]; add.u64 %r203,%r204,1; ld.u64 %r205,[%r57]; sub.u64 %r41,%r203,%r205; .loc 1 445 10 setp.le.s64 %r206,%r41,0; @ %r206 bra $L123; .loc 1 442 17 st.u64 [%r67],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r67],%r296; $L124: .loc 1 440 3 add.u64 %r57,%r57,24; add.u64 %r67,%r67,8; setp.ne.u64 %r208,%r57,%r131; @ %r208 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r209,%r291,0; @ %r209 bra $L127; add.u64 %r290,%frame,120; $L147: add.u64 %r102,%r148,40; mov.u64 %r84,%r290; cvt.u32.u32 %r211,%r24; cvt.s64.s8 %r210,%r211; add.u64 %r213,%r210,%r210; add.u64 %r214,%r213,%r210; shl.b64 %r215,%r214,3; add.u64 %r216,%r148,16; add.u64 %r60,%r215,%r216; .loc 1 456 10 mov.u64 %r82,1; .loc 1 460 4 mov.u64 %r235,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r217,0; st.u64 [%r148+8],%r217; .loc 1 465 28 cvt.u16.u32 %r220,%r24; add.u16 %r219,%r220,-1; cvt.u32.u16 %r221,%r219; st.u8 [%r148+28],%r221; .loc 1 467 20 add.u32 %r222,%r24,-2; cvt.s64.s32 %r50,%r222; add.u64 %r224,%r50,%r50; add.u64 %r225,%r224,%r50; shl.b64 %r226,%r225,3; add.u64 %r227,%r148,%r226; .loc 1 467 67 shl.b64 %r229,%r50,3; add.u64 %r230,%frame,%r229; .loc 1 467 59 ld.u64 %r232,[%r227+40]; ld.u64 %r233,[%r230+120]; mul.lo.u64 %r93,%r232,%r233; .loc 1 469 10 setp.eq.u64 %r234,%r93,0; @ %r234 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r82,%r46,%r82; $L128: .loc 1 460 4 st.u64 [%r102+8],%r235; ld.u64 %r46,[%r84]; add.u64 %r236,%r46,-1; st.u64 [%r102+16],%r236; st.u64 [%r102],%r82; .loc 1 453 7 add.u64 %r102,%r102,24;131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r148+48],%r93; mov.u64 %r239,-1; st.u64 [%r148+56],%r239; mov.u64 %r240,1; st.u64 [%r148+40],%r240; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r242(%value_in),_gfortrani_xmallocarray3,[%value_in]; } mov.u64 %r96,%r243; .loc 1 476 22 st.u64 [%r148],%r96; $L135: .loc 1 502 3 setp.ne.u64 %r245,%r291,0; @ %r245 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r148+28]; .loc 1 480 10 setp.eq.u64 %r246,%r55,%r291; @ %r246 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r291; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r249,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r250,[%r249+36]; setp.eq.u32 %r251,%r250,0; @ %r251 bra $L135; .loc 1 488 4 setp.eq.u64 %r252,%r291,0; @ %r252 bra $L133; add.u64 %r31,%r148,48; add.u64 %r125,%frame,120; cvt.u32.u32 %r254,%r24; cvt.s64.s8 %r253,%r254; add.u64 %r92,%r253,-1; .loc 1 488 10 mov.u64 %r101,0; $L137: .loc 1 492 21 ld.u64 %r256,[%r31+8]; add.u64 %r255,%r256,1; .loc 1 492 19 ld.u64 %r257,[%r31]; sub.u64 %r91,%r255,%r257; .loc 1 493 18 ld.u64 %r64,[%r125]; .loc 1 488 25 add.u64 %r101,%r101,1; .loc 1 493 11 setp.eq.u64 %r258,%r64,%r91; @ %r258 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r64; st.u64 [%stack+8],%r91; st.u64 [%stack],%r101; cvta.const.u64 %r22_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r31,%r31,24; add.u64 %r125,%r125,8; setp.ne.u64 %r261,%r92,%r101; @ %r261 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r263,%r24; setp.le.s16 %r264,%r263,1; @ %r264 bra $L138; shl.b64 %r262,%r291,3; bra $L139; $L138: mov.u64 %r262,8; $L139: add.u64 %r266,%frame,240; mov.u32 %r27026272,[%value_in]; } add.u64 %r111,%r148,40; mov.u64 %r109,%frame; mov.u64 %r98,0; $L140: .loc 1 505 18 ld.u64 %r274,[%r111]; st.u64 [%r109],%r274; .loc 1 502 26 add.u64 %r98,%r98,1; .loc 1 502 3 add.u64 %r111,%r111,24; add.u64 %r109,%r109,8; setp.gt.s64 %r275,%r291,%r98; @ %r275 bra $L140; $L133: ld.u64 %r103,[%frame+240]; .loc 1 514 22 ld.u64 %r69,[%frame]; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r81,%r59,%r69; setp.le.s64 %r289,%r291,1; .loc 1 512 13 mov.u32 %r276,1; cvt.u32.u32 %r293,%r24; cvt.s64.s8 %r294,%r293; add.u64 %r295,%r294,-1; $L146: st.u8 [%r96],%r276; .loc 1 513 15 add.u64 %r103,%r103,1; .loc 1 514 12 add.u64 %r96,%r96,%r69; .loc 1 516 13 setp.ne.u64 %r277,%r59,%r103; @ %r277 bra $L146; .loc 1 525 7 @ ! %r289 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r97],%r292; .loc 1 523 23 mul.lo.u64 %r71,%r74,%r73; .loc 1 524 5 add.u64 %r95,%r95,1; .loc 1 525 7 add.u64 %r97,%r97,8; add.u64 %r119,%r119,8; add.u64 %r118,%r118,8; setp.ne.u64 %r280,%r95,%r295; @ %r280 bra $L144; bra $L109; $L171: add.u64 %r97,%frame,248; add.u64 %r119,%frame,8; add.u64 %r118,%frame,128; .loc 1 523 23 mov.u64 %r71,%r81; .loc 1 524 5 mov.u64 %r95,1; .loc 1 520 13 mov.u64 %r292,0; $L144: .loc 1 529 16 ld.u64 %r285,[%r97]; add.u64 %r73,%r285,1; st.u64 [%r97],%r73; .loc 1 530 23 ld.u64 %r74,[%r119]; .loc 1 530 13 sub.u64 %r286,%r74,%r71; add.u64 %r96,%r96,%r286; .loc 1 516 32 ld.u64 %r77,[%r118]; .loc 1 516 13 setp.eq.u64 %r287,%r73,%r77; @ %r287 bra $L145; .loc 1 520 13 mov.u64 %r103,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r96,[%r148]; .loc 1 449 6 setp.eq.u64 %r288,%r96,0; @ ! %r288 bra $L122; bra $L147; $L109: .loc 1 534 1 ret; } product_i2.o/_gfortran_product_i2 .visible .func _gfortran_product_fortran/generated/product_i2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mproduct_i2 .visible .func _gfortran_mproduct_i2gfortran_sproduct_i2 .visible .func _gfortran_sproduct_i2VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,108] = {80,82,79,68,85,67,84114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_product_i2 .visible .func _gfortran_product_480predu64u64 %r229; .reg .u64 %r230; .reg .predpred %r255; .reg .predu32u16u64predu64pred %r322; .reg .predpred %r330; .reg .u16 %r331; .reg .predpred %r357; .reg .u32 %r359; .reg .u64u64 %r366; .reg .u64 %r367; mov.u64 %r182,%ar0; mov.u64 %r183,%ar1; mov.u64 %r184,%ar2; .loc 1 55 10 ld.s8 %r22,[%r183+28]; .loc 1 55 38 add.u32 %r185,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r107,%r185; .loc 1 56 10 ld.u64 %r25,[%r184]; .loc 1 56 7 add.u64 %r108,%r25,-1; .loc 1 58 7 shr.u64 %r187,%r108,63; set.u32.lt.s64 %r189,%r107,%r108; neg.s32 %r190,%r189; cvt.u16.u64 %r193,%r187; cvt.u16.u32 %r194,%r190; or.b16 %r192,%r193,%r194; .loc 1 58 6 cvt.u32.u16 %r195,%r192; cvt.u16.u8 %r196,%r195; setp.eq.u16 %r197,%r196,0; @ %r197 bra $L2; .loc 1 60 7 cvt.u32.u32 %r200,%r22; cvt.s64.s8 %r199,%r200; st.u64 [%stack+8],%r199_gfortran_runtime_errorr203,%r108,%r108; add.u64 %r204,%r203,%r108; shl.b64 %r205,%r204,3; add.u64 %r206,%r183,%r205; ld.u64 %r31,[%r206+56]; ld.u64 %r33,[%r206+48]; .loc 1 68 9 ld.u64 %r111,[%r206+40]; .loc 1 70 3 setp.ne.u64 %r220,%r108,0; @ %r220 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r221,%r107,%r108; @ %r221 bra $L4; bra $L46; $L3: add.u64 %r97,%r183,40; add.u64 %r61,%frame,120; add.u64 %r59,%frame,240; add.u64 %r222,%r183,16; add.u64 %r224,%r25,%r25; add.u64 %r225,%r224,%r25; shl.b64 %r226,%r225,3; add.u64 %r180,%r222,%r226; .loc 1 76 12 mov.u64 %r367,0; $L8: .loc 1 72 18 ld.u64 %r227,[%r97]; st.u64 [%r61],%r227; .loc 1 73 19 ld.u64 %r229,[%r97+16]; add.u64 %r228,%r229,1; ld.u64 %r230,[%r97+8]; sub.u64 %r38,%r228,%r230; .loc 1 75 10 setp.lt.s64 %r231,%r38,0; @ %r231 bra $L6; .loc 1 73 17 st.u64 [%r59],%r38; bra $L7; $L6: .loc 1 76 12 st.u64 [%r59],%r367; $L7: .loc 1 70 3 add.u64 %r97,%r97,24; add.u64 %r61,%r61,8; add.u64 %r59,%r59,8; setp.ne.u64 %r233,%r97,%r180; @ %r233 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r234,[%r182]; setp.eq.u64 %r235,%r234,0; @ ! %r235 bra $L11; bra $L10; $L4: add.u64 %r237,%r25,%r25; add.u64 %r238,%r237,%r25; shl.b64 %r239,%r238,3; add.u64 %r240,%r239,40; add.u64 %r150,%r183,%r240; shl.b64 %r241,%r25,3; add.u64 %r133,%r241,-8; add.u64 %r355,%frame,120; add.u64 %r138,%r355,%r133; add.u64 %r354,%frame,240; add.u64 %r130,%r354,%r133; cvt.u32.u32 %r245,%r22; cvt.s64.s8 %r244,%r245; add.u64 %r247,%r244,%r244; add.u64 %r248,%r247,%r244; shl.b64 %r249,%r248,3; add.u64 %r250,%r183,40; add.u64 %r100,%r249,%r250; .loc 1 84 12 mov.u64 %r366,0; $L14: .loc 1 80 18 ld.u64 %r251,[%r150]; st.u64 [%r138],%r251; .loc 1 81 19 ld.u64 %r253,[%r150+16]; add.u64 %r252,%r253,1; ld.u64 %r254,[%r150+8]; sub.u64 %r44,%r252,%r254; .loc 1 83 10 setp.lt.s64 %r255,%r44,0; @ %r255 bra $L12; .loc 1 81 17 st.u64 [%r130],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r130],%r366; $L13: .loc 1 78 3 add.u64 %r150,%r150,24; add.u64 %r138,%r138,8; add.u64 %r130,%r130,8; setp.ne.u64 %r257,%r100,%r150; @ %r257 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r258,%r107,0; @ %r258 bra $L16; add.u64 %r354,%frame,240; $L34: add.u64 %r159,%r182,40; mov.u64 %r157,%r354; add.u64 %r259,%r182,16; cvt.u32.u32 %r261,%r22; cvt.s64.s8 %r260,%r261; add.u64 %r263,%r260,%r260; add.u64 %r264,%r263,%r260; shl.b64 %r265,%r264,3; add.u64 %r151,%r259,%r265; .loc 1 94 10 mov.u64 %r101,1; .loc 1 98 4 mov.u64 %r288,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r266,0; st.u64 [%r182+8],%r266; .loc 1 103 28 cvt.u16.u32 %r269,%r22; add.u16 %r268,%r269,-1; cvt.u32.u16 %r270,%r268; st.u8 [%r182+28],%r270; .loc 1 105 20 add.u32 %r271,%r22,-2; cvt.s64.s32 %r52,%r271; add.u64 %r273,%r52,%r52; add.u64 %r274,%r273,%r52; shl.b64 %r275,%r274,3; add.u64 %r276,%r182,%r275; .loc 1 105 67 shl.b64 %r278,%r52,3; add.u64 %r279,%frame,%r278; .loc 1 105 59 ld.u64 %r281,[%r276+40]; ld.u64 %r282,[%r279+240]; mul.lo.u64 %r113,%r281,%r282; .loc 1 107 29 mov.u64 %r284,11(%value_in),_gfortrani_xmallocarray285,[%value_in]; } .loc 1 107 27 st.u64 [%r182],%r285; .loc 1 108 10 setp.eq.u64 %r287,%r113,0; @ ! %r287 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r101,%r48,%r101; $L17: .loc 1 98 4 st.u64 [%r159+8],%r288; ld.u64 %r48,[%r157]; add.u64 %r289,%r48,-1; st.u64 [%r159+16],%r289; st.u64 [%r159],%r101; .loc 1 91 7 add.u64 %r159,%r159,24; add.u64 %r157,%r157,8; setp.ne.u64 %r290,%r151,%r159; @ %r290 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r182+48],%r113; mov.u64 %r292,-1; st.u64 [%r182+56],%r292; mov.u64 %r293,1; st.u64 [%r182+40],%r293; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r182+28]; .loc 1 118 10 setp.eq.u64 %r294,%r58,%r107; @ %r294 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r107; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r297,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r298,[%r297+36]; setp.eq.u32 %r299,%r298,0; @ %r299 bra $L21; .loc 1 125 2 add.u64 %r354,%frame,240; cvta.const.u64 %r303,$LC2r354302303; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r305,%r107,0; @ %r305 bra $L24; $L27: .loc 1 65 9 add.u64 %r306,%r31,1; .loc 1 65 7 sub.u64 %r109,%r306,%r33; max.s64 %r45,%r109,0; .loc 1 137 8 ld.u64 %r121,[%r183]; .loc 1 138 8 ld.u64 %r122,[%r182]; .loc 1 154 36 add.u64 %r69,%r111,%r111; ld.u64 %r142,[%frame+360]; .loc 1 166 22 ld.u64 %r71,[%frame+120]; .loc 1 166 12 add.u64 %r72,%r71,%r71; .loc 1 167 22 ld.u64 %r74,[%frame]; .loc 1 167 12 add.u64 %r75,%r74,%r74; .loc 1 169 32 ld.u64 %r129,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r310,%r71,%r129; .loc 1 176 9 add.u64 %r311,%r310,%r310; neg.s64 %r132,%r311; .loc 1 177 23 mul.lo.u64 %r312,%r74,%r129; .loc 1 177 9 add.u64 %r313,%r312,%r312; neg.s64 %r135,%r313; setp.gt.s64 %r356,%r109,0; setp.le.s64 %r357,%r107,1; cvt.u32.u32 %r359,%r22; cvt.s64.s8 %r360,%r359; add.u64 %r361,%r360,-1; add.u64 %r362,%frame,120; add.u64 %r363,%frame,240; .loc 1 173 13 mov.u64 %r364,0; bra $L25; $L24: add.u64 %r172,%frame,360; add.u64 %r171,%r182,40; mov.u64 %r169,%frame; cvt.u32.u32 %r315,%r22; cvt.s64.s8 %r314,%r315; add.u64 %r160,%r314,-1; .loc 1 129 3 mov.u64 %r123,0; add.u64 %r354,%frame,240; .loc 1 131 16 mov.u64 %r316,%r123; $L26: st.u64 [%r172],%r316; .loc 1 132 18 ld.u64 %r317,[%r171]; st.u64 [%r169],%r317; .loc 1 133 17 shl.b64 %r319,%r123,3; add.u64 %r320,%r354,%r319; .loc 1 133 10 ld.u64 %r321,[%r320]; setp.le.s64 %r322,%r321,0; @ %r322 bra $L1; .loc 1 129 26 add.u64 %r123,%r123,1; .loc 1 129 3 add.u64 %r172,%r172,8; add.u64 %r171,%r171,24; add.u64 %r169,%r169,8; setp.ne.u64 %r323,%r123,%r160; @ %r323 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r356 bra $L35; .loc 1 150 10 mov.u16 %r325,1; st.u16 [%r122],%r325; bra $L29; $L35: mov.u64 %r116,%r121; .loc 1 148 10 mov.u32 %r114,1; .loc 1 154 13 mov.u64 %r115,0; $L28: .loc 1 158 10 ld.u16 %r327,[%r116]; cvt.u16.u32 %r328,%r114; mul.lo.u16 %r326,%r327,%r328; cvt.s32.s16 %r114,%r326; .loc 1 154 28 add.u64 %r115,%r115,1; .loc 1 154 36 add.u64 %r116,%r116,%r69; .loc 1 154 6 setp.gt.s64 %r330,%r45,%r115; @ %r330 bra $L28; .loc 1 161 12 cvt.u16.u32 %r331,%r114; st.u16 [%r122],%r331; $L29: .loc 1 165 15 add.u64 %r142,%r142,1; .loc 1 166 12 add.u64 %r121,%r121,%r72; .loc 1 167 12 add.u64 %r122,%r122,%r75; .loc 1 169 13 setp.ne.u64 %r332,%r142,%r129; @ %r332 bra $L25; .loc 1 176 9 add.u64 %r118,%r121,%r132; .loc 1 177 9 add.u64 %r119,%r122,%r135; .loc 1 179 7 @ %r357 bra $L1; add.u64 %r128,%frame,368; mov.u64 %r144,8; .loc 1 178 5 mov.u64 %r120,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r128],%r364; .loc 1 176 23 mul.lo.u64 %r338,%r88,%r87; .loc 1 176 9 add.u64 %r339,%r338,%r338; sub.u64 %r118,%r121,%r339; .loc 1 177 23 mul.lo.u64 %r340,%r91,%r87; .loc 1 177 9 add.u64 %r341,%r340,%r340; sub.u64 %r119,%r122,%r341; .loc 1 178 5 add.u64 %r120,%r120,1; .loc 1 179 7 add.u64 %r128,%r128,8; add.u64 %r144,%r144,8; setp.eq.u64 %r342,%r120,%r361; @ %r342 bra $L1; $L32: .loc 1 187 16 ld.u64 %r343,[%r128]; add.u64 %r87,%r343,1; st.u64 [%r128],%r87; .loc 1 188 23 add.u64 %r345,%r362,%r144; ld.u64 %r88,[%r345]; .loc 1 188 13 add.u64 %r346,%r88,%r88; add.u64 %r121,%r118,%r346; .loc 1 189 23 add.u64 %r347,%frame,%r144; ld.u64 %r91,[%r347]; .loc 1 189 13 add.u64 %r348,%r91,%r91; add.u64 %r122,%r119,%r348; .loc 1 169 32 add.u64 %r350,%r363,%r144; ld.u64 %r95,[%r350]; .loc 1 169 13 setp.eq.u64 %r351,%r87,%r95; @ %r351 bra $L33; .loc 1 173 13 mov.u64 %r142,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r352,[%r182]; setp.eq.u64 %r353,%r352,0; @ ! %r353 bra $L11; bra $L34; $L1: .loc 1 19_gfortran_mproduct_i2 .visible .func _gfortran_mproduct_608pred %r216; .reg .u32 %r220; .reg .u64 %r222; .reg .u32 %r224; .reg .u32 %r225; .reg .u16 %r227; .reg .u16u16 %r264; .reg .u32 %r265; .reg .u16u64 %r274; .reg .u64 %r280; .reg .pred %r282; .reg .pred %r283; .reg .u64u64u64predu64 %r313; .reg .u64u64pred %r332; .reg .pred %r333; .reg .u64 %r334; .reg .u32u16 %r355; .reg .u16 %r356; .reg .u32 %r357; .reg .predpred %r370; .reg .u64 %r371; .reg .u64 %r373; .reg .u32 %r374; .reg .predpredpredu64 %r427; .reg .u64 %r429; .reg .u64u64 %r451; mov.u64 %r212,%ar0; mov.u64 %r213,%ar1; mov.u64 %r214,%ar2; mov.u64 %r215,%ar3; .loc 1 223 6 setp.ne.u64 %r216,%r215,0; @ %r216 bra $L491213214; call _gfortran_product_.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r214]; .loc 1 233 7 add.u64 %r140,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r213+28]; .loc 1 234 38 add.u32 %r220,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r141,%r220; .loc 1 237 7 shr.u64 %r222,%r140,63; set.u32.gt.s64 %r224,%r140,%r141; neg.s32 %r225,%r224; cvt.u16.u64 %r228,%r222; cvt.u16.u32 %r229,%r225; or.b16 %r227,%r228,%r229; .loc 1 237 6 cvt.u32.u16 %r230,%r227; cvt.u16.u8 %r231,%r230; setp.eq.u16 %r232,%r231,0; @ %r232 bra $L51; .loc 1 239 7 cvt.u32.u32 %r235,%r23; cvt.s64.s8 %r234,%r235; st.u64 [%stack+8],%r234; st.u64 [%stack],%r22; cvta.const.u64 %r233233_gfortran_runtime_error1: .loc 1 244 9 add.u64 %r439,%r140,%r140; add.u64 %r440,%r439,%r140; shl.b64 %r240,%r440,3; add.u64 %r241,%r213,%r240; ld.u64 %r244,[%r241+56]; add.u64 %r243,%r244,1; .loc 1 244 7 ld.u64 %r251,[%r241+48]; sub.u64 %r142,%r243,%r251; .loc 1 245 6 setp.le.s64 %r252,%r142,0; @ %r252 bra $L48; .loc 1 248 9 ld.u64 %r155,[%r215]; .loc 1 250 15 ld.u64 %r34,[%r215+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r253,%r35,-4; and.b32 %r254,%r253,-5; set.u32.eq.u32 %r256,%r254,0; neg.s32 %r257,%r256; .loc 1 252 22 add.u32 %r258,%r35,-1; set.u32.le.u32 %r260,%r258,1; neg.s32 %r261,%r260; .loc 1 252 6 cvt.u16.u32 %r263,%r257; cvt.u16.u32 %r264,%r261; or.b16 %r262,%r263,%r264; cvt.u32.u16 %r265,%r262; cvt.u16.u8 %r266,%r265; setp.ne.u16 %r267,%r266,0; @ %r267 bra $L53; .loc 1 254 7 setp.ne.u32 %r269,%r35,16; @ %r269 bra $L54; $L53: .loc 1 261 9 shl.b64 %r273,%r440,3; add.u64 %r274,%r213,%r273; ld.u64 %r145,[%r274+40]; .loc 1 262 12 add.u64 %r280,%r215,%r273; ld.u64 %r42,[%r280+40]; .loc 1 264 3 setp.ne.u64 %r282,%r140,0; @ %r282 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r283,%r140,%r141; @ %r283 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r2842stack; call _gfortran_runtime_error5: add.u64 %r40,%r213,40; add.u64 %r182,%r215,40; add.u64 %r286,%r213,16; add.u64 %r288,%r22,%r22; add.u64 %r289,%r288,%r22; shl.b64 %r290,%r289,3; add.u64 %r209,%r286,%r290; .loc 1 264 3 mov.u64 %r95,0; add.u64 %r437,%frame,240; add.u64 %r436,%frame,360; .loc 1 271 12 mov.u64 %r451,%r95; $L60: .loc 1 266 18 add.u64 %r292,%r437,%r95; ld.u64 %r293,[%r40]; st.u64 [%r292],%r293; .loc 1 267 18 add.u64 %r294,%frame,%r95; .loc 1 267 20 ld.u64 %r296,[%r182]; mul.lo.u64 %r295,%r296,%r34; .loc 1 267 18 st.u64 [%r294],%r295; .loc 1 268 19 ld.u64 %r298,[%r40+16]; add.u64 %r297,%r298,1; ld.u64 %r299,[%r40+8]; sub.u64 %r54,%r297,%r299; .loc 1 270 10 setp.lt.s64 %r300,%r54,0; @ %r300 bra $L58; .loc 1 268 17 add.u64 %r302,%r436,%r95; st.u64 [%r302],%r54; bra $L59; $L58: .loc 1 271 12 add.u64 %r304,%r436,%r95; st.u64 [%r304],%r451; $L59: .loc 1 264 3 add.u64 %r40,%r40,24; add.u64 %r182,%r182,24; add.u64 %r95,%r95,8; setp.ne.u64 %r306,%r40,%r209; @ %r306 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r307,[%r212]; setp.eq.u64 %r308,%r307,0; @ ! %r308 bra $L63; bra $L62; $L56: add.u64 %r310,%r22,%r22; add.u64 %r311,%r310,%r22; shl.b64 %r312,%r311,3; add.u64 %r130,%r312,40; add.u64 %r136,%r213,%r130; add.u64 %r128,%r215,%r130; shl.b64 %r313,%r22,3; add.u64 %r106,%r313,-8; cvt.u32.u32 %r315,%r23; cvt.s64.s8 %r314,%r315; shl.b64 %r316,%r314,3; add.u64 %r69,%r316,-8; add.u64 %r437,%frame,240; add.u64 %r436,%frame,360; .loc 1 281 12 mov.u64 %r450,0; $L66: .loc 1 276 18 add.u64 %r318,%r437,%r106; ld.u64 %r319,[%r136]; st.u64 [%r318],%r319; .loc 1 277 18 add.u64 %r320,%frame,%r106; .loc 1 277 20 ld.u64 %r322,[%r128]; mul.lo.u64 %r321,%r322,%r34; .loc 1 277 18 st.u64 [%r320],%r321; .loc 1 278 19 ld.u64 %r324,[%r136+16]; add.u64 %r323,%r324,1; ld.u64 %r325,[%r136+8]; sub.u64 %r66,%r323,%r325; .loc 1 280 10 setp.lt.s64 %r326,%r66,0; @ %r326 bra $L64; .loc 1 278 17 add.u64 %r328,%r436,%r106; st.u64 [%r328],%r66; bra $L65; $L64: .loc 1 281 12 add.u64 %r330,%r436,%r106; st.u64 [%r330],%r450; $L65: .loc 1 274 3 add.u64 %r136,%r136,24; add.u64 %r128,%r128,24; add.u64 %r106,%r106,8; setp.ne.u64 %r332,%r69,%r106; @ %r332 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r333,%r141,0; @ %r333 bra $L68; add.u64 %r436,%frame,360; $L86: add.u64 %r175,%r212,40; mov.u64 %r167,%r436; cvt.u32.u32 %r335,%r23; cvt.s64.s8 %r334,%r335; add.u64 %r337,%r334,%r334; add.u64 %r338,%r337,%r334; shl.b64 %r339,%r338,3; add.u64 %r340,%r212,16; add.u64 %r137,%r339,%r340; .loc 1 291 10 mov.u64 %r132,1; .loc 1 295 4 mov.u64 %r359,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r341,%r23,-2; cvt.s64.s32 %r75,%r341; add.u64 %r343,%r75,%r75; add.u64 %r344,%r343,%r75; shl.b64 %r345,%r344,3; add.u64 %r346,%r212,%r345; .loc 1 299 67 shl.b64 %r348,%r75,3; add.u64 %r349,%frame,%r348; .loc 1 299 59 ld.u64 %r351,[%r346+40]; ld.u64 %r352,[%r349+360]; mul.lo.u64 %r149,%r351,%r352; .loc 1 301 24 mov.u64 %r353,0; st.u64 [%r212+8],%r353; .loc 1 302 28 cvt.u16.u32 %r356,%r23; add.u16 %r355,%r356,-1; cvt.u32.u16 %r357,%r355; st.u8 [%r212+28],%r357; .loc 1 304 10 setp.eq.u64 %r358,%r149,0; @ %r358 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r132,%r72,%r132; $L69: .loc 1 295 4 st.u64 [%r175+8],%r359; ld.u64 %r72,[%r167]; add.u64 %r360,%r72,-1; st.u64 [%r175+16],%r360; st.u64 [%r175],%r132; .loc 1 288 7 add.u64 %r175,%r175,24; add.u64 %r167,%r167,8; setp.ne.u64 %r361,%r137,%r175; @ %r361 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r212+48],%r149; mov.u64 %r363,-1; st.u64 [%r212+56],%r363; mov.u64 %r364,1; st.u64 [%r212+40],%r364; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r366,366; call (%value_in),_gfortrani_xmallocarray67,[%value_in]; } .loc 1 311 22 st.u64 [%r212],%r367; bra $L73; $L63: .loc 1 316 19 ld.s8 %r369,[%r212+28]; .loc 1 316 10 setp.eq.u64 %r370,%r369,%r141; @ %r370 bra $L74; .loc 1 317 2 cvta.const.u64 %r371371_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r373,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r374,[%r373+36]; setp.eq.u32 %r375,%r374,0; @ %r375 bra $L73; .loc 1 321 4 add.u64 %r436,%frame,360; cvta.const.u64 %r379,$LC2r4r378379; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r383,$LC383379; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r385,%r141,0; @ %r385 bra $L75; $L78: .loc 1 336 8 ld.u64 %r156,[%r212]; .loc 1 337 8 ld.u64 %r154,[%r213]; .loc 1 339 9 setp.ne.u64 %r386,%r154,0; @ %r386 bra $L76; bra $L48; $L75: add.u64 %r193,%frame,480; add.u64 %r192,%r212,40; add.u64 %r190,%frame,120; cvt.u32.u32 %r388,%r23; cvt.s64.s8 %r387,%r388; add.u64 %r178,%r387,-1; .loc 1 328 3 mov.u64 %r160,0; add.u64 %r436,%frame,360; .loc 1 330 16 mov.u64 %r389,%r160; $L77: st.u64 [%r193],%r389; .loc 1 331 18 ld.u64 %r390,[%r192]; st.u64 [%r190],%r390; .loc 1 332 17 shl.b64 %r392,%r160,3; add.u64 %r393,%r436,%r392; .loc 1 332 10 ld.u64 %r394,[%r393]; setp.le.s64 %r395,%r394,0; @ %r395 bra $L48; .loc 1 328 26 add.u64 %r160,%r160,1; .loc 1 328 3 add.u64 %r193,%r193,8; add.u64 %r192,%r192,24; add.u64 %r190,%r190,8; setp.ne.u64 %r396,%r160,%r178; @ %r396 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r45,%r42,%r34; .loc 1 349 32 add.u64 %r94,%r145,%r145; ld.u64 %r39,[%frame+480]; .loc 1 359 22 ld.u64 %r97,[%frame+240]; .loc 1 359 12 add.u64 %r98,%r97,%r97; .loc 1 360 23 ld.u64 %r100,[%frame]; .loc 1 361 22 ld.u64 %r102,[%frame+120]; .loc 1 361 12 add.u64 %r103,%r102,%r102; .loc 1 363 32 ld.u64 %r85,[%frame+360]; .loc 1 370 23 mul.lo.u64 %r38,%r85,%r97; .loc 1 371 24 mul.lo.u64 %r400,%r85,%r100; .loc 1 371 10 neg.s64 %r166,%r400; .loc 1 372 23 mul.lo.u64 %r401,%r85,%r102; .loc 1 372 9 add.u64 %r402,%r401,%r401; neg.s64 %r168,%r402; setp.le.s64 %r441,%r141,1; .loc 1 348 10 mov.u32 %r443,1; cvt.u32.u32 %r444,%r23; cvt.s64.s8 %r445,%r444; add.u64 %r446,%r445,-1; add.u64 %r447,%frame,240; add.u64 %r448,%frame,360; add.u64 %r449,%frame,120; $L87: .loc 1 328 3 mov.u64 %r159,%r155; mov.u64 %r158,%r154; .loc 1 348 10 mov.u32 %r133,%r443; .loc 1 349 9 mov.u64 %r157,0; $L80: .loc 1 352 6 ld.s8 %r404,[%r159]; cvt.u16.u32 %r403,%r404; setp.eq.u16 %r405,%r403,0; @ %r405 bra $L79; .loc 1 353 12 ld.u16 %r407,[%r158]; cvt.u16.u32 %r408,%r133; mul.lo.u16 %r406,%r407,%r408; cvt.s32.s16 %r133,%r406; $L79: .loc 1 349 24 add.u64 %r157,%r157,1; .loc 1 349 32 add.u64 %r158,%r158,%r94; .loc 1 349 47 add.u64 %r159,%r159,%r45; .loc 1 349 2 setp.ne.u64 %r410,%r142,%r157; @ %r410 bra $L80; .loc 1 355 8 cvt.u16.u32 %r411,%r133; st.u16 [%r156],%r411; .loc 1 358 15 add.u64 %r39,%r39,1; .loc 1 359 12 add.u64 %r154,%r154,%r98; .loc 1 360 13 add.u64 %r155,%r155,%r100; .loc 1 361 12 add.u64 %r156,%r156,%r103; .loc 1 363 13 setp.ne.u64 %r412,%r85,%r39; @ %r412 bra $L87; .loc 1 371 10 add.u64 %r151,%r155,%r166; .loc 1 372 9 add.u64 %r152,%r156,%r168; .loc 1 374 7 @ ! %r441 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r41],%r442; .loc 1 370 23 mul.lo.u64 %r105,%r117,%r116; .loc 1 371 24 mul.lo.u64 %r415,%r119,%r116; .loc 1 371 10 sub.u64 %r151,%r155,%r415; .loc 1 372 23 mul.lo.u64 %r416,%r121,%r116; .loc 1 372 9 add.u64 %r417,%r416,%r416; sub.u64 %r152,%r156,%r417; .loc 1 373 5 add.u64 %r153,%r153,1; .loc 1 374 7 add.u64 %r41,%r41,8; add.u64 %r169,%r169,8; setp.ne.u64 %r418,%r153,%r446; @ %r418 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; .loc 1 370 23 mov.u64 %r105,%r38; .loc 1 374 7 mov.u64 %r169,8; .loc 1 373 5 mov.u64 %r153,1; .loc 1 367 13 mov.u64 %r442,0; $L84: .loc 1 382 16 ld.u64 %r422,[%r41]; add.u64 %r116,%r422,1; st.u64 [%r41],%r116; .loc 1 383 23 add.u64 %r424,%r447,%r169; ld.u64 %r117,[%r424]; .loc 1 383 13 sub.u64 %r425,%r117,%r105; add.u64 %r426,%r425,%r425; add.u64 %r154,%r154,%r426; .loc 1 384 24 add.u64 %r427,%frame,%r169; ld.u64 %r119,[%r427]; .loc 1 384 14 add.u64 %r155,%r151,%r119; .loc 1 385 23 add.u64 %r429,%r449,%r169; ld.u64 %r121,[%r429]; .loc 1 385 13 add.u64 %r430,%r121,%r121; add.u64 %r156,%r152,%r430; .loc 1 363 32 add.u64 %r432,%r448,%r169; ld.u64 %r125,[%r432]; .loc 1 363 13 setp.eq.u64 %r433,%r116,%r125; @ %r433 bra $L85; .loc 1 367 13 mov.u64 %r39,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r434,[%r212]; setp.eq.u64 %r435,%r434,0; @ ! %r435 bra $L63; bra $L86; $L48:_gfortran_sproduct_i2 .visible .func _gfortran_sproduct_68u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64predpred %r292; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_product_.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r295,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r295; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r295; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r301,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r301; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r294,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r294,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r300,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r300; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r295,0; @ %r211 bra $L127; add.u64 %r294,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r294; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r244,4; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r295,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r295; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r295; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r295,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r295,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r295,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 add.u64 %r69,%r68,%r68; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r293,%r295,1; .loc 1 512 13 mov.u16 %r279,1; cvt.u32.u32 %r297,%r24; cvt.s64.s8 %r298,%r297; add.u64 %r299,%r298,-1; $L146: st.u16 [%r98],%r279; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r280,%r59,%r80; @ %r280 bra $L146; .loc 1 525 7 @ ! %r293 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r296; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r283,%r97,%r299; @ %r283 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; .loc 1 520 13 mov.u64 %r296,0; $L144: .loc 1 529 16 ld.u64 %r288,[%r99]; add.u64 %r74,%r288,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r289,%r75,%r72; add.u64 %r290,%r289,%r289; add.u64 %r98,%r98,%r290; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r291,%r74,%r77; @ %r291 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r292,%r98,0; @ ! %r292 bra $L122; bra $L147; $L109: .loc 1 534 1 ret; } product_i4.o/_gfortran_product_i4 .visible .func _gfortran_product_fortran/generated/product_i4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mproduct_i4 .visible .func _gfortran_mproduct_i4gfortran_sproduct_i4 .visible .func _gfortran_sproduct_i4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,108] = {80,82,79,68,85,67,84114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_product_i4 .visible .func _gfortran_product_480u64 %r52; .reg .u64 %r58; .reg .u64 %r61; .reg .u649pred %r217; .reg .pred64u64 %r251; .reg .predu64 %r256; .reg .u64 %r257; .reg .u32u32 %r268; .reg .u64u64predpredu64u64predmov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; .loc 1 55 10 ld.s8 %r22,[%r180+28]; .loc 1 55 38 add.u32 %r182,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r182; .loc 1 56 10 ld.u64 %r25,[%r181]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r184,%r104,63; set.u32.lt.s64 %r186,%r103,%r104; neg.s32 %r187,%r186; cvt.u16.u64 %r190,%r184; cvt.u16.u32 %r191,%r187; or.b16 %r189,%r190,%r191; .loc 1 58 6 cvt.u32.u16 %r192,%r189; cvt.u16.u8 %r193,%r192; setp.eq.u16 %r194,%r193,0; @ %r194 bra $L2; .loc 1 60 7 cvt.u32.u32 %r197,%r22; cvt.s64.s8 %r196,%r197; st.u64 [%stack+8],%r1961955_gfortran_runtime_errorr200,%r104,%r104; add.u64 %r201,%r200,%r104; shl.b64 %r202,%r201,3; add.u64 %r203,%r180,%r202; ld.u64 %r30,[%r203+56]; ld.u64 %r32,[%r203+48]; .loc 1 68 9 ld.u64 %r107,[%r203+40]; .loc 1 70 3 setp.ne.u64 %r217,%r104,0; @ %r217 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r218,%r103,%r104; @ %r218 bra $L4; bra $L46; $L3: add.u64 %r61,%r180,40; add.u64 %r40,%frame,120; add.u64 %r91,%frame,240; add.u64 %r219,%r180,16; add.u64 %r221,%r25,%r25; add.u64 %r222,%r221,%r25; shl.b64 %r223,%r222,3; add.u64 %r174,%r219,%r223; .loc 1 76 12 mov.u64 %r357,0; $L8: .loc 1 72 18 ld.u64 %r224,[%r61]; st.u64 [%r40],%r224; .loc 1 73 19 ld.u64 %r226,[%r61+16]; add.u64 %r225,%r226,1; ld.u64 %r227,[%r61+8]; sub.u64 %r37,%r225,%r227; .loc 1 75 10 setp.lt.s64 %r228,%r37,0; @ %r228 bra $L6; .loc 1 73 17 st.u64 [%r91],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r91],%r357; $L7: .loc 1 70 3 add.u64 %r61,%r61,24; add.u64 %r40,%r40,8; add.u64 %r91,%r91,8; setp.ne.u64 %r230,%r61,%r174; @ %r230 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r231,[%r179]; setp.eq.u64 %r232,%r231,0; @ ! %r232 bra $L11; bra $L10; $L4: add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r237,%r236,40; add.u64 %r143,%r180,%r237; shl.b64 %r238,%r25,3; add.u64 %r125,%r238,-8; add.u64 %r344,%frame,120; add.u64 %r128,%r344,%r125; add.u64 %r345,%frame,240; add.u64 %r121,%r345,%r125; cvt.u32.u32 %r242,%r22; cvt.s64.s8 %r241,%r242; add.u64 %r244,%r241,%r241; add.u64 %r245,%r244,%r241; shl.b64 %r246,%r245,3; add.u64 %r247,%r180,40; add.u64 %r95,%r246,%r247; .loc 1 84 12 mov.u64 %r356,0; $L14: .loc 1 80 18 ld.u64 %r248,[%r143]; st.u64 [%r128],%r248; .loc 1 81 19 ld.u64 %r250,[%r143+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r143+8]; sub.u64 %r44,%r249,%r251; .loc 1 83 10 setp.lt.s64 %r252,%r44,0; @ %r252 bra $L12; .loc 1 81 17 st.u64 [%r121],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r121],%r356; $L13: .loc 1 78 3 add.u64 %r143,%r143,24; add.u64 %r128,%r128,8; add.u64 %r121,%r121,8; setp.ne.u64 %r254,%r95,%r143; @ %r254 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r255,%r103,0; @ %r255 bra $L16; add.u64 %r345,%frame,240; $L34: add.u64 %r152,%r179,40; mov.u64 %r150,%r345; add.u64 %r256,%r179,16; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r144,%r256,%r262; .loc 1 94 10 mov.u64 %r99,1; .loc 1 98 4 mov.u64 %r285,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r263,0; st.u64 [%r179+8],%r263; .loc 1 103 28 cvt.u16.u32 %r266,%r22; add.u16 %r265,%r266,-1; cvt.u32.u16 %r267,%r265; st.u8 [%r179+28],%r267; .loc 1 105 20 add.u32 %r268,%r22,-2; cvt.s64.s32 %r52,%r268; add.u64 %r270,%r52,%r52; add.u64 %r271,%r270,%r52; shl.b64 %r272,%r271,3; add.u64 %r273,%r179,%r272; .loc 1 105 67 shl.b64 %r275,%r52,3; add.u64 %r276,%frame,%r275; .loc 1 105 59 ld.u64 %r278,[%r273+40]; ld.u64 %r279,[%r276+240]; mul.lo.u64 %r109,%r278,%r279; .loc 1 107 29281; call (%value_in),_gfortrani_xmallocarray282,[%value_in]; } .loc 1 107 27 st.u64 [%r179],%r282; .loc 1 108 10 setp.eq.u64 %r284,%r109,0; @ ! %r284 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r99,%r48,%r99; $L17: .loc 1 98 4 st.u64 [%r152+8],%r285; ld.u64 %r48,[%r150]; add.u64 %r286,%r48,-1; st.u64 [%r152+16],%r286; st.u64 [%r152],%r99; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r287,%r144,%r152; @ %r287 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r179+48],%r109; mov.u64 %r289,-1; st.u64 [%r179+56],%r289; mov.u64 %r290,1; st.u64 [%r179+40],%r290; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r179+28]; .loc 1 118 10 setp.eq.u64 %r291,%r58,%r103; @ %r291 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r294,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r295,[%r294+36]; setp.eq.u32 %r296,%r295,0; @ %r296 bra $L21; .loc 1 125 2 add.u64 %r345,%frame,240; cvta.const.u64 %r300,$LC2r345299300; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r302,%r103,0; @ %r302 bra $L24; $L27: .loc 1 65 9 add.u64 %r303,%r30,1; .loc 1 65 7 sub.u64 %r105,%r303,%r32; max.s64 %r45,%r105,0; .loc 1 137 8 ld.u64 %r116,[%r180]; .loc 1 138 8 ld.u64 %r117,[%r179]; .loc 1 154 36 shl.b64 %r66,%r107,2; ld.u64 %r138,[%frame+360]; .loc 1 166 22 ld.u64 %r68,[%frame+120]; .loc 1 166 12 shl.b64 %r69,%r68,2; .loc 1 167 22 ld.u64 %r71,[%frame]; .loc 1 167 12 shl.b64 %r72,%r71,2; .loc 1 169 32 ld.u64 %r123,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r304,%r68,%r123; .loc 1 176 9 shl.b64 %r305,%r304,2; neg.s64 %r127,%r305; .loc 1 177 23 mul.lo.u64 %r306,%r71,%r123; .loc 1 177 9 shl.b64 %r307,%r306,2; neg.s64 %r130,%r307; setp.gt.s64 %r346,%r105,0; setp.le.s64 %r347,%r103,1; cvt.u32.u32 %r349,%r22; cvt.s64.s8 %r350,%r349; add.u64 %r351,%r350,-1; add.u64 %r352,%frame,120; add.u64 %r353,%frame,240; .loc 1 173 13 mov.u64 %r354,0; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r179,40; mov.u64 %r162,%frame; cvt.u32.u32 %r309,%r22; cvt.s64.s8 %r308,%r309; add.u64 %r153,%r308,-1; .loc 1 129 3 mov.u64 %r118,0; add.u64 %r345,%frame,240; .loc 1 131 16 mov.u64 %r310,%r118; $L26: st.u64 [%r165],%r310; .loc 1 132 18 ld.u64 %r311,[%r164]; st.u64 [%r162],%r311; .loc 1 133 17 shl.b64 %r313,%r118,3; add.u64 %r314,%r345,%r313; .loc 1 133 10 ld.u64 %r315,[%r314]; setp.le.s64 %r316,%r315,0; @ %r316 bra $L1; .loc 1 129 26 add.u64 %r118,%r118,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r317,%r118,%r153; @ %r317 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r346 bra $L35; .loc 1 150 10 mov.u32 %r319,1; st.u32 [%r117],%r319; bra $L29; $L35: mov.u64 %r112,%r116; .loc 1 148 10 mov.u32 %r110,1; .loc 1 154 13 mov.u64 %r111,0; $L28: .loc 1 158 10 ld.u32 %r320,[%r112]; mul.lo.u32 %r110,%r110,%r320; .loc 1 154 28 add.u64 %r111,%r111,1; .loc 1 154 36 add.u64 %r112,%r112,%r66; .loc 1 154 6 setp.gt.s64 %r321,%r45,%r111; @ %r321 bra $L28; .loc 1 161 12 st.u32 [%r117],%r110; $L29: .loc 1 165 15 add.u64 %r138,%r138,1; .loc 1 166 12 add.u64 %r116,%r116,%r69; .loc 1 167 12 add.u64 %r117,%r117,%r72; .loc 1 169 13 setp.ne.u64 %r322,%r138,%r123; @ %r322 bra $L25; .loc 1 176 9 add.u64 %r113,%r116,%r127; .loc 1 177 9 add.u64 %r114,%r117,%r130; .loc 1 179 7 @ %r347 bra $L1; add.u64 %r122,%frame,368; mov.u64 %r140,8; .loc 1 178 5 mov.u64 %r115,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r122],%r354; .loc 1 176 23 mul.lo.u64 %r328,%r85,%r84; .loc 1 176 9 shl.b64 %r329,%r328,2; sub.u64 %r113,%r116,%r329; .loc 1 177 23 mul.lo.u64 %r330,%r88,%r84; .loc 1 177 9 shl.b64 %r331,%r330,2; sub.u64 %r114,%r117,%r331; .loc 1 178 5 add.u64 %r115,%r115,1; .loc 1 179 7 add.u64 %r122,%r122,8; add.u64 %r140,%r140,8; setp.eq.u64 %r332,%r115,%r351; @ %r332 bra $L1; $L32: .loc 1 187 16 ld.u64 %r333,[%r122]; add.u64 %r84,%r333,1; st.u64 [%r122],%r84; .loc 1 188 23 add.u64 %r335,%r352,%r140; ld.u64 %r85,[%r335]; .loc 1 188 13 shl.b64 %r336,%r85,2; add.u64 %r116,%r113,%r336; .loc 1 189 23 add.u64 %r337,%frame,%r140; ld.u64 %r88,[%r337]; .loc 1 189 13 shl.b64 %r338,%r88,2; add.u64 %r117,%r114,%r338; .loc 1 169 32 add.u64 %r340,%r353,%r140; ld.u64 %r92,[%r340]; .loc 1 169 13 setp.eq.u64 %r341,%r84,%r92; @ %r341 bra $L33; .loc 1 173 13 mov.u64 %r138,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r342,[%r179]; setp.eq.u64 %r343,%r342,0; @ ! %r343 bra $L11; bra $L34; $L1: .loc 1 19_gfortran_mproduct_i4 .visible .func _gfortran_mproduct_608u64u16 %r263; .reg .pred %r264; .reg .pred %r266; .reg .u64 %r270; .reg .u64 %r271; .reg .u64 %r277; .reg .pred %r279; .reg .pred %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .predu64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32u64 %r342; .reg .u64pred %r358; .reg .u64predu64 %r391; .reg .predpred %r399; .reg .u32 %r400; .reg .pred %r401; .reg .predmov.u64 %r211,%ar2; mov.u64 %r212,%ar3; .loc 1 223 6 setp.ne.u64 %r213,%r212,0; @ %r213 bra $L49gfortran_product_.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r211]; .loc 1 233 7 add.u64 %r137,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r210+28]; .loc 1 234 38 add.u32 %r217,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r138,%r217; .loc 1 237 7 shr.u64 %r219,%r137,63; set.u32.gt.s64 %r221,%r137,%r138; neg.s32 %r222,%r221; cvt.u16.u64 %r225,%r219; cvt.u16.u32 %r226,%r222; or.b16 %r224,%r225,%r226; .loc 1 237 6 cvt.u32.u16 %r227,%r224; cvt.u16.u8 %r228,%r227; setp.eq.u16 %r229,%r228,0; @ %r229 bra $L51; .loc 1 239 7 cvt.u32.u32 %r232,%r23; cvt.s64.s8 %r231,%r232; st.u64 [%stack+8],%r231; st.u64 [%stack],%r22; cvta.const.u64 %r2302stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r429,%r137,%r137; add.u64 %r430,%r429,%r137; shl.b64 %r237,%r430,3; add.u64 %r238,%r210,%r237; ld.u64 %r241,[%r238+56]; add.u64 %r240,%r241,1; .loc 1 244 7 ld.u64 %r248,[%r238+48]; sub.u64 %r139,%r240,%r248; .loc 1 245 6 setp.le.s64 %r249,%r139,0; @ %r249 bra $L48; .loc 1 248 9 ld.u64 %r152,[%r212]; .loc 1 250 15 ld.u64 %r34,[%r212+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r250,%r35,-4; and.b32 %r251,%r250,-5; set.u32.eq.u32 %r253,%r251,0; neg.s32 %r254,%r253; .loc 1 252 22 add.u32 %r255,%r35,-1; set.u32.le.u32 %r257,%r255,1; neg.s32 %r258,%r257; .loc 1 252 6 cvt.u16.u32 %r260,%r254; cvt.u16.u32 %r261,%r258; or.b16 %r259,%r260,%r261; cvt.u32.u16 %r262,%r259; cvt.u16.u8 %r263,%r262; setp.ne.u16 %r264,%r263,0; @ %r264 bra $L53; .loc 1 254 7 setp.ne.u32 %r266,%r35,16; @ %r266 bra $L54; $L53: .loc 1 261 9 shl.b64 %r270,%r430,3; add.u64 %r271,%r210,%r270; ld.u64 %r142,[%r271+40]; .loc 1 262 12 add.u64 %r277,%r212,%r270; ld.u64 %r42,[%r277+40]; .loc 1 264 3 setp.ne.u64 %r279,%r137,0; @ %r279 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r280,%r137,%r138; @ %r280 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r281281_gfortran_runtime_error5: add.u64 %r40,%r210,40; add.u64 %r179,%r212,40; add.u64 %r283,%r210,16; add.u64 %r285,%r22,%r22; add.u64 %r286,%r285,%r22; shl.b64 %r287,%r286,3; add.u64 %r206,%r283,%r287; .loc 1 264 3 mov.u64 %r92,0; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 271 12 mov.u64 %r441,%r92; $L60: .loc 1 266 18 add.u64 %r289,%r427,%r92; ld.u64 %r290,[%r40]; st.u64 [%r289],%r290; .loc 1 267 18 add.u64 %r291,%frame,%r92; .loc 1 267 20 ld.u64 %r293,[%r179]; mul.lo.u64 %r292,%r293,%r34; .loc 1 267 18 st.u64 [%r291],%r292; .loc 1 268 19 ld.u64 %r295,[%r40+16]; add.u64 %r294,%r295,1; ld.u64 %r296,[%r40+8]; sub.u64 %r54,%r294,%r296; .loc 1 270 10 setp.lt.s64 %r297,%r54,0; @ %r297 bra $L58; .loc 1 268 17 add.u64 %r299,%r426,%r92; st.u64 [%r299],%r54; bra $L59; $L58: .loc 1 271 12 add.u64 %r301,%r426,%r92; st.u64 [%r301],%r441; $L59: .loc 1 264 3 add.u64 %r40,%r40,24; add.u64 %r179,%r179,24; add.u64 %r92,%r92,8; setp.ne.u64 %r303,%r40,%r206; @ %r303 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r304,[%r209]; setp.eq.u64 %r305,%r304,0; @ ! %r305 bra $L63; bra $L62; $L56: add.u64 %r307,%r22,%r22; add.u64 %r308,%r307,%r22; shl.b64 %r309,%r308,3; add.u64 %r127,%r309,40; add.u64 %r133,%r210,%r127; add.u64 %r125,%r212,%r127; shl.b64 %r310,%r22,3; add.u64 %r103,%r310,-8; cvt.u32.u32 %r312,%r23; cvt.s64.s8 %r311,%r312; shl.b64 %r313,%r311,3; add.u64 %r69,%r313,-8; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 281 12 mov.u64 %r440,0; $L66: .loc 1 276 18 add.u64 %r315,%r427,%r103; ld.u64 %r316,[%r133]; st.u64 [%r315],%r316; .loc 1 277 18 add.u64 %r317,%frame,%r103; .loc 1 277 20 ld.u64 %r319,[%r125]; mul.lo.u64 %r318,%r319,%r34; .loc 1 277 18 st.u64 [%r317],%r318; .loc 1 278 19 ld.u64 %r321,[%r133+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r133+8]; sub.u64 %r66,%r320,%r322; .loc 1 280 10 setp.lt.s64 %r323,%r66,0; @ %r323 bra $L64; .loc 1 278 17 add.u64 %r325,%r426,%r103; st.u64 [%r325],%r66; bra $L65; $L64: .loc 1 281 12 add.u64 %r327,%r426,%r103; st.u64 [%r327],%r440; $L65: .loc 1 274 3 add.u64 %r133,%r133,24; add.u64 %r125,%r125,24; add.u64 %r103,%r103,8; setp.ne.u64 %r329,%r69,%r103; @ %r329 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r330,%r138,0; @ %r330 bra $L68; add.u64 %r426,%frame,360; $L86: add.u64 %r172,%r209,40; mov.u64 %r164,%r426; cvt.u32.u32 %r332,%r23; cvt.s64.s8 %r331,%r332; add.u64 %r334,%r331,%r331; add.u64 %r335,%r334,%r331; shl.b64 %r336,%r335,3; add.u64 %r337,%r209,16; add.u64 %r134,%r336,%r337; .loc 1 291 10 mov.u64 %r129,1; .loc 1 295 4 mov.u64 %r356,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r338,%r23,-2; cvt.s64.s32 %r75,%r338; add.u64 %r340,%r75,%r75; add.u64 %r341,%r340,%r75; shl.b64 %r342,%r341,3; add.u64 %r343,%r209,%r342; .loc 1 299 67 shl.b64 %r345,%r75,3; add.u64 %r346,%frame,%r345; .loc 1 299 59 ld.u64 %r348,[%r343+40]; ld.u64 %r349,[%r346+360]; mul.lo.u64 %r146,%r348,%r349; .loc 1 301 24 mov.u64 %r350,0; st.u64 [%r209+8],%r350; .loc 1 302 28 cvt.u16.u32 %r353,%r23; add.u16 %r352,%r353,-1; cvt.u32.u16 %r354,%r352; st.u8 [%r209+28],%r354; .loc 1 304 10 setp.eq.u64 %r355,%r146,0; @ %r355 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r129,%r72,%r129; $L69: .loc 1 295 4 st.u64 [%r172+8],%r356; ld.u64 %r72,[%r164]; add.u64 %r357,%r72,-1; st.u64 [%r172+16],%r357; st.u64 [%r172],%r129; .loc 1 288 7 add.u64 %r172,%r172,24; add.u64 %r164,%r164,8; setp.ne.u64 %r358,%r134,%r172; @ %r358 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r209+48],%r146; mov.u64 %r360,-1; st.u64 [%r209+56],%r360; mov.u64 %r361,1; st.u64 [%r209+40],%r361; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24363; call (%value_in),_gfortrani_xmallocarray64,[%value_in]; } .loc 1 311 22 st.u64 [%r209],%r364; bra $L73; $L63: .loc 1 316 19 ld.s8 %r366,[%r209+28]; .loc 1 316 10 setp.eq.u64 %r367,%r366,%r138; @ %r367 bra $L74; .loc 1 317 2 cvta.const.u64 %r36836_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r370,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r371,[%r370+36]; setp.eq.u32 %r372,%r371,0; @ %r372 bra $L73; .loc 1 321 4 add.u64 %r426,%frame,360; cvta.const.u64 %r376,$LC2r42637376; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r380,$LC1210380376; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r382,%r138,0; @ %r382 bra $L75; $L78: .loc 1 336 8 ld.u64 %r153,[%r209]; .loc 1 337 8 ld.u64 %r151,[%r210]; .loc 1 339 9 setp.ne.u64 %r383,%r151,0; @ %r383 bra $L76; bra $L48; $L75: add.u64 %r190,%frame,480; add.u64 %r189,%r209,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r385,%r23; cvt.s64.s8 %r384,%r385; add.u64 %r175,%r384,-1; .loc 1 328 3 mov.u64 %r157,0; add.u64 %r426,%frame,360; .loc 1 330 16 mov.u64 %r386,%r157; $L77: st.u64 [%r190],%r386; .loc 1 331 18 ld.u64 %r387,[%r189]; st.u64 [%r187],%r387; .loc 1 332 17 shl.b64 %r389,%r157,3; add.u64 %r390,%r426,%r389; .loc 1 332 10 ld.u64 %r391,[%r390]; setp.le.s64 %r392,%r391,0; @ %r392 bra $L48; .loc 1 328 26 add.u64 %r157,%r157,1; .loc 1 328 3 add.u64 %r190,%r190,8; add.u64 %r189,%r189,24; add.u64 %r187,%r187,8; setp.ne.u64 %r393,%r157,%r175; @ %r393 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r45,%r42,%r34; .loc 1 349 32 shl.b64 %r91,%r142,2; ld.u64 %r39,[%frame+480]; .loc 1 359 22 ld.u64 %r94,[%frame+240]; .loc 1 359 12 shl.b64 %r95,%r94,2; .loc 1 360 23 ld.u64 %r97,[%frame]; .loc 1 361 22 ld.u64 %r99,[%frame+120]; .loc 1 361 12 shl.b64 %r100,%r99,2; .loc 1 363 32 ld.u64 %r85,[%frame+360]; .loc 1 370 23 mul.lo.u64 %r38,%r85,%r94; .loc 1 371 24 mul.lo.u64 %r394,%r85,%r97; .loc 1 371 10 neg.s64 %r163,%r394; .loc 1 372 23 mul.lo.u64 %r395,%r85,%r99; .loc 1 372 9 shl.b64 %r396,%r395,2; neg.s64 %r165,%r396; setp.le.s64 %r431,%r138,1; cvt.u32.u32 %r433,%r23; cvt.s64.s8 %r434,%r433; add.u64 %r435,%r434,-1; add.u64 %r436,%frame,240; add.u64 %r437,%frame,360; add.u64 %r438,%frame,120; .loc 1 367 13 mov.u64 %r439,0; $L87: .loc 1 328 3 mov.u64 %r156,%r152; mov.u64 %r155,%r151; .loc 1 348 10 mov.u32 %r130,1; .loc 1 349 9 mov.u64 %r154,0; $L80: .loc 1 352 6 ld.s8 %r398,[%r156]; cvt.u16.u32 %r397,%r398; setp.eq.u16 %r399,%r397,0; @ %r399 bra $L79; .loc 1 353 12 ld.u32 %r400,[%r155]; mul.lo.u32 %r130,%r130,%r400; $L79: .loc 1 349 24 add.u64 %r154,%r154,1; .loc 1 349 32 add.u64 %r155,%r155,%r91; .loc 1 349 47 add.u64 %r156,%r156,%r45; .loc 1 349 2 setp.ne.u64 %r401,%r139,%r154; @ %r401 bra $L80; .loc 1 355 8 st.u32 [%r153],%r130; .loc 1 358 15 add.u64 %r39,%r39,1; .loc 1 359 12 add.u64 %r151,%r151,%r95; .loc 1 360 13 add.u64 %r152,%r152,%r97; .loc 1 361 12 add.u64 %r153,%r153,%r100; .loc 1 363 13 setp.ne.u64 %r402,%r85,%r39; @ %r402 bra $L87; .loc 1 371 10 add.u64 %r148,%r152,%r163; .loc 1 372 9 add.u64 %r149,%r153,%r165; .loc 1 374 7 @ ! %r431 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r41],%r439; .loc 1 370 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 371 24 mul.lo.u64 %r405,%r116,%r113; .loc 1 371 10 sub.u64 %r148,%r152,%r405; .loc 1 372 23 mul.lo.u64 %r406,%r118,%r113; .loc 1 372 9 shl.b64 %r407,%r406,2; sub.u64 %r149,%r153,%r407; .loc 1 373 5 add.u64 %r150,%r150,1; .loc 1 374 7 add.u64 %r41,%r41,8; add.u64 %r166,%r166,8; setp.ne.u64 %r408,%r150,%r435; @ %r408 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; .loc 1 370 23 mov.u64 %r102,%r38; .loc 1 374 7 mov.u64 %r166,8; .loc 1 373 5 mov.u64 %r150,1; $L84: .loc 1 382 16 ld.u64 %r412,[%r41]; add.u64 %r113,%r412,1; st.u64 [%r41],%r113; .loc 1 383 23 add.u64 %r414,%r436,%r166; ld.u64 %r114,[%r414]; .loc 1 383 13 sub.u64 %r415,%r114,%r102; shl.b64 %r416,%r415,2; add.u64 %r151,%r151,%r416; .loc 1 384 24 add.u64 %r417,%frame,%r166; ld.u64 %r116,[%r417]; .loc 1 384 14 add.u64 %r152,%r148,%r116; .loc 1 385 23 add.u64 %r419,%r438,%r166; ld.u64 %r118,[%r419]; .loc 1 385 13 shl.b64 %r420,%r118,2; add.u64 %r153,%r149,%r420; .loc 1 363 32 add.u64 %r422,%r437,%r166; ld.u64 %r122,[%r422]; .loc 1 363 13 setp.eq.u64 %r423,%r113,%r122; @ %r423 bra $L85; .loc 1 367 13 mov.u64 %r39,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r424,[%r209]; setp.eq.u64 %r425,%r424,0; @ ! %r425 bra $L63; bra $L86; $L48:_gfortran_sproduct_i4 .visible .func _gfortran_sproduct_68u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64predu64mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_product_.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r294,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r294; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r294; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r300,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r300; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r293,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r293,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r299,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r299; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r294,0; @ %r211 bra $L127; add.u64 %r293,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r293; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 2496244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r294,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r294; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r294; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r294,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r294,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r294,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 shl.b64 %r69,%r68,2; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r292,%r294,1; .loc 1 512 13 mov.u32 %r278,1; cvt.u32.u32 %r296,%r24; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; $L146: st.u32 [%r98],%r278; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r279,%r59,%r80; @ %r279 bra $L146; .loc 1 525 7 @ ! %r292 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r295; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r282,%r97,%r298; @ %r282 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; .loc 1 520 13 mov.u64 %r295,0; $L144: .loc 1 529 16 ld.u64 %r287,[%r99]; add.u64 %r74,%r287,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r288,%r75,%r72; shl.b64 %r289,%r288,2; add.u64 %r98,%r98,%r289; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r290,%r74,%r77; @ %r290 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r291,%r98,0; @ ! %r291 bra $L122; bra $L147; $L109: .loc 1 534 1 ret; } product_i8.o/_gfortran_product_i8 .visible .func _gfortran_product_fortran/generated/product_i8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mproduct_i8 .visible .func _gfortran_mproduct_i8gfortran_sproduct_i8 .visible .func _gfortran_sproduct_i8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,108] = {80,82,79,68,85,67,84114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_product_i8 .visible .func _gfortran_product_48096pred %r218; .reg .predpred %r231; .reg .u64 %r232; .reg .predu64 %r250; .reg .u64 %r251; .reg .u64 %r252; .reg .pred %r253; .reg .predu64u16u64 %r274; .reg .u64pred %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u32 %r306; .reg .u64u64 %r311; .reg .u64 %r312; .reg .pred %r313; .reg .predu64predmov.u64 %r180,%ar0; mov.u64 %r181,%ar1; mov.u64 %r182,%ar2; .loc 1 55 10 ld.s8 %r22,[%r181+28]; .loc 1 55 38 add.u32 %r183,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r107,%r183; .loc 1 56 10 ld.u64 %r25,[%r182]; .loc 1 56 7 add.u64 %r108,%r25,-1; .loc 1 58 7 shr.u64 %r185,%r108,63; set.u32.lt.s64 %r187,%r107,%r108; neg.s32 %r188,%r187; cvt.u16.u64 %r191,%r185; cvt.u16.u32 %r192,%r188; or.b16 %r190,%r191,%r192; .loc 1 58 6 cvt.u32.u16 %r193,%r190; cvt.u16.u8 %r194,%r193; setp.eq.u16 %r195,%r194,0; @ %r195 bra $L2; .loc 1 60 7 cvt.u32.u32 %r198,%r22; cvt.s64.s8 %r197,%r198; st.u64 [%stack+8],%r1971966_gfortran_runtime_errorr201,%r108,%r108; add.u64 %r202,%r201,%r108; shl.b64 %r203,%r202,3; add.u64 %r204,%r181,%r203; ld.u64 %r30,[%r204+56]; ld.u64 %r32,[%r204+48]; .loc 1 68 9 ld.u64 %r111,[%r204+40]; .loc 1 70 3 setp.ne.u64 %r218,%r108,0; @ %r218 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r219,%r107,%r108; @ %r219 bra $L4; bra $L46; $L3: add.u64 %r61,%r181,40; add.u64 %r40,%frame,120; add.u64 %r96,%frame,240; add.u64 %r220,%r181,16; add.u64 %r222,%r25,%r25; add.u64 %r223,%r222,%r25; shl.b64 %r224,%r223,3; add.u64 %r175,%r220,%r224; .loc 1 76 12 mov.u64 %r358,0; $L8: .loc 1 72 18 ld.u64 %r225,[%r61]; st.u64 [%r40],%r225; .loc 1 73 19 ld.u64 %r227,[%r61+16]; add.u64 %r226,%r227,1; ld.u64 %r228,[%r61+8]; sub.u64 %r37,%r226,%r228; .loc 1 75 10 setp.lt.s64 %r229,%r37,0; @ %r229 bra $L6; .loc 1 73 17 st.u64 [%r96],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r96],%r358; $L7: .loc 1 70 3 add.u64 %r61,%r61,24; add.u64 %r40,%r40,8; add.u64 %r96,%r96,8; setp.ne.u64 %r231,%r61,%r175; @ %r231 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r232,[%r180]; setp.eq.u64 %r233,%r232,0; @ ! %r233 bra $L11; bra $L10; $L4: add.u64 %r235,%r25,%r25; add.u64 %r236,%r235,%r25; shl.b64 %r237,%r236,3; add.u64 %r238,%r237,40; add.u64 %r136,%r181,%r238; shl.b64 %r239,%r25,3; add.u64 %r105,%r239,-8; add.u64 %r345,%frame,120; add.u64 %r129,%r345,%r105; add.u64 %r346,%frame,240; add.u64 %r103,%r346,%r105; cvt.u32.u32 %r243,%r22; cvt.s64.s8 %r242,%r243; add.u64 %r245,%r242,%r242; add.u64 %r246,%r245,%r242; shl.b64 %r247,%r246,3; add.u64 %r248,%r181,40; add.u64 %r80,%r247,%r248; .loc 1 84 12 mov.u64 %r357,0; $L14: .loc 1 80 18 ld.u64 %r249,[%r136]; st.u64 [%r129],%r249; .loc 1 81 19 ld.u64 %r251,[%r136+16]; add.u64 %r250,%r251,1; ld.u64 %r252,[%r136+8]; sub.u64 %r44,%r250,%r252; .loc 1 83 10 setp.lt.s64 %r253,%r44,0; @ %r253 bra $L12; .loc 1 81 17 st.u64 [%r103],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r103],%r357; $L13: .loc 1 78 3 add.u64 %r136,%r136,24; add.u64 %r129,%r129,8; add.u64 %r103,%r103,8; setp.ne.u64 %r255,%r80,%r136; @ %r255 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r256,%r107,0; @ %r256 bra $L16; add.u64 %r346,%frame,240; $L34: add.u64 %r152,%r180,40; mov.u64 %r150,%r346; add.u64 %r257,%r180,16; cvt.u32.u32 %r259,%r22; cvt.s64.s8 %r258,%r259; add.u64 %r261,%r258,%r258; add.u64 %r262,%r261,%r258; shl.b64 %r263,%r262,3; add.u64 %r139,%r257,%r263; .loc 1 94 10 mov.u64 %r102,1; .loc 1 98 4 mov.u64 %r286,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r264,0; st.u64 [%r180+8],%r264; .loc 1 103 28 cvt.u16.u32 %r267,%r22; add.u16 %r266,%r267,-1; cvt.u32.u16 %r268,%r266; st.u8 [%r180+28],%r268; .loc 1 105 20 add.u32 %r269,%r22,-2; cvt.s64.s32 %r53,%r269; add.u64 %r271,%r53,%r53; add.u64 %r272,%r271,%r53; shl.b64 %r273,%r272,3; add.u64 %r274,%r180,%r273; .loc 1 105 67 shl.b64 %r276,%r53,3; add.u64 %r277,%frame,%r276; .loc 1 105 59 ld.u64 %r279,[%r274+40]; ld.u64 %r280,[%r277+240]; mul.lo.u64 %r113,%r279,%r280; .loc 1 107 29 mov.u64 %r2811(%value_in),_gfortrani_xmallocarray283,[%value_in]; } .loc 1 107 27 st.u64 [%r180],%r283; .loc 1 108 10 setp.eq.u64 %r285,%r113,0; @ ! %r285 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r102,%r49,%r102; $L17: .loc 1 98 4 st.u64 [%r152+8],%r286; ld.u64 %r49,[%r150]; add.u64 %r287,%r49,-1; st.u64 [%r152+16],%r287; st.u64 [%r152],%r102; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r288,%r139,%r152; @ %r288 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r180+48],%r113; mov.u64 %r290,-1; st.u64 [%r180+56],%r290; mov.u64 %r291,1; st.u64 [%r180+40],%r291; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r60,[%r180+28]; .loc 1 118 10 setp.eq.u64 %r292,%r60,%r107; @ %r292 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r107; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r295,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r296,[%r295+36]; setp.eq.u32 %r297,%r296,0; @ %r297 bra $L21; .loc 1 125 2 add.u64 %r346,%frame,240; cvta.const.u64 %r301,$LC2r346300301; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r303,%r107,0; @ %r303 bra $L24; $L27: .loc 1 65 9 add.u64 %r304,%r30,1; .loc 1 65 7 sub.u64 %r109,%r304,%r32; max.s64 %r45,%r109,0; .loc 1 137 8 ld.u64 %r121,[%r181]; .loc 1 138 8 ld.u64 %r122,[%r180]; .loc 1 154 36 shl.b64 %r68,%r111,3; ld.u64 %r47,[%frame+360]; .loc 1 166 22 ld.u64 %r69,[%frame+120]; .loc 1 166 12 shl.b64 %r71,%r69,3; .loc 1 167 22 ld.u64 %r72,[%frame]; .loc 1 167 12 shl.b64 %r74,%r72,3; setp.gt.s64 %r347,%r109,0; setp.le.s64 %r348,%r107,1; cvt.u32.u32 %r350,%r22; cvt.s64.s8 %r351,%r350; add.u64 %r352,%r351,-1; add.u64 %r353,%frame,120; add.u64 %r354,%frame,240; .loc 1 173 13 mov.u64 %r355,0; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r180,40; mov.u64 %r162,%frame; cvt.u32.u32 %r306,%r22; cvt.s64.s8 %r305,%r306; add.u64 %r153,%r305,-1; .loc 1 129 3 mov.u64 %r123,0; add.u64 %r346,%frame,240; .loc 1 131 16 mov.u64 %r307,%r123; $L26: st.u64 [%r165],%r307; .loc 1 132 18 ld.u64 %r308,[%r164]; st.u64 [%r162],%r308; .loc 1 133 17 shl.b64 %r310,%r123,3; add.u64 %r311,%r346,%r310; .loc 1 133 10 ld.u64 %r312,[%r311]; setp.le.s64 %r313,%r312,0; @ %r313 bra $L1; .loc 1 129 26 add.u64 %r123,%r123,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r314,%r123,%r153; @ %r314 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r347 bra $L35; .loc 1 150 10 mov.u64 %r316,1; st.u64 [%r122],%r316; bra $L29; $L35: mov.u64 %r116,%r121; .loc 1 148 10 mov.u64 %r114,1; .loc 1 154 13 mov.u64 %r115,0; $L28: .loc 1 158 10 ld.u64 %r317,[%r116]; mul.lo.u64 %r114,%r114,%r317; .loc 1 154 28 add.u64 %r115,%r115,1; .loc 1 154 36 add.u64 %r116,%r116,%r68; .loc 1 154 6 setp.gt.s64 %r318,%r45,%r115; @ %r318 bra $L28; .loc 1 161 12 st.u64 [%r122],%r114; $L29: .loc 1 165 15 add.u64 %r47,%r47,1; .loc 1 166 12 add.u64 %r121,%r121,%r71; .loc 1 167 12 add.u64 %r122,%r122,%r74; .loc 1 169 32 ld.u64 %r127,[%frame+240]; .loc 1 169 13 setp.ne.u64 %r319,%r47,%r127; @ %r319 bra $L25; .loc 1 176 23 mul.lo.u64 %r320,%r69,%r47; .loc 1 176 9 shl.b64 %r321,%r320,3; sub.u64 %r118,%r121,%r321; .loc 1 177 23 mul.lo.u64 %r322,%r72,%r47; .loc 1 177 9 shl.b64 %r323,%r322,3; sub.u64 %r119,%r122,%r323; .loc 1 179 7 @ %r348 bra $L1; add.u64 %r144,%frame,368; mov.u64 %r106,8; .loc 1 178 5 mov.u64 %r120,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r144],%r355; .loc 1 176 23 mul.lo.u64 %r329,%r88,%r87; .loc 1 176 9 shl.b64 %r330,%r329,3; sub.u64 %r118,%r121,%r330; .loc 1 177 23 mul.lo.u64 %r331,%r91,%r87; .loc 1 177 9 shl.b64 %r332,%r331,3; sub.u64 %r119,%r122,%r332; .loc 1 178 5 add.u64 %r120,%r120,1; .loc 1 179 7 add.u64 %r144,%r144,8; add.u64 %r106,%r106,8; setp.eq.u64 %r333,%r120,%r352; @ %r333 bra $L1; $L32: .loc 1 187 16 ld.u64 %r334,[%r144]; add.u64 %r87,%r334,1; st.u64 [%r144],%r87; .loc 1 188 23 add.u64 %r336,%r353,%r106; ld.u64 %r88,[%r336]; .loc 1 188 13 shl.b64 %r337,%r88,3; add.u64 %r121,%r118,%r337; .loc 1 189 23 add.u64 %r338,%frame,%r106; ld.u64 %r91,[%r338]; .loc 1 189 13 shl.b64 %r339,%r91,3; add.u64 %r122,%r119,%r339; .loc 1 169 32 add.u64 %r341,%r354,%r106; ld.u64 %r94,[%r341]; .loc 1 169 13 setp.eq.u64 %r342,%r87,%r94; @ %r342 bra $L33; .loc 1 173 13 mov.u64 %r47,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r343,[%r180]; setp.eq.u64 %r344,%r343,0; @ ! %r344 bra $L11; bra $L34; $L1: .loc 1 19_gfortran_mproduct_i8 .visible .func _gfortran_mproduct_608predpred %r280; .reg .pred %r281; .reg .u64 %r282; .reg .u64 %r284; .reg .u64predu64 %r318; .reg .u64u64u64 %r364; .reg .u64 %r365; .reg .u64 %r367; .reg .predu64pred %r393; .reg .predpredmov.u64 %r210,%ar0; mov.u64 %r211,%ar1; mov.u64 %r212,%ar2; mov.u64 %r213,%ar3; .loc 1 223 6911212; call _gfortran_product_.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r212]; .loc 1 233 7 add.u64 %r141,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r211+28]; .loc 1 234 38 add.u32 %r218,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r142,%r218; .loc 1 237 7 shr.u64 %r220,%r141,63; set.u32.gt.s64 %r222,%r141,%r142; neg.s32 %r223,%r222; cvt.u16.u64 %r226,%r220; cvt.u16.u32 %r227,%r223; or.b16 %r225,%r226,%r227; .loc 1 237 6 cvt.u32.u16 %r228,%r225; cvt.u16.u8 %r229,%r228; setp.eq.u16 %r230,%r229,0; @ %r230 bra $L51; .loc 1 239 7 cvt.u32.u32 %r233,%r23; cvt.s64.s8 %r232,%r233; st.u64 [%stack+8],%r232; st.u64 [%stack],%r22; cvta.const.u64 %r2312stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r431,%r141,%r141; add.u64 %r430,%r431,%r141; shl.b64 %r238,%r430,3; add.u64 %r239,%r211,%r238; ld.u64 %r242,[%r239+56]; add.u64 %r241,%r242,1; .loc 1 244 7 ld.u64 %r249,[%r239+48]; sub.u64 %r143,%r241,%r249; .loc 1 245 6 setp.le.s64 %r250,%r143,0; @ %r250 bra $L48; .loc 1 248 9 ld.u64 %r156,[%r213]; .loc 1 250 15 ld.u64 %r35,[%r213+16]; .loc 1 252 22 cvt.u32.u64 %r36,%r35; .loc 1 252 53 add.u32 %r251,%r36,-4; and.b32 %r252,%r251,-5; set.u32.eq.u32 %r254,%r252,0; neg.s32 %r255,%r254; .loc 1 252 22 add.u32 %r256,%r36,-1; set.u32.le.u32 %r258,%r256,1; neg.s32 %r259,%r258; .loc 1 252 6 cvt.u16.u32 %r261,%r255; cvt.u16.u32 %r262,%r259; or.b16 %r260,%r261,%r262; cvt.u32.u16 %r263,%r260; cvt.u16.u8 %r264,%r263; setp.ne.u16 %r265,%r264,0; @ %r265 bra $L53; .loc 1 254 7 setp.ne.u32 %r267,%r36,16; @ %r267 bra $L54; $L53: .loc 1 261 9 shl.b64 %r271,%r430,3; add.u64 %r272,%r211,%r271; ld.u64 %r146,[%r272+40]; .loc 1 262 12 add.u64 %r278,%r213,%r271; ld.u64 %r43,[%r278+40]; .loc 1 264 3 setp.ne.u64 %r280,%r141,0; @ %r280 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r281,%r141,%r142; @ %r281 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r282282_gfortran_runtime_error5: add.u64 %r194,%r211,40; add.u64 %r196,%r213,40; add.u64 %r284,%r211,16; add.u64 %r286,%r22,%r22; add.u64 %r287,%r286,%r22; shl.b64 %r288,%r287,3; add.u64 %r207,%r284,%r288; .loc 1 264 3 mov.u64 %r198,0; add.u64 %r428,%frame,240; add.u64 %r427,%frame,360; .loc 1 271 12 mov.u64 %r442,%r198; $L60: .loc 1 266 18 add.u64 %r290,%r428,%r198; ld.u64 %r291,[%r194]; st.u64 [%r290],%r291; .loc 1 267 18 add.u64 %r292,%frame,%r198; .loc 1 267 20 ld.u64 %r294,[%r196]; mul.lo.u64 %r293,%r294,%r35; .loc 1 267 18 st.u64 [%r292],%r293; .loc 1 268 19 ld.u64 %r296,[%r194+16]; add.u64 %r295,%r296,1; ld.u64 %r297,[%r194+8]; sub.u64 %r55,%r295,%r297; .loc 1 270 10 setp.lt.s64 %r298,%r55,0; @ %r298 bra $L58; .loc 1 268 17 add.u64 %r300,%r427,%r198; st.u64 [%r300],%r55; bra $L59; $L58: .loc 1 271 12 add.u64 %r302,%r427,%r198; st.u64 [%r302],%r442; $L59: .loc 1 264 3 add.u64 %r194,%r194,24; add.u64 %r196,%r196,24; add.u64 %r198,%r198,8; setp.ne.u64 %r304,%r194,%r207; @ %r304 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r305,[%r210]; setp.eq.u64 %r306,%r305,0; @ ! %r306 bra $L63; bra $L62; $L56: add.u64 %r308,%r22,%r22; add.u64 %r309,%r308,%r22; shl.b64 %r310,%r309,3; add.u64 %r61,%r310,40; add.u64 %r100,%r211,%r61; add.u64 %r139,%r213,%r61; shl.b64 %r311,%r22,3; add.u64 %r192,%r311,-8; cvt.u32.u32 %r313,%r23; cvt.s64.s8 %r312,%r313; shl.b64 %r314,%r312,3; add.u64 %r39,%r314,-8; add.u64 %r428,%frame,240; add.u64 %r427,%frame,360; .loc 1 281 12 mov.u64 %r441,0; $L66: .loc 1 276 18 add.u64 %r316,%r428,%r192; ld.u64 %r317,[%r100]; st.u64 [%r316],%r317; .loc 1 277 18 add.u64 %r318,%frame,%r192; .loc 1 277 20 ld.u64 %r320,[%r139]; mul.lo.u64 %r319,%r320,%r35; .loc 1 277 18 st.u64 [%r318],%r319; .loc 1 278 19 ld.u64 %r322,[%r100+16]; add.u64 %r321,%r322,1; ld.u64 %r323,[%r100+8]; sub.u64 %r67,%r321,%r323; .loc 1 280 10 setp.lt.s64 %r324,%r67,0; @ %r324 bra $L64; .loc 1 278 17 add.u64 %r326,%r427,%r192; st.u64 [%r326],%r67; bra $L65; $L64: .loc 1 281 12 add.u64 %r328,%r427,%r192; st.u64 [%r328],%r441; $L65: .loc 1 274 3 add.u64 %r100,%r100,24; add.u64 %r139,%r139,24; add.u64 %r192,%r192,8; setp.ne.u64 %r330,%r39,%r192; @ %r330 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r331,%r142,0; @ %r331 bra $L68; add.u64 %r427,%frame,360; $L86: add.u64 %r137,%r210,40; mov.u64 %r135,%r427; cvt.u32.u32 %r333,%r23; cvt.s64.s8 %r332,%r333; add.u64 %r335,%r332,%r332; add.u64 %r336,%r335,%r332; shl.b64 %r337,%r336,3; add.u64 %r338,%r210,16; add.u64 %r106,%r337,%r338; .loc 1 291 10 mov.u64 %r131,1; .loc 1 295 4 mov.u64 %r357,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r339,%r23,-2; cvt.s64.s32 %r74,%r339; add.u64 %r341,%r74,%r74; add.u64 %r342,%r341,%r74; shl.b64 %r343,%r342,3; add.u64 %r344,%r210,%r343; .loc 1 299 67 shl.b64 %r346,%r74,3; add.u64 %r347,%frame,%r346; .loc 1 299 59 ld.u64 %r349,[%r344+40]; ld.u64 %r350,[%r347+360]; mul.lo.u64 %r150,%r349,%r350; .loc 1 301 24 mov.u64 %r351,0; st.u64 [%r210+8],%r351; .loc 1 302 28 cvt.u16.u32 %r354,%r23; add.u16 %r353,%r354,-1; cvt.u32.u16 %r355,%r353; st.u8 [%r210+28],%r355; .loc 1 304 10 setp.eq.u64 %r356,%r150,0; @ %r356 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r131,%r71,%r131; $L69: .loc 1 295 4 st.u64 [%r137+8],%r357; ld.u64 %r71,[%r135]; add.u64 %r358,%r71,-1; st.u64 [%r137+16],%r358; st.u64 [%r137],%r131; .loc 1 288 7 add.u64 %r137,%r137,24; add.u64 %r135,%r135,8; setp.ne.u64 %r359,%r106,%r137; @ %r359 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r210+48],%r150; mov.u64 %r361,-1; st.u64 [%r210+56],%r361; mov.u64 %r362,1; st.u64 [%r210+40],%r362; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r364364; call (%value_in),_gfortrani_xmallocarray65,[%value_in]; } .loc 1 311 22 st.u64 [%r210],%r365; bra $L73; $L63: .loc 1 316 19 ld.s8 %r367,[%r210+28]; .loc 1 316 10 setp.eq.u64 %r368,%r367,%r142; @ %r368 bra $L74; .loc 1 317 2 cvta.const.u64 %r36stack; call _gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r371,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r372,[%r371+36]; setp.eq.u32 %r373,%r372,0; @ %r373 bra $L73; .loc 1 321 4 add.u64 %r427,%frame,360; cvta.const.u64 %r377,$LC2r427376377; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r381,$LC1r2113377; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r383,%r142,0; @ %r383 bra $L75; $L78: .loc 1 336 8 ld.u64 %r157,[%r210]; .loc 1 337 8 ld.u64 %r155,[%r211]; .loc 1 339 9 setp.ne.u64 %r384,%r155,0; @ %r384 bra $L76; bra $L48; $L75: add.u64 %r183,%frame,480; add.u64 %r180,%r210,40; add.u64 %r177,%frame,120; cvt.u32.u32 %r386,%r23; cvt.s64.s8 %r385,%r386; add.u64 %r138,%r385,-1; .loc 1 328 3 mov.u64 %r161,0; add.u64 %r427,%frame,360; .loc 1 330 16 mov.u64 %r387,%r161; $L77: st.u64 [%r183],%r387; .loc 1 331 18 ld.u64 %r388,[%r180]; st.u64 [%r177],%r388; .loc 1 332 17 shl.b64 %r390,%r161,3; add.u64 %r391,%r427,%r390; .loc 1 332 10 ld.u64 %r392,[%r391]; setp.le.s64 %r393,%r392,0; @ %r393 bra $L48; .loc 1 328 26 add.u64 %r161,%r161,1; .loc 1 328 3 add.u64 %r183,%r183,8; add.u64 %r180,%r180,24; add.u64 %r177,%r177,8; setp.ne.u64 %r394,%r138,%r161; @ %r394 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r46,%r43,%r35; .loc 1 349 32 shl.b64 %r90,%r146,3; ld.u64 %r29,[%frame+480]; .loc 1 359 22 ld.u64 %r92,[%frame+240]; .loc 1 359 12 shl.b64 %r94,%r92,3; .loc 1 360 23 ld.u64 %r95,[%frame]; .loc 1 361 22 ld.u64 %r97,[%frame+120]; .loc 1 361 12 shl.b64 %r99,%r97,3; setp.le.s64 %r432,%r142,1; cvt.u32.u32 %r434,%r23; cvt.s64.s8 %r435,%r434; add.u64 %r436,%r435,-1; add.u64 %r437,%frame,240; add.u64 %r438,%frame,360; add.u64 %r439,%frame,120; .loc 1 367 13 mov.u64 %r440,0; $L87: .loc 1 328 3 mov.u64 %r160,%r156; mov.u64 %r159,%r155; .loc 1 348 10 mov.u64 %r133,1; .loc 1 349 9 mov.u64 %r158,0; $L80: .loc 1 352 6 ld.s8 %r396,[%r160]; cvt.u16.u32 %r395,%r396; setp.eq.u16 %r397,%r395,0; @ %r397 bra $L79; .loc 1 353 12 ld.u64 %r398,[%r159]; mul.lo.u64 %r133,%r133,%r398; $L79: .loc 1 349 24 add.u64 %r158,%r158,1; .loc 1 349 32 add.u64 %r159,%r159,%r90; .loc 1 349 47 add.u64 %r160,%r160,%r46; .loc 1 349 2 setp.ne.u64 %r399,%r143,%r158; @ %r399 bra $L80; .loc 1 355 8 st.u64 [%r157],%r133; .loc 1 358 15 add.u64 %r29,%r29,1; .loc 1 359 12 add.u64 %r155,%r155,%r94; .loc 1 360 13 add.u64 %r156,%r156,%r95; .loc 1 361 12 add.u64 %r157,%r157,%r99; .loc 1 363 32 ld.u64 %r84,[%frame+360]; .loc 1 363 13 setp.ne.u64 %r400,%r84,%r29; @ %r400 bra $L87; .loc 1 370 23 mul.lo.u64 %r102,%r29,%r92; .loc 1 371 24 mul.lo.u64 %r401,%r29,%r95; .loc 1 371 10 sub.u64 %r152,%r156,%r401; .loc 1 372 23 mul.lo.u64 %r402,%r29,%r97; .loc 1 372 9 shl.b64 %r403,%r402,3; sub.u64 %r153,%r157,%r403; .loc 1 374 7 @ ! %r432 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r41],%r440; .loc 1 370 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 371 24 mul.lo.u64 %r406,%r116,%r113; .loc 1 371 10 sub.u64 %r152,%r156,%r406; .loc 1 372 23 mul.lo.u64 %r407,%r118,%r113; .loc 1 372 9 shl.b64 %r408,%r407,3; sub.u64 %r153,%r157,%r408; .loc 1 373 5 add.u64 %r154,%r154,1; .loc 1 374 7 add.u64 %r41,%r41,8; add.u64 %r30,%r30,8; setp.ne.u64 %r409,%r154,%r436; @ %r409 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; mov.u64 %r30,8; .loc 1 373 5 mov.u64 %r154,1; $L84: .loc 1 382 16 ld.u64 %r413,[%r41]; add.u64 %r113,%r413,1; st.u64 [%r41],%r113; .loc 1 383 23 add.u64 %r415,%r437,%r30; ld.u64 %r114,[%r415]; .loc 1 383 13 sub.u64 %r416,%r114,%r102; shl.b64 %r417,%r416,3; add.u64 %r155,%r155,%r417; .loc 1 384 24 add.u64 %r418,%frame,%r30; ld.u64 %r116,[%r418]; .loc 1 384 14 add.u64 %r156,%r152,%r116; .loc 1 385 23 add.u64 %r420,%r439,%r30; ld.u64 %r118,[%r420]; .loc 1 385 13 shl.b64 %r421,%r118,3; add.u64 %r157,%r153,%r421; .loc 1 363 32 add.u64 %r423,%r438,%r30; ld.u64 %r121,[%r423]; .loc 1 363 13 setp.eq.u64 %r424,%r113,%r121; @ %r424 bra $L85; .loc 1 367 13 mov.u64 %r29,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r425,[%r210]; setp.eq.u64 %r426,%r425,0; @ ! %r426 bra $L63; bra $L86; $L48:_gfortran_sproduct_i8 .visible .func _gfortran_sproduct_68u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64predu64mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_product_.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r294,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r294; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r294; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r300,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r300; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r293,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r293,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r299,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r299; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r294,0; @ %r211 bra $L127; add.u64 %r293,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r293; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r24496244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r294,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r294; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r294; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r294,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r294,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r294,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 shl.b64 %r69,%r68,3; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r292,%r294,1; .loc 1 512 13 mov.u64 %r278,1; cvt.u32.u32 %r296,%r24; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; $L146: st.u64 [%r98],%r278; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r279,%r59,%r80; @ %r279 bra $L146; .loc 1 525 7 @ ! %r292 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r295; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r282,%r97,%r298; @ %r282 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; .loc 1 520 13 mov.u64 %r295,0; $L144: .loc 1 529 16 ld.u64 %r287,[%r99]; add.u64 %r74,%r287,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r288,%r75,%r72; shl.b64 %r289,%r288,3; add.u64 %r98,%r98,%r289; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r290,%r74,%r77; @ %r290 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r291,%r98,0; @ ! %r291 bra $L122; bra $L147; $L109: .loc 1 534 1 ret; } product_i16.o/_gfortran_product_i16 .visible .func _gfortran_product_i16fortran/generated/product_i16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mproduct_i16 .visible .func _gfortran_mproduct_i16gfortran_sproduct_i16 .visible .func _gfortran_sproduct_i16VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_retur_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,108] = {80,82,79,68,85,67,84114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_product_i16 .visible .func _gfortran_product_i168u64 %r52; .reg .u64 %r58; .reg .u64 %r61; .reg .u649pred %r217; .reg .pred64u64 %r251; .reg .predu64 %r256; .reg .u64 %r257; .reg .u32u32 %r268; .reg .u64u64pred %r317; .reg .u64pred %r332; .reg .predmov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; .loc 1 55 10 ld.s8 %r22,[%r180+28]; .loc 1 55 38 add.u32 %r182,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r182; .loc 1 56 10 ld.u64 %r25,[%r181]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r184,%r104,63; set.u32.lt.s64 %r186,%r103,%r104; neg.s32 %r187,%r186; cvt.u16.u64 %r190,%r184; cvt.u16.u32 %r191,%r187; or.b16 %r189,%r190,%r191; .loc 1 58 6 cvt.u32.u16 %r192,%r189; cvt.u16.u8 %r193,%r192; setp.eq.u16 %r194,%r193,0; @ %r194 bra $L2; .loc 1 60 7 cvt.u32.u32 %r197,%r22; cvt.s64.s8 %r196,%r197; st.u64 [%stack+8],%r1961955_gfortran_runtime_errorr200,%r104,%r104; add.u64 %r201,%r200,%r104; shl.b64 %r202,%r201,3; add.u64 %r203,%r180,%r202; ld.u64 %r30,[%r203+56]; ld.u64 %r32,[%r203+48]; .loc 1 68 9 ld.u64 %r107,[%r203+40]; .loc 1 70 3 setp.ne.u64 %r217,%r104,0; @ %r217 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r218,%r103,%r104; @ %r218 bra $L4; bra $L46; $L3: add.u64 %r61,%r180,40; add.u64 %r40,%frame,120; add.u64 %r91,%frame,240; add.u64 %r219,%r180,16; add.u64 %r221,%r25,%r25; add.u64 %r222,%r221,%r25; shl.b64 %r223,%r222,3; add.u64 %r174,%r219,%r223; .loc 1 76 12 mov.u64 %r378,0; $L8: .loc 1 72 18 ld.u64 %r224,[%r61]; st.u64 [%r40],%r224; .loc 1 73 19 ld.u64 %r226,[%r61+16]; add.u64 %r225,%r226,1; ld.u64 %r227,[%r61+8]; sub.u64 %r37,%r225,%r227; .loc 1 75 10 setp.lt.s64 %r228,%r37,0; @ %r228 bra $L6; .loc 1 73 17 st.u64 [%r91],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r91],%r378; $L7: .loc 1 70 3 add.u64 %r61,%r61,24; add.u64 %r40,%r40,8; add.u64 %r91,%r91,8; setp.ne.u64 %r230,%r61,%r174; @ %r230 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r231,[%r179]; setp.eq.u64 %r232,%r231,0; @ ! %r232 bra $L11; bra $L10; $L4: add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r237,%r236,40; add.u64 %r143,%r180,%r237; shl.b64 %r238,%r25,3; add.u64 %r125,%r238,-8; add.u64 %r361,%frame,120; add.u64 %r128,%r361,%r125; add.u64 %r366,%frame,240; add.u64 %r121,%r366,%r125; cvt.u32.u32 %r242,%r22; cvt.s64.s8 %r241,%r242; add.u64 %r244,%r241,%r241; add.u64 %r245,%r244,%r241; shl.b64 %r246,%r245,3; add.u64 %r247,%r180,40; add.u64 %r95,%r246,%r247; .loc 1 84 12 mov.u64 %r377,0; $L14: .loc 1 80 18 ld.u64 %r248,[%r143]; st.u64 [%r128],%r248; .loc 1 81 19 ld.u64 %r250,[%r143+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r143+8]; sub.u64 %r44,%r249,%r251; .loc 1 83 10 setp.lt.s64 %r252,%r44,0; @ %r252 bra $L12; .loc 1 81 17 st.u64 [%r121],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r121],%r377; $L13: .loc 1 78 3 add.u64 %r143,%r143,24; add.u64 %r128,%r128,8; add.u64 %r121,%r121,8; setp.ne.u64 %r254,%r95,%r143; @ %r254 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r255,%r103,0; @ %r255 bra $L16; add.u64 %r366,%frame,240; $L34: add.u64 %r152,%r179,40; mov.u64 %r150,%r366; add.u64 %r256,%r179,16; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r144,%r256,%r262; .loc 1 94 10 mov.u64 %r99,1; .loc 1 98 4 mov.u64 %r285,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r263,0; st.u64 [%r179+8],%r263; .loc 1 103 28 cvt.u16.u32 %r266,%r22; add.u16 %r265,%r266,-1; cvt.u32.u16 %r267,%r265; st.u8 [%r179+28],%r267; .loc 1 105 20 add.u32 %r268,%r22,-2; cvt.s64.s32 %r52,%r268; add.u64 %r270,%r52,%r52; add.u64 %r271,%r270,%r52; shl.b64 %r272,%r271,3; add.u64 %r273,%r179,%r272; .loc 1 105 67 shl.b64 %r275,%r52,3; add.u64 %r276,%frame,%r275; .loc 1 105 59 ld.u64 %r278,[%r273+40]; ld.u64 %r279,[%r276+240]; mul.lo.u64 %r109,%r278,%r279; .loc 1 107 29 mov.u64 %r281,16281; call (%value_in),_gfortrani_xmallocarray282,[%value_in]; } .loc 1 107 27 st.u64 [%r179],%r282; .loc 1 108 10 setp.eq.u64 %r284,%r109,0; @ ! %r284 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r99,%r48,%r99; $L17: .loc 1 98 4 st.u64 [%r152+8],%r285; ld.u64 %r48,[%r150]; add.u64 %r286,%r48,-1; st.u64 [%r152+16],%r286; st.u64 [%r152],%r99; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r287,%r144,%r152; @ %r287 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r179+48],%r109; mov.u64 %r289,-1; st.u64 [%r179+56],%r289; mov.u64 %r290,1; st.u64 [%r179+40],%r290; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r179+28]; .loc 1 118 10 setp.eq.u64 %r291,%r58,%r103; @ %r291 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r294,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r295,[%r294+36]; setp.eq.u32 %r296,%r295,0; @ %r296 bra $L21; .loc 1 125 2 add.u64 %r366,%frame,240; cvta.const.u64 %r300,$LC2r366299300; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r302,%r103,0; @ %r302 bra $L24; $L27: .loc 1 65 9 add.u64 %r303,%r30,1; .loc 1 65 7 sub.u64 %r105,%r303,%r32; max.s64 %r45,%r105,0; .loc 1 137 8 ld.u64 %r116,[%r180]; .loc 1 138 8 ld.u64 %r117,[%r179]; .loc 1 154 36 shl.b64 %r66,%r107,4; ld.u64 %r138,[%frame+360]; .loc 1 166 22 ld.u64 %r68,[%frame+120]; .loc 1 166 12 shl.b64 %r69,%r68,4; .loc 1 167 22 ld.u64 %r71,[%frame]; .loc 1 167 12 shl.b64 %r72,%r71,4; .loc 1 169 32 ld.u64 %r123,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r304,%r68,%r123; .loc 1 176 9 shl.b64 %r305,%r304,4; neg.s64 %r127,%r305; .loc 1 177 23 mul.lo.u64 %r306,%r71,%r123; .loc 1 177 9 shl.b64 %r307,%r306,4; neg.s64 %r130,%r307; setp.gt.s64 %r363,%r105,0; setp.le.s64 %r364,%r103,1; add.u64 %r369,%frame,480; add.u64 %r370,%frame,496; add.u64 %r371,%frame,512; cvt.u32.u32 %r372,%r22; cvt.s64.s8 %r373,%r372; add.u64 %r374,%r373,-1; add.u64 %r375,%frame,120; add.u64 %r376,%frame,240; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r179,40; mov.u64 %r162,%frame; cvt.u32.u32 %r309,%r22; cvt.s64.s8 %r308,%r309; add.u64 %r153,%r308,-1; .loc 1 129 3 mov.u64 %r118,0; add.u64 %r366,%frame,240; .loc 1 131 16 mov.u64 %r310,%r118; $L26: st.u64 [%r165],%r310; .loc 1 132 18 ld.u64 %r311,[%r164]; st.u64 [%r162],%r311; .loc 1 133 17 shl.b64 %r313,%r118,3; add.u64 %r314,%r366,%r313; .loc 1 133 10 ld.u64 %r315,[%r314]; setp.le.s64 %r316,%r315,0; @ %r316 bra $L1; .loc 1 129 26 add.u64 %r118,%r118,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r317,%r118,%r153; @ %r317 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r363 bra $L35; .loc 1 150 10 mov.u64 %r319,1; st.u64 [%r117],%r319; mov.u64 %r320,0; st.u64 [%r117+8],%r320; bra $L29; $L35: mov.u64 %r112,%r116; .loc 1 148 10 mov.u64 %r357,1; mov.u64 %r358,0; .loc 1 154 13 mov.u64 %r111,%r358; $L28: .loc 1 158 10 st.u64 [%frame+496],%r357; st.u64 [%frame+504],%r358; ld.u64 %r327,[%r112]; st.u64 [%frame+512],%r327; ld.u64 %r328,[%r112+8]; st.u64 [%frame+520],%r328480]; ld.u64 %r358,[%frame+488]; .loc 1 154 28 add.u64 %r111,%r111,1; .loc 1 154 36 add.u64 %r112,%r112,%r66; .loc 1 154 6 setp.gt.s64 %r332,%r45,%r111; @ %r332 bra $L28; .loc 1 161 12 st.u64 [%r117],%r357; st.u64 [%r117+8],%r358; $L29: .loc 1 165 15 add.u64 %r138,%r138,1; .loc 1 166 12 add.u64 %r116,%r116,%r69; .loc 1 167 12 add.u64 %r117,%r117,%r72; .loc 1 169 13 setp.ne.u64 %r335,%r138,%r123; @ %r335 bra $L25; .loc 1 176 9 add.u64 %r113,%r116,%r127; .loc 1 177 9 add.u64 %r114,%r117,%r130; .loc 1 179 7 @ %r364 bra $L1; add.u64 %r122,%frame,368; mov.u64 %r140,8; .loc 1 178 5 mov.u64 %r115,1; .loc 1 173 13 mov.u64 %r368,0; bra $L32; $L33: st.u64 [%r122],%r368; .loc 1 176 23 mul.lo.u64 %r341,%r85,%r84; .loc 1 176 9 shl.b64 %r342,%r341,4; sub.u64 %r113,%r116,%r342; .loc 1 177 23 mul.lo.u64 %r343,%r88,%r84; .loc 1 177 9 shl.b64 %r344,%r343,4; sub.u64 %r114,%r117,%r344; .loc 1 178 5 add.u64 %r115,%r115,1; .loc 1 179 7 add.u64 %r122,%r122,8; add.u64 %r140,%r140,8; setp.eq.u64 %r345,%r115,%r374; @ %r345 bra $L1; $L32: .loc 1 187 16 ld.u64 %r346,[%r122]; add.u64 %r84,%r346,1; st.u64 [%r122],%r84; .loc 1 188 23 add.u64 %r348,%r375,%r140; ld.u64 %r85,[%r348]; .loc 1 188 13 shl.b64 %r349,%r85,4; add.u64 %r116,%r113,%r349; .loc 1 189 23 add.u64 %r350,%frame,%r140; ld.u64 %r88,[%r350]; .loc 1 189 13 shl.b64 %r351,%r88,4; add.u64 %r117,%r114,%r351; .loc 1 169 32 add.u64 %r353,%r376,%r140; ld.u64 %r92,[%r353]; .loc 1 169 13 setp.eq.u64 %r354,%r84,%r92; @ %r354 bra $L33; .loc 1 173 13 mov.u64 %r138,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r355,[%r179]; setp.eq.u64 %r356,%r355,0; @ ! %r356 bra $L11; bra $L34; $L1: .loc 1 19_gfortran_mproduct_i16 .visible .func _gfortran_mproduct_i16656u64u16 %r263; .reg .pred %r264; .reg .pred %r266; .reg .u64 %r270; .reg .u64 %r271; .reg .u64 %r277; .reg .pred %r279; .reg .pred %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .predu64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32u64 %r342; .reg .u64pred %r358; .reg .u64predu64 %r391; .reg .predpredpred %r411; .reg .predpred %r420; .reg .u64u64 %r432; .reg .u64 %r434; .reg .pred %r435; .reg .u64 %r436; .reg .predpred.reg .u32 %r452; .reg .u64 %r453; .reg .u64 %r454; .reg .u64mov.u64 %r211,%ar2; mov.u64 %r212,%ar3; .loc 1 223 6 setp.ne.u64 %r213,%r212,0; @ %r213 bra $L49gfortran_product_i16.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r211]; .loc 1 233 7 add.u64 %r137,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r210+28]; .loc 1 234 38 add.u32 %r217,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r138,%r217; .loc 1 237 7 shr.u64 %r219,%r137,63; set.u32.gt.s64 %r221,%r137,%r138; neg.s32 %r222,%r221; cvt.u16.u64 %r225,%r219; cvt.u16.u32 %r226,%r222; or.b16 %r224,%r225,%r226; .loc 1 237 6 cvt.u32.u16 %r227,%r224; cvt.u16.u8 %r228,%r227; setp.eq.u16 %r229,%r228,0; @ %r229 bra $L51; .loc 1 239 7 cvt.u32.u32 %r232,%r23; cvt.s64.s8 %r231,%r232; st.u64 [%stack+8],%r231; st.u64 [%stack],%r22; cvta.const.u64 %r2302stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r443,%r137,%r137; add.u64 %r446,%r443,%r137; shl.b64 %r237,%r446,3; add.u64 %r238,%r210,%r237; ld.u64 %r241,[%r238+56]; add.u64 %r240,%r241,1; .loc 1 244 7 ld.u64 %r248,[%r238+48]; sub.u64 %r139,%r240,%r248; .loc 1 245 6 setp.le.s64 %r249,%r139,0; @ %r249 bra $L48; .loc 1 248 9 ld.u64 %r152,[%r212]; .loc 1 250 15 ld.u64 %r34,[%r212+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r250,%r35,-4; and.b32 %r251,%r250,-5; set.u32.eq.u32 %r253,%r251,0; neg.s32 %r254,%r253; .loc 1 252 22 add.u32 %r255,%r35,-1; set.u32.le.u32 %r257,%r255,1; neg.s32 %r258,%r257; .loc 1 252 6 cvt.u16.u32 %r260,%r254; cvt.u16.u32 %r261,%r258; or.b16 %r259,%r260,%r261; cvt.u32.u16 %r262,%r259; cvt.u16.u8 %r263,%r262; setp.ne.u16 %r264,%r263,0; @ %r264 bra $L53; .loc 1 254 7 setp.ne.u32 %r266,%r35,16; @ %r266 bra $L54; $L53: .loc 1 261 9 shl.b64 %r270,%r446,3; add.u64 %r271,%r210,%r270; ld.u64 %r142,[%r271+40]; .loc 1 262 12 add.u64 %r277,%r212,%r270; ld.u64 %r42,[%r277+40]; .loc 1 264 3 setp.ne.u64 %r279,%r137,0; @ %r279 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r280,%r137,%r138; @ %r280 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r281281_gfortran_runtime_error5: add.u64 %r40,%r210,40; add.u64 %r179,%r212,40; add.u64 %r283,%r210,16; add.u64 %r285,%r22,%r22; add.u64 %r286,%r285,%r22; shl.b64 %r287,%r286,3; add.u64 %r206,%r283,%r287; .loc 1 264 3 mov.u64 %r92,0; add.u64 %r444,%frame,240; add.u64 %r442,%frame,360; .loc 1 271 12 mov.u64 %r459,%r92; $L60: .loc 1 266 18 add.u64 %r289,%r444,%r92; ld.u64 %r290,[%r40]; st.u64 [%r289],%r290; .loc 1 267 18 add.u64 %r291,%frame,%r92; .loc 1 267 20 ld.u64 %r293,[%r179]; mul.lo.u64 %r292,%r293,%r34; .loc 1 267 18 st.u64 [%r291],%r292; .loc 1 268 19 ld.u64 %r295,[%r40+16]; add.u64 %r294,%r295,1; ld.u64 %r296,[%r40+8]; sub.u64 %r54,%r294,%r296; .loc 1 270 10 setp.lt.s64 %r297,%r54,0; @ %r297 bra $L58; .loc 1 268 17 add.u64 %r299,%r442,%r92; st.u64 [%r299],%r54; bra $L59; $L58: .loc 1 271 12 add.u64 %r301,%r442,%r92; st.u64 [%r301],%r459; $L59: .loc 1 264 3 add.u64 %r40,%r40,24; add.u64 %r179,%r179,24; add.u64 %r92,%r92,8; setp.ne.u64 %r303,%r40,%r206; @ %r303 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r304,[%r209]; setp.eq.u64 %r305,%r304,0; @ ! %r305 bra $L63; bra $L62; $L56: add.u64 %r307,%r22,%r22; add.u64 %r308,%r307,%r22; shl.b64 %r309,%r308,3; add.u64 %r127,%r309,40; add.u64 %r133,%r210,%r127; add.u64 %r125,%r212,%r127; shl.b64 %r310,%r22,3; add.u64 %r103,%r310,-8; cvt.u32.u32 %r312,%r23; cvt.s64.s8 %r311,%r312; shl.b64 %r313,%r311,3; add.u64 %r69,%r313,-8; add.u64 %r444,%frame,240; add.u64 %r442,%frame,360; .loc 1 281 12 mov.u64 %r458,0; $L66: .loc 1 276 18 add.u64 %r315,%r444,%r103; ld.u64 %r316,[%r133]; st.u64 [%r315],%r316; .loc 1 277 18 add.u64 %r317,%frame,%r103; .loc 1 277 20 ld.u64 %r319,[%r125]; mul.lo.u64 %r318,%r319,%r34; .loc 1 277 18 st.u64 [%r317],%r318; .loc 1 278 19 ld.u64 %r321,[%r133+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r133+8]; sub.u64 %r66,%r320,%r322; .loc 1 280 10 setp.lt.s64 %r323,%r66,0; @ %r323 bra $L64; .loc 1 278 17 add.u64 %r325,%r442,%r103; st.u64 [%r325],%r66; bra $L65; $L64: .loc 1 281 12 add.u64 %r327,%r442,%r103; st.u64 [%r327],%r458; $L65: .loc 1 274 3 add.u64 %r133,%r133,24; add.u64 %r125,%r125,24; add.u64 %r103,%r103,8; setp.ne.u64 %r329,%r69,%r103; @ %r329 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r330,%r138,0; @ %r330 bra $L68; add.u64 %r442,%frame,360; $L86: add.u64 %r172,%r209,40; mov.u64 %r164,%r442; cvt.u32.u32 %r332,%r23; cvt.s64.s8 %r331,%r332; add.u64 %r334,%r331,%r331; add.u64 %r335,%r334,%r331; shl.b64 %r336,%r335,3; add.u64 %r337,%r209,16; add.u64 %r134,%r336,%r337; .loc 1 291 10 mov.u64 %r129,1; .loc 1 295 4 mov.u64 %r356,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r338,%r23,-2; cvt.s64.s32 %r75,%r338; add.u64 %r340,%r75,%r75; add.u64 %r341,%r340,%r75; shl.b64 %r342,%r341,3; add.u64 %r343,%r209,%r342; .loc 1 299 67 shl.b64 %r345,%r75,3; add.u64 %r346,%frame,%r345; .loc 1 299 59 ld.u64 %r348,[%r343+40]; ld.u64 %r349,[%r346+360]; mul.lo.u64 %r146,%r348,%r349; .loc 1 301 24 mov.u64 %r350,0; st.u64 [%r209+8],%r350; .loc 1 302 28 cvt.u16.u32 %r353,%r23; add.u16 %r352,%r353,-1; cvt.u32.u16 %r354,%r352; st.u8 [%r209+28],%r354; .loc 1 304 10 setp.eq.u64 %r355,%r146,0; @ %r355 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r129,%r72,%r129; $L69: .loc 1 295 4 st.u64 [%r172+8],%r356; ld.u64 %r72,[%r164]; add.u64 %r357,%r72,-1; st.u64 [%r172+16],%r357; st.u64 [%r172],%r129; .loc 1 288 7 add.u64 %r172,%r172,24; add.u64 %r164,%r164,8; setp.ne.u64 %r358,%r134,%r172; @ %r358 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r209+48],%r146; mov.u64 %r360,-1; st.u64 [%r209+56],%r360; mov.u64 %r361,1; st.u64 [%r209+40],%r361; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r363,16363; call (%value_in),_gfortrani_xmallocarray64,[%value_in]; } .loc 1 311 22 st.u64 [%r209],%r364; bra $L73; $L63: .loc 1 316 19 ld.s8 %r366,[%r209+28]; .loc 1 316 10 setp.eq.u64 %r367,%r366,%r138; @ %r367 bra $L74; .loc 1 317 2 cvta.const.u64 %r36836_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r370,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r371,[%r370+36]; setp.eq.u32 %r372,%r371,0; @ %r372 bra $L73; .loc 1 321 4 add.u64 %r442,%frame,360; cvta.const.u64 %r376,$LC2r44237376; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r380,$LC1210380376; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r382,%r138,0; @ %r382 bra $L75; $L78: .loc 1 336 8 ld.u64 %r153,[%r209]; .loc 1 337 8 ld.u64 %r151,[%r210]; .loc 1 339 9 setp.ne.u64 %r383,%r151,0; @ %r383 bra $L76; bra $L48; $L75: add.u64 %r190,%frame,480; add.u64 %r189,%r209,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r385,%r23; cvt.s64.s8 %r384,%r385; add.u64 %r175,%r384,-1; .loc 1 328 3 mov.u64 %r157,0; add.u64 %r442,%frame,360; .loc 1 330 16 mov.u64 %r386,%r157; $L77: st.u64 [%r190],%r386; .loc 1 331 18 ld.u64 %r387,[%r189]; st.u64 [%r187],%r387; .loc 1 332 17 shl.b64 %r389,%r157,3; add.u64 %r390,%r442,%r389; .loc 1 332 10 ld.u64 %r391,[%r390]; setp.le.s64 %r392,%r391,0; @ %r392 bra $L48; .loc 1 328 26 add.u64 %r157,%r157,1; .loc 1 328 3 add.u64 %r190,%r190,8; add.u64 %r189,%r189,24; add.u64 %r187,%r187,8; setp.ne.u64 %r393,%r157,%r175; @ %r393 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r45,%r42,%r34; .loc 1 349 32 shl.b64 %r91,%r142,4; ld.u64 %r39,[%frame+480]; .loc 1 359 22 ld.u64 %r94,[%frame+240]; .loc 1 359 12 shl.b64 %r95,%r94,4; .loc 1 360 23 ld.u64 %r97,[%frame]; .loc 1 361 22 ld.u64 %r99,[%frame+120]; .loc 1 361 12 shl.b64 %r100,%r99,4; .loc 1 363 32 ld.u64 %r85,[%frame+360]; .loc 1 370 23 mul.lo.u64 %r38,%r85,%r94; .loc 1 371 24 mul.lo.u64 %r394,%r85,%r97; .loc 1 371 10 neg.s64 %r163,%r394; .loc 1 372 23 mul.lo.u64 %r395,%r85,%r99; .loc 1 372 9 shl.b64 %r396,%r395,4; neg.s64 %r165,%r396; setp.le.s64 %r447,%r138,1; .loc 1 353 12 add.u64 %r449,%frame,608; add.u64 %r450,%frame,624; add.u64 %r451,%frame,640; cvt.u32.u32 %r452,%r23; cvt.s64.s8 %r453,%r452; add.u64 %r454,%r453,-1; add.u64 %r455,%frame,240; add.u64 %r456,%frame,360; add.u64 %r457,%frame,120; $L87: .loc 1 328 3 mov.u64 %r156,%r152; mov.u64 %r155,%r151; .loc 1 348 10 mov.u64 %r438,1; mov.u64 %r439,0; .loc 1 349 9 mov.u64 %r154,%r439; $L80: .loc 1 352 6 ld.s8 %r398,[%r156]; cvt.u16.u32 %r397,%r398; setp.eq.u16 %r399,%r397,0; @ %r399 bra $L79; .loc 1 353 12 st.u64 [%frame+624],%r438; st.u64 [%frame+632],%r439; ld.u64 %r406,[%r155]; st.u64 [%frame+640],%r406; ld.u64 %r407,[%r155+8]; st.u64 [%frame+648],%r40739,[%frame+616]; $L79: .loc 1 349 24 add.u64 %r154,%r154,1; .loc 1 349 32 add.u64 %r155,%r155,%r91; .loc 1 349 47 add.u64 %r156,%r156,%r45; .loc 1 349 2 setp.ne.u64 %r411,%r139,%r154; @ %r411 bra $L80; .loc 1 355 8 st.u64 [%r153],%r438; st.u64 [%r153+8],%r439; .loc 1 358 15 add.u64 %r39,%r39,1; .loc 1 359 12 add.u64 %r151,%r151,%r95; .loc 1 360 13 add.u64 %r152,%r152,%r97; .loc 1 361 12 add.u64 %r153,%r153,%r100; .loc 1 363 13 setp.ne.u64 %r414,%r85,%r39; @ %r414 bra $L87; .loc 1 371 10 add.u64 %r148,%r152,%r163; .loc 1 372 9 add.u64 %r149,%r153,%r165; .loc 1 374 7 @ ! %r447 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r41],%r448; .loc 1 370 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 371 24 mul.lo.u64 %r417,%r116,%r113; .loc 1 371 10 sub.u64 %r148,%r152,%r417; .loc 1 372 23 mul.lo.u64 %r418,%r118,%r113; .loc 1 372 9 shl.b64 %r419,%r418,4; sub.u64 %r149,%r153,%r419; .loc 1 373 5 add.u64 %r150,%r150,1; .loc 1 374 7 add.u64 %r41,%r41,8; add.u64 %r166,%r166,8; setp.ne.u64 %r420,%r150,%r454; @ %r420 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; .loc 1 370 23 mov.u64 %r102,%r38; .loc 1 374 7 mov.u64 %r166,8; .loc 1 373 5 mov.u64 %r150,1; .loc 1 367 13 mov.u64 %r448,0; $L84: .loc 1 382 16 ld.u64 %r424,[%r41]; add.u64 %r113,%r424,1; st.u64 [%r41],%r113; .loc 1 383 23 add.u64 %r426,%r455,%r166; ld.u64 %r114,[%r426]; .loc 1 383 13 sub.u64 %r427,%r114,%r102; shl.b64 %r428,%r427,4; add.u64 %r151,%r151,%r428; .loc 1 384 24 add.u64 %r429,%frame,%r166; ld.u64 %r116,[%r429]; .loc 1 384 14 add.u64 %r152,%r148,%r116; .loc 1 385 23 add.u64 %r431,%r457,%r166; ld.u64 %r118,[%r431]; .loc 1 385 13 shl.b64 %r432,%r118,4; add.u64 %r153,%r149,%r432; .loc 1 363 32 add.u64 %r434,%r456,%r166; ld.u64 %r122,[%r434]; .loc 1 363 13 setp.eq.u64 %r435,%r113,%r122; @ %r435 bra $L85; .loc 1 367 13 mov.u64 %r39,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r436,[%r209]; setp.eq.u64 %r437,%r436,0; @ ! %r437 bra $L63; bra $L86; $L48:_gfortran_sproduct_i16 .visible .func _gfortran_sproduct_i1668u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64u64 %r279; .reg .pred %r280; .reg .predpred %r292; .reg .predu32u64 %r301; mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_product_i16,(%out_arg1,%out_arg2,%out_arg3); } .loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r295,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r295; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r295; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r301,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r301; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r294,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r294,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r300,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r300; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r295,0; @ %r211 bra $L127; add.u64 %r294,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r294; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r244,1696244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r295,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r295; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r295; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r295,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r295,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r295,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 shl.b64 %r69,%r68,4; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r293,%r295,1; .loc 1 512 13 mov.u64 %r278,1; cvt.u32.u32 %r297,%r24; cvt.s64.s8 %r298,%r297; add.u64 %r299,%r298,-1; $L146: st.u64 [%r98],%r278; mov.u64 %r279,0; st.u64 [%r98+8],%r279; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r280,%r59,%r80; @ %r280 bra $L146; .loc 1 525 7 @ ! %r293 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r279; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r283,%r97,%r299; @ %r283 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; $L144: .loc 1 529 16 ld.u64 %r288,[%r99]; add.u64 %r74,%r288,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r289,%r75,%r72; shl.b64 %r290,%r289,4; add.u64 %r98,%r98,%r290; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r291,%r74,%r77; @ %r291 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r292,%r98,0; @ ! %r292 bra $L122; bra $L147; $L109: .loc 1 534 1 ret; } product_r4.o/_gfortran_product_r4 .visible .func _gfortran_product_r4fortran/generated/product_r4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mproduct_r4 .visible .func _gfortran_mproduct_r4gfortran_sproduct_r4 .visible .func _gfortran_sproduct_r4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,108] = {80,82,79,68,85,67,84114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_product_r4 .visible .func _gfortran_product_r4480u64 %r52; .reg .u64 %r58; .reg .u64 %r61; .reg .u649pred %r217; .reg .pred64u64 %r251; .reg .predu64 %r256; .reg .u64 %r257; .reg .u32u32 %r268; .reg .u64u64pred %r317; .reg .f32 %r319; .reg .f32pred %r332; .reg .u64 %r333; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u64 %r338; .reg .u64 %r340; .reg .predmov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; .loc 1 55 10 ld.s8 %r22,[%r180+28]; .loc 1 55 38 add.u32 %r182,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r182; .loc 1 56 10 ld.u64 %r25,[%r181]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r184,%r104,63; set.u32.lt.s64 %r186,%r103,%r104; neg.s32 %r187,%r186; cvt.u16.u64 %r190,%r184; cvt.u16.u32 %r191,%r187; or.b16 %r189,%r190,%r191; .loc 1 58 6 cvt.u32.u16 %r192,%r189; cvt.u16.u8 %r193,%r192; setp.eq.u16 %r194,%r193,0; @ %r194 bra $L2; .loc 1 60 7 cvt.u32.u32 %r197,%r22; cvt.s64.s8 %r196,%r197; st.u64 [%stack+8],%r1961955_gfortran_runtime_errorr200,%r104,%r104; add.u64 %r201,%r200,%r104; shl.b64 %r202,%r201,3; add.u64 %r203,%r180,%r202; ld.u64 %r30,[%r203+56]; ld.u64 %r32,[%r203+48]; .loc 1 68 9 ld.u64 %r107,[%r203+40]; .loc 1 70 3 setp.ne.u64 %r217,%r104,0; @ %r217 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r218,%r103,%r104; @ %r218 bra $L4; bra $L46; $L3: add.u64 %r61,%r180,40; add.u64 %r40,%frame,120; add.u64 %r91,%frame,240; add.u64 %r219,%r180,16; add.u64 %r221,%r25,%r25; add.u64 %r222,%r221,%r25; shl.b64 %r223,%r222,3; add.u64 %r174,%r219,%r223; .loc 1 76 12 mov.u64 %r357,0; $L8: .loc 1 72 18 ld.u64 %r224,[%r61]; st.u64 [%r40],%r224; .loc 1 73 19 ld.u64 %r226,[%r61+16]; add.u64 %r225,%r226,1; ld.u64 %r227,[%r61+8]; sub.u64 %r37,%r225,%r227; .loc 1 75 10 setp.lt.s64 %r228,%r37,0; @ %r228 bra $L6; .loc 1 73 17 st.u64 [%r91],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r91],%r357; $L7: .loc 1 70 3 add.u64 %r61,%r61,24; add.u64 %r40,%r40,8; add.u64 %r91,%r91,8; setp.ne.u64 %r230,%r61,%r174; @ %r230 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r231,[%r179]; setp.eq.u64 %r232,%r231,0; @ ! %r232 bra $L11; bra $L10; $L4: add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r237,%r236,40; add.u64 %r143,%r180,%r237; shl.b64 %r238,%r25,3; add.u64 %r125,%r238,-8; add.u64 %r344,%frame,120; add.u64 %r128,%r344,%r125; add.u64 %r345,%frame,240; add.u64 %r121,%r345,%r125; cvt.u32.u32 %r242,%r22; cvt.s64.s8 %r241,%r242; add.u64 %r244,%r241,%r241; add.u64 %r245,%r244,%r241; shl.b64 %r246,%r245,3; add.u64 %r247,%r180,40; add.u64 %r95,%r246,%r247; .loc 1 84 12 mov.u64 %r356,0; $L14: .loc 1 80 18 ld.u64 %r248,[%r143]; st.u64 [%r128],%r248; .loc 1 81 19 ld.u64 %r250,[%r143+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r143+8]; sub.u64 %r44,%r249,%r251; .loc 1 83 10 setp.lt.s64 %r252,%r44,0; @ %r252 bra $L12; .loc 1 81 17 st.u64 [%r121],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r121],%r356; $L13: .loc 1 78 3 add.u64 %r143,%r143,24; add.u64 %r128,%r128,8; add.u64 %r121,%r121,8; setp.ne.u64 %r254,%r95,%r143; @ %r254 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r255,%r103,0; @ %r255 bra $L16; add.u64 %r345,%frame,240; $L34: add.u64 %r152,%r179,40; mov.u64 %r150,%r345; add.u64 %r256,%r179,16; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r144,%r256,%r262; .loc 1 94 10 mov.u64 %r99,1; .loc 1 98 4 mov.u64 %r285,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r263,0; st.u64 [%r179+8],%r263; .loc 1 103 28 cvt.u16.u32 %r266,%r22; add.u16 %r265,%r266,-1; cvt.u32.u16 %r267,%r265; st.u8 [%r179+28],%r267; .loc 1 105 20 add.u32 %r268,%r22,-2; cvt.s64.s32 %r52,%r268; add.u64 %r270,%r52,%r52; add.u64 %r271,%r270,%r52; shl.b64 %r272,%r271,3; add.u64 %r273,%r179,%r272; .loc 1 105 67 shl.b64 %r275,%r52,3; add.u64 %r276,%frame,%r275; .loc 1 105 59 ld.u64 %r278,[%r273+40]; ld.u64 %r279,[%r276+240]; mul.lo.u64 %r109,%r278,%r279; .loc 1 107 29281; call (%value_in),_gfortrani_xmallocarray282,[%value_in]; } .loc 1 107 27 st.u64 [%r179],%r282; .loc 1 108 10 setp.eq.u64 %r284,%r109,0; @ ! %r284 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r99,%r48,%r99; $L17: .loc 1 98 4 st.u64 [%r152+8],%r285; ld.u64 %r48,[%r150]; add.u64 %r286,%r48,-1; st.u64 [%r152+16],%r286; st.u64 [%r152],%r99; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r287,%r144,%r152; @ %r287 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r179+48],%r109; mov.u64 %r289,-1; st.u64 [%r179+56],%r289; mov.u64 %r290,1; st.u64 [%r179+40],%r290; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r179+28]; .loc 1 118 10 setp.eq.u64 %r291,%r58,%r103; @ %r291 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r294,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r295,[%r294+36]; setp.eq.u32 %r296,%r295,0; @ %r296 bra $L21; .loc 1 125 2 add.u64 %r345,%frame,240; cvta.const.u64 %r300,$LC2r345299300; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r302,%r103,0; @ %r302 bra $L24; $L27: .loc 1 65 9 add.u64 %r303,%r30,1; .loc 1 65 7 sub.u64 %r105,%r303,%r32; max.s64 %r45,%r105,0; .loc 1 137 8 ld.u64 %r116,[%r180]; .loc 1 138 8 ld.u64 %r117,[%r179]; .loc 1 154 36 shl.b64 %r66,%r107,2; ld.u64 %r138,[%frame+360]; .loc 1 166 22 ld.u64 %r68,[%frame+120]; .loc 1 166 12 shl.b64 %r69,%r68,2; .loc 1 167 22 ld.u64 %r71,[%frame]; .loc 1 167 12 shl.b64 %r72,%r71,2; .loc 1 169 32 ld.u64 %r123,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r304,%r68,%r123; .loc 1 176 9 shl.b64 %r305,%r304,2; neg.s64 %r127,%r305; .loc 1 177 23 mul.lo.u64 %r306,%r71,%r123; .loc 1 177 9 shl.b64 %r307,%r306,2; neg.s64 %r130,%r307; setp.gt.s64 %r346,%r105,0; setp.le.s64 %r347,%r103,1; cvt.u32.u32 %r349,%r22; cvt.s64.s8 %r350,%r349; add.u64 %r351,%r350,-1; add.u64 %r352,%frame,120; add.u64 %r353,%frame,240; .loc 1 173 13 mov.u64 %r354,0; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r179,40; mov.u64 %r162,%frame; cvt.u32.u32 %r309,%r22; cvt.s64.s8 %r308,%r309; add.u64 %r153,%r308,-1; .loc 1 129 3 mov.u64 %r118,0; add.u64 %r345,%frame,240; .loc 1 131 16 mov.u64 %r310,%r118; $L26: st.u64 [%r165],%r310; .loc 1 132 18 ld.u64 %r311,[%r164]; st.u64 [%r162],%r311; .loc 1 133 17 shl.b64 %r313,%r118,3; add.u64 %r314,%r345,%r313; .loc 1 133 10 ld.u64 %r315,[%r314]; setp.le.s64 %r316,%r315,0; @ %r316 bra $L1; .loc 1 129 26 add.u64 %r118,%r118,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r317,%r118,%r153; @ %r317 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r346 bra $L35; .loc 1 150 10 mov.f32 %r319,0f3f800000; st.f32 [%r117],%r319; bra $L29; $L35: mov.u64 %r112,%r116; .loc 1 148 10 mov.f32 %r110,0f3f800000; .loc 1 154 13 mov.u64 %r111,0; $L28: .loc 1 158 10 ld.f32 %r320,[%r112]; mul.f32 %r110,%r110,%r320; .loc 1 154 28 add.u64 %r111,%r111,1; .loc 1 154 36 add.u64 %r112,%r112,%r66; .loc 1 154 6 setp.gt.s64 %r321,%r45,%r111; @ %r321 bra $L28; .loc 1 161 12 st.f32 [%r117],%r110; $L29: .loc 1 165 15 add.u64 %r138,%r138,1; .loc 1 166 12 add.u64 %r116,%r116,%r69; .loc 1 167 12 add.u64 %r117,%r117,%r72; .loc 1 169 13 setp.ne.u64 %r322,%r138,%r123; @ %r322 bra $L25; .loc 1 176 9 add.u64 %r113,%r116,%r127; .loc 1 177 9 add.u64 %r114,%r117,%r130; .loc 1 179 7 @ %r347 bra $L1; add.u64 %r122,%frame,368; mov.u64 %r140,8; .loc 1 178 5 mov.u64 %r115,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r122],%r354; .loc 1 176 23 mul.lo.u64 %r328,%r85,%r84; .loc 1 176 9 shl.b64 %r329,%r328,2; sub.u64 %r113,%r116,%r329; .loc 1 177 23 mul.lo.u64 %r330,%r88,%r84; .loc 1 177 9 shl.b64 %r331,%r330,2; sub.u64 %r114,%r117,%r331; .loc 1 178 5 add.u64 %r115,%r115,1; .loc 1 179 7 add.u64 %r122,%r122,8; add.u64 %r140,%r140,8; setp.eq.u64 %r332,%r115,%r351; @ %r332 bra $L1; $L32: .loc 1 187 16 ld.u64 %r333,[%r122]; add.u64 %r84,%r333,1; st.u64 [%r122],%r84; .loc 1 188 23 add.u64 %r335,%r352,%r140; ld.u64 %r85,[%r335]; .loc 1 188 13 shl.b64 %r336,%r85,2; add.u64 %r116,%r113,%r336; .loc 1 189 23 add.u64 %r337,%frame,%r140; ld.u64 %r88,[%r337]; .loc 1 189 13 shl.b64 %r338,%r88,2; add.u64 %r117,%r114,%r338; .loc 1 169 32 add.u64 %r340,%r353,%r140; ld.u64 %r92,[%r340]; .loc 1 169 13 setp.eq.u64 %r341,%r84,%r92; @ %r341 bra $L33; .loc 1 173 13 mov.u64 %r138,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r342,[%r179]; setp.eq.u64 %r343,%r342,0; @ ! %r343 bra $L11; bra $L34; $L1: .loc 1 19_gfortran_mproduct_r4 .visible .func _gfortran_mproduct_r608u64u16 %r263; .reg .pred %r264; .reg .pred %r266; .reg .u64 %r270; .reg .u64 %r271; .reg .u64 %r277; .reg .pred %r279; .reg .pred %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .predu64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32u64 %r342; .reg .u64pred %r358; .reg .u64predu64 %r391; .reg .predpred %r399; .reg .f32 %r400; .reg .pred %r401; .reg .predmov.u64 %r211,%ar2; mov.u64 %r212,%ar3; .loc 1 223 6 setp.ne.u64 %r213,%r212,0; @ %r213 bra $L49gfortran_product_r4.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r211]; .loc 1 233 7 add.u64 %r137,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r210+28]; .loc 1 234 38 add.u32 %r217,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r138,%r217; .loc 1 237 7 shr.u64 %r219,%r137,63; set.u32.gt.s64 %r221,%r137,%r138; neg.s32 %r222,%r221; cvt.u16.u64 %r225,%r219; cvt.u16.u32 %r226,%r222; or.b16 %r224,%r225,%r226; .loc 1 237 6 cvt.u32.u16 %r227,%r224; cvt.u16.u8 %r228,%r227; setp.eq.u16 %r229,%r228,0; @ %r229 bra $L51; .loc 1 239 7 cvt.u32.u32 %r232,%r23; cvt.s64.s8 %r231,%r232; st.u64 [%stack+8],%r231; st.u64 [%stack],%r22; cvta.const.u64 %r2302stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r429,%r137,%r137; add.u64 %r430,%r429,%r137; shl.b64 %r237,%r430,3; add.u64 %r238,%r210,%r237; ld.u64 %r241,[%r238+56]; add.u64 %r240,%r241,1; .loc 1 244 7 ld.u64 %r248,[%r238+48]; sub.u64 %r139,%r240,%r248; .loc 1 245 6 setp.le.s64 %r249,%r139,0; @ %r249 bra $L48; .loc 1 248 9 ld.u64 %r152,[%r212]; .loc 1 250 15 ld.u64 %r34,[%r212+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r250,%r35,-4; and.b32 %r251,%r250,-5; set.u32.eq.u32 %r253,%r251,0; neg.s32 %r254,%r253; .loc 1 252 22 add.u32 %r255,%r35,-1; set.u32.le.u32 %r257,%r255,1; neg.s32 %r258,%r257; .loc 1 252 6 cvt.u16.u32 %r260,%r254; cvt.u16.u32 %r261,%r258; or.b16 %r259,%r260,%r261; cvt.u32.u16 %r262,%r259; cvt.u16.u8 %r263,%r262; setp.ne.u16 %r264,%r263,0; @ %r264 bra $L53; .loc 1 254 7 setp.ne.u32 %r266,%r35,16; @ %r266 bra $L54; $L53: .loc 1 261 9 shl.b64 %r270,%r430,3; add.u64 %r271,%r210,%r270; ld.u64 %r142,[%r271+40]; .loc 1 262 12 add.u64 %r277,%r212,%r270; ld.u64 %r42,[%r277+40]; .loc 1 264 3 setp.ne.u64 %r279,%r137,0; @ %r279 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r280,%r137,%r138; @ %r280 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r281281_gfortran_runtime_error5: add.u64 %r40,%r210,40; add.u64 %r179,%r212,40; add.u64 %r283,%r210,16; add.u64 %r285,%r22,%r22; add.u64 %r286,%r285,%r22; shl.b64 %r287,%r286,3; add.u64 %r206,%r283,%r287; .loc 1 264 3 mov.u64 %r92,0; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 271 12 mov.u64 %r441,%r92; $L60: .loc 1 266 18 add.u64 %r289,%r427,%r92; ld.u64 %r290,[%r40]; st.u64 [%r289],%r290; .loc 1 267 18 add.u64 %r291,%frame,%r92; .loc 1 267 20 ld.u64 %r293,[%r179]; mul.lo.u64 %r292,%r293,%r34; .loc 1 267 18 st.u64 [%r291],%r292; .loc 1 268 19 ld.u64 %r295,[%r40+16]; add.u64 %r294,%r295,1; ld.u64 %r296,[%r40+8]; sub.u64 %r54,%r294,%r296; .loc 1 270 10 setp.lt.s64 %r297,%r54,0; @ %r297 bra $L58; .loc 1 268 17 add.u64 %r299,%r426,%r92; st.u64 [%r299],%r54; bra $L59; $L58: .loc 1 271 12 add.u64 %r301,%r426,%r92; st.u64 [%r301],%r441; $L59: .loc 1 264 3 add.u64 %r40,%r40,24; add.u64 %r179,%r179,24; add.u64 %r92,%r92,8; setp.ne.u64 %r303,%r40,%r206; @ %r303 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r304,[%r209]; setp.eq.u64 %r305,%r304,0; @ ! %r305 bra $L63; bra $L62; $L56: add.u64 %r307,%r22,%r22; add.u64 %r308,%r307,%r22; shl.b64 %r309,%r308,3; add.u64 %r127,%r309,40; add.u64 %r133,%r210,%r127; add.u64 %r125,%r212,%r127; shl.b64 %r310,%r22,3; add.u64 %r103,%r310,-8; cvt.u32.u32 %r312,%r23; cvt.s64.s8 %r311,%r312; shl.b64 %r313,%r311,3; add.u64 %r69,%r313,-8; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 281 12 mov.u64 %r440,0; $L66: .loc 1 276 18 add.u64 %r315,%r427,%r103; ld.u64 %r316,[%r133]; st.u64 [%r315],%r316; .loc 1 277 18 add.u64 %r317,%frame,%r103; .loc 1 277 20 ld.u64 %r319,[%r125]; mul.lo.u64 %r318,%r319,%r34; .loc 1 277 18 st.u64 [%r317],%r318; .loc 1 278 19 ld.u64 %r321,[%r133+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r133+8]; sub.u64 %r66,%r320,%r322; .loc 1 280 10 setp.lt.s64 %r323,%r66,0; @ %r323 bra $L64; .loc 1 278 17 add.u64 %r325,%r426,%r103; st.u64 [%r325],%r66; bra $L65; $L64: .loc 1 281 12 add.u64 %r327,%r426,%r103; st.u64 [%r327],%r440; $L65: .loc 1 274 3 add.u64 %r133,%r133,24; add.u64 %r125,%r125,24; add.u64 %r103,%r103,8; setp.ne.u64 %r329,%r69,%r103; @ %r329 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r330,%r138,0; @ %r330 bra $L68; add.u64 %r426,%frame,360; $L86: add.u64 %r172,%r209,40; mov.u64 %r164,%r426; cvt.u32.u32 %r332,%r23; cvt.s64.s8 %r331,%r332; add.u64 %r334,%r331,%r331; add.u64 %r335,%r334,%r331; shl.b64 %r336,%r335,3; add.u64 %r337,%r209,16; add.u64 %r134,%r336,%r337; .loc 1 291 10 mov.u64 %r129,1; .loc 1 295 4 mov.u64 %r356,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r338,%r23,-2; cvt.s64.s32 %r75,%r338; add.u64 %r340,%r75,%r75; add.u64 %r341,%r340,%r75; shl.b64 %r342,%r341,3; add.u64 %r343,%r209,%r342; .loc 1 299 67 shl.b64 %r345,%r75,3; add.u64 %r346,%frame,%r345; .loc 1 299 59 ld.u64 %r348,[%r343+40]; ld.u64 %r349,[%r346+360]; mul.lo.u64 %r146,%r348,%r349; .loc 1 301 24 mov.u64 %r350,0; st.u64 [%r209+8],%r350; .loc 1 302 28 cvt.u16.u32 %r353,%r23; add.u16 %r352,%r353,-1; cvt.u32.u16 %r354,%r352; st.u8 [%r209+28],%r354; .loc 1 304 10 setp.eq.u64 %r355,%r146,0; @ %r355 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r129,%r72,%r129; $L69: .loc 1 295 4 st.u64 [%r172+8],%r356; ld.u64 %r72,[%r164]; add.u64 %r357,%r72,-1; st.u64 [%r172+16],%r357; st.u64 [%r172],%r129; .loc 1 288 7 add.u64 %r172,%r172,24; add.u64 %r164,%r164,8; setp.ne.u64 %r358,%r134,%r172; @ %r358 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r209+48],%r146; mov.u64 %r360,-1; st.u64 [%r209+56],%r360; mov.u64 %r361,1; st.u64 [%r209+40],%r361; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24363; call (%value_in),_gfortrani_xmallocarray64,[%value_in]; } .loc 1 311 22 st.u64 [%r209],%r364; bra $L73; $L63: .loc 1 316 19 ld.s8 %r366,[%r209+28]; .loc 1 316 10 setp.eq.u64 %r367,%r366,%r138; @ %r367 bra $L74; .loc 1 317 2 cvta.const.u64 %r36836_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r370,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r371,[%r370+36]; setp.eq.u32 %r372,%r371,0; @ %r372 bra $L73; .loc 1 321 4 add.u64 %r426,%frame,360; cvta.const.u64 %r376,$LC2r42637376; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r380,$LC1210380376; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r382,%r138,0; @ %r382 bra $L75; $L78: .loc 1 336 8 ld.u64 %r153,[%r209]; .loc 1 337 8 ld.u64 %r151,[%r210]; .loc 1 339 9 setp.ne.u64 %r383,%r151,0; @ %r383 bra $L76; bra $L48; $L75: add.u64 %r190,%frame,480; add.u64 %r189,%r209,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r385,%r23; cvt.s64.s8 %r384,%r385; add.u64 %r175,%r384,-1; .loc 1 328 3 mov.u64 %r157,0; add.u64 %r426,%frame,360; .loc 1 330 16 mov.u64 %r386,%r157; $L77: st.u64 [%r190],%r386; .loc 1 331 18 ld.u64 %r387,[%r189]; st.u64 [%r187],%r387; .loc 1 332 17 shl.b64 %r389,%r157,3; add.u64 %r390,%r426,%r389; .loc 1 332 10 ld.u64 %r391,[%r390]; setp.le.s64 %r392,%r391,0; @ %r392 bra $L48; .loc 1 328 26 add.u64 %r157,%r157,1; .loc 1 328 3 add.u64 %r190,%r190,8; add.u64 %r189,%r189,24; add.u64 %r187,%r187,8; setp.ne.u64 %r393,%r157,%r175; @ %r393 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r45,%r42,%r34; .loc 1 349 32 shl.b64 %r91,%r142,2; ld.u64 %r39,[%frame+480]; .loc 1 359 22 ld.u64 %r94,[%frame+240]; .loc 1 359 12 shl.b64 %r95,%r94,2; .loc 1 360 23 ld.u64 %r97,[%frame]; .loc 1 361 22 ld.u64 %r99,[%frame+120]; .loc 1 361 12 shl.b64 %r100,%r99,2; .loc 1 363 32 ld.u64 %r85,[%frame+360]; .loc 1 370 23 mul.lo.u64 %r38,%r85,%r94; .loc 1 371 24 mul.lo.u64 %r394,%r85,%r97; .loc 1 371 10 neg.s64 %r163,%r394; .loc 1 372 23 mul.lo.u64 %r395,%r85,%r99; .loc 1 372 9 shl.b64 %r396,%r395,2; neg.s64 %r165,%r396; setp.le.s64 %r431,%r138,1; cvt.u32.u32 %r433,%r23; cvt.s64.s8 %r434,%r433; add.u64 %r435,%r434,-1; add.u64 %r436,%frame,240; add.u64 %r437,%frame,360; add.u64 %r438,%frame,120; .loc 1 367 13 mov.u64 %r439,0; $L87: .loc 1 328 3 mov.u64 %r156,%r152; mov.u64 %r155,%r151; .loc 1 348 10 mov.f32 %r130,0f3f800000; .loc 1 349 9 mov.u64 %r154,0; $L80: .loc 1 352 6 ld.s8 %r398,[%r156]; cvt.u16.u32 %r397,%r398; setp.eq.u16 %r399,%r397,0; @ %r399 bra $L79; .loc 1 353 12 ld.f32 %r400,[%r155]; mul.f32 %r130,%r130,%r400; $L79: .loc 1 349 24 add.u64 %r154,%r154,1; .loc 1 349 32 add.u64 %r155,%r155,%r91; .loc 1 349 47 add.u64 %r156,%r156,%r45; .loc 1 349 2 setp.ne.u64 %r401,%r139,%r154; @ %r401 bra $L80; .loc 1 355 8 st.f32 [%r153],%r130; .loc 1 358 15 add.u64 %r39,%r39,1; .loc 1 359 12 add.u64 %r151,%r151,%r95; .loc 1 360 13 add.u64 %r152,%r152,%r97; .loc 1 361 12 add.u64 %r153,%r153,%r100; .loc 1 363 13 setp.ne.u64 %r402,%r85,%r39; @ %r402 bra $L87; .loc 1 371 10 add.u64 %r148,%r152,%r163; .loc 1 372 9 add.u64 %r149,%r153,%r165; .loc 1 374 7 @ ! %r431 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r41],%r439; .loc 1 370 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 371 24 mul.lo.u64 %r405,%r116,%r113; .loc 1 371 10 sub.u64 %r148,%r152,%r405; .loc 1 372 23 mul.lo.u64 %r406,%r118,%r113; .loc 1 372 9 shl.b64 %r407,%r406,2; sub.u64 %r149,%r153,%r407; .loc 1 373 5 add.u64 %r150,%r150,1; .loc 1 374 7 add.u64 %r41,%r41,8; add.u64 %r166,%r166,8; setp.ne.u64 %r408,%r150,%r435; @ %r408 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; .loc 1 370 23 mov.u64 %r102,%r38; .loc 1 374 7 mov.u64 %r166,8; .loc 1 373 5 mov.u64 %r150,1; $L84: .loc 1 382 16 ld.u64 %r412,[%r41]; add.u64 %r113,%r412,1; st.u64 [%r41],%r113; .loc 1 383 23 add.u64 %r414,%r436,%r166; ld.u64 %r114,[%r414]; .loc 1 383 13 sub.u64 %r415,%r114,%r102; shl.b64 %r416,%r415,2; add.u64 %r151,%r151,%r416; .loc 1 384 24 add.u64 %r417,%frame,%r166; ld.u64 %r116,[%r417]; .loc 1 384 14 add.u64 %r152,%r148,%r116; .loc 1 385 23 add.u64 %r419,%r438,%r166; ld.u64 %r118,[%r419]; .loc 1 385 13 shl.b64 %r420,%r118,2; add.u64 %r153,%r149,%r420; .loc 1 363 32 add.u64 %r422,%r437,%r166; ld.u64 %r122,[%r422]; .loc 1 363 13 setp.eq.u64 %r423,%r113,%r122; @ %r423 bra $L85; .loc 1 367 13 mov.u64 %r39,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r424,[%r209]; setp.eq.u64 %r425,%r424,0; @ ! %r425 bra $L63; bra $L86; $L48:_gfortran_sproduct_r4 .visible .func _gfortran_sproduct_r68u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64 %r276; .reg .predpredu64mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_product_r4.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r294,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r294; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r294; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r300,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r300; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r293,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r293,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r299,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r299; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r294,0; @ %r211 bra $L127; add.u64 %r293,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r293; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 2496244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r294,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r294; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r294; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r294,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r294,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r294,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 shl.b64 %r69,%r68,2; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r292,%r294,1; .loc 1 512 13 mov.f32 %r278,0f3f800000; cvt.u32.u32 %r296,%r24; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; $L146: st.f32 [%r98],%r278; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r279,%r59,%r80; @ %r279 bra $L146; .loc 1 525 7 @ ! %r292 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r295; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r282,%r97,%r298; @ %r282 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; .loc 1 520 13 mov.u64 %r295,0; $L144: .loc 1 529 16 ld.u64 %r287,[%r99]; add.u64 %r74,%r287,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r288,%r75,%r72; shl.b64 %r289,%r288,2; add.u64 %r98,%r98,%r289; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r290,%r74,%r77; @ %r290 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r291,%r98,0; @ ! %r291 bra $L122; bra $L147; $L109: .loc 1 534 1 ret; } product_r8.o/_gfortran_product_r8 .visible .func _gfortran_product_r8fortran/generated/product_r8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mproduct_r8 .visible .func _gfortran_mproduct_r8gfortran_sproduct_r8 .visible .func _gfortran_sproduct_r8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,108] = {80,82,79,68,85,67,84114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_product_r8 .visible .func _gfortran_product_r8480u64 %r52; .reg .u64 %r58; .reg .u64 %r61; .reg .u649pred %r217; .reg .pred64u64 %r251; .reg .predu64 %r256; .reg .u64 %r257; .reg .u32u32 %r268; .reg .u64u64pred %r317; .reg .f64 %r319; .reg .f64pred %r332; .reg .u64 %r333; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u64 %r338; .reg .u64 %r340; .reg .predmov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; .loc 1 55 10 ld.s8 %r22,[%r180+28]; .loc 1 55 38 add.u32 %r182,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r182; .loc 1 56 10 ld.u64 %r25,[%r181]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r184,%r104,63; set.u32.lt.s64 %r186,%r103,%r104; neg.s32 %r187,%r186; cvt.u16.u64 %r190,%r184; cvt.u16.u32 %r191,%r187; or.b16 %r189,%r190,%r191; .loc 1 58 6 cvt.u32.u16 %r192,%r189; cvt.u16.u8 %r193,%r192; setp.eq.u16 %r194,%r193,0; @ %r194 bra $L2; .loc 1 60 7 cvt.u32.u32 %r197,%r22; cvt.s64.s8 %r196,%r197; st.u64 [%stack+8],%r1961955_gfortran_runtime_errorr200,%r104,%r104; add.u64 %r201,%r200,%r104; shl.b64 %r202,%r201,3; add.u64 %r203,%r180,%r202; ld.u64 %r30,[%r203+56]; ld.u64 %r32,[%r203+48]; .loc 1 68 9 ld.u64 %r107,[%r203+40]; .loc 1 70 3 setp.ne.u64 %r217,%r104,0; @ %r217 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r218,%r103,%r104; @ %r218 bra $L4; bra $L46; $L3: add.u64 %r61,%r180,40; add.u64 %r40,%frame,120; add.u64 %r91,%frame,240; add.u64 %r219,%r180,16; add.u64 %r221,%r25,%r25; add.u64 %r222,%r221,%r25; shl.b64 %r223,%r222,3; add.u64 %r174,%r219,%r223; .loc 1 76 12 mov.u64 %r357,0; $L8: .loc 1 72 18 ld.u64 %r224,[%r61]; st.u64 [%r40],%r224; .loc 1 73 19 ld.u64 %r226,[%r61+16]; add.u64 %r225,%r226,1; ld.u64 %r227,[%r61+8]; sub.u64 %r37,%r225,%r227; .loc 1 75 10 setp.lt.s64 %r228,%r37,0; @ %r228 bra $L6; .loc 1 73 17 st.u64 [%r91],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r91],%r357; $L7: .loc 1 70 3 add.u64 %r61,%r61,24; add.u64 %r40,%r40,8; add.u64 %r91,%r91,8; setp.ne.u64 %r230,%r61,%r174; @ %r230 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r231,[%r179]; setp.eq.u64 %r232,%r231,0; @ ! %r232 bra $L11; bra $L10; $L4: add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r237,%r236,40; add.u64 %r143,%r180,%r237; shl.b64 %r238,%r25,3; add.u64 %r125,%r238,-8; add.u64 %r344,%frame,120; add.u64 %r128,%r344,%r125; add.u64 %r345,%frame,240; add.u64 %r121,%r345,%r125; cvt.u32.u32 %r242,%r22; cvt.s64.s8 %r241,%r242; add.u64 %r244,%r241,%r241; add.u64 %r245,%r244,%r241; shl.b64 %r246,%r245,3; add.u64 %r247,%r180,40; add.u64 %r95,%r246,%r247; .loc 1 84 12 mov.u64 %r356,0; $L14: .loc 1 80 18 ld.u64 %r248,[%r143]; st.u64 [%r128],%r248; .loc 1 81 19 ld.u64 %r250,[%r143+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r143+8]; sub.u64 %r44,%r249,%r251; .loc 1 83 10 setp.lt.s64 %r252,%r44,0; @ %r252 bra $L12; .loc 1 81 17 st.u64 [%r121],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r121],%r356; $L13: .loc 1 78 3 add.u64 %r143,%r143,24; add.u64 %r128,%r128,8; add.u64 %r121,%r121,8; setp.ne.u64 %r254,%r95,%r143; @ %r254 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r255,%r103,0; @ %r255 bra $L16; add.u64 %r345,%frame,240; $L34: add.u64 %r152,%r179,40; mov.u64 %r150,%r345; add.u64 %r256,%r179,16; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r144,%r256,%r262; .loc 1 94 10 mov.u64 %r99,1; .loc 1 98 4 mov.u64 %r285,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r263,0; st.u64 [%r179+8],%r263; .loc 1 103 28 cvt.u16.u32 %r266,%r22; add.u16 %r265,%r266,-1; cvt.u32.u16 %r267,%r265; st.u8 [%r179+28],%r267; .loc 1 105 20 add.u32 %r268,%r22,-2; cvt.s64.s32 %r52,%r268; add.u64 %r270,%r52,%r52; add.u64 %r271,%r270,%r52; shl.b64 %r272,%r271,3; add.u64 %r273,%r179,%r272; .loc 1 105 67 shl.b64 %r275,%r52,3; add.u64 %r276,%frame,%r275; .loc 1 105 59 ld.u64 %r278,[%r273+40]; ld.u64 %r279,[%r276+240]; mul.lo.u64 %r109,%r278,%r279; .loc 1 107 29 mov.u64 %r281281; call (%value_in),_gfortrani_xmallocarray282,[%value_in]; } .loc 1 107 27 st.u64 [%r179],%r282; .loc 1 108 10 setp.eq.u64 %r284,%r109,0; @ ! %r284 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r99,%r48,%r99; $L17: .loc 1 98 4 st.u64 [%r152+8],%r285; ld.u64 %r48,[%r150]; add.u64 %r286,%r48,-1; st.u64 [%r152+16],%r286; st.u64 [%r152],%r99; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r287,%r144,%r152; @ %r287 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r179+48],%r109; mov.u64 %r289,-1; st.u64 [%r179+56],%r289; mov.u64 %r290,1; st.u64 [%r179+40],%r290; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r179+28]; .loc 1 118 10 setp.eq.u64 %r291,%r58,%r103; @ %r291 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r294,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r295,[%r294+36]; setp.eq.u32 %r296,%r295,0; @ %r296 bra $L21; .loc 1 125 2 add.u64 %r345,%frame,240; cvta.const.u64 %r300,$LC2r345299300; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r302,%r103,0; @ %r302 bra $L24; $L27: .loc 1 65 9 add.u64 %r303,%r30,1; .loc 1 65 7 sub.u64 %r105,%r303,%r32; max.s64 %r45,%r105,0; .loc 1 137 8 ld.u64 %r116,[%r180]; .loc 1 138 8 ld.u64 %r117,[%r179]; .loc 1 154 36 shl.b64 %r66,%r107,3; ld.u64 %r138,[%frame+360]; .loc 1 166 22 ld.u64 %r68,[%frame+120]; .loc 1 166 12 shl.b64 %r69,%r68,3; .loc 1 167 22 ld.u64 %r71,[%frame]; .loc 1 167 12 shl.b64 %r72,%r71,3; .loc 1 169 32 ld.u64 %r123,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r304,%r68,%r123; .loc 1 176 9 shl.b64 %r305,%r304,3; neg.s64 %r127,%r305; .loc 1 177 23 mul.lo.u64 %r306,%r71,%r123; .loc 1 177 9 shl.b64 %r307,%r306,3; neg.s64 %r130,%r307; setp.gt.s64 %r346,%r105,0; setp.le.s64 %r347,%r103,1; cvt.u32.u32 %r349,%r22; cvt.s64.s8 %r350,%r349; add.u64 %r351,%r350,-1; add.u64 %r352,%frame,120; add.u64 %r353,%frame,240; .loc 1 173 13 mov.u64 %r354,0; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r179,40; mov.u64 %r162,%frame; cvt.u32.u32 %r309,%r22; cvt.s64.s8 %r308,%r309; add.u64 %r153,%r308,-1; .loc 1 129 3 mov.u64 %r118,0; add.u64 %r345,%frame,240; .loc 1 131 16 mov.u64 %r310,%r118; $L26: st.u64 [%r165],%r310; .loc 1 132 18 ld.u64 %r311,[%r164]; st.u64 [%r162],%r311; .loc 1 133 17 shl.b64 %r313,%r118,3; add.u64 %r314,%r345,%r313; .loc 1 133 10 ld.u64 %r315,[%r314]; setp.le.s64 %r316,%r315,0; @ %r316 bra $L1; .loc 1 129 26 add.u64 %r118,%r118,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r317,%r118,%r153; @ %r317 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r346 bra $L35; .loc 1 150 10 mov.f64 %r319,0d3ff0000000000000; st.f64 [%r117],%r319; bra $L29; $L35: mov.u64 %r112,%r116; .loc 1 148 10 mov.f64 %r110,0d3ff0000000000000; .loc 1 154 13 mov.u64 %r111,0; $L28: .loc 1 158 10 ld.f64 %r320,[%r112]; mul.f64 %r110,%r110,%r320; .loc 1 154 28 add.u64 %r111,%r111,1; .loc 1 154 36 add.u64 %r112,%r112,%r66; .loc 1 154 6 setp.gt.s64 %r321,%r45,%r111; @ %r321 bra $L28; .loc 1 161 12 st.f64 [%r117],%r110; $L29: .loc 1 165 15 add.u64 %r138,%r138,1; .loc 1 166 12 add.u64 %r116,%r116,%r69; .loc 1 167 12 add.u64 %r117,%r117,%r72; .loc 1 169 13 setp.ne.u64 %r322,%r138,%r123; @ %r322 bra $L25; .loc 1 176 9 add.u64 %r113,%r116,%r127; .loc 1 177 9 add.u64 %r114,%r117,%r130; .loc 1 179 7 @ %r347 bra $L1; add.u64 %r122,%frame,368; mov.u64 %r140,8; .loc 1 178 5 mov.u64 %r115,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r122],%r354; .loc 1 176 23 mul.lo.u64 %r328,%r85,%r84; .loc 1 176 9 shl.b64 %r329,%r328,3; sub.u64 %r113,%r116,%r329; .loc 1 177 23 mul.lo.u64 %r330,%r88,%r84; .loc 1 177 9 shl.b64 %r331,%r330,3; sub.u64 %r114,%r117,%r331; .loc 1 178 5 add.u64 %r115,%r115,1; .loc 1 179 7 add.u64 %r122,%r122,8; add.u64 %r140,%r140,8; setp.eq.u64 %r332,%r115,%r351; @ %r332 bra $L1; $L32: .loc 1 187 16 ld.u64 %r333,[%r122]; add.u64 %r84,%r333,1; st.u64 [%r122],%r84; .loc 1 188 23 add.u64 %r335,%r352,%r140; ld.u64 %r85,[%r335]; .loc 1 188 13 shl.b64 %r336,%r85,3; add.u64 %r116,%r113,%r336; .loc 1 189 23 add.u64 %r337,%frame,%r140; ld.u64 %r88,[%r337]; .loc 1 189 13 shl.b64 %r338,%r88,3; add.u64 %r117,%r114,%r338; .loc 1 169 32 add.u64 %r340,%r353,%r140; ld.u64 %r92,[%r340]; .loc 1 169 13 setp.eq.u64 %r341,%r84,%r92; @ %r341 bra $L33; .loc 1 173 13 mov.u64 %r138,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r342,[%r179]; setp.eq.u64 %r343,%r342,0; @ ! %r343 bra $L11; bra $L34; $L1: .loc 1 19_gfortran_mproduct_r8 .visible .func _gfortran_mproduct_r8608u64u16 %r263; .reg .pred %r264; .reg .pred %r266; .reg .u64 %r270; .reg .u64 %r271; .reg .u64 %r277; .reg .pred %r279; .reg .pred %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .predu64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32u64 %r342; .reg .u64pred %r358; .reg .u64predu64 %r391; .reg .predpred %r399; .reg .f64 %r400; .reg .pred %r401; .reg .predmov.u64 %r211,%ar2; mov.u64 %r212,%ar3; .loc 1 223 6 setp.ne.u64 %r213,%r212,0; @ %r213 bra $L49gfortran_product_r8.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r211]; .loc 1 233 7 add.u64 %r137,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r210+28]; .loc 1 234 38 add.u32 %r217,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r138,%r217; .loc 1 237 7 shr.u64 %r219,%r137,63; set.u32.gt.s64 %r221,%r137,%r138; neg.s32 %r222,%r221; cvt.u16.u64 %r225,%r219; cvt.u16.u32 %r226,%r222; or.b16 %r224,%r225,%r226; .loc 1 237 6 cvt.u32.u16 %r227,%r224; cvt.u16.u8 %r228,%r227; setp.eq.u16 %r229,%r228,0; @ %r229 bra $L51; .loc 1 239 7 cvt.u32.u32 %r232,%r23; cvt.s64.s8 %r231,%r232; st.u64 [%stack+8],%r231; st.u64 [%stack],%r22; cvta.const.u64 %r2302stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r429,%r137,%r137; add.u64 %r430,%r429,%r137; shl.b64 %r237,%r430,3; add.u64 %r238,%r210,%r237; ld.u64 %r241,[%r238+56]; add.u64 %r240,%r241,1; .loc 1 244 7 ld.u64 %r248,[%r238+48]; sub.u64 %r139,%r240,%r248; .loc 1 245 6 setp.le.s64 %r249,%r139,0; @ %r249 bra $L48; .loc 1 248 9 ld.u64 %r152,[%r212]; .loc 1 250 15 ld.u64 %r34,[%r212+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r250,%r35,-4; and.b32 %r251,%r250,-5; set.u32.eq.u32 %r253,%r251,0; neg.s32 %r254,%r253; .loc 1 252 22 add.u32 %r255,%r35,-1; set.u32.le.u32 %r257,%r255,1; neg.s32 %r258,%r257; .loc 1 252 6 cvt.u16.u32 %r260,%r254; cvt.u16.u32 %r261,%r258; or.b16 %r259,%r260,%r261; cvt.u32.u16 %r262,%r259; cvt.u16.u8 %r263,%r262; setp.ne.u16 %r264,%r263,0; @ %r264 bra $L53; .loc 1 254 7 setp.ne.u32 %r266,%r35,16; @ %r266 bra $L54; $L53: .loc 1 261 9 shl.b64 %r270,%r430,3; add.u64 %r271,%r210,%r270; ld.u64 %r142,[%r271+40]; .loc 1 262 12 add.u64 %r277,%r212,%r270; ld.u64 %r42,[%r277+40]; .loc 1 264 3 setp.ne.u64 %r279,%r137,0; @ %r279 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r280,%r137,%r138; @ %r280 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r281281_gfortran_runtime_error5: add.u64 %r40,%r210,40; add.u64 %r179,%r212,40; add.u64 %r283,%r210,16; add.u64 %r285,%r22,%r22; add.u64 %r286,%r285,%r22; shl.b64 %r287,%r286,3; add.u64 %r206,%r283,%r287; .loc 1 264 3 mov.u64 %r92,0; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 271 12 mov.u64 %r441,%r92; $L60: .loc 1 266 18 add.u64 %r289,%r427,%r92; ld.u64 %r290,[%r40]; st.u64 [%r289],%r290; .loc 1 267 18 add.u64 %r291,%frame,%r92; .loc 1 267 20 ld.u64 %r293,[%r179]; mul.lo.u64 %r292,%r293,%r34; .loc 1 267 18 st.u64 [%r291],%r292; .loc 1 268 19 ld.u64 %r295,[%r40+16]; add.u64 %r294,%r295,1; ld.u64 %r296,[%r40+8]; sub.u64 %r54,%r294,%r296; .loc 1 270 10 setp.lt.s64 %r297,%r54,0; @ %r297 bra $L58; .loc 1 268 17 add.u64 %r299,%r426,%r92; st.u64 [%r299],%r54; bra $L59; $L58: .loc 1 271 12 add.u64 %r301,%r426,%r92; st.u64 [%r301],%r441; $L59: .loc 1 264 3 add.u64 %r40,%r40,24; add.u64 %r179,%r179,24; add.u64 %r92,%r92,8; setp.ne.u64 %r303,%r40,%r206; @ %r303 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r304,[%r209]; setp.eq.u64 %r305,%r304,0; @ ! %r305 bra $L63; bra $L62; $L56: add.u64 %r307,%r22,%r22; add.u64 %r308,%r307,%r22; shl.b64 %r309,%r308,3; add.u64 %r127,%r309,40; add.u64 %r133,%r210,%r127; add.u64 %r125,%r212,%r127; shl.b64 %r310,%r22,3; add.u64 %r103,%r310,-8; cvt.u32.u32 %r312,%r23; cvt.s64.s8 %r311,%r312; shl.b64 %r313,%r311,3; add.u64 %r69,%r313,-8; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 281 12 mov.u64 %r440,0; $L66: .loc 1 276 18 add.u64 %r315,%r427,%r103; ld.u64 %r316,[%r133]; st.u64 [%r315],%r316; .loc 1 277 18 add.u64 %r317,%frame,%r103; .loc 1 277 20 ld.u64 %r319,[%r125]; mul.lo.u64 %r318,%r319,%r34; .loc 1 277 18 st.u64 [%r317],%r318; .loc 1 278 19 ld.u64 %r321,[%r133+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r133+8]; sub.u64 %r66,%r320,%r322; .loc 1 280 10 setp.lt.s64 %r323,%r66,0; @ %r323 bra $L64; .loc 1 278 17 add.u64 %r325,%r426,%r103; st.u64 [%r325],%r66; bra $L65; $L64: .loc 1 281 12 add.u64 %r327,%r426,%r103; st.u64 [%r327],%r440; $L65: .loc 1 274 3 add.u64 %r133,%r133,24; add.u64 %r125,%r125,24; add.u64 %r103,%r103,8; setp.ne.u64 %r329,%r69,%r103; @ %r329 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r330,%r138,0; @ %r330 bra $L68; add.u64 %r426,%frame,360; $L86: add.u64 %r172,%r209,40; mov.u64 %r164,%r426; cvt.u32.u32 %r332,%r23; cvt.s64.s8 %r331,%r332; add.u64 %r334,%r331,%r331; add.u64 %r335,%r334,%r331; shl.b64 %r336,%r335,3; add.u64 %r337,%r209,16; add.u64 %r134,%r336,%r337; .loc 1 291 10 mov.u64 %r129,1; .loc 1 295 4 mov.u64 %r356,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r338,%r23,-2; cvt.s64.s32 %r75,%r338; add.u64 %r340,%r75,%r75; add.u64 %r341,%r340,%r75; shl.b64 %r342,%r341,3; add.u64 %r343,%r209,%r342; .loc 1 299 67 shl.b64 %r345,%r75,3; add.u64 %r346,%frame,%r345; .loc 1 299 59 ld.u64 %r348,[%r343+40]; ld.u64 %r349,[%r346+360]; mul.lo.u64 %r146,%r348,%r349; .loc 1 301 24 mov.u64 %r350,0; st.u64 [%r209+8],%r350; .loc 1 302 28 cvt.u16.u32 %r353,%r23; add.u16 %r352,%r353,-1; cvt.u32.u16 %r354,%r352; st.u8 [%r209+28],%r354; .loc 1 304 10 setp.eq.u64 %r355,%r146,0; @ %r355 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r129,%r72,%r129; $L69: .loc 1 295 4 st.u64 [%r172+8],%r356; ld.u64 %r72,[%r164]; add.u64 %r357,%r72,-1; st.u64 [%r172+16],%r357; st.u64 [%r172],%r129; .loc 1 288 7 add.u64 %r172,%r172,24; add.u64 %r164,%r164,8; setp.ne.u64 %r358,%r134,%r172; @ %r358 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r209+48],%r146; mov.u64 %r360,-1; st.u64 [%r209+56],%r360; mov.u64 %r361,1; st.u64 [%r209+40],%r361; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r363363; call (%value_in),_gfortrani_xmallocarray64,[%value_in]; } .loc 1 311 22 st.u64 [%r209],%r364; bra $L73; $L63: .loc 1 316 19 ld.s8 %r366,[%r209+28]; .loc 1 316 10 setp.eq.u64 %r367,%r366,%r138; @ %r367 bra $L74; .loc 1 317 2 cvta.const.u64 %r36836_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r370,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r371,[%r370+36]; setp.eq.u32 %r372,%r371,0; @ %r372 bra $L73; .loc 1 321 4 add.u64 %r426,%frame,360; cvta.const.u64 %r376,$LC2r42637376; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r380,$LC1210380376; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r382,%r138,0; @ %r382 bra $L75; $L78: .loc 1 336 8 ld.u64 %r153,[%r209]; .loc 1 337 8 ld.u64 %r151,[%r210]; .loc 1 339 9 setp.ne.u64 %r383,%r151,0; @ %r383 bra $L76; bra $L48; $L75: add.u64 %r190,%frame,480; add.u64 %r189,%r209,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r385,%r23; cvt.s64.s8 %r384,%r385; add.u64 %r175,%r384,-1; .loc 1 328 3 mov.u64 %r157,0; add.u64 %r426,%frame,360; .loc 1 330 16 mov.u64 %r386,%r157; $L77: st.u64 [%r190],%r386; .loc 1 331 18 ld.u64 %r387,[%r189]; st.u64 [%r187],%r387; .loc 1 332 17 shl.b64 %r389,%r157,3; add.u64 %r390,%r426,%r389; .loc 1 332 10 ld.u64 %r391,[%r390]; setp.le.s64 %r392,%r391,0; @ %r392 bra $L48; .loc 1 328 26 add.u64 %r157,%r157,1; .loc 1 328 3 add.u64 %r190,%r190,8; add.u64 %r189,%r189,24; add.u64 %r187,%r187,8; setp.ne.u64 %r393,%r157,%r175; @ %r393 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r45,%r42,%r34; .loc 1 349 32 shl.b64 %r91,%r142,3; ld.u64 %r39,[%frame+480]; .loc 1 359 22 ld.u64 %r94,[%frame+240]; .loc 1 359 12 shl.b64 %r95,%r94,3; .loc 1 360 23 ld.u64 %r97,[%frame]; .loc 1 361 22 ld.u64 %r99,[%frame+120]; .loc 1 361 12 shl.b64 %r100,%r99,3; .loc 1 363 32 ld.u64 %r85,[%frame+360]; .loc 1 370 23 mul.lo.u64 %r38,%r85,%r94; .loc 1 371 24 mul.lo.u64 %r394,%r85,%r97; .loc 1 371 10 neg.s64 %r163,%r394; .loc 1 372 23 mul.lo.u64 %r395,%r85,%r99; .loc 1 372 9 shl.b64 %r396,%r395,3; neg.s64 %r165,%r396; setp.le.s64 %r431,%r138,1; cvt.u32.u32 %r433,%r23; cvt.s64.s8 %r434,%r433; add.u64 %r435,%r434,-1; add.u64 %r436,%frame,240; add.u64 %r437,%frame,360; add.u64 %r438,%frame,120; .loc 1 367 13 mov.u64 %r439,0; $L87: .loc 1 328 3 mov.u64 %r156,%r152; mov.u64 %r155,%r151; .loc 1 348 10 mov.f64 %r130,0d3ff0000000000000; .loc 1 349 9 mov.u64 %r154,0; $L80: .loc 1 352 6 ld.s8 %r398,[%r156]; cvt.u16.u32 %r397,%r398; setp.eq.u16 %r399,%r397,0; @ %r399 bra $L79; .loc 1 353 12 ld.f64 %r400,[%r155]; mul.f64 %r130,%r130,%r400; $L79: .loc 1 349 24 add.u64 %r154,%r154,1; .loc 1 349 32 add.u64 %r155,%r155,%r91; .loc 1 349 47 add.u64 %r156,%r156,%r45; .loc 1 349 2 setp.ne.u64 %r401,%r139,%r154; @ %r401 bra $L80; .loc 1 355 8 st.f64 [%r153],%r130; .loc 1 358 15 add.u64 %r39,%r39,1; .loc 1 359 12 add.u64 %r151,%r151,%r95; .loc 1 360 13 add.u64 %r152,%r152,%r97; .loc 1 361 12 add.u64 %r153,%r153,%r100; .loc 1 363 13 setp.ne.u64 %r402,%r85,%r39; @ %r402 bra $L87; .loc 1 371 10 add.u64 %r148,%r152,%r163; .loc 1 372 9 add.u64 %r149,%r153,%r165; .loc 1 374 7 @ ! %r431 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r41],%r439; .loc 1 370 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 371 24 mul.lo.u64 %r405,%r116,%r113; .loc 1 371 10 sub.u64 %r148,%r152,%r405; .loc 1 372 23 mul.lo.u64 %r406,%r118,%r113; .loc 1 372 9 shl.b64 %r407,%r406,3; sub.u64 %r149,%r153,%r407; .loc 1 373 5 add.u64 %r150,%r150,1; .loc 1 374 7 add.u64 %r41,%r41,8; add.u64 %r166,%r166,8; setp.ne.u64 %r408,%r150,%r435; @ %r408 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; .loc 1 370 23 mov.u64 %r102,%r38; .loc 1 374 7 mov.u64 %r166,8; .loc 1 373 5 mov.u64 %r150,1; $L84: .loc 1 382 16 ld.u64 %r412,[%r41]; add.u64 %r113,%r412,1; st.u64 [%r41],%r113; .loc 1 383 23 add.u64 %r414,%r436,%r166; ld.u64 %r114,[%r414]; .loc 1 383 13 sub.u64 %r415,%r114,%r102; shl.b64 %r416,%r415,3; add.u64 %r151,%r151,%r416; .loc 1 384 24 add.u64 %r417,%frame,%r166; ld.u64 %r116,[%r417]; .loc 1 384 14 add.u64 %r152,%r148,%r116; .loc 1 385 23 add.u64 %r419,%r438,%r166; ld.u64 %r118,[%r419]; .loc 1 385 13 shl.b64 %r420,%r118,3; add.u64 %r153,%r149,%r420; .loc 1 363 32 add.u64 %r422,%r437,%r166; ld.u64 %r122,[%r422]; .loc 1 363 13 setp.eq.u64 %r423,%r113,%r122; @ %r423 bra $L85; .loc 1 367 13 mov.u64 %r39,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r424,[%r209]; setp.eq.u64 %r425,%r424,0; @ ! %r425 bra $L63; bra $L86; $L48:_gfortran_sproduct_r8 .visible .func _gfortran_sproduct_r868u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64 %r276; .reg .predpredu64mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_product_r8.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r294,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r294; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r294; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r300,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r300; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r293,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r293,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r299,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r299; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r294,0; @ %r211 bra $L127; add.u64 %r293,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r293; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r24496244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r294,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r294; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r294; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r294,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r294,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r294,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 shl.b64 %r69,%r68,3; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r292,%r294,1; .loc 1 512 13 mov.f64 %r278,0d3ff0000000000000; cvt.u32.u32 %r296,%r24; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; $L146: st.f64 [%r98],%r278; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r279,%r59,%r80; @ %r279 bra $L146; .loc 1 525 7 @ ! %r292 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r295; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r282,%r97,%r298; @ %r282 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; .loc 1 520 13 mov.u64 %r295,0; $L144: .loc 1 529 16 ld.u64 %r287,[%r99]; add.u64 %r74,%r287,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r288,%r75,%r72; shl.b64 %r289,%r288,3; add.u64 %r98,%r98,%r289; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r290,%r74,%r77; @ %r290 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r291,%r98,0; @ ! %r291 bra $L122; bra $L147; $L109: .loc 1 534 1 ret; } product_r10product_c4.o/_gfortran_product_c4 .visible .func _gfortran_product_c4fortran/generated/product_c4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mproduct_c4 .visible .func _gfortran_mproduct_c4gfortran_sproduct_c4 .visible .func _gfortran_sproduct_c4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,108] = {80,82,79,68,85,67,84114,101,116,117,114,110,32,118,97,108,117,10132 $LC4[2] = {1065353226] = {70,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,9977,65,83,75,32,97,114,103,117,109,101,110,116,0 }99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_product_c4 .visible .func _gfortran_product_c4480predpredu64 %r231; .reg .u64 %r232; .reg .pred %r233; .reg .pred %r235; .reg .u64 %r236; .reg .pred %r237; .reg .u64u32 %r263; .reg .u64 %r265; .reg .u64 %r266; .reg .u64 %r267; .reg .u64 %r268; .reg .u16 %r270; .reg .u16 %r271; .reg .u32 %r272; .reg .u32 %r273; .reg .u64 %r275; .reg .u64u64pred64pred %r328; .reg .pred %r329; .reg .u64 %r335; .reg .u64u64 %r359; .reg .u64mov.u64 %r184,%ar0; mov.u64 %r185,%ar1; mov.u64 %r186,%ar2; .loc 1 55 10 ld.s8 %r22,[%r185+28]; .loc 1 55 38 add.u32 %r187,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r104,%r187; .loc 1 56 10 ld.u64 %r25,[%r186]; .loc 1 56 7 add.u64 %r105,%r25,-1; .loc 1 58 7 shr.u64 %r189,%r105,63; set.u32.lt.s64 %r191,%r104,%r105; neg.s32 %r192,%r191; cvt.u16.u64 %r195,%r189; cvt.u16.u32 %r196,%r192; or.b16 %r194,%r195,%r196; .loc 1 58 6 cvt.u32.u16 %r197,%r194; cvt.u16.u8 %r198,%r197; setp.eq.u16 %r199,%r198,0; @ %r199 bra $L2; .loc 1 60 7 cvt.u32.u32 %r202,%r22; cvt.s64.s8 %r201,%r202; st.u64 [%stack+8],%r2000200_gfortran_runtime_errorr205,%r105,%r105; add.u64 %r206,%r205,%r105; shl.b64 %r207,%r206,3; add.u64 %r208,%r185,%r207; ld.u64 %r30,[%r208+56]; ld.u64 %r32,[%r208+48]; .loc 1 68 9 ld.u64 %r108,[%r208+40]; .loc 1 70 3 setp.ne.u64 %r222,%r105,0; @ %r222 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r223,%r104,%r105; @ %r223 bra $L4; bra $L46; $L3: add.u64 %r102,%r185,40; add.u64 %r98,%frame,120; add.u64 %r97,%frame,240; add.u64 %r224,%r185,16; add.u64 %r226,%r25,%r25; add.u64 %r227,%r226,%r25; shl.b64 %r228,%r227,3; add.u64 %r103,%r224,%r228; .loc 1 76 12 mov.u64 %r364,0; $L8: .loc 1 72 18 ld.u64 %r229,[%r102]; st.u64 [%r98],%r229; .loc 1 73 19 ld.u64 %r231,[%r102+16]; add.u64 %r230,%r231,1; ld.u64 %r232,[%r102+8]; sub.u64 %r37,%r230,%r232; .loc 1 75 10 setp.lt.s64 %r233,%r37,0; @ %r233 bra $L6; .loc 1 73 17 st.u64 [%r97],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r97],%r364; $L7: .loc 1 70 3 add.u64 %r102,%r102,24; add.u64 %r98,%r98,8; add.u64 %r97,%r97,8; setp.ne.u64 %r235,%r102,%r103; @ %r235 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r236,[%r184]; setp.eq.u64 %r237,%r236,0; @ ! %r237 bra $L11; bra $L10; $L4: add.u64 %r239,%r25,%r25; add.u64 %r240,%r239,%r25; shl.b64 %r241,%r240,3; add.u64 %r242,%r241,40; add.u64 %r154,%r185,%r242; shl.b64 %r243,%r25,3; add.u64 %r147,%r243,-8; add.u64 %r352,%frame,120; add.u64 %r149,%r352,%r147; add.u64 %r351,%frame,240; add.u64 %r145,%r351,%r147; cvt.u32.u32 %r247,%r22; cvt.s64.s8 %r246,%r247; add.u64 %r249,%r246,%r246; add.u64 %r250,%r249,%r246; shl.b64 %r251,%r250,3; add.u64 %r252,%r185,40; add.u64 %r128,%r251,%r252; .loc 1 84 12 mov.u64 %r363,0; $L14: .loc 1 80 18 ld.u64 %r253,[%r154]; st.u64 [%r149],%r253; .loc 1 81 19 ld.u64 %r255,[%r154+16]; add.u64 %r254,%r255,1; ld.u64 %r256,[%r154+8]; sub.u64 %r44,%r254,%r256; .loc 1 83 10 setp.lt.s64 %r257,%r44,0; @ %r257 bra $L12; .loc 1 81 17 st.u64 [%r145],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r145],%r363; $L13: .loc 1 78 3 add.u64 %r154,%r154,24; add.u64 %r149,%r149,8; add.u64 %r145,%r145,8; setp.ne.u64 %r259,%r128,%r154; @ %r259 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r260,%r104,0; @ %r260 bra $L16; add.u64 %r351,%frame,240; $L34: add.u64 %r163,%r184,40; mov.u64 %r161,%r351; add.u64 %r261,%r184,16; cvt.u32.u32 %r263,%r22; cvt.s64.s8 %r262,%r263; add.u64 %r265,%r262,%r262; add.u64 %r266,%r265,%r262; shl.b64 %r267,%r266,3; add.u64 %r155,%r261,%r267; .loc 1 94 10 mov.u64 %r100,1; .loc 1 98 4 mov.u64 %r290,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r268,0; st.u64 [%r184+8],%r268; .loc 1 103 28 cvt.u16.u32 %r271,%r22; add.u16 %r270,%r271,-1; cvt.u32.u16 %r272,%r270; st.u8 [%r184+28],%r272; .loc 1 105 20 add.u32 %r273,%r22,-2; cvt.s64.s32 %r53,%r273; add.u64 %r275,%r53,%r53; add.u64 %r276,%r275,%r53; shl.b64 %r277,%r276,3; add.u64 %r278,%r184,%r277; .loc 1 105 67 shl.b64 %r280,%r53,3; add.u64 %r281,%frame,%r280; .loc 1 105 59 ld.u64 %r283,[%r278+40]; ld.u64 %r284,[%r281+240]; mul.lo.u64 %r110,%r283,%r284; .loc 1 107 29 mov.u64 %r28611286; call (%value_in),_gfortrani_xmallocarray287,[%value_in]; } .loc 1 107 27 st.u64 [%r184],%r287; .loc 1 108 10 setp.eq.u64 %r289,%r110,0; @ ! %r289 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r100,%r49,%r100; $L17: .loc 1 98 4 st.u64 [%r163+8],%r290; ld.u64 %r49,[%r161]; add.u64 %r291,%r49,-1; st.u64 [%r163+16],%r291; st.u64 [%r163],%r100; .loc 1 91 7 add.u64 %r163,%r163,24; add.u64 %r161,%r161,8; setp.ne.u64 %r292,%r155,%r163; @ %r292 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r184+48],%r110; mov.u64 %r294,-1; st.u64 [%r184+56],%r294; mov.u64 %r295,1; st.u64 [%r184+40],%r295; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r60,[%r184+28]; .loc 1 118 10 setp.eq.u64 %r296,%r60,%r104; @ %r296 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r104; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r299,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r300,[%r299+36]; setp.eq.u32 %r301,%r300,0; @ %r301 bra $L21; .loc 1 125 2 add.u64 %r351,%frame,240; cvta.const.u64 %r305,$LC2r351304305; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r307,%r104,0; @ %r307 bra $L24; $L27: .loc 1 65 9 add.u64 %r308,%r30,1; .loc 1 65 7 sub.u64 %r106,%r308,%r32; max.s64 %r45,%r106,0; .loc 1 137 8 ld.u64 %r117,[%r185]; .loc 1 138 8 ld.u64 %r118,[%r184]; .loc 1 154 36 shl.b64 %r68,%r108,3; ld.u64 %r111,[%frame+360]; .loc 1 166 22 ld.u64 %r70,[%frame+120]; .loc 1 166 12 shl.b64 %r71,%r70,3; .loc 1 167 22 ld.u64 %r73,[%frame]; .loc 1 167 12 shl.b64 %r74,%r73,3; .loc 1 169 32 ld.u64 %r124,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r309,%r70,%r124; .loc 1 176 9 shl.b64 %r310,%r309,3; neg.s64 %r129,%r310; .loc 1 177 23 mul.lo.u64 %r311,%r73,%r124; .loc 1 177 9 shl.b64 %r312,%r311,3; neg.s64 %r133,%r312; setp.gt.s64 %r353,%r106,0; setp.le.s64 %r354,%r104,1; .loc 1 150 10 ld.const.f32 %r356,[$LC4]; ld.const.f32 %r357,[$LC4+4]; cvt.u32.u32 %r358,%r22; cvt.s64.s8 %r359,%r358; add.u64 %r360,%r359,-1; add.u64 %r361,%frame,120; add.u64 %r362,%frame,240; bra $L25; $L24: add.u64 %r176,%frame,360; add.u64 %r175,%r184,40; mov.u64 %r173,%frame; cvt.u32.u32 %r314,%r22; cvt.s64.s8 %r313,%r314; add.u64 %r164,%r313,-1; .loc 1 129 3 mov.u64 %r119,0; add.u64 %r351,%frame,240; .loc 1 131 16 mov.u64 %r315,%r119; $L26: st.u64 [%r176],%r315; .loc 1 132 18 ld.u64 %r316,[%r175]; st.u64 [%r173],%r316; .loc 1 133 17 shl.b64 %r318,%r119,3; add.u64 %r319,%r351,%r318; .loc 1 133 10 ld.u64 %r320,[%r319]; setp.le.s64 %r321,%r320,0; @ %r321 bra $L1; .loc 1 129 26 add.u64 %r119,%r119,1; .loc 1 129 3 add.u64 %r176,%r176,8; add.u64 %r175,%r175,24; add.u64 %r173,%r173,8; setp.ne.u64 %r322,%r119,%r164; @ %r322 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r353 bra $L35; .loc 1 150 10 st.f32 [%r118],%r356; st.f32 [%r118+4],%r357; bra $L29; $L35: mov.u64 %r113,%r117; mov.f32 %r95,0f00000000; mov.f32 %r96,0f3f800000; .loc 1 154 13 mov.u64 %r112,0; $L28: ld.f32 %r126,[%r113]; ld.f32 %r122,[%r113+4]; .loc 1 158 10 mul.f32 %r99,%r96,%r122; mul.f32 %r326,%r95,%r122; neg.f32 %r327,%r326; fma.rn.f32 %r96,%r96,%r126,%r327; fma.rn.f32 %r95,%r95,%r126,%r99; .loc 1 154 28 add.u64 %r112,%r112,1; .loc 1 154 36 add.u64 %r113,%r113,%r68; .loc 1 154 6 setp.gt.s64 %r328,%r45,%r112; @ %r328 bra $L28; .loc 1 161 12 st.f32 [%r118],%r96; st.f32 [%r118+4],%r95; $L29: .loc 1 165 15 add.u64 %r111,%r111,1; .loc 1 166 12 add.u64 %r117,%r117,%r71; .loc 1 167 12 add.u64 %r118,%r118,%r74; .loc 1 169 13 setp.ne.u64 %r329,%r111,%r124; @ %r329 bra $L25; .loc 1 176 9 add.u64 %r114,%r117,%r129; .loc 1 177 9 add.u64 %r115,%r118,%r133; .loc 1 179 7 @ %r354 bra $L1; add.u64 %r123,%frame,368; mov.u64 %r141,8; .loc 1 178 5 mov.u64 %r116,1; .loc 1 173 13 mov.u64 %r355,0; bra $L32; $L33: st.u64 [%r123],%r355; .loc 1 176 23 mul.lo.u64 %r335,%r88,%r87; .loc 1 176 9 shl.b64 %r336,%r335,3; sub.u64 %r114,%r117,%r336; .loc 1 177 23 mul.lo.u64 %r337,%r91,%r87; .loc 1 177 9 shl.b64 %r338,%r337,3; sub.u64 %r115,%r118,%r338; .loc 1 178 5 add.u64 %r116,%r116,1; .loc 1 179 7 add.u64 %r123,%r123,8; add.u64 %r141,%r141,8; setp.eq.u64 %r339,%r116,%r360; @ %r339 bra $L1; $L32: .loc 1 187 16 ld.u64 %r340,[%r123]; add.u64 %r87,%r340,1; st.u64 [%r123],%r87; .loc 1 188 23 add.u64 %r342,%r361,%r141; ld.u64 %r88,[%r342]; .loc 1 188 13 shl.b64 %r343,%r88,3; add.u64 %r117,%r114,%r343; .loc 1 189 23 add.u64 %r344,%frame,%r141; ld.u64 %r91,[%r344]; .loc 1 189 13 shl.b64 %r345,%r91,3; add.u64 %r118,%r115,%r345; .loc 1 169 32 add.u64 %r347,%r362,%r141; ld.u64 %r94,[%r347]; .loc 1 169 13 setp.eq.u64 %r348,%r87,%r94; @ %r348 bra $L33; .loc 1 173 13 mov.u64 %r111,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r349,[%r184]; setp.eq.u64 %r350,%r349,0; @ ! %r350 bra $L11; bra $L34; $L1: .loc 1 19_gfortran_mproduct_c4 .visible .func _gfortran_mproduct_c608pred %r218; .reg .u32 %r222; .reg .u64 %r224; .reg .u32 %r226; .reg .u32 %r227; .reg .u16 %r229; .reg .u16u16 %r233; .reg .pred %r234; .reg .u64u64predu6464pred %r335; .reg .u64 %r336; .reg .u32 %r337; .reg .u64u64 %r342; .reg .u32 %r343; .reg .u64pred %r372; .reg .u64 %r373; .reg .u64 %r375; .reg .u32 %r376; .reg .predu32pred %r404; .reg .f32 %r405; .reg .f32 %r406; .reg .predu64 %r413; .reg .predu64predmov.u64 %r216,%ar2; mov.u64 %r217,%ar3; .loc 1 223 6 setp.ne.u64 %r218,%r217,0; @ %r218 bra $L491215216; call _gfortran_product_c4.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r216]; .loc 1 233 7 add.u64 %r134,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r215+28]; .loc 1 234 38 add.u32 %r222,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r135,%r222; .loc 1 237 7 shr.u64 %r224,%r134,63; set.u32.gt.s64 %r226,%r134,%r135; neg.s32 %r227,%r226; cvt.u16.u64 %r230,%r224; cvt.u16.u32 %r231,%r227; or.b16 %r229,%r230,%r231; .loc 1 237 6 cvt.u32.u16 %r232,%r229; cvt.u16.u8 %r233,%r232; setp.eq.u16 %r234,%r233,0; @ %r234 bra $L51; .loc 1 239 7 cvt.u32.u32 %r237,%r23; cvt.s64.s8 %r236,%r237; st.u64 [%stack+8],%r236; st.u64 [%stack],%r22; cvta.const.u64 %r235235_gfortran_runtime_error1: .loc 1 244 9 add.u64 %r434,%r134,%r134; add.u64 %r436,%r434,%r134; shl.b64 %r242,%r436,3; add.u64 %r243,%r215,%r242; ld.u64 %r246,[%r243+56]; add.u64 %r245,%r246,1; .loc 1 244 7 ld.u64 %r253,[%r243+48]; sub.u64 %r136,%r245,%r253; .loc 1 245 6 setp.le.s64 %r254,%r136,0; @ %r254 bra $L48; .loc 1 248 9 ld.u64 %r149,[%r217]; .loc 1 250 15 ld.u64 %r34,[%r217+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r255,%r35,-4; and.b32 %r256,%r255,-5; set.u32.eq.u32 %r258,%r256,0; neg.s32 %r259,%r258; .loc 1 252 22 add.u32 %r260,%r35,-1; set.u32.le.u32 %r262,%r260,1; neg.s32 %r263,%r262; .loc 1 252 6 cvt.u16.u32 %r265,%r259; cvt.u16.u32 %r266,%r263; or.loc 1 254 7 setp.ne.u32 %r271,%r35,16; @ %r271 bra $L54; $L53: .loc 1 261 9 shl.b64 %r275,%r436,3; add.u64 %r276,%r215,%r275; ld.u64 %r139,[%r276+40]; .loc 1 262 12 add.u64 %r282,%r217,%r275; ld.u64 %r41,[%r282+40]; .loc 1 264 3 setp.ne.u64 %r284,%r134,0; @ %r284 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r285,%r134,%r135; @ %r285 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r286286_gfortran_runtime_error5: add.u64 %r40,%r215,40; add.u64 %r128,%r217,40; add.u64 %r288,%r215,16; add.u64 %r290,%r22,%r22; add.u64 %r291,%r290,%r22; shl.b64 %r292,%r291,3; add.u64 %r211,%r288,%r292; .loc 1 264 3 mov.u64 %r88,0; add.u64 %r433,%frame,240; add.u64 %r432,%frame,360; .loc 1 271 12 mov.u64 %r447,%r88; $L60: .loc 1 266 18 add.u64 %r294,%r433,%r88; ld.u64 %r295,[%r40]; st.u64 [%r294],%r295; .loc 1 267 18 add.u64 %r296,%frame,%r88; .loc 1 267 20 ld.u64 %r298,[%r128]; mul.lo.u64 %r297,%r298,%r34; .loc 1 267 18 st.u64 [%r296],%r297; .loc 1 268 19 ld.u64 %r300,[%r40+16]; add.u64 %r299,%r300,1; ld.u64 %r301,[%r40+8]; sub.u64 %r52,%r299,%r301; .loc 1 270 10 setp.lt.s64 %r302,%r52,0; @ %r302 bra $L58; .loc 1 268 17 add.u64 %r304,%r432,%r88; st.u64 [%r304],%r52; bra $L59; $L58: .loc 1 271 12 add.u64 %r306,%r432,%r88; st.u64 [%r306],%r447; $L59: .loc 1 264 3 add.u64 %r40,%r40,24; add.u64 %r128,%r128,24; add.u64 %r88,%r88,8; setp.ne.u64 %r308,%r40,%r211; @ %r308 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r309,[%r214]; setp.eq.u64 %r310,%r309,0; @ ! %r310 bra $L63; bra $L62; $L56: add.u64 %r312,%r22,%r22; add.u64 %r313,%r312,%r22; shl.b64 %r314,%r313,3; add.u64 %r123,%r314,40; add.u64 %r132,%r215,%r123; add.u64 %r121,%r217,%r123; shl.b64 %r315,%r22,3; add.u64 %r133,%r315,-8; cvt.u32.u32 %r317,%r23; cvt.s64.s8 %r316,%r317; shl.b64 %r318,%r316,3; add.u64 %r67,%r318,-8; add.u64 %r433,%frame,240; add.u64 %r432,%frame,360; .loc 1 281 12 mov.u64 %r446,0; $L66: .loc 1 276 18 add.u64 %r320,%r433,%r133; ld.u64 %r321,[%r132]; st.u64 [%r320],%r321; .loc 1 277 18 add.u64 %r322,%frame,%r133; .loc 1 277 20 ld.u64 %r324,[%r121]; mul.lo.u64 %r323,%r324,%r34; .loc 1 277 18 st.u64 [%r322],%r323; .loc 1 278 19 ld.u64 %r326,[%r132+16]; add.u64 %r325,%r326,1; ld.u64 %r327,[%r132+8]; sub.u64 %r63,%r325,%r327; .loc 1 280 10 setp.lt.s64 %r328,%r63,0; @ %r328 bra $L64; .loc 1 278 17 add.u64 %r330,%r432,%r133; st.u64 [%r330],%r63; bra $L65; $L64: .loc 1 281 12 add.u64 %r332,%r432,%r133; st.u64 [%r332],%r446; $L65: .loc 1 274 3 add.u64 %r132,%r132,24; add.u64 %r121,%r121,24; add.u64 %r133,%r133,8; setp.ne.u64 %r334,%r67,%r133; @ %r334 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r335,%r135,0; @ %r335 bra $L68; add.u64 %r432,%frame,360; $L86: add.u64 %r181,%r214,40; mov.u64 %r179,%r432; cvt.u32.u32 %r337,%r23; cvt.s64.s8 %r336,%r337; add.u64 %r339,%r336,%r336; add.u64 %r340,%r339,%r336; shl.b64 %r341,%r340,3; add.u64 %r342,%r214,16; add.u64 %r144,%r341,%r342; .loc 1 291 10 mov.u64 %r125,1; .loc 1 295 4 mov.u64 %r361,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r343,%r23,-2; cvt.s64.s32 %r72,%r343; add.u64 %r345,%r72,%r72; add.u64 %r346,%r345,%r72; shl.b64 %r347,%r346,3; add.u64 %r348,%r214,%r347; .loc 1 299 67 shl.b64 %r350,%r72,3; add.u64 %r351,%frame,%r350; .loc 1 299 59 ld.u64 %r353,[%r348+40]; ld.u64 %r354,[%r351+360]; mul.lo.u64 %r143,%r353,%r354; .loc 1 301 24 mov.u64 %r355,0; st.u64 [%r214+8],%r355; .loc 1 302 28 cvt.u16.u32 %r358,%r23; add.u16 %r357,%r358,-1; cvt.u32.u16 %r359,%r357; st.u8 [%r214+28],%r359; .loc 1 304 10 setp.eq.u64 %r360,%r143,0; @ %r360 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r125,%r69,%r125; $L69: .loc 1 295 4 st.u64 [%r181+8],%r361; ld.u64 %r69,[%r179]; add.u64 %r362,%r69,-1; st.u64 [%r181+16],%r362; st.u64 [%r181],%r125; .loc 1 288 7 add.u64 %r181,%r181,24; add.u64 %r179,%r179,8; setp.ne.u64 %r363,%r144,%r181; @ %r363 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r214+48],%r143; mov.u64 %r365,-1; st.u64 [%r214+56],%r365; mov.u64 %r366,1; st.u64 [%r214+40],%r366; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r368368; call (%value_in),_gfortrani_xmallocarray69,[%value_in]; } .loc 1 311 22 st.u64 [%r214],%r369; bra $L73; $L63: .loc 1 316 19 ld.s8 %r371,[%r214+28]; .loc 1 316 10 setp.eq.u64 %r372,%r371,%r135; @ %r372 bra $L74; .loc 1 317 2 cvta.const.u64 %r373,$LC373_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r375,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r376,[%r375+36]; setp.eq.u32 %r377,%r376,0; @ %r377 bra $L73; .loc 1 321 4 add.u64 %r432,%frame,360; cvta.const.u64 %r381,$LC2r4380381; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r385,$LC738381; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r387,%r135,0; @ %r387 bra $L75; $L78: .loc 1 336 8 ld.u64 %r150,[%r214]; .loc 1 337 8 ld.u64 %r148,[%r215]; .loc 1 339 9 setp.ne.u64 %r388,%r148,0; @ %r388 bra $L76; bra $L48; $L75: add.u64 %r194,%frame,480; add.u64 %r193,%r214,40; add.u64 %r191,%frame,120; cvt.u32.u32 %r390,%r23; cvt.s64.s8 %r389,%r390; add.u64 %r182,%r389,-1; .loc 1 328 3 mov.u64 %r155,0; add.u64 %r432,%frame,360; .loc 1 330 16 mov.u64 %r391,%r155; $L77: st.u64 [%r194],%r391; .loc 1 331 18 ld.u64 %r392,[%r193]; st.u64 [%r191],%r392; .loc 1 332 17 shl.b64 %r394,%r155,3; add.u64 %r395,%r432,%r394; .loc 1 332 10 ld.u64 %r396,[%r395]; setp.le.s64 %r397,%r396,0; @ %r397 bra $L48; .loc 1 328 26 add.u64 %r155,%r155,1; .loc 1 328 3 add.u64 %r194,%r194,8; add.u64 %r193,%r193,24; add.u64 %r191,%r191,8; setp.ne.u64 %r398,%r155,%r182; @ %r398 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r43,%r41,%r34; .loc 1 349 32 shl.b64 %r87,%r139,3; ld.u64 %r151,[%frame+480]; .loc 1 359 22 ld.u64 %r90,[%frame+240]; .loc 1 359 12 shl.b64 %r91,%r90,3; .loc 1 360 23 ld.u64 %r93,[%frame]; .loc 1 361 22 ld.u64 %r95,[%frame+120]; .loc 1 361 12 shl.b64 %r96,%r95,3; .loc 1 363 32 ld.u64 %r81,[%frame+360]; .loc 1 370 23 mul.lo.u64 %r38,%r81,%r90; .loc 1 371 24 mul.lo.u64 %r399,%r81,%r93; .loc 1 371 10 neg.s64 %r161,%r399; .loc 1 372 23 mul.lo.u64 %r400,%r81,%r95; .loc 1 372 9 shl.b64 %r401,%r400,3; neg.s64 %r165,%r401; setp.le.s64 %r437,%r135,1; cvt.u32.u32 %r439,%r23; cvt.s64.s8 %r440,%r439; add.u64 %r441,%r440,-1; add.u64 %r442,%frame,240; add.u64 %r443,%frame,360; add.u64 %r444,%frame,120; .loc 1 367 13 mov.u64 %r445,0; $L87: .loc 1 328 3 mov.u64 %r154,%r149; mov.u64 %r153,%r148; mov.f32 %r119,0f00000000; mov.f32 %r120,0f3f800000; .loc 1 349 9 mov.u64 %r152,0; $L80: .loc 1 352 6 ld.s8 %r403,[%r154]; cvt.u16.u32 %r402,%r403; setp.eq.u16 %r404,%r402,0; @ %r404 bra $L79; ld.f32 %r163,[%r153]; ld.f32 %r170,[%r153+4]; .loc 1 353 12 mul.f32 %r174,%r120,%r170; mul.f32 %r405,%r119,%r170; neg.f32 %r406,%r405; fma.rn.f32 %r120,%r120,%r163,%r406; fma.rn.f32 %r119,%r119,%r163,%r174; $L79: .loc 1 349 24 add.u64 %r152,%r152,1; .loc 1 349 32 add.u64 %r153,%r153,%r87; .loc 1 349 47 add.u64 %r154,%r154,%r43; .loc 1 349 2 setp.ne.u64 %r407,%r136,%r152; @ %r407 bra $L80; .loc 1 355 8 st.f32 [%r150],%r120; st.f32 [%r150+4],%r119; .loc 1 358 15 add.u64 %r151,%r151,1; .loc 1 359 12 add.u64 %r148,%r148,%r91; .loc 1 360 13 add.u64 %r149,%r149,%r93; .loc 1 361 12 add.u64 %r150,%r150,%r96; .loc 1 363 13 setp.ne.u64 %r408,%r81,%r151; @ %r408 bra $L87; .loc 1 371 10 add.u64 %r145,%r149,%r161; .loc 1 372 9 add.u64 %r146,%r150,%r165; .loc 1 374 7 @ ! %r437 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r127],%r445; .loc 1 370 23 mul.lo.u64 %r99,%r110,%r109; .loc 1 371 24 mul.lo.u64 %r411,%r112,%r109; .loc 1 371 10 sub.u64 %r145,%r149,%r411; .loc 1 372 23 mul.lo.u64 %r412,%r114,%r109; .loc 1 372 9 shl.b64 %r413,%r412,3; sub.u64 %r146,%r150,%r413; .loc 1 373 5 add.u64 %r147,%r147,1; .loc 1 374 7 add.u64 %r127,%r127,8; add.u64 %r178,%r178,8; setp.ne.u64 %r414,%r147,%r441; @ %r414 bra $L84; bra $L48; $L108: add.u64 %r127,%frame,488; .loc 1 370 23 mov.u64 %r99,%r38; .loc 1 374 7 mov.u64 %r178,8; .loc 1 373 5 mov.u64 %r147,1; $L84: .loc 1 382 16 ld.u64 %r418,[%r127]; add.u64 %r109,%r418,1; st.u64 [%r127],%r109; .loc 1 383 23 add.u64 %r420,%r442,%r178; ld.u64 %r110,[%r420]; .loc 1 383 13 sub.u64 %r421,%r110,%r99; shl.b64 %r422,%r421,3; add.u64 %r148,%r148,%r422; .loc 1 384 24 add.u64 %r423,%frame,%r178; ld.u64 %r112,[%r423]; .loc 1 384 14 add.u64 %r149,%r145,%r112; .loc 1 385 23 add.u64 %r425,%r444,%r178; ld.u64 %r114,[%r425]; .loc 1 385 13 shl.b64 %r426,%r114,3; add.u64 %r150,%r146,%r426; .loc 1 363 32 add.u64 %r428,%r443,%r178; ld.u64 %r117,[%r428]; .loc 1 363 13 setp.eq.u64 %r429,%r109,%r117; @ %r429 bra $L85; .loc 1 367 13 mov.u64 %r151,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r430,[%r214]; setp.eq.u64 %r431,%r430,0; @ ! %r431 bra $L63; bra $L86; $L48:_gfortran_sproduct_c4 .visible .func _gfortran_sproduct_c68u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64 %r276; .reg .pred %r277; .reg .pred %r280; .reg .predpred %r292; .reg .f32 %r293; .reg .f32 %r294; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_product_c4.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r297,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r297; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r297; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r304,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r304; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r296,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r296,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r303,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r303; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r297,0; @ %r211 bra $L127; add.u64 %r296,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r296; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r24496244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r297,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r297; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r297; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r297,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r297,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r297,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 shl.b64 %r69,%r68,3; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; ld.const.f32 %r293,[$LC4]; ld.const.f32 %r294,[$LC4+4]; setp.le.s64 %r295,%r297,1; cvt.u32.u32 %r299,%r24; cvt.s64.s8 %r300,%r299; add.u64 %r301,%r300,-1; .loc 1 520 13 mov.u64 %r302,0; $L146: .loc 1 512 13 st.f32 [%r98],%r293; st.f32 [%r98+4],%r294; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r280,%r59,%r80; @ %r280 bra $L146; .loc 1 525 7 @ ! %r295 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r302; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r283,%r97,%r301; @ %r283 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; $L144: .loc 1 529 16 ld.u64 %r288,[%r99]; add.u64 %r74,%r288,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r289,%r75,%r72; shl.b64 %r290,%r289,3; add.u64 %r98,%r98,%r290; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r291,%r74,%r77; @ %r291 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r292,%r98,0; @ ! %r292 bra $L122; bra $L147; $L109: .loc 1 534 1 ret; } product_c8.o/_gfortran_product_c8 .visible .func _gfortran_product_c8fortran/generated/product_c8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mproduct_c8 .visible .func _gfortran_mproduct_c8gfortran_sproduct_c8 .visible .func _gfortran_sproduct_c8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,108] = {80,82,79,68,85,67,84114,101,116,117,114,110,32,118,97,108,117,10164 $LC4[2] = {460718241880001740870,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,9977,65,83,75,32,97,114,103,117,109,101,110,116,0 }99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,80,82,79,68,85,67,84,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_product_c8 .visible .func _gfortran_product_c8480predpredu64 %r231; .reg .u64 %r232; .reg .pred %r233; .reg .pred %r235; .reg .u64 %r236; .reg .pred %r237; .reg .u64u32 %r263; .reg .u64 %r265; .reg .u64 %r266; .reg .u64 %r267; .reg .u64 %r268; .reg .u16 %r270; .reg .u16 %r271; .reg .u32 %r272; .reg .u32 %r273; .reg .u64 %r275; .reg .u64u64pred64pred %r328; .reg .pred %r329; .reg .u64 %r335; .reg .u64u64 %r359; .reg .u64mov.u64 %r184,%ar0; mov.u64 %r185,%ar1; mov.u64 %r186,%ar2; .loc 1 55 10 ld.s8 %r22,[%r185+28]; .loc 1 55 38 add.u32 %r187,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r104,%r187; .loc 1 56 10 ld.u64 %r25,[%r186]; .loc 1 56 7 add.u64 %r105,%r25,-1; .loc 1 58 7 shr.u64 %r189,%r105,63; set.u32.lt.s64 %r191,%r104,%r105; neg.s32 %r192,%r191; cvt.u16.u64 %r195,%r189; cvt.u16.u32 %r196,%r192; or.b16 %r194,%r195,%r196; .loc 1 58 6 cvt.u32.u16 %r197,%r194; cvt.u16.u8 %r198,%r197; setp.eq.u16 %r199,%r198,0; @ %r199 bra $L2; .loc 1 60 7 cvt.u32.u32 %r202,%r22; cvt.s64.s8 %r201,%r202; st.u64 [%stack+8],%r2000200_gfortran_runtime_errorr205,%r105,%r105; add.u64 %r206,%r205,%r105; shl.b64 %r207,%r206,3; add.u64 %r208,%r185,%r207; ld.u64 %r30,[%r208+56]; ld.u64 %r32,[%r208+48]; .loc 1 68 9 ld.u64 %r108,[%r208+40]; .loc 1 70 3 setp.ne.u64 %r222,%r105,0; @ %r222 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r223,%r104,%r105; @ %r223 bra $L4; bra $L46; $L3: add.u64 %r102,%r185,40; add.u64 %r98,%frame,120; add.u64 %r97,%frame,240; add.u64 %r224,%r185,16; add.u64 %r226,%r25,%r25; add.u64 %r227,%r226,%r25; shl.b64 %r228,%r227,3; add.u64 %r103,%r224,%r228; .loc 1 76 12 mov.u64 %r364,0; $L8: .loc 1 72 18 ld.u64 %r229,[%r102]; st.u64 [%r98],%r229; .loc 1 73 19 ld.u64 %r231,[%r102+16]; add.u64 %r230,%r231,1; ld.u64 %r232,[%r102+8]; sub.u64 %r37,%r230,%r232; .loc 1 75 10 setp.lt.s64 %r233,%r37,0; @ %r233 bra $L6; .loc 1 73 17 st.u64 [%r97],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r97],%r364; $L7: .loc 1 70 3 add.u64 %r102,%r102,24; add.u64 %r98,%r98,8; add.u64 %r97,%r97,8; setp.ne.u64 %r235,%r102,%r103; @ %r235 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r236,[%r184]; setp.eq.u64 %r237,%r236,0; @ ! %r237 bra $L11; bra $L10; $L4: add.u64 %r239,%r25,%r25; add.u64 %r240,%r239,%r25; shl.b64 %r241,%r240,3; add.u64 %r242,%r241,40; add.u64 %r154,%r185,%r242; shl.b64 %r243,%r25,3; add.u64 %r147,%r243,-8; add.u64 %r352,%frame,120; add.u64 %r149,%r352,%r147; add.u64 %r351,%frame,240; add.u64 %r145,%r351,%r147; cvt.u32.u32 %r247,%r22; cvt.s64.s8 %r246,%r247; add.u64 %r249,%r246,%r246; add.u64 %r250,%r249,%r246; shl.b64 %r251,%r250,3; add.u64 %r252,%r185,40; add.u64 %r128,%r251,%r252; .loc 1 84 12 mov.u64 %r363,0; $L14: .loc 1 80 18 ld.u64 %r253,[%r154]; st.u64 [%r149],%r253; .loc 1 81 19 ld.u64 %r255,[%r154+16]; add.u64 %r254,%r255,1; ld.u64 %r256,[%r154+8]; sub.u64 %r44,%r254,%r256; .loc 1 83 10 setp.lt.s64 %r257,%r44,0; @ %r257 bra $L12; .loc 1 81 17 st.u64 [%r145],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r145],%r363; $L13: .loc 1 78 3 add.u64 %r154,%r154,24; add.u64 %r149,%r149,8; add.u64 %r145,%r145,8; setp.ne.u64 %r259,%r128,%r154; @ %r259 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r260,%r104,0; @ %r260 bra $L16; add.u64 %r351,%frame,240; $L34: add.u64 %r163,%r184,40; mov.u64 %r161,%r351; add.u64 %r261,%r184,16; cvt.u32.u32 %r263,%r22; cvt.s64.s8 %r262,%r263; add.u64 %r265,%r262,%r262; add.u64 %r266,%r265,%r262; shl.b64 %r267,%r266,3; add.u64 %r155,%r261,%r267; .loc 1 94 10 mov.u64 %r100,1; .loc 1 98 4 mov.u64 %r290,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r268,0; st.u64 [%r184+8],%r268; .loc 1 103 28 cvt.u16.u32 %r271,%r22; add.u16 %r270,%r271,-1; cvt.u32.u16 %r272,%r270; st.u8 [%r184+28],%r272; .loc 1 105 20 add.u32 %r273,%r22,-2; cvt.s64.s32 %r53,%r273; add.u64 %r275,%r53,%r53; add.u64 %r276,%r275,%r53; shl.b64 %r277,%r276,3; add.u64 %r278,%r184,%r277; .loc 1 105 67 shl.b64 %r280,%r53,3; add.u64 %r281,%frame,%r280; .loc 1 105 59 ld.u64 %r283,[%r278+40]; ld.u64 %r284,[%r281+240]; mul.lo.u64 %r110,%r283,%r284; .loc 1 107 29 mov.u64 %r286,1607 27 st.u64 [%r184],%r287; .loc 1 108 10 setp.eq.u64 %r289,%r110,0; @ ! %r289 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r100,%r49,%r100; $L17: .loc 1 98 4 st.u64 [%r163+8],%r290; ld.u64 %r49,[%r161]; add.u64 %r291,%r49,-1; st.u64 [%r163+16],%r291; st.u64 [%r163],%r100; .loc 1 91 7 add.u64 %r163,%r163,24; add.u64 %r161,%r161,8; setp.ne.u64 %r292,%r155,%r163; @ %r292 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r184+48],%r110; mov.u64 %r294,-1; st.u64 [%r184+56],%r294; mov.u64 %r295,1; st.u64 [%r184+40],%r295; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r60,[%r184+28]; .loc 1 118 10 setp.eq.u64 %r296,%r60,%r104; @ %r296 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r104; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r299,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r300,[%r299+36]; setp.eq.u32 %r301,%r300,0; @ %r301 bra $L21; .loc 1 125 2 add.u64 %r351,%frame,240; cvta.const.u64 %r305,$LC2r351304305; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r307,%r104,0; @ %r307 bra $L24; $L27: .loc 1 65 9 add.u64 %r308,%r30,1; .loc 1 65 7 sub.u64 %r106,%r308,%r32; max.s64 %r45,%r106,0; .loc 1 137 8 ld.u64 %r117,[%r185]; .loc 1 138 8 ld.u64 %r118,[%r184]; .loc 1 154 36 shl.b64 %r68,%r108,4; ld.u64 %r111,[%frame+360]; .loc 1 166 22 ld.u64 %r70,[%frame+120]; .loc 1 166 12 shl.b64 %r71,%r70,4; .loc 1 167 22 ld.u64 %r73,[%frame]; .loc 1 167 12 shl.b64 %r74,%r73,4; .loc 1 169 32 ld.u64 %r124,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r309,%r70,%r124; .loc 1 176 9 shl.b64 %r310,%r309,4; neg.s64 %r129,%r310; .loc 1 177 23 mul.lo.u64 %r311,%r73,%r124; .loc 1 177 9 shl.b64 %r312,%r311,4; neg.s64 %r133,%r312; setp.gt.s64 %r353,%r106,0; setp.le.s64 %r354,%r104,1; .loc 1 150 10 ld.const.f64 %r356,[$LC4]; ld.const.f64 %r357,[$LC4+8]; cvt.u32.u32 %r358,%r22; cvt.s64.s8 %r359,%r358; add.u64 %r360,%r359,-1; add.u64 %r361,%frame,120; add.u64 %r362,%frame,240; bra $L25; $L24: add.u64 %r176,%frame,360; add.u64 %r175,%r184,40; mov.u64 %r173,%frame; cvt.u32.u32 %r314,%r22; cvt.s64.s8 %r313,%r314; add.u64 %r164,%r313,-1; .loc 1 129 3 mov.u64 %r119,0; add.u64 %r351,%frame,240; .loc 1 131 16 mov.u64 %r315,%r119; $L26: st.u64 [%r176],%r315; .loc 1 132 18 ld.u64 %r316,[%r175]; st.u64 [%r173],%r316; .loc 1 133 17 shl.b64 %r318,%r119,3; add.u64 %r319,%r351,%r318; .loc 1 133 10 ld.u64 %r320,[%r319]; setp.le.s64 %r321,%r320,0; @ %r321 bra $L1; .loc 1 129 26 add.u64 %r119,%r119,1; .loc 1 129 3 add.u64 %r176,%r176,8; add.u64 %r175,%r175,24; add.u64 %r173,%r173,8; setp.ne.u64 %r322,%r119,%r164; @ %r322 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r353 bra $L35; .loc 1 150 10 st.f64 [%r118],%r356; st.f64 [%r118+8],%r357; bra $L29; $L35: mov.u64 %r113,%r117; mov.f64 %r95,0d0000000000000000; mov.f64 %r96,0d3ff0000000000000; .loc 1 154 13 mov.u64 %r112,0; $L28: ld.f64 %r126,[%r113]; ld.f64 %r122,[%r113+8]; .loc 1 158 10 mul.f64 %r99,%r96,%r122; mul.f64 %r326,%r95,%r122; neg.f64 %r327,%r326; fma.rn.f64 %r96,%r96,%r126,%r327; fma.rn.f64 %r95,%r95,%r126,%r99; .loc 1 154 28 add.u64 %r112,%r112,1; .loc 1 154 36 add.u64 %r113,%r113,%r68; .loc 1 154 6 setp.gt.s64 %r328,%r45,%r112; @ %r328 bra $L28; .loc 1 161 12 st.f64 [%r118],%r96; st.f64 [%r118+8],%r95; $L29: .loc 1 165 15 add.u64 %r111,%r111,1; .loc 1 166 12 add.u64 %r117,%r117,%r71; .loc 1 167 12 add.u64 %r118,%r118,%r74; .loc 1 169 13 setp.ne.u64 %r329,%r111,%r124; @ %r329 bra $L25; .loc 1 176 9 add.u64 %r114,%r117,%r129; .loc 1 177 9 add.u64 %r115,%r118,%r133; .loc 1 179 7 @ %r354 bra $L1; add.u64 %r123,%frame,368; mov.u64 %r141,8; .loc 1 178 5 mov.u64 %r116,1; .loc 1 173 13 mov.u64 %r355,0; bra $L32; $L33: st.u64 [%r123],%r355; .loc 1 176 23 mul.lo.u64 %r335,%r88,%r87; .loc 1 176 9 shl.b64 %r336,%r335,4; sub.u64 %r114,%r117,%r336; .loc 1 177 23 mul.lo.u64 %r337,%r91,%r87; .loc 1 177 9 shl.b64 %r338,%r337,4; sub.u64 %r115,%r118,%r338; .loc 1 178 5 add.u64 %r116,%r116,1; .loc 1 179 7 add.u64 %r123,%r123,8; add.u64 %r141,%r141,8; setp.eq.u64 %r339,%r116,%r360; @ %r339 bra $L1; $L32: .loc 1 187 16 ld.u64 %r340,[%r123]; add.u64 %r87,%r340,1; st.u64 [%r123],%r87; .loc 1 188 23 add.u64 %r342,%r361,%r141; ld.u64 %r88,[%r342]; .loc 1 188 13 shl.b64 %r343,%r88,4; add.u64 %r117,%r114,%r343; .loc 1 189 23 add.u64 %r344,%frame,%r141; ld.u64 %r91,[%r344]; .loc 1 189 13 shl.b64 %r345,%r91,4; add.u64 %r118,%r115,%r345; .loc 1 169 32 add.u64 %r347,%r362,%r141; ld.u64 %r94,[%r347]; .loc 1 169 13 setp.eq.u64 %r348,%r87,%r94; @ %r348 bra $L33; .loc 1 173 13 mov.u64 %r111,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r349,[%r184]; setp.eq.u64 %r350,%r349,0; @ ! %r350 bra $L11; bra $L34; $L1: .loc 1 19_gfortran_mproduct_c8 .visible .func _gfortran_mproduct_c8608pred %r218; .reg .u32 %r222; .reg .u64 %r224; .reg .u32 %r226; .reg .u32 %r227; .reg .u16 %r229; .reg .u16u16 %r233; .reg .pred %r234; .reg .u64u64predu6464pred %r335; .reg .u64 %r336; .reg .u32 %r337; .reg .u64u64 %r342; .reg .u32 %r343; .reg .u64pred %r372; .reg .u64 %r373; .reg .u64 %r375; .reg .u32 %r376; .reg .predu32pred %r404; .reg .f64 %r405; .reg .f64 %r406; .reg .predu64 %r413; .reg .predu64predmov.u64 %r216,%ar2; mov.u64 %r217,%ar3; .loc 1 223 6 setp.ne.u64 %r218,%r217,0; @ %r218 bra $L491215216; call _gfortran_product_c8.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r216]; .loc 1 233 7 add.u64 %r134,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r215+28]; .loc 1 234 38 add.u32 %r222,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r135,%r222; .loc 1 237 7 shr.u64 %r224,%r134,63; set.u32.gt.s64 %r226,%r134,%r135; neg.s32 %r227,%r226; cvt.u16.u64 %r230,%r224; cvt.u16.u32 %r231,%r227; or.b16 %r229,%r230,%r231; .loc 1 237 6 cvt.u32.u16 %r232,%r229; cvt.u16.u8 %r233,%r232; setp.eq.u16 %r234,%r233,0; @ %r234 bra $L51; .loc 1 239 7 cvt.u32.u32 %r237,%r23; cvt.s64.s8 %r236,%r237; st.u64 [%stack+8],%r236; st.u64 [%stack],%r22; cvta.const.u64 %r235235_gfortran_runtime_error1: .loc 1 244 9 add.u64 %r434,%r134,%r134; add.u64 %r436,%r434,%r134; shl.b64 %r242,%r436,3; add.u64 %r243,%r215,%r242; ld.u64 %r246,[%r243+56]; add.u64 %r245,%r246,1; .loc 1 244 7 ld.u64 %r253,[%r243+48]; sub.u64 %r136,%r245,%r253; .loc 1 245 6 setp.le.s64 %r254,%r136,0; @ %r254 bra $L48; .loc 1 248 9 ld.u64 %r149,[%r217]; .loc 1 250 15 ld.u64 %r34,[%r217+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r255,%r35,-4; and.b32 %r256,%r255,-5; set.u32.eq.u32 %r258,%r256,0; neg.s32 %r259,%r258; .loc 1 252 22 add.u32 %r260,%r35,-1; set.u32.le.u32 %r262,%r260,1; neg.s32 %r263,%r262; .loc 1 252 6 cvt.u16.u32 %r265,%r259; cvt.u16.u32 %r266,%r263; or.loc 1 254 7 setp.ne.u32 %r271,%r35,16; @ %r271 bra $L54; $L53: .loc 1 261 9 shl.b64 %r275,%r436,3; add.u64 %r276,%r215,%r275; ld.u64 %r139,[%r276+40]; .loc 1 262 12 add.u64 %r282,%r217,%r275; ld.u64 %r41,[%r282+40]; .loc 1 264 3 setp.ne.u64 %r284,%r134,0; @ %r284 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r285,%r134,%r135; @ %r285 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r286286_gfortran_runtime_error5: add.u64 %r40,%r215,40; add.u64 %r128,%r217,40; add.u64 %r288,%r215,16; add.u64 %r290,%r22,%r22; add.u64 %r291,%r290,%r22; shl.b64 %r292,%r291,3; add.u64 %r211,%r288,%r292; .loc 1 264 3 mov.u64 %r88,0; add.u64 %r433,%frame,240; add.u64 %r432,%frame,360; .loc 1 271 12 mov.u64 %r447,%r88; $L60: .loc 1 266 18 add.u64 %r294,%r433,%r88; ld.u64 %r295,[%r40]; st.u64 [%r294],%r295; .loc 1 267 18 add.u64 %r296,%frame,%r88; .loc 1 267 20 ld.u64 %r298,[%r128]; mul.lo.u64 %r297,%r298,%r34; .loc 1 267 18 st.u64 [%r296],%r297; .loc 1 268 19 ld.u64 %r300,[%r40+16]; add.u64 %r299,%r300,1; ld.u64 %r301,[%r40+8]; sub.u64 %r52,%r299,%r301; .loc 1 270 10 setp.lt.s64 %r302,%r52,0; @ %r302 bra $L58; .loc 1 268 17 add.u64 %r304,%r432,%r88; st.u64 [%r304],%r52; bra $L59; $L58: .loc 1 271 12 add.u64 %r306,%r432,%r88; st.u64 [%r306],%r447; $L59: .loc 1 264 3 add.u64 %r40,%r40,24; add.u64 %r128,%r128,24; add.u64 %r88,%r88,8; setp.ne.u64 %r308,%r40,%r211; @ %r308 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r309,[%r214]; setp.eq.u64 %r310,%r309,0; @ ! %r310 bra $L63; bra $L62; $L56: add.u64 %r312,%r22,%r22; add.u64 %r313,%r312,%r22; shl.b64 %r314,%r313,3; add.u64 %r123,%r314,40; add.u64 %r132,%r215,%r123; add.u64 %r121,%r217,%r123; shl.b64 %r315,%r22,3; add.u64 %r133,%r315,-8; cvt.u32.u32 %r317,%r23; cvt.s64.s8 %r316,%r317; shl.b64 %r318,%r316,3; add.u64 %r67,%r318,-8; add.u64 %r433,%frame,240; add.u64 %r432,%frame,360; .loc 1 281 12 mov.u64 %r446,0; $L66: .loc 1 276 18 add.u64 %r320,%r433,%r133; ld.u64 %r321,[%r132]; st.u64 [%r320],%r321; .loc 1 277 18 add.u64 %r322,%frame,%r133; .loc 1 277 20 ld.u64 %r324,[%r121]; mul.lo.u64 %r323,%r324,%r34; .loc 1 277 18 st.u64 [%r322],%r323; .loc 1 278 19 ld.u64 %r326,[%r132+16]; add.u64 %r325,%r326,1; ld.u64 %r327,[%r132+8]; sub.u64 %r63,%r325,%r327; .loc 1 280 10 setp.lt.s64 %r328,%r63,0; @ %r328 bra $L64; .loc 1 278 17 add.u64 %r330,%r432,%r133; st.u64 [%r330],%r63; bra $L65; $L64: .loc 1 281 12 add.u64 %r332,%r432,%r133; st.u64 [%r332],%r446; $L65: .loc 1 274 3 add.u64 %r132,%r132,24; add.u64 %r121,%r121,24; add.u64 %r133,%r133,8; setp.ne.u64 %r334,%r67,%r133; @ %r334 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r335,%r135,0; @ %r335 bra $L68; add.u64 %r432,%frame,360; $L86: add.u64 %r181,%r214,40; mov.u64 %r179,%r432; cvt.u32.u32 %r337,%r23; cvt.s64.s8 %r336,%r337; add.u64 %r339,%r336,%r336; add.u64 %r340,%r339,%r336; shl.b64 %r341,%r340,3; add.u64 %r342,%r214,16; add.u64 %r144,%r341,%r342; .loc 1 291 10 mov.u64 %r125,1; .loc 1 295 4 mov.u64 %r361,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r343,%r23,-2; cvt.s64.s32 %r72,%r343; add.u64 %r345,%r72,%r72; add.u64 %r346,%r345,%r72; shl.b64 %r347,%r346,3; add.u64 %r348,%r214,%r347; .loc 1 299 67 shl.b64 %r350,%r72,3; add.u64 %r351,%frame,%r350; .loc 1 299 59 ld.u64 %r353,[%r348+40]; ld.u64 %r354,[%r351+360]; mul.lo.u64 %r143,%r353,%r354; .loc 1 301 24 mov.u64 %r355,0; st.u64 [%r214+8],%r355; .loc 1 302 28 cvt.u16.u32 %r358,%r23; add.u16 %r357,%r358,-1; cvt.u32.u16 %r359,%r357; st.u8 [%r214+28],%r359; .loc 1 304 10 setp.eq.u64 %r360,%r143,0; @ %r360 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r125,%r69,%r125; $L69: .loc 1 295 4 st.u64 [%r181+8],%r361; ld.u64 %r69,[%r179]; add.u64 %r362,%r69,-1; st.u64 [%r181+16],%r362; st.u64 [%r181],%r125; .loc 1 288 7 add.u64 %r181,%r181,24; add.u64 %r179,%r179,8; setp.ne.u64 %r363,%r144,%r181; @ %r363 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r214+48],%r143; mov.u64 %r365,-1; st.u64 [%r214+56],%r365; mov.u64 %r366,1; st.u64 [%r214+40],%r366; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r368,16368; call (%value_in),_gfortrani_xmallocarray69,[%value_in]; } .loc 1 311 22 st.u64 [%r214],%r369; bra $L73; $L63: .loc 1 316 19 ld.s8 %r371,[%r214+28]; .loc 1 316 10 setp.eq.u64 %r372,%r371,%r135; @ %r372 bra $L74; .loc 1 317 2 cvta.const.u64 %r373,$LC373_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r375,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r376,[%r375+36]; setp.eq.u32 %r377,%r376,0; @ %r377 bra $L73; .loc 1 321 4 add.u64 %r432,%frame,360; cvta.const.u64 %r381,$LC2r4380381; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r385,$LC738381; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r387,%r135,0; @ %r387 bra $L75; $L78: .loc 1 336 8 ld.u64 %r150,[%r214]; .loc 1 337 8 ld.u64 %r148,[%r215]; .loc 1 339 9 setp.ne.u64 %r388,%r148,0; @ %r388 bra $L76; bra $L48; $L75: add.u64 %r194,%frame,480; add.u64 %r193,%r214,40; add.u64 %r191,%frame,120; cvt.u32.u32 %r390,%r23; cvt.s64.s8 %r389,%r390; add.u64 %r182,%r389,-1; .loc 1 328 3 mov.u64 %r155,0; add.u64 %r432,%frame,360; .loc 1 330 16 mov.u64 %r391,%r155; $L77: st.u64 [%r194],%r391; .loc 1 331 18 ld.u64 %r392,[%r193]; st.u64 [%r191],%r392; .loc 1 332 17 shl.b64 %r394,%r155,3; add.u64 %r395,%r432,%r394; .loc 1 332 10 ld.u64 %r396,[%r395]; setp.le.s64 %r397,%r396,0; @ %r397 bra $L48; .loc 1 328 26 add.u64 %r155,%r155,1; .loc 1 328 3 add.u64 %r194,%r194,8; add.u64 %r193,%r193,24; add.u64 %r191,%r191,8; setp.ne.u64 %r398,%r155,%r182; @ %r398 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r43,%r41,%r34; .loc 1 349 32 shl.b64 %r87,%r139,4; ld.u64 %r151,[%frame+480]; .loc 1 359 22 ld.u64 %r90,[%frame+240]; .loc 1 359 12 shl.b64 %r91,%r90,4; .loc 1 360 23 ld.u64 %r93,[%frame]; .loc 1 361 22 ld.u64 %r95,[%frame+120]; .loc 1 361 12 shl.b64 %r96,%r95,4; .loc 1 363 32 ld.u64 %r81,[%frame+360]; .loc 1 370 23 mul.lo.u64 %r38,%r81,%r90; .loc 1 371 24 mul.lo.u64 %r399,%r81,%r93; .loc 1 371 10 neg.s64 %r161,%r399; .loc 1 372 23 mul.lo.u64 %r400,%r81,%r95; .loc 1 372 9 shl.b64 %r401,%r400,4; neg.s64 %r165,%r401; setp.le.s64 %r437,%r135,1; cvt.u32.u32 %r439,%r23; cvt.s64.s8 %r440,%r439; add.u64 %r441,%r440,-1; add.u64 %r442,%frame,240; add.u64 %r443,%frame,360; add.u64 %r444,%frame,120; .loc 1 367 13 mov.u64 %r445,0; $L87: .loc 1 328 3 mov.u64 %r154,%r149; mov.u64 %r153,%r148; mov.f64 %r119,0d0000000000000000; mov.f64 %r120,0d3ff0000000000000; .loc 1 349 9 mov.u64 %r152,0; $L80: .loc 1 352 6 ld.s8 %r403,[%r154]; cvt.u16.u32 %r402,%r403; setp.eq.u16 %r404,%r402,0; @ %r404 bra $L79; ld.f64 %r163,[%r153]; ld.f64 %r170,[%r153+8]; .loc 1 353 12 mul.f64 %r174,%r120,%r170; mul.f64 %r405,%r119,%r170; neg.f64 %r406,%r405; fma.rn.f64 %r120,%r120,%r163,%r406; fma.rn.f64 %r119,%r119,%r163,%r174; $L79: .loc 1 349 24 add.u64 %r152,%r152,1; .loc 1 349 32 add.u64 %r153,%r153,%r87; .loc 1 349 47 add.u64 %r154,%r154,%r43; .loc 1 349 2 setp.ne.u64 %r407,%r136,%r152; @ %r407 bra $L80; .loc 1 355 8 st.f64 [%r150],%r120; st.f64 [%r150+8],%r119; .loc 1 358 15 add.u64 %r151,%r151,1; .loc 1 359 12 add.u64 %r148,%r148,%r91; .loc 1 360 13 add.u64 %r149,%r149,%r93; .loc 1 361 12 add.u64 %r150,%r150,%r96; .loc 1 363 13 setp.ne.u64 %r408,%r81,%r151; @ %r408 bra $L87; .loc 1 371 10 add.u64 %r145,%r149,%r161; .loc 1 372 9 add.u64 %r146,%r150,%r165; .loc 1 374 7 @ ! %r437 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r127],%r445; .loc 1 370 23 mul.lo.u64 %r99,%r110,%r109; .loc 1 371 24 mul.lo.u64 %r411,%r112,%r109; .loc 1 371 10 sub.u64 %r145,%r149,%r411; .loc 1 372 23 mul.lo.u64 %r412,%r114,%r109; .loc 1 372 9 shl.b64 %r413,%r412,4; sub.u64 %r146,%r150,%r413; .loc 1 373 5 add.u64 %r147,%r147,1; .loc 1 374 7 add.u64 %r127,%r127,8; add.u64 %r178,%r178,8; setp.ne.u64 %r414,%r147,%r441; @ %r414 bra $L84; bra $L48; $L108: add.u64 %r127,%frame,488; .loc 1 370 23 mov.u64 %r99,%r38; .loc 1 374 7 mov.u64 %r178,8; .loc 1 373 5 mov.u64 %r147,1; $L84: .loc 1 382 16 ld.u64 %r418,[%r127]; add.u64 %r109,%r418,1; st.u64 [%r127],%r109; .loc 1 383 23 add.u64 %r420,%r442,%r178; ld.u64 %r110,[%r420]; .loc 1 383 13 sub.u64 %r421,%r110,%r99; shl.b64 %r422,%r421,4; add.u64 %r148,%r148,%r422; .loc 1 384 24 add.u64 %r423,%frame,%r178; ld.u64 %r112,[%r423]; .loc 1 384 14 add.u64 %r149,%r145,%r112; .loc 1 385 23 add.u64 %r425,%r444,%r178; ld.u64 %r114,[%r425]; .loc 1 385 13 shl.b64 %r426,%r114,4; add.u64 %r150,%r146,%r426; .loc 1 363 32 add.u64 %r428,%r443,%r178; ld.u64 %r117,[%r428]; .loc 1 363 13 setp.eq.u64 %r429,%r109,%r117; @ %r429 bra $L85; .loc 1 367 13 mov.u64 %r151,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r430,[%r214]; setp.eq.u64 %r431,%r430,0; @ ! %r431 bra $L63; bra $L86; $L48:_gfortran_sproduct_c8 .visible .func _gfortran_sproduct_c868u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64 %r276; .reg .pred %r277; .reg .pred %r280; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_product_c8.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r297,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r297; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r297; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r304,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r304; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r296,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r296,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r303,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r303; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r297,0; @ %r211 bra $L127; add.u64 %r296,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r296; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r244,1696244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r297,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r297; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r297; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r297,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r297,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r297,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 shl.b64 %r69,%r68,4; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; ld.const.f64 %r293,[$LC4]; ld.const.f64 %r294,[$LC4+8]; setp.le.s64 %r295,%r297,1; cvt.u32.u32 %r299,%r24; cvt.s64.s8 %r300,%r299; add.u64 %r301,%r300,-1; .loc 1 520 13 mov.u64 %r302,0; $L146: .loc 1 512 13 st.f64 [%r98],%r293; st.f64 [%r98+8],%r294; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r280,%r59,%r80; @ %r280 bra $L146; .loc 1 525 7 @ ! %r295 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r302; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r283,%r97,%r301; @ %r283 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; $L144: .loc 1 529 16 ld.u64 %r288,[%r99]; add.u64 %r74,%r288,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r289,%r75,%r72; shl.b64 %r290,%r289,4; add.u64 %r98,%r98,%r290; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r291,%r74,%r77; @ %r291 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r292,%r98,0; @ ! %r292 bra $L122; bra $L147; $L109: .loc 1 53sum_i1.o/_gfortran_sum_i1 .visible .func _gfortran_sum_fortran/generated/sum_i1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_msum_i1 .visible .func _gfortran_msum_i1gfortran_ssum_i1 .visible .func _gfortran_ssum_i1VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10] = {83,85,77114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,83,85,77,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_sum_i1 .visible .func _gfortran_sum_480predu64 %r229; .reg .u64u64pred %r247; .reg .pred %r249; .reg .predu64u64 %r334; .reg .u64 %r336; .reg .predpred %r343; .reg .u32u64 %r353; .reg .u16 %r354; mov.u64 %r174,%ar0; mov.u64 %r175,%ar1; mov.u64 %r176,%ar2; .loc 1 55 10 ld.s8 %r22,[%r175+28]; .loc 1 55 38 add.u32 %r177,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r107,%r177; .loc 1 56 10 ld.u64 %r25,[%r176]; .loc 1 56 7 add.u64 %r108,%r25,-1; .loc 1 58 7 shr.u64 %r179,%r108,63; set.u32.lt.s64 %r181,%r107,%r108; neg.s32 %r182,%r181; cvt.u16.u64 %r185,%r179; cvt.u16.u32 %r186,%r182; or.b16 %r184,%r185,%r186; .loc 1 58 6 cvt.u32.u16 %r187,%r184; cvt.u16.u8 %r188,%r187; setp.eq.u16 %r189,%r188,0; @ %r189 bra $L2; .loc 1 60 7 cvt.u32.u32 %r192,%r22; cvt.s64.s8 %r191,%r192; st.u64 [%stack+8],%r19190,$LC0;stack; call _gfortran_runtime_errorr195,%r108,%r108; add.u64 %r196,%r195,%r108; shl.b64 %r197,%r196,3; add.u64 %r198,%r175,%r197; ld.u64 %r30,[%r198+56]; ld.u64 %r32,[%r198+48]; .loc 1 68 9 ld.u64 %r111,[%r198+40]; .loc 1 70 3 setp.ne.u64 %r212,%r108,0; @ %r212 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r213,%r107,%r108; @ %r213 bra $L4; bra $L46; $L3: add.u64 %r125,%r175,40; add.u64 %r172,%frame,120; add.u64 %r170,%frame,240; add.u64 %r214,%r175,16; add.u64 %r216,%r25,%r25; add.u64 %r217,%r216,%r25; shl.b64 %r218,%r217,3; add.u64 %r79,%r214,%r218; .loc 1 76 12 mov.u64 %r353,0; $L8: .loc 1 72 18 ld.u64 %r219,[%r125]; st.u64 [%r172],%r219; .loc 1 73 19 ld.u64 %r221,[%r125+16]; add.u64 %r220,%r221,1; ld.u64 %r222,[%r125+8]; sub.u64 %r37,%r220,%r222; .loc 1 75 10 setp.lt.s64 %r223,%r37,0; @ %r223 bra $L6; .loc 1 73 17 st.u64 [%r170],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r170],%r353; $L7: .loc 1 70 3 add.u64 %r125,%r125,24; add.u64 %r172,%r172,8; add.u64 %r170,%r170,8; setp.ne.u64 %r225,%r79,%r125; @ %r225 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r226,[%r174]; setp.eq.u64 %r227,%r226,0; @ ! %r227 bra $L11; bra $L10; $L4: add.u64 %r229,%r25,%r25; add.u64 %r230,%r229,%r25; shl.b64 %r231,%r230,3; add.u64 %r232,%r231,40; add.u64 %r117,%r175,%r232; shl.b64 %r233,%r25,3; add.u64 %r97,%r233,-8; add.u64 %r340,%frame,120; add.u64 %r100,%r340,%r97; add.u64 %r341,%frame,240; add.u64 %r95,%r341,%r97; cvt.u32.u32 %r237,%r22; cvt.s64.s8 %r236,%r237; add.u64 %r239,%r236,%r236; add.u64 %r240,%r239,%r236; shl.b64 %r241,%r240,3; add.u64 %r242,%r175,40; add.u64 %r94,%r241,%r242; .loc 1 84 12 mov.u64 %r352,0; $L14: .loc 1 80 18 ld.u64 %r243,[%r117]; st.u64 [%r100],%r243; .loc 1 81 19 ld.u64 %r245,[%r117+16]; add.u64 %r244,%r245,1; ld.u64 %r246,[%r117+8]; sub.u64 %r44,%r244,%r246; .loc 1 83 10 setp.lt.s64 %r247,%r44,0; @ %r247 bra $L12; .loc 1 81 17 st.u64 [%r95],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r95],%r352; $L13: .loc 1 78 3 add.u64 %r117,%r117,24; add.u64 %r100,%r100,8; add.u64 %r95,%r95,8; setp.ne.u64 %r249,%r94,%r117; @ %r249 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r250,%r107,0; @ %r250 bra $L16; add.u64 %r341,%frame,240; $L35: add.u64 %r147,%r174,40; mov.u64 %r138,%r341; cvt.u32.u32 %r252,%r22; cvt.s64.s8 %r251,%r252; add.u64 %r254,%r251,%r251; add.u64 %r255,%r254,%r251; shl.b64 %r256,%r255,3; add.u64 %r257,%r174,16; add.u64 %r126,%r256,%r257; .loc 1 94 10 mov.u64 %r99,1; .loc 1 98 4 mov.u64 %r280,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r258,0; st.u64 [%r174+8],%r258; .loc 1 103 28 cvt.u16.u32 %r261,%r22; add.u16 %r260,%r261,-1; cvt.u32.u16 %r262,%r260; st.u8 [%r174+28],%r262; .loc 1 105 20 add.u32 %r263,%r22,-2; cvt.s64.s32 %r55,%r263; add.u64 %r265,%r55,%r55; add.u64 %r266,%r265,%r55; shl.b64 %r267,%r266,3; add.u64 %r268,%r174,%r267; .loc 1 105 67 shl.b64 %r270,%r55,3; add.u64 %r271,%frame,%r270; .loc 1 105 59 ld.u64 %r273,[%r268+40]; ld.u64 %r274,[%r271+240]; mul.lo.u64 %r113,%r273,%r274; .loc 1 107 29 mov.u64 %r27611(%value_in),_gfortrani_xmallocarray277,[%value_in]; } .loc 1 107 27 st.u64 [%r174],%r277; .loc 1 108 10 setp.eq.u64 %r279,%r113,0; @ ! %r279 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r99,%r51,%r99; $L17: .loc 1 98 4 st.u64 [%r147+8],%r280; ld.u64 %r51,[%r138]; add.u64 %r281,%r51,-1; st.u64 [%r147+16],%r281; st.u64 [%r147],%r99; .loc 1 91 7 add.u64 %r147,%r147,24; add.u64 %r138,%r138,8; setp.ne.u64 %r282,%r126,%r147; @ %r282 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r174+48],%r113; mov.u64 %r284,-1; st.u64 [%r174+56],%r284; st.u64 [%r174+40],%r276; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r62,[%r174+28]; .loc 1 118 10 setp.eq.u64 %r286,%r62,%r107; @ %r286 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r107;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r289,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r290,[%r289+36]; setp.eq.u32 %r291,%r290,0; @ %r291 bra $L21; .loc 1 125 2 add.u64 %r341,%frame,240; cvta.const.u64 %r295,$LC2r34129495; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r297,%r107,0; @ %r297 bra $L24; $L27: .loc 1 65 9 add.u64 %r298,%r30,1; .loc 1 65 7 sub.u64 %r109,%r298,%r32; max.s64 %r45,%r109,0; .loc 1 137 8 ld.u64 %r121,[%r175]; .loc 1 138 8 ld.u64 %r122,[%r174]; ld.u64 %r47,[%frame+360]; .loc 1 166 22 ld.u64 %r74,[%frame+120]; .loc 1 167 22 ld.u64 %r76,[%frame]; setp.gt.s64 %r342,%r109,0; setp.le.s64 %r343,%r107,1; cvt.u32.u32 %r345,%r22; cvt.s64.s8 %r346,%r345; add.u64 %r347,%r346,-1; add.u64 %r348,%frame,120; add.u64 %r349,%frame,240; .loc 1 173 13 mov.u64 %r350,0; bra $L25; $L24: add.u64 %r160,%frame,360; add.u64 %r159,%r174,40; mov.u64 %r157,%frame; cvt.u32.u32 %r300,%r22; cvt.s64.s8 %r299,%r300; add.u64 %r148,%r299,-1; .loc 1 129 3 mov.u64 %r123,0; add.u64 %r341,%frame,240; .loc 1 131 16 mov.u64 %r301,%r123; $L26: st.u64 [%r160],%r301; .loc 1 132 18 ld.u64 %r302,[%r159]; st.u64 [%r157],%r302; .loc 1 133 17 shl.b64 %r304,%r123,3; add.u64 %r305,%r341,%r304; .loc 1 133 10 ld.u64 %r306,[%r305]; setp.le.s64 %r307,%r306,0; @ %r307 bra $L1; .loc 1 129 26 add.u64 %r123,%r123,1; .loc 1 129 3 add.u64 %r160,%r160,8; add.u64 %r159,%r159,24; add.u64 %r157,%r157,8; setp.ne.u64 %r308,%r123,%r148; @ %r308 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r342 bra $L28; .loc 1 150 10 mov.u32 %r310,0; st.u8 [%r122],%r310; bra $L29; $L28: .loc 1 154 36 mov.u64 %r116,%r121; .loc 1 148 10 mov.u32 %r114,0; .loc 1 154 13 mov.u64 %r115,0; $L30: .loc 1 158 10 ld.u8 %r354,[%r116]; mov.u16 %r314,%r354; cvt.u16.u32 %r315,%r114; add.u16 %r313,%r314,%r315; cvt.u32.u16 %r316,%r313; cvt.s32.s8 %r114,%r316; .loc 1 154 28 add.u64 %r115,%r115,1; .loc 1 154 36 add.u64 %r116,%r116,%r111; .loc 1 154 6 setp.gt.s64 %r318,%r45,%r115; @ %r318 bra $L30; .loc 1 161 12 cvt.u32.u32 %r319,%r114; st.u8 [%r122],%r319; $L29: .loc 1 165 15 add.u64 %r47,%r47,1; .loc 1 166 12 add.u64 %r121,%r121,%r74; .loc 1 167 12 add.u64 %r122,%r122,%r76; .loc 1 169 32 ld.u64 %r128,[%frame+240]; .loc 1 169 13 setp.ne.u64 %r320,%r47,%r128; @ %r320 bra $L25; .loc 1 176 23 mul.lo.u64 %r321,%r74,%r47; .loc 1 176 9 sub.u64 %r118,%r121,%r321; .loc 1 177 23 mul.lo.u64 %r322,%r76,%r47; .loc 1 177 9 sub.u64 %r119,%r122,%r322; .loc 1 179 7 @ %r343 bra $L1; add.u64 %r143,%frame,368; mov.u64 %r106,8; .loc 1 178 5 mov.u64 %r120,1; bra $L33; $L34: .loc 1 173 13 st.u64 [%r143],%r350; .loc 1 176 23 mul.lo.u64 %r328,%r89,%r88; .loc 1 176 9 sub.u64 %r118,%r121,%r328; .loc 1 177 23 mul.lo.u64 %r329,%r91,%r88; .loc 1 177 9 sub.u64 %r119,%r122,%r329; .loc 1 178 5 add.u64 %r120,%r120,1; .loc 1 179 7 add.u64 %r143,%r143,8; add.u64 %r106,%r106,8; setp.eq.u64 %r330,%r120,%r347; @ %r330 bra $L1; $L33: .loc 1 187 16 ld.u64 %r331,[%r143]; add.u64 %r88,%r331,1; st.u64 [%r143],%r88; .loc 1 188 23 add.u64 %r333,%r348,%r106; ld.u64 %r89,[%r333]; .loc 1 188 13 add.u64 %r121,%r118,%r89; .loc 1 189 23 add.u64 %r334,%frame,%r106; ld.u64 %r91,[%r334]; .loc 1 189 13 add.u64 %r122,%r119,%r91; .loc 1 169 32 add.u64 %r336,%r349,%r106; ld.u64 %r93,[%r336]; .loc 1 169 13 setp.eq.u64 %r337,%r88,%r93; @ %r337 bra $L34; .loc 1 173 13 mov.u64 %r47,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r338,[%r174]; setp.eq.u64 %r339,%r338,0; @ ! %r339 bra $L11; bra $L35; $L1: .loc 1 19_gfortran_msum_i1 .visible .func _gfortran_msum_608pred %r210; .reg .u32 %r214; .reg .u64 %r216; .reg .u32 %r218; .reg .u32 %r219; .reg .u16 %r221; .reg .u16 %r222; .reg .u16 %r223; .reg .u32 %r224; .reg .u16 %r225; .reg .predu32 %r259; .reg .u16u64 %r280; .reg .u64u64predu64u64 %r333; .reg .u64 %r334; .reg .u32predpredu64 %r377; .reg .pred %r379; .reg .pred %r380; .reg .u64 %r381; .reg .u32pred %r389; .reg .predu32 %r399; .reg .predpredmov.u64 %r206,%ar0; mov.u64 %r207,%ar1; mov.u64 %r208,%ar2; mov.u64 %r209,%ar3; .loc 1 223 6 setp.ne.u64 %r210,%r209,0; @ %r210 bra $L49; .loc 1 228 7 {_gfortran_sum_.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r208]; .loc 1 233 7 add.u64 %r135,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r207+28]; .loc 1 234 38 add.u32 %r214,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r136,%r214; .loc 1 237 7 shr.u64 %r216,%r135,63; set.u32.gt.s64 %r218,%r135,%r136; neg.s32 %r219,%r218; cvt.u16.u64 %r222,%r216; cvt.u16.u32 %r223,%r219; or.b16 %r221,%r222,%r223; .loc 1 237 6 cvt.u32.u16 %r224,%r221; cvt.u16.u8 %r225,%r224; setp.eq.u16 %r226,%r225,0; @ %r226 bra $L51; .loc 1 239 7 cvt.u32.u32 %r229,%r23; cvt.s64.s8 %r228,%r229; st.u64 [%stack+8],%r228; st.u64 [%stack],%r22; cvta.const.u64 %r227_gfortran_runtime_error1: .loc 1 244 9 add.u64 %r429,%r135,%r135; add.u64 %r430,%r429,%r135; shl.b64 %r234,%r430,3; add.u64 %r235,%r207,%r234; ld.u64 %r238,[%r235+56]; add.u64 %r237,%r238,1; .loc 1 244 7 ld.u64 %r245,[%r235+48]; sub.u64 %r137,%r237,%r245; .loc 1 245 6 setp.le.s64 %r246,%r137,0; @ %r246 bra $L48; .loc 1 248 9 ld.u64 %r149,[%r209]; .loc 1 250 15 ld.u64 %r34,[%r209+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r247,%r35,-4; and.b32 %r248,%r247,-5; set.u32.eq.u32 %r250,%r248,0; neg.s32 %r251,%r250; .loc 1 252 22 add.u32 %r252,%r35,-1; set.u32.le.u32 %r254,%r252,1; neg.s32 %r255,%r254; .loc 1 252 6 cvt.u16.u32 %r257,%r251; cvt.u16.u32 %r258,%r255; or.b16 %r256,%r257,%r258; cvt.u32.u16 %r259,%r256; cvt.u16.u8 %r260,%r259; setp.ne.u16 %r261,%r260,0; @ %r261 bra $L53; .loc 1 254 7 setp.ne.u32 %r263,%r35,16; @ %r263 bra $L54; $L53: .loc 1 261 9 shl.b64 %r267,%r430,3; add.u64 %r268,%r207,%r267; ld.u64 %r140,[%r268+40]; .loc 1 262 12 add.u64 %r274,%r209,%r267; ld.u64 %r42,[%r274+40]; .loc 1 264 3 setp.ne.u64 %r276,%r135,0; @ %r276 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r277,%r135,%r136; @ %r277 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r27827_gfortran_runtime_error5: add.u64 %r190,%r207,40; add.u64 %r192,%r209,40; add.u64 %r280,%r207,16; add.u64 %r282,%r22,%r22; add.u64 %r283,%r282,%r22; shl.b64 %r284,%r283,3; add.u64 %r203,%r280,%r284; .loc 1 264 3 mov.u64 %r194,0; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 271 12 mov.u64 %r441,%r194; $L60: .loc 1 266 18 add.u64 %r286,%r427,%r194; ld.u64 %r287,[%r190]; st.u64 [%r286],%r287; .loc 1 267 18 add.u64 %r288,%frame,%r194; .loc 1 267 20 ld.u64 %r290,[%r192]; mul.lo.u64 %r289,%r290,%r34; .loc 1 267 18 st.u64 [%r288],%r289; .loc 1 268 19 ld.u64 %r292,[%r190+16]; add.u64 %r291,%r292,1; ld.u64 %r293,[%r190+8]; sub.u64 %r53,%r291,%r293; .loc 1 270 10 setp.lt.s64 %r294,%r53,0; @ %r294 bra $L58; .loc 1 268 17 add.u64 %r296,%r426,%r194; st.u64 [%r296],%r53; bra $L59; $L58: .loc 1 271 12 add.u64 %r298,%r426,%r194; st.u64 [%r298],%r441; $L59: .loc 1 264 3 add.u64 %r190,%r190,24; add.u64 %r192,%r192,24; add.u64 %r194,%r194,8; setp.ne.u64 %r300,%r190,%r203; @ %r300 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r301,[%r206]; setp.eq.u64 %r302,%r301,0; @ ! %r302 bra $L63; bra $L62; $L56: add.u64 %r304,%r22,%r22; add.u64 %r305,%r304,%r22; shl.b64 %r306,%r305,3; add.u64 %r155,%r306,40; add.u64 %r58,%r207,%r155; add.u64 %r157,%r209,%r155; shl.b64 %r307,%r22,3; add.u64 %r103,%r307,-8; cvt.u32.u32 %r309,%r23; cvt.s64.s8 %r308,%r309; shl.b64 %r310,%r308,3; add.u64 %r187,%r310,-8; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 281 12 mov.u64 %r440,0; $L66: .loc 1 276 18 add.u64 %r312,%r427,%r103; ld.u64 %r313,[%r58]; st.u64 [%r312],%r313; .loc 1 277 18 add.u64 %r314,%frame,%r103; .loc 1 277 20 ld.u64 %r316,[%r157]; mul.lo.u64 %r315,%r316,%r34; .loc 1 277 18 st.u64 [%r314],%r315; .loc 1 278 19 ld.u64 %r318,[%r58+16]; add.u64 %r317,%r318,1; ld.u64 %r319,[%r58+8]; sub.u64 %r64,%r317,%r319; .loc 1 280 10 setp.lt.s64 %r320,%r64,0; @ %r320 bra $L64; .loc 1 278 17 add.u64 %r322,%r426,%r103; st.u64 [%r322],%r64; bra $L65; $L64: .loc 1 281 12 add.u64 %r324,%r426,%r103; st.u64 [%r324],%r440; $L65: .loc 1 274 3 add.u64 %r58,%r58,24; add.u64 %r157,%r157,24; add.u64 %r103,%r103,8; setp.ne.u64 %r326,%r103,%r187; @ %r326 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r327,%r136,0; @ %r327 bra $L68; add.u64 %r426,%frame,360; $L86: add.u64 %r127,%r206,40; mov.u64 %r121,%r426; cvt.u32.u32 %r329,%r23; cvt.s64.s8 %r328,%r329; add.u64 %r331,%r328,%r328; add.u64 %r332,%r331,%r328; shl.b64 %r333,%r332,3; add.u64 %r334,%r206,16; add.u64 %r72,%r333,%r334; .loc 1 291 10 mov.u64 %r123,1; .loc 1 295 4 mov.u64 %r353,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r335,%r23,-2; cvt.s64.s32 %r70,%r335; add.u64 %r337,%r70,%r70; add.u64 %r338,%r337,%r70; shl.b64 %r339,%r338,3; add.u64 %r340,%r206,%r339; .loc 1 299 67 shl.b64 %r342,%r70,3; add.u64 %r343,%frame,%r342; .loc 1 299 59 ld.u64 %r345,[%r340+40]; ld.u64 %r346,[%r343+360]; mul.lo.u64 %r143,%r345,%r346; .loc 1 301 24 mov.u64 %r347,0; st.u64 [%r206+8],%r347; .loc 1 302 28 cvt.u16.u32 %r350,%r23; add.u16 %r349,%r350,-1; cvt.u32.u16 %r351,%r349; st.u8 [%r206+28],%r351; .loc 1 304 10 setp.eq.u64 %r352,%r143,0; @ %r352 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r123,%r67,%r123; $L69: .loc 1 295 4 st.u64 [%r127+8],%r353; ld.u64 %r67,[%r121]; add.u64 %r354,%r67,-1; st.u64 [%r127+16],%r354; st.u64 [%r127],%r123; .loc 1 288 7 add.u64 %r127,%r127,24; add.u64 %r121,%r121,8; setp.ne.u64 %r355,%r72,%r127; @ %r355 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r206+48],%r143; mov.u64 %r357,-1; st.u64 [%r206+56],%r357; mov.u64 %r358,1; st.u64 [%r206+40],%r358; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r360360; call (%value_in),_gfortrani_xmallocarray61,[%value_in]; } .loc 1 311 22 st.u64 [%r206],%r361; bra $L73; $L63: .loc 1 316 19 ld.s8 %r363,[%r206+28]; .loc 1 316 10 setp.eq.u64 %r364,%r363,%r136; @ %r364 bra $L74; .loc 1 317 2 cvta.const.u64 %r365365_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r367,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r368,[%r367+36]; setp.eq.u32 %r369,%r368,0; @ %r369 bra $L73; .loc 1 321 4 add.u64 %r426,%frame,360; cvta.const.u64 %r373,$LC2; cvta.const.u64 %r372,$LC3; {372373; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r377,$LC377373; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r379,%r136,0; @ %r379 bra $L75; $L78: .loc 1 336 8 ld.u64 %r150,[%r206]; .loc 1 337 8 ld.u64 %r148,[%r207]; .loc 1 339 9 setp.ne.u64 %r380,%r148,0; @ %r380 bra $L76; bra $L48; $L75: add.u64 %r173,%frame,480; add.u64 %r170,%r206,40; add.u64 %r163,%frame,120; cvt.u32.u32 %r382,%r23; cvt.s64.s8 %r381,%r382; add.u64 %r128,%r381,-1; .loc 1 328 3 mov.u64 %r154,0; add.u64 %r426,%frame,360; .loc 1 330 16 mov.u64 %r383,%r154; $L77: st.u64 [%r173],%r383; .loc 1 331 18 ld.u64 %r384,[%r170]; st.u64 [%r163],%r384; .loc 1 332 17 shl.b64 %r386,%r154,3; add.u64 %r387,%r426,%r386; .loc 1 332 10 ld.u64 %r388,[%r387]; setp.le.s64 %r389,%r388,0; @ %r389 bra $L48; .loc 1 328 26 add.u64 %r154,%r154,1; .loc 1 328 3 add.u64 %r173,%r173,8; add.u64 %r170,%r170,24; add.u64 %r163,%r163,8; setp.ne.u64 %r390,%r128,%r154; @ %r390 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r44,%r42,%r34; ld.u64 %r176,[%frame+480]; .loc 1 359 22 ld.u64 %r89,[%frame+240]; .loc 1 360 23 ld.u64 %r91,[%frame]; .loc 1 361 22 ld.u64 %r93,[%frame+120]; setp.le.s64 %r431,%r136,1; .loc 1 348 10 mov.u32 %r433,0; cvt.u32.u32 %r434,%r23; cvt.s64.s8 %r435,%r434; add.u64 %r436,%r435,-1; add.u64 %r437,%frame,240; add.u64 %r438,%frame,360; add.u64 %r439,%frame,120; $L87: .loc 1 328 3 mov.u64 %r153,%r149; mov.u64 %r152,%r148; .loc 1 348 10 mov.u32 %r125,%r433; .loc 1 349 9 mov.u64 %r151,0; $L80: .loc 1 352 6 ld.s8 %r392,[%r153]; cvt.u16.u32 %r391,%r392; setp.eq.u16 %r393,%r391,0; @ %r393 bra $L79; .loc 1 353 12 ld.u8 %r442,[%r152]; mov.u16 %r397,%r442; cvt.u16.u32 %r398,%r125; add.u16 %r396,%r397,%r398; cvt.u32.u16 %r399,%r396; cvt.s32.s8 %r125,%r399; $L79: .loc 1 349 24 add.u64 %r151,%r151,1; .loc 1 349 32 add.u64 %r152,%r152,%r140; .loc 1 349 47 add.u64 %r153,%r153,%r44; .loc 1 349 2 setp.ne.u64 %r401,%r137,%r151; @ %r401 bra $L80; .loc 1 355 8 cvt.u32.u32 %r402,%r125; st.u8 [%r150],%r402; .loc 1 358 15 add.u64 %r176,%r176,1; .loc 1 359 12 add.u64 %r148,%r148,%r89; .loc 1 360 13 add.u64 %r149,%r149,%r91; .loc 1 361 12 add.u64 %r150,%r150,%r93; .loc 1 363 32 ld.u64 %r80,[%frame+360]; .loc 1 363 13 setp.ne.u64 %r403,%r80,%r176; @ %r403 bra $L87; .loc 1 370 23 mul.lo.u64 %r97,%r176,%r89; .loc 1 371 24 mul.lo.u64 %r404,%r176,%r91; .loc 1 371 10 sub.u64 %r145,%r149,%r404; .loc 1 372 23 mul.lo.u64 %r405,%r176,%r93; .loc 1 372 9 sub.u64 %r146,%r150,%r405; .loc 1 374 7 @ ! %r431 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r40],%r432; .loc 1 370 23 mul.lo.u64 %r97,%r109,%r108; .loc 1 371 24 mul.lo.u64 %r408,%r111,%r108; .loc 1 371 10 sub.u64 %r145,%r149,%r408; .loc 1 372 23 mul.lo.u64 %r409,%r113,%r108; .loc 1 372 9 sub.u64 %r146,%r150,%r409; .loc 1 373 5 add.u64 %r147,%r147,1; .loc 1 374 7 add.u64 %r40,%r40,8; add.u64 %r167,%r167,8; setp.ne.u64 %r410,%r147,%r436; @ %r410 bra $L84; bra $L48; $L108: add.u64 %r40,%frame,488; mov.u64 %r167,8; .loc 1 373 5 mov.u64 %r147,1; .loc 1 367 13 mov.u64 %r432,0; $L84: .loc 1 382 16 ld.u64 %r414,[%r40]; add.u64 %r108,%r414,1; st.u64 [%r40],%r108; .loc 1 383 23 add.u64 %r416,%r437,%r167; ld.u64 %r109,[%r416]; .loc 1 383 13 sub.u64 %r417,%r109,%r97; add.u64 %r148,%r148,%r417; .loc 1 384 24 add.u64 %r418,%frame,%r167; ld.u64 %r111,[%r418]; .loc 1 384 14 add.u64 %r149,%r145,%r111; .loc 1 385 23 add.u64 %r420,%r439,%r167; ld.u64 %r113,[%r420]; .loc 1 385 13 add.u64 %r150,%r146,%r113; .loc 1 363 32 add.u64 %r422,%r438,%r167; ld.u64 %r116,[%r422]; .loc 1 363 13 setp.eq.u64 %r423,%r108,%r116; @ %r423 bra $L85; .loc 1 367 13 mov.u64 %r176,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r424,[%r206]; setp.eq.u64 %r425,%r424,0; @ ! %r425 bra $L63; bra $L86; $L48:_gfortran_ssum_i1 .visible .func _gfortran_ssum_68predpred %r184; .reg .predpred %r206; .reg .predu32 %r211; .reg .u64 %r213; .reg .u64u32 %r222; .reg .u64pred %r264; .reg .u64 %r266; .reg .u32 %r270; .reg .u64 %r272; .reg .u64 %r274; .reg .predpredu64 %r297; mov.u64 %r148,%ar0; mov.u64 %r149,%ar1; mov.u64 %r150,%ar2; mov.u64 %r151,%ar3; .loc 1 412 6 setp.eq.u64 %r152,%r151,0; @ %r152 bra $L110; .loc 1 412 20 ld.u32 %r153,[%r151]; setp.eq.u32 %r154,%r153,0; @ %r154 bra $L111; $L110: .loc 1 417 7gfortran_sum_.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r150]; .loc 1 422 7 add.u64 %r86,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r149+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r291,%r26; .loc 1 425 7 shr.u64 %r159,%r86,63; set.u32.gt.s64 %r161,%r86,%r291; neg.s32 %r162,%r161; cvt.u16.u64 %r165,%r159; cvt.u16.u32 %r166,%r162; or.b16 %r164,%r165,%r166; .loc 1 425 6 cvt.u32.u16 %r167,%r164; cvt.u16.u8 %r168,%r167; setp.ne.u16 %r169,%r168,0; @ %r169 bra $L113; .loc 1 432 3 setp.ne.u64 %r170,%r86,0; @ %r170 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r171,%r86,%r291; @ %r171 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r174,%r24; cvt.s64.s8 %r173,%r174; st.u64 [%stack+8],%r173172172_gfortran_runtime_error114: add.u64 %r135,%r149,48; add.u64 %r137,%frame,120; add.u64 %r138,%r149,56; add.u64 %r176,%r149,24; add.u64 %r178,%r23,%r23; add.u64 %r179,%r178,%r23; shl.b64 %r180,%r179,3; add.u64 %r144,%r176,%r180; .loc 1 437 12 mov.u64 %r297,0; $L119: .loc 1 434 19 ld.u64 %r182,[%r138]; add.u64 %r181,%r182,1; ld.u64 %r183,[%r135]; sub.u64 %r36,%r181,%r183; .loc 1 436 10 setp.le.s64 %r184,%r36,0; @ %r184 bra $L117; .loc 1 434 17 st.u64 [%r137],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r137],%r297; $L118: .loc 1 432 3 add.u64 %r135,%r135,24; add.u64 %r137,%r137,8; add.u64 %r138,%r138,24; setp.ne.u64 %r186,%r135,%r144; @ %r186 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r96,[%r148]; .loc 1 449 6 setp.eq.u64 %r187,%r96,0; @ ! %r187 bra $L122; bra $L121; $L115: add.u64 %r189,%r23,%r23; add.u64 %r190,%r189,%r23; shl.b64 %r191,%r190,3; add.u64 %r192,%r191,48; add.u64 %r57,%r149,%r192; add.u64 %r290,%frame,120; shl.b64 %r194,%r23,3; add.u64 %r195,%r194,-8; add.u64 %r67,%r290,%r195; add.u64 %r196,%r149,48; cvt.u32.u32 %r198,%r24; cvt.s64.s8 %r197,%r198; add.u64 %r200,%r197,%r197; add.u64 %r201,%r200,%r197; shl.b64 %r202,%r201,3; add.u64 %r131,%r196,%r202; .loc 1 446 12 mov.u64 %r296,0; $L125: .loc 1 443 2 ld.u64 %r204,[%r57+8]; add.u64 %r203,%r204,1; ld.u64 %r205,[%r57]; sub.u64 %r41,%r203,%r205; .loc 1 445 10 setp.le.s64 %r206,%r41,0; @ %r206 bra $L123; .loc 1 442 17 st.u64 [%r67],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r67],%r296; $L124: .loc 1 440 3 add.u64 %r57,%r57,24; add.u64 %r67,%r67,8; setp.ne.u64 %r208,%r57,%r131; @ %r208 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r209,%r291,0; @ %r209 bra $L127; add.u64 %r290,%frame,120; $L147: add.u64 %r102,%r148,40; mov.u64 %r84,%r290; cvt.u32.u32 %r211,%r24; cvt.s64.s8 %r210,%r211; add.u64 %r213,%r210,%r210; add.u64 %r214,%r213,%r210; shl.b64 %r215,%r214,3; add.u64 %r216,%r148,16; add.u64 %r60,%r215,%r216; .loc 1 456 10 mov.u64 %r82,1; .loc 1 460 4 mov.u64 %r235,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r217,0; st.u64 [%r148+8],%r217; .loc 1 465 28 cvt.u16.u32 %r220,%r24; add.u16 %r219,%r220,-1; cvt.u32.u16 %r221,%r219; st.u8 [%r148+28],%r221; .loc 1 467 20 add.u32 %r222,%r24,-2; cvt.s64.s32 %r50,%r222; add.u64 %r224,%r50,%r50; add.u64 %r225,%r224,%r50; shl.b64 %r226,%r225,3; add.u64 %r227,%r148,%r226; .loc 1 467 67 shl.b64 %r229,%r50,3; add.u64 %r230,%frame,%r229; .loc 1 467 59 ld.u64 %r232,[%r227+40]; ld.u64 %r233,[%r230+120]; mul.lo.u64 %r93,%r232,%r233; .loc 1 469 10 setp.eq.u64 %r234,%r93,0; @ %r234 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r82,%r46,%r82; $L128: .loc 1 460 4 st.u64 [%r102+8],%r235; ld.u64 %r46,[%r84]; add.u64 %r236,%r46,-1; st.u64 [%r102+16],%r236; st.u64 [%r102],%r82; .loc 1 453 7 add.u64 %r102,%r102,24;131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r148+48],%r93; mov.u64 %r239,-1; st.u64 [%r148+56],%r239; mov.u64 %r240,1; st.u64 [%r148+40],%r240; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r242(%value_in),_gfortrani_xmallocarray3,[%value_in]; } mov.u64 %r96,%r243; .loc 1 476 22 st.u64 [%r148],%r96; $L135: .loc 1 502 3 setp.ne.u64 %r245,%r291,0; @ %r245 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r148+28]; .loc 1 480 10 setp.eq.u64 %r246,%r55,%r291; @ %r246 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r291; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r249,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r250,[%r249+36]; setp.eq.u32 %r251,%r250,0; @ %r251 bra $L135; .loc 1 488 4 setp.eq.u64 %r252,%r291,0; @ %r252 bra $L133; add.u64 %r31,%r148,48; add.u64 %r125,%frame,120; cvt.u32.u32 %r254,%r24; cvt.s64.s8 %r253,%r254; add.u64 %r92,%r253,-1; .loc 1 488 10 mov.u64 %r101,0; $L137: .loc 1 492 21 ld.u64 %r256,[%r31+8]; add.u64 %r255,%r256,1; .loc 1 492 19 ld.u64 %r257,[%r31]; sub.u64 %r91,%r255,%r257; .loc 1 493 18 ld.u64 %r64,[%r125]; .loc 1 488 25 add.u64 %r101,%r101,1; .loc 1 493 11 setp.eq.u64 %r258,%r64,%r91; @ %r258 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r64; st.u64 [%stack+8],%r91; st.u64 [%stack],%r101; cvta.const.u64 %r22_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r31,%r31,24; add.u64 %r125,%r125,8; setp.ne.u64 %r261,%r92,%r101; @ %r261 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r263,%r24; setp.le.s16 %r264,%r263,1; @ %r264 bra $L138; shl.b64 %r262,%r291,3; bra $L139; $L138: mov.u64 %r262,8; $L139: add.u64 %r266,%frame,240; mov.u32 %r27026272,[%value_in]; } add.u64 %r111,%r148,40; mov.u64 %r109,%frame; mov.u64 %r98,0; $L140: .loc 1 505 18 ld.u64 %r274,[%r111]; st.u64 [%r109],%r274; .loc 1 502 26 add.u64 %r98,%r98,1; .loc 1 502 3 add.u64 %r111,%r111,24; add.u64 %r109,%r109,8; setp.gt.s64 %r275,%r291,%r98; @ %r275 bra $L140; $L133: ld.u64 %r103,[%frame+240]; .loc 1 514 22 ld.u64 %r69,[%frame]; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r81,%r59,%r69; setp.le.s64 %r289,%r291,1; .loc 1 512 13 mov.u32 %r276,0; cvt.u32.u32 %r293,%r24; cvt.s64.s8 %r294,%r293; add.u64 %r295,%r294,-1; $L146: st.u8 [%r96],%r276; .loc 1 513 15 add.u64 %r103,%r103,1; .loc 1 514 12 add.u64 %r96,%r96,%r69; .loc 1 516 13 setp.ne.u64 %r277,%r59,%r103; @ %r277 bra $L146; .loc 1 525 7 @ ! %r289 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r97],%r292; .loc 1 523 23 mul.lo.u64 %r71,%r74,%r73; .loc 1 524 5 add.u64 %r95,%r95,1; .loc 1 525 7 add.u64 %r97,%r97,8; add.u64 %r119,%r119,8; add.u64 %r118,%r118,8; setp.ne.u64 %r280,%r95,%r295; @ %r280 bra $L144; bra $L109; $L171: add.u64 %r97,%frame,248; add.u64 %r119,%frame,8; add.u64 %r118,%frame,128; .loc 1 523 23 mov.u64 %r71,%r81; .loc 1 524 5 mov.u64 %r95,1; .loc 1 520 13 mov.u64 %r292,0; $L144: .loc 1 529 16 ld.u64 %r285,[%r97]; add.u64 %r73,%r285,1; st.u64 [%r97],%r73; .loc 1 530 23 ld.u64 %r74,[%r119]; .loc 1 530 13 sub.u64 %r286,%r74,%r71; add.u64 %r96,%r96,%r286; .loc 1 516 32 ld.u64 %r77,[%r118]; .loc 1 516 13 setp.eq.u64 %r287,%r73,%r77; @ %r287 bra $L145; .loc 1 520 13 mov.u64 %r103,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r96,[%r148]; .loc 1 449 6 setp.eq.u64 %r288,%r96,0; @ ! %r288 bra $L122; bra $L147; $L109: .loc 1 534 1 ret; } sum_i2.o/_gfortran_sum_i2 .visible .func _gfortran_sum_fortran/generated/sum_i2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_msum_i2 .visible .func _gfortran_msum_i2gfortran_ssum_i2 .visible .func _gfortran_ssum_i2VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10] = {83,85,77114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,83,85,77,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_sum_i2 .visible .func _gfortran_sum_480predu64u64 %r229; .reg .u64 %r230; .reg .predpred %r255; .reg .predu32u16u64predu64pred %r322; .reg .predpred %r330; .reg .u16 %r331; .reg .predpred %r357; .reg .u32 %r359; .reg .u64u64 %r366; .reg .u64 %r367; mov.u64 %r182,%ar0; mov.u64 %r183,%ar1; mov.u64 %r184,%ar2; .loc 1 55 10 ld.s8 %r22,[%r183+28]; .loc 1 55 38 add.u32 %r185,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r107,%r185; .loc 1 56 10 ld.u64 %r25,[%r184]; .loc 1 56 7 add.u64 %r108,%r25,-1; .loc 1 58 7 shr.u64 %r187,%r108,63; set.u32.lt.s64 %r189,%r107,%r108; neg.s32 %r190,%r189; cvt.u16.u64 %r193,%r187; cvt.u16.u32 %r194,%r190; or.b16 %r192,%r193,%r194; .loc 1 58 6 cvt.u32.u16 %r195,%r192; cvt.u16.u8 %r196,%r195; setp.eq.u16 %r197,%r196,0; @ %r197 bra $L2; .loc 1 60 7 cvt.u32.u32 %r200,%r22; cvt.s64.s8 %r199,%r200; st.u64 [%stack+8],%r199_gfortran_runtime_errorr203,%r108,%r108; add.u64 %r204,%r203,%r108; shl.b64 %r205,%r204,3; add.u64 %r206,%r183,%r205; ld.u64 %r31,[%r206+56]; ld.u64 %r33,[%r206+48]; .loc 1 68 9 ld.u64 %r111,[%r206+40]; .loc 1 70 3 setp.ne.u64 %r220,%r108,0; @ %r220 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r221,%r107,%r108; @ %r221 bra $L4; bra $L46; $L3: add.u64 %r97,%r183,40; add.u64 %r61,%frame,120; add.u64 %r59,%frame,240; add.u64 %r222,%r183,16; add.u64 %r224,%r25,%r25; add.u64 %r225,%r224,%r25; shl.b64 %r226,%r225,3; add.u64 %r180,%r222,%r226; .loc 1 76 12 mov.u64 %r367,0; $L8: .loc 1 72 18 ld.u64 %r227,[%r97]; st.u64 [%r61],%r227; .loc 1 73 19 ld.u64 %r229,[%r97+16]; add.u64 %r228,%r229,1; ld.u64 %r230,[%r97+8]; sub.u64 %r38,%r228,%r230; .loc 1 75 10 setp.lt.s64 %r231,%r38,0; @ %r231 bra $L6; .loc 1 73 17 st.u64 [%r59],%r38; bra $L7; $L6: .loc 1 76 12 st.u64 [%r59],%r367; $L7: .loc 1 70 3 add.u64 %r97,%r97,24; add.u64 %r61,%r61,8; add.u64 %r59,%r59,8; setp.ne.u64 %r233,%r97,%r180; @ %r233 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r234,[%r182]; setp.eq.u64 %r235,%r234,0; @ ! %r235 bra $L11; bra $L10; $L4: add.u64 %r237,%r25,%r25; add.u64 %r238,%r237,%r25; shl.b64 %r239,%r238,3; add.u64 %r240,%r239,40; add.u64 %r150,%r183,%r240; shl.b64 %r241,%r25,3; add.u64 %r133,%r241,-8; add.u64 %r355,%frame,120; add.u64 %r138,%r355,%r133; add.u64 %r354,%frame,240; add.u64 %r130,%r354,%r133; cvt.u32.u32 %r245,%r22; cvt.s64.s8 %r244,%r245; add.u64 %r247,%r244,%r244; add.u64 %r248,%r247,%r244; shl.b64 %r249,%r248,3; add.u64 %r250,%r183,40; add.u64 %r100,%r249,%r250; .loc 1 84 12 mov.u64 %r366,0; $L14: .loc 1 80 18 ld.u64 %r251,[%r150]; st.u64 [%r138],%r251; .loc 1 81 19 ld.u64 %r253,[%r150+16]; add.u64 %r252,%r253,1; ld.u64 %r254,[%r150+8]; sub.u64 %r44,%r252,%r254; .loc 1 83 10 setp.lt.s64 %r255,%r44,0; @ %r255 bra $L12; .loc 1 81 17 st.u64 [%r130],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r130],%r366; $L13: .loc 1 78 3 add.u64 %r150,%r150,24; add.u64 %r138,%r138,8; add.u64 %r130,%r130,8; setp.ne.u64 %r257,%r100,%r150; @ %r257 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r258,%r107,0; @ %r258 bra $L16; add.u64 %r354,%frame,240; $L34: add.u64 %r159,%r182,40; mov.u64 %r157,%r354; add.u64 %r259,%r182,16; cvt.u32.u32 %r261,%r22; cvt.s64.s8 %r260,%r261; add.u64 %r263,%r260,%r260; add.u64 %r264,%r263,%r260; shl.b64 %r265,%r264,3; add.u64 %r151,%r259,%r265; .loc 1 94 10 mov.u64 %r101,1; .loc 1 98 4 mov.u64 %r288,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r266,0; st.u64 [%r182+8],%r266; .loc 1 103 28 cvt.u16.u32 %r269,%r22; add.u16 %r268,%r269,-1; cvt.u32.u16 %r270,%r268; st.u8 [%r182+28],%r270; .loc 1 105 20 add.u32 %r271,%r22,-2; cvt.s64.s32 %r52,%r271; add.u64 %r273,%r52,%r52; add.u64 %r274,%r273,%r52; shl.b64 %r275,%r274,3; add.u64 %r276,%r182,%r275; .loc 1 105 67 shl.b64 %r278,%r52,3; add.u64 %r279,%frame,%r278; .loc 1 105 59 ld.u64 %r281,[%r276+40]; ld.u64 %r282,[%r279+240]; mul.lo.u64 %r113,%r281,%r282; .loc 1 107 29 mov.u64 %r284,11(%value_in),_gfortrani_xmallocarray285,[%value_in]; } .loc 1 107 27 st.u64 [%r182],%r285; .loc 1 108 10 setp.eq.u64 %r287,%r113,0; @ ! %r287 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r101,%r48,%r101; $L17: .loc 1 98 4 st.u64 [%r159+8],%r288; ld.u64 %r48,[%r157]; add.u64 %r289,%r48,-1; st.u64 [%r159+16],%r289; st.u64 [%r159],%r101; .loc 1 91 7 add.u64 %r159,%r159,24; add.u64 %r157,%r157,8; setp.ne.u64 %r290,%r151,%r159; @ %r290 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r182+48],%r113; mov.u64 %r292,-1; st.u64 [%r182+56],%r292; mov.u64 %r293,1; st.u64 [%r182+40],%r293; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r182+28]; .loc 1 118 10 setp.eq.u64 %r294,%r58,%r107; @ %r294 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r107; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r297,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r298,[%r297+36]; setp.eq.u32 %r299,%r298,0; @ %r299 bra $L21; .loc 1 125 2 add.u64 %r354,%frame,240; cvta.const.u64 %r303,$LC2r354302303; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r305,%r107,0; @ %r305 bra $L24; $L27: .loc 1 65 9 add.u64 %r306,%r31,1; .loc 1 65 7 sub.u64 %r109,%r306,%r33; max.s64 %r45,%r109,0; .loc 1 137 8 ld.u64 %r121,[%r183]; .loc 1 138 8 ld.u64 %r122,[%r182]; .loc 1 154 36 add.u64 %r69,%r111,%r111; ld.u64 %r142,[%frame+360]; .loc 1 166 22 ld.u64 %r71,[%frame+120]; .loc 1 166 12 add.u64 %r72,%r71,%r71; .loc 1 167 22 ld.u64 %r74,[%frame]; .loc 1 167 12 add.u64 %r75,%r74,%r74; .loc 1 169 32 ld.u64 %r129,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r310,%r71,%r129; .loc 1 176 9 add.u64 %r311,%r310,%r310; neg.s64 %r132,%r311; .loc 1 177 23 mul.lo.u64 %r312,%r74,%r129; .loc 1 177 9 add.u64 %r313,%r312,%r312; neg.s64 %r135,%r313; setp.gt.s64 %r356,%r109,0; setp.le.s64 %r357,%r107,1; cvt.u32.u32 %r359,%r22; cvt.s64.s8 %r360,%r359; add.u64 %r361,%r360,-1; add.u64 %r362,%frame,120; add.u64 %r363,%frame,240; .loc 1 173 13 mov.u64 %r364,0; bra $L25; $L24: add.u64 %r172,%frame,360; add.u64 %r171,%r182,40; mov.u64 %r169,%frame; cvt.u32.u32 %r315,%r22; cvt.s64.s8 %r314,%r315; add.u64 %r160,%r314,-1; .loc 1 129 3 mov.u64 %r123,0; add.u64 %r354,%frame,240; .loc 1 131 16 mov.u64 %r316,%r123; $L26: st.u64 [%r172],%r316; .loc 1 132 18 ld.u64 %r317,[%r171]; st.u64 [%r169],%r317; .loc 1 133 17 shl.b64 %r319,%r123,3; add.u64 %r320,%r354,%r319; .loc 1 133 10 ld.u64 %r321,[%r320]; setp.le.s64 %r322,%r321,0; @ %r322 bra $L1; .loc 1 129 26 add.u64 %r123,%r123,1; .loc 1 129 3 add.u64 %r172,%r172,8; add.u64 %r171,%r171,24; add.u64 %r169,%r169,8; setp.ne.u64 %r323,%r123,%r160; @ %r323 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r356 bra $L35; .loc 1 150 10 mov.u16 %r325,0; st.u16 [%r122],%r325; bra $L29; $L35: mov.u64 %r116,%r121; .loc 1 148 10 mov.u32 %r114,0; .loc 1 154 13 mov.u64 %r115,0; $L28: .loc 1 158 10 ld.u16 %r327,[%r116]; cvt.u16.u32 %r328,%r114; add.u16 %r326,%r327,%r328; cvt.s32.s16 %r114,%r326; .loc 1 154 28 add.u64 %r115,%r115,1; .loc 1 154 36 add.u64 %r116,%r116,%r69; .loc 1 154 6 setp.gt.s64 %r330,%r45,%r115; @ %r330 bra $L28; .loc 1 161 12 cvt.u16.u32 %r331,%r114; st.u16 [%r122],%r331; $L29: .loc 1 165 15 add.u64 %r142,%r142,1; .loc 1 166 12 add.u64 %r121,%r121,%r72; .loc 1 167 12 add.u64 %r122,%r122,%r75; .loc 1 169 13 setp.ne.u64 %r332,%r142,%r129; @ %r332 bra $L25; .loc 1 176 9 add.u64 %r118,%r121,%r132; .loc 1 177 9 add.u64 %r119,%r122,%r135; .loc 1 179 7 @ %r357 bra $L1; add.u64 %r128,%frame,368; mov.u64 %r144,8; .loc 1 178 5 mov.u64 %r120,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r128],%r364; .loc 1 176 23 mul.lo.u64 %r338,%r88,%r87; .loc 1 176 9 add.u64 %r339,%r338,%r338; sub.u64 %r118,%r121,%r339; .loc 1 177 23 mul.lo.u64 %r340,%r91,%r87; .loc 1 177 9 add.u64 %r341,%r340,%r340; sub.u64 %r119,%r122,%r341; .loc 1 178 5 add.u64 %r120,%r120,1; .loc 1 179 7 add.u64 %r128,%r128,8; add.u64 %r144,%r144,8; setp.eq.u64 %r342,%r120,%r361; @ %r342 bra $L1; $L32: .loc 1 187 16 ld.u64 %r343,[%r128]; add.u64 %r87,%r343,1; st.u64 [%r128],%r87; .loc 1 188 23 add.u64 %r345,%r362,%r144; ld.u64 %r88,[%r345]; .loc 1 188 13 add.u64 %r346,%r88,%r88; add.u64 %r121,%r118,%r346; .loc 1 189 23 add.u64 %r347,%frame,%r144; ld.u64 %r91,[%r347]; .loc 1 189 13 add.u64 %r348,%r91,%r91; add.u64 %r122,%r119,%r348; .loc 1 169 32 add.u64 %r350,%r363,%r144; ld.u64 %r95,[%r350]; .loc 1 169 13 setp.eq.u64 %r351,%r87,%r95; @ %r351 bra $L33; .loc 1 173 13 mov.u64 %r142,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r352,[%r182]; setp.eq.u64 %r353,%r352,0; @ ! %r353 bra $L11; bra $L34; $L1: .loc 1 19_gfortran_msum_i2 .visible .func _gfortran_msum_608pred %r216; .reg .u32 %r220; .reg .u64 %r222; .reg .u32 %r224; .reg .u32 %r225; .reg .u16 %r227; .reg .u16u16 %r264; .reg .u32 %r265; .reg .u16u64 %r274; .reg .u64 %r280; .reg .pred %r282; .reg .pred %r283; .reg .u64u64u64predu64 %r313; .reg .u64u64pred %r332; .reg .pred %r333; .reg .u64 %r334; .reg .u32u16 %r355; .reg .u16 %r356; .reg .u32 %r357; .reg .predpred %r370; .reg .u64 %r371; .reg .u64 %r373; .reg .u32 %r374; .reg .predpredpredu64 %r427; .reg .u64 %r429; .reg .u64u64 %r451; mov.u64 %r212,%ar0; mov.u64 %r213,%ar1; mov.u64 %r214,%ar2; mov.u64 %r215,%ar3; .loc 1 223 6 setp.ne.u64 %r216,%r215,0; @ %r216 bra $L491213214; call _gfortran_sum_.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r214]; .loc 1 233 7 add.u64 %r140,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r213+28]; .loc 1 234 38 add.u32 %r220,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r141,%r220; .loc 1 237 7 shr.u64 %r222,%r140,63; set.u32.gt.s64 %r224,%r140,%r141; neg.s32 %r225,%r224; cvt.u16.u64 %r228,%r222; cvt.u16.u32 %r229,%r225; or.b16 %r227,%r228,%r229; .loc 1 237 6 cvt.u32.u16 %r230,%r227; cvt.u16.u8 %r231,%r230; setp.eq.u16 %r232,%r231,0; @ %r232 bra $L51; .loc 1 239 7 cvt.u32.u32 %r235,%r23; cvt.s64.s8 %r234,%r235; st.u64 [%stack+8],%r234; st.u64 [%stack],%r22; cvta.const.u64 %r233233_gfortran_runtime_error1: .loc 1 244 9 add.u64 %r439,%r140,%r140; add.u64 %r440,%r439,%r140; shl.b64 %r240,%r440,3; add.u64 %r241,%r213,%r240; ld.u64 %r244,[%r241+56]; add.u64 %r243,%r244,1; .loc 1 244 7 ld.u64 %r251,[%r241+48]; sub.u64 %r142,%r243,%r251; .loc 1 245 6 setp.le.s64 %r252,%r142,0; @ %r252 bra $L48; .loc 1 248 9 ld.u64 %r155,[%r215]; .loc 1 250 15 ld.u64 %r34,[%r215+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r253,%r35,-4; and.b32 %r254,%r253,-5; set.u32.eq.u32 %r256,%r254,0; neg.s32 %r257,%r256; .loc 1 252 22 add.u32 %r258,%r35,-1; set.u32.le.u32 %r260,%r258,1; neg.s32 %r261,%r260; .loc 1 252 6 cvt.u16.u32 %r263,%r257; cvt.u16.u32 %r264,%r261; or.b16 %r262,%r263,%r264; cvt.u32.u16 %r265,%r262; cvt.u16.u8 %r266,%r265; setp.ne.u16 %r267,%r266,0; @ %r267 bra $L53; .loc 1 254 7 setp.ne.u32 %r269,%r35,16; @ %r269 bra $L54; $L53: .loc 1 261 9 shl.b64 %r273,%r440,3; add.u64 %r274,%r213,%r273; ld.u64 %r145,[%r274+40]; .loc 1 262 12 add.u64 %r280,%r215,%r273; ld.u64 %r42,[%r280+40]; .loc 1 264 3 setp.ne.u64 %r282,%r140,0; @ %r282 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r283,%r140,%r141; @ %r283 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r2842stack; call _gfortran_runtime_error5: add.u64 %r40,%r213,40; add.u64 %r182,%r215,40; add.u64 %r286,%r213,16; add.u64 %r288,%r22,%r22; add.u64 %r289,%r288,%r22; shl.b64 %r290,%r289,3; add.u64 %r209,%r286,%r290; .loc 1 264 3 mov.u64 %r95,0; add.u64 %r437,%frame,240; add.u64 %r436,%frame,360; .loc 1 271 12 mov.u64 %r451,%r95; $L60: .loc 1 266 18 add.u64 %r292,%r437,%r95; ld.u64 %r293,[%r40]; st.u64 [%r292],%r293; .loc 1 267 18 add.u64 %r294,%frame,%r95; .loc 1 267 20 ld.u64 %r296,[%r182]; mul.lo.u64 %r295,%r296,%r34; .loc 1 267 18 st.u64 [%r294],%r295; .loc 1 268 19 ld.u64 %r298,[%r40+16]; add.u64 %r297,%r298,1; ld.u64 %r299,[%r40+8]; sub.u64 %r54,%r297,%r299; .loc 1 270 10 setp.lt.s64 %r300,%r54,0; @ %r300 bra $L58; .loc 1 268 17 add.u64 %r302,%r436,%r95; st.u64 [%r302],%r54; bra $L59; $L58: .loc 1 271 12 add.u64 %r304,%r436,%r95; st.u64 [%r304],%r451; $L59: .loc 1 264 3 add.u64 %r40,%r40,24; add.u64 %r182,%r182,24; add.u64 %r95,%r95,8; setp.ne.u64 %r306,%r40,%r209; @ %r306 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r307,[%r212]; setp.eq.u64 %r308,%r307,0; @ ! %r308 bra $L63; bra $L62; $L56: add.u64 %r310,%r22,%r22; add.u64 %r311,%r310,%r22; shl.b64 %r312,%r311,3; add.u64 %r130,%r312,40; add.u64 %r136,%r213,%r130; add.u64 %r128,%r215,%r130; shl.b64 %r313,%r22,3; add.u64 %r106,%r313,-8; cvt.u32.u32 %r315,%r23; cvt.s64.s8 %r314,%r315; shl.b64 %r316,%r314,3; add.u64 %r69,%r316,-8; add.u64 %r437,%frame,240; add.u64 %r436,%frame,360; .loc 1 281 12 mov.u64 %r450,0; $L66: .loc 1 276 18 add.u64 %r318,%r437,%r106; ld.u64 %r319,[%r136]; st.u64 [%r318],%r319; .loc 1 277 18 add.u64 %r320,%frame,%r106; .loc 1 277 20 ld.u64 %r322,[%r128]; mul.lo.u64 %r321,%r322,%r34; .loc 1 277 18 st.u64 [%r320],%r321; .loc 1 278 19 ld.u64 %r324,[%r136+16]; add.u64 %r323,%r324,1; ld.u64 %r325,[%r136+8]; sub.u64 %r66,%r323,%r325; .loc 1 280 10 setp.lt.s64 %r326,%r66,0; @ %r326 bra $L64; .loc 1 278 17 add.u64 %r328,%r436,%r106; st.u64 [%r328],%r66; bra $L65; $L64: .loc 1 281 12 add.u64 %r330,%r436,%r106; st.u64 [%r330],%r450; $L65: .loc 1 274 3 add.u64 %r136,%r136,24; add.u64 %r128,%r128,24; add.u64 %r106,%r106,8; setp.ne.u64 %r332,%r69,%r106; @ %r332 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r333,%r141,0; @ %r333 bra $L68; add.u64 %r436,%frame,360; $L86: add.u64 %r175,%r212,40; mov.u64 %r167,%r436; cvt.u32.u32 %r335,%r23; cvt.s64.s8 %r334,%r335; add.u64 %r337,%r334,%r334; add.u64 %r338,%r337,%r334; shl.b64 %r339,%r338,3; add.u64 %r340,%r212,16; add.u64 %r137,%r339,%r340; .loc 1 291 10 mov.u64 %r132,1; .loc 1 295 4 mov.u64 %r359,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r341,%r23,-2; cvt.s64.s32 %r75,%r341; add.u64 %r343,%r75,%r75; add.u64 %r344,%r343,%r75; shl.b64 %r345,%r344,3; add.u64 %r346,%r212,%r345; .loc 1 299 67 shl.b64 %r348,%r75,3; add.u64 %r349,%frame,%r348; .loc 1 299 59 ld.u64 %r351,[%r346+40]; ld.u64 %r352,[%r349+360]; mul.lo.u64 %r149,%r351,%r352; .loc 1 301 24 mov.u64 %r353,0; st.u64 [%r212+8],%r353; .loc 1 302 28 cvt.u16.u32 %r356,%r23; add.u16 %r355,%r356,-1; cvt.u32.u16 %r357,%r355; st.u8 [%r212+28],%r357; .loc 1 304 10 setp.eq.u64 %r358,%r149,0; @ %r358 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r132,%r72,%r132; $L69: .loc 1 295 4 st.u64 [%r175+8],%r359; ld.u64 %r72,[%r167]; add.u64 %r360,%r72,-1; st.u64 [%r175+16],%r360; st.u64 [%r175],%r132; .loc 1 288 7 add.u64 %r175,%r175,24; add.u64 %r167,%r167,8; setp.ne.u64 %r361,%r137,%r175; @ %r361 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r212+48],%r149; mov.u64 %r363,-1; st.u64 [%r212+56],%r363; mov.u64 %r364,1; st.u64 [%r212+40],%r364; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r366,366; call (%value_in),_gfortrani_xmallocarray67,[%value_in]; } .loc 1 311 22 st.u64 [%r212],%r367; bra $L73; $L63: .loc 1 316 19 ld.s8 %r369,[%r212+28]; .loc 1 316 10 setp.eq.u64 %r370,%r369,%r141; @ %r370 bra $L74; .loc 1 317 2 cvta.const.u64 %r371371_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r373,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r374,[%r373+36]; setp.eq.u32 %r375,%r374,0; @ %r375 bra $L73; .loc 1 321 4 add.u64 %r436,%frame,360; cvta.const.u64 %r379,$LC2r4r378379; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r383,$LC383379; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r385,%r141,0; @ %r385 bra $L75; $L78: .loc 1 336 8 ld.u64 %r156,[%r212]; .loc 1 337 8 ld.u64 %r154,[%r213]; .loc 1 339 9 setp.ne.u64 %r386,%r154,0; @ %r386 bra $L76; bra $L48; $L75: add.u64 %r193,%frame,480; add.u64 %r192,%r212,40; add.u64 %r190,%frame,120; cvt.u32.u32 %r388,%r23; cvt.s64.s8 %r387,%r388; add.u64 %r178,%r387,-1; .loc 1 328 3 mov.u64 %r160,0; add.u64 %r436,%frame,360; .loc 1 330 16 mov.u64 %r389,%r160; $L77: st.u64 [%r193],%r389; .loc 1 331 18 ld.u64 %r390,[%r192]; st.u64 [%r190],%r390; .loc 1 332 17 shl.b64 %r392,%r160,3; add.u64 %r393,%r436,%r392; .loc 1 332 10 ld.u64 %r394,[%r393]; setp.le.s64 %r395,%r394,0; @ %r395 bra $L48; .loc 1 328 26 add.u64 %r160,%r160,1; .loc 1 328 3 add.u64 %r193,%r193,8; add.u64 %r192,%r192,24; add.u64 %r190,%r190,8; setp.ne.u64 %r396,%r160,%r178; @ %r396 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r45,%r42,%r34; .loc 1 349 32 add.u64 %r94,%r145,%r145; ld.u64 %r39,[%frame+480]; .loc 1 359 22 ld.u64 %r97,[%frame+240]; .loc 1 359 12 add.u64 %r98,%r97,%r97; .loc 1 360 23 ld.u64 %r100,[%frame]; .loc 1 361 22 ld.u64 %r102,[%frame+120]; .loc 1 361 12 add.u64 %r103,%r102,%r102; .loc 1 363 32 ld.u64 %r85,[%frame+360]; .loc 1 370 23 mul.lo.u64 %r38,%r85,%r97; .loc 1 371 24 mul.lo.u64 %r400,%r85,%r100; .loc 1 371 10 neg.s64 %r166,%r400; .loc 1 372 23 mul.lo.u64 %r401,%r85,%r102; .loc 1 372 9 add.u64 %r402,%r401,%r401; neg.s64 %r168,%r402; setp.le.s64 %r441,%r141,1; .loc 1 348 10 mov.u32 %r443,0; cvt.u32.u32 %r444,%r23; cvt.s64.s8 %r445,%r444; add.u64 %r446,%r445,-1; add.u64 %r447,%frame,240; add.u64 %r448,%frame,360; add.u64 %r449,%frame,120; $L87: .loc 1 328 3 mov.u64 %r159,%r155; mov.u64 %r158,%r154; .loc 1 348 10 mov.u32 %r133,%r443; .loc 1 349 9 mov.u64 %r157,0; $L80: .loc 1 352 6 ld.s8 %r404,[%r159]; cvt.u16.u32 %r403,%r404; setp.eq.u16 %r405,%r403,0; @ %r405 bra $L79; .loc 1 353 12 ld.u16 %r407,[%r158]; cvt.u16.u32 %r408,%r133; add.u16 %r406,%r407,%r408; cvt.s32.s16 %r133,%r406; $L79: .loc 1 349 24 add.u64 %r157,%r157,1; .loc 1 349 32 add.u64 %r158,%r158,%r94; .loc 1 349 47 add.u64 %r159,%r159,%r45; .loc 1 349 2 setp.ne.u64 %r410,%r142,%r157; @ %r410 bra $L80; .loc 1 355 8 cvt.u16.u32 %r411,%r133; st.u16 [%r156],%r411; .loc 1 358 15 add.u64 %r39,%r39,1; .loc 1 359 12 add.u64 %r154,%r154,%r98; .loc 1 360 13 add.u64 %r155,%r155,%r100; .loc 1 361 12 add.u64 %r156,%r156,%r103; .loc 1 363 13 setp.ne.u64 %r412,%r85,%r39; @ %r412 bra $L87; .loc 1 371 10 add.u64 %r151,%r155,%r166; .loc 1 372 9 add.u64 %r152,%r156,%r168; .loc 1 374 7 @ ! %r441 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r41],%r442; .loc 1 370 23 mul.lo.u64 %r105,%r117,%r116; .loc 1 371 24 mul.lo.u64 %r415,%r119,%r116; .loc 1 371 10 sub.u64 %r151,%r155,%r415; .loc 1 372 23 mul.lo.u64 %r416,%r121,%r116; .loc 1 372 9 add.u64 %r417,%r416,%r416; sub.u64 %r152,%r156,%r417; .loc 1 373 5 add.u64 %r153,%r153,1; .loc 1 374 7 add.u64 %r41,%r41,8; add.u64 %r169,%r169,8; setp.ne.u64 %r418,%r153,%r446; @ %r418 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; .loc 1 370 23 mov.u64 %r105,%r38; .loc 1 374 7 mov.u64 %r169,8; .loc 1 373 5 mov.u64 %r153,1; .loc 1 367 13 mov.u64 %r442,0; $L84: .loc 1 382 16 ld.u64 %r422,[%r41]; add.u64 %r116,%r422,1; st.u64 [%r41],%r116; .loc 1 383 23 add.u64 %r424,%r447,%r169; ld.u64 %r117,[%r424]; .loc 1 383 13 sub.u64 %r425,%r117,%r105; add.u64 %r426,%r425,%r425; add.u64 %r154,%r154,%r426; .loc 1 384 24 add.u64 %r427,%frame,%r169; ld.u64 %r119,[%r427]; .loc 1 384 14 add.u64 %r155,%r151,%r119; .loc 1 385 23 add.u64 %r429,%r449,%r169; ld.u64 %r121,[%r429]; .loc 1 385 13 add.u64 %r430,%r121,%r121; add.u64 %r156,%r152,%r430; .loc 1 363 32 add.u64 %r432,%r448,%r169; ld.u64 %r125,[%r432]; .loc 1 363 13 setp.eq.u64 %r433,%r116,%r125; @ %r433 bra $L85; .loc 1 367 13 mov.u64 %r39,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r434,[%r212]; setp.eq.u64 %r435,%r434,0; @ ! %r435 bra $L63; bra $L86; $L48:_gfortran_ssum_i2 .visible .func _gfortran_ssum_68u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64predpred %r292; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_sum_.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r295,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r295; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r295; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r301,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r301; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r294,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r294,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r300,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r300; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r295,0; @ %r211 bra $L127; add.u64 %r294,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r294; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r244,4; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r295,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r295; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r295; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r295,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r295,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r295,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 add.u64 %r69,%r68,%r68; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r293,%r295,1; .loc 1 512 13 mov.u16 %r279,0; cvt.u32.u32 %r297,%r24; cvt.s64.s8 %r298,%r297; add.u64 %r299,%r298,-1; $L146: st.u16 [%r98],%r279; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r280,%r59,%r80; @ %r280 bra $L146; .loc 1 525 7 @ ! %r293 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r296; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r283,%r97,%r299; @ %r283 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; .loc 1 520 13 mov.u64 %r296,0; $L144: .loc 1 529 16 ld.u64 %r288,[%r99]; add.u64 %r74,%r288,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r289,%r75,%r72; add.u64 %r290,%r289,%r289; add.u64 %r98,%r98,%r290; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r291,%r74,%r77; @ %r291 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r292,%r98,0; @ ! %r292 bra $L122; bra $L147; $L109: .loc 1 534 1 ret; } sum_i4.o/_gfortran_sum_i4 .visible .func _gfortran_sum_fortran/generated/sum_i4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_msum_i4 .visible .func _gfortran_msum_i4gfortran_ssum_i4 .visible .func _gfortran_ssum_i4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10] = {83,85,77114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,83,85,77,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_sum_i4 .visible .func _gfortran_sum_480u64 %r52; .reg .u64 %r58; .reg .u64 %r61; .reg .u649pred %r217; .reg .pred64u64 %r251; .reg .predu64 %r256; .reg .u64 %r257; .reg .u32u32 %r268; .reg .u64u64predpredu64u64predmov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; .loc 1 55 10 ld.s8 %r22,[%r180+28]; .loc 1 55 38 add.u32 %r182,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r182; .loc 1 56 10 ld.u64 %r25,[%r181]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r184,%r104,63; set.u32.lt.s64 %r186,%r103,%r104; neg.s32 %r187,%r186; cvt.u16.u64 %r190,%r184; cvt.u16.u32 %r191,%r187; or.b16 %r189,%r190,%r191; .loc 1 58 6 cvt.u32.u16 %r192,%r189; cvt.u16.u8 %r193,%r192; setp.eq.u16 %r194,%r193,0; @ %r194 bra $L2; .loc 1 60 7 cvt.u32.u32 %r197,%r22; cvt.s64.s8 %r196,%r197; st.u64 [%stack+8],%r1961955_gfortran_runtime_errorr200,%r104,%r104; add.u64 %r201,%r200,%r104; shl.b64 %r202,%r201,3; add.u64 %r203,%r180,%r202; ld.u64 %r30,[%r203+56]; ld.u64 %r32,[%r203+48]; .loc 1 68 9 ld.u64 %r107,[%r203+40]; .loc 1 70 3 setp.ne.u64 %r217,%r104,0; @ %r217 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r218,%r103,%r104; @ %r218 bra $L4; bra $L46; $L3: add.u64 %r61,%r180,40; add.u64 %r40,%frame,120; add.u64 %r91,%frame,240; add.u64 %r219,%r180,16; add.u64 %r221,%r25,%r25; add.u64 %r222,%r221,%r25; shl.b64 %r223,%r222,3; add.u64 %r174,%r219,%r223; .loc 1 76 12 mov.u64 %r357,0; $L8: .loc 1 72 18 ld.u64 %r224,[%r61]; st.u64 [%r40],%r224; .loc 1 73 19 ld.u64 %r226,[%r61+16]; add.u64 %r225,%r226,1; ld.u64 %r227,[%r61+8]; sub.u64 %r37,%r225,%r227; .loc 1 75 10 setp.lt.s64 %r228,%r37,0; @ %r228 bra $L6; .loc 1 73 17 st.u64 [%r91],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r91],%r357; $L7: .loc 1 70 3 add.u64 %r61,%r61,24; add.u64 %r40,%r40,8; add.u64 %r91,%r91,8; setp.ne.u64 %r230,%r61,%r174; @ %r230 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r231,[%r179]; setp.eq.u64 %r232,%r231,0; @ ! %r232 bra $L11; bra $L10; $L4: add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r237,%r236,40; add.u64 %r143,%r180,%r237; shl.b64 %r238,%r25,3; add.u64 %r125,%r238,-8; add.u64 %r344,%frame,120; add.u64 %r128,%r344,%r125; add.u64 %r345,%frame,240; add.u64 %r121,%r345,%r125; cvt.u32.u32 %r242,%r22; cvt.s64.s8 %r241,%r242; add.u64 %r244,%r241,%r241; add.u64 %r245,%r244,%r241; shl.b64 %r246,%r245,3; add.u64 %r247,%r180,40; add.u64 %r95,%r246,%r247; .loc 1 84 12 mov.u64 %r356,0; $L14: .loc 1 80 18 ld.u64 %r248,[%r143]; st.u64 [%r128],%r248; .loc 1 81 19 ld.u64 %r250,[%r143+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r143+8]; sub.u64 %r44,%r249,%r251; .loc 1 83 10 setp.lt.s64 %r252,%r44,0; @ %r252 bra $L12; .loc 1 81 17 st.u64 [%r121],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r121],%r356; $L13: .loc 1 78 3 add.u64 %r143,%r143,24; add.u64 %r128,%r128,8; add.u64 %r121,%r121,8; setp.ne.u64 %r254,%r95,%r143; @ %r254 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r255,%r103,0; @ %r255 bra $L16; add.u64 %r345,%frame,240; $L34: add.u64 %r152,%r179,40; mov.u64 %r150,%r345; add.u64 %r256,%r179,16; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r144,%r256,%r262; .loc 1 94 10 mov.u64 %r99,1; .loc 1 98 4 mov.u64 %r285,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r263,0; st.u64 [%r179+8],%r263; .loc 1 103 28 cvt.u16.u32 %r266,%r22; add.u16 %r265,%r266,-1; cvt.u32.u16 %r267,%r265; st.u8 [%r179+28],%r267; .loc 1 105 20 add.u32 %r268,%r22,-2; cvt.s64.s32 %r52,%r268; add.u64 %r270,%r52,%r52; add.u64 %r271,%r270,%r52; shl.b64 %r272,%r271,3; add.u64 %r273,%r179,%r272; .loc 1 105 67 shl.b64 %r275,%r52,3; add.u64 %r276,%frame,%r275; .loc 1 105 59 ld.u64 %r278,[%r273+40]; ld.u64 %r279,[%r276+240]; mul.lo.u64 %r109,%r278,%r279; .loc 1 107 29281; call (%value_in),_gfortrani_xmallocarray282,[%value_in]; } .loc 1 107 27 st.u64 [%r179],%r282; .loc 1 108 10 setp.eq.u64 %r284,%r109,0; @ ! %r284 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r99,%r48,%r99; $L17: .loc 1 98 4 st.u64 [%r152+8],%r285; ld.u64 %r48,[%r150]; add.u64 %r286,%r48,-1; st.u64 [%r152+16],%r286; st.u64 [%r152],%r99; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r287,%r144,%r152; @ %r287 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r179+48],%r109; mov.u64 %r289,-1; st.u64 [%r179+56],%r289; mov.u64 %r290,1; st.u64 [%r179+40],%r290; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r179+28]; .loc 1 118 10 setp.eq.u64 %r291,%r58,%r103; @ %r291 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r294,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r295,[%r294+36]; setp.eq.u32 %r296,%r295,0; @ %r296 bra $L21; .loc 1 125 2 add.u64 %r345,%frame,240; cvta.const.u64 %r300,$LC2r345299300; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r302,%r103,0; @ %r302 bra $L24; $L27: .loc 1 65 9 add.u64 %r303,%r30,1; .loc 1 65 7 sub.u64 %r105,%r303,%r32; max.s64 %r45,%r105,0; .loc 1 137 8 ld.u64 %r116,[%r180]; .loc 1 138 8 ld.u64 %r117,[%r179]; .loc 1 154 36 shl.b64 %r66,%r107,2; ld.u64 %r138,[%frame+360]; .loc 1 166 22 ld.u64 %r68,[%frame+120]; .loc 1 166 12 shl.b64 %r69,%r68,2; .loc 1 167 22 ld.u64 %r71,[%frame]; .loc 1 167 12 shl.b64 %r72,%r71,2; .loc 1 169 32 ld.u64 %r123,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r304,%r68,%r123; .loc 1 176 9 shl.b64 %r305,%r304,2; neg.s64 %r127,%r305; .loc 1 177 23 mul.lo.u64 %r306,%r71,%r123; .loc 1 177 9 shl.b64 %r307,%r306,2; neg.s64 %r130,%r307; setp.gt.s64 %r346,%r105,0; setp.le.s64 %r347,%r103,1; cvt.u32.u32 %r349,%r22; cvt.s64.s8 %r350,%r349; add.u64 %r351,%r350,-1; add.u64 %r352,%frame,120; add.u64 %r353,%frame,240; .loc 1 173 13 mov.u64 %r354,0; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r179,40; mov.u64 %r162,%frame; cvt.u32.u32 %r309,%r22; cvt.s64.s8 %r308,%r309; add.u64 %r153,%r308,-1; .loc 1 129 3 mov.u64 %r118,0; add.u64 %r345,%frame,240; .loc 1 131 16 mov.u64 %r310,%r118; $L26: st.u64 [%r165],%r310; .loc 1 132 18 ld.u64 %r311,[%r164]; st.u64 [%r162],%r311; .loc 1 133 17 shl.b64 %r313,%r118,3; add.u64 %r314,%r345,%r313; .loc 1 133 10 ld.u64 %r315,[%r314]; setp.le.s64 %r316,%r315,0; @ %r316 bra $L1; .loc 1 129 26 add.u64 %r118,%r118,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r317,%r118,%r153; @ %r317 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r346 bra $L35; .loc 1 150 10 mov.u32 %r319,0; st.u32 [%r117],%r319; bra $L29; $L35: mov.u64 %r112,%r116; .loc 1 148 10 mov.u32 %r110,0; .loc 1 154 13 mov.u64 %r111,0; $L28: .loc 1 158 10 ld.u32 %r320,[%r112]; add.u32 %r110,%r110,%r320; .loc 1 154 28 add.u64 %r111,%r111,1; .loc 1 154 36 add.u64 %r112,%r112,%r66; .loc 1 154 6 setp.gt.s64 %r321,%r45,%r111; @ %r321 bra $L28; .loc 1 161 12 st.u32 [%r117],%r110; $L29: .loc 1 165 15 add.u64 %r138,%r138,1; .loc 1 166 12 add.u64 %r116,%r116,%r69; .loc 1 167 12 add.u64 %r117,%r117,%r72; .loc 1 169 13 setp.ne.u64 %r322,%r138,%r123; @ %r322 bra $L25; .loc 1 176 9 add.u64 %r113,%r116,%r127; .loc 1 177 9 add.u64 %r114,%r117,%r130; .loc 1 179 7 @ %r347 bra $L1; add.u64 %r122,%frame,368; mov.u64 %r140,8; .loc 1 178 5 mov.u64 %r115,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r122],%r354; .loc 1 176 23 mul.lo.u64 %r328,%r85,%r84; .loc 1 176 9 shl.b64 %r329,%r328,2; sub.u64 %r113,%r116,%r329; .loc 1 177 23 mul.lo.u64 %r330,%r88,%r84; .loc 1 177 9 shl.b64 %r331,%r330,2; sub.u64 %r114,%r117,%r331; .loc 1 178 5 add.u64 %r115,%r115,1; .loc 1 179 7 add.u64 %r122,%r122,8; add.u64 %r140,%r140,8; setp.eq.u64 %r332,%r115,%r351; @ %r332 bra $L1; $L32: .loc 1 187 16 ld.u64 %r333,[%r122]; add.u64 %r84,%r333,1; st.u64 [%r122],%r84; .loc 1 188 23 add.u64 %r335,%r352,%r140; ld.u64 %r85,[%r335]; .loc 1 188 13 shl.b64 %r336,%r85,2; add.u64 %r116,%r113,%r336; .loc 1 189 23 add.u64 %r337,%frame,%r140; ld.u64 %r88,[%r337]; .loc 1 189 13 shl.b64 %r338,%r88,2; add.u64 %r117,%r114,%r338; .loc 1 169 32 add.u64 %r340,%r353,%r140; ld.u64 %r92,[%r340]; .loc 1 169 13 setp.eq.u64 %r341,%r84,%r92; @ %r341 bra $L33; .loc 1 173 13 mov.u64 %r138,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r342,[%r179]; setp.eq.u64 %r343,%r342,0; @ ! %r343 bra $L11; bra $L34; $L1: .loc 1 19_gfortran_msum_i4 .visible .func _gfortran_msum_608u64u16 %r263; .reg .pred %r264; .reg .pred %r266; .reg .u64 %r270; .reg .u64 %r271; .reg .u64 %r277; .reg .pred %r279; .reg .pred %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .predu64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32u64 %r342; .reg .u64pred %r358; .reg .u64predu64 %r391; .reg .predpred %r399; .reg .u32 %r400; .reg .pred %r401; .reg .predmov.u64 %r211,%ar2; mov.u64 %r212,%ar3; .loc 1 223 6 setp.ne.u64 %r213,%r212,0; @ %r213 bra $L49gfortran_sum_.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r211]; .loc 1 233 7 add.u64 %r137,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r210+28]; .loc 1 234 38 add.u32 %r217,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r138,%r217; .loc 1 237 7 shr.u64 %r219,%r137,63; set.u32.gt.s64 %r221,%r137,%r138; neg.s32 %r222,%r221; cvt.u16.u64 %r225,%r219; cvt.u16.u32 %r226,%r222; or.b16 %r224,%r225,%r226; .loc 1 237 6 cvt.u32.u16 %r227,%r224; cvt.u16.u8 %r228,%r227; setp.eq.u16 %r229,%r228,0; @ %r229 bra $L51; .loc 1 239 7 cvt.u32.u32 %r232,%r23; cvt.s64.s8 %r231,%r232; st.u64 [%stack+8],%r231; st.u64 [%stack],%r22; cvta.const.u64 %r2302stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r429,%r137,%r137; add.u64 %r430,%r429,%r137; shl.b64 %r237,%r430,3; add.u64 %r238,%r210,%r237; ld.u64 %r241,[%r238+56]; add.u64 %r240,%r241,1; .loc 1 244 7 ld.u64 %r248,[%r238+48]; sub.u64 %r139,%r240,%r248; .loc 1 245 6 setp.le.s64 %r249,%r139,0; @ %r249 bra $L48; .loc 1 248 9 ld.u64 %r152,[%r212]; .loc 1 250 15 ld.u64 %r34,[%r212+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r250,%r35,-4; and.b32 %r251,%r250,-5; set.u32.eq.u32 %r253,%r251,0; neg.s32 %r254,%r253; .loc 1 252 22 add.u32 %r255,%r35,-1; set.u32.le.u32 %r257,%r255,1; neg.s32 %r258,%r257; .loc 1 252 6 cvt.u16.u32 %r260,%r254; cvt.u16.u32 %r261,%r258; or.b16 %r259,%r260,%r261; cvt.u32.u16 %r262,%r259; cvt.u16.u8 %r263,%r262; setp.ne.u16 %r264,%r263,0; @ %r264 bra $L53; .loc 1 254 7 setp.ne.u32 %r266,%r35,16; @ %r266 bra $L54; $L53: .loc 1 261 9 shl.b64 %r270,%r430,3; add.u64 %r271,%r210,%r270; ld.u64 %r142,[%r271+40]; .loc 1 262 12 add.u64 %r277,%r212,%r270; ld.u64 %r42,[%r277+40]; .loc 1 264 3 setp.ne.u64 %r279,%r137,0; @ %r279 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r280,%r137,%r138; @ %r280 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r281281_gfortran_runtime_error5: add.u64 %r40,%r210,40; add.u64 %r179,%r212,40; add.u64 %r283,%r210,16; add.u64 %r285,%r22,%r22; add.u64 %r286,%r285,%r22; shl.b64 %r287,%r286,3; add.u64 %r206,%r283,%r287; .loc 1 264 3 mov.u64 %r92,0; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 271 12 mov.u64 %r441,%r92; $L60: .loc 1 266 18 add.u64 %r289,%r427,%r92; ld.u64 %r290,[%r40]; st.u64 [%r289],%r290; .loc 1 267 18 add.u64 %r291,%frame,%r92; .loc 1 267 20 ld.u64 %r293,[%r179]; mul.lo.u64 %r292,%r293,%r34; .loc 1 267 18 st.u64 [%r291],%r292; .loc 1 268 19 ld.u64 %r295,[%r40+16]; add.u64 %r294,%r295,1; ld.u64 %r296,[%r40+8]; sub.u64 %r54,%r294,%r296; .loc 1 270 10 setp.lt.s64 %r297,%r54,0; @ %r297 bra $L58; .loc 1 268 17 add.u64 %r299,%r426,%r92; st.u64 [%r299],%r54; bra $L59; $L58: .loc 1 271 12 add.u64 %r301,%r426,%r92; st.u64 [%r301],%r441; $L59: .loc 1 264 3 add.u64 %r40,%r40,24; add.u64 %r179,%r179,24; add.u64 %r92,%r92,8; setp.ne.u64 %r303,%r40,%r206; @ %r303 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r304,[%r209]; setp.eq.u64 %r305,%r304,0; @ ! %r305 bra $L63; bra $L62; $L56: add.u64 %r307,%r22,%r22; add.u64 %r308,%r307,%r22; shl.b64 %r309,%r308,3; add.u64 %r127,%r309,40; add.u64 %r133,%r210,%r127; add.u64 %r125,%r212,%r127; shl.b64 %r310,%r22,3; add.u64 %r103,%r310,-8; cvt.u32.u32 %r312,%r23; cvt.s64.s8 %r311,%r312; shl.b64 %r313,%r311,3; add.u64 %r69,%r313,-8; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 281 12 mov.u64 %r440,0; $L66: .loc 1 276 18 add.u64 %r315,%r427,%r103; ld.u64 %r316,[%r133]; st.u64 [%r315],%r316; .loc 1 277 18 add.u64 %r317,%frame,%r103; .loc 1 277 20 ld.u64 %r319,[%r125]; mul.lo.u64 %r318,%r319,%r34; .loc 1 277 18 st.u64 [%r317],%r318; .loc 1 278 19 ld.u64 %r321,[%r133+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r133+8]; sub.u64 %r66,%r320,%r322; .loc 1 280 10 setp.lt.s64 %r323,%r66,0; @ %r323 bra $L64; .loc 1 278 17 add.u64 %r325,%r426,%r103; st.u64 [%r325],%r66; bra $L65; $L64: .loc 1 281 12 add.u64 %r327,%r426,%r103; st.u64 [%r327],%r440; $L65: .loc 1 274 3 add.u64 %r133,%r133,24; add.u64 %r125,%r125,24; add.u64 %r103,%r103,8; setp.ne.u64 %r329,%r69,%r103; @ %r329 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r330,%r138,0; @ %r330 bra $L68; add.u64 %r426,%frame,360; $L86: add.u64 %r172,%r209,40; mov.u64 %r164,%r426; cvt.u32.u32 %r332,%r23; cvt.s64.s8 %r331,%r332; add.u64 %r334,%r331,%r331; add.u64 %r335,%r334,%r331; shl.b64 %r336,%r335,3; add.u64 %r337,%r209,16; add.u64 %r134,%r336,%r337; .loc 1 291 10 mov.u64 %r129,1; .loc 1 295 4 mov.u64 %r356,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r338,%r23,-2; cvt.s64.s32 %r75,%r338; add.u64 %r340,%r75,%r75; add.u64 %r341,%r340,%r75; shl.b64 %r342,%r341,3; add.u64 %r343,%r209,%r342; .loc 1 299 67 shl.b64 %r345,%r75,3; add.u64 %r346,%frame,%r345; .loc 1 299 59 ld.u64 %r348,[%r343+40]; ld.u64 %r349,[%r346+360]; mul.lo.u64 %r146,%r348,%r349; .loc 1 301 24 mov.u64 %r350,0; st.u64 [%r209+8],%r350; .loc 1 302 28 cvt.u16.u32 %r353,%r23; add.u16 %r352,%r353,-1; cvt.u32.u16 %r354,%r352; st.u8 [%r209+28],%r354; .loc 1 304 10 setp.eq.u64 %r355,%r146,0; @ %r355 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r129,%r72,%r129; $L69: .loc 1 295 4 st.u64 [%r172+8],%r356; ld.u64 %r72,[%r164]; add.u64 %r357,%r72,-1; st.u64 [%r172+16],%r357; st.u64 [%r172],%r129; .loc 1 288 7 add.u64 %r172,%r172,24; add.u64 %r164,%r164,8; setp.ne.u64 %r358,%r134,%r172; @ %r358 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r209+48],%r146; mov.u64 %r360,-1; st.u64 [%r209+56],%r360; mov.u64 %r361,1; st.u64 [%r209+40],%r361; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24363; call (%value_in),_gfortrani_xmallocarray64,[%value_in]; } .loc 1 311 22 st.u64 [%r209],%r364; bra $L73; $L63: .loc 1 316 19 ld.s8 %r366,[%r209+28]; .loc 1 316 10 setp.eq.u64 %r367,%r366,%r138; @ %r367 bra $L74; .loc 1 317 2 cvta.const.u64 %r36836_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r370,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r371,[%r370+36]; setp.eq.u32 %r372,%r371,0; @ %r372 bra $L73; .loc 1 321 4 add.u64 %r426,%frame,360; cvta.const.u64 %r376,$LC2r42637376; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r380,$LC1210380376; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r382,%r138,0; @ %r382 bra $L75; $L78: .loc 1 336 8 ld.u64 %r153,[%r209]; .loc 1 337 8 ld.u64 %r151,[%r210]; .loc 1 339 9 setp.ne.u64 %r383,%r151,0; @ %r383 bra $L76; bra $L48; $L75: add.u64 %r190,%frame,480; add.u64 %r189,%r209,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r385,%r23; cvt.s64.s8 %r384,%r385; add.u64 %r175,%r384,-1; .loc 1 328 3 mov.u64 %r157,0; add.u64 %r426,%frame,360; .loc 1 330 16 mov.u64 %r386,%r157; $L77: st.u64 [%r190],%r386; .loc 1 331 18 ld.u64 %r387,[%r189]; st.u64 [%r187],%r387; .loc 1 332 17 shl.b64 %r389,%r157,3; add.u64 %r390,%r426,%r389; .loc 1 332 10 ld.u64 %r391,[%r390]; setp.le.s64 %r392,%r391,0; @ %r392 bra $L48; .loc 1 328 26 add.u64 %r157,%r157,1; .loc 1 328 3 add.u64 %r190,%r190,8; add.u64 %r189,%r189,24; add.u64 %r187,%r187,8; setp.ne.u64 %r393,%r157,%r175; @ %r393 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r45,%r42,%r34; .loc 1 349 32 shl.b64 %r91,%r142,2; ld.u64 %r39,[%frame+480]; .loc 1 359 22 ld.u64 %r94,[%frame+240]; .loc 1 359 12 shl.b64 %r95,%r94,2; .loc 1 360 23 ld.u64 %r97,[%frame]; .loc 1 361 22 ld.u64 %r99,[%frame+120]; .loc 1 361 12 shl.b64 %r100,%r99,2; .loc 1 363 32 ld.u64 %r85,[%frame+360]; .loc 1 370 23 mul.lo.u64 %r38,%r85,%r94; .loc 1 371 24 mul.lo.u64 %r394,%r85,%r97; .loc 1 371 10 neg.s64 %r163,%r394; .loc 1 372 23 mul.lo.u64 %r395,%r85,%r99; .loc 1 372 9 shl.b64 %r396,%r395,2; neg.s64 %r165,%r396; setp.le.s64 %r431,%r138,1; cvt.u32.u32 %r433,%r23; cvt.s64.s8 %r434,%r433; add.u64 %r435,%r434,-1; add.u64 %r436,%frame,240; add.u64 %r437,%frame,360; add.u64 %r438,%frame,120; .loc 1 367 13 mov.u64 %r439,0; $L87: .loc 1 328 3 mov.u64 %r156,%r152; mov.u64 %r155,%r151; .loc 1 348 10 mov.u32 %r130,0; .loc 1 349 9 mov.u64 %r154,0; $L80: .loc 1 352 6 ld.s8 %r398,[%r156]; cvt.u16.u32 %r397,%r398; setp.eq.u16 %r399,%r397,0; @ %r399 bra $L79; .loc 1 353 12 ld.u32 %r400,[%r155]; add.u32 %r130,%r130,%r400; $L79: .loc 1 349 24 add.u64 %r154,%r154,1; .loc 1 349 32 add.u64 %r155,%r155,%r91; .loc 1 349 47 add.u64 %r156,%r156,%r45; .loc 1 349 2 setp.ne.u64 %r401,%r139,%r154; @ %r401 bra $L80; .loc 1 355 8 st.u32 [%r153],%r130; .loc 1 358 15 add.u64 %r39,%r39,1; .loc 1 359 12 add.u64 %r151,%r151,%r95; .loc 1 360 13 add.u64 %r152,%r152,%r97; .loc 1 361 12 add.u64 %r153,%r153,%r100; .loc 1 363 13 setp.ne.u64 %r402,%r85,%r39; @ %r402 bra $L87; .loc 1 371 10 add.u64 %r148,%r152,%r163; .loc 1 372 9 add.u64 %r149,%r153,%r165; .loc 1 374 7 @ ! %r431 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r41],%r439; .loc 1 370 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 371 24 mul.lo.u64 %r405,%r116,%r113; .loc 1 371 10 sub.u64 %r148,%r152,%r405; .loc 1 372 23 mul.lo.u64 %r406,%r118,%r113; .loc 1 372 9 shl.b64 %r407,%r406,2; sub.u64 %r149,%r153,%r407; .loc 1 373 5 add.u64 %r150,%r150,1; .loc 1 374 7 add.u64 %r41,%r41,8; add.u64 %r166,%r166,8; setp.ne.u64 %r408,%r150,%r435; @ %r408 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; .loc 1 370 23 mov.u64 %r102,%r38; .loc 1 374 7 mov.u64 %r166,8; .loc 1 373 5 mov.u64 %r150,1; $L84: .loc 1 382 16 ld.u64 %r412,[%r41]; add.u64 %r113,%r412,1; st.u64 [%r41],%r113; .loc 1 383 23 add.u64 %r414,%r436,%r166; ld.u64 %r114,[%r414]; .loc 1 383 13 sub.u64 %r415,%r114,%r102; shl.b64 %r416,%r415,2; add.u64 %r151,%r151,%r416; .loc 1 384 24 add.u64 %r417,%frame,%r166; ld.u64 %r116,[%r417]; .loc 1 384 14 add.u64 %r152,%r148,%r116; .loc 1 385 23 add.u64 %r419,%r438,%r166; ld.u64 %r118,[%r419]; .loc 1 385 13 shl.b64 %r420,%r118,2; add.u64 %r153,%r149,%r420; .loc 1 363 32 add.u64 %r422,%r437,%r166; ld.u64 %r122,[%r422]; .loc 1 363 13 setp.eq.u64 %r423,%r113,%r122; @ %r423 bra $L85; .loc 1 367 13 mov.u64 %r39,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r424,[%r209]; setp.eq.u64 %r425,%r424,0; @ ! %r425 bra $L63; bra $L86; $L48:_gfortran_ssum_i4 .visible .func _gfortran_ssum_68u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64predu64mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_sum_.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r294,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r294; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r294; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r300,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r300; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r293,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r293,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r299,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r299; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r294,0; @ %r211 bra $L127; add.u64 %r293,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r293; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 2496244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r294,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r294; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r294; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r294,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r294,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r294,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 shl.b64 %r69,%r68,2; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r292,%r294,1; .loc 1 512 13 mov.u32 %r278,0; cvt.u32.u32 %r296,%r24; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; $L146: st.u32 [%r98],%r278; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r279,%r59,%r80; @ %r279 bra $L146; .loc 1 525 7 @ ! %r292 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r295; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r282,%r97,%r298; @ %r282 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; .loc 1 520 13 mov.u64 %r295,0; $L144: .loc 1 529 16 ld.u64 %r287,[%r99]; add.u64 %r74,%r287,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r288,%r75,%r72; shl.b64 %r289,%r288,2; add.u64 %r98,%r98,%r289; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r290,%r74,%r77; @ %r290 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r291,%r98,0; @ ! %r291 bra $L122; bra $L147; $L109: .loc 1 534 1 ret; } sum_i8.o/_gfortran_sum_i8 .visible .func _gfortran_sum_fortran/generated/sum_i8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_msum_i8 .visible .func _gfortran_msum_i8gfortran_ssum_i8 .visible .func _gfortran_ssum_i8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10] = {83,85,77114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,83,85,77,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_sum_i8 .visible .func _gfortran_sum_48096pred %r218; .reg .predpred %r231; .reg .u64 %r232; .reg .predu64 %r250; .reg .u64 %r251; .reg .u64 %r252; .reg .pred %r253; .reg .predu64u16u64 %r274; .reg .u64pred %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u32 %r306; .reg .u64u64 %r311; .reg .u64 %r312; .reg .pred %r313; .reg .predu64pred %r347; .reg .predu64 %r358; mov.u64 %r180,%ar0; mov.u64 %r181,%ar1; mov.u64 %r182,%ar2; .loc 1 55 10 ld.s8 %r22,[%r181+28]; .loc 1 55 38 add.u32 %r183,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r107,%r183; .loc 1 56 10 ld.u64 %r25,[%r182]; .loc 1 56 7 add.u64 %r108,%r25,-1; .loc 1 58 7 shr.u64 %r185,%r108,63; set.u32.lt.s64 %r187,%r107,%r108; neg.s32 %r188,%r187; cvt.u16.u64 %r191,%r185; cvt.u16.u32 %r192,%r188; or.b16 %r190,%r191,%r192; .loc 1 58 6 cvt.u32.u16 %r193,%r190; cvt.u16.u8 %r194,%r193; setp.eq.u16 %r195,%r194,0; @ %r195 bra $L2; .loc 1 60 7 cvt.u32.u32 %r198,%r22; cvt.s64.s8 %r197,%r198; st.u64 [%stack+8],%r1971966_gfortran_runtime_errorr201,%r108,%r108; add.u64 %r202,%r201,%r108; shl.b64 %r203,%r202,3; add.u64 %r204,%r181,%r203; ld.u64 %r30,[%r204+56]; ld.u64 %r32,[%r204+48]; .loc 1 68 9 ld.u64 %r111,[%r204+40]; .loc 1 70 3 setp.ne.u64 %r218,%r108,0; @ %r218 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r219,%r107,%r108; @ %r219 bra $L4; bra $L46; $L3: add.u64 %r61,%r181,40; add.u64 %r40,%frame,120; add.u64 %r96,%frame,240; add.u64 %r220,%r181,16; add.u64 %r222,%r25,%r25; add.u64 %r223,%r222,%r25; shl.b64 %r224,%r223,3; add.u64 %r175,%r220,%r224; .loc 1 76 12 mov.u64 %r358,0; $L8: .loc 1 72 18 ld.u64 %r225,[%r61]; st.u64 [%r40],%r225; .loc 1 73 19 ld.u64 %r227,[%r61+16]; add.u64 %r226,%r227,1; ld.u64 %r228,[%r61+8]; sub.u64 %r37,%r226,%r228; .loc 1 75 10 setp.lt.s64 %r229,%r37,0; @ %r229 bra $L6; .loc 1 73 17 st.u64 [%r96],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r96],%r358; $L7: .loc 1 70 3 add.u64 %r61,%r61,24; add.u64 %r40,%r40,8; add.u64 %r96,%r96,8; setp.ne.u64 %r231,%r61,%r175; @ %r231 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r232,[%r180]; setp.eq.u64 %r233,%r232,0; @ ! %r233 bra $L11; bra $L10; $L4: add.u64 %r235,%r25,%r25; add.u64 %r236,%r235,%r25; shl.b64 %r237,%r236,3; add.u64 %r238,%r237,40; add.u64 %r136,%r181,%r238; shl.b64 %r239,%r25,3; add.u64 %r105,%r239,-8; add.u64 %r345,%frame,120; add.u64 %r129,%r345,%r105; add.u64 %r346,%frame,240; add.u64 %r103,%r346,%r105; cvt.u32.u32 %r243,%r22; cvt.s64.s8 %r242,%r243; add.u64 %r245,%r242,%r242; add.u64 %r246,%r245,%r242; shl.b64 %r247,%r246,3; add.u64 %r248,%r181,40; add.u64 %r80,%r247,%r248; .loc 1 84 12 mov.u64 %r357,0; $L14: .loc 1 80 18 ld.u64 %r249,[%r136]; st.u64 [%r129],%r249; .loc 1 81 19 ld.u64 %r251,[%r136+16]; add.u64 %r250,%r251,1; ld.u64 %r252,[%r136+8]; sub.u64 %r44,%r250,%r252; .loc 1 83 10 setp.lt.s64 %r253,%r44,0; @ %r253 bra $L12; .loc 1 81 17 st.u64 [%r103],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r103],%r357; $L13: .loc 1 78 3 add.u64 %r136,%r136,24; add.u64 %r129,%r129,8; add.u64 %r103,%r103,8; setp.ne.u64 %r255,%r80,%r136; @ %r255 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r256,%r107,0; @ %r256 bra $L16; add.u64 %r346,%frame,240; $L34: add.u64 %r152,%r180,40; mov.u64 %r150,%r346; add.u64 %r257,%r180,16; cvt.u32.u32 %r259,%r22; cvt.s64.s8 %r258,%r259; add.u64 %r261,%r258,%r258; add.u64 %r262,%r261,%r258; shl.b64 %r263,%r262,3; add.u64 %r139,%r257,%r263; .loc 1 94 10 mov.u64 %r102,1; .loc 1 98 4 mov.u64 %r286,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r264,0; st.u64 [%r180+8],%r264; .loc 1 103 28 cvt.u16.u32 %r267,%r22; add.u16 %r266,%r267,-1; cvt.u32.u16 %r268,%r266; st.u8 [%r180+28],%r268; .loc 1 105 20 add.u32 %r269,%r22,-2; cvt.s64.s32 %r53,%r269; add.u64 %r271,%r53,%r53; add.u64 %r272,%r271,%r53; shl.b64 %r273,%r272,3; add.u64 %r274,%r180,%r273; .loc 1 105 67 shl.b64 %r276,%r53,3; add.u64 %r277,%frame,%r276; .loc 1 105 59 ld.u64 %r279,[%r274+40]; ld.u64 %r280,[%r277+240]; mul.lo.u64 %r113,%r279,%r280; .loc 1 107 29 mov.u64 %r2811(%value_in),_gfortrani_xmallocarray283,[%value_in]; } .loc 1 107 27 st.u64 [%r180],%r283; .loc 1 108 10 setp.eq.u64 %r285,%r113,0; @ ! %r285 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r102,%r49,%r102; $L17: .loc 1 98 4 st.u64 [%r152+8],%r286; ld.u64 %r49,[%r150]; add.u64 %r287,%r49,-1; st.u64 [%r152+16],%r287; st.u64 [%r152],%r102; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r288,%r139,%r152; @ %r288 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r180+48],%r113; mov.u64 %r290,-1; st.u64 [%r180+56],%r290; mov.u64 %r291,1; st.u64 [%r180+40],%r291; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r60,[%r180+28]; .loc 1 118 10 setp.eq.u64 %r292,%r60,%r107; @ %r292 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r107; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r295,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r296,[%r295+36]; setp.eq.u32 %r297,%r296,0; @ %r297 bra $L21; .loc 1 125 2 add.u64 %r346,%frame,240; cvta.const.u64 %r301,$LC2r346300301; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r303,%r107,0; @ %r303 bra $L24; $L27: .loc 1 65 9 add.u64 %r304,%r30,1; .loc 1 65 7 sub.u64 %r109,%r304,%r32; max.s64 %r45,%r109,0; .loc 1 137 8 ld.u64 %r121,[%r181]; .loc 1 138 8 ld.u64 %r122,[%r180]; .loc 1 154 36 shl.b64 %r68,%r111,3; ld.u64 %r47,[%frame+360]; .loc 1 166 22 ld.u64 %r69,[%frame+120]; .loc 1 166 12 shl.b64 %r71,%r69,3; .loc 1 167 22 ld.u64 %r72,[%frame]; .loc 1 167 12 shl.b64 %r74,%r72,3; setp.gt.s64 %r347,%r109,0; setp.le.s64 %r348,%r107,1; .loc 1 150 10 mov.u64 %r350,0; cvt.u32.u32 %r351,%r22; cvt.s64.s8 %r352,%r351; add.u64 %r353,%r352,-1; add.u64 %r354,%frame,120; add.u64 %r355,%frame,240; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r180,40; mov.u64 %r162,%frame; cvt.u32.u32 %r306,%r22; cvt.s64.s8 %r305,%r306; add.u64 %r153,%r305,-1; .loc 1 129 3 mov.u64 %r123,0; add.u64 %r346,%frame,240; .loc 1 131 16 mov.u64 %r307,%r123; $L26: st.u64 [%r165],%r307; .loc 1 132 18 ld.u64 %r308,[%r164]; st.u64 [%r162],%r308; .loc 1 133 17 shl.b64 %r310,%r123,3; add.u64 %r311,%r346,%r310; .loc 1 133 10 ld.u64 %r312,[%r311]; setp.le.s64 %r313,%r312,0; @ %r313 bra $L1; .loc 1 129 26 add.u64 %r123,%r123,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r314,%r123,%r153; @ %r314 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r347 bra $L35; .loc 1 150 10 st.u64 [%r122],%r350; bra $L29; $L35: mov.u64 %r116,%r121; .loc 1 148 10 mov.u64 %r114,0; .loc 1 154 13 mov.u64 %r115,%r114; $L28: .loc 1 158 10 ld.u64 %r317,[%r116]; add.u64 %r114,%r114,%r317; .loc 1 154 28 add.u64 %r115,%r115,1; .loc 1 154 36 add.u64 %r116,%r116,%r68; .loc 1 154 6 setp.gt.s64 %r318,%r45,%r115; @ %r318 bra $L28; .loc 1 161 12 st.u64 [%r122],%r114; $L29: .loc 1 165 15 add.u64 %r47,%r47,1; .loc 1 166 12 add.u64 %r121,%r121,%r71; .loc 1 167 12 add.u64 %r122,%r122,%r74; .loc 1 169 32 ld.u64 %r127,[%frame+240]; .loc 1 169 13 setp.ne.u64 %r319,%r47,%r127; @ %r319 bra $L25; .loc 1 176 23 mul.lo.u64 %r320,%r69,%r47; .loc 1 176 9 shl.b64 %r321,%r320,3; sub.u64 %r118,%r121,%r321; .loc 1 177 23 mul.lo.u64 %r322,%r72,%r47; .loc 1 177 9 shl.b64 %r323,%r322,3; sub.u64 %r119,%r122,%r323; .loc 1 179 7 @ %r348 bra $L1; add.u64 %r144,%frame,368; mov.u64 %r106,8; .loc 1 178 5 mov.u64 %r120,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r144],%r350; .loc 1 176 23 mul.lo.u64 %r329,%r88,%r87; .loc 1 176 9 shl.b64 %r330,%r329,3; sub.u64 %r118,%r121,%r330; .loc 1 177 23 mul.lo.u64 %r331,%r91,%r87; .loc 1 177 9 shl.b64 %r332,%r331,3; sub.u64 %r119,%r122,%r332; .loc 1 178 5 add.u64 %r120,%r120,1; .loc 1 179 7 add.u64 %r144,%r144,8; add.u64 %r106,%r106,8; setp.eq.u64 %r333,%r120,%r353; @ %r333 bra $L1; $L32: .loc 1 187 16 ld.u64 %r334,[%r144]; add.u64 %r87,%r334,1; st.u64 [%r144],%r87; .loc 1 188 23 add.u64 %r336,%r354,%r106; ld.u64 %r88,[%r336]; .loc 1 188 13 shl.b64 %r337,%r88,3; add.u64 %r121,%r118,%r337; .loc 1 189 23 add.u64 %r338,%frame,%r106; ld.u64 %r91,[%r338]; .loc 1 189 13 shl.b64 %r339,%r91,3; add.u64 %r122,%r119,%r339; .loc 1 169 32 add.u64 %r341,%r355,%r106; ld.u64 %r94,[%r341]; .loc 1 169 13 setp.eq.u64 %r342,%r87,%r94; @ %r342 bra $L33; .loc 1 173 13 mov.u64 %r47,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r343,[%r180]; setp.eq.u64 %r344,%r343,0; @ ! %r344 bra $L11; bra $L34; $L1: .loc 1 19_gfortran_msum_i8 .visible .func _gfortran_msum_608predpred %r280; .reg .pred %r281; .reg .u64 %r282; .reg .u64 %r284; .reg .u64predu64 %r318; .reg .u64u64u64 %r364; .reg .u64 %r365; .reg .u64 %r367; .reg .predu64pred %r393; .reg .predpredmov.u64 %r210,%ar0; mov.u64 %r211,%ar1; mov.u64 %r212,%ar2; mov.u64 %r213,%ar3; .loc 1 223 6911212; call _gfortran_sum_.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r212]; .loc 1 233 7 add.u64 %r141,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r211+28]; .loc 1 234 38 add.u32 %r218,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r142,%r218; .loc 1 237 7 shr.u64 %r220,%r141,63; set.u32.gt.s64 %r222,%r141,%r142; neg.s32 %r223,%r222; cvt.u16.u64 %r226,%r220; cvt.u16.u32 %r227,%r223; or.b16 %r225,%r226,%r227; .loc 1 237 6 cvt.u32.u16 %r228,%r225; cvt.u16.u8 %r229,%r228; setp.eq.u16 %r230,%r229,0; @ %r230 bra $L51; .loc 1 239 7 cvt.u32.u32 %r233,%r23; cvt.s64.s8 %r232,%r233; st.u64 [%stack+8],%r232; st.u64 [%stack],%r22; cvta.const.u64 %r2312stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r431,%r141,%r141; add.u64 %r430,%r431,%r141; shl.b64 %r238,%r430,3; add.u64 %r239,%r211,%r238; ld.u64 %r242,[%r239+56]; add.u64 %r241,%r242,1; .loc 1 244 7 ld.u64 %r249,[%r239+48]; sub.u64 %r143,%r241,%r249; .loc 1 245 6 setp.le.s64 %r250,%r143,0; @ %r250 bra $L48; .loc 1 248 9 ld.u64 %r156,[%r213]; .loc 1 250 15 ld.u64 %r35,[%r213+16]; .loc 1 252 22 cvt.u32.u64 %r36,%r35; .loc 1 252 53 add.u32 %r251,%r36,-4; and.b32 %r252,%r251,-5; set.u32.eq.u32 %r254,%r252,0; neg.s32 %r255,%r254; .loc 1 252 22 add.u32 %r256,%r36,-1; set.u32.le.u32 %r258,%r256,1; neg.s32 %r259,%r258; .loc 1 252 6 cvt.u16.u32 %r261,%r255; cvt.u16.u32 %r262,%r259; or.b16 %r260,%r261,%r262; cvt.u32.u16 %r263,%r260; cvt.u16.u8 %r264,%r263; setp.ne.u16 %r265,%r264,0; @ %r265 bra $L53; .loc 1 254 7 setp.ne.u32 %r267,%r36,16; @ %r267 bra $L54; $L53: .loc 1 261 9 shl.b64 %r271,%r430,3; add.u64 %r272,%r211,%r271; ld.u64 %r146,[%r272+40]; .loc 1 262 12 add.u64 %r278,%r213,%r271; ld.u64 %r43,[%r278+40]; .loc 1 264 3 setp.ne.u64 %r280,%r141,0; @ %r280 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r281,%r141,%r142; @ %r281 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r282282_gfortran_runtime_error5: add.u64 %r194,%r211,40; add.u64 %r196,%r213,40; add.u64 %r284,%r211,16; add.u64 %r286,%r22,%r22; add.u64 %r287,%r286,%r22; shl.b64 %r288,%r287,3; add.u64 %r207,%r284,%r288; .loc 1 264 3 mov.u64 %r198,0; add.u64 %r428,%frame,240; add.u64 %r427,%frame,360; .loc 1 271 12 mov.u64 %r442,%r198; $L60: .loc 1 266 18 add.u64 %r290,%r428,%r198; ld.u64 %r291,[%r194]; st.u64 [%r290],%r291; .loc 1 267 18 add.u64 %r292,%frame,%r198; .loc 1 267 20 ld.u64 %r294,[%r196]; mul.lo.u64 %r293,%r294,%r35; .loc 1 267 18 st.u64 [%r292],%r293; .loc 1 268 19 ld.u64 %r296,[%r194+16]; add.u64 %r295,%r296,1; ld.u64 %r297,[%r194+8]; sub.u64 %r55,%r295,%r297; .loc 1 270 10 setp.lt.s64 %r298,%r55,0; @ %r298 bra $L58; .loc 1 268 17 add.u64 %r300,%r427,%r198; st.u64 [%r300],%r55; bra $L59; $L58: .loc 1 271 12 add.u64 %r302,%r427,%r198; st.u64 [%r302],%r442; $L59: .loc 1 264 3 add.u64 %r194,%r194,24; add.u64 %r196,%r196,24; add.u64 %r198,%r198,8; setp.ne.u64 %r304,%r194,%r207; @ %r304 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r305,[%r210]; setp.eq.u64 %r306,%r305,0; @ ! %r306 bra $L63; bra $L62; $L56: add.u64 %r308,%r22,%r22; add.u64 %r309,%r308,%r22; shl.b64 %r310,%r309,3; add.u64 %r61,%r310,40; add.u64 %r100,%r211,%r61; add.u64 %r139,%r213,%r61; shl.b64 %r311,%r22,3; add.u64 %r192,%r311,-8; cvt.u32.u32 %r313,%r23; cvt.s64.s8 %r312,%r313; shl.b64 %r314,%r312,3; add.u64 %r39,%r314,-8; add.u64 %r428,%frame,240; add.u64 %r427,%frame,360; .loc 1 281 12 mov.u64 %r441,0; $L66: .loc 1 276 18 add.u64 %r316,%r428,%r192; ld.u64 %r317,[%r100]; st.u64 [%r316],%r317; .loc 1 277 18 add.u64 %r318,%frame,%r192; .loc 1 277 20 ld.u64 %r320,[%r139]; mul.lo.u64 %r319,%r320,%r35; .loc 1 277 18 st.u64 [%r318],%r319; .loc 1 278 19 ld.u64 %r322,[%r100+16]; add.u64 %r321,%r322,1; ld.u64 %r323,[%r100+8]; sub.u64 %r67,%r321,%r323; .loc 1 280 10 setp.lt.s64 %r324,%r67,0; @ %r324 bra $L64; .loc 1 278 17 add.u64 %r326,%r427,%r192; st.u64 [%r326],%r67; bra $L65; $L64: .loc 1 281 12 add.u64 %r328,%r427,%r192; st.u64 [%r328],%r441; $L65: .loc 1 274 3 add.u64 %r100,%r100,24; add.u64 %r139,%r139,24; add.u64 %r192,%r192,8; setp.ne.u64 %r330,%r39,%r192; @ %r330 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r331,%r142,0; @ %r331 bra $L68; add.u64 %r427,%frame,360; $L86: add.u64 %r137,%r210,40; mov.u64 %r135,%r427; cvt.u32.u32 %r333,%r23; cvt.s64.s8 %r332,%r333; add.u64 %r335,%r332,%r332; add.u64 %r336,%r335,%r332; shl.b64 %r337,%r336,3; add.u64 %r338,%r210,16; add.u64 %r106,%r337,%r338; .loc 1 291 10 mov.u64 %r131,1; .loc 1 295 4 mov.u64 %r357,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r339,%r23,-2; cvt.s64.s32 %r74,%r339; add.u64 %r341,%r74,%r74; add.u64 %r342,%r341,%r74; shl.b64 %r343,%r342,3; add.u64 %r344,%r210,%r343; .loc 1 299 67 shl.b64 %r346,%r74,3; add.u64 %r347,%frame,%r346; .loc 1 299 59 ld.u64 %r349,[%r344+40]; ld.u64 %r350,[%r347+360]; mul.lo.u64 %r150,%r349,%r350; .loc 1 301 24 mov.u64 %r351,0; st.u64 [%r210+8],%r351; .loc 1 302 28 cvt.u16.u32 %r354,%r23; add.u16 %r353,%r354,-1; cvt.u32.u16 %r355,%r353; st.u8 [%r210+28],%r355; .loc 1 304 10 setp.eq.u64 %r356,%r150,0; @ %r356 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r131,%r71,%r131; $L69: .loc 1 295 4 st.u64 [%r137+8],%r357; ld.u64 %r71,[%r135]; add.u64 %r358,%r71,-1; st.u64 [%r137+16],%r358; st.u64 [%r137],%r131; .loc 1 288 7 add.u64 %r137,%r137,24; add.u64 %r135,%r135,8; setp.ne.u64 %r359,%r106,%r137; @ %r359 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r210+48],%r150; mov.u64 %r361,-1; st.u64 [%r210+56],%r361; mov.u64 %r362,1; st.u64 [%r210+40],%r362; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r364364; call (%value_in),_gfortrani_xmallocarray65,[%value_in]; } .loc 1 311 22 st.u64 [%r210],%r365; bra $L73; $L63: .loc 1 316 19 ld.s8 %r367,[%r210+28]; .loc 1 316 10 setp.eq.u64 %r368,%r367,%r142; @ %r368 bra $L74; .loc 1 317 2 cvta.const.u64 %r36stack; call _gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r371,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r372,[%r371+36]; setp.eq.u32 %r373,%r372,0; @ %r373 bra $L73; .loc 1 321 4 add.u64 %r427,%frame,360; cvta.const.u64 %r377,$LC2r427376377; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r381,$LC1r2113377; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r383,%r142,0; @ %r383 bra $L75; $L78: .loc 1 336 8 ld.u64 %r157,[%r210]; .loc 1 337 8 ld.u64 %r155,[%r211]; .loc 1 339 9 setp.ne.u64 %r384,%r155,0; @ %r384 bra $L76; bra $L48; $L75: add.u64 %r183,%frame,480; add.u64 %r180,%r210,40; add.u64 %r177,%frame,120; cvt.u32.u32 %r386,%r23; cvt.s64.s8 %r385,%r386; add.u64 %r138,%r385,-1; .loc 1 328 3 mov.u64 %r161,0; add.u64 %r427,%frame,360; .loc 1 330 16 mov.u64 %r387,%r161; $L77: st.u64 [%r183],%r387; .loc 1 331 18 ld.u64 %r388,[%r180]; st.u64 [%r177],%r388; .loc 1 332 17 shl.b64 %r390,%r161,3; add.u64 %r391,%r427,%r390; .loc 1 332 10 ld.u64 %r392,[%r391]; setp.le.s64 %r393,%r392,0; @ %r393 bra $L48; .loc 1 328 26 add.u64 %r161,%r161,1; .loc 1 328 3 add.u64 %r183,%r183,8; add.u64 %r180,%r180,24; add.u64 %r177,%r177,8; setp.ne.u64 %r394,%r138,%r161; @ %r394 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r46,%r43,%r35; .loc 1 349 32 shl.b64 %r90,%r146,3; ld.u64 %r29,[%frame+480]; .loc 1 359 22 ld.u64 %r92,[%frame+240]; .loc 1 359 12 shl.b64 %r94,%r92,3; .loc 1 360 23 ld.u64 %r95,[%frame]; .loc 1 361 22 ld.u64 %r97,[%frame+120]; .loc 1 361 12 shl.b64 %r99,%r97,3; setp.le.s64 %r432,%r142,1; cvt.u32.u32 %r434,%r23; cvt.s64.s8 %r435,%r434; add.u64 %r436,%r435,-1; add.u64 %r437,%frame,240; add.u64 %r438,%frame,360; add.u64 %r439,%frame,120; .loc 1 367 13 mov.u64 %r440,0; $L87: .loc 1 328 3 mov.u64 %r160,%r156; mov.u64 %r159,%r155; .loc 1 348 10 mov.u64 %r133,0; .loc 1 349 9 mov.u64 %r158,%r133; $L80: .loc 1 352 6 ld.s8 %r396,[%r160]; cvt.u16.u32 %r395,%r396; setp.eq.u16 %r397,%r395,0; @ %r397 bra $L79; .loc 1 353 12 ld.u64 %r398,[%r159]; add.u64 %r133,%r133,%r398; $L79: .loc 1 349 24 add.u64 %r158,%r158,1; .loc 1 349 32 add.u64 %r159,%r159,%r90; .loc 1 349 47 add.u64 %r160,%r160,%r46; .loc 1 349 2 setp.ne.u64 %r399,%r143,%r158; @ %r399 bra $L80; .loc 1 355 8 st.u64 [%r157],%r133; .loc 1 358 15 add.u64 %r29,%r29,1; .loc 1 359 12 add.u64 %r155,%r155,%r94; .loc 1 360 13 add.u64 %r156,%r156,%r95; .loc 1 361 12 add.u64 %r157,%r157,%r99; .loc 1 363 32 ld.u64 %r84,[%frame+360]; .loc 1 363 13 setp.ne.u64 %r400,%r84,%r29; @ %r400 bra $L87; .loc 1 370 23 mul.lo.u64 %r102,%r29,%r92; .loc 1 371 24 mul.lo.u64 %r401,%r29,%r95; .loc 1 371 10 sub.u64 %r152,%r156,%r401; .loc 1 372 23 mul.lo.u64 %r402,%r29,%r97; .loc 1 372 9 shl.b64 %r403,%r402,3; sub.u64 %r153,%r157,%r403; .loc 1 374 7 @ ! %r432 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r41],%r440; .loc 1 370 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 371 24 mul.lo.u64 %r406,%r116,%r113; .loc 1 371 10 sub.u64 %r152,%r156,%r406; .loc 1 372 23 mul.lo.u64 %r407,%r118,%r113; .loc 1 372 9 shl.b64 %r408,%r407,3; sub.u64 %r153,%r157,%r408; .loc 1 373 5 add.u64 %r154,%r154,1; .loc 1 374 7 add.u64 %r41,%r41,8; add.u64 %r30,%r30,8; setp.ne.u64 %r409,%r154,%r436; @ %r409 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; mov.u64 %r30,8; .loc 1 373 5 mov.u64 %r154,1; $L84: .loc 1 382 16 ld.u64 %r413,[%r41]; add.u64 %r113,%r413,1; st.u64 [%r41],%r113; .loc 1 383 23 add.u64 %r415,%r437,%r30; ld.u64 %r114,[%r415]; .loc 1 383 13 sub.u64 %r416,%r114,%r102; shl.b64 %r417,%r416,3; add.u64 %r155,%r155,%r417; .loc 1 384 24 add.u64 %r418,%frame,%r30; ld.u64 %r116,[%r418]; .loc 1 384 14 add.u64 %r156,%r152,%r116; .loc 1 385 23 add.u64 %r420,%r439,%r30; ld.u64 %r118,[%r420]; .loc 1 385 13 shl.b64 %r421,%r118,3; add.u64 %r157,%r153,%r421; .loc 1 363 32 add.u64 %r423,%r438,%r30; ld.u64 %r121,[%r423]; .loc 1 363 13 setp.eq.u64 %r424,%r113,%r121; @ %r424 bra $L85; .loc 1 367 13 mov.u64 %r29,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r425,[%r210]; setp.eq.u64 %r426,%r425,0; @ ! %r426 bra $L63; bra $L86; $L48:_gfortran_ssum_i8 .visible .func _gfortran_ssum_68u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_sum_.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r294,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r294; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r294; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r300,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r300; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r293,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r293,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r299,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r299; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r294,0; @ %r211 bra $L127; add.u64 %r293,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r293; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r24496244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r294,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r294; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r294; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r294,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r294,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r294,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 shl.b64 %r69,%r68,3; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r292,%r294,1; .loc 1 512 13 mov.u64 %r278,0; cvt.u32.u32 %r296,%r24; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; $L146: st.u64 [%r98],%r278; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r279,%r59,%r80; @ %r279 bra $L146; .loc 1 525 7 @ ! %r292 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r278; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r282,%r97,%r298; @ %r282 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; $L144: .loc 1 529 16 ld.u64 %r287,[%r99]; add.u64 %r74,%r287,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r288,%r75,%r72; shl.b64 %r289,%r288,3; add.u64 %r98,%r98,%r289; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r290,%r74,%r77; @ %r290 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r291,%r98,0; @ ! %r291 bra $L122; bra $L147; $L109: .loc 1 534 1 ret; } sum_i16.o/fortran/generated/sum_i16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_msum_i16 .visible .func _gfortran_msum_i16gfortran_ssum_i16 .visible .func _gfortran_ssum_i16VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10] = {83,85,77114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,83,85,77,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_sum_i16 .visible .func _gfortran_sum_i16480u64 %r52; .reg .u64 %r58; .reg .u64 %r61; .reg .u649pred %r217; .reg .pred64u64 %r251; .reg .predu64 %r256; .reg .u64 %r257; .reg .u32u32 %r268; .reg .u64u64pred %r317; .reg .u64 %r323; .reg .u64 %r326; .reg .u32 %r328; .reg .u64u64 %r352; .reg .u64 %r354; .reg .predpred %r365; .reg .predu64 %r374; .reg .u64 %r375; .reg .u64 %r376; .reg .u64 %r377; mov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; .loc 1 55 10 ld.s8 %r22,[%r180+28]; .loc 1 55 38 add.u32 %r182,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r182; .loc 1 56 10 ld.u64 %r25,[%r181]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r184,%r104,63; set.u32.lt.s64 %r186,%r103,%r104; neg.s32 %r187,%r186; cvt.u16.u64 %r190,%r184; cvt.u16.u32 %r191,%r187; or.b16 %r189,%r190,%r191; .loc 1 58 6 cvt.u32.u16 %r192,%r189; cvt.u16.u8 %r193,%r192; setp.eq.u16 %r194,%r193,0; @ %r194 bra $L2; .loc 1 60 7 cvt.u32.u32 %r197,%r22; cvt.s64.s8 %r196,%r197; st.u64 [%stack+8],%r1961955_gfortran_runtime_errorr200,%r104,%r104; add.u64 %r201,%r200,%r104; shl.b64 %r202,%r201,3; add.u64 %r203,%r180,%r202; ld.u64 %r30,[%r203+56]; ld.u64 %r32,[%r203+48]; .loc 1 68 9 ld.u64 %r107,[%r203+40]; .loc 1 70 3 setp.ne.u64 %r217,%r104,0; @ %r217 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r218,%r103,%r104; @ %r218 bra $L4; bra $L46; $L3: add.u64 %r61,%r180,40; add.u64 %r40,%frame,120; add.u64 %r91,%frame,240; add.u64 %r219,%r180,16; add.u64 %r221,%r25,%r25; add.u64 %r222,%r221,%r25; shl.b64 %r223,%r222,3; add.u64 %r174,%r219,%r223; .loc 1 76 12 mov.u64 %r377,0; $L8: .loc 1 72 18 ld.u64 %r224,[%r61]; st.u64 [%r40],%r224; .loc 1 73 19 ld.u64 %r226,[%r61+16]; add.u64 %r225,%r226,1; ld.u64 %r227,[%r61+8]; sub.u64 %r37,%r225,%r227; .loc 1 75 10 setp.lt.s64 %r228,%r37,0; @ %r228 bra $L6; .loc 1 73 17 st.u64 [%r91],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r91],%r377; $L7: .loc 1 70 3 add.u64 %r61,%r61,24; add.u64 %r40,%r40,8; add.u64 %r91,%r91,8; setp.ne.u64 %r230,%r61,%r174; @ %r230 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r231,[%r179]; setp.eq.u64 %r232,%r231,0; @ ! %r232 bra $L11; bra $L10; $L4: add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r237,%r236,40; add.u64 %r143,%r180,%r237; shl.b64 %r238,%r25,3; add.u64 %r125,%r238,-8; add.u64 %r364,%frame,120; add.u64 %r128,%r364,%r125; add.u64 %r367,%frame,240; add.u64 %r121,%r367,%r125; cvt.u32.u32 %r242,%r22; cvt.s64.s8 %r241,%r242; add.u64 %r244,%r241,%r241; add.u64 %r245,%r244,%r241; shl.b64 %r246,%r245,3; add.u64 %r247,%r180,40; add.u64 %r95,%r246,%r247; .loc 1 84 12 mov.u64 %r376,0; $L14: .loc 1 80 18 ld.u64 %r248,[%r143]; st.u64 [%r128],%r248; .loc 1 81 19 ld.u64 %r250,[%r143+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r143+8]; sub.u64 %r44,%r249,%r251; .loc 1 83 10 setp.lt.s64 %r252,%r44,0; @ %r252 bra $L12; .loc 1 81 17 st.u64 [%r121],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r121],%r376; $L13: .loc 1 78 3 add.u64 %r143,%r143,24; add.u64 %r128,%r128,8; add.u64 %r121,%r121,8; setp.ne.u64 %r254,%r95,%r143; @ %r254 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r255,%r103,0; @ %r255 bra $L16; add.u64 %r367,%frame,240; $L34: add.u64 %r152,%r179,40; mov.u64 %r150,%r367; add.u64 %r256,%r179,16; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r144,%r256,%r262; .loc 1 94 10 mov.u64 %r99,1; .loc 1 98 4 mov.u64 %r285,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r263,0; st.u64 [%r179+8],%r263; .loc 1 103 28 cvt.u16.u32 %r266,%r22; add.u16 %r265,%r266,-1; cvt.u32.u16 %r267,%r265; st.u8 [%r179+28],%r267; .loc 1 105 20 add.u32 %r268,%r22,-2; cvt.s64.s32 %r52,%r268; add.u64 %r270,%r52,%r52; add.u64 %r271,%r270,%r52; shl.b64 %r272,%r271,3; add.u64 %r273,%r179,%r272; .loc 1 105 67 shl.b64 %r275,%r52,3; add.u64 %r276,%frame,%r275; .loc 1 105 59 ld.u64 %r278,[%r273+40]; ld.u64 %r279,[%r276+240]; mul.lo.u64 %r109,%r278,%r279; .loc 1 107 29 mov.u64 %r281,16281; call (%value_in),_gfortrani_xmallocarray282,[%value_in]; } .loc 1 107 27 st.u64 [%r179],%r282; .loc 1 108 10 setp.eq.u64 %r284,%r109,0; @ ! %r284 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r99,%r48,%r99; $L17: .loc 1 98 4 st.u64 [%r152+8],%r285; ld.u64 %r48,[%r150]; add.u64 %r286,%r48,-1; st.u64 [%r152+16],%r286; st.u64 [%r152],%r99; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r287,%r144,%r152; @ %r287 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r179+48],%r109; mov.u64 %r289,-1; st.u64 [%r179+56],%r289; mov.u64 %r290,1; st.u64 [%r179+40],%r290; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r179+28]; .loc 1 118 10 setp.eq.u64 %r291,%r58,%r103; @ %r291 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r294,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r295,[%r294+36]; setp.eq.u32 %r296,%r295,0; @ %r296 bra $L21; .loc 1 125 2 add.u64 %r367,%frame,240; cvta.const.u64 %r300,$LC2r367299300; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r302,%r103,0; @ %r302 bra $L24; $L27: .loc 1 65 9 add.u64 %r303,%r30,1; .loc 1 65 7 sub.u64 %r105,%r303,%r32; max.s64 %r45,%r105,0; .loc 1 137 8 ld.u64 %r116,[%r180]; .loc 1 138 8 ld.u64 %r117,[%r179]; .loc 1 154 36 shl.b64 %r66,%r107,4; ld.u64 %r138,[%frame+360]; .loc 1 166 22 ld.u64 %r68,[%frame+120]; .loc 1 166 12 shl.b64 %r69,%r68,4; .loc 1 167 22 ld.u64 %r71,[%frame]; .loc 1 167 12 shl.b64 %r72,%r71,4; .loc 1 169 32 ld.u64 %r123,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r304,%r68,%r123; .loc 1 176 9 shl.b64 %r305,%r304,4; neg.s64 %r127,%r305; .loc 1 177 23 mul.lo.u64 %r306,%r71,%r123; .loc 1 177 9 shl.b64 %r307,%r306,4; neg.s64 %r130,%r307; setp.gt.s64 %r365,%r105,0; setp.le.s64 %r366,%r103,1; .loc 1 148 10 mov.u64 %r369,0; cvt.u32.u32 %r371,%r22; cvt.s64.s8 %r372,%r371; add.u64 %r373,%r372,-1; add.u64 %r374,%frame,120; add.u64 %r375,%frame,240; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r179,40; mov.u64 %r162,%frame; cvt.u32.u32 %r309,%r22; cvt.s64.s8 %r308,%r309; add.u64 %r153,%r308,-1; .loc 1 129 3 mov.u64 %r118,0; add.u64 %r367,%frame,240; .loc 1 131 16 mov.u64 %r310,%r118; $L26: st.u64 [%r165],%r310; .loc 1 132 18 ld.u64 %r311,[%r164]; st.u64 [%r162],%r311; .loc 1 133 17 shl.b64 %r313,%r118,3; add.u64 %r314,%r367,%r313; .loc 1 133 10 ld.u64 %r315,[%r314]; setp.le.s64 %r316,%r315,0; @ %r316 bra $L1; .loc 1 129 26 add.u64 %r118,%r118,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r317,%r118,%r153; @ %r317 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r365 bra $L35; .loc 1 150 10 st.u64 [%r117],%r369; st.u64 [%r117+8],%r369; bra $L29; $L35: mov.u64 %r112,%r116; .loc 1 148 10 mov.u64 %r358,%r369; mov.u64 %r359,%r369; .loc 1 154 13 mov.u64 %r111,0; $L28: .loc 1 158 10 ld.u64 %r360,[%r112]; ld.u64 %r361,[%r112+8]; add.u64 %r323,%r358,%r360; set.u32.lt.u64 %r328,%r323,%r358; cvt.s64.s32 %r326,%r328; add.u64 %r329,%r359,%r361; mov.u64 %r358,%r323; sub.u64 %r359,%r329,%r326; .loc 1 154 28 add.u64 %r111,%r111,1; .loc 1 154 36 add.u64 %r112,%r112,%r66; .loc 1 154 6 setp.gt.s64 %r333,%r45,%r111; @ %r333 bra $L28; .loc 1 161 12 st.u64 [%r117],%r358; st.u64 [%r117+8],%r359; $L29: .loc 1 165 15 add.u64 %r138,%r138,1; .loc 1 166 12 add.u64 %r116,%r116,%r69; .loc 1 167 12 add.u64 %r117,%r117,%r72; .loc 1 169 13 setp.ne.u64 %r336,%r138,%r123; @ %r336 bra $L25; .loc 1 176 9 add.u64 %r113,%r116,%r127; .loc 1 177 9 add.u64 %r114,%r117,%r130; .loc 1 179 7 @ %r366 bra $L1; add.u64 %r122,%frame,368; mov.u64 %r140,8; .loc 1 178 5 mov.u64 %r115,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r122],%r369; .loc 1 176 23 mul.lo.u64 %r342,%r85,%r84; .loc 1 176 9 shl.b64 %r343,%r342,4; sub.u64 %r113,%r116,%r343; .loc 1 177 23 mul.lo.u64 %r344,%r88,%r84; .loc 1 177 9 shl.b64 %r345,%r344,4; sub.u64 %r114,%r117,%r345; .loc 1 178 5 add.u64 %r115,%r115,1; .loc 1 179 7 add.u64 %r122,%r122,8; add.u64 %r140,%r140,8; setp.eq.u64 %r346,%r115,%r373; @ %r346 bra $L1; $L32: .loc 1 187 16 ld.u64 %r347,[%r122]; add.u64 %r84,%r347,1; st.u64 [%r122],%r84; .loc 1 188 23 add.u64 %r349,%r374,%r140; ld.u64 %r85,[%r349]; .loc 1 188 13 shl.b64 %r350,%r85,4; add.u64 %r116,%r113,%r350; .loc 1 189 23 add.u64 %r351,%frame,%r140; ld.u64 %r88,[%r351]; .loc 1 189 13 shl.b64 %r352,%r88,4; add.u64 %r117,%r114,%r352; .loc 1 169 32 add.u64 %r354,%r375,%r140; ld.u64 %r92,[%r354]; .loc 1 169 13 setp.eq.u64 %r355,%r84,%r92; @ %r355 bra $L33; .loc 1 173 13 mov.u64 %r138,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r356,[%r179]; setp.eq.u64 %r357,%r356,0; @ ! %r357 bra $L11; bra $L34; $L1: .loc 1 19_gfortran_msum_i16 .visible .func _gfortran_msum_i16608u64u16 %r263; .reg .pred %r264; .reg .pred %r266; .reg .u64 %r270; .reg .u64 %r271; .reg .u64 %r277; .reg .pred %r279; .reg .pred %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .predu64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32u64 %r342; .reg .u64pred %r358; .reg .u64predu64 %r391; .reg .predpred %r399; .reg .u64 %r402; .reg .u64 %r405; .reg .u32 %r407; .reg .u64 %r408; .reg .pred %r412; .reg .predu64u64mov.u64 %r211,%ar2; mov.u64 %r212,%ar3; .loc 1 223 6 setp.ne.u64 %r213,%r212,0; @ %r213 bra $L49gfortran_sum_i16.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r211]; .loc 1 233 7 add.u64 %r137,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r210+28]; .loc 1 234 38 add.u32 %r217,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r138,%r217; .loc 1 237 7 shr.u64 %r219,%r137,63; set.u32.gt.s64 %r221,%r137,%r138; neg.s32 %r222,%r221; cvt.u16.u64 %r225,%r219; cvt.u16.u32 %r226,%r222; or.b16 %r224,%r225,%r226; .loc 1 237 6 cvt.u32.u16 %r227,%r224; cvt.u16.u8 %r228,%r227; setp.eq.u16 %r229,%r228,0; @ %r229 bra $L51; .loc 1 239 7 cvt.u32.u32 %r232,%r23; cvt.s64.s8 %r231,%r232; st.u64 [%stack+8],%r231; st.u64 [%stack],%r22; cvta.const.u64 %r2302stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r446,%r137,%r137; add.u64 %r449,%r446,%r137; shl.b64 %r237,%r449,3; add.u64 %r238,%r210,%r237; ld.u64 %r241,[%r238+56]; add.u64 %r240,%r241,1; .loc 1 244 7 ld.u64 %r248,[%r238+48]; sub.u64 %r139,%r240,%r248; .loc 1 245 6 setp.le.s64 %r249,%r139,0; @ %r249 bra $L48; .loc 1 248 9 ld.u64 %r152,[%r212]; .loc 1 250 15 ld.u64 %r34,[%r212+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r250,%r35,-4; and.b32 %r251,%r250,-5; set.u32.eq.u32 %r253,%r251,0; neg.s32 %r254,%r253; .loc 1 252 22 add.u32 %r255,%r35,-1; set.u32.le.u32 %r257,%r255,1; neg.s32 %r258,%r257; .loc 1 252 6 cvt.u16.u32 %r260,%r254; cvt.u16.u32 %r261,%r258; or.b16 %r259,%r260,%r261; cvt.u32.u16 %r262,%r259; cvt.u16.u8 %r263,%r262; setp.ne.u16 %r264,%r263,0; @ %r264 bra $L53; .loc 1 254 7 setp.ne.u32 %r266,%r35,16; @ %r266 bra $L54; $L53: .loc 1 261 9 shl.b64 %r270,%r449,3; add.u64 %r271,%r210,%r270; ld.u64 %r142,[%r271+40]; .loc 1 262 12 add.u64 %r277,%r212,%r270; ld.u64 %r42,[%r277+40]; .loc 1 264 3 setp.ne.u64 %r279,%r137,0; @ %r279 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r280,%r137,%r138; @ %r280 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r281281_gfortran_runtime_error5: add.u64 %r40,%r210,40; add.u64 %r179,%r212,40; add.u64 %r283,%r210,16; add.u64 %r285,%r22,%r22; add.u64 %r286,%r285,%r22; shl.b64 %r287,%r286,3; add.u64 %r206,%r283,%r287; .loc 1 264 3 mov.u64 %r92,0; add.u64 %r447,%frame,240; add.u64 %r445,%frame,360; .loc 1 271 12 mov.u64 %r460,%r92; $L60: .loc 1 266 18 add.u64 %r289,%r447,%r92; ld.u64 %r290,[%r40]; st.u64 [%r289],%r290; .loc 1 267 18 add.u64 %r291,%frame,%r92; .loc 1 267 20 ld.u64 %r293,[%r179]; mul.lo.u64 %r292,%r293,%r34; .loc 1 267 18 st.u64 [%r291],%r292; .loc 1 268 19 ld.u64 %r295,[%r40+16]; add.u64 %r294,%r295,1; ld.u64 %r296,[%r40+8]; sub.u64 %r54,%r294,%r296; .loc 1 270 10 setp.lt.s64 %r297,%r54,0; @ %r297 bra $L58; .loc 1 268 17 add.u64 %r299,%r445,%r92; st.u64 [%r299],%r54; bra $L59; $L58: .loc 1 271 12 add.u64 %r301,%r445,%r92; st.u64 [%r301],%r460; $L59: .loc 1 264 3 add.u64 %r40,%r40,24; add.u64 %r179,%r179,24; add.u64 %r92,%r92,8; setp.ne.u64 %r303,%r40,%r206; @ %r303 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r304,[%r209]; setp.eq.u64 %r305,%r304,0; @ ! %r305 bra $L63; bra $L62; $L56: add.u64 %r307,%r22,%r22; add.u64 %r308,%r307,%r22; shl.b64 %r309,%r308,3; add.u64 %r127,%r309,40; add.u64 %r133,%r210,%r127; add.u64 %r125,%r212,%r127; shl.b64 %r310,%r22,3; add.u64 %r103,%r310,-8; cvt.u32.u32 %r312,%r23; cvt.s64.s8 %r311,%r312; shl.b64 %r313,%r311,3; add.u64 %r69,%r313,-8; add.u64 %r447,%frame,240; add.u64 %r445,%frame,360; .loc 1 281 12 mov.u64 %r459,0; $L66: .loc 1 276 18 add.u64 %r315,%r447,%r103; ld.u64 %r316,[%r133]; st.u64 [%r315],%r316; .loc 1 277 18 add.u64 %r317,%frame,%r103; .loc 1 277 20 ld.u64 %r319,[%r125]; mul.lo.u64 %r318,%r319,%r34; .loc 1 277 18 st.u64 [%r317],%r318; .loc 1 278 19 ld.u64 %r321,[%r133+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r133+8]; sub.u64 %r66,%r320,%r322; .loc 1 280 10 setp.lt.s64 %r323,%r66,0; @ %r323 bra $L64; .loc 1 278 17 add.u64 %r325,%r445,%r103; st.u64 [%r325],%r66; bra $L65; $L64: .loc 1 281 12 add.u64 %r327,%r445,%r103; st.u64 [%r327],%r459; $L65: .loc 1 274 3 add.u64 %r133,%r133,24; add.u64 %r125,%r125,24; add.u64 %r103,%r103,8; setp.ne.u64 %r329,%r69,%r103; @ %r329 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r330,%r138,0; @ %r330 bra $L68; add.u64 %r445,%frame,360; $L86: add.u64 %r172,%r209,40; mov.u64 %r164,%r445; cvt.u32.u32 %r332,%r23; cvt.s64.s8 %r331,%r332; add.u64 %r334,%r331,%r331; add.u64 %r335,%r334,%r331; shl.b64 %r336,%r335,3; add.u64 %r337,%r209,16; add.u64 %r134,%r336,%r337; .loc 1 291 10 mov.u64 %r129,1; .loc 1 295 4 mov.u64 %r356,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r338,%r23,-2; cvt.s64.s32 %r75,%r338; add.u64 %r340,%r75,%r75; add.u64 %r341,%r340,%r75; shl.b64 %r342,%r341,3; add.u64 %r343,%r209,%r342; .loc 1 299 67 shl.b64 %r345,%r75,3; add.u64 %r346,%frame,%r345; .loc 1 299 59 ld.u64 %r348,[%r343+40]; ld.u64 %r349,[%r346+360]; mul.lo.u64 %r146,%r348,%r349; .loc 1 301 24 mov.u64 %r350,0; st.u64 [%r209+8],%r350; .loc 1 302 28 cvt.u16.u32 %r353,%r23; add.u16 %r352,%r353,-1; cvt.u32.u16 %r354,%r352; st.u8 [%r209+28],%r354; .loc 1 304 10 setp.eq.u64 %r355,%r146,0; @ %r355 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r129,%r72,%r129; $L69: .loc 1 295 4 st.u64 [%r172+8],%r356; ld.u64 %r72,[%r164]; add.u64 %r357,%r72,-1; st.u64 [%r172+16],%r357; st.u64 [%r172],%r129; .loc 1 288 7 add.u64 %r172,%r172,24; add.u64 %r164,%r164,8; setp.ne.u64 %r358,%r134,%r172; @ %r358 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r209+48],%r146; mov.u64 %r360,-1; st.u64 [%r209+56],%r360; mov.u64 %r361,1; st.u64 [%r209+40],%r361; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r363,16363; call (%value_in),_gfortrani_xmallocarray64,[%value_in]; } .loc 1 311 22 st.u64 [%r209],%r364; bra $L73; $L63: .loc 1 316 19 ld.s8 %r366,[%r209+28]; .loc 1 316 10 setp.eq.u64 %r367,%r366,%r138; @ %r367 bra $L74; .loc 1 317 2 cvta.const.u64 %r36836_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r370,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r371,[%r370+36]; setp.eq.u32 %r372,%r371,0; @ %r372 bra $L73; .loc 1 321 4 add.u64 %r445,%frame,360; cvta.const.u64 %r376,$LC2r44537376; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r380,$LC1210380376; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r382,%r138,0; @ %r382 bra $L75; $L78: .loc 1 336 8 ld.u64 %r153,[%r209]; .loc 1 337 8 ld.u64 %r151,[%r210]; .loc 1 339 9 setp.ne.u64 %r383,%r151,0; @ %r383 bra $L76; bra $L48; $L75: add.u64 %r190,%frame,480; add.u64 %r189,%r209,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r385,%r23; cvt.s64.s8 %r384,%r385; add.u64 %r175,%r384,-1; .loc 1 328 3 mov.u64 %r157,0; add.u64 %r445,%frame,360; .loc 1 330 16 mov.u64 %r386,%r157; $L77: st.u64 [%r190],%r386; .loc 1 331 18 ld.u64 %r387,[%r189]; st.u64 [%r187],%r387; .loc 1 332 17 shl.b64 %r389,%r157,3; add.u64 %r390,%r445,%r389; .loc 1 332 10 ld.u64 %r391,[%r390]; setp.le.s64 %r392,%r391,0; @ %r392 bra $L48; .loc 1 328 26 add.u64 %r157,%r157,1; .loc 1 328 3 add.u64 %r190,%r190,8; add.u64 %r189,%r189,24; add.u64 %r187,%r187,8; setp.ne.u64 %r393,%r157,%r175; @ %r393 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r45,%r42,%r34; .loc 1 349 32 shl.b64 %r91,%r142,4; ld.u64 %r39,[%frame+480]; .loc 1 359 22 ld.u64 %r94,[%frame+240]; .loc 1 359 12 shl.b64 %r95,%r94,4; .loc 1 360 23 ld.u64 %r97,[%frame]; .loc 1 361 22 ld.u64 %r99,[%frame+120]; .loc 1 361 12 shl.b64 %r100,%r99,4; .loc 1 363 32 ld.u64 %r85,[%frame+360]; .loc 1 370 23 mul.lo.u64 %r38,%r85,%r94; .loc 1 371 24 mul.lo.u64 %r394,%r85,%r97; .loc 1 371 10 neg.s64 %r163,%r394; .loc 1 372 23 mul.lo.u64 %r395,%r85,%r99; .loc 1 372 9 shl.b64 %r396,%r395,4; neg.s64 %r165,%r396; setp.le.s64 %r450,%r138,1; .loc 1 348 10 mov.u64 %r452,0; cvt.u32.u32 %r453,%r23; cvt.s64.s8 %r454,%r453; add.u64 %r455,%r454,-1; add.u64 %r456,%frame,240; add.u64 %r457,%frame,360; add.u64 %r458,%frame,120; $L87: .loc 1 328 3 mov.u64 %r156,%r152; mov.u64 %r155,%r151; .loc 1 348 10 mov.u64 %r439,%r452; mov.u64 %r440,0; .loc 1 349 9 mov.u64 %r154,%r440; $L80: .loc 1 352 6 ld.s8 %r398,[%r156]; cvt.u16.u32 %r397,%r398; setp.eq.u16 %r399,%r397,0; @ %r399 bra $L79; .loc 1 353 12 ld.u64 %r441,[%r155]; ld.u64 %r442,[%r155+8]; add.u64 %r402,%r439,%r441; set.u32.lt.u64 %r407,%r402,%r439; cvt.s64.s32 %r405,%r407; add.u64 %r408,%r440,%r442; mov.u64 %r439,%r402; sub.u64 %r440,%r408,%r405; $L79: .loc 1 349 24 add.u64 %r154,%r154,1; .loc 1 349 32 add.u64 %r155,%r155,%r91; .loc 1 349 47 add.u64 %r156,%r156,%r45; .loc 1 349 2 setp.ne.u64 %r412,%r139,%r154; @ %r412 bra $L80; .loc 1 355 8 st.u64 [%r153],%r439; st.u64 [%r153+8],%r440; .loc 1 358 15 add.u64 %r39,%r39,1; .loc 1 359 12 add.u64 %r151,%r151,%r95; .loc 1 360 13 add.u64 %r152,%r152,%r97; .loc 1 361 12 add.u64 %r153,%r153,%r100; .loc 1 363 13 setp.ne.u64 %r415,%r85,%r39; @ %r415 bra $L87; .loc 1 371 10 add.u64 %r148,%r152,%r163; .loc 1 372 9 add.u64 %r149,%r153,%r165; .loc 1 374 7 @ ! %r450 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r41],%r452; .loc 1 370 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 371 24 mul.lo.u64 %r418,%r116,%r113; .loc 1 371 10 sub.u64 %r148,%r152,%r418; .loc 1 372 23 mul.lo.u64 %r419,%r118,%r113; .loc 1 372 9 shl.b64 %r420,%r419,4; sub.u64 %r149,%r153,%r420; .loc 1 373 5 add.u64 %r150,%r150,1; .loc 1 374 7 add.u64 %r41,%r41,8; add.u64 %r166,%r166,8; setp.ne.u64 %r421,%r150,%r455; @ %r421 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; .loc 1 370 23 mov.u64 %r102,%r38; .loc 1 374 7 mov.u64 %r166,8; .loc 1 373 5 mov.u64 %r150,1; $L84: .loc 1 382 16 ld.u64 %r425,[%r41]; add.u64 %r113,%r425,1; st.u64 [%r41],%r113; .loc 1 383 23 add.u64 %r427,%r456,%r166; ld.u64 %r114,[%r427]; .loc 1 383 13 sub.u64 %r428,%r114,%r102; shl.b64 %r429,%r428,4; add.u64 %r151,%r151,%r429; .loc 1 384 24 add.u64 %r430,%frame,%r166; ld.u64 %r116,[%r430]; .loc 1 384 14 add.u64 %r152,%r148,%r116; .loc 1 385 23 add.u64 %r432,%r458,%r166; ld.u64 %r118,[%r432]; .loc 1 385 13 shl.b64 %r433,%r118,4; add.u64 %r153,%r149,%r433; .loc 1 363 32 add.u64 %r435,%r457,%r166; ld.u64 %r122,[%r435]; .loc 1 363 13 setp.eq.u64 %r436,%r113,%r122; @ %r436 bra $L85; .loc 1 367 13 mov.u64 %r39,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r437,[%r209]; setp.eq.u64 %r438,%r437,0; @ ! %r438 bra $L63; bra $L86; $L48:_gfortran_ssum_i16 .visible .func _gfortran_ssum_i1668u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64predpred %r292; .reg .predu32u64 %r301; mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_sum_i16.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r295,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r295; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r295; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r301,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r301; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r294,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r294,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r300,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r300; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r295,0; @ %r211 bra $L127; add.u64 %r294,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r294; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r244,1696244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r295,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r295; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r295; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r295,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r295,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r295,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 shl.b64 %r69,%r68,4; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r293,%r295,1; .loc 1 512 13 mov.u64 %r278,0; cvt.u32.u32 %r297,%r24; cvt.s64.s8 %r298,%r297; add.u64 %r299,%r298,-1; $L146: st.u64 [%r98],%r278; st.u64 [%r98+8],%r278; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r280,%r59,%r80; @ %r280 bra $L146; .loc 1 525 7 @ ! %r293 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r278; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r283,%r97,%r299; @ %r283 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; $L144: .loc 1 529 16 ld.u64 %r288,[%r99]; add.u64 %r74,%r288,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r289,%r75,%r72; shl.b64 %r290,%r289,4; add.u64 %r98,%r98,%r290; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r291,%r74,%r77; @ %r291 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r292,%r98,0; @ ! %r292 bra $L122; bra $L147; $L109: .loc 1 534 1 ret; } sum_r4.o/_gfortran_sum_r4 .visible .func _gfortran_sum_r4fortran/generated/sum_r4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_msum_r4 .visible .func _gfortran_msum_r4gfortran_ssum_r4 .visible .func _gfortran_ssum_r4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10] = {83,85,77114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,83,85,77,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_sum_r4 .visible .func _gfortran_sum_r4480u64 %r52; .reg .u64 %r58; .reg .u64 %r61; .reg .u649pred %r217; .reg .pred64u64 %r251; .reg .predu64 %r256; .reg .u64 %r257; .reg .u32u32 %r268; .reg .u64u64pred %r317; .reg .f32 %r319; .reg .f32pred %r332; .reg .u64 %r333; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u64 %r338; .reg .u64 %r340; .reg .predmov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; .loc 1 55 10 ld.s8 %r22,[%r180+28]; .loc 1 55 38 add.u32 %r182,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r182; .loc 1 56 10 ld.u64 %r25,[%r181]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r184,%r104,63; set.u32.lt.s64 %r186,%r103,%r104; neg.s32 %r187,%r186; cvt.u16.u64 %r190,%r184; cvt.u16.u32 %r191,%r187; or.b16 %r189,%r190,%r191; .loc 1 58 6 cvt.u32.u16 %r192,%r189; cvt.u16.u8 %r193,%r192; setp.eq.u16 %r194,%r193,0; @ %r194 bra $L2; .loc 1 60 7 cvt.u32.u32 %r197,%r22; cvt.s64.s8 %r196,%r197; st.u64 [%stack+8],%r1961955_gfortran_runtime_errorr200,%r104,%r104; add.u64 %r201,%r200,%r104; shl.b64 %r202,%r201,3; add.u64 %r203,%r180,%r202; ld.u64 %r30,[%r203+56]; ld.u64 %r32,[%r203+48]; .loc 1 68 9 ld.u64 %r107,[%r203+40]; .loc 1 70 3 setp.ne.u64 %r217,%r104,0; @ %r217 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r218,%r103,%r104; @ %r218 bra $L4; bra $L46; $L3: add.u64 %r61,%r180,40; add.u64 %r40,%frame,120; add.u64 %r91,%frame,240; add.u64 %r219,%r180,16; add.u64 %r221,%r25,%r25; add.u64 %r222,%r221,%r25; shl.b64 %r223,%r222,3; add.u64 %r174,%r219,%r223; .loc 1 76 12 mov.u64 %r357,0; $L8: .loc 1 72 18 ld.u64 %r224,[%r61]; st.u64 [%r40],%r224; .loc 1 73 19 ld.u64 %r226,[%r61+16]; add.u64 %r225,%r226,1; ld.u64 %r227,[%r61+8]; sub.u64 %r37,%r225,%r227; .loc 1 75 10 setp.lt.s64 %r228,%r37,0; @ %r228 bra $L6; .loc 1 73 17 st.u64 [%r91],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r91],%r357; $L7: .loc 1 70 3 add.u64 %r61,%r61,24; add.u64 %r40,%r40,8; add.u64 %r91,%r91,8; setp.ne.u64 %r230,%r61,%r174; @ %r230 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r231,[%r179]; setp.eq.u64 %r232,%r231,0; @ ! %r232 bra $L11; bra $L10; $L4: add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r237,%r236,40; add.u64 %r143,%r180,%r237; shl.b64 %r238,%r25,3; add.u64 %r125,%r238,-8; add.u64 %r344,%frame,120; add.u64 %r128,%r344,%r125; add.u64 %r345,%frame,240; add.u64 %r121,%r345,%r125; cvt.u32.u32 %r242,%r22; cvt.s64.s8 %r241,%r242; add.u64 %r244,%r241,%r241; add.u64 %r245,%r244,%r241; shl.b64 %r246,%r245,3; add.u64 %r247,%r180,40; add.u64 %r95,%r246,%r247; .loc 1 84 12 mov.u64 %r356,0; $L14: .loc 1 80 18 ld.u64 %r248,[%r143]; st.u64 [%r128],%r248; .loc 1 81 19 ld.u64 %r250,[%r143+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r143+8]; sub.u64 %r44,%r249,%r251; .loc 1 83 10 setp.lt.s64 %r252,%r44,0; @ %r252 bra $L12; .loc 1 81 17 st.u64 [%r121],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r121],%r356; $L13: .loc 1 78 3 add.u64 %r143,%r143,24; add.u64 %r128,%r128,8; add.u64 %r121,%r121,8; setp.ne.u64 %r254,%r95,%r143; @ %r254 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r255,%r103,0; @ %r255 bra $L16; add.u64 %r345,%frame,240; $L34: add.u64 %r152,%r179,40; mov.u64 %r150,%r345; add.u64 %r256,%r179,16; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r144,%r256,%r262; .loc 1 94 10 mov.u64 %r99,1; .loc 1 98 4 mov.u64 %r285,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r263,0; st.u64 [%r179+8],%r263; .loc 1 103 28 cvt.u16.u32 %r266,%r22; add.u16 %r265,%r266,-1; cvt.u32.u16 %r267,%r265; st.u8 [%r179+28],%r267; .loc 1 105 20 add.u32 %r268,%r22,-2; cvt.s64.s32 %r52,%r268; add.u64 %r270,%r52,%r52; add.u64 %r271,%r270,%r52; shl.b64 %r272,%r271,3; add.u64 %r273,%r179,%r272; .loc 1 105 67 shl.b64 %r275,%r52,3; add.u64 %r276,%frame,%r275; .loc 1 105 59 ld.u64 %r278,[%r273+40]; ld.u64 %r279,[%r276+240]; mul.lo.u64 %r109,%r278,%r279; .loc 1 107 29281; call (%value_in),_gfortrani_xmallocarray282,[%value_in]; } .loc 1 107 27 st.u64 [%r179],%r282; .loc 1 108 10 setp.eq.u64 %r284,%r109,0; @ ! %r284 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r99,%r48,%r99; $L17: .loc 1 98 4 st.u64 [%r152+8],%r285; ld.u64 %r48,[%r150]; add.u64 %r286,%r48,-1; st.u64 [%r152+16],%r286; st.u64 [%r152],%r99; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r287,%r144,%r152; @ %r287 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r179+48],%r109; mov.u64 %r289,-1; st.u64 [%r179+56],%r289; mov.u64 %r290,1; st.u64 [%r179+40],%r290; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r179+28]; .loc 1 118 10 setp.eq.u64 %r291,%r58,%r103; @ %r291 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r294,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r295,[%r294+36]; setp.eq.u32 %r296,%r295,0; @ %r296 bra $L21; .loc 1 125 2 add.u64 %r345,%frame,240; cvta.const.u64 %r300,$LC2r345299300; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r302,%r103,0; @ %r302 bra $L24; $L27: .loc 1 65 9 add.u64 %r303,%r30,1; .loc 1 65 7 sub.u64 %r105,%r303,%r32; max.s64 %r45,%r105,0; .loc 1 137 8 ld.u64 %r116,[%r180]; .loc 1 138 8 ld.u64 %r117,[%r179]; .loc 1 154 36 shl.b64 %r66,%r107,2; ld.u64 %r138,[%frame+360]; .loc 1 166 22 ld.u64 %r68,[%frame+120]; .loc 1 166 12 shl.b64 %r69,%r68,2; .loc 1 167 22 ld.u64 %r71,[%frame]; .loc 1 167 12 shl.b64 %r72,%r71,2; .loc 1 169 32 ld.u64 %r123,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r304,%r68,%r123; .loc 1 176 9 shl.b64 %r305,%r304,2; neg.s64 %r127,%r305; .loc 1 177 23 mul.lo.u64 %r306,%r71,%r123; .loc 1 177 9 shl.b64 %r307,%r306,2; neg.s64 %r130,%r307; setp.gt.s64 %r346,%r105,0; setp.le.s64 %r347,%r103,1; cvt.u32.u32 %r349,%r22; cvt.s64.s8 %r350,%r349; add.u64 %r351,%r350,-1; add.u64 %r352,%frame,120; add.u64 %r353,%frame,240; .loc 1 173 13 mov.u64 %r354,0; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r179,40; mov.u64 %r162,%frame; cvt.u32.u32 %r309,%r22; cvt.s64.s8 %r308,%r309; add.u64 %r153,%r308,-1; .loc 1 129 3 mov.u64 %r118,0; add.u64 %r345,%frame,240; .loc 1 131 16 mov.u64 %r310,%r118; $L26: st.u64 [%r165],%r310; .loc 1 132 18 ld.u64 %r311,[%r164]; st.u64 [%r162],%r311; .loc 1 133 17 shl.b64 %r313,%r118,3; add.u64 %r314,%r345,%r313; .loc 1 133 10 ld.u64 %r315,[%r314]; setp.le.s64 %r316,%r315,0; @ %r316 bra $L1; .loc 1 129 26 add.u64 %r118,%r118,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r317,%r118,%r153; @ %r317 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r346 bra $L35; .loc 1 150 10 mov.f32 %r319,0f00000000; st.f32 [%r117],%r319; bra $L29; $L35: mov.u64 %r112,%r116; .loc 1 148 10 mov.f32 %r110,0f00000000; .loc 1 154 13 mov.u64 %r111,0; $L28: .loc 1 158 10 ld.f32 %r320,[%r112]; add.f32 %r110,%r110,%r320; .loc 1 154 28 add.u64 %r111,%r111,1; .loc 1 154 36 add.u64 %r112,%r112,%r66; .loc 1 154 6 setp.gt.s64 %r321,%r45,%r111; @ %r321 bra $L28; .loc 1 161 12 st.f32 [%r117],%r110; $L29: .loc 1 165 15 add.u64 %r138,%r138,1; .loc 1 166 12 add.u64 %r116,%r116,%r69; .loc 1 167 12 add.u64 %r117,%r117,%r72; .loc 1 169 13 setp.ne.u64 %r322,%r138,%r123; @ %r322 bra $L25; .loc 1 176 9 add.u64 %r113,%r116,%r127; .loc 1 177 9 add.u64 %r114,%r117,%r130; .loc 1 179 7 @ %r347 bra $L1; add.u64 %r122,%frame,368; mov.u64 %r140,8; .loc 1 178 5 mov.u64 %r115,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r122],%r354; .loc 1 176 23 mul.lo.u64 %r328,%r85,%r84; .loc 1 176 9 shl.b64 %r329,%r328,2; sub.u64 %r113,%r116,%r329; .loc 1 177 23 mul.lo.u64 %r330,%r88,%r84; .loc 1 177 9 shl.b64 %r331,%r330,2; sub.u64 %r114,%r117,%r331; .loc 1 178 5 add.u64 %r115,%r115,1; .loc 1 179 7 add.u64 %r122,%r122,8; add.u64 %r140,%r140,8; setp.eq.u64 %r332,%r115,%r351; @ %r332 bra $L1; $L32: .loc 1 187 16 ld.u64 %r333,[%r122]; add.u64 %r84,%r333,1; st.u64 [%r122],%r84; .loc 1 188 23 add.u64 %r335,%r352,%r140; ld.u64 %r85,[%r335]; .loc 1 188 13 shl.b64 %r336,%r85,2; add.u64 %r116,%r113,%r336; .loc 1 189 23 add.u64 %r337,%frame,%r140; ld.u64 %r88,[%r337]; .loc 1 189 13 shl.b64 %r338,%r88,2; add.u64 %r117,%r114,%r338; .loc 1 169 32 add.u64 %r340,%r353,%r140; ld.u64 %r92,[%r340]; .loc 1 169 13 setp.eq.u64 %r341,%r84,%r92; @ %r341 bra $L33; .loc 1 173 13 mov.u64 %r138,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r342,[%r179]; setp.eq.u64 %r343,%r342,0; @ ! %r343 bra $L11; bra $L34; $L1: .loc 1 19_gfortran_msum_r4 .visible .func _gfortran_msum_r608u64u16 %r263; .reg .pred %r264; .reg .pred %r266; .reg .u64 %r270; .reg .u64 %r271; .reg .u64 %r277; .reg .pred %r279; .reg .pred %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .predu64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32u64 %r342; .reg .u64pred %r358; .reg .u64predu64 %r391; .reg .predpred %r399; .reg .f32 %r400; .reg .pred %r401; .reg .predmov.u64 %r211,%ar2; mov.u64 %r212,%ar3; .loc 1 223 6 setp.ne.u64 %r213,%r212,0; @ %r213 bra $L49gfortran_sum_r4.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r211]; .loc 1 233 7 add.u64 %r137,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r210+28]; .loc 1 234 38 add.u32 %r217,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r138,%r217; .loc 1 237 7 shr.u64 %r219,%r137,63; set.u32.gt.s64 %r221,%r137,%r138; neg.s32 %r222,%r221; cvt.u16.u64 %r225,%r219; cvt.u16.u32 %r226,%r222; or.b16 %r224,%r225,%r226; .loc 1 237 6 cvt.u32.u16 %r227,%r224; cvt.u16.u8 %r228,%r227; setp.eq.u16 %r229,%r228,0; @ %r229 bra $L51; .loc 1 239 7 cvt.u32.u32 %r232,%r23; cvt.s64.s8 %r231,%r232; st.u64 [%stack+8],%r231; st.u64 [%stack],%r22; cvta.const.u64 %r2302stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r429,%r137,%r137; add.u64 %r430,%r429,%r137; shl.b64 %r237,%r430,3; add.u64 %r238,%r210,%r237; ld.u64 %r241,[%r238+56]; add.u64 %r240,%r241,1; .loc 1 244 7 ld.u64 %r248,[%r238+48]; sub.u64 %r139,%r240,%r248; .loc 1 245 6 setp.le.s64 %r249,%r139,0; @ %r249 bra $L48; .loc 1 248 9 ld.u64 %r152,[%r212]; .loc 1 250 15 ld.u64 %r34,[%r212+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r250,%r35,-4; and.b32 %r251,%r250,-5; set.u32.eq.u32 %r253,%r251,0; neg.s32 %r254,%r253; .loc 1 252 22 add.u32 %r255,%r35,-1; set.u32.le.u32 %r257,%r255,1; neg.s32 %r258,%r257; .loc 1 252 6 cvt.u16.u32 %r260,%r254; cvt.u16.u32 %r261,%r258; or.b16 %r259,%r260,%r261; cvt.u32.u16 %r262,%r259; cvt.u16.u8 %r263,%r262; setp.ne.u16 %r264,%r263,0; @ %r264 bra $L53; .loc 1 254 7 setp.ne.u32 %r266,%r35,16; @ %r266 bra $L54; $L53: .loc 1 261 9 shl.b64 %r270,%r430,3; add.u64 %r271,%r210,%r270; ld.u64 %r142,[%r271+40]; .loc 1 262 12 add.u64 %r277,%r212,%r270; ld.u64 %r42,[%r277+40]; .loc 1 264 3 setp.ne.u64 %r279,%r137,0; @ %r279 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r280,%r137,%r138; @ %r280 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r281281_gfortran_runtime_error5: add.u64 %r40,%r210,40; add.u64 %r179,%r212,40; add.u64 %r283,%r210,16; add.u64 %r285,%r22,%r22; add.u64 %r286,%r285,%r22; shl.b64 %r287,%r286,3; add.u64 %r206,%r283,%r287; .loc 1 264 3 mov.u64 %r92,0; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 271 12 mov.u64 %r441,%r92; $L60: .loc 1 266 18 add.u64 %r289,%r427,%r92; ld.u64 %r290,[%r40]; st.u64 [%r289],%r290; .loc 1 267 18 add.u64 %r291,%frame,%r92; .loc 1 267 20 ld.u64 %r293,[%r179]; mul.lo.u64 %r292,%r293,%r34; .loc 1 267 18 st.u64 [%r291],%r292; .loc 1 268 19 ld.u64 %r295,[%r40+16]; add.u64 %r294,%r295,1; ld.u64 %r296,[%r40+8]; sub.u64 %r54,%r294,%r296; .loc 1 270 10 setp.lt.s64 %r297,%r54,0; @ %r297 bra $L58; .loc 1 268 17 add.u64 %r299,%r426,%r92; st.u64 [%r299],%r54; bra $L59; $L58: .loc 1 271 12 add.u64 %r301,%r426,%r92; st.u64 [%r301],%r441; $L59: .loc 1 264 3 add.u64 %r40,%r40,24; add.u64 %r179,%r179,24; add.u64 %r92,%r92,8; setp.ne.u64 %r303,%r40,%r206; @ %r303 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r304,[%r209]; setp.eq.u64 %r305,%r304,0; @ ! %r305 bra $L63; bra $L62; $L56: add.u64 %r307,%r22,%r22; add.u64 %r308,%r307,%r22; shl.b64 %r309,%r308,3; add.u64 %r127,%r309,40; add.u64 %r133,%r210,%r127; add.u64 %r125,%r212,%r127; shl.b64 %r310,%r22,3; add.u64 %r103,%r310,-8; cvt.u32.u32 %r312,%r23; cvt.s64.s8 %r311,%r312; shl.b64 %r313,%r311,3; add.u64 %r69,%r313,-8; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 281 12 mov.u64 %r440,0; $L66: .loc 1 276 18 add.u64 %r315,%r427,%r103; ld.u64 %r316,[%r133]; st.u64 [%r315],%r316; .loc 1 277 18 add.u64 %r317,%frame,%r103; .loc 1 277 20 ld.u64 %r319,[%r125]; mul.lo.u64 %r318,%r319,%r34; .loc 1 277 18 st.u64 [%r317],%r318; .loc 1 278 19 ld.u64 %r321,[%r133+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r133+8]; sub.u64 %r66,%r320,%r322; .loc 1 280 10 setp.lt.s64 %r323,%r66,0; @ %r323 bra $L64; .loc 1 278 17 add.u64 %r325,%r426,%r103; st.u64 [%r325],%r66; bra $L65; $L64: .loc 1 281 12 add.u64 %r327,%r426,%r103; st.u64 [%r327],%r440; $L65: .loc 1 274 3 add.u64 %r133,%r133,24; add.u64 %r125,%r125,24; add.u64 %r103,%r103,8; setp.ne.u64 %r329,%r69,%r103; @ %r329 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r330,%r138,0; @ %r330 bra $L68; add.u64 %r426,%frame,360; $L86: add.u64 %r172,%r209,40; mov.u64 %r164,%r426; cvt.u32.u32 %r332,%r23; cvt.s64.s8 %r331,%r332; add.u64 %r334,%r331,%r331; add.u64 %r335,%r334,%r331; shl.b64 %r336,%r335,3; add.u64 %r337,%r209,16; add.u64 %r134,%r336,%r337; .loc 1 291 10 mov.u64 %r129,1; .loc 1 295 4 mov.u64 %r356,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r338,%r23,-2; cvt.s64.s32 %r75,%r338; add.u64 %r340,%r75,%r75; add.u64 %r341,%r340,%r75; shl.b64 %r342,%r341,3; add.u64 %r343,%r209,%r342; .loc 1 299 67 shl.b64 %r345,%r75,3; add.u64 %r346,%frame,%r345; .loc 1 299 59 ld.u64 %r348,[%r343+40]; ld.u64 %r349,[%r346+360]; mul.lo.u64 %r146,%r348,%r349; .loc 1 301 24 mov.u64 %r350,0; st.u64 [%r209+8],%r350; .loc 1 302 28 cvt.u16.u32 %r353,%r23; add.u16 %r352,%r353,-1; cvt.u32.u16 %r354,%r352; st.u8 [%r209+28],%r354; .loc 1 304 10 setp.eq.u64 %r355,%r146,0; @ %r355 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r129,%r72,%r129; $L69: .loc 1 295 4 st.u64 [%r172+8],%r356; ld.u64 %r72,[%r164]; add.u64 %r357,%r72,-1; st.u64 [%r172+16],%r357; st.u64 [%r172],%r129; .loc 1 288 7 add.u64 %r172,%r172,24; add.u64 %r164,%r164,8; setp.ne.u64 %r358,%r134,%r172; @ %r358 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r209+48],%r146; mov.u64 %r360,-1; st.u64 [%r209+56],%r360; mov.u64 %r361,1; st.u64 [%r209+40],%r361; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24363; call (%value_in),_gfortrani_xmallocarray64,[%value_in]; } .loc 1 311 22 st.u64 [%r209],%r364; bra $L73; $L63: .loc 1 316 19 ld.s8 %r366,[%r209+28]; .loc 1 316 10 setp.eq.u64 %r367,%r366,%r138; @ %r367 bra $L74; .loc 1 317 2 cvta.const.u64 %r36836_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r370,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r371,[%r370+36]; setp.eq.u32 %r372,%r371,0; @ %r372 bra $L73; .loc 1 321 4 add.u64 %r426,%frame,360; cvta.const.u64 %r376,$LC2r42637376; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r380,$LC1210380376; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r382,%r138,0; @ %r382 bra $L75; $L78: .loc 1 336 8 ld.u64 %r153,[%r209]; .loc 1 337 8 ld.u64 %r151,[%r210]; .loc 1 339 9 setp.ne.u64 %r383,%r151,0; @ %r383 bra $L76; bra $L48; $L75: add.u64 %r190,%frame,480; add.u64 %r189,%r209,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r385,%r23; cvt.s64.s8 %r384,%r385; add.u64 %r175,%r384,-1; .loc 1 328 3 mov.u64 %r157,0; add.u64 %r426,%frame,360; .loc 1 330 16 mov.u64 %r386,%r157; $L77: st.u64 [%r190],%r386; .loc 1 331 18 ld.u64 %r387,[%r189]; st.u64 [%r187],%r387; .loc 1 332 17 shl.b64 %r389,%r157,3; add.u64 %r390,%r426,%r389; .loc 1 332 10 ld.u64 %r391,[%r390]; setp.le.s64 %r392,%r391,0; @ %r392 bra $L48; .loc 1 328 26 add.u64 %r157,%r157,1; .loc 1 328 3 add.u64 %r190,%r190,8; add.u64 %r189,%r189,24; add.u64 %r187,%r187,8; setp.ne.u64 %r393,%r157,%r175; @ %r393 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r45,%r42,%r34; .loc 1 349 32 shl.b64 %r91,%r142,2; ld.u64 %r39,[%frame+480]; .loc 1 359 22 ld.u64 %r94,[%frame+240]; .loc 1 359 12 shl.b64 %r95,%r94,2; .loc 1 360 23 ld.u64 %r97,[%frame]; .loc 1 361 22 ld.u64 %r99,[%frame+120]; .loc 1 361 12 shl.b64 %r100,%r99,2; .loc 1 363 32 ld.u64 %r85,[%frame+360]; .loc 1 370 23 mul.lo.u64 %r38,%r85,%r94; .loc 1 371 24 mul.lo.u64 %r394,%r85,%r97; .loc 1 371 10 neg.s64 %r163,%r394; .loc 1 372 23 mul.lo.u64 %r395,%r85,%r99; .loc 1 372 9 shl.b64 %r396,%r395,2; neg.s64 %r165,%r396; setp.le.s64 %r431,%r138,1; cvt.u32.u32 %r433,%r23; cvt.s64.s8 %r434,%r433; add.u64 %r435,%r434,-1; add.u64 %r436,%frame,240; add.u64 %r437,%frame,360; add.u64 %r438,%frame,120; .loc 1 367 13 mov.u64 %r439,0; $L87: .loc 1 328 3 mov.u64 %r156,%r152; mov.u64 %r155,%r151; .loc 1 348 10 mov.f32 %r130,0f00000000; .loc 1 349 9 mov.u64 %r154,0; $L80: .loc 1 352 6 ld.s8 %r398,[%r156]; cvt.u16.u32 %r397,%r398; setp.eq.u16 %r399,%r397,0; @ %r399 bra $L79; .loc 1 353 12 ld.f32 %r400,[%r155]; add.f32 %r130,%r130,%r400; $L79: .loc 1 349 24 add.u64 %r154,%r154,1; .loc 1 349 32 add.u64 %r155,%r155,%r91; .loc 1 349 47 add.u64 %r156,%r156,%r45; .loc 1 349 2 setp.ne.u64 %r401,%r139,%r154; @ %r401 bra $L80; .loc 1 355 8 st.f32 [%r153],%r130; .loc 1 358 15 add.u64 %r39,%r39,1; .loc 1 359 12 add.u64 %r151,%r151,%r95; .loc 1 360 13 add.u64 %r152,%r152,%r97; .loc 1 361 12 add.u64 %r153,%r153,%r100; .loc 1 363 13 setp.ne.u64 %r402,%r85,%r39; @ %r402 bra $L87; .loc 1 371 10 add.u64 %r148,%r152,%r163; .loc 1 372 9 add.u64 %r149,%r153,%r165; .loc 1 374 7 @ ! %r431 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r41],%r439; .loc 1 370 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 371 24 mul.lo.u64 %r405,%r116,%r113; .loc 1 371 10 sub.u64 %r148,%r152,%r405; .loc 1 372 23 mul.lo.u64 %r406,%r118,%r113; .loc 1 372 9 shl.b64 %r407,%r406,2; sub.u64 %r149,%r153,%r407; .loc 1 373 5 add.u64 %r150,%r150,1; .loc 1 374 7 add.u64 %r41,%r41,8; add.u64 %r166,%r166,8; setp.ne.u64 %r408,%r150,%r435; @ %r408 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; .loc 1 370 23 mov.u64 %r102,%r38; .loc 1 374 7 mov.u64 %r166,8; .loc 1 373 5 mov.u64 %r150,1; $L84: .loc 1 382 16 ld.u64 %r412,[%r41]; add.u64 %r113,%r412,1; st.u64 [%r41],%r113; .loc 1 383 23 add.u64 %r414,%r436,%r166; ld.u64 %r114,[%r414]; .loc 1 383 13 sub.u64 %r415,%r114,%r102; shl.b64 %r416,%r415,2; add.u64 %r151,%r151,%r416; .loc 1 384 24 add.u64 %r417,%frame,%r166; ld.u64 %r116,[%r417]; .loc 1 384 14 add.u64 %r152,%r148,%r116; .loc 1 385 23 add.u64 %r419,%r438,%r166; ld.u64 %r118,[%r419]; .loc 1 385 13 shl.b64 %r420,%r118,2; add.u64 %r153,%r149,%r420; .loc 1 363 32 add.u64 %r422,%r437,%r166; ld.u64 %r122,[%r422]; .loc 1 363 13 setp.eq.u64 %r423,%r113,%r122; @ %r423 bra $L85; .loc 1 367 13 mov.u64 %r39,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r424,[%r209]; setp.eq.u64 %r425,%r424,0; @ ! %r425 bra $L63; bra $L86; $L48:_gfortran_ssum_r4 .visible .func _gfortran_ssum_r68u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64 %r276; .reg .predpredu64mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_sum_r4.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r294,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r294; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r294; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r300,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r300; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r293,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r293,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r299,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r299; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r294,0; @ %r211 bra $L127; add.u64 %r293,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r293; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 2496244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r294,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r294; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r294; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r294,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r294,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r294,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 shl.b64 %r69,%r68,2; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r292,%r294,1; .loc 1 512 13 mov.f32 %r278,0f00000000; cvt.u32.u32 %r296,%r24; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; $L146: st.f32 [%r98],%r278; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r279,%r59,%r80; @ %r279 bra $L146; .loc 1 525 7 @ ! %r292 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r295; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r282,%r97,%r298; @ %r282 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; .loc 1 520 13 mov.u64 %r295,0; $L144: .loc 1 529 16 ld.u64 %r287,[%r99]; add.u64 %r74,%r287,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r288,%r75,%r72; shl.b64 %r289,%r288,2; add.u64 %r98,%r98,%r289; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r290,%r74,%r77; @ %r290 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r291,%r98,0; @ ! %r291 bra $L122; bra $L147; $L109: .loc 1 534 1 ret; } sum_r8.o/_gfortran_sum_r8 .visible .func _gfortran_sum_r8fortran/generated/sum_r8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_msum_r8 .visible .func _gfortran_msum_r8gfortran_ssum_r8 .visible .func _gfortran_ssum_r8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10] = {83,85,77114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,83,85,77,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_sum_r8 .visible .func _gfortran_sum_r8480u64 %r52; .reg .u64 %r58; .reg .u64 %r61; .reg .u649pred %r217; .reg .pred64u64 %r251; .reg .predu64 %r256; .reg .u64 %r257; .reg .u32u32 %r268; .reg .u64u64pred %r317; .reg .f64 %r319; .reg .f64pred %r332; .reg .u64 %r333; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u64 %r338; .reg .u64 %r340; .reg .predmov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; .loc 1 55 10 ld.s8 %r22,[%r180+28]; .loc 1 55 38 add.u32 %r182,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r182; .loc 1 56 10 ld.u64 %r25,[%r181]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r184,%r104,63; set.u32.lt.s64 %r186,%r103,%r104; neg.s32 %r187,%r186; cvt.u16.u64 %r190,%r184; cvt.u16.u32 %r191,%r187; or.b16 %r189,%r190,%r191; .loc 1 58 6 cvt.u32.u16 %r192,%r189; cvt.u16.u8 %r193,%r192; setp.eq.u16 %r194,%r193,0; @ %r194 bra $L2; .loc 1 60 7 cvt.u32.u32 %r197,%r22; cvt.s64.s8 %r196,%r197; st.u64 [%stack+8],%r1961955_gfortran_runtime_errorr200,%r104,%r104; add.u64 %r201,%r200,%r104; shl.b64 %r202,%r201,3; add.u64 %r203,%r180,%r202; ld.u64 %r30,[%r203+56]; ld.u64 %r32,[%r203+48]; .loc 1 68 9 ld.u64 %r107,[%r203+40]; .loc 1 70 3 setp.ne.u64 %r217,%r104,0; @ %r217 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r218,%r103,%r104; @ %r218 bra $L4; bra $L46; $L3: add.u64 %r61,%r180,40; add.u64 %r40,%frame,120; add.u64 %r91,%frame,240; add.u64 %r219,%r180,16; add.u64 %r221,%r25,%r25; add.u64 %r222,%r221,%r25; shl.b64 %r223,%r222,3; add.u64 %r174,%r219,%r223; .loc 1 76 12 mov.u64 %r357,0; $L8: .loc 1 72 18 ld.u64 %r224,[%r61]; st.u64 [%r40],%r224; .loc 1 73 19 ld.u64 %r226,[%r61+16]; add.u64 %r225,%r226,1; ld.u64 %r227,[%r61+8]; sub.u64 %r37,%r225,%r227; .loc 1 75 10 setp.lt.s64 %r228,%r37,0; @ %r228 bra $L6; .loc 1 73 17 st.u64 [%r91],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r91],%r357; $L7: .loc 1 70 3 add.u64 %r61,%r61,24; add.u64 %r40,%r40,8; add.u64 %r91,%r91,8; setp.ne.u64 %r230,%r61,%r174; @ %r230 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r231,[%r179]; setp.eq.u64 %r232,%r231,0; @ ! %r232 bra $L11; bra $L10; $L4: add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r237,%r236,40; add.u64 %r143,%r180,%r237; shl.b64 %r238,%r25,3; add.u64 %r125,%r238,-8; add.u64 %r344,%frame,120; add.u64 %r128,%r344,%r125; add.u64 %r345,%frame,240; add.u64 %r121,%r345,%r125; cvt.u32.u32 %r242,%r22; cvt.s64.s8 %r241,%r242; add.u64 %r244,%r241,%r241; add.u64 %r245,%r244,%r241; shl.b64 %r246,%r245,3; add.u64 %r247,%r180,40; add.u64 %r95,%r246,%r247; .loc 1 84 12 mov.u64 %r356,0; $L14: .loc 1 80 18 ld.u64 %r248,[%r143]; st.u64 [%r128],%r248; .loc 1 81 19 ld.u64 %r250,[%r143+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r143+8]; sub.u64 %r44,%r249,%r251; .loc 1 83 10 setp.lt.s64 %r252,%r44,0; @ %r252 bra $L12; .loc 1 81 17 st.u64 [%r121],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r121],%r356; $L13: .loc 1 78 3 add.u64 %r143,%r143,24; add.u64 %r128,%r128,8; add.u64 %r121,%r121,8; setp.ne.u64 %r254,%r95,%r143; @ %r254 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r255,%r103,0; @ %r255 bra $L16; add.u64 %r345,%frame,240; $L34: add.u64 %r152,%r179,40; mov.u64 %r150,%r345; add.u64 %r256,%r179,16; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r144,%r256,%r262; .loc 1 94 10 mov.u64 %r99,1; .loc 1 98 4 mov.u64 %r285,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r263,0; st.u64 [%r179+8],%r263; .loc 1 103 28 cvt.u16.u32 %r266,%r22; add.u16 %r265,%r266,-1; cvt.u32.u16 %r267,%r265; st.u8 [%r179+28],%r267; .loc 1 105 20 add.u32 %r268,%r22,-2; cvt.s64.s32 %r52,%r268; add.u64 %r270,%r52,%r52; add.u64 %r271,%r270,%r52; shl.b64 %r272,%r271,3; add.u64 %r273,%r179,%r272; .loc 1 105 67 shl.b64 %r275,%r52,3; add.u64 %r276,%frame,%r275; .loc 1 105 59 ld.u64 %r278,[%r273+40]; ld.u64 %r279,[%r276+240]; mul.lo.u64 %r109,%r278,%r279; .loc 1 107 29 mov.u64 %r281281; call (%value_in),_gfortrani_xmallocarray282,[%value_in]; } .loc 1 107 27 st.u64 [%r179],%r282; .loc 1 108 10 setp.eq.u64 %r284,%r109,0; @ ! %r284 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r99,%r48,%r99; $L17: .loc 1 98 4 st.u64 [%r152+8],%r285; ld.u64 %r48,[%r150]; add.u64 %r286,%r48,-1; st.u64 [%r152+16],%r286; st.u64 [%r152],%r99; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r287,%r144,%r152; @ %r287 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r179+48],%r109; mov.u64 %r289,-1; st.u64 [%r179+56],%r289; mov.u64 %r290,1; st.u64 [%r179+40],%r290; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r179+28]; .loc 1 118 10 setp.eq.u64 %r291,%r58,%r103; @ %r291 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r294,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r295,[%r294+36]; setp.eq.u32 %r296,%r295,0; @ %r296 bra $L21; .loc 1 125 2 add.u64 %r345,%frame,240; cvta.const.u64 %r300,$LC2r345299300; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r302,%r103,0; @ %r302 bra $L24; $L27: .loc 1 65 9 add.u64 %r303,%r30,1; .loc 1 65 7 sub.u64 %r105,%r303,%r32; max.s64 %r45,%r105,0; .loc 1 137 8 ld.u64 %r116,[%r180]; .loc 1 138 8 ld.u64 %r117,[%r179]; .loc 1 154 36 shl.b64 %r66,%r107,3; ld.u64 %r138,[%frame+360]; .loc 1 166 22 ld.u64 %r68,[%frame+120]; .loc 1 166 12 shl.b64 %r69,%r68,3; .loc 1 167 22 ld.u64 %r71,[%frame]; .loc 1 167 12 shl.b64 %r72,%r71,3; .loc 1 169 32 ld.u64 %r123,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r304,%r68,%r123; .loc 1 176 9 shl.b64 %r305,%r304,3; neg.s64 %r127,%r305; .loc 1 177 23 mul.lo.u64 %r306,%r71,%r123; .loc 1 177 9 shl.b64 %r307,%r306,3; neg.s64 %r130,%r307; setp.gt.s64 %r346,%r105,0; setp.le.s64 %r347,%r103,1; cvt.u32.u32 %r349,%r22; cvt.s64.s8 %r350,%r349; add.u64 %r351,%r350,-1; add.u64 %r352,%frame,120; add.u64 %r353,%frame,240; .loc 1 173 13 mov.u64 %r354,0; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r179,40; mov.u64 %r162,%frame; cvt.u32.u32 %r309,%r22; cvt.s64.s8 %r308,%r309; add.u64 %r153,%r308,-1; .loc 1 129 3 mov.u64 %r118,0; add.u64 %r345,%frame,240; .loc 1 131 16 mov.u64 %r310,%r118; $L26: st.u64 [%r165],%r310; .loc 1 132 18 ld.u64 %r311,[%r164]; st.u64 [%r162],%r311; .loc 1 133 17 shl.b64 %r313,%r118,3; add.u64 %r314,%r345,%r313; .loc 1 133 10 ld.u64 %r315,[%r314]; setp.le.s64 %r316,%r315,0; @ %r316 bra $L1; .loc 1 129 26 add.u64 %r118,%r118,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r317,%r118,%r153; @ %r317 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r346 bra $L35; .loc 1 150 10 mov.f64 %r319,0d0000000000000000; st.f64 [%r117],%r319; bra $L29; $L35: mov.u64 %r112,%r116; .loc 1 148 10 mov.f64 %r110,0d0000000000000000; .loc 1 154 13 mov.u64 %r111,0; $L28: .loc 1 158 10 ld.f64 %r320,[%r112]; add.f64 %r110,%r110,%r320; .loc 1 154 28 add.u64 %r111,%r111,1; .loc 1 154 36 add.u64 %r112,%r112,%r66; .loc 1 154 6 setp.gt.s64 %r321,%r45,%r111; @ %r321 bra $L28; .loc 1 161 12 st.f64 [%r117],%r110; $L29: .loc 1 165 15 add.u64 %r138,%r138,1; .loc 1 166 12 add.u64 %r116,%r116,%r69; .loc 1 167 12 add.u64 %r117,%r117,%r72; .loc 1 169 13 setp.ne.u64 %r322,%r138,%r123; @ %r322 bra $L25; .loc 1 176 9 add.u64 %r113,%r116,%r127; .loc 1 177 9 add.u64 %r114,%r117,%r130; .loc 1 179 7 @ %r347 bra $L1; add.u64 %r122,%frame,368; mov.u64 %r140,8; .loc 1 178 5 mov.u64 %r115,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r122],%r354; .loc 1 176 23 mul.lo.u64 %r328,%r85,%r84; .loc 1 176 9 shl.b64 %r329,%r328,3; sub.u64 %r113,%r116,%r329; .loc 1 177 23 mul.lo.u64 %r330,%r88,%r84; .loc 1 177 9 shl.b64 %r331,%r330,3; sub.u64 %r114,%r117,%r331; .loc 1 178 5 add.u64 %r115,%r115,1; .loc 1 179 7 add.u64 %r122,%r122,8; add.u64 %r140,%r140,8; setp.eq.u64 %r332,%r115,%r351; @ %r332 bra $L1; $L32: .loc 1 187 16 ld.u64 %r333,[%r122]; add.u64 %r84,%r333,1; st.u64 [%r122],%r84; .loc 1 188 23 add.u64 %r335,%r352,%r140; ld.u64 %r85,[%r335]; .loc 1 188 13 shl.b64 %r336,%r85,3; add.u64 %r116,%r113,%r336; .loc 1 189 23 add.u64 %r337,%frame,%r140; ld.u64 %r88,[%r337]; .loc 1 189 13 shl.b64 %r338,%r88,3; add.u64 %r117,%r114,%r338; .loc 1 169 32 add.u64 %r340,%r353,%r140; ld.u64 %r92,[%r340]; .loc 1 169 13 setp.eq.u64 %r341,%r84,%r92; @ %r341 bra $L33; .loc 1 173 13 mov.u64 %r138,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r342,[%r179]; setp.eq.u64 %r343,%r342,0; @ ! %r343 bra $L11; bra $L34; $L1: .loc 1 19_gfortran_msum_r8 .visible .func _gfortran_msum_r8608u64u16 %r263; .reg .pred %r264; .reg .pred %r266; .reg .u64 %r270; .reg .u64 %r271; .reg .u64 %r277; .reg .pred %r279; .reg .pred %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .predu64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32u64 %r342; .reg .u64pred %r358; .reg .u64predu64 %r391; .reg .predpred %r399; .reg .f64 %r400; .reg .pred %r401; .reg .predmov.u64 %r211,%ar2; mov.u64 %r212,%ar3; .loc 1 223 6 setp.ne.u64 %r213,%r212,0; @ %r213 bra $L49gfortran_sum_r8.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r211]; .loc 1 233 7 add.u64 %r137,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r210+28]; .loc 1 234 38 add.u32 %r217,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r138,%r217; .loc 1 237 7 shr.u64 %r219,%r137,63; set.u32.gt.s64 %r221,%r137,%r138; neg.s32 %r222,%r221; cvt.u16.u64 %r225,%r219; cvt.u16.u32 %r226,%r222; or.b16 %r224,%r225,%r226; .loc 1 237 6 cvt.u32.u16 %r227,%r224; cvt.u16.u8 %r228,%r227; setp.eq.u16 %r229,%r228,0; @ %r229 bra $L51; .loc 1 239 7 cvt.u32.u32 %r232,%r23; cvt.s64.s8 %r231,%r232; st.u64 [%stack+8],%r231; st.u64 [%stack],%r22; cvta.const.u64 %r2302stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r429,%r137,%r137; add.u64 %r430,%r429,%r137; shl.b64 %r237,%r430,3; add.u64 %r238,%r210,%r237; ld.u64 %r241,[%r238+56]; add.u64 %r240,%r241,1; .loc 1 244 7 ld.u64 %r248,[%r238+48]; sub.u64 %r139,%r240,%r248; .loc 1 245 6 setp.le.s64 %r249,%r139,0; @ %r249 bra $L48; .loc 1 248 9 ld.u64 %r152,[%r212]; .loc 1 250 15 ld.u64 %r34,[%r212+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r250,%r35,-4; and.b32 %r251,%r250,-5; set.u32.eq.u32 %r253,%r251,0; neg.s32 %r254,%r253; .loc 1 252 22 add.u32 %r255,%r35,-1; set.u32.le.u32 %r257,%r255,1; neg.s32 %r258,%r257; .loc 1 252 6 cvt.u16.u32 %r260,%r254; cvt.u16.u32 %r261,%r258; or.b16 %r259,%r260,%r261; cvt.u32.u16 %r262,%r259; cvt.u16.u8 %r263,%r262; setp.ne.u16 %r264,%r263,0; @ %r264 bra $L53; .loc 1 254 7 setp.ne.u32 %r266,%r35,16; @ %r266 bra $L54; $L53: .loc 1 261 9 shl.b64 %r270,%r430,3; add.u64 %r271,%r210,%r270; ld.u64 %r142,[%r271+40]; .loc 1 262 12 add.u64 %r277,%r212,%r270; ld.u64 %r42,[%r277+40]; .loc 1 264 3 setp.ne.u64 %r279,%r137,0; @ %r279 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r280,%r137,%r138; @ %r280 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r281281_gfortran_runtime_error5: add.u64 %r40,%r210,40; add.u64 %r179,%r212,40; add.u64 %r283,%r210,16; add.u64 %r285,%r22,%r22; add.u64 %r286,%r285,%r22; shl.b64 %r287,%r286,3; add.u64 %r206,%r283,%r287; .loc 1 264 3 mov.u64 %r92,0; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 271 12 mov.u64 %r441,%r92; $L60: .loc 1 266 18 add.u64 %r289,%r427,%r92; ld.u64 %r290,[%r40]; st.u64 [%r289],%r290; .loc 1 267 18 add.u64 %r291,%frame,%r92; .loc 1 267 20 ld.u64 %r293,[%r179]; mul.lo.u64 %r292,%r293,%r34; .loc 1 267 18 st.u64 [%r291],%r292; .loc 1 268 19 ld.u64 %r295,[%r40+16]; add.u64 %r294,%r295,1; ld.u64 %r296,[%r40+8]; sub.u64 %r54,%r294,%r296; .loc 1 270 10 setp.lt.s64 %r297,%r54,0; @ %r297 bra $L58; .loc 1 268 17 add.u64 %r299,%r426,%r92; st.u64 [%r299],%r54; bra $L59; $L58: .loc 1 271 12 add.u64 %r301,%r426,%r92; st.u64 [%r301],%r441; $L59: .loc 1 264 3 add.u64 %r40,%r40,24; add.u64 %r179,%r179,24; add.u64 %r92,%r92,8; setp.ne.u64 %r303,%r40,%r206; @ %r303 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r304,[%r209]; setp.eq.u64 %r305,%r304,0; @ ! %r305 bra $L63; bra $L62; $L56: add.u64 %r307,%r22,%r22; add.u64 %r308,%r307,%r22; shl.b64 %r309,%r308,3; add.u64 %r127,%r309,40; add.u64 %r133,%r210,%r127; add.u64 %r125,%r212,%r127; shl.b64 %r310,%r22,3; add.u64 %r103,%r310,-8; cvt.u32.u32 %r312,%r23; cvt.s64.s8 %r311,%r312; shl.b64 %r313,%r311,3; add.u64 %r69,%r313,-8; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 281 12 mov.u64 %r440,0; $L66: .loc 1 276 18 add.u64 %r315,%r427,%r103; ld.u64 %r316,[%r133]; st.u64 [%r315],%r316; .loc 1 277 18 add.u64 %r317,%frame,%r103; .loc 1 277 20 ld.u64 %r319,[%r125]; mul.lo.u64 %r318,%r319,%r34; .loc 1 277 18 st.u64 [%r317],%r318; .loc 1 278 19 ld.u64 %r321,[%r133+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r133+8]; sub.u64 %r66,%r320,%r322; .loc 1 280 10 setp.lt.s64 %r323,%r66,0; @ %r323 bra $L64; .loc 1 278 17 add.u64 %r325,%r426,%r103; st.u64 [%r325],%r66; bra $L65; $L64: .loc 1 281 12 add.u64 %r327,%r426,%r103; st.u64 [%r327],%r440; $L65: .loc 1 274 3 add.u64 %r133,%r133,24; add.u64 %r125,%r125,24; add.u64 %r103,%r103,8; setp.ne.u64 %r329,%r69,%r103; @ %r329 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r330,%r138,0; @ %r330 bra $L68; add.u64 %r426,%frame,360; $L86: add.u64 %r172,%r209,40; mov.u64 %r164,%r426; cvt.u32.u32 %r332,%r23; cvt.s64.s8 %r331,%r332; add.u64 %r334,%r331,%r331; add.u64 %r335,%r334,%r331; shl.b64 %r336,%r335,3; add.u64 %r337,%r209,16; add.u64 %r134,%r336,%r337; .loc 1 291 10 mov.u64 %r129,1; .loc 1 295 4 mov.u64 %r356,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r338,%r23,-2; cvt.s64.s32 %r75,%r338; add.u64 %r340,%r75,%r75; add.u64 %r341,%r340,%r75; shl.b64 %r342,%r341,3; add.u64 %r343,%r209,%r342; .loc 1 299 67 shl.b64 %r345,%r75,3; add.u64 %r346,%frame,%r345; .loc 1 299 59 ld.u64 %r348,[%r343+40]; ld.u64 %r349,[%r346+360]; mul.lo.u64 %r146,%r348,%r349; .loc 1 301 24 mov.u64 %r350,0; st.u64 [%r209+8],%r350; .loc 1 302 28 cvt.u16.u32 %r353,%r23; add.u16 %r352,%r353,-1; cvt.u32.u16 %r354,%r352; st.u8 [%r209+28],%r354; .loc 1 304 10 setp.eq.u64 %r355,%r146,0; @ %r355 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r129,%r72,%r129; $L69: .loc 1 295 4 st.u64 [%r172+8],%r356; ld.u64 %r72,[%r164]; add.u64 %r357,%r72,-1; st.u64 [%r172+16],%r357; st.u64 [%r172],%r129; .loc 1 288 7 add.u64 %r172,%r172,24; add.u64 %r164,%r164,8; setp.ne.u64 %r358,%r134,%r172; @ %r358 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r209+48],%r146; mov.u64 %r360,-1; st.u64 [%r209+56],%r360; mov.u64 %r361,1; st.u64 [%r209+40],%r361; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r363363; call (%value_in),_gfortrani_xmallocarray64,[%value_in]; } .loc 1 311 22 st.u64 [%r209],%r364; bra $L73; $L63: .loc 1 316 19 ld.s8 %r366,[%r209+28]; .loc 1 316 10 setp.eq.u64 %r367,%r366,%r138; @ %r367 bra $L74; .loc 1 317 2 cvta.const.u64 %r36836_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r370,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r371,[%r370+36]; setp.eq.u32 %r372,%r371,0; @ %r372 bra $L73; .loc 1 321 4 add.u64 %r426,%frame,360; cvta.const.u64 %r376,$LC2r42637376; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r380,$LC1210380376; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r382,%r138,0; @ %r382 bra $L75; $L78: .loc 1 336 8 ld.u64 %r153,[%r209]; .loc 1 337 8 ld.u64 %r151,[%r210]; .loc 1 339 9 setp.ne.u64 %r383,%r151,0; @ %r383 bra $L76; bra $L48; $L75: add.u64 %r190,%frame,480; add.u64 %r189,%r209,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r385,%r23; cvt.s64.s8 %r384,%r385; add.u64 %r175,%r384,-1; .loc 1 328 3 mov.u64 %r157,0; add.u64 %r426,%frame,360; .loc 1 330 16 mov.u64 %r386,%r157; $L77: st.u64 [%r190],%r386; .loc 1 331 18 ld.u64 %r387,[%r189]; st.u64 [%r187],%r387; .loc 1 332 17 shl.b64 %r389,%r157,3; add.u64 %r390,%r426,%r389; .loc 1 332 10 ld.u64 %r391,[%r390]; setp.le.s64 %r392,%r391,0; @ %r392 bra $L48; .loc 1 328 26 add.u64 %r157,%r157,1; .loc 1 328 3 add.u64 %r190,%r190,8; add.u64 %r189,%r189,24; add.u64 %r187,%r187,8; setp.ne.u64 %r393,%r157,%r175; @ %r393 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r45,%r42,%r34; .loc 1 349 32 shl.b64 %r91,%r142,3; ld.u64 %r39,[%frame+480]; .loc 1 359 22 ld.u64 %r94,[%frame+240]; .loc 1 359 12 shl.b64 %r95,%r94,3; .loc 1 360 23 ld.u64 %r97,[%frame]; .loc 1 361 22 ld.u64 %r99,[%frame+120]; .loc 1 361 12 shl.b64 %r100,%r99,3; .loc 1 363 32 ld.u64 %r85,[%frame+360]; .loc 1 370 23 mul.lo.u64 %r38,%r85,%r94; .loc 1 371 24 mul.lo.u64 %r394,%r85,%r97; .loc 1 371 10 neg.s64 %r163,%r394; .loc 1 372 23 mul.lo.u64 %r395,%r85,%r99; .loc 1 372 9 shl.b64 %r396,%r395,3; neg.s64 %r165,%r396; setp.le.s64 %r431,%r138,1; cvt.u32.u32 %r433,%r23; cvt.s64.s8 %r434,%r433; add.u64 %r435,%r434,-1; add.u64 %r436,%frame,240; add.u64 %r437,%frame,360; add.u64 %r438,%frame,120; .loc 1 367 13 mov.u64 %r439,0; $L87: .loc 1 328 3 mov.u64 %r156,%r152; mov.u64 %r155,%r151; .loc 1 348 10 mov.f64 %r130,0d0000000000000000; .loc 1 349 9 mov.u64 %r154,0; $L80: .loc 1 352 6 ld.s8 %r398,[%r156]; cvt.u16.u32 %r397,%r398; setp.eq.u16 %r399,%r397,0; @ %r399 bra $L79; .loc 1 353 12 ld.f64 %r400,[%r155]; add.f64 %r130,%r130,%r400; $L79: .loc 1 349 24 add.u64 %r154,%r154,1; .loc 1 349 32 add.u64 %r155,%r155,%r91; .loc 1 349 47 add.u64 %r156,%r156,%r45; .loc 1 349 2 setp.ne.u64 %r401,%r139,%r154; @ %r401 bra $L80; .loc 1 355 8 st.f64 [%r153],%r130; .loc 1 358 15 add.u64 %r39,%r39,1; .loc 1 359 12 add.u64 %r151,%r151,%r95; .loc 1 360 13 add.u64 %r152,%r152,%r97; .loc 1 361 12 add.u64 %r153,%r153,%r100; .loc 1 363 13 setp.ne.u64 %r402,%r85,%r39; @ %r402 bra $L87; .loc 1 371 10 add.u64 %r148,%r152,%r163; .loc 1 372 9 add.u64 %r149,%r153,%r165; .loc 1 374 7 @ ! %r431 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r41],%r439; .loc 1 370 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 371 24 mul.lo.u64 %r405,%r116,%r113; .loc 1 371 10 sub.u64 %r148,%r152,%r405; .loc 1 372 23 mul.lo.u64 %r406,%r118,%r113; .loc 1 372 9 shl.b64 %r407,%r406,3; sub.u64 %r149,%r153,%r407; .loc 1 373 5 add.u64 %r150,%r150,1; .loc 1 374 7 add.u64 %r41,%r41,8; add.u64 %r166,%r166,8; setp.ne.u64 %r408,%r150,%r435; @ %r408 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; .loc 1 370 23 mov.u64 %r102,%r38; .loc 1 374 7 mov.u64 %r166,8; .loc 1 373 5 mov.u64 %r150,1; $L84: .loc 1 382 16 ld.u64 %r412,[%r41]; add.u64 %r113,%r412,1; st.u64 [%r41],%r113; .loc 1 383 23 add.u64 %r414,%r436,%r166; ld.u64 %r114,[%r414]; .loc 1 383 13 sub.u64 %r415,%r114,%r102; shl.b64 %r416,%r415,3; add.u64 %r151,%r151,%r416; .loc 1 384 24 add.u64 %r417,%frame,%r166; ld.u64 %r116,[%r417]; .loc 1 384 14 add.u64 %r152,%r148,%r116; .loc 1 385 23 add.u64 %r419,%r438,%r166; ld.u64 %r118,[%r419]; .loc 1 385 13 shl.b64 %r420,%r118,3; add.u64 %r153,%r149,%r420; .loc 1 363 32 add.u64 %r422,%r437,%r166; ld.u64 %r122,[%r422]; .loc 1 363 13 setp.eq.u64 %r423,%r113,%r122; @ %r423 bra $L85; .loc 1 367 13 mov.u64 %r39,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r424,[%r209]; setp.eq.u64 %r425,%r424,0; @ ! %r425 bra $L63; bra $L86; $L48:_gfortran_ssum_r8 .visible .func _gfortran_ssum_r868u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64 %r276; .reg .predpredu64mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_sum_r8.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r294,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r294; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r294; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r300,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r300; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r293,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r293,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r299,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r299; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r294,0; @ %r211 bra $L127; add.u64 %r293,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r293; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r24496244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r294,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r294; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r294; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r294,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r294,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r294,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 shl.b64 %r69,%r68,3; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r292,%r294,1; .loc 1 512 13 mov.f64 %r278,0d0000000000000000; cvt.u32.u32 %r296,%r24; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; $L146: st.f64 [%r98],%r278; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r279,%r59,%r80; @ %r279 bra $L146; .loc 1 525 7 @ ! %r292 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r295; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r282,%r97,%r298; @ %r282 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; .loc 1 520 13 mov.u64 %r295,0; $L144: .loc 1 529 16 ld.u64 %r287,[%r99]; add.u64 %r74,%r287,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r288,%r75,%r72; shl.b64 %r289,%r288,3; add.u64 %r98,%r98,%r289; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r290,%r74,%r77; @ %r290 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r291,%r98,0; @ ! %r291 bra $L122; bra $L147; $L109: .loc 1 534 1 ret; } sum_r10.o/ sum_c4.o/_gfortran_sum_c4 .visible .func _gfortran_sum_c4fortran/generated/sum_c4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_msum_c4 .visible .func _gfortran_msum_c4gfortran_ssum_c4 .visible .func _gfortran_ssum_c4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10] = {83,85,77114,101,116,117,114,110,32,118,97,108,117,10132 $LC4[2] = {70,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,110,115,105,9977,65,83,75,32,97,114,103,117,109,101,110,116,0 }99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,83,85,77,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_sum_c4 .visible .func _gfortran_sum_c4480u64predu64 %r227; .reg .u64 %r228; .reg .u64 %r229; .reg .predpredu32u64 %r250; .reg .u64 %r251; .reg .u64 %r252; .reg .u64 %r253; .reg .predu32u16u64pred %r326; .reg .u64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64u64 %r352; .reg .f32 %r353; .reg .f32 %r354; .reg .u32 %r355; .reg .u64 %r356; .reg .u64 %r357; .reg .u64 %r358; .reg .u64 %r359; .reg .u64 %r360; .reg .u64 %r361; mov.u64 %r181,%ar0; mov.u64 %r182,%ar1; mov.u64 %r183,%ar2; .loc 1 55 10 ld.s8 %r22,[%r182+28]; .loc 1 55 38 add.u32 %r184,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r104,%r184; .loc 1 56 10 ld.u64 %r25,[%r183]; .loc 1 56 7 add.u64 %r105,%r25,-1; .loc 1 58 7 shr.u64 %r186,%r105,63; set.u32.lt.s64 %r188,%r104,%r105; neg.s32 %r189,%r188; cvt.u16.u64 %r192,%r186; cvt.u16.u32 %r193,%r189; or.b16 %r191,%r192,%r193; .loc 1 58 6 cvt.u32.u16 %r194,%r191; cvt.u16.u8 %r195,%r194; setp.eq.u16 %r196,%r195,0; @ %r196 bra $L2; .loc 1 60 7 cvt.u32.u32 %r199,%r22; cvt.s64.s8 %r198,%r199; st.u64 [%stack+8],%r19819197_gfortran_runtime_errorr202,%r105,%r105; add.u64 %r203,%r202,%r105; shl.b64 %r204,%r203,3; add.u64 %r205,%r182,%r204; ld.u64 %r30,[%r205+56]; ld.u64 %r32,[%r205+48]; .loc 1 68 9 ld.u64 %r108,[%r205+40]; .loc 1 70 3 setp.ne.u64 %r219,%r105,0; @ %r219 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r220,%r104,%r105; @ %r220 bra $L4; bra $L46; $L3: add.u64 %r91,%r182,40; add.u64 %r60,%frame,120; add.u64 %r47,%frame,240; add.u64 %r221,%r182,16; add.u64 %r223,%r25,%r25; add.u64 %r224,%r223,%r25; shl.b64 %r225,%r224,3; add.u64 %r178,%r221,%r225; .loc 1 76 12 mov.u64 %r361,0; $L8: .loc 1 72 18 ld.u64 %r226,[%r91]; st.u64 [%r60],%r226; .loc 1 73 19 ld.u64 %r228,[%r91+16]; add.u64 %r227,%r228,1; ld.u64 %r229,[%r91+8]; sub.u64 %r37,%r227,%r229; .loc 1 75 10 setp.lt.s64 %r230,%r37,0; @ %r230 bra $L6; .loc 1 73 17 st.u64 [%r47],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r47],%r361; $L7: .loc 1 70 3 add.u64 %r91,%r91,24; add.u64 %r60,%r60,8; add.u64 %r47,%r47,8; setp.ne.u64 %r232,%r91,%r178; @ %r232 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r233,[%r181]; setp.eq.u64 %r234,%r233,0; @ ! %r234 bra $L11; bra $L10; $L4: add.u64 %r236,%r25,%r25; add.u64 %r237,%r236,%r25238,40; add.u64 %r147,%r182,%r239; shl.b64 %r240,%r25,3; add.u64 %r134,%r240,-8; add.u64 %r349,%frame,120; add.u64 %r142,%r349,%r134; add.u64 %r348,%frame,240; add.u64 %r130,%r348,%r134; cvt.u32.u32 %r244,%r22; cvt.s64.s8 %r243,%r244; add.u64 %r246,%r243,%r243; add.u64 %r247,%r246,%r243; shl.b64 %r248,%r247,3; add.u64 %r249,%r182,40; add.u64 %r95,%r248,%r249; .loc 1 84 12 mov.u64 %r360,0; $L14: .loc 1 80 18 ld.u64 %r250,[%r147]; st.u64 [%r142],%r250; .loc 1 81 19 ld.u64 %r252,[%r147+16]; add.u64 %r251,%r252,1; ld.u64 %r253,[%r147+8]; sub.u64 %r44,%r251,%r253; .loc 1 83 10 setp.lt.s64 %r254,%r44,0; @ %r254 bra $L12; .loc 1 81 17 st.u64 [%r130],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r130],%r360; $L13: .loc 1 78 3 add.u64 %r147,%r147,24; add.u64 %r142,%r142,8; add.u64 %r130,%r130,8; setp.ne.u64 %r256,%r95,%r147; @ %r256 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r257,%r104,0; @ %r257 bra $L16; add.u64 %r348,%frame,240; $L34: add.u64 %r156,%r181,40; mov.u64 %r154,%r348; add.u64 %r258,%r181,16; cvt.u32.u32 %r260,%r22; cvt.s64.s8 %r259,%r260; add.u64 %r262,%r259,%r259; add.u64 %r263,%r262,%r259; shl.b64 %r264,%r263,3; add.u64 %r148,%r258,%r264; .loc 1 94 10 mov.u64 %r96,1; .loc 1 98 4 mov.u64 %r287,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r265,0; st.u64 [%r181+8],%r265; .loc 1 103 28 cvt.u16.u32 %r268,%r22; add.u16 %r267,%r268,-1; cvt.u32.u16 %r269,%r267; st.u8 [%r181+28],%r269; .loc 1 105 20 add.u32 %r270,%r22,-2; cvt.s64.s32 %r53,%r270; add.u64 %r272,%r53,%r53; add.u64 %r273,%r272,%r53; shl.b64 %r274,%r273,3; add.u64 %r275,%r181,%r274; .loc 1 105 67 shl.b64 %r277,%r53,3; add.u64 %r278,%frame,%r277; .loc 1 105 59 ld.u64 %r280,[%r275+40]; ld.u64 %r281,[%r278+240]; mul.lo.u64 %r110,%r280,%r281; .loc 1 107 29 mov.u64 %r28311283; call (%value_in),_gfortrani_xmallocarray284,[%value_in]; } .loc 1 107 27 st.u64 [%r181],%r284; .loc 1 108 10 setp.eq.u64 %r286,%r110,0; @ ! %r286 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r96,%r49,%r96; $L17: .loc 1 98 4 st.u64 [%r156+8],%r287; ld.u64 %r49,[%r154]; add.u64 %r288,%r49,-1; st.u64 [%r156+16],%r288; st.u64 [%r156],%r96; .loc 1 91 7 add.u64 %r156,%r156,24; add.u64 %r154,%r154,8; setp.ne.u64 %r289,%r148,%r156; @ %r289 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r181+48],%r110; mov.u64 %r291,-1; st.u64 [%r181+56],%r291; mov.u64 %r292,1; st.u64 [%r181+40],%r292; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r59,[%r181+28]; .loc 1 118 10 setp.eq.u64 %r293,%r59,%r104; @ %r293 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r104; st.u64 [%stack],%r59;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r296,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r297,[%r296+36]; setp.eq.u32 %r298,%r297,0; @ %r298 bra $L21; .loc 1 125 2 add.u64 %r348,%frame,240; cvta.const.u64 %r302,$LC2r3430302; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r304,%r104,0; @ %r304 bra $L24; $L27: .loc 1 65 9 add.u64 %r305,%r30,1; .loc 1 65 7 sub.u64 %r106,%r305,%r32; max.s64 %r45,%r106,0; .loc 1 137 8 ld.u64 %r118,[%r182]; .loc 1 138 8 ld.u64 %r119,[%r181]; .loc 1 154 36 shl.b64 %r66,%r108,3; ld.u64 %r111,[%frame+360]; .loc 1 166 22 ld.u64 %r68,[%frame+120]; .loc 1 166 12 shl.b64 %r69,%r68,3; .loc 1 167 22 ld.u64 %r71,[%frame]; .loc 1 167 12 shl.b64 %r72,%r71,3; .loc 1 169 32 ld.u64 %r126,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r306,%r68,%r126; .loc 1 176 9 shl.b64 %r307,%r306,3; neg.s64 %r129,%r307; .loc 1 177 23 mul.lo.u64 %r308,%r71,%r126; .loc 1 177 9 shl.b64 %r309,%r308,3; neg.s64 %r131,%r309; setp.gt.s64 %r350,%r106,0; setp.le.s64 %r351,%r104,1; .loc 1 150 10 ld.const.f32 %r353,[$LC4]; ld.const.f32 %r354,[$LC4+4]; cvt.u32.u32 %r355,%r22; cvt.s64.s8 %r356,%r355; add.u64 %r357,%r356,-1; add.u64 %r358,%frame,120; add.u64 %r359,%frame,240; bra $L25; $L24: add.u64 %r169,%frame,360; add.u64 %r168,%r181,40; mov.u64 %r166,%frame; cvt.u32.u32 %r311,%r22; cvt.s64.s8 %r310,%r311; add.u64 %r157,%r310,-1; .loc 1 129 3 mov.u64 %r120,0; add.u64 %r348,%frame,240; .loc 1 131 16 mov.u64 %r312,%r120; $L26: st.u64 [%r169],%r312; .loc 1 132 18 ld.u64 %r313,[%r168]; st.u64 [%r166],%r313; .loc 1 133 17 shl.b64 %r315,%r120,3; add.u64 %r316,%r348,%r315; .loc 1 133 10 ld.u64 %r317,[%r316]; setp.le.s64 %r318,%r317,0; @ %r318 bra $L1; .loc 1 129 26 add.u64 %r120,%r120,1; .loc 1 129 3 add.u64 %r169,%r169,8; add.u64 %r168,%r168,24; add.u64 %r166,%r166,8; setp.ne.u64 %r319,%r120,%r157; @ %r319 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r350 bra $L35; .loc 1 150 10 st.f32 [%r119],%r353; st.f32 [%r119+4],%r354; bra $L29; $L35: mov.u64 %r113,%r118; mov.f32 %r40,0f00000000; mov.f32 %r102,%r40; .loc 1 154 13 mov.u64 %r112,0; $L28: .loc 1 158 10 ld.f32 %r323,[%r113]; add.f32 %r102,%r102,%r323; ld.f32 %r324,[%r113+4]; add.f32 %r40,%r40,%r324; .loc 1 154 28 add.u64 %r112,%r112,1; .loc 1 154 36 add.u64 %r113,%r113,%r66; .loc 1 154 6 setp.gt.s64 %r325,%r45,%r112; @ %r325 bra $L28; .loc 1 161 12 st.f32 [%r119],%r102; st.f32 [%r119+4],%r40; $L29: .loc 1 165 15 add.u64 %r111,%r111,1; .loc 1 166 12 add.u64 %r118,%r118,%r69; .loc 1 167 12 add.u64 %r119,%r119,%r72; .loc 1 169 13 setp.ne.u64 %r326,%r111,%r126; @ %r326 bra $L25; .loc 1 176 9 add.u64 %r115,%r118,%r129; .loc 1 177 9 add.u64 %r116,%r119,%r131; .loc 1 179 7 @ %r351 bra $L1; add.u64 %r125,%frame,368; mov.u64 %r140,8; .loc 1 178 5 mov.u64 %r117,1; .loc 1 173 13 mov.u64 %r352,0; bra $L32; $L33: st.u64 [%r125],%r352; .loc 1 176 23 mul.lo.u64 %r332,%r85,%r84; .loc 1 176 9 shl.b64 %r333,%r332,3; sub.u64 %r115,%r118,%r333; .loc 1 177 23 mul.lo.u64 %r334,%r88,%r84; .loc 1 177 9 shl.b64 %r335,%r334,3; sub.u64 %r116,%r119,%r335; .loc 1 178 5 add.u64 %r117,%r117,1; .loc 1 179 7 add.u64 %r125,%r125,8; add.u64 %r140,%r140,8; setp.eq.u64 %r336,%r117,%r357; @ %r336 bra $L1; $L32: .loc 1 187 16 ld.u64 %r337,[%r125]; add.u64 %r84,%r337,1; st.u64 [%r125],%r84; .loc 1 188 23 add.u64 %r339,%r358,%r140; ld.u64 %r85,[%r339]; .loc 1 188 13 shl.b64 %r340,%r85,3; add.u64 %r118,%r115,%r340; .loc 1 189 23 add.u64 %r341,%frame,%r140; ld.u64 %r88,[%r341]; .loc 1 189 13 shl.b64 %r342,%r88,3; add.u64 %r119,%r116,%r342; .loc 1 169 32 add.u64 %r344,%r359,%r140; ld.u64 %r92,[%r344]; .loc 1 169 13 setp.eq.u64 %r345,%r84,%r92; @ %r345 bra $L33; .loc 1 173 13 mov.u64 %r111,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r346,[%r181]; setp.eq.u64 %r347,%r346,0; @ ! %r347 bra $L11; bra $L34; $L1: .loc 1 19_gfortran_msum_c4 .visible .func _gfortran_msum_c60898pred %r215; .reg .u32 %r219; .reg .u64 %r221; .reg .u32 %r223; .reg .u32 %r224; .reg .u16 %r226; .reg .u16 %r227; .reg .u16u16 %r263; .reg .u32pred %r282; .reg .u64 %r283; .reg .u64u64u6464pred %r325; .reg .u64 %r327; .reg .u64 %r329; .reg .pred %r331; .reg .pred %r332; .reg .u64 %r333; .reg .u32 %r334; .reg .u64u64pred %r357; .reg .u64 %r358; .reg .u64 %r359; .reg .predu64 %r392; .reg .u64 %r393; .reg .pred %r394; .reg .predpred %r404; .reg .pred %r405; .reg .u64pred %r426; .reg .u64 %r427; .reg .predpred.reg .u64mov.u64 %r211,%ar0; mov.u64 %r212,%ar1; mov.u64 %r213,%ar2; mov.u64 %r214,%ar3; .loc 1 223 6 setp.ne.u64 %r215,%r214,0; @ %r215 bra $L49gfortran_sum_c4.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r213]; .loc 1 233 7 add.u64 %r136,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r212+28]; .loc 1 234 38 add.u32 %r219,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r137,%r219; .loc 1 237 7 shr.u64 %r221,%r136,63; set.u32.gt.s64 %r223,%r136,%r137; neg.s32 %r224,%r223; cvt.u16.u64 %r227,%r221; cvt.u16.u32 %r228,%r224; or.b16 %r226,%r227,%r228; .loc 1 237 6 cvt.u32.u16 %r229,%r226; cvt.u16.u8 %r230,%r229; setp.eq.u16 %r231,%r230,0; @ %r231 bra $L51; .loc 1 239 7 cvt.u32.u32 %r234,%r23; cvt.s64.s8 %r233,%r234; st.u64 [%stack+8],%r233; st.u64 [%stack],%r22; cvta.const.u64 %r2322stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r432,%r136,%r136; add.u64 %r433,%r432,%r136; shl.b64 %r239,%r433,3; add.u64 %r240,%r212,%r239; ld.u64 %r243,[%r240+56]; add.u64 %r242,%r243,1; .loc 1 244 7 ld.u64 %r250,[%r240+48]; sub.u64 %r138,%r242,%r250; .loc 1 245 6 setp.le.s64 %r251,%r138,0; @ %r251 bra $L48; .loc 1 248 9 ld.u64 %r151,[%r214]; .loc 1 250 15 ld.u64 %r34,[%r214+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r252,%r35,-4; and.b32 %r253,%r252,-5; set.u32.eq.u32 %r255,%r253,0; neg.s32 %r256,%r255; .loc 1 252 22 add.u32 %r257,%r35,-1; set.u32.le.u32 %r259,%r257,1; neg.s32 %r260,%r259; .loc 1 252 6 cvt.u16.u32 %r262,%r256; cvt.u16.u32 %r263,%r260; or.b16 %r261,%r262,%r263; cvt.u32.u16 %r264,%r261; cvt.u16.u8 %r265,%r264; setp.ne.u16 %r266,%r265,0; @ %r266 bra $L53; .loc 1 254 7 setp.ne.u32 %r268,%r35,16; @ %r268 bra $L54; $L53: .loc 1 261 9 shl.b64 %r272,%r433,3; add.u64 %r273,%r212,%r272; ld.u64 %r141,[%r273+40]; .loc 1 262 12 add.u64 %r279,%r214,%r272; ld.u64 %r41,[%r279+40]; .loc 1 264 3 setp.ne.u64 %r281,%r136,0; @ %r281 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r282,%r136,%r137; @ %r282 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r283283_gfortran_runtime_error5: add.u64 %r40,%r212,40; add.u64 %r129,%r214,40; add.u64 %r285,%r212,16; add.u64 %r287,%r22,%r22; add.u64 %r288,%r287,%r22; shl.b64 %r289,%r288,3; add.u64 %r208,%r285,%r289; .loc 1 264 3 mov.u64 %r87,0; add.u64 %r430,%frame,240; add.u64 %r429,%frame,360; .loc 1 271 12 mov.u64 %r444,%r87; $L60: .loc 1 266 18 add.u64 %r291,%r430,%r87; ld.u64 %r292,[%r40]; st.u64 [%r291],%r292; .loc 1 267 18 add.u64 %r293,%frame,%r87; .loc 1 267 20 ld.u64 %r295,[%r129]; mul.lo.u64 %r294,%r295,%r34; .loc 1 267 18 st.u64 [%r293],%r294; .loc 1 268 19 ld.u64 %r297,[%r40+16]; add.u64 %r296,%r297,1; ld.u64 %r298,[%r40+8]; sub.u64 %r52,%r296,%r298; .loc 1 270 10 setp.lt.s64 %r299,%r52,0; @ %r299 bra $L58; .loc 1 268 17 add.u64 %r301,%r429,%r87; st.u64 [%r301],%r52; bra $L59; $L58: .loc 1 271 12 add.u64 %r303,%r429,%r87; st.u64 [%r303],%r444; $L59: .loc 1 264 3 add.u64 %r40,%r40,24; add.u64 %r129,%r129,24; add.u64 %r87,%r87,8; setp.ne.u64 %r305,%r40,%r208; @ %r305 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r306,[%r211]; setp.eq.u64 %r307,%r306,0; @ ! %r307 bra $L63; bra $L62; $L56: add.u64 %r309,%r22,%r22; add.u64 %r310,%r309,%r22; shl.b64 %r311,%r310,3; add.u64 %r122,%r311,40; add.u64 %r131,%r212,%r122; add.u64 %r117,%r214,%r122; shl.b64 %r312,%r22,3; add.u64 %r135,%r312,-8; cvt.u32.u32 %r314,%r23; cvt.s64.s8 %r313,%r314; shl.b64 %r315,%r313,3; add.u64 %r66,%r315,-8; add.u64 %r430,%frame,240; add.u64 %r429,%frame,360; .loc 1 281 12 mov.u64 %r443,0; $L66: .loc 1 276 18 add.u64 %r317,%r430,%r135; ld.u64 %r318,[%r131]; st.u64 [%r317],%r318; .loc 1 277 18 add.u64 %r319,%frame,%r135; .loc 1 277 20 ld.u64 %r321,[%r117]; mul.lo.u64 %r320,%r321,%r34; .loc 1 277 18 st.u64 [%r319],%r320; .loc 1 278 19 ld.u64 %r323,[%r131+16]; add.u64 %r322,%r323,1; ld.u64 %r324,[%r131+8]; sub.u64 %r62,%r322,%r324; .loc 1 280 10 setp.lt.s64 %r325,%r62,0; @ %r325 bra $L64; .loc 1 278 17 add.u64 %r327,%r429,%r135; st.u64 [%r327],%r62; bra $L65; $L64: .loc 1 281 12 add.u64 %r329,%r429,%r135; st.u64 [%r329],%r443; $L65: .loc 1 274 3 add.u64 %r131,%r131,24; add.u64 %r117,%r117,24; add.u64 %r135,%r135,8; setp.ne.u64 %r331,%r66,%r135; @ %r331 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r332,%r137,0; @ %r332 bra $L68; add.u64 %r429,%frame,360; $L86: add.u64 %r179,%r211,40; mov.u64 %r171,%r429; cvt.u32.u32 %r334,%r23; cvt.s64.s8 %r333,%r334; add.u64 %r336,%r333,%r333; add.u64 %r337,%r336,%r333; shl.b64 %r338,%r337,3; add.u64 %r339,%r211,16; add.u64 %r132,%r338,%r339; .loc 1 291 10 mov.u64 %r126,1; .loc 1 295 4 mov.u64 %r358,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r340,%r23,-2; cvt.s64.s32 %r71,%r340; add.u64 %r342,%r71,%r71; add.u64 %r343,%r342,%r71; shl.b64 %r344,%r343,3; add.u64 %r345,%r211,%r344; .loc 1 299 67 shl.b64 %r347,%r71,3; add.u64 %r348,%frame,%r347; .loc 1 299 59 ld.u64 %r350,[%r345+40]; ld.u64 %r351,[%r348+360]; mul.lo.u64 %r145,%r350,%r351; .loc 1 301 24 mov.u64 %r352,0; st.u64 [%r211+8],%r352; .loc 1 302 28 cvt.u16.u32 %r355,%r23; add.u16 %r354,%r355,-1; cvt.u32.u16 %r356,%r354; st.u8 [%r211+28],%r356; .loc 1 304 10 setp.eq.u64 %r357,%r145,0; @ %r357 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r126,%r68,%r126; $L69: .loc 1 295 4 st.u64 [%r179+8],%r358; ld.u64 %r68,[%r171]; add.u64 %r359,%r68,-1; st.u64 [%r179+16],%r359; st.u64 [%r179],%r126; .loc 1 288 7 add.u64 %r179,%r179,24; add.u64 %r171,%r171,8; setp.ne.u64 %r360,%r132,%r179; @ %r360 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r211+48],%r145; mov.u64 %r362,-1; st.u64 [%r211+56],%r362; mov.u64 %r363,1; st.u64 [%r211+40],%r363; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r365365; call (%value_in),_gfortrani_xmallocarray66,[%value_in]; } .loc 1 311 22 st.u64 [%r211],%r366; bra $L73; $L63: .loc 1 316 19 ld.s8 %r368,[%r211+28]; .loc 1 316 10 setp.eq.u64 %r369,%r368,%r137; @ %r369 bra $L74; .loc 1 317 2 cvta.const.u64 %r370,$LC370_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r372,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r373,[%r372+36]; setp.eq.u32 %r374,%r373,0; @ %r374 bra $L73; .loc 1 321 4 add.u64 %r429,%frame,360; cvta.const.u64 %r378,$LC2377378; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r38221212382378; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r384,%r137,0; @ %r384 bra $L75; $L78: .loc 1 336 8 ld.u64 %r152,[%r211]; .loc 1 337 8 ld.u64 %r150,[%r212]; .loc 1 339 9 setp.ne.u64 %r385,%r150,0; @ %r385 bra $L76; bra $L48; $L75: add.u64 %r192,%frame,480; add.u64 %r191,%r211,40; add.u64 %r189,%frame,120; cvt.u32.u32 %r387,%r23; cvt.s64.s8 %r386,%r387; add.u64 %r180,%r386,-1; .loc 1 328 3 mov.u64 %r157,0; add.u64 %r429,%frame,360; .loc 1 330 16 mov.u64 %r388,%r157; $L77: st.u64 [%r192],%r388; .loc 1 331 18 ld.u64 %r389,[%r191]; st.u64 [%r189],%r389; .loc 1 332 17 shl.b64 %r391,%r157,3; add.u64 %r392,%r429,%r391; .loc 1 332 10 ld.u64 %r393,[%r392]; setp.le.s64 %r394,%r393,0; @ %r394 bra $L48; .loc 1 328 26 add.u64 %r157,%r157,1; .loc 1 328 3 add.u64 %r192,%r192,8; add.u64 %r191,%r191,24; add.u64 %r189,%r189,8; setp.ne.u64 %r395,%r157,%r180; @ %r395 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r43,%r41,%r34; .loc 1 349 32 shl.b64 %r86,%r141,3; ld.u64 %r153,[%frame+480]; .loc 1 359 22 ld.u64 %r89,[%frame+240]; .loc 1 359 12 shl.b64 %r90,%r89,3; .loc 1 360 23 ld.u64 %r92,[%frame]; .loc 1 361 22 ld.u64 %r94,[%frame+120]; .loc 1 361 12 shl.b64 %r95,%r94,3; .loc 1 363 32 ld.u64 %r80,[%frame+360]; .loc 1 370 23 mul.lo.u64 %r38,%r80,%r89; .loc 1 371 24 mul.lo.u64 %r396,%r80,%r92; .loc 1 371 10 neg.s64 %r162,%r396; .loc 1 372 23 mul.lo.u64 %r397,%r80,%r94; .loc 1 372 9 shl.b64 %r398,%r397,3; neg.s64 %r165,%r398; setp.le.s64 %r434,%r137,1; cvt.u32.u32 %r436,%r23; cvt.s64.s8 %r437,%r436; add.u64 %r438,%r437,-1; add.u64 %r439,%frame,240; add.u64 %r440,%frame,360; add.u64 %r441,%frame,120; .loc 1 367 13 mov.u64 %r442,0; $L87: .loc 1 328 3 mov.u64 %r156,%r151; mov.u64 %r155,%r150; mov.f32 %r118,0f00000000; mov.f32 %r119,%r118; .loc 1 349 9 mov.u64 %r154,0; $L80: .loc 1 352 6 ld.s8 %r400,[%r156]; cvt.u16.u32 %r399,%r400; setp.eq.u16 %r401,%r399,0; @ %r401 bra $L79; .loc 1 353 12 ld.f32 %r402,[%r155]; add.f32 %r119,%r119,%r402; ld.f32 %r403,[%r155+4]; add.f32 %r118,%r118,%r403; $L79: .loc 1 349 24 add.u64 %r154,%r154,1; .loc 1 349 32 add.u64 %r155,%r155,%r86; .loc 1 349 47 add.u64 %r156,%r156,%r43; .loc 1 349 2 setp.ne.u64 %r404,%r138,%r154; @ %r404 bra $L80; .loc 1 355 8 st.f32 [%r152],%r119; st.f32 [%r152+4],%r118; .loc 1 358 15 add.u64 %r153,%r153,1; .loc 1 359 12 add.u64 %r150,%r150,%r90; .loc 1 360 13 add.u64 %r151,%r151,%r92; .loc 1 361 12 add.u64 %r152,%r152,%r95; .loc 1 363 13 setp.ne.u64 %r405,%r80,%r153; @ %r405 bra $L87; .loc 1 371 10 add.u64 %r147,%r151,%r162; .loc 1 372 9 add.u64 %r148,%r152,%r165; .loc 1 374 7 @ ! %r434 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r128],%r442; .loc 1 370 23 mul.lo.u64 %r98,%r109,%r108; .loc 1 371 24 mul.lo.u64 %r408,%r111,%r108; .loc 1 371 10 sub.u64 %r147,%r151,%r408; .loc 1 372 23 mul.lo.u64 %r409,%r113,%r108; .loc 1 372 9 shl.b64 %r410,%r409,3; sub.u64 %r148,%r152,%r410; .loc 1 373 5 add.u64 %r149,%r149,1; .loc 1 374 7 add.u64 %r128,%r128,8; add.u64 %r178,%r178,8; setp.ne.u64 %r411,%r149,%r438; @ %r411 bra $L84; bra $L48; $L108: add.u64 %r128,%frame,488; .loc 1 370 23 mov.u64 %r98,%r38; .loc 1 374 7 mov.u64 %r178,8; .loc 1 373 5 mov.u64 %r149,1; $L84: .loc 1 382 16 ld.u64 %r415,[%r128]; add.u64 %r108,%r415,1; st.u64 [%r128],%r108; .loc 1 383 23 add.u64 %r417,%r439,%r178; ld.u64 %r109,[%r417]; .loc 1 383 13 sub.u64 %r418,%r109,%r98; shl.b64 %r419,%r418,3; add.u64 %r150,%r150,%r419; .loc 1 384 24 add.u64 %r420,%frame,%r178; ld.u64 %r111,[%r420]; .loc 1 384 14 add.u64 %r151,%r147,%r111; .loc 1 385 23 add.u64 %r422,%r441,%r178; ld.u64 %r113,[%r422]; .loc 1 385 13 shl.b64 %r423,%r113,3; add.u64 %r152,%r148,%r423; .loc 1 363 32 add.u64 %r425,%r440,%r178; ld.u64 %r116,[%r425]; .loc 1 363 13 setp.eq.u64 %r426,%r108,%r116; @ %r426 bra $L85; .loc 1 367 13 mov.u64 %r153,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r427,[%r211]; setp.eq.u64 %r428,%r427,0; @ ! %r428 bra $L63; bra $L86; $L48:_gfortran_ssum_c4 .visible .func _gfortran_ssum_c68u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64 %r276; .reg .pred %r277; .reg .pred %r280; .reg .predpred %r292; .reg .f32 %r293; .reg .f32 %r294; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_sum_c4.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r297,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r297; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r297; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r304,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r304; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r296,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r296,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r303,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r303; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r297,0; @ %r211 bra $L127; add.u64 %r296,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r296; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r24496244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r297,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r297; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r297; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r297,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r297,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r297,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 shl.b64 %r69,%r68,3; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; ld.const.f32 %r293,[$LC4]; ld.const.f32 %r294,[$LC4+4]; setp.le.s64 %r295,%r297,1; cvt.u32.u32 %r299,%r24; cvt.s64.s8 %r300,%r299; add.u64 %r301,%r300,-1; .loc 1 520 13 mov.u64 %r302,0; $L146: .loc 1 512 13 st.f32 [%r98],%r293; st.f32 [%r98+4],%r294; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r280,%r59,%r80; @ %r280 bra $L146; .loc 1 525 7 @ ! %r295 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r302; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r283,%r97,%r301; @ %r283 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; $L144: .loc 1 529 16 ld.u64 %r288,[%r99]; add.u64 %r74,%r288,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r289,%r75,%r72; shl.b64 %r290,%r289,3; add.u64 %r98,%r98,%r290; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r291,%r74,%r77; @ %r291 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r292,%r98,0; @ ! %r292 bra $L122; bra $L147; $L109: .loc 1 534 1 ret; } sum_c8.o/_gfortran_sum_c8 .visible .func _gfortran_sum_c8fortran/generated/sum_c8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_msum_c8 .visible .func _gfortran_msum_c8gfortran_ssum_c8 .visible .func _gfortran_ssum_c8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10] = {83,85,77114,101,116,117,114,110,32,118,97,108,117,10164 $LC4[2] = {70,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,83,85,77,32,105,110,116,114,105,110,115,105,9977,65,83,75,32,97,114,103,117,109,101,110,116,0 }99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,83,85,77,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_sum_c8 .visible .func _gfortran_sum_c8480u64predu64 %r227; .reg .u64 %r228; .reg .u64 %r229; .reg .predpredu32u64 %r250; .reg .u64 %r251; .reg .u64 %r252; .reg .u64 %r253; .reg .predu32u16u64pred %r326; .reg .u64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64u64 %r352; .reg .f64 %r353; .reg .f64 %r354; .reg .u32 %r355; .reg .u64 %r356; .reg .u64 %r357; .reg .u64 %r358; .reg .u64 %r359; .reg .u64 %r360; .reg .u64 %r361; mov.u64 %r181,%ar0; mov.u64 %r182,%ar1; mov.u64 %r183,%ar2; .loc 1 55 10 ld.s8 %r22,[%r182+28]; .loc 1 55 38 add.u32 %r184,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r104,%r184; .loc 1 56 10 ld.u64 %r25,[%r183]; .loc 1 56 7 add.u64 %r105,%r25,-1; .loc 1 58 7 shr.u64 %r186,%r105,63; set.u32.lt.s64 %r188,%r104,%r105; neg.s32 %r189,%r188; cvt.u16.u64 %r192,%r186; cvt.u16.u32 %r193,%r189; or.b16 %r191,%r192,%r193; .loc 1 58 6 cvt.u32.u16 %r194,%r191; cvt.u16.u8 %r195,%r194; setp.eq.u16 %r196,%r195,0; @ %r196 bra $L2; .loc 1 60 7 cvt.u32.u32 %r199,%r22; cvt.s64.s8 %r198,%r199; st.u64 [%stack+8],%r19819197_gfortran_runtime_errorr202,%r105,%r105; add.u64 %r203,%r202,%r105; shl.b64 %r204,%r203,3; add.u64 %r205,%r182,%r204; ld.u64 %r30,[%r205+56]; ld.u64 %r32,[%r205+48]; .loc 1 68 9 ld.u64 %r108,[%r205+40]; .loc 1 70 3 setp.ne.u64 %r219,%r105,0; @ %r219 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r220,%r104,%r105; @ %r220 bra $L4; bra $L46; $L3: add.u64 %r91,%r182,40; add.u64 %r60,%frame,120; add.u64 %r47,%frame,240; add.u64 %r221,%r182,16; add.u64 %r223,%r25,%r25; add.u64 %r224,%r223,%r25; shl.b64 %r225,%r224,3; add.u64 %r178,%r221,%r225; .loc 1 76 12 mov.u64 %r361,0; $L8: .loc 1 72 18 ld.u64 %r226,[%r91]; st.u64 [%r60],%r226; .loc 1 73 19 ld.u64 %r228,[%r91+16]; add.u64 %r227,%r228,1; ld.u64 %r229,[%r91+8]; sub.u64 %r37,%r227,%r229; .loc 1 75 10 setp.lt.s64 %r230,%r37,0; @ %r230 bra $L6; .loc 1 73 17 st.u64 [%r47],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r47],%r361; $L7: .loc 1 70 3 add.u64 %r91,%r91,24; add.u64 %r60,%r60,8; add.u64 %r47,%r47,8; setp.ne.u64 %r232,%r91,%r178; @ %r232 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r233,[%r181]; setp.eq.u64 %r234,%r233,0; @ ! %r234 bra $L11; bra $L10; $L4: add.u64 %r236,%r25,%r25; add.u64 %r237,%r236,%r25238,40; add.u64 %r147,%r182,%r239; shl.b64 %r240,%r25,3; add.u64 %r134,%r240,-8; add.u64 %r349,%frame,120; add.u64 %r142,%r349,%r134; add.u64 %r348,%frame,240; add.u64 %r130,%r348,%r134; cvt.u32.u32 %r244,%r22; cvt.s64.s8 %r243,%r244; add.u64 %r246,%r243,%r243; add.u64 %r247,%r246,%r243; shl.b64 %r248,%r247,3; add.u64 %r249,%r182,40; add.u64 %r95,%r248,%r249; .loc 1 84 12 mov.u64 %r360,0; $L14: .loc 1 80 18 ld.u64 %r250,[%r147]; st.u64 [%r142],%r250; .loc 1 81 19 ld.u64 %r252,[%r147+16]; add.u64 %r251,%r252,1; ld.u64 %r253,[%r147+8]; sub.u64 %r44,%r251,%r253; .loc 1 83 10 setp.lt.s64 %r254,%r44,0; @ %r254 bra $L12; .loc 1 81 17 st.u64 [%r130],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r130],%r360; $L13: .loc 1 78 3 add.u64 %r147,%r147,24; add.u64 %r142,%r142,8; add.u64 %r130,%r130,8; setp.ne.u64 %r256,%r95,%r147; @ %r256 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r257,%r104,0; @ %r257 bra $L16; add.u64 %r348,%frame,240; $L34: add.u64 %r156,%r181,40; mov.u64 %r154,%r348; add.u64 %r258,%r181,16; cvt.u32.u32 %r260,%r22; cvt.s64.s8 %r259,%r260; add.u64 %r262,%r259,%r259; add.u64 %r263,%r262,%r259; shl.b64 %r264,%r263,3; add.u64 %r148,%r258,%r264; .loc 1 94 10 mov.u64 %r96,1; .loc 1 98 4 mov.u64 %r287,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r265,0; st.u64 [%r181+8],%r265; .loc 1 103 28 cvt.u16.u32 %r268,%r22; add.u16 %r267,%r268,-1; cvt.u32.u16 %r269,%r267; st.u8 [%r181+28],%r269; .loc 1 105 20 add.u32 %r270,%r22,-2; cvt.s64.s32 %r53,%r270; add.u64 %r272,%r53,%r53; add.u64 %r273,%r272,%r53; shl.b64 %r274,%r273,3; add.u64 %r275,%r181,%r274; .loc 1 105 67 shl.b64 %r277,%r53,3; add.u64 %r278,%frame,%r277; .loc 1 105 59 ld.u64 %r280,[%r275+40]; ld.u64 %r281,[%r278+240]; mul.lo.u64 %r110,%r280,%r281; .loc 1 107 29 mov.u64 %r283,1607 27 st.u64 [%r181],%r284; .loc 1 108 10 setp.eq.u64 %r286,%r110,0; @ ! %r286 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r96,%r49,%r96; $L17: .loc 1 98 4 st.u64 [%r156+8],%r287; ld.u64 %r49,[%r154]; add.u64 %r288,%r49,-1; st.u64 [%r156+16],%r288; st.u64 [%r156],%r96; .loc 1 91 7 add.u64 %r156,%r156,24; add.u64 %r154,%r154,8; setp.ne.u64 %r289,%r148,%r156; @ %r289 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r181+48],%r110; mov.u64 %r291,-1; st.u64 [%r181+56],%r291; mov.u64 %r292,1; st.u64 [%r181+40],%r292; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r59,[%r181+28]; .loc 1 118 10 setp.eq.u64 %r293,%r59,%r104; @ %r293 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r104; st.u64 [%stack],%r59;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r296,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r297,[%r296+36]; setp.eq.u32 %r298,%r297,0; @ %r298 bra $L21; .loc 1 125 2 add.u64 %r348,%frame,240; cvta.const.u64 %r302,$LC2r3430302; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r304,%r104,0; @ %r304 bra $L24; $L27: .loc 1 65 9 add.u64 %r305,%r30,1; .loc 1 65 7 sub.u64 %r106,%r305,%r32; max.s64 %r45,%r106,0; .loc 1 137 8 ld.u64 %r118,[%r182]; .loc 1 138 8 ld.u64 %r119,[%r181]; .loc 1 154 36 shl.b64 %r66,%r108,4; ld.u64 %r111,[%frame+360]; .loc 1 166 22 ld.u64 %r68,[%frame+120]; .loc 1 166 12 shl.b64 %r69,%r68,4; .loc 1 167 22 ld.u64 %r71,[%frame]; .loc 1 167 12 shl.b64 %r72,%r71,4; .loc 1 169 32 ld.u64 %r126,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r306,%r68,%r126; .loc 1 176 9 shl.b64 %r307,%r306,4; neg.s64 %r129,%r307; .loc 1 177 23 mul.lo.u64 %r308,%r71,%r126; .loc 1 177 9 shl.b64 %r309,%r308,4; neg.s64 %r131,%r309; setp.gt.s64 %r350,%r106,0; setp.le.s64 %r351,%r104,1; .loc 1 150 10 ld.const.f64 %r353,[$LC4]; ld.const.f64 %r354,[$LC4+8]; cvt.u32.u32 %r355,%r22; cvt.s64.s8 %r356,%r355; add.u64 %r357,%r356,-1; add.u64 %r358,%frame,120; add.u64 %r359,%frame,240; bra $L25; $L24: add.u64 %r169,%frame,360; add.u64 %r168,%r181,40; mov.u64 %r166,%frame; cvt.u32.u32 %r311,%r22; cvt.s64.s8 %r310,%r311; add.u64 %r157,%r310,-1; .loc 1 129 3 mov.u64 %r120,0; add.u64 %r348,%frame,240; .loc 1 131 16 mov.u64 %r312,%r120; $L26: st.u64 [%r169],%r312; .loc 1 132 18 ld.u64 %r313,[%r168]; st.u64 [%r166],%r313; .loc 1 133 17 shl.b64 %r315,%r120,3; add.u64 %r316,%r348,%r315; .loc 1 133 10 ld.u64 %r317,[%r316]; setp.le.s64 %r318,%r317,0; @ %r318 bra $L1; .loc 1 129 26 add.u64 %r120,%r120,1; .loc 1 129 3 add.u64 %r169,%r169,8; add.u64 %r168,%r168,24; add.u64 %r166,%r166,8; setp.ne.u64 %r319,%r120,%r157; @ %r319 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r350 bra $L35; .loc 1 150 10 st.f64 [%r119],%r353; st.f64 [%r119+8],%r354; bra $L29; $L35: mov.u64 %r113,%r118; mov.f64 %r40,0d0000000000000000; mov.f64 %r102,%r40; .loc 1 154 13 mov.u64 %r112,0; $L28: .loc 1 158 10 ld.f64 %r323,[%r113]; add.f64 %r102,%r102,%r323; ld.f64 %r324,[%r113+8]; add.f64 %r40,%r40,%r324; .loc 1 154 28 add.u64 %r112,%r112,1; .loc 1 154 36 add.u64 %r113,%r113,%r66; .loc 1 154 6 setp.gt.s64 %r325,%r45,%r112; @ %r325 bra $L28; .loc 1 161 12 st.f64 [%r119],%r102; st.f64 [%r119+8],%r40; $L29: .loc 1 165 15 add.u64 %r111,%r111,1; .loc 1 166 12 add.u64 %r118,%r118,%r69; .loc 1 167 12 add.u64 %r119,%r119,%r72; .loc 1 169 13 setp.ne.u64 %r326,%r111,%r126; @ %r326 bra $L25; .loc 1 176 9 add.u64 %r115,%r118,%r129; .loc 1 177 9 add.u64 %r116,%r119,%r131; .loc 1 179 7 @ %r351 bra $L1; add.u64 %r125,%frame,368; mov.u64 %r140,8; .loc 1 178 5 mov.u64 %r117,1; .loc 1 173 13 mov.u64 %r352,0; bra $L32; $L33: st.u64 [%r125],%r352; .loc 1 176 23 mul.lo.u64 %r332,%r85,%r84; .loc 1 176 9 shl.b64 %r333,%r332,4; sub.u64 %r115,%r118,%r333; .loc 1 177 23 mul.lo.u64 %r334,%r88,%r84; .loc 1 177 9 shl.b64 %r335,%r334,4; sub.u64 %r116,%r119,%r335; .loc 1 178 5 add.u64 %r117,%r117,1; .loc 1 179 7 add.u64 %r125,%r125,8; add.u64 %r140,%r140,8; setp.eq.u64 %r336,%r117,%r357; @ %r336 bra $L1; $L32: .loc 1 187 16 ld.u64 %r337,[%r125]; add.u64 %r84,%r337,1; st.u64 [%r125],%r84; .loc 1 188 23 add.u64 %r339,%r358,%r140; ld.u64 %r85,[%r339]; .loc 1 188 13 shl.b64 %r340,%r85,4; add.u64 %r118,%r115,%r340; .loc 1 189 23 add.u64 %r341,%frame,%r140; ld.u64 %r88,[%r341]; .loc 1 189 13 shl.b64 %r342,%r88,4; add.u64 %r119,%r116,%r342; .loc 1 169 32 add.u64 %r344,%r359,%r140; ld.u64 %r92,[%r344]; .loc 1 169 13 setp.eq.u64 %r345,%r84,%r92; @ %r345 bra $L33; .loc 1 173 13 mov.u64 %r111,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r346,[%r181]; setp.eq.u64 %r347,%r346,0; @ ! %r347 bra $L11; bra $L34; $L1: .loc 1 19_gfortran_msum_c8 .visible .func _gfortran_msum_c860898pred %r215; .reg .u32 %r219; .reg .u64 %r221; .reg .u32 %r223; .reg .u32 %r224; .reg .u16 %r226; .reg .u16 %r227; .reg .u16u16 %r263; .reg .u32pred %r282; .reg .u64 %r283; .reg .u64u64u6464pred %r325; .reg .u64 %r327; .reg .u64 %r329; .reg .pred %r331; .reg .pred %r332; .reg .u64 %r333; .reg .u32 %r334; .reg .u64u64pred %r357; .reg .u64 %r358; .reg .u64 %r359; .reg .predu64 %r392; .reg .u64 %r393; .reg .pred %r394; .reg .predpred %r404; .reg .pred %r405; .reg .u64pred %r426; .reg .u64 %r427; .reg .predpred.reg .u64mov.u64 %r211,%ar0; mov.u64 %r212,%ar1; mov.u64 %r213,%ar2; mov.u64 %r214,%ar3; .loc 1 223 6 setp.ne.u64 %r215,%r214,0; @ %r215 bra $L49gfortran_sum_c8.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r213]; .loc 1 233 7 add.u64 %r136,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r212+28]; .loc 1 234 38 add.u32 %r219,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r137,%r219; .loc 1 237 7 shr.u64 %r221,%r136,63; set.u32.gt.s64 %r223,%r136,%r137; neg.s32 %r224,%r223; cvt.u16.u64 %r227,%r221; cvt.u16.u32 %r228,%r224; or.b16 %r226,%r227,%r228; .loc 1 237 6 cvt.u32.u16 %r229,%r226; cvt.u16.u8 %r230,%r229; setp.eq.u16 %r231,%r230,0; @ %r231 bra $L51; .loc 1 239 7 cvt.u32.u32 %r234,%r23; cvt.s64.s8 %r233,%r234; st.u64 [%stack+8],%r233; st.u64 [%stack],%r22; cvta.const.u64 %r2322stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r432,%r136,%r136; add.u64 %r433,%r432,%r136; shl.b64 %r239,%r433,3; add.u64 %r240,%r212,%r239; ld.u64 %r243,[%r240+56]; add.u64 %r242,%r243,1; .loc 1 244 7 ld.u64 %r250,[%r240+48]; sub.u64 %r138,%r242,%r250; .loc 1 245 6 setp.le.s64 %r251,%r138,0; @ %r251 bra $L48; .loc 1 248 9 ld.u64 %r151,[%r214]; .loc 1 250 15 ld.u64 %r34,[%r214+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r252,%r35,-4; and.b32 %r253,%r252,-5; set.u32.eq.u32 %r255,%r253,0; neg.s32 %r256,%r255; .loc 1 252 22 add.u32 %r257,%r35,-1; set.u32.le.u32 %r259,%r257,1; neg.s32 %r260,%r259; .loc 1 252 6 cvt.u16.u32 %r262,%r256; cvt.u16.u32 %r263,%r260; or.b16 %r261,%r262,%r263; cvt.u32.u16 %r264,%r261; cvt.u16.u8 %r265,%r264; setp.ne.u16 %r266,%r265,0; @ %r266 bra $L53; .loc 1 254 7 setp.ne.u32 %r268,%r35,16; @ %r268 bra $L54; $L53: .loc 1 261 9 shl.b64 %r272,%r433,3; add.u64 %r273,%r212,%r272; ld.u64 %r141,[%r273+40]; .loc 1 262 12 add.u64 %r279,%r214,%r272; ld.u64 %r41,[%r279+40]; .loc 1 264 3 setp.ne.u64 %r281,%r136,0; @ %r281 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r282,%r136,%r137; @ %r282 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r283283_gfortran_runtime_error5: add.u64 %r40,%r212,40; add.u64 %r129,%r214,40; add.u64 %r285,%r212,16; add.u64 %r287,%r22,%r22; add.u64 %r288,%r287,%r22; shl.b64 %r289,%r288,3; add.u64 %r208,%r285,%r289; .loc 1 264 3 mov.u64 %r87,0; add.u64 %r430,%frame,240; add.u64 %r429,%frame,360; .loc 1 271 12 mov.u64 %r444,%r87; $L60: .loc 1 266 18 add.u64 %r291,%r430,%r87; ld.u64 %r292,[%r40]; st.u64 [%r291],%r292; .loc 1 267 18 add.u64 %r293,%frame,%r87; .loc 1 267 20 ld.u64 %r295,[%r129]; mul.lo.u64 %r294,%r295,%r34; .loc 1 267 18 st.u64 [%r293],%r294; .loc 1 268 19 ld.u64 %r297,[%r40+16]; add.u64 %r296,%r297,1; ld.u64 %r298,[%r40+8]; sub.u64 %r52,%r296,%r298; .loc 1 270 10 setp.lt.s64 %r299,%r52,0; @ %r299 bra $L58; .loc 1 268 17 add.u64 %r301,%r429,%r87; st.u64 [%r301],%r52; bra $L59; $L58: .loc 1 271 12 add.u64 %r303,%r429,%r87; st.u64 [%r303],%r444; $L59: .loc 1 264 3 add.u64 %r40,%r40,24; add.u64 %r129,%r129,24; add.u64 %r87,%r87,8; setp.ne.u64 %r305,%r40,%r208; @ %r305 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r306,[%r211]; setp.eq.u64 %r307,%r306,0; @ ! %r307 bra $L63; bra $L62; $L56: add.u64 %r309,%r22,%r22; add.u64 %r310,%r309,%r22; shl.b64 %r311,%r310,3; add.u64 %r122,%r311,40; add.u64 %r131,%r212,%r122; add.u64 %r117,%r214,%r122; shl.b64 %r312,%r22,3; add.u64 %r135,%r312,-8; cvt.u32.u32 %r314,%r23; cvt.s64.s8 %r313,%r314; shl.b64 %r315,%r313,3; add.u64 %r66,%r315,-8; add.u64 %r430,%frame,240; add.u64 %r429,%frame,360; .loc 1 281 12 mov.u64 %r443,0; $L66: .loc 1 276 18 add.u64 %r317,%r430,%r135; ld.u64 %r318,[%r131]; st.u64 [%r317],%r318; .loc 1 277 18 add.u64 %r319,%frame,%r135; .loc 1 277 20 ld.u64 %r321,[%r117]; mul.lo.u64 %r320,%r321,%r34; .loc 1 277 18 st.u64 [%r319],%r320; .loc 1 278 19 ld.u64 %r323,[%r131+16]; add.u64 %r322,%r323,1; ld.u64 %r324,[%r131+8]; sub.u64 %r62,%r322,%r324; .loc 1 280 10 setp.lt.s64 %r325,%r62,0; @ %r325 bra $L64; .loc 1 278 17 add.u64 %r327,%r429,%r135; st.u64 [%r327],%r62; bra $L65; $L64: .loc 1 281 12 add.u64 %r329,%r429,%r135; st.u64 [%r329],%r443; $L65: .loc 1 274 3 add.u64 %r131,%r131,24; add.u64 %r117,%r117,24; add.u64 %r135,%r135,8; setp.ne.u64 %r331,%r66,%r135; @ %r331 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r332,%r137,0; @ %r332 bra $L68; add.u64 %r429,%frame,360; $L86: add.u64 %r179,%r211,40; mov.u64 %r171,%r429; cvt.u32.u32 %r334,%r23; cvt.s64.s8 %r333,%r334; add.u64 %r336,%r333,%r333; add.u64 %r337,%r336,%r333; shl.b64 %r338,%r337,3; add.u64 %r339,%r211,16; add.u64 %r132,%r338,%r339; .loc 1 291 10 mov.u64 %r126,1; .loc 1 295 4 mov.u64 %r358,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r340,%r23,-2; cvt.s64.s32 %r71,%r340; add.u64 %r342,%r71,%r71; add.u64 %r343,%r342,%r71; shl.b64 %r344,%r343,3; add.u64 %r345,%r211,%r344; .loc 1 299 67 shl.b64 %r347,%r71,3; add.u64 %r348,%frame,%r347; .loc 1 299 59 ld.u64 %r350,[%r345+40]; ld.u64 %r351,[%r348+360]; mul.lo.u64 %r145,%r350,%r351; .loc 1 301 24 mov.u64 %r352,0; st.u64 [%r211+8],%r352; .loc 1 302 28 cvt.u16.u32 %r355,%r23; add.u16 %r354,%r355,-1; cvt.u32.u16 %r356,%r354; st.u8 [%r211+28],%r356; .loc 1 304 10 setp.eq.u64 %r357,%r145,0; @ %r357 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r126,%r68,%r126; $L69: .loc 1 295 4 st.u64 [%r179+8],%r358; ld.u64 %r68,[%r171]; add.u64 %r359,%r68,-1; st.u64 [%r179+16],%r359; st.u64 [%r179],%r126; .loc 1 288 7 add.u64 %r179,%r179,24; add.u64 %r171,%r171,8; setp.ne.u64 %r360,%r132,%r179; @ %r360 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r211+48],%r145; mov.u64 %r362,-1; st.u64 [%r211+56],%r362; mov.u64 %r363,1; st.u64 [%r211+40],%r363; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r365,16365; call (%value_in),_gfortrani_xmallocarray66,[%value_in]; } .loc 1 311 22 st.u64 [%r211],%r366; bra $L73; $L63: .loc 1 316 19 ld.s8 %r368,[%r211+28]; .loc 1 316 10 setp.eq.u64 %r369,%r368,%r137; @ %r369 bra $L74; .loc 1 317 2 cvta.const.u64 %r370,$LC370_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r372,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r373,[%r372+36]; setp.eq.u32 %r374,%r373,0; @ %r374 bra $L73; .loc 1 321 4 add.u64 %r429,%frame,360; cvta.const.u64 %r378,$LC2377378; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r38221212382378; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r384,%r137,0; @ %r384 bra $L75; $L78: .loc 1 336 8 ld.u64 %r152,[%r211]; .loc 1 337 8 ld.u64 %r150,[%r212]; .loc 1 339 9 setp.ne.u64 %r385,%r150,0; @ %r385 bra $L76; bra $L48; $L75: add.u64 %r192,%frame,480; add.u64 %r191,%r211,40; add.u64 %r189,%frame,120; cvt.u32.u32 %r387,%r23; cvt.s64.s8 %r386,%r387; add.u64 %r180,%r386,-1; .loc 1 328 3 mov.u64 %r157,0; add.u64 %r429,%frame,360; .loc 1 330 16 mov.u64 %r388,%r157; $L77: st.u64 [%r192],%r388; .loc 1 331 18 ld.u64 %r389,[%r191]; st.u64 [%r189],%r389; .loc 1 332 17 shl.b64 %r391,%r157,3; add.u64 %r392,%r429,%r391; .loc 1 332 10 ld.u64 %r393,[%r392]; setp.le.s64 %r394,%r393,0; @ %r394 bra $L48; .loc 1 328 26 add.u64 %r157,%r157,1; .loc 1 328 3 add.u64 %r192,%r192,8; add.u64 %r191,%r191,24; add.u64 %r189,%r189,8; setp.ne.u64 %r395,%r157,%r180; @ %r395 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r43,%r41,%r34; .loc 1 349 32 shl.b64 %r86,%r141,4; ld.u64 %r153,[%frame+480]; .loc 1 359 22 ld.u64 %r89,[%frame+240]; .loc 1 359 12 shl.b64 %r90,%r89,4; .loc 1 360 23 ld.u64 %r92,[%frame]; .loc 1 361 22 ld.u64 %r94,[%frame+120]; .loc 1 361 12 shl.b64 %r95,%r94,4; .loc 1 363 32 ld.u64 %r80,[%frame+360]; .loc 1 370 23 mul.lo.u64 %r38,%r80,%r89; .loc 1 371 24 mul.lo.u64 %r396,%r80,%r92; .loc 1 371 10 neg.s64 %r162,%r396; .loc 1 372 23 mul.lo.u64 %r397,%r80,%r94; .loc 1 372 9 shl.b64 %r398,%r397,4; neg.s64 %r165,%r398; setp.le.s64 %r434,%r137,1; cvt.u32.u32 %r436,%r23; cvt.s64.s8 %r437,%r436; add.u64 %r438,%r437,-1; add.u64 %r439,%frame,240; add.u64 %r440,%frame,360; add.u64 %r441,%frame,120; .loc 1 367 13 mov.u64 %r442,0; $L87: .loc 1 328 3 mov.u64 %r156,%r151; mov.u64 %r155,%r150; mov.f64 %r118,0d0000000000000000; mov.f64 %r119,%r118; .loc 1 349 9 mov.u64 %r154,0; $L80: .loc 1 352 6 ld.s8 %r400,[%r156]; cvt.u16.u32 %r399,%r400; setp.eq.u16 %r401,%r399,0; @ %r401 bra $L79; .loc 1 353 12 ld.f64 %r402,[%r155]; add.f64 %r119,%r119,%r402; ld.f64 %r403,[%r155+8]; add.f64 %r118,%r118,%r403; $L79: .loc 1 349 24 add.u64 %r154,%r154,1; .loc 1 349 32 add.u64 %r155,%r155,%r86; .loc 1 349 47 add.u64 %r156,%r156,%r43; .loc 1 349 2 setp.ne.u64 %r404,%r138,%r154; @ %r404 bra $L80; .loc 1 355 8 st.f64 [%r152],%r119; st.f64 [%r152+8],%r118; .loc 1 358 15 add.u64 %r153,%r153,1; .loc 1 359 12 add.u64 %r150,%r150,%r90; .loc 1 360 13 add.u64 %r151,%r151,%r92; .loc 1 361 12 add.u64 %r152,%r152,%r95; .loc 1 363 13 setp.ne.u64 %r405,%r80,%r153; @ %r405 bra $L87; .loc 1 371 10 add.u64 %r147,%r151,%r162; .loc 1 372 9 add.u64 %r148,%r152,%r165; .loc 1 374 7 @ ! %r434 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r128],%r442; .loc 1 370 23 mul.lo.u64 %r98,%r109,%r108; .loc 1 371 24 mul.lo.u64 %r408,%r111,%r108; .loc 1 371 10 sub.u64 %r147,%r151,%r408; .loc 1 372 23 mul.lo.u64 %r409,%r113,%r108; .loc 1 372 9 shl.b64 %r410,%r409,4; sub.u64 %r148,%r152,%r410; .loc 1 373 5 add.u64 %r149,%r149,1; .loc 1 374 7 add.u64 %r128,%r128,8; add.u64 %r178,%r178,8; setp.ne.u64 %r411,%r149,%r438; @ %r411 bra $L84; bra $L48; $L108: add.u64 %r128,%frame,488; .loc 1 370 23 mov.u64 %r98,%r38; .loc 1 374 7 mov.u64 %r178,8; .loc 1 373 5 mov.u64 %r149,1; $L84: .loc 1 382 16 ld.u64 %r415,[%r128]; add.u64 %r108,%r415,1; st.u64 [%r128],%r108; .loc 1 383 23 add.u64 %r417,%r439,%r178; ld.u64 %r109,[%r417]; .loc 1 383 13 sub.u64 %r418,%r109,%r98; shl.b64 %r419,%r418,4; add.u64 %r150,%r150,%r419; .loc 1 384 24 add.u64 %r420,%frame,%r178; ld.u64 %r111,[%r420]; .loc 1 384 14 add.u64 %r151,%r147,%r111; .loc 1 385 23 add.u64 %r422,%r441,%r178; ld.u64 %r113,[%r422]; .loc 1 385 13 shl.b64 %r423,%r113,4; add.u64 %r152,%r148,%r423; .loc 1 363 32 add.u64 %r425,%r440,%r178; ld.u64 %r116,[%r425]; .loc 1 363 13 setp.eq.u64 %r426,%r108,%r116; @ %r426 bra $L85; .loc 1 367 13 mov.u64 %r153,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r427,[%r211]; setp.eq.u64 %r428,%r427,0; @ ! %r428 bra $L63; bra $L86; $L48:_gfortran_ssum_c8 .visible .func _gfortran_ssum_c868u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64 %r276; .reg .pred %r277; .reg .pred %r280; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_sum_c8.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r297,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r297; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r297; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r304,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r304; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r296,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r296,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r303,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r303; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r297,0; @ %r211 bra $L127; add.u64 %r296,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r296; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r244,1696244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r297,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r297; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r297; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r297,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r297,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r297,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 shl.b64 %r69,%r68,4; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; ld.const.f64 %r293,[$LC4]; ld.const.f64 %r294,[$LC4+8]; setp.le.s64 %r295,%r297,1; cvt.u32.u32 %r299,%r24; cvt.s64.s8 %r300,%r299; add.u64 %r301,%r300,-1; .loc 1 520 13 mov.u64 %r302,0; $L146: .loc 1 512 13 st.f64 [%r98],%r293; st.f64 [%r98+8],%r294; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r280,%r59,%r80; @ %r280 bra $L146; .loc 1 525 7 @ ! %r295 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r302; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r283,%r97,%r301; @ %r283 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; $L144: .loc 1 529 16 ld.u64 %r288,[%r99]; add.u64 %r74,%r288,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r289,%r75,%r72; shl.b64 %r290,%r289,4; add.u64 %r98,%r98,%r290; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r291,%r74,%r77; @ %r291 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r292,%r98,0; @ ! %r292 bra $L122; bra $L147; $L109: .loc 1 534 1 ret; } sum_c10.o/ bessel_r4.o/_gfortran_bessel_jn_r4 .visible .func _gfortran_bessel_jn_r4, .param .f32 %in_ar3); .file 1 "../../../libgfortran/generated/bessel_r4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_bessel_yn_r4 .visible .func _gfortran_bessel_yn_r4_gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime_jnf .extern .func (.param .f32 %value_out) jnfCL: ynf .extern .func (.param .f32 %value_out) ynf (.param .u32 %in_ar0, .param .f32 %in_ar1)99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,66,69,83,83,69,76,95,74,78,32,40,37,108,100,32,118,115,46,32,37,108,100,41_gfortran_bessel_jn_r4 .visible .func _gfortran_bessel_jn_r4predpredpred %r115; .reg .f32 %r116; .reg .pred %r117; .reg .f32 %r118; .reg .predpredpred %r147; mov.u64 %r84,%ar0; mov.u32 %r85,%ar1; mov.u32 %r86,%ar2; mov.f32 %r87,%ar3; .loc 1 52 6 ld.u64 %r88,[%r84]; setp.ne.u64 %r89,%r88,0; @ %r89 bra $L2; .loc 1 54 33 setp.lt.s32 %r90,%r86,%r85; @ %r90 bra $L10; .loc 1 54 37 sub.u32 %r91,%r86,%r85; .loc 1 54 40 add.u32 %r92,%r91,1; .loc 1 54 33 cvt.s64.s32 %r63,%r92; .loc 1 55 7 add.u64 %r78,%r63,-1; bra $L3; $L10: mov.u64 %r78,-1; .loc 1 54 33 mov.u64 %r63,%r88; $L3: .loc 1 55 7 mov.u64 %r93,0; st.u64 [%r84+48],%r93; st.u64 [%r84+56],%r78; mov.u64 %r94,1; st.u64 [%r84+40],%r94; .loc 1 56 24 mov.u64 %r96r96; call (%value_in),_gfortrani_xmallocarray.loc 1 56 22 st.u64 [%r84],%r97; .loc 1 57 19 st.u64 [%r84+8],%r93; $L2: .loc 1 60 6 setp.ge.s32 %r100,%r86,%r85; @ ! %r100 bra $L1; .loc 1 54 37 sub.u32 %r83,%r86,%r85; .loc 1 63 7 cvta.global.u64 %r101,_gfortrani_compile_options; .loc 1 63 6 ld.u32 %r102,[%r101+36]; setp.eq.u32 %r103,%r102,0; @ %r103 bra $L5; .loc 1 64 10 ld.u64 %r105,[%r84+56]; add.u64 %r104,%r105,1; ld.u64 %r106,[%r84+48]; sub.u64 %r33,%r104,%r106; .loc 1 64 48 add.u32 %r107,%r83,133,%r108; @ %r109 bra $L5; .loc 1 65 5 st.u64 [%stack+8],%r33; .loc 1 66 22 cvt.s64.s32 %r111,%r86; .loc 1 65 5 cvt.s64.s32 %r112,%r85; sub.u64 %r113,%r111,%r112; st.u64 [%stack],%r113; cvta.const.u64 %r110110_gfortran_runtime_errorr66,[%r84+40]; .loc 1 71 6 setp.neu.f32 %r115,%r87,0f00000000; @ %r115 bra $L6; .loc 1 73 10 ld.u64 %r41,[%r84]; .loc 1 73 25 mov.f32 %r116,0f3f800000; st.f32 [%r41],%r116; .loc 1 74 7 setp.eq.u32 %r117,%r83,0; @ %r117 bra $L1; shl.b64 %r74,%r66,2; add.u64 %r65,%r41,%r74; add.u32 %r69,%r83,1; .loc 1 74 14 mov.u32 %r68,1; .loc 1 75 34 mov.f32 %r118,0f00000000; $L8: st.f32 [%r65],%r118; .loc 1 74 32 add.u32 %r68,%r68,1; .loc 1 74 7 add.u64 %r65,%r65,%r74; setp.ne.u32 %r119,%r68,%r69; @ %r119 bra $L8; bra $L1; $L6: .loc 1 79 jn122,[%value_in]; } mov.f32 %r40,%r122; .loc 1 80 21 cvt.s64.s32 %r123,%r83; .loc 1 80 25 mul.lo.u64 %r124,%r123,%r66; .loc 1 80 34 ld.u64 %r125,[%r84]; shl.b64 %r126,%r124,2; add.u64 %r127,%r125,%r126; st.f32 [%r127],%r40; .loc 1 82 6 setp.eq.u32 %r128,%r86,%r85; @ %r128 bra $L1; .loc 1 85 11 add.u32 %r131,%r86,-131(%value_in),jn132,[%value_in]; } mov.f32 %r29,%r132; .loc 1 86 6 ld.u64 %r50,[%r84]; .loc 1 86 24 add.u32 %r51,%r83,-1; cvt.s64.s32 %r133,%r51; .loc 1 86 27 mul.lo.u64 %r53,%r133,%r66; .loc 1 86 36 shl.b64 %r134,%r53,2; add.u64 %r135,%r50,%r134; st.f32 [%r135],%r29; .loc 1 88 10 add.u32 %r136,%r85,1; .loc 1 88 6 setp.eq.u32 %r137,%r136,%r86; @ %r137 bra $L1; .loc 1 91 9 mov.f32 %r138,0f40000000; div.rn.f32 %r67,%r138,%r87; .loc 1 93 3 setp.le.s32 %r139,%r83,1; @ %r139 bra $L1; add.u32 %r64,%r51,%r85; neg.s64 %r141,%r66; shl.b64 %r142,%r141,2; sub.u64 %r143,%r53,%r66; shl.b64 %r144,%r143,2; add.u64 %r61,%r50,%r144; $L9: neg.f32 %r27,%r40; mov.f32 %r40,%r29; .loc 1 95 40 cvt.rn.f32.s32 %r145,%r64; mul.f32 %r146,%r145,%r67; .loc 1 95 59 fma.rn.f32 %r29,%r29,%r146,%r27; .loc 1 95 32 st.f32 [%r61],%r29; .loc 1 93 3 add.u32 %r64,%r64,-1; add.u64 %r61,%r61,%r142; setp.ne.u32 %r147,%r64,%r85; @ %r147 bra $L9; _gfortran_bessel_yn_r4 .visible .func _gfortran_bessel_yn_64f32.reg .u6464pred %r109; .reg .f32predpred %r127; .reg .f32 %r128; mov.u64 %r74,%ar0; mov.u32 %r75,%ar1; mov.u32 %r76,%ar2; mov.f32 %r77,%ar3; .loc 1 119 6 ld.u64 %r78,[%r74]; setp.ne.u64 %r79,%r78,0; @ %r79 bra $L20; .loc 1 121 33 setp.lt.s32 %r80,%r76,%r75; @ %r80 bra $L30; .loc 1 121 37 sub.u32 %r81,%r76,%r75; .loc 1 121 40 add.u32 %r82,%r81,1; .loc 1 121 33 cvt.s64.s32 %r65,%r82; .loc 1 122 7 add.u64 %r70,%r65,-1; bra $L21; $L30: mov.u64 %r70,-1; .loc 1 121 33 mov.u64 %r65,%r78; $L21: .loc 1 122 7 mov.u64 %r83,0; st.u64 [%r74+48],%r83; st.u64 [%r74+56],%r70; mov.u64 %r84,1; st.u64 [%r74+40],%r84; .loc 1 123 24 mov.u64 %r866586; call (%value_in),_gfortrani_xmallocarray.loc 1 123 22 st.u64 [%r74],%r87; .loc 1 124 19 st.u64 [%r74+8],%r83; $L20: .loc 1 127 6 setp.ge.s32 %r90,%r76,%r75; @ ! %r90 bra $L19; .loc 1 130 7 cvta.global.u64 %r91,_gfortrani_compile_options; .loc 1 130 6 ld.u32 %r92,[%r91+36]; setp.eq.u32 %r93,%r92,0; @ %r93 bra $L23; .loc 1 131 10 ld.u64 %r95,[%r74+56]; add.u64 %r94,%r95,1; ld.u64 %r96,[%r74+48]; sub.u64 %r34,%r94,%r96; .loc 1 131 45 sub.u32 %r97,%r76,%r75; .loc 1 131 48 add.u32 %r98,%r97,1; cvt.s64.s32 %r99,%r98; .loc 1 131 7 setp.eq.u64 %r100,%r34,%r99; @ %r100 bra $L23; .loc 1 132 5 st.u64 [%stack+8],%r34; .loc 1 133 22 cvt.s64.s32 %r102,%r76; .loc 1 132 5 cvt.s64.s32 %r103,%r75; sub.u64 %r104,%r102,%r103; st.u64 [%stack],%r104; cvta.const.u64 %r101,$LC0; {_gfortran_runtime_error23: .loc 1 136 10 ld.u64 %r68,[%r74+40]; .loc 1 138 6 setp.neu.f32 %r106,%r77,0f00000000; @ %r106 bra $L24; shl.b64 %r64,%r68,2; ld.u64 %r67,[%r74]; .loc 1 140 26 sub.u32 %r107,%r76,%r75; add.u32 %r50,%r107,1; .loc 1 140 14 mov.u32 %r42,0; .loc 1 142 34 mov.f32 %r108,0fff800000; $L25: st.f32 [%r67],%r108; .loc 1 140 32 add.u32 %r42,%r42,1; .loc 1 140 7 add.u64 %r67,%r67,%r64; setp.ne.u32 %r109,%r50,%r42; @ %r109 bra $L25; bra $L19; $L24: .loc 1 149 yn112,[%value_in]; } mov.f32 %r62,%r112; .loc 1 150 21 ld.u64 %r113,[%r74]; st.f32 [%r113],%r62; .loc 1 152 6 setp.eq.u32 %r114,%r76,%r75; @ %r114 bra $L19; .loc 1 155 11 add.u32 %r45,%r75,yn117,[%value_in]; } mov.f32 %r63,%r117; .loc 1 156 17 shl.b64 %r48,%r68,2; ld.u64 %r118,[%r74]; add.u64 %r49,%r118,%r48; .loc 1 156 28 st.f32 [%r49],%r63; .loc 1 158 6 setp.eq.u32 %r119,%r45,%r76; @ %r119 bra $L19; .loc 1 161 9 mov.f32 %r120,0f40000000; div.rn.f32 %r69,%r120,%r77; .loc 1 163 23 sub.u32 %r51,%r76,%r75; .loc 1 163 3 setp.eq.u32 %r121,%r51,1; @ %r121 bra $L19; add.u64 %r41,%r49,%r48; mov.u32 %r72,%r45; add.u32 %r25,%r51,%r75; .loc 1 168 29 mov.f32 %r128,0fff800000; $L29: .loc 1 166 10 setp.neu.f32 %r122,%r63,0fff800000; @ %r122 bra $L27; .loc 1 168 29 st.f32 [%r41],%r128; bra $L28; $L27: .loc 1 173 37 cvt.rn.f32.s32 %r124,%r72; mul.f32 %r125,%r124,%r69; .loc 1 173 56 neg.f32 %r126,%r62; fma.rn.f32 %r58,%r125,%r63,%r126; .loc 1 173 29 st.f32 [%r41],%r58; mov.f32 %r62,%r63; .loc 1 175 10 mov.f32 %r63,%r58; $L28: .loc 1 163 3 add.u64 %r41,%r41,%r48; add.u32 %r72,%r72,1; setp.ne.u32 %r127,%r25,%r72; @ %r127 bra $L29; $L19: .loc 1 178 1 ret; } bessel_r8.o/_gfortran_bessel_jn_r8 .visible .func _gfortran_bessel_jn_r8, .param .f64 %in_ar3); .file 1 "../../../libgfortran/generated/bessel_r8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_bessel_yn_r8 .visible .func _gfortran_bessel_yn_r8gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime_jn .extern .func (.param .f64 %value_out) jnCL: yn .extern .func (.param .f64 %value_out) yn (.param .u32 %in_ar0, .param .f64 %in_ar1)99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,66,69,83,83,69,76,95,74,78,32,40,37,108,100,32,118,115,46,32,37,108,100,41_gfortran_bessel_jn_r8 .visible .func _gfortran_bessel_jn_r8reg .f64 %r27; .reg .f64 %r29; .reg .u64predpredpred %r115; .reg .f64 %r116; .reg .pred %r117; .reg .f64 %r118; .reg .predpredpred %r147; mov.u64 %r84,%ar0; mov.u32 %r85,%ar1; mov.u32 %r86,%ar2; mov.f64 %r87,%ar3; .loc 1 52 6 ld.u64 %r88,[%r84]; setp.ne.u64 %r89,%r88,0; @ %r89 bra $L2; .loc 1 54 33 setp.lt.s32 %r90,%r86,%r85; @ %r90 bra $L10; .loc 1 54 37 sub.u32 %r91,%r86,%r85; .loc 1 54 40 add.u32 %r92,%r91,1; .loc 1 54 33 cvt.s64.s32 %r63,%r92; .loc 1 55 7 add.u64 %r78,%r63,-1; bra $L3; $L10: mov.u64 %r78,-1; .loc 1 54 33 mov.u64 %r63,%r88; $L3: .loc 1 55 7 mov.u64 %r93,0; st.u64 [%r84+48],%r93; st.u64 [%r84+56],%r78; mov.u64 %r94,1; st.u64 [%r84+40],%r94; .loc 1 56 24 mov.u6r96; call (%value_in),_gfortrani_xmallocarray.loc 1 56 22 st.u64 [%r84],%r97; .loc 1 57 19 st.u64 [%r84+8],%r93; $L2: .loc 1 60 6 setp.ge.s32 %r100,%r86,%r85; @ ! %r100 bra $L1; .loc 1 54 37 sub.u32 %r83,%r86,%r85; .loc 1 63 7 cvta.global.u64 %r101,_gfortrani_compile_options; .loc 1 63 6 ld.u32 %r102,[%r101+36]; setp.eq.u32 %r103,%r102,0; @ %r103 bra $L5; .loc 1 64 10 ld.u64 %r105,[%r84+56]; add.u64 %r104,%r105,1; ld.u64 %r106,[%r84+48]; sub.u64 %r33,%r104,%r106; .loc 1 64 48 add.u32 %r107,%r83,133,%r108; @ %r109 bra $L5; .loc 1 65 5 st.u64 [%stack+8],%r33; .loc 1 66 22 cvt.s64.s32 %r111,%r86; .loc 1 65 5 cvt.s64.s32 %r112,%r85; sub.u64 %r113,%r111,%r112; st.u64 [%stack],%r113; cvta.const.u64 %r110110_gfortran_runtime_errorr66,[%r84+40]; .loc 1 71 6 setp.neu.f64 %r115,%r87,0d0000000000000000; @ %r115 bra $L6; .loc 1 73 10 ld.u64 %r41,[%r84]; .loc 1 73 25 mov.f64 %r116,0d3ff0000000000000; st.f64 [%r41],%r116; .loc 1 74 7 setp.eq.u32 %r117,%r83,0; @ %r117 bra $L1; shl.b64 %r74,%r66,3; add.u64 %r65,%r41,%r74; add.u32 %r69,%r83,1; .loc 1 74 14 mov.u32 %r68,1; .loc 1 75 34 mov.f64 %r118,0d0000000000000000; $L8: st.f64 [%r65],%r118; .loc 1 74 32 add.u32 %r68,%r68,1; .loc 1 74 7 add.u64 %r65,%r65,%r74; setp.ne.u32 %r119,%r68,%r69; @ %r119 bra $L8; bra $L1; $L6: .loc 1 79 1186(%value_in),jn,(%out_arg1,%out_arg2); ld.param.f64 %r122,[%value_in]; } mov.f64 %r40,%r122; .loc 1 80 21 cvt.s64.s32 %r123,%r83; .loc 1 80 25 mul.lo.u64 %r124,%r123,%r66; .loc 1 80 34 ld.u64 %r125,[%r84]; shl.b64 %r126,%r124,3; add.u64 %r127,%r125,%r126; st.f64 [%r127],%r40; .loc 1 82 6 setp.eq.u32 %r128,%r86,%r85; @ %r128 bra $L1; .loc 1 85 11 add.u32 %r131,%r86,-1131(%value_in),jn,(%out_arg1,%out_arg2); ld.param.f64 %r132,[%value_in]; } mov.f64 %r29,%r132; .loc 1 86 6 ld.u64 %r50,[%r84]; .loc 1 86 24 add.u32 %r51,%r83,-1; cvt.s64.s32 %r133,%r51; .loc 1 86 27 mul.lo.u64 %r53,%r133,%r66; .loc 1 86 36 shl.b64 %r134,%r53,3; add.u64 %r135,%r50,%r134; st.f64 [%r135],%r29; .loc 1 88 10 add.u32 %r136,%r85,1; .loc 1 88 6 setp.eq.u32 %r137,%r136,%r86; @ %r137 bra $L1; .loc 1 91 9 mov.f64 %r138,0d4000000000000000; div.rn.f64 %r67,%r138,%r87; .loc 1 93 3 setp.le.s32 %r139,%r83,1; @ %r139 bra $L1; add.u32 %r64,%r51,%r85; neg.s64 %r141,%r66; shl.b64 %r142,%r141,3; sub.u64 %r143,%r53,%r66; shl.b64 %r144,%r143,3; add.u64 %r61,%r50,%r144; $L9: neg.f64 %r27,%r40; mov.f64 %r40,%r29; .loc 1 95 40 cvt.rn.f64.s32 %r145,%r64; mul.f64 %r146,%r145,%r67; .loc 1 95 59 fma.rn.f64 %r29,%r29,%r146,%r27; .loc 1 95 32 st.f64 [%r61],%r29; .loc 1 93 3 add.u32 %r64,%r64,-1; add.u64 %r61,%r61,%r142; setp.ne.u32 %r147,%r64,%r85; @ %r147 bra $L9; _gfortran_bessel_yn_r8 .visible .func _gfortran_bessel_yn_u32 %r76; .reg .f64 %r77; .reg .u6464pred %r109; .reg .f64predpred %r127; .reg .f64 %r128; mov.u64 %r74,%ar0; mov.u32 %r75,%ar1; mov.u32 %r76,%ar2; mov.f64 %r77,%ar3; .loc 1 119 6 ld.u64 %r78,[%r74]; setp.ne.u64 %r79,%r78,0; @ %r79 bra $L20; .loc 1 121 33 setp.lt.s32 %r80,%r76,%r75; @ %r80 bra $L30; .loc 1 121 37 sub.u32 %r81,%r76,%r75; .loc 1 121 40 add.u32 %r82,%r81,1; .loc 1 121 33 cvt.s64.s32 %r65,%r82; .loc 1 122 7 add.u64 %r70,%r65,-1; bra $L21; $L30: mov.u64 %r70,-1; .loc 1 121 33 mov.u64 %r65,%r78; $L21: .loc 1 122 7 mov.u64 %r83,0; st.u64 [%r74+48],%r83; st.u64 [%r74+56],%r70; mov.u64 %r84,1; st.u64 [%r74+40],%r84; .loc 1 123 24 mov.u64 6586; call (%value_in),_gfortrani_xmallocarray.loc 1 123 22 st.u64 [%r74],%r87; .loc 1 124 19 st.u64 [%r74+8],%r83; $L20: .loc 1 127 6 setp.ge.s32 %r90,%r76,%r75; @ ! %r90 bra $L19; .loc 1 130 7 cvta.global.u64 %r91,_gfortrani_compile_options; .loc 1 130 6 ld.u32 %r92,[%r91+36]; setp.eq.u32 %r93,%r92,0; @ %r93 bra $L23; .loc 1 131 10 ld.u64 %r95,[%r74+56]; add.u64 %r94,%r95,1; ld.u64 %r96,[%r74+48]; sub.u64 %r34,%r94,%r96; .loc 1 131 45 sub.u32 %r97,%r76,%r75; .loc 1 131 48 add.u32 %r98,%r97,1; cvt.s64.s32 %r99,%r98; .loc 1 131 7 setp.eq.u64 %r100,%r34,%r99; @ %r100 bra $L23; .loc 1 132 5 st.u64 [%stack+8],%r34; .loc 1 133 22 cvt.s64.s32 %r102,%r76; .loc 1 132 5 cvt.s64.s32 %r103,%r75; sub.u64 %r104,%r102,%r103; st.u64 [%stack],%r104; cvta.const.u64 %r101,$LC0; {_gfortran_runtime_error23: .loc 1 136 10 ld.u64 %r68,[%r74+40]; .loc 1 138 6 setp.neu.f64 %r106,%r77,0d0000000000000000; @ %r106 bra $L24; shl.b64 %r64,%r68,3; ld.u64 %r67,[%r74]; .loc 1 140 26 sub.u32 %r107,%r76,%r75; add.u32 %r50,%r107,1; .loc 1 140 14 mov.u32 %r42,0; .loc 1 142 34 mov.f64 %r108,0dfff0000000000000; $L25: st.f64 [%r67],%r108; .loc 1 140 32 add.u32 %r42,%r42,1; .loc 1 140 7 add.u64 %r67,%r67,%r64; setp.ne.u32 %r109,%r50,%r42; @ %r109 bra $L25; bra $L19; $L24: .loc 1 149 1175(%value_in),yn,(%out_arg1,%out_arg2); ld.param.f64 %r112,[%value_in]; } mov.f64 %r62,%r112; .loc 1 150 21 ld.u64 %r113,[%r74]; st.f64 [%r113],%r62; .loc 1 152 6 setp.eq.u32 %r114,%r76,%r75; @ %r114 bra $L19; .loc 1 155 11 add.u32 %r45,%r75,145(%value_in),yn,(%out_arg1,%out_arg2); ld.param.f64 %r117,[%value_in]; } mov.f64 %r63,%r117; .loc 1 156 17 shl.b64 %r48,%r68,3; ld.u64 %r118,[%r74]; add.u64 %r49,%r118,%r48; .loc 1 156 28 st.f64 [%r49],%r63; .loc 1 158 6 setp.eq.u32 %r119,%r45,%r76; @ %r119 bra $L19; .loc 1 161 9 mov.f64 %r120,0d4000000000000000; div.rn.f64 %r69,%r120,%r77; .loc 1 163 23 sub.u32 %r51,%r76,%r75; .loc 1 163 3 setp.eq.u32 %r121,%r51,1; @ %r121 bra $L19; add.u64 %r41,%r49,%r48; mov.u32 %r72,%r45; add.u32 %r25,%r51,%r75; .loc 1 168 29 mov.f64 %r128,0dfff0000000000000; $L29: .loc 1 166 10 setp.neu.f64 %r122,%r63,0dfff0000000000000; @ %r122 bra $L27; .loc 1 168 29 st.f64 [%r41],%r128; bra $L28; $L27: .loc 1 173 37 cvt.rn.f64.s32 %r124,%r72; mul.f64 %r125,%r124,%r69; .loc 1 173 56 neg.f64 %r126,%r62; fma.rn.f64 %r58,%r125,%r63,%r126; .loc 1 173 29 st.f64 [%r41],%r58; mov.f64 %r62,%r63; .loc 1 175 10 mov.f64 %r63,%r58; $L28: .loc 1 163 3 add.u64 %r41,%r41,%r48; add.u32 %r72,%r72,1; setp.ne.u32 %r127,%r25,%r72; @ %r127 bra $L29; $L19: .loc 1 178 1 ret; } bessel_r10.o/ iall_i1.o/fortran/generated/iall_i1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_miall_i1 .visible .func _gfortran_miall_i1gfortran_siall_i1 .visible .func _gfortran_siall_i1VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,76,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,76,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,105] = {73,65,76,76114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,76,76,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,73,65,76,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_iall_i1 .visible .func _gfortran_iall_480predpred %r209; .reg .pred %r210; .reg .u64 %r211; .reg .u64 %r213; .reg .u64pred %r220; .reg .pred %r222; .reg .u64pred %r247; .reg .u64 %r248; .reg .u32 %r249; .reg .u64 %r251; .reg .u64 %r252; .reg .u64 %r253; .reg .u64 %r254; .reg .u64 %r255; .reg .u16 %r257; .reg .u16predpred %r288; .reg .u64 %r291; .reg .u64 %r292; .reg .predpred %r304; .reg .predpredpred %r332; .reg .u64 %r333; .reg .pred %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .pred %r337; .reg .pred %r338; .reg .u32 %r340; .reg .u64 %r341; .reg .u64 %r342; .reg .u64 %r343; .reg .u64 %r344; .reg .u64 %r345; .reg .u64 %r347; .reg .u64 %r348; .reg .u16 %r349; mov.u64 %r171,%ar0; mov.u64 %r172,%ar1; mov.u64 %r173,%ar2; .loc 1 55 10 ld.s8 %r22,[%r172+28]; .loc 1 55 38 add.u32 %r174,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r174; .loc 1 56 10 ld.u64 %r25,[%r173]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r176,%r104,63; set.u32.lt.s64 %r178,%r103,%r104; neg.s32 %r179,%r178; cvt.u16.u64 %r182,%r176; cvt.u16.u32 %r183,%r179; or.b16 %r181,%r182,%r183; .loc 1 58 6 cvt.u32.u16 %r184,%r181; cvt.u16.u8 %r185,%r184; setp.eq.u16 %r186,%r185,0; @ %r186 bra $L2; .loc 1 60 7 cvt.u32.u32 %r189,%r22; cvt.s64.s8 %r188,%r189; st.u64 [%stack+8],%r18818187_gfortran_runtime_errorr192,%r104,%r104; add.u64 %r193,%r192,%r104; shl.b64 %r194,%r193,3; add.u64 %r195,%r172,%r194; ld.u64 %r31,[%r195+56]; ld.u64 %r33,[%r195+48]; .loc 1 68 9 ld.u64 %r107,[%r195+40]; .loc 1 70 3 setp.ne.u64 %r209,%r104,0; @ %r209 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r210,%r103,%r104; @ %r210 bra $L4; bra $L46; $L3: add.u64 %r168,%r172,40; add.u64 %r165,%frame,120; add.u64 %r48,%frame,240; add.u64 %r211,%r172,16; add.u64 %r213,%r25,%r25; add.u64 %r214,%r213,%r25; shl.b64 %r215,%r214,3; add.u64 %r139,%r211,%r215; .loc 1 76 12 mov.u64 %r348,0; $L8: .loc 1 72 18 ld.u64 %r216,[%r168]; st.u64 [%r165],%r216; .loc 1 73 19 ld.u64 %r218,[%r168+16]; add.u64 %r217,%r218,1; ld.u64 %r219,[%r168+8]; sub.u64 %r38,%r217,%r219; .loc 1 75 10 setp.lt.s64 %r220,%r38,0; @ %r220 bra $L6; .loc 1 73 17 st.u64 [%r48],%r38; bra $L7; $L6: .loc 1 76 12 st.u64 [%r48],%r348; $L7: .loc 1 70 3 add.u64 %r168,%r168,24; add.u64 %r165,%r165,8; add.u64 %r48,%r48,8; setp.ne.u64 %r222,%r139,%r168; @ %r222 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r223,[%r171]; setp.eq.u64 %r224,%r223,0; @ ! %r224 bra $L11; bra $L10; $L4: add.u64 %r226,%r25,%r25; add.u64 %r227,%r226,%r25; shl.b64 %r228,%r227,3; add.u64 %r229,%r228,40; add.u64 %r99,%r172,%r229; shl.b64 %r230,%r25,3; add.u64 %r91,%r230,-8; add.u64 %r335,%frame,120; add.u64 %r93,%r335,%r91; add.u64 %r336,%frame,240; add.u64 %r77,%r336,%r91; cvt.u32.u32 %r234,%r22; cvt.s64.s8 %r233,%r234; add.u64 %r236,%r233,%r233; add.u64 %r237,%r236,%r233172,40; add.u64 %r121,%r238,%r239; .loc 1 84 12 mov.u64 %r347,0; $L14: .loc 1 80 18 ld.u64 %r240,[%r99]; st.u64 [%r93],%r240; .loc 1 81 19 ld.u64 %r242,[%r99+16]; add.u64 %r241,%r242,1; ld.u64 %r243,[%r99+8]; sub.u64 %r44,%r241,%r243; .loc 1 83 10 setp.lt.s64 %r244,%r44,0; @ %r244 bra $L12; .loc 1 81 17 st.u64 [%r77],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r77],%r347; $L13: .loc 1 78 3 add.u64 %r99,%r99,24; add.u64 %r93,%r93,8; add.u64 %r77,%r77,8; setp.ne.u64 %r246,%r99,%r121; @ %r246 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r247,%r103,0; @ %r247 bra $L16; add.u64 %r336,%frame,240; $L35: add.u64 %r137,%r171,40; mov.u64 %r132,%r336; cvt.u32.u32 %r249,%r22; cvt.s64.s8 %r248,%r249; add.u64 %r251,%r248,%r248; add.u64 %r252,%r251,%r248; shl.b64 %r253,%r252,3; add.u64 %r254,%r171,16; add.u64 %r100,%r253,%r254; .loc 1 94 10 mov.u64 %r95,1; .loc 1 98 4 mov.u64 %r277,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r255,0; st.u64 [%r171+8],%r255; .loc 1 103 28 cvt.u16.u32 %r258,%r22; add.u16 %r257,%r258,-1; cvt.u32.u16 %r259,%r257; st.u8 [%r171+28],%r259; .loc 1 105 20 add.u32 %r260,%r22,-2; cvt.s64.s32 %r54,%r260; add.u64 %r262,%r54,%r54; add.u64 %r263,%r262,%r54; shl.b64 %r264,%r263,3; add.u64 %r265,%r171,%r264; .loc 1 105 67 shl.b64 %r267,%r54,3; add.u64 %r268,%frame,%r267; .loc 1 105 59 ld.u64 %r270,[%r265+40]; ld.u64 %r271,[%r268+240]; mul.lo.u64 %r109,%r270,%r271; .loc 1 107 29 mov.u64 %r27273; call (%value_in),_gfortrani_xmallocarray274,[%value_in]; } .loc 1 107 27 st.u64 [%r171],%r274; .loc 1 108 10 setp.eq.u64 %r276,%r109,0; @ ! %r276 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r95,%r50,%r95; $L17: .loc 1 98 4 st.u64 [%r137+8],%r277; ld.u64 %r50,[%r132]; add.u64 %r278,%r50,-1; st.u64 [%r137+16],%r278; st.u64 [%r137],%r95; .loc 1 91 7 add.u64 %r137,%r137,24; add.u64 %r132,%r132,8; setp.ne.u64 %r279,%r100,%r137; @ %r279 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r171+48],%r109; mov.u64 %r281,-1; st.u64 [%r171+56],%r281; st.u64 [%r171+40],%r273; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r61,[%r171+28]; .loc 1 118 10 setp.eq.u64 %r283,%r61,%r103; @ %r283 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r61;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r286,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r287,[%r286+36]; setp.eq.u32 %r288,%r287,0; @ %r288 bra $L21; .loc 1 125 2 add.u64 %r336,%frame,240; cvta.const.u64 %r292,$LC2r3r29292; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r294,%r103,0; @ %r294 bra $L24; $L27: .loc 1 65 9 add.u64 %r295,%r31,1; .loc 1 65 7 sub.u64 %r105,%r295,%r33; max.s64 %r45,%r105,0; .loc 1 137 8 ld.u64 %r117,[%r172]; .loc 1 138 8 ld.u64 %r118,[%r171]; ld.u64 %r47,[%frame+360]; .loc 1 166 22 ld.u64 %r69,[%frame+120]; .loc 1 167 22 ld.u64 %r71,[%frame]; setp.gt.s64 %r337,%r105,0; setp.le.s64 %r338,%r103,1; cvt.u32.u32 %r340,%r22; cvt.s64.s8 %r341,%r340; add.u64 %r342,%r341,-1; add.u64 %r343,%frame,120; add.u64 %r344,%frame,240; .loc 1 173 13 mov.u64 %r345,0; bra $L25; $L24: add.u64 %r157,%frame,360; add.u64 %r156,%r171,40; mov.u64 %r154,%frame; cvt.u32.u32 %r297,%r22; cvt.s64.s8 %r296,%r297; add.u64 %r145,%r296,-1; .loc 1 129 3 mov.u64 %r119,0; add.u64 %r336,%frame,240; .loc 1 131 16 mov.u64 %r298,%r119; $L26: st.u64 [%r157],%r298; .loc 1 132 18 ld.u64 %r299,[%r156]; st.u64 [%r154],%r299; .loc 1 133 17 shl.b64 %r301,%r119,3; add.u64 %r302,%r336,%r301; .loc 1 133 10 ld.u64 %r303,[%r302]; setp.le.s64 %r304,%r303,0; @ %r304 bra $L1; .loc 1 129 26 add.u64 %r119,%r119,1; .loc 1 129 3 add.u64 %r157,%r157,8; add.u64 %r156,%r156,24; add.u64 %r154,%r154,8; setp.ne.u64 %r305,%r119,%r145; @ %r305 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r337 bra $L28; .loc 1 150 10 mov.u32 %r307,0; st.u8 [%r118],%r307; bra $L29; $L28: .loc 1 154 36 mov.u64 %r112,%r117; .loc 1 148 10 mov.u32 %r110,-1; .loc 1 154 13 mov.u64 %r111,0; $L30: .loc 1 158 10 ld.u8 %r349,[%r112]; mov.loc 1 154 28 add.u64 %r111,%r111,1; .loc 1 154 36 add.u64 %r112,%r112,%r107; .loc 1 154 6 setp.gt.s64 %r313,%r45,%r111; @ %r313 bra $L30; .loc 1 161 12 cvt.u32.u32 %r314,%r110; st.u8 [%r118],%r314; $L29: .loc 1 165 15 add.u64 %r47,%r47,1; .loc 1 166 12 add.u64 %r117,%r117,%r69; .loc 1 167 12 add.u64 %r118,%r118,%r71; .loc 1 169 32 ld.u64 %r126,[%frame+240]; .loc 1 169 13 setp.ne.u64 %r315,%r47,%r126; @ %r315 bra $L25; .loc 1 176 23 mul.lo.u64 %r316,%r69,%r47; .loc 1 176 9 sub.u64 %r114,%r117,%r316; .loc 1 177 23 mul.lo.u64 %r317,%r71,%r47; .loc 1 177 9 sub.u64 %r115,%r118,%r317; .loc 1 179 7 @ %r338 bra $L1; add.u64 %r143,%frame,368; mov.u64 %r102,8; .loc 1 178 5 mov.u64 %r116,1; bra $L33; $L34: .loc 1 173 13 st.u64 [%r143],%r345; .loc 1 176 23 mul.lo.u64 %r323,%r84,%r83; .loc 1 176 9 sub.u64 %r114,%r117,%r323; .loc 1 177 23 mul.lo.u64 %r324,%r86,%r83; .loc 1 177 9 sub.u64 %r115,%r118,%r324; .loc 1 178 5 add.u64 %r116,%r116,1; .loc 1 179 7 add.u64 %r143,%r143,8; add.u64 %r102,%r102,8; setp.eq.u64 %r325,%r116,%r342; @ %r325 bra $L1; $L33: .loc 1 187 16 ld.u64 %r326,[%r143]; add.u64 %r83,%r326,1; st.u64 [%r143],%r83; .loc 1 188 23 add.u64 %r328,%r343,%r102; ld.u64 %r84,[%r328]; .loc 1 188 13 add.u64 %r117,%r114,%r84; .loc 1 189 23 add.u64 %r329,%frame,%r102; ld.u64 %r86,[%r329]; .loc 1 189 13 add.u64 %r118,%r115,%r86; .loc 1 169 32 add.u64 %r331,%r344,%r102; ld.u64 %r88,[%r331]; .loc 1 169 13 setp.eq.u64 %r332,%r83,%r88; @ %r332 bra $L34; .loc 1 173 13 mov.u64 %r47,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r333,[%r171]; setp.eq.u64 %r334,%r333,0; @ ! %r334 bra $L11; bra $L35; $L1: .loc 1 19_gfortran_miall_i1 .visible .func _gfortran_miall_608u32 %r215; .reg .u32 %r216; .reg .u16u16 %r255; .reg .u32 %r256; .reg .u16 %r257; .reg .pred %r258; .reg .predpred %r274; .reg .u64 %r275; .reg .u64 %r277; .reg .u64 %r279; .reg .u64 %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64pred %r297; .reg .u64 %r298; .reg .predu64 %r305; .reg .u32 %r306; .reg .u64u64 %r313; .reg .u64pred %r323; .reg .pred %r324; .reg .u64 %r325; .reg .u32 %r326; .reg .u64u64 %r331; .reg .u32 %r332; .reg .u64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u64u16 %r347; .reg .u32 %r348; .reg .pred %r349; .reg .u64 %r350; .reg .u64 %r351; .reg .predu16 %r388; .reg .u32 %r389; .reg .predu64 %r436; .reg .u16 %r437; mov.u64 %r203,%ar0; mov.u64 %r204,%ar1; mov.u64 %r205,%ar2; mov.u64 %r206,%ar3; .loc 1 223 6 setp.ne.u64 %r207,%r206,0; @ %r207 bra $L490r204205; call _gfortran_iall_i1,(%out_arg1,%out_arg2,%out_arg3); } .loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r205]; .loc 1 233 7 add.u64 %r132,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r204+28]; .loc 1 234 38 add.u32 %r211,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r133,%r211; .loc 1 237 7 shr.u64 %r213,%r132,63; set.u32.gt.s64 %r215,%r132,%r133; neg.s32 %r216,%r215; cvt.u16.u64 %r219,%r213; cvt.u16.u32 %r220,%r216; or.b16 %r218,%r219,%r220; .loc 1 237 6 cvt.u32.u16 %r221,%r218; cvt.u16.u8 %r222,%r221; setp.eq.u16 %r223,%r222,0; @ %r223 bra $L51; .loc 1 239 7 cvt.u32.u32 %r226,%r23; cvt.s64.s8 %r225,%r226; st.u64 [%stack+8],%r225; st.u64 [%stack],%r22; cvta.const.u64 %r2242stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r425,%r132,%r132; add.u64 %r424,%r425,%r132; shl.b64 %r231,%r424,3; add.u64 %r232,%r204,%r231; ld.u64 %r235,[%r232+56]; add.u64 %r234,%r235,1; .loc 1 244 7 ld.u64 %r242,[%r232+48]; sub.u64 %r134,%r234,%r242; .loc 1 245 6 setp.le.s64 %r243,%r134,0; @ %r243 bra $L48; .loc 1 248 9 ld.u64 %r146,[%r206]; .loc 1 250 15 ld.u64 %r34,[%r206+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r244,%r35,-4; and.b32 %r245,%r244,-5; set.u32.eq.u32 %r247,%r245,0; neg.s32 %r248,%r247; .loc 1 252 22 add.u32 %r249,%r35,-1; set.u32.le.u32 %r251,%r249,1; neg.s32 %r252,%r251; .loc 1 252 6 cvt.u16.u32 %r254,%r248; cvt.u16.u32 %r255,%r252; or.b16 %r253,%r254,%r255; cvt.u32.u16 %r256,%r253; cvt.u16.u8 %r257,%r256; setp.ne.u16 %r258,%r257,0; @ %r258 bra $L53; .loc 1 254 7 setp.ne.u32 %r260,%r35,16; @ %r260 bra $L54; $L53: .loc 1 261 9 shl.b64 %r264,%r424,3; add.u64 %r265,%r204,%r264; ld.u64 %r137,[%r265+40]; .loc 1 262 12 add.u64 %r271,%r206,%r264; ld.u64 %r42,[%r271+40]; .loc 1 264 3 setp.ne.u64 %r273,%r132,0; @ %r273 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r274,%r132,%r133; @ %r274 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r275275_gfortran_runtime_error5: add.u64 %r187,%r204,40; add.u64 %r189,%r206,40; add.u64 %r277,%r204,16; add.u64 %r279,%r22,%r22; add.u64 %r280,%r279,%r22; shl.b64 %r281,%r280,3; add.u64 %r200,%r277,%r281; .loc 1 264 3 mov.u64 %r191,0; add.u64 %r422,%frame,240; add.u64 %r421,%frame,360; .loc 1 271 12 mov.u64 %r436,%r191; $L60: .loc 1 266 18 add.u64 %r283,%r422,%r191; ld.u64 %r284,[%r187]; st.u64 [%r283],%r284; .loc 1 267 18 add.u64 %r285,%frame,%r191; .loc 1 267 20 ld.u64 %r287,[%r189]; mul.lo.u64 %r286,%r287,%r34; .loc 1 267 18 st.u64 [%r285],%r286; .loc 1 268 19 ld.u64 %r289,[%r187+16]; add.u64 %r288,%r289,1; ld.u64 %r290,[%r187+8]; sub.u64 %r53,%r288,%r290; .loc 1 270 10 setp.lt.s64 %r291,%r53,0; @ %r291 bra $L58; .loc 1 268 17 add.u64 %r293,%r421,%r191; st.u64 [%r293],%r53; bra $L59; $L58: .loc 1 271 12 add.u64 %r295,%r421,%r191; st.u64 [%r295],%r436; $L59: .loc 1 264 3 add.u64 %r187,%r187,24; add.u64 %r189,%r189,24; add.u64 %r191,%r191,8; setp.ne.u64 %r297,%r187,%r200; @ %r297 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r298,[%r203]; setp.eq.u64 %r299,%r298,0; @ ! %r299 bra $L63; bra $L62; $L56: add.u64 %r301,%r22,%r22; add.u64 %r302,%r301,%r22; shl.b64 %r303,%r302,3; add.u64 %r152,%r303,40; add.u64 %r58,%r204,%r152; add.u64 %r154,%r206,%r152; shl.b64 %r304,%r22,3; add.u64 %r100,%r304,-8; cvt.u32.u32 %r306,%r23; cvt.s64.s8 %r305,%r306; shl.b64 %r307,%r305,3; add.u64 %r184,%r307,-8; add.u64 %r422,%frame,240; add.u64 %r421,%frame,360; .loc 1 281 12 mov.u64 %r435,0; $L66: .loc 1 276 18 add.u64 %r309,%r422,%r100; ld.u64 %r310,[%r58]; st.u64 [%r309],%r310; .loc 1 277 18 add.u64 %r311,%frame,%r100; .loc 1 277 20 ld.u64 %r313,[%r154]; mul.lo.u64 %r312,%r313,%r34; .loc 1 277 18 st.u64 [%r311],%r312; .loc 1 278 19 ld.u64 %r315,[%r58+16]; add.u64 %r314,%r315,1; ld.u64 %r316,[%r58+8]; sub.u64 %r64,%r314,%r316; .loc 1 280 10 setp.lt.s64 %r317,%r64,0; @ %r317 bra $L64; .loc 1 278 17 add.u64 %r319,%r421,%r100; st.u64 [%r319],%r64; bra $L65; $L64: .loc 1 281 12 add.u64 %r321,%r421,%r100; st.u64 [%r321],%r435; $L65: .loc 1 274 3 add.u64 %r58,%r58,24; add.u64 %r154,%r154,24; add.u64 %r100,%r100,8; setp.ne.u64 %r323,%r100,%r184; @ %r323 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r324,%r133,0; @ %r324 bra $L68; add.u64 %r421,%frame,360; $L86: add.u64 %r124,%r203,40; mov.u64 %r118,%r421; cvt.u32.u32 %r326,%r23; cvt.s64.s8 %r325,%r326; add.u64 %r328,%r325,%r325; add.u64 %r329,%r328,%r325; shl.b64 %r330,%r329,3; add.u64 %r331,%r203,16; add.u64 %r72,%r330,%r331; .loc 1 291 10 mov.u64 %r120,1; .loc 1 295 4 mov.u64 %r350,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r332,%r23,-2; cvt.s64.s32 %r70,%r332; add.u64 %r334,%r70,%r70; add.u64 %r335,%r334,%r70; shl.b64 %r336,%r335,3; add.u64 %r337,%r203,%r336; .loc 1 299 67 shl.b64 %r339,%r70,3; add.u64 %r340,%frame,%r339; .loc 1 299 59 ld.u64 %r342,[%r337+40]; ld.u64 %r343,[%r340+360]; mul.lo.u64 %r140,%r342,%r343; .loc 1 301 24 mov.u64 %r344,0; st.u64 [%r203+8],%r344; .loc 1 302 28 cvt.u16.u32 %r347,%r23; add.u16 %r346,%r347,-1; cvt.u32.u16 %r348,%r346; st.u8 [%r203+28],%r348; .loc 1 304 10 setp.eq.u64 %r349,%r140,0; @ %r349 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r120,%r67,%r120; $L69: .loc 1 295 4 st.u64 [%r124+8],%r350; ld.u64 %r67,[%r118]; add.u64 %r351,%r67,-1; st.u64 [%r124+16],%r351; st.u64 [%r124],%r120; .loc 1 288 7 add.u64 %r124,%r124,24; add.u64 %r118,%r118,8; setp.ne.u64 %r352,%r72,%r124; @ %r352 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r203+48],%r140; mov.u64 %r354,-1; st.u64 [%r203+56],%r354; mov.u64 %r355,1; st.u64 [%r203+40],%r355; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r35714357; call (%value_in),_gfortrani_xmallocarray58,[%value_in]; } .loc 1 311 22 st.u64 [%r203],%r358; bra $L73; $L63: .loc 1 316 19 ld.s8 %r360,[%r203+28]; .loc 1 316 10 setp.eq.u64 %r361,%r360,%r133; @ %r361 bra $L74; .loc 1 317 2 cvta.const.u64 %r362362_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r364,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r365,[%r364+36]; setp.eq.u32 %r366,%r365,0; @ %r366 bra $L73; .loc 1 321 4 add.u64 %r421,%frame,360; cvta.const.u64 %r370,$LC2369370; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r374,$LC6; {374370; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r376,%r133,0; @ %r376 bra $L75; $L78: .loc 1 336 8 ld.u64 %r147,[%r203]; .loc 1 337 8 ld.u64 %r145,[%r204]; .loc 1 339 9 setp.ne.u64 %r377,%r145,0; @ %r377 bra $L76; bra $L48; $L75: add.u64 %r170,%frame,480; add.u64 %r167,%r203,40; add.u64 %r160,%frame,120; cvt.u32.u32 %r379,%r23; cvt.s64.s8 %r378,%r379; add.u64 %r125,%r378,-1; .loc 1 328 3 mov.u64 %r151,0; add.u64 %r421,%frame,360; .loc 1 330 16 mov.u64 %r380,%r151; $L77: st.u64 [%r170],%r380; .loc 1 331 18 ld.u64 %r381,[%r167]; st.u64 [%r160],%r381; .loc 1 332 17 shl.b64 %r383,%r151,3; add.u64 %r384,%r421,%r383; .loc 1 332 10 ld.u64 %r385,[%r384]; setp.le.s64 %r386,%r385,0; @ %r386 bra $L48; .loc 1 328 26 add.u64 %r151,%r151,1; .loc 1 328 3 add.u64 %r170,%r170,8; add.u64 %r167,%r167,24; add.u64 %r160,%r160,8; setp.ne.u64 %r387,%r125,%r151; @ %r387 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r44,%r42,%r34; ld.u64 %r173,[%frame+480]; .loc 1 359 22 ld.u64 %r86,[%frame+240]; .loc 1 360 23 ld.u64 %r88,[%frame]; .loc 1 361 22 ld.u64 %r90,[%frame+120]; setp.le.s64 %r426,%r133,1; .loc 1 348 10 mov.u32 %r428,-1; cvt.u32.u32 %r429,%r23; cvt.s64.s8 %r430,%r429; add.u64 %r431,%r430,-1; add.u64 %r432,%frame,240; add.u64 %r433,%frame,360; add.u64 %r434,%frame,120; $L87: .loc 1 328 3 mov.u64 %r150,%r146; mov.u64 %r149,%r145; .loc 1 348 10 mov.u32 %r122,%r428; .loc 1 349 9 mov.u64 %r148,0; $L80: .loc 1 352 6 ld.s8 %r389,[%r150]; cvt.u16.u32 %r388,%r389; setp.eq.u16 %r390,%r388,0; @ %r390 bra $L79; .loc 1 353 12 ld.u8 %r437,[%r149]; mov$L79: .loc 1 349 24 add.u64 %r148,%r148,1; .loc 1 349 32 add.u64 %r149,%r149,%r137; .loc 1 349 47 add.u64 %r150,%r150,%r44; .loc 1 349 2 setp.ne.u64 %r396,%r134,%r148; @ %r396 bra $L80; .loc 1 355 8 cvt.u32.u32 %r397,%r122; st.u8 [%r147],%r397; .loc 1 358 15 add.u64 %r173,%r173,1; .loc 1 359 12 add.u64 %r145,%r145,%r86; .loc 1 360 13 add.u64 %r146,%r146,%r88; .loc 1 361 12 add.u64 %r147,%r147,%r90; .loc 1 363 32 ld.u64 %r80,[%frame+360]; .loc 1 363 13 setp.ne.u64 %r398,%r80,%r173; @ %r398 bra $L87; .loc 1 370 23 mul.lo.u64 %r94,%r173,%r86; .loc 1 371 24 mul.lo.u64 %r399,%r173,%r88; .loc 1 371 10 sub.u64 %r142,%r146,%r399; .loc 1 372 23 mul.lo.u64 %r400,%r173,%r90; .loc 1 372 9 sub.u64 %r143,%r147,%r400; .loc 1 374 7 @ ! %r426 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r40],%r427; .loc 1 370 23 mul.lo.u64 %r94,%r106,%r105; .loc 1 371 24 mul.lo.u64 %r403,%r108,%r105; .loc 1 371 10 sub.u64 %r142,%r146,%r403; .loc 1 372 23 mul.lo.u64 %r404,%r110,%r105; .loc 1 372 9 sub.u64 %r143,%r147,%r404; .loc 1 373 5 add.u64 %r144,%r144,1; .loc 1 374 7 add.u64 %r40,%r40,8; add.u64 %r164,%r164,8; setp.ne.u64 %r405,%r144,%r431; @ %r405 bra $L84; bra $L48; $L108: add.u64 %r40,%frame,488; mov.u64 %r164,8; .loc 1 373 5 mov.u64 %r144,1; .loc 1 367 13 mov.u64 %r427,0; $L84: .loc 1 382 16 ld.u64 %r409,[%r40]; add.u64 %r105,%r409,1; st.u64 [%r40],%r105; .loc 1 383 23 add.u64 %r411,%r432,%r164; ld.u64 %r106,[%r411]; .loc 1 383 13 sub.u64 %r412,%r106,%r94; add.u64 %r145,%r145,%r412; .loc 1 384 24 add.u64 %r413,%frame,%r164; ld.u64 %r108,[%r413]; .loc 1 384 14 add.u64 %r146,%r142,%r108; .loc 1 385 23 add.u64 %r415,%r434,%r164; ld.u64 %r110,[%r415]; .loc 1 385 13 add.u64 %r147,%r143,%r110; .loc 1 363 32 add.u64 %r417,%r433,%r164; ld.u64 %r113,[%r417]; .loc 1 363 13 setp.eq.u64 %r418,%r105,%r113; @ %r418 bra $L85; .loc 1 367 13 mov.u64 %r173,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r419,[%r203]; setp.eq.u64 %r420,%r419,0; @ ! %r420 bra $L63; bra $L86; $L48:_gfortran_siall_i1 .visible .func _gfortran_siall_68predpred %r184; .reg .predpred %r206; .reg .predu32 %r211; .reg .u64 %r213; .reg .u64u32 %r222; .reg .u64pred %r264; .reg .u64 %r266; .reg .u32 %r270; .reg .u64 %r272; .reg .u64 %r274; .reg .predpredu64 %r297; mov.u64 %r148,%ar0; mov.u64 %r149,%ar1; mov.u64 %r150,%ar2; mov.u64 %r151,%ar3; .loc 1 412 6 setp.eq.u64 %r152,%r151,0; @ %r152 bra $L110; .loc 1 412 20 ld.u32 %r153,[%r151]; setp.eq.u32 %r154,%r153,0; @ %r154 bra $L111; $L110: .loc 1 417 7gfortran_ial.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r150]; .loc 1 422 7 add.u64 %r86,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r149+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r291,%r26; .loc 1 425 7 shr.u64 %r159,%r86,63; set.u32.gt.s64 %r161,%r86,%r291; neg.s32 %r162,%r161; cvt.u16.u64 %r165,%r159; cvt.u16.u32 %r166,%r162; or.b16 %r164,%r165,%r166; .loc 1 425 6 cvt.u32.u16 %r167,%r164; cvt.u16.u8 %r168,%r167; setp.ne.u16 %r169,%r168,0; @ %r169 bra $L113; .loc 1 432 3 setp.ne.u64 %r170,%r86,0; @ %r170 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r171,%r86,%r291; @ %r171 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r174,%r24; cvt.s64.s8 %r173,%r174; st.u64 [%stack+8],%r173172172_gfortran_runtime_error114: add.u64 %r135,%r149,48; add.u64 %r137,%frame,120; add.u64 %r138,%r149,56; add.u64 %r176,%r149,24; add.u64 %r178,%r23,%r23; add.u64 %r179,%r178,%r23; shl.b64 %r180,%r179,3; add.u64 %r144,%r176,%r180; .loc 1 437 12 mov.u64 %r297,0; $L119: .loc 1 434 19 ld.u64 %r182,[%r138]; add.u64 %r181,%r182,1; ld.u64 %r183,[%r135]; sub.u64 %r36,%r181,%r183; .loc 1 436 10 setp.le.s64 %r184,%r36,0; @ %r184 bra $L117; .loc 1 434 17 st.u64 [%r137],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r137],%r297; $L118: .loc 1 432 3 add.u64 %r135,%r135,24; add.u64 %r137,%r137,8; add.u64 %r138,%r138,24; setp.ne.u64 %r186,%r135,%r144; @ %r186 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r96,[%r148]; .loc 1 449 6 setp.eq.u64 %r187,%r96,0; @ ! %r187 bra $L122; bra $L121; $L115: add.u64 %r189,%r23,%r23; add.u64 %r190,%r189,%r23; shl.b64 %r191,%r190,3; add.u64 %r192,%r191,48; add.u64 %r57,%r149,%r192; add.u64 %r290,%frame,120; shl.b64 %r194,%r23,3; add.u64 %r195,%r194,-8; add.u64 %r67,%r290,%r195; add.u64 %r196,%r149,48; cvt.u32.u32 %r198,%r24; cvt.s64.s8 %r197,%r198; add.u64 %r200,%r197,%r197; add.u64 %r201,%r200,%r197; shl.b64 %r202,%r201,3; add.u64 %r131,%r196,%r202; .loc 1 446 12 mov.u64 %r296,0; $L125: .loc 1 443 2 ld.u64 %r204,[%r57+8]; add.u64 %r203,%r204,1; ld.u64 %r205,[%r57]; sub.u64 %r41,%r203,%r205; .loc 1 445 10 setp.le.s64 %r206,%r41,0; @ %r206 bra $L123; .loc 1 442 17 st.u64 [%r67],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r67],%r296; $L124: .loc 1 440 3 add.u64 %r57,%r57,24; add.u64 %r67,%r67,8; setp.ne.u64 %r208,%r57,%r131; @ %r208 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r209,%r291,0; @ %r209 bra $L127; add.u64 %r290,%frame,120; $L147: add.u64 %r102,%r148,40; mov.u64 %r84,%r290; cvt.u32.u32 %r211,%r24; cvt.s64.s8 %r210,%r211; add.u64 %r213,%r210,%r210; add.u64 %r214,%r213,%r210; shl.b64 %r215,%r214,3; add.u64 %r216,%r148,16; add.u64 %r60,%r215,%r216; .loc 1 456 10 mov.u64 %r82,1; .loc 1 460 4 mov.u64 %r235,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r217,0; st.u64 [%r148+8],%r217; .loc 1 465 28 cvt.u16.u32 %r220,%r24; add.u16 %r219,%r220,-1; cvt.u32.u16 %r221,%r219; st.u8 [%r148+28],%r221; .loc 1 467 20 add.u32 %r222,%r24,-2; cvt.s64.s32 %r50,%r222; add.u64 %r224,%r50,%r50; add.u64 %r225,%r224,%r50; shl.b64 %r226,%r225,3; add.u64 %r227,%r148,%r226; .loc 1 467 67 shl.b64 %r229,%r50,3; add.u64 %r230,%frame,%r229; .loc 1 467 59 ld.u64 %r232,[%r227+40]; ld.u64 %r233,[%r230+120]; mul.lo.u64 %r93,%r232,%r233; .loc 1 469 10 setp.eq.u64 %r234,%r93,0; @ %r234 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r82,%r46,%r82; $L128: .loc 1 460 4 st.u64 [%r102+8],%r235; ld.u64 %r46,[%r84]; add.u64 %r236,%r46,-1; st.u64 [%r102+16],%r236; st.u64 [%r102],%r82; .loc 1 453 7 add.u64 %r102,%r102,24;131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r148+48],%r93; mov.u64 %r239,-1; st.u64 [%r148+56],%r239; mov.u64 %r240,1; st.u64 [%r148+40],%r240; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r242(%value_in),_gfortrani_xmallocarray3,[%value_in]; } mov.u64 %r96,%r243; .loc 1 476 22 st.u64 [%r148],%r96; $L135: .loc 1 502 3 setp.ne.u64 %r245,%r291,0; @ %r245 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r148+28]; .loc 1 480 10 setp.eq.u64 %r246,%r55,%r291; @ %r246 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r291; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r249,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r250,[%r249+36]; setp.eq.u32 %r251,%r250,0; @ %r251 bra $L135; .loc 1 488 4 setp.eq.u64 %r252,%r291,0; @ %r252 bra $L133; add.u64 %r31,%r148,48; add.u64 %r125,%frame,120; cvt.u32.u32 %r254,%r24; cvt.s64.s8 %r253,%r254; add.u64 %r92,%r253,-1; .loc 1 488 10 mov.u64 %r101,0; $L137: .loc 1 492 21 ld.u64 %r256,[%r31+8]; add.u64 %r255,%r256,1; .loc 1 492 19 ld.u64 %r257,[%r31]; sub.u64 %r91,%r255,%r257; .loc 1 493 18 ld.u64 %r64,[%r125]; .loc 1 488 25 add.u64 %r101,%r101,1; .loc 1 493 11 setp.eq.u64 %r258,%r64,%r91; @ %r258 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r64; st.u64 [%stack+8],%r91; st.u64 [%stack],%r101; cvta.const.u64 %r22_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r31,%r31,24; add.u64 %r125,%r125,8; setp.ne.u64 %r261,%r92,%r101; @ %r261 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r263,%r24; setp.le.s16 %r264,%r263,1; @ %r264 bra $L138; shl.b64 %r262,%r291,3; bra $L139; $L138: mov.u64 %r262,8; $L139: add.u64 %r266,%frame,240; mov.u32 %r27026272,[%value_in]; } add.u64 %r111,%r148,40; mov.u64 %r109,%frame; mov.u64 %r98,0; $L140: .loc 1 505 18 ld.u64 %r274,[%r111]; st.u64 [%r109],%r274; .loc 1 502 26 add.u64 %r98,%r98,1; .loc 1 502 3 add.u64 %r111,%r111,24; add.u64 %r109,%r109,8; setp.gt.s64 %r275,%r291,%r98; @ %r275 bra $L140; $L133: ld.u64 %r103,[%frame+240]; .loc 1 514 22 ld.u64 %r69,[%frame]; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r81,%r59,%r69; setp.le.s64 %r289,%r291,1; .loc 1 512 13 mov.u32 %r276,0; cvt.u32.u32 %r293,%r24; cvt.s64.s8 %r294,%r293; add.u64 %r295,%r294,-1; $L146: st.u8 [%r96],%r276; .loc 1 513 15 add.u64 %r103,%r103,1; .loc 1 514 12 add.u64 %r96,%r96,%r69; .loc 1 516 13 setp.ne.u64 %r277,%r59,%r103; @ %r277 bra $L146; .loc 1 525 7 @ ! %r289 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r97],%r292; .loc 1 523 23 mul.lo.u64 %r71,%r74,%r73; .loc 1 524 5 add.u64 %r95,%r95,1; .loc 1 525 7 add.u64 %r97,%r97,8; add.u64 %r119,%r119,8; add.u64 %r118,%r118,8; setp.ne.u64 %r280,%r95,%r295; @ %r280 bra $L144; bra $L109; $L171: add.u64 %r97,%frame,248; add.u64 %r119,%frame,8; add.u64 %r118,%frame,128; .loc 1 523 23 mov.u64 %r71,%r81; .loc 1 524 5 mov.u64 %r95,1; .loc 1 520 13 mov.u64 %r292,0; $L144: .loc 1 529 16 ld.u64 %r285,[%r97]; add.u64 %r73,%r285,1; st.u64 [%r97],%r73; .loc 1 530 23 ld.u64 %r74,[%r119]; .loc 1 530 13 sub.u64 %r286,%r74,%r71; add.u64 %r96,%r96,%r286; .loc 1 516 32 ld.u64 %r77,[%r118]; .loc 1 516 13 setp.eq.u64 %r287,%r73,%r77; @ %r287 bra $L145; .loc 1 520 13 mov.u64 %r103,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r96,[%r148]; .loc 1 449 6 setp.eq.u64 %r288,%r96,0; @ ! %r288 bra $L122; bra $L147; $L109: .loc 1 534 1 ret; } iall_i2.o/fortran/generated/iall_i2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_miall_i2 .visible .func _gfortran_miall_i2gfortran_siall_i2 .visible .func _gfortran_siall_i2VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,76,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,76,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,105] = {73,65,76,76114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,76,76,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,73,65,76,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_iall_i2 .visible .func _gfortran_iall_480u64 %r52; .reg .u64 %r58; .reg .u64 %r61; .reg .u649pred %r217; .reg .pred64u64 %r251; .reg .predu64 %r256; .reg .u64 %r257; .reg .u32u32 %r268; .reg .u64predmov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; .loc 1 55 10 ld.s8 %r22,[%r180+28]; .loc 1 55 38 add.u32 %r182,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r182; .loc 1 56 10 ld.u64 %r25,[%r181]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r184,%r104,63; set.u32.lt.s64 %r186,%r103,%r104; neg.s32 %r187,%r186; cvt.u16.u64 %r190,%r184; cvt.u16.u32 %r191,%r187; or.b16 %r189,%r190,%r191; .loc 1 58 6 cvt.u32.u16 %r192,%r189; cvt.u16.u8 %r193,%r192; setp.eq.u16 %r194,%r193,0; @ %r194 bra $L2; .loc 1 60 7 cvt.u32.u32 %r197,%r22; cvt.s64.s8 %r196,%r197; st.u64 [%stack+8],%r1961955_gfortran_runtime_errorr200,%r104,%r104; add.u64 %r201,%r200,%r104; shl.b64 %r202,%r201,3; add.u64 %r203,%r180,%r202; ld.u64 %r30,[%r203+56]; ld.u64 %r32,[%r203+48]; .loc 1 68 9 ld.u64 %r107,[%r203+40]; .loc 1 70 3 setp.ne.u64 %r217,%r104,0; @ %r217 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r218,%r103,%r104; @ %r218 bra $L4; bra $L46; $L3: add.u64 %r61,%r180,40; add.u64 %r40,%frame,120; add.u64 %r91,%frame,240; add.u64 %r219,%r180,16; add.u64 %r221,%r25,%r25; add.u64 %r222,%r221,%r25; shl.b64 %r223,%r222,3; add.u64 %r174,%r219,%r223; .loc 1 76 12 mov.u64 %r363,0; $L8: .loc 1 72 18 ld.u64 %r224,[%r61]; st.u64 [%r40],%r224; .loc 1 73 19 ld.u64 %r226,[%r61+16]; add.u64 %r225,%r226,1; ld.u64 %r227,[%r61+8]; sub.u64 %r37,%r225,%r227; .loc 1 75 10 setp.lt.s64 %r228,%r37,0; @ %r228 bra $L6; .loc 1 73 17 st.u64 [%r91],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r91],%r363; $L7: .loc 1 70 3 add.u64 %r61,%r61,24; add.u64 %r40,%r40,8; add.u64 %r91,%r91,8; setp.ne.u64 %r230,%r61,%r174; @ %r230 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r231,[%r179]; setp.eq.u64 %r232,%r231,0; @ ! %r232 bra $L11; bra $L10; $L4: add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r237,%r236,40; add.u64 %r143,%r180,%r237; shl.b64 %r238,%r25,3; add.u64 %r125,%r238,-8; add.u64 %r351,%frame,120; add.u64 %r128,%r351,%r125; add.u64 %r350,%frame,240; add.u64 %r121,%r350,%r125; cvt.u32.u32 %r242,%r22; cvt.s64.s8 %r241,%r242; add.u64 %r244,%r241,%r241; add.u64 %r245,%r244,%r241; shl.b64 %r246,%r245,3; add.u64 %r247,%r180,40; add.u64 %r95,%r246,%r247; .loc 1 84 12 mov.u64 %r362,0; $L14: .loc 1 80 18 ld.u64 %r248,[%r143]; st.u64 [%r128],%r248; .loc 1 81 19 ld.u64 %r250,[%r143+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r143+8]; sub.u64 %r44,%r249,%r251; .loc 1 83 10 setp.lt.s64 %r252,%r44,0; @ %r252 bra $L12; .loc 1 81 17 st.u64 [%r121],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r121],%r362; $L13: .loc 1 78 3 add.u64 %r143,%r143,24; add.u64 %r128,%r128,8; add.u64 %r121,%r121,8; setp.ne.u64 %r254,%r95,%r143; @ %r254 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r255,%r103,0; @ %r255 bra $L16; add.u64 %r350,%frame,240; $L34: add.u64 %r152,%r179,40; mov.u64 %r150,%r350; add.u64 %r256,%r179,16; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r144,%r256,%r262; .loc 1 94 10 mov.u64 %r99,1; .loc 1 98 4 mov.u64 %r285,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r263,0; st.u64 [%r179+8],%r263; .loc 1 103 28 cvt.u16.u32 %r266,%r22; add.u16 %r265,%r266,-1; cvt.u32.u16 %r267,%r265; st.u8 [%r179+28],%r267; .loc 1 105 20 add.u32 %r268,%r22,-2; cvt.s64.s32 %r52,%r268; add.u64 %r270,%r52,%r52; add.u64 %r271,%r270,%r52; shl.b64 %r272,%r271,3; add.u64 %r273,%r179,%r272; .loc 1 105 67 shl.b64 %r275,%r52,3; add.u64 %r276,%frame,%r275; .loc 1 105 59 ld.u64 %r278,[%r273+40]; ld.u64 %r279,[%r276+240]; mul.lo.u64 %r109,%r278,%r279; .loc 1 107 29 mov.u64 %r281,281; call (%value_in),_gfortrani_xmallocarray282,[%value_in]; } .loc 1 107 27 st.u64 [%r179],%r282; .loc 1 108 10 setp.eq.u64 %r284,%r109,0; @ ! %r284 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r99,%r48,%r99; $L17: .loc 1 98 4 st.u64 [%r152+8],%r285; ld.u64 %r48,[%r150]; add.u64 %r286,%r48,-1; st.u64 [%r152+16],%r286; st.u64 [%r152],%r99; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r287,%r144,%r152; @ %r287 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r179+48],%r109; mov.u64 %r289,-1; st.u64 [%r179+56],%r289; mov.u64 %r290,1; st.u64 [%r179+40],%r290; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r179+28]; .loc 1 118 10 setp.eq.u64 %r291,%r58,%r103; @ %r291 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r294,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r295,[%r294+36]; setp.eq.u32 %r296,%r295,0; @ %r296 bra $L21; .loc 1 125 2 add.u64 %r350,%frame,240; cvta.const.u64 %r300,$LC2r350299300; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r302,%r103,0; @ %r302 bra $L24; $L27: .loc 1 65 9 add.u64 %r303,%r30,1; .loc 1 65 7 sub.u64 %r105,%r303,%r32; max.s64 %r45,%r105,0; .loc 1 137 8 ld.u64 %r116,[%r180]; .loc 1 138 8 ld.u64 %r117,[%r179]; .loc 1 154 36 add.u64 %r66,%r107,%r107; ld.u64 %r138,[%frame+360]; .loc 1 166 22 ld.u64 %r68,[%frame+120]; .loc 1 166 12 add.u64 %r69,%r68,%r68; .loc 1 167 22 ld.u64 %r71,[%frame]; .loc 1 167 12 add.u64 %r72,%r71,%r71; .loc 1 169 32 ld.u64 %r123,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r307,%r68,%r123; .loc 1 176 9 add.u64 %r308,%r307,%r307; neg.s64 %r127,%r308; .loc 1 177 23 mul.lo.u64 %r309,%r71,%r123; .loc 1 177 9 add.u64 %r310,%r309,%r309; neg.s64 %r130,%r310; setp.gt.s64 %r352,%r105,0; setp.le.s64 %r353,%r103,1; cvt.u32.u32 %r355,%r22; cvt.s64.s8 %r356,%r355; add.u64 %r357,%r356,-1; add.u64 %r358,%frame,120; add.u64 %r359,%frame,240; .loc 1 173 13 mov.u64 %r360,0; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r179,40; mov.u64 %r162,%frame; cvt.u32.u32 %r312,%r22; cvt.s64.s8 %r311,%r312; add.u64 %r153,%r311,-1; .loc 1 129 3 mov.u64 %r118,0; add.u64 %r350,%frame,240; .loc 1 131 16 mov.u64 %r313,%r118; $L26: st.u64 [%r165],%r313; .loc 1 132 18 ld.u64 %r314,[%r164]; st.u64 [%r162],%r314; .loc 1 133 17 shl.b64 %r316,%r118,3; add.u64 %r317,%r350,%r316; .loc 1 133 10 ld.u64 %r318,[%r317]; setp.le.s64 %r319,%r318,0; @ %r319 bra $L1; .loc 1 129 26 add.u64 %r118,%r118,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r320,%r118,%r153; @ %r320 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r352 bra $L35; .loc 1 150 10 mov.u16 %r322,0; st.u16 [%r117],%r322; bra $L29; $L35: mov.u64 %r112,%r116; .loc 1 148 10 mov.u32 %r110,-1; .loc 1 154 13 mov.u64 %r111,0; $L28: .loc 1 158 10 ld.u16 %r324,[%r112]; cvt.u16.u32 %r325,%r110; and.b16 %r323,%r324,%r325; cvt.s32.s16 %r110,%r323; .loc 1 154 28 add.u64 %r111,%r111,1; .loc 1 154 36 add.u64 %r112,%r112,%r66; .loc 1 154 6 setp.gt.s64 %r326,%r45,%r111; @ %r326 bra $L28; .loc 1 161 12 cvt.u16.u32 %r327,%r110; st.u16 [%r117],%r327; $L29: .loc 1 165 15 add.u64 %r138,%r138,1; .loc 1 166 12 add.u64 %r116,%r116,%r69; .loc 1 167 12 add.u64 %r117,%r117,%r72; .loc 1 169 13 setp.ne.u64 %r328,%r138,%r123; @ %r328 bra $L25; .loc 1 176 9 add.u64 %r113,%r116,%r127; .loc 1 177 9 add.u64 %r114,%r117,%r130; .loc 1 179 7 @ %r353 bra $L1; add.u64 %r122,%frame,368; mov.u64 %r140,8; .loc 1 178 5 mov.u64 %r115,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r122],%r360; .loc 1 176 23 mul.lo.u64 %r334,%r85,%r84; .loc 1 176 9 add.u64 %r335,%r334,%r334; sub.u64 %r113,%r116,%r335; .loc 1 177 23 mul.lo.u64 %r336,%r88,%r84; .loc 1 177 9 add.u64 %r337,%r336,%r336; sub.u64 %r114,%r117,%r337; .loc 1 178 5 add.u64 %r115,%r115,1; .loc 1 179 7 add.u64 %r122,%r122,8; add.u64 %r140,%r140,8; setp.eq.u64 %r338,%r115,%r357; @ %r338 bra $L1; $L32: .loc 1 187 16 ld.u64 %r339,[%r122]; add.u64 %r84,%r339,1; st.u64 [%r122],%r84; .loc 1 188 23 add.u64 %r341,%r358,%r140; ld.u64 %r85,[%r341]; .loc 1 188 13 add.u64 %r342,%r85,%r85; add.u64 %r116,%r113,%r342; .loc 1 189 23 add.u64 %r343,%frame,%r140; ld.u64 %r88,[%r343]; .loc 1 189 13 add.u64 %r344,%r88,%r88; add.u64 %r117,%r114,%r344; .loc 1 169 32 add.u64 %r346,%r359,%r140; ld.u64 %r92,[%r346]; .loc 1 169 13 setp.eq.u64 %r347,%r84,%r92; @ %r347 bra $L33; .loc 1 173 13 mov.u64 %r138,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r348,[%r179]; setp.eq.u64 %r349,%r348,0; @ ! %r349 bra $L11; bra $L34; $L1: .loc 1 19_gfortran_miall_i2 .visible .func _gfortran_miall_608u64u16 %r263; .reg .pred %r264; .reg .pred %r266; .reg .u64 %r270; .reg .u64 %r271; .reg .u64 %r277; .reg .pred %r279; .reg .pred %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .predu64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32u64 %r342; .reg .u64pred %r358; .reg .u64predu64 %r391; .reg .pred %r392; .reg .pred %r393; .reg .u64u64 %r421; .reg .u64predmov.u64 %r211,%ar2; mov.u64 %r212,%ar3; .loc 1 223 6 setp.ne.u64 %r213,%r212,0; @ %r213 bra $L49gfortran_ial.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r211]; .loc 1 233 7 add.u64 %r137,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r210+28]; .loc 1 234 38 add.u32 %r217,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r138,%r217; .loc 1 237 7 shr.u64 %r219,%r137,63; set.u32.gt.s64 %r221,%r137,%r138; neg.s32 %r222,%r221; cvt.u16.u64 %r225,%r219; cvt.u16.u32 %r226,%r222; or.b16 %r224,%r225,%r226; .loc 1 237 6 cvt.u32.u16 %r227,%r224; cvt.u16.u8 %r228,%r227; setp.eq.u16 %r229,%r228,0; @ %r229 bra $L51; .loc 1 239 7 cvt.u32.u32 %r232,%r23; cvt.s64.s8 %r231,%r232; st.u64 [%stack+8],%r231; st.u64 [%stack],%r22; cvta.const.u64 %r2302stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r435,%r137,%r137; add.u64 %r436,%r435,%r137; shl.b64 %r237,%r436,3; add.u64 %r238,%r210,%r237; ld.u64 %r241,[%r238+56]; add.u64 %r240,%r241,1; .loc 1 244 7 ld.u64 %r248,[%r238+48]; sub.u64 %r139,%r240,%r248; .loc 1 245 6 setp.le.s64 %r249,%r139,0; @ %r249 bra $L48; .loc 1 248 9 ld.u64 %r152,[%r212]; .loc 1 250 15 ld.u64 %r34,[%r212+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r250,%r35,-4; and.b32 %r251,%r250,-5; set.u32.eq.u32 %r253,%r251,0; neg.s32 %r254,%r253; .loc 1 252 22 add.u32 %r255,%r35,-1; set.u32.le.u32 %r257,%r255,1; neg.s32 %r258,%r257; .loc 1 252 6 cvt.u16.u32 %r260,%r254; cvt.u16.u32 %r261,%r258; or.b16 %r259,%r260,%r261; cvt.u32.u16 %r262,%r259; cvt.u16.u8 %r263,%r262; setp.ne.u16 %r264,%r263,0; @ %r264 bra $L53; .loc 1 254 7 setp.ne.u32 %r266,%r35,16; @ %r266 bra $L54; $L53: .loc 1 261 9 shl.b64 %r270,%r436,3; add.u64 %r271,%r210,%r270; ld.u64 %r142,[%r271+40]; .loc 1 262 12 add.u64 %r277,%r212,%r270; ld.u64 %r42,[%r277+40]; .loc 1 264 3 setp.ne.u64 %r279,%r137,0; @ %r279 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r280,%r137,%r138; @ %r280 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r281281_gfortran_runtime_error5: add.u64 %r40,%r210,40; add.u64 %r179,%r212,40; add.u64 %r283,%r210,16; add.u64 %r285,%r22,%r22; add.u64 %r286,%r285,%r22; shl.b64 %r287,%r286,3; add.u64 %r206,%r283,%r287; .loc 1 264 3 mov.u64 %r92,0; add.u64 %r433,%frame,240; add.u64 %r432,%frame,360; .loc 1 271 12 mov.u64 %r447,%r92; $L60: .loc 1 266 18 add.u64 %r289,%r433,%r92; ld.u64 %r290,[%r40]; st.u64 [%r289],%r290; .loc 1 267 18 add.u64 %r291,%frame,%r92; .loc 1 267 20 ld.u64 %r293,[%r179]; mul.lo.u64 %r292,%r293,%r34; .loc 1 267 18 st.u64 [%r291],%r292; .loc 1 268 19 ld.u64 %r295,[%r40+16]; add.u64 %r294,%r295,1; ld.u64 %r296,[%r40+8]; sub.u64 %r54,%r294,%r296; .loc 1 270 10 setp.lt.s64 %r297,%r54,0; @ %r297 bra $L58; .loc 1 268 17 add.u64 %r299,%r432,%r92; st.u64 [%r299],%r54; bra $L59; $L58: .loc 1 271 12 add.u64 %r301,%r432,%r92; st.u64 [%r301],%r447; $L59: .loc 1 264 3 add.u64 %r40,%r40,24; add.u64 %r179,%r179,24; add.u64 %r92,%r92,8; setp.ne.u64 %r303,%r40,%r206; @ %r303 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r304,[%r209]; setp.eq.u64 %r305,%r304,0; @ ! %r305 bra $L63; bra $L62; $L56: add.u64 %r307,%r22,%r22; add.u64 %r308,%r307,%r22; shl.b64 %r309,%r308,3; add.u64 %r127,%r309,40; add.u64 %r133,%r210,%r127; add.u64 %r125,%r212,%r127; shl.b64 %r310,%r22,3; add.u64 %r103,%r310,-8; cvt.u32.u32 %r312,%r23; cvt.s64.s8 %r311,%r312; shl.b64 %r313,%r311,3; add.u64 %r69,%r313,-8; add.u64 %r433,%frame,240; add.u64 %r432,%frame,360; .loc 1 281 12 mov.u64 %r446,0; $L66: .loc 1 276 18 add.u64 %r315,%r433,%r103; ld.u64 %r316,[%r133]; st.u64 [%r315],%r316; .loc 1 277 18 add.u64 %r317,%frame,%r103; .loc 1 277 20 ld.u64 %r319,[%r125]; mul.lo.u64 %r318,%r319,%r34; .loc 1 277 18 st.u64 [%r317],%r318; .loc 1 278 19 ld.u64 %r321,[%r133+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r133+8]; sub.u64 %r66,%r320,%r322; .loc 1 280 10 setp.lt.s64 %r323,%r66,0; @ %r323 bra $L64; .loc 1 278 17 add.u64 %r325,%r432,%r103; st.u64 [%r325],%r66; bra $L65; $L64: .loc 1 281 12 add.u64 %r327,%r432,%r103; st.u64 [%r327],%r446; $L65: .loc 1 274 3 add.u64 %r133,%r133,24; add.u64 %r125,%r125,24; add.u64 %r103,%r103,8; setp.ne.u64 %r329,%r69,%r103; @ %r329 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r330,%r138,0; @ %r330 bra $L68; add.u64 %r432,%frame,360; $L86: add.u64 %r172,%r209,40; mov.u64 %r164,%r432; cvt.u32.u32 %r332,%r23; cvt.s64.s8 %r331,%r332; add.u64 %r334,%r331,%r331; add.u64 %r335,%r334,%r331; shl.b64 %r336,%r335,3; add.u64 %r337,%r209,16; add.u64 %r134,%r336,%r337; .loc 1 291 10 mov.u64 %r129,1; .loc 1 295 4 mov.u64 %r356,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r338,%r23,-2; cvt.s64.s32 %r75,%r338; add.u64 %r340,%r75,%r75; add.u64 %r341,%r340,%r75; shl.b64 %r342,%r341,3; add.u64 %r343,%r209,%r342; .loc 1 299 67 shl.b64 %r345,%r75,3; add.u64 %r346,%frame,%r345; .loc 1 299 59 ld.u64 %r348,[%r343+40]; ld.u64 %r349,[%r346+360]; mul.lo.u64 %r146,%r348,%r349; .loc 1 301 24 mov.u64 %r350,0; st.u64 [%r209+8],%r350; .loc 1 302 28 cvt.u16.u32 %r353,%r23; add.u16 %r352,%r353,-1; cvt.u32.u16 %r354,%r352; st.u8 [%r209+28],%r354; .loc 1 304 10 setp.eq.u64 %r355,%r146,0; @ %r355 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r129,%r72,%r129; $L69: .loc 1 295 4 st.u64 [%r172+8],%r356; ld.u64 %r72,[%r164]; add.u64 %r357,%r72,-1; st.u64 [%r172+16],%r357; st.u64 [%r172],%r129; .loc 1 288 7 add.u64 %r172,%r172,24; add.u64 %r164,%r164,8; setp.ne.u64 %r358,%r134,%r172; @ %r358 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r209+48],%r146; mov.u64 %r360,-1; st.u64 [%r209+56],%r360; mov.u64 %r361,1; st.u64 [%r209+40],%r361; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r363,363; call (%value_in),_gfortrani_xmallocarray64,[%value_in]; } .loc 1 311 22 st.u64 [%r209],%r364; bra $L73; $L63: .loc 1 316 19 ld.s8 %r366,[%r209+28]; .loc 1 316 10 setp.eq.u64 %r367,%r366,%r138; @ %r367 bra $L74; .loc 1 317 2 cvta.const.u64 %r36836_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r370,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r371,[%r370+36]; setp.eq.u32 %r372,%r371,0; @ %r372 bra $L73; .loc 1 321 4 add.u64 %r432,%frame,360; cvta.const.u64 %r376,$LC2r437376; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r380,$LC1210380376; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r382,%r138,0; @ %r382 bra $L75; $L78: .loc 1 336 8 ld.u64 %r153,[%r209]; .loc 1 337 8 ld.u64 %r151,[%r210]; .loc 1 339 9 setp.ne.u64 %r383,%r151,0; @ %r383 bra $L76; bra $L48; $L75: add.u64 %r190,%frame,480; add.u64 %r189,%r209,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r385,%r23; cvt.s64.s8 %r384,%r385; add.u64 %r175,%r384,-1; .loc 1 328 3 mov.u64 %r157,0; add.u64 %r432,%frame,360; .loc 1 330 16 mov.u64 %r386,%r157; $L77: st.u64 [%r190],%r386; .loc 1 331 18 ld.u64 %r387,[%r189]; st.u64 [%r187],%r387; .loc 1 332 17 shl.b64 %r389,%r157,3; add.u64 %r390,%r432,%r389; .loc 1 332 10 ld.u64 %r391,[%r390]; setp.le.s64 %r392,%r391,0; @ %r392 bra $L48; .loc 1 328 26 add.u64 %r157,%r157,1; .loc 1 328 3 add.u64 %r190,%r190,8; add.u64 %r189,%r189,24; add.u64 %r187,%r187,8; setp.ne.u64 %r393,%r157,%r175; @ %r393 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r45,%r42,%r34; .loc 1 349 32 add.u64 %r91,%r142,%r142; ld.u64 %r39,[%frame+480]; .loc 1 359 22 ld.u64 %r94,[%frame+240]; .loc 1 359 12 add.u64 %r95,%r94,%r94; .loc 1 360 23 ld.u64 %r97,[%frame]; .loc 1 361 22 ld.u64 %r99,[%frame+120]; .loc 1 361 12 add.u64 %r100,%r99,%r99; .loc 1 363 32 ld.u64 %r85,[%frame+360]; .loc 1 370 23 mul.lo.u64 %r38,%r85,%r94; .loc 1 371 24 mul.lo.u64 %r397,%r85,%r97; .loc 1 371 10 neg.s64 %r163,%r397; .loc 1 372 23 mul.lo.u64 %r398,%r85,%r99; .loc 1 372 9 add.u64 %r399,%r398,%r398; neg.s64 %r165,%r399; setp.le.s64 %r437,%r138,1; .loc 1 348 10 mov.u32 %r439,-1; cvt.u32.u32 %r440,%r23; cvt.s64.s8 %r441,%r440; add.u64 %r442,%r441,-1; add.u64 %r443,%frame,240; add.u64 %r444,%frame,360; add.u64 %r445,%frame,120; $L87: .loc 1 328 3 mov.u64 %r156,%r152; mov.u64 %r155,%r151; .loc 1 348 10 mov.u32 %r130,%r439; .loc 1 349 9 mov.u64 %r154,0; $L80: .loc 1 352 6 ld.s8 %r401,[%r156]; cvt.u16.u32 %r400,%r401; setp.eq.u16 %r402,%r400,0; @ %r402 bra $L79; .loc 1 353 12 ld.u16 %r404,[%r155]; cvt.u16.u32 %r405,%r130; and.b16 %r403,%r404,%r405; cvt.s32.s16 %r130,%r403; $L79: .loc 1 349 24 add.u64 %r154,%r154,1; .loc 1 349 32 add.u64 %r155,%r155,%r91; .loc 1 349 47 add.u64 %r156,%r156,%r45; .loc 1 349 2 setp.ne.u64 %r406,%r139,%r154; @ %r406 bra $L80; .loc 1 355 8 cvt.u16.u32 %r407,%r130; st.u16 [%r153],%r407; .loc 1 358 15 add.u64 %r39,%r39,1; .loc 1 359 12 add.u64 %r151,%r151,%r95; .loc 1 360 13 add.u64 %r152,%r152,%r97; .loc 1 361 12 add.u64 %r153,%r153,%r100; .loc 1 363 13 setp.ne.u64 %r408,%r85,%r39; @ %r408 bra $L87; .loc 1 371 10 add.u64 %r148,%r152,%r163; .loc 1 372 9 add.u64 %r149,%r153,%r165; .loc 1 374 7 @ ! %r437 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r41],%r438; .loc 1 370 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 371 24 mul.lo.u64 %r411,%r116,%r113; .loc 1 371 10 sub.u64 %r148,%r152,%r411; .loc 1 372 23 mul.lo.u64 %r412,%r118,%r113; .loc 1 372 9 add.u64 %r413,%r412,%r412; sub.u64 %r149,%r153,%r413; .loc 1 373 5 add.u64 %r150,%r150,1; .loc 1 374 7 add.u64 %r41,%r41,8; add.u64 %r166,%r166,8; setp.ne.u64 %r414,%r150,%r442; @ %r414 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; .loc 1 370 23 mov.u64 %r102,%r38; .loc 1 374 7 mov.u64 %r166,8; .loc 1 373 5 mov.u64 %r150,1; .loc 1 367 13 mov.u64 %r438,0; $L84: .loc 1 382 16 ld.u64 %r418,[%r41]; add.u64 %r113,%r418,1; st.u64 [%r41],%r113; .loc 1 383 23 add.u64 %r420,%r443,%r166; ld.u64 %r114,[%r420]; .loc 1 383 13 sub.u64 %r421,%r114,%r102; add.u64 %r422,%r421,%r421; add.u64 %r151,%r151,%r422; .loc 1 384 24 add.u64 %r423,%frame,%r166; ld.u64 %r116,[%r423]; .loc 1 384 14 add.u64 %r152,%r148,%r116; .loc 1 385 23 add.u64 %r425,%r445,%r166; ld.u64 %r118,[%r425]; .loc 1 385 13 add.u64 %r426,%r118,%r118; add.u64 %r153,%r149,%r426; .loc 1 363 32 add.u64 %r428,%r444,%r166; ld.u64 %r122,[%r428]; .loc 1 363 13 setp.eq.u64 %r429,%r113,%r122; @ %r429 bra $L85; .loc 1 367 13 mov.u64 %r39,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r430,[%r209]; setp.eq.u64 %r431,%r430,0; @ ! %r431 bra $L63; bra $L86; $L48:_gfortran_siall_i2 .visible .func _gfortran_siall_68u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64predpred %r292; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_ial.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r295,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r295; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r295; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r301,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r301; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r294,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r294,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r300,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r300; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r295,0; @ %r211 bra $L127; add.u64 %r294,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r294; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r244,4; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r295,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r295; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r295; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r295,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r295,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r295,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 add.u64 %r69,%r68,%r68; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r293,%r295,1; .loc 1 512 13 mov.u16 %r279,0; cvt.u32.u32 %r297,%r24; cvt.s64.s8 %r298,%r297; add.u64 %r299,%r298,-1; $L146: st.u16 [%r98],%r279; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r280,%r59,%r80; @ %r280 bra $L146; .loc 1 525 7 @ ! %r293 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r296; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r283,%r97,%r299; @ %r283 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; .loc 1 520 13 mov.u64 %r296,0; $L144: .loc 1 529 16 ld.u64 %r288,[%r99]; add.u64 %r74,%r288,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r289,%r75,%r72; add.u64 %r290,%r289,%r289; add.u64 %r98,%r98,%r290; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r291,%r74,%r77; @ %r291 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r292,%r98,0; @ ! %r292 bra $L122; bra $L147; $L109: .loc 1 534 1 ret; } iall_i4.o/fortran/generated/iall_i4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_miall_i4 .visible .func _gfortran_miall_i4gfortran_siall_i4 .visible .func _gfortran_siall_i4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,76,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,76,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,105] = {73,65,76,76114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,76,76,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,73,65,76,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_iall_i4 .visible .func _gfortran_iall_480u64 %r52; .reg .u64 %r58; .reg .u64 %r61; .reg .u649pred %r217; .reg .pred64u64 %r251; .reg .predu64 %r256; .reg .u64 %r257; .reg .u32u32 %r268; .reg .u64u64predpredu64u64predmov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; .loc 1 55 10 ld.s8 %r22,[%r180+28]; .loc 1 55 38 add.u32 %r182,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r182; .loc 1 56 10 ld.u64 %r25,[%r181]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r184,%r104,63; set.u32.lt.s64 %r186,%r103,%r104; neg.s32 %r187,%r186; cvt.u16.u64 %r190,%r184; cvt.u16.u32 %r191,%r187; or.b16 %r189,%r190,%r191; .loc 1 58 6 cvt.u32.u16 %r192,%r189; cvt.u16.u8 %r193,%r192; setp.eq.u16 %r194,%r193,0; @ %r194 bra $L2; .loc 1 60 7 cvt.u32.u32 %r197,%r22; cvt.s64.s8 %r196,%r197; st.u64 [%stack+8],%r1961955_gfortran_runtime_errorr200,%r104,%r104; add.u64 %r201,%r200,%r104; shl.b64 %r202,%r201,3; add.u64 %r203,%r180,%r202; ld.u64 %r30,[%r203+56]; ld.u64 %r32,[%r203+48]; .loc 1 68 9 ld.u64 %r107,[%r203+40]; .loc 1 70 3 setp.ne.u64 %r217,%r104,0; @ %r217 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r218,%r103,%r104; @ %r218 bra $L4; bra $L46; $L3: add.u64 %r61,%r180,40; add.u64 %r40,%frame,120; add.u64 %r91,%frame,240; add.u64 %r219,%r180,16; add.u64 %r221,%r25,%r25; add.u64 %r222,%r221,%r25; shl.b64 %r223,%r222,3; add.u64 %r174,%r219,%r223; .loc 1 76 12 mov.u64 %r357,0; $L8: .loc 1 72 18 ld.u64 %r224,[%r61]; st.u64 [%r40],%r224; .loc 1 73 19 ld.u64 %r226,[%r61+16]; add.u64 %r225,%r226,1; ld.u64 %r227,[%r61+8]; sub.u64 %r37,%r225,%r227; .loc 1 75 10 setp.lt.s64 %r228,%r37,0; @ %r228 bra $L6; .loc 1 73 17 st.u64 [%r91],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r91],%r357; $L7: .loc 1 70 3 add.u64 %r61,%r61,24; add.u64 %r40,%r40,8; add.u64 %r91,%r91,8; setp.ne.u64 %r230,%r61,%r174; @ %r230 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r231,[%r179]; setp.eq.u64 %r232,%r231,0; @ ! %r232 bra $L11; bra $L10; $L4: add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r237,%r236,40; add.u64 %r143,%r180,%r237; shl.b64 %r238,%r25,3; add.u64 %r125,%r238,-8; add.u64 %r344,%frame,120; add.u64 %r128,%r344,%r125; add.u64 %r345,%frame,240; add.u64 %r121,%r345,%r125; cvt.u32.u32 %r242,%r22; cvt.s64.s8 %r241,%r242; add.u64 %r244,%r241,%r241; add.u64 %r245,%r244,%r241; shl.b64 %r246,%r245,3; add.u64 %r247,%r180,40; add.u64 %r95,%r246,%r247; .loc 1 84 12 mov.u64 %r356,0; $L14: .loc 1 80 18 ld.u64 %r248,[%r143]; st.u64 [%r128],%r248; .loc 1 81 19 ld.u64 %r250,[%r143+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r143+8]; sub.u64 %r44,%r249,%r251; .loc 1 83 10 setp.lt.s64 %r252,%r44,0; @ %r252 bra $L12; .loc 1 81 17 st.u64 [%r121],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r121],%r356; $L13: .loc 1 78 3 add.u64 %r143,%r143,24; add.u64 %r128,%r128,8; add.u64 %r121,%r121,8; setp.ne.u64 %r254,%r95,%r143; @ %r254 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r255,%r103,0; @ %r255 bra $L16; add.u64 %r345,%frame,240; $L34: add.u64 %r152,%r179,40; mov.u64 %r150,%r345; add.u64 %r256,%r179,16; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r144,%r256,%r262; .loc 1 94 10 mov.u64 %r99,1; .loc 1 98 4 mov.u64 %r285,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r263,0; st.u64 [%r179+8],%r263; .loc 1 103 28 cvt.u16.u32 %r266,%r22; add.u16 %r265,%r266,-1; cvt.u32.u16 %r267,%r265; st.u8 [%r179+28],%r267; .loc 1 105 20 add.u32 %r268,%r22,-2; cvt.s64.s32 %r52,%r268; add.u64 %r270,%r52,%r52; add.u64 %r271,%r270,%r52; shl.b64 %r272,%r271,3; add.u64 %r273,%r179,%r272; .loc 1 105 67 shl.b64 %r275,%r52,3; add.u64 %r276,%frame,%r275; .loc 1 105 59 ld.u64 %r278,[%r273+40]; ld.u64 %r279,[%r276+240]; mul.lo.u64 %r109,%r278,%r279; .loc 1 107 29281; call (%value_in),_gfortrani_xmallocarray282,[%value_in]; } .loc 1 107 27 st.u64 [%r179],%r282; .loc 1 108 10 setp.eq.u64 %r284,%r109,0; @ ! %r284 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r99,%r48,%r99; $L17: .loc 1 98 4 st.u64 [%r152+8],%r285; ld.u64 %r48,[%r150]; add.u64 %r286,%r48,-1; st.u64 [%r152+16],%r286; st.u64 [%r152],%r99; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r287,%r144,%r152; @ %r287 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r179+48],%r109; mov.u64 %r289,-1; st.u64 [%r179+56],%r289; mov.u64 %r290,1; st.u64 [%r179+40],%r290; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r179+28]; .loc 1 118 10 setp.eq.u64 %r291,%r58,%r103; @ %r291 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r294,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r295,[%r294+36]; setp.eq.u32 %r296,%r295,0; @ %r296 bra $L21; .loc 1 125 2 add.u64 %r345,%frame,240; cvta.const.u64 %r300,$LC2r345299300; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r302,%r103,0; @ %r302 bra $L24; $L27: .loc 1 65 9 add.u64 %r303,%r30,1; .loc 1 65 7 sub.u64 %r105,%r303,%r32; max.s64 %r45,%r105,0; .loc 1 137 8 ld.u64 %r116,[%r180]; .loc 1 138 8 ld.u64 %r117,[%r179]; .loc 1 154 36 shl.b64 %r66,%r107,2; ld.u64 %r138,[%frame+360]; .loc 1 166 22 ld.u64 %r68,[%frame+120]; .loc 1 166 12 shl.b64 %r69,%r68,2; .loc 1 167 22 ld.u64 %r71,[%frame]; .loc 1 167 12 shl.b64 %r72,%r71,2; .loc 1 169 32 ld.u64 %r123,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r304,%r68,%r123; .loc 1 176 9 shl.b64 %r305,%r304,2; neg.s64 %r127,%r305; .loc 1 177 23 mul.lo.u64 %r306,%r71,%r123; .loc 1 177 9 shl.b64 %r307,%r306,2; neg.s64 %r130,%r307; setp.gt.s64 %r346,%r105,0; setp.le.s64 %r347,%r103,1; cvt.u32.u32 %r349,%r22; cvt.s64.s8 %r350,%r349; add.u64 %r351,%r350,-1; add.u64 %r352,%frame,120; add.u64 %r353,%frame,240; .loc 1 173 13 mov.u64 %r354,0; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r179,40; mov.u64 %r162,%frame; cvt.u32.u32 %r309,%r22; cvt.s64.s8 %r308,%r309; add.u64 %r153,%r308,-1; .loc 1 129 3 mov.u64 %r118,0; add.u64 %r345,%frame,240; .loc 1 131 16 mov.u64 %r310,%r118; $L26: st.u64 [%r165],%r310; .loc 1 132 18 ld.u64 %r311,[%r164]; st.u64 [%r162],%r311; .loc 1 133 17 shl.b64 %r313,%r118,3; add.u64 %r314,%r345,%r313; .loc 1 133 10 ld.u64 %r315,[%r314]; setp.le.s64 %r316,%r315,0; @ %r316 bra $L1; .loc 1 129 26 add.u64 %r118,%r118,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r317,%r118,%r153; @ %r317 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r346 bra $L35; .loc 1 150 10 mov.u32 %r319,0; st.u32 [%r117],%r319; bra $L29; $L35: mov.u64 %r112,%r116; .loc 1 148 10 mov.u32 %r110,-1; .loc 1 154 13 mov.u64 %r111,0; $L28: .loc 1 158 10 ld.u32 %r320,[%r112]; and.b32 %r110,%r110,%r320; .loc 1 154 28 add.u64 %r111,%r111,1; .loc 1 154 36 add.u64 %r112,%r112,%r66; .loc 1 154 6 setp.gt.s64 %r321,%r45,%r111; @ %r321 bra $L28; .loc 1 161 12 st.u32 [%r117],%r110; $L29: .loc 1 165 15 add.u64 %r138,%r138,1; .loc 1 166 12 add.u64 %r116,%r116,%r69; .loc 1 167 12 add.u64 %r117,%r117,%r72; .loc 1 169 13 setp.ne.u64 %r322,%r138,%r123; @ %r322 bra $L25; .loc 1 176 9 add.u64 %r113,%r116,%r127; .loc 1 177 9 add.u64 %r114,%r117,%r130; .loc 1 179 7 @ %r347 bra $L1; add.u64 %r122,%frame,368; mov.u64 %r140,8; .loc 1 178 5 mov.u64 %r115,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r122],%r354; .loc 1 176 23 mul.lo.u64 %r328,%r85,%r84; .loc 1 176 9 shl.b64 %r329,%r328,2; sub.u64 %r113,%r116,%r329; .loc 1 177 23 mul.lo.u64 %r330,%r88,%r84; .loc 1 177 9 shl.b64 %r331,%r330,2; sub.u64 %r114,%r117,%r331; .loc 1 178 5 add.u64 %r115,%r115,1; .loc 1 179 7 add.u64 %r122,%r122,8; add.u64 %r140,%r140,8; setp.eq.u64 %r332,%r115,%r351; @ %r332 bra $L1; $L32: .loc 1 187 16 ld.u64 %r333,[%r122]; add.u64 %r84,%r333,1; st.u64 [%r122],%r84; .loc 1 188 23 add.u64 %r335,%r352,%r140; ld.u64 %r85,[%r335]; .loc 1 188 13 shl.b64 %r336,%r85,2; add.u64 %r116,%r113,%r336; .loc 1 189 23 add.u64 %r337,%frame,%r140; ld.u64 %r88,[%r337]; .loc 1 189 13 shl.b64 %r338,%r88,2; add.u64 %r117,%r114,%r338; .loc 1 169 32 add.u64 %r340,%r353,%r140; ld.u64 %r92,[%r340]; .loc 1 169 13 setp.eq.u64 %r341,%r84,%r92; @ %r341 bra $L33; .loc 1 173 13 mov.u64 %r138,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r342,[%r179]; setp.eq.u64 %r343,%r342,0; @ ! %r343 bra $L11; bra $L34; $L1: .loc 1 19_gfortran_miall_i4 .visible .func _gfortran_miall_608u64u16 %r263; .reg .pred %r264; .reg .pred %r266; .reg .u64 %r270; .reg .u64 %r271; .reg .u64 %r277; .reg .pred %r279; .reg .pred %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .predu64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32u64 %r342; .reg .u64pred %r358; .reg .u64predu64 %r391; .reg .predpred %r399; .reg .u32 %r400; .reg .pred %r401; .reg .predmov.u64 %r211,%ar2; mov.u64 %r212,%ar3; .loc 1 223 6 setp.ne.u64 %r213,%r212,0; @ %r213 bra $L49gfortran_ial.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r211]; .loc 1 233 7 add.u64 %r137,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r210+28]; .loc 1 234 38 add.u32 %r217,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r138,%r217; .loc 1 237 7 shr.u64 %r219,%r137,63; set.u32.gt.s64 %r221,%r137,%r138; neg.s32 %r222,%r221; cvt.u16.u64 %r225,%r219; cvt.u16.u32 %r226,%r222; or.b16 %r224,%r225,%r226; .loc 1 237 6 cvt.u32.u16 %r227,%r224; cvt.u16.u8 %r228,%r227; setp.eq.u16 %r229,%r228,0; @ %r229 bra $L51; .loc 1 239 7 cvt.u32.u32 %r232,%r23; cvt.s64.s8 %r231,%r232; st.u64 [%stack+8],%r231; st.u64 [%stack],%r22; cvta.const.u64 %r2302stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r429,%r137,%r137; add.u64 %r430,%r429,%r137; shl.b64 %r237,%r430,3; add.u64 %r238,%r210,%r237; ld.u64 %r241,[%r238+56]; add.u64 %r240,%r241,1; .loc 1 244 7 ld.u64 %r248,[%r238+48]; sub.u64 %r139,%r240,%r248; .loc 1 245 6 setp.le.s64 %r249,%r139,0; @ %r249 bra $L48; .loc 1 248 9 ld.u64 %r152,[%r212]; .loc 1 250 15 ld.u64 %r34,[%r212+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r250,%r35,-4; and.b32 %r251,%r250,-5; set.u32.eq.u32 %r253,%r251,0; neg.s32 %r254,%r253; .loc 1 252 22 add.u32 %r255,%r35,-1; set.u32.le.u32 %r257,%r255,1; neg.s32 %r258,%r257; .loc 1 252 6 cvt.u16.u32 %r260,%r254; cvt.u16.u32 %r261,%r258; or.b16 %r259,%r260,%r261; cvt.u32.u16 %r262,%r259; cvt.u16.u8 %r263,%r262; setp.ne.u16 %r264,%r263,0; @ %r264 bra $L53; .loc 1 254 7 setp.ne.u32 %r266,%r35,16; @ %r266 bra $L54; $L53: .loc 1 261 9 shl.b64 %r270,%r430,3; add.u64 %r271,%r210,%r270; ld.u64 %r142,[%r271+40]; .loc 1 262 12 add.u64 %r277,%r212,%r270; ld.u64 %r42,[%r277+40]; .loc 1 264 3 setp.ne.u64 %r279,%r137,0; @ %r279 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r280,%r137,%r138; @ %r280 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r281281_gfortran_runtime_error5: add.u64 %r40,%r210,40; add.u64 %r179,%r212,40; add.u64 %r283,%r210,16; add.u64 %r285,%r22,%r22; add.u64 %r286,%r285,%r22; shl.b64 %r287,%r286,3; add.u64 %r206,%r283,%r287; .loc 1 264 3 mov.u64 %r92,0; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 271 12 mov.u64 %r441,%r92; $L60: .loc 1 266 18 add.u64 %r289,%r427,%r92; ld.u64 %r290,[%r40]; st.u64 [%r289],%r290; .loc 1 267 18 add.u64 %r291,%frame,%r92; .loc 1 267 20 ld.u64 %r293,[%r179]; mul.lo.u64 %r292,%r293,%r34; .loc 1 267 18 st.u64 [%r291],%r292; .loc 1 268 19 ld.u64 %r295,[%r40+16]; add.u64 %r294,%r295,1; ld.u64 %r296,[%r40+8]; sub.u64 %r54,%r294,%r296; .loc 1 270 10 setp.lt.s64 %r297,%r54,0; @ %r297 bra $L58; .loc 1 268 17 add.u64 %r299,%r426,%r92; st.u64 [%r299],%r54; bra $L59; $L58: .loc 1 271 12 add.u64 %r301,%r426,%r92; st.u64 [%r301],%r441; $L59: .loc 1 264 3 add.u64 %r40,%r40,24; add.u64 %r179,%r179,24; add.u64 %r92,%r92,8; setp.ne.u64 %r303,%r40,%r206; @ %r303 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r304,[%r209]; setp.eq.u64 %r305,%r304,0; @ ! %r305 bra $L63; bra $L62; $L56: add.u64 %r307,%r22,%r22; add.u64 %r308,%r307,%r22; shl.b64 %r309,%r308,3; add.u64 %r127,%r309,40; add.u64 %r133,%r210,%r127; add.u64 %r125,%r212,%r127; shl.b64 %r310,%r22,3; add.u64 %r103,%r310,-8; cvt.u32.u32 %r312,%r23; cvt.s64.s8 %r311,%r312; shl.b64 %r313,%r311,3; add.u64 %r69,%r313,-8; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 281 12 mov.u64 %r440,0; $L66: .loc 1 276 18 add.u64 %r315,%r427,%r103; ld.u64 %r316,[%r133]; st.u64 [%r315],%r316; .loc 1 277 18 add.u64 %r317,%frame,%r103; .loc 1 277 20 ld.u64 %r319,[%r125]; mul.lo.u64 %r318,%r319,%r34; .loc 1 277 18 st.u64 [%r317],%r318; .loc 1 278 19 ld.u64 %r321,[%r133+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r133+8]; sub.u64 %r66,%r320,%r322; .loc 1 280 10 setp.lt.s64 %r323,%r66,0; @ %r323 bra $L64; .loc 1 278 17 add.u64 %r325,%r426,%r103; st.u64 [%r325],%r66; bra $L65; $L64: .loc 1 281 12 add.u64 %r327,%r426,%r103; st.u64 [%r327],%r440; $L65: .loc 1 274 3 add.u64 %r133,%r133,24; add.u64 %r125,%r125,24; add.u64 %r103,%r103,8; setp.ne.u64 %r329,%r69,%r103; @ %r329 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r330,%r138,0; @ %r330 bra $L68; add.u64 %r426,%frame,360; $L86: add.u64 %r172,%r209,40; mov.u64 %r164,%r426; cvt.u32.u32 %r332,%r23; cvt.s64.s8 %r331,%r332; add.u64 %r334,%r331,%r331; add.u64 %r335,%r334,%r331; shl.b64 %r336,%r335,3; add.u64 %r337,%r209,16; add.u64 %r134,%r336,%r337; .loc 1 291 10 mov.u64 %r129,1; .loc 1 295 4 mov.u64 %r356,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r338,%r23,-2; cvt.s64.s32 %r75,%r338; add.u64 %r340,%r75,%r75; add.u64 %r341,%r340,%r75; shl.b64 %r342,%r341,3; add.u64 %r343,%r209,%r342; .loc 1 299 67 shl.b64 %r345,%r75,3; add.u64 %r346,%frame,%r345; .loc 1 299 59 ld.u64 %r348,[%r343+40]; ld.u64 %r349,[%r346+360]; mul.lo.u64 %r146,%r348,%r349; .loc 1 301 24 mov.u64 %r350,0; st.u64 [%r209+8],%r350; .loc 1 302 28 cvt.u16.u32 %r353,%r23; add.u16 %r352,%r353,-1; cvt.u32.u16 %r354,%r352; st.u8 [%r209+28],%r354; .loc 1 304 10 setp.eq.u64 %r355,%r146,0; @ %r355 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r129,%r72,%r129; $L69: .loc 1 295 4 st.u64 [%r172+8],%r356; ld.u64 %r72,[%r164]; add.u64 %r357,%r72,-1; st.u64 [%r172+16],%r357; st.u64 [%r172],%r129; .loc 1 288 7 add.u64 %r172,%r172,24; add.u64 %r164,%r164,8; setp.ne.u64 %r358,%r134,%r172; @ %r358 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r209+48],%r146; mov.u64 %r360,-1; st.u64 [%r209+56],%r360; mov.u64 %r361,1; st.u64 [%r209+40],%r361; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24363; call (%value_in),_gfortrani_xmallocarray64,[%value_in]; } .loc 1 311 22 st.u64 [%r209],%r364; bra $L73; $L63: .loc 1 316 19 ld.s8 %r366,[%r209+28]; .loc 1 316 10 setp.eq.u64 %r367,%r366,%r138; @ %r367 bra $L74; .loc 1 317 2 cvta.const.u64 %r36836_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r370,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r371,[%r370+36]; setp.eq.u32 %r372,%r371,0; @ %r372 bra $L73; .loc 1 321 4 add.u64 %r426,%frame,360; cvta.const.u64 %r376,$LC2r42637376; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r380,$LC1210380376; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r382,%r138,0; @ %r382 bra $L75; $L78: .loc 1 336 8 ld.u64 %r153,[%r209]; .loc 1 337 8 ld.u64 %r151,[%r210]; .loc 1 339 9 setp.ne.u64 %r383,%r151,0; @ %r383 bra $L76; bra $L48; $L75: add.u64 %r190,%frame,480; add.u64 %r189,%r209,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r385,%r23; cvt.s64.s8 %r384,%r385; add.u64 %r175,%r384,-1; .loc 1 328 3 mov.u64 %r157,0; add.u64 %r426,%frame,360; .loc 1 330 16 mov.u64 %r386,%r157; $L77: st.u64 [%r190],%r386; .loc 1 331 18 ld.u64 %r387,[%r189]; st.u64 [%r187],%r387; .loc 1 332 17 shl.b64 %r389,%r157,3; add.u64 %r390,%r426,%r389; .loc 1 332 10 ld.u64 %r391,[%r390]; setp.le.s64 %r392,%r391,0; @ %r392 bra $L48; .loc 1 328 26 add.u64 %r157,%r157,1; .loc 1 328 3 add.u64 %r190,%r190,8; add.u64 %r189,%r189,24; add.u64 %r187,%r187,8; setp.ne.u64 %r393,%r157,%r175; @ %r393 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r45,%r42,%r34; .loc 1 349 32 shl.b64 %r91,%r142,2; ld.u64 %r39,[%frame+480]; .loc 1 359 22 ld.u64 %r94,[%frame+240]; .loc 1 359 12 shl.b64 %r95,%r94,2; .loc 1 360 23 ld.u64 %r97,[%frame]; .loc 1 361 22 ld.u64 %r99,[%frame+120]; .loc 1 361 12 shl.b64 %r100,%r99,2; .loc 1 363 32 ld.u64 %r85,[%frame+360]; .loc 1 370 23 mul.lo.u64 %r38,%r85,%r94; .loc 1 371 24 mul.lo.u64 %r394,%r85,%r97; .loc 1 371 10 neg.s64 %r163,%r394; .loc 1 372 23 mul.lo.u64 %r395,%r85,%r99; .loc 1 372 9 shl.b64 %r396,%r395,2; neg.s64 %r165,%r396; setp.le.s64 %r431,%r138,1; cvt.u32.u32 %r433,%r23; cvt.s64.s8 %r434,%r433; add.u64 %r435,%r434,-1; add.u64 %r436,%frame,240; add.u64 %r437,%frame,360; add.u64 %r438,%frame,120; .loc 1 367 13 mov.u64 %r439,0; $L87: .loc 1 328 3 mov.u64 %r156,%r152; mov.u64 %r155,%r151; .loc 1 348 10 mov.u32 %r130,-1; .loc 1 349 9 mov.u64 %r154,0; $L80: .loc 1 352 6 ld.s8 %r398,[%r156]; cvt.u16.u32 %r397,%r398; setp.eq.u16 %r399,%r397,0; @ %r399 bra $L79; .loc 1 353 12 ld.u32 %r400,[%r155]; and.b32 %r130,%r130,%r400; $L79: .loc 1 349 24 add.u64 %r154,%r154,1; .loc 1 349 32 add.u64 %r155,%r155,%r91; .loc 1 349 47 add.u64 %r156,%r156,%r45; .loc 1 349 2 setp.ne.u64 %r401,%r139,%r154; @ %r401 bra $L80; .loc 1 355 8 st.u32 [%r153],%r130; .loc 1 358 15 add.u64 %r39,%r39,1; .loc 1 359 12 add.u64 %r151,%r151,%r95; .loc 1 360 13 add.u64 %r152,%r152,%r97; .loc 1 361 12 add.u64 %r153,%r153,%r100; .loc 1 363 13 setp.ne.u64 %r402,%r85,%r39; @ %r402 bra $L87; .loc 1 371 10 add.u64 %r148,%r152,%r163; .loc 1 372 9 add.u64 %r149,%r153,%r165; .loc 1 374 7 @ ! %r431 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r41],%r439; .loc 1 370 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 371 24 mul.lo.u64 %r405,%r116,%r113; .loc 1 371 10 sub.u64 %r148,%r152,%r405; .loc 1 372 23 mul.lo.u64 %r406,%r118,%r113; .loc 1 372 9 shl.b64 %r407,%r406,2; sub.u64 %r149,%r153,%r407; .loc 1 373 5 add.u64 %r150,%r150,1; .loc 1 374 7 add.u64 %r41,%r41,8; add.u64 %r166,%r166,8; setp.ne.u64 %r408,%r150,%r435; @ %r408 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; .loc 1 370 23 mov.u64 %r102,%r38; .loc 1 374 7 mov.u64 %r166,8; .loc 1 373 5 mov.u64 %r150,1; $L84: .loc 1 382 16 ld.u64 %r412,[%r41]; add.u64 %r113,%r412,1; st.u64 [%r41],%r113; .loc 1 383 23 add.u64 %r414,%r436,%r166; ld.u64 %r114,[%r414]; .loc 1 383 13 sub.u64 %r415,%r114,%r102; shl.b64 %r416,%r415,2; add.u64 %r151,%r151,%r416; .loc 1 384 24 add.u64 %r417,%frame,%r166; ld.u64 %r116,[%r417]; .loc 1 384 14 add.u64 %r152,%r148,%r116; .loc 1 385 23 add.u64 %r419,%r438,%r166; ld.u64 %r118,[%r419]; .loc 1 385 13 shl.b64 %r420,%r118,2; add.u64 %r153,%r149,%r420; .loc 1 363 32 add.u64 %r422,%r437,%r166; ld.u64 %r122,[%r422]; .loc 1 363 13 setp.eq.u64 %r423,%r113,%r122; @ %r423 bra $L85; .loc 1 367 13 mov.u64 %r39,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r424,[%r209]; setp.eq.u64 %r425,%r424,0; @ ! %r425 bra $L63; bra $L86; $L48:_gfortran_siall_i4 .visible .func _gfortran_siall_68u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64predu64mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_ial.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r294,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r294; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r294; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r300,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r300; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r293,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r293,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r299,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r299; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r294,0; @ %r211 bra $L127; add.u64 %r293,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r293; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 2496244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r294,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r294; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r294; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r294,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r294,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r294,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 shl.b64 %r69,%r68,2; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r292,%r294,1; .loc 1 512 13 mov.u32 %r278,0; cvt.u32.u32 %r296,%r24; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; $L146: st.u32 [%r98],%r278; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r279,%r59,%r80; @ %r279 bra $L146; .loc 1 525 7 @ ! %r292 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r295; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r282,%r97,%r298; @ %r282 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; .loc 1 520 13 mov.u64 %r295,0; $L144: .loc 1 529 16 ld.u64 %r287,[%r99]; add.u64 %r74,%r287,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r288,%r75,%r72; shl.b64 %r289,%r288,2; add.u64 %r98,%r98,%r289; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r290,%r74,%r77; @ %r290 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r291,%r98,0; @ ! %r291 bra $L122; bra $L147; $L109: .loc 1 534 1 ret; } iall_i8.o/fortran/generated/iall_i8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_miall_i8 .visible .func _gfortran_miall_i8gfortran_siall_i8 .visible .func _gfortran_siall_i8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,76,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,76,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,105] = {73,65,76,76114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,76,76,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,73,65,76,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_iall_i8 .visible .func _gfortran_iall_48096pred %r218; .reg .predpred %r231; .reg .u64 %r232; .reg .predu64 %r250; .reg .u64 %r251; .reg .u64 %r252; .reg .pred %r253; .reg .predu64u16u64 %r274; .reg .u64pred %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u32 %r306; .reg .u64u64 %r311; .reg .u64 %r312; .reg .pred %r313; .reg .predu64pred %r347; .reg .predu64 %r358; mov.u64 %r180,%ar0; mov.u64 %r181,%ar1; mov.u64 %r182,%ar2; .loc 1 55 10 ld.s8 %r22,[%r181+28]; .loc 1 55 38 add.u32 %r183,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r107,%r183; .loc 1 56 10 ld.u64 %r25,[%r182]; .loc 1 56 7 add.u64 %r108,%r25,-1; .loc 1 58 7 shr.u64 %r185,%r108,63; set.u32.lt.s64 %r187,%r107,%r108; neg.s32 %r188,%r187; cvt.u16.u64 %r191,%r185; cvt.u16.u32 %r192,%r188; or.b16 %r190,%r191,%r192; .loc 1 58 6 cvt.u32.u16 %r193,%r190; cvt.u16.u8 %r194,%r193; setp.eq.u16 %r195,%r194,0; @ %r195 bra $L2; .loc 1 60 7 cvt.u32.u32 %r198,%r22; cvt.s64.s8 %r197,%r198; st.u64 [%stack+8],%r1971966_gfortran_runtime_errorr201,%r108,%r108; add.u64 %r202,%r201,%r108; shl.b64 %r203,%r202,3; add.u64 %r204,%r181,%r203; ld.u64 %r30,[%r204+56]; ld.u64 %r32,[%r204+48]; .loc 1 68 9 ld.u64 %r111,[%r204+40]; .loc 1 70 3 setp.ne.u64 %r218,%r108,0; @ %r218 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r219,%r107,%r108; @ %r219 bra $L4; bra $L46; $L3: add.u64 %r61,%r181,40; add.u64 %r40,%frame,120; add.u64 %r96,%frame,240; add.u64 %r220,%r181,16; add.u64 %r222,%r25,%r25; add.u64 %r223,%r222,%r25; shl.b64 %r224,%r223,3; add.u64 %r175,%r220,%r224; .loc 1 76 12 mov.u64 %r358,0; $L8: .loc 1 72 18 ld.u64 %r225,[%r61]; st.u64 [%r40],%r225; .loc 1 73 19 ld.u64 %r227,[%r61+16]; add.u64 %r226,%r227,1; ld.u64 %r228,[%r61+8]; sub.u64 %r37,%r226,%r228; .loc 1 75 10 setp.lt.s64 %r229,%r37,0; @ %r229 bra $L6; .loc 1 73 17 st.u64 [%r96],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r96],%r358; $L7: .loc 1 70 3 add.u64 %r61,%r61,24; add.u64 %r40,%r40,8; add.u64 %r96,%r96,8; setp.ne.u64 %r231,%r61,%r175; @ %r231 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r232,[%r180]; setp.eq.u64 %r233,%r232,0; @ ! %r233 bra $L11; bra $L10; $L4: add.u64 %r235,%r25,%r25; add.u64 %r236,%r235,%r25; shl.b64 %r237,%r236,3; add.u64 %r238,%r237,40; add.u64 %r136,%r181,%r238; shl.b64 %r239,%r25,3; add.u64 %r105,%r239,-8; add.u64 %r345,%frame,120; add.u64 %r129,%r345,%r105; add.u64 %r346,%frame,240; add.u64 %r103,%r346,%r105; cvt.u32.u32 %r243,%r22; cvt.s64.s8 %r242,%r243; add.u64 %r245,%r242,%r242; add.u64 %r246,%r245,%r242; shl.b64 %r247,%r246,3; add.u64 %r248,%r181,40; add.u64 %r80,%r247,%r248; .loc 1 84 12 mov.u64 %r357,0; $L14: .loc 1 80 18 ld.u64 %r249,[%r136]; st.u64 [%r129],%r249; .loc 1 81 19 ld.u64 %r251,[%r136+16]; add.u64 %r250,%r251,1; ld.u64 %r252,[%r136+8]; sub.u64 %r44,%r250,%r252; .loc 1 83 10 setp.lt.s64 %r253,%r44,0; @ %r253 bra $L12; .loc 1 81 17 st.u64 [%r103],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r103],%r357; $L13: .loc 1 78 3 add.u64 %r136,%r136,24; add.u64 %r129,%r129,8; add.u64 %r103,%r103,8; setp.ne.u64 %r255,%r80,%r136; @ %r255 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r256,%r107,0; @ %r256 bra $L16; add.u64 %r346,%frame,240; $L34: add.u64 %r152,%r180,40; mov.u64 %r150,%r346; add.u64 %r257,%r180,16; cvt.u32.u32 %r259,%r22; cvt.s64.s8 %r258,%r259; add.u64 %r261,%r258,%r258; add.u64 %r262,%r261,%r258; shl.b64 %r263,%r262,3; add.u64 %r139,%r257,%r263; .loc 1 94 10 mov.u64 %r102,1; .loc 1 98 4 mov.u64 %r286,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r264,0; st.u64 [%r180+8],%r264; .loc 1 103 28 cvt.u16.u32 %r267,%r22; add.u16 %r266,%r267,-1; cvt.u32.u16 %r268,%r266; st.u8 [%r180+28],%r268; .loc 1 105 20 add.u32 %r269,%r22,-2; cvt.s64.s32 %r53,%r269; add.u64 %r271,%r53,%r53; add.u64 %r272,%r271,%r53; shl.b64 %r273,%r272,3; add.u64 %r274,%r180,%r273; .loc 1 105 67 shl.b64 %r276,%r53,3; add.u64 %r277,%frame,%r276; .loc 1 105 59 ld.u64 %r279,[%r274+40]; ld.u64 %r280,[%r277+240]; mul.lo.u64 %r113,%r279,%r280; .loc 1 107 29 mov.u64 %r2811(%value_in),_gfortrani_xmallocarray283,[%value_in]; } .loc 1 107 27 st.u64 [%r180],%r283; .loc 1 108 10 setp.eq.u64 %r285,%r113,0; @ ! %r285 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r102,%r49,%r102; $L17: .loc 1 98 4 st.u64 [%r152+8],%r286; ld.u64 %r49,[%r150]; add.u64 %r287,%r49,-1; st.u64 [%r152+16],%r287; st.u64 [%r152],%r102; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r288,%r139,%r152; @ %r288 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r180+48],%r113; mov.u64 %r290,-1; st.u64 [%r180+56],%r290; mov.u64 %r291,1; st.u64 [%r180+40],%r291; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r60,[%r180+28]; .loc 1 118 10 setp.eq.u64 %r292,%r60,%r107; @ %r292 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r107; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r295,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r296,[%r295+36]; setp.eq.u32 %r297,%r296,0; @ %r297 bra $L21; .loc 1 125 2 add.u64 %r346,%frame,240; cvta.const.u64 %r301,$LC2r346300301; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r303,%r107,0; @ %r303 bra $L24; $L27: .loc 1 65 9 add.u64 %r304,%r30,1; .loc 1 65 7 sub.u64 %r109,%r304,%r32; max.s64 %r45,%r109,0; .loc 1 137 8 ld.u64 %r121,[%r181]; .loc 1 138 8 ld.u64 %r122,[%r180]; .loc 1 154 36 shl.b64 %r68,%r111,3; ld.u64 %r47,[%frame+360]; .loc 1 166 22 ld.u64 %r69,[%frame+120]; .loc 1 166 12 shl.b64 %r71,%r69,3; .loc 1 167 22 ld.u64 %r72,[%frame]; .loc 1 167 12 shl.b64 %r74,%r72,3; setp.gt.s64 %r347,%r109,0; setp.le.s64 %r348,%r107,1; .loc 1 150 10 mov.u64 %r350,0; cvt.u32.u32 %r351,%r22; cvt.s64.s8 %r352,%r351; add.u64 %r353,%r352,-1; add.u64 %r354,%frame,120; add.u64 %r355,%frame,240; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r180,40; mov.u64 %r162,%frame; cvt.u32.u32 %r306,%r22; cvt.s64.s8 %r305,%r306; add.u64 %r153,%r305,-1; .loc 1 129 3 mov.u64 %r123,0; add.u64 %r346,%frame,240; .loc 1 131 16 mov.u64 %r307,%r123; $L26: st.u64 [%r165],%r307; .loc 1 132 18 ld.u64 %r308,[%r164]; st.u64 [%r162],%r308; .loc 1 133 17 shl.b64 %r310,%r123,3; add.u64 %r311,%r346,%r310; .loc 1 133 10 ld.u64 %r312,[%r311]; setp.le.s64 %r313,%r312,0; @ %r313 bra $L1; .loc 1 129 26 add.u64 %r123,%r123,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r314,%r123,%r153; @ %r314 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r347 bra $L35; .loc 1 150 10 st.u64 [%r122],%r350; bra $L29; $L35: mov.u64 %r116,%r121; .loc 1 148 10 mov.u64 %r114,-1; .loc 1 154 13 mov.u64 %r115,0; $L28: .loc 1 158 10 ld.u64 %r317,[%r116]; and.b64 %r114,%r114,%r317; .loc 1 154 28 add.u64 %r115,%r115,1; .loc 1 154 36 add.u64 %r116,%r116,%r68; .loc 1 154 6 setp.gt.s64 %r318,%r45,%r115; @ %r318 bra $L28; .loc 1 161 12 st.u64 [%r122],%r114; $L29: .loc 1 165 15 add.u64 %r47,%r47,1; .loc 1 166 12 add.u64 %r121,%r121,%r71; .loc 1 167 12 add.u64 %r122,%r122,%r74; .loc 1 169 32 ld.u64 %r127,[%frame+240]; .loc 1 169 13 setp.ne.u64 %r319,%r47,%r127; @ %r319 bra $L25; .loc 1 176 23 mul.lo.u64 %r320,%r69,%r47; .loc 1 176 9 shl.b64 %r321,%r320,3; sub.u64 %r118,%r121,%r321; .loc 1 177 23 mul.lo.u64 %r322,%r72,%r47; .loc 1 177 9 shl.b64 %r323,%r322,3; sub.u64 %r119,%r122,%r323; .loc 1 179 7 @ %r348 bra $L1; add.u64 %r144,%frame,368; mov.u64 %r106,8; .loc 1 178 5 mov.u64 %r120,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r144],%r350; .loc 1 176 23 mul.lo.u64 %r329,%r88,%r87; .loc 1 176 9 shl.b64 %r330,%r329,3; sub.u64 %r118,%r121,%r330; .loc 1 177 23 mul.lo.u64 %r331,%r91,%r87; .loc 1 177 9 shl.b64 %r332,%r331,3; sub.u64 %r119,%r122,%r332; .loc 1 178 5 add.u64 %r120,%r120,1; .loc 1 179 7 add.u64 %r144,%r144,8; add.u64 %r106,%r106,8; setp.eq.u64 %r333,%r120,%r353; @ %r333 bra $L1; $L32: .loc 1 187 16 ld.u64 %r334,[%r144]; add.u64 %r87,%r334,1; st.u64 [%r144],%r87; .loc 1 188 23 add.u64 %r336,%r354,%r106; ld.u64 %r88,[%r336]; .loc 1 188 13 shl.b64 %r337,%r88,3; add.u64 %r121,%r118,%r337; .loc 1 189 23 add.u64 %r338,%frame,%r106; ld.u64 %r91,[%r338]; .loc 1 189 13 shl.b64 %r339,%r91,3; add.u64 %r122,%r119,%r339; .loc 1 169 32 add.u64 %r341,%r355,%r106; ld.u64 %r94,[%r341]; .loc 1 169 13 setp.eq.u64 %r342,%r87,%r94; @ %r342 bra $L33; .loc 1 173 13 mov.u64 %r47,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r343,[%r180]; setp.eq.u64 %r344,%r343,0; @ ! %r344 bra $L11; bra $L34; $L1: .loc 1 19_gfortran_miall_i8 .visible .func _gfortran_miall_608predpred %r280; .reg .pred %r281; .reg .u64 %r282; .reg .u64 %r284; .reg .u64predu64 %r318; .reg .u64u64u64 %r364; .reg .u64 %r365; .reg .u64 %r367; .reg .predu64pred %r393; .reg .predpredmov.u64 %r210,%ar0; mov.u64 %r211,%ar1; mov.u64 %r212,%ar2; mov.u64 %r213,%ar3; .loc 1 223 6911212; call _gfortran_iall_i8,(%out_arg1,%out_arg2,%out_arg3); } .loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r212]; .loc 1 233 7 add.u64 %r141,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r211+28]; .loc 1 234 38 add.u32 %r218,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r142,%r218; .loc 1 237 7 shr.u64 %r220,%r141,63; set.u32.gt.s64 %r222,%r141,%r142; neg.s32 %r223,%r222; cvt.u16.u64 %r226,%r220; cvt.u16.u32 %r227,%r223; or.b16 %r225,%r226,%r227; .loc 1 237 6 cvt.u32.u16 %r228,%r225; cvt.u16.u8 %r229,%r228; setp.eq.u16 %r230,%r229,0; @ %r230 bra $L51; .loc 1 239 7 cvt.u32.u32 %r233,%r23; cvt.s64.s8 %r232,%r233; st.u64 [%stack+8],%r232; st.u64 [%stack],%r22; cvta.const.u64 %r2312stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r431,%r141,%r141; add.u64 %r430,%r431,%r141; shl.b64 %r238,%r430,3; add.u64 %r239,%r211,%r238; ld.u64 %r242,[%r239+56]; add.u64 %r241,%r242,1; .loc 1 244 7 ld.u64 %r249,[%r239+48]; sub.u64 %r143,%r241,%r249; .loc 1 245 6 setp.le.s64 %r250,%r143,0; @ %r250 bra $L48; .loc 1 248 9 ld.u64 %r156,[%r213]; .loc 1 250 15 ld.u64 %r35,[%r213+16]; .loc 1 252 22 cvt.u32.u64 %r36,%r35; .loc 1 252 53 add.u32 %r251,%r36,-4; and.b32 %r252,%r251,-5; set.u32.eq.u32 %r254,%r252,0; neg.s32 %r255,%r254; .loc 1 252 22 add.u32 %r256,%r36,-1; set.u32.le.u32 %r258,%r256,1; neg.s32 %r259,%r258; .loc 1 252 6 cvt.u16.u32 %r261,%r255; cvt.u16.u32 %r262,%r259; or.b16 %r260,%r261,%r262; cvt.u32.u16 %r263,%r260; cvt.u16.u8 %r264,%r263; setp.ne.u16 %r265,%r264,0; @ %r265 bra $L53; .loc 1 254 7 setp.ne.u32 %r267,%r36,16; @ %r267 bra $L54; $L53: .loc 1 261 9 shl.b64 %r271,%r430,3; add.u64 %r272,%r211,%r271; ld.u64 %r146,[%r272+40]; .loc 1 262 12 add.u64 %r278,%r213,%r271; ld.u64 %r43,[%r278+40]; .loc 1 264 3 setp.ne.u64 %r280,%r141,0; @ %r280 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r281,%r141,%r142; @ %r281 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r282282_gfortran_runtime_error5: add.u64 %r194,%r211,40; add.u64 %r196,%r213,40; add.u64 %r284,%r211,16; add.u64 %r286,%r22,%r22; add.u64 %r287,%r286,%r22; shl.b64 %r288,%r287,3; add.u64 %r207,%r284,%r288; .loc 1 264 3 mov.u64 %r198,0; add.u64 %r428,%frame,240; add.u64 %r427,%frame,360; .loc 1 271 12 mov.u64 %r442,%r198; $L60: .loc 1 266 18 add.u64 %r290,%r428,%r198; ld.u64 %r291,[%r194]; st.u64 [%r290],%r291; .loc 1 267 18 add.u64 %r292,%frame,%r198; .loc 1 267 20 ld.u64 %r294,[%r196]; mul.lo.u64 %r293,%r294,%r35; .loc 1 267 18 st.u64 [%r292],%r293; .loc 1 268 19 ld.u64 %r296,[%r194+16]; add.u64 %r295,%r296,1; ld.u64 %r297,[%r194+8]; sub.u64 %r55,%r295,%r297; .loc 1 270 10 setp.lt.s64 %r298,%r55,0; @ %r298 bra $L58; .loc 1 268 17 add.u64 %r300,%r427,%r198; st.u64 [%r300],%r55; bra $L59; $L58: .loc 1 271 12 add.u64 %r302,%r427,%r198; st.u64 [%r302],%r442; $L59: .loc 1 264 3 add.u64 %r194,%r194,24; add.u64 %r196,%r196,24; add.u64 %r198,%r198,8; setp.ne.u64 %r304,%r194,%r207; @ %r304 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r305,[%r210]; setp.eq.u64 %r306,%r305,0; @ ! %r306 bra $L63; bra $L62; $L56: add.u64 %r308,%r22,%r22; add.u64 %r309,%r308,%r22; shl.b64 %r310,%r309,3; add.u64 %r61,%r310,40; add.u64 %r100,%r211,%r61; add.u64 %r139,%r213,%r61; shl.b64 %r311,%r22,3; add.u64 %r192,%r311,-8; cvt.u32.u32 %r313,%r23; cvt.s64.s8 %r312,%r313; shl.b64 %r314,%r312,3; add.u64 %r39,%r314,-8; add.u64 %r428,%frame,240; add.u64 %r427,%frame,360; .loc 1 281 12 mov.u64 %r441,0; $L66: .loc 1 276 18 add.u64 %r316,%r428,%r192; ld.u64 %r317,[%r100]; st.u64 [%r316],%r317; .loc 1 277 18 add.u64 %r318,%frame,%r192; .loc 1 277 20 ld.u64 %r320,[%r139]; mul.lo.u64 %r319,%r320,%r35; .loc 1 277 18 st.u64 [%r318],%r319; .loc 1 278 19 ld.u64 %r322,[%r100+16]; add.u64 %r321,%r322,1; ld.u64 %r323,[%r100+8]; sub.u64 %r67,%r321,%r323; .loc 1 280 10 setp.lt.s64 %r324,%r67,0; @ %r324 bra $L64; .loc 1 278 17 add.u64 %r326,%r427,%r192; st.u64 [%r326],%r67; bra $L65; $L64: .loc 1 281 12 add.u64 %r328,%r427,%r192; st.u64 [%r328],%r441; $L65: .loc 1 274 3 add.u64 %r100,%r100,24; add.u64 %r139,%r139,24; add.u64 %r192,%r192,8; setp.ne.u64 %r330,%r39,%r192; @ %r330 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r331,%r142,0; @ %r331 bra $L68; add.u64 %r427,%frame,360; $L86: add.u64 %r137,%r210,40; mov.u64 %r135,%r427; cvt.u32.u32 %r333,%r23; cvt.s64.s8 %r332,%r333; add.u64 %r335,%r332,%r332; add.u64 %r336,%r335,%r332; shl.b64 %r337,%r336,3; add.u64 %r338,%r210,16; add.u64 %r106,%r337,%r338; .loc 1 291 10 mov.u64 %r131,1; .loc 1 295 4 mov.u64 %r357,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r339,%r23,-2; cvt.s64.s32 %r74,%r339; add.u64 %r341,%r74,%r74; add.u64 %r342,%r341,%r74; shl.b64 %r343,%r342,3; add.u64 %r344,%r210,%r343; .loc 1 299 67 shl.b64 %r346,%r74,3; add.u64 %r347,%frame,%r346; .loc 1 299 59 ld.u64 %r349,[%r344+40]; ld.u64 %r350,[%r347+360]; mul.lo.u64 %r150,%r349,%r350; .loc 1 301 24 mov.u64 %r351,0; st.u64 [%r210+8],%r351; .loc 1 302 28 cvt.u16.u32 %r354,%r23; add.u16 %r353,%r354,-1; cvt.u32.u16 %r355,%r353; st.u8 [%r210+28],%r355; .loc 1 304 10 setp.eq.u64 %r356,%r150,0; @ %r356 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r131,%r71,%r131; $L69: .loc 1 295 4 st.u64 [%r137+8],%r357; ld.u64 %r71,[%r135]; add.u64 %r358,%r71,-1; st.u64 [%r137+16],%r358; st.u64 [%r137],%r131; .loc 1 288 7 add.u64 %r137,%r137,24; add.u64 %r135,%r135,8; setp.ne.u64 %r359,%r106,%r137; @ %r359 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r210+48],%r150; mov.u64 %r361,-1; st.u64 [%r210+56],%r361; mov.u64 %r362,1; st.u64 [%r210+40],%r362; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r364364; call (%value_in),_gfortrani_xmallocarray65,[%value_in]; } .loc 1 311 22 st.u64 [%r210],%r365; bra $L73; $L63: .loc 1 316 19 ld.s8 %r367,[%r210+28]; .loc 1 316 10 setp.eq.u64 %r368,%r367,%r142; @ %r368 bra $L74; .loc 1 317 2 cvta.const.u64 %r36stack; call _gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r371,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r372,[%r371+36]; setp.eq.u32 %r373,%r372,0; @ %r373 bra $L73; .loc 1 321 4 add.u64 %r427,%frame,360; cvta.const.u64 %r377,$LC2r427376377; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r381,$LC1r2113377; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r383,%r142,0; @ %r383 bra $L75; $L78: .loc 1 336 8 ld.u64 %r157,[%r210]; .loc 1 337 8 ld.u64 %r155,[%r211]; .loc 1 339 9 setp.ne.u64 %r384,%r155,0; @ %r384 bra $L76; bra $L48; $L75: add.u64 %r183,%frame,480; add.u64 %r180,%r210,40; add.u64 %r177,%frame,120; cvt.u32.u32 %r386,%r23; cvt.s64.s8 %r385,%r386; add.u64 %r138,%r385,-1; .loc 1 328 3 mov.u64 %r161,0; add.u64 %r427,%frame,360; .loc 1 330 16 mov.u64 %r387,%r161; $L77: st.u64 [%r183],%r387; .loc 1 331 18 ld.u64 %r388,[%r180]; st.u64 [%r177],%r388; .loc 1 332 17 shl.b64 %r390,%r161,3; add.u64 %r391,%r427,%r390; .loc 1 332 10 ld.u64 %r392,[%r391]; setp.le.s64 %r393,%r392,0; @ %r393 bra $L48; .loc 1 328 26 add.u64 %r161,%r161,1; .loc 1 328 3 add.u64 %r183,%r183,8; add.u64 %r180,%r180,24; add.u64 %r177,%r177,8; setp.ne.u64 %r394,%r138,%r161; @ %r394 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r46,%r43,%r35; .loc 1 349 32 shl.b64 %r90,%r146,3; ld.u64 %r29,[%frame+480]; .loc 1 359 22 ld.u64 %r92,[%frame+240]; .loc 1 359 12 shl.b64 %r94,%r92,3; .loc 1 360 23 ld.u64 %r95,[%frame]; .loc 1 361 22 ld.u64 %r97,[%frame+120]; .loc 1 361 12 shl.b64 %r99,%r97,3; setp.le.s64 %r432,%r142,1; cvt.u32.u32 %r434,%r23; cvt.s64.s8 %r435,%r434; add.u64 %r436,%r435,-1; add.u64 %r437,%frame,240; add.u64 %r438,%frame,360; add.u64 %r439,%frame,120; .loc 1 367 13 mov.u64 %r440,0; $L87: .loc 1 328 3 mov.u64 %r160,%r156; mov.u64 %r159,%r155; .loc 1 348 10 mov.u64 %r133,-1; .loc 1 349 9 mov.u64 %r158,0; $L80: .loc 1 352 6 ld.s8 %r396,[%r160]; cvt.u16.u32 %r395,%r396; setp.eq.u16 %r397,%r395,0; @ %r397 bra $L79; .loc 1 353 12 ld.u64 %r398,[%r159]; and.b64 %r133,%r133,%r398; $L79: .loc 1 349 24 add.u64 %r158,%r158,1; .loc 1 349 32 add.u64 %r159,%r159,%r90; .loc 1 349 47 add.u64 %r160,%r160,%r46; .loc 1 349 2 setp.ne.u64 %r399,%r143,%r158; @ %r399 bra $L80; .loc 1 355 8 st.u64 [%r157],%r133; .loc 1 358 15 add.u64 %r29,%r29,1; .loc 1 359 12 add.u64 %r155,%r155,%r94; .loc 1 360 13 add.u64 %r156,%r156,%r95; .loc 1 361 12 add.u64 %r157,%r157,%r99; .loc 1 363 32 ld.u64 %r84,[%frame+360]; .loc 1 363 13 setp.ne.u64 %r400,%r84,%r29; @ %r400 bra $L87; .loc 1 370 23 mul.lo.u64 %r102,%r29,%r92; .loc 1 371 24 mul.lo.u64 %r401,%r29,%r95; .loc 1 371 10 sub.u64 %r152,%r156,%r401; .loc 1 372 23 mul.lo.u64 %r402,%r29,%r97; .loc 1 372 9 shl.b64 %r403,%r402,3; sub.u64 %r153,%r157,%r403; .loc 1 374 7 @ ! %r432 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r41],%r440; .loc 1 370 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 371 24 mul.lo.u64 %r406,%r116,%r113; .loc 1 371 10 sub.u64 %r152,%r156,%r406; .loc 1 372 23 mul.lo.u64 %r407,%r118,%r113; .loc 1 372 9 shl.b64 %r408,%r407,3; sub.u64 %r153,%r157,%r408; .loc 1 373 5 add.u64 %r154,%r154,1; .loc 1 374 7 add.u64 %r41,%r41,8; add.u64 %r30,%r30,8; setp.ne.u64 %r409,%r154,%r436; @ %r409 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; mov.u64 %r30,8; .loc 1 373 5 mov.u64 %r154,1; $L84: .loc 1 382 16 ld.u64 %r413,[%r41]; add.u64 %r113,%r413,1; st.u64 [%r41],%r113; .loc 1 383 23 add.u64 %r415,%r437,%r30; ld.u64 %r114,[%r415]; .loc 1 383 13 sub.u64 %r416,%r114,%r102; shl.b64 %r417,%r416,3; add.u64 %r155,%r155,%r417; .loc 1 384 24 add.u64 %r418,%frame,%r30; ld.u64 %r116,[%r418]; .loc 1 384 14 add.u64 %r156,%r152,%r116; .loc 1 385 23 add.u64 %r420,%r439,%r30; ld.u64 %r118,[%r420]; .loc 1 385 13 shl.b64 %r421,%r118,3; add.u64 %r157,%r153,%r421; .loc 1 363 32 add.u64 %r423,%r438,%r30; ld.u64 %r121,[%r423]; .loc 1 363 13 setp.eq.u64 %r424,%r113,%r121; @ %r424 bra $L85; .loc 1 367 13 mov.u64 %r29,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r425,[%r210]; setp.eq.u64 %r426,%r425,0; @ ! %r426 bra $L63; bra $L86; $L48:_gfortran_siall_i8 .visible .func _gfortran_siall_68u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_ial.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r294,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r294; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r294; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r300,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r300; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r293,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r293,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r299,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r299; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r294,0; @ %r211 bra $L127; add.u64 %r293,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r293; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r24496244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r294,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r294; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r294; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r294,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r294,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r294,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 shl.b64 %r69,%r68,3; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r292,%r294,1; .loc 1 512 13 mov.u64 %r278,0; cvt.u32.u32 %r296,%r24; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; $L146: st.u64 [%r98],%r278; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r279,%r59,%r80; @ %r279 bra $L146; .loc 1 525 7 @ ! %r292 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r278; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r282,%r97,%r298; @ %r282 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; $L144: .loc 1 529 16 ld.u64 %r287,[%r99]; add.u64 %r74,%r287,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r288,%r75,%r72; shl.b64 %r289,%r288,3; add.u64 %r98,%r98,%r289; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r290,%r74,%r77; @ %r290 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r291,%r98,0; @ ! %r291 bra $L122; bra $L147; $L109: .loc 1 534 1 ret; } iall_i16.o/_gfortran_iall_i16 .visible .func _gfortran_iall_i16fortran/generated/iall_i16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_miall_i16 .visible .func _gfortran_miall_i16gfortran_siall_i16 .visible .func _gfortran_siall_i16VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,76,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,76,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,105] = {73,65,76,76114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,76,76,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,73,65,76,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_iall_i16480u64 %r52; .reg .u64 %r58; .reg .u64 %r61; .reg .u649pred %r217; .reg .pred64u64 %r251; .reg .predu64 %r256; .reg .u64 %r257; .reg .u32u32 %r268; .reg .u64u6464predu64 %r370; mov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; .loc 1 55 10 ld.s8 %r22,[%r180+28]; .loc 1 55 38 add.u32 %r182,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r182; .loc 1 56 10 ld.u64 %r25,[%r181]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r184,%r104,63; set.u32.lt.s64 %r186,%r103,%r104; neg.s32 %r187,%r186; cvt.u16.u64 %r190,%r184; cvt.u16.u32 %r191,%r187; or.b16 %r189,%r190,%r191; .loc 1 58 6 cvt.u32.u16 %r192,%r189; cvt.u16.u8 %r193,%r192; setp.eq.u16 %r194,%r193,0; @ %r194 bra $L2; .loc 1 60 7 cvt.u32.u32 %r197,%r22; cvt.s64.s8 %r196,%r197; st.u64 [%stack+8],%r1961955_gfortran_runtime_errorr200,%r104,%r104; add.u64 %r201,%r200,%r104; shl.b64 %r202,%r201,3; add.u64 %r203,%r180,%r202; ld.u64 %r30,[%r203+56]; ld.u64 %r32,[%r203+48]; .loc 1 68 9 ld.u64 %r107,[%r203+40]; .loc 1 70 3 setp.ne.u64 %r217,%r104,0; @ %r217 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r218,%r103,%r104; @ %r218 bra $L4; bra $L46; $L3: add.u64 %r61,%r180,40; add.u64 %r40,%frame,120; add.u64 %r91,%frame,240; add.u64 %r219,%r180,16; add.u64 %r221,%r25,%r25; add.u64 %r222,%r221,%r25; shl.b64 %r223,%r222,3; add.u64 %r174,%r219,%r223; .loc 1 76 12 mov.u64 %r370,0; $L8: .loc 1 72 18 ld.u64 %r224,[%r61]; st.u64 [%r40],%r224; .loc 1 73 19 ld.u64 %r226,[%r61+16]; add.u64 %r225,%r226,1; ld.u64 %r227,[%r61+8]; sub.u64 %r37,%r225,%r227; .loc 1 75 10 setp.lt.s64 %r228,%r37,0; @ %r228 bra $L6; .loc 1 73 17 st.u64 [%r91],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r91],%r370; $L7: .loc 1 70 3 add.u64 %r61,%r61,24; add.u64 %r40,%r40,8; add.u64 %r91,%r91,8; setp.ne.u64 %r230,%r61,%r174; @ %r230 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r231,[%r179]; setp.eq.u64 %r232,%r231,0; @ ! %r232 bra $L11; bra $L10; $L4: add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r237,%r236,40; add.u64 %r143,%r180,%r237; shl.b64 %r238,%r25,3; add.u64 %r125,%r238,-8; add.u64 %r358,%frame,120; add.u64 %r128,%r358,%r125; add.u64 %r357,%frame,240; add.u64 %r121,%r357,%r125; cvt.u32.u32 %r242,%r22; cvt.s64.s8 %r241,%r242; add.u64 %r244,%r241,%r241; add.u64 %r245,%r244,%r241; shl.b64 %r246,%r245,3; add.u64 %r247,%r180,40; add.u64 %r95,%r246,%r247; .loc 1 84 12 mov.u64 %r369,0; $L14: .loc 1 80 18 ld.u64 %r248,[%r143]; st.u64 [%r128],%r248; .loc 1 81 19 ld.u64 %r250,[%r143+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r143+8]; sub.u64 %r44,%r249,%r251; .loc 1 83 10 setp.lt.s64 %r252,%r44,0; @ %r252 bra $L12; .loc 1 81 17 st.u64 [%r121],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r121],%r369; $L13: .loc 1 78 3 add.u64 %r143,%r143,24; add.u64 %r128,%r128,8; add.u64 %r121,%r121,8; setp.ne.u64 %r254,%r95,%r143; @ %r254 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r255,%r103,0; @ %r255 bra $L16; add.u64 %r357,%frame,240; $L34: add.u64 %r152,%r179,40; mov.u64 %r150,%r357; add.u64 %r256,%r179,16; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r144,%r256,%r262; .loc 1 94 10 mov.u64 %r99,1; .loc 1 98 4 mov.u64 %r285,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r263,0; st.u64 [%r179+8],%r263; .loc 1 103 28 cvt.u16.u32 %r266,%r22; add.u16 %r265,%r266,-1; cvt.u32.u16 %r267,%r265; st.u8 [%r179+28],%r267; .loc 1 105 20 add.u32 %r268,%r22,-2; cvt.s64.s32 %r52,%r268; add.u64 %r270,%r52,%r52; add.u64 %r271,%r270,%r52; shl.b64 %r272,%r271,3; add.u64 %r273,%r179,%r272; .loc 1 105 67 shl.b64 %r275,%r52,3; add.u64 %r276,%frame,%r275; .loc 1 105 59 ld.u64 %r278,[%r273+40]; ld.u64 %r279,[%r276+240]; mul.lo.u64 %r109,%r278,%r279; .loc 1 107 29 mov.u64 %r281,16281; call (%value_in),_gfortrani_xmallocarray282,[%value_in]; } .loc 1 107 27 st.u64 [%r179],%r282; .loc 1 108 10 setp.eq.u64 %r284,%r109,0; @ ! %r284 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r99,%r48,%r99; $L17: .loc 1 98 4 st.u64 [%r152+8],%r285; ld.u64 %r48,[%r150]; add.u64 %r286,%r48,-1; st.u64 [%r152+16],%r286; st.u64 [%r152],%r99; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r287,%r144,%r152; @ %r287 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r179+48],%r109; mov.u64 %r289,-1; st.u64 [%r179+56],%r289; mov.u64 %r290,1; st.u64 [%r179+40],%r290; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r179+28]; .loc 1 118 10 setp.eq.u64 %r291,%r58,%r103; @ %r291 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r294,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r295,[%r294+36]; setp.eq.u32 %r296,%r295,0; @ %r296 bra $L21; .loc 1 125 2 add.u64 %r357,%frame,240; cvta.const.u64 %r300,$LC2r357299300; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r302,%r103,0; @ %r302 bra $L24; $L27: .loc 1 65 9 add.u64 %r303,%r30,1; .loc 1 65 7 sub.u64 %r105,%r303,%r32; max.s64 %r45,%r105,0; .loc 1 137 8 ld.u64 %r116,[%r180]; .loc 1 138 8 ld.u64 %r117,[%r179]; .loc 1 154 36 shl.b64 %r66,%r107,4; ld.u64 %r138,[%frame+360]; .loc 1 166 22 ld.u64 %r68,[%frame+120]; .loc 1 166 12 shl.b64 %r69,%r68,4; .loc 1 167 22 ld.u64 %r71,[%frame]; .loc 1 167 12 shl.b64 %r72,%r71,4; .loc 1 169 32 ld.u64 %r123,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r304,%r68,%r123; .loc 1 176 9 shl.b64 %r305,%r304,4; neg.s64 %r127,%r305; .loc 1 177 23 mul.lo.u64 %r306,%r71,%r123; .loc 1 177 9 shl.b64 %r307,%r306,4; neg.s64 %r130,%r307; setp.gt.s64 %r359,%r105,0; setp.le.s64 %r360,%r103,1; .loc 1 148 10 mov.u64 %r362,-1; cvt.u32.u32 %r364,%r22; cvt.s64.s8 %r365,%r364; add.u64 %r366,%r365,-1; add.u64 %r367,%frame,120; add.u64 %r368,%frame,240; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r179,40; mov.u64 %r162,%frame; cvt.u32.u32 %r309,%r22; cvt.s64.s8 %r308,%r309; add.u64 %r153,%r308,-1; .loc 1 129 3 mov.u64 %r118,0; add.u64 %r357,%frame,240; .loc 1 131 16 mov.u64 %r310,%r118; $L26: st.u64 [%r165],%r310; .loc 1 132 18 ld.u64 %r311,[%r164]; st.u64 [%r162],%r311; .loc 1 133 17 shl.b64 %r313,%r118,3; add.u64 %r314,%r357,%r313; .loc 1 133 10 ld.u64 %r315,[%r314]; setp.le.s64 %r316,%r315,0; @ %r316 bra $L1; .loc 1 129 26 add.u64 %r118,%r118,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r317,%r118,%r153; @ %r317 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r359 bra $L35; .loc 1 150 10 mov.u64 %r319,0; st.u64 [%r117],%r319; st.u64 [%r117+8],%r319; bra $L29; $L35: mov.u64 %r112,%r116; .loc 1 148 10 mov.u64 %r353,%r362; mov.u64 %r354,%r362; .loc 1 154 13 mov.u64 %r111,0; $L28: .loc 1 158 10 ld.u64 %r323,[%r112]; ld.u64 %r326,[%r112+8]; and.b64 %r353,%r323,%r353; and.b64 %r354,%r326,%r354; .loc 1 154 28 add.u64 %r111,%r111,1; .loc 1 154 36 add.u64 %r112,%r112,%r66; .loc 1 154 6 setp.gt.s64 %r328,%r45,%r111; @ %r328 bra $L28; .loc 1 161 12 st.u64 [%r117],%r353; st.u64 [%r117+8],%r354; $L29: .loc 1 165 15 add.u64 %r138,%r138,1; .loc 1 166 12 add.u64 %r116,%r116,%r69; .loc 1 167 12 add.u64 %r117,%r117,%r72; .loc 1 169 13 setp.ne.u64 %r331,%r138,%r123; @ %r331 bra $L25; .loc 1 176 9 add.u64 %r113,%r116,%r127; .loc 1 177 9 add.u64 %r114,%r117,%r130; .loc 1 179 7 @ %r360 bra $L1; add.u64 %r122,%frame,368; mov.u64 %r140,8; .loc 1 178 5 mov.u64 %r115,1; .loc 1 173 13 mov.u64 %r361,0; bra $L32; $L33: st.u64 [%r122],%r361; .loc 1 176 23 mul.lo.u64 %r337,%r85,%r84; .loc 1 176 9 shl.b64 %r338,%r337,4; sub.u64 %r113,%r116,%r338; .loc 1 177 23 mul.lo.u64 %r339,%r88,%r84; .loc 1 177 9 shl.b64 %r340,%r339,4; sub.u64 %r114,%r117,%r340; .loc 1 178 5 add.u64 %r115,%r115,1; .loc 1 179 7 add.u64 %r122,%r122,8; add.u64 %r140,%r140,8; setp.eq.u64 %r341,%r115,%r366; @ %r341 bra $L1; $L32: .loc 1 187 16 ld.u64 %r342,[%r122]; add.u64 %r84,%r342,1; st.u64 [%r122],%r84; .loc 1 188 23 add.u64 %r344,%r367,%r140; ld.u64 %r85,[%r344]; .loc 1 188 13 shl.b64 %r345,%r85,4; add.u64 %r116,%r113,%r345; .loc 1 189 23 add.u64 %r346,%frame,%r140; ld.u64 %r88,[%r346]; .loc 1 189 13 shl.b64 %r347,%r88,4; add.u64 %r117,%r114,%r347; .loc 1 169 32 add.u64 %r349,%r368,%r140; ld.u64 %r92,[%r349]; .loc 1 169 13 setp.eq.u64 %r350,%r84,%r92; @ %r350 bra $L33; .loc 1 173 13 mov.u64 %r138,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r351,[%r179]; setp.eq.u64 %r352,%r351,0; @ ! %r352 bra $L11; bra $L34; $L1: .loc 1 19_gfortran_miall_i16 .visible .func _gfortran_miall_i16608u64u16 %r263; .reg .pred %r264; .reg .pred %r266; .reg .u64 %r270; .reg .u64 %r271; .reg .u64 %r277; .reg .pred %r279; .reg .pred %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .predu64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32u64 %r342; .reg .u64pred %r358; .reg .u64predu64 %r391; .reg .predpred %r399; .reg .u64 %r402; .reg .u64 %r405; .reg .pred %r407; .reg .predu64 %r435; .reg .u64 %r438; .reg .u64 %r439; .reg .u64 %r440; .reg .u64 %r442; .reg .predmov.u64 %r211,%ar2; mov.u64 %r212,%ar3; .loc 1 223 6 setp.ne.u64 %r213,%r212,0; @ %r213 bra $L492,%out_arg3); } .loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r211]; .loc 1 233 7 add.u64 %r137,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r210+28]; .loc 1 234 38 add.u32 %r217,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r138,%r217; .loc 1 237 7 shr.u64 %r219,%r137,63; set.u32.gt.s64 %r221,%r137,%r138; neg.s32 %r222,%r221; cvt.u16.u64 %r225,%r219; cvt.u16.u32 %r226,%r222; or.b16 %r224,%r225,%r226; .loc 1 237 6 cvt.u32.u16 %r227,%r224; cvt.u16.u8 %r228,%r227; setp.eq.u16 %r229,%r228,0; @ %r229 bra $L51; .loc 1 239 7 cvt.u32.u32 %r232,%r23; cvt.s64.s8 %r231,%r232; st.u64 [%stack+8],%r231; st.u64 [%stack],%r22; cvta.const.u64 %r2302stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r440,%r137,%r137; add.u64 %r442,%r440,%r137; shl.b64 %r237,%r442,3; add.u64 %r238,%r210,%r237; ld.u64 %r241,[%r238+56]; add.u64 %r240,%r241,1; .loc 1 244 7 ld.u64 %r248,[%r238+48]; sub.u64 %r139,%r240,%r248; .loc 1 245 6 setp.le.s64 %r249,%r139,0; @ %r249 bra $L48; .loc 1 248 9 ld.u64 %r152,[%r212]; .loc 1 250 15 ld.u64 %r34,[%r212+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r250,%r35,-4; and.b32 %r251,%r250,-5; set.u32.eq.u32 %r253,%r251,0; neg.s32 %r254,%r253; .loc 1 252 22 add.u32 %r255,%r35,-1; set.u32.le.u32 %r257,%r255,1; neg.s32 %r258,%r257; .loc 1 252 6 cvt.u16.u32 %r260,%r254; cvt.u16.u32 %r261,%r258; or.b16 %r259,%r260,%r261; cvt.u32.u16 %r262,%r259; cvt.u16.u8 %r263,%r262; setp.ne.u16 %r264,%r263,0; @ %r264 bra $L53; .loc 1 254 7 setp.ne.u32 %r266,%r35,16; @ %r266 bra $L54; $L53: .loc 1 261 9 shl.b64 %r270,%r442,3; add.u64 %r271,%r210,%r270; ld.u64 %r142,[%r271+40]; .loc 1 262 12 add.u64 %r277,%r212,%r270; ld.u64 %r42,[%r277+40]; .loc 1 264 3 setp.ne.u64 %r279,%r137,0; @ %r279 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r280,%r137,%r138; @ %r280 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r281281_gfortran_runtime_error5: add.u64 %r40,%r210,40; add.u64 %r179,%r212,40; add.u64 %r283,%r210,16; add.u64 %r285,%r22,%r22; add.u64 %r286,%r285,%r22; shl.b64 %r287,%r286,3; add.u64 %r206,%r283,%r287; .loc 1 264 3 mov.u64 %r92,0; add.u64 %r439,%frame,240; add.u64 %r438,%frame,360; .loc 1 271 12 mov.u64 %r454,%r92; $L60: .loc 1 266 18 add.u64 %r289,%r439,%r92; ld.u64 %r290,[%r40]; st.u64 [%r289],%r290; .loc 1 267 18 add.u64 %r291,%frame,%r92; .loc 1 267 20 ld.u64 %r293,[%r179]; mul.lo.u64 %r292,%r293,%r34; .loc 1 267 18 st.u64 [%r291],%r292; .loc 1 268 19 ld.u64 %r295,[%r40+16]; add.u64 %r294,%r295,1; ld.u64 %r296,[%r40+8]; sub.u64 %r54,%r294,%r296; .loc 1 270 10 setp.lt.s64 %r297,%r54,0; @ %r297 bra $L58; .loc 1 268 17 add.u64 %r299,%r438,%r92; st.u64 [%r299],%r54; bra $L59; $L58: .loc 1 271 12 add.u64 %r301,%r438,%r92; st.u64 [%r301],%r454; $L59: .loc 1 264 3 add.u64 %r40,%r40,24; add.u64 %r179,%r179,24; add.u64 %r92,%r92,8; setp.ne.u64 %r303,%r40,%r206; @ %r303 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r304,[%r209]; setp.eq.u64 %r305,%r304,0; @ ! %r305 bra $L63; bra $L62; $L56: add.u64 %r307,%r22,%r22; add.u64 %r308,%r307,%r22; shl.b64 %r309,%r308,3; add.u64 %r127,%r309,40; add.u64 %r133,%r210,%r127; add.u64 %r125,%r212,%r127; shl.b64 %r310,%r22,3; add.u64 %r103,%r310,-8; cvt.u32.u32 %r312,%r23; cvt.s64.s8 %r311,%r312; shl.b64 %r313,%r311,3; add.u64 %r69,%r313,-8; add.u64 %r439,%frame,240; add.u64 %r438,%frame,360; .loc 1 281 12 mov.u64 %r453,0; $L66: .loc 1 276 18 add.u64 %r315,%r439,%r103; ld.u64 %r316,[%r133]; st.u64 [%r315],%r316; .loc 1 277 18 add.u64 %r317,%frame,%r103; .loc 1 277 20 ld.u64 %r319,[%r125]; mul.lo.u64 %r318,%r319,%r34; .loc 1 277 18 st.u64 [%r317],%r318; .loc 1 278 19 ld.u64 %r321,[%r133+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r133+8]; sub.u64 %r66,%r320,%r322; .loc 1 280 10 setp.lt.s64 %r323,%r66,0; @ %r323 bra $L64; .loc 1 278 17 add.u64 %r325,%r438,%r103; st.u64 [%r325],%r66; bra $L65; $L64: .loc 1 281 12 add.u64 %r327,%r438,%r103; st.u64 [%r327],%r453; $L65: .loc 1 274 3 add.u64 %r133,%r133,24; add.u64 %r125,%r125,24; add.u64 %r103,%r103,8; setp.ne.u64 %r329,%r69,%r103; @ %r329 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r330,%r138,0; @ %r330 bra $L68; add.u64 %r438,%frame,360; $L86: add.u64 %r172,%r209,40; mov.u64 %r164,%r438; cvt.u32.u32 %r332,%r23; cvt.s64.s8 %r331,%r332; add.u64 %r334,%r331,%r331; add.u64 %r335,%r334,%r331; shl.b64 %r336,%r335,3; add.u64 %r337,%r209,16; add.u64 %r134,%r336,%r337; .loc 1 291 10 mov.u64 %r129,1; .loc 1 295 4 mov.u64 %r356,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r338,%r23,-2; cvt.s64.s32 %r75,%r338; add.u64 %r340,%r75,%r75; add.u64 %r341,%r340,%r75; shl.b64 %r342,%r341,3; add.u64 %r343,%r209,%r342; .loc 1 299 67 shl.b64 %r345,%r75,3; add.u64 %r346,%frame,%r345; .loc 1 299 59 ld.u64 %r348,[%r343+40]; ld.u64 %r349,[%r346+360]; mul.lo.u64 %r146,%r348,%r349; .loc 1 301 24 mov.u64 %r350,0; st.u64 [%r209+8],%r350; .loc 1 302 28 cvt.u16.u32 %r353,%r23; add.u16 %r352,%r353,-1; cvt.u32.u16 %r354,%r352; st.u8 [%r209+28],%r354; .loc 1 304 10 setp.eq.u64 %r355,%r146,0; @ %r355 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r129,%r72,%r129; $L69: .loc 1 295 4 st.u64 [%r172+8],%r356; ld.u64 %r72,[%r164]; add.u64 %r357,%r72,-1; st.u64 [%r172+16],%r357; st.u64 [%r172],%r129; .loc 1 288 7 add.u64 %r172,%r172,24; add.u64 %r164,%r164,8; setp.ne.u64 %r358,%r134,%r172; @ %r358 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r209+48],%r146; mov.u64 %r360,-1; st.u64 [%r209+56],%r360; mov.u64 %r361,1; st.u64 [%r209+40],%r361; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r363,16363; call (%value_in),_gfortrani_xmallocarray64,[%value_in]; } .loc 1 311 22 st.u64 [%r209],%r364; bra $L73; $L63: .loc 1 316 19 ld.s8 %r366,[%r209+28]; .loc 1 316 10 setp.eq.u64 %r367,%r366,%r138; @ %r367 bra $L74; .loc 1 317 2 cvta.const.u64 %r36836_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r370,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r371,[%r370+36]; setp.eq.u32 %r372,%r371,0; @ %r372 bra $L73; .loc 1 321 4 add.u64 %r438,%frame,360; cvta.const.u64 %r376,$LC2r437376; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r380,$LC1210380376; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r382,%r138,0; @ %r382 bra $L75; $L78: .loc 1 336 8 ld.u64 %r153,[%r209]; .loc 1 337 8 ld.u64 %r151,[%r210]; .loc 1 339 9 setp.ne.u64 %r383,%r151,0; @ %r383 bra $L76; bra $L48; $L75: add.u64 %r190,%frame,480; add.u64 %r189,%r209,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r385,%r23; cvt.s64.s8 %r384,%r385; add.u64 %r175,%r384,-1; .loc 1 328 3 mov.u64 %r157,0; add.u64 %r438,%frame,360; .loc 1 330 16 mov.u64 %r386,%r157; $L77: st.u64 [%r190],%r386; .loc 1 331 18 ld.u64 %r387,[%r189]; st.u64 [%r187],%r387; .loc 1 332 17 shl.b64 %r389,%r157,3; add.u64 %r390,%r438,%r389; .loc 1 332 10 ld.u64 %r391,[%r390]; setp.le.s64 %r392,%r391,0; @ %r392 bra $L48; .loc 1 328 26 add.u64 %r157,%r157,1; .loc 1 328 3 add.u64 %r190,%r190,8; add.u64 %r189,%r189,24; add.u64 %r187,%r187,8; setp.ne.u64 %r393,%r157,%r175; @ %r393 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r45,%r42,%r34; .loc 1 349 32 shl.b64 %r91,%r142,4; ld.u64 %r39,[%frame+480]; .loc 1 359 22 ld.u64 %r94,[%frame+240]; .loc 1 359 12 shl.b64 %r95,%r94,4; .loc 1 360 23 ld.u64 %r97,[%frame]; .loc 1 361 22 ld.u64 %r99,[%frame+120]; .loc 1 361 12 shl.b64 %r100,%r99,4; .loc 1 363 32 ld.u64 %r85,[%frame+360]; .loc 1 370 23 mul.lo.u64 %r38,%r85,%r94; .loc 1 371 24 mul.lo.u64 %r394,%r85,%r97; .loc 1 371 10 neg.s64 %r163,%r394; .loc 1 372 23 mul.lo.u64 %r395,%r85,%r99; .loc 1 372 9 shl.b64 %r396,%r395,4; neg.s64 %r165,%r396; setp.le.s64 %r443,%r138,1; .loc 1 348 10 mov.u64 %r445,-1; cvt.u32.u32 %r447,%r23; cvt.s64.s8 %r448,%r447; add.u64 %r449,%r448,-1; add.u64 %r450,%frame,240; add.u64 %r451,%frame,360; add.u64 %r452,%frame,120; $L87: .loc 1 328 3 mov.u64 %r156,%r152; mov.u64 %r155,%r151; .loc 1 348 10 mov.u64 %r434,%r445; mov.u64 %r435,%r445; .loc 1 349 9 mov.u64 %r154,0; $L80: .loc 1 352 6 ld.s8 %r398,[%r156]; cvt.u16.u32 %r397,%r398; setp.eq.u16 %r399,%r397,0; @ %r399 bra $L79; .loc 1 353 12 ld.u64 %r402,[%r155]; ld.u64 %r405,[%r155+8]; and.b64 %r434,%r402,%r434; and.b64 %r435,%r405,%r435; $L79: .loc 1 349 24 add.u64 %r154,%r154,1; .loc 1 349 32 add.u64 %r155,%r155,%r91; .loc 1 349 47 add.u64 %r156,%r156,%r45; .loc 1 349 2 setp.ne.u64 %r407,%r139,%r154; @ %r407 bra $L80; .loc 1 355 8 st.u64 [%r153],%r434; st.u64 [%r153+8],%r435; .loc 1 358 15 add.u64 %r39,%r39,1; .loc 1 359 12 add.u64 %r151,%r151,%r95; .loc 1 360 13 add.u64 %r152,%r152,%r97; .loc 1 361 12 add.u64 %r153,%r153,%r100; .loc 1 363 13 setp.ne.u64 %r410,%r85,%r39; @ %r410 bra $L87; .loc 1 371 10 add.u64 %r148,%r152,%r163; .loc 1 372 9 add.u64 %r149,%r153,%r165; .loc 1 374 7 @ ! %r443 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r41],%r444; .loc 1 370 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 371 24 mul.lo.u64 %r413,%r116,%r113; .loc 1 371 10 sub.u64 %r148,%r152,%r413; .loc 1 372 23 mul.lo.u64 %r414,%r118,%r113; .loc 1 372 9 shl.b64 %r415,%r414,4; sub.u64 %r149,%r153,%r415; .loc 1 373 5 add.u64 %r150,%r150,1; .loc 1 374 7 add.u64 %r41,%r41,8; add.u64 %r166,%r166,8; setp.ne.u64 %r416,%r150,%r449; @ %r416 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; .loc 1 370 23 mov.u64 %r102,%r38; .loc 1 374 7 mov.u64 %r166,8; .loc 1 373 5 mov.u64 %r150,1; .loc 1 367 13 mov.u64 %r444,0; $L84: .loc 1 382 16 ld.u64 %r420,[%r41]; add.u64 %r113,%r420,1; st.u64 [%r41],%r113; .loc 1 383 23 add.u64 %r422,%r450,%r166; ld.u64 %r114,[%r422]; .loc 1 383 13 sub.u64 %r423,%r114,%r102; shl.b64 %r424,%r423,4; add.u64 %r151,%r151,%r424; .loc 1 384 24 add.u64 %r425,%frame,%r166; ld.u64 %r116,[%r425]; .loc 1 384 14 add.u64 %r152,%r148,%r116; .loc 1 385 23 add.u64 %r427,%r452,%r166; ld.u64 %r118,[%r427]; .loc 1 385 13 shl.b64 %r428,%r118,4; add.u64 %r153,%r149,%r428; .loc 1 363 32 add.u64 %r430,%r451,%r166; ld.u64 %r122,[%r430]; .loc 1 363 13 setp.eq.u64 %r431,%r113,%r122; @ %r431 bra $L85; .loc 1 367 13 mov.u64 %r39,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r432,[%r209]; setp.eq.u64 %r433,%r432,0; @ ! %r433 bra $L63; bra $L86; $L48:_gfortran_siall_i16 .visible .func _gfortran_siall_i1668u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64predpred %r292; .reg .predu32u64 %r301; mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_iall_i16.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r295,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r295; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r295; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r301,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r301; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r294,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r294,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r300,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r300; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r295,0; @ %r211 bra $L127; add.u64 %r294,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r294; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r244,1696244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r295,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r295; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r295; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r295,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r295,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r295,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 shl.b64 %r69,%r68,4; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r293,%r295,1; .loc 1 512 13 mov.u64 %r278,0; cvt.u32.u32 %r297,%r24; cvt.s64.s8 %r298,%r297; add.u64 %r299,%r298,-1; $L146: st.u64 [%r98],%r278; st.u64 [%r98+8],%r278; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r280,%r59,%r80; @ %r280 bra $L146; .loc 1 525 7 @ ! %r293 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r278; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r283,%r97,%r299; @ %r283 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; $L144: .loc 1 529 16 ld.u64 %r288,[%r99]; add.u64 %r74,%r288,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r289,%r75,%r72; shl.b64 %r290,%r289,4; add.u64 %r98,%r98,%r290; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r291,%r74,%r77; @ %r291 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r292,%r98,0; @ ! %r292 bra $L122; bra $L147; $L109: .loc 1 534 1 ret; } iany_i1.o/fortran/generated/iany_i1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_miany_i1 .visible .func _gfortran_miany_i1gfortran_siany_i1 .visible .func _gfortran_siany_i1VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,78,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,78,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,105] = {73,65,78,89114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,78,89,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,73,65,78,89,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_iany_i1 .visible .func _gfortran_iany_480predpred %r209; .reg .pred %r210; .reg .u64 %r211; .reg .u64 %r213; .reg .u64pred %r220; .reg .pred %r222; .reg .u64pred %r247; .reg .u64 %r248; .reg .u32 %r249; .reg .u64 %r251; .reg .u64 %r252; .reg .u64 %r253; .reg .u64 %r254; .reg .u64 %r255; .reg .u16 %r257; .reg .u16predpred %r288; .reg .u64 %r291; .reg .u64 %r292; .reg .predpred %r304; .reg .predpredpred %r332; .reg .u64 %r333; .reg .pred %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .pred %r337; .reg .pred %r338; .reg .u32 %r340; .reg .u64 %r341; .reg .u64 %r342; .reg .u64 %r343; .reg .u64 %r344; .reg .u64 %r345; .reg .u64 %r347; .reg .u64 %r348; .reg .u16 %r349; mov.u64 %r171,%ar0; mov.u64 %r172,%ar1; mov.u64 %r173,%ar2; .loc 1 55 10 ld.s8 %r22,[%r172+28]; .loc 1 55 38 add.u32 %r174,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r174; .loc 1 56 10 ld.u64 %r25,[%r173]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r176,%r104,63; set.u32.lt.s64 %r178,%r103,%r104; neg.s32 %r179,%r178; cvt.u16.u64 %r182,%r176; cvt.u16.u32 %r183,%r179; or.b16 %r181,%r182,%r183; .loc 1 58 6 cvt.u32.u16 %r184,%r181; cvt.u16.u8 %r185,%r184; setp.eq.u16 %r186,%r185,0; @ %r186 bra $L2; .loc 1 60 7 cvt.u32.u32 %r189,%r22; cvt.s64.s8 %r188,%r189; st.u64 [%stack+8],%r18818187_gfortran_runtime_errorr192,%r104,%r104; add.u64 %r193,%r192,%r104; shl.b64 %r194,%r193,3; add.u64 %r195,%r172,%r194; ld.u64 %r31,[%r195+56]; ld.u64 %r33,[%r195+48]; .loc 1 68 9 ld.u64 %r107,[%r195+40]; .loc 1 70 3 setp.ne.u64 %r209,%r104,0; @ %r209 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r210,%r103,%r104; @ %r210 bra $L4; bra $L46; $L3: add.u64 %r168,%r172,40; add.u64 %r165,%frame,120; add.u64 %r48,%frame,240; add.u64 %r211,%r172,16; add.u64 %r213,%r25,%r25; add.u64 %r214,%r213,%r25; shl.b64 %r215,%r214,3; add.u64 %r139,%r211,%r215; .loc 1 76 12 mov.u64 %r348,0; $L8: .loc 1 72 18 ld.u64 %r216,[%r168]; st.u64 [%r165],%r216; .loc 1 73 19 ld.u64 %r218,[%r168+16]; add.u64 %r217,%r218,1; ld.u64 %r219,[%r168+8]; sub.u64 %r38,%r217,%r219; .loc 1 75 10 setp.lt.s64 %r220,%r38,0; @ %r220 bra $L6; .loc 1 73 17 st.u64 [%r48],%r38; bra $L7; $L6: .loc 1 76 12 st.u64 [%r48],%r348; $L7: .loc 1 70 3 add.u64 %r168,%r168,24; add.u64 %r165,%r165,8; add.u64 %r48,%r48,8; setp.ne.u64 %r222,%r139,%r168; @ %r222 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r223,[%r171]; setp.eq.u64 %r224,%r223,0; @ ! %r224 bra $L11; bra $L10; $L4: add.u64 %r226,%r25,%r25; add.u64 %r227,%r226,%r25; shl.b64 %r228,%r227,3; add.u64 %r229,%r228,40; add.u64 %r99,%r172,%r229; shl.b64 %r230,%r25,3; add.u64 %r91,%r230,-8; add.u64 %r335,%frame,120; add.u64 %r93,%r335,%r91; add.u64 %r336,%frame,240; add.u64 %r77,%r336,%r91; cvt.u32.u32 %r234,%r22; cvt.s64.s8 %r233,%r234; add.u64 %r236,%r233,%r233; add.u64 %r237,%r236,%r233172,40; add.u64 %r121,%r238,%r239; .loc 1 84 12 mov.u64 %r347,0; $L14: .loc 1 80 18 ld.u64 %r240,[%r99]; st.u64 [%r93],%r240; .loc 1 81 19 ld.u64 %r242,[%r99+16]; add.u64 %r241,%r242,1; ld.u64 %r243,[%r99+8]; sub.u64 %r44,%r241,%r243; .loc 1 83 10 setp.lt.s64 %r244,%r44,0; @ %r244 bra $L12; .loc 1 81 17 st.u64 [%r77],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r77],%r347; $L13: .loc 1 78 3 add.u64 %r99,%r99,24; add.u64 %r93,%r93,8; add.u64 %r77,%r77,8; setp.ne.u64 %r246,%r99,%r121; @ %r246 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r247,%r103,0; @ %r247 bra $L16; add.u64 %r336,%frame,240; $L35: add.u64 %r137,%r171,40; mov.u64 %r132,%r336; cvt.u32.u32 %r249,%r22; cvt.s64.s8 %r248,%r249; add.u64 %r251,%r248,%r248; add.u64 %r252,%r251,%r248; shl.b64 %r253,%r252,3; add.u64 %r254,%r171,16; add.u64 %r100,%r253,%r254; .loc 1 94 10 mov.u64 %r95,1; .loc 1 98 4 mov.u64 %r277,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r255,0; st.u64 [%r171+8],%r255; .loc 1 103 28 cvt.u16.u32 %r258,%r22; add.u16 %r257,%r258,-1; cvt.u32.u16 %r259,%r257; st.u8 [%r171+28],%r259; .loc 1 105 20 add.u32 %r260,%r22,-2; cvt.s64.s32 %r54,%r260; add.u64 %r262,%r54,%r54; add.u64 %r263,%r262,%r54; shl.b64 %r264,%r263,3; add.u64 %r265,%r171,%r264; .loc 1 105 67 shl.b64 %r267,%r54,3; add.u64 %r268,%frame,%r267; .loc 1 105 59 ld.u64 %r270,[%r265+40]; ld.u64 %r271,[%r268+240]; mul.lo.u64 %r109,%r270,%r271; .loc 1 107 29 mov.u64 %r27273; call (%value_in),_gfortrani_xmallocarray274,[%value_in]; } .loc 1 107 27 st.u64 [%r171],%r274; .loc 1 108 10 setp.eq.u64 %r276,%r109,0; @ ! %r276 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r95,%r50,%r95; $L17: .loc 1 98 4 st.u64 [%r137+8],%r277; ld.u64 %r50,[%r132]; add.u64 %r278,%r50,-1; st.u64 [%r137+16],%r278; st.u64 [%r137],%r95; .loc 1 91 7 add.u64 %r137,%r137,24; add.u64 %r132,%r132,8; setp.ne.u64 %r279,%r100,%r137; @ %r279 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r171+48],%r109; mov.u64 %r281,-1; st.u64 [%r171+56],%r281; st.u64 [%r171+40],%r273; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r61,[%r171+28]; .loc 1 118 10 setp.eq.u64 %r283,%r61,%r103; @ %r283 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r61;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r286,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r287,[%r286+36]; setp.eq.u32 %r288,%r287,0; @ %r288 bra $L21; .loc 1 125 2 add.u64 %r336,%frame,240; cvta.const.u64 %r292,$LC2r3r29292; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r294,%r103,0; @ %r294 bra $L24; $L27: .loc 1 65 9 add.u64 %r295,%r31,1; .loc 1 65 7 sub.u64 %r105,%r295,%r33; max.s64 %r45,%r105,0; .loc 1 137 8 ld.u64 %r117,[%r172]; .loc 1 138 8 ld.u64 %r118,[%r171]; ld.u64 %r47,[%frame+360]; .loc 1 166 22 ld.u64 %r69,[%frame+120]; .loc 1 167 22 ld.u64 %r71,[%frame]; setp.gt.s64 %r337,%r105,0; setp.le.s64 %r338,%r103,1; cvt.u32.u32 %r340,%r22; cvt.s64.s8 %r341,%r340; add.u64 %r342,%r341,-1; add.u64 %r343,%frame,120; add.u64 %r344,%frame,240; .loc 1 173 13 mov.u64 %r345,0; bra $L25; $L24: add.u64 %r157,%frame,360; add.u64 %r156,%r171,40; mov.u64 %r154,%frame; cvt.u32.u32 %r297,%r22; cvt.s64.s8 %r296,%r297; add.u64 %r145,%r296,-1; .loc 1 129 3 mov.u64 %r119,0; add.u64 %r336,%frame,240; .loc 1 131 16 mov.u64 %r298,%r119; $L26: st.u64 [%r157],%r298; .loc 1 132 18 ld.u64 %r299,[%r156]; st.u64 [%r154],%r299; .loc 1 133 17 shl.b64 %r301,%r119,3; add.u64 %r302,%r336,%r301; .loc 1 133 10 ld.u64 %r303,[%r302]; setp.le.s64 %r304,%r303,0; @ %r304 bra $L1; .loc 1 129 26 add.u64 %r119,%r119,1; .loc 1 129 3 add.u64 %r157,%r157,8; add.u64 %r156,%r156,24; add.u64 %r154,%r154,8; setp.ne.u64 %r305,%r119,%r145; @ %r305 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r337 bra $L28; .loc 1 150 10 mov.u32 %r307,0; st.u8 [%r118],%r307; bra $L29; $L28: .loc 1 154 36 mov.u64 %r112,%r117; .loc 1 148 10 mov.u32 %r110,0; .loc 1 154 13 mov.u64 %r111,0; $L30: .loc 1 158 10 ld.u8 %r349,[%r112]; mov.u16 %r310,%r349; cvt.u16.u32 %r311,%r110; or.loc 1 154 28 add.u64 %r111,%r111,1; .loc 1 154 36 add.u64 %r112,%r112,%r107; .loc 1 154 6 setp.gt.s64 %r313,%r45,%r111; @ %r313 bra $L30; .loc 1 161 12 cvt.u32.u32 %r314,%r110; st.u8 [%r118],%r314; $L29: .loc 1 165 15 add.u64 %r47,%r47,1; .loc 1 166 12 add.u64 %r117,%r117,%r69; .loc 1 167 12 add.u64 %r118,%r118,%r71; .loc 1 169 32 ld.u64 %r126,[%frame+240]; .loc 1 169 13 setp.ne.u64 %r315,%r47,%r126; @ %r315 bra $L25; .loc 1 176 23 mul.lo.u64 %r316,%r69,%r47; .loc 1 176 9 sub.u64 %r114,%r117,%r316; .loc 1 177 23 mul.lo.u64 %r317,%r71,%r47; .loc 1 177 9 sub.u64 %r115,%r118,%r317; .loc 1 179 7 @ %r338 bra $L1; add.u64 %r143,%frame,368; mov.u64 %r102,8; .loc 1 178 5 mov.u64 %r116,1; bra $L33; $L34: .loc 1 173 13 st.u64 [%r143],%r345; .loc 1 176 23 mul.lo.u64 %r323,%r84,%r83; .loc 1 176 9 sub.u64 %r114,%r117,%r323; .loc 1 177 23 mul.lo.u64 %r324,%r86,%r83; .loc 1 177 9 sub.u64 %r115,%r118,%r324; .loc 1 178 5 add.u64 %r116,%r116,1; .loc 1 179 7 add.u64 %r143,%r143,8; add.u64 %r102,%r102,8; setp.eq.u64 %r325,%r116,%r342; @ %r325 bra $L1; $L33: .loc 1 187 16 ld.u64 %r326,[%r143]; add.u64 %r83,%r326,1; st.u64 [%r143],%r83; .loc 1 188 23 add.u64 %r328,%r343,%r102; ld.u64 %r84,[%r328]; .loc 1 188 13 add.u64 %r117,%r114,%r84; .loc 1 189 23 add.u64 %r329,%frame,%r102; ld.u64 %r86,[%r329]; .loc 1 189 13 add.u64 %r118,%r115,%r86; .loc 1 169 32 add.u64 %r331,%r344,%r102; ld.u64 %r88,[%r331]; .loc 1 169 13 setp.eq.u64 %r332,%r83,%r88; @ %r332 bra $L34; .loc 1 173 13 mov.u64 %r47,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r333,[%r171]; setp.eq.u64 %r334,%r333,0; @ ! %r334 bra $L11; bra $L35; $L1: .loc 1 19_gfortran_miany_i1 .visible .func _gfortran_miany_608u32 %r215; .reg .u32 %r216; .reg .u16u16 %r255; .reg .u32 %r256; .reg .u16 %r257; .reg .pred %r258; .reg .predpred %r274; .reg .u64 %r275; .reg .u64 %r277; .reg .u64 %r279; .reg .u64 %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64pred %r297; .reg .u64 %r298; .reg .predu64 %r305; .reg .u32 %r306; .reg .u64u64 %r313; .reg .u64pred %r323; .reg .pred %r324; .reg .u64 %r325; .reg .u32 %r326; .reg .u64u64 %r331; .reg .u32 %r332; .reg .u64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u64u16 %r347; .reg .u32 %r348; .reg .pred %r349; .reg .u64 %r350; .reg .u64 %r351; .reg .predu16 %r388; .reg .u32 %r389; .reg .predu64 %r436; .reg .u16 %r437; mov.u64 %r203,%ar0; mov.u64 %r204,%ar1; mov.u64 %r205,%ar2; mov.u64 %r206,%ar3; .loc 1 223 6 setp.ne.u64 %r207,%r206,0; @ %r207 bra $L490r204205; call _gfortran_iany_i1,(%out_arg1,%out_arg2,%out_arg3); } .loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r205]; .loc 1 233 7 add.u64 %r132,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r204+28]; .loc 1 234 38 add.u32 %r211,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r133,%r211; .loc 1 237 7 shr.u64 %r213,%r132,63; set.u32.gt.s64 %r215,%r132,%r133; neg.s32 %r216,%r215; cvt.u16.u64 %r219,%r213; cvt.u16.u32 %r220,%r216; or.b16 %r218,%r219,%r220; .loc 1 237 6 cvt.u32.u16 %r221,%r218; cvt.u16.u8 %r222,%r221; setp.eq.u16 %r223,%r222,0; @ %r223 bra $L51; .loc 1 239 7 cvt.u32.u32 %r226,%r23; cvt.s64.s8 %r225,%r226; st.u64 [%stack+8],%r225; st.u64 [%stack],%r22; cvta.const.u64 %r2242stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r425,%r132,%r132; add.u64 %r424,%r425,%r132; shl.b64 %r231,%r424,3; add.u64 %r232,%r204,%r231; ld.u64 %r235,[%r232+56]; add.u64 %r234,%r235,1; .loc 1 244 7 ld.u64 %r242,[%r232+48]; sub.u64 %r134,%r234,%r242; .loc 1 245 6 setp.le.s64 %r243,%r134,0; @ %r243 bra $L48; .loc 1 248 9 ld.u64 %r146,[%r206]; .loc 1 250 15 ld.u64 %r34,[%r206+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r244,%r35,-4; and.b32 %r245,%r244,-5; set.u32.eq.u32 %r247,%r245,0; neg.s32 %r248,%r247; .loc 1 252 22 add.u32 %r249,%r35,-1; set.u32.le.u32 %r251,%r249,1; neg.s32 %r252,%r251; .loc 1 252 6 cvt.u16.u32 %r254,%r248; cvt.u16.u32 %r255,%r252; or.b16 %r253,%r254,%r255; cvt.u32.u16 %r256,%r253; cvt.u16.u8 %r257,%r256; setp.ne.u16 %r258,%r257,0; @ %r258 bra $L53; .loc 1 254 7 setp.ne.u32 %r260,%r35,16; @ %r260 bra $L54; $L53: .loc 1 261 9 shl.b64 %r264,%r424,3; add.u64 %r265,%r204,%r264; ld.u64 %r137,[%r265+40]; .loc 1 262 12 add.u64 %r271,%r206,%r264; ld.u64 %r42,[%r271+40]; .loc 1 264 3 setp.ne.u64 %r273,%r132,0; @ %r273 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r274,%r132,%r133; @ %r274 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r275275_gfortran_runtime_error5: add.u64 %r187,%r204,40; add.u64 %r189,%r206,40; add.u64 %r277,%r204,16; add.u64 %r279,%r22,%r22; add.u64 %r280,%r279,%r22; shl.b64 %r281,%r280,3; add.u64 %r200,%r277,%r281; .loc 1 264 3 mov.u64 %r191,0; add.u64 %r422,%frame,240; add.u64 %r421,%frame,360; .loc 1 271 12 mov.u64 %r436,%r191; $L60: .loc 1 266 18 add.u64 %r283,%r422,%r191; ld.u64 %r284,[%r187]; st.u64 [%r283],%r284; .loc 1 267 18 add.u64 %r285,%frame,%r191; .loc 1 267 20 ld.u64 %r287,[%r189]; mul.lo.u64 %r286,%r287,%r34; .loc 1 267 18 st.u64 [%r285],%r286; .loc 1 268 19 ld.u64 %r289,[%r187+16]; add.u64 %r288,%r289,1; ld.u64 %r290,[%r187+8]; sub.u64 %r53,%r288,%r290; .loc 1 270 10 setp.lt.s64 %r291,%r53,0; @ %r291 bra $L58; .loc 1 268 17 add.u64 %r293,%r421,%r191; st.u64 [%r293],%r53; bra $L59; $L58: .loc 1 271 12 add.u64 %r295,%r421,%r191; st.u64 [%r295],%r436; $L59: .loc 1 264 3 add.u64 %r187,%r187,24; add.u64 %r189,%r189,24; add.u64 %r191,%r191,8; setp.ne.u64 %r297,%r187,%r200; @ %r297 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r298,[%r203]; setp.eq.u64 %r299,%r298,0; @ ! %r299 bra $L63; bra $L62; $L56: add.u64 %r301,%r22,%r22; add.u64 %r302,%r301,%r22; shl.b64 %r303,%r302,3; add.u64 %r152,%r303,40; add.u64 %r58,%r204,%r152; add.u64 %r154,%r206,%r152; shl.b64 %r304,%r22,3; add.u64 %r100,%r304,-8; cvt.u32.u32 %r306,%r23; cvt.s64.s8 %r305,%r306; shl.b64 %r307,%r305,3; add.u64 %r184,%r307,-8; add.u64 %r422,%frame,240; add.u64 %r421,%frame,360; .loc 1 281 12 mov.u64 %r435,0; $L66: .loc 1 276 18 add.u64 %r309,%r422,%r100; ld.u64 %r310,[%r58]; st.u64 [%r309],%r310; .loc 1 277 18 add.u64 %r311,%frame,%r100; .loc 1 277 20 ld.u64 %r313,[%r154]; mul.lo.u64 %r312,%r313,%r34; .loc 1 277 18 st.u64 [%r311],%r312; .loc 1 278 19 ld.u64 %r315,[%r58+16]; add.u64 %r314,%r315,1; ld.u64 %r316,[%r58+8]; sub.u64 %r64,%r314,%r316; .loc 1 280 10 setp.lt.s64 %r317,%r64,0; @ %r317 bra $L64; .loc 1 278 17 add.u64 %r319,%r421,%r100; st.u64 [%r319],%r64; bra $L65; $L64: .loc 1 281 12 add.u64 %r321,%r421,%r100; st.u64 [%r321],%r435; $L65: .loc 1 274 3 add.u64 %r58,%r58,24; add.u64 %r154,%r154,24; add.u64 %r100,%r100,8; setp.ne.u64 %r323,%r100,%r184; @ %r323 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r324,%r133,0; @ %r324 bra $L68; add.u64 %r421,%frame,360; $L86: add.u64 %r124,%r203,40; mov.u64 %r118,%r421; cvt.u32.u32 %r326,%r23; cvt.s64.s8 %r325,%r326; add.u64 %r328,%r325,%r325; add.u64 %r329,%r328,%r325; shl.b64 %r330,%r329,3; add.u64 %r331,%r203,16; add.u64 %r72,%r330,%r331; .loc 1 291 10 mov.u64 %r120,1; .loc 1 295 4 mov.u64 %r350,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r332,%r23,-2; cvt.s64.s32 %r70,%r332; add.u64 %r334,%r70,%r70; add.u64 %r335,%r334,%r70; shl.b64 %r336,%r335,3; add.u64 %r337,%r203,%r336; .loc 1 299 67 shl.b64 %r339,%r70,3; add.u64 %r340,%frame,%r339; .loc 1 299 59 ld.u64 %r342,[%r337+40]; ld.u64 %r343,[%r340+360]; mul.lo.u64 %r140,%r342,%r343; .loc 1 301 24 mov.u64 %r344,0; st.u64 [%r203+8],%r344; .loc 1 302 28 cvt.u16.u32 %r347,%r23; add.u16 %r346,%r347,-1; cvt.u32.u16 %r348,%r346; st.u8 [%r203+28],%r348; .loc 1 304 10 setp.eq.u64 %r349,%r140,0; @ %r349 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r120,%r67,%r120; $L69: .loc 1 295 4 st.u64 [%r124+8],%r350; ld.u64 %r67,[%r118]; add.u64 %r351,%r67,-1; st.u64 [%r124+16],%r351; st.u64 [%r124],%r120; .loc 1 288 7 add.u64 %r124,%r124,24; add.u64 %r118,%r118,8; setp.ne.u64 %r352,%r72,%r124; @ %r352 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r203+48],%r140; mov.u64 %r354,-1; st.u64 [%r203+56],%r354; mov.u64 %r355,1; st.u64 [%r203+40],%r355; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r35714357; call (%value_in),_gfortrani_xmallocarray58,[%value_in]; } .loc 1 311 22 st.u64 [%r203],%r358; bra $L73; $L63: .loc 1 316 19 ld.s8 %r360,[%r203+28]; .loc 1 316 10 setp.eq.u64 %r361,%r360,%r133; @ %r361 bra $L74; .loc 1 317 2 cvta.const.u64 %r362362_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r364,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r365,[%r364+36]; setp.eq.u32 %r366,%r365,0; @ %r366 bra $L73; .loc 1 321 4 add.u64 %r421,%frame,360; cvta.const.u64 %r370,$LC2369370; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r374,$LC6; {374370; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r376,%r133,0; @ %r376 bra $L75; $L78: .loc 1 336 8 ld.u64 %r147,[%r203]; .loc 1 337 8 ld.u64 %r145,[%r204]; .loc 1 339 9 setp.ne.u64 %r377,%r145,0; @ %r377 bra $L76; bra $L48; $L75: add.u64 %r170,%frame,480; add.u64 %r167,%r203,40; add.u64 %r160,%frame,120; cvt.u32.u32 %r379,%r23; cvt.s64.s8 %r378,%r379; add.u64 %r125,%r378,-1; .loc 1 328 3 mov.u64 %r151,0; add.u64 %r421,%frame,360; .loc 1 330 16 mov.u64 %r380,%r151; $L77: st.u64 [%r170],%r380; .loc 1 331 18 ld.u64 %r381,[%r167]; st.u64 [%r160],%r381; .loc 1 332 17 shl.b64 %r383,%r151,3; add.u64 %r384,%r421,%r383; .loc 1 332 10 ld.u64 %r385,[%r384]; setp.le.s64 %r386,%r385,0; @ %r386 bra $L48; .loc 1 328 26 add.u64 %r151,%r151,1; .loc 1 328 3 add.u64 %r170,%r170,8; add.u64 %r167,%r167,24; add.u64 %r160,%r160,8; setp.ne.u64 %r387,%r125,%r151; @ %r387 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r44,%r42,%r34; ld.u64 %r173,[%frame+480]; .loc 1 359 22 ld.u64 %r86,[%frame+240]; .loc 1 360 23 ld.u64 %r88,[%frame]; .loc 1 361 22 ld.u64 %r90,[%frame+120]; setp.le.s64 %r426,%r133,1; .loc 1 348 10 mov.u32 %r428,0; cvt.u32.u32 %r429,%r23; cvt.s64.s8 %r430,%r429; add.u64 %r431,%r430,-1; add.u64 %r432,%frame,240; add.u64 %r433,%frame,360; add.u64 %r434,%frame,120; $L87: .loc 1 328 3 mov.u64 %r150,%r146; mov.u64 %r149,%r145; .loc 1 348 10 mov.u32 %r122,%r428; .loc 1 349 9 mov.u64 %r148,0; $L80: .loc 1 352 6 ld.s8 %r389,[%r150]; cvt.u16.u32 %r388,%r389; setp.eq.u16 %r390,%r388,0; @ %r390 bra $L79; .loc 1 353 12 ld.u8 %r437,[%r149]; mov.u16 %r393,%r437; cvt.u16.u32 %r394,%r122; or$L79: .loc 1 349 24 add.u64 %r148,%r148,1; .loc 1 349 32 add.u64 %r149,%r149,%r137; .loc 1 349 47 add.u64 %r150,%r150,%r44; .loc 1 349 2 setp.ne.u64 %r396,%r134,%r148; @ %r396 bra $L80; .loc 1 355 8 cvt.u32.u32 %r397,%r122; st.u8 [%r147],%r397; .loc 1 358 15 add.u64 %r173,%r173,1; .loc 1 359 12 add.u64 %r145,%r145,%r86; .loc 1 360 13 add.u64 %r146,%r146,%r88; .loc 1 361 12 add.u64 %r147,%r147,%r90; .loc 1 363 32 ld.u64 %r80,[%frame+360]; .loc 1 363 13 setp.ne.u64 %r398,%r80,%r173; @ %r398 bra $L87; .loc 1 370 23 mul.lo.u64 %r94,%r173,%r86; .loc 1 371 24 mul.lo.u64 %r399,%r173,%r88; .loc 1 371 10 sub.u64 %r142,%r146,%r399; .loc 1 372 23 mul.lo.u64 %r400,%r173,%r90; .loc 1 372 9 sub.u64 %r143,%r147,%r400; .loc 1 374 7 @ ! %r426 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r40],%r427; .loc 1 370 23 mul.lo.u64 %r94,%r106,%r105; .loc 1 371 24 mul.lo.u64 %r403,%r108,%r105; .loc 1 371 10 sub.u64 %r142,%r146,%r403; .loc 1 372 23 mul.lo.u64 %r404,%r110,%r105; .loc 1 372 9 sub.u64 %r143,%r147,%r404; .loc 1 373 5 add.u64 %r144,%r144,1; .loc 1 374 7 add.u64 %r40,%r40,8; add.u64 %r164,%r164,8; setp.ne.u64 %r405,%r144,%r431; @ %r405 bra $L84; bra $L48; $L108: add.u64 %r40,%frame,488; mov.u64 %r164,8; .loc 1 373 5 mov.u64 %r144,1; .loc 1 367 13 mov.u64 %r427,0; $L84: .loc 1 382 16 ld.u64 %r409,[%r40]; add.u64 %r105,%r409,1; st.u64 [%r40],%r105; .loc 1 383 23 add.u64 %r411,%r432,%r164; ld.u64 %r106,[%r411]; .loc 1 383 13 sub.u64 %r412,%r106,%r94; add.u64 %r145,%r145,%r412; .loc 1 384 24 add.u64 %r413,%frame,%r164; ld.u64 %r108,[%r413]; .loc 1 384 14 add.u64 %r146,%r142,%r108; .loc 1 385 23 add.u64 %r415,%r434,%r164; ld.u64 %r110,[%r415]; .loc 1 385 13 add.u64 %r147,%r143,%r110; .loc 1 363 32 add.u64 %r417,%r433,%r164; ld.u64 %r113,[%r417]; .loc 1 363 13 setp.eq.u64 %r418,%r105,%r113; @ %r418 bra $L85; .loc 1 367 13 mov.u64 %r173,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r419,[%r203]; setp.eq.u64 %r420,%r419,0; @ ! %r420 bra $L63; bra $L86; $L48:_gfortran_siany_i1 .visible .func _gfortran_siany_68predpred %r184; .reg .predpred %r206; .reg .predu32 %r211; .reg .u64 %r213; .reg .u64u32 %r222; .reg .u64pred %r264; .reg .u64 %r266; .reg .u32 %r270; .reg .u64 %r272; .reg .u64 %r274; .reg .predpredu64 %r297; mov.u64 %r148,%ar0; mov.u64 %r149,%ar1; mov.u64 %r150,%ar2; mov.u64 %r151,%ar3; .loc 1 412 6 setp.eq.u64 %r152,%r151,0; @ %r152 bra $L110; .loc 1 412 20 ld.u32 %r153,[%r151]; setp.eq.u32 %r154,%r153,0; @ %r154 bra $L111; $L110: .loc 1 417 7gfortran_iany_.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r150]; .loc 1 422 7 add.u64 %r86,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r149+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r291,%r26; .loc 1 425 7 shr.u64 %r159,%r86,63; set.u32.gt.s64 %r161,%r86,%r291; neg.s32 %r162,%r161; cvt.u16.u64 %r165,%r159; cvt.u16.u32 %r166,%r162; or.b16 %r164,%r165,%r166; .loc 1 425 6 cvt.u32.u16 %r167,%r164; cvt.u16.u8 %r168,%r167; setp.ne.u16 %r169,%r168,0; @ %r169 bra $L113; .loc 1 432 3 setp.ne.u64 %r170,%r86,0; @ %r170 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r171,%r86,%r291; @ %r171 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r174,%r24; cvt.s64.s8 %r173,%r174; st.u64 [%stack+8],%r173172172_gfortran_runtime_error114: add.u64 %r135,%r149,48; add.u64 %r137,%frame,120; add.u64 %r138,%r149,56; add.u64 %r176,%r149,24; add.u64 %r178,%r23,%r23; add.u64 %r179,%r178,%r23; shl.b64 %r180,%r179,3; add.u64 %r144,%r176,%r180; .loc 1 437 12 mov.u64 %r297,0; $L119: .loc 1 434 19 ld.u64 %r182,[%r138]; add.u64 %r181,%r182,1; ld.u64 %r183,[%r135]; sub.u64 %r36,%r181,%r183; .loc 1 436 10 setp.le.s64 %r184,%r36,0; @ %r184 bra $L117; .loc 1 434 17 st.u64 [%r137],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r137],%r297; $L118: .loc 1 432 3 add.u64 %r135,%r135,24; add.u64 %r137,%r137,8; add.u64 %r138,%r138,24; setp.ne.u64 %r186,%r135,%r144; @ %r186 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r96,[%r148]; .loc 1 449 6 setp.eq.u64 %r187,%r96,0; @ ! %r187 bra $L122; bra $L121; $L115: add.u64 %r189,%r23,%r23; add.u64 %r190,%r189,%r23; shl.b64 %r191,%r190,3; add.u64 %r192,%r191,48; add.u64 %r57,%r149,%r192; add.u64 %r290,%frame,120; shl.b64 %r194,%r23,3; add.u64 %r195,%r194,-8; add.u64 %r67,%r290,%r195; add.u64 %r196,%r149,48; cvt.u32.u32 %r198,%r24; cvt.s64.s8 %r197,%r198; add.u64 %r200,%r197,%r197; add.u64 %r201,%r200,%r197; shl.b64 %r202,%r201,3; add.u64 %r131,%r196,%r202; .loc 1 446 12 mov.u64 %r296,0; $L125: .loc 1 443 2 ld.u64 %r204,[%r57+8]; add.u64 %r203,%r204,1; ld.u64 %r205,[%r57]; sub.u64 %r41,%r203,%r205; .loc 1 445 10 setp.le.s64 %r206,%r41,0; @ %r206 bra $L123; .loc 1 442 17 st.u64 [%r67],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r67],%r296; $L124: .loc 1 440 3 add.u64 %r57,%r57,24; add.u64 %r67,%r67,8; setp.ne.u64 %r208,%r57,%r131; @ %r208 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r209,%r291,0; @ %r209 bra $L127; add.u64 %r290,%frame,120; $L147: add.u64 %r102,%r148,40; mov.u64 %r84,%r290; cvt.u32.u32 %r211,%r24; cvt.s64.s8 %r210,%r211; add.u64 %r213,%r210,%r210; add.u64 %r214,%r213,%r210; shl.b64 %r215,%r214,3; add.u64 %r216,%r148,16; add.u64 %r60,%r215,%r216; .loc 1 456 10 mov.u64 %r82,1; .loc 1 460 4 mov.u64 %r235,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r217,0; st.u64 [%r148+8],%r217; .loc 1 465 28 cvt.u16.u32 %r220,%r24; add.u16 %r219,%r220,-1; cvt.u32.u16 %r221,%r219; st.u8 [%r148+28],%r221; .loc 1 467 20 add.u32 %r222,%r24,-2; cvt.s64.s32 %r50,%r222; add.u64 %r224,%r50,%r50; add.u64 %r225,%r224,%r50; shl.b64 %r226,%r225,3; add.u64 %r227,%r148,%r226; .loc 1 467 67 shl.b64 %r229,%r50,3; add.u64 %r230,%frame,%r229; .loc 1 467 59 ld.u64 %r232,[%r227+40]; ld.u64 %r233,[%r230+120]; mul.lo.u64 %r93,%r232,%r233; .loc 1 469 10 setp.eq.u64 %r234,%r93,0; @ %r234 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r82,%r46,%r82; $L128: .loc 1 460 4 st.u64 [%r102+8],%r235; ld.u64 %r46,[%r84]; add.u64 %r236,%r46,-1; st.u64 [%r102+16],%r236; st.u64 [%r102],%r82; .loc 1 453 7 add.u64 %r102,%r102,24;131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r148+48],%r93; mov.u64 %r239,-1; st.u64 [%r148+56],%r239; mov.u64 %r240,1; st.u64 [%r148+40],%r240; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r242(%value_in),_gfortrani_xmallocarray3,[%value_in]; } mov.u64 %r96,%r243; .loc 1 476 22 st.u64 [%r148],%r96; $L135: .loc 1 502 3 setp.ne.u64 %r245,%r291,0; @ %r245 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r148+28]; .loc 1 480 10 setp.eq.u64 %r246,%r55,%r291; @ %r246 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r291; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r249,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r250,[%r249+36]; setp.eq.u32 %r251,%r250,0; @ %r251 bra $L135; .loc 1 488 4 setp.eq.u64 %r252,%r291,0; @ %r252 bra $L133; add.u64 %r31,%r148,48; add.u64 %r125,%frame,120; cvt.u32.u32 %r254,%r24; cvt.s64.s8 %r253,%r254; add.u64 %r92,%r253,-1; .loc 1 488 10 mov.u64 %r101,0; $L137: .loc 1 492 21 ld.u64 %r256,[%r31+8]; add.u64 %r255,%r256,1; .loc 1 492 19 ld.u64 %r257,[%r31]; sub.u64 %r91,%r255,%r257; .loc 1 493 18 ld.u64 %r64,[%r125]; .loc 1 488 25 add.u64 %r101,%r101,1; .loc 1 493 11 setp.eq.u64 %r258,%r64,%r91; @ %r258 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r64; st.u64 [%stack+8],%r91; st.u64 [%stack],%r101; cvta.const.u64 %r22_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r31,%r31,24; add.u64 %r125,%r125,8; setp.ne.u64 %r261,%r92,%r101; @ %r261 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r263,%r24; setp.le.s16 %r264,%r263,1; @ %r264 bra $L138; shl.b64 %r262,%r291,3; bra $L139; $L138: mov.u64 %r262,8; $L139: add.u64 %r266,%frame,240; mov.u32 %r27026272,[%value_in]; } add.u64 %r111,%r148,40; mov.u64 %r109,%frame; mov.u64 %r98,0; $L140: .loc 1 505 18 ld.u64 %r274,[%r111]; st.u64 [%r109],%r274; .loc 1 502 26 add.u64 %r98,%r98,1; .loc 1 502 3 add.u64 %r111,%r111,24; add.u64 %r109,%r109,8; setp.gt.s64 %r275,%r291,%r98; @ %r275 bra $L140; $L133: ld.u64 %r103,[%frame+240]; .loc 1 514 22 ld.u64 %r69,[%frame]; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r81,%r59,%r69; setp.le.s64 %r289,%r291,1; .loc 1 512 13 mov.u32 %r276,0; cvt.u32.u32 %r293,%r24; cvt.s64.s8 %r294,%r293; add.u64 %r295,%r294,-1; $L146: st.u8 [%r96],%r276; .loc 1 513 15 add.u64 %r103,%r103,1; .loc 1 514 12 add.u64 %r96,%r96,%r69; .loc 1 516 13 setp.ne.u64 %r277,%r59,%r103; @ %r277 bra $L146; .loc 1 525 7 @ ! %r289 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r97],%r292; .loc 1 523 23 mul.lo.u64 %r71,%r74,%r73; .loc 1 524 5 add.u64 %r95,%r95,1; .loc 1 525 7 add.u64 %r97,%r97,8; add.u64 %r119,%r119,8; add.u64 %r118,%r118,8; setp.ne.u64 %r280,%r95,%r295; @ %r280 bra $L144; bra $L109; $L171: add.u64 %r97,%frame,248; add.u64 %r119,%frame,8; add.u64 %r118,%frame,128; .loc 1 523 23 mov.u64 %r71,%r81; .loc 1 524 5 mov.u64 %r95,1; .loc 1 520 13 mov.u64 %r292,0; $L144: .loc 1 529 16 ld.u64 %r285,[%r97]; add.u64 %r73,%r285,1; st.u64 [%r97],%r73; .loc 1 530 23 ld.u64 %r74,[%r119]; .loc 1 530 13 sub.u64 %r286,%r74,%r71; add.u64 %r96,%r96,%r286; .loc 1 516 32 ld.u64 %r77,[%r118]; .loc 1 516 13 setp.eq.u64 %r287,%r73,%r77; @ %r287 bra $L145; .loc 1 520 13 mov.u64 %r103,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r96,[%r148]; .loc 1 449 6 setp.eq.u64 %r288,%r96,0; @ ! %r288 bra $L122; bra $L147; $L109: .loc 1 534 1 ret; } iany_i2.o/fortran/generated/iany_i2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_miany_i2 .visible .func _gfortran_miany_i2gfortran_siany_i2 .visible .func _gfortran_siany_i2VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,78,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,78,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,105] = {73,65,78,89114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,78,89,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,73,65,78,89,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_iany_i2 .visible .func _gfortran_iany_480u64 %r52; .reg .u64 %r58; .reg .u64 %r61; .reg .u649pred %r217; .reg .pred64u64 %r251; .reg .predu64 %r256; .reg .u64 %r257; .reg .u32u32 %r268; .reg .u64predmov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; .loc 1 55 10 ld.s8 %r22,[%r180+28]; .loc 1 55 38 add.u32 %r182,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r182; .loc 1 56 10 ld.u64 %r25,[%r181]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r184,%r104,63; set.u32.lt.s64 %r186,%r103,%r104; neg.s32 %r187,%r186; cvt.u16.u64 %r190,%r184; cvt.u16.u32 %r191,%r187; or.b16 %r189,%r190,%r191; .loc 1 58 6 cvt.u32.u16 %r192,%r189; cvt.u16.u8 %r193,%r192; setp.eq.u16 %r194,%r193,0; @ %r194 bra $L2; .loc 1 60 7 cvt.u32.u32 %r197,%r22; cvt.s64.s8 %r196,%r197; st.u64 [%stack+8],%r1961955_gfortran_runtime_errorr200,%r104,%r104; add.u64 %r201,%r200,%r104; shl.b64 %r202,%r201,3; add.u64 %r203,%r180,%r202; ld.u64 %r30,[%r203+56]; ld.u64 %r32,[%r203+48]; .loc 1 68 9 ld.u64 %r107,[%r203+40]; .loc 1 70 3 setp.ne.u64 %r217,%r104,0; @ %r217 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r218,%r103,%r104; @ %r218 bra $L4; bra $L46; $L3: add.u64 %r61,%r180,40; add.u64 %r40,%frame,120; add.u64 %r91,%frame,240; add.u64 %r219,%r180,16; add.u64 %r221,%r25,%r25; add.u64 %r222,%r221,%r25; shl.b64 %r223,%r222,3; add.u64 %r174,%r219,%r223; .loc 1 76 12 mov.u64 %r363,0; $L8: .loc 1 72 18 ld.u64 %r224,[%r61]; st.u64 [%r40],%r224; .loc 1 73 19 ld.u64 %r226,[%r61+16]; add.u64 %r225,%r226,1; ld.u64 %r227,[%r61+8]; sub.u64 %r37,%r225,%r227; .loc 1 75 10 setp.lt.s64 %r228,%r37,0; @ %r228 bra $L6; .loc 1 73 17 st.u64 [%r91],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r91],%r363; $L7: .loc 1 70 3 add.u64 %r61,%r61,24; add.u64 %r40,%r40,8; add.u64 %r91,%r91,8; setp.ne.u64 %r230,%r61,%r174; @ %r230 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r231,[%r179]; setp.eq.u64 %r232,%r231,0; @ ! %r232 bra $L11; bra $L10; $L4: add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r237,%r236,40; add.u64 %r143,%r180,%r237; shl.b64 %r238,%r25,3; add.u64 %r125,%r238,-8; add.u64 %r351,%frame,120; add.u64 %r128,%r351,%r125; add.u64 %r350,%frame,240; add.u64 %r121,%r350,%r125; cvt.u32.u32 %r242,%r22; cvt.s64.s8 %r241,%r242; add.u64 %r244,%r241,%r241; add.u64 %r245,%r244,%r241; shl.b64 %r246,%r245,3; add.u64 %r247,%r180,40; add.u64 %r95,%r246,%r247; .loc 1 84 12 mov.u64 %r362,0; $L14: .loc 1 80 18 ld.u64 %r248,[%r143]; st.u64 [%r128],%r248; .loc 1 81 19 ld.u64 %r250,[%r143+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r143+8]; sub.u64 %r44,%r249,%r251; .loc 1 83 10 setp.lt.s64 %r252,%r44,0; @ %r252 bra $L12; .loc 1 81 17 st.u64 [%r121],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r121],%r362; $L13: .loc 1 78 3 add.u64 %r143,%r143,24; add.u64 %r128,%r128,8; add.u64 %r121,%r121,8; setp.ne.u64 %r254,%r95,%r143; @ %r254 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r255,%r103,0; @ %r255 bra $L16; add.u64 %r350,%frame,240; $L34: add.u64 %r152,%r179,40; mov.u64 %r150,%r350; add.u64 %r256,%r179,16; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r144,%r256,%r262; .loc 1 94 10 mov.u64 %r99,1; .loc 1 98 4 mov.u64 %r285,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r263,0; st.u64 [%r179+8],%r263; .loc 1 103 28 cvt.u16.u32 %r266,%r22; add.u16 %r265,%r266,-1; cvt.u32.u16 %r267,%r265; st.u8 [%r179+28],%r267; .loc 1 105 20 add.u32 %r268,%r22,-2; cvt.s64.s32 %r52,%r268; add.u64 %r270,%r52,%r52; add.u64 %r271,%r270,%r52; shl.b64 %r272,%r271,3; add.u64 %r273,%r179,%r272; .loc 1 105 67 shl.b64 %r275,%r52,3; add.u64 %r276,%frame,%r275; .loc 1 105 59 ld.u64 %r278,[%r273+40]; ld.u64 %r279,[%r276+240]; mul.lo.u64 %r109,%r278,%r279; .loc 1 107 29 mov.u64 %r281,281; call (%value_in),_gfortrani_xmallocarray282,[%value_in]; } .loc 1 107 27 st.u64 [%r179],%r282; .loc 1 108 10 setp.eq.u64 %r284,%r109,0; @ ! %r284 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r99,%r48,%r99; $L17: .loc 1 98 4 st.u64 [%r152+8],%r285; ld.u64 %r48,[%r150]; add.u64 %r286,%r48,-1; st.u64 [%r152+16],%r286; st.u64 [%r152],%r99; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r287,%r144,%r152; @ %r287 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r179+48],%r109; mov.u64 %r289,-1; st.u64 [%r179+56],%r289; mov.u64 %r290,1; st.u64 [%r179+40],%r290; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r179+28]; .loc 1 118 10 setp.eq.u64 %r291,%r58,%r103; @ %r291 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r294,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r295,[%r294+36]; setp.eq.u32 %r296,%r295,0; @ %r296 bra $L21; .loc 1 125 2 add.u64 %r350,%frame,240; cvta.const.u64 %r300,$LC2r350299300; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r302,%r103,0; @ %r302 bra $L24; $L27: .loc 1 65 9 add.u64 %r303,%r30,1; .loc 1 65 7 sub.u64 %r105,%r303,%r32; max.s64 %r45,%r105,0; .loc 1 137 8 ld.u64 %r116,[%r180]; .loc 1 138 8 ld.u64 %r117,[%r179]; .loc 1 154 36 add.u64 %r66,%r107,%r107; ld.u64 %r138,[%frame+360]; .loc 1 166 22 ld.u64 %r68,[%frame+120]; .loc 1 166 12 add.u64 %r69,%r68,%r68; .loc 1 167 22 ld.u64 %r71,[%frame]; .loc 1 167 12 add.u64 %r72,%r71,%r71; .loc 1 169 32 ld.u64 %r123,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r307,%r68,%r123; .loc 1 176 9 add.u64 %r308,%r307,%r307; neg.s64 %r127,%r308; .loc 1 177 23 mul.lo.u64 %r309,%r71,%r123; .loc 1 177 9 add.u64 %r310,%r309,%r309; neg.s64 %r130,%r310; setp.gt.s64 %r352,%r105,0; setp.le.s64 %r353,%r103,1; cvt.u32.u32 %r355,%r22; cvt.s64.s8 %r356,%r355; add.u64 %r357,%r356,-1; add.u64 %r358,%frame,120; add.u64 %r359,%frame,240; .loc 1 173 13 mov.u64 %r360,0; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r179,40; mov.u64 %r162,%frame; cvt.u32.u32 %r312,%r22; cvt.s64.s8 %r311,%r312; add.u64 %r153,%r311,-1; .loc 1 129 3 mov.u64 %r118,0; add.u64 %r350,%frame,240; .loc 1 131 16 mov.u64 %r313,%r118; $L26: st.u64 [%r165],%r313; .loc 1 132 18 ld.u64 %r314,[%r164]; st.u64 [%r162],%r314; .loc 1 133 17 shl.b64 %r316,%r118,3; add.u64 %r317,%r350,%r316; .loc 1 133 10 ld.u64 %r318,[%r317]; setp.le.s64 %r319,%r318,0; @ %r319 bra $L1; .loc 1 129 26 add.u64 %r118,%r118,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r320,%r118,%r153; @ %r320 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r352 bra $L35; .loc 1 150 10 mov.u16 %r322,0; st.u16 [%r117],%r322; bra $L29; $L35: mov.u64 %r112,%r116; .loc 1 148 10 mov.u32 %r110,0; .loc 1 154 13 mov.u64 %r111,0; $L28: .loc 1 158 10 ld.u16 %r324,[%r112]; cvt.u16.u32 %r325,%r110; or.b16 %r323,%r324,%r325; cvt.s32.s16 %r110,%r323; .loc 1 154 28 add.u64 %r111,%r111,1; .loc 1 154 36 add.u64 %r112,%r112,%r66; .loc 1 154 6 setp.gt.s64 %r326,%r45,%r111; @ %r326 bra $L28; .loc 1 161 12 cvt.u16.u32 %r327,%r110; st.u16 [%r117],%r327; $L29: .loc 1 165 15 add.u64 %r138,%r138,1; .loc 1 166 12 add.u64 %r116,%r116,%r69; .loc 1 167 12 add.u64 %r117,%r117,%r72; .loc 1 169 13 setp.ne.u64 %r328,%r138,%r123; @ %r328 bra $L25; .loc 1 176 9 add.u64 %r113,%r116,%r127; .loc 1 177 9 add.u64 %r114,%r117,%r130; .loc 1 179 7 @ %r353 bra $L1; add.u64 %r122,%frame,368; mov.u64 %r140,8; .loc 1 178 5 mov.u64 %r115,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r122],%r360; .loc 1 176 23 mul.lo.u64 %r334,%r85,%r84; .loc 1 176 9 add.u64 %r335,%r334,%r334; sub.u64 %r113,%r116,%r335; .loc 1 177 23 mul.lo.u64 %r336,%r88,%r84; .loc 1 177 9 add.u64 %r337,%r336,%r336; sub.u64 %r114,%r117,%r337; .loc 1 178 5 add.u64 %r115,%r115,1; .loc 1 179 7 add.u64 %r122,%r122,8; add.u64 %r140,%r140,8; setp.eq.u64 %r338,%r115,%r357; @ %r338 bra $L1; $L32: .loc 1 187 16 ld.u64 %r339,[%r122]; add.u64 %r84,%r339,1; st.u64 [%r122],%r84; .loc 1 188 23 add.u64 %r341,%r358,%r140; ld.u64 %r85,[%r341]; .loc 1 188 13 add.u64 %r342,%r85,%r85; add.u64 %r116,%r113,%r342; .loc 1 189 23 add.u64 %r343,%frame,%r140; ld.u64 %r88,[%r343]; .loc 1 189 13 add.u64 %r344,%r88,%r88; add.u64 %r117,%r114,%r344; .loc 1 169 32 add.u64 %r346,%r359,%r140; ld.u64 %r92,[%r346]; .loc 1 169 13 setp.eq.u64 %r347,%r84,%r92; @ %r347 bra $L33; .loc 1 173 13 mov.u64 %r138,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r348,[%r179]; setp.eq.u64 %r349,%r348,0; @ ! %r349 bra $L11; bra $L34; $L1: .loc 1 19_gfortran_miany_i2 .visible .func _gfortran_miany_608u64u16 %r263; .reg .pred %r264; .reg .pred %r266; .reg .u64 %r270; .reg .u64 %r271; .reg .u64 %r277; .reg .pred %r279; .reg .pred %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .predu64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32u64 %r342; .reg .u64pred %r358; .reg .u64predu64 %r391; .reg .pred %r392; .reg .pred %r393; .reg .u64u64 %r421; .reg .u64predmov.u64 %r211,%ar2; mov.u64 %r212,%ar3; .loc 1 223 6 setp.ne.u64 %r213,%r212,0; @ %r213 bra $L49gfortran_iany_.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r211]; .loc 1 233 7 add.u64 %r137,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r210+28]; .loc 1 234 38 add.u32 %r217,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r138,%r217; .loc 1 237 7 shr.u64 %r219,%r137,63; set.u32.gt.s64 %r221,%r137,%r138; neg.s32 %r222,%r221; cvt.u16.u64 %r225,%r219; cvt.u16.u32 %r226,%r222; or.b16 %r224,%r225,%r226; .loc 1 237 6 cvt.u32.u16 %r227,%r224; cvt.u16.u8 %r228,%r227; setp.eq.u16 %r229,%r228,0; @ %r229 bra $L51; .loc 1 239 7 cvt.u32.u32 %r232,%r23; cvt.s64.s8 %r231,%r232; st.u64 [%stack+8],%r231; st.u64 [%stack],%r22; cvta.const.u64 %r2302stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r435,%r137,%r137; add.u64 %r436,%r435,%r137; shl.b64 %r237,%r436,3; add.u64 %r238,%r210,%r237; ld.u64 %r241,[%r238+56]; add.u64 %r240,%r241,1; .loc 1 244 7 ld.u64 %r248,[%r238+48]; sub.u64 %r139,%r240,%r248; .loc 1 245 6 setp.le.s64 %r249,%r139,0; @ %r249 bra $L48; .loc 1 248 9 ld.u64 %r152,[%r212]; .loc 1 250 15 ld.u64 %r34,[%r212+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r250,%r35,-4; and.b32 %r251,%r250,-5; set.u32.eq.u32 %r253,%r251,0; neg.s32 %r254,%r253; .loc 1 252 22 add.u32 %r255,%r35,-1; set.u32.le.u32 %r257,%r255,1; neg.s32 %r258,%r257; .loc 1 252 6 cvt.u16.u32 %r260,%r254; cvt.u16.u32 %r261,%r258; or.b16 %r259,%r260,%r261; cvt.u32.u16 %r262,%r259; cvt.u16.u8 %r263,%r262; setp.ne.u16 %r264,%r263,0; @ %r264 bra $L53; .loc 1 254 7 setp.ne.u32 %r266,%r35,16; @ %r266 bra $L54; $L53: .loc 1 261 9 shl.b64 %r270,%r436,3; add.u64 %r271,%r210,%r270; ld.u64 %r142,[%r271+40]; .loc 1 262 12 add.u64 %r277,%r212,%r270; ld.u64 %r42,[%r277+40]; .loc 1 264 3 setp.ne.u64 %r279,%r137,0; @ %r279 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r280,%r137,%r138; @ %r280 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r281281_gfortran_runtime_error5: add.u64 %r40,%r210,40; add.u64 %r179,%r212,40; add.u64 %r283,%r210,16; add.u64 %r285,%r22,%r22; add.u64 %r286,%r285,%r22; shl.b64 %r287,%r286,3; add.u64 %r206,%r283,%r287; .loc 1 264 3 mov.u64 %r92,0; add.u64 %r433,%frame,240; add.u64 %r432,%frame,360; .loc 1 271 12 mov.u64 %r447,%r92; $L60: .loc 1 266 18 add.u64 %r289,%r433,%r92; ld.u64 %r290,[%r40]; st.u64 [%r289],%r290; .loc 1 267 18 add.u64 %r291,%frame,%r92; .loc 1 267 20 ld.u64 %r293,[%r179]; mul.lo.u64 %r292,%r293,%r34; .loc 1 267 18 st.u64 [%r291],%r292; .loc 1 268 19 ld.u64 %r295,[%r40+16]; add.u64 %r294,%r295,1; ld.u64 %r296,[%r40+8]; sub.u64 %r54,%r294,%r296; .loc 1 270 10 setp.lt.s64 %r297,%r54,0; @ %r297 bra $L58; .loc 1 268 17 add.u64 %r299,%r432,%r92; st.u64 [%r299],%r54; bra $L59; $L58: .loc 1 271 12 add.u64 %r301,%r432,%r92; st.u64 [%r301],%r447; $L59: .loc 1 264 3 add.u64 %r40,%r40,24; add.u64 %r179,%r179,24; add.u64 %r92,%r92,8; setp.ne.u64 %r303,%r40,%r206; @ %r303 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r304,[%r209]; setp.eq.u64 %r305,%r304,0; @ ! %r305 bra $L63; bra $L62; $L56: add.u64 %r307,%r22,%r22; add.u64 %r308,%r307,%r22; shl.b64 %r309,%r308,3; add.u64 %r127,%r309,40; add.u64 %r133,%r210,%r127; add.u64 %r125,%r212,%r127; shl.b64 %r310,%r22,3; add.u64 %r103,%r310,-8; cvt.u32.u32 %r312,%r23; cvt.s64.s8 %r311,%r312; shl.b64 %r313,%r311,3; add.u64 %r69,%r313,-8; add.u64 %r433,%frame,240; add.u64 %r432,%frame,360; .loc 1 281 12 mov.u64 %r446,0; $L66: .loc 1 276 18 add.u64 %r315,%r433,%r103; ld.u64 %r316,[%r133]; st.u64 [%r315],%r316; .loc 1 277 18 add.u64 %r317,%frame,%r103; .loc 1 277 20 ld.u64 %r319,[%r125]; mul.lo.u64 %r318,%r319,%r34; .loc 1 277 18 st.u64 [%r317],%r318; .loc 1 278 19 ld.u64 %r321,[%r133+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r133+8]; sub.u64 %r66,%r320,%r322; .loc 1 280 10 setp.lt.s64 %r323,%r66,0; @ %r323 bra $L64; .loc 1 278 17 add.u64 %r325,%r432,%r103; st.u64 [%r325],%r66; bra $L65; $L64: .loc 1 281 12 add.u64 %r327,%r432,%r103; st.u64 [%r327],%r446; $L65: .loc 1 274 3 add.u64 %r133,%r133,24; add.u64 %r125,%r125,24; add.u64 %r103,%r103,8; setp.ne.u64 %r329,%r69,%r103; @ %r329 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r330,%r138,0; @ %r330 bra $L68; add.u64 %r432,%frame,360; $L86: add.u64 %r172,%r209,40; mov.u64 %r164,%r432; cvt.u32.u32 %r332,%r23; cvt.s64.s8 %r331,%r332; add.u64 %r334,%r331,%r331; add.u64 %r335,%r334,%r331; shl.b64 %r336,%r335,3; add.u64 %r337,%r209,16; add.u64 %r134,%r336,%r337; .loc 1 291 10 mov.u64 %r129,1; .loc 1 295 4 mov.u64 %r356,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r338,%r23,-2; cvt.s64.s32 %r75,%r338; add.u64 %r340,%r75,%r75; add.u64 %r341,%r340,%r75; shl.b64 %r342,%r341,3; add.u64 %r343,%r209,%r342; .loc 1 299 67 shl.b64 %r345,%r75,3; add.u64 %r346,%frame,%r345; .loc 1 299 59 ld.u64 %r348,[%r343+40]; ld.u64 %r349,[%r346+360]; mul.lo.u64 %r146,%r348,%r349; .loc 1 301 24 mov.u64 %r350,0; st.u64 [%r209+8],%r350; .loc 1 302 28 cvt.u16.u32 %r353,%r23; add.u16 %r352,%r353,-1; cvt.u32.u16 %r354,%r352; st.u8 [%r209+28],%r354; .loc 1 304 10 setp.eq.u64 %r355,%r146,0; @ %r355 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r129,%r72,%r129; $L69: .loc 1 295 4 st.u64 [%r172+8],%r356; ld.u64 %r72,[%r164]; add.u64 %r357,%r72,-1; st.u64 [%r172+16],%r357; st.u64 [%r172],%r129; .loc 1 288 7 add.u64 %r172,%r172,24; add.u64 %r164,%r164,8; setp.ne.u64 %r358,%r134,%r172; @ %r358 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r209+48],%r146; mov.u64 %r360,-1; st.u64 [%r209+56],%r360; mov.u64 %r361,1; st.u64 [%r209+40],%r361; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r363,363; call (%value_in),_gfortrani_xmallocarray64,[%value_in]; } .loc 1 311 22 st.u64 [%r209],%r364; bra $L73; $L63: .loc 1 316 19 ld.s8 %r366,[%r209+28]; .loc 1 316 10 setp.eq.u64 %r367,%r366,%r138; @ %r367 bra $L74; .loc 1 317 2 cvta.const.u64 %r36836_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r370,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r371,[%r370+36]; setp.eq.u32 %r372,%r371,0; @ %r372 bra $L73; .loc 1 321 4 add.u64 %r432,%frame,360; cvta.const.u64 %r376,$LC2r437376; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r380,$LC1210380376; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r382,%r138,0; @ %r382 bra $L75; $L78: .loc 1 336 8 ld.u64 %r153,[%r209]; .loc 1 337 8 ld.u64 %r151,[%r210]; .loc 1 339 9 setp.ne.u64 %r383,%r151,0; @ %r383 bra $L76; bra $L48; $L75: add.u64 %r190,%frame,480; add.u64 %r189,%r209,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r385,%r23; cvt.s64.s8 %r384,%r385; add.u64 %r175,%r384,-1; .loc 1 328 3 mov.u64 %r157,0; add.u64 %r432,%frame,360; .loc 1 330 16 mov.u64 %r386,%r157; $L77: st.u64 [%r190],%r386; .loc 1 331 18 ld.u64 %r387,[%r189]; st.u64 [%r187],%r387; .loc 1 332 17 shl.b64 %r389,%r157,3; add.u64 %r390,%r432,%r389; .loc 1 332 10 ld.u64 %r391,[%r390]; setp.le.s64 %r392,%r391,0; @ %r392 bra $L48; .loc 1 328 26 add.u64 %r157,%r157,1; .loc 1 328 3 add.u64 %r190,%r190,8; add.u64 %r189,%r189,24; add.u64 %r187,%r187,8; setp.ne.u64 %r393,%r157,%r175; @ %r393 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r45,%r42,%r34; .loc 1 349 32 add.u64 %r91,%r142,%r142; ld.u64 %r39,[%frame+480]; .loc 1 359 22 ld.u64 %r94,[%frame+240]; .loc 1 359 12 add.u64 %r95,%r94,%r94; .loc 1 360 23 ld.u64 %r97,[%frame]; .loc 1 361 22 ld.u64 %r99,[%frame+120]; .loc 1 361 12 add.u64 %r100,%r99,%r99; .loc 1 363 32 ld.u64 %r85,[%frame+360]; .loc 1 370 23 mul.lo.u64 %r38,%r85,%r94; .loc 1 371 24 mul.lo.u64 %r397,%r85,%r97; .loc 1 371 10 neg.s64 %r163,%r397; .loc 1 372 23 mul.lo.u64 %r398,%r85,%r99; .loc 1 372 9 add.u64 %r399,%r398,%r398; neg.s64 %r165,%r399; setp.le.s64 %r437,%r138,1; .loc 1 348 10 mov.u32 %r439,0; cvt.u32.u32 %r440,%r23; cvt.s64.s8 %r441,%r440; add.u64 %r442,%r441,-1; add.u64 %r443,%frame,240; add.u64 %r444,%frame,360; add.u64 %r445,%frame,120; $L87: .loc 1 328 3 mov.u64 %r156,%r152; mov.u64 %r155,%r151; .loc 1 348 10 mov.u32 %r130,%r439; .loc 1 349 9 mov.u64 %r154,0; $L80: .loc 1 352 6 ld.s8 %r401,[%r156]; cvt.u16.u32 %r400,%r401; setp.eq.u16 %r402,%r400,0; @ %r402 bra $L79; .loc 1 353 12 ld.u16 %r404,[%r155]; cvt.u16.u32 %r405,%r130; or.b16 %r403,%r404,%r405; cvt.s32.s16 %r130,%r403; $L79: .loc 1 349 24 add.u64 %r154,%r154,1; .loc 1 349 32 add.u64 %r155,%r155,%r91; .loc 1 349 47 add.u64 %r156,%r156,%r45; .loc 1 349 2 setp.ne.u64 %r406,%r139,%r154; @ %r406 bra $L80; .loc 1 355 8 cvt.u16.u32 %r407,%r130; st.u16 [%r153],%r407; .loc 1 358 15 add.u64 %r39,%r39,1; .loc 1 359 12 add.u64 %r151,%r151,%r95; .loc 1 360 13 add.u64 %r152,%r152,%r97; .loc 1 361 12 add.u64 %r153,%r153,%r100; .loc 1 363 13 setp.ne.u64 %r408,%r85,%r39; @ %r408 bra $L87; .loc 1 371 10 add.u64 %r148,%r152,%r163; .loc 1 372 9 add.u64 %r149,%r153,%r165; .loc 1 374 7 @ ! %r437 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r41],%r438; .loc 1 370 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 371 24 mul.lo.u64 %r411,%r116,%r113; .loc 1 371 10 sub.u64 %r148,%r152,%r411; .loc 1 372 23 mul.lo.u64 %r412,%r118,%r113; .loc 1 372 9 add.u64 %r413,%r412,%r412; sub.u64 %r149,%r153,%r413; .loc 1 373 5 add.u64 %r150,%r150,1; .loc 1 374 7 add.u64 %r41,%r41,8; add.u64 %r166,%r166,8; setp.ne.u64 %r414,%r150,%r442; @ %r414 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; .loc 1 370 23 mov.u64 %r102,%r38; .loc 1 374 7 mov.u64 %r166,8; .loc 1 373 5 mov.u64 %r150,1; .loc 1 367 13 mov.u64 %r438,0; $L84: .loc 1 382 16 ld.u64 %r418,[%r41]; add.u64 %r113,%r418,1; st.u64 [%r41],%r113; .loc 1 383 23 add.u64 %r420,%r443,%r166; ld.u64 %r114,[%r420]; .loc 1 383 13 sub.u64 %r421,%r114,%r102; add.u64 %r422,%r421,%r421; add.u64 %r151,%r151,%r422; .loc 1 384 24 add.u64 %r423,%frame,%r166; ld.u64 %r116,[%r423]; .loc 1 384 14 add.u64 %r152,%r148,%r116; .loc 1 385 23 add.u64 %r425,%r445,%r166; ld.u64 %r118,[%r425]; .loc 1 385 13 add.u64 %r426,%r118,%r118; add.u64 %r153,%r149,%r426; .loc 1 363 32 add.u64 %r428,%r444,%r166; ld.u64 %r122,[%r428]; .loc 1 363 13 setp.eq.u64 %r429,%r113,%r122; @ %r429 bra $L85; .loc 1 367 13 mov.u64 %r39,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r430,[%r209]; setp.eq.u64 %r431,%r430,0; @ ! %r431 bra $L63; bra $L86; $L48:_gfortran_siany_i2 .visible .func _gfortran_siany_68u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64predpred %r292; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_iany_.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r295,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r295; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r295; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r301,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r301; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r294,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r294,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r300,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r300; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r295,0; @ %r211 bra $L127; add.u64 %r294,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r294; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r244,4; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r295,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r295; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r295; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r295,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r295,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r295,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 add.u64 %r69,%r68,%r68; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r293,%r295,1; .loc 1 512 13 mov.u16 %r279,0; cvt.u32.u32 %r297,%r24; cvt.s64.s8 %r298,%r297; add.u64 %r299,%r298,-1; $L146: st.u16 [%r98],%r279; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r280,%r59,%r80; @ %r280 bra $L146; .loc 1 525 7 @ ! %r293 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r296; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r283,%r97,%r299; @ %r283 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; .loc 1 520 13 mov.u64 %r296,0; $L144: .loc 1 529 16 ld.u64 %r288,[%r99]; add.u64 %r74,%r288,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r289,%r75,%r72; add.u64 %r290,%r289,%r289; add.u64 %r98,%r98,%r290; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r291,%r74,%r77; @ %r291 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r292,%r98,0; @ ! %r292 bra $L122; bra $L147; $L109: .loc 1 534 1 ret; } iany_i4.o/fortran/generated/iany_i4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_miany_i4 .visible .func _gfortran_miany_i4gfortran_siany_i4 .visible .func _gfortran_siany_i4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,78,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,78,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,105] = {73,65,78,89114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,78,89,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,73,65,78,89,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_iany_i4 .visible .func _gfortran_iany_480u64 %r52; .reg .u64 %r58; .reg .u64 %r61; .reg .u649pred %r217; .reg .pred64u64 %r251; .reg .predu64 %r256; .reg .u64 %r257; .reg .u32u32 %r268; .reg .u64u64predpredu64u64predmov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; .loc 1 55 10 ld.s8 %r22,[%r180+28]; .loc 1 55 38 add.u32 %r182,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r182; .loc 1 56 10 ld.u64 %r25,[%r181]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r184,%r104,63; set.u32.lt.s64 %r186,%r103,%r104; neg.s32 %r187,%r186; cvt.u16.u64 %r190,%r184; cvt.u16.u32 %r191,%r187; or.b16 %r189,%r190,%r191; .loc 1 58 6 cvt.u32.u16 %r192,%r189; cvt.u16.u8 %r193,%r192; setp.eq.u16 %r194,%r193,0; @ %r194 bra $L2; .loc 1 60 7 cvt.u32.u32 %r197,%r22; cvt.s64.s8 %r196,%r197; st.u64 [%stack+8],%r1961955_gfortran_runtime_errorr200,%r104,%r104; add.u64 %r201,%r200,%r104; shl.b64 %r202,%r201,3; add.u64 %r203,%r180,%r202; ld.u64 %r30,[%r203+56]; ld.u64 %r32,[%r203+48]; .loc 1 68 9 ld.u64 %r107,[%r203+40]; .loc 1 70 3 setp.ne.u64 %r217,%r104,0; @ %r217 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r218,%r103,%r104; @ %r218 bra $L4; bra $L46; $L3: add.u64 %r61,%r180,40; add.u64 %r40,%frame,120; add.u64 %r91,%frame,240; add.u64 %r219,%r180,16; add.u64 %r221,%r25,%r25; add.u64 %r222,%r221,%r25; shl.b64 %r223,%r222,3; add.u64 %r174,%r219,%r223; .loc 1 76 12 mov.u64 %r357,0; $L8: .loc 1 72 18 ld.u64 %r224,[%r61]; st.u64 [%r40],%r224; .loc 1 73 19 ld.u64 %r226,[%r61+16]; add.u64 %r225,%r226,1; ld.u64 %r227,[%r61+8]; sub.u64 %r37,%r225,%r227; .loc 1 75 10 setp.lt.s64 %r228,%r37,0; @ %r228 bra $L6; .loc 1 73 17 st.u64 [%r91],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r91],%r357; $L7: .loc 1 70 3 add.u64 %r61,%r61,24; add.u64 %r40,%r40,8; add.u64 %r91,%r91,8; setp.ne.u64 %r230,%r61,%r174; @ %r230 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r231,[%r179]; setp.eq.u64 %r232,%r231,0; @ ! %r232 bra $L11; bra $L10; $L4: add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r237,%r236,40; add.u64 %r143,%r180,%r237; shl.b64 %r238,%r25,3; add.u64 %r125,%r238,-8; add.u64 %r344,%frame,120; add.u64 %r128,%r344,%r125; add.u64 %r345,%frame,240; add.u64 %r121,%r345,%r125; cvt.u32.u32 %r242,%r22; cvt.s64.s8 %r241,%r242; add.u64 %r244,%r241,%r241; add.u64 %r245,%r244,%r241; shl.b64 %r246,%r245,3; add.u64 %r247,%r180,40; add.u64 %r95,%r246,%r247; .loc 1 84 12 mov.u64 %r356,0; $L14: .loc 1 80 18 ld.u64 %r248,[%r143]; st.u64 [%r128],%r248; .loc 1 81 19 ld.u64 %r250,[%r143+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r143+8]; sub.u64 %r44,%r249,%r251; .loc 1 83 10 setp.lt.s64 %r252,%r44,0; @ %r252 bra $L12; .loc 1 81 17 st.u64 [%r121],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r121],%r356; $L13: .loc 1 78 3 add.u64 %r143,%r143,24; add.u64 %r128,%r128,8; add.u64 %r121,%r121,8; setp.ne.u64 %r254,%r95,%r143; @ %r254 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r255,%r103,0; @ %r255 bra $L16; add.u64 %r345,%frame,240; $L34: add.u64 %r152,%r179,40; mov.u64 %r150,%r345; add.u64 %r256,%r179,16; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r144,%r256,%r262; .loc 1 94 10 mov.u64 %r99,1; .loc 1 98 4 mov.u64 %r285,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r263,0; st.u64 [%r179+8],%r263; .loc 1 103 28 cvt.u16.u32 %r266,%r22; add.u16 %r265,%r266,-1; cvt.u32.u16 %r267,%r265; st.u8 [%r179+28],%r267; .loc 1 105 20 add.u32 %r268,%r22,-2; cvt.s64.s32 %r52,%r268; add.u64 %r270,%r52,%r52; add.u64 %r271,%r270,%r52; shl.b64 %r272,%r271,3; add.u64 %r273,%r179,%r272; .loc 1 105 67 shl.b64 %r275,%r52,3; add.u64 %r276,%frame,%r275; .loc 1 105 59 ld.u64 %r278,[%r273+40]; ld.u64 %r279,[%r276+240]; mul.lo.u64 %r109,%r278,%r279; .loc 1 107 29281; call (%value_in),_gfortrani_xmallocarray282,[%value_in]; } .loc 1 107 27 st.u64 [%r179],%r282; .loc 1 108 10 setp.eq.u64 %r284,%r109,0; @ ! %r284 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r99,%r48,%r99; $L17: .loc 1 98 4 st.u64 [%r152+8],%r285; ld.u64 %r48,[%r150]; add.u64 %r286,%r48,-1; st.u64 [%r152+16],%r286; st.u64 [%r152],%r99; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r287,%r144,%r152; @ %r287 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r179+48],%r109; mov.u64 %r289,-1; st.u64 [%r179+56],%r289; mov.u64 %r290,1; st.u64 [%r179+40],%r290; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r179+28]; .loc 1 118 10 setp.eq.u64 %r291,%r58,%r103; @ %r291 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r294,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r295,[%r294+36]; setp.eq.u32 %r296,%r295,0; @ %r296 bra $L21; .loc 1 125 2 add.u64 %r345,%frame,240; cvta.const.u64 %r300,$LC2r345299300; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r302,%r103,0; @ %r302 bra $L24; $L27: .loc 1 65 9 add.u64 %r303,%r30,1; .loc 1 65 7 sub.u64 %r105,%r303,%r32; max.s64 %r45,%r105,0; .loc 1 137 8 ld.u64 %r116,[%r180]; .loc 1 138 8 ld.u64 %r117,[%r179]; .loc 1 154 36 shl.b64 %r66,%r107,2; ld.u64 %r138,[%frame+360]; .loc 1 166 22 ld.u64 %r68,[%frame+120]; .loc 1 166 12 shl.b64 %r69,%r68,2; .loc 1 167 22 ld.u64 %r71,[%frame]; .loc 1 167 12 shl.b64 %r72,%r71,2; .loc 1 169 32 ld.u64 %r123,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r304,%r68,%r123; .loc 1 176 9 shl.b64 %r305,%r304,2; neg.s64 %r127,%r305; .loc 1 177 23 mul.lo.u64 %r306,%r71,%r123; .loc 1 177 9 shl.b64 %r307,%r306,2; neg.s64 %r130,%r307; setp.gt.s64 %r346,%r105,0; setp.le.s64 %r347,%r103,1; cvt.u32.u32 %r349,%r22; cvt.s64.s8 %r350,%r349; add.u64 %r351,%r350,-1; add.u64 %r352,%frame,120; add.u64 %r353,%frame,240; .loc 1 173 13 mov.u64 %r354,0; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r179,40; mov.u64 %r162,%frame; cvt.u32.u32 %r309,%r22; cvt.s64.s8 %r308,%r309; add.u64 %r153,%r308,-1; .loc 1 129 3 mov.u64 %r118,0; add.u64 %r345,%frame,240; .loc 1 131 16 mov.u64 %r310,%r118; $L26: st.u64 [%r165],%r310; .loc 1 132 18 ld.u64 %r311,[%r164]; st.u64 [%r162],%r311; .loc 1 133 17 shl.b64 %r313,%r118,3; add.u64 %r314,%r345,%r313; .loc 1 133 10 ld.u64 %r315,[%r314]; setp.le.s64 %r316,%r315,0; @ %r316 bra $L1; .loc 1 129 26 add.u64 %r118,%r118,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r317,%r118,%r153; @ %r317 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r346 bra $L35; .loc 1 150 10 mov.u32 %r319,0; st.u32 [%r117],%r319; bra $L29; $L35: mov.u64 %r112,%r116; .loc 1 148 10 mov.u32 %r110,0; .loc 1 154 13 mov.u64 %r111,0; $L28: .loc 1 158 10 ld.u32 %r320,[%r112]; or.b32 %r110,%r110,%r320; .loc 1 154 28 add.u64 %r111,%r111,1; .loc 1 154 36 add.u64 %r112,%r112,%r66; .loc 1 154 6 setp.gt.s64 %r321,%r45,%r111; @ %r321 bra $L28; .loc 1 161 12 st.u32 [%r117],%r110; $L29: .loc 1 165 15 add.u64 %r138,%r138,1; .loc 1 166 12 add.u64 %r116,%r116,%r69; .loc 1 167 12 add.u64 %r117,%r117,%r72; .loc 1 169 13 setp.ne.u64 %r322,%r138,%r123; @ %r322 bra $L25; .loc 1 176 9 add.u64 %r113,%r116,%r127; .loc 1 177 9 add.u64 %r114,%r117,%r130; .loc 1 179 7 @ %r347 bra $L1; add.u64 %r122,%frame,368; mov.u64 %r140,8; .loc 1 178 5 mov.u64 %r115,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r122],%r354; .loc 1 176 23 mul.lo.u64 %r328,%r85,%r84; .loc 1 176 9 shl.b64 %r329,%r328,2; sub.u64 %r113,%r116,%r329; .loc 1 177 23 mul.lo.u64 %r330,%r88,%r84; .loc 1 177 9 shl.b64 %r331,%r330,2; sub.u64 %r114,%r117,%r331; .loc 1 178 5 add.u64 %r115,%r115,1; .loc 1 179 7 add.u64 %r122,%r122,8; add.u64 %r140,%r140,8; setp.eq.u64 %r332,%r115,%r351; @ %r332 bra $L1; $L32: .loc 1 187 16 ld.u64 %r333,[%r122]; add.u64 %r84,%r333,1; st.u64 [%r122],%r84; .loc 1 188 23 add.u64 %r335,%r352,%r140; ld.u64 %r85,[%r335]; .loc 1 188 13 shl.b64 %r336,%r85,2; add.u64 %r116,%r113,%r336; .loc 1 189 23 add.u64 %r337,%frame,%r140; ld.u64 %r88,[%r337]; .loc 1 189 13 shl.b64 %r338,%r88,2; add.u64 %r117,%r114,%r338; .loc 1 169 32 add.u64 %r340,%r353,%r140; ld.u64 %r92,[%r340]; .loc 1 169 13 setp.eq.u64 %r341,%r84,%r92; @ %r341 bra $L33; .loc 1 173 13 mov.u64 %r138,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r342,[%r179]; setp.eq.u64 %r343,%r342,0; @ ! %r343 bra $L11; bra $L34; $L1: .loc 1 19_gfortran_miany_i4 .visible .func _gfortran_miany_608u64u16 %r263; .reg .pred %r264; .reg .pred %r266; .reg .u64 %r270; .reg .u64 %r271; .reg .u64 %r277; .reg .pred %r279; .reg .pred %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .predu64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32u64 %r342; .reg .u64pred %r358; .reg .u64predu64 %r391; .reg .predpred %r399; .reg .u32 %r400; .reg .pred %r401; .reg .predmov.u64 %r211,%ar2; mov.u64 %r212,%ar3; .loc 1 223 6 setp.ne.u64 %r213,%r212,0; @ %r213 bra $L49gfortran_iany_.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r211]; .loc 1 233 7 add.u64 %r137,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r210+28]; .loc 1 234 38 add.u32 %r217,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r138,%r217; .loc 1 237 7 shr.u64 %r219,%r137,63; set.u32.gt.s64 %r221,%r137,%r138; neg.s32 %r222,%r221; cvt.u16.u64 %r225,%r219; cvt.u16.u32 %r226,%r222; or.b16 %r224,%r225,%r226; .loc 1 237 6 cvt.u32.u16 %r227,%r224; cvt.u16.u8 %r228,%r227; setp.eq.u16 %r229,%r228,0; @ %r229 bra $L51; .loc 1 239 7 cvt.u32.u32 %r232,%r23; cvt.s64.s8 %r231,%r232; st.u64 [%stack+8],%r231; st.u64 [%stack],%r22; cvta.const.u64 %r2302stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r429,%r137,%r137; add.u64 %r430,%r429,%r137; shl.b64 %r237,%r430,3; add.u64 %r238,%r210,%r237; ld.u64 %r241,[%r238+56]; add.u64 %r240,%r241,1; .loc 1 244 7 ld.u64 %r248,[%r238+48]; sub.u64 %r139,%r240,%r248; .loc 1 245 6 setp.le.s64 %r249,%r139,0; @ %r249 bra $L48; .loc 1 248 9 ld.u64 %r152,[%r212]; .loc 1 250 15 ld.u64 %r34,[%r212+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r250,%r35,-4; and.b32 %r251,%r250,-5; set.u32.eq.u32 %r253,%r251,0; neg.s32 %r254,%r253; .loc 1 252 22 add.u32 %r255,%r35,-1; set.u32.le.u32 %r257,%r255,1; neg.s32 %r258,%r257; .loc 1 252 6 cvt.u16.u32 %r260,%r254; cvt.u16.u32 %r261,%r258; or.b16 %r259,%r260,%r261; cvt.u32.u16 %r262,%r259; cvt.u16.u8 %r263,%r262; setp.ne.u16 %r264,%r263,0; @ %r264 bra $L53; .loc 1 254 7 setp.ne.u32 %r266,%r35,16; @ %r266 bra $L54; $L53: .loc 1 261 9 shl.b64 %r270,%r430,3; add.u64 %r271,%r210,%r270; ld.u64 %r142,[%r271+40]; .loc 1 262 12 add.u64 %r277,%r212,%r270; ld.u64 %r42,[%r277+40]; .loc 1 264 3 setp.ne.u64 %r279,%r137,0; @ %r279 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r280,%r137,%r138; @ %r280 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r281281_gfortran_runtime_error5: add.u64 %r40,%r210,40; add.u64 %r179,%r212,40; add.u64 %r283,%r210,16; add.u64 %r285,%r22,%r22; add.u64 %r286,%r285,%r22; shl.b64 %r287,%r286,3; add.u64 %r206,%r283,%r287; .loc 1 264 3 mov.u64 %r92,0; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 271 12 mov.u64 %r441,%r92; $L60: .loc 1 266 18 add.u64 %r289,%r427,%r92; ld.u64 %r290,[%r40]; st.u64 [%r289],%r290; .loc 1 267 18 add.u64 %r291,%frame,%r92; .loc 1 267 20 ld.u64 %r293,[%r179]; mul.lo.u64 %r292,%r293,%r34; .loc 1 267 18 st.u64 [%r291],%r292; .loc 1 268 19 ld.u64 %r295,[%r40+16]; add.u64 %r294,%r295,1; ld.u64 %r296,[%r40+8]; sub.u64 %r54,%r294,%r296; .loc 1 270 10 setp.lt.s64 %r297,%r54,0; @ %r297 bra $L58; .loc 1 268 17 add.u64 %r299,%r426,%r92; st.u64 [%r299],%r54; bra $L59; $L58: .loc 1 271 12 add.u64 %r301,%r426,%r92; st.u64 [%r301],%r441; $L59: .loc 1 264 3 add.u64 %r40,%r40,24; add.u64 %r179,%r179,24; add.u64 %r92,%r92,8; setp.ne.u64 %r303,%r40,%r206; @ %r303 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r304,[%r209]; setp.eq.u64 %r305,%r304,0; @ ! %r305 bra $L63; bra $L62; $L56: add.u64 %r307,%r22,%r22; add.u64 %r308,%r307,%r22; shl.b64 %r309,%r308,3; add.u64 %r127,%r309,40; add.u64 %r133,%r210,%r127; add.u64 %r125,%r212,%r127; shl.b64 %r310,%r22,3; add.u64 %r103,%r310,-8; cvt.u32.u32 %r312,%r23; cvt.s64.s8 %r311,%r312; shl.b64 %r313,%r311,3; add.u64 %r69,%r313,-8; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 281 12 mov.u64 %r440,0; $L66: .loc 1 276 18 add.u64 %r315,%r427,%r103; ld.u64 %r316,[%r133]; st.u64 [%r315],%r316; .loc 1 277 18 add.u64 %r317,%frame,%r103; .loc 1 277 20 ld.u64 %r319,[%r125]; mul.lo.u64 %r318,%r319,%r34; .loc 1 277 18 st.u64 [%r317],%r318; .loc 1 278 19 ld.u64 %r321,[%r133+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r133+8]; sub.u64 %r66,%r320,%r322; .loc 1 280 10 setp.lt.s64 %r323,%r66,0; @ %r323 bra $L64; .loc 1 278 17 add.u64 %r325,%r426,%r103; st.u64 [%r325],%r66; bra $L65; $L64: .loc 1 281 12 add.u64 %r327,%r426,%r103; st.u64 [%r327],%r440; $L65: .loc 1 274 3 add.u64 %r133,%r133,24; add.u64 %r125,%r125,24; add.u64 %r103,%r103,8; setp.ne.u64 %r329,%r69,%r103; @ %r329 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r330,%r138,0; @ %r330 bra $L68; add.u64 %r426,%frame,360; $L86: add.u64 %r172,%r209,40; mov.u64 %r164,%r426; cvt.u32.u32 %r332,%r23; cvt.s64.s8 %r331,%r332; add.u64 %r334,%r331,%r331; add.u64 %r335,%r334,%r331; shl.b64 %r336,%r335,3; add.u64 %r337,%r209,16; add.u64 %r134,%r336,%r337; .loc 1 291 10 mov.u64 %r129,1; .loc 1 295 4 mov.u64 %r356,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r338,%r23,-2; cvt.s64.s32 %r75,%r338; add.u64 %r340,%r75,%r75; add.u64 %r341,%r340,%r75; shl.b64 %r342,%r341,3; add.u64 %r343,%r209,%r342; .loc 1 299 67 shl.b64 %r345,%r75,3; add.u64 %r346,%frame,%r345; .loc 1 299 59 ld.u64 %r348,[%r343+40]; ld.u64 %r349,[%r346+360]; mul.lo.u64 %r146,%r348,%r349; .loc 1 301 24 mov.u64 %r350,0; st.u64 [%r209+8],%r350; .loc 1 302 28 cvt.u16.u32 %r353,%r23; add.u16 %r352,%r353,-1; cvt.u32.u16 %r354,%r352; st.u8 [%r209+28],%r354; .loc 1 304 10 setp.eq.u64 %r355,%r146,0; @ %r355 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r129,%r72,%r129; $L69: .loc 1 295 4 st.u64 [%r172+8],%r356; ld.u64 %r72,[%r164]; add.u64 %r357,%r72,-1; st.u64 [%r172+16],%r357; st.u64 [%r172],%r129; .loc 1 288 7 add.u64 %r172,%r172,24; add.u64 %r164,%r164,8; setp.ne.u64 %r358,%r134,%r172; @ %r358 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r209+48],%r146; mov.u64 %r360,-1; st.u64 [%r209+56],%r360; mov.u64 %r361,1; st.u64 [%r209+40],%r361; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24363; call (%value_in),_gfortrani_xmallocarray64,[%value_in]; } .loc 1 311 22 st.u64 [%r209],%r364; bra $L73; $L63: .loc 1 316 19 ld.s8 %r366,[%r209+28]; .loc 1 316 10 setp.eq.u64 %r367,%r366,%r138; @ %r367 bra $L74; .loc 1 317 2 cvta.const.u64 %r36836_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r370,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r371,[%r370+36]; setp.eq.u32 %r372,%r371,0; @ %r372 bra $L73; .loc 1 321 4 add.u64 %r426,%frame,360; cvta.const.u64 %r376,$LC2r42637376; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r380,$LC1210380376; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r382,%r138,0; @ %r382 bra $L75; $L78: .loc 1 336 8 ld.u64 %r153,[%r209]; .loc 1 337 8 ld.u64 %r151,[%r210]; .loc 1 339 9 setp.ne.u64 %r383,%r151,0; @ %r383 bra $L76; bra $L48; $L75: add.u64 %r190,%frame,480; add.u64 %r189,%r209,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r385,%r23; cvt.s64.s8 %r384,%r385; add.u64 %r175,%r384,-1; .loc 1 328 3 mov.u64 %r157,0; add.u64 %r426,%frame,360; .loc 1 330 16 mov.u64 %r386,%r157; $L77: st.u64 [%r190],%r386; .loc 1 331 18 ld.u64 %r387,[%r189]; st.u64 [%r187],%r387; .loc 1 332 17 shl.b64 %r389,%r157,3; add.u64 %r390,%r426,%r389; .loc 1 332 10 ld.u64 %r391,[%r390]; setp.le.s64 %r392,%r391,0; @ %r392 bra $L48; .loc 1 328 26 add.u64 %r157,%r157,1; .loc 1 328 3 add.u64 %r190,%r190,8; add.u64 %r189,%r189,24; add.u64 %r187,%r187,8; setp.ne.u64 %r393,%r157,%r175; @ %r393 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r45,%r42,%r34; .loc 1 349 32 shl.b64 %r91,%r142,2; ld.u64 %r39,[%frame+480]; .loc 1 359 22 ld.u64 %r94,[%frame+240]; .loc 1 359 12 shl.b64 %r95,%r94,2; .loc 1 360 23 ld.u64 %r97,[%frame]; .loc 1 361 22 ld.u64 %r99,[%frame+120]; .loc 1 361 12 shl.b64 %r100,%r99,2; .loc 1 363 32 ld.u64 %r85,[%frame+360]; .loc 1 370 23 mul.lo.u64 %r38,%r85,%r94; .loc 1 371 24 mul.lo.u64 %r394,%r85,%r97; .loc 1 371 10 neg.s64 %r163,%r394; .loc 1 372 23 mul.lo.u64 %r395,%r85,%r99; .loc 1 372 9 shl.b64 %r396,%r395,2; neg.s64 %r165,%r396; setp.le.s64 %r431,%r138,1; cvt.u32.u32 %r433,%r23; cvt.s64.s8 %r434,%r433; add.u64 %r435,%r434,-1; add.u64 %r436,%frame,240; add.u64 %r437,%frame,360; add.u64 %r438,%frame,120; .loc 1 367 13 mov.u64 %r439,0; $L87: .loc 1 328 3 mov.u64 %r156,%r152; mov.u64 %r155,%r151; .loc 1 348 10 mov.u32 %r130,0; .loc 1 349 9 mov.u64 %r154,0; $L80: .loc 1 352 6 ld.s8 %r398,[%r156]; cvt.u16.u32 %r397,%r398; setp.eq.u16 %r399,%r397,0; @ %r399 bra $L79; .loc 1 353 12 ld.u32 %r400,[%r155]; or.b32 %r130,%r130,%r400; $L79: .loc 1 349 24 add.u64 %r154,%r154,1; .loc 1 349 32 add.u64 %r155,%r155,%r91; .loc 1 349 47 add.u64 %r156,%r156,%r45; .loc 1 349 2 setp.ne.u64 %r401,%r139,%r154; @ %r401 bra $L80; .loc 1 355 8 st.u32 [%r153],%r130; .loc 1 358 15 add.u64 %r39,%r39,1; .loc 1 359 12 add.u64 %r151,%r151,%r95; .loc 1 360 13 add.u64 %r152,%r152,%r97; .loc 1 361 12 add.u64 %r153,%r153,%r100; .loc 1 363 13 setp.ne.u64 %r402,%r85,%r39; @ %r402 bra $L87; .loc 1 371 10 add.u64 %r148,%r152,%r163; .loc 1 372 9 add.u64 %r149,%r153,%r165; .loc 1 374 7 @ ! %r431 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r41],%r439; .loc 1 370 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 371 24 mul.lo.u64 %r405,%r116,%r113; .loc 1 371 10 sub.u64 %r148,%r152,%r405; .loc 1 372 23 mul.lo.u64 %r406,%r118,%r113; .loc 1 372 9 shl.b64 %r407,%r406,2; sub.u64 %r149,%r153,%r407; .loc 1 373 5 add.u64 %r150,%r150,1; .loc 1 374 7 add.u64 %r41,%r41,8; add.u64 %r166,%r166,8; setp.ne.u64 %r408,%r150,%r435; @ %r408 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; .loc 1 370 23 mov.u64 %r102,%r38; .loc 1 374 7 mov.u64 %r166,8; .loc 1 373 5 mov.u64 %r150,1; $L84: .loc 1 382 16 ld.u64 %r412,[%r41]; add.u64 %r113,%r412,1; st.u64 [%r41],%r113; .loc 1 383 23 add.u64 %r414,%r436,%r166; ld.u64 %r114,[%r414]; .loc 1 383 13 sub.u64 %r415,%r114,%r102; shl.b64 %r416,%r415,2; add.u64 %r151,%r151,%r416; .loc 1 384 24 add.u64 %r417,%frame,%r166; ld.u64 %r116,[%r417]; .loc 1 384 14 add.u64 %r152,%r148,%r116; .loc 1 385 23 add.u64 %r419,%r438,%r166; ld.u64 %r118,[%r419]; .loc 1 385 13 shl.b64 %r420,%r118,2; add.u64 %r153,%r149,%r420; .loc 1 363 32 add.u64 %r422,%r437,%r166; ld.u64 %r122,[%r422]; .loc 1 363 13 setp.eq.u64 %r423,%r113,%r122; @ %r423 bra $L85; .loc 1 367 13 mov.u64 %r39,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r424,[%r209]; setp.eq.u64 %r425,%r424,0; @ ! %r425 bra $L63; bra $L86; $L48:_gfortran_siany_i4 .visible .func _gfortran_siany_68u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64predu64mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_iany_.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r294,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r294; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r294; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r300,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r300; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r293,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r293,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r299,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r299; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r294,0; @ %r211 bra $L127; add.u64 %r293,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r293; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 2496244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r294,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r294; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r294; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r294,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r294,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r294,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 shl.b64 %r69,%r68,2; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r292,%r294,1; .loc 1 512 13 mov.u32 %r278,0; cvt.u32.u32 %r296,%r24; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; $L146: st.u32 [%r98],%r278; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r279,%r59,%r80; @ %r279 bra $L146; .loc 1 525 7 @ ! %r292 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r295; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r282,%r97,%r298; @ %r282 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; .loc 1 520 13 mov.u64 %r295,0; $L144: .loc 1 529 16 ld.u64 %r287,[%r99]; add.u64 %r74,%r287,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r288,%r75,%r72; shl.b64 %r289,%r288,2; add.u64 %r98,%r98,%r289; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r290,%r74,%r77; @ %r290 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r291,%r98,0; @ ! %r291 bra $L122; bra $L147; $L109: .loc 1 534 1 ret; } iany_i8.o/fortran/generated/iany_i8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_miany_i8 .visible .func _gfortran_miany_i8gfortran_siany_i8 .visible .func _gfortran_siany_i8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,78,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,78,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,105] = {73,65,78,89114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,78,89,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,73,65,78,89,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_iany_i8 .visible .func _gfortran_iany_48096pred %r218; .reg .predpred %r231; .reg .u64 %r232; .reg .predu64 %r250; .reg .u64 %r251; .reg .u64 %r252; .reg .pred %r253; .reg .predu64u16u64 %r274; .reg .u64pred %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u32 %r306; .reg .u64u64 %r311; .reg .u64 %r312; .reg .pred %r313; .reg .predu64pred %r347; .reg .predu64 %r358; mov.u64 %r180,%ar0; mov.u64 %r181,%ar1; mov.u64 %r182,%ar2; .loc 1 55 10 ld.s8 %r22,[%r181+28]; .loc 1 55 38 add.u32 %r183,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r107,%r183; .loc 1 56 10 ld.u64 %r25,[%r182]; .loc 1 56 7 add.u64 %r108,%r25,-1; .loc 1 58 7 shr.u64 %r185,%r108,63; set.u32.lt.s64 %r187,%r107,%r108; neg.s32 %r188,%r187; cvt.u16.u64 %r191,%r185; cvt.u16.u32 %r192,%r188; or.b16 %r190,%r191,%r192; .loc 1 58 6 cvt.u32.u16 %r193,%r190; cvt.u16.u8 %r194,%r193; setp.eq.u16 %r195,%r194,0; @ %r195 bra $L2; .loc 1 60 7 cvt.u32.u32 %r198,%r22; cvt.s64.s8 %r197,%r198; st.u64 [%stack+8],%r1971966_gfortran_runtime_errorr201,%r108,%r108; add.u64 %r202,%r201,%r108; shl.b64 %r203,%r202,3; add.u64 %r204,%r181,%r203; ld.u64 %r30,[%r204+56]; ld.u64 %r32,[%r204+48]; .loc 1 68 9 ld.u64 %r111,[%r204+40]; .loc 1 70 3 setp.ne.u64 %r218,%r108,0; @ %r218 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r219,%r107,%r108; @ %r219 bra $L4; bra $L46; $L3: add.u64 %r61,%r181,40; add.u64 %r40,%frame,120; add.u64 %r96,%frame,240; add.u64 %r220,%r181,16; add.u64 %r222,%r25,%r25; add.u64 %r223,%r222,%r25; shl.b64 %r224,%r223,3; add.u64 %r175,%r220,%r224; .loc 1 76 12 mov.u64 %r358,0; $L8: .loc 1 72 18 ld.u64 %r225,[%r61]; st.u64 [%r40],%r225; .loc 1 73 19 ld.u64 %r227,[%r61+16]; add.u64 %r226,%r227,1; ld.u64 %r228,[%r61+8]; sub.u64 %r37,%r226,%r228; .loc 1 75 10 setp.lt.s64 %r229,%r37,0; @ %r229 bra $L6; .loc 1 73 17 st.u64 [%r96],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r96],%r358; $L7: .loc 1 70 3 add.u64 %r61,%r61,24; add.u64 %r40,%r40,8; add.u64 %r96,%r96,8; setp.ne.u64 %r231,%r61,%r175; @ %r231 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r232,[%r180]; setp.eq.u64 %r233,%r232,0; @ ! %r233 bra $L11; bra $L10; $L4: add.u64 %r235,%r25,%r25; add.u64 %r236,%r235,%r25; shl.b64 %r237,%r236,3; add.u64 %r238,%r237,40; add.u64 %r136,%r181,%r238; shl.b64 %r239,%r25,3; add.u64 %r105,%r239,-8; add.u64 %r345,%frame,120; add.u64 %r129,%r345,%r105; add.u64 %r346,%frame,240; add.u64 %r103,%r346,%r105; cvt.u32.u32 %r243,%r22; cvt.s64.s8 %r242,%r243; add.u64 %r245,%r242,%r242; add.u64 %r246,%r245,%r242; shl.b64 %r247,%r246,3; add.u64 %r248,%r181,40; add.u64 %r80,%r247,%r248; .loc 1 84 12 mov.u64 %r357,0; $L14: .loc 1 80 18 ld.u64 %r249,[%r136]; st.u64 [%r129],%r249; .loc 1 81 19 ld.u64 %r251,[%r136+16]; add.u64 %r250,%r251,1; ld.u64 %r252,[%r136+8]; sub.u64 %r44,%r250,%r252; .loc 1 83 10 setp.lt.s64 %r253,%r44,0; @ %r253 bra $L12; .loc 1 81 17 st.u64 [%r103],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r103],%r357; $L13: .loc 1 78 3 add.u64 %r136,%r136,24; add.u64 %r129,%r129,8; add.u64 %r103,%r103,8; setp.ne.u64 %r255,%r80,%r136; @ %r255 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r256,%r107,0; @ %r256 bra $L16; add.u64 %r346,%frame,240; $L34: add.u64 %r152,%r180,40; mov.u64 %r150,%r346; add.u64 %r257,%r180,16; cvt.u32.u32 %r259,%r22; cvt.s64.s8 %r258,%r259; add.u64 %r261,%r258,%r258; add.u64 %r262,%r261,%r258; shl.b64 %r263,%r262,3; add.u64 %r139,%r257,%r263; .loc 1 94 10 mov.u64 %r102,1; .loc 1 98 4 mov.u64 %r286,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r264,0; st.u64 [%r180+8],%r264; .loc 1 103 28 cvt.u16.u32 %r267,%r22; add.u16 %r266,%r267,-1; cvt.u32.u16 %r268,%r266; st.u8 [%r180+28],%r268; .loc 1 105 20 add.u32 %r269,%r22,-2; cvt.s64.s32 %r53,%r269; add.u64 %r271,%r53,%r53; add.u64 %r272,%r271,%r53; shl.b64 %r273,%r272,3; add.u64 %r274,%r180,%r273; .loc 1 105 67 shl.b64 %r276,%r53,3; add.u64 %r277,%frame,%r276; .loc 1 105 59 ld.u64 %r279,[%r274+40]; ld.u64 %r280,[%r277+240]; mul.lo.u64 %r113,%r279,%r280; .loc 1 107 29 mov.u64 %r2811(%value_in),_gfortrani_xmallocarray283,[%value_in]; } .loc 1 107 27 st.u64 [%r180],%r283; .loc 1 108 10 setp.eq.u64 %r285,%r113,0; @ ! %r285 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r102,%r49,%r102; $L17: .loc 1 98 4 st.u64 [%r152+8],%r286; ld.u64 %r49,[%r150]; add.u64 %r287,%r49,-1; st.u64 [%r152+16],%r287; st.u64 [%r152],%r102; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r288,%r139,%r152; @ %r288 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r180+48],%r113; mov.u64 %r290,-1; st.u64 [%r180+56],%r290; mov.u64 %r291,1; st.u64 [%r180+40],%r291; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r60,[%r180+28]; .loc 1 118 10 setp.eq.u64 %r292,%r60,%r107; @ %r292 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r107; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r295,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r296,[%r295+36]; setp.eq.u32 %r297,%r296,0; @ %r297 bra $L21; .loc 1 125 2 add.u64 %r346,%frame,240; cvta.const.u64 %r301,$LC2r346300301; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r303,%r107,0; @ %r303 bra $L24; $L27: .loc 1 65 9 add.u64 %r304,%r30,1; .loc 1 65 7 sub.u64 %r109,%r304,%r32; max.s64 %r45,%r109,0; .loc 1 137 8 ld.u64 %r121,[%r181]; .loc 1 138 8 ld.u64 %r122,[%r180]; .loc 1 154 36 shl.b64 %r68,%r111,3; ld.u64 %r47,[%frame+360]; .loc 1 166 22 ld.u64 %r69,[%frame+120]; .loc 1 166 12 shl.b64 %r71,%r69,3; .loc 1 167 22 ld.u64 %r72,[%frame]; .loc 1 167 12 shl.b64 %r74,%r72,3; setp.gt.s64 %r347,%r109,0; setp.le.s64 %r348,%r107,1; .loc 1 150 10 mov.u64 %r350,0; cvt.u32.u32 %r351,%r22; cvt.s64.s8 %r352,%r351; add.u64 %r353,%r352,-1; add.u64 %r354,%frame,120; add.u64 %r355,%frame,240; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r180,40; mov.u64 %r162,%frame; cvt.u32.u32 %r306,%r22; cvt.s64.s8 %r305,%r306; add.u64 %r153,%r305,-1; .loc 1 129 3 mov.u64 %r123,0; add.u64 %r346,%frame,240; .loc 1 131 16 mov.u64 %r307,%r123; $L26: st.u64 [%r165],%r307; .loc 1 132 18 ld.u64 %r308,[%r164]; st.u64 [%r162],%r308; .loc 1 133 17 shl.b64 %r310,%r123,3; add.u64 %r311,%r346,%r310; .loc 1 133 10 ld.u64 %r312,[%r311]; setp.le.s64 %r313,%r312,0; @ %r313 bra $L1; .loc 1 129 26 add.u64 %r123,%r123,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r314,%r123,%r153; @ %r314 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r347 bra $L35; .loc 1 150 10 st.u64 [%r122],%r350; bra $L29; $L35: mov.u64 %r116,%r121; .loc 1 148 10 mov.u64 %r114,0; .loc 1 154 13 mov.u64 %r115,%r114; $L28: .loc 1 158 10 ld.u64 %r317,[%r116]; or.b64 %r114,%r114,%r317; .loc 1 154 28 add.u64 %r115,%r115,1; .loc 1 154 36 add.u64 %r116,%r116,%r68; .loc 1 154 6 setp.gt.s64 %r318,%r45,%r115; @ %r318 bra $L28; .loc 1 161 12 st.u64 [%r122],%r114; $L29: .loc 1 165 15 add.u64 %r47,%r47,1; .loc 1 166 12 add.u64 %r121,%r121,%r71; .loc 1 167 12 add.u64 %r122,%r122,%r74; .loc 1 169 32 ld.u64 %r127,[%frame+240]; .loc 1 169 13 setp.ne.u64 %r319,%r47,%r127; @ %r319 bra $L25; .loc 1 176 23 mul.lo.u64 %r320,%r69,%r47; .loc 1 176 9 shl.b64 %r321,%r320,3; sub.u64 %r118,%r121,%r321; .loc 1 177 23 mul.lo.u64 %r322,%r72,%r47; .loc 1 177 9 shl.b64 %r323,%r322,3; sub.u64 %r119,%r122,%r323; .loc 1 179 7 @ %r348 bra $L1; add.u64 %r144,%frame,368; mov.u64 %r106,8; .loc 1 178 5 mov.u64 %r120,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r144],%r350; .loc 1 176 23 mul.lo.u64 %r329,%r88,%r87; .loc 1 176 9 shl.b64 %r330,%r329,3; sub.u64 %r118,%r121,%r330; .loc 1 177 23 mul.lo.u64 %r331,%r91,%r87; .loc 1 177 9 shl.b64 %r332,%r331,3; sub.u64 %r119,%r122,%r332; .loc 1 178 5 add.u64 %r120,%r120,1; .loc 1 179 7 add.u64 %r144,%r144,8; add.u64 %r106,%r106,8; setp.eq.u64 %r333,%r120,%r353; @ %r333 bra $L1; $L32: .loc 1 187 16 ld.u64 %r334,[%r144]; add.u64 %r87,%r334,1; st.u64 [%r144],%r87; .loc 1 188 23 add.u64 %r336,%r354,%r106; ld.u64 %r88,[%r336]; .loc 1 188 13 shl.b64 %r337,%r88,3; add.u64 %r121,%r118,%r337; .loc 1 189 23 add.u64 %r338,%frame,%r106; ld.u64 %r91,[%r338]; .loc 1 189 13 shl.b64 %r339,%r91,3; add.u64 %r122,%r119,%r339; .loc 1 169 32 add.u64 %r341,%r355,%r106; ld.u64 %r94,[%r341]; .loc 1 169 13 setp.eq.u64 %r342,%r87,%r94; @ %r342 bra $L33; .loc 1 173 13 mov.u64 %r47,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r343,[%r180]; setp.eq.u64 %r344,%r343,0; @ ! %r344 bra $L11; bra $L34; $L1: .loc 1 19_gfortran_miany_i8 .visible .func _gfortran_miany_608predpred %r280; .reg .pred %r281; .reg .u64 %r282; .reg .u64 %r284; .reg .u64predu64 %r318; .reg .u64u64u64 %r364; .reg .u64 %r365; .reg .u64 %r367; .reg .predu64pred %r393; .reg .predpredmov.u64 %r210,%ar0; mov.u64 %r211,%ar1; mov.u64 %r212,%ar2; mov.u64 %r213,%ar3; .loc 1 223 6911212; call _gfortran_iany_i8,(%out_arg1,%out_arg2,%out_arg3); } .loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r212]; .loc 1 233 7 add.u64 %r141,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r211+28]; .loc 1 234 38 add.u32 %r218,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r142,%r218; .loc 1 237 7 shr.u64 %r220,%r141,63; set.u32.gt.s64 %r222,%r141,%r142; neg.s32 %r223,%r222; cvt.u16.u64 %r226,%r220; cvt.u16.u32 %r227,%r223; or.b16 %r225,%r226,%r227; .loc 1 237 6 cvt.u32.u16 %r228,%r225; cvt.u16.u8 %r229,%r228; setp.eq.u16 %r230,%r229,0; @ %r230 bra $L51; .loc 1 239 7 cvt.u32.u32 %r233,%r23; cvt.s64.s8 %r232,%r233; st.u64 [%stack+8],%r232; st.u64 [%stack],%r22; cvta.const.u64 %r2312stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r431,%r141,%r141; add.u64 %r430,%r431,%r141; shl.b64 %r238,%r430,3; add.u64 %r239,%r211,%r238; ld.u64 %r242,[%r239+56]; add.u64 %r241,%r242,1; .loc 1 244 7 ld.u64 %r249,[%r239+48]; sub.u64 %r143,%r241,%r249; .loc 1 245 6 setp.le.s64 %r250,%r143,0; @ %r250 bra $L48; .loc 1 248 9 ld.u64 %r156,[%r213]; .loc 1 250 15 ld.u64 %r35,[%r213+16]; .loc 1 252 22 cvt.u32.u64 %r36,%r35; .loc 1 252 53 add.u32 %r251,%r36,-4; and.b32 %r252,%r251,-5; set.u32.eq.u32 %r254,%r252,0; neg.s32 %r255,%r254; .loc 1 252 22 add.u32 %r256,%r36,-1; set.u32.le.u32 %r258,%r256,1; neg.s32 %r259,%r258; .loc 1 252 6 cvt.u16.u32 %r261,%r255; cvt.u16.u32 %r262,%r259; or.b16 %r260,%r261,%r262; cvt.u32.u16 %r263,%r260; cvt.u16.u8 %r264,%r263; setp.ne.u16 %r265,%r264,0; @ %r265 bra $L53; .loc 1 254 7 setp.ne.u32 %r267,%r36,16; @ %r267 bra $L54; $L53: .loc 1 261 9 shl.b64 %r271,%r430,3; add.u64 %r272,%r211,%r271; ld.u64 %r146,[%r272+40]; .loc 1 262 12 add.u64 %r278,%r213,%r271; ld.u64 %r43,[%r278+40]; .loc 1 264 3 setp.ne.u64 %r280,%r141,0; @ %r280 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r281,%r141,%r142; @ %r281 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r282282_gfortran_runtime_error5: add.u64 %r194,%r211,40; add.u64 %r196,%r213,40; add.u64 %r284,%r211,16; add.u64 %r286,%r22,%r22; add.u64 %r287,%r286,%r22; shl.b64 %r288,%r287,3; add.u64 %r207,%r284,%r288; .loc 1 264 3 mov.u64 %r198,0; add.u64 %r428,%frame,240; add.u64 %r427,%frame,360; .loc 1 271 12 mov.u64 %r442,%r198; $L60: .loc 1 266 18 add.u64 %r290,%r428,%r198; ld.u64 %r291,[%r194]; st.u64 [%r290],%r291; .loc 1 267 18 add.u64 %r292,%frame,%r198; .loc 1 267 20 ld.u64 %r294,[%r196]; mul.lo.u64 %r293,%r294,%r35; .loc 1 267 18 st.u64 [%r292],%r293; .loc 1 268 19 ld.u64 %r296,[%r194+16]; add.u64 %r295,%r296,1; ld.u64 %r297,[%r194+8]; sub.u64 %r55,%r295,%r297; .loc 1 270 10 setp.lt.s64 %r298,%r55,0; @ %r298 bra $L58; .loc 1 268 17 add.u64 %r300,%r427,%r198; st.u64 [%r300],%r55; bra $L59; $L58: .loc 1 271 12 add.u64 %r302,%r427,%r198; st.u64 [%r302],%r442; $L59: .loc 1 264 3 add.u64 %r194,%r194,24; add.u64 %r196,%r196,24; add.u64 %r198,%r198,8; setp.ne.u64 %r304,%r194,%r207; @ %r304 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r305,[%r210]; setp.eq.u64 %r306,%r305,0; @ ! %r306 bra $L63; bra $L62; $L56: add.u64 %r308,%r22,%r22; add.u64 %r309,%r308,%r22; shl.b64 %r310,%r309,3; add.u64 %r61,%r310,40; add.u64 %r100,%r211,%r61; add.u64 %r139,%r213,%r61; shl.b64 %r311,%r22,3; add.u64 %r192,%r311,-8; cvt.u32.u32 %r313,%r23; cvt.s64.s8 %r312,%r313; shl.b64 %r314,%r312,3; add.u64 %r39,%r314,-8; add.u64 %r428,%frame,240; add.u64 %r427,%frame,360; .loc 1 281 12 mov.u64 %r441,0; $L66: .loc 1 276 18 add.u64 %r316,%r428,%r192; ld.u64 %r317,[%r100]; st.u64 [%r316],%r317; .loc 1 277 18 add.u64 %r318,%frame,%r192; .loc 1 277 20 ld.u64 %r320,[%r139]; mul.lo.u64 %r319,%r320,%r35; .loc 1 277 18 st.u64 [%r318],%r319; .loc 1 278 19 ld.u64 %r322,[%r100+16]; add.u64 %r321,%r322,1; ld.u64 %r323,[%r100+8]; sub.u64 %r67,%r321,%r323; .loc 1 280 10 setp.lt.s64 %r324,%r67,0; @ %r324 bra $L64; .loc 1 278 17 add.u64 %r326,%r427,%r192; st.u64 [%r326],%r67; bra $L65; $L64: .loc 1 281 12 add.u64 %r328,%r427,%r192; st.u64 [%r328],%r441; $L65: .loc 1 274 3 add.u64 %r100,%r100,24; add.u64 %r139,%r139,24; add.u64 %r192,%r192,8; setp.ne.u64 %r330,%r39,%r192; @ %r330 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r331,%r142,0; @ %r331 bra $L68; add.u64 %r427,%frame,360; $L86: add.u64 %r137,%r210,40; mov.u64 %r135,%r427; cvt.u32.u32 %r333,%r23; cvt.s64.s8 %r332,%r333; add.u64 %r335,%r332,%r332; add.u64 %r336,%r335,%r332; shl.b64 %r337,%r336,3; add.u64 %r338,%r210,16; add.u64 %r106,%r337,%r338; .loc 1 291 10 mov.u64 %r131,1; .loc 1 295 4 mov.u64 %r357,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r339,%r23,-2; cvt.s64.s32 %r74,%r339; add.u64 %r341,%r74,%r74; add.u64 %r342,%r341,%r74; shl.b64 %r343,%r342,3; add.u64 %r344,%r210,%r343; .loc 1 299 67 shl.b64 %r346,%r74,3; add.u64 %r347,%frame,%r346; .loc 1 299 59 ld.u64 %r349,[%r344+40]; ld.u64 %r350,[%r347+360]; mul.lo.u64 %r150,%r349,%r350; .loc 1 301 24 mov.u64 %r351,0; st.u64 [%r210+8],%r351; .loc 1 302 28 cvt.u16.u32 %r354,%r23; add.u16 %r353,%r354,-1; cvt.u32.u16 %r355,%r353; st.u8 [%r210+28],%r355; .loc 1 304 10 setp.eq.u64 %r356,%r150,0; @ %r356 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r131,%r71,%r131; $L69: .loc 1 295 4 st.u64 [%r137+8],%r357; ld.u64 %r71,[%r135]; add.u64 %r358,%r71,-1; st.u64 [%r137+16],%r358; st.u64 [%r137],%r131; .loc 1 288 7 add.u64 %r137,%r137,24; add.u64 %r135,%r135,8; setp.ne.u64 %r359,%r106,%r137; @ %r359 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r210+48],%r150; mov.u64 %r361,-1; st.u64 [%r210+56],%r361; mov.u64 %r362,1; st.u64 [%r210+40],%r362; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r364364; call (%value_in),_gfortrani_xmallocarray65,[%value_in]; } .loc 1 311 22 st.u64 [%r210],%r365; bra $L73; $L63: .loc 1 316 19 ld.s8 %r367,[%r210+28]; .loc 1 316 10 setp.eq.u64 %r368,%r367,%r142; @ %r368 bra $L74; .loc 1 317 2 cvta.const.u64 %r36stack; call _gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r371,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r372,[%r371+36]; setp.eq.u32 %r373,%r372,0; @ %r373 bra $L73; .loc 1 321 4 add.u64 %r427,%frame,360; cvta.const.u64 %r377,$LC2r427376377; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r381,$LC1r2113377; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r383,%r142,0; @ %r383 bra $L75; $L78: .loc 1 336 8 ld.u64 %r157,[%r210]; .loc 1 337 8 ld.u64 %r155,[%r211]; .loc 1 339 9 setp.ne.u64 %r384,%r155,0; @ %r384 bra $L76; bra $L48; $L75: add.u64 %r183,%frame,480; add.u64 %r180,%r210,40; add.u64 %r177,%frame,120; cvt.u32.u32 %r386,%r23; cvt.s64.s8 %r385,%r386; add.u64 %r138,%r385,-1; .loc 1 328 3 mov.u64 %r161,0; add.u64 %r427,%frame,360; .loc 1 330 16 mov.u64 %r387,%r161; $L77: st.u64 [%r183],%r387; .loc 1 331 18 ld.u64 %r388,[%r180]; st.u64 [%r177],%r388; .loc 1 332 17 shl.b64 %r390,%r161,3; add.u64 %r391,%r427,%r390; .loc 1 332 10 ld.u64 %r392,[%r391]; setp.le.s64 %r393,%r392,0; @ %r393 bra $L48; .loc 1 328 26 add.u64 %r161,%r161,1; .loc 1 328 3 add.u64 %r183,%r183,8; add.u64 %r180,%r180,24; add.u64 %r177,%r177,8; setp.ne.u64 %r394,%r138,%r161; @ %r394 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r46,%r43,%r35; .loc 1 349 32 shl.b64 %r90,%r146,3; ld.u64 %r29,[%frame+480]; .loc 1 359 22 ld.u64 %r92,[%frame+240]; .loc 1 359 12 shl.b64 %r94,%r92,3; .loc 1 360 23 ld.u64 %r95,[%frame]; .loc 1 361 22 ld.u64 %r97,[%frame+120]; .loc 1 361 12 shl.b64 %r99,%r97,3; setp.le.s64 %r432,%r142,1; cvt.u32.u32 %r434,%r23; cvt.s64.s8 %r435,%r434; add.u64 %r436,%r435,-1; add.u64 %r437,%frame,240; add.u64 %r438,%frame,360; add.u64 %r439,%frame,120; .loc 1 367 13 mov.u64 %r440,0; $L87: .loc 1 328 3 mov.u64 %r160,%r156; mov.u64 %r159,%r155; .loc 1 348 10 mov.u64 %r133,0; .loc 1 349 9 mov.u64 %r158,%r133; $L80: .loc 1 352 6 ld.s8 %r396,[%r160]; cvt.u16.u32 %r395,%r396; setp.eq.u16 %r397,%r395,0; @ %r397 bra $L79; .loc 1 353 12 ld.u64 %r398,[%r159]; or.b64 %r133,%r133,%r398; $L79: .loc 1 349 24 add.u64 %r158,%r158,1; .loc 1 349 32 add.u64 %r159,%r159,%r90; .loc 1 349 47 add.u64 %r160,%r160,%r46; .loc 1 349 2 setp.ne.u64 %r399,%r143,%r158; @ %r399 bra $L80; .loc 1 355 8 st.u64 [%r157],%r133; .loc 1 358 15 add.u64 %r29,%r29,1; .loc 1 359 12 add.u64 %r155,%r155,%r94; .loc 1 360 13 add.u64 %r156,%r156,%r95; .loc 1 361 12 add.u64 %r157,%r157,%r99; .loc 1 363 32 ld.u64 %r84,[%frame+360]; .loc 1 363 13 setp.ne.u64 %r400,%r84,%r29; @ %r400 bra $L87; .loc 1 370 23 mul.lo.u64 %r102,%r29,%r92; .loc 1 371 24 mul.lo.u64 %r401,%r29,%r95; .loc 1 371 10 sub.u64 %r152,%r156,%r401; .loc 1 372 23 mul.lo.u64 %r402,%r29,%r97; .loc 1 372 9 shl.b64 %r403,%r402,3; sub.u64 %r153,%r157,%r403; .loc 1 374 7 @ ! %r432 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r41],%r440; .loc 1 370 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 371 24 mul.lo.u64 %r406,%r116,%r113; .loc 1 371 10 sub.u64 %r152,%r156,%r406; .loc 1 372 23 mul.lo.u64 %r407,%r118,%r113; .loc 1 372 9 shl.b64 %r408,%r407,3; sub.u64 %r153,%r157,%r408; .loc 1 373 5 add.u64 %r154,%r154,1; .loc 1 374 7 add.u64 %r41,%r41,8; add.u64 %r30,%r30,8; setp.ne.u64 %r409,%r154,%r436; @ %r409 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; mov.u64 %r30,8; .loc 1 373 5 mov.u64 %r154,1; $L84: .loc 1 382 16 ld.u64 %r413,[%r41]; add.u64 %r113,%r413,1; st.u64 [%r41],%r113; .loc 1 383 23 add.u64 %r415,%r437,%r30; ld.u64 %r114,[%r415]; .loc 1 383 13 sub.u64 %r416,%r114,%r102; shl.b64 %r417,%r416,3; add.u64 %r155,%r155,%r417; .loc 1 384 24 add.u64 %r418,%frame,%r30; ld.u64 %r116,[%r418]; .loc 1 384 14 add.u64 %r156,%r152,%r116; .loc 1 385 23 add.u64 %r420,%r439,%r30; ld.u64 %r118,[%r420]; .loc 1 385 13 shl.b64 %r421,%r118,3; add.u64 %r157,%r153,%r421; .loc 1 363 32 add.u64 %r423,%r438,%r30; ld.u64 %r121,[%r423]; .loc 1 363 13 setp.eq.u64 %r424,%r113,%r121; @ %r424 bra $L85; .loc 1 367 13 mov.u64 %r29,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r425,[%r210]; setp.eq.u64 %r426,%r425,0; @ ! %r426 bra $L63; bra $L86; $L48:_gfortran_siany_i8 .visible .func _gfortran_siany_68u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_iany_.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r294,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r294; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r294; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r300,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r300; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r293,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r293,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r299,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r299; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r294,0; @ %r211 bra $L127; add.u64 %r293,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r293; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r24496244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r294,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r294; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r294; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r294,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r294,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r294,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 shl.b64 %r69,%r68,3; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r292,%r294,1; .loc 1 512 13 mov.u64 %r278,0; cvt.u32.u32 %r296,%r24; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; $L146: st.u64 [%r98],%r278; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r279,%r59,%r80; @ %r279 bra $L146; .loc 1 525 7 @ ! %r292 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r278; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r282,%r97,%r298; @ %r282 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; $L144: .loc 1 529 16 ld.u64 %r287,[%r99]; add.u64 %r74,%r287,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r288,%r75,%r72; shl.b64 %r289,%r288,3; add.u64 %r98,%r98,%r289; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r290,%r74,%r77; @ %r290 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r291,%r98,0; @ ! %r291 bra $L122; bra $L147; $L109: .loc 1 534 1 ret; } iany_i16.o/_gfortran_iany_i16 .visible .func _gfortran_iany_i16fortran/generated/iany_i16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_miany_i16 .visible .func _gfortran_miany_i16gfortran_siany_i16 .visible .func _gfortran_siany_i16VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,78,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,78,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,105] = {73,65,78,89114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,65,78,89,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,73,65,78,89,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_iany_i16480u64 %r52; .reg .u64 %r58; .reg .u64 %r61; .reg .u649pred %r217; .reg .pred64u64 %r251; .reg .predu64 %r256; .reg .u64 %r257; .reg .u32u32 %r268; .reg .u64u6464predu64 %r370; mov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; .loc 1 55 10 ld.s8 %r22,[%r180+28]; .loc 1 55 38 add.u32 %r182,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r182; .loc 1 56 10 ld.u64 %r25,[%r181]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r184,%r104,63; set.u32.lt.s64 %r186,%r103,%r104; neg.s32 %r187,%r186; cvt.u16.u64 %r190,%r184; cvt.u16.u32 %r191,%r187; or.b16 %r189,%r190,%r191; .loc 1 58 6 cvt.u32.u16 %r192,%r189; cvt.u16.u8 %r193,%r192; setp.eq.u16 %r194,%r193,0; @ %r194 bra $L2; .loc 1 60 7 cvt.u32.u32 %r197,%r22; cvt.s64.s8 %r196,%r197; st.u64 [%stack+8],%r1961955_gfortran_runtime_errorr200,%r104,%r104; add.u64 %r201,%r200,%r104; shl.b64 %r202,%r201,3; add.u64 %r203,%r180,%r202; ld.u64 %r30,[%r203+56]; ld.u64 %r32,[%r203+48]; .loc 1 68 9 ld.u64 %r107,[%r203+40]; .loc 1 70 3 setp.ne.u64 %r217,%r104,0; @ %r217 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r218,%r103,%r104; @ %r218 bra $L4; bra $L46; $L3: add.u64 %r61,%r180,40; add.u64 %r40,%frame,120; add.u64 %r91,%frame,240; add.u64 %r219,%r180,16; add.u64 %r221,%r25,%r25; add.u64 %r222,%r221,%r25; shl.b64 %r223,%r222,3; add.u64 %r174,%r219,%r223; .loc 1 76 12 mov.u64 %r370,0; $L8: .loc 1 72 18 ld.u64 %r224,[%r61]; st.u64 [%r40],%r224; .loc 1 73 19 ld.u64 %r226,[%r61+16]; add.u64 %r225,%r226,1; ld.u64 %r227,[%r61+8]; sub.u64 %r37,%r225,%r227; .loc 1 75 10 setp.lt.s64 %r228,%r37,0; @ %r228 bra $L6; .loc 1 73 17 st.u64 [%r91],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r91],%r370; $L7: .loc 1 70 3 add.u64 %r61,%r61,24; add.u64 %r40,%r40,8; add.u64 %r91,%r91,8; setp.ne.u64 %r230,%r61,%r174; @ %r230 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r231,[%r179]; setp.eq.u64 %r232,%r231,0; @ ! %r232 bra $L11; bra $L10; $L4: add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r237,%r236,40; add.u64 %r143,%r180,%r237; shl.b64 %r238,%r25,3; add.u64 %r125,%r238,-8; add.u64 %r358,%frame,120; add.u64 %r128,%r358,%r125; add.u64 %r357,%frame,240; add.u64 %r121,%r357,%r125; cvt.u32.u32 %r242,%r22; cvt.s64.s8 %r241,%r242; add.u64 %r244,%r241,%r241; add.u64 %r245,%r244,%r241; shl.b64 %r246,%r245,3; add.u64 %r247,%r180,40; add.u64 %r95,%r246,%r247; .loc 1 84 12 mov.u64 %r369,0; $L14: .loc 1 80 18 ld.u64 %r248,[%r143]; st.u64 [%r128],%r248; .loc 1 81 19 ld.u64 %r250,[%r143+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r143+8]; sub.u64 %r44,%r249,%r251; .loc 1 83 10 setp.lt.s64 %r252,%r44,0; @ %r252 bra $L12; .loc 1 81 17 st.u64 [%r121],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r121],%r369; $L13: .loc 1 78 3 add.u64 %r143,%r143,24; add.u64 %r128,%r128,8; add.u64 %r121,%r121,8; setp.ne.u64 %r254,%r95,%r143; @ %r254 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r255,%r103,0; @ %r255 bra $L16; add.u64 %r357,%frame,240; $L34: add.u64 %r152,%r179,40; mov.u64 %r150,%r357; add.u64 %r256,%r179,16; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r144,%r256,%r262; .loc 1 94 10 mov.u64 %r99,1; .loc 1 98 4 mov.u64 %r285,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r263,0; st.u64 [%r179+8],%r263; .loc 1 103 28 cvt.u16.u32 %r266,%r22; add.u16 %r265,%r266,-1; cvt.u32.u16 %r267,%r265; st.u8 [%r179+28],%r267; .loc 1 105 20 add.u32 %r268,%r22,-2; cvt.s64.s32 %r52,%r268; add.u64 %r270,%r52,%r52; add.u64 %r271,%r270,%r52; shl.b64 %r272,%r271,3; add.u64 %r273,%r179,%r272; .loc 1 105 67 shl.b64 %r275,%r52,3; add.u64 %r276,%frame,%r275; .loc 1 105 59 ld.u64 %r278,[%r273+40]; ld.u64 %r279,[%r276+240]; mul.lo.u64 %r109,%r278,%r279; .loc 1 107 29 mov.u64 %r281,16281; call (%value_in),_gfortrani_xmallocarray282,[%value_in]; } .loc 1 107 27 st.u64 [%r179],%r282; .loc 1 108 10 setp.eq.u64 %r284,%r109,0; @ ! %r284 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r99,%r48,%r99; $L17: .loc 1 98 4 st.u64 [%r152+8],%r285; ld.u64 %r48,[%r150]; add.u64 %r286,%r48,-1; st.u64 [%r152+16],%r286; st.u64 [%r152],%r99; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r287,%r144,%r152; @ %r287 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r179+48],%r109; mov.u64 %r289,-1; st.u64 [%r179+56],%r289; mov.u64 %r290,1; st.u64 [%r179+40],%r290; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r179+28]; .loc 1 118 10 setp.eq.u64 %r291,%r58,%r103; @ %r291 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r294,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r295,[%r294+36]; setp.eq.u32 %r296,%r295,0; @ %r296 bra $L21; .loc 1 125 2 add.u64 %r357,%frame,240; cvta.const.u64 %r300,$LC2r357299300; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r302,%r103,0; @ %r302 bra $L24; $L27: .loc 1 65 9 add.u64 %r303,%r30,1; .loc 1 65 7 sub.u64 %r105,%r303,%r32; max.s64 %r45,%r105,0; .loc 1 137 8 ld.u64 %r116,[%r180]; .loc 1 138 8 ld.u64 %r117,[%r179]; .loc 1 154 36 shl.b64 %r66,%r107,4; ld.u64 %r138,[%frame+360]; .loc 1 166 22 ld.u64 %r68,[%frame+120]; .loc 1 166 12 shl.b64 %r69,%r68,4; .loc 1 167 22 ld.u64 %r71,[%frame]; .loc 1 167 12 shl.b64 %r72,%r71,4; .loc 1 169 32 ld.u64 %r123,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r304,%r68,%r123; .loc 1 176 9 shl.b64 %r305,%r304,4; neg.s64 %r127,%r305; .loc 1 177 23 mul.lo.u64 %r306,%r71,%r123; .loc 1 177 9 shl.b64 %r307,%r306,4; neg.s64 %r130,%r307; setp.gt.s64 %r359,%r105,0; setp.le.s64 %r360,%r103,1; .loc 1 148 10 mov.u64 %r362,0; cvt.u32.u32 %r364,%r22; cvt.s64.s8 %r365,%r364; add.u64 %r366,%r365,-1; add.u64 %r367,%frame,120; add.u64 %r368,%frame,240; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r179,40; mov.u64 %r162,%frame; cvt.u32.u32 %r309,%r22; cvt.s64.s8 %r308,%r309; add.u64 %r153,%r308,-1; .loc 1 129 3 mov.u64 %r118,0; add.u64 %r357,%frame,240; .loc 1 131 16 mov.u64 %r310,%r118; $L26: st.u64 [%r165],%r310; .loc 1 132 18 ld.u64 %r311,[%r164]; st.u64 [%r162],%r311; .loc 1 133 17 shl.b64 %r313,%r118,3; add.u64 %r314,%r357,%r313; .loc 1 133 10 ld.u64 %r315,[%r314]; setp.le.s64 %r316,%r315,0; @ %r316 bra $L1; .loc 1 129 26 add.u64 %r118,%r118,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r317,%r118,%r153; @ %r317 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r359 bra $L35; .loc 1 150 10 st.u64 [%r117],%r362; st.u64 [%r117+8],%r362; bra $L29; $L35: mov.u64 %r112,%r116; .loc 1 148 10 mov.u64 %r353,%r362; mov.u64 %r354,%r362; .loc 1 154 13 mov.u64 %r111,0; $L28: .loc 1 158 10 ld.u64 %r323,[%r112]; ld.u64 %r326,[%r112+8]; or.b64 %r353,%r323,%r353; or.b64 %r354,%r326,%r354; .loc 1 154 28 add.u64 %r111,%r111,1; .loc 1 154 36 add.u64 %r112,%r112,%r66; .loc 1 154 6 setp.gt.s64 %r328,%r45,%r111; @ %r328 bra $L28; .loc 1 161 12 st.u64 [%r117],%r353; st.u64 [%r117+8],%r354; $L29: .loc 1 165 15 add.u64 %r138,%r138,1; .loc 1 166 12 add.u64 %r116,%r116,%r69; .loc 1 167 12 add.u64 %r117,%r117,%r72; .loc 1 169 13 setp.ne.u64 %r331,%r138,%r123; @ %r331 bra $L25; .loc 1 176 9 add.u64 %r113,%r116,%r127; .loc 1 177 9 add.u64 %r114,%r117,%r130; .loc 1 179 7 @ %r360 bra $L1; add.u64 %r122,%frame,368; mov.u64 %r140,8; .loc 1 178 5 mov.u64 %r115,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r122],%r362; .loc 1 176 23 mul.lo.u64 %r337,%r85,%r84; .loc 1 176 9 shl.b64 %r338,%r337,4; sub.u64 %r113,%r116,%r338; .loc 1 177 23 mul.lo.u64 %r339,%r88,%r84; .loc 1 177 9 shl.b64 %r340,%r339,4; sub.u64 %r114,%r117,%r340; .loc 1 178 5 add.u64 %r115,%r115,1; .loc 1 179 7 add.u64 %r122,%r122,8; add.u64 %r140,%r140,8; setp.eq.u64 %r341,%r115,%r366; @ %r341 bra $L1; $L32: .loc 1 187 16 ld.u64 %r342,[%r122]; add.u64 %r84,%r342,1; st.u64 [%r122],%r84; .loc 1 188 23 add.u64 %r344,%r367,%r140; ld.u64 %r85,[%r344]; .loc 1 188 13 shl.b64 %r345,%r85,4; add.u64 %r116,%r113,%r345; .loc 1 189 23 add.u64 %r346,%frame,%r140; ld.u64 %r88,[%r346]; .loc 1 189 13 shl.b64 %r347,%r88,4; add.u64 %r117,%r114,%r347; .loc 1 169 32 add.u64 %r349,%r368,%r140; ld.u64 %r92,[%r349]; .loc 1 169 13 setp.eq.u64 %r350,%r84,%r92; @ %r350 bra $L33; .loc 1 173 13 mov.u64 %r138,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r351,[%r179]; setp.eq.u64 %r352,%r351,0; @ ! %r352 bra $L11; bra $L34; $L1: .loc 1 19_gfortran_miany_i16 .visible .func _gfortran_miany_i16608u64u16 %r263; .reg .pred %r264; .reg .pred %r266; .reg .u64 %r270; .reg .u64 %r271; .reg .u64 %r277; .reg .pred %r279; .reg .pred %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .predu64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32u64 %r342; .reg .u64pred %r358; .reg .u64predu64 %r391; .reg .predpred %r399; .reg .u64 %r402; .reg .u64 %r405; .reg .pred %r407; .reg .predu64 %r435; .reg .u64 %r438; .reg .u64 %r439; .reg .u64 %r440; .reg .u64 %r442; .reg .predmov.u64 %r211,%ar2; mov.u64 %r212,%ar3; .loc 1 223 6 setp.ne.u64 %r213,%r212,0; @ %r213 bra $L492,%out_arg3); } .loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r211]; .loc 1 233 7 add.u64 %r137,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r210+28]; .loc 1 234 38 add.u32 %r217,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r138,%r217; .loc 1 237 7 shr.u64 %r219,%r137,63; set.u32.gt.s64 %r221,%r137,%r138; neg.s32 %r222,%r221; cvt.u16.u64 %r225,%r219; cvt.u16.u32 %r226,%r222; or.b16 %r224,%r225,%r226; .loc 1 237 6 cvt.u32.u16 %r227,%r224; cvt.u16.u8 %r228,%r227; setp.eq.u16 %r229,%r228,0; @ %r229 bra $L51; .loc 1 239 7 cvt.u32.u32 %r232,%r23; cvt.s64.s8 %r231,%r232; st.u64 [%stack+8],%r231; st.u64 [%stack],%r22; cvta.const.u64 %r2302stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r440,%r137,%r137; add.u64 %r442,%r440,%r137; shl.b64 %r237,%r442,3; add.u64 %r238,%r210,%r237; ld.u64 %r241,[%r238+56]; add.u64 %r240,%r241,1; .loc 1 244 7 ld.u64 %r248,[%r238+48]; sub.u64 %r139,%r240,%r248; .loc 1 245 6 setp.le.s64 %r249,%r139,0; @ %r249 bra $L48; .loc 1 248 9 ld.u64 %r152,[%r212]; .loc 1 250 15 ld.u64 %r34,[%r212+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r250,%r35,-4; and.b32 %r251,%r250,-5; set.u32.eq.u32 %r253,%r251,0; neg.s32 %r254,%r253; .loc 1 252 22 add.u32 %r255,%r35,-1; set.u32.le.u32 %r257,%r255,1; neg.s32 %r258,%r257; .loc 1 252 6 cvt.u16.u32 %r260,%r254; cvt.u16.u32 %r261,%r258; or.b16 %r259,%r260,%r261; cvt.u32.u16 %r262,%r259; cvt.u16.u8 %r263,%r262; setp.ne.u16 %r264,%r263,0; @ %r264 bra $L53; .loc 1 254 7 setp.ne.u32 %r266,%r35,16; @ %r266 bra $L54; $L53: .loc 1 261 9 shl.b64 %r270,%r442,3; add.u64 %r271,%r210,%r270; ld.u64 %r142,[%r271+40]; .loc 1 262 12 add.u64 %r277,%r212,%r270; ld.u64 %r42,[%r277+40]; .loc 1 264 3 setp.ne.u64 %r279,%r137,0; @ %r279 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r280,%r137,%r138; @ %r280 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r281281_gfortran_runtime_error5: add.u64 %r40,%r210,40; add.u64 %r179,%r212,40; add.u64 %r283,%r210,16; add.u64 %r285,%r22,%r22; add.u64 %r286,%r285,%r22; shl.b64 %r287,%r286,3; add.u64 %r206,%r283,%r287; .loc 1 264 3 mov.u64 %r92,0; add.u64 %r439,%frame,240; add.u64 %r438,%frame,360; .loc 1 271 12 mov.u64 %r453,%r92; $L60: .loc 1 266 18 add.u64 %r289,%r439,%r92; ld.u64 %r290,[%r40]; st.u64 [%r289],%r290; .loc 1 267 18 add.u64 %r291,%frame,%r92; .loc 1 267 20 ld.u64 %r293,[%r179]; mul.lo.u64 %r292,%r293,%r34; .loc 1 267 18 st.u64 [%r291],%r292; .loc 1 268 19 ld.u64 %r295,[%r40+16]; add.u64 %r294,%r295,1; ld.u64 %r296,[%r40+8]; sub.u64 %r54,%r294,%r296; .loc 1 270 10 setp.lt.s64 %r297,%r54,0; @ %r297 bra $L58; .loc 1 268 17 add.u64 %r299,%r438,%r92; st.u64 [%r299],%r54; bra $L59; $L58: .loc 1 271 12 add.u64 %r301,%r438,%r92; st.u64 [%r301],%r453; $L59: .loc 1 264 3 add.u64 %r40,%r40,24; add.u64 %r179,%r179,24; add.u64 %r92,%r92,8; setp.ne.u64 %r303,%r40,%r206; @ %r303 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r304,[%r209]; setp.eq.u64 %r305,%r304,0; @ ! %r305 bra $L63; bra $L62; $L56: add.u64 %r307,%r22,%r22; add.u64 %r308,%r307,%r22; shl.b64 %r309,%r308,3; add.u64 %r127,%r309,40; add.u64 %r133,%r210,%r127; add.u64 %r125,%r212,%r127; shl.b64 %r310,%r22,3; add.u64 %r103,%r310,-8; cvt.u32.u32 %r312,%r23; cvt.s64.s8 %r311,%r312; shl.b64 %r313,%r311,3; add.u64 %r69,%r313,-8; add.u64 %r439,%frame,240; add.u64 %r438,%frame,360; .loc 1 281 12 mov.u64 %r452,0; $L66: .loc 1 276 18 add.u64 %r315,%r439,%r103; ld.u64 %r316,[%r133]; st.u64 [%r315],%r316; .loc 1 277 18 add.u64 %r317,%frame,%r103; .loc 1 277 20 ld.u64 %r319,[%r125]; mul.lo.u64 %r318,%r319,%r34; .loc 1 277 18 st.u64 [%r317],%r318; .loc 1 278 19 ld.u64 %r321,[%r133+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r133+8]; sub.u64 %r66,%r320,%r322; .loc 1 280 10 setp.lt.s64 %r323,%r66,0; @ %r323 bra $L64; .loc 1 278 17 add.u64 %r325,%r438,%r103; st.u64 [%r325],%r66; bra $L65; $L64: .loc 1 281 12 add.u64 %r327,%r438,%r103; st.u64 [%r327],%r452; $L65: .loc 1 274 3 add.u64 %r133,%r133,24; add.u64 %r125,%r125,24; add.u64 %r103,%r103,8; setp.ne.u64 %r329,%r69,%r103; @ %r329 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r330,%r138,0; @ %r330 bra $L68; add.u64 %r438,%frame,360; $L86: add.u64 %r172,%r209,40; mov.u64 %r164,%r438; cvt.u32.u32 %r332,%r23; cvt.s64.s8 %r331,%r332; add.u64 %r334,%r331,%r331; add.u64 %r335,%r334,%r331; shl.b64 %r336,%r335,3; add.u64 %r337,%r209,16; add.u64 %r134,%r336,%r337; .loc 1 291 10 mov.u64 %r129,1; .loc 1 295 4 mov.u64 %r356,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r338,%r23,-2; cvt.s64.s32 %r75,%r338; add.u64 %r340,%r75,%r75; add.u64 %r341,%r340,%r75; shl.b64 %r342,%r341,3; add.u64 %r343,%r209,%r342; .loc 1 299 67 shl.b64 %r345,%r75,3; add.u64 %r346,%frame,%r345; .loc 1 299 59 ld.u64 %r348,[%r343+40]; ld.u64 %r349,[%r346+360]; mul.lo.u64 %r146,%r348,%r349; .loc 1 301 24 mov.u64 %r350,0; st.u64 [%r209+8],%r350; .loc 1 302 28 cvt.u16.u32 %r353,%r23; add.u16 %r352,%r353,-1; cvt.u32.u16 %r354,%r352; st.u8 [%r209+28],%r354; .loc 1 304 10 setp.eq.u64 %r355,%r146,0; @ %r355 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r129,%r72,%r129; $L69: .loc 1 295 4 st.u64 [%r172+8],%r356; ld.u64 %r72,[%r164]; add.u64 %r357,%r72,-1; st.u64 [%r172+16],%r357; st.u64 [%r172],%r129; .loc 1 288 7 add.u64 %r172,%r172,24; add.u64 %r164,%r164,8; setp.ne.u64 %r358,%r134,%r172; @ %r358 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r209+48],%r146; mov.u64 %r360,-1; st.u64 [%r209+56],%r360; mov.u64 %r361,1; st.u64 [%r209+40],%r361; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r363,16363; call (%value_in),_gfortrani_xmallocarray64,[%value_in]; } .loc 1 311 22 st.u64 [%r209],%r364; bra $L73; $L63: .loc 1 316 19 ld.s8 %r366,[%r209+28]; .loc 1 316 10 setp.eq.u64 %r367,%r366,%r138; @ %r367 bra $L74; .loc 1 317 2 cvta.const.u64 %r36836_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r370,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r371,[%r370+36]; setp.eq.u32 %r372,%r371,0; @ %r372 bra $L73; .loc 1 321 4 add.u64 %r438,%frame,360; cvta.const.u64 %r376,$LC2r437376; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r380,$LC1210380376; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r382,%r138,0; @ %r382 bra $L75; $L78: .loc 1 336 8 ld.u64 %r153,[%r209]; .loc 1 337 8 ld.u64 %r151,[%r210]; .loc 1 339 9 setp.ne.u64 %r383,%r151,0; @ %r383 bra $L76; bra $L48; $L75: add.u64 %r190,%frame,480; add.u64 %r189,%r209,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r385,%r23; cvt.s64.s8 %r384,%r385; add.u64 %r175,%r384,-1; .loc 1 328 3 mov.u64 %r157,0; add.u64 %r438,%frame,360; .loc 1 330 16 mov.u64 %r386,%r157; $L77: st.u64 [%r190],%r386; .loc 1 331 18 ld.u64 %r387,[%r189]; st.u64 [%r187],%r387; .loc 1 332 17 shl.b64 %r389,%r157,3; add.u64 %r390,%r438,%r389; .loc 1 332 10 ld.u64 %r391,[%r390]; setp.le.s64 %r392,%r391,0; @ %r392 bra $L48; .loc 1 328 26 add.u64 %r157,%r157,1; .loc 1 328 3 add.u64 %r190,%r190,8; add.u64 %r189,%r189,24; add.u64 %r187,%r187,8; setp.ne.u64 %r393,%r157,%r175; @ %r393 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r45,%r42,%r34; .loc 1 349 32 shl.b64 %r91,%r142,4; ld.u64 %r39,[%frame+480]; .loc 1 359 22 ld.u64 %r94,[%frame+240]; .loc 1 359 12 shl.b64 %r95,%r94,4; .loc 1 360 23 ld.u64 %r97,[%frame]; .loc 1 361 22 ld.u64 %r99,[%frame+120]; .loc 1 361 12 shl.b64 %r100,%r99,4; .loc 1 363 32 ld.u64 %r85,[%frame+360]; .loc 1 370 23 mul.lo.u64 %r38,%r85,%r94; .loc 1 371 24 mul.lo.u64 %r394,%r85,%r97; .loc 1 371 10 neg.s64 %r163,%r394; .loc 1 372 23 mul.lo.u64 %r395,%r85,%r99; .loc 1 372 9 shl.b64 %r396,%r395,4; neg.s64 %r165,%r396; setp.le.s64 %r443,%r138,1; .loc 1 348 10 mov.u64 %r445,0; cvt.u32.u32 %r446,%r23; cvt.s64.s8 %r447,%r446; add.u64 %r448,%r447,-1; add.u64 %r449,%frame,240; add.u64 %r450,%frame,360; add.u64 %r451,%frame,120; $L87: .loc 1 328 3 mov.u64 %r156,%r152; mov.u64 %r155,%r151; .loc 1 348 10 mov.u64 %r434,%r445; mov.u64 %r435,0; .loc 1 349 9 mov.u64 %r154,%r435; $L80: .loc 1 352 6 ld.s8 %r398,[%r156]; cvt.u16.u32 %r397,%r398; setp.eq.u16 %r399,%r397,0; @ %r399 bra $L79; .loc 1 353 12 ld.u64 %r402,[%r155]; ld.u64 %r405,[%r155+8]; or.b64 %r434,%r402,%r434; or.b64 %r435,%r405,%r435; $L79: .loc 1 349 24 add.u64 %r154,%r154,1; .loc 1 349 32 add.u64 %r155,%r155,%r91; .loc 1 349 47 add.u64 %r156,%r156,%r45; .loc 1 349 2 setp.ne.u64 %r407,%r139,%r154; @ %r407 bra $L80; .loc 1 355 8 st.u64 [%r153],%r434; st.u64 [%r153+8],%r435; .loc 1 358 15 add.u64 %r39,%r39,1; .loc 1 359 12 add.u64 %r151,%r151,%r95; .loc 1 360 13 add.u64 %r152,%r152,%r97; .loc 1 361 12 add.u64 %r153,%r153,%r100; .loc 1 363 13 setp.ne.u64 %r410,%r85,%r39; @ %r410 bra $L87; .loc 1 371 10 add.u64 %r148,%r152,%r163; .loc 1 372 9 add.u64 %r149,%r153,%r165; .loc 1 374 7 @ ! %r443 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r41],%r445; .loc 1 370 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 371 24 mul.lo.u64 %r413,%r116,%r113; .loc 1 371 10 sub.u64 %r148,%r152,%r413; .loc 1 372 23 mul.lo.u64 %r414,%r118,%r113; .loc 1 372 9 shl.b64 %r415,%r414,4; sub.u64 %r149,%r153,%r415; .loc 1 373 5 add.u64 %r150,%r150,1; .loc 1 374 7 add.u64 %r41,%r41,8; add.u64 %r166,%r166,8; setp.ne.u64 %r416,%r150,%r448; @ %r416 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; .loc 1 370 23 mov.u64 %r102,%r38; .loc 1 374 7 mov.u64 %r166,8; .loc 1 373 5 mov.u64 %r150,1; $L84: .loc 1 382 16 ld.u64 %r420,[%r41]; add.u64 %r113,%r420,1; st.u64 [%r41],%r113; .loc 1 383 23 add.u64 %r422,%r449,%r166; ld.u64 %r114,[%r422]; .loc 1 383 13 sub.u64 %r423,%r114,%r102; shl.b64 %r424,%r423,4; add.u64 %r151,%r151,%r424; .loc 1 384 24 add.u64 %r425,%frame,%r166; ld.u64 %r116,[%r425]; .loc 1 384 14 add.u64 %r152,%r148,%r116; .loc 1 385 23 add.u64 %r427,%r451,%r166; ld.u64 %r118,[%r427]; .loc 1 385 13 shl.b64 %r428,%r118,4; add.u64 %r153,%r149,%r428; .loc 1 363 32 add.u64 %r430,%r450,%r166; ld.u64 %r122,[%r430]; .loc 1 363 13 setp.eq.u64 %r431,%r113,%r122; @ %r431 bra $L85; .loc 1 367 13 mov.u64 %r39,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r432,[%r209]; setp.eq.u64 %r433,%r432,0; @ ! %r433 bra $L63; bra $L86; $L48:_gfortran_siany_i16 .visible .func _gfortran_siany_i1668u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64predpred %r292; .reg .predu32u64 %r301; mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_iany_i16.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r295,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r295; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r295; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r301,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r301; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r294,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r294,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r300,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r300; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r295,0; @ %r211 bra $L127; add.u64 %r294,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r294; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r244,1696244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r295,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r295; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r295; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r295,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r295,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r295,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 shl.b64 %r69,%r68,4; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r293,%r295,1; .loc 1 512 13 mov.u64 %r278,0; cvt.u32.u32 %r297,%r24; cvt.s64.s8 %r298,%r297; add.u64 %r299,%r298,-1; $L146: st.u64 [%r98],%r278; st.u64 [%r98+8],%r278; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r280,%r59,%r80; @ %r280 bra $L146; .loc 1 525 7 @ ! %r293 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r278; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r283,%r97,%r299; @ %r283 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; $L144: .loc 1 529 16 ld.u64 %r288,[%r99]; add.u64 %r74,%r288,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r289,%r75,%r72; shl.b64 %r290,%r289,4; add.u64 %r98,%r98,%r290; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r291,%r74,%r77; @ %r291 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r292,%r98,0; @ ! %r292 bra $L122; bra $L147; $L109: .loc 1 534 1 ret; } iparity_i1.o/_gfortran_iparity_i1 .visible .func _gfortran_iparity_fortran/generated/iparity_i1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_miparity_i1 .visible .func _gfortran_miparity_i1gfortran_siparity_i1 .visible .func _gfortran_siparity_i1VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,73,80,65,82,73,84,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,80,65,82,73,84,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,108] = {73,80,65,82,73,84,89114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,80,65,82,73,84,89,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,73,80,65,82,73,84,89,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_iparity_i1 .visible .func _gfortran_iparity_480predpred %r209; .reg .pred %r210; .reg .u64 %r211; .reg .u64 %r213; .reg .u64pred %r220; .reg .pred %r222; .reg .u64pred %r247; .reg .u64 %r248; .reg .u32 %r249; .reg .u64 %r251; .reg .u64 %r252; .reg .u64 %r253; .reg .u64 %r254; .reg .u64 %r255; .reg .u16 %r257; .reg .u16predpred %r288; .reg .u64 %r291; .reg .u64 %r292; .reg .predpred %r304; .reg .predpredpred %r332; .reg .u64 %r333; .reg .pred %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .pred %r337; .reg .pred %r338; .reg .u32 %r340; .reg .u64 %r341; .reg .u64 %r342; .reg .u64 %r343; .reg .u64 %r344; .reg .u64 %r345; .reg .u64 %r347; .reg .u64 %r348; .reg .u16 %r349; mov.u64 %r171,%ar0; mov.u64 %r172,%ar1; mov.u64 %r173,%ar2; .loc 1 55 10 ld.s8 %r22,[%r172+28]; .loc 1 55 38 add.u32 %r174,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r174; .loc 1 56 10 ld.u64 %r25,[%r173]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r176,%r104,63; set.u32.lt.s64 %r178,%r103,%r104; neg.s32 %r179,%r178; cvt.u16.u64 %r182,%r176; cvt.u16.u32 %r183,%r179; or.b16 %r181,%r182,%r183; .loc 1 58 6 cvt.u32.u16 %r184,%r181; cvt.u16.u8 %r185,%r184; setp.eq.u16 %r186,%r185,0; @ %r186 bra $L2; .loc 1 60 7 cvt.u32.u32 %r189,%r22; cvt.s64.s8 %r188,%r189; st.u64 [%stack+8],%r18818187_gfortran_runtime_errorr192,%r104,%r104; add.u64 %r193,%r192,%r104; shl.b64 %r194,%r193,3; add.u64 %r195,%r172,%r194; ld.u64 %r31,[%r195+56]; ld.u64 %r33,[%r195+48]; .loc 1 68 9 ld.u64 %r107,[%r195+40]; .loc 1 70 3 setp.ne.u64 %r209,%r104,0; @ %r209 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r210,%r103,%r104; @ %r210 bra $L4; bra $L46; $L3: add.u64 %r168,%r172,40; add.u64 %r165,%frame,120; add.u64 %r48,%frame,240; add.u64 %r211,%r172,16; add.u64 %r213,%r25,%r25; add.u64 %r214,%r213,%r25; shl.b64 %r215,%r214,3; add.u64 %r139,%r211,%r215; .loc 1 76 12 mov.u64 %r348,0; $L8: .loc 1 72 18 ld.u64 %r216,[%r168]; st.u64 [%r165],%r216; .loc 1 73 19 ld.u64 %r218,[%r168+16]; add.u64 %r217,%r218,1; ld.u64 %r219,[%r168+8]; sub.u64 %r38,%r217,%r219; .loc 1 75 10 setp.lt.s64 %r220,%r38,0; @ %r220 bra $L6; .loc 1 73 17 st.u64 [%r48],%r38; bra $L7; $L6: .loc 1 76 12 st.u64 [%r48],%r348; $L7: .loc 1 70 3 add.u64 %r168,%r168,24; add.u64 %r165,%r165,8; add.u64 %r48,%r48,8; setp.ne.u64 %r222,%r139,%r168; @ %r222 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r223,[%r171]; setp.eq.u64 %r224,%r223,0; @ ! %r224 bra $L11; bra $L10; $L4: add.u64 %r226,%r25,%r25; add.u64 %r227,%r226,%r25; shl.b64 %r228,%r227,3; add.u64 %r229,%r228,40; add.u64 %r99,%r172,%r229; shl.b64 %r230,%r25,3; add.u64 %r91,%r230,-8; add.u64 %r335,%frame,120; add.u64 %r93,%r335,%r91; add.u64 %r336,%frame,240; add.u64 %r77,%r336,%r91; cvt.u32.u32 %r234,%r22; cvt.s64.s8 %r233,%r234; add.u64 %r236,%r233,%r233; add.u64 %r237,%r236,%r233172,40; add.u64 %r121,%r238,%r239; .loc 1 84 12 mov.u64 %r347,0; $L14: .loc 1 80 18 ld.u64 %r240,[%r99]; st.u64 [%r93],%r240; .loc 1 81 19 ld.u64 %r242,[%r99+16]; add.u64 %r241,%r242,1; ld.u64 %r243,[%r99+8]; sub.u64 %r44,%r241,%r243; .loc 1 83 10 setp.lt.s64 %r244,%r44,0; @ %r244 bra $L12; .loc 1 81 17 st.u64 [%r77],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r77],%r347; $L13: .loc 1 78 3 add.u64 %r99,%r99,24; add.u64 %r93,%r93,8; add.u64 %r77,%r77,8; setp.ne.u64 %r246,%r99,%r121; @ %r246 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r247,%r103,0; @ %r247 bra $L16; add.u64 %r336,%frame,240; $L35: add.u64 %r137,%r171,40; mov.u64 %r132,%r336; cvt.u32.u32 %r249,%r22; cvt.s64.s8 %r248,%r249; add.u64 %r251,%r248,%r248; add.u64 %r252,%r251,%r248; shl.b64 %r253,%r252,3; add.u64 %r254,%r171,16; add.u64 %r100,%r253,%r254; .loc 1 94 10 mov.u64 %r95,1; .loc 1 98 4 mov.u64 %r277,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r255,0; st.u64 [%r171+8],%r255; .loc 1 103 28 cvt.u16.u32 %r258,%r22; add.u16 %r257,%r258,-1; cvt.u32.u16 %r259,%r257; st.u8 [%r171+28],%r259; .loc 1 105 20 add.u32 %r260,%r22,-2; cvt.s64.s32 %r54,%r260; add.u64 %r262,%r54,%r54; add.u64 %r263,%r262,%r54; shl.b64 %r264,%r263,3; add.u64 %r265,%r171,%r264; .loc 1 105 67 shl.b64 %r267,%r54,3; add.u64 %r268,%frame,%r267; .loc 1 105 59 ld.u64 %r270,[%r265+40]; ld.u64 %r271,[%r268+240]; mul.lo.u64 %r109,%r270,%r271; .loc 1 107 29 mov.u64 %r27273; call (%value_in),_gfortrani_xmallocarray274,[%value_in]; } .loc 1 107 27 st.u64 [%r171],%r274; .loc 1 108 10 setp.eq.u64 %r276,%r109,0; @ ! %r276 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r95,%r50,%r95; $L17: .loc 1 98 4 st.u64 [%r137+8],%r277; ld.u64 %r50,[%r132]; add.u64 %r278,%r50,-1; st.u64 [%r137+16],%r278; st.u64 [%r137],%r95; .loc 1 91 7 add.u64 %r137,%r137,24; add.u64 %r132,%r132,8; setp.ne.u64 %r279,%r100,%r137; @ %r279 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r171+48],%r109; mov.u64 %r281,-1; st.u64 [%r171+56],%r281; st.u64 [%r171+40],%r273; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r61,[%r171+28]; .loc 1 118 10 setp.eq.u64 %r283,%r61,%r103; @ %r283 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r61;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r286,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r287,[%r286+36]; setp.eq.u32 %r288,%r287,0; @ %r288 bra $L21; .loc 1 125 2 add.u64 %r336,%frame,240; cvta.const.u64 %r292,$LC2r3r29292; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r294,%r103,0; @ %r294 bra $L24; $L27: .loc 1 65 9 add.u64 %r295,%r31,1; .loc 1 65 7 sub.u64 %r105,%r295,%r33; max.s64 %r45,%r105,0; .loc 1 137 8 ld.u64 %r117,[%r172]; .loc 1 138 8 ld.u64 %r118,[%r171]; ld.u64 %r47,[%frame+360]; .loc 1 166 22 ld.u64 %r69,[%frame+120]; .loc 1 167 22 ld.u64 %r71,[%frame]; setp.gt.s64 %r337,%r105,0; setp.le.s64 %r338,%r103,1; cvt.u32.u32 %r340,%r22; cvt.s64.s8 %r341,%r340; add.u64 %r342,%r341,-1; add.u64 %r343,%frame,120; add.u64 %r344,%frame,240; .loc 1 173 13 mov.u64 %r345,0; bra $L25; $L24: add.u64 %r157,%frame,360; add.u64 %r156,%r171,40; mov.u64 %r154,%frame; cvt.u32.u32 %r297,%r22; cvt.s64.s8 %r296,%r297; add.u64 %r145,%r296,-1; .loc 1 129 3 mov.u64 %r119,0; add.u64 %r336,%frame,240; .loc 1 131 16 mov.u64 %r298,%r119; $L26: st.u64 [%r157],%r298; .loc 1 132 18 ld.u64 %r299,[%r156]; st.u64 [%r154],%r299; .loc 1 133 17 shl.b64 %r301,%r119,3; add.u64 %r302,%r336,%r301; .loc 1 133 10 ld.u64 %r303,[%r302]; setp.le.s64 %r304,%r303,0; @ %r304 bra $L1; .loc 1 129 26 add.u64 %r119,%r119,1; .loc 1 129 3 add.u64 %r157,%r157,8; add.u64 %r156,%r156,24; add.u64 %r154,%r154,8; setp.ne.u64 %r305,%r119,%r145; @ %r305 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r337 bra $L28; .loc 1 150 10 mov.u32 %r307,0; st.u8 [%r118],%r307; bra $L29; $L28: .loc 1 154 36 mov.u64 %r112,%r117; .loc 1 148 10 mov.u32 %r110,0; .loc 1 154 13 mov.u64 %r111,0; $L30: .loc 1 158 10 ld.u8 %r349,[%r112]; mov.loc 1 154 28 add.u64 %r111,%r111,1; .loc 1 154 36 add.u64 %r112,%r112,%r107; .loc 1 154 6 setp.gt.s64 %r313,%r45,%r111; @ %r313 bra $L30; .loc 1 161 12 cvt.u32.u32 %r314,%r110; st.u8 [%r118],%r314; $L29: .loc 1 165 15 add.u64 %r47,%r47,1; .loc 1 166 12 add.u64 %r117,%r117,%r69; .loc 1 167 12 add.u64 %r118,%r118,%r71; .loc 1 169 32 ld.u64 %r126,[%frame+240]; .loc 1 169 13 setp.ne.u64 %r315,%r47,%r126; @ %r315 bra $L25; .loc 1 176 23 mul.lo.u64 %r316,%r69,%r47; .loc 1 176 9 sub.u64 %r114,%r117,%r316; .loc 1 177 23 mul.lo.u64 %r317,%r71,%r47; .loc 1 177 9 sub.u64 %r115,%r118,%r317; .loc 1 179 7 @ %r338 bra $L1; add.u64 %r143,%frame,368; mov.u64 %r102,8; .loc 1 178 5 mov.u64 %r116,1; bra $L33; $L34: .loc 1 173 13 st.u64 [%r143],%r345; .loc 1 176 23 mul.lo.u64 %r323,%r84,%r83; .loc 1 176 9 sub.u64 %r114,%r117,%r323; .loc 1 177 23 mul.lo.u64 %r324,%r86,%r83; .loc 1 177 9 sub.u64 %r115,%r118,%r324; .loc 1 178 5 add.u64 %r116,%r116,1; .loc 1 179 7 add.u64 %r143,%r143,8; add.u64 %r102,%r102,8; setp.eq.u64 %r325,%r116,%r342; @ %r325 bra $L1; $L33: .loc 1 187 16 ld.u64 %r326,[%r143]; add.u64 %r83,%r326,1; st.u64 [%r143],%r83; .loc 1 188 23 add.u64 %r328,%r343,%r102; ld.u64 %r84,[%r328]; .loc 1 188 13 add.u64 %r117,%r114,%r84; .loc 1 189 23 add.u64 %r329,%frame,%r102; ld.u64 %r86,[%r329]; .loc 1 189 13 add.u64 %r118,%r115,%r86; .loc 1 169 32 add.u64 %r331,%r344,%r102; ld.u64 %r88,[%r331]; .loc 1 169 13 setp.eq.u64 %r332,%r83,%r88; @ %r332 bra $L34; .loc 1 173 13 mov.u64 %r47,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r333,[%r171]; setp.eq.u64 %r334,%r333,0; @ ! %r334 bra $L11; bra $L35; $L1: .loc 1 19_gfortran_miparity_i1 .visible .func _gfortran_miparity_608u32 %r215; .reg .u32 %r216; .reg .u16u16 %r255; .reg .u32 %r256; .reg .u16 %r257; .reg .pred %r258; .reg .predpred %r274; .reg .u64 %r275; .reg .u64 %r277; .reg .u64 %r279; .reg .u64 %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64pred %r297; .reg .u64 %r298; .reg .predu64 %r305; .reg .u32 %r306; .reg .u64u64 %r313; .reg .u64pred %r323; .reg .pred %r324; .reg .u64 %r325; .reg .u32 %r326; .reg .u64u64 %r331; .reg .u32 %r332; .reg .u64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u64u16 %r347; .reg .u32 %r348; .reg .pred %r349; .reg .u64 %r350; .reg .u64 %r351; .reg .predu16 %r388; .reg .u32 %r389; .reg .predu64 %r436; .reg .u16 %r437; mov.u64 %r203,%ar0; mov.u64 %r204,%ar1; mov.u64 %r205,%ar2; mov.u64 %r206,%ar3; .loc 1 223 6 setp.ne.u64 %r207,%r206,0; @ %r207 bra $L490r204205; call _gfortran_iparity_.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r205]; .loc 1 233 7 add.u64 %r132,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r204+28]; .loc 1 234 38 add.u32 %r211,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r133,%r211; .loc 1 237 7 shr.u64 %r213,%r132,63; set.u32.gt.s64 %r215,%r132,%r133; neg.s32 %r216,%r215; cvt.u16.u64 %r219,%r213; cvt.u16.u32 %r220,%r216; or.b16 %r218,%r219,%r220; .loc 1 237 6 cvt.u32.u16 %r221,%r218; cvt.u16.u8 %r222,%r221; setp.eq.u16 %r223,%r222,0; @ %r223 bra $L51; .loc 1 239 7 cvt.u32.u32 %r226,%r23; cvt.s64.s8 %r225,%r226; st.u64 [%stack+8],%r225; st.u64 [%stack],%r22; cvta.const.u64 %r2242stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r425,%r132,%r132; add.u64 %r424,%r425,%r132; shl.b64 %r231,%r424,3; add.u64 %r232,%r204,%r231; ld.u64 %r235,[%r232+56]; add.u64 %r234,%r235,1; .loc 1 244 7 ld.u64 %r242,[%r232+48]; sub.u64 %r134,%r234,%r242; .loc 1 245 6 setp.le.s64 %r243,%r134,0; @ %r243 bra $L48; .loc 1 248 9 ld.u64 %r146,[%r206]; .loc 1 250 15 ld.u64 %r34,[%r206+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r244,%r35,-4; and.b32 %r245,%r244,-5; set.u32.eq.u32 %r247,%r245,0; neg.s32 %r248,%r247; .loc 1 252 22 add.u32 %r249,%r35,-1; set.u32.le.u32 %r251,%r249,1; neg.s32 %r252,%r251; .loc 1 252 6 cvt.u16.u32 %r254,%r248; cvt.u16.u32 %r255,%r252; or.b16 %r253,%r254,%r255; cvt.u32.u16 %r256,%r253; cvt.u16.u8 %r257,%r256; setp.ne.u16 %r258,%r257,0; @ %r258 bra $L53; .loc 1 254 7 setp.ne.u32 %r260,%r35,16; @ %r260 bra $L54; $L53: .loc 1 261 9 shl.b64 %r264,%r424,3; add.u64 %r265,%r204,%r264; ld.u64 %r137,[%r265+40]; .loc 1 262 12 add.u64 %r271,%r206,%r264; ld.u64 %r42,[%r271+40]; .loc 1 264 3 setp.ne.u64 %r273,%r132,0; @ %r273 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r274,%r132,%r133; @ %r274 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r275275_gfortran_runtime_error5: add.u64 %r187,%r204,40; add.u64 %r189,%r206,40; add.u64 %r277,%r204,16; add.u64 %r279,%r22,%r22; add.u64 %r280,%r279,%r22; shl.b64 %r281,%r280,3; add.u64 %r200,%r277,%r281; .loc 1 264 3 mov.u64 %r191,0; add.u64 %r422,%frame,240; add.u64 %r421,%frame,360; .loc 1 271 12 mov.u64 %r436,%r191; $L60: .loc 1 266 18 add.u64 %r283,%r422,%r191; ld.u64 %r284,[%r187]; st.u64 [%r283],%r284; .loc 1 267 18 add.u64 %r285,%frame,%r191; .loc 1 267 20 ld.u64 %r287,[%r189]; mul.lo.u64 %r286,%r287,%r34; .loc 1 267 18 st.u64 [%r285],%r286; .loc 1 268 19 ld.u64 %r289,[%r187+16]; add.u64 %r288,%r289,1; ld.u64 %r290,[%r187+8]; sub.u64 %r53,%r288,%r290; .loc 1 270 10 setp.lt.s64 %r291,%r53,0; @ %r291 bra $L58; .loc 1 268 17 add.u64 %r293,%r421,%r191; st.u64 [%r293],%r53; bra $L59; $L58: .loc 1 271 12 add.u64 %r295,%r421,%r191; st.u64 [%r295],%r436; $L59: .loc 1 264 3 add.u64 %r187,%r187,24; add.u64 %r189,%r189,24; add.u64 %r191,%r191,8; setp.ne.u64 %r297,%r187,%r200; @ %r297 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r298,[%r203]; setp.eq.u64 %r299,%r298,0; @ ! %r299 bra $L63; bra $L62; $L56: add.u64 %r301,%r22,%r22; add.u64 %r302,%r301,%r22; shl.b64 %r303,%r302,3; add.u64 %r152,%r303,40; add.u64 %r58,%r204,%r152; add.u64 %r154,%r206,%r152; shl.b64 %r304,%r22,3; add.u64 %r100,%r304,-8; cvt.u32.u32 %r306,%r23; cvt.s64.s8 %r305,%r306; shl.b64 %r307,%r305,3; add.u64 %r184,%r307,-8; add.u64 %r422,%frame,240; add.u64 %r421,%frame,360; .loc 1 281 12 mov.u64 %r435,0; $L66: .loc 1 276 18 add.u64 %r309,%r422,%r100; ld.u64 %r310,[%r58]; st.u64 [%r309],%r310; .loc 1 277 18 add.u64 %r311,%frame,%r100; .loc 1 277 20 ld.u64 %r313,[%r154]; mul.lo.u64 %r312,%r313,%r34; .loc 1 277 18 st.u64 [%r311],%r312; .loc 1 278 19 ld.u64 %r315,[%r58+16]; add.u64 %r314,%r315,1; ld.u64 %r316,[%r58+8]; sub.u64 %r64,%r314,%r316; .loc 1 280 10 setp.lt.s64 %r317,%r64,0; @ %r317 bra $L64; .loc 1 278 17 add.u64 %r319,%r421,%r100; st.u64 [%r319],%r64; bra $L65; $L64: .loc 1 281 12 add.u64 %r321,%r421,%r100; st.u64 [%r321],%r435; $L65: .loc 1 274 3 add.u64 %r58,%r58,24; add.u64 %r154,%r154,24; add.u64 %r100,%r100,8; setp.ne.u64 %r323,%r100,%r184; @ %r323 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r324,%r133,0; @ %r324 bra $L68; add.u64 %r421,%frame,360; $L86: add.u64 %r124,%r203,40; mov.u64 %r118,%r421; cvt.u32.u32 %r326,%r23; cvt.s64.s8 %r325,%r326; add.u64 %r328,%r325,%r325; add.u64 %r329,%r328,%r325; shl.b64 %r330,%r329,3; add.u64 %r331,%r203,16; add.u64 %r72,%r330,%r331; .loc 1 291 10 mov.u64 %r120,1; .loc 1 295 4 mov.u64 %r350,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r332,%r23,-2; cvt.s64.s32 %r70,%r332; add.u64 %r334,%r70,%r70; add.u64 %r335,%r334,%r70; shl.b64 %r336,%r335,3; add.u64 %r337,%r203,%r336; .loc 1 299 67 shl.b64 %r339,%r70,3; add.u64 %r340,%frame,%r339; .loc 1 299 59 ld.u64 %r342,[%r337+40]; ld.u64 %r343,[%r340+360]; mul.lo.u64 %r140,%r342,%r343; .loc 1 301 24 mov.u64 %r344,0; st.u64 [%r203+8],%r344; .loc 1 302 28 cvt.u16.u32 %r347,%r23; add.u16 %r346,%r347,-1; cvt.u32.u16 %r348,%r346; st.u8 [%r203+28],%r348; .loc 1 304 10 setp.eq.u64 %r349,%r140,0; @ %r349 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r120,%r67,%r120; $L69: .loc 1 295 4 st.u64 [%r124+8],%r350; ld.u64 %r67,[%r118]; add.u64 %r351,%r67,-1; st.u64 [%r124+16],%r351; st.u64 [%r124],%r120; .loc 1 288 7 add.u64 %r124,%r124,24; add.u64 %r118,%r118,8; setp.ne.u64 %r352,%r72,%r124; @ %r352 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r203+48],%r140; mov.u64 %r354,-1; st.u64 [%r203+56],%r354; mov.u64 %r355,1; st.u64 [%r203+40],%r355; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r35714357; call (%value_in),_gfortrani_xmallocarray58,[%value_in]; } .loc 1 311 22 st.u64 [%r203],%r358; bra $L73; $L63: .loc 1 316 19 ld.s8 %r360,[%r203+28]; .loc 1 316 10 setp.eq.u64 %r361,%r360,%r133; @ %r361 bra $L74; .loc 1 317 2 cvta.const.u64 %r362362_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r364,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r365,[%r364+36]; setp.eq.u32 %r366,%r365,0; @ %r366 bra $L73; .loc 1 321 4 add.u64 %r421,%frame,360; cvta.const.u64 %r370,$LC2369370; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r374,$LC6; {374370; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r376,%r133,0; @ %r376 bra $L75; $L78: .loc 1 336 8 ld.u64 %r147,[%r203]; .loc 1 337 8 ld.u64 %r145,[%r204]; .loc 1 339 9 setp.ne.u64 %r377,%r145,0; @ %r377 bra $L76; bra $L48; $L75: add.u64 %r170,%frame,480; add.u64 %r167,%r203,40; add.u64 %r160,%frame,120; cvt.u32.u32 %r379,%r23; cvt.s64.s8 %r378,%r379; add.u64 %r125,%r378,-1; .loc 1 328 3 mov.u64 %r151,0; add.u64 %r421,%frame,360; .loc 1 330 16 mov.u64 %r380,%r151; $L77: st.u64 [%r170],%r380; .loc 1 331 18 ld.u64 %r381,[%r167]; st.u64 [%r160],%r381; .loc 1 332 17 shl.b64 %r383,%r151,3; add.u64 %r384,%r421,%r383; .loc 1 332 10 ld.u64 %r385,[%r384]; setp.le.s64 %r386,%r385,0; @ %r386 bra $L48; .loc 1 328 26 add.u64 %r151,%r151,1; .loc 1 328 3 add.u64 %r170,%r170,8; add.u64 %r167,%r167,24; add.u64 %r160,%r160,8; setp.ne.u64 %r387,%r125,%r151; @ %r387 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r44,%r42,%r34; ld.u64 %r173,[%frame+480]; .loc 1 359 22 ld.u64 %r86,[%frame+240]; .loc 1 360 23 ld.u64 %r88,[%frame]; .loc 1 361 22 ld.u64 %r90,[%frame+120]; setp.le.s64 %r426,%r133,1; .loc 1 348 10 mov.u32 %r428,0; cvt.u32.u32 %r429,%r23; cvt.s64.s8 %r430,%r429; add.u64 %r431,%r430,-1; add.u64 %r432,%frame,240; add.u64 %r433,%frame,360; add.u64 %r434,%frame,120; $L87: .loc 1 328 3 mov.u64 %r150,%r146; mov.u64 %r149,%r145; .loc 1 348 10 mov.u32 %r122,%r428; .loc 1 349 9 mov.u64 %r148,0; $L80: .loc 1 352 6 ld.s8 %r389,[%r150]; cvt.u16.u32 %r388,%r389; setp.eq.u16 %r390,%r388,0; @ %r390 bra $L79; .loc 1 353 12 ld.u8 %r437,[%r149]; mov$L79: .loc 1 349 24 add.u64 %r148,%r148,1; .loc 1 349 32 add.u64 %r149,%r149,%r137; .loc 1 349 47 add.u64 %r150,%r150,%r44; .loc 1 349 2 setp.ne.u64 %r396,%r134,%r148; @ %r396 bra $L80; .loc 1 355 8 cvt.u32.u32 %r397,%r122; st.u8 [%r147],%r397; .loc 1 358 15 add.u64 %r173,%r173,1; .loc 1 359 12 add.u64 %r145,%r145,%r86; .loc 1 360 13 add.u64 %r146,%r146,%r88; .loc 1 361 12 add.u64 %r147,%r147,%r90; .loc 1 363 32 ld.u64 %r80,[%frame+360]; .loc 1 363 13 setp.ne.u64 %r398,%r80,%r173; @ %r398 bra $L87; .loc 1 370 23 mul.lo.u64 %r94,%r173,%r86; .loc 1 371 24 mul.lo.u64 %r399,%r173,%r88; .loc 1 371 10 sub.u64 %r142,%r146,%r399; .loc 1 372 23 mul.lo.u64 %r400,%r173,%r90; .loc 1 372 9 sub.u64 %r143,%r147,%r400; .loc 1 374 7 @ ! %r426 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r40],%r427; .loc 1 370 23 mul.lo.u64 %r94,%r106,%r105; .loc 1 371 24 mul.lo.u64 %r403,%r108,%r105; .loc 1 371 10 sub.u64 %r142,%r146,%r403; .loc 1 372 23 mul.lo.u64 %r404,%r110,%r105; .loc 1 372 9 sub.u64 %r143,%r147,%r404; .loc 1 373 5 add.u64 %r144,%r144,1; .loc 1 374 7 add.u64 %r40,%r40,8; add.u64 %r164,%r164,8; setp.ne.u64 %r405,%r144,%r431; @ %r405 bra $L84; bra $L48; $L108: add.u64 %r40,%frame,488; mov.u64 %r164,8; .loc 1 373 5 mov.u64 %r144,1; .loc 1 367 13 mov.u64 %r427,0; $L84: .loc 1 382 16 ld.u64 %r409,[%r40]; add.u64 %r105,%r409,1; st.u64 [%r40],%r105; .loc 1 383 23 add.u64 %r411,%r432,%r164; ld.u64 %r106,[%r411]; .loc 1 383 13 sub.u64 %r412,%r106,%r94; add.u64 %r145,%r145,%r412; .loc 1 384 24 add.u64 %r413,%frame,%r164; ld.u64 %r108,[%r413]; .loc 1 384 14 add.u64 %r146,%r142,%r108; .loc 1 385 23 add.u64 %r415,%r434,%r164; ld.u64 %r110,[%r415]; .loc 1 385 13 add.u64 %r147,%r143,%r110; .loc 1 363 32 add.u64 %r417,%r433,%r164; ld.u64 %r113,[%r417]; .loc 1 363 13 setp.eq.u64 %r418,%r105,%r113; @ %r418 bra $L85; .loc 1 367 13 mov.u64 %r173,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r419,[%r203]; setp.eq.u64 %r420,%r419,0; @ ! %r420 bra $L63; bra $L86; $L48:_gfortran_siparity_i1 .visible .func _gfortran_siparity_68predpred %r184; .reg .predpred %r206; .reg .predu32 %r211; .reg .u64 %r213; .reg .u64u32 %r222; .reg .u64pred %r264; .reg .u64 %r266; .reg .u32 %r270; .reg .u64 %r272; .reg .u64 %r274; .reg .predpredu64 %r297; mov.u64 %r148,%ar0; mov.u64 %r149,%ar1; mov.u64 %r150,%ar2; mov.u64 %r151,%ar3; .loc 1 412 6 setp.eq.u64 %r152,%r151,0; @ %r152 bra $L110; .loc 1 412 20 ld.u32 %r153,[%r151]; setp.eq.u32 %r154,%r153,0; @ %r154 bra $L111; $L110: .loc 1 417 7gfortran_iparity_.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r150]; .loc 1 422 7 add.u64 %r86,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r149+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r291,%r26; .loc 1 425 7 shr.u64 %r159,%r86,63; set.u32.gt.s64 %r161,%r86,%r291; neg.s32 %r162,%r161; cvt.u16.u64 %r165,%r159; cvt.u16.u32 %r166,%r162; or.b16 %r164,%r165,%r166; .loc 1 425 6 cvt.u32.u16 %r167,%r164; cvt.u16.u8 %r168,%r167; setp.ne.u16 %r169,%r168,0; @ %r169 bra $L113; .loc 1 432 3 setp.ne.u64 %r170,%r86,0; @ %r170 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r171,%r86,%r291; @ %r171 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r174,%r24; cvt.s64.s8 %r173,%r174; st.u64 [%stack+8],%r173172172_gfortran_runtime_error114: add.u64 %r135,%r149,48; add.u64 %r137,%frame,120; add.u64 %r138,%r149,56; add.u64 %r176,%r149,24; add.u64 %r178,%r23,%r23; add.u64 %r179,%r178,%r23; shl.b64 %r180,%r179,3; add.u64 %r144,%r176,%r180; .loc 1 437 12 mov.u64 %r297,0; $L119: .loc 1 434 19 ld.u64 %r182,[%r138]; add.u64 %r181,%r182,1; ld.u64 %r183,[%r135]; sub.u64 %r36,%r181,%r183; .loc 1 436 10 setp.le.s64 %r184,%r36,0; @ %r184 bra $L117; .loc 1 434 17 st.u64 [%r137],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r137],%r297; $L118: .loc 1 432 3 add.u64 %r135,%r135,24; add.u64 %r137,%r137,8; add.u64 %r138,%r138,24; setp.ne.u64 %r186,%r135,%r144; @ %r186 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r96,[%r148]; .loc 1 449 6 setp.eq.u64 %r187,%r96,0; @ ! %r187 bra $L122; bra $L121; $L115: add.u64 %r189,%r23,%r23; add.u64 %r190,%r189,%r23; shl.b64 %r191,%r190,3; add.u64 %r192,%r191,48; add.u64 %r57,%r149,%r192; add.u64 %r290,%frame,120; shl.b64 %r194,%r23,3; add.u64 %r195,%r194,-8; add.u64 %r67,%r290,%r195; add.u64 %r196,%r149,48; cvt.u32.u32 %r198,%r24; cvt.s64.s8 %r197,%r198; add.u64 %r200,%r197,%r197; add.u64 %r201,%r200,%r197; shl.b64 %r202,%r201,3; add.u64 %r131,%r196,%r202; .loc 1 446 12 mov.u64 %r296,0; $L125: .loc 1 443 2 ld.u64 %r204,[%r57+8]; add.u64 %r203,%r204,1; ld.u64 %r205,[%r57]; sub.u64 %r41,%r203,%r205; .loc 1 445 10 setp.le.s64 %r206,%r41,0; @ %r206 bra $L123; .loc 1 442 17 st.u64 [%r67],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r67],%r296; $L124: .loc 1 440 3 add.u64 %r57,%r57,24; add.u64 %r67,%r67,8; setp.ne.u64 %r208,%r57,%r131; @ %r208 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r209,%r291,0; @ %r209 bra $L127; add.u64 %r290,%frame,120; $L147: add.u64 %r102,%r148,40; mov.u64 %r84,%r290; cvt.u32.u32 %r211,%r24; cvt.s64.s8 %r210,%r211; add.u64 %r213,%r210,%r210; add.u64 %r214,%r213,%r210; shl.b64 %r215,%r214,3; add.u64 %r216,%r148,16; add.u64 %r60,%r215,%r216; .loc 1 456 10 mov.u64 %r82,1; .loc 1 460 4 mov.u64 %r235,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r217,0; st.u64 [%r148+8],%r217; .loc 1 465 28 cvt.u16.u32 %r220,%r24; add.u16 %r219,%r220,-1; cvt.u32.u16 %r221,%r219; st.u8 [%r148+28],%r221; .loc 1 467 20 add.u32 %r222,%r24,-2; cvt.s64.s32 %r50,%r222; add.u64 %r224,%r50,%r50; add.u64 %r225,%r224,%r50; shl.b64 %r226,%r225,3; add.u64 %r227,%r148,%r226; .loc 1 467 67 shl.b64 %r229,%r50,3; add.u64 %r230,%frame,%r229; .loc 1 467 59 ld.u64 %r232,[%r227+40]; ld.u64 %r233,[%r230+120]; mul.lo.u64 %r93,%r232,%r233; .loc 1 469 10 setp.eq.u64 %r234,%r93,0; @ %r234 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r82,%r46,%r82; $L128: .loc 1 460 4 st.u64 [%r102+8],%r235; ld.u64 %r46,[%r84]; add.u64 %r236,%r46,-1; st.u64 [%r102+16],%r236; st.u64 [%r102],%r82; .loc 1 453 7 add.u64 %r102,%r102,24;131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r148+48],%r93; mov.u64 %r239,-1; st.u64 [%r148+56],%r239; mov.u64 %r240,1; st.u64 [%r148+40],%r240; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r242(%value_in),_gfortrani_xmallocarray3,[%value_in]; } mov.u64 %r96,%r243; .loc 1 476 22 st.u64 [%r148],%r96; $L135: .loc 1 502 3 setp.ne.u64 %r245,%r291,0; @ %r245 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r148+28]; .loc 1 480 10 setp.eq.u64 %r246,%r55,%r291; @ %r246 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r291; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r249,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r250,[%r249+36]; setp.eq.u32 %r251,%r250,0; @ %r251 bra $L135; .loc 1 488 4 setp.eq.u64 %r252,%r291,0; @ %r252 bra $L133; add.u64 %r31,%r148,48; add.u64 %r125,%frame,120; cvt.u32.u32 %r254,%r24; cvt.s64.s8 %r253,%r254; add.u64 %r92,%r253,-1; .loc 1 488 10 mov.u64 %r101,0; $L137: .loc 1 492 21 ld.u64 %r256,[%r31+8]; add.u64 %r255,%r256,1; .loc 1 492 19 ld.u64 %r257,[%r31]; sub.u64 %r91,%r255,%r257; .loc 1 493 18 ld.u64 %r64,[%r125]; .loc 1 488 25 add.u64 %r101,%r101,1; .loc 1 493 11 setp.eq.u64 %r258,%r64,%r91; @ %r258 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r64; st.u64 [%stack+8],%r91; st.u64 [%stack],%r101; cvta.const.u64 %r22_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r31,%r31,24; add.u64 %r125,%r125,8; setp.ne.u64 %r261,%r92,%r101; @ %r261 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r263,%r24; setp.le.s16 %r264,%r263,1; @ %r264 bra $L138; shl.b64 %r262,%r291,3; bra $L139; $L138: mov.u64 %r262,8; $L139: add.u64 %r266,%frame,240; mov.u32 %r27026272,[%value_in]; } add.u64 %r111,%r148,40; mov.u64 %r109,%frame; mov.u64 %r98,0; $L140: .loc 1 505 18 ld.u64 %r274,[%r111]; st.u64 [%r109],%r274; .loc 1 502 26 add.u64 %r98,%r98,1; .loc 1 502 3 add.u64 %r111,%r111,24; add.u64 %r109,%r109,8; setp.gt.s64 %r275,%r291,%r98; @ %r275 bra $L140; $L133: ld.u64 %r103,[%frame+240]; .loc 1 514 22 ld.u64 %r69,[%frame]; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r81,%r59,%r69; setp.le.s64 %r289,%r291,1; .loc 1 512 13 mov.u32 %r276,0; cvt.u32.u32 %r293,%r24; cvt.s64.s8 %r294,%r293; add.u64 %r295,%r294,-1; $L146: st.u8 [%r96],%r276; .loc 1 513 15 add.u64 %r103,%r103,1; .loc 1 514 12 add.u64 %r96,%r96,%r69; .loc 1 516 13 setp.ne.u64 %r277,%r59,%r103; @ %r277 bra $L146; .loc 1 525 7 @ ! %r289 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r97],%r292; .loc 1 523 23 mul.lo.u64 %r71,%r74,%r73; .loc 1 524 5 add.u64 %r95,%r95,1; .loc 1 525 7 add.u64 %r97,%r97,8; add.u64 %r119,%r119,8; add.u64 %r118,%r118,8; setp.ne.u64 %r280,%r95,%r295; @ %r280 bra $L144; bra $L109; $L171: add.u64 %r97,%frame,248; add.u64 %r119,%frame,8; add.u64 %r118,%frame,128; .loc 1 523 23 mov.u64 %r71,%r81; .loc 1 524 5 mov.u64 %r95,1; .loc 1 520 13 mov.u64 %r292,0; $L144: .loc 1 529 16 ld.u64 %r285,[%r97]; add.u64 %r73,%r285,1; st.u64 [%r97],%r73; .loc 1 530 23 ld.u64 %r74,[%r119]; .loc 1 530 13 sub.u64 %r286,%r74,%r71; add.u64 %r96,%r96,%r286; .loc 1 516 32 ld.u64 %r77,[%r118]; .loc 1 516 13 setp.eq.u64 %r287,%r73,%r77; @ %r287 bra $L145; .loc 1 520 13 mov.u64 %r103,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r96,[%r148]; .loc 1 449 6 setp.eq.u64 %r288,%r96,0; @ ! %r288 bra $L122; bra $L147; $L109: .loc 1 534 1 ret; } iparity_i2.o/_gfortran_iparity_i2 .visible .func _gfortran_iparity_fortran/generated/iparity_i2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_miparity_i2 .visible .func _gfortran_miparity_i2gfortran_siparity_i2 .visible .func _gfortran_siparity_i2VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,73,80,65,82,73,84,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,80,65,82,73,84,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,108] = {73,80,65,82,73,84,89114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,80,65,82,73,84,89,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,73,80,65,82,73,84,89,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_iparity_i2 .visible .func _gfortran_iparity_480u64 %r52; .reg .u64 %r58; .reg .u64 %r61; .reg .u649pred %r217; .reg .pred64u64 %r251; .reg .predu64 %r256; .reg .u64 %r257; .reg .u32u32 %r268; .reg .u64predmov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; .loc 1 55 10 ld.s8 %r22,[%r180+28]; .loc 1 55 38 add.u32 %r182,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r182; .loc 1 56 10 ld.u64 %r25,[%r181]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r184,%r104,63; set.u32.lt.s64 %r186,%r103,%r104; neg.s32 %r187,%r186; cvt.u16.u64 %r190,%r184; cvt.u16.u32 %r191,%r187; or.b16 %r189,%r190,%r191; .loc 1 58 6 cvt.u32.u16 %r192,%r189; cvt.u16.u8 %r193,%r192; setp.eq.u16 %r194,%r193,0; @ %r194 bra $L2; .loc 1 60 7 cvt.u32.u32 %r197,%r22; cvt.s64.s8 %r196,%r197; st.u64 [%stack+8],%r1961955_gfortran_runtime_errorr200,%r104,%r104; add.u64 %r201,%r200,%r104; shl.b64 %r202,%r201,3; add.u64 %r203,%r180,%r202; ld.u64 %r30,[%r203+56]; ld.u64 %r32,[%r203+48]; .loc 1 68 9 ld.u64 %r107,[%r203+40]; .loc 1 70 3 setp.ne.u64 %r217,%r104,0; @ %r217 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r218,%r103,%r104; @ %r218 bra $L4; bra $L46; $L3: add.u64 %r61,%r180,40; add.u64 %r40,%frame,120; add.u64 %r91,%frame,240; add.u64 %r219,%r180,16; add.u64 %r221,%r25,%r25; add.u64 %r222,%r221,%r25; shl.b64 %r223,%r222,3; add.u64 %r174,%r219,%r223; .loc 1 76 12 mov.u64 %r363,0; $L8: .loc 1 72 18 ld.u64 %r224,[%r61]; st.u64 [%r40],%r224; .loc 1 73 19 ld.u64 %r226,[%r61+16]; add.u64 %r225,%r226,1; ld.u64 %r227,[%r61+8]; sub.u64 %r37,%r225,%r227; .loc 1 75 10 setp.lt.s64 %r228,%r37,0; @ %r228 bra $L6; .loc 1 73 17 st.u64 [%r91],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r91],%r363; $L7: .loc 1 70 3 add.u64 %r61,%r61,24; add.u64 %r40,%r40,8; add.u64 %r91,%r91,8; setp.ne.u64 %r230,%r61,%r174; @ %r230 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r231,[%r179]; setp.eq.u64 %r232,%r231,0; @ ! %r232 bra $L11; bra $L10; $L4: add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r237,%r236,40; add.u64 %r143,%r180,%r237; shl.b64 %r238,%r25,3; add.u64 %r125,%r238,-8; add.u64 %r351,%frame,120; add.u64 %r128,%r351,%r125; add.u64 %r350,%frame,240; add.u64 %r121,%r350,%r125; cvt.u32.u32 %r242,%r22; cvt.s64.s8 %r241,%r242; add.u64 %r244,%r241,%r241; add.u64 %r245,%r244,%r241; shl.b64 %r246,%r245,3; add.u64 %r247,%r180,40; add.u64 %r95,%r246,%r247; .loc 1 84 12 mov.u64 %r362,0; $L14: .loc 1 80 18 ld.u64 %r248,[%r143]; st.u64 [%r128],%r248; .loc 1 81 19 ld.u64 %r250,[%r143+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r143+8]; sub.u64 %r44,%r249,%r251; .loc 1 83 10 setp.lt.s64 %r252,%r44,0; @ %r252 bra $L12; .loc 1 81 17 st.u64 [%r121],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r121],%r362; $L13: .loc 1 78 3 add.u64 %r143,%r143,24; add.u64 %r128,%r128,8; add.u64 %r121,%r121,8; setp.ne.u64 %r254,%r95,%r143; @ %r254 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r255,%r103,0; @ %r255 bra $L16; add.u64 %r350,%frame,240; $L34: add.u64 %r152,%r179,40; mov.u64 %r150,%r350; add.u64 %r256,%r179,16; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r144,%r256,%r262; .loc 1 94 10 mov.u64 %r99,1; .loc 1 98 4 mov.u64 %r285,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r263,0; st.u64 [%r179+8],%r263; .loc 1 103 28 cvt.u16.u32 %r266,%r22; add.u16 %r265,%r266,-1; cvt.u32.u16 %r267,%r265; st.u8 [%r179+28],%r267; .loc 1 105 20 add.u32 %r268,%r22,-2; cvt.s64.s32 %r52,%r268; add.u64 %r270,%r52,%r52; add.u64 %r271,%r270,%r52; shl.b64 %r272,%r271,3; add.u64 %r273,%r179,%r272; .loc 1 105 67 shl.b64 %r275,%r52,3; add.u64 %r276,%frame,%r275; .loc 1 105 59 ld.u64 %r278,[%r273+40]; ld.u64 %r279,[%r276+240]; mul.lo.u64 %r109,%r278,%r279; .loc 1 107 29 mov.u64 %r281,281; call (%value_in),_gfortrani_xmallocarray282,[%value_in]; } .loc 1 107 27 st.u64 [%r179],%r282; .loc 1 108 10 setp.eq.u64 %r284,%r109,0; @ ! %r284 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r99,%r48,%r99; $L17: .loc 1 98 4 st.u64 [%r152+8],%r285; ld.u64 %r48,[%r150]; add.u64 %r286,%r48,-1; st.u64 [%r152+16],%r286; st.u64 [%r152],%r99; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r287,%r144,%r152; @ %r287 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r179+48],%r109; mov.u64 %r289,-1; st.u64 [%r179+56],%r289; mov.u64 %r290,1; st.u64 [%r179+40],%r290; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r179+28]; .loc 1 118 10 setp.eq.u64 %r291,%r58,%r103; @ %r291 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r294,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r295,[%r294+36]; setp.eq.u32 %r296,%r295,0; @ %r296 bra $L21; .loc 1 125 2 add.u64 %r350,%frame,240; cvta.const.u64 %r300,$LC2r350299300; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r302,%r103,0; @ %r302 bra $L24; $L27: .loc 1 65 9 add.u64 %r303,%r30,1; .loc 1 65 7 sub.u64 %r105,%r303,%r32; max.s64 %r45,%r105,0; .loc 1 137 8 ld.u64 %r116,[%r180]; .loc 1 138 8 ld.u64 %r117,[%r179]; .loc 1 154 36 add.u64 %r66,%r107,%r107; ld.u64 %r138,[%frame+360]; .loc 1 166 22 ld.u64 %r68,[%frame+120]; .loc 1 166 12 add.u64 %r69,%r68,%r68; .loc 1 167 22 ld.u64 %r71,[%frame]; .loc 1 167 12 add.u64 %r72,%r71,%r71; .loc 1 169 32 ld.u64 %r123,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r307,%r68,%r123; .loc 1 176 9 add.u64 %r308,%r307,%r307; neg.s64 %r127,%r308; .loc 1 177 23 mul.lo.u64 %r309,%r71,%r123; .loc 1 177 9 add.u64 %r310,%r309,%r309; neg.s64 %r130,%r310; setp.gt.s64 %r352,%r105,0; setp.le.s64 %r353,%r103,1; cvt.u32.u32 %r355,%r22; cvt.s64.s8 %r356,%r355; add.u64 %r357,%r356,-1; add.u64 %r358,%frame,120; add.u64 %r359,%frame,240; .loc 1 173 13 mov.u64 %r360,0; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r179,40; mov.u64 %r162,%frame; cvt.u32.u32 %r312,%r22; cvt.s64.s8 %r311,%r312; add.u64 %r153,%r311,-1; .loc 1 129 3 mov.u64 %r118,0; add.u64 %r350,%frame,240; .loc 1 131 16 mov.u64 %r313,%r118; $L26: st.u64 [%r165],%r313; .loc 1 132 18 ld.u64 %r314,[%r164]; st.u64 [%r162],%r314; .loc 1 133 17 shl.b64 %r316,%r118,3; add.u64 %r317,%r350,%r316; .loc 1 133 10 ld.u64 %r318,[%r317]; setp.le.s64 %r319,%r318,0; @ %r319 bra $L1; .loc 1 129 26 add.u64 %r118,%r118,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r320,%r118,%r153; @ %r320 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r352 bra $L35; .loc 1 150 10 mov.u16 %r322,0; st.u16 [%r117],%r322; bra $L29; $L35: mov.u64 %r112,%r116; .loc 1 148 10 mov.u32 %r110,0; .loc 1 154 13 mov.u64 %r111,0; $L28: .loc 1 158 10 ld.u16 %r324,[%r112]; cvt.u16.u32 %r325,%r110; xor.b16 %r323,%r324,%r325; cvt.s32.s16 %r110,%r323; .loc 1 154 28 add.u64 %r111,%r111,1; .loc 1 154 36 add.u64 %r112,%r112,%r66; .loc 1 154 6 setp.gt.s64 %r326,%r45,%r111; @ %r326 bra $L28; .loc 1 161 12 cvt.u16.u32 %r327,%r110; st.u16 [%r117],%r327; $L29: .loc 1 165 15 add.u64 %r138,%r138,1; .loc 1 166 12 add.u64 %r116,%r116,%r69; .loc 1 167 12 add.u64 %r117,%r117,%r72; .loc 1 169 13 setp.ne.u64 %r328,%r138,%r123; @ %r328 bra $L25; .loc 1 176 9 add.u64 %r113,%r116,%r127; .loc 1 177 9 add.u64 %r114,%r117,%r130; .loc 1 179 7 @ %r353 bra $L1; add.u64 %r122,%frame,368; mov.u64 %r140,8; .loc 1 178 5 mov.u64 %r115,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r122],%r360; .loc 1 176 23 mul.lo.u64 %r334,%r85,%r84; .loc 1 176 9 add.u64 %r335,%r334,%r334; sub.u64 %r113,%r116,%r335; .loc 1 177 23 mul.lo.u64 %r336,%r88,%r84; .loc 1 177 9 add.u64 %r337,%r336,%r336; sub.u64 %r114,%r117,%r337; .loc 1 178 5 add.u64 %r115,%r115,1; .loc 1 179 7 add.u64 %r122,%r122,8; add.u64 %r140,%r140,8; setp.eq.u64 %r338,%r115,%r357; @ %r338 bra $L1; $L32: .loc 1 187 16 ld.u64 %r339,[%r122]; add.u64 %r84,%r339,1; st.u64 [%r122],%r84; .loc 1 188 23 add.u64 %r341,%r358,%r140; ld.u64 %r85,[%r341]; .loc 1 188 13 add.u64 %r342,%r85,%r85; add.u64 %r116,%r113,%r342; .loc 1 189 23 add.u64 %r343,%frame,%r140; ld.u64 %r88,[%r343]; .loc 1 189 13 add.u64 %r344,%r88,%r88; add.u64 %r117,%r114,%r344; .loc 1 169 32 add.u64 %r346,%r359,%r140; ld.u64 %r92,[%r346]; .loc 1 169 13 setp.eq.u64 %r347,%r84,%r92; @ %r347 bra $L33; .loc 1 173 13 mov.u64 %r138,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r348,[%r179]; setp.eq.u64 %r349,%r348,0; @ ! %r349 bra $L11; bra $L34; $L1: .loc 1 19_gfortran_miparity_i2 .visible .func _gfortran_miparity_608u64u16 %r263; .reg .pred %r264; .reg .pred %r266; .reg .u64 %r270; .reg .u64 %r271; .reg .u64 %r277; .reg .pred %r279; .reg .pred %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .predu64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32u64 %r342; .reg .u64pred %r358; .reg .u64predu64 %r391; .reg .pred %r392; .reg .pred %r393; .reg .u64u64 %r421; .reg .u64predmov.u64 %r211,%ar2; mov.u64 %r212,%ar3; .loc 1 223 6 setp.ne.u64 %r213,%r212,0; @ %r213 bra $L49gfortran_iparity_.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r211]; .loc 1 233 7 add.u64 %r137,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r210+28]; .loc 1 234 38 add.u32 %r217,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r138,%r217; .loc 1 237 7 shr.u64 %r219,%r137,63; set.u32.gt.s64 %r221,%r137,%r138; neg.s32 %r222,%r221; cvt.u16.u64 %r225,%r219; cvt.u16.u32 %r226,%r222; or.b16 %r224,%r225,%r226; .loc 1 237 6 cvt.u32.u16 %r227,%r224; cvt.u16.u8 %r228,%r227; setp.eq.u16 %r229,%r228,0; @ %r229 bra $L51; .loc 1 239 7 cvt.u32.u32 %r232,%r23; cvt.s64.s8 %r231,%r232; st.u64 [%stack+8],%r231; st.u64 [%stack],%r22; cvta.const.u64 %r2302stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r435,%r137,%r137; add.u64 %r436,%r435,%r137; shl.b64 %r237,%r436,3; add.u64 %r238,%r210,%r237; ld.u64 %r241,[%r238+56]; add.u64 %r240,%r241,1; .loc 1 244 7 ld.u64 %r248,[%r238+48]; sub.u64 %r139,%r240,%r248; .loc 1 245 6 setp.le.s64 %r249,%r139,0; @ %r249 bra $L48; .loc 1 248 9 ld.u64 %r152,[%r212]; .loc 1 250 15 ld.u64 %r34,[%r212+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r250,%r35,-4; and.b32 %r251,%r250,-5; set.u32.eq.u32 %r253,%r251,0; neg.s32 %r254,%r253; .loc 1 252 22 add.u32 %r255,%r35,-1; set.u32.le.u32 %r257,%r255,1; neg.s32 %r258,%r257; .loc 1 252 6 cvt.u16.u32 %r260,%r254; cvt.u16.u32 %r261,%r258; or.b16 %r259,%r260,%r261; cvt.u32.u16 %r262,%r259; cvt.u16.u8 %r263,%r262; setp.ne.u16 %r264,%r263,0; @ %r264 bra $L53; .loc 1 254 7 setp.ne.u32 %r266,%r35,16; @ %r266 bra $L54; $L53: .loc 1 261 9 shl.b64 %r270,%r436,3; add.u64 %r271,%r210,%r270; ld.u64 %r142,[%r271+40]; .loc 1 262 12 add.u64 %r277,%r212,%r270; ld.u64 %r42,[%r277+40]; .loc 1 264 3 setp.ne.u64 %r279,%r137,0; @ %r279 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r280,%r137,%r138; @ %r280 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r281281_gfortran_runtime_error5: add.u64 %r40,%r210,40; add.u64 %r179,%r212,40; add.u64 %r283,%r210,16; add.u64 %r285,%r22,%r22; add.u64 %r286,%r285,%r22; shl.b64 %r287,%r286,3; add.u64 %r206,%r283,%r287; .loc 1 264 3 mov.u64 %r92,0; add.u64 %r433,%frame,240; add.u64 %r432,%frame,360; .loc 1 271 12 mov.u64 %r447,%r92; $L60: .loc 1 266 18 add.u64 %r289,%r433,%r92; ld.u64 %r290,[%r40]; st.u64 [%r289],%r290; .loc 1 267 18 add.u64 %r291,%frame,%r92; .loc 1 267 20 ld.u64 %r293,[%r179]; mul.lo.u64 %r292,%r293,%r34; .loc 1 267 18 st.u64 [%r291],%r292; .loc 1 268 19 ld.u64 %r295,[%r40+16]; add.u64 %r294,%r295,1; ld.u64 %r296,[%r40+8]; sub.u64 %r54,%r294,%r296; .loc 1 270 10 setp.lt.s64 %r297,%r54,0; @ %r297 bra $L58; .loc 1 268 17 add.u64 %r299,%r432,%r92; st.u64 [%r299],%r54; bra $L59; $L58: .loc 1 271 12 add.u64 %r301,%r432,%r92; st.u64 [%r301],%r447; $L59: .loc 1 264 3 add.u64 %r40,%r40,24; add.u64 %r179,%r179,24; add.u64 %r92,%r92,8; setp.ne.u64 %r303,%r40,%r206; @ %r303 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r304,[%r209]; setp.eq.u64 %r305,%r304,0; @ ! %r305 bra $L63; bra $L62; $L56: add.u64 %r307,%r22,%r22; add.u64 %r308,%r307,%r22; shl.b64 %r309,%r308,3; add.u64 %r127,%r309,40; add.u64 %r133,%r210,%r127; add.u64 %r125,%r212,%r127; shl.b64 %r310,%r22,3; add.u64 %r103,%r310,-8; cvt.u32.u32 %r312,%r23; cvt.s64.s8 %r311,%r312; shl.b64 %r313,%r311,3; add.u64 %r69,%r313,-8; add.u64 %r433,%frame,240; add.u64 %r432,%frame,360; .loc 1 281 12 mov.u64 %r446,0; $L66: .loc 1 276 18 add.u64 %r315,%r433,%r103; ld.u64 %r316,[%r133]; st.u64 [%r315],%r316; .loc 1 277 18 add.u64 %r317,%frame,%r103; .loc 1 277 20 ld.u64 %r319,[%r125]; mul.lo.u64 %r318,%r319,%r34; .loc 1 277 18 st.u64 [%r317],%r318; .loc 1 278 19 ld.u64 %r321,[%r133+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r133+8]; sub.u64 %r66,%r320,%r322; .loc 1 280 10 setp.lt.s64 %r323,%r66,0; @ %r323 bra $L64; .loc 1 278 17 add.u64 %r325,%r432,%r103; st.u64 [%r325],%r66; bra $L65; $L64: .loc 1 281 12 add.u64 %r327,%r432,%r103; st.u64 [%r327],%r446; $L65: .loc 1 274 3 add.u64 %r133,%r133,24; add.u64 %r125,%r125,24; add.u64 %r103,%r103,8; setp.ne.u64 %r329,%r69,%r103; @ %r329 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r330,%r138,0; @ %r330 bra $L68; add.u64 %r432,%frame,360; $L86: add.u64 %r172,%r209,40; mov.u64 %r164,%r432; cvt.u32.u32 %r332,%r23; cvt.s64.s8 %r331,%r332; add.u64 %r334,%r331,%r331; add.u64 %r335,%r334,%r331; shl.b64 %r336,%r335,3; add.u64 %r337,%r209,16; add.u64 %r134,%r336,%r337; .loc 1 291 10 mov.u64 %r129,1; .loc 1 295 4 mov.u64 %r356,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r338,%r23,-2; cvt.s64.s32 %r75,%r338; add.u64 %r340,%r75,%r75; add.u64 %r341,%r340,%r75; shl.b64 %r342,%r341,3; add.u64 %r343,%r209,%r342; .loc 1 299 67 shl.b64 %r345,%r75,3; add.u64 %r346,%frame,%r345; .loc 1 299 59 ld.u64 %r348,[%r343+40]; ld.u64 %r349,[%r346+360]; mul.lo.u64 %r146,%r348,%r349; .loc 1 301 24 mov.u64 %r350,0; st.u64 [%r209+8],%r350; .loc 1 302 28 cvt.u16.u32 %r353,%r23; add.u16 %r352,%r353,-1; cvt.u32.u16 %r354,%r352; st.u8 [%r209+28],%r354; .loc 1 304 10 setp.eq.u64 %r355,%r146,0; @ %r355 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r129,%r72,%r129; $L69: .loc 1 295 4 st.u64 [%r172+8],%r356; ld.u64 %r72,[%r164]; add.u64 %r357,%r72,-1; st.u64 [%r172+16],%r357; st.u64 [%r172],%r129; .loc 1 288 7 add.u64 %r172,%r172,24; add.u64 %r164,%r164,8; setp.ne.u64 %r358,%r134,%r172; @ %r358 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r209+48],%r146; mov.u64 %r360,-1; st.u64 [%r209+56],%r360; mov.u64 %r361,1; st.u64 [%r209+40],%r361; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r363,363; call (%value_in),_gfortrani_xmallocarray64,[%value_in]; } .loc 1 311 22 st.u64 [%r209],%r364; bra $L73; $L63: .loc 1 316 19 ld.s8 %r366,[%r209+28]; .loc 1 316 10 setp.eq.u64 %r367,%r366,%r138; @ %r367 bra $L74; .loc 1 317 2 cvta.const.u64 %r36836_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r370,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r371,[%r370+36]; setp.eq.u32 %r372,%r371,0; @ %r372 bra $L73; .loc 1 321 4 add.u64 %r432,%frame,360; cvta.const.u64 %r376,$LC2r437376; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r380,$LC1210380376; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r382,%r138,0; @ %r382 bra $L75; $L78: .loc 1 336 8 ld.u64 %r153,[%r209]; .loc 1 337 8 ld.u64 %r151,[%r210]; .loc 1 339 9 setp.ne.u64 %r383,%r151,0; @ %r383 bra $L76; bra $L48; $L75: add.u64 %r190,%frame,480; add.u64 %r189,%r209,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r385,%r23; cvt.s64.s8 %r384,%r385; add.u64 %r175,%r384,-1; .loc 1 328 3 mov.u64 %r157,0; add.u64 %r432,%frame,360; .loc 1 330 16 mov.u64 %r386,%r157; $L77: st.u64 [%r190],%r386; .loc 1 331 18 ld.u64 %r387,[%r189]; st.u64 [%r187],%r387; .loc 1 332 17 shl.b64 %r389,%r157,3; add.u64 %r390,%r432,%r389; .loc 1 332 10 ld.u64 %r391,[%r390]; setp.le.s64 %r392,%r391,0; @ %r392 bra $L48; .loc 1 328 26 add.u64 %r157,%r157,1; .loc 1 328 3 add.u64 %r190,%r190,8; add.u64 %r189,%r189,24; add.u64 %r187,%r187,8; setp.ne.u64 %r393,%r157,%r175; @ %r393 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r45,%r42,%r34; .loc 1 349 32 add.u64 %r91,%r142,%r142; ld.u64 %r39,[%frame+480]; .loc 1 359 22 ld.u64 %r94,[%frame+240]; .loc 1 359 12 add.u64 %r95,%r94,%r94; .loc 1 360 23 ld.u64 %r97,[%frame]; .loc 1 361 22 ld.u64 %r99,[%frame+120]; .loc 1 361 12 add.u64 %r100,%r99,%r99; .loc 1 363 32 ld.u64 %r85,[%frame+360]; .loc 1 370 23 mul.lo.u64 %r38,%r85,%r94; .loc 1 371 24 mul.lo.u64 %r397,%r85,%r97; .loc 1 371 10 neg.s64 %r163,%r397; .loc 1 372 23 mul.lo.u64 %r398,%r85,%r99; .loc 1 372 9 add.u64 %r399,%r398,%r398; neg.s64 %r165,%r399; setp.le.s64 %r437,%r138,1; .loc 1 348 10 mov.u32 %r439,0; cvt.u32.u32 %r440,%r23; cvt.s64.s8 %r441,%r440; add.u64 %r442,%r441,-1; add.u64 %r443,%frame,240; add.u64 %r444,%frame,360; add.u64 %r445,%frame,120; $L87: .loc 1 328 3 mov.u64 %r156,%r152; mov.u64 %r155,%r151; .loc 1 348 10 mov.u32 %r130,%r439; .loc 1 349 9 mov.u64 %r154,0; $L80: .loc 1 352 6 ld.s8 %r401,[%r156]; cvt.u16.u32 %r400,%r401; setp.eq.u16 %r402,%r400,0; @ %r402 bra $L79; .loc 1 353 12 ld.u16 %r404,[%r155]; cvt.u16.u32 %r405,%r130; xor.b16 %r403,%r404,%r405; cvt.s32.s16 %r130,%r403; $L79: .loc 1 349 24 add.u64 %r154,%r154,1; .loc 1 349 32 add.u64 %r155,%r155,%r91; .loc 1 349 47 add.u64 %r156,%r156,%r45; .loc 1 349 2 setp.ne.u64 %r406,%r139,%r154; @ %r406 bra $L80; .loc 1 355 8 cvt.u16.u32 %r407,%r130; st.u16 [%r153],%r407; .loc 1 358 15 add.u64 %r39,%r39,1; .loc 1 359 12 add.u64 %r151,%r151,%r95; .loc 1 360 13 add.u64 %r152,%r152,%r97; .loc 1 361 12 add.u64 %r153,%r153,%r100; .loc 1 363 13 setp.ne.u64 %r408,%r85,%r39; @ %r408 bra $L87; .loc 1 371 10 add.u64 %r148,%r152,%r163; .loc 1 372 9 add.u64 %r149,%r153,%r165; .loc 1 374 7 @ ! %r437 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r41],%r438; .loc 1 370 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 371 24 mul.lo.u64 %r411,%r116,%r113; .loc 1 371 10 sub.u64 %r148,%r152,%r411; .loc 1 372 23 mul.lo.u64 %r412,%r118,%r113; .loc 1 372 9 add.u64 %r413,%r412,%r412; sub.u64 %r149,%r153,%r413; .loc 1 373 5 add.u64 %r150,%r150,1; .loc 1 374 7 add.u64 %r41,%r41,8; add.u64 %r166,%r166,8; setp.ne.u64 %r414,%r150,%r442; @ %r414 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; .loc 1 370 23 mov.u64 %r102,%r38; .loc 1 374 7 mov.u64 %r166,8; .loc 1 373 5 mov.u64 %r150,1; .loc 1 367 13 mov.u64 %r438,0; $L84: .loc 1 382 16 ld.u64 %r418,[%r41]; add.u64 %r113,%r418,1; st.u64 [%r41],%r113; .loc 1 383 23 add.u64 %r420,%r443,%r166; ld.u64 %r114,[%r420]; .loc 1 383 13 sub.u64 %r421,%r114,%r102; add.u64 %r422,%r421,%r421; add.u64 %r151,%r151,%r422; .loc 1 384 24 add.u64 %r423,%frame,%r166; ld.u64 %r116,[%r423]; .loc 1 384 14 add.u64 %r152,%r148,%r116; .loc 1 385 23 add.u64 %r425,%r445,%r166; ld.u64 %r118,[%r425]; .loc 1 385 13 add.u64 %r426,%r118,%r118; add.u64 %r153,%r149,%r426; .loc 1 363 32 add.u64 %r428,%r444,%r166; ld.u64 %r122,[%r428]; .loc 1 363 13 setp.eq.u64 %r429,%r113,%r122; @ %r429 bra $L85; .loc 1 367 13 mov.u64 %r39,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r430,[%r209]; setp.eq.u64 %r431,%r430,0; @ ! %r431 bra $L63; bra $L86; $L48:_gfortran_siparity_i2 .visible .func _gfortran_siparity_68u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64predpred %r292; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_iparity_.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r295,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r295; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r295; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r301,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r301; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r294,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r294,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r300,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r300; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r295,0; @ %r211 bra $L127; add.u64 %r294,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r294; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r244,4; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r295,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r295; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r295; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r295,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r295,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r295,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 add.u64 %r69,%r68,%r68; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r293,%r295,1; .loc 1 512 13 mov.u16 %r279,0; cvt.u32.u32 %r297,%r24; cvt.s64.s8 %r298,%r297; add.u64 %r299,%r298,-1; $L146: st.u16 [%r98],%r279; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r280,%r59,%r80; @ %r280 bra $L146; .loc 1 525 7 @ ! %r293 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r296; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r283,%r97,%r299; @ %r283 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; .loc 1 520 13 mov.u64 %r296,0; $L144: .loc 1 529 16 ld.u64 %r288,[%r99]; add.u64 %r74,%r288,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r289,%r75,%r72; add.u64 %r290,%r289,%r289; add.u64 %r98,%r98,%r290; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r291,%r74,%r77; @ %r291 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r292,%r98,0; @ ! %r292 bra $L122; bra $L147; $L109: .loc 1 534 1 ret; } iparity_i4.o/_gfortran_iparity_i4 .visible .func _gfortran_iparity_fortran/generated/iparity_i4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_miparity_i4 .visible .func _gfortran_miparity_i4gfortran_siparity_i4 .visible .func _gfortran_siparity_i4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,73,80,65,82,73,84,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,80,65,82,73,84,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,108] = {73,80,65,82,73,84,89114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,80,65,82,73,84,89,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,73,80,65,82,73,84,89,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_iparity_i4 .visible .func _gfortran_iparity_480u64 %r52; .reg .u64 %r58; .reg .u64 %r61; .reg .u649pred %r217; .reg .pred64u64 %r251; .reg .predu64 %r256; .reg .u64 %r257; .reg .u32u32 %r268; .reg .u64u64predpredu64u64predmov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; .loc 1 55 10 ld.s8 %r22,[%r180+28]; .loc 1 55 38 add.u32 %r182,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r182; .loc 1 56 10 ld.u64 %r25,[%r181]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r184,%r104,63; set.u32.lt.s64 %r186,%r103,%r104; neg.s32 %r187,%r186; cvt.u16.u64 %r190,%r184; cvt.u16.u32 %r191,%r187; or.b16 %r189,%r190,%r191; .loc 1 58 6 cvt.u32.u16 %r192,%r189; cvt.u16.u8 %r193,%r192; setp.eq.u16 %r194,%r193,0; @ %r194 bra $L2; .loc 1 60 7 cvt.u32.u32 %r197,%r22; cvt.s64.s8 %r196,%r197; st.u64 [%stack+8],%r1961955_gfortran_runtime_errorr200,%r104,%r104; add.u64 %r201,%r200,%r104; shl.b64 %r202,%r201,3; add.u64 %r203,%r180,%r202; ld.u64 %r30,[%r203+56]; ld.u64 %r32,[%r203+48]; .loc 1 68 9 ld.u64 %r107,[%r203+40]; .loc 1 70 3 setp.ne.u64 %r217,%r104,0; @ %r217 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r218,%r103,%r104; @ %r218 bra $L4; bra $L46; $L3: add.u64 %r61,%r180,40; add.u64 %r40,%frame,120; add.u64 %r91,%frame,240; add.u64 %r219,%r180,16; add.u64 %r221,%r25,%r25; add.u64 %r222,%r221,%r25; shl.b64 %r223,%r222,3; add.u64 %r174,%r219,%r223; .loc 1 76 12 mov.u64 %r357,0; $L8: .loc 1 72 18 ld.u64 %r224,[%r61]; st.u64 [%r40],%r224; .loc 1 73 19 ld.u64 %r226,[%r61+16]; add.u64 %r225,%r226,1; ld.u64 %r227,[%r61+8]; sub.u64 %r37,%r225,%r227; .loc 1 75 10 setp.lt.s64 %r228,%r37,0; @ %r228 bra $L6; .loc 1 73 17 st.u64 [%r91],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r91],%r357; $L7: .loc 1 70 3 add.u64 %r61,%r61,24; add.u64 %r40,%r40,8; add.u64 %r91,%r91,8; setp.ne.u64 %r230,%r61,%r174; @ %r230 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r231,[%r179]; setp.eq.u64 %r232,%r231,0; @ ! %r232 bra $L11; bra $L10; $L4: add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r237,%r236,40; add.u64 %r143,%r180,%r237; shl.b64 %r238,%r25,3; add.u64 %r125,%r238,-8; add.u64 %r344,%frame,120; add.u64 %r128,%r344,%r125; add.u64 %r345,%frame,240; add.u64 %r121,%r345,%r125; cvt.u32.u32 %r242,%r22; cvt.s64.s8 %r241,%r242; add.u64 %r244,%r241,%r241; add.u64 %r245,%r244,%r241; shl.b64 %r246,%r245,3; add.u64 %r247,%r180,40; add.u64 %r95,%r246,%r247; .loc 1 84 12 mov.u64 %r356,0; $L14: .loc 1 80 18 ld.u64 %r248,[%r143]; st.u64 [%r128],%r248; .loc 1 81 19 ld.u64 %r250,[%r143+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r143+8]; sub.u64 %r44,%r249,%r251; .loc 1 83 10 setp.lt.s64 %r252,%r44,0; @ %r252 bra $L12; .loc 1 81 17 st.u64 [%r121],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r121],%r356; $L13: .loc 1 78 3 add.u64 %r143,%r143,24; add.u64 %r128,%r128,8; add.u64 %r121,%r121,8; setp.ne.u64 %r254,%r95,%r143; @ %r254 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r255,%r103,0; @ %r255 bra $L16; add.u64 %r345,%frame,240; $L34: add.u64 %r152,%r179,40; mov.u64 %r150,%r345; add.u64 %r256,%r179,16; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r144,%r256,%r262; .loc 1 94 10 mov.u64 %r99,1; .loc 1 98 4 mov.u64 %r285,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r263,0; st.u64 [%r179+8],%r263; .loc 1 103 28 cvt.u16.u32 %r266,%r22; add.u16 %r265,%r266,-1; cvt.u32.u16 %r267,%r265; st.u8 [%r179+28],%r267; .loc 1 105 20 add.u32 %r268,%r22,-2; cvt.s64.s32 %r52,%r268; add.u64 %r270,%r52,%r52; add.u64 %r271,%r270,%r52; shl.b64 %r272,%r271,3; add.u64 %r273,%r179,%r272; .loc 1 105 67 shl.b64 %r275,%r52,3; add.u64 %r276,%frame,%r275; .loc 1 105 59 ld.u64 %r278,[%r273+40]; ld.u64 %r279,[%r276+240]; mul.lo.u64 %r109,%r278,%r279; .loc 1 107 29281; call (%value_in),_gfortrani_xmallocarray282,[%value_in]; } .loc 1 107 27 st.u64 [%r179],%r282; .loc 1 108 10 setp.eq.u64 %r284,%r109,0; @ ! %r284 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r99,%r48,%r99; $L17: .loc 1 98 4 st.u64 [%r152+8],%r285; ld.u64 %r48,[%r150]; add.u64 %r286,%r48,-1; st.u64 [%r152+16],%r286; st.u64 [%r152],%r99; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r287,%r144,%r152; @ %r287 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r179+48],%r109; mov.u64 %r289,-1; st.u64 [%r179+56],%r289; mov.u64 %r290,1; st.u64 [%r179+40],%r290; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r179+28]; .loc 1 118 10 setp.eq.u64 %r291,%r58,%r103; @ %r291 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r294,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r295,[%r294+36]; setp.eq.u32 %r296,%r295,0; @ %r296 bra $L21; .loc 1 125 2 add.u64 %r345,%frame,240; cvta.const.u64 %r300,$LC2r345299300; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r302,%r103,0; @ %r302 bra $L24; $L27: .loc 1 65 9 add.u64 %r303,%r30,1; .loc 1 65 7 sub.u64 %r105,%r303,%r32; max.s64 %r45,%r105,0; .loc 1 137 8 ld.u64 %r116,[%r180]; .loc 1 138 8 ld.u64 %r117,[%r179]; .loc 1 154 36 shl.b64 %r66,%r107,2; ld.u64 %r138,[%frame+360]; .loc 1 166 22 ld.u64 %r68,[%frame+120]; .loc 1 166 12 shl.b64 %r69,%r68,2; .loc 1 167 22 ld.u64 %r71,[%frame]; .loc 1 167 12 shl.b64 %r72,%r71,2; .loc 1 169 32 ld.u64 %r123,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r304,%r68,%r123; .loc 1 176 9 shl.b64 %r305,%r304,2; neg.s64 %r127,%r305; .loc 1 177 23 mul.lo.u64 %r306,%r71,%r123; .loc 1 177 9 shl.b64 %r307,%r306,2; neg.s64 %r130,%r307; setp.gt.s64 %r346,%r105,0; setp.le.s64 %r347,%r103,1; cvt.u32.u32 %r349,%r22; cvt.s64.s8 %r350,%r349; add.u64 %r351,%r350,-1; add.u64 %r352,%frame,120; add.u64 %r353,%frame,240; .loc 1 173 13 mov.u64 %r354,0; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r179,40; mov.u64 %r162,%frame; cvt.u32.u32 %r309,%r22; cvt.s64.s8 %r308,%r309; add.u64 %r153,%r308,-1; .loc 1 129 3 mov.u64 %r118,0; add.u64 %r345,%frame,240; .loc 1 131 16 mov.u64 %r310,%r118; $L26: st.u64 [%r165],%r310; .loc 1 132 18 ld.u64 %r311,[%r164]; st.u64 [%r162],%r311; .loc 1 133 17 shl.b64 %r313,%r118,3; add.u64 %r314,%r345,%r313; .loc 1 133 10 ld.u64 %r315,[%r314]; setp.le.s64 %r316,%r315,0; @ %r316 bra $L1; .loc 1 129 26 add.u64 %r118,%r118,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r317,%r118,%r153; @ %r317 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r346 bra $L35; .loc 1 150 10 mov.u32 %r319,0; st.u32 [%r117],%r319; bra $L29; $L35: mov.u64 %r112,%r116; .loc 1 148 10 mov.u32 %r110,0; .loc 1 154 13 mov.u64 %r111,0; $L28: .loc 1 158 10 ld.u32 %r320,[%r112]; xor.b32 %r110,%r110,%r320; .loc 1 154 28 add.u64 %r111,%r111,1; .loc 1 154 36 add.u64 %r112,%r112,%r66; .loc 1 154 6 setp.gt.s64 %r321,%r45,%r111; @ %r321 bra $L28; .loc 1 161 12 st.u32 [%r117],%r110; $L29: .loc 1 165 15 add.u64 %r138,%r138,1; .loc 1 166 12 add.u64 %r116,%r116,%r69; .loc 1 167 12 add.u64 %r117,%r117,%r72; .loc 1 169 13 setp.ne.u64 %r322,%r138,%r123; @ %r322 bra $L25; .loc 1 176 9 add.u64 %r113,%r116,%r127; .loc 1 177 9 add.u64 %r114,%r117,%r130; .loc 1 179 7 @ %r347 bra $L1; add.u64 %r122,%frame,368; mov.u64 %r140,8; .loc 1 178 5 mov.u64 %r115,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r122],%r354; .loc 1 176 23 mul.lo.u64 %r328,%r85,%r84; .loc 1 176 9 shl.b64 %r329,%r328,2; sub.u64 %r113,%r116,%r329; .loc 1 177 23 mul.lo.u64 %r330,%r88,%r84; .loc 1 177 9 shl.b64 %r331,%r330,2; sub.u64 %r114,%r117,%r331; .loc 1 178 5 add.u64 %r115,%r115,1; .loc 1 179 7 add.u64 %r122,%r122,8; add.u64 %r140,%r140,8; setp.eq.u64 %r332,%r115,%r351; @ %r332 bra $L1; $L32: .loc 1 187 16 ld.u64 %r333,[%r122]; add.u64 %r84,%r333,1; st.u64 [%r122],%r84; .loc 1 188 23 add.u64 %r335,%r352,%r140; ld.u64 %r85,[%r335]; .loc 1 188 13 shl.b64 %r336,%r85,2; add.u64 %r116,%r113,%r336; .loc 1 189 23 add.u64 %r337,%frame,%r140; ld.u64 %r88,[%r337]; .loc 1 189 13 shl.b64 %r338,%r88,2; add.u64 %r117,%r114,%r338; .loc 1 169 32 add.u64 %r340,%r353,%r140; ld.u64 %r92,[%r340]; .loc 1 169 13 setp.eq.u64 %r341,%r84,%r92; @ %r341 bra $L33; .loc 1 173 13 mov.u64 %r138,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r342,[%r179]; setp.eq.u64 %r343,%r342,0; @ ! %r343 bra $L11; bra $L34; $L1: .loc 1 19_gfortran_miparity_i4 .visible .func _gfortran_miparity_608u64u16 %r263; .reg .pred %r264; .reg .pred %r266; .reg .u64 %r270; .reg .u64 %r271; .reg .u64 %r277; .reg .pred %r279; .reg .pred %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .predu64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32u64 %r342; .reg .u64pred %r358; .reg .u64predu64 %r391; .reg .predpred %r399; .reg .u32 %r400; .reg .pred %r401; .reg .predmov.u64 %r211,%ar2; mov.u64 %r212,%ar3; .loc 1 223 6 setp.ne.u64 %r213,%r212,0; @ %r213 bra $L49gfortran_iparity_.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r211]; .loc 1 233 7 add.u64 %r137,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r210+28]; .loc 1 234 38 add.u32 %r217,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r138,%r217; .loc 1 237 7 shr.u64 %r219,%r137,63; set.u32.gt.s64 %r221,%r137,%r138; neg.s32 %r222,%r221; cvt.u16.u64 %r225,%r219; cvt.u16.u32 %r226,%r222; or.b16 %r224,%r225,%r226; .loc 1 237 6 cvt.u32.u16 %r227,%r224; cvt.u16.u8 %r228,%r227; setp.eq.u16 %r229,%r228,0; @ %r229 bra $L51; .loc 1 239 7 cvt.u32.u32 %r232,%r23; cvt.s64.s8 %r231,%r232; st.u64 [%stack+8],%r231; st.u64 [%stack],%r22; cvta.const.u64 %r2302stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r429,%r137,%r137; add.u64 %r430,%r429,%r137; shl.b64 %r237,%r430,3; add.u64 %r238,%r210,%r237; ld.u64 %r241,[%r238+56]; add.u64 %r240,%r241,1; .loc 1 244 7 ld.u64 %r248,[%r238+48]; sub.u64 %r139,%r240,%r248; .loc 1 245 6 setp.le.s64 %r249,%r139,0; @ %r249 bra $L48; .loc 1 248 9 ld.u64 %r152,[%r212]; .loc 1 250 15 ld.u64 %r34,[%r212+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r250,%r35,-4; and.b32 %r251,%r250,-5; set.u32.eq.u32 %r253,%r251,0; neg.s32 %r254,%r253; .loc 1 252 22 add.u32 %r255,%r35,-1; set.u32.le.u32 %r257,%r255,1; neg.s32 %r258,%r257; .loc 1 252 6 cvt.u16.u32 %r260,%r254; cvt.u16.u32 %r261,%r258; or.b16 %r259,%r260,%r261; cvt.u32.u16 %r262,%r259; cvt.u16.u8 %r263,%r262; setp.ne.u16 %r264,%r263,0; @ %r264 bra $L53; .loc 1 254 7 setp.ne.u32 %r266,%r35,16; @ %r266 bra $L54; $L53: .loc 1 261 9 shl.b64 %r270,%r430,3; add.u64 %r271,%r210,%r270; ld.u64 %r142,[%r271+40]; .loc 1 262 12 add.u64 %r277,%r212,%r270; ld.u64 %r42,[%r277+40]; .loc 1 264 3 setp.ne.u64 %r279,%r137,0; @ %r279 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r280,%r137,%r138; @ %r280 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r281281_gfortran_runtime_error5: add.u64 %r40,%r210,40; add.u64 %r179,%r212,40; add.u64 %r283,%r210,16; add.u64 %r285,%r22,%r22; add.u64 %r286,%r285,%r22; shl.b64 %r287,%r286,3; add.u64 %r206,%r283,%r287; .loc 1 264 3 mov.u64 %r92,0; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 271 12 mov.u64 %r441,%r92; $L60: .loc 1 266 18 add.u64 %r289,%r427,%r92; ld.u64 %r290,[%r40]; st.u64 [%r289],%r290; .loc 1 267 18 add.u64 %r291,%frame,%r92; .loc 1 267 20 ld.u64 %r293,[%r179]; mul.lo.u64 %r292,%r293,%r34; .loc 1 267 18 st.u64 [%r291],%r292; .loc 1 268 19 ld.u64 %r295,[%r40+16]; add.u64 %r294,%r295,1; ld.u64 %r296,[%r40+8]; sub.u64 %r54,%r294,%r296; .loc 1 270 10 setp.lt.s64 %r297,%r54,0; @ %r297 bra $L58; .loc 1 268 17 add.u64 %r299,%r426,%r92; st.u64 [%r299],%r54; bra $L59; $L58: .loc 1 271 12 add.u64 %r301,%r426,%r92; st.u64 [%r301],%r441; $L59: .loc 1 264 3 add.u64 %r40,%r40,24; add.u64 %r179,%r179,24; add.u64 %r92,%r92,8; setp.ne.u64 %r303,%r40,%r206; @ %r303 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r304,[%r209]; setp.eq.u64 %r305,%r304,0; @ ! %r305 bra $L63; bra $L62; $L56: add.u64 %r307,%r22,%r22; add.u64 %r308,%r307,%r22; shl.b64 %r309,%r308,3; add.u64 %r127,%r309,40; add.u64 %r133,%r210,%r127; add.u64 %r125,%r212,%r127; shl.b64 %r310,%r22,3; add.u64 %r103,%r310,-8; cvt.u32.u32 %r312,%r23; cvt.s64.s8 %r311,%r312; shl.b64 %r313,%r311,3; add.u64 %r69,%r313,-8; add.u64 %r427,%frame,240; add.u64 %r426,%frame,360; .loc 1 281 12 mov.u64 %r440,0; $L66: .loc 1 276 18 add.u64 %r315,%r427,%r103; ld.u64 %r316,[%r133]; st.u64 [%r315],%r316; .loc 1 277 18 add.u64 %r317,%frame,%r103; .loc 1 277 20 ld.u64 %r319,[%r125]; mul.lo.u64 %r318,%r319,%r34; .loc 1 277 18 st.u64 [%r317],%r318; .loc 1 278 19 ld.u64 %r321,[%r133+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r133+8]; sub.u64 %r66,%r320,%r322; .loc 1 280 10 setp.lt.s64 %r323,%r66,0; @ %r323 bra $L64; .loc 1 278 17 add.u64 %r325,%r426,%r103; st.u64 [%r325],%r66; bra $L65; $L64: .loc 1 281 12 add.u64 %r327,%r426,%r103; st.u64 [%r327],%r440; $L65: .loc 1 274 3 add.u64 %r133,%r133,24; add.u64 %r125,%r125,24; add.u64 %r103,%r103,8; setp.ne.u64 %r329,%r69,%r103; @ %r329 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r330,%r138,0; @ %r330 bra $L68; add.u64 %r426,%frame,360; $L86: add.u64 %r172,%r209,40; mov.u64 %r164,%r426; cvt.u32.u32 %r332,%r23; cvt.s64.s8 %r331,%r332; add.u64 %r334,%r331,%r331; add.u64 %r335,%r334,%r331; shl.b64 %r336,%r335,3; add.u64 %r337,%r209,16; add.u64 %r134,%r336,%r337; .loc 1 291 10 mov.u64 %r129,1; .loc 1 295 4 mov.u64 %r356,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r338,%r23,-2; cvt.s64.s32 %r75,%r338; add.u64 %r340,%r75,%r75; add.u64 %r341,%r340,%r75; shl.b64 %r342,%r341,3; add.u64 %r343,%r209,%r342; .loc 1 299 67 shl.b64 %r345,%r75,3; add.u64 %r346,%frame,%r345; .loc 1 299 59 ld.u64 %r348,[%r343+40]; ld.u64 %r349,[%r346+360]; mul.lo.u64 %r146,%r348,%r349; .loc 1 301 24 mov.u64 %r350,0; st.u64 [%r209+8],%r350; .loc 1 302 28 cvt.u16.u32 %r353,%r23; add.u16 %r352,%r353,-1; cvt.u32.u16 %r354,%r352; st.u8 [%r209+28],%r354; .loc 1 304 10 setp.eq.u64 %r355,%r146,0; @ %r355 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r129,%r72,%r129; $L69: .loc 1 295 4 st.u64 [%r172+8],%r356; ld.u64 %r72,[%r164]; add.u64 %r357,%r72,-1; st.u64 [%r172+16],%r357; st.u64 [%r172],%r129; .loc 1 288 7 add.u64 %r172,%r172,24; add.u64 %r164,%r164,8; setp.ne.u64 %r358,%r134,%r172; @ %r358 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r209+48],%r146; mov.u64 %r360,-1; st.u64 [%r209+56],%r360; mov.u64 %r361,1; st.u64 [%r209+40],%r361; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24363; call (%value_in),_gfortrani_xmallocarray64,[%value_in]; } .loc 1 311 22 st.u64 [%r209],%r364; bra $L73; $L63: .loc 1 316 19 ld.s8 %r366,[%r209+28]; .loc 1 316 10 setp.eq.u64 %r367,%r366,%r138; @ %r367 bra $L74; .loc 1 317 2 cvta.const.u64 %r36836_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r370,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r371,[%r370+36]; setp.eq.u32 %r372,%r371,0; @ %r372 bra $L73; .loc 1 321 4 add.u64 %r426,%frame,360; cvta.const.u64 %r376,$LC2r42637376; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r380,$LC1210380376; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r382,%r138,0; @ %r382 bra $L75; $L78: .loc 1 336 8 ld.u64 %r153,[%r209]; .loc 1 337 8 ld.u64 %r151,[%r210]; .loc 1 339 9 setp.ne.u64 %r383,%r151,0; @ %r383 bra $L76; bra $L48; $L75: add.u64 %r190,%frame,480; add.u64 %r189,%r209,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r385,%r23; cvt.s64.s8 %r384,%r385; add.u64 %r175,%r384,-1; .loc 1 328 3 mov.u64 %r157,0; add.u64 %r426,%frame,360; .loc 1 330 16 mov.u64 %r386,%r157; $L77: st.u64 [%r190],%r386; .loc 1 331 18 ld.u64 %r387,[%r189]; st.u64 [%r187],%r387; .loc 1 332 17 shl.b64 %r389,%r157,3; add.u64 %r390,%r426,%r389; .loc 1 332 10 ld.u64 %r391,[%r390]; setp.le.s64 %r392,%r391,0; @ %r392 bra $L48; .loc 1 328 26 add.u64 %r157,%r157,1; .loc 1 328 3 add.u64 %r190,%r190,8; add.u64 %r189,%r189,24; add.u64 %r187,%r187,8; setp.ne.u64 %r393,%r157,%r175; @ %r393 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r45,%r42,%r34; .loc 1 349 32 shl.b64 %r91,%r142,2; ld.u64 %r39,[%frame+480]; .loc 1 359 22 ld.u64 %r94,[%frame+240]; .loc 1 359 12 shl.b64 %r95,%r94,2; .loc 1 360 23 ld.u64 %r97,[%frame]; .loc 1 361 22 ld.u64 %r99,[%frame+120]; .loc 1 361 12 shl.b64 %r100,%r99,2; .loc 1 363 32 ld.u64 %r85,[%frame+360]; .loc 1 370 23 mul.lo.u64 %r38,%r85,%r94; .loc 1 371 24 mul.lo.u64 %r394,%r85,%r97; .loc 1 371 10 neg.s64 %r163,%r394; .loc 1 372 23 mul.lo.u64 %r395,%r85,%r99; .loc 1 372 9 shl.b64 %r396,%r395,2; neg.s64 %r165,%r396; setp.le.s64 %r431,%r138,1; cvt.u32.u32 %r433,%r23; cvt.s64.s8 %r434,%r433; add.u64 %r435,%r434,-1; add.u64 %r436,%frame,240; add.u64 %r437,%frame,360; add.u64 %r438,%frame,120; .loc 1 367 13 mov.u64 %r439,0; $L87: .loc 1 328 3 mov.u64 %r156,%r152; mov.u64 %r155,%r151; .loc 1 348 10 mov.u32 %r130,0; .loc 1 349 9 mov.u64 %r154,0; $L80: .loc 1 352 6 ld.s8 %r398,[%r156]; cvt.u16.u32 %r397,%r398; setp.eq.u16 %r399,%r397,0; @ %r399 bra $L79; .loc 1 353 12 ld.u32 %r400,[%r155]; xor.b32 %r130,%r130,%r400; $L79: .loc 1 349 24 add.u64 %r154,%r154,1; .loc 1 349 32 add.u64 %r155,%r155,%r91; .loc 1 349 47 add.u64 %r156,%r156,%r45; .loc 1 349 2 setp.ne.u64 %r401,%r139,%r154; @ %r401 bra $L80; .loc 1 355 8 st.u32 [%r153],%r130; .loc 1 358 15 add.u64 %r39,%r39,1; .loc 1 359 12 add.u64 %r151,%r151,%r95; .loc 1 360 13 add.u64 %r152,%r152,%r97; .loc 1 361 12 add.u64 %r153,%r153,%r100; .loc 1 363 13 setp.ne.u64 %r402,%r85,%r39; @ %r402 bra $L87; .loc 1 371 10 add.u64 %r148,%r152,%r163; .loc 1 372 9 add.u64 %r149,%r153,%r165; .loc 1 374 7 @ ! %r431 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r41],%r439; .loc 1 370 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 371 24 mul.lo.u64 %r405,%r116,%r113; .loc 1 371 10 sub.u64 %r148,%r152,%r405; .loc 1 372 23 mul.lo.u64 %r406,%r118,%r113; .loc 1 372 9 shl.b64 %r407,%r406,2; sub.u64 %r149,%r153,%r407; .loc 1 373 5 add.u64 %r150,%r150,1; .loc 1 374 7 add.u64 %r41,%r41,8; add.u64 %r166,%r166,8; setp.ne.u64 %r408,%r150,%r435; @ %r408 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; .loc 1 370 23 mov.u64 %r102,%r38; .loc 1 374 7 mov.u64 %r166,8; .loc 1 373 5 mov.u64 %r150,1; $L84: .loc 1 382 16 ld.u64 %r412,[%r41]; add.u64 %r113,%r412,1; st.u64 [%r41],%r113; .loc 1 383 23 add.u64 %r414,%r436,%r166; ld.u64 %r114,[%r414]; .loc 1 383 13 sub.u64 %r415,%r114,%r102; shl.b64 %r416,%r415,2; add.u64 %r151,%r151,%r416; .loc 1 384 24 add.u64 %r417,%frame,%r166; ld.u64 %r116,[%r417]; .loc 1 384 14 add.u64 %r152,%r148,%r116; .loc 1 385 23 add.u64 %r419,%r438,%r166; ld.u64 %r118,[%r419]; .loc 1 385 13 shl.b64 %r420,%r118,2; add.u64 %r153,%r149,%r420; .loc 1 363 32 add.u64 %r422,%r437,%r166; ld.u64 %r122,[%r422]; .loc 1 363 13 setp.eq.u64 %r423,%r113,%r122; @ %r423 bra $L85; .loc 1 367 13 mov.u64 %r39,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r424,[%r209]; setp.eq.u64 %r425,%r424,0; @ ! %r425 bra $L63; bra $L86; $L48:_gfortran_siparity_i4 .visible .func _gfortran_siparity_68u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64predu64mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_iparity_.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r294,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r294; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r294; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r300,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r300; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r293,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r293,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r299,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r299; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r294,0; @ %r211 bra $L127; add.u64 %r293,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r293; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 2496244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r294,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r294; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r294; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r294,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r294,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r294,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 shl.b64 %r69,%r68,2; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r292,%r294,1; .loc 1 512 13 mov.u32 %r278,0; cvt.u32.u32 %r296,%r24; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; $L146: st.u32 [%r98],%r278; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r279,%r59,%r80; @ %r279 bra $L146; .loc 1 525 7 @ ! %r292 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r295; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r282,%r97,%r298; @ %r282 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; .loc 1 520 13 mov.u64 %r295,0; $L144: .loc 1 529 16 ld.u64 %r287,[%r99]; add.u64 %r74,%r287,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r288,%r75,%r72; shl.b64 %r289,%r288,2; add.u64 %r98,%r98,%r289; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r290,%r74,%r77; @ %r290 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r291,%r98,0; @ ! %r291 bra $L122; bra $L147; $L109: .loc 1 534 1 ret; } iparity_i8.o/_gfortran_iparity_i8 .visible .func _gfortran_iparity_fortran/generated/iparity_i8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_miparity_i8 .visible .func _gfortran_miparity_i8gfortran_siparity_i8 .visible .func _gfortran_siparity_i8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,73,80,65,82,73,84,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,80,65,82,73,84,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,108] = {73,80,65,82,73,84,89114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,80,65,82,73,84,89,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,73,80,65,82,73,84,89,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_iparity_i8 .visible .func _gfortran_iparity_48096pred %r218; .reg .predpred %r231; .reg .u64 %r232; .reg .predu64 %r250; .reg .u64 %r251; .reg .u64 %r252; .reg .pred %r253; .reg .predu64u16u64 %r274; .reg .u64pred %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u32 %r306; .reg .u64u64 %r311; .reg .u64 %r312; .reg .pred %r313; .reg .predu64pred %r347; .reg .predu64 %r358; mov.u64 %r180,%ar0; mov.u64 %r181,%ar1; mov.u64 %r182,%ar2; .loc 1 55 10 ld.s8 %r22,[%r181+28]; .loc 1 55 38 add.u32 %r183,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r107,%r183; .loc 1 56 10 ld.u64 %r25,[%r182]; .loc 1 56 7 add.u64 %r108,%r25,-1; .loc 1 58 7 shr.u64 %r185,%r108,63; set.u32.lt.s64 %r187,%r107,%r108; neg.s32 %r188,%r187; cvt.u16.u64 %r191,%r185; cvt.u16.u32 %r192,%r188; or.b16 %r190,%r191,%r192; .loc 1 58 6 cvt.u32.u16 %r193,%r190; cvt.u16.u8 %r194,%r193; setp.eq.u16 %r195,%r194,0; @ %r195 bra $L2; .loc 1 60 7 cvt.u32.u32 %r198,%r22; cvt.s64.s8 %r197,%r198; st.u64 [%stack+8],%r1971966_gfortran_runtime_errorr201,%r108,%r108; add.u64 %r202,%r201,%r108; shl.b64 %r203,%r202,3; add.u64 %r204,%r181,%r203; ld.u64 %r30,[%r204+56]; ld.u64 %r32,[%r204+48]; .loc 1 68 9 ld.u64 %r111,[%r204+40]; .loc 1 70 3 setp.ne.u64 %r218,%r108,0; @ %r218 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r219,%r107,%r108; @ %r219 bra $L4; bra $L46; $L3: add.u64 %r61,%r181,40; add.u64 %r40,%frame,120; add.u64 %r96,%frame,240; add.u64 %r220,%r181,16; add.u64 %r222,%r25,%r25; add.u64 %r223,%r222,%r25; shl.b64 %r224,%r223,3; add.u64 %r175,%r220,%r224; .loc 1 76 12 mov.u64 %r358,0; $L8: .loc 1 72 18 ld.u64 %r225,[%r61]; st.u64 [%r40],%r225; .loc 1 73 19 ld.u64 %r227,[%r61+16]; add.u64 %r226,%r227,1; ld.u64 %r228,[%r61+8]; sub.u64 %r37,%r226,%r228; .loc 1 75 10 setp.lt.s64 %r229,%r37,0; @ %r229 bra $L6; .loc 1 73 17 st.u64 [%r96],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r96],%r358; $L7: .loc 1 70 3 add.u64 %r61,%r61,24; add.u64 %r40,%r40,8; add.u64 %r96,%r96,8; setp.ne.u64 %r231,%r61,%r175; @ %r231 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r232,[%r180]; setp.eq.u64 %r233,%r232,0; @ ! %r233 bra $L11; bra $L10; $L4: add.u64 %r235,%r25,%r25; add.u64 %r236,%r235,%r25; shl.b64 %r237,%r236,3; add.u64 %r238,%r237,40; add.u64 %r136,%r181,%r238; shl.b64 %r239,%r25,3; add.u64 %r105,%r239,-8; add.u64 %r345,%frame,120; add.u64 %r129,%r345,%r105; add.u64 %r346,%frame,240; add.u64 %r103,%r346,%r105; cvt.u32.u32 %r243,%r22; cvt.s64.s8 %r242,%r243; add.u64 %r245,%r242,%r242; add.u64 %r246,%r245,%r242; shl.b64 %r247,%r246,3; add.u64 %r248,%r181,40; add.u64 %r80,%r247,%r248; .loc 1 84 12 mov.u64 %r357,0; $L14: .loc 1 80 18 ld.u64 %r249,[%r136]; st.u64 [%r129],%r249; .loc 1 81 19 ld.u64 %r251,[%r136+16]; add.u64 %r250,%r251,1; ld.u64 %r252,[%r136+8]; sub.u64 %r44,%r250,%r252; .loc 1 83 10 setp.lt.s64 %r253,%r44,0; @ %r253 bra $L12; .loc 1 81 17 st.u64 [%r103],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r103],%r357; $L13: .loc 1 78 3 add.u64 %r136,%r136,24; add.u64 %r129,%r129,8; add.u64 %r103,%r103,8; setp.ne.u64 %r255,%r80,%r136; @ %r255 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r256,%r107,0; @ %r256 bra $L16; add.u64 %r346,%frame,240; $L34: add.u64 %r152,%r180,40; mov.u64 %r150,%r346; add.u64 %r257,%r180,16; cvt.u32.u32 %r259,%r22; cvt.s64.s8 %r258,%r259; add.u64 %r261,%r258,%r258; add.u64 %r262,%r261,%r258; shl.b64 %r263,%r262,3; add.u64 %r139,%r257,%r263; .loc 1 94 10 mov.u64 %r102,1; .loc 1 98 4 mov.u64 %r286,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r264,0; st.u64 [%r180+8],%r264; .loc 1 103 28 cvt.u16.u32 %r267,%r22; add.u16 %r266,%r267,-1; cvt.u32.u16 %r268,%r266; st.u8 [%r180+28],%r268; .loc 1 105 20 add.u32 %r269,%r22,-2; cvt.s64.s32 %r53,%r269; add.u64 %r271,%r53,%r53; add.u64 %r272,%r271,%r53; shl.b64 %r273,%r272,3; add.u64 %r274,%r180,%r273; .loc 1 105 67 shl.b64 %r276,%r53,3; add.u64 %r277,%frame,%r276; .loc 1 105 59 ld.u64 %r279,[%r274+40]; ld.u64 %r280,[%r277+240]; mul.lo.u64 %r113,%r279,%r280; .loc 1 107 29 mov.u64 %r2811(%value_in),_gfortrani_xmallocarray283,[%value_in]; } .loc 1 107 27 st.u64 [%r180],%r283; .loc 1 108 10 setp.eq.u64 %r285,%r113,0; @ ! %r285 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r102,%r49,%r102; $L17: .loc 1 98 4 st.u64 [%r152+8],%r286; ld.u64 %r49,[%r150]; add.u64 %r287,%r49,-1; st.u64 [%r152+16],%r287; st.u64 [%r152],%r102; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r288,%r139,%r152; @ %r288 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r180+48],%r113; mov.u64 %r290,-1; st.u64 [%r180+56],%r290; mov.u64 %r291,1; st.u64 [%r180+40],%r291; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r60,[%r180+28]; .loc 1 118 10 setp.eq.u64 %r292,%r60,%r107; @ %r292 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r107; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r295,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r296,[%r295+36]; setp.eq.u32 %r297,%r296,0; @ %r297 bra $L21; .loc 1 125 2 add.u64 %r346,%frame,240; cvta.const.u64 %r301,$LC2r346300301; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r303,%r107,0; @ %r303 bra $L24; $L27: .loc 1 65 9 add.u64 %r304,%r30,1; .loc 1 65 7 sub.u64 %r109,%r304,%r32; max.s64 %r45,%r109,0; .loc 1 137 8 ld.u64 %r121,[%r181]; .loc 1 138 8 ld.u64 %r122,[%r180]; .loc 1 154 36 shl.b64 %r68,%r111,3; ld.u64 %r47,[%frame+360]; .loc 1 166 22 ld.u64 %r69,[%frame+120]; .loc 1 166 12 shl.b64 %r71,%r69,3; .loc 1 167 22 ld.u64 %r72,[%frame]; .loc 1 167 12 shl.b64 %r74,%r72,3; setp.gt.s64 %r347,%r109,0; setp.le.s64 %r348,%r107,1; .loc 1 150 10 mov.u64 %r350,0; cvt.u32.u32 %r351,%r22; cvt.s64.s8 %r352,%r351; add.u64 %r353,%r352,-1; add.u64 %r354,%frame,120; add.u64 %r355,%frame,240; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r180,40; mov.u64 %r162,%frame; cvt.u32.u32 %r306,%r22; cvt.s64.s8 %r305,%r306; add.u64 %r153,%r305,-1; .loc 1 129 3 mov.u64 %r123,0; add.u64 %r346,%frame,240; .loc 1 131 16 mov.u64 %r307,%r123; $L26: st.u64 [%r165],%r307; .loc 1 132 18 ld.u64 %r308,[%r164]; st.u64 [%r162],%r308; .loc 1 133 17 shl.b64 %r310,%r123,3; add.u64 %r311,%r346,%r310; .loc 1 133 10 ld.u64 %r312,[%r311]; setp.le.s64 %r313,%r312,0; @ %r313 bra $L1; .loc 1 129 26 add.u64 %r123,%r123,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r314,%r123,%r153; @ %r314 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r347 bra $L35; .loc 1 150 10 st.u64 [%r122],%r350; bra $L29; $L35: mov.u64 %r116,%r121; .loc 1 148 10 mov.u64 %r114,0; .loc 1 154 13 mov.u64 %r115,%r114; $L28: .loc 1 158 10 ld.u64 %r317,[%r116]; xor.b64 %r114,%r114,%r317; .loc 1 154 28 add.u64 %r115,%r115,1; .loc 1 154 36 add.u64 %r116,%r116,%r68; .loc 1 154 6 setp.gt.s64 %r318,%r45,%r115; @ %r318 bra $L28; .loc 1 161 12 st.u64 [%r122],%r114; $L29: .loc 1 165 15 add.u64 %r47,%r47,1; .loc 1 166 12 add.u64 %r121,%r121,%r71; .loc 1 167 12 add.u64 %r122,%r122,%r74; .loc 1 169 32 ld.u64 %r127,[%frame+240]; .loc 1 169 13 setp.ne.u64 %r319,%r47,%r127; @ %r319 bra $L25; .loc 1 176 23 mul.lo.u64 %r320,%r69,%r47; .loc 1 176 9 shl.b64 %r321,%r320,3; sub.u64 %r118,%r121,%r321; .loc 1 177 23 mul.lo.u64 %r322,%r72,%r47; .loc 1 177 9 shl.b64 %r323,%r322,3; sub.u64 %r119,%r122,%r323; .loc 1 179 7 @ %r348 bra $L1; add.u64 %r144,%frame,368; mov.u64 %r106,8; .loc 1 178 5 mov.u64 %r120,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r144],%r350; .loc 1 176 23 mul.lo.u64 %r329,%r88,%r87; .loc 1 176 9 shl.b64 %r330,%r329,3; sub.u64 %r118,%r121,%r330; .loc 1 177 23 mul.lo.u64 %r331,%r91,%r87; .loc 1 177 9 shl.b64 %r332,%r331,3; sub.u64 %r119,%r122,%r332; .loc 1 178 5 add.u64 %r120,%r120,1; .loc 1 179 7 add.u64 %r144,%r144,8; add.u64 %r106,%r106,8; setp.eq.u64 %r333,%r120,%r353; @ %r333 bra $L1; $L32: .loc 1 187 16 ld.u64 %r334,[%r144]; add.u64 %r87,%r334,1; st.u64 [%r144],%r87; .loc 1 188 23 add.u64 %r336,%r354,%r106; ld.u64 %r88,[%r336]; .loc 1 188 13 shl.b64 %r337,%r88,3; add.u64 %r121,%r118,%r337; .loc 1 189 23 add.u64 %r338,%frame,%r106; ld.u64 %r91,[%r338]; .loc 1 189 13 shl.b64 %r339,%r91,3; add.u64 %r122,%r119,%r339; .loc 1 169 32 add.u64 %r341,%r355,%r106; ld.u64 %r94,[%r341]; .loc 1 169 13 setp.eq.u64 %r342,%r87,%r94; @ %r342 bra $L33; .loc 1 173 13 mov.u64 %r47,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r343,[%r180]; setp.eq.u64 %r344,%r343,0; @ ! %r344 bra $L11; bra $L34; $L1: .loc 1 19_gfortran_miparity_i8 .visible .func _gfortran_miparity_608predpred %r280; .reg .pred %r281; .reg .u64 %r282; .reg .u64 %r284; .reg .u64predu64 %r318; .reg .u64u64u64 %r364; .reg .u64 %r365; .reg .u64 %r367; .reg .predu64pred %r393; .reg .predpredmov.u64 %r210,%ar0; mov.u64 %r211,%ar1; mov.u64 %r212,%ar2; mov.u64 %r213,%ar3; .loc 1 223 6911212; call _gfortran_iparity_.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r212]; .loc 1 233 7 add.u64 %r141,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r211+28]; .loc 1 234 38 add.u32 %r218,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r142,%r218; .loc 1 237 7 shr.u64 %r220,%r141,63; set.u32.gt.s64 %r222,%r141,%r142; neg.s32 %r223,%r222; cvt.u16.u64 %r226,%r220; cvt.u16.u32 %r227,%r223; or.b16 %r225,%r226,%r227; .loc 1 237 6 cvt.u32.u16 %r228,%r225; cvt.u16.u8 %r229,%r228; setp.eq.u16 %r230,%r229,0; @ %r230 bra $L51; .loc 1 239 7 cvt.u32.u32 %r233,%r23; cvt.s64.s8 %r232,%r233; st.u64 [%stack+8],%r232; st.u64 [%stack],%r22; cvta.const.u64 %r2312stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r431,%r141,%r141; add.u64 %r430,%r431,%r141; shl.b64 %r238,%r430,3; add.u64 %r239,%r211,%r238; ld.u64 %r242,[%r239+56]; add.u64 %r241,%r242,1; .loc 1 244 7 ld.u64 %r249,[%r239+48]; sub.u64 %r143,%r241,%r249; .loc 1 245 6 setp.le.s64 %r250,%r143,0; @ %r250 bra $L48; .loc 1 248 9 ld.u64 %r156,[%r213]; .loc 1 250 15 ld.u64 %r35,[%r213+16]; .loc 1 252 22 cvt.u32.u64 %r36,%r35; .loc 1 252 53 add.u32 %r251,%r36,-4; and.b32 %r252,%r251,-5; set.u32.eq.u32 %r254,%r252,0; neg.s32 %r255,%r254; .loc 1 252 22 add.u32 %r256,%r36,-1; set.u32.le.u32 %r258,%r256,1; neg.s32 %r259,%r258; .loc 1 252 6 cvt.u16.u32 %r261,%r255; cvt.u16.u32 %r262,%r259; or.b16 %r260,%r261,%r262; cvt.u32.u16 %r263,%r260; cvt.u16.u8 %r264,%r263; setp.ne.u16 %r265,%r264,0; @ %r265 bra $L53; .loc 1 254 7 setp.ne.u32 %r267,%r36,16; @ %r267 bra $L54; $L53: .loc 1 261 9 shl.b64 %r271,%r430,3; add.u64 %r272,%r211,%r271; ld.u64 %r146,[%r272+40]; .loc 1 262 12 add.u64 %r278,%r213,%r271; ld.u64 %r43,[%r278+40]; .loc 1 264 3 setp.ne.u64 %r280,%r141,0; @ %r280 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r281,%r141,%r142; @ %r281 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r282282_gfortran_runtime_error5: add.u64 %r194,%r211,40; add.u64 %r196,%r213,40; add.u64 %r284,%r211,16; add.u64 %r286,%r22,%r22; add.u64 %r287,%r286,%r22; shl.b64 %r288,%r287,3; add.u64 %r207,%r284,%r288; .loc 1 264 3 mov.u64 %r198,0; add.u64 %r428,%frame,240; add.u64 %r427,%frame,360; .loc 1 271 12 mov.u64 %r442,%r198; $L60: .loc 1 266 18 add.u64 %r290,%r428,%r198; ld.u64 %r291,[%r194]; st.u64 [%r290],%r291; .loc 1 267 18 add.u64 %r292,%frame,%r198; .loc 1 267 20 ld.u64 %r294,[%r196]; mul.lo.u64 %r293,%r294,%r35; .loc 1 267 18 st.u64 [%r292],%r293; .loc 1 268 19 ld.u64 %r296,[%r194+16]; add.u64 %r295,%r296,1; ld.u64 %r297,[%r194+8]; sub.u64 %r55,%r295,%r297; .loc 1 270 10 setp.lt.s64 %r298,%r55,0; @ %r298 bra $L58; .loc 1 268 17 add.u64 %r300,%r427,%r198; st.u64 [%r300],%r55; bra $L59; $L58: .loc 1 271 12 add.u64 %r302,%r427,%r198; st.u64 [%r302],%r442; $L59: .loc 1 264 3 add.u64 %r194,%r194,24; add.u64 %r196,%r196,24; add.u64 %r198,%r198,8; setp.ne.u64 %r304,%r194,%r207; @ %r304 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r305,[%r210]; setp.eq.u64 %r306,%r305,0; @ ! %r306 bra $L63; bra $L62; $L56: add.u64 %r308,%r22,%r22; add.u64 %r309,%r308,%r22; shl.b64 %r310,%r309,3; add.u64 %r61,%r310,40; add.u64 %r100,%r211,%r61; add.u64 %r139,%r213,%r61; shl.b64 %r311,%r22,3; add.u64 %r192,%r311,-8; cvt.u32.u32 %r313,%r23; cvt.s64.s8 %r312,%r313; shl.b64 %r314,%r312,3; add.u64 %r39,%r314,-8; add.u64 %r428,%frame,240; add.u64 %r427,%frame,360; .loc 1 281 12 mov.u64 %r441,0; $L66: .loc 1 276 18 add.u64 %r316,%r428,%r192; ld.u64 %r317,[%r100]; st.u64 [%r316],%r317; .loc 1 277 18 add.u64 %r318,%frame,%r192; .loc 1 277 20 ld.u64 %r320,[%r139]; mul.lo.u64 %r319,%r320,%r35; .loc 1 277 18 st.u64 [%r318],%r319; .loc 1 278 19 ld.u64 %r322,[%r100+16]; add.u64 %r321,%r322,1; ld.u64 %r323,[%r100+8]; sub.u64 %r67,%r321,%r323; .loc 1 280 10 setp.lt.s64 %r324,%r67,0; @ %r324 bra $L64; .loc 1 278 17 add.u64 %r326,%r427,%r192; st.u64 [%r326],%r67; bra $L65; $L64: .loc 1 281 12 add.u64 %r328,%r427,%r192; st.u64 [%r328],%r441; $L65: .loc 1 274 3 add.u64 %r100,%r100,24; add.u64 %r139,%r139,24; add.u64 %r192,%r192,8; setp.ne.u64 %r330,%r39,%r192; @ %r330 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r331,%r142,0; @ %r331 bra $L68; add.u64 %r427,%frame,360; $L86: add.u64 %r137,%r210,40; mov.u64 %r135,%r427; cvt.u32.u32 %r333,%r23; cvt.s64.s8 %r332,%r333; add.u64 %r335,%r332,%r332; add.u64 %r336,%r335,%r332; shl.b64 %r337,%r336,3; add.u64 %r338,%r210,16; add.u64 %r106,%r337,%r338; .loc 1 291 10 mov.u64 %r131,1; .loc 1 295 4 mov.u64 %r357,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r339,%r23,-2; cvt.s64.s32 %r74,%r339; add.u64 %r341,%r74,%r74; add.u64 %r342,%r341,%r74; shl.b64 %r343,%r342,3; add.u64 %r344,%r210,%r343; .loc 1 299 67 shl.b64 %r346,%r74,3; add.u64 %r347,%frame,%r346; .loc 1 299 59 ld.u64 %r349,[%r344+40]; ld.u64 %r350,[%r347+360]; mul.lo.u64 %r150,%r349,%r350; .loc 1 301 24 mov.u64 %r351,0; st.u64 [%r210+8],%r351; .loc 1 302 28 cvt.u16.u32 %r354,%r23; add.u16 %r353,%r354,-1; cvt.u32.u16 %r355,%r353; st.u8 [%r210+28],%r355; .loc 1 304 10 setp.eq.u64 %r356,%r150,0; @ %r356 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r131,%r71,%r131; $L69: .loc 1 295 4 st.u64 [%r137+8],%r357; ld.u64 %r71,[%r135]; add.u64 %r358,%r71,-1; st.u64 [%r137+16],%r358; st.u64 [%r137],%r131; .loc 1 288 7 add.u64 %r137,%r137,24; add.u64 %r135,%r135,8; setp.ne.u64 %r359,%r106,%r137; @ %r359 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r210+48],%r150; mov.u64 %r361,-1; st.u64 [%r210+56],%r361; mov.u64 %r362,1; st.u64 [%r210+40],%r362; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r364364; call (%value_in),_gfortrani_xmallocarray65,[%value_in]; } .loc 1 311 22 st.u64 [%r210],%r365; bra $L73; $L63: .loc 1 316 19 ld.s8 %r367,[%r210+28]; .loc 1 316 10 setp.eq.u64 %r368,%r367,%r142; @ %r368 bra $L74; .loc 1 317 2 cvta.const.u64 %r36stack; call _gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r371,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r372,[%r371+36]; setp.eq.u32 %r373,%r372,0; @ %r373 bra $L73; .loc 1 321 4 add.u64 %r427,%frame,360; cvta.const.u64 %r377,$LC2r427376377; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r381,$LC1r2113377; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r383,%r142,0; @ %r383 bra $L75; $L78: .loc 1 336 8 ld.u64 %r157,[%r210]; .loc 1 337 8 ld.u64 %r155,[%r211]; .loc 1 339 9 setp.ne.u64 %r384,%r155,0; @ %r384 bra $L76; bra $L48; $L75: add.u64 %r183,%frame,480; add.u64 %r180,%r210,40; add.u64 %r177,%frame,120; cvt.u32.u32 %r386,%r23; cvt.s64.s8 %r385,%r386; add.u64 %r138,%r385,-1; .loc 1 328 3 mov.u64 %r161,0; add.u64 %r427,%frame,360; .loc 1 330 16 mov.u64 %r387,%r161; $L77: st.u64 [%r183],%r387; .loc 1 331 18 ld.u64 %r388,[%r180]; st.u64 [%r177],%r388; .loc 1 332 17 shl.b64 %r390,%r161,3; add.u64 %r391,%r427,%r390; .loc 1 332 10 ld.u64 %r392,[%r391]; setp.le.s64 %r393,%r392,0; @ %r393 bra $L48; .loc 1 328 26 add.u64 %r161,%r161,1; .loc 1 328 3 add.u64 %r183,%r183,8; add.u64 %r180,%r180,24; add.u64 %r177,%r177,8; setp.ne.u64 %r394,%r138,%r161; @ %r394 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r46,%r43,%r35; .loc 1 349 32 shl.b64 %r90,%r146,3; ld.u64 %r29,[%frame+480]; .loc 1 359 22 ld.u64 %r92,[%frame+240]; .loc 1 359 12 shl.b64 %r94,%r92,3; .loc 1 360 23 ld.u64 %r95,[%frame]; .loc 1 361 22 ld.u64 %r97,[%frame+120]; .loc 1 361 12 shl.b64 %r99,%r97,3; setp.le.s64 %r432,%r142,1; cvt.u32.u32 %r434,%r23; cvt.s64.s8 %r435,%r434; add.u64 %r436,%r435,-1; add.u64 %r437,%frame,240; add.u64 %r438,%frame,360; add.u64 %r439,%frame,120; .loc 1 367 13 mov.u64 %r440,0; $L87: .loc 1 328 3 mov.u64 %r160,%r156; mov.u64 %r159,%r155; .loc 1 348 10 mov.u64 %r133,0; .loc 1 349 9 mov.u64 %r158,%r133; $L80: .loc 1 352 6 ld.s8 %r396,[%r160]; cvt.u16.u32 %r395,%r396; setp.eq.u16 %r397,%r395,0; @ %r397 bra $L79; .loc 1 353 12 ld.u64 %r398,[%r159]; xor.b64 %r133,%r133,%r398; $L79: .loc 1 349 24 add.u64 %r158,%r158,1; .loc 1 349 32 add.u64 %r159,%r159,%r90; .loc 1 349 47 add.u64 %r160,%r160,%r46; .loc 1 349 2 setp.ne.u64 %r399,%r143,%r158; @ %r399 bra $L80; .loc 1 355 8 st.u64 [%r157],%r133; .loc 1 358 15 add.u64 %r29,%r29,1; .loc 1 359 12 add.u64 %r155,%r155,%r94; .loc 1 360 13 add.u64 %r156,%r156,%r95; .loc 1 361 12 add.u64 %r157,%r157,%r99; .loc 1 363 32 ld.u64 %r84,[%frame+360]; .loc 1 363 13 setp.ne.u64 %r400,%r84,%r29; @ %r400 bra $L87; .loc 1 370 23 mul.lo.u64 %r102,%r29,%r92; .loc 1 371 24 mul.lo.u64 %r401,%r29,%r95; .loc 1 371 10 sub.u64 %r152,%r156,%r401; .loc 1 372 23 mul.lo.u64 %r402,%r29,%r97; .loc 1 372 9 shl.b64 %r403,%r402,3; sub.u64 %r153,%r157,%r403; .loc 1 374 7 @ ! %r432 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r41],%r440; .loc 1 370 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 371 24 mul.lo.u64 %r406,%r116,%r113; .loc 1 371 10 sub.u64 %r152,%r156,%r406; .loc 1 372 23 mul.lo.u64 %r407,%r118,%r113; .loc 1 372 9 shl.b64 %r408,%r407,3; sub.u64 %r153,%r157,%r408; .loc 1 373 5 add.u64 %r154,%r154,1; .loc 1 374 7 add.u64 %r41,%r41,8; add.u64 %r30,%r30,8; setp.ne.u64 %r409,%r154,%r436; @ %r409 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; mov.u64 %r30,8; .loc 1 373 5 mov.u64 %r154,1; $L84: .loc 1 382 16 ld.u64 %r413,[%r41]; add.u64 %r113,%r413,1; st.u64 [%r41],%r113; .loc 1 383 23 add.u64 %r415,%r437,%r30; ld.u64 %r114,[%r415]; .loc 1 383 13 sub.u64 %r416,%r114,%r102; shl.b64 %r417,%r416,3; add.u64 %r155,%r155,%r417; .loc 1 384 24 add.u64 %r418,%frame,%r30; ld.u64 %r116,[%r418]; .loc 1 384 14 add.u64 %r156,%r152,%r116; .loc 1 385 23 add.u64 %r420,%r439,%r30; ld.u64 %r118,[%r420]; .loc 1 385 13 shl.b64 %r421,%r118,3; add.u64 %r157,%r153,%r421; .loc 1 363 32 add.u64 %r423,%r438,%r30; ld.u64 %r121,[%r423]; .loc 1 363 13 setp.eq.u64 %r424,%r113,%r121; @ %r424 bra $L85; .loc 1 367 13 mov.u64 %r29,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r425,[%r210]; setp.eq.u64 %r426,%r425,0; @ ! %r426 bra $L63; bra $L86; $L48:_gfortran_siparity_i8 .visible .func _gfortran_siparity_68u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_iparity_.loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r294,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r294; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r294; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r300,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r300; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r293,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r293,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r299,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r299; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r294,0; @ %r211 bra $L127; add.u64 %r293,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r293; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r24496244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r294,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r294; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r294; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r294,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r294,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r294,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 shl.b64 %r69,%r68,3; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r292,%r294,1; .loc 1 512 13 mov.u64 %r278,0; cvt.u32.u32 %r296,%r24; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; $L146: st.u64 [%r98],%r278; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r279,%r59,%r80; @ %r279 bra $L146; .loc 1 525 7 @ ! %r292 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r278; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r282,%r97,%r298; @ %r282 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; $L144: .loc 1 529 16 ld.u64 %r287,[%r99]; add.u64 %r74,%r287,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r288,%r75,%r72; shl.b64 %r289,%r288,3; add.u64 %r98,%r98,%r289; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r290,%r74,%r77; @ %r290 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r291,%r98,0; @ ! %r291 bra $L122; bra $L147; $L109: .loc 1 534 1 ret; } iparity_i16.o/_gfortran_iparity_i16 .visible .func _gfortran_iparity_i16fortran/generated/iparity_i16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_miparity_i16 .visible .func _gfortran_miparity_i16gfortran_siparity_i16 .visible .func _gfortran_siparity_i16VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents,105,110,99,111,114,114,101,99,116,32,105,110,32,73,80,65,82,73,84,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,80,65,82,73,84,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,108] = {73,80,65,82,73,84,89114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,73,80,65,82,73,84,89,32,105,110,116,114,105,11077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,73,80,65,82,73,84,89,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_iparity_i16 .visible .func _gfortran_iparity_i16480u64 %r52; .reg .u64 %r58; .reg .u64 %r61; .reg .u649pred %r217; .reg .pred64u64 %r251; .reg .predu64 %r256; .reg .u64 %r257; .reg .u32u32 %r268; .reg .u64u6464predu64 %r370; mov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; .loc 1 55 10 ld.s8 %r22,[%r180+28]; .loc 1 55 38 add.u32 %r182,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r182; .loc 1 56 10 ld.u64 %r25,[%r181]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r184,%r104,63; set.u32.lt.s64 %r186,%r103,%r104; neg.s32 %r187,%r186; cvt.u16.u64 %r190,%r184; cvt.u16.u32 %r191,%r187; or.b16 %r189,%r190,%r191; .loc 1 58 6 cvt.u32.u16 %r192,%r189; cvt.u16.u8 %r193,%r192; setp.eq.u16 %r194,%r193,0; @ %r194 bra $L2; .loc 1 60 7 cvt.u32.u32 %r197,%r22; cvt.s64.s8 %r196,%r197; st.u64 [%stack+8],%r1961955_gfortran_runtime_errorr200,%r104,%r104; add.u64 %r201,%r200,%r104; shl.b64 %r202,%r201,3; add.u64 %r203,%r180,%r202; ld.u64 %r30,[%r203+56]; ld.u64 %r32,[%r203+48]; .loc 1 68 9 ld.u64 %r107,[%r203+40]; .loc 1 70 3 setp.ne.u64 %r217,%r104,0; @ %r217 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r218,%r103,%r104; @ %r218 bra $L4; bra $L46; $L3: add.u64 %r61,%r180,40; add.u64 %r40,%frame,120; add.u64 %r91,%frame,240; add.u64 %r219,%r180,16; add.u64 %r221,%r25,%r25; add.u64 %r222,%r221,%r25; shl.b64 %r223,%r222,3; add.u64 %r174,%r219,%r223; .loc 1 76 12 mov.u64 %r370,0; $L8: .loc 1 72 18 ld.u64 %r224,[%r61]; st.u64 [%r40],%r224; .loc 1 73 19 ld.u64 %r226,[%r61+16]; add.u64 %r225,%r226,1; ld.u64 %r227,[%r61+8]; sub.u64 %r37,%r225,%r227; .loc 1 75 10 setp.lt.s64 %r228,%r37,0; @ %r228 bra $L6; .loc 1 73 17 st.u64 [%r91],%r37; bra $L7; $L6: .loc 1 76 12 st.u64 [%r91],%r370; $L7: .loc 1 70 3 add.u64 %r61,%r61,24; add.u64 %r40,%r40,8; add.u64 %r91,%r91,8; setp.ne.u64 %r230,%r61,%r174; @ %r230 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r231,[%r179]; setp.eq.u64 %r232,%r231,0; @ ! %r232 bra $L11; bra $L10; $L4: add.u64 %r234,%r25,%r25; add.u64 %r235,%r234,%r25; shl.b64 %r236,%r235,3; add.u64 %r237,%r236,40; add.u64 %r143,%r180,%r237; shl.b64 %r238,%r25,3; add.u64 %r125,%r238,-8; add.u64 %r358,%frame,120; add.u64 %r128,%r358,%r125; add.u64 %r357,%frame,240; add.u64 %r121,%r357,%r125; cvt.u32.u32 %r242,%r22; cvt.s64.s8 %r241,%r242; add.u64 %r244,%r241,%r241; add.u64 %r245,%r244,%r241; shl.b64 %r246,%r245,3; add.u64 %r247,%r180,40; add.u64 %r95,%r246,%r247; .loc 1 84 12 mov.u64 %r369,0; $L14: .loc 1 80 18 ld.u64 %r248,[%r143]; st.u64 [%r128],%r248; .loc 1 81 19 ld.u64 %r250,[%r143+16]; add.u64 %r249,%r250,1; ld.u64 %r251,[%r143+8]; sub.u64 %r44,%r249,%r251; .loc 1 83 10 setp.lt.s64 %r252,%r44,0; @ %r252 bra $L12; .loc 1 81 17 st.u64 [%r121],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r121],%r369; $L13: .loc 1 78 3 add.u64 %r143,%r143,24; add.u64 %r128,%r128,8; add.u64 %r121,%r121,8; setp.ne.u64 %r254,%r95,%r143; @ %r254 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r255,%r103,0; @ %r255 bra $L16; add.u64 %r357,%frame,240; $L34: add.u64 %r152,%r179,40; mov.u64 %r150,%r357; add.u64 %r256,%r179,16; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r144,%r256,%r262; .loc 1 94 10 mov.u64 %r99,1; .loc 1 98 4 mov.u64 %r285,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r263,0; st.u64 [%r179+8],%r263; .loc 1 103 28 cvt.u16.u32 %r266,%r22; add.u16 %r265,%r266,-1; cvt.u32.u16 %r267,%r265; st.u8 [%r179+28],%r267; .loc 1 105 20 add.u32 %r268,%r22,-2; cvt.s64.s32 %r52,%r268; add.u64 %r270,%r52,%r52; add.u64 %r271,%r270,%r52; shl.b64 %r272,%r271,3; add.u64 %r273,%r179,%r272; .loc 1 105 67 shl.b64 %r275,%r52,3; add.u64 %r276,%frame,%r275; .loc 1 105 59 ld.u64 %r278,[%r273+40]; ld.u64 %r279,[%r276+240]; mul.lo.u64 %r109,%r278,%r279; .loc 1 107 29 mov.u64 %r281,16281; call (%value_in),_gfortrani_xmallocarray282,[%value_in]; } .loc 1 107 27 st.u64 [%r179],%r282; .loc 1 108 10 setp.eq.u64 %r284,%r109,0; @ ! %r284 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r99,%r48,%r99; $L17: .loc 1 98 4 st.u64 [%r152+8],%r285; ld.u64 %r48,[%r150]; add.u64 %r286,%r48,-1; st.u64 [%r152+16],%r286; st.u64 [%r152],%r99; .loc 1 91 7 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; setp.ne.u64 %r287,%r144,%r152; @ %r287 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r179+48],%r109; mov.u64 %r289,-1; st.u64 [%r179+56],%r289; mov.u64 %r290,1; st.u64 [%r179+40],%r290; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r179+28]; .loc 1 118 10 setp.eq.u64 %r291,%r58,%r103; @ %r291 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r294,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r295,[%r294+36]; setp.eq.u32 %r296,%r295,0; @ %r296 bra $L21; .loc 1 125 2 add.u64 %r357,%frame,240; cvta.const.u64 %r300,$LC2r357299300; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r302,%r103,0; @ %r302 bra $L24; $L27: .loc 1 65 9 add.u64 %r303,%r30,1; .loc 1 65 7 sub.u64 %r105,%r303,%r32; max.s64 %r45,%r105,0; .loc 1 137 8 ld.u64 %r116,[%r180]; .loc 1 138 8 ld.u64 %r117,[%r179]; .loc 1 154 36 shl.b64 %r66,%r107,4; ld.u64 %r138,[%frame+360]; .loc 1 166 22 ld.u64 %r68,[%frame+120]; .loc 1 166 12 shl.b64 %r69,%r68,4; .loc 1 167 22 ld.u64 %r71,[%frame]; .loc 1 167 12 shl.b64 %r72,%r71,4; .loc 1 169 32 ld.u64 %r123,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r304,%r68,%r123; .loc 1 176 9 shl.b64 %r305,%r304,4; neg.s64 %r127,%r305; .loc 1 177 23 mul.lo.u64 %r306,%r71,%r123; .loc 1 177 9 shl.b64 %r307,%r306,4; neg.s64 %r130,%r307; setp.gt.s64 %r359,%r105,0; setp.le.s64 %r360,%r103,1; .loc 1 148 10 mov.u64 %r362,0; cvt.u32.u32 %r364,%r22; cvt.s64.s8 %r365,%r364; add.u64 %r366,%r365,-1; add.u64 %r367,%frame,120; add.u64 %r368,%frame,240; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r179,40; mov.u64 %r162,%frame; cvt.u32.u32 %r309,%r22; cvt.s64.s8 %r308,%r309; add.u64 %r153,%r308,-1; .loc 1 129 3 mov.u64 %r118,0; add.u64 %r357,%frame,240; .loc 1 131 16 mov.u64 %r310,%r118; $L26: st.u64 [%r165],%r310; .loc 1 132 18 ld.u64 %r311,[%r164]; st.u64 [%r162],%r311; .loc 1 133 17 shl.b64 %r313,%r118,3; add.u64 %r314,%r357,%r313; .loc 1 133 10 ld.u64 %r315,[%r314]; setp.le.s64 %r316,%r315,0; @ %r316 bra $L1; .loc 1 129 26 add.u64 %r118,%r118,1; .loc 1 129 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r317,%r118,%r153; @ %r317 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r359 bra $L35; .loc 1 150 10 st.u64 [%r117],%r362; st.u64 [%r117+8],%r362; bra $L29; $L35: mov.u64 %r112,%r116; .loc 1 148 10 mov.u64 %r353,%r362; mov.u64 %r354,%r362; .loc 1 154 13 mov.u64 %r111,0; $L28: .loc 1 158 10 ld.u64 %r323,[%r112]; ld.u64 %r326,[%r112+8]; xor.b64 %r353,%r323,%r353; xor.b64 %r354,%r326,%r354; .loc 1 154 28 add.u64 %r111,%r111,1; .loc 1 154 36 add.u64 %r112,%r112,%r66; .loc 1 154 6 setp.gt.s64 %r328,%r45,%r111; @ %r328 bra $L28; .loc 1 161 12 st.u64 [%r117],%r353; st.u64 [%r117+8],%r354; $L29: .loc 1 165 15 add.u64 %r138,%r138,1; .loc 1 166 12 add.u64 %r116,%r116,%r69; .loc 1 167 12 add.u64 %r117,%r117,%r72; .loc 1 169 13 setp.ne.u64 %r331,%r138,%r123; @ %r331 bra $L25; .loc 1 176 9 add.u64 %r113,%r116,%r127; .loc 1 177 9 add.u64 %r114,%r117,%r130; .loc 1 179 7 @ %r360 bra $L1; add.u64 %r122,%frame,368; mov.u64 %r140,8; .loc 1 178 5 mov.u64 %r115,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r122],%r362; .loc 1 176 23 mul.lo.u64 %r337,%r85,%r84; .loc 1 176 9 shl.b64 %r338,%r337,4; sub.u64 %r113,%r116,%r338; .loc 1 177 23 mul.lo.u64 %r339,%r88,%r84; .loc 1 177 9 shl.b64 %r340,%r339,4; sub.u64 %r114,%r117,%r340; .loc 1 178 5 add.u64 %r115,%r115,1; .loc 1 179 7 add.u64 %r122,%r122,8; add.u64 %r140,%r140,8; setp.eq.u64 %r341,%r115,%r366; @ %r341 bra $L1; $L32: .loc 1 187 16 ld.u64 %r342,[%r122]; add.u64 %r84,%r342,1; st.u64 [%r122],%r84; .loc 1 188 23 add.u64 %r344,%r367,%r140; ld.u64 %r85,[%r344]; .loc 1 188 13 shl.b64 %r345,%r85,4; add.u64 %r116,%r113,%r345; .loc 1 189 23 add.u64 %r346,%frame,%r140; ld.u64 %r88,[%r346]; .loc 1 189 13 shl.b64 %r347,%r88,4; add.u64 %r117,%r114,%r347; .loc 1 169 32 add.u64 %r349,%r368,%r140; ld.u64 %r92,[%r349]; .loc 1 169 13 setp.eq.u64 %r350,%r84,%r92; @ %r350 bra $L33; .loc 1 173 13 mov.u64 %r138,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r351,[%r179]; setp.eq.u64 %r352,%r351,0; @ ! %r352 bra $L11; bra $L34; $L1: .loc 1 19_gfortran_miparity_i16 .visible .func _gfortran_miparity_i16608u64u16 %r263; .reg .pred %r264; .reg .pred %r266; .reg .u64 %r270; .reg .u64 %r271; .reg .u64 %r277; .reg .pred %r279; .reg .pred %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64u64u64 %r318; .reg .u64pred %r323; .reg .u64 %r325; .reg .u64 %r327; .reg .predu64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .u32u64 %r342; .reg .u64pred %r358; .reg .u64predu64 %r391; .reg .predpred %r399; .reg .u64 %r402; .reg .u64 %r405; .reg .pred %r407; .reg .predu64 %r435; .reg .u64 %r438; .reg .u64 %r439; .reg .u64 %r440; .reg .u64 %r442; .reg .predmov.u64 %r211,%ar2; mov.u64 %r212,%ar3; .loc 1 223 6 setp.ne.u64 %r213,%r212,0; @ %r213 bra $L49gfortran_iparity_i16.loc 1 230 7 bra $L48; $L49: .loc 1 233 10 ld.u64 %r22,[%r211]; .loc 1 233 7 add.u64 %r137,%r22,-1; .loc 1 234 10 ld.s8 %r23,[%r210+28]; .loc 1 234 38 add.u32 %r217,%r23,-1; .loc 1 234 8 cvt.s64.s32 %r138,%r217; .loc 1 237 7 shr.u64 %r219,%r137,63; set.u32.gt.s64 %r221,%r137,%r138; neg.s32 %r222,%r221; cvt.u16.u64 %r225,%r219; cvt.u16.u32 %r226,%r222; or.b16 %r224,%r225,%r226; .loc 1 237 6 cvt.u32.u16 %r227,%r224; cvt.u16.u8 %r228,%r227; setp.eq.u16 %r229,%r228,0; @ %r229 bra $L51; .loc 1 239 7 cvt.u32.u32 %r232,%r23; cvt.s64.s8 %r231,%r232; st.u64 [%stack+8],%r231; st.u64 [%stack],%r22; cvta.const.u64 %r2302stack; call _gfortran_runtime_error1: .loc 1 244 9 add.u64 %r440,%r137,%r137; add.u64 %r442,%r440,%r137; shl.b64 %r237,%r442,3; add.u64 %r238,%r210,%r237; ld.u64 %r241,[%r238+56]; add.u64 %r240,%r241,1; .loc 1 244 7 ld.u64 %r248,[%r238+48]; sub.u64 %r139,%r240,%r248; .loc 1 245 6 setp.le.s64 %r249,%r139,0; @ %r249 bra $L48; .loc 1 248 9 ld.u64 %r152,[%r212]; .loc 1 250 15 ld.u64 %r34,[%r212+16]; .loc 1 252 22 cvt.u32.u64 %r35,%r34; .loc 1 252 53 add.u32 %r250,%r35,-4; and.b32 %r251,%r250,-5; set.u32.eq.u32 %r253,%r251,0; neg.s32 %r254,%r253; .loc 1 252 22 add.u32 %r255,%r35,-1; set.u32.le.u32 %r257,%r255,1; neg.s32 %r258,%r257; .loc 1 252 6 cvt.u16.u32 %r260,%r254; cvt.u16.u32 %r261,%r258; or.b16 %r259,%r260,%r261; cvt.u32.u16 %r262,%r259; cvt.u16.u8 %r263,%r262; setp.ne.u16 %r264,%r263,0; @ %r264 bra $L53; .loc 1 254 7 setp.ne.u32 %r266,%r35,16; @ %r266 bra $L54; $L53: .loc 1 261 9 shl.b64 %r270,%r442,3; add.u64 %r271,%r210,%r270; ld.u64 %r142,[%r271+40]; .loc 1 262 12 add.u64 %r277,%r212,%r270; ld.u64 %r42,[%r277+40]; .loc 1 264 3 setp.ne.u64 %r279,%r137,0; @ %r279 bra $L55; $L61: .loc 1 274 3 setp.lt.s64 %r280,%r137,%r138; @ %r280 bra $L56; bra $L105; $L54: .loc 1 259 5 cvta.const.u64 %r281281_gfortran_runtime_error5: add.u64 %r40,%r210,40; add.u64 %r179,%r212,40; add.u64 %r283,%r210,16; add.u64 %r285,%r22,%r22; add.u64 %r286,%r285,%r22; shl.b64 %r287,%r286,3; add.u64 %r206,%r283,%r287; .loc 1 264 3 mov.u64 %r92,0; add.u64 %r439,%frame,240; add.u64 %r438,%frame,360; .loc 1 271 12 mov.u64 %r453,%r92; $L60: .loc 1 266 18 add.u64 %r289,%r439,%r92; ld.u64 %r290,[%r40]; st.u64 [%r289],%r290; .loc 1 267 18 add.u64 %r291,%frame,%r92; .loc 1 267 20 ld.u64 %r293,[%r179]; mul.lo.u64 %r292,%r293,%r34; .loc 1 267 18 st.u64 [%r291],%r292; .loc 1 268 19 ld.u64 %r295,[%r40+16]; add.u64 %r294,%r295,1; ld.u64 %r296,[%r40+8]; sub.u64 %r54,%r294,%r296; .loc 1 270 10 setp.lt.s64 %r297,%r54,0; @ %r297 bra $L58; .loc 1 268 17 add.u64 %r299,%r438,%r92; st.u64 [%r299],%r54; bra $L59; $L58: .loc 1 271 12 add.u64 %r301,%r438,%r92; st.u64 [%r301],%r453; $L59: .loc 1 264 3 add.u64 %r40,%r40,24; add.u64 %r179,%r179,24; add.u64 %r92,%r92,8; setp.ne.u64 %r303,%r40,%r206; @ %r303 bra $L60; bra $L61; $L105: .loc 1 284 6 ld.u64 %r304,[%r209]; setp.eq.u64 %r305,%r304,0; @ ! %r305 bra $L63; bra $L62; $L56: add.u64 %r307,%r22,%r22; add.u64 %r308,%r307,%r22; shl.b64 %r309,%r308,3; add.u64 %r127,%r309,40; add.u64 %r133,%r210,%r127; add.u64 %r125,%r212,%r127; shl.b64 %r310,%r22,3; add.u64 %r103,%r310,-8; cvt.u32.u32 %r312,%r23; cvt.s64.s8 %r311,%r312; shl.b64 %r313,%r311,3; add.u64 %r69,%r313,-8; add.u64 %r439,%frame,240; add.u64 %r438,%frame,360; .loc 1 281 12 mov.u64 %r452,0; $L66: .loc 1 276 18 add.u64 %r315,%r439,%r103; ld.u64 %r316,[%r133]; st.u64 [%r315],%r316; .loc 1 277 18 add.u64 %r317,%frame,%r103; .loc 1 277 20 ld.u64 %r319,[%r125]; mul.lo.u64 %r318,%r319,%r34; .loc 1 277 18 st.u64 [%r317],%r318; .loc 1 278 19 ld.u64 %r321,[%r133+16]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r133+8]; sub.u64 %r66,%r320,%r322; .loc 1 280 10 setp.lt.s64 %r323,%r66,0; @ %r323 bra $L64; .loc 1 278 17 add.u64 %r325,%r438,%r103; st.u64 [%r325],%r66; bra $L65; $L64: .loc 1 281 12 add.u64 %r327,%r438,%r103; st.u64 [%r327],%r452; $L65: .loc 1 274 3 add.u64 %r133,%r133,24; add.u64 %r125,%r125,24; add.u64 %r103,%r103,8; setp.ne.u64 %r329,%r69,%r103; @ %r329 bra $L66; bra $L106; $L62: .loc 1 288 7 setp.eq.u64 %r330,%r138,0; @ %r330 bra $L68; add.u64 %r438,%frame,360; $L86: add.u64 %r172,%r209,40; mov.u64 %r164,%r438; cvt.u32.u32 %r332,%r23; cvt.s64.s8 %r331,%r332; add.u64 %r334,%r331,%r331; add.u64 %r335,%r334,%r331; shl.b64 %r336,%r335,3; add.u64 %r337,%r209,16; add.u64 %r134,%r336,%r337; .loc 1 291 10 mov.u64 %r129,1; .loc 1 295 4 mov.u64 %r356,0; bra $L69; $L68: .loc 1 299 20 add.u32 %r338,%r23,-2; cvt.s64.s32 %r75,%r338; add.u64 %r340,%r75,%r75; add.u64 %r341,%r340,%r75; shl.b64 %r342,%r341,3; add.u64 %r343,%r209,%r342; .loc 1 299 67 shl.b64 %r345,%r75,3; add.u64 %r346,%frame,%r345; .loc 1 299 59 ld.u64 %r348,[%r343+40]; ld.u64 %r349,[%r346+360]; mul.lo.u64 %r146,%r348,%r349; .loc 1 301 24 mov.u64 %r350,0; st.u64 [%r209+8],%r350; .loc 1 302 28 cvt.u16.u32 %r353,%r23; add.u16 %r352,%r353,-1; cvt.u32.u16 %r354,%r352; st.u8 [%r209+28],%r354; .loc 1 304 10 setp.eq.u64 %r355,%r146,0; @ %r355 bra $L70; bra $L107; $L72: .loc 1 293 47 mul.lo.u64 %r129,%r72,%r129; $L69: .loc 1 295 4 st.u64 [%r172+8],%r356; ld.u64 %r72,[%r164]; add.u64 %r357,%r72,-1; st.u64 [%r172+16],%r357; st.u64 [%r172],%r129; .loc 1 288 7 add.u64 %r172,%r172,24; add.u64 %r164,%r164,8; setp.ne.u64 %r358,%r134,%r172; @ %r358 bra $L72; bra $L68; $L70: .loc 1 307 4 st.u64 [%r209+48],%r146; mov.u64 %r360,-1; st.u64 [%r209+56],%r360; mov.u64 %r361,1; st.u64 [%r209+40],%r361; .loc 1 308 4 bra $L48; $L107: .loc 1 311 24 mov.u64 %r363,16363; call (%value_in),_gfortrani_xmallocarray64,[%value_in]; } .loc 1 311 22 st.u64 [%r209],%r364; bra $L73; $L63: .loc 1 316 19 ld.s8 %r366,[%r209+28]; .loc 1 316 10 setp.eq.u64 %r367,%r366,%r138; @ %r367 bra $L74; .loc 1 317 2 cvta.const.u64 %r36836_gfortran_runtime_error74: .loc 1 319 11 cvta.global.u64 %r370,_gfortrani_compile_options; .loc 1 319 10 ld.u32 %r371,[%r370+36]; setp.eq.u32 %r372,%r371,0; @ %r372 bra $L73; .loc 1 321 4 add.u64 %r438,%frame,360; cvta.const.u64 %r376,$LC2r437376; call _gfortrani_bounds_ifunction_return323 4 cvta.const.u64 %r380,$LC1210380376; call _gfortrani_bounds_equal_extents$L73: .loc 1 328 3 setp.ne.u64 %r382,%r138,0; @ %r382 bra $L75; $L78: .loc 1 336 8 ld.u64 %r153,[%r209]; .loc 1 337 8 ld.u64 %r151,[%r210]; .loc 1 339 9 setp.ne.u64 %r383,%r151,0; @ %r383 bra $L76; bra $L48; $L75: add.u64 %r190,%frame,480; add.u64 %r189,%r209,40; add.u64 %r187,%frame,120; cvt.u32.u32 %r385,%r23; cvt.s64.s8 %r384,%r385; add.u64 %r175,%r384,-1; .loc 1 328 3 mov.u64 %r157,0; add.u64 %r438,%frame,360; .loc 1 330 16 mov.u64 %r386,%r157; $L77: st.u64 [%r190],%r386; .loc 1 331 18 ld.u64 %r387,[%r189]; st.u64 [%r187],%r387; .loc 1 332 17 shl.b64 %r389,%r157,3; add.u64 %r390,%r438,%r389; .loc 1 332 10 ld.u64 %r391,[%r390]; setp.le.s64 %r392,%r391,0; @ %r392 bra $L48; .loc 1 328 26 add.u64 %r157,%r157,1; .loc 1 328 3 add.u64 %r190,%r190,8; add.u64 %r189,%r189,24; add.u64 %r187,%r187,8; setp.ne.u64 %r393,%r157,%r175; @ %r393 bra $L77; bra $L78; $L76: .loc 1 262 12 mul.lo.u64 %r45,%r42,%r34; .loc 1 349 32 shl.b64 %r91,%r142,4; ld.u64 %r39,[%frame+480]; .loc 1 359 22 ld.u64 %r94,[%frame+240]; .loc 1 359 12 shl.b64 %r95,%r94,4; .loc 1 360 23 ld.u64 %r97,[%frame]; .loc 1 361 22 ld.u64 %r99,[%frame+120]; .loc 1 361 12 shl.b64 %r100,%r99,4; .loc 1 363 32 ld.u64 %r85,[%frame+360]; .loc 1 370 23 mul.lo.u64 %r38,%r85,%r94; .loc 1 371 24 mul.lo.u64 %r394,%r85,%r97; .loc 1 371 10 neg.s64 %r163,%r394; .loc 1 372 23 mul.lo.u64 %r395,%r85,%r99; .loc 1 372 9 shl.b64 %r396,%r395,4; neg.s64 %r165,%r396; setp.le.s64 %r443,%r138,1; .loc 1 348 10 mov.u64 %r445,0; cvt.u32.u32 %r446,%r23; cvt.s64.s8 %r447,%r446; add.u64 %r448,%r447,-1; add.u64 %r449,%frame,240; add.u64 %r450,%frame,360; add.u64 %r451,%frame,120; $L87: .loc 1 328 3 mov.u64 %r156,%r152; mov.u64 %r155,%r151; .loc 1 348 10 mov.u64 %r434,%r445; mov.u64 %r435,0; .loc 1 349 9 mov.u64 %r154,%r435; $L80: .loc 1 352 6 ld.s8 %r398,[%r156]; cvt.u16.u32 %r397,%r398; setp.eq.u16 %r399,%r397,0; @ %r399 bra $L79; .loc 1 353 12 ld.u64 %r402,[%r155]; ld.u64 %r405,[%r155+8]; xor.b64 %r434,%r402,%r434; xor.b64 %r435,%r405,%r435; $L79: .loc 1 349 24 add.u64 %r154,%r154,1; .loc 1 349 32 add.u64 %r155,%r155,%r91; .loc 1 349 47 add.u64 %r156,%r156,%r45; .loc 1 349 2 setp.ne.u64 %r407,%r139,%r154; @ %r407 bra $L80; .loc 1 355 8 st.u64 [%r153],%r434; st.u64 [%r153+8],%r435; .loc 1 358 15 add.u64 %r39,%r39,1; .loc 1 359 12 add.u64 %r151,%r151,%r95; .loc 1 360 13 add.u64 %r152,%r152,%r97; .loc 1 361 12 add.u64 %r153,%r153,%r100; .loc 1 363 13 setp.ne.u64 %r410,%r85,%r39; @ %r410 bra $L87; .loc 1 371 10 add.u64 %r148,%r152,%r163; .loc 1 372 9 add.u64 %r149,%r153,%r165; .loc 1 374 7 @ ! %r443 bra $L108; bra $L48; $L85: .loc 1 367 13 st.u64 [%r41],%r445; .loc 1 370 23 mul.lo.u64 %r102,%r114,%r113; .loc 1 371 24 mul.lo.u64 %r413,%r116,%r113; .loc 1 371 10 sub.u64 %r148,%r152,%r413; .loc 1 372 23 mul.lo.u64 %r414,%r118,%r113; .loc 1 372 9 shl.b64 %r415,%r414,4; sub.u64 %r149,%r153,%r415; .loc 1 373 5 add.u64 %r150,%r150,1; .loc 1 374 7 add.u64 %r41,%r41,8; add.u64 %r166,%r166,8; setp.ne.u64 %r416,%r150,%r448; @ %r416 bra $L84; bra $L48; $L108: add.u64 %r41,%frame,488; .loc 1 370 23 mov.u64 %r102,%r38; .loc 1 374 7 mov.u64 %r166,8; .loc 1 373 5 mov.u64 %r150,1; $L84: .loc 1 382 16 ld.u64 %r420,[%r41]; add.u64 %r113,%r420,1; st.u64 [%r41],%r113; .loc 1 383 23 add.u64 %r422,%r449,%r166; ld.u64 %r114,[%r422]; .loc 1 383 13 sub.u64 %r423,%r114,%r102; shl.b64 %r424,%r423,4; add.u64 %r151,%r151,%r424; .loc 1 384 24 add.u64 %r425,%frame,%r166; ld.u64 %r116,[%r425]; .loc 1 384 14 add.u64 %r152,%r148,%r116; .loc 1 385 23 add.u64 %r427,%r451,%r166; ld.u64 %r118,[%r427]; .loc 1 385 13 shl.b64 %r428,%r118,4; add.u64 %r153,%r149,%r428; .loc 1 363 32 add.u64 %r430,%r450,%r166; ld.u64 %r122,[%r430]; .loc 1 363 13 setp.eq.u64 %r431,%r113,%r122; @ %r431 bra $L85; .loc 1 367 13 mov.u64 %r39,0; bra $L87; $L106: .loc 1 284 6 ld.u64 %r432,[%r209]; setp.eq.u64 %r433,%r432,0; @ ! %r433 bra $L63; bra $L86; $L48:_gfortran_siparity_i16 .visible .func _gfortran_siparity_i1668u64 %r107; .reg .u64u16 %r168; .reg .u32 %r169; .reg .u16 %r170; .reg .pred %r171; .reg .predpred %r186; .reg .predu64predu64u16 %r221; .reg .u16 %r222; .reg .u32pred %r239; .reg .u64predu32 %r252; .reg .predu64 %r264; .reg .u16 %r265; .reg .pred %r266; .reg .u64 %r268; .reg .u32 %r272; .reg .u64 %r274; .reg .u64predpred %r292; .reg .predu32u64 %r301; mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 412 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L110; .loc 1 412 20 ld.u32 %r155,[%r153]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L111; $L110:r152; call _gfortran_iparity_i16,(%out_arg1,%out_arg2,%out_arg3); } .loc 1 419 7 bra $L109; $L111: .loc 1 422 10 ld.u64 %r23,[%r152]; .loc 1 422 7 add.u64 %r88,%r23,-1; .loc 1 423 10 ld.s8 %r24,[%r151+28]; .loc 1 423 38 add.u32 %r26,%r24,-1; .loc 1 423 8 cvt.s64.s32 %r295,%r26; .loc 1 425 7 shr.u64 %r161,%r88,63; set.u32.gt.s64 %r163,%r88,%r295; neg.s32 %r164,%r163; cvt.u16.u64 %r167,%r161; cvt.u16.u32 %r168,%r164; or.b16 %r166,%r167,%r168; .loc 1 425 6 cvt.u32.u16 %r169,%r166; cvt.u16.u8 %r170,%r169; setp.ne.u16 %r171,%r170,0; @ %r171 bra $L113; .loc 1 432 3 setp.ne.u64 %r172,%r88,0; @ %r172 bra $L114; $L120: .loc 1 440 3 setp.lt.s64 %r173,%r88,%r295; @ %r173 bra $L115; bra $L168; $L113: .loc 1 427 7 cvt.u32.u32 %r176,%r24; cvt.s64.s8 %r175,%r176; st.u64 [%stack+8],%r175174174_gfortran_runtime_error114: add.u64 %r137,%r151,48; add.u64 %r139,%frame,120; add.u64 %r140,%r151,56; add.u64 %r178,%r151,24; add.u64 %r180,%r23,%r23; add.u64 %r181,%r180,%r23; shl.b64 %r182,%r181,3; add.u64 %r146,%r178,%r182; .loc 1 437 12 mov.u64 %r301,0; $L119: .loc 1 434 19 ld.u64 %r184,[%r140]; add.u64 %r183,%r184,1; ld.u64 %r185,[%r137]; sub.u64 %r36,%r183,%r185; .loc 1 436 10 setp.le.s64 %r186,%r36,0; @ %r186 bra $L117; .loc 1 434 17 st.u64 [%r139],%r36; bra $L118; $L117: .loc 1 437 12 st.u64 [%r139],%r301; $L118: .loc 1 432 3 add.u64 %r137,%r137,24; add.u64 %r139,%r139,8; add.u64 %r140,%r140,24; setp.ne.u64 %r188,%r137,%r146; @ %r188 bra $L119; bra $L120; $L168: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r189,%r98,0; @ ! %r189 bra $L122; bra $L121; $L115: add.u64 %r191,%r23,%r23; add.u64 %r192,%r191,%r23; shl.b64 %r193,%r192,3; add.u64 %r194,%r193,48; add.u64 %r127,%r151,%r194; add.u64 %r294,%frame,120; shl.b64 %r196,%r23,3; add.u64 %r197,%r196,-8; add.u64 %r43,%r294,%r197; add.u64 %r198,%r151,48; cvt.u32.u32 %r200,%r24; cvt.s64.s8 %r199,%r200; add.u64 %r202,%r199,%r199; add.u64 %r203,%r202,%r199; shl.b64 %r204,%r203,3; add.u64 %r133,%r198,%r204; .loc 1 446 12 mov.u64 %r300,0; $L125: .loc 1 443 2 ld.u64 %r206,[%r127+8]; add.u64 %r205,%r206,1; ld.u64 %r207,[%r127]; sub.u64 %r41,%r205,%r207; .loc 1 445 10 setp.le.s64 %r208,%r41,0; @ %r208 bra $L123; .loc 1 442 17 st.u64 [%r43],%r41; bra $L124; $L123: .loc 1 446 12 st.u64 [%r43],%r300; $L124: .loc 1 440 3 add.u64 %r127,%r127,24; add.u64 %r43,%r43,8; setp.ne.u64 %r210,%r127,%r133; @ %r210 bra $L125; bra $L169; $L121: .loc 1 453 7 setp.eq.u64 %r211,%r295,0; @ %r211 bra $L127; add.u64 %r294,%frame,120; $L147: add.u64 %r111,%r150,40; mov.u64 %r107,%r294; cvt.u32.u32 %r213,%r24; cvt.s64.s8 %r212,%r213; add.u64 %r215,%r212,%r212; add.u64 %r216,%r215,%r212; shl.b64 %r217,%r216,3; add.u64 %r218,%r150,16; add.u64 %r79,%r217,%r218; .loc 1 456 10 mov.u64 %r81,1; .loc 1 460 4 mov.u64 %r237,0; bra $L128; $L127: .loc 1 464 24 mov.u64 %r219,0; st.u64 [%r150+8],%r219; .loc 1 465 28 cvt.u16.u32 %r222,%r24; add.u16 %r221,%r222,-1; cvt.u32.u16 %r223,%r221; st.u8 [%r150+28],%r223; .loc 1 467 20 add.u32 %r224,%r24,-2; cvt.s64.s32 %r50,%r224; add.u64 %r226,%r50,%r50; add.u64 %r227,%r226,%r50; shl.b64 %r228,%r227,3; add.u64 %r229,%r150,%r228; .loc 1 467 67 shl.b64 %r231,%r50,3; add.u64 %r232,%frame,%r231; .loc 1 467 59 ld.u64 %r234,[%r229+40]; ld.u64 %r235,[%r232+120]; mul.lo.u64 %r96,%r234,%r235; .loc 1 469 10 setp.eq.u64 %r236,%r96,0; @ %r236 bra $L129; bra $L170; $L131: .loc 1 458 48 mul.lo.u64 %r81,%r46,%r81; $L128: .loc 1 460 4 st.u64 [%r111+8],%r237; ld.u64 %r46,[%r107]; add.u64 %r238,%r46,-1; st.u64 [%r111+16],%r238; st.u64 [%r111],%r81; .loc 1 453 7 add.u64 %r111,%r111,24; add.u64 %r107,%r107,8; setp.ne.u64 %r239,%r79,%r111; @ %r239 bra $L131; bra $L127; $L129: .loc 1 472 4 st.u64 [%r150+48],%r96; mov.u64 %r241,-1; st.u64 [%r150+56],%r241; mov.u64 %r242,1; st.u64 [%r150+40],%r242; .loc 1 473 4 bra $L109; $L170: .loc 1 476 24 mov.u64 %r244,1696244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } mov.u64 %r98,%r245; .loc 1 476 22 st.u64 [%r150],%r98; $L135: .loc 1 502 3 setp.ne.u64 %r247,%r295,0; @ %r247 bra $L132; bra $L133; $L122: .loc 1 480 19 ld.s8 %r55,[%r150+28]; .loc 1 480 10 setp.eq.u64 %r248,%r55,%r295; @ %r248 bra $L134; .loc 1 481 2 st.u64 [%stack+8],%r295; st.u64 [%stack],%r55;_gfortran_runtime_error134: .loc 1 486 11 cvta.global.u64 %r251,_gfortrani_compile_options; .loc 1 486 10 ld.u32 %r252,[%r251+36]; setp.eq.u32 %r253,%r252,0; @ %r253 bra $L135; .loc 1 488 4 setp.eq.u64 %r254,%r295,0; @ %r254 bra $L133; add.u64 %r78,%r150,48; add.u64 %r31,%frame,120; cvt.u32.u32 %r256,%r24; cvt.s64.s8 %r255,%r256; add.u64 %r129,%r255,-1; .loc 1 488 10 mov.u64 %r104,0; $L137: .loc 1 492 21 ld.u64 %r258,[%r78+8]; add.u64 %r257,%r258,1; .loc 1 492 19 ld.u64 %r259,[%r78]; sub.u64 %r93,%r257,%r259; .loc 1 493 18 ld.u64 %r63,[%r31]; .loc 1 488 25 add.u64 %r104,%r104,1; .loc 1 493 11 setp.eq.u64 %r260,%r63,%r93; @ %r260 bra $L136; .loc 1 494 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r261261_gfortran_runtime_error136: .loc 1 488 4 add.u64 %r78,%r78,24; add.u64 %r31,%r31,8; setp.ne.u64 %r263,%r129,%r104; @ %r263 bra $L137; $L132: .loc 1 504 16 cvt.u16.u32 %r265,%r24; setp.le.s16 %r266,%r265,1; @ %r266 bra $L138; shl.b64 %r264,%r295,3; bra $L139; $L138: mov.u64 %r264,8; $L139: add.u64 %r268,%frame,240; mov.u32 %r2726722674,[%value_in]; } add.u64 %r116,%r150,40; mov.u64 %r114,%frame; mov.u64 %r100,0; $L140: .loc 1 505 18 ld.u64 %r276,[%r116]; st.u64 [%r114],%r276; .loc 1 502 26 add.u64 %r100,%r100,1; .loc 1 502 3 add.u64 %r116,%r116,24; add.u64 %r114,%r114,8; setp.gt.s64 %r277,%r295,%r100; @ %r277 bra $L140; $L133: ld.u64 %r80,[%frame+240]; .loc 1 514 22 ld.u64 %r68,[%frame]; .loc 1 514 12 shl.b64 %r69,%r68,4; .loc 1 516 32 ld.u64 %r59,[%frame+120]; .loc 1 523 23 mul.lo.u64 %r44,%r59,%r68; setp.le.s64 %r293,%r295,1; .loc 1 512 13 mov.u64 %r278,0; cvt.u32.u32 %r297,%r24; cvt.s64.s8 %r298,%r297; add.u64 %r299,%r298,-1; $L146: st.u64 [%r98],%r278; st.u64 [%r98+8],%r278; .loc 1 513 15 add.u64 %r80,%r80,1; .loc 1 514 12 add.u64 %r98,%r98,%r69; .loc 1 516 13 setp.ne.u64 %r280,%r59,%r80; @ %r280 bra $L146; .loc 1 525 7 @ ! %r293 bra $L171; bra $L109; $L145: .loc 1 520 13 st.u64 [%r99],%r278; .loc 1 523 23 mul.lo.u64 %r72,%r75,%r74; .loc 1 524 5 add.u64 %r97,%r97,1; .loc 1 525 7 add.u64 %r99,%r99,8; add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.ne.u64 %r283,%r97,%r299; @ %r283 bra $L144; bra $L109; $L171: add.u64 %r99,%frame,248; add.u64 %r124,%frame,8; add.u64 %r123,%frame,128; .loc 1 523 23 mov.u64 %r72,%r44; .loc 1 524 5 mov.u64 %r97,1; $L144: .loc 1 529 16 ld.u64 %r288,[%r99]; add.u64 %r74,%r288,1; st.u64 [%r99],%r74; .loc 1 530 23 ld.u64 %r75,[%r124]; .loc 1 530 13 sub.u64 %r289,%r75,%r72; shl.b64 %r290,%r289,4; add.u64 %r98,%r98,%r290; .loc 1 516 32 ld.u64 %r77,[%r123]; .loc 1 516 13 setp.eq.u64 %r291,%r74,%r77; @ %r291 bra $L145; .loc 1 520 13 mov.u64 %r80,0; bra $L146; $L169: .loc 1 449 15 ld.u64 %r98,[%r150]; .loc 1 449 6 setp.eq.u64 %r292,%r98,0; @ ! %r292 bra $L122; bra $L147; $L109: .loc 1 534 1 ret; } norm2_r4.o/_gfortran_norm2_r4 .visible .func _gfortran_norm2_r4fortran/generated/norm2_r4.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_retursqrtf .extern .func (.param .f32 %value_out) sqrtf,105,110,99,111,114,114,101,99,116,32,105,110,32,78,79,82,77,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,78,79,82,77,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,105] = {78,79,82,77114,101,116,117,114,110,32,118,97,108,11_gfortran_norm2_r4 .visible .func _gfortran_norm2_r4480101; .reg .u64 %r103; .reg .f32pred %r225; .reg .predpred %r236; .reg .pred %r238; .reg .u64u64pred %r263; .reg .u64 %r264; .reg .u32 %r265; .reg .u64 %r267; .reg .u64 %r268; .reg .u64 %r269; .reg .u64 %r270; .reg .u64 %r271; .reg .u16 %r273; .reg .u16u32u64 %r317; .reg .u64predpred %r326; .reg .pred %r327; .reg .f32 %r329; .reg .f32 %r330; .reg .pred %r331; .reg .u64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64pred %r345; .reg .u64mov.u64 %r187,%ar0; mov.u64 %r188,%ar1; mov.u64 %r189,%ar2; .loc 1 58 10 ld.s8 %r22,[%r188+28]; .loc 1 58 38 add.u32 %r190,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r113,%r190; .loc 1 59 10 ld.u64 %r25,[%r189]; .loc 1 59 7 add.u64 %r114,%r25,-1; .loc 1 61 7 shr.u64 %r192,%r114,63; set.u32.lt.s64 %r194,%r113,%r114; neg.s32 %r195,%r194; cvt.u16.u64 %r198,%r192; cvt.u16.u32 %r199,%r195; or.b16 %r197,%r198,%r199; .loc 1 61 6 cvt.u32.u16 %r200,%r197; cvt.u16.u8 %r201,%r200; setp.eq.u16 %r202,%r201,0; @ %r202 bra $L2; .loc 1 63 7 cvt.u32.u32 %r205,%r22; cvt.s64.s8 %r204,%r205; st.u64 [%stack+8],%r20403203_gfortran_runtime_errorr208,%r114,%r114; add.u64 %r209,%r208,%r114; shl.b64 %r210,%r209,3; add.u64 %r211,%r188,%r210; ld.u64 %r31,[%r211+56]; ld.u64 %r33,[%r211+48]; .loc 1 71 9 ld.u64 %r117,[%r211+40]; .loc 1 73 3 setp.ne.u64 %r225,%r114,0; @ %r225 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r226,%r113,%r114; @ %r226 bra $L4; bra $L54; $L3: add.u64 %r112,%r188,40; add.u64 %r133,%frame,120; add.u64 %r185,%frame,240; add.u64 %r227,%r188,16; add.u64 %r229,%r25,%r25; add.u64 %r230,%r229,%r25; shl.b64 %r231,%r230,3; add.u64 %r75,%r227,%r231; .loc 1 79 12 mov.u64 %r368,0; $L8: .loc 1 75 18 ld.u64 %r232,[%r112]; st.u64 [%r133],%r232; .loc 1 76 19 ld.u64 %r234,[%r112+16]; add.u64 %r233,%r234,1; ld.u64 %r235,[%r112+8]; sub.u64 %r38,%r233,%r235; .loc 1 78 10 setp.lt.s64 %r236,%r38,0; @ %r236 bra $L6; .loc 1 76 17 st.u64 [%r185],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r185],%r368; $L7: .loc 1 73 3 add.u64 %r112,%r112,24; add.u64 %r133,%r133,8; add.u64 %r185,%r185,8; setp.ne.u64 %r238,%r75,%r112; @ %r238 bra $L8; bra $L9; $L54: .loc 1 90 6 ld.u64 %r239,[%r187]; setp.eq.u64 %r240,%r239,0; @ ! %r240 bra $L11; bra $L10; $L4: add.u64 %r242,%r25,%r25; add.u64 %r243,%r242,%r25; shl.b64 %r244,%r243,3; add.u64 %r245,%r244,40; add.u64 %r111,%r188,%r245; shl.b64 %r246,%r25,3; add.u64 %r101,%r246,-8; add.u64 %r358,%frame,120; add.u64 %r106,%r358,%r101; add.u64 %r357,%frame,240; add.u64 %r99,%r357,%r101; cvt.u32.u32 %r250,%r22; cvt.s64.s8 %r249,%r250; add.u64 %r252,%r249,%r249; add.u64 %r253,%r252,%r249; shl.b64 %r254,%r253,3; add.u64 %r255,%r188,40; add.u64 %r48,%r254,%r255; .loc 1 87 12 mov.u64 %r367,0; $L14: .loc 1 83 18 ld.u64 %r256,[%r111]; st.u64 [%r106],%r256; .loc 1 84 19 ld.u64 %r258,[%r111+16]; add.u64 %r257,%r258,1; ld.u64 %r259,[%r111+8]; sub.u64 %r45,%r257,%r259; .loc 1 86 10 setp.lt.s64 %r260,%r45,0; @ %r260 bra $L12; .loc 1 84 17 st.u64 [%r99],%r45; bra $L13; $L12: .loc 1 87 12 st.u64 [%r99],%r367; $L13: .loc 1 81 3 add.u64 %r111,%r111,24; add.u64 %r106,%r106,8; add.u64 %r99,%r99,8; setp.ne.u64 %r262,%r48,%r111; @ %r262 bra $L14; bra $L55; $L10: .loc 1 94 7 setp.le.s64 %r263,%r113,0; @ %r263 bra $L16; add.u64 %r357,%frame,240; $L40: add.u64 %r151,%r187,40; mov.u64 %r148,%r357; cvt.u32.u32 %r265,%r22; cvt.s64.s8 %r264,%r265; add.u64 %r267,%r264,%r264; add.u64 %r268,%r267,%r264; shl.b64 %r269,%r268,3; add.u64 %r270,%r187,16; add.u64 %r134,%r269,%r270; .loc 1 97 10 mov.u64 %r103,1; .loc 1 101 4 mov.u64 %r293,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r271,0; st.u64 [%r187+8],%r271; .loc 1 106 28 cvt.u16.u32 %r274,%r22; add.u16 %r273,%r274,-1; cvt.u32.u16 %r275,%r273; st.u8 [%r187+28],%r275; .loc 1 108 20 add.u32 %r276,%r22,-2; cvt.s64.s32 %r56,%r276; add.u64 %r278,%r56,%r56; add.u64 %r279,%r278,%r56; shl.b64 %r280,%r279,3; add.u64 %r281,%r187,%r280; .loc 1 108 67 shl.b64 %r283,%r56,3; add.u64 %r284,%frame,%r283; .loc 1 108 59 ld.u64 %r286,[%r281+40]; ld.u64 %r287,[%r284+240]; mul.lo.u64 %r119,%r286,%r287; .loc 1 110 29289; call (%value_in),_gfortrani_xmallocarray290,[%value_in]; } .loc 1 110 27 st.u64 [%r187],%r290; .loc 1 111 10 setp.eq.u64 %r292,%r119,0; @ ! %r292 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r103,%r52,%r103; $L17: .loc 1 101 4 st.u64 [%r151+8],%r293; ld.u64 %r52,[%r148]; add.u64 %r294,%r52,-1; st.u64 [%r151+16],%r294; st.u64 [%r151],%r103; .loc 1 94 7 add.u64 %r151,%r151,24; add.u64 %r148,%r148,8; setp.ne.u64 %r295,%r134,%r151; @ %r295 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r187+48],%r119; mov.u64 %r297,-1; st.u64 [%r187+56],%r297; mov.u64 %r298,1; st.u64 [%r187+40],%r298; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r62,[%r187+28]; .loc 1 121 10 setp.eq.u64 %r299,%r62,%r113; @ %r299 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r113; st.u64 [%stack],%r62;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r302,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r303,[%r302+36]; setp.eq.u32 %r304,%r303,0; @ %r304 bra $L21; .loc 1 128 2 add.u64 %r357,%frame,240; cvta.const.u64 %r308,$LC2r357307308; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r310,%r113,0; @ %r310 bra $L24; $L28: .loc 1 68 9 add.u64 %r311,%r31,1; .loc 1 68 7 sub.u64 %r115,%r311,%r33; max.s64 %r46,%r115,0; .loc 1 140 8 ld.u64 %r129,[%r188]; .loc 1 141 8 ld.u64 %r130,[%r187]; .loc 1 159 36 shl.b64 %r73,%r117,2; .loc 1 186 22 ld.u64 %r177,[%frame+120]; .loc 1 186 12 shl.b64 %r180,%r177,2; .loc 1 187 22 ld.u64 %r181,[%frame]; .loc 1 187 12 shl.b64 %r184,%r181,2; .loc 1 185 12 ld.u64 %r186,[%frame+360]; setp.gt.s64 %r359,%r115,0; setp.le.s64 %r360,%r113,1; cvt.u32.u32 %r362,%r22; cvt.s64.s8 %r363,%r362; add.u64 %r364,%r363,-1; add.u64 %r365,%frame,120; add.u64 %r366,%frame,240; bra $L25; $L24: add.u64 %r168,%frame,360; add.u64 %r167,%r187,40; mov.u64 %r165,%frame; cvt.u32.u32 %r313,%r22; cvt.s64.s8 %r312,%r313; add.u64 %r154,%r312,-1; .loc 1 132 3 mov.u64 %r131,0; add.u64 %r357,%frame,240; .loc 1 134 16 mov.u64 %r314,%r131; $L27: st.u64 [%r168],%r314; .loc 1 135 18 ld.u64 %r315,[%r167]; st.u64 [%r165],%r315; .loc 1 136 17 shl.b64 %r317,%r131,3; add.u64 %r318,%r357,%r317; .loc 1 136 10 ld.u64 %r319,[%r318]; setp.le.s64 %r320,%r319,0; @ %r320 bra $L1; .loc 1 132 26 add.u64 %r131,%r131,1; .loc 1 132 3 add.u64 %r168,%r168,8; add.u64 %r167,%r167,24; add.u64 %r165,%r165,8; setp.ne.u64 %r321,%r131,%r154; @ %r321 bra $L27; bra $L28; $L25: .loc 1 154 5 @ %r359 bra $L41; .loc 1 155 10 mov.f32 %r323,0f00000000; st.f32 [%r130],%r323; bra $L30; $L41: mov.u64 %r125,%r129; .loc 1 153 8 mov.f32 %r105,0f3f800000; .loc 1 152 9 mov.f32 %r104,0f00000000; .loc 1 159 13 mov.u64 %r124,0; $L29: .loc 1 163 8 ld.f32 %r69,[%r125]; .loc 1 163 7 setp.eq.f32 %r324,%r69,0f00000000; @ %r324 bra $L31; .loc 1 166 13 abs.f32 %r121,%r69; .loc 1 167 11 setp.gt.f32 %r325,%r121,%r105; @ ! %r325 bra $L56; .loc 1 169 9 div.rn.f32 %r123,%r105,%r121; .loc 1 170 25 mul.f32 %r71,%r123,%r104; .loc 1 170 12 fma.rn.f32 %r104,%r71,%r123,0f3f800000; .loc 1 171 11 mov.f32 %r105,%r121; bra $L31; $L56: .loc 1 175 9 div.rn.f32 %r122,%r69,%r105; .loc 1 176 12 fma.rn.f32 %r104,%r122,%r122,%r104; $L31: .loc 1 159 28 add.u64 %r124,%r124,1; .loc 1 159 36 add.u64 %r125,%r125,%r73; .loc 1 159 6 setp.gt.s64 %r326,%r46,%r124; @ %r326 bra $L29; setp.geu.f32 %r327,%r104,0f00000000; @ ! %r327 bra $L57; .loc 1 180 23 sqrt.rn.f32 %r74,%r104; bra $L36; $L57: { .param .f32 %value_in; .param .f32 %out_arg1; st.param.f32 [%out_arg1],%r104; call (%value_in),sqrtf,(%out_arg1); ld.param.f32 %r329,[%value_in]; } mov.f32 %r74,%r329; $L36: .loc 1 180 13 mul.f32 %r330,%r105,%r74; .loc 1 181 12 st.f32 [%r130],%r330; $L30: .loc 1 185 15 add.u64 %r186,%r186,1; .loc 1 186 12 add.u64 %r129,%r129,%r180; .loc 1 187 12 add.u64 %r130,%r130,%r184; .loc 1 189 32 ld.u64 %r141,[%frame+240]; .loc 1 189 13 setp.ne.u64 %r331,%r186,%r141; @ %r331 bra $L25; .loc 1 196 23 mul.lo.u64 %r332,%r186,%r177; .loc 1 196 9 shl.b64 %r333,%r332,2; sub.u64 %r126,%r129,%r333; .loc 1 197 23 mul.lo.u64 %r334,%r186,%r181; .loc 1 197 9 shl.b64 %r335,%r334,2; sub.u64 %r127,%r130,%r335; .loc 1 199 7 @ %r360 bra $L1; add.u64 %r63,%frame,368; mov.u64 %r159,8; .loc 1 198 5 mov.u64 %r128,1; .loc 1 193 13 mov.u64 %r361,0; bra $L38; $L39: st.u64 [%r63],%r361; .loc 1 196 23 mul.lo.u64 %r341,%r90,%r89; .loc 1 196 9 shl.b64 %r342,%r341,2; sub.u64 %r126,%r129,%r342; .loc 1 197 23 mul.lo.u64 %r343,%r93,%r89; .loc 1 197 9 shl.b64 %r344,%r343,2; sub.u64 %r127,%r130,%r344; .loc 1 198 5 add.u64 %r128,%r128,1; .loc 1 199 7 add.u64 %r63,%r63,8; add.u64 %r159,%r159,8; setp.eq.u64 %r345,%r128,%r364; @ %r345 bra $L1; $L38: .loc 1 207 16 ld.u64 %r346,[%r63]; add.u64 %r89,%r346,1; st.u64 [%r63],%r89; .loc 1 208 23 add.u64 %r348,%r365,%r159; ld.u64 %r90,[%r348]; .loc 1 208 13 shl.b64 %r349,%r90,2; add.u64 %r129,%r126,%r349; .loc 1 209 23 add.u64 %r350,%frame,%r159; ld.u64 %r93,[%r350]; .loc 1 209 13 shl.b64 %r351,%r93,2; add.u64 %r130,%r127,%r351; .loc 1 189 32 add.u64 %r353,%r366,%r159; ld.u64 %r96,[%r353]; .loc 1 189 13 setp.eq.u64 %r354,%r89,%r96; @ %r354 bra $L39; mov.u64 %r186,0; bra $L25; $L55: .loc 1 90 6 ld.u64 %r355,[%r187]; setp.eq.u64 %r356,%r355,0; @ ! %r356 bra $L11; bra $L40; $L1: .loc 1 213 1 ret; } norm2_r8.o/_gfortran_norm2_r8 .visible .func _gfortran_norm2_r8fortran/generated/norm2_r8.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_retursqrt .extern .func (.param .f64 %value_out) sqrt,105,110,99,111,114,114,101,99,116,32,105,110,32,78,79,82,77,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,78,79,82,77,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,105] = {78,79,82,77114,101,116,117,114,110,32,118,97,108,11_gfortran_norm2_r8 .visible .func _gfortran_norm2_r8480101; .reg .u64pred %r225; .reg .predpred %r236; .reg .pred %r238; .reg .u64u64pred %r263; .reg .u64 %r264; .reg .u32 %r265; .reg .u64 %r267; .reg .u64 %r268; .reg .u64 %r269; .reg .u64 %r270; .reg .u64 %r271; .reg .u16 %r273; .reg .u16u32u64 %r317; .reg .u64predpred %r326; .reg .pred %r327; .reg .f64 %r329; .reg .f64 %r330; .reg .pred %r331; .reg .u64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64pred %r345; .reg .u64mov.u64 %r187,%ar0; mov.u64 %r188,%ar1; mov.u64 %r189,%ar2; .loc 1 58 10 ld.s8 %r22,[%r188+28]; .loc 1 58 38 add.u32 %r190,%r22,-1; .loc 1 58 8 cvt.s64.s32 %r113,%r190; .loc 1 59 10 ld.u64 %r25,[%r189]; .loc 1 59 7 add.u64 %r114,%r25,-1; .loc 1 61 7 shr.u64 %r192,%r114,63; set.u32.lt.s64 %r194,%r113,%r114; neg.s32 %r195,%r194; cvt.u16.u64 %r198,%r192; cvt.u16.u32 %r199,%r195; or.b16 %r197,%r198,%r199; .loc 1 61 6 cvt.u32.u16 %r200,%r197; cvt.u16.u8 %r201,%r200; setp.eq.u16 %r202,%r201,0; @ %r202 bra $L2; .loc 1 63 7 cvt.u32.u32 %r205,%r22; cvt.s64.s8 %r204,%r205; st.u64 [%stack+8],%r20403203_gfortran_runtime_errorr208,%r114,%r114; add.u64 %r209,%r208,%r114; shl.b64 %r210,%r209,3; add.u64 %r211,%r188,%r210; ld.u64 %r31,[%r211+56]; ld.u64 %r33,[%r211+48]; .loc 1 71 9 ld.u64 %r117,[%r211+40]; .loc 1 73 3 setp.ne.u64 %r225,%r114,0; @ %r225 bra $L3; $L9: .loc 1 81 3 setp.gt.s64 %r226,%r113,%r114; @ %r226 bra $L4; bra $L54; $L3: add.u64 %r112,%r188,40; add.u64 %r133,%frame,120; add.u64 %r185,%frame,240; add.u64 %r227,%r188,16; add.u64 %r229,%r25,%r25; add.u64 %r230,%r229,%r25; shl.b64 %r231,%r230,3; add.u64 %r75,%r227,%r231; .loc 1 79 12 mov.u64 %r368,0; $L8: .loc 1 75 18 ld.u64 %r232,[%r112]; st.u64 [%r133],%r232; .loc 1 76 19 ld.u64 %r234,[%r112+16]; add.u64 %r233,%r234,1; ld.u64 %r235,[%r112+8]; sub.u64 %r38,%r233,%r235; .loc 1 78 10 setp.lt.s64 %r236,%r38,0; @ %r236 bra $L6; .loc 1 76 17 st.u64 [%r185],%r38; bra $L7; $L6: .loc 1 79 12 st.u64 [%r185],%r368; $L7: .loc 1 73 3 add.u64 %r112,%r112,24; add.u64 %r133,%r133,8; add.u64 %r185,%r185,8; setp.ne.u64 %r238,%r75,%r112; @ %r238 bra $L8; bra $L9; $L54: .loc 1 90 6 ld.u64 %r239,[%r187]; setp.eq.u64 %r240,%r239,0; @ ! %r240 bra $L11; bra $L10; $L4: add.u64 %r242,%r25,%r25; add.u64 %r243,%r242,%r25; shl.b64 %r244,%r243,3; add.u64 %r245,%r244,40; add.u64 %r111,%r188,%r245; shl.b64 %r246,%r25,3; add.u64 %r101,%r246,-8; add.u64 %r358,%frame,120; add.u64 %r106,%r358,%r101; add.u64 %r357,%frame,240; add.u64 %r99,%r357,%r101; cvt.u32.u32 %r250,%r22; cvt.s64.s8 %r249,%r250; add.u64 %r252,%r249,%r249; add.u64 %r253,%r252,%r249; shl.b64 %r254,%r253,3; add.u64 %r255,%r188,40; add.u64 %r48,%r254,%r255; .loc 1 87 12 mov.u64 %r367,0; $L14: .loc 1 83 18 ld.u64 %r256,[%r111]; st.u64 [%r106],%r256; .loc 1 84 19 ld.u64 %r258,[%r111+16]; add.u64 %r257,%r258,1; ld.u64 %r259,[%r111+8]; sub.u64 %r45,%r257,%r259; .loc 1 86 10 setp.lt.s64 %r260,%r45,0; @ %r260 bra $L12; .loc 1 84 17 st.u64 [%r99],%r45; bra $L13; $L12: .loc 1 87 12 st.u64 [%r99],%r367; $L13: .loc 1 81 3 add.u64 %r111,%r111,24; add.u64 %r106,%r106,8; add.u64 %r99,%r99,8; setp.ne.u64 %r262,%r48,%r111; @ %r262 bra $L14; bra $L55; $L10: .loc 1 94 7 setp.le.s64 %r263,%r113,0; @ %r263 bra $L16; add.u64 %r357,%frame,240; $L40: add.u64 %r151,%r187,40; mov.u64 %r148,%r357; cvt.u32.u32 %r265,%r22; cvt.s64.s8 %r264,%r265; add.u64 %r267,%r264,%r264; add.u64 %r268,%r267,%r264; shl.b64 %r269,%r268,3; add.u64 %r270,%r187,16; add.u64 %r134,%r269,%r270; .loc 1 97 10 mov.u64 %r103,1; .loc 1 101 4 mov.u64 %r293,0; bra $L17; $L16: .loc 1 105 24 mov.u64 %r271,0; st.u64 [%r187+8],%r271; .loc 1 106 28 cvt.u16.u32 %r274,%r22; add.u16 %r273,%r274,-1; cvt.u32.u16 %r275,%r273; st.u8 [%r187+28],%r275; .loc 1 108 20 add.u32 %r276,%r22,-2; cvt.s64.s32 %r56,%r276; add.u64 %r278,%r56,%r56; add.u64 %r279,%r278,%r56; shl.b64 %r280,%r279,3; add.u64 %r281,%r187,%r280; .loc 1 108 67 shl.b64 %r283,%r56,3; add.u64 %r284,%frame,%r283; .loc 1 108 59 ld.u64 %r286,[%r281+40]; ld.u64 %r287,[%r284+240]; mul.lo.u64 %r119,%r286,%r287; .loc 1 110 29 mov.u64 %r289289; call (%value_in),_gfortrani_xmallocarray290,[%value_in]; } .loc 1 110 27 st.u64 [%r187],%r290; .loc 1 111 10 setp.eq.u64 %r292,%r119,0; @ ! %r292 bra $L21; bra $L18; $L20: .loc 1 99 48 mul.lo.u64 %r103,%r52,%r103; $L17: .loc 1 101 4 st.u64 [%r151+8],%r293; ld.u64 %r52,[%r148]; add.u64 %r294,%r52,-1; st.u64 [%r151+16],%r294; st.u64 [%r151],%r103; .loc 1 94 7 add.u64 %r151,%r151,24; add.u64 %r148,%r148,8; setp.ne.u64 %r295,%r134,%r151; @ %r295 bra $L20; bra $L16; $L18: .loc 1 114 4 st.u64 [%r187+48],%r119; mov.u64 %r297,-1; st.u64 [%r187+56],%r297; mov.u64 %r298,1; st.u64 [%r187+40],%r298; .loc 1 115 4 bra $L1; $L11: .loc 1 121 19 ld.s8 %r62,[%r187+28]; .loc 1 121 10 setp.eq.u64 %r299,%r62,%r113; @ %r299 bra $L23; .loc 1 122 2 st.u64 [%stack+8],%r113; st.u64 [%stack],%r62;_gfortran_runtime_error23: .loc 1 127 11 cvta.global.u64 %r302,_gfortrani_compile_options; .loc 1 127 10 ld.u32 %r303,[%r302+36]; setp.eq.u32 %r304,%r303,0; @ %r304 bra $L21; .loc 1 128 2 add.u64 %r357,%frame,240; cvta.const.u64 %r308,$LC2r357307308; call _gfortrani_bounds_ifunction_return$L21: .loc 1 132 3 setp.gt.s64 %r310,%r113,0; @ %r310 bra $L24; $L28: .loc 1 68 9 add.u64 %r311,%r31,1; .loc 1 68 7 sub.u64 %r115,%r311,%r33; max.s64 %r46,%r115,0; .loc 1 140 8 ld.u64 %r129,[%r188]; .loc 1 141 8 ld.u64 %r130,[%r187]; .loc 1 159 36 shl.b64 %r73,%r117,3; .loc 1 186 22 ld.u64 %r177,[%frame+120]; .loc 1 186 12 shl.b64 %r180,%r177,3; .loc 1 187 22 ld.u64 %r181,[%frame]; .loc 1 187 12 shl.b64 %r184,%r181,3; .loc 1 185 12 ld.u64 %r186,[%frame+360]; setp.gt.s64 %r359,%r115,0; setp.le.s64 %r360,%r113,1; cvt.u32.u32 %r362,%r22; cvt.s64.s8 %r363,%r362; add.u64 %r364,%r363,-1; add.u64 %r365,%frame,120; add.u64 %r366,%frame,240; bra $L25; $L24: add.u64 %r168,%frame,360; add.u64 %r167,%r187,40; mov.u64 %r165,%frame; cvt.u32.u32 %r313,%r22; cvt.s64.s8 %r312,%r313; add.u64 %r154,%r312,-1; .loc 1 132 3 mov.u64 %r131,0; add.u64 %r357,%frame,240; .loc 1 134 16 mov.u64 %r314,%r131; $L27: st.u64 [%r168],%r314; .loc 1 135 18 ld.u64 %r315,[%r167]; st.u64 [%r165],%r315; .loc 1 136 17 shl.b64 %r317,%r131,3; add.u64 %r318,%r357,%r317; .loc 1 136 10 ld.u64 %r319,[%r318]; setp.le.s64 %r320,%r319,0; @ %r320 bra $L1; .loc 1 132 26 add.u64 %r131,%r131,1; .loc 1 132 3 add.u64 %r168,%r168,8; add.u64 %r167,%r167,24; add.u64 %r165,%r165,8; setp.ne.u64 %r321,%r131,%r154; @ %r321 bra $L27; bra $L28; $L25: .loc 1 154 5 @ %r359 bra $L41; .loc 1 155 10 mov.f64 %r323,0d0000000000000000; st.f64 [%r130],%r323; bra $L30; $L41: mov.u64 %r125,%r129; .loc 1 153 8 mov.f64 %r105,0d3ff0000000000000; .loc 1 152 9 mov.f64 %r104,0d0000000000000000; .loc 1 159 13 mov.u64 %r124,0; $L29: .loc 1 163 8 ld.f64 %r69,[%r125]; .loc 1 163 7 setp.eq.f64 %r324,%r69,0d0000000000000000; @ %r324 bra $L31; .loc 1 166 13 abs.f64 %r121,%r69; .loc 1 167 11 setp.gt.f64 %r325,%r121,%r105; @ ! %r325 bra $L56; .loc 1 169 9 div.rn.f64 %r123,%r105,%r121; .loc 1 170 25 mul.f64 %r71,%r123,%r104; .loc 1 170 12 fma.rn.f64 %r104,%r71,%r123,0d3ff0000000000000; .loc 1 171 11 mov.f64 %r105,%r121; bra $L31; $L56: .loc 1 175 9 div.rn.f64 %r122,%r69,%r105; .loc 1 176 12 fma.rn.f64 %r104,%r122,%r122,%r104; $L31: .loc 1 159 28 add.u64 %r124,%r124,1; .loc 1 159 36 add.u64 %r125,%r125,%r73; .loc 1 159 6 setp.gt.s64 %r326,%r46,%r124; @ %r326 bra $L29; setp.geu.f64 %r327,%r104,0d0000000000000000; @ ! %r327 bra $L57; .loc 1 180 23 sqrt.rn.f64 %r74,%r104; bra $L36; $L57: { .param .f64 %value_in; .param .f64 %out_arg1; st.param.f64 [%out_arg1],%r104; call (%value_in),sqrt,(%out_arg1); ld.param.f64 %r329,[%value_in]; } mov.f64 %r74,%r329; $L36: .loc 1 180 13 mul.f64 %r330,%r105,%r74; .loc 1 181 12 st.f64 [%r130],%r330; $L30: .loc 1 185 15 add.u64 %r186,%r186,1; .loc 1 186 12 add.u64 %r129,%r129,%r180; .loc 1 187 12 add.u64 %r130,%r130,%r184; .loc 1 189 32 ld.u64 %r141,[%frame+240]; .loc 1 189 13 setp.ne.u64 %r331,%r186,%r141; @ %r331 bra $L25; .loc 1 196 23 mul.lo.u64 %r332,%r186,%r177; .loc 1 196 9 shl.b64 %r333,%r332,3; sub.u64 %r126,%r129,%r333; .loc 1 197 23 mul.lo.u64 %r334,%r186,%r181; .loc 1 197 9 shl.b64 %r335,%r334,3; sub.u64 %r127,%r130,%r335; .loc 1 199 7 @ %r360 bra $L1; add.u64 %r63,%frame,368; mov.u64 %r159,8; .loc 1 198 5 mov.u64 %r128,1; .loc 1 193 13 mov.u64 %r361,0; bra $L38; $L39: st.u64 [%r63],%r361; .loc 1 196 23 mul.lo.u64 %r341,%r90,%r89; .loc 1 196 9 shl.b64 %r342,%r341,3; sub.u64 %r126,%r129,%r342; .loc 1 197 23 mul.lo.u64 %r343,%r93,%r89; .loc 1 197 9 shl.b64 %r344,%r343,3; sub.u64 %r127,%r130,%r344; .loc 1 198 5 add.u64 %r128,%r128,1; .loc 1 199 7 add.u64 %r63,%r63,8; add.u64 %r159,%r159,8; setp.eq.u64 %r345,%r128,%r364; @ %r345 bra $L1; $L38: .loc 1 207 16 ld.u64 %r346,[%r63]; add.u64 %r89,%r346,1; st.u64 [%r63],%r89; .loc 1 208 23 add.u64 %r348,%r365,%r159; ld.u64 %r90,[%r348]; .loc 1 208 13 shl.b64 %r349,%r90,3; add.u64 %r129,%r126,%r349; .loc 1 209 23 add.u64 %r350,%frame,%r159; ld.u64 %r93,[%r350]; .loc 1 209 13 shl.b64 %r351,%r93,3; add.u64 %r130,%r127,%r351; .loc 1 189 32 add.u64 %r353,%r366,%r159; ld.u64 %r96,[%r353]; .loc 1 189 13 setp.eq.u64 %r354,%r89,%r96; @ %r354 bra $L39; mov.u64 %r186,0; bra $L25; $L55: .loc 1 90 6 ld.u64 %r355,[%r187]; setp.eq.u64 %r356,%r355,0; @ ! %r356 bra $L11; bra $L40; $L1: .loc 1 213 1 ret; } norm2_r10.o/ parity_l1.o/_gfortran_parity_l1 .visible .func _gfortran_parity_l1fortran/generated/parity_l1.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_return,105,110,99,111,114,114,101,99,116,32,105,110,32,80,65,82,73,84,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,80,65,82,73,84,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {80,65,82,73,84,89114,101,116,117,114,110,32,118,97,108,11_gfortran_parity_l1 .visible .func _gfortran_parity_l14806u64 %r212; .reg .u64pred %r245; .reg .predu64predpredu64 %r332; .reg .u64pred %r340; .reg .pred %r341; .reg .u32 %r343; .reg .u64u64 %r350; .reg .u64 %r351; mov.u64 %r172,%ar0; mov.u64 %r173,%ar1; mov.u64 %r174,%ar2; .loc 1 55 10 ld.s8 %r22,[%r173+28]; .loc 1 55 38 add.u32 %r175,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r103,%r175; .loc 1 56 10 ld.u64 %r25,[%r174]; .loc 1 56 7 add.u64 %r104,%r25,-1; .loc 1 58 7 shr.u64 %r177,%r104,63; set.u32.lt.s64 %r179,%r103,%r104; neg.s32 %r180,%r179; cvt.u16.u64 %r183,%r177; cvt.u16.u32 %r184,%r180; or.b16 %r182,%r183,%r184; .loc 1 58 6 cvt.u32.u16 %r185,%r182; cvt.u16.u8 %r186,%r185; setp.eq.u16 %r187,%r186,0; @ %r187 bra $L2; .loc 1 60 7 cvt.u32.u32 %r190,%r22; cvt.s64.s8 %r189,%r190; st.u64 [%stack+8],%r189_gfortran_runtime_errorr193,%r104,%r104; add.u64 %r194,%r193,%r10173,%r195; ld.u64 %r31,[%r196+56]; ld.u64 %r33,[%r196+48]; .loc 1 68 9 ld.u64 %r107,[%r196+40]; .loc 1 70 3 setp.ne.u64 %r210,%r104,0; @ %r210 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r211,%r103,%r104; @ %r211 bra $L4; bra $L46; $L3: add.u64 %r123,%r173,40; add.u64 %r169,%frame,120; add.u64 %r168,%frame,240; add.u64 %r212,%r173,16; add.u64 %r214,%r25,%r25; add.u64 %r215,%r214,%r25; shl.b64 %r216,%r215,3; add.u64 %r131,%r212,%r216; .loc 1 76 12 mov.u64 %r351,0; $L8: .loc 1 72 18 ld.u64 %r217,[%r123]; st.u64 [%r169],%r217; .loc 1 73 19 ld.u64 %r219,[%r123+16]; add.u64 %r218,%r219,1; ld.u64 %r220,[%r123+8]; sub.u64 %r38,%r218,%r220; .loc 1 75 10 setp.lt.s64 %r221,%r38,0; @ %r221 bra $L6; .loc 1 73 17 st.u64 [%r168],%r38; bra $L7; $L6: .loc 1 76 12 st.u64 [%r168],%r351; $L7: .loc 1 70 3 add.u64 %r123,%r123,24; add.u64 %r169,%r169,8; add.u64 %r168,%r168,8; setp.ne.u64 %r223,%r131,%r123; @ %r223 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r224,[%r172]; setp.eq.u64 %r225,%r224,0; @ ! %r225 bra $L11; bra $L10; $L4: add.u64 %r227,%r25,%r25; add.u64 %r228,%r227,%r25; shl.b64 %r229,%r228,3; add.u64 %r230,%r229,40; add.u64 %r101,%r173,%r230; shl.b64 %r231,%r25,3; add.u64 %r93,%r231,-8; add.u64 %r338,%frame,120; add.u64 %r95,%r338,%r93; add.u64 %r339,%frame,240; add.u64 %r90,%r339,%r93; cvt.u32.u32 %r235,%r22; cvt.s64.s8 %r234,%r235; add.u64 %r237,%r234,%r234; add.u64 %r238,%r237,%r234; shl.b64 %r239,%r238,3; add.u64 %r240,%r173,40; add.u64 %r120,%r239,%r240; .loc 1 84 12 mov.u64 %r350,0; $L14: .loc 1 80 18 ld.u64 %r241,[%r101]; st.u64 [%r95],%r241; .loc 1 81 19 ld.u64 %r243,[%r101+16]; add.u64 %r242,%r243,1; ld.u64 %r244,[%r101+8]; sub.u64 %r45,%r242,%r244; .loc 1 83 10 setp.lt.s64 %r245,%r45,0; @ %r245 bra $L12; .loc 1 81 17 st.u64 [%r90],%r45; bra $L13; $L12: .loc 1 84 12 st.u64 [%r90],%r350; $L13: .loc 1 78 3 add.u64 %r101,%r101,24; add.u64 %r95,%r95,8; add.u64 %r90,%r90,8; setp.ne.u64 %r247,%r101,%r120; @ %r247 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r248,%r103,0; @ %r248 bra $L16; add.u64 %r339,%frame,240; $L35: add.u64 %r147,%r172,40; mov.u64 %r135,%r339; cvt.u32.u32 %r250,%r22; cvt.s64.s8 %r249,%r250; add.u64 %r252,%r249,%r249; add.u64 %r253,%r252,%r249; shl.b64 %r254,%r253,3; add.u64 %r255,%r172,16; add.u64 %r113,%r254,%r255; .loc 1 94 10 mov.u64 %r96,1; .loc 1 98 4 mov.u64 %r278,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r256,0; st.u64 [%r172+8],%r256; .loc 1 103 28 cvt.u16.u32 %r259,%r22; add.u16 %r258,%r259,-1; cvt.u32.u16 %r260,%r258; st.u8 [%r172+28],%r260; .loc 1 105 20 add.u32 %r261,%r22,-2; cvt.s64.s32 %r56,%r261; add.u64 %r263,%r56,%r56; add.u64 %r264,%r263,%r56; shl.b64 %r265,%r264,3; add.u64 %r266,%r172,%r265; .loc 1 105 67 shl.b64 %r268,%r56,3; add.u64 %r269,%frame,%r268; .loc 1 105 59 ld.u64 %r271,[%r266+40]; ld.u64 %r272,[%r269+240]; mul.lo.u64 %r109,%r271,%r272; .loc 1 107 29 mov.u64 %r274274; call (%value_in),_gfortrani_xmallocarray275,[%value_in]; } .loc 1 107 27 st.u64 [%r172],%r275; .loc 1 108 10 setp.eq.u64 %r277,%r109,0; @ ! %r277 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r96,%r52,%r96; $L17: .loc 1 98 4 st.u64 [%r147+8],%r278; ld.u64 %r52,[%r135]; add.u64 %r279,%r52,-1; st.u64 [%r147+16],%r279; st.u64 [%r147],%r96; .loc 1 91 7 add.u64 %r147,%r147,24; add.u64 %r135,%r135,8; setp.ne.u64 %r280,%r113,%r147; @ %r280 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r172+48],%r109; mov.u64 %r282,-1; st.u64 [%r172+56],%r282; st.u64 [%r172+40],%r274; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r62,[%r172+28]; .loc 1 118 10 setp.eq.u64 %r284,%r62,%r103; @ %r284 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r62;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r287,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r288,[%r287+36]; setp.eq.u32 %r289,%r288,0; @ %r289 bra $L21; .loc 1 125 2 add.u64 %r339,%frame,240; cvta.const.u64 %r293,$LC2r329293; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r295,%r103,0; @ %r295 bra $L24; $L27: .loc 1 65 9 add.u64 %r296,%r31,1; .loc 1 65 7 sub.u64 %r105,%r296,%r33; max.s64 %r46,%r105,0; .loc 1 137 8 ld.u64 %r117,[%r173]; .loc 1 138 8 ld.u64 %r118,[%r172]; ld.u64 %r48,[%frame+360]; .loc 1 166 22 ld.u64 %r71,[%frame+120]; .loc 1 167 22 ld.u64 %r73,[%frame]; setp.gt.s64 %r340,%r105,0; setp.le.s64 %r341,%r103,1; cvt.u32.u32 %r343,%r22; cvt.s64.s8 %r344,%r343; add.u64 %r345,%r344,-1; add.u64 %r346,%frame,120; add.u64 %r347,%frame,240; .loc 1 173 13 mov.u64 %r348,0; bra $L25; $L24: add.u64 %r160,%frame,360; add.u64 %r159,%r172,40; mov.u64 %r157,%frame; cvt.u32.u32 %r298,%r22; cvt.s64.s8 %r297,%r298; add.u64 %r148,%r297,-1; .loc 1 129 3 mov.u64 %r119,0; add.u64 %r339,%frame,240; .loc 1 131 16 mov.u64 %r299,%r119; $L26: st.u64 [%r160],%r299; .loc 1 132 18 ld.u64 %r300,[%r159]; st.u64 [%r157],%r300; .loc 1 133 17 shl.b64 %r302,%r119,3; add.u64 %r303,%r339,%r302; .loc 1 133 10 ld.u64 %r304,[%r303]; setp.le.s64 %r305,%r304,0; @ %r305 bra $L1; .loc 1 129 26 add.u64 %r119,%r119,1; .loc 1 129 3 add.u64 %r160,%r160,8; add.u64 %r159,%r159,24; add.u64 %r157,%r157,8; setp.ne.u64 %r306,%r119,%r148; @ %r306 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r340 bra $L28; .loc 1 150 10 mov.u32 %r308,0; st.u8 [%r118],%r308; bra $L29; $L28: .loc 1 154 36 mov.u64 %r112,%r117; .loc 1 148 10 mov.u32 %r110,0; .loc 1 154 13 mov.u64 %r111,0; $L30: .loc 1 158 19 ld.s8 %r311,[%r112]; cvt.u16.u32 %r310,%r311; cvt.u16.u32 %r312,%r110; set.u32.ne.u16 %r313,%r310,%r312; neg.s32 %r314,%r313; cvt.u32.u32 %r309,%r314; .loc 1 158 10 cvt.s32.s8 %r110,%r309; .loc 1 154 28 add.u64 %r111,%r111,1; .loc 1 154 36 add.u64 %r112,%r112,%r107; .loc 1 154 6 setp.gt.s64 %r316,%r46,%r111; @ %r316 bra $L30; .loc 1 161 12 cvt.u32.u32 %r317,%r110; st.u8 [%r118],%r317; $L29: .loc 1 165 15 add.u64 %r48,%r48,1; .loc 1 166 12 add.u64 %r117,%r117,%r71; .loc 1 167 12 add.u64 %r118,%r118,%r73; .loc 1 169 32 ld.u64 %r126,[%frame+240]; .loc 1 169 13 setp.ne.u64 %r318,%r48,%r126; @ %r318 bra $L25; .loc 1 176 23 mul.lo.u64 %r319,%r71,%r48; .loc 1 176 9 sub.u64 %r114,%r117,%r319; .loc 1 177 23 mul.lo.u64 %r320,%r73,%r48; .loc 1 177 9 sub.u64 %r115,%r118,%r320; .loc 1 179 7 @ %r341 bra $L1; add.u64 %r143,%frame,368; mov.u64 %r102,8; .loc 1 178 5 mov.u64 %r116,1; bra $L33; $L34: .loc 1 173 13 st.u64 [%r143],%r348; .loc 1 176 23 mul.lo.u64 %r326,%r85,%r84; .loc 1 176 9 sub.u64 %r114,%r117,%r326; .loc 1 177 23 mul.lo.u64 %r327,%r87,%r84; .loc 1 177 9 sub.u64 %r115,%r118,%r327; .loc 1 178 5 add.u64 %r116,%r116,1; .loc 1 179 7 add.u64 %r143,%r143,8; add.u64 %r102,%r102,8; setp.eq.u64 %r328,%r116,%r345; @ %r328 bra $L1; $L33: .loc 1 187 16 ld.u64 %r329,[%r143]; add.u64 %r84,%r329,1; st.u64 [%r143],%r84; .loc 1 188 23 add.u64 %r331,%r346,%r102; ld.u64 %r85,[%r331]; .loc 1 188 13 add.u64 %r117,%r114,%r85; .loc 1 189 23 add.u64 %r332,%frame,%r102; ld.u64 %r87,[%r332]; .loc 1 189 13 add.u64 %r118,%r115,%r87; .loc 1 169 32 add.u64 %r334,%r347,%r102; ld.u64 %r89,[%r334]; .loc 1 169 13 setp.eq.u64 %r335,%r84,%r89; @ %r335 bra $L34; .loc 1 173 13 mov.u64 %r48,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r336,[%r172]; setp.eq.u64 %r337,%r336,0; @ ! %r337 bra $L11; bra $L35; $L1: .loc 1 193 1 ret; } parity_l2.o/_gfortran_parity_l2 .visible .func _gfortran_parity_l2fortran/generated/parity_l2.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_return,105,110,99,111,114,114,101,99,116,32,105,110,32,80,65,82,73,84,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,80,65,82,73,84,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {80,65,82,73,84,89114,101,116,117,114,110,32,118,97,108,11_gfortran_parity_l2 .visible .func _gfortran_parity_l2480pred %r218; .reg .predpred %r231; .reg .u64 %r232; .reg .predu64 %r250; .reg .u64 %r251; .reg .u64 %r252; .reg .pred %r253; .reg .predu64u16u64 %r274; .reg .u64pred %r303; .reg .u64 %r304; .reg .u64u32u64 %r317; .reg .u64pred %r357; .reg .u32 %r359; .reg .u64u64 %r366; .reg .u64 %r367; mov.u64 %r180,%ar0; mov.u64 %r181,%ar1; mov.u64 %r182,%ar2; .loc 1 55 10 ld.s8 %r22,[%r181+28]; .loc 1 55 38 add.u32 %r183,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r105,%r183; .loc 1 56 10 ld.u64 %r25,[%r182]; .loc 1 56 7 add.u64 %r106,%r25,-1; .loc 1 58 7 shr.u64 %r185,%r106,63; set.u32.lt.s64 %r187,%r105,%r106; neg.s32 %r188,%r187; cvt.u16.u64 %r191,%r185; cvt.u16.u32 %r192,%r188; or.b16 %r190,%r191,%r192; .loc 1 58 6 cvt.u32.u16 %r193,%r190; cvt.u16.u8 %r194,%r193; setp.eq.u16 %r195,%r194,0; @ %r195 bra $L2; .loc 1 60 7 cvt.u32.u32 %r198,%r22; cvt.s64.s8 %r197,%r198; st.u64 [%stack+8],%r1971966_gfortran_runtime_errorr201,%r106,%r106; add.u64 %r202,%r201,%r106; shl.b64 %r203,%r202,3; add.u64 %r204,%r181,%r203; ld.u64 %r31,[%r204+56]; ld.u64 %r33,[%r204+48]; .loc 1 68 9 ld.u64 %r109,[%r204+40]; .loc 1 70 3 setp.ne.u64 %r218,%r106,0; @ %r218 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r219,%r105,%r106; @ %r219 bra $L4; bra $L46; $L3: add.u64 %r93,%r181,40; add.u64 %r59,%frame,120; add.u64 %r29,%frame,240; add.u64 %r220,%r181,16; add.u64 %r222,%r25,%r25; add.u64 %r223,%r222,%r25; shl.b64 %r224,%r223,3; add.u64 %r177,%r220,%r224; .loc 1 76 12 mov.u64 %r367,0; $L8: .loc 1 72 18 ld.u64 %r225,[%r93]; st.u64 [%r59],%r225; .loc 1 73 19 ld.u64 %r227,[%r93+16]; add.u64 %r226,%r227,1; ld.u64 %r228,[%r93+8]; sub.u64 %r38,%r226,%r228; .loc 1 75 10 setp.lt.s64 %r229,%r38,0; @ %r229 bra $L6; .loc 1 73 17 st.u64 [%r29],%r38; bra $L7; $L6: .loc 1 76 12 st.u64 [%r29],%r367; $L7: .loc 1 70 3 add.u64 %r93,%r93,24; add.u64 %r59,%r59,8; add.u64 %r29,%r29,8; setp.ne.u64 %r231,%r93,%r177; @ %r231 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r232,[%r180]; setp.eq.u64 %r233,%r232,0; @ ! %r233 bra $L11; bra $L10; $L4: add.u64 %r235,%r25,%r25; add.u64 %r236,%r235,%r25; shl.b64 %r237,%r236,3; add.u64 %r238,%r237,40; add.u64 %r146,%r181,%r238; shl.b64 %r239,%r25,3; add.u64 %r128,%r239,-8; add.u64 %r355,%frame,120; add.u64 %r131,%r355,%r128; add.u64 %r354,%frame,240; add.u64 %r126,%r354,%r128; cvt.u32.u32 %r243,%r22; cvt.s64.s8 %r242,%r243; add.u64 %r245,%r242,%r242; add.u64 %r246,%r245,%r242; shl.b64 %r247,%r246,3; add.u64 %r248,%r181,40; add.u64 %r96,%r247,%r248; .loc 1 84 12 mov.u64 %r366,0; $L14: .loc 1 80 18 ld.u64 %r249,[%r146]; st.u64 [%r131],%r249; .loc 1 81 19 ld.u64 %r251,[%r146+16]; add.u64 %r250,%r251,1; ld.u64 %r252,[%r146+8]; sub.u64 %r44,%r250,%r252; .loc 1 83 10 setp.lt.s64 %r253,%r44,0; @ %r253 bra $L12; .loc 1 81 17 st.u64 [%r126],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r126],%r366; $L13: .loc 1 78 3 add.u64 %r146,%r146,24; add.u64 %r131,%r131,8; add.u64 %r126,%r126,8; setp.ne.u64 %r255,%r96,%r146; @ %r255 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r256,%r105,0; @ %r256 bra $L16; add.u64 %r354,%frame,240; $L34: add.u64 %r155,%r180,40; mov.u64 %r153,%r354; add.u64 %r257,%r180,16; cvt.u32.u32 %r259,%r22; cvt.s64.s8 %r258,%r259; add.u64 %r261,%r258,%r258; add.u64 %r262,%r261,%r258; shl.b64 %r263,%r262,3; add.u64 %r147,%r257,%r263; .loc 1 94 10 mov.u64 %r98,1; .loc 1 98 4 mov.u64 %r286,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r264,0; st.u64 [%r180+8],%r264; .loc 1 103 28 cvt.u16.u32 %r267,%r22; add.u16 %r266,%r267,-1; cvt.u32.u16 %r268,%r266; st.u8 [%r180+28],%r268; .loc 1 105 20 add.u32 %r269,%r22,-2; cvt.s64.s32 %r52,%r269; add.u64 %r271,%r52,%r52; add.u64 %r272,%r271,%r52; shl.b64 %r273,%r272,3; add.u64 %r274,%r180,%r273; .loc 1 105 67 shl.b64 %r276,%r52,3; add.u64 %r277,%frame,%r276; .loc 1 105 59 ld.u64 %r279,[%r274+40]; ld.u64 %r280,[%r277+240]; mul.lo.u64 %r111,%r279,%r280; .loc 1 107 29 mov.u64 %r28call (%value_in),_gfortrani_xmallocarray283,[%value_in]; } .loc 1 107 27 st.u64 [%r180],%r283; .loc 1 108 10 setp.eq.u64 %r285,%r111,0; @ ! %r285 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r98,%r48,%r98; $L17: .loc 1 98 4 st.u64 [%r155+8],%r286; ld.u64 %r48,[%r153]; add.u64 %r287,%r48,-1; st.u64 [%r155+16],%r287; st.u64 [%r155],%r98; .loc 1 91 7 add.u64 %r155,%r155,24; add.u64 %r153,%r153,8; setp.ne.u64 %r288,%r147,%r155; @ %r288 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r180+48],%r111; mov.u64 %r290,-1; st.u64 [%r180+56],%r290; mov.u64 %r291,1; st.u64 [%r180+40],%r291; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r180+28]; .loc 1 118 10 setp.eq.u64 %r292,%r58,%r105; @ %r292 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r105; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r295,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r296,[%r295+36]; setp.eq.u32 %r297,%r296,0; @ %r297 bra $L21; .loc 1 125 2 add.u64 %r354,%frame,240; cvta.const.u64 %r301,$LC2r354300301; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r303,%r105,0; @ %r303 bra $L24; $L27: .loc 1 65 9 add.u64 %r304,%r31,1; .loc 1 65 7 sub.u64 %r107,%r304,%r33; max.s64 %r45,%r107,0; .loc 1 137 8 ld.u64 %r118,[%r181]; .loc 1 138 8 ld.u64 %r119,[%r180]; .loc 1 154 36 add.u64 %r67,%r109,%r109; ld.u64 %r139,[%frame+360]; .loc 1 166 22 ld.u64 %r69,[%frame+120]; .loc 1 166 12 add.u64 %r70,%r69,%r69; .loc 1 167 22 ld.u64 %r72,[%frame]; .loc 1 167 12 add.u64 %r73,%r72,%r72; .loc 1 169 32 ld.u64 %r125,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r308,%r69,%r125; .loc 1 176 9 add.u64 %r309,%r308,%r308; neg.s64 %r130,%r309; .loc 1 177 23 mul.lo.u64 %r310,%r72,%r125; .loc 1 177 9 add.u64 %r311,%r310,%r310; neg.s64 %r132,%r311; setp.gt.s64 %r356,%r107,0; setp.le.s64 %r357,%r105,1; cvt.u32.u32 %r359,%r22; cvt.s64.s8 %r360,%r359; add.u64 %r361,%r360,-1; add.u64 %r362,%frame,120; add.u64 %r363,%frame,240; .loc 1 173 13 mov.u64 %r364,0; bra $L25; $L24: add.u64 %r168,%frame,360; add.u64 %r167,%r180,40; mov.u64 %r165,%frame; cvt.u32.u32 %r313,%r22; cvt.s64.s8 %r312,%r313; add.u64 %r156,%r312,-1; .loc 1 129 3 mov.u64 %r120,0; add.u64 %r354,%frame,240; .loc 1 131 16 mov.u64 %r314,%r120; $L26: st.u64 [%r168],%r314; .loc 1 132 18 ld.u64 %r315,[%r167]; st.u64 [%r165],%r315; .loc 1 133 17 shl.b64 %r317,%r120,3; add.u64 %r318,%r354,%r317; .loc 1 133 10 ld.u64 %r319,[%r318]; setp.le.s64 %r320,%r319,0; @ %r320 bra $L1; .loc 1 129 26 add.u64 %r120,%r120,1; .loc 1 129 3 add.u64 %r168,%r168,8; add.u64 %r167,%r167,24; add.u64 %r165,%r165,8; setp.ne.u64 %r321,%r120,%r156; @ %r321 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r356 bra $L35; .loc 1 150 10 mov.u16 %r323,0; st.u16 [%r119],%r323; bra $L29; $L35: mov.u64 %r114,%r118; .loc 1 148 10 mov.u32 %r112,0; .loc 1 154 13 mov.u64 %r113,0; $L28: .loc 1 158 19 ld.u16 %r325,[%r114]; cvt.u16.u32 %r326,%r112; set.u32.ne.u16 %r327,%r325,%r326; neg.s32 %r328,%r327; cvt.u32.u32 %r324,%r328; .loc 1 158 10 cvt.u32.u8 %r112,%r324; .loc 1 154 28 add.u64 %r113,%r113,1; .loc 1 154 36 add.u64 %r114,%r114,%r67; .loc 1 154 6 setp.gt.s64 %r330,%r45,%r113; @ %r330 bra $L28; .loc 1 161 12 cvt.u16.u32 %r331,%r112; st.u16 [%r119],%r331; $L29: .loc 1 165 15 add.u64 %r139,%r139,1; .loc 1 166 12 add.u64 %r118,%r118,%r70; .loc 1 167 12 add.u64 %r119,%r119,%r73; .loc 1 169 13 setp.ne.u64 %r332,%r139,%r125; @ %r332 bra $L25; .loc 1 176 9 add.u64 %r115,%r118,%r130; .loc 1 177 9 add.u64 %r116,%r119,%r132; .loc 1 179 7 @ %r357 bra $L1; add.u64 %r124,%frame,368; mov.u64 %r141,8; .loc 1 178 5 mov.u64 %r117,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r124],%r364; .loc 1 176 23 mul.lo.u64 %r338,%r86,%r85; .loc 1 176 9 add.u64 %r339,%r338,%r338; sub.u64 %r115,%r118,%r339; .loc 1 177 23 mul.lo.u64 %r340,%r89,%r85; .loc 1 177 9 add.u64 %r341,%r340,%r340; sub.u64 %r116,%r119,%r341; .loc 1 178 5 add.u64 %r117,%r117,1; .loc 1 179 7 add.u64 %r124,%r124,8; add.u64 %r141,%r141,8; setp.eq.u64 %r342,%r117,%r361; @ %r342 bra $L1; $L32: .loc 1 187 16 ld.u64 %r343,[%r124]; add.u64 %r85,%r343,1; st.u64 [%r124],%r85; .loc 1 188 23 add.u64 %r345,%r362,%r141; ld.u64 %r86,[%r345]; .loc 1 188 13 add.u64 %r346,%r86,%r86; add.u64 %r118,%r115,%r346; .loc 1 189 23 add.u64 %r347,%frame,%r141; ld.u64 %r89,[%r347]; .loc 1 189 13 add.u64 %r348,%r89,%r89; add.u64 %r119,%r116,%r348; .loc 1 169 32 add.u64 %r350,%r363,%r141; ld.u64 %r92,[%r350]; .loc 1 169 13 setp.eq.u64 %r351,%r85,%r92; @ %r351 bra $L33; .loc 1 173 13 mov.u64 %r139,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r352,[%r180]; setp.eq.u64 %r353,%r352,0; @ ! %r353 bra $L11; bra $L34; $L1: .loc 1 193 1 ret; } parity_l4.o/_gfortran_parity_l4 .visible .func _gfortran_parity_l4fortran/generated/parity_l4.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_return,105,110,99,111,114,114,101,99,116,32,105,110,32,80,65,82,73,84,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,80,65,82,73,84,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {80,65,82,73,84,89114,101,116,117,114,110,32,118,97,108,11_gfortran_parity_l4 .visible .func _gfortran_parity_l4480pred %r218; .reg .predpred %r231; .reg .u64 %r232; .reg .predu64 %r250; .reg .u64 %r251; .reg .u64 %r252; .reg .pred %r253; .reg .predu64u16u64 %r274; .reg .u64pred %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u64 %r306; .reg .u64u64pred %r325; .reg .pred %r326; .reg .u64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64u64 %r357; .reg .u64 %r358; .reg .u64 %r360; .reg .u64 %r361; mov.u64 %r180,%ar0; mov.u64 %r181,%ar1; mov.u64 %r182,%ar2; .loc 1 55 10 ld.s8 %r22,[%r181+28]; .loc 1 55 38 add.u32 %r183,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r105,%r183; .loc 1 56 10 ld.u64 %r25,[%r182]; .loc 1 56 7 add.u64 %r106,%r25,-1; .loc 1 58 7 shr.u64 %r185,%r106,63; set.u32.lt.s64 %r187,%r105,%r106; neg.s32 %r188,%r187; cvt.u16.u64 %r191,%r185; cvt.u16.u32 %r192,%r188; or.b16 %r190,%r191,%r192; .loc 1 58 6 cvt.u32.u16 %r193,%r190; cvt.u16.u8 %r194,%r193; setp.eq.u16 %r195,%r194,0; @ %r195 bra $L2; .loc 1 60 7 cvt.u32.u32 %r198,%r22; cvt.s64.s8 %r197,%r198; st.u64 [%stack+8],%r1971966_gfortran_runtime_errorr201,%r106,%r106; add.u64 %r202,%r201,%r106; shl.b64 %r203,%r202,3; add.u64 %r204,%r181,%r203; ld.u64 %r31,[%r204+56]; ld.u64 %r33,[%r204+48]; .loc 1 68 9 ld.u64 %r109,[%r204+40]; .loc 1 70 3 setp.ne.u64 %r218,%r106,0; @ %r218 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r219,%r105,%r106; @ %r219 bra $L4; bra $L46; $L3: add.u64 %r93,%r181,40; add.u64 %r59,%frame,120; add.u64 %r29,%frame,240; add.u64 %r220,%r181,16; add.u64 %r222,%r25,%r25; add.u64 %r223,%r222,%r25; shl.b64 %r224,%r223,3; add.u64 %r177,%r220,%r224; .loc 1 76 12 mov.u64 %r361,0; $L8: .loc 1 72 18 ld.u64 %r225,[%r93]; st.u64 [%r59],%r225; .loc 1 73 19 ld.u64 %r227,[%r93+16]; add.u64 %r226,%r227,1; ld.u64 %r228,[%r93+8]; sub.u64 %r38,%r226,%r228; .loc 1 75 10 setp.lt.s64 %r229,%r38,0; @ %r229 bra $L6; .loc 1 73 17 st.u64 [%r29],%r38; bra $L7; $L6: .loc 1 76 12 st.u64 [%r29],%r361; $L7: .loc 1 70 3 add.u64 %r93,%r93,24; add.u64 %r59,%r59,8; add.u64 %r29,%r29,8; setp.ne.u64 %r231,%r93,%r177; @ %r231 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r232,[%r180]; setp.eq.u64 %r233,%r232,0; @ ! %r233 bra $L11; bra $L10; $L4: add.u64 %r235,%r25,%r25; add.u64 %r236,%r235,%r25; shl.b64 %r237,%r236,3; add.u64 %r238,%r237,40; add.u64 %r146,%r181,%r238; shl.b64 %r239,%r25,3; add.u64 %r128,%r239,-8; add.u64 %r349,%frame,120; add.u64 %r131,%r349,%r128; add.u64 %r348,%frame,240; add.u64 %r126,%r348,%r128; cvt.u32.u32 %r243,%r22; cvt.s64.s8 %r242,%r243; add.u64 %r245,%r242,%r242; add.u64 %r246,%r245,%r242; shl.b64 %r247,%r246,3; add.u64 %r248,%r181,40; add.u64 %r96,%r247,%r248; .loc 1 84 12 mov.u64 %r360,0; $L14: .loc 1 80 18 ld.u64 %r249,[%r146]; st.u64 [%r131],%r249; .loc 1 81 19 ld.u64 %r251,[%r146+16]; add.u64 %r250,%r251,1; ld.u64 %r252,[%r146+8]; sub.u64 %r44,%r250,%r252; .loc 1 83 10 setp.lt.s64 %r253,%r44,0; @ %r253 bra $L12; .loc 1 81 17 st.u64 [%r126],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r126],%r360; $L13: .loc 1 78 3 add.u64 %r146,%r146,24; add.u64 %r131,%r131,8; add.u64 %r126,%r126,8; setp.ne.u64 %r255,%r96,%r146; @ %r255 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r256,%r105,0; @ %r256 bra $L16; add.u64 %r348,%frame,240; $L34: add.u64 %r155,%r180,40; mov.u64 %r153,%r348; add.u64 %r257,%r180,16; cvt.u32.u32 %r259,%r22; cvt.s64.s8 %r258,%r259; add.u64 %r261,%r258,%r258; add.u64 %r262,%r261,%r258; shl.b64 %r263,%r262,3; add.u64 %r147,%r257,%r263; .loc 1 94 10 mov.u64 %r98,1; .loc 1 98 4 mov.u64 %r286,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r264,0; st.u64 [%r180+8],%r264; .loc 1 103 28 cvt.u16.u32 %r267,%r22; add.u16 %r266,%r267,-1; cvt.u32.u16 %r268,%r266; st.u8 [%r180+28],%r268; .loc 1 105 20 add.u32 %r269,%r22,-2; cvt.s64.s32 %r52,%r269; add.u64 %r271,%r52,%r52; add.u64 %r272,%r271,%r52; shl.b64 %r273,%r272,3; add.u64 %r274,%r180,%r273; .loc 1 105 67 shl.b64 %r276,%r52,3; add.u64 %r277,%frame,%r276; .loc 1 105 59 ld.u64 %r279,[%r274+40]; ld.u64 %r280,[%r277+240]; mul.lo.u64 %r111,%r279,%r280; .loc 1 107 29282; call (%value_in),_gfortrani_xmallocarray283,[%value_in]; } .loc 1 107 27 st.u64 [%r180],%r283; .loc 1 108 10 setp.eq.u64 %r285,%r111,0; @ ! %r285 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r98,%r48,%r98; $L17: .loc 1 98 4 st.u64 [%r155+8],%r286; ld.u64 %r48,[%r153]; add.u64 %r287,%r48,-1; st.u64 [%r155+16],%r287; st.u64 [%r155],%r98; .loc 1 91 7 add.u64 %r155,%r155,24; add.u64 %r153,%r153,8; setp.ne.u64 %r288,%r147,%r155; @ %r288 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r180+48],%r111; mov.u64 %r290,-1; st.u64 [%r180+56],%r290; mov.u64 %r291,1; st.u64 [%r180+40],%r291; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r180+28]; .loc 1 118 10 setp.eq.u64 %r292,%r58,%r105; @ %r292 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r105; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r295,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r296,[%r295+36]; setp.eq.u32 %r297,%r296,0; @ %r297 bra $L21; .loc 1 125 2 add.u64 %r348,%frame,240; cvta.const.u64 %r301,$LC2r34300301; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r303,%r105,0; @ %r303 bra $L24; $L27: .loc 1 65 9 add.u64 %r304,%r31,1; .loc 1 65 7 sub.u64 %r107,%r304,%r33; max.s64 %r45,%r107,0; .loc 1 137 8 ld.u64 %r118,[%r181]; .loc 1 138 8 ld.u64 %r119,[%r180]; .loc 1 154 36 shl.b64 %r67,%r109,2; ld.u64 %r139,[%frame+360]; .loc 1 166 22 ld.u64 %r69,[%frame+120]; .loc 1 166 12 shl.b64 %r70,%r69,2; .loc 1 167 22 ld.u64 %r72,[%frame]; .loc 1 167 12 shl.b64 %r73,%r72,2; .loc 1 169 32 ld.u64 %r125,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r305,%r69,%r125; .loc 1 176 9 shl.b64 %r306,%r305,2; neg.s64 %r130,%r306; .loc 1 177 23 mul.lo.u64 %r307,%r72,%r125; .loc 1 177 9 shl.b64 %r308,%r307,2; neg.s64 %r132,%r308; setp.gt.s64 %r350,%r107,0; setp.le.s64 %r351,%r105,1; cvt.u32.u32 %r353,%r22; cvt.s64.s8 %r354,%r353; add.u64 %r355,%r354,-1; add.u64 %r356,%frame,120; add.u64 %r357,%frame,240; .loc 1 173 13 mov.u64 %r358,0; bra $L25; $L24: add.u64 %r168,%frame,360; add.u64 %r167,%r180,40; mov.u64 %r165,%frame; cvt.u32.u32 %r310,%r22; cvt.s64.s8 %r309,%r310; add.u64 %r156,%r309,-1; .loc 1 129 3 mov.u64 %r120,0; add.u64 %r348,%frame,240; .loc 1 131 16 mov.u64 %r311,%r120; $L26: st.u64 [%r168],%r311; .loc 1 132 18 ld.u64 %r312,[%r167]; st.u64 [%r165],%r312; .loc 1 133 17 shl.b64 %r314,%r120,3; add.u64 %r315,%r348,%r314; .loc 1 133 10 ld.u64 %r316,[%r315]; setp.le.s64 %r317,%r316,0; @ %r317 bra $L1; .loc 1 129 26 add.u64 %r120,%r120,1; .loc 1 129 3 add.u64 %r168,%r168,8; add.u64 %r167,%r167,24; add.u64 %r165,%r165,8; setp.ne.u64 %r318,%r120,%r156; @ %r318 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r350 bra $L35; .loc 1 150 10 mov.u32 %r320,0; st.u32 [%r119],%r320; bra $L29; $L35: mov.u64 %r114,%r118; .loc 1 148 10 mov.u32 %r112,0; .loc 1 154 13 mov.u64 %r113,0; $L28: .loc 1 158 19 ld.u32 %r322,[%r114]; set.u32.ne.u32 %r323,%r322,%r112; neg.s32 %r324,%r323; cvt.u32.u32 %r321,%r324; cvt.u32.u8 %r112,%r321; .loc 1 154 28 add.u64 %r113,%r113,1; .loc 1 154 36 add.u64 %r114,%r114,%r67; .loc 1 154 6 setp.gt.s64 %r325,%r45,%r113; @ %r325 bra $L28; .loc 1 161 12 st.u32 [%r119],%r112; $L29: .loc 1 165 15 add.u64 %r139,%r139,1; .loc 1 166 12 add.u64 %r118,%r118,%r70; .loc 1 167 12 add.u64 %r119,%r119,%r73; .loc 1 169 13 setp.ne.u64 %r326,%r139,%r125; @ %r326 bra $L25; .loc 1 176 9 add.u64 %r115,%r118,%r130; .loc 1 177 9 add.u64 %r116,%r119,%r132; .loc 1 179 7 @ %r351 bra $L1; add.u64 %r124,%frame,368; mov.u64 %r141,8; .loc 1 178 5 mov.u64 %r117,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r124],%r358; .loc 1 176 23 mul.lo.u64 %r332,%r86,%r85; .loc 1 176 9 shl.b64 %r333,%r332,2; sub.u64 %r115,%r118,%r333; .loc 1 177 23 mul.lo.u64 %r334,%r89,%r85; .loc 1 177 9 shl.b64 %r335,%r334,2; sub.u64 %r116,%r119,%r335; .loc 1 178 5 add.u64 %r117,%r117,1; .loc 1 179 7 add.u64 %r124,%r124,8; add.u64 %r141,%r141,8; setp.eq.u64 %r336,%r117,%r355; @ %r336 bra $L1; $L32: .loc 1 187 16 ld.u64 %r337,[%r124]; add.u64 %r85,%r337,1; st.u64 [%r124],%r85; .loc 1 188 23 add.u64 %r339,%r356,%r141; ld.u64 %r86,[%r339]; .loc 1 188 13 shl.b64 %r340,%r86,2; add.u64 %r118,%r115,%r340; .loc 1 189 23 add.u64 %r341,%frame,%r141; ld.u64 %r89,[%r341]; .loc 1 189 13 shl.b64 %r342,%r89,2; add.u64 %r119,%r116,%r342; .loc 1 169 32 add.u64 %r344,%r357,%r141; ld.u64 %r92,[%r344]; .loc 1 169 13 setp.eq.u64 %r345,%r85,%r92; @ %r345 bra $L33; .loc 1 173 13 mov.u64 %r139,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r346,[%r180]; setp.eq.u64 %r347,%r346,0; @ ! %r347 bra $L11; bra $L34; $L1: .loc 1 193 1 ret; } parity_l8.o/_gfortran_parity_l8 .visible .func _gfortran_parity_l8fortran/generated/parity_l8.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_return,105,110,99,111,114,114,101,99,116,32,105,110,32,80,65,82,73,84,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,80,65,82,73,84,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {80,65,82,73,84,89114,101,116,117,114,110,32,118,97,108,11_gfortran_parity_l8 .visible .func _gfortran_parity_l848096predu64 %r227; .reg .u64 %r228; .reg .u64 %r229; .reg .predpredu32u64 %r250; .reg .u64 %r251; .reg .u64 %r252; .reg .u64 %r253; .reg .predu32u16u64pred %r323; .reg .pred %r324; .reg .u64 %r325; .reg .u64u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .predmov.u64 %r181,%ar0; mov.u64 %r182,%ar1; mov.u64 %r183,%ar2; .loc 1 55 10 ld.s8 %r22,[%r182+28]; .loc 1 55 38 add.u32 %r184,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r107,%r184; .loc 1 56 10 ld.u64 %r25,[%r183]; .loc 1 56 7 add.u64 %r108,%r25,-1; .loc 1 58 7 shr.u64 %r186,%r108,63; set.u32.lt.s64 %r188,%r107,%r108; neg.s32 %r189,%r188; cvt.u16.u64 %r192,%r186; cvt.u16.u32 %r193,%r189; or.b16 %r191,%r192,%r193; .loc 1 58 6 cvt.u32.u16 %r194,%r191; cvt.u16.u8 %r195,%r194; setp.eq.u16 %r196,%r195,0; @ %r196 bra $L2; .loc 1 60 7 cvt.u32.u32 %r199,%r22; cvt.s64.s8 %r198,%r199; st.u64 [%stack+8],%r19819197_gfortran_runtime_errorr202,%r108,%r108; add.u64 %r203,%r202,%r108; shl.b64 %r204,%r203,3; add.u64 %r205,%r182,%r204; ld.u64 %r31,[%r205+56]; ld.u64 %r33,[%r205+48]; .loc 1 68 9 ld.u64 %r111,[%r205+40]; .loc 1 70 3 setp.ne.u64 %r219,%r108,0; @ %r219 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r220,%r107,%r108; @ %r220 bra $L4; bra $L46; $L3: add.u64 %r63,%r182,40; add.u64 %r55,%frame,120; add.u64 %r29,%frame,240; add.u64 %r221,%r182,16; add.u64 %r223,%r25,%r25; add.u64 %r224,%r223,%r25; shl.b64 %r225,%r224,3; add.u64 %r178,%r221,%r225; .loc 1 76 12 mov.u64 %r363,0; $L8: .loc 1 72 18 ld.u64 %r226,[%r63]; st.u64 [%r55],%r226; .loc 1 73 19 ld.u64 %r228,[%r63+16]; add.u64 %r227,%r228,1; ld.u64 %r229,[%r63+8]; sub.u64 %r38,%r227,%r229; .loc 1 75 10 setp.lt.s64 %r230,%r38,0; @ %r230 bra $L6; .loc 1 73 17 st.u64 [%r29],%r38; bra $L7; $L6: .loc 1 76 12 st.u64 [%r29],%r363; $L7: .loc 1 70 3 add.u64 %r63,%r63,24; add.u64 %r55,%r55,8; add.u64 %r29,%r29,8; setp.ne.u64 %r232,%r63,%r178; @ %r232 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r233,[%r181]; setp.eq.u64 %r234,%r233,0; @ ! %r234 bra $L11; bra $L10; $L4: add.u64 %r236,%r25,%r25; add.u64 %r237,%r236,%r25238,40; add.u64 %r141,%r182,%r239; shl.b64 %r240,%r25,3; add.u64 %r127,%r240,-8; add.u64 %r351,%frame,120; add.u64 %r132,%r351,%r127; add.u64 %r350,%frame,240; add.u64 %r104,%r350,%r127; cvt.u32.u32 %r244,%r22; cvt.s64.s8 %r243,%r244; add.u64 %r246,%r243,%r243; add.u64 %r247,%r246,%r243; shl.b64 %r248,%r247,3; add.u64 %r249,%r182,40; add.u64 %r96,%r248,%r249; .loc 1 84 12 mov.u64 %r362,0; $L14: .loc 1 80 18 ld.u64 %r250,[%r141]; st.u64 [%r132],%r250; .loc 1 81 19 ld.u64 %r252,[%r141+16]; add.u64 %r251,%r252,1; ld.u64 %r253,[%r141+8]; sub.u64 %r44,%r251,%r253; .loc 1 83 10 setp.lt.s64 %r254,%r44,0; @ %r254 bra $L12; .loc 1 81 17 st.u64 [%r104],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r104],%r362; $L13: .loc 1 78 3 add.u64 %r141,%r141,24; add.u64 %r132,%r132,8; add.u64 %r104,%r104,8; setp.ne.u64 %r256,%r96,%r141; @ %r256 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r257,%r107,0; @ %r257 bra $L16; add.u64 %r350,%frame,240; $L34: add.u64 %r155,%r181,40; mov.u64 %r153,%r350; add.u64 %r258,%r181,16; cvt.u32.u32 %r260,%r22; cvt.s64.s8 %r259,%r260; add.u64 %r262,%r259,%r259; add.u64 %r263,%r262,%r259; shl.b64 %r264,%r263,3; add.u64 %r147,%r258,%r264; .loc 1 94 10 mov.u64 %r101,1; .loc 1 98 4 mov.u64 %r287,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r265,0; st.u64 [%r181+8],%r265; .loc 1 103 28 cvt.u16.u32 %r268,%r22; add.u16 %r267,%r268,-1; cvt.u32.u16 %r269,%r267; st.u8 [%r181+28],%r269; .loc 1 105 20 add.u32 %r270,%r22,-2; cvt.s64.s32 %r53,%r270; add.u64 %r272,%r53,%r53; add.u64 %r273,%r272,%r53; shl.b64 %r274,%r273,3; add.u64 %r275,%r181,%r274; .loc 1 105 67 shl.b64 %r277,%r53,3; add.u64 %r278,%frame,%r277; .loc 1 105 59 ld.u64 %r280,[%r275+40]; ld.u64 %r281,[%r278+240]; mul.lo.u64 %r113,%r280,%r281; .loc 1 107 29 mov.u64 %r28311(%value_in),_gfortrani_xmallocarray284,[%value_in]; } .loc 1 107 27 st.u64 [%r181],%r284; .loc 1 108 10 setp.eq.u64 %r286,%r113,0; @ ! %r286 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r101,%r49,%r101; $L17: .loc 1 98 4 st.u64 [%r155+8],%r287; ld.u64 %r49,[%r153]; add.u64 %r288,%r49,-1; st.u64 [%r155+16],%r288; st.u64 [%r155],%r101; .loc 1 91 7 add.u64 %r155,%r155,24; add.u64 %r153,%r153,8; setp.ne.u64 %r289,%r147,%r155; @ %r289 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r181+48],%r113; mov.u64 %r291,-1; st.u64 [%r181+56],%r291; mov.u64 %r292,1; st.u64 [%r181+40],%r292; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r60,[%r181+28]; .loc 1 118 10 setp.eq.u64 %r293,%r60,%r107; @ %r293 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r107; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r296,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r297,[%r296+36]; setp.eq.u32 %r298,%r297,0; @ %r298 bra $L21; .loc 1 125 2 add.u64 %r350,%frame,240; cvta.const.u64 %r302,$LC20302; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r304,%r107,0; @ %r304 bra $L24; $L27: .loc 1 65 9 add.u64 %r305,%r31,1; .loc 1 65 7 sub.u64 %r109,%r305,%r33; max.s64 %r45,%r109,0; .loc 1 137 8 ld.u64 %r121,[%r182]; .loc 1 138 8 ld.u64 %r122,[%r181]; .loc 1 154 36 shl.b64 %r69,%r111,3; ld.u64 %r47,[%frame+360]; .loc 1 166 22 ld.u64 %r70,[%frame+120]; .loc 1 166 12 shl.b64 %r72,%r70,3; .loc 1 167 22 ld.u64 %r73,[%frame]; .loc 1 167 12 shl.b64 %r75,%r73,3; setp.gt.s64 %r352,%r109,0; setp.le.s64 %r353,%r107,1; .loc 1 148 10 mov.u64 %r355,0; cvt.u32.u32 %r356,%r22; cvt.s64.s8 %r357,%r356; add.u64 %r358,%r357,-1; add.u64 %r359,%frame,120; add.u64 %r360,%frame,240; bra $L25; $L24: add.u64 %r168,%frame,360; add.u64 %r167,%r181,40; mov.u64 %r165,%frame; cvt.u32.u32 %r307,%r22; cvt.s64.s8 %r306,%r307; add.u64 %r156,%r306,-1; .loc 1 129 3 mov.u64 %r123,0; add.u64 %r350,%frame,240; .loc 1 131 16 mov.u64 %r308,%r123; $L26: st.u64 [%r168],%r308; .loc 1 132 18 ld.u64 %r309,[%r167]; st.u64 [%r165],%r309; .loc 1 133 17 shl.b64 %r311,%r123,3; add.u64 %r312,%r350,%r311; .loc 1 133 10 ld.u64 %r313,[%r312]; setp.le.s64 %r314,%r313,0; @ %r314 bra $L1; .loc 1 129 26 add.u64 %r123,%r123,1; .loc 1 129 3 add.u64 %r168,%r168,8; add.u64 %r167,%r167,24; add.u64 %r165,%r165,8; setp.ne.u64 %r315,%r123,%r156; @ %r315 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r352 bra $L35; .loc 1 150 10 st.u64 [%r122],%r355; bra $L29; $L35: mov.u64 %r116,%r121; .loc 1 148 10 mov.u64 %r114,%r355; .loc 1 154 13 mov.u64 %r115,0; $L28: .loc 1 158 19 ld.u64 %r319,[%r116]; set.u32.ne.u64 %r320,%r319,%r114; neg.s32 %r321,%r320; cvt.u32.u32 %r318,%r321; .loc 1 158 10 cvt.u64.u8 %r114,%r318; .loc 1 154 28 add.u64 %r115,%r115,1; .loc 1 154 36 add.u64 %r116,%r116,%r69; .loc 1 154 6 setp.gt.s64 %r323,%r45,%r115; @ %r323 bra $L28; .loc 1 161 12 st.u64 [%r122],%r114; $L29: .loc 1 165 15 add.u64 %r47,%r47,1; .loc 1 166 12 add.u64 %r121,%r121,%r72; .loc 1 167 12 add.u64 %r122,%r122,%r75; .loc 1 169 32 ld.u64 %r130,[%frame+240]; .loc 1 169 13 setp.ne.u64 %r324,%r47,%r130; @ %r324 bra $L25; .loc 1 176 23 mul.lo.u64 %r325,%r70,%r47; .loc 1 176 9 shl.b64 %r326,%r325,3; sub.u64 %r118,%r121,%r326; .loc 1 177 23 mul.lo.u64 %r327,%r73,%r47; .loc 1 177 9 shl.b64 %r328,%r327,3; sub.u64 %r119,%r122,%r328; .loc 1 179 7 @ %r353 bra $L1; add.u64 %r146,%frame,368; mov.u64 %r106,8; .loc 1 178 5 mov.u64 %r120,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r146],%r355; .loc 1 176 23 mul.lo.u64 %r334,%r88,%r87; .loc 1 176 9 shl.b64 %r335,%r334,3; sub.u64 %r118,%r121,%r335; .loc 1 177 23 mul.lo.u64 %r336,%r91,%r87; .loc 1 177 9 shl.b64 %r337,%r336,3; sub.u64 %r119,%r122,%r337; .loc 1 178 5 add.u64 %r120,%r120,1; .loc 1 179 7 add.u64 %r146,%r146,8; add.u64 %r106,%r106,8; setp.eq.u64 %r338,%r120,%r358; @ %r338 bra $L1; $L32: .loc 1 187 16 ld.u64 %r339,[%r146]; add.u64 %r87,%r339,1; st.u64 [%r146],%r87; .loc 1 188 23 add.u64 %r341,%r359,%r106; ld.u64 %r88,[%r341]; .loc 1 188 13 shl.b64 %r342,%r88,3; add.u64 %r121,%r118,%r342; .loc 1 189 23 add.u64 %r343,%frame,%r106; ld.u64 %r91,[%r343]; .loc 1 189 13 shl.b64 %r344,%r91,3; add.u64 %r122,%r119,%r344; .loc 1 169 32 add.u64 %r346,%r360,%r106; ld.u64 %r95,[%r346]; .loc 1 169 13 setp.eq.u64 %r347,%r87,%r95; @ %r347 bra $L33; .loc 1 173 13 mov.u64 %r47,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r348,[%r181]; setp.eq.u64 %r349,%r348,0; @ ! %r349 bra $L11; bra $L34; $L1: .loc 1 193 1 ret; } parity_l16.o/_gfortran_parity_l16 .visible .func _gfortran_parity_l16fortran/generated/parity_l16.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_return,105,110,99,111,114,114,101,99,116,32,105,110,32,80,65,82,73,84,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,80,65,82,73,84,89,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {80,65,82,73,84,89114,101,116,117,114,110,32,118,97,108,11_gfortran_parity_l16 .visible .func _gfortran_parity_l16480pred %r218; .reg .predpred %r231; .reg .u64 %r232; .reg .predu64 %r250; .reg .u64 %r251; .reg .u64 %r252; .reg .pred %r253; .reg .predu64u16u64 %r274; .reg .u64pred %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u64 %r306; .reg .u64u64pred %r318; .reg .u32u64 %r330; .reg .u32 %r334; .reg .u32 %r335; .reg .pred %r338; .reg .predmov.u64 %r180,%ar0; mov.u64 %r181,%ar1; mov.u64 %r182,%ar2; .loc 1 55 10 ld.s8 %r22,[%r181+28]; .loc 1 55 38 add.u32 %r183,%r22,-1; .loc 1 55 8 cvt.s64.s32 %r105,%r183; .loc 1 56 10 ld.u64 %r25,[%r182]; .loc 1 56 7 add.u64 %r106,%r25,-1; .loc 1 58 7 shr.u64 %r185,%r106,63; set.u32.lt.s64 %r187,%r105,%r106; neg.s32 %r188,%r187; cvt.u16.u64 %r191,%r185; cvt.u16.u32 %r192,%r188; or.b16 %r190,%r191,%r192; .loc 1 58 6 cvt.u32.u16 %r193,%r190; cvt.u16.u8 %r194,%r193; setp.eq.u16 %r195,%r194,0; @ %r195 bra $L2; .loc 1 60 7 cvt.u32.u32 %r198,%r22; cvt.s64.s8 %r197,%r198; st.u64 [%stack+8],%r1971966_gfortran_runtime_errorr201,%r106,%r106; add.u64 %r202,%r201,%r106; shl.b64 %r203,%r202,3; add.u64 %r204,%r181,%r203; ld.u64 %r31,[%r204+56]; ld.u64 %r33,[%r204+48]; .loc 1 68 9 ld.u64 %r109,[%r204+40]; .loc 1 70 3 setp.ne.u64 %r218,%r106,0; @ %r218 bra $L3; $L9: .loc 1 78 3 setp.gt.s64 %r219,%r105,%r106; @ %r219 bra $L4; bra $L46; $L3: add.u64 %r93,%r181,40; add.u64 %r59,%frame,120; add.u64 %r29,%frame,240; add.u64 %r220,%r181,16; add.u64 %r222,%r25,%r25; add.u64 %r223,%r222,%r25; shl.b64 %r224,%r223,3; add.u64 %r177,%r220,%r224; .loc 1 76 12 mov.u64 %r381,0; $L8: .loc 1 72 18 ld.u64 %r225,[%r93]; st.u64 [%r59],%r225; .loc 1 73 19 ld.u64 %r227,[%r93+16]; add.u64 %r226,%r227,1; ld.u64 %r228,[%r93+8]; sub.u64 %r38,%r226,%r228; .loc 1 75 10 setp.lt.s64 %r229,%r38,0; @ %r229 bra $L6; .loc 1 73 17 st.u64 [%r29],%r38; bra $L7; $L6: .loc 1 76 12 st.u64 [%r29],%r381; $L7: .loc 1 70 3 add.u64 %r93,%r93,24; add.u64 %r59,%r59,8; add.u64 %r29,%r29,8; setp.ne.u64 %r231,%r93,%r177; @ %r231 bra $L8; bra $L9; $L46: .loc 1 87 6 ld.u64 %r232,[%r180]; setp.eq.u64 %r233,%r232,0; @ ! %r233 bra $L11; bra $L10; $L4: add.u64 %r235,%r25,%r25; add.u64 %r236,%r235,%r25; shl.b64 %r237,%r236,3; add.u64 %r238,%r237,40; add.u64 %r146,%r181,%r238; shl.b64 %r239,%r25,3; add.u64 %r128,%r239,-8; add.u64 %r367,%frame,120; add.u64 %r131,%r367,%r128; add.u64 %r370,%frame,240; add.u64 %r126,%r370,%r128; cvt.u32.u32 %r243,%r22; cvt.s64.s8 %r242,%r243; add.u64 %r245,%r242,%r242; add.u64 %r246,%r245,%r242; shl.b64 %r247,%r246,3; add.u64 %r248,%r181,40; add.u64 %r96,%r247,%r248; .loc 1 84 12 mov.u64 %r380,0; $L14: .loc 1 80 18 ld.u64 %r249,[%r146]; st.u64 [%r131],%r249; .loc 1 81 19 ld.u64 %r251,[%r146+16]; add.u64 %r250,%r251,1; ld.u64 %r252,[%r146+8]; sub.u64 %r44,%r250,%r252; .loc 1 83 10 setp.lt.s64 %r253,%r44,0; @ %r253 bra $L12; .loc 1 81 17 st.u64 [%r126],%r44; bra $L13; $L12: .loc 1 84 12 st.u64 [%r126],%r380; $L13: .loc 1 78 3 add.u64 %r146,%r146,24; add.u64 %r131,%r131,8; add.u64 %r126,%r126,8; setp.ne.u64 %r255,%r96,%r146; @ %r255 bra $L14; bra $L47; $L10: .loc 1 91 7 setp.le.s64 %r256,%r105,0; @ %r256 bra $L16; add.u64 %r370,%frame,240; $L34: add.u64 %r155,%r180,40; mov.u64 %r153,%r370; add.u64 %r257,%r180,16; cvt.u32.u32 %r259,%r22; cvt.s64.s8 %r258,%r259; add.u64 %r261,%r258,%r258; add.u64 %r262,%r261,%r258; shl.b64 %r263,%r262,3; add.u64 %r147,%r257,%r263; .loc 1 94 10 mov.u64 %r98,1; .loc 1 98 4 mov.u64 %r286,0; bra $L17; $L16: .loc 1 102 24 mov.u64 %r264,0; st.u64 [%r180+8],%r264; .loc 1 103 28 cvt.u16.u32 %r267,%r22; add.u16 %r266,%r267,-1; cvt.u32.u16 %r268,%r266; st.u8 [%r180+28],%r268; .loc 1 105 20 add.u32 %r269,%r22,-2; cvt.s64.s32 %r52,%r269; add.u64 %r271,%r52,%r52; add.u64 %r272,%r271,%r52; shl.b64 %r273,%r272,3; add.u64 %r274,%r180,%r273; .loc 1 105 67 shl.b64 %r276,%r52,3; add.u64 %r277,%frame,%r276; .loc 1 105 59 ld.u64 %r279,[%r274+40]; ld.u64 %r280,[%r277+240]; mul.lo.u64 %r111,%r279,%r280; .loc 1 107 29 mov.u64 %r282,16282; call (%value_in),_gfortrani_xmallocarray283,[%value_in]; } .loc 1 107 27 st.u64 [%r180],%r283; .loc 1 108 10 setp.eq.u64 %r285,%r111,0; @ ! %r285 bra $L21; bra $L18; $L20: .loc 1 96 48 mul.lo.u64 %r98,%r48,%r98; $L17: .loc 1 98 4 st.u64 [%r155+8],%r286; ld.u64 %r48,[%r153]; add.u64 %r287,%r48,-1; st.u64 [%r155+16],%r287; st.u64 [%r155],%r98; .loc 1 91 7 add.u64 %r155,%r155,24; add.u64 %r153,%r153,8; setp.ne.u64 %r288,%r147,%r155; @ %r288 bra $L20; bra $L16; $L18: .loc 1 111 4 st.u64 [%r180+48],%r111; mov.u64 %r290,-1; st.u64 [%r180+56],%r290; mov.u64 %r291,1; st.u64 [%r180+40],%r291; .loc 1 112 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r58,[%r180+28]; .loc 1 118 10 setp.eq.u64 %r292,%r58,%r105; @ %r292 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r105; st.u64 [%stack],%r58;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r295,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r296,[%r295+36]; setp.eq.u32 %r297,%r296,0; @ %r297 bra $L21; .loc 1 125 2 add.u64 %r370,%frame,240; cvta.const.u64 %r301,$LC2r370300301; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r303,%r105,0; @ %r303 bra $L24; $L27: .loc 1 65 9 add.u64 %r304,%r31,1; .loc 1 65 7 sub.u64 %r107,%r304,%r33; max.s64 %r45,%r107,0; .loc 1 137 8 ld.u64 %r118,[%r181]; .loc 1 138 8 ld.u64 %r119,[%r180]; .loc 1 154 36 shl.b64 %r67,%r109,4; ld.u64 %r139,[%frame+360]; .loc 1 166 22 ld.u64 %r69,[%frame+120]; .loc 1 166 12 shl.b64 %r70,%r69,4; .loc 1 167 22 ld.u64 %r72,[%frame]; .loc 1 167 12 shl.b64 %r73,%r72,4; .loc 1 169 32 ld.u64 %r125,[%frame+240]; .loc 1 176 23 mul.lo.u64 %r305,%r69,%r125; .loc 1 176 9 shl.b64 %r306,%r305,4; neg.s64 %r130,%r306; .loc 1 177 23 mul.lo.u64 %r307,%r72,%r125; .loc 1 177 9 shl.b64 %r308,%r307,4; neg.s64 %r132,%r308; setp.gt.s64 %r368,%r107,0; setp.le.s64 %r369,%r105,1; .loc 1 148 10 mov.u64 %r373,0; cvt.u32.u32 %r375,%r22; cvt.s64.s8 %r376,%r375; add.u64 %r377,%r376,-1; add.u64 %r378,%frame,120; add.u64 %r379,%frame,240; bra $L25; $L24: add.u64 %r168,%frame,360; add.u64 %r167,%r180,40; mov.u64 %r165,%frame; cvt.u32.u32 %r310,%r22; cvt.s64.s8 %r309,%r310; add.u64 %r156,%r309,-1; .loc 1 129 3 mov.u64 %r120,0; add.u64 %r370,%frame,240; .loc 1 131 16 mov.u64 %r311,%r120; $L26: st.u64 [%r168],%r311; .loc 1 132 18 ld.u64 %r312,[%r167]; st.u64 [%r165],%r312; .loc 1 133 17 shl.b64 %r314,%r120,3; add.u64 %r315,%r370,%r314; .loc 1 133 10 ld.u64 %r316,[%r315]; setp.le.s64 %r317,%r316,0; @ %r317 bra $L1; .loc 1 129 26 add.u64 %r120,%r120,1; .loc 1 129 3 add.u64 %r168,%r168,8; add.u64 %r167,%r167,24; add.u64 %r165,%r165,8; setp.ne.u64 %r318,%r120,%r156; @ %r318 bra $L26; bra $L27; $L25: .loc 1 149 5 @ %r368 bra $L35; .loc 1 150 10 st.u64 [%r119],%r373; st.u64 [%r119+8],%r373; bra $L29; $L35: mov.u64 %r114,%r118; .loc 1 148 10 mov.u64 %r363,%r373; .loc 1 154 13 mov.u64 %r113,0; $L28: .loc 1 158 19 ld.u64 %r325,[%r114]; xor.b64 %r324,%r325,%r363; ld.u64 %r327,[%r114+8]; or.b64 %r330,%r324,%r327; set.u32.ne.u64 %r334,%r330,0; neg.s32 %r335,%r334; cvt.u32.u32 %r322,%r335; .loc 1 158 10 cvt.u64.u8 %r363,%r322; .loc 1 154 28 add.u64 %r113,%r113,1; .loc 1 154 36 add.u64 %r114,%r114,%r67; .loc 1 154 6 setp.gt.s64 %r338,%r45,%r113; @ %r338 bra $L28; .loc 1 161 12 st.u64 [%r119],%r363; st.u64 [%r119+8],%r373; $L29: .loc 1 165 15 add.u64 %r139,%r139,1; .loc 1 166 12 add.u64 %r118,%r118,%r70; .loc 1 167 12 add.u64 %r119,%r119,%r73; .loc 1 169 13 setp.ne.u64 %r341,%r139,%r125; @ %r341 bra $L25; .loc 1 176 9 add.u64 %r115,%r118,%r130; .loc 1 177 9 add.u64 %r116,%r119,%r132; .loc 1 179 7 @ %r369 bra $L1; add.u64 %r124,%frame,368; mov.u64 %r141,8; .loc 1 178 5 mov.u64 %r117,1; bra $L32; $L33: .loc 1 173 13 st.u64 [%r124],%r373; .loc 1 176 23 mul.lo.u64 %r347,%r86,%r85; .loc 1 176 9 shl.b64 %r348,%r347,4; sub.u64 %r115,%r118,%r348; .loc 1 177 23 mul.lo.u64 %r349,%r89,%r85; .loc 1 177 9 shl.b64 %r350,%r349,4; sub.u64 %r116,%r119,%r350; .loc 1 178 5 add.u64 %r117,%r117,1; .loc 1 179 7 add.u64 %r124,%r124,8; add.u64 %r141,%r141,8; setp.eq.u64 %r351,%r117,%r377; @ %r351 bra $L1; $L32: .loc 1 187 16 ld.u64 %r352,[%r124]; add.u64 %r85,%r352,1; st.u64 [%r124],%r85; .loc 1 188 23 add.u64 %r354,%r378,%r141; ld.u64 %r86,[%r354]; .loc 1 188 13 shl.b64 %r355,%r86,4; add.u64 %r118,%r115,%r355; .loc 1 189 23 add.u64 %r356,%frame,%r141; ld.u64 %r89,[%r356]; .loc 1 189 13 shl.b64 %r357,%r89,4; add.u64 %r119,%r116,%r357; .loc 1 169 32 add.u64 %r359,%r379,%r141; ld.u64 %r92,[%r359]; .loc 1 169 13 setp.eq.u64 %r360,%r85,%r92; @ %r360 bra $L33; .loc 1 173 13 mov.u64 %r139,0; bra $L25; $L47: .loc 1 87 6 ld.u64 %r361,[%r180]; setp.eq.u64 %r362,%r361,0; @ ! %r362 bra $L11; bra $L34; $L1: .loc 1 193 1 ret; } matmul_i1.o/_gfortran_matmul_i1 .visible .func _gfortran_matmul_, .param .u32 %in_ar4, .param .u64 %in_ar5); .file 1 "../../../libgfortran/generated/matmul_i1.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_options[6]FUNCTION DECL: _gfortran_size0_gfortran_size0_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime_97,116,109,117,108,95,105,49,0 }2] = {782] = {672] = {8471,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,32,40,97,41,32,61,61,32,50,32,124,124,32,71,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,32,40,98,41,32,61,61,3243116,114,97,110,47,103,101,110,101,114,97,116,101,100,47,109,97,116,109,117,108,95,105,49,465,114,114,97,121,32,98,111,117,110,100,32,109,105,115,109,97,116,99,104,32,102,111,114,32,100,105,109,101,110,115,105,111,110,32,49,32,111,102,32,97,114,114,97,121,32,40,37,108,100,47,37,108,100,41,3265,114,114,97,121,32,98,111,117,110,100,32,109,105,115,109,97,116,99,104,32,102,111,114,32,100,105,109,101,110,115,105,111,110,32,50,32,111,102,32,97,114,114,97,121,32,40,37,108,100,47,37,108,100,41,3273,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,97,114,103,117,109,101,110,116,32,66,32,105,110,32,77,65,84,77,85,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,49,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,100,0 }103,101,109,109,32,33,61,32,78,85,76,7_gfortran_matmul_i1 .visible .func _gfortran_matmul_99u64 %r410; .reg .u32.reg .u64u64 %r531; .reg .u64 %r533; .reg .u64 %r536; .reg .u64 %r537; .reg .u64 %r538; .reg .u64 %r539; .reg .u64 %r545; .reg .u64u64 %r556; .reg .u64 %r559; .reg .u64 %r575; .reg .u64 %r576; .reg .u64 %r577; .reg .u64 %r578; .reg .u64 %r584; .reg .u64 %r585; .reg .u64 %r587; .reg .u64u64 %r603; .reg .u64 %r608; .reg .u64 %r609; .reg .u64 %r614; .reg .u64 %r615; .reg .u64 %r616; .reg .u32 %r624; .reg .u64 %r625; .reg .u64 %r626; .reg .u64 %r630; .reg .u64 %r634; .reg .u64 %r635; .reg .u64 %r636; .reg .u64 %r641; .reg .u64 %r642; .reg .u64 %r644; .reg .u64 %r649; .reg .u64 %r651; .reg .u64 %r654; .reg .u64 %r656; .reg .u64 %r661; .reg .u64 %r663; .reg .u64 %r668; .reg .u64 %r670; .reg .u64 %r675; .reg .u64 %r677; .reg .u64 %r681; .reg .u64 %r684; .reg .u64 %r690; .reg .u64 %r692; .reg .u64 %r696; .reg .u64 %r697; .reg .u64 %r700; .reg .u64 %r703; .reg .u64 %r706; .reg .u64 %r709; .reg .u64 %r713; .reg .u64 %r715; .reg .u64 %r719; .reg .u64 %r721; .reg .u64 %r724; .reg .u64 %r725; .reg .u64 %r726; .reg .u32 %r727; .reg .u32 %r728; .reg .u64 %r729; .reg .u16 %r730; .reg .pred %r731; .reg .u16 %r732; .reg .pred %r734; .reg .u64 %r735; .reg .u32 %r736; .reg .u64 %r737; .reg .u64 %r738; .reg .u64 %r739; .reg .pred %r740; .reg .pred %r742; .reg .u64 %r745; .reg .u64 %r747; .reg .u64 %r748; .reg .u64 %r749; .reg .u64 %r751; .reg .u64 %r752; .reg .u64 %r753; .reg .u64 %r754; .reg .u64 %r755; .reg .u64 %r758; .reg .u64 %r760; .reg .u64 %r761; .reg .u64 %r763; .reg .u64 %r765; .reg .u64 %r766; .reg .u64 %r768; .reg .u64 %r769; .reg .u64 %r770; .reg .u64 %r771; .reg .u64 %r772; .reg .u32 %r773; .reg .pred %r774; .reg .u64 %r775; .reg .u64 %r776; .reg .u64 %r777; .reg .pred %r779; .reg .u64 %r780; .reg .u64 %r781; .reg .u64 %r782; .reg .pred %r783; .reg .u64 %r784; .reg .u64 %r785; .reg .u64 %r786; .reg .u16 %r787; .reg .pred %r788; .reg .u64 %r789; .reg .u64 %r791; .reg .u64 %r792; .reg .u64 %r793; .reg .u16 %r794; .reg .pred %r796; .reg .pred %r797; .reg .u64 %r798; .reg .pred %r800; .reg .u64 %r801; .reg .u64 %r803; .reg .u64 %r804; .reg .u64 %r805; .reg .u64 %r806; .reg .u64 %r807; .reg .u64 %r808; .reg .pred %r809; .reg .u64 %r810; .reg .u64 %r812; .reg .u64 %r813; .reg .u64 %r814; .reg .u16 %r815; .reg .u32 %r816; .reg .pred %r817; .reg .u16 %r818; .reg .pred %r819; .reg .u64 %r820; .reg .u64 %r821; .reg .u64 %r822; .reg .u64 %r823; .reg .u64 %r824; .reg .u64 %r825; .reg .pred %r826; .reg .u32 %r828; .reg .u32 %r829; .reg .u32 %r831; .reg .u32 %r832; .reg .u16 %r833; .reg .u16 %r834; .reg .u16 %r835; .reg .u32 %r836; .reg .u16 %r837; .reg .pred %r838; .reg .u64 %r839; .reg .u16 %r841; .reg .pred %r842; .reg .u64 %r843; .reg .u64 %r844; .reg .u64 %r845; .reg .u32 %r846; .reg .u32 %r847; .reg .u32 %r848; .reg .u32 %r849; .reg .u32 %r850; .reg .u32 %r851; .reg .u32 %r853; .reg .u32 %r854; .reg .u16 %r855; .reg .u16 %r856; .reg .u16 %r857; .reg .u32 %r858; .reg .u16 %r859; .reg .pred %r860; .reg .u32 %r862; .reg .u32 %r863; .reg .u16 %r864; .reg .u16 %r865; .reg .u16 %r866; .reg .u32 %r867; .reg .u16 %r868; .reg .pred %r869; .reg .u32 %r871; .reg .u32 %r872; .reg .u32 %r874; .reg .u32 %r875; .reg .u16 %r876; .reg .u16 %r877; .reg .u16 %r878; .reg .u32 %r879; .reg .u16 %r880; .reg .pred %r881; .reg .f32 %r882; .reg .f32 %r883; .reg .f32 %r884; .reg .f32 %r885; .reg .f32 %r886; .reg .f32 %r887; .reg .f32 %r888; .reg .pred %r889; .reg .u32 %r890; .reg .u32 %r891; .reg .pred %r892; .reg .pred %r893; .reg .u32 %r895; .reg .u32 %r896; .reg .u32 %r898; .reg .u32 %r899; .reg .u16 %r900; .reg .u16 %r901; .reg .u16 %r902; .reg .u32 %r903; .reg .u16 %r904; .reg .pred %r905; .reg .u32 %r907; .reg .u32 %r908; .reg .u32 %r910; .reg .u32 %r911; .reg .u16 %r912; .reg .u16 %r913; .reg .u16 %r914; .reg .u32 %r915; .reg .u16 %r916; .reg .pred %r917; .reg .u32 %r919; .reg .u32 %r920; .reg .u32 %r922; .reg .u32 %r923; .reg .u16 %r924; .reg .u16 %r925; .reg .u16 %r926; .reg .u32 %r927; .reg .u16 %r928; .reg .pred %r929; .reg .pred %r930; .reg .u64 %r931; .reg .u32 %r932; .reg .u64 %r933; .reg .u64 %r934; .reg .u32 %r935; .reg .pred %r936; .reg .pred %r937; .reg .u32 %r938; .reg .pred %r939; .reg .pred %r940; .reg .u32 %r955; .reg .u64 %r956; .reg .u64 %r957; .reg .u64 %r958; .reg .u64 %r959; .reg .u64 %r960; .reg .u64 %r961; .reg .u64 %r962; .reg .u16 %r963; .reg .u16 %r964; .reg .u16 %r965; .reg .u32 %r966; .reg .u16 %r967; .reg .pred %r968; .reg .pred %r969; .reg .u16 %r970; .reg .pred %r971; .reg .u64 %r972; .reg .pred %r973; .reg .u32 %r978; .reg .u64 %r980; .reg .u32 %r985; .reg .u32 %r986; .reg .u32 %r988; .reg .u32 %r989; .reg .u16 %r990; .reg .u16 %r991; .reg .u16 %r992; .reg .u32 %r994; .reg .u32 %r995; .reg .u16 %r997; .reg .u16 %r999; .reg .u32 %r1000; .reg .u16 %r1001; .reg .pred %r1002; .reg .pred %r1003; .reg .u64 %r1004; .reg .u64 %r1006; .reg .u64 %r1007; .reg .pred %r1009; .reg .u64 %r1011; .reg .u64 %r1014; .reg .u64 %r1015; .reg .u64 %r1019; .reg .u64 %r1022; .reg .u64 %r1026; .reg .u64 %r1029; .reg .u64 %r1030; .reg .u64 %r1033; .reg .u64 %r1035; .reg .u64 %r1036; .reg .u64 %r1037; .reg .u64 %r1038; .reg .u64 %r1039; .reg .pred %r1041; .reg .u64 %r1047; .reg .u64 %r1048; .reg .u64 %r1049; .reg .u64 %r1050; .reg .pred %r1052; .reg .u64 %r1053; .reg .u64 %r1054; .reg .u64 %r1056; .reg .u64 %r1059; .reg .u64 %r1060; .reg .u64 %r1061; .reg .u64 %r1062; .reg .u64 %r1066; .reg .u32 %r1075; .reg .u64 %r1076; .reg .u64 %r1077; .reg .u32 %r1078; .reg .u32 %r1079; .reg .u64 %r1080; .reg .u64 %r1081; .reg .u32 %r1082; .reg .u64 %r1085; .reg .u64 %r1086; .reg .u64 %r1088; .reg .u64 %r1092; .reg .u16 %r1093; .reg .u32 %r1094; .reg .u32 %r1095; .reg .u16 %r1096; .reg .u16 %r1097; .reg .u16 %r1098; .reg .pred %r1099; .reg .u64 %r1102; .reg .u64 %r1103; .reg .u64 %r1104; .reg .u64 %r1105; .reg .pred %r1107; .reg .u64 %r1109; .reg .u64 %r1110; .reg .u64 %r1111; .reg .u32 %r1112; .reg .u64 %r1121; .reg .u16 %r1123; .reg .u16 %r1124; .reg .u16 %r1127; .reg .u16 %r1129; .reg .u32 %r1130; .reg .u16 %r1134; .reg .u16 %r1137; .reg .u16 %r1139; .reg .u32 %r1140; .reg .u64 %r1143; .reg .u16 %r1146; .reg .u16 %r1149; .reg .u16 %r1151; .reg .u32 %r1152; .reg .u16 %r1159; .reg .u16 %r1161; .reg .u32 %r1162; .reg .u64 %r1164; .reg .u16 %r1167; .reg .u16 %r1170; .reg .u16 %r1172; .reg .u32 %r1173; .reg .u16 %r1180; .reg .u16 %r1182; .reg .u32 %r1183; .reg .u64 %r1185; .reg .u16 %r1188; .reg .u16 %r1191; .reg .u16 %r1193; .reg .u32 %r1194; .reg .u16 %r1201; .reg .u16 %r1203; .reg .u32 %r1204; .reg .u64 %r1206; .reg .u16 %r1209; .reg .u16 %r1212; .reg .u16 %r1214; .reg .u32 %r1215; .reg .u64 %r1217; .reg .u16 %r1220; .reg .u16 %r1223; .reg .u16 %r1225; .reg .u32 %r1226; .reg .u16 %r1233; .reg .u16 %r1235; .reg .u32 %r1236; .reg .u16 %r1243; .reg .u16 %r1245; .reg .u32 %r1246; .reg .u16 %r1253; .reg .u16 %r1255; .reg .u32 %r1256; .reg .u16 %r1263; .reg .u16 %r1265; .reg .u32 %r1266; .reg .u16 %r1273; .reg .u16 %r1275; .reg .u32 %r1276; .reg .u16 %r1283; .reg .u16 %r1285; .reg .u32 %r1286; .reg .pred %r1288; .reg .u32 %r1289; .reg .u32 %r1290; .reg .u32 %r1291; .reg .u32 %r1292; .reg .u32 %r1293; .reg .u32 %r1294; .reg .u32 %r1295; .reg .u32 %r1296; .reg .u32 %r1297; .reg .u32 %r1298; .reg .u32 %r1299; .reg .u32 %r1300; .reg .u32 %r1301; .reg .u32 %r1302; .reg .u32 %r1303; .reg .u32 %r1304; .reg .pred %r1305; .reg .pred %r1306; .reg .u16 %r1312; .reg .u16 %r1313; .reg .u16 %r1316; .reg .u16 %r1318; .reg .u32 %r1319; .reg .u64 %r1321; .reg .u64 %r1322; .reg .u16 %r1325; .reg .u16 %r1329; .reg .u16 %r1331; .reg .u32 %r1332; .reg .u64 %r1334; .reg .u64 %r1335; .reg .u16 %r1338; .reg .u16 %r1342; .reg .u16 %r1344; .reg .u32 %r1345; .reg .u64 %r1347; .reg .u64 %r1348; .reg .u16 %r1351; .reg .u16 %r1355; .reg .u16 %r1357; .reg .u32 %r1358; .reg .u32 %r1361; .reg .u32 %r1362; .reg .u32 %r1363; .reg .u32 %r1364; .reg .u64 %r1366; .reg .pred %r1367; .reg .pred %r1368; .reg .u64 %r1374; .reg .u16 %r1377; .reg .u16 %r1378; .reg .u16 %r1381; .reg .u16 %r1383; .reg .u32 %r1384; .reg .u16 %r1388; .reg .u16 %r1392; .reg .u16 %r1394; .reg .u32 %r1395; .reg .u64 %r1397; .reg .u16 %r1400; .reg .u16 %r1404; .reg .u16 %r1406; .reg .u32 %r1407; .reg .u64 %r1409; .reg .u16 %r1412; .reg .u16 %r1416; .reg .u16 %r1418; .reg .u32 %r1419; .reg .u32 %r1422; .reg .u32 %r1423; .reg .u32 %r1424; .reg .u32 %r1425; .reg .pred %r1426; .reg .pred %r1427; .reg .u16 %r1433; .reg .u16 %r1434; .reg .u16 %r1437; .reg .u16 %r1439; .reg .u32 %r1440; .reg .u32 %r1443; .reg .pred %r1445; .reg .u16 %r1446; .reg .pred %r1447; .reg .u32 %r1449; .reg .u32 %r1450; .reg .u16 %r1451; .reg .u16 %r1452; .reg .u32 %r1454; .reg .u16 %r1455; .reg .pred %r1456; .reg .pred %r1457; .reg .u16 %r1458; .reg .pred %r1459; .reg .pred %r1460; .reg .pred %r1461; .reg .pred %r1463; .reg .u16 %r1468; .reg .u16 %r1469; .reg .u16 %r1472; .reg .u16 %r1474; .reg .u32 %r1475; .reg .u32 %r1478; .reg .pred %r1480; .reg .u16 %r1485; .reg .u16 %r1486; .reg .u16 %r1489; .reg .u16 %r1491; .reg .u32 %r1492; .reg .u32 %r1496; .reg .pred %r1497; .reg .pred %r1498; .reg .u16 %r1503; .reg .u16 %r1504; .reg .u16 %r1507; .reg .u16 %r1509; .reg .u32 %r1510; .reg .u32 %r1513; .reg .pred %r1514; .reg .pred %r1515; .reg .pred %r1516; .reg .pred %r1517; .reg .pred %r1520; .reg .u16 %r1525; .reg .u16 %r1526; .reg .u16 %r1529; .reg .u16 %r1531; .reg .u32 %r1532; .reg .pred %r1534; .reg .pred %r1536; .reg .pred %r1539; .reg .u16 %r1544; .reg .u16 %r1545; .reg .u16 %r1548; .reg .u16 %r1550; .reg .u32 %r1551; .reg .u32 %r1554; .reg .pred %r1555; .reg .pred %r1556; .reg .u64 %r1557; .reg .pred %r1558; .reg .u64 %r1559; .reg .u64 %r1560; .reg .u64 %r1561; .reg .u16 %r1562; .reg .u32 %r1563; .reg .pred %r1564; .reg .u16 %r1565; .reg .u32 %r1566; .reg .pred %r1567; .reg .u64 %r1568; .reg .u64 %r1569; .reg .u64 %r1570; .reg .u64 %r1571; .reg .u64 %r1572; .reg .u64 %r1573; .reg .u64 %r1574; .reg .u32 %r1575; .reg .pred %r1576; .reg .u64 %r1577; .reg .u64 %r1578; .reg .pred %r1579; .reg .pred %r1580; .reg .pred %r1581; .reg .pred %r1584; .reg .pred %r1585; .reg .pred %r1586; .reg .pred %r1587; .reg .pred %r1588; .reg .pred %r1590; .reg .u32 %r1591; .reg .u64 %r1592; .reg .u64 %r1593; .reg .u64 %r1594; .reg .u64 %r1596; .reg .pred %r1598; .reg .u64 %r1610; .reg .u64 %r1611; .reg .u64 %r1615; .reg .u64 %r1616; .reg .u64 %r1617; .reg .u64 %r1618; .reg .u64 %r1619; .reg .u64 %r1620; .reg .u64 %r1621; .reg .u64 %r1622; .reg .u64 %r1623; .reg .u64 %r1624; .reg .u64 %r1625; .reg .pred %r1626; .reg .u64 %r1628; .reg .pred %r1631; .reg .u64 %r1633; .reg .pred %r1634; .reg .pred %r1635; .reg .u64 %r1636; .reg .u64 %r1637; .reg .u64 %r1638; .reg .u64 %r1639; .reg .pred %r1643; .reg .u32 %r1644; .reg .u32 %r1646; .reg .u32 %r1647; .reg .u32 %r1648; .reg .pred %r1650; .reg .u32 %r1651; .reg .u64 %r1655; .reg .u64 %r1656; .reg .pred %r1657; .reg .pred %r1658; .reg .pred %r1659; .reg .u64 %r1664; .reg .u64 %r1665; .reg .pred %r1666; .reg .u64 %r1668; .reg .u64 %r1672; .reg .pred %r1673; .reg .u64 %r1675; .reg .u64 %r1678; .reg .u64 %r1679; .reg .pred %r1680; .reg .pred %r1681; .reg .pred %r1682; .reg .u64 %r1685; .reg .u64 %r1687; .reg .u64 %r1688; .reg .u64 %r1689; .reg .pred %r1690; .reg .pred %r1691; .reg .pred %r1692; .reg .u64 %r1693; .reg .u64 %r1694; .reg .u64 %r1697; .reg .u64 %r1698; .reg .pred %r1699; .reg .pred %r1700; .reg .pred %r1701; .reg .u64 %r1702; .reg .u64 %r1705; .reg .u64 %r1706; .reg .pred %r1707; .reg .pred %r1708; .reg .pred %r1709; .reg .u64 %r1715; .reg .pred %r1716; .reg .pred %r1717; .reg .pred %r1718; .reg .u64 %r1723; .reg .pred %r1724; .reg .pred %r1725; .reg .pred %r1726; .reg .u64 %r1731; .reg .pred %r1732; .reg .pred %r1733; .reg .pred %r1734; .reg .u64 %r1739; .reg .pred %r1740; .reg .pred %r1741; .reg .pred %r1742; .reg .u64 %r1747; .reg .pred %r1748; .reg .pred %r1749; .reg .pred %r1750; .reg .u64 %r1753; .reg .u64 %r1754; .reg .u64 %r1756; .reg .u64 %r1757; .reg .u32 %r1760; .reg .u64 %r1761; .reg .u64 %r1762; .reg .u32 %r1763; .reg .u32 %r1764; .reg .u64 %r1765; .reg .u64 %r1766; .reg .u32 %r1767; .reg .u64 %r1768; .reg .u64 %r1769; .reg .u32 %r1771; .reg .u64 %r1772; .reg .u64 %r1773; .reg .u32 %r1774; .reg .u32 %r1775; .reg .u64 %r1776; .reg .u64 %r1777; .reg .u32 %r1778; .reg .u64 %r1779; .reg .u64 %r1780; .reg .u32 %r1782; .reg .u64 %r1783; .reg .u64 %r1784; .reg .u32 %r1785; .reg .u32 %r1786; .reg .u64 %r1787; .reg .u64 %r1788; .reg .u32 %r1789; .reg .pred %r1790; .reg .u64 %r1791; .reg .u32 %r1794; .reg .u32 %r1798; .reg .u32 %r1802; .reg .pred %r1803; .reg .u64 %r1804; .reg .u32 %r1805; .reg .u32 %r1806; .reg .u32 %r1807; .reg .u32 %r1808; .reg .u16 %r1813; .reg .u16 %r1814; .reg .u16 %r1816; .reg .u16 %r1817; .reg .u32 %r1818; .reg .u64 %r1819; .reg .u64 %r1820; .reg .u16 %r1822; .reg .u16 %r1824; .reg .u16 %r1825; .reg .u32 %r1826; .reg .u64 %r1827; .reg .u64 %r1828; .reg .u16 %r1830; .reg .u16 %r1832; .reg .u16 %r1833; .reg .u32 %r1834; .reg .u64 %r1835; .reg .u64 %r1836; .reg .u16 %r1838; .reg .u16 %r1840; .reg .u16 %r1841; .reg .u32 %r1842; .reg .pred %r1843; .reg .pred %r1844; .reg .u32 %r1845; .reg .u32 %r1846; .reg .u32 %r1847; .reg .u32 %r1848; .reg .u64 %r1851; .reg .u64 %r1853; .reg .u16 %r1855; .reg .u16 %r1856; .reg .u16 %r1858; .reg .u16 %r1859; .reg .u32 %r1860; .reg .u16 %r1862; .reg .u16 %r1864; .reg .u16 %r1865; .reg .u32 %r1866; .reg .u64 %r1867; .reg .u16 %r1869; .reg .u16 %r1871; .reg .u16 %r1872; .reg .u32 %r1873; .reg .u64 %r1874; .reg .u16 %r1876; .reg .u16 %r1878; .reg .u16 %r1879; .reg .u32 %r1880; .reg .pred %r1881; .reg .u32 %r1882; .reg .u16 %r1887; .reg .u16 %r1888; .reg .u16 %r1890; .reg .u16 %r1891; .reg .u32 %r1892; .reg .u32 %r1893; .reg .u16 %r1899; .reg .u16 %r1900; .reg .u16 %r1902; .reg .u16 %r1903; .reg .u32 %r1904; .reg .u32 %r1905; .reg .u16 %r1911; .reg .u16 %r1912; .reg .u16 %r1914; .reg .u16 %r1915; .reg .u32 %r1916; .reg .pred %r1917; .reg .pred %r1918; .reg .u32 %r1919; .reg .u16 %r1924; .reg .u16 %r1925; .reg .u16 %r1927; .reg .u16 %r1928; .reg .u32 %r1929; .reg .u32 %r1930; .reg .u16 %r1936; .reg .u16 %r1937; .reg .u16 %r1939; .reg .u16 %r1940; .reg .u32 %r1941; .reg .u32 %r1942; .reg .u16 %r1948; .reg .u16 %r1949; .reg .u16 %r1951; .reg .u16 %r1952; .reg .u32 %r1953; .reg .pred %r1954; .reg .pred %r1955; .reg .u32 %r1956; .reg .u64 %r1957; .reg .u16 %r1961; .reg .u16 %r1962; .reg .u16 %r1964; .reg .u16 %r1965; .reg .u32 %r1966; .reg .u32 %r1967; .reg .u64 %r1968; .reg .u16 %r1973; .reg .u16 %r1974; .reg .u16 %r1976; .reg .u16 %r1977; .reg .u32 %r1978; .reg .u32 %r1979; .reg .u64 %r1980; .reg .u16 %r1985; .reg .u16 %r1986; .reg .u16 %r1988; .reg .u16 %r1989; .reg .u32 %r1990; .reg .pred %r1991; .reg .pred %r1992; .reg .u32 %r1993; .reg .u64 %r1995; .reg .u64 %r1996; .reg .u16 %r1999; .reg .u16 %r2000; .reg .u16 %r2002; .reg .u16 %r2003; .reg .u32 %r2004; .reg .u32 %r2005; .reg .u64 %r2007; .reg .u64 %r2008; .reg .u16 %r2012; .reg .u16 %r2013; .reg .u16 %r2015; .reg .u16 %r2016; .reg .u32 %r2017; .reg .u32 %r2018; .reg .u64 %r2020; .reg .u64 %r2021; .reg .u16 %r2025; .reg .u16 %r2026; .reg .u16 %r2028; .reg .u16 %r2029; .reg .u32 %r2030; .reg .pred %r2031; .reg .u64 %r2033; .reg .u64 %r2035; .reg .u64 %r2038; .reg .pred %r2040; .reg .pred %r2041; .reg .pred %r2042; .reg .u64 %r2044; .reg .u64 %r2045; .reg .u16 %r2048; .reg .u16 %r2049; .reg .u16 %r2052; .reg .u16 %r2053; .reg .u32 %r2054; .reg .u64 %r2056; .reg .u64 %r2057; .reg .u16 %r2061; .reg .u16 %r2062; .reg .u16 %r2065; .reg .u16 %r2066; .reg .u32 %r2067; .reg .u64 %r2069; .reg .u64 %r2070; .reg .u16 %r2074; .reg .u16 %r2075; .reg .u16 %r2078; .reg .u16 %r2079; .reg .u32 %r2080; .reg .pred %r2081; .reg .pred %r2082; .reg .u32 %r2084; .reg .u64 %r2086; .reg .u64 %r2087; .reg .u16 %r2090; .reg .u16 %r2091; .reg .u16 %r2093; .reg .u16 %r2094; .reg .u32 %r2095; .reg .u32 %r2096; .reg .u64 %r2098; .reg .u64 %r2099; .reg .u16 %r2103; .reg .u16 %r2104; .reg .u16 %r2106; .reg .u16 %r2107; .reg .u32 %r2108; .reg .u32 %r2109; .reg .u64 %r2111; .reg .u64 %r2112; .reg .u16 %r2116; .reg .u16 %r2117; .reg .u16 %r2119; .reg .u16 %r2120; .reg .u32 %r2121; .reg .pred %r2122; .reg .u16 %r2125; .reg .u16 %r2126; .reg .u16 %r2128; .reg .u16 %r2129; .reg .u32 %r2130; .reg .u16 %r2134; .reg .u16 %r2135; .reg .u16 %r2137; .reg .u16 %r2138; .reg .u32 %r2139; .reg .u16 %r2143; .reg .u16 %r2144; .reg .u16 %r2146; .reg .u16 %r2147; .reg .u32 %r2148; .reg .pred %r2149; .reg .u64 %r2150; .reg .u64 %r2152; .reg .u16 %r2154; .reg .u16 %r2155; .reg .u16 %r2157; .reg .u16 %r2158; .reg .u32 %r2159; .reg .u16 %r2161; .reg .u16 %r2163; .reg .u16 %r2164; .reg .u32 %r2165; .reg .u64 %r2166; .reg .u16 %r2168; .reg .u16 %r2170; .reg .u16 %r2171; .reg .u32 %r2172; .reg .u64 %r2173; .reg .u16 %r2175; .reg .u16 %r2177; .reg .u16 %r2178; .reg .u32 %r2179; .reg .pred %r2180; .reg .u16 %r2183; .reg .u16 %r2184; .reg .u16 %r2186; .reg .u16 %r2187; .reg .u32 %r2188; .reg .u64 %r2189; .reg .u64 %r2190; .reg .u16 %r2192; .reg .u16 %r2194; .reg .u16 %r2195; .reg .u32 %r2196; .reg .u64 %r2197; .reg .u64 %r2198; .reg .u16 %r2200; .reg .u16 %r2202; .reg .u16 %r2203; .reg .u32 %r2204; .reg .u64 %r2205; .reg .u64 %r2206; .reg .u16 %r2208; .reg .u16 %r2210; .reg .u16 %r2211; .reg .u32 %r2212; .reg .pred %r2213; .reg .u32 %r2214; .reg .u32 %r2216; .reg .u32 %r2218; .reg .pred %r2219; .reg .u32 %r2220; .reg .u64 %r2221; .reg .u64 %r2222; .reg .u32 %r2223; .reg .u32 %r2224; .reg .u64 %r2225; .reg .u64 %r2226; .reg .u32 %r2227; .reg .u32 %r2229; .reg .u64 %r2230; .reg .u64 %r2231; .reg .u32 %r2232; .reg .u32 %r2233; .reg .u64 %r2234; .reg .u64 %r2235; .reg .u32 %r2236; .reg .u32 %r2238; .reg .u64 %r2239; .reg .u64 %r2240; .reg .u32 %r2241; .reg .u32 %r2242; .reg .u64 %r2243; .reg .u64 %r2244; .reg .u32 %r2245; .reg .pred %r2246; .reg .u32 %r2247; .reg .u64 %r2248; .reg .u32 %r2250; .reg .u64 %r2251; .reg .u32 %r2253; .reg .u64 %r2254; .reg .pred %r2255; .reg .u32 %r2256; .reg .u64 %r2257; .reg .u64 %r2259; .reg .u32 %r2261; .reg .u64 %r2262; .reg .u64 %r2264; .reg .u32 %r2266; .reg .u64 %r2267; .reg .pred %r2268; .reg .u16 %r2271; .reg .u16 %r2272; .reg .u16 %r2273; .reg .u32 %r2276; .reg .u16 %r2280; .reg .u16 %r2281; .reg .u16 %r2283; .reg .u16 %r2284; .reg .u32 %r2285; .reg .u16 %r2289; .reg .u16 %r2290; .reg .u16 %r2292; .reg .u16 %r2293; .reg .u32 %r2294; .reg .pred %r2295; .reg .u16 %r2298; .reg .u16 %r2299; .reg .u16 %r2300; .reg .u32 %r2303; .reg .u16 %r2307; .reg .u16 %r2308; .reg .u16 %r2310; .reg .u16 %r2311; .reg .u32 %r2312; .reg .u16 %r2316; .reg .u16 %r2317; .reg .u16 %r2319; .reg .u16 %r2320; .reg .u32 %r2321; .reg .pred %r2322; .reg .u16 %r2325; .reg .u16 %r2326; .reg .u16 %r2327; .reg .u32 %r2330; .reg .u16 %r2334; .reg .u16 %r2335; .reg .u16 %r2337; .reg .u16 %r2338; .reg .u32 %r2339; .reg .u16 %r2343; .reg .u16 %r2344; .reg .u16 %r2346; .reg .u16 %r2347; .reg .u32 %r2348; .reg .pred %r2349; .reg .u16 %r2352; .reg .u16 %r2353; .reg .u16 %r2356; .reg .u16 %r2357; .reg .u32 %r2358; .reg .u16 %r2362; .reg .u16 %r2363; .reg .u16 %r2366; .reg .u16 %r2367; .reg .u32 %r2368; .reg .u16 %r2372; .reg .u16 %r2373; .reg .u16 %r2376; .reg .u16 %r2377; .reg .u32 %r2378; .reg .pred %r2379; .reg .pred %r2382; .reg .u16 %r2385; .reg .u16 %r2386; .reg .u16 %r2387; .reg .u32 %r2390; .reg .u16 %r2394; .reg .u16 %r2395; .reg .u16 %r2397; .reg .u16 %r2398; .reg .u32 %r2399; .reg .u16 %r2403; .reg .u16 %r2404; .reg .u16 %r2406; .reg .u16 %r2407; .reg .u32 %r2408; .reg .pred %r2409; .reg .u32 %r2419; .reg .u16 %r2420; mov.u64 %r724,%ar0; mov.u64 %r725,%ar1; mov.u64 %r726,%ar2; mov.u32 %r727,%ar3; mov.u32 %r728,%ar4; mov.u64 %r729,%ar5; .loc 1 2462 3 ld.s8 %r265,[%r725+28]; cvt.u16.u32 %r730,%r265; setp.eq.u16 %r731,%r730,2; @ %r731 bra $L2; ld.s8 %r1591,[%r726+28]; cvt.u16.u32 %r732,%r1591; setp.eq.u16 %r734,%r732,2; @ %r734 bra $L3; cvta.const.u64 %r738,$LC3; cvta.const.u64 %r737,__func__$0; mov.u32 %r736,2462; cvta.const.u64 %r7357377383: .loc 1 2476 6 ld.u64 %r739,[%r724]; setp.ne.u64 %r740,%r739,0; @ %r740 bra $L4; .loc 1 2478 10 setp.ne.u16 %r742,%r730,1; @ %r742 bra $L5; .loc 1 2480 4 st.u64 [%r724+48],%r739; ld.u64 %r745,[%r726+80]; ld.u64 %r747,[%r726+72]; sub.u64 %r748,%r745,%r747; st.u64 [%r724+56],%r748; mov.u64 %r749,1; st.u64 [%r724+40],%r749; bra $L6; $L123: .loc 1 2485 4 st.u64 [%r724+48],%r1557; add.u64 %r751,%r630,-1; st.u64 [%r724+56],%r751; mov.u64 %r752,1; st.u64 [%r724+40],%r752; bra $L6; $L124: .loc 1 2490 4 mov.u64 %r753,0; st.u64 [%r724+48],%r753; add.u64 %r754,%r630,-1; st.u64 [%r724+56],%r754; mov.u64 %r755,1; st.u64 [%r724+40],%r755; .loc 1 2493 11 st.u64 [%r724+72],%r753; ld.u64 %r758,[%r726+80]; ld.u64 %r760,[%r726+72]; sub.u64 %r761,%r758,%r760; st.u64 [%r724+80],%r761; st.u64 [%r724+64],%r630; $L6: .loc 1 2499 1724; call (%value_in),_gfortran_size0,(%out_arg1); ld.param.u64 %r763,[%value_in]; } .loc 1 2499 4 mov.u64 %r76576(%value_in),_gfortrani_xmallocarray766,[%value_in]; } .loc 1 2499 2 st.u64 [%r724],%r766; .loc 1 2500 24 mov.u64 %r768,0; st.u64 [%r724+8],%r768; .loc 1 2557 7 ld.s8 %r265,[%r725+28]; .loc 1 2564 15 ld.u64 %r770,[%r725+56]; add.u64 %r769,%r770,1; .loc 1 2564 13 ld.u64 %r771,[%r725+48]; sub.u64 %r266,%r769,%r771; bra $L7; $L4: .loc 1 2502 12 cvta.global.u64 %r772,_gfortrani_compile_options; .loc 1 2502 11 ld.u32 %r773,[%r772+36]; setp.eq.u32 %r774,%r773,0; @ %r774 bra $L8; .loc 1 2509 17 ld.u64 %r776,[%r724+56]; add.u64 %r775,%r776,1; .loc 1 2509 15 ld.u64 %r777,[%r724+48]; sub.u64 %r266,%r775,%r777; .loc 1 2506 10 setp.ne.u16 %r779,%r730,1; @ %r779 bra $L9; .loc 1 2508 17 ld.u64 %r781,[%r726+80]; add.u64 %r780,%r781,1; .loc 1 2508 15 ld.u64 %r782,[%r726+72]; sub.u64 %r349,%r780,%r782; .loc 1 2510 7 setp.ne.u64 %r783,%r349,%r266; @ %r783 bra $L10; .loc 1 2564 15 ld.u64 %r785,[%r725+56]; add.u64 %r784,%r785,1; .loc 1 2564 13 ld.u64 %r786,[%r725+48]; sub.u64 %r266,%r784,%r786; .loc 1 2543 7 ld.s8 %r288,[%r724+28]; .loc 1 2548 27 ld.u64 %r336,[%r724+40]; .loc 1 2543 6 cvt.u16.u32 %r787,%r288; setp.eq.u16 %r788,%r787,1; @ %r788 bra $L11; .loc 1 2553 16 ld.u64 %r626,[%r724+64]; .loc 1 2560 16 ld.u64 %r377,[%r725+40]; .loc 1 2563 14 mov.u64 %r313,1; .loc 1 2561 16 mov.u64 %r307,%r313; bra $L12; $L10: .loc 1 2511 6 st.u64 [%stack+8],%r349; st.u64 [%stack],%r266; cvta.const.u64 %r7878_gfortran_runtime_error2485 4 ld.u64 %r792,[%r725+56]; add.u64 %r791,%r792,1; ld.u64 %r793,[%r725+48]; sub.u64 %r635,%r791,%r793; .loc 1 2515 15 cvt.u16.u32 %r794,%r1591; setp.ne.u16 %r796,%r794,1; @ %r796 bra $L13; .loc 1 2519 7 setp.eq.u64 %r797,%r266,%r635; @ %r797 bra $L14; .loc 1 2520 6 st.u64 [%stack+8],%r635; st.u64 [%stack],%r266; cvta.const.u64 %r798_gfortran_runtime_error13: .loc 1 2528 7 setp.eq.u64 %r800,%r266,%r635; @ %r800 bra $L15; .loc 1 2529 6 st.u64 [%stack+8],%r635; st.u64 [%stack],%r266; cvta.const.u64 %r801801_gfortran_runtime_error15: .loc 1 2533 17 ld.u64 %r804,[%r726+80]; add.u64 %r803,%r804,1; .loc 1 2533 15 ld.u64 %r805,[%r726+72]; sub.u64 %r346,%r803,%r805; .loc 1 2534 17 ld.u64 %r807,[%r724+80]; add.u64 %r806,%r807,1; .loc 1 2534 15 ld.u64 %r808,[%r724+72]; sub.u64 %r347,%r806,%r808; .loc 1 2535 7 setp.eq.u64 %r809,%r346,%r347; @ %r809 bra $L14; .loc 1 2536 6 st.u64 [%stack+8],%r346; st.u64 [%stack],%r347; cvta.const.u64 %r810,$LC810_gfortran_runtime_error2564 15 ld.u64 %r813,[%r725+56]; add.u64 %r812,%r813,1; .loc 1 2564 13 ld.u64 %r814,[%r725+48]; sub.u64 %r266,%r812,%r814; $L7: .loc 1 2548 27 ld.u64 %r626,[%r724+40]; .loc 1 2543 6 ld.s8 %r816,[%r724+28]; cvt.u16.u32 %r815,%r816; setp.eq.u16 %r817,%r815,1; @ %r817 bra $L130; $L127: .loc 1 2553 16 mov.u64 %r336,%r626; ld.u64 %r626,[%r724+64]; bra $L16; $L130: .loc 1 2548 27 mov.u64 %r336,%r626; $L16: .loc 1 2560 16 ld.u64 %r377,[%r725+40]; .loc 1 2557 6 cvt.u16.u32 %r818,%r265; setp.eq.u16 %r819,%r818,1; @ %r819 bra $L131; $L125: .loc 1 2569 16 ld.u64 %r307,[%r725+64]; .loc 1 2571 15 ld.u64 %r821,[%r725+80]; add.u64 %r820,%r821,1; .loc 1 2571 13 ld.u64 %r822,[%r725+72]; mov.u64 %r313,%r266; sub.u64 %r266,%r820,%r822; bra $L12; $L131: .loc 1 2563 14 mov.u64 %r313,1; .loc 1 2561 16 mov.u64 %r307,%r313; $L12: .loc 1 2575 16 ld.u64 %r824,[%r726+56]; add.u64 %r823,%r824,1; ld.u64 %r825,[%r726+48]; sub.u64 %r74,%r823,%r825; .loc 1 2575 6 setp.eq.u64 %r826,%r74,%r266; @ %r826 bra $L17; .loc 1 2577 10 set.u32.gt.s64 %r828,%r266,0; neg.s32 %r829,%r828; .loc 1 2577 21 set.u32.gt.s64 %r831,%r74,0; neg.s32 %r832,%r831; cvt.u16.u32 %r834,%r829; cvt.u16.u32 %r835,%r832; or.b16 %r833,%r834,%r835; cvt.u32.u16 %r836,%r833; cvt.u16.u8 %r837,%r836; setp.eq.u16 %r838,%r837,0; @ %r838 bra $L17; .loc 1 2578 2 st.u64 [%stack+8],%r266; st.u64 [%stack],%r74; cvta.const.u64 %r838stack; call _gfortran_runtime_error17: .loc 1 2583 7 ld.s8 %r78,[%r726+28]; .loc 1 2586 16 ld.u64 %r625,[%r726+40]; .loc 1 2583 6 cvt.u16.u32 %r841,%r78; setp.eq.u16 %r842,%r841,1; @ %r842 bra $L132; .loc 1 2597 16 ld.u64 %r308,[%r726+64]; .loc 1 2598 16 ld.u64 %r844,[%r726+80]; add.u64 %r843,%r844,1; .loc 1 2598 14 ld.u64 %r845,[%r726+72]; sub.u64 %r314,%r843,%r845; bra $L18; $L132: .loc 1 2592 14 mov.u64 %r314,1; .loc 1 2591 16 mov.u64 %r308,256; $L18: .loc 1 2601 9 ld.u64 %r354,[%r725]; .loc 1 2602 9 ld.u64 %r355,[%r726]; .loc 1 2603 8 ld.u64 %r356,[%r724]; .loc 1 2612 28 set.u32.eq.u64 %r847,%r336,1; neg.s32 %r848,%r847; cvt.u32.u32 %r846,%r848; cvt.u32.u8 %r83,%r846; .loc 1 2612 46 set.u32.eq.u64 %r850,%r377,1; neg.s32 %r851,%r850; cvt.u32.u32 %r849,%r851; cvt.u32.u8 %r624,%r849; .loc 1 2612 7 set.u32.ne.u32 %r853,%r727,0; neg.s32 %r854,%r853; .loc 1 2612 16 cvt.u16.u32 %r856,%r854; cvt.u16.u32 %r857,%r83; and.b16 %r855,%r856,%r857; .loc 1 2612 6 cvt.u32.u16 %r858,%r855; cvt.u16.u8 %r859,%r858; setp.eq.u16 %r860,%r859,0; @ %r860 bra $L19; .loc 1 2612 63 set.u32.eq.u64 %r862,%r307,1; neg.s32 %r863,%r862; .loc 1 2612 51 cvt.u16.u32 %r865,%r863; cvt.u16.u32 %r866,%r624; or.b16 %r864,%r865,%r866; .loc 1 2612 33 cvt.u32.u16 %r867,%r864; cvt.u16.u8 %r868,%r867; setp.eq.u16 %r869,%r868,0; @ %r869 bra $L20; .loc 1 2613 20 set.u32.eq.u64 %r871,%r625,1; neg.s32 %r872,%r871; .loc 1 2613 37 set.u32.eq.u64 %r874,%r308,1; neg.s32 %r875,%r874; .loc 1 2613 25 cvt.u16.u32 %r877,%r872; cvt.u16.u32 %r878,%r875; or.b16 %r876,%r877,%r878; .loc 1 2613 7 cvt.u32.u16 %r879,%r876; cvt.u16.u8 %r880,%r879; setp.eq.u16 %r881,%r880,0; @ %r881 bra $L20; .loc 1 2615 13 cvt.rn.f32.s32 %r94,%r728; .loc 1 2614 12 cvt.rn.f32.s64 %r882,%r313; .loc 1 2614 50 cvt.rn.f32.s64 %r883,%r266; .loc 1 2614 47 mul.f32 %r884,%r882,%r883; .loc 1 2614 31 cvt.rn.f32.s64 %r885,%r314; .loc 1 2614 47 mul.f32 %r886,%r884,%r885; .loc 1 2615 13 mul.f32 %r887,%r94,%r94; mul.f32 %r888,%r887,%r94; .loc 1 2614 7 setp.gt.f32 %r889,%r886,%r888; @ ! %r889 bra $L19; .loc 1 2617 17 cvt.u32.u64 %r96,%r313; st.u32 [%frame+20],%r96; .loc 1 2617 29 cvt.u32.u64 %r97,%r314; st.u32 [%frame+16],%r97; .loc 1 2617 41 cvt.u32.u64 %r98,%r266; st.u32 [%frame+12],%r98; .loc 1 2617 52 cvt.u32.u64 %r99,%r626; st.u32 [%frame+8],%r99; .loc 1 2618 27 mov.u32 %r890,1; st.u8 [%frame+25],%r890; .loc 1 2618 36 mov.u32 %r891,0; st.u8 [%frame+24],%r891; .loc 1 2619 17 setp.ne.u64 %r892,%r377,1; @ %r892 bra $L22; cvt.u32.u64 %r337,%r307; bra $L23; $L22: cvt.u32.u64 %r337,%r377; $L23: st.u32 [%frame+4],%r337; .loc 1 2620 3 setp.ne.u64 %r893,%r625,1; @ %r893 bra $L24; cvt.u32.u64 %r338,%r308; bra $L25; $L24: cvt.u32.u64 %r338,%r625; $L25: st.u32 [%frame],%r338; .loc 1 2622 15 set.u32.gt.s32 %r895,%r337,0; neg.s32 %r896,%r895; .loc 1 2622 26 set.u32.gt.s32 %r898,%r338,0; neg.s32 %r899,%r898; .loc 1 2622 19 cvt.u16.u32 %r901,%r896; cvt.u16.u32 %r902,%r899; and.b16 %r900,%r901,%r902; .loc 1 2622 10 cvt.u32.u16 %r903,%r900; cvt.u16.u8 %r904,%r903; setp.eq.u16 %r905,%r904,0; @ %r905 bra $L19; .loc 1 2622 37 set.u32.gt.s32 %r907,%r99,0; neg.s32 %r908,%r907; .loc 1 2622 46 set.u32.gt.s32 %r910,%r96,1; neg.s32 %r911,%r910; .loc 1 2622 41 cvt.u16.u32 %r913,%r908; cvt.u16.u32 %r914,%r911; and.b16 %r912,%r913,%r914; cvt.u32.u16 %r915,%r912; cvt.u16.u8 %r916,%r915; setp.eq.u16 %r917,%r916,0; @ %r917 bra $L19; .loc 1 2622 55 set.u32.gt.s32 %r919,%r97,1; neg.s32 %r920,%r919; .loc 1 2622 64 set.u32.gt.s32 %r922,%r98,1; neg.s32 %r923,%r922; .loc 1 2622 59 cvt.u16.u32 %r925,%r920; cvt.u16.u32 %r926,%r923; and.b16 %r924,%r925,%r926; cvt.u32.u16 %r927,%r924; cvt.u16.u8 %r928,%r927; setp.eq.u16 %r929,%r928,0; @ %r929 bra $L19; .loc 1 2624 4 setp.ne.u64 %r930,%r729,0; @ %r930 bra $L27; cvta.const.u64 %r934,$LC8; cvta.const.u64 %r933,__func__$0; mov.u32 %r932,2624; cvta.const.u64 %r93199339327: .loc 1 2626 17 and.b32 %r935,%r727,2; .loc 1 2626 7 setp.ne.u32 %r936,%r935,0; @ %r936 bra $L133; .loc 1 2629 35 setp.eq.u64 %r937,%r377,1; @ %r937 bra $L134; cvta.const.u64 %r315,$LC2; bra $L28; $L133: .loc 1 2627 13 cvta.const.u64 %r315,$LC1; bra $L28; $L134: .loc 1 2629 35 cvta.const.u64 %r315,$LC0; $L28: .loc 1 2631 17 and.b32 %r938,%r727,4; .loc 1 2631 7 setp.ne.u32 %r939,%r938,0; @ %r939 bra $L135; .loc 1 2634 35 setp.eq.u64 %r940,%r625,1; @ %r940 bra $L136; cvta.const.u64 %r316,$LC2; bra $L29; $L135: .loc 1 2632 13 cvta.const.u64 %r316,$LC1; bra $L29; $L136: .loc 1 2634 35 cvta.const.u64 %r316,$LC0; $L29: .loc 1 2636 4 add.u64 %r956,%frame,8; add.u64 %r957,%frame,24; add.u64 %r958,%frame,4; add.u64 %r959,%frame,25; add.u64 %r960,%frame,12; add.u64 %r961,%frame,16; add.u64 %r962,%frame,20; mov.u32 %r955,1;,.param .u64 %in_ar3,.param .u64 %in_ar4,.param .u64 %in_ar5,.param .u64 %in_ar6,.param .u64 %in_ar7,.param .u64 %in_ar8,.param .u64 %in_ar9,.param .u64 %in_ar10,.param .u64 %in_ar11,.param .u64 %in_ar12,.param .u32 %in_ar13,.param .u32 %in_ar14962960354; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r958; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r355; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%frame; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r957; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r356; .param .u64 %out_arg13; st.param.u64 [%out_arg13],%r956; .param .u32 %out_arg14; st.param.u32 [%out_arg14],%r955; .param .u32 %out_arg15; st.param.u32 [%out_arg15],%r955; call %r729,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13,%out_arg14,%out_arg15),$LCT0; } bra $L1; $L19: .loc 1 2643 21 cvt.u16.u32 %r964,%r83; cvt.u16.u32 %r965,%r624; and.b16 %r963,%r964,%r965; .loc 1 2643 6 cvt.u32.u16 %r966,%r963; cvt.u16.u8 %r967,%r966; setp.eq.u16 %r968,%r967,0; @ %r968 bra $L31; .loc 1 2643 38 setp.ne.u64 %r969,%r625,1; @ %r969 bra $L20; .loc 1 2644 7 cvt.u16.u32 %r970,%r78; setp.eq.u16 %r971,%r970,1; @ %r971 bra $L32; .loc 1 2677 16 add.u64 %r972,%r626,1; .loc 1 2678 9 sub.u64 %r358,%r356,%r972; .loc 1 2687 7 setp.le.s64 %r973,%r314,0; @ %r973 bra $L33; add.u64 %r141,%r358,%r972; add.u64 %r345,%r314,1; .loc 1 2687 13 mov.u64 %r407,%r625; setp.gt.s64 %r1590,%r313,0; sub.u64 %r1697,%r345,%r625; and.b64 %r1698,%r1697,3; setp.eq.u64 %r1701,%r1698,0; @ %r1701 bra $L34; setp.eq.u64 %r1700,%r1698,1; @ %r1700 bra $L294; setp.eq.u64 %r1699,%r1698,2; @ %r1699 bra $L295; bra $L356; $L35: .loc 1 2689 22 mov.u32 %r978973130,[%value_in]; } $L36: .loc 1 2687 24 add.u64 %r1702,%r407,1; .loc 1 2687 7 add.u64 %r1753,%r141,%r626; .loc 1 2688 2 @ %r1590 bra $L232; bra $L357; $L34: @ %r1590 bra $L35; bra $L36; $L33: .loc 1 2692 28 set.u32.eq.u64 %r985,%r266,0; neg.s32 %r986,%r985; .loc 1 2692 13 set.u32.eq.u64 %r988,%r313,0; neg.s32 %r989,%r988; .loc 1 2692 28 cvt.u16.u32 %r991,%r986; cvt.u16.u32 %r992,%r989; or.b16 %r990,%r991,%r992; .loc 1 2692 23 set.u32.eq.u64 %r994,%r314,0; neg.s32 %r995,%r994; .loc 1 2692 28 cvt.u16.u32 %r999,%r995; or.b16 %r997,%r990,%r999; cvt.u32.u16 %r1000,%r997; cvt.u16.u8 %r1001,%r1000; setp.eq.u16 %r1002,%r1001,0; @ ! %r1002 bra $L1; .loc 1 2697 10 setp.eq.u64 %r1003,%r307,1; selp.u64 %r333,%r626,%r307,%r1003; .loc 1 2702 21 shl.b64 %r1004,%r333,8; .loc 1 2702 14 add.u64 %r363,%r1004,%r308; .loc 1 2706 12 min.s64 %r1006,%r363,655361007,[%value_in]; } .loc 1 2710 7 setp.gt.s64 %r1009,%r314,0; @ %r1009 bra $L39; $L42: .loc 1 2910072918 7 bra $L1; $L39: .loc 1 2680 16 add.u64 %r1011,%r307,1; .loc 1 2681 9 sub.u64 %r360,%r354,%r1011; shl.b64 %r634,%r626,9; add.u64 %r636,%r626,%r626; neg.s64 %r1014,%r308; shl.b64 %r1015,%r1014,9; mov.u64 %r57,%r1014; shl.b64 %r348,%r308,9; add.u64 %r1578,%r308,%r308; add.u64 %r54,%r1578,%r308; add.u64 %r50,%r358,%r636; add.u64 %r1019,%r314,-1; and.b64 %r65,%r1019,-512; shl.b64 %r587,%r307,8; neg.s64 %r1592,%r307; shl.b64 %r1022,%r1592,8; add.u64 %r319,%r307,%r307; add.u64 %r1026,%r1592,%r1592; shl.b64 %r462,%r308,2; shl.b64 %r1029,%r57,2; .loc 1 2683 16 add.u64 %r1030,%r308,1; .loc 1 2684 9 sub.u64 %r129,%r355,%r1030; add.u64 %r1033,%r57,%r57; add.u64 %r131,%r1033,%r129; shl.b64 %r453,%r626,2; mov.u64 %r56,%r626; mov.u64 %r575,0; setp.gt.s64 %r1588,%r266,0; setp.gt.s64 %r1635,%r313,0; add.u64 %r1636,%r266,1; add.u64 %r1637,%r313,-1; and.b64 %r1638,%r1637,-256; add.u64 %r1639,%r360,1; bra $L41; $L138: mov.u64 %r575,%r576; $L41: add.u64 %r70,%r575,1; .loc 1 2715 9 sub.u64 %r365,%r314,%r575; min.s64 %r1754,%r365,512; .loc 1 2716 24 shr.s64 %r1035,%r1754,63; shr.u64 %r1036,%r1035,62; add.u64 %r1037,%r1754,%r1036; and.b64 %r1038,%r1037,3; sub.u64 %r1039,%r1038,%r1036; .loc 1 2716 10 sub.u64 %r366,%r1754,%r1039; .loc 1 2718 4 @ %r1588 bra $L40; $L45: .loc 1 2710 7 add.u64 %r576,%r575,512; add.u64 %r56,%r56,%r634; add.u64 %r57,%r57,%r1015; add.u64 %r54,%r54,%r348; add.u64 %r50,%r50,%r634; setp.ne.u64 %r1041,%r65,%r575; @ %r1041 bra $L138; bra $L42; $L40: mov.u64 %r584,%r1592; .loc 1 2768 13 add.u64 %r600,%r366,%r70; add.u64 %r455,%r54,%r131; add.u64 %r113,%r50,%r636; mad.lo.u64 %r577,%r308,%r600,%r129; mad.lo.u64 %r526,%r600,%r626,%r358; mov.u64 %r585,%r307; .loc 1 2718 12 mov.u64 %r369,%r625; .loc 1 2768 8 add.u64 %r1628,%r600,-1; setp.le.s64 %r1631,%r70,%r1628; .loc 1 2873 17 add.u64 %r1633,%r70,%r1754; .loc 1 2874 9 setp.le.s64 %r1634,%r1633,%r600; $L44: .loc 1 2723 13 sub.u64 %r367,%r1636,%r369; min.s64 %r1756,%r367,256; .loc 1 2724 28 shr.u64 %r1047,%r1756,63; add.u64 %r1048,%r1756,%r1047; and.b64 %r1049,%r1048,1; sub.u64 %r1050,%r1049,%r1047; .loc 1 2724 14 sub.u64 %r368,%r1756,%r1050; .loc 1 2727 8 @ %r1635 bra $L43; $L73: .loc 1 2718 30 add.u64 %r369,%r369,256; .loc 1 2718 4 add.u64 %r585,%r585,%r587; add.u64 %r584,%r584,%r1022; setp.ge.s64 %r1052,%r266,%r369; @ %r1052 bra $L44; bra $L45; $L43: .loc 1 2763 26 add.u64 %r1053,%r369,-1; add.u64 %r1054,%r1053,%r1756; .loc 1 2763 31 mul.lo.u64 %r430,%r1054,%r307; .loc 1 2734 13 add.u64 %r599,%r368,%r369; .loc 1 2734 8 add.u64 %r601,%r599,-1; add.u64 %r1056,%r455,%r369; add.u64 %r456,%r1056,%r1756; add.u64 %r578,%r577,%r369; add.u64 %r528,%r1756,%r578; .loc 1 2768 8 mov.u64 %r598,0; setp.le.s64 %r1598,%r369,%r601; setp.gt.s64 %r1585,%r1756,%r368; setp.gt.s64 %r1584,%r1754,%r366; not.b64 %r1621,%r369; add.u64 %r1622,%r1621,%r599; and.b64 %r1623,%r1622,-2; .loc 1 2904 18 add.u64 %r1624,%r1756,%r369; .loc 1 2904 13 add.u64 %r1625,%r1624,-1; setp.gt.s64 %r1626,%r369,%r1625; bra $L72; $L140: mov.u64 %r598,%r597; $L72: add.u64 %r591,%r598,1; .loc 1 2732 10 sub.u64 %r370,%r313,%r598; min.s64 %r1757,%r370,256; .loc 1 2733 25 shr.s64 %r1593,%r1757,63; shr.u64 %r1059,%r1757,63; add.u64 %r1060,%r1757,%r1059; and.b64 %r1061,%r1060,1; sub.u64 %r1062,%r1061,%r1059; .loc 1 2733 11 sub.u64 %r371,%r1757,%r1062; .loc 1 2735 5 @ %r1598 bra $L46; $L55: .loc 1 2757 8 @ ! %r1585 bra $L48; bra $L47; $L46: .loc 1 2751 28 shl.b64 %r135,%r1757,8; .loc 1 2737 17 add.u64 %r603,%r371,%r591; add.u64 %r1066,%r1639,%r585; add.u64 %r320,%r1066,%r598; add.u64 %r112,%r307,%r585; mov.u64 %r77,%r584; mov.u64 %r317,%r585; mov.u64 %r322,0; setp.lt.s64 %r1579,%r591,%r603; setp.gt.s64 %r1586,%r1757,%r371; not.b64 %r1615,%r591; add.u64 %r1616,%r1615,%r603; and.b64 %r1617,%r1616,-2; add.u64 %r1618,%r1617,2; add.u64 %r1619,%r1757,%r591; add.u64 %r1620,%r360,%r1619; bra $L54; $L139: mov.u64 %r322,%r321; $L54: .loc 1 2738 9 @ %r1579 bra $L49; $L53: .loc 1 2749 12 @ %r1586 bra $L50; bra $L51; $L49: add.u64 %r340,%r1007,%r322; add.u64 %r323,%r1618,%r320; .loc 1 2738 9 mov.u64 %r341,%r320; add.u64 %r1687,%r1618,-2; shr.u64 %r1685,%r1687,1; add.u64 %r1688,%r1685,1; and.b64 %r1689,%r1688,3; setp.eq.u64 %r1692,%r1689,0; @ %r1692 bra $L52; setp.eq.u64 %r1691,%r1689,1; @ %r1691 bra $L296; setp.eq.u64 %r1690,%r1689,2; @ %r1690 bra $L297; bra $L358; $L52: .loc 1 2740 49 ld.u8 %r1075,[%r341]; st.u8 [%r340],%r1075; .loc 1 2743 7 add.u64 %r1076,%r112,%r341; add.u64 %r1077,%r1076,%r77; .loc 1 2742 49 ld.u8 %r1078,[%r1077]; st.u8 [%r340+1],%r1078; .loc 1 2744 49 ld.u8 %r1079,[%r341+1]; st.u8 [%r340+256],%r1079; .loc 1 2747 7 sub.u64 %r1080,%r341,%r317; add.u64 %r1081,%r1080,%r112; .loc 1 2746 49 ld.u8 %r1082,[%r1081+1]; st.u8 [%r340+257],%r1082; .loc 1 2738 9 add.u64 %r1693,%r341,2; add.u64 %r1694,%r340,512; .loc 1 2740 49 ld.u8 %r1760,[%r1693]; st.u8 [%r1694],%r1760; .loc 1 2743 7 add.u64 %r1761,%r112,%r1693; add.u64 %r1762,%r1761,%r77; .loc 1 2742 49 ld.u8 %r1763,[%r1762]; st.u8 [%r1694+1],%r1763; .loc 1 2744 49 ld.u8 %r1764,[%r1693+1]; st.u8 [%r1694+256],%r1764; .loc 1 2747 7 sub.u64 %r1765,%r1693,%r317; add.u64 %r1766,%r1765,%r112; .loc 1 2746 49 ld.u8 %r1767,[%r1766+1]; st.u8 [%r1694+257],%r1767; .loc 1 2738 9 add.u64 %r1768,%r341,4; add.u64 %r1769,%r340,1024; .loc 1 2740 49 ld.u8 %r1771,[%r1768]; st.u8 [%r1769],%r1771; .loc 1 2743 7 add.u64 %r1772,%r112,%r1768; add.u64 %r1773,%r1772,%r77; .loc 1 2742 49 ld.u8 %r1774,[%r1773]; st.u8 [%r1769+1],%r1774; .loc 1 2744 49 ld.u8 %r1775,[%r1768+1]; st.u8 [%r1769+256],%r1775; .loc 1 2747 7 sub.u64 %r1776,%r1768,%r317; add.u64 %r1777,%r1776,%r112; .loc 1 2746 49 ld.u8 %r1778,[%r1777+1]; st.u8 [%r1769+257],%r1778; .loc 1 2738 9 add.u64 %r1779,%r341,6; add.u64 %r1780,%r340,1536; .loc 1 2740 49 ld.u8 %r1782,[%r1779]; st.u8 [%r1780],%r1782; .loc 1 2743 7 add.u64 %r1783,%r112,%r1779; add.u64 %r1784,%r1783,%r77; .loc 1 2742 49 ld.u8 %r1785,[%r1784]; st.u8 [%r1780+1],%r1785; .loc 1 2744 49 ld.u8 %r1786,[%r1779+1]; st.u8 [%r1780+256],%r1786; .loc 1 2747 7 sub.u64 %r1787,%r1779,%r317; add.u64 %r1788,%r1787,%r112; .loc 1 2746 49 ld.u8 %r1789,[%r1788+1]; st.u8 [%r1780+257],%r1789; .loc 1 2738 9 add.u64 %r341,%r341,8; add.u64 %r340,%r340,2048; setp.ne.u64 %r1790,%r323,%r341; @ %r1790 bra $L52; bra $L53; $L50: .loc 1 2751 41 add.u64 %r1085,%r135,%r322; add.u64 %r1086,%r1007,%r1085; .loc 1 2752 10 add.u64 %r1088,%r1620,%r317; .loc 1 2754 10 add.u64 %r1092,%r1620,%r112; .loc 1 2751 41 ld.u8 %r1093,[%r1092+-1]; cvt.u32.u16 %r2419,%r1093; mov.u32 %r1095,%r2419; shl.b32 %r1094,%r1095,8; ld.u8 %r1096,[%r1088+-1]; cvt.u16.u32 %r1098,%r1094; or.b16 %r1097,%r1096,%r1098; st.u16 [%r1086+-256],%r1097; $L51: .loc 1 2735 5 add.u64 %r321,%r322,2; add.u64 %r320,%r320,%r319; add.u64 %r317,%r317,%r319; add.u64 %r112,%r112,%r319; add.u64 %r77,%r77,%r1026; setp.ne.u64 %r1099,%r1623,%r322; @ %r1099 bra $L139; bra $L55; $L48: .loc 1 2767 25 shr.u64 %r1102,%r1593,62; add.u64 %r1103,%r1757,%r1102; and.b64 %r1104,%r1103,3; sub.u64 %r1105,%r1104,%r1102; .loc 1 2767 11 sub.u64 %r372,%r1757,%r1105; .loc 1 2769 5 @ %r1631 bra $L56; bra $L57; $L47: .loc 1 2759 17 add.u64 %r1791,%r1757,%r591; .loc 1 2760 9 setp.le.s64 %r1107,%r1791,%r591; @ %r1107 bra $L48; add.u64 %r1109,%r1639,%r430; add.u64 %r432,%r1109,%r598; add.u64 %r1110,%r1756,-1; add.u64 %r431,%r1007,%r1110; add.u64 %r1111,%r430,%r360; add.u64 %r342,%r1111,%r1791; sub.u64 %r1678,%r342,%r432; and.b64 %r1679,%r1678,3; setp.eq.u64 %r1682,%r1679,0; @ %r1682 bra $L58; setp.eq.u64 %r1681,%r1679,1; @ %r1681 bra $L298; setp.eq.u64 %r1680,%r1679,2; @ %r1680 bra $L299; bra $L359; $L58: .loc 1 2762 43 ld.u8 %r1112,[%r432]; st.u8 [%r431],%r1112; ld.u8 %r1794,[%r432+1]; st.u8 [%r431+256],%r1794; ld.u8 %r1798,[%r432+2]; st.u8 [%r431+512],%r1798; ld.u8 %r1802,[%r432+3]; st.u8 [%r431+768],%r1802; .loc 1 2760 9 add.u64 %r432,%r432,4; add.u64 %r431,%r431,1024; setp.ne.u64 %r1803,%r342,%r432; @ %r1803 bra $L58; bra $L48; $L57: .loc 1 2871 8 @ ! %r1584 bra $L60; bra $L59; $L56: .loc 1 2876 14 add.u64 %r608,%r372,%r591; .loc 1 2876 9 add.u64 %r609,%r608,-1; sub.u64 %r463,%r54,%r308; add.u64 %r458,%r1578,%r463; add.u64 %r454,%r358,%r56; sub.u64 %r450,%r113,%r626; shl.b64 %r502,%r372,8; mov.u64 %r448,%r113; mov.u64 %r452,%r50; mov.u64 %r457,%r456; mov.u64 %r459,%r54; mov.u64 %r461,%r57; mov.u64 %r387,%r70; setp.le.s64 %r1580,%r591,%r609; setp.gt.s64 %r1587,%r1757,%r372; add.u64 %r1611,%r129,%r369; $L68: .loc 1 2772 9 @ %r1580 bra $L61; $L67: .loc 1 2843 12 @ ! %r1587 bra $L63; bra $L62; $L61: add.u64 %r521,%r1611,%r1756; .loc 1 2904 13 mov.u64 %r389,%r591; mov.u64 %r474,1024; mov.u64 %r475,768; mov.u64 %r476,512; mov.u64 %r477,256; sub.u64 %r1610,%r521,%r457; $L66: add.u64 %r471,%r454,%r389; .loc 1 2774 10 ld.s8 %r390,[%r471]; .loc 1 2775 10 ld.s8 %r391,[%r471+1]; add.u64 %r469,%r452,%r389; .loc 1 2776 10 ld.s8 %r392,[%r469]; .loc 1 2777 10 ld.s8 %r393,[%r469+1]; add.u64 %r467,%r450,%r389; .loc 1 2778 10 ld.s8 %r394,[%r467]; .loc 1 2779 10 ld.s8 %r395,[%r467+1]; add.u64 %r465,%r448,%r389; .loc 1 2780 10 ld.s8 %r396,[%r465]; .loc 1 2781 10 ld.s8 %r397,[%r465+1]; .loc 1 2782 10 ld.s8 %r398,[%r471+2]; .loc 1 2783 10 ld.s8 %r399,[%r471+3]; .loc 1 2784 10 ld.s8 %r400,[%r469+2]; .loc 1 2785 10 ld.s8 %r401,[%r469+3]; .loc 1 2786 10 ld.s8 %r402,[%r467+2]; .loc 1 2787 10 ld.s8 %r403,[%r467+3]; .loc 1 2788 10 ld.s8 %r404,[%r465+2]; .loc 1 2789 10 ld.s8 %r405,[%r465+3]; .loc 1 2791 6 @ %r1626 bra $L64; sub.u64 %r493,%r457,%r1756; add.u64 %r491,%r1007,%r477; $L65: sub.u64 %r488,%r491,%r476; .loc 1 2793 19 add.u64 %r1121,%r488,%r477; .loc 1 2794 11 ld.u8 %r1123,[%r1121]; ld.u8 %r1124,[%r493]; .loc 1 2793 14 cvt.u16.u32 %r1129,%r390; mad.lo.u16 %r1127,%r1123,%r1124,%r1129; cvt.u32.u16 %r1130,%r1127; cvt.s32.s8 %r390,%r1130; .loc 1 2796 11 ld.u8 %r1134,[%r491]; .loc 1 2795 14 cvt.u16.u32 %r1139,%r391; mad.lo.u16 %r1137,%r1124,%r1134,%r1139; cvt.u32.u16 %r1140,%r1137; cvt.s32.s8 %r391,%r1140; add.u64 %r483,%r1610,%r493; .loc 1 2798 14 add.u64 %r1143,%r483,%r463; .loc 1 2798 11 ld.u8 %r1146,[%r1143]; .loc 1 2797 14 cvt.u16.u32 %r1151,%r392; mad.lo.u16 %r1149,%r1123,%r1146,%r1151; cvt.u32.u16 %r1152,%r1149; cvt.s32.s8 %r392,%r1152; .loc 1 2799 14 cvt.u16.u32 %r1161,%r393; mad.lo.u16 %r1159,%r1134,%r1146,%r1161; cvt.u32.u16 %r1162,%r1159; cvt.s32.s8 %r393,%r1162; .loc 1 2802 14 add.u64 %r1164,%r483,%r459; .loc 1 2802 11 ld.u8 %r1167,[%r1164]; .loc 1 2801 14 cvt.u16.u32 %r1172,%r394; mad.lo.u16 %r1170,%r1123,%r1167,%r1172; cvt.u32.u16 %r1173,%r1170; cvt.s32.s8 %r394,%r1173; .loc 1 2803 14 cvt.u16.u32 %r1182,%r395; mad.lo.u16 %r1180,%r1134,%r1167,%r1182; cvt.u32.u16 %r1183,%r1180; cvt.s32.s8 %r395,%r1183; .loc 1 2806 14 add.u64 %r1185,%r483,%r458; .loc 1 2806 11 ld.u8 %r1188,[%r1185]; .loc 1 2805 14 cvt.u16.u32 %r1193,%r396; mad.lo.u16 %r1191,%r1123,%r1188,%r1193; cvt.u32.u16 %r1194,%r1191; cvt.s32.s8 %r396,%r1194; .loc 1 2807 14 cvt.u16.u32 %r1203,%r397; mad.lo.u16 %r1201,%r1134,%r1188,%r1203; cvt.u32.u16 %r1204,%r1201; cvt.s32.s8 %r397,%r1204; .loc 1 2809 19 add.u64 %r1206,%r488,%r475; .loc 1 2810 11 ld.u8 %r1209,[%r1206]; .loc 1 2809 14 cvt.u16.u32 %r1214,%r398; mad.lo.u16 %r1212,%r1124,%r1209,%r1214; cvt.u32.u16 %r1215,%r1212; cvt.s32.s8 %r398,%r1215; .loc 1 2811 19 add.u64 %r1217,%r488,%r474; .loc 1 2812 11 ld.u8 %r1220,[%r1217]; .loc 1 2811 14 cvt.u16.u32 %r1225,%r399; mad.lo.u16 %r1223,%r1124,%r1220,%r1225; cvt.u32.u16 %r1226,%r1223; cvt.s32.s8 %r399,%r1226; .loc 1 2813 14 cvt.u16.u32 %r1235,%r400; mad.lo.u16 %r1233,%r1146,%r1209,%r1235; cvt.u32.u16 %r1236,%r1233; cvt.s32.s8 %r400,%r1236; .loc 1 2815 14 cvt.u16.u32 %r1245,%r401; mad.lo.u16 %r1243,%r1146,%r1220,%r1245; cvt.u32.u16 %r1246,%r1243; cvt.s32.s8 %r401,%r1246; .loc 1 2817 14 cvt.u16.u32 %r1255,%r402; mad.lo.u16 %r1253,%r1167,%r1209,%r1255; cvt.u32.u16 %r1256,%r1253; cvt.s32.s8 %r402,%r1256; .loc 1 2819 14 cvt.u16.u32 %r1265,%r403; mad.lo.u16 %r1263,%r1167,%r1220,%r1265; cvt.u32.u16 %r1266,%r1263; cvt.s32.s8 %r403,%r1266; .loc 1 2821 14 cvt.u16.u32 %r1275,%r404; mad.lo.u16 %r1273,%r1188,%r1209,%r1275; cvt.u32.u16 %r1276,%r1273; cvt.s32.s8 %r404,%r1276; .loc 1 2823 14 cvt.u16.u32 %r1285,%r405; mad.lo.u16 %r1283,%r1188,%r1220,%r1285; cvt.u32.u16 %r1286,%r1283; cvt.s32.s8 %r405,%r1286; .loc 1 2791 6 add.u64 %r493,%r493,1; add.u64 %r491,%r491,1; setp.ne.u64 %r1288,%r457,%r493; @ %r1288 bra $L65; $L64: .loc 1 2826 24 cvt.u32.u32 %r1289,%r390; st.u8 [%r471],%r1289; .loc 1 2827 28 cvt.u32.u32 %r1290,%r391; st.u8 [%r471+1],%r1290; .loc 1 2828 30 cvt.u32.u32 %r1291,%r392; st.u8 [%r469],%r1291; .loc 1 2829 34 cvt.u32.u32 %r1292,%r393; st.u8 [%r469+1],%r1292; .loc 1 2830 30 cvt.u32.u32 %r1293,%r394; st.u8 [%r467],%r1293; .loc 1 2831 34 cvt.u32.u32 %r1294,%r395; st.u8 [%r467+1],%r1294; .loc 1 2832 30 cvt.u32.u32 %r1295,%r396; st.u8 [%r465],%r1295; .loc 1 2833 34 cvt.u32.u32 %r1296,%r397; st.u8 [%r465+1],%r1296; .loc 1 2834 28 cvt.u32.u32 %r1297,%r398; st.u8 [%r471+2],%r1297; .loc 1 2835 28 cvt.u32.u32 %r1298,%r399; st.u8 [%r471+3],%r1298; .loc 1 2836 34 cvt.u32.u32 %r1299,%r400; st.u8 [%r469+2],%r1299; .loc 1 2837 34 cvt.u32.u32 %r1300,%r401; st.u8 [%r469+3],%r1300; .loc 1 2838 34 cvt.u32.u32 %r1301,%r402; st.u8 [%r467+2],%r1301; .loc 1 2839 34 cvt.u32.u32 %r1302,%r403; st.u8 [%r467+3],%r1302; .loc 1 2840 34 cvt.u32.u32 %r1303,%r404; st.u8 [%r465+2],%r1303; .loc 1 2841 34 cvt.u32.u32 %r1304,%r405; st.u8 [%r465+3],%r1304; .loc 1 2772 33 add.u64 %r389,%r389,4; .loc 1 2772 9 add.u64 %r477,%r477,1024; add.u64 %r476,%r476,1024; add.u64 %r475,%r475,1024; add.u64 %r474,%r474,1024; setp.le.s64 %r1305,%r389,%r609; @ %r1305 bra $L66; bra $L67; $L63: .loc 1 2769 29 add.u64 %r387,%r387,4; .loc 1 2769 5 add.u64 %r463,%r463,%r462; add.u64 %r461,%r461,%r1029; add.u64 %r459,%r459,%r462; add.u64 %r458,%r458,%r462; add.u64 %r457,%r457,%r462; add.u64 %r454,%r454,%r453; add.u64 %r452,%r452,%r453; add.u64 %r450,%r450,%r453; add.u64 %r448,%r448,%r453; setp.le.s64 %r1306,%r387,%r1628; @ %r1306 bra $L68; bra $L57; $L62: add.u64 %r508,%r454,%r608; add.u64 %r507,%r452,%r608; add.u64 %r505,%r450,%r608; add.u64 %r504,%r448,%r608; .loc 1 2904 13 mov.u64 %r503,%r502; add.u64 %r1804,%r1757,%r591; .loc 1 2846 6 add.u64 %r1366,%r1804,%r454; $L71: .loc 1 2848 14 ld.s8 %r383,[%r508]; .loc 1 2849 14 ld.s8 %r384,[%r507]; .loc 1 2850 14 ld.s8 %r385,[%r505]; .loc 1 2851 14 ld.s8 %r386,[%r504]; .loc 1 2853 10 @ %r1626 bra $L69; add.u64 %r522,%r1007,%r503; sub.u64 %r520,%r457,%r1756; and.b64 %r1672,%r1756,1; setp.eq.u64 %r1673,%r1672,0; @ ! %r1673 bra $L360; $L70: .loc 1 2856 13 ld.u8 %r2420,[%r520]; mov.u16 %r1312,%r2420; ld.u8 %r1313,[%r522]; .loc 1 2855 11 cvt.u16.u32 %r1318,%r383; mad.lo.u16 %r1316,%r1312,%r1313,%r1318; cvt.u32.u16 %r1319,%r1316; cvt.s32.s8 %r1805,%r1319; .loc 1 2858 16 add.u64 %r1321,%r463,%r520; add.u64 %r1322,%r1321,%r461; .loc 1 2858 13 ld.u8 %r2420,[%r1322]; mov.u16 %r1325,%r2420; .loc 1 2857 11 cvt.u16.u32 %r1331,%r384; mad.lo.u16 %r1329,%r1325,%r1313,%r1331; cvt.u32.u16 %r1332,%r1329; cvt.s32.s8 %r1806,%r1332; .loc 1 2860 16 add.u64 %r1334,%r459,%r520; add.u64 %r1335,%r1334,%r461; .loc 1 2860 13 ld.u8 %r2420,[%r1335]; mov.u16 %r1338,%r2420; .loc 1 2859 11 cvt.u16.u32 %r1344,%r385; mad.lo.u16 %r1342,%r1338,%r1313,%r1344; cvt.u32.u16 %r1345,%r1342; cvt.s32.s8 %r1807,%r1345; .loc 1 2862 16 add.u64 %r1347,%r458,%r520; add.u64 %r1348,%r1347,%r461; .loc 1 2862 13 ld.u8 %r2420,[%r1348]; mov.u16 %r1351,%r2420; .loc 1 2861 11 cvt.u16.u32 %r1357,%r386; mad.lo.u16 %r1355,%r1351,%r1313,%r1357; cvt.u32.u16 %r1358,%r1355; cvt.s32.s8 %r1808,%r1358; .loc 1 2853 10 add.u64 %r1675,%r520,1; .loc 1 2856 13 ld.u8 %r2420,[%r1675]; mov.u16 %r1813,%r2420; ld.u8 %r1814,[%r522+1]; .loc 1 2855 11 cvt.u16.u32 %r1816,%r1805; mad.lo.u16 %r1817,%r1813,%r1814,%r1816; cvt.u32.u16 %r1818,%r1817; cvt.s32.s8 %r383,%r1818; .loc 1 2858 16 add.u64 %r1819,%r463,%r1675; add.u64 %r1820,%r1819,%r461; .loc 1 2858 13 ld.u8 %r2420,[%r1820]; mov.u16 %r1822,%r2420; .loc 1 2857 11 cvt.u16.u32 %r1824,%r1806; mad.lo.u16 %r1825,%r1822,%r1814,%r1824; cvt.u32.u16 %r1826,%r1825; cvt.s32.s8 %r384,%r1826; .loc 1 2860 16 add.u64 %r1827,%r459,%r1675; add.u64 %r1828,%r1827,%r461; .loc 1 2860 13 ld.u8 %r2420,[%r1828]; mov.u16 %r1830,%r2420; .loc 1 2859 11 cvt.u16.u32 %r1832,%r1807; mad.lo.u16 %r1833,%r1830,%r1814,%r1832; cvt.u32.u16 %r1834,%r1833; cvt.s32.s8 %r385,%r1834; .loc 1 2862 16 add.u64 %r1835,%r458,%r1675; add.u64 %r1836,%r1835,%r461; .loc 1 2862 13 ld.u8 %r2420,[%r1836]; mov.u16 %r1838,%r2420; .loc 1 2861 11 cvt.u16.u32 %r1840,%r1808; mad.lo.u16 %r1841,%r1838,%r1814,%r1840; cvt.u32.u16 %r1842,%r1841; cvt.s32.s8 %r386,%r1842; .loc 1 2853 10 add.u64 %r522,%r522,2; add.u64 %r520,%r520,2; setp.ne.u64 %r1843,%r457,%r520; @ %r1843 bra $L70; $L69: .loc 1 2864 28 cvt.u32.u32 %r1361,%r383; st.u8 [%r508],%r1361; .loc 1 2865 34 cvt.u32.u32 %r1362,%r384; st.u8 [%r507],%r1362; .loc 1 2866 34 cvt.u32.u32 %r1363,%r385; st.u8 [%r505],%r1363; .loc 1 2867 34 cvt.u32.u32 %r1364,%r386; st.u8 [%r504],%r1364; .loc 1 2846 6 add.u64 %r508,%r508,1; add.u64 %r507,%r507,1; add.u64 %r505,%r505,1; add.u64 %r504,%r504,1; add.u64 %r503,%r503,256; setp.ne.u64 %r1367,%r1366,%r508; @ %r1367 bra $L71; bra $L63; $L60: .loc 1 2727 8 add.u64 %r597,%r598,256; setp.ne.u64 %r1368,%r1638,%r598; @ %r1368 bra $L140; bra $L73; $L59: .loc 1 2874 9 @ %r1634 bra $L60; .loc 1 2759 17 add.u64 %r614,%r1757,%r591; .loc 1 2876 14 add.u64 %r615,%r372,%r591; .loc 1 2876 9 add.u64 %r616,%r615,-1; shl.b64 %r554,%r372,8; mov.u64 %r527,%r526; mov.u64 %r529,%r528; mov.u64 %r531,%r578; mov.u64 %r374,%r600; setp.gt.s64 %r1844,%r1757,%r372; setp.le.s64 %r1581,%r591,%r616; $L81: .loc 1 2877 6 @ %r1581 bra $L74; $L80: .loc 1 2901 6 @ %r1844 bra $L75; bra $L76; $L74: .loc 1 2904 13 mov.u64 %r378,%r591; mov.u64 %r536,1024; mov.u64 %r537,768; mov.u64 %r538,512; mov.u64 %r539,256; $L79: add.u64 %r533,%r527,%r378; .loc 1 2879 14 ld.s8 %r379,[%r533]; .loc 1 2880 14 ld.s8 %r380,[%r533+1]; .loc 1 2881 14 ld.s8 %r381,[%r533+2]; .loc 1 2882 14 ld.s8 %r382,[%r533+3]; .loc 1 2884 10 @ %r1626 bra $L77; add.u64 %r547,%r1007,%r539; mov.u64 %r548,%r531; sub.u64 %r1664,%r529,%r531; and.b64 %r1665,%r1664,1; setp.eq.u64 %r1666,%r1665,0; @ ! %r1666 bra $L361; $L78: sub.u64 %r545,%r547,%r538; .loc 1 2886 16 add.u64 %r1374,%r545,%r539; .loc 1 2887 13 ld.u8 %r2420,[%r1374]; mov.u16 %r1377,%r2420; ld.u8 %r1378,[%r548]; .loc 1 2886 11 cvt.u16.u32 %r1383,%r379; mad.lo.u16 %r1381,%r1377,%r1378,%r1383; cvt.u32.u16 %r1384,%r1381; cvt.s32.s8 %r1845,%r1384; .loc 1 2889 13 ld.u8 %r2420,[%r547]; mov.u16 %r1388,%r2420; .loc 1 2888 11 cvt.u16.u32 %r1394,%r380; mad.lo.u16 %r1392,%r1388,%r1378,%r1394; cvt.u32.u16 %r1395,%r1392; cvt.s32.s8 %r1846,%r1395; .loc 1 2890 16 add.u64 %r1397,%r545,%r537; .loc 1 2891 13 ld.u8 %r2420,[%r1397]; mov.u16 %r1400,%r2420; .loc 1 2890 11 cvt.u16.u32 %r1406,%r381; mad.lo.u16 %r1404,%r1400,%r1378,%r1406; cvt.u32.u16 %r1407,%r1404; cvt.s32.s8 %r1847,%r1407; .loc 1 2892 16 add.u64 %r1409,%r545,%r536; .loc 1 2893 13 ld.u8 %r2420,[%r1409]; mov.u16 %r1412,%r2420; .loc 1 2892 11 cvt.u16.u32 %r1418,%r382; mad.lo.u16 %r1416,%r1412,%r1378,%r1418; cvt.u32.u16 %r1419,%r1416; cvt.s32.s8 %r1848,%r1419; .loc 1 2884 10 add.u64 %r1668,%r547,1; sub.u64 %r1851,%r1668,%r538; .loc 1 2886 16 add.u64 %r1853,%r1851,%r539; .loc 1 2887 13 ld.u8 %r2420,[%r1853]; mov.u16 %r1855,%r2420; ld.u8 %r1856,[%r548+1]; .loc 1 2886 11 cvt.u16.u32 %r1858,%r1845; mad.lo.u16 %r1859,%r1855,%r1856,%r1858; cvt.u32.u16 %r1860,%r1859; cvt.s32.s8 %r379,%r1860; .loc 1 2889 13 ld.u8 %r2420,[%r1668]; mov.u16 %r1862,%r2420; .loc 1 2888 11 cvt.u16.u32 %r1864,%r1846; mad.lo.u16 %r1865,%r1862,%r1856,%r1864; cvt.u32.u16 %r1866,%r1865; cvt.s32.s8 %r380,%r1866; .loc 1 2890 16 add.u64 %r1867,%r1851,%r537; .loc 1 2891 13 ld.u8 %r2420,[%r1867]; mov.u16 %r1869,%r2420; .loc 1 2890 11 cvt.u16.u32 %r1871,%r1847; mad.lo.u16 %r1872,%r1869,%r1856,%r1871; cvt.u32.u16 %r1873,%r1872; cvt.s32.s8 %r381,%r1873; .loc 1 2892 16 add.u64 %r1874,%r1851,%r536; .loc 1 2893 13 ld.u8 %r2420,[%r1874]; mov.u16 %r1876,%r2420; .loc 1 2892 11 cvt.u16.u32 %r1878,%r1848; mad.lo.u16 %r1879,%r1876,%r1856,%r1878; cvt.u32.u16 %r1880,%r1879; cvt.s32.s8 %r382,%r1880; .loc 1 2884 10 add.u64 %r548,%r548,2; add.u64 %r547,%r547,2; setp.ne.u64 %r1881,%r529,%r548; @ %r1881 bra $L78; $L77: .loc 1 2895 28 cvt.u32.u32 %r1422,%r379; st.u8 [%r533],%r1422; .loc 1 2896 32 cvt.u32.u32 %r1423,%r380; st.u8 [%r533+1],%r1423; .loc 1 2897 32 cvt.u32.u32 %r1424,%r381; st.u8 [%r533+2],%r1424; .loc 1 2898 32 cvt.u32.u32 %r1425,%r382; st.u8 [%r533+3],%r1425; .loc 1 2877 30 add.u64 %r378,%r378,4; .loc 1 2877 6 add.u64 %r539,%r539,1024; add.u64 %r538,%r538,1024; add.u64 %r537,%r537,1024; add.u64 %r536,%r536,1024; setp.le.s64 %r1426,%r378,%r616; @ %r1426 bra $L79; bra $L80; $L76: .loc 1 2874 39 add.u64 %r374,%r374,1; .loc 1 2874 9 add.u64 %r531,%r531,%r308; add.u64 %r529,%r529,%r308; add.u64 %r527,%r527,%r626; setp.ne.u64 %r1427,%r1633,%r374; @ %r1427 bra $L81; bra $L60; $L75: add.u64 %r556,%r527,%r615; .loc 1 2904 13 mov.u64 %r555,%r554; add.u64 %r1577,%r527,%r614; $L84: .loc 1 2903 14 ld.s8 %r376,[%r556]; .loc 1 2905 10 @ %r1626 bra $L82; add.u64 %r406,%r1007,%r555; mov.u64 %r559,%r531; sub.u64 %r1655,%r529,%r531; and.b64 %r1656,%r1655,3; setp.eq.u64 %r1659,%r1656,0; @ %r1659 bra $L83; setp.eq.u64 %r1658,%r1656,1; @ %r1658 bra $L300; setp.eq.u64 %r1657,%r1656,2; @ %r1657 bra $L301; bra $L362; $L83: .loc 1 2908 13 ld.u8 %r2420,[%r406]; mov.u16 %r1433,%r2420; ld.u8 %r2420,[%r559]; mov.u16 %r1434,%r2420; .loc 1 2907 11 cvt.u16.u32 %r1439,%r376; mad.lo.u16 %r1437,%r1433,%r1434,%r1439; cvt.u32.u16 %r1440,%r1437; cvt.s32.s8 %r1882,%r1440; .loc 1 2908 13 ld.u8 %r2420,[%r406+1]; mov.u16 %r1887,%r2420; ld.u8 %r2420,[%r559+1]; mov.u16 %r1888,%r2420; .loc 1 2907 11 cvt.u16.u32 %r1890,%r1882; mad.lo.u16 %r1891,%r1887,%r1888,%r1890; cvt.u32.u16 %r1892,%r1891; cvt.s32.s8 %r1893,%r1892; .loc 1 2908 13 ld.u8 %r2420,[%r406+2]; mov.u16 %r1899,%r2420; ld.u8 %r2420,[%r559+2]; mov.u16 %r1900,%r2420; .loc 1 2907 11 cvt.u16.u32 %r1902,%r1893; mad.lo.u16 %r1903,%r1899,%r1900,%r1902; cvt.u32.u16 %r1904,%r1903; cvt.s32.s8 %r1905,%r1904; .loc 1 2908 13 ld.u8 %r2420,[%r406+3]; mov.u16 %r1911,%r2420; ld.u8 %r2420,[%r559+3]; mov.u16 %r1912,%r2420; .loc 1 2907 11 cvt.u16.u32 %r1914,%r1905; mad.lo.u16 %r1915,%r1911,%r1912,%r1914; cvt.u32.u16 %r1916,%r1915; cvt.s32.s8 %r376,%r1916; .loc 1 2905 10 add.u64 %r406,%r406,4; add.u64 %r559,%r559,4; setp.ne.u64 %r1917,%r529,%r559; @ %r1917 bra $L83; $L82: .loc 1 2910 28 cvt.u32.u32 %r1443,%r376; st.u8 [%r556],%r1443; .loc 1 2901 6 add.u64 %r556,%r556,1; add.u64 %r555,%r555,256; setp.ne.u64 %r1445,%r1577,%r556; @ %r1445 bra $L84; bra $L76; $L20: .loc 1 2958 11 cvt.u16.u32 %r1446,%r265; setp.eq.u16 %r1447,%r1446,1; @ ! %r1447 bra $L363; bra $L85; $L31: .loc 1 2920 38 set.u32.eq.u64 %r1449,%r307,1; neg.s32 %r1450,%r1449; .loc 1 2920 26 cvt.u16.u32 %r1452,%r1450; and.b16 %r1451,%r1452,%r964; .loc 1 2920 11 cvt.u32.u16 %r1454,%r1451; cvt.u16.u8 %r1455,%r1454; setp.eq.u16 %r1456,%r1455,0; @ %r1456 bra $L20; .loc 1 2920 43 setp.ne.u64 %r1457,%r625,1; @ %r1457 bra $L20; $L121: .loc 1 2922 10 cvt.u16.u32 %r1458,%r265; setp.ne.u16 %r1459,%r1458,1; @ %r1459 bra $L87; .loc 1 2948 4 setp.gt.s64 %r1460,%r314,0; @ %r1460 bra $L88; bra $L1; $L87: .loc 1 2929 4 setp.le.s64 %r1461,%r314,0; @ %r1461 bra $L1; mov.u64 %r649,%r356; mov.u64 %r651,%r355; .loc 1 2929 11 mov.u64 %r410,0; setp.gt.s64 %r1918,%r313,0; setp.le.s64 %r1643,%r266,0; .loc 1 2936 7 cvt.u32.u64 %r1644,%r410; $L91: .loc 1 2933 8 @ %r1918 bra $L90; $L95: .loc 1 2929 29 add.u64 %r410,%r410,1; .loc 1 2929 4 add.u64 %r649,%r649,%r626; add.u64 %r651,%r651,%r308; setp.ne.u64 %r1463,%r314,%r410; @ %r1463 bra $L91; bra $L1; $L90: add.u64 %r644,%r266,%r354; .loc 1 2933 8 mov.u64 %r641,%r649; mov.u64 %r642,0; add.u64 %r1596,%r313,%r649; $L94: .loc 1 2937 5 @ %r1643 bra $L141; add.u64 %r310,%r354,%r642; mov.u64 %r231,%r651; .loc 1 2936 7 mov.u32 %r411,%r1644; sub.u64 %r1705,%r644,%r310; and.b64 %r1706,%r1705,3; setp.eq.u64 %r1709,%r1706,0; @ %r1709 bra $L93; setp.eq.u64 %r1708,%r1706,1; @ %r1708 bra $L302; setp.eq.u64 %r1707,%r1706,2; @ %r1707 bra $L303; bra $L364; $L93: .loc 1 2938 23 ld.u8 %r2420,[%r310]; mov.u16 %r1468,%r2420; ld.u8 %r2420,[%r231]; mov.u16 %r1469,%r2420; .loc 1 2938 9 cvt.u16.u32 %r1474,%r411; mad.lo.u16 %r1472,%r1468,%r1469,%r1474; cvt.u32.u16 %r1475,%r1472; cvt.s32.s8 %r1919,%r1475; .loc 1 2938 23 ld.u8 %r2420,[%r310+1]; mov.u16 %r1924,%r2420; ld.u8 %r2420,[%r231+1]; mov.u16 %r1925,%r2420; .loc 1 2938 9 cvt.u16.u32 %r1927,%r1919; mad.lo.u16 %r1928,%r1924,%r1925,%r1927; cvt.u32.u16 %r1929,%r1928; cvt.s32.s8 %r1930,%r1929; .loc 1 2938 23 ld.u8 %r2420,[%r310+2]; mov.u16 %r1936,%r2420; ld.u8 %r2420,[%r231+2]; mov.u16 %r1937,%r2420; .loc 1 2938 9 cvt.u16.u32 %r1939,%r1930; mad.lo.u16 %r1940,%r1936,%r1937,%r1939; cvt.u32.u16 %r1941,%r1940; cvt.s32.s8 %r1942,%r1941; .loc 1 2938 23 ld.u8 %r2420,[%r310+3]; mov.u16 %r1948,%r2420; ld.u8 %r2420,[%r231+3]; mov.u16 %r1949,%r2420; .loc 1 2938 9 cvt.u16.u32 %r1951,%r1942; mad.lo.u16 %r1952,%r1948,%r1949,%r1951; cvt.u32.u16 %r1953,%r1952; cvt.s32.s8 %r411,%r1953; .loc 1 2937 5 add.u64 %r310,%r310,4; add.u64 %r231,%r231,4; setp.ne.u64 %r1954,%r310,%r644; @ %r1954 bra $L93; bra $L92; $L141: .loc 1 2936 7 mov.u32 %r411,%r1644; $L92: .loc 1 2939 15 cvt.u32.u32 %r1478,%r411; st.u8 [%r641],%r1478; .loc 1 2933 8 add.u64 %r641,%r641,1; add.u64 %r642,%r642,%r377; add.u64 %r644,%r644,%r377; setp.ne.u64 %r1480,%r641,%r1596; @ %r1480 bra $L94; bra $L95; $L88: mov.u64 %r661,%r356; mov.u64 %r663,%r355; .loc 1 2948 11 mov.u64 %r408,0; setp.le.s64 %r1955,%r266,0; .loc 1 2951 10 cvt.u32.u64 %r1646,%r408; $L98: .loc 1 2952 8 @ %r1955 bra $L142; mov.u64 %r654,%r354; mov.u64 %r656,%r663; .loc 1 2951 10 mov.u32 %r409,%r1646; add.u64 %r1594,%r266,%r663; and.b64 %r1715,%r266,3; setp.eq.u64 %r1718,%r1715,0; @ %r1718 bra $L97; setp.eq.u64 %r1717,%r1715,1; @ %r1717 bra $L304; setp.eq.u64 %r1716,%r1715,2; @ %r1716 bra $L305; bra $L365; $L97: .loc 1 2953 26 ld.u8 %r2420,[%r654]; mov.u16 %r1485,%r2420; ld.u8 %r2420,[%r656]; mov.u16 %r1486,%r2420; .loc 1 2953 5 cvt.u16.u32 %r1491,%r409; mad.lo.u16 %r1489,%r1485,%r1486,%r1491; cvt.u32.u16 %r1492,%r1489; cvt.s32.s8 %r1956,%r1492; .loc 1 2952 8 add.u64 %r1957,%r654,%r377; .loc 1 2953 26 ld.u8 %r2420,[%r1957]; mov.u16 %r1961,%r2420; ld.u8 %r2420,[%r656+1]; mov.u16 %r1962,%r2420; .loc 1 2953 5 cvt.u16.u32 %r1964,%r1956; mad.lo.u16 %r1965,%r1961,%r1962,%r1964; cvt.u32.u16 %r1966,%r1965; cvt.s32.s8 %r1967,%r1966; .loc 1 2952 8 add.u64 %r1968,%r1957,%r377; .loc 1 2953 26 ld.u8 %r2420,[%r1968]; mov.u16 %r1973,%r2420; ld.u8 %r2420,[%r656+2]; mov.u16 %r1974,%r2420; .loc 1 2953 5 cvt.u16.u32 %r1976,%r1967; mad.lo.u16 %r1977,%r1973,%r1974,%r1976; cvt.u32.u16 %r1978,%r1977; cvt.s32.s8 %r1979,%r1978; .loc 1 2952 8 add.u64 %r1980,%r1968,%r377; .loc 1 2953 26 ld.u8 %r2420,[%r1980]; mov.u16 %r1985,%r2420; ld.u8 %r2420,[%r656+3]; mov.u16 %r1986,%r2420; .loc 1 2953 5 cvt.u16.u32 %r1988,%r1979; mad.lo.u16 %r1989,%r1985,%r1986,%r1988; cvt.u32.u16 %r1990,%r1989; cvt.s32.s8 %r409,%r1990; .loc 1 2952 8 add.u64 %r654,%r1980,%r377; add.u64 %r656,%r656,4; setp.ne.u64 %r1991,%r656,%r1594; @ %r1991 bra $L97; bra $L96; $L142: .loc 1 2951 10 mov.u32 %r409,%r1646; $L96: .loc 1 2954 25 cvt.u32.u32 %r1496,%r409; st.u8 [%r661],%r1496; .loc 1 2948 29 add.u64 %r408,%r408,1; .loc 1 2948 4 add.u64 %r661,%r661,%r626; add.u64 %r663,%r663,%r308; setp.ne.u64 %r1497,%r314,%r408; @ %r1497 bra $L98; bra $L1; $L85: .loc 1 2963 7 setp.le.s64 %r1498,%r314,0; @ %r1498 bra $L1; mov.u64 %r675,%r356; mov.u64 %r677,0; .loc 1 2963 14 mov.u64 %r422,%r677; setp.le.s64 %r1992,%r266,0; .loc 1 2966 6 cvt.u32.u64 %r1647,%r677; $L101: .loc 1 2967 4 @ %r1992 bra $L143; mov.u64 %r668,%r354; add.u64 %r670,%r355,%r677; .loc 1 2966 6 mov.u32 %r423,%r1647; .loc 1 2967 11 mov.u64 %r424,0; and.b64 %r1723,%r266,3; setp.eq.u64 %r1726,%r1723,0; @ %r1726 bra $L100; setp.eq.u64 %r1725,%r1723,1; @ %r1725 bra $L306; setp.eq.u64 %r1724,%r1723,2; @ %r1724 bra $L307; bra $L366; $L100: .loc 1 2968 29 ld.u8 %r2420,[%r668]; mov.u16 %r1503,%r2420; ld.u8 %r2420,[%r670]; mov.u16 %r1504,%r2420; .loc 1 2968 8 cvt.u16.u32 %r1509,%r423; mad.lo.u16 %r1507,%r1503,%r1504,%r1509; cvt.u32.u16 %r1510,%r1507; cvt.s32.s8 %r1993,%r1510; .loc 1 2967 4 add.u64 %r1995,%r668,%r377; add.u64 %r1996,%r670,%r625; .loc 1 2968 29 ld.u8 %r2420,[%r1995]; mov.u16 %r1999,%r2420; ld.u8 %r2420,[%r1996]; mov.u16 %r2000,%r2420; .loc 1 2968 8 cvt.u16.u32 %r2002,%r1993; mad.lo.u16 %r2003,%r1999,%r2000,%r2002; cvt.u32.u16 %r2004,%r2003; cvt.s32.s8 %r2005,%r2004; .loc 1 2967 4 add.u64 %r2007,%r1995,%r377; add.u64 %r2008,%r1996,%r625; .loc 1 2968 29 ld.u8 %r2420,[%r2007]; mov.u16 %r2012,%r2420; ld.u8 %r2420,[%r2008]; mov.u16 %r2013,%r2420; .loc 1 2968 8 cvt.u16.u32 %r2015,%r2005; mad.lo.u16 %r2016,%r2012,%r2013,%r2015; cvt.u32.u16 %r2017,%r2016; cvt.s32.s8 %r2018,%r2017; .loc 1 2967 4 add.u64 %r2020,%r2007,%r377; add.u64 %r2021,%r2008,%r625; .loc 1 2968 29 ld.u8 %r2420,[%r2020]; mov.u16 %r2025,%r2420; ld.u8 %r2420,[%r2021]; mov.u16 %r2026,%r2420; .loc 1 2968 8 cvt.u16.u32 %r2028,%r2018; mad.lo.u16 %r2029,%r2025,%r2026,%r2028; cvt.u32.u16 %r2030,%r2029; cvt.s32.s8 %r423,%r2030; .loc 1 2967 28 add.u64 %r424,%r424,4; .loc 1 2967 4 add.u64 %r668,%r2020,%r377; add.u64 %r670,%r2021,%r625; setp.ne.u64 %r2031,%r266,%r424; @ %r2031 bra $L100; bra $L99; $L143: .loc 1 2966 6 mov.u32 %r423,%r1647; $L99: .loc 1 2969 21 cvt.u32.u32 %r1513,%r423; st.u8 [%r675],%r1513; .loc 1 2963 32 add.u64 %r422,%r422,1; .loc 1 2963 7 add.u64 %r675,%r675,%r336; add.u64 %r677,%r677,%r308; setp.ne.u64 %r1514,%r314,%r422; @ %r1514 bra $L101; bra $L1; $L363: .loc 1 2972 11 setp.gt.s64 %r1515,%r307,%r377; @ %r1515 bra $L102; .loc 1 2993 7 setp.gt.s64 %r1516,%r314,0; @ %r1516 bra $L103; bra $L1; $L102: .loc 1 2974 7 setp.gt.s64 %r1517,%r314,0; @ %r1517 bra $L104; bra $L1; $L108: add.u64 %r700,%r356,%r703; .loc 1 2975 9 mov.u64 %r420,0; and.b64 %r1739,%r313,3; setp.eq.u64 %r1742,%r1739,0; @ %r1742 bra $L105; setp.eq.u64 %r1741,%r1739,1; @ %r1741 bra $L308; setp.eq.u64 %r1740,%r1739,2; @ %r1740 bra $L309; bra $L367; $L105: .loc 1 2976 34 st.u8 [%r700],%r1648; .loc 1 2975 2 add.u64 %r2033,%r700,%r336; .loc 1 2976 34 st.u8 [%r2033],%r1648; .loc 1 2975 2 add.u64 %r2035,%r2033,%r336; .loc 1 2976 34 st.u8 [%r2035],%r1648; .loc 1 2975 2 add.u64 %r2038,%r2035,%r336; .loc 1 2976 34 st.u8 [%r2038],%r1648; .loc 1 2975 27 add.u64 %r420,%r420,4; .loc 1 2975 2 add.u64 %r700,%r2038,%r336; setp.ne.u64 %r2040,%r313,%r420; @ %r2040 bra $L105; $L109: .loc 1 2974 32 add.u64 %r419,%r435,1; .loc 1 2974 7 add.u64 %r703,%r703,%r626; setp.ne.u64 %r1520,%r314,%r419; @ %r1520 bra $L144; mov.u64 %r696,%r356; mov.u64 %r697,0; mov.u64 %r436,%r697; setp.le.s64 %r2041,%r266,0; bra $L107; $L104: mov.u64 %r703,0; .loc 1 2974 14 mov.u64 %r435,%r703; setp.gt.s64 %r2042,%r313,0; .loc 1 2976 34 cvt.u32.u64 %r1648,%r703; bra $L106; $L144: mov.u64 %r435,%r419; $L106: .loc 1 2975 2 @ %r2042 bra $L108; bra $L109; $L112: add.u64 %r684,%r354,%r690; .loc 1 2980 4 mov.u64 %r681,%r696; .loc 1 2980 11 mov.u64 %r418,0; and.b64 %r1731,%r313,3; setp.eq.u64 %r1734,%r1731,0; @ %r1734 bra $L110; setp.eq.u64 %r1733,%r1731,1; @ %r1733 bra $L310; setp.eq.u64 %r1732,%r1731,2; @ %r1732 bra $L311; bra $L368; $L110: .loc 1 2983 37 ld.u8 %r2420,[%r684]; mov.u16 %r1525,%r2420; ld.u8 %r2420,[%r692]; mov.u16 %r1526,%r2420; .loc 1 2982 36 ld.u8 %r2420,[%r681]; mov.u16 %r1531,%r2420; mad.lo.u16 %r1529,%r1525,%r1526,%r1531; cvt.u32.u16 %r1532,%r1529; st.u8 [%r681],%r1532; .loc 1 2980 4 add.u64 %r2044,%r681,%r336; add.u64 %r2045,%r684,%r377; .loc 1 2983 37 ld.u8 %r2420,[%r2045]; mov.u16 %r2048,%r2420; ld.u8 %r2420,[%r692]; mov.u16 %r2049,%r2420; .loc 1 2982 36 ld.u8 %r2420,[%r2044]; mov.u16 %r2052,%r2420; mad.lo.u16 %r2053,%r2048,%r2049,%r2052; cvt.u32.u16 %r2054,%r2053; st.u8 [%r2044],%r2054; .loc 1 2980 4 add.u64 %r2056,%r2044,%r336; add.u64 %r2057,%r2045,%r377; .loc 1 2983 37 ld.u8 %r2420,[%r2057]; mov.u16 %r2061,%r2420; ld.u8 %r2420,[%r692]; mov.u16 %r2062,%r2420; .loc 1 2982 36 ld.u8 %r2420,[%r2056]; mov.u16 %r2065,%r2420; mad.lo.u16 %r2066,%r2061,%r2062,%r2065; cvt.u32.u16 %r2067,%r2066; st.u8 [%r2056],%r2067; .loc 1 2980 4 add.u64 %r2069,%r2056,%r336; add.u64 %r2070,%r2057,%r377; .loc 1 2983 37 ld.u8 %r2420,[%r2070]; mov.u16 %r2074,%r2420; ld.u8 %r2420,[%r692]; mov.u16 %r2075,%r2420; .loc 1 2982 36 ld.u8 %r2420,[%r2069]; mov.u16 %r2078,%r2420; mad.lo.u16 %r2079,%r2074,%r2075,%r2078; cvt.u32.u16 %r2080,%r2079; st.u8 [%r2069],%r2080; .loc 1 2980 29 add.u64 %r418,%r418,4; .loc 1 2980 4 add.u64 %r681,%r2069,%r336; add.u64 %r684,%r2070,%r377; setp.ne.u64 %r2081,%r313,%r418; @ %r2081 bra $L110; $L113: .loc 1 2979 26 add.u64 %r417,%r417,1; .loc 1 2979 2 add.u64 %r690,%r690,%r307; add.u64 %r692,%r692,%r625; setp.eq.u64 %r1534,%r266,%r417; @ %r1534 bra $L111; $L114: .loc 1 2980 4 @ %r2042 bra $L112; bra $L113; $L111: .loc 1 2978 32 add.u64 %r416,%r436,1; .loc 1 2978 7 add.u64 %r696,%r696,%r626; add.u64 %r697,%r697,%r308; setp.eq.u64 %r1536,%r435,%r436; @ %r1536 bra $L1; mov.u64 %r436,%r416; $L107: .loc 1 2979 2 @ %r2041 bra $L111; add.u64 %r692,%r355,%r697; mov.u64 %r690,0; .loc 1 2979 9 mov.u64 %r417,%r690; bra $L114; $L103: mov.u64 %r721,%r355; .loc 1 2993 7 mov.u64 %r719,0; .loc 1 2993 14 mov.u64 %r412,%r719; setp.gt.s64 %r2082,%r313,0; setp.le.s64 %r1650,%r266,0; .loc 1 3000 10 cvt.u32.u64 %r1651,%r719; $L116: .loc 1 2997 4 @ %r2082 bra $L115; $L120: .loc 1 2993 32 add.u64 %r412,%r412,1; .loc 1 2993 7 add.u64 %r719,%r719,%r626; add.u64 %r721,%r721,%r308; setp.ne.u64 %r1539,%r314,%r412; @ %r1539 bra $L116; bra $L1; $L115: add.u64 %r713,%r356,%r719; .loc 1 2997 4 mov.u64 %r715,0; .loc 1 2997 11 mov.u64 %r413,%r715; $L119: .loc 1 3001 8 @ %r1650 bra $L145; add.u64 %r706,%r354,%r715; mov.u64 %r709,%r721; .loc 1 3000 10 mov.u32 %r414,%r1651; .loc 1 3001 15 mov.u64 %r415,0; and.b64 %r1747,%r266,3; setp.eq.u64 %r1750,%r1747,0; @ %r1750 bra $L118; setp.eq.u64 %r1749,%r1747,1; @ %r1749 bra $L312; setp.eq.u64 %r1748,%r1747,2; @ %r1748 bra $L313; bra $L369; $L118: .loc 1 3002 28 ld.u8 %r2420,[%r706]; mov.u16 %r1544,%r2420; ld.u8 %r2420,[%r709]; mov.u16 %r1545,%r2420; .loc 1 3002 5 cvt.u16.u32 %r1550,%r414; mad.lo.u16 %r1548,%r1544,%r1545,%r1550; cvt.u32.u16 %r1551,%r1548; cvt.s32.s8 %r2084,%r1551; .loc 1 3001 8 add.u64 %r2086,%r706,%r307; add.u64 %r2087,%r709,%r625; .loc 1 3002 28 ld.u8 %r2420,[%r2086]; mov.u16 %r2090,%r2420; ld.u8 %r2420,[%r2087]; mov.u16 %r2091,%r2420; .loc 1 3002 5 cvt.u16.u32 %r2093,%r2084; mad.lo.u16 %r2094,%r2090,%r2091,%r2093; cvt.u32.u16 %r2095,%r2094; cvt.s32.s8 %r2096,%r2095; .loc 1 3001 8 add.u64 %r2098,%r2086,%r307; add.u64 %r2099,%r2087,%r625; .loc 1 3002 28 ld.u8 %r2420,[%r2098]; mov.u16 %r2103,%r2420; ld.u8 %r2420,[%r2099]; mov.u16 %r2104,%r2420; .loc 1 3002 5 cvt.u16.u32 %r2106,%r2096; mad.lo.u16 %r2107,%r2103,%r2104,%r2106; cvt.u32.u16 %r2108,%r2107; cvt.s32.s8 %r2109,%r2108; .loc 1 3001 8 add.u64 %r2111,%r2098,%r307; add.u64 %r2112,%r2099,%r625; .loc 1 3002 28 ld.u8 %r2420,[%r2111]; mov.u16 %r2116,%r2420; ld.u8 %r2420,[%r2112]; mov.u16 %r2117,%r2420; .loc 1 3002 5 cvt.u16.u32 %r2119,%r2109; mad.lo.u16 %r2120,%r2116,%r2117,%r2119; cvt.u32.u16 %r2121,%r2120; cvt.s32.s8 %r414,%r2121; .loc 1 3001 32 add.u64 %r415,%r415,4; .loc 1 3001 8 add.u64 %r706,%r2111,%r307; add.u64 %r709,%r2112,%r625; setp.ne.u64 %r2122,%r266,%r415; @ %r2122 bra $L118; bra $L117; $L145: .loc 1 3000 10 mov.u32 %r414,%r1651; $L117: .loc 1 3003 27 cvt.u32.u32 %r1554,%r414; st.u8 [%r713],%r1554; .loc 1 2997 29 add.u64 %r413,%r413,1; .loc 1 2997 4 add.u64 %r713,%r713,%r336; add.u64 %r715,%r715,%r377; setp.ne.u64 %r1555,%r313,%r413; @ %r1555 bra $L119; bra $L120; $L32: .loc 1 2920 11 setp.eq.u64 %r1556,%r307,1; @ %r1556 bra $L121; bra $L20; $L2: .loc 1 2476 6 ld.u64 %r1557,[%r724]; setp.ne.u64 %r1558,%r1557,0; @ %r1558 bra $L122; .loc 1 2485 4 ld.u64 %r1560,[%r725+56]; add.u64 %r1559,%r1560,1; ld.u64 %r1561,[%r725+48]; sub.u64 %r630,%r1559,%r1561; .loc 1 2483 15 ld.s8 %r1563,[%r726+28]; cvt.u16.u32 %r1562,%r1563; setp.eq.u16 %r1564,%r1562,1; @ %r1564 bra $L123; bra $L124; $L126: .loc 1 2560 16 ld.u64 %r377,[%r725+40]; .loc 1 2548 27 mov.u64 %r336,%r626; bra $L125; $L14: ld.u64 %r626,[%r724+40]; .loc 1 2543 6 ld.s8 %r1566,[%r724+28]; cvt.u16.u32 %r1565,%r1566; setp.eq.u16 %r1567,%r1565,1; @ ! %r1567 bra $L127; bra $L126; $L11: .loc 1 2560 16 ld.u64 %r377,[%r725+40]; mov.u32 %r265,%r288; .loc 1 2548 27 mov.u64 %r626,%r336; .loc 1 2563 14 mov.u64 %r313,1; .loc 1 2561 16 mov.u64 %r307,%r313; bra $L12; $L5: .loc 1 2485 4 ld.u64 %r1569,[%r725+56]; add.u64 %r1568,%r1569,1; ld.u64 %r1570,[%r725+48]; sub.u64 %r630,%r1568,%r1570; bra $L124; $L128: .loc 1 2509 17 ld.u64 %r1572,[%r724+56]; add.u64 %r1571,%r1572,1; .loc 1 2509 15 ld.u64 %r1573,[%r724+48]; sub.u64 %r266,%r1571,%r1573; ld.s8 %r1591,[%r726+28]; bra $L9; $L122: .loc 1 2502 12 cvta.global.u64 %r1574,_gfortrani_compile_options; .loc 1 2502 11 ld.u32 %r1575,[%r1574+36]; setp.ne.u32 %r1576,%r1575,0; @ ! %r1576 bra $L8; bra $L128; $L1: .loc 1 3007 1 ret; $L362: .loc 1 2908 13 ld.u8 %r2420,[%r406]; mov.u16 %r2125,%r2420; ld.u8 %r2420,[%r531]; mov.u16 %r2126,%r2420; .loc 1 2907 11 cvt.u16.u32 %r2128,%r376; mad.lo.u16 %r2129,%r2125,%r2126,%r2128; cvt.u32.u16 %r2130,%r2129; cvt.s32.s8 %r376,%r2130; .loc 1 2905 10 add.u64 %r406,%r406,1; add.u64 %r559,%r531,1; $L301: .loc 1 2908 13 ld.u8 %r2420,[%r406]; mov.u16 %r2134,%r2420; ld.u8 %r2420,[%r559]; mov.u16 %r2135,%r2420; .loc 1 2907 11 cvt.u16.u32 %r2137,%r376; mad.lo.u16 %r2138,%r2134,%r2135,%r2137; cvt.u32.u16 %r2139,%r2138; cvt.s32.s8 %r376,%r2139; .loc 1 2905 10 add.u64 %r406,%r406,1; add.u64 %r559,%r559,1; $L300: .loc 1 2908 13 ld.u8 %r2420,[%r406]; mov.u16 %r2143,%r2420; ld.u8 %r2420,[%r559]; mov.u16 %r2144,%r2420; .loc 1 2907 11 cvt.u16.u32 %r2146,%r376; mad.lo.u16 %r2147,%r2143,%r2144,%r2146; cvt.u32.u16 %r2148,%r2147; cvt.s32.s8 %r376,%r2148; .loc 1 2905 10 add.u64 %r406,%r406,1; add.u64 %r559,%r559,1; setp.ne.u64 %r2149,%r529,%r559; @ %r2149 bra $L83; bra $L82; $L361: sub.u64 %r2150,%r547,%r538; .loc 1 2886 16 add.u64 %r2152,%r2150,%r539; .loc 1 2887 13 ld.u8 %r2420,[%r2152]; mov.u16 %r2154,%r2420; ld.u8 %r2155,[%r531]; .loc 1 2886 11 cvt.u16.u32 %r2157,%r379; mad.lo.u16 %r2158,%r2154,%r2155,%r2157; cvt.u32.u16 %r2159,%r2158; cvt.s32.s8 %r379,%r2159; .loc 1 2889 13 ld.u8 %r2420,[%r547]; mov.u16 %r2161,%r2420; .loc 1 2888 11 cvt.u16.u32 %r2163,%r380; mad.lo.u16 %r2164,%r2161,%r2155,%r2163; cvt.u32.u16 %r2165,%r2164; cvt.s32.s8 %r380,%r2165; .loc 1 2890 16 add.u64 %r2166,%r2150,%r537; .loc 1 2891 13 ld.u8 %r2420,[%r2166]; mov.u16 %r2168,%r2420; .loc 1 2890 11 cvt.u16.u32 %r2170,%r381; mad.lo.u16 %r2171,%r2168,%r2155,%r2170; cvt.u32.u16 %r2172,%r2171; cvt.s32.s8 %r381,%r2172; .loc 1 2892 16 add.u64 %r2173,%r2150,%r536; .loc 1 2893 13 ld.u8 %r2420,[%r2173]; mov.u16 %r2175,%r2420; .loc 1 2892 11 cvt.u16.u32 %r2177,%r382; mad.lo.u16 %r2178,%r2175,%r2155,%r2177; cvt.u32.u16 %r2179,%r2178; cvt.s32.s8 %r382,%r2179; .loc 1 2884 10 add.u64 %r548,%r531,1; add.u64 %r547,%r547,1; setp.ne.u64 %r2180,%r529,%r548; @ %r2180 bra $L78; bra $L77; $L360: .loc 1 2856 13 ld.u8 %r2420,[%r520]; mov.u16 %r2183,%r2420; ld.u8 %r2184,[%r522]; .loc 1 2855 11 cvt.u16.u32 %r2186,%r383; mad.lo.u16 %r2187,%r2183,%r2184,%r2186; cvt.u32.u16 %r2188,%r2187; cvt.s32.s8 %r383,%r2188; .loc 1 2858 16 add.u64 %r2189,%r463,%r520; add.u64 %r2190,%r2189,%r461; .loc 1 2858 13 ld.u8 %r2420,[%r2190]; mov.u16 %r2192,%r2420; .loc 1 2857 11 cvt.u16.u32 %r2194,%r384; mad.lo.u16 %r2195,%r2192,%r2184,%r2194; cvt.u32.u16 %r2196,%r2195; cvt.s32.s8 %r384,%r2196; .loc 1 2860 16 add.u64 %r2197,%r459,%r520; add.u64 %r2198,%r2197,%r461; .loc 1 2860 13 ld.u8 %r2420,[%r2198]; mov.u16 %r2200,%r2420; .loc 1 2859 11 cvt.u16.u32 %r2202,%r385; mad.lo.u16 %r2203,%r2200,%r2184,%r2202; cvt.u32.u16 %r2204,%r2203; cvt.s32.s8 %r385,%r2204; .loc 1 2862 16 add.u64 %r2205,%r458,%r520; add.u64 %r2206,%r2205,%r461; .loc 1 2862 13 ld.u8 %r2420,[%r2206]; mov.u16 %r2208,%r2420; .loc 1 2861 11 cvt.u16.u32 %r2210,%r386; mad.lo.u16 %r2211,%r2208,%r2184,%r2210; cvt.u32.u16 %r2212,%r2211; cvt.s32.s8 %r386,%r2212; .loc 1 2853 10 add.u64 %r522,%r522,1; add.u64 %r520,%r520,1; setp.ne.u64 %r2213,%r457,%r520; @ %r2213 bra $L70; bra $L69; $L359: .loc 1 2762 43 ld.u8 %r2214,[%r432]; st.u8 [%r431],%r2214; .loc 1 2760 9 add.u64 %r432,%r432,1; add.u64 %r431,%r431,256; $L299: .loc 1 2762 43 ld.u8 %r2216,[%r432]; st.u8 [%r431],%r2216; .loc 1 2760 9 add.u64 %r432,%r432,1; add.u64 %r431,%r431,256; $L298: .loc 1 2762 43 ld.u8 %r2218,[%r432]; st.u8 [%r431],%r2218; .loc 1 2760 9 add.u64 %r432,%r432,1; add.u64 %r431,%r431,256; setp.ne.u64 %r2219,%r342,%r432; @ %r2219 bra $L58; bra $L48; $L358: .loc 1 2740 49 ld.u8 %r2220,[%r320]; st.u8 [%r340],%r2220; .loc 1 2743 7 add.u64 %r2221,%r112,%r320; add.u64 %r2222,%r2221,%r77; .loc 1 2742 49 ld.u8 %r2223,[%r2222]; st.u8 [%r340+1],%r2223; .loc 1 2744 49 ld.u8 %r2224,[%r320+1]; st.u8 [%r340+256],%r2224; .loc 1 2747 7 sub.u64 %r2225,%r320,%r317; add.u64 %r2226,%r2225,%r112; .loc 1 2746 49 ld.u8 %r2227,[%r2226+1]; st.u8 [%r340+257],%r2227; .loc 1 2738 9 add.u64 %r341,%r320,2; add.u64 %r340,%r340,512; $L297: .loc 1 2740 49 ld.u8 %r2229,[%r341]; st.u8 [%r340],%r2229; .loc 1 2743 7 add.u64 %r2230,%r112,%r341; add.u64 %r2231,%r2230,%r77; .loc 1 2742 49 ld.u8 %r2232,[%r2231]; st.u8 [%r340+1],%r2232; .loc 1 2744 49 ld.u8 %r2233,[%r341+1]; st.u8 [%r340+256],%r2233; .loc 1 2747 7 sub.u64 %r2234,%r341,%r317; add.u64 %r2235,%r2234,%r112; .loc 1 2746 49 ld.u8 %r2236,[%r2235+1]; st.u8 [%r340+257],%r2236; .loc 1 2738 9 add.u64 %r341,%r341,2; add.u64 %r340,%r340,512; $L296: .loc 1 2740 49 ld.u8 %r2238,[%r341]; st.u8 [%r340],%r2238; .loc 1 2743 7 add.u64 %r2239,%r112,%r341; add.u64 %r2240,%r2239,%r77; .loc 1 2742 49 ld.u8 %r2241,[%r2240]; st.u8 [%r340+1],%r2241; .loc 1 2744 49 ld.u8 %r2242,[%r341+1]; st.u8 [%r340+256],%r2242; .loc 1 2747 7 sub.u64 %r2243,%r341,%r317; add.u64 %r2244,%r2243,%r112; .loc 1 2746 49 ld.u8 %r2245,[%r2244+1]; st.u8 [%r340+257],%r2245; .loc 1 2738 9 add.u64 %r341,%r341,2; add.u64 %r340,%r340,512; setp.ne.u64 %r2246,%r323,%r341; @ %r2246 bra $L52; bra $L53; $L356: .loc 1 2688 2 @ ! %r1590 bra $L370; .loc 1 2689 22 mov.u32 %r224747313248,[%value_in]; } $L370: .loc 1 2687 24 mov.u64 %r407,2; .loc 1 2687 7 add.u64 %r141,%r141,%r626; $L295: .loc 1 2688 2 @ ! %r1590 bra $L371; .loc 1 2689 22 mov.u32 %r225050313251,[%value_in]; } $L371: .loc 1 2687 24 add.u64 %r407,%r407,1; .loc 1 2687 7 add.u64 %r141,%r141,%r626; $L294: .loc 1 2688 2 @ ! %r1590 bra $L372; .loc 1 2689 22 mov.u32 %r225353313254,[%value_in]; } $L372: .loc 1 2687 24 add.u64 %r407,%r407,1; .loc 1 2687 7 add.u64 %r141,%r141,%r626; setp.eq.u64 %r2255,%r345,%r407; @ ! %r2255 bra $L34; bra $L33; $L232: .loc 1 2689 22 mov.u32 %r225652256313257,[%value_in]; } $L357: .loc 1 2687 7 add.u64 %r2259,%r1753,%r626; .loc 1 2688 2 @ ! %r1590 bra $L373; .loc 1 2689 22 mov.u32 %r226122592261313262,[%value_in]; } $L373: .loc 1 2687 7 add.u64 %r2264,%r2259,%r626; .loc 1 2688 2 @ ! %r1590 bra $L374; .loc 1 2689 22 mov.u32 %r226622642266313267,[%value_in]; } $L374: .loc 1 2687 24 add.u64 %r407,%r1702,3; .loc 1 2687 7 add.u64 %r141,%r2264,%r626; setp.eq.u64 %r2268,%r345,%r407; @ ! %r2268 bra $L34; bra $L33; $L364: .loc 1 2938 23 ld.u8 %r2420,[%r310]; mov.u16 %r2271,%r2420; ld.u8 %r2420,[%r651]; mov.u16 %r2272,%r2420; mul.lo.u16 %r2273,%r2271,%r2272; .loc 1 2938 9 cvt.u32.u16 %r2276,%r2273; cvt.s32.s8 %r411,%r2276; .loc 1 2937 5 add.u64 %r310,%r310,1; add.u64 %r231,%r651,1; $L303: .loc 1 2938 23 ld.u8 %r2420,[%r310]; mov.u16 %r2280,%r2420; ld.u8 %r2420,[%r231]; mov.u16 %r2281,%r2420; .loc 1 2938 9 cvt.u16.u32 %r2283,%r411; mad.lo.u16 %r2284,%r2280,%r2281,%r2283; cvt.u32.u16 %r2285,%r2284; cvt.s32.s8 %r411,%r2285; .loc 1 2937 5 add.u64 %r310,%r310,1; add.u64 %r231,%r231,1; $L302: .loc 1 2938 23 ld.u8 %r2420,[%r310]; mov.u16 %r2289,%r2420; ld.u8 %r2420,[%r231]; mov.u16 %r2290,%r2420; .loc 1 2938 9 cvt.u16.u32 %r2292,%r411; mad.lo.u16 %r2293,%r2289,%r2290,%r2292; cvt.u32.u16 %r2294,%r2293; cvt.s32.s8 %r411,%r2294; .loc 1 2937 5 add.u64 %r310,%r310,1; add.u64 %r231,%r231,1; setp.ne.u64 %r2295,%r310,%r644; @ %r2295 bra $L93; bra $L92; $L365: .loc 1 2953 26 ld.u8 %r2420,[%r354]; mov.u16 %r2298,%r2420; ld.u8 %r2420,[%r663]; mov.u16 %r2299,%r2420; mul.lo.u16 %r2300,%r2298,%r2299; .loc 1 2953 5 cvt.u32.u16 %r2303,%r2300; cvt.s32.s8 %r409,%r2303; .loc 1 2952 8 add.u64 %r654,%r354,%r377; add.u64 %r656,%r663,1; $L305: .loc 1 2953 26 ld.u8 %r2420,[%r654]; mov.u16 %r2307,%r2420; ld.u8 %r2420,[%r656]; mov.u16 %r2308,%r2420; .loc 1 2953 5 cvt.u16.u32 %r2310,%r409; mad.lo.u16 %r2311,%r2307,%r2308,%r2310; cvt.u32.u16 %r2312,%r2311; cvt.s32.s8 %r409,%r2312; .loc 1 2952 8 add.u64 %r654,%r654,%r377; add.u64 %r656,%r656,1; $L304: .loc 1 2953 26 ld.u8 %r2420,[%r654]; mov.u16 %r2316,%r2420; ld.u8 %r2420,[%r656]; mov.u16 %r2317,%r2420; .loc 1 2953 5 cvt.u16.u32 %r2319,%r409; mad.lo.u16 %r2320,%r2316,%r2317,%r2319; cvt.u32.u16 %r2321,%r2320; cvt.s32.s8 %r409,%r2321; .loc 1 2952 8 add.u64 %r654,%r654,%r377; add.u64 %r656,%r656,1; setp.ne.u64 %r2322,%r656,%r1594; @ %r2322 bra $L97; bra $L96; $L366: .loc 1 2968 29 ld.u8 %r2420,[%r354]; mov.u16 %r2325,%r2420; ld.u8 %r2420,[%r670]; mov.u16 %r2326,%r2420; mul.lo.u16 %r2327,%r2325,%r2326; .loc 1 2968 8 cvt.u32.u16 %r2330,%r2327; cvt.s32.s8 %r423,%r2330; .loc 1 2967 28 mov.u64 %r424,1; .loc 1 2967 4 add.u64 %r668,%r354,%r377; add.u64 %r670,%r670,%r625; $L307: .loc 1 2968 29 ld.u8 %r2420,[%r668]; mov.u16 %r2334,%r2420; ld.u8 %r2420,[%r670]; mov.u16 %r2335,%r2420; .loc 1 2968 8 cvt.u16.u32 %r2337,%r423; mad.lo.u16 %r2338,%r2334,%r2335,%r2337; cvt.u32.u16 %r2339,%r2338; cvt.s32.s8 %r423,%r2339; .loc 1 2967 28 add.u64 %r424,%r424,1; .loc 1 2967 4 add.u64 %r668,%r668,%r377; add.u64 %r670,%r670,%r625; $L306: .loc 1 2968 29 ld.u8 %r2420,[%r668]; mov.u16 %r2343,%r2420; ld.u8 %r2420,[%r670]; mov.u16 %r2344,%r2420; .loc 1 2968 8 cvt.u16.u32 %r2346,%r423; mad.lo.u16 %r2347,%r2343,%r2344,%r2346; cvt.u32.u16 %r2348,%r2347; cvt.s32.s8 %r423,%r2348; .loc 1 2967 28 add.u64 %r424,%r424,1; .loc 1 2967 4 add.u64 %r668,%r668,%r377; add.u64 %r670,%r670,%r625; setp.ne.u64 %r2349,%r266,%r424; @ %r2349 bra $L100; bra $L99; $L368: .loc 1 2983 37 ld.u8 %r2420,[%r684]; mov.u16 %r2352,%r2420; ld.u8 %r2420,[%r692]; mov.u16 %r2353,%r2420; .loc 1 2982 36 ld.u8 %r2420,[%r696]; mov.u16 %r2356,%r2420; mad.lo.u16 %r2357,%r2352,%r2353,%r2356; cvt.u32.u16 %r2358,%r2357; st.u8 [%r696],%r2358; .loc 1 2980 29 mov.u64 %r418,1; .loc 1 2980 4 add.u64 %r681,%r696,%r336; add.u64 %r684,%r684,%r377; $L311: .loc 1 2983 37 ld.u8 %r2420,[%r684]; mov.u16 %r2362,%r2420; ld.u8 %r2420,[%r692]; mov.u16 %r2363,%r2420; .loc 1 2982 36 ld.u8 %r2420,[%r681]; mov.u16 %r2366,%r2420; mad.lo.u16 %r2367,%r2362,%r2363,%r2366; cvt.u32.u16 %r2368,%r2367; st.u8 [%r681],%r2368; .loc 1 2980 29 add.u64 %r418,%r418,1; .loc 1 2980 4 add.u64 %r681,%r681,%r336; add.u64 %r684,%r684,%r377; $L310: .loc 1 2983 37 ld.u8 %r2420,[%r684]; mov.u16 %r2372,%r2420; ld.u8 %r2420,[%r692]; mov.u16 %r2373,%r2420; .loc 1 2982 36 ld.u8 %r2420,[%r681]; mov.u16 %r2376,%r2420; mad.lo.u16 %r2377,%r2372,%r2373,%r2376; cvt.u32.u16 %r2378,%r2377; st.u8 [%r681],%r2378; .loc 1 2980 29 add.u64 %r418,%r418,1; .loc 1 2980 4 add.u64 %r681,%r681,%r336; add.u64 %r684,%r684,%r377; setp.ne.u64 %r2379,%r313,%r418; @ %r2379 bra $L110; bra $L113; $L367: .loc 1 2976 34 st.u8 [%r700],%r1648; .loc 1 2975 27 mov.u64 %r420,1; .loc 1 2975 2 add.u64 %r700,%r700,%r336; $L309: .loc 1 2976 34 st.u8 [%r700],%r1648; .loc 1 2975 27 add.u64 %r420,%r420,1; .loc 1 2975 2 add.u64 %r700,%r700,%r336; $L308: .loc 1 2976 34 st.u8 [%r700],%r1648; .loc 1 2975 27 add.u64 %r420,%r420,1; .loc 1 2975 2 add.u64 %r700,%r700,%r336; setp.ne.u64 %r2382,%r313,%r420; @ %r2382 bra $L105; bra $L109; $L369: .loc 1 3002 28 ld.u8 %r2420,[%r706]; mov.u16 %r2385,%r2420; ld.u8 %r2420,[%r721]; mov.u16 %r2386,%r2420; mul.lo.u16 %r2387,%r2385,%r2386; .loc 1 3002 5 cvt.u32.u16 %r2390,%r2387; cvt.s32.s8 %r414,%r2390; .loc 1 3001 32 mov.u64 %r415,1; .loc 1 3001 8 add.u64 %r706,%r706,%r307; add.u64 %r709,%r721,%r625; $L313: .loc 1 3002 28 ld.u8 %r2420,[%r706]; mov.u16 %r2394,%r2420; ld.u8 %r2420,[%r709]; mov.u16 %r2395,%r2420; .loc 1 3002 5 cvt.u16.u32 %r2397,%r414; mad.lo.u16 %r2398,%r2394,%r2395,%r2397; cvt.u32.u16 %r2399,%r2398; cvt.s32.s8 %r414,%r2399; .loc 1 3001 32 add.u64 %r415,%r415,1; .loc 1 3001 8 add.u64 %r706,%r706,%r307; add.u64 %r709,%r709,%r625; $L312: .loc 1 3002 28 ld.u8 %r2420,[%r706]; mov.u16 %r2403,%r2420; ld.u8 %r2420,[%r709]; mov.u16 %r2404,%r2420; .loc 1 3002 5 cvt.u16.u32 %r2406,%r414; mad.lo.u16 %r2407,%r2403,%r2404,%r2406; cvt.u32.u16 %r2408,%r2407; cvt.s32.s8 %r414,%r2408; .loc 1 3001 32 add.u64 %r415,%r415,1; .loc 1 3001 8 add.u64 %r706,%r706,%r307; add.u64 %r709,%r709,%r625; setp.ne.u64 %r2409,%r266,%r415; @ %r2409 bra $L118; bra $L117; } matmul_i2.o/_gfortran_matmul_i2 .visible .func _gfortran_matmul_, .param .u32 %in_ar4, .param .u64 %in_ar5); .file 1 "../../../libgfortran/generated/matmul_i2.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_options[6]FUNCTION DECL: _gfortran_size0_gfortran_size0_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime_97,116,109,117,108,95,105,50,0 }2] = {782] = {672] = {8471,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,32,40,97,41,32,61,61,32,50,32,124,124,32,71,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,32,40,98,41,32,61,61,3243116,114,97,110,47,103,101,110,101,114,97,116,101,100,47,109,97,116,109,117,108,95,105,50,465,114,114,97,121,32,98,111,117,110,100,32,109,105,115,109,97,116,99,104,32,102,111,114,32,100,105,109,101,110,115,105,111,110,32,49,32,111,102,32,97,114,114,97,121,32,40,37,108,100,47,37,108,100,41,3265,114,114,97,121,32,98,111,117,110,100,32,109,105,115,109,97,116,99,104,32,102,111,114,32,100,105,109,101,110,115,105,111,110,32,50,32,111,102,32,97,114,114,97,121,32,40,37,108,100,47,37,108,100,41,3273,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,97,114,103,117,109,101,110,116,32,66,32,105,110,32,77,65,84,77,85,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,49,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,100,0 }103,101,109,109,32,33,61,32,78,85,76,7_gfortran_matmul_i2 .visible .func _gfortran_matmul_u64 %r148; .reg .u64 %r239; .reg .u64 %r272; .reg .u64u64u64u64 %r330; .reg .u32 %r331; .reg .u32u64 %r353; .reg .u64 %r355; .reg .u64 %r358; .reg .u64u64 %r365; .reg .u64 %r366; .reg .u64 %r369; .reg .u64u32u64 %r408; .reg .u32 %r409; .reg .u64 %r410; .reg .u64u64 %r488; .reg .u64 %r491; .reg .u64 %r496; .reg .u64 %r498; .reg .u64u64 %r511; .reg .u64 %r518; .reg .u64 %r520; .reg .u64 %r521; .reg .u64 %r523; .reg .u64 %r525; .reg .u64 %r528; .reg .u64 %r532; .reg .u64 %r535; .reg .u64 %r536; .reg .u64u64 %r550; .reg .u64 %r551; .reg .u64 %r552; .reg .u64 %r553; .reg .u64 %r554; .reg .u64 %r560; .reg .u64 %r563; .reg .u64 %r565; .reg .u64 %r570; .reg .u64 %r571; .reg .u64 %r572; .reg .u64 %r576; .reg .u64 %r578; .reg .u64 %r581; .reg .u32 %r583; .reg .u64 %r593; .reg .u64 %r594; .reg .u64.reg .u64 %r624; .reg .u64 %r625; .reg .u64 %r627; .reg .u64 %r628; .reg .u64 %r631; .reg .u64 %r632; .reg .u64 %r637; .reg .u64 %r638; .reg .u64 %r639; .reg .u32 %r644; .reg .u64 %r645; .reg .u64 %r646; .reg .u64 %r650; .reg .u64 %r655; .reg .u64 %r660; .reg .u64 %r661; .reg .u64 %r663; .reg .u64 %r664; .reg .u64u64 %r679; .reg .u64 %r680; .reg .u64 %r681; .reg .u64 %r682; .reg .u64 %r683; .reg .u64 %r684; .reg .u64 %r685; .reg .u64 %r687; .reg .u64u64 %r694; .reg .u64 %r695; .reg .u64 %r696; .reg .u64 %r699; .reg .u64 %r700; .reg .u64 %r701; .reg .u64 %r706; .reg .u64 %r708; .reg .u64 %r709; .reg .u64 %r710; .reg .u64 %r711; .reg .u64 %r712; .reg .u64 %r715; .reg .u64 %r716; .reg .u64 %r717; .reg .u64 %r720; .reg .u64 %r721; .reg .u64 %r722; .reg .u64 %r724; .reg .u64 %r725; .reg .u64 %r732; .reg .u64 %r734; .reg .u64 %r738; .reg .u64 %r743; .reg .u64 %r744; .reg .u64 %r746; .reg .u64 %r747; .reg .u64 %r752; .reg .u64 %r754; .reg .u64 %r755; .reg .u64 %r763; .reg .u64 %r764; .reg .u64 %r766; .reg .u64 %r767; .reg .u64 %r769; .reg .u64 %r772; .reg .u64 %r774; .reg .u64 %r775; .reg .u64 %r780; .reg .u64 %r781; .reg .u64 %r783; .reg .u64 %r788; .reg .u64 %r790; .reg .u64 %r792; .reg .u64 %r796; .reg .u64 %r797; .reg .u64 %r801; .reg .u64 %r804; .reg .u64 %r806; .reg .u64 %r808; .reg .u64 %r812; .reg .u64 %r814; .reg .u64 %r816; .reg .u64 %r818; .reg .u64 %r821; .reg .u64 %r823; .reg .u64 %r824; .reg .u64 %r829; .reg .u64 %r831; .reg .u64 %r837; .reg .u64 %r838; .reg .u64 %r839; .reg .u64 %r840; .reg .u32 %r841; .reg .u32 %r842; .reg .u64 %r843; .reg .u16 %r844; .reg .pred %r845; .reg .u16 %r846; .reg .pred %r848; .reg .u64 %r849; .reg .u32 %r850; .reg .u64 %r851; .reg .u64 %r852; .reg .u64 %r853; .reg .pred %r854; .reg .pred %r856; .reg .u64 %r859; .reg .u64 %r861; .reg .u64 %r862; .reg .u64 %r863; .reg .u64 %r865; .reg .u64 %r866; .reg .u64 %r867; .reg .u64 %r868; .reg .u64 %r869; .reg .u64 %r872; .reg .u64 %r874; .reg .u64 %r875; .reg .u64 %r877; .reg .u64 %r879; .reg .u64 %r880; .reg .u64 %r882; .reg .u64 %r883; .reg .u64 %r884; .reg .u64 %r885; .reg .u64 %r886; .reg .u32 %r887; .reg .pred %r888; .reg .u64 %r889; .reg .u64 %r890; .reg .u64 %r891; .reg .pred %r893; .reg .u64 %r894; .reg .u64 %r895; .reg .u64 %r896; .reg .pred %r897; .reg .u64 %r898; .reg .u64 %r899; .reg .u64 %r900; .reg .u16 %r901; .reg .pred %r902; .reg .u64 %r903; .reg .u64 %r905; .reg .u64 %r906; .reg .u64 %r907; .reg .u16 %r908; .reg .pred %r910; .reg .pred %r911; .reg .u64 %r912; .reg .pred %r914; .reg .u64 %r915; .reg .u64 %r917; .reg .u64 %r918; .reg .u64 %r919; .reg .u64 %r920; .reg .u64 %r921; .reg .u64 %r922; .reg .pred %r923; .reg .u64 %r924; .reg .u64 %r926; .reg .u64 %r927; .reg .u64 %r928; .reg .u16 %r929; .reg .u32 %r930; .reg .pred %r931; .reg .u16 %r932; .reg .pred %r933; .reg .u64 %r934; .reg .u64 %r935; .reg .u64 %r936; .reg .u64 %r937; .reg .u64 %r938; .reg .u64 %r939; .reg .pred %r940; .reg .u32 %r942; .reg .u32 %r943; .reg .u32 %r945; .reg .u32 %r946; .reg .u16 %r947; .reg .u16 %r948; .reg .u16 %r949; .reg .u32 %r950; .reg .u16 %r951; .reg .pred %r952; .reg .u64 %r953; .reg .u16 %r955; .reg .pred %r956; .reg .u64 %r957; .reg .u64 %r958; .reg .u64 %r959; .reg .u32 %r960; .reg .u32 %r961; .reg .u32 %r962; .reg .u32 %r963; .reg .u32 %r964; .reg .u32 %r965; .reg .u32 %r967; .reg .u32 %r968; .reg .u16 %r969; .reg .u16 %r970; .reg .u16 %r971; .reg .u32 %r972; .reg .u16 %r973; .reg .pred %r974; .reg .u32 %r976; .reg .u32 %r977; .reg .u16 %r978; .reg .u16 %r979; .reg .u16 %r980; .reg .u32 %r981; .reg .u16 %r982; .reg .pred %r983; .reg .u32 %r985; .reg .u32 %r986; .reg .u32 %r988; .reg .u32 %r989; .reg .u16 %r990; .reg .u16 %r991; .reg .u16 %r992; .reg .u32 %r993; .reg .u16 %r994; .reg .pred %r995; .reg .f32 %r996; .reg .f32 %r997; .reg .f32 %r998; .reg .f32 %r999; .reg .f32 %r1000; .reg .f32 %r1001; .reg .f32 %r1002; .reg .pred %r1003; .reg .u16 %r1004; .reg .u16 %r1005; .reg .pred %r1006; .reg .pred %r1007; .reg .u32 %r1009; .reg .u32 %r1010; .reg .u32 %r1012; .reg .u32 %r1013; .reg .u16 %r1014; .reg .u16 %r1015; .reg .u16 %r1016; .reg .u32 %r1017; .reg .u16 %r1018; .reg .pred %r1019; .reg .u32 %r1021; .reg .u32 %r1022; .reg .u32 %r1024; .reg .u32 %r1025; .reg .u16 %r1026; .reg .u16 %r1027; .reg .u16 %r1028; .reg .u32 %r1029; .reg .u16 %r1030; .reg .pred %r1031; .reg .u32 %r1033; .reg .u32 %r1034; .reg .u32 %r1036; .reg .u32 %r1037; .reg .u16 %r1038; .reg .u16 %r1039; .reg .u16 %r1040; .reg .u32 %r1041; .reg .u16 %r1042; .reg .pred %r1043; .reg .pred %r1044; .reg .u64 %r1045; .reg .u32 %r1046; .reg .u64 %r1047; .reg .u64 %r1048; .reg .u32 %r1049; .reg .pred %r1050; .reg .pred %r1051; .reg .u32 %r1052; .reg .pred %r1053; .reg .pred %r1054; .reg .u32 %r1069; .reg .u64 %r1070; .reg .u64 %r1071; .reg .u64 %r1072; .reg .u64 %r1073; .reg .u64 %r1074; .reg .u64 %r1075; .reg .u64 %r1076; .reg .u16 %r1077; .reg .u16 %r1078; .reg .u16 %r1079; .reg .u32 %r1080; .reg .u16 %r1081; .reg .pred %r1082; .reg .pred %r1083; .reg .u16 %r1084; .reg .pred %r1085; .reg .u64 %r1086; .reg .pred %r1088; .reg .u32 %r1093; .reg .u64 %r1095; .reg .u32 %r1100; .reg .u32 %r1101; .reg .u32 %r1103; .reg .u32 %r1104; .reg .u16 %r1105; .reg .u16 %r1106; .reg .u16 %r1107; .reg .u32 %r1109; .reg .u32 %r1110; .reg .u16 %r1112; .reg .u16 %r1114; .reg .u32 %r1115; .reg .u16 %r1116; .reg .pred %r1117; .reg .pred %r1118; .reg .u64 %r1119; .reg .u64 %r1121; .reg .u64 %r1122; .reg .u64 %r1123; .reg .pred %r1125; .reg .u64 %r1127; .reg .u64 %r1128; .reg .u64 %r1129; .reg .u64 %r1130; .reg .u64 %r1132; .reg .u64 %r1133; .reg .u64 %r1137; .reg .u64 %r1139; .reg .u64 %r1140; .reg .u64 %r1143; .reg .u64 %r1146; .reg .u64 %r1148; .reg .u64 %r1149; .reg .u64 %r1150; .reg .u64 %r1151; .reg .u64 %r1152; .reg .pred %r1154; .reg .u64 %r1160; .reg .u64 %r1161; .reg .u64 %r1162; .reg .u64 %r1163; .reg .pred %r1165; .reg .u64 %r1166; .reg .u64 %r1167; .reg .u64 %r1168; .reg .u64 %r1169; .reg .u64 %r1171; .reg .u64 %r1172; .reg .u64 %r1173; .reg .u64 %r1174; .reg .u64 %r1175; .reg .u64 %r1176; .reg .u64 %r1177; .reg .u64 %r1178; .reg .u64 %r1179; .reg .u64 %r1183; .reg .u64 %r1184; .reg .u64 %r1185; .reg .u64 %r1186; .reg .u64 %r1189; .reg .u64 %r1190; .reg .u64 %r1191; .reg .u64 %r1192; .reg .u64 %r1194; .reg .u64 %r1196; .reg .u64 %r1201; .reg .u64 %r1204; .reg .u64 %r1210; .reg .u64 %r1211; .reg .u64 %r1213; .reg .u64 %r1214; .reg .u16 %r1215; .reg .u16 %r1216; .reg .u64 %r1217; .reg .u64 %r1218; .reg .u16 %r1219; .reg .u16 %r1220; .reg .u64 %r1226; .reg .u64 %r1227; .reg .u32 %r1228; .reg .u32 %r1229; .reg .u32 %r1230; .reg .u32 %r1231; .reg .pred %r1232; .reg .u64 %r1235; .reg .u64 %r1236; .reg .u64 %r1237; .reg .u64 %r1238; .reg .u64 %r1240; .reg .pred %r1241; .reg .u64 %r1242; .reg .u64 %r1243; .reg .u16 %r1244; .reg .u64 %r1247; .reg .u64 %r1248; .reg .u64 %r1249; .reg .u64 %r1250; .reg .u64 %r1251; .reg .u64 %r1252; .reg .u64 %r1253; .reg .u64 %r1254; .reg .u64 %r1257; .reg .u64 %r1258; .reg .u64 %r1259; .reg .u64 %r1260; .reg .u64 %r1266; .reg .u16 %r1268; .reg .u16 %r1269; .reg .u16 %r1270; .reg .u16 %r1271; .reg .u64 %r1273; .reg .u16 %r1276; .reg .u16 %r1277; .reg .u16 %r1278; .reg .u64 %r1280; .reg .u16 %r1283; .reg .u16 %r1284; .reg .u16 %r1285; .reg .u16 %r1290; .reg .u16 %r1291; .reg .u64 %r1293; .reg .u16 %r1296; .reg .u16 %r1297; .reg .u16 %r1298; .reg .u16 %r1303; .reg .u16 %r1304; .reg .u64 %r1306; .reg .u16 %r1309; .reg .u16 %r1310; .reg .u16 %r1311; .reg .u16 %r1316; .reg .u16 %r1317; .reg .u64 %r1319; .reg .u16 %r1322; .reg .u16 %r1323; .reg .u16 %r1324; .reg .u64 %r1326; .reg .u16 %r1329; .reg .u16 %r1330; .reg .u16 %r1331; .reg .u16 %r1336; .reg .u16 %r1337; .reg .u16 %r1342; .reg .u16 %r1343; .reg .u16 %r1348; .reg .u16 %r1349; .reg .u16 %r1354; .reg .u16 %r1355; .reg .u16 %r1360; .reg .u16 %r1361; .reg .u16 %r1366; .reg .u16 %r1367; .reg .pred %r1369; .reg .u16 %r1370; .reg .u16 %r1371; .reg .u16 %r1372; .reg .u16 %r1373; .reg .u16 %r1374; .reg .u16 %r1375; .reg .u16 %r1376; .reg .u16 %r1377; .reg .u16 %r1378; .reg .u16 %r1379; .reg .u16 %r1380; .reg .u16 %r1381; .reg .u16 %r1382; .reg .u16 %r1383; .reg .u16 %r1384; .reg .u16 %r1385; .reg .pred %r1386; .reg .pred %r1387; .reg .u64 %r1391; .reg .u16 %r1393; .reg .u16 %r1394; .reg .u16 %r1396; .reg .u16 %r1399; .reg .u16 %r1402; .reg .u16 %r1405; .reg .u16 %r1408; .reg .u16 %r1411; .reg .u16 %r1414; .reg .u16 %r1417; .reg .u16 %r1418; .reg .u16 %r1419; .reg .u16 %r1420; .reg .pred %r1421; .reg .pred %r1422; .reg .pred %r1423; .reg .u64 %r1424; .reg .u64 %r1425; .reg .u64 %r1426; .reg .u64 %r1430; .reg .u64 %r1434; .reg .u64 %r1435; .reg .u64 %r1438; .reg .u16 %r1440; .reg .u16 %r1441; .reg .u16 %r1443; .reg .u16 %r1446; .reg .u16 %r1449; .reg .u64 %r1451; .reg .u16 %r1453; .reg .u16 %r1456; .reg .u64 %r1458; .reg .u16 %r1460; .reg .u16 %r1463; .reg .u16 %r1466; .reg .u16 %r1467; .reg .u16 %r1468; .reg .u16 %r1469; .reg .pred %r1470; .reg .pred %r1471; .reg .u64 %r1473; .reg .u64 %r1474; .reg .u64 %r1475; .reg .u64 %r1477; .reg .u16 %r1479; .reg .u16 %r1480; .reg .u16 %r1482; .reg .u16 %r1485; .reg .pred %r1486; .reg .u16 %r1487; .reg .pred %r1488; .reg .u32 %r1490; .reg .u32 %r1491; .reg .u16 %r1492; .reg .u16 %r1493; .reg .u32 %r1495; .reg .u16 %r1496; .reg .pred %r1497; .reg .pred %r1498; .reg .u16 %r1499; .reg .pred %r1500; .reg .pred %r1501; .reg .pred %r1502; .reg .u64 %r1504; .reg .pred %r1506; .reg .u64 %r1507; .reg .u64 %r1508; .reg .u64 %r1509; .reg .u64 %r1510; .reg .u64 %r1512; .reg .u16 %r1514; .reg .u16 %r1515; .reg .u16 %r1517; .reg .u16 %r1520; .reg .pred %r1521; .reg .u64 %r1525; .reg .u64 %r1526; .reg .u64 %r1527; .reg .u16 %r1529; .reg .u16 %r1530; .reg .u16 %r1532; .reg .u16 %r1535; .reg .pred %r1536; .reg .pred %r1537; .reg .u64 %r1542; .reg .u16 %r1544; .reg .u16 %r1545; .reg .u16 %r1547; .reg .u16 %r1550; .reg .pred %r1551; .reg .pred %r1552; .reg .pred %r1553; .reg .pred %r1554; .reg .u64 %r1555; .reg .pred %r1558; .reg .u64 %r1563; .reg .u16 %r1565; .reg .u16 %r1566; .reg .u16 %r1567; .reg .u16 %r1568; .reg .pred %r1570; .reg .pred %r1572; .reg .u64 %r1574; .reg .u64 %r1575; .reg .pred %r1580; .reg .u64 %r1581; .reg .u64 %r1582; .reg .u64 %r1584; .reg .u16 %r1586; .reg .u16 %r1587; .reg .u16 %r1589; .reg .u16 %r1592; .reg .pred %r1593; .reg .pred %r1594; .reg .u64 %r1595; .reg .pred %r1596; .reg .u64 %r1597; .reg .u64 %r1598; .reg .u64 %r1599; .reg .u16 %r1600; .reg .u32 %r1601; .reg .pred %r1602; .reg .u16 %r1603; .reg .u32 %r1604; .reg .pred %r1605; .reg .u64 %r1606; .reg .u64 %r1607; .reg .u64 %r1608; .reg .u64 %r1609; .reg .u64 %r1610; .reg .u64 %r1611; .reg .u64 %r1612; .reg .u32 %r1613; .reg .pred %r1614; .reg .pred %r1615; .reg .pred %r1619; .reg .pred %r1621; .reg .pred %r1622; .reg .u64 %r1623; .reg .pred %r1624; .reg .pred %r1625; .reg .pred %r1626; .reg .pred %r1627; .reg .pred %r1629; .reg .pred %r1631; .reg .pred %r1632; .reg .u64 %r1633; .reg .u32 %r1634; .reg .u64 %r1641; .reg .u64 %r1642; .reg .u64 %r1643; .reg .u64 %r1644; .reg .u64 %r1649; .reg .u64 %r1650; .reg .u64 %r1651; .reg .u64 %r1652; .reg .u64 %r1653; .reg .u64 %r1655; .reg .u64 %r1656; .reg .u64 %r1657; .reg .u64 %r1658; .reg .u64 %r1659; .reg .pred %r1660; .reg .u64 %r1662; .reg .u64 %r1666; .reg .u64 %r1668; .reg .u64 %r1669; .reg .pred %r1670; .reg .u64 %r1671; .reg .u64 %r1672; .reg .pred %r1677; .reg .u32 %r1678; .reg .u32 %r1680; .reg .u32 %r1681; .reg .u16 %r1682; .reg .pred %r1684; .reg .u32 %r1685; .reg .u64 %r1687; .reg .u64 %r1688; .reg .u64 %r1689; .reg .u64 %r1690; .reg .u64 %r1691; .reg .pred %r1692; .reg .pred %r1693; .reg .pred %r1694; .reg .u64 %r1697; .reg .u64 %r1698; .reg .u64 %r1699; .reg .u64 %r1701; .reg .pred %r1702; .reg .u64 %r1704; .reg .u64 %r1705; .reg .u64 %r1706; .reg .u64 %r1707; .reg .u64 %r1709; .reg .pred %r1710; .reg .u64 %r1716; .reg .u64 %r1717; .reg .u64 %r1718; .reg .u64 %r1719; .reg .u64 %r1720; .reg .pred %r1721; .reg .pred %r1722; .reg .pred %r1723; .reg .u64 %r1726; .reg .u64 %r1727; .reg .u64 %r1728; .reg .u64 %r1729; .reg .u64 %r1730; .reg .pred %r1731; .reg .pred %r1732; .reg .pred %r1733; .reg .u64 %r1734; .reg .u64 %r1735; .reg .u64 %r1739; .reg .pred %r1740; .reg .pred %r1741; .reg .pred %r1742; .reg .u64 %r1743; .reg .u64 %r1744; .reg .u64 %r1745; .reg .u64 %r1746; .reg .u64 %r1747; .reg .u64 %r1748; .reg .pred %r1749; .reg .pred %r1750; .reg .pred %r1751; .reg .u64 %r1754; .reg .u64 %r1755; .reg .u64 %r1756; .reg .u64 %r1757; .reg .u64 %r1758; .reg .pred %r1759; .reg .pred %r1760; .reg .pred %r1761; .reg .u64 %r1766; .reg .pred %r1767; .reg .pred %r1768; .reg .pred %r1769; .reg .u64 %r1774; .reg .pred %r1775; .reg .pred %r1776; .reg .pred %r1777; .reg .u64 %r1782; .reg .pred %r1783; .reg .pred %r1784; .reg .pred %r1785; .reg .u64 %r1790; .reg .pred %r1791; .reg .pred %r1792; .reg .pred %r1793; .reg .u64 %r1796; .reg .u64 %r1797; .reg .u64 %r1799; .reg .u64 %r1800; .reg .u64 %r1803; .reg .u64 %r1804; .reg .u16 %r1805; .reg .u16 %r1806; .reg .u64 %r1807; .reg .u64 %r1808; .reg .u16 %r1809; .reg .u16 %r1810; .reg .u64 %r1811; .reg .u64 %r1812; .reg .u64 %r1814; .reg .u64 %r1815; .reg .u16 %r1816; .reg .u16 %r1817; .reg .u64 %r1818; .reg .u64 %r1819; .reg .u16 %r1820; .reg .u16 %r1821; .reg .u64 %r1822; .reg .u64 %r1823; .reg .u64 %r1825; .reg .u64 %r1826; .reg .u16 %r1827; .reg .u16 %r1828; .reg .u64 %r1829; .reg .u64 %r1830; .reg .u16 %r1831; .reg .u16 %r1832; .reg .pred %r1833; .reg .u16 %r1836; .reg .u16 %r1840; .reg .u16 %r1844; .reg .pred %r1845; .reg .u16 %r1856; .reg .u16 %r1857; .reg .u16 %r1859; .reg .u16 %r1860; .reg .u16 %r1861; .reg .u16 %r1863; .reg .u16 %r1864; .reg .u16 %r1865; .reg .u16 %r1867; .reg .u16 %r1868; .reg .u16 %r1869; .reg .u16 %r1871; .reg .u16 %r1872; .reg .pred %r1873; .reg .pred %r1874; .reg .u64 %r1881; .reg .u64 %r1883; .reg .u16 %r1884; .reg .u16 %r1885; .reg .u16 %r1887; .reg .u16 %r1888; .reg .u16 %r1889; .reg .u16 %r1891; .reg .u16 %r1892; .reg .u64 %r1893; .reg .u16 %r1894; .reg .u16 %r1896; .reg .u16 %r1897; .reg .u64 %r1898; .reg .u16 %r1899; .reg .u16 %r1901; .reg .u16 %r1902; .reg .pred %r1903; .reg .u16 %r1907; .reg .u16 %r1908; .reg .u16 %r1910; .reg .u16 %r1916; .reg .u16 %r1917; .reg .u16 %r1919; .reg .u16 %r1925; .reg .u16 %r1926; .reg .u16 %r1928; .reg .u16 %r1929; .reg .pred %r1930; .reg .pred %r1931; .reg .u16 %r1935; .reg .u16 %r1936; .reg .u16 %r1938; .reg .u16 %r1944; .reg .u16 %r1945; .reg .u16 %r1947; .reg .u16 %r1953; .reg .u16 %r1954; .reg .u16 %r1956; .reg .u16 %r1957; .reg .pred %r1958; .reg .pred %r1959; .reg .u64 %r1961; .reg .u16 %r1963; .reg .u16 %r1964; .reg .u16 %r1966; .reg .u64 %r1969; .reg .u16 %r1972; .reg .u16 %r1973; .reg .u16 %r1975; .reg .u64 %r1978; .reg .u16 %r1981; .reg .u16 %r1982; .reg .u16 %r1984; .reg .u16 %r1985; .reg .pred %r1986; .reg .pred %r1987; .reg .u64 %r1990; .reg .u64 %r1991; .reg .u16 %r1992; .reg .u16 %r1993; .reg .u16 %r1995; .reg .u64 %r1999; .reg .u64 %r2000; .reg .u16 %r2002; .reg .u16 %r2003; .reg .u16 %r2005; .reg .u64 %r2009; .reg .u64 %r2010; .reg .u16 %r2012; .reg .u16 %r2013; .reg .u16 %r2015; .reg .u16 %r2016; .reg .pred %r2017; .reg .u64 %r2019; .reg .u64 %r2021; .reg .u64 %r2024; .reg .pred %r2026; .reg .pred %r2027; .reg .pred %r2028; .reg .u64 %r2030; .reg .u64 %r2031; .reg .u16 %r2032; .reg .u16 %r2033; .reg .u16 %r2035; .reg .u16 %r2036; .reg .u64 %r2038; .reg .u64 %r2039; .reg .u16 %r2041; .reg .u16 %r2042; .reg .u16 %r2044; .reg .u16 %r2045; .reg .u64 %r2047; .reg .u64 %r2048; .reg .u16 %r2050; .reg .u16 %r2051; .reg .u16 %r2053; .reg .u16 %r2054; .reg .pred %r2055; .reg .pred %r2056; .reg .u64 %r2060; .reg .u64 %r2061; .reg .u16 %r2062; .reg .u16 %r2063; .reg .u16 %r2065; .reg .u64 %r2069; .reg .u64 %r2070; .reg .u16 %r2072; .reg .u16 %r2073; .reg .u16 %r2075; .reg .u64 %r2079; .reg .u64 %r2080; .reg .u16 %r2082; .reg .u16 %r2083; .reg .u16 %r2085; .reg .u16 %r2086; .reg .pred %r2087; .reg .u16 %r2088; .reg .u16 %r2089; .reg .u16 %r2091; .reg .u16 %r2092; .reg .u16 %r2094; .reg .u16 %r2095; .reg .u16 %r2097; .reg .u16 %r2098; .reg .u16 %r2100; .reg .u16 %r2101; .reg .u16 %r2103; .reg .u16 %r2104; .reg .pred %r2105; .reg .u64 %r2106; .reg .u64 %r2108; .reg .u16 %r2109; .reg .u16 %r2110; .reg .u16 %r2112; .reg .u16 %r2113; .reg .u16 %r2114; .reg .u16 %r2116; .reg .u16 %r2117; .reg .u64 %r2118; .reg .u16 %r2119; .reg .u16 %r2121; .reg .u16 %r2122; .reg .u64 %r2123; .reg .u16 %r2124; .reg .u16 %r2126; .reg .u16 %r2127; .reg .pred %r2128; .reg .u16 %r2130; .reg .u16 %r2131; .reg .u16 %r2133; .reg .u16 %r2134; .reg .u16 %r2135; .reg .u16 %r2137; .reg .u16 %r2138; .reg .u16 %r2139; .reg .u16 %r2141; .reg .u16 %r2142; .reg .u16 %r2143; .reg .u16 %r2145; .reg .u16 %r2146; .reg .pred %r2147; .reg .u16 %r2148; .reg .u16 %r2150; .reg .u16 %r2152; .reg .pred %r2153; .reg .u64 %r2154; .reg .u64 %r2155; .reg .u16 %r2156; .reg .u16 %r2157; .reg .u64 %r2158; .reg .u64 %r2159; .reg .u16 %r2160; .reg .u16 %r2161; .reg .u64 %r2163; .reg .u64 %r2164; .reg .u16 %r2165; .reg .u16 %r2166; .reg .u64 %r2167; .reg .u64 %r2168; .reg .u16 %r2169; .reg .u16 %r2170; .reg .u64 %r2172; .reg .u64 %r2173; .reg .u16 %r2174; .reg .u16 %r2175; .reg .u64 %r2176; .reg .u64 %r2177; .reg .u16 %r2178; .reg .u16 %r2179; .reg .pred %r2180; .reg .u32 %r2181; .reg .u64 %r2182; .reg .u32 %r2184; .reg .u64 %r2185; .reg .u32 %r2187; .reg .u64 %r2188; .reg .pred %r2189; .reg .u32 %r2190; .reg .u64 %r2191; .reg .u64 %r2193; .reg .u32 %r2195; .reg .u64 %r2196; .reg .u64 %r2198; .reg .u32 %r2200; .reg .u64 %r2201; .reg .pred %r2202; .reg .u16 %r2203; .reg .u16 %r2204; .reg .u16 %r2205; .reg .u16 %r2209; .reg .u16 %r2210; .reg .u16 %r2212; .reg .u16 %r2213; .reg .u16 %r2215; .reg .u16 %r2216; .reg .u16 %r2218; .reg .u16 %r2219; .reg .pred %r2220; .reg .u16 %r2221; .reg .u16 %r2222; .reg .u16 %r2223; .reg .u16 %r2227; .reg .u16 %r2228; .reg .u16 %r2230; .reg .u16 %r2231; .reg .u16 %r2233; .reg .u16 %r2234; .reg .u16 %r2236; .reg .u16 %r2237; .reg .pred %r2238; .reg .u16 %r2239; .reg .u16 %r2240; .reg .u16 %r2241; .reg .u16 %r2245; .reg .u16 %r2246; .reg .u16 %r2248; .reg .u16 %r2249; .reg .u16 %r2251; .reg .u16 %r2252; .reg .u16 %r2254; .reg .u16 %r2255; .reg .pred %r2256; .reg .u16 %r2257; .reg .u16 %r2258; .reg .u16 %r2260; .reg .u16 %r2261; .reg .u16 %r2263; .reg .u16 %r2264; .reg .u16 %r2266; .reg .u16 %r2267; .reg .u16 %r2269; .reg .u16 %r2270; .reg .u16 %r2272; .reg .u16 %r2273; .reg .pred %r2274; .reg .pred %r2277; .reg .u16 %r2278; .reg .u16 %r2279; .reg .u16 %r2280; .reg .u16 %r2284; .reg .u16 %r2285; .reg .u16 %r2287; .reg .u16 %r2288; .reg .u16 %r2290; .reg .u16 %r2291; .reg .u16 %r2293; .reg .u16 %r2294; .reg .pred %r2295; mov.u64 %r838,%ar0; mov.u64 %r839,%ar1; mov.u64 %r840,%ar2; mov.u32 %r841,%ar3; mov.u32 %r842,%ar4; mov.u64 %r843,%ar5; .loc 1 2462 3 ld.s8 %r583,[%r839+28]; cvt.u16.u32 %r844,%r583; setp.eq.u16 %r845,%r844,2; @ %r845 bra $L2; ld.s8 %r1634,[%r840+28]; cvt.u16.u32 %r846,%r1634; setp.eq.u16 %r848,%r846,2; @ %r848 bra $L3; cvta.const.u64 %r852,$LC3; cvta.const.u64 %r851,__func__$0; mov.u32 %r850,2462; cvta.const.u64 %r848583: .loc 1 2476 6 ld.u64 %r853,[%r838]; setp.ne.u64 %r854,%r853,0; @ %r854 bra $L4; .loc 1 2478 10 setp.ne.u16 %r856,%r844,1; @ %r856 bra $L5; .loc 1 2480 4 st.u64 [%r838+48],%r853; ld.u64 %r859,[%r840+80]; ld.u64 %r861,[%r840+72]; sub.u64 %r862,%r859,%r861; st.u64 [%r838+56],%r862; mov.u64 %r863,1; st.u64 [%r838+40],%r863; bra $L6; $L123: .loc 1 2485 4 st.u64 [%r838+48],%r1595; add.u64 %r865,%r650,-1; st.u64 [%r838+56],%r865; mov.u64 %r866,1; st.u64 [%r838+40],%r866; bra $L6; $L124: .loc 1 2490 4 mov.u64 %r867,0; st.u64 [%r838+48],%r867; add.u64 %r868,%r650,-1; st.u64 [%r838+56],%r868; mov.u64 %r869,1; st.u64 [%r838+40],%r869; .loc 1 2493 11 st.u64 [%r838+72],%r867; ld.u64 %r872,[%r840+80]; ld.u64 %r874,[%r840+72]; sub.u64 %r875,%r872,%r874; st.u64 [%r838+80],%r875; st.u64 [%r838+64],%r650; $L6: .loc 1 2499 1838; call (%value_in),_gfortran_size0,(%out_arg1); ld.param.u64 %r877,[%value_in]; } .loc 1 2499 4 mov.u64 %r879,879; call (%value_in),_gfortrani_xmallocarray880,[%value_in]; } .loc 1 2499 2 st.u64 [%r838],%r880; .loc 1 2500 24 mov.u64 %r882,0; st.u64 [%r838+8],%r882; .loc 1 2557 7 ld.s8 %r583,[%r839+28]; .loc 1 2564 15 ld.u64 %r884,[%r839+56]; add.u64 %r883,%r884,1; .loc 1 2564 13 ld.u64 %r885,[%r839+48]; sub.u64 %r272,%r883,%r885; bra $L7; $L4: .loc 1 2502 12 cvta.global.u64 %r886,_gfortrani_compile_options; .loc 1 2502 11 ld.u32 %r887,[%r886+36]; setp.eq.u32 %r888,%r887,0; @ %r888 bra $L8; .loc 1 2509 17 ld.u64 %r890,[%r838+56]; add.u64 %r889,%r890,1; .loc 1 2509 15 ld.u64 %r891,[%r838+48]; sub.u64 %r272,%r889,%r891; .loc 1 2506 10 setp.ne.u16 %r893,%r844,1; @ %r893 bra $L9; .loc 1 2508 17 ld.u64 %r895,[%r840+80]; add.u64 %r894,%r895,1; .loc 1 2508 15 ld.u64 %r896,[%r840+72]; sub.u64 %r346,%r894,%r896; .loc 1 2510 7 setp.ne.u64 %r897,%r346,%r272; @ %r897 bra $L10; .loc 1 2564 15 ld.u64 %r899,[%r839+56]; add.u64 %r898,%r899,1; .loc 1 2564 13 ld.u64 %r900,[%r839+48]; sub.u64 %r272,%r898,%r900; .loc 1 2543 7 ld.s8 %r443,[%r838+28]; .loc 1 2548 27 ld.u64 %r712,[%r838+40]; .loc 1 2543 6 cvt.u16.u32 %r901,%r443; setp.eq.u16 %r902,%r901,1; @ %r902 bra $L11; .loc 1 2553 16 ld.u64 %r581,[%r838+64]; .loc 1 2560 16 ld.u64 %r646,[%r839+40]; .loc 1 2563 14 mov.u64 %r320,1; .loc 1 2561 16 mov.u64 %r313,%r320; bra $L12; $L10: .loc 1 2511 6 st.u64 [%stack+8],%r346; st.u64 [%stack],%r272; cvta.const.u64 %r903903_gfortran_runtime_error2485 4 ld.u64 %r906,[%r839+56]; add.u64 %r905,%r906,1; ld.u64 %r907,[%r839+48]; sub.u64 %r655,%r905,%r907; .loc 1 2515 15 cvt.u16.u32 %r908,%r1634; setp.ne.u16 %r910,%r908,1; @ %r910 bra $L13; .loc 1 2519 7 setp.eq.u64 %r911,%r655,%r272; @ %r911 bra $L14; .loc 1 2520 6 st.u64 [%stack+8],%r655; st.u64 [%stack],%r272; cvta.const.u64 %r912912_gfortran_runtime_error13: .loc 1 2528 7 setp.eq.u64 %r914,%r655,%r272; @ %r914 bra $L15; .loc 1 2529 6 st.u64 [%stack+8],%r655; st.u64 [%stack],%r272; cvta.const.u64 %r915915_gfortran_runtime_error15: .loc 1 2533 17 ld.u64 %r918,[%r840+80]; add.u64 %r917,%r918,1; .loc 1 2533 15 ld.u64 %r919,[%r840+72]; sub.u64 %r343,%r917,%r919; .loc 1 2534 17 ld.u64 %r921,[%r838+80]; add.u64 %r920,%r921,1; .loc 1 2534 15 ld.u64 %r922,[%r838+72]; sub.u64 %r344,%r920,%r922; .loc 1 2535 7 setp.eq.u64 %r923,%r343,%r344; @ %r923 bra $L14; .loc 1 2536 6 st.u64 [%stack+8],%r343; st.u64 [%stack],%r344; cvta.const.u64 %r924,$LC9stack; call _gfortran_runtime_error2564 15 ld.u64 %r927,[%r839+56]; add.u64 %r926,%r927,1; .loc 1 2564 13 ld.u64 %r928,[%r839+48]; sub.u64 %r272,%r926,%r928; $L7: .loc 1 2548 27 ld.u64 %r581,[%r838+40]; .loc 1 2543 6 ld.s8 %r930,[%r838+28]; cvt.u16.u32 %r929,%r930; setp.eq.u16 %r931,%r929,1; @ %r931 bra $L130; $L127: .loc 1 2553 16 mov.u64 %r712,%r581; ld.u64 %r581,[%r838+64]; bra $L16; $L130: .loc 1 2548 27 mov.u64 %r712,%r581; $L16: .loc 1 2560 16 ld.u64 %r646,[%r839+40]; .loc 1 2557 6 cvt.u16.u32 %r932,%r583; setp.eq.u16 %r933,%r932,1; @ %r933 bra $L131; $L125: .loc 1 2569 16 ld.u64 %r313,[%r839+64]; .loc 1 2571 15 ld.u64 %r935,[%r839+80]; add.u64 %r934,%r935,1; .loc 1 2571 13 ld.u64 %r936,[%r839+72]; mov.u64 %r320,%r272; sub.u64 %r272,%r934,%r936; bra $L12; $L131: .loc 1 2563 14 mov.u64 %r320,1; .loc 1 2561 16 mov.u64 %r313,%r320; $L12: .loc 1 2575 16 ld.u64 %r938,[%r840+56]; add.u64 %r937,%r938,1; ld.u64 %r939,[%r840+48]; sub.u64 %r75,%r937,%r939; .loc 1 2575 6 setp.eq.u64 %r940,%r75,%r272; @ %r940 bra $L17; .loc 1 2577 10 set.u32.gt.s64 %r942,%r272,0; neg.s32 %r943,%r942; .loc 1 2577 21 set.u32.gt.s64 %r945,%r75,0; neg.s32 %r946,%r945; cvt.u16.u32 %r948,%r943; cvt.u16.u32 %r949,%r946; or.b16 %r947,%r948,%r949; cvt.u32.u16 %r950,%r947; cvt.u16.u8 %r951,%r950; setp.eq.u16 %r952,%r951,0; @ %r952 bra $L17; .loc 1 2578 2 st.u64 [%stack+8],%r2729_gfortran_runtime_error17: .loc 1 2583 7 ld.s8 %r83,[%r840+28]; .loc 1 2586 16 ld.u64 %r645,[%r840+40]; .loc 1 2583 6 cvt.u16.u32 %r955,%r83; setp.eq.u16 %r956,%r955,1; @ %r956 bra $L132; .loc 1 2597 16 ld.u64 %r314,[%r840+64]; .loc 1 2598 16 ld.u64 %r958,[%r840+80]; add.u64 %r957,%r958,1; .loc 1 2598 14 ld.u64 %r959,[%r840+72]; sub.u64 %r321,%r957,%r959; bra $L18; $L132: .loc 1 2592 14 mov.u64 %r321,1; .loc 1 2591 16 mov.u64 %r314,256; $L18: .loc 1 2601 9 ld.u64 %r349,[%r839]; .loc 1 2602 9 ld.u64 %r350,[%r840]; .loc 1 2603 8 ld.u64 %r351,[%r838]; .loc 1 2612 28 set.u32.eq.u64 %r961,%r712,1; neg.s32 %r962,%r961; cvt.u32.u32 %r960,%r962; cvt.u32.u8 %r88,%r960; .loc 1 2612 46 set.u32.eq.u64 %r964,%r646,1; neg.s32 %r965,%r964; cvt.u32.u32 %r963,%r965; cvt.u32.u8 %r644,%r963; .loc 1 2612 7 set.u32.ne.u32 %r967,%r841,0; neg.s32 %r968,%r967; .loc 1 2612 16 cvt.u16.u32 %r970,%r968; cvt.u16.u32 %r971,%r88; and.b16 %r969,%r970,%r971; .loc 1 2612 6 cvt.u32.u16 %r972,%r969; cvt.u16.u8 %r973,%r972; setp.eq.u16 %r974,%r973,0; @ %r974 bra $L19; .loc 1 2612 63 set.u32.eq.u64 %r976,%r313,1; neg.s32 %r977,%r976; .loc 1 2612 51 cvt.u16.u32 %r979,%r977; cvt.u16.u32 %r980,%r644; or.b16 %r978,%r979,%r980; .loc 1 2612 33 cvt.u32.u16 %r981,%r978; cvt.u16.u8 %r982,%r981; setp.eq.u16 %r983,%r982,0; @ %r983 bra $L20; .loc 1 2613 20 set.u32.eq.u64 %r985,%r645,1; neg.s32 %r986,%r985; .loc 1 2613 37 set.u32.eq.u64 %r988,%r314,1; neg.s32 %r989,%r988; .loc 1 2613 25 cvt.u16.u32 %r991,%r986; cvt.u16.u32 %r992,%r989; or.b16 %r990,%r991,%r992; .loc 1 2613 7 cvt.u32.u16 %r993,%r990; cvt.u16.u8 %r994,%r993; setp.eq.u16 %r995,%r994,0; @ %r995 bra $L20; .loc 1 2615 13 cvt.rn.f32.s32 %r99,%r842; .loc 1 2614 12 cvt.rn.f32.s64 %r996,%r320; .loc 1 2614 50 cvt.rn.f32.s64 %r997,%r272; .loc 1 2614 47 mul.f32 %r998,%r996,%r997; .loc 1 2614 31 cvt.rn.f32.s64 %r999,%r321; .loc 1 2614 47 mul.f32 %r1000,%r998,%r999; .loc 1 2615 13 mul.f32 %r1001,%r99,%r99; mul.f32 %r1002,%r1001,%r99; .loc 1 2614 7 setp.gt.f32 %r1003,%r1000,%r1002; @ ! %r1003 bra $L19; .loc 1 2617 17 cvt.u32.u64 %r102,%r320; st.u32 [%frame+20],%r102; .loc 1 2617 29 cvt.u32.u64 %r103,%r321; st.u32 [%frame+16],%r103; .loc 1 2617 41 cvt.u32.u64 %r104,%r272; st.u32 [%frame+12],%r104; .loc 1 2617 52 cvt.u32.u64 %r105,%r581; st.u32 [%frame+8],%r105; .loc 1 2618 27 mov.u16 %r1004,1; st.u16 [%frame+26],%r1004; .loc 1 2618 36 mov.u16 %r1005,0; st.u16 [%frame+24],%r1005; .loc 1 2619 17 setp.ne.u64 %r1006,%r646,1; @ %r1006 bra $L22; cvt.u32.u64 %r331,%r313; bra $L23; $L22: cvt.u32.u64 %r331,%r646; $L23: st.u32 [%frame+4],%r331; .loc 1 2620 3 setp.ne.u64 %r1007,%r645,1; @ %r1007 bra $L24; cvt.u32.u64 %r332,%r314; bra $L25; $L24: cvt.u32.u64 %r332,%r645; $L25: st.u32 [%frame],%r332; .loc 1 2622 15 set.u32.gt.s32 %r1009,%r331,0; neg.s32 %r1010,%r1009; .loc 1 2622 26 set.u32.gt.s32 %r1012,%r332,0; neg.s32 %r1013,%r1012; .loc 1 2622 19 cvt.u16.u32 %r1015,%r1010; cvt.u16.u32 %r1016,%r1013; and.b16 %r1014,%r1015,%r1016; .loc 1 2622 10 cvt.u32.u16 %r1017,%r1014; cvt.u16.u8 %r1018,%r1017; setp.eq.u16 %r1019,%r1018,0; @ %r1019 bra $L19; .loc 1 2622 37 set.u32.gt.s32 %r1021,%r105,0; neg.s32 %r1022,%r1021; .loc 1 2622 46 set.u32.gt.s32 %r1024,%r102,1; neg.s32 %r1025,%r1024; .loc 1 2622 41 cvt.u16.u32 %r1027,%r1022; cvt.u16.u32 %r1028,%r1025; and.b16 %r1026,%r1027,%r1028; cvt.u32.u16 %r1029,%r1026; cvt.u16.u8 %r1030,%r1029; setp.eq.u16 %r1031,%r1030,0; @ %r1031 bra $L19; .loc 1 2622 55 set.u32.gt.s32 %r1033,%r103,1; neg.s32 %r1034,%r1033; .loc 1 2622 64 set.u32.gt.s32 %r1036,%r104,1; neg.s32 %r1037,%r1036; .loc 1 2622 59 cvt.u16.u32 %r1039,%r1034; cvt.u16.u32 %r1040,%r1037; and.b16 %r1038,%r1039,%r1040; cvt.u32.u16 %r1041,%r1038; cvt.u16.u8 %r1042,%r1041; setp.eq.u16 %r1043,%r1042,0; @ %r1043 bra $L19; .loc 1 2624 4 setp.ne.u64 %r1044,%r843,0; @ %r1044 bra $L27; cvta.const.u64 %r1048,$LC8; cvta.const.u64 %r1047,__func__$0; mov.u32 %r1046,2624; cvta.const.u64 %r1045104561004827: .loc 1 2626 17 and.b32 %r1049,%r841,2; .loc 1 2626 7 setp.ne.u32 %r1050,%r1049,0; @ %r1050 bra $L133; .loc 1 2629 35 setp.eq.u64 %r1051,%r646,1; @ %r1051 bra $L134; cvta.const.u64 %r322,$LC2; bra $L28; $L133: .loc 1 2627 13 cvta.const.u64 %r322,$LC1; bra $L28; $L134: .loc 1 2629 35 cvta.const.u64 %r322,$LC0; $L28: .loc 1 2631 17 and.b32 %r1052,%r841,4; .loc 1 2631 7 setp.ne.u32 %r1053,%r1052,0; @ %r1053 bra $L135; .loc 1 2634 35 setp.eq.u64 %r1054,%r645,1; @ %r1054 bra $L136; cvta.const.u64 %r323,$LC2; bra $L29; $L135: .loc 1 2632 13 cvta.const.u64 %r323,$LC1; bra $L29; $L136: .loc 1 2634 35 cvta.const.u64 %r323,$LC0; $L29: .loc 1 2636 4 add.u64 %r1070,%frame,8; add.u64 %r1071,%frame,24; add.u64 %r1072,%frame,4; add.u64 %r1073,%frame,26; add.u64 %r1074,%frame,12; add.u64 %r1075,%frame,16; add.u64 %r1076,%frame,20; mov.u32 %r1069,1;,.param .u64 %in_ar3,.param .u64 %in_ar4,.param .u64 %in_ar5,.param .u64 %in_ar6,.param .u64 %in_ar7,.param .u64 %in_ar8,.param .u64 %in_ar9,.param .u64 %in_ar10,.param .u64 %in_ar11,.param .u64 %in_ar12,.param .u32 %in_ar13,10761010741073.param .u64 %out_arg8; st.param.u64 [%out_arg8],%r1072; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r350; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%frame; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r1071; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r351; .param .u64 %out_arg13; st.param.u64 [%out_arg13],%r1070; .param .u32 %out_arg14; st.param.u32 [%out_arg14],%r1069; .param .u32 %out_arg15; st.param.u32 [%out_arg15],%r1069; call %r843,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13,%out_arg14,%out_arg15),$LCT0; } bra $L1; $L19: .loc 1 2643 21 cvt.u16.u32 %r1078,%r88; cvt.u16.u32 %r1079,%r644; and.b16 %r1077,%r1078,%r1079; .loc 1 2643 6 cvt.u32.u16 %r1080,%r1077; cvt.u16.u8 %r1081,%r1080; setp.eq.u16 %r1082,%r1081,0; @ %r1082 bra $L31; .loc 1 2643 38 setp.ne.u64 %r1083,%r645,1; @ %r1083 bra $L20; .loc 1 2644 7 cvt.u16.u32 %r1084,%r83; setp.eq.u16 %r1085,%r1084,1; @ %r1085 bra $L32; .loc 1 2677 16 add.u64 %r1086,%r581,1; .loc 1 2678 9 add.u64 %r122,%r1086,%r1086; sub.u64 %r353,%r351,%r122; .loc 1 2687 7 setp.le.s64 %r1088,%r321,0; @ %r1088 bra $L33; add.u64 %r711,%r122,-2; mov.u64 %r710,%r351; add.u64 %r715,%r321,1; .loc 1 2689 22 add.u64 %r578,%r320,%r320; .loc 1 2687 13 mov.u64 %r402,%r645; setp.gt.s64 %r1631,%r320,0; and.b64 %r1739,%r321,3; setp.eq.u64 %r1742,%r1739,0; @ %r1742 bra $L34; setp.eq.u64 %r1741,%r1739,1; @ %r1741 bra $L294; setp.eq.u64 %r1740,%r1739,2; @ %r1740 bra $L295; bra $L356; $L35: .loc 1 2689 22 mov.u32 %r100935781095,[%value_in]; } $L36: .loc 1 2687 24 add.u64 %r1743,%r402,1; .loc 1 2687 7 add.u64 %r1796,%r710,%r711; .loc 1 2688 2 @ %r1631 bra $L232; bra $L357; $L34: @ %r1631 bra $L35; bra $L36; $L33: .loc 1 2692 28 set.u32.eq.u64 %r1100,%r272,0; neg.s32 %r1101,%r1100; .loc 1 2692 13 set.u32.eq.u64 %r1103,%r320,0; neg.s32 %r1104,%r1103; .loc 1 2692 28 cvt.u16.u32 %r1106,%r1101; cvt.u16.u32 %r1107,%r1104; or.b16 %r1105,%r1106,%r1107; .loc 1 2692 23 set.u32.eq.u64 %r1109,%r321,0; neg.s32 %r1110,%r1109; .loc 1 2692 28 cvt.u16.u32 %r1114,%r1110; or.b16 %r1112,%r1105,%r1114; cvt.u32.u16 %r1115,%r1112; cvt.u16.u8 %r1116,%r1115; setp.eq.u16 %r1117,%r1116,0; @ ! %r1117 bra $L1; .loc 1 2697 10 setp.eq.u64 %r1118,%r313,1; selp.u64 %r330,%r581,%r313,%r1118; .loc 1 2702 21 shl.b64 %r1119,%r330,8; .loc 1 2702 14 add.u64 %r358,%r1119,%r314; .loc 1 2706 12 min.s64 %r1121,%r358,65536; add.u64 %r1122,%r1121,%r11211221123,[%value_in]; } .loc 1 2710 7 setp.gt.s64 %r1125,%r321,0; @ %r1125 bra $L39; $L42: .loc 1 2911232918 7 bra $L1; $L39: .loc 1 2680 16 add.u64 %r1127,%r313,1; .loc 1 2681 9 add.u64 %r1128,%r1127,%r1127; sub.u64 %r355,%r349,%r1128; .loc 1 2683 16 add.u64 %r1129,%r314,1; .loc 1 2684 9 add.u64 %r1130,%r1129,%r1129; shl.b64 %r687,%r314,9; shl.b64 %r689,%r314,10; shl.b64 %r699,%r314,2; sub.u64 %r690,%r350,%r1130; add.u64 %r688,%r699,%r690; shl.b64 %r693,%r581,9; add.u64 %r694,%r122,-2; shl.b64 %r695,%r581,2; neg.s64 %r1132,%r314; shl.b64 %r1133,%r1132,10; shl.b64 %r701,%r581,10; mov.u64 %r700,%r351; add.u64 %r1137,%r321,-1; and.b64 %r706,%r1137,-512; shl.b64 %r663,%r313,8; shl.b64 %r665,%r313,9; add.u64 %r666,%r1128,-2; neg.s64 %r1139,%r313; shl.b64 %r1140,%r1139,9; shl.b64 %r1143,%r1139,2; shl.b64 %r606,%r313,2; add.u64 %r709,%r1130,-2; shl.b64 %r459,%r314,3; shl.b64 %r454,%r581,3; shl.b64 %r1146,%r1132,3; shl.b64 %r696,%r1132,1; mov.u64 %r691,%r694; mov.u64 %r685,%r314; mov.u64 %r683,0; setp.gt.s64 %r1626,%r272,0; add.u64 %r1669,%r696,2; setp.gt.s64 %r1670,%r320,0; add.u64 %r1671,%r320,1; add.u64 %r1672,%r355,4; bra $L41; $L138: mov.u64 %r683,%r684; $L41: add.u64 %r708,%r683,1; .loc 1 2715 9 sub.u64 %r360,%r321,%r683; min.s64 %r1797,%r360,512; .loc 1 2716 24 shr.s64 %r1148,%r1797,63; shr.u64 %r1149,%r1148,62; add.u64 %r1150,%r1797,%r1149; and.b64 %r1151,%r1150,3; sub.u64 %r1152,%r1151,%r1149; .loc 1 2716 10 sub.u64 %r361,%r1797,%r1152; .loc 1 2718 4 @ %r1626 bra $L40; $L45: .loc 1 2710 7 add.u64 %r684,%r683,512; add.u64 %r685,%r685,%r687; add.u64 %r688,%r688,%r689; add.u64 %r691,%r691,%r693; add.u64 %r695,%r695,%r693; add.u64 %r696,%r696,%r1133; add.u64 %r699,%r699,%r689; add.u64 %r700,%r700,%r701; setp.ne.u64 %r1154,%r683,%r706; @ %r1154 bra $L138; bra $L42; $L40: add.u64 %r660,%r1669,%r688; .loc 1 2768 13 add.u64 %r624,%r361,%r708; add.u64 %r440,%r699,%r709; add.u64 %r433,%r440,%r709; add.u64 %r679,%r688,%r709; add.u64 %r680,%r679,%r709; mul.lo.u64 %r541,%r624,%r314; mad.lo.u64 %r676,%r624,%r709,%r690; mul.lo.u64 %r535,%r624,%r581; mov.u64 %r667,%r1143; mov.u64 %r664,%r666; mov.u64 %r661,%r313; .loc 1 2718 12 mov.u64 %r364,%r645; add.u64 %r1623,%r272,1; .loc 1 2768 8 add.u64 %r1662,%r624,-1; sub.u64 %r1666,%r691,%r581; add.u64 %r1668,%r581,%r691; $L44: .loc 1 2723 13 sub.u64 %r362,%r1623,%r364; min.s64 %r1799,%r362,256; .loc 1 2724 28 shr.u64 %r1160,%r1799,63; add.u64 %r1161,%r1799,%r1160; and.b64 %r1162,%r1161,1; sub.u64 %r1163,%r1162,%r1160; .loc 1 2724 14 sub.u64 %r363,%r1799,%r1163; .loc 1 2727 8 @ %r1670 bra $L43; $L73: .loc 1 2718 30 add.u64 %r364,%r364,256; .loc 1 2718 4 add.u64 %r660,%r660,512; add.u64 %r661,%r661,%r663; add.u64 %r664,%r664,%r665; add.u64 %r667,%r667,%r1140; setp.ge.s64 %r1165,%r272,%r364; @ %r1165 bra $L44; bra $L45; $L43: .loc 1 2763 26 add.u64 %r1166,%r364,-1; add.u64 %r1167,%r1166,%r1799; .loc 1 2763 31 mul.lo.u64 %r421,%r1167,%r313; .loc 1 2734 13 add.u64 %r623,%r363,%r364; .loc 1 2734 8 add.u64 %r625,%r623,-1; add.u64 %r1168,%r421,1; add.u64 %r1169,%r1168,%r1168; add.u64 %r442,%r355,%r1169; add.u64 %r682,%r664,%r666; add.u64 %r681,%r666,%r667; shl.b64 %r1171,%r1799,1; add.u64 %r1172,%r1171,-2; add.u64 %r82,%r1123,%r1172; add.u64 %r1173,%r364,%r685; add.u64 %r1174,%r1173,%r1799; add.u64 %r1175,%r1174,%r1174; add.u64 %r458,%r1175,%r690; add.u64 %r1176,%r364,%r364; add.u64 %r329,%r1176,%r688; add.u64 %r327,%r1176,%r679; add.u64 %r325,%r1176,%r680; add.u64 %r677,%r676,%r1176; add.u64 %r1177,%r541,%r364; add.u64 %r1178,%r1177,%r1799; add.u64 %r1179,%r1178,%r1178; add.u64 %r539,%r1179,%r690; .loc 1 2768 8 mov.u64 %r68,%r700; .loc 1 2727 16 mov.u64 %r378,%r645; setp.le.s64 %r1615,%r364,%r625; setp.gt.s64 %r1625,%r1799,%r363; setp.ge.s64 %r1632,%r1662,%r708; setp.gt.s64 %r1624,%r1797,%r361; not.b64 %r1655,%r364; add.u64 %r1656,%r1655,%r623; and.b64 %r1657,%r1656,-2; .loc 1 2904 18 add.u64 %r1658,%r1799,%r364; .loc 1 2904 13 add.u64 %r1659,%r1658,-1; setp.gt.s64 %r1660,%r364,%r1659; $L72: .loc 1 2732 10 sub.u64 %r365,%r1671,%r378; min.s64 %r1800,%r365,256; .loc 1 2733 25 shr.s64 %r1633,%r1800,63; shr.u64 %r1183,%r1800,63; add.u64 %r1184,%r1800,%r1183; and.b64 %r1185,%r1184,1; sub.u64 %r1186,%r1185,%r1183; .loc 1 2733 11 sub.u64 %r366,%r1800,%r1186; .loc 1 2735 5 @ %r1615 bra $L46; $L55: .loc 1 2757 8 @ ! %r1625 bra $L48; bra $L47; $L46: .loc 1 2737 17 add.u64 %r627,%r366,%r378; .loc 1 2737 12 add.u64 %r628,%r627,-1; add.u64 %r1189,%r661,-1; add.u64 %r1190,%r1189,%r378; add.u64 %r1191,%r1190,%r1800; add.u64 %r1192,%r1191,%r1191; add.u64 %r609,%r355,%r1192; shl.b64 %r1194,%r1800,9; add.u64 %r1196,%r1194,-512; add.u64 %r602,%r1123,%r1196; add.u64 %r593,%r661,%r313; shl.b64 %r1201,%r1800,1; sub.u64 %r67,%r666,%r1201; mov.u64 %r594,%r667; mov.u64 %r595,%r664; mov.u64 %r607,0; setp.le.s64 %r1619,%r378,%r628; setp.gt.s64 %r1627,%r1800,%r366; add.u64 %r1649,%r67,2; not.b64 %r1650,%r378; add.u64 %r1651,%r1650,%r627; and.b64 %r1652,%r1651,-2; add.u64 %r1653,%r1652,%r378; bra $L54; $L139: mov.u64 %r607,%r608; $L54: .loc 1 2738 9 @ %r1619 bra $L49; $L53: .loc 1 2749 12 @ %r1627 bra $L50; bra $L51; $L49: add.u64 %r1204,%r607,%r607; add.u64 %r38,%r1123,%r1204; add.u64 %r637,%r1649,%r609; add.u64 %r1210,%r1653,%r593; add.u64 %r1211,%r1210,%r1210; add.u64 %r610,%r1211,%r1672; sub.u64 %r1727,%r610,%r637; add.u64 %r1728,%r1727,-4; shr.u64 %r1726,%r1728,2; add.u64 %r1729,%r1726,1; and.b64 %r1730,%r1729,3; setp.eq.u64 %r1733,%r1730,0; @ %r1733 bra $L52; setp.eq.u64 %r1732,%r1730,1; @ %r1732 bra $L296; setp.eq.u64 %r1731,%r1730,2; @ %r1731 bra $L297; bra $L358; $L52: .loc 1 2741 7 add.u64 %r1213,%r595,%r637; add.u64 %r1214,%r1213,%r594; .loc 1 2740 49 ld.u16 %r1215,[%r1214]; st.u16 [%r38],%r1215; .loc 1 2742 49 ld.u16 %r1216,[%r637]; st.u16 [%r38+2],%r1216; .loc 1 2745 7 add.u64 %r1217,%r594,%r637; add.u64 %r1218,%r1217,%r595; .loc 1 2744 49 ld.u16 %r1219,[%r1218+2]; st.u16 [%r38+512],%r1219; .loc 1 2746 49 ld.u16 %r1220,[%r637+2]; st.u16 [%r38+514],%r1220; .loc 1 2738 9 add.u64 %r1734,%r38,1024; add.u64 %r1735,%r637,4; .loc 1 2741 7 add.u64 %r1803,%r595,%r1735; add.u64 %r1804,%r1803,%r594; .loc 1 2740 49 ld.u16 %r1805,[%r1804]; st.u16 [%r1734],%r1805; .loc 1 2742 49 ld.u16 %r1806,[%r1735]; st.u16 [%r1734+2],%r1806; .loc 1 2745 7 add.u64 %r1807,%r594,%r1735; add.u64 %r1808,%r1807,%r595; .loc 1 2744 49 ld.u16 %r1809,[%r1808+2]; st.u16 [%r1734+512],%r1809; .loc 1 2746 49 ld.u16 %r1810,[%r1735+2]; st.u16 [%r1734+514],%r1810; .loc 1 2738 9 add.u64 %r1811,%r38,2048; add.u64 %r1812,%r637,8; .loc 1 2741 7 add.u64 %r1814,%r595,%r1812; add.u64 %r1815,%r1814,%r594; .loc 1 2740 49 ld.u16 %r1816,[%r1815]; st.u16 [%r1811],%r1816; .loc 1 2742 49 ld.u16 %r1817,[%r1812]; st.u16 [%r1811+2],%r1817; .loc 1 2745 7 add.u64 %r1818,%r594,%r1812; add.u64 %r1819,%r1818,%r595; .loc 1 2744 49 ld.u16 %r1820,[%r1819+2]; st.u16 [%r1811+512],%r1820; .loc 1 2746 49 ld.u16 %r1821,[%r1812+2]; st.u16 [%r1811+514],%r1821; .loc 1 2738 9 add.u64 %r1822,%r38,3072; add.u64 %r1823,%r637,12; .loc 1 2741 7 add.u64 %r1825,%r595,%r1823; add.u64 %r1826,%r1825,%r594; .loc 1 2740 49 ld.u16 %r1827,[%r1826]; st.u16 [%r1822],%r1827; .loc 1 2742 49 ld.u16 %r1828,[%r1823]; st.u16 [%r1822+2],%r1828; .loc 1 2745 7 add.u64 %r1829,%r594,%r1823; add.u64 %r1830,%r1829,%r595; .loc 1 2744 49 ld.u16 %r1831,[%r1830+2]; st.u16 [%r1822+512],%r1831; .loc 1 2746 49 ld.u16 %r1832,[%r1823+2]; st.u16 [%r1822+514],%r1832; .loc 1 2738 9 add.u64 %r38,%r38,4096; add.u64 %r637,%r637,16; setp.ne.u64 %r1833,%r610,%r637; @ %r1833 bra $L52; bra $L53; $L50: .loc 1 2754 10 add.u64 %r1226,%r609,%r681; add.u64 %r1227,%r1226,%r682; .loc 1 2751 41 ld.u16 %r1228,[%r1227]; shl.b32 %r1229,%r1228,16; ld.u16 %r1230,[%r609]; or.b32 %r1231,%r1230,%r1229; st.u32 [%r602],%r1231; $L51: .loc 1 2735 5 add.u64 %r608,%r607,2; add.u64 %r609,%r609,%r606; add.u64 %r602,%r602,4; add.u64 %r595,%r595,%r606; add.u64 %r594,%r594,%r1143; add.u64 %r593,%r593,%r666; setp.ne.u64 %r1232,%r1657,%r607; @ %r1232 bra $L139; bra $L55; $L48: .loc 1 2767 25 shr.u64 %r1235,%r1633,62; add.u64 %r1236,%r1800,%r1235; and.b64 %r1237,%r1236,3; sub.u64 %r1238,%r1237,%r1235; .loc 1 2767 11 sub.u64 %r369,%r1800,%r1238; .loc 1 2769 5 @ %r1632 bra $L56; bra $L57; $L47: .loc 1 2759 17 add.u64 %r148,%r1800,%r378; .loc 1 2759 12 add.u64 %r1240,%r148,-1; .loc 1 2760 9 setp.lt.s64 %r1241,%r1240,%r378; @ %r1241 bra $L48; add.u64 %r1242,%r148,%r421; add.u64 %r1243,%r1242,%r1242; add.u64 %r39,%r355,%r1243; mov.u64 %r100,%r82; mov.u64 %r120,%r442; sub.u64 %r1717,%r39,%r442; add.u64 %r1718,%r1717,-2; shr.u64 %r1716,%r1718,1; add.u64 %r1719,%r1716,1; and.b64 %r1720,%r1719,3; setp.eq.u64 %r1723,%r1720,0; @ %r1723 bra $L58; setp.eq.u64 %r1722,%r1720,1; @ %r1722 bra $L298; setp.eq.u64 %r1721,%r1720,2; @ %r1721 bra $L299; bra $L359; $L58: .loc 1 2762 43 ld.u16 %r1244,[%r120]; st.u16 [%r100],%r1244; ld.u16 %r1836,[%r120+2]; st.u16 [%r100+512],%r1836; ld.u16 %r1840,[%r120+4]; st.u16 [%r100+1024],%r1840; ld.u16 %r1844,[%r120+6]; st.u16 [%r100+1536],%r1844; .loc 1 2760 9 add.u64 %r120,%r120,8; add.u64 %r100,%r100,2048; setp.ne.u64 %r1845,%r39,%r120; @ %r1845 bra $L58; bra $L48; $L57: .loc 1 2871 8 @ ! %r1624 bra $L60; bra $L59; $L56: .loc 1 2876 14 add.u64 %r631,%r369,%r378; .loc 1 2876 9 add.u64 %r632,%r631,-1; add.u64 %r452,%r68,%r694; add.u64 %r451,%r452,%r694; add.u64 %r449,%r451,%r694; not.b64 %r1247,%r378; add.u64 %r1248,%r1247,%r631; shr.u64 %r1249,%r1248,2; add.u64 %r1250,%r1249,1; shl.b64 %r465,%r1250,3; add.u64 %r1251,%r1666,%r631; add.u64 %r1252,%r1251,%r1251; add.u64 %r424,%r1252,%r353; add.u64 %r1253,%r631,%r691; add.u64 %r1254,%r1253,%r1253; add.u64 %r420,%r1254,%r353; add.u64 %r1257,%r1668,%r631; add.u64 %r1258,%r1257,%r1257; add.u64 %r336,%r1258,%r353; add.u64 %r1259,%r631,%r695; add.u64 %r1260,%r1259,%r1259; add.u64 %r333,%r1260,%r353; shl.b64 %r506,%r369,8; add.u64 %r429,%r1800,%r1800; mov.u64 %r434,%r433; mov.u64 %r444,%r440; mov.u64 %r445,%r699; mov.u64 %r447,%r696; mov.u64 %r455,%r68; mov.u64 %r460,%r458; mov.u64 %r383,%r708; mov.u64 %r431,0; mov.u64 %r432,%r431; setp.le.s64 %r1621,%r378,%r632; setp.gt.s64 %r1629,%r1800,%r369; $L68: .loc 1 2772 9 @ %r1621 bra $L61; $L67: .loc 1 2843 12 @ ! %r1629 bra $L63; bra $L62; $L61: .loc 1 2904 13 mov.u64 %r478,0; mov.u64 %r479,2048; mov.u64 %r480,1536; mov.u64 %r481,1024; mov.u64 %r482,-512; $L66: add.u64 %r476,%r455,%r478; .loc 1 2774 10 ld.s16 %r384,[%r476]; .loc 1 2775 10 ld.s16 %r385,[%r476+2]; add.u64 %r474,%r452,%r478; .loc 1 2776 10 ld.s16 %r386,[%r474]; .loc 1 2777 10 ld.s16 %r387,[%r474+2]; add.u64 %r472,%r451,%r478; .loc 1 2778 10 ld.s16 %r388,[%r472]; .loc 1 2779 10 ld.s16 %r389,[%r472+2]; add.u64 %r470,%r449,%r478; .loc 1 2780 10 ld.s16 %r390,[%r470]; .loc 1 2781 10 ld.s16 %r391,[%r470+2]; .loc 1 2782 10 ld.s16 %r392,[%r476+4]; .loc 1 2783 10 ld.s16 %r393,[%r476+6]; .loc 1 2784 10 ld.s16 %r394,[%r474+4]; .loc 1 2785 10 ld.s16 %r395,[%r474+6]; .loc 1 2786 10 ld.s16 %r396,[%r472+4]; .loc 1 2787 10 ld.s16 %r397,[%r472+6]; .loc 1 2788 10 ld.s16 %r398,[%r470+4]; .loc 1 2789 10 ld.s16 %r399,[%r470+6]; .loc 1 2791 6 @ %r1660 bra $L64; shl.b64 %r1266,%r478,8; add.u64 %r498,%r1123,%r1266; add.u64 %r496,%r431,%r660; $L65: .loc 1 2794 11 ld.u16 %r1268,[%r498]; ld.u16 %r1269,[%r496]; .loc 1 2793 14 cvt.u16.u32 %r1271,%r384; mad.lo.u16 %r1270,%r1268,%r1269,%r1271; cvt.s32.s16 %r384,%r1270; add.u64 %r491,%r482,%r498; .loc 1 2795 19 add.u64 %r1273,%r491,%r481; .loc 1 2796 11 ld.u16 %r1276,[%r1273]; .loc 1 2795 14 cvt.u16.u32 %r1278,%r385; mad.lo.u16 %r1277,%r1269,%r1276,%r1278; cvt.s32.s16 %r385,%r1277; add.u64 %r488,%r447,%r496; .loc 1 2798 14 add.u64 %r1280,%r488,%r445; .loc 1 2798 11 ld.u16 %r1283,[%r1280]; .loc 1 2797 14 cvt.u16.u32 %r1285,%r386; mad.lo.u16 %r1284,%r1268,%r1283,%r1285; cvt.s32.s16 %r386,%r1284; .loc 1 2799 14 cvt.u16.u32 %r1291,%r387; mad.lo.u16 %r1290,%r1276,%r1283,%r1291; cvt.s32.s16 %r387,%r1290; .loc 1 2802 14 add.u64 %r1293,%r488,%r444; .loc 1 2802 11 ld.u16 %r1296,[%r1293]; .loc 1 2801 14 cvt.u16.u32 %r1298,%r388; mad.lo.u16 %r1297,%r1268,%r1296,%r1298; cvt.s32.s16 %r388,%r1297; .loc 1 2803 14 cvt.u16.u32 %r1304,%r389; mad.lo.u16 %r1303,%r1276,%r1296,%r1304; cvt.s32.s16 %r389,%r1303; .loc 1 2806 14 add.u64 %r1306,%r488,%r434; .loc 1 2806 11 ld.u16 %r1309,[%r1306]; .loc 1 2805 14 cvt.u16.u32 %r1311,%r390; mad.lo.u16 %r1310,%r1268,%r1309,%r1311; cvt.s32.s16 %r390,%r1310; .loc 1 2807 14 cvt.u16.u32 %r1317,%r391; mad.lo.u16 %r1316,%r1276,%r1309,%r1317; cvt.s32.s16 %r391,%r1316; .loc 1 2809 19 add.u64 %r1319,%r491,%r480; .loc 1 2810 11 ld.u16 %r1322,[%r1319]; .loc 1 2809 14 cvt.u16.u32 %r1324,%r392; mad.lo.u16 %r1323,%r1269,%r1322,%r1324; cvt.s32.s16 %r392,%r1323; .loc 1 2811 19 add.u64 %r1326,%r491,%r479; .loc 1 2812 11 ld.u16 %r1329,[%r1326]; .loc 1 2811 14 cvt.u16.u32 %r1331,%r393; mad.lo.u16 %r1330,%r1269,%r1329,%r1331; cvt.s32.s16 %r393,%r1330; .loc 1 2813 14 cvt.u16.u32 %r1337,%r394; mad.lo.u16 %r1336,%r1283,%r1322,%r1337; cvt.s32.s16 %r394,%r1336; .loc 1 2815 14 cvt.u16.u32 %r1343,%r395; mad.lo.u16 %r1342,%r1283,%r1329,%r1343; cvt.s32.s16 %r395,%r1342; .loc 1 2817 14 cvt.u16.u32 %r1349,%r396; mad.lo.u16 %r1348,%r1296,%r1322,%r1349; cvt.s32.s16 %r396,%r1348; .loc 1 2819 14 cvt.u16.u32 %r1355,%r397; mad.lo.u16 %r1354,%r1296,%r1329,%r1355; cvt.s32.s16 %r397,%r1354; .loc 1 2821 14 cvt.u16.u32 %r1361,%r398; mad.lo.u16 %r1360,%r1309,%r1322,%r1361; cvt.s32.s16 %r398,%r1360; .loc 1 2823 14 cvt.u16.u32 %r1367,%r399; mad.lo.u16 %r1366,%r1309,%r1329,%r1367; cvt.s32.s16 %r399,%r1366; .loc 1 2791 6 add.u64 %r498,%r498,2; add.u64 %r496,%r496,2; setp.ne.u64 %r1369,%r460,%r496; @ %r1369 bra $L65; $L64: .loc 1 2826 24 cvt.u16.u32 %r1370,%r384; st.u16 [%r476],%r1370; .loc 1 2827 28 cvt.u16.u32 %r1371,%r385; st.u16 [%r476+2],%r1371; .loc 1 2828 30 cvt.u16.u32 %r1372,%r386; st.u16 [%r474],%r1372; .loc 1 2829 34 cvt.u16.u32 %r1373,%r387; st.u16 [%r474+2],%r1373; .loc 1 2830 30 cvt.u16.u32 %r1374,%r388; st.u16 [%r472],%r1374; .loc 1 2831 34 cvt.u16.u32 %r1375,%r389; st.u16 [%r472+2],%r1375; .loc 1 2832 30 cvt.u16.u32 %r1376,%r390; st.u16 [%r470],%r1376; .loc 1 2833 34 cvt.u16.u32 %r1377,%r391; st.u16 [%r470+2],%r1377; .loc 1 2834 28 cvt.u16.u32 %r1378,%r392; st.u16 [%r476+4],%r1378; .loc 1 2835 28 cvt.u16.u32 %r1379,%r393; st.u16 [%r476+6],%r1379; .loc 1 2836 34 cvt.u16.u32 %r1380,%r394; st.u16 [%r474+4],%r1380; .loc 1 2837 34 cvt.u16.u32 %r1381,%r395; st.u16 [%r474+6],%r1381; .loc 1 2838 34 cvt.u16.u32 %r1382,%r396; st.u16 [%r472+4],%r1382; .loc 1 2839 34 cvt.u16.u32 %r1383,%r397; st.u16 [%r472+6],%r1383; .loc 1 2840 34 cvt.u16.u32 %r1384,%r398; st.u16 [%r470+4],%r1384; .loc 1 2841 34 cvt.u16.u32 %r1385,%r399; st.u16 [%r470+6],%r1385; .loc 1 2772 9 add.u64 %r482,%r482,-2048; add.u64 %r481,%r481,2048; add.u64 %r480,%r480,2048; add.u64 %r479,%r479,2048; add.u64 %r478,%r478,8; setp.ne.u64 %r1386,%r465,%r478; @ %r1386 bra $L66; bra $L67; $L63: .loc 1 2769 29 add.u64 %r383,%r383,4; .loc 1 2769 5 add.u64 %r460,%r460,%r459; add.u64 %r455,%r455,%r454; add.u64 %r452,%r452,%r454; add.u64 %r451,%r451,%r454; add.u64 %r449,%r449,%r454; add.u64 %r447,%r447,%r1146; add.u64 %r445,%r445,%r459; add.u64 %r444,%r444,%r459; add.u64 %r434,%r434,%r459; add.u64 %r432,%r432,%r454; add.u64 %r431,%r431,%r459; setp.le.s64 %r1387,%r383,%r1662; @ %r1387 bra $L68; bra $L57; $L62: add.u64 %r511,%r424,%r432; add.u64 %r510,%r420,%r432; add.u64 %r509,%r336,%r432; add.u64 %r508,%r333,%r432; add.u64 %r328,%r329,%r431; add.u64 %r326,%r327,%r431; add.u64 %r324,%r325,%r431; add.u64 %r428,%r429,%r455; .loc 1 2904 13 mov.u64 %r507,%r506; $L71: .loc 1 2848 14 ld.s16 %r379,[%r511]; .loc 1 2849 14 ld.s16 %r380,[%r510]; .loc 1 2850 14 ld.s16 %r381,[%r509]; .loc 1 2851 14 ld.s16 %r382,[%r508]; .loc 1 2853 10 @ %r1660 bra $L69; add.u64 %r1391,%r507,%r507; add.u64 %r525,%r1123,%r1391; add.u64 %r523,%r431,%r660; mov.u64 %r518,%r324; mov.u64 %r520,%r326; mov.u64 %r521,%r328; sub.u64 %r1706,%r460,%r523; add.u64 %r1707,%r1706,-2; shr.u64 %r1705,%r1707,1; and.b64 %r1709,%r1705,1; setp.ne.u64 %r1710,%r1709,0; @ ! %r1710 bra $L360; $L70: .loc 1 2856 13 ld.u16 %r1393,[%r523]; ld.u16 %r1394,[%r525]; .loc 1 2855 11 cvt.u16.u32 %r1396,%r379; .loc 1 2858 13 ld.u16 %r1399,[%r521]; .loc 1 2857 11 cvt.u16.u32 %r1402,%r380; .loc 1 2860 13 ld.u16 %r1405,[%r520]; .loc 1 2859 11 cvt.u16.u32 %r1408,%r381; .loc 1 2862 13 ld.u16 %r1411,[%r518]; .loc 1 2861 11 cvt.u16.u32 %r1414,%r382; .loc 1 2856 13 ld.u16 %r1856,[%r523+2]; ld.u16 %r1857,[%r525+2]; .loc 1 2855 11 mad.lo.u16 %r1859,%r1393,%r1394,%r1396; mad.lo.u16 %r1860,%r1856,%r1857,%r1859; cvt.s32.s16 %r379,%r1860; .loc 1 2858 13 ld.u16 %r1861,[%r521+2]; .loc 1 2857 11 mad.lo.u16 %r1863,%r1399,%r1394,%r1402; mad.lo.u16 %r1864,%r1861,%r1857,%r1863; cvt.s32.s16 %r380,%r1864; .loc 1 2860 13 ld.u16 %r1865,[%r520+2]; .loc 1 2859 11 mad.lo.u16 %r1867,%r1405,%r1394,%r1408; mad.lo.u16 %r1868,%r1865,%r1857,%r1867; cvt.s32.s16 %r381,%r1868; .loc 1 2862 13 ld.u16 %r1869,[%r518+2]; .loc 1 2861 11 mad.lo.u16 %r1871,%r1411,%r1394,%r1414; mad.lo.u16 %r1872,%r1869,%r1857,%r1871; cvt.s32.s16 %r382,%r1872; .loc 1 2853 10 add.u64 %r525,%r525,4; add.u64 %r523,%r523,4; add.u64 %r521,%r521,4; add.u64 %r520,%r520,4; add.u64 %r518,%r518,4; setp.ne.u64 %r1873,%r460,%r523; @ %r1873 bra $L70; $L69: .loc 1 2864 28 cvt.u16.u32 %r1417,%r379; st.u16 [%r511],%r1417; .loc 1 2865 34 cvt.u16.u32 %r1418,%r380; st.u16 [%r510],%r1418; .loc 1 2866 34 cvt.u16.u32 %r1419,%r381; st.u16 [%r509],%r1419; .loc 1 2867 34 cvt.u16.u32 %r1420,%r382; st.u16 [%r508],%r1420; .loc 1 2846 6 add.u64 %r511,%r511,2; add.u64 %r510,%r510,2; add.u64 %r509,%r509,2; add.u64 %r508,%r508,2; add.u64 %r507,%r507,256; setp.ne.u64 %r1421,%r428,%r511; @ %r1421 bra $L71; bra $L63; $L60: .loc 1 2727 34 add.u64 %r378,%r378,256; .loc 1 2727 8 add.u64 %r68,%r68,512; add.u64 %r442,%r442,512; setp.ge.s64 %r1422,%r320,%r378; @ %r1422 bra $L72; bra $L73; $L59: .loc 1 2873 17 add.u64 %r239,%r1797,%r708; .loc 1 2874 9 setp.le.s64 %r1423,%r239,%r624; @ %r1423 bra $L60; .loc 1 2876 14 add.u64 %r638,%r369,%r378; .loc 1 2876 9 add.u64 %r639,%r638,-1; add.u64 %r536,%r378,%r535; add.u64 %r1424,%r378,%r378; sub.u64 %r1425,%r353,%r1424; add.u64 %r1426,%r638,%r638; add.u64 %r528,%r1425,%r1426; shl.b64 %r570,%r369,8; mov.u64 %r540,%r539; mov.u64 %r542,%r677; mov.u64 %r372,%r624; setp.gt.s64 %r1874,%r1800,%r369; setp.le.s64 %r1622,%r378,%r639; not.b64 %r1641,%r378; add.u64 %r1642,%r1641,%r638; and.b64 %r1643,%r1642,-4; add.u64 %r1644,%r353,8; $L81: .loc 1 2877 6 @ %r1622 bra $L74; $L80: .loc 1 2901 6 @ %r1874 bra $L75; bra $L76; $L74: add.u64 %r1430,%r536,%r536; add.u64 %r554,%r353,%r1430; add.u64 %r1434,%r1643,%r536; add.u64 %r1435,%r1434,%r1434; add.u64 %r544,%r1435,%r1644; .loc 1 2904 13 mov.u64 %r550,2048; mov.u64 %r551,1536; mov.u64 %r552,512; mov.u64 %r553,-1024; $L79: .loc 1 2879 14 ld.s16 %r374,[%r554]; .loc 1 2880 14 ld.s16 %r375,[%r554+2]; .loc 1 2881 14 ld.s16 %r376,[%r554+4]; .loc 1 2882 14 ld.s16 %r377,[%r554+6]; .loc 1 2884 10 @ %r1660 bra $L77; add.u64 %r563,%r1123,%r552; mov.u64 %r565,%r542; sub.u64 %r1698,%r540,%r542; add.u64 %r1699,%r1698,-2; shr.u64 %r1697,%r1699,1; and.b64 %r1701,%r1697,1; setp.ne.u64 %r1702,%r1701,0; @ ! %r1702 bra $L361; $L78: add.u64 %r560,%r553,%r563; .loc 1 2886 16 add.u64 %r1438,%r560,%r552; .loc 1 2887 13 ld.u16 %r1440,[%r1438]; ld.u16 %r1441,[%r565]; .loc 1 2886 11 cvt.u16.u32 %r1443,%r374; .loc 1 2889 13 ld.u16 %r1446,[%r563]; .loc 1 2888 11 cvt.u16.u32 %r1449,%r375; .loc 1 2890 16 add.u64 %r1451,%r560,%r551; .loc 1 2891 13 ld.u16 %r1453,[%r1451]; .loc 1 2890 11 cvt.u16.u32 %r1456,%r376; .loc 1 2892 16 add.u64 %r1458,%r560,%r550; .loc 1 2893 13 ld.u16 %r1460,[%r1458]; .loc 1 2892 11 cvt.u16.u32 %r1463,%r377; .loc 1 2884 10 add.u64 %r1704,%r563,2; add.u64 %r1881,%r553,%r1704; .loc 1 2886 16 add.u64 %r1883,%r1881,%r552; .loc 1 2887 13 ld.u16 %r1884,[%r1883]; ld.u16 %r1885,[%r565+2]; .loc 1 2886 11 mad.lo.u16 %r1887,%r1440,%r1441,%r1443; mad.lo.u16 %r1888,%r1884,%r1885,%r1887; cvt.s32.s16 %r374,%r1888; .loc 1 2889 13 ld.u16 %r1889,[%r1704]; .loc 1 2888 11 mad.lo.u16 %r1891,%r1446,%r1441,%r1449; mad.lo.u16 %r1892,%r1889,%r1885,%r1891; cvt.s32.s16 %r375,%r1892; .loc 1 2890 16 add.u64 %r1893,%r1881,%r551; .loc 1 2891 13 ld.u16 %r1894,[%r1893]; .loc 1 2890 11 mad.lo.u16 %r1896,%r1453,%r1441,%r1456; mad.lo.u16 %r1897,%r1894,%r1885,%r1896; cvt.s32.s16 %r376,%r1897; .loc 1 2892 16 add.u64 %r1898,%r1881,%r550; .loc 1 2893 13 ld.u16 %r1899,[%r1898]; .loc 1 2892 11 mad.lo.u16 %r1901,%r1460,%r1441,%r1463; mad.lo.u16 %r1902,%r1899,%r1885,%r1901; cvt.s32.s16 %r377,%r1902; .loc 1 2884 10 add.u64 %r565,%r565,4; add.u64 %r563,%r563,4; setp.ne.u64 %r1903,%r540,%r565; @ %r1903 bra $L78; $L77: .loc 1 2895 28 cvt.u16.u32 %r1466,%r374; st.u16 [%r554],%r1466; .loc 1 2896 32 cvt.u16.u32 %r1467,%r375; st.u16 [%r554+2],%r1467; .loc 1 2897 32 cvt.u16.u32 %r1468,%r376; st.u16 [%r554+4],%r1468; .loc 1 2898 32 cvt.u16.u32 %r1469,%r377; st.u16 [%r554+6],%r1469; .loc 1 2877 6 add.u64 %r554,%r554,8; add.u64 %r553,%r553,-2048; add.u64 %r552,%r552,2048; add.u64 %r551,%r551,2048; add.u64 %r550,%r550,2048; setp.ne.u64 %r1470,%r544,%r554; @ %r1470 bra $L79; bra $L80; $L76: .loc 1 2874 39 add.u64 %r372,%r372,1; .loc 1 2874 9 add.u64 %r542,%r542,%r709; add.u64 %r540,%r540,%r709; add.u64 %r536,%r536,%r581; setp.ne.u64 %r1471,%r239,%r372; @ %r1471 bra $L81; bra $L60; $L75: add.u64 %r1473,%r536,%r536; add.u64 %r572,%r1473,%r528; add.u64 %r1474,%r1800,%r536; add.u64 %r1475,%r1474,%r1474; add.u64 %r532,%r353,%r1475; .loc 1 2904 13 mov.u64 %r571,%r570; $L84: .loc 1 2903 14 ld.s16 %r373,[%r572]; .loc 1 2905 10 @ %r1660 bra $L82; add.u64 %r1477,%r571,%r571; add.u64 %r401,%r1123,%r1477; mov.u64 %r576,%r542; sub.u64 %r1688,%r540,%r542; add.u64 %r1689,%r1688,-2; shr.u64 %r1687,%r1689,1; add.u64 %r1690,%r1687,1; and.b64 %r1691,%r1690,3; setp.eq.u64 %r1694,%r1691,0; @ %r1694 bra $L83; setp.eq.u64 %r1693,%r1691,1; @ %r1693 bra $L300; setp.eq.u64 %r1692,%r1691,2; @ %r1692 bra $L301; bra $L362; $L83: .loc 1 2908 13 ld.u16 %r1479,[%r401]; ld.u16 %r1480,[%r576]; .loc 1 2907 11 cvt.u16.u32 %r1482,%r373; .loc 1 2908 13 ld.u16 %r1907,[%r401+2]; ld.u16 %r1908,[%r576+2]; .loc 1 2907 11 mad.lo.u16 %r1910,%r1479,%r1480,%r1482; .loc 1 2908 13 ld.u16 %r1916,[%r401+4]; ld.u16 %r1917,[%r576+4]; .loc 1 2907 11 mad.lo.u16 %r1919,%r1907,%r1908,%r1910; .loc 1 2908 13 ld.u16 %r1925,[%r401+6]; ld.u16 %r1926,[%r576+6]; .loc 1 2907 11 mad.lo.u16 %r1928,%r1916,%r1917,%r1919; mad.lo.u16 %r1929,%r1925,%r1926,%r1928; cvt.s32.s16 %r373,%r1929; .loc 1 2905 10 add.u64 %r401,%r401,8; add.u64 %r576,%r576,8; setp.ne.u64 %r1930,%r540,%r576; @ %r1930 bra $L83; $L82: .loc 1 2910 28 cvt.u16.u32 %r1485,%r373; st.u16 [%r572],%r1485; .loc 1 2901 6 add.u64 %r572,%r572,2; add.u64 %r571,%r571,256; setp.ne.u64 %r1486,%r532,%r572; @ %r1486 bra $L84; bra $L76; $L20: .loc 1 2958 11 cvt.u16.u32 %r1487,%r583; setp.eq.u16 %r1488,%r1487,1; @ ! %r1488 bra $L363; bra $L85; $L31: .loc 1 2920 38 set.u32.eq.u64 %r1490,%r313,1; neg.s32 %r1491,%r1490; .loc 1 2920 26 cvt.u16.u32 %r1493,%r1491; and.b16 %r1492,%r1493,%r1078; .loc 1 2920 11 cvt.u32.u16 %r1495,%r1492; cvt.u16.u8 %r1496,%r1495; setp.eq.u16 %r1497,%r1496,0; @ %r1497 bra $L20; .loc 1 2920 43 setp.ne.u64 %r1498,%r645,1; @ %r1498 bra $L20; $L121: .loc 1 2922 10 cvt.u16.u32 %r1499,%r583; setp.ne.u16 %r1500,%r1499,1; @ %r1500 bra $L87; .loc 1 2948 4 setp.gt.s64 %r1501,%r321,0; @ %r1501 bra $L88; bra $L1; $L87: .loc 1 2929 4 setp.le.s64 %r1502,%r321,0; @ %r1502 bra $L1; add.u64 %r724,%r646,%r646; add.u64 %r1504,%r272,%r272; add.u64 %r725,%r349,%r1504; mov.u64 %r734,0; mov.u64 %r732,%r734; .loc 1 2929 11 mov.u64 %r405,%r734; setp.gt.s64 %r1931,%r320,0; setp.le.s64 %r1677,%r272,0; .loc 1 2936 7 cvt.u32.u64 %r1678,%r734; $L91: .loc 1 2933 8 @ %r1931 bra $L90; $L95: .loc 1 2929 29 add.u64 %r405,%r405,1; .loc 1 2929 4 add.u64 %r732,%r732,%r581; add.u64 %r734,%r734,%r314; setp.ne.u64 %r1506,%r321,%r405; @ %r1506 bra $L91; bra $L1; $L90: add.u64 %r1507,%r732,%r732; add.u64 %r720,%r1507,%r351; add.u64 %r1508,%r734,%r734; add.u64 %r743,%r350,%r1508; add.u64 %r1509,%r320,%r732; add.u64 %r1510,%r1509,%r1509; add.u64 %r738,%r1510,%r351; .loc 1 2933 8 mov.u64 %r722,%r725; mov.u64 %r721,0; $L94: .loc 1 2937 5 @ %r1677 bra $L140; add.u64 %r1512,%r721,%r721; add.u64 %r716,%r349,%r1512; mov.u64 %r717,%r743; .loc 1 2936 7 mov.u32 %r406,%r1678; sub.u64 %r1745,%r722,%r716; add.u64 %r1746,%r1745,-2; shr.u64 %r1744,%r1746,1; add.u64 %r1747,%r1744,1; and.b64 %r1748,%r1747,3; setp.eq.u64 %r1751,%r1748,0; @ %r1751 bra $L93; setp.eq.u64 %r1750,%r1748,1; @ %r1750 bra $L302; setp.eq.u64 %r1749,%r1748,2; @ %r1749 bra $L303; bra $L364; $L93: .loc 1 2938 23 ld.u16 %r1514,[%r716]; ld.u16 %r1515,[%r717]; .loc 1 2938 9 cvt.u16.u32 %r1517,%r406; .loc 1 2938 23 ld.u16 %r1935,[%r716+2]; ld.u16 %r1936,[%r717+2]; .loc 1 2938 9 mad.lo.u16 %r1938,%r1514,%r1515,%r1517; .loc 1 2938 23 ld.u16 %r1944,[%r716+4]; ld.u16 %r1945,[%r717+4]; .loc 1 2938 9 mad.lo.u16 %r1947,%r1935,%r1936,%r1938; .loc 1 2938 23 ld.u16 %r1953,[%r716+6]; ld.u16 %r1954,[%r717+6]; .loc 1 2938 9 mad.lo.u16 %r1956,%r1944,%r1945,%r1947; mad.lo.u16 %r1957,%r1953,%r1954,%r1956; cvt.s32.s16 %r406,%r1957; .loc 1 2937 5 add.u64 %r716,%r716,8; add.u64 %r717,%r717,8; setp.ne.u64 %r1958,%r716,%r722; @ %r1958 bra $L93; bra $L92; $L140: .loc 1 2936 7 mov.u32 %r406,%r1678; $L92: .loc 1 2939 15 cvt.u16.u32 %r1520,%r406; st.u16 [%r720],%r1520; .loc 1 2933 8 add.u64 %r720,%r720,2; add.u64 %r721,%r721,%r646; add.u64 %r722,%r722,%r724; setp.ne.u64 %r1521,%r720,%r738; @ %r1521 bra $L94; bra $L95; $L88: add.u64 %r754,%r581,%r581; mov.u64 %r752,%r351; add.u64 %r746,%r646,%r646; .loc 1 2948 4 mov.u64 %r755,0; .loc 1 2948 11 mov.u64 %r403,%r755; setp.le.s64 %r1959,%r272,0; .loc 1 2951 10 cvt.u32.u64 %r1680,%r755; $L98: .loc 1 2952 8 @ %r1959 bra $L141; mov.u64 %r744,%r349; add.u64 %r1525,%r755,%r755; add.u64 %r747,%r1525,%r350; add.u64 %r1526,%r272,%r755; add.u64 %r1527,%r1526,%r1526; add.u64 %r763,%r1527,%r350; .loc 1 2951 10 mov.u32 %r404,%r1680; sub.u64 %r1755,%r763,%r747; add.u64 %r1756,%r1755,-2; shr.u64 %r1754,%r1756,1; add.u64 %r1757,%r1754,1; and.b64 %r1758,%r1757,3; setp.eq.u64 %r1761,%r1758,0; @ %r1761 bra $L97; setp.eq.u64 %r1760,%r1758,1; @ %r1760 bra $L304; setp.eq.u64 %r1759,%r1758,2; @ %r1759 bra $L305; bra $L365; $L97: .loc 1 2953 26 ld.u16 %r1529,[%r744]; ld.u16 %r1530,[%r747]; .loc 1 2953 5 cvt.u16.u32 %r1532,%r404; .loc 1 2952 8 add.u64 %r1961,%r744,%r746; .loc 1 2953 26 ld.u16 %r1963,[%r1961]; ld.u16 %r1964,[%r747+2]; .loc 1 2953 5 mad.lo.u16 %r1966,%r1529,%r1530,%r1532; .loc 1 2952 8 add.u64 %r1969,%r1961,%r746; .loc 1 2953 26 ld.u16 %r1972,[%r1969]; ld.u16 %r1973,[%r747+4]; .loc 1 2953 5 mad.lo.u16 %r1975,%r1963,%r1964,%r1966; .loc 1 2952 8 add.u64 %r1978,%r1969,%r746; .loc 1 2953 26 ld.u16 %r1981,[%r1978]; ld.u16 %r1982,[%r747+6]; .loc 1 2953 5 mad.lo.u16 %r1984,%r1972,%r1973,%r1975; mad.lo.u16 %r1985,%r1981,%r1982,%r1984; cvt.s32.s16 %r404,%r1985; .loc 1 2952 8 add.u64 %r744,%r1978,%r746; add.u64 %r747,%r747,8; setp.ne.u64 %r1986,%r747,%r763; @ %r1986 bra $L97; bra $L96; $L141: .loc 1 2951 10 mov.u32 %r404,%r1680; $L96: .loc 1 2954 25 cvt.u16.u32 %r1535,%r404; st.u16 [%r752],%r1535; .loc 1 2948 29 add.u64 %r403,%r403,1; .loc 1 2948 4 add.u64 %r752,%r752,%r754; add.u64 %r755,%r755,%r314; setp.ne.u64 %r1536,%r321,%r403; @ %r1536 bra $L98; bra $L1; $L85: .loc 1 2963 7 setp.le.s64 %r1537,%r321,0; @ %r1537 bra $L1; add.u64 %r774,%r712,%r712; mov.u64 %r772,%r351; add.u64 %r766,%r646,%r646; add.u64 %r769,%r645,%r645; mov.u64 %r775,0; .loc 1 2963 14 mov.u64 %r416,%r775; setp.le.s64 %r1987,%r272,0; .loc 1 2966 6 cvt.u32.u64 %r1681,%r775; $L101: .loc 1 2967 4 @ %r1987 bra $L142; mov.u64 %r764,%r349; add.u64 %r1542,%r775,%r775; add.u64 %r767,%r350,%r1542; .loc 1 2966 6 mov.u32 %r417,%r1681; .loc 1 2967 11 mov.u64 %r418,0; and.b64 %r1766,%r272,3; setp.eq.u64 %r1769,%r1766,0; @ %r1769 bra $L100; setp.eq.u64 %r1768,%r1766,1; @ %r1768 bra $L306; setp.eq.u64 %r1767,%r1766,2; @ %r1767 bra $L307; bra $L366; $L100: .loc 1 2968 29 ld.u16 %r1544,[%r764]; ld.u16 %r1545,[%r767]; .loc 1 2968 8 cvt.u16.u32 %r1547,%r417; .loc 1 2967 4 add.u64 %r1990,%r764,%r766; add.u64 %r1991,%r767,%r769; .loc 1 2968 29 ld.u16 %r1992,[%r1990]; ld.u16 %r1993,[%r1991]; .loc 1 2968 8 mad.lo.u16 %r1995,%r1544,%r1545,%r1547; .loc 1 2967 4 add.u64 %r1999,%r1990,%r766; add.u64 %r2000,%r1991,%r769; .loc 1 2968 29 ld.u16 %r2002,[%r1999]; ld.u16 %r2003,[%r2000]; .loc 1 2968 8 mad.lo.u16 %r2005,%r1992,%r1993,%r1995; .loc 1 2967 4 add.u64 %r2009,%r1999,%r766; add.u64 %r2010,%r2000,%r769; .loc 1 2968 29 ld.u16 %r2012,[%r2009]; ld.u16 %r2013,[%r2010]; .loc 1 2968 8 mad.lo.u16 %r2015,%r2002,%r2003,%r2005; mad.lo.u16 %r2016,%r2012,%r2013,%r2015; cvt.s32.s16 %r417,%r2016; .loc 1 2967 28 add.u64 %r418,%r418,4; .loc 1 2967 4 add.u64 %r764,%r2009,%r766; add.u64 %r767,%r2010,%r769; setp.ne.u64 %r2017,%r272,%r418; @ %r2017 bra $L100; bra $L99; $L142: .loc 1 2966 6 mov.u32 %r417,%r1681; $L99: .loc 1 2969 21 cvt.u16.u32 %r1550,%r417; st.u16 [%r772],%r1550; .loc 1 2963 32 add.u64 %r416,%r416,1; .loc 1 2963 7 add.u64 %r772,%r772,%r774; add.u64 %r775,%r775,%r314; setp.ne.u64 %r1551,%r321,%r416; @ %r1551 bra $L101; bra $L1; $L363: .loc 1 2972 11 setp.gt.s64 %r1552,%r313,%r646; @ %r1552 bra $L102; .loc 1 2993 7 setp.gt.s64 %r1553,%r321,0; @ %r1553 bra $L103; bra $L1; $L102: .loc 1 2974 7 setp.gt.s64 %r1554,%r321,0; @ %r1554 bra $L104; bra $L1; $L108: add.u64 %r1555,%r808,%r808; add.u64 %r804,%r351,%r1555; .loc 1 2975 9 mov.u64 %r415,0; and.b64 %r1782,%r320,3; setp.eq.u64 %r1785,%r1782,0; @ %r1785 bra $L105; setp.eq.u64 %r1784,%r1782,1; @ %r1784 bra $L308; setp.eq.u64 %r1783,%r1782,2; @ %r1783 bra $L309; bra $L367; $L105: .loc 1 2976 34 st.u16 [%r804],%r1682; .loc 1 2975 2 add.u64 %r2019,%r804,%r806; .loc 1 2976 34 st.u16 [%r2019],%r1682; .loc 1 2975 2 add.u64 %r2021,%r2019,%r806; .loc 1 2976 34 st.u16 [%r2021],%r1682; .loc 1 2975 2 add.u64 %r2024,%r2021,%r806; .loc 1 2976 34 st.u16 [%r2024],%r1682; .loc 1 2975 27 add.u64 %r415,%r415,4; .loc 1 2975 2 add.u64 %r804,%r2024,%r806; setp.ne.u64 %r2026,%r320,%r415; @ %r2026 bra $L105; $L109: .loc 1 2974 32 add.u64 %r414,%r436,1; .loc 1 2974 7 add.u64 %r808,%r808,%r581; setp.ne.u64 %r1558,%r321,%r414; @ %r1558 bra $L143; add.u64 %r792,%r645,%r645; add.u64 %r783,%r646,%r646; mov.u64 %r797,0; mov.u64 %r796,%r797; mov.u64 %r437,%r797; setp.le.s64 %r2027,%r272,0; bra $L107; $L104: add.u64 %r806,%r712,%r712; mov.u64 %r808,0; .loc 1 2974 14 mov.u64 %r436,%r808; setp.gt.s64 %r2028,%r320,0; .loc 1 2976 34 cvt.u16.u64 %r1682,%r808; bra $L106; $L143: mov.u64 %r436,%r414; $L106: .loc 1 2975 2 @ %r2028 bra $L108; bra $L109; $L112: add.u64 %r1563,%r788,%r788; add.u64 %r781,%r349,%r1563; .loc 1 2980 4 mov.u64 %r780,%r801; .loc 1 2980 11 mov.u64 %r413,0; and.b64 %r1774,%r320,3; setp.eq.u64 %r1777,%r1774,0; @ %r1777 bra $L110; setp.eq.u64 %r1776,%r1774,1; @ %r1776 bra $L310; setp.eq.u64 %r1775,%r1774,2; @ %r1775 bra $L311; bra $L368; $L110: .loc 1 2983 37 ld.u16 %r1565,[%r781]; ld.u16 %r1566,[%r790]; .loc 1 2982 36 ld.u16 %r1568,[%r780]; mad.lo.u16 %r1567,%r1565,%r1566,%r1568; st.u16 [%r780],%r1567; .loc 1 2980 4 add.u64 %r2030,%r780,%r806; add.u64 %r2031,%r781,%r783; .loc 1 2983 37 ld.u16 %r2032,[%r2031]; ld.u16 %r2033,[%r790]; .loc 1 2982 36 ld.u16 %r2035,[%r2030]; mad.lo.u16 %r2036,%r2032,%r2033,%r2035; st.u16 [%r2030],%r2036; .loc 1 2980 4 add.u64 %r2038,%r2030,%r806; add.u64 %r2039,%r2031,%r783; .loc 1 2983 37 ld.u16 %r2041,[%r2039]; ld.u16 %r2042,[%r790]; .loc 1 2982 36 ld.u16 %r2044,[%r2038]; mad.lo.u16 %r2045,%r2041,%r2042,%r2044; st.u16 [%r2038],%r2045; .loc 1 2980 4 add.u64 %r2047,%r2038,%r806; add.u64 %r2048,%r2039,%r783; .loc 1 2983 37 ld.u16 %r2050,[%r2048]; ld.u16 %r2051,[%r790]; .loc 1 2982 36 ld.u16 %r2053,[%r2047]; mad.lo.u16 %r2054,%r2050,%r2051,%r2053; st.u16 [%r2047],%r2054; .loc 1 2980 29 add.u64 %r413,%r413,4; .loc 1 2980 4 add.u64 %r780,%r2047,%r806; add.u64 %r781,%r2048,%r783; setp.ne.u64 %r2055,%r320,%r413; @ %r2055 bra $L110; $L113: .loc 1 2979 26 add.u64 %r412,%r412,1; .loc 1 2979 2 add.u64 %r788,%r788,%r313; add.u64 %r790,%r790,%r792; setp.eq.u64 %r1570,%r272,%r412; @ %r1570 bra $L111; $L114: .loc 1 2980 4 @ %r2028 bra $L112; bra $L113; $L111: .loc 1 2978 32 add.u64 %r411,%r437,1; .loc 1 2978 7 add.u64 %r796,%r796,%r581; add.u64 %r797,%r797,%r314; setp.eq.u64 %r1572,%r436,%r437; @ %r1572 bra $L1; mov.u64 %r437,%r411; $L107: .loc 1 2979 2 @ %r2027 bra $L111; add.u64 %r1574,%r797,%r797; add.u64 %r790,%r350,%r1574; add.u64 %r1575,%r796,%r796; add.u64 %r801,%r351,%r1575; mov.u64 %r788,0; .loc 1 2979 9 mov.u64 %r412,%r788; bra $L114; $L103: add.u64 %r823,%r712,%r712; add.u64 %r814,%r313,%r313; add.u64 %r818,%r645,%r645; .loc 1 2993 7 mov.u64 %r831,0; mov.u64 %r829,%r831; .loc 1 2993 14 mov.u64 %r407,%r831; setp.gt.s64 %r2056,%r320,0; setp.le.s64 %r1684,%r272,0; .loc 1 3000 10 cvt.u32.u64 %r1685,%r831; $L116: .loc 1 2997 4 @ %r2056 bra $L115; $L120: .loc 1 2993 32 add.u64 %r407,%r407,1; .loc 1 2993 7 add.u64 %r829,%r829,%r581; add.u64 %r831,%r831,%r314; setp.ne.u64 %r1580,%r321,%r407; @ %r1580 bra $L116; bra $L1; $L115: add.u64 %r1581,%r829,%r829; add.u64 %r821,%r351,%r1581; add.u64 %r1582,%r831,%r831; add.u64 %r837,%r350,%r1582; .loc 1 2997 4 mov.u64 %r824,0; .loc 1 2997 11 mov.u64 %r408,%r824; $L119: .loc 1 3001 8 @ %r1684 bra $L144; add.u64 %r1584,%r824,%r824; add.u64 %r812,%r349,%r1584; mov.u64 %r816,%r837; .loc 1 3000 10 mov.u32 %r409,%r1685; .loc 1 3001 15 mov.u64 %r410,0; and.b64 %r1790,%r272,3; setp.eq.u64 %r1793,%r1790,0; @ %r1793 bra $L118; setp.eq.u64 %r1792,%r1790,1; @ %r1792 bra $L312; setp.eq.u64 %r1791,%r1790,2; @ %r1791 bra $L313; bra $L369; $L118: .loc 1 3002 28 ld.u16 %r1586,[%r812]; ld.u16 %r1587,[%r816]; .loc 1 3002 5 cvt.u16.u32 %r1589,%r409; .loc 1 3001 8 add.u64 %r2060,%r812,%r814; add.u64 %r2061,%r816,%r818; .loc 1 3002 28 ld.u16 %r2062,[%r2060]; ld.u16 %r2063,[%r2061]; .loc 1 3002 5 mad.lo.u16 %r2065,%r1586,%r1587,%r1589; .loc 1 3001 8 add.u64 %r2069,%r2060,%r814; add.u64 %r2070,%r2061,%r818; .loc 1 3002 28 ld.u16 %r2072,[%r2069]; ld.u16 %r2073,[%r2070]; .loc 1 3002 5 mad.lo.u16 %r2075,%r2062,%r2063,%r2065; .loc 1 3001 8 add.u64 %r2079,%r2069,%r814; add.u64 %r2080,%r2070,%r818; .loc 1 3002 28 ld.u16 %r2082,[%r2079]; ld.u16 %r2083,[%r2080]; .loc 1 3002 5 mad.lo.u16 %r2085,%r2072,%r2073,%r2075; mad.lo.u16 %r2086,%r2082,%r2083,%r2085; cvt.s32.s16 %r409,%r2086; .loc 1 3001 32 add.u64 %r410,%r410,4; .loc 1 3001 8 add.u64 %r812,%r2079,%r814; add.u64 %r816,%r2080,%r818; setp.ne.u64 %r2087,%r272,%r410; @ %r2087 bra $L118; bra $L117; $L144: .loc 1 3000 10 mov.u32 %r409,%r1685; $L117: .loc 1 3003 27 cvt.u16.u32 %r1592,%r409; st.u16 [%r821],%r1592; .loc 1 2997 29 add.u64 %r408,%r408,1; .loc 1 2997 4 add.u64 %r821,%r821,%r823; add.u64 %r824,%r824,%r646; setp.ne.u64 %r1593,%r320,%r408; @ %r1593 bra $L119; bra $L120; $L32: .loc 1 2920 11 setp.eq.u64 %r1594,%r313,1; @ %r1594 bra $L121; bra $L20; $L2: .loc 1 2476 6 ld.u64 %r1595,[%r838]; setp.ne.u64 %r1596,%r1595,0; @ %r1596 bra $L122; .loc 1 2485 4 ld.u64 %r1598,[%r839+56]; add.u64 %r1597,%r1598,1; ld.u64 %r1599,[%r839+48]; sub.u64 %r650,%r1597,%r1599; .loc 1 2483 15 ld.s8 %r1601,[%r840+28]; cvt.u16.u32 %r1600,%r1601; setp.eq.u16 %r1602,%r1600,1; @ %r1602 bra $L123; bra $L124; $L126: .loc 1 2560 16 ld.u64 %r646,[%r839+40]; .loc 1 2548 27 mov.u64 %r712,%r581; bra $L125; $L14: ld.u64 %r581,[%r838+40]; .loc 1 2543 6 ld.s8 %r1604,[%r838+28]; cvt.u16.u32 %r1603,%r1604; setp.eq.u16 %r1605,%r1603,1; @ ! %r1605 bra $L127; bra $L126; $L11: .loc 1 2560 16 ld.u64 %r646,[%r839+40]; mov.u32 %r583,%r443; .loc 1 2548 27 mov.u64 %r581,%r712; .loc 1 2563 14 mov.u64 %r320,1; .loc 1 2561 16 mov.u64 %r313,%r320; bra $L12; $L5: .loc 1 2485 4 ld.u64 %r1607,[%r839+56]; add.u64 %r1606,%r1607,1; ld.u64 %r1608,[%r839+48]; sub.u64 %r650,%r1606,%r1608; bra $L124; $L128: .loc 1 2509 17 ld.u64 %r1610,[%r838+56]; add.u64 %r1609,%r1610,1; .loc 1 2509 15 ld.u64 %r1611,[%r838+48]; sub.u64 %r272,%r1609,%r1611; ld.s8 %r1634,[%r840+28]; bra $L9; $L122: .loc 1 2502 12 cvta.global.u64 %r1612,_gfortrani_compile_options; .loc 1 2502 11 ld.u32 %r1613,[%r1612+36]; setp.ne.u32 %r1614,%r1613,0; @ ! %r1614 bra $L8; bra $L128; $L1: .loc 1 3007 1 ret; $L362: .loc 1 2908 13 ld.u16 %r2088,[%r401]; ld.u16 %r2089,[%r542]; .loc 1 2907 11 cvt.u16.u32 %r2091,%r373; mad.lo.u16 %r2092,%r2088,%r2089,%r2091; cvt.s32.s16 %r373,%r2092; .loc 1 2905 10 add.u64 %r401,%r401,2; add.u64 %r576,%r542,2; $L301: .loc 1 2908 13 ld.u16 %r2094,[%r401]; ld.u16 %r2095,[%r576]; .loc 1 2907 11 cvt.u16.u32 %r2097,%r373; mad.lo.u16 %r2098,%r2094,%r2095,%r2097; cvt.s32.s16 %r373,%r2098; .loc 1 2905 10 add.u64 %r401,%r401,2; add.u64 %r576,%r576,2; $L300: .loc 1 2908 13 ld.u16 %r2100,[%r401]; ld.u16 %r2101,[%r576]; .loc 1 2907 11 cvt.u16.u32 %r2103,%r373; mad.lo.u16 %r2104,%r2100,%r2101,%r2103; cvt.s32.s16 %r373,%r2104; .loc 1 2905 10 add.u64 %r401,%r401,2; add.u64 %r576,%r576,2; setp.ne.u64 %r2105,%r540,%r576; @ %r2105 bra $L83; bra $L82; $L361: add.u64 %r2106,%r553,%r563; .loc 1 2886 16 add.u64 %r2108,%r2106,%r552; .loc 1 2887 13 ld.u16 %r2109,[%r2108]; ld.u16 %r2110,[%r542]; .loc 1 2886 11 cvt.u16.u32 %r2112,%r374; mad.lo.u16 %r2113,%r2109,%r2110,%r2112; cvt.s32.s16 %r374,%r2113; .loc 1 2889 13 ld.u16 %r2114,[%r563]; .loc 1 2888 11 cvt.u16.u32 %r2116,%r375; mad.lo.u16 %r2117,%r2114,%r2110,%r2116; cvt.s32.s16 %r375,%r2117; .loc 1 2890 16 add.u64 %r2118,%r2106,%r551; .loc 1 2891 13 ld.u16 %r2119,[%r2118]; .loc 1 2890 11 cvt.u16.u32 %r2121,%r376; mad.lo.u16 %r2122,%r2119,%r2110,%r2121; cvt.s32.s16 %r376,%r2122; .loc 1 2892 16 add.u64 %r2123,%r2106,%r550; .loc 1 2893 13 ld.u16 %r2124,[%r2123]; .loc 1 2892 11 cvt.u16.u32 %r2126,%r377; mad.lo.u16 %r2127,%r2124,%r2110,%r2126; cvt.s32.s16 %r377,%r2127; .loc 1 2884 10 add.u64 %r565,%r542,2; add.u64 %r563,%r563,2; setp.ne.u64 %r2128,%r540,%r565; @ %r2128 bra $L78; bra $L77; $L360: .loc 1 2856 13 ld.u16 %r2130,[%r523]; ld.u16 %r2131,[%r525]; .loc 1 2855 11 cvt.u16.u32 %r2133,%r379; mad.lo.u16 %r2134,%r2130,%r2131,%r2133; cvt.s32.s16 %r379,%r2134; .loc 1 2858 13 ld.u16 %r2135,[%r328]; .loc 1 2857 11 cvt.u16.u32 %r2137,%r380; mad.lo.u16 %r2138,%r2135,%r2131,%r2137; cvt.s32.s16 %r380,%r2138; .loc 1 2860 13 ld.u16 %r2139,[%r326]; .loc 1 2859 11 cvt.u16.u32 %r2141,%r381; mad.lo.u16 %r2142,%r2139,%r2131,%r2141; cvt.s32.s16 %r381,%r2142; .loc 1 2862 13 ld.u16 %r2143,[%r324]; .loc 1 2861 11 cvt.u16.u32 %r2145,%r382; mad.lo.u16 %r2146,%r2143,%r2131,%r2145; cvt.s32.s16 %r382,%r2146; .loc 1 2853 10 add.u64 %r525,%r525,2; add.u64 %r523,%r523,2; add.u64 %r521,%r328,2; add.u64 %r520,%r326,2; add.u64 %r518,%r324,2; setp.ne.u64 %r2147,%r460,%r523; @ %r2147 bra $L70; bra $L69; $L359: .loc 1 2762 43 ld.u16 %r2148,[%r442]; st.u16 [%r82],%r2148; .loc 1 2760 9 add.u64 %r120,%r442,2; add.u64 %r100,%r82,512; $L299: .loc 1 2762 43 ld.u16 %r2150,[%r120]; st.u16 [%r100],%r2150; .loc 1 2760 9 add.u64 %r120,%r120,2; add.u64 %r100,%r100,512; $L298: .loc 1 2762 43 ld.u16 %r2152,[%r120]; st.u16 [%r100],%r2152; .loc 1 2760 9 add.u64 %r120,%r120,2; add.u64 %r100,%r100,512; setp.ne.u64 %r2153,%r39,%r120; @ %r2153 bra $L58; bra $L48; $L358: .loc 1 2741 7 add.u64 %r2154,%r595,%r637; add.u64 %r2155,%r2154,%r594; .loc 1 2740 49 ld.u16 %r2156,[%r2155]; st.u16 [%r38],%r2156; .loc 1 2742 49 ld.u16 %r2157,[%r637]; st.u16 [%r38+2],%r2157; .loc 1 2745 7 add.u64 %r2158,%r594,%r637; add.u64 %r2159,%r2158,%r595; .loc 1 2744 49 ld.u16 %r2160,[%r2159+2]; st.u16 [%r38+512],%r2160; .loc 1 2746 49 ld.u16 %r2161,[%r637+2]; st.u16 [%r38+514],%r2161; .loc 1 2738 9 add.u64 %r38,%r38,1024; add.u64 %r637,%r637,4; $L297: .loc 1 2741 7 add.u64 %r2163,%r595,%r637; add.u64 %r2164,%r2163,%r594; .loc 1 2740 49 ld.u16 %r2165,[%r2164]; st.u16 [%r38],%r2165; .loc 1 2742 49 ld.u16 %r2166,[%r637]; st.u16 [%r38+2],%r2166; .loc 1 2745 7 add.u64 %r2167,%r594,%r637; add.u64 %r2168,%r2167,%r595; .loc 1 2744 49 ld.u16 %r2169,[%r2168+2]; st.u16 [%r38+512],%r2169; .loc 1 2746 49 ld.u16 %r2170,[%r637+2]; st.u16 [%r38+514],%r2170; .loc 1 2738 9 add.u64 %r38,%r38,1024; add.u64 %r637,%r637,4; $L296: .loc 1 2741 7 add.u64 %r2172,%r595,%r637; add.u64 %r2173,%r2172,%r594; .loc 1 2740 49 ld.u16 %r2174,[%r2173]; st.u16 [%r38],%r2174; .loc 1 2742 49 ld.u16 %r2175,[%r637]; st.u16 [%r38+2],%r2175; .loc 1 2745 7 add.u64 %r2176,%r594,%r637; add.u64 %r2177,%r2176,%r595; .loc 1 2744 49 ld.u16 %r2178,[%r2177+2]; st.u16 [%r38+512],%r2178; .loc 1 2746 49 ld.u16 %r2179,[%r637+2]; st.u16 [%r38+514],%r2179; .loc 1 2738 9 add.u64 %r38,%r38,1024; add.u64 %r637,%r637,4; setp.ne.u64 %r2180,%r610,%r637; @ %r2180 bra $L52; bra $L53; $L356: .loc 1 2688 2 @ ! %r1631 bra $L370; .loc 1 2689 22 mov.u32 %r21812181578182,[%value_in]; } $L370: .loc 1 2687 24 mov.u64 %r402,2; .loc 1 2687 7 add.u64 %r710,%r351,%r711; $L295: .loc 1 2688 2 @ ! %r1631 bra $L371; .loc 1 2689 22 mov.u32 %r21842184578185,[%value_in]; } $L371: .loc 1 2687 24 add.u64 %r402,%r402,1; .loc 1 2687 7 add.u64 %r710,%r710,%r711; $L294: .loc 1 2688 2 @ ! %r1631 bra $L372; .loc 1 2689 22 mov.u32 %r21872187578188,[%value_in]; } $L372: .loc 1 2687 24 add.u64 %r402,%r402,1; .loc 1 2687 7 add.u64 %r710,%r710,%r711; setp.eq.u64 %r2189,%r402,%r715; @ ! %r2189 bra $L34; bra $L33; $L232: .loc 1 2689 22 mov.u32 %r2190962190578191,[%value_in]; } $L357: .loc 1 2687 7 add.u64 %r2193,%r1796,%r711; .loc 1 2688 2 @ ! %r1631 bra $L373; .loc 1 2689 22 mov.u32 %r212192195578196,[%value_in]; } $L373: .loc 1 2687 7 add.u64 %r2198,%r2193,%r711; .loc 1 2688 2 @ ! %r1631 bra $L374; .loc 1 2689 22 mov.u32 %r2200210578201,[%value_in]; } $L374: .loc 1 2687 24 add.u64 %r402,%r1743,3; .loc 1 2687 7 add.u64 %r710,%r2198,%r711; setp.eq.u64 %r2202,%r402,%r715; @ ! %r2202 bra $L34; bra $L33; $L364: .loc 1 2938 23 ld.u16 %r2203,[%r716]; ld.u16 %r2204,[%r743]; mul.lo.u16 %r2205,%r2203,%r2204; .loc 1 2938 9 cvt.s32.s16 %r406,%r2205; .loc 1 2937 5 add.u64 %r716,%r716,2; add.u64 %r717,%r743,2; $L303: .loc 1 2938 23 ld.u16 %r2209,[%r716]; ld.u16 %r2210,[%r717]; .loc 1 2938 9 cvt.u16.u32 %r2212,%r406; mad.lo.u16 %r2213,%r2209,%r2210,%r2212; cvt.s32.s16 %r406,%r2213; .loc 1 2937 5 add.u64 %r716,%r716,2; add.u64 %r717,%r717,2; $L302: .loc 1 2938 23 ld.u16 %r2215,[%r716]; ld.u16 %r2216,[%r717]; .loc 1 2938 9 cvt.u16.u32 %r2218,%r406; mad.lo.u16 %r2219,%r2215,%r2216,%r2218; cvt.s32.s16 %r406,%r2219; .loc 1 2937 5 add.u64 %r716,%r716,2; add.u64 %r717,%r717,2; setp.ne.u64 %r2220,%r716,%r722; @ %r2220 bra $L93; bra $L92; $L365: .loc 1 2953 26 ld.u16 %r2221,[%r349]; ld.u16 %r2222,[%r747]; mul.lo.u16 %r2223,%r2221,%r2222; .loc 1 2953 5 cvt.s32.s16 %r404,%r2223; .loc 1 2952 8 add.u64 %r744,%r349,%r746; add.u64 %r747,%r747,2; $L305: .loc 1 2953 26 ld.u16 %r2227,[%r744]; ld.u16 %r2228,[%r747]; .loc 1 2953 5 cvt.u16.u32 %r2230,%r404; mad.lo.u16 %r2231,%r2227,%r2228,%r2230; cvt.s32.s16 %r404,%r2231; .loc 1 2952 8 add.u64 %r744,%r744,%r746; add.u64 %r747,%r747,2; $L304: .loc 1 2953 26 ld.u16 %r2233,[%r744]; ld.u16 %r2234,[%r747]; .loc 1 2953 5 cvt.u16.u32 %r2236,%r404; mad.lo.u16 %r2237,%r2233,%r2234,%r2236; cvt.s32.s16 %r404,%r2237; .loc 1 2952 8 add.u64 %r744,%r744,%r746; add.u64 %r747,%r747,2; setp.ne.u64 %r2238,%r747,%r763; @ %r2238 bra $L97; bra $L96; $L366: .loc 1 2968 29 ld.u16 %r2239,[%r349]; ld.u16 %r2240,[%r767]; mul.lo.u16 %r2241,%r2239,%r2240; .loc 1 2968 8 cvt.s32.s16 %r417,%r2241; .loc 1 2967 28 mov.u64 %r418,1; .loc 1 2967 4 add.u64 %r764,%r349,%r766; add.u64 %r767,%r767,%r769; $L307: .loc 1 2968 29 ld.u16 %r2245,[%r764]; ld.u16 %r2246,[%r767]; .loc 1 2968 8 cvt.u16.u32 %r2248,%r417; mad.lo.u16 %r2249,%r2245,%r2246,%r2248; cvt.s32.s16 %r417,%r2249; .loc 1 2967 28 add.u64 %r418,%r418,1; .loc 1 2967 4 add.u64 %r764,%r764,%r766; add.u64 %r767,%r767,%r769; $L306: .loc 1 2968 29 ld.u16 %r2251,[%r764]; ld.u16 %r2252,[%r767]; .loc 1 2968 8 cvt.u16.u32 %r2254,%r417; mad.lo.u16 %r2255,%r2251,%r2252,%r2254; cvt.s32.s16 %r417,%r2255; .loc 1 2967 28 add.u64 %r418,%r418,1; .loc 1 2967 4 add.u64 %r764,%r764,%r766; add.u64 %r767,%r767,%r769; setp.ne.u64 %r2256,%r272,%r418; @ %r2256 bra $L100; bra $L99; $L368: .loc 1 2983 37 ld.u16 %r2257,[%r781]; ld.u16 %r2258,[%r790]; .loc 1 2982 36 ld.u16 %r2260,[%r801]; mad.lo.u16 %r2261,%r2257,%r2258,%r2260; st.u16 [%r801],%r2261; .loc 1 2980 29 mov.u64 %r413,1; .loc 1 2980 4 add.u64 %r780,%r801,%r806; add.u64 %r781,%r781,%r783; $L311: .loc 1 2983 37 ld.u16 %r2263,[%r781]; ld.u16 %r2264,[%r790]; .loc 1 2982 36 ld.u16 %r2266,[%r780]; mad.lo.u16 %r2267,%r2263,%r2264,%r2266; st.u16 [%r780],%r2267; .loc 1 2980 29 add.u64 %r413,%r413,1; .loc 1 2980 4 add.u64 %r780,%r780,%r806; add.u64 %r781,%r781,%r783; $L310: .loc 1 2983 37 ld.u16 %r2269,[%r781]; ld.u16 %r2270,[%r790]; .loc 1 2982 36 ld.u16 %r2272,[%r780]; mad.lo.u16 %r2273,%r2269,%r2270,%r2272; st.u16 [%r780],%r2273; .loc 1 2980 29 add.u64 %r413,%r413,1; .loc 1 2980 4 add.u64 %r780,%r780,%r806; add.u64 %r781,%r781,%r783; setp.ne.u64 %r2274,%r320,%r413; @ %r2274 bra $L110; bra $L113; $L367: .loc 1 2976 34 st.u16 [%r804],%r1682; .loc 1 2975 27 mov.u64 %r415,1; .loc 1 2975 2 add.u64 %r804,%r804,%r806; $L309: .loc 1 2976 34 st.u16 [%r804],%r1682; .loc 1 2975 27 add.u64 %r415,%r415,1; .loc 1 2975 2 add.u64 %r804,%r804,%r806; $L308: .loc 1 2976 34 st.u16 [%r804],%r1682; .loc 1 2975 27 add.u64 %r415,%r415,1; .loc 1 2975 2 add.u64 %r804,%r804,%r806; setp.ne.u64 %r2277,%r320,%r415; @ %r2277 bra $L105; bra $L109; $L369: .loc 1 3002 28 ld.u16 %r2278,[%r812]; ld.u16 %r2279,[%r837]; mul.lo.u16 %r2280,%r2278,%r2279; .loc 1 3002 5 cvt.s32.s16 %r409,%r2280; .loc 1 3001 32 mov.u64 %r410,1; .loc 1 3001 8 add.u64 %r812,%r812,%r814; add.u64 %r816,%r837,%r818; $L313: .loc 1 3002 28 ld.u16 %r2284,[%r812]; ld.u16 %r2285,[%r816]; .loc 1 3002 5 cvt.u16.u32 %r2287,%r409; mad.lo.u16 %r2288,%r2284,%r2285,%r2287; cvt.s32.s16 %r409,%r2288; .loc 1 3001 32 add.u64 %r410,%r410,1; .loc 1 3001 8 add.u64 %r812,%r812,%r814; add.u64 %r816,%r816,%r818; $L312: .loc 1 3002 28 ld.u16 %r2290,[%r812]; ld.u16 %r2291,[%r816]; .loc 1 3002 5 cvt.u16.u32 %r2293,%r409; mad.lo.u16 %r2294,%r2290,%r2291,%r2293; cvt.s32.s16 %r409,%r2294; .loc 1 3001 32 add.u64 %r410,%r410,1; .loc 1 3001 8 add.u64 %r812,%r812,%r814; add.u64 %r816,%r816,%r818; setp.ne.u64 %r2295,%r272,%r410; @ %r2295 bra $L118; bra $L117; } matmul_i4.o/, .param .u32 %in_ar4, .param .u64 %in_ar5); .file 1 "../../../libgfortran/generated/matmul_i4.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_options[6]FUNCTION DECL: _gfortran_size0_gfortran_size0_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime_97,116,109,117,108,95,105,52,0 }2] = {782] = {672] = {8471,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,32,40,97,41,32,61,61,32,50,32,124,124,32,71,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,32,40,98,41,32,61,61,3243116,114,97,110,47,103,101,110,101,114,97,116,101,100,47,109,97,116,109,117,108,95,105,52,465,114,114,97,121,32,98,111,117,110,100,32,109,105,115,109,97,116,99,104,32,102,111,114,32,100,105,109,101,110,115,105,111,110,32,49,32,111,102,32,97,114,114,97,121,32,40,37,108,100,47,37,108,100,41,3265,114,114,97,121,32,98,111,117,110,100,32,109,105,115,109,97,116,99,104,32,102,111,114,32,100,105,109,101,110,115,105,111,110,32,50,32,111,102,32,97,114,114,97,121,32,40,37,108,100,47,37,108,100,41,3273,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,97,114,103,117,109,101,110,116,32,66,32,105,110,32,77,65,84,77,85,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,49,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,100,0 }103,101,109,109,32,33,61,32,78,85,76,7_gfortran_matmul_i4 .visible .func _gfortran_matmul_u64u64 %r279; .reg .u32 %r280; .reg .u32u64 %r301; .reg .u32 %r302; .reg .u64u64u32 %r313; .reg .u64u64u64 %r374; .reg .u64u64 %r390; .reg .u64 %r392; .reg .u64u64u64 %r409; .reg .u64u64.reg .u64 %r530; .reg .u64 %r531; .reg .u64 %r532; .reg .u64 %r533; .reg .u64u64 %r544; .reg .u64 %r550; .reg .u64 %r553; .reg .u64 %r558; .reg .u64 %r559; .reg .u64 %r569; .reg .u64 %r571; .reg .u64 %r572; .reg .u64 %r573; .reg .u64 %r575; .reg .u64 %r576; .reg .u64 %r577; .reg .u64u64 %r590; .reg .u64 %r591; .reg .u64 %r592; .reg .u64 %r593; .reg .u64 %r594; .reg .u64u64.reg .u64 %r619; .reg .u64 %r621; .reg .u64 %r622; .reg .u64 %r623; .reg .u64 %r626; .reg .u64 %r627; .reg .u64 %r628; .reg .u64 %r631; .reg .u64 %r632; .reg .u64 %r633; .reg .u64 %r635; .reg .u64 %r636; .reg .u64 %r642; .reg .u64 %r644; .reg .u64 %r645; .reg .u64 %r649; .reg .u64 %r654; .reg .u64 %r655; .reg .u64 %r657; .reg .u64 %r658; .reg .u64 %r663; .reg .u64 %r665; .reg .u64 %r666; .reg .u64 %r674; .reg .u64 %r675; .reg .u64 %r677; .reg .u64 %r678; .reg .u64 %r680; .reg .u64 %r683; .reg .u64 %r685; .reg .u64 %r686; .reg .u64 %r691; .reg .u64 %r692; .reg .u64 %r694; .reg .u64 %r699; .reg .u64 %r701; .reg .u64 %r703; .reg .u64 %r707; .reg .u64 %r708; .reg .u64 %r712; .reg .u64 %r715; .reg .u64 %r717; .reg .u64 %r719; .reg .u64 %r723; .reg .u64 %r725; .reg .u64 %r727; .reg .u64 %r729; .reg .u64 %r732; .reg .u64 %r734; .reg .u64 %r736; .reg .u64 %r741; .reg .u64 %r743; .reg .u64 %r749; .reg .u64 %r750; .reg .u64 %r751; .reg .u64 %r752; .reg .u32 %r753; .reg .u32 %r754; .reg .u64 %r755; .reg .u16 %r756; .reg .pred %r757; .reg .u16 %r758; .reg .pred %r760; .reg .u64 %r761; .reg .u32 %r762; .reg .u64 %r763; .reg .u64 %r764; .reg .u64 %r765; .reg .pred %r766; .reg .pred %r768; .reg .u64 %r771; .reg .u64 %r773; .reg .u64 %r774; .reg .u64 %r775; .reg .u64 %r777; .reg .u64 %r778; .reg .u64 %r779; .reg .u64 %r780; .reg .u64 %r781; .reg .u64 %r784; .reg .u64 %r786; .reg .u64 %r787; .reg .u64 %r789; .reg .u64 %r791; .reg .u64 %r792; .reg .u64 %r794; .reg .u64 %r795; .reg .u64 %r796; .reg .u64 %r797; .reg .u64 %r798; .reg .u32 %r799; .reg .pred %r800; .reg .u64 %r801; .reg .u64 %r802; .reg .u64 %r803; .reg .pred %r805; .reg .u64 %r806; .reg .u64 %r807; .reg .u64 %r808; .reg .pred %r809; .reg .u64 %r810; .reg .u64 %r811; .reg .u64 %r812; .reg .u16 %r813; .reg .pred %r814; .reg .u64 %r815; .reg .u64 %r817; .reg .u64 %r818; .reg .u64 %r819; .reg .u16 %r820; .reg .pred %r822; .reg .pred %r823; .reg .u64 %r824; .reg .pred %r826; .reg .u64 %r827; .reg .u64 %r829; .reg .u64 %r830; .reg .u64 %r831; .reg .u64 %r832; .reg .u64 %r833; .reg .u64 %r834; .reg .pred %r835; .reg .u64 %r836; .reg .u64 %r838; .reg .u64 %r839; .reg .u64 %r840; .reg .u16 %r841; .reg .u32 %r842; .reg .pred %r843; .reg .u16 %r844; .reg .pred %r845; .reg .u64 %r846; .reg .u64 %r847; .reg .u64 %r848; .reg .u64 %r849; .reg .u64 %r850; .reg .u64 %r851; .reg .pred %r852; .reg .u32 %r854; .reg .u32 %r855; .reg .u32 %r857; .reg .u32 %r858; .reg .u16 %r859; .reg .u16 %r860; .reg .u16 %r861; .reg .u32 %r862; .reg .u16 %r863; .reg .pred %r864; .reg .u64 %r865; .reg .u16 %r867; .reg .pred %r868; .reg .u64 %r869; .reg .u64 %r870; .reg .u64 %r871; .reg .u32 %r872; .reg .u32 %r873; .reg .u32 %r874; .reg .u32 %r875; .reg .u32 %r876; .reg .u32 %r877; .reg .u32 %r879; .reg .u32 %r880; .reg .u16 %r881; .reg .u16 %r882; .reg .u16 %r883; .reg .u32 %r884; .reg .u16 %r885; .reg .pred %r886; .reg .u32 %r888; .reg .u32 %r889; .reg .u16 %r890; .reg .u16 %r891; .reg .u16 %r892; .reg .u32 %r893; .reg .u16 %r894; .reg .pred %r895; .reg .u32 %r897; .reg .u32 %r898; .reg .u32 %r900; .reg .u32 %r901; .reg .u16 %r902; .reg .u16 %r903; .reg .u16 %r904; .reg .u32 %r905; .reg .u16 %r906; .reg .pred %r907; .reg .f32 %r908; .reg .f32 %r909; .reg .f32 %r910; .reg .f32 %r911; .reg .f32 %r912; .reg .f32 %r913; .reg .f32 %r914; .reg .pred %r915; .reg .u32 %r916; .reg .u32 %r917; .reg .pred %r918; .reg .pred %r919; .reg .u32 %r921; .reg .u32 %r922; .reg .u32 %r924; .reg .u32 %r925; .reg .u16 %r926; .reg .u16 %r927; .reg .u16 %r928; .reg .u32 %r929; .reg .u16 %r930; .reg .pred %r931; .reg .u32 %r933; .reg .u32 %r934; .reg .u32 %r936; .reg .u32 %r937; .reg .u16 %r938; .reg .u16 %r939; .reg .u16 %r940; .reg .u32 %r941; .reg .u16 %r942; .reg .pred %r943; .reg .u32 %r945; .reg .u32 %r946; .reg .u32 %r948; .reg .u32 %r949; .reg .u16 %r950; .reg .u16 %r951; .reg .u16 %r952; .reg .u32 %r953; .reg .u16 %r954; .reg .pred %r955; .reg .pred %r956; .reg .u64 %r957; .reg .u32 %r958; .reg .u64 %r959; .reg .u64 %r960; .reg .u32 %r961; .reg .pred %r962; .reg .pred %r963; .reg .u32 %r964; .reg .pred %r965; .reg .pred %r966; .reg .u32 %r981; .reg .u64 %r982; .reg .u64 %r983; .reg .u64 %r984; .reg .u64 %r985; .reg .u64 %r986; .reg .u64 %r987; .reg .u64 %r988; .reg .u16 %r989; .reg .u16 %r990; .reg .u16 %r991; .reg .u32 %r992; .reg .u16 %r993; .reg .pred %r994; .reg .pred %r995; .reg .u16 %r996; .reg .pred %r997; .reg .u64 %r998; .reg .pred %r999; .reg .u32 %r1003; .reg .u64 %r1005; .reg .u32 %r1010; .reg .u32 %r1011; .reg .u32 %r1013; .reg .u32 %r1014; .reg .u16 %r1015; .reg .u16 %r1016; .reg .u16 %r1017; .reg .u32 %r1019; .reg .u32 %r1020; .reg .u16 %r1022; .reg .u16 %r1024; .reg .u32 %r1025; .reg .u16 %r1026; .reg .pred %r1027; .reg .pred %r1028; .reg .u64 %r1029; .reg .u64 %r1031; .reg .u64 %r1032; .reg .u64 %r1033; .reg .pred %r1035; .reg .u64 %r1037; .reg .u64 %r1038; .reg .u64 %r1041; .reg .u64 %r1042; .reg .u64 %r1046; .reg .u64 %r1047; .reg .u64 %r1048; .reg .u64 %r1050; .reg .u64 %r1051; .reg .u64 %r1054; .reg .u64 %r1057; .reg .u64 %r1058; .reg .u64 %r1061; .reg .u64 %r1063; .reg .u64 %r1064; .reg .u64 %r1065; .reg .u64 %r1066; .reg .u64 %r1067; .reg .pred %r1069; .reg .u64 %r1075; .reg .u64 %r1076; .reg .u64 %r1077; .reg .u64 %r1078; .reg .pred %r1080; .reg .u64 %r1081; .reg .u64 %r1082; .reg .u64 %r1083; .reg .u64 %r1084; .reg .u64 %r1085; .reg .u64 %r1087; .reg .u64 %r1088; .reg .u64 %r1089; .reg .u64 %r1090; .reg .u64 %r1091; .reg .u64 %r1092; .reg .u64 %r1093; .reg .u64 %r1094; .reg .u64 %r1098; .reg .u64 %r1099; .reg .u64 %r1100; .reg .u64 %r1101; .reg .u64 %r1105; .reg .u64 %r1107; .reg .u64 %r1110; .reg .u64 %r1113; .reg .u64 %r1118; .reg .u64 %r1119; .reg .u64 %r1121; .reg .u64 %r1122; .reg .u32 %r1123; .reg .u32 %r1124; .reg .u64 %r1125; .reg .u64 %r1126; .reg .u32 %r1127; .reg .u32 %r1128; .reg .v2 .u32 %r1130; .reg .u64 %r1131; .reg .u64 %r1132; .reg .u64 %r1137; .reg .u64 %r1138; .reg .u64 %r1139; .reg .u64 %r1141; .reg .u64 %r1142; .reg .pred %r1143; .reg .u64 %r1146; .reg .u64 %r1147; .reg .u64 %r1148; .reg .u64 %r1149; .reg .u64 %r1151; .reg .pred %r1152; .reg .u64 %r1153; .reg .u64 %r1154; .reg .u32 %r1155; .reg .u64 %r1158; .reg .u64 %r1159; .reg .u64 %r1160; .reg .u64 %r1161; .reg .u64 %r1162; .reg .u64 %r1163; .reg .u64 %r1164; .reg .u64 %r1165; .reg .u64 %r1166; .reg .u64 %r1168; .reg .u64 %r1169; .reg .u64 %r1170; .reg .u64 %r1171; .reg .u64 %r1176; .reg .u64 %r1179; .reg .u64 %r1182; .reg .u64 %r1185; .reg .u64 %r1188; .reg .u64 %r1190; .reg .pred %r1198; .reg .pred %r1199; .reg .pred %r1200; .reg .u64 %r1204; .reg .u32 %r1206; .reg .u32 %r1208; .reg .u32 %r1210; .reg .u32 %r1212; .reg .pred %r1214; .reg .pred %r1215; .reg .pred %r1216; .reg .u64 %r1217; .reg .u64 %r1218; .reg .u64 %r1220; .reg .u64 %r1221; .reg .u64 %r1222; .reg .u64 %r1223; .reg .u64 %r1231; .reg .u64 %r1232; .reg .u64 %r1233; .reg .u64 %r1235; .reg .u32 %r1237; .reg .u32 %r1239; .reg .u64 %r1240; .reg .u32 %r1242; .reg .u64 %r1243; .reg .u32 %r1245; .reg .pred %r1247; .reg .pred %r1248; .reg .u64 %r1251; .reg .u32 %r1253; .reg .u32 %r1254; .reg .pred %r1256; .reg .u16 %r1257; .reg .pred %r1258; .reg .u32 %r1260; .reg .u32 %r1261; .reg .u16 %r1262; .reg .u16 %r1263; .reg .u32 %r1265; .reg .u16 %r1266; .reg .pred %r1267; .reg .pred %r1268; .reg .u16 %r1269; .reg .pred %r1270; .reg .pred %r1271; .reg .pred %r1272; .reg .u64 %r1273; .reg .u64 %r1274; .reg .pred %r1276; .reg .u64 %r1277; .reg .u64 %r1278; .reg .u64 %r1280; .reg .u32 %r1282; .reg .u32 %r1283; .reg .pred %r1285; .reg .u64 %r1287; .reg .u64 %r1288; .reg .u64 %r1289; .reg .u32 %r1291; .reg .u32 %r1292; .reg .pred %r1294; .reg .pred %r1295; .reg .u64 %r1297; .reg .u32 %r1299; .reg .u32 %r1300; .reg .pred %r1302; .reg .pred %r1303; .reg .pred %r1304; .reg .pred %r1305; .reg .u64 %r1306; .reg .pred %r1309; .reg .u64 %r1311; .reg .u32 %r1313; .reg .u32 %r1314; .reg .u32 %r1315; .reg .u32 %r1316; .reg .pred %r1318; .reg .pred %r1320; .reg .u64 %r1322; .reg .u64 %r1323; .reg .pred %r1325; .reg .u64 %r1326; .reg .u64 %r1327; .reg .u64 %r1329; .reg .u32 %r1331; .reg .u32 %r1332; .reg .pred %r1334; .reg .pred %r1335; .reg .u64 %r1336; .reg .pred %r1337; .reg .u64 %r1338; .reg .u64 %r1339; .reg .u64 %r1340; .reg .u16 %r1341; .reg .u32 %r1342; .reg .pred %r1343; .reg .u16 %r1344; .reg .u32 %r1345; .reg .pred %r1346; .reg .u64 %r1347; .reg .u64 %r1348; .reg .u64 %r1349; .reg .u64 %r1350; .reg .u64 %r1351; .reg .u64 %r1352; .reg .u64 %r1353; .reg .u32 %r1354; .reg .pred %r1355; .reg .pred %r1357; .reg .pred %r1358; .reg .pred %r1359; .reg .pred %r1360; .reg .pred %r1361; .reg .u64 %r1363; .reg .u32 %r1364; .reg .u64 %r1365; .reg .pred %r1366; .reg .pred %r1370; .reg .pred %r1372; .reg .pred %r1373; .reg .pred %r1374; .reg .u64 %r1382; .reg .u64 %r1383; .reg .u64 %r1384; .reg .u64 %r1385; .reg .u64 %r1390; .reg .u64 %r1391; .reg .u64 %r1392; .reg .u64 %r1393; .reg .u64 %r1395; .reg .u64 %r1396; .reg .u64 %r1397; .reg .u64 %r1398; .reg .u64 %r1399; .reg .pred %r1400; .reg .u64 %r1402; .reg .pred %r1404; .reg .u64 %r1407; .reg .u64 %r1408; .reg .u64 %r1409; .reg .pred %r1410; .reg .u64 %r1411; .reg .u64 %r1412; .reg .pred %r1416; .reg .u32 %r1417; .reg .pred %r1418; .reg .u64 %r1419; .reg .u64 %r1420; .reg .u64 %r1421; .reg .u64 %r1422; .reg .u64 %r1423; .reg .pred %r1424; .reg .pred %r1425; .reg .pred %r1426; .reg .u64 %r1429; .reg .u64 %r1430; .reg .u64 %r1431; .reg .u64 %r1433; .reg .pred %r1434; .reg .u64 %r1436; .reg .u64 %r1437; .reg .u64 %r1438; .reg .u64 %r1439; .reg .u64 %r1440; .reg .u64 %r1441; .reg .pred %r1442; .reg .pred %r1443; .reg .pred %r1444; .reg .u64 %r1450; .reg .u64 %r1451; .reg .u64 %r1452; .reg .u64 %r1453; .reg .u64 %r1454; .reg .pred %r1455; .reg .pred %r1456; .reg .pred %r1457; .reg .u64 %r1460; .reg .u64 %r1461; .reg .u64 %r1462; .reg .u64 %r1463; .reg .u64 %r1464; .reg .pred %r1465; .reg .pred %r1466; .reg .pred %r1467; .reg .u64 %r1468; .reg .u64 %r1469; .reg .u64 %r1473; .reg .pred %r1474; .reg .pred %r1475; .reg .pred %r1476; .reg .u64 %r1477; .reg .u64 %r1478; .reg .u64 %r1479; .reg .u64 %r1480; .reg .u64 %r1481; .reg .u64 %r1482; .reg .pred %r1483; .reg .pred %r1484; .reg .pred %r1485; .reg .u64 %r1488; .reg .u64 %r1489; .reg .u64 %r1490; .reg .u64 %r1491; .reg .u64 %r1492; .reg .pred %r1493; .reg .pred %r1494; .reg .pred %r1495; .reg .u64 %r1500; .reg .pred %r1501; .reg .pred %r1502; .reg .pred %r1503; .reg .u64 %r1508; .reg .pred %r1509; .reg .pred %r1510; .reg .pred %r1511; .reg .u64 %r1516; .reg .pred %r1517; .reg .pred %r1518; .reg .pred %r1519; .reg .u64 %r1524; .reg .pred %r1525; .reg .pred %r1526; .reg .pred %r1527; .reg .u64 %r1530; .reg .u64 %r1531; .reg .u64 %r1533; .reg .u64 %r1534; .reg .u64 %r1537; .reg .u64 %r1538; .reg .u32 %r1539; .reg .u32 %r1540; .reg .u64 %r1541; .reg .u64 %r1542; .reg .u32 %r1543; .reg .u32 %r1544; .reg .u64 %r1545; .reg .u64 %r1546; .reg .u64 %r1548; .reg .u64 %r1549; .reg .u32 %r1550; .reg .u32 %r1551; .reg .u64 %r1552; .reg .u64 %r1553; .reg .u32 %r1554; .reg .u32 %r1555; .reg .u64 %r1556; .reg .u64 %r1557; .reg .u64 %r1559; .reg .u64 %r1560; .reg .u32 %r1561; .reg .u32 %r1562; .reg .u64 %r1563; .reg .u64 %r1564; .reg .u32 %r1565; .reg .u32 %r1566; .reg .pred %r1567; .reg .u32 %r1570; .reg .u32 %r1574; .reg .u32 %r1578; .reg .pred %r1579; .reg .u32 %r1580; .reg .u32 %r1581; .reg .u32 %r1582; .reg .u32 %r1583; .reg .u32 %r1589; .reg .u32 %r1590; .reg .u32 %r1592; .reg .u32 %r1593; .reg .u32 %r1595; .reg .u32 %r1596; .reg .u32 %r1598; .reg .u32 %r1599; .reg .u32 %r1601; .reg .u32 %r1608; .reg .u32 %r1609; .reg .u32 %r1611; .reg .u32 %r1612; .reg .u32 %r1614; .reg .u32 %r1615; .reg .u32 %r1617; .reg .u32 %r1618; .reg .u32 %r1620; .reg .u32 %r1627; .reg .u32 %r1628; .reg .u32 %r1630; .reg .u32 %r1632; .reg .u32 %r1634; .reg .pred %r1636; .reg .pred %r1637; .reg .u32 %r1638; .reg .u32 %r1639; .reg .u32 %r1640; .reg .u32 %r1641; .reg .u64 %r1644; .reg .u32 %r1645; .reg .u64 %r1646; .reg .u32 %r1647; .reg .u32 %r1649; .reg .u64 %r1651; .reg .u32 %r1652; .reg .u64 %r1654; .reg .u32 %r1655; .reg .pred %r1657; .reg .u32 %r1658; .reg .u32 %r1661; .reg .u32 %r1662; .reg .u32 %r1664; .reg .u32 %r1668; .reg .u32 %r1669; .reg .u32 %r1671; .reg .u32 %r1675; .reg .u32 %r1676; .reg .pred %r1678; .reg .pred %r1679; .reg .u32 %r1680; .reg .u32 %r1683; .reg .u32 %r1684; .reg .u32 %r1686; .reg .u32 %r1690; .reg .u32 %r1691; .reg .u32 %r1693; .reg .u32 %r1697; .reg .u32 %r1698; .reg .pred %r1700; .reg .pred %r1701; .reg .u32 %r1702; .reg .u64 %r1703; .reg .u32 %r1705; .reg .u32 %r1706; .reg .u32 %r1708; .reg .u64 %r1709; .reg .u32 %r1712; .reg .u32 %r1713; .reg .u32 %r1715; .reg .u64 %r1716; .reg .u32 %r1719; .reg .u32 %r1720; .reg .pred %r1722; .reg .pred %r1723; .reg .u32 %r1724; .reg .u64 %r1726; .reg .u64 %r1727; .reg .u32 %r1728; .reg .u32 %r1729; .reg .u32 %r1731; .reg .u64 %r1733; .reg .u64 %r1734; .reg .u32 %r1736; .reg .u32 %r1737; .reg .u32 %r1739; .reg .u64 %r1741; .reg .u64 %r1742; .reg .u32 %r1744; .reg .u32 %r1745; .reg .pred %r1747; .reg .u64 %r1749; .reg .u64 %r1751; .reg .u64 %r1754; .reg .pred %r1756; .reg .pred %r1757; .reg .pred %r1758; .reg .u64 %r1760; .reg .u64 %r1761; .reg .u32 %r1762; .reg .u32 %r1763; .reg .u32 %r1765; .reg .u32 %r1766; .reg .u64 %r1768; .reg .u64 %r1769; .reg .u32 %r1771; .reg .u32 %r1772; .reg .u32 %r1774; .reg .u32 %r1775; .reg .u64 %r1777; .reg .u64 %r1778; .reg .u32 %r1780; .reg .u32 %r1781; .reg .u32 %r1783; .reg .u32 %r1784; .reg .pred %r1785; .reg .pred %r1786; .reg .u32 %r1788; .reg .u64 %r1790; .reg .u64 %r1791; .reg .u32 %r1792; .reg .u32 %r1793; .reg .u32 %r1795; .reg .u64 %r1797; .reg .u64 %r1798; .reg .u32 %r1800; .reg .u32 %r1801; .reg .u32 %r1803; .reg .u64 %r1805; .reg .u64 %r1806; .reg .u32 %r1808; .reg .u32 %r1809; .reg .pred %r1811; .reg .u32 %r1812; .reg .u32 %r1813; .reg .u32 %r1816; .reg .u32 %r1817; .reg .u32 %r1820; .reg .u32 %r1821; .reg .pred %r1823; .reg .u64 %r1824; .reg .u32 %r1825; .reg .u64 %r1826; .reg .u32 %r1827; .reg .u32 %r1829; .reg .u64 %r1831; .reg .u32 %r1832; .reg .u64 %r1834; .reg .u32 %r1835; .reg .pred %r1837; .reg .u32 %r1838; .reg .u32 %r1839; .reg .u32 %r1841; .reg .u32 %r1843; .reg .u32 %r1845; .reg .u32 %r1848; .reg .u32 %r1849; .reg .u32 %r1851; .reg .u32 %r1853; .reg .u32 %r1855; .reg .u32 %r1858; .reg .u32 %r1859; .reg .u32 %r1861; .reg .u32 %r1863; .reg .u32 %r1865; .reg .pred %r1867; .reg .u32 %r1868; .reg .u32 %r1870; .reg .u32 %r1872; .reg .pred %r1873; .reg .u64 %r1874; .reg .u64 %r1875; .reg .u32 %r1876; .reg .u32 %r1877; .reg .u64 %r1878; .reg .u64 %r1879; .reg .u32 %r1880; .reg .u32 %r1881; .reg .u64 %r1883; .reg .u64 %r1884; .reg .u32 %r1885; .reg .u32 %r1886; .reg .u64 %r1887; .reg .u64 %r1888; .reg .u32 %r1889; .reg .u32 %r1890; .reg .u64 %r1892; .reg .u64 %r1893; .reg .u32 %r1894; .reg .u32 %r1895; .reg .u64 %r1896; .reg .u64 %r1897; .reg .u32 %r1898; .reg .u32 %r1899; .reg .pred %r1900; .reg .u32 %r1901; .reg .u64 %r1902; .reg .u32 %r1904; .reg .u64 %r1905; .reg .u32 %r1907; .reg .u64 %r1908; .reg .pred %r1909; .reg .u32 %r1910; .reg .u64 %r1911; .reg .u64 %r1913; .reg .u32 %r1915; .reg .u64 %r1916; .reg .u64 %r1918; .reg .u32 %r1920; .reg .u64 %r1921; .reg .pred %r1922; .reg .u32 %r1923; .reg .u32 %r1924; .reg .u32 %r1927; .reg .u32 %r1928; .reg .u32 %r1931; .reg .u32 %r1932; .reg .pred %r1934; .reg .u32 %r1935; .reg .u32 %r1936; .reg .u32 %r1939; .reg .u32 %r1940; .reg .u32 %r1943; .reg .u32 %r1944; .reg .pred %r1946; .reg .u32 %r1947; .reg .u32 %r1948; .reg .u32 %r1951; .reg .u32 %r1952; .reg .u32 %r1955; .reg .u32 %r1956; .reg .pred %r1958; .reg .u32 %r1959; .reg .u32 %r1960; .reg .u32 %r1962; .reg .u32 %r1963; .reg .u32 %r1965; .reg .u32 %r1966; .reg .u32 %r1968; .reg .u32 %r1969; .reg .u32 %r1971; .reg .u32 %r1972; .reg .u32 %r1974; .reg .u32 %r1975; .reg .pred %r1976; .reg .pred %r1979; .reg .u32 %r1980; .reg .u32 %r1981; .reg .u32 %r1984; .reg .u32 %r1985; .reg .u32 %r1988; .reg .u32 %r1989; .reg .pred %r1991; mov.u64 %r750,%ar0; mov.u64 %r751,%ar1; mov.u64 %r752,%ar2; mov.u32 %r753,%ar3; mov.u32 %r754,%ar4; mov.u64 %r755,%ar5; .loc 1 2462 3 ld.s8 %r447,[%r751+28]; cvt.u16.u32 %r756,%r447; setp.eq.u16 %r757,%r756,2; @ %r757 bra $L2; ld.s8 %r1364,[%r752+28]; cvt.u16.u32 %r758,%r1364; setp.eq.u16 %r760,%r758,2; @ %r760 bra $L3; cvta.const.u64 %r764,$LC3; cvta.const.u64 %r763,__func__$0; mov.u32 %r762,2462; cvta.const.u64 %r76177627633: .loc 1 2476 6 ld.u64 %r765,[%r750]; setp.ne.u64 %r766,%r765,0; @ %r766 bra $L4; .loc 1 2478 10 setp.ne.u16 %r768,%r756,1; @ %r768 bra $L5; .loc 1 2480 4 st.u64 [%r750+48],%r765; ld.u64 %r771,[%r752+80]; ld.u64 %r773,[%r752+72]; sub.u64 %r774,%r771,%r773; st.u64 [%r750+56],%r774; mov.u64 %r775,1; st.u64 [%r750+40],%r775; bra $L6; $L123: .loc 1 2485 4 st.u64 [%r750+48],%r1336; add.u64 %r777,%r509,-1; st.u64 [%r750+56],%r777; mov.u64 %r778,1; st.u64 [%r750+40],%r778; bra $L6; $L124: .loc 1 2490 4 mov.u64 %r779,0; st.u64 [%r750+48],%r779; add.u64 %r780,%r509,-1; st.u64 [%r750+56],%r780; mov.u64 %r781,1; st.u64 [%r750+40],%r781; .loc 1 2493 11 st.u64 [%r750+72],%r779; ld.u64 %r784,[%r752+80]; ld.u64 %r786,[%r752+72]; sub.u64 %r787,%r784,%r786; st.u64 [%r750+80],%r787; st.u64 [%r750+64],%r509; $L6: .loc 1 2499 1750; call (%value_in),_gfortran_size0,(%out_arg1); ld.param.u64 %r789,[%value_in]; } .loc 1 2499 4791; call (%value_in),_gfortrani_xmallocarray792,[%value_in]; } .loc 1 2499 2 st.u64 [%r750],%r792; .loc 1 2500 24 mov.u64 %r794,0; st.u64 [%r750+8],%r794; .loc 1 2557 7 ld.s8 %r447,[%r751+28]; .loc 1 2564 15 ld.u64 %r796,[%r751+56]; add.u64 %r795,%r796,1; .loc 1 2564 13 ld.u64 %r797,[%r751+48]; sub.u64 %r457,%r795,%r797; bra $L7; $L4: .loc 1 2502 12 cvta.global.u64 %r798,_gfortrani_compile_options; .loc 1 2502 11 ld.u32 %r799,[%r798+36]; setp.eq.u32 %r800,%r799,0; @ %r800 bra $L8; .loc 1 2509 17 ld.u64 %r802,[%r750+56]; add.u64 %r801,%r802,1; .loc 1 2509 15 ld.u64 %r803,[%r750+48]; sub.u64 %r457,%r801,%r803; .loc 1 2506 10 setp.ne.u16 %r805,%r756,1; @ %r805 bra $L9; .loc 1 2508 17 ld.u64 %r807,[%r752+80]; add.u64 %r806,%r807,1; .loc 1 2508 15 ld.u64 %r808,[%r752+72]; sub.u64 %r240,%r806,%r808; .loc 1 2510 7 setp.ne.u64 %r809,%r240,%r457; @ %r809 bra $L10; .loc 1 2564 15 ld.u64 %r811,[%r751+56]; add.u64 %r810,%r811,1; .loc 1 2564 13 ld.u64 %r812,[%r751+48]; sub.u64 %r457,%r810,%r812; .loc 1 2543 7 ld.s8 %r332,[%r750+28]; .loc 1 2548 27 ld.u64 %r445,[%r750+40]; .loc 1 2543 6 cvt.u16.u32 %r813,%r332; setp.eq.u16 %r814,%r813,1; @ %r814 bra $L11; .loc 1 2553 16 ld.u64 %r505,[%r750+64]; .loc 1 2560 16 ld.u64 %r504,[%r751+40]; .loc 1 2563 14 mov.u64 %r215,1; .loc 1 2561 16 mov.u64 %r208,%r215; bra $L12; $L10: .loc 1 2511 6 st.u64 [%stack+8],%r240; st.u64 [%stack],%r457; cvta.const.u64 %r815815_gfortran_runtime_error2485 4 ld.u64 %r818,[%r751+56]; add.u64 %r817,%r818,1; ld.u64 %r819,[%r751+48]; sub.u64 %r513,%r817,%r819; .loc 1 2515 15 cvt.u16.u32 %r820,%r1364; setp.ne.u16 %r822,%r820,1; @ %r822 bra $L13; .loc 1 2519 7 setp.eq.u64 %r823,%r513,%r457; @ %r823 bra $L14; .loc 1 2520 6 st.u64 [%stack+8],%r513; st.u64 [%stack],%r457; cvta.const.u64 %r8248stack; call _gfortran_runtime_error13: .loc 1 2528 7 setp.eq.u64 %r826,%r513,%r457; @ %r826 bra $L15; .loc 1 2529 6 st.u64 [%stack+8],%r513; st.u64 [%stack],%r457; cvta.const.u64 %r827827_gfortran_runtime_error15: .loc 1 2533 17 ld.u64 %r830,[%r752+80]; add.u64 %r829,%r830,1; .loc 1 2533 15 ld.u64 %r831,[%r752+72]; sub.u64 %r237,%r829,%r831; .loc 1 2534 17 ld.u64 %r833,[%r750+80]; add.u64 %r832,%r833,1; .loc 1 2534 15 ld.u64 %r834,[%r750+72]; sub.u64 %r238,%r832,%r834; .loc 1 2535 7 setp.eq.u64 %r835,%r237,%r238; @ %r835 bra $L14; .loc 1 2536 6 st.u64 [%stack+8],%r237; st.u64 [%stack],%r238; cvta.const.u64 %r836,$LC836_gfortran_runtime_error2564 15 ld.u64 %r839,[%r751+56]; add.u64 %r838,%r839,1; .loc 1 2564 13 ld.u64 %r840,[%r751+48]; sub.u64 %r457,%r838,%r840; $L7: .loc 1 2548 27 ld.u64 %r505,[%r750+40]; .loc 1 2543 6 ld.s8 %r842,[%r750+28]; cvt.u16.u32 %r841,%r842; setp.eq.u16 %r843,%r841,1; @ %r843 bra $L130; $L127: .loc 1 2553 16 mov.u64 %r445,%r505; ld.u64 %r505,[%r750+64]; bra $L16; $L130: .loc 1 2548 27 mov.u64 %r445,%r505; $L16: .loc 1 2560 16 ld.u64 %r504,[%r751+40]; .loc 1 2557 6 cvt.u16.u32 %r844,%r447; setp.eq.u16 %r845,%r844,1; @ %r845 bra $L131; $L125: .loc 1 2569 16 ld.u64 %r208,[%r751+64]; .loc 1 2571 15 ld.u64 %r847,[%r751+80]; add.u64 %r846,%r847,1; .loc 1 2571 13 ld.u64 %r848,[%r751+72]; mov.u64 %r215,%r457; sub.u64 %r457,%r846,%r848; bra $L12; $L131: .loc 1 2563 14 mov.u64 %r215,1; .loc 1 2561 16 mov.u64 %r208,%r215; $L12: .loc 1 2575 16 ld.u64 %r850,[%r752+56]; add.u64 %r849,%r850,1; ld.u64 %r851,[%r752+48]; sub.u64 %r63,%r849,%r851; .loc 1 2575 6 setp.eq.u64 %r852,%r63,%r457; @ %r852 bra $L17; .loc 1 2577 10 set.u32.gt.s64 %r854,%r457,0; neg.s32 %r855,%r854; .loc 1 2577 21 set.u32.gt.s64 %r857,%r63,0; neg.s32 %r858,%r857; cvt.u16.u32 %r860,%r855; cvt.u16.u32 %r861,%r858; or.b16 %r859,%r860,%r861; cvt.u32.u16 %r862,%r859; cvt.u16.u8 %r863,%r862; setp.eq.u16 %r864,%r863,0; @ %r864 bra $L17; .loc 1 2578 2 st.u64 [%stack+8],%r457; st.u64 [%stack],%r63; cvta.const.u64 %r865865_gfortran_runtime_error17: .loc 1 2583 7 ld.s8 %r65,[%r752+28]; .loc 1 2586 16 ld.u64 %r503,[%r752+40]; .loc 1 2583 6 cvt.u16.u32 %r867,%r65; setp.eq.u16 %r868,%r867,1; @ %r868 bra $L132; .loc 1 2597 16 ld.u64 %r209,[%r752+64]; .loc 1 2598 16 ld.u64 %r870,[%r752+80]; add.u64 %r869,%r870,1; .loc 1 2598 14 ld.u64 %r871,[%r752+72]; sub.u64 %r216,%r869,%r871; bra $L18; $L132: .loc 1 2592 14 mov.u64 %r216,1; .loc 1 2591 16 mov.u64 %r209,256; $L18: .loc 1 2601 9 ld.u64 %r244,[%r751]; .loc 1 2602 9 ld.u64 %r245,[%r752]; .loc 1 2603 8 ld.u64 %r246,[%r750]; .loc 1 2612 28 set.u32.eq.u64 %r873,%r445,1; neg.s32 %r874,%r873; cvt.u32.u32 %r872,%r874; cvt.u32.u8 %r70,%r872; .loc 1 2612 46 set.u32.eq.u64 %r876,%r504,1; neg.s32 %r877,%r876; cvt.u32.u32 %r875,%r877; cvt.u32.u8 %r502,%r875; .loc 1 2612 7 set.u32.ne.u32 %r879,%r753,0; neg.s32 %r880,%r879; .loc 1 2612 16 cvt.u16.u32 %r882,%r880; cvt.u16.u32 %r883,%r70; and.b16 %r881,%r882,%r883; .loc 1 2612 6 cvt.u32.u16 %r884,%r881; cvt.u16.u8 %r885,%r884; setp.eq.u16 %r886,%r885,0; @ %r886 bra $L19; .loc 1 2612 63 set.u32.eq.u64 %r888,%r208,1; neg.s32 %r889,%r888; .loc 1 2612 51 cvt.u16.u32 %r891,%r889; cvt.u16.u32 %r892,%r502; or.b16 %r890,%r891,%r892; .loc 1 2612 33 cvt.u32.u16 %r893,%r890; cvt.u16.u8 %r894,%r893; setp.eq.u16 %r895,%r894,0; @ %r895 bra $L20; .loc 1 2613 20 set.u32.eq.u64 %r897,%r503,1; neg.s32 %r898,%r897; .loc 1 2613 37 set.u32.eq.u64 %r900,%r209,1; neg.s32 %r901,%r900; .loc 1 2613 25 cvt.u16.u32 %r903,%r898; cvt.u16.u32 %r904,%r901; or.b16 %r902,%r903,%r904; .loc 1 2613 7 cvt.u32.u16 %r905,%r902; cvt.u16.u8 %r906,%r905; setp.eq.u16 %r907,%r906,0; @ %r907 bra $L20; .loc 1 2615 13 cvt.rn.f32.s32 %r81,%r754; .loc 1 2614 12 cvt.rn.f32.s64 %r908,%r215; .loc 1 2614 50 cvt.rn.f32.s64 %r909,%r457; .loc 1 2614 47 mul.f32 %r910,%r908,%r909; .loc 1 2614 31 cvt.rn.f32.s64 %r911,%r216; .loc 1 2614 47 mul.f32 %r912,%r910,%r911; .loc 1 2615 13 mul.f32 %r913,%r81,%r81; mul.f32 %r914,%r913,%r81; .loc 1 2614 7 setp.gt.f32 %r915,%r912,%r914; @ ! %r915 bra $L19; .loc 1 2617 17 cvt.u32.u64 %r84,%r215; st.u32 [%frame+28],%r84; .loc 1 2617 29 cvt.u32.u64 %r85,%r216; st.u32 [%frame+24],%r85; .loc 1 2617 41 cvt.u32.u64 %r86,%r457; st.u32 [%frame+20],%r86; .loc 1 2617 52 cvt.u32.u64 %r87,%r505; st.u32 [%frame+16],%r87; .loc 1 2618 27 mov.u32 %r916,1; st.u32 [%frame+12],%r916; .loc 1 2618 36 mov.u32 %r917,0; st.u32 [%frame+8],%r917; .loc 1 2619 17 setp.ne.u64 %r918,%r504,1; @ %r918 bra $L22; cvt.u32.u64 %r229,%r208; bra $L23; $L22: cvt.u32.u64 %r229,%r504; $L23: st.u32 [%frame+4],%r229; .loc 1 2620 3 setp.ne.u64 %r919,%r503,1; @ %r919 bra $L24; cvt.u32.u64 %r230,%r209; bra $L25; $L24: cvt.u32.u64 %r230,%r503; $L25: st.u32 [%frame],%r230; .loc 1 2622 15 set.u32.gt.s32 %r921,%r229,0; neg.s32 %r922,%r921; .loc 1 2622 26 set.u32.gt.s32 %r924,%r230,0; neg.s32 %r925,%r924; .loc 1 2622 19 cvt.u16.u32 %r927,%r922; cvt.u16.u32 %r928,%r925; and.b16 %r926,%r927,%r928; .loc 1 2622 10 cvt.u32.u16 %r929,%r926; cvt.u16.u8 %r930,%r929; setp.eq.u16 %r931,%r930,0; @ %r931 bra $L19; .loc 1 2622 37 set.u32.gt.s32 %r933,%r87,0; neg.s32 %r934,%r933; .loc 1 2622 46 set.u32.gt.s32 %r936,%r84,1; neg.s32 %r937,%r936; .loc 1 2622 41 cvt.u16.u32 %r939,%r934; cvt.u16.u32 %r940,%r937; and.b16 %r938,%r939,%r940; cvt.u32.u16 %r941,%r938; cvt.u16.u8 %r942,%r941; setp.eq.u16 %r943,%r942,0; @ %r943 bra $L19; .loc 1 2622 55 set.u32.gt.s32 %r945,%r85,1; neg.s32 %r946,%r945; .loc 1 2622 64 set.u32.gt.s32 %r948,%r86,1; neg.s32 %r949,%r948; .loc 1 2622 59 cvt.u16.u32 %r951,%r946; cvt.u16.u32 %r952,%r949; and.b16 %r950,%r951,%r952; cvt.u32.u16 %r953,%r950; cvt.u16.u8 %r954,%r953; setp.eq.u16 %r955,%r954,0; @ %r955 bra $L19; .loc 1 2624 4 setp.ne.u64 %r956,%r755,0; @ %r956 bra $L27; cvta.const.u64 %r960,$LC8; cvta.const.u64 %r959,__func__$0; mov.u32 %r958,2624; cvta.const.u64 %r9959027: .loc 1 2626 17 and.b32 %r961,%r753,2; .loc 1 2626 7 setp.ne.u32 %r962,%r961,0; @ %r962 bra $L133; .loc 1 2629 35 setp.eq.u64 %r963,%r504,1; @ %r963 bra $L134; cvta.const.u64 %r217,$LC2; bra $L28; $L133: .loc 1 2627 13 cvta.const.u64 %r217,$LC1; bra $L28; $L134: .loc 1 2629 35 cvta.const.u64 %r217,$LC0; $L28: .loc 1 2631 17 and.b32 %r964,%r753,4; .loc 1 2631 7 setp.ne.u32 %r965,%r964,0; @ %r965 bra $L135; .loc 1 2634 35 setp.eq.u64 %r966,%r503,1; @ %r966 bra $L136; cvta.const.u64 %r218,$LC2; bra $L29; $L135: .loc 1 2632 13 cvta.const.u64 %r218,$LC1; bra $L29; $L136: .loc 1 2634 35 cvta.const.u64 %r218,$LC0; $L29: .loc 1 2636 4 add.u64 %r982,%frame,16; add.u64 %r983,%frame,8; add.u64 %r984,%frame,4; add.u64 %r985,%frame,12; add.u64 %r986,%frame,20; add.u64 %r987,%frame,24; add.u64 %r988,%frame,28; mov.u32 %r981,1;,.param .u64 %in_ar3,.param .u64 %in_ar4,.param .u64 %in_ar5,.param .u64 %in_ar6,.param .u64 %in_ar7,.param .u64 %in_ar8,.param .u64 %in_ar9,.param .u64 %in_ar10,.param .u64 %in_ar11,.param .u64 %in_ar12,.param .u32 %in_ar13,.param .u32 %in_ar14);988987986244; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r984; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r245; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%frame; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r983; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r246; .param .u64 %out_arg13; st.param.u64 [%out_arg13],%r982; .param .u32 %out_arg14; st.param.u32 [%out_arg14],%r981; .param .u32 %out_arg15; st.param.u32 [%out_arg15],%r981; call %r755,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13,%out_arg14,%out_arg15),$LCT0; } bra $L1; $L19: .loc 1 2643 21 cvt.u16.u32 %r990,%r70; cvt.u16.u32 %r991,%r502; and.b16 %r989,%r990,%r991; .loc 1 2643 6 cvt.u32.u16 %r992,%r989; cvt.u16.u8 %r993,%r992; setp.eq.u16 %r994,%r993,0; @ %r994 bra $L31; .loc 1 2643 38 setp.ne.u64 %r995,%r503,1; @ %r995 bra $L20; .loc 1 2644 7 cvt.u16.u32 %r996,%r65; setp.eq.u16 %r997,%r996,1; @ %r997 bra $L32; .loc 1 2677 16 add.u64 %r998,%r505,1; .loc 1 2678 9 shl.b64 %r103,%r998,2; sub.u64 %r248,%r246,%r103; .loc 1 2687 7 setp.le.s64 %r999,%r216,0; @ %r999 bra $L33; add.u64 %r623,%r103,-4; mov.u64 %r622,%r246; add.u64 %r626,%r216,1; .loc 1 2689 22 shl.b64 %r442,%r215,2; .loc 1 2687 13 mov.u64 %r298,%r503; setp.gt.s64 %r1361,%r215,0; and.b64 %r1473,%r216,3; setp.eq.u64 %r1476,%r1473,0; @ %r1476 bra $L34; setp.eq.u64 %r1475,%r1473,1; @ %r1475 bra $L300; setp.eq.u64 %r1474,%r1473,2; @ %r1474 bra $L301; bra $L366; $L35: .loc 1 2689 22 mov.u32 %r100310031005,[%value_in]; } $L36: .loc 1 2687 24 add.u64 %r1477,%r298,1; .loc 1 2687 7 add.u64 %r1530,%r622,%r623; .loc 1 2688 2 @ %r1361 bra $L238; bra $L367; $L34: @ %r1361 bra $L35; bra $L36; $L33: .loc 1 2692 28 set.u32.eq.u64 %r1010,%r457,0; neg.s32 %r1011,%r1010; .loc 1 2692 13 set.u32.eq.u64 %r1013,%r215,0; neg.s32 %r1014,%r1013; .loc 1 2692 28 cvt.u16.u32 %r1016,%r1011; cvt.u16.u32 %r1017,%r1014; or.b16 %r1015,%r1016,%r1017; .loc 1 2692 23 set.u32.eq.u64 %r1019,%r216,0; neg.s32 %r1020,%r1019; .loc 1 2692 28 cvt.u16.u32 %r1024,%r1020; or.b16 %r1022,%r1015,%r1024; cvt.u32.u16 %r1025,%r1022; cvt.u16.u8 %r1026,%r1025; setp.eq.u16 %r1027,%r1026,0; @ ! %r1027 bra $L1; .loc 1 2697 10 setp.eq.u64 %r1028,%r208,1; selp.u64 %r227,%r505,%r208,%r1028; .loc 1 2702 21 shl.b64 %r1029,%r227,8; .loc 1 2702 14 add.u64 %r253,%r1029,%r209; .loc 1 2706 12 min.s64 %r1031,%r253,65536; shl.b64 %r1032,%r1031,10321033,[%value_in]; } .loc 1 2710 7 setp.gt.s64 %r1035,%r216,0; @ %r1035 bra $L39; $L42: .loc 1 2910332918 7 bra $L1; $L39: .loc 1 2680 16 add.u64 %r1037,%r208,1; .loc 1 2681 9 shl.b64 %r106,%r1037,2; sub.u64 %r250,%r244,%r106; .loc 1 2683 16 add.u64 %r1038,%r209,1; .loc 1 2684 9 shl.b64 %r109,%r1038,2; shl.b64 %r596,%r209,9; shl.b64 %r598,%r209,11; shl.b64 %r608,%r209,3; sub.u64 %r599,%r245,%r109; add.u64 %r597,%r608,%r599; shl.b64 %r602,%r505,9; add.u64 %r603,%r505,%r505; add.u64 %r605,%r103,-4; neg.s64 %r1041,%r209; shl.b64 %r1042,%r1041,11; shl.b64 %r606,%r1041,2; shl.b64 %r610,%r505,11; shl.b64 %r1046,%r505,3; add.u64 %r1047,%r1046,4; add.u64 %r609,%r248,%r1047; add.u64 %r1048,%r216,-1; and.b64 %r618,%r1048,-512; add.u64 %r619,%r599,%r606; shl.b64 %r575,%r208,8; shl.b64 %r577,%r208,10; add.u64 %r578,%r106,-4; neg.s64 %r1050,%r208; shl.b64 %r1051,%r1050,10; shl.b64 %r1054,%r1050,3; shl.b64 %r540,%r208,3; add.u64 %r541,%r208,%r208; add.u64 %r231,%r109,-4; shl.b64 %r225,%r209,4; shl.b64 %r220,%r505,4; neg.s64 %r1057,%r505; shl.b64 %r1058,%r1057,2; shl.b64 %r1061,%r1041,4; mov.u64 %r604,%r605; mov.u64 %r600,%r603; mov.u64 %r594,%r209; mov.u64 %r592,0; setp.gt.s64 %r1374,%r457,0; add.u64 %r1409,%r619,4; setp.gt.s64 %r1410,%r215,0; add.u64 %r1411,%r215,1; add.u64 %r1412,%r250,8; bra $L41; $L138: mov.u64 %r592,%r593; $L41: add.u64 %r621,%r592,1; .loc 1 2715 9 sub.u64 %r255,%r216,%r592; min.s64 %r1531,%r255,512; .loc 1 2716 24 shr.s64 %r1063,%r1531,63; shr.u64 %r1064,%r1063,62; add.u64 %r1065,%r1531,%r1064; and.b64 %r1066,%r1065,3; sub.u64 %r1067,%r1066,%r1064; .loc 1 2716 10 sub.u64 %r256,%r1531,%r1067; .loc 1 2718 4 @ %r1374 bra $L40; $L45: .loc 1 2710 7 add.u64 %r593,%r592,512; add.u64 %r594,%r594,%r596; add.u64 %r597,%r597,%r598; add.u64 %r600,%r600,%r602; add.u64 %r604,%r604,%r602; add.u64 %r606,%r606,%r1042; add.u64 %r608,%r608,%r598; add.u64 %r609,%r609,%r610; setp.ne.u64 %r1069,%r592,%r618; @ %r1069 bra $L138; bra $L42; $L40: add.u64 %r572,%r1409,%r608; .loc 1 2768 13 add.u64 %r482,%r256,%r621; add.u64 %r478,%r231,%r608; add.u64 %r476,%r231,%r478; add.u64 %r590,%r231,%r597; add.u64 %r591,%r231,%r590; mul.lo.u64 %r406,%r482,%r505; mul.lo.u64 %r399,%r482,%r209; mad.lo.u64 %r587,%r482,%r231,%r599; mov.u64 %r579,%r1054; mov.u64 %r576,%r578; mov.u64 %r573,%r208; .loc 1 2718 12 mov.u64 %r259,%r503; add.u64 %r1365,%r457,1; .loc 1 2768 8 add.u64 %r1402,%r482,-1; setp.ge.s64 %r1404,%r1402,%r621; sub.u64 %r1407,%r600,%r505; add.u64 %r1408,%r1407,%r603; $L44: .loc 1 2723 13 sub.u64 %r257,%r1365,%r259; min.s64 %r1533,%r257,256; .loc 1 2724 28 shr.u64 %r1075,%r1533,63; add.u64 %r1076,%r1533,%r1075; and.b64 %r1077,%r1076,1; sub.u64 %r1078,%r1077,%r1075; .loc 1 2724 14 sub.u64 %r258,%r1533,%r1078; .loc 1 2727 8 @ %r1410 bra $L43; $L73: .loc 1 2718 30 add.u64 %r259,%r259,256; .loc 1 2718 4 add.u64 %r572,%r572,1024; add.u64 %r573,%r573,%r575; add.u64 %r576,%r576,%r577; add.u64 %r579,%r579,%r1051; setp.ge.s64 %r1080,%r457,%r259; @ %r1080 bra $L44; bra $L45; $L43: .loc 1 2763 26 add.u64 %r1081,%r259,-1; add.u64 %r1082,%r1081,%r1533; .loc 1 2763 31 mul.lo.u64 %r316,%r1082,%r208; .loc 1 2734 13 add.u64 %r481,%r258,%r259; .loc 1 2734 8 add.u64 %r483,%r481,-1; add.u64 %r1083,%r316,1; shl.b64 %r1084,%r1083,2; add.u64 %r559,%r250,%r1084; add.u64 %r569,%r573,%r208; sub.u64 %r1085,%r569,%r316; shl.b64 %r571,%r1085,2; shl.b64 %r1087,%r1533,2; add.u64 %r1088,%r1087,-4; add.u64 %r100,%r1033,%r1088; add.u64 %r1089,%r259,%r594; add.u64 %r1090,%r1089,%r1533; shl.b64 %r1091,%r1090,2; add.u64 %r224,%r1091,%r599; shl.b64 %r589,%r259,2; add.u64 %r48,%r589,%r597; add.u64 %r30,%r589,%r590; add.u64 %r242,%r589,%r591; add.u64 %r588,%r587,%r589; add.u64 %r1092,%r399,%r259; add.u64 %r1093,%r1092,%r1533; shl.b64 %r1094,%r1093,2; add.u64 %r397,%r1094,%r599; .loc 1 2768 8 mov.u64 %r558,%r609; .loc 1 2727 16 mov.u64 %r273,%r503; setp.le.s64 %r1366,%r259,%r483; setp.gt.s64 %r1358,%r1533,%r258; setp.gt.s64 %r1357,%r1531,%r256; not.b64 %r1395,%r259; add.u64 %r1396,%r1395,%r481; and.b64 %r1397,%r1396,-2; .loc 1 2904 18 add.u64 %r1398,%r1533,%r259; .loc 1 2904 13 add.u64 %r1399,%r1398,-1; setp.gt.s64 %r1400,%r259,%r1399; $L72: .loc 1 2732 10 sub.u64 %r260,%r1411,%r273; min.s64 %r1534,%r260,256; .loc 1 2733 25 shr.s64 %r1363,%r1534,63; shr.u64 %r1098,%r1534,63; add.u64 %r1099,%r1534,%r1098; and.b64 %r1100,%r1099,1; sub.u64 %r1101,%r1100,%r1098; .loc 1 2733 11 sub.u64 %r261,%r1534,%r1101; .loc 1 2735 5 @ %r1366 bra $L46; $L55: .loc 1 2757 8 @ ! %r1358 bra $L48; bra $L47; $L46: .loc 1 2737 17 add.u64 %r485,%r261,%r273; .loc 1 2737 12 add.u64 %r486,%r485,-1; shl.b64 %r1105,%r1534,10; add.u64 %r1107,%r1105,-1024; add.u64 %r533,%r1033,%r1107; add.u64 %r539,%r559,%r571; add.u64 %r1110,%r1534,%r273; shl.b64 %r550,%r1110,2; shl.b64 %r553,%r1534,2; mov.u64 %r544,%r569; mov.u64 %r543,%r579; mov.u64 %r542,%r576; mov.u64 %r531,0; setp.le.s64 %r1370,%r273,%r486; setp.gt.s64 %r1359,%r1534,%r261; not.b64 %r1390,%r273; add.u64 %r1391,%r1390,%r485; and.b64 %r1392,%r1391,-2; add.u64 %r1393,%r1392,%r273; bra $L54; $L139: mov.u64 %r531,%r532; $L54: .loc 1 2738 9 @ %r1370 bra $L49; $L53: .loc 1 2749 12 @ %r1359 bra $L50; bra $L51; $L49: shl.b64 %r1113,%r531,2; add.u64 %r315,%r1033,%r1113; add.u64 %r1118,%r1393,%r544; shl.b64 %r1119,%r1118,2; add.u64 %r530,%r1119,%r1412; .loc 1 2738 9 mov.u64 %r160,%r539; sub.u64 %r1461,%r530,%r539; add.u64 %r1462,%r1461,-8; shr.u64 %r1460,%r1462,3; add.u64 %r1463,%r1460,1; and.b64 %r1464,%r1463,3; setp.eq.u64 %r1467,%r1464,0; @ %r1467 bra $L52; setp.eq.u64 %r1466,%r1464,1; @ %r1466 bra $L302; setp.eq.u64 %r1465,%r1464,2; @ %r1465 bra $L303; bra $L368; $L52: .loc 1 2741 7 add.u64 %r1121,%r160,%r542; add.u64 %r1122,%r1121,%r543; .loc 1 2740 49 ld.u32 %r1123,[%r1122]; st.u32 [%r315],%r1123; .loc 1 2742 49 ld.u32 %r1124,[%r160]; st.u32 [%r315+4],%r1124; .loc 1 2745 7 add.u64 %r1125,%r160,%r543; add.u64 %r1126,%r1125,%r542; .loc 1 2744 49 ld.u32 %r1127,[%r1126+4]; st.u32 [%r315+1024],%r1127; .loc 1 2746 49 ld.u32 %r1128,[%r160+4]; st.u32 [%r315+1028],%r1128; .loc 1 2738 9 add.u64 %r1468,%r315,2048; add.u64 %r1469,%r160,8; .loc 1 2741 7 add.u64 %r1537,%r1469,%r542; add.u64 %r1538,%r1537,%r543; .loc 1 2740 49 ld.u32 %r1539,[%r1538]; st.u32 [%r1468],%r1539; .loc 1 2742 49 ld.u32 %r1540,[%r1469]; st.u32 [%r1468+4],%r1540; .loc 1 2745 7 add.u64 %r1541,%r1469,%r543; add.u64 %r1542,%r1541,%r542; .loc 1 2744 49 ld.u32 %r1543,[%r1542+4]; st.u32 [%r1468+1024],%r1543; .loc 1 2746 49 ld.u32 %r1544,[%r1469+4]; st.u32 [%r1468+1028],%r1544; .loc 1 2738 9 add.u64 %r1545,%r315,4096; add.u64 %r1546,%r160,16; .loc 1 2741 7 add.u64 %r1548,%r1546,%r542; add.u64 %r1549,%r1548,%r543; .loc 1 2740 49 ld.u32 %r1550,[%r1549]; st.u32 [%r1545],%r1550; .loc 1 2742 49 ld.u32 %r1551,[%r1546]; st.u32 [%r1545+4],%r1551; .loc 1 2745 7 add.u64 %r1552,%r1546,%r543; add.u64 %r1553,%r1552,%r542; .loc 1 2744 49 ld.u32 %r1554,[%r1553+4]; st.u32 [%r1545+1024],%r1554; .loc 1 2746 49 ld.u32 %r1555,[%r1546+4]; st.u32 [%r1545+1028],%r1555; .loc 1 2738 9 add.u64 %r1556,%r315,6144; add.u64 %r1557,%r160,24; .loc 1 2741 7 add.u64 %r1559,%r1557,%r542; add.u64 %r1560,%r1559,%r543; .loc 1 2740 49 ld.u32 %r1561,[%r1560]; st.u32 [%r1556],%r1561; .loc 1 2742 49 ld.u32 %r1562,[%r1557]; st.u32 [%r1556+4],%r1562; .loc 1 2745 7 add.u64 %r1563,%r1557,%r543; add.u64 %r1564,%r1563,%r542; .loc 1 2744 49 ld.u32 %r1565,[%r1564+4]; st.u32 [%r1556+1024],%r1565; .loc 1 2746 49 ld.u32 %r1566,[%r1557+4]; st.u32 [%r1556+1028],%r1566; .loc 1 2738 9 add.u64 %r315,%r315,8192; add.u64 %r160,%r160,32; setp.ne.u64 %r1567,%r160,%r530; @ %r1567 bra $L52; bra $L53; $L50: .loc 1 2752 10 add.u64 %r1131,%r542,%r550; add.u64 %r1132,%r250,%r1131; .loc 1 2754 10 add.u64 %r1137,%r539,%r553; .loc 1 2751 41 ld.u32 %r1138,[%r1137+-4]; shl.b64 %r1139,%r1138,32; ld.u32 %r1141,[%r1132+-4]; or.b64 %r1142,%r1141,%r1139; mov.b64 %r1130,%r1142; st.v2.u32 [%r533],%r1130; $L51: .loc 1 2735 5 add.u64 %r532,%r531,2; add.u64 %r533,%r533,8; add.u64 %r539,%r539,%r540; add.u64 %r542,%r542,%r540; add.u64 %r543,%r543,%r1054; add.u64 %r544,%r544,%r541; setp.ne.u64 %r1143,%r531,%r1397; @ %r1143 bra $L139; bra $L55; $L48: .loc 1 2767 25 shr.u64 %r1146,%r1363,62; add.u64 %r1147,%r1534,%r1146; and.b64 %r1148,%r1147,3; sub.u64 %r1149,%r1148,%r1146; .loc 1 2767 11 sub.u64 %r263,%r1534,%r1149; .loc 1 2769 5 @ %r1404 bra $L56; bra $L57; $L47: .loc 1 2759 17 add.u64 %r130,%r1534,%r273; .loc 1 2759 12 add.u64 %r1151,%r130,-1; .loc 1 2760 9 setp.lt.s64 %r1152,%r1151,%r273; @ %r1152 bra $L48; add.u64 %r1153,%r130,%r316; shl.b64 %r1154,%r1153,2; add.u64 %r228,%r250,%r1154; mov.u64 %r331,%r100; mov.u64 %r56,%r559; sub.u64 %r1451,%r228,%r559; add.u64 %r1452,%r1451,-4; shr.u64 %r1450,%r1452,2; add.u64 %r1453,%r1450,1; and.b64 %r1454,%r1453,3; setp.eq.u64 %r1457,%r1454,0; @ %r1457 bra $L58; setp.eq.u64 %r1456,%r1454,1; @ %r1456 bra $L304; setp.eq.u64 %r1455,%r1454,2; @ %r1455 bra $L305; bra $L369; $L58: .loc 1 2762 43 ld.u32 %r1155,[%r56]; st.u32 [%r331],%r1155; ld.u32 %r1570,[%r56+4]; st.u32 [%r331+1024],%r1570; ld.u32 %r1574,[%r56+8]; st.u32 [%r331+2048],%r1574; ld.u32 %r1578,[%r56+12]; st.u32 [%r331+3072],%r1578; .loc 1 2760 9 add.u64 %r56,%r56,16; add.u64 %r331,%r331,4096; setp.ne.u64 %r1579,%r228,%r56; @ %r1579 bra $L58; bra $L48; $L57: .loc 1 2871 8 @ ! %r1357 bra $L60; bra $L59; $L56: .loc 1 2876 14 add.u64 %r489,%r263,%r273; .loc 1 2876 9 add.u64 %r490,%r489,-1; add.u64 %r221,%r558,%r1058; add.u64 %r82,%r558,%r605; shl.b64 %r1158,%r505,2; add.u64 %r39,%r1158,%r82; not.b64 %r1159,%r273; add.u64 %r1160,%r1159,%r489; shr.u64 %r1161,%r1160,2; add.u64 %r1162,%r1161,1; shl.b64 %r320,%r1162,4; add.u64 %r1163,%r1407,%r489; shl.b64 %r1164,%r1163,2; add.u64 %r467,%r1164,%r248; add.u64 %r1165,%r489,%r600; shl.b64 %r1166,%r1165,2; add.u64 %r464,%r1166,%r248; add.u64 %r1168,%r1408,%r489; shl.b64 %r1169,%r1168,2; add.u64 %r462,%r1169,%r248; add.u64 %r1170,%r489,%r604; shl.b64 %r1171,%r1170,2; add.u64 %r458,%r1171,%r248; shl.b64 %r369,%r263,8; shl.b64 %r471,%r1534,2; mov.u64 %r477,%r476; mov.u64 %r479,%r478; mov.u64 %r480,%r608; mov.u64 %r488,%r606; mov.u64 %r219,%r558; mov.u64 %r226,%r224; mov.u64 %r279,%r621; mov.u64 %r474,0; mov.u64 %r475,%r474; setp.le.s64 %r1372,%r273,%r490; setp.gt.s64 %r1360,%r1534,%r263; $L68: .loc 1 2772 9 @ %r1372 bra $L61; $L67: .loc 1 2843 12 @ ! %r1360 bra $L63; bra $L62; $L61: .loc 1 2904 13 mov.u64 %r341,0; mov.u64 %r342,4096; mov.u64 %r343,3072; mov.u64 %r344,1024; mov.u64 %r345,-2048; $L66: add.u64 %r339,%r221,%r341; .loc 1 2774 10 ld.u32 %r280,[%r339]; .loc 1 2775 10 ld.u32 %r281,[%r339+4]; add.u64 %r337,%r219,%r341; .loc 1 2776 10 ld.u32 %r282,[%r337]; .loc 1 2777 10 ld.u32 %r283,[%r337+4]; add.u64 %r335,%r82,%r341; .loc 1 2778 10 ld.u32 %r284,[%r335]; .loc 1 2779 10 ld.u32 %r285,[%r335+4]; add.u64 %r333,%r39,%r341; .loc 1 2780 10 ld.u32 %r286,[%r333]; .loc 1 2781 10 ld.u32 %r287,[%r333+4]; .loc 1 2782 10 ld.u32 %r288,[%r339+8]; .loc 1 2783 10 ld.u32 %r289,[%r339+12]; .loc 1 2784 10 ld.u32 %r290,[%r337+8]; .loc 1 2785 10 ld.u32 %r291,[%r337+12]; .loc 1 2786 10 ld.u32 %r292,[%r335+8]; .loc 1 2787 10 ld.u32 %r293,[%r335+12]; .loc 1 2788 10 ld.u32 %r294,[%r333+8]; .loc 1 2789 10 ld.u32 %r295,[%r333+12]; .loc 1 2791 6 @ %r1400 bra $L64; add.u64 %r361,%r474,%r572; add.u64 %r359,%r1033,%r344; $L65: add.u64 %r356,%r345,%r359; .loc 1 2793 19 add.u64 %r1176,%r356,%r344; ld.u32 %r135,[%r1176]; .loc 1 2794 14 ld.u32 %r136,[%r361]; .loc 1 2793 14 mad.lo.u32 %r280,%r135,%r136,%r280; .loc 1 2795 19 ld.u32 %r138,[%r359]; .loc 1 2795 14 mad.lo.u32 %r281,%r136,%r138,%r281; add.u64 %r351,%r361,%r488; .loc 1 2798 14 add.u64 %r1179,%r351,%r480; ld.u32 %r140,[%r1179]; .loc 1 2797 14 mad.lo.u32 %r282,%r135,%r140,%r282; .loc 1 2799 14 mad.lo.u32 %r283,%r138,%r140,%r283; .loc 1 2802 14 add.u64 %r1182,%r351,%r479; ld.u32 %r143,[%r1182]; .loc 1 2801 14 mad.lo.u32 %r284,%r135,%r143,%r284; .loc 1 2803 14 mad.lo.u32 %r285,%r138,%r143,%r285; .loc 1 2806 14 add.u64 %r1185,%r351,%r477; ld.u32 %r146,[%r1185]; .loc 1 2805 14 mad.lo.u32 %r286,%r135,%r146,%r286; .loc 1 2807 14 mad.lo.u32 %r287,%r138,%r146,%r287; .loc 1 2809 19 add.u64 %r1188,%r356,%r343; ld.u32 %r149,[%r1188]; .loc 1 2809 14 mad.lo.u32 %r288,%r136,%r149,%r288; .loc 1 2811 19 add.u64 %r1190,%r356,%r342; ld.u32 %r151,[%r1190]; .loc 1 2811 14 mad.lo.u32 %r289,%r136,%r151,%r289; .loc 1 2813 14 mad.lo.u32 %r290,%r140,%r149,%r290; .loc 1 2815 14 mad.lo.u32 %r291,%r140,%r151,%r291; .loc 1 2817 14 mad.lo.u32 %r292,%r143,%r149,%r292; .loc 1 2819 14 mad.lo.u32 %r293,%r143,%r151,%r293; .loc 1 2821 14 mad.lo.u32 %r294,%r146,%r149,%r294; .loc 1 2823 14 mad.lo.u32 %r295,%r146,%r151,%r295; .loc 1 2791 6 add.u64 %r361,%r361,4; add.u64 %r359,%r359,4; setp.ne.u64 %r1198,%r226,%r361; @ %r1198 bra $L65; $L64: .loc 1 2826 24 st.u32 [%r339],%r280; .loc 1 2827 28 st.u32 [%r339+4],%r281; .loc 1 2828 30 st.u32 [%r337],%r282; .loc 1 2829 34 st.u32 [%r337+4],%r283; .loc 1 2830 30 st.u32 [%r335],%r284; .loc 1 2831 34 st.u32 [%r335+4],%r285; .loc 1 2832 30 st.u32 [%r333],%r286; .loc 1 2833 34 st.u32 [%r333+4],%r287; .loc 1 2834 28 st.u32 [%r339+8],%r288; .loc 1 2835 28 st.u32 [%r339+12],%r289; .loc 1 2836 34 st.u32 [%r337+8],%r290; .loc 1 2837 34 st.u32 [%r337+12],%r291; .loc 1 2838 34 st.u32 [%r335+8],%r292; .loc 1 2839 34 st.u32 [%r335+12],%r293; .loc 1 2840 34 st.u32 [%r333+8],%r294; .loc 1 2841 34 st.u32 [%r333+12],%r295; .loc 1 2772 9 add.u64 %r345,%r345,-4096; add.u64 %r344,%r344,4096; add.u64 %r343,%r343,4096; add.u64 %r342,%r342,4096; add.u64 %r341,%r341,16; setp.ne.u64 %r1199,%r320,%r341; @ %r1199 bra $L66; bra $L67; $L63: .loc 1 2769 29 add.u64 %r279,%r279,4; .loc 1 2769 5 add.u64 %r226,%r226,%r225; add.u64 %r221,%r221,%r220; add.u64 %r219,%r219,%r220; add.u64 %r82,%r82,%r220; add.u64 %r39,%r39,%r220; add.u64 %r488,%r488,%r1061; add.u64 %r480,%r480,%r225; add.u64 %r479,%r479,%r225; add.u64 %r477,%r477,%r225; add.u64 %r475,%r475,%r220; add.u64 %r474,%r474,%r225; setp.le.s64 %r1200,%r279,%r1402; @ %r1200 bra $L68; bra $L57; $L62: add.u64 %r374,%r467,%r475; add.u64 %r373,%r464,%r475; add.u64 %r372,%r462,%r475; add.u64 %r371,%r458,%r475; add.u64 %r239,%r48,%r474; add.u64 %r32,%r30,%r474; add.u64 %r207,%r242,%r474; add.u64 %r472,%r221,%r471; .loc 1 2904 13 mov.u64 %r370,%r369; $L71: .loc 1 2848 14 ld.u32 %r275,[%r374]; .loc 1 2849 14 ld.u32 %r276,[%r373]; .loc 1 2850 14 ld.u32 %r277,[%r372]; .loc 1 2851 14 ld.u32 %r278,[%r371]; .loc 1 2853 10 @ %r1400 bra $L69; shl.b64 %r1204,%r370,2; add.u64 %r389,%r1033,%r1204; add.u64 %r387,%r474,%r572; mov.u64 %r381,%r207; mov.u64 %r383,%r32; mov.u64 %r385,%r239; sub.u64 %r1438,%r226,%r387; add.u64 %r1439,%r1438,-4; shr.u64 %r1437,%r1439,2; add.u64 %r1440,%r1437,1; and.b64 %r1441,%r1440,3; setp.eq.u64 %r1444,%r1441,0; @ %r1444 bra $L70; setp.eq.u64 %r1443,%r1441,1; @ %r1443 bra $L306; setp.eq.u64 %r1442,%r1441,2; @ %r1442 bra $L307; bra $L370; $L70: .loc 1 2855 16 ld.u32 %r161,[%r389]; .loc 1 2856 13 ld.u32 %r1206,[%r387]; .loc 1 2855 11 mad.lo.u32 %r1580,%r161,%r1206,%r275; .loc 1 2858 13 ld.u32 %r1208,[%r385]; .loc 1 2857 11 mad.lo.u32 %r1581,%r161,%r1208,%r276; .loc 1 2860 13 ld.u32 %r1210,[%r383]; .loc 1 2859 11 mad.lo.u32 %r1582,%r161,%r1210,%r277; .loc 1 2862 13 ld.u32 %r1212,[%r381]; .loc 1 2861 11 mad.lo.u32 %r1583,%r161,%r1212,%r278; .loc 1 2855 16 ld.u32 %r1589,[%r389+4]; .loc 1 2856 13 ld.u32 %r1590,[%r387+4]; .loc 1 2855 11 mad.lo.u32 %r1592,%r1589,%r1590,%r1580; .loc 1 2858 13 ld.u32 %r1593,[%r385+4]; .loc 1 2857 11 mad.lo.u32 %r1595,%r1589,%r1593,%r1581; .loc 1 2860 13 ld.u32 %r1596,[%r383+4]; .loc 1 2859 11 mad.lo.u32 %r1598,%r1589,%r1596,%r1582; .loc 1 2862 13 ld.u32 %r1599,[%r381+4]; .loc 1 2861 11 mad.lo.u32 %r1601,%r1589,%r1599,%r1583; .loc 1 2855 16 ld.u32 %r1608,[%r389+8]; .loc 1 2856 13 ld.u32 %r1609,[%r387+8]; .loc 1 2855 11 mad.lo.u32 %r1611,%r1608,%r1609,%r1592; .loc 1 2858 13 ld.u32 %r1612,[%r385+8]; .loc 1 2857 11 mad.lo.u32 %r1614,%r1608,%r1612,%r1595; .loc 1 2860 13 ld.u32 %r1615,[%r383+8]; .loc 1 2859 11 mad.lo.u32 %r1617,%r1608,%r1615,%r1598; .loc 1 2862 13 ld.u32 %r1618,[%r381+8]; .loc 1 2861 11 mad.lo.u32 %r1620,%r1608,%r1618,%r1601; .loc 1 2855 16 ld.u32 %r1627,[%r389+12]; .loc 1 2856 13 ld.u32 %r1628,[%r387+12]; .loc 1 2855 11 mad.lo.u32 %r275,%r1627,%r1628,%r1611; .loc 1 2858 13 ld.u32 %r1630,[%r385+12]; .loc 1 2857 11 mad.lo.u32 %r276,%r1627,%r1630,%r1614; .loc 1 2860 13 ld.u32 %r1632,[%r383+12]; .loc 1 2859 11 mad.lo.u32 %r277,%r1627,%r1632,%r1617; .loc 1 2862 13 ld.u32 %r1634,[%r381+12]; .loc 1 2861 11 mad.lo.u32 %r278,%r1627,%r1634,%r1620; .loc 1 2853 10 add.u64 %r389,%r389,16; add.u64 %r387,%r387,16; add.u64 %r385,%r385,16; add.u64 %r383,%r383,16; add.u64 %r381,%r381,16; setp.ne.u64 %r1636,%r226,%r387; @ %r1636 bra $L70; $L69: .loc 1 2864 28 st.u32 [%r374],%r275; .loc 1 2865 34 st.u32 [%r373],%r276; .loc 1 2866 34 st.u32 [%r372],%r277; .loc 1 2867 34 st.u32 [%r371],%r278; .loc 1 2846 6 add.u64 %r374,%r374,4; add.u64 %r373,%r373,4; add.u64 %r372,%r372,4; add.u64 %r371,%r371,4; add.u64 %r370,%r370,256; setp.ne.u64 %r1214,%r374,%r472; @ %r1214 bra $L71; bra $L63; $L60: .loc 1 2727 34 add.u64 %r273,%r273,256; .loc 1 2727 8 add.u64 %r558,%r558,1024; add.u64 %r559,%r559,1024; setp.ge.s64 %r1215,%r215,%r273; @ %r1215 bra $L72; bra $L73; $L59: .loc 1 2873 17 add.u64 %r170,%r1531,%r621; .loc 1 2874 9 setp.le.s64 %r1216,%r170,%r482; @ %r1216 bra $L60; .loc 1 2876 14 add.u64 %r496,%r263,%r273; .loc 1 2876 9 add.u64 %r497,%r496,-1; .loc 1 2759 17 add.u64 %r405,%r1534,%r273; add.u64 %r1217,%r405,%r406; shl.b64 %r1218,%r1217,2; add.u64 %r407,%r1218,%r248; neg.s64 %r1220,%r405; shl.b64 %r1221,%r1220,2; shl.b64 %r1222,%r273,2; add.u64 %r390,%r1222,%r1221; shl.b64 %r1223,%r496,2; add.u64 %r392,%r1223,%r1221; shl.b64 %r434,%r263,8; .loc 1 2876 9 mov.u64 %r398,%r397; mov.u64 %r400,%r588; mov.u64 %r401,%r406; mov.u64 %r266,%r482; setp.gt.s64 %r1637,%r1534,%r263; setp.le.s64 %r1373,%r273,%r497; not.b64 %r1382,%r273; add.u64 %r1383,%r1382,%r496; and.b64 %r1384,%r1383,-4; add.u64 %r1385,%r1384,%r273; $L81: .loc 1 2877 6 @ %r1373 bra $L74; $L80: .loc 1 2901 6 @ %r1637 bra $L75; bra $L76; $L74: add.u64 %r419,%r390,%r407; add.u64 %r1231,%r1385,%r401; shl.b64 %r1232,%r1231,2; add.u64 %r1233,%r248,16; add.u64 %r409,%r1232,%r1233; .loc 1 2904 13 mov.u64 %r415,4096; mov.u64 %r416,3072; mov.u64 %r417,1024; mov.u64 %r418,-2048; $L79: .loc 1 2879 14 ld.u32 %r269,[%r419]; .loc 1 2880 14 ld.u32 %r270,[%r419+4]; .loc 1 2881 14 ld.u32 %r271,[%r419+8]; .loc 1 2882 14 ld.u32 %r272,[%r419+12]; .loc 1 2884 10 @ %r1400 bra $L77; add.u64 %r428,%r1033,%r417; mov.u64 %r429,%r400; sub.u64 %r1430,%r398,%r400; add.u64 %r1431,%r1430,-4; shr.u64 %r1429,%r1431,2; and.b64 %r1433,%r1429,1; setp.ne.u64 %r1434,%r1433,0; @ ! %r1434 bra $L371; $L78: add.u64 %r425,%r418,%r428; .loc 1 2887 16 ld.u32 %r172,[%r429]; .loc 1 2886 16 add.u64 %r1235,%r425,%r417; .loc 1 2887 13 ld.u32 %r1237,[%r1235]; .loc 1 2886 11 mad.lo.u32 %r1638,%r172,%r1237,%r269; .loc 1 2889 13 ld.u32 %r1239,[%r428]; .loc 1 2888 11 mad.lo.u32 %r1639,%r172,%r1239,%r270; .loc 1 2890 16 add.u64 %r1240,%r425,%r416; .loc 1 2891 13 ld.u32 %r1242,[%r1240]; .loc 1 2890 11 mad.lo.u32 %r1640,%r172,%r1242,%r271; .loc 1 2892 16 add.u64 %r1243,%r425,%r415; .loc 1 2893 13 ld.u32 %r1245,[%r1243]; .loc 1 2892 11 mad.lo.u32 %r1641,%r172,%r1245,%r272; .loc 1 2884 10 add.u64 %r1436,%r428,4; add.u64 %r1644,%r418,%r1436; .loc 1 2887 16 ld.u32 %r1645,[%r429+4]; .loc 1 2886 16 add.u64 %r1646,%r1644,%r417; .loc 1 2887 13 ld.u32 %r1647,[%r1646]; .loc 1 2886 11 mad.lo.u32 %r269,%r1645,%r1647,%r1638; .loc 1 2889 13 ld.u32 %r1649,[%r1436]; .loc 1 2888 11 mad.lo.u32 %r270,%r1645,%r1649,%r1639; .loc 1 2890 16 add.u64 %r1651,%r1644,%r416; .loc 1 2891 13 ld.u32 %r1652,[%r1651]; .loc 1 2890 11 mad.lo.u32 %r271,%r1645,%r1652,%r1640; .loc 1 2892 16 add.u64 %r1654,%r1644,%r415; .loc 1 2893 13 ld.u32 %r1655,[%r1654]; .loc 1 2892 11 mad.lo.u32 %r272,%r1645,%r1655,%r1641; .loc 1 2884 10 add.u64 %r429,%r429,8; add.u64 %r428,%r428,8; setp.ne.u64 %r1657,%r398,%r429; @ %r1657 bra $L78; $L77: .loc 1 2895 28 st.u32 [%r419],%r269; .loc 1 2896 32 st.u32 [%r419+4],%r270; .loc 1 2897 32 st.u32 [%r419+8],%r271; .loc 1 2898 32 st.u32 [%r419+12],%r272; .loc 1 2877 6 add.u64 %r419,%r419,16; add.u64 %r418,%r418,-4096; add.u64 %r417,%r417,4096; add.u64 %r416,%r416,4096; add.u64 %r415,%r415,4096; setp.ne.u64 %r1247,%r409,%r419; @ %r1247 bra $L79; bra $L80; $L76: .loc 1 2874 39 add.u64 %r266,%r266,1; .loc 1 2874 9 add.u64 %r407,%r407,%r605; add.u64 %r401,%r401,%r505; add.u64 %r400,%r400,%r231; add.u64 %r398,%r398,%r231; setp.ne.u64 %r1248,%r170,%r266; @ %r1248 bra $L81; bra $L60; $L75: add.u64 %r436,%r392,%r407; .loc 1 2904 13 mov.u64 %r435,%r434; $L84: .loc 1 2903 14 ld.u32 %r268,[%r436]; .loc 1 2905 10 @ %r1400 bra $L82; shl.b64 %r1251,%r435,2; add.u64 %r297,%r1033,%r1251; mov.u64 %r440,%r400; sub.u64 %r1420,%r398,%r400; add.u64 %r1421,%r1420,-4; shr.u64 %r1419,%r1421,2; add.u64 %r1422,%r1419,1; and.b64 %r1423,%r1422,3; setp.eq.u64 %r1426,%r1423,0; @ %r1426 bra $L83; setp.eq.u64 %r1425,%r1423,1; @ %r1425 bra $L308; setp.eq.u64 %r1424,%r1423,2; @ %r1424 bra $L309; bra $L372; $L83: .loc 1 2908 13 ld.u32 %r1253,[%r297]; ld.u32 %r1254,[%r440]; .loc 1 2907 11 mad.lo.u32 %r1658,%r1253,%r1254,%r268; .loc 1 2908 13 ld.u32 %r1661,[%r297+4]; ld.u32 %r1662,[%r440+4]; .loc 1 2907 11 mad.lo.u32 %r1664,%r1661,%r1662,%r1658; .loc 1 2908 13 ld.u32 %r1668,[%r297+8]; ld.u32 %r1669,[%r440+8]; .loc 1 2907 11 mad.lo.u32 %r1671,%r1668,%r1669,%r1664; .loc 1 2908 13 ld.u32 %r1675,[%r297+12]; ld.u32 %r1676,[%r440+12]; .loc 1 2907 11 mad.lo.u32 %r268,%r1675,%r1676,%r1671; .loc 1 2905 10 add.u64 %r297,%r297,16; add.u64 %r440,%r440,16; setp.ne.u64 %r1678,%r398,%r440; @ %r1678 bra $L83; $L82: .loc 1 2910 28 st.u32 [%r436],%r268; .loc 1 2901 6 add.u64 %r436,%r436,4; add.u64 %r435,%r435,256; setp.ne.u64 %r1256,%r407,%r436; @ %r1256 bra $L84; bra $L76; $L20: .loc 1 2958 11 cvt.u16.u32 %r1257,%r447; setp.eq.u16 %r1258,%r1257,1; @ ! %r1258 bra $L373; bra $L85; $L31: .loc 1 2920 38 set.u32.eq.u64 %r1260,%r208,1; neg.s32 %r1261,%r1260; .loc 1 2920 26 cvt.u16.u32 %r1263,%r1261; and.b16 %r1262,%r1263,%r990; .loc 1 2920 11 cvt.u32.u16 %r1265,%r1262; cvt.u16.u8 %r1266,%r1265; setp.eq.u16 %r1267,%r1266,0; @ %r1267 bra $L20; .loc 1 2920 43 setp.ne.u64 %r1268,%r503,1; @ %r1268 bra $L20; $L121: .loc 1 2922 10 cvt.u16.u32 %r1269,%r447; setp.ne.u16 %r1270,%r1269,1; @ %r1270 bra $L87; .loc 1 2948 4 setp.gt.s64 %r1271,%r216,0; @ %r1271 bra $L88; bra $L1; $L87: .loc 1 2929 4 setp.le.s64 %r1272,%r216,0; @ %r1272 bra $L1; shl.b64 %r645,%r505,2; shl.b64 %r1273,%r215,2; add.u64 %r644,%r246,%r1273; shl.b64 %r635,%r504,2; shl.b64 %r1274,%r457,2; add.u64 %r636,%r244,%r1274; mov.u64 %r649,0; mov.u64 %r642,%r649; .loc 1 2929 11 mov.u64 %r301,%r649; setp.gt.s64 %r1679,%r215,0; setp.le.s64 %r1416,%r457,0; $L91: .loc 1 2933 8 @ %r1679 bra $L90; $L95: .loc 1 2929 29 add.u64 %r301,%r301,1; .loc 1 2929 4 add.u64 %r642,%r642,%r505; add.u64 %r644,%r644,%r645; add.u64 %r649,%r649,%r209; setp.ne.u64 %r1276,%r216,%r301; @ %r1276 bra $L91; bra $L1; $L90: shl.b64 %r1277,%r642,2; add.u64 %r631,%r246,%r1277; shl.b64 %r1278,%r649,2; add.u64 %r654,%r245,%r1278; .loc 1 2933 8 mov.u64 %r633,%r636; mov.u64 %r632,0; $L94: .loc 1 2937 5 @ %r1416 bra $L140; shl.b64 %r1280,%r632,2; add.u64 %r627,%r244,%r1280; mov.u64 %r628,%r654; .loc 1 2936 7 mov.u32 %r302,0; sub.u64 %r1479,%r633,%r627; add.u64 %r1480,%r1479,-4; shr.u64 %r1478,%r1480,2; add.u64 %r1481,%r1478,1; and.b64 %r1482,%r1481,3; setp.eq.u64 %r1485,%r1482,0; @ %r1485 bra $L93; setp.eq.u64 %r1484,%r1482,1; @ %r1484 bra $L310; setp.eq.u64 %r1483,%r1482,2; @ %r1483 bra $L311; bra $L374; $L93: .loc 1 2938 23 ld.u32 %r1282,[%r627]; ld.u32 %r1283,[%r628]; .loc 1 2938 9 mad.lo.u32 %r1680,%r1282,%r1283,%r302; .loc 1 2938 23 ld.u32 %r1683,[%r627+4]; ld.u32 %r1684,[%r628+4]; .loc 1 2938 9 mad.lo.u32 %r1686,%r1683,%r1684,%r1680; .loc 1 2938 23 ld.u32 %r1690,[%r627+8]; ld.u32 %r1691,[%r628+8]; .loc 1 2938 9 mad.lo.u32 %r1693,%r1690,%r1691,%r1686; .loc 1 2938 23 ld.u32 %r1697,[%r627+12]; ld.u32 %r1698,[%r628+12]; .loc 1 2938 9 mad.lo.u32 %r302,%r1697,%r1698,%r1693; .loc 1 2937 5 add.u64 %r627,%r627,16; add.u64 %r628,%r628,16; setp.ne.u64 %r1700,%r627,%r633; @ %r1700 bra $L93; bra $L92; $L140: .loc 1 2936 7 mov.u32 %r302,0; $L92: .loc 1 2939 15 st.u32 [%r631],%r302; .loc 1 2933 8 add.u64 %r631,%r631,4; add.u64 %r632,%r632,%r504; add.u64 %r633,%r633,%r635; setp.ne.u64 %r1285,%r631,%r644; @ %r1285 bra $L94; bra $L95; $L88: shl.b64 %r665,%r505,2; mov.u64 %r663,%r246; shl.b64 %r657,%r504,2; .loc 1 2948 4 mov.u64 %r666,0; .loc 1 2948 11 mov.u64 %r299,%r666; setp.le.s64 %r1701,%r457,0; $L98: .loc 1 2952 8 @ %r1701 bra $L141; mov.u64 %r655,%r244; shl.b64 %r1287,%r666,2; add.u64 %r658,%r1287,%r245; add.u64 %r1288,%r457,%r666; shl.b64 %r1289,%r1288,2; add.u64 %r674,%r1289,%r245; .loc 1 2951 10 mov.u32 %r300,0; sub.u64 %r1489,%r674,%r658; add.u64 %r1490,%r1489,-4; shr.u64 %r1488,%r1490,2; add.u64 %r1491,%r1488,1; and.b64 %r1492,%r1491,3; setp.eq.u64 %r1495,%r1492,0; @ %r1495 bra $L97; setp.eq.u64 %r1494,%r1492,1; @ %r1494 bra $L312; setp.eq.u64 %r1493,%r1492,2; @ %r1493 bra $L313; bra $L375; $L97: .loc 1 2953 26 ld.u32 %r1291,[%r655]; ld.u32 %r1292,[%r658]; .loc 1 2953 5 mad.lo.u32 %r1702,%r1291,%r1292,%r300; .loc 1 2952 8 add.u64 %r1703,%r655,%r657; .loc 1 2953 26 ld.u32 %r1705,[%r1703]; ld.u32 %r1706,[%r658+4]; .loc 1 2953 5 mad.lo.u32 %r1708,%r1705,%r1706,%r1702; .loc 1 2952 8 add.u64 %r1709,%r1703,%r657; .loc 1 2953 26 ld.u32 %r1712,[%r1709]; ld.u32 %r1713,[%r658+8]; .loc 1 2953 5 mad.lo.u32 %r1715,%r1712,%r1713,%r1708; .loc 1 2952 8 add.u64 %r1716,%r1709,%r657; .loc 1 2953 26 ld.u32 %r1719,[%r1716]; ld.u32 %r1720,[%r658+12]; .loc 1 2953 5 mad.lo.u32 %r300,%r1719,%r1720,%r1715; .loc 1 2952 8 add.u64 %r655,%r1716,%r657; add.u64 %r658,%r658,16; setp.ne.u64 %r1722,%r658,%r674; @ %r1722 bra $L97; bra $L96; $L141: .loc 1 2951 10 mov.u32 %r300,0; $L96: .loc 1 2954 25 st.u32 [%r663],%r300; .loc 1 2948 29 add.u64 %r299,%r299,1; .loc 1 2948 4 add.u64 %r663,%r663,%r665; add.u64 %r666,%r666,%r209; setp.ne.u64 %r1294,%r216,%r299; @ %r1294 bra $L98; bra $L1; $L85: .loc 1 2963 7 setp.le.s64 %r1295,%r216,0; @ %r1295 bra $L1; shl.b64 %r685,%r445,2; mov.u64 %r683,%r246; shl.b64 %r677,%r504,2; shl.b64 %r680,%r503,2; mov.u64 %r686,0; .loc 1 2963 14 mov.u64 %r312,%r686; setp.le.s64 %r1723,%r457,0; $L101: .loc 1 2967 4 @ %r1723 bra $L142; mov.u64 %r675,%r244; shl.b64 %r1297,%r686,2; add.u64 %r678,%r245,%r1297; .loc 1 2966 6 mov.u32 %r313,0; .loc 1 2967 11 mov.u64 %r314,0; and.b64 %r1500,%r457,3; setp.eq.u64 %r1503,%r1500,0; @ %r1503 bra $L100; setp.eq.u64 %r1502,%r1500,1; @ %r1502 bra $L314; setp.eq.u64 %r1501,%r1500,2; @ %r1501 bra $L315; bra $L376; $L100: .loc 1 2968 29 ld.u32 %r1299,[%r675]; ld.u32 %r1300,[%r678]; .loc 1 2968 8 mad.lo.u32 %r1724,%r1299,%r1300,%r313; .loc 1 2967 4 add.u64 %r1726,%r675,%r677; add.u64 %r1727,%r678,%r680; .loc 1 2968 29 ld.u32 %r1728,[%r1726]; ld.u32 %r1729,[%r1727]; .loc 1 2968 8 mad.lo.u32 %r1731,%r1728,%r1729,%r1724; .loc 1 2967 4 add.u64 %r1733,%r1726,%r677; add.u64 %r1734,%r1727,%r680; .loc 1 2968 29 ld.u32 %r1736,[%r1733]; ld.u32 %r1737,[%r1734]; .loc 1 2968 8 mad.lo.u32 %r1739,%r1736,%r1737,%r1731; .loc 1 2967 4 add.u64 %r1741,%r1733,%r677; add.u64 %r1742,%r1734,%r680; .loc 1 2968 29 ld.u32 %r1744,[%r1741]; ld.u32 %r1745,[%r1742]; .loc 1 2968 8 mad.lo.u32 %r313,%r1744,%r1745,%r1739; .loc 1 2967 28 add.u64 %r314,%r314,4; .loc 1 2967 4 add.u64 %r675,%r1741,%r677; add.u64 %r678,%r1742,%r680; setp.ne.u64 %r1747,%r457,%r314; @ %r1747 bra $L100; bra $L99; $L142: .loc 1 2966 6 mov.u32 %r313,0; $L99: .loc 1 2969 21 st.u32 [%r683],%r313; .loc 1 2963 32 add.u64 %r312,%r312,1; .loc 1 2963 7 add.u64 %r683,%r683,%r685; add.u64 %r686,%r686,%r209; setp.ne.u64 %r1302,%r216,%r312; @ %r1302 bra $L101; bra $L1; $L373: .loc 1 2972 11 setp.gt.s64 %r1303,%r208,%r504; @ %r1303 bra $L102; .loc 1 2993 7 setp.gt.s64 %r1304,%r216,0; @ %r1304 bra $L103; bra $L1; $L102: .loc 1 2974 7 setp.gt.s64 %r1305,%r216,0; @ %r1305 bra $L104; bra $L1; $L108: shl.b64 %r1306,%r719,2; add.u64 %r715,%r246,%r1306; .loc 1 2975 9 mov.u64 %r311,0; and.b64 %r1516,%r215,3; setp.eq.u64 %r1519,%r1516,0; @ %r1519 bra $L105; setp.eq.u64 %r1518,%r1516,1; @ %r1518 bra $L316; setp.eq.u64 %r1517,%r1516,2; @ %r1517 bra $L317; bra $L377; $L105: .loc 1 2976 34 st.u32 [%r715],%r1417; .loc 1 2975 2 add.u64 %r1749,%r715,%r717; .loc 1 2976 34 st.u32 [%r1749],%r1417; .loc 1 2975 2 add.u64 %r1751,%r1749,%r717; .loc 1 2976 34 st.u32 [%r1751],%r1417; .loc 1 2975 2 add.u64 %r1754,%r1751,%r717; .loc 1 2976 34 st.u32 [%r1754],%r1417; .loc 1 2975 27 add.u64 %r311,%r311,4; .loc 1 2975 2 add.u64 %r715,%r1754,%r717; setp.ne.u64 %r1756,%r215,%r311; @ %r1756 bra $L105; $L109: .loc 1 2974 32 add.u64 %r310,%r326,1; .loc 1 2974 7 add.u64 %r719,%r719,%r505; setp.ne.u64 %r1309,%r216,%r310; @ %r1309 bra $L143; shl.b64 %r703,%r503,2; shl.b64 %r694,%r504,2; mov.u64 %r708,0; mov.u64 %r707,%r708; mov.u64 %r327,%r708; setp.le.s64 %r1757,%r457,0; bra $L107; $L104: shl.b64 %r717,%r445,2; mov.u64 %r719,0; .loc 1 2974 14 mov.u64 %r326,%r719; setp.gt.s64 %r1758,%r215,0; .loc 1 2976 34 cvt.u32.u64 %r1417,%r719; bra $L106; $L143: mov.u64 %r326,%r310; $L106: .loc 1 2975 2 @ %r1758 bra $L108; bra $L109; $L112: shl.b64 %r1311,%r699,2; add.u64 %r692,%r244,%r1311; .loc 1 2980 4 mov.u64 %r691,%r712; .loc 1 2980 11 mov.u64 %r309,0; and.b64 %r1508,%r215,3; setp.eq.u64 %r1511,%r1508,0; @ %r1511 bra $L110; setp.eq.u64 %r1510,%r1508,1; @ %r1510 bra $L318; setp.eq.u64 %r1509,%r1508,2; @ %r1509 bra $L319; bra $L378; $L110: .loc 1 2983 37 ld.u32 %r1313,[%r692]; ld.u32 %r1314,[%r701]; .loc 1 2982 36 ld.u32 %r1316,[%r691]; mad.lo.u32 %r1315,%r1313,%r1314,%r1316; st.u32 [%r691],%r1315; .loc 1 2980 4 add.u64 %r1760,%r691,%r717; add.u64 %r1761,%r692,%r694; .loc 1 2983 37 ld.u32 %r1762,[%r1761]; ld.u32 %r1763,[%r701]; .loc 1 2982 36 ld.u32 %r1765,[%r1760]; mad.lo.u32 %r1766,%r1762,%r1763,%r1765; st.u32 [%r1760],%r1766; .loc 1 2980 4 add.u64 %r1768,%r1760,%r717; add.u64 %r1769,%r1761,%r694; .loc 1 2983 37 ld.u32 %r1771,[%r1769]; ld.u32 %r1772,[%r701]; .loc 1 2982 36 ld.u32 %r1774,[%r1768]; mad.lo.u32 %r1775,%r1771,%r1772,%r1774; st.u32 [%r1768],%r1775; .loc 1 2980 4 add.u64 %r1777,%r1768,%r717; add.u64 %r1778,%r1769,%r694; .loc 1 2983 37 ld.u32 %r1780,[%r1778]; ld.u32 %r1781,[%r701]; .loc 1 2982 36 ld.u32 %r1783,[%r1777]; mad.lo.u32 %r1784,%r1780,%r1781,%r1783; st.u32 [%r1777],%r1784; .loc 1 2980 29 add.u64 %r309,%r309,4; .loc 1 2980 4 add.u64 %r691,%r1777,%r717; add.u64 %r692,%r1778,%r694; setp.ne.u64 %r1785,%r215,%r309; @ %r1785 bra $L110; $L113: .loc 1 2979 26 add.u64 %r308,%r308,1; .loc 1 2979 2 add.u64 %r699,%r699,%r208; add.u64 %r701,%r701,%r703; setp.eq.u64 %r1318,%r457,%r308; @ %r1318 bra $L111; $L114: .loc 1 2980 4 @ %r1758 bra $L112; bra $L113; $L111: .loc 1 2978 32 add.u64 %r307,%r327,1; .loc 1 2978 7 add.u64 %r707,%r707,%r505; add.u64 %r708,%r708,%r209; setp.eq.u64 %r1320,%r326,%r327; @ %r1320 bra $L1; mov.u64 %r327,%r307; $L107: .loc 1 2979 2 @ %r1757 bra $L111; shl.b64 %r1322,%r708,2; add.u64 %r701,%r245,%r1322; shl.b64 %r1323,%r707,2; add.u64 %r712,%r246,%r1323; mov.u64 %r699,0; .loc 1 2979 9 mov.u64 %r308,%r699; bra $L114; $L103: shl.b64 %r734,%r445,2; shl.b64 %r725,%r208,2; shl.b64 %r729,%r503,2; .loc 1 2993 7 mov.u64 %r743,0; mov.u64 %r741,%r743; .loc 1 2993 14 mov.u64 %r303,%r743; setp.gt.s64 %r1786,%r215,0; setp.le.s64 %r1418,%r457,0; $L116: .loc 1 2997 4 @ %r1786 bra $L115; $L120: .loc 1 2993 32 add.u64 %r303,%r303,1; .loc 1 2993 7 add.u64 %r741,%r741,%r505; add.u64 %r743,%r743,%r209; setp.ne.u64 %r1325,%r216,%r303; @ %r1325 bra $L116; bra $L1; $L115: shl.b64 %r1326,%r741,2; add.u64 %r732,%r246,%r1326; shl.b64 %r1327,%r743,2; add.u64 %r749,%r245,%r1327; .loc 1 2997 4 mov.u64 %r736,0; .loc 1 2997 11 mov.u64 %r304,%r736; $L119: .loc 1 3001 8 @ %r1418 bra $L144; shl.b64 %r1329,%r736,2; add.u64 %r723,%r244,%r1329; mov.u64 %r727,%r749; .loc 1 3000 10 mov.u32 %r305,0; .loc 1 3001 15 mov.u64 %r306,0; and.b64 %r1524,%r457,3; setp.eq.u64 %r1527,%r1524,0; @ %r1527 bra $L118; setp.eq.u64 %r1526,%r1524,1; @ %r1526 bra $L320; setp.eq.u64 %r1525,%r1524,2; @ %r1525 bra $L321; bra $L379; $L118: .loc 1 3002 28 ld.u32 %r1331,[%r723]; ld.u32 %r1332,[%r727]; .loc 1 3002 5 mad.lo.u32 %r1788,%r1331,%r1332,%r305; .loc 1 3001 8 add.u64 %r1790,%r723,%r725; add.u64 %r1791,%r727,%r729; .loc 1 3002 28 ld.u32 %r1792,[%r1790]; ld.u32 %r1793,[%r1791]; .loc 1 3002 5 mad.lo.u32 %r1795,%r1792,%r1793,%r1788; .loc 1 3001 8 add.u64 %r1797,%r1790,%r725; add.u64 %r1798,%r1791,%r729; .loc 1 3002 28 ld.u32 %r1800,[%r1797]; ld.u32 %r1801,[%r1798]; .loc 1 3002 5 mad.lo.u32 %r1803,%r1800,%r1801,%r1795; .loc 1 3001 8 add.u64 %r1805,%r1797,%r725; add.u64 %r1806,%r1798,%r729; .loc 1 3002 28 ld.u32 %r1808,[%r1805]; ld.u32 %r1809,[%r1806]; .loc 1 3002 5 mad.lo.u32 %r305,%r1808,%r1809,%r1803; .loc 1 3001 32 add.u64 %r306,%r306,4; .loc 1 3001 8 add.u64 %r723,%r1805,%r725; add.u64 %r727,%r1806,%r729; setp.ne.u64 %r1811,%r457,%r306; @ %r1811 bra $L118; bra $L117; $L144: .loc 1 3000 10 mov.u32 %r305,0; $L117: .loc 1 3003 27 st.u32 [%r732],%r305; .loc 1 2997 29 add.u64 %r304,%r304,1; .loc 1 2997 4 add.u64 %r732,%r732,%r734; add.u64 %r736,%r736,%r504; setp.ne.u64 %r1334,%r215,%r304; @ %r1334 bra $L119; bra $L120; $L32: .loc 1 2920 11 setp.eq.u64 %r1335,%r208,1; @ %r1335 bra $L121; bra $L20; $L2: .loc 1 2476 6 ld.u64 %r1336,[%r750]; setp.ne.u64 %r1337,%r1336,0; @ %r1337 bra $L122; .loc 1 2485 4 ld.u64 %r1339,[%r751+56]; add.u64 %r1338,%r1339,1; ld.u64 %r1340,[%r751+48]; sub.u64 %r509,%r1338,%r1340; .loc 1 2483 15 ld.s8 %r1342,[%r752+28]; cvt.u16.u32 %r1341,%r1342; setp.eq.u16 %r1343,%r1341,1; @ %r1343 bra $L123; bra $L124; $L126: .loc 1 2560 16 ld.u64 %r504,[%r751+40]; .loc 1 2548 27 mov.u64 %r445,%r505; bra $L125; $L14: ld.u64 %r505,[%r750+40]; .loc 1 2543 6 ld.s8 %r1345,[%r750+28]; cvt.u16.u32 %r1344,%r1345; setp.eq.u16 %r1346,%r1344,1; @ ! %r1346 bra $L127; bra $L126; $L11: .loc 1 2560 16 ld.u64 %r504,[%r751+40]; mov.u32 %r447,%r332; .loc 1 2548 27 mov.u64 %r505,%r445; .loc 1 2563 14 mov.u64 %r215,1; .loc 1 2561 16 mov.u64 %r208,%r215; bra $L12; $L5: .loc 1 2485 4 ld.u64 %r1348,[%r751+56]; add.u64 %r1347,%r1348,1; ld.u64 %r1349,[%r751+48]; sub.u64 %r509,%r1347,%r1349; bra $L124; $L128: .loc 1 2509 17 ld.u64 %r1351,[%r750+56]; add.u64 %r1350,%r1351,1; .loc 1 2509 15 ld.u64 %r1352,[%r750+48]; sub.u64 %r457,%r1350,%r1352; ld.s8 %r1364,[%r752+28]; bra $L9; $L122: .loc 1 2502 12 cvta.global.u64 %r1353,_gfortrani_compile_options; .loc 1 2502 11 ld.u32 %r1354,[%r1353+36]; setp.ne.u32 %r1355,%r1354,0; @ ! %r1355 bra $L8; bra $L128; $L1: .loc 1 3007 1 ret; $L372: .loc 1 2908 13 ld.u32 %r1812,[%r297]; ld.u32 %r1813,[%r400]; .loc 1 2907 11 mad.lo.u32 %r268,%r1812,%r1813,%r268; .loc 1 2905 10 add.u64 %r297,%r297,4; add.u64 %r440,%r400,4; $L309: .loc 1 2908 13 ld.u32 %r1816,[%r297]; ld.u32 %r1817,[%r440]; .loc 1 2907 11 mad.lo.u32 %r268,%r1816,%r1817,%r268; .loc 1 2905 10 add.u64 %r297,%r297,4; add.u64 %r440,%r440,4; $L308: .loc 1 2908 13 ld.u32 %r1820,[%r297]; ld.u32 %r1821,[%r440]; .loc 1 2907 11 mad.lo.u32 %r268,%r1820,%r1821,%r268; .loc 1 2905 10 add.u64 %r297,%r297,4; add.u64 %r440,%r440,4; setp.ne.u64 %r1823,%r398,%r440; @ %r1823 bra $L83; bra $L82; $L371: add.u64 %r1824,%r418,%r428; .loc 1 2887 16 ld.u32 %r1825,[%r400]; .loc 1 2886 16 add.u64 %r1826,%r1824,%r417; .loc 1 2887 13 ld.u32 %r1827,[%r1826]; .loc 1 2886 11 mad.lo.u32 %r269,%r1825,%r1827,%r269; .loc 1 2889 13 ld.u32 %r1829,[%r428]; .loc 1 2888 11 mad.lo.u32 %r270,%r1825,%r1829,%r270; .loc 1 2890 16 add.u64 %r1831,%r1824,%r416; .loc 1 2891 13 ld.u32 %r1832,[%r1831]; .loc 1 2890 11 mad.lo.u32 %r271,%r1825,%r1832,%r271; .loc 1 2892 16 add.u64 %r1834,%r1824,%r415; .loc 1 2893 13 ld.u32 %r1835,[%r1834]; .loc 1 2892 11 mad.lo.u32 %r272,%r1825,%r1835,%r272; .loc 1 2884 10 add.u64 %r429,%r400,4; add.u64 %r428,%r428,4; setp.ne.u64 %r1837,%r398,%r429; @ %r1837 bra $L78; bra $L77; $L370: .loc 1 2855 16 ld.u32 %r1838,[%r389]; .loc 1 2856 13 ld.u32 %r1839,[%r387]; .loc 1 2855 11 mad.lo.u32 %r275,%r1838,%r1839,%r275; .loc 1 2858 13 ld.u32 %r1841,[%r239]; .loc 1 2857 11 mad.lo.u32 %r276,%r1838,%r1841,%r276; .loc 1 2860 13 ld.u32 %r1843,[%r32]; .loc 1 2859 11 mad.lo.u32 %r277,%r1838,%r1843,%r277; .loc 1 2862 13 ld.u32 %r1845,[%r207]; .loc 1 2861 11 mad.lo.u32 %r278,%r1838,%r1845,%r278; .loc 1 2853 10 add.u64 %r389,%r389,4; add.u64 %r387,%r387,4; add.u64 %r385,%r239,4; add.u64 %r383,%r32,4; add.u64 %r381,%r207,4; $L307: .loc 1 2855 16 ld.u32 %r1848,[%r389]; .loc 1 2856 13 ld.u32 %r1849,[%r387]; .loc 1 2855 11 mad.lo.u32 %r275,%r1848,%r1849,%r275; .loc 1 2858 13 ld.u32 %r1851,[%r385]; .loc 1 2857 11 mad.lo.u32 %r276,%r1848,%r1851,%r276; .loc 1 2860 13 ld.u32 %r1853,[%r383]; .loc 1 2859 11 mad.lo.u32 %r277,%r1848,%r1853,%r277; .loc 1 2862 13 ld.u32 %r1855,[%r381]; .loc 1 2861 11 mad.lo.u32 %r278,%r1848,%r1855,%r278; .loc 1 2853 10 add.u64 %r389,%r389,4; add.u64 %r387,%r387,4; add.u64 %r385,%r385,4; add.u64 %r383,%r383,4; add.u64 %r381,%r381,4; $L306: .loc 1 2855 16 ld.u32 %r1858,[%r389]; .loc 1 2856 13 ld.u32 %r1859,[%r387]; .loc 1 2855 11 mad.lo.u32 %r275,%r1858,%r1859,%r275; .loc 1 2858 13 ld.u32 %r1861,[%r385]; .loc 1 2857 11 mad.lo.u32 %r276,%r1858,%r1861,%r276; .loc 1 2860 13 ld.u32 %r1863,[%r383]; .loc 1 2859 11 mad.lo.u32 %r277,%r1858,%r1863,%r277; .loc 1 2862 13 ld.u32 %r1865,[%r381]; .loc 1 2861 11 mad.lo.u32 %r278,%r1858,%r1865,%r278; .loc 1 2853 10 add.u64 %r389,%r389,4; add.u64 %r387,%r387,4; add.u64 %r385,%r385,4; add.u64 %r383,%r383,4; add.u64 %r381,%r381,4; setp.ne.u64 %r1867,%r226,%r387; @ %r1867 bra $L70; bra $L69; $L369: .loc 1 2762 43 ld.u32 %r1868,[%r559]; st.u32 [%r100],%r1868; .loc 1 2760 9 add.u64 %r56,%r559,4; add.u64 %r331,%r100,1024; $L305: .loc 1 2762 43 ld.u32 %r1870,[%r56]; st.u32 [%r331],%r1870; .loc 1 2760 9 add.u64 %r56,%r56,4; add.u64 %r331,%r331,1024; $L304: .loc 1 2762 43 ld.u32 %r1872,[%r56]; st.u32 [%r331],%r1872; .loc 1 2760 9 add.u64 %r56,%r56,4; add.u64 %r331,%r331,1024; setp.ne.u64 %r1873,%r228,%r56; @ %r1873 bra $L58; bra $L48; $L368: .loc 1 2741 7 add.u64 %r1874,%r539,%r542; add.u64 %r1875,%r1874,%r543; .loc 1 2740 49 ld.u32 %r1876,[%r1875]; st.u32 [%r315],%r1876; .loc 1 2742 49 ld.u32 %r1877,[%r539]; st.u32 [%r315+4],%r1877; .loc 1 2745 7 add.u64 %r1878,%r539,%r543; add.u64 %r1879,%r1878,%r542; .loc 1 2744 49 ld.u32 %r1880,[%r1879+4]; st.u32 [%r315+1024],%r1880; .loc 1 2746 49 ld.u32 %r1881,[%r539+4]; st.u32 [%r315+1028],%r1881; .loc 1 2738 9 add.u64 %r315,%r315,2048; add.u64 %r160,%r539,8; $L303: .loc 1 2741 7 add.u64 %r1883,%r160,%r542; add.u64 %r1884,%r1883,%r543; .loc 1 2740 49 ld.u32 %r1885,[%r1884]; st.u32 [%r315],%r1885; .loc 1 2742 49 ld.u32 %r1886,[%r160]; st.u32 [%r315+4],%r1886; .loc 1 2745 7 add.u64 %r1887,%r160,%r543; add.u64 %r1888,%r1887,%r542; .loc 1 2744 49 ld.u32 %r1889,[%r1888+4]; st.u32 [%r315+1024],%r1889; .loc 1 2746 49 ld.u32 %r1890,[%r160+4]; st.u32 [%r315+1028],%r1890; .loc 1 2738 9 add.u64 %r315,%r315,2048; add.u64 %r160,%r160,8; $L302: .loc 1 2741 7 add.u64 %r1892,%r160,%r542; add.u64 %r1893,%r1892,%r543; .loc 1 2740 49 ld.u32 %r1894,[%r1893]; st.u32 [%r315],%r1894; .loc 1 2742 49 ld.u32 %r1895,[%r160]; st.u32 [%r315+4],%r1895; .loc 1 2745 7 add.u64 %r1896,%r160,%r543; add.u64 %r1897,%r1896,%r542; .loc 1 2744 49 ld.u32 %r1898,[%r1897+4]; st.u32 [%r315+1024],%r1898; .loc 1 2746 49 ld.u32 %r1899,[%r160+4]; st.u32 [%r315+1028],%r1899; .loc 1 2738 9 add.u64 %r315,%r315,2048; add.u64 %r160,%r160,8; setp.ne.u64 %r1900,%r160,%r530; @ %r1900 bra $L52; bra $L53; $L366: .loc 1 2688 2 @ ! %r1361 bra $L380; .loc 1 2689 22 mov.u32 %r19019011902,[%value_in]; } $L380: .loc 1 2687 24 mov.u64 %r298,2; .loc 1 2687 7 add.u64 %r622,%r246,%r623; $L301: .loc 1 2688 2 @ ! %r1361 bra $L381; .loc 1 2689 22 mov.u32 %r190419041905,[%value_in]; } $L381: .loc 1 2687 24 add.u64 %r298,%r298,1; .loc 1 2687 7 add.u64 %r622,%r622,%r623; $L300: .loc 1 2688 2 @ ! %r1361 bra $L382; .loc 1 2689 22 mov.u32 %r190719071908,[%value_in]; } $L382: .loc 1 2687 24 add.u64 %r298,%r298,1; .loc 1 2687 7 add.u64 %r622,%r622,%r623; setp.eq.u64 %r1909,%r298,%r626; @ ! %r1909 bra $L34; bra $L33; $L238: .loc 1 2689 22 mov.u32 %r1910153911911,[%value_in]; } $L367: .loc 1 2687 7 add.u64 %r1913,%r1530,%r623; .loc 1 2688 2 @ ! %r1361 bra $L383; .loc 1 2689 22 mov.u32 %r191519119151916,[%value_in]; } $L383: .loc 1 2687 7 add.u64 %r1918,%r1913,%r623; .loc 1 2688 2 @ ! %r1361 bra $L384; .loc 1 2689 22 mov.u32 %r192019121921,[%value_in]; } $L384: .loc 1 2687 24 add.u64 %r298,%r1477,3; .loc 1 2687 7 add.u64 %r622,%r1918,%r623; setp.eq.u64 %r1922,%r298,%r626; @ ! %r1922 bra $L34; bra $L33; $L374: .loc 1 2938 23 ld.u32 %r1923,[%r627]; ld.u32 %r1924,[%r654]; mul.lo.u32 %r302,%r1923,%r1924; .loc 1 2937 5 add.u64 %r627,%r627,4; add.u64 %r628,%r654,4; $L311: .loc 1 2938 23 ld.u32 %r1927,[%r627]; ld.u32 %r1928,[%r628]; .loc 1 2938 9 mad.lo.u32 %r302,%r1927,%r1928,%r302; .loc 1 2937 5 add.u64 %r627,%r627,4; add.u64 %r628,%r628,4; $L310: .loc 1 2938 23 ld.u32 %r1931,[%r627]; ld.u32 %r1932,[%r628]; .loc 1 2938 9 mad.lo.u32 %r302,%r1931,%r1932,%r302; .loc 1 2937 5 add.u64 %r627,%r627,4; add.u64 %r628,%r628,4; setp.ne.u64 %r1934,%r627,%r633; @ %r1934 bra $L93; bra $L92; $L375: .loc 1 2953 26 ld.u32 %r1935,[%r244]; ld.u32 %r1936,[%r658]; mul.lo.u32 %r300,%r1935,%r1936; .loc 1 2952 8 add.u64 %r655,%r244,%r657; add.u64 %r658,%r658,4; $L313: .loc 1 2953 26 ld.u32 %r1939,[%r655]; ld.u32 %r1940,[%r658]; .loc 1 2953 5 mad.lo.u32 %r300,%r1939,%r1940,%r300; .loc 1 2952 8 add.u64 %r655,%r655,%r657; add.u64 %r658,%r658,4; $L312: .loc 1 2953 26 ld.u32 %r1943,[%r655]; ld.u32 %r1944,[%r658]; .loc 1 2953 5 mad.lo.u32 %r300,%r1943,%r1944,%r300; .loc 1 2952 8 add.u64 %r655,%r655,%r657; add.u64 %r658,%r658,4; setp.ne.u64 %r1946,%r658,%r674; @ %r1946 bra $L97; bra $L96; $L376: .loc 1 2968 29 ld.u32 %r1947,[%r244]; ld.u32 %r1948,[%r678]; mul.lo.u32 %r313,%r1947,%r1948; .loc 1 2967 28 mov.u64 %r314,1; .loc 1 2967 4 add.u64 %r675,%r244,%r677; add.u64 %r678,%r678,%r680; $L315: .loc 1 2968 29 ld.u32 %r1951,[%r675]; ld.u32 %r1952,[%r678]; .loc 1 2968 8 mad.lo.u32 %r313,%r1951,%r1952,%r313; .loc 1 2967 28 add.u64 %r314,%r314,1; .loc 1 2967 4 add.u64 %r675,%r675,%r677; add.u64 %r678,%r678,%r680; $L314: .loc 1 2968 29 ld.u32 %r1955,[%r675]; ld.u32 %r1956,[%r678]; .loc 1 2968 8 mad.lo.u32 %r313,%r1955,%r1956,%r313; .loc 1 2967 28 add.u64 %r314,%r314,1; .loc 1 2967 4 add.u64 %r675,%r675,%r677; add.u64 %r678,%r678,%r680; setp.ne.u64 %r1958,%r457,%r314; @ %r1958 bra $L100; bra $L99; $L378: .loc 1 2983 37 ld.u32 %r1959,[%r692]; ld.u32 %r1960,[%r701]; .loc 1 2982 36 ld.u32 %r1962,[%r712]; mad.lo.u32 %r1963,%r1959,%r1960,%r1962; st.u32 [%r712],%r1963; .loc 1 2980 29 mov.u64 %r309,1; .loc 1 2980 4 add.u64 %r691,%r712,%r717; add.u64 %r692,%r692,%r694; $L319: .loc 1 2983 37 ld.u32 %r1965,[%r692]; ld.u32 %r1966,[%r701]; .loc 1 2982 36 ld.u32 %r1968,[%r691]; mad.lo.u32 %r1969,%r1965,%r1966,%r1968; st.u32 [%r691],%r1969; .loc 1 2980 29 add.u64 %r309,%r309,1; .loc 1 2980 4 add.u64 %r691,%r691,%r717; add.u64 %r692,%r692,%r694; $L318: .loc 1 2983 37 ld.u32 %r1971,[%r692]; ld.u32 %r1972,[%r701]; .loc 1 2982 36 ld.u32 %r1974,[%r691]; mad.lo.u32 %r1975,%r1971,%r1972,%r1974; st.u32 [%r691],%r1975; .loc 1 2980 29 add.u64 %r309,%r309,1; .loc 1 2980 4 add.u64 %r691,%r691,%r717; add.u64 %r692,%r692,%r694; setp.ne.u64 %r1976,%r215,%r309; @ %r1976 bra $L110; bra $L113; $L377: .loc 1 2976 34 st.u32 [%r715],%r1417; .loc 1 2975 27 mov.u64 %r311,1; .loc 1 2975 2 add.u64 %r715,%r715,%r717; $L317: .loc 1 2976 34 st.u32 [%r715],%r1417; .loc 1 2975 27 add.u64 %r311,%r311,1; .loc 1 2975 2 add.u64 %r715,%r715,%r717; $L316: .loc 1 2976 34 st.u32 [%r715],%r1417; .loc 1 2975 27 add.u64 %r311,%r311,1; .loc 1 2975 2 add.u64 %r715,%r715,%r717; setp.ne.u64 %r1979,%r215,%r311; @ %r1979 bra $L105; bra $L109; $L379: .loc 1 3002 28 ld.u32 %r1980,[%r723]; ld.u32 %r1981,[%r749]; mul.lo.u32 %r305,%r1980,%r1981; .loc 1 3001 32 mov.u64 %r306,1; .loc 1 3001 8 add.u64 %r723,%r723,%r725; add.u64 %r727,%r749,%r729; $L321: .loc 1 3002 28 ld.u32 %r1984,[%r723]; ld.u32 %r1985,[%r727]; .loc 1 3002 5 mad.lo.u32 %r305,%r1984,%r1985,%r305; .loc 1 3001 32 add.u64 %r306,%r306,1; .loc 1 3001 8 add.u64 %r723,%r723,%r725; add.u64 %r727,%r727,%r729; $L320: .loc 1 3002 28 ld.u32 %r1988,[%r723]; ld.u32 %r1989,[%r727]; .loc 1 3002 5 mad.lo.u32 %r305,%r1988,%r1989,%r305; .loc 1 3001 32 add.u64 %r306,%r306,1; .loc 1 3001 8 add.u64 %r723,%r723,%r725; add.u64 %r727,%r727,%r729; setp.ne.u64 %r1991,%r457,%r306; @ %r1991 bra $L118; bra $L117; } matmul_i8.o/_gfortran_matmul_i8 .visible .func _gfortran_matmul_, .param .u32 %in_ar4, .param .u64 %in_ar5); .file 1 "../../../libgfortran/generated/matmul_i8.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_options[6]FUNCTION DECL: _gfortran_size0_gfortran_size0_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime_97,116,109,117,108,95,105,56,0 }2] = {782] = {672] = {8471,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,32,40,97,41,32,61,61,32,50,32,124,124,32,71,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,32,40,98,41,32,61,61,3243116,114,97,110,47,103,101,110,101,114,97,116,101,100,47,109,97,116,109,117,108,95,105,56,465,114,114,97,121,32,98,111,117,110,100,32,109,105,115,109,97,116,99,104,32,102,111,114,32,100,105,109,101,110,115,105,111,110,32,49,32,111,102,32,97,114,114,97,121,32,40,37,108,100,47,37,108,100,41,3265,114,114,97,121,32,98,111,117,110,100,32,109,105,115,109,97,116,99,104,32,102,111,114,32,100,105,109,101,110,115,105,111,110,32,50,32,111,102,32,97,114,114,97,121,32,40,37,108,100,47,37,108,100,41,3273,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,97,114,103,117,109,101,110,116,32,66,32,105,110,32,77,65,84,77,85,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,49,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,100,0 }103,101,109,109,32,33,61,32,78,85,76,7_gfortran_matmul_i8 .visible .func _gfortran_matmul_8964u64 %r280; .reg .u64 %r281; .reg .u64 %r282; .reg .u64 %r283; .reg .u64u64u64u64 %r305; .reg .u64 %r306; .reg .u64u64u32 %r331; .reg .u64 %r332; .reg .u64u64 %r374; .reg .u64u64 %r390; .reg .u64 %r392; .reg .u64u64u64 %r409; .reg .u64u64u64 %r530; .reg .u64 %r536; .reg .u64 %r537; .reg .u64 %r538; .reg .u64u64 %r555; .reg .u64u64 %r571; .reg .u64 %r573; .reg .u64 %r574; .reg .u64 %r575; .reg .u64 %r576; .reg .u64 %r577; .reg .u64 %r585; .reg .u64 %r586; .reg .u64 %r587; .reg .u64 %r588; .reg .u64 %r589; .reg .u64 %r590; .reg .u64 %r591; .reg .u64 %r592; .reg .u64 %r594; .reg .u64u64.reg .u64 %r619; .reg .u64 %r620; .reg .u64 %r621; .reg .u64 %r624; .reg .u64 %r625; .reg .u64 %r626; .reg .u64 %r629; .reg .u64 %r630; .reg .u64 %r631; .reg .u64 %r633; .reg .u64 %r634; .reg .u64 %r640; .reg .u64 %r642; .reg .u64 %r643; .reg .u64 %r647; .reg .u64 %r652; .reg .u64 %r653; .reg .u64 %r655; .reg .u64 %r656; .reg .u64 %r661; .reg .u64 %r663; .reg .u64 %r664; .reg .u64 %r672; .reg .u64 %r673; .reg .u64 %r675; .reg .u64 %r676; .reg .u64 %r678; .reg .u64 %r681; .reg .u64 %r683; .reg .u64u64 %r699; .reg .u64 %r701; .reg .u64 %r705; .reg .u64 %r706; .reg .u64 %r710; .reg .u64 %r713; .reg .u64 %r715; .reg .u64 %r717; .reg .u64 %r721; .reg .u64 %r723; .reg .u64 %r725; .reg .u64 %r727; .reg .u64 %r730; .reg .u64 %r732; .reg .u64 %r734; .reg .u64 %r739; .reg .u64 %r741; .reg .u64 %r747; .reg .u64 %r748; .reg .u64 %r749; .reg .u64 %r750; .reg .u32 %r751; .reg .u32 %r752; .reg .u64 %r753; .reg .u16 %r754; .reg .pred %r755; .reg .u16 %r756; .reg .pred %r758; .reg .u64 %r759; .reg .u32 %r760; .reg .u64 %r761; .reg .u64 %r762; .reg .u64 %r763; .reg .pred %r764; .reg .pred %r766; .reg .u64 %r769; .reg .u64 %r771; .reg .u64 %r772; .reg .u64 %r773; .reg .u64 %r775; .reg .u64 %r776; .reg .u64 %r777; .reg .u64 %r778; .reg .u64 %r779; .reg .u64 %r782; .reg .u64 %r784; .reg .u64 %r785; .reg .u64 %r787; .reg .u64 %r789; .reg .u64 %r790; .reg .u64 %r792; .reg .u64 %r793; .reg .u64 %r794; .reg .u64 %r795; .reg .u64 %r796; .reg .u32 %r797; .reg .pred %r798; .reg .u64 %r799; .reg .u64 %r800; .reg .u64 %r801; .reg .pred %r803; .reg .u64 %r804; .reg .u64 %r805; .reg .u64 %r806; .reg .pred %r807; .reg .u64 %r808; .reg .u64 %r809; .reg .u64 %r810; .reg .u16 %r811; .reg .pred %r812; .reg .u64 %r813; .reg .u64 %r815; .reg .u64 %r816; .reg .u64 %r817; .reg .u16 %r818; .reg .pred %r820; .reg .pred %r821; .reg .u64 %r822; .reg .pred %r824; .reg .u64 %r825; .reg .u64 %r827; .reg .u64 %r828; .reg .u64 %r829; .reg .u64 %r830; .reg .u64 %r831; .reg .u64 %r832; .reg .pred %r833; .reg .u64 %r834; .reg .u64 %r836; .reg .u64 %r837; .reg .u64 %r838; .reg .u16 %r839; .reg .u32 %r840; .reg .pred %r841; .reg .u16 %r842; .reg .pred %r843; .reg .u64 %r844; .reg .u64 %r845; .reg .u64 %r846; .reg .u64 %r847; .reg .u64 %r848; .reg .u64 %r849; .reg .pred %r850; .reg .u32 %r852; .reg .u32 %r853; .reg .u32 %r855; .reg .u32 %r856; .reg .u16 %r857; .reg .u16 %r858; .reg .u16 %r859; .reg .u32 %r860; .reg .u16 %r861; .reg .pred %r862; .reg .u64 %r863; .reg .u16 %r865; .reg .pred %r866; .reg .u64 %r867; .reg .u64 %r868; .reg .u64 %r869; .reg .u32 %r870; .reg .u32 %r871; .reg .u32 %r872; .reg .u32 %r873; .reg .u32 %r874; .reg .u32 %r875; .reg .u32 %r877; .reg .u32 %r878; .reg .u16 %r879; .reg .u16 %r880; .reg .u16 %r881; .reg .u32 %r882; .reg .u16 %r883; .reg .pred %r884; .reg .u32 %r886; .reg .u32 %r887; .reg .u16 %r888; .reg .u16 %r889; .reg .u16 %r890; .reg .u32 %r891; .reg .u16 %r892; .reg .pred %r893; .reg .u32 %r895; .reg .u32 %r896; .reg .u32 %r898; .reg .u32 %r899; .reg .u16 %r900; .reg .u16 %r901; .reg .u16 %r902; .reg .u32 %r903; .reg .u16 %r904; .reg .pred %r905; .reg .f32 %r906; .reg .f32 %r907; .reg .f32 %r908; .reg .f32 %r909; .reg .f32 %r910; .reg .f32 %r911; .reg .f32 %r912; .reg .pred %r913; .reg .u64 %r914; .reg .u64 %r915; .reg .pred %r916; .reg .pred %r917; .reg .u32 %r919; .reg .u32 %r920; .reg .u32 %r922; .reg .u32 %r923; .reg .u16 %r924; .reg .u16 %r925; .reg .u16 %r926; .reg .u32 %r927; .reg .u16 %r928; .reg .pred %r929; .reg .u32 %r931; .reg .u32 %r932; .reg .u32 %r934; .reg .u32 %r935; .reg .u16 %r936; .reg .u16 %r937; .reg .u16 %r938; .reg .u32 %r939; .reg .u16 %r940; .reg .pred %r941; .reg .u32 %r943; .reg .u32 %r944; .reg .u32 %r946; .reg .u32 %r947; .reg .u16 %r948; .reg .u16 %r949; .reg .u16 %r950; .reg .u32 %r951; .reg .u16 %r952; .reg .pred %r953; .reg .pred %r954; .reg .u64 %r955; .reg .u32 %r956; .reg .u64 %r957; .reg .u64 %r958; .reg .u32 %r959; .reg .pred %r960; .reg .pred %r961; .reg .u32 %r962; .reg .pred %r963; .reg .pred %r964; .reg .u32 %r979; .reg .u64 %r980; .reg .u64 %r981; .reg .u64 %r982; .reg .u64 %r983; .reg .u64 %r984; .reg .u64 %r985; .reg .u64 %r986; .reg .u16 %r987; .reg .u16 %r988; .reg .u16 %r989; .reg .u32 %r990; .reg .u16 %r991; .reg .pred %r992; .reg .pred %r993; .reg .u16 %r994; .reg .pred %r995; .reg .u64 %r996; .reg .pred %r997; .reg .u32 %r1001; .reg .u64 %r1003; .reg .u32 %r1008; .reg .u32 %r1009; .reg .u32 %r1011; .reg .u32 %r1012; .reg .u16 %r1013; .reg .u16 %r1014; .reg .u16 %r1015; .reg .u32 %r1017; .reg .u32 %r1018; .reg .u16 %r1020; .reg .u16 %r1022; .reg .u32 %r1023; .reg .u16 %r1024; .reg .pred %r1025; .reg .pred %r1026; .reg .u64 %r1027; .reg .u64 %r1029; .reg .u64 %r1030; .reg .u64 %r1031; .reg .pred %r1033; .reg .u64 %r1035; .reg .u64 %r1036; .reg .u64 %r1037; .reg .u64 %r1040; .reg .u64 %r1041; .reg .u64 %r1045; .reg .u64 %r1046; .reg .u64 %r1047; .reg .u64 %r1049; .reg .u64 %r1050; .reg .u64 %r1053; .reg .u64 %r1056; .reg .u64 %r1057; .reg .u64 %r1060; .reg .u64 %r1062; .reg .u64 %r1063; .reg .u64 %r1064; .reg .u64 %r1065; .reg .u64 %r1066; .reg .pred %r1068; .reg .u64 %r1074; .reg .u64 %r1075; .reg .u64 %r1076; .reg .u64 %r1077; .reg .pred %r1079; .reg .u64 %r1080; .reg .u64 %r1081; .reg .u64 %r1082; .reg .u64 %r1083; .reg .u64 %r1084; .reg .u64 %r1086; .reg .u64 %r1087; .reg .u64 %r1088; .reg .u64 %r1089; .reg .u64 %r1090; .reg .u64 %r1091; .reg .u64 %r1092; .reg .u64 %r1093; .reg .u64 %r1097; .reg .u64 %r1098; .reg .u64 %r1099; .reg .u64 %r1100; .reg .u64 %r1104; .reg .u64 %r1106; .reg .u64 %r1109; .reg .u64 %r1112; .reg .u64 %r1117; .reg .u64 %r1118; .reg .u64 %r1120; .reg .u64 %r1121; .reg .u64 %r1122; .reg .u64 %r1123; .reg .u64 %r1124; .reg .u64 %r1125; .reg .u64 %r1126; .reg .u64 %r1127; .reg .u64 %r1129; .reg .u64 %r1130; .reg .u64 %r1131; .reg .u64 %r1132; .reg .u64 %r1133; .reg .pred %r1134; .reg .u64 %r1137; .reg .u64 %r1138; .reg .u64 %r1139; .reg .u64 %r1140; .reg .u64 %r1142; .reg .pred %r1143; .reg .u64 %r1144; .reg .u64 %r1145; .reg .u64 %r1146; .reg .u64 %r1149; .reg .u64 %r1150; .reg .u64 %r1151; .reg .u64 %r1152; .reg .u64 %r1153; .reg .u64 %r1154; .reg .u64 %r1155; .reg .u64 %r1156; .reg .u64 %r1158; .reg .u64 %r1159; .reg .u64 %r1160; .reg .u64 %r1161; .reg .u64 %r1166; .reg .u64 %r1169; .reg .u64 %r1172; .reg .u64 %r1175; .reg .u64 %r1178; .reg .u64 %r1180; .reg .pred %r1188; .reg .pred %r1189; .reg .pred %r1190; .reg .u64 %r1194; .reg .u64 %r1196; .reg .u64 %r1198; .reg .u64 %r1200; .reg .u64 %r1202; .reg .pred %r1204; .reg .pred %r1205; .reg .pred %r1206; .reg .u64 %r1207; .reg .u64 %r1208; .reg .u64 %r1210; .reg .u64 %r1211; .reg .u64 %r1212; .reg .u64 %r1213; .reg .u64 %r1221; .reg .u64 %r1222; .reg .u64 %r1223; .reg .u64 %r1225; .reg .u64 %r1227; .reg .u64 %r1229; .reg .u64 %r1230; .reg .u64 %r1232; .reg .u64 %r1233; .reg .u64 %r1235; .reg .pred %r1237; .reg .pred %r1238; .reg .u64 %r1241; .reg .u64 %r1243; .reg .u64 %r1244; .reg .pred %r1246; .reg .u16 %r1247; .reg .pred %r1248; .reg .u32 %r1250; .reg .u32 %r1251; .reg .u16 %r1252; .reg .u16 %r1253; .reg .u32 %r1255; .reg .u16 %r1256; .reg .pred %r1257; .reg .pred %r1258; .reg .u16 %r1259; .reg .pred %r1260; .reg .pred %r1261; .reg .pred %r1262; .reg .u64 %r1263; .reg .u64 %r1264; .reg .pred %r1266; .reg .u64 %r1267; .reg .u64 %r1268; .reg .u64 %r1270; .reg .u64 %r1272; .reg .u64 %r1273; .reg .pred %r1275; .reg .u64 %r1277; .reg .u64 %r1278; .reg .u64 %r1279; .reg .u64 %r1281; .reg .u64 %r1282; .reg .pred %r1284; .reg .pred %r1285; .reg .u64 %r1287; .reg .u64 %r1289; .reg .u64 %r1290; .reg .pred %r1292; .reg .pred %r1293; .reg .pred %r1294; .reg .pred %r1295; .reg .u64 %r1296; .reg .pred %r1299; .reg .u64 %r1301; .reg .u64 %r1303; .reg .u64 %r1304; .reg .u64 %r1305; .reg .u64 %r1306; .reg .pred %r1308; .reg .pred %r1310; .reg .u64 %r1312; .reg .u64 %r1313; .reg .pred %r1315; .reg .u64 %r1316; .reg .u64 %r1317; .reg .u64 %r1319; .reg .u64 %r1321; .reg .u64 %r1322; .reg .pred %r1324; .reg .pred %r1325; .reg .u64 %r1326; .reg .pred %r1327; .reg .u64 %r1328; .reg .u64 %r1329; .reg .u64 %r1330; .reg .u16 %r1331; .reg .u32 %r1332; .reg .pred %r1333; .reg .u16 %r1334; .reg .u32 %r1335; .reg .pred %r1336; .reg .u64 %r1337; .reg .u64 %r1338; .reg .u64 %r1339; .reg .u64 %r1340; .reg .u64 %r1341; .reg .u64 %r1342; .reg .u64 %r1343; .reg .u32 %r1344; .reg .pred %r1345; .reg .pred %r1347; .reg .pred %r1348; .reg .pred %r1349; .reg .pred %r1350; .reg .pred %r1351; .reg .u64 %r1353; .reg .u32 %r1354; .reg .u64 %r1355; .reg .pred %r1356; .reg .pred %r1361; .reg .pred %r1362; .reg .pred %r1363; .reg .pred %r1365; .reg .u64 %r1372; .reg .u64 %r1373; .reg .u64 %r1374; .reg .u64 %r1375; .reg .u64 %r1380; .reg .u64 %r1381; .reg .u64 %r1382; .reg .u64 %r1383; .reg .u64 %r1385; .reg .u64 %r1386; .reg .u64 %r1387; .reg .u64 %r1388; .reg .u64 %r1389; .reg .pred %r1390; .reg .u64 %r1392; .reg .pred %r1394; .reg .u64 %r1397; .reg .u64 %r1398; .reg .u64 %r1399; .reg .pred %r1400; .reg .u64 %r1401; .reg .u64 %r1402; .reg .pred %r1406; .reg .u64 %r1407; .reg .pred %r1408; .reg .u64 %r1409; .reg .u64 %r1410; .reg .u64 %r1411; .reg .u64 %r1412; .reg .u64 %r1413; .reg .pred %r1414; .reg .pred %r1415; .reg .pred %r1416; .reg .u64 %r1419; .reg .u64 %r1420; .reg .u64 %r1421; .reg .u64 %r1423; .reg .pred %r1424; .reg .u64 %r1426; .reg .u64 %r1427; .reg .u64 %r1428; .reg .u64 %r1429; .reg .u64 %r1430; .reg .u64 %r1431; .reg .pred %r1432; .reg .pred %r1433; .reg .pred %r1434; .reg .u64 %r1440; .reg .u64 %r1441; .reg .u64 %r1442; .reg .u64 %r1443; .reg .u64 %r1444; .reg .pred %r1445; .reg .pred %r1446; .reg .pred %r1447; .reg .u64 %r1450; .reg .u64 %r1451; .reg .u64 %r1452; .reg .u64 %r1453; .reg .u64 %r1454; .reg .pred %r1455; .reg .pred %r1456; .reg .pred %r1457; .reg .u64 %r1458; .reg .u64 %r1459; .reg .u64 %r1463; .reg .pred %r1464; .reg .pred %r1465; .reg .pred %r1466; .reg .u64 %r1467; .reg .u64 %r1468; .reg .u64 %r1469; .reg .u64 %r1470; .reg .u64 %r1471; .reg .u64 %r1472; .reg .pred %r1473; .reg .pred %r1474; .reg .pred %r1475; .reg .u64 %r1478; .reg .u64 %r1479; .reg .u64 %r1480; .reg .u64 %r1481; .reg .u64 %r1482; .reg .pred %r1483; .reg .pred %r1484; .reg .pred %r1485; .reg .u64 %r1490; .reg .pred %r1491; .reg .pred %r1492; .reg .pred %r1493; .reg .u64 %r1498; .reg .pred %r1499; .reg .pred %r1500; .reg .pred %r1501; .reg .u64 %r1506; .reg .pred %r1507; .reg .pred %r1508; .reg .pred %r1509; .reg .u64 %r1514; .reg .pred %r1515; .reg .pred %r1516; .reg .pred %r1517; .reg .u64 %r1520; .reg .u64 %r1521; .reg .u64 %r1523; .reg .u64 %r1524; .reg .u64 %r1527; .reg .u64 %r1528; .reg .u64 %r1529; .reg .u64 %r1530; .reg .u64 %r1531; .reg .u64 %r1532; .reg .u64 %r1533; .reg .u64 %r1534; .reg .u64 %r1535; .reg .u64 %r1536; .reg .u64 %r1538; .reg .u64 %r1539; .reg .u64 %r1540; .reg .u64 %r1541; .reg .u64 %r1542; .reg .u64 %r1543; .reg .u64 %r1544; .reg .u64 %r1545; .reg .u64 %r1546; .reg .u64 %r1547; .reg .u64 %r1549; .reg .u64 %r1550; .reg .u64 %r1551; .reg .u64 %r1552; .reg .u64 %r1553; .reg .u64 %r1554; .reg .u64 %r1555; .reg .u64 %r1556; .reg .pred %r1557; .reg .u64 %r1560; .reg .u64 %r1564; .reg .u64 %r1568; .reg .pred %r1569; .reg .u64 %r1570; .reg .u64 %r1571; .reg .u64 %r1572; .reg .u64 %r1573; .reg .u64 %r1579; .reg .u64 %r1580; .reg .u64 %r1582; .reg .u64 %r1583; .reg .u64 %r1585; .reg .u64 %r1586; .reg .u64 %r1588; .reg .u64 %r1589; .reg .u64 %r1591; .reg .u64 %r1598; .reg .u64 %r1599; .reg .u64 %r1601; .reg .u64 %r1602; .reg .u64 %r1604; .reg .u64 %r1605; .reg .u64 %r1607; .reg .u64 %r1608; .reg .u64 %r1610; .reg .u64 %r1617; .reg .u64 %r1618; .reg .u64 %r1620; .reg .u64 %r1622; .reg .u64 %r1624; .reg .pred %r1626; .reg .pred %r1627; .reg .u64 %r1628; .reg .u64 %r1629; .reg .u64 %r1630; .reg .u64 %r1631; .reg .u64 %r1634; .reg .u64 %r1635; .reg .u64 %r1636; .reg .u64 %r1637; .reg .u64 %r1639; .reg .u64 %r1641; .reg .u64 %r1642; .reg .u64 %r1644; .reg .u64 %r1645; .reg .pred %r1647; .reg .u64 %r1648; .reg .u64 %r1651; .reg .u64 %r1652; .reg .u64 %r1654; .reg .u64 %r1658; .reg .u64 %r1659; .reg .u64 %r1661; .reg .u64 %r1665; .reg .u64 %r1666; .reg .pred %r1668; .reg .pred %r1669; .reg .u64 %r1670; .reg .u64 %r1673; .reg .u64 %r1674; .reg .u64 %r1676; .reg .u64 %r1680; .reg .u64 %r1681; .reg .u64 %r1683; .reg .u64 %r1687; .reg .u64 %r1688; .reg .pred %r1690; .reg .pred %r1691; .reg .u64 %r1692; .reg .u64 %r1693; .reg .u64 %r1695; .reg .u64 %r1696; .reg .u64 %r1698; .reg .u64 %r1699; .reg .u64 %r1702; .reg .u64 %r1703; .reg .u64 %r1705; .reg .u64 %r1706; .reg .u64 %r1709; .reg .u64 %r1710; .reg .pred %r1712; .reg .pred %r1713; .reg .u64 %r1714; .reg .u64 %r1716; .reg .u64 %r1717; .reg .u64 %r1718; .reg .u64 %r1719; .reg .u64 %r1721; .reg .u64 %r1723; .reg .u64 %r1724; .reg .u64 %r1726; .reg .u64 %r1727; .reg .u64 %r1729; .reg .u64 %r1731; .reg .u64 %r1732; .reg .u64 %r1734; .reg .u64 %r1735; .reg .pred %r1737; .reg .u64 %r1739; .reg .u64 %r1741; .reg .u64 %r1744; .reg .pred %r1746; .reg .pred %r1747; .reg .pred %r1748; .reg .u64 %r1750; .reg .u64 %r1751; .reg .u64 %r1752; .reg .u64 %r1753; .reg .u64 %r1755; .reg .u64 %r1756; .reg .u64 %r1758; .reg .u64 %r1759; .reg .u64 %r1761; .reg .u64 %r1762; .reg .u64 %r1764; .reg .u64 %r1765; .reg .u64 %r1767; .reg .u64 %r1768; .reg .u64 %r1770; .reg .u64 %r1771; .reg .u64 %r1773; .reg .u64 %r1774; .reg .pred %r1775; .reg .pred %r1776; .reg .u64 %r1778; .reg .u64 %r1780; .reg .u64 %r1781; .reg .u64 %r1782; .reg .u64 %r1783; .reg .u64 %r1785; .reg .u64 %r1787; .reg .u64 %r1788; .reg .u64 %r1790; .reg .u64 %r1791; .reg .u64 %r1793; .reg .u64 %r1795; .reg .u64 %r1796; .reg .u64 %r1798; .reg .u64 %r1799; .reg .pred %r1801; .reg .u64 %r1802; .reg .u64 %r1803; .reg .u64 %r1806; .reg .u64 %r1807; .reg .u64 %r1810; .reg .u64 %r1811; .reg .pred %r1813; .reg .u64 %r1814; .reg .u64 %r1815; .reg .u64 %r1816; .reg .u64 %r1817; .reg .u64 %r1819; .reg .u64 %r1821; .reg .u64 %r1822; .reg .u64 %r1824; .reg .u64 %r1825; .reg .pred %r1827; .reg .u64 %r1828; .reg .u64 %r1829; .reg .u64 %r1831; .reg .u64 %r1833; .reg .u64 %r1835; .reg .u64 %r1838; .reg .u64 %r1839; .reg .u64 %r1841; .reg .u64 %r1843; .reg .u64 %r1845; .reg .u64 %r1848; .reg .u64 %r1849; .reg .u64 %r1851; .reg .u64 %r1853; .reg .u64 %r1855; .reg .pred %r1857; .reg .u64 %r1858; .reg .u64 %r1860; .reg .u64 %r1862; .reg .pred %r1863; .reg .u64 %r1864; .reg .u64 %r1865; .reg .u64 %r1866; .reg .u64 %r1867; .reg .u64 %r1868; .reg .u64 %r1869; .reg .u64 %r1870; .reg .u64 %r1871; .reg .u64 %r1873; .reg .u64 %r1874; .reg .u64 %r1875; .reg .u64 %r1876; .reg .u64 %r1877; .reg .u64 %r1878; .reg .u64 %r1879; .reg .u64 %r1880; .reg .u64 %r1882; .reg .u64 %r1883; .reg .u64 %r1884; .reg .u64 %r1885; .reg .u64 %r1886; .reg .u64 %r1887; .reg .u64 %r1888; .reg .u64 %r1889; .reg .pred %r1890; .reg .u32 %r1891; .reg .u64 %r1892; .reg .u32 %r1894; .reg .u64 %r1895; .reg .u32 %r1897; .reg .u64 %r1898; .reg .pred %r1899; .reg .u32 %r1900; .reg .u64 %r1901; .reg .u64 %r1903; .reg .u32 %r1905; .reg .u64 %r1906; .reg .u64 %r1908; .reg .u32 %r1910; .reg .u64 %r1911; .reg .pred %r1912; .reg .u64 %r1913; .reg .u64 %r1914; .reg .u64 %r1917; .reg .u64 %r1918; .reg .u64 %r1921; .reg .u64 %r1922; .reg .pred %r1924; .reg .u64 %r1925; .reg .u64 %r1926; .reg .u64 %r1929; .reg .u64 %r1930; .reg .u64 %r1933; .reg .u64 %r1934; .reg .pred %r1936; .reg .u64 %r1937; .reg .u64 %r1938; .reg .u64 %r1941; .reg .u64 %r1942; .reg .u64 %r1945; .reg .u64 %r1946; .reg .pred %r1948; .reg .u64 %r1949; .reg .u64 %r1950; .reg .u64 %r1952; .reg .u64 %r1953; .reg .u64 %r1955; .reg .u64 %r1956; .reg .u64 %r1958; .reg .u64 %r1959; .reg .u64 %r1961; .reg .u64 %r1962; .reg .u64 %r1964; .reg .u64 %r1965; .reg .pred %r1966; .reg .pred %r1969; .reg .u64 %r1970; .reg .u64 %r1971; .reg .u64 %r1974; .reg .u64 %r1975; .reg .u64 %r1978; .reg .u64 %r1979; .reg .pred %r1981; mov.u64 %r748,%ar0; mov.u64 %r749,%ar1; mov.u64 %r750,%ar2; mov.u32 %r751,%ar3; mov.u32 %r752,%ar4; mov.u64 %r753,%ar5; .loc 1 2462 3 ld.s8 %r446,[%r749+28]; cvt.u16.u32 %r754,%r446; setp.eq.u16 %r755,%r754,2; @ %r755 bra $L2; ld.s8 %r1354,[%r750+28]; cvt.u16.u32 %r756,%r1354; setp.eq.u16 %r758,%r756,2; @ %r758 bra $L3; cvta.const.u64 %r762,$LC3; cvta.const.u64 %r761,__func__$0; mov.u32 %r760,2462; cvta.const.u64 %r7767623: .loc 1 2476 6 ld.u64 %r763,[%r748]; setp.ne.u64 %r764,%r763,0; @ %r764 bra $L4; .loc 1 2478 10 setp.ne.u16 %r766,%r754,1; @ %r766 bra $L5; .loc 1 2480 4 st.u64 [%r748+48],%r763; ld.u64 %r769,[%r750+80]; ld.u64 %r771,[%r750+72]; sub.u64 %r772,%r769,%r771; st.u64 [%r748+56],%r772; mov.u64 %r773,1; st.u64 [%r748+40],%r773; bra $L6; $L123: .loc 1 2485 4 st.u64 [%r748+48],%r1326; add.u64 %r775,%r508,-1; st.u64 [%r748+56],%r775; mov.u64 %r776,1; st.u64 [%r748+40],%r776; bra $L6; $L124: .loc 1 2490 4 mov.u64 %r777,0; st.u64 [%r748+48],%r777; add.u64 %r778,%r508,-1; st.u64 [%r748+56],%r778; mov.u64 %r779,1; st.u64 [%r748+40],%r779; .loc 1 2493 11 st.u64 [%r748+72],%r777; ld.u64 %r782,[%r750+80]; ld.u64 %r784,[%r750+72]; sub.u64 %r785,%r782,%r784; st.u64 [%r748+80],%r785; st.u64 [%r748+64],%r508; $L6: .loc 1 2499 1748; call (%value_in),_gfortran_size0,(%out_arg1); ld.param.u64 %r787,[%value_in]; } .loc 1 2499 4 mov.u64 %r789789; call (%value_in),_gfortrani_xmallocarray790,[%value_in]; } .loc 1 2499 2 st.u64 [%r748],%r790; .loc 1 2500 24 mov.u64 %r792,0; st.u64 [%r748+8],%r792; .loc 1 2557 7 ld.s8 %r446,[%r749+28]; .loc 1 2564 15 ld.u64 %r794,[%r749+56]; add.u64 %r793,%r794,1; .loc 1 2564 13 ld.u64 %r795,[%r749+48]; sub.u64 %r456,%r793,%r795; bra $L7; $L4: .loc 1 2502 12 cvta.global.u64 %r796,_gfortrani_compile_options; .loc 1 2502 11 ld.u32 %r797,[%r796+36]; setp.eq.u32 %r798,%r797,0; @ %r798 bra $L8; .loc 1 2509 17 ld.u64 %r800,[%r748+56]; add.u64 %r799,%r800,1; .loc 1 2509 15 ld.u64 %r801,[%r748+48]; sub.u64 %r456,%r799,%r801; .loc 1 2506 10 setp.ne.u16 %r803,%r754,1; @ %r803 bra $L9; .loc 1 2508 17 ld.u64 %r805,[%r750+80]; add.u64 %r804,%r805,1; .loc 1 2508 15 ld.u64 %r806,[%r750+72]; sub.u64 %r238,%r804,%r806; .loc 1 2510 7 setp.ne.u64 %r807,%r238,%r456; @ %r807 bra $L10; .loc 1 2564 15 ld.u64 %r809,[%r749+56]; add.u64 %r808,%r809,1; .loc 1 2564 13 ld.u64 %r810,[%r749+48]; sub.u64 %r456,%r808,%r810; .loc 1 2543 7 ld.s8 %r331,[%r748+28]; .loc 1 2548 27 ld.u64 %r444,[%r748+40]; .loc 1 2543 6 cvt.u16.u32 %r811,%r331; setp.eq.u16 %r812,%r811,1; @ %r812 bra $L11; .loc 1 2553 16 ld.u64 %r504,[%r748+64]; .loc 1 2560 16 ld.u64 %r503,[%r749+40]; .loc 1 2563 14 mov.u64 %r216,1; .loc 1 2561 16 mov.u64 %r208,%r216; bra $L12; $L10: .loc 1 2511 6 st.u64 [%stack+8],%r238; st.u64 [%stack],%r456; cvta.const.u64 %r813813_gfortran_runtime_error2485 4 ld.u64 %r816,[%r749+56]; add.u64 %r815,%r816,1; ld.u64 %r817,[%r749+48]; sub.u64 %r512,%r815,%r817; .loc 1 2515 15 cvt.u16.u32 %r818,%r1354; setp.ne.u16 %r820,%r818,1; @ %r820 bra $L13; .loc 1 2519 7 setp.eq.u64 %r821,%r512,%r456; @ %r821 bra $L14; .loc 1 2520 6 st.u64 [%stack+8],%r512; st.u64 [%stack],%r456; cvta.const.u64 %r8228stack; call _gfortran_runtime_error13: .loc 1 2528 7 setp.eq.u64 %r824,%r512,%r456; @ %r824 bra $L15; .loc 1 2529 6 st.u64 [%stack+8],%r512; st.u64 [%stack],%r456; cvta.const.u64 %r825825_gfortran_runtime_error15: .loc 1 2533 17 ld.u64 %r828,[%r750+80]; add.u64 %r827,%r828,1; .loc 1 2533 15 ld.u64 %r829,[%r750+72]; sub.u64 %r236,%r827,%r829; .loc 1 2534 17 ld.u64 %r831,[%r748+80]; add.u64 %r830,%r831,1; .loc 1 2534 15 ld.u64 %r832,[%r748+72]; sub.u64 %r237,%r830,%r832; .loc 1 2535 7 setp.eq.u64 %r833,%r236,%r237; @ %r833 bra $L14; .loc 1 2536 6 st.u64 [%stack+8],%r236; st.u64 [%stack],%r237; cvta.const.u64 %r834,$LC8stack; call _gfortran_runtime_error2564 15 ld.u64 %r837,[%r749+56]; add.u64 %r836,%r837,1; .loc 1 2564 13 ld.u64 %r838,[%r749+48]; sub.u64 %r456,%r836,%r838; $L7: .loc 1 2548 27 ld.u64 %r504,[%r748+40]; .loc 1 2543 6 ld.s8 %r840,[%r748+28]; cvt.u16.u32 %r839,%r840; setp.eq.u16 %r841,%r839,1; @ %r841 bra $L130; $L127: .loc 1 2553 16 mov.u64 %r444,%r504; ld.u64 %r504,[%r748+64]; bra $L16; $L130: .loc 1 2548 27 mov.u64 %r444,%r504; $L16: .loc 1 2560 16 ld.u64 %r503,[%r749+40]; .loc 1 2557 6 cvt.u16.u32 %r842,%r446; setp.eq.u16 %r843,%r842,1; @ %r843 bra $L131; $L125: .loc 1 2569 16 ld.u64 %r208,[%r749+64]; .loc 1 2571 15 ld.u64 %r845,[%r749+80]; add.u64 %r844,%r845,1; .loc 1 2571 13 ld.u64 %r846,[%r749+72]; mov.u64 %r216,%r456; sub.u64 %r456,%r844,%r846; bra $L12; $L131: .loc 1 2563 14 mov.u64 %r216,1; .loc 1 2561 16 mov.u64 %r208,%r216; $L12: .loc 1 2575 16 ld.u64 %r848,[%r750+56]; add.u64 %r847,%r848,1; ld.u64 %r849,[%r750+48]; sub.u64 %r64,%r847,%r849; .loc 1 2575 6 setp.eq.u64 %r850,%r64,%r456; @ %r850 bra $L17; .loc 1 2577 10 set.u32.gt.s64 %r852,%r456,0; neg.s32 %r853,%r852; .loc 1 2577 21 set.u32.gt.s64 %r855,%r64,0; neg.s32 %r856,%r855; cvt.u16.u32 %r858,%r853; cvt.u16.u32 %r859,%r856; or.b16 %r857,%r858,%r859; cvt.u32.u16 %r860,%r857; cvt.u16.u8 %r861,%r860; setp.eq.u16 %r862,%r861,0; @ %r862 bra $L17; .loc 1 2578 2 st.u64 [%stack+8],%r456; st.u64 [%stack],%r64;_gfortran_runtime_error17: .loc 1 2583 7 ld.s8 %r66,[%r750+28]; .loc 1 2586 16 ld.u64 %r502,[%r750+40]; .loc 1 2583 6 cvt.u16.u32 %r865,%r66; setp.eq.u16 %r866,%r865,1; @ %r866 bra $L132; .loc 1 2597 16 ld.u64 %r209,[%r750+64]; .loc 1 2598 16 ld.u64 %r868,[%r750+80]; add.u64 %r867,%r868,1; .loc 1 2598 14 ld.u64 %r869,[%r750+72]; sub.u64 %r217,%r867,%r869; bra $L18; $L132: .loc 1 2592 14 mov.u64 %r217,1; .loc 1 2591 16 mov.u64 %r209,256; $L18: .loc 1 2601 9 ld.u64 %r243,[%r749]; .loc 1 2602 9 ld.u64 %r244,[%r750]; .loc 1 2603 8 ld.u64 %r245,[%r748]; .loc 1 2612 28 set.u32.eq.u64 %r871,%r444,1; neg.s32 %r872,%r871; cvt.u32.u32 %r870,%r872; cvt.u32.u8 %r71,%r870; .loc 1 2612 46 set.u32.eq.u64 %r874,%r503,1; neg.s32 %r875,%r874; cvt.u32.u32 %r873,%r875; cvt.u32.u8 %r501,%r873; .loc 1 2612 7 set.u32.ne.u32 %r877,%r751,0; neg.s32 %r878,%r877; .loc 1 2612 16 cvt.u16.u32 %r880,%r878; cvt.u16.u32 %r881,%r71; and.b16 %r879,%r880,%r881; .loc 1 2612 6 cvt.u32.u16 %r882,%r879; cvt.u16.u8 %r883,%r882; setp.eq.u16 %r884,%r883,0; @ %r884 bra $L19; .loc 1 2612 63 set.u32.eq.u64 %r886,%r208,1; neg.s32 %r887,%r886; .loc 1 2612 51 cvt.u16.u32 %r889,%r887; cvt.u16.u32 %r890,%r501; or.b16 %r888,%r889,%r890; .loc 1 2612 33 cvt.u32.u16 %r891,%r888; cvt.u16.u8 %r892,%r891; setp.eq.u16 %r893,%r892,0; @ %r893 bra $L20; .loc 1 2613 20 set.u32.eq.u64 %r895,%r502,1; neg.s32 %r896,%r895; .loc 1 2613 37 set.u32.eq.u64 %r898,%r209,1; neg.s32 %r899,%r898; .loc 1 2613 25 cvt.u16.u32 %r901,%r896; cvt.u16.u32 %r902,%r899; or.b16 %r900,%r901,%r902; .loc 1 2613 7 cvt.u32.u16 %r903,%r900; cvt.u16.u8 %r904,%r903; setp.eq.u16 %r905,%r904,0; @ %r905 bra $L20; .loc 1 2615 13 cvt.rn.f32.s32 %r83,%r752; .loc 1 2614 12 cvt.rn.f32.s64 %r906,%r216; .loc 1 2614 50 cvt.rn.f32.s64 %r907,%r456; .loc 1 2614 47 mul.f32 %r908,%r906,%r907; .loc 1 2614 31 cvt.rn.f32.s64 %r909,%r217; .loc 1 2614 47 mul.f32 %r910,%r908,%r909; .loc 1 2615 13 mul.f32 %r911,%r83,%r83; mul.f32 %r912,%r911,%r83; .loc 1 2614 7 setp.gt.f32 %r913,%r910,%r912; @ ! %r913 bra $L19; .loc 1 2617 17 cvt.u32.u64 %r85,%r216; st.u32 [%frame+36],%r85; .loc 1 2617 29 cvt.u32.u64 %r86,%r217; st.u32 [%frame+32],%r86; .loc 1 2617 41 cvt.u32.u64 %r87,%r456; st.u32 [%frame+28],%r87; .loc 1 2617 52 cvt.u32.u64 %r88,%r504; st.u32 [%frame+24],%r88; .loc 1 2618 27 mov.u64 %r914,1; st.u64 [%frame+8],%r914; .loc 1 2618 36 mov.u64 %r915,0; st.u64 [%frame],%r915; .loc 1 2619 17 setp.ne.u64 %r916,%r503,1; @ %r916 bra $L22; cvt.u32.u64 %r230,%r208; bra $L23; $L22: cvt.u32.u64 %r230,%r503; $L23: st.u32 [%frame+20],%r230; .loc 1 2620 3 setp.ne.u64 %r917,%r502,1; @ %r917 bra $L24; cvt.u32.u64 %r231,%r209; bra $L25; $L24: cvt.u32.u64 %r231,%r502; $L25: st.u32 [%frame+16],%r231; .loc 1 2622 15 set.u32.gt.s32 %r919,%r230,0; neg.s32 %r920,%r919; .loc 1 2622 26 set.u32.gt.s32 %r922,%r231,0; neg.s32 %r923,%r922; .loc 1 2622 19 cvt.u16.u32 %r925,%r920; cvt.u16.u32 %r926,%r923; and.b16 %r924,%r925,%r926; .loc 1 2622 10 cvt.u32.u16 %r927,%r924; cvt.u16.u8 %r928,%r927; setp.eq.u16 %r929,%r928,0; @ %r929 bra $L19; .loc 1 2622 37 set.u32.gt.s32 %r931,%r88,0; neg.s32 %r932,%r931; .loc 1 2622 46 set.u32.gt.s32 %r934,%r85,1; neg.s32 %r935,%r934; .loc 1 2622 41 cvt.u16.u32 %r937,%r932; cvt.u16.u32 %r938,%r935; and.b16 %r936,%r937,%r938; cvt.u32.u16 %r939,%r936; cvt.u16.u8 %r940,%r939; setp.eq.u16 %r941,%r940,0; @ %r941 bra $L19; .loc 1 2622 55 set.u32.gt.s32 %r943,%r86,1; neg.s32 %r944,%r943; .loc 1 2622 64 set.u32.gt.s32 %r946,%r87,1; neg.s32 %r947,%r946; .loc 1 2622 59 cvt.u16.u32 %r949,%r944; cvt.u16.u32 %r950,%r947; and.b16 %r948,%r949,%r950; cvt.u32.u16 %r951,%r948; cvt.u16.u8 %r952,%r951; setp.eq.u16 %r953,%r952,0; @ %r953 bra $L19; .loc 1 2624 4 setp.ne.u64 %r954,%r753,0; @ %r954 bra $L27; cvta.const.u64 %r958,$LC8; cvta.const.u64 %r957,__func__$0; mov.u32 %r956,2624; cvta.const.u64 %r9957927: .loc 1 2626 17 and.b32 %r959,%r751,2; .loc 1 2626 7 setp.ne.u32 %r960,%r959,0; @ %r960 bra $L133; .loc 1 2629 35 setp.eq.u64 %r961,%r503,1; @ %r961 bra $L134; cvta.const.u64 %r218,$LC2; bra $L28; $L133: .loc 1 2627 13 cvta.const.u64 %r218,$LC1; bra $L28; $L134: .loc 1 2629 35 cvta.const.u64 %r218,$LC0; $L28: .loc 1 2631 17 and.b32 %r962,%r751,4; .loc 1 2631 7 setp.ne.u32 %r963,%r962,0; @ %r963 bra $L135; .loc 1 2634 35 setp.eq.u64 %r964,%r502,1; @ %r964 bra $L136; cvta.const.u64 %r219,$LC2; bra $L29; $L135: .loc 1 2632 13 cvta.const.u64 %r219,$LC1; bra $L29; $L136: .loc 1 2634 35 cvta.const.u64 %r219,$LC0; $L29: .loc 1 2636 4 add.u64 %r980,%frame,24; add.u64 %r981,%frame,16; add.u64 %r982,%frame,20; add.u64 %r983,%frame,8; add.u64 %r984,%frame,28; add.u64 %r985,%frame,32; add.u64 %r986,%frame,36; mov.u32 %r979,1;,.param .u64 %in_ar3,.param .u64 %in_ar4,.param .u64 %in_ar5,.param .u64 %in_ar6,.param .u64 %in_ar7,.param .u64 %in_ar8,.param .u64 %in_ar9,.param .u64 %in_ar10,.param .u64 %in_ar11,.param .u64 %in_ar12,.param .u32 %in_ar13,.param .u32 %in_ar14);r219984243; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r982; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r244; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r981; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%frame; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r245; .param .u64 %out_arg13; st.param.u64 [%out_arg13],%r980; .param .u32 %out_arg14; st.param.u32 [%out_arg14],%r979; .param .u32 %out_arg15; st.param.u32 [%out_arg15],%r979; call %r753,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13,%out_arg14,%out_arg15),$LCT0; } bra $L1; $L19: .loc 1 2643 21 cvt.u16.u32 %r988,%r71; cvt.u16.u32 %r989,%r501; and.b16 %r987,%r988,%r989; .loc 1 2643 6 cvt.u32.u16 %r990,%r987; cvt.u16.u8 %r991,%r990; setp.eq.u16 %r992,%r991,0; @ %r992 bra $L31; .loc 1 2643 38 setp.ne.u64 %r993,%r502,1; @ %r993 bra $L20; .loc 1 2644 7 cvt.u16.u32 %r994,%r66; setp.eq.u16 %r995,%r994,1; @ %r995 bra $L32; .loc 1 2677 16 add.u64 %r996,%r504,1; .loc 1 2678 9 shl.b64 %r104,%r996,3; sub.u64 %r247,%r245,%r104; .loc 1 2687 7 setp.le.s64 %r997,%r217,0; @ %r997 bra $L33; add.u64 %r621,%r104,-8; mov.u64 %r620,%r245; add.u64 %r624,%r217,1; .loc 1 2689 22 shl.b64 %r441,%r216,3; .loc 1 2687 13 mov.u64 %r296,%r502; setp.gt.s64 %r1351,%r216,0; and.b64 %r1463,%r217,3; setp.eq.u64 %r1466,%r1463,0; @ %r1466 bra $L34; setp.eq.u64 %r1465,%r1463,1; @ %r1465 bra $L300; setp.eq.u64 %r1464,%r1463,2; @ %r1464 bra $L301; bra $L366; $L35: .loc 1 2689 22 mov.u32 %r10010011003,[%value_in]; } $L36: .loc 1 2687 24 add.u64 %r1467,%r296,1; .loc 1 2687 7 add.u64 %r1520,%r620,%r621; .loc 1 2688 2 @ %r1351 bra $L238; bra $L367; $L34: @ %r1351 bra $L35; bra $L36; $L33: .loc 1 2692 28 set.u32.eq.u64 %r1008,%r456,0; neg.s32 %r1009,%r1008; .loc 1 2692 13 set.u32.eq.u64 %r1011,%r216,0; neg.s32 %r1012,%r1011; .loc 1 2692 28 cvt.u16.u32 %r1014,%r1009; cvt.u16.u32 %r1015,%r1012; or.b16 %r1013,%r1014,%r1015; .loc 1 2692 23 set.u32.eq.u64 %r1017,%r217,0; neg.s32 %r1018,%r1017; .loc 1 2692 28 cvt.u16.u32 %r1022,%r1018; or.b16 %r1020,%r1013,%r1022; cvt.u32.u16 %r1023,%r1020; cvt.u16.u8 %r1024,%r1023; setp.eq.u16 %r1025,%r1024,0; @ ! %r1025 bra $L1; .loc 1 2697 10 setp.eq.u64 %r1026,%r208,1; selp.u64 %r228,%r504,%r208,%r1026; .loc 1 2702 21 shl.b64 %r1027,%r228,8; .loc 1 2702 14 add.u64 %r252,%r1027,%r209; .loc 1 2706 12 min.s64 %r1029,%r252,65536; shl.b64 %r1030,%r1029031031,[%value_in]; } .loc 1 2710 7 setp.gt.s64 %r1033,%r217,0; @ %r1033 bra $L39; $L42: .loc 1 2910312918 7 bra $L1; $L39: .loc 1 2680 16 add.u64 %r1035,%r208,1; .loc 1 2681 9 shl.b64 %r107,%r1035,3; sub.u64 %r249,%r243,%r107; shl.b64 %r594,%r209,9; shl.b64 %r596,%r209,12; shl.b64 %r605,%r209,4; .loc 1 2683 16 add.u64 %r1036,%r209,1; .loc 1 2684 9 shl.b64 %r1037,%r1036,3; sub.u64 %r597,%r244,%r1037; add.u64 %r595,%r605,%r597; shl.b64 %r600,%r504,9; add.u64 %r601,%r504,%r504; shl.b64 %r602,%r504,2; neg.s64 %r1040,%r209; shl.b64 %r1041,%r1040,12; shl.b64 %r603,%r1040,3; shl.b64 %r607,%r504,12; shl.b64 %r1045,%r504,4; add.u64 %r1046,%r1045,8; add.u64 %r606,%r247,%r1046; add.u64 %r1047,%r217,-1; and.b64 %r615,%r1047,-512; add.u64 %r616,%r597,%r603; shl.b64 %r573,%r208,8; shl.b64 %r575,%r208,11; add.u64 %r576,%r107,-8; neg.s64 %r1049,%r208; shl.b64 %r1050,%r1049,11; shl.b64 %r1053,%r1049,4; shl.b64 %r537,%r208,4; add.u64 %r538,%r208,%r208; shl.b64 %r226,%r209,5; shl.b64 %r221,%r504,5; neg.s64 %r1056,%r504; shl.b64 %r1057,%r1056,3; shl.b64 %r566,%r504,3; shl.b64 %r1060,%r1040,5; shl.b64 %r619,%r209,3; mov.u64 %r598,%r601; mov.u64 %r592,%r209; mov.u64 %r590,0; setp.gt.s64 %r1363,%r456,0; add.u64 %r1399,%r616,8; setp.gt.s64 %r1400,%r216,0; add.u64 %r1401,%r216,1; add.u64 %r1402,%r249,16; bra $L41; $L138: mov.u64 %r590,%r591; $L41: add.u64 %r618,%r590,1; .loc 1 2715 9 sub.u64 %r254,%r217,%r590; min.s64 %r1521,%r254,512; .loc 1 2716 24 shr.s64 %r1062,%r1521,63; shr.u64 %r1063,%r1062,62; add.u64 %r1064,%r1521,%r1063; and.b64 %r1065,%r1064,3; sub.u64 %r1066,%r1065,%r1063; .loc 1 2716 10 sub.u64 %r255,%r1521,%r1066; .loc 1 2718 4 @ %r1363 bra $L40; $L45: .loc 1 2710 7 add.u64 %r591,%r590,512; add.u64 %r592,%r592,%r594; add.u64 %r595,%r595,%r596; add.u64 %r598,%r598,%r600; add.u64 %r602,%r602,%r600; add.u64 %r603,%r603,%r1041; add.u64 %r605,%r605,%r596; add.u64 %r606,%r606,%r607; setp.ne.u64 %r1068,%r590,%r615; @ %r1068 bra $L138; bra $L42; $L40: add.u64 %r570,%r1399,%r605; .loc 1 2768 13 add.u64 %r480,%r255,%r618; add.u64 %r477,%r605,%r619; add.u64 %r475,%r477,%r619; add.u64 %r588,%r595,%r619; add.u64 %r589,%r588,%r619; mul.lo.u64 %r406,%r480,%r504; mul.lo.u64 %r399,%r480,%r209; mad.lo.u64 %r585,%r480,%r619,%r597; mov.u64 %r577,%r1053; mov.u64 %r574,%r576; mov.u64 %r571,%r208; .loc 1 2718 12 mov.u64 %r258,%r502; add.u64 %r1355,%r456,1; .loc 1 2768 8 add.u64 %r1392,%r480,-1; setp.ge.s64 %r1394,%r1392,%r618; sub.u64 %r1397,%r598,%r504; add.u64 %r1398,%r1397,%r601; $L44: .loc 1 2723 13 sub.u64 %r256,%r1355,%r258; min.s64 %r1523,%r256,256; .loc 1 2724 28 shr.u64 %r1074,%r1523,63; add.u64 %r1075,%r1523,%r1074; and.b64 %r1076,%r1075,1; sub.u64 %r1077,%r1076,%r1074; .loc 1 2724 14 sub.u64 %r257,%r1523,%r1077; .loc 1 2727 8 @ %r1400 bra $L43; $L73: .loc 1 2718 30 add.u64 %r258,%r258,256; .loc 1 2718 4 add.u64 %r570,%r570,2048; add.u64 %r571,%r571,%r573; add.u64 %r574,%r574,%r575; add.u64 %r577,%r577,%r1050; setp.ge.s64 %r1079,%r456,%r258; @ %r1079 bra $L44; bra $L45; $L43: .loc 1 2763 26 add.u64 %r1080,%r258,-1; add.u64 %r1081,%r1080,%r1523; .loc 1 2763 31 mul.lo.u64 %r314,%r1081,%r208; .loc 1 2734 13 add.u64 %r479,%r257,%r258; .loc 1 2734 8 add.u64 %r481,%r479,-1; add.u64 %r1082,%r314,1; shl.b64 %r1083,%r1082,3; add.u64 %r556,%r249,%r1083; add.u64 %r567,%r571,%r208; sub.u64 %r1084,%r567,%r314; shl.b64 %r569,%r1084,3; shl.b64 %r1086,%r1523,3; add.u64 %r1087,%r1086,-8; add.u64 %r330,%r1031,%r1087; add.u64 %r1088,%r258,%r592; add.u64 %r1089,%r1088,%r1523; shl.b64 %r1090,%r1089,3; add.u64 %r225,%r1090,%r597; shl.b64 %r587,%r258,3; add.u64 %r48,%r587,%r595; add.u64 %r46,%r587,%r588; add.u64 %r30,%r587,%r589; add.u64 %r586,%r585,%r587; add.u64 %r1091,%r399,%r258; add.u64 %r1092,%r1091,%r1523; shl.b64 %r1093,%r1092,3; add.u64 %r397,%r1093,%r597; .loc 1 2768 8 mov.u64 %r555,%r606; .loc 1 2727 16 mov.u64 %r273,%r502; setp.le.s64 %r1356,%r258,%r481; setp.gt.s64 %r1348,%r1523,%r257; setp.gt.s64 %r1347,%r1521,%r255; not.b64 %r1385,%r258; add.u64 %r1386,%r1385,%r479; and.b64 %r1387,%r1386,-2; .loc 1 2904 18 add.u64 %r1388,%r1523,%r258; .loc 1 2904 13 add.u64 %r1389,%r1388,-1; setp.gt.s64 %r1390,%r258,%r1389; $L72: .loc 1 2732 10 sub.u64 %r259,%r1401,%r273; min.s64 %r1524,%r259,256; .loc 1 2733 25 shr.s64 %r1353,%r1524,63; shr.u64 %r1097,%r1524,63; add.u64 %r1098,%r1524,%r1097; and.b64 %r1099,%r1098,1; sub.u64 %r1100,%r1099,%r1097; .loc 1 2733 11 sub.u64 %r260,%r1524,%r1100; .loc 1 2735 5 @ %r1356 bra $L46; $L55: .loc 1 2757 8 @ ! %r1348 bra $L48; bra $L47; $L46: .loc 1 2737 17 add.u64 %r484,%r260,%r273; .loc 1 2737 12 add.u64 %r485,%r484,-1; shl.b64 %r1104,%r1524,11; add.u64 %r1106,%r1104,-2048; add.u64 %r530,%r1031,%r1106; add.u64 %r536,%r556,%r569; add.u64 %r1109,%r1524,%r273; shl.b64 %r547,%r1109,3; shl.b64 %r551,%r1524,3; mov.u64 %r541,%r567; mov.u64 %r540,%r577; mov.u64 %r539,%r574; mov.u64 %r528,0; setp.le.s64 %r1361,%r273,%r485; setp.gt.s64 %r1349,%r1524,%r260; not.b64 %r1380,%r273; add.u64 %r1381,%r1380,%r484; and.b64 %r1382,%r1381,-2; add.u64 %r1383,%r1382,%r273; bra $L54; $L139: mov.u64 %r528,%r529; $L54: .loc 1 2738 9 @ %r1361 bra $L49; $L53: .loc 1 2749 12 @ %r1349 bra $L50; bra $L51; $L49: shl.b64 %r1112,%r528,3; add.u64 %r229,%r1031,%r1112; add.u64 %r1117,%r1383,%r541; shl.b64 %r1118,%r1117,3; add.u64 %r527,%r1118,%r1402; .loc 1 2738 9 mov.u64 %r211,%r536; sub.u64 %r1451,%r527,%r536; add.u64 %r1452,%r1451,-16; shr.u64 %r1450,%r1452,4; add.u64 %r1453,%r1450,1; and.b64 %r1454,%r1453,3; setp.eq.u64 %r1457,%r1454,0; @ %r1457 bra $L52; setp.eq.u64 %r1456,%r1454,1; @ %r1456 bra $L302; setp.eq.u64 %r1455,%r1454,2; @ %r1455 bra $L303; bra $L368; $L52: .loc 1 2741 7 add.u64 %r1120,%r211,%r539; add.u64 %r1121,%r1120,%r540; .loc 1 2740 49 ld.u64 %r1122,[%r1121]; st.u64 [%r229],%r1122; .loc 1 2742 49 ld.u64 %r1123,[%r211]; st.u64 [%r229+8],%r1123; .loc 1 2745 7 add.u64 %r1124,%r211,%r540; add.u64 %r1125,%r1124,%r539; .loc 1 2744 49 ld.u64 %r1126,[%r1125+8]; st.u64 [%r229+2048],%r1126; .loc 1 2746 49 ld.u64 %r1127,[%r211+8]; st.u64 [%r229+2056],%r1127; .loc 1 2738 9 add.u64 %r1458,%r229,4096; add.u64 %r1459,%r211,16; .loc 1 2741 7 add.u64 %r1527,%r1459,%r539; add.u64 %r1528,%r1527,%r540; .loc 1 2740 49 ld.u64 %r1529,[%r1528]; st.u64 [%r1458],%r1529; .loc 1 2742 49 ld.u64 %r1530,[%r1459]; st.u64 [%r1458+8],%r1530; .loc 1 2745 7 add.u64 %r1531,%r1459,%r540; add.u64 %r1532,%r1531,%r539; .loc 1 2744 49 ld.u64 %r1533,[%r1532+8]; st.u64 [%r1458+2048],%r1533; .loc 1 2746 49 ld.u64 %r1534,[%r1459+8]; st.u64 [%r1458+2056],%r1534; .loc 1 2738 9 add.u64 %r1535,%r229,8192; add.u64 %r1536,%r211,32; .loc 1 2741 7 add.u64 %r1538,%r1536,%r539; add.u64 %r1539,%r1538,%r540; .loc 1 2740 49 ld.u64 %r1540,[%r1539]; st.u64 [%r1535],%r1540; .loc 1 2742 49 ld.u64 %r1541,[%r1536]; st.u64 [%r1535+8],%r1541; .loc 1 2745 7 add.u64 %r1542,%r1536,%r540; add.u64 %r1543,%r1542,%r539; .loc 1 2744 49 ld.u64 %r1544,[%r1543+8]; st.u64 [%r1535+2048],%r1544; .loc 1 2746 49 ld.u64 %r1545,[%r1536+8]; st.u64 [%r1535+2056],%r1545; .loc 1 2738 9 add.u64 %r1546,%r229,12288; add.u64 %r1547,%r211,48; .loc 1 2741 7 add.u64 %r1549,%r1547,%r539; add.u64 %r1550,%r1549,%r540; .loc 1 2740 49 ld.u64 %r1551,[%r1550]; st.u64 [%r1546],%r1551; .loc 1 2742 49 ld.u64 %r1552,[%r1547]; st.u64 [%r1546+8],%r1552; .loc 1 2745 7 add.u64 %r1553,%r1547,%r540; add.u64 %r1554,%r1553,%r539; .loc 1 2744 49 ld.u64 %r1555,[%r1554+8]; st.u64 [%r1546+2048],%r1555; .loc 1 2746 49 ld.u64 %r1556,[%r1547+8]; st.u64 [%r1546+2056],%r1556; .loc 1 2738 9 add.u64 %r229,%r229,16384; add.u64 %r211,%r211,64; setp.ne.u64 %r1557,%r211,%r527; @ %r1557 bra $L52; bra $L53; $L50: .loc 1 2752 10 add.u64 %r1129,%r539,%r547; add.u64 %r1130,%r249,%r1129; .loc 1 2751 41 ld.u64 %r1131,[%r1130+-8]; st.u64 [%r530],%r1131; .loc 1 2754 10 add.u64 %r1132,%r536,%r551; .loc 1 2753 41 ld.u64 %r1133,[%r1132+-8]; st.u64 [%r530+8],%r1133; $L51: .loc 1 2735 5 add.u64 %r529,%r528,2; add.u64 %r530,%r530,16; add.u64 %r536,%r536,%r537; add.u64 %r539,%r539,%r537; add.u64 %r540,%r540,%r1053; add.u64 %r541,%r541,%r538; setp.ne.u64 %r1134,%r528,%r1387; @ %r1134 bra $L139; bra $L55; $L48: .loc 1 2767 25 shr.u64 %r1137,%r1353,62; add.u64 %r1138,%r1524,%r1137; and.b64 %r1139,%r1138,3; sub.u64 %r1140,%r1139,%r1137; .loc 1 2767 11 sub.u64 %r263,%r1524,%r1140; .loc 1 2769 5 @ %r1394 bra $L56; bra $L57; $L47: .loc 1 2759 17 add.u64 %r132,%r1524,%r273; .loc 1 2759 12 add.u64 %r1142,%r132,-1; .loc 1 2760 9 setp.lt.s64 %r1143,%r1142,%r273; @ %r1143 bra $L48; add.u64 %r1144,%r132,%r314; shl.b64 %r1145,%r1144,3; add.u64 %r136,%r249,%r1145; mov.u64 %r73,%r330; mov.u64 %r57,%r556; sub.u64 %r1441,%r136,%r556; add.u64 %r1442,%r1441,-8; shr.u64 %r1440,%r1442,3; add.u64 %r1443,%r1440,1; and.b64 %r1444,%r1443,3; setp.eq.u64 %r1447,%r1444,0; @ %r1447 bra $L58; setp.eq.u64 %r1446,%r1444,1; @ %r1446 bra $L304; setp.eq.u64 %r1445,%r1444,2; @ %r1445 bra $L305; bra $L369; $L58: .loc 1 2762 43 ld.u64 %r1146,[%r57]; st.u64 [%r73],%r1146; ld.u64 %r1560,[%r57+8]; st.u64 [%r73+2048],%r1560; ld.u64 %r1564,[%r57+16]; st.u64 [%r73+4096],%r1564; ld.u64 %r1568,[%r57+24]; st.u64 [%r73+6144],%r1568; .loc 1 2760 9 add.u64 %r57,%r57,32; add.u64 %r73,%r73,8192; setp.ne.u64 %r1569,%r57,%r136; @ %r1569 bra $L58; bra $L48; $L57: .loc 1 2871 8 @ ! %r1347 bra $L60; bra $L59; $L56: .loc 1 2876 14 add.u64 %r488,%r263,%r273; .loc 1 2876 9 add.u64 %r489,%r488,-1; add.u64 %r222,%r555,%r1057; add.u64 %r89,%r555,%r566; add.u64 %r42,%r566,%r89; not.b64 %r1149,%r273; add.u64 %r1150,%r1149,%r488; shr.u64 %r1151,%r1150,2; add.u64 %r1152,%r1151,1; shl.b64 %r318,%r1152,5; add.u64 %r1153,%r1397,%r488; shl.b64 %r1154,%r1153,3; add.u64 %r468,%r1154,%r247; add.u64 %r1155,%r488,%r598; shl.b64 %r1156,%r1155,3; add.u64 %r463,%r1156,%r247; add.u64 %r1158,%r1398,%r488; shl.b64 %r1159,%r1158,3; add.u64 %r460,%r1159,%r247; add.u64 %r1160,%r488,%r602; shl.b64 %r1161,%r1160,3; add.u64 %r457,%r1161,%r247; shl.b64 %r369,%r263,8; shl.b64 %r471,%r1524,3; mov.u64 %r476,%r475; mov.u64 %r478,%r477; mov.u64 %r483,%r605; mov.u64 %r486,%r603; mov.u64 %r220,%r555; mov.u64 %r227,%r225; mov.u64 %r278,%r618; mov.u64 %r473,0; mov.u64 %r474,%r473; setp.le.s64 %r1362,%r273,%r489; setp.gt.s64 %r1350,%r1524,%r263; $L68: .loc 1 2772 9 @ %r1362 bra $L61; $L67: .loc 1 2843 12 @ ! %r1350 bra $L63; bra $L62; $L61: .loc 1 2904 13 mov.u64 %r341,0; mov.u64 %r342,8192; mov.u64 %r343,6144; mov.u64 %r344,2048; mov.u64 %r345,-4096; $L66: add.u64 %r339,%r222,%r341; .loc 1 2774 10 ld.u64 %r279,[%r339]; .loc 1 2775 10 ld.u64 %r280,[%r339+8]; add.u64 %r337,%r220,%r341; .loc 1 2776 10 ld.u64 %r281,[%r337]; .loc 1 2777 10 ld.u64 %r282,[%r337+8]; add.u64 %r335,%r89,%r341; .loc 1 2778 10 ld.u64 %r283,[%r335]; .loc 1 2779 10 ld.u64 %r284,[%r335+8]; add.u64 %r333,%r42,%r341; .loc 1 2780 10 ld.u64 %r285,[%r333]; .loc 1 2781 10 ld.u64 %r286,[%r333+8]; .loc 1 2782 10 ld.u64 %r287,[%r339+16]; .loc 1 2783 10 ld.u64 %r288,[%r339+24]; .loc 1 2784 10 ld.u64 %r289,[%r337+16]; .loc 1 2785 10 ld.u64 %r290,[%r337+24]; .loc 1 2786 10 ld.u64 %r291,[%r335+16]; .loc 1 2787 10 ld.u64 %r292,[%r335+24]; .loc 1 2788 10 ld.u64 %r293,[%r333+16]; .loc 1 2789 10 ld.u64 %r294,[%r333+24]; .loc 1 2791 6 @ %r1390 bra $L64; add.u64 %r361,%r473,%r570; add.u64 %r359,%r1031,%r344; $L65: add.u64 %r356,%r345,%r359; .loc 1 2793 19 add.u64 %r1166,%r356,%r344; ld.u64 %r137,[%r1166]; .loc 1 2794 14 ld.u64 %r138,[%r361]; .loc 1 2793 14 mad.lo.u64 %r279,%r137,%r138,%r279; .loc 1 2795 19 ld.u64 %r140,[%r359]; .loc 1 2795 14 mad.lo.u64 %r280,%r138,%r140,%r280; add.u64 %r351,%r361,%r486; .loc 1 2798 14 add.u64 %r1169,%r351,%r483; ld.u64 %r142,[%r1169]; .loc 1 2797 14 mad.lo.u64 %r281,%r137,%r142,%r281; .loc 1 2799 14 mad.lo.u64 %r282,%r140,%r142,%r282; .loc 1 2802 14 add.u64 %r1172,%r351,%r478; ld.u64 %r145,[%r1172]; .loc 1 2801 14 mad.lo.u64 %r283,%r137,%r145,%r283; .loc 1 2803 14 mad.lo.u64 %r284,%r140,%r145,%r284; .loc 1 2806 14 add.u64 %r1175,%r351,%r476; ld.u64 %r148,[%r1175]; .loc 1 2805 14 mad.lo.u64 %r285,%r137,%r148,%r285; .loc 1 2807 14 mad.lo.u64 %r286,%r140,%r148,%r286; .loc 1 2809 19 add.u64 %r1178,%r356,%r343; ld.u64 %r151,[%r1178]; .loc 1 2809 14 mad.lo.u64 %r287,%r138,%r151,%r287; .loc 1 2811 19 add.u64 %r1180,%r356,%r342; ld.u64 %r153,[%r1180]; .loc 1 2811 14 mad.lo.u64 %r288,%r138,%r153,%r288; .loc 1 2813 14 mad.lo.u64 %r289,%r142,%r151,%r289; .loc 1 2815 14 mad.lo.u64 %r290,%r142,%r153,%r290; .loc 1 2817 14 mad.lo.u64 %r291,%r145,%r151,%r291; .loc 1 2819 14 mad.lo.u64 %r292,%r145,%r153,%r292; .loc 1 2821 14 mad.lo.u64 %r293,%r148,%r151,%r293; .loc 1 2823 14 mad.lo.u64 %r294,%r148,%r153,%r294; .loc 1 2791 6 add.u64 %r361,%r361,8; add.u64 %r359,%r359,8; setp.ne.u64 %r1188,%r227,%r361; @ %r1188 bra $L65; $L64: .loc 1 2826 24 st.u64 [%r339],%r279; .loc 1 2827 28 st.u64 [%r339+8],%r280; .loc 1 2828 30 st.u64 [%r337],%r281; .loc 1 2829 34 st.u64 [%r337+8],%r282; .loc 1 2830 30 st.u64 [%r335],%r283; .loc 1 2831 34 st.u64 [%r335+8],%r284; .loc 1 2832 30 st.u64 [%r333],%r285; .loc 1 2833 34 st.u64 [%r333+8],%r286; .loc 1 2834 28 st.u64 [%r339+16],%r287; .loc 1 2835 28 st.u64 [%r339+24],%r288; .loc 1 2836 34 st.u64 [%r337+16],%r289; .loc 1 2837 34 st.u64 [%r337+24],%r290; .loc 1 2838 34 st.u64 [%r335+16],%r291; .loc 1 2839 34 st.u64 [%r335+24],%r292; .loc 1 2840 34 st.u64 [%r333+16],%r293; .loc 1 2841 34 st.u64 [%r333+24],%r294; .loc 1 2772 9 add.u64 %r345,%r345,-8192; add.u64 %r344,%r344,8192; add.u64 %r343,%r343,8192; add.u64 %r342,%r342,8192; add.u64 %r341,%r341,32; setp.ne.u64 %r1189,%r318,%r341; @ %r1189 bra $L66; bra $L67; $L63: .loc 1 2769 29 add.u64 %r278,%r278,4; .loc 1 2769 5 add.u64 %r227,%r227,%r226; add.u64 %r222,%r222,%r221; add.u64 %r220,%r220,%r221; add.u64 %r89,%r89,%r221; add.u64 %r42,%r42,%r221; add.u64 %r486,%r486,%r1060; add.u64 %r483,%r483,%r226; add.u64 %r478,%r478,%r226; add.u64 %r476,%r476,%r226; add.u64 %r474,%r474,%r221; add.u64 %r473,%r473,%r226; setp.le.s64 %r1190,%r278,%r1392; @ %r1190 bra $L68; bra $L57; $L62: add.u64 %r374,%r468,%r474; add.u64 %r373,%r463,%r474; add.u64 %r372,%r460,%r474; add.u64 %r371,%r457,%r474; add.u64 %r49,%r48,%r473; add.u64 %r239,%r46,%r473; add.u64 %r240,%r30,%r473; add.u64 %r472,%r222,%r471; .loc 1 2904 13 mov.u64 %r370,%r369; $L71: .loc 1 2848 14 ld.u64 %r274,[%r374]; .loc 1 2849 14 ld.u64 %r275,[%r373]; .loc 1 2850 14 ld.u64 %r276,[%r372]; .loc 1 2851 14 ld.u64 %r277,[%r371]; .loc 1 2853 10 @ %r1390 bra $L69; shl.b64 %r1194,%r370,3; add.u64 %r389,%r1031,%r1194; add.u64 %r387,%r473,%r570; mov.u64 %r381,%r240; mov.u64 %r383,%r239; mov.u64 %r385,%r49; sub.u64 %r1428,%r227,%r387; add.u64 %r1429,%r1428,-8; shr.u64 %r1427,%r1429,3; add.u64 %r1430,%r1427,1; and.b64 %r1431,%r1430,3; setp.eq.u64 %r1434,%r1431,0; @ %r1434 bra $L70; setp.eq.u64 %r1433,%r1431,1; @ %r1433 bra $L306; setp.eq.u64 %r1432,%r1431,2; @ %r1432 bra $L307; bra $L370; $L70: .loc 1 2855 16 ld.u64 %r162,[%r389]; .loc 1 2856 13 ld.u64 %r1196,[%r387]; .loc 1 2855 11 mad.lo.u64 %r1570,%r162,%r1196,%r274; .loc 1 2858 13 ld.u64 %r1198,[%r385]; .loc 1 2857 11 mad.lo.u64 %r1571,%r162,%r1198,%r275; .loc 1 2860 13 ld.u64 %r1200,[%r383]; .loc 1 2859 11 mad.lo.u64 %r1572,%r162,%r1200,%r276; .loc 1 2862 13 ld.u64 %r1202,[%r381]; .loc 1 2861 11 mad.lo.u64 %r1573,%r162,%r1202,%r277; .loc 1 2855 16 ld.u64 %r1579,[%r389+8]; .loc 1 2856 13 ld.u64 %r1580,[%r387+8]; .loc 1 2855 11 mad.lo.u64 %r1582,%r1579,%r1580,%r1570; .loc 1 2858 13 ld.u64 %r1583,[%r385+8]; .loc 1 2857 11 mad.lo.u64 %r1585,%r1579,%r1583,%r1571; .loc 1 2860 13 ld.u64 %r1586,[%r383+8]; .loc 1 2859 11 mad.lo.u64 %r1588,%r1579,%r1586,%r1572; .loc 1 2862 13 ld.u64 %r1589,[%r381+8]; .loc 1 2861 11 mad.lo.u64 %r1591,%r1579,%r1589,%r1573; .loc 1 2855 16 ld.u64 %r1598,[%r389+16]; .loc 1 2856 13 ld.u64 %r1599,[%r387+16]; .loc 1 2855 11 mad.lo.u64 %r1601,%r1598,%r1599,%r1582; .loc 1 2858 13 ld.u64 %r1602,[%r385+16]; .loc 1 2857 11 mad.lo.u64 %r1604,%r1598,%r1602,%r1585; .loc 1 2860 13 ld.u64 %r1605,[%r383+16]; .loc 1 2859 11 mad.lo.u64 %r1607,%r1598,%r1605,%r1588; .loc 1 2862 13 ld.u64 %r1608,[%r381+16]; .loc 1 2861 11 mad.lo.u64 %r1610,%r1598,%r1608,%r1591; .loc 1 2855 16 ld.u64 %r1617,[%r389+24]; .loc 1 2856 13 ld.u64 %r1618,[%r387+24]; .loc 1 2855 11 mad.lo.u64 %r274,%r1617,%r1618,%r1601; .loc 1 2858 13 ld.u64 %r1620,[%r385+24]; .loc 1 2857 11 mad.lo.u64 %r275,%r1617,%r1620,%r1604; .loc 1 2860 13 ld.u64 %r1622,[%r383+24]; .loc 1 2859 11 mad.lo.u64 %r276,%r1617,%r1622,%r1607; .loc 1 2862 13 ld.u64 %r1624,[%r381+24]; .loc 1 2861 11 mad.lo.u64 %r277,%r1617,%r1624,%r1610; .loc 1 2853 10 add.u64 %r389,%r389,32; add.u64 %r387,%r387,32; add.u64 %r385,%r385,32; add.u64 %r383,%r383,32; add.u64 %r381,%r381,32; setp.ne.u64 %r1626,%r227,%r387; @ %r1626 bra $L70; $L69: .loc 1 2864 28 st.u64 [%r374],%r274; .loc 1 2865 34 st.u64 [%r373],%r275; .loc 1 2866 34 st.u64 [%r372],%r276; .loc 1 2867 34 st.u64 [%r371],%r277; .loc 1 2846 6 add.u64 %r374,%r374,8; add.u64 %r373,%r373,8; add.u64 %r372,%r372,8; add.u64 %r371,%r371,8; add.u64 %r370,%r370,256; setp.ne.u64 %r1204,%r374,%r472; @ %r1204 bra $L71; bra $L63; $L60: .loc 1 2727 34 add.u64 %r273,%r273,256; .loc 1 2727 8 add.u64 %r555,%r555,2048; add.u64 %r556,%r556,2048; setp.ge.s64 %r1205,%r216,%r273; @ %r1205 bra $L72; bra $L73; $L59: .loc 1 2873 17 add.u64 %r171,%r1521,%r618; .loc 1 2874 9 setp.le.s64 %r1206,%r171,%r480; @ %r1206 bra $L60; .loc 1 2876 14 add.u64 %r495,%r263,%r273; .loc 1 2876 9 add.u64 %r496,%r495,-1; .loc 1 2759 17 add.u64 %r405,%r1524,%r273; add.u64 %r1207,%r405,%r406; shl.b64 %r1208,%r1207,3; add.u64 %r407,%r1208,%r247; neg.s64 %r1210,%r405; shl.b64 %r1211,%r1210,3; shl.b64 %r1212,%r273,3; add.u64 %r390,%r1212,%r1211; shl.b64 %r1213,%r495,3; add.u64 %r392,%r1213,%r1211; shl.b64 %r434,%r263,8; .loc 1 2876 9 mov.u64 %r398,%r397; mov.u64 %r400,%r586; mov.u64 %r401,%r406; mov.u64 %r267,%r480; setp.gt.s64 %r1627,%r1524,%r263; setp.le.s64 %r1365,%r273,%r496; not.b64 %r1372,%r273; add.u64 %r1373,%r1372,%r495; and.b64 %r1374,%r1373,-4; add.u64 %r1375,%r1374,%r273; $L81: .loc 1 2877 6 @ %r1365 bra $L74; $L80: .loc 1 2901 6 @ %r1627 bra $L75; bra $L76; $L74: add.u64 %r419,%r390,%r407; add.u64 %r1221,%r1375,%r401; shl.b64 %r1222,%r1221,3; add.u64 %r1223,%r247,32; add.u64 %r409,%r1222,%r1223; .loc 1 2904 13 mov.u64 %r415,8192; mov.u64 %r416,6144; mov.u64 %r417,2048; mov.u64 %r418,-4096; $L79: .loc 1 2879 14 ld.u64 %r269,[%r419]; .loc 1 2880 14 ld.u64 %r270,[%r419+8]; .loc 1 2881 14 ld.u64 %r271,[%r419+16]; .loc 1 2882 14 ld.u64 %r272,[%r419+24]; .loc 1 2884 10 @ %r1390 bra $L77; add.u64 %r428,%r1031,%r417; mov.u64 %r429,%r400; sub.u64 %r1420,%r398,%r400; add.u64 %r1421,%r1420,-8; shr.u64 %r1419,%r1421,3; and.b64 %r1423,%r1419,1; setp.ne.u64 %r1424,%r1423,0; @ ! %r1424 bra $L371; $L78: add.u64 %r425,%r418,%r428; .loc 1 2887 16 ld.u64 %r173,[%r429]; .loc 1 2886 16 add.u64 %r1225,%r425,%r417; .loc 1 2887 13 ld.u64 %r1227,[%r1225]; .loc 1 2886 11 mad.lo.u64 %r1628,%r173,%r1227,%r269; .loc 1 2889 13 ld.u64 %r1229,[%r428]; .loc 1 2888 11 mad.lo.u64 %r1629,%r173,%r1229,%r270; .loc 1 2890 16 add.u64 %r1230,%r425,%r416; .loc 1 2891 13 ld.u64 %r1232,[%r1230]; .loc 1 2890 11 mad.lo.u64 %r1630,%r173,%r1232,%r271; .loc 1 2892 16 add.u64 %r1233,%r425,%r415; .loc 1 2893 13 ld.u64 %r1235,[%r1233]; .loc 1 2892 11 mad.lo.u64 %r1631,%r173,%r1235,%r272; .loc 1 2884 10 add.u64 %r1426,%r428,8; add.u64 %r1634,%r418,%r1426; .loc 1 2887 16 ld.u64 %r1635,[%r429+8]; .loc 1 2886 16 add.u64 %r1636,%r1634,%r417; .loc 1 2887 13 ld.u64 %r1637,[%r1636]; .loc 1 2886 11 mad.lo.u64 %r269,%r1635,%r1637,%r1628; .loc 1 2889 13 ld.u64 %r1639,[%r1426]; .loc 1 2888 11 mad.lo.u64 %r270,%r1635,%r1639,%r1629; .loc 1 2890 16 add.u64 %r1641,%r1634,%r416; .loc 1 2891 13 ld.u64 %r1642,[%r1641]; .loc 1 2890 11 mad.lo.u64 %r271,%r1635,%r1642,%r1630; .loc 1 2892 16 add.u64 %r1644,%r1634,%r415; .loc 1 2893 13 ld.u64 %r1645,[%r1644]; .loc 1 2892 11 mad.lo.u64 %r272,%r1635,%r1645,%r1631; .loc 1 2884 10 add.u64 %r429,%r429,16; add.u64 %r428,%r428,16; setp.ne.u64 %r1647,%r398,%r429; @ %r1647 bra $L78; $L77: .loc 1 2895 28 st.u64 [%r419],%r269; .loc 1 2896 32 st.u64 [%r419+8],%r270; .loc 1 2897 32 st.u64 [%r419+16],%r271; .loc 1 2898 32 st.u64 [%r419+24],%r272; .loc 1 2877 6 add.u64 %r419,%r419,32; add.u64 %r418,%r418,-8192; add.u64 %r417,%r417,8192; add.u64 %r416,%r416,8192; add.u64 %r415,%r415,8192; setp.ne.u64 %r1237,%r409,%r419; @ %r1237 bra $L79; bra $L80; $L76: .loc 1 2874 39 add.u64 %r267,%r267,1; .loc 1 2874 9 add.u64 %r407,%r407,%r566; add.u64 %r401,%r401,%r504; add.u64 %r400,%r400,%r619; add.u64 %r398,%r398,%r619; setp.ne.u64 %r1238,%r171,%r267; @ %r1238 bra $L81; bra $L60; $L75: add.u64 %r436,%r392,%r407; .loc 1 2904 13 mov.u64 %r435,%r434; $L84: .loc 1 2903 14 ld.u64 %r268,[%r436]; .loc 1 2905 10 @ %r1390 bra $L82; shl.b64 %r1241,%r435,3; add.u64 %r332,%r1031,%r1241; mov.u64 %r297,%r400; sub.u64 %r1410,%r398,%r400; add.u64 %r1411,%r1410,-8; shr.u64 %r1409,%r1411,3; add.u64 %r1412,%r1409,1; and.b64 %r1413,%r1412,3; setp.eq.u64 %r1416,%r1413,0; @ %r1416 bra $L83; setp.eq.u64 %r1415,%r1413,1; @ %r1415 bra $L308; setp.eq.u64 %r1414,%r1413,2; @ %r1414 bra $L309; bra $L372; $L83: .loc 1 2908 13 ld.u64 %r1243,[%r332]; ld.u64 %r1244,[%r297]; .loc 1 2907 11 mad.lo.u64 %r1648,%r1243,%r1244,%r268; .loc 1 2908 13 ld.u64 %r1651,[%r332+8]; ld.u64 %r1652,[%r297+8]; .loc 1 2907 11 mad.lo.u64 %r1654,%r1651,%r1652,%r1648; .loc 1 2908 13 ld.u64 %r1658,[%r332+16]; ld.u64 %r1659,[%r297+16]; .loc 1 2907 11 mad.lo.u64 %r1661,%r1658,%r1659,%r1654; .loc 1 2908 13 ld.u64 %r1665,[%r332+24]; ld.u64 %r1666,[%r297+24]; .loc 1 2907 11 mad.lo.u64 %r268,%r1665,%r1666,%r1661; .loc 1 2905 10 add.u64 %r332,%r332,32; add.u64 %r297,%r297,32; setp.ne.u64 %r1668,%r398,%r297; @ %r1668 bra $L83; $L82: .loc 1 2910 28 st.u64 [%r436],%r268; .loc 1 2901 6 add.u64 %r436,%r436,8; add.u64 %r435,%r435,256; setp.ne.u64 %r1246,%r407,%r436; @ %r1246 bra $L84; bra $L76; $L20: .loc 1 2958 11 cvt.u16.u32 %r1247,%r446; setp.eq.u16 %r1248,%r1247,1; @ ! %r1248 bra $L373; bra $L85; $L31: .loc 1 2920 38 set.u32.eq.u64 %r1250,%r208,1; neg.s32 %r1251,%r1250; .loc 1 2920 26 cvt.u16.u32 %r1253,%r1251; and.b16 %r1252,%r1253,%r988; .loc 1 2920 11 cvt.u32.u16 %r1255,%r1252; cvt.u16.u8 %r1256,%r1255; setp.eq.u16 %r1257,%r1256,0; @ %r1257 bra $L20; .loc 1 2920 43 setp.ne.u64 %r1258,%r502,1; @ %r1258 bra $L20; $L121: .loc 1 2922 10 cvt.u16.u32 %r1259,%r446; setp.ne.u16 %r1260,%r1259,1; @ %r1260 bra $L87; .loc 1 2948 4 setp.gt.s64 %r1261,%r217,0; @ %r1261 bra $L88; bra $L1; $L87: .loc 1 2929 4 setp.le.s64 %r1262,%r217,0; @ %r1262 bra $L1; shl.b64 %r643,%r504,3; shl.b64 %r1263,%r216,3; add.u64 %r642,%r245,%r1263; shl.b64 %r633,%r503,3; shl.b64 %r1264,%r456,3; add.u64 %r634,%r243,%r1264; mov.u64 %r647,0; mov.u64 %r640,%r647; .loc 1 2929 11 mov.u64 %r300,%r647; setp.gt.s64 %r1669,%r216,0; setp.le.s64 %r1406,%r456,0; $L91: .loc 1 2933 8 @ %r1669 bra $L90; $L95: .loc 1 2929 29 add.u64 %r300,%r300,1; .loc 1 2929 4 add.u64 %r640,%r640,%r504; add.u64 %r642,%r642,%r643; add.u64 %r647,%r647,%r209; setp.ne.u64 %r1266,%r217,%r300; @ %r1266 bra $L91; bra $L1; $L90: shl.b64 %r1267,%r640,3; add.u64 %r629,%r245,%r1267; shl.b64 %r1268,%r647,3; add.u64 %r652,%r244,%r1268; .loc 1 2933 8 mov.u64 %r631,%r634; mov.u64 %r630,0; $L94: .loc 1 2937 5 @ %r1406 bra $L140; shl.b64 %r1270,%r630,3; add.u64 %r625,%r243,%r1270; mov.u64 %r626,%r652; .loc 1 2936 7 mov.u64 %r301,0; sub.u64 %r1469,%r631,%r625; add.u64 %r1470,%r1469,-8; shr.u64 %r1468,%r1470,3; add.u64 %r1471,%r1468,1; and.b64 %r1472,%r1471,3; setp.eq.u64 %r1475,%r1472,0; @ %r1475 bra $L93; setp.eq.u64 %r1474,%r1472,1; @ %r1474 bra $L310; setp.eq.u64 %r1473,%r1472,2; @ %r1473 bra $L311; bra $L374; $L93: .loc 1 2938 23 ld.u64 %r1272,[%r625]; ld.u64 %r1273,[%r626]; .loc 1 2938 9 mad.lo.u64 %r1670,%r1272,%r1273,%r301; .loc 1 2938 23 ld.u64 %r1673,[%r625+8]; ld.u64 %r1674,[%r626+8]; .loc 1 2938 9 mad.lo.u64 %r1676,%r1673,%r1674,%r1670; .loc 1 2938 23 ld.u64 %r1680,[%r625+16]; ld.u64 %r1681,[%r626+16]; .loc 1 2938 9 mad.lo.u64 %r1683,%r1680,%r1681,%r1676; .loc 1 2938 23 ld.u64 %r1687,[%r625+24]; ld.u64 %r1688,[%r626+24]; .loc 1 2938 9 mad.lo.u64 %r301,%r1687,%r1688,%r1683; .loc 1 2937 5 add.u64 %r625,%r625,32; add.u64 %r626,%r626,32; setp.ne.u64 %r1690,%r625,%r631; @ %r1690 bra $L93; bra $L92; $L140: .loc 1 2936 7 mov.u64 %r301,0; $L92: .loc 1 2939 15 st.u64 [%r629],%r301; .loc 1 2933 8 add.u64 %r629,%r629,8; add.u64 %r630,%r630,%r503; add.u64 %r631,%r631,%r633; setp.ne.u64 %r1275,%r629,%r642; @ %r1275 bra $L94; bra $L95; $L88: shl.b64 %r663,%r504,3; mov.u64 %r661,%r245; shl.b64 %r655,%r503,3; .loc 1 2948 4 mov.u64 %r664,0; .loc 1 2948 11 mov.u64 %r298,%r664; setp.le.s64 %r1691,%r456,0; $L98: .loc 1 2952 8 @ %r1691 bra $L141; mov.u64 %r653,%r243; shl.b64 %r1277,%r664,3; add.u64 %r656,%r1277,%r244; add.u64 %r1278,%r456,%r664; shl.b64 %r1279,%r1278,3; add.u64 %r672,%r1279,%r244; .loc 1 2951 10 mov.u64 %r299,0; sub.u64 %r1479,%r672,%r656; add.u64 %r1480,%r1479,-8; shr.u64 %r1478,%r1480,3; add.u64 %r1481,%r1478,1; and.b64 %r1482,%r1481,3; setp.eq.u64 %r1485,%r1482,0; @ %r1485 bra $L97; setp.eq.u64 %r1484,%r1482,1; @ %r1484 bra $L312; setp.eq.u64 %r1483,%r1482,2; @ %r1483 bra $L313; bra $L375; $L97: .loc 1 2953 26 ld.u64 %r1281,[%r653]; ld.u64 %r1282,[%r656]; .loc 1 2953 5 mad.lo.u64 %r1692,%r1281,%r1282,%r299; .loc 1 2952 8 add.u64 %r1693,%r653,%r655; .loc 1 2953 26 ld.u64 %r1695,[%r1693]; ld.u64 %r1696,[%r656+8]; .loc 1 2953 5 mad.lo.u64 %r1698,%r1695,%r1696,%r1692; .loc 1 2952 8 add.u64 %r1699,%r1693,%r655; .loc 1 2953 26 ld.u64 %r1702,[%r1699]; ld.u64 %r1703,[%r656+16]; .loc 1 2953 5 mad.lo.u64 %r1705,%r1702,%r1703,%r1698; .loc 1 2952 8 add.u64 %r1706,%r1699,%r655; .loc 1 2953 26 ld.u64 %r1709,[%r1706]; ld.u64 %r1710,[%r656+24]; .loc 1 2953 5 mad.lo.u64 %r299,%r1709,%r1710,%r1705; .loc 1 2952 8 add.u64 %r653,%r1706,%r655; add.u64 %r656,%r656,32; setp.ne.u64 %r1712,%r656,%r672; @ %r1712 bra $L97; bra $L96; $L141: .loc 1 2951 10 mov.u64 %r299,0; $L96: .loc 1 2954 25 st.u64 [%r661],%r299; .loc 1 2948 29 add.u64 %r298,%r298,1; .loc 1 2948 4 add.u64 %r661,%r661,%r663; add.u64 %r664,%r664,%r209; setp.ne.u64 %r1284,%r217,%r298; @ %r1284 bra $L98; bra $L1; $L85: .loc 1 2963 7 setp.le.s64 %r1285,%r217,0; @ %r1285 bra $L1; shl.b64 %r683,%r444,3; mov.u64 %r681,%r245; shl.b64 %r675,%r503,3; shl.b64 %r678,%r502,3; mov.u64 %r684,0; .loc 1 2963 14 mov.u64 %r311,%r684; setp.le.s64 %r1713,%r456,0; $L101: .loc 1 2967 4 @ %r1713 bra $L142; mov.u64 %r673,%r243; shl.b64 %r1287,%r684,3; add.u64 %r676,%r244,%r1287; .loc 1 2966 6 mov.u64 %r312,0; .loc 1 2967 11 mov.u64 %r313,%r312; and.b64 %r1490,%r456,3; setp.eq.u64 %r1493,%r1490,0; @ %r1493 bra $L100; setp.eq.u64 %r1492,%r1490,1; @ %r1492 bra $L314; setp.eq.u64 %r1491,%r1490,2; @ %r1491 bra $L315; bra $L376; $L100: .loc 1 2968 29 ld.u64 %r1289,[%r673]; ld.u64 %r1290,[%r676]; .loc 1 2968 8 mad.lo.u64 %r1714,%r1289,%r1290,%r312; .loc 1 2967 4 add.u64 %r1716,%r673,%r675; add.u64 %r1717,%r676,%r678; .loc 1 2968 29 ld.u64 %r1718,[%r1716]; ld.u64 %r1719,[%r1717]; .loc 1 2968 8 mad.lo.u64 %r1721,%r1718,%r1719,%r1714; .loc 1 2967 4 add.u64 %r1723,%r1716,%r675; add.u64 %r1724,%r1717,%r678; .loc 1 2968 29 ld.u64 %r1726,[%r1723]; ld.u64 %r1727,[%r1724]; .loc 1 2968 8 mad.lo.u64 %r1729,%r1726,%r1727,%r1721; .loc 1 2967 4 add.u64 %r1731,%r1723,%r675; add.u64 %r1732,%r1724,%r678; .loc 1 2968 29 ld.u64 %r1734,[%r1731]; ld.u64 %r1735,[%r1732]; .loc 1 2968 8 mad.lo.u64 %r312,%r1734,%r1735,%r1729; .loc 1 2967 28 add.u64 %r313,%r313,4; .loc 1 2967 4 add.u64 %r673,%r1731,%r675; add.u64 %r676,%r1732,%r678; setp.ne.u64 %r1737,%r456,%r313; @ %r1737 bra $L100; bra $L99; $L142: .loc 1 2966 6 mov.u64 %r312,0; $L99: .loc 1 2969 21 st.u64 [%r681],%r312; .loc 1 2963 32 add.u64 %r311,%r311,1; .loc 1 2963 7 add.u64 %r681,%r681,%r683; add.u64 %r684,%r684,%r209; setp.ne.u64 %r1292,%r217,%r311; @ %r1292 bra $L101; bra $L1; $L373: .loc 1 2972 11 setp.gt.s64 %r1293,%r208,%r503; @ %r1293 bra $L102; .loc 1 2993 7 setp.gt.s64 %r1294,%r217,0; @ %r1294 bra $L103; bra $L1; $L102: .loc 1 2974 7 setp.gt.s64 %r1295,%r217,0; @ %r1295 bra $L104; bra $L1; $L108: shl.b64 %r1296,%r717,3; add.u64 %r713,%r245,%r1296; .loc 1 2975 9 mov.u64 %r310,0; and.b64 %r1506,%r216,3; setp.eq.u64 %r1509,%r1506,0; @ %r1509 bra $L105; setp.eq.u64 %r1508,%r1506,1; @ %r1508 bra $L316; setp.eq.u64 %r1507,%r1506,2; @ %r1507 bra $L317; bra $L377; $L105: .loc 1 2976 34 st.u64 [%r713],%r1407; .loc 1 2975 2 add.u64 %r1739,%r713,%r715; .loc 1 2976 34 st.u64 [%r1739],%r1407; .loc 1 2975 2 add.u64 %r1741,%r1739,%r715; .loc 1 2976 34 st.u64 [%r1741],%r1407; .loc 1 2975 2 add.u64 %r1744,%r1741,%r715; .loc 1 2976 34 st.u64 [%r1744],%r1407; .loc 1 2975 27 add.u64 %r310,%r310,4; .loc 1 2975 2 add.u64 %r713,%r1744,%r715; setp.ne.u64 %r1746,%r216,%r310; @ %r1746 bra $L105; $L109: .loc 1 2974 32 add.u64 %r309,%r324,1; .loc 1 2974 7 add.u64 %r717,%r717,%r504; setp.ne.u64 %r1299,%r217,%r309; @ %r1299 bra $L143; shl.b64 %r701,%r502,3; shl.b64 %r692,%r503,3; mov.u64 %r706,0; mov.u64 %r705,%r706; mov.u64 %r325,%r706; setp.le.s64 %r1747,%r456,0; bra $L107; $L104: shl.b64 %r715,%r444,3; mov.u64 %r717,0; .loc 1 2974 14 mov.u64 %r324,%r717; setp.gt.s64 %r1748,%r216,0; .loc 1 2976 34 mov.u64 %r1407,%r717; bra $L106; $L143: mov.u64 %r324,%r309; $L106: .loc 1 2975 2 @ %r1748 bra $L108; bra $L109; $L112: shl.b64 %r1301,%r697,3; add.u64 %r690,%r243,%r1301; .loc 1 2980 4 mov.u64 %r689,%r710; .loc 1 2980 11 mov.u64 %r308,0; and.b64 %r1498,%r216,3; setp.eq.u64 %r1501,%r1498,0; @ %r1501 bra $L110; setp.eq.u64 %r1500,%r1498,1; @ %r1500 bra $L318; setp.eq.u64 %r1499,%r1498,2; @ %r1499 bra $L319; bra $L378; $L110: .loc 1 2983 37 ld.u64 %r1303,[%r690]; ld.u64 %r1304,[%r699]; .loc 1 2982 36 ld.u64 %r1306,[%r689]; mad.lo.u64 %r1305,%r1303,%r1304,%r1306; st.u64 [%r689],%r1305; .loc 1 2980 4 add.u64 %r1750,%r689,%r715; add.u64 %r1751,%r690,%r692; .loc 1 2983 37 ld.u64 %r1752,[%r1751]; ld.u64 %r1753,[%r699]; .loc 1 2982 36 ld.u64 %r1755,[%r1750]; mad.lo.u64 %r1756,%r1752,%r1753,%r1755; st.u64 [%r1750],%r1756; .loc 1 2980 4 add.u64 %r1758,%r1750,%r715; add.u64 %r1759,%r1751,%r692; .loc 1 2983 37 ld.u64 %r1761,[%r1759]; ld.u64 %r1762,[%r699]; .loc 1 2982 36 ld.u64 %r1764,[%r1758]; mad.lo.u64 %r1765,%r1761,%r1762,%r1764; st.u64 [%r1758],%r1765; .loc 1 2980 4 add.u64 %r1767,%r1758,%r715; add.u64 %r1768,%r1759,%r692; .loc 1 2983 37 ld.u64 %r1770,[%r1768]; ld.u64 %r1771,[%r699]; .loc 1 2982 36 ld.u64 %r1773,[%r1767]; mad.lo.u64 %r1774,%r1770,%r1771,%r1773; st.u64 [%r1767],%r1774; .loc 1 2980 29 add.u64 %r308,%r308,4; .loc 1 2980 4 add.u64 %r689,%r1767,%r715; add.u64 %r690,%r1768,%r692; setp.ne.u64 %r1775,%r216,%r308; @ %r1775 bra $L110; $L113: .loc 1 2979 26 add.u64 %r307,%r307,1; .loc 1 2979 2 add.u64 %r697,%r697,%r208; add.u64 %r699,%r699,%r701; setp.eq.u64 %r1308,%r456,%r307; @ %r1308 bra $L111; $L114: .loc 1 2980 4 @ %r1748 bra $L112; bra $L113; $L111: .loc 1 2978 32 add.u64 %r306,%r325,1; .loc 1 2978 7 add.u64 %r705,%r705,%r504; add.u64 %r706,%r706,%r209; setp.eq.u64 %r1310,%r324,%r325; @ %r1310 bra $L1; mov.u64 %r325,%r306; $L107: .loc 1 2979 2 @ %r1747 bra $L111; shl.b64 %r1312,%r706,3; add.u64 %r699,%r244,%r1312; shl.b64 %r1313,%r705,3; add.u64 %r710,%r245,%r1313; mov.u64 %r697,0; .loc 1 2979 9 mov.u64 %r307,%r697; bra $L114; $L103: shl.b64 %r732,%r444,3; shl.b64 %r723,%r208,3; shl.b64 %r727,%r502,3; .loc 1 2993 7 mov.u64 %r741,0; mov.u64 %r739,%r741; .loc 1 2993 14 mov.u64 %r302,%r741; setp.gt.s64 %r1776,%r216,0; setp.le.s64 %r1408,%r456,0; $L116: .loc 1 2997 4 @ %r1776 bra $L115; $L120: .loc 1 2993 32 add.u64 %r302,%r302,1; .loc 1 2993 7 add.u64 %r739,%r739,%r504; add.u64 %r741,%r741,%r209; setp.ne.u64 %r1315,%r217,%r302; @ %r1315 bra $L116; bra $L1; $L115: shl.b64 %r1316,%r739,3; add.u64 %r730,%r245,%r1316; shl.b64 %r1317,%r741,3; add.u64 %r747,%r244,%r1317; .loc 1 2997 4 mov.u64 %r734,0; .loc 1 2997 11 mov.u64 %r303,%r734; $L119: .loc 1 3001 8 @ %r1408 bra $L144; shl.b64 %r1319,%r734,3; add.u64 %r721,%r243,%r1319; mov.u64 %r725,%r747; .loc 1 3000 10 mov.u64 %r304,0; .loc 1 3001 15 mov.u64 %r305,%r304; and.b64 %r1514,%r456,3; setp.eq.u64 %r1517,%r1514,0; @ %r1517 bra $L118; setp.eq.u64 %r1516,%r1514,1; @ %r1516 bra $L320; setp.eq.u64 %r1515,%r1514,2; @ %r1515 bra $L321; bra $L379; $L118: .loc 1 3002 28 ld.u64 %r1321,[%r721]; ld.u64 %r1322,[%r725]; .loc 1 3002 5 mad.lo.u64 %r1778,%r1321,%r1322,%r304; .loc 1 3001 8 add.u64 %r1780,%r721,%r723; add.u64 %r1781,%r725,%r727; .loc 1 3002 28 ld.u64 %r1782,[%r1780]; ld.u64 %r1783,[%r1781]; .loc 1 3002 5 mad.lo.u64 %r1785,%r1782,%r1783,%r1778; .loc 1 3001 8 add.u64 %r1787,%r1780,%r723; add.u64 %r1788,%r1781,%r727; .loc 1 3002 28 ld.u64 %r1790,[%r1787]; ld.u64 %r1791,[%r1788]; .loc 1 3002 5 mad.lo.u64 %r1793,%r1790,%r1791,%r1785; .loc 1 3001 8 add.u64 %r1795,%r1787,%r723; add.u64 %r1796,%r1788,%r727; .loc 1 3002 28 ld.u64 %r1798,[%r1795]; ld.u64 %r1799,[%r1796]; .loc 1 3002 5 mad.lo.u64 %r304,%r1798,%r1799,%r1793; .loc 1 3001 32 add.u64 %r305,%r305,4; .loc 1 3001 8 add.u64 %r721,%r1795,%r723; add.u64 %r725,%r1796,%r727; setp.ne.u64 %r1801,%r456,%r305; @ %r1801 bra $L118; bra $L117; $L144: .loc 1 3000 10 mov.u64 %r304,0; $L117: .loc 1 3003 27 st.u64 [%r730],%r304; .loc 1 2997 29 add.u64 %r303,%r303,1; .loc 1 2997 4 add.u64 %r730,%r730,%r732; add.u64 %r734,%r734,%r503; setp.ne.u64 %r1324,%r216,%r303; @ %r1324 bra $L119; bra $L120; $L32: .loc 1 2920 11 setp.eq.u64 %r1325,%r208,1; @ %r1325 bra $L121; bra $L20; $L2: .loc 1 2476 6 ld.u64 %r1326,[%r748]; setp.ne.u64 %r1327,%r1326,0; @ %r1327 bra $L122; .loc 1 2485 4 ld.u64 %r1329,[%r749+56]; add.u64 %r1328,%r1329,1; ld.u64 %r1330,[%r749+48]; sub.u64 %r508,%r1328,%r1330; .loc 1 2483 15 ld.s8 %r1332,[%r750+28]; cvt.u16.u32 %r1331,%r1332; setp.eq.u16 %r1333,%r1331,1; @ %r1333 bra $L123; bra $L124; $L126: .loc 1 2560 16 ld.u64 %r503,[%r749+40]; .loc 1 2548 27 mov.u64 %r444,%r504; bra $L125; $L14: ld.u64 %r504,[%r748+40]; .loc 1 2543 6 ld.s8 %r1335,[%r748+28]; cvt.u16.u32 %r1334,%r1335; setp.eq.u16 %r1336,%r1334,1; @ ! %r1336 bra $L127; bra $L126; $L11: .loc 1 2560 16 ld.u64 %r503,[%r749+40]; mov.u32 %r446,%r331; .loc 1 2548 27 mov.u64 %r504,%r444; .loc 1 2563 14 mov.u64 %r216,1; .loc 1 2561 16 mov.u64 %r208,%r216; bra $L12; $L5: .loc 1 2485 4 ld.u64 %r1338,[%r749+56]; add.u64 %r1337,%r1338,1; ld.u64 %r1339,[%r749+48]; sub.u64 %r508,%r1337,%r1339; bra $L124; $L128: .loc 1 2509 17 ld.u64 %r1341,[%r748+56]; add.u64 %r1340,%r1341,1; .loc 1 2509 15 ld.u64 %r1342,[%r748+48]; sub.u64 %r456,%r1340,%r1342; ld.s8 %r1354,[%r750+28]; bra $L9; $L122: .loc 1 2502 12 cvta.global.u64 %r1343,_gfortrani_compile_options; .loc 1 2502 11 ld.u32 %r1344,[%r1343+36]; setp.ne.u32 %r1345,%r1344,0; @ ! %r1345 bra $L8; bra $L128; $L1: .loc 1 3007 1 ret; $L372: .loc 1 2908 13 ld.u64 %r1802,[%r332]; ld.u64 %r1803,[%r400]; .loc 1 2907 11 mad.lo.u64 %r268,%r1802,%r1803,%r268; .loc 1 2905 10 add.u64 %r332,%r332,8; add.u64 %r297,%r400,8; $L309: .loc 1 2908 13 ld.u64 %r1806,[%r332]; ld.u64 %r1807,[%r297]; .loc 1 2907 11 mad.lo.u64 %r268,%r1806,%r1807,%r268; .loc 1 2905 10 add.u64 %r332,%r332,8; add.u64 %r297,%r297,8; $L308: .loc 1 2908 13 ld.u64 %r1810,[%r332]; ld.u64 %r1811,[%r297]; .loc 1 2907 11 mad.lo.u64 %r268,%r1810,%r1811,%r268; .loc 1 2905 10 add.u64 %r332,%r332,8; add.u64 %r297,%r297,8; setp.ne.u64 %r1813,%r398,%r297; @ %r1813 bra $L83; bra $L82; $L371: add.u64 %r1814,%r418,%r428; .loc 1 2887 16 ld.u64 %r1815,[%r400]; .loc 1 2886 16 add.u64 %r1816,%r1814,%r417; .loc 1 2887 13 ld.u64 %r1817,[%r1816]; .loc 1 2886 11 mad.lo.u64 %r269,%r1815,%r1817,%r269; .loc 1 2889 13 ld.u64 %r1819,[%r428]; .loc 1 2888 11 mad.lo.u64 %r270,%r1815,%r1819,%r270; .loc 1 2890 16 add.u64 %r1821,%r1814,%r416; .loc 1 2891 13 ld.u64 %r1822,[%r1821]; .loc 1 2890 11 mad.lo.u64 %r271,%r1815,%r1822,%r271; .loc 1 2892 16 add.u64 %r1824,%r1814,%r415; .loc 1 2893 13 ld.u64 %r1825,[%r1824]; .loc 1 2892 11 mad.lo.u64 %r272,%r1815,%r1825,%r272; .loc 1 2884 10 add.u64 %r429,%r400,8; add.u64 %r428,%r428,8; setp.ne.u64 %r1827,%r398,%r429; @ %r1827 bra $L78; bra $L77; $L370: .loc 1 2855 16 ld.u64 %r1828,[%r389]; .loc 1 2856 13 ld.u64 %r1829,[%r387]; .loc 1 2855 11 mad.lo.u64 %r274,%r1828,%r1829,%r274; .loc 1 2858 13 ld.u64 %r1831,[%r49]; .loc 1 2857 11 mad.lo.u64 %r275,%r1828,%r1831,%r275; .loc 1 2860 13 ld.u64 %r1833,[%r239]; .loc 1 2859 11 mad.lo.u64 %r276,%r1828,%r1833,%r276; .loc 1 2862 13 ld.u64 %r1835,[%r240]; .loc 1 2861 11 mad.lo.u64 %r277,%r1828,%r1835,%r277; .loc 1 2853 10 add.u64 %r389,%r389,8; add.u64 %r387,%r387,8; add.u64 %r385,%r49,8; add.u64 %r383,%r239,8; add.u64 %r381,%r240,8; $L307: .loc 1 2855 16 ld.u64 %r1838,[%r389]; .loc 1 2856 13 ld.u64 %r1839,[%r387]; .loc 1 2855 11 mad.lo.u64 %r274,%r1838,%r1839,%r274; .loc 1 2858 13 ld.u64 %r1841,[%r385]; .loc 1 2857 11 mad.lo.u64 %r275,%r1838,%r1841,%r275; .loc 1 2860 13 ld.u64 %r1843,[%r383]; .loc 1 2859 11 mad.lo.u64 %r276,%r1838,%r1843,%r276; .loc 1 2862 13 ld.u64 %r1845,[%r381]; .loc 1 2861 11 mad.lo.u64 %r277,%r1838,%r1845,%r277; .loc 1 2853 10 add.u64 %r389,%r389,8; add.u64 %r387,%r387,8; add.u64 %r385,%r385,8; add.u64 %r383,%r383,8; add.u64 %r381,%r381,8; $L306: .loc 1 2855 16 ld.u64 %r1848,[%r389]; .loc 1 2856 13 ld.u64 %r1849,[%r387]; .loc 1 2855 11 mad.lo.u64 %r274,%r1848,%r1849,%r274; .loc 1 2858 13 ld.u64 %r1851,[%r385]; .loc 1 2857 11 mad.lo.u64 %r275,%r1848,%r1851,%r275; .loc 1 2860 13 ld.u64 %r1853,[%r383]; .loc 1 2859 11 mad.lo.u64 %r276,%r1848,%r1853,%r276; .loc 1 2862 13 ld.u64 %r1855,[%r381]; .loc 1 2861 11 mad.lo.u64 %r277,%r1848,%r1855,%r277; .loc 1 2853 10 add.u64 %r389,%r389,8; add.u64 %r387,%r387,8; add.u64 %r385,%r385,8; add.u64 %r383,%r383,8; add.u64 %r381,%r381,8; setp.ne.u64 %r1857,%r227,%r387; @ %r1857 bra $L70; bra $L69; $L369: .loc 1 2762 43 ld.u64 %r1858,[%r556]; st.u64 [%r330],%r1858; .loc 1 2760 9 add.u64 %r57,%r556,8; add.u64 %r73,%r330,2048; $L305: .loc 1 2762 43 ld.u64 %r1860,[%r57]; st.u64 [%r73],%r1860; .loc 1 2760 9 add.u64 %r57,%r57,8; add.u64 %r73,%r73,2048; $L304: .loc 1 2762 43 ld.u64 %r1862,[%r57]; st.u64 [%r73],%r1862; .loc 1 2760 9 add.u64 %r57,%r57,8; add.u64 %r73,%r73,2048; setp.ne.u64 %r1863,%r57,%r136; @ %r1863 bra $L58; bra $L48; $L368: .loc 1 2741 7 add.u64 %r1864,%r536,%r539; add.u64 %r1865,%r1864,%r540; .loc 1 2740 49 ld.u64 %r1866,[%r1865]; st.u64 [%r229],%r1866; .loc 1 2742 49 ld.u64 %r1867,[%r536]; st.u64 [%r229+8],%r1867; .loc 1 2745 7 add.u64 %r1868,%r536,%r540; add.u64 %r1869,%r1868,%r539; .loc 1 2744 49 ld.u64 %r1870,[%r1869+8]; st.u64 [%r229+2048],%r1870; .loc 1 2746 49 ld.u64 %r1871,[%r536+8]; st.u64 [%r229+2056],%r1871; .loc 1 2738 9 add.u64 %r229,%r229,4096; add.u64 %r211,%r536,16; $L303: .loc 1 2741 7 add.u64 %r1873,%r211,%r539; add.u64 %r1874,%r1873,%r540; .loc 1 2740 49 ld.u64 %r1875,[%r1874]; st.u64 [%r229],%r1875; .loc 1 2742 49 ld.u64 %r1876,[%r211]; st.u64 [%r229+8],%r1876; .loc 1 2745 7 add.u64 %r1877,%r211,%r540; add.u64 %r1878,%r1877,%r539; .loc 1 2744 49 ld.u64 %r1879,[%r1878+8]; st.u64 [%r229+2048],%r1879; .loc 1 2746 49 ld.u64 %r1880,[%r211+8]; st.u64 [%r229+2056],%r1880; .loc 1 2738 9 add.u64 %r229,%r229,4096; add.u64 %r211,%r211,16; $L302: .loc 1 2741 7 add.u64 %r1882,%r211,%r539; add.u64 %r1883,%r1882,%r540; .loc 1 2740 49 ld.u64 %r1884,[%r1883]; st.u64 [%r229],%r1884; .loc 1 2742 49 ld.u64 %r1885,[%r211]; st.u64 [%r229+8],%r1885; .loc 1 2745 7 add.u64 %r1886,%r211,%r540; add.u64 %r1887,%r1886,%r539; .loc 1 2744 49 ld.u64 %r1888,[%r1887+8]; st.u64 [%r229+2048],%r1888; .loc 1 2746 49 ld.u64 %r1889,[%r211+8]; st.u64 [%r229+2056],%r1889; .loc 1 2738 9 add.u64 %r229,%r229,4096; add.u64 %r211,%r211,16; setp.ne.u64 %r1890,%r211,%r527; @ %r1890 bra $L52; bra $L53; $L366: .loc 1 2688 2 @ ! %r1351 bra $L380; .loc 1 2689 22 mov.u32 %r1881892,[%value_in]; } $L380: .loc 1 2687 24 mov.u64 %r296,2; .loc 1 2687 7 add.u64 %r620,%r245,%r621; $L301: .loc 1 2688 2 @ ! %r1351 bra $L381; .loc 1 2689 22 mov.u32 %r188941895,[%value_in]; } $L381: .loc 1 2687 24 add.u64 %r296,%r296,1; .loc 1 2687 7 add.u64 %r620,%r620,%r621; $L300: .loc 1 2688 2 @ ! %r1351 bra $L382; .loc 1 2689 22 mov.u32 %r1881898,[%value_in]; } $L382: .loc 1 2687 24 add.u64 %r296,%r296,1; .loc 1 2687 7 add.u64 %r620,%r620,%r621; setp.eq.u64 %r1899,%r296,%r624; @ ! %r1899 bra $L34; bra $L33; $L238: .loc 1 2689 22 mov.u32 %r19001529001901,[%value_in]; } $L367: .loc 1 2687 7 add.u64 %r1903,%r1520,%r621; .loc 1 2688 2 @ ! %r1351 bra $L383; .loc 1 2689 22 mov.u32 %r190519019051906,[%value_in]; } $L383: .loc 1 2687 7 add.u64 %r1908,%r1903,%r621; .loc 1 2688 2 @ ! %r1351 bra $L384; .loc 1 2689 22 mov.u32 %r19101919101911,[%value_in]; } $L384: .loc 1 2687 24 add.u64 %r296,%r1467,3; .loc 1 2687 7 add.u64 %r620,%r1908,%r621; setp.eq.u64 %r1912,%r296,%r624; @ ! %r1912 bra $L34; bra $L33; $L374: .loc 1 2938 23 ld.u64 %r1913,[%r625]; ld.u64 %r1914,[%r652]; mul.lo.u64 %r301,%r1913,%r1914; .loc 1 2937 5 add.u64 %r625,%r625,8; add.u64 %r626,%r652,8; $L311: .loc 1 2938 23 ld.u64 %r1917,[%r625]; ld.u64 %r1918,[%r626]; .loc 1 2938 9 mad.lo.u64 %r301,%r1917,%r1918,%r301; .loc 1 2937 5 add.u64 %r625,%r625,8; add.u64 %r626,%r626,8; $L310: .loc 1 2938 23 ld.u64 %r1921,[%r625]; ld.u64 %r1922,[%r626]; .loc 1 2938 9 mad.lo.u64 %r301,%r1921,%r1922,%r301; .loc 1 2937 5 add.u64 %r625,%r625,8; add.u64 %r626,%r626,8; setp.ne.u64 %r1924,%r625,%r631; @ %r1924 bra $L93; bra $L92; $L375: .loc 1 2953 26 ld.u64 %r1925,[%r243]; ld.u64 %r1926,[%r656]; mul.lo.u64 %r299,%r1925,%r1926; .loc 1 2952 8 add.u64 %r653,%r243,%r655; add.u64 %r656,%r656,8; $L313: .loc 1 2953 26 ld.u64 %r1929,[%r653]; ld.u64 %r1930,[%r656]; .loc 1 2953 5 mad.lo.u64 %r299,%r1929,%r1930,%r299; .loc 1 2952 8 add.u64 %r653,%r653,%r655; add.u64 %r656,%r656,8; $L312: .loc 1 2953 26 ld.u64 %r1933,[%r653]; ld.u64 %r1934,[%r656]; .loc 1 2953 5 mad.lo.u64 %r299,%r1933,%r1934,%r299; .loc 1 2952 8 add.u64 %r653,%r653,%r655; add.u64 %r656,%r656,8; setp.ne.u64 %r1936,%r656,%r672; @ %r1936 bra $L97; bra $L96; $L376: .loc 1 2968 29 ld.u64 %r1937,[%r243]; ld.u64 %r1938,[%r676]; mul.lo.u64 %r312,%r1937,%r1938; .loc 1 2967 28 mov.u64 %r313,1; .loc 1 2967 4 add.u64 %r673,%r243,%r675; add.u64 %r676,%r676,%r678; $L315: .loc 1 2968 29 ld.u64 %r1941,[%r673]; ld.u64 %r1942,[%r676]; .loc 1 2968 8 mad.lo.u64 %r312,%r1941,%r1942,%r312; .loc 1 2967 28 add.u64 %r313,%r313,1; .loc 1 2967 4 add.u64 %r673,%r673,%r675; add.u64 %r676,%r676,%r678; $L314: .loc 1 2968 29 ld.u64 %r1945,[%r673]; ld.u64 %r1946,[%r676]; .loc 1 2968 8 mad.lo.u64 %r312,%r1945,%r1946,%r312; .loc 1 2967 28 add.u64 %r313,%r313,1; .loc 1 2967 4 add.u64 %r673,%r673,%r675; add.u64 %r676,%r676,%r678; setp.ne.u64 %r1948,%r456,%r313; @ %r1948 bra $L100; bra $L99; $L378: .loc 1 2983 37 ld.u64 %r1949,[%r690]; ld.u64 %r1950,[%r699]; .loc 1 2982 36 ld.u64 %r1952,[%r710]; mad.lo.u64 %r1953,%r1949,%r1950,%r1952; st.u64 [%r710],%r1953; .loc 1 2980 29 mov.u64 %r308,1; .loc 1 2980 4 add.u64 %r689,%r710,%r715; add.u64 %r690,%r690,%r692; $L319: .loc 1 2983 37 ld.u64 %r1955,[%r690]; ld.u64 %r1956,[%r699]; .loc 1 2982 36 ld.u64 %r1958,[%r689]; mad.lo.u64 %r1959,%r1955,%r1956,%r1958; st.u64 [%r689],%r1959; .loc 1 2980 29 add.u64 %r308,%r308,1; .loc 1 2980 4 add.u64 %r689,%r689,%r715; add.u64 %r690,%r690,%r692; $L318: .loc 1 2983 37 ld.u64 %r1961,[%r690]; ld.u64 %r1962,[%r699]; .loc 1 2982 36 ld.u64 %r1964,[%r689]; mad.lo.u64 %r1965,%r1961,%r1962,%r1964; st.u64 [%r689],%r1965; .loc 1 2980 29 add.u64 %r308,%r308,1; .loc 1 2980 4 add.u64 %r689,%r689,%r715; add.u64 %r690,%r690,%r692; setp.ne.u64 %r1966,%r216,%r308; @ %r1966 bra $L110; bra $L113; $L377: .loc 1 2976 34 st.u64 [%r713],%r1407; .loc 1 2975 27 mov.u64 %r310,1; .loc 1 2975 2 add.u64 %r713,%r713,%r715; $L317: .loc 1 2976 34 st.u64 [%r713],%r1407; .loc 1 2975 27 add.u64 %r310,%r310,1; .loc 1 2975 2 add.u64 %r713,%r713,%r715; $L316: .loc 1 2976 34 st.u64 [%r713],%r1407; .loc 1 2975 27 add.u64 %r310,%r310,1; .loc 1 2975 2 add.u64 %r713,%r713,%r715; setp.ne.u64 %r1969,%r216,%r310; @ %r1969 bra $L105; bra $L109; $L379: .loc 1 3002 28 ld.u64 %r1970,[%r721]; ld.u64 %r1971,[%r747]; mul.lo.u64 %r304,%r1970,%r1971; .loc 1 3001 32 mov.u64 %r305,1; .loc 1 3001 8 add.u64 %r721,%r721,%r723; add.u64 %r725,%r747,%r727; $L321: .loc 1 3002 28 ld.u64 %r1974,[%r721]; ld.u64 %r1975,[%r725]; .loc 1 3002 5 mad.lo.u64 %r304,%r1974,%r1975,%r304; .loc 1 3001 32 add.u64 %r305,%r305,1; .loc 1 3001 8 add.u64 %r721,%r721,%r723; add.u64 %r725,%r725,%r727; $L320: .loc 1 3002 28 ld.u64 %r1978,[%r721]; ld.u64 %r1979,[%r725]; .loc 1 3002 5 mad.lo.u64 %r304,%r1978,%r1979,%r304; .loc 1 3001 32 add.u64 %r305,%r305,1; .loc 1 3001 8 add.u64 %r721,%r721,%r723; add.u64 %r725,%r725,%r727; setp.ne.u64 %r1981,%r456,%r305; @ %r1981 bra $L118; bra $L117; } matmul_i16.o/_gfortran_matmul_i16 .visible .func _gfortran_matmul_i16, .param .u32 %in_ar4, .param .u64 %in_ar5); .file 1 "../../../libgfortran/generated/matmul_i16.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_options[6]FUNCTION DECL: _gfortran_size0_gfortran_size0_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime_97,116,109,117,108,95,105,49,54,0 }2] = {782] = {672] = {8471,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,32,40,97,41,32,61,61,32,50,32,124,124,32,71,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,32,40,98,41,32,61,61,32116,114,97,110,47,103,101,110,101,114,97,116,101,100,47,109,97,116,109,117,108,95,105,49,54,465,114,114,97,121,32,98,111,117,110,100,32,109,105,115,109,97,116,99,104,32,102,111,114,32,100,105,109,101,110,115,105,111,110,32,49,32,111,102,32,97,114,114,97,121,32,40,37,108,100,47,37,108,100,41,3265,114,114,97,121,32,98,111,117,110,100,32,109,105,115,109,97,116,99,104,32,102,111,114,32,100,105,109,101,110,115,105,111,110,32,50,32,111,102,32,97,114,114,97,121,32,40,37,108,100,47,37,108,100,41,3273,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,97,114,103,117,109,101,110,116,32,66,32,105,110,32,77,65,84,77,85,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,49,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,100,0 }103,101,109,109,32,33,61,32,78,85,76,7_gfortran_matmul_i16 .visible .func _gfortran_matmul_i16189u64 %r314; .reg .u64 %r316; .reg .u64u32 %r331; .reg .u64 %r332; .reg .u64u64 %r374; .reg .u64u64 %r390; .reg .u64 %r392; .reg .u64u64u64 %r409; .reg .u64u64u64 %r530; .reg .u64 %r536; .reg .u64 %r537; .reg .u64 %r538; .reg .u64u64 %r555; .reg .u64u64 %r571; .reg .u64 %r573; .reg .u64 %r574; .reg .u64 %r575; .reg .u64 %r576; .reg .u64 %r577; .reg .u64 %r585; .reg .u64 %r586; .reg .u64 %r587; .reg .u64 %r588; .reg .u64 %r589; .reg .u64 %r590; .reg .u64 %r591; .reg .u64 %r592; .reg .u64 %r594; .reg .u64u64.reg .u64 %r619; .reg .u64 %r620; .reg .u64 %r621; .reg .u64 %r624; .reg .u64 %r625; .reg .u64 %r626; .reg .u64 %r629; .reg .u64 %r630; .reg .u64 %r631; .reg .u64 %r633; .reg .u64 %r634; .reg .u64 %r640; .reg .u64 %r642; .reg .u64 %r643; .reg .u64 %r647; .reg .u64 %r652; .reg .u64 %r653; .reg .u64 %r655; .reg .u64 %r656; .reg .u64 %r661; .reg .u64 %r663; .reg .u64 %r664; .reg .u64 %r672; .reg .u64 %r673; .reg .u64 %r675; .reg .u64 %r676; .reg .u64 %r678; .reg .u64 %r681; .reg .u64 %r683; .reg .u64u64 %r699; .reg .u64 %r701; .reg .u64 %r705; .reg .u64 %r706; .reg .u64 %r710; .reg .u64 %r713; .reg .u64 %r715; .reg .u64 %r717; .reg .u64 %r721; .reg .u64 %r723; .reg .u64 %r725; .reg .u64 %r727; .reg .u64 %r730; .reg .u64 %r732; .reg .u64 %r734; .reg .u64 %r739; .reg .u64 %r741; .reg .u64 %r747; .reg .u64 %r748; .reg .u64 %r749; .reg .u64 %r750; .reg .u32 %r751; .reg .u32 %r752; .reg .u64 %r753; .reg .u16 %r754; .reg .pred %r755; .reg .u16 %r756; .reg .pred %r758; .reg .u64 %r759; .reg .u32 %r760; .reg .u64 %r761; .reg .u64 %r762; .reg .u64 %r763; .reg .pred %r764; .reg .pred %r766; .reg .u64 %r769; .reg .u64 %r771; .reg .u64 %r772; .reg .u64 %r773; .reg .u64 %r775; .reg .u64 %r776; .reg .u64 %r777; .reg .u64 %r778; .reg .u64 %r779; .reg .u64 %r782; .reg .u64 %r784; .reg .u64 %r785; .reg .u64 %r787; .reg .u64 %r789; .reg .u64 %r790; .reg .u64 %r792; .reg .u64 %r793; .reg .u64 %r794; .reg .u64 %r795; .reg .u64 %r796; .reg .u32 %r797; .reg .pred %r798; .reg .u64 %r799; .reg .u64 %r800; .reg .u64 %r801; .reg .pred %r803; .reg .u64 %r804; .reg .u64 %r805; .reg .u64 %r806; .reg .pred %r807; .reg .u64 %r808; .reg .u64 %r809; .reg .u64 %r810; .reg .u16 %r811; .reg .pred %r812; .reg .u64 %r813; .reg .u64 %r815; .reg .u64 %r816; .reg .u64 %r817; .reg .u16 %r818; .reg .pred %r820; .reg .pred %r821; .reg .u64 %r822; .reg .pred %r824; .reg .u64 %r825; .reg .u64 %r827; .reg .u64 %r828; .reg .u64 %r829; .reg .u64 %r830; .reg .u64 %r831; .reg .u64 %r832; .reg .pred %r833; .reg .u64 %r834; .reg .u64 %r836; .reg .u64 %r837; .reg .u64 %r838; .reg .u16 %r839; .reg .u32 %r840; .reg .pred %r841; .reg .u16 %r842; .reg .pred %r843; .reg .u64 %r844; .reg .u64 %r845; .reg .u64 %r846; .reg .u64 %r847; .reg .u64 %r848; .reg .u64 %r849; .reg .pred %r850; .reg .u32 %r852; .reg .u32 %r853; .reg .u32 %r855; .reg .u32 %r856; .reg .u16 %r857; .reg .u16 %r858; .reg .u16 %r859; .reg .u32 %r860; .reg .u16 %r861; .reg .pred %r862; .reg .u64 %r863; .reg .u16 %r865; .reg .pred %r866; .reg .u64 %r867; .reg .u64 %r868; .reg .u64 %r869; .reg .u32 %r870; .reg .u32 %r871; .reg .u32 %r872; .reg .u32 %r873; .reg .u32 %r874; .reg .u32 %r875; .reg .u32 %r877; .reg .u32 %r878; .reg .u16 %r879; .reg .u16 %r880; .reg .u16 %r881; .reg .u32 %r882; .reg .u16 %r883; .reg .pred %r884; .reg .u32 %r886; .reg .u32 %r887; .reg .u16 %r888; .reg .u16 %r889; .reg .u16 %r890; .reg .u32 %r891; .reg .u16 %r892; .reg .pred %r893; .reg .u32 %r895; .reg .u32 %r896; .reg .u32 %r898; .reg .u32 %r899; .reg .u16 %r900; .reg .u16 %r901; .reg .u16 %r902; .reg .u32 %r903; .reg .u16 %r904; .reg .pred %r905; .reg .f32 %r906; .reg .f32 %r907; .reg .f32 %r908; .reg .f32 %r909; .reg .f32 %r910; .reg .f32 %r911; .reg .f32 %r912; .reg .pred %r913; .reg .u64 %r914; .reg .u64 %r915; .reg .pred %r918; .reg .pred %r919; .reg .u32 %r921; .reg .u32 %r922; .reg .u32 %r924; .reg .u32 %r925; .reg .u16 %r926; .reg .u16 %r927; .reg .u16 %r928; .reg .u32 %r929; .reg .u16 %r930; .reg .pred %r931; .reg .u32 %r933; .reg .u32 %r934; .reg .u32 %r936; .reg .u32 %r937; .reg .u16 %r938; .reg .u16 %r939; .reg .u16 %r940; .reg .u32 %r941; .reg .u16 %r942; .reg .pred %r943; .reg .u32 %r945; .reg .u32 %r946; .reg .u32 %r948; .reg .u32 %r949; .reg .u16 %r950; .reg .u16 %r951; .reg .u16 %r952; .reg .u32 %r953; .reg .u16 %r954; .reg .pred %r955; .reg .pred %r956; .reg .u64 %r957; .reg .u32 %r958; .reg .u64 %r959; .reg .u64 %r960; .reg .u32 %r961; .reg .pred %r962; .reg .pred %r963; .reg .u32 %r964; .reg .pred %r965; .reg .pred %r966; .reg .u32 %r981; .reg .u64 %r982; .reg .u64 %r983; .reg .u64 %r984; .reg .u64 %r985; .reg .u64 %r986; .reg .u64 %r987; .reg .u64 %r988; .reg .u16 %r989; .reg .u16 %r990; .reg .u16 %r991; .reg .u32 %r992; .reg .u16 %r993; .reg .pred %r994; .reg .pred %r995; .reg .u16 %r996; .reg .pred %r997; .reg .u64 %r998; .reg .pred %r999; .reg .u32 %r1003; .reg .u64 %r1005; .reg .u32 %r1010; .reg .u32 %r1011; .reg .u32 %r1013; .reg .u32 %r1014; .reg .u16 %r1015; .reg .u16 %r1016; .reg .u16 %r1017; .reg .u32 %r1019; .reg .u32 %r1020; .reg .u16 %r1022; .reg .u16 %r1024; .reg .u32 %r1025; .reg .u16 %r1026; .reg .pred %r1027; .reg .pred %r1028; .reg .u64 %r1029; .reg .u64 %r1031; .reg .u64 %r1032; .reg .u64 %r1033; .reg .pred %r1035; .reg .u64 %r1037; .reg .u64 %r1038; .reg .u64 %r1039; .reg .u64 %r1042; .reg .u64 %r1043; .reg .u64 %r1047; .reg .u64 %r1048; .reg .u64 %r1049; .reg .u64 %r1051; .reg .u64 %r1052; .reg .u64 %r1055; .reg .u64 %r1058; .reg .u64 %r1059; .reg .u64 %r1062; .reg .u64 %r1064; .reg .u64 %r1065; .reg .u64 %r1066; .reg .u64 %r1067; .reg .u64 %r1068; .reg .pred %r1070; .reg .u64 %r1076; .reg .u64 %r1077; .reg .u64 %r1078; .reg .u64 %r1079; .reg .pred %r1081; .reg .u64 %r1082; .reg .u64 %r1083; .reg .u64 %r1084; .reg .u64 %r1085; .reg .u64 %r1086; .reg .u64 %r1088; .reg .u64 %r1089; .reg .u64 %r1090; .reg .u64 %r1091; .reg .u64 %r1092; .reg .u64 %r1093; .reg .u64 %r1094; .reg .u64 %r1095; .reg .u64 %r1099; .reg .u64 %r1100; .reg .u64 %r1101; .reg .u64 %r1102; .reg .u64 %r1106; .reg .u64 %r1108; .reg .u64 %r1111; .reg .u64 %r1114; .reg .u64 %r1119; .reg .u64 %r1120; .reg .u64 %r1122; .reg .u64 %r1123; .reg .u64 %r1124; .reg .u64 %r1125; .reg .u64 %r1126; .reg .u64 %r1127; .reg .u64 %r1128; .reg .u64 %r1129; .reg .u64 %r1130; .reg .u64 %r1131; .reg .u64 %r1132; .reg .u64 %r1133; .reg .u64 %r1135; .reg .u64 %r1136; .reg .u64 %r1137; .reg .u64 %r1138; .reg .u64 %r1139; .reg .u64 %r1140; .reg .u64 %r1141; .reg .pred %r1142; .reg .u64 %r1145; .reg .u64 %r1146; .reg .u64 %r1147; .reg .u64 %r1148; .reg .u64 %r1150; .reg .pred %r1151; .reg .u64 %r1152; .reg .u64 %r1153; .reg .u64 %r1154; .reg .u64 %r1155; .reg .u64 %r1158; .reg .u64 %r1159; .reg .u64 %r1160; .reg .u64 %r1161; .reg .u64 %r1162; .reg .u64 %r1163; .reg .u64 %r1164; .reg .u64 %r1165; .reg .u64 %r1166; .reg .u64 %r1167; .reg .u64 %r1168; .reg .u64 %r1169; .reg .u64 %r1170; .reg .u64 %r1175; .reg .u64 %r1188; .reg .u64 %r1191; .reg .u32 %r1193; .reg .u64 %r1194; .reg .u64 %r1210; .reg .u64 %r1213; .reg .u32 %r1215; .reg .u64 %r1216; .reg .u64 %r1220; .reg .u64 %r1233; .reg .u64 %r1236; .reg .u32 %r1238; .reg .u64 %r1239; .reg .u64 %r1255; .reg .u64 %r1258; .reg .u32 %r1260; .reg .u64 %r1261; .reg .u64 %r1265; .reg .u64 %r1278; .reg .u64 %r1281; .reg .u32 %r1283; .reg .u64 %r1284; .reg .u64 %r1300; .reg .u64 %r1303; .reg .u32 %r1305; .reg .u64 %r1306; .reg .u64 %r1310; .reg .u64 %r1323; .reg .u64 %r1326; .reg .u32 %r1328; .reg .u64 %r1329; .reg .u64 %r1345; .reg .u64 %r1348; .reg .u32 %r1350; .reg .u64 %r1351; .reg .u64 %r1355; .reg .u64 %r1368; .reg .u64 %r1371; .reg .u32 %r1373; .reg .u64 %r1374; .reg .u64 %r1378; .reg .u64 %r1391; .reg .u64 %r1394; .reg .u32 %r1396; .reg .u64 %r1397; .reg .u64 %r1413; .reg .u64 %r1416; .reg .u32 %r1418; .reg .u64 %r1419; .reg .u64 %r1435; .reg .u64 %r1438; .reg .u32 %r1440; .reg .u64 %r1441; .reg .u64 %r1457; .reg .u64 %r1460; .reg .u32 %r1462; .reg .u64 %r1463; .reg .u64 %r1479; .reg .u64 %r1482; .reg .u32 %r1484; .reg .u64 %r1485; .reg .u64 %r1501; .reg .u64 %r1504; .reg .u32 %r1506; .reg .u64 %r1507; .reg .u64 %r1523; .reg .u64 %r1526; .reg .u32 %r1528; .reg .u64 %r1529; .reg .pred %r1533; .reg .pred %r1566; .reg .pred %r1567; .reg .u64 %r1571; .reg .u64 %r1578; .reg .u64 %r1579; .reg .u64 %r1584; .reg .u64 %r1587; .reg .u32 %r1589; .reg .u64 %r1590; .reg .u64 %r1600; .reg .u64 %r1601; .reg .u64 %r1606; .reg .u64 %r1609; .reg .u32 %r1611; .reg .u64 %r1612; .reg .u64 %r1622; .reg .u64 %r1623; .reg .u64 %r1628; .reg .u64 %r1631; .reg .u32 %r1633; .reg .u64 %r1634; .reg .u64 %r1644; .reg .u64 %r1645; .reg .u64 %r1650; .reg .u64 %r1653; .reg .u32 %r1655; .reg .u64 %r1656; .reg .pred %r1660; .reg .pred %r1669; .reg .pred %r1670; .reg .pred %r1671; .reg .u64 %r1672; .reg .u64 %r1673; .reg .u64 %r1675; .reg .u64 %r1676; .reg .u64 %r1677; .reg .u64 %r1678; .reg .u64 %r1685; .reg .u64 %r1686; .reg .u64 %r1687; .reg .u64 %r1690; .reg .u64 %r1697; .reg .u64 %r1698; .reg .u64 %r1703; .reg .u64 %r1706; .reg .u32 %r1708; .reg .u64 %r1709; .reg .u64 %r1719; .reg .u64 %r1720; .reg .u64 %r1725; .reg .u64 %r1728; .reg .u32 %r1730; .reg .u64 %r1731; .reg .u64 %r1735; .reg .u64 %r1742; .reg .u64 %r1743; .reg .u64 %r1748; .reg .u64 %r1751; .reg .u32 %r1753; .reg .u64 %r1754; .reg .u64 %r1758; .reg .u64 %r1765; .reg .u64 %r1766; .reg .u64 %r1771; .reg .u64 %r1774; .reg .u32 %r1776; .reg .u64 %r1777; .reg .pred %r1781; .reg .pred %r1790; .reg .pred %r1791; .reg .u64 %r1794; .reg .u64 %r1797; .reg .u64 %r1798; .reg .u64 %r1801; .reg .u64 %r1802; .reg .u64 %r1807; .reg .u64 %r1810; .reg .u32 %r1812; .reg .u64 %r1813; .reg .u64 %r1816; .reg .pred %r1820; .reg .u16 %r1821; .reg .pred %r1822; .reg .u32 %r1824; .reg .u32 %r1825; .reg .u16 %r1826; .reg .u16 %r1827; .reg .u32 %r1829; .reg .u16 %r1830; .reg .pred %r1831; .reg .pred %r1832; .reg .u16 %r1833; .reg .pred %r1834; .reg .pred %r1835; .reg .pred %r1836; .reg .u64 %r1837; .reg .u64 %r1838; .reg .pred %r1840; .reg .u64 %r1841; .reg .u64 %r1842; .reg .u64 %r1844; .reg .u64 %r1847; .reg .u64 %r1848; .reg .u64 %r1851; .reg .u64 %r1852; .reg .u64 %r1857; .reg .u64 %r1860; .reg .u32 %r1862; .reg .u64 %r1863; .reg .u64 %r1866; .reg .pred %r1870; .reg .u64 %r1872; .reg .u64 %r1873; .reg .u64 %r1874; .reg .u64 %r1877; .reg .u64 %r1878; .reg .u64 %r1881; .reg .u64 %r1882; .reg .u64 %r1887; .reg .u64 %r1890; .reg .u32 %r1892; .reg .u64 %r1893; .reg .u64 %r1896; .reg .pred %r1900; .reg .pred %r1901; .reg .u64 %r1903; .reg .u64 %r1906; .reg .u64 %r1907; .reg .u64 %r1910; .reg .u64 %r1911; .reg .u64 %r1916; .reg .u64 %r1919; .reg .u32 %r1921; .reg .u64 %r1922; .reg .u64 %r1925; .reg .pred %r1929; .reg .pred %r1930; .reg .pred %r1931; .reg .pred %r1932; .reg .u64 %r1933; .reg .pred %r1937; .reg .u64 %r1939; .reg .u64 %r1942; .reg .u64 %r1943; .reg .u64 %r1946; .reg .u64 %r1947; .reg .u64 %r1953; .reg .u64 %r1956; .reg .u32 %r1958; .reg .u64 %r1959; .reg .u64 %r1962; .reg .pred %r1966; .reg .pred %r1968; .reg .u64 %r1970; .reg .u64 %r1971; .reg .pred %r1973; .reg .u64 %r1974; .reg .u64 %r1975; .reg .u64 %r1977; .reg .u64 %r1980; .reg .u64 %r1981; .reg .u64 %r1984; .reg .u64 %r1985; .reg .u64 %r1990; .reg .u64 %r1993; .reg .u32 %r1995; .reg .u64 %r1996; .reg .u64 %r1999; .reg .pred %r2003; .reg .pred %r2004; .reg .u64 %r2005; .reg .pred %r2006; .reg .u64 %r2007; .reg .u64 %r2008; .reg .u64 %r2009; .reg .u16 %r2010; .reg .u32 %r2011; .reg .pred %r2012; .reg .u16 %r2013; .reg .u32 %r2014; .reg .pred %r2015; .reg .u64 %r2016; .reg .u64 %r2017; .reg .u64 %r2018; .reg .u64 %r2019; .reg .u64 %r2020; .reg .u64 %r2021; .reg .u64 %r2022; .reg .u32 %r2023; .reg .pred %r2024; .reg .u64 %r2025; .reg .u64 %r2026; .reg .u64 %r2027; .reg .u64 %r2028; .reg .u64 %r2029; .reg .u64 %r2030; .reg .u64 %r2031; .reg .u64 %r2032; .reg .u64 %r2033; .reg .u64 %r2034; .reg .u64 %r2035; .reg .u64 %r2036; .reg .u64 %r2037; .reg .u64 %r2038; .reg .u64 %r2039; .reg .u64 %r2040; .reg .u64 %r2041; .reg .u64 %r2042; .reg .u64 %r2043; .reg .u64 %r2044; .reg .u64 %r2045; .reg .u64 %r2046; .reg .u64 %r2047; .reg .u64 %r2048; .reg .u64 %r2049; .reg .u64 %r2050; .reg .u64 %r2051; .reg .u64 %r2052; .reg .u64 %r2053; .reg .u64 %r2054; .reg .u64 %r2055; .reg .u64 %r2056; .reg .u64 %r2057; .reg .u64 %r2058; .reg .u64 %r2059; .reg .u64 %r2060; .reg .u64 %r2061; .reg .u64 %r2062; .reg .u64 %r2063; .reg .u64 %r2064; .reg .u64 %r2065; .reg .u64 %r2066; .reg .u64 %r2067; .reg .u64 %r2068; .reg .u64 %r2069; .reg .u64 %r2070; .reg .u64 %r2071; .reg .u64 %r2072; .reg .u64 %r2073; .reg .u64 %r2074; .reg .u64 %r2075; .reg .u64 %r2076; .reg .u64 %r2077; .reg .u64 %r2078; .reg .u64 %r2079; .reg .u64 %r2080; .reg .u64 %r2081; .reg .u64 %r2082; .reg .u64 %r2083; .reg .u64 %r2084; .reg .u64 %r2085; .reg .u64 %r2086; .reg .u64 %r2087; .reg .u64 %r2088; .reg .u64 %r2089; .reg .u64 %r2090; .reg .u64 %r2091; .reg .u64 %r2092; .reg .u64 %r2093; .reg .u64 %r2094; .reg .u64 %r2095; .reg .u64 %r2096; .reg .u64 %r2097; .reg .u64 %r2098; .reg .u64 %r2099; .reg .u64 %r2100; .reg .u64 %r2101; .reg .u64 %r2102; .reg .u64 %r2103; .reg .u64 %r2104; .reg .u64 %r2107; .reg .u64 %r2108; .reg .u64 %r2111; .reg .u64 %r2112; .reg .u64 %r2115; .reg .u64 %r2116; .reg .u64 %r2119; .reg .u64 %r2120; .reg .u64 %r2123; .reg .u64 %r2124; .reg .u64 %r2127; .reg .u64 %r2128; .reg .u64 %r2131; .reg .u64 %r2132; .reg .u64 %r2135; .reg .u64 %r2136; .reg .u64 %r2139; .reg .u64 %r2140; .reg .u64 %r2143; .reg .u64 %r2144; .reg .u64 %r2147; .reg .u64 %r2148; .reg .u64 %r2151; .reg .u64 %r2152; .reg .u64 %r2155; .reg .u64 %r2156; .reg .u64 %r2159; .reg .u64 %r2160; .reg .u64 %r2163; .reg .u64 %r2164; .reg .u64 %r2167; .reg .u64 %r2168; .reg .u64 %r2171; .reg .u64 %r2172; .reg .u64 %r2175; .reg .u64 %r2176; .reg .u64 %r2179; .reg .u64 %r2180; .reg .u64 %r2183; .reg .u64 %r2184; .reg .u64 %r2187; .reg .u64 %r2188; .reg .u64 %r2191; .reg .u64 %r2192; .reg .u64 %r2195; .reg .u64 %r2196; .reg .u64 %r2199; .reg .u64 %r2200; .reg .u64 %r2203; .reg .u64 %r2204; .reg .u64 %r2207; .reg .u64 %r2208; .reg .u64 %r2211; .reg .u64 %r2212; .reg .u64 %r2215; .reg .u64 %r2216; .reg .u64 %r2217; .reg .u64 %r2218; .reg .u64 %r2221; .reg .u64 %r2222; .reg .pred %r2226; .reg .pred %r2228; .reg .pred %r2229; .reg .pred %r2230; .reg .u32 %r2232; .reg .pred %r2234; .reg .pred %r2235; .reg .pred %r2236; .reg .pred %r2237; .reg .pred %r2238; .reg .pred %r2239; .reg .pred %r2240; .reg .pred %r2241; .reg .u64 %r2242; .reg .pred %r2243; .reg .pred %r2245; .reg .pred %r2247; .reg .u64 %r2260; .reg .u64 %r2261; .reg .u64 %r2262; .reg .u64 %r2278; .reg .u64 %r2279; .reg .u64 %r2280; .reg .u64 %r2281; .reg .u64 %r2283; .reg .u64 %r2284; .reg .u64 %r2285; .reg .u64 %r2287; .reg .u64 %r2288; .reg .u64 %r2293; .reg .u64 %r2300; .reg .pred %r2301; .reg .u64 %r2302; .reg .u64 %r2303; .reg .u64 %r2304; .reg .u64 %r2305; .reg .u64 %r2306; .reg .u64 %r2307; .reg .u64 %r2308; .reg .pred %r2312; .reg .u64 %r2313; .reg .u64 %r2314; .reg .u64 %r2315; .reg .u64 %r2316; .reg .u64 %r2317; .reg .u64 %r2318; .reg .u64 %r2319; .reg .u64 %r2320; .reg .u64 %r2321; .reg .u64 %r2325; .reg .u64 %r2326; .reg .u64 %r2327; .reg .u64 %r2328; .reg .pred %r2332; .reg .u64 %r2333; .reg .u64 %r2334; .reg .u64 %r2335; .reg .u64 %r2336; .reg .u64 %r2337; .reg .u64 %r2338; .reg .u64 %r2340; .reg .pred %r2341; .reg .u64 %r2342; .reg .u64 %r2343; .reg .u64 %r2344; .reg .u64 %r2345; .reg .u64 %r2346; .reg .u64 %r2347; .reg .u64 %r2348; .reg .pred %r2349; .reg .pred %r2350; .reg .pred %r2351; .reg .u64 %r2352; .reg .u64 %r2353; .reg .u64 %r2354; .reg .u64 %r2355; .reg .u64 %r2356; .reg .u64 %r2358; .reg .pred %r2359; .reg .u64 %r2360; .reg .u64 %r2361; .reg .u64 %r2365; .reg .pred %r2366; .reg .pred %r2367; .reg .pred %r2368; .reg .u64 %r2369; .reg .u64 %r2370; .reg .u64 %r2371; .reg .u64 %r2372; .reg .u64 %r2374; .reg .pred %r2375; .reg .u64 %r2376; .reg .u64 %r2377; .reg .u64 %r2378; .reg .u64 %r2379; .reg .u64 %r2380; .reg .u64 %r2382; .reg .pred %r2383; .reg .u64 %r2384; .reg .u64 %r2388; .reg .pred %r2389; .reg .u64 %r2394; .reg .pred %r2395; .reg .u64 %r2400; .reg .pred %r2401; .reg .pred %r2402; .reg .pred %r2403; .reg .u64 %r2408; .reg .pred %r2409; .reg .u64 %r2412; .reg .u64 %r2413; .reg .u64 %r2415; .reg .u64 %r2416; .reg .u64 %r2419; .reg .u64 %r2420; .reg .u64 %r2421; .reg .u64 %r2422; .reg .u64 %r2423; .reg .u64 %r2424; .reg .u64 %r2425; .reg .u64 %r2426; .reg .u64 %r2427; .reg .u64 %r2428; .reg .u64 %r2429; .reg .u64 %r2430; .reg .pred %r2431; .reg .u64 %r2434; .reg .u64 %r2435; .reg .u64 %r2436; .reg .u64 %r2437; .reg .u64 %r2439; .reg .u64 %r2440; .reg .u64 %r2441; .reg .u64 %r2442; .reg .u64 %r2444; .reg .u64 %r2445; .reg .pred %r2446; .reg .pred %r2450; .reg .u64 %r2461; .reg .u64 %r2462; .reg .u64 %r2463; .reg .u64 %r2464; .reg .u64 %r2465; .reg .u64 %r2466; .reg .u64 %r2467; .reg .u32 %r2468; .reg .u64 %r2469; .reg .u64 %r2470; .reg .pred %r2472; .reg .pred %r2473; .reg .u64 %r2481; .reg .u64 %r2482; .reg .u64 %r2483; .reg .u64 %r2484; .reg .u64 %r2485; .reg .u64 %r2486; .reg .u64 %r2487; .reg .u32 %r2488; .reg .u64 %r2489; .reg .u64 %r2490; .reg .pred %r2492; .reg .pred %r2493; .reg .u64 %r2499; .reg .u64 %r2501; .reg .u64 %r2502; .reg .u64 %r2503; .reg .u64 %r2504; .reg .u64 %r2505; .reg .u64 %r2506; .reg .u64 %r2507; .reg .u32 %r2508; .reg .u64 %r2509; .reg .u64 %r2510; .reg .pred %r2512; .reg .pred %r2513; .reg .u64 %r2520; .reg .u64 %r2521; .reg .u64 %r2522; .reg .u64 %r2523; .reg .u64 %r2524; .reg .u64 %r2525; .reg .u64 %r2526; .reg .u64 %r2527; .reg .u64 %r2528; .reg .u32 %r2529; .reg .u64 %r2530; .reg .u64 %r2531; .reg .pred %r2533; .reg .u64 %r2535; .reg .u64 %r2537; .reg .u64 %r2540; .reg .pred %r2542; .reg .pred %r2543; .reg .pred %r2544; .reg .u64 %r2549; .reg .u64 %r2550; .reg .u64 %r2551; .reg .u64 %r2552; .reg .u64 %r2553; .reg .u64 %r2554; .reg .u64 %r2555; .reg .u64 %r2556; .reg .u64 %r2557; .reg .u64 %r2558; .reg .u64 %r2559; .reg .u32 %r2560; .reg .u64 %r2561; .reg .u64 %r2562; .reg .u64 %r2563; .reg .pred %r2564; .reg .pred %r2565; .reg .u64 %r2573; .reg .u64 %r2574; .reg .u64 %r2575; .reg .u64 %r2576; .reg .u64 %r2577; .reg .u64 %r2578; .reg .u64 %r2579; .reg .u64 %r2580; .reg .u64 %r2581; .reg .u32 %r2582; .reg .u64 %r2583; .reg .u64 %r2584; .reg .pred %r2586; .reg .u64 %r2587; .reg .u64 %r2588; .reg .u64 %r2589; .reg .u64 %r2590; .reg .u64 %r2591; .reg .u64 %r2592; .reg .u64 %r2593; .reg .u32 %r2594; .reg .u64 %r2595; .reg .u64 %r2596; .reg .pred %r2598; .reg .u64 %r2599; .reg .u64 %r2600; .reg .u64 %r2602; .reg .u64 %r2603; .reg .u64 %r2605; .reg .u64 %r2606; .reg .pred %r2607; .reg .u64 %r2608; .reg .u64 %r2609; .reg .u64 %r2610; .reg .u64 %r2611; .reg .u64 %r2612; .reg .u64 %r2613; .reg .u64 %r2614; .reg .u64 %r2615; .reg .u64 %r2616; .reg .u64 %r2617; .reg .u64 %r2618; .reg .u64 %r2619; .reg .pred %r2620; .reg .u32 %r2621; .reg .u64 %r2622; .reg .u32 %r2624; .reg .u64 %r2625; .reg .u32 %r2627; .reg .u64 %r2628; .reg .pred %r2629; .reg .u32 %r2630; .reg .u64 %r2631; .reg .u64 %r2633; .reg .u32 %r2635; .reg .u64 %r2636; .reg .u64 %r2638; .reg .u32 %r2640; .reg .u64 %r2641; .reg .pred %r2642; .reg .u64 %r2643; .reg .u64 %r2644; .reg .u64 %r2645; .reg .u64 %r2646; .reg .pred %r2654; .reg .u64 %r2655; .reg .u64 %r2656; .reg .u64 %r2657; .reg .u64 %r2658; .reg .pred %r2666; .reg .u64 %r2667; .reg .u64 %r2668; .reg .u64 %r2669; .reg .u64 %r2670; .reg .pred %r2678; .reg .u64 %r2679; .reg .u64 %r2680; .reg .u64 %r2681; .reg .u64 %r2682; .reg .u64 %r2683; .reg .u64 %r2684; .reg .u64 %r2685; .reg .u64 %r2686; .reg .u64 %r2687; .reg .u32 %r2688; .reg .u64 %r2689; .reg .u64 %r2690; .reg .u64 %r2691; .reg .pred %r2692; .reg .pred %r2695; .reg .u64 %r2696; .reg .u64 %r2697; .reg .u64 %r2698; .reg .u64 %r2699; .reg .pred %r2707; mov.u64 %r748,%ar0; mov.u64 %r749,%ar1; mov.u64 %r750,%ar2; mov.u32 %r751,%ar3; mov.u32 %r752,%ar4; mov.u64 %r753,%ar5; .loc 1 2462 3 ld.s8 %r446,[%r749+28]; cvt.u16.u32 %r754,%r446; setp.eq.u16 %r755,%r754,2; @ %r755 bra $L2; ld.s8 %r2232,[%r750+28]; cvt.u16.u32 %r756,%r2232; setp.eq.u16 %r758,%r756,2; @ %r758 bra $L3; cvta.const.u64 %r762,$LC3; cvta.const.u64 %r761,__func__$0; mov.u32 %r760,2462; cvta.const.u64 %r7767623: .loc 1 2476 6 ld.u64 %r763,[%r748]; setp.ne.u64 %r764,%r763,0; @ %r764 bra $L4; .loc 1 2478 10 setp.ne.u16 %r766,%r754,1; @ %r766 bra $L5; .loc 1 2480 4 st.u64 [%r748+48],%r763; ld.u64 %r769,[%r750+80]; ld.u64 %r771,[%r750+72]; sub.u64 %r772,%r769,%r771; st.u64 [%r748+56],%r772; mov.u64 %r773,1; st.u64 [%r748+40],%r773; bra $L6; $L123: .loc 1 2485 4 st.u64 [%r748+48],%r2005; add.u64 %r775,%r508,-1; st.u64 [%r748+56],%r775; mov.u64 %r776,1; st.u64 [%r748+40],%r776; bra $L6; $L124: .loc 1 2490 4 mov.u64 %r777,0; st.u64 [%r748+48],%r777; add.u64 %r778,%r508,-1; st.u64 [%r748+56],%r778; mov.u64 %r779,1; st.u64 [%r748+40],%r779; .loc 1 2493 11 st.u64 [%r748+72],%r777; ld.u64 %r782,[%r750+80]; ld.u64 %r784,[%r750+72]; sub.u64 %r785,%r782,%r784; st.u64 [%r748+80],%r785; st.u64 [%r748+64],%r508; $L6: .loc 1 2499 1748; call (%value_in),_gfortran_size0,(%out_arg1); ld.param.u64 %r787,[%value_in]; } .loc 1 2499 4 mov.u64 %r789,16789; call (%value_in),_gfortrani_xmallocarray790,[%value_in]; } .loc 1 2499 2 st.u64 [%r748],%r790; .loc 1 2500 24 mov.u64 %r792,0; st.u64 [%r748+8],%r792; .loc 1 2557 7 ld.s8 %r446,[%r749+28]; .loc 1 2564 15 ld.u64 %r794,[%r749+56]; add.u64 %r793,%r794,1; .loc 1 2564 13 ld.u64 %r795,[%r749+48]; sub.u64 %r456,%r793,%r795; bra $L7; $L4: .loc 1 2502 12 cvta.global.u64 %r796,_gfortrani_compile_options; .loc 1 2502 11 ld.u32 %r797,[%r796+36]; setp.eq.u32 %r798,%r797,0; @ %r798 bra $L8; .loc 1 2509 17 ld.u64 %r800,[%r748+56]; add.u64 %r799,%r800,1; .loc 1 2509 15 ld.u64 %r801,[%r748+48]; sub.u64 %r456,%r799,%r801; .loc 1 2506 10 setp.ne.u16 %r803,%r754,1; @ %r803 bra $L9; .loc 1 2508 17 ld.u64 %r805,[%r750+80]; add.u64 %r804,%r805,1; .loc 1 2508 15 ld.u64 %r806,[%r750+72]; sub.u64 %r238,%r804,%r806; .loc 1 2510 7 setp.ne.u64 %r807,%r238,%r456; @ %r807 bra $L10; .loc 1 2564 15 ld.u64 %r809,[%r749+56]; add.u64 %r808,%r809,1; .loc 1 2564 13 ld.u64 %r810,[%r749+48]; sub.u64 %r456,%r808,%r810; .loc 1 2543 7 ld.s8 %r331,[%r748+28]; .loc 1 2548 27 ld.u64 %r444,[%r748+40]; .loc 1 2543 6 cvt.u16.u32 %r811,%r331; setp.eq.u16 %r812,%r811,1; @ %r812 bra $L11; .loc 1 2553 16 ld.u64 %r504,[%r748+64]; .loc 1 2560 16 ld.u64 %r503,[%r749+40]; .loc 1 2563 14 mov.u64 %r216,1; .loc 1 2561 16 mov.u64 %r208,%r216; bra $L12; $L10: .loc 1 2511 6 st.u64 [%stack+8],%r238; st.u64 [%stack],%r456; cvta.const.u64 %r813813_gfortran_runtime_error2485 4 ld.u64 %r816,[%r749+56]; add.u64 %r815,%r816,1; ld.u64 %r817,[%r749+48]; sub.u64 %r512,%r815,%r817; .loc 1 2515 15 cvt.u16.u32 %r818,%r2232; setp.ne.u16 %r820,%r818,1; @ %r820 bra $L13; .loc 1 2519 7 setp.eq.u64 %r821,%r512,%r456; @ %r821 bra $L14; .loc 1 2520 6 st.u64 [%stack+8],%r512; st.u64 [%stack],%r456; cvta.const.u64 %r8228stack; call _gfortran_runtime_error13: .loc 1 2528 7 setp.eq.u64 %r824,%r512,%r456; @ %r824 bra $L15; .loc 1 2529 6 st.u64 [%stack+8],%r512; st.u64 [%stack],%r456; cvta.const.u64 %r825825_gfortran_runtime_error15: .loc 1 2533 17 ld.u64 %r828,[%r750+80]; add.u64 %r827,%r828,1; .loc 1 2533 15 ld.u64 %r829,[%r750+72]; sub.u64 %r236,%r827,%r829; .loc 1 2534 17 ld.u64 %r831,[%r748+80]; add.u64 %r830,%r831,1; .loc 1 2534 15 ld.u64 %r832,[%r748+72]; sub.u64 %r237,%r830,%r832; .loc 1 2535 7 setp.eq.u64 %r833,%r236,%r237; @ %r833 bra $L14; .loc 1 2536 6 st.u64 [%stack+8],%r236; st.u64 [%stack],%r237; cvta.const.u64 %r834,$LC8stack; call _gfortran_runtime_error2564 15 ld.u64 %r837,[%r749+56]; add.u64 %r836,%r837,1; .loc 1 2564 13 ld.u64 %r838,[%r749+48]; sub.u64 %r456,%r836,%r838; $L7: .loc 1 2548 27 ld.u64 %r504,[%r748+40]; .loc 1 2543 6 ld.s8 %r840,[%r748+28]; cvt.u16.u32 %r839,%r840; setp.eq.u16 %r841,%r839,1; @ %r841 bra $L130; $L127: .loc 1 2553 16 mov.u64 %r444,%r504; ld.u64 %r504,[%r748+64]; bra $L16; $L130: .loc 1 2548 27 mov.u64 %r444,%r504; $L16: .loc 1 2560 16 ld.u64 %r503,[%r749+40]; .loc 1 2557 6 cvt.u16.u32 %r842,%r446; setp.eq.u16 %r843,%r842,1; @ %r843 bra $L131; $L125: .loc 1 2569 16 ld.u64 %r208,[%r749+64]; .loc 1 2571 15 ld.u64 %r845,[%r749+80]; add.u64 %r844,%r845,1; .loc 1 2571 13 ld.u64 %r846,[%r749+72]; mov.u64 %r216,%r456; sub.u64 %r456,%r844,%r846; bra $L12; $L131: .loc 1 2563 14 mov.u64 %r216,1; .loc 1 2561 16 mov.u64 %r208,%r216; $L12: .loc 1 2575 16 ld.u64 %r848,[%r750+56]; add.u64 %r847,%r848,1; ld.u64 %r849,[%r750+48]; sub.u64 %r64,%r847,%r849; .loc 1 2575 6 setp.eq.u64 %r850,%r64,%r456; @ %r850 bra $L17; .loc 1 2577 10 set.u32.gt.s64 %r852,%r456,0; neg.s32 %r853,%r852; .loc 1 2577 21 set.u32.gt.s64 %r855,%r64,0; neg.s32 %r856,%r855; cvt.u16.u32 %r858,%r853; cvt.u16.u32 %r859,%r856; or.b16 %r857,%r858,%r859; cvt.u32.u16 %r860,%r857; cvt.u16.u8 %r861,%r860; setp.eq.u16 %r862,%r861,0; @ %r862 bra $L17; .loc 1 2578 2 st.u64 [%stack+8],%r456; st.u64 [%stack],%r64;_gfortran_runtime_error17: .loc 1 2583 7 ld.s8 %r66,[%r750+28]; .loc 1 2586 16 ld.u64 %r502,[%r750+40]; .loc 1 2583 6 cvt.u16.u32 %r865,%r66; setp.eq.u16 %r866,%r865,1; @ %r866 bra $L132; .loc 1 2597 16 ld.u64 %r209,[%r750+64]; .loc 1 2598 16 ld.u64 %r868,[%r750+80]; add.u64 %r867,%r868,1; .loc 1 2598 14 ld.u64 %r869,[%r750+72]; sub.u64 %r217,%r867,%r869; bra $L18; $L132: .loc 1 2592 14 mov.u64 %r217,1; .loc 1 2591 16 mov.u64 %r209,256; $L18: .loc 1 2601 9 ld.u64 %r243,[%r749]; .loc 1 2602 9 ld.u64 %r244,[%r750]; .loc 1 2603 8 ld.u64 %r245,[%r748]; .loc 1 2612 28 set.u32.eq.u64 %r871,%r444,1; neg.s32 %r872,%r871; cvt.u32.u32 %r870,%r872; cvt.u32.u8 %r71,%r870; .loc 1 2612 46 set.u32.eq.u64 %r874,%r503,1; neg.s32 %r875,%r874; cvt.u32.u32 %r873,%r875; cvt.u32.u8 %r501,%r873; .loc 1 2612 7 set.u32.ne.u32 %r877,%r751,0; neg.s32 %r878,%r877; .loc 1 2612 16 cvt.u16.u32 %r880,%r878; cvt.u16.u32 %r881,%r71; and.b16 %r879,%r880,%r881; .loc 1 2612 6 cvt.u32.u16 %r882,%r879; cvt.u16.u8 %r883,%r882; setp.eq.u16 %r884,%r883,0; @ %r884 bra $L19; .loc 1 2612 63 set.u32.eq.u64 %r886,%r208,1; neg.s32 %r887,%r886; .loc 1 2612 51 cvt.u16.u32 %r889,%r887; cvt.u16.u32 %r890,%r501; or.b16 %r888,%r889,%r890; .loc 1 2612 33 cvt.u32.u16 %r891,%r888; cvt.u16.u8 %r892,%r891; setp.eq.u16 %r893,%r892,0; @ %r893 bra $L20; .loc 1 2613 20 set.u32.eq.u64 %r895,%r502,1; neg.s32 %r896,%r895; .loc 1 2613 37 set.u32.eq.u64 %r898,%r209,1; neg.s32 %r899,%r898; .loc 1 2613 25 cvt.u16.u32 %r901,%r896; cvt.u16.u32 %r902,%r899; or.b16 %r900,%r901,%r902; .loc 1 2613 7 cvt.u32.u16 %r903,%r900; cvt.u16.u8 %r904,%r903; setp.eq.u16 %r905,%r904,0; @ %r905 bra $L20; .loc 1 2615 13 cvt.rn.f32.s32 %r83,%r752; .loc 1 2614 12 cvt.rn.f32.s64 %r906,%r216; .loc 1 2614 50 cvt.rn.f32.s64 %r907,%r456; .loc 1 2614 47 mul.f32 %r908,%r906,%r907; .loc 1 2614 31 cvt.rn.f32.s64 %r909,%r217; .loc 1 2614 47 mul.f32 %r910,%r908,%r909; .loc 1 2615 13 mul.f32 %r911,%r83,%r83; mul.f32 %r912,%r911,%r83; .loc 1 2614 7 setp.gt.f32 %r913,%r910,%r912; @ ! %r913 bra $L19; .loc 1 2617 17 cvt.u32.u64 %r85,%r216; st.u32 [%frame+52],%r85; .loc 1 2617 29 cvt.u32.u64 %r86,%r217; st.u32 [%frame+48],%r86; .loc 1 2617 41 cvt.u32.u64 %r87,%r456; st.u32 [%frame+44],%r87; .loc 1 2617 52 cvt.u32.u64 %r88,%r504; st.u32 [%frame+40],%r88; .loc 1 2618 28 mov.u64 %r914,1; st.u64 [%frame+16],%r914; mov.u64 %r915,0; st.u64 [%frame+24],%r915; .loc 1 2618 37 st.u64 [%frame],%r915; st.u64 [%frame+8],%r915; .loc 1 2619 17 setp.ne.u64 %r918,%r503,1; @ %r918 bra $L22; cvt.u32.u64 %r230,%r208; bra $L23; $L22: cvt.u32.u64 %r230,%r503; $L23: st.u32 [%frame+36],%r230; .loc 1 2620 3 setp.ne.u64 %r919,%r502,1; @ %r919 bra $L24; cvt.u32.u64 %r231,%r209; bra $L25; $L24: cvt.u32.u64 %r231,%r502; $L25: st.u32 [%frame+32],%r231; .loc 1 2622 15 set.u32.gt.s32 %r921,%r230,0; neg.s32 %r922,%r921; .loc 1 2622 26 set.u32.gt.s32 %r924,%r231,0; neg.s32 %r925,%r924; .loc 1 2622 19 cvt.u16.u32 %r927,%r922; cvt.u16.u32 %r928,%r925; and.b16 %r926,%r927,%r928; .loc 1 2622 10 cvt.u32.u16 %r929,%r926; cvt.u16.u8 %r930,%r929; setp.eq.u16 %r931,%r930,0; @ %r931 bra $L19; .loc 1 2622 37 set.u32.gt.s32 %r933,%r88,0; neg.s32 %r934,%r933; .loc 1 2622 46 set.u32.gt.s32 %r936,%r85,1; neg.s32 %r937,%r936; .loc 1 2622 41 cvt.u16.u32 %r939,%r934; cvt.u16.u32 %r940,%r937; and.b16 %r938,%r939,%r940; cvt.u32.u16 %r941,%r938; cvt.u16.u8 %r942,%r941; setp.eq.u16 %r943,%r942,0; @ %r943 bra $L19; .loc 1 2622 55 set.u32.gt.s32 %r945,%r86,1; neg.s32 %r946,%r945; .loc 1 2622 64 set.u32.gt.s32 %r948,%r87,1; neg.s32 %r949,%r948; .loc 1 2622 59 cvt.u16.u32 %r951,%r946; cvt.u16.u32 %r952,%r949; and.b16 %r950,%r951,%r952; cvt.u32.u16 %r953,%r950; cvt.u16.u8 %r954,%r953; setp.eq.u16 %r955,%r954,0; @ %r955 bra $L19; .loc 1 2624 4 setp.ne.u64 %r956,%r753,0; @ %r956 bra $L27; cvta.const.u64 %r960,$LC8; cvta.const.u64 %r959,__func__$0; mov.u32 %r958,2624; cvta.const.u64 %r9959027: .loc 1 2626 17 and.b32 %r961,%r751,2; .loc 1 2626 7 setp.ne.u32 %r962,%r961,0; @ %r962 bra $L133; .loc 1 2629 35 setp.eq.u64 %r963,%r503,1; @ %r963 bra $L134; cvta.const.u64 %r218,$LC2; bra $L28; $L133: .loc 1 2627 13 cvta.const.u64 %r218,$LC1; bra $L28; $L134: .loc 1 2629 35 cvta.const.u64 %r218,$LC0; $L28: .loc 1 2631 17 and.b32 %r964,%r751,4; .loc 1 2631 7 setp.ne.u32 %r965,%r964,0; @ %r965 bra $L135; .loc 1 2634 35 setp.eq.u64 %r966,%r502,1; @ %r966 bra $L136; cvta.const.u64 %r219,$LC2; bra $L29; $L135: .loc 1 2632 13 cvta.const.u64 %r219,$LC1; bra $L29; $L136: .loc 1 2634 35 cvta.const.u64 %r219,$LC0; $L29: .loc 1 2636 4 add.u64 %r982,%frame,40; add.u64 %r983,%frame,32; add.u64 %r984,%frame,36; add.u64 %r985,%frame,16; add.u64 %r986,%frame,44; add.u64 %r987,%frame,48; add.u64 %r988,%frame,52; mov.u32 %r981,1;,.param .u64 %in_ar3,.param .u64 %in_ar4,.param .u64 %in_ar5,.param .u64 %in_ar6,.param .u64 %in_ar7,.param .u64 %in_ar8,.param .u64 %in_ar9,.param .u64 %in_ar10,.param .u64 %in_ar11,.param .u64 %in_ar12,.param .u32 %in_ar13,.param .u32 %in_ar14);r219988987986243; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r984; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r244; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r983; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%frame; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r245; .param .u64 %out_arg13; st.param.u64 [%out_arg13],%r982; .param .u32 %out_arg14; st.param.u32 [%out_arg14],%r981; .param .u32 %out_arg15; st.param.u32 [%out_arg15],%r981; call %r753,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13,%out_arg14,%out_arg15),$LCT0; } bra $L1; $L19: .loc 1 2643 21 cvt.u16.u32 %r990,%r71; cvt.u16.u32 %r991,%r501; and.b16 %r989,%r990,%r991; .loc 1 2643 6 cvt.u32.u16 %r992,%r989; cvt.u16.u8 %r993,%r992; setp.eq.u16 %r994,%r993,0; @ %r994 bra $L31; .loc 1 2643 38 setp.ne.u64 %r995,%r502,1; @ %r995 bra $L20; .loc 1 2644 7 cvt.u16.u32 %r996,%r66; setp.eq.u16 %r997,%r996,1; @ %r997 bra $L32; .loc 1 2677 16 add.u64 %r998,%r504,1; .loc 1 2678 9 shl.b64 %r104,%r998,4; sub.u64 %r247,%r245,%r104; .loc 1 2687 7 setp.le.s64 %r999,%r217,0; @ %r999 bra $L33; add.u64 %r621,%r104,-16; mov.u64 %r620,%r245; add.u64 %r624,%r217,1; .loc 1 2689 22 shl.b64 %r441,%r216,4; .loc 1 2687 13 mov.u64 %r296,%r502; setp.gt.s64 %r2234,%r216,0; and.b64 %r2365,%r217,3; setp.eq.u64 %r2368,%r2365,0; @ %r2368 bra $L34; setp.eq.u64 %r2367,%r2365,1; @ %r2367 bra $L246; setp.eq.u64 %r2366,%r2365,2; @ %r2366 bra $L247; bra $L278; $L35: .loc 1 2689 22 mov.u32 %r10030031005,[%value_in]; } $L36: .loc 1 2687 24 add.u64 %r2369,%r296,1; .loc 1 2687 7 add.u64 %r2412,%r620,%r621; .loc 1 2688 2 @ %r2234 bra $L214; bra $L279; $L34: @ %r2234 bra $L35; bra $L36; $L33: .loc 1 2692 28 set.u32.eq.u64 %r1010,%r456,0; neg.s32 %r1011,%r1010; .loc 1 2692 13 set.u32.eq.u64 %r1013,%r216,0; neg.s32 %r1014,%r1013; .loc 1 2692 28 cvt.u16.u32 %r1016,%r1011; cvt.u16.u32 %r1017,%r1014; or.b16 %r1015,%r1016,%r1017; .loc 1 2692 23 set.u32.eq.u64 %r1019,%r217,0; neg.s32 %r1020,%r1019; .loc 1 2692 28 cvt.u16.u32 %r1024,%r1020; or.b16 %r1022,%r1015,%r1024; cvt.u32.u16 %r1025,%r1022; cvt.u16.u8 %r1026,%r1025; setp.eq.u16 %r1027,%r1026,0; @ ! %r1027 bra $L1; .loc 1 2697 10 setp.eq.u64 %r1028,%r208,1; selp.u64 %r228,%r504,%r208,%r1028; .loc 1 2702 21 shl.b64 %r1029,%r228,8; .loc 1 2702 14 add.u64 %r252,%r1029,%r209; .loc 1 2706 12 min.s64 %r1031,%r252,65536; shl.b64 %r1032,%r10310321033,[%value_in]; } .loc 1 2710 7 setp.gt.s64 %r1035,%r217,0; @ %r1035 bra $L39; $L42: .loc 1 2910332918 7 bra $L1; $L39: .loc 1 2680 16 add.u64 %r1037,%r208,1; .loc 1 2681 9 shl.b64 %r107,%r1037,4; sub.u64 %r249,%r243,%r107; shl.b64 %r594,%r209,9; shl.b64 %r596,%r209,13; shl.b64 %r605,%r209,5; .loc 1 2683 16 add.u64 %r1038,%r209,1; .loc 1 2684 9 shl.b64 %r1039,%r1038,4; sub.u64 %r597,%r244,%r1039; add.u64 %r595,%r605,%r597; shl.b64 %r600,%r504,9; add.u64 %r601,%r504,%r504; shl.b64 %r602,%r504,2; neg.s64 %r1042,%r209; shl.b64 %r1043,%r1042,13; shl.b64 %r603,%r1042,4; shl.b64 %r607,%r504,13; shl.b64 %r1047,%r504,5; add.u64 %r1048,%r1047,16; add.u64 %r606,%r247,%r1048; add.u64 %r1049,%r217,-1; and.b64 %r615,%r1049,-512; add.u64 %r616,%r597,%r603; shl.b64 %r573,%r208,8; shl.b64 %r575,%r208,12; add.u64 %r576,%r107,-16; neg.s64 %r1051,%r208; shl.b64 %r1052,%r1051,12; shl.b64 %r1055,%r1051,5; shl.b64 %r537,%r208,5; add.u64 %r538,%r208,%r208; shl.b64 %r226,%r209,6; shl.b64 %r221,%r504,6; neg.s64 %r1058,%r504; shl.b64 %r1059,%r1058,4; shl.b64 %r566,%r504,4; shl.b64 %r1062,%r1042,6; shl.b64 %r619,%r209,4; mov.u64 %r598,%r601; mov.u64 %r592,%r209; mov.u64 %r590,0; setp.gt.s64 %r2245,%r456,0; add.u64 %r2300,%r616,16; setp.gt.s64 %r2301,%r216,0; add.u64 %r2302,%r456,1; add.u64 %r2303,%r216,1; add.u64 %r2304,%r249,32; add.u64 %r2305,%frame,64; add.u64 %r2306,%frame,80; add.u64 %r2307,%frame,96; add.u64 %r2308,%r247,64; bra $L41; $L138: mov.u64 %r590,%r591; $L41: add.u64 %r618,%r590,1; .loc 1 2715 9 sub.u64 %r254,%r217,%r590; min.s64 %r2413,%r254,512; .loc 1 2716 24 shr.s64 %r1064,%r2413,63; shr.u64 %r1065,%r1064,62; add.u64 %r1066,%r2413,%r1065; and.b64 %r1067,%r1066,3; sub.u64 %r1068,%r1067,%r1065; .loc 1 2716 10 sub.u64 %r255,%r2413,%r1068; .loc 1 2718 4 @ %r2245 bra $L40; $L45: .loc 1 2710 7 add.u64 %r591,%r590,512; add.u64 %r592,%r592,%r594; add.u64 %r595,%r595,%r596; add.u64 %r598,%r598,%r600; add.u64 %r602,%r602,%r600; add.u64 %r603,%r603,%r1043; add.u64 %r605,%r605,%r596; add.u64 %r606,%r606,%r607; setp.ne.u64 %r1070,%r590,%r615; @ %r1070 bra $L138; bra $L42; $L40: add.u64 %r570,%r2300,%r605; .loc 1 2768 13 add.u64 %r480,%r255,%r618; add.u64 %r477,%r605,%r619; add.u64 %r475,%r477,%r619; add.u64 %r588,%r595,%r619; add.u64 %r589,%r588,%r619; mul.lo.u64 %r406,%r480,%r504; mul.lo.u64 %r399,%r480,%r209; mad.lo.u64 %r585,%r480,%r619,%r597; mov.u64 %r577,%r1055; mov.u64 %r574,%r576; mov.u64 %r571,%r208; .loc 1 2718 12 mov.u64 %r258,%r502; .loc 1 2768 8 add.u64 %r2293,%r480,-1; $L44: .loc 1 2723 13 sub.u64 %r256,%r2302,%r258; min.s64 %r2415,%r256,256; .loc 1 2724 28 shr.u64 %r1076,%r2415,63; add.u64 %r1077,%r2415,%r1076; and.b64 %r1078,%r1077,1; sub.u64 %r1079,%r1078,%r1076; .loc 1 2724 14 sub.u64 %r257,%r2415,%r1079; .loc 1 2727 8 @ %r2301 bra $L43; $L73: .loc 1 2718 30 add.u64 %r258,%r258,256; .loc 1 2718 4 add.u64 %r570,%r570,4096; add.u64 %r571,%r571,%r573; add.u64 %r574,%r574,%r575; add.u64 %r577,%r577,%r1052; setp.ge.s64 %r1081,%r456,%r258; @ %r1081 bra $L44; bra $L45; $L43: .loc 1 2763 26 add.u64 %r1082,%r258,-1; add.u64 %r1083,%r1082,%r2415; .loc 1 2763 31 mul.lo.u64 %r314,%r1083,%r208; .loc 1 2734 13 add.u64 %r479,%r257,%r258; .loc 1 2734 8 add.u64 %r481,%r479,-1; add.u64 %r1084,%r314,1; shl.b64 %r1085,%r1084,4; add.u64 %r556,%r249,%r1085; add.u64 %r567,%r571,%r208; sub.u64 %r1086,%r567,%r314; shl.b64 %r569,%r1086,4; shl.b64 %r1088,%r2415,4; add.u64 %r1089,%r1088,-16; add.u64 %r330,%r1033,%r1089; add.u64 %r1090,%r258,%r592; add.u64 %r1091,%r1090,%r2415; shl.b64 %r1092,%r1091,4; add.u64 %r225,%r1092,%r597; shl.b64 %r587,%r258,4; add.u64 %r48,%r587,%r595; add.u64 %r46,%r587,%r588; add.u64 %r30,%r587,%r589; add.u64 %r586,%r585,%r587; add.u64 %r1093,%r399,%r258; add.u64 %r1094,%r1093,%r2415; shl.b64 %r1095,%r1094,4; add.u64 %r397,%r1095,%r597; .loc 1 2768 8 mov.u64 %r555,%r606; .loc 1 2727 16 mov.u64 %r273,%r502; setp.le.s64 %r2235,%r258,%r481; setp.gt.s64 %r2228,%r2415,%r257; setp.ge.s64 %r2247,%r2293,%r618; setp.gt.s64 %r2226,%r2413,%r255; not.b64 %r2283,%r258; add.u64 %r2284,%r2283,%r479; and.b64 %r2285,%r2284,-2; .loc 1 2904 18 add.u64 %r2287,%r2415,%r258; .loc 1 2904 13 add.u64 %r2288,%r2287,-1; $L72: .loc 1 2732 10 sub.u64 %r259,%r2303,%r273; min.s64 %r2416,%r259,256; .loc 1 2733 25 shr.s64 %r2242,%r2416,63; shr.u64 %r1099,%r2416,63; add.u64 %r1100,%r2416,%r1099; and.b64 %r1101,%r1100,1; sub.u64 %r1102,%r1101,%r1099; .loc 1 2733 11 sub.u64 %r260,%r2416,%r1102; .loc 1 2735 5 @ %r2235 bra $L46; $L55: .loc 1 2757 8 @ ! %r2228 bra $L48; bra $L47; $L46: .loc 1 2737 17 add.u64 %r484,%r260,%r273; .loc 1 2737 12 add.u64 %r485,%r484,-1; shl.b64 %r1106,%r2416,12; add.u64 %r1108,%r1106,-4096; add.u64 %r530,%r1033,%r1108; add.u64 %r536,%r556,%r569; add.u64 %r1111,%r2416,%r273; shl.b64 %r547,%r1111,4; shl.b64 %r551,%r2416,4; mov.u64 %r541,%r567; mov.u64 %r540,%r577; mov.u64 %r539,%r574; mov.u64 %r528,0; setp.le.s64 %r2240,%r273,%r485; setp.gt.s64 %r2229,%r2416,%r260; not.b64 %r2278,%r273; add.u64 %r2279,%r2278,%r484; and.b64 %r2280,%r2279,-2; add.u64 %r2281,%r2280,%r273; bra $L54; $L139: mov.u64 %r528,%r529; $L54: .loc 1 2738 9 @ %r2240 bra $L49; $L53: .loc 1 2749 12 @ %r2229 bra $L50; bra $L51; $L49: shl.b64 %r1114,%r528,4; add.u64 %r229,%r1033,%r1114; add.u64 %r1119,%r2281,%r541; shl.b64 %r1120,%r1119,4; add.u64 %r527,%r1120,%r2304; .loc 1 2738 9 mov.u64 %r211,%r536; sub.u64 %r2355,%r527,%r536; add.u64 %r2356,%r2355,-32; shr.u64 %r2354,%r2356,5; and.b64 %r2358,%r2354,1; setp.ne.u64 %r2359,%r2358,0; @ ! %r2359 bra $L280; $L52: .loc 1 2741 7 add.u64 %r1122,%r211,%r539; add.u64 %r1123,%r1122,%r540; .loc 1 2740 49 ld.u64 %r1124,[%r1123]; st.u64 [%r229],%r1124; ld.u64 %r1125,[%r1123+8]; st.u64 [%r229+8],%r1125; .loc 1 2742 49 ld.u64 %r1126,[%r211]; st.u64 [%r229+16],%r1126; ld.u64 %r1127,[%r211+8]; st.u64 [%r229+24],%r1127; .loc 1 2745 7 add.u64 %r1128,%r211,%r540; add.u64 %r1129,%r1128,%r539; .loc 1 2744 49 ld.u64 %r1130,[%r1129+16]; st.u64 [%r229+4096],%r1130; ld.u64 %r1131,[%r1129+24]; st.u64 [%r229+4104],%r1131; .loc 1 2746 49 ld.u64 %r1132,[%r211+16]; st.u64 [%r229+4112],%r1132; ld.u64 %r1133,[%r211+24]; st.u64 [%r229+4120],%r1133; .loc 1 2738 9 add.u64 %r2360,%r229,8192; add.u64 %r2361,%r211,32; .loc 1 2741 7 add.u64 %r2419,%r2361,%r539; add.u64 %r2420,%r2419,%r540; .loc 1 2740 49 ld.u64 %r2421,[%r2420]; st.u64 [%r2360],%r2421; ld.u64 %r2422,[%r2420+8]; st.u64 [%r2360+8],%r2422; .loc 1 2742 49 ld.u64 %r2423,[%r2361]; st.u64 [%r2360+16],%r2423; ld.u64 %r2424,[%r2361+8]; st.u64 [%r2360+24],%r2424; .loc 1 2745 7 add.u64 %r2425,%r2361,%r540; add.u64 %r2426,%r2425,%r539; .loc 1 2744 49 ld.u64 %r2427,[%r2426+16]; st.u64 [%r2360+4096],%r2427; ld.u64 %r2428,[%r2426+24]; st.u64 [%r2360+4104],%r2428; .loc 1 2746 49 ld.u64 %r2429,[%r2361+16]; st.u64 [%r2360+4112],%r2429; ld.u64 %r2430,[%r2361+24]; st.u64 [%r2360+4120],%r2430; .loc 1 2738 9 add.u64 %r229,%r229,16384; add.u64 %r211,%r211,64; setp.ne.u64 %r2431,%r211,%r527; @ %r2431 bra $L52; bra $L53; $L50: .loc 1 2752 10 add.u64 %r1135,%r539,%r547; add.u64 %r1136,%r249,%r1135; .loc 1 2751 41 ld.u64 %r1137,[%r1136+-16]; st.u64 [%r530],%r1137; ld.u64 %r1138,[%r1136+-8]; st.u64 [%r530+8],%r1138; .loc 1 2754 10 add.u64 %r1139,%r536,%r551; .loc 1 2753 41 ld.u64 %r1140,[%r1139+-16]; st.u64 [%r530+16],%r1140; ld.u64 %r1141,[%r1139+-8]; st.u64 [%r530+24],%r1141; $L51: .loc 1 2735 5 add.u64 %r529,%r528,2; add.u64 %r530,%r530,32; add.u64 %r536,%r536,%r537; add.u64 %r539,%r539,%r537; add.u64 %r540,%r540,%r1055; add.u64 %r541,%r541,%r538; setp.ne.u64 %r1142,%r528,%r2285; @ %r1142 bra $L139; bra $L55; $L48: .loc 1 2767 25 shr.u64 %r1145,%r2242,62; add.u64 %r1146,%r2416,%r1145; and.b64 %r1147,%r1146,3; sub.u64 %r1148,%r1147,%r1145; .loc 1 2767 11 sub.u64 %r263,%r2416,%r1148; .loc 1 2769 5 @ %r2247 bra $L56; bra $L57; $L47: .loc 1 2759 17 add.u64 %r132,%r2416,%r273; .loc 1 2759 12 add.u64 %r1150,%r132,-1; .loc 1 2760 9 setp.lt.s64 %r1151,%r1150,%r273; @ %r1151 bra $L48; add.u64 %r1152,%r132,%r314; shl.b64 %r1153,%r1152,4; add.u64 %r136,%r249,%r1153; mov.u64 %r73,%r330; mov.u64 %r57,%r556; sub.u64 %r2345,%r136,%r556; add.u64 %r2346,%r2345,-16; shr.u64 %r2344,%r2346,4; add.u64 %r2347,%r2344,1; and.b64 %r2348,%r2347,3; setp.eq.u64 %r2351,%r2348,0; @ %r2351 bra $L58; setp.eq.u64 %r2350,%r2348,1; @ %r2350 bra $L248; setp.eq.u64 %r2349,%r2348,2; @ %r2349 bra $L249; bra $L281; $L58: .loc 1 2762 43 ld.u64 %r1154,[%r57]; st.u64 [%r73],%r1154; ld.u64 %r1155,[%r57+8]; st.u64 [%r73+8],%r1155; .loc 1 2760 9 add.u64 %r2352,%r57,16; add.u64 %r2353,%r73,4096; .loc 1 2762 43 ld.u64 %r2434,[%r2352]; st.u64 [%r2353],%r2434; ld.u64 %r2435,[%r2352+8]; st.u64 [%r2353+8],%r2435; .loc 1 2760 9 add.u64 %r2436,%r57,32; add.u64 %r2437,%r73,8192; .loc 1 2762 43 ld.u64 %r2439,[%r2436]; st.u64 [%r2437],%r2439; ld.u64 %r2440,[%r2436+8]; st.u64 [%r2437+8],%r2440; .loc 1 2760 9 add.u64 %r2441,%r57,48; add.u64 %r2442,%r73,12288; .loc 1 2762 43 ld.u64 %r2444,[%r2441]; st.u64 [%r2442],%r2444; ld.u64 %r2445,[%r2441+8]; st.u64 [%r2442+8],%r2445; .loc 1 2760 9 add.u64 %r57,%r57,64; add.u64 %r73,%r73,16384; setp.ne.u64 %r2446,%r57,%r136; @ %r2446 bra $L58; bra $L48; $L57: .loc 1 2871 8 @ ! %r2226 bra $L60; bra $L59; $L56: .loc 1 2876 14 add.u64 %r488,%r263,%r273; .loc 1 2876 9 add.u64 %r489,%r488,-1; sub.u64 %r316,%r598,%r504; add.u64 %r222,%r555,%r1059; add.u64 %r89,%r555,%r566; add.u64 %r42,%r566,%r89; not.b64 %r1158,%r273; add.u64 %r1159,%r1158,%r488; shr.u64 %r1160,%r1159,2; add.u64 %r1161,%r1160,1; shl.b64 %r318,%r1161,6; add.u64 %r1162,%r316,%r488; shl.b64 %r1163,%r1162,4; add.u64 %r468,%r1163,%r247; add.u64 %r1164,%r488,%r598; shl.b64 %r1165,%r1164,4; add.u64 %r463,%r1165,%r247; add.u64 %r1166,%r316,%r601; add.u64 %r1167,%r1166,%r488; shl.b64 %r1168,%r1167,4; add.u64 %r460,%r1168,%r247; add.u64 %r1169,%r488,%r602; shl.b64 %r1170,%r1169,4; add.u64 %r457,%r1170,%r247; shl.b64 %r369,%r263,8; shl.b64 %r471,%r2416,4; mov.u64 %r476,%r475; mov.u64 %r478,%r477; mov.u64 %r483,%r605; mov.u64 %r486,%r603; mov.u64 %r220,%r555; mov.u64 %r227,%r225; mov.u64 %r278,%r618; mov.u64 %r473,0; mov.u64 %r474,%r473; setp.le.s64 %r2241,%r273,%r489; setp.gt.s64 %r2230,%r2416,%r263; $L68: .loc 1 2772 9 @ %r2241 bra $L61; $L67: .loc 1 2843 12 @ ! %r2230 bra $L63; bra $L62; $L61: .loc 1 2904 13 mov.u64 %r341,0; mov.u64 %r342,16384; mov.u64 %r343,12288; mov.u64 %r344,4096; mov.u64 %r345,-8192; setp.gt.s64 %r2236,%r258,%r2288; $L66: add.u64 %r339,%r222,%r341; .loc 1 2774 10 ld.u64 %r2063,[%r339]; ld.u64 %r2064,[%r339+8]; .loc 1 2775 10 ld.u64 %r2065,[%r339+16]; ld.u64 %r2066,[%r339+24]; add.u64 %r337,%r220,%r341; .loc 1 2776 10 ld.u64 %r2067,[%r337]; ld.u64 %r2068,[%r337+8]; .loc 1 2777 10 ld.u64 %r2069,[%r337+16]; ld.u64 %r2070,[%r337+24]; add.u64 %r335,%r89,%r341; .loc 1 2778 10 ld.u64 %r2071,[%r335]; ld.u64 %r2072,[%r335+8]; .loc 1 2779 10 ld.u64 %r2073,[%r335+16]; ld.u64 %r2074,[%r335+24]; add.u64 %r333,%r42,%r341; .loc 1 2780 10 ld.u64 %r2075,[%r333]; ld.u64 %r2076,[%r333+8]; .loc 1 2781 10 ld.u64 %r2077,[%r333+16]; ld.u64 %r2078,[%r333+24]; .loc 1 2782 10 ld.u64 %r2079,[%r339+32]; ld.u64 %r2080,[%r339+40]; .loc 1 2783 10 ld.u64 %r2081,[%r339+48]; ld.u64 %r2082,[%r339+56]; .loc 1 2784 10 ld.u64 %r2083,[%r337+32]; ld.u64 %r2084,[%r337+40]; .loc 1 2785 10 ld.u64 %r2085,[%r337+48]; ld.u64 %r2086,[%r337+56]; .loc 1 2786 10 ld.u64 %r2087,[%r335+32]; ld.u64 %r2088,[%r335+40]; .loc 1 2787 10 ld.u64 %r2089,[%r335+48]; ld.u64 %r2090,[%r335+56]; .loc 1 2788 10 ld.u64 %r2091,[%r333+32]; ld.u64 %r2092,[%r333+40]; .loc 1 2789 10 ld.u64 %r2093,[%r333+48]; ld.u64 %r2094,[%r333+56]; .loc 1 2791 6 @ %r2236 bra $L64; add.u64 %r361,%r473,%r570; add.u64 %r359,%r1033,%r344; $L65: add.u64 %r356,%r345,%r359; .loc 1 2793 19 add.u64 %r1175,%r356,%r344; ld.u64 %r2025,[%r1175]; ld.u64 %r2026,[%r1175+8]; .loc 1 2794 14 ld.u64 %r2027,[%r361]; ld.u64 %r2028,[%r361+8]; .loc 1 2794 11 st.u64 [%frame+80],%r2025; st.u64 [%frame+88],%r2026; st.u64 [%frame+96],%r2027; st.u64 [%frame+104],%r20282305230623072103,[%frame+64]; ld.u64 %r2104,[%frame+72]; .loc 1 2793 14 add.u64 %r1188,%r2063,%r2103; set.u32.lt.u64 %r1193,%r1188,%r2063; cvt.s64.s32 %r1191,%r1193; add.u64 %r1194,%r2064,%r2104; mov.u64 %r2063,%r1188; sub.u64 %r2064,%r1194,%r1191; .loc 1 2795 19 ld.u64 %r2029,[%r359]; ld.u64 %r2030,[%r359+8]; .loc 1 2796 11 st.u64 [%frame+80],%r2027; st.u64 [%frame+88],%r2028; st.u64 [%frame+96],%r2029; st.u64 [%frame+104],%r2032305230623072107,[%frame+64]; ld.u64 %r2108,[%frame+72]; .loc 1 2795 14 add.u64 %r1210,%r2065,%r2107; set.u32.lt.u64 %r1215,%r1210,%r2065; cvt.s64.s32 %r1213,%r1215; add.u64 %r1216,%r2066,%r2108; mov.u64 %r2065,%r1210; sub.u64 %r2066,%r1216,%r1213; add.u64 %r351,%r361,%r486; .loc 1 2798 14 add.u64 %r1220,%r351,%r483; ld.u64 %r2031,[%r1220]; ld.u64 %r2032,[%r1220+8]; .loc 1 2798 11 st.u64 [%frame+80],%r2025; st.u64 [%frame+88],%r2026; st.u64 [%frame+96],%r2031; st.u64 [%frame+104],%r20322305230623072111,[%frame+64]; ld.u64 %r2112,[%frame+72]; .loc 1 2797 14 add.u64 %r1233,%r2067,%r2111; set.u32.lt.u64 %r1238,%r1233,%r2067; cvt.s64.s32 %r1236,%r1238; add.u64 %r1239,%r2068,%r2112; mov.u64 %r2067,%r1233; sub.u64 %r2068,%r1239,%r1236; .loc 1 2800 11 st.u64 [%frame+80],%r2029; st.u64 [%frame+88],%r2030; st.u64 [%frame+96],%r2031; st.u64 [%frame+104],%r20322305230623072115,[%frame+64]; ld.u64 %r2116,[%frame+72]; .loc 1 2799 14 add.u64 %r1255,%r2069,%r2115; set.u32.lt.u64 %r1260,%r1255,%r2069; cvt.s64.s32 %r1258,%r1260; add.u64 %r1261,%r2070,%r2116; mov.u64 %r2069,%r1255; sub.u64 %r2070,%r1261,%r1258; .loc 1 2802 14 add.u64 %r1265,%r351,%r478; ld.u64 %r2033,[%r1265]; ld.u64 %r2034,[%r1265+8]; .loc 1 2802 11 st.u64 [%frame+80],%r2025; st.u64 [%frame+88],%r2026; st.u64 [%frame+96],%r2033; st.u64 [%frame+104],%r20342305230623072119,[%frame+64]; ld.u64 %r2120,[%frame+72]; .loc 1 2801 14 add.u64 %r1278,%r2071,%r2119; set.u32.lt.u64 %r1283,%r1278,%r2071; cvt.s64.s32 %r1281,%r1283; add.u64 %r1284,%r2072,%r2120; mov.u64 %r2071,%r1278; sub.u64 %r2072,%r1284,%r1281; .loc 1 2804 11 st.u64 [%frame+80],%r2029; st.u64 [%frame+88],%r2030; st.u64 [%frame+96],%r2033; st.u64 [%frame+104],%r20342305230623072123,[%frame+64]; ld.u64 %r2124,[%frame+72]; .loc 1 2803 14 add.u64 %r1300,%r2073,%r2123; set.u32.lt.u64 %r1305,%r1300,%r2073; cvt.s64.s32 %r1303,%r1305; add.u64 %r1306,%r2074,%r2124; mov.u64 %r2073,%r1300; sub.u64 %r2074,%r1306,%r1303; .loc 1 2806 14 add.u64 %r1310,%r351,%r476; ld.u64 %r2035,[%r1310]; ld.u64 %r2036,[%r1310+8]; .loc 1 2806 11 st.u64 [%frame+80],%r2025; st.u64 [%frame+88],%r2026; st.u64 [%frame+96],%r2035; st.u64 [%frame+104],%r203305230623072127,[%frame+64]; ld.u64 %r2128,[%frame+72]; .loc 1 2805 14 add.u64 %r1323,%r2075,%r2127; set.u32.lt.u64 %r1328,%r1323,%r2075; cvt.s64.s32 %r1326,%r1328; add.u64 %r1329,%r2076,%r2128; mov.u64 %r2075,%r1323; sub.u64 %r2076,%r1329,%r1326; .loc 1 2808 11 st.u64 [%frame+80],%r2029; st.u64 [%frame+88],%r2030; st.u64 [%frame+96],%r2035; st.u64 [%frame+104],%r203305230623072131,[%frame+64]; ld.u64 %r2132,[%frame+72]; .loc 1 2807 14 add.u64 %r1345,%r2077,%r2131; set.u32.lt.u64 %r1350,%r1345,%r2077; cvt.s64.s32 %r1348,%r1350; add.u64 %r1351,%r2078,%r2132; mov.u64 %r2077,%r1345; sub.u64 %r2078,%r1351,%r1348; .loc 1 2809 19 add.u64 %r1355,%r356,%r343; ld.u64 %r2037,[%r1355]; ld.u64 %r2038,[%r1355+8]; .loc 1 2810 11 st.u64 [%frame+80],%r2027; st.u64 [%frame+88],%r2028; st.u64 [%frame+96],%r2037; st.u64 [%frame+104],%r20382305230623072135,[%frame+64]; ld.u64 %r2136,[%frame+72]; .loc 1 2809 14 add.u64 %r1368,%r2079,%r2135; set.u32.lt.u64 %r1373,%r1368,%r2079; cvt.s64.s32 %r1371,%r1373; add.u64 %r1374,%r2080,%r2136; mov.u64 %r2079,%r1368; sub.u64 %r2080,%r1374,%r1371; .loc 1 2811 19 add.u64 %r1378,%r356,%r342; ld.u64 %r2039,[%r1378]; ld.u64 %r2040,[%r1378+8]; .loc 1 2812 11 st.u64 [%frame+80],%r2027; st.u64 [%frame+88],%r2028; st.u64 [%frame+96],%r2039; st.u64 [%frame+104],%r2042305230623072139,[%frame+64]; ld.u64 %r2140,[%frame+72]; .loc 1 2811 14 add.u64 %r1391,%r2081,%r2139; set.u32.lt.u64 %r1396,%r1391,%r2081; cvt.s64.s32 %r1394,%r1396; add.u64 %r1397,%r2082,%r2140; mov.u64 %r2081,%r1391; sub.u64 %r2082,%r1397,%r1394; .loc 1 2814 11 st.u64 [%frame+80],%r2031; st.u64 [%frame+88],%r2032; st.u64 [%frame+96],%r2037; st.u64 [%frame+104],%r20382305230623072143,[%frame+64]; ld.u64 %r2144,[%frame+72]; .loc 1 2813 14 add.u64 %r1413,%r2083,%r2143; set.u32.lt.u64 %r1418,%r1413,%r2083; cvt.s64.s32 %r1416,%r1418; add.u64 %r1419,%r2084,%r2144; mov.u64 %r2083,%r1413; sub.u64 %r2084,%r1419,%r1416; .loc 1 2816 11 st.u64 [%frame+80],%r2031; st.u64 [%frame+88],%r2032; st.u64 [%frame+96],%r2039; st.u64 [%frame+104],%r2042305230623072147,[%frame+64]; ld.u64 %r2148,[%frame+72]; .loc 1 2815 14 add.u64 %r1435,%r2085,%r2147; set.u32.lt.u64 %r1440,%r1435,%r2085; cvt.s64.s32 %r1438,%r1440; add.u64 %r1441,%r2086,%r2148; mov.u64 %r2085,%r1435; sub.u64 %r2086,%r1441,%r1438; .loc 1 2818 11 st.u64 [%frame+80],%r2033; st.u64 [%frame+88],%r2034; st.u64 [%frame+96],%r2037; st.u64 [%frame+104],%r20382305230623072151,[%frame+64]; ld.u64 %r2152,[%frame+72]; .loc 1 2817 14 add.u64 %r1457,%r2087,%r2151; set.u32.lt.u64 %r1462,%r1457,%r2087; cvt.s64.s32 %r1460,%r1462; add.u64 %r1463,%r2088,%r2152; mov.u64 %r2087,%r1457; sub.u64 %r2088,%r1463,%r1460; .loc 1 2820 11 st.u64 [%frame+80],%r2033; st.u64 [%frame+88],%r2034; st.u64 [%frame+96],%r2039; st.u64 [%frame+104],%r2042305230623072155,[%frame+64]; ld.u64 %r2156,[%frame+72]; .loc 1 2819 14 add.u64 %r1479,%r2089,%r2155; set.u32.lt.u64 %r1484,%r1479,%r2089; cvt.s64.s32 %r1482,%r1484; add.u64 %r1485,%r2090,%r2156; mov.u64 %r2089,%r1479; sub.u64 %r2090,%r1485,%r1482; .loc 1 2822 11 st.u64 [%frame+80],%r2035; st.u64 [%frame+88],%r2036; st.u64 [%frame+96],%r2037; st.u64 [%frame+104],%r20382305230623072159,[%frame+64]; ld.u64 %r2160,[%frame+72]; .loc 1 2821 14 add.u64 %r1501,%r2091,%r2159; set.u32.lt.u64 %r1506,%r1501,%r2091; cvt.s64.s32 %r1504,%r1506; add.u64 %r1507,%r2092,%r2160; mov.u64 %r2091,%r1501; sub.u64 %r2092,%r1507,%r1504; .loc 1 2824 11 st.u64 [%frame+80],%r2035; st.u64 [%frame+88],%r2036; st.u64 [%frame+96],%r2039; st.u64 [%frame+104],%r2042305230623072163,[%frame+64]; ld.u64 %r2164,[%frame+72]; .loc 1 2823 14 add.u64 %r1523,%r2093,%r2163; set.u32.lt.u64 %r1528,%r1523,%r2093; cvt.s64.s32 %r1526,%r1528; add.u64 %r1529,%r2094,%r2164; mov.u64 %r2093,%r1523; sub.u64 %r2094,%r1529,%r1526; .loc 1 2791 6 add.u64 %r361,%r361,16; add.u64 %r359,%r359,16; setp.ne.u64 %r1533,%r227,%r361; @ %r1533 bra $L65; $L64: .loc 1 2826 24 st.u64 [%r339],%r2063; st.u64 [%r339+8],%r2064; .loc 1 2827 28 st.u64 [%r339+16],%r2065; st.u64 [%r339+24],%r2066; .loc 1 2828 30 st.u64 [%r337],%r2067; st.u64 [%r337+8],%r2068; .loc 1 2829 34 st.u64 [%r337+16],%r2069; st.u64 [%r337+24],%r2070; .loc 1 2830 30 st.u64 [%r335],%r2071; st.u64 [%r335+8],%r2072; .loc 1 2831 34 st.u64 [%r335+16],%r2073; st.u64 [%r335+24],%r2074; .loc 1 2832 30 st.u64 [%r333],%r2075; st.u64 [%r333+8],%r2076; .loc 1 2833 34 st.u64 [%r333+16],%r2077; st.u64 [%r333+24],%r2078; .loc 1 2834 28 st.u64 [%r339+32],%r2079; st.u64 [%r339+40],%r2080; .loc 1 2835 28 st.u64 [%r339+48],%r2081; st.u64 [%r339+56],%r2082; .loc 1 2836 34 st.u64 [%r337+32],%r2083; st.u64 [%r337+40],%r2084; .loc 1 2837 34 st.u64 [%r337+48],%r2085; st.u64 [%r337+56],%r2086; .loc 1 2838 34 st.u64 [%r335+32],%r2087; st.u64 [%r335+40],%r2088; .loc 1 2839 34 st.u64 [%r335+48],%r2089; st.u64 [%r335+56],%r2090; .loc 1 2840 34 st.u64 [%r333+32],%r2091; st.u64 [%r333+40],%r2092; .loc 1 2841 34 st.u64 [%r333+48],%r2093; st.u64 [%r333+56],%r2094; .loc 1 2772 9 add.u64 %r345,%r345,-16384; add.u64 %r344,%r344,16384; add.u64 %r343,%r343,16384; add.u64 %r342,%r342,16384; add.u64 %r341,%r341,64; setp.ne.u64 %r1566,%r318,%r341; @ %r1566 bra $L66; bra $L67; $L63: .loc 1 2769 29 add.u64 %r278,%r278,4; .loc 1 2769 5 add.u64 %r227,%r227,%r226; add.u64 %r222,%r222,%r221; add.u64 %r220,%r220,%r221; add.u64 %r89,%r89,%r221; add.u64 %r42,%r42,%r221; add.u64 %r486,%r486,%r1062; add.u64 %r483,%r483,%r226; add.u64 %r478,%r478,%r226; add.u64 %r476,%r476,%r226; add.u64 %r474,%r474,%r221; add.u64 %r473,%r473,%r226; setp.le.s64 %r1567,%r278,%r2293; @ %r1567 bra $L68; bra $L57; $L62: add.u64 %r374,%r468,%r474; add.u64 %r373,%r463,%r474; add.u64 %r372,%r460,%r474; add.u64 %r371,%r457,%r474; add.u64 %r49,%r48,%r473; add.u64 %r239,%r46,%r473; add.u64 %r240,%r30,%r473; add.u64 %r472,%r222,%r471; .loc 1 2904 13 mov.u64 %r370,%r369; setp.gt.s64 %r2237,%r258,%r2288; $L71: .loc 1 2848 14 ld.u64 %r2055,[%r374]; ld.u64 %r2056,[%r374+8]; .loc 1 2849 14 ld.u64 %r2057,[%r373]; ld.u64 %r2058,[%r373+8]; .loc 1 2850 14 ld.u64 %r2059,[%r372]; ld.u64 %r2060,[%r372+8]; .loc 1 2851 14 ld.u64 %r2061,[%r371]; ld.u64 %r2062,[%r371+8]; .loc 1 2853 10 @ %r2237 bra $L69; shl.b64 %r1571,%r370,4; add.u64 %r389,%r1033,%r1571; add.u64 %r387,%r473,%r570; mov.u64 %r381,%r240; mov.u64 %r383,%r239; mov.u64 %r385,%r49; $L70: .loc 1 2855 16 ld.u64 %r2041,[%r389]; ld.u64 %r2042,[%r389+8]; .loc 1 2856 13 st.u64 [%frame+80],%r2041; st.u64 [%frame+88],%r2042; ld.u64 %r1578,[%r387]; st.u64 [%frame+96],%r1578; ld.u64 %r1579,[%r387+8]; st.u64 [%frame+104],%r1572305230623072167,[%frame+64]; ld.u64 %r2168,[%frame+72]; .loc 1 2855 11 add.u64 %r1584,%r2055,%r2167; set.u32.lt.u64 %r1589,%r1584,%r2055; cvt.s64.s32 %r1587,%r1589; add.u64 %r1590,%r2056,%r2168; mov.u64 %r2055,%r1584; sub.u64 %r2056,%r1590,%r1587; .loc 1 2858 13 st.u64 [%frame+80],%r2041; st.u64 [%frame+88],%r2042; ld.u64 %r1600,[%r385]; st.u64 [%frame+96],%r1600; ld.u64 %r1601,[%r385+8]; st.u64 [%frame+104],%r16005230623072171,[%frame+64]; ld.u64 %r2172,[%frame+72]; .loc 1 2857 11 add.u64 %r1606,%r2057,%r2171; set.u32.lt.u64 %r1611,%r1606,%r2057; cvt.s64.s32 %r1609,%r1611; add.u64 %r1612,%r2058,%r2172; mov.u64 %r2057,%r1606; sub.u64 %r2058,%r1612,%r1609; .loc 1 2860 13 st.u64 [%frame+80],%r2041; st.u64 [%frame+88],%r2042; ld.u64 %r1622,[%r383]; st.u64 [%frame+96],%r1622; ld.u64 %r1623,[%r383+8]; st.u64 [%frame+104],%r162305230623072175,[%frame+64]; ld.u64 %r2176,[%frame+72]; .loc 1 2859 11 add.u64 %r1628,%r2059,%r2175; set.u32.lt.u64 %r1633,%r1628,%r2059; cvt.s64.s32 %r1631,%r1633; add.u64 %r1634,%r2060,%r2176; mov.u64 %r2059,%r1628; sub.u64 %r2060,%r1634,%r1631; .loc 1 2862 13 st.u64 [%frame+80],%r2041; st.u64 [%frame+88],%r2042; ld.u64 %r1644,[%r381]; st.u64 [%frame+96],%r1644; ld.u64 %r1645,[%r381+8]; st.u64 [%frame+104],%r16452305230623072179,[%frame+64]; ld.u64 %r2180,[%frame+72]; .loc 1 2861 11 add.u64 %r1650,%r2061,%r2179; set.u32.lt.u64 %r1655,%r1650,%r2061; cvt.s64.s32 %r1653,%r1655; add.u64 %r1656,%r2062,%r2180; mov.u64 %r2061,%r1650; sub.u64 %r2062,%r1656,%r1653; .loc 1 2853 10 add.u64 %r389,%r389,16; add.u64 %r387,%r387,16; add.u64 %r385,%r385,16; add.u64 %r383,%r383,16; add.u64 %r381,%r381,16; setp.ne.u64 %r1660,%r227,%r387; @ %r1660 bra $L70; $L69: .loc 1 2864 28 st.u64 [%r374],%r2055; st.u64 [%r374+8],%r2056; .loc 1 2865 34 st.u64 [%r373],%r2057; st.u64 [%r373+8],%r2058; .loc 1 2866 34 st.u64 [%r372],%r2059; st.u64 [%r372+8],%r2060; .loc 1 2867 34 st.u64 [%r371],%r2061; st.u64 [%r371+8],%r2062; .loc 1 2846 6 add.u64 %r374,%r374,16; add.u64 %r373,%r373,16; add.u64 %r372,%r372,16; add.u64 %r371,%r371,16; add.u64 %r370,%r370,256; setp.ne.u64 %r1669,%r374,%r472; @ %r1669 bra $L71; bra $L63; $L60: .loc 1 2727 34 add.u64 %r273,%r273,256; .loc 1 2727 8 add.u64 %r555,%r555,4096; add.u64 %r556,%r556,4096; setp.ge.s64 %r1670,%r216,%r273; @ %r1670 bra $L72; bra $L73; $L59: .loc 1 2873 17 add.u64 %r171,%r2413,%r618; .loc 1 2874 9 setp.le.s64 %r1671,%r171,%r480; @ %r1671 bra $L60; .loc 1 2876 14 add.u64 %r495,%r263,%r273; .loc 1 2876 9 add.u64 %r496,%r495,-1; .loc 1 2759 17 add.u64 %r405,%r2416,%r273; add.u64 %r1672,%r405,%r406; shl.b64 %r1673,%r1672,4; add.u64 %r407,%r1673,%r247; neg.s64 %r1675,%r405; shl.b64 %r1676,%r1675,4; shl.b64 %r1677,%r273,4; add.u64 %r390,%r1677,%r1676; shl.b64 %r1678,%r495,4; add.u64 %r392,%r1678,%r1676; shl.b64 %r434,%r263,8; .loc 1 2876 9 mov.u64 %r398,%r397; mov.u64 %r400,%r586; mov.u64 %r401,%r406; mov.u64 %r267,%r480; setp.gt.s64 %r2450,%r2416,%r263; setp.le.s64 %r2243,%r273,%r496; not.b64 %r2260,%r273; add.u64 %r2261,%r2260,%r495; and.b64 %r2262,%r2261,-4; $L81: .loc 1 2877 6 @ %r2243 bra $L74; $L80: .loc 1 2901 6 @ %r2450 bra $L75; bra $L76; $L74: add.u64 %r419,%r390,%r407; add.u64 %r1685,%r2262,%r273; add.u64 %r1686,%r1685,%r401; shl.b64 %r1687,%r1686,4; add.u64 %r409,%r1687,%r2308; .loc 1 2904 13 mov.u64 %r415,16384; mov.u64 %r416,12288; mov.u64 %r417,4096; mov.u64 %r418,-8192; setp.gt.s64 %r2238,%r258,%r2288; $L79: .loc 1 2879 14 ld.u64 %r2047,[%r419]; ld.u64 %r2048,[%r419+8]; .loc 1 2880 14 ld.u64 %r2049,[%r419+16]; ld.u64 %r2050,[%r419+24]; .loc 1 2881 14 ld.u64 %r2051,[%r419+32]; ld.u64 %r2052,[%r419+40]; .loc 1 2882 14 ld.u64 %r2053,[%r419+48]; ld.u64 %r2054,[%r419+56]; .loc 1 2884 10 @ %r2238 bra $L77; add.u64 %r428,%r1033,%r417; mov.u64 %r429,%r400; $L78: add.u64 %r425,%r418,%r428; .loc 1 2887 16 ld.u64 %r2043,[%r429]; ld.u64 %r2044,[%r429+8]; .loc 1 2886 16 add.u64 %r1690,%r425,%r417; .loc 1 2887 13 st.u64 [%frame+80],%r2043; st.u64 [%frame+88],%r2044; ld.u64 %r1697,[%r1690]; st.u64 [%frame+96],%r1697; ld.u64 %r1698,[%r1690+8]; st.u64 [%frame+104],%r16982305230623072183,[%frame+64]; ld.u64 %r2184,[%frame+72]; .loc 1 2886 11 add.u64 %r1703,%r2047,%r2183; set.u32.lt.u64 %r1708,%r1703,%r2047; cvt.s64.s32 %r1706,%r1708; add.u64 %r1709,%r2048,%r2184; mov.u64 %r2047,%r1703; sub.u64 %r2048,%r1709,%r1706; .loc 1 2889 13 st.u64 [%frame+80],%r2043; st.u64 [%frame+88],%r2044; ld.u64 %r1719,[%r428]; st.u64 [%frame+96],%r1719; ld.u64 %r1720,[%r428+8]; st.u64 [%frame+104],%r172305230623072187,[%frame+64]; ld.u64 %r2188,[%frame+72]; .loc 1 2888 11 add.u64 %r1725,%r2049,%r2187; set.u32.lt.u64 %r1730,%r1725,%r2049; cvt.s64.s32 %r1728,%r1730; add.u64 %r1731,%r2050,%r2188; mov.u64 %r2049,%r1725; sub.u64 %r2050,%r1731,%r1728; .loc 1 2890 16 add.u64 %r1735,%r425,%r416; .loc 1 2891 13 st.u64 [%frame+80],%r2043; st.u64 [%frame+88],%r2044; ld.u64 %r1742,[%r1735]; st.u64 [%frame+96],%r1742; ld.u64 %r1743,[%r1735+8]; st.u64 [%frame+104],%r174305230623072191,[%frame+64]; ld.u64 %r2192,[%frame+72]; .loc 1 2890 11 add.u64 %r1748,%r2051,%r2191; set.u32.lt.u64 %r1753,%r1748,%r2051; cvt.s64.s32 %r1751,%r1753; add.u64 %r1754,%r2052,%r2192; mov.u64 %r2051,%r1748; sub.u64 %r2052,%r1754,%r1751; .loc 1 2892 16 add.u64 %r1758,%r425,%r415; .loc 1 2893 13 st.u64 [%frame+80],%r2043; st.u64 [%frame+88],%r2044; ld.u64 %r1765,[%r1758]; st.u64 [%frame+96],%r1765; ld.u64 %r1766,[%r1758+8]; st.u64 [%frame+104],%r176305230623072195,[%frame+64]; ld.u64 %r2196,[%frame+72]; .loc 1 2892 11 add.u64 %r1771,%r2053,%r2195; set.u32.lt.u64 %r1776,%r1771,%r2053; cvt.s64.s32 %r1774,%r1776; add.u64 %r1777,%r2054,%r2196; mov.u64 %r2053,%r1771; sub.u64 %r2054,%r1777,%r1774; .loc 1 2884 10 add.u64 %r429,%r429,16; add.u64 %r428,%r428,16; setp.ne.u64 %r1781,%r398,%r429; @ %r1781 bra $L78; $L77: .loc 1 2895 28 st.u64 [%r419],%r2047; st.u64 [%r419+8],%r2048; .loc 1 2896 32 st.u64 [%r419+16],%r2049; st.u64 [%r419+24],%r2050; .loc 1 2897 32 st.u64 [%r419+32],%r2051; st.u64 [%r419+40],%r2052; .loc 1 2898 32 st.u64 [%r419+48],%r2053; st.u64 [%r419+56],%r2054; .loc 1 2877 6 add.u64 %r419,%r419,64; add.u64 %r418,%r418,-16384; add.u64 %r417,%r417,16384; add.u64 %r416,%r416,16384; add.u64 %r415,%r415,16384; setp.ne.u64 %r1790,%r409,%r419; @ %r1790 bra $L79; bra $L80; $L76: .loc 1 2874 39 add.u64 %r267,%r267,1; .loc 1 2874 9 add.u64 %r407,%r407,%r566; add.u64 %r401,%r401,%r504; add.u64 %r400,%r400,%r619; add.u64 %r398,%r398,%r619; setp.ne.u64 %r1791,%r171,%r267; @ %r1791 bra $L81; bra $L60; $L75: add.u64 %r436,%r392,%r407; .loc 1 2904 13 mov.u64 %r435,%r434; setp.gt.s64 %r2239,%r258,%r2288; $L84: .loc 1 2903 14 ld.u64 %r2045,[%r436]; ld.u64 %r2046,[%r436+8]; .loc 1 2905 10 @ %r2239 bra $L82; shl.b64 %r1794,%r435,4; add.u64 %r332,%r1033,%r1794; mov.u64 %r297,%r400; sub.u64 %r2337,%r398,%r400; add.u64 %r2338,%r2337,-16; shr.u64 %r2336,%r2338,4; and.b64 %r2340,%r2336,1; setp.ne.u64 %r2341,%r2340,0; @ ! %r2341 bra $L282; $L83: .loc 1 2908 13 ld.u64 %r1797,[%r332]; st.u64 [%frame+80],%r1797; ld.u64 %r1798,[%r332+8]; st.u64 [%frame+88],%r1798; ld.u64 %r1801,[%r297]; st.u64 [%frame+96],%r1801; ld.u64 %r1802,[%r297+8]; st.u64 [%frame+104],%r18022305230623072199,[%frame+64]; ld.u64 %r2200,[%frame+72]; .loc 1 2907 11 add.u64 %r1807,%r2045,%r2199; set.u32.lt.u64 %r1812,%r1807,%r2045; cvt.s64.s32 %r1810,%r1812; add.u64 %r1813,%r2046,%r2200; sub.u64 %r1816,%r1813,%r1810; .loc 1 2905 10 add.u64 %r2342,%r332,16; add.u64 %r2343,%r297,16; .loc 1 2908 13 ld.u64 %r2461,[%r2342]; st.u64 [%frame+80],%r2461; ld.u64 %r2462,[%r2342+8]; st.u64 [%frame+88],%r2462; ld.u64 %r2463,[%r2343]; st.u64 [%frame+96],%r2463; ld.u64 %r2464,[%r2343+8]; st.u64 [%frame+104],%r24642305230623072465,[%frame+64]; ld.u64 %r2466,[%frame+72]; .loc 1 2907 11 add.u64 %r2467,%r1807,%r2465; set.u32.lt.u64 %r2468,%r2467,%r1807; cvt.s64.s32 %r2469,%r2468; add.u64 %r2470,%r1816,%r2466; mov.u64 %r2045,%r2467; sub.u64 %r2046,%r2470,%r2469; .loc 1 2905 10 add.u64 %r332,%r332,32; add.u64 %r297,%r297,32; setp.ne.u64 %r2472,%r398,%r297; @ %r2472 bra $L83; $L82: .loc 1 2910 28 st.u64 [%r436],%r2045; st.u64 [%r436+8],%r2046; .loc 1 2901 6 add.u64 %r436,%r436,16; add.u64 %r435,%r435,256; setp.ne.u64 %r1820,%r407,%r436; @ %r1820 bra $L84; bra $L76; $L20: .loc 1 2958 11 cvt.u16.u32 %r1821,%r446; setp.eq.u16 %r1822,%r1821,1; @ ! %r1822 bra $L283; bra $L85; $L31: .loc 1 2920 38 set.u32.eq.u64 %r1824,%r208,1; neg.s32 %r1825,%r1824; .loc 1 2920 26 cvt.u16.u32 %r1827,%r1825; and.b16 %r1826,%r1827,%r990; .loc 1 2920 11 cvt.u32.u16 %r1829,%r1826; cvt.u16.u8 %r1830,%r1829; setp.eq.u16 %r1831,%r1830,0; @ %r1831 bra $L20; .loc 1 2920 43 setp.ne.u64 %r1832,%r502,1; @ %r1832 bra $L20; $L121: .loc 1 2922 10 cvt.u16.u32 %r1833,%r446; setp.ne.u16 %r1834,%r1833,1; @ %r1834 bra $L87; .loc 1 2948 4 setp.gt.s64 %r1835,%r217,0; @ %r1835 bra $L88; bra $L1; $L87: .loc 1 2929 4 setp.le.s64 %r1836,%r217,0; @ %r1836 bra $L1; shl.b64 %r643,%r504,4; shl.b64 %r1837,%r216,4; add.u64 %r642,%r245,%r1837; shl.b64 %r633,%r503,4; shl.b64 %r1838,%r456,4; add.u64 %r634,%r243,%r1838; mov.u64 %r647,0; mov.u64 %r640,%r647; .loc 1 2929 11 mov.u64 %r300,%r647; setp.gt.s64 %r2473,%r216,0; setp.le.s64 %r2312,%r456,0; add.u64 %r2313,%frame,64; add.u64 %r2314,%frame,80; add.u64 %r2315,%frame,96; $L91: .loc 1 2933 8 @ %r2473 bra $L90; $L95: .loc 1 2929 29 add.u64 %r300,%r300,1; .loc 1 2929 4 add.u64 %r640,%r640,%r504; add.u64 %r642,%r642,%r643; add.u64 %r647,%r647,%r209; setp.ne.u64 %r1840,%r217,%r300; @ %r1840 bra $L91; bra $L1; $L90: shl.b64 %r1841,%r640,4; add.u64 %r629,%r245,%r1841; shl.b64 %r1842,%r647,4; add.u64 %r652,%r244,%r1842; .loc 1 2933 8 mov.u64 %r631,%r634; mov.u64 %r630,0; $L94: .loc 1 2937 5 @ %r2312 bra $L140; shl.b64 %r1844,%r630,4; add.u64 %r625,%r243,%r1844; mov.u64 %r626,%r652; .loc 1 2936 7 mov.u64 %r2097,0; mov.u64 %r2098,%r2097; sub.u64 %r2371,%r631,%r625; add.u64 %r2372,%r2371,-16; shr.u64 %r2370,%r2372,4; and.b64 %r2374,%r2370,1; setp.ne.u64 %r2375,%r2374,0; @ ! %r2375 bra $L284; $L93: .loc 1 2938 23 ld.u64 %r1847,[%r625]; st.u64 [%frame+80],%r1847; ld.u64 %r1848,[%r625+8]; st.u64 [%frame+88],%r1848; ld.u64 %r1851,[%r626]; st.u64 [%frame+96],%r1851; ld.u64 %r1852,[%r626+8]; st.u64 [%frame+104],%r1852231r23142312203,[%frame+64]; ld.u64 %r2204,[%frame+72]; .loc 1 2938 9 add.u64 %r1857,%r2097,%r2203; set.u32.lt.u64 %r1862,%r1857,%r2097; cvt.s64.s32 %r1860,%r1862; add.u64 %r1863,%r2098,%r2204; sub.u64 %r1866,%r1863,%r1860; .loc 1 2937 5 add.u64 %r2376,%r625,16; add.u64 %r2377,%r626,16; .loc 1 2938 23 ld.u64 %r2481,[%r2376]; st.u64 [%frame+80],%r2481; ld.u64 %r2482,[%r2376+8]; st.u64 [%frame+88],%r2482; ld.u64 %r2483,[%r2377]; st.u64 [%frame+96],%r2483; ld.u64 %r2484,[%r2377+8]; st.u64 [%frame+104],%r2484231r23142312485,[%frame+64]; ld.u64 %r2486,[%frame+72]; .loc 1 2938 9 add.u64 %r2487,%r1857,%r2485; set.u32.lt.u64 %r2488,%r2487,%r1857; cvt.s64.s32 %r2489,%r2488; add.u64 %r2490,%r1866,%r2486; mov.u64 %r2097,%r2487; sub.u64 %r2098,%r2490,%r2489; .loc 1 2937 5 add.u64 %r625,%r625,32; add.u64 %r626,%r626,32; setp.ne.u64 %r2492,%r625,%r631; @ %r2492 bra $L93; bra $L92; $L140: .loc 1 2936 7 mov.u64 %r2097,0; mov.u64 %r2098,%r2097; $L92: .loc 1 2939 15 st.u64 [%r629],%r2097; st.u64 [%r629+8],%r2098; .loc 1 2933 8 add.u64 %r629,%r629,16; add.u64 %r630,%r630,%r503; add.u64 %r631,%r631,%r633; setp.ne.u64 %r1870,%r629,%r642; @ %r1870 bra $L94; bra $L95; $L88: shl.b64 %r663,%r504,4; mov.u64 %r661,%r245; shl.b64 %r655,%r503,4; .loc 1 2948 4 mov.u64 %r664,0; .loc 1 2948 11 mov.u64 %r298,%r664; setp.le.s64 %r2493,%r456,0; add.u64 %r2316,%frame,64; add.u64 %r2317,%frame,80; add.u64 %r2318,%frame,96; $L98: .loc 1 2952 8 @ %r2493 bra $L141; mov.u64 %r653,%r243; shl.b64 %r1872,%r664,4; add.u64 %r656,%r1872,%r244; add.u64 %r1873,%r456,%r664; shl.b64 %r1874,%r1873,4; add.u64 %r672,%r1874,%r244; .loc 1 2951 10 mov.u64 %r2095,0; mov.u64 %r2096,%r2095; sub.u64 %r2379,%r672,%r656; add.u64 %r2380,%r2379,-16; shr.u64 %r2378,%r2380,4; and.b64 %r2382,%r2378,1; setp.ne.u64 %r2383,%r2382,0; @ ! %r2383 bra $L285; $L97: .loc 1 2953 26 ld.u64 %r1877,[%r653]; st.u64 [%frame+80],%r1877; ld.u64 %r1878,[%r653+8]; st.u64 [%frame+88],%r1878; ld.u64 %r1881,[%r656]; st.u64 [%frame+96],%r1881; ld.u64 %r1882,[%r656+8]; st.u64 [%frame+104],%r1882231623172312207,[%frame+64]; ld.u64 %r2208,[%frame+72]; .loc 1 2953 5 add.u64 %r1887,%r2095,%r2207; set.u32.lt.u64 %r1892,%r1887,%r2095; cvt.s64.s32 %r1890,%r1892; add.u64 %r1893,%r2096,%r2208; sub.u64 %r1896,%r1893,%r1890; .loc 1 2952 8 add.u64 %r2499,%r653,%r655; add.u64 %r2384,%r656,16; .loc 1 2953 26 ld.u64 %r2501,[%r2499]; st.u64 [%frame+80],%r2501; ld.u64 %r2502,[%r2499+8]; st.u64 [%frame+88],%r2502; ld.u64 %r2503,[%r2384]; st.u64 [%frame+96],%r2503; ld.u64 %r2504,[%r2384+8]; st.u64 [%frame+104],%r2504231623172312505,[%frame+64]; ld.u64 %r2506,[%frame+72]; .loc 1 2953 5 add.u64 %r2507,%r1887,%r2505; set.u32.lt.u64 %r2508,%r2507,%r1887; cvt.s64.s32 %r2509,%r2508; add.u64 %r2510,%r1896,%r2506; mov.u64 %r2095,%r2507; sub.u64 %r2096,%r2510,%r2509; .loc 1 2952 8 add.u64 %r653,%r2499,%r655; add.u64 %r656,%r656,32; setp.ne.u64 %r2512,%r656,%r672; @ %r2512 bra $L97; bra $L96; $L141: .loc 1 2951 10 mov.u64 %r2095,0; mov.u64 %r2096,%r2095; $L96: .loc 1 2954 25 st.u64 [%r661],%r2095; st.u64 [%r661+8],%r2096; .loc 1 2948 29 add.u64 %r298,%r298,1; .loc 1 2948 4 add.u64 %r661,%r661,%r663; add.u64 %r664,%r664,%r209; setp.ne.u64 %r1900,%r217,%r298; @ %r1900 bra $L98; bra $L1; $L85: .loc 1 2963 7 setp.le.s64 %r1901,%r217,0; @ %r1901 bra $L1; shl.b64 %r683,%r444,4; mov.u64 %r681,%r245; shl.b64 %r675,%r503,4; shl.b64 %r678,%r502,4; mov.u64 %r684,0; .loc 1 2963 14 mov.u64 %r311,%r684; setp.le.s64 %r2513,%r456,0; add.u64 %r2319,%frame,64; add.u64 %r2320,%frame,80; add.u64 %r2321,%frame,96; $L101: .loc 1 2967 4 @ %r2513 bra $L142; mov.u64 %r673,%r243; shl.b64 %r1903,%r684,4; add.u64 %r676,%r244,%r1903; .loc 1 2966 6 mov.u64 %r2101,0; mov.u64 %r2102,%r2101; .loc 1 2967 11 mov.u64 %r313,%r2101; and.b64 %r2388,%r456,1; setp.eq.u64 %r2389,%r2388,0; @ ! %r2389 bra $L286; $L100: .loc 1 2968 29 ld.u64 %r1906,[%r673]; st.u64 [%frame+80],%r1906; ld.u64 %r1907,[%r673+8]; st.u64 [%frame+88],%r1907; ld.u64 %r1910,[%r676]; st.u64 [%frame+96],%r1910; ld.u64 %r1911,[%r676+8]; st.u64 [%frame+104],%r191192320232211,[%frame+64]; ld.u64 %r2212,[%frame+72]; .loc 1 2968 8 add.u64 %r1916,%r2101,%r2211; set.u32.lt.u64 %r1921,%r1916,%r2101; cvt.s64.s32 %r1919,%r1921; add.u64 %r1922,%r2102,%r2212; sub.u64 %r1925,%r1922,%r1919; .loc 1 2967 4 add.u64 %r2520,%r673,%r675; add.u64 %r2521,%r676,%r678; .loc 1 2968 29 ld.u64 %r2522,[%r2520]; st.u64 [%frame+80],%r2522; ld.u64 %r2523,[%r2520+8]; st.u64 [%frame+88],%r2523; ld.u64 %r2524,[%r2521]; st.u64 [%frame+96],%r2524; ld.u64 %r2525,[%r2521+8]; st.u64 [%frame+104],%r252523192320232526,[%frame+64]; ld.u64 %r2527,[%frame+72]; .loc 1 2968 8 add.u64 %r2528,%r1916,%r2526; set.u32.lt.u64 %r2529,%r2528,%r1916; cvt.s64.s32 %r2530,%r2529; add.u64 %r2531,%r1925,%r2527; mov.u64 %r2101,%r2528; sub.u64 %r2102,%r2531,%r2530; .loc 1 2967 28 add.u64 %r313,%r313,2; .loc 1 2967 4 add.u64 %r673,%r2520,%r675; add.u64 %r676,%r2521,%r678; setp.ne.u64 %r2533,%r456,%r313; @ %r2533 bra $L100; bra $L99; $L142: .loc 1 2966 6 mov.u64 %r2101,0; mov.u64 %r2102,%r2101; $L99: .loc 1 2969 21 st.u64 [%r681],%r2101; st.u64 [%r681+8],%r2102; .loc 1 2963 32 add.u64 %r311,%r311,1; .loc 1 2963 7 add.u64 %r681,%r681,%r683; add.u64 %r684,%r684,%r209; setp.ne.u64 %r1929,%r217,%r311; @ %r1929 bra $L101; bra $L1; $L283: .loc 1 2972 11 setp.gt.s64 %r1930,%r208,%r503; @ %r1930 bra $L102; .loc 1 2993 7 setp.gt.s64 %r1931,%r217,0; @ %r1931 bra $L103; bra $L1; $L102: .loc 1 2974 7 setp.gt.s64 %r1932,%r217,0; @ %r1932 bra $L104; bra $L1; $L108: shl.b64 %r1933,%r717,4; add.u64 %r713,%r245,%r1933; .loc 1 2975 9 mov.u64 %r310,0; and.b64 %r2400,%r216,3; setp.eq.u64 %r2403,%r2400,0; @ %r2403 bra $L105; setp.eq.u64 %r2402,%r2400,1; @ %r2402 bra $L250; setp.eq.u64 %r2401,%r2400,2; @ %r2401 bra $L251; bra $L287; $L105: .loc 1 2976 34 st.u64 [%r713],%r2328; st.u64 [%r713+8],%r2328; .loc 1 2975 2 add.u64 %r2535,%r713,%r715; .loc 1 2976 34 st.u64 [%r2535],%r2328; st.u64 [%r2535+8],%r2328; .loc 1 2975 2 add.u64 %r2537,%r2535,%r715; .loc 1 2976 34 st.u64 [%r2537],%r2328; st.u64 [%r2537+8],%r2328; .loc 1 2975 2 add.u64 %r2540,%r2537,%r715; .loc 1 2976 34 st.u64 [%r2540],%r2328; st.u64 [%r2540+8],%r2328; .loc 1 2975 27 add.u64 %r310,%r310,4; .loc 1 2975 2 add.u64 %r713,%r2540,%r715; setp.ne.u64 %r2542,%r216,%r310; @ %r2542 bra $L105; $L109: .loc 1 2974 32 add.u64 %r309,%r324,1; .loc 1 2974 7 add.u64 %r717,%r717,%r504; setp.ne.u64 %r1937,%r217,%r309; @ %r1937 bra $L143; shl.b64 %r701,%r502,4; shl.b64 %r692,%r503,4; mov.u64 %r706,0; mov.u64 %r705,%r706; mov.u64 %r325,%r706; setp.le.s64 %r2543,%r456,0; add.u64 %r2325,%frame,64; add.u64 %r2326,%frame,80; add.u64 %r2327,%frame,96; bra $L107; $L104: shl.b64 %r715,%r444,4; mov.u64 %r717,0; .loc 1 2974 14 mov.u64 %r324,%r717; setp.gt.s64 %r2544,%r216,0; .loc 1 2976 34 mov.u64 %r2328,%r717; bra $L106; $L143: mov.u64 %r324,%r309; $L106: .loc 1 2975 2 @ %r2544 bra $L108; bra $L109; $L112: shl.b64 %r1939,%r697,4; add.u64 %r690,%r243,%r1939; .loc 1 2980 4 mov.u64 %r689,%r710; .loc 1 2980 11 mov.u64 %r308,0; and.b64 %r2394,%r216,1; setp.eq.u64 %r2395,%r2394,0; @ ! %r2395 bra $L288; $L110: .loc 1 2983 37 ld.u64 %r1942,[%r690]; st.u64 [%frame+80],%r1942; ld.u64 %r1943,[%r690+8]; st.u64 [%frame+88],%r1943; ld.u64 %r1946,[%r699]; st.u64 [%frame+96],%r1946; ld.u64 %r1947,[%r699+8]; st.u64 [%frame+104],%r194723252326232215,[%frame+64]; ld.u64 %r2216,[%frame+72]; .loc 1 2982 36 ld.u64 %r2217,[%r689]; ld.u64 %r2218,[%r689+8]; add.u64 %r1953,%r2217,%r2215; set.u32.lt.u64 %r1958,%r1953,%r2217; cvt.s64.s32 %r1956,%r1958; add.u64 %r1959,%r2218,%r2216; sub.u64 %r1962,%r1959,%r1956; st.u64 [%r689],%r1953; st.u64 [%r689+8],%r1962; .loc 1 2980 4 add.u64 %r2549,%r689,%r715; add.u64 %r2550,%r690,%r692; .loc 1 2983 37 ld.u64 %r2551,[%r2550]; st.u64 [%frame+80],%r2551; ld.u64 %r2552,[%r2550+8]; st.u64 [%frame+88],%r2552; ld.u64 %r2553,[%r699]; st.u64 [%frame+96],%r2553; ld.u64 %r2554,[%r699+8]; st.u64 [%frame+104],%r255423252326232555,[%frame+64]; ld.u64 %r2556,[%frame+72]; .loc 1 2982 36 ld.u64 %r2557,[%r2549]; ld.u64 %r2558,[%r2549+8]; add.u64 %r2559,%r2557,%r2555; set.u32.lt.u64 %r2560,%r2559,%r2557; cvt.s64.s32 %r2561,%r2560; add.u64 %r2562,%r2558,%r2556; sub.u64 %r2563,%r2562,%r2561; st.u64 [%r2549],%r2559; st.u64 [%r2549+8],%r2563; .loc 1 2980 29 add.u64 %r308,%r308,2; .loc 1 2980 4 add.u64 %r689,%r2549,%r715; add.u64 %r690,%r2550,%r692; setp.ne.u64 %r2564,%r216,%r308; @ %r2564 bra $L110; $L113: .loc 1 2979 26 add.u64 %r307,%r307,1; .loc 1 2979 2 add.u64 %r697,%r697,%r208; add.u64 %r699,%r699,%r701; setp.eq.u64 %r1966,%r456,%r307; @ %r1966 bra $L111; $L114: .loc 1 2980 4 @ %r2544 bra $L112; bra $L113; $L111: .loc 1 2978 32 add.u64 %r306,%r325,1; .loc 1 2978 7 add.u64 %r705,%r705,%r504; add.u64 %r706,%r706,%r209; setp.eq.u64 %r1968,%r324,%r325; @ %r1968 bra $L1; mov.u64 %r325,%r306; $L107: .loc 1 2979 2 @ %r2543 bra $L111; shl.b64 %r1970,%r706,4; add.u64 %r699,%r244,%r1970; shl.b64 %r1971,%r705,4; add.u64 %r710,%r245,%r1971; mov.u64 %r697,0; .loc 1 2979 9 mov.u64 %r307,%r697; bra $L114; $L103: shl.b64 %r732,%r444,4; shl.b64 %r723,%r208,4; shl.b64 %r727,%r502,4; .loc 1 2993 7 mov.u64 %r741,0; mov.u64 %r739,%r741; .loc 1 2993 14 mov.u64 %r302,%r741; setp.gt.s64 %r2565,%r216,0; setp.le.s64 %r2332,%r456,0; add.u64 %r2333,%frame,64; add.u64 %r2334,%frame,80; add.u64 %r2335,%frame,96; $L116: .loc 1 2997 4 @ %r2565 bra $L115; $L120: .loc 1 2993 32 add.u64 %r302,%r302,1; .loc 1 2993 7 add.u64 %r739,%r739,%r504; add.u64 %r741,%r741,%r209; setp.ne.u64 %r1973,%r217,%r302; @ %r1973 bra $L116; bra $L1; $L115: shl.b64 %r1974,%r739,4; add.u64 %r730,%r245,%r1974; shl.b64 %r1975,%r741,4; add.u64 %r747,%r244,%r1975; .loc 1 2997 4 mov.u64 %r734,0; .loc 1 2997 11 mov.u64 %r303,%r734; $L119: .loc 1 3001 8 @ %r2332 bra $L144; shl.b64 %r1977,%r734,4; add.u64 %r721,%r243,%r1977; mov.u64 %r725,%r747; .loc 1 3000 10 mov.u64 %r2099,0; mov.u64 %r2100,%r2099; .loc 1 3001 15 mov.u64 %r305,%r2099; and.b64 %r2408,%r456,1; setp.eq.u64 %r2409,%r2408,0; @ ! %r2409 bra $L289; $L118: .loc 1 3002 28 ld.u64 %r1980,[%r721]; st.u64 [%frame+80],%r1980; ld.u64 %r1981,[%r721+8]; st.u64 [%frame+88],%r1981; ld.u64 %r1984,[%r725]; st.u64 [%frame+96],%r1984; ld.u64 %r1985,[%r725+8]; st.u64 [%frame+104],%r1985233r232332221,[%frame+64]; ld.u64 %r2222,[%frame+72]; .loc 1 3002 5 add.u64 %r1990,%r2099,%r2221; set.u32.lt.u64 %r1995,%r1990,%r2099; cvt.s64.s32 %r1993,%r1995; add.u64 %r1996,%r2100,%r2222; sub.u64 %r1999,%r1996,%r1993; .loc 1 3001 8 add.u64 %r2573,%r721,%r723; add.u64 %r2574,%r725,%r727; .loc 1 3002 28 ld.u64 %r2575,[%r2573]; st.u64 [%frame+80],%r2575; ld.u64 %r2576,[%r2573+8]; st.u64 [%frame+88],%r2576; ld.u64 %r2577,[%r2574]; st.u64 [%frame+96],%r2577; ld.u64 %r2578,[%r2574+8]; st.u64 [%frame+104],%r2578233r232332579,[%frame+64]; ld.u64 %r2580,[%frame+72]; .loc 1 3002 5 add.u64 %r2581,%r1990,%r2579; set.u32.lt.u64 %r2582,%r2581,%r1990; cvt.s64.s32 %r2583,%r2582; add.u64 %r2584,%r1999,%r2580; mov.u64 %r2099,%r2581; sub.u64 %r2100,%r2584,%r2583; .loc 1 3001 32 add.u64 %r305,%r305,2; .loc 1 3001 8 add.u64 %r721,%r2573,%r723; add.u64 %r725,%r2574,%r727; setp.ne.u64 %r2586,%r456,%r305; @ %r2586 bra $L118; bra $L117; $L144: .loc 1 3000 10 mov.u64 %r2099,0; mov.u64 %r2100,%r2099; $L117: .loc 1 3003 27 st.u64 [%r730],%r2099; st.u64 [%r730+8],%r2100; .loc 1 2997 29 add.u64 %r303,%r303,1; .loc 1 2997 4 add.u64 %r730,%r730,%r732; add.u64 %r734,%r734,%r503; setp.ne.u64 %r2003,%r216,%r303; @ %r2003 bra $L119; bra $L120; $L32: .loc 1 2920 11 setp.eq.u64 %r2004,%r208,1; @ %r2004 bra $L121; bra $L20; $L2: .loc 1 2476 6 ld.u64 %r2005,[%r748]; setp.ne.u64 %r2006,%r2005,0; @ %r2006 bra $L122; .loc 1 2485 4 ld.u64 %r2008,[%r749+56]; add.u64 %r2007,%r2008,1; ld.u64 %r2009,[%r749+48]; sub.u64 %r508,%r2007,%r2009; .loc 1 2483 15 ld.s8 %r2011,[%r750+28]; cvt.u16.u32 %r2010,%r2011; setp.eq.u16 %r2012,%r2010,1; @ %r2012 bra $L123; bra $L124; $L126: .loc 1 2560 16 ld.u64 %r503,[%r749+40]; .loc 1 2548 27 mov.u64 %r444,%r504; bra $L125; $L14: ld.u64 %r504,[%r748+40]; .loc 1 2543 6 ld.s8 %r2014,[%r748+28]; cvt.u16.u32 %r2013,%r2014; setp.eq.u16 %r2015,%r2013,1; @ ! %r2015 bra $L127; bra $L126; $L11: .loc 1 2560 16 ld.u64 %r503,[%r749+40]; mov.u32 %r446,%r331; .loc 1 2548 27 mov.u64 %r504,%r444; .loc 1 2563 14 mov.u64 %r216,1; .loc 1 2561 16 mov.u64 %r208,%r216; bra $L12; $L5: .loc 1 2485 4 ld.u64 %r2017,[%r749+56]; add.u64 %r2016,%r2017,1; ld.u64 %r2018,[%r749+48]; sub.u64 %r508,%r2016,%r2018; bra $L124; $L128: .loc 1 2509 17 ld.u64 %r2020,[%r748+56]; add.u64 %r2019,%r2020,1; .loc 1 2509 15 ld.u64 %r2021,[%r748+48]; sub.u64 %r456,%r2019,%r2021; ld.s8 %r2232,[%r750+28]; bra $L9; $L122: .loc 1 2502 12 cvta.global.u64 %r2022,_gfortrani_compile_options; .loc 1 2502 11 ld.u32 %r2023,[%r2022+36]; setp.ne.u32 %r2024,%r2023,0; @ ! %r2024 bra $L8; bra $L128; $L1: .loc 1 3007 1 ret; $L282: .loc 1 2908 13 ld.u64 %r2587,[%r332]; st.u64 [%frame+80],%r2587; ld.u64 %r2588,[%r332+8]; st.u64 [%frame+88],%r2588; ld.u64 %r2589,[%r400]; st.u64 [%frame+96],%r2589; ld.u64 %r2590,[%r400+8]; st.u64 [%frame+104],%r2592305230623072591,[%frame+64]; ld.u64 %r2592,[%frame+72]; .loc 1 2907 11 add.u64 %r2593,%r2045,%r2591; set.u32.lt.u64 %r2594,%r2593,%r2045; cvt.s64.s32 %r2595,%r2594; add.u64 %r2596,%r2046,%r2592; mov.u64 %r2045,%r2593; sub.u64 %r2046,%r2596,%r2595; .loc 1 2905 10 add.u64 %r332,%r332,16; add.u64 %r297,%r400,16; setp.ne.u64 %r2598,%r398,%r297; @ %r2598 bra $L83; bra $L82; $L281: .loc 1 2762 43 ld.u64 %r2599,[%r556]; st.u64 [%r330],%r2599; ld.u64 %r2600,[%r556+8]; st.u64 [%r330+8],%r2600; .loc 1 2760 9 add.u64 %r57,%r556,16; add.u64 %r73,%r330,4096; $L249: .loc 1 2762 43 ld.u64 %r2602,[%r57]; st.u64 [%r73],%r2602; ld.u64 %r2603,[%r57+8]; st.u64 [%r73+8],%r2603; .loc 1 2760 9 add.u64 %r57,%r57,16; add.u64 %r73,%r73,4096; $L248: .loc 1 2762 43 ld.u64 %r2605,[%r57]; st.u64 [%r73],%r2605; ld.u64 %r2606,[%r57+8]; st.u64 [%r73+8],%r2606; .loc 1 2760 9 add.u64 %r57,%r57,16; add.u64 %r73,%r73,4096; setp.ne.u64 %r2607,%r57,%r136; @ %r2607 bra $L58; bra $L48; $L280: .loc 1 2741 7 add.u64 %r2608,%r536,%r539; add.u64 %r2609,%r2608,%r540; .loc 1 2740 49 ld.u64 %r2610,[%r2609]; st.u64 [%r229],%r2610; ld.u64 %r2611,[%r2609+8]; st.u64 [%r229+8],%r2611; .loc 1 2742 49 ld.u64 %r2612,[%r536]; st.u64 [%r229+16],%r2612; ld.u64 %r2613,[%r536+8]; st.u64 [%r229+24],%r2613; .loc 1 2745 7 add.u64 %r2614,%r536,%r540; add.u64 %r2615,%r2614,%r539; .loc 1 2744 49 ld.u64 %r2616,[%r2615+16]; st.u64 [%r229+4096],%r2616; ld.u64 %r2617,[%r2615+24]; st.u64 [%r229+4104],%r2617; .loc 1 2746 49 ld.u64 %r2618,[%r536+16]; st.u64 [%r229+4112],%r2618; ld.u64 %r2619,[%r536+24]; st.u64 [%r229+4120],%r2619; .loc 1 2738 9 add.u64 %r229,%r229,8192; add.u64 %r211,%r536,32; setp.ne.u64 %r2620,%r211,%r527; @ %r2620 bra $L52; bra $L53; $L278: .loc 1 2688 2 @ ! %r2234 bra $L290; .loc 1 2689 22 mov.u32 %r2621262622,[%value_in]; } $L290: .loc 1 2687 24 mov.u64 %r296,2; .loc 1 2687 7 add.u64 %r620,%r245,%r621; $L247: .loc 1 2688 2 @ ! %r2234 bra $L291; .loc 1 2689 22 mov.u32 %r262426242625,[%value_in]; } $L291: .loc 1 2687 24 add.u64 %r296,%r296,1; .loc 1 2687 7 add.u64 %r620,%r620,%r621; $L246: .loc 1 2688 2 @ ! %r2234 bra $L292; .loc 1 2689 22 mov.u32 %r262726272628,[%value_in]; } $L292: .loc 1 2687 24 add.u64 %r296,%r296,1; .loc 1 2687 7 add.u64 %r620,%r620,%r621; setp.eq.u64 %r2629,%r296,%r624; @ ! %r2629 bra $L34; bra $L33; $L214: .loc 1 2689 22 mov.u32 %r2630241262631,[%value_in]; } $L279: .loc 1 2687 7 add.u64 %r2633,%r2412,%r621; .loc 1 2688 2 @ ! %r2234 bra $L293; .loc 1 2689 22 mov.u32 %r2635263262636,[%value_in]; } $L293: .loc 1 2687 7 add.u64 %r2638,%r2633,%r621; .loc 1 2688 2 @ ! %r2234 bra $L294; .loc 1 2689 22 mov.u32 %r264026362641,[%value_in]; } $L294: .loc 1 2687 24 add.u64 %r296,%r2369,3; .loc 1 2687 7 add.u64 %r620,%r2638,%r621; setp.eq.u64 %r2642,%r296,%r624; @ ! %r2642 bra $L34; bra $L33; $L284: .loc 1 2938 23 ld.u64 %r2643,[%r625]; st.u64 [%frame+80],%r2643; ld.u64 %r2644,[%r625+8]; st.u64 [%frame+88],%r2644; ld.u64 %r2645,[%r652]; st.u64 [%frame+96],%r2645; ld.u64 %r2646,[%r652+8]; st.u64 [%frame+104],%r26431r2314231.loc 1 2938 9 ld.u64 %r2097,[%frame+64]; ld.u64 %r2098,[%frame+72]; .loc 1 2937 5 add.u64 %r625,%r625,16; add.u64 %r626,%r652,16; setp.ne.u64 %r2654,%r625,%r631; @ %r2654 bra $L93; bra $L92; $L285: .loc 1 2953 26 ld.u64 %r2655,[%r243]; st.u64 [%frame+80],%r2655; ld.u64 %r2656,[%r243+8]; st.u64 [%frame+88],%r2656; ld.u64 %r2657,[%r656]; st.u64 [%frame+96],%r2657; ld.u64 %r2658,[%r656+8]; st.u64 [%frame+104],%r265823162317231.loc 1 2953 5 ld.u64 %r2095,[%frame+64]; ld.u64 %r2096,[%frame+72]; .loc 1 2952 8 add.u64 %r653,%r243,%r655; add.u64 %r656,%r656,16; setp.ne.u64 %r2666,%r656,%r672; @ %r2666 bra $L97; bra $L96; $L286: .loc 1 2968 29 ld.u64 %r2667,[%r243]; st.u64 [%frame+80],%r2667; ld.u64 %r2668,[%r243+8]; st.u64 [%frame+88],%r2668; ld.u64 %r2669,[%r676]; st.u64 [%frame+96],%r2669; ld.u64 %r2670,[%r676+8]; st.u64 [%frame+104],%r2672319232023.loc 1 2968 8 ld.u64 %r2101,[%frame+64]; ld.u64 %r2102,[%frame+72]; .loc 1 2967 28 mov.u64 %r313,1; .loc 1 2967 4 add.u64 %r673,%r243,%r675; add.u64 %r676,%r676,%r678; setp.ne.u64 %r2678,%r456,%r313; @ %r2678 bra $L100; bra $L99; $L288: .loc 1 2983 37 ld.u64 %r2679,[%r690]; st.u64 [%frame+80],%r2679; ld.u64 %r2680,[%r690+8]; st.u64 [%frame+88],%r2680; ld.u64 %r2681,[%r699]; st.u64 [%frame+96],%r2681; ld.u64 %r2682,[%r699+8]; st.u64 [%frame+104],%r268223252326232683,[%frame+64]; ld.u64 %r2684,[%frame+72]; .loc 1 2982 36 ld.u64 %r2685,[%r710]; ld.u64 %r2686,[%r710+8]; add.u64 %r2687,%r2685,%r2683; set.u32.lt.u64 %r2688,%r2687,%r2685; cvt.s64.s32 %r2689,%r2688; add.u64 %r2690,%r2686,%r2684; sub.u64 %r2691,%r2690,%r2689; st.u64 [%r710],%r2687; st.u64 [%r710+8],%r2691; .loc 1 2980 29 mov.u64 %r308,1; .loc 1 2980 4 add.u64 %r689,%r710,%r715; add.u64 %r690,%r690,%r692; setp.ne.u64 %r2692,%r216,%r308; @ %r2692 bra $L110; bra $L113; $L287: .loc 1 2976 34 st.u64 [%r713],%r2328; st.u64 [%r713+8],%r2328; .loc 1 2975 27 mov.u64 %r310,1; .loc 1 2975 2 add.u64 %r713,%r713,%r715; $L251: .loc 1 2976 34 st.u64 [%r713],%r2328; st.u64 [%r713+8],%r2328; .loc 1 2975 27 add.u64 %r310,%r310,1; .loc 1 2975 2 add.u64 %r713,%r713,%r715; $L250: .loc 1 2976 34 st.u64 [%r713],%r2328; st.u64 [%r713+8],%r2328; .loc 1 2975 27 add.u64 %r310,%r310,1; .loc 1 2975 2 add.u64 %r713,%r713,%r715; setp.ne.u64 %r2695,%r216,%r310; @ %r2695 bra $L105; bra $L109; $L289: .loc 1 3002 28 ld.u64 %r2696,[%r721]; st.u64 [%frame+80],%r2696; ld.u64 %r2697,[%r721+8]; st.u64 [%frame+88],%r2697; ld.u64 %r2698,[%r747]; st.u64 [%frame+96],%r2698; ld.u64 %r2699,[%r747+8]; st.u64 [%frame+104],%r269233r23233.loc 1 3002 5 ld.u64 %r2099,[%frame+64]; ld.u64 %r2100,[%frame+72]; .loc 1 3001 32 mov.u64 %r305,1; .loc 1 3001 8 add.u64 %r721,%r721,%r723; add.u64 %r725,%r747,%r727; setp.ne.u64 %r2707,%r456,%r305; @ %r2707 bra $L118; bra $L117; } matmul_r4.o/, .param .u32 %in_ar4, .param .u64 %in_ar5); .file 1 "../../../libgfortran/generated/matmul_r4.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_options[6]FUNCTION DECL: _gfortran_size0_gfortran_size0_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime_97,116,109,117,108,95,114,52,0 }2] = {782] = {672] = {8471,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,32,40,97,41,32,61,61,32,50,32,124,124,32,71,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,32,40,98,41,32,61,61,3243116,114,97,110,47,103,101,110,101,114,97,116,101,100,47,109,97,116,109,117,108,95,114,52,465,114,114,97,121,32,98,111,117,110,100,32,109,105,115,109,97,116,99,104,32,102,111,114,32,100,105,109,101,110,115,105,111,110,32,49,32,111,102,32,97,114,114,97,121,32,40,37,108,100,47,37,108,100,41,3265,114,114,97,121,32,98,111,117,110,100,32,109,105,115,109,97,116,99,104,32,102,111,114,32,100,105,109,101,110,115,105,111,110,32,50,32,111,102,32,97,114,114,97,121,32,40,37,108,100,47,37,108,100,41,3273,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,97,114,103,117,109,101,110,116,32,66,32,105,110,32,77,65,84,77,85,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,49,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,100,0 }103,101,109,109,32,33,61,32,78,85,76,7_gfortran_matmul_r4 .visible .func _gfortran_matmul_r89f32 %r240; .reg .f32 %r241; .reg .f32 %r242; .reg .f32 %r243; .reg .f32 %r244; .reg .u64 %r245; .reg .f32 %r246; .reg .f32 %r247; .reg .f32 %r248; .reg .f32 %r249; .reg .u64 %r250; .reg .f32 %r251; .reg .f32 %r252; .reg .f32 %r253; .reg .f32 %r254; .reg .f32 %r255; .reg .f32 %r256; .reg .f32 %r257; .reg .f32 %r258; .reg .f32 %r259; .reg .f32 %r260; .reg .f32 %r261; .reg .f32 %r262; .reg .f32 %r263; .reg .f32 %r264; .reg .f32 %r265; .reg .f32 %r266; .reg .u64f32u64 %r280; .reg .u64 %r281; .reg .u64 %r282; .reg .u64 %r283; .reg .f32u64 %r311; .reg .u64 %r313; .reg .u64u64 %r323; .reg .u64 %r328; .reg .u64 %r331; .reg .u64u64 %r391; .reg .u64 %r397; .reg .u64u64u32 %r473; .reg .u64 %r474; .reg .u64 %r475; .reg .u64 %r476; .reg .u64u64 %r511; .reg .u64 %r512; .reg .u64 %r513; .reg .u64u64u64u64 %r571; .reg .u64 %r572; .reg .u64 %r573; .reg .u64 %r574; .reg .u64 %r575; .reg .u64 %r577; .reg .u64u64 %r591; .reg .u64u64 %r602; .reg .u64 %r604; .reg .u64 %r605; .reg .u64 %r611; .reg .u64 %r613; .reg .u64 %r614; .reg .u64 %r618; .reg .u64 %r623; .reg .u64 %r624; .reg .u64 %r626; .reg .u64 %r627; .reg .u64 %r632; .reg .u64 %r634; .reg .u64 %r635; .reg .u64 %r643; .reg .u64 %r644; .reg .u64 %r646; .reg .u64 %r647; .reg .u64 %r649; .reg .u64 %r652; .reg .u64 %r654; .reg .u64 %r655; .reg .u64 %r660; .reg .u64 %r661; .reg .u64 %r663; .reg .u64 %r668; .reg .u64 %r670; .reg .u64 %r672; .reg .u64 %r676; .reg .u64 %r677; .reg .u64 %r681; .reg .u64 %r684; .reg .u64 %r686; .reg .u64 %r688; .reg .u64 %r692; .reg .u64 %r694; .reg .u64 %r696; .reg .u64 %r698; .reg .u64 %r701; .reg .u64 %r703; .reg .u64 %r705; .reg .u64 %r710; .reg .u64 %r712; .reg .u64 %r718; .reg .u64 %r719; .reg .u64 %r720; .reg .u64 %r721; .reg .u32 %r722; .reg .u32 %r723; .reg .u64 %r724; .reg .u16 %r725; .reg .pred %r726; .reg .u16 %r727; .reg .pred %r729; .reg .u64 %r730; .reg .u32 %r731; .reg .u64 %r732; .reg .u64 %r733; .reg .u64 %r734; .reg .pred %r735; .reg .pred %r737; .reg .u64 %r740; .reg .u64 %r742; .reg .u64 %r743; .reg .u64 %r744; .reg .u64 %r746; .reg .u64 %r747; .reg .u64 %r748; .reg .u64 %r749; .reg .u64 %r750; .reg .u64 %r753; .reg .u64 %r755; .reg .u64 %r756; .reg .u64 %r758; .reg .u64 %r760; .reg .u64 %r761; .reg .u64 %r763; .reg .u64 %r764; .reg .u64 %r765; .reg .u64 %r766; .reg .u64 %r767; .reg .u32 %r768; .reg .pred %r769; .reg .u64 %r770; .reg .u64 %r771; .reg .u64 %r772; .reg .pred %r774; .reg .u64 %r775; .reg .u64 %r776; .reg .u64 %r777; .reg .pred %r778; .reg .u64 %r779; .reg .u64 %r780; .reg .u64 %r781; .reg .u16 %r782; .reg .pred %r783; .reg .u64 %r784; .reg .u64 %r786; .reg .u64 %r787; .reg .u64 %r788; .reg .u16 %r789; .reg .pred %r791; .reg .pred %r792; .reg .u64 %r793; .reg .pred %r795; .reg .u64 %r796; .reg .u64 %r798; .reg .u64 %r799; .reg .u64 %r800; .reg .u64 %r801; .reg .u64 %r802; .reg .u64 %r803; .reg .pred %r804; .reg .u64 %r805; .reg .u64 %r807; .reg .u64 %r808; .reg .u64 %r809; .reg .u16 %r810; .reg .u32 %r811; .reg .pred %r812; .reg .u16 %r813; .reg .pred %r814; .reg .u64 %r815; .reg .u64 %r816; .reg .u64 %r817; .reg .u64 %r818; .reg .u64 %r819; .reg .u64 %r820; .reg .pred %r821; .reg .u32 %r823; .reg .u32 %r824; .reg .u32 %r826; .reg .u32 %r827; .reg .u16 %r828; .reg .u16 %r829; .reg .u16 %r830; .reg .u32 %r831; .reg .u16 %r832; .reg .pred %r833; .reg .u64 %r834; .reg .u16 %r836; .reg .pred %r837; .reg .u64 %r838; .reg .u64 %r839; .reg .u64 %r840; .reg .u32 %r841; .reg .u32 %r842; .reg .u32 %r843; .reg .u32 %r844; .reg .u32 %r845; .reg .u32 %r846; .reg .u32 %r848; .reg .u32 %r849; .reg .u16 %r850; .reg .u16 %r851; .reg .u16 %r852; .reg .u32 %r853; .reg .u16 %r854; .reg .pred %r855; .reg .u32 %r857; .reg .u32 %r858; .reg .u16 %r859; .reg .u16 %r860; .reg .u16 %r861; .reg .u32 %r862; .reg .u16 %r863; .reg .pred %r864; .reg .u32 %r866; .reg .u32 %r867; .reg .u32 %r869; .reg .u32 %r870; .reg .u16 %r871; .reg .u16 %r872; .reg .u16 %r873; .reg .u32 %r874; .reg .u16 %r875; .reg .pred %r876; .reg .f32 %r877; .reg .f32 %r878; .reg .f32 %r879; .reg .f32 %r880; .reg .f32 %r881; .reg .f32 %r882; .reg .f32 %r883; .reg .pred %r884; .reg .f32 %r885; .reg .f32 %r886; .reg .pred %r887; .reg .pred %r888; .reg .u32 %r890; .reg .u32 %r891; .reg .u32 %r893; .reg .u32 %r894; .reg .u16 %r895; .reg .u16 %r896; .reg .u16 %r897; .reg .u32 %r898; .reg .u16 %r899; .reg .pred %r900; .reg .u32 %r902; .reg .u32 %r903; .reg .u32 %r905; .reg .u32 %r906; .reg .u16 %r907; .reg .u16 %r908; .reg .u16 %r909; .reg .u32 %r910; .reg .u16 %r911; .reg .pred %r912; .reg .u32 %r914; .reg .u32 %r915; .reg .u32 %r917; .reg .u32 %r918; .reg .u16 %r919; .reg .u16 %r920; .reg .u16 %r921; .reg .u32 %r922; .reg .u16 %r923; .reg .pred %r924; .reg .pred %r925; .reg .u64 %r926; .reg .u32 %r927; .reg .u64 %r928; .reg .u64 %r929; .reg .u32 %r930; .reg .pred %r931; .reg .pred %r932; .reg .u32 %r933; .reg .pred %r934; .reg .pred %r935; .reg .u32 %r950; .reg .u64 %r951; .reg .u64 %r952; .reg .u64 %r953; .reg .u64 %r954; .reg .u64 %r955; .reg .u64 %r956; .reg .u64 %r957; .reg .u16 %r958; .reg .u16 %r959; .reg .u16 %r960; .reg .u32 %r961; .reg .u16 %r962; .reg .pred %r963; .reg .pred %r964; .reg .u16 %r965; .reg .pred %r966; .reg .u64 %r967; .reg .pred %r968; .reg .u32 %r972; .reg .u64 %r974; .reg .u32 %r979; .reg .u32 %r980; .reg .u32 %r982; .reg .u32 %r983; .reg .u16 %r984; .reg .u16 %r985; .reg .u16 %r986; .reg .u32 %r988; .reg .u32 %r989; .reg .u16 %r991; .reg .u16 %r993; .reg .u32 %r994; .reg .u16 %r995; .reg .pred %r996; .reg .pred %r997; .reg .u64 %r998; .reg .u64 %r1000; .reg .u64 %r1001; .reg .u64 %r1002; .reg .pred %r1004; .reg .u64 %r1006; .reg .u64 %r1007; .reg .u64 %r1010; .reg .u64 %r1011; .reg .u64 %r1015; .reg .u64 %r1016; .reg .u64 %r1017; .reg .u64 %r1019; .reg .u64 %r1020; .reg .u64 %r1023; .reg .u64 %r1026; .reg .u64 %r1027; .reg .u64 %r1030; .reg .u64 %r1032; .reg .u64 %r1033; .reg .u64 %r1034; .reg .u64 %r1035; .reg .u64 %r1036; .reg .pred %r1038; .reg .u64 %r1044; .reg .u64 %r1045; .reg .u64 %r1046; .reg .u64 %r1047; .reg .pred %r1049; .reg .u64 %r1050; .reg .u64 %r1051; .reg .u64 %r1052; .reg .u64 %r1053; .reg .u64 %r1054; .reg .u64 %r1056; .reg .u64 %r1057; .reg .u64 %r1058; .reg .u64 %r1059; .reg .u64 %r1060; .reg .u64 %r1061; .reg .u64 %r1062; .reg .u64 %r1063; .reg .u64 %r1067; .reg .u64 %r1068; .reg .u64 %r1069; .reg .u64 %r1070; .reg .u64 %r1074; .reg .u64 %r1076; .reg .u64 %r1079; .reg .u64 %r1082; .reg .u64 %r1087; .reg .u64 %r1088; .reg .u64 %r1090; .reg .u64 %r1091; .reg .f32 %r1092; .reg .f32 %r1093; .reg .u64 %r1094; .reg .u64 %r1095; .reg .f32 %r1096; .reg .f32 %r1097; .reg .u64 %r1099; .reg .u64 %r1100; .reg .f32 %r1101; .reg .u64 %r1102; .reg .f32 %r1103; .reg .pred %r1104; .reg .u64 %r1107; .reg .u64 %r1108; .reg .u64 %r1109; .reg .u64 %r1110; .reg .u64 %r1112; .reg .pred %r1113; .reg .u64 %r1114; .reg .u64 %r1115; .reg .f32 %r1116; .reg .u64 %r1119; .reg .u64 %r1120; .reg .u64 %r1121; .reg .u64 %r1122; .reg .u64 %r1123; .reg .u64 %r1124; .reg .u64 %r1125; .reg .u64 %r1126; .reg .u64 %r1127; .reg .u64 %r1129; .reg .u64 %r1130; .reg .u64 %r1131; .reg .u64 %r1132; .reg .u64 %r1137; .reg .u64 %r1138; .reg .u64 %r1139; .reg .u64 %r1140; .reg .u64 %r1141; .reg .u64 %r1142; .reg .pred %r1144; .reg .pred %r1145; .reg .u64 %r1149; .reg .f32 %r1150; .reg .f32 %r1151; .reg .f32 %r1152; .reg .f32 %r1153; .reg .pred %r1155; .reg .pred %r1156; .reg .pred %r1157; .reg .u64 %r1158; .reg .u64 %r1159; .reg .u64 %r1161; .reg .u64 %r1162; .reg .u64 %r1163; .reg .u64 %r1164; .reg .u64 %r1172; .reg .u64 %r1173; .reg .u64 %r1174; .reg .u64 %r1176; .reg .f32 %r1177; .reg .f32 %r1178; .reg .u64 %r1179; .reg .f32 %r1180; .reg .u64 %r1181; .reg .f32 %r1182; .reg .pred %r1184; .reg .pred %r1185; .reg .u64 %r1188; .reg .f32 %r1189; .reg .f32 %r1190; .reg .pred %r1192; .reg .u16 %r1193; .reg .pred %r1194; .reg .u32 %r1196; .reg .u32 %r1197; .reg .u16 %r1198; .reg .u16 %r1199; .reg .u32 %r1201; .reg .u16 %r1202; .reg .pred %r1203; .reg .pred %r1204; .reg .u16 %r1205; .reg .pred %r1206; .reg .pred %r1207; .reg .pred %r1208; .reg .u64 %r1209; .reg .u64 %r1210; .reg .pred %r1212; .reg .u64 %r1213; .reg .u64 %r1214; .reg .u64 %r1216; .reg .f32 %r1217; .reg .f32 %r1218; .reg .pred %r1220; .reg .u64 %r1222; .reg .u64 %r1223; .reg .u64 %r1224; .reg .f32 %r1225; .reg .f32 %r1226; .reg .pred %r1228; .reg .pred %r1229; .reg .u64 %r1231; .reg .f32 %r1232; .reg .f32 %r1233; .reg .pred %r1235; .reg .pred %r1236; .reg .pred %r1237; .reg .pred %r1238; .reg .u64 %r1239; .reg .pred %r1242; .reg .u64 %r1244; .reg .f32 %r1245; .reg .f32 %r1246; .reg .f32 %r1247; .reg .pred %r1249; .reg .pred %r1251; .reg .u64 %r1253; .reg .u64 %r1254; .reg .pred %r1256; .reg .u64 %r1257; .reg .u64 %r1258; .reg .u64 %r1260; .reg .f32 %r1261; .reg .f32 %r1262; .reg .pred %r1264; .reg .pred %r1265; .reg .u64 %r1266; .reg .pred %r1267; .reg .u64 %r1268; .reg .u64 %r1269; .reg .u64 %r1270; .reg .u16 %r1271; .reg .u32 %r1272; .reg .pred %r1273; .reg .u16 %r1274; .reg .u32 %r1275; .reg .pred %r1276; .reg .u64 %r1277; .reg .u64 %r1278; .reg .u64 %r1279; .reg .u64 %r1280; .reg .u64 %r1281; .reg .u64 %r1282; .reg .u64 %r1283; .reg .u32 %r1284; .reg .pred %r1285; .reg .u32 %r1286; .reg .u64 %r1287; .reg .pred %r1288; .reg .pred %r1293; .reg .pred %r1294; .reg .pred %r1295; .reg .pred %r1296; .reg .pred %r1299; .reg .pred %r1300; .reg .pred %r1301; .reg .pred %r1302; .reg .pred %r1303; .reg .u64 %r1305; .reg .u64 %r1312; .reg .u64 %r1313; .reg .u64 %r1314; .reg .u64 %r1315; .reg .u64 %r1320; .reg .u64 %r1321; .reg .u64 %r1322; .reg .u64 %r1323; .reg .u64 %r1325; .reg .u64 %r1326; .reg .u64 %r1327; .reg .u64 %r1328; .reg .u64 %r1329; .reg .pred %r1330; .reg .u64 %r1332; .reg .pred %r1334; .reg .u64 %r1337; .reg .u64 %r1338; .reg .u64 %r1339; .reg .pred %r1340; .reg .u64 %r1341; .reg .u64 %r1342; .reg .pred %r1346; .reg .f32 %r1347; .reg .pred %r1348; .reg .u64 %r1349; .reg .u64 %r1350; .reg .u64 %r1351; .reg .u64 %r1352; .reg .u64 %r1353; .reg .pred %r1354; .reg .pred %r1355; .reg .pred %r1356; .reg .u64 %r1359; .reg .u64 %r1360; .reg .u64 %r1361; .reg .u64 %r1362; .reg .u64 %r1363; .reg .pred %r1364; .reg .pred %r1365; .reg .pred %r1366; .reg .u64 %r1368; .reg .u64 %r1369; .reg .u64 %r1370; .reg .u64 %r1371; .reg .u64 %r1372; .reg .u64 %r1373; .reg .pred %r1374; .reg .pred %r1375; .reg .pred %r1376; .reg .u64 %r1382; .reg .u64 %r1383; .reg .u64 %r1384; .reg .u64 %r1386; .reg .pred %r1387; .reg .u64 %r1388; .reg .u64 %r1389; .reg .u64 %r1390; .reg .u64 %r1391; .reg .u64 %r1392; .reg .u64 %r1393; .reg .u64 %r1394; .reg .pred %r1395; .reg .pred %r1396; .reg .pred %r1397; .reg .u64 %r1400; .reg .u64 %r1401; .reg .u64 %r1402; .reg .u64 %r1403; .reg .u64 %r1404; .reg .pred %r1405; .reg .pred %r1406; .reg .pred %r1407; .reg .u64 %r1408; .reg .u64 %r1409; .reg .u64 %r1413; .reg .pred %r1414; .reg .pred %r1415; .reg .pred %r1416; .reg .u64 %r1417; .reg .u64 %r1418; .reg .u64 %r1419; .reg .u64 %r1420; .reg .u64 %r1421; .reg .u64 %r1422; .reg .pred %r1423; .reg .pred %r1424; .reg .pred %r1425; .reg .u64 %r1428; .reg .u64 %r1429; .reg .u64 %r1430; .reg .u64 %r1431; .reg .u64 %r1432; .reg .pred %r1433; .reg .pred %r1434; .reg .pred %r1435; .reg .u64 %r1440; .reg .pred %r1441; .reg .pred %r1442; .reg .pred %r1443; .reg .u64 %r1448; .reg .pred %r1449; .reg .pred %r1450; .reg .pred %r1451; .reg .u64 %r1456; .reg .pred %r1457; .reg .pred %r1458; .reg .pred %r1459; .reg .u64 %r1464; .reg .pred %r1465; .reg .pred %r1466; .reg .pred %r1467; .reg .u64 %r1470; .reg .u64 %r1471; .reg .u64 %r1473; .reg .u64 %r1474; .reg .u64 %r1477; .reg .u64 %r1478; .reg .f32 %r1479; .reg .f32 %r1480; .reg .u64 %r1481; .reg .u64 %r1482; .reg .f32 %r1483; .reg .f32 %r1484; .reg .u64 %r1485; .reg .u64 %r1486; .reg .u64 %r1488; .reg .u64 %r1489; .reg .f32 %r1490; .reg .f32 %r1491; .reg .u64 %r1492; .reg .u64 %r1493; .reg .f32 %r1494; .reg .f32 %r1495; .reg .u64 %r1496; .reg .u64 %r1497; .reg .u64 %r1499; .reg .u64 %r1500; .reg .f32 %r1501; .reg .f32 %r1502; .reg .u64 %r1503; .reg .u64 %r1504; .reg .f32 %r1505; .reg .f32 %r1506; .reg .pred %r1507; .reg .f32 %r1510; .reg .f32 %r1514; .reg .f32 %r1518; .reg .pred %r1519; .reg .f32 %r1520; .reg .f32 %r1521; .reg .f32 %r1522; .reg .f32 %r1523; .reg .f32 %r1524; .reg .f32 %r1525; .reg .f32 %r1526; .reg .f32 %r1527; .reg .f32 %r1528; .reg .f32 %r1529; .reg .f32 %r1530; .reg .f32 %r1531; .reg .f32 %r1532; .reg .f32 %r1533; .reg .f32 %r1534; .reg .f32 %r1535; .reg .u64 %r1538; .reg .u64 %r1539; .reg .f32 %r1540; .reg .f32 %r1541; .reg .f32 %r1542; .reg .u64 %r1543; .reg .u64 %r1544; .reg .f32 %r1545; .reg .u64 %r1546; .reg .f32 %r1547; .reg .u64 %r1548; .reg .f32 %r1549; .reg .u64 %r1550; .reg .f32 %r1551; .reg .u64 %r1552; .reg .f32 %r1553; .reg .pred %r1554; .reg .f32 %r1555; .reg .f32 %r1556; .reg .f32 %r1557; .reg .f32 %r1558; .reg .f32 %r1564; .reg .f32 %r1565; .reg .f32 %r1566; .reg .f32 %r1567; .reg .f32 %r1568; .reg .f32 %r1569; .reg .f32 %r1570; .reg .f32 %r1571; .reg .f32 %r1572; .reg .f32 %r1579; .reg .f32 %r1580; .reg .f32 %r1581; .reg .f32 %r1582; .reg .f32 %r1583; .reg .f32 %r1584; .reg .f32 %r1585; .reg .f32 %r1586; .reg .f32 %r1587; .reg .f32 %r1594; .reg .f32 %r1595; .reg .f32 %r1596; .reg .f32 %r1597; .reg .f32 %r1598; .reg .pred %r1599; .reg .pred %r1600; .reg .f32 %r1601; .reg .f32 %r1602; .reg .f32 %r1603; .reg .f32 %r1604; .reg .u64 %r1607; .reg .f32 %r1608; .reg .u64 %r1609; .reg .f32 %r1610; .reg .f32 %r1611; .reg .f32 %r1612; .reg .f32 %r1613; .reg .u64 %r1614; .reg .f32 %r1615; .reg .f32 %r1616; .reg .u64 %r1617; .reg .f32 %r1618; .reg .f32 %r1619; .reg .u64 %r1621; .reg .u64 %r1623; .reg .f32 %r1624; .reg .u64 %r1625; .reg .f32 %r1626; .reg .f32 %r1627; .reg .f32 %r1628; .reg .f32 %r1629; .reg .u64 %r1630; .reg .f32 %r1631; .reg .f32 %r1632; .reg .u64 %r1633; .reg .f32 %r1634; .reg .f32 %r1635; .reg .u64 %r1637; .reg .u64 %r1639; .reg .f32 %r1640; .reg .u64 %r1641; .reg .f32 %r1642; .reg .f32 %r1643; .reg .u64 %r1644; .reg .f32 %r1645; .reg .u64 %r1646; .reg .f32 %r1647; .reg .pred %r1648; .reg .f32 %r1649; .reg .f32 %r1652; .reg .f32 %r1653; .reg .f32 %r1654; .reg .f32 %r1658; .reg .f32 %r1659; .reg .f32 %r1660; .reg .f32 %r1664; .reg .f32 %r1665; .reg .pred %r1666; .reg .pred %r1667; .reg .f32 %r1668; .reg .f32 %r1671; .reg .f32 %r1672; .reg .f32 %r1673; .reg .f32 %r1677; .reg .f32 %r1678; .reg .f32 %r1679; .reg .f32 %r1683; .reg .f32 %r1684; .reg .pred %r1685; .reg .pred %r1686; .reg .f32 %r1687; .reg .u64 %r1688; .reg .f32 %r1690; .reg .f32 %r1691; .reg .f32 %r1692; .reg .u64 %r1693; .reg .f32 %r1696; .reg .f32 %r1697; .reg .f32 %r1698; .reg .u64 %r1699; .reg .f32 %r1702; .reg .f32 %r1703; .reg .pred %r1704; .reg .pred %r1705; .reg .f32 %r1706; .reg .u64 %r1708; .reg .u64 %r1709; .reg .f32 %r1710; .reg .f32 %r1711; .reg .f32 %r1712; .reg .u64 %r1714; .reg .u64 %r1715; .reg .f32 %r1717; .reg .f32 %r1718; .reg .f32 %r1719; .reg .u64 %r1721; .reg .u64 %r1722; .reg .f32 %r1724; .reg .f32 %r1725; .reg .pred %r1726; .reg .u64 %r1728; .reg .u64 %r1730; .reg .u64 %r1733; .reg .pred %r1735; .reg .pred %r1736; .reg .pred %r1737; .reg .u64 %r1739; .reg .u64 %r1740; .reg .f32 %r1741; .reg .f32 %r1742; .reg .f32 %r1743; .reg .f32 %r1744; .reg .u64 %r1746; .reg .u64 %r1747; .reg .f32 %r1749; .reg .f32 %r1750; .reg .f32 %r1751; .reg .f32 %r1752; .reg .u64 %r1754; .reg .u64 %r1755; .reg .f32 %r1757; .reg .f32 %r1758; .reg .f32 %r1759; .reg .f32 %r1760; .reg .pred %r1761; .reg .pred %r1762; .reg .f32 %r1764; .reg .u64 %r1766; .reg .u64 %r1767; .reg .f32 %r1768; .reg .f32 %r1769; .reg .f32 %r1770; .reg .u64 %r1772; .reg .u64 %r1773; .reg .f32 %r1775; .reg .f32 %r1776; .reg .f32 %r1777; .reg .u64 %r1779; .reg .u64 %r1780; .reg .f32 %r1782; .reg .f32 %r1783; .reg .pred %r1784; .reg .f32 %r1785; .reg .f32 %r1786; .reg .f32 %r1788; .reg .f32 %r1789; .reg .f32 %r1791; .reg .f32 %r1792; .reg .pred %r1793; .reg .u64 %r1794; .reg .f32 %r1795; .reg .u64 %r1796; .reg .f32 %r1797; .reg .f32 %r1798; .reg .u64 %r1799; .reg .f32 %r1800; .reg .u64 %r1801; .reg .f32 %r1802; .reg .u64 %r1804; .reg .f32 %r1805; .reg .u64 %r1806; .reg .f32 %r1807; .reg .f32 %r1808; .reg .u64 %r1809; .reg .f32 %r1810; .reg .u64 %r1811; .reg .f32 %r1812; .reg .u64 %r1814; .reg .f32 %r1815; .reg .u64 %r1816; .reg .f32 %r1817; .reg .f32 %r1818; .reg .u64 %r1819; .reg .f32 %r1820; .reg .u64 %r1821; .reg .f32 %r1822; .reg .pred %r1823; .reg .f32 %r1824; .reg .f32 %r1825; .reg .f32 %r1826; .reg .f32 %r1827; .reg .f32 %r1828; .reg .f32 %r1830; .reg .f32 %r1831; .reg .f32 %r1832; .reg .f32 %r1833; .reg .f32 %r1834; .reg .f32 %r1836; .reg .f32 %r1837; .reg .f32 %r1838; .reg .f32 %r1839; .reg .f32 %r1840; .reg .pred %r1841; .reg .u64 %r1842; .reg .u64 %r1843; .reg .f32 %r1844; .reg .f32 %r1845; .reg .f32 %r1846; .reg .u64 %r1847; .reg .u64 %r1848; .reg .f32 %r1849; .reg .u64 %r1850; .reg .f32 %r1851; .reg .u64 %r1852; .reg .f32 %r1853; .reg .u64 %r1854; .reg .f32 %r1855; .reg .u64 %r1856; .reg .f32 %r1857; .reg .pred %r1858; .reg .f32 %r1859; .reg .f32 %r1861; .reg .f32 %r1863; .reg .pred %r1864; .reg .u64 %r1865; .reg .u64 %r1866; .reg .f32 %r1867; .reg .f32 %r1868; .reg .u64 %r1869; .reg .u64 %r1870; .reg .f32 %r1871; .reg .f32 %r1872; .reg .u64 %r1874; .reg .u64 %r1875; .reg .f32 %r1876; .reg .f32 %r1877; .reg .u64 %r1878; .reg .u64 %r1879; .reg .f32 %r1880; .reg .f32 %r1881; .reg .u64 %r1883; .reg .u64 %r1884; .reg .f32 %r1885; .reg .f32 %r1886; .reg .u64 %r1887; .reg .u64 %r1888; .reg .f32 %r1889; .reg .f32 %r1890; .reg .pred %r1891; .reg .u32 %r1892; .reg .u64 %r1893; .reg .u32 %r1895; .reg .u64 %r1896; .reg .u32 %r1898; .reg .u64 %r1899; .reg .pred %r1900; .reg .u32 %r1901; .reg .u64 %r1902; .reg .u64 %r1904; .reg .u32 %r1906; .reg .u64 %r1907; .reg .u64 %r1909; .reg .u32 %r1911; .reg .u64 %r1912; .reg .pred %r1913; .reg .f32 %r1914; .reg .f32 %r1915; .reg .f32 %r1917; .reg .f32 %r1918; .reg .f32 %r1920; .reg .f32 %r1921; .reg .pred %r1922; .reg .f32 %r1923; .reg .f32 %r1924; .reg .f32 %r1926; .reg .f32 %r1927; .reg .f32 %r1929; .reg .f32 %r1930; .reg .pred %r1931; .reg .f32 %r1932; .reg .f32 %r1933; .reg .f32 %r1935; .reg .f32 %r1936; .reg .f32 %r1938; .reg .f32 %r1939; .reg .pred %r1940; .reg .f32 %r1941; .reg .f32 %r1942; .reg .f32 %r1943; .reg .f32 %r1944; .reg .f32 %r1946; .reg .f32 %r1947; .reg .f32 %r1948; .reg .f32 %r1949; .reg .f32 %r1951; .reg .f32 %r1952; .reg .f32 %r1953; .reg .f32 %r1954; .reg .pred %r1955; .reg .pred %r1958; .reg .f32 %r1959; .reg .f32 %r1960; .reg .f32 %r1962; .reg .f32 %r1963; .reg .f32 %r1965; .reg .f32 %r1966; .reg .pred %r1967; mov.u64 %r719,%ar0; mov.u64 %r720,%ar1; mov.u64 %r721,%ar2; mov.u32 %r722,%ar3; mov.u32 %r723,%ar4; mov.u64 %r724,%ar5; .loc 1 2462 3 ld.s8 %r418,[%r720+28]; cvt.u16.u32 %r725,%r418; setp.eq.u16 %r726,%r725,2; @ %r726 bra $L2; ld.s8 %r1286,[%r721+28]; cvt.u16.u32 %r727,%r1286; setp.eq.u16 %r729,%r727,2; @ %r729 bra $L3; cvta.const.u64 %r733,$LC3; cvta.const.u64 %r732,__func__$0; mov.u32 %r731,2462; cvta.const.u64 %r7307327333: .loc 1 2476 6 ld.u64 %r734,[%r719]; setp.ne.u64 %r735,%r734,0; @ %r735 bra $L4; .loc 1 2478 10 setp.ne.u16 %r737,%r725,1; @ %r737 bra $L5; .loc 1 2480 4 st.u64 [%r719+48],%r734; ld.u64 %r740,[%r721+80]; ld.u64 %r742,[%r721+72]; sub.u64 %r743,%r740,%r742; st.u64 [%r719+56],%r743; mov.u64 %r744,1; st.u64 [%r719+40],%r744; bra $L6; $L123: .loc 1 2485 4 st.u64 [%r719+48],%r1266; add.u64 %r746,%r480,-1; st.u64 [%r719+56],%r746; mov.u64 %r747,1; st.u64 [%r719+40],%r747; bra $L6; $L124: .loc 1 2490 4 mov.u64 %r748,0; st.u64 [%r719+48],%r748; add.u64 %r749,%r480,-1; st.u64 [%r719+56],%r749; mov.u64 %r750,1; st.u64 [%r719+40],%r750; .loc 1 2493 11 st.u64 [%r719+72],%r748; ld.u64 %r753,[%r721+80]; ld.u64 %r755,[%r721+72]; sub.u64 %r756,%r753,%r755; st.u64 [%r719+80],%r756; st.u64 [%r719+64],%r480; $L6: .loc 1 2499 1719; call (%value_in),_gfortran_size0,(%out_arg1); ld.param.u64 %r758,[%value_in]; } .loc 1 2499 4760; call (%value_in),_gfortrani_xmallocarray761,[%value_in]; } .loc 1 2499 2 st.u64 [%r719],%r761; .loc 1 2500 24 mov.u64 %r763,0; st.u64 [%r719+8],%r763; .loc 1 2557 7 ld.s8 %r418,[%r720+28]; .loc 1 2564 15 ld.u64 %r765,[%r720+56]; add.u64 %r764,%r765,1; .loc 1 2564 13 ld.u64 %r766,[%r720+48]; sub.u64 %r428,%r764,%r766; bra $L7; $L4: .loc 1 2502 12 cvta.global.u64 %r767,_gfortrani_compile_options; .loc 1 2502 11 ld.u32 %r768,[%r767+36]; setp.eq.u32 %r769,%r768,0; @ %r769 bra $L8; .loc 1 2509 17 ld.u64 %r771,[%r719+56]; add.u64 %r770,%r771,1; .loc 1 2509 15 ld.u64 %r772,[%r719+48]; sub.u64 %r428,%r770,%r772; .loc 1 2506 10 setp.ne.u16 %r774,%r725,1; @ %r774 bra $L9; .loc 1 2508 17 ld.u64 %r776,[%r721+80]; add.u64 %r775,%r776,1; .loc 1 2508 15 ld.u64 %r777,[%r721+72]; sub.u64 %r210,%r775,%r777; .loc 1 2510 7 setp.ne.u64 %r778,%r210,%r428; @ %r778 bra $L10; .loc 1 2564 15 ld.u64 %r780,[%r720+56]; add.u64 %r779,%r780,1; .loc 1 2564 13 ld.u64 %r781,[%r720+48]; sub.u64 %r428,%r779,%r781; .loc 1 2543 7 ld.s8 %r303,[%r719+28]; .loc 1 2548 27 ld.u64 %r416,[%r719+40]; .loc 1 2543 6 cvt.u16.u32 %r782,%r303; setp.eq.u16 %r783,%r782,1; @ %r783 bra $L11; .loc 1 2553 16 ld.u64 %r476,[%r719+64]; .loc 1 2560 16 ld.u64 %r475,[%r720+40]; .loc 1 2563 14 mov.u64 %r186,1; .loc 1 2561 16 mov.u64 %r178,%r186; bra $L12; $L10: .loc 1 2511 6 st.u64 [%stack+8],%r210; st.u64 [%stack],%r428; cvta.const.u64 %r7847stack; call _gfortran_runtime_error2485 4 ld.u64 %r787,[%r720+56]; add.u64 %r786,%r787,1; ld.u64 %r788,[%r720+48]; sub.u64 %r484,%r786,%r788; .loc 1 2515 15 cvt.u16.u32 %r789,%r1286; setp.ne.u16 %r791,%r789,1; @ %r791 bra $L13; .loc 1 2519 7 setp.eq.u64 %r792,%r484,%r428; @ %r792 bra $L14; .loc 1 2520 6 st.u64 [%stack+8],%r484; st.u64 [%stack],%r428; cvta.const.u64 %r7933_gfortran_runtime_error13: .loc 1 2528 7 setp.eq.u64 %r795,%r484,%r428; @ %r795 bra $L15; .loc 1 2529 6 st.u64 [%stack+8],%r484; st.u64 [%stack],%r428; cvta.const.u64 %r7966_gfortran_runtime_error15: .loc 1 2533 17 ld.u64 %r799,[%r721+80]; add.u64 %r798,%r799,1; .loc 1 2533 15 ld.u64 %r800,[%r721+72]; sub.u64 %r208,%r798,%r800; .loc 1 2534 17 ld.u64 %r802,[%r719+80]; add.u64 %r801,%r802,1; .loc 1 2534 15 ld.u64 %r803,[%r719+72]; sub.u64 %r209,%r801,%r803; .loc 1 2535 7 setp.eq.u64 %r804,%r208,%r209; @ %r804 bra $L14; .loc 1 2536 6 st.u64 [%stack+8],%r208; st.u64 [%stack],%r209; cvta.const.u64 %r805,$LC805_gfortran_runtime_error2564 15 ld.u64 %r808,[%r720+56]; add.u64 %r807,%r808,1; .loc 1 2564 13 ld.u64 %r809,[%r720+48]; sub.u64 %r428,%r807,%r809; $L7: .loc 1 2548 27 ld.u64 %r476,[%r719+40]; .loc 1 2543 6 ld.s8 %r811,[%r719+28]; cvt.u16.u32 %r810,%r811; setp.eq.u16 %r812,%r810,1; @ %r812 bra $L130; $L127: .loc 1 2553 16 mov.u64 %r416,%r476; ld.u64 %r476,[%r719+64]; bra $L16; $L130: .loc 1 2548 27 mov.u64 %r416,%r476; $L16: .loc 1 2560 16 ld.u64 %r475,[%r720+40]; .loc 1 2557 6 cvt.u16.u32 %r813,%r418; setp.eq.u16 %r814,%r813,1; @ %r814 bra $L131; $L125: .loc 1 2569 16 ld.u64 %r178,[%r720+64]; .loc 1 2571 15 ld.u64 %r816,[%r720+80]; add.u64 %r815,%r816,1; .loc 1 2571 13 ld.u64 %r817,[%r720+72]; mov.u64 %r186,%r428; sub.u64 %r428,%r815,%r817; bra $L12; $L131: .loc 1 2563 14 mov.u64 %r186,1; .loc 1 2561 16 mov.u64 %r178,%r186; $L12: .loc 1 2575 16 ld.u64 %r819,[%r721+56]; add.u64 %r818,%r819,1; ld.u64 %r820,[%r721+48]; sub.u64 %r64,%r818,%r820; .loc 1 2575 6 setp.eq.u64 %r821,%r64,%r428; @ %r821 bra $L17; .loc 1 2577 10 set.u32.gt.s64 %r823,%r428,0; neg.s32 %r824,%r823; .loc 1 2577 21 set.u32.gt.s64 %r826,%r64,0; neg.s32 %r827,%r826; cvt.u16.u32 %r829,%r824; cvt.u16.u32 %r830,%r827; or.b16 %r828,%r829,%r830; cvt.u32.u16 %r831,%r828; cvt.u16.u8 %r832,%r831; setp.eq.u16 %r833,%r832,0; @ %r833 bra $L17; .loc 1 2578 2 st.u64 [%stack+8],%r428; st.u64 [%stack],%r64; cvta.const.u64 %r8348stack; call _gfortran_runtime_error17: .loc 1 2583 7 ld.s8 %r66,[%r721+28]; .loc 1 2586 16 ld.u64 %r474,[%r721+40]; .loc 1 2583 6 cvt.u16.u32 %r836,%r66; setp.eq.u16 %r837,%r836,1; @ %r837 bra $L132; .loc 1 2597 16 ld.u64 %r179,[%r721+64]; .loc 1 2598 16 ld.u64 %r839,[%r721+80]; add.u64 %r838,%r839,1; .loc 1 2598 14 ld.u64 %r840,[%r721+72]; sub.u64 %r187,%r838,%r840; bra $L18; $L132: .loc 1 2592 14 mov.u64 %r187,1; .loc 1 2591 16 mov.u64 %r179,256; $L18: .loc 1 2601 9 ld.u64 %r215,[%r720]; .loc 1 2602 9 ld.u64 %r216,[%r721]; .loc 1 2603 8 ld.u64 %r217,[%r719]; .loc 1 2612 28 set.u32.eq.u64 %r842,%r416,1; neg.s32 %r843,%r842; cvt.u32.u32 %r841,%r843; cvt.u32.u8 %r71,%r841; .loc 1 2612 46 set.u32.eq.u64 %r845,%r475,1; neg.s32 %r846,%r845; cvt.u32.u32 %r844,%r846; cvt.u32.u8 %r473,%r844; .loc 1 2612 7 set.u32.ne.u32 %r848,%r722,0; neg.s32 %r849,%r848; .loc 1 2612 16 cvt.u16.u32 %r851,%r849; cvt.u16.u32 %r852,%r71; and.b16 %r850,%r851,%r852; .loc 1 2612 6 cvt.u32.u16 %r853,%r850; cvt.u16.u8 %r854,%r853; setp.eq.u16 %r855,%r854,0; @ %r855 bra $L19; .loc 1 2612 63 set.u32.eq.u64 %r857,%r178,1; neg.s32 %r858,%r857; .loc 1 2612 51 cvt.u16.u32 %r860,%r858; cvt.u16.u32 %r861,%r473; or.b16 %r859,%r860,%r861; .loc 1 2612 33 cvt.u32.u16 %r862,%r859; cvt.u16.u8 %r863,%r862; setp.eq.u16 %r864,%r863,0; @ %r864 bra $L20; .loc 1 2613 20 set.u32.eq.u64 %r866,%r474,1; neg.s32 %r867,%r866; .loc 1 2613 37 set.u32.eq.u64 %r869,%r179,1; neg.s32 %r870,%r869; .loc 1 2613 25 cvt.u16.u32 %r872,%r867; cvt.u16.u32 %r873,%r870; or.b16 %r871,%r872,%r873; .loc 1 2613 7 cvt.u32.u16 %r874,%r871; cvt.u16.u8 %r875,%r874; setp.eq.u16 %r876,%r875,0; @ %r876 bra $L20; .loc 1 2615 13 cvt.rn.f32.s32 %r83,%r723; .loc 1 2614 12 cvt.rn.f32.s64 %r877,%r186; .loc 1 2614 50 cvt.rn.f32.s64 %r878,%r428; .loc 1 2614 47 mul.f32 %r879,%r877,%r878; .loc 1 2614 31 cvt.rn.f32.s64 %r880,%r187; .loc 1 2614 47 mul.f32 %r881,%r879,%r880; .loc 1 2615 13 mul.f32 %r882,%r83,%r83; mul.f32 %r883,%r882,%r83; .loc 1 2614 7 setp.gt.f32 %r884,%r881,%r883; @ ! %r884 bra $L19; .loc 1 2617 17 cvt.u32.u64 %r85,%r186; st.u32 [%frame+28],%r85; .loc 1 2617 29 cvt.u32.u64 %r86,%r187; st.u32 [%frame+24],%r86; .loc 1 2617 41 cvt.u32.u64 %r87,%r428; st.u32 [%frame+20],%r87; .loc 1 2617 52 cvt.u32.u64 %r88,%r476; st.u32 [%frame+16],%r88; .loc 1 2618 24 mov.f32 %r885,0f3f800000; st.f32 [%frame+12],%r885; .loc 1 2618 33 mov.f32 %r886,0f00000000; st.f32 [%frame+8],%r886; .loc 1 2619 17 setp.ne.u64 %r887,%r475,1; @ %r887 bra $L22; cvt.u32.u64 %r200,%r178; bra $L23; $L22: cvt.u32.u64 %r200,%r475; $L23: st.u32 [%frame+4],%r200; .loc 1 2620 3 setp.ne.u64 %r888,%r474,1; @ %r888 bra $L24; cvt.u32.u64 %r201,%r179; bra $L25; $L24: cvt.u32.u64 %r201,%r474; $L25: st.u32 [%frame],%r201; .loc 1 2622 15 set.u32.gt.s32 %r890,%r200,0; neg.s32 %r891,%r890; .loc 1 2622 26 set.u32.gt.s32 %r893,%r201,0; neg.s32 %r894,%r893; .loc 1 2622 19 cvt.u16.u32 %r896,%r891; cvt.u16.u32 %r897,%r894; and.b16 %r895,%r896,%r897; .loc 1 2622 10 cvt.u32.u16 %r898,%r895; cvt.u16.u8 %r899,%r898; setp.eq.u16 %r900,%r899,0; @ %r900 bra $L19; .loc 1 2622 37 set.u32.gt.s32 %r902,%r88,0; neg.s32 %r903,%r902; .loc 1 2622 46 set.u32.gt.s32 %r905,%r85,1; neg.s32 %r906,%r905; .loc 1 2622 41 cvt.u16.u32 %r908,%r903; cvt.u16.u32 %r909,%r906; and.b16 %r907,%r908,%r909; cvt.u32.u16 %r910,%r907; cvt.u16.u8 %r911,%r910; setp.eq.u16 %r912,%r911,0; @ %r912 bra $L19; .loc 1 2622 55 set.u32.gt.s32 %r914,%r86,1; neg.s32 %r915,%r914; .loc 1 2622 64 set.u32.gt.s32 %r917,%r87,1; neg.s32 %r918,%r917; .loc 1 2622 59 cvt.u16.u32 %r920,%r915; cvt.u16.u32 %r921,%r918; and.b16 %r919,%r920,%r921; cvt.u32.u16 %r922,%r919; cvt.u16.u8 %r923,%r922; setp.eq.u16 %r924,%r923,0; @ %r924 bra $L19; .loc 1 2624 4 setp.ne.u64 %r925,%r724,0; @ %r925 bra $L27; cvta.const.u64 %r929,$LC8; cvta.const.u64 %r928,__func__$0; mov.u32 %r927,2624; cvta.const.u64 %r926992927: .loc 1 2626 17 and.b32 %r930,%r722,2; .loc 1 2626 7 setp.ne.u32 %r931,%r930,0; @ %r931 bra $L133; .loc 1 2629 35 setp.eq.u64 %r932,%r475,1; @ %r932 bra $L134; cvta.const.u64 %r188,$LC2; bra $L28; $L133: .loc 1 2627 13 cvta.const.u64 %r188,$LC1; bra $L28; $L134: .loc 1 2629 35 cvta.const.u64 %r188,$LC0; $L28: .loc 1 2631 17 and.b32 %r933,%r722,4; .loc 1 2631 7 setp.ne.u32 %r934,%r933,0; @ %r934 bra $L135; .loc 1 2634 35 setp.eq.u64 %r935,%r474,1; @ %r935 bra $L136; cvta.const.u64 %r189,$LC2; bra $L29; $L135: .loc 1 2632 13 cvta.const.u64 %r189,$LC1; bra $L29; $L136: .loc 1 2634 35 cvta.const.u64 %r189,$LC0; $L29: .loc 1 2636 4 add.u64 %r951,%frame,16; add.u64 %r952,%frame,8; add.u64 %r953,%frame,4; add.u64 %r954,%frame,12; add.u64 %r955,%frame,20; add.u64 %r956,%frame,24; add.u64 %r957,%frame,28; mov.u32 %r950,1;,.param .u64 %in_ar3,.param .u64 %in_ar4,.param .u64 %in_ar5,.param .u64 %in_ar6,.param .u64 %in_ar7,.param .u64 %in_ar8,.param .u64 %in_ar9,.param .u64 %in_ar10,.param .u64 %in_ar11,.param .u64 %in_ar12,.param .u32 %in_ar13,.param .u32 %in_ar1418r18995795955215; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r953; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r216; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%frame; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r952; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r217; .param .u64 %out_arg13; st.param.u64 [%out_arg13],%r951; .param .u32 %out_arg14; st.param.u32 [%out_arg14],%r950; .param .u32 %out_arg15; st.param.u32 [%out_arg15],%r950; call %r724,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13,%out_arg14,%out_arg15),$LCT0; } bra $L1; $L19: .loc 1 2643 21 cvt.u16.u32 %r959,%r71; cvt.u16.u32 %r960,%r473; and.b16 %r958,%r959,%r960; .loc 1 2643 6 cvt.u32.u16 %r961,%r958; cvt.u16.u8 %r962,%r961; setp.eq.u16 %r963,%r962,0; @ %r963 bra $L31; .loc 1 2643 38 setp.ne.u64 %r964,%r474,1; @ %r964 bra $L20; .loc 1 2644 7 cvt.u16.u32 %r965,%r66; setp.eq.u16 %r966,%r965,1; @ %r966 bra $L32; .loc 1 2677 16 add.u64 %r967,%r476,1; .loc 1 2678 9 shl.b64 %r104,%r967,2; sub.u64 %r219,%r217,%r104; .loc 1 2687 7 setp.le.s64 %r968,%r187,0; @ %r968 bra $L33; add.u64 %r592,%r104,-4; mov.u64 %r591,%r217; add.u64 %r595,%r187,1; .loc 1 2689 22 shl.b64 %r413,%r186,2; .loc 1 2687 13 mov.u64 %r268,%r474; setp.gt.s64 %r1303,%r186,0; and.b64 %r1413,%r187,3; setp.eq.u64 %r1416,%r1413,0; @ %r1416 bra $L34; setp.eq.u64 %r1415,%r1413,1; @ %r1415 bra $L309; setp.eq.u64 %r1414,%r1413,2; @ %r1414 bra $L310; bra $L380; $L35: .loc 1 2689 22 mov.u32 %r9797213974,[%value_in]; } $L36: .loc 1 2687 24 add.u64 %r1417,%r268,1; .loc 1 2687 7 add.u64 %r1470,%r591,%r592; .loc 1 2688 2 @ %r1303 bra $L247; bra $L381; $L34: @ %r1303 bra $L35; bra $L36; $L33: .loc 1 2692 28 set.u32.eq.u64 %r979,%r428,0; neg.s32 %r980,%r979; .loc 1 2692 13 set.u32.eq.u64 %r982,%r186,0; neg.s32 %r983,%r982; .loc 1 2692 28 cvt.u16.u32 %r985,%r980; cvt.u16.u32 %r986,%r983; or.b16 %r984,%r985,%r986; .loc 1 2692 23 set.u32.eq.u64 %r988,%r187,0; neg.s32 %r989,%r988; .loc 1 2692 28 cvt.u16.u32 %r993,%r989; or.b16 %r991,%r984,%r993; cvt.u32.u16 %r994,%r991; cvt.u16.u8 %r995,%r994; setp.eq.u16 %r996,%r995,0; @ ! %r996 bra $L1; .loc 1 2697 10 setp.eq.u64 %r997,%r178,1; selp.u64 %r198,%r476,%r178,%r997; .loc 1 2702 21 shl.b64 %r998,%r198,8; .loc 1 2702 14 add.u64 %r224,%r998,%r179; .loc 1 2706 12 min.s64 %r1000,%r224,65536; shl.b64 %r1001,%r1000,10011002,[%value_in]; } .loc 1 2710 7 setp.gt.s64 %r1004,%r187,0; @ %r1004 bra $L39; $L42: .loc 1 2910022918 7 bra $L1; $L39: .loc 1 2680 16 add.u64 %r1006,%r178,1; .loc 1 2681 9 shl.b64 %r107,%r1006,2; sub.u64 %r221,%r215,%r107; .loc 1 2683 16 add.u64 %r1007,%r179,1; .loc 1 2684 9 shl.b64 %r110,%r1007,2; shl.b64 %r565,%r179,9; shl.b64 %r567,%r179,11; shl.b64 %r577,%r179,3; sub.u64 %r568,%r216,%r110; add.u64 %r566,%r577,%r568; shl.b64 %r571,%r476,9; add.u64 %r572,%r476,%r476; add.u64 %r574,%r104,-4; neg.s64 %r1010,%r179; shl.b64 %r1011,%r1010,11; shl.b64 %r575,%r1010,2; shl.b64 %r579,%r476,11; shl.b64 %r1015,%r476,3; add.u64 %r1016,%r1015,4; add.u64 %r578,%r219,%r1016; add.u64 %r1017,%r187,-1; and.b64 %r587,%r1017,-512; add.u64 %r588,%r568,%r575; shl.b64 %r544,%r178,8; shl.b64 %r546,%r178,10; add.u64 %r547,%r107,-4; neg.s64 %r1019,%r178; shl.b64 %r1020,%r1019,10; shl.b64 %r1023,%r1019,3; shl.b64 %r509,%r178,3; add.u64 %r510,%r178,%r178; add.u64 %r202,%r110,-4; shl.b64 %r196,%r179,4; shl.b64 %r191,%r476,4; neg.s64 %r1026,%r476; shl.b64 %r1027,%r1026,2; shl.b64 %r1030,%r1010,4; mov.u64 %r573,%r574; mov.u64 %r569,%r572; mov.u64 %r563,%r179; mov.u64 %r561,0; setp.gt.s64 %r1295,%r428,0; add.u64 %r1339,%r588,4; setp.gt.s64 %r1340,%r186,0; add.u64 %r1341,%r186,1; add.u64 %r1342,%r221,8; bra $L41; $L138: mov.u64 %r561,%r562; $L41: add.u64 %r590,%r561,1; .loc 1 2715 9 sub.u64 %r226,%r187,%r561; min.s64 %r1471,%r226,512; .loc 1 2716 24 shr.s64 %r1032,%r1471,63; shr.u64 %r1033,%r1032,62; add.u64 %r1034,%r1471,%r1033; and.b64 %r1035,%r1034,3; sub.u64 %r1036,%r1035,%r1033; .loc 1 2716 10 sub.u64 %r227,%r1471,%r1036; .loc 1 2718 4 @ %r1295 bra $L40; $L45: .loc 1 2710 7 add.u64 %r562,%r561,512; add.u64 %r563,%r563,%r565; add.u64 %r566,%r566,%r567; add.u64 %r569,%r569,%r571; add.u64 %r573,%r573,%r571; add.u64 %r575,%r575,%r1011; add.u64 %r577,%r577,%r567; add.u64 %r578,%r578,%r579; setp.ne.u64 %r1038,%r561,%r587; @ %r1038 bra $L138; bra $L42; $L40: add.u64 %r541,%r1339,%r577; .loc 1 2768 13 add.u64 %r452,%r227,%r590; add.u64 %r449,%r202,%r577; add.u64 %r447,%r202,%r449; add.u64 %r559,%r202,%r566; add.u64 %r560,%r202,%r559; mul.lo.u64 %r378,%r452,%r476; mul.lo.u64 %r371,%r452,%r179; mad.lo.u64 %r556,%r452,%r202,%r568; mov.u64 %r548,%r1023; mov.u64 %r545,%r547; mov.u64 %r542,%r178; .loc 1 2718 12 mov.u64 %r230,%r474; add.u64 %r1287,%r428,1; .loc 1 2768 8 add.u64 %r1332,%r452,-1; setp.ge.s64 %r1334,%r1332,%r590; sub.u64 %r1337,%r569,%r476; add.u64 %r1338,%r1337,%r572; $L44: .loc 1 2723 13 sub.u64 %r228,%r1287,%r230; min.s64 %r1473,%r228,256; .loc 1 2724 28 shr.u64 %r1044,%r1473,63; add.u64 %r1045,%r1473,%r1044; and.b64 %r1046,%r1045,1; sub.u64 %r1047,%r1046,%r1044; .loc 1 2724 14 sub.u64 %r229,%r1473,%r1047; .loc 1 2727 8 @ %r1340 bra $L43; $L73: .loc 1 2718 30 add.u64 %r230,%r230,256; .loc 1 2718 4 add.u64 %r541,%r541,1024; add.u64 %r542,%r542,%r544; add.u64 %r545,%r545,%r546; add.u64 %r548,%r548,%r1020; setp.ge.s64 %r1049,%r428,%r230; @ %r1049 bra $L44; bra $L45; $L43: .loc 1 2763 26 add.u64 %r1050,%r230,-1; add.u64 %r1051,%r1050,%r1473; .loc 1 2763 31 mul.lo.u64 %r286,%r1051,%r178; .loc 1 2734 13 add.u64 %r451,%r229,%r230; .loc 1 2734 8 add.u64 %r453,%r451,-1; add.u64 %r1052,%r286,1; shl.b64 %r1053,%r1052,2; add.u64 %r528,%r221,%r1053; add.u64 %r538,%r542,%r178; sub.u64 %r1054,%r538,%r286; shl.b64 %r540,%r1054,2; shl.b64 %r1056,%r1473,2; add.u64 %r1057,%r1056,-4; add.u64 %r302,%r1002,%r1057; add.u64 %r1058,%r230,%r563; add.u64 %r1059,%r1058,%r1473; shl.b64 %r1060,%r1059,2; add.u64 %r195,%r1060,%r568; shl.b64 %r558,%r230,2; add.u64 %r48,%r558,%r566; add.u64 %r46,%r558,%r559; add.u64 %r30,%r558,%r560; add.u64 %r557,%r556,%r558; add.u64 %r1061,%r371,%r230; add.u64 %r1062,%r1061,%r1473; shl.b64 %r1063,%r1062,2; add.u64 %r369,%r1063,%r568; .loc 1 2768 8 mov.u64 %r527,%r578; .loc 1 2727 16 mov.u64 %r245,%r474; setp.le.s64 %r1288,%r230,%r453; setp.gt.s64 %r1300,%r1473,%r229; setp.gt.s64 %r1299,%r1471,%r227; not.b64 %r1325,%r230; add.u64 %r1326,%r1325,%r451; and.b64 %r1327,%r1326,-2; .loc 1 2904 18 add.u64 %r1328,%r1473,%r230; .loc 1 2904 13 add.u64 %r1329,%r1328,-1; setp.gt.s64 %r1330,%r230,%r1329; $L72: .loc 1 2732 10 sub.u64 %r231,%r1341,%r245; min.s64 %r1474,%r231,256; .loc 1 2733 25 shr.s64 %r1305,%r1474,63; shr.u64 %r1067,%r1474,63; add.u64 %r1068,%r1474,%r1067; and.b64 %r1069,%r1068,1; sub.u64 %r1070,%r1069,%r1067; .loc 1 2733 11 sub.u64 %r232,%r1474,%r1070; .loc 1 2735 5 @ %r1288 bra $L46; $L55: .loc 1 2757 8 @ ! %r1300 bra $L48; bra $L47; $L46: .loc 1 2737 17 add.u64 %r456,%r232,%r245; .loc 1 2737 12 add.u64 %r457,%r456,-1; shl.b64 %r1074,%r1474,10; add.u64 %r1076,%r1074,-1024; add.u64 %r502,%r1002,%r1076; add.u64 %r508,%r528,%r540; add.u64 %r1079,%r1474,%r245; shl.b64 %r519,%r1079,2; shl.b64 %r523,%r1474,2; mov.u64 %r513,%r538; mov.u64 %r512,%r548; mov.u64 %r511,%r545; mov.u64 %r500,0; setp.le.s64 %r1293,%r245,%r457; setp.gt.s64 %r1301,%r1474,%r232; not.b64 %r1320,%r245; add.u64 %r1321,%r1320,%r456; and.b64 %r1322,%r1321,-2; add.u64 %r1323,%r1322,%r245; bra $L54; $L139: mov.u64 %r500,%r501; $L54: .loc 1 2738 9 @ %r1293 bra $L49; $L53: .loc 1 2749 12 @ %r1301 bra $L50; bra $L51; $L49: shl.b64 %r1082,%r500,2; add.u64 %r199,%r1002,%r1082; add.u64 %r1087,%r1323,%r513; shl.b64 %r1088,%r1087,2; add.u64 %r499,%r1088,%r1342; .loc 1 2738 9 mov.u64 %r181,%r508; sub.u64 %r1401,%r499,%r508; add.u64 %r1402,%r1401,-8; shr.u64 %r1400,%r1402,3; add.u64 %r1403,%r1400,1; and.b64 %r1404,%r1403,3; setp.eq.u64 %r1407,%r1404,0; @ %r1407 bra $L52; setp.eq.u64 %r1406,%r1404,1; @ %r1406 bra $L311; setp.eq.u64 %r1405,%r1404,2; @ %r1405 bra $L312; bra $L382; $L52: .loc 1 2741 7 add.u64 %r1090,%r181,%r511; add.u64 %r1091,%r1090,%r512; .loc 1 2740 49 ld.f32 %r1092,[%r1091]; st.f32 [%r199],%r1092; .loc 1 2742 49 ld.f32 %r1093,[%r181]; st.f32 [%r199+4],%r1093; .loc 1 2745 7 add.u64 %r1094,%r181,%r512; add.u64 %r1095,%r1094,%r511; .loc 1 2744 49 ld.f32 %r1096,[%r1095+4]; st.f32 [%r199+1024],%r1096; .loc 1 2746 49 ld.f32 %r1097,[%r181+4]; st.f32 [%r199+1028],%r1097; .loc 1 2738 9 add.u64 %r1408,%r199,2048; add.u64 %r1409,%r181,8; .loc 1 2741 7 add.u64 %r1477,%r1409,%r511; add.u64 %r1478,%r1477,%r512; .loc 1 2740 49 ld.f32 %r1479,[%r1478]; st.f32 [%r1408],%r1479; .loc 1 2742 49 ld.f32 %r1480,[%r1409]; st.f32 [%r1408+4],%r1480; .loc 1 2745 7 add.u64 %r1481,%r1409,%r512; add.u64 %r1482,%r1481,%r511; .loc 1 2744 49 ld.f32 %r1483,[%r1482+4]; st.f32 [%r1408+1024],%r1483; .loc 1 2746 49 ld.f32 %r1484,[%r1409+4]; st.f32 [%r1408+1028],%r1484; .loc 1 2738 9 add.u64 %r1485,%r199,4096; add.u64 %r1486,%r181,16; .loc 1 2741 7 add.u64 %r1488,%r1486,%r511; add.u64 %r1489,%r1488,%r512; .loc 1 2740 49 ld.f32 %r1490,[%r1489]; st.f32 [%r1485],%r1490; .loc 1 2742 49 ld.f32 %r1491,[%r1486]; st.f32 [%r1485+4],%r1491; .loc 1 2745 7 add.u64 %r1492,%r1486,%r512; add.u64 %r1493,%r1492,%r511; .loc 1 2744 49 ld.f32 %r1494,[%r1493+4]; st.f32 [%r1485+1024],%r1494; .loc 1 2746 49 ld.f32 %r1495,[%r1486+4]; st.f32 [%r1485+1028],%r1495; .loc 1 2738 9 add.u64 %r1496,%r199,6144; add.u64 %r1497,%r181,24; .loc 1 2741 7 add.u64 %r1499,%r1497,%r511; add.u64 %r1500,%r1499,%r512; .loc 1 2740 49 ld.f32 %r1501,[%r1500]; st.f32 [%r1496],%r1501; .loc 1 2742 49 ld.f32 %r1502,[%r1497]; st.f32 [%r1496+4],%r1502; .loc 1 2745 7 add.u64 %r1503,%r1497,%r512; add.u64 %r1504,%r1503,%r511; .loc 1 2744 49 ld.f32 %r1505,[%r1504+4]; st.f32 [%r1496+1024],%r1505; .loc 1 2746 49 ld.f32 %r1506,[%r1497+4]; st.f32 [%r1496+1028],%r1506; .loc 1 2738 9 add.u64 %r199,%r199,8192; add.u64 %r181,%r181,32; setp.ne.u64 %r1507,%r181,%r499; @ %r1507 bra $L52; bra $L53; $L50: .loc 1 2752 10 add.u64 %r1099,%r511,%r519; add.u64 %r1100,%r221,%r1099; .loc 1 2751 41 ld.f32 %r1101,[%r1100+-4]; st.f32 [%r502],%r1101; .loc 1 2754 10 add.u64 %r1102,%r508,%r523; .loc 1 2753 41 ld.f32 %r1103,[%r1102+-4]; st.f32 [%r502+4],%r1103; $L51: .loc 1 2735 5 add.u64 %r501,%r500,2; add.u64 %r502,%r502,8; add.u64 %r508,%r508,%r509; add.u64 %r511,%r511,%r509; add.u64 %r512,%r512,%r1023; add.u64 %r513,%r513,%r510; setp.ne.u64 %r1104,%r500,%r1327; @ %r1104 bra $L139; bra $L55; $L48: .loc 1 2767 25 shr.u64 %r1107,%r1305,62; add.u64 %r1108,%r1474,%r1107; and.b64 %r1109,%r1108,3; sub.u64 %r1110,%r1109,%r1107; .loc 1 2767 11 sub.u64 %r235,%r1474,%r1110; .loc 1 2769 5 @ %r1334 bra $L56; bra $L57; $L47: .loc 1 2759 17 add.u64 %r132,%r1474,%r245; .loc 1 2759 12 add.u64 %r1112,%r132,-1; .loc 1 2760 9 setp.lt.s64 %r1113,%r1112,%r245; @ %r1113 bra $L48; add.u64 %r1114,%r132,%r286; shl.b64 %r1115,%r1114,2; add.u64 %r136,%r221,%r1115; mov.u64 %r73,%r302; mov.u64 %r57,%r528; sub.u64 %r1391,%r136,%r528; add.u64 %r1392,%r1391,-4; shr.u64 %r1390,%r1392,2; add.u64 %r1393,%r1390,1; and.b64 %r1394,%r1393,3; setp.eq.u64 %r1397,%r1394,0; @ %r1397 bra $L58; setp.eq.u64 %r1396,%r1394,1; @ %r1396 bra $L313; setp.eq.u64 %r1395,%r1394,2; @ %r1395 bra $L314; bra $L383; $L58: .loc 1 2762 43 ld.f32 %r1116,[%r57]; st.f32 [%r73],%r1116; ld.f32 %r1510,[%r57+4]; st.f32 [%r73+1024],%r1510; ld.f32 %r1514,[%r57+8]; st.f32 [%r73+2048],%r1514; ld.f32 %r1518,[%r57+12]; st.f32 [%r73+3072],%r1518; .loc 1 2760 9 add.u64 %r57,%r57,16; add.u64 %r73,%r73,4096; setp.ne.u64 %r1519,%r57,%r136; @ %r1519 bra $L58; bra $L48; $L57: .loc 1 2871 8 @ ! %r1299 bra $L60; bra $L59; $L56: .loc 1 2876 14 add.u64 %r460,%r235,%r245; .loc 1 2876 9 add.u64 %r461,%r460,-1; add.u64 %r192,%r527,%r1027; add.u64 %r89,%r527,%r574; shl.b64 %r1119,%r476,2; add.u64 %r42,%r1119,%r89; not.b64 %r1120,%r245; add.u64 %r1121,%r1120,%r460; shr.u64 %r1122,%r1121,2; add.u64 %r1123,%r1122,1; shl.b64 %r290,%r1123,4; add.u64 %r1124,%r1337,%r460; shl.b64 %r1125,%r1124,2; add.u64 %r440,%r1125,%r219; add.u64 %r1126,%r460,%r569; shl.b64 %r1127,%r1126,2; add.u64 %r435,%r1127,%r219; add.u64 %r1129,%r1338,%r460; shl.b64 %r1130,%r1129,2; add.u64 %r432,%r1130,%r219; add.u64 %r1131,%r460,%r573; shl.b64 %r1132,%r1131,2; add.u64 %r429,%r1132,%r219; shl.b64 %r341,%r235,8; shl.b64 %r443,%r1474,2; mov.u64 %r448,%r447; mov.u64 %r450,%r449; mov.u64 %r455,%r577; mov.u64 %r458,%r575; mov.u64 %r190,%r527; mov.u64 %r197,%r195; mov.u64 %r250,%r590; mov.u64 %r445,0; mov.u64 %r446,%r445; setp.le.s64 %r1294,%r245,%r461; setp.gt.s64 %r1302,%r1474,%r235; $L68: .loc 1 2772 9 @ %r1294 bra $L61; $L67: .loc 1 2843 12 @ ! %r1302 bra $L63; bra $L62; $L61: .loc 1 2904 13 mov.u64 %r313,0; mov.u64 %r314,4096; mov.u64 %r315,3072; mov.u64 %r316,1024; mov.u64 %r317,-2048; $L66: add.u64 %r311,%r192,%r313; .loc 1 2774 10 ld.f32 %r251,[%r311]; .loc 1 2775 10 ld.f32 %r252,[%r311+4]; add.u64 %r309,%r190,%r313; .loc 1 2776 10 ld.f32 %r253,[%r309]; .loc 1 2777 10 ld.f32 %r254,[%r309+4]; add.u64 %r307,%r89,%r313; .loc 1 2778 10 ld.f32 %r255,[%r307]; .loc 1 2779 10 ld.f32 %r256,[%r307+4]; add.u64 %r305,%r42,%r313; .loc 1 2780 10 ld.f32 %r257,[%r305]; .loc 1 2781 10 ld.f32 %r258,[%r305+4]; .loc 1 2782 10 ld.f32 %r259,[%r311+8]; .loc 1 2783 10 ld.f32 %r260,[%r311+12]; .loc 1 2784 10 ld.f32 %r261,[%r309+8]; .loc 1 2785 10 ld.f32 %r262,[%r309+12]; .loc 1 2786 10 ld.f32 %r263,[%r307+8]; .loc 1 2787 10 ld.f32 %r264,[%r307+12]; .loc 1 2788 10 ld.f32 %r265,[%r305+8]; .loc 1 2789 10 ld.f32 %r266,[%r305+12]; .loc 1 2791 6 @ %r1330 bra $L64; add.u64 %r333,%r445,%r541; add.u64 %r331,%r1002,%r316; sub.u64 %r1383,%r197,%r333; add.u64 %r1384,%r1383,-4; shr.u64 %r1382,%r1384,2; and.b64 %r1386,%r1382,1; setp.ne.u64 %r1387,%r1386,0; @ ! %r1387 bra $L384; $L65: add.u64 %r328,%r317,%r331; .loc 1 2793 19 add.u64 %r1137,%r328,%r316; ld.f32 %r137,[%r1137]; .loc 1 2794 14 ld.f32 %r138,[%r333]; .loc 1 2793 14 fma.rn.f32 %r1520,%r137,%r138,%r251; .loc 1 2795 19 ld.f32 %r139,[%r331]; .loc 1 2795 14 fma.rn.f32 %r1521,%r138,%r139,%r252; add.u64 %r323,%r333,%r458; .loc 1 2798 14 add.u64 %r1138,%r323,%r455; ld.f32 %r140,[%r1138]; .loc 1 2797 14 fma.rn.f32 %r1522,%r137,%r140,%r253; .loc 1 2799 14 fma.rn.f32 %r1523,%r139,%r140,%r254; .loc 1 2802 14 add.u64 %r1139,%r323,%r450; ld.f32 %r141,[%r1139]; .loc 1 2801 14 fma.rn.f32 %r1524,%r137,%r141,%r255; .loc 1 2803 14 fma.rn.f32 %r1525,%r139,%r141,%r256; .loc 1 2806 14 add.u64 %r1140,%r323,%r448; ld.f32 %r142,[%r1140]; .loc 1 2805 14 fma.rn.f32 %r1526,%r137,%r142,%r257; .loc 1 2807 14 fma.rn.f32 %r1527,%r139,%r142,%r258; .loc 1 2809 19 add.u64 %r1141,%r328,%r315; ld.f32 %r143,[%r1141]; .loc 1 2809 14 fma.rn.f32 %r1528,%r138,%r143,%r259; .loc 1 2811 19 add.u64 %r1142,%r328,%r314; ld.f32 %r144,[%r1142]; .loc 1 2811 14 fma.rn.f32 %r1529,%r138,%r144,%r260; .loc 1 2813 14 fma.rn.f32 %r1530,%r140,%r143,%r261; .loc 1 2815 14 fma.rn.f32 %r1531,%r140,%r144,%r262; .loc 1 2817 14 fma.rn.f32 %r1532,%r141,%r143,%r263; .loc 1 2819 14 fma.rn.f32 %r1533,%r141,%r144,%r264; .loc 1 2821 14 fma.rn.f32 %r1534,%r142,%r143,%r265; .loc 1 2823 14 fma.rn.f32 %r1535,%r142,%r144,%r266; .loc 1 2791 6 add.u64 %r1388,%r333,4; add.u64 %r1389,%r331,4; add.u64 %r1538,%r317,%r1389; .loc 1 2793 19 add.u64 %r1539,%r1538,%r316; ld.f32 %r1540,[%r1539]; .loc 1 2794 14 ld.f32 %r1541,[%r1388]; .loc 1 2793 14 fma.rn.f32 %r251,%r1540,%r1541,%r1520; .loc 1 2795 19 ld.f32 %r1542,[%r1389]; .loc 1 2795 14 fma.rn.f32 %r252,%r1541,%r1542,%r1521; add.u64 %r1543,%r1388,%r458; .loc 1 2798 14 add.u64 %r1544,%r1543,%r455; ld.f32 %r1545,[%r1544]; .loc 1 2797 14 fma.rn.f32 %r253,%r1540,%r1545,%r1522; .loc 1 2799 14 fma.rn.f32 %r254,%r1542,%r1545,%r1523; .loc 1 2802 14 add.u64 %r1546,%r1543,%r450; ld.f32 %r1547,[%r1546]; .loc 1 2801 14 fma.rn.f32 %r255,%r1540,%r1547,%r1524; .loc 1 2803 14 fma.rn.f32 %r256,%r1542,%r1547,%r1525; .loc 1 2806 14 add.u64 %r1548,%r1543,%r448; ld.f32 %r1549,[%r1548]; .loc 1 2805 14 fma.rn.f32 %r257,%r1540,%r1549,%r1526; .loc 1 2807 14 fma.rn.f32 %r258,%r1542,%r1549,%r1527; .loc 1 2809 19 add.u64 %r1550,%r1538,%r315; ld.f32 %r1551,[%r1550]; .loc 1 2809 14 fma.rn.f32 %r259,%r1541,%r1551,%r1528; .loc 1 2811 19 add.u64 %r1552,%r1538,%r314; ld.f32 %r1553,[%r1552]; .loc 1 2811 14 fma.rn.f32 %r260,%r1541,%r1553,%r1529; .loc 1 2813 14 fma.rn.f32 %r261,%r1545,%r1551,%r1530; .loc 1 2815 14 fma.rn.f32 %r262,%r1545,%r1553,%r1531; .loc 1 2817 14 fma.rn.f32 %r263,%r1547,%r1551,%r1532; .loc 1 2819 14 fma.rn.f32 %r264,%r1547,%r1553,%r1533; .loc 1 2821 14 fma.rn.f32 %r265,%r1549,%r1551,%r1534; .loc 1 2823 14 fma.rn.f32 %r266,%r1549,%r1553,%r1535; .loc 1 2791 6 add.u64 %r333,%r333,8; add.u64 %r331,%r331,8; setp.ne.u64 %r1554,%r197,%r333; @ %r1554 bra $L65; $L64: .loc 1 2826 24 st.f32 [%r311],%r251; .loc 1 2827 28 st.f32 [%r311+4],%r252; .loc 1 2828 30 st.f32 [%r309],%r253; .loc 1 2829 34 st.f32 [%r309+4],%r254; .loc 1 2830 30 st.f32 [%r307],%r255; .loc 1 2831 34 st.f32 [%r307+4],%r256; .loc 1 2832 30 st.f32 [%r305],%r257; .loc 1 2833 34 st.f32 [%r305+4],%r258; .loc 1 2834 28 st.f32 [%r311+8],%r259; .loc 1 2835 28 st.f32 [%r311+12],%r260; .loc 1 2836 34 st.f32 [%r309+8],%r261; .loc 1 2837 34 st.f32 [%r309+12],%r262; .loc 1 2838 34 st.f32 [%r307+8],%r263; .loc 1 2839 34 st.f32 [%r307+12],%r264; .loc 1 2840 34 st.f32 [%r305+8],%r265; .loc 1 2841 34 st.f32 [%r305+12],%r266; .loc 1 2772 9 add.u64 %r317,%r317,-4096; add.u64 %r316,%r316,4096; add.u64 %r315,%r315,4096; add.u64 %r314,%r314,4096; add.u64 %r313,%r313,16; setp.ne.u64 %r1144,%r290,%r313; @ %r1144 bra $L66; bra $L67; $L63: .loc 1 2769 29 add.u64 %r250,%r250,4; .loc 1 2769 5 add.u64 %r197,%r197,%r196; add.u64 %r192,%r192,%r191; add.u64 %r190,%r190,%r191; add.u64 %r89,%r89,%r191; add.u64 %r42,%r42,%r191; add.u64 %r458,%r458,%r1030; add.u64 %r455,%r455,%r196; add.u64 %r450,%r450,%r196; add.u64 %r448,%r448,%r196; add.u64 %r446,%r446,%r191; add.u64 %r445,%r445,%r196; setp.le.s64 %r1145,%r250,%r1332; @ %r1145 bra $L68; bra $L57; $L62: add.u64 %r346,%r440,%r446; add.u64 %r345,%r435,%r446; add.u64 %r344,%r432,%r446; add.u64 %r343,%r429,%r446; add.u64 %r49,%r48,%r445; add.u64 %r211,%r46,%r445; add.u64 %r212,%r30,%r445; add.u64 %r444,%r192,%r443; .loc 1 2904 13 mov.u64 %r342,%r341; $L71: .loc 1 2848 14 ld.f32 %r246,[%r346]; .loc 1 2849 14 ld.f32 %r247,[%r345]; .loc 1 2850 14 ld.f32 %r248,[%r344]; .loc 1 2851 14 ld.f32 %r249,[%r343]; .loc 1 2853 10 @ %r1330 bra $L69; shl.b64 %r1149,%r342,2; add.u64 %r361,%r1002,%r1149; add.u64 %r359,%r445,%r541; mov.u64 %r353,%r212; mov.u64 %r355,%r211; mov.u64 %r357,%r49; sub.u64 %r1370,%r197,%r359; add.u64 %r1371,%r1370,-4; shr.u64 %r1369,%r1371,2; add.u64 %r1372,%r1369,1; and.b64 %r1373,%r1372,3; setp.eq.u64 %r1376,%r1373,0; @ %r1376 bra $L70; setp.eq.u64 %r1375,%r1373,1; @ %r1375 bra $L315; setp.eq.u64 %r1374,%r1373,2; @ %r1374 bra $L316; bra $L385; $L70: .loc 1 2855 16 ld.f32 %r146,[%r361]; .loc 1 2855 11 ld.f32 %r1150,[%r359]; fma.rn.f32 %r1555,%r146,%r1150,%r246; .loc 1 2857 11 ld.f32 %r1151,[%r357]; fma.rn.f32 %r1556,%r146,%r1151,%r247; .loc 1 2859 11 ld.f32 %r1152,[%r355]; fma.rn.f32 %r1557,%r146,%r1152,%r248; .loc 1 2861 11 ld.f32 %r1153,[%r353]; fma.rn.f32 %r1558,%r146,%r1153,%r249; .loc 1 2855 16 ld.f32 %r1564,[%r361+4]; .loc 1 2855 11 ld.f32 %r1565,[%r359+4]; fma.rn.f32 %r1566,%r1564,%r1565,%r1555; .loc 1 2857 11 ld.f32 %r1567,[%r357+4]; fma.rn.f32 %r1568,%r1564,%r1567,%r1556; .loc 1 2859 11 ld.f32 %r1569,[%r355+4]; fma.rn.f32 %r1570,%r1564,%r1569,%r1557; .loc 1 2861 11 ld.f32 %r1571,[%r353+4]; fma.rn.f32 %r1572,%r1564,%r1571,%r1558; .loc 1 2855 16 ld.f32 %r1579,[%r361+8]; .loc 1 2855 11 ld.f32 %r1580,[%r359+8]; fma.rn.f32 %r1581,%r1579,%r1580,%r1566; .loc 1 2857 11 ld.f32 %r1582,[%r357+8]; fma.rn.f32 %r1583,%r1579,%r1582,%r1568; .loc 1 2859 11 ld.f32 %r1584,[%r355+8]; fma.rn.f32 %r1585,%r1579,%r1584,%r1570; .loc 1 2861 11 ld.f32 %r1586,[%r353+8]; fma.rn.f32 %r1587,%r1579,%r1586,%r1572; .loc 1 2855 16 ld.f32 %r1594,[%r361+12]; .loc 1 2855 11 ld.f32 %r1595,[%r359+12]; fma.rn.f32 %r246,%r1594,%r1595,%r1581; .loc 1 2857 11 ld.f32 %r1596,[%r357+12]; fma.rn.f32 %r247,%r1594,%r1596,%r1583; .loc 1 2859 11 ld.f32 %r1597,[%r355+12]; fma.rn.f32 %r248,%r1594,%r1597,%r1585; .loc 1 2861 11 ld.f32 %r1598,[%r353+12]; fma.rn.f32 %r249,%r1594,%r1598,%r1587; .loc 1 2853 10 add.u64 %r361,%r361,16; add.u64 %r359,%r359,16; add.u64 %r357,%r357,16; add.u64 %r355,%r355,16; add.u64 %r353,%r353,16; setp.ne.u64 %r1599,%r197,%r359; @ %r1599 bra $L70; $L69: .loc 1 2864 28 st.f32 [%r346],%r246; .loc 1 2865 34 st.f32 [%r345],%r247; .loc 1 2866 34 st.f32 [%r344],%r248; .loc 1 2867 34 st.f32 [%r343],%r249; .loc 1 2846 6 add.u64 %r346,%r346,4; add.u64 %r345,%r345,4; add.u64 %r344,%r344,4; add.u64 %r343,%r343,4; add.u64 %r342,%r342,256; setp.ne.u64 %r1155,%r346,%r444; @ %r1155 bra $L71; bra $L63; $L60: .loc 1 2727 34 add.u64 %r245,%r245,256; .loc 1 2727 8 add.u64 %r527,%r527,1024; add.u64 %r528,%r528,1024; setp.ge.s64 %r1156,%r186,%r245; @ %r1156 bra $L72; bra $L73; $L59: .loc 1 2873 17 add.u64 %r151,%r1471,%r590; .loc 1 2874 9 setp.le.s64 %r1157,%r151,%r452; @ %r1157 bra $L60; .loc 1 2876 14 add.u64 %r467,%r235,%r245; .loc 1 2876 9 add.u64 %r468,%r467,-1; .loc 1 2759 17 add.u64 %r377,%r1474,%r245; add.u64 %r1158,%r377,%r378; shl.b64 %r1159,%r1158,2; add.u64 %r379,%r1159,%r219; neg.s64 %r1161,%r377; shl.b64 %r1162,%r1161,2; shl.b64 %r1163,%r245,2; add.u64 %r362,%r1163,%r1162; shl.b64 %r1164,%r467,2; add.u64 %r364,%r1164,%r1162; shl.b64 %r406,%r235,8; .loc 1 2876 9 mov.u64 %r370,%r369; mov.u64 %r372,%r557; mov.u64 %r373,%r378; mov.u64 %r239,%r452; setp.gt.s64 %r1600,%r1474,%r235; setp.le.s64 %r1296,%r245,%r468; not.b64 %r1312,%r245; add.u64 %r1313,%r1312,%r467; and.b64 %r1314,%r1313,-4; add.u64 %r1315,%r1314,%r245; $L81: .loc 1 2877 6 @ %r1296 bra $L74; $L80: .loc 1 2901 6 @ %r1600 bra $L75; bra $L76; $L74: add.u64 %r391,%r362,%r379; add.u64 %r1172,%r1315,%r373; shl.b64 %r1173,%r1172,2; add.u64 %r1174,%r219,16; add.u64 %r381,%r1173,%r1174; .loc 1 2904 13 mov.u64 %r387,4096; mov.u64 %r388,3072; mov.u64 %r389,1024; mov.u64 %r390,-2048; $L79: .loc 1 2879 14 ld.f32 %r241,[%r391]; .loc 1 2880 14 ld.f32 %r242,[%r391+4]; .loc 1 2881 14 ld.f32 %r243,[%r391+8]; .loc 1 2882 14 ld.f32 %r244,[%r391+12]; .loc 1 2884 10 @ %r1330 bra $L77; add.u64 %r400,%r1002,%r389; mov.u64 %r401,%r372; sub.u64 %r1360,%r370,%r372; add.u64 %r1361,%r1360,-4; shr.u64 %r1359,%r1361,2; add.u64 %r1362,%r1359,1; and.b64 %r1363,%r1362,3; setp.eq.u64 %r1366,%r1363,0; @ %r1366 bra $L78; setp.eq.u64 %r1365,%r1363,1; @ %r1365 bra $L317; setp.eq.u64 %r1364,%r1363,2; @ %r1364 bra $L318; bra $L386; $L78: add.u64 %r397,%r390,%r400; .loc 1 2887 16 ld.f32 %r153,[%r401]; .loc 1 2886 16 add.u64 %r1176,%r397,%r389; .loc 1 2886 11 ld.f32 %r1177,[%r1176]; fma.rn.f32 %r1601,%r1177,%r153,%r241; .loc 1 2888 11 ld.f32 %r1178,[%r400]; fma.rn.f32 %r1602,%r153,%r1178,%r242; .loc 1 2890 16 add.u64 %r1179,%r397,%r388; .loc 1 2890 11 ld.f32 %r1180,[%r1179]; fma.rn.f32 %r1603,%r153,%r1180,%r243; .loc 1 2892 16 add.u64 %r1181,%r397,%r387; .loc 1 2892 11 ld.f32 %r1182,[%r1181]; fma.rn.f32 %r1604,%r153,%r1182,%r244; .loc 1 2884 10 add.u64 %r1368,%r400,4; add.u64 %r1607,%r390,%r1368; .loc 1 2887 16 ld.f32 %r1608,[%r401+4]; .loc 1 2886 16 add.u64 %r1609,%r1607,%r389; .loc 1 2886 11 ld.f32 %r1610,[%r1609]; fma.rn.f32 %r1611,%r1610,%r1608,%r1601; .loc 1 2888 11 ld.f32 %r1612,[%r1368]; fma.rn.f32 %r1613,%r1608,%r1612,%r1602; .loc 1 2890 16 add.u64 %r1614,%r1607,%r388; .loc 1 2890 11 ld.f32 %r1615,[%r1614]; fma.rn.f32 %r1616,%r1608,%r1615,%r1603; .loc 1 2892 16 add.u64 %r1617,%r1607,%r387; .loc 1 2892 11 ld.f32 %r1618,[%r1617]; fma.rn.f32 %r1619,%r1608,%r1618,%r1604; .loc 1 2884 10 add.u64 %r1621,%r400,8; add.u64 %r1623,%r390,%r1621; .loc 1 2887 16 ld.f32 %r1624,[%r401+8]; .loc 1 2886 16 add.u64 %r1625,%r1623,%r389; .loc 1 2886 11 ld.f32 %r1626,[%r1625]; fma.rn.f32 %r1627,%r1626,%r1624,%r1611; .loc 1 2888 11 ld.f32 %r1628,[%r1621]; fma.rn.f32 %r1629,%r1624,%r1628,%r1613; .loc 1 2890 16 add.u64 %r1630,%r1623,%r388; .loc 1 2890 11 ld.f32 %r1631,[%r1630]; fma.rn.f32 %r1632,%r1624,%r1631,%r1616; .loc 1 2892 16 add.u64 %r1633,%r1623,%r387; .loc 1 2892 11 ld.f32 %r1634,[%r1633]; fma.rn.f32 %r1635,%r1624,%r1634,%r1619; .loc 1 2884 10 add.u64 %r1637,%r400,12; add.u64 %r1639,%r390,%r1637; .loc 1 2887 16 ld.f32 %r1640,[%r401+12]; .loc 1 2886 16 add.u64 %r1641,%r1639,%r389; .loc 1 2886 11 ld.f32 %r1642,[%r1641]; fma.rn.f32 %r241,%r1642,%r1640,%r1627; .loc 1 2888 11 ld.f32 %r1643,[%r1637]; fma.rn.f32 %r242,%r1640,%r1643,%r1629; .loc 1 2890 16 add.u64 %r1644,%r1639,%r388; .loc 1 2890 11 ld.f32 %r1645,[%r1644]; fma.rn.f32 %r243,%r1640,%r1645,%r1632; .loc 1 2892 16 add.u64 %r1646,%r1639,%r387; .loc 1 2892 11 ld.f32 %r1647,[%r1646]; fma.rn.f32 %r244,%r1640,%r1647,%r1635; .loc 1 2884 10 add.u64 %r401,%r401,16; add.u64 %r400,%r400,16; setp.ne.u64 %r1648,%r370,%r401; @ %r1648 bra $L78; $L77: .loc 1 2895 28 st.f32 [%r391],%r241; .loc 1 2896 32 st.f32 [%r391+4],%r242; .loc 1 2897 32 st.f32 [%r391+8],%r243; .loc 1 2898 32 st.f32 [%r391+12],%r244; .loc 1 2877 6 add.u64 %r391,%r391,16; add.u64 %r390,%r390,-4096; add.u64 %r389,%r389,4096; add.u64 %r388,%r388,4096; add.u64 %r387,%r387,4096; setp.ne.u64 %r1184,%r381,%r391; @ %r1184 bra $L79; bra $L80; $L76: .loc 1 2874 39 add.u64 %r239,%r239,1; .loc 1 2874 9 add.u64 %r379,%r379,%r574; add.u64 %r373,%r373,%r476; add.u64 %r372,%r372,%r202; add.u64 %r370,%r370,%r202; setp.ne.u64 %r1185,%r151,%r239; @ %r1185 bra $L81; bra $L60; $L75: add.u64 %r408,%r364,%r379; .loc 1 2904 13 mov.u64 %r407,%r406; $L84: .loc 1 2903 14 ld.f32 %r240,[%r408]; .loc 1 2905 10 @ %r1330 bra $L82; shl.b64 %r1188,%r407,2; add.u64 %r304,%r1002,%r1188; mov.u64 %r269,%r372; sub.u64 %r1350,%r370,%r372; add.u64 %r1351,%r1350,-4; shr.u64 %r1349,%r1351,2; add.u64 %r1352,%r1349,1; and.b64 %r1353,%r1352,3; setp.eq.u64 %r1356,%r1353,0; @ %r1356 bra $L83; setp.eq.u64 %r1355,%r1353,1; @ %r1355 bra $L319; setp.eq.u64 %r1354,%r1353,2; @ %r1354 bra $L320; bra $L387; $L83: .loc 1 2907 11 ld.f32 %r1189,[%r304]; ld.f32 %r1190,[%r269]; fma.rn.f32 %r1649,%r1189,%r1190,%r240; ld.f32 %r1652,[%r304+4]; ld.f32 %r1653,[%r269+4]; fma.rn.f32 %r1654,%r1652,%r1653,%r1649; ld.f32 %r1658,[%r304+8]; ld.f32 %r1659,[%r269+8]; fma.rn.f32 %r1660,%r1658,%r1659,%r1654; ld.f32 %r1664,[%r304+12]; ld.f32 %r1665,[%r269+12]; fma.rn.f32 %r240,%r1664,%r1665,%r1660; .loc 1 2905 10 add.u64 %r304,%r304,16; add.u64 %r269,%r269,16; setp.ne.u64 %r1666,%r370,%r269; @ %r1666 bra $L83; $L82: .loc 1 2910 28 st.f32 [%r408],%r240; .loc 1 2901 6 add.u64 %r408,%r408,4; add.u64 %r407,%r407,256; setp.ne.u64 %r1192,%r379,%r408; @ %r1192 bra $L84; bra $L76; $L20: .loc 1 2958 11 cvt.u16.u32 %r1193,%r418; setp.eq.u16 %r1194,%r1193,1; @ ! %r1194 bra $L388; bra $L85; $L31: .loc 1 2920 38 set.u32.eq.u64 %r1196,%r178,1; neg.s32 %r1197,%r1196; .loc 1 2920 26 cvt.u16.u32 %r1199,%r1197; and.b16 %r1198,%r1199,%r959; .loc 1 2920 11 cvt.u32.u16 %r1201,%r1198; cvt.u16.u8 %r1202,%r1201; setp.eq.u16 %r1203,%r1202,0; @ %r1203 bra $L20; .loc 1 2920 43 setp.ne.u64 %r1204,%r474,1; @ %r1204 bra $L20; $L121: .loc 1 2922 10 cvt.u16.u32 %r1205,%r418; setp.ne.u16 %r1206,%r1205,1; @ %r1206 bra $L87; .loc 1 2948 4 setp.gt.s64 %r1207,%r187,0; @ %r1207 bra $L88; bra $L1; $L87: .loc 1 2929 4 setp.le.s64 %r1208,%r187,0; @ %r1208 bra $L1; shl.b64 %r614,%r476,2; shl.b64 %r1209,%r186,2; add.u64 %r613,%r217,%r1209; shl.b64 %r604,%r475,2; shl.b64 %r1210,%r428,2; add.u64 %r605,%r215,%r1210; mov.u64 %r618,0; mov.u64 %r611,%r618; .loc 1 2929 11 mov.u64 %r272,%r618; setp.gt.s64 %r1667,%r186,0; setp.le.s64 %r1346,%r428,0; $L91: .loc 1 2933 8 @ %r1667 bra $L90; $L95: .loc 1 2929 29 add.u64 %r272,%r272,1; .loc 1 2929 4 add.u64 %r611,%r611,%r476; add.u64 %r613,%r613,%r614; add.u64 %r618,%r618,%r179; setp.ne.u64 %r1212,%r187,%r272; @ %r1212 bra $L91; bra $L1; $L90: shl.b64 %r1213,%r611,2; add.u64 %r600,%r217,%r1213; shl.b64 %r1214,%r618,2; add.u64 %r623,%r216,%r1214; .loc 1 2933 8 mov.u64 %r602,%r605; mov.u64 %r601,0; $L94: .loc 1 2937 5 @ %r1346 bra $L140; shl.b64 %r1216,%r601,2; add.u64 %r596,%r215,%r1216; mov.u64 %r597,%r623; .loc 1 2936 7 mov.f32 %r273,0f00000000; sub.u64 %r1419,%r602,%r596; add.u64 %r1420,%r1419,-4; shr.u64 %r1418,%r1420,2; add.u64 %r1421,%r1418,1; and.b64 %r1422,%r1421,3; setp.eq.u64 %r1425,%r1422,0; @ %r1425 bra $L93; setp.eq.u64 %r1424,%r1422,1; @ %r1424 bra $L321; setp.eq.u64 %r1423,%r1422,2; @ %r1423 bra $L322; bra $L389; $L93: .loc 1 2938 9 ld.f32 %r1217,[%r596]; ld.f32 %r1218,[%r597]; fma.rn.f32 %r1668,%r1217,%r1218,%r273; ld.f32 %r1671,[%r596+4]; ld.f32 %r1672,[%r597+4]; fma.rn.f32 %r1673,%r1671,%r1672,%r1668; ld.f32 %r1677,[%r596+8]; ld.f32 %r1678,[%r597+8]; fma.rn.f32 %r1679,%r1677,%r1678,%r1673; ld.f32 %r1683,[%r596+12]; ld.f32 %r1684,[%r597+12]; fma.rn.f32 %r273,%r1683,%r1684,%r1679; .loc 1 2937 5 add.u64 %r596,%r596,16; add.u64 %r597,%r597,16; setp.ne.u64 %r1685,%r596,%r602; @ %r1685 bra $L93; bra $L92; $L140: .loc 1 2936 7 mov.f32 %r273,0f00000000; $L92: .loc 1 2939 15 st.f32 [%r600],%r273; .loc 1 2933 8 add.u64 %r600,%r600,4; add.u64 %r601,%r601,%r475; add.u64 %r602,%r602,%r604; setp.ne.u64 %r1220,%r600,%r613; @ %r1220 bra $L94; bra $L95; $L88: shl.b64 %r634,%r476,2; mov.u64 %r632,%r217; shl.b64 %r626,%r475,2; .loc 1 2948 4 mov.u64 %r635,0; .loc 1 2948 11 mov.u64 %r270,%r635; setp.le.s64 %r1686,%r428,0; $L98: .loc 1 2952 8 @ %r1686 bra $L141; mov.u64 %r624,%r215; shl.b64 %r1222,%r635,2; add.u64 %r627,%r1222,%r216; add.u64 %r1223,%r428,%r635; shl.b64 %r1224,%r1223,2; add.u64 %r643,%r1224,%r216; .loc 1 2951 10 mov.f32 %r271,0f00000000; sub.u64 %r1429,%r643,%r627; add.u64 %r1430,%r1429,-4; shr.u64 %r1428,%r1430,2; add.u64 %r1431,%r1428,1; and.b64 %r1432,%r1431,3; setp.eq.u64 %r1435,%r1432,0; @ %r1435 bra $L97; setp.eq.u64 %r1434,%r1432,1; @ %r1434 bra $L323; setp.eq.u64 %r1433,%r1432,2; @ %r1433 bra $L324; bra $L390; $L97: .loc 1 2953 5 ld.f32 %r1225,[%r624]; ld.f32 %r1226,[%r627]; fma.rn.f32 %r1687,%r1225,%r1226,%r271; .loc 1 2952 8 add.u64 %r1688,%r624,%r626; .loc 1 2953 5 ld.f32 %r1690,[%r1688]; ld.f32 %r1691,[%r627+4]; fma.rn.f32 %r1692,%r1690,%r1691,%r1687; .loc 1 2952 8 add.u64 %r1693,%r1688,%r626; .loc 1 2953 5 ld.f32 %r1696,[%r1693]; ld.f32 %r1697,[%r627+8]; fma.rn.f32 %r1698,%r1696,%r1697,%r1692; .loc 1 2952 8 add.u64 %r1699,%r1693,%r626; .loc 1 2953 5 ld.f32 %r1702,[%r1699]; ld.f32 %r1703,[%r627+12]; fma.rn.f32 %r271,%r1702,%r1703,%r1698; .loc 1 2952 8 add.u64 %r624,%r1699,%r626; add.u64 %r627,%r627,16; setp.ne.u64 %r1704,%r627,%r643; @ %r1704 bra $L97; bra $L96; $L141: .loc 1 2951 10 mov.f32 %r271,0f00000000; $L96: .loc 1 2954 25 st.f32 [%r632],%r271; .loc 1 2948 29 add.u64 %r270,%r270,1; .loc 1 2948 4 add.u64 %r632,%r632,%r634; add.u64 %r635,%r635,%r179; setp.ne.u64 %r1228,%r187,%r270; @ %r1228 bra $L98; bra $L1; $L85: .loc 1 2963 7 setp.le.s64 %r1229,%r187,0; @ %r1229 bra $L1; shl.b64 %r654,%r416,2; mov.u64 %r652,%r217; shl.b64 %r646,%r475,2; shl.b64 %r649,%r474,2; mov.u64 %r655,0; .loc 1 2963 14 mov.u64 %r283,%r655; setp.le.s64 %r1705,%r428,0; $L101: .loc 1 2967 4 @ %r1705 bra $L142; mov.u64 %r644,%r215; shl.b64 %r1231,%r655,2; add.u64 %r647,%r216,%r1231; .loc 1 2966 6 mov.f32 %r284,0f00000000; .loc 1 2967 11 mov.u64 %r285,0; and.b64 %r1440,%r428,3; setp.eq.u64 %r1443,%r1440,0; @ %r1443 bra $L100; setp.eq.u64 %r1442,%r1440,1; @ %r1442 bra $L325; setp.eq.u64 %r1441,%r1440,2; @ %r1441 bra $L326; bra $L391; $L100: .loc 1 2968 8 ld.f32 %r1232,[%r644]; ld.f32 %r1233,[%r647]; fma.rn.f32 %r1706,%r1232,%r1233,%r284; .loc 1 2967 4 add.u64 %r1708,%r644,%r646; add.u64 %r1709,%r647,%r649; .loc 1 2968 8 ld.f32 %r1710,[%r1708]; ld.f32 %r1711,[%r1709]; fma.rn.f32 %r1712,%r1710,%r1711,%r1706; .loc 1 2967 4 add.u64 %r1714,%r1708,%r646; add.u64 %r1715,%r1709,%r649; .loc 1 2968 8 ld.f32 %r1717,[%r1714]; ld.f32 %r1718,[%r1715]; fma.rn.f32 %r1719,%r1717,%r1718,%r1712; .loc 1 2967 4 add.u64 %r1721,%r1714,%r646; add.u64 %r1722,%r1715,%r649; .loc 1 2968 8 ld.f32 %r1724,[%r1721]; ld.f32 %r1725,[%r1722]; fma.rn.f32 %r284,%r1724,%r1725,%r1719; .loc 1 2967 28 add.u64 %r285,%r285,4; .loc 1 2967 4 add.u64 %r644,%r1721,%r646; add.u64 %r647,%r1722,%r649; setp.ne.u64 %r1726,%r428,%r285; @ %r1726 bra $L100; bra $L99; $L142: .loc 1 2966 6 mov.f32 %r284,0f00000000; $L99: .loc 1 2969 21 st.f32 [%r652],%r284; .loc 1 2963 32 add.u64 %r283,%r283,1; .loc 1 2963 7 add.u64 %r652,%r652,%r654; add.u64 %r655,%r655,%r179; setp.ne.u64 %r1235,%r187,%r283; @ %r1235 bra $L101; bra $L1; $L388: .loc 1 2972 11 setp.gt.s64 %r1236,%r178,%r475; @ %r1236 bra $L102; .loc 1 2993 7 setp.gt.s64 %r1237,%r187,0; @ %r1237 bra $L103; bra $L1; $L102: .loc 1 2974 7 setp.gt.s64 %r1238,%r187,0; @ %r1238 bra $L104; bra $L1; $L108: shl.b64 %r1239,%r688,2; add.u64 %r684,%r217,%r1239; .loc 1 2975 9 mov.u64 %r282,0; and.b64 %r1456,%r186,3; setp.eq.u64 %r1459,%r1456,0; @ %r1459 bra $L105; setp.eq.u64 %r1458,%r1456,1; @ %r1458 bra $L327; setp.eq.u64 %r1457,%r1456,2; @ %r1457 bra $L328; bra $L392; $L105: .loc 1 2976 34 st.f32 [%r684],%r1347; .loc 1 2975 2 add.u64 %r1728,%r684,%r686; .loc 1 2976 34 st.f32 [%r1728],%r1347; .loc 1 2975 2 add.u64 %r1730,%r1728,%r686; .loc 1 2976 34 st.f32 [%r1730],%r1347; .loc 1 2975 2 add.u64 %r1733,%r1730,%r686; .loc 1 2976 34 st.f32 [%r1733],%r1347; .loc 1 2975 27 add.u64 %r282,%r282,4; .loc 1 2975 2 add.u64 %r684,%r1733,%r686; setp.ne.u64 %r1735,%r186,%r282; @ %r1735 bra $L105; $L109: .loc 1 2974 32 add.u64 %r281,%r296,1; .loc 1 2974 7 add.u64 %r688,%r688,%r476; setp.ne.u64 %r1242,%r187,%r281; @ %r1242 bra $L143; shl.b64 %r672,%r474,2; shl.b64 %r663,%r475,2; mov.u64 %r677,0; mov.u64 %r676,%r677; mov.u64 %r297,%r677; setp.le.s64 %r1736,%r428,0; bra $L107; $L104: shl.b64 %r686,%r416,2; mov.u64 %r688,0; .loc 1 2974 14 mov.u64 %r296,%r688; setp.gt.s64 %r1737,%r186,0; .loc 1 2976 34 mov.f32 %r1347,0f00000000; bra $L106; $L143: mov.u64 %r296,%r281; $L106: .loc 1 2975 2 @ %r1737 bra $L108; bra $L109; $L112: shl.b64 %r1244,%r668,2; add.u64 %r661,%r215,%r1244; .loc 1 2980 4 mov.u64 %r660,%r681; .loc 1 2980 11 mov.u64 %r280,0; and.b64 %r1448,%r186,3; setp.eq.u64 %r1451,%r1448,0; @ %r1451 bra $L110; setp.eq.u64 %r1450,%r1448,1; @ %r1450 bra $L329; setp.eq.u64 %r1449,%r1448,2; @ %r1449 bra $L330; bra $L393; $L110: .loc 1 2982 36 ld.f32 %r1245,[%r661]; ld.f32 %r1246,[%r670]; ld.f32 %r1247,[%r660]; fma.rn.f32 %r175,%r1245,%r1246,%r1247; st.f32 [%r660],%r175; .loc 1 2980 4 add.u64 %r1739,%r660,%r686; add.u64 %r1740,%r661,%r663; .loc 1 2982 36 ld.f32 %r1741,[%r1740]; ld.f32 %r1742,[%r670]; ld.f32 %r1743,[%r1739]; fma.rn.f32 %r1744,%r1741,%r1742,%r1743; st.f32 [%r1739],%r1744; .loc 1 2980 4 add.u64 %r1746,%r1739,%r686; add.u64 %r1747,%r1740,%r663; .loc 1 2982 36 ld.f32 %r1749,[%r1747]; ld.f32 %r1750,[%r670]; ld.f32 %r1751,[%r1746]; fma.rn.f32 %r1752,%r1749,%r1750,%r1751; st.f32 [%r1746],%r1752; .loc 1 2980 4 add.u64 %r1754,%r1746,%r686; add.u64 %r1755,%r1747,%r663; .loc 1 2982 36 ld.f32 %r1757,[%r1755]; ld.f32 %r1758,[%r670]; ld.f32 %r1759,[%r1754]; fma.rn.f32 %r1760,%r1757,%r1758,%r1759; st.f32 [%r1754],%r1760; .loc 1 2980 29 add.u64 %r280,%r280,4; .loc 1 2980 4 add.u64 %r660,%r1754,%r686; add.u64 %r661,%r1755,%r663; setp.ne.u64 %r1761,%r186,%r280; @ %r1761 bra $L110; $L113: .loc 1 2979 26 add.u64 %r279,%r279,1; .loc 1 2979 2 add.u64 %r668,%r668,%r178; add.u64 %r670,%r670,%r672; setp.eq.u64 %r1249,%r428,%r279; @ %r1249 bra $L111; $L114: .loc 1 2980 4 @ %r1737 bra $L112; bra $L113; $L111: .loc 1 2978 32 add.u64 %r278,%r297,1; .loc 1 2978 7 add.u64 %r676,%r676,%r476; add.u64 %r677,%r677,%r179; setp.eq.u64 %r1251,%r296,%r297; @ %r1251 bra $L1; mov.u64 %r297,%r278; $L107: .loc 1 2979 2 @ %r1736 bra $L111; shl.b64 %r1253,%r677,2; add.u64 %r670,%r216,%r1253; shl.b64 %r1254,%r676,2; add.u64 %r681,%r217,%r1254; mov.u64 %r668,0; .loc 1 2979 9 mov.u64 %r279,%r668; bra $L114; $L103: shl.b64 %r703,%r416,2; shl.b64 %r694,%r178,2; shl.b64 %r698,%r474,2; .loc 1 2993 7 mov.u64 %r712,0; mov.u64 %r710,%r712; .loc 1 2993 14 mov.u64 %r274,%r712; setp.gt.s64 %r1762,%r186,0; setp.le.s64 %r1348,%r428,0; $L116: .loc 1 2997 4 @ %r1762 bra $L115; $L120: .loc 1 2993 32 add.u64 %r274,%r274,1; .loc 1 2993 7 add.u64 %r710,%r710,%r476; add.u64 %r712,%r712,%r179; setp.ne.u64 %r1256,%r187,%r274; @ %r1256 bra $L116; bra $L1; $L115: shl.b64 %r1257,%r710,2; add.u64 %r701,%r217,%r1257; shl.b64 %r1258,%r712,2; add.u64 %r718,%r216,%r1258; .loc 1 2997 4 mov.u64 %r705,0; .loc 1 2997 11 mov.u64 %r275,%r705; $L119: .loc 1 3001 8 @ %r1348 bra $L144; shl.b64 %r1260,%r705,2; add.u64 %r692,%r215,%r1260; mov.u64 %r696,%r718; .loc 1 3000 10 mov.f32 %r276,0f00000000; .loc 1 3001 15 mov.u64 %r277,0; and.b64 %r1464,%r428,3; setp.eq.u64 %r1467,%r1464,0; @ %r1467 bra $L118; setp.eq.u64 %r1466,%r1464,1; @ %r1466 bra $L331; setp.eq.u64 %r1465,%r1464,2; @ %r1465 bra $L332; bra $L394; $L118: .loc 1 3002 5 ld.f32 %r1261,[%r692]; ld.f32 %r1262,[%r696]; fma.rn.f32 %r1764,%r1261,%r1262,%r276; .loc 1 3001 8 add.u64 %r1766,%r692,%r694; add.u64 %r1767,%r696,%r698; .loc 1 3002 5 ld.f32 %r1768,[%r1766]; ld.f32 %r1769,[%r1767]; fma.rn.f32 %r1770,%r1768,%r1769,%r1764; .loc 1 3001 8 add.u64 %r1772,%r1766,%r694; add.u64 %r1773,%r1767,%r698; .loc 1 3002 5 ld.f32 %r1775,[%r1772]; ld.f32 %r1776,[%r1773]; fma.rn.f32 %r1777,%r1775,%r1776,%r1770; .loc 1 3001 8 add.u64 %r1779,%r1772,%r694; add.u64 %r1780,%r1773,%r698; .loc 1 3002 5 ld.f32 %r1782,[%r1779]; ld.f32 %r1783,[%r1780]; fma.rn.f32 %r276,%r1782,%r1783,%r1777; .loc 1 3001 32 add.u64 %r277,%r277,4; .loc 1 3001 8 add.u64 %r692,%r1779,%r694; add.u64 %r696,%r1780,%r698; setp.ne.u64 %r1784,%r428,%r277; @ %r1784 bra $L118; bra $L117; $L144: .loc 1 3000 10 mov.f32 %r276,0f00000000; $L117: .loc 1 3003 27 st.f32 [%r701],%r276; .loc 1 2997 29 add.u64 %r275,%r275,1; .loc 1 2997 4 add.u64 %r701,%r701,%r703; add.u64 %r705,%r705,%r475; setp.ne.u64 %r1264,%r186,%r275; @ %r1264 bra $L119; bra $L120; $L32: .loc 1 2920 11 setp.eq.u64 %r1265,%r178,1; @ %r1265 bra $L121; bra $L20; $L2: .loc 1 2476 6 ld.u64 %r1266,[%r719]; setp.ne.u64 %r1267,%r1266,0; @ %r1267 bra $L122; .loc 1 2485 4 ld.u64 %r1269,[%r720+56]; add.u64 %r1268,%r1269,1; ld.u64 %r1270,[%r720+48]; sub.u64 %r480,%r1268,%r1270; .loc 1 2483 15 ld.s8 %r1272,[%r721+28]; cvt.u16.u32 %r1271,%r1272; setp.eq.u16 %r1273,%r1271,1; @ %r1273 bra $L123; bra $L124; $L126: .loc 1 2560 16 ld.u64 %r475,[%r720+40]; .loc 1 2548 27 mov.u64 %r416,%r476; bra $L125; $L14: ld.u64 %r476,[%r719+40]; .loc 1 2543 6 ld.s8 %r1275,[%r719+28]; cvt.u16.u32 %r1274,%r1275; setp.eq.u16 %r1276,%r1274,1; @ ! %r1276 bra $L127; bra $L126; $L11: .loc 1 2560 16 ld.u64 %r475,[%r720+40]; mov.u32 %r418,%r303; .loc 1 2548 27 mov.u64 %r476,%r416; .loc 1 2563 14 mov.u64 %r186,1; .loc 1 2561 16 mov.u64 %r178,%r186; bra $L12; $L5: .loc 1 2485 4 ld.u64 %r1278,[%r720+56]; add.u64 %r1277,%r1278,1; ld.u64 %r1279,[%r720+48]; sub.u64 %r480,%r1277,%r1279; bra $L124; $L128: .loc 1 2509 17 ld.u64 %r1281,[%r719+56]; add.u64 %r1280,%r1281,1; .loc 1 2509 15 ld.u64 %r1282,[%r719+48]; sub.u64 %r428,%r1280,%r1282; ld.s8 %r1286,[%r721+28]; bra $L9; $L122: .loc 1 2502 12 cvta.global.u64 %r1283,_gfortrani_compile_options; .loc 1 2502 11 ld.u32 %r1284,[%r1283+36]; setp.ne.u32 %r1285,%r1284,0; @ ! %r1285 bra $L8; bra $L128; $L1: .loc 1 3007 1 ret; $L387: .loc 1 2907 11 ld.f32 %r1785,[%r304]; ld.f32 %r1786,[%r372]; fma.rn.f32 %r240,%r1785,%r1786,%r240; .loc 1 2905 10 add.u64 %r304,%r304,4; add.u64 %r269,%r372,4; $L320: .loc 1 2907 11 ld.f32 %r1788,[%r304]; ld.f32 %r1789,[%r269]; fma.rn.f32 %r240,%r1788,%r1789,%r240; .loc 1 2905 10 add.u64 %r304,%r304,4; add.u64 %r269,%r269,4; $L319: .loc 1 2907 11 ld.f32 %r1791,[%r304]; ld.f32 %r1792,[%r269]; fma.rn.f32 %r240,%r1791,%r1792,%r240; .loc 1 2905 10 add.u64 %r304,%r304,4; add.u64 %r269,%r269,4; setp.ne.u64 %r1793,%r370,%r269; @ %r1793 bra $L83; bra $L82; $L386: add.u64 %r1794,%r390,%r400; .loc 1 2887 16 ld.f32 %r1795,[%r372]; .loc 1 2886 16 add.u64 %r1796,%r1794,%r389; .loc 1 2886 11 ld.f32 %r1797,[%r1796]; fma.rn.f32 %r241,%r1797,%r1795,%r241; .loc 1 2888 11 ld.f32 %r1798,[%r400]; fma.rn.f32 %r242,%r1795,%r1798,%r242; .loc 1 2890 16 add.u64 %r1799,%r1794,%r388; .loc 1 2890 11 ld.f32 %r1800,[%r1799]; fma.rn.f32 %r243,%r1795,%r1800,%r243; .loc 1 2892 16 add.u64 %r1801,%r1794,%r387; .loc 1 2892 11 ld.f32 %r1802,[%r1801]; fma.rn.f32 %r244,%r1795,%r1802,%r244; .loc 1 2884 10 add.u64 %r401,%r372,4; add.u64 %r400,%r400,4; $L318: add.u64 %r1804,%r390,%r400; .loc 1 2887 16 ld.f32 %r1805,[%r401]; .loc 1 2886 16 add.u64 %r1806,%r1804,%r389; .loc 1 2886 11 ld.f32 %r1807,[%r1806]; fma.rn.f32 %r241,%r1807,%r1805,%r241; .loc 1 2888 11 ld.f32 %r1808,[%r400]; fma.rn.f32 %r242,%r1805,%r1808,%r242; .loc 1 2890 16 add.u64 %r1809,%r1804,%r388; .loc 1 2890 11 ld.f32 %r1810,[%r1809]; fma.rn.f32 %r243,%r1805,%r1810,%r243; .loc 1 2892 16 add.u64 %r1811,%r1804,%r387; .loc 1 2892 11 ld.f32 %r1812,[%r1811]; fma.rn.f32 %r244,%r1805,%r1812,%r244; .loc 1 2884 10 add.u64 %r401,%r401,4; add.u64 %r400,%r400,4; $L317: add.u64 %r1814,%r390,%r400; .loc 1 2887 16 ld.f32 %r1815,[%r401]; .loc 1 2886 16 add.u64 %r1816,%r1814,%r389; .loc 1 2886 11 ld.f32 %r1817,[%r1816]; fma.rn.f32 %r241,%r1817,%r1815,%r241; .loc 1 2888 11 ld.f32 %r1818,[%r400]; fma.rn.f32 %r242,%r1815,%r1818,%r242; .loc 1 2890 16 add.u64 %r1819,%r1814,%r388; .loc 1 2890 11 ld.f32 %r1820,[%r1819]; fma.rn.f32 %r243,%r1815,%r1820,%r243; .loc 1 2892 16 add.u64 %r1821,%r1814,%r387; .loc 1 2892 11 ld.f32 %r1822,[%r1821]; fma.rn.f32 %r244,%r1815,%r1822,%r244; .loc 1 2884 10 add.u64 %r401,%r401,4; add.u64 %r400,%r400,4; setp.ne.u64 %r1823,%r370,%r401; @ %r1823 bra $L78; bra $L77; $L385: .loc 1 2855 16 ld.f32 %r1824,[%r361]; .loc 1 2855 11 ld.f32 %r1825,[%r359]; fma.rn.f32 %r246,%r1824,%r1825,%r246; .loc 1 2857 11 ld.f32 %r1826,[%r49]; fma.rn.f32 %r247,%r1824,%r1826,%r247; .loc 1 2859 11 ld.f32 %r1827,[%r211]; fma.rn.f32 %r248,%r1824,%r1827,%r248; .loc 1 2861 11 ld.f32 %r1828,[%r212]; fma.rn.f32 %r249,%r1824,%r1828,%r249; .loc 1 2853 10 add.u64 %r361,%r361,4; add.u64 %r359,%r359,4; add.u64 %r357,%r49,4; add.u64 %r355,%r211,4; add.u64 %r353,%r212,4; $L316: .loc 1 2855 16 ld.f32 %r1830,[%r361]; .loc 1 2855 11 ld.f32 %r1831,[%r359]; fma.rn.f32 %r246,%r1830,%r1831,%r246; .loc 1 2857 11 ld.f32 %r1832,[%r357]; fma.rn.f32 %r247,%r1830,%r1832,%r247; .loc 1 2859 11 ld.f32 %r1833,[%r355]; fma.rn.f32 %r248,%r1830,%r1833,%r248; .loc 1 2861 11 ld.f32 %r1834,[%r353]; fma.rn.f32 %r249,%r1830,%r1834,%r249; .loc 1 2853 10 add.u64 %r361,%r361,4; add.u64 %r359,%r359,4; add.u64 %r357,%r357,4; add.u64 %r355,%r355,4; add.u64 %r353,%r353,4; $L315: .loc 1 2855 16 ld.f32 %r1836,[%r361]; .loc 1 2855 11 ld.f32 %r1837,[%r359]; fma.rn.f32 %r246,%r1836,%r1837,%r246; .loc 1 2857 11 ld.f32 %r1838,[%r357]; fma.rn.f32 %r247,%r1836,%r1838,%r247; .loc 1 2859 11 ld.f32 %r1839,[%r355]; fma.rn.f32 %r248,%r1836,%r1839,%r248; .loc 1 2861 11 ld.f32 %r1840,[%r353]; fma.rn.f32 %r249,%r1836,%r1840,%r249; .loc 1 2853 10 add.u64 %r361,%r361,4; add.u64 %r359,%r359,4; add.u64 %r357,%r357,4; add.u64 %r355,%r355,4; add.u64 %r353,%r353,4; setp.ne.u64 %r1841,%r197,%r359; @ %r1841 bra $L70; bra $L69; $L384: add.u64 %r1842,%r317,%r331; .loc 1 2793 19 add.u64 %r1843,%r1842,%r316; ld.f32 %r1844,[%r1843]; .loc 1 2794 14 ld.f32 %r1845,[%r333]; .loc 1 2793 14 fma.rn.f32 %r251,%r1844,%r1845,%r251; .loc 1 2795 19 ld.f32 %r1846,[%r331]; .loc 1 2795 14 fma.rn.f32 %r252,%r1845,%r1846,%r252; add.u64 %r1847,%r333,%r458; .loc 1 2798 14 add.u64 %r1848,%r1847,%r455; ld.f32 %r1849,[%r1848]; .loc 1 2797 14 fma.rn.f32 %r253,%r1844,%r1849,%r253; .loc 1 2799 14 fma.rn.f32 %r254,%r1846,%r1849,%r254; .loc 1 2802 14 add.u64 %r1850,%r1847,%r450; ld.f32 %r1851,[%r1850]; .loc 1 2801 14 fma.rn.f32 %r255,%r1844,%r1851,%r255; .loc 1 2803 14 fma.rn.f32 %r256,%r1846,%r1851,%r256; .loc 1 2806 14 add.u64 %r1852,%r1847,%r448; ld.f32 %r1853,[%r1852]; .loc 1 2805 14 fma.rn.f32 %r257,%r1844,%r1853,%r257; .loc 1 2807 14 fma.rn.f32 %r258,%r1846,%r1853,%r258; .loc 1 2809 19 add.u64 %r1854,%r1842,%r315; ld.f32 %r1855,[%r1854]; .loc 1 2809 14 fma.rn.f32 %r259,%r1845,%r1855,%r259; .loc 1 2811 19 add.u64 %r1856,%r1842,%r314; ld.f32 %r1857,[%r1856]; .loc 1 2811 14 fma.rn.f32 %r260,%r1845,%r1857,%r260; .loc 1 2813 14 fma.rn.f32 %r261,%r1849,%r1855,%r261; .loc 1 2815 14 fma.rn.f32 %r262,%r1849,%r1857,%r262; .loc 1 2817 14 fma.rn.f32 %r263,%r1851,%r1855,%r263; .loc 1 2819 14 fma.rn.f32 %r264,%r1851,%r1857,%r264; .loc 1 2821 14 fma.rn.f32 %r265,%r1853,%r1855,%r265; .loc 1 2823 14 fma.rn.f32 %r266,%r1853,%r1857,%r266; .loc 1 2791 6 add.u64 %r333,%r333,4; add.u64 %r331,%r331,4; setp.ne.u64 %r1858,%r197,%r333; @ %r1858 bra $L65; bra $L64; $L383: .loc 1 2762 43 ld.f32 %r1859,[%r528]; st.f32 [%r302],%r1859; .loc 1 2760 9 add.u64 %r57,%r528,4; add.u64 %r73,%r302,1024; $L314: .loc 1 2762 43 ld.f32 %r1861,[%r57]; st.f32 [%r73],%r1861; .loc 1 2760 9 add.u64 %r57,%r57,4; add.u64 %r73,%r73,1024; $L313: .loc 1 2762 43 ld.f32 %r1863,[%r57]; st.f32 [%r73],%r1863; .loc 1 2760 9 add.u64 %r57,%r57,4; add.u64 %r73,%r73,1024; setp.ne.u64 %r1864,%r57,%r136; @ %r1864 bra $L58; bra $L48; $L382: .loc 1 2741 7 add.u64 %r1865,%r508,%r511; add.u64 %r1866,%r1865,%r512; .loc 1 2740 49 ld.f32 %r1867,[%r1866]; st.f32 [%r199],%r1867; .loc 1 2742 49 ld.f32 %r1868,[%r508]; st.f32 [%r199+4],%r1868; .loc 1 2745 7 add.u64 %r1869,%r508,%r512; add.u64 %r1870,%r1869,%r511; .loc 1 2744 49 ld.f32 %r1871,[%r1870+4]; st.f32 [%r199+1024],%r1871; .loc 1 2746 49 ld.f32 %r1872,[%r508+4]; st.f32 [%r199+1028],%r1872; .loc 1 2738 9 add.u64 %r199,%r199,2048; add.u64 %r181,%r508,8; $L312: .loc 1 2741 7 add.u64 %r1874,%r181,%r511; add.u64 %r1875,%r1874,%r512; .loc 1 2740 49 ld.f32 %r1876,[%r1875]; st.f32 [%r199],%r1876; .loc 1 2742 49 ld.f32 %r1877,[%r181]; st.f32 [%r199+4],%r1877; .loc 1 2745 7 add.u64 %r1878,%r181,%r512; add.u64 %r1879,%r1878,%r511; .loc 1 2744 49 ld.f32 %r1880,[%r1879+4]; st.f32 [%r199+1024],%r1880; .loc 1 2746 49 ld.f32 %r1881,[%r181+4]; st.f32 [%r199+1028],%r1881; .loc 1 2738 9 add.u64 %r199,%r199,2048; add.u64 %r181,%r181,8; $L311: .loc 1 2741 7 add.u64 %r1883,%r181,%r511; add.u64 %r1884,%r1883,%r512; .loc 1 2740 49 ld.f32 %r1885,[%r1884]; st.f32 [%r199],%r1885; .loc 1 2742 49 ld.f32 %r1886,[%r181]; st.f32 [%r199+4],%r1886; .loc 1 2745 7 add.u64 %r1887,%r181,%r512; add.u64 %r1888,%r1887,%r511; .loc 1 2744 49 ld.f32 %r1889,[%r1888+4]; st.f32 [%r199+1024],%r1889; .loc 1 2746 49 ld.f32 %r1890,[%r181+4]; st.f32 [%r199+1028],%r1890; .loc 1 2738 9 add.u64 %r199,%r199,2048; add.u64 %r181,%r181,8; setp.ne.u64 %r1891,%r181,%r499; @ %r1891 bra $L52; bra $L53; $L380: .loc 1 2688 2 @ ! %r1303 bra $L395; .loc 1 2689 22 mov.u32 %r188921393,[%value_in]; } $L395: .loc 1 2687 24 mov.u64 %r268,2; .loc 1 2687 7 add.u64 %r591,%r217,%r592; $L310: .loc 1 2688 2 @ ! %r1303 bra $L396; .loc 1 2689 22 mov.u32 %r188951396,[%value_in]; } $L396: .loc 1 2687 24 add.u64 %r268,%r268,1; .loc 1 2687 7 add.u64 %r591,%r591,%r592; $L309: .loc 1 2688 2 @ ! %r1303 bra $L397; .loc 1 2689 22 mov.u32 %r18891399,[%value_in]; } $L397: .loc 1 2687 24 add.u64 %r268,%r268,1; .loc 1 2687 7 add.u64 %r591,%r591,%r592; setp.eq.u64 %r1900,%r268,%r595; @ ! %r1900 bra $L34; bra $L33; $L247: .loc 1 2689 22 mov.u32 %r190114901131902,[%value_in]; } $L381: .loc 1 2687 7 add.u64 %r1904,%r1470,%r592; .loc 1 2688 2 @ ! %r1303 bra $L398; .loc 1 2689 22 mov.u32 %r19061904906131907,[%value_in]; } $L398: .loc 1 2687 7 add.u64 %r1909,%r1904,%r592; .loc 1 2688 2 @ ! %r1303 bra $L399; .loc 1 2689 22 mov.u32 %r19111909911131912,[%value_in]; } $L399: .loc 1 2687 24 add.u64 %r268,%r1417,3; .loc 1 2687 7 add.u64 %r591,%r1909,%r592; setp.eq.u64 %r1913,%r268,%r595; @ ! %r1913 bra $L34; bra $L33; $L389: .loc 1 2938 9 ld.f32 %r1914,[%r596]; ld.f32 %r1915,[%r623]; fma.rn.f32 %r273,%r1914,%r1915,%r273; .loc 1 2937 5 add.u64 %r596,%r596,4; add.u64 %r597,%r623,4; $L322: .loc 1 2938 9 ld.f32 %r1917,[%r596]; ld.f32 %r1918,[%r597]; fma.rn.f32 %r273,%r1917,%r1918,%r273; .loc 1 2937 5 add.u64 %r596,%r596,4; add.u64 %r597,%r597,4; $L321: .loc 1 2938 9 ld.f32 %r1920,[%r596]; ld.f32 %r1921,[%r597]; fma.rn.f32 %r273,%r1920,%r1921,%r273; .loc 1 2937 5 add.u64 %r596,%r596,4; add.u64 %r597,%r597,4; setp.ne.u64 %r1922,%r596,%r602; @ %r1922 bra $L93; bra $L92; $L390: .loc 1 2953 5 ld.f32 %r1923,[%r215]; ld.f32 %r1924,[%r627]; fma.rn.f32 %r271,%r1923,%r1924,%r271; .loc 1 2952 8 add.u64 %r624,%r215,%r626; add.u64 %r627,%r627,4; $L324: .loc 1 2953 5 ld.f32 %r1926,[%r624]; ld.f32 %r1927,[%r627]; fma.rn.f32 %r271,%r1926,%r1927,%r271; .loc 1 2952 8 add.u64 %r624,%r624,%r626; add.u64 %r627,%r627,4; $L323: .loc 1 2953 5 ld.f32 %r1929,[%r624]; ld.f32 %r1930,[%r627]; fma.rn.f32 %r271,%r1929,%r1930,%r271; .loc 1 2952 8 add.u64 %r624,%r624,%r626; add.u64 %r627,%r627,4; setp.ne.u64 %r1931,%r627,%r643; @ %r1931 bra $L97; bra $L96; $L391: .loc 1 2968 8 ld.f32 %r1932,[%r215]; ld.f32 %r1933,[%r647]; fma.rn.f32 %r284,%r1932,%r1933,%r284; .loc 1 2967 28 mov.u64 %r285,1; .loc 1 2967 4 add.u64 %r644,%r215,%r646; add.u64 %r647,%r647,%r649; $L326: .loc 1 2968 8 ld.f32 %r1935,[%r644]; ld.f32 %r1936,[%r647]; fma.rn.f32 %r284,%r1935,%r1936,%r284; .loc 1 2967 28 add.u64 %r285,%r285,1; .loc 1 2967 4 add.u64 %r644,%r644,%r646; add.u64 %r647,%r647,%r649; $L325: .loc 1 2968 8 ld.f32 %r1938,[%r644]; ld.f32 %r1939,[%r647]; fma.rn.f32 %r284,%r1938,%r1939,%r284; .loc 1 2967 28 add.u64 %r285,%r285,1; .loc 1 2967 4 add.u64 %r644,%r644,%r646; add.u64 %r647,%r647,%r649; setp.ne.u64 %r1940,%r428,%r285; @ %r1940 bra $L100; bra $L99; $L393: .loc 1 2982 36 ld.f32 %r1941,[%r661]; ld.f32 %r1942,[%r670]; ld.f32 %r1943,[%r681]; fma.rn.f32 %r1944,%r1941,%r1942,%r1943; st.f32 [%r681],%r1944; .loc 1 2980 29 mov.u64 %r280,1; .loc 1 2980 4 add.u64 %r660,%r681,%r686; add.u64 %r661,%r661,%r663; $L330: .loc 1 2982 36 ld.f32 %r1946,[%r661]; ld.f32 %r1947,[%r670]; ld.f32 %r1948,[%r660]; fma.rn.f32 %r1949,%r1946,%r1947,%r1948; st.f32 [%r660],%r1949; .loc 1 2980 29 add.u64 %r280,%r280,1; .loc 1 2980 4 add.u64 %r660,%r660,%r686; add.u64 %r661,%r661,%r663; $L329: .loc 1 2982 36 ld.f32 %r1951,[%r661]; ld.f32 %r1952,[%r670]; ld.f32 %r1953,[%r660]; fma.rn.f32 %r1954,%r1951,%r1952,%r1953; st.f32 [%r660],%r1954; .loc 1 2980 29 add.u64 %r280,%r280,1; .loc 1 2980 4 add.u64 %r660,%r660,%r686; add.u64 %r661,%r661,%r663; setp.ne.u64 %r1955,%r186,%r280; @ %r1955 bra $L110; bra $L113; $L392: .loc 1 2976 34 st.f32 [%r684],%r1347; .loc 1 2975 27 mov.u64 %r282,1; .loc 1 2975 2 add.u64 %r684,%r684,%r686; $L328: .loc 1 2976 34 st.f32 [%r684],%r1347; .loc 1 2975 27 add.u64 %r282,%r282,1; .loc 1 2975 2 add.u64 %r684,%r684,%r686; $L327: .loc 1 2976 34 st.f32 [%r684],%r1347; .loc 1 2975 27 add.u64 %r282,%r282,1; .loc 1 2975 2 add.u64 %r684,%r684,%r686; setp.ne.u64 %r1958,%r186,%r282; @ %r1958 bra $L105; bra $L109; $L394: .loc 1 3002 5 ld.f32 %r1959,[%r692]; ld.f32 %r1960,[%r718]; fma.rn.f32 %r276,%r1959,%r1960,%r276; .loc 1 3001 32 mov.u64 %r277,1; .loc 1 3001 8 add.u64 %r692,%r692,%r694; add.u64 %r696,%r718,%r698; $L332: .loc 1 3002 5 ld.f32 %r1962,[%r692]; ld.f32 %r1963,[%r696]; fma.rn.f32 %r276,%r1962,%r1963,%r276; .loc 1 3001 32 add.u64 %r277,%r277,1; .loc 1 3001 8 add.u64 %r692,%r692,%r694; add.u64 %r696,%r696,%r698; $L331: .loc 1 3002 5 ld.f32 %r1965,[%r692]; ld.f32 %r1966,[%r696]; fma.rn.f32 %r276,%r1965,%r1966,%r276; .loc 1 3001 32 add.u64 %r277,%r277,1; .loc 1 3001 8 add.u64 %r692,%r692,%r694; add.u64 %r696,%r696,%r698; setp.ne.u64 %r1967,%r428,%r277; @ %r1967 bra $L118; bra $L117; } matmul_r8.o/_gfortran_matmul_r8 .visible .func _gfortran_matmul_r8, .param .u32 %in_ar4, .param .u64 %in_ar5); .file 1 "../../../libgfortran/generated/matmul_r8.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_options[6]FUNCTION DECL: _gfortran_size0_gfortran_size0_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime_97,116,109,117,108,95,114,56,0 }2] = {782] = {672] = {8471,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,32,40,97,41,32,61,61,32,50,32,124,124,32,71,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,32,40,98,41,32,61,61,3243116,114,97,110,47,103,101,110,101,114,97,116,101,100,47,109,97,116,109,117,108,95,114,56,465,114,114,97,121,32,98,111,117,110,100,32,109,105,115,109,97,116,99,104,32,102,111,114,32,100,105,109,101,110,115,105,111,110,32,49,32,111,102,32,97,114,114,97,121,32,40,37,108,100,47,37,108,100,41,3265,114,114,97,121,32,98,111,117,110,100,32,109,105,115,109,97,116,99,104,32,102,111,114,32,100,105,109,101,110,115,105,111,110,32,50,32,111,102,32,97,114,114,97,121,32,40,37,108,100,47,37,108,100,41,3273,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,97,114,103,117,109,101,110,116,32,66,32,105,110,32,77,65,84,77,85,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,49,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,100,0 }103,101,109,109,32,33,61,32,78,85,76,7_gfortran_matmul_r8 .visible .func _gfortran_matmul_r889f64 %r240; .reg .f64 %r241; .reg .f64 %r242; .reg .u64 %r243; .reg .f64 %r244; .reg .f64f64 %r250; .reg .f64 %r251; .reg .f64 %r252; .reg .f64 %r253; .reg .f64 %r254; .reg .f64 %r255; .reg .f64 %r256; .reg .f64 %r257; .reg .f64 %r258; .reg .f64 %r259; .reg .f64 %r260; .reg .f64 %r261; .reg .f64 %r262; .reg .f64 %r263; .reg .f64 %r264; .reg .u64 %r266; .reg .u64 %r267; .reg .u64 %r268; .reg .f64 %r269; .reg .u64u64 %r275; .reg .u64u64 %r280; .reg .u64 %r281; .reg .f64u64 %r312; .reg .u64u64 %r321; .reg .u64 %r326; .reg .u64 %r329; .reg .u64 %r331; .reg .u64u64u64 %r414; .reg .u32 %r416; .reg .u64 %r426; .reg .u64 %r427; .reg .u64u64 %r500; .reg .u64u64 %r511; .reg .u64 %r517; .reg .u64 %r521; .reg .u64u64u64u64 %r571; .reg .u64 %r572; .reg .u64 %r573; .reg .u64 %r575; .reg .u64 %r576; .reg .u64 %r577; .reg .u64 %r585; .reg .u64 %r586; .reg .u64 %r588; .reg .u64 %r589; .reg .u64 %r590; .reg .u64 %r591; .reg .u64u64 %r603; .reg .u64 %r604; .reg .u64 %r610; .reg .u64 %r612; .reg .u64 %r613; .reg .u64 %r617; .reg .u64 %r622; .reg .u64 %r623; .reg .u64 %r625; .reg .u64 %r626; .reg .u64 %r631; .reg .u64 %r633; .reg .u64 %r634; .reg .u64 %r642; .reg .u64 %r643; .reg .u64 %r645; .reg .u64 %r646; .reg .u64 %r648; .reg .u64 %r651; .reg .u64 %r653; .reg .u64 %r654; .reg .u64 %r659; .reg .u64 %r660; .reg .u64 %r662; .reg .u64 %r667; .reg .u64 %r669; .reg .u64 %r671; .reg .u64 %r675; .reg .u64 %r676; .reg .u64 %r680; .reg .u64 %r683; .reg .u64 %r685; .reg .u64 %r687; .reg .u64 %r691; .reg .u64 %r693; .reg .u64 %r695; .reg .u64 %r697; .reg .u64 %r700; .reg .u64 %r702; .reg .u64 %r704; .reg .u64 %r709; .reg .u64 %r711; .reg .u64 %r717; .reg .u64 %r718; .reg .u64 %r719; .reg .u64 %r720; .reg .u32 %r721; .reg .u32 %r722; .reg .u64 %r723; .reg .u16 %r724; .reg .pred %r725; .reg .u16 %r726; .reg .pred %r728; .reg .u64 %r729; .reg .u32 %r730; .reg .u64 %r731; .reg .u64 %r732; .reg .u64 %r733; .reg .pred %r734; .reg .pred %r736; .reg .u64 %r739; .reg .u64 %r741; .reg .u64 %r742; .reg .u64 %r743; .reg .u64 %r745; .reg .u64 %r746; .reg .u64 %r747; .reg .u64 %r748; .reg .u64 %r749; .reg .u64 %r752; .reg .u64 %r754; .reg .u64 %r755; .reg .u64 %r757; .reg .u64 %r759; .reg .u64 %r760; .reg .u64 %r762; .reg .u64 %r763; .reg .u64 %r764; .reg .u64 %r765; .reg .u64 %r766; .reg .u32 %r767; .reg .pred %r768; .reg .u64 %r769; .reg .u64 %r770; .reg .u64 %r771; .reg .pred %r773; .reg .u64 %r774; .reg .u64 %r775; .reg .u64 %r776; .reg .pred %r777; .reg .u64 %r778; .reg .u64 %r779; .reg .u64 %r780; .reg .u16 %r781; .reg .pred %r782; .reg .u64 %r783; .reg .u64 %r785; .reg .u64 %r786; .reg .u64 %r787; .reg .u16 %r788; .reg .pred %r790; .reg .pred %r791; .reg .u64 %r792; .reg .pred %r794; .reg .u64 %r795; .reg .u64 %r797; .reg .u64 %r798; .reg .u64 %r799; .reg .u64 %r800; .reg .u64 %r801; .reg .u64 %r802; .reg .pred %r803; .reg .u64 %r804; .reg .u64 %r806; .reg .u64 %r807; .reg .u64 %r808; .reg .u16 %r809; .reg .u32 %r810; .reg .pred %r811; .reg .u16 %r812; .reg .pred %r813; .reg .u64 %r814; .reg .u64 %r815; .reg .u64 %r816; .reg .u64 %r817; .reg .u64 %r818; .reg .u64 %r819; .reg .pred %r820; .reg .u32 %r822; .reg .u32 %r823; .reg .u32 %r825; .reg .u32 %r826; .reg .u16 %r827; .reg .u16 %r828; .reg .u16 %r829; .reg .u32 %r830; .reg .u16 %r831; .reg .pred %r832; .reg .u64 %r833; .reg .u16 %r835; .reg .pred %r836; .reg .u64 %r837; .reg .u64 %r838; .reg .u64 %r839; .reg .u32 %r840; .reg .u32 %r841; .reg .u32 %r842; .reg .u32 %r843; .reg .u32 %r844; .reg .u32 %r845; .reg .u32 %r847; .reg .u32 %r848; .reg .u16 %r849; .reg .u16 %r850; .reg .u16 %r851; .reg .u32 %r852; .reg .u16 %r853; .reg .pred %r854; .reg .u32 %r856; .reg .u32 %r857; .reg .u16 %r858; .reg .u16 %r859; .reg .u16 %r860; .reg .u32 %r861; .reg .u16 %r862; .reg .pred %r863; .reg .u32 %r865; .reg .u32 %r866; .reg .u32 %r868; .reg .u32 %r869; .reg .u16 %r870; .reg .u16 %r871; .reg .u16 %r872; .reg .u32 %r873; .reg .u16 %r874; .reg .pred %r875; .reg .f32 %r876; .reg .f32 %r877; .reg .f32 %r878; .reg .f32 %r879; .reg .f32 %r880; .reg .f32 %r881; .reg .f32 %r882; .reg .pred %r883; .reg .f64 %r884; .reg .f64 %r885; .reg .pred %r886; .reg .pred %r887; .reg .u32 %r889; .reg .u32 %r890; .reg .u32 %r892; .reg .u32 %r893; .reg .u16 %r894; .reg .u16 %r895; .reg .u16 %r896; .reg .u32 %r897; .reg .u16 %r898; .reg .pred %r899; .reg .u32 %r901; .reg .u32 %r902; .reg .u32 %r904; .reg .u32 %r905; .reg .u16 %r906; .reg .u16 %r907; .reg .u16 %r908; .reg .u32 %r909; .reg .u16 %r910; .reg .pred %r911; .reg .u32 %r913; .reg .u32 %r914; .reg .u32 %r916; .reg .u32 %r917; .reg .u16 %r918; .reg .u16 %r919; .reg .u16 %r920; .reg .u32 %r921; .reg .u16 %r922; .reg .pred %r923; .reg .pred %r924; .reg .u64 %r925; .reg .u32 %r926; .reg .u64 %r927; .reg .u64 %r928; .reg .u32 %r929; .reg .pred %r930; .reg .pred %r931; .reg .u32 %r932; .reg .pred %r933; .reg .pred %r934; .reg .u32 %r949; .reg .u64 %r950; .reg .u64 %r951; .reg .u64 %r952; .reg .u64 %r953; .reg .u64 %r954; .reg .u64 %r955; .reg .u64 %r956; .reg .u16 %r957; .reg .u16 %r958; .reg .u16 %r959; .reg .u32 %r960; .reg .u16 %r961; .reg .pred %r962; .reg .pred %r963; .reg .u16 %r964; .reg .pred %r965; .reg .u64 %r966; .reg .pred %r967; .reg .u32 %r971; .reg .u64 %r973; .reg .u32 %r978; .reg .u32 %r979; .reg .u32 %r981; .reg .u32 %r982; .reg .u16 %r983; .reg .u16 %r984; .reg .u16 %r985; .reg .u32 %r987; .reg .u32 %r988; .reg .u16 %r990; .reg .u16 %r992; .reg .u32 %r993; .reg .u16 %r994; .reg .pred %r995; .reg .pred %r996; .reg .u64 %r997; .reg .u64 %r999; .reg .u64 %r1000; .reg .u64 %r1001; .reg .pred %r1003; .reg .u64 %r1005; .reg .u64 %r1006; .reg .u64 %r1007; .reg .u64 %r1010; .reg .u64 %r1011; .reg .u64 %r1015; .reg .u64 %r1016; .reg .u64 %r1017; .reg .u64 %r1019; .reg .u64 %r1020; .reg .u64 %r1023; .reg .u64 %r1026; .reg .u64 %r1027; .reg .u64 %r1030; .reg .u64 %r1032; .reg .u64 %r1033; .reg .u64 %r1034; .reg .u64 %r1035; .reg .u64 %r1036; .reg .pred %r1038; .reg .u64 %r1044; .reg .u64 %r1045; .reg .u64 %r1046; .reg .u64 %r1047; .reg .pred %r1049; .reg .u64 %r1050; .reg .u64 %r1051; .reg .u64 %r1052; .reg .u64 %r1053; .reg .u64 %r1054; .reg .u64 %r1056; .reg .u64 %r1057; .reg .u64 %r1058; .reg .u64 %r1059; .reg .u64 %r1060; .reg .u64 %r1061; .reg .u64 %r1062; .reg .u64 %r1063; .reg .u64 %r1067; .reg .u64 %r1068; .reg .u64 %r1069; .reg .u64 %r1070; .reg .u64 %r1074; .reg .u64 %r1076; .reg .u64 %r1079; .reg .u64 %r1082; .reg .u64 %r1087; .reg .u64 %r1088; .reg .u64 %r1090; .reg .u64 %r1091; .reg .f64 %r1092; .reg .f64 %r1093; .reg .u64 %r1094; .reg .u64 %r1095; .reg .f64 %r1096; .reg .f64 %r1097; .reg .u64 %r1099; .reg .u64 %r1100; .reg .f64 %r1101; .reg .u64 %r1102; .reg .f64 %r1103; .reg .pred %r1104; .reg .u64 %r1107; .reg .u64 %r1108; .reg .u64 %r1109; .reg .u64 %r1110; .reg .u64 %r1112; .reg .pred %r1113; .reg .u64 %r1114; .reg .u64 %r1115; .reg .f64 %r1116; .reg .u64 %r1119; .reg .u64 %r1120; .reg .u64 %r1121; .reg .u64 %r1122; .reg .u64 %r1123; .reg .u64 %r1124; .reg .u64 %r1125; .reg .u64 %r1126; .reg .u64 %r1128; .reg .u64 %r1129; .reg .u64 %r1130; .reg .u64 %r1131; .reg .u64 %r1136; .reg .u64 %r1137; .reg .u64 %r1138; .reg .u64 %r1139; .reg .u64 %r1140; .reg .u64 %r1141; .reg .pred %r1143; .reg .pred %r1144; .reg .u64 %r1148; .reg .f64 %r1149; .reg .f64 %r1150; .reg .f64 %r1151; .reg .f64 %r1152; .reg .pred %r1154; .reg .pred %r1155; .reg .pred %r1156; .reg .u64 %r1157; .reg .u64 %r1158; .reg .u64 %r1160; .reg .u64 %r1161; .reg .u64 %r1162; .reg .u64 %r1163; .reg .u64 %r1171; .reg .u64 %r1172; .reg .u64 %r1173; .reg .u64 %r1175; .reg .f64 %r1176; .reg .f64 %r1177; .reg .u64 %r1178; .reg .f64 %r1179; .reg .u64 %r1180; .reg .f64 %r1181; .reg .pred %r1183; .reg .pred %r1184; .reg .u64 %r1187; .reg .f64 %r1188; .reg .f64 %r1189; .reg .pred %r1191; .reg .u16 %r1192; .reg .pred %r1193; .reg .u32 %r1195; .reg .u32 %r1196; .reg .u16 %r1197; .reg .u16 %r1198; .reg .u32 %r1200; .reg .u16 %r1201; .reg .pred %r1202; .reg .pred %r1203; .reg .u16 %r1204; .reg .pred %r1205; .reg .pred %r1206; .reg .pred %r1207; .reg .u64 %r1208; .reg .u64 %r1209; .reg .pred %r1211; .reg .u64 %r1212; .reg .u64 %r1213; .reg .u64 %r1215; .reg .f64 %r1216; .reg .f64 %r1217; .reg .pred %r1219; .reg .u64 %r1221; .reg .u64 %r1222; .reg .u64 %r1223; .reg .f64 %r1224; .reg .f64 %r1225; .reg .pred %r1227; .reg .pred %r1228; .reg .u64 %r1230; .reg .f64 %r1231; .reg .f64 %r1232; .reg .pred %r1234; .reg .pred %r1235; .reg .pred %r1236; .reg .pred %r1237; .reg .u64 %r1238; .reg .pred %r1241; .reg .u64 %r1243; .reg .f64 %r1244; .reg .f64 %r1245; .reg .f64 %r1246; .reg .pred %r1248; .reg .pred %r1250; .reg .u64 %r1252; .reg .u64 %r1253; .reg .pred %r1255; .reg .u64 %r1256; .reg .u64 %r1257; .reg .u64 %r1259; .reg .f64 %r1260; .reg .f64 %r1261; .reg .pred %r1263; .reg .pred %r1264; .reg .u64 %r1265; .reg .pred %r1266; .reg .u64 %r1267; .reg .u64 %r1268; .reg .u64 %r1269; .reg .u16 %r1270; .reg .u32 %r1271; .reg .pred %r1272; .reg .u16 %r1273; .reg .u32 %r1274; .reg .pred %r1275; .reg .u64 %r1276; .reg .u64 %r1277; .reg .u64 %r1278; .reg .u64 %r1279; .reg .u64 %r1280; .reg .u64 %r1281; .reg .u64 %r1282; .reg .u32 %r1283; .reg .pred %r1284; .reg .pred %r1285; .reg .u32 %r1286; .reg .u64 %r1287; .reg .pred %r1292; .reg .pred %r1293; .reg .pred %r1294; .reg .pred %r1295; .reg .pred %r1298; .reg .pred %r1299; .reg .pred %r1300; .reg .pred %r1301; .reg .pred %r1302; .reg .u64 %r1304; .reg .u64 %r1311; .reg .u64 %r1312; .reg .u64 %r1313; .reg .u64 %r1314; .reg .u64 %r1319; .reg .u64 %r1320; .reg .u64 %r1321; .reg .u64 %r1322; .reg .u64 %r1324; .reg .u64 %r1325; .reg .u64 %r1326; .reg .u64 %r1327; .reg .u64 %r1328; .reg .pred %r1329; .reg .u64 %r1331; .reg .pred %r1333; .reg .u64 %r1336; .reg .u64 %r1337; .reg .u64 %r1338; .reg .pred %r1339; .reg .u64 %r1340; .reg .u64 %r1341; .reg .pred %r1345; .reg .f64 %r1346; .reg .pred %r1347; .reg .u64 %r1348; .reg .u64 %r1349; .reg .u64 %r1350; .reg .u64 %r1351; .reg .u64 %r1352; .reg .pred %r1353; .reg .pred %r1354; .reg .pred %r1355; .reg .u64 %r1358; .reg .u64 %r1359; .reg .u64 %r1360; .reg .u64 %r1361; .reg .u64 %r1362; .reg .pred %r1363; .reg .pred %r1364; .reg .pred %r1365; .reg .u64 %r1367; .reg .u64 %r1368; .reg .u64 %r1369; .reg .u64 %r1370; .reg .u64 %r1371; .reg .u64 %r1372; .reg .pred %r1373; .reg .pred %r1374; .reg .pred %r1375; .reg .u64 %r1381; .reg .u64 %r1382; .reg .u64 %r1383; .reg .u64 %r1385; .reg .pred %r1386; .reg .u64 %r1387; .reg .u64 %r1388; .reg .u64 %r1389; .reg .u64 %r1390; .reg .u64 %r1391; .reg .u64 %r1392; .reg .u64 %r1393; .reg .pred %r1394; .reg .pred %r1395; .reg .pred %r1396; .reg .u64 %r1399; .reg .u64 %r1400; .reg .u64 %r1401; .reg .u64 %r1402; .reg .u64 %r1403; .reg .pred %r1404; .reg .pred %r1405; .reg .pred %r1406; .reg .u64 %r1407; .reg .u64 %r1408; .reg .u64 %r1412; .reg .pred %r1413; .reg .pred %r1414; .reg .pred %r1415; .reg .u64 %r1416; .reg .u64 %r1417; .reg .u64 %r1418; .reg .u64 %r1419; .reg .u64 %r1420; .reg .u64 %r1421; .reg .pred %r1422; .reg .pred %r1423; .reg .pred %r1424; .reg .u64 %r1427; .reg .u64 %r1428; .reg .u64 %r1429; .reg .u64 %r1430; .reg .u64 %r1431; .reg .pred %r1432; .reg .pred %r1433; .reg .pred %r1434; .reg .u64 %r1439; .reg .pred %r1440; .reg .pred %r1441; .reg .pred %r1442; .reg .u64 %r1447; .reg .pred %r1448; .reg .pred %r1449; .reg .pred %r1450; .reg .u64 %r1455; .reg .pred %r1456; .reg .pred %r1457; .reg .pred %r1458; .reg .u64 %r1463; .reg .pred %r1464; .reg .pred %r1465; .reg .pred %r1466; .reg .u64 %r1469; .reg .u64 %r1470; .reg .u64 %r1472; .reg .u64 %r1473; .reg .u64 %r1476; .reg .u64 %r1477; .reg .f64 %r1478; .reg .f64 %r1479; .reg .u64 %r1480; .reg .u64 %r1481; .reg .f64 %r1482; .reg .f64 %r1483; .reg .u64 %r1484; .reg .u64 %r1485; .reg .u64 %r1487; .reg .u64 %r1488; .reg .f64 %r1489; .reg .f64 %r1490; .reg .u64 %r1491; .reg .u64 %r1492; .reg .f64 %r1493; .reg .f64 %r1494; .reg .u64 %r1495; .reg .u64 %r1496; .reg .u64 %r1498; .reg .u64 %r1499; .reg .f64 %r1500; .reg .f64 %r1501; .reg .u64 %r1502; .reg .u64 %r1503; .reg .f64 %r1504; .reg .f64 %r1505; .reg .pred %r1506; .reg .f64 %r1509; .reg .f64 %r1513; .reg .f64 %r1517; .reg .pred %r1518; .reg .f64 %r1519; .reg .f64 %r1520; .reg .f64 %r1521; .reg .f64 %r1522; .reg .f64 %r1523; .reg .f64 %r1524; .reg .f64 %r1525; .reg .f64 %r1526; .reg .f64 %r1527; .reg .f64 %r1528; .reg .f64 %r1529; .reg .f64 %r1530; .reg .f64 %r1531; .reg .f64 %r1532; .reg .f64 %r1533; .reg .f64 %r1534; .reg .u64 %r1537; .reg .u64 %r1538; .reg .f64 %r1539; .reg .f64 %r1540; .reg .f64 %r1541; .reg .u64 %r1542; .reg .u64 %r1543; .reg .f64 %r1544; .reg .u64 %r1545; .reg .f64 %r1546; .reg .u64 %r1547; .reg .f64 %r1548; .reg .u64 %r1549; .reg .f64 %r1550; .reg .u64 %r1551; .reg .f64 %r1552; .reg .pred %r1553; .reg .f64 %r1554; .reg .f64 %r1555; .reg .f64 %r1556; .reg .f64 %r1557; .reg .f64 %r1563; .reg .f64 %r1564; .reg .f64 %r1565; .reg .f64 %r1566; .reg .f64 %r1567; .reg .f64 %r1568; .reg .f64 %r1569; .reg .f64 %r1570; .reg .f64 %r1571; .reg .f64 %r1578; .reg .f64 %r1579; .reg .f64 %r1580; .reg .f64 %r1581; .reg .f64 %r1582; .reg .f64 %r1583; .reg .f64 %r1584; .reg .f64 %r1585; .reg .f64 %r1586; .reg .f64 %r1593; .reg .f64 %r1594; .reg .f64 %r1595; .reg .f64 %r1596; .reg .f64 %r1597; .reg .pred %r1598; .reg .pred %r1599; .reg .f64 %r1600; .reg .f64 %r1601; .reg .f64 %r1602; .reg .f64 %r1603; .reg .u64 %r1606; .reg .f64 %r1607; .reg .u64 %r1608; .reg .f64 %r1609; .reg .f64 %r1610; .reg .f64 %r1611; .reg .f64 %r1612; .reg .u64 %r1613; .reg .f64 %r1614; .reg .f64 %r1615; .reg .u64 %r1616; .reg .f64 %r1617; .reg .f64 %r1618; .reg .u64 %r1620; .reg .u64 %r1622; .reg .f64 %r1623; .reg .u64 %r1624; .reg .f64 %r1625; .reg .f64 %r1626; .reg .f64 %r1627; .reg .f64 %r1628; .reg .u64 %r1629; .reg .f64 %r1630; .reg .f64 %r1631; .reg .u64 %r1632; .reg .f64 %r1633; .reg .f64 %r1634; .reg .u64 %r1636; .reg .u64 %r1638; .reg .f64 %r1639; .reg .u64 %r1640; .reg .f64 %r1641; .reg .f64 %r1642; .reg .u64 %r1643; .reg .f64 %r1644; .reg .u64 %r1645; .reg .f64 %r1646; .reg .pred %r1647; .reg .f64 %r1648; .reg .f64 %r1651; .reg .f64 %r1652; .reg .f64 %r1653; .reg .f64 %r1657; .reg .f64 %r1658; .reg .f64 %r1659; .reg .f64 %r1663; .reg .f64 %r1664; .reg .pred %r1665; .reg .pred %r1666; .reg .f64 %r1667; .reg .f64 %r1670; .reg .f64 %r1671; .reg .f64 %r1672; .reg .f64 %r1676; .reg .f64 %r1677; .reg .f64 %r1678; .reg .f64 %r1682; .reg .f64 %r1683; .reg .pred %r1684; .reg .pred %r1685; .reg .f64 %r1686; .reg .u64 %r1687; .reg .f64 %r1689; .reg .f64 %r1690; .reg .f64 %r1691; .reg .u64 %r1692; .reg .f64 %r1695; .reg .f64 %r1696; .reg .f64 %r1697; .reg .u64 %r1698; .reg .f64 %r1701; .reg .f64 %r1702; .reg .pred %r1703; .reg .pred %r1704; .reg .f64 %r1705; .reg .u64 %r1707; .reg .u64 %r1708; .reg .f64 %r1709; .reg .f64 %r1710; .reg .f64 %r1711; .reg .u64 %r1713; .reg .u64 %r1714; .reg .f64 %r1716; .reg .f64 %r1717; .reg .f64 %r1718; .reg .u64 %r1720; .reg .u64 %r1721; .reg .f64 %r1723; .reg .f64 %r1724; .reg .pred %r1725; .reg .u64 %r1727; .reg .u64 %r1729; .reg .u64 %r1732; .reg .pred %r1734; .reg .pred %r1735; .reg .pred %r1736; .reg .u64 %r1738; .reg .u64 %r1739; .reg .f64 %r1740; .reg .f64 %r1741; .reg .f64 %r1742; .reg .f64 %r1743; .reg .u64 %r1745; .reg .u64 %r1746; .reg .f64 %r1748; .reg .f64 %r1749; .reg .f64 %r1750; .reg .f64 %r1751; .reg .u64 %r1753; .reg .u64 %r1754; .reg .f64 %r1756; .reg .f64 %r1757; .reg .f64 %r1758; .reg .f64 %r1759; .reg .pred %r1760; .reg .pred %r1761; .reg .f64 %r1763; .reg .u64 %r1765; .reg .u64 %r1766; .reg .f64 %r1767; .reg .f64 %r1768; .reg .f64 %r1769; .reg .u64 %r1771; .reg .u64 %r1772; .reg .f64 %r1774; .reg .f64 %r1775; .reg .f64 %r1776; .reg .u64 %r1778; .reg .u64 %r1779; .reg .f64 %r1781; .reg .f64 %r1782; .reg .pred %r1783; .reg .f64 %r1784; .reg .f64 %r1785; .reg .f64 %r1787; .reg .f64 %r1788; .reg .f64 %r1790; .reg .f64 %r1791; .reg .pred %r1792; .reg .u64 %r1793; .reg .f64 %r1794; .reg .u64 %r1795; .reg .f64 %r1796; .reg .f64 %r1797; .reg .u64 %r1798; .reg .f64 %r1799; .reg .u64 %r1800; .reg .f64 %r1801; .reg .u64 %r1803; .reg .f64 %r1804; .reg .u64 %r1805; .reg .f64 %r1806; .reg .f64 %r1807; .reg .u64 %r1808; .reg .f64 %r1809; .reg .u64 %r1810; .reg .f64 %r1811; .reg .u64 %r1813; .reg .f64 %r1814; .reg .u64 %r1815; .reg .f64 %r1816; .reg .f64 %r1817; .reg .u64 %r1818; .reg .f64 %r1819; .reg .u64 %r1820; .reg .f64 %r1821; .reg .pred %r1822; .reg .f64 %r1823; .reg .f64 %r1824; .reg .f64 %r1825; .reg .f64 %r1826; .reg .f64 %r1827; .reg .f64 %r1829; .reg .f64 %r1830; .reg .f64 %r1831; .reg .f64 %r1832; .reg .f64 %r1833; .reg .f64 %r1835; .reg .f64 %r1836; .reg .f64 %r1837; .reg .f64 %r1838; .reg .f64 %r1839; .reg .pred %r1840; .reg .u64 %r1841; .reg .u64 %r1842; .reg .f64 %r1843; .reg .f64 %r1844; .reg .f64 %r1845; .reg .u64 %r1846; .reg .u64 %r1847; .reg .f64 %r1848; .reg .u64 %r1849; .reg .f64 %r1850; .reg .u64 %r1851; .reg .f64 %r1852; .reg .u64 %r1853; .reg .f64 %r1854; .reg .u64 %r1855; .reg .f64 %r1856; .reg .pred %r1857; .reg .f64 %r1858; .reg .f64 %r1860; .reg .f64 %r1862; .reg .pred %r1863; .reg .u64 %r1864; .reg .u64 %r1865; .reg .f64 %r1866; .reg .f64 %r1867; .reg .u64 %r1868; .reg .u64 %r1869; .reg .f64 %r1870; .reg .f64 %r1871; .reg .u64 %r1873; .reg .u64 %r1874; .reg .f64 %r1875; .reg .f64 %r1876; .reg .u64 %r1877; .reg .u64 %r1878; .reg .f64 %r1879; .reg .f64 %r1880; .reg .u64 %r1882; .reg .u64 %r1883; .reg .f64 %r1884; .reg .f64 %r1885; .reg .u64 %r1886; .reg .u64 %r1887; .reg .f64 %r1888; .reg .f64 %r1889; .reg .pred %r1890; .reg .u32 %r1891; .reg .u64 %r1892; .reg .u32 %r1894; .reg .u64 %r1895; .reg .u32 %r1897; .reg .u64 %r1898; .reg .pred %r1899; .reg .u32 %r1900; .reg .u64 %r1901; .reg .u64 %r1903; .reg .u32 %r1905; .reg .u64 %r1906; .reg .u64 %r1908; .reg .u32 %r1910; .reg .u64 %r1911; .reg .pred %r1912; .reg .f64 %r1913; .reg .f64 %r1914; .reg .f64 %r1916; .reg .f64 %r1917; .reg .f64 %r1919; .reg .f64 %r1920; .reg .pred %r1921; .reg .f64 %r1922; .reg .f64 %r1923; .reg .f64 %r1925; .reg .f64 %r1926; .reg .f64 %r1928; .reg .f64 %r1929; .reg .pred %r1930; .reg .f64 %r1931; .reg .f64 %r1932; .reg .f64 %r1934; .reg .f64 %r1935; .reg .f64 %r1937; .reg .f64 %r1938; .reg .pred %r1939; .reg .f64 %r1940; .reg .f64 %r1941; .reg .f64 %r1942; .reg .f64 %r1943; .reg .f64 %r1945; .reg .f64 %r1946; .reg .f64 %r1947; .reg .f64 %r1948; .reg .f64 %r1950; .reg .f64 %r1951; .reg .f64 %r1952; .reg .f64 %r1953; .reg .pred %r1954; .reg .pred %r1957; .reg .f64 %r1958; .reg .f64 %r1959; .reg .f64 %r1961; .reg .f64 %r1962; .reg .f64 %r1964; .reg .f64 %r1965; .reg .pred %r1966; mov.u64 %r718,%ar0; mov.u64 %r719,%ar1; mov.u64 %r720,%ar2; mov.u32 %r721,%ar3; mov.u32 %r722,%ar4; mov.u64 %r723,%ar5; .loc 1 2462 3 ld.s8 %r416,[%r719+28]; cvt.u16.u32 %r724,%r416; setp.eq.u16 %r725,%r724,2; @ %r725 bra $L2; ld.s8 %r1286,[%r720+28]; cvt.u16.u32 %r726,%r1286; setp.eq.u16 %r728,%r726,2; @ %r728 bra $L3; cvta.const.u64 %r732,$LC3; cvta.const.u64 %r731,__func__$0; mov.u32 %r730,2462; cvta.const.u64 %r72737323: .loc 1 2476 6 ld.u64 %r733,[%r718]; setp.ne.u64 %r734,%r733,0; @ %r734 bra $L4; .loc 1 2478 10 setp.ne.u16 %r736,%r724,1; @ %r736 bra $L5; .loc 1 2480 4 st.u64 [%r718+48],%r733; ld.u64 %r739,[%r720+80]; ld.u64 %r741,[%r720+72]; sub.u64 %r742,%r739,%r741; st.u64 [%r718+56],%r742; mov.u64 %r743,1; st.u64 [%r718+40],%r743; bra $L6; $L123: .loc 1 2485 4 st.u64 [%r718+48],%r1265; add.u64 %r745,%r478,-1; st.u64 [%r718+56],%r745; mov.u64 %r746,1; st.u64 [%r718+40],%r746; bra $L6; $L124: .loc 1 2490 4 mov.u64 %r747,0; st.u64 [%r718+48],%r747; add.u64 %r748,%r478,-1; st.u64 [%r718+56],%r748; mov.u64 %r749,1; st.u64 [%r718+40],%r749; .loc 1 2493 11 st.u64 [%r718+72],%r747; ld.u64 %r752,[%r720+80]; ld.u64 %r754,[%r720+72]; sub.u64 %r755,%r752,%r754; st.u64 [%r718+80],%r755; st.u64 [%r718+64],%r478; $L6: .loc 1 2499 1718; call (%value_in),_gfortran_size0,(%out_arg1); ld.param.u64 %r757,[%value_in]; } .loc 1 2499 4 mov.u64 %r759759; call (%value_in),_gfortrani_xmallocarray760,[%value_in]; } .loc 1 2499 2 st.u64 [%r718],%r760; .loc 1 2500 24 mov.u64 %r762,0; st.u64 [%r718+8],%r762; .loc 1 2557 7 ld.s8 %r416,[%r719+28]; .loc 1 2564 15 ld.u64 %r764,[%r719+56]; add.u64 %r763,%r764,1; .loc 1 2564 13 ld.u64 %r765,[%r719+48]; sub.u64 %r426,%r763,%r765; bra $L7; $L4: .loc 1 2502 12 cvta.global.u64 %r766,_gfortrani_compile_options; .loc 1 2502 11 ld.u32 %r767,[%r766+36]; setp.eq.u32 %r768,%r767,0; @ %r768 bra $L8; .loc 1 2509 17 ld.u64 %r770,[%r718+56]; add.u64 %r769,%r770,1; .loc 1 2509 15 ld.u64 %r771,[%r718+48]; sub.u64 %r426,%r769,%r771; .loc 1 2506 10 setp.ne.u16 %r773,%r724,1; @ %r773 bra $L9; .loc 1 2508 17 ld.u64 %r775,[%r720+80]; add.u64 %r774,%r775,1; .loc 1 2508 15 ld.u64 %r776,[%r720+72]; sub.u64 %r208,%r774,%r776; .loc 1 2510 7 setp.ne.u64 %r777,%r208,%r426; @ %r777 bra $L10; .loc 1 2564 15 ld.u64 %r779,[%r719+56]; add.u64 %r778,%r779,1; .loc 1 2564 13 ld.u64 %r780,[%r719+48]; sub.u64 %r426,%r778,%r780; .loc 1 2543 7 ld.s8 %r301,[%r718+28]; .loc 1 2548 27 ld.u64 %r414,[%r718+40]; .loc 1 2543 6 cvt.u16.u32 %r781,%r301; setp.eq.u16 %r782,%r781,1; @ %r782 bra $L11; .loc 1 2553 16 ld.u64 %r474,[%r718+64]; .loc 1 2560 16 ld.u64 %r473,[%r719+40]; .loc 1 2563 14 mov.u64 %r186,1; .loc 1 2561 16 mov.u64 %r178,%r186; bra $L12; $L10: .loc 1 2511 6 st.u64 [%stack+8],%r208; st.u64 [%stack],%r426; cvta.const.u64 %r783783_gfortran_runtime_error2485 4 ld.u64 %r786,[%r719+56]; add.u64 %r785,%r786,1; ld.u64 %r787,[%r719+48]; sub.u64 %r482,%r785,%r787; .loc 1 2515 15 cvt.u16.u32 %r788,%r1286; setp.ne.u16 %r790,%r788,1; @ %r790 bra $L13; .loc 1 2519 7 setp.eq.u64 %r791,%r482,%r426; @ %r791 bra $L14; .loc 1 2520 6 st.u64 [%stack+8],%r482; st.u64 [%stack],%r426; cvta.const.u64 %r7922_gfortran_runtime_error13: .loc 1 2528 7 setp.eq.u64 %r794,%r482,%r426; @ %r794 bra $L15; .loc 1 2529 6 st.u64 [%stack+8],%r482; st.u64 [%stack],%r426; cvta.const.u64 %r7955_gfortran_runtime_error15: .loc 1 2533 17 ld.u64 %r798,[%r720+80]; add.u64 %r797,%r798,1; .loc 1 2533 15 ld.u64 %r799,[%r720+72]; sub.u64 %r206,%r797,%r799; .loc 1 2534 17 ld.u64 %r801,[%r718+80]; add.u64 %r800,%r801,1; .loc 1 2534 15 ld.u64 %r802,[%r718+72]; sub.u64 %r207,%r800,%r802; .loc 1 2535 7 setp.eq.u64 %r803,%r206,%r207; @ %r803 bra $L14; .loc 1 2536 6 st.u64 [%stack+8],%r206; st.u64 [%stack],%r207; cvta.const.u64 %r804,$LCstack; call _gfortran_runtime_error2564 15 ld.u64 %r807,[%r719+56]; add.u64 %r806,%r807,1; .loc 1 2564 13 ld.u64 %r808,[%r719+48]; sub.u64 %r426,%r806,%r808; $L7: .loc 1 2548 27 ld.u64 %r474,[%r718+40]; .loc 1 2543 6 ld.s8 %r810,[%r718+28]; cvt.u16.u32 %r809,%r810; setp.eq.u16 %r811,%r809,1; @ %r811 bra $L130; $L127: .loc 1 2553 16 mov.u64 %r414,%r474; ld.u64 %r474,[%r718+64]; bra $L16; $L130: .loc 1 2548 27 mov.u64 %r414,%r474; $L16: .loc 1 2560 16 ld.u64 %r473,[%r719+40]; .loc 1 2557 6 cvt.u16.u32 %r812,%r416; setp.eq.u16 %r813,%r812,1; @ %r813 bra $L131; $L125: .loc 1 2569 16 ld.u64 %r178,[%r719+64]; .loc 1 2571 15 ld.u64 %r815,[%r719+80]; add.u64 %r814,%r815,1; .loc 1 2571 13 ld.u64 %r816,[%r719+72]; mov.u64 %r186,%r426; sub.u64 %r426,%r814,%r816; bra $L12; $L131: .loc 1 2563 14 mov.u64 %r186,1; .loc 1 2561 16 mov.u64 %r178,%r186; $L12: .loc 1 2575 16 ld.u64 %r818,[%r720+56]; add.u64 %r817,%r818,1; ld.u64 %r819,[%r720+48]; sub.u64 %r64,%r817,%r819; .loc 1 2575 6 setp.eq.u64 %r820,%r64,%r426; @ %r820 bra $L17; .loc 1 2577 10 set.u32.gt.s64 %r822,%r426,0; neg.s32 %r823,%r822; .loc 1 2577 21 set.u32.gt.s64 %r825,%r64,0; neg.s32 %r826,%r825; cvt.u16.u32 %r828,%r823; cvt.u16.u32 %r829,%r826; or.b16 %r827,%r828,%r829; cvt.u32.u16 %r830,%r827; cvt.u16.u8 %r831,%r830; setp.eq.u16 %r832,%r831,0; @ %r832 bra $L17; .loc 1 2578 2 st.u64 [%stack+8],%r426; st.u64 [%stack],%r64;_gfortran_runtime_error17: .loc 1 2583 7 ld.s8 %r66,[%r720+28]; .loc 1 2586 16 ld.u64 %r472,[%r720+40]; .loc 1 2583 6 cvt.u16.u32 %r835,%r66; setp.eq.u16 %r836,%r835,1; @ %r836 bra $L132; .loc 1 2597 16 ld.u64 %r179,[%r720+64]; .loc 1 2598 16 ld.u64 %r838,[%r720+80]; add.u64 %r837,%r838,1; .loc 1 2598 14 ld.u64 %r839,[%r720+72]; sub.u64 %r187,%r837,%r839; bra $L18; $L132: .loc 1 2592 14 mov.u64 %r187,1; .loc 1 2591 16 mov.u64 %r179,256; $L18: .loc 1 2601 9 ld.u64 %r213,[%r719]; .loc 1 2602 9 ld.u64 %r214,[%r720]; .loc 1 2603 8 ld.u64 %r215,[%r718]; .loc 1 2612 28 set.u32.eq.u64 %r841,%r414,1; neg.s32 %r842,%r841; cvt.u32.u32 %r840,%r842; cvt.u32.u8 %r71,%r840; .loc 1 2612 46 set.u32.eq.u64 %r844,%r473,1; neg.s32 %r845,%r844; cvt.u32.u32 %r843,%r845; cvt.u32.u8 %r471,%r843; .loc 1 2612 7 set.u32.ne.u32 %r847,%r721,0; neg.s32 %r848,%r847; .loc 1 2612 16 cvt.u16.u32 %r850,%r848; cvt.u16.u32 %r851,%r71; and.b16 %r849,%r850,%r851; .loc 1 2612 6 cvt.u32.u16 %r852,%r849; cvt.u16.u8 %r853,%r852; setp.eq.u16 %r854,%r853,0; @ %r854 bra $L19; .loc 1 2612 63 set.u32.eq.u64 %r856,%r178,1; neg.s32 %r857,%r856; .loc 1 2612 51 cvt.u16.u32 %r859,%r857; cvt.u16.u32 %r860,%r471; or.b16 %r858,%r859,%r860; .loc 1 2612 33 cvt.u32.u16 %r861,%r858; cvt.u16.u8 %r862,%r861; setp.eq.u16 %r863,%r862,0; @ %r863 bra $L20; .loc 1 2613 20 set.u32.eq.u64 %r865,%r472,1; neg.s32 %r866,%r865; .loc 1 2613 37 set.u32.eq.u64 %r868,%r179,1; neg.s32 %r869,%r868; .loc 1 2613 25 cvt.u16.u32 %r871,%r866; cvt.u16.u32 %r872,%r869; or.b16 %r870,%r871,%r872; .loc 1 2613 7 cvt.u32.u16 %r873,%r870; cvt.u16.u8 %r874,%r873; setp.eq.u16 %r875,%r874,0; @ %r875 bra $L20; .loc 1 2615 13 cvt.rn.f32.s32 %r83,%r722; .loc 1 2614 12 cvt.rn.f32.s64 %r876,%r186; .loc 1 2614 50 cvt.rn.f32.s64 %r877,%r426; .loc 1 2614 47 mul.f32 %r878,%r876,%r877; .loc 1 2614 31 cvt.rn.f32.s64 %r879,%r187; .loc 1 2614 47 mul.f32 %r880,%r878,%r879; .loc 1 2615 13 mul.f32 %r881,%r83,%r83; mul.f32 %r882,%r881,%r83; .loc 1 2614 7 setp.gt.f32 %r883,%r880,%r882; @ ! %r883 bra $L19; .loc 1 2617 17 cvt.u32.u64 %r85,%r186; st.u32 [%frame+36],%r85; .loc 1 2617 29 cvt.u32.u64 %r86,%r187; st.u32 [%frame+32],%r86; .loc 1 2617 41 cvt.u32.u64 %r87,%r426; st.u32 [%frame+28],%r87; .loc 1 2617 52 cvt.u32.u64 %r88,%r474; st.u32 [%frame+24],%r88; .loc 1 2618 24 mov.f64 %r884,0d3ff0000000000000; st.f64 [%frame+8],%r884; .loc 1 2618 33 mov.f64 %r885,0d0000000000000000; st.f64 [%frame],%r885; .loc 1 2619 17 setp.ne.u64 %r886,%r473,1; @ %r886 bra $L22; cvt.u32.u64 %r200,%r178; bra $L23; $L22: cvt.u32.u64 %r200,%r473; $L23: st.u32 [%frame+20],%r200; .loc 1 2620 3 setp.ne.u64 %r887,%r472,1; @ %r887 bra $L24; cvt.u32.u64 %r201,%r179; bra $L25; $L24: cvt.u32.u64 %r201,%r472; $L25: st.u32 [%frame+16],%r201; .loc 1 2622 15 set.u32.gt.s32 %r889,%r200,0; neg.s32 %r890,%r889; .loc 1 2622 26 set.u32.gt.s32 %r892,%r201,0; neg.s32 %r893,%r892; .loc 1 2622 19 cvt.u16.u32 %r895,%r890; cvt.u16.u32 %r896,%r893; and.b16 %r894,%r895,%r896; .loc 1 2622 10 cvt.u32.u16 %r897,%r894; cvt.u16.u8 %r898,%r897; setp.eq.u16 %r899,%r898,0; @ %r899 bra $L19; .loc 1 2622 37 set.u32.gt.s32 %r901,%r88,0; neg.s32 %r902,%r901; .loc 1 2622 46 set.u32.gt.s32 %r904,%r85,1; neg.s32 %r905,%r904; .loc 1 2622 41 cvt.u16.u32 %r907,%r902; cvt.u16.u32 %r908,%r905; and.b16 %r906,%r907,%r908; cvt.u32.u16 %r909,%r906; cvt.u16.u8 %r910,%r909; setp.eq.u16 %r911,%r910,0; @ %r911 bra $L19; .loc 1 2622 55 set.u32.gt.s32 %r913,%r86,1; neg.s32 %r914,%r913; .loc 1 2622 64 set.u32.gt.s32 %r916,%r87,1; neg.s32 %r917,%r916; .loc 1 2622 59 cvt.u16.u32 %r919,%r914; cvt.u16.u32 %r920,%r917; and.b16 %r918,%r919,%r920; cvt.u32.u16 %r921,%r918; cvt.u16.u8 %r922,%r921; setp.eq.u16 %r923,%r922,0; @ %r923 bra $L19; .loc 1 2624 4 setp.ne.u64 %r924,%r723,0; @ %r924 bra $L27; cvta.const.u64 %r928,$LC8; cvta.const.u64 %r927,__func__$0; mov.u32 %r926,2624; cvta.const.u64 %r925992827: .loc 1 2626 17 and.b32 %r929,%r721,2; .loc 1 2626 7 setp.ne.u32 %r930,%r929,0; @ %r930 bra $L133; .loc 1 2629 35 setp.eq.u64 %r931,%r473,1; @ %r931 bra $L134; cvta.const.u64 %r188,$LC2; bra $L28; $L133: .loc 1 2627 13 cvta.const.u64 %r188,$LC1; bra $L28; $L134: .loc 1 2629 35 cvta.const.u64 %r188,$LC0; $L28: .loc 1 2631 17 and.b32 %r932,%r721,4; .loc 1 2631 7 setp.ne.u32 %r933,%r932,0; @ %r933 bra $L135; .loc 1 2634 35 setp.eq.u64 %r934,%r472,1; @ %r934 bra $L136; cvta.const.u64 %r189,$LC2; bra $L29; $L135: .loc 1 2632 13 cvta.const.u64 %r189,$LC1; bra $L29; $L136: .loc 1 2634 35 cvta.const.u64 %r189,$LC0; $L29: .loc 1 2636 4 add.u64 %r950,%frame,24; add.u64 %r951,%frame,16; add.u64 %r952,%frame,20; add.u64 %r953,%frame,8; add.u64 %r954,%frame,28; add.u64 %r955,%frame,32; add.u64 %r956,%frame,36; mov.u32 %r949,1;,.param .u64 %in_ar3,.param .u64 %in_ar4,.param .u64 %in_ar5,.param .u64 %in_ar6,.param .u64 %in_ar7,.param .u64 %in_ar8,.param .u64 %in_ar9,.param .u64 %in_ar10,.param .u64 %in_ar11,.param .u64 %in_ar12,.param .u32 %in_ar13,.param .u32 %in_ar1418r189956955954213; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r952; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r214; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r951; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%frame; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r215; .param .u64 %out_arg13; st.param.u64 [%out_arg13],%r950; .param .u32 %out_arg14; st.param.u32 [%out_arg14],%r949; .param .u32 %out_arg15; st.param.u32 [%out_arg15],%r949; call %r723,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13,%out_arg14,%out_arg15),$LCT0; } bra $L1; $L19: .loc 1 2643 21 cvt.u16.u32 %r958,%r71; cvt.u16.u32 %r959,%r471; and.b16 %r957,%r958,%r959; .loc 1 2643 6 cvt.u32.u16 %r960,%r957; cvt.u16.u8 %r961,%r960; setp.eq.u16 %r962,%r961,0; @ %r962 bra $L31; .loc 1 2643 38 setp.ne.u64 %r963,%r472,1; @ %r963 bra $L20; .loc 1 2644 7 cvt.u16.u32 %r964,%r66; setp.eq.u16 %r965,%r964,1; @ %r965 bra $L32; .loc 1 2677 16 add.u64 %r966,%r474,1; .loc 1 2678 9 shl.b64 %r104,%r966,3; sub.u64 %r217,%r215,%r104; .loc 1 2687 7 setp.le.s64 %r967,%r187,0; @ %r967 bra $L33; add.u64 %r591,%r104,-8; mov.u64 %r590,%r215; add.u64 %r594,%r187,1; .loc 1 2689 22 shl.b64 %r411,%r186,3; .loc 1 2687 13 mov.u64 %r266,%r472; setp.gt.s64 %r1302,%r186,0; and.b64 %r1412,%r187,3; setp.eq.u64 %r1415,%r1412,0; @ %r1415 bra $L34; setp.eq.u64 %r1414,%r1412,1; @ %r1414 bra $L309; setp.eq.u64 %r1413,%r1412,2; @ %r1413 bra $L310; bra $L380; $L35: .loc 1 2689 22 mov.u32 %r971$L36: .loc 1 2687 24 add.u64 %r1416,%r266,1; .loc 1 2687 7 add.u64 %r1469,%r590,%r591; .loc 1 2688 2 @ %r1302 bra $L247; bra $L381; $L34: @ %r1302 bra $L35; bra $L36; $L33: .loc 1 2692 28 set.u32.eq.u64 %r978,%r426,0; neg.s32 %r979,%r978; .loc 1 2692 13 set.u32.eq.u64 %r981,%r186,0; neg.s32 %r982,%r981; .loc 1 2692 28 cvt.u16.u32 %r984,%r979; cvt.u16.u32 %r985,%r982; or.b16 %r983,%r984,%r985; .loc 1 2692 23 set.u32.eq.u64 %r987,%r187,0; neg.s32 %r988,%r987; .loc 1 2692 28 cvt.u16.u32 %r992,%r988; or.b16 %r990,%r983,%r992; cvt.u32.u16 %r993,%r990; cvt.u16.u8 %r994,%r993; setp.eq.u16 %r995,%r994,0; @ ! %r995 bra $L1; .loc 1 2697 10 setp.eq.u64 %r996,%r178,1; selp.u64 %r198,%r474,%r178,%r996; .loc 1 2702 21 shl.b64 %r997,%r198,8; .loc 1 2702 14 add.u64 %r222,%r997,%r179; .loc 1 2706 12 min.s64 %r999,%r222,65536; shl.b64 %r1000,%r9991001,[%value_in]; } .loc 1 2710 7 setp.gt.s64 %r1003,%r187,0; @ %r1003 bra $L39; $L42: .loc 1 2910012918 7 bra $L1; $L39: .loc 1 2680 16 add.u64 %r1005,%r178,1; .loc 1 2681 9 shl.b64 %r107,%r1005,3; sub.u64 %r219,%r213,%r107; shl.b64 %r564,%r179,9; shl.b64 %r566,%r179,12; shl.b64 %r575,%r179,4; .loc 1 2683 16 add.u64 %r1006,%r179,1; .loc 1 2684 9 shl.b64 %r1007,%r1006,3; sub.u64 %r567,%r214,%r1007; add.u64 %r565,%r575,%r567; shl.b64 %r570,%r474,9; add.u64 %r571,%r474,%r474; shl.b64 %r572,%r474,2; neg.s64 %r1010,%r179; shl.b64 %r1011,%r1010,12; shl.b64 %r573,%r1010,3; shl.b64 %r577,%r474,12; shl.b64 %r1015,%r474,4; add.u64 %r1016,%r1015,8; add.u64 %r576,%r217,%r1016; add.u64 %r1017,%r187,-1; and.b64 %r585,%r1017,-512; add.u64 %r586,%r567,%r573; shl.b64 %r543,%r178,8; shl.b64 %r545,%r178,11; add.u64 %r546,%r107,-8; neg.s64 %r1019,%r178; shl.b64 %r1020,%r1019,11; shl.b64 %r1023,%r1019,4; shl.b64 %r507,%r178,4; add.u64 %r508,%r178,%r178; shl.b64 %r196,%r179,5; shl.b64 %r191,%r474,5; neg.s64 %r1026,%r474; shl.b64 %r1027,%r1026,3; shl.b64 %r536,%r474,3; shl.b64 %r1030,%r1010,5; shl.b64 %r589,%r179,3; mov.u64 %r568,%r571; mov.u64 %r562,%r179; mov.u64 %r560,0; setp.gt.s64 %r1295,%r426,0; add.u64 %r1338,%r586,8; setp.gt.s64 %r1339,%r186,0; add.u64 %r1340,%r186,1; add.u64 %r1341,%r219,16; bra $L41; $L138: mov.u64 %r560,%r561; $L41: add.u64 %r588,%r560,1; .loc 1 2715 9 sub.u64 %r224,%r187,%r560; min.s64 %r1470,%r224,512; .loc 1 2716 24 shr.s64 %r1032,%r1470,63; shr.u64 %r1033,%r1032,62; add.u64 %r1034,%r1470,%r1033; and.b64 %r1035,%r1034,3; sub.u64 %r1036,%r1035,%r1033; .loc 1 2716 10 sub.u64 %r225,%r1470,%r1036; .loc 1 2718 4 @ %r1295 bra $L40; $L45: .loc 1 2710 7 add.u64 %r561,%r560,512; add.u64 %r562,%r562,%r564; add.u64 %r565,%r565,%r566; add.u64 %r568,%r568,%r570; add.u64 %r572,%r572,%r570; add.u64 %r573,%r573,%r1011; add.u64 %r575,%r575,%r566; add.u64 %r576,%r576,%r577; setp.ne.u64 %r1038,%r560,%r585; @ %r1038 bra $L138; bra $L42; $L40: add.u64 %r540,%r1338,%r575; .loc 1 2768 13 add.u64 %r450,%r225,%r588; add.u64 %r447,%r575,%r589; add.u64 %r445,%r447,%r589; add.u64 %r558,%r565,%r589; add.u64 %r559,%r558,%r589; mul.lo.u64 %r376,%r450,%r474; mul.lo.u64 %r369,%r450,%r179; mad.lo.u64 %r555,%r450,%r589,%r567; mov.u64 %r547,%r1023; mov.u64 %r544,%r546; mov.u64 %r541,%r178; .loc 1 2718 12 mov.u64 %r228,%r472; add.u64 %r1287,%r426,1; .loc 1 2768 8 add.u64 %r1331,%r450,-1; setp.ge.s64 %r1333,%r1331,%r588; sub.u64 %r1336,%r568,%r474; add.u64 %r1337,%r1336,%r571; $L44: .loc 1 2723 13 sub.u64 %r226,%r1287,%r228; min.s64 %r1472,%r226,256; .loc 1 2724 28 shr.u64 %r1044,%r1472,63; add.u64 %r1045,%r1472,%r1044; and.b64 %r1046,%r1045,1; sub.u64 %r1047,%r1046,%r1044; .loc 1 2724 14 sub.u64 %r227,%r1472,%r1047; .loc 1 2727 8 @ %r1339 bra $L43; $L73: .loc 1 2718 30 add.u64 %r228,%r228,256; .loc 1 2718 4 add.u64 %r540,%r540,2048; add.u64 %r541,%r541,%r543; add.u64 %r544,%r544,%r545; add.u64 %r547,%r547,%r1020; setp.ge.s64 %r1049,%r426,%r228; @ %r1049 bra $L44; bra $L45; $L43: .loc 1 2763 26 add.u64 %r1050,%r228,-1; add.u64 %r1051,%r1050,%r1472; .loc 1 2763 31 mul.lo.u64 %r284,%r1051,%r178; .loc 1 2734 13 add.u64 %r449,%r227,%r228; .loc 1 2734 8 add.u64 %r451,%r449,-1; add.u64 %r1052,%r284,1; shl.b64 %r1053,%r1052,3; add.u64 %r526,%r219,%r1053; add.u64 %r537,%r541,%r178; sub.u64 %r1054,%r537,%r284; shl.b64 %r539,%r1054,3; shl.b64 %r1056,%r1472,3; add.u64 %r1057,%r1056,-8; add.u64 %r300,%r1001,%r1057; add.u64 %r1058,%r228,%r562; add.u64 %r1059,%r1058,%r1472; shl.b64 %r1060,%r1059,3; add.u64 %r195,%r1060,%r567; shl.b64 %r557,%r228,3; add.u64 %r48,%r557,%r565; add.u64 %r46,%r557,%r558; add.u64 %r30,%r557,%r559; add.u64 %r556,%r555,%r557; add.u64 %r1061,%r369,%r228; add.u64 %r1062,%r1061,%r1472; shl.b64 %r1063,%r1062,3; add.u64 %r367,%r1063,%r567; .loc 1 2768 8 mov.u64 %r525,%r576; .loc 1 2727 16 mov.u64 %r243,%r472; setp.le.s64 %r1285,%r228,%r451; setp.gt.s64 %r1299,%r1472,%r227; setp.gt.s64 %r1298,%r1470,%r225; not.b64 %r1324,%r228; add.u64 %r1325,%r1324,%r449; and.b64 %r1326,%r1325,-2; .loc 1 2904 18 add.u64 %r1327,%r1472,%r228; .loc 1 2904 13 add.u64 %r1328,%r1327,-1; setp.gt.s64 %r1329,%r228,%r1328; $L72: .loc 1 2732 10 sub.u64 %r229,%r1340,%r243; min.s64 %r1473,%r229,256; .loc 1 2733 25 shr.s64 %r1304,%r1473,63; shr.u64 %r1067,%r1473,63; add.u64 %r1068,%r1473,%r1067; and.b64 %r1069,%r1068,1; sub.u64 %r1070,%r1069,%r1067; .loc 1 2733 11 sub.u64 %r230,%r1473,%r1070; .loc 1 2735 5 @ %r1285 bra $L46; $L55: .loc 1 2757 8 @ ! %r1299 bra $L48; bra $L47; $L46: .loc 1 2737 17 add.u64 %r454,%r230,%r243; .loc 1 2737 12 add.u64 %r455,%r454,-1; shl.b64 %r1074,%r1473,11; add.u64 %r1076,%r1074,-2048; add.u64 %r500,%r1001,%r1076; add.u64 %r506,%r526,%r539; add.u64 %r1079,%r1473,%r243; shl.b64 %r517,%r1079,3; shl.b64 %r521,%r1473,3; mov.u64 %r511,%r537; mov.u64 %r510,%r547; mov.u64 %r509,%r544; mov.u64 %r498,0; setp.le.s64 %r1292,%r243,%r455; setp.gt.s64 %r1300,%r1473,%r230; not.b64 %r1319,%r243; add.u64 %r1320,%r1319,%r454; and.b64 %r1321,%r1320,-2; add.u64 %r1322,%r1321,%r243; bra $L54; $L139: mov.u64 %r498,%r499; $L54: .loc 1 2738 9 @ %r1292 bra $L49; $L53: .loc 1 2749 12 @ %r1300 bra $L50; bra $L51; $L49: shl.b64 %r1082,%r498,3; add.u64 %r199,%r1001,%r1082; add.u64 %r1087,%r1322,%r511; shl.b64 %r1088,%r1087,3; add.u64 %r497,%r1088,%r1341; .loc 1 2738 9 mov.u64 %r181,%r506; sub.u64 %r1400,%r497,%r506; add.u64 %r1401,%r1400,-16; shr.u64 %r1399,%r1401,4; add.u64 %r1402,%r1399,1; and.b64 %r1403,%r1402,3; setp.eq.u64 %r1406,%r1403,0; @ %r1406 bra $L52; setp.eq.u64 %r1405,%r1403,1; @ %r1405 bra $L311; setp.eq.u64 %r1404,%r1403,2; @ %r1404 bra $L312; bra $L382; $L52: .loc 1 2741 7 add.u64 %r1090,%r181,%r509; add.u64 %r1091,%r1090,%r510; .loc 1 2740 49 ld.f64 %r1092,[%r1091]; st.f64 [%r199],%r1092; .loc 1 2742 49 ld.f64 %r1093,[%r181]; st.f64 [%r199+8],%r1093; .loc 1 2745 7 add.u64 %r1094,%r181,%r510; add.u64 %r1095,%r1094,%r509; .loc 1 2744 49 ld.f64 %r1096,[%r1095+8]; st.f64 [%r199+2048],%r1096; .loc 1 2746 49 ld.f64 %r1097,[%r181+8]; st.f64 [%r199+2056],%r1097; .loc 1 2738 9 add.u64 %r1407,%r199,4096; add.u64 %r1408,%r181,16; .loc 1 2741 7 add.u64 %r1476,%r1408,%r509; add.u64 %r1477,%r1476,%r510; .loc 1 2740 49 ld.f64 %r1478,[%r1477]; st.f64 [%r1407],%r1478; .loc 1 2742 49 ld.f64 %r1479,[%r1408]; st.f64 [%r1407+8],%r1479; .loc 1 2745 7 add.u64 %r1480,%r1408,%r510; add.u64 %r1481,%r1480,%r509; .loc 1 2744 49 ld.f64 %r1482,[%r1481+8]; st.f64 [%r1407+2048],%r1482; .loc 1 2746 49 ld.f64 %r1483,[%r1408+8]; st.f64 [%r1407+2056],%r1483; .loc 1 2738 9 add.u64 %r1484,%r199,8192; add.u64 %r1485,%r181,32; .loc 1 2741 7 add.u64 %r1487,%r1485,%r509; add.u64 %r1488,%r1487,%r510; .loc 1 2740 49 ld.f64 %r1489,[%r1488]; st.f64 [%r1484],%r1489; .loc 1 2742 49 ld.f64 %r1490,[%r1485]; st.f64 [%r1484+8],%r1490; .loc 1 2745 7 add.u64 %r1491,%r1485,%r510; add.u64 %r1492,%r1491,%r509; .loc 1 2744 49 ld.f64 %r1493,[%r1492+8]; st.f64 [%r1484+2048],%r1493; .loc 1 2746 49 ld.f64 %r1494,[%r1485+8]; st.f64 [%r1484+2056],%r1494; .loc 1 2738 9 add.u64 %r1495,%r199,12288; add.u64 %r1496,%r181,48; .loc 1 2741 7 add.u64 %r1498,%r1496,%r509; add.u64 %r1499,%r1498,%r510; .loc 1 2740 49 ld.f64 %r1500,[%r1499]; st.f64 [%r1495],%r1500; .loc 1 2742 49 ld.f64 %r1501,[%r1496]; st.f64 [%r1495+8],%r1501; .loc 1 2745 7 add.u64 %r1502,%r1496,%r510; add.u64 %r1503,%r1502,%r509; .loc 1 2744 49 ld.f64 %r1504,[%r1503+8]; st.f64 [%r1495+2048],%r1504; .loc 1 2746 49 ld.f64 %r1505,[%r1496+8]; st.f64 [%r1495+2056],%r1505; .loc 1 2738 9 add.u64 %r199,%r199,16384; add.u64 %r181,%r181,64; setp.ne.u64 %r1506,%r181,%r497; @ %r1506 bra $L52; bra $L53; $L50: .loc 1 2752 10 add.u64 %r1099,%r509,%r517; add.u64 %r1100,%r219,%r1099; .loc 1 2751 41 ld.f64 %r1101,[%r1100+-8]; st.f64 [%r500],%r1101; .loc 1 2754 10 add.u64 %r1102,%r506,%r521; .loc 1 2753 41 ld.f64 %r1103,[%r1102+-8]; st.f64 [%r500+8],%r1103; $L51: .loc 1 2735 5 add.u64 %r499,%r498,2; add.u64 %r500,%r500,16; add.u64 %r506,%r506,%r507; add.u64 %r509,%r509,%r507; add.u64 %r510,%r510,%r1023; add.u64 %r511,%r511,%r508; setp.ne.u64 %r1104,%r498,%r1326; @ %r1104 bra $L139; bra $L55; $L48: .loc 1 2767 25 shr.u64 %r1107,%r1304,62; add.u64 %r1108,%r1473,%r1107; and.b64 %r1109,%r1108,3; sub.u64 %r1110,%r1109,%r1107; .loc 1 2767 11 sub.u64 %r233,%r1473,%r1110; .loc 1 2769 5 @ %r1333 bra $L56; bra $L57; $L47: .loc 1 2759 17 add.u64 %r132,%r1473,%r243; .loc 1 2759 12 add.u64 %r1112,%r132,-1; .loc 1 2760 9 setp.lt.s64 %r1113,%r1112,%r243; @ %r1113 bra $L48; add.u64 %r1114,%r132,%r284; shl.b64 %r1115,%r1114,3; add.u64 %r136,%r219,%r1115; mov.u64 %r73,%r300; mov.u64 %r57,%r526; sub.u64 %r1390,%r136,%r526; add.u64 %r1391,%r1390,-8; shr.u64 %r1389,%r1391,3; add.u64 %r1392,%r1389,1; and.b64 %r1393,%r1392,3; setp.eq.u64 %r1396,%r1393,0; @ %r1396 bra $L58; setp.eq.u64 %r1395,%r1393,1; @ %r1395 bra $L313; setp.eq.u64 %r1394,%r1393,2; @ %r1394 bra $L314; bra $L383; $L58: .loc 1 2762 43 ld.f64 %r1116,[%r57]; st.f64 [%r73],%r1116; ld.f64 %r1509,[%r57+8]; st.f64 [%r73+2048],%r1509; ld.f64 %r1513,[%r57+16]; st.f64 [%r73+4096],%r1513; ld.f64 %r1517,[%r57+24]; st.f64 [%r73+6144],%r1517; .loc 1 2760 9 add.u64 %r57,%r57,32; add.u64 %r73,%r73,8192; setp.ne.u64 %r1518,%r57,%r136; @ %r1518 bra $L58; bra $L48; $L57: .loc 1 2871 8 @ ! %r1298 bra $L60; bra $L59; $L56: .loc 1 2876 14 add.u64 %r458,%r233,%r243; .loc 1 2876 9 add.u64 %r459,%r458,-1; add.u64 %r192,%r525,%r1027; add.u64 %r89,%r525,%r536; add.u64 %r42,%r536,%r89; not.b64 %r1119,%r243; add.u64 %r1120,%r1119,%r458; shr.u64 %r1121,%r1120,2; add.u64 %r1122,%r1121,1; shl.b64 %r288,%r1122,5; add.u64 %r1123,%r1336,%r458; shl.b64 %r1124,%r1123,3; add.u64 %r438,%r1124,%r217; add.u64 %r1125,%r458,%r568; shl.b64 %r1126,%r1125,3; add.u64 %r433,%r1126,%r217; add.u64 %r1128,%r1337,%r458; shl.b64 %r1129,%r1128,3; add.u64 %r430,%r1129,%r217; add.u64 %r1130,%r458,%r572; shl.b64 %r1131,%r1130,3; add.u64 %r427,%r1131,%r217; shl.b64 %r339,%r233,8; shl.b64 %r441,%r1473,3; mov.u64 %r446,%r445; mov.u64 %r448,%r447; mov.u64 %r453,%r575; mov.u64 %r456,%r573; mov.u64 %r190,%r525; mov.u64 %r197,%r195; mov.u64 %r248,%r588; mov.u64 %r443,0; mov.u64 %r444,%r443; setp.le.s64 %r1293,%r243,%r459; setp.gt.s64 %r1301,%r1473,%r233; $L68: .loc 1 2772 9 @ %r1293 bra $L61; $L67: .loc 1 2843 12 @ ! %r1301 bra $L63; bra $L62; $L61: .loc 1 2904 13 mov.u64 %r311,0; mov.u64 %r312,8192; mov.u64 %r313,6144; mov.u64 %r314,2048; mov.u64 %r315,-4096; $L66: add.u64 %r309,%r192,%r311; .loc 1 2774 10 ld.f64 %r249,[%r309]; .loc 1 2775 10 ld.f64 %r250,[%r309+8]; add.u64 %r307,%r190,%r311; .loc 1 2776 10 ld.f64 %r251,[%r307]; .loc 1 2777 10 ld.f64 %r252,[%r307+8]; add.u64 %r305,%r89,%r311; .loc 1 2778 10 ld.f64 %r253,[%r305]; .loc 1 2779 10 ld.f64 %r254,[%r305+8]; add.u64 %r303,%r42,%r311; .loc 1 2780 10 ld.f64 %r255,[%r303]; .loc 1 2781 10 ld.f64 %r256,[%r303+8]; .loc 1 2782 10 ld.f64 %r257,[%r309+16]; .loc 1 2783 10 ld.f64 %r258,[%r309+24]; .loc 1 2784 10 ld.f64 %r259,[%r307+16]; .loc 1 2785 10 ld.f64 %r260,[%r307+24]; .loc 1 2786 10 ld.f64 %r261,[%r305+16]; .loc 1 2787 10 ld.f64 %r262,[%r305+24]; .loc 1 2788 10 ld.f64 %r263,[%r303+16]; .loc 1 2789 10 ld.f64 %r264,[%r303+24]; .loc 1 2791 6 @ %r1329 bra $L64; add.u64 %r331,%r443,%r540; add.u64 %r329,%r1001,%r314; sub.u64 %r1382,%r197,%r331; add.u64 %r1383,%r1382,-8; shr.u64 %r1381,%r1383,3; and.b64 %r1385,%r1381,1; setp.ne.u64 %r1386,%r1385,0; @ ! %r1386 bra $L384; $L65: add.u64 %r326,%r315,%r329; .loc 1 2793 19 add.u64 %r1136,%r326,%r314; ld.f64 %r137,[%r1136]; .loc 1 2794 14 ld.f64 %r138,[%r331]; .loc 1 2793 14 fma.rn.f64 %r1519,%r137,%r138,%r249; .loc 1 2795 19 ld.f64 %r139,[%r329]; .loc 1 2795 14 fma.rn.f64 %r1520,%r138,%r139,%r250; add.u64 %r321,%r331,%r456; .loc 1 2798 14 add.u64 %r1137,%r321,%r453; ld.f64 %r140,[%r1137]; .loc 1 2797 14 fma.rn.f64 %r1521,%r137,%r140,%r251; .loc 1 2799 14 fma.rn.f64 %r1522,%r139,%r140,%r252; .loc 1 2802 14 add.u64 %r1138,%r321,%r448; ld.f64 %r141,[%r1138]; .loc 1 2801 14 fma.rn.f64 %r1523,%r137,%r141,%r253; .loc 1 2803 14 fma.rn.f64 %r1524,%r139,%r141,%r254; .loc 1 2806 14 add.u64 %r1139,%r321,%r446; ld.f64 %r142,[%r1139]; .loc 1 2805 14 fma.rn.f64 %r1525,%r137,%r142,%r255; .loc 1 2807 14 fma.rn.f64 %r1526,%r139,%r142,%r256; .loc 1 2809 19 add.u64 %r1140,%r326,%r313; ld.f64 %r143,[%r1140]; .loc 1 2809 14 fma.rn.f64 %r1527,%r138,%r143,%r257; .loc 1 2811 19 add.u64 %r1141,%r326,%r312; ld.f64 %r144,[%r1141]; .loc 1 2811 14 fma.rn.f64 %r1528,%r138,%r144,%r258; .loc 1 2813 14 fma.rn.f64 %r1529,%r140,%r143,%r259; .loc 1 2815 14 fma.rn.f64 %r1530,%r140,%r144,%r260; .loc 1 2817 14 fma.rn.f64 %r1531,%r141,%r143,%r261; .loc 1 2819 14 fma.rn.f64 %r1532,%r141,%r144,%r262; .loc 1 2821 14 fma.rn.f64 %r1533,%r142,%r143,%r263; .loc 1 2823 14 fma.rn.f64 %r1534,%r142,%r144,%r264; .loc 1 2791 6 add.u64 %r1387,%r331,8; add.u64 %r1388,%r329,8; add.u64 %r1537,%r315,%r1388; .loc 1 2793 19 add.u64 %r1538,%r1537,%r314; ld.f64 %r1539,[%r1538]; .loc 1 2794 14 ld.f64 %r1540,[%r1387]; .loc 1 2793 14 fma.rn.f64 %r249,%r1539,%r1540,%r1519; .loc 1 2795 19 ld.f64 %r1541,[%r1388]; .loc 1 2795 14 fma.rn.f64 %r250,%r1540,%r1541,%r1520; add.u64 %r1542,%r1387,%r456; .loc 1 2798 14 add.u64 %r1543,%r1542,%r453; ld.f64 %r1544,[%r1543]; .loc 1 2797 14 fma.rn.f64 %r251,%r1539,%r1544,%r1521; .loc 1 2799 14 fma.rn.f64 %r252,%r1541,%r1544,%r1522; .loc 1 2802 14 add.u64 %r1545,%r1542,%r448; ld.f64 %r1546,[%r1545]; .loc 1 2801 14 fma.rn.f64 %r253,%r1539,%r1546,%r1523; .loc 1 2803 14 fma.rn.f64 %r254,%r1541,%r1546,%r1524; .loc 1 2806 14 add.u64 %r1547,%r1542,%r446; ld.f64 %r1548,[%r1547]; .loc 1 2805 14 fma.rn.f64 %r255,%r1539,%r1548,%r1525; .loc 1 2807 14 fma.rn.f64 %r256,%r1541,%r1548,%r1526; .loc 1 2809 19 add.u64 %r1549,%r1537,%r313; ld.f64 %r1550,[%r1549]; .loc 1 2809 14 fma.rn.f64 %r257,%r1540,%r1550,%r1527; .loc 1 2811 19 add.u64 %r1551,%r1537,%r312; ld.f64 %r1552,[%r1551]; .loc 1 2811 14 fma.rn.f64 %r258,%r1540,%r1552,%r1528; .loc 1 2813 14 fma.rn.f64 %r259,%r1544,%r1550,%r1529; .loc 1 2815 14 fma.rn.f64 %r260,%r1544,%r1552,%r1530; .loc 1 2817 14 fma.rn.f64 %r261,%r1546,%r1550,%r1531; .loc 1 2819 14 fma.rn.f64 %r262,%r1546,%r1552,%r1532; .loc 1 2821 14 fma.rn.f64 %r263,%r1548,%r1550,%r1533; .loc 1 2823 14 fma.rn.f64 %r264,%r1548,%r1552,%r1534; .loc 1 2791 6 add.u64 %r331,%r331,16; add.u64 %r329,%r329,16; setp.ne.u64 %r1553,%r197,%r331; @ %r1553 bra $L65; $L64: .loc 1 2826 24 st.f64 [%r309],%r249; .loc 1 2827 28 st.f64 [%r309+8],%r250; .loc 1 2828 30 st.f64 [%r307],%r251; .loc 1 2829 34 st.f64 [%r307+8],%r252; .loc 1 2830 30 st.f64 [%r305],%r253; .loc 1 2831 34 st.f64 [%r305+8],%r254; .loc 1 2832 30 st.f64 [%r303],%r255; .loc 1 2833 34 st.f64 [%r303+8],%r256; .loc 1 2834 28 st.f64 [%r309+16],%r257; .loc 1 2835 28 st.f64 [%r309+24],%r258; .loc 1 2836 34 st.f64 [%r307+16],%r259; .loc 1 2837 34 st.f64 [%r307+24],%r260; .loc 1 2838 34 st.f64 [%r305+16],%r261; .loc 1 2839 34 st.f64 [%r305+24],%r262; .loc 1 2840 34 st.f64 [%r303+16],%r263; .loc 1 2841 34 st.f64 [%r303+24],%r264; .loc 1 2772 9 add.u64 %r315,%r315,-8192; add.u64 %r314,%r314,8192; add.u64 %r313,%r313,8192; add.u64 %r312,%r312,8192; add.u64 %r311,%r311,32; setp.ne.u64 %r1143,%r288,%r311; @ %r1143 bra $L66; bra $L67; $L63: .loc 1 2769 29 add.u64 %r248,%r248,4; .loc 1 2769 5 add.u64 %r197,%r197,%r196; add.u64 %r192,%r192,%r191; add.u64 %r190,%r190,%r191; add.u64 %r89,%r89,%r191; add.u64 %r42,%r42,%r191; add.u64 %r456,%r456,%r1030; add.u64 %r453,%r453,%r196; add.u64 %r448,%r448,%r196; add.u64 %r446,%r446,%r196; add.u64 %r444,%r444,%r191; add.u64 %r443,%r443,%r196; setp.le.s64 %r1144,%r248,%r1331; @ %r1144 bra $L68; bra $L57; $L62: add.u64 %r344,%r438,%r444; add.u64 %r343,%r433,%r444; add.u64 %r342,%r430,%r444; add.u64 %r341,%r427,%r444; add.u64 %r49,%r48,%r443; add.u64 %r209,%r46,%r443; add.u64 %r210,%r30,%r443; add.u64 %r442,%r192,%r441; .loc 1 2904 13 mov.u64 %r340,%r339; $L71: .loc 1 2848 14 ld.f64 %r244,[%r344]; .loc 1 2849 14 ld.f64 %r245,[%r343]; .loc 1 2850 14 ld.f64 %r246,[%r342]; .loc 1 2851 14 ld.f64 %r247,[%r341]; .loc 1 2853 10 @ %r1329 bra $L69; shl.b64 %r1148,%r340,3; add.u64 %r359,%r1001,%r1148; add.u64 %r357,%r443,%r540; mov.u64 %r351,%r210; mov.u64 %r353,%r209; mov.u64 %r355,%r49; sub.u64 %r1369,%r197,%r357; add.u64 %r1370,%r1369,-8; shr.u64 %r1368,%r1370,3; add.u64 %r1371,%r1368,1; and.b64 %r1372,%r1371,3; setp.eq.u64 %r1375,%r1372,0; @ %r1375 bra $L70; setp.eq.u64 %r1374,%r1372,1; @ %r1374 bra $L315; setp.eq.u64 %r1373,%r1372,2; @ %r1373 bra $L316; bra $L385; $L70: .loc 1 2855 16 ld.f64 %r146,[%r359]; .loc 1 2855 11 ld.f64 %r1149,[%r357]; fma.rn.f64 %r1554,%r146,%r1149,%r244; .loc 1 2857 11 ld.f64 %r1150,[%r355]; fma.rn.f64 %r1555,%r146,%r1150,%r245; .loc 1 2859 11 ld.f64 %r1151,[%r353]; fma.rn.f64 %r1556,%r146,%r1151,%r246; .loc 1 2861 11 ld.f64 %r1152,[%r351]; fma.rn.f64 %r1557,%r146,%r1152,%r247; .loc 1 2855 16 ld.f64 %r1563,[%r359+8]; .loc 1 2855 11 ld.f64 %r1564,[%r357+8]; fma.rn.f64 %r1565,%r1563,%r1564,%r1554; .loc 1 2857 11 ld.f64 %r1566,[%r355+8]; fma.rn.f64 %r1567,%r1563,%r1566,%r1555; .loc 1 2859 11 ld.f64 %r1568,[%r353+8]; fma.rn.f64 %r1569,%r1563,%r1568,%r1556; .loc 1 2861 11 ld.f64 %r1570,[%r351+8]; fma.rn.f64 %r1571,%r1563,%r1570,%r1557; .loc 1 2855 16 ld.f64 %r1578,[%r359+16]; .loc 1 2855 11 ld.f64 %r1579,[%r357+16]; fma.rn.f64 %r1580,%r1578,%r1579,%r1565; .loc 1 2857 11 ld.f64 %r1581,[%r355+16]; fma.rn.f64 %r1582,%r1578,%r1581,%r1567; .loc 1 2859 11 ld.f64 %r1583,[%r353+16]; fma.rn.f64 %r1584,%r1578,%r1583,%r1569; .loc 1 2861 11 ld.f64 %r1585,[%r351+16]; fma.rn.f64 %r1586,%r1578,%r1585,%r1571; .loc 1 2855 16 ld.f64 %r1593,[%r359+24]; .loc 1 2855 11 ld.f64 %r1594,[%r357+24]; fma.rn.f64 %r244,%r1593,%r1594,%r1580; .loc 1 2857 11 ld.f64 %r1595,[%r355+24]; fma.rn.f64 %r245,%r1593,%r1595,%r1582; .loc 1 2859 11 ld.f64 %r1596,[%r353+24]; fma.rn.f64 %r246,%r1593,%r1596,%r1584; .loc 1 2861 11 ld.f64 %r1597,[%r351+24]; fma.rn.f64 %r247,%r1593,%r1597,%r1586; .loc 1 2853 10 add.u64 %r359,%r359,32; add.u64 %r357,%r357,32; add.u64 %r355,%r355,32; add.u64 %r353,%r353,32; add.u64 %r351,%r351,32; setp.ne.u64 %r1598,%r197,%r357; @ %r1598 bra $L70; $L69: .loc 1 2864 28 st.f64 [%r344],%r244; .loc 1 2865 34 st.f64 [%r343],%r245; .loc 1 2866 34 st.f64 [%r342],%r246; .loc 1 2867 34 st.f64 [%r341],%r247; .loc 1 2846 6 add.u64 %r344,%r344,8; add.u64 %r343,%r343,8; add.u64 %r342,%r342,8; add.u64 %r341,%r341,8; add.u64 %r340,%r340,256; setp.ne.u64 %r1154,%r344,%r442; @ %r1154 bra $L71; bra $L63; $L60: .loc 1 2727 34 add.u64 %r243,%r243,256; .loc 1 2727 8 add.u64 %r525,%r525,2048; add.u64 %r526,%r526,2048; setp.ge.s64 %r1155,%r186,%r243; @ %r1155 bra $L72; bra $L73; $L59: .loc 1 2873 17 add.u64 %r151,%r1470,%r588; .loc 1 2874 9 setp.le.s64 %r1156,%r151,%r450; @ %r1156 bra $L60; .loc 1 2876 14 add.u64 %r465,%r233,%r243; .loc 1 2876 9 add.u64 %r466,%r465,-1; .loc 1 2759 17 add.u64 %r375,%r1473,%r243; add.u64 %r1157,%r375,%r376; shl.b64 %r1158,%r1157,3; add.u64 %r377,%r1158,%r217; neg.s64 %r1160,%r375; shl.b64 %r1161,%r1160,3; shl.b64 %r1162,%r243,3; add.u64 %r360,%r1162,%r1161; shl.b64 %r1163,%r465,3; add.u64 %r362,%r1163,%r1161; shl.b64 %r404,%r233,8; .loc 1 2876 9 mov.u64 %r368,%r367; mov.u64 %r370,%r556; mov.u64 %r371,%r376; mov.u64 %r237,%r450; setp.gt.s64 %r1599,%r1473,%r233; setp.le.s64 %r1294,%r243,%r466; not.b64 %r1311,%r243; add.u64 %r1312,%r1311,%r465; and.b64 %r1313,%r1312,-4; add.u64 %r1314,%r1313,%r243; $L81: .loc 1 2877 6 @ %r1294 bra $L74; $L80: .loc 1 2901 6 @ %r1599 bra $L75; bra $L76; $L74: add.u64 %r389,%r360,%r377; add.u64 %r1171,%r1314,%r371; shl.b64 %r1172,%r1171,3; add.u64 %r1173,%r217,32; add.u64 %r379,%r1172,%r1173; .loc 1 2904 13 mov.u64 %r385,8192; mov.u64 %r386,6144; mov.u64 %r387,2048; mov.u64 %r388,-4096; $L79: .loc 1 2879 14 ld.f64 %r239,[%r389]; .loc 1 2880 14 ld.f64 %r240,[%r389+8]; .loc 1 2881 14 ld.f64 %r241,[%r389+16]; .loc 1 2882 14 ld.f64 %r242,[%r389+24]; .loc 1 2884 10 @ %r1329 bra $L77; add.u64 %r398,%r1001,%r387; mov.u64 %r399,%r370; sub.u64 %r1359,%r368,%r370; add.u64 %r1360,%r1359,-8; shr.u64 %r1358,%r1360,3; add.u64 %r1361,%r1358,1; and.b64 %r1362,%r1361,3; setp.eq.u64 %r1365,%r1362,0; @ %r1365 bra $L78; setp.eq.u64 %r1364,%r1362,1; @ %r1364 bra $L317; setp.eq.u64 %r1363,%r1362,2; @ %r1363 bra $L318; bra $L386; $L78: add.u64 %r395,%r388,%r398; .loc 1 2887 16 ld.f64 %r153,[%r399]; .loc 1 2886 16 add.u64 %r1175,%r395,%r387; .loc 1 2886 11 ld.f64 %r1176,[%r1175]; fma.rn.f64 %r1600,%r1176,%r153,%r239; .loc 1 2888 11 ld.f64 %r1177,[%r398]; fma.rn.f64 %r1601,%r153,%r1177,%r240; .loc 1 2890 16 add.u64 %r1178,%r395,%r386; .loc 1 2890 11 ld.f64 %r1179,[%r1178]; fma.rn.f64 %r1602,%r153,%r1179,%r241; .loc 1 2892 16 add.u64 %r1180,%r395,%r385; .loc 1 2892 11 ld.f64 %r1181,[%r1180]; fma.rn.f64 %r1603,%r153,%r1181,%r242; .loc 1 2884 10 add.u64 %r1367,%r398,8; add.u64 %r1606,%r388,%r1367; .loc 1 2887 16 ld.f64 %r1607,[%r399+8]; .loc 1 2886 16 add.u64 %r1608,%r1606,%r387; .loc 1 2886 11 ld.f64 %r1609,[%r1608]; fma.rn.f64 %r1610,%r1609,%r1607,%r1600; .loc 1 2888 11 ld.f64 %r1611,[%r1367]; fma.rn.f64 %r1612,%r1607,%r1611,%r1601; .loc 1 2890 16 add.u64 %r1613,%r1606,%r386; .loc 1 2890 11 ld.f64 %r1614,[%r1613]; fma.rn.f64 %r1615,%r1607,%r1614,%r1602; .loc 1 2892 16 add.u64 %r1616,%r1606,%r385; .loc 1 2892 11 ld.f64 %r1617,[%r1616]; fma.rn.f64 %r1618,%r1607,%r1617,%r1603; .loc 1 2884 10 add.u64 %r1620,%r398,16; add.u64 %r1622,%r388,%r1620; .loc 1 2887 16 ld.f64 %r1623,[%r399+16]; .loc 1 2886 16 add.u64 %r1624,%r1622,%r387; .loc 1 2886 11 ld.f64 %r1625,[%r1624]; fma.rn.f64 %r1626,%r1625,%r1623,%r1610; .loc 1 2888 11 ld.f64 %r1627,[%r1620]; fma.rn.f64 %r1628,%r1623,%r1627,%r1612; .loc 1 2890 16 add.u64 %r1629,%r1622,%r386; .loc 1 2890 11 ld.f64 %r1630,[%r1629]; fma.rn.f64 %r1631,%r1623,%r1630,%r1615; .loc 1 2892 16 add.u64 %r1632,%r1622,%r385; .loc 1 2892 11 ld.f64 %r1633,[%r1632]; fma.rn.f64 %r1634,%r1623,%r1633,%r1618; .loc 1 2884 10 add.u64 %r1636,%r398,24; add.u64 %r1638,%r388,%r1636; .loc 1 2887 16 ld.f64 %r1639,[%r399+24]; .loc 1 2886 16 add.u64 %r1640,%r1638,%r387; .loc 1 2886 11 ld.f64 %r1641,[%r1640]; fma.rn.f64 %r239,%r1641,%r1639,%r1626; .loc 1 2888 11 ld.f64 %r1642,[%r1636]; fma.rn.f64 %r240,%r1639,%r1642,%r1628; .loc 1 2890 16 add.u64 %r1643,%r1638,%r386; .loc 1 2890 11 ld.f64 %r1644,[%r1643]; fma.rn.f64 %r241,%r1639,%r1644,%r1631; .loc 1 2892 16 add.u64 %r1645,%r1638,%r385; .loc 1 2892 11 ld.f64 %r1646,[%r1645]; fma.rn.f64 %r242,%r1639,%r1646,%r1634; .loc 1 2884 10 add.u64 %r399,%r399,32; add.u64 %r398,%r398,32; setp.ne.u64 %r1647,%r368,%r399; @ %r1647 bra $L78; $L77: .loc 1 2895 28 st.f64 [%r389],%r239; .loc 1 2896 32 st.f64 [%r389+8],%r240; .loc 1 2897 32 st.f64 [%r389+16],%r241; .loc 1 2898 32 st.f64 [%r389+24],%r242; .loc 1 2877 6 add.u64 %r389,%r389,32; add.u64 %r388,%r388,-8192; add.u64 %r387,%r387,8192; add.u64 %r386,%r386,8192; add.u64 %r385,%r385,8192; setp.ne.u64 %r1183,%r379,%r389; @ %r1183 bra $L79; bra $L80; $L76: .loc 1 2874 39 add.u64 %r237,%r237,1; .loc 1 2874 9 add.u64 %r377,%r377,%r536; add.u64 %r371,%r371,%r474; add.u64 %r370,%r370,%r589; add.u64 %r368,%r368,%r589; setp.ne.u64 %r1184,%r151,%r237; @ %r1184 bra $L81; bra $L60; $L75: add.u64 %r406,%r362,%r377; .loc 1 2904 13 mov.u64 %r405,%r404; $L84: .loc 1 2903 14 ld.f64 %r238,[%r406]; .loc 1 2905 10 @ %r1329 bra $L82; shl.b64 %r1187,%r405,3; add.u64 %r302,%r1001,%r1187; mov.u64 %r267,%r370; sub.u64 %r1349,%r368,%r370; add.u64 %r1350,%r1349,-8; shr.u64 %r1348,%r1350,3; add.u64 %r1351,%r1348,1; and.b64 %r1352,%r1351,3; setp.eq.u64 %r1355,%r1352,0; @ %r1355 bra $L83; setp.eq.u64 %r1354,%r1352,1; @ %r1354 bra $L319; setp.eq.u64 %r1353,%r1352,2; @ %r1353 bra $L320; bra $L387; $L83: .loc 1 2907 11 ld.f64 %r1188,[%r302]; ld.f64 %r1189,[%r267]; fma.rn.f64 %r1648,%r1188,%r1189,%r238; ld.f64 %r1651,[%r302+8]; ld.f64 %r1652,[%r267+8]; fma.rn.f64 %r1653,%r1651,%r1652,%r1648; ld.f64 %r1657,[%r302+16]; ld.f64 %r1658,[%r267+16]; fma.rn.f64 %r1659,%r1657,%r1658,%r1653; ld.f64 %r1663,[%r302+24]; ld.f64 %r1664,[%r267+24]; fma.rn.f64 %r238,%r1663,%r1664,%r1659; .loc 1 2905 10 add.u64 %r302,%r302,32; add.u64 %r267,%r267,32; setp.ne.u64 %r1665,%r368,%r267; @ %r1665 bra $L83; $L82: .loc 1 2910 28 st.f64 [%r406],%r238; .loc 1 2901 6 add.u64 %r406,%r406,8; add.u64 %r405,%r405,256; setp.ne.u64 %r1191,%r377,%r406; @ %r1191 bra $L84; bra $L76; $L20: .loc 1 2958 11 cvt.u16.u32 %r1192,%r416; setp.eq.u16 %r1193,%r1192,1; @ ! %r1193 bra $L388; bra $L85; $L31: .loc 1 2920 38 set.u32.eq.u64 %r1195,%r178,1; neg.s32 %r1196,%r1195; .loc 1 2920 26 cvt.u16.u32 %r1198,%r1196; and.b16 %r1197,%r1198,%r958; .loc 1 2920 11 cvt.u32.u16 %r1200,%r1197; cvt.u16.u8 %r1201,%r1200; setp.eq.u16 %r1202,%r1201,0; @ %r1202 bra $L20; .loc 1 2920 43 setp.ne.u64 %r1203,%r472,1; @ %r1203 bra $L20; $L121: .loc 1 2922 10 cvt.u16.u32 %r1204,%r416; setp.ne.u16 %r1205,%r1204,1; @ %r1205 bra $L87; .loc 1 2948 4 setp.gt.s64 %r1206,%r187,0; @ %r1206 bra $L88; bra $L1; $L87: .loc 1 2929 4 setp.le.s64 %r1207,%r187,0; @ %r1207 bra $L1; shl.b64 %r613,%r474,3; shl.b64 %r1208,%r186,3; add.u64 %r612,%r215,%r1208; shl.b64 %r603,%r473,3; shl.b64 %r1209,%r426,3; add.u64 %r604,%r213,%r1209; mov.u64 %r617,0; mov.u64 %r610,%r617; .loc 1 2929 11 mov.u64 %r270,%r617; setp.gt.s64 %r1666,%r186,0; setp.le.s64 %r1345,%r426,0; $L91: .loc 1 2933 8 @ %r1666 bra $L90; $L95: .loc 1 2929 29 add.u64 %r270,%r270,1; .loc 1 2929 4 add.u64 %r610,%r610,%r474; add.u64 %r612,%r612,%r613; add.u64 %r617,%r617,%r179; setp.ne.u64 %r1211,%r187,%r270; @ %r1211 bra $L91; bra $L1; $L90: shl.b64 %r1212,%r610,3; add.u64 %r599,%r215,%r1212; shl.b64 %r1213,%r617,3; add.u64 %r622,%r214,%r1213; .loc 1 2933 8 mov.u64 %r601,%r604; mov.u64 %r600,0; $L94: .loc 1 2937 5 @ %r1345 bra $L140; shl.b64 %r1215,%r600,3; add.u64 %r595,%r213,%r1215; mov.u64 %r596,%r622; .loc 1 2936 7 mov.f64 %r271,0d0000000000000000; sub.u64 %r1418,%r601,%r595; add.u64 %r1419,%r1418,-8; shr.u64 %r1417,%r1419,3; add.u64 %r1420,%r1417,1; and.b64 %r1421,%r1420,3; setp.eq.u64 %r1424,%r1421,0; @ %r1424 bra $L93; setp.eq.u64 %r1423,%r1421,1; @ %r1423 bra $L321; setp.eq.u64 %r1422,%r1421,2; @ %r1422 bra $L322; bra $L389; $L93: .loc 1 2938 9 ld.f64 %r1216,[%r595]; ld.f64 %r1217,[%r596]; fma.rn.f64 %r1667,%r1216,%r1217,%r271; ld.f64 %r1670,[%r595+8]; ld.f64 %r1671,[%r596+8]; fma.rn.f64 %r1672,%r1670,%r1671,%r1667; ld.f64 %r1676,[%r595+16]; ld.f64 %r1677,[%r596+16]; fma.rn.f64 %r1678,%r1676,%r1677,%r1672; ld.f64 %r1682,[%r595+24]; ld.f64 %r1683,[%r596+24]; fma.rn.f64 %r271,%r1682,%r1683,%r1678; .loc 1 2937 5 add.u64 %r595,%r595,32; add.u64 %r596,%r596,32; setp.ne.u64 %r1684,%r595,%r601; @ %r1684 bra $L93; bra $L92; $L140: .loc 1 2936 7 mov.f64 %r271,0d0000000000000000; $L92: .loc 1 2939 15 st.f64 [%r599],%r271; .loc 1 2933 8 add.u64 %r599,%r599,8; add.u64 %r600,%r600,%r473; add.u64 %r601,%r601,%r603; setp.ne.u64 %r1219,%r599,%r612; @ %r1219 bra $L94; bra $L95; $L88: shl.b64 %r633,%r474,3; mov.u64 %r631,%r215; shl.b64 %r625,%r473,3; .loc 1 2948 4 mov.u64 %r634,0; .loc 1 2948 11 mov.u64 %r268,%r634; setp.le.s64 %r1685,%r426,0; $L98: .loc 1 2952 8 @ %r1685 bra $L141; mov.u64 %r623,%r213; shl.b64 %r1221,%r634,3; add.u64 %r626,%r1221,%r214; add.u64 %r1222,%r426,%r634; shl.b64 %r1223,%r1222,3; add.u64 %r642,%r1223,%r214; .loc 1 2951 10 mov.f64 %r269,0d0000000000000000; sub.u64 %r1428,%r642,%r626; add.u64 %r1429,%r1428,-8; shr.u64 %r1427,%r1429,3; add.u64 %r1430,%r1427,1; and.b64 %r1431,%r1430,3; setp.eq.u64 %r1434,%r1431,0; @ %r1434 bra $L97; setp.eq.u64 %r1433,%r1431,1; @ %r1433 bra $L323; setp.eq.u64 %r1432,%r1431,2; @ %r1432 bra $L324; bra $L390; $L97: .loc 1 2953 5 ld.f64 %r1224,[%r623]; ld.f64 %r1225,[%r626]; fma.rn.f64 %r1686,%r1224,%r1225,%r269; .loc 1 2952 8 add.u64 %r1687,%r623,%r625; .loc 1 2953 5 ld.f64 %r1689,[%r1687]; ld.f64 %r1690,[%r626+8]; fma.rn.f64 %r1691,%r1689,%r1690,%r1686; .loc 1 2952 8 add.u64 %r1692,%r1687,%r625; .loc 1 2953 5 ld.f64 %r1695,[%r1692]; ld.f64 %r1696,[%r626+16]; fma.rn.f64 %r1697,%r1695,%r1696,%r1691; .loc 1 2952 8 add.u64 %r1698,%r1692,%r625; .loc 1 2953 5 ld.f64 %r1701,[%r1698]; ld.f64 %r1702,[%r626+24]; fma.rn.f64 %r269,%r1701,%r1702,%r1697; .loc 1 2952 8 add.u64 %r623,%r1698,%r625; add.u64 %r626,%r626,32; setp.ne.u64 %r1703,%r626,%r642; @ %r1703 bra $L97; bra $L96; $L141: .loc 1 2951 10 mov.f64 %r269,0d0000000000000000; $L96: .loc 1 2954 25 st.f64 [%r631],%r269; .loc 1 2948 29 add.u64 %r268,%r268,1; .loc 1 2948 4 add.u64 %r631,%r631,%r633; add.u64 %r634,%r634,%r179; setp.ne.u64 %r1227,%r187,%r268; @ %r1227 bra $L98; bra $L1; $L85: .loc 1 2963 7 setp.le.s64 %r1228,%r187,0; @ %r1228 bra $L1; shl.b64 %r653,%r414,3; mov.u64 %r651,%r215; shl.b64 %r645,%r473,3; shl.b64 %r648,%r472,3; mov.u64 %r654,0; .loc 1 2963 14 mov.u64 %r281,%r654; setp.le.s64 %r1704,%r426,0; $L101: .loc 1 2967 4 @ %r1704 bra $L142; mov.u64 %r643,%r213; shl.b64 %r1230,%r654,3; add.u64 %r646,%r214,%r1230; .loc 1 2966 6 mov.f64 %r282,0d0000000000000000; .loc 1 2967 11 mov.u64 %r283,0; and.b64 %r1439,%r426,3; setp.eq.u64 %r1442,%r1439,0; @ %r1442 bra $L100; setp.eq.u64 %r1441,%r1439,1; @ %r1441 bra $L325; setp.eq.u64 %r1440,%r1439,2; @ %r1440 bra $L326; bra $L391; $L100: .loc 1 2968 8 ld.f64 %r1231,[%r643]; ld.f64 %r1232,[%r646]; fma.rn.f64 %r1705,%r1231,%r1232,%r282; .loc 1 2967 4 add.u64 %r1707,%r643,%r645; add.u64 %r1708,%r646,%r648; .loc 1 2968 8 ld.f64 %r1709,[%r1707]; ld.f64 %r1710,[%r1708]; fma.rn.f64 %r1711,%r1709,%r1710,%r1705; .loc 1 2967 4 add.u64 %r1713,%r1707,%r645; add.u64 %r1714,%r1708,%r648; .loc 1 2968 8 ld.f64 %r1716,[%r1713]; ld.f64 %r1717,[%r1714]; fma.rn.f64 %r1718,%r1716,%r1717,%r1711; .loc 1 2967 4 add.u64 %r1720,%r1713,%r645; add.u64 %r1721,%r1714,%r648; .loc 1 2968 8 ld.f64 %r1723,[%r1720]; ld.f64 %r1724,[%r1721]; fma.rn.f64 %r282,%r1723,%r1724,%r1718; .loc 1 2967 28 add.u64 %r283,%r283,4; .loc 1 2967 4 add.u64 %r643,%r1720,%r645; add.u64 %r646,%r1721,%r648; setp.ne.u64 %r1725,%r426,%r283; @ %r1725 bra $L100; bra $L99; $L142: .loc 1 2966 6 mov.f64 %r282,0d0000000000000000; $L99: .loc 1 2969 21 st.f64 [%r651],%r282; .loc 1 2963 32 add.u64 %r281,%r281,1; .loc 1 2963 7 add.u64 %r651,%r651,%r653; add.u64 %r654,%r654,%r179; setp.ne.u64 %r1234,%r187,%r281; @ %r1234 bra $L101; bra $L1; $L388: .loc 1 2972 11 setp.gt.s64 %r1235,%r178,%r473; @ %r1235 bra $L102; .loc 1 2993 7 setp.gt.s64 %r1236,%r187,0; @ %r1236 bra $L103; bra $L1; $L102: .loc 1 2974 7 setp.gt.s64 %r1237,%r187,0; @ %r1237 bra $L104; bra $L1; $L108: shl.b64 %r1238,%r687,3; add.u64 %r683,%r215,%r1238; .loc 1 2975 9 mov.u64 %r280,0; and.b64 %r1455,%r186,3; setp.eq.u64 %r1458,%r1455,0; @ %r1458 bra $L105; setp.eq.u64 %r1457,%r1455,1; @ %r1457 bra $L327; setp.eq.u64 %r1456,%r1455,2; @ %r1456 bra $L328; bra $L392; $L105: .loc 1 2976 34 st.f64 [%r683],%r1346; .loc 1 2975 2 add.u64 %r1727,%r683,%r685; .loc 1 2976 34 st.f64 [%r1727],%r1346; .loc 1 2975 2 add.u64 %r1729,%r1727,%r685; .loc 1 2976 34 st.f64 [%r1729],%r1346; .loc 1 2975 2 add.u64 %r1732,%r1729,%r685; .loc 1 2976 34 st.f64 [%r1732],%r1346; .loc 1 2975 27 add.u64 %r280,%r280,4; .loc 1 2975 2 add.u64 %r683,%r1732,%r685; setp.ne.u64 %r1734,%r186,%r280; @ %r1734 bra $L105; $L109: .loc 1 2974 32 add.u64 %r279,%r294,1; .loc 1 2974 7 add.u64 %r687,%r687,%r474; setp.ne.u64 %r1241,%r187,%r279; @ %r1241 bra $L143; shl.b64 %r671,%r472,3; shl.b64 %r662,%r473,3; mov.u64 %r676,0; mov.u64 %r675,%r676; mov.u64 %r295,%r676; setp.le.s64 %r1735,%r426,0; bra $L107; $L104: shl.b64 %r685,%r414,3; mov.u64 %r687,0; .loc 1 2974 14 mov.u64 %r294,%r687; setp.gt.s64 %r1736,%r186,0; .loc 1 2976 34 mov.f64 %r1346,0d0000000000000000; bra $L106; $L143: mov.u64 %r294,%r279; $L106: .loc 1 2975 2 @ %r1736 bra $L108; bra $L109; $L112: shl.b64 %r1243,%r667,3; add.u64 %r660,%r213,%r1243; .loc 1 2980 4 mov.u64 %r659,%r680; .loc 1 2980 11 mov.u64 %r278,0; and.b64 %r1447,%r186,3; setp.eq.u64 %r1450,%r1447,0; @ %r1450 bra $L110; setp.eq.u64 %r1449,%r1447,1; @ %r1449 bra $L329; setp.eq.u64 %r1448,%r1447,2; @ %r1448 bra $L330; bra $L393; $L110: .loc 1 2982 36 ld.f64 %r1244,[%r660]; ld.f64 %r1245,[%r669]; ld.f64 %r1246,[%r659]; fma.rn.f64 %r175,%r1244,%r1245,%r1246; st.f64 [%r659],%r175; .loc 1 2980 4 add.u64 %r1738,%r659,%r685; add.u64 %r1739,%r660,%r662; .loc 1 2982 36 ld.f64 %r1740,[%r1739]; ld.f64 %r1741,[%r669]; ld.f64 %r1742,[%r1738]; fma.rn.f64 %r1743,%r1740,%r1741,%r1742; st.f64 [%r1738],%r1743; .loc 1 2980 4 add.u64 %r1745,%r1738,%r685; add.u64 %r1746,%r1739,%r662; .loc 1 2982 36 ld.f64 %r1748,[%r1746]; ld.f64 %r1749,[%r669]; ld.f64 %r1750,[%r1745]; fma.rn.f64 %r1751,%r1748,%r1749,%r1750; st.f64 [%r1745],%r1751; .loc 1 2980 4 add.u64 %r1753,%r1745,%r685; add.u64 %r1754,%r1746,%r662; .loc 1 2982 36 ld.f64 %r1756,[%r1754]; ld.f64 %r1757,[%r669]; ld.f64 %r1758,[%r1753]; fma.rn.f64 %r1759,%r1756,%r1757,%r1758; st.f64 [%r1753],%r1759; .loc 1 2980 29 add.u64 %r278,%r278,4; .loc 1 2980 4 add.u64 %r659,%r1753,%r685; add.u64 %r660,%r1754,%r662; setp.ne.u64 %r1760,%r186,%r278; @ %r1760 bra $L110; $L113: .loc 1 2979 26 add.u64 %r277,%r277,1; .loc 1 2979 2 add.u64 %r667,%r667,%r178; add.u64 %r669,%r669,%r671; setp.eq.u64 %r1248,%r426,%r277; @ %r1248 bra $L111; $L114: .loc 1 2980 4 @ %r1736 bra $L112; bra $L113; $L111: .loc 1 2978 32 add.u64 %r276,%r295,1; .loc 1 2978 7 add.u64 %r675,%r675,%r474; add.u64 %r676,%r676,%r179; setp.eq.u64 %r1250,%r294,%r295; @ %r1250 bra $L1; mov.u64 %r295,%r276; $L107: .loc 1 2979 2 @ %r1735 bra $L111; shl.b64 %r1252,%r676,3; add.u64 %r669,%r214,%r1252; shl.b64 %r1253,%r675,3; add.u64 %r680,%r215,%r1253; mov.u64 %r667,0; .loc 1 2979 9 mov.u64 %r277,%r667; bra $L114; $L103: shl.b64 %r702,%r414,3; shl.b64 %r693,%r178,3; shl.b64 %r697,%r472,3; .loc 1 2993 7 mov.u64 %r711,0; mov.u64 %r709,%r711; .loc 1 2993 14 mov.u64 %r272,%r711; setp.gt.s64 %r1761,%r186,0; setp.le.s64 %r1347,%r426,0; $L116: .loc 1 2997 4 @ %r1761 bra $L115; $L120: .loc 1 2993 32 add.u64 %r272,%r272,1; .loc 1 2993 7 add.u64 %r709,%r709,%r474; add.u64 %r711,%r711,%r179; setp.ne.u64 %r1255,%r187,%r272; @ %r1255 bra $L116; bra $L1; $L115: shl.b64 %r1256,%r709,3; add.u64 %r700,%r215,%r1256; shl.b64 %r1257,%r711,3; add.u64 %r717,%r214,%r1257; .loc 1 2997 4 mov.u64 %r704,0; .loc 1 2997 11 mov.u64 %r273,%r704; $L119: .loc 1 3001 8 @ %r1347 bra $L144; shl.b64 %r1259,%r704,3; add.u64 %r691,%r213,%r1259; mov.u64 %r695,%r717; .loc 1 3000 10 mov.f64 %r274,0d0000000000000000; .loc 1 3001 15 mov.u64 %r275,0; and.b64 %r1463,%r426,3; setp.eq.u64 %r1466,%r1463,0; @ %r1466 bra $L118; setp.eq.u64 %r1465,%r1463,1; @ %r1465 bra $L331; setp.eq.u64 %r1464,%r1463,2; @ %r1464 bra $L332; bra $L394; $L118: .loc 1 3002 5 ld.f64 %r1260,[%r691]; ld.f64 %r1261,[%r695]; fma.rn.f64 %r1763,%r1260,%r1261,%r274; .loc 1 3001 8 add.u64 %r1765,%r691,%r693; add.u64 %r1766,%r695,%r697; .loc 1 3002 5 ld.f64 %r1767,[%r1765]; ld.f64 %r1768,[%r1766]; fma.rn.f64 %r1769,%r1767,%r1768,%r1763; .loc 1 3001 8 add.u64 %r1771,%r1765,%r693; add.u64 %r1772,%r1766,%r697; .loc 1 3002 5 ld.f64 %r1774,[%r1771]; ld.f64 %r1775,[%r1772]; fma.rn.f64 %r1776,%r1774,%r1775,%r1769; .loc 1 3001 8 add.u64 %r1778,%r1771,%r693; add.u64 %r1779,%r1772,%r697; .loc 1 3002 5 ld.f64 %r1781,[%r1778]; ld.f64 %r1782,[%r1779]; fma.rn.f64 %r274,%r1781,%r1782,%r1776; .loc 1 3001 32 add.u64 %r275,%r275,4; .loc 1 3001 8 add.u64 %r691,%r1778,%r693; add.u64 %r695,%r1779,%r697; setp.ne.u64 %r1783,%r426,%r275; @ %r1783 bra $L118; bra $L117; $L144: .loc 1 3000 10 mov.f64 %r274,0d0000000000000000; $L117: .loc 1 3003 27 st.f64 [%r700],%r274; .loc 1 2997 29 add.u64 %r273,%r273,1; .loc 1 2997 4 add.u64 %r700,%r700,%r702; add.u64 %r704,%r704,%r473; setp.ne.u64 %r1263,%r186,%r273; @ %r1263 bra $L119; bra $L120; $L32: .loc 1 2920 11 setp.eq.u64 %r1264,%r178,1; @ %r1264 bra $L121; bra $L20; $L2: .loc 1 2476 6 ld.u64 %r1265,[%r718]; setp.ne.u64 %r1266,%r1265,0; @ %r1266 bra $L122; .loc 1 2485 4 ld.u64 %r1268,[%r719+56]; add.u64 %r1267,%r1268,1; ld.u64 %r1269,[%r719+48]; sub.u64 %r478,%r1267,%r1269; .loc 1 2483 15 ld.s8 %r1271,[%r720+28]; cvt.u16.u32 %r1270,%r1271; setp.eq.u16 %r1272,%r1270,1; @ %r1272 bra $L123; bra $L124; $L126: .loc 1 2560 16 ld.u64 %r473,[%r719+40]; .loc 1 2548 27 mov.u64 %r414,%r474; bra $L125; $L14: ld.u64 %r474,[%r718+40]; .loc 1 2543 6 ld.s8 %r1274,[%r718+28]; cvt.u16.u32 %r1273,%r1274; setp.eq.u16 %r1275,%r1273,1; @ ! %r1275 bra $L127; bra $L126; $L11: .loc 1 2560 16 ld.u64 %r473,[%r719+40]; mov.u32 %r416,%r301; .loc 1 2548 27 mov.u64 %r474,%r414; .loc 1 2563 14 mov.u64 %r186,1; .loc 1 2561 16 mov.u64 %r178,%r186; bra $L12; $L5: .loc 1 2485 4 ld.u64 %r1277,[%r719+56]; add.u64 %r1276,%r1277,1; ld.u64 %r1278,[%r719+48]; sub.u64 %r478,%r1276,%r1278; bra $L124; $L128: .loc 1 2509 17 ld.u64 %r1280,[%r718+56]; add.u64 %r1279,%r1280,1; .loc 1 2509 15 ld.u64 %r1281,[%r718+48]; sub.u64 %r426,%r1279,%r1281; ld.s8 %r1286,[%r720+28]; bra $L9; $L122: .loc 1 2502 12 cvta.global.u64 %r1282,_gfortrani_compile_options; .loc 1 2502 11 ld.u32 %r1283,[%r1282+36]; setp.ne.u32 %r1284,%r1283,0; @ ! %r1284 bra $L8; bra $L128; $L1: .loc 1 3007 1 ret; $L387: .loc 1 2907 11 ld.f64 %r1784,[%r302]; ld.f64 %r1785,[%r370]; fma.rn.f64 %r238,%r1784,%r1785,%r238; .loc 1 2905 10 add.u64 %r302,%r302,8; add.u64 %r267,%r370,8; $L320: .loc 1 2907 11 ld.f64 %r1787,[%r302]; ld.f64 %r1788,[%r267]; fma.rn.f64 %r238,%r1787,%r1788,%r238; .loc 1 2905 10 add.u64 %r302,%r302,8; add.u64 %r267,%r267,8; $L319: .loc 1 2907 11 ld.f64 %r1790,[%r302]; ld.f64 %r1791,[%r267]; fma.rn.f64 %r238,%r1790,%r1791,%r238; .loc 1 2905 10 add.u64 %r302,%r302,8; add.u64 %r267,%r267,8; setp.ne.u64 %r1792,%r368,%r267; @ %r1792 bra $L83; bra $L82; $L386: add.u64 %r1793,%r388,%r398; .loc 1 2887 16 ld.f64 %r1794,[%r370]; .loc 1 2886 16 add.u64 %r1795,%r1793,%r387; .loc 1 2886 11 ld.f64 %r1796,[%r1795]; fma.rn.f64 %r239,%r1796,%r1794,%r239; .loc 1 2888 11 ld.f64 %r1797,[%r398]; fma.rn.f64 %r240,%r1794,%r1797,%r240; .loc 1 2890 16 add.u64 %r1798,%r1793,%r386; .loc 1 2890 11 ld.f64 %r1799,[%r1798]; fma.rn.f64 %r241,%r1794,%r1799,%r241; .loc 1 2892 16 add.u64 %r1800,%r1793,%r385; .loc 1 2892 11 ld.f64 %r1801,[%r1800]; fma.rn.f64 %r242,%r1794,%r1801,%r242; .loc 1 2884 10 add.u64 %r399,%r370,8; add.u64 %r398,%r398,8; $L318: add.u64 %r1803,%r388,%r398; .loc 1 2887 16 ld.f64 %r1804,[%r399]; .loc 1 2886 16 add.u64 %r1805,%r1803,%r387; .loc 1 2886 11 ld.f64 %r1806,[%r1805]; fma.rn.f64 %r239,%r1806,%r1804,%r239; .loc 1 2888 11 ld.f64 %r1807,[%r398]; fma.rn.f64 %r240,%r1804,%r1807,%r240; .loc 1 2890 16 add.u64 %r1808,%r1803,%r386; .loc 1 2890 11 ld.f64 %r1809,[%r1808]; fma.rn.f64 %r241,%r1804,%r1809,%r241; .loc 1 2892 16 add.u64 %r1810,%r1803,%r385; .loc 1 2892 11 ld.f64 %r1811,[%r1810]; fma.rn.f64 %r242,%r1804,%r1811,%r242; .loc 1 2884 10 add.u64 %r399,%r399,8; add.u64 %r398,%r398,8; $L317: add.u64 %r1813,%r388,%r398; .loc 1 2887 16 ld.f64 %r1814,[%r399]; .loc 1 2886 16 add.u64 %r1815,%r1813,%r387; .loc 1 2886 11 ld.f64 %r1816,[%r1815]; fma.rn.f64 %r239,%r1816,%r1814,%r239; .loc 1 2888 11 ld.f64 %r1817,[%r398]; fma.rn.f64 %r240,%r1814,%r1817,%r240; .loc 1 2890 16 add.u64 %r1818,%r1813,%r386; .loc 1 2890 11 ld.f64 %r1819,[%r1818]; fma.rn.f64 %r241,%r1814,%r1819,%r241; .loc 1 2892 16 add.u64 %r1820,%r1813,%r385; .loc 1 2892 11 ld.f64 %r1821,[%r1820]; fma.rn.f64 %r242,%r1814,%r1821,%r242; .loc 1 2884 10 add.u64 %r399,%r399,8; add.u64 %r398,%r398,8; setp.ne.u64 %r1822,%r368,%r399; @ %r1822 bra $L78; bra $L77; $L385: .loc 1 2855 16 ld.f64 %r1823,[%r359]; .loc 1 2855 11 ld.f64 %r1824,[%r357]; fma.rn.f64 %r244,%r1823,%r1824,%r244; .loc 1 2857 11 ld.f64 %r1825,[%r49]; fma.rn.f64 %r245,%r1823,%r1825,%r245; .loc 1 2859 11 ld.f64 %r1826,[%r209]; fma.rn.f64 %r246,%r1823,%r1826,%r246; .loc 1 2861 11 ld.f64 %r1827,[%r210]; fma.rn.f64 %r247,%r1823,%r1827,%r247; .loc 1 2853 10 add.u64 %r359,%r359,8; add.u64 %r357,%r357,8; add.u64 %r355,%r49,8; add.u64 %r353,%r209,8; add.u64 %r351,%r210,8; $L316: .loc 1 2855 16 ld.f64 %r1829,[%r359]; .loc 1 2855 11 ld.f64 %r1830,[%r357]; fma.rn.f64 %r244,%r1829,%r1830,%r244; .loc 1 2857 11 ld.f64 %r1831,[%r355]; fma.rn.f64 %r245,%r1829,%r1831,%r245; .loc 1 2859 11 ld.f64 %r1832,[%r353]; fma.rn.f64 %r246,%r1829,%r1832,%r246; .loc 1 2861 11 ld.f64 %r1833,[%r351]; fma.rn.f64 %r247,%r1829,%r1833,%r247; .loc 1 2853 10 add.u64 %r359,%r359,8; add.u64 %r357,%r357,8; add.u64 %r355,%r355,8; add.u64 %r353,%r353,8; add.u64 %r351,%r351,8; $L315: .loc 1 2855 16 ld.f64 %r1835,[%r359]; .loc 1 2855 11 ld.f64 %r1836,[%r357]; fma.rn.f64 %r244,%r1835,%r1836,%r244; .loc 1 2857 11 ld.f64 %r1837,[%r355]; fma.rn.f64 %r245,%r1835,%r1837,%r245; .loc 1 2859 11 ld.f64 %r1838,[%r353]; fma.rn.f64 %r246,%r1835,%r1838,%r246; .loc 1 2861 11 ld.f64 %r1839,[%r351]; fma.rn.f64 %r247,%r1835,%r1839,%r247; .loc 1 2853 10 add.u64 %r359,%r359,8; add.u64 %r357,%r357,8; add.u64 %r355,%r355,8; add.u64 %r353,%r353,8; add.u64 %r351,%r351,8; setp.ne.u64 %r1840,%r197,%r357; @ %r1840 bra $L70; bra $L69; $L384: add.u64 %r1841,%r315,%r329; .loc 1 2793 19 add.u64 %r1842,%r1841,%r314; ld.f64 %r1843,[%r1842]; .loc 1 2794 14 ld.f64 %r1844,[%r331]; .loc 1 2793 14 fma.rn.f64 %r249,%r1843,%r1844,%r249; .loc 1 2795 19 ld.f64 %r1845,[%r329]; .loc 1 2795 14 fma.rn.f64 %r250,%r1844,%r1845,%r250; add.u64 %r1846,%r331,%r456; .loc 1 2798 14 add.u64 %r1847,%r1846,%r453; ld.f64 %r1848,[%r1847]; .loc 1 2797 14 fma.rn.f64 %r251,%r1843,%r1848,%r251; .loc 1 2799 14 fma.rn.f64 %r252,%r1845,%r1848,%r252; .loc 1 2802 14 add.u64 %r1849,%r1846,%r448; ld.f64 %r1850,[%r1849]; .loc 1 2801 14 fma.rn.f64 %r253,%r1843,%r1850,%r253; .loc 1 2803 14 fma.rn.f64 %r254,%r1845,%r1850,%r254; .loc 1 2806 14 add.u64 %r1851,%r1846,%r446; ld.f64 %r1852,[%r1851]; .loc 1 2805 14 fma.rn.f64 %r255,%r1843,%r1852,%r255; .loc 1 2807 14 fma.rn.f64 %r256,%r1845,%r1852,%r256; .loc 1 2809 19 add.u64 %r1853,%r1841,%r313; ld.f64 %r1854,[%r1853]; .loc 1 2809 14 fma.rn.f64 %r257,%r1844,%r1854,%r257; .loc 1 2811 19 add.u64 %r1855,%r1841,%r312; ld.f64 %r1856,[%r1855]; .loc 1 2811 14 fma.rn.f64 %r258,%r1844,%r1856,%r258; .loc 1 2813 14 fma.rn.f64 %r259,%r1848,%r1854,%r259; .loc 1 2815 14 fma.rn.f64 %r260,%r1848,%r1856,%r260; .loc 1 2817 14 fma.rn.f64 %r261,%r1850,%r1854,%r261; .loc 1 2819 14 fma.rn.f64 %r262,%r1850,%r1856,%r262; .loc 1 2821 14 fma.rn.f64 %r263,%r1852,%r1854,%r263; .loc 1 2823 14 fma.rn.f64 %r264,%r1852,%r1856,%r264; .loc 1 2791 6 add.u64 %r331,%r331,8; add.u64 %r329,%r329,8; setp.ne.u64 %r1857,%r197,%r331; @ %r1857 bra $L65; bra $L64; $L383: .loc 1 2762 43 ld.f64 %r1858,[%r526]; st.f64 [%r300],%r1858; .loc 1 2760 9 add.u64 %r57,%r526,8; add.u64 %r73,%r300,2048; $L314: .loc 1 2762 43 ld.f64 %r1860,[%r57]; st.f64 [%r73],%r1860; .loc 1 2760 9 add.u64 %r57,%r57,8; add.u64 %r73,%r73,2048; $L313: .loc 1 2762 43 ld.f64 %r1862,[%r57]; st.f64 [%r73],%r1862; .loc 1 2760 9 add.u64 %r57,%r57,8; add.u64 %r73,%r73,2048; setp.ne.u64 %r1863,%r57,%r136; @ %r1863 bra $L58; bra $L48; $L382: .loc 1 2741 7 add.u64 %r1864,%r506,%r509; add.u64 %r1865,%r1864,%r510; .loc 1 2740 49 ld.f64 %r1866,[%r1865]; st.f64 [%r199],%r1866; .loc 1 2742 49 ld.f64 %r1867,[%r506]; st.f64 [%r199+8],%r1867; .loc 1 2745 7 add.u64 %r1868,%r506,%r510; add.u64 %r1869,%r1868,%r509; .loc 1 2744 49 ld.f64 %r1870,[%r1869+8]; st.f64 [%r199+2048],%r1870; .loc 1 2746 49 ld.f64 %r1871,[%r506+8]; st.f64 [%r199+2056],%r1871; .loc 1 2738 9 add.u64 %r199,%r199,4096; add.u64 %r181,%r506,16; $L312: .loc 1 2741 7 add.u64 %r1873,%r181,%r509; add.u64 %r1874,%r1873,%r510; .loc 1 2740 49 ld.f64 %r1875,[%r1874]; st.f64 [%r199],%r1875; .loc 1 2742 49 ld.f64 %r1876,[%r181]; st.f64 [%r199+8],%r1876; .loc 1 2745 7 add.u64 %r1877,%r181,%r510; add.u64 %r1878,%r1877,%r509; .loc 1 2744 49 ld.f64 %r1879,[%r1878+8]; st.f64 [%r199+2048],%r1879; .loc 1 2746 49 ld.f64 %r1880,[%r181+8]; st.f64 [%r199+2056],%r1880; .loc 1 2738 9 add.u64 %r199,%r199,4096; add.u64 %r181,%r181,16; $L311: .loc 1 2741 7 add.u64 %r1882,%r181,%r509; add.u64 %r1883,%r1882,%r510; .loc 1 2740 49 ld.f64 %r1884,[%r1883]; st.f64 [%r199],%r1884; .loc 1 2742 49 ld.f64 %r1885,[%r181]; st.f64 [%r199+8],%r1885; .loc 1 2745 7 add.u64 %r1886,%r181,%r510; add.u64 %r1887,%r1886,%r509; .loc 1 2744 49 ld.f64 %r1888,[%r1887+8]; st.f64 [%r199+2048],%r1888; .loc 1 2746 49 ld.f64 %r1889,[%r181+8]; st.f64 [%r199+2056],%r1889; .loc 1 2738 9 add.u64 %r199,%r199,4096; add.u64 %r181,%r181,16; setp.ne.u64 %r1890,%r181,%r497; @ %r1890 bra $L52; bra $L53; $L380: .loc 1 2688 2 @ ! %r1302 bra $L395; .loc 1 2689 22 mov.u32 %r1881892,[%value_in]; } $L395: .loc 1 2687 24 mov.u64 %r266,2; .loc 1 2687 7 add.u64 %r590,%r215,%r591; $L310: .loc 1 2688 2 @ ! %r1302 bra $L396; .loc 1 2689 22 mov.u32 %r1881895,[%value_in]; } $L396: .loc 1 2687 24 add.u64 %r266,%r266,1; .loc 1 2687 7 add.u64 %r590,%r590,%r591; $L309: .loc 1 2688 2 @ ! %r1302 bra $L397; .loc 1 2689 22 mov.u32 %r1881898,[%value_in]; } $L397: .loc 1 2687 24 add.u64 %r266,%r266,1; .loc 1 2687 7 add.u64 %r590,%r590,%r591; setp.eq.u64 %r1899,%r266,%r594; @ ! %r1899 bra $L34; bra $L33; $L247: .loc 1 2689 22 mov.u32 %r190014699001901,[%value_in]; } $L381: .loc 1 2687 7 add.u64 %r1903,%r1469,%r591; .loc 1 2688 2 @ ! %r1302 bra $L398; .loc 1 2689 22 mov.u32 %r190519019051906,[%value_in]; } $L398: .loc 1 2687 7 add.u64 %r1908,%r1903,%r591; .loc 1 2688 2 @ ! %r1302 bra $L399; .loc 1 2689 22 mov.u32 %r19101919101911,[%value_in]; } $L399: .loc 1 2687 24 add.u64 %r266,%r1416,3; .loc 1 2687 7 add.u64 %r590,%r1908,%r591; setp.eq.u64 %r1912,%r266,%r594; @ ! %r1912 bra $L34; bra $L33; $L389: .loc 1 2938 9 ld.f64 %r1913,[%r595]; ld.f64 %r1914,[%r622]; fma.rn.f64 %r271,%r1913,%r1914,%r271; .loc 1 2937 5 add.u64 %r595,%r595,8; add.u64 %r596,%r622,8; $L322: .loc 1 2938 9 ld.f64 %r1916,[%r595]; ld.f64 %r1917,[%r596]; fma.rn.f64 %r271,%r1916,%r1917,%r271; .loc 1 2937 5 add.u64 %r595,%r595,8; add.u64 %r596,%r596,8; $L321: .loc 1 2938 9 ld.f64 %r1919,[%r595]; ld.f64 %r1920,[%r596]; fma.rn.f64 %r271,%r1919,%r1920,%r271; .loc 1 2937 5 add.u64 %r595,%r595,8; add.u64 %r596,%r596,8; setp.ne.u64 %r1921,%r595,%r601; @ %r1921 bra $L93; bra $L92; $L390: .loc 1 2953 5 ld.f64 %r1922,[%r213]; ld.f64 %r1923,[%r626]; fma.rn.f64 %r269,%r1922,%r1923,%r269; .loc 1 2952 8 add.u64 %r623,%r213,%r625; add.u64 %r626,%r626,8; $L324: .loc 1 2953 5 ld.f64 %r1925,[%r623]; ld.f64 %r1926,[%r626]; fma.rn.f64 %r269,%r1925,%r1926,%r269; .loc 1 2952 8 add.u64 %r623,%r623,%r625; add.u64 %r626,%r626,8; $L323: .loc 1 2953 5 ld.f64 %r1928,[%r623]; ld.f64 %r1929,[%r626]; fma.rn.f64 %r269,%r1928,%r1929,%r269; .loc 1 2952 8 add.u64 %r623,%r623,%r625; add.u64 %r626,%r626,8; setp.ne.u64 %r1930,%r626,%r642; @ %r1930 bra $L97; bra $L96; $L391: .loc 1 2968 8 ld.f64 %r1931,[%r213]; ld.f64 %r1932,[%r646]; fma.rn.f64 %r282,%r1931,%r1932,%r282; .loc 1 2967 28 mov.u64 %r283,1; .loc 1 2967 4 add.u64 %r643,%r213,%r645; add.u64 %r646,%r646,%r648; $L326: .loc 1 2968 8 ld.f64 %r1934,[%r643]; ld.f64 %r1935,[%r646]; fma.rn.f64 %r282,%r1934,%r1935,%r282; .loc 1 2967 28 add.u64 %r283,%r283,1; .loc 1 2967 4 add.u64 %r643,%r643,%r645; add.u64 %r646,%r646,%r648; $L325: .loc 1 2968 8 ld.f64 %r1937,[%r643]; ld.f64 %r1938,[%r646]; fma.rn.f64 %r282,%r1937,%r1938,%r282; .loc 1 2967 28 add.u64 %r283,%r283,1; .loc 1 2967 4 add.u64 %r643,%r643,%r645; add.u64 %r646,%r646,%r648; setp.ne.u64 %r1939,%r426,%r283; @ %r1939 bra $L100; bra $L99; $L393: .loc 1 2982 36 ld.f64 %r1940,[%r660]; ld.f64 %r1941,[%r669]; ld.f64 %r1942,[%r680]; fma.rn.f64 %r1943,%r1940,%r1941,%r1942; st.f64 [%r680],%r1943; .loc 1 2980 29 mov.u64 %r278,1; .loc 1 2980 4 add.u64 %r659,%r680,%r685; add.u64 %r660,%r660,%r662; $L330: .loc 1 2982 36 ld.f64 %r1945,[%r660]; ld.f64 %r1946,[%r669]; ld.f64 %r1947,[%r659]; fma.rn.f64 %r1948,%r1945,%r1946,%r1947; st.f64 [%r659],%r1948; .loc 1 2980 29 add.u64 %r278,%r278,1; .loc 1 2980 4 add.u64 %r659,%r659,%r685; add.u64 %r660,%r660,%r662; $L329: .loc 1 2982 36 ld.f64 %r1950,[%r660]; ld.f64 %r1951,[%r669]; ld.f64 %r1952,[%r659]; fma.rn.f64 %r1953,%r1950,%r1951,%r1952; st.f64 [%r659],%r1953; .loc 1 2980 29 add.u64 %r278,%r278,1; .loc 1 2980 4 add.u64 %r659,%r659,%r685; add.u64 %r660,%r660,%r662; setp.ne.u64 %r1954,%r186,%r278; @ %r1954 bra $L110; bra $L113; $L392: .loc 1 2976 34 st.f64 [%r683],%r1346; .loc 1 2975 27 mov.u64 %r280,1; .loc 1 2975 2 add.u64 %r683,%r683,%r685; $L328: .loc 1 2976 34 st.f64 [%r683],%r1346; .loc 1 2975 27 add.u64 %r280,%r280,1; .loc 1 2975 2 add.u64 %r683,%r683,%r685; $L327: .loc 1 2976 34 st.f64 [%r683],%r1346; .loc 1 2975 27 add.u64 %r280,%r280,1; .loc 1 2975 2 add.u64 %r683,%r683,%r685; setp.ne.u64 %r1957,%r186,%r280; @ %r1957 bra $L105; bra $L109; $L394: .loc 1 3002 5 ld.f64 %r1958,[%r691]; ld.f64 %r1959,[%r717]; fma.rn.f64 %r274,%r1958,%r1959,%r274; .loc 1 3001 32 mov.u64 %r275,1; .loc 1 3001 8 add.u64 %r691,%r691,%r693; add.u64 %r695,%r717,%r697; $L332: .loc 1 3002 5 ld.f64 %r1961,[%r691]; ld.f64 %r1962,[%r695]; fma.rn.f64 %r274,%r1961,%r1962,%r274; .loc 1 3001 32 add.u64 %r275,%r275,1; .loc 1 3001 8 add.u64 %r691,%r691,%r693; add.u64 %r695,%r695,%r697; $L331: .loc 1 3002 5 ld.f64 %r1964,[%r691]; ld.f64 %r1965,[%r695]; fma.rn.f64 %r274,%r1964,%r1965,%r274; .loc 1 3001 32 add.u64 %r275,%r275,1; .loc 1 3001 8 add.u64 %r691,%r691,%r693; add.u64 %r695,%r695,%r697; setp.ne.u64 %r1966,%r426,%r275; @ %r1966 bra $L118; bra $L117; } matmul_r10.o/ matmul_c4.o/, .param .u32 %in_ar4, .param .u64 %in_ar5); .file 1 "../../../libgfortran/generated/matmul_c4.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_options[6]FUNCTION DECL: _gfortran_size0_gfortran_size0_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime_VAR DEF: $LC9 .const .align 4 .u32 $LC9[2] = {__func__$0 .const .align 1 .u8 __func__$0[10] = {109,97,116,109,117,108,95,99,52,0 }2] = {782] = {672] = {8471,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,32,40,97,41,32,61,61,32,50,32,124,124,32,71,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,32,40,98,41,32,61,61,3243116,114,97,110,47,103,101,110,101,114,97,116,101,100,4765,114,114,97,121,32,98,111,117,110,100,32,109,105,115,109,97,116,99,104,32,102,111,114,32,100,105,109,101,110,115,105,111,110,32,49,32,111,102,32,97,114,114,97,121,32,40,37,108,100,47,37,108,100,41,3265,114,114,97,121,32,98,111,117,110,100,32,109,105,115,109,97,116,99,104,32,102,111,114,32,100,105,109,101,110,115,105,111,110,32,50,32,111,102,32,97,114,114,97,121,32,40,37,108,100,47,37,108,100,41,3273,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,97,114,103,117,109,101,110,116,32,66,32,105,110,32,77,65,84,77,85,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,49,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,100,0 }; // BEGIN VAR DEF: $LC8 .const .align 4 .u32 $LC8[2] = {10653532_gfortran_matmul_c4 .visible .func _gfortran_matmul_cu64 %r200; .reg .f32 %r201; .reg .f32 %r202; .reg .f32 %r203; .reg .f32 %r205; .reg .f32 %r206; .reg .f32 %r207; .reg .f32 %r208; .reg .f32 %r209; .reg .f32 %r211; .reg .f32 %r212; .reg .f32 %r213; .reg .f32u64 %r257; .reg .u64 %r258; .reg .u64u64 %r267; .reg .u64 %r268; .reg .u64 %r269; .reg .f32 %r270; .reg .f32 %r271; .reg .f32 %r272; .reg .f32 %r273; .reg .f32 %r274; .reg .f32 %r275; .reg .f32 %r276; .reg .f32 %r277; .reg .f32 %r278; .reg .f32 %r279; .reg .f32 %r280; .reg .f32 %r281; .reg .f32 %r282; .reg .f32 %r283; .reg .f32 %r284; .reg .f32f32 %r290; .reg .f32 %r296; .reg .u64 %r297; .reg .u64 %r298; .reg .f32 %r299; .reg .f32 %r300; .reg .f32 %r301; .reg .f32 %r302; .reg .f32 %r303; .reg .f32 %r304; .reg .f32 %r305; .reg .f32 %r306; .reg .f32 %r307; .reg .f32 %r308; .reg .f32 %r309; .reg .f32 %r310; .reg .f32 %r311; .reg .f32 %r312; .reg .f32 %r313; .reg .f32 %r314; .reg .f32 %r315; .reg .f32 %r317; .reg .f32 %r318; .reg .f32 %r319; .reg .f32 %r320; .reg .f32 %r321; .reg .f32 %r323; .reg .f32 %r324; .reg .f32 %r325; .reg .f32 %r326; .reg .f32 %r327; .reg .f32 %r329; .reg .f32 %r330; .reg .f32 %r331; .reg .f32 %r333; .reg .f32 %r334; .reg .f32 %r335; .reg .f32 %r336; .reg .f32 %r337; .reg .f32 %r339; .reg .f32 %r340; .reg .f32 %r341; .reg .f32 %r343; .reg .f32 %r344; .reg .f32 %r345; .reg .f32 %r346; .reg .f32 %r347; .reg .f32 %r349; .reg .f32 %r350; .reg .f32 %r351; .reg .f32 %r353; .reg .f32 %r354; .reg .f32 %r355; .reg .f32 %r356; .reg .f32 %r357; .reg .f32 %r359; .reg .f32 %r360; .reg .f32 %r361; .reg .f32 %r362; .reg .f32 %r363; .reg .f32 %r365; .reg .f32 %r366; .reg .f32 %r367; .reg .f32 %r369; .reg .f32 %r370; .reg .f32 %r371; .reg .f32 %r373; .reg .f32 %r374; .reg .f32 %r375; .reg .f32 %r377; .reg .f32 %r378; .reg .f32 %r379; .reg .f32 %r381; .reg .f32 %r382; .reg .f32 %r383; .reg .f32 %r385; .reg .f32 %r386; .reg .f32 %r387; .reg .f32 %r389; .reg .f32 %r390; .reg .f32 %r391; .reg .f32 %r392; .reg .f32 %r393; .reg .f32 %r394; .reg .f32 %r395; .reg .f32 %r396; .reg .f32 %r397; .reg .f32 %r398; .reg .f32 %r399; .reg .f32 %r400; .reg .f32 %r401; .reg .f32 %r402; .reg .f32 %r403; .reg .f32 %r405; .reg .f32 %r406; .reg .f32 %r407; .reg .f32 %r408; .reg .f32 %r409; .reg .f32 %r411; .reg .f32 %r412; .reg .f32 %r413; .reg .f32 %r414; .reg .f32 %r415; .reg .f32 %r417; .reg .f32 %r418; .reg .f32 %r419; .reg .f32 %r420; .reg .f32 %r421; .reg .f32 %r423; .reg .f32 %r424; .reg .f32 %r425; .reg .f32 %r426; .reg .f32 %r427; .reg .f32 %r428; .reg .f32 %r429; .reg .f32 %r430; .reg .f32 %r431; .reg .f32 %r432; .reg .f32 %r433; .reg .f32 %r434; .reg .f32 %r435; .reg .f32 %r436; .reg .f32 %r437; .reg .f32 %r439; .reg .f32 %r440; .reg .f32 %r441; .reg .f32 %r442; .reg .f32 %r443; .reg .f32 %r445; .reg .f32 %r446; .reg .f32 %r447; .reg .f32 %r448; .reg .f32 %r449; .reg .f32 %r451; .reg .f32 %r452; .reg .f32 %r453; .reg .f32 %r454; .reg .f32 %r455; .reg .f32 %r457; .reg .f32 %r458; .reg .f32 %r459; .reg .f32 %r460; .reg .f32 %r461; .reg .f32 %r462; .reg .f32 %r463; .reg .f32 %r464; .reg .f32 %r465; .reg .f32 %r467; .reg .f32 %r468; .reg .f32 %r469; .reg .u64 %r472; .reg .u64 %r483; .reg .f32 %r495; .reg .u64 %r496; .reg .u64 %r498; .reg .u64 %r500; .reg .u64 %r502; .reg .u64 %r504; .reg .u64u64 %r515; .reg .u64 %r517; .reg .u64 %r519; .reg .u64 %r521; .reg .u64 %r523; .reg .u64 %r524; .reg .u64 %r527; .reg .u64 %r530; .reg .u64 %r535; .reg .u64 %r539; .reg .u64 %r542; .reg .u64 %r543; .reg .u64 %r549; .reg .u64 %r552; .reg .u64 %r553; .reg .u64 %r555; .reg .u64u64 %r567; .reg .u64 %r569; .reg .u64 %r574; .reg .u64 %r575; .reg .u64 %r577; .reg .u64u64 %r585; .reg .u64 %r592; .reg .u64u64 %r604; .reg .u64 %r606; .reg .u64 %r608; .reg .u64 %r610; .reg .u64 %r615; .reg .u64 %r616; .reg .u64 %r617; .reg .u64 %r621; .reg .u64 %r623; .reg .u32 %r627; .reg .u64 %r637; .reg .u64 %r638; .reg .u64 %r639; .reg .u64 %r640; .reg .u64 %r641; .reg .u64 %r642; .reg .u64 %r643; .reg .u64 %r644; .reg .u64 %r645; .reg .u64 %r647; .reg .u64 %r648; .reg .u64 %r649; .reg .u64 %r650; .reg .u64 %r651; .reg .u64 %r657; .reg .u64 %r658; .reg .u32 %r662; .reg .u64 %r663; .reg .u64 %r664; .reg .u64 %r668; .reg .u64 %r672; .reg .u64 %r676; .reg .u64 %r677; .reg .u64 %r678; .reg .u64 %r679; .reg .u64 %r682; .reg .u64 %r683; .reg .u64 %r692; .reg .u64 %r694; .reg .u64 %r695; .reg .u64 %r696; .reg .u64 %r706; .reg .u64 %r707; .reg .u64 %r708; .reg .u64 %r709; .reg .u64 %r710; .reg .u64 %r716; .reg .u64 %r721; .reg .u64 %r726; .reg .u64 %r736; .reg .u64 %r737; .reg .u64 %r747; .reg .u64 %r749; .reg .u64 %r750; .reg .u64 %r756; .reg .u64 %r757; .reg .u64 %r758; .reg .u64 %r759; .reg .u64 %r760; .reg .u64 %r761; .reg .u64 %r762; .reg .u64 %r763; .reg .u64 %r764; .reg .u64 %r765; .reg .u64 %r766; .reg .u64 %r767; .reg .u64 %r769; .reg .u64 %r770; .reg .u64 %r771; .reg .u64 %r772; .reg .u64 %r774; .reg .u64 %r775; .reg .u64 %r782; .reg .u64 %r783; .reg .u64 %r784; .reg .u64 %r785; .reg .u64 %r786; .reg .u64 %r793; .reg .u64 %r795; .reg .u64 %r797; .reg .u64 %r798; .reg .u64 %r799; .reg .u64 %r800; .reg .u64 %r802; .reg .u64 %r806; .reg .u64 %r807; .reg .u64 %r808; .reg .u64 %r809; .reg .u64 %r810; .reg .u64 %r815; .reg .u64 %r816; .reg .u64 %r818; .reg .u64 %r821; .reg .u64 %r822; .reg .u64 %r827; .reg .u64 %r829; .reg .u64 %r830; .reg .u64 %r834; .reg .u64 %r836; .reg .u64 %r840; .reg .u64 %r842; .reg .u64 %r843; .reg .u64 %r844; .reg .u64 %r845; .reg .u64 %r851; .reg .u64 %r853; .reg .u64 %r854; .reg .u64 %r861; .reg .u64 %r862; .reg .u64 %r864; .reg .u64 %r865; .reg .u64 %r867; .reg .u64 %r870; .reg .u64 %r872; .reg .u64 %r873; .reg .u64 %r878; .reg .u64 %r881; .reg .u64 %r883; .reg .u64 %r884; .reg .u64 %r889; .reg .u64 %r891; .reg .u64 %r893; .reg .u64 %r894; .reg .u64 %r900; .reg .u32 %r901; .reg .u64 %r902; .reg .u64 %r905; .reg .u64 %r908; .reg .u64 %r910; .reg .u64 %r912; .reg .u64 %r914; .reg .u64 %r918; .reg .u64 %r920; .reg .u64 %r922; .reg .u64 %r924; .reg .u64 %r927; .reg .u64 %r929; .reg .u64 %r931; .reg .u64 %r936; .reg .u64 %r938; .reg .u64 %r944; .reg .u64 %r945; .reg .u64 %r946; .reg .u64 %r947; .reg .u32 %r948; .reg .u32 %r949; .reg .u64 %r950; .reg .u16 %r951; .reg .pred %r952; .reg .u16 %r953; .reg .pred %r955; .reg .u64 %r956; .reg .u32 %r957; .reg .u64 %r958; .reg .u64 %r959; .reg .u64 %r960; .reg .pred %r961; .reg .pred %r963; .reg .u64 %r966; .reg .u64 %r968; .reg .u64 %r969; .reg .u64 %r970; .reg .u64 %r972; .reg .u64 %r973; .reg .u64 %r974; .reg .u64 %r975; .reg .u64 %r976; .reg .u64 %r979; .reg .u64 %r981; .reg .u64 %r982; .reg .u64 %r984; .reg .u64 %r986; .reg .u64 %r987; .reg .u64 %r989; .reg .u64 %r990; .reg .u64 %r991; .reg .u64 %r992; .reg .u64 %r993; .reg .u32 %r994; .reg .pred %r995; .reg .u64 %r996; .reg .u64 %r997; .reg .u64 %r998; .reg .pred %r1000; .reg .u64 %r1001; .reg .u64 %r1002; .reg .u64 %r1003; .reg .pred %r1004; .reg .u64 %r1005; .reg .u64 %r1006; .reg .u64 %r1007; .reg .u16 %r1008; .reg .pred %r1009; .reg .u64 %r1010; .reg .u64 %r1012; .reg .u64 %r1013; .reg .u64 %r1014; .reg .u16 %r1015; .reg .pred %r1017; .reg .pred %r1018; .reg .u64 %r1019; .reg .pred %r1021; .reg .u64 %r1022; .reg .u64 %r1024; .reg .u64 %r1025; .reg .u64 %r1026; .reg .u64 %r1027; .reg .u64 %r1028; .reg .u64 %r1029; .reg .pred %r1030; .reg .u64 %r1031; .reg .u64 %r1033; .reg .u64 %r1034; .reg .u64 %r1035; .reg .u16 %r1036; .reg .u32 %r1037; .reg .pred %r1038; .reg .u16 %r1039; .reg .pred %r1040; .reg .u64 %r1041; .reg .u64 %r1042; .reg .u64 %r1043; .reg .u64 %r1044; .reg .u64 %r1045; .reg .u64 %r1046; .reg .pred %r1047; .reg .u32 %r1049; .reg .u32 %r1050; .reg .u32 %r1052; .reg .u32 %r1053; .reg .u16 %r1054; .reg .u16 %r1055; .reg .u16 %r1056; .reg .u32 %r1057; .reg .u16 %r1058; .reg .pred %r1059; .reg .u64 %r1060; .reg .u16 %r1062; .reg .pred %r1063; .reg .u64 %r1064; .reg .u64 %r1065; .reg .u64 %r1066; .reg .u32 %r1067; .reg .u32 %r1068; .reg .u32 %r1069; .reg .u32 %r1070; .reg .u32 %r1071; .reg .u32 %r1072; .reg .u32 %r1074; .reg .u32 %r1075; .reg .u16 %r1076; .reg .u16 %r1077; .reg .u16 %r1078; .reg .u32 %r1079; .reg .u16 %r1080; .reg .pred %r1081; .reg .u32 %r1083; .reg .u32 %r1084; .reg .u16 %r1085; .reg .u16 %r1086; .reg .u16 %r1087; .reg .u32 %r1088; .reg .u16 %r1089; .reg .pred %r1090; .reg .u32 %r1092; .reg .u32 %r1093; .reg .u32 %r1095; .reg .u32 %r1096; .reg .u16 %r1097; .reg .u16 %r1098; .reg .u16 %r1099; .reg .u32 %r1100; .reg .u16 %r1101; .reg .pred %r1102; .reg .f32 %r1103; .reg .f32 %r1104; .reg .f32 %r1105; .reg .f32 %r1106; .reg .f32 %r1107; .reg .f32 %r1108; .reg .f32 %r1109; .reg .pred %r1110; .reg .f32 %r1111; .reg .f32 %r1112; .reg .pred %r1115; .reg .pred %r1116; .reg .u32 %r1118; .reg .u32 %r1119; .reg .u32 %r1121; .reg .u32 %r1122; .reg .u16 %r1123; .reg .u16 %r1124; .reg .u16 %r1125; .reg .u32 %r1126; .reg .u16 %r1127; .reg .pred %r1128; .reg .u32 %r1130; .reg .u32 %r1131; .reg .u32 %r1133; .reg .u32 %r1134; .reg .u16 %r1135; .reg .u16 %r1136; .reg .u16 %r1137; .reg .u32 %r1138; .reg .u16 %r1139; .reg .pred %r1140; .reg .u32 %r1142; .reg .u32 %r1143; .reg .u32 %r1145; .reg .u32 %r1146; .reg .u16 %r1147; .reg .u16 %r1148; .reg .u16 %r1149; .reg .u32 %r1150; .reg .u16 %r1151; .reg .pred %r1152; .reg .pred %r1153; .reg .u64 %r1154; .reg .u32 %r1155; .reg .u64 %r1156; .reg .u64 %r1157; .reg .u32 %r1158; .reg .pred %r1159; .reg .pred %r1160; .reg .u32 %r1161; .reg .pred %r1162; .reg .pred %r1163; .reg .u32 %r1178; .reg .u64 %r1179; .reg .u64 %r1180; .reg .u64 %r1181; .reg .u64 %r1182; .reg .u64 %r1183; .reg .u64 %r1184; .reg .u64 %r1185; .reg .u16 %r1186; .reg .u16 %r1187; .reg .u16 %r1188; .reg .u32 %r1189; .reg .u16 %r1190; .reg .pred %r1191; .reg .pred %r1192; .reg .u16 %r1193; .reg .pred %r1194; .reg .u64 %r1195; .reg .pred %r1196; .reg .u32 %r1200; .reg .u64 %r1202; .reg .u32 %r1207; .reg .u32 %r1208; .reg .u32 %r1210; .reg .u32 %r1211; .reg .u16 %r1212; .reg .u16 %r1213; .reg .u16 %r1214; .reg .u32 %r1216; .reg .u32 %r1217; .reg .u16 %r1219; .reg .u16 %r1221; .reg .u32 %r1222; .reg .u16 %r1223; .reg .pred %r1224; .reg .pred %r1225; .reg .u64 %r1226; .reg .u64 %r1228; .reg .u64 %r1229; .reg .u64 %r1230; .reg .pred %r1232; .reg .u64 %r1234; .reg .u64 %r1235; .reg .u64 %r1236; .reg .u64 %r1238; .reg .u64 %r1239; .reg .u64 %r1244; .reg .u64 %r1245; .reg .u64 %r1246; .reg .u64 %r1247; .reg .u64 %r1251; .reg .u64 %r1253; .reg .u64 %r1254; .reg .u64 %r1255; .reg .u64 %r1256; .reg .u64 %r1257; .reg .pred %r1259; .reg .u64 %r1265; .reg .u64 %r1266; .reg .u64 %r1267; .reg .u64 %r1268; .reg .pred %r1270; .reg .u64 %r1271; .reg .u64 %r1272; .reg .u64 %r1273; .reg .u64 %r1274; .reg .u64 %r1275; .reg .u64 %r1276; .reg .u64 %r1277; .reg .u64 %r1278; .reg .u64 %r1279; .reg .u64 %r1283; .reg .u64 %r1284; .reg .u64 %r1285; .reg .u64 %r1286; .reg .u64 %r1289; .reg .u64 %r1290; .reg .u64 %r1291; .reg .u64 %r1292; .reg .u64 %r1293; .reg .u64 %r1295; .reg .u64 %r1296; .reg .u64 %r1297; .reg .u64 %r1298; .reg .u64 %r1299; .reg .u64 %r1300; .reg .u64 %r1304; .reg .u64 %r1305; .reg .u64 %r1308; .reg .u64 %r1313; .reg .u64 %r1314; .reg .f32 %r1316; .reg .f32 %r1317; .reg .f32 %r1318; .reg .f32 %r1319; .reg .f32 %r1321; .reg .f32 %r1322; .reg .pred %r1323; .reg .u64 %r1326; .reg .u64 %r1327; .reg .u64 %r1328; .reg .u64 %r1329; .reg .u64 %r1331; .reg .pred %r1332; .reg .u64 %r1333; .reg .u64 %r1334; .reg .f32 %r1335; .reg .u64 %r1338; .reg .u64 %r1339; .reg .u64 %r1340; .reg .u64 %r1341; .reg .u64 %r1342; .reg .u64 %r1343; .reg .u64 %r1344; .reg .u64 %r1346; .reg .u64 %r1347; .reg .f32 %r1351; .reg .f32 %r1352; .reg .f32 %r1353; .reg .f32 %r1354; .reg .f32 %r1355; .reg .f32 %r1356; .reg .f32 %r1357; .reg .f32 %r1358; .reg .f32 %r1359; .reg .f32 %r1360; .reg .f32 %r1361; .reg .f32 %r1362; .reg .f32 %r1363; .reg .f32 %r1364; .reg .f32 %r1365; .reg .f32 %r1366; .reg .f32 %r1367; .reg .f32 %r1368; .reg .f32 %r1369; .reg .f32 %r1370; .reg .f32 %r1371; .reg .f32 %r1372; .reg .f32 %r1373; .reg .f32 %r1374; .reg .f32 %r1375; .reg .f32 %r1376; .reg .f32 %r1377; .reg .f32 %r1378; .reg .f32 %r1379; .reg .f32 %r1380; .reg .f32 %r1381; .reg .f32 %r1382; .reg .pred %r1383; .reg .pred %r1384; .reg .pred %r1385; .reg .u64 %r1387; .reg .u64 %r1388; .reg .u64 %r1389; .reg .u64 %r1390; .reg .u64 %r1391; .reg .u64 %r1392; .reg .f32 %r1394; .reg .f32 %r1395; .reg .f32 %r1396; .reg .f32 %r1397; .reg .f32 %r1398; .reg .f32 %r1399; .reg .f32 %r1400; .reg .f32 %r1401; .reg .pred %r1402; .reg .pred %r1403; .reg .pred %r1404; .reg .u64 %r1406; .reg .u64 %r1407; .reg .u64 %r1409; .reg .u64 %r1410; .reg .u64 %r1411; .reg .u64 %r1412; .reg .u64 %r1413; .reg .u64 %r1420; .reg .u64 %r1421; .reg .f32 %r1424; .reg .f32 %r1425; .reg .f32 %r1426; .reg .f32 %r1427; .reg .f32 %r1428; .reg .f32 %r1429; .reg .f32 %r1430; .reg .f32 %r1431; .reg .pred %r1432; .reg .pred %r1433; .reg .pred %r1434; .reg .f32 %r1437; .reg .f32 %r1438; .reg .pred %r1440; .reg .u16 %r1441; .reg .pred %r1442; .reg .u32 %r1444; .reg .u32 %r1445; .reg .u16 %r1446; .reg .u16 %r1447; .reg .u32 %r1449; .reg .u16 %r1450; .reg .pred %r1451; .reg .pred %r1452; .reg .u16 %r1453; .reg .pred %r1454; .reg .pred %r1455; .reg .pred %r1456; .reg .u64 %r1457; .reg .u64 %r1458; .reg .pred %r1460; .reg .u64 %r1461; .reg .f32 %r1463; .reg .f32 %r1464; .reg .pred %r1466; .reg .u64 %r1468; .reg .u64 %r1469; .reg .u64 %r1470; .reg .f32 %r1471; .reg .f32 %r1472; .reg .pred %r1474; .reg .pred %r1475; .reg .u64 %r1477; .reg .f32 %r1478; .reg .f32 %r1479; .reg .pred %r1481; .reg .pred %r1482; .reg .pred %r1483; .reg .pred %r1484; .reg .u64 %r1485; .reg .pred %r1489; .reg .u64 %r1491; .reg .f32 %r1492; .reg .f32 %r1493; .reg .f32 %r1494; .reg .pred %r1496; .reg .pred %r1498; .reg .pred %r1502; .reg .u64 %r1503; .reg .u64 %r1504; .reg .u64 %r1506; .reg .f32 %r1507; .reg .f32 %r1508; .reg .pred %r1510; .reg .pred %r1511; .reg .u64 %r1512; .reg .pred %r1513; .reg .u64 %r1514; .reg .u64 %r1515; .reg .u64 %r1516; .reg .u16 %r1517; .reg .u32 %r1518; .reg .pred %r1519; .reg .u16 %r1520; .reg .u32 %r1521; .reg .pred %r1522; .reg .u64 %r1523; .reg .u64 %r1524; .reg .u64 %r1525; .reg .u64 %r1526; .reg .u64 %r1527; .reg .u64 %r1528; .reg .u64 %r1529; .reg .u32 %r1530; .reg .pred %r1531; .reg .u64 %r1532; .reg .u64 %r1533; .reg .u64 %r1534; .reg .f32 %r1535; .reg .pred %r1536; .reg .pred %r1542; .reg .pred %r1543; .reg .pred %r1544; .reg .f32 %r1545; .reg .u64 %r1546; .reg .pred %r1547; .reg .pred %r1548; .reg .pred %r1549; .reg .pred %r1550; .reg .u32 %r1551; .reg .u64 %r1552; .reg .pred %r1553; .reg .pred %r1556; .reg .u64 %r1562; .reg .u64 %r1563; .reg .u64 %r1564; .reg .u64 %r1565; .reg .u64 %r1566; .reg .u64 %r1570; .reg .u64 %r1571; .reg .u64 %r1572; .reg .u64 %r1573; .reg .u64 %r1574; .reg .u64 %r1576; .reg .u64 %r1577; .reg .u64 %r1578; .reg .u64 %r1580; .reg .pred %r1581; .reg .u64 %r1583; .reg .pred %r1585; .reg .u64 %r1588; .reg .pred %r1589; .reg .u64 %r1590; .reg .pred %r1591; .reg .u64 %r1592; .reg .u64 %r1593; .reg .pred %r1597; .reg .u64 %r1598; .reg .f32 %r1599; .reg .f32 %r1600; .reg .pred %r1601; .reg .u64 %r1602; .reg .u64 %r1603; .reg .u64 %r1604; .reg .u64 %r1605; .reg .u64 %r1606; .reg .pred %r1607; .reg .pred %r1608; .reg .pred %r1609; .reg .u64 %r1611; .reg .u64 %r1613; .reg .u64 %r1614; .reg .u64 %r1615; .reg .u64 %r1616; .reg .u64 %r1617; .reg .pred %r1618; .reg .pred %r1619; .reg .pred %r1620; .reg .u64 %r1621; .reg .u64 %r1624; .reg .u64 %r1625; .reg .u64 %r1626; .reg .u64 %r1628; .reg .pred %r1629; .reg .u64 %r1630; .reg .u64 %r1631; .reg .u64 %r1632; .reg .u64 %r1636; .reg .pred %r1637; .reg .pred %r1638; .reg .pred %r1639; .reg .u64 %r1640; .reg .u64 %r1641; .reg .u64 %r1642; .reg .u64 %r1643; .reg .u64 %r1644; .reg .u64 %r1645; .reg .pred %r1646; .reg .pred %r1647; .reg .pred %r1648; .reg .u64 %r1653; .reg .u64 %r1654; .reg .u64 %r1655; .reg .u64 %r1656; .reg .u64 %r1657; .reg .pred %r1658; .reg .pred %r1659; .reg .pred %r1660; .reg .u64 %r1661; .reg .u64 %r1665; .reg .pred %r1666; .reg .pred %r1667; .reg .pred %r1668; .reg .u64 %r1673; .reg .pred %r1674; .reg .pred %r1675; .reg .pred %r1676; .reg .u64 %r1681; .reg .pred %r1682; .reg .pred %r1683; .reg .pred %r1684; .reg .u64 %r1689; .reg .pred %r1690; .reg .pred %r1691; .reg .pred %r1692; .reg .u64 %r1695; .reg .u64 %r1696; .reg .u64 %r1698; .reg .u64 %r1699; .reg .f32 %r1703; .reg .f32 %r1704; .reg .f32 %r1705; .reg .f32 %r1706; .reg .f32 %r1707; .reg .f32 %r1708; .reg .f32 %r1709; .reg .f32 %r1710; .reg .pred %r1711; .reg .f32 %r1715; .reg .f32 %r1716; .reg .u64 %r1717; .reg .f32 %r1721; .reg .f32 %r1722; .reg .u64 %r1723; .reg .f32 %r1727; .reg .f32 %r1728; .reg .pred %r1729; .reg .pred %r1730; .reg .f32 %r1731; .reg .f32 %r1732; .reg .f32 %r1736; .reg .f32 %r1737; .reg .f32 %r1738; .reg .f32 %r1739; .reg .f32 %r1740; .reg .f32 %r1741; .reg .f32 %r1742; .reg .f32 %r1743; .reg .f32 %r1744; .reg .f32 %r1745; .reg .f32 %r1746; .reg .u64 %r1748; .reg .f32 %r1751; .reg .f32 %r1752; .reg .f32 %r1753; .reg .f32 %r1754; .reg .f32 %r1755; .reg .f32 %r1756; .reg .f32 %r1757; .reg .f32 %r1758; .reg .f32 %r1759; .reg .f32 %r1760; .reg .f32 %r1761; .reg .u64 %r1763; .reg .f32 %r1766; .reg .f32 %r1767; .reg .f32 %r1768; .reg .f32 %r1769; .reg .f32 %r1770; .reg .f32 %r1771; .reg .f32 %r1772; .reg .f32 %r1773; .reg .f32 %r1774; .reg .pred %r1775; .reg .pred %r1776; .reg .f32 %r1777; .reg .f32 %r1778; .reg .f32 %r1783; .reg .f32 %r1784; .reg .f32 %r1785; .reg .f32 %r1786; .reg .f32 %r1787; .reg .f32 %r1788; .reg .f32 %r1789; .reg .f32 %r1790; .reg .f32 %r1791; .reg .f32 %r1792; .reg .f32 %r1793; .reg .f32 %r1799; .reg .f32 %r1800; .reg .f32 %r1801; .reg .f32 %r1802; .reg .f32 %r1803; .reg .f32 %r1804; .reg .f32 %r1805; .reg .f32 %r1806; .reg .f32 %r1807; .reg .f32 %r1808; .reg .f32 %r1809; .reg .f32 %r1815; .reg .f32 %r1816; .reg .f32 %r1817; .reg .f32 %r1818; .reg .f32 %r1819; .reg .f32 %r1820; .reg .f32 %r1821; .reg .f32 %r1822; .reg .f32 %r1823; .reg .pred %r1824; .reg .pred %r1825; .reg .f32 %r1826; .reg .f32 %r1827; .reg .u64 %r1828; .reg .u64 %r1830; .reg .f32 %r1831; .reg .f32 %r1832; .reg .f32 %r1833; .reg .f32 %r1834; .reg .f32 %r1835; .reg .f32 %r1836; .reg .f32 %r1837; .reg .f32 %r1838; .reg .f32 %r1839; .reg .f32 %r1840; .reg .f32 %r1841; .reg .u64 %r1842; .reg .u64 %r1843; .reg .u64 %r1844; .reg .f32 %r1846; .reg .f32 %r1847; .reg .f32 %r1848; .reg .f32 %r1849; .reg .f32 %r1850; .reg .f32 %r1851; .reg .f32 %r1852; .reg .f32 %r1853; .reg .f32 %r1854; .reg .f32 %r1855; .reg .f32 %r1856; .reg .u64 %r1857; .reg .u64 %r1858; .reg .u64 %r1859; .reg .f32 %r1861; .reg .f32 %r1862; .reg .f32 %r1863; .reg .f32 %r1864; .reg .f32 %r1865; .reg .f32 %r1866; .reg .f32 %r1867; .reg .f32 %r1868; .reg .f32 %r1869; .reg .pred %r1870; .reg .pred %r1871; .reg .f32 %r1872; .reg .f32 %r1873; .reg .u64 %r1875; .reg .u64 %r1876; .reg .f32 %r1877; .reg .f32 %r1878; .reg .f32 %r1879; .reg .f32 %r1880; .reg .f32 %r1881; .reg .f32 %r1882; .reg .f32 %r1883; .reg .f32 %r1884; .reg .f32 %r1885; .reg .f32 %r1886; .reg .f32 %r1887; .reg .u64 %r1889; .reg .u64 %r1890; .reg .f32 %r1892; .reg .f32 %r1893; .reg .f32 %r1894; .reg .f32 %r1895; .reg .f32 %r1896; .reg .f32 %r1897; .reg .f32 %r1898; .reg .f32 %r1899; .reg .f32 %r1900; .reg .f32 %r1901; .reg .f32 %r1902; .reg .u64 %r1904; .reg .u64 %r1905; .reg .f32 %r1907; .reg .f32 %r1908; .reg .f32 %r1909; .reg .f32 %r1910; .reg .f32 %r1911; .reg .f32 %r1912; .reg .f32 %r1913; .reg .f32 %r1914; .reg .f32 %r1915; .reg .pred %r1916; .reg .u64 %r1920; .reg .u64 %r1922; .reg .u64 %r1925; .reg .pred %r1927; .reg .pred %r1928; .reg .pred %r1929; .reg .u64 %r1931; .reg .u64 %r1932; .reg .u64 %r1933; .reg .f32 %r1934; .reg .f32 %r1935; .reg .f32 %r1936; .reg .f32 %r1937; .reg .f32 %r1938; .reg .f32 %r1939; .reg .f32 %r1940; .reg .f32 %r1941; .reg .f32 %r1942; .reg .f32 %r1943; .reg .f32 %r1944; .reg .f32 %r1945; .reg .u64 %r1947; .reg .u64 %r1948; .reg .u64 %r1949; .reg .f32 %r1951; .reg .f32 %r1952; .reg .f32 %r1953; .reg .f32 %r1954; .reg .f32 %r1955; .reg .f32 %r1956; .reg .f32 %r1957; .reg .f32 %r1958; .reg .f32 %r1959; .reg .f32 %r1960; .reg .f32 %r1961; .reg .f32 %r1962; .reg .u64 %r1964; .reg .u64 %r1965; .reg .u64 %r1966; .reg .f32 %r1968; .reg .f32 %r1969; .reg .f32 %r1970; .reg .f32 %r1971; .reg .f32 %r1972; .reg .f32 %r1973; .reg .f32 %r1974; .reg .f32 %r1975; .reg .f32 %r1976; .reg .f32 %r1977; .reg .f32 %r1978; .reg .f32 %r1979; .reg .pred %r1980; .reg .pred %r1981; .reg .f32 %r1983; .reg .f32 %r1984; .reg .u64 %r1986; .reg .u64 %r1987; .reg .f32 %r1988; .reg .f32 %r1989; .reg .f32 %r1990; .reg .f32 %r1991; .reg .f32 %r1992; .reg .f32 %r1993; .reg .f32 %r1994; .reg .f32 %r1995; .reg .f32 %r1996; .reg .f32 %r1997; .reg .f32 %r1998; .reg .u64 %r2000; .reg .u64 %r2001; .reg .f32 %r2003; .reg .f32 %r2004; .reg .f32 %r2005; .reg .f32 %r2006; .reg .f32 %r2007; .reg .f32 %r2008; .reg .f32 %r2009; .reg .f32 %r2010; .reg .f32 %r2011; .reg .f32 %r2012; .reg .f32 %r2013; .reg .u64 %r2015; .reg .u64 %r2016; .reg .f32 %r2018; .reg .f32 %r2019; .reg .f32 %r2020; .reg .f32 %r2021; .reg .f32 %r2022; .reg .f32 %r2023; .reg .f32 %r2024; .reg .f32 %r2025; .reg .f32 %r2026; .reg .pred %r2027; .reg .f32 %r2028; .reg .f32 %r2029; .reg .f32 %r2030; .reg .f32 %r2031; .reg .f32 %r2032; .reg .f32 %r2033; .reg .f32 %r2034; .reg .f32 %r2035; .reg .f32 %r2036; .reg .f32 %r2038; .reg .f32 %r2039; .reg .f32 %r2040; .reg .f32 %r2041; .reg .f32 %r2042; .reg .f32 %r2043; .reg .f32 %r2044; .reg .f32 %r2045; .reg .f32 %r2046; .reg .f32 %r2048; .reg .f32 %r2049; .reg .f32 %r2050; .reg .f32 %r2051; .reg .f32 %r2052; .reg .f32 %r2053; .reg .f32 %r2054; .reg .f32 %r2055; .reg .f32 %r2056; .reg .pred %r2057; .reg .f32 %r2058; .reg .f32 %r2059; .reg .f32 %r2061; .reg .f32 %r2062; .reg .f32 %r2064; .reg .f32 %r2065; .reg .pred %r2066; .reg .f32 %r2067; .reg .f32 %r2068; .reg .f32 %r2069; .reg .f32 %r2070; .reg .f32 %r2071; .reg .f32 %r2072; .reg .f32 %r2073; .reg .f32 %r2074; .reg .pred %r2075; .reg .u32 %r2076; .reg .u64 %r2077; .reg .u32 %r2079; .reg .u64 %r2080; .reg .u32 %r2082; .reg .u64 %r2083; .reg .pred %r2084; .reg .u32 %r2085; .reg .u64 %r2086; .reg .u64 %r2088; .reg .u32 %r2090; .reg .u64 %r2091; .reg .u64 %r2093; .reg .u32 %r2095; .reg .u64 %r2096; .reg .pred %r2097; .reg .f32 %r2098; .reg .f32 %r2099; .reg .f32 %r2100; .reg .f32 %r2101; .reg .f32 %r2102; .reg .f32 %r2103; .reg .f32 %r2104; .reg .f32 %r2108; .reg .f32 %r2109; .reg .f32 %r2110; .reg .f32 %r2111; .reg .f32 %r2112; .reg .f32 %r2113; .reg .f32 %r2114; .reg .f32 %r2115; .reg .f32 %r2116; .reg .f32 %r2118; .reg .f32 %r2119; .reg .f32 %r2120; .reg .f32 %r2121; .reg .f32 %r2122; .reg .f32 %r2123; .reg .f32 %r2124; .reg .f32 %r2125; .reg .f32 %r2126; .reg .pred %r2127; .reg .f32 %r2128; .reg .f32 %r2129; .reg .f32 %r2130; .reg .f32 %r2131; .reg .f32 %r2132; .reg .f32 %r2133; .reg .f32 %r2134; .reg .f32 %r2138; .reg .f32 %r2139; .reg .f32 %r2140; .reg .f32 %r2141; .reg .f32 %r2142; .reg .f32 %r2143; .reg .f32 %r2144; .reg .f32 %r2145; .reg .f32 %r2146; .reg .f32 %r2148; .reg .f32 %r2149; .reg .f32 %r2150; .reg .f32 %r2151; .reg .f32 %r2152; .reg .f32 %r2153; .reg .f32 %r2154; .reg .f32 %r2155; .reg .f32 %r2156; .reg .pred %r2157; .reg .f32 %r2158; .reg .f32 %r2159; .reg .f32 %r2160; .reg .f32 %r2161; .reg .f32 %r2162; .reg .f32 %r2163; .reg .f32 %r2164; .reg .f32 %r2168; .reg .f32 %r2169; .reg .f32 %r2170; .reg .f32 %r2171; .reg .f32 %r2172; .reg .f32 %r2173; .reg .f32 %r2174; .reg .f32 %r2175; .reg .f32 %r2176; .reg .f32 %r2178; .reg .f32 %r2179; .reg .f32 %r2180; .reg .f32 %r2181; .reg .f32 %r2182; .reg .f32 %r2183; .reg .f32 %r2184; .reg .f32 %r2185; .reg .f32 %r2186; .reg .pred %r2187; .reg .f32 %r2188; .reg .f32 %r2189; .reg .f32 %r2190; .reg .f32 %r2191; .reg .f32 %r2192; .reg .f32 %r2193; .reg .f32 %r2194; .reg .f32 %r2195; .reg .f32 %r2196; .reg .f32 %r2197; .reg .f32 %r2198; .reg .f32 %r2199; .reg .f32 %r2201; .reg .f32 %r2202; .reg .f32 %r2203; .reg .f32 %r2204; .reg .f32 %r2205; .reg .f32 %r2206; .reg .f32 %r2207; .reg .f32 %r2208; .reg .f32 %r2209; .reg .f32 %r2210; .reg .f32 %r2211; .reg .f32 %r2212; .reg .f32 %r2214; .reg .f32 %r2215; .reg .f32 %r2216; .reg .f32 %r2217; .reg .f32 %r2218; .reg .f32 %r2219; .reg .f32 %r2220; .reg .f32 %r2221; .reg .f32 %r2222; .reg .f32 %r2223; .reg .f32 %r2224; .reg .f32 %r2225; .reg .pred %r2226; .reg .pred %r2229; .reg .f32 %r2230; .reg .f32 %r2231; .reg .f32 %r2232; .reg .f32 %r2233; .reg .f32 %r2234; .reg .f32 %r2235; .reg .f32 %r2236; .reg .f32 %r2240; .reg .f32 %r2241; .reg .f32 %r2242; .reg .f32 %r2243; .reg .f32 %r2244; .reg .f32 %r2245; .reg .f32 %r2246; .reg .f32 %r2247; .reg .f32 %r2248; .reg .f32 %r2250; .reg .f32 %r2251; .reg .f32 %r2252; .reg .f32 %r2253; .reg .f32 %r2254; .reg .f32 %r2255; .reg .f32 %r2256; .reg .f32 %r2257; .reg .f32 %r2258; .reg .pred %r2259; mov.u64 %r945,%ar0; mov.u64 %r946,%ar1; mov.u64 %r947,%ar2; mov.u32 %r948,%ar3; mov.u32 %r949,%ar4; mov.u64 %r950,%ar5; .loc 1 2462 3 ld.s8 %r627,[%r946+28]; cvt.u16.u32 %r951,%r627; setp.eq.u16 %r952,%r951,2; @ %r952 bra $L2; ld.s8 %r1551,[%r947+28]; cvt.u16.u32 %r953,%r1551; setp.eq.u16 %r955,%r953,2; @ %r955 bra $L3; cvta.const.u64 %r959,$LC3; cvta.const.u64 %r958,__func__$0; mov.u32 %r957,2462; cvta.const.u64 %r995893: .loc 1 2476 6 ld.u64 %r960,[%r945]; setp.ne.u64 %r961,%r960,0; @ %r961 bra $L4; .loc 1 2478 10 setp.ne.u16 %r963,%r951,1; @ %r963 bra $L5; .loc 1 2480 4 st.u64 [%r945+48],%r960; ld.u64 %r966,[%r947+80]; ld.u64 %r968,[%r947+72]; sub.u64 %r969,%r966,%r968; st.u64 [%r945+56],%r969; mov.u64 %r970,1; st.u64 [%r945+40],%r970; bra $L6; $L123: .loc 1 2485 4 st.u64 [%r945+48],%r1512; add.u64 %r972,%r668,-1; st.u64 [%r945+56],%r972; mov.u64 %r973,1; st.u64 [%r945+40],%r973; bra $L6; $L124: .loc 1 2490 4 mov.u64 %r974,0; st.u64 [%r945+48],%r974; add.u64 %r975,%r668,-1; st.u64 [%r945+56],%r975; mov.u64 %r976,1; st.u64 [%r945+40],%r976; .loc 1 2493 11 st.u64 [%r945+72],%r974; ld.u64 %r979,[%r947+80]; ld.u64 %r981,[%r947+72]; sub.u64 %r982,%r979,%r981; st.u64 [%r945+80],%r982; st.u64 [%r945+64],%r668; $L6: .loc 1 2499 1945; call (%value_in),_gfortran_size0,(%out_arg1); ld.param.u64 %r984,[%value_in]; } .loc 1 2499 4 mov.u64 %r9869986; call (%value_in),_gfortrani_xmallocarray987,[%value_in]; } .loc 1 2499 2 st.u64 [%r945],%r987; .loc 1 2500 24 mov.u64 %r989,0; st.u64 [%r945+8],%r989; .loc 1 2557 7 ld.s8 %r627,[%r946+28]; .loc 1 2564 15 ld.u64 %r991,[%r946+56]; add.u64 %r990,%r991,1; .loc 1 2564 13 ld.u64 %r992,[%r946+48]; sub.u64 %r638,%r990,%r992; bra $L7; $L4: .loc 1 2502 12 cvta.global.u64 %r993,_gfortrani_compile_options; .loc 1 2502 11 ld.u32 %r994,[%r993+36]; setp.eq.u32 %r995,%r994,0; @ %r995 bra $L8; .loc 1 2509 17 ld.u64 %r997,[%r945+56]; add.u64 %r996,%r997,1; .loc 1 2509 15 ld.u64 %r998,[%r945+48]; sub.u64 %r638,%r996,%r998; .loc 1 2506 10 setp.ne.u16 %r1000,%r951,1; @ %r1000 bra $L9; .loc 1 2508 17 ld.u64 %r1002,[%r947+80]; add.u64 %r1001,%r1002,1; .loc 1 2508 15 ld.u64 %r1003,[%r947+72]; sub.u64 %r217,%r1001,%r1003; .loc 1 2510 7 setp.ne.u64 %r1004,%r217,%r638; @ %r1004 bra $L10; .loc 1 2564 15 ld.u64 %r1006,[%r946+56]; add.u64 %r1005,%r1006,1; .loc 1 2564 13 ld.u64 %r1007,[%r946+48]; sub.u64 %r638,%r1005,%r1007; .loc 1 2543 7 ld.s8 %r901,[%r945+28]; .loc 1 2548 27 ld.u64 %r844,[%r945+40]; .loc 1 2543 6 cvt.u16.u32 %r1008,%r901; setp.eq.u16 %r1009,%r1008,1; @ %r1009 bra $L11; .loc 1 2553 16 ld.u64 %r664,[%r945+64]; .loc 1 2560 16 ld.u64 %r298,[%r946+40]; .loc 1 2563 14 mov.u64 %r162,1; .loc 1 2561 16 mov.u64 %r150,%r162; bra $L12; $L10: .loc 1 2511 6 st.u64 [%stack+8],%r217; st.u64 [%stack],%r638; cvta.const.u64 %r10101010_gfortran_runtime_error2485 4 ld.u64 %r1013,[%r946+56]; add.u64 %r1012,%r1013,1; ld.u64 %r1014,[%r946+48]; sub.u64 %r672,%r1012,%r1014; .loc 1 2515 15 cvt.u16.u32 %r1015,%r1551; setp.ne.u16 %r1017,%r1015,1; @ %r1017 bra $L13; .loc 1 2519 7 setp.eq.u64 %r1018,%r672,%r638; @ %r1018 bra $L14; .loc 1 2520 6 st.u64 [%stack+8],%r672; st.u64 [%stack],%r638; cvta.const.u64 %r101101_gfortran_runtime_error13: .loc 1 2528 7 setp.eq.u64 %r1021,%r672,%r638; @ %r1021 bra $L15; .loc 1 2529 6 st.u64 [%stack+8],%r672; st.u64 [%stack],%r638; cvta.const.u64 %r102210stack; call _gfortran_runtime_error15: .loc 1 2533 17 ld.u64 %r1025,[%r947+80]; add.u64 %r1024,%r1025,1; .loc 1 2533 15 ld.u64 %r1026,[%r947+72]; sub.u64 %r215,%r1024,%r1026; .loc 1 2534 17 ld.u64 %r1028,[%r945+80]; add.u64 %r1027,%r1028,1; .loc 1 2534 15 ld.u64 %r1029,[%r945+72]; sub.u64 %r216,%r1027,%r1029; .loc 1 2535 7 setp.eq.u64 %r1030,%r215,%r216; @ %r1030 bra $L14; .loc 1 2536 6 st.u64 [%stack+8],%r215; st.u64 [%stack],%r216; cvta.const.u64 %r1031,$LC10stack; call _gfortran_runtime_error2564 15 ld.u64 %r1034,[%r946+56]; add.u64 %r1033,%r1034,1; .loc 1 2564 13 ld.u64 %r1035,[%r946+48]; sub.u64 %r638,%r1033,%r1035; $L7: .loc 1 2548 27 ld.u64 %r664,[%r945+40]; .loc 1 2543 6 ld.s8 %r1037,[%r945+28]; cvt.u16.u32 %r1036,%r1037; setp.eq.u16 %r1038,%r1036,1; @ %r1038 bra $L130; $L127: .loc 1 2553 16 mov.u64 %r844,%r664; ld.u64 %r664,[%r945+64]; bra $L16; $L130: .loc 1 2548 27 mov.u64 %r844,%r664; $L16: .loc 1 2560 16 ld.u64 %r298,[%r946+40]; .loc 1 2557 6 cvt.u16.u32 %r1039,%r627; setp.eq.u16 %r1040,%r1039,1; @ %r1040 bra $L131; $L125: .loc 1 2569 16 ld.u64 %r150,[%r946+64]; .loc 1 2571 15 ld.u64 %r1042,[%r946+80]; add.u64 %r1041,%r1042,1; .loc 1 2571 13 ld.u64 %r1043,[%r946+72]; mov.u64 %r162,%r638; sub.u64 %r638,%r1041,%r1043; bra $L12; $L131: .loc 1 2563 14 mov.u64 %r162,1; .loc 1 2561 16 mov.u64 %r150,%r162; $L12: .loc 1 2575 16 ld.u64 %r1045,[%r947+56]; add.u64 %r1044,%r1045,1; ld.u64 %r1046,[%r947+48]; sub.u64 %r65,%r1044,%r1046; .loc 1 2575 6 setp.eq.u64 %r1047,%r65,%r638; @ %r1047 bra $L17; .loc 1 2577 10 set.u32.gt.s64 %r1049,%r638,0; neg.s32 %r1050,%r1049; .loc 1 2577 21 set.u32.gt.s64 %r1052,%r65,0; neg.s32 %r1053,%r1052; cvt.u16.u32 %r1055,%r1050; cvt.u16.u32 %r1056,%r1053; or.b16 %r1054,%r1055,%r1056; cvt.u32.u16 %r1057,%r1054; cvt.u16.u8 %r1058,%r1057; setp.eq.u16 %r1059,%r1058,0; @ %r1059 bra $L17; .loc 1 2578 2 st.u64 [%stack+8],%r638; st.u64 [%stack],%r65; cvta.const.u64 %r10601060_gfortran_runtime_error17: .loc 1 2583 7 ld.s8 %r70,[%r947+28]; .loc 1 2586 16 ld.u64 %r663,[%r947+40]; .loc 1 2583 6 cvt.u16.u32 %r1062,%r70; setp.eq.u16 %r1063,%r1062,1; @ %r1063 bra $L132; .loc 1 2597 16 ld.u64 %r151,[%r947+64]; .loc 1 2598 16 ld.u64 %r1065,[%r947+80]; add.u64 %r1064,%r1065,1; .loc 1 2598 14 ld.u64 %r1066,[%r947+72]; sub.u64 %r163,%r1064,%r1066; bra $L18; $L132: .loc 1 2592 14 mov.u64 %r163,1; .loc 1 2591 16 mov.u64 %r151,256; $L18: .loc 1 2601 9 ld.u64 %r220,[%r946]; .loc 1 2602 9 ld.u64 %r221,[%r947]; .loc 1 2603 8 ld.u64 %r222,[%r945]; .loc 1 2612 28 set.u32.eq.u64 %r1068,%r844,1; neg.s32 %r1069,%r1068; cvt.u32.u32 %r1067,%r1069; cvt.u32.u8 %r75,%r1067; .loc 1 2612 46 set.u32.eq.u64 %r1071,%r298,1; neg.s32 %r1072,%r1071; cvt.u32.u32 %r1070,%r1072; cvt.u32.u8 %r662,%r1070; .loc 1 2612 7 set.u32.ne.u32 %r1074,%r948,0; neg.s32 %r1075,%r1074; .loc 1 2612 16 cvt.u16.u32 %r1077,%r1075; cvt.u16.u32 %r1078,%r75; and.b16 %r1076,%r1077,%r1078; .loc 1 2612 6 cvt.u32.u16 %r1079,%r1076; cvt.u16.u8 %r1080,%r1079; setp.eq.u16 %r1081,%r1080,0; @ %r1081 bra $L19; .loc 1 2612 63 set.u32.eq.u64 %r1083,%r150,1; neg.s32 %r1084,%r1083; .loc 1 2612 51 cvt.u16.u32 %r1086,%r1084; cvt.u16.u32 %r1087,%r662; or.b16 %r1085,%r1086,%r1087; .loc 1 2612 33 cvt.u32.u16 %r1088,%r1085; cvt.u16.u8 %r1089,%r1088; setp.eq.u16 %r1090,%r1089,0; @ %r1090 bra $L20; .loc 1 2613 20 set.u32.eq.u64 %r1092,%r663,1; neg.s32 %r1093,%r1092; .loc 1 2613 37 set.u32.eq.u64 %r1095,%r151,1; neg.s32 %r1096,%r1095; .loc 1 2613 25 cvt.u16.u32 %r1098,%r1093; cvt.u16.u32 %r1099,%r1096; or.b16 %r1097,%r1098,%r1099; .loc 1 2613 7 cvt.u32.u16 %r1100,%r1097; cvt.u16.u8 %r1101,%r1100; setp.eq.u16 %r1102,%r1101,0; @ %r1102 bra $L20; .loc 1 2615 13 cvt.rn.f32.s32 %r86,%r949; .loc 1 2614 12 cvt.rn.f32.s64 %r1103,%r162; .loc 1 2614 50 cvt.rn.f32.s64 %r1104,%r638; .loc 1 2614 47 mul.f32 %r1105,%r1103,%r1104; .loc 1 2614 31 cvt.rn.f32.s64 %r1106,%r163; .loc 1 2614 47 mul.f32 %r1107,%r1105,%r1106; .loc 1 2615 13 mul.f32 %r1108,%r86,%r86; mul.f32 %r1109,%r1108,%r86; .loc 1 2614 7 setp.gt.f32 %r1110,%r1107,%r1109; @ ! %r1110 bra $L19; .loc 1 2617 17 cvt.u32.u64 %r90,%r162; st.u32 [%frame+36],%r90; .loc 1 2617 29 cvt.u32.u64 %r91,%r163; st.u32 [%frame+32],%r91; .loc 1 2617 41 cvt.u32.u64 %r92,%r638; st.u32 [%frame+28],%r92; .loc 1 2617 52 cvt.u32.u64 %r93,%r664; st.u32 [%frame+24],%r93; .loc 1 2618 27 ld.const.f32 %r1111,[$LC8]; st.f32 [%frame+8],%r1111; ld.const.f32 %r1112,[$LC8+4]; st.f32 [%frame+12],%r1112; .loc 1 2618 36 ld.const.f32 %r1545,[$LC9]; st.f32 [%frame],%r1545; ld.const.f32 %r1535,[$LC9+4]; st.f32 [%frame+4],%r1535; .loc 1 2619 17 setp.ne.u64 %r1115,%r298,1; @ %r1115 bra $L22; cvt.u32.u64 %r198,%r150; bra $L23; $L22: cvt.u32.u64 %r198,%r298; $L23: st.u32 [%frame+20],%r198; .loc 1 2620 3 setp.ne.u64 %r1116,%r663,1; @ %r1116 bra $L24; cvt.u32.u64 %r199,%r151; bra $L25; $L24: cvt.u32.u64 %r199,%r663; $L25: st.u32 [%frame+16],%r199; .loc 1 2622 15 set.u32.gt.s32 %r1118,%r198,0; neg.s32 %r1119,%r1118; .loc 1 2622 26 set.u32.gt.s32 %r1121,%r199,0; neg.s32 %r1122,%r1121; .loc 1 2622 19 cvt.u16.u32 %r1124,%r1119; cvt.u16.u32 %r1125,%r1122; and.b16 %r1123,%r1124,%r1125; .loc 1 2622 10 cvt.u32.u16 %r1126,%r1123; cvt.u16.u8 %r1127,%r1126; setp.eq.u16 %r1128,%r1127,0; @ %r1128 bra $L19; .loc 1 2622 37 set.u32.gt.s32 %r1130,%r93,0; neg.s32 %r1131,%r1130; .loc 1 2622 46 set.u32.gt.s32 %r1133,%r90,1; neg.s32 %r1134,%r1133; .loc 1 2622 41 cvt.u16.u32 %r1136,%r1131; cvt.u16.u32 %r1137,%r1134; and.b16 %r1135,%r1136,%r1137; cvt.u32.u16 %r1138,%r1135; cvt.u16.u8 %r1139,%r1138; setp.eq.u16 %r1140,%r1139,0; @ %r1140 bra $L19; .loc 1 2622 55 set.u32.gt.s32 %r1142,%r91,1; neg.s32 %r1143,%r1142; .loc 1 2622 64 set.u32.gt.s32 %r1145,%r92,1; neg.s32 %r1146,%r1145; .loc 1 2622 59 cvt.u16.u32 %r1148,%r1143; cvt.u16.u32 %r1149,%r1146; and.b16 %r1147,%r1148,%r1149; cvt.u32.u16 %r1150,%r1147; cvt.u16.u8 %r1151,%r1150; setp.eq.u16 %r1152,%r1151,0; @ %r1152 bra $L19; .loc 1 2624 4 setp.ne.u64 %r1153,%r950,0; @ %r1153 bra $L27; cvta.const.u64 %r1157,$LC10; cvta.const.u64 %r1156,__func__$0; mov.u32 %r1155,2624; cvta.const.u64 %r11115415511561127: .loc 1 2626 17 and.b32 %r1158,%r948,2; .loc 1 2626 7 setp.ne.u32 %r1159,%r1158,0; @ %r1159 bra $L133; .loc 1 2629 35 setp.eq.u64 %r1160,%r298,1; @ %r1160 bra $L134; cvta.const.u64 %r164,$LC2; bra $L28; $L133: .loc 1 2627 13 cvta.const.u64 %r164,$LC1; bra $L28; $L134: .loc 1 2629 35 cvta.const.u64 %r164,$LC0; $L28: .loc 1 2631 17 and.b32 %r1161,%r948,4; .loc 1 2631 7 setp.ne.u32 %r1162,%r1161,0; @ %r1162 bra $L135; .loc 1 2634 35 setp.eq.u64 %r1163,%r663,1; @ %r1163 bra $L136; cvta.const.u64 %r165,$LC2; bra $L29; $L135: .loc 1 2632 13 cvta.const.u64 %r165,$LC1; bra $L29; $L136: .loc 1 2634 35 cvta.const.u64 %r165,$LC0; $L29: .loc 1 2636 4 add.u64 %r1179,%frame,24; add.u64 %r1180,%frame,16; add.u64 %r1181,%frame,20; add.u64 %r1182,%frame,8; add.u64 %r1183,%frame,28; add.u64 %r1184,%frame,32; add.u64 %r1185,%frame,36; mov.u32 %r1178,1;,.param .u64 %in_ar3,.param .u64 %in_ar4,.param .u64 %in_ar5,.param .u64 %in_ar6,.param .u64 %in_ar7,.param .u64 %in_ar8,.param .u64 %in_ar9,.param .u64 %in_ar10,.param .u64 %in_ar11,.param .u64 %in_ar12,.param .u32 %in_ar13,.param .u32 %in_ar14164r118118411831182220; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r1181; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r221; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r1180; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%frame; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r222; .param .u64 %out_arg13; st.param.u64 [%out_arg13],%r1179; .param .u32 %out_arg14; st.param.u32 [%out_arg14],%r1178; .param .u32 %out_arg15; st.param.u32 [%out_arg15],%r1178; call %r950,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13,%out_arg14,%out_arg15),$LCT0; } bra $L1; $L19: .loc 1 2643 21 cvt.u16.u32 %r1187,%r75; cvt.u16.u32 %r1188,%r662; and.b16 %r1186,%r1187,%r1188; .loc 1 2643 6 cvt.u32.u16 %r1189,%r1186; cvt.u16.u8 %r1190,%r1189; setp.eq.u16 %r1191,%r1190,0; @ %r1191 bra $L31; .loc 1 2643 38 setp.ne.u64 %r1192,%r663,1; @ %r1192 bra $L20; .loc 1 2644 7 cvt.u16.u32 %r1193,%r70; setp.eq.u16 %r1194,%r1193,1; @ %r1194 bra $L32; .loc 1 2677 16 add.u64 %r1195,%r664,1; .loc 1 2678 9 shl.b64 %r111,%r1195,3; sub.u64 %r224,%r222,%r111; .loc 1 2687 7 setp.le.s64 %r1196,%r163,0; @ %r1196 bra $L33; add.u64 %r802,%r111,-8; mov.u64 %r800,%r222; add.u64 %r806,%r163,1; .loc 1 2689 22 shl.b64 %r623,%r162,3; .loc 1 2687 13 mov.u64 %r255,%r663; setp.gt.s64 %r1556,%r162,0; and.b64 %r1636,%r163,3; setp.eq.u64 %r1639,%r1636,0; @ %r1639 bra $L34; setp.eq.u64 %r1638,%r1636,1; @ %r1638 bra $L282; setp.eq.u64 %r1637,%r1636,2; @ %r1637 bra $L283; bra $L338; $L35: .loc 1 2689 22 mov.u32 %r12002001202,[%value_in]; } $L36: .loc 1 2687 24 add.u64 %r1640,%r255,1; .loc 1 2687 7 add.u64 %r1695,%r800,%r802; .loc 1 2688 2 @ %r1556 bra $L220; bra $L339; $L34: @ %r1556 bra $L35; bra $L36; $L33: .loc 1 2692 28 set.u32.eq.u64 %r1207,%r638,0; neg.s32 %r1208,%r1207; .loc 1 2692 13 set.u32.eq.u64 %r1210,%r162,0; neg.s32 %r1211,%r1210; .loc 1 2692 28 cvt.u16.u32 %r1213,%r1208; cvt.u16.u32 %r1214,%r1211; or.b16 %r1212,%r1213,%r1214; .loc 1 2692 23 set.u32.eq.u64 %r1216,%r163,0; neg.s32 %r1217,%r1216; .loc 1 2692 28 cvt.u16.u32 %r1221,%r1217; or.b16 %r1219,%r1212,%r1221; cvt.u32.u16 %r1222,%r1219; cvt.u16.u8 %r1223,%r1222; setp.eq.u16 %r1224,%r1223,0; @ ! %r1224 bra $L1; .loc 1 2697 10 setp.eq.u64 %r1225,%r150,1; selp.u64 %r196,%r664,%r150,%r1225; .loc 1 2702 21 shl.b64 %r1226,%r196,8; .loc 1 2702 14 add.u64 %r229,%r1226,%r151; .loc 1 2706 12 min.s64 %r1228,%r229,65536; shl.b64 %r1229,%r12282291230,[%value_in]; } .loc 1 2710 7 setp.gt.s64 %r1232,%r163,0; @ %r1232 bra $L39; $L42: .loc 1 29302918 7 bra $L1; $L39: .loc 1 2680 16 add.u64 %r1234,%r150,1; .loc 1 2681 9 shl.b64 %r1235,%r1234,3; sub.u64 %r226,%r220,%r1235; .loc 1 2683 16 add.u64 %r1236,%r151,1; .loc 1 2684 9 shl.b64 %r117,%r1236,3; shl.b64 %r769,%r664,12; shl.b64 %r767,%r664,4; shl.b64 %r771,%r664,5; neg.s64 %r1238,%r664; shl.b64 %r1239,%r1238,12; shl.b64 %r772,%r1238,3; shl.b64 %r775,%r664,9; shl.b64 %r783,%r151,12; add.u64 %r784,%r117,-8; sub.u64 %r785,%r221,%r117; add.u64 %r782,%r784,%r785; add.u64 %r1244,%r151,%r151; add.u64 %r1245,%r1244,%r151; shl.b64 %r1246,%r1245,3; add.u64 %r786,%r1246,%r785; add.u64 %r1247,%r163,-1; and.b64 %r793,%r1247,-512; shl.b64 %r749,%r150,8; add.u64 %r750,%r150,%r150; add.u64 %r797,%r111,-8; sub.u64 %r798,%r224,%r797; shl.b64 %r710,%r150,4; shl.b64 %r1251,%r1238,5; shl.b64 %r67,%r151,5; mov.u64 %r774,%r664; mov.u64 %r770,%r771; mov.u64 %r765,0; setp.gt.s64 %r1553,%r638,0; add.u64 %r1590,%r798,8; setp.gt.s64 %r1591,%r162,0; add.u64 %r1592,%r162,1; add.u64 %r1593,%r226,16; bra $L41; $L138: mov.u64 %r765,%r766; $L41: add.u64 %r795,%r765,1; .loc 1 2715 9 sub.u64 %r231,%r163,%r765; min.s64 %r1696,%r231,512; .loc 1 2716 24 shr.s64 %r1253,%r1696,63; shr.u64 %r1254,%r1253,62; add.u64 %r1255,%r1696,%r1254; and.b64 %r1256,%r1255,3; sub.u64 %r1257,%r1256,%r1254; .loc 1 2716 10 sub.u64 %r232,%r1696,%r1257; .loc 1 2718 4 @ %r1553 bra $L40; $L45: .loc 1 2710 7 add.u64 %r766,%r765,512; add.u64 %r767,%r767,%r769; add.u64 %r770,%r770,%r769; add.u64 %r772,%r772,%r1239; add.u64 %r774,%r774,%r775; add.u64 %r782,%r782,%r783; add.u64 %r786,%r786,%r783; setp.ne.u64 %r1259,%r765,%r793; @ %r1259 bra $L138; bra $L42; $L40: .loc 1 2768 13 add.u64 %r644,%r232,%r795; add.u64 %r799,%r1590,%r767; add.u64 %r760,%r782,%r784; add.u64 %r763,%r784,%r786; add.u64 %r259,%r767,%r797; mul.lo.u64 %r583,%r644,%r664; mul.lo.u64 %r577,%r644,%r151; mad.lo.u64 %r756,%r644,%r784,%r785; mov.u64 %r747,%r750; .loc 1 2718 12 mov.u64 %r235,%r663; add.u64 %r1552,%r638,1; .loc 1 2768 8 add.u64 %r1583,%r644,-1; setp.ge.s64 %r1585,%r1583,%r795; .loc 1 2873 17 add.u64 %r1588,%r1696,%r795; .loc 1 2874 9 setp.le.s64 %r1589,%r1588,%r644; $L44: .loc 1 2723 13 sub.u64 %r233,%r1552,%r235; min.s64 %r1698,%r233,256; .loc 1 2724 28 shr.u64 %r1265,%r1698,63; add.u64 %r1266,%r1698,%r1265; and.b64 %r1267,%r1266,1; sub.u64 %r1268,%r1267,%r1265; .loc 1 2724 14 sub.u64 %r234,%r1698,%r1268; .loc 1 2727 8 @ %r1591 bra $L43; $L73: .loc 1 2718 30 add.u64 %r235,%r235,256; .loc 1 2718 4 add.u64 %r747,%r747,%r749; setp.ge.s64 %r1270,%r638,%r235; @ %r1270 bra $L44; bra $L45; $L43: .loc 1 2763 26 add.u64 %r1271,%r235,-1; add.u64 %r1272,%r1271,%r1698; .loc 1 2763 31 mul.lo.u64 %r146,%r1272,%r150; .loc 1 2734 13 add.u64 %r643,%r234,%r235; .loc 1 2734 8 add.u64 %r645,%r643,-1; add.u64 %r1273,%r146,1; shl.b64 %r1274,%r1273,3; add.u64 %r737,%r226,%r1274; shl.b64 %r679,%r1698,3; add.u64 %r1275,%r679,-8; add.u64 %r678,%r1230,%r1275; add.u64 %r1276,%r679,-4; add.u64 %r683,%r1230,%r1276; shl.b64 %r758,%r235,3; add.u64 %r759,%r758,%r782; add.u64 %r761,%r758,%r760; add.u64 %r762,%r758,%r786; add.u64 %r764,%r758,%r763; .loc 1 2904 18 add.u64 %r1546,%r1698,%r235; add.u64 %r757,%r756,%r758; add.u64 %r1277,%r577,%r235; add.u64 %r1278,%r1277,%r1698; shl.b64 %r1279,%r1278,3; add.u64 %r574,%r1279,%r785; mov.u64 %r736,%r799; .loc 1 2727 16 mov.u64 %r248,%r663; setp.le.s64 %r1536,%r235,%r645; setp.gt.s64 %r1548,%r1698,%r234; setp.gt.s64 %r1547,%r1696,%r232; not.b64 %r1576,%r235; add.u64 %r1577,%r1576,%r643; and.b64 %r1578,%r1577,-2; .loc 1 2904 13 add.u64 %r1580,%r1546,-1; setp.gt.s64 %r1581,%r235,%r1580; $L72: .loc 1 2732 10 sub.u64 %r236,%r1592,%r248; min.s64 %r1699,%r236,256; .loc 1 2733 25 shr.s64 %r1533,%r1699,63; shr.u64 %r1283,%r1699,63; add.u64 %r1284,%r1699,%r1283; and.b64 %r1285,%r1284,1; sub.u64 %r1286,%r1285,%r1283; .loc 1 2733 11 sub.u64 %r237,%r1699,%r1286; .loc 1 2735 5 @ %r1536 bra $L46; $L55: .loc 1 2757 8 @ ! %r1548 bra $L48; bra $L47; $L46: .loc 1 2737 17 add.u64 %r647,%r237,%r248; .loc 1 2737 12 add.u64 %r648,%r647,-1; sub.u64 %r1289,%r747,%r150; add.u64 %r1290,%r1289,%r248; add.u64 %r1291,%r1290,%r1699; shl.b64 %r1292,%r1291,3; add.u64 %r1293,%r1292,-8; add.u64 %r709,%r226,%r1293; shl.b64 %r1295,%r1699,11; add.u64 %r1296,%r1295,-2048; add.u64 %r716,%r1230,%r1296; add.u64 %r1297,%r248,%r747; add.u64 %r1298,%r1297,%r1699; shl.b64 %r1299,%r1298,3; add.u64 %r1300,%r1299,-8; add.u64 %r721,%r226,%r1300; neg.s64 %r1304,%r1699; shl.b64 %r1305,%r1304,3; mov.u64 %r726,%r747; mov.u64 %r707,0; setp.le.s64 %r1542,%r248,%r648; setp.gt.s64 %r1549,%r1699,%r237; add.u64 %r1570,%r1305,8; not.b64 %r1571,%r248; add.u64 %r1572,%r1571,%r647; and.b64 %r1573,%r1572,-2; add.u64 %r1574,%r1573,%r248; bra $L54; $L139: mov.u64 %r707,%r708; $L54: .loc 1 2738 9 @ %r1542 bra $L49; $L53: .loc 1 2749 12 @ %r1549 bra $L50; bra $L51; $L49: add.u64 %r694,%r1570,%r709; shl.b64 %r1308,%r707,3; add.u64 %r695,%r1230,%r1308; add.u64 %r696,%r1570,%r721; add.u64 %r1313,%r1574,%r726; shl.b64 %r1314,%r1313,3; add.u64 %r706,%r1314,%r1593; sub.u64 %r1625,%r706,%r696; add.u64 %r1626,%r1625,-16; shr.u64 %r1624,%r1626,4; and.b64 %r1628,%r1624,1; setp.ne.u64 %r1629,%r1628,0; @ ! %r1629 bra $L340; $L52: ld.f32 %r192,[%r694+4]; .loc 1 2740 49 ld.f32 %r1316,[%r694]; st.f32 [%r695],%r1316; st.f32 [%r695+4],%r192; ld.f32 %r182,[%r696+4]; .loc 1 2742 49 ld.f32 %r1317,[%r696]; st.f32 [%r695+8],%r1317; st.f32 [%r695+12],%r182; ld.f32 %r180,[%r694+12]; .loc 1 2744 49 ld.f32 %r1318,[%r694+8]; st.f32 [%r695+2048],%r1318; st.f32 [%r695+2052],%r180; ld.f32 %r178,[%r696+12]; .loc 1 2746 49 ld.f32 %r1319,[%r696+8]; st.f32 [%r695+2056],%r1319; st.f32 [%r695+2060],%r178; .loc 1 2738 9 add.u64 %r1630,%r694,16; add.u64 %r1631,%r695,4096; add.u64 %r1632,%r696,16; ld.f32 %r1703,[%r1630+4]; .loc 1 2740 49 ld.f32 %r1704,[%r1630]; st.f32 [%r1631],%r1704; st.f32 [%r1631+4],%r1703; ld.f32 %r1705,[%r1632+4]; .loc 1 2742 49 ld.f32 %r1706,[%r1632]; st.f32 [%r1631+8],%r1706; st.f32 [%r1631+12],%r1705; ld.f32 %r1707,[%r1630+12]; .loc 1 2744 49 ld.f32 %r1708,[%r1630+8]; st.f32 [%r1631+2048],%r1708; st.f32 [%r1631+2052],%r1707; ld.f32 %r1709,[%r1632+12]; .loc 1 2746 49 ld.f32 %r1710,[%r1632+8]; st.f32 [%r1631+2056],%r1710; st.f32 [%r1631+2060],%r1709; .loc 1 2738 9 add.u64 %r694,%r694,32; add.u64 %r695,%r695,8192; add.u64 %r696,%r696,32; setp.ne.u64 %r1711,%r696,%r706; @ %r1711 bra $L52; bra $L53; $L50: ld.f32 %r174,[%r709+4]; .loc 1 2751 41 ld.f32 %r1321,[%r709]; st.f32 [%r716],%r1321; st.f32 [%r716+4],%r174; ld.f32 %r171,[%r721+4]; .loc 1 2753 41 ld.f32 %r1322,[%r721]; st.f32 [%r716+8],%r1322; st.f32 [%r716+12],%r171; $L51: .loc 1 2735 5 add.u64 %r708,%r707,2; add.u64 %r709,%r709,%r710; add.u64 %r716,%r716,16; add.u64 %r721,%r721,%r710; add.u64 %r726,%r726,%r750; setp.ne.u64 %r1323,%r707,%r1578; @ %r1323 bra $L139; bra $L55; $L48: .loc 1 2767 25 shr.u64 %r1326,%r1533,62; add.u64 %r1327,%r1699,%r1326; and.b64 %r1328,%r1327,3; sub.u64 %r1329,%r1328,%r1326; .loc 1 2767 11 sub.u64 %r240,%r1699,%r1329; .loc 1 2769 5 @ %r1585 bra $L56; bra $L57; $L47: .loc 1 2759 17 add.u64 %r131,%r1699,%r248; .loc 1 2759 12 add.u64 %r1331,%r131,-1; .loc 1 2760 9 setp.lt.s64 %r1332,%r1331,%r248; @ %r1332 bra $L48; add.u64 %r1333,%r131,%r146; shl.b64 %r1334,%r1333,3; add.u64 %r692,%r226,%r1334; mov.u64 %r682,%r683; mov.u64 %r677,%r678; mov.u64 %r676,%r737; sub.u64 %r1614,%r692,%r737; add.u64 %r1615,%r1614,-8; shr.u64 %r1613,%r1615,3; add.u64 %r1616,%r1613,1; and.b64 %r1617,%r1616,3; setp.eq.u64 %r1620,%r1617,0; @ %r1620 bra $L58; setp.eq.u64 %r1619,%r1617,1; @ %r1619 bra $L284; setp.eq.u64 %r1618,%r1617,2; @ %r1618 bra $L285; bra $L341; $L58: ld.f32 %r168,[%r676+4]; .loc 1 2762 43 ld.f32 %r1335,[%r676]; st.f32 [%r677],%r1335; st.f32 [%r682],%r168; .loc 1 2760 9 add.u64 %r1621,%r676,8; ld.f32 %r1715,[%r1621+4]; .loc 1 2762 43 ld.f32 %r1716,[%r1621]; st.f32 [%r677+2048],%r1716; st.f32 [%r682+2048],%r1715; .loc 1 2760 9 add.u64 %r1717,%r676,16; ld.f32 %r1721,[%r1717+4]; .loc 1 2762 43 ld.f32 %r1722,[%r1717]; st.f32 [%r677+4096],%r1722; st.f32 [%r682+4096],%r1721; .loc 1 2760 9 add.u64 %r1723,%r676,24; ld.f32 %r1727,[%r1723+4]; .loc 1 2762 43 ld.f32 %r1728,[%r1723]; st.f32 [%r677+6144],%r1728; st.f32 [%r682+6144],%r1727; .loc 1 2760 9 add.u64 %r676,%r676,32; add.u64 %r677,%r677,8192; add.u64 %r682,%r682,8192; setp.ne.u64 %r1729,%r676,%r692; @ %r1729 bra $L58; bra $L48; $L57: .loc 1 2871 8 @ ! %r1547 bra $L60; bra $L59; $L56: .loc 1 2876 14 add.u64 %r650,%r240,%r248; .loc 1 2876 9 add.u64 %r651,%r650,-1; add.u64 %r1338,%r650,%r774; shl.b64 %r1339,%r1338,3; add.u64 %r140,%r224,%r1339; add.u64 %r639,%r736,%r797; add.u64 %r637,%r639,%r797; add.u64 %r51,%r637,%r797; not.b64 %r1340,%r248; add.u64 %r1341,%r1340,%r650; shr.u64 %r1342,%r1341,2; add.u64 %r1343,%r1342,1; shl.b64 %r244,%r1343,5; sub.u64 %r1344,%r650,%r248; add.u64 %r1346,%r1344,1; shl.b64 %r1347,%r1346,11; add.u64 %r552,%r1230,%r1347; shl.b64 %r238,%r1699,3; mov.u64 %r640,%r736; mov.u64 %r641,%r764; mov.u64 %r642,%r762; mov.u64 %r649,%r761; mov.u64 %r88,%r759; mov.u64 %r200,%r772; mov.u64 %r250,%r795; mov.u64 %r49,0; setp.le.s64 %r1543,%r248,%r651; setp.gt.s64 %r1550,%r1699,%r240; $L68: .loc 1 2772 9 @ %r1543 bra $L61; $L67: .loc 1 2843 12 @ ! %r1550 bra $L63; bra $L62; $L61: add.u64 %r506,%r1230,2048; add.u64 %r504,%r1230,4096; add.u64 %r502,%r1230,6144; add.u64 %r500,%r1230,8192; .loc 1 2904 13 mov.u64 %r498,0; $L66: add.u64 %r496,%r640,%r498; ld.f32 %r184,[%r496]; ld.f32 %r281,[%r496+4]; ld.f32 %r201,[%r496+8]; ld.f32 %r189,[%r496+12]; add.u64 %r483,%r639,%r498; ld.f32 %r282,[%r483]; ld.f32 %r45,[%r483+4]; ld.f32 %r278,[%r483+8]; ld.f32 %r270,[%r483+12]; add.u64 %r472,%r637,%r498; ld.f32 %r280,[%r472]; ld.f32 %r279,[%r472+4]; ld.f32 %r300,[%r472+8]; ld.f32 %r301,[%r472+12]; add.u64 %r286,%r51,%r498; ld.f32 %r296,[%r286]; ld.f32 %r299,[%r286+4]; ld.f32 %r302,[%r286+8]; ld.f32 %r303,[%r286+12]; ld.f32 %r272,[%r496+16]; ld.f32 %r273,[%r496+20]; ld.f32 %r284,[%r496+24]; ld.f32 %r283,[%r496+28]; ld.f32 %r290,[%r483+16]; ld.f32 %r274,[%r483+20]; ld.f32 %r276,[%r483+24]; ld.f32 %r275,[%r483+28]; ld.f32 %r304,[%r472+16]; ld.f32 %r305,[%r472+20]; ld.f32 %r308,[%r472+24]; ld.f32 %r309,[%r472+28]; ld.f32 %r306,[%r286+16]; ld.f32 %r307,[%r286+20]; ld.f32 %r310,[%r286+24]; ld.f32 %r311,[%r286+28]; .loc 1 2791 6 @ %r1581 bra $L64; mov.u64 %r523,0; $L65: add.u64 %r521,%r506,%r523; ld.f32 %r312,[%r521+-2048]; ld.f32 %r313,[%r521+-2044]; add.u64 %r519,%r88,%r523; ld.f32 %r314,[%r519]; ld.f32 %r315,[%r519+4]; .loc 1 2794 11 mul.f32 %r317,%r313,%r314; mul.f32 %r1351,%r313,%r315; neg.f32 %r1352,%r1351; fma.rn.f32 %r318,%r312,%r314,%r1352; fma.rn.f32 %r319,%r312,%r315,%r317; .loc 1 2793 14 add.f32 %r184,%r184,%r318; add.f32 %r281,%r281,%r319; add.u64 %r517,%r504,%r523; ld.f32 %r320,[%r517+-2048]; ld.f32 %r321,[%r517+-2044]; .loc 1 2796 11 mul.f32 %r323,%r315,%r320; mul.f32 %r1353,%r315,%r321; neg.f32 %r1354,%r1353; fma.rn.f32 %r324,%r314,%r320,%r1354; fma.rn.f32 %r325,%r314,%r321,%r323; .loc 1 2795 14 add.f32 %r201,%r201,%r324; add.f32 %r189,%r189,%r325; add.u64 %r515,%r649,%r523; ld.f32 %r326,[%r515]; ld.f32 %r327,[%r515+4]; .loc 1 2798 11 mul.f32 %r329,%r313,%r326; mul.f32 %r1355,%r313,%r327; neg.f32 %r1356,%r1355; fma.rn.f32 %r330,%r312,%r326,%r1356; fma.rn.f32 %r331,%r312,%r327,%r329; .loc 1 2797 14 add.f32 %r282,%r282,%r330; add.f32 %r45,%r45,%r331; .loc 1 2800 11 mul.f32 %r333,%r321,%r326; mul.f32 %r1357,%r321,%r327; neg.f32 %r1358,%r1357; fma.rn.f32 %r334,%r320,%r326,%r1358; fma.rn.f32 %r335,%r320,%r327,%r333; .loc 1 2799 14 add.f32 %r278,%r278,%r334; add.f32 %r270,%r270,%r335; add.u64 %r513,%r642,%r523; ld.f32 %r336,[%r513]; ld.f32 %r337,[%r513+4]; .loc 1 2802 11 mul.f32 %r339,%r313,%r336; mul.f32 %r1359,%r313,%r337; neg.f32 %r1360,%r1359; fma.rn.f32 %r340,%r312,%r336,%r1360; fma.rn.f32 %r341,%r312,%r337,%r339; .loc 1 2801 14 add.f32 %r280,%r280,%r340; add.f32 %r279,%r279,%r341; .loc 1 2804 11 mul.f32 %r343,%r321,%r336; mul.f32 %r1361,%r321,%r337; neg.f32 %r1362,%r1361; fma.rn.f32 %r344,%r320,%r336,%r1362; fma.rn.f32 %r345,%r320,%r337,%r343; .loc 1 2803 14 add.f32 %r300,%r300,%r344; add.f32 %r301,%r301,%r345; add.u64 %r511,%r641,%r523; ld.f32 %r346,[%r511]; ld.f32 %r347,[%r511+4]; .loc 1 2806 11 mul.f32 %r349,%r313,%r346; mul.f32 %r1363,%r313,%r347; neg.f32 %r1364,%r1363; fma.rn.f32 %r350,%r312,%r346,%r1364; fma.rn.f32 %r351,%r312,%r347,%r349; .loc 1 2805 14 add.f32 %r296,%r296,%r350; add.f32 %r299,%r299,%r351; .loc 1 2808 11 mul.f32 %r353,%r321,%r346; mul.f32 %r1365,%r321,%r347; neg.f32 %r1366,%r1365; fma.rn.f32 %r354,%r320,%r346,%r1366; fma.rn.f32 %r355,%r320,%r347,%r353; .loc 1 2807 14 add.f32 %r302,%r302,%r354; add.f32 %r303,%r303,%r355; add.u64 %r509,%r502,%r523; ld.f32 %r356,[%r509+-2048]; ld.f32 %r357,[%r509+-2044]; .loc 1 2810 11 mul.f32 %r359,%r315,%r356; mul.f32 %r1367,%r315,%r357; neg.f32 %r1368,%r1367; fma.rn.f32 %r360,%r314,%r356,%r1368; fma.rn.f32 %r361,%r314,%r357,%r359; .loc 1 2809 14 add.f32 %r272,%r272,%r360; add.f32 %r273,%r273,%r361; add.u64 %r507,%r500,%r523; ld.f32 %r362,[%r507+-2048]; ld.f32 %r363,[%r507+-2044]; .loc 1 2812 11 mul.f32 %r365,%r315,%r362; mul.f32 %r1369,%r315,%r363; neg.f32 %r1370,%r1369; fma.rn.f32 %r366,%r314,%r362,%r1370; fma.rn.f32 %r367,%r314,%r363,%r365; .loc 1 2811 14 add.f32 %r284,%r284,%r366; add.f32 %r283,%r283,%r367; .loc 1 2814 11 mul.f32 %r369,%r327,%r356; mul.f32 %r1371,%r327,%r357; neg.f32 %r1372,%r1371; fma.rn.f32 %r370,%r326,%r356,%r1372; fma.rn.f32 %r371,%r326,%r357,%r369; .loc 1 2813 14 add.f32 %r290,%r290,%r370; add.f32 %r274,%r274,%r371; .loc 1 2816 11 mul.f32 %r373,%r327,%r362; mul.f32 %r1373,%r327,%r363; neg.f32 %r1374,%r1373; fma.rn.f32 %r374,%r326,%r362,%r1374; fma.rn.f32 %r375,%r326,%r363,%r373; .loc 1 2815 14 add.f32 %r276,%r276,%r374; add.f32 %r275,%r275,%r375; .loc 1 2818 11 mul.f32 %r377,%r337,%r356; mul.f32 %r1375,%r337,%r357; neg.f32 %r1376,%r1375; fma.rn.f32 %r378,%r336,%r356,%r1376; fma.rn.f32 %r379,%r336,%r357,%r377; .loc 1 2817 14 add.f32 %r304,%r304,%r378; add.f32 %r305,%r305,%r379; .loc 1 2820 11 mul.f32 %r381,%r337,%r362; mul.f32 %r1377,%r337,%r363; neg.f32 %r1378,%r1377; fma.rn.f32 %r382,%r336,%r362,%r1378; fma.rn.f32 %r383,%r336,%r363,%r381; .loc 1 2819 14 add.f32 %r308,%r308,%r382; add.f32 %r309,%r309,%r383; .loc 1 2822 11 mul.f32 %r385,%r347,%r356; mul.f32 %r1379,%r347,%r357; neg.f32 %r1380,%r1379; fma.rn.f32 %r386,%r346,%r356,%r1380; fma.rn.f32 %r387,%r346,%r357,%r385; .loc 1 2821 14 add.f32 %r306,%r306,%r386; add.f32 %r307,%r307,%r387; .loc 1 2824 11 mul.f32 %r389,%r347,%r362; mul.f32 %r1381,%r347,%r363; neg.f32 %r1382,%r1381; fma.rn.f32 %r390,%r346,%r362,%r1382; fma.rn.f32 %r391,%r346,%r363,%r389; .loc 1 2823 14 add.f32 %r310,%r310,%r390; add.f32 %r311,%r311,%r391; .loc 1 2791 6 add.u64 %r523,%r523,8; setp.ne.u64 %r1383,%r523,%r679; @ %r1383 bra $L65; $L64: .loc 1 2826 24 st.f32 [%r496],%r184; st.f32 [%r496+4],%r281; .loc 1 2827 28 st.f32 [%r496+8],%r201; st.f32 [%r496+12],%r189; .loc 1 2828 30 st.f32 [%r483],%r282; st.f32 [%r483+4],%r45; .loc 1 2829 34 st.f32 [%r483+8],%r278; st.f32 [%r483+12],%r270; .loc 1 2830 30 st.f32 [%r472],%r280; st.f32 [%r472+4],%r279; .loc 1 2831 34 st.f32 [%r472+8],%r300; st.f32 [%r472+12],%r301; .loc 1 2832 30 st.f32 [%r286],%r296; st.f32 [%r286+4],%r299; .loc 1 2833 34 st.f32 [%r286+8],%r302; st.f32 [%r286+12],%r303; .loc 1 2834 28 st.f32 [%r496+16],%r272; st.f32 [%r496+20],%r273; .loc 1 2835 28 st.f32 [%r496+24],%r284; st.f32 [%r496+28],%r283; .loc 1 2836 34 st.f32 [%r483+16],%r290; st.f32 [%r483+20],%r274; .loc 1 2837 34 st.f32 [%r483+24],%r276; st.f32 [%r483+28],%r275; .loc 1 2838 34 st.f32 [%r472+16],%r304; st.f32 [%r472+20],%r305; .loc 1 2839 34 st.f32 [%r472+24],%r308; st.f32 [%r472+28],%r309; .loc 1 2840 34 st.f32 [%r286+16],%r306; st.f32 [%r286+20],%r307; .loc 1 2841 34 st.f32 [%r286+24],%r310; st.f32 [%r286+28],%r311; .loc 1 2772 9 add.u64 %r506,%r506,8192; add.u64 %r504,%r504,8192; add.u64 %r502,%r502,8192; add.u64 %r500,%r500,8192; add.u64 %r498,%r498,32; setp.ne.u64 %r1384,%r244,%r498; @ %r1384 bra $L66; bra $L67; $L63: .loc 1 2769 29 add.u64 %r250,%r250,4; .loc 1 2769 5 add.u64 %r200,%r200,%r1251; add.u64 %r140,%r140,%r771; add.u64 %r88,%r88,%r67; add.u64 %r649,%r649,%r67; add.u64 %r642,%r642,%r67; add.u64 %r641,%r641,%r67; add.u64 %r640,%r640,%r771; add.u64 %r639,%r639,%r771; add.u64 %r637,%r637,%r771; add.u64 %r51,%r51,%r771; add.u64 %r49,%r49,%r771; setp.le.s64 %r1385,%r250,%r1583; @ %r1385 bra $L68; bra $L57; $L62: add.u64 %r549,%r140,4; add.u64 %r542,%r49,%r259; add.u64 %r128,%r238,%r640; .loc 1 2904 13 mov.u64 %r553,%r552; mov.u64 %r555,%r140; add.u64 %r1532,%r49,%r767; add.u64 %r1534,%r49,%r770; $L71: ld.f32 %r392,[%r555]; ld.f32 %r393,[%r549]; add.u64 %r1387,%r1532,%r555; add.u64 %r543,%r1387,%r200; ld.f32 %r394,[%r543]; add.u64 %r1388,%r1532,%r549; add.u64 %r530,%r1388,%r200; ld.f32 %r395,[%r530]; add.u64 %r1389,%r542,%r555; add.u64 %r539,%r1389,%r200; ld.f32 %r396,[%r539]; add.u64 %r1390,%r542,%r549; add.u64 %r527,%r1390,%r200; ld.f32 %r397,[%r527]; add.u64 %r1391,%r1534,%r555; add.u64 %r535,%r1391,%r200; ld.f32 %r398,[%r535]; add.u64 %r1392,%r1534,%r549; add.u64 %r524,%r1392,%r200; ld.f32 %r399,[%r524]; .loc 1 2853 10 @ %r1581 bra $L69; mov.u64 %r566,0; $L70: add.u64 %r564,%r553,%r566; ld.f32 %r400,[%r564+-2048]; ld.f32 %r401,[%r564+-2044]; add.u64 %r562,%r88,%r566; ld.f32 %r402,[%r562]; ld.f32 %r403,[%r562+4]; .loc 1 2856 13 mul.f32 %r405,%r401,%r402; mul.f32 %r1394,%r401,%r403; neg.f32 %r1395,%r1394; fma.rn.f32 %r406,%r400,%r402,%r1395; fma.rn.f32 %r407,%r400,%r403,%r405; .loc 1 2855 11 add.f32 %r392,%r392,%r406; add.f32 %r393,%r393,%r407; add.u64 %r560,%r649,%r566; ld.f32 %r408,[%r560]; ld.f32 %r409,[%r560+4]; .loc 1 2858 13 mul.f32 %r411,%r401,%r408; mul.f32 %r1396,%r401,%r409; neg.f32 %r1397,%r1396; fma.rn.f32 %r412,%r400,%r408,%r1397; fma.rn.f32 %r413,%r400,%r409,%r411; .loc 1 2857 11 add.f32 %r394,%r394,%r412; add.f32 %r395,%r395,%r413; add.u64 %r558,%r642,%r566; ld.f32 %r414,[%r558]; ld.f32 %r415,[%r558+4]; .loc 1 2860 13 mul.f32 %r417,%r401,%r414; mul.f32 %r1398,%r401,%r415; neg.f32 %r1399,%r1398; fma.rn.f32 %r418,%r400,%r414,%r1399; fma.rn.f32 %r419,%r400,%r415,%r417; .loc 1 2859 11 add.f32 %r396,%r396,%r418; add.f32 %r397,%r397,%r419; add.u64 %r556,%r641,%r566; ld.f32 %r420,[%r556]; ld.f32 %r421,[%r556+4]; .loc 1 2862 13 mul.f32 %r423,%r401,%r420; mul.f32 %r1400,%r401,%r421; neg.f32 %r1401,%r1400; fma.rn.f32 %r424,%r400,%r420,%r1401; fma.rn.f32 %r425,%r400,%r421,%r423; .loc 1 2861 11 add.f32 %r398,%r398,%r424; add.f32 %r399,%r399,%r425; .loc 1 2853 10 add.u64 %r566,%r566,8; setp.ne.u64 %r1402,%r566,%r679; @ %r1402 bra $L70; $L69: .loc 1 2864 28 st.f32 [%r555],%r392; st.f32 [%r549],%r393; .loc 1 2865 34 st.f32 [%r543],%r394; st.f32 [%r530],%r395; .loc 1 2866 34 st.f32 [%r539],%r396; st.f32 [%r527],%r397; .loc 1 2867 34 st.f32 [%r535],%r398; st.f32 [%r524],%r399; .loc 1 2846 6 add.u64 %r555,%r555,8; add.u64 %r553,%r553,2048; add.u64 %r549,%r549,8; setp.ne.u64 %r1403,%r128,%r555; @ %r1403 bra $L71; bra $L63; $L60: .loc 1 2727 34 add.u64 %r248,%r248,256; .loc 1 2727 8 add.u64 %r736,%r736,2048; add.u64 %r737,%r737,2048; setp.ge.s64 %r1404,%r162,%r248; @ %r1404 bra $L72; bra $L73; $L59: .loc 1 2874 9 @ %r1589 bra $L60; .loc 1 2876 14 add.u64 %r657,%r240,%r248; .loc 1 2876 9 add.u64 %r658,%r657,-1; .loc 1 2759 17 add.u64 %r582,%r1699,%r248; add.u64 %r1406,%r582,%r583; shl.b64 %r1407,%r1406,3; add.u64 %r584,%r224,%r1407; neg.s64 %r1409,%r582; shl.b64 %r1410,%r1409,3; shl.b64 %r1411,%r248,3; add.u64 %r567,%r1411,%r1410; shl.b64 %r1412,%r657,3; add.u64 %r569,%r1412,%r1410; shl.b64 %r1413,%r240,11; add.u64 %r615,%r1230,%r1413; .loc 1 2876 9 mov.u64 %r575,%r574; mov.u64 %r578,%r757; mov.u64 %r579,%r583; mov.u64 %r243,%r644; setp.gt.s64 %r1730,%r1699,%r240; setp.le.s64 %r1544,%r248,%r658; not.b64 %r1562,%r248; add.u64 %r1563,%r1562,%r657; and.b64 %r1564,%r1563,-4; add.u64 %r1565,%r1564,%r248; add.u64 %r1566,%r224,32; $L81: .loc 1 2877 6 @ %r1544 bra $L74; $L80: .loc 1 2901 6 @ %r1730 bra $L75; bra $L76; $L74: add.u64 %r599,%r567,%r584; add.u64 %r598,%r1230,2048; add.u64 %r596,%r1230,4096; add.u64 %r594,%r1230,6144; add.u64 %r592,%r1230,8192; add.u64 %r1420,%r1565,%r579; shl.b64 %r1421,%r1420,3; add.u64 %r585,%r1421,%r1566; $L79: ld.f32 %r426,[%r599]; ld.f32 %r427,[%r599+4]; ld.f32 %r428,[%r599+8]; ld.f32 %r429,[%r599+12]; ld.f32 %r430,[%r599+16]; ld.f32 %r431,[%r599+20]; ld.f32 %r432,[%r599+24]; ld.f32 %r433,[%r599+28]; .loc 1 2884 10 @ %r1581 bra $L77; mov.u64 %r610,0; $L78: add.u64 %r608,%r598,%r610; ld.f32 %r434,[%r608+-2048]; ld.f32 %r435,[%r608+-2044]; add.u64 %r606,%r578,%r610; ld.f32 %r436,[%r606]; ld.f32 %r437,[%r606+4]; .loc 1 2887 13 mul.f32 %r439,%r435,%r436; mul.f32 %r1424,%r435,%r437; neg.f32 %r1425,%r1424; fma.rn.f32 %r440,%r434,%r436,%r1425; fma.rn.f32 %r441,%r434,%r437,%r439; .loc 1 2886 11 add.f32 %r426,%r426,%r440; add.f32 %r427,%r427,%r441; add.u64 %r604,%r596,%r610; ld.f32 %r442,[%r604+-2048]; ld.f32 %r443,[%r604+-2044]; .loc 1 2889 13 mul.f32 %r445,%r437,%r442; mul.f32 %r1426,%r437,%r443; neg.f32 %r1427,%r1426; fma.rn.f32 %r446,%r436,%r442,%r1427; fma.rn.f32 %r447,%r436,%r443,%r445; .loc 1 2888 11 add.f32 %r428,%r428,%r446; add.f32 %r429,%r429,%r447; add.u64 %r602,%r594,%r610; ld.f32 %r448,[%r602+-2048]; ld.f32 %r449,[%r602+-2044]; .loc 1 2891 13 mul.f32 %r451,%r437,%r448; mul.f32 %r1428,%r437,%r449; neg.f32 %r1429,%r1428; fma.rn.f32 %r452,%r436,%r448,%r1429; fma.rn.f32 %r453,%r436,%r449,%r451; .loc 1 2890 11 add.f32 %r430,%r430,%r452; add.f32 %r431,%r431,%r453; add.u64 %r600,%r592,%r610; ld.f32 %r454,[%r600+-2048]; ld.f32 %r455,[%r600+-2044]; .loc 1 2893 13 mul.f32 %r457,%r437,%r454; mul.f32 %r1430,%r437,%r455; neg.f32 %r1431,%r1430; fma.rn.f32 %r458,%r436,%r454,%r1431; fma.rn.f32 %r459,%r436,%r455,%r457; .loc 1 2892 11 add.f32 %r432,%r432,%r458; add.f32 %r433,%r433,%r459; .loc 1 2884 10 add.u64 %r610,%r610,8; setp.ne.u64 %r1432,%r610,%r679; @ %r1432 bra $L78; $L77: .loc 1 2895 28 st.f32 [%r599],%r426; st.f32 [%r599+4],%r427; .loc 1 2896 32 st.f32 [%r599+8],%r428; st.f32 [%r599+12],%r429; .loc 1 2897 32 st.f32 [%r599+16],%r430; st.f32 [%r599+20],%r431; .loc 1 2898 32 st.f32 [%r599+24],%r432; st.f32 [%r599+28],%r433; .loc 1 2877 6 add.u64 %r599,%r599,32; add.u64 %r598,%r598,8192; add.u64 %r596,%r596,8192; add.u64 %r594,%r594,8192; add.u64 %r592,%r592,8192; setp.ne.u64 %r1433,%r585,%r599; @ %r1433 bra $L79; bra $L80; $L76: .loc 1 2874 39 add.u64 %r243,%r243,1; .loc 1 2874 9 add.u64 %r584,%r584,%r797; add.u64 %r579,%r579,%r664; add.u64 %r578,%r578,%r784; add.u64 %r575,%r575,%r784; setp.ne.u64 %r1434,%r1588,%r243; @ %r1434 bra $L81; bra $L60; $L75: add.u64 %r617,%r569,%r584; .loc 1 2904 13 mov.u64 %r616,%r615; $L84: ld.f32 %r460,[%r617]; ld.f32 %r461,[%r617+4]; .loc 1 2905 10 @ %r1581 bra $L82; add.u64 %r621,%r616,4; mov.u64 %r256,%r578; mov.u64 %r297,%r616; sub.u64 %r1603,%r575,%r578; add.u64 %r1604,%r1603,-8; shr.u64 %r1602,%r1604,3; add.u64 %r1605,%r1602,1; and.b64 %r1606,%r1605,3; setp.eq.u64 %r1609,%r1606,0; @ %r1609 bra $L83; setp.eq.u64 %r1608,%r1606,1; @ %r1608 bra $L286; setp.eq.u64 %r1607,%r1606,2; @ %r1607 bra $L287; bra $L342; $L83: ld.f32 %r462,[%r297]; ld.f32 %r463,[%r621]; ld.f32 %r464,[%r256]; ld.f32 %r465,[%r256+4]; .loc 1 2908 13 mul.f32 %r467,%r463,%r464; mul.f32 %r1437,%r463,%r465; neg.f32 %r1438,%r1437; fma.rn.f32 %r468,%r462,%r464,%r1438; fma.rn.f32 %r469,%r462,%r465,%r467; .loc 1 2907 11 add.f32 %r1731,%r460,%r468; add.f32 %r1732,%r461,%r469; .loc 1 2905 10 add.u64 %r1611,%r256,8; ld.f32 %r1736,[%r297+8]; ld.f32 %r1737,[%r621+8]; ld.f32 %r1738,[%r1611]; ld.f32 %r1739,[%r1611+4]; .loc 1 2908 13 mul.f32 %r1740,%r1737,%r1738; mul.f32 %r1741,%r1737,%r1739; neg.f32 %r1742,%r1741; fma.rn.f32 %r1743,%r1736,%r1738,%r1742; fma.rn.f32 %r1744,%r1736,%r1739,%r1740; .loc 1 2907 11 add.f32 %r1745,%r1731,%r1743; add.f32 %r1746,%r1732,%r1744; .loc 1 2905 10 add.u64 %r1748,%r256,16; ld.f32 %r1751,[%r297+16]; ld.f32 %r1752,[%r621+16]; ld.f32 %r1753,[%r1748]; ld.f32 %r1754,[%r1748+4]; .loc 1 2908 13 mul.f32 %r1755,%r1752,%r1753; mul.f32 %r1756,%r1752,%r1754; neg.f32 %r1757,%r1756; fma.rn.f32 %r1758,%r1751,%r1753,%r1757; fma.rn.f32 %r1759,%r1751,%r1754,%r1755; .loc 1 2907 11 add.f32 %r1760,%r1745,%r1758; add.f32 %r1761,%r1746,%r1759; .loc 1 2905 10 add.u64 %r1763,%r256,24; ld.f32 %r1766,[%r297+24]; ld.f32 %r1767,[%r621+24]; ld.f32 %r1768,[%r1763]; ld.f32 %r1769,[%r1763+4]; .loc 1 2908 13 mul.f32 %r1770,%r1767,%r1768; mul.f32 %r1771,%r1767,%r1769; neg.f32 %r1772,%r1771; fma.rn.f32 %r1773,%r1766,%r1768,%r1772; fma.rn.f32 %r1774,%r1766,%r1769,%r1770; .loc 1 2907 11 add.f32 %r460,%r1760,%r1773; add.f32 %r461,%r1761,%r1774; .loc 1 2905 10 add.u64 %r297,%r297,32; add.u64 %r256,%r256,32; add.u64 %r621,%r621,32; setp.ne.u64 %r1775,%r256,%r575; @ %r1775 bra $L83; $L82: .loc 1 2910 28 st.f32 [%r617],%r460; st.f32 [%r617+4],%r461; .loc 1 2901 6 add.u64 %r617,%r617,8; add.u64 %r616,%r616,2048; setp.ne.u64 %r1440,%r584,%r617; @ %r1440 bra $L84; bra $L76; $L20: .loc 1 2958 11 cvt.u16.u32 %r1441,%r627; setp.eq.u16 %r1442,%r1441,1; @ ! %r1442 bra $L343; bra $L85; $L31: .loc 1 2920 38 set.u32.eq.u64 %r1444,%r150,1; neg.s32 %r1445,%r1444; .loc 1 2920 26 cvt.u16.u32 %r1447,%r1445; and.b16 %r1446,%r1447,%r1187; .loc 1 2920 11 cvt.u32.u16 %r1449,%r1446; cvt.u16.u8 %r1450,%r1449; setp.eq.u16 %r1451,%r1450,0; @ %r1451 bra $L20; .loc 1 2920 43 setp.ne.u64 %r1452,%r663,1; @ %r1452 bra $L20; $L121: .loc 1 2922 10 cvt.u16.u32 %r1453,%r627; setp.ne.u16 %r1454,%r1453,1; @ %r1454 bra $L87; .loc 1 2948 4 setp.gt.s64 %r1455,%r163,0; @ %r1455 bra $L88; bra $L1; $L87: .loc 1 2929 4 setp.le.s64 %r1456,%r163,0; @ %r1456 bra $L1; shl.b64 %r830,%r664,3; shl.b64 %r1457,%r162,3; add.u64 %r829,%r222,%r1457; shl.b64 %r836,%r151,3; mov.u64 %r834,%r221; shl.b64 %r818,%r298,3; shl.b64 %r1458,%r638,3; add.u64 %r822,%r220,%r1458; mov.u64 %r827,0; .loc 1 2929 11 mov.u64 %r258,%r827; setp.gt.s64 %r1776,%r162,0; setp.le.s64 %r1597,%r638,0; $L91: .loc 1 2933 8 @ %r1776 bra $L90; $L95: .loc 1 2929 29 add.u64 %r258,%r258,1; .loc 1 2929 4 add.u64 %r827,%r827,%r664; add.u64 %r829,%r829,%r830; add.u64 %r834,%r834,%r836; setp.ne.u64 %r1460,%r163,%r258; @ %r1460 bra $L91; bra $L1; $L90: shl.b64 %r1461,%r827,3; add.u64 %r815,%r222,%r1461; .loc 1 2933 8 mov.u64 %r821,%r822; mov.u64 %r816,%r220; $L94: .loc 1 2937 5 @ %r1597 bra $L140; add.u64 %r809,%r816,4; add.u64 %r810,%r834,4; mov.u64 %r808,%r834; mov.u64 %r807,%r816; mov.f32 %r108,0f00000000; mov.f32 %r103,%r108; sub.u64 %r1642,%r821,%r816; add.u64 %r1643,%r1642,-8; shr.u64 %r1641,%r1643,3; add.u64 %r1644,%r1641,1; and.b64 %r1645,%r1644,3; setp.eq.u64 %r1648,%r1645,0; @ %r1648 bra $L93; setp.eq.u64 %r1647,%r1645,1; @ %r1647 bra $L288; setp.eq.u64 %r1646,%r1645,2; @ %r1646 bra $L289; bra $L344; $L93: ld.f32 %r186,[%r807]; ld.f32 %r193,[%r809]; ld.f32 %r177,[%r808]; ld.f32 %r175,[%r810]; .loc 1 2938 23 mul.f32 %r44,%r177,%r193; mul.f32 %r1463,%r175,%r193; neg.f32 %r1464,%r1463; fma.rn.f32 %r43,%r177,%r186,%r1464; fma.rn.f32 %r101,%r175,%r186,%r44; .loc 1 2938 9 add.f32 %r1777,%r103,%r43; add.f32 %r1778,%r108,%r101; ld.f32 %r1783,[%r807+8]; ld.f32 %r1784,[%r809+8]; ld.f32 %r1785,[%r808+8]; ld.f32 %r1786,[%r810+8]; .loc 1 2938 23 mul.f32 %r1787,%r1785,%r1784; mul.f32 %r1788,%r1786,%r1784; neg.f32 %r1789,%r1788; fma.rn.f32 %r1790,%r1785,%r1783,%r1789; fma.rn.f32 %r1791,%r1786,%r1783,%r1787; .loc 1 2938 9 add.f32 %r1792,%r1777,%r1790; add.f32 %r1793,%r1778,%r1791; ld.f32 %r1799,[%r807+16]; ld.f32 %r1800,[%r809+16]; ld.f32 %r1801,[%r808+16]; ld.f32 %r1802,[%r810+16]; .loc 1 2938 23 mul.f32 %r1803,%r1801,%r1800; mul.f32 %r1804,%r1802,%r1800; neg.f32 %r1805,%r1804; fma.rn.f32 %r1806,%r1801,%r1799,%r1805; fma.rn.f32 %r1807,%r1802,%r1799,%r1803; .loc 1 2938 9 add.f32 %r1808,%r1792,%r1806; add.f32 %r1809,%r1793,%r1807; ld.f32 %r1815,[%r807+24]; ld.f32 %r1816,[%r809+24]; ld.f32 %r1817,[%r808+24]; ld.f32 %r1818,[%r810+24]; .loc 1 2938 23 mul.f32 %r1819,%r1817,%r1816; mul.f32 %r1820,%r1818,%r1816; neg.f32 %r1821,%r1820; fma.rn.f32 %r1822,%r1817,%r1815,%r1821; fma.rn.f32 %r1823,%r1818,%r1815,%r1819; .loc 1 2938 9 add.f32 %r103,%r1808,%r1822; add.f32 %r108,%r1809,%r1823; .loc 1 2937 5 add.u64 %r807,%r807,32; add.u64 %r808,%r808,32; add.u64 %r809,%r809,32; add.u64 %r810,%r810,32; setp.ne.u64 %r1824,%r807,%r821; @ %r1824 bra $L93; bra $L92; $L140: mov.f32 %r108,0f00000000; mov.f32 %r103,%r108; $L92: .loc 1 2939 15 st.f32 [%r815],%r103; st.f32 [%r815+4],%r108; .loc 1 2933 8 add.u64 %r815,%r815,8; add.u64 %r816,%r816,%r818; add.u64 %r821,%r821,%r818; setp.ne.u64 %r1466,%r815,%r829; @ %r1466 bra $L94; bra $L95; $L88: shl.b64 %r853,%r664,3; mov.u64 %r851,%r222; shl.b64 %r842,%r298,3; .loc 1 2948 4 mov.u64 %r854,0; .loc 1 2948 11 mov.u64 %r257,%r854; setp.le.s64 %r1825,%r638,0; $L98: .loc 1 2952 8 @ %r1825 bra $L141; mov.u64 %r840,%r220; shl.b64 %r1468,%r854,3; add.u64 %r843,%r1468,%r221; add.u64 %r845,%r220,4; add.u64 %r1469,%r638,%r854; shl.b64 %r1470,%r1469,3; add.u64 %r861,%r1470,%r221; mov.f32 %r37,0f00000000; mov.f32 %r38,%r37; sub.u64 %r1654,%r861,%r843; add.u64 %r1655,%r1654,-8; shr.u64 %r1653,%r1655,3; add.u64 %r1656,%r1653,1; and.b64 %r1657,%r1656,3; setp.eq.u64 %r1660,%r1657,0; @ %r1660 bra $L97; setp.eq.u64 %r1659,%r1657,1; @ %r1659 bra $L290; setp.eq.u64 %r1658,%r1657,2; @ %r1658 bra $L291; bra $L345; $L97: ld.f32 %r187,[%r840]; ld.f32 %r194,[%r845]; ld.f32 %r172,[%r843]; ld.f32 %r169,[%r843+4]; .loc 1 2953 26 mul.f32 %r87,%r172,%r194; mul.f32 %r1471,%r169,%r194; neg.f32 %r1472,%r1471; fma.rn.f32 %r69,%r172,%r187,%r1472; fma.rn.f32 %r68,%r169,%r187,%r87; .loc 1 2953 5 add.f32 %r1826,%r38,%r69; add.f32 %r1827,%r37,%r68; .loc 1 2952 8 add.u64 %r1828,%r840,%r842; add.u64 %r1661,%r843,8; add.u64 %r1830,%r845,%r842; ld.f32 %r1831,[%r1828]; ld.f32 %r1832,[%r1830]; ld.f32 %r1833,[%r1661]; ld.f32 %r1834,[%r1661+4]; .loc 1 2953 26 mul.f32 %r1835,%r1833,%r1832; mul.f32 %r1836,%r1834,%r1832; neg.f32 %r1837,%r1836; fma.rn.f32 %r1838,%r1833,%r1831,%r1837; fma.rn.f32 %r1839,%r1834,%r1831,%r1835; .loc 1 2953 5 add.f32 %r1840,%r1826,%r1838; add.f32 %r1841,%r1827,%r1839; .loc 1 2952 8 add.u64 %r1842,%r1828,%r842; add.u64 %r1843,%r843,16; add.u64 %r1844,%r1830,%r842; ld.f32 %r1846,[%r1842]; ld.f32 %r1847,[%r1844]; ld.f32 %r1848,[%r1843]; ld.f32 %r1849,[%r1843+4]; .loc 1 2953 26 mul.f32 %r1850,%r1848,%r1847; mul.f32 %r1851,%r1849,%r1847; neg.f32 %r1852,%r1851; fma.rn.f32 %r1853,%r1848,%r1846,%r1852; fma.rn.f32 %r1854,%r1849,%r1846,%r1850; .loc 1 2953 5 add.f32 %r1855,%r1840,%r1853; add.f32 %r1856,%r1841,%r1854; .loc 1 2952 8 add.u64 %r1857,%r1842,%r842; add.u64 %r1858,%r843,24; add.u64 %r1859,%r1844,%r842; ld.f32 %r1861,[%r1857]; ld.f32 %r1862,[%r1859]; ld.f32 %r1863,[%r1858]; ld.f32 %r1864,[%r1858+4]; .loc 1 2953 26 mul.f32 %r1865,%r1863,%r1862; mul.f32 %r1866,%r1864,%r1862; neg.f32 %r1867,%r1866; fma.rn.f32 %r1868,%r1863,%r1861,%r1867; fma.rn.f32 %r1869,%r1864,%r1861,%r1865; .loc 1 2953 5 add.f32 %r38,%r1855,%r1868; add.f32 %r37,%r1856,%r1869; .loc 1 2952 8 add.u64 %r840,%r1857,%r842; add.u64 %r843,%r843,32; add.u64 %r845,%r1859,%r842; setp.ne.u64 %r1870,%r843,%r861; @ %r1870 bra $L97; bra $L96; $L141: mov.f32 %r37,0f00000000; mov.f32 %r38,%r37; $L96: .loc 1 2954 25 st.f32 [%r851],%r38; st.f32 [%r851+4],%r37; .loc 1 2948 29 add.u64 %r257,%r257,1; .loc 1 2948 4 add.u64 %r851,%r851,%r853; add.u64 %r854,%r854,%r151; setp.ne.u64 %r1474,%r163,%r257; @ %r1474 bra $L98; bra $L1; $L85: .loc 1 2963 7 setp.le.s64 %r1475,%r163,0; @ %r1475 bra $L1; shl.b64 %r872,%r844,3; mov.u64 %r870,%r222; shl.b64 %r864,%r298,3; shl.b64 %r867,%r663,3; mov.u64 %r873,0; .loc 1 2963 14 mov.u64 %r268,%r873; setp.le.s64 %r1871,%r638,0; $L101: .loc 1 2967 4 @ %r1871 bra $L142; mov.u64 %r862,%r220; shl.b64 %r1477,%r873,3; add.u64 %r865,%r221,%r1477; mov.f32 %r152,0f00000000; mov.f32 %r188,%r152; .loc 1 2967 11 mov.u64 %r269,0; and.b64 %r1665,%r638,3; setp.eq.u64 %r1668,%r1665,0; @ %r1668 bra $L100; setp.eq.u64 %r1667,%r1665,1; @ %r1667 bra $L292; setp.eq.u64 %r1666,%r1665,2; @ %r1666 bra $L293; bra $L346; $L100: ld.f32 %r277,[%r862]; ld.f32 %r206,[%r862+4]; ld.f32 %r195,[%r865]; ld.f32 %r205,[%r865+4]; .loc 1 2968 29 mul.f32 %r203,%r195,%r206; mul.f32 %r1478,%r205,%r206; neg.f32 %r1479,%r1478; fma.rn.f32 %r191,%r195,%r277,%r1479; fma.rn.f32 %r202,%r205,%r277,%r203; .loc 1 2968 8 add.f32 %r1872,%r188,%r191; add.f32 %r1873,%r152,%r202; .loc 1 2967 4 add.u64 %r1875,%r862,%r864; add.u64 %r1876,%r865,%r867; ld.f32 %r1877,[%r1875]; ld.f32 %r1878,[%r1875+4]; ld.f32 %r1879,[%r1876]; ld.f32 %r1880,[%r1876+4]; .loc 1 2968 29 mul.f32 %r1881,%r1879,%r1878; mul.f32 %r1882,%r1880,%r1878; neg.f32 %r1883,%r1882; fma.rn.f32 %r1884,%r1879,%r1877,%r1883; fma.rn.f32 %r1885,%r1880,%r1877,%r1881; .loc 1 2968 8 add.f32 %r1886,%r1872,%r1884; add.f32 %r1887,%r1873,%r1885; .loc 1 2967 4 add.u64 %r1889,%r1875,%r864; add.u64 %r1890,%r1876,%r867; ld.f32 %r1892,[%r1889]; ld.f32 %r1893,[%r1889+4]; ld.f32 %r1894,[%r1890]; ld.f32 %r1895,[%r1890+4]; .loc 1 2968 29 mul.f32 %r1896,%r1894,%r1893; mul.f32 %r1897,%r1895,%r1893; neg.f32 %r1898,%r1897; fma.rn.f32 %r1899,%r1894,%r1892,%r1898; fma.rn.f32 %r1900,%r1895,%r1892,%r1896; .loc 1 2968 8 add.f32 %r1901,%r1886,%r1899; add.f32 %r1902,%r1887,%r1900; .loc 1 2967 4 add.u64 %r1904,%r1889,%r864; add.u64 %r1905,%r1890,%r867; ld.f32 %r1907,[%r1904]; ld.f32 %r1908,[%r1904+4]; ld.f32 %r1909,[%r1905]; ld.f32 %r1910,[%r1905+4]; .loc 1 2968 29 mul.f32 %r1911,%r1909,%r1908; mul.f32 %r1912,%r1910,%r1908; neg.f32 %r1913,%r1912; fma.rn.f32 %r1914,%r1909,%r1907,%r1913; fma.rn.f32 %r1915,%r1910,%r1907,%r1911; .loc 1 2968 8 add.f32 %r188,%r1901,%r1914; add.f32 %r152,%r1902,%r1915; .loc 1 2967 28 add.u64 %r269,%r269,4; .loc 1 2967 4 add.u64 %r862,%r1904,%r864; add.u64 %r865,%r1905,%r867; setp.ne.u64 %r1916,%r638,%r269; @ %r1916 bra $L100; bra $L99; $L142: mov.f32 %r152,0f00000000; mov.f32 %r188,%r152; $L99: .loc 1 2969 21 st.f32 [%r870],%r188; st.f32 [%r870+4],%r152; .loc 1 2963 32 add.u64 %r268,%r268,1; .loc 1 2963 7 add.u64 %r870,%r870,%r872; add.u64 %r873,%r873,%r151; setp.ne.u64 %r1481,%r163,%r268; @ %r1481 bra $L101; bra $L1; $L343: .loc 1 2972 11 setp.gt.s64 %r1482,%r150,%r298; @ %r1482 bra $L102; .loc 1 2993 7 setp.gt.s64 %r1483,%r163,0; @ %r1483 bra $L103; bra $L1; $L102: .loc 1 2974 7 setp.gt.s64 %r1484,%r163,0; @ %r1484 bra $L104; bra $L1; $L108: shl.b64 %r1485,%r914,3; add.u64 %r910,%r222,%r1485; .loc 1 2975 9 mov.u64 %r267,0; and.b64 %r1681,%r162,3; setp.eq.u64 %r1684,%r1681,0; @ %r1684 bra $L105; setp.eq.u64 %r1683,%r1681,1; @ %r1683 bra $L294; setp.eq.u64 %r1682,%r1681,2; @ %r1682 bra $L295; bra $L347; $L105: .loc 1 2976 34 st.f32 [%r910],%r1599; st.f32 [%r910+4],%r1600; .loc 1 2975 2 add.u64 %r1920,%r910,%r912; .loc 1 2976 34 st.f32 [%r1920],%r1599; st.f32 [%r1920+4],%r1600; .loc 1 2975 2 add.u64 %r1922,%r1920,%r912; .loc 1 2976 34 st.f32 [%r1922],%r1599; st.f32 [%r1922+4],%r1600; .loc 1 2975 2 add.u64 %r1925,%r1922,%r912; .loc 1 2976 34 st.f32 [%r1925],%r1599; st.f32 [%r1925+4],%r1600; .loc 1 2975 27 add.u64 %r267,%r267,4; .loc 1 2975 2 add.u64 %r910,%r1925,%r912; setp.ne.u64 %r1927,%r162,%r267; @ %r1927 bra $L105; $L109: .loc 1 2974 32 add.u64 %r266,%r288,1; .loc 1 2974 7 add.u64 %r914,%r914,%r664; setp.ne.u64 %r1489,%r163,%r266; @ %r1489 bra $L143; shl.b64 %r902,%r664,3; mov.u64 %r900,%r222; shl.b64 %r893,%r663,3; shl.b64 %r883,%r298,3; mov.u64 %r905,0; mov.u64 %r289,%r905; setp.le.s64 %r1928,%r638,0; add.u64 %r1598,%r221,4; bra $L107; $L104: shl.b64 %r912,%r844,3; mov.u64 %r914,0; .loc 1 2974 14 mov.u64 %r288,%r914; setp.gt.s64 %r1929,%r162,0; ld.const.f32 %r1599,[$LC9]; ld.const.f32 %r1600,[$LC9+4]; bra $L106; $L143: mov.u64 %r288,%r266; $L106: .loc 1 2975 2 @ %r1929 bra $L108; bra $L109; $L112: shl.b64 %r1491,%r889,3; add.u64 %r881,%r220,%r1491; add.u64 %r884,%r900,4; .loc 1 2980 4 mov.u64 %r878,%r900; .loc 1 2980 11 mov.u64 %r265,0; and.b64 %r1673,%r162,3; setp.eq.u64 %r1676,%r1673,0; @ %r1676 bra $L110; setp.eq.u64 %r1675,%r1673,1; @ %r1675 bra $L296; setp.eq.u64 %r1674,%r1673,2; @ %r1674 bra $L297; bra $L348; $L110: ld.f32 %r209,[%r881]; ld.f32 %r153,[%r881+4]; ld.f32 %r208,[%r891]; ld.f32 %r156,[%r894]; .loc 1 2983 37 mul.f32 %r155,%r153,%r208; ld.f32 %r1492,[%r878]; fma.rn.f32 %r495,%r208,%r209,%r1492; fma.rn.f32 %r160,%r156,%r209,%r155; .loc 1 2982 36 neg.f32 %r1493,%r153; fma.rn.f32 %r207,%r1493,%r156,%r495; ld.f32 %r1494,[%r884]; add.f32 %r154,%r160,%r1494; st.f32 [%r878],%r207; st.f32 [%r884],%r154; .loc 1 2980 4 add.u64 %r1931,%r878,%r912; add.u64 %r1932,%r881,%r883; add.u64 %r1933,%r884,%r912; ld.f32 %r1934,[%r1932]; ld.f32 %r1935,[%r1932+4]; ld.f32 %r1936,[%r891]; ld.f32 %r1937,[%r894]; .loc 1 2983 37 mul.f32 %r1938,%r1935,%r1936; ld.f32 %r1939,[%r1931]; fma.rn.f32 %r1940,%r1936,%r1934,%r1939; fma.rn.f32 %r1941,%r1937,%r1934,%r1938; .loc 1 2982 36 neg.f32 %r1942,%r1935; fma.rn.f32 %r1943,%r1942,%r1937,%r1940; ld.f32 %r1944,[%r1933]; add.f32 %r1945,%r1941,%r1944; st.f32 [%r1931],%r1943; st.f32 [%r1933],%r1945; .loc 1 2980 4 add.u64 %r1947,%r1931,%r912; add.u64 %r1948,%r1932,%r883; add.u64 %r1949,%r1933,%r912; ld.f32 %r1951,[%r1948]; ld.f32 %r1952,[%r1948+4]; ld.f32 %r1953,[%r891]; ld.f32 %r1954,[%r894]; .loc 1 2983 37 mul.f32 %r1955,%r1952,%r1953; ld.f32 %r1956,[%r1947]; fma.rn.f32 %r1957,%r1953,%r1951,%r1956; fma.rn.f32 %r1958,%r1954,%r1951,%r1955; .loc 1 2982 36 neg.f32 %r1959,%r1952; fma.rn.f32 %r1960,%r1959,%r1954,%r1957; ld.f32 %r1961,[%r1949]; add.f32 %r1962,%r1958,%r1961; st.f32 [%r1947],%r1960; st.f32 [%r1949],%r1962; .loc 1 2980 4 add.u64 %r1964,%r1947,%r912; add.u64 %r1965,%r1948,%r883; add.u64 %r1966,%r1949,%r912; ld.f32 %r1968,[%r1965]; ld.f32 %r1969,[%r1965+4]; ld.f32 %r1970,[%r891]; ld.f32 %r1971,[%r894]; .loc 1 2983 37 mul.f32 %r1972,%r1969,%r1970; ld.f32 %r1973,[%r1964]; fma.rn.f32 %r1974,%r1970,%r1968,%r1973; fma.rn.f32 %r1975,%r1971,%r1968,%r1972; .loc 1 2982 36 neg.f32 %r1976,%r1969; fma.rn.f32 %r1977,%r1976,%r1971,%r1974; ld.f32 %r1978,[%r1966]; add.f32 %r1979,%r1975,%r1978; st.f32 [%r1964],%r1977; st.f32 [%r1966],%r1979; .loc 1 2980 29 add.u64 %r265,%r265,4; .loc 1 2980 4 add.u64 %r878,%r1964,%r912; add.u64 %r881,%r1965,%r883; add.u64 %r884,%r1966,%r912; setp.ne.u64 %r1980,%r162,%r265; @ %r1980 bra $L110; $L113: .loc 1 2979 26 add.u64 %r264,%r264,1; .loc 1 2979 2 add.u64 %r889,%r889,%r150; add.u64 %r891,%r891,%r893; add.u64 %r894,%r894,%r893; setp.eq.u64 %r1496,%r638,%r264; @ %r1496 bra $L111; $L114: .loc 1 2980 4 @ %r1929 bra $L112; bra $L113; $L111: .loc 1 2978 32 add.u64 %r263,%r289,1; .loc 1 2978 7 add.u64 %r900,%r900,%r902; add.u64 %r905,%r905,%r151; setp.eq.u64 %r1498,%r288,%r289; @ %r1498 bra $L1; mov.u64 %r289,%r263; $L107: .loc 1 2979 2 @ %r1928 bra $L111; shl.b64 %r908,%r905,3; add.u64 %r891,%r221,%r908; add.u64 %r894,%r1598,%r908; mov.u64 %r889,0; .loc 1 2979 9 mov.u64 %r264,%r889; bra $L114; $L103: shl.b64 %r929,%r844,3; shl.b64 %r920,%r150,3; shl.b64 %r924,%r663,3; .loc 1 2993 7 mov.u64 %r938,0; mov.u64 %r936,%r938; .loc 1 2993 14 mov.u64 %r260,%r938; setp.gt.s64 %r1981,%r162,0; setp.le.s64 %r1601,%r638,0; $L116: .loc 1 2997 4 @ %r1981 bra $L115; $L120: .loc 1 2993 32 add.u64 %r260,%r260,1; .loc 1 2993 7 add.u64 %r936,%r936,%r664; add.u64 %r938,%r938,%r151; setp.ne.u64 %r1502,%r163,%r260; @ %r1502 bra $L116; bra $L1; $L115: shl.b64 %r1503,%r936,3; add.u64 %r927,%r222,%r1503; shl.b64 %r1504,%r938,3; add.u64 %r944,%r221,%r1504; .loc 1 2997 4 mov.u64 %r931,0; .loc 1 2997 11 mov.u64 %r261,%r931; $L119: .loc 1 3001 8 @ %r1601 bra $L144; shl.b64 %r1506,%r931,3; add.u64 %r918,%r220,%r1506; mov.u64 %r922,%r944; mov.f32 %r161,0f00000000; mov.f32 %r211,%r161; .loc 1 3001 15 mov.u64 %r262,0; and.b64 %r1689,%r638,3; setp.eq.u64 %r1692,%r1689,0; @ %r1692 bra $L118; setp.eq.u64 %r1691,%r1689,1; @ %r1691 bra $L298; setp.eq.u64 %r1690,%r1689,2; @ %r1690 bra $L299; bra $L349; $L118: ld.f32 %r285,[%r918]; ld.f32 %r214,[%r918+4]; ld.f32 %r159,[%r922]; ld.f32 %r213,[%r922+4]; .loc 1 3002 28 mul.f32 %r271,%r159,%r214; mul.f32 %r1507,%r213,%r214; neg.f32 %r1508,%r1507; fma.rn.f32 %r212,%r159,%r285,%r1508; fma.rn.f32 %r158,%r213,%r285,%r271; .loc 1 3002 5 add.f32 %r1983,%r211,%r212; add.f32 %r1984,%r161,%r158; .loc 1 3001 8 add.u64 %r1986,%r918,%r920; add.u64 %r1987,%r922,%r924; ld.f32 %r1988,[%r1986]; ld.f32 %r1989,[%r1986+4]; ld.f32 %r1990,[%r1987]; ld.f32 %r1991,[%r1987+4]; .loc 1 3002 28 mul.f32 %r1992,%r1990,%r1989; mul.f32 %r1993,%r1991,%r1989; neg.f32 %r1994,%r1993; fma.rn.f32 %r1995,%r1990,%r1988,%r1994; fma.rn.f32 %r1996,%r1991,%r1988,%r1992; .loc 1 3002 5 add.f32 %r1997,%r1983,%r1995; add.f32 %r1998,%r1984,%r1996; .loc 1 3001 8 add.u64 %r2000,%r1986,%r920; add.u64 %r2001,%r1987,%r924; ld.f32 %r2003,[%r2000]; ld.f32 %r2004,[%r2000+4]; ld.f32 %r2005,[%r2001]; ld.f32 %r2006,[%r2001+4]; .loc 1 3002 28 mul.f32 %r2007,%r2005,%r2004; mul.f32 %r2008,%r2006,%r2004; neg.f32 %r2009,%r2008; fma.rn.f32 %r2010,%r2005,%r2003,%r2009; fma.rn.f32 %r2011,%r2006,%r2003,%r2007; .loc 1 3002 5 add.f32 %r2012,%r1997,%r2010; add.f32 %r2013,%r1998,%r2011; .loc 1 3001 8 add.u64 %r2015,%r2000,%r920; add.u64 %r2016,%r2001,%r924; ld.f32 %r2018,[%r2015]; ld.f32 %r2019,[%r2015+4]; ld.f32 %r2020,[%r2016]; ld.f32 %r2021,[%r2016+4]; .loc 1 3002 28 mul.f32 %r2022,%r2020,%r2019; mul.f32 %r2023,%r2021,%r2019; neg.f32 %r2024,%r2023; fma.rn.f32 %r2025,%r2020,%r2018,%r2024; fma.rn.f32 %r2026,%r2021,%r2018,%r2022; .loc 1 3002 5 add.f32 %r211,%r2012,%r2025; add.f32 %r161,%r2013,%r2026; .loc 1 3001 32 add.u64 %r262,%r262,4; .loc 1 3001 8 add.u64 %r918,%r2015,%r920; add.u64 %r922,%r2016,%r924; setp.ne.u64 %r2027,%r638,%r262; @ %r2027 bra $L118; bra $L117; $L144: mov.f32 %r161,0f00000000; mov.f32 %r211,%r161; $L117: .loc 1 3003 27 st.f32 [%r927],%r211; st.f32 [%r927+4],%r161; .loc 1 2997 29 add.u64 %r261,%r261,1; .loc 1 2997 4 add.u64 %r927,%r927,%r929; add.u64 %r931,%r931,%r298; setp.ne.u64 %r1510,%r162,%r261; @ %r1510 bra $L119; bra $L120; $L32: .loc 1 2920 11 setp.eq.u64 %r1511,%r150,1; @ %r1511 bra $L121; bra $L20; $L2: .loc 1 2476 6 ld.u64 %r1512,[%r945]; setp.ne.u64 %r1513,%r1512,0; @ %r1513 bra $L122; .loc 1 2485 4 ld.u64 %r1515,[%r946+56]; add.u64 %r1514,%r1515,1; ld.u64 %r1516,[%r946+48]; sub.u64 %r668,%r1514,%r1516; .loc 1 2483 15 ld.s8 %r1518,[%r947+28]; cvt.u16.u32 %r1517,%r1518; setp.eq.u16 %r1519,%r1517,1; @ %r1519 bra $L123; bra $L124; $L126: .loc 1 2560 16 ld.u64 %r298,[%r946+40]; .loc 1 2548 27 mov.u64 %r844,%r664; bra $L125; $L14: ld.u64 %r664,[%r945+40]; .loc 1 2543 6 ld.s8 %r1521,[%r945+28]; cvt.u16.u32 %r1520,%r1521; setp.eq.u16 %r1522,%r1520,1; @ ! %r1522 bra $L127; bra $L126; $L11: .loc 1 2560 16 ld.u64 %r298,[%r946+40]; mov.u32 %r627,%r901; .loc 1 2548 27 mov.u64 %r664,%r844; .loc 1 2563 14 mov.u64 %r162,1; .loc 1 2561 16 mov.u64 %r150,%r162; bra $L12; $L5: .loc 1 2485 4 ld.u64 %r1524,[%r946+56]; add.u64 %r1523,%r1524,1; ld.u64 %r1525,[%r946+48]; sub.u64 %r668,%r1523,%r1525; bra $L124; $L128: .loc 1 2509 17 ld.u64 %r1527,[%r945+56]; add.u64 %r1526,%r1527,1; .loc 1 2509 15 ld.u64 %r1528,[%r945+48]; sub.u64 %r638,%r1526,%r1528; ld.s8 %r1551,[%r947+28]; bra $L9; $L122: .loc 1 2502 12 cvta.global.u64 %r1529,_gfortrani_compile_options; .loc 1 2502 11 ld.u32 %r1530,[%r1529+36]; setp.ne.u32 %r1531,%r1530,0; @ ! %r1531 bra $L8; bra $L128; $L1: .loc 1 3007 1 ret; $L342: ld.f32 %r2028,[%r616]; ld.f32 %r2029,[%r621]; ld.f32 %r2030,[%r578]; ld.f32 %r2031,[%r578+4]; .loc 1 2908 13 mul.f32 %r2032,%r2029,%r2030; mul.f32 %r2033,%r2029,%r2031; neg.f32 %r2034,%r2033; fma.rn.f32 %r2035,%r2028,%r2030,%r2034; fma.rn.f32 %r2036,%r2028,%r2031,%r2032; .loc 1 2907 11 add.f32 %r460,%r460,%r2035; add.f32 %r461,%r461,%r2036; .loc 1 2905 10 add.u64 %r297,%r616,8; add.u64 %r256,%r578,8; add.u64 %r621,%r616,12; $L287: ld.f32 %r2038,[%r297]; ld.f32 %r2039,[%r621]; ld.f32 %r2040,[%r256]; ld.f32 %r2041,[%r256+4]; .loc 1 2908 13 mul.f32 %r2042,%r2039,%r2040; mul.f32 %r2043,%r2039,%r2041; neg.f32 %r2044,%r2043; fma.rn.f32 %r2045,%r2038,%r2040,%r2044; fma.rn.f32 %r2046,%r2038,%r2041,%r2042; .loc 1 2907 11 add.f32 %r460,%r460,%r2045; add.f32 %r461,%r461,%r2046; .loc 1 2905 10 add.u64 %r297,%r297,8; add.u64 %r256,%r256,8; add.u64 %r621,%r621,8; $L286: ld.f32 %r2048,[%r297]; ld.f32 %r2049,[%r621]; ld.f32 %r2050,[%r256]; ld.f32 %r2051,[%r256+4]; .loc 1 2908 13 mul.f32 %r2052,%r2049,%r2050; mul.f32 %r2053,%r2049,%r2051; neg.f32 %r2054,%r2053; fma.rn.f32 %r2055,%r2048,%r2050,%r2054; fma.rn.f32 %r2056,%r2048,%r2051,%r2052; .loc 1 2907 11 add.f32 %r460,%r460,%r2055; add.f32 %r461,%r461,%r2056; .loc 1 2905 10 add.u64 %r297,%r297,8; add.u64 %r256,%r256,8; add.u64 %r621,%r621,8; setp.ne.u64 %r2057,%r256,%r575; @ %r2057 bra $L83; bra $L82; $L341: ld.f32 %r2058,[%r737+4]; .loc 1 2762 43 ld.f32 %r2059,[%r737]; st.f32 [%r678],%r2059; st.f32 [%r683],%r2058; .loc 1 2760 9 add.u64 %r676,%r737,8; add.u64 %r677,%r678,2048; add.u64 %r682,%r683,2048; $L285: ld.f32 %r2061,[%r676+4]; .loc 1 2762 43 ld.f32 %r2062,[%r676]; st.f32 [%r677],%r2062; st.f32 [%r682],%r2061; .loc 1 2760 9 add.u64 %r676,%r676,8; add.u64 %r677,%r677,2048; add.u64 %r682,%r682,2048; $L284: ld.f32 %r2064,[%r676+4]; .loc 1 2762 43 ld.f32 %r2065,[%r676]; st.f32 [%r677],%r2065; st.f32 [%r682],%r2064; .loc 1 2760 9 add.u64 %r676,%r676,8; add.u64 %r677,%r677,2048; add.u64 %r682,%r682,2048; setp.ne.u64 %r2066,%r676,%r692; @ %r2066 bra $L58; bra $L48; $L340: ld.f32 %r2067,[%r694+4]; .loc 1 2740 49 ld.f32 %r2068,[%r694]; st.f32 [%r695],%r2068; st.f32 [%r695+4],%r2067; ld.f32 %r2069,[%r696+4]; .loc 1 2742 49 ld.f32 %r2070,[%r696]; st.f32 [%r695+8],%r2070; st.f32 [%r695+12],%r2069; ld.f32 %r2071,[%r694+12]; .loc 1 2744 49 ld.f32 %r2072,[%r694+8]; st.f32 [%r695+2048],%r2072; st.f32 [%r695+2052],%r2071; ld.f32 %r2073,[%r696+12]; .loc 1 2746 49 ld.f32 %r2074,[%r696+8]; st.f32 [%r695+2056],%r2074; st.f32 [%r695+2060],%r2073; .loc 1 2738 9 add.u64 %r694,%r694,16; add.u64 %r695,%r695,4096; add.u64 %r696,%r696,16; setp.ne.u64 %r2075,%r696,%r706; @ %r2075 bra $L52; bra $L53; $L338: .loc 1 2688 2 @ ! %r1556 bra $L350; .loc 1 2689 22 mov.u32 %r20762076077,[%value_in]; } $L350: .loc 1 2687 24 mov.u64 %r255,2; .loc 1 2687 7 add.u64 %r800,%r222,%r802; $L283: .loc 1 2688 2 @ ! %r1556 bra $L351; .loc 1 2689 22 mov.u32 %r20792079080,[%value_in]; } $L351: .loc 1 2687 24 add.u64 %r255,%r255,1; .loc 1 2687 7 add.u64 %r800,%r800,%r802; $L282: .loc 1 2688 2 @ ! %r1556 bra $L352; .loc 1 2689 22 mov.u32 %r2082082083,[%value_in]; } $L352: .loc 1 2687 24 add.u64 %r255,%r255,1; .loc 1 2687 7 add.u64 %r800,%r800,%r802; setp.eq.u64 %r2084,%r255,%r806; @ ! %r2084 bra $L34; bra $L33; $L220: .loc 1 2689 22 mov.u32 %r208516952085086,[%value_in]; } $L339: .loc 1 2687 7 add.u64 %r2088,%r1695,%r802; .loc 1 2688 2 @ ! %r1556 bra $L353; .loc 1 2689 22 mov.u32 %r20208090091,[%value_in]; } $L353: .loc 1 2687 7 add.u64 %r2093,%r2088,%r802; .loc 1 2688 2 @ ! %r1556 bra $L354; .loc 1 2689 22 mov.u32 %r202092095096,[%value_in]; } $L354: .loc 1 2687 24 add.u64 %r255,%r1640,3; .loc 1 2687 7 add.u64 %r800,%r2093,%r802; setp.eq.u64 %r2097,%r255,%r806; @ ! %r2097 bra $L34; bra $L33; $L344: ld.f32 %r2098,[%r816]; ld.f32 %r2099,[%r809]; ld.f32 %r2100,[%r834]; ld.f32 %r2101,[%r810]; .loc 1 2938 23 mul.f32 %r2102,%r2100,%r2099; mul.f32 %r2103,%r2101,%r2099; neg.f32 %r2104,%r2103; .loc 1 2938 9 fma.rn.f32 %r103,%r2100,%r2098,%r2104; fma.rn.f32 %r108,%r2101,%r2098,%r2102; .loc 1 2937 5 add.u64 %r807,%r816,8; add.u64 %r808,%r834,8; add.u64 %r809,%r816,12; add.u64 %r810,%r834,12; $L289: ld.f32 %r2108,[%r807]; ld.f32 %r2109,[%r809]; ld.f32 %r2110,[%r808]; ld.f32 %r2111,[%r810]; .loc 1 2938 23 mul.f32 %r2112,%r2110,%r2109; mul.f32 %r2113,%r2111,%r2109; neg.f32 %r2114,%r2113; fma.rn.f32 %r2115,%r2110,%r2108,%r2114; fma.rn.f32 %r2116,%r2111,%r2108,%r2112; .loc 1 2938 9 add.f32 %r103,%r103,%r2115; add.f32 %r108,%r108,%r2116; .loc 1 2937 5 add.u64 %r807,%r807,8; add.u64 %r808,%r808,8; add.u64 %r809,%r809,8; add.u64 %r810,%r810,8; $L288: ld.f32 %r2118,[%r807]; ld.f32 %r2119,[%r809]; ld.f32 %r2120,[%r808]; ld.f32 %r2121,[%r810]; .loc 1 2938 23 mul.f32 %r2122,%r2120,%r2119; mul.f32 %r2123,%r2121,%r2119; neg.f32 %r2124,%r2123; fma.rn.f32 %r2125,%r2120,%r2118,%r2124; fma.rn.f32 %r2126,%r2121,%r2118,%r2122; .loc 1 2938 9 add.f32 %r103,%r103,%r2125; add.f32 %r108,%r108,%r2126; .loc 1 2937 5 add.u64 %r807,%r807,8; add.u64 %r808,%r808,8; add.u64 %r809,%r809,8; add.u64 %r810,%r810,8; setp.ne.u64 %r2127,%r807,%r821; @ %r2127 bra $L93; bra $L92; $L345: ld.f32 %r2128,[%r220]; ld.f32 %r2129,[%r845]; ld.f32 %r2130,[%r843]; ld.f32 %r2131,[%r843+4]; .loc 1 2953 26 mul.f32 %r2132,%r2130,%r2129; mul.f32 %r2133,%r2131,%r2129; neg.f32 %r2134,%r2133; .loc 1 2953 5 fma.rn.f32 %r38,%r2130,%r2128,%r2134; fma.rn.f32 %r37,%r2131,%r2128,%r2132; .loc 1 2952 8 add.u64 %r840,%r220,%r842; add.u64 %r843,%r843,8; add.u64 %r845,%r845,%r842; $L291: ld.f32 %r2138,[%r840]; ld.f32 %r2139,[%r845]; ld.f32 %r2140,[%r843]; ld.f32 %r2141,[%r843+4]; .loc 1 2953 26 mul.f32 %r2142,%r2140,%r2139; mul.f32 %r2143,%r2141,%r2139; neg.f32 %r2144,%r2143; fma.rn.f32 %r2145,%r2140,%r2138,%r2144; fma.rn.f32 %r2146,%r2141,%r2138,%r2142; .loc 1 2953 5 add.f32 %r38,%r38,%r2145; add.f32 %r37,%r37,%r2146; .loc 1 2952 8 add.u64 %r840,%r840,%r842; add.u64 %r843,%r843,8; add.u64 %r845,%r845,%r842; $L290: ld.f32 %r2148,[%r840]; ld.f32 %r2149,[%r845]; ld.f32 %r2150,[%r843]; ld.f32 %r2151,[%r843+4]; .loc 1 2953 26 mul.f32 %r2152,%r2150,%r2149; mul.f32 %r2153,%r2151,%r2149; neg.f32 %r2154,%r2153; fma.rn.f32 %r2155,%r2150,%r2148,%r2154; fma.rn.f32 %r2156,%r2151,%r2148,%r2152; .loc 1 2953 5 add.f32 %r38,%r38,%r2155; add.f32 %r37,%r37,%r2156; .loc 1 2952 8 add.u64 %r840,%r840,%r842; add.u64 %r843,%r843,8; add.u64 %r845,%r845,%r842; setp.ne.u64 %r2157,%r843,%r861; @ %r2157 bra $L97; bra $L96; $L346: ld.f32 %r2158,[%r220]; ld.f32 %r2159,[%r220+4]; ld.f32 %r2160,[%r865]; ld.f32 %r2161,[%r865+4]; .loc 1 2968 29 mul.f32 %r2162,%r2160,%r2159; mul.f32 %r2163,%r2161,%r2159; neg.f32 %r2164,%r2163; .loc 1 2968 8 fma.rn.f32 %r188,%r2160,%r2158,%r2164; fma.rn.f32 %r152,%r2161,%r2158,%r2162; .loc 1 2967 28 mov.u64 %r269,1; .loc 1 2967 4 add.u64 %r862,%r220,%r864; add.u64 %r865,%r865,%r867; $L293: ld.f32 %r2168,[%r862]; ld.f32 %r2169,[%r862+4]; ld.f32 %r2170,[%r865]; ld.f32 %r2171,[%r865+4]; .loc 1 2968 29 mul.f32 %r2172,%r2170,%r2169; mul.f32 %r2173,%r2171,%r2169; neg.f32 %r2174,%r2173; fma.rn.f32 %r2175,%r2170,%r2168,%r2174; fma.rn.f32 %r2176,%r2171,%r2168,%r2172; .loc 1 2968 8 add.f32 %r188,%r188,%r2175; add.f32 %r152,%r152,%r2176; .loc 1 2967 28 add.u64 %r269,%r269,1; .loc 1 2967 4 add.u64 %r862,%r862,%r864; add.u64 %r865,%r865,%r867; $L292: ld.f32 %r2178,[%r862]; ld.f32 %r2179,[%r862+4]; ld.f32 %r2180,[%r865]; ld.f32 %r2181,[%r865+4]; .loc 1 2968 29 mul.f32 %r2182,%r2180,%r2179; mul.f32 %r2183,%r2181,%r2179; neg.f32 %r2184,%r2183; fma.rn.f32 %r2185,%r2180,%r2178,%r2184; fma.rn.f32 %r2186,%r2181,%r2178,%r2182; .loc 1 2968 8 add.f32 %r188,%r188,%r2185; add.f32 %r152,%r152,%r2186; .loc 1 2967 28 add.u64 %r269,%r269,1; .loc 1 2967 4 add.u64 %r862,%r862,%r864; add.u64 %r865,%r865,%r867; setp.ne.u64 %r2187,%r638,%r269; @ %r2187 bra $L100; bra $L99; $L348: ld.f32 %r2188,[%r881]; ld.f32 %r2189,[%r881+4]; ld.f32 %r2190,[%r891]; ld.f32 %r2191,[%r894]; .loc 1 2983 37 mul.f32 %r2192,%r2189,%r2190; ld.f32 %r2193,[%r900]; fma.rn.f32 %r2194,%r2190,%r2188,%r2193; fma.rn.f32 %r2195,%r2191,%r2188,%r2192; .loc 1 2982 36 neg.f32 %r2196,%r2189; fma.rn.f32 %r2197,%r2196,%r2191,%r2194; ld.f32 %r2198,[%r884]; add.f32 %r2199,%r2195,%r2198; st.f32 [%r900],%r2197; st.f32 [%r884],%r2199; .loc 1 2980 29 mov.u64 %r265,1; .loc 1 2980 4 add.u64 %r878,%r900,%r912; add.u64 %r881,%r881,%r883; add.u64 %r884,%r884,%r912; $L297: ld.f32 %r2201,[%r881]; ld.f32 %r2202,[%r881+4]; ld.f32 %r2203,[%r891]; ld.f32 %r2204,[%r894]; .loc 1 2983 37 mul.f32 %r2205,%r2202,%r2203; ld.f32 %r2206,[%r878]; fma.rn.f32 %r2207,%r2203,%r2201,%r2206; fma.rn.f32 %r2208,%r2204,%r2201,%r2205; .loc 1 2982 36 neg.f32 %r2209,%r2202; fma.rn.f32 %r2210,%r2209,%r2204,%r2207; ld.f32 %r2211,[%r884]; add.f32 %r2212,%r2208,%r2211; st.f32 [%r878],%r2210; st.f32 [%r884],%r2212; .loc 1 2980 29 add.u64 %r265,%r265,1; .loc 1 2980 4 add.u64 %r878,%r878,%r912; add.u64 %r881,%r881,%r883; add.u64 %r884,%r884,%r912; $L296: ld.f32 %r2214,[%r881]; ld.f32 %r2215,[%r881+4]; ld.f32 %r2216,[%r891]; ld.f32 %r2217,[%r894]; .loc 1 2983 37 mul.f32 %r2218,%r2215,%r2216; ld.f32 %r2219,[%r878]; fma.rn.f32 %r2220,%r2216,%r2214,%r2219; fma.rn.f32 %r2221,%r2217,%r2214,%r2218; .loc 1 2982 36 neg.f32 %r2222,%r2215; fma.rn.f32 %r2223,%r2222,%r2217,%r2220; ld.f32 %r2224,[%r884]; add.f32 %r2225,%r2221,%r2224; st.f32 [%r878],%r2223; st.f32 [%r884],%r2225; .loc 1 2980 29 add.u64 %r265,%r265,1; .loc 1 2980 4 add.u64 %r878,%r878,%r912; add.u64 %r881,%r881,%r883; add.u64 %r884,%r884,%r912; setp.ne.u64 %r2226,%r162,%r265; @ %r2226 bra $L110; bra $L113; $L347: .loc 1 2976 34 st.f32 [%r910],%r1599; st.f32 [%r910+4],%r1600; .loc 1 2975 27 mov.u64 %r267,1; .loc 1 2975 2 add.u64 %r910,%r910,%r912; $L295: .loc 1 2976 34 st.f32 [%r910],%r1599; st.f32 [%r910+4],%r1600; .loc 1 2975 27 add.u64 %r267,%r267,1; .loc 1 2975 2 add.u64 %r910,%r910,%r912; $L294: .loc 1 2976 34 st.f32 [%r910],%r1599; st.f32 [%r910+4],%r1600; .loc 1 2975 27 add.u64 %r267,%r267,1; .loc 1 2975 2 add.u64 %r910,%r910,%r912; setp.ne.u64 %r2229,%r162,%r267; @ %r2229 bra $L105; bra $L109; $L349: ld.f32 %r2230,[%r918]; ld.f32 %r2231,[%r918+4]; ld.f32 %r2232,[%r944]; ld.f32 %r2233,[%r944+4]; .loc 1 3002 28 mul.f32 %r2234,%r2232,%r2231; mul.f32 %r2235,%r2233,%r2231; neg.f32 %r2236,%r2235; .loc 1 3002 5 fma.rn.f32 %r211,%r2232,%r2230,%r2236; fma.rn.f32 %r161,%r2233,%r2230,%r2234; .loc 1 3001 32 mov.u64 %r262,1; .loc 1 3001 8 add.u64 %r918,%r918,%r920; add.u64 %r922,%r944,%r924; $L299: ld.f32 %r2240,[%r918]; ld.f32 %r2241,[%r918+4]; ld.f32 %r2242,[%r922]; ld.f32 %r2243,[%r922+4]; .loc 1 3002 28 mul.f32 %r2244,%r2242,%r2241; mul.f32 %r2245,%r2243,%r2241; neg.f32 %r2246,%r2245; fma.rn.f32 %r2247,%r2242,%r2240,%r2246; fma.rn.f32 %r2248,%r2243,%r2240,%r2244; .loc 1 3002 5 add.f32 %r211,%r211,%r2247; add.f32 %r161,%r161,%r2248; .loc 1 3001 32 add.u64 %r262,%r262,1; .loc 1 3001 8 add.u64 %r918,%r918,%r920; add.u64 %r922,%r922,%r924; $L298: ld.f32 %r2250,[%r918]; ld.f32 %r2251,[%r918+4]; ld.f32 %r2252,[%r922]; ld.f32 %r2253,[%r922+4]; .loc 1 3002 28 mul.f32 %r2254,%r2252,%r2251; mul.f32 %r2255,%r2253,%r2251; neg.f32 %r2256,%r2255; fma.rn.f32 %r2257,%r2252,%r2250,%r2256; fma.rn.f32 %r2258,%r2253,%r2250,%r2254; .loc 1 3002 5 add.f32 %r211,%r211,%r2257; add.f32 %r161,%r161,%r2258; .loc 1 3001 32 add.u64 %r262,%r262,1; .loc 1 3001 8 add.u64 %r918,%r918,%r920; add.u64 %r922,%r922,%r924; setp.ne.u64 %r2259,%r638,%r262; @ %r2259 bra $L118; bra $L117; } matmul_c8.o/_gfortran_matmul_c8 .visible .func _gfortran_matmul_c8, .param .u32 %in_ar4, .param .u64 %in_ar5); .file 1 "../../../libgfortran/generated/matmul_c8.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_options[6]FUNCTION DECL: _gfortran_size0_gfortran_size0_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime_VAR DEF: $LC9 .const .align 8 .u64 $LC9[2] = {__func__$0 .const .align 1 .u8 __func__$0[10] = {109,97,116,109,117,108,95,99,56,0 }2] = {782] = {672] = {8471,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,32,40,97,41,32,61,61,32,50,32,124,124,32,71,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,32,40,98,41,32,61,61,3243116,114,97,110,47,103,101,110,101,114,97,116,101,100,4765,114,114,97,121,32,98,111,117,110,100,32,109,105,115,109,97,116,99,104,32,102,111,114,32,100,105,109,101,110,115,105,111,110,32,49,32,111,102,32,97,114,114,97,121,32,40,37,108,100,47,37,108,100,41,3265,114,114,97,121,32,98,111,117,110,100,32,109,105,115,109,97,116,99,104,32,102,111,114,32,100,105,109,101,110,115,105,111,110,32,50,32,111,102,32,97,114,114,97,121,32,40,37,108,100,47,37,108,100,41,3273,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,97,114,103,117,109,101,110,116,32,66,32,105,110,32,77,65,84,77,85,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,49,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,100,0 }; // BEGIN VAR DEF: $LC8 .const .align 8 .u64 $LC8[2] = {4607182418800017408_gfortran_matmul_c8 .visible .func _gfortran_matmul_c8f64 %r189; .reg .f64 %r191; .reg .f64 %r192; .reg .f64 %r193; .reg .f64u32 %r198; .reg .u32 %r199; .reg .u64 %r200; .reg .f64 %r201; .reg .f64 %r202; .reg .f64 %r203; .reg .f64 %r205; .reg .f64 %r206; .reg .f64 %r207; .reg .f64 %r208; .reg .f64 %r209; .reg .f64 %r211; .reg .f64 %r212; .reg .f64 %r213; .reg .f64u64 %r257; .reg .u64 %r258; .reg .u64u64 %r267; .reg .u64 %r268; .reg .u64 %r269; .reg .f64 %r270; .reg .f64 %r271; .reg .f64 %r272; .reg .f64 %r273; .reg .f64 %r274; .reg .f64 %r275; .reg .f64 %r276; .reg .f64 %r277; .reg .f64 %r278; .reg .f64 %r279; .reg .f64 %r280; .reg .f64 %r281; .reg .f64 %r282; .reg .f64 %r283; .reg .f64 %r284; .reg .f64f64 %r290; .reg .f64 %r296; .reg .u64 %r297; .reg .u64 %r298; .reg .f64 %r299; .reg .f64 %r300; .reg .f64 %r301; .reg .f64 %r302; .reg .f64 %r303; .reg .f64 %r304; .reg .f64 %r305; .reg .f64 %r306; .reg .f64 %r307; .reg .f64 %r308; .reg .f64 %r309; .reg .f64 %r310; .reg .f64 %r311; .reg .f64 %r312; .reg .f64 %r313; .reg .f64 %r314; .reg .f64 %r315; .reg .f64 %r317; .reg .f64 %r318; .reg .f64 %r319; .reg .f64 %r320; .reg .f64 %r321; .reg .f64 %r323; .reg .f64 %r324; .reg .f64 %r325; .reg .f64 %r326; .reg .f64 %r327; .reg .f64 %r329; .reg .f64 %r330; .reg .f64 %r331; .reg .f64 %r333; .reg .f64 %r334; .reg .f64 %r335; .reg .f64 %r336; .reg .f64 %r337; .reg .f64 %r339; .reg .f64 %r340; .reg .f64 %r341; .reg .f64 %r343; .reg .f64 %r344; .reg .f64 %r345; .reg .f64 %r346; .reg .f64 %r347; .reg .f64 %r349; .reg .f64 %r350; .reg .f64 %r351; .reg .f64 %r353; .reg .f64 %r354; .reg .f64 %r355; .reg .f64 %r356; .reg .f64 %r357; .reg .f64 %r359; .reg .f64 %r360; .reg .f64 %r361; .reg .f64 %r362; .reg .f64 %r363; .reg .f64 %r365; .reg .f64 %r366; .reg .f64 %r367; .reg .f64 %r369; .reg .f64 %r370; .reg .f64 %r371; .reg .f64 %r373; .reg .f64 %r374; .reg .f64 %r375; .reg .f64 %r377; .reg .f64 %r378; .reg .f64 %r379; .reg .f64 %r381; .reg .f64 %r382; .reg .f64 %r383; .reg .f64 %r385; .reg .f64 %r386; .reg .f64 %r387; .reg .f64 %r389; .reg .f64 %r390; .reg .f64 %r391; .reg .f64 %r392; .reg .f64 %r393; .reg .f64 %r394; .reg .f64 %r395; .reg .f64 %r396; .reg .f64 %r397; .reg .f64 %r398; .reg .f64 %r399; .reg .f64 %r400; .reg .f64 %r401; .reg .f64 %r402; .reg .f64 %r403; .reg .f64 %r405; .reg .f64 %r406; .reg .f64 %r407; .reg .f64 %r408; .reg .f64 %r409; .reg .f64 %r411; .reg .f64 %r412; .reg .f64 %r413; .reg .f64 %r414; .reg .f64 %r415; .reg .f64 %r417; .reg .f64 %r418; .reg .f64 %r419; .reg .f64 %r420; .reg .f64 %r421; .reg .f64 %r423; .reg .f64 %r424; .reg .f64 %r425; .reg .f64 %r426; .reg .f64 %r427; .reg .f64 %r428; .reg .f64 %r429; .reg .f64 %r430; .reg .f64 %r431; .reg .f64 %r432; .reg .f64 %r433; .reg .f64 %r434; .reg .f64 %r435; .reg .f64 %r436; .reg .f64 %r437; .reg .f64 %r439; .reg .f64 %r440; .reg .f64 %r441; .reg .f64 %r442; .reg .f64 %r443; .reg .f64 %r445; .reg .f64 %r446; .reg .f64 %r447; .reg .f64 %r448; .reg .f64 %r449; .reg .f64 %r451; .reg .f64 %r452; .reg .f64 %r453; .reg .f64 %r454; .reg .f64 %r455; .reg .f64 %r457; .reg .f64 %r458; .reg .f64 %r459; .reg .f64 %r460; .reg .f64 %r461; .reg .f64u64 %r509; .reg .u64 %r511; .reg .u64 %r513; .reg .u64 %r515; .reg .u64 %r517; .reg .u64 %r519; .reg .u64 %r521; .reg .u64 %r523; .reg .u64 %r524; .reg .u64 %r527; .reg .u64u64 %r549; .reg .u64 %r552; .reg .u64 %r553; .reg .u64 %r555; .reg .u64u64 %r567; .reg .u64 %r569; .reg .u64 %r574; .reg .u64 %r575; .reg .u64 %r577; .reg .u64u64 %r585; .reg .u64 %r592; .reg .u64u64 %r604; .reg .u64 %r606; .reg .u64 %r608; .reg .u64 %r610; .reg .u64 %r615; .reg .u64 %r616; .reg .u64 %r617; .reg .u64 %r621; .reg .u64 %r623; .reg .u32 %r627; .reg .u64 %r637; .reg .u64 %r638; .reg .u64 %r639; .reg .u64 %r640; .reg .u64 %r641; .reg .u64 %r642; .reg .u64 %r643; .reg .u64 %r644; .reg .u64 %r645; .reg .u64 %r647; .reg .u64 %r648; .reg .u64 %r649; .reg .u64 %r650; .reg .u64 %r651; .reg .u64 %r657; .reg .u64 %r658; .reg .u32 %r662; .reg .u64 %r663; .reg .u64 %r664; .reg .u64 %r668; .reg .u64 %r672; .reg .u64 %r676; .reg .u64 %r677; .reg .u64 %r678; .reg .u64 %r679; .reg .u64 %r682; .reg .u64 %r683; .reg .u64 %r692; .reg .u64 %r694; .reg .u64 %r695; .reg .u64 %r696; .reg .u64 %r706; .reg .u64 %r707; .reg .u64 %r708; .reg .u64 %r709; .reg .u64 %r710; .reg .u64 %r716; .reg .u64 %r721; .reg .u64 %r726; .reg .u64 %r736; .reg .u64 %r737; .reg .u64 %r747; .reg .u64 %r749; .reg .u64 %r750; .reg .u64 %r756; .reg .u64 %r757; .reg .u64 %r758; .reg .u64 %r759; .reg .u64 %r760; .reg .u64 %r761; .reg .u64 %r762; .reg .u64 %r763; .reg .u64 %r764; .reg .u64 %r765; .reg .u64 %r766; .reg .u64 %r767; .reg .u64 %r769; .reg .u64 %r770; .reg .u64 %r771; .reg .u64 %r772; .reg .u64 %r774; .reg .u64 %r775; .reg .u64 %r782; .reg .u64 %r783; .reg .u64 %r784; .reg .u64 %r785; .reg .u64 %r786; .reg .u64 %r793; .reg .u64 %r795; .reg .u64 %r797; .reg .u64 %r798; .reg .u64 %r799; .reg .u64 %r800; .reg .u64 %r802; .reg .u64 %r806; .reg .u64 %r807; .reg .u64 %r808; .reg .u64 %r809; .reg .u64 %r810; .reg .u64 %r815; .reg .u64 %r816; .reg .u64 %r818; .reg .u64 %r821; .reg .u64 %r822; .reg .u64 %r827; .reg .u64 %r829; .reg .u64 %r830; .reg .u64 %r834; .reg .u64 %r836; .reg .u64 %r840; .reg .u64 %r842; .reg .u64 %r843; .reg .u64 %r844; .reg .u64 %r845; .reg .u64 %r851; .reg .u64 %r853; .reg .u64 %r854; .reg .u64 %r861; .reg .u64 %r862; .reg .u64 %r864; .reg .u64 %r865; .reg .u64 %r867; .reg .u64 %r870; .reg .u64 %r872; .reg .u64 %r873; .reg .u64 %r878; .reg .u64 %r881; .reg .u64 %r883; .reg .u64 %r884; .reg .u64 %r889; .reg .u64 %r891; .reg .u64 %r893; .reg .u64 %r894; .reg .u64 %r900; .reg .u32 %r901; .reg .u64 %r902; .reg .u64 %r905; .reg .u64 %r908; .reg .u64 %r910; .reg .u64 %r912; .reg .u64 %r914; .reg .u64 %r918; .reg .u64 %r920; .reg .u64 %r922; .reg .u64 %r924; .reg .u64 %r927; .reg .u64 %r929; .reg .u64 %r931; .reg .u64 %r936; .reg .u64 %r938; .reg .u64 %r944; .reg .u64 %r945; .reg .u64 %r946; .reg .u64 %r947; .reg .u32 %r948; .reg .u32 %r949; .reg .u64 %r950; .reg .u16 %r951; .reg .pred %r952; .reg .u16 %r953; .reg .pred %r955; .reg .u64 %r956; .reg .u32 %r957; .reg .u64 %r958; .reg .u64 %r959; .reg .u64 %r960; .reg .pred %r961; .reg .pred %r963; .reg .u64 %r966; .reg .u64 %r968; .reg .u64 %r969; .reg .u64 %r970; .reg .u64 %r972; .reg .u64 %r973; .reg .u64 %r974; .reg .u64 %r975; .reg .u64 %r976; .reg .u64 %r979; .reg .u64 %r981; .reg .u64 %r982; .reg .u64 %r984; .reg .u64 %r986; .reg .u64 %r987; .reg .u64 %r989; .reg .u64 %r990; .reg .u64 %r991; .reg .u64 %r992; .reg .u64 %r993; .reg .u32 %r994; .reg .pred %r995; .reg .u64 %r996; .reg .u64 %r997; .reg .u64 %r998; .reg .pred %r1000; .reg .u64 %r1001; .reg .u64 %r1002; .reg .u64 %r1003; .reg .pred %r1004; .reg .u64 %r1005; .reg .u64 %r1006; .reg .u64 %r1007; .reg .u16 %r1008; .reg .pred %r1009; .reg .u64 %r1010; .reg .u64 %r1012; .reg .u64 %r1013; .reg .u64 %r1014; .reg .u16 %r1015; .reg .pred %r1017; .reg .pred %r1018; .reg .u64 %r1019; .reg .pred %r1021; .reg .u64 %r1022; .reg .u64 %r1024; .reg .u64 %r1025; .reg .u64 %r1026; .reg .u64 %r1027; .reg .u64 %r1028; .reg .u64 %r1029; .reg .pred %r1030; .reg .u64 %r1031; .reg .u64 %r1033; .reg .u64 %r1034; .reg .u64 %r1035; .reg .u16 %r1036; .reg .u32 %r1037; .reg .pred %r1038; .reg .u16 %r1039; .reg .pred %r1040; .reg .u64 %r1041; .reg .u64 %r1042; .reg .u64 %r1043; .reg .u64 %r1044; .reg .u64 %r1045; .reg .u64 %r1046; .reg .pred %r1047; .reg .u32 %r1049; .reg .u32 %r1050; .reg .u32 %r1052; .reg .u32 %r1053; .reg .u16 %r1054; .reg .u16 %r1055; .reg .u16 %r1056; .reg .u32 %r1057; .reg .u16 %r1058; .reg .pred %r1059; .reg .u64 %r1060; .reg .u16 %r1062; .reg .pred %r1063; .reg .u64 %r1064; .reg .u64 %r1065; .reg .u64 %r1066; .reg .u32 %r1067; .reg .u32 %r1068; .reg .u32 %r1069; .reg .u32 %r1070; .reg .u32 %r1071; .reg .u32 %r1072; .reg .u32 %r1074; .reg .u32 %r1075; .reg .u16 %r1076; .reg .u16 %r1077; .reg .u16 %r1078; .reg .u32 %r1079; .reg .u16 %r1080; .reg .pred %r1081; .reg .u32 %r1083; .reg .u32 %r1084; .reg .u16 %r1085; .reg .u16 %r1086; .reg .u16 %r1087; .reg .u32 %r1088; .reg .u16 %r1089; .reg .pred %r1090; .reg .u32 %r1092; .reg .u32 %r1093; .reg .u32 %r1095; .reg .u32 %r1096; .reg .u16 %r1097; .reg .u16 %r1098; .reg .u16 %r1099; .reg .u32 %r1100; .reg .u16 %r1101; .reg .pred %r1102; .reg .f32 %r1103; .reg .f32 %r1104; .reg .f32 %r1105; .reg .f32 %r1106; .reg .f32 %r1107; .reg .f32 %r1108; .reg .f32 %r1109; .reg .pred %r1110; .reg .f64 %r1111; .reg .f64 %r1112; .reg .pred %r1115; .reg .pred %r1116; .reg .u32 %r1118; .reg .u32 %r1119; .reg .u32 %r1121; .reg .u32 %r1122; .reg .u16 %r1123; .reg .u16 %r1124; .reg .u16 %r1125; .reg .u32 %r1126; .reg .u16 %r1127; .reg .pred %r1128; .reg .u32 %r1130; .reg .u32 %r1131; .reg .u32 %r1133; .reg .u32 %r1134; .reg .u16 %r1135; .reg .u16 %r1136; .reg .u16 %r1137; .reg .u32 %r1138; .reg .u16 %r1139; .reg .pred %r1140; .reg .u32 %r1142; .reg .u32 %r1143; .reg .u32 %r1145; .reg .u32 %r1146; .reg .u16 %r1147; .reg .u16 %r1148; .reg .u16 %r1149; .reg .u32 %r1150; .reg .u16 %r1151; .reg .pred %r1152; .reg .pred %r1153; .reg .u64 %r1154; .reg .u32 %r1155; .reg .u64 %r1156; .reg .u64 %r1157; .reg .u32 %r1158; .reg .pred %r1159; .reg .pred %r1160; .reg .u32 %r1161; .reg .pred %r1162; .reg .pred %r1163; .reg .u32 %r1178; .reg .u64 %r1179; .reg .u64 %r1180; .reg .u64 %r1181; .reg .u64 %r1182; .reg .u64 %r1183; .reg .u64 %r1184; .reg .u64 %r1185; .reg .u16 %r1186; .reg .u16 %r1187; .reg .u16 %r1188; .reg .u32 %r1189; .reg .u16 %r1190; .reg .pred %r1191; .reg .pred %r1192; .reg .u16 %r1193; .reg .pred %r1194; .reg .u64 %r1195; .reg .pred %r1196; .reg .u32 %r1200; .reg .u64 %r1202; .reg .u32 %r1207; .reg .u32 %r1208; .reg .u32 %r1210; .reg .u32 %r1211; .reg .u16 %r1212; .reg .u16 %r1213; .reg .u16 %r1214; .reg .u32 %r1216; .reg .u32 %r1217; .reg .u16 %r1219; .reg .u16 %r1221; .reg .u32 %r1222; .reg .u16 %r1223; .reg .pred %r1224; .reg .pred %r1225; .reg .u64 %r1226; .reg .u64 %r1228; .reg .u64 %r1229; .reg .u64 %r1230; .reg .pred %r1232; .reg .u64 %r1234; .reg .u64 %r1235; .reg .u64 %r1236; .reg .u64 %r1238; .reg .u64 %r1239; .reg .u64 %r1244; .reg .u64 %r1245; .reg .u64 %r1246; .reg .u64 %r1247; .reg .u64 %r1251; .reg .u64 %r1253; .reg .u64 %r1254; .reg .u64 %r1255; .reg .u64 %r1256; .reg .u64 %r1257; .reg .pred %r1259; .reg .u64 %r1265; .reg .u64 %r1266; .reg .u64 %r1267; .reg .u64 %r1268; .reg .pred %r1270; .reg .u64 %r1271; .reg .u64 %r1272; .reg .u64 %r1273; .reg .u64 %r1274; .reg .u64 %r1275; .reg .u64 %r1276; .reg .u64 %r1277; .reg .u64 %r1278; .reg .u64 %r1279; .reg .u64 %r1283; .reg .u64 %r1284; .reg .u64 %r1285; .reg .u64 %r1286; .reg .u64 %r1289; .reg .u64 %r1290; .reg .u64 %r1291; .reg .u64 %r1292; .reg .u64 %r1293; .reg .u64 %r1295; .reg .u64 %r1296; .reg .u64 %r1297; .reg .u64 %r1298; .reg .u64 %r1299; .reg .u64 %r1300; .reg .u64 %r1304; .reg .u64 %r1305; .reg .u64 %r1308; .reg .u64 %r1313; .reg .u64 %r1314; .reg .f64 %r1316; .reg .f64 %r1317; .reg .f64 %r1318; .reg .f64 %r1319; .reg .f64 %r1321; .reg .f64 %r1322; .reg .pred %r1323; .reg .u64 %r1326; .reg .u64 %r1327; .reg .u64 %r1328; .reg .u64 %r1329; .reg .u64 %r1331; .reg .pred %r1332; .reg .u64 %r1333; .reg .u64 %r1334; .reg .f64 %r1335; .reg .u64 %r1338; .reg .u64 %r1339; .reg .u64 %r1340; .reg .u64 %r1341; .reg .u64 %r1342; .reg .u64 %r1343; .reg .u64 %r1344; .reg .u64 %r1346; .reg .u64 %r1347; .reg .f64 %r1351; .reg .f64 %r1352; .reg .f64 %r1353; .reg .f64 %r1354; .reg .f64 %r1355; .reg .f64 %r1356; .reg .f64 %r1357; .reg .f64 %r1358; .reg .f64 %r1359; .reg .f64 %r1360; .reg .f64 %r1361; .reg .f64 %r1362; .reg .f64 %r1363; .reg .f64 %r1364; .reg .f64 %r1365; .reg .f64 %r1366; .reg .f64 %r1367; .reg .f64 %r1368; .reg .f64 %r1369; .reg .f64 %r1370; .reg .f64 %r1371; .reg .f64 %r1372; .reg .f64 %r1373; .reg .f64 %r1374; .reg .f64 %r1375; .reg .f64 %r1376; .reg .f64 %r1377; .reg .f64 %r1378; .reg .f64 %r1379; .reg .f64 %r1380; .reg .f64 %r1381; .reg .f64 %r1382; .reg .pred %r1383; .reg .pred %r1384; .reg .pred %r1385; .reg .u64 %r1387; .reg .u64 %r1388; .reg .u64 %r1389; .reg .u64 %r1390; .reg .u64 %r1391; .reg .u64 %r1392; .reg .f64 %r1394; .reg .f64 %r1395; .reg .f64 %r1396; .reg .f64 %r1397; .reg .f64 %r1398; .reg .f64 %r1399; .reg .f64 %r1400; .reg .f64 %r1401; .reg .pred %r1402; .reg .pred %r1403; .reg .pred %r1404; .reg .u64 %r1406; .reg .u64 %r1407; .reg .u64 %r1409; .reg .u64 %r1410; .reg .u64 %r1411; .reg .u64 %r1412; .reg .u64 %r1413; .reg .u64 %r1420; .reg .u64 %r1421; .reg .f64 %r1424; .reg .f64 %r1425; .reg .f64 %r1426; .reg .f64 %r1427; .reg .f64 %r1428; .reg .f64 %r1429; .reg .f64 %r1430; .reg .f64 %r1431; .reg .pred %r1432; .reg .pred %r1433; .reg .pred %r1434; .reg .f64 %r1437; .reg .f64 %r1438; .reg .pred %r1440; .reg .u16 %r1441; .reg .pred %r1442; .reg .u32 %r1444; .reg .u32 %r1445; .reg .u16 %r1446; .reg .u16 %r1447; .reg .u32 %r1449; .reg .u16 %r1450; .reg .pred %r1451; .reg .pred %r1452; .reg .u16 %r1453; .reg .pred %r1454; .reg .pred %r1455; .reg .pred %r1456; .reg .u64 %r1457; .reg .u64 %r1458; .reg .pred %r1460; .reg .u64 %r1461; .reg .f64 %r1463; .reg .f64 %r1464; .reg .pred %r1466; .reg .u64 %r1468; .reg .u64 %r1469; .reg .u64 %r1470; .reg .f64 %r1471; .reg .f64 %r1472; .reg .pred %r1474; .reg .pred %r1475; .reg .u64 %r1477; .reg .f64 %r1478; .reg .f64 %r1479; .reg .pred %r1481; .reg .pred %r1482; .reg .pred %r1483; .reg .pred %r1484; .reg .u64 %r1485; .reg .pred %r1489; .reg .u64 %r1491; .reg .f64 %r1492; .reg .f64 %r1493; .reg .f64 %r1494; .reg .pred %r1496; .reg .pred %r1498; .reg .pred %r1502; .reg .u64 %r1503; .reg .u64 %r1504; .reg .u64 %r1506; .reg .f64 %r1507; .reg .f64 %r1508; .reg .pred %r1510; .reg .pred %r1511; .reg .u64 %r1512; .reg .pred %r1513; .reg .u64 %r1514; .reg .u64 %r1515; .reg .u64 %r1516; .reg .u16 %r1517; .reg .u32 %r1518; .reg .pred %r1519; .reg .u16 %r1520; .reg .u32 %r1521; .reg .pred %r1522; .reg .u64 %r1523; .reg .u64 %r1524; .reg .u64 %r1525; .reg .u64 %r1526; .reg .u64 %r1527; .reg .u64 %r1528; .reg .u64 %r1529; .reg .u32 %r1530; .reg .pred %r1531; .reg .u64 %r1532; .reg .u64 %r1533; .reg .u64 %r1534; .reg .f64 %r1535; .reg .pred %r1536; .reg .pred %r1542; .reg .pred %r1543; .reg .pred %r1544; .reg .f64 %r1545; .reg .u64 %r1546; .reg .pred %r1547; .reg .pred %r1548; .reg .pred %r1549; .reg .pred %r1550; .reg .u32 %r1551; .reg .u64 %r1552; .reg .pred %r1553; .reg .pred %r1556; .reg .u64 %r1562; .reg .u64 %r1563; .reg .u64 %r1564; .reg .u64 %r1565; .reg .u64 %r1566; .reg .u64 %r1570; .reg .u64 %r1571; .reg .u64 %r1572; .reg .u64 %r1573; .reg .u64 %r1574; .reg .u64 %r1576; .reg .u64 %r1577; .reg .u64 %r1578; .reg .u64 %r1580; .reg .pred %r1581; .reg .u64 %r1583; .reg .pred %r1585; .reg .u64 %r1588; .reg .pred %r1589; .reg .u64 %r1590; .reg .pred %r1591; .reg .u64 %r1592; .reg .u64 %r1593; .reg .pred %r1597; .reg .u64 %r1598; .reg .f64 %r1599; .reg .f64 %r1600; .reg .pred %r1601; .reg .u64 %r1602; .reg .u64 %r1603; .reg .u64 %r1604; .reg .u64 %r1605; .reg .u64 %r1606; .reg .pred %r1607; .reg .pred %r1608; .reg .pred %r1609; .reg .u64 %r1611; .reg .u64 %r1613; .reg .u64 %r1614; .reg .u64 %r1615; .reg .u64 %r1616; .reg .u64 %r1617; .reg .pred %r1618; .reg .pred %r1619; .reg .pred %r1620; .reg .u64 %r1621; .reg .u64 %r1624; .reg .u64 %r1625; .reg .u64 %r1626; .reg .u64 %r1628; .reg .pred %r1629; .reg .u64 %r1630; .reg .u64 %r1631; .reg .u64 %r1632; .reg .u64 %r1636; .reg .pred %r1637; .reg .pred %r1638; .reg .pred %r1639; .reg .u64 %r1640; .reg .u64 %r1641; .reg .u64 %r1642; .reg .u64 %r1643; .reg .u64 %r1644; .reg .u64 %r1645; .reg .pred %r1646; .reg .pred %r1647; .reg .pred %r1648; .reg .u64 %r1653; .reg .u64 %r1654; .reg .u64 %r1655; .reg .u64 %r1656; .reg .u64 %r1657; .reg .pred %r1658; .reg .pred %r1659; .reg .pred %r1660; .reg .u64 %r1661; .reg .u64 %r1665; .reg .pred %r1666; .reg .pred %r1667; .reg .pred %r1668; .reg .u64 %r1673; .reg .pred %r1674; .reg .pred %r1675; .reg .pred %r1676; .reg .u64 %r1681; .reg .pred %r1682; .reg .pred %r1683; .reg .pred %r1684; .reg .u64 %r1689; .reg .pred %r1690; .reg .pred %r1691; .reg .pred %r1692; .reg .u64 %r1695; .reg .u64 %r1696; .reg .u64 %r1698; .reg .u64 %r1699; .reg .f64 %r1703; .reg .f64 %r1704; .reg .f64 %r1705; .reg .f64 %r1706; .reg .f64 %r1707; .reg .f64 %r1708; .reg .f64 %r1709; .reg .f64 %r1710; .reg .pred %r1711; .reg .f64 %r1715; .reg .f64 %r1716; .reg .u64 %r1717; .reg .f64 %r1721; .reg .f64 %r1722; .reg .u64 %r1723; .reg .f64 %r1727; .reg .f64 %r1728; .reg .pred %r1729; .reg .pred %r1730; .reg .f64 %r1731; .reg .f64 %r1732; .reg .f64 %r1736; .reg .f64 %r1737; .reg .f64 %r1738; .reg .f64 %r1739; .reg .f64 %r1740; .reg .f64 %r1741; .reg .f64 %r1742; .reg .f64 %r1743; .reg .f64 %r1744; .reg .f64 %r1745; .reg .f64 %r1746; .reg .u64 %r1748; .reg .f64 %r1751; .reg .f64 %r1752; .reg .f64 %r1753; .reg .f64 %r1754; .reg .f64 %r1755; .reg .f64 %r1756; .reg .f64 %r1757; .reg .f64 %r1758; .reg .f64 %r1759; .reg .f64 %r1760; .reg .f64 %r1761; .reg .u64 %r1763; .reg .f64 %r1766; .reg .f64 %r1767; .reg .f64 %r1768; .reg .f64 %r1769; .reg .f64 %r1770; .reg .f64 %r1771; .reg .f64 %r1772; .reg .f64 %r1773; .reg .f64 %r1774; .reg .pred %r1775; .reg .pred %r1776; .reg .f64 %r1777; .reg .f64 %r1778; .reg .f64 %r1783; .reg .f64 %r1784; .reg .f64 %r1785; .reg .f64 %r1786; .reg .f64 %r1787; .reg .f64 %r1788; .reg .f64 %r1789; .reg .f64 %r1790; .reg .f64 %r1791; .reg .f64 %r1792; .reg .f64 %r1793; .reg .f64 %r1799; .reg .f64 %r1800; .reg .f64 %r1801; .reg .f64 %r1802; .reg .f64 %r1803; .reg .f64 %r1804; .reg .f64 %r1805; .reg .f64 %r1806; .reg .f64 %r1807; .reg .f64 %r1808; .reg .f64 %r1809; .reg .f64 %r1815; .reg .f64 %r1816; .reg .f64 %r1817; .reg .f64 %r1818; .reg .f64 %r1819; .reg .f64 %r1820; .reg .f64 %r1821; .reg .f64 %r1822; .reg .f64 %r1823; .reg .pred %r1824; .reg .pred %r1825; .reg .f64 %r1826; .reg .f64 %r1827; .reg .u64 %r1828; .reg .u64 %r1830; .reg .f64 %r1831; .reg .f64 %r1832; .reg .f64 %r1833; .reg .f64 %r1834; .reg .f64 %r1835; .reg .f64 %r1836; .reg .f64 %r1837; .reg .f64 %r1838; .reg .f64 %r1839; .reg .f64 %r1840; .reg .f64 %r1841; .reg .u64 %r1842; .reg .u64 %r1843; .reg .u64 %r1844; .reg .f64 %r1846; .reg .f64 %r1847; .reg .f64 %r1848; .reg .f64 %r1849; .reg .f64 %r1850; .reg .f64 %r1851; .reg .f64 %r1852; .reg .f64 %r1853; .reg .f64 %r1854; .reg .f64 %r1855; .reg .f64 %r1856; .reg .u64 %r1857; .reg .u64 %r1858; .reg .u64 %r1859; .reg .f64 %r1861; .reg .f64 %r1862; .reg .f64 %r1863; .reg .f64 %r1864; .reg .f64 %r1865; .reg .f64 %r1866; .reg .f64 %r1867; .reg .f64 %r1868; .reg .f64 %r1869; .reg .pred %r1870; .reg .pred %r1871; .reg .f64 %r1872; .reg .f64 %r1873; .reg .u64 %r1875; .reg .u64 %r1876; .reg .f64 %r1877; .reg .f64 %r1878; .reg .f64 %r1879; .reg .f64 %r1880; .reg .f64 %r1881; .reg .f64 %r1882; .reg .f64 %r1883; .reg .f64 %r1884; .reg .f64 %r1885; .reg .f64 %r1886; .reg .f64 %r1887; .reg .u64 %r1889; .reg .u64 %r1890; .reg .f64 %r1892; .reg .f64 %r1893; .reg .f64 %r1894; .reg .f64 %r1895; .reg .f64 %r1896; .reg .f64 %r1897; .reg .f64 %r1898; .reg .f64 %r1899; .reg .f64 %r1900; .reg .f64 %r1901; .reg .f64 %r1902; .reg .u64 %r1904; .reg .u64 %r1905; .reg .f64 %r1907; .reg .f64 %r1908; .reg .f64 %r1909; .reg .f64 %r1910; .reg .f64 %r1911; .reg .f64 %r1912; .reg .f64 %r1913; .reg .f64 %r1914; .reg .f64 %r1915; .reg .pred %r1916; .reg .u64 %r1920; .reg .u64 %r1922; .reg .u64 %r1925; .reg .pred %r1927; .reg .pred %r1928; .reg .pred %r1929; .reg .u64 %r1931; .reg .u64 %r1932; .reg .u64 %r1933; .reg .f64 %r1934; .reg .f64 %r1935; .reg .f64 %r1936; .reg .f64 %r1937; .reg .f64 %r1938; .reg .f64 %r1939; .reg .f64 %r1940; .reg .f64 %r1941; .reg .f64 %r1942; .reg .f64 %r1943; .reg .f64 %r1944; .reg .f64 %r1945; .reg .u64 %r1947; .reg .u64 %r1948; .reg .u64 %r1949; .reg .f64 %r1951; .reg .f64 %r1952; .reg .f64 %r1953; .reg .f64 %r1954; .reg .f64 %r1955; .reg .f64 %r1956; .reg .f64 %r1957; .reg .f64 %r1958; .reg .f64 %r1959; .reg .f64 %r1960; .reg .f64 %r1961; .reg .f64 %r1962; .reg .u64 %r1964; .reg .u64 %r1965; .reg .u64 %r1966; .reg .f64 %r1968; .reg .f64 %r1969; .reg .f64 %r1970; .reg .f64 %r1971; .reg .f64 %r1972; .reg .f64 %r1973; .reg .f64 %r1974; .reg .f64 %r1975; .reg .f64 %r1976; .reg .f64 %r1977; .reg .f64 %r1978; .reg .f64 %r1979; .reg .pred %r1980; .reg .pred %r1981; .reg .f64 %r1983; .reg .f64 %r1984; .reg .u64 %r1986; .reg .u64 %r1987; .reg .f64 %r1988; .reg .f64 %r1989; .reg .f64 %r1990; .reg .f64 %r1991; .reg .f64 %r1992; .reg .f64 %r1993; .reg .f64 %r1994; .reg .f64 %r1995; .reg .f64 %r1996; .reg .f64 %r1997; .reg .f64 %r1998; .reg .u64 %r2000; .reg .u64 %r2001; .reg .f64 %r2003; .reg .f64 %r2004; .reg .f64 %r2005; .reg .f64 %r2006; .reg .f64 %r2007; .reg .f64 %r2008; .reg .f64 %r2009; .reg .f64 %r2010; .reg .f64 %r2011; .reg .f64 %r2012; .reg .f64 %r2013; .reg .u64 %r2015; .reg .u64 %r2016; .reg .f64 %r2018; .reg .f64 %r2019; .reg .f64 %r2020; .reg .f64 %r2021; .reg .f64 %r2022; .reg .f64 %r2023; .reg .f64 %r2024; .reg .f64 %r2025; .reg .f64 %r2026; .reg .pred %r2027; .reg .f64 %r2028; .reg .f64 %r2029; .reg .f64 %r2030; .reg .f64 %r2031; .reg .f64 %r2032; .reg .f64 %r2033; .reg .f64 %r2034; .reg .f64 %r2035; .reg .f64 %r2036; .reg .f64 %r2038; .reg .f64 %r2039; .reg .f64 %r2040; .reg .f64 %r2041; .reg .f64 %r2042; .reg .f64 %r2043; .reg .f64 %r2044; .reg .f64 %r2045; .reg .f64 %r2046; .reg .f64 %r2048; .reg .f64 %r2049; .reg .f64 %r2050; .reg .f64 %r2051; .reg .f64 %r2052; .reg .f64 %r2053; .reg .f64 %r2054; .reg .f64 %r2055; .reg .f64 %r2056; .reg .pred %r2057; .reg .f64 %r2058; .reg .f64 %r2059; .reg .f64 %r2061; .reg .f64 %r2062; .reg .f64 %r2064; .reg .f64 %r2065; .reg .pred %r2066; .reg .f64 %r2067; .reg .f64 %r2068; .reg .f64 %r2069; .reg .f64 %r2070; .reg .f64 %r2071; .reg .f64 %r2072; .reg .f64 %r2073; .reg .f64 %r2074; .reg .pred %r2075; .reg .u32 %r2076; .reg .u64 %r2077; .reg .u32 %r2079; .reg .u64 %r2080; .reg .u32 %r2082; .reg .u64 %r2083; .reg .pred %r2084; .reg .u32 %r2085; .reg .u64 %r2086; .reg .u64 %r2088; .reg .u32 %r2090; .reg .u64 %r2091; .reg .u64 %r2093; .reg .u32 %r2095; .reg .u64 %r2096; .reg .pred %r2097; .reg .f64 %r2098; .reg .f64 %r2099; .reg .f64 %r2100; .reg .f64 %r2101; .reg .f64 %r2102; .reg .f64 %r2103; .reg .f64 %r2104; .reg .f64 %r2108; .reg .f64 %r2109; .reg .f64 %r2110; .reg .f64 %r2111; .reg .f64 %r2112; .reg .f64 %r2113; .reg .f64 %r2114; .reg .f64 %r2115; .reg .f64 %r2116; .reg .f64 %r2118; .reg .f64 %r2119; .reg .f64 %r2120; .reg .f64 %r2121; .reg .f64 %r2122; .reg .f64 %r2123; .reg .f64 %r2124; .reg .f64 %r2125; .reg .f64 %r2126; .reg .pred %r2127; .reg .f64 %r2128; .reg .f64 %r2129; .reg .f64 %r2130; .reg .f64 %r2131; .reg .f64 %r2132; .reg .f64 %r2133; .reg .f64 %r2134; .reg .f64 %r2138; .reg .f64 %r2139; .reg .f64 %r2140; .reg .f64 %r2141; .reg .f64 %r2142; .reg .f64 %r2143; .reg .f64 %r2144; .reg .f64 %r2145; .reg .f64 %r2146; .reg .f64 %r2148; .reg .f64 %r2149; .reg .f64 %r2150; .reg .f64 %r2151; .reg .f64 %r2152; .reg .f64 %r2153; .reg .f64 %r2154; .reg .f64 %r2155; .reg .f64 %r2156; .reg .pred %r2157; .reg .f64 %r2158; .reg .f64 %r2159; .reg .f64 %r2160; .reg .f64 %r2161; .reg .f64 %r2162; .reg .f64 %r2163; .reg .f64 %r2164; .reg .f64 %r2168; .reg .f64 %r2169; .reg .f64 %r2170; .reg .f64 %r2171; .reg .f64 %r2172; .reg .f64 %r2173; .reg .f64 %r2174; .reg .f64 %r2175; .reg .f64 %r2176; .reg .f64 %r2178; .reg .f64 %r2179; .reg .f64 %r2180; .reg .f64 %r2181; .reg .f64 %r2182; .reg .f64 %r2183; .reg .f64 %r2184; .reg .f64 %r2185; .reg .f64 %r2186; .reg .pred %r2187; .reg .f64 %r2188; .reg .f64 %r2189; .reg .f64 %r2190; .reg .f64 %r2191; .reg .f64 %r2192; .reg .f64 %r2193; .reg .f64 %r2194; .reg .f64 %r2195; .reg .f64 %r2196; .reg .f64 %r2197; .reg .f64 %r2198; .reg .f64 %r2199; .reg .f64 %r2201; .reg .f64 %r2202; .reg .f64 %r2203; .reg .f64 %r2204; .reg .f64 %r2205; .reg .f64 %r2206; .reg .f64 %r2207; .reg .f64 %r2208; .reg .f64 %r2209; .reg .f64 %r2210; .reg .f64 %r2211; .reg .f64 %r2212; .reg .f64 %r2214; .reg .f64 %r2215; .reg .f64 %r2216; .reg .f64 %r2217; .reg .f64 %r2218; .reg .f64 %r2219; .reg .f64 %r2220; .reg .f64 %r2221; .reg .f64 %r2222; .reg .f64 %r2223; .reg .f64 %r2224; .reg .f64 %r2225; .reg .pred %r2226; .reg .pred %r2229; .reg .f64 %r2230; .reg .f64 %r2231; .reg .f64 %r2232; .reg .f64 %r2233; .reg .f64 %r2234; .reg .f64 %r2235; .reg .f64 %r2236; .reg .f64 %r2240; .reg .f64 %r2241; .reg .f64 %r2242; .reg .f64 %r2243; .reg .f64 %r2244; .reg .f64 %r2245; .reg .f64 %r2246; .reg .f64 %r2247; .reg .f64 %r2248; .reg .f64 %r2250; .reg .f64 %r2251; .reg .f64 %r2252; .reg .f64 %r2253; .reg .f64 %r2254; .reg .f64 %r2255; .reg .f64 %r2256; .reg .f64 %r2257; .reg .f64 %r2258; .reg .pred %r2259; mov.u64 %r945,%ar0; mov.u64 %r946,%ar1; mov.u64 %r947,%ar2; mov.u32 %r948,%ar3; mov.u32 %r949,%ar4; mov.u64 %r950,%ar5; .loc 1 2462 3 ld.s8 %r627,[%r946+28]; cvt.u16.u32 %r951,%r627; setp.eq.u16 %r952,%r951,2; @ %r952 bra $L2; ld.s8 %r1551,[%r947+28]; cvt.u16.u32 %r953,%r1551; setp.eq.u16 %r955,%r953,2; @ %r955 bra $L3; cvta.const.u64 %r959,$LC3; cvta.const.u64 %r958,__func__$0; mov.u32 %r957,2462; cvta.const.u64 %r995893: .loc 1 2476 6 ld.u64 %r960,[%r945]; setp.ne.u64 %r961,%r960,0; @ %r961 bra $L4; .loc 1 2478 10 setp.ne.u16 %r963,%r951,1; @ %r963 bra $L5; .loc 1 2480 4 st.u64 [%r945+48],%r960; ld.u64 %r966,[%r947+80]; ld.u64 %r968,[%r947+72]; sub.u64 %r969,%r966,%r968; st.u64 [%r945+56],%r969; mov.u64 %r970,1; st.u64 [%r945+40],%r970; bra $L6; $L123: .loc 1 2485 4 st.u64 [%r945+48],%r1512; add.u64 %r972,%r668,-1; st.u64 [%r945+56],%r972; mov.u64 %r973,1; st.u64 [%r945+40],%r973; bra $L6; $L124: .loc 1 2490 4 mov.u64 %r974,0; st.u64 [%r945+48],%r974; add.u64 %r975,%r668,-1; st.u64 [%r945+56],%r975; mov.u64 %r976,1; st.u64 [%r945+40],%r976; .loc 1 2493 11 st.u64 [%r945+72],%r974; ld.u64 %r979,[%r947+80]; ld.u64 %r981,[%r947+72]; sub.u64 %r982,%r979,%r981; st.u64 [%r945+80],%r982; st.u64 [%r945+64],%r668; $L6: .loc 1 2499 1945; call (%value_in),_gfortran_size0,(%out_arg1); ld.param.u64 %r984,[%value_in]; } .loc 1 2499 4 mov.u64 %r986,16986; call (%value_in),_gfortrani_xmallocarray987,[%value_in]; } .loc 1 2499 2 st.u64 [%r945],%r987; .loc 1 2500 24 mov.u64 %r989,0; st.u64 [%r945+8],%r989; .loc 1 2557 7 ld.s8 %r627,[%r946+28]; .loc 1 2564 15 ld.u64 %r991,[%r946+56]; add.u64 %r990,%r991,1; .loc 1 2564 13 ld.u64 %r992,[%r946+48]; sub.u64 %r638,%r990,%r992; bra $L7; $L4: .loc 1 2502 12 cvta.global.u64 %r993,_gfortrani_compile_options; .loc 1 2502 11 ld.u32 %r994,[%r993+36]; setp.eq.u32 %r995,%r994,0; @ %r995 bra $L8; .loc 1 2509 17 ld.u64 %r997,[%r945+56]; add.u64 %r996,%r997,1; .loc 1 2509 15 ld.u64 %r998,[%r945+48]; sub.u64 %r638,%r996,%r998; .loc 1 2506 10 setp.ne.u16 %r1000,%r951,1; @ %r1000 bra $L9; .loc 1 2508 17 ld.u64 %r1002,[%r947+80]; add.u64 %r1001,%r1002,1; .loc 1 2508 15 ld.u64 %r1003,[%r947+72]; sub.u64 %r217,%r1001,%r1003; .loc 1 2510 7 setp.ne.u64 %r1004,%r217,%r638; @ %r1004 bra $L10; .loc 1 2564 15 ld.u64 %r1006,[%r946+56]; add.u64 %r1005,%r1006,1; .loc 1 2564 13 ld.u64 %r1007,[%r946+48]; sub.u64 %r638,%r1005,%r1007; .loc 1 2543 7 ld.s8 %r901,[%r945+28]; .loc 1 2548 27 ld.u64 %r844,[%r945+40]; .loc 1 2543 6 cvt.u16.u32 %r1008,%r901; setp.eq.u16 %r1009,%r1008,1; @ %r1009 bra $L11; .loc 1 2553 16 ld.u64 %r664,[%r945+64]; .loc 1 2560 16 ld.u64 %r298,[%r946+40]; .loc 1 2563 14 mov.u64 %r162,1; .loc 1 2561 16 mov.u64 %r150,%r162; bra $L12; $L10: .loc 1 2511 6 st.u64 [%stack+8],%r217; st.u64 [%stack],%r638; cvta.const.u64 %r10101010_gfortran_runtime_error2485 4 ld.u64 %r1013,[%r946+56]; add.u64 %r1012,%r1013,1; ld.u64 %r1014,[%r946+48]; sub.u64 %r672,%r1012,%r1014; .loc 1 2515 15 cvt.u16.u32 %r1015,%r1551; setp.ne.u16 %r1017,%r1015,1; @ %r1017 bra $L13; .loc 1 2519 7 setp.eq.u64 %r1018,%r672,%r638; @ %r1018 bra $L14; .loc 1 2520 6 st.u64 [%stack+8],%r672; st.u64 [%stack],%r638; cvta.const.u64 %r101101_gfortran_runtime_error13: .loc 1 2528 7 setp.eq.u64 %r1021,%r672,%r638; @ %r1021 bra $L15; .loc 1 2529 6 st.u64 [%stack+8],%r672; st.u64 [%stack],%r638; cvta.const.u64 %r102210stack; call _gfortran_runtime_error15: .loc 1 2533 17 ld.u64 %r1025,[%r947+80]; add.u64 %r1024,%r1025,1; .loc 1 2533 15 ld.u64 %r1026,[%r947+72]; sub.u64 %r215,%r1024,%r1026; .loc 1 2534 17 ld.u64 %r1028,[%r945+80]; add.u64 %r1027,%r1028,1; .loc 1 2534 15 ld.u64 %r1029,[%r945+72]; sub.u64 %r216,%r1027,%r1029; .loc 1 2535 7 setp.eq.u64 %r1030,%r215,%r216; @ %r1030 bra $L14; .loc 1 2536 6 st.u64 [%stack+8],%r215; st.u64 [%stack],%r216; cvta.const.u64 %r1031,$LC10stack; call _gfortran_runtime_error2564 15 ld.u64 %r1034,[%r946+56]; add.u64 %r1033,%r1034,1; .loc 1 2564 13 ld.u64 %r1035,[%r946+48]; sub.u64 %r638,%r1033,%r1035; $L7: .loc 1 2548 27 ld.u64 %r664,[%r945+40]; .loc 1 2543 6 ld.s8 %r1037,[%r945+28]; cvt.u16.u32 %r1036,%r1037; setp.eq.u16 %r1038,%r1036,1; @ %r1038 bra $L130; $L127: .loc 1 2553 16 mov.u64 %r844,%r664; ld.u64 %r664,[%r945+64]; bra $L16; $L130: .loc 1 2548 27 mov.u64 %r844,%r664; $L16: .loc 1 2560 16 ld.u64 %r298,[%r946+40]; .loc 1 2557 6 cvt.u16.u32 %r1039,%r627; setp.eq.u16 %r1040,%r1039,1; @ %r1040 bra $L131; $L125: .loc 1 2569 16 ld.u64 %r150,[%r946+64]; .loc 1 2571 15 ld.u64 %r1042,[%r946+80]; add.u64 %r1041,%r1042,1; .loc 1 2571 13 ld.u64 %r1043,[%r946+72]; mov.u64 %r162,%r638; sub.u64 %r638,%r1041,%r1043; bra $L12; $L131: .loc 1 2563 14 mov.u64 %r162,1; .loc 1 2561 16 mov.u64 %r150,%r162; $L12: .loc 1 2575 16 ld.u64 %r1045,[%r947+56]; add.u64 %r1044,%r1045,1; ld.u64 %r1046,[%r947+48]; sub.u64 %r65,%r1044,%r1046; .loc 1 2575 6 setp.eq.u64 %r1047,%r65,%r638; @ %r1047 bra $L17; .loc 1 2577 10 set.u32.gt.s64 %r1049,%r638,0; neg.s32 %r1050,%r1049; .loc 1 2577 21 set.u32.gt.s64 %r1052,%r65,0; neg.s32 %r1053,%r1052; cvt.u16.u32 %r1055,%r1050; cvt.u16.u32 %r1056,%r1053; or.b16 %r1054,%r1055,%r1056; cvt.u32.u16 %r1057,%r1054; cvt.u16.u8 %r1058,%r1057; setp.eq.u16 %r1059,%r1058,0; @ %r1059 bra $L17; .loc 1 2578 2 st.u64 [%stack+8],%r638; st.u64 [%stack],%r65; cvta.const.u64 %r10601060_gfortran_runtime_error17: .loc 1 2583 7 ld.s8 %r70,[%r947+28]; .loc 1 2586 16 ld.u64 %r663,[%r947+40]; .loc 1 2583 6 cvt.u16.u32 %r1062,%r70; setp.eq.u16 %r1063,%r1062,1; @ %r1063 bra $L132; .loc 1 2597 16 ld.u64 %r151,[%r947+64]; .loc 1 2598 16 ld.u64 %r1065,[%r947+80]; add.u64 %r1064,%r1065,1; .loc 1 2598 14 ld.u64 %r1066,[%r947+72]; sub.u64 %r163,%r1064,%r1066; bra $L18; $L132: .loc 1 2592 14 mov.u64 %r163,1; .loc 1 2591 16 mov.u64 %r151,256; $L18: .loc 1 2601 9 ld.u64 %r220,[%r946]; .loc 1 2602 9 ld.u64 %r221,[%r947]; .loc 1 2603 8 ld.u64 %r222,[%r945]; .loc 1 2612 28 set.u32.eq.u64 %r1068,%r844,1; neg.s32 %r1069,%r1068; cvt.u32.u32 %r1067,%r1069; cvt.u32.u8 %r75,%r1067; .loc 1 2612 46 set.u32.eq.u64 %r1071,%r298,1; neg.s32 %r1072,%r1071; cvt.u32.u32 %r1070,%r1072; cvt.u32.u8 %r662,%r1070; .loc 1 2612 7 set.u32.ne.u32 %r1074,%r948,0; neg.s32 %r1075,%r1074; .loc 1 2612 16 cvt.u16.u32 %r1077,%r1075; cvt.u16.u32 %r1078,%r75; and.b16 %r1076,%r1077,%r1078; .loc 1 2612 6 cvt.u32.u16 %r1079,%r1076; cvt.u16.u8 %r1080,%r1079; setp.eq.u16 %r1081,%r1080,0; @ %r1081 bra $L19; .loc 1 2612 63 set.u32.eq.u64 %r1083,%r150,1; neg.s32 %r1084,%r1083; .loc 1 2612 51 cvt.u16.u32 %r1086,%r1084; cvt.u16.u32 %r1087,%r662; or.b16 %r1085,%r1086,%r1087; .loc 1 2612 33 cvt.u32.u16 %r1088,%r1085; cvt.u16.u8 %r1089,%r1088; setp.eq.u16 %r1090,%r1089,0; @ %r1090 bra $L20; .loc 1 2613 20 set.u32.eq.u64 %r1092,%r663,1; neg.s32 %r1093,%r1092; .loc 1 2613 37 set.u32.eq.u64 %r1095,%r151,1; neg.s32 %r1096,%r1095; .loc 1 2613 25 cvt.u16.u32 %r1098,%r1093; cvt.u16.u32 %r1099,%r1096; or.b16 %r1097,%r1098,%r1099; .loc 1 2613 7 cvt.u32.u16 %r1100,%r1097; cvt.u16.u8 %r1101,%r1100; setp.eq.u16 %r1102,%r1101,0; @ %r1102 bra $L20; .loc 1 2615 13 cvt.rn.f32.s32 %r86,%r949; .loc 1 2614 12 cvt.rn.f32.s64 %r1103,%r162; .loc 1 2614 50 cvt.rn.f32.s64 %r1104,%r638; .loc 1 2614 47 mul.f32 %r1105,%r1103,%r1104; .loc 1 2614 31 cvt.rn.f32.s64 %r1106,%r163; .loc 1 2614 47 mul.f32 %r1107,%r1105,%r1106; .loc 1 2615 13 mul.f32 %r1108,%r86,%r86; mul.f32 %r1109,%r1108,%r86; .loc 1 2614 7 setp.gt.f32 %r1110,%r1107,%r1109; @ ! %r1110 bra $L19; .loc 1 2617 17 cvt.u32.u64 %r90,%r162; st.u32 [%frame+52],%r90; .loc 1 2617 29 cvt.u32.u64 %r91,%r163; st.u32 [%frame+48],%r91; .loc 1 2617 41 cvt.u32.u64 %r92,%r638; st.u32 [%frame+44],%r92; .loc 1 2617 52 cvt.u32.u64 %r93,%r664; st.u32 [%frame+40],%r93; .loc 1 2618 27 ld.const.f64 %r1111,[$LC8]; st.f64 [%frame+16],%r1111; ld.const.f64 %r1112,[$LC8+8]; st.f64 [%frame+24],%r1112; .loc 1 2618 36 ld.const.f64 %r1545,[$LC9]; st.f64 [%frame],%r1545; ld.const.f64 %r1535,[$LC9+8]; st.f64 [%frame+8],%r1535; .loc 1 2619 17 setp.ne.u64 %r1115,%r298,1; @ %r1115 bra $L22; cvt.u32.u64 %r198,%r150; bra $L23; $L22: cvt.u32.u64 %r198,%r298; $L23: st.u32 [%frame+36],%r198; .loc 1 2620 3 setp.ne.u64 %r1116,%r663,1; @ %r1116 bra $L24; cvt.u32.u64 %r199,%r151; bra $L25; $L24: cvt.u32.u64 %r199,%r663; $L25: st.u32 [%frame+32],%r199; .loc 1 2622 15 set.u32.gt.s32 %r1118,%r198,0; neg.s32 %r1119,%r1118; .loc 1 2622 26 set.u32.gt.s32 %r1121,%r199,0; neg.s32 %r1122,%r1121; .loc 1 2622 19 cvt.u16.u32 %r1124,%r1119; cvt.u16.u32 %r1125,%r1122; and.b16 %r1123,%r1124,%r1125; .loc 1 2622 10 cvt.u32.u16 %r1126,%r1123; cvt.u16.u8 %r1127,%r1126; setp.eq.u16 %r1128,%r1127,0; @ %r1128 bra $L19; .loc 1 2622 37 set.u32.gt.s32 %r1130,%r93,0; neg.s32 %r1131,%r1130; .loc 1 2622 46 set.u32.gt.s32 %r1133,%r90,1; neg.s32 %r1134,%r1133; .loc 1 2622 41 cvt.u16.u32 %r1136,%r1131; cvt.u16.u32 %r1137,%r1134; and.b16 %r1135,%r1136,%r1137; cvt.u32.u16 %r1138,%r1135; cvt.u16.u8 %r1139,%r1138; setp.eq.u16 %r1140,%r1139,0; @ %r1140 bra $L19; .loc 1 2622 55 set.u32.gt.s32 %r1142,%r91,1; neg.s32 %r1143,%r1142; .loc 1 2622 64 set.u32.gt.s32 %r1145,%r92,1; neg.s32 %r1146,%r1145; .loc 1 2622 59 cvt.u16.u32 %r1148,%r1143; cvt.u16.u32 %r1149,%r1146; and.b16 %r1147,%r1148,%r1149; cvt.u32.u16 %r1150,%r1147; cvt.u16.u8 %r1151,%r1150; setp.eq.u16 %r1152,%r1151,0; @ %r1152 bra $L19; .loc 1 2624 4 setp.ne.u64 %r1153,%r950,0; @ %r1153 bra $L27; cvta.const.u64 %r1157,$LC10; cvta.const.u64 %r1156,__func__$0; mov.u32 %r1155,2624; cvta.const.u64 %r11115415511561127: .loc 1 2626 17 and.b32 %r1158,%r948,2; .loc 1 2626 7 setp.ne.u32 %r1159,%r1158,0; @ %r1159 bra $L133; .loc 1 2629 35 setp.eq.u64 %r1160,%r298,1; @ %r1160 bra $L134; cvta.const.u64 %r164,$LC2; bra $L28; $L133: .loc 1 2627 13 cvta.const.u64 %r164,$LC1; bra $L28; $L134: .loc 1 2629 35 cvta.const.u64 %r164,$LC0; $L28: .loc 1 2631 17 and.b32 %r1161,%r948,4; .loc 1 2631 7 setp.ne.u32 %r1162,%r1161,0; @ %r1162 bra $L135; .loc 1 2634 35 setp.eq.u64 %r1163,%r663,1; @ %r1163 bra $L136; cvta.const.u64 %r165,$LC2; bra $L29; $L135: .loc 1 2632 13 cvta.const.u64 %r165,$LC1; bra $L29; $L136: .loc 1 2634 35 cvta.const.u64 %r165,$LC0; $L29: .loc 1 2636 4 add.u64 %r1179,%frame,40; add.u64 %r1180,%frame,32; add.u64 %r1181,%frame,36; add.u64 %r1182,%frame,16; add.u64 %r1183,%frame,44; add.u64 %r1184,%frame,48; add.u64 %r1185,%frame,52; mov.u32 %r1178,1;,.param .u64 %in_ar3,.param .u64 %in_ar4,.param .u64 %in_ar5,.param .u64 %in_ar6,.param .u64 %in_ar7,.param .u64 %in_ar8,.param .u64 %in_ar9,.param .u64 %in_ar10,.param .u64 %in_ar11,.param .u64 %in_ar12,.param .u32 %in_ar13,.param .u32 %in_ar14164r118118411831182220; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r1181; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r221; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r1180; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%frame; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r222; .param .u64 %out_arg13; st.param.u64 [%out_arg13],%r1179; .param .u32 %out_arg14; st.param.u32 [%out_arg14],%r1178; .param .u32 %out_arg15; st.param.u32 [%out_arg15],%r1178; call %r950,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13,%out_arg14,%out_arg15),$LCT0; } bra $L1; $L19: .loc 1 2643 21 cvt.u16.u32 %r1187,%r75; cvt.u16.u32 %r1188,%r662; and.b16 %r1186,%r1187,%r1188; .loc 1 2643 6 cvt.u32.u16 %r1189,%r1186; cvt.u16.u8 %r1190,%r1189; setp.eq.u16 %r1191,%r1190,0; @ %r1191 bra $L31; .loc 1 2643 38 setp.ne.u64 %r1192,%r663,1; @ %r1192 bra $L20; .loc 1 2644 7 cvt.u16.u32 %r1193,%r70; setp.eq.u16 %r1194,%r1193,1; @ %r1194 bra $L32; .loc 1 2677 16 add.u64 %r1195,%r664,1; .loc 1 2678 9 shl.b64 %r111,%r1195,4; sub.u64 %r224,%r222,%r111; .loc 1 2687 7 setp.le.s64 %r1196,%r163,0; @ %r1196 bra $L33; add.u64 %r802,%r111,-16; mov.u64 %r800,%r222; add.u64 %r806,%r163,1; .loc 1 2689 22 shl.b64 %r623,%r162,4; .loc 1 2687 13 mov.u64 %r255,%r663; setp.gt.s64 %r1556,%r162,0; and.b64 %r1636,%r163,3; setp.eq.u64 %r1639,%r1636,0; @ %r1639 bra $L34; setp.eq.u64 %r1638,%r1636,1; @ %r1638 bra $L282; setp.eq.u64 %r1637,%r1636,2; @ %r1637 bra $L283; bra $L338; $L35: .loc 1 2689 22 mov.u32 %r12002001202,[%value_in]; } $L36: .loc 1 2687 24 add.u64 %r1640,%r255,1; .loc 1 2687 7 add.u64 %r1695,%r800,%r802; .loc 1 2688 2 @ %r1556 bra $L220; bra $L339; $L34: @ %r1556 bra $L35; bra $L36; $L33: .loc 1 2692 28 set.u32.eq.u64 %r1207,%r638,0; neg.s32 %r1208,%r1207; .loc 1 2692 13 set.u32.eq.u64 %r1210,%r162,0; neg.s32 %r1211,%r1210; .loc 1 2692 28 cvt.u16.u32 %r1213,%r1208; cvt.u16.u32 %r1214,%r1211; or.b16 %r1212,%r1213,%r1214; .loc 1 2692 23 set.u32.eq.u64 %r1216,%r163,0; neg.s32 %r1217,%r1216; .loc 1 2692 28 cvt.u16.u32 %r1221,%r1217; or.b16 %r1219,%r1212,%r1221; cvt.u32.u16 %r1222,%r1219; cvt.u16.u8 %r1223,%r1222; setp.eq.u16 %r1224,%r1223,0; @ ! %r1224 bra $L1; .loc 1 2697 10 setp.eq.u64 %r1225,%r150,1; selp.u64 %r196,%r664,%r150,%r1225; .loc 1 2702 21 shl.b64 %r1226,%r196,8; .loc 1 2702 14 add.u64 %r229,%r1226,%r151; .loc 1 2706 12 min.s64 %r1228,%r229,65536; shl.b64 %r1229,%r12282291230,[%value_in]; } .loc 1 2710 7 setp.gt.s64 %r1232,%r163,0; @ %r1232 bra $L39; $L42: .loc 1 29302918 7 bra $L1; $L39: .loc 1 2680 16 add.u64 %r1234,%r150,1; .loc 1 2681 9 shl.b64 %r1235,%r1234,4; sub.u64 %r226,%r220,%r1235; .loc 1 2683 16 add.u64 %r1236,%r151,1; .loc 1 2684 9 shl.b64 %r117,%r1236,4; shl.b64 %r769,%r664,13; shl.b64 %r767,%r664,5; shl.b64 %r771,%r664,6; neg.s64 %r1238,%r664; shl.b64 %r1239,%r1238,13; shl.b64 %r772,%r1238,4; shl.b64 %r775,%r664,9; shl.b64 %r783,%r151,13; add.u64 %r784,%r117,-16; sub.u64 %r785,%r221,%r117; add.u64 %r782,%r784,%r785; add.u64 %r1244,%r151,%r151; add.u64 %r1245,%r1244,%r151; shl.b64 %r1246,%r1245,4; add.u64 %r786,%r1246,%r785; add.u64 %r1247,%r163,-1; and.b64 %r793,%r1247,-512; shl.b64 %r749,%r150,8; add.u64 %r750,%r150,%r150; add.u64 %r797,%r111,-16; sub.u64 %r798,%r224,%r797; shl.b64 %r710,%r150,5; shl.b64 %r1251,%r1238,6; shl.b64 %r67,%r151,6; mov.u64 %r774,%r664; mov.u64 %r770,%r771; mov.u64 %r765,0; setp.gt.s64 %r1553,%r638,0; add.u64 %r1590,%r798,16; setp.gt.s64 %r1591,%r162,0; add.u64 %r1592,%r162,1; add.u64 %r1593,%r226,32; bra $L41; $L138: mov.u64 %r765,%r766; $L41: add.u64 %r795,%r765,1; .loc 1 2715 9 sub.u64 %r231,%r163,%r765; min.s64 %r1696,%r231,512; .loc 1 2716 24 shr.s64 %r1253,%r1696,63; shr.u64 %r1254,%r1253,62; add.u64 %r1255,%r1696,%r1254; and.b64 %r1256,%r1255,3; sub.u64 %r1257,%r1256,%r1254; .loc 1 2716 10 sub.u64 %r232,%r1696,%r1257; .loc 1 2718 4 @ %r1553 bra $L40; $L45: .loc 1 2710 7 add.u64 %r766,%r765,512; add.u64 %r767,%r767,%r769; add.u64 %r770,%r770,%r769; add.u64 %r772,%r772,%r1239; add.u64 %r774,%r774,%r775; add.u64 %r782,%r782,%r783; add.u64 %r786,%r786,%r783; setp.ne.u64 %r1259,%r765,%r793; @ %r1259 bra $L138; bra $L42; $L40: .loc 1 2768 13 add.u64 %r644,%r232,%r795; add.u64 %r799,%r1590,%r767; add.u64 %r760,%r782,%r784; add.u64 %r763,%r784,%r786; add.u64 %r259,%r767,%r797; mul.lo.u64 %r583,%r644,%r664; mul.lo.u64 %r577,%r644,%r151; mad.lo.u64 %r756,%r644,%r784,%r785; mov.u64 %r747,%r750; .loc 1 2718 12 mov.u64 %r235,%r663; add.u64 %r1552,%r638,1; .loc 1 2768 8 add.u64 %r1583,%r644,-1; setp.ge.s64 %r1585,%r1583,%r795; .loc 1 2873 17 add.u64 %r1588,%r1696,%r795; .loc 1 2874 9 setp.le.s64 %r1589,%r1588,%r644; $L44: .loc 1 2723 13 sub.u64 %r233,%r1552,%r235; min.s64 %r1698,%r233,256; .loc 1 2724 28 shr.u64 %r1265,%r1698,63; add.u64 %r1266,%r1698,%r1265; and.b64 %r1267,%r1266,1; sub.u64 %r1268,%r1267,%r1265; .loc 1 2724 14 sub.u64 %r234,%r1698,%r1268; .loc 1 2727 8 @ %r1591 bra $L43; $L73: .loc 1 2718 30 add.u64 %r235,%r235,256; .loc 1 2718 4 add.u64 %r747,%r747,%r749; setp.ge.s64 %r1270,%r638,%r235; @ %r1270 bra $L44; bra $L45; $L43: .loc 1 2763 26 add.u64 %r1271,%r235,-1; add.u64 %r1272,%r1271,%r1698; .loc 1 2763 31 mul.lo.u64 %r146,%r1272,%r150; .loc 1 2734 13 add.u64 %r643,%r234,%r235; .loc 1 2734 8 add.u64 %r645,%r643,-1; add.u64 %r1273,%r146,1; shl.b64 %r1274,%r1273,4; add.u64 %r737,%r226,%r1274; shl.b64 %r679,%r1698,4; add.u64 %r1275,%r679,-16; add.u64 %r678,%r1230,%r1275; add.u64 %r1276,%r679,-8; add.u64 %r683,%r1230,%r1276; shl.b64 %r758,%r235,4; add.u64 %r759,%r758,%r782; add.u64 %r761,%r758,%r760; add.u64 %r762,%r758,%r786; add.u64 %r764,%r758,%r763; .loc 1 2904 18 add.u64 %r1546,%r1698,%r235; add.u64 %r757,%r756,%r758; add.u64 %r1277,%r577,%r235; add.u64 %r1278,%r1277,%r1698; shl.b64 %r1279,%r1278,4; add.u64 %r574,%r1279,%r785; mov.u64 %r736,%r799; .loc 1 2727 16 mov.u64 %r248,%r663; setp.le.s64 %r1536,%r235,%r645; setp.gt.s64 %r1548,%r1698,%r234; setp.gt.s64 %r1547,%r1696,%r232; not.b64 %r1576,%r235; add.u64 %r1577,%r1576,%r643; and.b64 %r1578,%r1577,-2; .loc 1 2904 13 add.u64 %r1580,%r1546,-1; setp.gt.s64 %r1581,%r235,%r1580; $L72: .loc 1 2732 10 sub.u64 %r236,%r1592,%r248; min.s64 %r1699,%r236,256; .loc 1 2733 25 shr.s64 %r1533,%r1699,63; shr.u64 %r1283,%r1699,63; add.u64 %r1284,%r1699,%r1283; and.b64 %r1285,%r1284,1; sub.u64 %r1286,%r1285,%r1283; .loc 1 2733 11 sub.u64 %r237,%r1699,%r1286; .loc 1 2735 5 @ %r1536 bra $L46; $L55: .loc 1 2757 8 @ ! %r1548 bra $L48; bra $L47; $L46: .loc 1 2737 17 add.u64 %r647,%r237,%r248; .loc 1 2737 12 add.u64 %r648,%r647,-1; sub.u64 %r1289,%r747,%r150; add.u64 %r1290,%r1289,%r248; add.u64 %r1291,%r1290,%r1699; shl.b64 %r1292,%r1291,4; add.u64 %r1293,%r1292,-16; add.u64 %r709,%r226,%r1293; shl.b64 %r1295,%r1699,12; add.u64 %r1296,%r1295,-4096; add.u64 %r716,%r1230,%r1296; add.u64 %r1297,%r248,%r747; add.u64 %r1298,%r1297,%r1699; shl.b64 %r1299,%r1298,4; add.u64 %r1300,%r1299,-16; add.u64 %r721,%r226,%r1300; neg.s64 %r1304,%r1699; shl.b64 %r1305,%r1304,4; mov.u64 %r726,%r747; mov.u64 %r707,0; setp.le.s64 %r1542,%r248,%r648; setp.gt.s64 %r1549,%r1699,%r237; add.u64 %r1570,%r1305,16; not.b64 %r1571,%r248; add.u64 %r1572,%r1571,%r647; and.b64 %r1573,%r1572,-2; add.u64 %r1574,%r1573,%r248; bra $L54; $L139: mov.u64 %r707,%r708; $L54: .loc 1 2738 9 @ %r1542 bra $L49; $L53: .loc 1 2749 12 @ %r1549 bra $L50; bra $L51; $L49: add.u64 %r694,%r1570,%r709; shl.b64 %r1308,%r707,4; add.u64 %r695,%r1230,%r1308; add.u64 %r696,%r1570,%r721; add.u64 %r1313,%r1574,%r726; shl.b64 %r1314,%r1313,4; add.u64 %r706,%r1314,%r1593; sub.u64 %r1625,%r706,%r696; add.u64 %r1626,%r1625,-32; shr.u64 %r1624,%r1626,5; and.b64 %r1628,%r1624,1; setp.ne.u64 %r1629,%r1628,0; @ ! %r1629 bra $L340; $L52: ld.f64 %r192,[%r694+8]; .loc 1 2740 49 ld.f64 %r1316,[%r694]; st.f64 [%r695],%r1316; st.f64 [%r695+8],%r192; ld.f64 %r182,[%r696+8]; .loc 1 2742 49 ld.f64 %r1317,[%r696]; st.f64 [%r695+16],%r1317; st.f64 [%r695+24],%r182; ld.f64 %r180,[%r694+24]; .loc 1 2744 49 ld.f64 %r1318,[%r694+16]; st.f64 [%r695+4096],%r1318; st.f64 [%r695+4104],%r180; ld.f64 %r178,[%r696+24]; .loc 1 2746 49 ld.f64 %r1319,[%r696+16]; st.f64 [%r695+4112],%r1319; st.f64 [%r695+4120],%r178; .loc 1 2738 9 add.u64 %r1630,%r694,32; add.u64 %r1631,%r695,8192; add.u64 %r1632,%r696,32; ld.f64 %r1703,[%r1630+8]; .loc 1 2740 49 ld.f64 %r1704,[%r1630]; st.f64 [%r1631],%r1704; st.f64 [%r1631+8],%r1703; ld.f64 %r1705,[%r1632+8]; .loc 1 2742 49 ld.f64 %r1706,[%r1632]; st.f64 [%r1631+16],%r1706; st.f64 [%r1631+24],%r1705; ld.f64 %r1707,[%r1630+24]; .loc 1 2744 49 ld.f64 %r1708,[%r1630+16]; st.f64 [%r1631+4096],%r1708; st.f64 [%r1631+4104],%r1707; ld.f64 %r1709,[%r1632+24]; .loc 1 2746 49 ld.f64 %r1710,[%r1632+16]; st.f64 [%r1631+4112],%r1710; st.f64 [%r1631+4120],%r1709; .loc 1 2738 9 add.u64 %r694,%r694,64; add.u64 %r695,%r695,16384; add.u64 %r696,%r696,64; setp.ne.u64 %r1711,%r696,%r706; @ %r1711 bra $L52; bra $L53; $L50: ld.f64 %r174,[%r709+8]; .loc 1 2751 41 ld.f64 %r1321,[%r709]; st.f64 [%r716],%r1321; st.f64 [%r716+8],%r174; ld.f64 %r171,[%r721+8]; .loc 1 2753 41 ld.f64 %r1322,[%r721]; st.f64 [%r716+16],%r1322; st.f64 [%r716+24],%r171; $L51: .loc 1 2735 5 add.u64 %r708,%r707,2; add.u64 %r709,%r709,%r710; add.u64 %r716,%r716,32; add.u64 %r721,%r721,%r710; add.u64 %r726,%r726,%r750; setp.ne.u64 %r1323,%r707,%r1578; @ %r1323 bra $L139; bra $L55; $L48: .loc 1 2767 25 shr.u64 %r1326,%r1533,62; add.u64 %r1327,%r1699,%r1326; and.b64 %r1328,%r1327,3; sub.u64 %r1329,%r1328,%r1326; .loc 1 2767 11 sub.u64 %r240,%r1699,%r1329; .loc 1 2769 5 @ %r1585 bra $L56; bra $L57; $L47: .loc 1 2759 17 add.u64 %r131,%r1699,%r248; .loc 1 2759 12 add.u64 %r1331,%r131,-1; .loc 1 2760 9 setp.lt.s64 %r1332,%r1331,%r248; @ %r1332 bra $L48; add.u64 %r1333,%r131,%r146; shl.b64 %r1334,%r1333,4; add.u64 %r692,%r226,%r1334; mov.u64 %r682,%r683; mov.u64 %r677,%r678; mov.u64 %r676,%r737; sub.u64 %r1614,%r692,%r737; add.u64 %r1615,%r1614,-16; shr.u64 %r1613,%r1615,4; add.u64 %r1616,%r1613,1; and.b64 %r1617,%r1616,3; setp.eq.u64 %r1620,%r1617,0; @ %r1620 bra $L58; setp.eq.u64 %r1619,%r1617,1; @ %r1619 bra $L284; setp.eq.u64 %r1618,%r1617,2; @ %r1618 bra $L285; bra $L341; $L58: ld.f64 %r168,[%r676+8]; .loc 1 2762 43 ld.f64 %r1335,[%r676]; st.f64 [%r677],%r1335; st.f64 [%r682],%r168; .loc 1 2760 9 add.u64 %r1621,%r676,16; ld.f64 %r1715,[%r1621+8]; .loc 1 2762 43 ld.f64 %r1716,[%r1621]; st.f64 [%r677+4096],%r1716; st.f64 [%r682+4096],%r1715; .loc 1 2760 9 add.u64 %r1717,%r676,32; ld.f64 %r1721,[%r1717+8]; .loc 1 2762 43 ld.f64 %r1722,[%r1717]; st.f64 [%r677+8192],%r1722; st.f64 [%r682+8192],%r1721; .loc 1 2760 9 add.u64 %r1723,%r676,48; ld.f64 %r1727,[%r1723+8]; .loc 1 2762 43 ld.f64 %r1728,[%r1723]; st.f64 [%r677+12288],%r1728; st.f64 [%r682+12288],%r1727; .loc 1 2760 9 add.u64 %r676,%r676,64; add.u64 %r677,%r677,16384; add.u64 %r682,%r682,16384; setp.ne.u64 %r1729,%r676,%r692; @ %r1729 bra $L58; bra $L48; $L57: .loc 1 2871 8 @ ! %r1547 bra $L60; bra $L59; $L56: .loc 1 2876 14 add.u64 %r650,%r240,%r248; .loc 1 2876 9 add.u64 %r651,%r650,-1; add.u64 %r1338,%r650,%r774; shl.b64 %r1339,%r1338,4; add.u64 %r140,%r224,%r1339; add.u64 %r639,%r736,%r797; add.u64 %r637,%r639,%r797; add.u64 %r51,%r637,%r797; not.b64 %r1340,%r248; add.u64 %r1341,%r1340,%r650; shr.u64 %r1342,%r1341,2; add.u64 %r1343,%r1342,1; shl.b64 %r244,%r1343,6; sub.u64 %r1344,%r650,%r248; add.u64 %r1346,%r1344,1; shl.b64 %r1347,%r1346,12; add.u64 %r552,%r1230,%r1347; shl.b64 %r238,%r1699,4; mov.u64 %r640,%r736; mov.u64 %r641,%r764; mov.u64 %r642,%r762; mov.u64 %r649,%r761; mov.u64 %r88,%r759; mov.u64 %r200,%r772; mov.u64 %r250,%r795; mov.u64 %r49,0; setp.le.s64 %r1543,%r248,%r651; setp.gt.s64 %r1550,%r1699,%r240; $L68: .loc 1 2772 9 @ %r1543 bra $L61; $L67: .loc 1 2843 12 @ ! %r1550 bra $L63; bra $L62; $L61: add.u64 %r506,%r1230,4096; add.u64 %r504,%r1230,8192; add.u64 %r502,%r1230,12288; add.u64 %r500,%r1230,16384; .loc 1 2904 13 mov.u64 %r498,0; $L66: add.u64 %r496,%r640,%r498; ld.f64 %r184,[%r496]; ld.f64 %r281,[%r496+8]; ld.f64 %r201,[%r496+16]; ld.f64 %r189,[%r496+24]; add.u64 %r483,%r639,%r498; ld.f64 %r282,[%r483]; ld.f64 %r45,[%r483+8]; ld.f64 %r278,[%r483+16]; ld.f64 %r270,[%r483+24]; add.u64 %r472,%r637,%r498; ld.f64 %r280,[%r472]; ld.f64 %r279,[%r472+8]; ld.f64 %r300,[%r472+16]; ld.f64 %r301,[%r472+24]; add.u64 %r286,%r51,%r498; ld.f64 %r296,[%r286]; ld.f64 %r299,[%r286+8]; ld.f64 %r302,[%r286+16]; ld.f64 %r303,[%r286+24]; ld.f64 %r272,[%r496+32]; ld.f64 %r273,[%r496+40]; ld.f64 %r284,[%r496+48]; ld.f64 %r283,[%r496+56]; ld.f64 %r290,[%r483+32]; ld.f64 %r274,[%r483+40]; ld.f64 %r276,[%r483+48]; ld.f64 %r275,[%r483+56]; ld.f64 %r304,[%r472+32]; ld.f64 %r305,[%r472+40]; ld.f64 %r308,[%r472+48]; ld.f64 %r309,[%r472+56]; ld.f64 %r306,[%r286+32]; ld.f64 %r307,[%r286+40]; ld.f64 %r310,[%r286+48]; ld.f64 %r311,[%r286+56]; .loc 1 2791 6 @ %r1581 bra $L64; mov.u64 %r523,0; $L65: add.u64 %r521,%r506,%r523; ld.f64 %r312,[%r521+-4096]; ld.f64 %r313,[%r521+-4088]; add.u64 %r519,%r88,%r523; ld.f64 %r314,[%r519]; ld.f64 %r315,[%r519+8]; .loc 1 2794 11 mul.f64 %r317,%r313,%r314; mul.f64 %r1351,%r313,%r315; neg.f64 %r1352,%r1351; fma.rn.f64 %r318,%r312,%r314,%r1352; fma.rn.f64 %r319,%r312,%r315,%r317; .loc 1 2793 14 add.f64 %r184,%r184,%r318; add.f64 %r281,%r281,%r319; add.u64 %r517,%r504,%r523; ld.f64 %r320,[%r517+-4096]; ld.f64 %r321,[%r517+-4088]; .loc 1 2796 11 mul.f64 %r323,%r315,%r320; mul.f64 %r1353,%r315,%r321; neg.f64 %r1354,%r1353; fma.rn.f64 %r324,%r314,%r320,%r1354; fma.rn.f64 %r325,%r314,%r321,%r323; .loc 1 2795 14 add.f64 %r201,%r201,%r324; add.f64 %r189,%r189,%r325; add.u64 %r515,%r649,%r523; ld.f64 %r326,[%r515]; ld.f64 %r327,[%r515+8]; .loc 1 2798 11 mul.f64 %r329,%r313,%r326; mul.f64 %r1355,%r313,%r327; neg.f64 %r1356,%r1355; fma.rn.f64 %r330,%r312,%r326,%r1356; fma.rn.f64 %r331,%r312,%r327,%r329; .loc 1 2797 14 add.f64 %r282,%r282,%r330; add.f64 %r45,%r45,%r331; .loc 1 2800 11 mul.f64 %r333,%r321,%r326; mul.f64 %r1357,%r321,%r327; neg.f64 %r1358,%r1357; fma.rn.f64 %r334,%r320,%r326,%r1358; fma.rn.f64 %r335,%r320,%r327,%r333; .loc 1 2799 14 add.f64 %r278,%r278,%r334; add.f64 %r270,%r270,%r335; add.u64 %r513,%r642,%r523; ld.f64 %r336,[%r513]; ld.f64 %r337,[%r513+8]; .loc 1 2802 11 mul.f64 %r339,%r313,%r336; mul.f64 %r1359,%r313,%r337; neg.f64 %r1360,%r1359; fma.rn.f64 %r340,%r312,%r336,%r1360; fma.rn.f64 %r341,%r312,%r337,%r339; .loc 1 2801 14 add.f64 %r280,%r280,%r340; add.f64 %r279,%r279,%r341; .loc 1 2804 11 mul.f64 %r343,%r321,%r336; mul.f64 %r1361,%r321,%r337; neg.f64 %r1362,%r1361; fma.rn.f64 %r344,%r320,%r336,%r1362; fma.rn.f64 %r345,%r320,%r337,%r343; .loc 1 2803 14 add.f64 %r300,%r300,%r344; add.f64 %r301,%r301,%r345; add.u64 %r511,%r641,%r523; ld.f64 %r346,[%r511]; ld.f64 %r347,[%r511+8]; .loc 1 2806 11 mul.f64 %r349,%r313,%r346; mul.f64 %r1363,%r313,%r347; neg.f64 %r1364,%r1363; fma.rn.f64 %r350,%r312,%r346,%r1364; fma.rn.f64 %r351,%r312,%r347,%r349; .loc 1 2805 14 add.f64 %r296,%r296,%r350; add.f64 %r299,%r299,%r351; .loc 1 2808 11 mul.f64 %r353,%r321,%r346; mul.f64 %r1365,%r321,%r347; neg.f64 %r1366,%r1365; fma.rn.f64 %r354,%r320,%r346,%r1366; fma.rn.f64 %r355,%r320,%r347,%r353; .loc 1 2807 14 add.f64 %r302,%r302,%r354; add.f64 %r303,%r303,%r355; add.u64 %r509,%r502,%r523; ld.f64 %r356,[%r509+-4096]; ld.f64 %r357,[%r509+-4088]; .loc 1 2810 11 mul.f64 %r359,%r315,%r356; mul.f64 %r1367,%r315,%r357; neg.f64 %r1368,%r1367; fma.rn.f64 %r360,%r314,%r356,%r1368; fma.rn.f64 %r361,%r314,%r357,%r359; .loc 1 2809 14 add.f64 %r272,%r272,%r360; add.f64 %r273,%r273,%r361; add.u64 %r507,%r500,%r523; ld.f64 %r362,[%r507+-4096]; ld.f64 %r363,[%r507+-4088]; .loc 1 2812 11 mul.f64 %r365,%r315,%r362; mul.f64 %r1369,%r315,%r363; neg.f64 %r1370,%r1369; fma.rn.f64 %r366,%r314,%r362,%r1370; fma.rn.f64 %r367,%r314,%r363,%r365; .loc 1 2811 14 add.f64 %r284,%r284,%r366; add.f64 %r283,%r283,%r367; .loc 1 2814 11 mul.f64 %r369,%r327,%r356; mul.f64 %r1371,%r327,%r357; neg.f64 %r1372,%r1371; fma.rn.f64 %r370,%r326,%r356,%r1372; fma.rn.f64 %r371,%r326,%r357,%r369; .loc 1 2813 14 add.f64 %r290,%r290,%r370; add.f64 %r274,%r274,%r371; .loc 1 2816 11 mul.f64 %r373,%r327,%r362; mul.f64 %r1373,%r327,%r363; neg.f64 %r1374,%r1373; fma.rn.f64 %r374,%r326,%r362,%r1374; fma.rn.f64 %r375,%r326,%r363,%r373; .loc 1 2815 14 add.f64 %r276,%r276,%r374; add.f64 %r275,%r275,%r375; .loc 1 2818 11 mul.f64 %r377,%r337,%r356; mul.f64 %r1375,%r337,%r357; neg.f64 %r1376,%r1375; fma.rn.f64 %r378,%r336,%r356,%r1376; fma.rn.f64 %r379,%r336,%r357,%r377; .loc 1 2817 14 add.f64 %r304,%r304,%r378; add.f64 %r305,%r305,%r379; .loc 1 2820 11 mul.f64 %r381,%r337,%r362; mul.f64 %r1377,%r337,%r363; neg.f64 %r1378,%r1377; fma.rn.f64 %r382,%r336,%r362,%r1378; fma.rn.f64 %r383,%r336,%r363,%r381; .loc 1 2819 14 add.f64 %r308,%r308,%r382; add.f64 %r309,%r309,%r383; .loc 1 2822 11 mul.f64 %r385,%r347,%r356; mul.f64 %r1379,%r347,%r357; neg.f64 %r1380,%r1379; fma.rn.f64 %r386,%r346,%r356,%r1380; fma.rn.f64 %r387,%r346,%r357,%r385; .loc 1 2821 14 add.f64 %r306,%r306,%r386; add.f64 %r307,%r307,%r387; .loc 1 2824 11 mul.f64 %r389,%r347,%r362; mul.f64 %r1381,%r347,%r363; neg.f64 %r1382,%r1381; fma.rn.f64 %r390,%r346,%r362,%r1382; fma.rn.f64 %r391,%r346,%r363,%r389; .loc 1 2823 14 add.f64 %r310,%r310,%r390; add.f64 %r311,%r311,%r391; .loc 1 2791 6 add.u64 %r523,%r523,16; setp.ne.u64 %r1383,%r523,%r679; @ %r1383 bra $L65; $L64: .loc 1 2826 24 st.f64 [%r496],%r184; st.f64 [%r496+8],%r281; .loc 1 2827 28 st.f64 [%r496+16],%r201; st.f64 [%r496+24],%r189; .loc 1 2828 30 st.f64 [%r483],%r282; st.f64 [%r483+8],%r45; .loc 1 2829 34 st.f64 [%r483+16],%r278; st.f64 [%r483+24],%r270; .loc 1 2830 30 st.f64 [%r472],%r280; st.f64 [%r472+8],%r279; .loc 1 2831 34 st.f64 [%r472+16],%r300; st.f64 [%r472+24],%r301; .loc 1 2832 30 st.f64 [%r286],%r296; st.f64 [%r286+8],%r299; .loc 1 2833 34 st.f64 [%r286+16],%r302; st.f64 [%r286+24],%r303; .loc 1 2834 28 st.f64 [%r496+32],%r272; st.f64 [%r496+40],%r273; .loc 1 2835 28 st.f64 [%r496+48],%r284; st.f64 [%r496+56],%r283; .loc 1 2836 34 st.f64 [%r483+32],%r290; st.f64 [%r483+40],%r274; .loc 1 2837 34 st.f64 [%r483+48],%r276; st.f64 [%r483+56],%r275; .loc 1 2838 34 st.f64 [%r472+32],%r304; st.f64 [%r472+40],%r305; .loc 1 2839 34 st.f64 [%r472+48],%r308; st.f64 [%r472+56],%r309; .loc 1 2840 34 st.f64 [%r286+32],%r306; st.f64 [%r286+40],%r307; .loc 1 2841 34 st.f64 [%r286+48],%r310; st.f64 [%r286+56],%r311; .loc 1 2772 9 add.u64 %r506,%r506,16384; add.u64 %r504,%r504,16384; add.u64 %r502,%r502,16384; add.u64 %r500,%r500,16384; add.u64 %r498,%r498,64; setp.ne.u64 %r1384,%r244,%r498; @ %r1384 bra $L66; bra $L67; $L63: .loc 1 2769 29 add.u64 %r250,%r250,4; .loc 1 2769 5 add.u64 %r200,%r200,%r1251; add.u64 %r140,%r140,%r771; add.u64 %r88,%r88,%r67; add.u64 %r649,%r649,%r67; add.u64 %r642,%r642,%r67; add.u64 %r641,%r641,%r67; add.u64 %r640,%r640,%r771; add.u64 %r639,%r639,%r771; add.u64 %r637,%r637,%r771; add.u64 %r51,%r51,%r771; add.u64 %r49,%r49,%r771; setp.le.s64 %r1385,%r250,%r1583; @ %r1385 bra $L68; bra $L57; $L62: add.u64 %r549,%r140,8; add.u64 %r542,%r49,%r259; add.u64 %r128,%r238,%r640; .loc 1 2904 13 mov.u64 %r553,%r552; mov.u64 %r555,%r140; add.u64 %r1532,%r49,%r767; add.u64 %r1534,%r49,%r770; $L71: ld.f64 %r392,[%r555]; ld.f64 %r393,[%r549]; add.u64 %r1387,%r1532,%r555; add.u64 %r543,%r1387,%r200; ld.f64 %r394,[%r543]; add.u64 %r1388,%r1532,%r549; add.u64 %r530,%r1388,%r200; ld.f64 %r395,[%r530]; add.u64 %r1389,%r542,%r555; add.u64 %r539,%r1389,%r200; ld.f64 %r396,[%r539]; add.u64 %r1390,%r542,%r549; add.u64 %r527,%r1390,%r200; ld.f64 %r397,[%r527]; add.u64 %r1391,%r1534,%r555; add.u64 %r535,%r1391,%r200; ld.f64 %r398,[%r535]; add.u64 %r1392,%r1534,%r549; add.u64 %r524,%r1392,%r200; ld.f64 %r399,[%r524]; .loc 1 2853 10 @ %r1581 bra $L69; mov.u64 %r566,0; $L70: add.u64 %r564,%r553,%r566; ld.f64 %r400,[%r564+-4096]; ld.f64 %r401,[%r564+-4088]; add.u64 %r562,%r88,%r566; ld.f64 %r402,[%r562]; ld.f64 %r403,[%r562+8]; .loc 1 2856 13 mul.f64 %r405,%r401,%r402; mul.f64 %r1394,%r401,%r403; neg.f64 %r1395,%r1394; fma.rn.f64 %r406,%r400,%r402,%r1395; fma.rn.f64 %r407,%r400,%r403,%r405; .loc 1 2855 11 add.f64 %r392,%r392,%r406; add.f64 %r393,%r393,%r407; add.u64 %r560,%r649,%r566; ld.f64 %r408,[%r560]; ld.f64 %r409,[%r560+8]; .loc 1 2858 13 mul.f64 %r411,%r401,%r408; mul.f64 %r1396,%r401,%r409; neg.f64 %r1397,%r1396; fma.rn.f64 %r412,%r400,%r408,%r1397; fma.rn.f64 %r413,%r400,%r409,%r411; .loc 1 2857 11 add.f64 %r394,%r394,%r412; add.f64 %r395,%r395,%r413; add.u64 %r558,%r642,%r566; ld.f64 %r414,[%r558]; ld.f64 %r415,[%r558+8]; .loc 1 2860 13 mul.f64 %r417,%r401,%r414; mul.f64 %r1398,%r401,%r415; neg.f64 %r1399,%r1398; fma.rn.f64 %r418,%r400,%r414,%r1399; fma.rn.f64 %r419,%r400,%r415,%r417; .loc 1 2859 11 add.f64 %r396,%r396,%r418; add.f64 %r397,%r397,%r419; add.u64 %r556,%r641,%r566; ld.f64 %r420,[%r556]; ld.f64 %r421,[%r556+8]; .loc 1 2862 13 mul.f64 %r423,%r401,%r420; mul.f64 %r1400,%r401,%r421; neg.f64 %r1401,%r1400; fma.rn.f64 %r424,%r400,%r420,%r1401; fma.rn.f64 %r425,%r400,%r421,%r423; .loc 1 2861 11 add.f64 %r398,%r398,%r424; add.f64 %r399,%r399,%r425; .loc 1 2853 10 add.u64 %r566,%r566,16; setp.ne.u64 %r1402,%r566,%r679; @ %r1402 bra $L70; $L69: .loc 1 2864 28 st.f64 [%r555],%r392; st.f64 [%r549],%r393; .loc 1 2865 34 st.f64 [%r543],%r394; st.f64 [%r530],%r395; .loc 1 2866 34 st.f64 [%r539],%r396; st.f64 [%r527],%r397; .loc 1 2867 34 st.f64 [%r535],%r398; st.f64 [%r524],%r399; .loc 1 2846 6 add.u64 %r555,%r555,16; add.u64 %r553,%r553,4096; add.u64 %r549,%r549,16; setp.ne.u64 %r1403,%r128,%r555; @ %r1403 bra $L71; bra $L63; $L60: .loc 1 2727 34 add.u64 %r248,%r248,256; .loc 1 2727 8 add.u64 %r736,%r736,4096; add.u64 %r737,%r737,4096; setp.ge.s64 %r1404,%r162,%r248; @ %r1404 bra $L72; bra $L73; $L59: .loc 1 2874 9 @ %r1589 bra $L60; .loc 1 2876 14 add.u64 %r657,%r240,%r248; .loc 1 2876 9 add.u64 %r658,%r657,-1; .loc 1 2759 17 add.u64 %r582,%r1699,%r248; add.u64 %r1406,%r582,%r583; shl.b64 %r1407,%r1406,4; add.u64 %r584,%r224,%r1407; neg.s64 %r1409,%r582; shl.b64 %r1410,%r1409,4; shl.b64 %r1411,%r248,4; add.u64 %r567,%r1411,%r1410; shl.b64 %r1412,%r657,4; add.u64 %r569,%r1412,%r1410; shl.b64 %r1413,%r240,12; add.u64 %r615,%r1230,%r1413; .loc 1 2876 9 mov.u64 %r575,%r574; mov.u64 %r578,%r757; mov.u64 %r579,%r583; mov.u64 %r243,%r644; setp.gt.s64 %r1730,%r1699,%r240; setp.le.s64 %r1544,%r248,%r658; not.b64 %r1562,%r248; add.u64 %r1563,%r1562,%r657; and.b64 %r1564,%r1563,-4; add.u64 %r1565,%r1564,%r248; add.u64 %r1566,%r224,64; $L81: .loc 1 2877 6 @ %r1544 bra $L74; $L80: .loc 1 2901 6 @ %r1730 bra $L75; bra $L76; $L74: add.u64 %r599,%r567,%r584; add.u64 %r598,%r1230,4096; add.u64 %r596,%r1230,8192; add.u64 %r594,%r1230,12288; add.u64 %r592,%r1230,16384; add.u64 %r1420,%r1565,%r579; shl.b64 %r1421,%r1420,4; add.u64 %r585,%r1421,%r1566; $L79: ld.f64 %r426,[%r599]; ld.f64 %r427,[%r599+8]; ld.f64 %r428,[%r599+16]; ld.f64 %r429,[%r599+24]; ld.f64 %r430,[%r599+32]; ld.f64 %r431,[%r599+40]; ld.f64 %r432,[%r599+48]; ld.f64 %r433,[%r599+56]; .loc 1 2884 10 @ %r1581 bra $L77; mov.u64 %r610,0; $L78: add.u64 %r608,%r598,%r610; ld.f64 %r434,[%r608+-4096]; ld.f64 %r435,[%r608+-4088]; add.u64 %r606,%r578,%r610; ld.f64 %r436,[%r606]; ld.f64 %r437,[%r606+8]; .loc 1 2887 13 mul.f64 %r439,%r435,%r436; mul.f64 %r1424,%r435,%r437; neg.f64 %r1425,%r1424; fma.rn.f64 %r440,%r434,%r436,%r1425; fma.rn.f64 %r441,%r434,%r437,%r439; .loc 1 2886 11 add.f64 %r426,%r426,%r440; add.f64 %r427,%r427,%r441; add.u64 %r604,%r596,%r610; ld.f64 %r442,[%r604+-4096]; ld.f64 %r443,[%r604+-4088]; .loc 1 2889 13 mul.f64 %r445,%r437,%r442; mul.f64 %r1426,%r437,%r443; neg.f64 %r1427,%r1426; fma.rn.f64 %r446,%r436,%r442,%r1427; fma.rn.f64 %r447,%r436,%r443,%r445; .loc 1 2888 11 add.f64 %r428,%r428,%r446; add.f64 %r429,%r429,%r447; add.u64 %r602,%r594,%r610; ld.f64 %r448,[%r602+-4096]; ld.f64 %r449,[%r602+-4088]; .loc 1 2891 13 mul.f64 %r451,%r437,%r448; mul.f64 %r1428,%r437,%r449; neg.f64 %r1429,%r1428; fma.rn.f64 %r452,%r436,%r448,%r1429; fma.rn.f64 %r453,%r436,%r449,%r451; .loc 1 2890 11 add.f64 %r430,%r430,%r452; add.f64 %r431,%r431,%r453; add.u64 %r600,%r592,%r610; ld.f64 %r454,[%r600+-4096]; ld.f64 %r455,[%r600+-4088]; .loc 1 2893 13 mul.f64 %r457,%r437,%r454; mul.f64 %r1430,%r437,%r455; neg.f64 %r1431,%r1430; fma.rn.f64 %r458,%r436,%r454,%r1431; fma.rn.f64 %r459,%r436,%r455,%r457; .loc 1 2892 11 add.f64 %r432,%r432,%r458; add.f64 %r433,%r433,%r459; .loc 1 2884 10 add.u64 %r610,%r610,16; setp.ne.u64 %r1432,%r610,%r679; @ %r1432 bra $L78; $L77: .loc 1 2895 28 st.f64 [%r599],%r426; st.f64 [%r599+8],%r427; .loc 1 2896 32 st.f64 [%r599+16],%r428; st.f64 [%r599+24],%r429; .loc 1 2897 32 st.f64 [%r599+32],%r430; st.f64 [%r599+40],%r431; .loc 1 2898 32 st.f64 [%r599+48],%r432; st.f64 [%r599+56],%r433; .loc 1 2877 6 add.u64 %r599,%r599,64; add.u64 %r598,%r598,16384; add.u64 %r596,%r596,16384; add.u64 %r594,%r594,16384; add.u64 %r592,%r592,16384; setp.ne.u64 %r1433,%r585,%r599; @ %r1433 bra $L79; bra $L80; $L76: .loc 1 2874 39 add.u64 %r243,%r243,1; .loc 1 2874 9 add.u64 %r584,%r584,%r797; add.u64 %r579,%r579,%r664; add.u64 %r578,%r578,%r784; add.u64 %r575,%r575,%r784; setp.ne.u64 %r1434,%r1588,%r243; @ %r1434 bra $L81; bra $L60; $L75: add.u64 %r617,%r569,%r584; .loc 1 2904 13 mov.u64 %r616,%r615; $L84: ld.f64 %r460,[%r617]; ld.f64 %r461,[%r617+8]; .loc 1 2905 10 @ %r1581 bra $L82; add.u64 %r621,%r616,8; mov.u64 %r256,%r578; mov.u64 %r297,%r616; sub.u64 %r1603,%r575,%r578; add.u64 %r1604,%r1603,-16; shr.u64 %r1602,%r1604,4; add.u64 %r1605,%r1602,1; and.b64 %r1606,%r1605,3; setp.eq.u64 %r1609,%r1606,0; @ %r1609 bra $L83; setp.eq.u64 %r1608,%r1606,1; @ %r1608 bra $L286; setp.eq.u64 %r1607,%r1606,2; @ %r1607 bra $L287; bra $L342; $L83: ld.f64 %r462,[%r297]; ld.f64 %r463,[%r621]; ld.f64 %r464,[%r256]; ld.f64 %r465,[%r256+8]; .loc 1 2908 13 mul.f64 %r467,%r463,%r464; mul.f64 %r1437,%r463,%r465; neg.f64 %r1438,%r1437; fma.rn.f64 %r468,%r462,%r464,%r1438; fma.rn.f64 %r469,%r462,%r465,%r467; .loc 1 2907 11 add.f64 %r1731,%r460,%r468; add.f64 %r1732,%r461,%r469; .loc 1 2905 10 add.u64 %r1611,%r256,16; ld.f64 %r1736,[%r297+16]; ld.f64 %r1737,[%r621+16]; ld.f64 %r1738,[%r1611]; ld.f64 %r1739,[%r1611+8]; .loc 1 2908 13 mul.f64 %r1740,%r1737,%r1738; mul.f64 %r1741,%r1737,%r1739; neg.f64 %r1742,%r1741; fma.rn.f64 %r1743,%r1736,%r1738,%r1742; fma.rn.f64 %r1744,%r1736,%r1739,%r1740; .loc 1 2907 11 add.f64 %r1745,%r1731,%r1743; add.f64 %r1746,%r1732,%r1744; .loc 1 2905 10 add.u64 %r1748,%r256,32; ld.f64 %r1751,[%r297+32]; ld.f64 %r1752,[%r621+32]; ld.f64 %r1753,[%r1748]; ld.f64 %r1754,[%r1748+8]; .loc 1 2908 13 mul.f64 %r1755,%r1752,%r1753; mul.f64 %r1756,%r1752,%r1754; neg.f64 %r1757,%r1756; fma.rn.f64 %r1758,%r1751,%r1753,%r1757; fma.rn.f64 %r1759,%r1751,%r1754,%r1755; .loc 1 2907 11 add.f64 %r1760,%r1745,%r1758; add.f64 %r1761,%r1746,%r1759; .loc 1 2905 10 add.u64 %r1763,%r256,48; ld.f64 %r1766,[%r297+48]; ld.f64 %r1767,[%r621+48]; ld.f64 %r1768,[%r1763]; ld.f64 %r1769,[%r1763+8]; .loc 1 2908 13 mul.f64 %r1770,%r1767,%r1768; mul.f64 %r1771,%r1767,%r1769; neg.f64 %r1772,%r1771; fma.rn.f64 %r1773,%r1766,%r1768,%r1772; fma.rn.f64 %r1774,%r1766,%r1769,%r1770; .loc 1 2907 11 add.f64 %r460,%r1760,%r1773; add.f64 %r461,%r1761,%r1774; .loc 1 2905 10 add.u64 %r297,%r297,64; add.u64 %r256,%r256,64; add.u64 %r621,%r621,64; setp.ne.u64 %r1775,%r256,%r575; @ %r1775 bra $L83; $L82: .loc 1 2910 28 st.f64 [%r617],%r460; st.f64 [%r617+8],%r461; .loc 1 2901 6 add.u64 %r617,%r617,16; add.u64 %r616,%r616,4096; setp.ne.u64 %r1440,%r584,%r617; @ %r1440 bra $L84; bra $L76; $L20: .loc 1 2958 11 cvt.u16.u32 %r1441,%r627; setp.eq.u16 %r1442,%r1441,1; @ ! %r1442 bra $L343; bra $L85; $L31: .loc 1 2920 38 set.u32.eq.u64 %r1444,%r150,1; neg.s32 %r1445,%r1444; .loc 1 2920 26 cvt.u16.u32 %r1447,%r1445; and.b16 %r1446,%r1447,%r1187; .loc 1 2920 11 cvt.u32.u16 %r1449,%r1446; cvt.u16.u8 %r1450,%r1449; setp.eq.u16 %r1451,%r1450,0; @ %r1451 bra $L20; .loc 1 2920 43 setp.ne.u64 %r1452,%r663,1; @ %r1452 bra $L20; $L121: .loc 1 2922 10 cvt.u16.u32 %r1453,%r627; setp.ne.u16 %r1454,%r1453,1; @ %r1454 bra $L87; .loc 1 2948 4 setp.gt.s64 %r1455,%r163,0; @ %r1455 bra $L88; bra $L1; $L87: .loc 1 2929 4 setp.le.s64 %r1456,%r163,0; @ %r1456 bra $L1; shl.b64 %r830,%r664,4; shl.b64 %r1457,%r162,4; add.u64 %r829,%r222,%r1457; shl.b64 %r836,%r151,4; mov.u64 %r834,%r221; shl.b64 %r818,%r298,4; shl.b64 %r1458,%r638,4; add.u64 %r822,%r220,%r1458; mov.u64 %r827,0; .loc 1 2929 11 mov.u64 %r258,%r827; setp.gt.s64 %r1776,%r162,0; setp.le.s64 %r1597,%r638,0; $L91: .loc 1 2933 8 @ %r1776 bra $L90; $L95: .loc 1 2929 29 add.u64 %r258,%r258,1; .loc 1 2929 4 add.u64 %r827,%r827,%r664; add.u64 %r829,%r829,%r830; add.u64 %r834,%r834,%r836; setp.ne.u64 %r1460,%r163,%r258; @ %r1460 bra $L91; bra $L1; $L90: shl.b64 %r1461,%r827,4; add.u64 %r815,%r222,%r1461; .loc 1 2933 8 mov.u64 %r821,%r822; mov.u64 %r816,%r220; $L94: .loc 1 2937 5 @ %r1597 bra $L140; add.u64 %r809,%r816,8; add.u64 %r810,%r834,8; mov.u64 %r808,%r834; mov.u64 %r807,%r816; mov.f64 %r108,0d0000000000000000; mov.f64 %r103,%r108; sub.u64 %r1642,%r821,%r816; add.u64 %r1643,%r1642,-16; shr.u64 %r1641,%r1643,4; add.u64 %r1644,%r1641,1; and.b64 %r1645,%r1644,3; setp.eq.u64 %r1648,%r1645,0; @ %r1648 bra $L93; setp.eq.u64 %r1647,%r1645,1; @ %r1647 bra $L288; setp.eq.u64 %r1646,%r1645,2; @ %r1646 bra $L289; bra $L344; $L93: ld.f64 %r186,[%r807]; ld.f64 %r193,[%r809]; ld.f64 %r177,[%r808]; ld.f64 %r175,[%r810]; .loc 1 2938 23 mul.f64 %r44,%r177,%r193; mul.f64 %r1463,%r175,%r193; neg.f64 %r1464,%r1463; fma.rn.f64 %r43,%r177,%r186,%r1464; fma.rn.f64 %r101,%r175,%r186,%r44; .loc 1 2938 9 add.f64 %r1777,%r103,%r43; add.f64 %r1778,%r108,%r101; ld.f64 %r1783,[%r807+16]; ld.f64 %r1784,[%r809+16]; ld.f64 %r1785,[%r808+16]; ld.f64 %r1786,[%r810+16]; .loc 1 2938 23 mul.f64 %r1787,%r1785,%r1784; mul.f64 %r1788,%r1786,%r1784; neg.f64 %r1789,%r1788; fma.rn.f64 %r1790,%r1785,%r1783,%r1789; fma.rn.f64 %r1791,%r1786,%r1783,%r1787; .loc 1 2938 9 add.f64 %r1792,%r1777,%r1790; add.f64 %r1793,%r1778,%r1791; ld.f64 %r1799,[%r807+32]; ld.f64 %r1800,[%r809+32]; ld.f64 %r1801,[%r808+32]; ld.f64 %r1802,[%r810+32]; .loc 1 2938 23 mul.f64 %r1803,%r1801,%r1800; mul.f64 %r1804,%r1802,%r1800; neg.f64 %r1805,%r1804; fma.rn.f64 %r1806,%r1801,%r1799,%r1805; fma.rn.f64 %r1807,%r1802,%r1799,%r1803; .loc 1 2938 9 add.f64 %r1808,%r1792,%r1806; add.f64 %r1809,%r1793,%r1807; ld.f64 %r1815,[%r807+48]; ld.f64 %r1816,[%r809+48]; ld.f64 %r1817,[%r808+48]; ld.f64 %r1818,[%r810+48]; .loc 1 2938 23 mul.f64 %r1819,%r1817,%r1816; mul.f64 %r1820,%r1818,%r1816; neg.f64 %r1821,%r1820; fma.rn.f64 %r1822,%r1817,%r1815,%r1821; fma.rn.f64 %r1823,%r1818,%r1815,%r1819; .loc 1 2938 9 add.f64 %r103,%r1808,%r1822; add.f64 %r108,%r1809,%r1823; .loc 1 2937 5 add.u64 %r807,%r807,64; add.u64 %r808,%r808,64; add.u64 %r809,%r809,64; add.u64 %r810,%r810,64; setp.ne.u64 %r1824,%r807,%r821; @ %r1824 bra $L93; bra $L92; $L140: mov.f64 %r108,0d0000000000000000; mov.f64 %r103,%r108; $L92: .loc 1 2939 15 st.f64 [%r815],%r103; st.f64 [%r815+8],%r108; .loc 1 2933 8 add.u64 %r815,%r815,16; add.u64 %r816,%r816,%r818; add.u64 %r821,%r821,%r818; setp.ne.u64 %r1466,%r815,%r829; @ %r1466 bra $L94; bra $L95; $L88: shl.b64 %r853,%r664,4; mov.u64 %r851,%r222; shl.b64 %r842,%r298,4; .loc 1 2948 4 mov.u64 %r854,0; .loc 1 2948 11 mov.u64 %r257,%r854; setp.le.s64 %r1825,%r638,0; $L98: .loc 1 2952 8 @ %r1825 bra $L141; mov.u64 %r840,%r220; shl.b64 %r1468,%r854,4; add.u64 %r843,%r1468,%r221; add.u64 %r845,%r220,8; add.u64 %r1469,%r638,%r854; shl.b64 %r1470,%r1469,4; add.u64 %r861,%r1470,%r221; mov.f64 %r37,0d0000000000000000; mov.f64 %r38,%r37; sub.u64 %r1654,%r861,%r843; add.u64 %r1655,%r1654,-16; shr.u64 %r1653,%r1655,4; add.u64 %r1656,%r1653,1; and.b64 %r1657,%r1656,3; setp.eq.u64 %r1660,%r1657,0; @ %r1660 bra $L97; setp.eq.u64 %r1659,%r1657,1; @ %r1659 bra $L290; setp.eq.u64 %r1658,%r1657,2; @ %r1658 bra $L291; bra $L345; $L97: ld.f64 %r187,[%r840]; ld.f64 %r194,[%r845]; ld.f64 %r172,[%r843]; ld.f64 %r169,[%r843+8]; .loc 1 2953 26 mul.f64 %r87,%r172,%r194; mul.f64 %r1471,%r169,%r194; neg.f64 %r1472,%r1471; fma.rn.f64 %r69,%r172,%r187,%r1472; fma.rn.f64 %r68,%r169,%r187,%r87; .loc 1 2953 5 add.f64 %r1826,%r38,%r69; add.f64 %r1827,%r37,%r68; .loc 1 2952 8 add.u64 %r1828,%r840,%r842; add.u64 %r1661,%r843,16; add.u64 %r1830,%r845,%r842; ld.f64 %r1831,[%r1828]; ld.f64 %r1832,[%r1830]; ld.f64 %r1833,[%r1661]; ld.f64 %r1834,[%r1661+8]; .loc 1 2953 26 mul.f64 %r1835,%r1833,%r1832; mul.f64 %r1836,%r1834,%r1832; neg.f64 %r1837,%r1836; fma.rn.f64 %r1838,%r1833,%r1831,%r1837; fma.rn.f64 %r1839,%r1834,%r1831,%r1835; .loc 1 2953 5 add.f64 %r1840,%r1826,%r1838; add.f64 %r1841,%r1827,%r1839; .loc 1 2952 8 add.u64 %r1842,%r1828,%r842; add.u64 %r1843,%r843,32; add.u64 %r1844,%r1830,%r842; ld.f64 %r1846,[%r1842]; ld.f64 %r1847,[%r1844]; ld.f64 %r1848,[%r1843]; ld.f64 %r1849,[%r1843+8]; .loc 1 2953 26 mul.f64 %r1850,%r1848,%r1847; mul.f64 %r1851,%r1849,%r1847; neg.f64 %r1852,%r1851; fma.rn.f64 %r1853,%r1848,%r1846,%r1852; fma.rn.f64 %r1854,%r1849,%r1846,%r1850; .loc 1 2953 5 add.f64 %r1855,%r1840,%r1853; add.f64 %r1856,%r1841,%r1854; .loc 1 2952 8 add.u64 %r1857,%r1842,%r842; add.u64 %r1858,%r843,48; add.u64 %r1859,%r1844,%r842; ld.f64 %r1861,[%r1857]; ld.f64 %r1862,[%r1859]; ld.f64 %r1863,[%r1858]; ld.f64 %r1864,[%r1858+8]; .loc 1 2953 26 mul.f64 %r1865,%r1863,%r1862; mul.f64 %r1866,%r1864,%r1862; neg.f64 %r1867,%r1866; fma.rn.f64 %r1868,%r1863,%r1861,%r1867; fma.rn.f64 %r1869,%r1864,%r1861,%r1865; .loc 1 2953 5 add.f64 %r38,%r1855,%r1868; add.f64 %r37,%r1856,%r1869; .loc 1 2952 8 add.u64 %r840,%r1857,%r842; add.u64 %r843,%r843,64; add.u64 %r845,%r1859,%r842; setp.ne.u64 %r1870,%r843,%r861; @ %r1870 bra $L97; bra $L96; $L141: mov.f64 %r37,0d0000000000000000; mov.f64 %r38,%r37; $L96: .loc 1 2954 25 st.f64 [%r851],%r38; st.f64 [%r851+8],%r37; .loc 1 2948 29 add.u64 %r257,%r257,1; .loc 1 2948 4 add.u64 %r851,%r851,%r853; add.u64 %r854,%r854,%r151; setp.ne.u64 %r1474,%r163,%r257; @ %r1474 bra $L98; bra $L1; $L85: .loc 1 2963 7 setp.le.s64 %r1475,%r163,0; @ %r1475 bra $L1; shl.b64 %r872,%r844,4; mov.u64 %r870,%r222; shl.b64 %r864,%r298,4; shl.b64 %r867,%r663,4; mov.u64 %r873,0; .loc 1 2963 14 mov.u64 %r268,%r873; setp.le.s64 %r1871,%r638,0; $L101: .loc 1 2967 4 @ %r1871 bra $L142; mov.u64 %r862,%r220; shl.b64 %r1477,%r873,4; add.u64 %r865,%r221,%r1477; mov.f64 %r152,0d0000000000000000; mov.f64 %r188,%r152; .loc 1 2967 11 mov.u64 %r269,0; and.b64 %r1665,%r638,3; setp.eq.u64 %r1668,%r1665,0; @ %r1668 bra $L100; setp.eq.u64 %r1667,%r1665,1; @ %r1667 bra $L292; setp.eq.u64 %r1666,%r1665,2; @ %r1666 bra $L293; bra $L346; $L100: ld.f64 %r277,[%r862]; ld.f64 %r206,[%r862+8]; ld.f64 %r195,[%r865]; ld.f64 %r205,[%r865+8]; .loc 1 2968 29 mul.f64 %r203,%r195,%r206; mul.f64 %r1478,%r205,%r206; neg.f64 %r1479,%r1478; fma.rn.f64 %r191,%r195,%r277,%r1479; fma.rn.f64 %r202,%r205,%r277,%r203; .loc 1 2968 8 add.f64 %r1872,%r188,%r191; add.f64 %r1873,%r152,%r202; .loc 1 2967 4 add.u64 %r1875,%r862,%r864; add.u64 %r1876,%r865,%r867; ld.f64 %r1877,[%r1875]; ld.f64 %r1878,[%r1875+8]; ld.f64 %r1879,[%r1876]; ld.f64 %r1880,[%r1876+8]; .loc 1 2968 29 mul.f64 %r1881,%r1879,%r1878; mul.f64 %r1882,%r1880,%r1878; neg.f64 %r1883,%r1882; fma.rn.f64 %r1884,%r1879,%r1877,%r1883; fma.rn.f64 %r1885,%r1880,%r1877,%r1881; .loc 1 2968 8 add.f64 %r1886,%r1872,%r1884; add.f64 %r1887,%r1873,%r1885; .loc 1 2967 4 add.u64 %r1889,%r1875,%r864; add.u64 %r1890,%r1876,%r867; ld.f64 %r1892,[%r1889]; ld.f64 %r1893,[%r1889+8]; ld.f64 %r1894,[%r1890]; ld.f64 %r1895,[%r1890+8]; .loc 1 2968 29 mul.f64 %r1896,%r1894,%r1893; mul.f64 %r1897,%r1895,%r1893; neg.f64 %r1898,%r1897; fma.rn.f64 %r1899,%r1894,%r1892,%r1898; fma.rn.f64 %r1900,%r1895,%r1892,%r1896; .loc 1 2968 8 add.f64 %r1901,%r1886,%r1899; add.f64 %r1902,%r1887,%r1900; .loc 1 2967 4 add.u64 %r1904,%r1889,%r864; add.u64 %r1905,%r1890,%r867; ld.f64 %r1907,[%r1904]; ld.f64 %r1908,[%r1904+8]; ld.f64 %r1909,[%r1905]; ld.f64 %r1910,[%r1905+8]; .loc 1 2968 29 mul.f64 %r1911,%r1909,%r1908; mul.f64 %r1912,%r1910,%r1908; neg.f64 %r1913,%r1912; fma.rn.f64 %r1914,%r1909,%r1907,%r1913; fma.rn.f64 %r1915,%r1910,%r1907,%r1911; .loc 1 2968 8 add.f64 %r188,%r1901,%r1914; add.f64 %r152,%r1902,%r1915; .loc 1 2967 28 add.u64 %r269,%r269,4; .loc 1 2967 4 add.u64 %r862,%r1904,%r864; add.u64 %r865,%r1905,%r867; setp.ne.u64 %r1916,%r638,%r269; @ %r1916 bra $L100; bra $L99; $L142: mov.f64 %r152,0d0000000000000000; mov.f64 %r188,%r152; $L99: .loc 1 2969 21 st.f64 [%r870],%r188; st.f64 [%r870+8],%r152; .loc 1 2963 32 add.u64 %r268,%r268,1; .loc 1 2963 7 add.u64 %r870,%r870,%r872; add.u64 %r873,%r873,%r151; setp.ne.u64 %r1481,%r163,%r268; @ %r1481 bra $L101; bra $L1; $L343: .loc 1 2972 11 setp.gt.s64 %r1482,%r150,%r298; @ %r1482 bra $L102; .loc 1 2993 7 setp.gt.s64 %r1483,%r163,0; @ %r1483 bra $L103; bra $L1; $L102: .loc 1 2974 7 setp.gt.s64 %r1484,%r163,0; @ %r1484 bra $L104; bra $L1; $L108: shl.b64 %r1485,%r914,4; add.u64 %r910,%r222,%r1485; .loc 1 2975 9 mov.u64 %r267,0; and.b64 %r1681,%r162,3; setp.eq.u64 %r1684,%r1681,0; @ %r1684 bra $L105; setp.eq.u64 %r1683,%r1681,1; @ %r1683 bra $L294; setp.eq.u64 %r1682,%r1681,2; @ %r1682 bra $L295; bra $L347; $L105: .loc 1 2976 34 st.f64 [%r910],%r1599; st.f64 [%r910+8],%r1600; .loc 1 2975 2 add.u64 %r1920,%r910,%r912; .loc 1 2976 34 st.f64 [%r1920],%r1599; st.f64 [%r1920+8],%r1600; .loc 1 2975 2 add.u64 %r1922,%r1920,%r912; .loc 1 2976 34 st.f64 [%r1922],%r1599; st.f64 [%r1922+8],%r1600; .loc 1 2975 2 add.u64 %r1925,%r1922,%r912; .loc 1 2976 34 st.f64 [%r1925],%r1599; st.f64 [%r1925+8],%r1600; .loc 1 2975 27 add.u64 %r267,%r267,4; .loc 1 2975 2 add.u64 %r910,%r1925,%r912; setp.ne.u64 %r1927,%r162,%r267; @ %r1927 bra $L105; $L109: .loc 1 2974 32 add.u64 %r266,%r288,1; .loc 1 2974 7 add.u64 %r914,%r914,%r664; setp.ne.u64 %r1489,%r163,%r266; @ %r1489 bra $L143; shl.b64 %r902,%r664,4; mov.u64 %r900,%r222; shl.b64 %r893,%r663,4; shl.b64 %r883,%r298,4; mov.u64 %r905,0; mov.u64 %r289,%r905; setp.le.s64 %r1928,%r638,0; add.u64 %r1598,%r221,8; bra $L107; $L104: shl.b64 %r912,%r844,4; mov.u64 %r914,0; .loc 1 2974 14 mov.u64 %r288,%r914; setp.gt.s64 %r1929,%r162,0; ld.const.f64 %r1599,[$LC9]; ld.const.f64 %r1600,[$LC9+8]; bra $L106; $L143: mov.u64 %r288,%r266; $L106: .loc 1 2975 2 @ %r1929 bra $L108; bra $L109; $L112: shl.b64 %r1491,%r889,4; add.u64 %r881,%r220,%r1491; add.u64 %r884,%r900,8; .loc 1 2980 4 mov.u64 %r878,%r900; .loc 1 2980 11 mov.u64 %r265,0; and.b64 %r1673,%r162,3; setp.eq.u64 %r1676,%r1673,0; @ %r1676 bra $L110; setp.eq.u64 %r1675,%r1673,1; @ %r1675 bra $L296; setp.eq.u64 %r1674,%r1673,2; @ %r1674 bra $L297; bra $L348; $L110: ld.f64 %r209,[%r881]; ld.f64 %r153,[%r881+8]; ld.f64 %r208,[%r891]; ld.f64 %r156,[%r894]; .loc 1 2983 37 mul.f64 %r155,%r153,%r208; ld.f64 %r1492,[%r878]; fma.rn.f64 %r495,%r208,%r209,%r1492; fma.rn.f64 %r160,%r156,%r209,%r155; .loc 1 2982 36 neg.f64 %r1493,%r153; fma.rn.f64 %r207,%r1493,%r156,%r495; ld.f64 %r1494,[%r884]; add.f64 %r154,%r160,%r1494; st.f64 [%r878],%r207; st.f64 [%r884],%r154; .loc 1 2980 4 add.u64 %r1931,%r878,%r912; add.u64 %r1932,%r881,%r883; add.u64 %r1933,%r884,%r912; ld.f64 %r1934,[%r1932]; ld.f64 %r1935,[%r1932+8]; ld.f64 %r1936,[%r891]; ld.f64 %r1937,[%r894]; .loc 1 2983 37 mul.f64 %r1938,%r1935,%r1936; ld.f64 %r1939,[%r1931]; fma.rn.f64 %r1940,%r1936,%r1934,%r1939; fma.rn.f64 %r1941,%r1937,%r1934,%r1938; .loc 1 2982 36 neg.f64 %r1942,%r1935; fma.rn.f64 %r1943,%r1942,%r1937,%r1940; ld.f64 %r1944,[%r1933]; add.f64 %r1945,%r1941,%r1944; st.f64 [%r1931],%r1943; st.f64 [%r1933],%r1945; .loc 1 2980 4 add.u64 %r1947,%r1931,%r912; add.u64 %r1948,%r1932,%r883; add.u64 %r1949,%r1933,%r912; ld.f64 %r1951,[%r1948]; ld.f64 %r1952,[%r1948+8]; ld.f64 %r1953,[%r891]; ld.f64 %r1954,[%r894]; .loc 1 2983 37 mul.f64 %r1955,%r1952,%r1953; ld.f64 %r1956,[%r1947]; fma.rn.f64 %r1957,%r1953,%r1951,%r1956; fma.rn.f64 %r1958,%r1954,%r1951,%r1955; .loc 1 2982 36 neg.f64 %r1959,%r1952; fma.rn.f64 %r1960,%r1959,%r1954,%r1957; ld.f64 %r1961,[%r1949]; add.f64 %r1962,%r1958,%r1961; st.f64 [%r1947],%r1960; st.f64 [%r1949],%r1962; .loc 1 2980 4 add.u64 %r1964,%r1947,%r912; add.u64 %r1965,%r1948,%r883; add.u64 %r1966,%r1949,%r912; ld.f64 %r1968,[%r1965]; ld.f64 %r1969,[%r1965+8]; ld.f64 %r1970,[%r891]; ld.f64 %r1971,[%r894]; .loc 1 2983 37 mul.f64 %r1972,%r1969,%r1970; ld.f64 %r1973,[%r1964]; fma.rn.f64 %r1974,%r1970,%r1968,%r1973; fma.rn.f64 %r1975,%r1971,%r1968,%r1972; .loc 1 2982 36 neg.f64 %r1976,%r1969; fma.rn.f64 %r1977,%r1976,%r1971,%r1974; ld.f64 %r1978,[%r1966]; add.f64 %r1979,%r1975,%r1978; st.f64 [%r1964],%r1977; st.f64 [%r1966],%r1979; .loc 1 2980 29 add.u64 %r265,%r265,4; .loc 1 2980 4 add.u64 %r878,%r1964,%r912; add.u64 %r881,%r1965,%r883; add.u64 %r884,%r1966,%r912; setp.ne.u64 %r1980,%r162,%r265; @ %r1980 bra $L110; $L113: .loc 1 2979 26 add.u64 %r264,%r264,1; .loc 1 2979 2 add.u64 %r889,%r889,%r150; add.u64 %r891,%r891,%r893; add.u64 %r894,%r894,%r893; setp.eq.u64 %r1496,%r638,%r264; @ %r1496 bra $L111; $L114: .loc 1 2980 4 @ %r1929 bra $L112; bra $L113; $L111: .loc 1 2978 32 add.u64 %r263,%r289,1; .loc 1 2978 7 add.u64 %r900,%r900,%r902; add.u64 %r905,%r905,%r151; setp.eq.u64 %r1498,%r288,%r289; @ %r1498 bra $L1; mov.u64 %r289,%r263; $L107: .loc 1 2979 2 @ %r1928 bra $L111; shl.b64 %r908,%r905,4; add.u64 %r891,%r221,%r908; add.u64 %r894,%r1598,%r908; mov.u64 %r889,0; .loc 1 2979 9 mov.u64 %r264,%r889; bra $L114; $L103: shl.b64 %r929,%r844,4; shl.b64 %r920,%r150,4; shl.b64 %r924,%r663,4; .loc 1 2993 7 mov.u64 %r938,0; mov.u64 %r936,%r938; .loc 1 2993 14 mov.u64 %r260,%r938; setp.gt.s64 %r1981,%r162,0; setp.le.s64 %r1601,%r638,0; $L116: .loc 1 2997 4 @ %r1981 bra $L115; $L120: .loc 1 2993 32 add.u64 %r260,%r260,1; .loc 1 2993 7 add.u64 %r936,%r936,%r664; add.u64 %r938,%r938,%r151; setp.ne.u64 %r1502,%r163,%r260; @ %r1502 bra $L116; bra $L1; $L115: shl.b64 %r1503,%r936,4; add.u64 %r927,%r222,%r1503; shl.b64 %r1504,%r938,4; add.u64 %r944,%r221,%r1504; .loc 1 2997 4 mov.u64 %r931,0; .loc 1 2997 11 mov.u64 %r261,%r931; $L119: .loc 1 3001 8 @ %r1601 bra $L144; shl.b64 %r1506,%r931,4; add.u64 %r918,%r220,%r1506; mov.u64 %r922,%r944; mov.f64 %r161,0d0000000000000000; mov.f64 %r211,%r161; .loc 1 3001 15 mov.u64 %r262,0; and.b64 %r1689,%r638,3; setp.eq.u64 %r1692,%r1689,0; @ %r1692 bra $L118; setp.eq.u64 %r1691,%r1689,1; @ %r1691 bra $L298; setp.eq.u64 %r1690,%r1689,2; @ %r1690 bra $L299; bra $L349; $L118: ld.f64 %r285,[%r918]; ld.f64 %r214,[%r918+8]; ld.f64 %r159,[%r922]; ld.f64 %r213,[%r922+8]; .loc 1 3002 28 mul.f64 %r271,%r159,%r214; mul.f64 %r1507,%r213,%r214; neg.f64 %r1508,%r1507; fma.rn.f64 %r212,%r159,%r285,%r1508; fma.rn.f64 %r158,%r213,%r285,%r271; .loc 1 3002 5 add.f64 %r1983,%r211,%r212; add.f64 %r1984,%r161,%r158; .loc 1 3001 8 add.u64 %r1986,%r918,%r920; add.u64 %r1987,%r922,%r924; ld.f64 %r1988,[%r1986]; ld.f64 %r1989,[%r1986+8]; ld.f64 %r1990,[%r1987]; ld.f64 %r1991,[%r1987+8]; .loc 1 3002 28 mul.f64 %r1992,%r1990,%r1989; mul.f64 %r1993,%r1991,%r1989; neg.f64 %r1994,%r1993; fma.rn.f64 %r1995,%r1990,%r1988,%r1994; fma.rn.f64 %r1996,%r1991,%r1988,%r1992; .loc 1 3002 5 add.f64 %r1997,%r1983,%r1995; add.f64 %r1998,%r1984,%r1996; .loc 1 3001 8 add.u64 %r2000,%r1986,%r920; add.u64 %r2001,%r1987,%r924; ld.f64 %r2003,[%r2000]; ld.f64 %r2004,[%r2000+8]; ld.f64 %r2005,[%r2001]; ld.f64 %r2006,[%r2001+8]; .loc 1 3002 28 mul.f64 %r2007,%r2005,%r2004; mul.f64 %r2008,%r2006,%r2004; neg.f64 %r2009,%r2008; fma.rn.f64 %r2010,%r2005,%r2003,%r2009; fma.rn.f64 %r2011,%r2006,%r2003,%r2007; .loc 1 3002 5 add.f64 %r2012,%r1997,%r2010; add.f64 %r2013,%r1998,%r2011; .loc 1 3001 8 add.u64 %r2015,%r2000,%r920; add.u64 %r2016,%r2001,%r924; ld.f64 %r2018,[%r2015]; ld.f64 %r2019,[%r2015+8]; ld.f64 %r2020,[%r2016]; ld.f64 %r2021,[%r2016+8]; .loc 1 3002 28 mul.f64 %r2022,%r2020,%r2019; mul.f64 %r2023,%r2021,%r2019; neg.f64 %r2024,%r2023; fma.rn.f64 %r2025,%r2020,%r2018,%r2024; fma.rn.f64 %r2026,%r2021,%r2018,%r2022; .loc 1 3002 5 add.f64 %r211,%r2012,%r2025; add.f64 %r161,%r2013,%r2026; .loc 1 3001 32 add.u64 %r262,%r262,4; .loc 1 3001 8 add.u64 %r918,%r2015,%r920; add.u64 %r922,%r2016,%r924; setp.ne.u64 %r2027,%r638,%r262; @ %r2027 bra $L118; bra $L117; $L144: mov.f64 %r161,0d0000000000000000; mov.f64 %r211,%r161; $L117: .loc 1 3003 27 st.f64 [%r927],%r211; st.f64 [%r927+8],%r161; .loc 1 2997 29 add.u64 %r261,%r261,1; .loc 1 2997 4 add.u64 %r927,%r927,%r929; add.u64 %r931,%r931,%r298; setp.ne.u64 %r1510,%r162,%r261; @ %r1510 bra $L119; bra $L120; $L32: .loc 1 2920 11 setp.eq.u64 %r1511,%r150,1; @ %r1511 bra $L121; bra $L20; $L2: .loc 1 2476 6 ld.u64 %r1512,[%r945]; setp.ne.u64 %r1513,%r1512,0; @ %r1513 bra $L122; .loc 1 2485 4 ld.u64 %r1515,[%r946+56]; add.u64 %r1514,%r1515,1; ld.u64 %r1516,[%r946+48]; sub.u64 %r668,%r1514,%r1516; .loc 1 2483 15 ld.s8 %r1518,[%r947+28]; cvt.u16.u32 %r1517,%r1518; setp.eq.u16 %r1519,%r1517,1; @ %r1519 bra $L123; bra $L124; $L126: .loc 1 2560 16 ld.u64 %r298,[%r946+40]; .loc 1 2548 27 mov.u64 %r844,%r664; bra $L125; $L14: ld.u64 %r664,[%r945+40]; .loc 1 2543 6 ld.s8 %r1521,[%r945+28]; cvt.u16.u32 %r1520,%r1521; setp.eq.u16 %r1522,%r1520,1; @ ! %r1522 bra $L127; bra $L126; $L11: .loc 1 2560 16 ld.u64 %r298,[%r946+40]; mov.u32 %r627,%r901; .loc 1 2548 27 mov.u64 %r664,%r844; .loc 1 2563 14 mov.u64 %r162,1; .loc 1 2561 16 mov.u64 %r150,%r162; bra $L12; $L5: .loc 1 2485 4 ld.u64 %r1524,[%r946+56]; add.u64 %r1523,%r1524,1; ld.u64 %r1525,[%r946+48]; sub.u64 %r668,%r1523,%r1525; bra $L124; $L128: .loc 1 2509 17 ld.u64 %r1527,[%r945+56]; add.u64 %r1526,%r1527,1; .loc 1 2509 15 ld.u64 %r1528,[%r945+48]; sub.u64 %r638,%r1526,%r1528; ld.s8 %r1551,[%r947+28]; bra $L9; $L122: .loc 1 2502 12 cvta.global.u64 %r1529,_gfortrani_compile_options; .loc 1 2502 11 ld.u32 %r1530,[%r1529+36]; setp.ne.u32 %r1531,%r1530,0; @ ! %r1531 bra $L8; bra $L128; $L1: .loc 1 3007 1 ret; $L342: ld.f64 %r2028,[%r616]; ld.f64 %r2029,[%r621]; ld.f64 %r2030,[%r578]; ld.f64 %r2031,[%r578+8]; .loc 1 2908 13 mul.f64 %r2032,%r2029,%r2030; mul.f64 %r2033,%r2029,%r2031; neg.f64 %r2034,%r2033; fma.rn.f64 %r2035,%r2028,%r2030,%r2034; fma.rn.f64 %r2036,%r2028,%r2031,%r2032; .loc 1 2907 11 add.f64 %r460,%r460,%r2035; add.f64 %r461,%r461,%r2036; .loc 1 2905 10 add.u64 %r297,%r616,16; add.u64 %r256,%r578,16; add.u64 %r621,%r616,24; $L287: ld.f64 %r2038,[%r297]; ld.f64 %r2039,[%r621]; ld.f64 %r2040,[%r256]; ld.f64 %r2041,[%r256+8]; .loc 1 2908 13 mul.f64 %r2042,%r2039,%r2040; mul.f64 %r2043,%r2039,%r2041; neg.f64 %r2044,%r2043; fma.rn.f64 %r2045,%r2038,%r2040,%r2044; fma.rn.f64 %r2046,%r2038,%r2041,%r2042; .loc 1 2907 11 add.f64 %r460,%r460,%r2045; add.f64 %r461,%r461,%r2046; .loc 1 2905 10 add.u64 %r297,%r297,16; add.u64 %r256,%r256,16; add.u64 %r621,%r621,16; $L286: ld.f64 %r2048,[%r297]; ld.f64 %r2049,[%r621]; ld.f64 %r2050,[%r256]; ld.f64 %r2051,[%r256+8]; .loc 1 2908 13 mul.f64 %r2052,%r2049,%r2050; mul.f64 %r2053,%r2049,%r2051; neg.f64 %r2054,%r2053; fma.rn.f64 %r2055,%r2048,%r2050,%r2054; fma.rn.f64 %r2056,%r2048,%r2051,%r2052; .loc 1 2907 11 add.f64 %r460,%r460,%r2055; add.f64 %r461,%r461,%r2056; .loc 1 2905 10 add.u64 %r297,%r297,16; add.u64 %r256,%r256,16; add.u64 %r621,%r621,16; setp.ne.u64 %r2057,%r256,%r575; @ %r2057 bra $L83; bra $L82; $L341: ld.f64 %r2058,[%r737+8]; .loc 1 2762 43 ld.f64 %r2059,[%r737]; st.f64 [%r678],%r2059; st.f64 [%r683],%r2058; .loc 1 2760 9 add.u64 %r676,%r737,16; add.u64 %r677,%r678,4096; add.u64 %r682,%r683,4096; $L285: ld.f64 %r2061,[%r676+8]; .loc 1 2762 43 ld.f64 %r2062,[%r676]; st.f64 [%r677],%r2062; st.f64 [%r682],%r2061; .loc 1 2760 9 add.u64 %r676,%r676,16; add.u64 %r677,%r677,4096; add.u64 %r682,%r682,4096; $L284: ld.f64 %r2064,[%r676+8]; .loc 1 2762 43 ld.f64 %r2065,[%r676]; st.f64 [%r677],%r2065; st.f64 [%r682],%r2064; .loc 1 2760 9 add.u64 %r676,%r676,16; add.u64 %r677,%r677,4096; add.u64 %r682,%r682,4096; setp.ne.u64 %r2066,%r676,%r692; @ %r2066 bra $L58; bra $L48; $L340: ld.f64 %r2067,[%r694+8]; .loc 1 2740 49 ld.f64 %r2068,[%r694]; st.f64 [%r695],%r2068; st.f64 [%r695+8],%r2067; ld.f64 %r2069,[%r696+8]; .loc 1 2742 49 ld.f64 %r2070,[%r696]; st.f64 [%r695+16],%r2070; st.f64 [%r695+24],%r2069; ld.f64 %r2071,[%r694+24]; .loc 1 2744 49 ld.f64 %r2072,[%r694+16]; st.f64 [%r695+4096],%r2072; st.f64 [%r695+4104],%r2071; ld.f64 %r2073,[%r696+24]; .loc 1 2746 49 ld.f64 %r2074,[%r696+16]; st.f64 [%r695+4112],%r2074; st.f64 [%r695+4120],%r2073; .loc 1 2738 9 add.u64 %r694,%r694,32; add.u64 %r695,%r695,8192; add.u64 %r696,%r696,32; setp.ne.u64 %r2075,%r696,%r706; @ %r2075 bra $L52; bra $L53; $L338: .loc 1 2688 2 @ ! %r1556 bra $L350; .loc 1 2689 22 mov.u32 %r20762076077,[%value_in]; } $L350: .loc 1 2687 24 mov.u64 %r255,2; .loc 1 2687 7 add.u64 %r800,%r222,%r802; $L283: .loc 1 2688 2 @ ! %r1556 bra $L351; .loc 1 2689 22 mov.u32 %r20792079080,[%value_in]; } $L351: .loc 1 2687 24 add.u64 %r255,%r255,1; .loc 1 2687 7 add.u64 %r800,%r800,%r802; $L282: .loc 1 2688 2 @ ! %r1556 bra $L352; .loc 1 2689 22 mov.u32 %r2082082083,[%value_in]; } $L352: .loc 1 2687 24 add.u64 %r255,%r255,1; .loc 1 2687 7 add.u64 %r800,%r800,%r802; setp.eq.u64 %r2084,%r255,%r806; @ ! %r2084 bra $L34; bra $L33; $L220: .loc 1 2689 22 mov.u32 %r208516952085086,[%value_in]; } $L339: .loc 1 2687 7 add.u64 %r2088,%r1695,%r802; .loc 1 2688 2 @ ! %r1556 bra $L353; .loc 1 2689 22 mov.u32 %r20208090091,[%value_in]; } $L353: .loc 1 2687 7 add.u64 %r2093,%r2088,%r802; .loc 1 2688 2 @ ! %r1556 bra $L354; .loc 1 2689 22 mov.u32 %r202092095096,[%value_in]; } $L354: .loc 1 2687 24 add.u64 %r255,%r1640,3; .loc 1 2687 7 add.u64 %r800,%r2093,%r802; setp.eq.u64 %r2097,%r255,%r806; @ ! %r2097 bra $L34; bra $L33; $L344: ld.f64 %r2098,[%r816]; ld.f64 %r2099,[%r809]; ld.f64 %r2100,[%r834]; ld.f64 %r2101,[%r810]; .loc 1 2938 23 mul.f64 %r2102,%r2100,%r2099; mul.f64 %r2103,%r2101,%r2099; neg.f64 %r2104,%r2103; .loc 1 2938 9 fma.rn.f64 %r103,%r2100,%r2098,%r2104; fma.rn.f64 %r108,%r2101,%r2098,%r2102; .loc 1 2937 5 add.u64 %r807,%r816,16; add.u64 %r808,%r834,16; add.u64 %r809,%r816,24; add.u64 %r810,%r834,24; $L289: ld.f64 %r2108,[%r807]; ld.f64 %r2109,[%r809]; ld.f64 %r2110,[%r808]; ld.f64 %r2111,[%r810]; .loc 1 2938 23 mul.f64 %r2112,%r2110,%r2109; mul.f64 %r2113,%r2111,%r2109; neg.f64 %r2114,%r2113; fma.rn.f64 %r2115,%r2110,%r2108,%r2114; fma.rn.f64 %r2116,%r2111,%r2108,%r2112; .loc 1 2938 9 add.f64 %r103,%r103,%r2115; add.f64 %r108,%r108,%r2116; .loc 1 2937 5 add.u64 %r807,%r807,16; add.u64 %r808,%r808,16; add.u64 %r809,%r809,16; add.u64 %r810,%r810,16; $L288: ld.f64 %r2118,[%r807]; ld.f64 %r2119,[%r809]; ld.f64 %r2120,[%r808]; ld.f64 %r2121,[%r810]; .loc 1 2938 23 mul.f64 %r2122,%r2120,%r2119; mul.f64 %r2123,%r2121,%r2119; neg.f64 %r2124,%r2123; fma.rn.f64 %r2125,%r2120,%r2118,%r2124; fma.rn.f64 %r2126,%r2121,%r2118,%r2122; .loc 1 2938 9 add.f64 %r103,%r103,%r2125; add.f64 %r108,%r108,%r2126; .loc 1 2937 5 add.u64 %r807,%r807,16; add.u64 %r808,%r808,16; add.u64 %r809,%r809,16; add.u64 %r810,%r810,16; setp.ne.u64 %r2127,%r807,%r821; @ %r2127 bra $L93; bra $L92; $L345: ld.f64 %r2128,[%r220]; ld.f64 %r2129,[%r845]; ld.f64 %r2130,[%r843]; ld.f64 %r2131,[%r843+8]; .loc 1 2953 26 mul.f64 %r2132,%r2130,%r2129; mul.f64 %r2133,%r2131,%r2129; neg.f64 %r2134,%r2133; .loc 1 2953 5 fma.rn.f64 %r38,%r2130,%r2128,%r2134; fma.rn.f64 %r37,%r2131,%r2128,%r2132; .loc 1 2952 8 add.u64 %r840,%r220,%r842; add.u64 %r843,%r843,16; add.u64 %r845,%r845,%r842; $L291: ld.f64 %r2138,[%r840]; ld.f64 %r2139,[%r845]; ld.f64 %r2140,[%r843]; ld.f64 %r2141,[%r843+8]; .loc 1 2953 26 mul.f64 %r2142,%r2140,%r2139; mul.f64 %r2143,%r2141,%r2139; neg.f64 %r2144,%r2143; fma.rn.f64 %r2145,%r2140,%r2138,%r2144; fma.rn.f64 %r2146,%r2141,%r2138,%r2142; .loc 1 2953 5 add.f64 %r38,%r38,%r2145; add.f64 %r37,%r37,%r2146; .loc 1 2952 8 add.u64 %r840,%r840,%r842; add.u64 %r843,%r843,16; add.u64 %r845,%r845,%r842; $L290: ld.f64 %r2148,[%r840]; ld.f64 %r2149,[%r845]; ld.f64 %r2150,[%r843]; ld.f64 %r2151,[%r843+8]; .loc 1 2953 26 mul.f64 %r2152,%r2150,%r2149; mul.f64 %r2153,%r2151,%r2149; neg.f64 %r2154,%r2153; fma.rn.f64 %r2155,%r2150,%r2148,%r2154; fma.rn.f64 %r2156,%r2151,%r2148,%r2152; .loc 1 2953 5 add.f64 %r38,%r38,%r2155; add.f64 %r37,%r37,%r2156; .loc 1 2952 8 add.u64 %r840,%r840,%r842; add.u64 %r843,%r843,16; add.u64 %r845,%r845,%r842; setp.ne.u64 %r2157,%r843,%r861; @ %r2157 bra $L97; bra $L96; $L346: ld.f64 %r2158,[%r220]; ld.f64 %r2159,[%r220+8]; ld.f64 %r2160,[%r865]; ld.f64 %r2161,[%r865+8]; .loc 1 2968 29 mul.f64 %r2162,%r2160,%r2159; mul.f64 %r2163,%r2161,%r2159; neg.f64 %r2164,%r2163; .loc 1 2968 8 fma.rn.f64 %r188,%r2160,%r2158,%r2164; fma.rn.f64 %r152,%r2161,%r2158,%r2162; .loc 1 2967 28 mov.u64 %r269,1; .loc 1 2967 4 add.u64 %r862,%r220,%r864; add.u64 %r865,%r865,%r867; $L293: ld.f64 %r2168,[%r862]; ld.f64 %r2169,[%r862+8]; ld.f64 %r2170,[%r865]; ld.f64 %r2171,[%r865+8]; .loc 1 2968 29 mul.f64 %r2172,%r2170,%r2169; mul.f64 %r2173,%r2171,%r2169; neg.f64 %r2174,%r2173; fma.rn.f64 %r2175,%r2170,%r2168,%r2174; fma.rn.f64 %r2176,%r2171,%r2168,%r2172; .loc 1 2968 8 add.f64 %r188,%r188,%r2175; add.f64 %r152,%r152,%r2176; .loc 1 2967 28 add.u64 %r269,%r269,1; .loc 1 2967 4 add.u64 %r862,%r862,%r864; add.u64 %r865,%r865,%r867; $L292: ld.f64 %r2178,[%r862]; ld.f64 %r2179,[%r862+8]; ld.f64 %r2180,[%r865]; ld.f64 %r2181,[%r865+8]; .loc 1 2968 29 mul.f64 %r2182,%r2180,%r2179; mul.f64 %r2183,%r2181,%r2179; neg.f64 %r2184,%r2183; fma.rn.f64 %r2185,%r2180,%r2178,%r2184; fma.rn.f64 %r2186,%r2181,%r2178,%r2182; .loc 1 2968 8 add.f64 %r188,%r188,%r2185; add.f64 %r152,%r152,%r2186; .loc 1 2967 28 add.u64 %r269,%r269,1; .loc 1 2967 4 add.u64 %r862,%r862,%r864; add.u64 %r865,%r865,%r867; setp.ne.u64 %r2187,%r638,%r269; @ %r2187 bra $L100; bra $L99; $L348: ld.f64 %r2188,[%r881]; ld.f64 %r2189,[%r881+8]; ld.f64 %r2190,[%r891]; ld.f64 %r2191,[%r894]; .loc 1 2983 37 mul.f64 %r2192,%r2189,%r2190; ld.f64 %r2193,[%r900]; fma.rn.f64 %r2194,%r2190,%r2188,%r2193; fma.rn.f64 %r2195,%r2191,%r2188,%r2192; .loc 1 2982 36 neg.f64 %r2196,%r2189; fma.rn.f64 %r2197,%r2196,%r2191,%r2194; ld.f64 %r2198,[%r884]; add.f64 %r2199,%r2195,%r2198; st.f64 [%r900],%r2197; st.f64 [%r884],%r2199; .loc 1 2980 29 mov.u64 %r265,1; .loc 1 2980 4 add.u64 %r878,%r900,%r912; add.u64 %r881,%r881,%r883; add.u64 %r884,%r884,%r912; $L297: ld.f64 %r2201,[%r881]; ld.f64 %r2202,[%r881+8]; ld.f64 %r2203,[%r891]; ld.f64 %r2204,[%r894]; .loc 1 2983 37 mul.f64 %r2205,%r2202,%r2203; ld.f64 %r2206,[%r878]; fma.rn.f64 %r2207,%r2203,%r2201,%r2206; fma.rn.f64 %r2208,%r2204,%r2201,%r2205; .loc 1 2982 36 neg.f64 %r2209,%r2202; fma.rn.f64 %r2210,%r2209,%r2204,%r2207; ld.f64 %r2211,[%r884]; add.f64 %r2212,%r2208,%r2211; st.f64 [%r878],%r2210; st.f64 [%r884],%r2212; .loc 1 2980 29 add.u64 %r265,%r265,1; .loc 1 2980 4 add.u64 %r878,%r878,%r912; add.u64 %r881,%r881,%r883; add.u64 %r884,%r884,%r912; $L296: ld.f64 %r2214,[%r881]; ld.f64 %r2215,[%r881+8]; ld.f64 %r2216,[%r891]; ld.f64 %r2217,[%r894]; .loc 1 2983 37 mul.f64 %r2218,%r2215,%r2216; ld.f64 %r2219,[%r878]; fma.rn.f64 %r2220,%r2216,%r2214,%r2219; fma.rn.f64 %r2221,%r2217,%r2214,%r2218; .loc 1 2982 36 neg.f64 %r2222,%r2215; fma.rn.f64 %r2223,%r2222,%r2217,%r2220; ld.f64 %r2224,[%r884]; add.f64 %r2225,%r2221,%r2224; st.f64 [%r878],%r2223; st.f64 [%r884],%r2225; .loc 1 2980 29 add.u64 %r265,%r265,1; .loc 1 2980 4 add.u64 %r878,%r878,%r912; add.u64 %r881,%r881,%r883; add.u64 %r884,%r884,%r912; setp.ne.u64 %r2226,%r162,%r265; @ %r2226 bra $L110; bra $L113; $L347: .loc 1 2976 34 st.f64 [%r910],%r1599; st.f64 [%r910+8],%r1600; .loc 1 2975 27 mov.u64 %r267,1; .loc 1 2975 2 add.u64 %r910,%r910,%r912; $L295: .loc 1 2976 34 st.f64 [%r910],%r1599; st.f64 [%r910+8],%r1600; .loc 1 2975 27 add.u64 %r267,%r267,1; .loc 1 2975 2 add.u64 %r910,%r910,%r912; $L294: .loc 1 2976 34 st.f64 [%r910],%r1599; st.f64 [%r910+8],%r1600; .loc 1 2975 27 add.u64 %r267,%r267,1; .loc 1 2975 2 add.u64 %r910,%r910,%r912; setp.ne.u64 %r2229,%r162,%r267; @ %r2229 bra $L105; bra $L109; $L349: ld.f64 %r2230,[%r918]; ld.f64 %r2231,[%r918+8]; ld.f64 %r2232,[%r944]; ld.f64 %r2233,[%r944+8]; .loc 1 3002 28 mul.f64 %r2234,%r2232,%r2231; mul.f64 %r2235,%r2233,%r2231; neg.f64 %r2236,%r2235; .loc 1 3002 5 fma.rn.f64 %r211,%r2232,%r2230,%r2236; fma.rn.f64 %r161,%r2233,%r2230,%r2234; .loc 1 3001 32 mov.u64 %r262,1; .loc 1 3001 8 add.u64 %r918,%r918,%r920; add.u64 %r922,%r944,%r924; $L299: ld.f64 %r2240,[%r918]; ld.f64 %r2241,[%r918+8]; ld.f64 %r2242,[%r922]; ld.f64 %r2243,[%r922+8]; .loc 1 3002 28 mul.f64 %r2244,%r2242,%r2241; mul.f64 %r2245,%r2243,%r2241; neg.f64 %r2246,%r2245; fma.rn.f64 %r2247,%r2242,%r2240,%r2246; fma.rn.f64 %r2248,%r2243,%r2240,%r2244; .loc 1 3002 5 add.f64 %r211,%r211,%r2247; add.f64 %r161,%r161,%r2248; .loc 1 3001 32 add.u64 %r262,%r262,1; .loc 1 3001 8 add.u64 %r918,%r918,%r920; add.u64 %r922,%r922,%r924; $L298: ld.f64 %r2250,[%r918]; ld.f64 %r2251,[%r918+8]; ld.f64 %r2252,[%r922]; ld.f64 %r2253,[%r922+8]; .loc 1 3002 28 mul.f64 %r2254,%r2252,%r2251; mul.f64 %r2255,%r2253,%r2251; neg.f64 %r2256,%r2255; fma.rn.f64 %r2257,%r2252,%r2250,%r2256; fma.rn.f64 %r2258,%r2253,%r2250,%r2254; .loc 1 3002 5 add.f64 %r211,%r211,%r2257; add.f64 %r161,%r161,%r2258; .loc 1 3001 32 add.u64 %r262,%r262,1; .loc 1 3001 8 add.u64 %r918,%r918,%r920; add.u64 %r922,%r922,%r924; setp.ne.u64 %r2259,%r638,%r262; @ %r2259 bra $L118; bra $L117; } matmul_c10.o/ matmul_l4.o/_gfortran_matmul_l4 .visible .func _gfortran_matmul_l4fortran/generated/matmul_l4.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_options[6]FUNCTION DECL: _gfortran_size0_gfortran_size0_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_internal_error .extern .func _gfortrani_internal_97,116,109,117,108,95,108,5243116,114,97,110,47,103,101,110,101,114,97,116,101,100,47,109,97,116,109,117,108,95,108,52,46,9973,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,32,77,65,84,77,85,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,100,0 }70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,32,40,97,41,32,61,61,32,50,32,124,124,32,71,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,32,40,98,41,32,61,61,3273,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,32,77,65,84,77,85,76,32,105,110,116,114,105,110,115,105,99,32,102,111,114,32,100,105,109,101,110,115,105,111,110,32,50,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,12199,111,117,110,116,32,61,61,32,71,70,67,95,68,69,83,67,82,73,80,84,79,82,95,69,88,84,69,78,84,40,98,44,48,473,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,32,77,65,84,77,85,76,32,105,110,116,114,105,110,115,105,99,32,102,111,114,32,100,105,109,101,110,115,105,111,110,32,49,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_matmul_l4 .visible .func _gfortran_matmul_l4u64pred %r165; .reg .predpredpredpredpred32u64 %r271; .reg .u16 %r272; .reg .u32 %r273; .reg .pred %r274; .reg .u64 %r275; .reg .u64u64 %r280; .reg .u64pred %r310; .reg .predu64 %r327; .reg .u64pred %r334; .reg .predpred %r345; .reg .pred %r346; .reg .pred %r347; .reg .pred %r348; .reg .pred %r349; .reg .pred %r350; .reg .predpredpredpredpred %r392; .reg .u32 %r394; .reg .u16 %r395; .reg .predpredpredpredpredpredpred %r450; .reg .u64 %r451; .reg .u64 %r452; .reg .u32 %r455; .reg .u16 %r456; .reg .pred %r457; .reg .u32 %r458; .reg .u16 %r459; .reg .pred %r460; .reg .u64 %r461; .reg .u64 %r462; .reg .u32 %r465; .reg .u16 %r466; .reg .pred %r467; .reg .u32mov.u64 %r152,%ar0; mov.u64 %r153,%ar1; mov.u64 %r154,%ar2; .loc 1 64 3 ld.s8 %r22,[%r153+28]; cvt.u16.u32 %r155,%r22; setp.eq.u16 %r156,%r155,2; @ %r156 bra $L2; ld.s8 %r333,[%r154+28]; cvt.u16.u32 %r157,%r333; setp.eq.u16 %r159,%r157,2; @ %r159 bra $L3; cvta.const.u64 %r163,$LC0; cvta.const.u64 %r162,__func__$0; mov.u32 %r161,64; cvta.const.u64 %r160,$LC1; {16233: .loc 1 67 6 ld.u64 %r164,[%r1524; .loc 1 69 10 setp.ne.u16 %r167,%r155,1; @ %r167 bra $L5; .loc 1 71 4 st.u64 [%r152+48],%r164; ld.u64 %r170,[%r154+80]; ld.u64 %r172,[%r154+72]; sub.u64 %r173,%r170,%r172; st.u64 [%r152+56],%r173; mov.u64 %r174,1; st.u64 [%r152+40],%r174; bra $L6; $L28: .loc 1 76 4 st.u64 [%r152+48],%r316; add.u64 %r176,%r124,-1; st.u64 [%r152+56],%r176; mov.u64 %r177,1; st.u64 [%r152+40],%r177; bra $L6; $L29: .loc 1 81 4 mov.u64 %r178,0; st.u64 [%r152+48],%r178; add.u64 %r179,%r124,-1; st.u64 [%r152+56],%r179; mov.u64 %r180,1; st.u64 [%r152+40],%r180; .loc 1 84 11 st.u64 [%r152+72],%r178; ld.u64 %r183,[%r154+80]; ld.u64 %r185,[%r154+72]; sub.u64 %r186,%r183,%r185; st.u64 [%r152+80],%r186; st.u64 [%r152+64],%r124; $L6:152; call (%value_in),_gfortran_size0,(%out_arg1); ld.param.u64 %r188,[%value_in]; } .loc 1 90 4call (%value_in),_gfortrani_xmallocarray191,[%value_in]; } .loc 1 90 2 st.u64 [%r152],%r191; .loc 1 91 24 mov.u64 %r193,0; st.u64 [%r152+8],%r193; bra $L7; $L4: .loc 1 93 14 cvta.global.u64 %r194,_gfortrani_compile_options; .loc 1 93 13 ld.u32 %r195,[%r194+36]; setp.eq.u32 %r196,%r195,0; @ %r196 bra $L7; .loc 1 100 19 ld.u64 %r198,[%r152+56]; add.u64 %r197,%r198,1; .loc 1 100 17 ld.u64 %r199,[%r152+48]; sub.u64 %r123,%r197,%r199; .loc 1 97 5 setp.ne.u16 %r201,%r155,1; @ %r201 bra $L8; .loc 1 99 19 ld.u64 %r203,[%r154+80]; add.u64 %r202,%r203,1; .loc 1 99 17 ld.u64 %r204,[%r154+72]; sub.u64 %r107,%r202,%r204; .loc 1 101 9 setp.eq.u64 %r205,%r107,%r123; @ %r205 bra $L7; .loc 1 102 8 st.u64 [%stack+8],%r107; st.u64 [%stack],%r123; cvta.const.u64 %r206,$LC2; {stack; call _gfortran_runtime_error76 4 ld.u64 %r209,[%r153+56]; add.u64 %r208,%r209,1; ld.u64 %r210,[%r153+48]; sub.u64 %r148,%r208,%r210; .loc 1 106 10 cvt.u16.u32 %r211,%r333; setp.ne.u16 %r213,%r211,1; @ %r213 bra $L9; .loc 1 110 9 setp.eq.u64 %r214,%r123,%r148; @ %r214 bra $L7; .loc 1 111 8 st.u64 [%stack+8],%r148; st.u64 [%stack],%r123; cvta.const.u64 %r215215_gfortran_runtime_error119 9 setp.eq.u64 %r217,%r123,%r148; @ %r217 bra $L10; .loc 1 120 8 st.u64 [%stack+8],%r148; st.u64 [%stack],%r123;_gfortran_runtime_error10: .loc 1 125 19 ld.u64 %r221,[%r154+80]; add.u64 %r220,%r221,1; .loc 1 125 17 ld.u64 %r222,[%r154+72]; sub.u64 %r105,%r220,%r222; .loc 1 126 19 ld.u64 %r224,[%r152+80]; add.u64 %r223,%r224,1; .loc 1 126 17 ld.u64 %r225,[%r152+72]; sub.u64 %r106,%r223,%r225; .loc 1 127 9 setp.eq.u64 %r226,%r105,%r106; @ %r226 bra $L7; .loc 1 128 8 st.u64 [%stack+8],%r105; st.u64 [%stack],%r106; cvta.const.u64 %r227227_gfortran_runtime_error135 9 ld.u64 %r89,[%r153]; .loc 1 136 12 ld.u64 %r51,[%r153+16]; .loc 1 138 19 cvt.u32.u64 %r52,%r51; .loc 1 138 44 add.u32 %r229,%r52,-4; and.b32 %r230,%r229,-5; set.u32.eq.u32 %r232,%r230,0; neg.s32 %r233,%r232; .loc 1 138 19 add.u32 %r234,%r52,-1; set.u32.le.u32 %r236,%r234,1; neg.s32 %r237,%r236; .loc 1 138 6 cvt.u16.u32 %r239,%r233; cvt.u16.u32 %r240,%r237; or.b16 %r238,%r239,%r240; cvt.u32.u16 %r241,%r238; cvt.u16.u8 %r242,%r24111; .loc 1 140 6 setp.ne.u32 %r245,%r52,16; @ %r245 bra $L12; $L11: .loc 1 147 9 ld.u64 %r110,[%r154]; .loc 1 148 12 ld.u64 %r57,[%r154+16]; .loc 1 150 19 cvt.u32.u64 %r58,%r57; .loc 1 150 44 add.u32 %r246,%r58,-4; and.b32 %r247,%r246,-5; set.u32.eq.u32 %r249,%r247,0; neg.s32 %r250,%r249; .loc 1 150 19 add.u32 %r251,%r58,-1; set.u32.le.u32 %r253,%r251,1; neg.s32 %r254,%r253; .loc 1 150 6 cvt.u16.u32 %r256,%r250; cvt.u16.u32 %r257,%r254; or.b16 %r255,%r256,%r257; cvt.u32.u16 %r258,%r255; cvt.u16.u8 %r259,%r258; setp.ne.u16 %r260,%r259,0; @ %r260 bra $L13; bra $L122; $L12: .loc 1 145 5 cvta.const.u64 %r262,$LC5; mov.u64 %r261,call _gfortrani_internal_error122: .loc 1 152 6 setp.ne.u32 %r264,%r58,16; @ %r264 bra $L12; $L13: .loc 1 159 8 ld.u64 %r111,[%r152]; .loc 1 164 16 ld.u64 %r92,[%r152+40]; .loc 1 162 6 ld.s8 %r266,[%r152+28]; cvt.u16.u32 %r265,%r266; setp.eq.u16 %r267,%r265,1; @ %r267 bra $L31; .loc 1 170 16 ld.u64 %r93,[%r152+64]; bra $L15; $L31: .loc 1 165 16 mov.u64 %r93,%r92; $L15: .loc 1 177 17 ld.u64 %r268,[%r153+40]; mul.lo.u64 %r96,%r51,%r268; .loc 1 178 15 ld.u64 %r270,[%r153+56]; add.u64 %r269,%r270,1; .loc 1 178 13 ld.u64 %r271,[%r153+48]; sub.u64 %r94,%r269,%r271; .loc 1 175 6 ld.s8 %r273,[%r153+28]; cvt.u16.u32 %r272,%r273; setp.eq.u16 %r274,%r272,1; @ %r274 bra $L32; .loc 1 185 17 ld.u64 %r275,[%r153+64]; mul.lo.u64 %r99,%r51,%r275; .loc 1 186 15 ld.u64 %r277,[%r153+80]; add.u64 %r276,%r277,1; .loc 1 186 13 ld.u64 %r278,[%r153+72]; sub.u64 %r101,%r276,%r278; bra $L16; $L32: .loc 1 178 13 mov.u64 %r101,%r94; .loc 1 177 15 mov.u64 %r99,%r96; .loc 1 179 15 mov.u64 %r96,0; .loc 1 181 14 mov.u64 %r94,1; .loc 1 180 16 mov.u64 %r92,%r96; $L16: .loc 1 192 17 ld.u64 %r279,[%r154+40]; mul.lo.u64 %r131,%r57,%r279; .loc 1 193 7 ld.u64 %r281,[%r154+56]; add.u64 %r280,%r281,1; ld.u64 %r282,[%r154+48]; sub.u64 %r135,%r280,%r282; .loc 1 190 6 ld.s8 %r284,[%r154+28]; cvt.u16.u32 %r283,%r284; setp.ne.u16 %r285,%r283,1; @ %r285 bra $L17; .loc 1 193 7 setp.eq.u64 %r286,%r101,%r135; @ %r286 bra $L33; cvta.const.u64 %r290,$LC6; cvta.const.u64 %r289,__func__$0; mov.u32 %r288,193;r2289907: .loc 1 201 7 setp.eq.u64 %r291,%r101,%r135; @ %r291 bra $L19; cvta.const.u64 %r295,$LC6; cvta.const.u64 %r294,__func__$0; mov.u32 %r293,201;r293294959: .loc 1 202 17 ld.u64 %r296,[%r154+64]; mul.lo.u64 %r74,%r57,%r296; .loc 1 203 16 ld.u64 %r298,[%r154+80]; add.u64 %r297,%r298,1; .loc 1 203 14 ld.u64 %r299,[%r154+72]; sub.u64 %r91,%r297,%r299; .loc 1 206 3 setp.le.s64 %r300,%r91,0; @ %r300 bra $L1; mov.u64 %r122,%r93; bra $L18; $L33: mov.u64 %r74,0; mov.u64 %r122,%r74; .loc 1 196 14 mov.u64 %r91,1; $L18: .loc 1 228 16 shl.b64 %r83,%r92,2; .loc 1 231 24 mul.lo.u64 %r55,%r94,%r96; .loc 1 231 13 neg.s64 %r85,%r55; .loc 1 233 36 mul.lo.u64 %r301,%r92,%r94; .loc 1 233 24 sub.u64 %r302,%r122,%r301; .loc 1 233 12 shl.b64 %r87,%r302,2; mul.lo.u64 %r86,%r83,%r94; .loc 1 206 10 mov.u64 %r112,0; setp.gt.s64 %r334,%r94,0; setp.le.s64 %r337,%r101,0; .loc 1 215 17 cvt.u32.u64 %r338,%r112; .loc 1 221 25 mov.u32 %r339,1; bra $L21; $L34: .loc 1 208 7 mov.u64 %r116,%r111; mov.u64 %r117,%r89; .loc 1 208 14 mov.u64 %r118,0; $L25: .loc 1 215 17 st.u32 [%r116],%r338; .loc 1 217 11 @ %r337 bra $L22; mov.u64 %r114,%r110; mov.u64 %r113,%r117; .loc 1 217 18 mov.u64 %r115,0; and.b64 %r344,%r101,7; setp.eq.u64 %r351,%r344,0; @ %r351 bra $L24; setp.eq.u64 %r350,%r344,1; @ %r350 bra $L78; setp.eq.u64 %r349,%r344,2; @ %r349 bra $L79; setp.eq.u64 %r348,%r344,3; @ %r348 bra $L80; setp.eq.u64 %r347,%r344,4; @ %r347 bra $L81; setp.eq.u64 %r346,%r344,5; @ %r346 bra $L82; setp.eq.u64 %r345,%r344,6; @ %r345 bra $L83; bra $L123; $L24: .loc 1 219 18 ld.s8 %r306,[%r113]; cvt.u16.u32 %r305,%r306; setp.eq.u16 %r307,%r305,0; @ %r307 bra $L23; .loc 1 219 23 ld.s8 %r309,[%r114]; cvt.u16.u32 %r308,%r309; setp.eq.u16 %r310,%r308,0; @ %r310 bra $L23; $L125: .loc 1 221 25 st.u32 [%r116],%r339; .loc 1 222 19 bra $L22; $L23: .loc 1 224 18 add.u64 %r353,%r113,%r99; .loc 1 225 18 add.u64 %r354,%r114,%r131; .loc 1 217 35 add.u64 %r352,%r115,1; .loc 1 219 18 ld.s8 %r356,[%r353]; cvt.u16.u32bra $L124; $L22: .loc 1 228 16 add.u64 %r116,%r116,%r83; .loc 1 229 17 add.u64 %r117,%r117,%r96; .loc 1 208 32 add.u64 %r118,%r118,1; .loc 1 208 7 setp.ne.u64 %r313,%r94,%r118; @ %r313 bra $L25; .loc 1 229 17 add.u64 %r89,%r89,%r55; .loc 1 228 16 add.u64 %r111,%r111,%r86; $L26: .loc 1 231 13 add.u64 %r89,%r89,%r85; .loc 1 232 13 add.u64 %r110,%r110,%r74; .loc 1 233 12 add.u64 %r111,%r111,%r87; .loc 1 206 28 add.u64 %r112,%r112,1; .loc 1 206 3 setp.le.s64 %r314,%r91,%r112; @ %r314 bra $L1; $L21: .loc 1 208 7 @ %r334 bra $L34; bra $L26; $L2: .loc 1 67 6 ld.u64 %r316,[%r152]; setp.ne.u64 %r317,%r316,0; @ %r317 bra $L27; .loc 1 76 4 ld.u64 %r319,[%r153+56]; add.u64 %r318,%r319,1; ld.u64 %r320,[%r153+48]; sub.u64 %r124,%r318,%r320; .loc 1 74 15 ld.s8 %r322,[%r154+28]; cvt.u16.u32 %r321,%r322; setp.eq.u16 %r323,%r321,1; @ %r323 bra $L28; bra $L29; $L5: .loc 1 76 4 ld.u64 %r325,[%r153+56]; add.u64 %r324,%r325,1; ld.u64 %r326,[%r153+48]; sub.u64 %r124,%r324,%r326; bra $L29; $L30: .loc 1 100 19 ld.u64 %r328,[%r152+56]; add.u64 %r327,%r328,1; .loc 1 100 17 ld.u64 %r329,[%r152+48]; sub.u64 %r123,%r327,%r329; ld.s8 %r333,[%r154+28]; bra $L8; $L27: .loc 1 93 14 cvta.global.u64 %r330,_gfortrani_compile_options; .loc 1 93 13 ld.u32 %r331,[%r330+36]; setp.ne.u32 %r332,%r331,0; @ ! %r332 bra $L7; bra $L30; $L1: .loc 1 235 1 ret; $L123: .loc 1 219 18 ld.s8 %r359,[%r117]; cvt.u16.u32.loc 1 219 23 ld.s8 %r362,[%r110]; cvt.u16.u32 %r363,%r362; setp.eq.u16 %r364,%r363,0; @ ! %r364 bra $L125; $L43: .loc 1 224 18 add.u64 %r113,%r117,%r99; .loc 1 225 18 add.u64 %r114,%r110,%r131; .loc 1 217 35 mov.u64 %r115,1; $L83: .loc 1 219 18 ld.s8 %r366,[%r113]; cvt.u16.u32 %r367,%r366; setp.eq.u16 %r368,%r367,0; @ %r368 bra $L46; .loc 1 219 23 ld.s8 %r369,[%r114]; cvt.u16.u32 %r370,%r369; setp.eq.u16 %r371,%r370,0; @ ! %r371 bra $L125; $L46: .loc 1 224 18 add.u64 %r113,%r113,%r99; .loc 1 225 18 add.u64 %r114,%r114,%r131; .loc 1 217 35 add.u64 %r115,%r115,1; $L82: .loc 1 219 18 ld.s8 %r373,[%r113]; cvt.u16.u32 %r374,%r373; setp.eq.u16 %r375,%r374,0; @ %r375 bra $L49; .loc 1 219 23 ld.s8 %r376,[%r114]; cvt.u16.u32 %r377,%r376; setp.eq.u16 %r378,%r377,0; @ ! %r378 bra $L125; $L49: .loc 1 224 18 add.u64 %r113,%r113,%r99; .loc 1 225 18 add.u64 %r114,%r114,%r131; .loc 1 217 35 add.u64 %r115,%r115,1; $L81: .loc 1 219 18 ld.s8 %r380,[%r113]; cvt.u16.u32.loc 1 219 23 ld.s8 %r383,[%r114]; cvt.u16.u32 %r384,%r383; setp.eq.u16 %r385,%r384,0; @ ! %r385 bra $L125; $L52: .loc 1 224 18 add.u64 %r113,%r113,%r99; .loc 1 225 18 add.u64 %r114,%r114,%r131; .loc 1 217 35 add.u64 %r115,%r115,1; $L80: .loc 1 219 18 ld.s8 %r387,[%r113]; cvt.u16.u32 %r388,%r387; setp.eq.u16 %r389,%r388,0; @ %r389 bra $L55; .loc 1 219 23 ld.s8 %r390,[%r114]; cvt.u16.u32 %r391,%r390; setp.eq.u16 %r392,%r391,0; @ ! %r392 bra $L125; $L55: .loc 1 224 18 add.u64 %r113,%r113,%r99; .loc 1 225 18 add.u64 %r114,%r114,%r131; .loc 1 217 35 add.u64 %r115,%r115,1; $L79: .loc 1 219 18 ld.s8 %r394,[%r113]; cvt.u16.u32 %r395,%r394; setp.eq.u16 %r396,%r395,0; @ %r396 bra $L58; .loc 1 219 23 ld.s8 %r397,[%r114]; cvt.u16.u32 %r398,%r397; setp.eq.u16 %r399,%r398,0; @ ! %r399 bra $L125; $L58: .loc 1 224 18 add.u64 %r113,%r113,%r99; .loc 1 225 18 add.u64 %r114,%r114,%r131; .loc 1 217 35 add.u64 %r115,%r115,1; $L78: .loc 1 219 18 ld.s8 %r401,[%r113]; cvt.u16.u32.loc 1 219 23 ld.s8 %r404,[%r114]; cvt.u16.u32 %r405,%r404; setp.eq.u16 %r406,%r405,0; @ ! %r406 bra $L125; $L61: .loc 1 224 18 add.u64 %r113,%r113,%r99; .loc 1 225 18 add.u64 %r114,%r114,%r131; .loc 1 217 35 add.u64 %r115,%r115,1; .loc 1 217 11 setp.ne.u64 %r407,%r101,%r115; @ %r407 bra $L24; bra $L22; $L124: .loc 1 219 23 ld.s8 %r408,[%r354]; cvt.u16.u32 %r409,%r408; setp.eq.u16 %r410,%r409,0; @ ! %r410 bra $L125; $L64: .loc 1 224 18 add.u64 %r411,%r353,%r99; .loc 1 225 18 add.u64 %r412,%r354,%r131; .loc 1 219 18 ld.s8 %r415,[%r411]; cvt.u16.u32.loc 1 219 23 ld.s8 %r418,[%r412]; cvt.u16.u32 %r419,%r418; setp.eq.u16 %r420,%r419,0; @ ! %r420 bra $L125; $L66: .loc 1 224 18 add.u64 %r421,%r411,%r99; .loc 1 225 18 add.u64 %r422,%r412,%r131; .loc 1 219 18 ld.s8 %r425,[%r421]; cvt.u16.u32 %r426,%r425; setp.eq.u16 %r427,%r426,0; @ %r427 bra $L68; .loc 1 219 23 ld.s8 %r428,[%r422]; cvt.u16.u32 %r429,%r428; setp.eq.u16 %r430,%r429,0; @ ! %r430 bra $L125; $L68: .loc 1 224 18 add.u64 %r431,%r421,%r99; .loc 1 225 18 add.u64 %r432,%r422,%r131; .loc 1 219 18 ld.s8 %r435,[%r431]; cvt.u16.u32.loc 1 219 23 ld.s8 %r438,[%r432]; cvt.u16.u32 %r439,%r438; setp.eq.u16 %r440,%r439,0; @ ! %r440 bra $L125; $L70: .loc 1 224 18 add.u64 %r441,%r431,%r99; .loc 1 225 18 add.u64 %r442,%r432,%r131; .loc 1 219 18 ld.s8 %r445,[%r441]; cvt.u16.u32 %r446,%r445; setp.eq.u16 %r447,%r446,0; @ %r447 bra $L72; .loc 1 219 23 ld.s8 %r448,[%r442]; cvt.u16.u32 %r449,%r448; setp.eq.u16 %r450,%r449,0; @ ! %r450 bra $L125; $L72: .loc 1 224 18 add.u64 %r451,%r441,%r99; .loc 1 225 18 add.u64 %r452,%r442,%r131; .loc 1 219 18 ld.s8 %r455,[%r451]; cvt.u16.u32 %r456,%r455; setp.eq.u16 %r457,%r456,0; @ %r457 bra $L74; .loc 1 219 23 ld.s8 %r458,[%r452]; cvt.u16.u32 %r459,%r458; setp.eq.u16 %r460,%r459,0; @ ! %r460 bra $L125; $L74: .loc 1 224 18 add.u64 %r461,%r451,%r99; .loc 1 225 18 add.u64 %r462,%r452,%r131; .loc 1 219 18 ld.s8 %r465,[%r461]; cvt.u16.u32 %r466,%r465; setp.eq.u16 %r467,%r466,0; @ %r467 bra $L76; .loc 1 219 23 ld.s8 %r468,[%r462]; cvt.u16.u32 %r469,%r468; setp.eq.u16 %r470,%r469,0; @ ! %r470 bra $L125; $L76: .loc 1 224 18 add.u64 %r113,%r461,%r99; .loc 1 225 18 add.u64 %r114,%r462,%r131; .loc 1 217 35 add.u64 %r115,%r352,7; .loc 1 217 11 setp.ne.u64 %r471,%r101,%r115; @ %r471 bra $L24; bra $L22; } matmul_l8.o/_gfortran_matmul_l8 .visible .func _gfortran_matmul_l8fortran/generated/matmul_l8.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_options[6]FUNCTION DECL: _gfortran_size0_gfortran_size0_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_internal_error .extern .func _gfortrani_internal_97,116,109,117,108,95,108,5643116,114,97,110,47,103,101,110,101,114,97,116,101,100,47,109,97,116,109,117,108,95,108,56,46,9973,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,32,77,65,84,77,85,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,100,0 }70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,32,40,97,41,32,61,61,32,50,32,124,124,32,71,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,32,40,98,41,32,61,61,3273,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,32,77,65,84,77,85,76,32,105,110,116,114,105,110,115,105,99,32,102,111,114,32,100,105,109,101,110,115,105,111,110,32,50,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,12199,111,117,110,116,32,61,61,32,71,70,67,95,68,69,83,67,82,73,80,84,79,82,95,69,88,84,69,78,84,40,98,44,48,473,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,32,77,65,84,77,85,76,32,105,110,116,114,105,110,115,105,99,32,102,111,114,32,100,105,109,101,110,115,105,111,110,32,49,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_matmul_l8 .visible .func _gfortran_matmul_l8u64pred %r165; .reg .predpredpredpredpred32u64 %r271; .reg .u16 %r272; .reg .u32 %r273; .reg .pred %r274; .reg .u64 %r275; .reg .u64u64 %r280; .reg .u64pred %r310; .reg .predu64 %r327; .reg .u64pred %r345; .reg .pred %r346; .reg .pred %r347; .reg .pred %r348; .reg .predpredpredpredpredpred %r392; .reg .u32 %r394; .reg .u16 %r395; .reg .predpredpredpredpredpredpred %r450; .reg .u64 %r451; .reg .u64 %r452; .reg .u32 %r455; .reg .u16 %r456; .reg .pred %r457; .reg .u32 %r458; .reg .u16 %r459; .reg .pred %r460; .reg .u64 %r461; .reg .u64 %r462; .reg .u32 %r465; .reg .u16 %r466; .reg .pred %r467; .reg .u32mov.u64 %r152,%ar0; mov.u64 %r153,%ar1; mov.u64 %r154,%ar2; .loc 1 64 3 ld.s8 %r22,[%r153+28]; cvt.u16.u32 %r155,%r22; setp.eq.u16 %r156,%r155,2; @ %r156 bra $L2; ld.s8 %r333,[%r154+28]; cvt.u16.u32 %r157,%r333; setp.eq.u16 %r159,%r157,2; @ %r159 bra $L3; cvta.const.u64 %r163,$LC0; cvta.const.u64 %r162,__func__$0; mov.u32 %r161,64; cvta.const.u64 %r160,$LC1; {16233: .loc 1 67 6 ld.u64 %r164,[%r1524; .loc 1 69 10 setp.ne.u16 %r167,%r155,1; @ %r167 bra $L5; .loc 1 71 4 st.u64 [%r152+48],%r164; ld.u64 %r170,[%r154+80]; ld.u64 %r172,[%r154+72]; sub.u64 %r173,%r170,%r172; st.u64 [%r152+56],%r173; mov.u64 %r174,1; st.u64 [%r152+40],%r174; bra $L6; $L28: .loc 1 76 4 st.u64 [%r152+48],%r316; add.u64 %r176,%r124,-1; st.u64 [%r152+56],%r176; mov.u64 %r177,1; st.u64 [%r152+40],%r177; bra $L6; $L29: .loc 1 81 4 mov.u64 %r178,0; st.u64 [%r152+48],%r178; add.u64 %r179,%r124,-1; st.u64 [%r152+56],%r179; mov.u64 %r180,1; st.u64 [%r152+40],%r180; .loc 1 84 11 st.u64 [%r152+72],%r178; ld.u64 %r183,[%r154+80]; ld.u64 %r185,[%r154+72]; sub.u64 %r186,%r183,%r185; st.u64 [%r152+80],%r186; st.u64 [%r152+64],%r124; $L6:152; call (%value_in),_gfortran_size0,(%out_arg1); ld.param.u64 %r188,[%value_in]; } .loc 1 90 4 mov.u64 %r19018r190; call (%value_in),_gfortrani_xmallocarray191,[%value_in]; } .loc 1 90 2 st.u64 [%r152],%r191; .loc 1 91 24 mov.u64 %r193,0; st.u64 [%r152+8],%r193; bra $L7; $L4: .loc 1 93 14 cvta.global.u64 %r194,_gfortrani_compile_options; .loc 1 93 13 ld.u32 %r195,[%r194+36]; setp.eq.u32 %r196,%r195,0; @ %r196 bra $L7; .loc 1 100 19 ld.u64 %r198,[%r152+56]; add.u64 %r197,%r198,1; .loc 1 100 17 ld.u64 %r199,[%r152+48]; sub.u64 %r123,%r197,%r199; .loc 1 97 5 setp.ne.u16 %r201,%r155,1; @ %r201 bra $L8; .loc 1 99 19 ld.u64 %r203,[%r154+80]; add.u64 %r202,%r203,1; .loc 1 99 17 ld.u64 %r204,[%r154+72]; sub.u64 %r107,%r202,%r204; .loc 1 101 9 setp.eq.u64 %r205,%r107,%r123; @ %r205 bra $L7; .loc 1 102 8 st.u64 [%stack+8],%r107; st.u64 [%stack],%r123; cvta.const.u64 %r206,$LC2; {stack; call _gfortran_runtime_error76 4 ld.u64 %r209,[%r153+56]; add.u64 %r208,%r209,1; ld.u64 %r210,[%r153+48]; sub.u64 %r148,%r208,%r210; .loc 1 106 10 cvt.u16.u32 %r211,%r333; setp.ne.u16 %r213,%r211,1; @ %r213 bra $L9; .loc 1 110 9 setp.eq.u64 %r214,%r123,%r148; @ %r214 bra $L7; .loc 1 111 8 st.u64 [%stack+8],%r148; st.u64 [%stack],%r123; cvta.const.u64 %r215215_gfortran_runtime_error119 9 setp.eq.u64 %r217,%r123,%r148; @ %r217 bra $L10; .loc 1 120 8 st.u64 [%stack+8],%r148; st.u64 [%stack],%r123;_gfortran_runtime_error10: .loc 1 125 19 ld.u64 %r221,[%r154+80]; add.u64 %r220,%r221,1; .loc 1 125 17 ld.u64 %r222,[%r154+72]; sub.u64 %r105,%r220,%r222; .loc 1 126 19 ld.u64 %r224,[%r152+80]; add.u64 %r223,%r224,1; .loc 1 126 17 ld.u64 %r225,[%r152+72]; sub.u64 %r106,%r223,%r225; .loc 1 127 9 setp.eq.u64 %r226,%r105,%r106; @ %r226 bra $L7; .loc 1 128 8 st.u64 [%stack+8],%r105; st.u64 [%stack],%r106; cvta.const.u64 %r227227_gfortran_runtime_error135 9 ld.u64 %r89,[%r153]; .loc 1 136 12 ld.u64 %r51,[%r153+16]; .loc 1 138 19 cvt.u32.u64 %r52,%r51; .loc 1 138 44 add.u32 %r229,%r52,-4; and.b32 %r230,%r229,-5; set.u32.eq.u32 %r232,%r230,0; neg.s32 %r233,%r232; .loc 1 138 19 add.u32 %r234,%r52,-1; set.u32.le.u32 %r236,%r234,1; neg.s32 %r237,%r236; .loc 1 138 6 cvt.u16.u32 %r239,%r233; cvt.u16.u32 %r240,%r237; or.b16 %r238,%r239,%r240; cvt.u32.u16 %r241,%r238; cvt.u16.u8 %r242,%r24111; .loc 1 140 6 setp.ne.u32 %r245,%r52,16; @ %r245 bra $L12; $L11: .loc 1 147 9 ld.u64 %r110,[%r154]; .loc 1 148 12 ld.u64 %r57,[%r154+16]; .loc 1 150 19 cvt.u32.u64 %r58,%r57; .loc 1 150 44 add.u32 %r246,%r58,-4; and.b32 %r247,%r246,-5; set.u32.eq.u32 %r249,%r247,0; neg.s32 %r250,%r249; .loc 1 150 19 add.u32 %r251,%r58,-1; set.u32.le.u32 %r253,%r251,1; neg.s32 %r254,%r253; .loc 1 150 6 cvt.u16.u32 %r256,%r250; cvt.u16.u32 %r257,%r254; or.b16 %r255,%r256,%r257; cvt.u32.u16 %r258,%r255; cvt.u16.u8 %r259,%r258; setp.ne.u16 %r260,%r259,0; @ %r260 bra $L13; bra $L122; $L12: .loc 1 145 5 cvta.const.u64 %r262,$LC5; mov.u64 %r261,call _gfortrani_internal_error122: .loc 1 152 6 setp.ne.u32 %r264,%r58,16; @ %r264 bra $L12; $L13: .loc 1 159 8 ld.u64 %r111,[%r152]; .loc 1 164 16 ld.u64 %r92,[%r152+40]; .loc 1 162 6 ld.s8 %r266,[%r152+28]; cvt.u16.u32 %r265,%r266; setp.eq.u16 %r267,%r265,1; @ %r267 bra $L31; .loc 1 170 16 ld.u64 %r93,[%r152+64]; bra $L15; $L31: .loc 1 165 16 mov.u64 %r93,%r92; $L15: .loc 1 177 17 ld.u64 %r268,[%r153+40]; mul.lo.u64 %r96,%r51,%r268; .loc 1 178 15 ld.u64 %r270,[%r153+56]; add.u64 %r269,%r270,1; .loc 1 178 13 ld.u64 %r271,[%r153+48]; sub.u64 %r94,%r269,%r271; .loc 1 175 6 ld.s8 %r273,[%r153+28]; cvt.u16.u32 %r272,%r273; setp.eq.u16 %r274,%r272,1; @ %r274 bra $L32; .loc 1 185 17 ld.u64 %r275,[%r153+64]; mul.lo.u64 %r99,%r51,%r275; .loc 1 186 15 ld.u64 %r277,[%r153+80]; add.u64 %r276,%r277,1; .loc 1 186 13 ld.u64 %r278,[%r153+72]; sub.u64 %r101,%r276,%r278; bra $L16; $L32: .loc 1 178 13 mov.u64 %r101,%r94; .loc 1 177 15 mov.u64 %r99,%r96; .loc 1 179 15 mov.u64 %r96,0; .loc 1 181 14 mov.u64 %r94,1; .loc 1 180 16 mov.u64 %r92,%r96; $L16: .loc 1 192 17 ld.u64 %r279,[%r154+40]; mul.lo.u64 %r131,%r57,%r279; .loc 1 193 7 ld.u64 %r281,[%r154+56]; add.u64 %r280,%r281,1; ld.u64 %r282,[%r154+48]; sub.u64 %r135,%r280,%r282; .loc 1 190 6 ld.s8 %r284,[%r154+28]; cvt.u16.u32 %r283,%r284; setp.ne.u16 %r285,%r283,1; @ %r285 bra $L17; .loc 1 193 7 setp.eq.u64 %r286,%r101,%r135; @ %r286 bra $L33; cvta.const.u64 %r290,$LC6; cvta.const.u64 %r289,__func__$0; mov.u32 %r288,193;r2289907: .loc 1 201 7 setp.eq.u64 %r291,%r101,%r135; @ %r291 bra $L19; cvta.const.u64 %r295,$LC6; cvta.const.u64 %r294,__func__$0; mov.u32 %r293,201;r293294959: .loc 1 202 17 ld.u64 %r296,[%r154+64]; mul.lo.u64 %r74,%r57,%r296; .loc 1 203 16 ld.u64 %r298,[%r154+80]; add.u64 %r297,%r298,1; .loc 1 203 14 ld.u64 %r299,[%r154+72]; sub.u64 %r91,%r297,%r299; .loc 1 206 3 setp.le.s64 %r300,%r91,0; @ %r300 bra $L1; mov.u64 %r122,%r93; bra $L18; $L33: mov.u64 %r74,0; mov.u64 %r122,%r74; .loc 1 196 14 mov.u64 %r91,1; $L18: .loc 1 228 16 shl.b64 %r83,%r92,3; .loc 1 231 24 mul.lo.u64 %r55,%r94,%r96; .loc 1 231 13 neg.s64 %r85,%r55; .loc 1 233 36 mul.lo.u64 %r301,%r92,%r94; .loc 1 233 24 sub.u64 %r302,%r122,%r301; .loc 1 233 12 shl.b64 %r87,%r302,3; mul.lo.u64 %r86,%r83,%r94; .loc 1 206 10 mov.u64 %r112,0; setp.gt.s64 %r334,%r94,0; setp.le.s64 %r337,%r101,0; .loc 1 215 17 mov.u64 %r338,%r112; .loc 1 221 25 mov.u64 %r339,1; bra $L21; $L34: .loc 1 208 7 mov.u64 %r116,%r111; mov.u64 %r117,%r89; .loc 1 208 14 mov.u64 %r118,0; $L25: .loc 1 215 17 st.u64 [%r116],%r338; .loc 1 217 11 @ %r337 bra $L22; mov.u64 %r114,%r110; mov.u64 %r113,%r117; .loc 1 217 18 mov.u64 %r115,0; and.b64 %r344,%r101,7; setp.eq.u64 %r351,%r344,0; @ %r351 bra $L24; setp.eq.u64 %r350,%r344,1; @ %r350 bra $L78; setp.eq.u64 %r349,%r344,2; @ %r349 bra $L79; setp.eq.u64 %r348,%r344,3; @ %r348 bra $L80; setp.eq.u64 %r347,%r344,4; @ %r347 bra $L81; setp.eq.u64 %r346,%r344,5; @ %r346 bra $L82; setp.eq.u64 %r345,%r344,6; @ %r345 bra $L83; bra $L123; $L24: .loc 1 219 18 ld.s8 %r306,[%r113]; cvt.u16.u32 %r305,%r306; setp.eq.u16 %r307,%r305,0; @ %r307 bra $L23; .loc 1 219 23 ld.s8 %r309,[%r114]; cvt.u16.u32 %r308,%r309; setp.eq.u16 %r310,%r308,0; @ %r310 bra $L23; $L125: .loc 1 221 25 st.u64 [%r116],%r339; .loc 1 222 19 bra $L22; $L23: .loc 1 224 18 add.u64 %r353,%r113,%r99; .loc 1 225 18 add.u64 %r354,%r114,%r131; .loc 1 217 35 add.u64 %r352,%r115,1; .loc 1 219 18 ld.s8 %r356,[%r353]; cvt.u16.u32bra $L124; $L22: .loc 1 228 16 add.u64 %r116,%r116,%r83; .loc 1 229 17 add.u64 %r117,%r117,%r96; .loc 1 208 32 add.u64 %r118,%r118,1; .loc 1 208 7 setp.ne.u64 %r313,%r94,%r118; @ %r313 bra $L25; .loc 1 229 17 add.u64 %r89,%r89,%r55; .loc 1 228 16 add.u64 %r111,%r111,%r86; $L26: .loc 1 231 13 add.u64 %r89,%r89,%r85; .loc 1 232 13 add.u64 %r110,%r110,%r74; .loc 1 233 12 add.u64 %r111,%r111,%r87; .loc 1 206 28 add.u64 %r112,%r112,1; .loc 1 206 3 setp.le.s64 %r314,%r91,%r112; @ %r314 bra $L1; $L21: .loc 1 208 7 @ %r334 bra $L34; bra $L26; $L2: .loc 1 67 6 ld.u64 %r316,[%r152]; setp.ne.u64 %r317,%r316,0; @ %r317 bra $L27; .loc 1 76 4 ld.u64 %r319,[%r153+56]; add.u64 %r318,%r319,1; ld.u64 %r320,[%r153+48]; sub.u64 %r124,%r318,%r320; .loc 1 74 15 ld.s8 %r322,[%r154+28]; cvt.u16.u32 %r321,%r322; setp.eq.u16 %r323,%r321,1; @ %r323 bra $L28; bra $L29; $L5: .loc 1 76 4 ld.u64 %r325,[%r153+56]; add.u64 %r324,%r325,1; ld.u64 %r326,[%r153+48]; sub.u64 %r124,%r324,%r326; bra $L29; $L30: .loc 1 100 19 ld.u64 %r328,[%r152+56]; add.u64 %r327,%r328,1; .loc 1 100 17 ld.u64 %r329,[%r152+48]; sub.u64 %r123,%r327,%r329; ld.s8 %r333,[%r154+28]; bra $L8; $L27: .loc 1 93 14 cvta.global.u64 %r330,_gfortrani_compile_options; .loc 1 93 13 ld.u32 %r331,[%r330+36]; setp.ne.u32 %r332,%r331,0; @ ! %r332 bra $L7; bra $L30; $L1: .loc 1 235 1 ret; $L123: .loc 1 219 18 ld.s8 %r359,[%r117]; cvt.u16.u32.loc 1 219 23 ld.s8 %r362,[%r110]; cvt.u16.u32 %r363,%r362; setp.eq.u16 %r364,%r363,0; @ ! %r364 bra $L125; $L43: .loc 1 224 18 add.u64 %r113,%r117,%r99; .loc 1 225 18 add.u64 %r114,%r110,%r131; .loc 1 217 35 mov.u64 %r115,1; $L83: .loc 1 219 18 ld.s8 %r366,[%r113]; cvt.u16.u32 %r367,%r366; setp.eq.u16 %r368,%r367,0; @ %r368 bra $L46; .loc 1 219 23 ld.s8 %r369,[%r114]; cvt.u16.u32 %r370,%r369; setp.eq.u16 %r371,%r370,0; @ ! %r371 bra $L125; $L46: .loc 1 224 18 add.u64 %r113,%r113,%r99; .loc 1 225 18 add.u64 %r114,%r114,%r131; .loc 1 217 35 add.u64 %r115,%r115,1; $L82: .loc 1 219 18 ld.s8 %r373,[%r113]; cvt.u16.u32 %r374,%r373; setp.eq.u16 %r375,%r374,0; @ %r375 bra $L49; .loc 1 219 23 ld.s8 %r376,[%r114]; cvt.u16.u32 %r377,%r376; setp.eq.u16 %r378,%r377,0; @ ! %r378 bra $L125; $L49: .loc 1 224 18 add.u64 %r113,%r113,%r99; .loc 1 225 18 add.u64 %r114,%r114,%r131; .loc 1 217 35 add.u64 %r115,%r115,1; $L81: .loc 1 219 18 ld.s8 %r380,[%r113]; cvt.u16.u32.loc 1 219 23 ld.s8 %r383,[%r114]; cvt.u16.u32 %r384,%r383; setp.eq.u16 %r385,%r384,0; @ ! %r385 bra $L125; $L52: .loc 1 224 18 add.u64 %r113,%r113,%r99; .loc 1 225 18 add.u64 %r114,%r114,%r131; .loc 1 217 35 add.u64 %r115,%r115,1; $L80: .loc 1 219 18 ld.s8 %r387,[%r113]; cvt.u16.u32 %r388,%r387; setp.eq.u16 %r389,%r388,0; @ %r389 bra $L55; .loc 1 219 23 ld.s8 %r390,[%r114]; cvt.u16.u32 %r391,%r390; setp.eq.u16 %r392,%r391,0; @ ! %r392 bra $L125; $L55: .loc 1 224 18 add.u64 %r113,%r113,%r99; .loc 1 225 18 add.u64 %r114,%r114,%r131; .loc 1 217 35 add.u64 %r115,%r115,1; $L79: .loc 1 219 18 ld.s8 %r394,[%r113]; cvt.u16.u32 %r395,%r394; setp.eq.u16 %r396,%r395,0; @ %r396 bra $L58; .loc 1 219 23 ld.s8 %r397,[%r114]; cvt.u16.u32 %r398,%r397; setp.eq.u16 %r399,%r398,0; @ ! %r399 bra $L125; $L58: .loc 1 224 18 add.u64 %r113,%r113,%r99; .loc 1 225 18 add.u64 %r114,%r114,%r131; .loc 1 217 35 add.u64 %r115,%r115,1; $L78: .loc 1 219 18 ld.s8 %r401,[%r113]; cvt.u16.u32.loc 1 219 23 ld.s8 %r404,[%r114]; cvt.u16.u32 %r405,%r404; setp.eq.u16 %r406,%r405,0; @ ! %r406 bra $L125; $L61: .loc 1 224 18 add.u64 %r113,%r113,%r99; .loc 1 225 18 add.u64 %r114,%r114,%r131; .loc 1 217 35 add.u64 %r115,%r115,1; .loc 1 217 11 setp.ne.u64 %r407,%r101,%r115; @ %r407 bra $L24; bra $L22; $L124: .loc 1 219 23 ld.s8 %r408,[%r354]; cvt.u16.u32 %r409,%r408; setp.eq.u16 %r410,%r409,0; @ ! %r410 bra $L125; $L64: .loc 1 224 18 add.u64 %r411,%r353,%r99; .loc 1 225 18 add.u64 %r412,%r354,%r131; .loc 1 219 18 ld.s8 %r415,[%r411]; cvt.u16.u32.loc 1 219 23 ld.s8 %r418,[%r412]; cvt.u16.u32 %r419,%r418; setp.eq.u16 %r420,%r419,0; @ ! %r420 bra $L125; $L66: .loc 1 224 18 add.u64 %r421,%r411,%r99; .loc 1 225 18 add.u64 %r422,%r412,%r131; .loc 1 219 18 ld.s8 %r425,[%r421]; cvt.u16.u32 %r426,%r425; setp.eq.u16 %r427,%r426,0; @ %r427 bra $L68; .loc 1 219 23 ld.s8 %r428,[%r422]; cvt.u16.u32 %r429,%r428; setp.eq.u16 %r430,%r429,0; @ ! %r430 bra $L125; $L68: .loc 1 224 18 add.u64 %r431,%r421,%r99; .loc 1 225 18 add.u64 %r432,%r422,%r131; .loc 1 219 18 ld.s8 %r435,[%r431]; cvt.u16.u32.loc 1 219 23 ld.s8 %r438,[%r432]; cvt.u16.u32 %r439,%r438; setp.eq.u16 %r440,%r439,0; @ ! %r440 bra $L125; $L70: .loc 1 224 18 add.u64 %r441,%r431,%r99; .loc 1 225 18 add.u64 %r442,%r432,%r131; .loc 1 219 18 ld.s8 %r445,[%r441]; cvt.u16.u32 %r446,%r445; setp.eq.u16 %r447,%r446,0; @ %r447 bra $L72; .loc 1 219 23 ld.s8 %r448,[%r442]; cvt.u16.u32 %r449,%r448; setp.eq.u16 %r450,%r449,0; @ ! %r450 bra $L125; $L72: .loc 1 224 18 add.u64 %r451,%r441,%r99; .loc 1 225 18 add.u64 %r452,%r442,%r131; .loc 1 219 18 ld.s8 %r455,[%r451]; cvt.u16.u32 %r456,%r455; setp.eq.u16 %r457,%r456,0; @ %r457 bra $L74; .loc 1 219 23 ld.s8 %r458,[%r452]; cvt.u16.u32 %r459,%r458; setp.eq.u16 %r460,%r459,0; @ ! %r460 bra $L125; $L74: .loc 1 224 18 add.u64 %r461,%r451,%r99; .loc 1 225 18 add.u64 %r462,%r452,%r131; .loc 1 219 18 ld.s8 %r465,[%r461]; cvt.u16.u32 %r466,%r465; setp.eq.u16 %r467,%r466,0; @ %r467 bra $L76; .loc 1 219 23 ld.s8 %r468,[%r462]; cvt.u16.u32 %r469,%r468; setp.eq.u16 %r470,%r469,0; @ ! %r470 bra $L125; $L76: .loc 1 224 18 add.u64 %r113,%r461,%r99; .loc 1 225 18 add.u64 %r114,%r462,%r131; .loc 1 217 35 add.u64 %r115,%r352,7; .loc 1 217 11 setp.ne.u64 %r471,%r101,%r115; @ %r471 bra $L24; bra $L22; } matmul_l16.o/_gfortran_matmul_l16 .visible .func _gfortran_matmul_l16fortran/generated/matmul_l16.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_options[6]FUNCTION DECL: _gfortran_size0_gfortran_size0_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_internal_error .extern .func _gfortrani_internal_97,116,109,117,108,95,108,49,54116,114,97,110,47,103,101,110,101,114,97,116,101,100,47,109,97,116,109,117,108,95,108,49,54,46,9973,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,32,77,65,84,77,85,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,100,0 }70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,32,40,97,41,32,61,61,32,50,32,124,124,32,71,70,67,95,68,69,83,67,82,73,80,84,79,82,95,82,65,78,75,32,40,98,41,32,61,61,3273,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,32,77,65,84,77,85,76,32,105,110,116,114,105,110,115,105,99,32,102,111,114,32,100,105,109,101,110,115,105,111,110,32,50,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,12199,111,117,110,116,32,61,61,32,71,70,67,95,68,69,83,67,82,73,80,84,79,82,95,69,88,84,69,78,84,40,98,44,48,473,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,32,77,65,84,77,85,76,32,105,110,116,114,105,110,115,105,99,32,102,111,114,32,100,105,109,101,110,115,105,111,110,32,49,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortran_matmul_l16 .visible .func _gfortran_matmul_l16u64pred %r165; .reg .predpredpredpredpred32u64 %r271; .reg .u16 %r272; .reg .u32 %r273; .reg .pred %r274; .reg .u64 %r275; .reg .u64u64 %r280; .reg .u64u64 %r302; .reg .u16 %r306; .reg .u32 %r307; .reg .pred %r308; .reg .u16 %r309; .reg .u32 %r310; .reg .pred %r311; .reg .predu64 %r346; .reg .pred %r347; .reg .pred %r348; .reg .predpred %r351; .reg .pred %r352; .reg .predpredpred %r373; .reg .u32 %r375; .reg .u16 %r376; .reg .predpredu16 %r400; .reg .predpred %r408; .reg .pred.reg .u16 %r418; .reg .predu32 %r427; .reg .u16 %r428; .reg .predpredpredpredpred %r472; .reg .pred %r473; mov.u64 %r152,%ar0; mov.u64 %r153,%ar1; mov.u64 %r154,%ar2; .loc 1 64 3 ld.s8 %r22,[%r153+28]; cvt.u16.u32 %r155,%r22; setp.eq.u16 %r156,%r155,2; @ %r156 bra $L2; ld.s8 %r335,[%r154+28]; cvt.u16.u32 %r157,%r335; setp.eq.u16 %r159,%r157,2; @ %r159 bra $L3; cvta.const.u64 %r163,$LC0; cvta.const.u64 %r162,__func__$0; mov.u32 %r161,64; cvta.const.u64 %r160,$LC1; {16233: .loc 1 67 6 ld.u64 %r164,[%r1524; .loc 1 69 10 setp.ne.u16 %r167,%r155,1; @ %r167 bra $L5; .loc 1 71 4 st.u64 [%r152+48],%r164; ld.u64 %r170,[%r154+80]; ld.u64 %r172,[%r154+72]; sub.u64 %r173,%r170,%r172; st.u64 [%r152+56],%r173; mov.u64 %r174,1; st.u64 [%r152+40],%r174; bra $L6; $L28: .loc 1 76 4 st.u64 [%r152+48],%r318; add.u64 %r176,%r124,-1; st.u64 [%r152+56],%r176; mov.u64 %r177,1; st.u64 [%r152+40],%r177; bra $L6; $L29: .loc 1 81 4 mov.u64 %r178,0; st.u64 [%r152+48],%r178; add.u64 %r179,%r124,-1; st.u64 [%r152+56],%r179; mov.u64 %r180,1; st.u64 [%r152+40],%r180; .loc 1 84 11 st.u64 [%r152+72],%r178; ld.u64 %r183,[%r154+80]; ld.u64 %r185,[%r154+72]; sub.u64 %r186,%r183,%r185; st.u64 [%r152+80],%r186; st.u64 [%r152+64],%r124; $L6:152; call (%value_in),_gfortran_size0,(%out_arg1); ld.param.u64 %r188,[%value_in]; } .loc 1 90 4 mov.u64 %r190,16call (%value_in),_gfortrani_xmallocarray191,[%value_in]; } .loc 1 90 2 st.u64 [%r152],%r191; .loc 1 91 24 mov.u64 %r193,0; st.u64 [%r152+8],%r193; bra $L7; $L4: .loc 1 93 14 cvta.global.u64 %r194,_gfortrani_compile_options; .loc 1 93 13 ld.u32 %r195,[%r194+36]; setp.eq.u32 %r196,%r195,0; @ %r196 bra $L7; .loc 1 100 19 ld.u64 %r198,[%r152+56]; add.u64 %r197,%r198,1; .loc 1 100 17 ld.u64 %r199,[%r152+48]; sub.u64 %r123,%r197,%r199; .loc 1 97 5 setp.ne.u16 %r201,%r155,1; @ %r201 bra $L8; .loc 1 99 19 ld.u64 %r203,[%r154+80]; add.u64 %r202,%r203,1; .loc 1 99 17 ld.u64 %r204,[%r154+72]; sub.u64 %r107,%r202,%r204; .loc 1 101 9 setp.eq.u64 %r205,%r107,%r123; @ %r205 bra $L7; .loc 1 102 8 st.u64 [%stack+8],%r107; st.u64 [%stack],%r123; cvta.const.u64 %r206,$LC2; {stack; call _gfortran_runtime_error76 4 ld.u64 %r209,[%r153+56]; add.u64 %r208,%r209,1; ld.u64 %r210,[%r153+48]; sub.u64 %r148,%r208,%r210; .loc 1 106 10 cvt.u16.u32 %r211,%r335; setp.ne.u16 %r213,%r211,1; @ %r213 bra $L9; .loc 1 110 9 setp.eq.u64 %r214,%r123,%r148; @ %r214 bra $L7; .loc 1 111 8 st.u64 [%stack+8],%r148; st.u64 [%stack],%r123; cvta.const.u64 %r215215_gfortran_runtime_error119 9 setp.eq.u64 %r217,%r123,%r148; @ %r217 bra $L10; .loc 1 120 8 st.u64 [%stack+8],%r148; st.u64 [%stack],%r123;_gfortran_runtime_error10: .loc 1 125 19 ld.u64 %r221,[%r154+80]; add.u64 %r220,%r221,1; .loc 1 125 17 ld.u64 %r222,[%r154+72]; sub.u64 %r105,%r220,%r222; .loc 1 126 19 ld.u64 %r224,[%r152+80]; add.u64 %r223,%r224,1; .loc 1 126 17 ld.u64 %r225,[%r152+72]; sub.u64 %r106,%r223,%r225; .loc 1 127 9 setp.eq.u64 %r226,%r105,%r106; @ %r226 bra $L7; .loc 1 128 8 st.u64 [%stack+8],%r105; st.u64 [%stack],%r106; cvta.const.u64 %r227227_gfortran_runtime_error135 9 ld.u64 %r89,[%r153]; .loc 1 136 12 ld.u64 %r51,[%r153+16]; .loc 1 138 19 cvt.u32.u64 %r52,%r51; .loc 1 138 44 add.u32 %r229,%r52,-4; and.b32 %r230,%r229,-5; set.u32.eq.u32 %r232,%r230,0; neg.s32 %r233,%r232; .loc 1 138 19 add.u32 %r234,%r52,-1; set.u32.le.u32 %r236,%r234,1; neg.s32 %r237,%r236; .loc 1 138 6 cvt.u16.u32 %r239,%r233; cvt.u16.u32 %r240,%r237; or.b16 %r238,%r239,%r240; cvt.u32.u16 %r241,%r238; cvt.u16.u8 %r242,%r24111; .loc 1 140 6 setp.ne.u32 %r245,%r52,16; @ %r245 bra $L12; $L11: .loc 1 147 9 ld.u64 %r110,[%r154]; .loc 1 148 12 ld.u64 %r57,[%r154+16]; .loc 1 150 19 cvt.u32.u64 %r58,%r57; .loc 1 150 44 add.u32 %r246,%r58,-4; and.b32 %r247,%r246,-5; set.u32.eq.u32 %r249,%r247,0; neg.s32 %r250,%r249; .loc 1 150 19 add.u32 %r251,%r58,-1; set.u32.le.u32 %r253,%r251,1; neg.s32 %r254,%r253; .loc 1 150 6 cvt.u16.u32 %r256,%r250; cvt.u16.u32 %r257,%r254; or.b16 %r255,%r256,%r257; cvt.u32.u16 %r258,%r255; cvt.u16.u8 %r259,%r258; setp.ne.u16 %r260,%r259,0; @ %r260 bra $L13; bra $L122; $L12: .loc 1 145 5 cvta.const.u64 %r262,$LC5; mov.u64 %r261,call _gfortrani_internal_error122: .loc 1 152 6 setp.ne.u32 %r264,%r58,16; @ %r264 bra $L12; $L13: .loc 1 159 8 ld.u64 %r111,[%r152]; .loc 1 164 16 ld.u64 %r92,[%r152+40]; .loc 1 162 6 ld.s8 %r266,[%r152+28]; cvt.u16.u32 %r265,%r266; setp.eq.u16 %r267,%r265,1; @ %r267 bra $L31; .loc 1 170 16 ld.u64 %r93,[%r152+64]; bra $L15; $L31: .loc 1 165 16 mov.u64 %r93,%r92; $L15: .loc 1 177 17 ld.u64 %r268,[%r153+40]; mul.lo.u64 %r96,%r51,%r268; .loc 1 178 15 ld.u64 %r270,[%r153+56]; add.u64 %r269,%r270,1; .loc 1 178 13 ld.u64 %r271,[%r153+48]; sub.u64 %r94,%r269,%r271; .loc 1 175 6 ld.s8 %r273,[%r153+28]; cvt.u16.u32 %r272,%r273; setp.eq.u16 %r274,%r272,1; @ %r274 bra $L32; .loc 1 185 17 ld.u64 %r275,[%r153+64]; mul.lo.u64 %r99,%r51,%r275; .loc 1 186 15 ld.u64 %r277,[%r153+80]; add.u64 %r276,%r277,1; .loc 1 186 13 ld.u64 %r278,[%r153+72]; sub.u64 %r101,%r276,%r278; bra $L16; $L32: .loc 1 178 13 mov.u64 %r101,%r94; .loc 1 177 15 mov.u64 %r99,%r96; .loc 1 179 15 mov.u64 %r96,0; .loc 1 181 14 mov.u64 %r94,1; .loc 1 180 16 mov.u64 %r92,%r96; $L16: .loc 1 192 17 ld.u64 %r279,[%r154+40]; mul.lo.u64 %r131,%r57,%r279; .loc 1 193 7 ld.u64 %r281,[%r154+56]; add.u64 %r280,%r281,1; ld.u64 %r282,[%r154+48]; sub.u64 %r135,%r280,%r282; .loc 1 190 6 ld.s8 %r284,[%r154+28]; cvt.u16.u32 %r283,%r284; setp.ne.u16 %r285,%r283,1; @ %r285 bra $L17; .loc 1 193 7 setp.eq.u64 %r286,%r101,%r135; @ %r286 bra $L33; cvta.const.u64 %r290,$LC6; cvta.const.u64 %r289,__func__$0; mov.u32 %r288,193;r2289907: .loc 1 201 7 setp.eq.u64 %r291,%r101,%r135; @ %r291 bra $L19; cvta.const.u64 %r295,$LC6; cvta.const.u64 %r294,__func__$0; mov.u32 %r293,201;r293294959: .loc 1 202 17 ld.u64 %r296,[%r154+64]; mul.lo.u64 %r74,%r57,%r296; .loc 1 203 16 ld.u64 %r298,[%r154+80]; add.u64 %r297,%r298,1; .loc 1 203 14 ld.u64 %r299,[%r154+72]; sub.u64 %r91,%r297,%r299; .loc 1 206 3 setp.le.s64 %r300,%r91,0; @ %r300 bra $L1; mov.u64 %r122,%r93; bra $L18; $L33: mov.u64 %r74,0; mov.u64 %r122,%r74; .loc 1 196 14 mov.u64 %r91,1; $L18: .loc 1 228 16 shl.b64 %r83,%r92,4; .loc 1 231 24 mul.lo.u64 %r55,%r94,%r96; .loc 1 231 13 neg.s64 %r85,%r55; .loc 1 233 36 mul.lo.u64 %r301,%r92,%r94; .loc 1 233 24 sub.u64 %r302,%r122,%r301; .loc 1 233 12 shl.b64 %r87,%r302,4; mul.lo.u64 %r86,%r83,%r94; .loc 1 206 10 mov.u64 %r112,0; setp.gt.s64 %r336,%r94,0; setp.le.s64 %r339,%r101,0; .loc 1 215 17 mov.u64 %r340,%r112; .loc 1 221 25 mov.u64 %r341,1; bra $L21; $L34: .loc 1 208 7 mov.u64 %r116,%r111; mov.u64 %r117,%r89; .loc 1 208 14 mov.u64 %r118,0; $L25: .loc 1 215 17 st.u64 [%r116],%r340; st.u64 [%r116+8],%r340; .loc 1 217 11 @ %r339 bra $L22; mov.u64 %r114,%r110; mov.u64 %r113,%r117; .loc 1 217 18 mov.u64 %r115,0; and.b64 %r346,%r101,7; setp.eq.u64 %r353,%r346,0; @ %r353 bra $L24; setp.eq.u64 %r352,%r346,1; @ %r352 bra $L78; setp.eq.u64 %r351,%r346,2; @ %r351 bra $L79; setp.eq.u64 %r350,%r346,3; @ %r350 bra $L80; setp.eq.u64 %r349,%r346,4; @ %r349 bra $L81; setp.eq.u64 %r348,%r346,5; @ %r348 bra $L82; setp.eq.u64 %r347,%r346,6; @ %r347 bra $L83; bra $L123; $L24: .loc 1 219 18 ld.s8 %r307,[%r113]; cvt.u16.u32 %r306,%r307; setp.eq.u16 %r308,%r306,0; @ %r308 bra $L23; .loc 1 219 23 ld.s8 %r310,[%r114]; cvt.u16.u32 %r309,%r310; setp.eq.u16 %r311,%r309,0; @ %r311 bra $L23; $L125: .loc 1 221 25 st.u64 [%r116],%r341; st.u64 [%r116+8],%r340; .loc 1 222 19 bra $L22; $L23: .loc 1 224 18 add.u64 %r355,%r113,%r99; .loc 1 225 18 add.u64 %r356,%r114,%r131; .loc 1 217 35 add.u64 %r354,%r115,1; .loc 1 219 18 ld.s8 %r358,[%r355]; cvt.u16.u32bra $L124; $L22: .loc 1 228 16 add.u64 %r116,%r116,%r83; .loc 1 229 17 add.u64 %r117,%r117,%r96; .loc 1 208 32 add.u64 %r118,%r118,1; .loc 1 208 7 setp.ne.u64 %r315,%r94,%r118; @ %r315 bra $L25; .loc 1 229 17 add.u64 %r89,%r89,%r55; .loc 1 228 16 add.u64 %r111,%r111,%r86; $L26: .loc 1 231 13 add.u64 %r89,%r89,%r85; .loc 1 232 13 add.u64 %r110,%r110,%r74; .loc 1 233 12 add.u64 %r111,%r111,%r87; .loc 1 206 28 add.u64 %r112,%r112,1; .loc 1 206 3 setp.le.s64 %r316,%r91,%r112; @ %r316 bra $L1; $L21: .loc 1 208 7 @ %r336 bra $L34; bra $L26; $L2: .loc 1 67 6 ld.u64 %r318,[%r152]; setp.ne.u64 %r319,%r318,0; @ %r319 bra $L27; .loc 1 76 4 ld.u64 %r321,[%r153+56]; add.u64 %r320,%r321,1; ld.u64 %r322,[%r153+48]; sub.u64 %r124,%r320,%r322; .loc 1 74 15 ld.s8 %r324,[%r154+28]; cvt.u16.u32 %r323,%r324; setp.eq.u16 %r325,%r323,1; @ %r325 bra $L28; bra $L29; $L5: .loc 1 76 4 ld.u64 %r327,[%r153+56]; add.u64 %r326,%r327,1; ld.u64 %r328,[%r153+48]; sub.u64 %r124,%r326,%r328; bra $L29; $L30: .loc 1 100 19 ld.u64 %r330,[%r152+56]; add.u64 %r329,%r330,1; .loc 1 100 17 ld.u64 %r331,[%r152+48]; sub.u64 %r123,%r329,%r331; ld.s8 %r335,[%r154+28]; bra $L8; $L27: .loc 1 93 14 cvta.global.u64 %r332,_gfortrani_compile_options; .loc 1 93 13 ld.u32 %r333,[%r332+36]; setp.ne.u32 %r334,%r333,0; @ ! %r334 bra $L7; bra $L30; $L1: .loc 1 235 1 ret; $L123: .loc 1 219 18 ld.s8 %r361,[%r117]; cvt.u16.u32 %r362,%r361; setp.eq.u16 %r363,%r362,0; @ %r363 bra $L43; .loc 1 219 23 ld.s8 %r364,[%r110]; cvt.u16.u32 %r365,%r364; setp.eq.u16 %r366,%r365,0; @ ! %r366 bra $L125; $L43: .loc 1 224 18 add.u64 %r113,%r117,%r99; .loc 1 225 18 add.u64 %r114,%r110,%r131; .loc 1 217 35 mov.u64 %r115,1; $L83: .loc 1 219 18 ld.s8 %r368,[%r113]; cvt.u16.u32 %r369,%r368; setp.eq.u16 %r370,%r369,0; @ %r370 bra $L46; .loc 1 219 23 ld.s8 %r371,[%r114]; cvt.u16.u32 %r372,%r371; setp.eq.u16 %r373,%r372,0; @ ! %r373 bra $L125; $L46: .loc 1 224 18 add.u64 %r113,%r113,%r99; .loc 1 225 18 add.u64 %r114,%r114,%r131; .loc 1 217 35 add.u64 %r115,%r115,1; $L82: .loc 1 219 18 ld.s8 %r375,[%r113]; cvt.u16.u32 %r376,%r375; setp.eq.u16 %r377,%r376,0; @ %r377 bra $L49; .loc 1 219 23 ld.s8 %r378,[%r114]; cvt.u16.u32 %r379,%r378; setp.eq.u16 %r380,%r379,0; @ ! %r380 bra $L125; $L49: .loc 1 224 18 add.u64 %r113,%r113,%r99; .loc 1 225 18 add.u64 %r114,%r114,%r131; .loc 1 217 35 add.u64 %r115,%r115,1; $L81: .loc 1 219 18 ld.s8 %r382,[%r113]; cvt.u16.u32.loc 1 219 23 ld.s8 %r385,[%r114]; cvt.u16.u32 %r386,%r385; setp.eq.u16 %r387,%r386,0; @ ! %r387 bra $L125; $L52: .loc 1 224 18 add.u64 %r113,%r113,%r99; .loc 1 225 18 add.u64 %r114,%r114,%r131; .loc 1 217 35 add.u64 %r115,%r115,1; $L80: .loc 1 219 18 ld.s8 %r389,[%r113]; cvt.u16.u32 %r390,%r389; setp.eq.u16 %r391,%r390,0; @ %r391 bra $L55; .loc 1 219 23 ld.s8 %r392,[%r114]; cvt.u16.u32! %r394 bra $L125; $L55: .loc 1 224 18 add.u64 %r113,%r113,%r99; .loc 1 225 18 add.u64 %r114,%r114,%r131; .loc 1 217 35 add.u64 %r115,%r115,1; $L79: .loc 1 219 18 ld.s8 %r396,[%r113]; cvt.u16.u32 %r397,%r396; setp.eq.u16 %r398,%r397,0; @ %r398 bra $L58; .loc 1 219 23 ld.s8 %r399,[%r114]; cvt.u16.u32 %r400,%r399; setp.eq.u16 %r401,%r400,0; @ ! %r401 bra $L125; $L58: .loc 1 224 18 add.u64 %r113,%r113,%r99; .loc 1 225 18 add.u64 %r114,%r114,%r131; .loc 1 217 35 add.u64 %r115,%r115,1; $L78: .loc 1 219 18 ld.s8 %r403,[%r113]; cvt.u16.u32 %r404,%r403; setp.eq.u16 %r405,%r404,0; @ %r405 bra $L61; .loc 1 219 23 ld.s8 %r406,[%r114]; cvt.u16.u32! %r408 bra $L125; $L61: .loc 1 224 18 add.u64 %r113,%r113,%r99; .loc 1 225 18 add.u64 %r114,%r114,%r131; .loc 1 217 35 add.u64 %r115,%r115,1; .loc 1 217 11 setp.ne.u64 %r409,%r101,%r115; @ %r409 bra $L24; bra $L22; $L124: .loc 1 219 23 ld.s8 %r410,[%r356]; cvt.u16.u32 %r411,%r410; setp.eq.u16 %r412,%r411,0; @ ! %r412 bra $L125; $L64: .loc 1 224 18 add.u64 %r413,%r355,%r99; .loc 1 225 18 add.u64 %r414,%r356,%r131; .loc 1 219 18 ld.s8 %r417,[%r413]; cvt.u16.u32 %r418,%r417; setp.eq.u16 %r419,%r418,0; @ %r419 bra $L66; .loc 1 219 23 ld.s8 %r420,[%r414]; cvt.u16.u32! %r422 bra $L125; $L66: .loc 1 224 18 add.u64 %r423,%r413,%r99; .loc 1 225 18 add.u64 %r424,%r414,%r131; .loc 1 219 18 ld.s8 %r427,[%r423]; cvt.u16.u32 %r428,%r427; setp.eq.u16 %r429,%r428,0; @ %r429 bra $L68; .loc 1 219 23 ld.s8 %r430,[%r424]; cvt.u16.u32 %r431,%r430; setp.eq.u16 %r432,%r431,0; @ ! %r432 bra $L125; $L68: .loc 1 224 18 add.u64 %r433,%r423,%r99; .loc 1 225 18 add.u64 %r434,%r424,%r131; .loc 1 219 18 ld.s8 %r437,[%r433]; cvt.u16.u32.loc 1 219 23 ld.s8 %r440,[%r434]; cvt.u16.u32 %r441,%r440; setp.eq.u16 %r442,%r441,0; @ ! %r442 bra $L125; $L70: .loc 1 224 18 add.u64 %r443,%r433,%r99; .loc 1 225 18 add.u64 %r444,%r434,%r131; .loc 1 219 18 ld.s8 %r447,[%r443]; cvt.u16.u32 %r448,%r447; setp.eq.u16 %r449,%r448,0; @ %r449 bra $L72; .loc 1 219 23 ld.s8 %r450,[%r444]; cvt.u16.u32 %r451,%r450; setp.eq.u16 %r452,%r451,0; @ ! %r452 bra $L125; $L72: .loc 1 224 18 add.u64 %r453,%r443,%r99; .loc 1 225 18 add.u64 %r454,%r444,%r131; .loc 1 219 18 ld.s8 %r457,[%r453]; cvt.u16.u32 %r458,%r457; setp.eq.u16 %r459,%r458,0; @ %r459 bra $L74; .loc 1 219 23 ld.s8 %r460,[%r454]; cvt.u16.u32 %r461,%r460; setp.eq.u16 %r462,%r461,0; @ ! %r462 bra $L125; $L74: .loc 1 224 18 add.u64 %r463,%r453,%r99; .loc 1 225 18 add.u64 %r464,%r454,%r131; .loc 1 219 18 ld.s8 %r467,[%r463]; cvt.u16.u32 %r468,%r467; setp.eq.u16 %r469,%r468,0; @ %r469 bra $L76; .loc 1 219 23 ld.s8 %r470,[%r464]; cvt.u16.u32 %r471,%r470; setp.eq.u16 %r472,%r471,0; @ ! %r472 bra $L125; $L76: .loc 1 224 18 add.u64 %r113,%r463,%r99; .loc 1 225 18 add.u64 %r114,%r464,%r131; .loc 1 217 35 add.u64 %r115,%r354,7; .loc 1 217 11 setp.ne.u64 %r473,%r101,%r115; @ %r473 bra $L24; bra $L22; } shape_gfortran_shape_1 .visible .func _gfortran_shape_1fortran/generated/shape_i1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_xmallocarr_gfortrani_xmallocarraygfortran_shape_1 .visible .func _gfortran_shape_1predpred %r74; .reg .u16 %r75; .reg .predpred %r8814 ld.s8 %r22,[%r56+28]; .loc 1 44 6 ld.u64 %r57,[%r55]; setp.ne.u64 %r58,%r57,0; @ %r58 bra $L2; .loc 1 46 7 st.u64 [%r55+48],%r57; add.u32 %r60,%r22,-1; cvt.s64.s32 %r61,%r60; st.u64 [%r55+56],%r61; mov.u64 %r62,1; st.u64 [%r55+40],%r62; .loc 1 47 19 st.u64 [%r55+8],%r57; .loc 1 48 24 cvt.u32.u32 %r67,%r22; cvt.s64.s8 %r66,%r662; call (%value_in),_gfortrani_xmallocarray.loc 1 48 22 st.u64 [%r55],%r68; $L2: .loc 1 53 7 ld.u64 %r71,[%r55+56]; add.u64 %r70,%r71,1; ld.u64 %r73,[%r55+48]; sub.u64 %r72,%r70,%r73; .loc 1 53 6 setp.le.s64 %r74,%r72,0; @ %r74 bra $L1; .loc 1 56 3 cvt.u16.u32 %r75,%r22; setp.le.s16 %r76,%r75,0; @ %r76 bra $L1; add.u64 %r40,%r56,48; ld.u64 %r51,[%r55+40]; ld.u64 %r52,[%r55]; add.u64 %r50,%r56,56; cvt.u32.u32 %r78,%r22; cvt.s64.s8 %r77,%r78; add.u64 %r80,%r77,%r77; add.u64 %r81,%r80,%r77; shl.b64 %r82,%r81,3; add.u64 %r38,%r82,%r40; $L4: .loc 1 58 16 ld.u64 %r86,[%r50]; add.u64 %r85,%r86,1; .loc 1 58 14 ld.u64 %r87,[%r40]; sub.u64 %r84,%r85,%r87; .loc 1 59 56 max.s64 %r84,%r84,0; .loc 1 59 34 st.u8 [%r52],%r84; .loc 1 56 3 add.u64 %r40,%r40,24; add.u64 %r52,%r52,%r51; add.u64 %r50,%r50,24; setp.ne.u64 %r88,%r38,%r40; @ %r88 bra $L4; $L1: .loc 1 61 1gfortran_shape_2 .visible .func _gfortran_shape_fortran/generated/shape_i2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_xmallocarr_gfortrani_xmallocarraygfortran_shape_2 .visible .func _gfortran_shape_predpredpred %r91; mov.u64 %r56,%ar0; mov.u64 %r57,%ar1; .loc 1 42 14 ld.s8 %r22,[%r57+28]; .loc 1 44 6 ld.u64 %r58,[%r56]; setp.ne.u64 %r59,%r58,0; @ %r59 bra $L2; .loc 1 46 7 st.u64 [%r56+48],%r58; add.u32 %r61,%r22,-1; cvt.s64.s32 %r62,%r61; st.u64 [%r56+56],%r62; mov.u64 %r63,1; st.u64 [%r56+40],%r63; .loc 1 47 19 st.u64 [%r56+8],%r58; .loc 1 48 24 cvt.u32.u32 %r68,%r22; cvt.s64.s8 %r67,%r68; mov.u64 %r66,2_gfortrani_xmallocarray.loc 1 48 22 st.u64 [%r56],%r69; $L2: .loc 1 53 7 ld.u64 %r72,[%r56+56]; add.u64 %r71,%r72,1; ld.u64 %r74,[%r56+48]; sub.u64 %r73,%r71,%r74; .loc 1 53 6 setp.le.s64 %r75,%r73,0; @ %r75 bra $L1; .loc 1 56 3 cvt.u16.u32 %r76,%r22; setp.le.s16 %r77,%r76,0; @ %r77 bra $L1; add.u64 %r40,%r57,48; ld.u64 %r78,[%r56+40]; add.u64 %r51,%r78,%r78; ld.u64 %r53,[%r56]; add.u64 %r50,%r57,56; cvt.u32.u32 %r81,%r22; cvt.s64.s8 %r80,%r81; add.u64 %r83,%r80,%r80; add.u64 %r84,%r83,%r80; shl.b64 %r85,%r84,3; add.u64 %r38,%r85,%r40; $L4: .loc 1 58 16 ld.u64 %r89,[%r50]; add.u64 %r88,%r89,1; .loc 1 58 14 ld.u64 %r90,[%r40]; sub.u64 %r87,%r88,%r90; .loc 1 59 56 max.s64 %r87,%r87,0; .loc 1 59 34 st.u16 [%r53],%r87; .loc 1 56 3 add.u64 %r40,%r40,24; add.u64 %r53,%r53,%r51; add.u64 %r50,%r50,24; setp.ne.u64 %r91,%r38,%r40; @ %r91 bra $L4; $L1: .loc 1 61 1 ret; } shape_gfortran_shape_4 .visible .func _gfortran_shape_4fortran/generated/shape_i4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_xmallocarr_gfortrani_xmallocarraygfortran_shape_4 .visible .func _gfortran_shape_4predpredpred %r90; mov.u64 %r56,%ar0; mov.u64 %r57,%ar1; .loc 1 42 14 ld.s8 %r22,[%r57+28]; .loc 1 44 6 ld.u64 %r58,[%r56]; setp.ne.u64 %r59,%r58,0; @ %r59 bra $L2; .loc 1 46 7 st.u64 [%r56+48],%r58; add.u32 %r61,%r22,-1; cvt.s64.s32 %r62,%r61; st.u64 [%r56+56],%r62; mov.u64 %r63,1; st.u64 [%r56+40],%r63; .loc 1 47 19 st.u64 [%r56+8],%r58; .loc 1 48 24 cvt.u32.u32 %r68,%r22; cvt.s64.s8 %r67,%r68; mov.u64 %r66,4_gfortrani_xmallocarray.loc 1 48 22 st.u64 [%r56],%r69; $L2: .loc 1 53 7 ld.u64 %r72,[%r56+56]; add.u64 %r71,%r72,1; ld.u64 %r74,[%r56+48]; sub.u64 %r73,%r71,%r74; .loc 1 53 6 setp.le.s64 %r75,%r73,0; @ %r75 bra $L1; .loc 1 56 3 cvt.u16.u32 %r76,%r22; setp.le.s16 %r77,%r76,0; @ %r77 bra $L1; add.u64 %r40,%r57,48; ld.u64 %r78,[%r56+40]; shl.b64 %r51,%r78,2; ld.u64 %r53,[%r56]; add.u64 %r50,%r57,56; cvt.u32.u32 %r80,%r22; cvt.s64.s8 %r79,%r80; add.u64 %r82,%r79,%r79; add.u64 %r83,%r82,%r79; shl.b64 %r84,%r83,3; add.u64 %r38,%r84,%r40; $L4: .loc 1 58 16 ld.u64 %r88,[%r50]; add.u64 %r87,%r88,1; .loc 1 58 14 ld.u64 %r89,[%r40]; sub.u64 %r86,%r87,%r89; .loc 1 59 56 max.s64 %r86,%r86,0; .loc 1 59 34 st.u32 [%r53],%r86; .loc 1 56 3 add.u64 %r40,%r40,24; add.u64 %r53,%r53,%r51; add.u64 %r50,%r50,24; setp.ne.u64 %r90,%r38,%r40; @ %r90 bra $L4; $L1: .loc 1 61 1gfortran_shape_8 .visible .func _gfortran_shape_8fortran/generated/shape_i8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_xmallocarr_gfortrani_xmallocarraygfortran_shape_8 .visible .func _gfortran_shape_8.reg .pred %r74; .reg .u16 %r75; .reg .predpred %r9014 ld.s8 %r22,[%r56+28]; .loc 1 44 6 ld.u64 %r57,[%r55]; setp.ne.u64 %r58,%r57,0; @ %r58 bra $L2; .loc 1 46 7 st.u64 [%r55+48],%r57; add.u32 %r60,%r22,-1; cvt.s64.s32 %r61,%r60; st.u64 [%r55+56],%r61; mov.u64 %r62,1; st.u64 [%r55+40],%r62; .loc 1 47 19 st.u64 [%r55+8],%r57; .loc 1 48 24 cvt.u32.u32 %r67,%r22; cvt.s64.s8 %r66,%r67; mov.u64 66; call (%value_in),_gfortrani_xmallocarray.loc 1 48 22 st.u64 [%r55],%r68; $L2: .loc 1 53 7 ld.u64 %r71,[%r55+56]; add.u64 %r70,%r71,1; ld.u64 %r73,[%r55+48]; sub.u64 %r72,%r70,%r73; .loc 1 53 6 setp.le.s64 %r74,%r72,0; @ %r74 bra $L1; .loc 1 56 3 cvt.u16.u32 %r75,%r22; setp.le.s16 %r76,%r75,0; @ %r76 bra $L1; add.u64 %r40,%r56,48; ld.u64 %r77,[%r55+40]; shl.b64 %r49,%r77,3; ld.u64 %r51,[%r55]; add.u64 %r48,%r56,56; cvt.u32.u32 %r79,%r22; cvt.s64.s8 %r78,%r79; add.u64 %r81,%r78,%r78; add.u64 %r82,%r81,%r78; shl.b64 %r83,%r82,3; add.u64 %r53,%r83,%r40; $L4: .loc 1 58 16 ld.u64 %r86,[%r48]; add.u64 %r85,%r86,1; .loc 1 58 14 ld.u64 %r88,[%r40]; sub.u64 %r87,%r85,%r88; .loc 1 59 56 max.s64 %r89,%r87,0; .loc 1 59 34 st.u64 [%r51],%r89; .loc 1 56 3 add.u64 %r40,%r40,24; add.u64 %r51,%r51,%r49; add.u64 %r48,%r48,24; setp.ne.u64 %r90,%r40,%r53; @ %r90 bra $L4; $L1: .loc 1 61 1 ret; } shape_i16.o/gfortran_shape_16 .visible .func _gfortran_shape_16fortran/generated/shape_i16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_xmallocarr_gfortrani_xmallocarraygfortran_shape_16 .visible .func _gfortran_shape_16predpredmov.u64 %r56,%ar0; mov.u64 %r57,%ar1; .loc 1 42 14 ld.s8 %r22,[%r57+28]; .loc 1 44 6 ld.u64 %r58,[%r56]; setp.ne.u64 %r59,%r58,0; @ %r59 bra $L2; .loc 1 46 7 st.u64 [%r56+48],%r58; add.u32 %r61,%r22,-1; cvt.s64.s32 %r62,%r61; st.u64 [%r56+56],%r62; mov.u64 %r63,1; st.u64 [%r56+40],%r63; .loc 1 47 19 st.u64 [%r56+8],%r58; .loc 1 48 24 cvt.u32.u32 %r68,%r22; cvt.s64.s8 %r67,%r68; mov.u64 %r66,_gfortrani_xmallocarray.loc 1 48 22 st.u64 [%r56],%r69; $L2: .loc 1 53 7 ld.u64 %r72,[%r56+56]; add.u64 %r71,%r72,1; ld.u64 %r74,[%r56+48]; sub.u64 %r73,%r71,%r74; .loc 1 53 6 setp.le.s64 %r75,%r73,0; @ %r75 bra $L1; .loc 1 56 3 cvt.u16.u32 %r76,%r22; setp.le.s16 %r77,%r76,0; @ %r77 bra $L1; add.u64 %r40,%r57,48; ld.u64 %r78,[%r56+40]; shl.b64 %r51,%r78,4; ld.u64 %r53,[%r56]; add.u64 %r50,%r57,56; cvt.u32.u32 %r80,%r22; cvt.s64.s8 %r79,%r80; add.u64 %r82,%r79,%r79; add.u64 %r83,%r82,%r79; shl.b64 %r84,%r83,3; add.u64 %r38,%r84,%r40; $L4: .loc 1 58 16 ld.u64 %r88,[%r50]; add.u64 %r87,%r88,1; .loc 1 58 14 ld.u64 %r89,[%r40]; sub.u64 %r86,%r87,%r89; .loc 1 59 56 max.s64 %r86,%r86,0; st.u64 [%r53],%r86; shr.s64 %r90,%r86,63; st.u64 [%r53+8],%r90; .loc 1 56 3 add.u64 %r40,%r40,24; add.u64 %r53,%r53,%r51; add.u64 %r50,%r50,24; setp.ne.u64 %r91,%r38,%r40; @ %r91 bra $L4; $L1: .loc 1 61 1 ret; } eoshift1_4.o/BEGIN FUNCTION DECL: eoshift1 .func eoshift1); .file 1 "../../../libgfortran/generated/eoshift1_4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_eoshift1_4 .visible .func _gfortran_eoshift1_4_gfortran_eoshift1_4_char .visible .func _gfortran_eoshift1_4_charVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_size0_gfortran_size0_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extentsgfortrani_bounds_reduced_extents .extern .func _gfortrani_bounds_reduced_extents)13] = {114,101,116,117,114,110,32,118,97,108,117,101] = {32,0 }8] = {69,79,83,72,73,70,8473,70,84,32,97,114,103,117,109,101,110,12] = {0,0 }; // BEGIN VAR DEF: space$0 .const .align 4 .u32 space$0[1] = {32 }; // BEGIN FUNCTION DEF: eoshift1 .func eoshift1) {local .align 16 .b8 %frame_ar[608predpredpredpred %r195; .reg .u32 %r197; .reg .predpredpredpred %r246; .reg .predu64 %r319; .reg .predu64u64 %r351; .reg .u64 %r352; .reg .u64 %r353; .reg .predpredmov.u64 %r165,%ar0; mov.u64 %r166,%ar1; mov.u64 %r167,%ar2; mov.u64 %r168,%ar3; mov.u64 %r169,%ar4; mov.u64 %r170,%ar5; mov.u64 %r171,%ar6; .loc 1 74 10 ld.u64 %r22,[%r166+16]; .loc 1 76 6 setp.eq.u64 %r172,%r169,0; @ %r172 bra $L51; .loc 1 77 11 ld.u32 %r173,[%r169]; add.u32 %r104,%r173,-1; bra $L2; $L51: .loc 1 79 11 cvt.u32.u64 %r104,%r169; $L2: .loc 1 81 13 mov.u64 %r174,1; st.u64 [%frame],%r174; .loc 1 82 12 mov.u64 %r175,0; st.u64 [%frame+120],%r175166; call (%value_in),_gfortran_size0,(%out_arg1); ld.param.u64 %r177,[%value_in]; } .loc 1 85 6 ld.u64 %r178,[%r165]; setp.ne.u64 %r179,%r178,0; @ %r179 bra $L3; .loc 1 87 19 st.u64 [%r165+8],%r178; .loc 1 88 7 ld.u64 %r181,[%r166+16]; st.u64 [%r165+16],%r181; ld.u64 %r182,[%r166+24]; st.u64 [%r165+24],%r182; .loc 1 89 34 ld.s8 %r162,[%r166+28]; .loc 1 89 7 cvt.u16.u32 %r183,%r162; setp.gt.s16 %r184,%r183,0; @ %r184 bra $L4; $L7: .loc 1 105 24; call (%value_in),_gfortrani_xmallocarray187,[%value_in]; } .loc 1 105 22 st.u64 [%r165],%r187; cvta.global.u64 %r360,_gfortrani_compile_options; bra $L5; $L6: .loc 1 93 9 ld.u64 %r190,[%r53+8]; ld.u64 %r192,[%r53]; .loc 1 98 17 mad.lo.u64 %r108,%r40,%r108,%r108; add.u64 %r53,%r53,24; .loc 1 93 7 sub.u64 %r40,%r190,%r192; $L49: add.u64 %r164,%r165,%r36; .loc 1 101 4 st.u64 [%r164+48],%r194; st.u64 [%r164+56],%r40; st.u64 [%r164+40],%r108; .loc 1 89 7 add.u64 %r36,%r36,24; setp.ne.u64 %r195,%r36,%r353; @ %r195 bra $L6; bra $L7; $L3: .loc 1 108 12 cvta.global.u64 %r360,_gfortrani_compile_options; .loc 1 108 11 ld.u32 %r197,[%r360+36]; setp.ne.u32 %r198,%r197,0; @ ! %r198 bra $L9; .loc 1 110 7 cvta.const.u64 %r202,$LC0;r16620202; call _gfortrani_bounds_equal_extents$L5: .loc 1 114 6 ld.u32 %r204,[%r360+36]; setp.eq.u32 %r205,%r204,0; @ %r205 bra $L9; .loc 1 116 7 cvta.const.u64 %r210,$LC0; cvta.const.u64 %r20104210; call _gfortrani_bounds_reduced_extents); } $L9: .loc 1 120 6 setp.eq.u64 %r211,%r177,0; @ %r211 bra $L1; .loc 1 124 23 ld.s8 %r150,[%r166+28]; cvt.u32.u32 %r212,%r150; cvt.s64.s8 %r159,%r212; .loc 1 124 3 setp.gt.s64 %r213,%r159,0; @ %r213 bra $L12; .loc 1 70 7 mov.u64 %r98,0; .loc 1 71 11 mov.u64 %r90,%r98; .loc 1 72 11 mov.u64 %r83,%r98; bra $L13; $L12: .loc 1 126 15 cvt.s64.s32 %r114,%r104; .loc 1 128 21 ld.u64 %r84,[%r165+16]; .loc 1 131 21 ld.u64 %r154,[%r166+16]; mov.u64 %r93,40; .loc 1 123 5 mov.u64 %r100,0; .loc 1 70 7 mov.u64 %r98,%r100; .loc 1 124 12 mov.u64 %r138,%r100; .loc 1 71 11 mov.u64 %r90,%r100; .loc 1 72 11 mov.u64 %r83,%r100; .loc 1 138 20 mov.u64 %r373,%r100; $L18: add.u64 %r68,%r166,%r93; .loc 1 134 17 ld.u64 %r215,[%r68+16]; add.u64 %r214,%r215,1; .loc 1 134 15 ld.u64 %r216,[%r68+8]; sub.u64 %r99,%r214,%r216; .loc 1 128 21 add.u64 %r217,%r165,%r93; ld.u64 %r218,[%r217]; mul.lo.u64 %r107,%r84,%r218; .loc 1 131 21 ld.u64 %r219,[%r68]; mul.lo.u64 %r148,%r154,%r219; .loc 1 126 10 setp.ne.u64 %r220,%r114,%r138; @ %r220 bra $L14; .loc 1 129 14 setp.ne.u64 %r221,%r107,0; .loc 1 74 8 selp.u64 %r83,%r107,%r22,%r221; .loc 1 132 14 setp.ne.u64 %r222,%r148,0; .loc 1 74 8 selp.u64 %r90,%r148,%r22,%r222; .loc 1 134 15 mov.u64 %r98,%r99; bra $L17; $L14: .loc 1 138 20 shl.b64 %r223,%r100,3; add.u64 %r224,%frame,%r223; add.u64 %r225,%r224,120; st.u64 [%r225],%r373; .loc 1 139 21 st.u64 [%r224],%r99; .loc 1 140 22 st.u64 [%r224+480],%r107; .loc 1 141 22 st.u64 [%r224+360],%r148; .loc 1 143 24 add.u64 %r239,%r100,%r100; add.u64 %r240,%r239,%r100; shl.b64 %r241,%r240,3; add.u64 %r242,%r167,%r241; .loc 1 143 22 ld.u64 %r244,[%r242+40]; st.u64 [%r224+240],%r244; .loc 1 144 12 add.u64 %r100,%r100,1; $L17: .loc 1 124 55 add.u64 %r138,%r138,1; .loc 1 124 3 add.u64 %r93,%r93,24; setp.ne.u64 %r245,%r138,%r159; @ %r245 bra $L18; $L13: .loc 1 147 14 ld.u64 %r43,[%frame+360]; .loc 1 147 6 setp.ne.u64 %r246,%r43,0; .loc 1 74 8 selp.u64 %r43,%r43,%r22,%r246; .loc 1 149 14 ld.u64 %r37,[%frame+480]; .loc 1 149 6 setp.ne.u64 %r247,%r37,0; .loc 1 74 8 selp.u64 %r37,%r37,%r22,%r247; .loc 1 151 14 ld.u64 %r39,[%frame+240]; .loc 1 151 6 setp.ne.u64 %r248,%r39,0; .loc 1 152 16 selp.u64 %r39,%r39,1,%r248; .loc 1 158 8 ld.u64 %r137,[%r165]; .loc 1 159 8 ld.u64 %r50,[%r166]; .loc 1 160 8 ld.u64 %r51,[%r167]; .loc 1 162 9 setp.eq.u64 %r249,%r137,0; @ %r249 bra $L1; .loc 1 228 12 shl.b64 %r58,%r39,2; .loc 1 186 38 set.u32.eq.u64 %r251,%r22,%r83; neg.s32 %r252,%r251; .loc 1 186 19 set.u32.eq.u64 %r254,%r22,%r90; neg.s32 %r255,%r254; .loc 1 186 27 cvt.u16.u32 %r257,%r252; cvt.u16.u32 %r258,%r255; and.b16 %r256,%r257,%r258; cvt.u32.u16 %r259,%r256; cvt.u32.u8 %r96,%r259; .loc 1 231 32 ld.u64 %r139,[%frame]; .loc 1 238 30 mul.lo.u64 %r122,%r139,%r37; .loc 1 239 30 mul.lo.u64 %r260,%r139,%r43; .loc 1 239 16 neg.s64 %r144,%r260; .loc 1 240 23 mul.lo.u64 %r261,%r139,%r39; .loc 1 240 9 shl.b64 %r262,%r261,2; neg.s64 %r145,%r262; setp.ne.u32 %r361,%r96,0; setp.ne.u64 %r356,%r168,0; setp.le.s64 %r355,%r159,2; .loc 1 168 10 cvt.u32.u64 %r364,%r98; setp.eq.u64 %r365,%r171,1; .loc 1 219 8 setp.gt.s64 %r366,%r22,0; cvt.u32.u32 %r367,%r150; cvt.s64.s8 %r368,%r367; add.u64 %r369,%r368,-1; add.u64 %r370,%frame,480; add.u64 %r371,%frame,360; add.u64 %r372,%frame,240; $L50: .loc 1 165 10 ld.u32 %r105,[%r51]; .loc 1 166 26 setp.lt.s32 %r263,%r105,0; @ %r263 bra $L23; cvt.s64.s32 %r110,%r105; bra $L24; $L23: .loc 1 166 28 neg.s32 %r264,%r105; .loc 1 166 26 cvt.s64.s32 %r110,%r264; $L24: .loc 1 166 10 setp.le.s64 %r265,%r110,%r98; @ %r265 bra $L25; .loc 1 168 10 mov.u32 %r105,%r364; mov.u32 %r106,%r364; bra $L26; $L25: .loc 1 172 8 abs.s32 %r106,%r105; $L26: .loc 1 176 29 cvt.s64.s32 %r95,%r106; .loc 1 174 10 setp.le.s32 %r266,%r105,0; @ %r266 bra $L27; .loc 1 176 15 mad.lo.u64 %r94,%r95,%r90,%r50; mov.u64 %r87,%r137; bra $L28; $L27: .loc 1 182 16 mad.lo.u64 %r87,%r95,%r83,%r137; mov.u64 %r94,%r50; $L28: .loc 1 194 24 sub.u64 %r102,%r98,%r95; .loc 1 186 10 @ %r361 bra $L29; .loc 1 194 4 setp.gt.s64 %r270,%r102,0; @ %r270 bra $L30; bra $L31; $L29: .loc 1 188 24 mul.lo.u64 %r131,%r102,%r22; .loc 1 189 4131277,[%value_in]; } .loc 1 190 9 add.u64 %r87,%r87,%r131; bra $L31; $L30: .loc 1 198 12 mov.u64 %r129,%r87; .loc 1 194 11 mov.u64 %r130,0; $L32:285,[%value_in]; } .loc 1 197 13 add.u64 %r129,%r129,%r83; .loc 1 198 12 add.u64 %r94,%r94,%r90; .loc 1 194 34 add.u64 %r130,%r130,1; .loc 1 194 4 setp.ne.u64 %r287,%r102,%r130; @ %r287 bra $L32; .loc 1 197 13 mad.lo.u64 %r87,%r102,%r83,%r87; $L31: .loc 1 201 10 setp.ge.s32 %r289,%r105,0; selp.u64 %r87,%r87,%r137,%r289; .loc 1 212 10 add.u64 %r133,%r95,-1; .loc 1 205 10 @ %r356 bra $L34; .loc 1 212 8 setp.ne.u64 %r291,%r95,0; @ %r291 bra $L45; $L38: .loc 1 226 12 add.u64 %r137,%r137,%r37; .loc 1 227 12 add.u64 %r50,%r50,%r43; .loc 1 228 12 add.u64 %r51,%r51,%r58; .loc 1 229 15 ld.u64 %r292,[%frame+120]; add.u64 %r60,%r292,1; .loc 1 231 13 setp.eq.u64 %r293,%r60,%r139; @ %r293 bra $L36; .loc 1 229 15 st.u64 [%frame+120],%r60; bra $L50; $L34: .loc 1 206 8 setp.eq.u64 %r294,%r95,0; @ %r294 bra $L38; $L39: .loc 1 208 61681,[%value_in]; } .loc 1 209 11 add.u64 %r87,%r87,%r83; .loc 1 206 10 add.u64 %r133,%r133,-1; .loc 1 206 8 setp.ne.u64 %r303,%r133,-1; @ %r303 bra $L39; bra $L38; $L45: .loc 1 216 9 @ %r365 bra $L40; .loc 1 219 8 @ %r366 bra $L41; bra $L42; $L40: .loc 1 217 8 ld.s8 %r310,[%r170]bra $L42; $L41: .loc 1 219 15 mov.u64 %r132,0; $L43: .loc 1 220 3 add.u64 %r312,%r87,%r132 add.u64 %r132,%r132,%r171; .loc 1 219 8 setp.gt.s64 %r321,%r22,%r132; @ %r321 bra $L43; $L42: .loc 1 222 11 add.u64 %r87,%r87,%r83; .loc 1 212 10 add.u64 %r133,%r133,-1; .loc 1 212 8 setp.ne.u64 %r322,%r133,-1; @ %r322 bra $L45; bra $L38; $L36: .loc 1 235 20 mov.u64 %r323,0; st.u64 [%frame+120],%r323; .loc 1 239 16 add.u64 %r134,%r50,%r144; .loc 1 240 9 add.u64 %r135,%r51,%r145; .loc 1 242 14 @ %r355 bra $L1; add.u64 %r157,%frame,128; .loc 1 238 30 mov.u64 %r63,%r122; .loc 1 242 14 mov.u64 %r156,8; .loc 1 241 12 mov.u64 %r136,1; bra $L47; $L48: .loc 1 235 20 st.u64 [%r157],%r323; .loc 1 238 30 mul.lo.u64 %r63,%r75,%r74; .loc 1 239 30 mul.lo.u64 %r329,%r77,%r74; .loc 1 239 16 sub.u64 %r134,%r50,%r329; .loc 1 240 23 mul.lo.u64 %r330,%r79,%r74; .loc 1 240 9 shl.b64 %r331,%r330,2; sub.u64 %r135,%r51,%r331; .loc 1 241 12 add.u64 %r136,%r136,1; .loc 1 242 14 add.u64 %r157,%r157,8; add.u64 %r156,%r156,8; setp.eq.u64 %r332,%r136,%r369; @ %r332 bra $L1; $L47: .loc 1 250 23 ld.u64 %r333,[%r157]; add.u64 %r74,%r333,1; st.u64 [%r157],%r74; .loc 1 251 30 add.u64 %r335,%r370,%r156; ld.u64 %r75,[%r335]; .loc 1 251 20 sub.u64 %r336,%r75,%r63; add.u64 %r137,%r137,%r336; .loc 1 252 30 add.u64 %r338,%r371,%r156; ld.u64 %r77,[%r338]; .loc 1 252 20 add.u64 %r50,%r134,%r77; .loc 1 253 23 add.u64 %r340,%r372,%r156; ld.u64 %r79,[%r340]; .loc 1 253 13 shl.b64 %r341,%r79,2; add.u64 %r51,%r135,%r341; .loc 1 231 32 add.u64 %r342,%frame,%r156; ld.u64 %r82,[%r342]; .loc 1 231 13 setp.eq.u64 %r343,%r74,%r82; @ %r343 bra $L48; bra $L50; $L4: .loc 1 93 9 ld.u64 %r345,[%r166+56]; ld.u64 %r347,[%r166+48]; .loc 1 93 7 sub.u64 %r40,%r345,%r347; add.u64 %r53,%r166,72; cvt.u32.u32 %r349,%r162; cvt.s64.s8 %r348,%r349; add.u64 %r351,%r348,%r348; add.u64 %r352,%r351,%r348; shl.b64 %r353,%r352,3; mov.u64 %r36,%r178; .loc 1 96 17 mov.u64 %r108,%r174; .loc 1 101 4 mov.u64 %r194,%r36; bra $L49; $L_gfortran_eoshift1_4 .visible .func _gfortran_eoshift1_4 {r22.loc 1 272 3 mov.u64 %r33,1; cvta.const.u64 %r32,$LC3eoshift127_gfortran_eoshift1_4_char .visible .func _gfortran_eoshift1_4_char.loc 1 295 3 mov.u64 %r36,1; cvta.const.u64 %r35,$LC4eoshift129_gfortran_eoshift1_4_char4 .visible .func _gfortran_eoshift1_4_char4.loc 1 319 3 mov.u64 %r36,4; cvta.const.u64 %r35,space$0eoshift1BEGIN FUNCTION DECL: eoshift1 .func eoshift1); .file 1 "../../../libgfortran/generated/eoshift1_8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_eoshift1_8 .visible .func _gfortran_eoshift1_8_gfortran_eoshift1_8_char .visible .func _gfortran_eoshift1_8_charVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_size0_gfortran_size0_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extentsgfortrani_bounds_reduced_extents .extern .func _gfortrani_bounds_reduced_extents)13] = {114,101,116,117,114,110,32,118,97,108,117,101] = {32,0 }8] = {69,79,83,72,73,70,8473,70,84,32,97,114,103,117,109,101,110,12] = {0,0 }; // BEGIN VAR DEF: space$0 .const .align 4 .u32 space$0[1] = {32 }; // BEGIN FUNCTION DEF: eoshift1 .func eoshift1) {local .align 16 .b8 %frame_ar[608predpredpredpred %r195; .reg .u32 %r197; .reg .predpredpredpred %r246; .reg .predpred %r268; .reg .u64 %r275; .reg .u64 %r283; .reg .pred %r285; .reg .pred %r287; .reg .predpred %r301; .reg .u32u64 %r350; .reg .u64 %r351; .reg .u64 %r353; .reg .pred %r355; .reg .pred %r356; .reg .pred %r357; .reg .pred %r362; .reg .predu64 %r370; mov.u64 %r164,%ar0; mov.u64 %r165,%ar1; mov.u64 %r166,%ar2; mov.u64 %r167,%ar3; mov.u64 %r168,%ar4; mov.u64 %r169,%ar5; mov.u64 %r170,%ar6; .loc 1 74 10 ld.u64 %r22,[%r165+16]; .loc 1 76 6 setp.eq.u64 %r171,%r168,0; @ %r171 bra $L48; .loc 1 77 21 ld.u32 %r173,[%r168]; add.u32 %r105,%r173,-1; bra $L2; $L48: .loc 1 79 11 cvt.u32.u64 %r105,%r168; $L2: .loc 1 81 13 mov.u64 %r174,1; st.u64 [%frame],%r174; .loc 1 82 12 mov.u64 %r175,0; st.u64 [%frame+120],%r175165; call (%value_in),_gfortran_size0,(%out_arg1); ld.param.u64 %r177,[%value_in]; } .loc 1 85 6 ld.u64 %r178,[%r164]; setp.ne.u64 %r179,%r178,0; @ %r179 bra $L3; .loc 1 87 19 st.u64 [%r164+8],%r178; .loc 1 88 7 ld.u64 %r181,[%r165+16]; st.u64 [%r164+16],%r181; ld.u64 %r182,[%r165+24]; st.u64 [%r164+24],%r182; .loc 1 89 34 ld.s8 %r161,[%r165+28]; .loc 1 89 7 cvt.u16.u32 %r183,%r161; setp.gt.s16 %r184,%r183,0; @ %r184 bra $L4; $L7: .loc 1 105 24; call (%value_in),_gfortrani_xmallocarray187,[%value_in]; } .loc 1 105 22 st.u64 [%r164],%r187; cvta.global.u64 %r353,_gfortrani_compile_options; bra $L5; $L6: .loc 1 93 9 ld.u64 %r190,[%r55+8]; ld.u64 %r192,[%r55]; .loc 1 98 17 mad.lo.u64 %r108,%r42,%r108,%r108; add.u64 %r55,%r55,24; .loc 1 93 7 sub.u64 %r42,%r190,%r192; $L46: add.u64 %r163,%r164,%r38; .loc 1 101 4 st.u64 [%r163+48],%r194; st.u64 [%r163+56],%r42; st.u64 [%r163+40],%r108; .loc 1 89 7 add.u64 %r38,%r38,24; setp.ne.u64 %r195,%r38,%r351; @ %r195 bra $L6; bra $L7; $L3: .loc 1 108 12 cvta.global.u64 %r353,_gfortrani_compile_options; .loc 1 108 11 ld.u32 %r197,[%r353+36]; setp.ne.u32 %r198,%r197,0; @ ! %r198 bra $L9; .loc 1 110 7 cvta.const.u64 %r202,$LC0;164r20202; call _gfortrani_bounds_equal_extents$L5: .loc 1 114 6 ld.u32 %r204,[%r353+36]; setp.eq.u32 %r205,%r204,0; @ %r205 bra $L9; .loc 1 116 7 cvta.const.u64 %r210,$LC0; cvta.const.u64 %r200210; call _gfortrani_bounds_reduced_extents); } $L9: .loc 1 120 6 setp.eq.u64 %r211,%r177,0; @ %r211 bra $L1; .loc 1 124 23 ld.s8 %r151,[%r165+28]; cvt.u32.u32 %r212,%r151; cvt.s64.s8 %r149,%r212; .loc 1 124 3 setp.gt.s64 %r213,%r149,0; @ %r213 bra $L12; .loc 1 70 7 mov.u64 %r98,0; .loc 1 71 11 mov.u64 %r91,%r98; .loc 1 72 11 mov.u64 %r85,%r98; bra $L13; $L12: .loc 1 126 15 cvt.s64.s32 %r112,%r105; .loc 1 128 21 ld.u64 %r86,[%r164+16]; .loc 1 131 21 ld.u64 %r154,[%r165+16]; mov.u64 %r96,40; .loc 1 123 5 mov.u64 %r100,0; .loc 1 70 7 mov.u64 %r98,%r100; .loc 1 124 12 mov.u64 %r137,%r100; .loc 1 71 11 mov.u64 %r91,%r100; .loc 1 72 11 mov.u64 %r85,%r100; .loc 1 138 20 mov.u64 %r370,%r100; $L18: add.u64 %r70,%r165,%r96; .loc 1 134 17 ld.u64 %r215,[%r70+16]; add.u64 %r214,%r215,1; .loc 1 134 15 ld.u64 %r216,[%r70+8]; sub.u64 %r99,%r214,%r216; .loc 1 128 21 add.u64 %r217,%r164,%r96; ld.u64 %r218,[%r217]; mul.lo.u64 %r107,%r86,%r218; .loc 1 131 21 ld.u64 %r219,[%r70]; mul.lo.u64 %r147,%r154,%r219; .loc 1 126 10 setp.ne.u64 %r220,%r112,%r137; @ %r220 bra $L14; .loc 1 129 14 setp.ne.u64 %r221,%r107,0; .loc 1 74 8 selp.u64 %r85,%r107,%r22,%r221; .loc 1 132 14 setp.ne.u64 %r222,%r147,0; .loc 1 74 8 selp.u64 %r91,%r147,%r22,%r222; .loc 1 134 15 mov.u64 %r98,%r99; bra $L17; $L14: .loc 1 138 20 shl.b64 %r223,%r100,3; add.u64 %r224,%frame,%r223; add.u64 %r225,%r224,120; st.u64 [%r225],%r370; .loc 1 139 21 st.u64 [%r224],%r99; .loc 1 140 22 st.u64 [%r224+480],%r107; .loc 1 141 22 st.u64 [%r224+360],%r147; .loc 1 143 24 add.u64 %r239,%r100,%r100; add.u64 %r240,%r239,%r100; shl.b64 %r241,%r240,3; add.u64 %r242,%r166,%r241; .loc 1 143 22 ld.u64 %r244,[%r242+40]; st.u64 [%r224+240],%r244; .loc 1 144 12 add.u64 %r100,%r100,1; $L17: .loc 1 124 55 add.u64 %r137,%r137,1; .loc 1 124 3 add.u64 %r96,%r96,24; setp.ne.u64 %r245,%r137,%r149; @ %r245 bra $L18; $L13: .loc 1 147 14 ld.u64 %r45,[%frame+360]; .loc 1 147 6 setp.ne.u64 %r246,%r45,0; .loc 1 74 8 selp.u64 %r45,%r45,%r22,%r246; .loc 1 149 14 ld.u64 %r39,[%frame+480]; .loc 1 149 6 setp.ne.u64 %r247,%r39,0; .loc 1 74 8 selp.u64 %r39,%r39,%r22,%r247; .loc 1 151 14 ld.u64 %r41,[%frame+240]; .loc 1 151 6 setp.ne.u64 %r248,%r41,0; .loc 1 152 16 selp.u64 %r41,%r41,1,%r248; .loc 1 158 8 ld.u64 %r136,[%r164]; .loc 1 159 8 ld.u64 %r54,[%r165]; .loc 1 160 8 ld.u64 %r52,[%r166]; .loc 1 162 9 setp.eq.u64 %r249,%r136,0; @ %r249 bra $L1; .loc 1 228 12 shl.b64 %r60,%r41,3; .loc 1 186 38 set.u32.eq.u64 %r251,%r22,%r85; neg.s32 %r252,%r251; .loc 1 186 19 set.u32.eq.u64 %r254,%r22,%r91; neg.s32 %r255,%r254; .loc 1 186 27 cvt.u16.u32 %r257,%r252; cvt.u16.u32 %r258,%r255; and.b16 %r256,%r257,%r258; cvt.u32.u16 %r259,%r256; cvt.u32.u8 %r93,%r259; .loc 1 231 32 ld.u64 %r127,[%frame]; .loc 1 238 30 mul.lo.u64 %r120,%r127,%r39; .loc 1 239 30 mul.lo.u64 %r260,%r127,%r45; .loc 1 239 16 neg.s64 %r144,%r260; .loc 1 240 23 mul.lo.u64 %r261,%r127,%r41; .loc 1 240 9 shl.b64 %r262,%r261,3; neg.s64 %r145,%r262; setp.ne.u32 %r355,%r93,0; setp.ne.u64 %r357,%r167,0; setp.le.s64 %r356,%r149,2; setp.eq.u64 %r362,%r170,1; .loc 1 219 8 setp.gt.s64 %r363,%r22,0; cvt.u32.u32 %r364,%r151; cvt.s64.s8 %r365,%r364; add.u64 %r366,%r365,-1; add.u64 %r367,%frame,480; add.u64 %r368,%frame,360; add.u64 %r369,%frame,240; $L47: .loc 1 165 10 ld.u64 %r106,[%r52]; .loc 1 166 26 abs.s64 %r47,%r106; .loc 1 166 10 setp.gt.s64 %r263,%r47,%r98; @ %r263 bra $L51; .loc 1 194 24 sub.u64 %r63,%r98,%r47; bra $L23; $L51: mov.u64 %r47,%r98; mov.u64 %r106,%r98; mov.u64 %r63,0; $L23: .loc 1 174 10 setp.le.s64 %r264,%r106,0; @ %r264 bra $L24; .loc 1 176 15 mad.lo.u64 %r94,%r47,%r91,%r54; mov.u64 %r89,%r136; bra $L25; $L24: .loc 1 182 16 mad.lo.u64 %r89,%r47,%r85,%r136; mov.u64 %r94,%r54; $L25: .loc 1 186 10 @ %r355 bra $L26; .loc 1 194 4 setp.ne.u64 %r268,%r63,0; @ %r268 bra $L27; bra $L28; $L26: .loc 1 188 24 mul.lo.u64 %r130,%r63,%r22; .loc 1 189 4130275,[%value_in]; } .loc 1 190 9 add.u64 %r89,%r89,%r130; bra $L28; $L27: .loc 1 198 12 mov.u64 %r128,%r89; .loc 1 194 11 mov.u64 %r129,0; $L29:1283,[%value_in]; } .loc 1 197 13 add.u64 %r128,%r128,%r85; .loc 1 198 12 add.u64 %r94,%r94,%r91; .loc 1 194 34 add.u64 %r129,%r129,1; .loc 1 194 4 setp.ne.u64 %r285,%r63,%r129; @ %r285 bra $L29; .loc 1 197 13 mad.lo.u64 %r89,%r63,%r85,%r89; $L28: .loc 1 201 10 setp.ge.s64 %r287,%r106,0; selp.u64 %r89,%r89,%r136,%r287; .loc 1 212 10 add.u64 %r132,%r47,-1; .loc 1 205 10 @ %r357 bra $L31; .loc 1 212 8 setp.ne.u64 %r289,%r47,0; @ %r289 bra $L42; $L35: .loc 1 226 12 add.u64 %r136,%r136,%r39; .loc 1 227 12 add.u64 %r54,%r54,%r45; .loc 1 228 12 add.u64 %r52,%r52,%r60; .loc 1 229 15 ld.u64 %r290,[%frame+120]; add.u64 %r62,%r290,1; .loc 1 231 13 setp.eq.u64 %r291,%r62,%r127; @ %r291 bra $L33; .loc 1 229 15 st.u64 [%frame+120],%r62; bra $L47; $L31: .loc 1 206 8 setp.eq.u64 %r292,%r47,0; @ %r292 bra $L35; $L36: .loc 1 208 6167299,[%value_in]; } .loc 1 209 11 add.u64 %r89,%r89,%r85; .loc 1 206 10 add.u64 %r132,%r132,-1; .loc 1 206 8 setp.ne.u64 %r301,%r132,-1; @ %r301 bra $L36; bra $L35; $L42: .loc 1 216 9 @ %r362 bra $L37; .loc 1 219 8 @ %r363 bra $L38; bra $L39; $L37: .loc 1 217 8 ld.s8 %r308,[%r169]89bra $L39; $L38: .loc 1 219 15 mov.u64 %r131,0; $L40: .loc 1 220 3 add.u64 %r310,%r89,%r131170317,[%value_in]; } .loc 1 219 32 add.u64 %r131,%r131,%r170; .loc 1 219 8 setp.gt.s64 %r319,%r22,%r131; @ %r319 bra $L40; $L39: .loc 1 222 11 add.u64 %r89,%r89,%r85; .loc 1 212 10 add.u64 %r132,%r132,-1; .loc 1 212 8 setp.ne.u64 %r320,%r132,-1; @ %r320 bra $L42; bra $L35; $L33: .loc 1 235 20 mov.u64 %r321,0; st.u64 [%frame+120],%r321; .loc 1 239 16 add.u64 %r133,%r54,%r144; .loc 1 240 9 add.u64 %r134,%r52,%r145; .loc 1 242 14 @ %r356 bra $L1; add.u64 %r157,%frame,128; .loc 1 238 30 mov.u64 %r65,%r120; .loc 1 242 14 mov.u64 %r156,8; .loc 1 241 12 mov.u64 %r135,1; bra $L44; $L45: .loc 1 235 20 st.u64 [%r157],%r321; .loc 1 238 30 mul.lo.u64 %r65,%r77,%r76; .loc 1 239 30 mul.lo.u64 %r327,%r79,%r76; .loc 1 239 16 sub.u64 %r133,%r54,%r327; .loc 1 240 23 mul.lo.u64 %r328,%r81,%r76; .loc 1 240 9 shl.b64 %r329,%r328,3; sub.u64 %r134,%r52,%r329; .loc 1 241 12 add.u64 %r135,%r135,1; .loc 1 242 14 add.u64 %r157,%r157,8; add.u64 %r156,%r156,8; setp.eq.u64 %r330,%r135,%r366; @ %r330 bra $L1; $L44: .loc 1 250 23 ld.u64 %r331,[%r157]; add.u64 %r76,%r331,1; st.u64 [%r157],%r76; .loc 1 251 30 add.u64 %r333,%r367,%r156; ld.u64 %r77,[%r333]; .loc 1 251 20 sub.u64 %r334,%r77,%r65; add.u64 %r136,%r136,%r334; .loc 1 252 30 add.u64 %r336,%r368,%r156; ld.u64 %r79,[%r336]; .loc 1 252 20 add.u64 %r54,%r133,%r79; .loc 1 253 23 add.u64 %r338,%r369,%r156; ld.u64 %r81,[%r338]; .loc 1 253 13 shl.b64 %r339,%r81,3; add.u64 %r52,%r134,%r339; .loc 1 231 32 add.u64 %r340,%frame,%r156; ld.u64 %r84,[%r340]; .loc 1 231 13 setp.eq.u64 %r341,%r76,%r84; @ %r341 bra $L45; bra $L47; $L4: .loc 1 93 9 ld.u64 %r343,[%r165+56]; ld.u64 %r345,[%r165+48]; .loc 1 93 7 sub.u64 %r42,%r343,%r345; add.u64 %r55,%r165,72; cvt.u32.u32 %r347,%r161; cvt.s64.s8 %r346,%r347; add.u64 %r349,%r346,%r346; add.u64 %r350,%r349,%r346; shl.b64 %r351,%r350,3; mov.u64 %r38,%r178; .loc 1 96 17 mov.u64 %r108,%r174; .loc 1 101 4 mov.u64 %r194,%r38; bra $L46; $L_gfortran_eoshift1_8 .visible .func _gfortran_eoshift1_8 {r22.loc 1 272 3 mov.u64 %r33,1; cvta.const.u64 %r32,$LC3eoshift127_gfortran_eoshift1_8_char .visible .func _gfortran_eoshift1_8_char.loc 1 295 3 mov.u64 %r36,1; cvta.const.u64 %r35,$LC4eoshift129_gfortran_eoshift1_8_char4 .visible .func _gfortran_eoshift1_8_char4.loc 1 319 3 mov.u64 %r36,4; cvta.const.u64 %r35,space$0eoshift1BEGIN FUNCTION DECL: eoshift1 .func eoshift1); .file 1 "../../../libgfortran/generated/eoshift1_16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_eoshift1_16 .visible .func _gfortran_eoshift1_16_gfortran_eoshift1_16_char .visible .func _gfortran_eoshift1_16_char_gfortran_eoshift1_16_char4 .visible .func _gfortran_eoshift1_16_char4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_size0_gfortran_size0_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extentsgfortrani_bounds_reduced_extents .extern .func _gfortrani_bounds_reduced_extents)13] = {114,101,116,117,114,110,32,118,97,108,117,101] = {32,0 }8] = {69,79,83,72,73,70,8473,70,84,32,97,114,103,117,109,101,110,12] = {0,0 }; // BEGIN VAR DEF: space$0 .const .align 4 .u32 space$0[1] = {32 }; // BEGIN FUNCTION DEF: eoshift1 .func eoshift1) {local .align 16 .b8 %frame_ar[608u64 %r200; .reg .pred %r201; .reg .u32 %r203; .reg .pred %r204; .reg .u64 %r207; .reg .u64 %r208; .reg .u32pred %r251; .reg .pred %r252; .reg .predu32u64 %r269; .reg .u64 %r271; .reg .u64pred %r297; .reg .pred %r300; .reg .u64 %r302; .reg .u64 %r305; .reg .u32 %r307; .reg .u64 %r308; .reg .pred %r313; .reg .pred %r315; .reg .pred %r317; .reg .pred %r322; .reg .pred %r324; .reg .pred %r326; .reg .u64 %r334; .reg .u64 %r342; .reg .pred %r344; .reg .pred %r361; .reg .predpred %r366; .reg .u64 %r373; .reg .pred %r375; .reg .u32u64 %r391; .reg .predu64pred.reg .pred %r445; .reg .pred %r446; .reg .predu64 %r460; .reg .u64 %r461; .reg .u64 %r462; mov.u64 %r171,%ar0; mov.u64 %r172,%ar1; mov.u64 %r173,%ar2; mov.u64 %r174,%ar3; mov.u64 %r175,%ar4; mov.u64 %r176,%ar5; mov.u64 %r177,%ar6; .loc 1 74 10 ld.u64 %r22,[%r172+16]; .loc 1 76 6 setp.eq.u64 %r178,%r175,0; @ %r178 bra $L52; .loc 1 77 21 ld.u32 %r179,[%r175]; add.u32 %r105,%r179,-1; bra $L2; $L52: .loc 1 79 11 cvt.u32.u64 %r105,%r175; $L2: .loc 1 81 13 mov.u64 %r180,1; st.u64 [%frame],%r180; .loc 1 82 12 mov.u64 %r181,0; st.u64 [%frame+120],%r181172; call (%value_in),_gfortran_size0,(%out_arg1); ld.param.u64 %r183,[%value_in]; } .loc 1 85 6 ld.u64 %r184,[%r171]; setp.ne.u64 %r185,%r184,0; @ %r185 bra $L3; .loc 1 87 19 st.u64 [%r171+8],%r184; .loc 1 88 7 ld.u64 %r187,[%r172+16]; st.u64 [%r171+16],%r187; ld.u64 %r188,[%r172+24]; st.u64 [%r171+24],%r188; .loc 1 89 34 ld.s8 %r169,[%r172+28]; .loc 1 89 7 cvt.u16.u32 %r189,%r169; setp.gt.s16 %r190,%r189,0; @ %r190 bra $L4; $L7: .loc 1 105 2418r22; call (%value_in),_gfortrani_xmallocarray193,[%value_in]; } .loc 1 105 22 st.u64 [%r171],%r193; cvta.global.u64 %r444,_gfortrani_compile_options; bra $L5; $L6: .loc 1 93 9 ld.u64 %r196,[%r45+8]; ld.u64 %r198,[%r45]; .loc 1 98 17 mad.lo.u64 %r108,%r43,%r108,%r108; add.u64 %r45,%r45,24; .loc 1 93 7 sub.u64 %r43,%r196,%r198; $L50: add.u64 %r125,%r171,%r35; .loc 1 101 4 st.u64 [%r125+48],%r200; st.u64 [%r125+56],%r43; st.u64 [%r125+40],%r108; .loc 1 89 7 add.u64 %r35,%r35,24; setp.ne.u64 %r201,%r35,%r425; @ %r201 bra $L6; bra $L7; $L3: .loc 1 108 12 cvta.global.u64 %r444,_gfortrani_compile_options; .loc 1 108 11 ld.u32 %r203,[%r444+36]; setp.ne.u32 %r204,%r203,0; @ ! %r204 bra $L9; .loc 1 110 7 cvta.const.u64 %r208,$LC0;r17220708; call _gfortrani_bounds_equal_extents$L5: .loc 1 114 6 ld.u32 %r210,[%r444+36]; setp.eq.u32 %r211,%r210,0; @ %r211 bra $L9; .loc 1 116 7 cvta.const.u64 %r216,$LC0; cvta.const.u64 %r21517r17215216; call _gfortrani_bounds_reduced_extents); } $L9: .loc 1 120 6 setp.eq.u64 %r217,%r183,0; @ %r217 bra $L1; .loc 1 124 23 ld.s8 %r165,[%r172+28]; cvt.u32.u32 %r218,%r165; cvt.s64.s8 %r166,%r218; .loc 1 124 3 setp.gt.s64 %r219,%r166,0; @ %r219 bra $L12; .loc 1 70 7 mov.u64 %r100,0; .loc 1 71 11 mov.u64 %r94,%r100; .loc 1 72 11 mov.u64 %r86,%r100; bra $L13; $L12: .loc 1 126 15 cvt.s64.s32 %r104,%r105; .loc 1 128 21 ld.u64 %r150,[%r171+16]; .loc 1 131 21 ld.u64 %r164,[%r172+16]; mov.u64 %r89,40; .loc 1 123 5 mov.u64 %r101,0; .loc 1 70 7 mov.u64 %r100,%r101; .loc 1 124 12 mov.u64 %r138,%r101; .loc 1 71 11 mov.u64 %r94,%r101; .loc 1 72 11 mov.u64 %r86,%r101; .loc 1 138 20 mov.u64 %r462,%r101; $L18: add.u64 %r65,%r172,%r89; .loc 1 134 17 ld.u64 %r221,[%r65+16]; add.u64 %r220,%r221,1; .loc 1 134 15 ld.u64 %r222,[%r65+8]; sub.u64 %r167,%r220,%r222; .loc 1 128 21 add.u64 %r223,%r171,%r89; ld.u64 %r224,[%r223]; mul.lo.u64 %r156,%r150,%r224; .loc 1 131 21 ld.u64 %r225,[%r65]; mul.lo.u64 %r143,%r164,%r225; .loc 1 126 10 setp.ne.u64 %r226,%r104,%r138; @ %r226 bra $L14; .loc 1 129 14 setp.ne.u64 %r227,%r156,0; .loc 1 74 8 selp.u64 %r86,%r156,%r22,%r227; .loc 1 132 14 setp.ne.u64 %r228,%r143,0; .loc 1 74 8 selp.u64 %r94,%r143,%r22,%r228; .loc 1 134 15 mov.u64 %r100,%r167; bra $L17; $L14: .loc 1 138 20 shl.b64 %r229,%r101,3; add.u64 %r230,%frame,%r229; add.u64 %r231,%r230,120; st.u64 [%r231],%r462; .loc 1 139 21 st.u64 [%r230],%r167; .loc 1 140 22 st.u64 [%r230+480],%r156; .loc 1 141 22 st.u64 [%r230+360],%r143; .loc 1 143 24 add.u64 %r245,%r101,%r101; add.u64 %r246,%r245,%r101; shl.b64 %r247,%r246,3; add.u64 %r248,%r173,%r247; .loc 1 143 22 ld.u64 %r250,[%r248+40]; st.u64 [%r230+240],%r250; .loc 1 144 12 add.u64 %r101,%r101,1; $L17: .loc 1 124 55 add.u64 %r138,%r138,1; .loc 1 124 3 add.u64 %r89,%r89,24; setp.ne.u64 %r251,%r138,%r166; @ %r251 bra $L18; $L13: .loc 1 147 14 ld.u64 %r46,[%frame+360]; .loc 1 147 6 setp.ne.u64 %r252,%r46,0; .loc 1 74 8 selp.u64 %r46,%r46,%r22,%r252; .loc 1 149 14 ld.u64 %r40,[%frame+480]; .loc 1 149 6 setp.ne.u64 %r253,%r40,0; .loc 1 74 8 selp.u64 %r40,%r40,%r22,%r253; .loc 1 151 14 ld.u64 %r42,[%frame+240]; .loc 1 151 6 setp.ne.u64 %r254,%r42,0; .loc 1 152 16 selp.u64 %r42,%r42,1,%r254; .loc 1 158 8 ld.u64 %r137,[%r171]; .loc 1 159 8 ld.u64 %r53,[%r172]; .loc 1 160 8 ld.u64 %r54,[%r173]; .loc 1 162 9 setp.eq.u64 %r255,%r137,0; @ %r255 bra $L1; .loc 1 228 12 shl.b64 %r62,%r42,4; .loc 1 166 34 shr.s64 %r429,%r100,63; .loc 1 186 19 set.u32.eq.u64 %r258,%r22,%r94; neg.s32 %r259,%r258; .loc 1 186 38 set.u32.eq.u64 %r261,%r22,%r86; neg.s32 %r262,%r261; .loc 1 186 27 cvt.u16.u32 %r264,%r259; cvt.u16.u32 %r265,%r262; and.b16 %r263,%r264,%r265; cvt.u32.u16 %r266,%r263; cvt.u32.u8 %r127,%r266; .loc 1 231 32 ld.u64 %r115,[%frame]; .loc 1 238 30 mul.lo.u64 %r60,%r115,%r40; .loc 1 239 30 mul.lo.u64 %r267,%r115,%r46; .loc 1 239 16 neg.s64 %r145,%r267; .loc 1 240 23 mul.lo.u64 %r268,%r115,%r42; .loc 1 240 9 shl.b64 %r269,%r268,4; neg.s64 %r142,%r269; setp.ne.u32 %r450,%r127,0; setp.ne.u64 %r446,%r174,0; setp.le.s64 %r445,%r166,2; setp.eq.u64 %r454,%r177,1; .loc 1 219 8 setp.gt.s64 %r455,%r22,0; cvt.u32.u32 %r456,%r165; cvt.s64.s8 %r457,%r456; add.u64 %r458,%r457,-1; add.u64 %r459,%frame,480; add.u64 %r460,%frame,360; add.u64 %r461,%frame,240; $L51: .loc 1 165 10 ld.u64 %r430,[%r54]; ld.u64 %r431,[%r54+8]; .loc 1 166 26 shr.s64 %r271,%r431,63; xor.b64 %r426,%r271,%r430; xor.b64 %r427,%r271,%r431; sub.u64 %r282,%r426,%r271; set.u32.gt.u64 %r287,%r282,%r426; cvt.s64.s32 %r285,%r287; sub.u64 %r288,%r427,%r271; add.u64 %r291,%r288,%r285; mov.u64 %r426,%r282; .loc 1 166 10 setp.gt.s64 %r294,%r291,%r429; @ %r294 bra $L55; setp.ne.u64 %r297,%r291,%r429; @ %r297 bra $L56; setp.gt.u64 %r300,%r282,%r100; @ %r300 bra $L55; $L56: .loc 1 203 9 mov.u64 %r93,%r282; .loc 1 194 24 sub.u64 %r302,%r100,%r93; set.u32.gt.u64 %r307,%r302,%r100; cvt.s64.s32 %r305,%r307; sub.u64 %r308,%r429,%r291; mov.u64 %r432,%r302; add.u64 %r433,%r308,%r305; bra $L23; $L55: mov.u64 %r93,%r100; mov.u64 %r426,%r100; mov.u64 %r430,%r100; mov.u64 %r431,%r429; mov.u64 %r432,0; mov.u64 %r433,%r432; $L23: .loc 1 174 10 setp.gt.s64 %r313,%r431,0; @ %r313 bra $L57; setp.ne.u64 %r315,%r431,0; @ %r315 bra $L25; setp.ne.u64 %r317,%r430,0; @ ! %r317 bra $L25; $L57: .loc 1 176 15 mad.lo.u64 %r96,%r94,%r426,%r53; mov.u64 %r91,%r137; bra $L27; $L25: .loc 1 182 16 mad.lo.u64 %r91,%r86,%r426,%r137; mov.u64 %r96,%r53; $L27: .loc 1 186 10 @ %r450 bra $L28; .loc 1 194 4 setp.gt.s64 %r322,%r433,0; @ %r322 bra $L29; setp.ne.u64 %r324,%r433,0; @ %r324 bra $L31; setp.ne.u64 %r326,%r432,0; @ %r326 bra $L29; bra $L31; $L28: .loc 1 188 11 mul.lo.u64 %r131,%r22,%r432; .loc 1 189 4.loc 1 190 9 add.u64 %r91,%r91,%r131; bra $L31; $L29: .loc 1 182 18 mov.u64 %r129,%r91; .loc 1 194 11 mov.u64 %r130,0; $L32:342,[%value_in]; } .loc 1 197 13 add.u64 %r129,%r129,%r86; .loc 1 198 12 add.u64 %r96,%r96,%r94; .loc 1 194 34 add.u64 %r130,%r130,1; .loc 1 194 4 setp.ne.u64 %r344,%r432,%r130; @ %r344 bra $L32; .loc 1 197 13 mad.lo.u64 %r91,%r432,%r86,%r91; $L31: .loc 1 201 10 setp.ge.s64 %r361,%r431,0; selp.u64 %r91,%r91,%r137,%r361; .loc 1 212 10 add.u64 %r133,%r93,-1; .loc 1 205 10 @ %r446 bra $L35; .loc 1 212 8 setp.ne.u64 %r363,%r93,0; @ %r363 bra $L46; $L39: .loc 1 226 12 add.u64 %r137,%r137,%r40; .loc 1 227 12 add.u64 %r53,%r53,%r46; .loc 1 228 12 add.u64 %r54,%r54,%r62; .loc 1 229 15 ld.u64 %r364,[%frame+120]; add.u64 %r64,%r364,1; .loc 1 231 13 setp.eq.u64 %r365,%r64,%r115; @ %r365 bra $L37; .loc 1 229 15 st.u64 [%frame+120],%r64; bra $L51; $L35: .loc 1 206 8 setp.eq.u64 %r366,%r93,0; @ %r366 bra $L39; $L40: .loc 1 208 617373,[%value_in]; } .loc 1 209 11 add.u64 %r91,%r91,%r86; .loc 1 206 10 add.u64 %r133,%r133,-1; .loc 1 206 8 setp.ne.u64 %r375,%r133,-1; @ %r375 bra $L40; bra $L39; $L46: .loc 1 216 9 @ %r454 bra $L41; .loc 1 219 8 @ %r455 bra $L42; bra $L43; $L41: .loc 1 217 8 ld.s8 %r382,[%r176]9382383,[%value_in]; } bra $L43; $L42: .loc 1 219 15 mov.u64 %r132,0; $L44: .loc 1 220 3 add.u64 %r384,%r91,%r132 add.u64 %r132,%r132,%r177; .loc 1 219 8 setp.gt.s64 %r393,%r22,%r132; @ %r393 bra $L44; $L43: .loc 1 222 11 add.u64 %r91,%r91,%r86; .loc 1 212 10 add.u64 %r133,%r133,-1; .loc 1 212 8 setp.ne.u64 %r394,%r133,-1; @ %r394 bra $L46; bra $L39; $L37: .loc 1 235 20 mov.u64 %r395,0; st.u64 [%frame+120],%r395; .loc 1 239 16 add.u64 %r134,%r53,%r145; .loc 1 240 9 add.u64 %r135,%r54,%r142; .loc 1 242 14 @ %r445 bra $L1; add.u64 %r159,%frame,128; .loc 1 238 30 mov.u64 %r67,%r60; .loc 1 242 14 mov.u64 %r155,8; .loc 1 241 12 mov.u64 %r136,1; bra $L48; $L49: .loc 1 235 20 st.u64 [%r159],%r395; .loc 1 238 30 mul.lo.u64 %r67,%r78,%r77; .loc 1 239 30 mul.lo.u64 %r401,%r80,%r77; .loc 1 239 16 sub.u64 %r134,%r53,%r401; .loc 1 240 23 mul.lo.u64 %r402,%r82,%r77; .loc 1 240 9 shl.b64 %r403,%r402,4; sub.u64 %r135,%r54,%r403; .loc 1 241 12 add.u64 %r136,%r136,1; .loc 1 242 14 add.u64 %r159,%r159,8; add.u64 %r155,%r155,8; setp.eq.u64 %r404,%r458,%r136; @ %r404 bra $L1; $L48: .loc 1 250 23 ld.u64 %r405,[%r159]; add.u64 %r77,%r405,1; st.u64 [%r159],%r77; .loc 1 251 30 add.u64 %r407,%r459,%r155; ld.u64 %r78,[%r407]; .loc 1 251 20 sub.u64 %r408,%r78,%r67; add.u64 %r137,%r137,%r408; .loc 1 252 30 add.u64 %r410,%r460,%r155; ld.u64 %r80,[%r410]; .loc 1 252 20 add.u64 %r53,%r134,%r80; .loc 1 253 23 add.u64 %r412,%r461,%r155; ld.u64 %r82,[%r412]; .loc 1 253 13 shl.b64 %r413,%r82,4; add.u64 %r54,%r135,%r413; .loc 1 231 32 add.u64 %r414,%frame,%r155; ld.u64 %r85,[%r414]; .loc 1 231 13 setp.eq.u64 %r415,%r77,%r85; @ %r415 bra $L49; bra $L51; $L4: .loc 1 93 9 ld.u64 %r417,[%r172+56]; ld.u64 %r419,[%r172+48]; .loc 1 93 7 sub.u64 %r43,%r417,%r419; add.u64 %r45,%r172,72; cvt.u32.u32 %r421,%r169; cvt.s64.s8 %r420,%r421; add.u64 %r423,%r420,%r420; add.u64 %r424,%r423,%r420; shl.b64 %r425,%r424,3; mov.u64 %r35,%r184; .loc 1 96 17 mov.u64 %r108,%r180; .loc 1 101 4 mov.u64 %r200,%r35; bra $L50; $L_gfortran_eoshift1_16 .visible .func _gfortran_eoshift1_16 {r22.loc 1 272 3 mov.u64 %r33,1; cvta.const.u64 %r32,$LC3eoshift127_gfortran_eoshift1_16_char .visible .func _gfortran_eoshift1_16_char.loc 1 295 3 mov.u64 %r36,1; cvta.const.u64 %r35,$LC4eoshift129_gfortran_eoshift1_16_char4 .visible .func _gfortran_eoshift1_16_char4.loc 1 319 3 mov.u64 %r36,4; cvta.const.u64 %r35,space$0eoshift121 1 ret; } eoshift3_4.o/BEGIN FUNCTION DECL: eoshift3 .func eoshift3); .file 1 "../../../libgfortran/generated/eoshift3_4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_eoshift3_4 .visible .func _gfortran_eoshift3_4_gfortran_eoshift3_4_char .visible .func _gfortran_eoshift3_4_charVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_size0_gfortran_size0_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extentsgfortrani_bounds_reduced_extents .extern .func _gfortrani_bounds_reduced_extents)13] = {114,101,116,117,114,110,32,118,97,108,117,101] = {32,0 }8] = {69,79,83,72,73,70,8473,70,84,32,97,114,103,117,109,101,110,12] = {0,0 }; // BEGIN VAR DEF: space$0 .const .align 4 .u32 space$0[1] = {32 }; // BEGIN FUNCTION DEF: eoshift3 .func eoshift3) {local .align 16 .b8 %frame_ar[720predpredpred %r271; .reg .u32predpred %r289; .reg .u64 %r296; .reg .u64u64 %r311; .reg .pred %r312; .reg .predpred %r341; .reg .u6464predu64 %r384; .reg .u64 %r385; .reg .u32 %r390; .reg .pred %r391; .reg .pred %r392; .reg .u64 %r393; .reg .u64 %r394; .reg .u64 %r395; .reg .u64 %r396; mov.u64 %r171,%ar0; mov.u64 %r172,%ar1; mov.u64 %r173,%ar2; mov.u64 %r174,%ar3; mov.u64 %r175,%ar4; mov.u64 %r176,%ar5; mov.u64 %r177172; call (%value_in),_gfortran_size0,(%out_arg1); ld.param.u64 %r179,[%value_in]; } .loc 1 79 10 ld.u64 %r22,[%r172+16]; .loc 1 81 6 setp.eq.u64 %r180,%r175,0; @ %r180 bra $L54; .loc 1 82 11 ld.u32 %r181,[%r175]; add.u32 %r101,%r181,-1; bra $L2; $L54: .loc 1 84 11 cvt.u32.u64 %r101,%r175; $L2: .loc 1 86 6 ld.u64 %r182,[%r171]; setp.ne.u64 %r183,%r182,0; @ %r183 bra $L3; .loc 1 88 2422; call (%value_in),_gfortrani_xmallocarray186,[%value_in]; } .loc 1 88 22 st.u64 [%r171],%r186; .loc 1 89 19 st.u64 [%r171+8],%r182; .loc 1 90 7 ld.u64 %r189,[%r172+16]; st.u64 [%r171+16],%r189; ld.u64 %r190,[%r172+24]; st.u64 [%r171+24],%r190; .loc 1 91 34 ld.s8 %r137,[%r172+28]; .loc 1 91 7 cvt.u16.u32 %r191,%r137; setp.gt.s16 %r192,%r191,0; @ %r192 bra $L4; $L7: .loc 1 107 2422; call (%value_in),_gfortrani_xmallocarray195,[%value_in]; } .loc 1 107 22 st.u64 [%r171],%r195; cvta.global.u64 %r380,_gfortrani_compile_options; bra $L5; $L6: .loc 1 95 9 ld.u64 %r198,[%r121+8]; ld.u64 %r200,[%r121]; .loc 1 100 17 mad.lo.u64 %r104,%r111,%r104,%r104; add.u64 %r121,%r121,24; .loc 1 95 7 sub.u64 %r111,%r198,%r200; $L52: add.u64 %r107,%r171,%r112; .loc 1 103 4 st.u64 [%r107+48],%r202; st.u64 [%r107+56],%r111; st.u64 [%r107+40],%r104; .loc 1 91 7 add.u64 %r112,%r112,24; setp.ne.u64 %r203,%r376,%r112; @ %r203 bra $L6; bra $L7; $L3: .loc 1 110 12 cvta.global.u64 %r380,_gfortrani_compile_options; .loc 1 110 11 ld.u32 %r205,[%r380+36]; setp.ne.u32 %r206,%r205,0; @ ! %r206 bra $L9; .loc 1 112 7 cvta.const.u64 %r210,$LC0;r17220910; call _gfortrani_bounds_equal_extents$L5: .loc 1 116 6 ld.u32 %r212,[%r380+36]; setp.eq.u32 %r213,%r212,0; @ %r213 bra $L9; .loc 1 118 7 cvta.const.u64 %r218,$LC0; cvta.const.u64 %r21717r1727218; call _gfortrani_bounds_reduced_extents); } $L9: .loc 1 122 6 setp.ne.u64 %r219,%r179,0; @ ! %r219 bra $L1; .loc 1 125 13 mov.u64 %r220,1; st.u64 [%frame],%r220; .loc 1 126 12 mov.u64 %r221,0; st.u64 [%frame+120],%r221; .loc 1 128 23 ld.s8 %r144,[%r172+28]; .loc 1 128 3 setp.le.s64 %r222,%r144,0; @ %r222 bra $L55; .loc 1 130 15 cvt.s64.s32 %r87,%r101; .loc 1 132 21 ld.u64 %r163,[%r171+16]; .loc 1 135 21 ld.u64 %r168,[%r172+16]; mov.u64 %r136,40; .loc 1 127 5 mov.u64 %r97,%r221; .loc 1 74 7 mov.u64 %r95,%r97; .loc 1 128 12 mov.u64 %r132,%r97; .loc 1 75 11 mov.u64 %r88,%r97; .loc 1 76 11 mov.u64 %r83,%r97; setp.eq.u64 %r377,%r174,0; .loc 1 142 20 mov.u64 %r396,%r97; $L18: add.u64 %r134,%r172,%r136; .loc 1 138 17 ld.u64 %r224,[%r134+16]; add.u64 %r223,%r224,1; .loc 1 138 15 ld.u64 %r225,[%r134+8]; sub.u64 %r160,%r223,%r225; .loc 1 132 21 add.u64 %r226,%r171,%r136; ld.u64 %r227,[%r226]; mul.lo.u64 %r165,%r163,%r227; .loc 1 135 21 ld.u64 %r228,[%r134]; mul.lo.u64 %r170,%r168,%r228; .loc 1 130 10 setp.ne.u64 %r229,%r87,%r132; @ %r229 bra $L13; .loc 1 133 14 setp.ne.u64 %r230,%r165,0; .loc 1 79 8 selp.u64 %r83,%r165,%r22,%r230; .loc 1 136 14 setp.ne.u64 %r231,%r170,0; .loc 1 79 8 selp.u64 %r88,%r170,%r22,%r231; .loc 1 138 15 mov.u64 %r95,%r160; bra $L16; $L13: .loc 1 142 20 shl.b64 %r382,%r97,3; add.u64 %r233,%frame,%r382; add.u64 %r234,%r233,120; st.u64 [%r234],%r396; .loc 1 143 21 st.u64 [%r233],%r160; .loc 1 144 22 st.u64 [%r233+600],%r165; .loc 1 145 22 st.u64 [%r233+480],%r170; .loc 1 147 24 add.u64 %r248,%r97,%r97; add.u64 %r381,%r248,%r97; shl.b64 %r250,%r381,3; add.u64 %r251,%r173,%r250; .loc 1 147 22 ld.u64 %r253,[%r251+40]; st.u64 [%r233+360],%r253; .loc 1 148 14 @ %r377 bra $L58; .loc 1 149 26 add.u64 %r259,%r174,%r250; ld.u64 %r261,[%r259+40]; ld.u64 %r262,[%r174+16]; mul.lo.u64 %r44,%r261,%r262; bra $L17; $L58: .loc 1 151 24 mov.u64 %r44,%r396; $L17: add.u64 %r264,%frame,%r382; st.u64 [%r264+240],%r44; .loc 1 152 12 add.u64 %r97,%r97,1; $L16: .loc 1 128 55 add.u64 %r132,%r132,1; .loc 1 128 3 add.u64 %r136,%r136,24; setp.ne.u64 %r266,%r132,%r144; @ %r266 bra $L18; bra $L12; $L55: .loc 1 74 7 mov.u64 %r95,%r221; .loc 1 75 11 mov.u64 %r88,%r95; .loc 1 76 11 mov.u64 %r83,%r95; setp.eq.u64 %r377,%r174,0; $L12: .loc 1 155 14 ld.u64 %r45,[%frame+480]; .loc 1 155 6 setp.ne.u64 %r267,%r45,0; .loc 1 79 8 selp.u64 %r45,%r45,%r22,%r267; .loc 1 157 14 ld.u64 %r46,[%frame+600]; .loc 1 157 6 setp.ne.u64 %r268,%r46,0; .loc 1 79 8 selp.u64 %r46,%r46,%r22,%r268; .loc 1 159 14 ld.u64 %r36,[%frame+360]; .loc 1 159 6 setp.ne.u64 %r269,%r36,0; .loc 1 160 16 selp.u64 %r36,%r36,1,%r269; .loc 1 161 23 ld.u64 %r38,[%frame+240]; .loc 1 169 8 ld.u64 %r129,[%r171]; .loc 1 170 8 ld.u64 %r130,[%r172]; .loc 1 171 8 ld.u64 %r131,[%r173]; .loc 1 161 6 @ %r377 bra $L22; .loc 1 161 13 setp.ne.u64 %r271,%r38,0; .loc 1 79 8 selp.u64 %r38,%r38,%r22,%r271; st.u64 [%frame+240],%r38; .loc 1 173 10 ld.u64 %r174,[%r174]; bra $L22; $L50: .loc 1 244 12 shl.b64 %r58,%r36,2; .loc 1 201 19 set.u32.eq.u64 %r273,%r22,%r88; neg.s32 %r274,%r273; .loc 1 201 38 set.u32.eq.u64 %r276,%r22,%r83; neg.s32 %r277,%r276; .loc 1 201 27 cvt.u16.u32 %r279,%r274; cvt.u16.u32 %r280,%r277; and.b16 %r278,%r279,%r280; cvt.u32.u16 %r281,%r278; cvt.u32.u8 %r100,%r281; .loc 1 248 32 ld.u64 %r92,[%frame]; setp.ne.u32 %r383,%r100,0; add.u64 %r385,%r144,-1; add.u64 %r384,%frame,128; add.u64 %r379,%frame,240; .loc 1 183 10 cvt.u32.u64 %r390,%r95; setp.eq.u64 %r391,%r177,1; .loc 1 235 8 setp.gt.s64 %r392,%r22,0; .loc 1 269 30 add.u64 %r393,%frame,600; .loc 1 270 30 add.u64 %r394,%frame,480; .loc 1 271 23 add.u64 %r395,%frame,360; $L53: .loc 1 180 10 ld.u32 %r102,[%r131]; .loc 1 181 26 setp.lt.s32 %r282,%r102,0; @ %r282 bra $L24; cvt.s64.s32 %r106,%r102; bra $L25; $L24: .loc 1 181 28 neg.s32 %r283,%r102; .loc 1 181 26 cvt.s64.s32 %r106,%r283; $L25: .loc 1 181 10 setp.le.s64 %r284,%r106,%r95; @ %r284 bra $L26; .loc 1 183 10 mov.u32 %r102,%r390; mov.u32 %r103,%r390; bra $L27; $L26: .loc 1 187 8 abs.s32 %r103,%r102; $L27: .loc 1 191 29 cvt.s64.s32 %r156,%r103; .loc 1 189 10 setp.le.s32 %r285,%r102,0; @ %r285 bra $L28; .loc 1 191 15 mad.lo.u64 %r91,%r88,%r156,%r130; mov.u64 %r85,%r129; bra $L29; $L28: .loc 1 197 16 mad.lo.u64 %r85,%r83,%r156,%r129; mov.u64 %r91,%r130; $L29: .loc 1 209 24 sub.u64 %r155,%r95,%r156; .loc 1 201 10 @ %r383 bra $L30; .loc 1 209 4 setp.gt.s64 %r289,%r155,0; @ %r289 bra $L31; bra $L32; $L30: .loc 1 203 24 mul.lo.u64 %r124,%r22,%r155; .loc 1 204 44296,[%value_in]; } .loc 1 205 9 add.u64 %r85,%r85,%r124; bra $L32; $L31: .loc 1 213 12 mov.u64 %r122,%r85; .loc 1 209 11 mov.u64 %r123,0; $L33: .loc 1 211 84,[%value_in]; } .loc 1 212 13 add.u64 %r122,%r122,%r83; .loc 1 213 12 add.u64 %r91,%r91,%r88; .loc 1 209 34 add.u64 %r123,%r123,1; .loc 1 209 4 setp.ne.u64 %r306,%r123,%r155; @ %r306 bra $L33; .loc 1 212 13 mad.lo.u64 %r85,%r123,%r83,%r85; $L32: .loc 1 217 10 setp.ge.s32 %r308,%r102,0; selp.u64 %r85,%r85,%r129,%r308; .loc 1 228 10 add.u64 %r126,%r156,-1; .loc 1 221 10 setp.ne.u64 %r309,%r174,0; @ %r309 bra $L35; .loc 1 228 8 setp.ne.u64 %r310,%r156,0; @ %r310 bra $L46; $L39: .loc 1 242 12 add.u64 %r129,%r129,%r46; .loc 1 243 12 add.u64 %r130,%r130,%r45; .loc 1 244 12 add.u64 %r131,%r131,%r58; .loc 1 245 12 add.u64 %r174,%r174,%r38; .loc 1 246 15 ld.u64 %r311,[%frame+120]; add.u64 %r60,%r311,1; st.u64 [%frame+120],%r60; .loc 1 248 13 setp.eq.u64 %r312,%r60,%r92; @ %r312 bra $L37; bra $L53; $L35: .loc 1 222 8 setp.eq.u64 %r313,%r156,0; @ %r313 bra $L39; $L40: .loc 1 224 617320,[%value_in]; } .loc 1 225 11 add.u64 %r85,%r85,%r83; .loc 1 222 10 add.u64 %r126,%r126,-1; .loc 1 222 8 setp.ne.u64 %r322,%r126,-1; @ %r322 bra $L40; bra $L39; $L46: .loc 1 232 9 @ %r391 bra $L41; .loc 1 235 8 @ %r392 bra $L42; bra $L43; $L41: .loc 1 233 8 ld.s8 %r329,[%r176]85bra $L43; $L42: .loc 1 235 15 mov.u64 %r125,0; $L44: .loc 1 236 3 add.u64 %r331,%r85,%rr177338,[%value_in]; } .loc 1 235 32 add.u64 %r125,%r125,%r177; .loc 1 235 8 setp.gt.s64 %r340,%r22,%r125; @ %r340 bra $L44; $L43: .loc 1 238 11 add.u64 %r85,%r85,%r83; .loc 1 228 10 add.u64 %r126,%r126,-1; .loc 1 228 8 setp.ne.u64 %r341,%r126,-1; @ %r341 bra $L46; bra $L39; $L37: mov.u64 %r79,%r384; .loc 1 260 24 mov.u64 %r77,%r36; mov.u64 %r75,%r45; mov.u64 %r73,%r46; mov.u64 %r82,%r92; mov.u64 %r110,8; .loc 1 247 9 mov.u64 %r128,0; .loc 1 252 20 mov.u64 %r343,%r128; $L49: st.u64 [%r79+-8],%r343; .loc 1 255 30 mul.lo.u64 %r62,%r73,%r82; .loc 1 256 30 mul.lo.u64 %r64,%r75,%r82; .loc 1 257 23 mul.lo.u64 %r66,%r77,%r82; .loc 1 258 26 shl.b64 %r345,%r128,3; add.u64 %r346,%r379,%r345; .loc 1 258 30 ld.u64 %r347,[%r346]; mul.lo.u64 %r70,%r82,%r347; .loc 1 259 12 add.u64 %r128,%r128,1; .loc 1 260 14 setp.lt.s64 %r348,%r128,%r385; @ ! %r348 bra $L1; .loc 1 268 23 ld.u64 %r349,[%r79]; add.u64 %r72,%r349,1; st.u64 [%r79],%r72; .loc 1 269 30 add.u64 %r351,%r393,%r110; ld.u64 %r73,[%r351]; .loc 1 269 20 sub.u64 %r352,%r73,%r62; add.u64 %r129,%r129,%r352; .loc 1 270 30 add.u64 %r354,%r394,%r110; ld.u64 %r75,[%r354]; .loc 1 270 20 sub.u64 %r355,%r75,%r64; add.u64 %r130,%r130,%r355; .loc 1 271 23 add.u64 %r357,%r395,%r110; ld.u64 %r77,[%r357]; .loc 1 271 13 sub.u64 %r358,%r77,%r66; shl.b64 %r359,%r358,2; add.u64 %r131,%r131,%r359; .loc 1 272 30 add.u64 %r361,%r379,%r110; .loc 1 272 20 ld.u64 %r363,[%r361]; sub.u64 %r362,%r363,%r70; add.u64 %r174,%r174,%r362; .loc 1 248 32 add.u64 %r364,%frame,%r110; ld.u64 %r82,[%r364]; .loc 1 248 13 add.u64 %r79,%r79,8; setp.ne.u64 %r365,%r72,%r82; @ %r365 bra $L53; add.u64 %r110,%r110,8; bra $L49; $L22: .loc 1 177 9 setp.ne.u64 %r366,%r129,0; @ %r366 bra $L50; bra $L1; $L4: .loc 1 95 9 ld.u64 %r368,[%r172+56]; ld.u64 %r370,[%r172+48]; .loc 1 95 7 sub.u64 %r111,%r368,%r370; add.u64 %r121,%r172,72; cvt.u32.u32 %r372,%r137; cvt.s64.s8 %r371,%r372; add.u64 %r374,%r371,%r371; add.u64 %r375,%r374,%r371; shl.b64 %r376,%r375,3; mov.u64 %r112,%r182; .loc 1 98 17 mov.u64 %r104,1; .loc 1 103 4 mov.u64 %r202,%r112; bra $L52; $L1: .loc 1 27_gfortran_eoshift3_4 .visible .func _gfortran_eoshift3_4 {r22.loc 1 292 3 mov.u64 %r33,1; cvta.const.u64 %r32,$LC3eoshift329_gfortran_eoshift3_4_char .visible .func _gfortran_eoshift3_4_char.loc 1 315 3 mov.u64 %r36,1; cvta.const.u64 %r35,$LC4eoshift31_gfortran_eoshift3_4_char4 .visible .func _gfortran_eoshift3_4_char4.loc 1 339 3 mov.u64 %r36,4; cvta.const.u64 %r35,space$0eoshift3BEGIN FUNCTION DECL: eoshift3 .func eoshift3); .file 1 "../../../libgfortran/generated/eoshift3_8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_eoshift3_8 .visible .func _gfortran_eoshift3_8_gfortran_eoshift3_8_char .visible .func _gfortran_eoshift3_8_charVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_size0_gfortran_size0_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extentsgfortrani_bounds_reduced_extents .extern .func _gfortrani_bounds_reduced_extents)13] = {114,101,116,117,114,110,32,118,97,108,117,101] = {32,0 }8] = {69,79,83,72,73,70,8473,70,84,32,97,114,103,117,109,101,110,12] = {0,0 }; // BEGIN VAR DEF: space$0 .const .align 4 .u32 space$0[1] = {32 }; // BEGIN FUNCTION DEF: eoshift3 .func eoshift3) {local .align 16 .b8 %frame_ar[72032predpredpred %r271; .reg .u32pred %r287; .reg .u64 %r294; .reg .u64pred %r311; .reg .u64 %r318; .reg .pred %r320; .reg .u32 %r327; .reg .u64predpredpred %r389; .reg .u64 %r390; .reg .u64 %r391; .reg .u64 %r392; .reg .u64 %r393; mov.u64 %r170,%ar0; mov.u64 %r171,%ar1; mov.u64 %r172,%ar2; mov.u64 %r173,%ar3; mov.u64 %r174,%ar4; mov.u64 %r175,%ar5; mov.u64 %r176171; call (%value_in),_gfortran_size0,(%out_arg1); ld.param.u64 %r178,[%value_in]; } .loc 1 79 10 ld.u64 %r22,[%r171+16]; .loc 1 81 6 setp.eq.u64 %r179,%r174,0; @ %r179 bra $L51; .loc 1 82 21 ld.u32 %r181,[%r174]; add.u32 %r102,%r181,-1; bra $L2; $L51: .loc 1 84 11 cvt.u32.u64 %r102,%r174; $L2: .loc 1 86 6 ld.u64 %r182,[%r170]; setp.ne.u64 %r183,%r182,0; @ %r183 bra $L3; .loc 1 88 2417r22; call (%value_in),_gfortrani_xmallocarray186,[%value_in]; } .loc 1 88 22 st.u64 [%r170],%r186; .loc 1 89 19 st.u64 [%r170+8],%r182; .loc 1 90 7 ld.u64 %r189,[%r171+16]; st.u64 [%r170+16],%r189; ld.u64 %r190,[%r171+24]; st.u64 [%r170+24],%r190; .loc 1 91 34 ld.s8 %r137,[%r171+28]; .loc 1 91 7 cvt.u16.u32 %r191,%r137; setp.gt.s16 %r192,%r191,0; @ %r192 bra $L4; $L7: .loc 1 107 2417r22; call (%value_in),_gfortrani_xmallocarray195,[%value_in]; } .loc 1 107 22 st.u64 [%r170],%r195; cvta.global.u64 %r382,_gfortrani_compile_options; bra $L5; $L6: .loc 1 95 9 ld.u64 %r198,[%r120+8]; ld.u64 %r200,[%r120]; .loc 1 100 17 mad.lo.u64 %r104,%r111,%r104,%r104; add.u64 %r120,%r120,24; .loc 1 95 7 sub.u64 %r111,%r198,%r200; $L49: add.u64 %r106,%r170,%r117; .loc 1 103 4 st.u64 [%r106+48],%r202; st.u64 [%r106+56],%r111; st.u64 [%r106+40],%r104; .loc 1 91 7 add.u64 %r117,%r117,24; setp.ne.u64 %r203,%r374,%r117; @ %r203 bra $L6; bra $L7; $L3: .loc 1 110 12 cvta.global.u64 %r382,_gfortrani_compile_options; .loc 1 110 11 ld.u32 %r205,[%r382+36]; setp.ne.u32 %r206,%r205,0; @ ! %r206 bra $L9; .loc 1 112 7 cvta.const.u64 %r210,$LC0;r20910; call _gfortrani_bounds_equal_extents$L5: .loc 1 116 6 ld.u32 %r212,[%r382+36]; setp.eq.u32 %r213,%r212,0; @ %r213 bra $L9; .loc 1 118 7 cvta.const.u64 %r218,$LC0; cvta.const.u64 %r21717171217218; call _gfortrani_bounds_reduced_extents); } $L9: .loc 1 122 6 setp.ne.u64 %r219,%r178,0; @ ! %r219 bra $L1; .loc 1 125 13 mov.u64 %r220,1; st.u64 [%frame],%r220; .loc 1 126 12 mov.u64 %r221,0; st.u64 [%frame+120],%r221; .loc 1 128 23 ld.s8 %r146,[%r171+28]; .loc 1 128 3 setp.le.s64 %r222,%r146,0; @ %r222 bra $L52; .loc 1 130 15 cvt.s64.s32 %r109,%r102; .loc 1 132 21 ld.u64 %r162,[%r170+16]; .loc 1 135 21 ld.u64 %r167,[%r171+16]; mov.u64 %r138,40; .loc 1 127 5 mov.u64 %r98,%r221; .loc 1 74 7 mov.u64 %r96,%r98; .loc 1 128 12 mov.u64 %r132,%r98; .loc 1 75 11 mov.u64 %r89,%r98; .loc 1 76 11 mov.u64 %r85,%r98; setp.eq.u64 %r380,%r173,0; .loc 1 142 20 mov.u64 %r393,%r98; $L18: add.u64 %r134,%r171,%r138; .loc 1 138 17 ld.u64 %r224,[%r134+16]; add.u64 %r223,%r224,1; .loc 1 138 15 ld.u64 %r225,[%r134+8]; sub.u64 %r159,%r223,%r225; .loc 1 132 21 add.u64 %r226,%r170,%r138; ld.u64 %r227,[%r226]; mul.lo.u64 %r164,%r162,%r227; .loc 1 135 21 ld.u64 %r228,[%r134]; mul.lo.u64 %r169,%r167,%r228; .loc 1 130 10 setp.ne.u64 %r229,%r109,%r132; @ %r229 bra $L13; .loc 1 133 14 setp.ne.u64 %r230,%r164,0; .loc 1 79 8 selp.u64 %r85,%r164,%r22,%r230; .loc 1 136 14 setp.ne.u64 %r231,%r169,0; .loc 1 79 8 selp.u64 %r89,%r169,%r22,%r231; .loc 1 138 15 mov.u64 %r96,%r159; bra $L16; $L13: .loc 1 142 20 shl.b64 %r377,%r98,3; add.u64 %r233,%frame,%r377; add.u64 %r234,%r233,120; st.u64 [%r234],%r393; .loc 1 143 21 st.u64 [%r233],%r159; .loc 1 144 22 st.u64 [%r233+600],%r164; .loc 1 145 22 st.u64 [%r233+480],%r169; .loc 1 147 24 add.u64 %r248,%r98,%r98; add.u64 %r375,%r248,%r98; shl.b64 %r250,%r375,3; add.u64 %r251,%r172,%r250; .loc 1 147 22 ld.u64 %r253,[%r251+40]; st.u64 [%r233+360],%r253; .loc 1 148 14 @ %r380 bra $L55; .loc 1 149 26 add.u64 %r259,%r173,%r250; ld.u64 %r261,[%r259+40]; ld.u64 %r262,[%r173+16]; mul.lo.u64 %r46,%r261,%r262; bra $L17; $L55: .loc 1 151 24 mov.u64 %r46,%r393; $L17: add.u64 %r264,%frame,%r377; st.u64 [%r264+240],%r46; .loc 1 152 12 add.u64 %r98,%r98,1; $L16: .loc 1 128 55 add.u64 %r132,%r132,1; .loc 1 128 3 add.u64 %r138,%r138,24; setp.ne.u64 %r266,%r132,%r146; @ %r266 bra $L18; bra $L12; $L52: .loc 1 74 7 mov.u64 %r96,%r221; .loc 1 75 11 mov.u64 %r89,%r96; .loc 1 76 11 mov.u64 %r85,%r96; setp.eq.u64 %r380,%r173,0; $L12: .loc 1 155 14 ld.u64 %r47,[%frame+480]; .loc 1 155 6 setp.ne.u64 %r267,%r47,0; .loc 1 79 8 selp.u64 %r47,%r47,%r22,%r267; .loc 1 157 14 ld.u64 %r48,[%frame+600]; .loc 1 157 6 setp.ne.u64 %r268,%r48,0; .loc 1 79 8 selp.u64 %r48,%r48,%r22,%r268; .loc 1 159 14 ld.u64 %r38,[%frame+360]; .loc 1 159 6 setp.ne.u64 %r269,%r38,0; .loc 1 160 16 selp.u64 %r38,%r38,1,%r269; .loc 1 161 23 ld.u64 %r40,[%frame+240]; .loc 1 169 8 ld.u64 %r129,[%r170]; .loc 1 170 8 ld.u64 %r130,[%r171]; .loc 1 171 8 ld.u64 %r131,[%r172]; .loc 1 161 6 @ %r380 bra $L22; .loc 1 161 13 setp.ne.u64 %r271,%r40,0; .loc 1 79 8 selp.u64 %r40,%r40,%r22,%r271; st.u64 [%frame+240],%r40; .loc 1 173 10 ld.u64 %r173,[%r173]; bra $L22; $L47: .loc 1 244 12 shl.b64 %r60,%r38,3; .loc 1 201 38 set.u32.eq.u64 %r273,%r22,%r85; neg.s32 %r274,%r273; .loc 1 201 19 set.u32.eq.u64 %r276,%r22,%r89; neg.s32 %r277,%r276; .loc 1 201 27 cvt.u16.u32 %r279,%r274; cvt.u16.u32 %r280,%r277; and.b16 %r278,%r279,%r280; cvt.u32.u16 %r281,%r278; cvt.u32.u8 %r88,%r281; .loc 1 248 32 ld.u64 %r86,[%frame]; setp.ne.u32 %r376,%r88,0; add.u64 %r379,%r146,-1; add.u64 %r378,%frame,128; add.u64 %r383,%frame,240; setp.eq.u64 %r388,%r176,1; .loc 1 235 8 setp.gt.s64 %r389,%r22,0; .loc 1 269 30 add.u64 %r390,%frame,600; .loc 1 270 30 add.u64 %r391,%frame,480; .loc 1 271 23 add.u64 %r392,%frame,360; $L50: .loc 1 180 10 ld.u64 %r103,[%r131]; .loc 1 181 26 abs.s64 %r50,%r103; .loc 1 181 10 setp.gt.s64 %r282,%r50,%r96; @ %r282 bra $L56; .loc 1 209 24 sub.u64 %r97,%r96,%r50; bra $L24; $L56: mov.u64 %r50,%r96; mov.u64 %r103,%r96; mov.u64 %r97,0; $L24: .loc 1 189 10 setp.le.s64 %r283,%r103,0; @ %r283 bra $L25; .loc 1 191 15 mad.lo.u64 %r91,%r50,%r89,%r130; mov.u64 %r87,%r129; bra $L26; $L25: .loc 1 197 16 mad.lo.u64 %r87,%r50,%r85,%r129; mov.u64 %r91,%r130; $L26: .loc 1 201 10 @ %r376 bra $L27; .loc 1 209 4 setp.ne.u64 %r287,%r97,0; @ %r287 bra $L28; bra $L29; $L27: .loc 1 203 24 mul.lo.u64 %r123,%r97,%r22; .loc 1 204 4d.u64 %r87,%r87,%r123; bra $L29; $L28: .loc 1 213 12 mov.u64 %r121,%r87; .loc 1 209 11 mov.u64 %r122,0; $L30:2,[%value_in]; } .loc 1 212 13 add.u64 %r121,%r121,%r85; .loc 1 213 12 add.u64 %r91,%r91,%r89; .loc 1 209 34 add.u64 %r122,%r122,1; .loc 1 209 4 setp.ne.u64 %r304,%r97,%r122; @ %r304 bra $L30; .loc 1 212 13 mad.lo.u64 %r87,%r97,%r85,%r87; $L29: .loc 1 217 10 setp.ge.s64 %r306,%r103,0; selp.u64 %r87,%r87,%r129,%r306; .loc 1 228 10 add.u64 %r126,%r50,-1; .loc 1 221 10 setp.ne.u64 %r307,%r173,0; @ %r307 bra $L32; .loc 1 228 8 setp.ne.u64 %r308,%r50,0; @ %r308 bra $L43; $L36: .loc 1 242 12 add.u64 %r129,%r129,%r48; .loc 1 243 12 add.u64 %r130,%r130,%r47; .loc 1 244 12 add.u64 %r131,%r131,%r60; .loc 1 245 12 add.u64 %r173,%r173,%r40; .loc 1 246 15 ld.u64 %r309,[%frame+120]; add.u64 %r62,%r309,1; st.u64 [%frame+120],%r62; .loc 1 248 13 setp.eq.u64 %r310,%r62,%r86; @ %r310 bra $L34; bra $L50; $L32: .loc 1 222 8 setp.eq.u64 %r311,%r50,0; @ %r311 bra $L36; $L37: .loc 1 224 6173318,[%value_in]; } .loc 1 225 11 add.u64 %r87,%r87,%r85; .loc 1 222 10 add.u64 %r126,%r126,-1; .loc 1 222 8 setp.ne.u64 %r320,%r126,-1; @ %r320 bra $L37; bra $L36; $L43: .loc 1 232 9 @ %r388 bra $L38; .loc 1 235 8 @ %r389 bra $L39; bra $L40; $L38: .loc 1 233 8 ld.s8 %r327,[%r175]bra $L40; $L39: .loc 1 235 15 mov.u64 %r125,0; $L41: .loc 1 236 3 add.u64 %r329,%r87,%r336,[%value_in]; } .loc 1 235 32 add.u64 %r125,%r125,%r176; .loc 1 235 8 setp.gt.s64 %r338,%r22,%r125; @ %r338 bra $L41; $L40: .loc 1 238 11 add.u64 %r87,%r87,%r85; .loc 1 228 10 add.u64 %r126,%r126,-1; .loc 1 228 8 setp.ne.u64 %r339,%r126,-1; @ %r339 bra $L43; bra $L36; $L34: mov.u64 %r81,%r378; .loc 1 260 24 mov.u64 %r79,%r38; mov.u64 %r77,%r47; mov.u64 %r75,%r48; mov.u64 %r84,%r86; mov.u64 %r110,8; .loc 1 247 9 mov.u64 %r128,0; .loc 1 252 20 mov.u64 %r341,%r128; $L46: st.u64 [%r81+-8],%r341; .loc 1 255 30 mul.lo.u64 %r64,%r84,%r75; .loc 1 256 30 mul.lo.u64 %r66,%r77,%r84; .loc 1 257 23 mul.lo.u64 %r68,%r84,%r79; .loc 1 258 26 shl.b64 %r343,%r128,3; add.u64 %r344,%r383,%r343; .loc 1 258 30 ld.u64 %r345,[%r344]; mul.lo.u64 %r72,%r84,%r345; .loc 1 259 12 add.u64 %r128,%r128,1; .loc 1 260 14 setp.lt.s64 %r346,%r128,%r379; @ ! %r346 bra $L1; .loc 1 268 23 ld.u64 %r347,[%r81]; add.u64 %r74,%r347,1; st.u64 [%r81],%r74; .loc 1 269 30 add.u64 %r349,%r390,%r110; ld.u64 %r75,[%r349]; .loc 1 269 20 sub.u64 %r350,%r75,%r64; add.u64 %r129,%r129,%r350; .loc 1 270 30 add.u64 %r352,%r391,%r110; ld.u64 %r77,[%r352]; .loc 1 270 20 sub.u64 %r353,%r77,%r66; add.u64 %r130,%r130,%r353; .loc 1 271 23 add.u64 %r355,%r392,%r110; ld.u64 %r79,[%r355]; .loc 1 271 13 sub.u64 %r356,%r79,%r68; shl.b64 %r357,%r356,3; add.u64 %r131,%r131,%r357; .loc 1 272 30 add.u64 %r359,%r383,%r110; .loc 1 272 20 ld.u64 %r361,[%r359]; sub.u64 %r360,%r361,%r72; add.u64 %r173,%r173,%r360; .loc 1 248 32 add.u64 %r362,%frame,%r110; ld.u64 %r84,[%r362]; .loc 1 248 13 add.u64 %r81,%r81,8; setp.ne.u64 %r363,%r74,%r84; @ %r363 bra $L50; add.u64 %r110,%r110,8; bra $L46; $L22: .loc 1 177 9 setp.ne.u64 %r364,%r129,0; @ %r364 bra $L47; bra $L1; $L4: .loc 1 95 9 ld.u64 %r366,[%r171+56]; ld.u64 %r368,[%r171+48]; .loc 1 95 7 sub.u64 %r111,%r366,%r368; add.u64 %r120,%r171,72; cvt.u32.u32 %r370,%r137; cvt.s64.s8 %r369,%r370; add.u64 %r372,%r369,%r369; add.u64 %r373,%r372,%r369; shl.b64 %r374,%r373,3; mov.u64 %r117,%r182; .loc 1 98 17 mov.u64 %r104,1; .loc 1 103 4 mov.u64 %r202,%r117; bra $L49; $L1: .loc 1 27_gfortran_eoshift3_8 .visible .func _gfortran_eoshift3_8 {r22.loc 1 292 3 mov.u64 %r33,1; cvta.const.u64 %r32,$LC3eoshift329_gfortran_eoshift3_8_char .visible .func _gfortran_eoshift3_8_char.loc 1 315 3 mov.u64 %r36,1; cvta.const.u64 %r35,$LC4eoshift31_gfortran_eoshift3_8_char4 .visible .func _gfortran_eoshift3_8_char4.loc 1 339 3 mov.u64 %r36,4; cvta.const.u64 %r35,space$0eoshift3ret; } eoshift3_16.o/BEGIN FUNCTION DECL: eoshift3 .func eoshift3); .file 1 "../../../libgfortran/generated/eoshift3_16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_eoshift3_16 .visible .func _gfortran_eoshift3_16_gfortran_eoshift3_16_char .visible .func _gfortran_eoshift3_16_char_gfortran_eoshift3_16_char4 .visible .func _gfortran_eoshift3_16_char4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_size0_gfortran_size0_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extentsgfortrani_bounds_reduced_extents .extern .func _gfortrani_bounds_reduced_extents)13] = {114,101,116,117,114,110,32,118,97,108,117,101] = {32,0 }8] = {69,79,83,72,73,70,8473,70,84,32,97,114,103,117,109,101,110,12] = {0,0 }; // BEGIN VAR DEF: space$0 .const .align 4 .u32 space$0[1] = {32 }; // BEGIN FUNCTION DEF: eoshift3 .func eoshift3) {local .align 16 .b8 %frame_ar[720pred %r186; .reg .u32u16 %r197; .reg .predpred %r209; .reg .u32 %r211; .reg .predpredpredpredpred %r235; .reg .pred %r236; .reg .predu64 %r267; .reg .u64 %r268; .reg .u64 %r270; .reg .predu64 %r301; .reg .u64 %r304; .reg .u32 %r306; .reg .u64 %r307; .reg .u64 %r310; .reg .pred %r313; .reg .pred %r316; .reg .pred %r319; .reg .u64 %r321; .reg .u64 %r324; .reg .u32 %r326; .reg .u64 %r327; .reg .pred %r332; .reg .pred %r334; .reg .predpred %r345; .reg .u64 %r353; .reg .u64 %r361; .reg .pred %r363; .reg .pred %r380; .reg .predpredpred %r437; .reg .predpredpred %r473; .reg .u64 %r474; .reg .u64 %r475; .reg .pred %r480; .reg .predmov.u64 %r177,%ar0; mov.u64 %r178,%ar1; mov.u64 %r179,%ar2; mov.u64 %r180,%ar3; mov.u64 %r181,%ar4; mov.u64 %r182,%ar5; mov.u64 %r18178; call (%value_in),_gfortran_size0.loc 1 79 10 ld.u64 %r22,[%r178+16]; .loc 1 81 6 setp.eq.u64 %r186,%r181,0; @ %r186 bra $L55; .loc 1 82 21 ld.u32 %r187,[%r181]; add.u32 %r102,%r187,-1; bra $L2; $L55: .loc 1 84 11 cvt.u32.u64 %r102,%r181; $L2: .loc 1 86 6 ld.u64 %r188,[%r1773; .loc 1 88 2422; call (%value_in),_gfortrani_xmallocarray192,[%value_in]; } .loc 1 88 22 st.u64 [%r177],%r192; .loc 1 89 19 st.u64 [%r177+8],%r188; .loc 1 90 7 ld.u64 %r195,[%r178+16]; st.u64 [%r177+16],%r195; ld.u64 %r196,[%r178+24]; st.u64 [%r177+24],%r196; .loc 1 91 34 ld.s8 %r142,[%r178+28]; .loc 1 91 7 cvt.u16.u32 %r197,%r142; setp.gt.s16 %r198,%r197,0; @ %r198 bra $L4; $L7: .loc 1 107 2422; call (%value_in),_gfortrani_xmallocarray201,[%value_in]; } .loc 1 107 22 st.u64 [%r177],%r201; cvta.global.u64 %r470,_gfortrani_compile_options; bra $L5; $L6: .loc 1 95 9 ld.u64 %r204,[%r112+8]; ld.u64 %r206,[%r112]; .loc 1 100 17 mad.lo.u64 %r105,%r113,%r105,%r105; add.u64 %r112,%r112,24; .loc 1 95 7 sub.u64 %r113,%r204,%r206; $L53: add.u64 %r96,%r177,%r108; .loc 1 103 4 st.u64 [%r96+48],%r208; st.u64 [%r96+56],%r113; st.u64 [%r96+40],%r105; .loc 1 91 7 add.u64 %r108,%r108,24; setp.ne.u64 %r209,%r448,%r108; @ %r209 bra $L6; bra $L7; $L3: .loc 1 110 12 cvta.global.u64 %r470,_gfortrani_compile_options; .loc 1 110 11 ld.u32 %r211,[%r470+36]; setp.ne.u32 %r212,%r211,0; @ ! %r212 bra $L9; .loc 1 112 7 cvta.const.u64 %r216,$LC0;216; call _gfortrani_bounds_equal_extents$L5: .loc 1 116 6 ld.u32 %r218,[%r470+36]; setp.eq.u32 %r219,%r218,0; @ %r219 bra $L9; .loc 1 118 7 cvta.const.u64 %r224,$LC0r1782234; call _gfortrani_bounds_reduced_extents); } $L9: .loc 1 122 6 setp.ne.u64 %r225,%r185,0; @ ! %r225 bra $L1; .loc 1 125 13 mov.u64 %r226,1; st.u64 [%frame],%r226; .loc 1 126 12 mov.u64 %r227,0; st.u64 [%frame+120],%r227; .loc 1 128 23 ld.s8 %r140,[%r178+28]; .loc 1 128 3 setp.le.s64 %r228,%r140,0; @ %r228 bra $L56; .loc 1 130 15 cvt.s64.s32 %r111,%r102; .loc 1 132 21 ld.u64 %r169,[%r177+16]; .loc 1 135 21 ld.u64 %r174,[%r178+16]; mov.u64 %r138,40; .loc 1 127 5 mov.u64 %r99,%r227; .loc 1 74 7 mov.u64 %r97,%r99; .loc 1 128 12 mov.u64 %r136,%r99; .loc 1 75 11 mov.u64 %r92,%r99; .loc 1 76 11 mov.u64 %r85,%r99; setp.eq.u64 %r467,%r180,0; .loc 1 142 20 mov.u64 %r485,%r99; $L18: add.u64 %r120,%r178,%r138; .loc 1 138 17 ld.u64 %r230,[%r120+16]; add.u64 %r229,%r230,1; .loc 1 138 15 ld.u64 %r231,[%r120+8]; sub.u64 %r166,%r229,%r231; .loc 1 132 21 add.u64 %r232,%r177,%r138; ld.u64 %r233,[%r232]; mul.lo.u64 %r171,%r169,%r233; .loc 1 135 21 ld.u64 %r234,[%r120]; mul.lo.u64 %r176,%r174,%r234; .loc 1 130 10 setp.ne.u64 %r235,%r111,%r136; @ %r235 bra $L13; .loc 1 133 14 setp.ne.u64 %r236,%r171,0; .loc 1 79 8 selp.u64 %r85,%r171,%r22,%r236; .loc 1 136 14 setp.ne.u64 %r237,%r176,0; .loc 1 79 8 selp.u64 %r92,%r176,%r22,%r237; .loc 1 138 15 mov.u64 %r97,%r166; bra $L16; $L13: .loc 1 142 20 shl.b64 %r472,%r99,3; add.u64 %r239,%frame,%r472; add.u64 %r240,%r239,120; st.u64 [%r240],%r485; .loc 1 143 21 st.u64 [%r239],%r166; .loc 1 144 22 st.u64 [%r239+600],%r171; .loc 1 145 22 st.u64 [%r239+480],%r176; .loc 1 147 24 add.u64 %r254,%r99,%r99; add.u64 %r471,%r254,%r99; shl.b64 %r256,%r471,3; add.u64 %r257,%r179,%r256; .loc 1 147 22 ld.u64 %r259,[%r257+40]; st.u64 [%r239+360],%r259; .loc 1 148 14 @ %r467 bra $L59; .loc 1 149 26 add.u64 %r265,%r180,%r256; ld.u64 %r267,[%r265+40]; ld.u64 %r268,[%r180+16]; mul.lo.u64 %r46,%r267,%r268; bra $L17; $L59: .loc 1 151 24 mov.u64 %r46,%r485; $L17: add.u64 %r270,%frame,%r472; st.u64 [%r270+240],%r46; .loc 1 152 12 add.u64 %r99,%r99,1; $L16: .loc 1 128 55 add.u64 %r136,%r136,1; .loc 1 128 3 add.u64 %r138,%r138,24; setp.ne.u64 %r272,%r136,%r140; @ %r272 bra $L18; bra $L12; $L56: .loc 1 74 7 mov.u64 %r97,%r227; .loc 1 75 11 mov.u64 %r92,%r97; .loc 1 76 11 mov.u64 %r85,%r97; setp.eq.u64 %r467,%r180,0; $L12: .loc 1 155 14 ld.u64 %r47,[%frame+480]; .loc 1 155 6 setp.ne.u64 %r273,%r47,0; .loc 1 79 8 selp.u64 %r47,%r47,%r22,%r273; .loc 1 157 14 ld.u64 %r48,[%frame+600]; .loc 1 157 6 setp.ne.u64 %r274,%r48,0; .loc 1 79 8 selp.u64 %r48,%r48,%r22,%r274; .loc 1 159 14 ld.u64 %r38,[%frame+360]; .loc 1 159 6 setp.ne.u64 %r275,%r38,0; .loc 1 160 16 selp.u64 %r38,%r38,1,%r275; .loc 1 161 23 ld.u64 %r40,[%frame+240]; .loc 1 169 8 ld.u64 %r133,[%r177]; .loc 1 170 8 ld.u64 %r134,[%r178]; .loc 1 171 8 ld.u64 %r135,[%r179]; .loc 1 161 6 @ %r467 bra $L22; .loc 1 161 13 setp.ne.u64 %r277,%r40,0; .loc 1 79 8 selp.u64 %r40,%r40,%r22,%r277; st.u64 [%frame+240],%r40; .loc 1 173 10 ld.u64 %r180,[%r180]; bra $L22; $L51: .loc 1 244 12 shl.b64 %r59,%r38,4; .loc 1 181 34 shr.s64 %r452,%r97,63; .loc 1 201 38 set.u32.eq.u64 %r280,%r22,%r85; neg.s32 %r281,%r280; .loc 1 201 19 set.u32.eq.u64 %r283,%r22,%r92; neg.s32 %r284,%r283; .loc 1 201 27cvt.u32.u16 %r288,%r285; cvt.u32.u8 %r95,%r288; .loc 1 248 32 ld.u64 %r98,[%frame]; setp.ne.u32 %r473,%r95,0; add.u64 %r475,%r140,-1; add.u64 %r474,%frame,128; add.u64 %r469,%frame,240; setp.eq.u64 %r480,%r183,1; .loc 1 235 8 setp.gt.s64 %r481,%r22,0; .loc 1 269 30 add.u64 %r482,%frame,600; .loc 1 270 30 add.u64 %r483,%frame,480; .loc 1 271 23 add.u64 %r484,%frame,360; $L54: .loc 1 180 10 ld.u64 %r453,[%r135]; ld.u64 %r454,[%r135+8]; .loc 1 181 26 shr.s64 %r290,%r454,63; xor.b64 %r449,%r290,%r453; xor.b64 %r450,%r290,%r454; sub.u64 %r301,%r449,%r290; set.u32.gt.u64 %r306,%r301,%r449; cvt.s64.s32 %r304,%r306; sub.u64 %r307,%r450,%r290; add.u64 %r310,%r307,%r304; mov.u64 %r449,%r301; .loc 1 181 10 setp.gt.s64 %r313,%r310,%r452; @ %r313 bra $L60; setp.ne.u64 %r316,%r310,%r452; @ %r316 bra $L61; setp.gt.u64 %r319,%r301,%r97; @ %r319 bra $L60; $L61: .loc 1 219 9 mov.u64 %r86,%r301; .loc 1 209 24 sub.u64 %r321,%r97,%r86; set.u32.gt.u64 %r326,%r321,%r97; cvt.s64.s32 %r324,%r326; sub.u64 %r327,%r452,%r310; mov.u64 %r455,%r321; add.u64 %r456,%r327,%r324; bra $L24; $L60: mov.u64 %r86,%r97; mov.u64 %r449,%r97; mov.u64 %r453,%r97; mov.u64 %r454,%r452; mov.u64 %r455,0; mov.u64 %r456,%r455; $L24: .loc 1 189 10 setp.gt.s64 %r332,%r454,0; @ %r332 bra $L62; setp.ne.u64 %r334,%r454,0; @ %r334 bra $L26; setp.ne.u64 %r336,%r453,0; @ ! %r336 bra $L26; $L62: .loc 1 191 15 mad.lo.u64 %r94,%r92,%r449,%r134; mov.u64 %r89,%r133; bra $L28; $L26: .loc 1 197 16 mad.lo.u64 %r89,%r85,%r449,%r133; mov.u64 %r94,%r134; $L28: .loc 1 201 10 @ %r473 bra $L29; .loc 1 209 4 setp.gt.s64 %r341,%r456,0; @ %r341 bra $L30; setp.ne.u64 %r343,%r456,0; @ %r343 bra $L32; setp.ne.u64 %r345,%r455,0; @ %r345 bra $L30; bra $L32; $L29: .loc 1 203 11 mul.lo.u64 %r127,%r22,%r455; .loc 1 204 47353,[%value_in]; } .loc 1 205 9 add.u64 %r89,%r89,%r127; bra $L32; $L30: .loc 1 197 18 mov.u64 %r125,%r89; .loc 1 209 11 mov.u64 %r126,0; $L33:361,[%value_in]; } .loc 1 212 13 add.u64 %r125,%r125,%r85; .loc 1 213 12 add.u64 %r94,%r94,%r92; .loc 1 209 34 add.u64 %r126,%r126,1; .loc 1 209 4 setp.ne.u64 %r363,%r126,%r455; @ %r363 bra $L33; .loc 1 212 13 mad.lo.u64 %r89,%r126,%r85,%r89; $L32: .loc 1 217 10 setp.ge.s64 %r380,%r454,0; selp.u64 %r89,%r89,%r133,%r380; .loc 1 228 10 add.u64 %r129,%r86,-1; .loc 1 221 10 setp.ne.u64 %r381,%r180,0; @ %r381 bra $L36; .loc 1 228 8 setp.ne.u64 %r382,%r86,0; @ %r382 bra $L47; $L40: .loc 1 242 12 add.u64 %r133,%r133,%r48; .loc 1 243 12 add.u64 %r134,%r134,%r47; .loc 1 244 12 add.u64 %r135,%r135,%r59; .loc 1 245 12 add.u64 %r180,%r180,%r40; .loc 1 246 15 ld.u64 %r383,[%frame+120]; add.u64 %r61,%r383,1; st.u64 [%frame+120],%r61; .loc 1 248 13 setp.eq.u64 %r384,%r61,%r98; @ %r384 bra $L38; bra $L54; $L36: .loc 1 222 8 setp.eq.u64 %r385,%r86,0; @ %r385 bra $L40; $L41: .loc 1 224 6180392,[%value_in]; } .loc 1 225 11 add.u64 %r89,%r89,%r85; .loc 1 222 10 add.u64 %r129,%r129,-1; .loc 1 222 8 setp.ne.u64 %r394,%r129,-1; @ %r394 bra $L41; bra $L40; $L47: .loc 1 232 9 @ %r480 bra $L42; .loc 1 235 8 @ %r481 bra $L43; bra $L44; $L42: .loc 1 233 8 ld.s8 %r401,[%r182]89bra $L44; $L43: .loc 1 235 15 mov.u64 %r128,0; $L45: .loc 1 236 3 add.u64 %r403,%r89,%r1283410,[%value_in]; } .loc 1 235 32 add.u64 %r128,%r128,%r183; .loc 1 235 8 setp.gt.s64 %r412,%r22,%r128; @ %r412 bra $L45; $L44: .loc 1 238 11 add.u64 %r89,%r89,%r85; .loc 1 228 10 add.u64 %r129,%r129,-1; .loc 1 228 8 setp.ne.u64 %r413,%r129,-1; @ %r413 bra $L47; bra $L40; $L38: mov.u64 %r114,%r474; .loc 1 260 24 mov.u64 %r80,%r38; mov.u64 %r78,%r47; mov.u64 %r76,%r48; mov.u64 %r84,%r98; mov.u64 %r122,8; .loc 1 247 9 mov.u64 %r132,0; .loc 1 252 20 mov.u64 %r415,%r132; $L50: st.u64 [%r114+-8],%r415; .loc 1 255 30 mul.lo.u64 %r63,%r76,%r84; .loc 1 256 30 mul.lo.u64 %r65,%r84,%r78; .loc 1 257 23 mul.lo.u64 %r67,%r84,%r80; .loc 1 258 26 shl.b64 %r417,%r132,3; add.u64 %r418,%r469,%r417; .loc 1 258 30 ld.u64 %r419,[%r418]; mul.lo.u64 %r72,%r84,%r419; .loc 1 259 12 add.u64 %r132,%r132,1; .loc 1 260 14 setp.lt.s64 %r420,%r132,%r475; @ ! %r420 bra $L1; .loc 1 268 23 ld.u64 %r421,[%r114]; add.u64 %r75,%r421,1; st.u64 [%r114],%r75; .loc 1 269 30 add.u64 %r423,%r482,%r122; ld.u64 %r76,[%r423]; .loc 1 269 20 sub.u64 %r424,%r76,%r63; add.u64 %r133,%r133,%r424; .loc 1 270 30 add.u64 %r426,%r483,%r122; ld.u64 %r78,[%r426]; .loc 1 270 20 sub.u64 %r427,%r78,%r65; add.u64 %r134,%r134,%r427; .loc 1 271 23 add.u64 %r429,%r484,%r122; ld.u64 %r80,[%r429]; .loc 1 271 13 sub.u64 %r430,%r80,%r67; shl.b64 %r431,%r430,4; add.u64 %r135,%r135,%r431; .loc 1 272 30 add.u64 %r433,%r469,%r122; .loc 1 272 20 ld.u64 %r435,[%r433]; sub.u64 %r434,%r435,%r72; add.u64 %r180,%r180,%r434; .loc 1 248 32 add.u64 %r436,%frame,%r122; ld.u64 %r84,[%r436]; .loc 1 248 13 add.u64 %r114,%r114,8; setp.ne.u64 %r437,%r75,%r84; @ %r437 bra $L54; add.u64 %r122,%r122,8; bra $L50; $L22: .loc 1 177 9 setp.ne.u64 %r438,%r133,0; @ %r438 bra $L51; bra $L1; $L4: .loc 1 95 9 ld.u64 %r440,[%r178+56]; ld.u64 %r442,[%r178+48]; .loc 1 95 7 sub.u64 %r113,%r440,%r442; add.u64 %r112,%r178,72; cvt.u32.u32 %r444,%r142; cvt.s64.s8 %r443,%r444; add.u64 %r446,%r443,%r443; add.u64 %r447,%r446,%r443; shl.b64 %r448,%r447,3; mov.u64 %r108,%r188; .loc 1 98 17 mov.u64 %r105,1; .loc 1 103 4 mov.u64 %r208,%r108; bra $L53; $L1: .loc 1 27_gfortran_eoshift3_16 .visible .func _gfortran_eoshift3_16 {r22.loc 1 292 3 mov.u64 %r33,1; cvta.const.u64 %r32,$LC3eoshift329_gfortran_eoshift3_16_char .visible .func _gfortran_eoshift3_16_char.loc 1 315 3 mov.u64 %r36,1; cvta.const.u64 %r35,$LC4eoshift31_gfortran_eoshift3_16_char4 .visible .func _gfortran_eoshift3_16_char4.loc 1 339 3 mov.u64 %r36,4; cvta.const.u64 %r35,space$0eoshift3ret; } cshift1_4.o/BEGIN FUNCTION DECL: cshift1 .func cshift1fortran/generated/cshift1_4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_cshift1_4 .visible .func _gfortran_cshift1_4gfortran_cshift1_4_char .visible .func _gfortran_cshift1_4_char_gfortran_cshift1_4_char4 .visible .func _gfortran_cshift1_4_char4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortran_size0_gfortran_size0_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extentsgfortrani_bounds_reduced_extents .extern .func _gfortrani_bounds_reduced_extents); // BEGIN GLOBAL FUNCTION DECL: _gfortrani_cshift1_4_i1 .extern .func _gfortrani_cshift1_4_i1gfortrani_cshift1_4_i2 .extern .func _gfortrani_cshift1_4_i2gfortrani_cshift1_4_i4 .extern .func _gfortrani_cshift1_4_i4gfortrani_cshift1_4_i8 .extern .func _gfortrani_cshift1_4_i8gfortrani_cshift1_4_r4 .extern .func _gfortrani_cshift1_4_r4gfortrani_cshift1_4_r8 .extern .func _gfortrani_cshift1_4_r8gfortrani_cshift1_4_c4 .extern .func _gfortrani_cshift1_4_c4gfortrani_cshift1_4_c8 .extern .func _gfortrani_cshift1_4_c8114,101,116,117,114,110,32,118,97,108,117,101,0 }03,117,109,101,110,116,32,39,68,73,77,39,32,105,115,32,111,117,116,32,111,102,32,114,97,110,103,101,32,105,110,32,99,97,108,108,32,116,111,32,39,67,83,72,73,70,84,397] = {67,83,72,73,70,8473,70,84,32,97,114,103,117,109,101,110,116,0 }; // BEGIN FUNCTION DEF: cshift1 .func cshift1reg .u64 %stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[60predu64 %r200; .reg .u16 %r201; .reg .predu64 %r216; .reg .u32 %r218; .reg .predpred %r225; .reg .u32 %r226; .reg .u32 %r227; .reg .u64 %r228; .reg .u64 %r229; .reg .pred %r230; .reg .pred %r231; .reg .pred %r232; .reg .predpred %r236; .reg .u64 %r237; .reg .u64 %r238; .reg .predpred64predpred %r315; .reg .pred64predu64.reg .predmov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; mov.u64 %r182,%ar3; .loc 1 66 6 setp.eq.u64 %r183,%r182,0; @ %r183 bra $L54; .loc 1 67 11 ld.u32 %r184,[%r182]; add.u32 %r116,%r184,-1; .loc 1 71 6 setp.lt.s32 %r185,%r116,0; @ ! %r185 bra $L2; bra $L3; $L54: .loc 1 69 11 cvt.u32.u64 %r116,%r182; $L2: .loc 1 71 34 ld.s8 %r186,[%r180+28]; .loc 1 71 17 setp.gt.s32 %r187,%r186,%r116; @ %r187 bra $L4; $L3: .loc 1 7_gfortran_runtime_errorr25,[%r180+16]180; call (%value_in),_gfortran_size0,(%out_arg1); ld.param.u64 %r191,[%value_in]; } .loc 1 78 6 ld.u64 %r192,[%r1795; .loc 1 80 24191_gfortrani_xmallocarray196,[%value_in]; } .loc 1 80 22 st.u64 [%r179],%r196; .loc 1 81 19 st.u64 [%r179+8],%r192; .loc 1 82 7 ld.u64 %r199,[%r180+16]; st.u64 [%r179+16],%r199; ld.u64 %r200,[%r180+24]; st.u64 [%r179+24],%r200; .loc 1 83 34 ld.s8 %r155,[%r180+28]; .loc 1 83 7 cvt.u16.u32 %r201,%r155; setp.gt.s16 %r202,%r201,0; @ %r202 bra $L6; cvta.global.u64 %r419,_gfortrani_compile_options; bra $L7; $L8: .loc 1 87 16 ld.u64 %r204,[%r82+8]; ld.u64 %r206,[%r82]; .loc 1 92 10 mad.lo.u64 %r108,%r41,%r108,%r108; add.u64 %r82,%r82,24; .loc 1 87 14 sub.u64 %r41,%r204,%r206; $L52: add.u64 %r56,%r179,%r71; .loc 1 95 4 st.u64 [%r56+48],%r208; st.u64 [%r56+56],%r41; st.u64 [%r56+40],%r108; .loc 1 83 7 add.u64 %r71,%r71,24; setp.ne.u64 %r209,%r415,%r71; @ %r209 bra $L8; cvta.global.u64 %r419,_gfortrani_compile_options; bra $L7; $L5: .loc 1 98 12 cvta.global.u64 %r419,_gfortrani_compile_options; .loc 1 98 11 ld.u32 %r211,[%r419+36]; setp.ne.u32 %r212,%r211,0; @ ! %r212 bra $L11; .loc 1 100 7 cvta.const.u64 %r216,$LC1r1802116; call _gfortrani_bounds_equal_extents$L7: .loc 1 104 6 ld.u32 %r218,[%r419+36]; setp.eq.u32 %r219,%r218,0; @ %r219 bra $L11; .loc 1 106 7 cvta.const.u64 %r224,$LC1r18116234; call _gfortrani_bounds_reduced_extents); } $L11: .loc 1 110 6 setp.eq.u64 %r225,%r191,0; @ %r225 bra $L1; .loc 1 115 15 ld.u64 %r46,[%r180+16]; ld.s8 %r226,[%r180+29]; shl.b32 %r227,%r226,4; cvt.s64.s32 %r228,%r227; shl.b64 %r229,%r46,7; or.b64 %r48,%r228,%r229; .loc 1 117 3 setp.eq.u64 %r230,%r48,560; @ %r230 bra $L14; setp.gt.u64 %r231,%r48,560; @ %r231 bra $L15; setp.eq.u64 %r232,%r48,288; @ %r232 bra $L16; setp.gt.u64 %r233,%r48,288; @ %r233 bra $L17; setp.eq.u64 %r234,%r48,160; @ %r234 bra $L18; setp.eq.u64 %r235,%r48,272; @ %r235 bra $L16; setp.eq.u64 %r236,%r48,144; @ %r236 bra $L18; bra $L19; $L17: add.u64 %r237,%r48,-528; and.b64 %r238,%r237,-17; setp.eq.u64 %r239,%r238,0; @ ! %r239 bra $L19; bra $L20; $L15: setp.eq.u64 %r240,%r48,1072; @ %r240 bra $L21; setp.gt.u64 %r241,%r48,1072; @ %r241 bra $L22; add.u64 %r242,%r48,-1040; and.b64 %r243,%r242,-17; setp.eq.u64 %r244,%r243,0; @ ! %r244 bra $L19; bra $L23; $L22: setp.eq.u64 %r245,%r48,1088; @ %r245 bra $L24; setp.eq.u64 %r246,%r48,2112; @ %r246 bra $L25; bra $L19; $L18: .loc 1 121 7r1801182; call _gfortrani_cshift1_4_123 7 bra $L1; $L16: .loc 1 127 7r1801182; call _gfortrani_cshift1_4_129 7 bra $L1; $L20: .loc 1 133 7r1801182; call _gfortrani_cshift1_4_135 7 bra $L1; $L23: .loc 1 139 7r1801182; call _gfortrani_cshift1_4_141 7 bra $L1; $L14: .loc 1 152 7r1801182; call _gfortrani_cshift1_4_r47 bra $L1; $L21: .loc 1 157 7r1801182; call _gfortrani_cshift1_4_r87 bra $L1; $L24: .loc 1 176 7r1801182; call _gfortrani_cshift1_4_c4178 7 bra $L1; $L25: .loc 1 181 7r1801182; call _gfortrani_cshift1_4_c8183 7 bra $L1; $L19: .loc 1 204 13 mov.u64 %r279,1; st.u64 [%frame],%r279; .loc 1 205 12 mov.u64 %r280,0; st.u64 [%frame+120],%r280; .loc 1 213 23 ld.s8 %r150,[%r180+28]; cvt.u32.u32 %r281,%r150; cvt.s64.s8 %r149,%r281; .loc 1 213 3 setp.le.s64 %r282,%r149,0; @ %r282 bra $L55; .loc 1 215 15 cvt.s64.s32 %r101,%r116; add.u64 %r111,%r180,40; add.u64 %r107,%r179,40; .loc 1 217 21 ld.u64 %r174,[%r179+16]; .loc 1 74 8 mov.u64 %r99,%r25; mov.u64 %r94,%r25; .loc 1 206 5 mov.u64 %r106,%r280; .loc 1 211 7 mov.u64 %r104,%r106; .loc 1 213 12 mov.u64 %r143,%r106; .loc 1 227 20 mov.u64 %r430,%r106; $L32: .loc 1 223 17 ld.u64 %r284,[%r111+16]; add.u64 %r283,%r284,1; .loc 1 223 15 ld.u64 %r285,[%r111+8]; sub.u64 %r171,%r283,%r285; .loc 1 217 21 ld.u64 %r286,[%r107]; mul.lo.u64 %r93,%r174,%r286; .loc 1 220 21 ld.u64 %r287,[%r111]; mul.lo.u64 %r98,%r46,%r287; .loc 1 215 10 setp.ne.u64 %r288,%r101,%r143; @ %r288 bra $L28; .loc 1 218 14 setp.ne.u64 %r289,%r93,0; .loc 1 221 14 setp.ne.u64 %r290,%r98,0; .loc 1 223 15 mov.u64 %r104,%r171; selp.u64 %r99,%r98,%r25,%r290; selp.u64 %r94,%r93,%r25,%r289; bra $L31; $L28: .loc 1 227 20 shl.b64 %r291,%r106,3; add.u64 %r292,%frame,%r291; add.u64 %r293,%r292,120; st.u64 [%r293],%r430; .loc 1 228 21 st.u64 [%r292],%r171; .loc 1 229 22 st.u64 [%r292+480],%r93; .loc 1 230 22 st.u64 [%r292+360],%r98; .loc 1 232 24 add.u64 %r307,%r106,%r106; add.u64 %r308,%r307,%r106; shl.b64 %r309,%r308,3; add.u64 %r310,%r181,%r309; .loc 1 232 22 ld.u64 %r312,[%r310+40]; st.u64 [%r292+240],%r312; .loc 1 233 12 add.u64 %r106,%r106,1; $L31: .loc 1 213 55 add.u64 %r143,%r143,1; .loc 1 213 3 add.u64 %r111,%r111,24; add.u64 %r107,%r107,24; setp.ne.u64 %r313,%r143,%r149; @ %r313 bra $L32; bra $L27; $L55: .loc 1 74 8 mov.u64 %r99,%r25; mov.u64 %r94,%r25; .loc 1 211 7 mov.u64 %r104,%r280; $L27: .loc 1 236 14 ld.u64 %r52,[%frame+360]; .loc 1 236 6 setp.ne.u64 %r314,%r52,0; .loc 1 74 8 selp.u64 %r52,%r52,%r25,%r314; .loc 1 238 14 ld.u64 %r37,[%frame+480]; .loc 1 238 6 setp.ne.u64 %r315,%r37,0; .loc 1 74 8 selp.u64 %r37,%r37,%r25,%r315; .loc 1 240 14 ld.u64 %r40,[%frame+240]; .loc 1 240 6 setp.ne.u64 %r316,%r40,0; .loc 1 241 16 selp.u64 %r40,%r40,1,%r316; .loc 1 247 8 ld.u64 %r140,[%r179]; .loc 1 248 8 ld.u64 %r141,[%r180]; .loc 1 249 8 ld.u64 %r142,[%r181]; .loc 1 251 9 setp.eq.u64 %r317,%r140,0; @ %r317 bra $L1; .loc 1 294 12 shl.b64 %r68,%r40,2; .loc 1 297 32 ld.u64 %r109,[%frame]; .loc 1 268 38 set.u32.eq.u64 %r319,%r25,%r94; neg.s32 %r320,%r319; .loc 1 268 19 set.u32.eq.u64 %r322,%r25,%r99; neg.s32 %r323,%r322; .loc 1 268 .loc 1 304 30 mul.lo.u64 %r36,%r37,%r109; .loc 1 305 30 mul.lo.u64 %r328,%r109,%r52; .loc 1 305 16 neg.s64 %r57,%r328; .loc 1 306 23 mul.lo.u64 %r329,%r40,%r109; .loc 1 306 9 shl.b64 %r330,%r329,2; neg.s64 %r156,%r330; setp.ne.u32 %r416,%r100,0; setp.le.s64 %r418,%r149,2; .loc 1 258 12 cvt.u32.u64 %r423,%r104; cvt.u32.u32 %r424,%r150; cvt.s64.s8 %r425,%r424; add.u64 %r426,%r425,-1; add.u64 %r427,%frame,480; add.u64 %r428,%frame,360; add.u64 %r429,%frame,240; $L53: .loc 1 254 10 ld.u32 %r127,[%r142]; .loc 1 257 10 setp.ge.s32 %r331,%r127,0; @ %r331 bra $L36; .loc 1 258 12 add.u32 %r55,%r127,%r423; .loc 1 259 11 cvt.s64.s32 %r157,%r55; .loc 1 259 10 setp.ge.s64 %r332,%r157,%r104; @ %r332 bra $L37; setp.ge.s32 %r333,%r55,0; @ %r333 bra $L38; $L37: .loc 1 261 12 rem.s64 %r157,%r157,%r104; .loc 1 262 7 setp.ge.s64 %r336,%r157,0; @ %r336 bra $L38; .loc 1 263 9 cvt.u32.u64 %r338,%r157; add.u32 %r337,%r423,%r338; .loc 1 266 22 cvt.s64.s32 %r157,%r337; $L38: .loc 1 277 24 sub.u64 %r158,%r104,%r157; .loc 1 268 10 @ %r416 bra $L39; .loc 1 277 4 setp.gt.s64 %r340,%r158,0; @ %r340 bra $L40; mov.u64 %r114,%r140; bra $L41; $L39: .loc 1 270 26 mul.lo.u64 %r135,%r25,%r157; .loc 1 271 27 mul.lo.u64 %r136,%r25,%r158; .loc 1 272 21 add.u64 %r341,%r141,%r135; .loc 1 272 214341136348,[%value_in]; } .loc 1 273 15 add.u64 %r350,%r140,%r136; .loc 1 273 2141bra $L42; $L40: .loc 1 266 11 mad.lo.u64 %r133,%r157,%r99,%r141; .loc 1 287 7 mov.u64 %r132,%r140; .loc 1 277 11 mov.u64 %r134,0; $L43:1366,[%value_in]; } .loc 1 280 13 add.u64 %r132,%r132,%r94; .loc 1 281 12 add.u64 %r133,%r133,%r99; .loc 1 277 31 add.u64 %r134,%r134,1; .loc 1 277 4 setp.ne.u64 %r368,%r134,%r158; @ %r368 bra $L43; .loc 1 280 13 mad.lo.u64 %r114,%r134,%r94,%r140; $L41: .loc 1 283 6 setp.gt.s64 %r370,%r157,0; @ ! %r370 bra $L42; .loc 1 287 7 mov.u64 %r130,%r141; .loc 1 283 25 mov.u64 %r131,0; $L45: .loc 1 285 3377,[%value_in]; } .loc 1 286 8 add.u64 %r114,%r114,%r94; .loc 1 287 7 add.u64 %r130,%r130,%r99; .loc 1 283 39 add.u64 %r131,%r131,1; .loc 1 283 6 setp.ne.u64 %r379,%r131,%r157; @ %r379 bra $L45; $L42: .loc 1 292 12 add.u64 %r140,%r140,%r37; .loc 1 293 12 add.u64 %r141,%r141,%r52; .loc 1 294 12 add.u64 %r142,%r142,%r68; .loc 1 295 15 ld.u64 %r380,[%frame+120]; add.u64 %r70,%r380,1; .loc 1 297 13 setp.eq.u64 %r381,%r70,%r109; @ %r381 bra $L47; .loc 1 295 15 st.u64 [%frame+120],%r70; bra $L53; $L47: .loc 1 301 20 mov.u64 %r382,0; st.u64 [%frame+120],%r382; .loc 1 305 16 add.u64 %r137,%r141,%r57; .loc 1 306 9 add.u64 %r138,%r142,%r156; .loc 1 308 14 @ %r418 bra $L1; add.u64 %r167,%frame,128; .loc 1 304 30 mov.u64 %r73,%r36; .loc 1 308 14 mov.u64 %r166,8; .loc 1 307 12 mov.u64 %r139,1; bra $L50; $L51: .loc 1 301 20 st.u64 [%r167],%r382; .loc 1 304 30 mul.lo.u64 %r73,%r85,%r84; .loc 1 305 30 mul.lo.u64 %r388,%r87,%r84; .loc 1 305 16 sub.u64 %r137,%r141,%r388; .loc 1 306 23 mul.lo.u64 %r389,%r89,%r84; .loc 1 306 9 shl.b64 %r390,%r389,2; sub.u64 %r138,%r142,%r390; .loc 1 307 12 add.u64 %r139,%r139,1; .loc 1 308 14 add.u64 %r167,%r167,8; add.u64 %r166,%r166,8; setp.eq.u64 %r391,%r426,%r139; @ %r391 bra $L1; $L50: .loc 1 316 23 ld.u64 %r392,[%r167]; add.u64 %r84,%r392,1; st.u64 [%r167],%r84; .loc 1 317 30 add.u64 %r394,%r427,%r166; ld.u64 %r85,[%r394]; .loc 1 317 20 sub.u64 %r395,%r85,%r73; add.u64 %r140,%r140,%r395; .loc 1 318 30 add.u64 %r397,%r428,%r166; ld.u64 %r87,[%r397]; .loc 1 318 20 add.u64 %r141,%r137,%r87; .loc 1 319 23 add.u64 %r399,%r429,%r166; ld.u64 %r89,[%r399]; .loc 1 319 13 shl.b64 %r400,%r89,2; add.u64 %r142,%r138,%r400; .loc 1 297 32 add.u64 %r401,%frame,%r166; ld.u64 %r92,[%r401]; .loc 1 297 13 setp.eq.u64 %r402,%r84,%r92; @ %r402 bra $L51; bra $L53; $L36: .loc 1 259 11 cvt.s64.s32 %r157,%r127; .loc 1 259 10 setp.lt.s64 %r403,%r157,%r104; @ %r403 bra $L38; .loc 1 261 12 rem.s64 %r157,%r157,%r104; bra $L38; $L6: .loc 1 87 16 ld.u64 %r407,[%r180+56]; ld.u64 %r409,[%r180+48]; .loc 1 87 14 sub.u64 %r41,%r407,%r409; add.u64 %r82,%r180,72; cvt.u32.u32 %r411,%r155; cvt.s64.s8 %r410,%r411; add.u64 %r413,%r410,%r410; add.u64 %r414,%r413,%r410; shl.b64 %r415,%r414,3; mov.u64 %r71,%r192; .loc 1 90 17 mov.u64 %r108,1; .loc 1 95 4 mov.u64 %r208,%r71; bra $L52; $L1: .loc 1 32_gfortran_cshift1_4 .visible .func _gfortran_cshift1_.loc 1 337call cshift1_gfortran_cshift1_4_char .visible .func _gfortran_cshift1_4_char.loc 1 357call cshift13_gfortran_cshift1_4_char4 .visible .func _gfortran_cshift1_4_char4$LFB8:call cshift1ret; $LFE8: } cshift1_8.o/BEGIN FUNCTION DECL: cshift1 .func cshift1fortran/generated/cshift1_8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_cshift1_8 .visible .func _gfortran_cshift1_8gfortran_cshift1_8_char .visible .func _gfortran_cshift1_8_char_gfortran_cshift1_8_char4 .visible .func _gfortran_cshift1_8_char4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortran_size0_gfortran_size0_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extentsgfortrani_bounds_reduced_extents .extern .func _gfortrani_bounds_reduced_extents); // BEGIN GLOBAL FUNCTION DECL: _gfortrani_cshift1_8_i1 .extern .func _gfortrani_cshift1_8_i1gfortrani_cshift1_8_i2 .extern .func _gfortrani_cshift1_8_i2gfortrani_cshift1_8_i4 .extern .func _gfortrani_cshift1_8_i4gfortrani_cshift1_8_i8 .extern .func _gfortrani_cshift1_8_i8gfortrani_cshift1_8_r4 .extern .func _gfortrani_cshift1_8_r4gfortrani_cshift1_8_r8 .extern .func _gfortrani_cshift1_8_r8gfortrani_cshift1_8_c4 .extern .func _gfortrani_cshift1_8_c4gfortrani_cshift1_8_c8 .extern .func _gfortrani_cshift1_8_c8114,101,116,117,114,110,32,118,97,108,117,101,0 }03,117,109,101,110,116,32,39,68,73,77,39,32,105,115,32,111,117,116,32,111,102,32,114,97,110,103,101,32,105,110,32,99,97,108,108,32,116,111,32,39,67,83,72,73,70,84,397] = {67,83,72,73,70,8473,70,84,32,97,114,103,117,109,101,110,116,0 }; // BEGIN FUNCTION DEF: cshift1 .func cshift1reg .u64 %stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[608pred %r223; .reg .u32 %r224; .reg .u32pred %r230; .reg .pred %r231; .reg .pred %r232; .reg .pred %r233; .reg .predpredpred %r242; .reg .predpred %r286; .reg .pred %r287; .reg .predu64 %r310; .reg .pred %r311; .reg .pred %r312; .reg .pred %r313; .reg .predpred %r329; .reg .u32 %r331; .reg .u32 %r332; .reg .u64predu64 %r356; .reg .u64 %r363; .reg .u64 %r372; .reg .predu64 %r431; .reg .u64 %r432; mov.u64 %r176,%ar0; mov.u64 %r177,%ar1; mov.u64 %r178,%ar2; mov.u64 %r179,%ar3; .loc 1 66 6 setp.eq.u64 %r180,%r179,0; @ %r180 bra $L53; .loc 1 67 21 ld.u32 %r182,[%r179]; add.u32 %r125,%r182,-1; .loc 1 71 6 setp.lt.s32 %r183,%r125,0; @ ! %r183 bra $L2; bra $L3; $L53: .loc 1 69 11 cvt.u32.u64 %r125,%r179; $L2: .loc 1 71 34 ld.s8 %r184,[%r177+28]; .loc 1 71 17 setp.gt.s32 %r185,%r184,%r125; @ %r185 bra $L4; $L3: .loc 1 72 5 cvta.const.u64 %r186186_gfortran_runtime_errorr27,[%r177+16]177; call (%value_in),_gfortran_size0,(%out_arg1); ld.param.u64 %r189,[%value_in]; } .loc 1 78 6 ld.u64 %r190,[%r176]; setp.ne.u64 %r191,%r190,0; @ %r191 bra $L5; .loc 1 80 2427; call (%value_in),_gfortrani_xmallocarray194,[%value_in]; } .loc 1 80 22 st.u64 [%r176],%r194; .loc 1 81 19 st.u64 [%r176+8],%r190; .loc 1 82 7 ld.u64 %r197,[%r177+16]; st.u64 [%r176+16],%r197; ld.u64 %r198,[%r177+24]; st.u64 [%r176+24],%r198; .loc 1 83 34 ld.s8 %r162,[%r177+28]; .loc 1 83 7 cvt.u16.u32 %r199,%r162; setp.gt.s16 %r200,%r199,0; @ %r200 bra $L6; cvta.global.u64 %r421,_gfortrani_compile_options; bra $L7; $L8: .loc 1 87 16 ld.u64 %r202,[%r72+8]; ld.u64 %r204,[%r72]; .loc 1 92 10 mad.lo.u64 %r115,%r152,%r115,%r115; add.u64 %r72,%r72,24; .loc 1 87 14 sub.u64 %r152,%r202,%r204; $L51: add.u64 %r45,%r176,%r55; .loc 1 95 4 st.u64 [%r45+48],%r206; st.u64 [%r45+56],%r152; st.u64 [%r45+40],%r115; .loc 1 83 7 add.u64 %r55,%r55,24; setp.ne.u64 %r207,%r418,%r55; @ %r207 bra $L8; cvta.global.u64 %r421,_gfortrani_compile_options; bra $L7; $L5: .loc 1 98 12 cvta.global.u64 %r421,_gfortrani_compile_options; .loc 1 98 11 ld.u32 %r209,[%r421+36]; setp.ne.u32 %r210,%r209,0; @ ! %r210 bra $L11; .loc 1 100 7 cvta.const.u64 %r214,$LC1; cvta.const.u64 %r21317721314; call _gfortrani_bounds_equal_extents$L7: .loc 1 104 6 ld.u32 %r216,[%r421+36]; setp.eq.u32 %r217,%r216,0; @ %r217 bra $L11; .loc 1 106 7 cvta.const.u64 %r222,$LC1r17122; call _gfortrani_bounds_reduced_extents); } $L11: .loc 1 110 6 setp.eq.u64 %r223,%r189,0; @ %r223 bra $L1; .loc 1 115 15 ld.u64 %r51,[%r177+16]; ld.s8 %r224,[%r177+29]; shl.b32 %r225,%r224,4; cvt.s64.s32 %r226,%r225; shl.b64 %r227,%r51,7; or.b64 %r53,%r226,%r227; .loc 1 117 3 setp.eq.u64 %r228,%r53,560; @ %r228 bra $L14; setp.gt.u64 %r229,%r53,560; @ %r229 bra $L15; setp.eq.u64 %r230,%r53,288; @ %r230 bra $L16; setp.gt.u64 %r231,%r53,288; @ %r231 bra $L17; setp.eq.u64 %r232,%r53,160; @ %r232 bra $L18; setp.eq.u64 %r233,%r53,272; @ %r233 bra $L16; setp.eq.u64 %r234,%r53,144; @ %r234 bra $L18; bra $L19; $L17: add.u64 %r235,%r53,-528; and.b64 %r236,%r235,-17; setp.eq.u64 %r237,%r236,0; @ ! %r237 bra $L19; bra $L20; $L15: setp.eq.u64 %r238,%r53,1072; @ %r238 bra $L21; setp.gt.u64 %r239,%r53,1072; @ %r239 bra $L22; add.u64 %r240,%r53,-1040; and.b64 %r241,%r240,-17; setp.eq.u64 %r242,%r241,0; @ ! %r242 bra $L19; bra $L23; $L22: setp.eq.u64 %r243,%r53,1088; @ %r243 bra $L24; setp.eq.u64 %r244,%r53,2112; @ %r244 bra $L25; bra $L19; $L18:177178179; call _gfortrani_cshift1_8_123 7 bra $L1; $L16:177178179; call _gfortrani_cshift1_8_129 7 bra $L1; $L20:177178179; call _gfortrani_cshift1_8_135 7 bra $L1; $L23:177178179; call _gfortrani_cshift1_8_141 7 bra $L1; $L14:177178179; call _gfortrani_cshift1_8_r47 bra $L1; $L21:177178179; call _gfortrani_cshift1_8_r87 bra $L1; $L24:177178179; call _gfortrani_cshift1_8_c4178 7 bra $L1; $L25:177178179; call _gfortrani_cshift1_8_c8183 7 bra $L1; $L19: .loc 1 204 13 mov.u64 %r277,1; st.u64 [%frame],%r277; .loc 1 205 12 mov.u64 %r278,0; st.u64 [%frame+120],%r278; .loc 1 213 23 ld.s8 %r160,[%r177+28]; cvt.u32.u32 %r279,%r160; cvt.s64.s8 %r158,%r279; .loc 1 213 3 setp.le.s64 %r280,%r158,0; @ %r280 bra $L54; .loc 1 215 15 cvt.s64.s32 %r105,%r125; add.u64 %r117,%r177,40; add.u64 %r103,%r176,40; .loc 1 217 21 ld.u64 %r172,[%r176+16]; .loc 1 74 8 mov.u64 %r101,%r27; mov.u64 %r95,%r27; .loc 1 206 5 mov.u64 %r109,%r278; .loc 1 211 7 mov.u64 %r108,%r109; .loc 1 213 12 mov.u64 %r147,%r109; .loc 1 227 20 mov.u64 %r432,%r109; $L32: .loc 1 223 17 ld.u64 %r282,[%r117+16]; add.u64 %r281,%r282,1; .loc 1 223 15 ld.u64 %r283,[%r117+8]; sub.u64 %r154,%r281,%r283; .loc 1 217 21 ld.u64 %r284,[%r103]; mul.lo.u64 %r94,%r172,%r284; .loc 1 220 21 ld.u64 %r285,[%r117]; mul.lo.u64 %r100,%r51,%r285; .loc 1 215 10 setp.ne.u64 %r286,%r105,%r147; @ %r286 bra $L28; .loc 1 218 14 setp.ne.u64 %r287,%r94,0; .loc 1 221 14 setp.ne.u64 %r288,%r100,0; .loc 1 223 15 mov.u64 %r108,%r154; selp.u64 %r101,%r100,%r27,%r288; selp.u64 %r95,%r94,%r27,%r287; bra $L31; $L28: .loc 1 227 20 shl.b64 %r289,%r109,3; add.u64 %r290,%frame,%r289; add.u64 %r291,%r290,120; st.u64 [%r291],%r432; .loc 1 228 21 st.u64 [%r290],%r154; .loc 1 229 22 st.u64 [%r290+480],%r94; .loc 1 230 22 st.u64 [%r290+360],%r100; .loc 1 232 24 add.u64 %r305,%r109,%r109; add.u64 %r306,%r305,%r109; shl.b64 %r307,%r306,3; add.u64 %r308,%r178,%r307; .loc 1 232 22 ld.u64 %r310,[%r308+40]; st.u64 [%r290+240],%r310; .loc 1 233 12 add.u64 %r109,%r109,1; $L31: .loc 1 213 55 add.u64 %r147,%r147,1; .loc 1 213 3 add.u64 %r117,%r117,24; add.u64 %r103,%r103,24; setp.ne.u64 %r311,%r147,%r158; @ %r311 bra $L32; bra $L27; $L54: .loc 1 74 8 mov.u64 %r101,%r27; mov.u64 %r95,%r27; .loc 1 211 7 mov.u64 %r108,%r278; $L27: .loc 1 236 14 ld.u64 %r56,[%frame+360]; .loc 1 236 6 setp.ne.u64 %r312,%r56,0; .loc 1 74 8 selp.u64 %r56,%r56,%r27,%r312; .loc 1 238 14 ld.u64 %r43,[%frame+480]; .loc 1 238 6 setp.ne.u64 %r313,%r43,0; .loc 1 74 8 selp.u64 %r43,%r43,%r27,%r313; .loc 1 240 14 ld.u64 %r46,[%frame+240]; .loc 1 240 6 setp.ne.u64 %r314,%r46,0; .loc 1 241 16 selp.u64 %r46,%r46,1,%r314; .loc 1 247 8 ld.u64 %r144,[%r176]; .loc 1 248 8 ld.u64 %r145,[%r177]; .loc 1 249 8 ld.u64 %r146,[%r178]; .loc 1 251 9 setp.eq.u64 %r315,%r144,0; @ %r315 bra $L1; .loc 1 294 12 shl.b64 %r69,%r46,3; .loc 1 268 38 set.u32.eq.u64 %r317,%r27,%r95; neg.s32 %r318,%r317; .loc 1 268 19 set.u32.eq.u64 %r320,%r27,%r101; neg.s32 %r321,%r320; .loc 1 268 27 cvt.u16.u32 %r323,%r318; cvt.u16.u32 %r324,%r321; and.b16 %r322,%r323,%r324; cvt.u32.u16 %r325,%r322; cvt.u32.u8 %r104,%r325; .loc 1 297 32 ld.u64 %r110,[%frame]; .loc 1 304 30 mul.lo.u64 %r148,%r43,%r110; .loc 1 305 30 mul.lo.u64 %r326,%r56,%r110; .loc 1 305 16 neg.s64 %r164,%r326; .loc 1 306 23 mul.lo.u64 %r327,%r46,%r110; .loc 1 306 9 shl.b64 %r328,%r327,3; neg.s64 %r156,%r328; setp.ne.u32 %r419,%r104,0; setp.le.s64 %r422,%r158,2; cvt.u32.u32 %r426,%r160; cvt.s64.s8 %r427,%r426; add.u64 %r428,%r427,-1; add.u64 %r429,%frame,480; add.u64 %r430,%frame,360; add.u64 %r431,%frame,240; $L52: .loc 1 254 10 ld.u64 %r113,[%r146]; .loc 1 257 10 setp.ge.s64 %r329,%r113,0; @ %r329 bra $L36; .loc 1 258 12 add.u64 %r113,%r113,%r108; $L36: .loc 1 259 11 set.u32.ge.s64 %r331,%r113,%r108; neg.s32 %r332,%r331; shr.u64 %r334,%r113,63; cvt.u16.u32 %r337,%r332; cvt.u16.u64 %r338,%r334; or.b16 %r336,%r337,%r338; .loc 1 259 10 cvt.u32.u16 %r339,%r336; cvt.u16.u8 %r340,%r339; setp.eq.u16 %r341,%r340,0; @ %r341 bra $L37; .loc 1 261 7 rem.s64 %r113,%r113,%r108; .loc 1 262 7 setp.ge.s64 %r344,%r113,0; @ %r344 bra $L37; .loc 1 263 9 add.u64 %r113,%r113,%r108; $L37: .loc 1 277 24 sub.u64 %r114,%r108,%r113; .loc 1 268 10 @ %r419 bra $L38; .loc 1 277 4 setp.gt.s64 %r346,%r114,0; @ %r346 bra $L39; mov.u64 %r173,%r144; bra $L40; $L38: .loc 1 270 26 mul.lo.u64 %r139,%r113,%r27; .loc 1 271 27 mul.lo.u64 %r140,%r114,%r27; .loc 1 272 21 add.u64 %r347,%r145,%r139; .loc 1 272 21447140354,[%value_in]; } .loc 1 273 15 add.u64 %r356,%r144,%r140; .loc 1 273 2bra $L41; $L39: .loc 1 266 11 mad.lo.u64 %r137,%r113,%r101,%r145; .loc 1 287 7 mov.u64 %r136,%r144; .loc 1 277 11 mov.u64 %r138,0; $L42:372,[%value_in]; } .loc 1 280 13 add.u64 %r136,%r136,%r95; .loc 1 281 12 add.u64 %r137,%r137,%r101; .loc 1 277 31 add.u64 %r138,%r138,1; .loc 1 277 4 setp.ne.u64 %r374,%r114,%r138; @ %r374 bra $L42; .loc 1 280 13 mad.lo.u64 %r173,%r114,%r95,%r144; $L40: .loc 1 283 6 setp.gt.s64 %r376,%r113,0; @ ! %r376 bra $L41; .loc 1 287 7 mov.u64 %r134,%r145; .loc 1 283 25 mov.u64 %r135,0; $L44: .loc 1 285 317r1383,[%value_in]; } .loc 1 286 8 add.u64 %r173,%r173,%r95; .loc 1 287 7 add.u64 %r134,%r134,%r101; .loc 1 283 39 add.u64 %r135,%r135,1; .loc 1 283 6 setp.ne.u64 %r385,%r113,%r135; @ %r385 bra $L44; $L41: .loc 1 292 12 add.u64 %r144,%r144,%r43; .loc 1 293 12 add.u64 %r145,%r145,%r56; .loc 1 294 12 add.u64 %r146,%r146,%r69; .loc 1 295 15 ld.u64 %r386,[%frame+120]; add.u64 %r71,%r386,1; .loc 1 297 13 setp.eq.u64 %r387,%r71,%r110; @ %r387 bra $L46; .loc 1 295 15 st.u64 [%frame+120],%r71; bra $L52; $L46: .loc 1 301 20 mov.u64 %r388,0; st.u64 [%frame+120],%r388; .loc 1 305 16 add.u64 %r141,%r145,%r164; .loc 1 306 9 add.u64 %r142,%r146,%r156; .loc 1 308 14 @ %r422 bra $L1; add.u64 %r153,%frame,128; .loc 1 304 30 mov.u64 %r74,%r148; .loc 1 308 14 mov.u64 %r151,8; .loc 1 307 12 mov.u64 %r143,1; bra $L49; $L50: .loc 1 301 20 st.u64 [%r153],%r388; .loc 1 304 30 mul.lo.u64 %r74,%r86,%r85; .loc 1 305 30 mul.lo.u64 %r394,%r88,%r85; .loc 1 305 16 sub.u64 %r141,%r145,%r394; .loc 1 306 23 mul.lo.u64 %r395,%r90,%r85; .loc 1 306 9 shl.b64 %r396,%r395,3; sub.u64 %r142,%r146,%r396; .loc 1 307 12 add.u64 %r143,%r143,1; .loc 1 308 14 add.u64 %r153,%r153,8; add.u64 %r151,%r151,8; setp.eq.u64 %r397,%r428,%r143; @ %r397 bra $L1; $L49: .loc 1 316 23 ld.u64 %r398,[%r153]; add.u64 %r85,%r398,1; st.u64 [%r153],%r85; .loc 1 317 30 add.u64 %r400,%r429,%r151; ld.u64 %r86,[%r400]; .loc 1 317 20 sub.u64 %r401,%r86,%r74; add.u64 %r144,%r144,%r401; .loc 1 318 30 add.u64 %r403,%r430,%r151; ld.u64 %r88,[%r403]; .loc 1 318 20 add.u64 %r145,%r141,%r88; .loc 1 319 23 add.u64 %r405,%r431,%r151; ld.u64 %r90,[%r405]; .loc 1 319 13 shl.b64 %r406,%r90,3; add.u64 %r146,%r142,%r406; .loc 1 297 32 add.u64 %r407,%frame,%r151; ld.u64 %r93,[%r407]; .loc 1 297 13 setp.eq.u64 %r408,%r85,%r93; @ %r408 bra $L50; bra $L52; $L6: .loc 1 87 16 ld.u64 %r410,[%r177+56]; ld.u64 %r412,[%r177+48]; .loc 1 87 14 sub.u64 %r152,%r410,%r412; add.u64 %r72,%r177,72; cvt.u32.u32 %r414,%r162; cvt.s64.s8 %r413,%r414; add.u64 %r416,%r413,%r413; add.u64 %r417,%r416,%r413; shl.b64 %r418,%r417,3; mov.u64 %r55,%r190; .loc 1 90 17 mov.u64 %r115,1; .loc 1 95 4 mov.u64 %r206,%r55; bra $L51; $L1: .loc 1 32_gfortran_cshift1_8 .visible .func _gfortran_cshift1_8.loc 1 337call cshift1_gfortran_cshift1_8_char .visible .func _gfortran_cshift1_8_char.loc 1 357call cshift13_gfortran_cshift1_8_char4 .visible .func _gfortran_cshift1_8_char4$LFB8:call cshift1ret; $LFE8: } cshift1_16.o/BEGIN FUNCTION DECL: cshift1 .func cshift1fortran/generated/cshift1_16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_cshift1_16 .visible .func _gfortran_cshift1_16gfortran_cshift1_16_char .visible .func _gfortran_cshift1_16_char_gfortran_cshift1_16_char4 .visible .func _gfortran_cshift1_16_char4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortran_size0_gfortran_size0_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extentsgfortrani_bounds_reduced_extents .extern .func _gfortrani_bounds_reduced_extents); // BEGIN GLOBAL FUNCTION DECL: _gfortrani_cshift1_16_i1 .extern .func _gfortrani_cshift1_16_i1gfortrani_cshift1_16_i2 .extern .func _gfortrani_cshift1_16_i2gfortrani_cshift1_16_i4 .extern .func _gfortrani_cshift1_16_i4gfortrani_cshift1_16_i8 .extern .func _gfortrani_cshift1_16_i8gfortrani_cshift1_16_r4 .extern .func _gfortrani_cshift1_16_r4gfortrani_cshift1_16_r8 .extern .func _gfortrani_cshift1_16_r8gfortrani_cshift1_16_c4 .extern .func _gfortrani_cshift1_16_c4gfortrani_cshift1_16_c8 .extern .func _gfortrani_cshift1_16_c8modti3 .extern .func __modti3114,101,116,117,114,110,32,118,97,108,117,101,0 }03,117,109,101,110,116,32,39,68,73,77,39,32,105,115,32,111,117,116,32,111,102,32,114,97,110,103,101,32,105,110,32,99,97,108,108,32,116,111,32,39,67,83,72,73,70,84,397] = {67,83,72,73,70,8473,70,84,32,97,114,103,117,109,101,110,116,0 }; // BEGIN FUNCTION DEF: cshift1 .func cshift1reg .u64 %stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[65pred %r223; .reg .u32 %r224; .reg .u32pred %r230; .reg .pred %r231; .reg .pred %r232; .reg .pred %r233; .reg .predpredpred %r242; .reg .predpred %r286; .reg .pred %r287; .reg .predu64 %r310; .reg .pred %r311; .reg .pred %r312; .reg .pred %r313; .reg .predpred %r331; .reg .u64 %r333; .reg .u64 %r336; .reg .u32 %r338; .reg .u64 %r339; .reg .u64 %r342; .reg .pred %r345; .reg .pred %r348; .reg .pred %r351; .reg .pred %r353; .reg .pred %r371; .reg .u64 %r373; .reg .u64 %r376; .reg .u32 %r378; .reg .u64 %r379; .reg .u64 %r384; .reg .u64 %r387; .reg .u32 %r389; .reg .u64 %r390; .reg .u64 %r393; .reg .pred %r396; .reg .predpred %r446; .reg .pred %r448; .reg .pred %r450; .reg .u64 %r457; .reg .pred %r459; .reg .u64 %r460; .reg .predu64 %r475; .reg .u64 %r477; .reg .u64 %r479; .reg .u64 %r480; .reg .u64 %r481; .reg .pred %r482; .reg .pred %r485; .reg .pred %r488; .reg .pred %r491; .reg .u64 %r509; .reg .u64 %r511; .reg .u64 %r512; .reg .u32u64 %r520; .reg .u64 %r521; .reg .pred %r538; .reg .pred %r540; .reg .u64 %r542; .reg .u64u64 %r554; mov.u64 %r177,%ar0; mov.u64 %r178,%ar1; mov.u64 %r179,%ar2; mov.u64 %r180,%ar3; .loc 1 66 6 setp.eq.u64 %r181,%r180,0; @ %r181 bra $L61; .loc 1 67 21 ld.u32 %r182,[%r180]; add.u32 %r122,%r182,-1; .loc 1 71 6 setp.lt.s32 %r183,%r122,0; @ ! %r183 bra $L2; bra $L3; $L61: .loc 1 69 11 cvt.u32.u64 %r122,%r180; $L2: .loc 1 71 34 ld.s8 %r184,[%r178+28]; .loc 1 71 17 setp.gt.s32 %r185,%r184,%r122; @ %r185 bra $L4; $L3: .loc 1 72 5 cvta.const.u64 %r186186_gfortran_runtime_errorr27,[%r178+16]178; call (%value_in),_gfortran_size0,(%out_arg1); ld.param.u64 %r189,[%value_in]; } .loc 1 78 6 ld.u64 %r190,[%r177]; setp.ne.u64 %r191,%r190,0; @ %r191 bra $L5; .loc 1 80 2427; call (%value_in),_gfortrani_xmallocarray194,[%value_in]; } .loc 1 80 22 st.u64 [%r177],%r194; .loc 1 81 19 st.u64 [%r177+8],%r190; .loc 1 82 7 ld.u64 %r197,[%r178+16]; st.u64 [%r177+16],%r197; ld.u64 %r198,[%r178+24]; st.u64 [%r177+24],%r198; .loc 1 83 34 ld.s8 %r160,[%r178+28]; .loc 1 83 7 cvt.u16.u32 %r199,%r160; setp.gt.s16 %r200,%r199,0; @ %r200 bra $L6; cvta.global.u64 %r542,_gfortrani_compile_options; bra $L7; $L8: .loc 1 87 16 ld.u64 %r202,[%r53+8]; ld.u64 %r204,[%r53]; .loc 1 92 10 mad.lo.u64 %r113,%r173,%r113,%r113; add.u64 %r53,%r53,24; .loc 1 87 14 sub.u64 %r173,%r202,%r204; $L59: add.u64 %r38,%r177,%r41; .loc 1 95 4 st.u64 [%r38+48],%r206; st.u64 [%r38+56],%r173; st.u64 [%r38+40],%r113; .loc 1 83 7 add.u64 %r41,%r41,24; setp.ne.u64 %r207,%r41,%r517; @ %r207 bra $L8; cvta.global.u64 %r542,_gfortrani_compile_options; bra $L7; $L5: .loc 1 98 12 cvta.global.u64 %r542,_gfortrani_compile_options; .loc 1 98 11 ld.u32 %r209,[%r542+36]; setp.ne.u32 %r210,%r209,0; @ ! %r210 bra $L11; .loc 1 100 7 cvta.const.u64 %r214,$LC1; cvta.const.u64 %r2131314; call _gfortrani_bounds_equal_extents$L7: .loc 1 104 6 ld.u32 %r216,[%r542+36]; setp.eq.u32 %r217,%r216,0; @ %r217 bra $L11; .loc 1 106 7 cvta.const.u64 %r222,$LC1r178122; call _gfortrani_bounds_reduced_extents); } $L11: .loc 1 110 6 setp.eq.u64 %r223,%r189,0; @ %r223 bra $L1; .loc 1 115 15 ld.u64 %r49,[%r178+16]; ld.s8 %r224,[%r178+29]; shl.b32 %r225,%r224,4; cvt.s64.s32 %r226,%r225; shl.b64 %r227,%r49,7; or.b64 %r51,%r226,%r227; .loc 1 117 3 setp.eq.u64 %r228,%r51,560; @ %r228 bra $L14; setp.gt.u64 %r229,%r51,560; @ %r229 bra $L15; setp.eq.u64 %r230,%r51,288; @ %r230 bra $L16; setp.gt.u64 %r231,%r51,288; @ %r231 bra $L17; setp.eq.u64 %r232,%r51,160; @ %r232 bra $L18; setp.eq.u64 %r233,%r51,272; @ %r233 bra $L16; setp.eq.u64 %r234,%r51,144; @ %r234 bra $L18; bra $L19; $L17: add.u64 %r235,%r51,-528; and.b64 %r236,%r235,-17; setp.eq.u64 %r237,%r236,0; @ ! %r237 bra $L19; bra $L20; $L15: setp.eq.u64 %r238,%r51,1072; @ %r238 bra $L21; setp.gt.u64 %r239,%r51,1072; @ %r239 bra $L22; add.u64 %r240,%r51,-1040; and.b64 %r241,%r240,-17; setp.eq.u64 %r242,%r241,0; @ ! %r242 bra $L19; bra $L23; $L22: setp.eq.u64 %r243,%r51,1088; @ %r243 bra $L24; setp.eq.u64 %r244,%r51,2112; @ %r244 bra $L25; bra $L19; $L18:179180; call _gfortrani_cshift1_16_123 7 bra $L1; $L16:179180; call _gfortrani_cshift1_16_129 7 bra $L1; $L20:179180; call _gfortrani_cshift1_16_135 7 bra $L1; $L23:179180; call _gfortrani_cshift1_16_141 7 bra $L1; $L14:179180; call _gfortrani_cshift1_16_r47 bra $L1; $L21:179180; call _gfortrani_cshift1_16_r87 bra $L1; $L24:179180; call _gfortrani_cshift1_16_c4178 7 bra $L1; $L25:179180; call _gfortrani_cshift1_16_c8183 7 bra $L1; $L19: .loc 1 204 13 mov.u64 %r277,1; st.u64 [%frame],%r277; .loc 1 205 12 mov.u64 %r278,0; st.u64 [%frame+120],%r278; .loc 1 213 23 ld.s8 %r166,[%r178+28]; cvt.u32.u32 %r279,%r166; cvt.s64.s8 %r167,%r279; .loc 1 213 3 setp.le.s64 %r280,%r167,0; @ %r280 bra $L62; .loc 1 215 15 cvt.s64.s32 %r103,%r122; add.u64 %r80,%r178,40; add.u64 %r64,%r177,40; .loc 1 217 21 ld.u64 %r158,[%r177+16]; .loc 1 210 11 mov.u64 %r98,%r27; .loc 1 209 11 mov.u64 %r92,%r27; .loc 1 206 5 mov.u64 %r108,%r278; .loc 1 211 7 mov.u64 %r106,%r108; .loc 1 213 12 mov.u64 %r144,%r108; .loc 1 227 20 mov.u64 %r554,%r108; $L32: .loc 1 223 17 ld.u64 %r282,[%r80+16]; add.u64 %r281,%r282,1; .loc 1 223 15 ld.u64 %r283,[%r80+8]; sub.u64 %r157,%r281,%r283; .loc 1 217 21 ld.u64 %r284,[%r64]; mul.lo.u64 %r91,%r158,%r284; .loc 1 220 21 ld.u64 %r285,[%r80]; mul.lo.u64 %r97,%r49,%r285; .loc 1 215 10 setp.ne.u64 %r286,%r103,%r144; @ %r286 bra $L28; .loc 1 218 14 setp.ne.u64 %r287,%r91,0; .loc 1 221 14 setp.ne.u64 %r288,%r97,0; .loc 1 223 15 mov.u64 %r106,%r157; selp.u64 %r98,%r97,%r27,%r288; selp.u64 %r92,%r91,%r27,%r287; bra $L31; $L28: .loc 1 227 20 shl.b64 %r289,%r108,3; add.u64 %r290,%frame,%r289; add.u64 %r291,%r290,120; st.u64 [%r291],%r554; .loc 1 228 21 st.u64 [%r290],%r157; .loc 1 229 22 st.u64 [%r290+480],%r91; .loc 1 230 22 st.u64 [%r290+360],%r97; .loc 1 232 24 add.u64 %r305,%r108,%r108; add.u64 %r306,%r305,%r108; shl.b64 %r307,%r306,3; add.u64 %r308,%r179,%r307; .loc 1 232 22 ld.u64 %r310,[%r308+40]; st.u64 [%r290+240],%r310; .loc 1 233 12 add.u64 %r108,%r108,1; $L31: .loc 1 213 55 add.u64 %r144,%r144,1; .loc 1 213 3 add.u64 %r80,%r80,24; add.u64 %r64,%r64,24; setp.ne.u64 %r311,%r144,%r167; @ %r311 bra $L32; bra $L27; $L62: .loc 1 210 11 mov.u64 %r98,%r27; .loc 1 209 11 mov.u64 %r92,%r27; .loc 1 211 7 mov.u64 %r106,%r278; $L27: .loc 1 236 14 ld.u64 %r54,[%frame+360]; .loc 1 236 6 setp.ne.u64 %r312,%r54,0; .loc 1 237 16 selp.u64 %r54,%r54,%r27,%r312; .loc 1 238 14 ld.u64 %r40,[%frame+480]; .loc 1 238 6 setp.ne.u64 %r313,%r40,0; .loc 1 239 16 selp.u64 %r40,%r40,%r27,%r313; .loc 1 240 14 ld.u64 %r43,[%frame+240]; .loc 1 240 6 setp.ne.u64 %r314,%r43,0; .loc 1 241 16 selp.u64 %r43,%r43,1,%r314; .loc 1 247 8 ld.u64 %r141,[%r177]; .loc 1 248 8 ld.u64 %r142,[%r178]; .loc 1 249 8 ld.u64 %r143,[%r179]; .loc 1 251 9 setp.eq.u64 %r315,%r141,0; @ %r315 bra $L1; .loc 1 294 12 shl.b64 %r67,%r43,4; .loc 1 297 32 ld.u64 %r119,[%frame]; .loc 1 268 19 set.u32.eq.u64 %r317,%r27,%r98; neg.s32 %r318,%r317; .loc 1 268 38 set.u32.eq.u64 %r320,%r27,%r92; neg.s32 %r321,%r320; .loc 1 268 27 cvt.u16.u32 %r323,%r318; cvt.u16.u32 %r324,%r321; and.b16 %r322,%r323,%r324; cvt.u32.u16 %r325,%r322; cvt.u32.u8 %r109,%r325; .loc 1 258 12 shr.s64 %r519,%r106,63; .loc 1 304 30 mul.lo.u64 %r65,%r119,%r40; .loc 1 305 30 mul.lo.u64 %r327,%r119,%r54; .loc 1 305 16 neg.s64 %r169,%r327; .loc 1 306 23 mul.lo.u64 %r328,%r119,%r43; .loc 1 306 9 shl.b64 %r329,%r328,4; neg.s64 %r151,%r329; setp.ne.u32 %r538,%r109,0; setp.le.s64 %r540,%r167,2; .loc 1 261 7 add.u64 %r545,%frame,608; add.u64 %r546,%frame,624; add.u64 %r547,%frame,640; cvt.u32.u32 %r548,%r166; cvt.s64.s8 %r549,%r548; add.u64 %r550,%r549,-1; add.u64 %r551,%frame,480; add.u64 %r552,%frame,360; add.u64 %r553,%frame,240; $L60: .loc 1 254 10 ld.u64 %r520,[%r143]; ld.u64 %r521,[%r143+8]; .loc 1 257 10 setp.ge.s64 %r331,%r521,0; @ %r331 bra $L36; .loc 1 258 12 add.u64 %r333,%r520,%r106; set.u32.lt.u64 %r338,%r333,%r520; cvt.s64.s32 %r336,%r338; add.u64 %r339,%r521,%r519; sub.u64 %r342,%r339,%r336; mov.u64 %r520,%r333; mov.u64 %r521,%r342; .loc 1 259 10 setp.gt.s64 %r345,%r519,%r342; @ %r345 bra $L63; setp.ne.u64 %r348,%r519,%r342; @ %r348 bra $L38; setp.gt.u64 %r351,%r106,%r333; @ ! %r351 bra $L38; $L63: setp.ge.s64 %r353,%r342,0; @ %r353 bra $L40; $L38: .loc 1 261 7 st.u64 [%frame+624],%r333; st.u64 [%frame+632],%r342; st.u64 [%frame+640],%r106; st.u64 [%frame+648]608]; ld.u64 %r521,[%frame+616]; .loc 1 262 7 setp.ge.s64 %r371,%r521,0; @ %r371 bra $L40; .loc 1 263 9 add.u64 %r373,%r520,%r106; set.u32.lt.u64 %r378,%r373,%r520; cvt.s64.s32 %r376,%r378; add.u64 %r379,%r521,%r519; mov.u64 %r520,%r373; sub.u64 %r521,%r379,%r376; $L40: .loc 1 277 24 sub.u64 %r384,%r106,%r520; set.u32.gt.u64 %r389,%r384,%r106; cvt.s64.s32 %r387,%r389; sub.u64 %r390,%r519,%r521; add.u64 %r393,%r390,%r387; .loc 1 268 10 @ %r538 bra $L43; .loc 1 277 4 setp.gt.s64 %r396,%r393,0; @ %r396 bra $L44; setp.ne.u64 %r398,%r393,0; @ %r398 bra $L64; setp.ne.u64 %r400,%r384,0; @ %r400 bra $L44; $L64: mov.u64 %r130,%r141; bra $L46; $L43: .loc 1 270 16 mul.lo.u64 %r136,%r27,%r520; .loc 1 271 9 mul.lo.u64 %r137,%r27,%r384; .loc 1 272 21 add.u64 %r402,%r142,%r136; .loc 1 272 2137409,[%value_in]; } .loc 1 273 15 add.u64 %r411,%r141,%r137; .loc 1 273 2418,[%value_in]; } bra $L47; $L44: .loc 1 266 11 mad.lo.u64 %r134,%r520,%r98,%r142; .loc 1 286 8 mov.u64 %r133,%r141; .loc 1 277 11 mov.u64 %r135,0; $L48:13r13427,[%value_in]; } .loc 1 280 13 add.u64 %r133,%r133,%r92; .loc 1 281 12 add.u64 %r134,%r134,%r98; .loc 1 277 31 add.u64 %r135,%r135,1; .loc 1 277 4 setp.ne.u64 %r429,%r384,%r135; @ %r429 bra $L48; .loc 1 280 13 mad.lo.u64 %r130,%r384,%r92,%r141; $L46: .loc 1 283 6 setp.gt.s64 %r446,%r521,0; @ %r446 bra $L49; setp.ne.u64 %r448,%r521,0; @ %r448 bra $L47; setp.ne.u64 %r450,%r520,0; @ ! %r450 bra $L47; $L49: .loc 1 286 8 mov.u64 %r131,%r142; .loc 1 283 25 mov.u64 %r132,0; $L51: .loc 1 285 31131457,[%value_in]; } .loc 1 286 8 add.u64 %r130,%r130,%r92; .loc 1 287 7 add.u64 %r131,%r131,%r98; .loc 1 283 39 add.u64 %r132,%r132,1; .loc 1 283 6 setp.ne.u64 %r459,%r520,%r132; @ %r459 bra $L51; $L47: .loc 1 292 12 add.u64 %r141,%r141,%r40; .loc 1 293 12 add.u64 %r142,%r142,%r54; .loc 1 294 12 add.u64 %r143,%r143,%r67; .loc 1 295 15 ld.u64 %r460,[%frame+120]; add.u64 %r69,%r460,1; .loc 1 297 13 setp.eq.u64 %r461,%r69,%r119; @ %r461 bra $L53; .loc 1 295 15 st.u64 [%frame+120],%r69; bra $L60; $L53: .loc 1 301 20 mov.u64 %r462,0; st.u64 [%frame+120],%r462; .loc 1 305 16 add.u64 %r138,%r142,%r169; .loc 1 306 9 add.u64 %r139,%r143,%r151; .loc 1 308 14 @ %r540 bra $L1; add.u64 %r149,%frame,128; .loc 1 304 30 mov.u64 %r71,%r65; .loc 1 308 14 mov.u64 %r129,8; .loc 1 307 12 mov.u64 %r140,1; bra $L56; $L57: .loc 1 301 20 st.u64 [%r149],%r462; .loc 1 304 30 mul.lo.u64 %r71,%r83,%r82; .loc 1 305 30 mul.lo.u64 %r468,%r85,%r82; .loc 1 305 16 sub.u64 %r138,%r142,%r468; .loc 1 306 23 mul.lo.u64 %r469,%r87,%r82; .loc 1 306 9 shl.b64 %r470,%r469,4; sub.u64 %r139,%r143,%r470; .loc 1 307 12 add.u64 %r140,%r140,1; .loc 1 308 14 add.u64 %r149,%r149,8; add.u64 %r129,%r129,8; setp.eq.u64 %r471,%r550,%r140; @ %r471 bra $L1; $L56: .loc 1 316 23 ld.u64 %r472,[%r149]; add.u64 %r82,%r472,1; st.u64 [%r149],%r82; .loc 1 317 30 add.u64 %r474,%r551,%r129; ld.u64 %r83,[%r474]; .loc 1 317 20 sub.u64 %r475,%r83,%r71; add.u64 %r141,%r141,%r475; .loc 1 318 30 add.u64 %r477,%r552,%r129; ld.u64 %r85,[%r477]; .loc 1 318 20 add.u64 %r142,%r138,%r85; .loc 1 319 23 add.u64 %r479,%r553,%r129; ld.u64 %r87,[%r479]; .loc 1 319 13 shl.b64 %r480,%r87,4; add.u64 %r143,%r139,%r480; .loc 1 297 32 add.u64 %r481,%frame,%r129; ld.u64 %r90,[%r481]; .loc 1 297 13 setp.eq.u64 %r482,%r82,%r90; @ %r482 bra $L57; bra $L60; $L36: .loc 1 259 10 setp.gt.s64 %r485,%r519,%r521; @ %r485 bra $L40; setp.ne.u64 %r488,%r519,%r521; @ %r488 bra $L65; setp.gt.u64 %r491,%r106,%r520; @ %r491 bra $L40; $L65: .loc 1 261 7 st.u64 [%frame+624],%r520; st.u64 [%frame+632],%r521; st.u64 [%frame+640],%r106; st.u64 [%frame+648]608]; ld.u64 %r521,[%frame+616]; bra $L40; $L6: .loc 1 87 16 ld.u64 %r509,[%r178+56]; ld.u64 %r511,[%r178+48]; .loc 1 87 14 sub.u64 %r173,%r509,%r511; add.u64 %r53,%r178,72; cvt.u32.u32 %r513,%r160; cvt.s64.s8 %r512,%r513; add.u64 %r515,%r512,%r512; add.u64 %r516,%r515,%r512; shl.b64 %r517,%r516,3; mov.u64 %r41,%r190; .loc 1 90 17 mov.u64 %r113,1; .loc 1 95 4 mov.u64 %r206,%r41; bra $L59; $L1: .loc 1 32_gfortran_cshift1_16 .visible .func _gfortran_cshift1_16.loc 1 337call cshift1_gfortran_cshift1_16_char .visible .func _gfortran_cshift1_16_char.loc 1 357call cshift13_gfortran_cshift1_16_char4 .visible .func _gfortran_cshift1_16_char4$LFB8:call cshift1ret; $LFE8: } reshape_i4.o/_gfortran_reshape_4 .visible .func _gfortran_reshape_4.file 1 "../../../libgfortran/generated/reshape_i4.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray_gfortrani_reshape_packed .extern .func _gfortrani_reshape_packed73,110,99,111,114,114,101,99,116,32,115,105,122,101,32,105,110,32,83,79,85,82,67,69,32,97,114,103,117,109,101,110,116,32,116,111,32,82,69,83,72,65,80,69,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,100,0 }114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,82,69,83,72,65,80,69,32,105,110,116,114,105,11097,108,117,101,32,37,108,100,32,105,110,32,79,82,68,69,82,32,97,114,103,117,109,101,110,116,32,116,111,32,82,69,83,72,65,80,69,32,105,110,116,114,105,110,115,105,9910,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,82,69,83,72,65,80,69,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10,111,117,116,32,111,102,32,114,97,110,103,101,32,105,110,32,79,82,68,69,82,32,97,114,103,117,109,101,110,116,32,116,111,32,82,69,83,72,65,80,69,32,105115,104,97,112,101,32,97,110,100,32,116,97,114,103,101,116,32,100,111,32,110,111,116,32,99,111,110,102,111,114_gfortran_reshape_4 .visible .func _gfortran_reshape_4 {local .align 16 .b8 %frame_ar[1200100; .reg .u64u64.reg .u64 %r230; .reg .u64u64u64.reg .predpredu64 %r294; .reg .u64 %r295; .reg .u64 %r301; .reg .u64 %r305; .reg .u64 %r306; .reg .u64u64 %r310; .reg .predpred %r315; .reg .u64 %r316; .reg .u32pred %r325; .reg .u16 %r326; .reg .pred %r327; .reg .u64 %r329; .reg .u32 %r330; .reg .u64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64pred %r339; .reg .predpredu64 %r399; .reg .u64 %r400; .reg .u64 %r402; .reg .u64predu32 %r459; .reg .u32 %r461; .reg .u32 %r462; .reg .u16 %r463; .reg .u16 %r464; .reg .u16 %r465; .reg .u32 %r467; .reg .u32 %r468; .reg .u16 %r470; .reg .u16 %r472; .reg .u32 %r473; .reg .u16 %r474; .reg .pred %r475; .reg .u32 %r482; .reg .pred %r483; .reg .pred %r484; .reg .pred %r485; .reg .u64 %r487; .reg .u64 %r494; .reg .u64 %r497; .reg .u64 %r504; .reg .u64 %r507; .reg .pred %r508; .reg .u64 %r510; .reg .pred %r512; .reg .pred %r518; .reg .u64 %r519; .reg .u64 %r520; .reg .u64 %r521; .reg .pred %r522; .reg .pred %r523; .reg .u64 %r524; .reg .u64 %r525; .reg .pred %r526; .reg .u32 %r528; .reg .u32 %r529; .reg .u16 %r533; .reg .u16 %r534; .reg .u32 %r536; .reg .u16 %r537; .reg .pred %r538; .reg .pred %r540; .reg .u64 %r550; .reg .u64 %r560; .reg .u64 %r570; .reg .u64 %r573; .reg .u64 %r575; .reg .u64 %r576; .reg .u64 %r577; .reg .u64 %r579; .reg .pred %r580; .reg .pred %r581; .reg .pred %r582; .reg .pred %r583; .reg .pred %r584; .reg .u64 %r585; .reg .u64 %r586; .reg .u64 %r587; .reg .u64 %r588; .reg .u64 %r589; .reg .pred %r590; .reg .u64 %r592; .reg .u32 %r593; .reg .u32 %r594; .reg .u16 %r595; .reg .pred %r596; .reg .u64 %r597; .reg .u64 %r598; .reg .u64 %r599; .reg .pred %r600; .reg .u64 %r601; .reg .u64 %r602; .reg .u32 %r603; .reg .u32 %r604; .reg .u64 %r605; .reg .u32 %r606; .reg .u64 %r607; .reg .u32 %r608; mov.u64 %r257,%ar0; mov.u64 %r258,%ar1; mov.u64 %r259,%ar2; mov.u64 %r260,%ar3; mov.u64 %r261,%ar4; .loc 1 78 10 ld.u64 %r263,[%r259+56]; add.u64 %r262,%r263,1; .loc 1 78 8 ld.u64 %r264,[%r259+48]; sub.u64 %r161,%r262,%r264; .loc 1 83 15 ld.s8 %r265,[%r257+28]; .loc 1 83 6 setp.eq.u64 %r266,%r265,%r161; @ %r266 bra $L2; .loc 1 84 5 cvta.const.u64 %r2667_gfortran_runtime_errorld.u64 %r269,[%r259+40]; shl.b64 %r249,%r269,3; ld.u64 %r247,[%r259]; mov.u64 %r230,%frame; shl.b64 %r253,%r161,3; add.u64 %r254,%frame,%r253; .loc 1 90 44 mov.u64 %r250,%frame; .loc 1 86 15 mov.u32 %r151,0; .loc 1 93 23 mov.u64 %r607,0; .loc 1 94 14 mov.u32 %r608,1; $L5: .loc 1 90 39 ld.u64 %r32,[%r247]; .loc 1 91 10 setp.le.s64 %r270,%r32,0; @ %r270 bra $L3; .loc 1 90 21 st.u64 [%r250],%r32; bra $L4; $L3: .loc 1 93 23 st.u64 [%r250],%r607; .loc 1 94 14 mov.u32 %r151,%r608; $L4: .loc 1 88 3 add.u64 %r247,%r247,%r249; add.u64 %r250,%r250,8; setp.ne.u64 %r272,%r250,%r254; @ %r272 bra $L5; .loc 1 98 6 ld.u64 %r273,[%r257]; setp.ne.u64 %r274,%r273,0; @ %r274 bra $L6; add.u64 %r113,%r257,40; add.u64 %r277,%r161,%r161; add.u64 %r278,%r277,%r161; shl.b64 %r279,%r278,3; add.u64 %r246,%r113,%r279; mov.u64 %r159,%r230; .loc 1 102 10 mov.u64 %r167,1; $L7: .loc 1 105 8 ld.u64 %r166,[%r159]; .loc 1 107 4 st.u64 [%r113+8],%r273; add.u64 %r281,%r166,-1; st.u64 [%r113+16],%r281; st.u64 [%r113],%r167; .loc 1 109 7 mul.lo.u64 %r167,%r167,%r166; .loc 1 103 7 add.u64 %r159,%r159,8; add.u64 %r113,%r113,24; setp.ne.u64 %r282,%r113,%r246; @ %r282 bra $L7; .loc 1 111 19 st.u64 [%r257+8],%r273; .loc 1 118 24 max.s64 %r286,%r167,0285; call (%value_in),_gfortrani_xmallocarray287,[%value_in]; } .loc 1 118 22 st.u64 [%r257],%r287; .loc 1 119 23 st.u8 [%r257+28],%r161; $L6: .loc 1 122 6 setp.ne.u32 %r289,%r151,0; @ %r289 bra $L1; .loc 1 125 6 setp.eq.u64 %r290,%r260,0; @ %r290 bra $L53; .loc 1 127 12 ld.s8 %r145,[%r260+28]; .loc 1 130 7 setp.le.s64 %r292,%r145,0; @ %r292 bra $L54; .loc 1 132 21 shl.b64 %r294,%r145,3; add.u64 %r295,%frame,3629301,[%value_in]; } add.u64 %r139,%r260,40; add.u64 %r144,%frame,120; add.u64 %r52,%frame,240; add.u64 %r305,%r145,%r145; add.u64 %r306,%r305,%r145; shl.b64 %r307,%r306,3; add.u64 %r111,%r139,%r307; .loc 1 129 14 mov.u32 %r149,%r151; .loc 1 128 13 mov.u64 %r146,1; .loc 1 138 19 mov.u64 %r605,0; .loc 1 137 15 cvt.u32.u64 %r606,%r146; $L14: .loc 1 133 24 ld.u64 %r39,[%r139]; .loc 1 133 22 st.u64 [%r144],%r39; .loc 1 134 24 ld.u64 %r309,[%r139+16]; add.u64 %r308,%r309,1; ld.u64 %r310,[%r139+8]; sub.u64 %r43,%r308,%r310; .loc 1 135 14 setp.le.s64 %r311,%r43,0; @ %r311 bra $L11; .loc 1 134 22 st.u64 [%r52],%r43; bra $L12; $L11: .loc 1 138 19 st.u64 [%r52],%r605; .loc 1 137 15 mov.u32 %r149,%r606; $L12: .loc 1 141 14 setp.ne.u64 %r313,%r39,%r146; @ %r313 bra $L55; .loc 1 142 19 ld.u64 %r314,[%r52]; mul.lo.u64 %r146,%r146,%r314; bra $L13; $L55: .loc 1 144 19 mov.u64 %r146,0; $L13: .loc 1 130 7 add.u64 %r139,%r139,24; add.u64 %r144,%r144,8; add.u64 %r52,%r52,8; setp.ne.u64 %r315,%r111,%r139; @ %r315 bra $L14; bra $L10; $L54: .loc 1 129 14 mov.u32 %r149,%r151; .loc 1 128 13 mov.u64 %r146,1; $L10: .loc 1 146 12 ld.u64 %r147,[%r260]; bra $L9; $L53: .loc 1 153 12 mov.u64 %r147,%r260; .loc 1 152 14 mov.u32 %r149,1; .loc 1 151 13 mov.u64 %r146,1; .loc 1 150 12 mov.u64 %r145,%r260; $L9: .loc 1 156 7 cvta.global.u64 %r316,_gfortrani_compile_options; .loc 1 156 6 ld.u32 %r317,[%r316+36]; setp.eq.u32 %r318,%r317,0; @ ! %r318 bra $L92; setp.eq.u64 %r590,%r261,0; bra $L15; $L92: add.u64 %r229,%r257,48; .loc 1 161 23 mov.u64 %r224,0; .loc 1 160 10 mov.u64 %r175,1; $L17: .loc 1 163 20 ld.u64 %r47,[%r230]; .loc 1 163 7 mul.lo.u64 %r175,%r175,%r47; .loc 1 164 17 ld.u64 %r320,[%r229+8]; add.u64 %r319,%r320,1; .loc 1 164 15 ld.u64 %r321,[%r229]; sub.u64 %r176,%r319,%r321; .loc 1 161 41 add.u64 %r224,%r224,1; .loc 1 165 7 setp.eq.u64 %r322,%r47,%r176; @ %r322 bra $L16; .loc 1 166 6 st.u64 [%stack+16],%r47; st.u64 [%stack+8],%r176; st.u64 [%stack],%r224;_gfortran_runtime_error16: .loc 1 161 7 add.u64 %r230,%r230,8; add.u64 %r229,%r229,24; setp.ne.u64 %r325,%r161,%r224; @ %r325 bra $L17; .loc 1 173 14 ld.s8 %r53,[%r258+28]; .loc 1 174 7 cvt.u16.u32 %r326,%r53; setp.le.s16 %r327,%r326,0; @ %r327 bra $L56; add.u64 %r193,%r258,48; add.u64 %r241,%r258,56; cvt.u32.u32 %r330,%r53; cvt.s64.s8 %r329,%r330; add.u64 %r332,%r329,%r329; add.u64 %r333,%r332,%r329; shl.b64 %r334,%r333,3; add.u64 %r234,%r193,%r334; .loc 1 172 21 mov.u64 %r155,1; $L19: .loc 1 177 9 ld.u64 %r337,[%r241]; add.u64 %r336,%r337,1; .loc 1 177 7 ld.u64 %r338,[%r193]; sub.u64 %r335,%r336,%r338; .loc 1 178 33 max.s64 %r335,%r335,0; .loc 1 178 18 mul.lo.u64 %r155,%r155,%r335; .loc 1 174 7 add.u64 %r193,%r193,24; add.u64 %r241,%r241,24; setp.ne.u64 %r339,%r234,%r193; @ %r339 bra $L19; bra $L18; $L56: .loc 1 172 21 mov.u64 %r155,1; $L18: .loc 1 181 10 setp.ge.s64 %r340,%r155,%r175; @ %r340 bra $L20; .loc 1 181 34 set.u32.eq.u64 %r342,%r260,0; neg.s32 %r343,%r342; .loc 1 181 39 cvt.u16.u32 %r346,%r149; cvt.u16.u32 %r349,%r343; or.b16 %r348,%r349,%r346; .loc 1 181 30 cvt.u32.u16 %r351,%r348; cvt.u16.u8 %r352,%r351; setp.eq.u16 %r353,%r352,0; @ %r353 bra $L20; .loc 1 182 2 st.u64 [%stack+8],%r175; st.u64 [%stack],%r155; cvta.const.u64 %r3_gfortran_runtime_error20: .loc 1 186 10 setp.eq.u64 %r590,%r261,0; @ %r590 bra $L15; .loc 1 192 14 shl.b64 %r357,%r161,2; add.u64 %r358,%frame,480; mov.u32 %r3635362357364,[%value_in]; } ld.u64 %r366,[%r261+40]; shl.b64 %r191,%r366,3; ld.u64 %r189,[%r261]; .loc 1 194 20 mov.u64 %r172,0; .loc 1 206 16 mov.u32 %r604,1; $L23: .loc 1 196 28 ld.u64 %r63,[%r189]; .loc 1 196 10 add.u64 %r171,%r63,-1; .loc 1 198 14 shr.u64 %r368,%r171,63; .loc 1 198 23 set.u32.le.s64 %r370,%r161,%r171; neg.s32 %r371,%r370; .loc 1 198 18 cvt.u16.u64 %r374,%r368; cvt.u16.u32 %r375,%r371; or.b16 %r373,%r374,%r375; .loc 1 198 11 cvt.u32.u16 %r376,%r373; cvt.u16.u8 %r377,%r376; setp.eq.u16 %r378,%r377,0; @ %r378 bra $L21; .loc 1 199 3 st.u64 [%stack],%r63_gfortran_runtime_errorshl.b64 %r381,%r171,2; add.u64 %r382,%frame,%r381; add.u64 %r383,%r382,480; .loc 1 202 11 ld.u32 %r384,[%r383]; setp.eq.u32 %r385,%r384,0; @ %r385 bra $L22; .loc 1 203 3 st.u64 [%stack],%r63; cvta.const.u64 %r386386_gfortran_runtime_error22: .loc 1 206 16 st.u32 [%r383],%r604; .loc 1 194 38 add.u64 %r172,%r172,1; .loc 1 194 4 add.u64 %r189,%r189,%r191; setp.ne.u64 %r392,%r161,%r172; @ %r392 bra $L23; $L15: add.u64 %r91,%frame,960; add.u64 %r51,%frame,1080; add.u64 %r46,%frame,840; .loc 1 194 20 mov.u64 %r83,%r51; mov.u64 %r185,0; mov.u64 %r136,1; .loc 1 220 17 mov.u64 %r400,%r185; $L29: .loc 1 215 10 @ %r590 bra $L57; .loc 1 216 34 ld.u64 %r395,[%r261+40]; mul.lo.u64 %r394,%r185,%r395; .loc 1 216 31 ld.u64 %r396,[%r261]; shl.b64 %r397,%r394,3; add.u64 %r398,%r396,%r397; .loc 1 216 13 ld.u64 %r399,[%r398]; add.u64 %r156,%r399,-1; bra $L24; $L57: mov.u64 %r156,%r185; $L24: .loc 1 220 17 st.u64 [%r83],%r400; .loc 1 221 20 add.u64 %r402,%r156,%r156; add.u64 %r403,%r402,%r156; shl.b64 %r404,%r403,3; add.u64 %r405,%r257,%r404; ld.u64 %r75,[%r405+40]; .loc 1 221 18 st.u64 [%r46],%r75; .loc 1 222 20 ld.u64 %r414,[%r405+56]; add.u64 %r413,%r414,1; ld.u64 %r421,[%r405+48]; sub.u64 %r79,%r413,%r421; .loc 1 223 10 setp.lt.s64 %r422,%r79,0; @ %r422 bra $L25; .loc 1 222 18 st.u64 [%r91],%r79; bra $L26; $L25: .loc 1 224 20 st.u64 [%r91],%r400; mov.u64 %r79,%r400; $L26: .loc 1 226 35 shl.b64 %r424,%r156,3; add.u64 %r425,%frame,%r424; .loc 1 226 10 ld.u64 %r426,[%r425]; setp.eq.u64 %r427,%r426,%r79; @ %r427 bra $L27; .loc 1 227 9 cvta.const.u64 %r4284_gfortran_runtime_error27: .loc 1 229 10 setp.ne.u64 %r430,%r75,%r136; @ %r430 bra $L58; .loc 1 230 15 mul.lo.u64 %r136,%r136,%r79; bra $L28; $L58: .loc 1 232 15 mov.u64 %r136,0; $L28: .loc 1 233 10 setp.le.s64 %r431,%r79,0; @ %r431 bra $L1; .loc 1 212 37 add.u64 %r185,%r185,1; .loc 1 212 3 add.u64 %r91,%r91,8; add.u64 %r83,%r83,8; add.u64 %r46,%r46,8; setp.ne.u64 %r432,%r161,%r185; @ %r432 bra $L29; .loc 1 237 8 ld.s8 %r194,[%r258+28]; .loc 1 247 17 shl.b64 %r435,%r194,3; add.u64 %r589,%frame,720; mov.u32 %r440435442,[%value_in]; } add.u64 %r197,%r258,40; add.u64 %r588,%frame,480; mov.u64 %r195,%r588; add.u64 %r585,%frame,600; mov.u64 %r169,%r585; add.u64 %r445,%r194,%r194; add.u64 %r446,%r445,%r194; shl.b64 %r447,%r446,3; add.u64 %r134,%r447,%r197; .loc 1 243 9 mov.u64 %r143,1; .loc 1 253 15 mov.u64 %r602,0; .loc 1 252 11 cvt.u32.u64 %r603,%r143; $L33: .loc 1 248 20 ld.u64 %r86,[%r197]; .loc 1 248 18 st.u64 [%r195],%r86; .loc 1 249 20 ld.u64 %r450,[%r197+16]; add.u64 %r449,%r450,1; ld.u64 %r451,[%r197+8]; sub.u64 %r90,%r449,%r451; .loc 1 250 10 setp.le.s64 %r452,%r90,0; @ %r452 bra $L30; .loc 1 249 18 st.u64 [%r169],%r90; bra $L31; $L30: .loc 1 253 15 st.u64 [%r169],%r602; .loc 1 252 11 mov.u32 %r151,%r603; $L31: .loc 1 256 10 setp.ne.u64 %r454,%r86,%r143; @ %r454 bra $L59; .loc 1 257 15 ld.u64 %r455,[%r169]; mul.lo.u64 %r143,%r143,%r455; bra $L32; $L59: .loc 1 259 15 mov.u64 %r143,0; $L32: .loc 1 245 3 add.u64 %r197,%r197,24; add.u64 %r195,%r195,8; add.u64 %r169,%r169,8; setp.ne.u64 %r456,%r134,%r197; @ %r456 bra $L33; .loc 1 267 34 ld.u64 %r82,[%r257]; .loc 1 267 68 ld.u64 %r183,[%r258]; .loc 1 262 13 set.u32.ne.u64 %r458,%r136,0; neg.s32 %r459,%r458; .loc 1 262 32 set.u32.ne.u64 %r461,%r146,0; neg.s32 %r462,%r461; cvt.u16.u32 %r464,%r459; cvt.u16.u32 %r465,%r462; and.b16 %r463,%r464,%r465; .loc 1 262 27 set.u32.ne.u64 %r467,%r143,0; neg.s32 %r468,%r467; .loc 1 262 32 cvt.u16.u32 %r472,%r468; and.b16 %r470,%r463,%r472; cvt.u32.u16 %r473,%r470; cvt.u16.u8 %r474,%r473; setp.eq.u16 %r475,%r474,0; @ %r475 bra $L34; .loc 1 264 13 shl.b64 %r96,%r136,2; .loc 1 265 13 shl.b64 %r98,%r143,2; .loc 1 266 13 shl.b64 %r100,%r146,2; .loc 1 267 7 {1831100; call _gfortrani_reshape_packed269 7 bra $L1; $L34: .loc 1 273 12 ld.u64 %r181,[%frame+840]; .loc 1 274 12 ld.u64 %r140,[%frame+480]; .loc 1 276 6 and.b32 %r482,%r151,%r149; setp.eq.u32 %r483,%r482,0; @ %r483 bra $L3636: .loc 1 279 6 setp.eq.u32 %r484,%r151,0; @ %r484 bra $L37; .loc 1 285 7 setp.le.s64 %r485,%r145,0; @ %r485 bra $L60; .loc 1 290 13 ld.u64 %r140,[%frame+120]; .loc 1 287 16 shl.b64 %r207,%r145,3; add.u64 %r487,%frame,36288 17 add.u64 %r497,%frame,2mov.u64 %r236,%r140; mov.u64 %r200,0; .loc 1 289 26 add.u64 %r601,%frame,120; $L38: .loc 1 289 17 add.u64 %r507,%r588,%r200; st.u64 [%r507],%r236; .loc 1 285 7 add.u64 %r200,%r200,8; setp.eq.u64 %r508,%r200,%r207; @ %r508 bra $L61; .loc 1 289 26 add.u64 %r510,%r601,%r200; ld.u64 %r236,[%r510]; bra $L38; $L50: .loc 1 297 15 ld.u32 %r106,[%r183]; .loc 1 297 13 st.u32 [%r82],%r106; .loc 1 299 12 add.u64 %r82,%r82,%r108; .loc 1 300 11 add.u64 %r183,%r183,%r586; .loc 1 301 16 add.u64 %r157,%r157,1; .loc 1 302 16 add.u64 %r242,%r242,1; st.u64 [%frame+720],%r242; .loc 1 306 13 setp.ne.u64 %r512,%r157,%r222; @ %r512 bra $L39; .loc 1 315 14 @ %r600 bra $L40; add.u64 %r173,%frame,1088; add.u64 %r104,%frame,848; add.u64 %r170,%frame,968; .loc 1 313 30 mov.u64 %r115,%r138; .loc 1 310 21 mov.u64 %r592,0; bra $L41; $L42: st.u64 [%r173],%r592; .loc 1 313 30 mul.lo.u64 %r115,%r118,%r117; .loc 1 315 14 add.u64 %r173,%r173,8; add.u64 %r104,%r104,8; add.u64 %r170,%r170,8; setp.eq.u64 %r518,%r173,%r201; @ %r518 bra $L40; $L41: .loc 1 323 24 ld.u64 %r519,[%r173]; add.u64 %r117,%r519,1; st.u64 [%r173],%r117; .loc 1 324 30 ld.u64 %r118,[%r104]; .loc 1 324 20 sub.u64 %r520,%r118,%r115; shl.b64 %r521,%r520,2; add.u64 %r82,%r82,%r521; .loc 1 306 34 ld.u64 %r120,[%r170]; .loc 1 306 13 setp.eq.u64 %r522,%r117,%r120; @ %r522 bra $L42; .loc 1 329 13 setp.ne.u64 %r523,%r242,%r587; @ %r523 bra $L62; .loc 1 310 21 mov.u64 %r157,0; bra $L51; $L64: mov.u64 %r157,0; .loc 1 329 13 mov.u64 %r82,%r157; $L51: .loc 1 333 21 mov.u64 %r524,0; st.u64 [%frame+720],%r524; .loc 1 336 29 ld.u64 %r525,[%frame+480]; mul.lo.u64 %r124,%r587,%r525; .loc 1 338 14 setp.ne.u64 %r526,%r194,1; @ %r526 bra $L44; $L47: .loc 1 340 19 set.u32.ne.u64 %r528,%r168,0; neg.s32 %r529,%r528; .loc 1 340 24 cvt.u16.u32 %r534,%r529; and.b16 %r533,%r534,%r595; .loc 1 340 18 cvt.u32.u16 %r536,%r533; cvt.u16.u8 %r537,%r536; setp.ne.u16 %r538,%r537,0; @ %r538 bra $L45; .loc 1 354 19 mov.u64 %r183,%r147; bra $L46; $L49: .loc 1 333 21 st.u64 [%r217],%r524; .loc 1 336 29 mul.lo.u64 %r124,%r132,%r131; .loc 1 337 12 add.u64 %r182,%r182,1; .loc 1 338 14 add.u64 %r217,%r217,8; setp.eq.u64 %r540,%r194,%r182; @ ! %r540 bra $L48; bra $L47; $L45: .loc 1 345 19 @ %r596 bra $L63; .loc 1 347 3520550,[%value_in]; } .loc 1 348 3620560,[%value_in]; } .loc 1 349 3658r59920570,[%value_in]; } .loc 1 350 32 ld.u64 %r140,[%frame+480]; .loc 1 354 19 mov.u64 %r183,%r147; .loc 1 350 32 mov.u64 %r194,%r145; .loc 1 343 24 mov.u64 %r168,0; bra $L46; $L44: add.u64 %r217,%frame,728; .loc 1 337 12 mov.u64 %r182,1; $L48: .loc 1 359 24 ld.u64 %r573,[%r217]; add.u64 %r131,%r573,1; st.u64 [%r217],%r131; shl.b64 %r184,%r182,3; .loc 1 360 29 add.u64 %r575,%r588,%r184; ld.u64 %r132,[%r575]; .loc 1 360 19 sub.u64 %r576,%r132,%r124; shl.b64 %r577,%r576,2; add.u64 %r183,%r183,%r577; .loc 1 329 34 add.u64 %r579,%r585,%r184; ld.u64 %r135,[%r579]; .loc 1 329 13 setp.eq.u64 %r580,%r131,%r135; @ %r580 bra $L49; bra $L46; $L60: .loc 1 283 12 mov.u64 %r183,%r147; .loc 1 285 7 mov.u64 %r194,%r145; bra $L37; $L61: .loc 1 283 12 mov.u64 %r183,%r147; mov.u64 %r194,%r145; $L37: .loc 1 294 9 setp.eq.u64 %r581,%r82,0; @ %r581 bra $L1; .loc 1 299 12 shl.b64 %r108,%r181,2; ld.u64 %r157,[%frame+1080]; .loc 1 306 34 ld.u64 %r222,[%frame+960]; .loc 1 313 30 mul.lo.u64 %r138,%r222,%r181; .loc 1 329 34 ld.u64 %r587,[%frame+600]; .loc 1 302 13 ld.u64 %r242,[%frame+720]; add.u64 %r201,%r51,%r253; .loc 1 347 35 shl.b64 %r205,%r145,3; mov.u64 %r168,%r183; shl.b64 %r586,%r140,2; .loc 1 340 24 set.u32.ne.u64 %r593,%r260,0; neg.s32 %r594,%r593; cvt.u16.u32 %r595,%r594; .loc 1 345 19 setp.le.s64 %r596,%r145,0; .loc 1 347 35 add.u64 %r597,%frame,360; .loc 1 348 36 add.u64 %r598,%frame,240; .loc 1 349 36 add.u64 %r599,%frame,120; .loc 1 315 14 setp.eq.u64 %r600,%r161,1; bra $L50; $L63: .loc 1 354 19 mov.u64 %r183,%r147; .loc 1 345 19 mov.u64 %r194,%r145; .loc 1 343 24 mov.u64 %r168,0; $L46: .loc 1 294 9 setp.eq.u64 %r582,%r82,0; @ %r582 bra $L1; shl.b64 %r586,%r140,2; ld.u64 %r587,[%frame+600]; bra $L43; $L62: .loc 1 310 21 mov.u64 %r157,0; $L43: .loc 1 302 13 ld.u64 %r242,[%frame+720]; bra $L50; $L39: .loc 1 329 13 setp.eq.u64 %r583,%r242,%r587; @ %r583 bra $L51; bra $L50; $L40: setp.eq.u64 %r584,%r242,%r587; @ %r584 bra $L64; $L1: .loc 1 364 1 ret; } reshape_i8.o/_gfortran_reshape_8 .visible .func _gfortran_reshape_8.file 1 "../../../libgfortran/generated/reshape_i8.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray_gfortrani_reshape_packed .extern .func _gfortrani_reshape_packed73,110,99,111,114,114,101,99,116,32,115,105,122,101,32,105,110,32,83,79,85,82,67,69,32,97,114,103,117,109,101,110,116,32,116,111,32,82,69,83,72,65,80,69,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,100,0 }114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,82,69,83,72,65,80,69,32,105,110,116,114,105,11097,108,117,101,32,37,108,100,32,105,110,32,79,82,68,69,82,32,97,114,103,117,109,101,110,116,32,116,111,32,82,69,83,72,65,80,69,32,105,110,116,114,105,110,115,105,9910,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,82,69,83,72,65,80,69,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10,111,117,116,32,111,102,32,114,97,110,103,101,32,105,110,32,79,82,68,69,82,32,97,114,103,117,109,101,110,116,32,116,111,32,82,69,83,72,65,80,69,32,105115,104,97,112,101,32,97,110,100,32,116,97,114,103,101,116,32,100,111,32,110,111,116,32,99,111,110,102,111,114_gfortran_reshape_8 .visible .func _gfortran_reshape_8 {local .align 16 .b8 %frame_ar[1200100; .reg .u64u64.reg .u64 %r230; .reg .u64u64u64.reg .predpredu64 %r294; .reg .u64 %r295; .reg .u64 %r301; .reg .u64 %r305; .reg .u64 %r306; .reg .u64u64 %r310; .reg .predpred %r315; .reg .u64 %r316; .reg .u32pred %r325; .reg .u16 %r326; .reg .pred %r327; .reg .u64 %r329; .reg .u32 %r330; .reg .u64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64pred %r339; .reg .predpredu64 %r399; .reg .u64 %r400; .reg .u64 %r402; .reg .u64predu32 %r459; .reg .u32 %r461; .reg .u32 %r462; .reg .u16 %r463; .reg .u16 %r464; .reg .u16 %r465; .reg .u32 %r467; .reg .u32 %r468; .reg .u16 %r470; .reg .u16 %r472; .reg .u32 %r473; .reg .u16 %r474; .reg .pred %r475; .reg .u32 %r482; .reg .pred %r483; .reg .pred %r484; .reg .pred %r485; .reg .u64 %r487; .reg .u64 %r494; .reg .u64 %r497; .reg .u64 %r504; .reg .u64 %r507; .reg .pred %r508; .reg .u64 %r510; .reg .pred %r512; .reg .pred %r518; .reg .u64 %r519; .reg .u64 %r520; .reg .u64 %r521; .reg .pred %r522; .reg .pred %r523; .reg .u64 %r524; .reg .u64 %r525; .reg .pred %r526; .reg .u32 %r528; .reg .u32 %r529; .reg .u16 %r533; .reg .u16 %r534; .reg .u32 %r536; .reg .u16 %r537; .reg .pred %r538; .reg .pred %r540; .reg .u64 %r550; .reg .u64 %r560; .reg .u64 %r570; .reg .u64 %r573; .reg .u64 %r575; .reg .u64 %r576; .reg .u64 %r577; .reg .u64 %r579; .reg .pred %r580; .reg .pred %r581; .reg .pred %r582; .reg .pred %r583; .reg .pred %r584; .reg .u64 %r585; .reg .u64 %r586; .reg .u64 %r587; .reg .u64 %r588; .reg .u64 %r589; .reg .pred %r590; .reg .u64 %r592; .reg .u32 %r593; .reg .u32 %r594; .reg .u16 %r595; .reg .pred %r596; .reg .u64 %r597; .reg .u64 %r598; .reg .u64 %r599; .reg .pred %r600; .reg .u64 %r601; .reg .u64 %r602; .reg .u32 %r603; .reg .u32 %r604; .reg .u64 %r605; .reg .u32 %r606; .reg .u64 %r607; .reg .u32 %r608; mov.u64 %r257,%ar0; mov.u64 %r258,%ar1; mov.u64 %r259,%ar2; mov.u64 %r260,%ar3; mov.u64 %r261,%ar4; .loc 1 78 10 ld.u64 %r263,[%r259+56]; add.u64 %r262,%r263,1; .loc 1 78 8 ld.u64 %r264,[%r259+48]; sub.u64 %r161,%r262,%r264; .loc 1 83 15 ld.s8 %r265,[%r257+28]; .loc 1 83 6 setp.eq.u64 %r266,%r265,%r161; @ %r266 bra $L2; .loc 1 84 5 cvta.const.u64 %r2667_gfortran_runtime_errorld.u64 %r269,[%r259+40]; shl.b64 %r249,%r269,3; ld.u64 %r247,[%r259]; mov.u64 %r230,%frame; shl.b64 %r253,%r161,3; add.u64 %r254,%frame,%r253; .loc 1 90 44 mov.u64 %r250,%frame; .loc 1 86 15 mov.u32 %r151,0; .loc 1 93 23 mov.u64 %r607,0; .loc 1 94 14 mov.u32 %r608,1; $L5: .loc 1 90 39 ld.u64 %r32,[%r247]; .loc 1 91 10 setp.le.s64 %r270,%r32,0; @ %r270 bra $L3; .loc 1 90 21 st.u64 [%r250],%r32; bra $L4; $L3: .loc 1 93 23 st.u64 [%r250],%r607; .loc 1 94 14 mov.u32 %r151,%r608; $L4: .loc 1 88 3 add.u64 %r247,%r247,%r249; add.u64 %r250,%r250,8; setp.ne.u64 %r272,%r250,%r254; @ %r272 bra $L5; .loc 1 98 6 ld.u64 %r273,[%r257]; setp.ne.u64 %r274,%r273,0; @ %r274 bra $L6; add.u64 %r113,%r257,40; add.u64 %r277,%r161,%r161; add.u64 %r278,%r277,%r161; shl.b64 %r279,%r278,3; add.u64 %r246,%r113,%r279; mov.u64 %r159,%r230; .loc 1 102 10 mov.u64 %r167,1; $L7: .loc 1 105 8 ld.u64 %r166,[%r159]; .loc 1 107 4 st.u64 [%r113+8],%r273; add.u64 %r281,%r166,-1; st.u64 [%r113+16],%r281; st.u64 [%r113],%r167; .loc 1 109 7 mul.lo.u64 %r167,%r167,%r166; .loc 1 103 7 add.u64 %r159,%r159,8; add.u64 %r113,%r113,24; setp.ne.u64 %r282,%r113,%r246; @ %r282 bra $L7; .loc 1 111 19 st.u64 [%r257+8],%r273; .loc 1 118 24 max.s64 %r286,%r167,0; mov.u64 %r285285; call (%value_in),_gfortrani_xmallocarray287,[%value_in]; } .loc 1 118 22 st.u64 [%r257],%r287; .loc 1 119 23 st.u8 [%r257+28],%r161; $L6: .loc 1 122 6 setp.ne.u32 %r289,%r151,0; @ %r289 bra $L1; .loc 1 125 6 setp.eq.u64 %r290,%r260,0; @ %r290 bra $L53; .loc 1 127 12 ld.s8 %r145,[%r260+28]; .loc 1 130 7 setp.le.s64 %r292,%r145,0; @ %r292 bra $L54; .loc 1 132 21 shl.b64 %r294,%r145,3; add.u64 %r295,%frame,3629301,[%value_in]; } add.u64 %r139,%r260,40; add.u64 %r144,%frame,120; add.u64 %r52,%frame,240; add.u64 %r305,%r145,%r145; add.u64 %r306,%r305,%r145; shl.b64 %r307,%r306,3; add.u64 %r111,%r139,%r307; .loc 1 129 14 mov.u32 %r149,%r151; .loc 1 128 13 mov.u64 %r146,1; .loc 1 138 19 mov.u64 %r605,0; .loc 1 137 15 cvt.u32.u64 %r606,%r146; $L14: .loc 1 133 24 ld.u64 %r39,[%r139]; .loc 1 133 22 st.u64 [%r144],%r39; .loc 1 134 24 ld.u64 %r309,[%r139+16]; add.u64 %r308,%r309,1; ld.u64 %r310,[%r139+8]; sub.u64 %r43,%r308,%r310; .loc 1 135 14 setp.le.s64 %r311,%r43,0; @ %r311 bra $L11; .loc 1 134 22 st.u64 [%r52],%r43; bra $L12; $L11: .loc 1 138 19 st.u64 [%r52],%r605; .loc 1 137 15 mov.u32 %r149,%r606; $L12: .loc 1 141 14 setp.ne.u64 %r313,%r39,%r146; @ %r313 bra $L55; .loc 1 142 19 ld.u64 %r314,[%r52]; mul.lo.u64 %r146,%r146,%r314; bra $L13; $L55: .loc 1 144 19 mov.u64 %r146,0; $L13: .loc 1 130 7 add.u64 %r139,%r139,24; add.u64 %r144,%r144,8; add.u64 %r52,%r52,8; setp.ne.u64 %r315,%r111,%r139; @ %r315 bra $L14; bra $L10; $L54: .loc 1 129 14 mov.u32 %r149,%r151; .loc 1 128 13 mov.u64 %r146,1; $L10: .loc 1 146 12 ld.u64 %r147,[%r260]; bra $L9; $L53: .loc 1 153 12 mov.u64 %r147,%r260; .loc 1 152 14 mov.u32 %r149,1; .loc 1 151 13 mov.u64 %r146,1; .loc 1 150 12 mov.u64 %r145,%r260; $L9: .loc 1 156 7 cvta.global.u64 %r316,_gfortrani_compile_options; .loc 1 156 6 ld.u32 %r317,[%r316+36]; setp.eq.u32 %r318,%r317,0; @ ! %r318 bra $L92; setp.eq.u64 %r590,%r261,0; bra $L15; $L92: add.u64 %r229,%r257,48; .loc 1 161 23 mov.u64 %r224,0; .loc 1 160 10 mov.u64 %r175,1; $L17: .loc 1 163 20 ld.u64 %r47,[%r230]; .loc 1 163 7 mul.lo.u64 %r175,%r175,%r47; .loc 1 164 17 ld.u64 %r320,[%r229+8]; add.u64 %r319,%r320,1; .loc 1 164 15 ld.u64 %r321,[%r229]; sub.u64 %r176,%r319,%r321; .loc 1 161 41 add.u64 %r224,%r224,1; .loc 1 165 7 setp.eq.u64 %r322,%r47,%r176; @ %r322 bra $L16; .loc 1 166 6 st.u64 [%stack+16],%r47; st.u64 [%stack+8],%r176; st.u64 [%stack],%r224;_gfortran_runtime_error16: .loc 1 161 7 add.u64 %r230,%r230,8; add.u64 %r229,%r229,24; setp.ne.u64 %r325,%r161,%r224; @ %r325 bra $L17; .loc 1 173 14 ld.s8 %r53,[%r258+28]; .loc 1 174 7 cvt.u16.u32 %r326,%r53; setp.le.s16 %r327,%r326,0; @ %r327 bra $L56; add.u64 %r193,%r258,48; add.u64 %r241,%r258,56; cvt.u32.u32 %r330,%r53; cvt.s64.s8 %r329,%r330; add.u64 %r332,%r329,%r329; add.u64 %r333,%r332,%r329; shl.b64 %r334,%r333,3; add.u64 %r234,%r193,%r334; .loc 1 172 21 mov.u64 %r155,1; $L19: .loc 1 177 9 ld.u64 %r337,[%r241]; add.u64 %r336,%r337,1; .loc 1 177 7 ld.u64 %r338,[%r193]; sub.u64 %r335,%r336,%r338; .loc 1 178 33 max.s64 %r335,%r335,0; .loc 1 178 18 mul.lo.u64 %r155,%r155,%r335; .loc 1 174 7 add.u64 %r193,%r193,24; add.u64 %r241,%r241,24; setp.ne.u64 %r339,%r234,%r193; @ %r339 bra $L19; bra $L18; $L56: .loc 1 172 21 mov.u64 %r155,1; $L18: .loc 1 181 10 setp.ge.s64 %r340,%r155,%r175; @ %r340 bra $L20; .loc 1 181 34 set.u32.eq.u64 %r342,%r260,0; neg.s32 %r343,%r342; .loc 1 181 39 cvt.u16.u32 %r346,%r149; cvt.u16.u32 %r349,%r343; or.b16 %r348,%r349,%r346; .loc 1 181 30 cvt.u32.u16 %r351,%r348; cvt.u16.u8 %r352,%r351; setp.eq.u16 %r353,%r352,0; @ %r353 bra $L20; .loc 1 182 2 st.u64 [%stack+8],%r175; st.u64 [%stack],%r155; cvta.const.u64 %r3_gfortran_runtime_error20: .loc 1 186 10 setp.eq.u64 %r590,%r261,0; @ %r590 bra $L15; .loc 1 192 14 shl.b64 %r357,%r161,2; add.u64 %r358,%frame,480; mov.u32 %r3635362357364,[%value_in]; } ld.u64 %r366,[%r261+40]; shl.b64 %r191,%r366,3; ld.u64 %r189,[%r261]; .loc 1 194 20 mov.u64 %r172,0; .loc 1 206 16 mov.u32 %r604,1; $L23: .loc 1 196 28 ld.u64 %r63,[%r189]; .loc 1 196 10 add.u64 %r171,%r63,-1; .loc 1 198 14 shr.u64 %r368,%r171,63; .loc 1 198 23 set.u32.le.s64 %r370,%r161,%r171; neg.s32 %r371,%r370; .loc 1 198 18 cvt.u16.u64 %r374,%r368; cvt.u16.u32 %r375,%r371; or.b16 %r373,%r374,%r375; .loc 1 198 11 cvt.u32.u16 %r376,%r373; cvt.u16.u8 %r377,%r376; setp.eq.u16 %r378,%r377,0; @ %r378 bra $L21; .loc 1 199 3 st.u64 [%stack],%r63_gfortran_runtime_errorshl.b64 %r381,%r171,2; add.u64 %r382,%frame,%r381; add.u64 %r383,%r382,480; .loc 1 202 11 ld.u32 %r384,[%r383]; setp.eq.u32 %r385,%r384,0; @ %r385 bra $L22; .loc 1 203 3 st.u64 [%stack],%r63; cvta.const.u64 %r386386_gfortran_runtime_error22: .loc 1 206 16 st.u32 [%r383],%r604; .loc 1 194 38 add.u64 %r172,%r172,1; .loc 1 194 4 add.u64 %r189,%r189,%r191; setp.ne.u64 %r392,%r161,%r172; @ %r392 bra $L23; $L15: add.u64 %r91,%frame,960; add.u64 %r51,%frame,1080; add.u64 %r46,%frame,840; .loc 1 194 20 mov.u64 %r83,%r51; mov.u64 %r185,0; mov.u64 %r136,1; .loc 1 220 17 mov.u64 %r400,%r185; $L29: .loc 1 215 10 @ %r590 bra $L57; .loc 1 216 34 ld.u64 %r395,[%r261+40]; mul.lo.u64 %r394,%r185,%r395; .loc 1 216 31 ld.u64 %r396,[%r261]; shl.b64 %r397,%r394,3; add.u64 %r398,%r396,%r397; .loc 1 216 13 ld.u64 %r399,[%r398]; add.u64 %r156,%r399,-1; bra $L24; $L57: mov.u64 %r156,%r185; $L24: .loc 1 220 17 st.u64 [%r83],%r400; .loc 1 221 20 add.u64 %r402,%r156,%r156; add.u64 %r403,%r402,%r156; shl.b64 %r404,%r403,3; add.u64 %r405,%r257,%r404; ld.u64 %r75,[%r405+40]; .loc 1 221 18 st.u64 [%r46],%r75; .loc 1 222 20 ld.u64 %r414,[%r405+56]; add.u64 %r413,%r414,1; ld.u64 %r421,[%r405+48]; sub.u64 %r79,%r413,%r421; .loc 1 223 10 setp.lt.s64 %r422,%r79,0; @ %r422 bra $L25; .loc 1 222 18 st.u64 [%r91],%r79; bra $L26; $L25: .loc 1 224 20 st.u64 [%r91],%r400; mov.u64 %r79,%r400; $L26: .loc 1 226 35 shl.b64 %r424,%r156,3; add.u64 %r425,%frame,%r424; .loc 1 226 10 ld.u64 %r426,[%r425]; setp.eq.u64 %r427,%r426,%r79; @ %r427 bra $L27; .loc 1 227 9 cvta.const.u64 %r4284_gfortran_runtime_error27: .loc 1 229 10 setp.ne.u64 %r430,%r75,%r136; @ %r430 bra $L58; .loc 1 230 15 mul.lo.u64 %r136,%r136,%r79; bra $L28; $L58: .loc 1 232 15 mov.u64 %r136,0; $L28: .loc 1 233 10 setp.le.s64 %r431,%r79,0; @ %r431 bra $L1; .loc 1 212 37 add.u64 %r185,%r185,1; .loc 1 212 3 add.u64 %r91,%r91,8; add.u64 %r83,%r83,8; add.u64 %r46,%r46,8; setp.ne.u64 %r432,%r161,%r185; @ %r432 bra $L29; .loc 1 237 8 ld.s8 %r194,[%r258+28]; .loc 1 247 17 shl.b64 %r435,%r194,3; add.u64 %r589,%frame,720; mov.u32 %r440435442,[%value_in]; } add.u64 %r197,%r258,40; add.u64 %r588,%frame,480; mov.u64 %r195,%r588; add.u64 %r585,%frame,600; mov.u64 %r169,%r585; add.u64 %r445,%r194,%r194; add.u64 %r446,%r445,%r194; shl.b64 %r447,%r446,3; add.u64 %r134,%r447,%r197; .loc 1 243 9 mov.u64 %r143,1; .loc 1 253 15 mov.u64 %r602,0; .loc 1 252 11 cvt.u32.u64 %r603,%r143; $L33: .loc 1 248 20 ld.u64 %r86,[%r197]; .loc 1 248 18 st.u64 [%r195],%r86; .loc 1 249 20 ld.u64 %r450,[%r197+16]; add.u64 %r449,%r450,1; ld.u64 %r451,[%r197+8]; sub.u64 %r90,%r449,%r451; .loc 1 250 10 setp.le.s64 %r452,%r90,0; @ %r452 bra $L30; .loc 1 249 18 st.u64 [%r169],%r90; bra $L31; $L30: .loc 1 253 15 st.u64 [%r169],%r602; .loc 1 252 11 mov.u32 %r151,%r603; $L31: .loc 1 256 10 setp.ne.u64 %r454,%r86,%r143; @ %r454 bra $L59; .loc 1 257 15 ld.u64 %r455,[%r169]; mul.lo.u64 %r143,%r143,%r455; bra $L32; $L59: .loc 1 259 15 mov.u64 %r143,0; $L32: .loc 1 245 3 add.u64 %r197,%r197,24; add.u64 %r195,%r195,8; add.u64 %r169,%r169,8; setp.ne.u64 %r456,%r134,%r197; @ %r456 bra $L33; .loc 1 267 34 ld.u64 %r82,[%r257]; .loc 1 267 68 ld.u64 %r183,[%r258]; .loc 1 262 13 set.u32.ne.u64 %r458,%r136,0; neg.s32 %r459,%r458; .loc 1 262 32 set.u32.ne.u64 %r461,%r146,0; neg.s32 %r462,%r461; cvt.u16.u32 %r464,%r459; cvt.u16.u32 %r465,%r462; and.b16 %r463,%r464,%r465; .loc 1 262 27 set.u32.ne.u64 %r467,%r143,0; neg.s32 %r468,%r467; .loc 1 262 32 cvt.u16.u32 %r472,%r468; and.b16 %r470,%r463,%r472; cvt.u32.u16 %r473,%r470; cvt.u16.u8 %r474,%r473; setp.eq.u16 %r475,%r474,0; @ %r475 bra $L34; .loc 1 264 13 shl.b64 %r96,%r136,3; .loc 1 265 13 shl.b64 %r98,%r143,3; .loc 1 266 13 shl.b64 %r100,%r146,3; .loc 1 267 7 {1831100; call _gfortrani_reshape_packed269 7 bra $L1; $L34: .loc 1 273 12 ld.u64 %r181,[%frame+840]; .loc 1 274 12 ld.u64 %r140,[%frame+480]; .loc 1 276 6 and.b32 %r482,%r151,%r149; setp.eq.u32 %r483,%r482,0; @ %r483 bra $L3636: .loc 1 279 6 setp.eq.u32 %r484,%r151,0; @ %r484 bra $L37; .loc 1 285 7 setp.le.s64 %r485,%r145,0; @ %r485 bra $L60; .loc 1 290 13 ld.u64 %r140,[%frame+120]; .loc 1 287 16 shl.b64 %r207,%r145,3; add.u64 %r487,%frame,36288 17 add.u64 %r497,%frame,2mov.u64 %r236,%r140; mov.u64 %r200,0; .loc 1 289 26 add.u64 %r601,%frame,120; $L38: .loc 1 289 17 add.u64 %r507,%r588,%r200; st.u64 [%r507],%r236; .loc 1 285 7 add.u64 %r200,%r200,8; setp.eq.u64 %r508,%r200,%r207; @ %r508 bra $L61; .loc 1 289 26 add.u64 %r510,%r601,%r200; ld.u64 %r236,[%r510]; bra $L38; $L50: .loc 1 297 15 ld.u64 %r106,[%r183]; .loc 1 297 13 st.u64 [%r82],%r106; .loc 1 299 12 add.u64 %r82,%r82,%r108; .loc 1 300 11 add.u64 %r183,%r183,%r586; .loc 1 301 16 add.u64 %r157,%r157,1; .loc 1 302 16 add.u64 %r242,%r242,1; st.u64 [%frame+720],%r242; .loc 1 306 13 setp.ne.u64 %r512,%r157,%r222; @ %r512 bra $L39; .loc 1 315 14 @ %r600 bra $L40; add.u64 %r173,%frame,1088; add.u64 %r104,%frame,848; add.u64 %r170,%frame,968; .loc 1 313 30 mov.u64 %r115,%r138; .loc 1 310 21 mov.u64 %r592,0; bra $L41; $L42: st.u64 [%r173],%r592; .loc 1 313 30 mul.lo.u64 %r115,%r118,%r117; .loc 1 315 14 add.u64 %r173,%r173,8; add.u64 %r104,%r104,8; add.u64 %r170,%r170,8; setp.eq.u64 %r518,%r173,%r201; @ %r518 bra $L40; $L41: .loc 1 323 24 ld.u64 %r519,[%r173]; add.u64 %r117,%r519,1; st.u64 [%r173],%r117; .loc 1 324 30 ld.u64 %r118,[%r104]; .loc 1 324 20 sub.u64 %r520,%r118,%r115; shl.b64 %r521,%r520,3; add.u64 %r82,%r82,%r521; .loc 1 306 34 ld.u64 %r120,[%r170]; .loc 1 306 13 setp.eq.u64 %r522,%r117,%r120; @ %r522 bra $L42; .loc 1 329 13 setp.ne.u64 %r523,%r242,%r587; @ %r523 bra $L62; .loc 1 310 21 mov.u64 %r157,0; bra $L51; $L64: mov.u64 %r157,0; .loc 1 329 13 mov.u64 %r82,%r157; $L51: .loc 1 333 21 mov.u64 %r524,0; st.u64 [%frame+720],%r524; .loc 1 336 29 ld.u64 %r525,[%frame+480]; mul.lo.u64 %r124,%r587,%r525; .loc 1 338 14 setp.ne.u64 %r526,%r194,1; @ %r526 bra $L44; $L47: .loc 1 340 19 set.u32.ne.u64 %r528,%r168,0; neg.s32 %r529,%r528; .loc 1 340 24 cvt.u16.u32 %r534,%r529; and.b16 %r533,%r534,%r595; .loc 1 340 18 cvt.u32.u16 %r536,%r533; cvt.u16.u8 %r537,%r536; setp.ne.u16 %r538,%r537,0; @ %r538 bra $L45; .loc 1 354 19 mov.u64 %r183,%r147; bra $L46; $L49: .loc 1 333 21 st.u64 [%r217],%r524; .loc 1 336 29 mul.lo.u64 %r124,%r132,%r131; .loc 1 337 12 add.u64 %r182,%r182,1; .loc 1 338 14 add.u64 %r217,%r217,8; setp.eq.u64 %r540,%r194,%r182; @ ! %r540 bra $L48; bra $L47; $L45: .loc 1 345 19 @ %r596 bra $L63; .loc 1 347 3520550,[%value_in]; } .loc 1 348 3620560,[%value_in]; } .loc 1 349 3658r59920570,[%value_in]; } .loc 1 350 32 ld.u64 %r140,[%frame+480]; .loc 1 354 19 mov.u64 %r183,%r147; .loc 1 350 32 mov.u64 %r194,%r145; .loc 1 343 24 mov.u64 %r168,0; bra $L46; $L44: add.u64 %r217,%frame,728; .loc 1 337 12 mov.u64 %r182,1; $L48: .loc 1 359 24 ld.u64 %r573,[%r217]; add.u64 %r131,%r573,1; st.u64 [%r217],%r131; shl.b64 %r184,%r182,3; .loc 1 360 29 add.u64 %r575,%r588,%r184; ld.u64 %r132,[%r575]; .loc 1 360 19 sub.u64 %r576,%r132,%r124; shl.b64 %r577,%r576,3; add.u64 %r183,%r183,%r577; .loc 1 329 34 add.u64 %r579,%r585,%r184; ld.u64 %r135,[%r579]; .loc 1 329 13 setp.eq.u64 %r580,%r131,%r135; @ %r580 bra $L49; bra $L46; $L60: .loc 1 283 12 mov.u64 %r183,%r147; .loc 1 285 7 mov.u64 %r194,%r145; bra $L37; $L61: .loc 1 283 12 mov.u64 %r183,%r147; mov.u64 %r194,%r145; $L37: .loc 1 294 9 setp.eq.u64 %r581,%r82,0; @ %r581 bra $L1; .loc 1 299 12 shl.b64 %r108,%r181,3; ld.u64 %r157,[%frame+1080]; .loc 1 306 34 ld.u64 %r222,[%frame+960]; .loc 1 313 30 mul.lo.u64 %r138,%r222,%r181; .loc 1 329 34 ld.u64 %r587,[%frame+600]; .loc 1 302 13 ld.u64 %r242,[%frame+720]; add.u64 %r201,%r51,%r253; .loc 1 347 35 shl.b64 %r205,%r145,3; mov.u64 %r168,%r183; shl.b64 %r586,%r140,3; .loc 1 340 24 set.u32.ne.u64 %r593,%r260,0; neg.s32 %r594,%r593; cvt.u16.u32 %r595,%r594; .loc 1 345 19 setp.le.s64 %r596,%r145,0; .loc 1 347 35 add.u64 %r597,%frame,360; .loc 1 348 36 add.u64 %r598,%frame,240; .loc 1 349 36 add.u64 %r599,%frame,120; .loc 1 315 14 setp.eq.u64 %r600,%r161,1; bra $L50; $L63: .loc 1 354 19 mov.u64 %r183,%r147; .loc 1 345 19 mov.u64 %r194,%r145; .loc 1 343 24 mov.u64 %r168,0; $L46: .loc 1 294 9 setp.eq.u64 %r582,%r82,0; @ %r582 bra $L1; shl.b64 %r586,%r140,3; ld.u64 %r587,[%frame+600]; bra $L43; $L62: .loc 1 310 21 mov.u64 %r157,0; $L43: .loc 1 302 13 ld.u64 %r242,[%frame+720]; bra $L50; $L39: .loc 1 329 13 setp.eq.u64 %r583,%r242,%r587; @ %r583 bra $L51; bra $L50; $L40: setp.eq.u64 %r584,%r242,%r587; @ %r584 bra $L64; $L1: .loc 1 364 1 ret; } reshape_i16.o/_gfortran_reshape_16 .visible .func _gfortran_reshape_16.file 1 "../../../libgfortran/generated/reshape_i16.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray_gfortrani_reshape_packed .extern .func _gfortrani_reshape_packed73,110,99,111,114,114,101,99,116,32,115,105,122,101,32,105,110,32,83,79,85,82,67,69,32,97,114,103,117,109,101,110,116,32,116,111,32,82,69,83,72,65,80,69,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,100,0 }114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,82,69,83,72,65,80,69,32,105,110,116,114,105,11097,108,117,101,32,37,108,100,32,105,110,32,79,82,68,69,82,32,97,114,103,117,109,101,110,116,32,116,111,32,82,69,83,72,65,80,69,32,105,110,116,114,105,110,115,105,9910,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,82,69,83,72,65,80,69,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10,111,117,116,32,111,102,32,114,97,110,103,101,32,105,110,32,79,82,68,69,82,32,97,114,103,117,109,101,110,116,32,116,111,32,82,69,83,72,65,80,69,32,105115,104,97,112,101,32,97,110,100,32,116,97,114,103,101,116,32,100,111,32,110,111,116,32,99,111,110,102,111,114_gfortran_reshape_16 .visible .func _gfortran_reshape_16 {local .align 16 .b8 %frame_ar[1200u64.reg .u64 %r230; .reg .u64u64u64.reg .predpredu64 %r294; .reg .u64 %r295; .reg .u64 %r301; .reg .u64 %r305; .reg .u64 %r306; .reg .u64u64 %r310; .reg .predpred %r315; .reg .u64 %r316; .reg .u32pred %r325; .reg .u16 %r326; .reg .pred %r327; .reg .u64 %r329; .reg .u32 %r330; .reg .u64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64pred %r339; .reg .predpredu64 %r399; .reg .u64 %r400; .reg .u64 %r402; .reg .u64predu32 %r459; .reg .u32 %r461; .reg .u32 %r462; .reg .u16 %r463; .reg .u16 %r464; .reg .u16 %r465; .reg .u32 %r467; .reg .u32 %r468; .reg .u16 %r470; .reg .u16 %r472; .reg .u32 %r473; .reg .u16 %r474; .reg .pred %r475; .reg .u32 %r482; .reg .pred %r483; .reg .pred %r484; .reg .pred %r485; .reg .u64 %r487; .reg .u64 %r494; .reg .u64 %r497; .reg .u64 %r504; .reg .u64 %r507; .reg .pred %r508; .reg .u64 %r510; .reg .pred %r514; .reg .pred %r520; .reg .u64pred %r528; .reg .u32 %r530; .reg .u32 %r531; .reg .u16 %r535; .reg .u16 %r536; .reg .u32 %r538; .reg .u16 %r539; .reg .pred %r540; .reg .pred %r542; .reg .u64 %r552; .reg .u64 %r562; .reg .u64 %r572; .reg .u64 %r575; .reg .u64 %r577; .reg .u64pred %r582; .reg .pred %r583; .reg .pred %r584; .reg .pred %r585; .reg .pred %r586; .reg .u64 %r587; .reg .u64 %r588; .reg .u64 %r589; .reg .u64 %r590; .reg .pred %r591; .reg .u64 %r592; .reg .u64 %r593; .reg .u64 %r594; .reg .u64 %r596; .reg .u32 %r597; .reg .u32 %r598; .reg .u16 %r599; .reg .pred.reg .u32 %r612; mov.u64 %r257,%ar0; mov.u64 %r258,%ar1; mov.u64 %r259,%ar2; mov.u64 %r260,%ar3; mov.u64 %r261,%ar4; .loc 1 78 10 ld.u64 %r263,[%r259+56]; add.u64 %r262,%r263,1; .loc 1 78 8 ld.u64 %r264,[%r259+48]; sub.u64 %r161,%r262,%r264; .loc 1 83 15 ld.s8 %r265,[%r257+28]; .loc 1 83 6 setp.eq.u64 %r266,%r265,%r161; @ %r266 bra $L2; .loc 1 84 5 cvta.const.u64 %r2667_gfortran_runtime_errorld.u64 %r269,[%r259+40]; shl.b64 %r249,%r269,3; ld.u64 %r247,[%r259]; mov.u64 %r230,%frame; shl.b64 %r253,%r161,3; add.u64 %r254,%frame,%r253; .loc 1 90 44 mov.u64 %r250,%frame; .loc 1 86 15 mov.u32 %r151,0; .loc 1 93 23 mov.u64 %r611,0; .loc 1 94 14 mov.u32 %r612,1; $L5: .loc 1 90 39 ld.u64 %r32,[%r247]; .loc 1 91 10 setp.le.s64 %r270,%r32,0; @ %r270 bra $L3; .loc 1 90 21 st.u64 [%r250],%r32; bra $L4; $L3: .loc 1 93 23 st.u64 [%r250],%r611; .loc 1 94 14 mov.u32 %r151,%r612; $L4: .loc 1 88 3 add.u64 %r247,%r247,%r249; add.u64 %r250,%r250,8; setp.ne.u64 %r272,%r250,%r254; @ %r272 bra $L5; .loc 1 98 6 ld.u64 %r273,[%r257]; setp.ne.u64 %r274,%r273,0; @ %r274 bra $L6; add.u64 %r113,%r257,40; add.u64 %r277,%r161,%r161; add.u64 %r278,%r277,%r161; shl.b64 %r279,%r278,3; add.u64 %r246,%r113,%r279; mov.u64 %r159,%r230; .loc 1 102 10 mov.u64 %r167,1; $L7: .loc 1 105 8 ld.u64 %r166,[%r159]; .loc 1 107 4 st.u64 [%r113+8],%r273; add.u64 %r281,%r166,-1; st.u64 [%r113+16],%r281; st.u64 [%r113],%r167; .loc 1 109 7 mul.lo.u64 %r167,%r167,%r166; .loc 1 103 7 add.u64 %r159,%r159,8; add.u64 %r113,%r113,24; setp.ne.u64 %r282,%r113,%r246; @ %r282 bra $L7; .loc 1 111 19 st.u64 [%r257+8],%r273; .loc 1 118 24 max.s64 %r286,%r167,0; mov.u64 %r285,16285; call (%value_in),_gfortrani_xmallocarray287,[%value_in]; } .loc 1 118 22 st.u64 [%r257],%r287; .loc 1 119 23 st.u8 [%r257+28],%r161; $L6: .loc 1 122 6 setp.ne.u32 %r289,%r151,0; @ %r289 bra $L1; .loc 1 125 6 setp.eq.u64 %r290,%r260,0; @ %r290 bra $L53; .loc 1 127 12 ld.s8 %r145,[%r260+28]; .loc 1 130 7 setp.le.s64 %r292,%r145,0; @ %r292 bra $L54; .loc 1 132 21 shl.b64 %r294,%r145,3; add.u64 %r295,%frame,3629301,[%value_in]; } add.u64 %r139,%r260,40; add.u64 %r144,%frame,120; add.u64 %r52,%frame,240; add.u64 %r305,%r145,%r145; add.u64 %r306,%r305,%r145; shl.b64 %r307,%r306,3; add.u64 %r111,%r139,%r307; .loc 1 129 14 mov.u32 %r149,%r151; .loc 1 128 13 mov.u64 %r146,1; .loc 1 138 19 mov.u64 %r609,0; .loc 1 137 15 cvt.u32.u64 %r610,%r146; $L14: .loc 1 133 24 ld.u64 %r39,[%r139]; .loc 1 133 22 st.u64 [%r144],%r39; .loc 1 134 24 ld.u64 %r309,[%r139+16]; add.u64 %r308,%r309,1; ld.u64 %r310,[%r139+8]; sub.u64 %r43,%r308,%r310; .loc 1 135 14 setp.le.s64 %r311,%r43,0; @ %r311 bra $L11; .loc 1 134 22 st.u64 [%r52],%r43; bra $L12; $L11: .loc 1 138 19 st.u64 [%r52],%r609; .loc 1 137 15 mov.u32 %r149,%r610; $L12: .loc 1 141 14 setp.ne.u64 %r313,%r39,%r146; @ %r313 bra $L55; .loc 1 142 19 ld.u64 %r314,[%r52]; mul.lo.u64 %r146,%r146,%r314; bra $L13; $L55: .loc 1 144 19 mov.u64 %r146,0; $L13: .loc 1 130 7 add.u64 %r139,%r139,24; add.u64 %r144,%r144,8; add.u64 %r52,%r52,8; setp.ne.u64 %r315,%r111,%r139; @ %r315 bra $L14; bra $L10; $L54: .loc 1 129 14 mov.u32 %r149,%r151; .loc 1 128 13 mov.u64 %r146,1; $L10: .loc 1 146 12 ld.u64 %r147,[%r260]; bra $L9; $L53: .loc 1 153 12 mov.u64 %r147,%r260; .loc 1 152 14 mov.u32 %r149,1; .loc 1 151 13 mov.u64 %r146,1; .loc 1 150 12 mov.u64 %r145,%r260; $L9: .loc 1 156 7 cvta.global.u64 %r316,_gfortrani_compile_options; .loc 1 156 6 ld.u32 %r317,[%r316+36]; setp.eq.u32 %r318,%r317,0; @ ! %r318 bra $L92; setp.eq.u64 %r591,%r261,0; bra $L15; $L92: add.u64 %r229,%r257,48; .loc 1 161 23 mov.u64 %r224,0; .loc 1 160 10 mov.u64 %r175,1; $L17: .loc 1 163 20 ld.u64 %r47,[%r230]; .loc 1 163 7 mul.lo.u64 %r175,%r175,%r47; .loc 1 164 17 ld.u64 %r320,[%r229+8]; add.u64 %r319,%r320,1; .loc 1 164 15 ld.u64 %r321,[%r229]; sub.u64 %r176,%r319,%r321; .loc 1 161 41 add.u64 %r224,%r224,1; .loc 1 165 7 setp.eq.u64 %r322,%r47,%r176; @ %r322 bra $L16; .loc 1 166 6 st.u64 [%stack+16],%r47; st.u64 [%stack+8],%r176; st.u64 [%stack],%r224;_gfortran_runtime_error16: .loc 1 161 7 add.u64 %r230,%r230,8; add.u64 %r229,%r229,24; setp.ne.u64 %r325,%r161,%r224; @ %r325 bra $L17; .loc 1 173 14 ld.s8 %r53,[%r258+28]; .loc 1 174 7 cvt.u16.u32 %r326,%r53; setp.le.s16 %r327,%r326,0; @ %r327 bra $L56; add.u64 %r193,%r258,48; add.u64 %r241,%r258,56; cvt.u32.u32 %r330,%r53; cvt.s64.s8 %r329,%r330; add.u64 %r332,%r329,%r329; add.u64 %r333,%r332,%r329; shl.b64 %r334,%r333,3; add.u64 %r234,%r193,%r334; .loc 1 172 21 mov.u64 %r155,1; $L19: .loc 1 177 9 ld.u64 %r337,[%r241]; add.u64 %r336,%r337,1; .loc 1 177 7 ld.u64 %r338,[%r193]; sub.u64 %r335,%r336,%r338; .loc 1 178 33 max.s64 %r335,%r335,0; .loc 1 178 18 mul.lo.u64 %r155,%r155,%r335; .loc 1 174 7 add.u64 %r193,%r193,24; add.u64 %r241,%r241,24; setp.ne.u64 %r339,%r234,%r193; @ %r339 bra $L19; bra $L18; $L56: .loc 1 172 21 mov.u64 %r155,1; $L18: .loc 1 181 10 setp.ge.s64 %r340,%r155,%r175; @ %r340 bra $L20; .loc 1 181 34 set.u32.eq.u64 %r342,%r260,0; neg.s32 %r343,%r342; .loc 1 181 39 cvt.u16.u32 %r346,%r149; cvt.u16.u32 %r349,%r343; or.b16 %r348,%r349,%r346; .loc 1 181 30 cvt.u32.u16 %r351,%r348; cvt.u16.u8 %r352,%r351; setp.eq.u16 %r353,%r352,0; @ %r353 bra $L20; .loc 1 182 2 st.u64 [%stack+8],%r175; st.u64 [%stack],%r155; cvta.const.u64 %r3_gfortran_runtime_error20: .loc 1 186 10 setp.eq.u64 %r591,%r261,0; @ %r591 bra $L15; .loc 1 192 14 shl.b64 %r357,%r161,2; add.u64 %r358,%frame,480; mov.u32 %r3635362357364,[%value_in]; } ld.u64 %r366,[%r261+40]; shl.b64 %r191,%r366,3; ld.u64 %r189,[%r261]; .loc 1 194 20 mov.u64 %r172,0; .loc 1 206 16 mov.u32 %r608,1; $L23: .loc 1 196 28 ld.u64 %r63,[%r189]; .loc 1 196 10 add.u64 %r171,%r63,-1; .loc 1 198 14 shr.u64 %r368,%r171,63; .loc 1 198 23 set.u32.le.s64 %r370,%r161,%r171; neg.s32 %r371,%r370; .loc 1 198 18 cvt.u16.u64 %r374,%r368; cvt.u16.u32 %r375,%r371; or.b16 %r373,%r374,%r375; .loc 1 198 11 cvt.u32.u16 %r376,%r373; cvt.u16.u8 %r377,%r376; setp.eq.u16 %r378,%r377,0; @ %r378 bra $L21; .loc 1 199 3 st.u64 [%stack],%r63_gfortran_runtime_errorshl.b64 %r381,%r171,2; add.u64 %r382,%frame,%r381; add.u64 %r383,%r382,480; .loc 1 202 11 ld.u32 %r384,[%r383]; setp.eq.u32 %r385,%r384,0; @ %r385 bra $L22; .loc 1 203 3 st.u64 [%stack],%r63; cvta.const.u64 %r386386_gfortran_runtime_error22: .loc 1 206 16 st.u32 [%r383],%r608; .loc 1 194 38 add.u64 %r172,%r172,1; .loc 1 194 4 add.u64 %r189,%r189,%r191; setp.ne.u64 %r392,%r161,%r172; @ %r392 bra $L23; $L15: add.u64 %r91,%frame,960; add.u64 %r51,%frame,1080; add.u64 %r46,%frame,840; .loc 1 194 20 mov.u64 %r83,%r51; mov.u64 %r185,0; mov.u64 %r136,1; .loc 1 220 17 mov.u64 %r400,%r185; $L29: .loc 1 215 10 @ %r591 bra $L57; .loc 1 216 34 ld.u64 %r395,[%r261+40]; mul.lo.u64 %r394,%r185,%r395; .loc 1 216 31 ld.u64 %r396,[%r261]; shl.b64 %r397,%r394,3; add.u64 %r398,%r396,%r397; .loc 1 216 13 ld.u64 %r399,[%r398]; add.u64 %r156,%r399,-1; bra $L24; $L57: mov.u64 %r156,%r185; $L24: .loc 1 220 17 st.u64 [%r83],%r400; .loc 1 221 20 add.u64 %r402,%r156,%r156; add.u64 %r403,%r402,%r156; shl.b64 %r404,%r403,3; add.u64 %r405,%r257,%r404; ld.u64 %r75,[%r405+40]; .loc 1 221 18 st.u64 [%r46],%r75; .loc 1 222 20 ld.u64 %r414,[%r405+56]; add.u64 %r413,%r414,1; ld.u64 %r421,[%r405+48]; sub.u64 %r79,%r413,%r421; .loc 1 223 10 setp.lt.s64 %r422,%r79,0; @ %r422 bra $L25; .loc 1 222 18 st.u64 [%r91],%r79; bra $L26; $L25: .loc 1 224 20 st.u64 [%r91],%r400; mov.u64 %r79,%r400; $L26: .loc 1 226 35 shl.b64 %r424,%r156,3; add.u64 %r425,%frame,%r424; .loc 1 226 10 ld.u64 %r426,[%r425]; setp.eq.u64 %r427,%r426,%r79; @ %r427 bra $L27; .loc 1 227 9 cvta.const.u64 %r4284_gfortran_runtime_error27: .loc 1 229 10 setp.ne.u64 %r430,%r75,%r136; @ %r430 bra $L58; .loc 1 230 15 mul.lo.u64 %r136,%r136,%r79; bra $L28; $L58: .loc 1 232 15 mov.u64 %r136,0; $L28: .loc 1 233 10 setp.le.s64 %r431,%r79,0; @ %r431 bra $L1; .loc 1 212 37 add.u64 %r185,%r185,1; .loc 1 212 3 add.u64 %r91,%r91,8; add.u64 %r83,%r83,8; add.u64 %r46,%r46,8; setp.ne.u64 %r432,%r161,%r185; @ %r432 bra $L29; .loc 1 237 8 ld.s8 %r194,[%r258+28]; .loc 1 247 17 shl.b64 %r435,%r194,3; add.u64 %r590,%frame,720; mov.u32 %r440435442,[%value_in]; } add.u64 %r197,%r258,40; add.u64 %r589,%frame,480; mov.u64 %r195,%r589; add.u64 %r592,%frame,600; mov.u64 %r169,%r592; add.u64 %r445,%r194,%r194; add.u64 %r446,%r445,%r194; shl.b64 %r447,%r446,3; add.u64 %r134,%r447,%r197; .loc 1 243 9 mov.u64 %r143,1; .loc 1 253 15 mov.u64 %r606,0; .loc 1 252 11 cvt.u32.u64 %r607,%r143; $L33: .loc 1 248 20 ld.u64 %r86,[%r197]; .loc 1 248 18 st.u64 [%r195],%r86; .loc 1 249 20 ld.u64 %r450,[%r197+16]; add.u64 %r449,%r450,1; ld.u64 %r451,[%r197+8]; sub.u64 %r90,%r449,%r451; .loc 1 250 10 setp.le.s64 %r452,%r90,0; @ %r452 bra $L30; .loc 1 249 18 st.u64 [%r169],%r90; bra $L31; $L30: .loc 1 253 15 st.u64 [%r169],%r606; .loc 1 252 11 mov.u32 %r151,%r607; $L31: .loc 1 256 10 setp.ne.u64 %r454,%r86,%r143; @ %r454 bra $L59; .loc 1 257 15 ld.u64 %r455,[%r169]; mul.lo.u64 %r143,%r143,%r455; bra $L32; $L59: .loc 1 259 15 mov.u64 %r143,0; $L32: .loc 1 245 3 add.u64 %r197,%r197,24; add.u64 %r195,%r195,8; add.u64 %r169,%r169,8; setp.ne.u64 %r456,%r134,%r197; @ %r456 bra $L33; .loc 1 267 34 ld.u64 %r82,[%r257]; .loc 1 267 68 ld.u64 %r183,[%r258]; .loc 1 262 13 set.u32.ne.u64 %r458,%r136,0; neg.s32 %r459,%r458; .loc 1 262 32 set.u32.ne.u64 %r461,%r146,0; neg.s32 %r462,%r461; cvt.u16.u32 %r464,%r459; cvt.u16.u32 %r465,%r462; and.b16 %r463,%r464,%r465; .loc 1 262 27 set.u32.ne.u64 %r467,%r143,0; neg.s32 %r468,%r467; .loc 1 262 32 cvt.u16.u32 %r472,%r468; and.b16 %r470,%r463,%r472; cvt.u32.u16 %r473,%r470; cvt.u16.u8 %r474,%r473; setp.eq.u16 %r475,%r474,0; @ %r475 bra $L34; .loc 1 264 13 shl.b64 %r96,%r136,4; .loc 1 265 13 shl.b64 %r98,%r143,4; .loc 1 266 13 shl.b64 %r100,%r146,4; .loc 1 267 7 {1831100; call _gfortrani_reshape_packed269 7 bra $L1; $L34: .loc 1 273 12 ld.u64 %r181,[%frame+840]; .loc 1 274 12 ld.u64 %r140,[%frame+480]; .loc 1 276 6 and.b32 %r482,%r151,%r149; setp.eq.u32 %r483,%r482,0; @ %r483 bra $L3636: .loc 1 279 6 setp.eq.u32 %r484,%r151,0; @ %r484 bra $L37; .loc 1 285 7 setp.le.s64 %r485,%r145,0; @ %r485 bra $L60; .loc 1 290 13 ld.u64 %r140,[%frame+120]; .loc 1 287 16 shl.b64 %r207,%r145,3; add.u64 %r487,%frame,36288 17 add.u64 %r497,%frame,259mov.u64 %r236,%r140; mov.u64 %r200,0; .loc 1 289 26 add.u64 %r605,%frame,120; $L38: .loc 1 289 17 add.u64 %r507,%r589,%r200; st.u64 [%r507],%r236; .loc 1 285 7 add.u64 %r200,%r200,8; setp.eq.u64 %r508,%r200,%r207; @ %r508 bra $L61; .loc 1 289 26 add.u64 %r510,%r605,%r200; ld.u64 %r236,[%r510]; bra $L38; $L50: .loc 1 297 15 ld.u64 %r587,[%r183]; ld.u64 %r588,[%r183+8]; .loc 1 297 13 st.u64 [%r82],%r587; st.u64 [%r82+8],%r588; .loc 1 299 12 add.u64 %r82,%r82,%r108; .loc 1 300 11 add.u64 %r183,%r183,%r593; .loc 1 301 16 add.u64 %r157,%r157,1; .loc 1 302 16 add.u64 %r242,%r242,1; st.u64 [%frame+720],%r242; .loc 1 306 13 setp.ne.u64 %r514,%r157,%r222; @ %r514 bra $L39; .loc 1 315 14 @ %r604 bra $L40; add.u64 %r173,%frame,1088; add.u64 %r104,%frame,848; add.u64 %r170,%frame,968; .loc 1 313 30 mov.u64 %r115,%r138; .loc 1 310 21 mov.u64 %r596,0; bra $L41; $L42: st.u64 [%r173],%r596; .loc 1 313 30 mul.lo.u64 %r115,%r118,%r117; .loc 1 315 14 add.u64 %r173,%r173,8; add.u64 %r104,%r104,8; add.u64 %r170,%r170,8; setp.eq.u64 %r520,%r173,%r201; @ %r520 bra $L40; $L41: .loc 1 323 24 ld.u64 %r521,[%r173]; add.u64 %r117,%r521,1; st.u64 [%r173],%r117; .loc 1 324 30 ld.u64 %r118,[%r104]; .loc 1 324 20 sub.u64 %r522,%r118,%r115; shl.b64 %r523,%r522,4; add.u64 %r82,%r82,%r523; .loc 1 306 34 ld.u64 %r120,[%r170]; .loc 1 306 13 setp.eq.u64 %r524,%r117,%r120; @ %r524 bra $L42; .loc 1 329 13 setp.ne.u64 %r525,%r242,%r594; @ %r525 bra $L62; .loc 1 310 21 mov.u64 %r157,0; bra $L51; $L64: mov.u64 %r157,0; .loc 1 329 13 mov.u64 %r82,%r157; $L51: .loc 1 333 21 mov.u64 %r526,0; st.u64 [%frame+720],%r526; .loc 1 336 29 ld.u64 %r527,[%frame+480]; mul.lo.u64 %r124,%r594,%r527; .loc 1 338 14 setp.ne.u64 %r528,%r194,1; @ %r528 bra $L44; $L47: .loc 1 340 19 set.u32.ne.u64 %r530,%r168,0; neg.s32 %r531,%r530; .loc 1 340 24 cvt.u16.u32 %r536,%r531; and.b16 %r535,%r536,%r599; .loc 1 340 18 cvt.u32.u16 %r538,%r535; cvt.u16.u8 %r539,%r538; setp.ne.u16 %r540,%r539,0; @ %r540 bra $L45; .loc 1 354 19 mov.u64 %r183,%r147; bra $L46; $L49: .loc 1 333 21 st.u64 [%r217],%r526; .loc 1 336 29 mul.lo.u64 %r124,%r132,%r131; .loc 1 337 12 add.u64 %r182,%r182,1; .loc 1 338 14 add.u64 %r217,%r217,8; setp.eq.u64 %r542,%r194,%r182; @ ! %r542 bra $L48; bra $L47; $L45: .loc 1 345 19 @ %r600 bra $L63; .loc 1 347 35552,[%value_in]; } .loc 1 348 365960220562,[%value_in]; } .loc 1 349 3620572,[%value_in]; } .loc 1 350 32 ld.u64 %r140,[%frame+480]; .loc 1 354 19 mov.u64 %r183,%r147; .loc 1 350 32 mov.u64 %r194,%r145; .loc 1 343 24 mov.u64 %r168,0; bra $L46; $L44: add.u64 %r217,%frame,728; .loc 1 337 12 mov.u64 %r182,1; $L48: .loc 1 359 24 ld.u64 %r575,[%r217]; add.u64 %r131,%r575,1; st.u64 [%r217],%r131; shl.b64 %r184,%r182,3; .loc 1 360 29 add.u64 %r577,%r589,%r184; ld.u64 %r132,[%r577]; .loc 1 360 19 sub.u64 %r578,%r132,%r124; shl.b64 %r579,%r578,4; add.u64 %r183,%r183,%r579; .loc 1 329 34 add.u64 %r581,%r592,%r184; ld.u64 %r135,[%r581]; .loc 1 329 13 setp.eq.u64 %r582,%r131,%r135; @ %r582 bra $L49; bra $L46; $L60: .loc 1 283 12 mov.u64 %r183,%r147; .loc 1 285 7 mov.u64 %r194,%r145; bra $L37; $L61: .loc 1 283 12 mov.u64 %r183,%r147; mov.u64 %r194,%r145; $L37: .loc 1 294 9 setp.eq.u64 %r583,%r82,0; @ %r583 bra $L1; .loc 1 299 12 shl.b64 %r108,%r181,4; ld.u64 %r157,[%frame+1080]; .loc 1 306 34 ld.u64 %r222,[%frame+960]; .loc 1 313 30 mul.lo.u64 %r138,%r222,%r181; .loc 1 329 34 ld.u64 %r594,[%frame+600]; .loc 1 302 13 ld.u64 %r242,[%frame+720]; add.u64 %r201,%r51,%r253; .loc 1 347 35 shl.b64 %r205,%r145,3; mov.u64 %r168,%r183; shl.b64 %r593,%r140,4; .loc 1 340 24 set.u32.ne.u64 %r597,%r260,0; neg.s32 %r598,%r597; cvt.u16.u32 %r599,%r598; .loc 1 345 19 setp.le.s64 %r600,%r145,0; .loc 1 347 35 add.u64 %r601,%frame,360; .loc 1 348 36 add.u64 %r602,%frame,240; .loc 1 349 36 add.u64 %r603,%frame,120; .loc 1 315 14 setp.eq.u64 %r604,%r161,1; bra $L50; $L63: .loc 1 354 19 mov.u64 %r183,%r147; .loc 1 345 19 mov.u64 %r194,%r145; .loc 1 343 24 mov.u64 %r168,0; $L46: .loc 1 294 9 setp.eq.u64 %r584,%r82,0; @ %r584 bra $L1; shl.b64 %r593,%r140,4; ld.u64 %r594,[%frame+600]; bra $L43; $L62: .loc 1 310 21 mov.u64 %r157,0; $L43: .loc 1 302 13 ld.u64 %r242,[%frame+720]; bra $L50; $L39: .loc 1 329 13 setp.eq.u64 %r585,%r242,%r594; @ %r585 bra $L51; bra $L50; $L40: setp.eq.u64 %r586,%r242,%r594; @ %r586 bra $L64; $L1: .loc 1 364 1 ret; } reshape_r4.o/_gfortran_reshape_r4 .visible .func _gfortran_reshape_.file 1 "../../../libgfortran/generated/reshape_r4.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray_gfortrani_reshape_packed .extern .func _gfortrani_reshape_packed73,110,99,111,114,114,101,99,116,32,115,105,122,101,32,105,110,32,83,79,85,82,67,69,32,97,114,103,117,109,101,110,116,32,116,111,32,82,69,83,72,65,80,69,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,100,0 }114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,82,69,83,72,65,80,69,32,105,110,116,114,105,11097,108,117,101,32,37,108,100,32,105,110,32,79,82,68,69,82,32,97,114,103,117,109,101,110,116,32,116,111,32,82,69,83,72,65,80,69,32,105,110,116,114,105,110,115,105,9910,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,82,69,83,72,65,80,69,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10,111,117,116,32,111,102,32,114,97,110,103,101,32,105,110,32,79,82,68,69,82,32,97,114,103,117,109,101,110,116,32,116,111,32,82,69,83,72,65,80,69,32,105115,104,97,112,101,32,97,110,100,32,116,97,114,103,101,116,32,100,111,32,110,111,116,32,99,111,110,102,111,114_gfortran_reshape_r4 .visible .func _gfortran_reshape_ {local .align 16 .b8 %frame_ar[1200100; .reg .u64u64.reg .u64 %r230; .reg .u64u64u64.reg .predpredu64 %r294; .reg .u64 %r295; .reg .u64 %r301; .reg .u64 %r305; .reg .u64 %r306; .reg .u64u64 %r310; .reg .predpred %r315; .reg .u64 %r316; .reg .u32pred %r325; .reg .u16 %r326; .reg .pred %r327; .reg .u64 %r329; .reg .u32 %r330; .reg .u64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64pred %r339; .reg .predpredu64 %r399; .reg .u64 %r400; .reg .u64 %r402; .reg .u64predu32 %r459; .reg .u32 %r461; .reg .u32 %r462; .reg .u16 %r463; .reg .u16 %r464; .reg .u16 %r465; .reg .u32 %r467; .reg .u32 %r468; .reg .u16 %r470; .reg .u16 %r472; .reg .u32 %r473; .reg .u16 %r474; .reg .pred %r475; .reg .u32 %r482; .reg .pred %r483; .reg .pred %r484; .reg .pred %r485; .reg .u64 %r487; .reg .u64 %r494; .reg .u64 %r497; .reg .u64 %r504; .reg .u64 %r507; .reg .pred %r508; .reg .u64 %r510; .reg .pred %r512; .reg .pred %r518; .reg .u64 %r519; .reg .u64 %r520; .reg .u64 %r521; .reg .pred %r522; .reg .pred %r523; .reg .u64 %r524; .reg .u64 %r525; .reg .pred %r526; .reg .u32 %r528; .reg .u32 %r529; .reg .u16 %r533; .reg .u16 %r534; .reg .u32 %r536; .reg .u16 %r537; .reg .pred %r538; .reg .pred %r540; .reg .u64 %r550; .reg .u64 %r560; .reg .u64 %r570; .reg .u64 %r573; .reg .u64 %r575; .reg .u64 %r576; .reg .u64 %r577; .reg .u64 %r579; .reg .pred %r580; .reg .pred %r581; .reg .pred %r582; .reg .pred %r583; .reg .pred %r584; .reg .u64 %r585; .reg .u64 %r586; .reg .u64 %r587; .reg .u64 %r588; .reg .u64 %r589; .reg .pred %r590; .reg .u64 %r592; .reg .u32 %r593; .reg .u32 %r594; .reg .u16 %r595; .reg .pred %r596; .reg .u64 %r597; .reg .u64 %r598; .reg .u64 %r599; .reg .pred %r600; .reg .u64 %r601; .reg .u64 %r602; .reg .u32 %r603; .reg .u32 %r604; .reg .u64 %r605; .reg .u32 %r606; .reg .u64 %r607; .reg .u32 %r608; mov.u64 %r257,%ar0; mov.u64 %r258,%ar1; mov.u64 %r259,%ar2; mov.u64 %r260,%ar3; mov.u64 %r261,%ar4; .loc 1 78 10 ld.u64 %r263,[%r259+56]; add.u64 %r262,%r263,1; .loc 1 78 8 ld.u64 %r264,[%r259+48]; sub.u64 %r161,%r262,%r264; .loc 1 83 15 ld.s8 %r265,[%r257+28]; .loc 1 83 6 setp.eq.u64 %r266,%r265,%r161; @ %r266 bra $L2; .loc 1 84 5 cvta.const.u64 %r2667_gfortran_runtime_errorld.u64 %r269,[%r259+40]; shl.b64 %r249,%r269,3; ld.u64 %r247,[%r259]; mov.u64 %r230,%frame; shl.b64 %r253,%r161,3; add.u64 %r254,%frame,%r253; .loc 1 90 44 mov.u64 %r250,%frame; .loc 1 86 15 mov.u32 %r151,0; .loc 1 93 23 mov.u64 %r607,0; .loc 1 94 14 mov.u32 %r608,1; $L5: .loc 1 90 39 ld.u64 %r32,[%r247]; .loc 1 91 10 setp.le.s64 %r270,%r32,0; @ %r270 bra $L3; .loc 1 90 21 st.u64 [%r250],%r32; bra $L4; $L3: .loc 1 93 23 st.u64 [%r250],%r607; .loc 1 94 14 mov.u32 %r151,%r608; $L4: .loc 1 88 3 add.u64 %r247,%r247,%r249; add.u64 %r250,%r250,8; setp.ne.u64 %r272,%r250,%r254; @ %r272 bra $L5; .loc 1 98 6 ld.u64 %r273,[%r257]; setp.ne.u64 %r274,%r273,0; @ %r274 bra $L6; add.u64 %r113,%r257,40; add.u64 %r277,%r161,%r161; add.u64 %r278,%r277,%r161; shl.b64 %r279,%r278,3; add.u64 %r246,%r113,%r279; mov.u64 %r159,%r230; .loc 1 102 10 mov.u64 %r167,1; $L7: .loc 1 105 8 ld.u64 %r166,[%r159]; .loc 1 107 4 st.u64 [%r113+8],%r273; add.u64 %r281,%r166,-1; st.u64 [%r113+16],%r281; st.u64 [%r113],%r167; .loc 1 109 7 mul.lo.u64 %r167,%r167,%r166; .loc 1 103 7 add.u64 %r159,%r159,8; add.u64 %r113,%r113,24; setp.ne.u64 %r282,%r113,%r246; @ %r282 bra $L7; .loc 1 111 19 st.u64 [%r257+8],%r273; .loc 1 118 24 max.s64 %r286,%r167,0285; call (%value_in),_gfortrani_xmallocarray287,[%value_in]; } .loc 1 118 22 st.u64 [%r257],%r287; .loc 1 119 23 st.u8 [%r257+28],%r161; $L6: .loc 1 122 6 setp.ne.u32 %r289,%r151,0; @ %r289 bra $L1; .loc 1 125 6 setp.eq.u64 %r290,%r260,0; @ %r290 bra $L53; .loc 1 127 12 ld.s8 %r145,[%r260+28]; .loc 1 130 7 setp.le.s64 %r292,%r145,0; @ %r292 bra $L54; .loc 1 132 21 shl.b64 %r294,%r145,3; add.u64 %r295,%frame,3629301,[%value_in]; } add.u64 %r139,%r260,40; add.u64 %r144,%frame,120; add.u64 %r52,%frame,240; add.u64 %r305,%r145,%r145; add.u64 %r306,%r305,%r145; shl.b64 %r307,%r306,3; add.u64 %r111,%r139,%r307; .loc 1 129 14 mov.u32 %r149,%r151; .loc 1 128 13 mov.u64 %r146,1; .loc 1 138 19 mov.u64 %r605,0; .loc 1 137 15 cvt.u32.u64 %r606,%r146; $L14: .loc 1 133 24 ld.u64 %r39,[%r139]; .loc 1 133 22 st.u64 [%r144],%r39; .loc 1 134 24 ld.u64 %r309,[%r139+16]; add.u64 %r308,%r309,1; ld.u64 %r310,[%r139+8]; sub.u64 %r43,%r308,%r310; .loc 1 135 14 setp.le.s64 %r311,%r43,0; @ %r311 bra $L11; .loc 1 134 22 st.u64 [%r52],%r43; bra $L12; $L11: .loc 1 138 19 st.u64 [%r52],%r605; .loc 1 137 15 mov.u32 %r149,%r606; $L12: .loc 1 141 14 setp.ne.u64 %r313,%r39,%r146; @ %r313 bra $L55; .loc 1 142 19 ld.u64 %r314,[%r52]; mul.lo.u64 %r146,%r146,%r314; bra $L13; $L55: .loc 1 144 19 mov.u64 %r146,0; $L13: .loc 1 130 7 add.u64 %r139,%r139,24; add.u64 %r144,%r144,8; add.u64 %r52,%r52,8; setp.ne.u64 %r315,%r111,%r139; @ %r315 bra $L14; bra $L10; $L54: .loc 1 129 14 mov.u32 %r149,%r151; .loc 1 128 13 mov.u64 %r146,1; $L10: .loc 1 146 12 ld.u64 %r147,[%r260]; bra $L9; $L53: .loc 1 153 12 mov.u64 %r147,%r260; .loc 1 152 14 mov.u32 %r149,1; .loc 1 151 13 mov.u64 %r146,1; .loc 1 150 12 mov.u64 %r145,%r260; $L9: .loc 1 156 7 cvta.global.u64 %r316,_gfortrani_compile_options; .loc 1 156 6 ld.u32 %r317,[%r316+36]; setp.eq.u32 %r318,%r317,0; @ ! %r318 bra $L92; setp.eq.u64 %r590,%r261,0; bra $L15; $L92: add.u64 %r229,%r257,48; .loc 1 161 23 mov.u64 %r224,0; .loc 1 160 10 mov.u64 %r175,1; $L17: .loc 1 163 20 ld.u64 %r47,[%r230]; .loc 1 163 7 mul.lo.u64 %r175,%r175,%r47; .loc 1 164 17 ld.u64 %r320,[%r229+8]; add.u64 %r319,%r320,1; .loc 1 164 15 ld.u64 %r321,[%r229]; sub.u64 %r176,%r319,%r321; .loc 1 161 41 add.u64 %r224,%r224,1; .loc 1 165 7 setp.eq.u64 %r322,%r47,%r176; @ %r322 bra $L16; .loc 1 166 6 st.u64 [%stack+16],%r47; st.u64 [%stack+8],%r176; st.u64 [%stack],%r224;_gfortran_runtime_error16: .loc 1 161 7 add.u64 %r230,%r230,8; add.u64 %r229,%r229,24; setp.ne.u64 %r325,%r161,%r224; @ %r325 bra $L17; .loc 1 173 14 ld.s8 %r53,[%r258+28]; .loc 1 174 7 cvt.u16.u32 %r326,%r53; setp.le.s16 %r327,%r326,0; @ %r327 bra $L56; add.u64 %r193,%r258,48; add.u64 %r241,%r258,56; cvt.u32.u32 %r330,%r53; cvt.s64.s8 %r329,%r330; add.u64 %r332,%r329,%r329; add.u64 %r333,%r332,%r329; shl.b64 %r334,%r333,3; add.u64 %r234,%r193,%r334; .loc 1 172 21 mov.u64 %r155,1; $L19: .loc 1 177 9 ld.u64 %r337,[%r241]; add.u64 %r336,%r337,1; .loc 1 177 7 ld.u64 %r338,[%r193]; sub.u64 %r335,%r336,%r338; .loc 1 178 33 max.s64 %r335,%r335,0; .loc 1 178 18 mul.lo.u64 %r155,%r155,%r335; .loc 1 174 7 add.u64 %r193,%r193,24; add.u64 %r241,%r241,24; setp.ne.u64 %r339,%r234,%r193; @ %r339 bra $L19; bra $L18; $L56: .loc 1 172 21 mov.u64 %r155,1; $L18: .loc 1 181 10 setp.ge.s64 %r340,%r155,%r175; @ %r340 bra $L20; .loc 1 181 34 set.u32.eq.u64 %r342,%r260,0; neg.s32 %r343,%r342; .loc 1 181 39 cvt.u16.u32 %r346,%r149; cvt.u16.u32 %r349,%r343; or.b16 %r348,%r349,%r346; .loc 1 181 30 cvt.u32.u16 %r351,%r348; cvt.u16.u8 %r352,%r351; setp.eq.u16 %r353,%r352,0; @ %r353 bra $L20; .loc 1 182 2 st.u64 [%stack+8],%r175; st.u64 [%stack],%r155; cvta.const.u64 %r3_gfortran_runtime_error20: .loc 1 186 10 setp.eq.u64 %r590,%r261,0; @ %r590 bra $L15; .loc 1 192 14 shl.b64 %r357,%r161,2; add.u64 %r358,%frame,480; mov.u32 %r3635362357364,[%value_in]; } ld.u64 %r366,[%r261+40]; shl.b64 %r191,%r366,3; ld.u64 %r189,[%r261]; .loc 1 194 20 mov.u64 %r172,0; .loc 1 206 16 mov.u32 %r604,1; $L23: .loc 1 196 28 ld.u64 %r63,[%r189]; .loc 1 196 10 add.u64 %r171,%r63,-1; .loc 1 198 14 shr.u64 %r368,%r171,63; .loc 1 198 23 set.u32.le.s64 %r370,%r161,%r171; neg.s32 %r371,%r370; .loc 1 198 18 cvt.u16.u64 %r374,%r368; cvt.u16.u32 %r375,%r371; or.b16 %r373,%r374,%r375; .loc 1 198 11 cvt.u32.u16 %r376,%r373; cvt.u16.u8 %r377,%r376; setp.eq.u16 %r378,%r377,0; @ %r378 bra $L21; .loc 1 199 3 st.u64 [%stack],%r63_gfortran_runtime_errorshl.b64 %r381,%r171,2; add.u64 %r382,%frame,%r381; add.u64 %r383,%r382,480; .loc 1 202 11 ld.u32 %r384,[%r383]; setp.eq.u32 %r385,%r384,0; @ %r385 bra $L22; .loc 1 203 3 st.u64 [%stack],%r63; cvta.const.u64 %r386386_gfortran_runtime_error22: .loc 1 206 16 st.u32 [%r383],%r604; .loc 1 194 38 add.u64 %r172,%r172,1; .loc 1 194 4 add.u64 %r189,%r189,%r191; setp.ne.u64 %r392,%r161,%r172; @ %r392 bra $L23; $L15: add.u64 %r91,%frame,960; add.u64 %r51,%frame,1080; add.u64 %r46,%frame,840; .loc 1 194 20 mov.u64 %r83,%r51; mov.u64 %r185,0; mov.u64 %r136,1; .loc 1 220 17 mov.u64 %r400,%r185; $L29: .loc 1 215 10 @ %r590 bra $L57; .loc 1 216 34 ld.u64 %r395,[%r261+40]; mul.lo.u64 %r394,%r185,%r395; .loc 1 216 31 ld.u64 %r396,[%r261]; shl.b64 %r397,%r394,3; add.u64 %r398,%r396,%r397; .loc 1 216 13 ld.u64 %r399,[%r398]; add.u64 %r156,%r399,-1; bra $L24; $L57: mov.u64 %r156,%r185; $L24: .loc 1 220 17 st.u64 [%r83],%r400; .loc 1 221 20 add.u64 %r402,%r156,%r156; add.u64 %r403,%r402,%r156; shl.b64 %r404,%r403,3; add.u64 %r405,%r257,%r404; ld.u64 %r75,[%r405+40]; .loc 1 221 18 st.u64 [%r46],%r75; .loc 1 222 20 ld.u64 %r414,[%r405+56]; add.u64 %r413,%r414,1; ld.u64 %r421,[%r405+48]; sub.u64 %r79,%r413,%r421; .loc 1 223 10 setp.lt.s64 %r422,%r79,0; @ %r422 bra $L25; .loc 1 222 18 st.u64 [%r91],%r79; bra $L26; $L25: .loc 1 224 20 st.u64 [%r91],%r400; mov.u64 %r79,%r400; $L26: .loc 1 226 35 shl.b64 %r424,%r156,3; add.u64 %r425,%frame,%r424; .loc 1 226 10 ld.u64 %r426,[%r425]; setp.eq.u64 %r427,%r426,%r79; @ %r427 bra $L27; .loc 1 227 9 cvta.const.u64 %r4284_gfortran_runtime_error27: .loc 1 229 10 setp.ne.u64 %r430,%r75,%r136; @ %r430 bra $L58; .loc 1 230 15 mul.lo.u64 %r136,%r136,%r79; bra $L28; $L58: .loc 1 232 15 mov.u64 %r136,0; $L28: .loc 1 233 10 setp.le.s64 %r431,%r79,0; @ %r431 bra $L1; .loc 1 212 37 add.u64 %r185,%r185,1; .loc 1 212 3 add.u64 %r91,%r91,8; add.u64 %r83,%r83,8; add.u64 %r46,%r46,8; setp.ne.u64 %r432,%r161,%r185; @ %r432 bra $L29; .loc 1 237 8 ld.s8 %r194,[%r258+28]; .loc 1 247 17 shl.b64 %r435,%r194,3; add.u64 %r589,%frame,720; mov.u32 %r440435442,[%value_in]; } add.u64 %r197,%r258,40; add.u64 %r588,%frame,480; mov.u64 %r195,%r588; add.u64 %r585,%frame,600; mov.u64 %r169,%r585; add.u64 %r445,%r194,%r194; add.u64 %r446,%r445,%r194; shl.b64 %r447,%r446,3; add.u64 %r134,%r447,%r197; .loc 1 243 9 mov.u64 %r143,1; .loc 1 253 15 mov.u64 %r602,0; .loc 1 252 11 cvt.u32.u64 %r603,%r143; $L33: .loc 1 248 20 ld.u64 %r86,[%r197]; .loc 1 248 18 st.u64 [%r195],%r86; .loc 1 249 20 ld.u64 %r450,[%r197+16]; add.u64 %r449,%r450,1; ld.u64 %r451,[%r197+8]; sub.u64 %r90,%r449,%r451; .loc 1 250 10 setp.le.s64 %r452,%r90,0; @ %r452 bra $L30; .loc 1 249 18 st.u64 [%r169],%r90; bra $L31; $L30: .loc 1 253 15 st.u64 [%r169],%r602; .loc 1 252 11 mov.u32 %r151,%r603; $L31: .loc 1 256 10 setp.ne.u64 %r454,%r86,%r143; @ %r454 bra $L59; .loc 1 257 15 ld.u64 %r455,[%r169]; mul.lo.u64 %r143,%r143,%r455; bra $L32; $L59: .loc 1 259 15 mov.u64 %r143,0; $L32: .loc 1 245 3 add.u64 %r197,%r197,24; add.u64 %r195,%r195,8; add.u64 %r169,%r169,8; setp.ne.u64 %r456,%r134,%r197; @ %r456 bra $L33; .loc 1 267 34 ld.u64 %r82,[%r257]; .loc 1 267 68 ld.u64 %r183,[%r258]; .loc 1 262 13 set.u32.ne.u64 %r458,%r136,0; neg.s32 %r459,%r458; .loc 1 262 32 set.u32.ne.u64 %r461,%r146,0; neg.s32 %r462,%r461; cvt.u16.u32 %r464,%r459; cvt.u16.u32 %r465,%r462; and.b16 %r463,%r464,%r465; .loc 1 262 27 set.u32.ne.u64 %r467,%r143,0; neg.s32 %r468,%r467; .loc 1 262 32 cvt.u16.u32 %r472,%r468; and.b16 %r470,%r463,%r472; cvt.u32.u16 %r473,%r470; cvt.u16.u8 %r474,%r473; setp.eq.u16 %r475,%r474,0; @ %r475 bra $L34; .loc 1 264 13 shl.b64 %r96,%r136,2; .loc 1 265 13 shl.b64 %r98,%r143,2; .loc 1 266 13 shl.b64 %r100,%r146,2; .loc 1 267 7 {1831100; call _gfortrani_reshape_packed269 7 bra $L1; $L34: .loc 1 273 12 ld.u64 %r181,[%frame+840]; .loc 1 274 12 ld.u64 %r140,[%frame+480]; .loc 1 276 6 and.b32 %r482,%r151,%r149; setp.eq.u32 %r483,%r482,0; @ %r483 bra $L3636: .loc 1 279 6 setp.eq.u32 %r484,%r151,0; @ %r484 bra $L37; .loc 1 285 7 setp.le.s64 %r485,%r145,0; @ %r485 bra $L60; .loc 1 290 13 ld.u64 %r140,[%frame+120]; .loc 1 287 16 shl.b64 %r207,%r145,3; add.u64 %r487,%frame,36288 17 add.u64 %r497,%frame,2mov.u64 %r236,%r140; mov.u64 %r200,0; .loc 1 289 26 add.u64 %r601,%frame,120; $L38: .loc 1 289 17 add.u64 %r507,%r588,%r200; st.u64 [%r507],%r236; .loc 1 285 7 add.u64 %r200,%r200,8; setp.eq.u64 %r508,%r200,%r207; @ %r508 bra $L61; .loc 1 289 26 add.u64 %r510,%r601,%r200; ld.u64 %r236,[%r510]; bra $L38; $L50: .loc 1 297 15 ld.f32 %r106,[%r183]; .loc 1 297 13 st.f32 [%r82],%r106; .loc 1 299 12 add.u64 %r82,%r82,%r108; .loc 1 300 11 add.u64 %r183,%r183,%r586; .loc 1 301 16 add.u64 %r157,%r157,1; .loc 1 302 16 add.u64 %r242,%r242,1; st.u64 [%frame+720],%r242; .loc 1 306 13 setp.ne.u64 %r512,%r157,%r222; @ %r512 bra $L39; .loc 1 315 14 @ %r600 bra $L40; add.u64 %r173,%frame,1088; add.u64 %r104,%frame,848; add.u64 %r170,%frame,968; .loc 1 313 30 mov.u64 %r115,%r138; .loc 1 310 21 mov.u64 %r592,0; bra $L41; $L42: st.u64 [%r173],%r592; .loc 1 313 30 mul.lo.u64 %r115,%r118,%r117; .loc 1 315 14 add.u64 %r173,%r173,8; add.u64 %r104,%r104,8; add.u64 %r170,%r170,8; setp.eq.u64 %r518,%r173,%r201; @ %r518 bra $L40; $L41: .loc 1 323 24 ld.u64 %r519,[%r173]; add.u64 %r117,%r519,1; st.u64 [%r173],%r117; .loc 1 324 30 ld.u64 %r118,[%r104]; .loc 1 324 20 sub.u64 %r520,%r118,%r115; shl.b64 %r521,%r520,2; add.u64 %r82,%r82,%r521; .loc 1 306 34 ld.u64 %r120,[%r170]; .loc 1 306 13 setp.eq.u64 %r522,%r117,%r120; @ %r522 bra $L42; .loc 1 329 13 setp.ne.u64 %r523,%r242,%r587; @ %r523 bra $L62; .loc 1 310 21 mov.u64 %r157,0; bra $L51; $L64: mov.u64 %r157,0; .loc 1 329 13 mov.u64 %r82,%r157; $L51: .loc 1 333 21 mov.u64 %r524,0; st.u64 [%frame+720],%r524; .loc 1 336 29 ld.u64 %r525,[%frame+480]; mul.lo.u64 %r124,%r587,%r525; .loc 1 338 14 setp.ne.u64 %r526,%r194,1; @ %r526 bra $L44; $L47: .loc 1 340 19 set.u32.ne.u64 %r528,%r168,0; neg.s32 %r529,%r528; .loc 1 340 24 cvt.u16.u32 %r534,%r529; and.b16 %r533,%r534,%r595; .loc 1 340 18 cvt.u32.u16 %r536,%r533; cvt.u16.u8 %r537,%r536; setp.ne.u16 %r538,%r537,0; @ %r538 bra $L45; .loc 1 354 19 mov.u64 %r183,%r147; bra $L46; $L49: .loc 1 333 21 st.u64 [%r217],%r524; .loc 1 336 29 mul.lo.u64 %r124,%r132,%r131; .loc 1 337 12 add.u64 %r182,%r182,1; .loc 1 338 14 add.u64 %r217,%r217,8; setp.eq.u64 %r540,%r194,%r182; @ ! %r540 bra $L48; bra $L47; $L45: .loc 1 345 19 @ %r596 bra $L63; .loc 1 347 3520550,[%value_in]; } .loc 1 348 3620560,[%value_in]; } .loc 1 349 3658r59920570,[%value_in]; } .loc 1 350 32 ld.u64 %r140,[%frame+480]; .loc 1 354 19 mov.u64 %r183,%r147; .loc 1 350 32 mov.u64 %r194,%r145; .loc 1 343 24 mov.u64 %r168,0; bra $L46; $L44: add.u64 %r217,%frame,728; .loc 1 337 12 mov.u64 %r182,1; $L48: .loc 1 359 24 ld.u64 %r573,[%r217]; add.u64 %r131,%r573,1; st.u64 [%r217],%r131; shl.b64 %r184,%r182,3; .loc 1 360 29 add.u64 %r575,%r588,%r184; ld.u64 %r132,[%r575]; .loc 1 360 19 sub.u64 %r576,%r132,%r124; shl.b64 %r577,%r576,2; add.u64 %r183,%r183,%r577; .loc 1 329 34 add.u64 %r579,%r585,%r184; ld.u64 %r135,[%r579]; .loc 1 329 13 setp.eq.u64 %r580,%r131,%r135; @ %r580 bra $L49; bra $L46; $L60: .loc 1 283 12 mov.u64 %r183,%r147; .loc 1 285 7 mov.u64 %r194,%r145; bra $L37; $L61: .loc 1 283 12 mov.u64 %r183,%r147; mov.u64 %r194,%r145; $L37: .loc 1 294 9 setp.eq.u64 %r581,%r82,0; @ %r581 bra $L1; .loc 1 299 12 shl.b64 %r108,%r181,2; ld.u64 %r157,[%frame+1080]; .loc 1 306 34 ld.u64 %r222,[%frame+960]; .loc 1 313 30 mul.lo.u64 %r138,%r222,%r181; .loc 1 329 34 ld.u64 %r587,[%frame+600]; .loc 1 302 13 ld.u64 %r242,[%frame+720]; add.u64 %r201,%r51,%r253; .loc 1 347 35 shl.b64 %r205,%r145,3; mov.u64 %r168,%r183; shl.b64 %r586,%r140,2; .loc 1 340 24 set.u32.ne.u64 %r593,%r260,0; neg.s32 %r594,%r593; cvt.u16.u32 %r595,%r594; .loc 1 345 19 setp.le.s64 %r596,%r145,0; .loc 1 347 35 add.u64 %r597,%frame,360; .loc 1 348 36 add.u64 %r598,%frame,240; .loc 1 349 36 add.u64 %r599,%frame,120; .loc 1 315 14 setp.eq.u64 %r600,%r161,1; bra $L50; $L63: .loc 1 354 19 mov.u64 %r183,%r147; .loc 1 345 19 mov.u64 %r194,%r145; .loc 1 343 24 mov.u64 %r168,0; $L46: .loc 1 294 9 setp.eq.u64 %r582,%r82,0; @ %r582 bra $L1; shl.b64 %r586,%r140,2; ld.u64 %r587,[%frame+600]; bra $L43; $L62: .loc 1 310 21 mov.u64 %r157,0; $L43: .loc 1 302 13 ld.u64 %r242,[%frame+720]; bra $L50; $L39: .loc 1 329 13 setp.eq.u64 %r583,%r242,%r587; @ %r583 bra $L51; bra $L50; $L40: setp.eq.u64 %r584,%r242,%r587; @ %r584 bra $L64; $L1: .loc 1 364 1 ret; } reshape_r8.o/_gfortran_reshape_r8 .visible .func _gfortran_reshape_.file 1 "../../../libgfortran/generated/reshape_r8.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray_gfortrani_reshape_packed .extern .func _gfortrani_reshape_packed73,110,99,111,114,114,101,99,116,32,115,105,122,101,32,105,110,32,83,79,85,82,67,69,32,97,114,103,117,109,101,110,116,32,116,111,32,82,69,83,72,65,80,69,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,100,0 }114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,82,69,83,72,65,80,69,32,105,110,116,114,105,11097,108,117,101,32,37,108,100,32,105,110,32,79,82,68,69,82,32,97,114,103,117,109,101,110,116,32,116,111,32,82,69,83,72,65,80,69,32,105,110,116,114,105,110,115,105,9910,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,82,69,83,72,65,80,69,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10,111,117,116,32,111,102,32,114,97,110,103,101,32,105,110,32,79,82,68,69,82,32,97,114,103,117,109,101,110,116,32,116,111,32,82,69,83,72,65,80,69,32,105115,104,97,112,101,32,97,110,100,32,116,97,114,103,101,116,32,100,111,32,110,111,116,32,99,111,110,102,111,114_gfortran_reshape_r8 .visible .func _gfortran_reshape_ {local .align 16 .b8 %frame_ar[1200100; .reg .u64u64.reg .u64 %r230; .reg .u64u64u64.reg .predpredu64 %r294; .reg .u64 %r295; .reg .u64 %r301; .reg .u64 %r305; .reg .u64 %r306; .reg .u64u64 %r310; .reg .predpred %r315; .reg .u64 %r316; .reg .u32pred %r325; .reg .u16 %r326; .reg .pred %r327; .reg .u64 %r329; .reg .u32 %r330; .reg .u64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64pred %r339; .reg .predpredu64 %r399; .reg .u64 %r400; .reg .u64 %r402; .reg .u64predu32 %r459; .reg .u32 %r461; .reg .u32 %r462; .reg .u16 %r463; .reg .u16 %r464; .reg .u16 %r465; .reg .u32 %r467; .reg .u32 %r468; .reg .u16 %r470; .reg .u16 %r472; .reg .u32 %r473; .reg .u16 %r474; .reg .pred %r475; .reg .u32 %r482; .reg .pred %r483; .reg .pred %r484; .reg .pred %r485; .reg .u64 %r487; .reg .u64 %r494; .reg .u64 %r497; .reg .u64 %r504; .reg .u64 %r507; .reg .pred %r508; .reg .u64 %r510; .reg .pred %r512; .reg .pred %r518; .reg .u64 %r519; .reg .u64 %r520; .reg .u64 %r521; .reg .pred %r522; .reg .pred %r523; .reg .u64 %r524; .reg .u64 %r525; .reg .pred %r526; .reg .u32 %r528; .reg .u32 %r529; .reg .u16 %r533; .reg .u16 %r534; .reg .u32 %r536; .reg .u16 %r537; .reg .pred %r538; .reg .pred %r540; .reg .u64 %r550; .reg .u64 %r560; .reg .u64 %r570; .reg .u64 %r573; .reg .u64 %r575; .reg .u64 %r576; .reg .u64 %r577; .reg .u64 %r579; .reg .pred %r580; .reg .pred %r581; .reg .pred %r582; .reg .pred %r583; .reg .pred %r584; .reg .u64 %r585; .reg .u64 %r586; .reg .u64 %r587; .reg .u64 %r588; .reg .u64 %r589; .reg .pred %r590; .reg .u64 %r592; .reg .u32 %r593; .reg .u32 %r594; .reg .u16 %r595; .reg .pred %r596; .reg .u64 %r597; .reg .u64 %r598; .reg .u64 %r599; .reg .pred %r600; .reg .u64 %r601; .reg .u64 %r602; .reg .u32 %r603; .reg .u32 %r604; .reg .u64 %r605; .reg .u32 %r606; .reg .u64 %r607; .reg .u32 %r608; mov.u64 %r257,%ar0; mov.u64 %r258,%ar1; mov.u64 %r259,%ar2; mov.u64 %r260,%ar3; mov.u64 %r261,%ar4; .loc 1 78 10 ld.u64 %r263,[%r259+56]; add.u64 %r262,%r263,1; .loc 1 78 8 ld.u64 %r264,[%r259+48]; sub.u64 %r161,%r262,%r264; .loc 1 83 15 ld.s8 %r265,[%r257+28]; .loc 1 83 6 setp.eq.u64 %r266,%r265,%r161; @ %r266 bra $L2; .loc 1 84 5 cvta.const.u64 %r2667_gfortran_runtime_errorld.u64 %r269,[%r259+40]; shl.b64 %r249,%r269,3; ld.u64 %r247,[%r259]; mov.u64 %r230,%frame; shl.b64 %r253,%r161,3; add.u64 %r254,%frame,%r253; .loc 1 90 44 mov.u64 %r250,%frame; .loc 1 86 15 mov.u32 %r151,0; .loc 1 93 23 mov.u64 %r607,0; .loc 1 94 14 mov.u32 %r608,1; $L5: .loc 1 90 39 ld.u64 %r32,[%r247]; .loc 1 91 10 setp.le.s64 %r270,%r32,0; @ %r270 bra $L3; .loc 1 90 21 st.u64 [%r250],%r32; bra $L4; $L3: .loc 1 93 23 st.u64 [%r250],%r607; .loc 1 94 14 mov.u32 %r151,%r608; $L4: .loc 1 88 3 add.u64 %r247,%r247,%r249; add.u64 %r250,%r250,8; setp.ne.u64 %r272,%r250,%r254; @ %r272 bra $L5; .loc 1 98 6 ld.u64 %r273,[%r257]; setp.ne.u64 %r274,%r273,0; @ %r274 bra $L6; add.u64 %r113,%r257,40; add.u64 %r277,%r161,%r161; add.u64 %r278,%r277,%r161; shl.b64 %r279,%r278,3; add.u64 %r246,%r113,%r279; mov.u64 %r159,%r230; .loc 1 102 10 mov.u64 %r167,1; $L7: .loc 1 105 8 ld.u64 %r166,[%r159]; .loc 1 107 4 st.u64 [%r113+8],%r273; add.u64 %r281,%r166,-1; st.u64 [%r113+16],%r281; st.u64 [%r113],%r167; .loc 1 109 7 mul.lo.u64 %r167,%r167,%r166; .loc 1 103 7 add.u64 %r159,%r159,8; add.u64 %r113,%r113,24; setp.ne.u64 %r282,%r113,%r246; @ %r282 bra $L7; .loc 1 111 19 st.u64 [%r257+8],%r273; .loc 1 118 24 max.s64 %r286,%r167,0; mov.u64 %r285285; call (%value_in),_gfortrani_xmallocarray287,[%value_in]; } .loc 1 118 22 st.u64 [%r257],%r287; .loc 1 119 23 st.u8 [%r257+28],%r161; $L6: .loc 1 122 6 setp.ne.u32 %r289,%r151,0; @ %r289 bra $L1; .loc 1 125 6 setp.eq.u64 %r290,%r260,0; @ %r290 bra $L53; .loc 1 127 12 ld.s8 %r145,[%r260+28]; .loc 1 130 7 setp.le.s64 %r292,%r145,0; @ %r292 bra $L54; .loc 1 132 21 shl.b64 %r294,%r145,3; add.u64 %r295,%frame,3629301,[%value_in]; } add.u64 %r139,%r260,40; add.u64 %r144,%frame,120; add.u64 %r52,%frame,240; add.u64 %r305,%r145,%r145; add.u64 %r306,%r305,%r145; shl.b64 %r307,%r306,3; add.u64 %r111,%r139,%r307; .loc 1 129 14 mov.u32 %r149,%r151; .loc 1 128 13 mov.u64 %r146,1; .loc 1 138 19 mov.u64 %r605,0; .loc 1 137 15 cvt.u32.u64 %r606,%r146; $L14: .loc 1 133 24 ld.u64 %r39,[%r139]; .loc 1 133 22 st.u64 [%r144],%r39; .loc 1 134 24 ld.u64 %r309,[%r139+16]; add.u64 %r308,%r309,1; ld.u64 %r310,[%r139+8]; sub.u64 %r43,%r308,%r310; .loc 1 135 14 setp.le.s64 %r311,%r43,0; @ %r311 bra $L11; .loc 1 134 22 st.u64 [%r52],%r43; bra $L12; $L11: .loc 1 138 19 st.u64 [%r52],%r605; .loc 1 137 15 mov.u32 %r149,%r606; $L12: .loc 1 141 14 setp.ne.u64 %r313,%r39,%r146; @ %r313 bra $L55; .loc 1 142 19 ld.u64 %r314,[%r52]; mul.lo.u64 %r146,%r146,%r314; bra $L13; $L55: .loc 1 144 19 mov.u64 %r146,0; $L13: .loc 1 130 7 add.u64 %r139,%r139,24; add.u64 %r144,%r144,8; add.u64 %r52,%r52,8; setp.ne.u64 %r315,%r111,%r139; @ %r315 bra $L14; bra $L10; $L54: .loc 1 129 14 mov.u32 %r149,%r151; .loc 1 128 13 mov.u64 %r146,1; $L10: .loc 1 146 12 ld.u64 %r147,[%r260]; bra $L9; $L53: .loc 1 153 12 mov.u64 %r147,%r260; .loc 1 152 14 mov.u32 %r149,1; .loc 1 151 13 mov.u64 %r146,1; .loc 1 150 12 mov.u64 %r145,%r260; $L9: .loc 1 156 7 cvta.global.u64 %r316,_gfortrani_compile_options; .loc 1 156 6 ld.u32 %r317,[%r316+36]; setp.eq.u32 %r318,%r317,0; @ ! %r318 bra $L92; setp.eq.u64 %r590,%r261,0; bra $L15; $L92: add.u64 %r229,%r257,48; .loc 1 161 23 mov.u64 %r224,0; .loc 1 160 10 mov.u64 %r175,1; $L17: .loc 1 163 20 ld.u64 %r47,[%r230]; .loc 1 163 7 mul.lo.u64 %r175,%r175,%r47; .loc 1 164 17 ld.u64 %r320,[%r229+8]; add.u64 %r319,%r320,1; .loc 1 164 15 ld.u64 %r321,[%r229]; sub.u64 %r176,%r319,%r321; .loc 1 161 41 add.u64 %r224,%r224,1; .loc 1 165 7 setp.eq.u64 %r322,%r47,%r176; @ %r322 bra $L16; .loc 1 166 6 st.u64 [%stack+16],%r47; st.u64 [%stack+8],%r176; st.u64 [%stack],%r224;_gfortran_runtime_error16: .loc 1 161 7 add.u64 %r230,%r230,8; add.u64 %r229,%r229,24; setp.ne.u64 %r325,%r161,%r224; @ %r325 bra $L17; .loc 1 173 14 ld.s8 %r53,[%r258+28]; .loc 1 174 7 cvt.u16.u32 %r326,%r53; setp.le.s16 %r327,%r326,0; @ %r327 bra $L56; add.u64 %r193,%r258,48; add.u64 %r241,%r258,56; cvt.u32.u32 %r330,%r53; cvt.s64.s8 %r329,%r330; add.u64 %r332,%r329,%r329; add.u64 %r333,%r332,%r329; shl.b64 %r334,%r333,3; add.u64 %r234,%r193,%r334; .loc 1 172 21 mov.u64 %r155,1; $L19: .loc 1 177 9 ld.u64 %r337,[%r241]; add.u64 %r336,%r337,1; .loc 1 177 7 ld.u64 %r338,[%r193]; sub.u64 %r335,%r336,%r338; .loc 1 178 33 max.s64 %r335,%r335,0; .loc 1 178 18 mul.lo.u64 %r155,%r155,%r335; .loc 1 174 7 add.u64 %r193,%r193,24; add.u64 %r241,%r241,24; setp.ne.u64 %r339,%r234,%r193; @ %r339 bra $L19; bra $L18; $L56: .loc 1 172 21 mov.u64 %r155,1; $L18: .loc 1 181 10 setp.ge.s64 %r340,%r155,%r175; @ %r340 bra $L20; .loc 1 181 34 set.u32.eq.u64 %r342,%r260,0; neg.s32 %r343,%r342; .loc 1 181 39 cvt.u16.u32 %r346,%r149; cvt.u16.u32 %r349,%r343; or.b16 %r348,%r349,%r346; .loc 1 181 30 cvt.u32.u16 %r351,%r348; cvt.u16.u8 %r352,%r351; setp.eq.u16 %r353,%r352,0; @ %r353 bra $L20; .loc 1 182 2 st.u64 [%stack+8],%r175; st.u64 [%stack],%r155; cvta.const.u64 %r3_gfortran_runtime_error20: .loc 1 186 10 setp.eq.u64 %r590,%r261,0; @ %r590 bra $L15; .loc 1 192 14 shl.b64 %r357,%r161,2; add.u64 %r358,%frame,480; mov.u32 %r3635362357364,[%value_in]; } ld.u64 %r366,[%r261+40]; shl.b64 %r191,%r366,3; ld.u64 %r189,[%r261]; .loc 1 194 20 mov.u64 %r172,0; .loc 1 206 16 mov.u32 %r604,1; $L23: .loc 1 196 28 ld.u64 %r63,[%r189]; .loc 1 196 10 add.u64 %r171,%r63,-1; .loc 1 198 14 shr.u64 %r368,%r171,63; .loc 1 198 23 set.u32.le.s64 %r370,%r161,%r171; neg.s32 %r371,%r370; .loc 1 198 18 cvt.u16.u64 %r374,%r368; cvt.u16.u32 %r375,%r371; or.b16 %r373,%r374,%r375; .loc 1 198 11 cvt.u32.u16 %r376,%r373; cvt.u16.u8 %r377,%r376; setp.eq.u16 %r378,%r377,0; @ %r378 bra $L21; .loc 1 199 3 st.u64 [%stack],%r63_gfortran_runtime_errorshl.b64 %r381,%r171,2; add.u64 %r382,%frame,%r381; add.u64 %r383,%r382,480; .loc 1 202 11 ld.u32 %r384,[%r383]; setp.eq.u32 %r385,%r384,0; @ %r385 bra $L22; .loc 1 203 3 st.u64 [%stack],%r63; cvta.const.u64 %r386386_gfortran_runtime_error22: .loc 1 206 16 st.u32 [%r383],%r604; .loc 1 194 38 add.u64 %r172,%r172,1; .loc 1 194 4 add.u64 %r189,%r189,%r191; setp.ne.u64 %r392,%r161,%r172; @ %r392 bra $L23; $L15: add.u64 %r91,%frame,960; add.u64 %r51,%frame,1080; add.u64 %r46,%frame,840; .loc 1 194 20 mov.u64 %r83,%r51; mov.u64 %r185,0; mov.u64 %r136,1; .loc 1 220 17 mov.u64 %r400,%r185; $L29: .loc 1 215 10 @ %r590 bra $L57; .loc 1 216 34 ld.u64 %r395,[%r261+40]; mul.lo.u64 %r394,%r185,%r395; .loc 1 216 31 ld.u64 %r396,[%r261]; shl.b64 %r397,%r394,3; add.u64 %r398,%r396,%r397; .loc 1 216 13 ld.u64 %r399,[%r398]; add.u64 %r156,%r399,-1; bra $L24; $L57: mov.u64 %r156,%r185; $L24: .loc 1 220 17 st.u64 [%r83],%r400; .loc 1 221 20 add.u64 %r402,%r156,%r156; add.u64 %r403,%r402,%r156; shl.b64 %r404,%r403,3; add.u64 %r405,%r257,%r404; ld.u64 %r75,[%r405+40]; .loc 1 221 18 st.u64 [%r46],%r75; .loc 1 222 20 ld.u64 %r414,[%r405+56]; add.u64 %r413,%r414,1; ld.u64 %r421,[%r405+48]; sub.u64 %r79,%r413,%r421; .loc 1 223 10 setp.lt.s64 %r422,%r79,0; @ %r422 bra $L25; .loc 1 222 18 st.u64 [%r91],%r79; bra $L26; $L25: .loc 1 224 20 st.u64 [%r91],%r400; mov.u64 %r79,%r400; $L26: .loc 1 226 35 shl.b64 %r424,%r156,3; add.u64 %r425,%frame,%r424; .loc 1 226 10 ld.u64 %r426,[%r425]; setp.eq.u64 %r427,%r426,%r79; @ %r427 bra $L27; .loc 1 227 9 cvta.const.u64 %r4284_gfortran_runtime_error27: .loc 1 229 10 setp.ne.u64 %r430,%r75,%r136; @ %r430 bra $L58; .loc 1 230 15 mul.lo.u64 %r136,%r136,%r79; bra $L28; $L58: .loc 1 232 15 mov.u64 %r136,0; $L28: .loc 1 233 10 setp.le.s64 %r431,%r79,0; @ %r431 bra $L1; .loc 1 212 37 add.u64 %r185,%r185,1; .loc 1 212 3 add.u64 %r91,%r91,8; add.u64 %r83,%r83,8; add.u64 %r46,%r46,8; setp.ne.u64 %r432,%r161,%r185; @ %r432 bra $L29; .loc 1 237 8 ld.s8 %r194,[%r258+28]; .loc 1 247 17 shl.b64 %r435,%r194,3; add.u64 %r589,%frame,720; mov.u32 %r440435442,[%value_in]; } add.u64 %r197,%r258,40; add.u64 %r588,%frame,480; mov.u64 %r195,%r588; add.u64 %r585,%frame,600; mov.u64 %r169,%r585; add.u64 %r445,%r194,%r194; add.u64 %r446,%r445,%r194; shl.b64 %r447,%r446,3; add.u64 %r134,%r447,%r197; .loc 1 243 9 mov.u64 %r143,1; .loc 1 253 15 mov.u64 %r602,0; .loc 1 252 11 cvt.u32.u64 %r603,%r143; $L33: .loc 1 248 20 ld.u64 %r86,[%r197]; .loc 1 248 18 st.u64 [%r195],%r86; .loc 1 249 20 ld.u64 %r450,[%r197+16]; add.u64 %r449,%r450,1; ld.u64 %r451,[%r197+8]; sub.u64 %r90,%r449,%r451; .loc 1 250 10 setp.le.s64 %r452,%r90,0; @ %r452 bra $L30; .loc 1 249 18 st.u64 [%r169],%r90; bra $L31; $L30: .loc 1 253 15 st.u64 [%r169],%r602; .loc 1 252 11 mov.u32 %r151,%r603; $L31: .loc 1 256 10 setp.ne.u64 %r454,%r86,%r143; @ %r454 bra $L59; .loc 1 257 15 ld.u64 %r455,[%r169]; mul.lo.u64 %r143,%r143,%r455; bra $L32; $L59: .loc 1 259 15 mov.u64 %r143,0; $L32: .loc 1 245 3 add.u64 %r197,%r197,24; add.u64 %r195,%r195,8; add.u64 %r169,%r169,8; setp.ne.u64 %r456,%r134,%r197; @ %r456 bra $L33; .loc 1 267 34 ld.u64 %r82,[%r257]; .loc 1 267 68 ld.u64 %r183,[%r258]; .loc 1 262 13 set.u32.ne.u64 %r458,%r136,0; neg.s32 %r459,%r458; .loc 1 262 32 set.u32.ne.u64 %r461,%r146,0; neg.s32 %r462,%r461; cvt.u16.u32 %r464,%r459; cvt.u16.u32 %r465,%r462; and.b16 %r463,%r464,%r465; .loc 1 262 27 set.u32.ne.u64 %r467,%r143,0; neg.s32 %r468,%r467; .loc 1 262 32 cvt.u16.u32 %r472,%r468; and.b16 %r470,%r463,%r472; cvt.u32.u16 %r473,%r470; cvt.u16.u8 %r474,%r473; setp.eq.u16 %r475,%r474,0; @ %r475 bra $L34; .loc 1 264 13 shl.b64 %r96,%r136,3; .loc 1 265 13 shl.b64 %r98,%r143,3; .loc 1 266 13 shl.b64 %r100,%r146,3; .loc 1 267 7 {1831100; call _gfortrani_reshape_packed269 7 bra $L1; $L34: .loc 1 273 12 ld.u64 %r181,[%frame+840]; .loc 1 274 12 ld.u64 %r140,[%frame+480]; .loc 1 276 6 and.b32 %r482,%r151,%r149; setp.eq.u32 %r483,%r482,0; @ %r483 bra $L3636: .loc 1 279 6 setp.eq.u32 %r484,%r151,0; @ %r484 bra $L37; .loc 1 285 7 setp.le.s64 %r485,%r145,0; @ %r485 bra $L60; .loc 1 290 13 ld.u64 %r140,[%frame+120]; .loc 1 287 16 shl.b64 %r207,%r145,3; add.u64 %r487,%frame,36288 17 add.u64 %r497,%frame,2mov.u64 %r236,%r140; mov.u64 %r200,0; .loc 1 289 26 add.u64 %r601,%frame,120; $L38: .loc 1 289 17 add.u64 %r507,%r588,%r200; st.u64 [%r507],%r236; .loc 1 285 7 add.u64 %r200,%r200,8; setp.eq.u64 %r508,%r200,%r207; @ %r508 bra $L61; .loc 1 289 26 add.u64 %r510,%r601,%r200; ld.u64 %r236,[%r510]; bra $L38; $L50: .loc 1 297 15 ld.f64 %r106,[%r183]; .loc 1 297 13 st.f64 [%r82],%r106; .loc 1 299 12 add.u64 %r82,%r82,%r108; .loc 1 300 11 add.u64 %r183,%r183,%r586; .loc 1 301 16 add.u64 %r157,%r157,1; .loc 1 302 16 add.u64 %r242,%r242,1; st.u64 [%frame+720],%r242; .loc 1 306 13 setp.ne.u64 %r512,%r157,%r222; @ %r512 bra $L39; .loc 1 315 14 @ %r600 bra $L40; add.u64 %r173,%frame,1088; add.u64 %r104,%frame,848; add.u64 %r170,%frame,968; .loc 1 313 30 mov.u64 %r115,%r138; .loc 1 310 21 mov.u64 %r592,0; bra $L41; $L42: st.u64 [%r173],%r592; .loc 1 313 30 mul.lo.u64 %r115,%r118,%r117; .loc 1 315 14 add.u64 %r173,%r173,8; add.u64 %r104,%r104,8; add.u64 %r170,%r170,8; setp.eq.u64 %r518,%r173,%r201; @ %r518 bra $L40; $L41: .loc 1 323 24 ld.u64 %r519,[%r173]; add.u64 %r117,%r519,1; st.u64 [%r173],%r117; .loc 1 324 30 ld.u64 %r118,[%r104]; .loc 1 324 20 sub.u64 %r520,%r118,%r115; shl.b64 %r521,%r520,3; add.u64 %r82,%r82,%r521; .loc 1 306 34 ld.u64 %r120,[%r170]; .loc 1 306 13 setp.eq.u64 %r522,%r117,%r120; @ %r522 bra $L42; .loc 1 329 13 setp.ne.u64 %r523,%r242,%r587; @ %r523 bra $L62; .loc 1 310 21 mov.u64 %r157,0; bra $L51; $L64: mov.u64 %r157,0; .loc 1 329 13 mov.u64 %r82,%r157; $L51: .loc 1 333 21 mov.u64 %r524,0; st.u64 [%frame+720],%r524; .loc 1 336 29 ld.u64 %r525,[%frame+480]; mul.lo.u64 %r124,%r587,%r525; .loc 1 338 14 setp.ne.u64 %r526,%r194,1; @ %r526 bra $L44; $L47: .loc 1 340 19 set.u32.ne.u64 %r528,%r168,0; neg.s32 %r529,%r528; .loc 1 340 24 cvt.u16.u32 %r534,%r529; and.b16 %r533,%r534,%r595; .loc 1 340 18 cvt.u32.u16 %r536,%r533; cvt.u16.u8 %r537,%r536; setp.ne.u16 %r538,%r537,0; @ %r538 bra $L45; .loc 1 354 19 mov.u64 %r183,%r147; bra $L46; $L49: .loc 1 333 21 st.u64 [%r217],%r524; .loc 1 336 29 mul.lo.u64 %r124,%r132,%r131; .loc 1 337 12 add.u64 %r182,%r182,1; .loc 1 338 14 add.u64 %r217,%r217,8; setp.eq.u64 %r540,%r194,%r182; @ ! %r540 bra $L48; bra $L47; $L45: .loc 1 345 19 @ %r596 bra $L63; .loc 1 347 3520550,[%value_in]; } .loc 1 348 3620560,[%value_in]; } .loc 1 349 3658r59920570,[%value_in]; } .loc 1 350 32 ld.u64 %r140,[%frame+480]; .loc 1 354 19 mov.u64 %r183,%r147; .loc 1 350 32 mov.u64 %r194,%r145; .loc 1 343 24 mov.u64 %r168,0; bra $L46; $L44: add.u64 %r217,%frame,728; .loc 1 337 12 mov.u64 %r182,1; $L48: .loc 1 359 24 ld.u64 %r573,[%r217]; add.u64 %r131,%r573,1; st.u64 [%r217],%r131; shl.b64 %r184,%r182,3; .loc 1 360 29 add.u64 %r575,%r588,%r184; ld.u64 %r132,[%r575]; .loc 1 360 19 sub.u64 %r576,%r132,%r124; shl.b64 %r577,%r576,3; add.u64 %r183,%r183,%r577; .loc 1 329 34 add.u64 %r579,%r585,%r184; ld.u64 %r135,[%r579]; .loc 1 329 13 setp.eq.u64 %r580,%r131,%r135; @ %r580 bra $L49; bra $L46; $L60: .loc 1 283 12 mov.u64 %r183,%r147; .loc 1 285 7 mov.u64 %r194,%r145; bra $L37; $L61: .loc 1 283 12 mov.u64 %r183,%r147; mov.u64 %r194,%r145; $L37: .loc 1 294 9 setp.eq.u64 %r581,%r82,0; @ %r581 bra $L1; .loc 1 299 12 shl.b64 %r108,%r181,3; ld.u64 %r157,[%frame+1080]; .loc 1 306 34 ld.u64 %r222,[%frame+960]; .loc 1 313 30 mul.lo.u64 %r138,%r222,%r181; .loc 1 329 34 ld.u64 %r587,[%frame+600]; .loc 1 302 13 ld.u64 %r242,[%frame+720]; add.u64 %r201,%r51,%r253; .loc 1 347 35 shl.b64 %r205,%r145,3; mov.u64 %r168,%r183; shl.b64 %r586,%r140,3; .loc 1 340 24 set.u32.ne.u64 %r593,%r260,0; neg.s32 %r594,%r593; cvt.u16.u32 %r595,%r594; .loc 1 345 19 setp.le.s64 %r596,%r145,0; .loc 1 347 35 add.u64 %r597,%frame,360; .loc 1 348 36 add.u64 %r598,%frame,240; .loc 1 349 36 add.u64 %r599,%frame,120; .loc 1 315 14 setp.eq.u64 %r600,%r161,1; bra $L50; $L63: .loc 1 354 19 mov.u64 %r183,%r147; .loc 1 345 19 mov.u64 %r194,%r145; .loc 1 343 24 mov.u64 %r168,0; $L46: .loc 1 294 9 setp.eq.u64 %r582,%r82,0; @ %r582 bra $L1; shl.b64 %r586,%r140,3; ld.u64 %r587,[%frame+600]; bra $L43; $L62: .loc 1 310 21 mov.u64 %r157,0; $L43: .loc 1 302 13 ld.u64 %r242,[%frame+720]; bra $L50; $L39: .loc 1 329 13 setp.eq.u64 %r583,%r242,%r587; @ %r583 bra $L51; bra $L50; $L40: setp.eq.u64 %r584,%r242,%r587; @ %r584 bra $L64; $L1: .loc 1 364 1 ret; } reshape_r10reshape_c4.o/_gfortran_reshape_c4 .visible .func _gfortran_reshape_c4.file 1 "../../../libgfortran/generated/reshape_c4.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray_gfortrani_reshape_packed .extern .func _gfortrani_reshape_packed73,110,99,111,114,114,101,99,116,32,115,105,122,101,32,105,110,32,83,79,85,82,67,69,32,97,114,103,117,109,101,110,116,32,116,111,32,82,69,83,72,65,80,69,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,100,0 }114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,82,69,83,72,65,80,69,32,105,110,116,114,105,11097,108,117,101,32,37,108,100,32,105,110,32,79,82,68,69,82,32,97,114,103,117,109,101,110,116,32,116,111,32,82,69,83,72,65,80,69,32,105,110,116,114,105,110,115,105,9910,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,82,69,83,72,65,80,69,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10,111,117,116,32,111,102,32,114,97,110,103,101,32,105,110,32,79,82,68,69,82,32,97,114,103,117,109,101,110,116,32,116,111,32,82,69,83,72,65,80,69,32,105115,104,97,112,101,32,97,110,100,32,116,97,114,103,101,116,32,100,111,32,110,111,116,32,99,111,110,102,111,114_gfortran_reshape_c4 .visible .func _gfortran_reshape_c4 {local .align 16 .b8 %frame_ar[1200100; .reg .u6464u64 %r266; .reg .pred %r267; .reg .u64predpredu64u64 %r310; .reg .u64 %r311; .reg .pred %r312; .reg .predpredpredu64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64pred %r340; .reg .predu32u64 %r399; .reg .u64u64 %r427; .reg .predpred %r457; .reg .u32 %r459; .reg .u32 %r460; .reg .u32 %r462; .reg .u32 %r463; .reg .u16 %r464; .reg .u16 %r465; .reg .u16 %r466; .reg .u32 %r468; .reg .u32 %r469; .reg .u16 %r471; .reg .u16 %r473; .reg .u32 %r474; .reg .u16 %r475; .reg .pred %r476; .reg .u32 %r483; .reg .pred %r484; .reg .pred %r485; .reg .pred %r486; .reg .u64 %r488; .reg .u64 %r495; .reg .u64 %r498; .reg .u64 %r505; .reg .u64 %r508; .reg .pred %r509; .reg .u64 %r511; .reg .pred %r513; .reg .pred %r519; .reg .u64 %r520; .reg .u64 %r521; .reg .u64 %r522; .reg .pred %r523; .reg .pred %r524; .reg .u64 %r525; .reg .u64 %r526; .reg .pred %r527; .reg .u32 %r529; .reg .u32 %r530; .reg .u16 %r534; .reg .u16 %r535; .reg .u32pred %r541; .reg .u64 %r551; .reg .u64 %r561; .reg .u64 %r571; .reg .u64 %r574; .reg .u64 %r576; .reg .u64 %r577; .reg .u64 %r578; .reg .u64 %r580; .reg .pred %r581; .reg .pred %r582; .reg .pred %r583; .reg .pred %r584; .reg .pred %r585; .reg .u64 %r586; .reg .u64 %r587; .reg .u64 %r588; .reg .u64 %r589; .reg .u64 %r590; .reg .pred %r591; .reg .u64 %r593; .reg .u32 %r594; .reg .u32 %r595; .reg .u16 %r596; .reg .pred78 10 ld.u64 %r264,[%r260+56]; add.u64 %r263,%r264,1; .loc 1 78 8 ld.u64 %r265,[%r260+48]; sub.u64 %r162,%r263,%r265; .loc 1 83 15 ld.s8 %r266,[%r258+28]; .loc 1 83 6 setp.eq.u64 %r267,%r266,%r162; @ %r267 bra $L2; .loc 1 84_gfortran_runtime_errorld.u64 %r270,[%r260+40]; shl.b64 %r250,%r270,3; ld.u64 %r248,[%r260]; mov.u64 %r231,%frame; shl.b64 %r254,%r162,3; add.u64 %r255,%frame,%r254; .loc 1 90 44 mov.u64 %r251,%frame; .loc 1 86 15 mov.u32 %r152,0; .loc 1 93 23 mov.u64 %r608,0; .loc 1 94 14 mov.u32 %r609,1; $L5: .loc 1 90 39 ld.u64 %r32,[%r248]; .loc 1 91 10 setp.le.s64 %r271,%r32,0; @ %r271 bra $L3; .loc 1 90 21 st.u64 [%r251],%r32; bra $L4; $L3: .loc 1 93 23 st.u64 [%r251],%r608; .loc 1 94 14 mov.u32 %r152,%r609; $L4: .loc 1 88 3 add.u64 %r248,%r248,%r250; add.u64 %r251,%r251,8; setp.ne.u64 %r273,%r251,%r255; @ %r273 bra $L5; .loc 1 98 6 ld.u64 %r274,[%r258]; setp.ne.u64 %r275,%r274,0; @ %r275 bra $L6; add.u64 %r114,%r258,40; add.u64 %r278,%r162,%r162; add.u64 %r279,%r278,%r162; shl.b64 %r280,%r279,3; add.u64 %r247,%r114,%r280; mov.u64 %r160,%r231; .loc 1 102 10 mov.u64 %r168,1; $L7: .loc 1 105 8 ld.u64 %r167,[%r160]; .loc 1 107 4 st.u64 [%r114+8],%r274; add.u64 %r282,%r167,-1; st.u64 [%r114+16],%r282; st.u64 [%r114],%r168; .loc 1 109 7 mul.lo.u64 %r168,%r168,%r167; .loc 1 103 7 add.u64 %r160,%r160,8; add.u64 %r114,%r114,24; setp.ne.u64 %r283,%r114,%r247; @ %r283 bra $L7; .loc 1 111 19 st.u64 [%r258+8],%r274; .loc 1 118 24 max.s64 %r287,%r168,0; mov.u64 %r286call (%value_in),_gfortrani_xmallocarray288,[%value_in]; } .loc 1 118 22 st.u64 [%r258],%r288; .loc 1 119 23 st.u8 [%r258+28],%r162; $L6: .loc 1 122 6 setp.ne.u32 %r290,%r152,0; @ %r290 bra $L1; .loc 1 125 6 setp.eq.u64 %r291,%r261,0; @ %r291 bra $L53; .loc 1 127 12 ld.s8 %r146,[%r261+28]; .loc 1 130 7 setp.le.s64 %r293,%r146,0; @ %r293 bra $L54; .loc 1 132 21 shl.b64 %r295,%r146,3; add.u64 %r296,%frame,36295302,[%value_in]; } add.u64 %r140,%r261,40; add.u64 %r145,%frame,120; add.u64 %r52,%frame,240; add.u64 %r306,%r146,%r146; add.u64 %r307,%r306,%r146; shl.b64 %r308,%r307,3; add.u64 %r112,%r140,%r308; .loc 1 129 14 mov.u32 %r150,%r152; .loc 1 128 13 mov.u64 %r147,1; .loc 1 138 19 mov.u64 %r606,0; .loc 1 137 15 cvt.u32.u64 %r607,%r147; $L14: .loc 1 133 24 ld.u64 %r39,[%r140]; .loc 1 133 22 st.u64 [%r145],%r39; .loc 1 134 24 ld.u64 %r310,[%r140+16]; add.u64 %r309,%r310,1; ld.u64 %r311,[%r140+8]; sub.u64 %r43,%r309,%r311; .loc 1 135 14 setp.le.s64 %r312,%r43,0; @ %r312 bra $L11; .loc 1 134 22 st.u64 [%r52],%r43; bra $L12; $L11: .loc 1 138 19 st.u64 [%r52],%r606; .loc 1 137 15 mov.u32 %r150,%r607; $L12: .loc 1 141 14 setp.ne.u64 %r314,%r39,%r147; @ %r314 bra $L55; .loc 1 142 19 ld.u64 %r315,[%r52]; mul.lo.u64 %r147,%r147,%r315; bra $L13; $L55: .loc 1 144 19 mov.u64 %r147,0; $L13: .loc 1 130 7 add.u64 %r140,%r140,24; add.u64 %r145,%r145,8; add.u64 %r52,%r52,8; setp.ne.u64 %r316,%r112,%r140; @ %r316 bra $L14; bra $L10; $L54: .loc 1 129 14 mov.u32 %r150,%r152; .loc 1 128 13 mov.u64 %r147,1; $L10: .loc 1 146 12 ld.u64 %r148,[%r261]; bra $L9; $L53: .loc 1 153 12 mov.u64 %r148,%r261; .loc 1 152 14 mov.u32 %r150,1; .loc 1 151 13 mov.u64 %r147,1; .loc 1 150 12 mov.u64 %r146,%r261; $L9: .loc 1 156 7 cvta.global.u64 %r317,_gfortrani_compile_options; .loc 1 156 6 ld.u32 %r318,[%r317+36]; setp.eq.u32 %r319,%r318,0; @ ! %r319 bra $L92; setp.eq.u64 %r591,%r262,0; bra $L15; $L92: add.u64 %r230,%r258,48; .loc 1 161 23 mov.u64 %r225,0; .loc 1 160 10 mov.u64 %r176,1; $L17: .loc 1 163 20 ld.u64 %r47,[%r231]; .loc 1 163 7 mul.lo.u64 %r176,%r176,%r47; .loc 1 164 17 ld.u64 %r321,[%r230+8]; add.u64 %r320,%r321,1; .loc 1 164 15 ld.u64 %r322,[%r230]; sub.u64 %r177,%r320,%r322; .loc 1 161 41 add.u64 %r225,%r225,1; .loc 1 165 7 setp.eq.u64 %r323,%r47,%r177; @ %r323 bra $L16; .loc 1 166 6 st.u64 [%stack+16],%r47; st.u64 [%stack+8],%r177; st.u64 [%stack],%r225;_gfortran_runtime_error16: .loc 1 161 7 add.u64 %r231,%r231,8; add.u64 %r230,%r230,24; setp.ne.u64 %r326,%r162,%r225; @ %r326 bra $L17; .loc 1 173 14 ld.s8 %r53,[%r259+28]; .loc 1 174 7 cvt.u16.u32 %r327,%r53; setp.le.s16 %r328,%r327,0; @ %r328 bra $L56; add.u64 %r194,%r259,48; add.u64 %r242,%r259,56; cvt.u32.u32 %r331,%r53; cvt.s64.s8 %r330,%r331; add.u64 %r333,%r330,%r330; add.u64 %r334,%r333,%r330; shl.b64 %r335,%r334,3; add.u64 %r235,%r194,%r335; .loc 1 172 21 mov.u64 %r156,1; $L19: .loc 1 177 9 ld.u64 %r338,[%r242]; add.u64 %r337,%r338,1; .loc 1 177 7 ld.u64 %r339,[%r194]; sub.u64 %r336,%r337,%r339; .loc 1 178 33 max.s64 %r336,%r336,0; .loc 1 178 18 mul.lo.u64 %r156,%r156,%r336; .loc 1 174 7 add.u64 %r194,%r194,24; add.u64 %r242,%r242,24; setp.ne.u64 %r340,%r235,%r194; @ %r340 bra $L19; bra $L18; $L56: .loc 1 172 21 mov.u64 %r156,1; $L18: .loc 1 181 10 setp.ge.s64 %r341,%r156,%r176; @ %r341 bra $L20; .loc 1 181 34 set.u32.eq.u64 %r343,%r261,0; neg.s32 %r344,%r343; .loc 1 181 39 cvt.u16.u32 %r347,%r150; cvt.u16.u32 %r350,%r344; or.b16 %r349,%r350,%r347; .loc 1 181 30.loc 1 182 2 st.u64 [%stack+8],%r176; st.u64 [%stack],%r156; cvta.const.u64 %r33_gfortran_runtime_error20: .loc 1 186 10 setp.eq.u64 %r591,%r262,0; @ %r591 bra $L15; .loc 1 192 14 shl.b64 %r358,%r162,2; add.u64 %r359,%frame,480; mov.u32 %r36358365,[%value_in]; } ld.u64 %r367,[%r262+40]; shl.b64 %r192,%r367,3; ld.u64 %r190,[%r262]; .loc 1 194 20 mov.u64 %r173,0; .loc 1 206 16 mov.u32 %r605,1; $L23: .loc 1 196 28 ld.u64 %r63,[%r190]; .loc 1 196 10 add.u64 %r172,%r63,-1; .loc 1 198 14 shr.u64 %r369,%r172,63; .loc 1 198 23 set.u32.le.s64 %r371,%r162,%r172; neg.s32 %r372,%r371; .loc 1 198 18 cvt.u16.u64 %r375,%r369; cvt.u16.u32 %r376,%r372; or.b16 %r374,%r375,%r376; .loc 1 198 11 cvt.u32.u16 %r377,%r374; cvt.u16.u8 %r378,%r377; setp.eq.u16 %r379,%r378,0; @ %r379 bra $L21; .loc 1 199 3 st.u64 [%stack],%r63_gfortran_runtime_errorshl.b64 %r382,%r172,2; add.u64 %r383,%frame,%r382; add.u64 %r384,%r383,480; .loc 1 202 11 ld.u32 %r385,[%r384]; setp.eq.u32 %r386,%r385,0; @ %r386 bra $L22;_gfortran_runtime_error22: .loc 1 206 16 st.u32 [%r384],%r605; .loc 1 194 38 add.u64 %r173,%r173,1; .loc 1 194 4 add.u64 %r190,%r190,%r192; setp.ne.u64 %r393,%r162,%r173; @ %r393 bra $L23; $L15: add.u64 %r91,%frame,960; add.u64 %r51,%frame,1080; add.u64 %r46,%frame,840; .loc 1 194 20 mov.u64 %r83,%r51; mov.u64 %r186,0; mov.u64 %r137,1; .loc 1 220 17 mov.u64 %r401,%r186; $L29: .loc 1 215 10 @ %r591 bra $L57; .loc 1 216 34 ld.u64 %r396,[%r262+40]; mul.lo.u64 %r395,%r186,%r396; .loc 1 216 31 ld.u64 %r397,[%r262]; shl.b64 %r398,%r395,3; add.u64 %r399,%r397,%r398; .loc 1 216 13 ld.u64 %r400,[%r399]; add.u64 %r157,%r400,-1; bra $L24; $L57: mov.u64 %r157,%r186; $L24: .loc 1 220 17 st.u64 [%r83],%r401; .loc 1 221 20 add.u64 %r403,%r157,%r157; add.u64 %r404,%r403,%r157; shl.b64 %r405,%r404,3; add.u64 %r406,%r258,%r405; ld.u64 %r75,[%r406+40]; .loc 1 221 18 st.u64 [%r46],%r75; .loc 1 222 20 ld.u64 %r415,[%r406+56]; add.u64 %r414,%r415,1; ld.u64 %r422,[%r406+48]; sub.u64 %r79,%r414,%r422; .loc 1 223 10 setp.lt.s64 %r423,%r79,0; @ %r423 bra $L25; .loc 1 222 18 st.u64 [%r91],%r79; bra $L26; $L25: .loc 1 224 20 st.u64 [%r91],%r401; mov.u64 %r79,%r401; $L26: .loc 1 226 35 shl.b64 %r425,%r157,3; add.u64 %r426,%frame,%r425; .loc 1 226 10 ld.u64 %r427,[%r426]; setp.eq.u64 %r428,%r427,%r79; @ %r428 bra $L27; .loc 1 227 9 cvta.const.u64 %r4242_gfortran_runtime_error27: .loc 1 229 10 setp.ne.u64 %r431,%r75,%r137; @ %r431 bra $L58; .loc 1 230 15 mul.lo.u64 %r137,%r137,%r79; bra $L28; $L58: .loc 1 232 15 mov.u64 %r137,0; $L28: .loc 1 233 10 setp.le.s64 %r432,%r79,0; @ %r432 bra $L1; .loc 1 212 37 add.u64 %r186,%r186,1; .loc 1 212 3 add.u64 %r91,%r91,8; add.u64 %r83,%r83,8; add.u64 %r46,%r46,8; setp.ne.u64 %r433,%r162,%r186; @ %r433 bra $L29; .loc 1 237 8 ld.s8 %r195,[%r259+28]; .loc 1 247 17 shl.b64 %r436,%r195,3; add.u64 %r590,%frame,720; mov.u32 %r441436443,[%value_in]; } add.u64 %r198,%r259,40; add.u64 %r589,%frame,480; mov.u64 %r196,%r589; add.u64 %r586,%frame,600; mov.u64 %r170,%r586; add.u64 %r446,%r195,%r195; add.u64 %r447,%r446,%r195; shl.b64 %r448,%r447,3; add.u64 %r135,%r448,%r198; .loc 1 243 9 mov.u64 %r144,1; .loc 1 253 15 mov.u64 %r603,0; .loc 1 252 11 cvt.u32.u64 %r604,%r144; $L33: .loc 1 248 20 ld.u64 %r86,[%r198]; .loc 1 248 18 st.u64 [%r196],%r86; .loc 1 249 20 ld.u64 %r451,[%r198+16]; add.u64 %r450,%r451,1; ld.u64 %r452,[%r198+8]; sub.u64 %r90,%r450,%r452; .loc 1 250 10 setp.le.s64 %r453,%r90,0; @ %r453 bra $L30; .loc 1 249 18 st.u64 [%r170],%r90; bra $L31; $L30: .loc 1 253 15 st.u64 [%r170],%r603; .loc 1 252 11 mov.u32 %r152,%r604; $L31: .loc 1 256 10 setp.ne.u64 %r455,%r86,%r144; @ %r455 bra $L59; .loc 1 257 15 ld.u64 %r456,[%r170]; mul.lo.u64 %r144,%r144,%r456; bra $L32; $L59: .loc 1 259 15 mov.u64 %r144,0; $L32: .loc 1 245 3 add.u64 %r198,%r198,24; add.u64 %r196,%r196,8; add.u64 %r170,%r170,8; setp.ne.u64 %r457,%r135,%r198; @ %r457 bra $L33; .loc 1 267 34 ld.u64 %r82,[%r258]; .loc 1 267 68 ld.u64 %r184,[%r259]; .loc 1 262 13 set.u32.ne.u64 %r459,%r137,0; neg.s32 %r460,%r459; .loc 1 262 32 set.u32.ne.u64 %r462,%r147,0; neg.s32 %r463,%r462; cvt.u16.u32 %r465,%r460; cvt.u16.u32 %r466,%r463; and.b16 %r464,%r465,%r466; .loc 1 262 27 set.u32.ne.u64 %r468,%r144,0; neg.s32 %r469,%r468; .loc 1 262 32 cvt.u16.u32 %r473,%r469; and.b16 %r471,%r464,%r473; cvt.u32.u16 %r474,%r471; cvt.u16.u8 %r475,%r474; setp.eq.u16 %r476,%r475,0; @ %r476 bra $L34; .loc 1 264 13 shl.b64 %r96,%r137,3; .loc 1 265 13 shl.b64 %r98,%r144,3; .loc 1 266 13 shl.b64 %r100,%r147,3; .loc 1 267 7 {1841100; call _gfortrani_reshape_packed269 7 bra $L1; $L34: .loc 1 273 12 ld.u64 %r182,[%frame+840]; .loc 1 274 12 ld.u64 %r141,[%frame+480]; .loc 1 276 6 and.b32 %r483,%r152,%r150; setp.eq.u32 %r484,%r483,0; @ %r484 bra $L3636: .loc 1 279 6 setp.eq.u32 %r485,%r152,0; @ %r485 bra $L37; .loc 1 285 7 setp.le.s64 %r486,%r146,0; @ %r486 bra $L60; .loc 1 290 13 ld.u64 %r141,[%frame+120]; .loc 1 287 16 shl.b64 %r208,%r146,3; add.u64 %r488,%frame,368495,[%value_in]; } .loc 1 288 17 add.u64 %r498,%frame,2208505,[%value_in]; } mov.u64 %r237,%r141; mov.u64 %r201,0; .loc 1 289 26 add.u64 %r602,%frame,120; $L38: .loc 1 289 17 add.u64 %r508,%r589,%r201; st.u64 [%r508],%r237; .loc 1 285 7 add.u64 %r201,%r201,8; setp.eq.u64 %r509,%r201,%r208; @ %r509 bra $L61; .loc 1 289 26 add.u64 %r511,%r602,%r201; ld.u64 %r237,[%r511]; bra $L38; $L50: .loc 1 297 15 ld.f32 %r106,[%r184]; ld.f32 %r107,[%r184+4]; .loc 1 297 13 st.f32 [%r82],%r106; st.f32 [%r82+4],%r107; .loc 1 299 12 add.u64 %r82,%r82,%r109; .loc 1 300 11 add.u64 %r184,%r184,%r587; .loc 1 301 16 add.u64 %r158,%r158,1; .loc 1 302 16 add.u64 %r243,%r243,1; st.u64 [%frame+720],%r243; .loc 1 306 13 setp.ne.u64 %r513,%r158,%r223; @ %r513 bra $L39; .loc 1 315 14 @ %r601 bra $L40; add.u64 %r174,%frame,1088; add.u64 %r104,%frame,848; add.u64 %r171,%frame,968; .loc 1 313 30 mov.u64 %r116,%r139; .loc 1 310 21 mov.u64 %r593,0; bra $L41; $L42: st.u64 [%r174],%r593; .loc 1 313 30 mul.lo.u64 %r116,%r119,%r118; .loc 1 315 14 add.u64 %r174,%r174,8; add.u64 %r104,%r104,8; add.u64 %r171,%r171,8; setp.eq.u64 %r519,%r174,%r202; @ %r519 bra $L40; $L41: .loc 1 323 24 ld.u64 %r520,[%r174]; add.u64 %r118,%r520,1; st.u64 [%r174],%r118; .loc 1 324 30 ld.u64 %r119,[%r104]; .loc 1 324 20 sub.u64 %r521,%r119,%r116; shl.b64 %r522,%r521,3; add.u64 %r82,%r82,%r522; .loc 1 306 34 ld.u64 %r121,[%r171]; .loc 1 306 13 setp.eq.u64 %r523,%r118,%r121; @ %r523 bra $L42; .loc 1 329 13 setp.ne.u64 %r524,%r243,%r588; @ %r524 bra $L62; .loc 1 310 21 mov.u64 %r158,0; bra $L51; $L64: mov.u64 %r158,0; .loc 1 329 13 mov.u64 %r82,%r158; $L51: .loc 1 333 21 mov.u64 %r525,0; st.u64 [%frame+720],%r525; .loc 1 336 29 ld.u64 %r526,[%frame+480]; mul.lo.u64 %r125,%r588,%r526; .loc 1 338 14 setp.ne.u64 %r527,%r195,1; @ %r527 bra $L44; $L47: .loc 1 340 19 set.u32.ne.u64 %r529,%r169,0; neg.s32 %r530,%r529; .loc 1 340 24 cvt.u16.u32 %r535,%r530; and.b16 %r534,%r535,%r596; .loc 1 340 18 cvt.u32.u16 %r537,%r534; cvt.u16.u8 %r538,%r53745; .loc 1 354 19 mov.u64 %r184,%r148; bra $L46; $L49: .loc 1 333 21 st.u64 [%r218],%r525; .loc 1 336 29 mul.lo.u64 %r125,%r133,%r132; .loc 1 337 12 add.u64 %r183,%r183,1; .loc 1 338 14 add.u64 %r218,%r218,8; setp.eq.u64 %r541,%r195,%r183; @ ! %r541 bra $L48; bra $L47; $L45: .loc 1 345 19 @ %r597 bra $L63; .loc 1 347 356551,[%value_in]; } .loc 1 348 36206561,[%value_in]; } .loc 1 349 36206571,[%value_in]; } .loc 1 350 32 ld.u64 %r141,[%frame+480]; .loc 1 354 19 mov.u64 %r184,%r148; .loc 1 350 32 mov.u64 %r195,%r146; .loc 1 343 24 mov.u64 %r169,0; bra $L46; $L44: add.u64 %r218,%frame,728; .loc 1 337 12 mov.u64 %r183,1; $L48: .loc 1 359 24 ld.u64 %r574,[%r218]; add.u64 %r132,%r574,1; st.u64 [%r218],%r132; shl.b64 %r185,%r183,3; .loc 1 360 29 add.u64 %r576,%r589,%r185; ld.u64 %r133,[%r576]; .loc 1 360 19 sub.u64 %r577,%r133,%r125; shl.b64 %r578,%r577,3; add.u64 %r184,%r184,%r578; .loc 1 329 34 add.u64 %r580,%r586,%r185; ld.u64 %r136,[%r580]; .loc 1 329 13 setp.eq.u64 %r581,%r132,%r136; @ %r581 bra $L49; bra $L46; $L60: .loc 1 283 12 mov.u64 %r184,%r148; .loc 1 285 7 mov.u64 %r195,%r146; bra $L37; $L61: .loc 1 283 12 mov.u64 %r184,%r148; mov.u64 %r195,%r146; $L37: .loc 1 294 9 setp.eq.u64 %r582,%r82,0; @ %r582 bra $L1; .loc 1 299 12 shl.b64 %r109,%r182,3; ld.u64 %r158,[%frame+1080]; .loc 1 306 34 ld.u64 %r223,[%frame+960]; .loc 1 313 30 mul.lo.u64 %r139,%r223,%r182; .loc 1 329 34 ld.u64 %r588,[%frame+600]; .loc 1 302 13 ld.u64 %r243,[%frame+720]; add.u64 %r202,%r51,%r254; .loc 1 347 35 shl.b64 %r206,%r146,3; mov.u64 %r169,%r184; shl.b64 %r587,%r141,3; .loc 1 340 24 set.u32.ne.u64 %r594,%r261,0; neg.s32 %r595,%r594; cvt.u16.u32 %r596,%r595; .loc 1 345 19 setp.le.s64 %r597,%r146,0; .loc 1 347 35 add.u64 %r598,%frame,360; .loc 1 348 36 add.u64 %r599,%frame,240; .loc 1 349 36 add.u64 %r600,%frame,120; .loc 1 315 14 setp.eq.u64 %r601,%r162,1; bra $L50; $L63: .loc 1 354 19 mov.u64 %r184,%r148; .loc 1 345 19 mov.u64 %r195,%r146; .loc 1 343 24 mov.u64 %r169,0; $L46: .loc 1 294 9 setp.eq.u64 %r583,%r82,0; @ %r583 bra $L1; shl.b64 %r587,%r141,3; ld.u64 %r588,[%frame+600]; bra $L43; $L62: .loc 1 310 21 mov.u64 %r158,0; $L43: .loc 1 302 13 ld.u64 %r243,[%frame+720]; bra $L50; $L39: .loc 1 329 13 setp.eq.u64 %r584,%r243,%r588; @ %r584 bra $L51; bra $L50; $L40: setp.eq.u64 %r585,%r243,%r588; @ %r585 bra $L64; $L1: .loc 1 364 1 ret; } reshape_c8.o/_gfortran_reshape_c8 .visible .func _gfortran_reshape_c8.file 1 "../../../libgfortran/generated/reshape_c8.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray_gfortrani_reshape_packed .extern .func _gfortrani_reshape_packed73,110,99,111,114,114,101,99,116,32,115,105,122,101,32,105,110,32,83,79,85,82,67,69,32,97,114,103,117,109,101,110,116,32,116,111,32,82,69,83,72,65,80,69,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,100,0 }114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,82,69,83,72,65,80,69,32,105,110,116,114,105,11097,108,117,101,32,37,108,100,32,105,110,32,79,82,68,69,82,32,97,114,103,117,109,101,110,116,32,116,111,32,82,69,83,72,65,80,69,32,105,110,116,114,105,110,115,105,9910,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,82,69,83,72,65,80,69,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10,111,117,116,32,111,102,32,114,97,110,103,101,32,105,110,32,79,82,68,69,82,32,97,114,103,117,109,101,110,116,32,116,111,32,82,69,83,72,65,80,69,32,105115,104,97,112,101,32,97,110,100,32,116,97,114,103,101,116,32,100,111,32,110,111,116,32,99,111,110,102,111,114_gfortran_reshape_c8 .visible .func _gfortran_reshape_c8 {local .align 16 .b8 %frame_ar[1200100; .reg .u6464u64 %r266; .reg .pred %r267; .reg .u64predpredu64u64 %r310; .reg .u64 %r311; .reg .pred %r312; .reg .predpredpredu64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64pred %r340; .reg .predu32u64 %r399; .reg .u64u64 %r427; .reg .predpred %r457; .reg .u32 %r459; .reg .u32 %r460; .reg .u32 %r462; .reg .u32 %r463; .reg .u16 %r464; .reg .u16 %r465; .reg .u16 %r466; .reg .u32 %r468; .reg .u32 %r469; .reg .u16 %r471; .reg .u16 %r473; .reg .u32 %r474; .reg .u16 %r475; .reg .pred %r476; .reg .u32 %r483; .reg .pred %r484; .reg .pred %r485; .reg .pred %r486; .reg .u64 %r488; .reg .u64 %r495; .reg .u64 %r498; .reg .u64 %r505; .reg .u64 %r508; .reg .pred %r509; .reg .u64 %r511; .reg .pred %r513; .reg .pred %r519; .reg .u64 %r520; .reg .u64 %r521; .reg .u64 %r522; .reg .pred %r523; .reg .pred %r524; .reg .u64 %r525; .reg .u64 %r526; .reg .pred %r527; .reg .u32 %r529; .reg .u32 %r530; .reg .u16 %r534; .reg .u16 %r535; .reg .u32pred %r541; .reg .u64 %r551; .reg .u64 %r561; .reg .u64 %r571; .reg .u64 %r574; .reg .u64 %r576; .reg .u64 %r577; .reg .u64 %r578; .reg .u64 %r580; .reg .pred %r581; .reg .pred %r582; .reg .pred %r583; .reg .pred %r584; .reg .pred %r585; .reg .u64 %r586; .reg .u64 %r587; .reg .u64 %r588; .reg .u64 %r589; .reg .u64 %r590; .reg .pred %r591; .reg .u64 %r593; .reg .u32 %r594; .reg .u32 %r595; .reg .u16 %r596; .reg .pred78 10 ld.u64 %r264,[%r260+56]; add.u64 %r263,%r264,1; .loc 1 78 8 ld.u64 %r265,[%r260+48]; sub.u64 %r162,%r263,%r265; .loc 1 83 15 ld.s8 %r266,[%r258+28]; .loc 1 83 6 setp.eq.u64 %r267,%r266,%r162; @ %r267 bra $L2; .loc 1 84_gfortran_runtime_errorld.u64 %r270,[%r260+40]; shl.b64 %r250,%r270,3; ld.u64 %r248,[%r260]; mov.u64 %r231,%frame; shl.b64 %r254,%r162,3; add.u64 %r255,%frame,%r254; .loc 1 90 44 mov.u64 %r251,%frame; .loc 1 86 15 mov.u32 %r152,0; .loc 1 93 23 mov.u64 %r608,0; .loc 1 94 14 mov.u32 %r609,1; $L5: .loc 1 90 39 ld.u64 %r32,[%r248]; .loc 1 91 10 setp.le.s64 %r271,%r32,0; @ %r271 bra $L3; .loc 1 90 21 st.u64 [%r251],%r32; bra $L4; $L3: .loc 1 93 23 st.u64 [%r251],%r608; .loc 1 94 14 mov.u32 %r152,%r609; $L4: .loc 1 88 3 add.u64 %r248,%r248,%r250; add.u64 %r251,%r251,8; setp.ne.u64 %r273,%r251,%r255; @ %r273 bra $L5; .loc 1 98 6 ld.u64 %r274,[%r258]; setp.ne.u64 %r275,%r274,0; @ %r275 bra $L6; add.u64 %r114,%r258,40; add.u64 %r278,%r162,%r162; add.u64 %r279,%r278,%r162; shl.b64 %r280,%r279,3; add.u64 %r247,%r114,%r280; mov.u64 %r160,%r231; .loc 1 102 10 mov.u64 %r168,1; $L7: .loc 1 105 8 ld.u64 %r167,[%r160]; .loc 1 107 4 st.u64 [%r114+8],%r274; add.u64 %r282,%r167,-1; st.u64 [%r114+16],%r282; st.u64 [%r114],%r168; .loc 1 109 7 mul.lo.u64 %r168,%r168,%r167; .loc 1 103 7 add.u64 %r160,%r160,8; add.u64 %r114,%r114,24; setp.ne.u64 %r283,%r114,%r247; @ %r283 bra $L7; .loc 1 111 19 st.u64 [%r258+8],%r274; .loc 1 118 24 max.s64 %r287,%r168,0; mov.u64 %r286,16286; call (%value_in),_gfortrani_xmallocarray288,[%value_in]; } .loc 1 118 22 st.u64 [%r258],%r288; .loc 1 119 23 st.u8 [%r258+28],%r162; $L6: .loc 1 122 6 setp.ne.u32 %r290,%r152,0; @ %r290 bra $L1; .loc 1 125 6 setp.eq.u64 %r291,%r261,0; @ %r291 bra $L53; .loc 1 127 12 ld.s8 %r146,[%r261+28]; .loc 1 130 7 setp.le.s64 %r293,%r146,0; @ %r293 bra $L54; .loc 1 132 21 shl.b64 %r295,%r146,3; add.u64 %r296,%frame,36295302,[%value_in]; } add.u64 %r140,%r261,40; add.u64 %r145,%frame,120; add.u64 %r52,%frame,240; add.u64 %r306,%r146,%r146; add.u64 %r307,%r306,%r146; shl.b64 %r308,%r307,3; add.u64 %r112,%r140,%r308; .loc 1 129 14 mov.u32 %r150,%r152; .loc 1 128 13 mov.u64 %r147,1; .loc 1 138 19 mov.u64 %r606,0; .loc 1 137 15 cvt.u32.u64 %r607,%r147; $L14: .loc 1 133 24 ld.u64 %r39,[%r140]; .loc 1 133 22 st.u64 [%r145],%r39; .loc 1 134 24 ld.u64 %r310,[%r140+16]; add.u64 %r309,%r310,1; ld.u64 %r311,[%r140+8]; sub.u64 %r43,%r309,%r311; .loc 1 135 14 setp.le.s64 %r312,%r43,0; @ %r312 bra $L11; .loc 1 134 22 st.u64 [%r52],%r43; bra $L12; $L11: .loc 1 138 19 st.u64 [%r52],%r606; .loc 1 137 15 mov.u32 %r150,%r607; $L12: .loc 1 141 14 setp.ne.u64 %r314,%r39,%r147; @ %r314 bra $L55; .loc 1 142 19 ld.u64 %r315,[%r52]; mul.lo.u64 %r147,%r147,%r315; bra $L13; $L55: .loc 1 144 19 mov.u64 %r147,0; $L13: .loc 1 130 7 add.u64 %r140,%r140,24; add.u64 %r145,%r145,8; add.u64 %r52,%r52,8; setp.ne.u64 %r316,%r112,%r140; @ %r316 bra $L14; bra $L10; $L54: .loc 1 129 14 mov.u32 %r150,%r152; .loc 1 128 13 mov.u64 %r147,1; $L10: .loc 1 146 12 ld.u64 %r148,[%r261]; bra $L9; $L53: .loc 1 153 12 mov.u64 %r148,%r261; .loc 1 152 14 mov.u32 %r150,1; .loc 1 151 13 mov.u64 %r147,1; .loc 1 150 12 mov.u64 %r146,%r261; $L9: .loc 1 156 7 cvta.global.u64 %r317,_gfortrani_compile_options; .loc 1 156 6 ld.u32 %r318,[%r317+36]; setp.eq.u32 %r319,%r318,0; @ ! %r319 bra $L92; setp.eq.u64 %r591,%r262,0; bra $L15; $L92: add.u64 %r230,%r258,48; .loc 1 161 23 mov.u64 %r225,0; .loc 1 160 10 mov.u64 %r176,1; $L17: .loc 1 163 20 ld.u64 %r47,[%r231]; .loc 1 163 7 mul.lo.u64 %r176,%r176,%r47; .loc 1 164 17 ld.u64 %r321,[%r230+8]; add.u64 %r320,%r321,1; .loc 1 164 15 ld.u64 %r322,[%r230]; sub.u64 %r177,%r320,%r322; .loc 1 161 41 add.u64 %r225,%r225,1; .loc 1 165 7 setp.eq.u64 %r323,%r47,%r177; @ %r323 bra $L16; .loc 1 166 6 st.u64 [%stack+16],%r47; st.u64 [%stack+8],%r177; st.u64 [%stack],%r225;_gfortran_runtime_error16: .loc 1 161 7 add.u64 %r231,%r231,8; add.u64 %r230,%r230,24; setp.ne.u64 %r326,%r162,%r225; @ %r326 bra $L17; .loc 1 173 14 ld.s8 %r53,[%r259+28]; .loc 1 174 7 cvt.u16.u32 %r327,%r53; setp.le.s16 %r328,%r327,0; @ %r328 bra $L56; add.u64 %r194,%r259,48; add.u64 %r242,%r259,56; cvt.u32.u32 %r331,%r53; cvt.s64.s8 %r330,%r331; add.u64 %r333,%r330,%r330; add.u64 %r334,%r333,%r330; shl.b64 %r335,%r334,3; add.u64 %r235,%r194,%r335; .loc 1 172 21 mov.u64 %r156,1; $L19: .loc 1 177 9 ld.u64 %r338,[%r242]; add.u64 %r337,%r338,1; .loc 1 177 7 ld.u64 %r339,[%r194]; sub.u64 %r336,%r337,%r339; .loc 1 178 33 max.s64 %r336,%r336,0; .loc 1 178 18 mul.lo.u64 %r156,%r156,%r336; .loc 1 174 7 add.u64 %r194,%r194,24; add.u64 %r242,%r242,24; setp.ne.u64 %r340,%r235,%r194; @ %r340 bra $L19; bra $L18; $L56: .loc 1 172 21 mov.u64 %r156,1; $L18: .loc 1 181 10 setp.ge.s64 %r341,%r156,%r176; @ %r341 bra $L20; .loc 1 181 34 set.u32.eq.u64 %r343,%r261,0; neg.s32 %r344,%r343; .loc 1 181 39 cvt.u16.u32 %r347,%r150; cvt.u16.u32 %r350,%r344; or.b16 %r349,%r350,%r347; .loc 1 181 30.loc 1 182 2 st.u64 [%stack+8],%r176; st.u64 [%stack],%r156; cvta.const.u64 %r33_gfortran_runtime_error20: .loc 1 186 10 setp.eq.u64 %r591,%r262,0; @ %r591 bra $L15; .loc 1 192 14 shl.b64 %r358,%r162,2; add.u64 %r359,%frame,480; mov.u32 %r36358365,[%value_in]; } ld.u64 %r367,[%r262+40]; shl.b64 %r192,%r367,3; ld.u64 %r190,[%r262]; .loc 1 194 20 mov.u64 %r173,0; .loc 1 206 16 mov.u32 %r605,1; $L23: .loc 1 196 28 ld.u64 %r63,[%r190]; .loc 1 196 10 add.u64 %r172,%r63,-1; .loc 1 198 14 shr.u64 %r369,%r172,63; .loc 1 198 23 set.u32.le.s64 %r371,%r162,%r172; neg.s32 %r372,%r371; .loc 1 198 18 cvt.u16.u64 %r375,%r369; cvt.u16.u32 %r376,%r372; or.b16 %r374,%r375,%r376; .loc 1 198 11 cvt.u32.u16 %r377,%r374; cvt.u16.u8 %r378,%r377; setp.eq.u16 %r379,%r378,0; @ %r379 bra $L21; .loc 1 199 3 st.u64 [%stack],%r63_gfortran_runtime_errorshl.b64 %r382,%r172,2; add.u64 %r383,%frame,%r382; add.u64 %r384,%r383,480; .loc 1 202 11 ld.u32 %r385,[%r384]; setp.eq.u32 %r386,%r385,0; @ %r386 bra $L22;_gfortran_runtime_error22: .loc 1 206 16 st.u32 [%r384],%r605; .loc 1 194 38 add.u64 %r173,%r173,1; .loc 1 194 4 add.u64 %r190,%r190,%r192; setp.ne.u64 %r393,%r162,%r173; @ %r393 bra $L23; $L15: add.u64 %r91,%frame,960; add.u64 %r51,%frame,1080; add.u64 %r46,%frame,840; .loc 1 194 20 mov.u64 %r83,%r51; mov.u64 %r186,0; mov.u64 %r137,1; .loc 1 220 17 mov.u64 %r401,%r186; $L29: .loc 1 215 10 @ %r591 bra $L57; .loc 1 216 34 ld.u64 %r396,[%r262+40]; mul.lo.u64 %r395,%r186,%r396; .loc 1 216 31 ld.u64 %r397,[%r262]; shl.b64 %r398,%r395,3; add.u64 %r399,%r397,%r398; .loc 1 216 13 ld.u64 %r400,[%r399]; add.u64 %r157,%r400,-1; bra $L24; $L57: mov.u64 %r157,%r186; $L24: .loc 1 220 17 st.u64 [%r83],%r401; .loc 1 221 20 add.u64 %r403,%r157,%r157; add.u64 %r404,%r403,%r157; shl.b64 %r405,%r404,3; add.u64 %r406,%r258,%r405; ld.u64 %r75,[%r406+40]; .loc 1 221 18 st.u64 [%r46],%r75; .loc 1 222 20 ld.u64 %r415,[%r406+56]; add.u64 %r414,%r415,1; ld.u64 %r422,[%r406+48]; sub.u64 %r79,%r414,%r422; .loc 1 223 10 setp.lt.s64 %r423,%r79,0; @ %r423 bra $L25; .loc 1 222 18 st.u64 [%r91],%r79; bra $L26; $L25: .loc 1 224 20 st.u64 [%r91],%r401; mov.u64 %r79,%r401; $L26: .loc 1 226 35 shl.b64 %r425,%r157,3; add.u64 %r426,%frame,%r425; .loc 1 226 10 ld.u64 %r427,[%r426]; setp.eq.u64 %r428,%r427,%r79; @ %r428 bra $L27; .loc 1 227 9 cvta.const.u64 %r4242_gfortran_runtime_error27: .loc 1 229 10 setp.ne.u64 %r431,%r75,%r137; @ %r431 bra $L58; .loc 1 230 15 mul.lo.u64 %r137,%r137,%r79; bra $L28; $L58: .loc 1 232 15 mov.u64 %r137,0; $L28: .loc 1 233 10 setp.le.s64 %r432,%r79,0; @ %r432 bra $L1; .loc 1 212 37 add.u64 %r186,%r186,1; .loc 1 212 3 add.u64 %r91,%r91,8; add.u64 %r83,%r83,8; add.u64 %r46,%r46,8; setp.ne.u64 %r433,%r162,%r186; @ %r433 bra $L29; .loc 1 237 8 ld.s8 %r195,[%r259+28]; .loc 1 247 17 shl.b64 %r436,%r195,3; add.u64 %r590,%frame,720; mov.u32 %r441436443,[%value_in]; } add.u64 %r198,%r259,40; add.u64 %r589,%frame,480; mov.u64 %r196,%r589; add.u64 %r586,%frame,600; mov.u64 %r170,%r586; add.u64 %r446,%r195,%r195; add.u64 %r447,%r446,%r195; shl.b64 %r448,%r447,3; add.u64 %r135,%r448,%r198; .loc 1 243 9 mov.u64 %r144,1; .loc 1 253 15 mov.u64 %r603,0; .loc 1 252 11 cvt.u32.u64 %r604,%r144; $L33: .loc 1 248 20 ld.u64 %r86,[%r198]; .loc 1 248 18 st.u64 [%r196],%r86; .loc 1 249 20 ld.u64 %r451,[%r198+16]; add.u64 %r450,%r451,1; ld.u64 %r452,[%r198+8]; sub.u64 %r90,%r450,%r452; .loc 1 250 10 setp.le.s64 %r453,%r90,0; @ %r453 bra $L30; .loc 1 249 18 st.u64 [%r170],%r90; bra $L31; $L30: .loc 1 253 15 st.u64 [%r170],%r603; .loc 1 252 11 mov.u32 %r152,%r604; $L31: .loc 1 256 10 setp.ne.u64 %r455,%r86,%r144; @ %r455 bra $L59; .loc 1 257 15 ld.u64 %r456,[%r170]; mul.lo.u64 %r144,%r144,%r456; bra $L32; $L59: .loc 1 259 15 mov.u64 %r144,0; $L32: .loc 1 245 3 add.u64 %r198,%r198,24; add.u64 %r196,%r196,8; add.u64 %r170,%r170,8; setp.ne.u64 %r457,%r135,%r198; @ %r457 bra $L33; .loc 1 267 34 ld.u64 %r82,[%r258]; .loc 1 267 68 ld.u64 %r184,[%r259]; .loc 1 262 13 set.u32.ne.u64 %r459,%r137,0; neg.s32 %r460,%r459; .loc 1 262 32 set.u32.ne.u64 %r462,%r147,0; neg.s32 %r463,%r462; cvt.u16.u32 %r465,%r460; cvt.u16.u32 %r466,%r463; and.b16 %r464,%r465,%r466; .loc 1 262 27 set.u32.ne.u64 %r468,%r144,0; neg.s32 %r469,%r468; .loc 1 262 32 cvt.u16.u32 %r473,%r469; and.b16 %r471,%r464,%r473; cvt.u32.u16 %r474,%r471; cvt.u16.u8 %r475,%r474; setp.eq.u16 %r476,%r475,0; @ %r476 bra $L34; .loc 1 264 13 shl.b64 %r96,%r137,4; .loc 1 265 13 shl.b64 %r98,%r144,4; .loc 1 266 13 shl.b64 %r100,%r147,4; .loc 1 267 7 {1841100; call _gfortrani_reshape_packed269 7 bra $L1; $L34: .loc 1 273 12 ld.u64 %r182,[%frame+840]; .loc 1 274 12 ld.u64 %r141,[%frame+480]; .loc 1 276 6 and.b32 %r483,%r152,%r150; setp.eq.u32 %r484,%r483,0; @ %r484 bra $L3636: .loc 1 279 6 setp.eq.u32 %r485,%r152,0; @ %r485 bra $L37; .loc 1 285 7 setp.le.s64 %r486,%r146,0; @ %r486 bra $L60; .loc 1 290 13 ld.u64 %r141,[%frame+120]; .loc 1 287 16 shl.b64 %r208,%r146,3; add.u64 %r488,%frame,368495,[%value_in]; } .loc 1 288 17 add.u64 %r498,%frame,2208505,[%value_in]; } mov.u64 %r237,%r141; mov.u64 %r201,0; .loc 1 289 26 add.u64 %r602,%frame,120; $L38: .loc 1 289 17 add.u64 %r508,%r589,%r201; st.u64 [%r508],%r237; .loc 1 285 7 add.u64 %r201,%r201,8; setp.eq.u64 %r509,%r201,%r208; @ %r509 bra $L61; .loc 1 289 26 add.u64 %r511,%r602,%r201; ld.u64 %r237,[%r511]; bra $L38; $L50: .loc 1 297 15 ld.f64 %r106,[%r184]; ld.f64 %r107,[%r184+8]; .loc 1 297 13 st.f64 [%r82],%r106; st.f64 [%r82+8],%r107; .loc 1 299 12 add.u64 %r82,%r82,%r109; .loc 1 300 11 add.u64 %r184,%r184,%r587; .loc 1 301 16 add.u64 %r158,%r158,1; .loc 1 302 16 add.u64 %r243,%r243,1; st.u64 [%frame+720],%r243; .loc 1 306 13 setp.ne.u64 %r513,%r158,%r223; @ %r513 bra $L39; .loc 1 315 14 @ %r601 bra $L40; add.u64 %r174,%frame,1088; add.u64 %r104,%frame,848; add.u64 %r171,%frame,968; .loc 1 313 30 mov.u64 %r116,%r139; .loc 1 310 21 mov.u64 %r593,0; bra $L41; $L42: st.u64 [%r174],%r593; .loc 1 313 30 mul.lo.u64 %r116,%r119,%r118; .loc 1 315 14 add.u64 %r174,%r174,8; add.u64 %r104,%r104,8; add.u64 %r171,%r171,8; setp.eq.u64 %r519,%r174,%r202; @ %r519 bra $L40; $L41: .loc 1 323 24 ld.u64 %r520,[%r174]; add.u64 %r118,%r520,1; st.u64 [%r174],%r118; .loc 1 324 30 ld.u64 %r119,[%r104]; .loc 1 324 20 sub.u64 %r521,%r119,%r116; shl.b64 %r522,%r521,4; add.u64 %r82,%r82,%r522; .loc 1 306 34 ld.u64 %r121,[%r171]; .loc 1 306 13 setp.eq.u64 %r523,%r118,%r121; @ %r523 bra $L42; .loc 1 329 13 setp.ne.u64 %r524,%r243,%r588; @ %r524 bra $L62; .loc 1 310 21 mov.u64 %r158,0; bra $L51; $L64: mov.u64 %r158,0; .loc 1 329 13 mov.u64 %r82,%r158; $L51: .loc 1 333 21 mov.u64 %r525,0; st.u64 [%frame+720],%r525; .loc 1 336 29 ld.u64 %r526,[%frame+480]; mul.lo.u64 %r125,%r588,%r526; .loc 1 338 14 setp.ne.u64 %r527,%r195,1; @ %r527 bra $L44; $L47: .loc 1 340 19 set.u32.ne.u64 %r529,%r169,0; neg.s32 %r530,%r529; .loc 1 340 24 cvt.u16.u32 %r535,%r530; and.b16 %r534,%r535,%r596; .loc 1 340 18 cvt.u32.u16 %r537,%r534; cvt.u16.u8 %r538,%r53745; .loc 1 354 19 mov.u64 %r184,%r148; bra $L46; $L49: .loc 1 333 21 st.u64 [%r218],%r525; .loc 1 336 29 mul.lo.u64 %r125,%r133,%r132; .loc 1 337 12 add.u64 %r183,%r183,1; .loc 1 338 14 add.u64 %r218,%r218,8; setp.eq.u64 %r541,%r195,%r183; @ ! %r541 bra $L48; bra $L47; $L45: .loc 1 345 19 @ %r597 bra $L63; .loc 1 347 356551,[%value_in]; } .loc 1 348 36206561,[%value_in]; } .loc 1 349 36206571,[%value_in]; } .loc 1 350 32 ld.u64 %r141,[%frame+480]; .loc 1 354 19 mov.u64 %r184,%r148; .loc 1 350 32 mov.u64 %r195,%r146; .loc 1 343 24 mov.u64 %r169,0; bra $L46; $L44: add.u64 %r218,%frame,728; .loc 1 337 12 mov.u64 %r183,1; $L48: .loc 1 359 24 ld.u64 %r574,[%r218]; add.u64 %r132,%r574,1; st.u64 [%r218],%r132; shl.b64 %r185,%r183,3; .loc 1 360 29 add.u64 %r576,%r589,%r185; ld.u64 %r133,[%r576]; .loc 1 360 19 sub.u64 %r577,%r133,%r125; shl.b64 %r578,%r577,4; add.u64 %r184,%r184,%r578; .loc 1 329 34 add.u64 %r580,%r586,%r185; ld.u64 %r136,[%r580]; .loc 1 329 13 setp.eq.u64 %r581,%r132,%r136; @ %r581 bra $L49; bra $L46; $L60: .loc 1 283 12 mov.u64 %r184,%r148; .loc 1 285 7 mov.u64 %r195,%r146; bra $L37; $L61: .loc 1 283 12 mov.u64 %r184,%r148; mov.u64 %r195,%r146; $L37: .loc 1 294 9 setp.eq.u64 %r582,%r82,0; @ %r582 bra $L1; .loc 1 299 12 shl.b64 %r109,%r182,4; ld.u64 %r158,[%frame+1080]; .loc 1 306 34 ld.u64 %r223,[%frame+960]; .loc 1 313 30 mul.lo.u64 %r139,%r223,%r182; .loc 1 329 34 ld.u64 %r588,[%frame+600]; .loc 1 302 13 ld.u64 %r243,[%frame+720]; add.u64 %r202,%r51,%r254; .loc 1 347 35 shl.b64 %r206,%r146,3; mov.u64 %r169,%r184; shl.b64 %r587,%r141,4; .loc 1 340 24 set.u32.ne.u64 %r594,%r261,0; neg.s32 %r595,%r594; cvt.u16.u32 %r596,%r595; .loc 1 345 19 setp.le.s64 %r597,%r146,0; .loc 1 347 35 add.u64 %r598,%frame,360; .loc 1 348 36 add.u64 %r599,%frame,240; .loc 1 349 36 add.u64 %r600,%frame,120; .loc 1 315 14 setp.eq.u64 %r601,%r162,1; bra $L50; $L63: .loc 1 354 19 mov.u64 %r184,%r148; .loc 1 345 19 mov.u64 %r195,%r146; .loc 1 343 24 mov.u64 %r169,0; $L46: .loc 1 294 9 setp.eq.u64 %r583,%r82,0; @ %r583 bra $L1; shl.b64 %r587,%r141,4; ld.u64 %r588,[%frame+600]; bra $L43; $L62: .loc 1 310 21 mov.u64 %r158,0; $L43: .loc 1 302 13 ld.u64 %r243,[%frame+720]; bra $L50; $L39: .loc 1 329 13 setp.eq.u64 %r584,%r243,%r588; @ %r584 bra $L51; bra $L50; $L40: setp.eq.u64 %r585,%r243,%r588; @ %r585 bra $L64; $L1: .loc 1 36in_pack_i1.o/gfortrani_internal_pack_1gfortrani_internal_pack_1fortran/generated/in_pack_i1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_xmallocarr_gfortrani_xmallocarraygfortrani_internal_pack_1gfortrani_internal_pack_116 .b8 %frame_ar[368u32u64 %r95; .reg .u64 %r96; .reg .pred %r97; .reg .pred %r99mov.u64 %r71,%ar0; .loc 1 51 7 ld.s8 %r50,[%r71+28]; .loc 1 54 3 setp.gt.s64 %r72,%r50,0; @ %r72 bra $L2; $L4: .loc 1 73 18 ld.u64 %r70,[%r71]; bra $L1; $L2: add.u64 %r42,%frame,240; add.u64 %r40,%r71,40; mov.u64 %r32,%frame; add.u64 %r28,%frame,120; .loc 1 54 19 mov.u64 %r65,0; .loc 1 53 10 mov.u32 %r45,1; .loc 1 52 9 mov.u64 %r52,1; .loc 1 56 16 mov.u64 %r73,%r65; bra $L6; $L11: mov.u64 %r65,%r53; $L6: st.u64 [%r42],%r73; .loc 1 57 19 ld.u64 %r23,[%r40]; .loc 1 57 17 st.u64 [%r32],%r23; .loc 1 58 19 ld.u64 %r75,[%r40+16]; add.u64 %r74,%r75,1; ld.u64 %r76,[%r40+8]; sub.u64 %r27,%r74,%r76; .loc 1 58 17 st.u64 [%r28],%r27; .loc 1 59 10 setp.le.s64 %r77,%r27,0; @ %r77 bra $L4; .loc 1 66 10 setp.eq.u64 %r78,%r23,%r52; .loc 1 67 16 selp.u32 %r45,%r45,0,%r78; .loc 1 69 13 mul.lo.u64 %r52,%r52,%r27; .loc 1 54 36 add.u64 %r53,%r65,1; .loc 1 54 3 add.u64 %r42,%r42,8; add.u64 %r40,%r40,24; add.u64 %r32,%r32,8; add.u64 %r28,%r28,8; setp.ne.u64 %r79,%r50,%r53; @ %r79 bra $L11; .loc 1 72 6 setp.ne.u32 %r80,%r45,0; @ %r80 bra $L4; .loc 1 76 13 mov.u64 call (%value_in),_gfortrani_xmallocarray78 7 ld.u64 %r57,[%r71]; .loc 1 79 11 ld.u64 %r54,[%frame]; .loc 1 82 9 setp.eq.u64 %r85,%r57,0; @ %r85 bra $L1; ld.u64 %r44,[%frame+240]; .loc 1 91 32 ld.u64 %r59,[%frame+120]; .loc 1 98 28 mul.lo.u64 %r48,%r59,%r54; mov.u64 %r47,%r70; setp.eq.u64 %r99,%r50,1; add.u64 %r103,%frame,120; .loc 1 95 20 mov.u64 %r104,0; $L10: .loc 1 85 17 ld.u8 %r86,[%r57]; st.u8 [%r47],%r86; .loc 1 87 11 add.u64 %r57,%r57,%r54; .loc 1 88 15 add.u64 %r44,%r44,1; .loc 1 91 13 setp.ne.u64 %r87,%r44,%r59; @ %r87 bra $L7; .loc 1 100 14 @ %r99 bra $L1; add.u64 %r60,%frame,248; add.u64 %r67,%frame,8; .loc 1 98 28 mov.u64 %r35,%r48; .loc 1 99 12 mov.u64 %r66,%r82; bra $L8; $L9: .loc 1 95 20 st.u64 [%r60],%r104; .loc 1 98 28 mul.lo.u64 %r35,%r38,%r37; .loc 1 99 12 add.u64 %r56,%r66,1; .loc 1 100 14 add.u64 %r60,%r60,8; add.u64 %r67,%r67,8; setp.eq.u64 %r91,%r65,%r66; @ %r91 bra $L1; mov.u64 %r66,%r56; $L8: .loc 1 107 23 ld.u64 %r92,[%r60]; add.u64 %r37,%r92,1; st.u64 [%r60],%r37; .loc 1 108 28 ld.u64 %r38,[%r67]; .loc 1 108 19 sub.u64 %r93,%r38,%r35; add.u64 %r57,%r57,%r93; .loc 1 91 32 shl.b64 %r95,%r66,3; add.u64 %r96,%r103,%r95; ld.u64 %r41,[%r96]; .loc 1 91 13 setp.eq.u64 %r97,%r37,%r41; @ %r97 bra $L9; .loc 1 95 20 mov.u64 %r44,0; $L7: add.u64 %r47,%r47,1; bra $L10; in_pack_i2.o/gfortrani_internal_pack_gfortrani_internal_pack_fortran/generated/in_pack_i2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_xmallocarr_gfortrani_xmallocarraygfortrani_internal_pack_gfortrani_internal_pack_216 .b8 %frame_ar[368u16 %r89; .reg .pred %r90; .reg .predu64 %r107; .reg .u64 %r108; mov.u64 %r73,%ar0; .loc 1 51 7 ld.s8 %r50,[%r73+28]; .loc 1 54 3 setp.gt.s64 %r74,%r50,0; @ %r74 bra $L2; $L4: .loc 1 73 18 ld.u64 %r72,[%r73]; bra $L1; $L2: add.u64 %r44,%frame,240; add.u64 %r69,%r73,40; mov.u64 %r35,%frame; add.u64 %r34,%frame,120; .loc 1 54 19 mov.u64 %r65,0; .loc 1 53 10 mov.u32 %r46,1; .loc 1 52 9 mov.u64 %r53,1; .loc 1 56 16 mov.u64 %r75,%r65; bra $L6; $L11: mov.u64 %r65,%r54; $L6: st.u64 [%r44],%r75; .loc 1 57 19 ld.u64 %r23,[%r69]; .loc 1 57 17 st.u64 [%r35],%r23; .loc 1 58 19 ld.u64 %r77,[%r69+16]; add.u64 %r76,%r77,1; ld.u64 %r78,[%r69+8]; sub.u64 %r27,%r76,%r78; .loc 1 58 17 st.u64 [%r34],%r27; .loc 1 59 10 setp.le.s64 %r79,%r27,0; @ %r79 bra $L4; .loc 1 66 10 setp.eq.u64 %r80,%r23,%r53; .loc 1 67 16 selp.u32 %r46,%r46,0,%r80; .loc 1 69 13 mul.lo.u64 %r53,%r53,%r27; .loc 1 54 36 add.u64 %r54,%r65,1; .loc 1 54 3 add.u64 %r44,%r44,8; add.u64 %r69,%r69,24; add.u64 %r35,%r35,8; add.u64 %r34,%r34,8; setp.ne.u64 %r81,%r50,%r54; @ %r81 bra $L11; .loc 1 72 6 setp.ne.u32 %r82,%r46,0; @ %r82 bra $L4; .loc 1 76 13 mov.u64 %r84,call (%value_in),_gfortrani_xmallocarray78 7 ld.u64 %r58,[%r73]; .loc 1 79 11 ld.u64 %r55,[%frame]; .loc 1 82 9 setp.eq.u64 %r87,%r58,0; @ %r87 bra $L1; .loc 1 87 11 add.u64 %r33,%r55,%r55; ld.u64 %r45,[%frame+240]; .loc 1 91 32 ld.u64 %r59,[%frame+120]; .loc 1 98 28 mul.lo.u64 %r47,%r59,%r55; mov.u64 %r56,%r72; setp.eq.u64 %r103,%r50,1; .loc 1 99 12 mov.u64 %r106,1; add.u64 %r107,%frame,120; .loc 1 95 20 mov.u64 %r108,0; $L10: .loc 1 85 17 ld.u16 %r89,[%r58]; st.u16 [%r56],%r89; .loc 1 87 11 add.u64 %r58,%r58,%r33; .loc 1 88 15 add.u64 %r45,%r45,1; .loc 1 91 13 setp.ne.u64 %r90,%r45,%r59; @ %r90 bra $L7; .loc 1 100 14 @ %r103 bra $L1; add.u64 %r60,%frame,248; add.u64 %r68,%frame,8; .loc 1 98 28 mov.u64 %r37,%r47; .loc 1 99 12 mov.u64 %r67,%r106; bra $L8; $L9: .loc 1 95 20 st.u64 [%r60],%r108; .loc 1 98 28 mul.lo.u64 %r37,%r40,%r39; .loc 1 99 12 add.u64 %r57,%r67,1; .loc 1 100 14 add.u64 %r60,%r60,8; add.u64 %r68,%r68,8; setp.eq.u64 %r94,%r65,%r67; @ %r94 bra $L1; mov.u64 %r67,%r57; $L8: .loc 1 107 23 ld.u64 %r95,[%r60]; add.u64 %r39,%r95,1; st.u64 [%r60],%r39; .loc 1 108 28 ld.u64 %r40,[%r68]; .loc 1 108 19 sub.u64 %r96,%r40,%r37; add.u64 %r97,%r96,%r96; add.u64 %r58,%r58,%r97; .loc 1 91 32 shl.b64 %r99,%r67,3; add.u64 %r100,%r107,%r99; ld.u64 %r43,[%r100]; .loc 1 91 13 setp.eq.u64 %r101,%r39,%r43; @ %r101 bra $L9; .loc 1 95 20 mov.u64 %r45,0; $L7: add.u64 %r56,%r56,2; bra $L10; in_pack_i4.o/gfortrani_internal_pack_4gfortrani_internal_pack_4fortran/generated/in_pack_i4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_xmallocarr_gfortrani_xmallocarraygfortrani_internal_pack_4gfortrani_internal_pack_416 .b8 %frame_ar[368predpred %r99; .reg .predmov.u64 %r73,%ar0; .loc 1 51 7 ld.s8 %r50,[%r73+28]; .loc 1 54 3 setp.gt.s64 %r74,%r50,0; @ %r74 bra $L2; $L4: .loc 1 73 18 ld.u64 %r72,[%r73]; bra $L1; $L2: add.u64 %r44,%frame,240; add.u64 %r69,%r73,40; mov.u64 %r35,%frame; add.u64 %r34,%frame,120; .loc 1 54 19 mov.u64 %r65,0; .loc 1 53 10 mov.u32 %r46,1; .loc 1 52 9 mov.u64 %r53,1; .loc 1 56 16 mov.u64 %r75,%r65; bra $L6; $L11: mov.u64 %r65,%r54; $L6: st.u64 [%r44],%r75; .loc 1 57 19 ld.u64 %r23,[%r69]; .loc 1 57 17 st.u64 [%r35],%r23; .loc 1 58 19 ld.u64 %r77,[%r69+16]; add.u64 %r76,%r77,1; ld.u64 %r78,[%r69+8]; sub.u64 %r27,%r76,%r78; .loc 1 58 17 st.u64 [%r34],%r27; .loc 1 59 10 setp.le.s64 %r79,%r27,0; @ %r79 bra $L4; .loc 1 66 10 setp.eq.u64 %r80,%r23,%r53; .loc 1 67 16 selp.u32 %r46,%r46,0,%r80; .loc 1 69 13 mul.lo.u64 %r53,%r53,%r27; .loc 1 54 36 add.u64 %r54,%r65,1; .loc 1 54 3 add.u64 %r44,%r44,8; add.u64 %r69,%r69,24; add.u64 %r35,%r35,8; add.u64 %r34,%r34,8; setp.ne.u64 %r81,%r50,%r54; @ %r81 bra $L11; .loc 1 72 6 setp.ne.u32 %r82,%r46,0; @ %r82 bra $L4; .loc 1 76 13 mov.u64 %r8r84; call (%value_in),_gfortrani_xmallocarray78 7 ld.u64 %r58,[%r73]; .loc 1 79 11 ld.u64 %r55,[%frame]; .loc 1 82 9 setp.eq.u64 %r87,%r58,0; @ %r87 bra $L1; .loc 1 87 11 shl.b64 %r33,%r55,2; ld.u64 %r45,[%frame+240]; .loc 1 91 32 ld.u64 %r59,[%frame+120]; .loc 1 98 28 mul.lo.u64 %r47,%r59,%r55; mov.u64 %r56,%r72; setp.eq.u64 %r101,%r50,1; .loc 1 99 12 mov.u64 %r104,1; add.u64 %r105,%frame,120; .loc 1 95 20 mov.u64 %r106,0; $L10: .loc 1 85 19 ld.u32 %r31,[%r58]; .loc 1 85 17 st.u32 [%r56],%r31; .loc 1 87 11 add.u64 %r58,%r58,%r33; .loc 1 88 15 add.u64 %r45,%r45,1; .loc 1 91 13 setp.ne.u64 %r88,%r45,%r59; @ %r88 bra $L7; .loc 1 100 14 @ %r101 bra $L1; add.u64 %r60,%frame,248; add.u64 %r68,%frame,8; .loc 1 98 28 mov.u64 %r37,%r47; .loc 1 99 12 mov.u64 %r67,%r104; bra $L8; $L9: .loc 1 95 20 st.u64 [%r60],%r106; .loc 1 98 28 mul.lo.u64 %r37,%r40,%r39; .loc 1 99 12 add.u64 %r57,%r67,1; .loc 1 100 14 add.u64 %r60,%r60,8; add.u64 %r68,%r68,8; setp.eq.u64 %r92,%r65,%r67; @ %r92 bra $L1; mov.u64 %r67,%r57; $L8: .loc 1 107 23 ld.u64 %r93,[%r60]; add.u64 %r39,%r93,1; st.u64 [%r60],%r39; .loc 1 108 28 ld.u64 %r40,[%r68]; .loc 1 108 19 sub.u64 %r94,%r40,%r37; shl.b64 %r95,%r94,2; add.u64 %r58,%r58,%r95; .loc 1 91 32 shl.b64 %r97,%r67,3; add.u64 %r98,%r105,%r97; ld.u64 %r43,[%r98]; .loc 1 91 13 setp.eq.u64 %r99,%r39,%r43; @ %r99 bra $L9; .loc 1 95 20 mov.u64 %r45,0; $L7: add.u64 %r56,%r56,4; bra $L10;gfortrani_internal_pack_8gfortrani_internal_pack_8fortran/generated/in_pack_i8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_xmallocarr_gfortrani_xmallocarraygfortrani_internal_pack_8gfortrani_internal_pack_816 .b8 %frame_ar[368predpred %r99; .reg .predmov.u64 %r73,%ar0; .loc 1 51 7 ld.s8 %r50,[%r73+28]; .loc 1 54 3 setp.gt.s64 %r74,%r50,0; @ %r74 bra $L2; $L4: .loc 1 73 18 ld.u64 %r72,[%r73]; bra $L1; $L2: add.u64 %r44,%frame,240; add.u64 %r69,%r73,40; mov.u64 %r35,%frame; add.u64 %r34,%frame,120; .loc 1 54 19 mov.u64 %r65,0; .loc 1 53 10 mov.u32 %r46,1; .loc 1 52 9 mov.u64 %r53,1; .loc 1 56 16 mov.u64 %r75,%r65; bra $L6; $L11: mov.u64 %r65,%r54; $L6: st.u64 [%r44],%r75; .loc 1 57 19 ld.u64 %r23,[%r69]; .loc 1 57 17 st.u64 [%r35],%r23; .loc 1 58 19 ld.u64 %r77,[%r69+16]; add.u64 %r76,%r77,1; ld.u64 %r78,[%r69+8]; sub.u64 %r27,%r76,%r78; .loc 1 58 17 st.u64 [%r34],%r27; .loc 1 59 10 setp.le.s64 %r79,%r27,0; @ %r79 bra $L4; .loc 1 66 10 setp.eq.u64 %r80,%r23,%r53; .loc 1 67 16 selp.u32 %r46,%r46,0,%r80; .loc 1 69 13 mul.lo.u64 %r53,%r53,%r27; .loc 1 54 36 add.u64 %r54,%r65,1; .loc 1 54 3 add.u64 %r44,%r44,8; add.u64 %r69,%r69,24; add.u64 %r35,%r35,8; add.u64 %r34,%r34,8; setp.ne.u64 %r81,%r50,%r54; @ %r81 bra $L11; .loc 1 72 6 setp.ne.u32 %r82,%r46,0; @ %r82 bra $L4; .loc 1 76 13 mov.u64 r84; call (%value_in),_gfortrani_xmallocarray78 7 ld.u64 %r58,[%r73]; .loc 1 79 11 ld.u64 %r55,[%frame]; .loc 1 82 9 setp.eq.u64 %r87,%r58,0; @ %r87 bra $L1; .loc 1 87 11 shl.b64 %r33,%r55,3; ld.u64 %r45,[%frame+240]; .loc 1 91 32 ld.u64 %r59,[%frame+120]; .loc 1 98 28 mul.lo.u64 %r47,%r59,%r55; mov.u64 %r56,%r72; setp.eq.u64 %r101,%r50,1; .loc 1 99 12 mov.u64 %r104,1; add.u64 %r105,%frame,120; .loc 1 95 20 mov.u64 %r106,0; $L10: .loc 1 85 19 ld.u64 %r31,[%r58]; .loc 1 85 17 st.u64 [%r56],%r31; .loc 1 87 11 add.u64 %r58,%r58,%r33; .loc 1 88 15 add.u64 %r45,%r45,1; .loc 1 91 13 setp.ne.u64 %r88,%r45,%r59; @ %r88 bra $L7; .loc 1 100 14 @ %r101 bra $L1; add.u64 %r60,%frame,248; add.u64 %r68,%frame,8; .loc 1 98 28 mov.u64 %r37,%r47; .loc 1 99 12 mov.u64 %r67,%r104; bra $L8; $L9: .loc 1 95 20 st.u64 [%r60],%r106; .loc 1 98 28 mul.lo.u64 %r37,%r40,%r39; .loc 1 99 12 add.u64 %r57,%r67,1; .loc 1 100 14 add.u64 %r60,%r60,8; add.u64 %r68,%r68,8; setp.eq.u64 %r92,%r65,%r67; @ %r92 bra $L1; mov.u64 %r67,%r57; $L8: .loc 1 107 23 ld.u64 %r93,[%r60]; add.u64 %r39,%r93,1; st.u64 [%r60],%r39; .loc 1 108 28 ld.u64 %r40,[%r68]; .loc 1 108 19 sub.u64 %r94,%r40,%r37; shl.b64 %r95,%r94,3; add.u64 %r58,%r58,%r95; .loc 1 91 32 shl.b64 %r97,%r67,3; add.u64 %r98,%r105,%r97; ld.u64 %r43,[%r98]; .loc 1 91 13 setp.eq.u64 %r99,%r39,%r43; @ %r99 bra $L9; .loc 1 95 20 mov.u64 %r45,0; $L7: add.u64 %r56,%r56,8; bra $L10;gfortrani_internal_pack_16gfortrani_internal_pack_16fortran/generated/in_pack_i16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_xmallocarr_gfortrani_xmallocarraygfortrani_internal_pack_16gfortrani_internal_pack_1616 .b8 %frame_ar[368predpredmov.u64 %r73,%ar0; .loc 1 51 7 ld.s8 %r50,[%r73+28]; .loc 1 54 3 setp.gt.s64 %r74,%r50,0; @ %r74 bra $L2; $L4: .loc 1 73 18 ld.u64 %r72,[%r73]; bra $L1; $L2: add.u64 %r44,%frame,240; add.u64 %r69,%r73,40; mov.u64 %r35,%frame; add.u64 %r34,%frame,120; .loc 1 54 19 mov.u64 %r65,0; .loc 1 53 10 mov.u32 %r46,1; .loc 1 52 9 mov.u64 %r53,1; .loc 1 56 16 mov.u64 %r75,%r65; bra $L6; $L11: mov.u64 %r65,%r54; $L6: st.u64 [%r44],%r75; .loc 1 57 19 ld.u64 %r23,[%r69]; .loc 1 57 17 st.u64 [%r35],%r23; .loc 1 58 19 ld.u64 %r77,[%r69+16]; add.u64 %r76,%r77,1; ld.u64 %r78,[%r69+8]; sub.u64 %r27,%r76,%r78; .loc 1 58 17 st.u64 [%r34],%r27; .loc 1 59 10 setp.le.s64 %r79,%r27,0; @ %r79 bra $L4; .loc 1 66 10 setp.eq.u64 %r80,%r23,%r53; .loc 1 67 16 selp.u32 %r46,%r46,0,%r80; .loc 1 69 13 mul.lo.u64 %r53,%r53,%r27; .loc 1 54 36 add.u64 %r54,%r65,1; .loc 1 54 3 add.u64 %r44,%r44,8; add.u64 %r69,%r69,24; add.u64 %r35,%r35,8; add.u64 %r34,%r34,8; setp.ne.u64 %r81,%r50,%r54; @ %r81 bra $L11; .loc 1 72 6 setp.ne.u32 %r82,%r46,0; @ %r82 bra $L4; .loc 1 76 13r84; call (%value_in),_gfortrani_xmallocarray78 7 ld.u64 %r58,[%r73]; .loc 1 79 11 ld.u64 %r55,[%frame]; .loc 1 82 9 setp.eq.u64 %r87,%r58,0; @ %r87 bra $L1; .loc 1 87 11 shl.b64 %r33,%r55,4; ld.u64 %r45,[%frame+240]; .loc 1 91 32 ld.u64 %r59,[%frame+120]; .loc 1 98 28 mul.lo.u64 %r47,%r59,%r55; mov.u64 %r56,%r72; setp.eq.u64 %r105,%r50,1; .loc 1 99 12 mov.u64 %r108,1; add.u64 %r109,%frame,120; .loc 1 95 20 mov.u64 %r110,0; $L10: .loc 1 85 19 ld.u64 %r103,[%r58]; ld.u64 %r104,[%r58+8]; .loc 1 85 17 st.u64 [%r56],%r103; st.u64 [%r56+8],%r104; .loc 1 87 11 add.u64 %r58,%r58,%r33; .loc 1 88 15 add.u64 %r45,%r45,1; .loc 1 91 13 setp.ne.u64 %r90,%r45,%r59; @ %r90 bra $L7; .loc 1 100 14 @ %r105 bra $L1; add.u64 %r60,%frame,248; add.u64 %r68,%frame,8; .loc 1 98 28 mov.u64 %r37,%r47; .loc 1 99 12 mov.u64 %r67,%r108; bra $L8; $L9: .loc 1 95 20 st.u64 [%r60],%r110; .loc 1 98 28 mul.lo.u64 %r37,%r40,%r39; .loc 1 99 12 add.u64 %r57,%r67,1; .loc 1 100 14 add.u64 %r60,%r60,8; add.u64 %r68,%r68,8; setp.eq.u64 %r94,%r65,%r67; @ %r94 bra $L1; mov.u64 %r67,%r57; $L8: .loc 1 107 23 ld.u64 %r95,[%r60]; add.u64 %r39,%r95,1; st.u64 [%r60],%r39; .loc 1 108 28 ld.u64 %r40,[%r68]; .loc 1 108 19 sub.u64 %r96,%r40,%r37; shl.b64 %r97,%r96,4; add.u64 %r58,%r58,%r97; .loc 1 91 32 shl.b64 %r99,%r67,3; add.u64 %r100,%r109,%r99; ld.u64 %r43,[%r100]; .loc 1 91 13 setp.eq.u64 %r101,%r39,%r43; @ %r101 bra $L9; .loc 1 95 20 mov.u64 %r45,0; $L7: add.u64 %r56,%r56,16; bra $L10; in_pack_r4.o/gfortrani_internal_pack_r4gfortrani_internal_pack_r4fortran/generated/in_pack_r4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_xmallocarr_gfortrani_xmallocarraygfortrani_internal_pack_r4gfortrani_internal_pack_r416 .b8 %frame_ar[368predpred %r99; .reg .predmov.u64 %r73,%ar0; .loc 1 51 7 ld.s8 %r50,[%r73+28]; .loc 1 54 3 setp.gt.s64 %r74,%r50,0; @ %r74 bra $L2; $L4: .loc 1 73 18 ld.u64 %r72,[%r73]; bra $L1; $L2: add.u64 %r44,%frame,240; add.u64 %r69,%r73,40; mov.u64 %r35,%frame; add.u64 %r34,%frame,120; .loc 1 54 19 mov.u64 %r65,0; .loc 1 53 10 mov.u32 %r46,1; .loc 1 52 9 mov.u64 %r53,1; .loc 1 56 16 mov.u64 %r75,%r65; bra $L6; $L11: mov.u64 %r65,%r54; $L6: st.u64 [%r44],%r75; .loc 1 57 19 ld.u64 %r23,[%r69]; .loc 1 57 17 st.u64 [%r35],%r23; .loc 1 58 19 ld.u64 %r77,[%r69+16]; add.u64 %r76,%r77,1; ld.u64 %r78,[%r69+8]; sub.u64 %r27,%r76,%r78; .loc 1 58 17 st.u64 [%r34],%r27; .loc 1 59 10 setp.le.s64 %r79,%r27,0; @ %r79 bra $L4; .loc 1 66 10 setp.eq.u64 %r80,%r23,%r53; .loc 1 67 16 selp.u32 %r46,%r46,0,%r80; .loc 1 69 13 mul.lo.u64 %r53,%r53,%r27; .loc 1 54 36 add.u64 %r54,%r65,1; .loc 1 54 3 add.u64 %r44,%r44,8; add.u64 %r69,%r69,24; add.u64 %r35,%r35,8; add.u64 %r34,%r34,8; setp.ne.u64 %r81,%r50,%r54; @ %r81 bra $L11; .loc 1 72 6 setp.ne.u32 %r82,%r46,0; @ %r82 bra $L4; .loc 1 76 13 mov.u64 %r8r84; call (%value_in),_gfortrani_xmallocarray78 7 ld.u64 %r58,[%r73]; .loc 1 79 11 ld.u64 %r55,[%frame]; .loc 1 82 9 setp.eq.u64 %r87,%r58,0; @ %r87 bra $L1; .loc 1 87 11 shl.b64 %r33,%r55,2; ld.u64 %r45,[%frame+240]; .loc 1 91 32 ld.u64 %r59,[%frame+120]; .loc 1 98 28 mul.lo.u64 %r47,%r59,%r55; mov.u64 %r56,%r72; setp.eq.u64 %r101,%r50,1; .loc 1 99 12 mov.u64 %r104,1; add.u64 %r105,%frame,120; .loc 1 95 20 mov.u64 %r106,0; $L10: .loc 1 85 19 ld.f32 %r31,[%r58]; .loc 1 85 17 st.f32 [%r56],%r31; .loc 1 87 11 add.u64 %r58,%r58,%r33; .loc 1 88 15 add.u64 %r45,%r45,1; .loc 1 91 13 setp.ne.u64 %r88,%r45,%r59; @ %r88 bra $L7; .loc 1 100 14 @ %r101 bra $L1; add.u64 %r60,%frame,248; add.u64 %r68,%frame,8; .loc 1 98 28 mov.u64 %r37,%r47; .loc 1 99 12 mov.u64 %r67,%r104; bra $L8; $L9: .loc 1 95 20 st.u64 [%r60],%r106; .loc 1 98 28 mul.lo.u64 %r37,%r40,%r39; .loc 1 99 12 add.u64 %r57,%r67,1; .loc 1 100 14 add.u64 %r60,%r60,8; add.u64 %r68,%r68,8; setp.eq.u64 %r92,%r65,%r67; @ %r92 bra $L1; mov.u64 %r67,%r57; $L8: .loc 1 107 23 ld.u64 %r93,[%r60]; add.u64 %r39,%r93,1; st.u64 [%r60],%r39; .loc 1 108 28 ld.u64 %r40,[%r68]; .loc 1 108 19 sub.u64 %r94,%r40,%r37; shl.b64 %r95,%r94,2; add.u64 %r58,%r58,%r95; .loc 1 91 32 shl.b64 %r97,%r67,3; add.u64 %r98,%r105,%r97; ld.u64 %r43,[%r98]; .loc 1 91 13 setp.eq.u64 %r99,%r39,%r43; @ %r99 bra $L9; .loc 1 95 20 mov.u64 %r45,0; $L7: add.u64 %r56,%r56,4; bra $L10;gfortrani_internal_pack_r8gfortrani_internal_pack_r8fortran/generated/in_pack_r8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_xmallocarr_gfortrani_xmallocarraygfortrani_internal_pack_r8gfortrani_internal_pack_r816 .b8 %frame_ar[368predpred %r99; .reg .predmov.u64 %r73,%ar0; .loc 1 51 7 ld.s8 %r50,[%r73+28]; .loc 1 54 3 setp.gt.s64 %r74,%r50,0; @ %r74 bra $L2; $L4: .loc 1 73 18 ld.u64 %r72,[%r73]; bra $L1; $L2: add.u64 %r44,%frame,240; add.u64 %r69,%r73,40; mov.u64 %r35,%frame; add.u64 %r34,%frame,120; .loc 1 54 19 mov.u64 %r65,0; .loc 1 53 10 mov.u32 %r46,1; .loc 1 52 9 mov.u64 %r53,1; .loc 1 56 16 mov.u64 %r75,%r65; bra $L6; $L11: mov.u64 %r65,%r54; $L6: st.u64 [%r44],%r75; .loc 1 57 19 ld.u64 %r23,[%r69]; .loc 1 57 17 st.u64 [%r35],%r23; .loc 1 58 19 ld.u64 %r77,[%r69+16]; add.u64 %r76,%r77,1; ld.u64 %r78,[%r69+8]; sub.u64 %r27,%r76,%r78; .loc 1 58 17 st.u64 [%r34],%r27; .loc 1 59 10 setp.le.s64 %r79,%r27,0; @ %r79 bra $L4; .loc 1 66 10 setp.eq.u64 %r80,%r23,%r53; .loc 1 67 16 selp.u32 %r46,%r46,0,%r80; .loc 1 69 13 mul.lo.u64 %r53,%r53,%r27; .loc 1 54 36 add.u64 %r54,%r65,1; .loc 1 54 3 add.u64 %r44,%r44,8; add.u64 %r69,%r69,24; add.u64 %r35,%r35,8; add.u64 %r34,%r34,8; setp.ne.u64 %r81,%r50,%r54; @ %r81 bra $L11; .loc 1 72 6 setp.ne.u32 %r82,%r46,0; @ %r82 bra $L4; .loc 1 76 13 mov.u64 r84; call (%value_in),_gfortrani_xmallocarray78 7 ld.u64 %r58,[%r73]; .loc 1 79 11 ld.u64 %r55,[%frame]; .loc 1 82 9 setp.eq.u64 %r87,%r58,0; @ %r87 bra $L1; .loc 1 87 11 shl.b64 %r33,%r55,3; ld.u64 %r45,[%frame+240]; .loc 1 91 32 ld.u64 %r59,[%frame+120]; .loc 1 98 28 mul.lo.u64 %r47,%r59,%r55; mov.u64 %r56,%r72; setp.eq.u64 %r101,%r50,1; .loc 1 99 12 mov.u64 %r104,1; add.u64 %r105,%frame,120; .loc 1 95 20 mov.u64 %r106,0; $L10: .loc 1 85 19 ld.f64 %r31,[%r58]; .loc 1 85 17 st.f64 [%r56],%r31; .loc 1 87 11 add.u64 %r58,%r58,%r33; .loc 1 88 15 add.u64 %r45,%r45,1; .loc 1 91 13 setp.ne.u64 %r88,%r45,%r59; @ %r88 bra $L7; .loc 1 100 14 @ %r101 bra $L1; add.u64 %r60,%frame,248; add.u64 %r68,%frame,8; .loc 1 98 28 mov.u64 %r37,%r47; .loc 1 99 12 mov.u64 %r67,%r104; bra $L8; $L9: .loc 1 95 20 st.u64 [%r60],%r106; .loc 1 98 28 mul.lo.u64 %r37,%r40,%r39; .loc 1 99 12 add.u64 %r57,%r67,1; .loc 1 100 14 add.u64 %r60,%r60,8; add.u64 %r68,%r68,8; setp.eq.u64 %r92,%r65,%r67; @ %r92 bra $L1; mov.u64 %r67,%r57; $L8: .loc 1 107 23 ld.u64 %r93,[%r60]; add.u64 %r39,%r93,1; st.u64 [%r60],%r39; .loc 1 108 28 ld.u64 %r40,[%r68]; .loc 1 108 19 sub.u64 %r94,%r40,%r37; shl.b64 %r95,%r94,3; add.u64 %r58,%r58,%r95; .loc 1 91 32 shl.b64 %r97,%r67,3; add.u64 %r98,%r105,%r97; ld.u64 %r43,[%r98]; .loc 1 91 13 setp.eq.u64 %r99,%r39,%r43; @ %r99 bra $L9; .loc 1 95 20 mov.u64 %r45,0; $L7: add.u64 %r56,%r56,8; bra $L10;in_pack_r10in_pack_c4.o/gfortrani_internal_pack_c4gfortrani_internal_pack_c4fortran/generated/in_pack_c4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_xmallocarr_gfortrani_xmallocarraygfortrani_internal_pack_c4gfortrani_internal_pack_c416 .b8 %frame_ar[368.reg .u64 %r74; .reg .predu64 %r85; .reg .u64 %r86; .reg .pred %r88; .reg .pred %r89; .reg .predmov.u64 %r74,%ar0; .loc 1 51 7 ld.s8 %r51,[%r74+28]; .loc 1 54 3 setp.gt.s64 %r75,%r51,0; @ %r75 bra $L2; $L4: .loc 1 73 18 ld.u64 %r73,[%r74]; bra $L1; $L2: add.u64 %r45,%frame,240; add.u64 %r70,%r74,40; mov.u64 %r36,%frame; add.u64 %r35,%frame,120; .loc 1 54 19 mov.u64 %r66,0; .loc 1 53 10 mov.u32 %r47,1; .loc 1 52 9 mov.u64 %r54,1; .loc 1 56 16 mov.u64 %r76,%r66; bra $L6; $L11: mov.u64 %r66,%r55; $L6: st.u64 [%r45],%r76; .loc 1 57 19 ld.u64 %r23,[%r70]; .loc 1 57 17 st.u64 [%r36],%r23; .loc 1 58 19 ld.u64 %r78,[%r70+16]; add.u64 %r77,%r78,1; ld.u64 %r79,[%r70+8]; sub.u64 %r27,%r77,%r79; .loc 1 58 17 st.u64 [%r35],%r27; .loc 1 59 10 setp.le.s64 %r80,%r27,0; @ %r80 bra $L4; .loc 1 66 10 setp.eq.u64 %r81,%r23,%r54; .loc 1 67 16 selp.u32 %r47,%r47,0,%r81; .loc 1 69 13 mul.lo.u64 %r54,%r54,%r27; .loc 1 54 36 add.u64 %r55,%r66,1; .loc 1 54 3 add.u64 %r45,%r45,8; add.u64 %r70,%r70,24; add.u64 %r36,%r36,8; add.u64 %r35,%r35,8; setp.ne.u64 %r82,%r51,%r55; @ %r82 bra $L11; .loc 1 72 6 setp.ne.u32 %r83,%r47,0; @ %r83 bra $L4; .loc 1 76 13 mov.u64 _gfortrani_xmallocarray78 7 ld.u64 %r59,[%r74]; .loc 1 79 11 ld.u64 %r56,[%frame]; .loc 1 82 9 setp.eq.u64 %r88,%r59,0; @ %r88 bra $L1; .loc 1 87 11 shl.b64 %r34,%r56,3; ld.u64 %r46,[%frame+240]; .loc 1 91 32 ld.u64 %r60,[%frame+120]; .loc 1 98 28 mul.lo.u64 %r48,%r60,%r56; mov.u64 %r57,%r73; setp.eq.u64 %r102,%r51,1; .loc 1 99 12 mov.u64 %r105,1; add.u64 %r106,%frame,120; .loc 1 95 20 mov.u64 %r107,0; $L10: .loc 1 85 19 ld.f32 %r31,[%r59]; ld.f32 %r32,[%r59+4]; .loc 1 85 17 st.f32 [%r57],%r31; st.f32 [%r57+4],%r32; .loc 1 87 11 add.u64 %r59,%r59,%r34; .loc 1 88 15 add.u64 %r46,%r46,1; .loc 1 91 13 setp.ne.u64 %r89,%r46,%r60; @ %r89 bra $L7; .loc 1 100 14 @ %r102 bra $L1; add.u64 %r61,%frame,248; add.u64 %r69,%frame,8; .loc 1 98 28 mov.u64 %r38,%r48; .loc 1 99 12 mov.u64 %r68,%r105; bra $L8; $L9: .loc 1 95 20 st.u64 [%r61],%r107; .loc 1 98 28 mul.lo.u64 %r38,%r41,%r40; .loc 1 99 12 add.u64 %r58,%r68,1; .loc 1 100 14 add.u64 %r61,%r61,8; add.u64 %r69,%r69,8; setp.eq.u64 %r93,%r66,%r68; @ %r93 bra $L1; mov.u64 %r68,%r58; $L8: .loc 1 107 23 ld.u64 %r94,[%r61]; add.u64 %r40,%r94,1; st.u64 [%r61],%r40; .loc 1 108 28 ld.u64 %r41,[%r69]; .loc 1 108 19 sub.u64 %r95,%r41,%r38; shl.b64 %r96,%r95,3; add.u64 %r59,%r59,%r96; .loc 1 91 32 shl.b64 %r98,%r68,3; add.u64 %r99,%r106,%r98; ld.u64 %r44,[%r99]; .loc 1 91 13 setp.eq.u64 %r100,%r40,%r44; @ %r100 bra $L9; .loc 1 95 20 mov.u64 %r46,0; $L7: add.u64 %r57,%r57,8; bra $L10; in_pack_c8.o/gfortrani_internal_pack_c8gfortrani_internal_pack_c8fortran/generated/in_pack_c8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_xmallocarr_gfortrani_xmallocarraygfortrani_internal_pack_c8gfortrani_internal_pack_c816 .b8 %frame_ar[368.reg .u64 %r74; .reg .predu64 %r85; .reg .u64 %r86; .reg .pred %r88; .reg .pred %r89; .reg .predmov.u64 %r74,%ar0; .loc 1 51 7 ld.s8 %r51,[%r74+28]; .loc 1 54 3 setp.gt.s64 %r75,%r51,0; @ %r75 bra $L2; $L4: .loc 1 73 18 ld.u64 %r73,[%r74]; bra $L1; $L2: add.u64 %r45,%frame,240; add.u64 %r70,%r74,40; mov.u64 %r36,%frame; add.u64 %r35,%frame,120; .loc 1 54 19 mov.u64 %r66,0; .loc 1 53 10 mov.u32 %r47,1; .loc 1 52 9 mov.u64 %r54,1; .loc 1 56 16 mov.u64 %r76,%r66; bra $L6; $L11: mov.u64 %r66,%r55; $L6: st.u64 [%r45],%r76; .loc 1 57 19 ld.u64 %r23,[%r70]; .loc 1 57 17 st.u64 [%r36],%r23; .loc 1 58 19 ld.u64 %r78,[%r70+16]; add.u64 %r77,%r78,1; ld.u64 %r79,[%r70+8]; sub.u64 %r27,%r77,%r79; .loc 1 58 17 st.u64 [%r35],%r27; .loc 1 59 10 setp.le.s64 %r80,%r27,0; @ %r80 bra $L4; .loc 1 66 10 setp.eq.u64 %r81,%r23,%r54; .loc 1 67 16 selp.u32 %r47,%r47,0,%r81; .loc 1 69 13 mul.lo.u64 %r54,%r54,%r27; .loc 1 54 36 add.u64 %r55,%r66,1; .loc 1 54 3 add.u64 %r45,%r45,8; add.u64 %r70,%r70,24; add.u64 %r36,%r36,8; add.u64 %r35,%r35,8; setp.ne.u64 %r82,%r51,%r55; @ %r82 bra $L11; .loc 1 72 6 setp.ne.u32 %r83,%r47,0; @ %r83 bra $L4; .loc 1 76 13_gfortrani_xmallocarray78 7 ld.u64 %r59,[%r74]; .loc 1 79 11 ld.u64 %r56,[%frame]; .loc 1 82 9 setp.eq.u64 %r88,%r59,0; @ %r88 bra $L1; .loc 1 87 11 shl.b64 %r34,%r56,4; ld.u64 %r46,[%frame+240]; .loc 1 91 32 ld.u64 %r60,[%frame+120]; .loc 1 98 28 mul.lo.u64 %r48,%r60,%r56; mov.u64 %r57,%r73; setp.eq.u64 %r102,%r51,1; .loc 1 99 12 mov.u64 %r105,1; add.u64 %r106,%frame,120; .loc 1 95 20 mov.u64 %r107,0; $L10: .loc 1 85 19 ld.f64 %r31,[%r59]; ld.f64 %r32,[%r59+8]; .loc 1 85 17 st.f64 [%r57],%r31; st.f64 [%r57+8],%r32; .loc 1 87 11 add.u64 %r59,%r59,%r34; .loc 1 88 15 add.u64 %r46,%r46,1; .loc 1 91 13 setp.ne.u64 %r89,%r46,%r60; @ %r89 bra $L7; .loc 1 100 14 @ %r102 bra $L1; add.u64 %r61,%frame,248; add.u64 %r69,%frame,8; .loc 1 98 28 mov.u64 %r38,%r48; .loc 1 99 12 mov.u64 %r68,%r105; bra $L8; $L9: .loc 1 95 20 st.u64 [%r61],%r107; .loc 1 98 28 mul.lo.u64 %r38,%r41,%r40; .loc 1 99 12 add.u64 %r58,%r68,1; .loc 1 100 14 add.u64 %r61,%r61,8; add.u64 %r69,%r69,8; setp.eq.u64 %r93,%r66,%r68; @ %r93 bra $L1; mov.u64 %r68,%r58; $L8: .loc 1 107 23 ld.u64 %r94,[%r61]; add.u64 %r40,%r94,1; st.u64 [%r61],%r40; .loc 1 108 28 ld.u64 %r41,[%r69]; .loc 1 108 19 sub.u64 %r95,%r41,%r38; shl.b64 %r96,%r95,4; add.u64 %r59,%r59,%r96; .loc 1 91 32 shl.b64 %r98,%r68,3; add.u64 %r99,%r106,%r98; ld.u64 %r44,[%r99]; .loc 1 91 13 setp.eq.u64 %r100,%r40,%r44; @ %r100 bra $L9; .loc 1 95 20 mov.u64 %r46,0; $L7: add.u64 %r57,%r57,16; bra $L10; in_pack_c10in_unpack_i1.o/ 1622802203gfortrani_internal_unpack_1 .visible .func _gfortrani_internal_unpack_1fortran/generated/in_unpack_i1.c"F: _gfortrani_internal_unpack_1 .visible .func _gfortrani_internal_unpack_1864pred.reg .u64mov.u64 %r69,%ar0; mov.u64 %r70,%ar1; .loc 1 43 8 ld.u64 %r55,[%r69]; .loc 1 44 11 set.u32.eq.u64 %r72,%r55,%r70; neg.s32 %r73,%r72; .loc 1 44 22 set.u32.eq.u64 %r75,%r70,0; neg.s32 %r76,%r75; .loc 1 44 19 cvt.u16.u32 %r78,%r73; cvt.u16.u32 %r79,%r76; or.b16 %r77,%r78,%r79; .loc 1 44 61; .loc 1 47 7 ld.s8 %r50,[%r69+28]; .loc 1 49 3 setp.le.s64 %r83,%r50,0; @ %r83 bra $L14; add.u64 %r48,%r69,40; mov.u64 %r58,0; .loc 1 49 19 mov.u64 %r62,%r58; .loc 1 48 9 mov.u64 %r43,1; add.u64 %r118,%frame,240; add.u64 %r116,%frame,120; .loc 1 51 16 mov.u64 %r86,%r58; bra $L5; $L16: mov.u64 %r62,%r56; $L5: add.u64 %r85,%r118,%r58; st.u64 [%r85],%r86; .loc 1 52 19 ld.u64 %r26,[%r48]; .loc 1 52 17 add.u64 %r87,%frame,%r58; st.u64 [%r87],%r26; .loc 1 53 19 ld.u64 %r89,[%r48+16]; add.u64 %r88,%r89,1; ld.u64 %r90,[%r48+8]; sub.u64 %r30,%r88,%r90; .loc 1 53 17 add.u64 %r92,%r116,%r58; st.u64 [%r92],%r30; .loc 1 54 10 setp.le.s64 %r93,%r30,0; @ %r93 bra $L1; .loc 1 57 10 setp.ne.u64 %r94,%r26,%r43; @ %r94 bra $L15; .loc 1 58 8 mul.lo.u64 %r43,%r43,%r30; bra $L4; $L15: .loc 1 60 8 mov.u64 %r43,0; $L4: .loc 1 49 36 add.u64 %r56,%r62,1; .loc 1 49 3 add.u64 %r48,%r48,24; add.u64 %r58,%r58,8; setp.ne.u64 %r95,%r50,%r56; @ %r95 bra $L16; .loc 1 63 6 setp.eq.u64 %r96,%r43,0; @ %r96 bra $L6; .loc 1 65 7 mov.u64 %r68,%r43; bra $L3; $L14: .loc 1 49 3 mov.u64 %r68,1; $L3: .loc 1 65 7103,[%value_in]; } .loc 1 66 7 bra $L1; $L6: .loc 1 69 11 ld.u64 %r52,[%frame]; .loc 1 71 9 setp.eq.u64 %r105,%r55,0; @ %r105 bra $L1; ld.u64 %r47,[%frame+240]; .loc 1 80 32 ld.u64 %r59,[%frame+120]; .loc 1 87 29 mul.lo.u64 %r44,%r59,%r52; mov.u64 %r45,%r70; setp.eq.u64 %r117,%r50,1; .loc 1 88 12 mov.u64 %r121,1; .loc 1 84 20 mov.u64 %r122,%r43; $L13: .loc 1 74 13 ld.u8 %r106,[%r45]; st.u8 [%r55],%r106; .loc 1 76 12 add.u64 %r55,%r55,%r52; .loc 1 77 15 add.u64 %r47,%r47,1; .loc 1 80 13 setp.ne.u64 %r107,%r47,%r59; @ %r107 bra $L8; .loc 1 89 14 @ ! %r117 bra $L22; bra $L1; $L12: .loc 1 84 20 st.u64 [%r49],%r122; .loc 1 87 29 mul.lo.u64 %r37,%r40,%r39; .loc 1 88 12 add.u64 %r54,%r63,1; .loc 1 89 14 add.u64 %r49,%r49,8; add.u64 %r65,%r65,8; add.u64 %r64,%r64,8; setp.ne.u64 %r110,%r62,%r63; @ %r110 bra $L17; bra $L1; $L22: add.u64 %r49,%frame,248; add.u64 %r65,%frame,8; add.u64 %r64,%frame,128; .loc 1 87 29 mov.u64 %r37,%r44; .loc 1 88 12 mov.u64 %r63,%r121; bra $L11; $L17: mov.u64 %r63,%r54; $L11: .loc 1 96 23 ld.u64 %r113,[%r49]; add.u64 %r39,%r113,1; st.u64 [%r49],%r39; .loc 1 97 29 ld.u64 %r40,[%r65]; .loc 1 97 20 sub.u64 %r114,%r40,%r37; add.u64 %r55,%r55,%r114; .loc 1 80 32 ld.u64 %r42,[%r64]; .loc 1 80 13 setp.eq.u64 %r115,%r39,%r42; @ %r115 bra $L12; .loc 1 84 20 mov.u64 %r47,%r43; $L8: add.u64 %r45,%r45,1; bra $L13; $L1: .loc 1 101 1 ret; } in_unpack_i2.o/ 1622802203gfortrani_internal_unpack_2 .visible .func _gfortrani_internal_unpack_fortran/generated/in_unpack_i2.c"F: _gfortrani_internal_unpack_2 .visible .func _gfortrani_internal_unpack_8predpred %r99predmov.u64 %r72,%ar0; mov.u64 %r73,%ar1; .loc 1 43 8 ld.u64 %r55,[%r72]; .loc 1 44 11 set.u32.eq.u64 %r75,%r55,%r73; neg.s32 %r76,%r75; .loc 1 44 22 set.u32.eq.u64 %r78,%r73,0; neg.s32 %r79,%r78; .loc 1 44 19 cvt.u16.u32 %r81,%r76; cvt.u16.u32 %r82,%r79; or.b16 %r80,%r81,%r82; .loc 1 44 6@ %r85 bra $L1; .loc 1 47 7 ld.s8 %r49,[%r72+28]; .loc 1 49 3 setp.le.s64 %r86,%r49,0; @ %r86 bra $L14; add.u64 %r35,%r72,40; mov.u64 %r47,0; .loc 1 49 19 mov.u64 %r63,%r47; .loc 1 48 9 mov.u64 %r43,1; add.u64 %r124,%frame,240; add.u64 %r122,%frame,120; .loc 1 51 16 mov.u64 %r89,%r47; bra $L5; $L16: mov.u64 %r63,%r56; $L5: add.u64 %r88,%r124,%r47; st.u64 [%r88],%r89; .loc 1 52 19 ld.u64 %r26,[%r35]; .loc 1 52 17 add.u64 %r90,%frame,%r47; st.u64 [%r90],%r26; .loc 1 53 19 ld.u64 %r92,[%r35+16]; add.u64 %r91,%r92,1; ld.u64 %r93,[%r35+8]; sub.u64 %r30,%r91,%r93; .loc 1 53 17 add.u64 %r95,%r122,%r47; st.u64 [%r95],%r30; .loc 1 54 10 setp.le.s64 %r96,%r30,0; @ %r96 bra $L1; .loc 1 57 10 setp.ne.u64 %r97,%r26,%r43; @ %r97 bra $L15; .loc 1 58 8 mul.lo.u64 %r43,%r43,%r30; bra $L4; $L15: .loc 1 60 8 mov.u64 %r43,0; $L4: .loc 1 49 36 add.u64 %r56,%r63,1; .loc 1 49 3 add.u64 %r35,%r35,24; add.u64 %r47,%r47,8; setp.ne.u64 %r98,%r49,%r56; @ %r98 bra $L16; .loc 1 63 6 setp.eq.u64 %r99,%r43,0; @ %r99 bra $L6; .loc 1 65 7 add.u64 %r71,%r43,%r43; bra $L3; $L14: .loc 1 49 3 mov.u64 %r71,2; $L3: .loc 1 65 7107,[%value_in]; } .loc 1 66 7 bra $L1; $L6: .loc 1 69 11 ld.u64 %r52,[%frame]; .loc 1 71 9 setp.eq.u64 %r109,%r55,0; @ %r109 bra $L1; .loc 1 76 12 add.u64 %r34,%r52,%r52; ld.u64 %r44,[%frame+240]; .loc 1 80 32 ld.u64 %r59,[%frame+120]; .loc 1 87 29 mul.lo.u64 %r31,%r59,%r52; mov.u64 %r53,%r73; setp.eq.u64 %r123,%r49,1; .loc 1 88 12 mov.u64 %r127,1; .loc 1 84 20 mov.u64 %r128,%r43; $L13: .loc 1 74 13 ld.u16 %r111,[%r53]; st.u16 [%r55],%r111; .loc 1 76 12 add.u64 %r55,%r55,%r34; .loc 1 77 15 add.u64 %r44,%r44,1; .loc 1 80 13 setp.ne.u64 %r112,%r44,%r59; @ %r112 bra $L8; .loc 1 89 14 @ ! %r123 bra $L22; bra $L1; $L12: .loc 1 84 20 st.u64 [%r48],%r128; .loc 1 87 29 mul.lo.u64 %r37,%r40,%r39; .loc 1 88 12 add.u64 %r54,%r64,1; .loc 1 89 14 add.u64 %r48,%r48,8; add.u64 %r67,%r67,8; add.u64 %r66,%r66,8; setp.ne.u64 %r115,%r63,%r64; @ %r115 bra $L17; bra $L1; $L22: add.u64 %r48,%frame,248; add.u64 %r67,%frame,8; add.u64 %r66,%frame,128; .loc 1 87 29 mov.u64 %r37,%r31; .loc 1 88 12 mov.u64 %r64,%r127; bra $L11; $L17: mov.u64 %r64,%r54; $L11: .loc 1 96 23 ld.u64 %r118,[%r48]; add.u64 %r39,%r118,1; st.u64 [%r48],%r39; .loc 1 97 29 ld.u64 %r40,[%r67]; .loc 1 97 20 sub.u64 %r119,%r40,%r37; add.u64 %r120,%r119,%r119; add.u64 %r55,%r55,%r120; .loc 1 80 32 ld.u64 %r42,[%r66]; .loc 1 80 13 setp.eq.u64 %r121,%r39,%r42; @ %r121 bra $L12; .loc 1 84 20 mov.u64 %r44,%r43; $L8: add.u64 %r53,%r53,2; bra $L13; $L1: .loc 1 101 1 ret; } in_unpack_i4.o/ 1622802203gfortrani_internal_unpack_4 .visible .func _gfortrani_internal_unpack_4fortran/generated/in_unpack_i4.c"F: _gfortrani_internal_unpack_4 .visible .func _gfortrani_internal_unpack_48616pred %r99predmov.u64 %r72,%ar0; mov.u64 %r73,%ar1; .loc 1 43 8 ld.u64 %r55,[%r72]; .loc 1 44 11 set.u32.eq.u64 %r75,%r55,%r73; neg.s32 %r76,%r75; .loc 1 44 22 set.u32.eq.u64 %r78,%r73,0; neg.s32 %r79,%r78; .loc 1 44 19 cvt.u16.u32 %r81,%r76; cvt.u16.u32 %r82,%r79; or.b16 %r80,%r81,%r82; .loc 1 44 6@ %r85 bra $L1; .loc 1 47 7 ld.s8 %r49,[%r72+28]; .loc 1 49 3 setp.le.s64 %r86,%r49,0; @ %r86 bra $L14; add.u64 %r35,%r72,40; mov.u64 %r47,0; .loc 1 49 19 mov.u64 %r63,%r47; .loc 1 48 9 mov.u64 %r43,1; add.u64 %r121,%frame,240; add.u64 %r119,%frame,120; .loc 1 51 16 mov.u64 %r89,%r47; bra $L5; $L16: mov.u64 %r63,%r56; $L5: add.u64 %r88,%r121,%r47; st.u64 [%r88],%r89; .loc 1 52 19 ld.u64 %r26,[%r35]; .loc 1 52 17 add.u64 %r90,%frame,%r47; st.u64 [%r90],%r26; .loc 1 53 19 ld.u64 %r92,[%r35+16]; add.u64 %r91,%r92,1; ld.u64 %r93,[%r35+8]; sub.u64 %r30,%r91,%r93; .loc 1 53 17 add.u64 %r95,%r119,%r47; st.u64 [%r95],%r30; .loc 1 54 10 setp.le.s64 %r96,%r30,0; @ %r96 bra $L1; .loc 1 57 10 setp.ne.u64 %r97,%r26,%r43; @ %r97 bra $L15; .loc 1 58 8 mul.lo.u64 %r43,%r43,%r30; bra $L4; $L15: .loc 1 60 8 mov.u64 %r43,0; $L4: .loc 1 49 36 add.u64 %r56,%r63,1; .loc 1 49 3 add.u64 %r35,%r35,24; add.u64 %r47,%r47,8; setp.ne.u64 %r98,%r49,%r56; @ %r98 bra $L16; .loc 1 63 6 setp.eq.u64 %r99,%r43,0; @ %r99 bra $L6; .loc 1 65 7 shl.b64 %r71,%r43,2; bra $L3; $L14: .loc 1 49 3 mov.u64 %r71,4; $L3: .loc 1 65 7106,[%value_in]; } .loc 1 66 7 bra $L1; $L6: .loc 1 69 11 ld.u64 %r52,[%frame]; .loc 1 71 9 setp.eq.u64 %r108,%r55,0; @ %r108 bra $L1; .loc 1 76 12 shl.b64 %r34,%r52,2; ld.u64 %r44,[%frame+240]; .loc 1 80 32 ld.u64 %r59,[%frame+120]; .loc 1 87 29 mul.lo.u64 %r31,%r59,%r52; mov.u64 %r53,%r73; setp.eq.u64 %r120,%r49,1; .loc 1 88 12 mov.u64 %r124,1; .loc 1 84 20 mov.u64 %r125,%r43; $L13: .loc 1 74 15 ld.u32 %r32,[%r53]; .loc 1 74 13 st.u32 [%r55],%r32; .loc 1 76 12 add.u64 %r55,%r55,%r34; .loc 1 77 15 add.u64 %r44,%r44,1; .loc 1 80 13 setp.ne.u64 %r109,%r44,%r59; @ %r109 bra $L8; .loc 1 89 14 @ ! %r120 bra $L22; bra $L1; $L12: .loc 1 84 20 st.u64 [%r48],%r125; .loc 1 87 29 mul.lo.u64 %r37,%r40,%r39; .loc 1 88 12 add.u64 %r54,%r64,1; .loc 1 89 14 add.u64 %r48,%r48,8; add.u64 %r67,%r67,8; add.u64 %r66,%r66,8; setp.ne.u64 %r112,%r63,%r64; @ %r112 bra $L17; bra $L1; $L22: add.u64 %r48,%frame,248; add.u64 %r67,%frame,8; add.u64 %r66,%frame,128; .loc 1 87 29 mov.u64 %r37,%r31; .loc 1 88 12 mov.u64 %r64,%r124; bra $L11; $L17: mov.u64 %r64,%r54; $L11: .loc 1 96 23 ld.u64 %r115,[%r48]; add.u64 %r39,%r115,1; st.u64 [%r48],%r39; .loc 1 97 29 ld.u64 %r40,[%r67]; .loc 1 97 20 sub.u64 %r116,%r40,%r37; shl.b64 %r117,%r116,2; add.u64 %r55,%r55,%r117; .loc 1 80 32 ld.u64 %r42,[%r66]; .loc 1 80 13 setp.eq.u64 %r118,%r39,%r42; @ %r118 bra $L12; .loc 1 84 20 mov.u64 %r44,%r43; $L8: add.u64 %r53,%r53,4; bra $L13; $L1: .loc 1 101 1 ret; } in_unpack_i8.o/ 1622802203gfortrani_internal_unpack_8 .visible .func _gfortrani_internal_unpack_8fortran/generated/in_unpack_i8.c"F: _gfortrani_internal_unpack_8 .visible .func _gfortrani_internal_unpack_8816pred %r99predmov.u64 %r72,%ar0; mov.u64 %r73,%ar1; .loc 1 43 8 ld.u64 %r55,[%r72]; .loc 1 44 11 set.u32.eq.u64 %r75,%r55,%r73; neg.s32 %r76,%r75; .loc 1 44 22 set.u32.eq.u64 %r78,%r73,0; neg.s32 %r79,%r78; .loc 1 44 19 cvt.u16.u32 %r81,%r76; cvt.u16.u32 %r82,%r79; or.b16 %r80,%r81,%r82; .loc 1 44 6@ %r85 bra $L1; .loc 1 47 7 ld.s8 %r49,[%r72+28]; .loc 1 49 3 setp.le.s64 %r86,%r49,0; @ %r86 bra $L14; add.u64 %r35,%r72,40; mov.u64 %r47,0; .loc 1 49 19 mov.u64 %r63,%r47; .loc 1 48 9 mov.u64 %r43,1; add.u64 %r121,%frame,240; add.u64 %r119,%frame,120; .loc 1 51 16 mov.u64 %r89,%r47; bra $L5; $L16: mov.u64 %r63,%r56; $L5: add.u64 %r88,%r121,%r47; st.u64 [%r88],%r89; .loc 1 52 19 ld.u64 %r26,[%r35]; .loc 1 52 17 add.u64 %r90,%frame,%r47; st.u64 [%r90],%r26; .loc 1 53 19 ld.u64 %r92,[%r35+16]; add.u64 %r91,%r92,1; ld.u64 %r93,[%r35+8]; sub.u64 %r30,%r91,%r93; .loc 1 53 17 add.u64 %r95,%r119,%r47; st.u64 [%r95],%r30; .loc 1 54 10 setp.le.s64 %r96,%r30,0; @ %r96 bra $L1; .loc 1 57 10 setp.ne.u64 %r97,%r26,%r43; @ %r97 bra $L15; .loc 1 58 8 mul.lo.u64 %r43,%r43,%r30; bra $L4; $L15: .loc 1 60 8 mov.u64 %r43,0; $L4: .loc 1 49 36 add.u64 %r56,%r63,1; .loc 1 49 3 add.u64 %r35,%r35,24; add.u64 %r47,%r47,8; setp.ne.u64 %r98,%r49,%r56; @ %r98 bra $L16; .loc 1 63 6 setp.eq.u64 %r99,%r43,0; @ %r99 bra $L6; .loc 1 65 7 shl.b64 %r71,%r43,3; bra $L3; $L14: .loc 1 49 3 mov.u64 %r71,8; $L3: .loc 1 65 7106,[%value_in]; } .loc 1 66 7 bra $L1; $L6: .loc 1 69 11 ld.u64 %r52,[%frame]; .loc 1 71 9 setp.eq.u64 %r108,%r55,0; @ %r108 bra $L1; .loc 1 76 12 shl.b64 %r34,%r52,3; ld.u64 %r44,[%frame+240]; .loc 1 80 32 ld.u64 %r59,[%frame+120]; .loc 1 87 29 mul.lo.u64 %r31,%r59,%r52; mov.u64 %r53,%r73; setp.eq.u64 %r120,%r49,1; .loc 1 88 12 mov.u64 %r124,1; .loc 1 84 20 mov.u64 %r125,%r43; $L13: .loc 1 74 15 ld.u64 %r32,[%r53]; .loc 1 74 13 st.u64 [%r55],%r32; .loc 1 76 12 add.u64 %r55,%r55,%r34; .loc 1 77 15 add.u64 %r44,%r44,1; .loc 1 80 13 setp.ne.u64 %r109,%r44,%r59; @ %r109 bra $L8; .loc 1 89 14 @ ! %r120 bra $L22; bra $L1; $L12: .loc 1 84 20 st.u64 [%r48],%r125; .loc 1 87 29 mul.lo.u64 %r37,%r40,%r39; .loc 1 88 12 add.u64 %r54,%r64,1; .loc 1 89 14 add.u64 %r48,%r48,8; add.u64 %r67,%r67,8; add.u64 %r66,%r66,8; setp.ne.u64 %r112,%r63,%r64; @ %r112 bra $L17; bra $L1; $L22: add.u64 %r48,%frame,248; add.u64 %r67,%frame,8; add.u64 %r66,%frame,128; .loc 1 87 29 mov.u64 %r37,%r31; .loc 1 88 12 mov.u64 %r64,%r124; bra $L11; $L17: mov.u64 %r64,%r54; $L11: .loc 1 96 23 ld.u64 %r115,[%r48]; add.u64 %r39,%r115,1; st.u64 [%r48],%r39; .loc 1 97 29 ld.u64 %r40,[%r67]; .loc 1 97 20 sub.u64 %r116,%r40,%r37; shl.b64 %r117,%r116,3; add.u64 %r55,%r55,%r117; .loc 1 80 32 ld.u64 %r42,[%r66]; .loc 1 80 13 setp.eq.u64 %r118,%r39,%r42; @ %r118 bra $L12; .loc 1 84 20 mov.u64 %r44,%r43; $L8: add.u64 %r53,%r53,8; bra $L13; $L1: .loc 1 101 1 ret; } in_unpack_i16.o/1622802203gfortrani_internal_unpack_16 .visible .func _gfortrani_internal_unpack_16fortran/generated/in_unpack_i16.c"F: _gfortrani_internal_unpack_16 .visible .func _gfortrani_internal_unpack_168predpred %r99; .reg .u64 %r106; .reg .predpredmov.u64 %r72,%ar0; mov.u64 %r73,%ar1; .loc 1 43 8 ld.u64 %r55,[%r72]; .loc 1 44 11 set.u32.eq.u64 %r75,%r55,%r73; neg.s32 %r76,%r75; .loc 1 44 22 set.u32.eq.u64 %r78,%r73,0; neg.s32 %r79,%r78; .loc 1 44 19 cvt.u16.u32 %r81,%r76; cvt.u16.u32 %r82,%r79; or.b16 %r80,%r81,%r82; .loc 1 44 6@ %r85 bra $L1; .loc 1 47 7 ld.s8 %r49,[%r72+28]; .loc 1 49 3 setp.le.s64 %r86,%r49,0; @ %r86 bra $L14; add.u64 %r35,%r72,40; mov.u64 %r47,0; .loc 1 49 19 mov.u64 %r63,%r47; .loc 1 48 9 mov.u64 %r43,1; add.u64 %r123,%frame,240; add.u64 %r124,%frame,120; .loc 1 51 16 mov.u64 %r89,%r47; bra $L5; $L16: mov.u64 %r63,%r56; $L5: add.u64 %r88,%r123,%r47; st.u64 [%r88],%r89; .loc 1 52 19 ld.u64 %r26,[%r35]; .loc 1 52 17 add.u64 %r90,%frame,%r47; st.u64 [%r90],%r26; .loc 1 53 19 ld.u64 %r92,[%r35+16]; add.u64 %r91,%r92,1; ld.u64 %r93,[%r35+8]; sub.u64 %r30,%r91,%r93; .loc 1 53 17 add.u64 %r95,%r124,%r47; st.u64 [%r95],%r30; .loc 1 54 10 setp.le.s64 %r96,%r30,0; @ %r96 bra $L1; .loc 1 57 10 setp.ne.u64 %r97,%r26,%r43; @ %r97 bra $L15; .loc 1 58 8 mul.lo.u64 %r43,%r43,%r30; bra $L4; $L15: .loc 1 60 8 mov.u64 %r43,0; $L4: .loc 1 49 36 add.u64 %r56,%r63,1; .loc 1 49 3 add.u64 %r35,%r35,24; add.u64 %r47,%r47,8; setp.ne.u64 %r98,%r49,%r56; @ %r98 bra $L16; .loc 1 63 6 setp.eq.u64 %r99,%r43,0; @ %r99 bra $L6; .loc 1 65 7 shl.b64 %r71,%r43,4; bra $L3; $L14: .loc 1 49 3 mov.u64 %r71,16; $L3: .loc 1 65 7106,[%value_in]; } .loc 1 66 7 bra $L1; $L6: .loc 1 69 11 ld.u64 %r52,[%frame]; .loc 1 71 9 setp.eq.u64 %r108,%r55,0; @ %r108 bra $L1; .loc 1 76 12 shl.b64 %r34,%r52,4; ld.u64 %r44,[%frame+240]; .loc 1 80 32 ld.u64 %r59,[%frame+120]; .loc 1 87 29 mul.lo.u64 %r31,%r59,%r52; mov.u64 %r53,%r73; setp.eq.u64 %r125,%r49,1; .loc 1 88 12 mov.u64 %r128,1; .loc 1 84 20 mov.u64 %r129,%r43; $L13: .loc 1 74 15 ld.u64 %r121,[%r53]; ld.u64 %r122,[%r53+8]; .loc 1 74 13 st.u64 [%r55],%r121; st.u64 [%r55+8],%r122; .loc 1 76 12 add.u64 %r55,%r55,%r34; .loc 1 77 15 add.u64 %r44,%r44,1; .loc 1 80 13 setp.ne.u64 %r111,%r44,%r59; @ %r111 bra $L8; .loc 1 89 14 @ ! %r125 bra $L22; bra $L1; $L12: .loc 1 84 20 st.u64 [%r48],%r129; .loc 1 87 29 mul.lo.u64 %r37,%r40,%r39; .loc 1 88 12 add.u64 %r54,%r64,1; .loc 1 89 14 add.u64 %r48,%r48,8; add.u64 %r67,%r67,8; add.u64 %r66,%r66,8; setp.ne.u64 %r114,%r63,%r64; @ %r114 bra $L17; bra $L1; $L22: add.u64 %r48,%frame,248; add.u64 %r67,%frame,8; add.u64 %r66,%frame,128; .loc 1 87 29 mov.u64 %r37,%r31; .loc 1 88 12 mov.u64 %r64,%r128; bra $L11; $L17: mov.u64 %r64,%r54; $L11: .loc 1 96 23 ld.u64 %r117,[%r48]; add.u64 %r39,%r117,1; st.u64 [%r48],%r39; .loc 1 97 29 ld.u64 %r40,[%r67]; .loc 1 97 20 sub.u64 %r118,%r40,%r37; shl.b64 %r119,%r118,4; add.u64 %r55,%r55,%r119; .loc 1 80 32 ld.u64 %r42,[%r66]; .loc 1 80 13 setp.eq.u64 %r120,%r39,%r42; @ %r120 bra $L12; .loc 1 84 20 mov.u64 %r44,%r43; $L8: add.u64 %r53,%r53,16; bra $L13; $L1: .loc 1 101 1 ret; } in_unpack_r4.o/ 1622802203gfortrani_internal_unpack_r4 .visible .func _gfortrani_internal_unpack_r4fortran/generated/in_unpack_r4.c"F: _gfortrani_internal_unpack_r4 .visible .func _gfortrani_internal_unpack_r48u64 %r31predpred %r99predmov.u64 %r72,%ar0; mov.u64 %r73,%ar1; .loc 1 43 8 ld.u64 %r55,[%r72]; .loc 1 44 11 set.u32.eq.u64 %r75,%r55,%r73; neg.s32 %r76,%r75; .loc 1 44 22 set.u32.eq.u64 %r78,%r73,0; neg.s32 %r79,%r78; .loc 1 44 19 cvt.u16.u32 %r81,%r76; cvt.u16.u32 %r82,%r79; or.b16 %r80,%r81,%r82; .loc 1 44 6@ %r85 bra $L1; .loc 1 47 7 ld.s8 %r49,[%r72+28]; .loc 1 49 3 setp.le.s64 %r86,%r49,0; @ %r86 bra $L14; add.u64 %r35,%r72,40; mov.u64 %r47,0; .loc 1 49 19 mov.u64 %r63,%r47; .loc 1 48 9 mov.u64 %r43,1; add.u64 %r121,%frame,240; add.u64 %r119,%frame,120; .loc 1 51 16 mov.u64 %r89,%r47; bra $L5; $L16: mov.u64 %r63,%r56; $L5: add.u64 %r88,%r121,%r47; st.u64 [%r88],%r89; .loc 1 52 19 ld.u64 %r26,[%r35]; .loc 1 52 17 add.u64 %r90,%frame,%r47; st.u64 [%r90],%r26; .loc 1 53 19 ld.u64 %r92,[%r35+16]; add.u64 %r91,%r92,1; ld.u64 %r93,[%r35+8]; sub.u64 %r30,%r91,%r93; .loc 1 53 17 add.u64 %r95,%r119,%r47; st.u64 [%r95],%r30; .loc 1 54 10 setp.le.s64 %r96,%r30,0; @ %r96 bra $L1; .loc 1 57 10 setp.ne.u64 %r97,%r26,%r43; @ %r97 bra $L15; .loc 1 58 8 mul.lo.u64 %r43,%r43,%r30; bra $L4; $L15: .loc 1 60 8 mov.u64 %r43,0; $L4: .loc 1 49 36 add.u64 %r56,%r63,1; .loc 1 49 3 add.u64 %r35,%r35,24; add.u64 %r47,%r47,8; setp.ne.u64 %r98,%r49,%r56; @ %r98 bra $L16; .loc 1 63 6 setp.eq.u64 %r99,%r43,0; @ %r99 bra $L6; .loc 1 65 7 shl.b64 %r71,%r43,2; bra $L3; $L14: .loc 1 49 3 mov.u64 %r71,4; $L3: .loc 1 65 7106,[%value_in]; } .loc 1 66 7 bra $L1; $L6: .loc 1 69 11 ld.u64 %r52,[%frame]; .loc 1 71 9 setp.eq.u64 %r108,%r55,0; @ %r108 bra $L1; .loc 1 76 12 shl.b64 %r34,%r52,2; ld.u64 %r44,[%frame+240]; .loc 1 80 32 ld.u64 %r59,[%frame+120]; .loc 1 87 29 mul.lo.u64 %r31,%r59,%r52; mov.u64 %r53,%r73; setp.eq.u64 %r120,%r49,1; .loc 1 88 12 mov.u64 %r124,1; .loc 1 84 20 mov.u64 %r125,%r43; $L13: .loc 1 74 15 ld.f32 %r32,[%r53]; .loc 1 74 13 st.f32 [%r55],%r32; .loc 1 76 12 add.u64 %r55,%r55,%r34; .loc 1 77 15 add.u64 %r44,%r44,1; .loc 1 80 13 setp.ne.u64 %r109,%r44,%r59; @ %r109 bra $L8; .loc 1 89 14 @ ! %r120 bra $L22; bra $L1; $L12: .loc 1 84 20 st.u64 [%r48],%r125; .loc 1 87 29 mul.lo.u64 %r37,%r40,%r39; .loc 1 88 12 add.u64 %r54,%r64,1; .loc 1 89 14 add.u64 %r48,%r48,8; add.u64 %r67,%r67,8; add.u64 %r66,%r66,8; setp.ne.u64 %r112,%r63,%r64; @ %r112 bra $L17; bra $L1; $L22: add.u64 %r48,%frame,248; add.u64 %r67,%frame,8; add.u64 %r66,%frame,128; .loc 1 87 29 mov.u64 %r37,%r31; .loc 1 88 12 mov.u64 %r64,%r124; bra $L11; $L17: mov.u64 %r64,%r54; $L11: .loc 1 96 23 ld.u64 %r115,[%r48]; add.u64 %r39,%r115,1; st.u64 [%r48],%r39; .loc 1 97 29 ld.u64 %r40,[%r67]; .loc 1 97 20 sub.u64 %r116,%r40,%r37; shl.b64 %r117,%r116,2; add.u64 %r55,%r55,%r117; .loc 1 80 32 ld.u64 %r42,[%r66]; .loc 1 80 13 setp.eq.u64 %r118,%r39,%r42; @ %r118 bra $L12; .loc 1 84 20 mov.u64 %r44,%r43; $L8: add.u64 %r53,%r53,4; bra $L13; $L1: .loc 1 101 1 ret; } in_unpack_r8.o/ 1622802203gfortrani_internal_unpack_r8 .visible .func _gfortrani_internal_unpack_r8fortran/generated/in_unpack_r8.c"F: _gfortrani_internal_unpack_r8 .visible .func _gfortrani_internal_unpack_r88u64 %r31predpred %r99predmov.u64 %r72,%ar0; mov.u64 %r73,%ar1; .loc 1 43 8 ld.u64 %r55,[%r72]; .loc 1 44 11 set.u32.eq.u64 %r75,%r55,%r73; neg.s32 %r76,%r75; .loc 1 44 22 set.u32.eq.u64 %r78,%r73,0; neg.s32 %r79,%r78; .loc 1 44 19 cvt.u16.u32 %r81,%r76; cvt.u16.u32 %r82,%r79; or.b16 %r80,%r81,%r82; .loc 1 44 6@ %r85 bra $L1; .loc 1 47 7 ld.s8 %r49,[%r72+28]; .loc 1 49 3 setp.le.s64 %r86,%r49,0; @ %r86 bra $L14; add.u64 %r35,%r72,40; mov.u64 %r47,0; .loc 1 49 19 mov.u64 %r63,%r47; .loc 1 48 9 mov.u64 %r43,1; add.u64 %r121,%frame,240; add.u64 %r119,%frame,120; .loc 1 51 16 mov.u64 %r89,%r47; bra $L5; $L16: mov.u64 %r63,%r56; $L5: add.u64 %r88,%r121,%r47; st.u64 [%r88],%r89; .loc 1 52 19 ld.u64 %r26,[%r35]; .loc 1 52 17 add.u64 %r90,%frame,%r47; st.u64 [%r90],%r26; .loc 1 53 19 ld.u64 %r92,[%r35+16]; add.u64 %r91,%r92,1; ld.u64 %r93,[%r35+8]; sub.u64 %r30,%r91,%r93; .loc 1 53 17 add.u64 %r95,%r119,%r47; st.u64 [%r95],%r30; .loc 1 54 10 setp.le.s64 %r96,%r30,0; @ %r96 bra $L1; .loc 1 57 10 setp.ne.u64 %r97,%r26,%r43; @ %r97 bra $L15; .loc 1 58 8 mul.lo.u64 %r43,%r43,%r30; bra $L4; $L15: .loc 1 60 8 mov.u64 %r43,0; $L4: .loc 1 49 36 add.u64 %r56,%r63,1; .loc 1 49 3 add.u64 %r35,%r35,24; add.u64 %r47,%r47,8; setp.ne.u64 %r98,%r49,%r56; @ %r98 bra $L16; .loc 1 63 6 setp.eq.u64 %r99,%r43,0; @ %r99 bra $L6; .loc 1 65 7 shl.b64 %r71,%r43,3; bra $L3; $L14: .loc 1 49 3 mov.u64 %r71,8; $L3: .loc 1 65 7106,[%value_in]; } .loc 1 66 7 bra $L1; $L6: .loc 1 69 11 ld.u64 %r52,[%frame]; .loc 1 71 9 setp.eq.u64 %r108,%r55,0; @ %r108 bra $L1; .loc 1 76 12 shl.b64 %r34,%r52,3; ld.u64 %r44,[%frame+240]; .loc 1 80 32 ld.u64 %r59,[%frame+120]; .loc 1 87 29 mul.lo.u64 %r31,%r59,%r52; mov.u64 %r53,%r73; setp.eq.u64 %r120,%r49,1; .loc 1 88 12 mov.u64 %r124,1; .loc 1 84 20 mov.u64 %r125,%r43; $L13: .loc 1 74 15 ld.f64 %r32,[%r53]; .loc 1 74 13 st.f64 [%r55],%r32; .loc 1 76 12 add.u64 %r55,%r55,%r34; .loc 1 77 15 add.u64 %r44,%r44,1; .loc 1 80 13 setp.ne.u64 %r109,%r44,%r59; @ %r109 bra $L8; .loc 1 89 14 @ ! %r120 bra $L22; bra $L1; $L12: .loc 1 84 20 st.u64 [%r48],%r125; .loc 1 87 29 mul.lo.u64 %r37,%r40,%r39; .loc 1 88 12 add.u64 %r54,%r64,1; .loc 1 89 14 add.u64 %r48,%r48,8; add.u64 %r67,%r67,8; add.u64 %r66,%r66,8; setp.ne.u64 %r112,%r63,%r64; @ %r112 bra $L17; bra $L1; $L22: add.u64 %r48,%frame,248; add.u64 %r67,%frame,8; add.u64 %r66,%frame,128; .loc 1 87 29 mov.u64 %r37,%r31; .loc 1 88 12 mov.u64 %r64,%r124; bra $L11; $L17: mov.u64 %r64,%r54; $L11: .loc 1 96 23 ld.u64 %r115,[%r48]; add.u64 %r39,%r115,1; st.u64 [%r48],%r39; .loc 1 97 29 ld.u64 %r40,[%r67]; .loc 1 97 20 sub.u64 %r116,%r40,%r37; shl.b64 %r117,%r116,3; add.u64 %r55,%r55,%r117; .loc 1 80 32 ld.u64 %r42,[%r66]; .loc 1 80 13 setp.eq.u64 %r118,%r39,%r42; @ %r118 bra $L12; .loc 1 84 20 mov.u64 %r44,%r43; $L8: add.u64 %r53,%r53,8; bra $L13; $L1: .loc 1 101 1 ret; } in_unpack_r10.o/1622802203in_unpack_c4.o/gfortrani_internal_unpack_c4 .visible .func _gfortrani_internal_unpack_c4fortran/generated/in_unpack_c4.c"F: _gfortrani_internal_unpack_c4 .visible .func _gfortrani_internal_unpack_c48u64 %r31; .reg .f32 %r32; .reg .f32 %r33; .reg .u6416predpred %r109; .reg .predu64 %r125; .reg .u64 %r126; mov.u64 %r73,%ar0; mov.u64 %r74,%ar1; .loc 1 43 8 ld.u64 %r56,[%r73]; .loc 1 44 11 set.u32.eq.u64 %r76,%r56,%r74; neg.s32 %r77,%r76; .loc 1 44 22 set.u32.eq.u64 %r79,%r74,0; neg.s32 %r80,%r79; .loc 1 44 19 cvt.u16.u32 %r82,%r77; cvt.u16.u32 %r83,%r80; or.b16 %r81,%r82,%r83; .loc 1 44 6 cvt.u32.u16 %r84,%r81; cvt.u16.u8 %r85,%r84; setp.ne.u16 %r86,%r85,0; @ %r86 bra $L1; .loc 1 47 7 ld.s8 %r50,[%r73+28]; .loc 1 49 3 setp.le.s64 %r87,%r50,0; @ %r87 bra $L14; add.u64 %r36,%r73,40; mov.u64 %r48,0; .loc 1 49 19 mov.u64 %r64,%r48; .loc 1 48 9 mov.u64 %r44,1; add.u64 %r122,%frame,240; add.u64 %r120,%frame,120; .loc 1 51 16 mov.u64 %r90,%r48; bra $L5; $L16: mov.u64 %r64,%r57; $L5: add.u64 %r89,%r122,%r48; st.u64 [%r89],%r90; .loc 1 52 19 ld.u64 %r26,[%r36]; .loc 1 52 17 add.u64 %r91,%frame,%r48; st.u64 [%r91],%r26; .loc 1 53 19 ld.u64 %r93,[%r36+16]; add.u64 %r92,%r93,1; ld.u64 %r94,[%r36+8]; sub.u64 %r30,%r92,%r94; .loc 1 53 17 add.u64 %r96,%r120,%r48; st.u64 [%r96],%r30; .loc 1 54 10 setp.le.s64 %r97,%r30,0; @ %r97 bra $L1; .loc 1 57 10 setp.ne.u64 %r98,%r26,%r44; @ %r98 bra $L15; .loc 1 58 8 mul.lo.u64 %r44,%r44,%r30; bra $L4; $L15: .loc 1 60 8 mov.u64 %r44,0; $L4: .loc 1 49 36 add.u64 %r57,%r64,1; .loc 1 49 3 add.u64 %r36,%r36,24; add.u64 %r48,%r48,8; setp.ne.u64 %r99,%r50,%r57; @ %r99 bra $L16; .loc 1 63 6 setp.eq.u64 %r100,%r44,0; @ %r100 bra $L6; .loc 1 65 7 shl.b64 %r72,%r44,3; bra $L3; $L14: .loc 1 49 3 mov.u64 %r72,8; $L3: .loc 1 65 7107,[%value_in]; } .loc 1 66 7 bra $L1; $L6: .loc 1 69 11 ld.u64 %r53,[%frame]; .loc 1 71 9 setp.eq.u64 %r109,%r56,0; @ %r109 bra $L1; .loc 1 76 12 shl.b64 %r35,%r53,3; ld.u64 %r45,[%frame+240]; .loc 1 80 32 ld.u64 %r60,[%frame+120]; .loc 1 87 29 mul.lo.u64 %r31,%r60,%r53; mov.u64 %r54,%r74; setp.eq.u64 %r121,%r50,1; .loc 1 88 12 mov.u64 %r125,1; .loc 1 84 20 mov.u64 %r126,%r44; $L13: .loc 1 74 15 ld.f32 %r32,[%r54]; ld.f32 %r33,[%r54+4]; .loc 1 74 13 st.f32 [%r56],%r32; st.f32 [%r56+4],%r33; .loc 1 76 12 add.u64 %r56,%r56,%r35; .loc 1 77 15 add.u64 %r45,%r45,1; .loc 1 80 13 setp.ne.u64 %r110,%r45,%r60; @ %r110 bra $L8; .loc 1 89 14 @ ! %r121 bra $L22; bra $L1; $L12: .loc 1 84 20 st.u64 [%r49],%r126; .loc 1 87 29 mul.lo.u64 %r38,%r41,%r40; .loc 1 88 12 add.u64 %r55,%r65,1; .loc 1 89 14 add.u64 %r49,%r49,8; add.u64 %r68,%r68,8; add.u64 %r67,%r67,8; setp.ne.u64 %r113,%r64,%r65; @ %r113 bra $L17; bra $L1; $L22: add.u64 %r49,%frame,248; add.u64 %r68,%frame,8; add.u64 %r67,%frame,128; .loc 1 87 29 mov.u64 %r38,%r31; .loc 1 88 12 mov.u64 %r65,%r125; bra $L11; $L17: mov.u64 %r65,%r55; $L11: .loc 1 96 23 ld.u64 %r116,[%r49]; add.u64 %r40,%r116,1; st.u64 [%r49],%r40; .loc 1 97 29 ld.u64 %r41,[%r68]; .loc 1 97 20 sub.u64 %r117,%r41,%r38; shl.b64 %r118,%r117,3; add.u64 %r56,%r56,%r118; .loc 1 80 32 ld.u64 %r43,[%r67]; .loc 1 80 13 setp.eq.u64 %r119,%r40,%r43; @ %r119 bra $L12; .loc 1 84 20 mov.u64 %r45,%r44; $L8: add.u64 %r54,%r54,8; bra $L13; $L1: .loc 1 101 1 ret; } in_unpack_c8.o/gfortrani_internal_unpack_c8 .visible .func _gfortrani_internal_unpack_c8fortran/generated/in_unpack_c8.c"F: _gfortrani_internal_unpack_c8 .visible .func _gfortrani_internal_unpack_c88u64 %r31; .reg .f64 %r32; .reg .f64 %r33; .reg .u6416predpred %r109; .reg .predu64 %r125; .reg .u64 %r126; mov.u64 %r73,%ar0; mov.u64 %r74,%ar1; .loc 1 43 8 ld.u64 %r56,[%r73]; .loc 1 44 11 set.u32.eq.u64 %r76,%r56,%r74; neg.s32 %r77,%r76; .loc 1 44 22 set.u32.eq.u64 %r79,%r74,0; neg.s32 %r80,%r79; .loc 1 44 19 cvt.u16.u32 %r82,%r77; cvt.u16.u32 %r83,%r80; or.b16 %r81,%r82,%r83; .loc 1 44 6 cvt.u32.u16 %r84,%r81; cvt.u16.u8 %r85,%r84; setp.ne.u16 %r86,%r85,0; @ %r86 bra $L1; .loc 1 47 7 ld.s8 %r50,[%r73+28]; .loc 1 49 3 setp.le.s64 %r87,%r50,0; @ %r87 bra $L14; add.u64 %r36,%r73,40; mov.u64 %r48,0; .loc 1 49 19 mov.u64 %r64,%r48; .loc 1 48 9 mov.u64 %r44,1; add.u64 %r122,%frame,240; add.u64 %r120,%frame,120; .loc 1 51 16 mov.u64 %r90,%r48; bra $L5; $L16: mov.u64 %r64,%r57; $L5: add.u64 %r89,%r122,%r48; st.u64 [%r89],%r90; .loc 1 52 19 ld.u64 %r26,[%r36]; .loc 1 52 17 add.u64 %r91,%frame,%r48; st.u64 [%r91],%r26; .loc 1 53 19 ld.u64 %r93,[%r36+16]; add.u64 %r92,%r93,1; ld.u64 %r94,[%r36+8]; sub.u64 %r30,%r92,%r94; .loc 1 53 17 add.u64 %r96,%r120,%r48; st.u64 [%r96],%r30; .loc 1 54 10 setp.le.s64 %r97,%r30,0; @ %r97 bra $L1; .loc 1 57 10 setp.ne.u64 %r98,%r26,%r44; @ %r98 bra $L15; .loc 1 58 8 mul.lo.u64 %r44,%r44,%r30; bra $L4; $L15: .loc 1 60 8 mov.u64 %r44,0; $L4: .loc 1 49 36 add.u64 %r57,%r64,1; .loc 1 49 3 add.u64 %r36,%r36,24; add.u64 %r48,%r48,8; setp.ne.u64 %r99,%r50,%r57; @ %r99 bra $L16; .loc 1 63 6 setp.eq.u64 %r100,%r44,0; @ %r100 bra $L6; .loc 1 65 7 shl.b64 %r72,%r44,4; bra $L3; $L14: .loc 1 49 3 mov.u64 %r72,16; $L3: .loc 1 65 7107,[%value_in]; } .loc 1 66 7 bra $L1; $L6: .loc 1 69 11 ld.u64 %r53,[%frame]; .loc 1 71 9 setp.eq.u64 %r109,%r56,0; @ %r109 bra $L1; .loc 1 76 12 shl.b64 %r35,%r53,4; ld.u64 %r45,[%frame+240]; .loc 1 80 32 ld.u64 %r60,[%frame+120]; .loc 1 87 29 mul.lo.u64 %r31,%r60,%r53; mov.u64 %r54,%r74; setp.eq.u64 %r121,%r50,1; .loc 1 88 12 mov.u64 %r125,1; .loc 1 84 20 mov.u64 %r126,%r44; $L13: .loc 1 74 15 ld.f64 %r32,[%r54]; ld.f64 %r33,[%r54+8]; .loc 1 74 13 st.f64 [%r56],%r32; st.f64 [%r56+8],%r33; .loc 1 76 12 add.u64 %r56,%r56,%r35; .loc 1 77 15 add.u64 %r45,%r45,1; .loc 1 80 13 setp.ne.u64 %r110,%r45,%r60; @ %r110 bra $L8; .loc 1 89 14 @ ! %r121 bra $L22; bra $L1; $L12: .loc 1 84 20 st.u64 [%r49],%r126; .loc 1 87 29 mul.lo.u64 %r38,%r41,%r40; .loc 1 88 12 add.u64 %r55,%r65,1; .loc 1 89 14 add.u64 %r49,%r49,8; add.u64 %r68,%r68,8; add.u64 %r67,%r67,8; setp.ne.u64 %r113,%r64,%r65; @ %r113 bra $L17; bra $L1; $L22: add.u64 %r49,%frame,248; add.u64 %r68,%frame,8; add.u64 %r67,%frame,128; .loc 1 87 29 mov.u64 %r38,%r31; .loc 1 88 12 mov.u64 %r65,%r125; bra $L11; $L17: mov.u64 %r65,%r55; $L11: .loc 1 96 23 ld.u64 %r116,[%r49]; add.u64 %r40,%r116,1; st.u64 [%r49],%r40; .loc 1 97 29 ld.u64 %r41,[%r68]; .loc 1 97 20 sub.u64 %r117,%r41,%r38; shl.b64 %r118,%r117,4; add.u64 %r56,%r56,%r118; .loc 1 80 32 ld.u64 %r43,[%r67]; .loc 1 80 13 setp.eq.u64 %r119,%r40,%r43; @ %r119 bra $L12; .loc 1 84 20 mov.u64 %r45,%r44; $L8: add.u64 %r54,%r54,16; bra $L13; $L1: .loc 1 101 1 ret; } in_unpack_c10.o/162280220pow_i4_i4.o/gfortran_pow_i4_i4 .visible .func (.param .u32 %value_out) _gfortran_pow_i4_; .file 1 "../../../libgfortran/generated/pow_i4_i4gfortran_pow_i4_gfortran_pow_i4_4.reg .pred10 setp.ge.s32 %r29,%r27,0; @ %r29 bra $L3; .loc 1 53 7 setp.eq.u32 %r30,%r26,1; @ %r30 bra $L7; .loc 1 55 7 setp.ne.u32 %r31,%r26,-1; @ %r31 bra $L8; .loc 1 56 16 and.b32 %r32,%r27,1; .loc 1 56 26 setp.eq.u32 %r33,%r32,0; selp.u32 %r25,1,-1,%r33; bra $L1; $L3: .loc 1 61 7 mov.u32 %r24,%r27; .loc 1 48 7 mov.u32 %r25,1; $L5: .loc 1 65 10 and.b32 %r34,%r24,1; .loc 1 65 7 setp.eq.u32 %r35,%r34,0; @ %r35 bra $L4; .loc 1 66 10 mul.lo.u32 %r25,%r25,%r26; $L4: .loc 1 67 6 shr.u32 %r24,%r24,1; .loc 1 68 7 setp.eq.u32 %r36,%r24,0; @ %r36 bra $L1; .loc 1 69 8 mul.lo.u32 %r26,%r26,%r26; .loc 1 65 7 bra $L5; $L6: .loc 1 56 26 mov.u32 %r25,1; bra $L1; $L7: mov.u32 %r25,%r26; bra $L1; $L8: .loc 1 57 28 mov.u32 %r25,0; $L1: .loc 1 75gfortran_pow_i8_i4 .visible .func (.param .u64 %value_out) _gfortran_pow_i8_); .file 1 "../../../libgfortran/generated/pow_i8_i4gfortran_pow_i8_gfortran_pow_i8_pred %r29; .reg .predmov.u64 %r26,%ar0; mov.u32 10 setp.ge.s32 %r29,%r27,0; @ %r29 bra $L3; .loc 1 53 7 setp.eq.u64 %r30,%r26,1; @ %r30 bra $L7; .loc 1 55 7 setp.ne.u64 %r31,%r26,-1; @ %r31 bra $L8; .loc 1 56 16 and.b32 %r32,%r27,1; .loc 1 56 26 setp.eq.u32 %r33,%r32,0; selp.u64 %r25,1,-1,%r33; bra $L1; $L3: .loc 1 61 7 mov.u32 %r24,%r27; .loc 1 48 7 mov.u64 %r25,1; $L5: .loc 1 65 10 and.b32 %r34,%r24,1; .loc 1 65 7 setp.eq.u32 %r35,%r34,0; @ %r35 bra $L4; .loc 1 66 10 mul.lo.u64 %r25,%r25,%r26; $L4: .loc 1 67 6 shr.u32 %r24,%r24,1; .loc 1 68 7 setp.eq.u32 %r36,%r24,0; @ %r36 bra $L1; .loc 1 69 8 mul.lo.u64 %r26,%r26,%r26; .loc 1 65 7 bra $L5; $L6: .loc 1 56 26 mov.u64 %r25,1; bra $L1; $L7: mov.u64 %r25,%r26; bra $L1; $L8: .loc 1 57 28 mov.u64 %r25,0; $L1: .loc 1 75libgfortran/generated/pow_i16_i4F: _gfortran_pow_i16_i4 .visible .func _gfortran_pow_i16_i4pred %r30; .reg .pred %r31; .reg .pred %r33; .reg .pred %r35; .reg .pred.loc 1 49 6 setp.eq.u32 %r30,%r29,0; @ %r30 bra $L10; .loc 1 51 10 setp.ge.s32 %r31,%r29,0; @ %r31 bra $L3; .loc 1 53 7 setp.ne.u64 %r33,%r75,1; @ %r33 bra $L4; setp.ne.u64 %r35,%r76,0; @ %r35 bra $L6; .loc 1 54 13 st.u64 [%r26],%r75; st.u64 [%r26+8],%r76; bra $L1; $L4: .loc 1 55 7 setp.ne.u64 %r39,%r75,-1; @ %r39 bra $L6; setp.ne.u64 %r41,%r76,-1; @ %r41 bra $L6; .loc 1 56 16 and.b32 %r42,%r29,1; .loc 1 56 26 setp.ne.u32 %r43,%r42,0; selp.u64 %r75,%r76,1,%r43; selp.u64 %r76,%r76,0,%r43; st.u64 [%r26],%r75; st.u64 [%r26+8],%r76; bra $L1; $L6: .loc 1 57 28 mov.u64 %r46,0; st.u64 [%r26],%r46; st.u64 [%r26+8],%r46; bra $L1; $L3: .loc 1 61 7 mov.u32 %r25,%r29; .loc 1 48 7 mov.u64 %r73,1; mov.u64 %r74,0; .loc 1 66 10 add.u64 %r81,%frame,16; add.u64 %r82,%frame,32; $L9: .loc 1 65 10 and.b32 %r48,%r25,1; .loc 1 658; .loc 1 66 10 st.u64 [%frame+16],%r73; st.u64 [%frame+24],%r74; st.u64 [%frame+32],%r75; st.u64 [%frame+40],%r76; {$L8: .loc 1 67 6 shr.u32 %r25,%r25,1; .loc 1 68 7 setp.eq.u32 %r60,%r25,0; @ %r60 bra $L2; .loc 1 69 8 st.u64 [%frame+16],%r75; st.u64 [%frame+24],%r76; st.u64 [%frame+32],%r75; st.u64 [%frame+40],%r76; { 7 bra $L9; $L10: .loc 1 48 7 mov.u64 %r73,1; mov.u64 %r74,0; $L2: .loc 1 74 10 st.u64 [%r26],%r73; st.u64 [%r26+8],%r74; $L1: .loc 1 75 1 ret; } pow_r16_i4.o/ gfortran_pow_c4_i4 .visible .func _gfortran_pow_c4_i4); .file 1 "../../../libgfortran/generated/pow_c4_i4gfortran_pow_c4_i4 .visible .func _gfortran_pow_c4_i4f32 %r52; .reg .f32 %r53; .reg .predpredmov.u64 %r46,%ar0; mov.f32 %r47,%ar1; mov.f32 %r48,%ar2; mov.u32 %r49,%ar3; .loc 1 49 6 setp.eq.u32 %r50,%r49,0; @ %r50 bra $L10; mov.f32 %r39,%r47; mov.f32 %r40,%r48; .loc 1 51 10 setp.ge.s32 %r51,%r49,0; @ %r51 bra $L3; .loc 1 54 8 neg.s32 %r26,%r49; .loc 1 55 6 abs.f32 %r52,%r39; abs.f32 %r53,%r40; setp.lt.f32 %r54,%r52,%r53; @ ! %r54 bra $L18; div.rn.f32 %r29,%r39,%r40; fma.rn.f32 %r30,%r29,%r39,%r40; fma.rn.f32 %r32,%r29,0f00000000,0fbf800000; add.f32 %r55,%r29,0f00000000; div.rn.f32 %r39,%r55,%r30; div.rn.f32 %r40,%r32,%r30; bra $L8; $L18: div.rn.f32 %r33,%r40,%r39; fma.rn.f32 %r34,%r40,%r33,%r39; fma.rn.f32 %r35,%r33,0f00000000,0f3f800000; div.rn.f32 %r39,%r35,%r34; mov.f32 %r57,0f00000000; sub.f32 %r56,%r57,%r33; div.rn.f32 %r40,%r56,%r34; $L8: .loc 1 41 1 mov.f32 %r38,0f00000000; mov.f32 %r37,0f3f800000; bra $L7; $L3: .loc 1 59 7 mov.u32 %r26,%r49; bra $L8; $L7: .loc 1 63 10 and.b32 %r58,%r26,1; .loc 1 63 79; .loc 1 64 10 mul.f32 %r42,%r38,%r39; mul.f32 %r60,%r38,%r40; neg.f32 %r61,%r60; fma.rn.f32 %r38,%r37,%r40,%r42; fma.rn.f32 %r37,%r37,%r39,%r61; $L9: .loc 1 65 6 shr.u32 %r26,%r26,1; .loc 1 66 7 setp.eq.u32 %r62,%r26,0; @ %r62 bra $L2; .loc 1 67 8 mul.f32 %r45,%r39,%r40; mul.f32 %r63,%r40,%r40; neg.f32 %r64,%r63; fma.rn.f32 %r39,%r39,%r39,%r64; add.f32 %r40,%r45,%r45; .loc 1 63 7 bra $L7; $L10: mov.f32 %r38,0f00000000; mov.f32 %r37,0f3f800000; $L2: .loc 1 72 10 st.f32 [%r46],%r37; st.f32 [%r46+4],%r38; .loc 1 73 1 ret; } pow_c8_i4.o/gfortran_pow_c8_i4 .visible .func _gfortran_pow_c8_i4); .file 1 "../../../libgfortran/generated/pow_c8_i4gfortran_pow_c8_i4 .visible .func _gfortran_pow_c8_i4u32.reg .f64 %r38; .reg .f64 %r39; .reg .f64 %r40; .reg .f64 %r42; .reg .f64 %r45; .reg .u64 %r46; .reg .f64 %r47; .reg .f64 %r48; .reg .u32f64 %r52; .reg .f64 %r53; .reg .predpred %r59; .reg .f64 %r60; .reg .f64 %r61; .reg .predmov.u64 %r46,%ar0; mov.f64 %r47,%ar1; mov.f64 %r48,%ar2; mov.u32 %r49,%ar3; .loc 1 49 6 setp.eq.u32 %r50,%r49,0; @ %r50 bra $L10; mov.f64 %r39,%r47; mov.f64 %r40,%r48; .loc 1 51 10 setp.ge.s32 %r51,%r49,0; @ %r51 bra $L3; .loc 1 54 8 neg.s32 %r26,%r49; .loc 1 55 6 abs.f64 %r52,%r39; abs.f64 %r53,%r40; setp.lt.f64 %r54,%r52,%r53; @ ! %r54 bra $L18; div.rn.f64 %r29,%r39,%r40; fma.rn.f64 %r30,%r29,%r39,%r40; fma.rn.f64 %r32,%r29,0d0000000000000000,0dbff0000000000000; add.f64 %r55,%r29,0d0000000000000000; div.rn.f64 %r39,%r55,%r30; div.rn.f64 %r40,%r32,%r30; bra $L8; $L18: div.rn.f64 %r33,%r40,%r39; fma.rn.f64 %r34,%r40,%r33,%r39; fma.rn.f64 %r35,%r33,0d0000000000000000,0d3ff0000000000000; div.rn.f64 %r39,%r35,%r34; mov.f64 %r57,0d0000000000000000; sub.f64 %r56,%r57,%r33; div.rn.f64 %r40,%r56,%r34; $L8: .loc 1 41 1 mov.f64 %r38,0d0000000000000000; mov.f64 %r37,0d3ff0000000000000; bra $L7; $L3: .loc 1 59 7 mov.u32 %r26,%r49; bra $L8; $L7: .loc 1 63 10 and.b32 %r58,%r26,1; .loc 1 63 79; .loc 1 64 10 mul.f64 %r42,%r38,%r39; mul.f64 %r60,%r38,%r40; neg.f64 %r61,%r60; fma.rn.f64 %r38,%r37,%r40,%r42; fma.rn.f64 %r37,%r37,%r39,%r61; $L9: .loc 1 65 6 shr.u32 %r26,%r26,1; .loc 1 66 7 setp.eq.u32 %r62,%r26,0; @ %r62 bra $L2; .loc 1 67 8 mul.f64 %r45,%r39,%r40; mul.f64 %r63,%r40,%r40; neg.f64 %r64,%r63; fma.rn.f64 %r39,%r39,%r39,%r64; add.f64 %r40,%r45,%r45; .loc 1 63 7 bra $L7; $L10: mov.f64 %r38,0d0000000000000000; mov.f64 %r37,0d3ff0000000000000; $L2: .loc 1 72 10 st.f64 [%r46],%r37; st.f64 [%r46+8],%r38; .loc 1 73 1 ret; } pow_c10_i4.o/ pow_i4_i8.o/gfortran_pow_i4_i8 .visible .func (.param .u32 %value_out) _gfortran_pow_i4_i.file 1 "../../../libgfortran/generated/pow_i4_i8gfortran_pow_i4_gfortran_pow_i4_ir24.reg .pred10 setp.ge.s64 %r29,%r27,0; @ %r29 bra $L3; .loc 1 53 7 setp.eq.u32 %r30,%r26,1; @ %r30 bra $L7; .loc 1 55 7 setp.ne.u32 %r31,%r26,-1; @ %r31 bra $L8; .loc 1 56 16 and.b64 %r32,%r27,1; .loc 1 56 26 setp.eq.u64 %r33,%r32,0; selp.u32 %r25,1,-1,%r33; bra $L1; $L3: .loc 1 61 7 mov.u64 %r24,%r27; .loc 1 48 7 mov.u32 %r25,1; $L5: .loc 1 65 10 and.b64 %r34,%r24,1; .loc 1 65.loc 1 66 10 mul.lo.u32 %r25,%r25,%r26; $L4: .loc 1 67 6 shr.u64 %r24,%r24,1; .loc 1 68 7 setp.eq.u64 %r36,%r24,0; @ %r36 bra $L1; .loc 1 69 8 mul.lo.u32 %r26,%r26,%r26; .loc 1 65 7 bra $L5; $L6: .loc 1 56 26 mov.u32 %r25,1; bra $L1; $L7: mov.u32 %r25,%r26; bra $L1; $L8: .loc 1 57 28 mov.u32 %r25,0; $L1: .loc 1 75gfortran_pow_i8_i8 .visible .func (.param .u64 %value_out) _gfortran_pow_i8_fortran/generated/pow_i8_i8gfortran_pow_i8_gfortran_pow_i8_4.reg .predpred10 setp.ge.s64 %r29,%r27,0; @ %r29 bra $L3; .loc 1 53 7 setp.eq.u64 %r30,%r26,1; @ %r30 bra $L7; .loc 1 55 7 setp.ne.u64 %r31,%r26,-1; @ %r31 bra $L8; .loc 1 56 16 and.b64 %r32,%r27,1; .loc 1 56 26 setp.eq.u64 %r33,%r32,0; selp.u64 %r25,1,-1,%r33; bra $L1; $L3: .loc 1 61 7 mov.u64 %r24,%r27; .loc 1 48 7 mov.u64 %r25,1; $L5: .loc 1 65 10 and.b64 %r34,%r24,1; .loc 1 65.loc 1 66 10 mul.lo.u64 %r25,%r25,%r26; $L4: .loc 1 67 6 shr.u64 %r24,%r24,1; .loc 1 68 7 setp.eq.u64 %r36,%r24,0; @ %r36 bra $L1; .loc 1 69 8 mul.lo.u64 %r26,%r26,%r26; .loc 1 65 7 bra $L5; $L6: .loc 1 56 26 mov.u64 %r25,1; bra $L1; $L7: mov.u64 %r25,%r26; bra $L1; $L8: .loc 1 57 28 mov.u64 %r25,0; $L1: .loc 1 75gfortran_pow_i16_i8 .visible .func _gfortran_pow_i16_fortran/generated/pow_i16_i8F: _gfortran_pow_i16_i8 .visible .func _gfortran_pow_i16_pred %r30; .reg .pred %r31; .reg .pred %r33; .reg .pred %r35; .reg .pred.loc 1 49 6 setp.eq.u64 %r30,%r29,0; @ %r30 bra $L10; .loc 1 51 10 setp.ge.s64 %r31,%r29,0; @ %r31 bra $L3; .loc 1 53 7 setp.ne.u64 %r33,%r75,1; @ %r33 bra $L4; setp.ne.u64 %r35,%r76,0; @ %r35 bra $L6; .loc 1 54 13 st.u64 [%r26],%r75; st.u64 [%r26+8],%r76; bra $L1; $L4: .loc 1 55 7 setp.ne.u64 %r39,%r75,-1; @ %r39 bra $L6; setp.ne.u64 %r41,%r76,-1; @ %r41 bra $L6; .loc 1 56 16 and.b64 %r42,%r29,1; .loc 1 56 26 setp.ne.u64 %r43,%r42,0; selp.u64 %r75,%r76,1,%r43; selp.u64 %r76,%r76,0,%r43; st.u64 [%r26],%r75; st.u64 [%r26+8],%r76; bra $L1; $L6: .loc 1 57 28 mov.u64 %r46,0; st.u64 [%r26],%r46; st.u64 [%r26+8],%r46; bra $L1; $L3: .loc 1 61 7 mov.u64 %r25,%r29; .loc 1 48 7 mov.u64 %r73,1; mov.u64 %r74,0; .loc 1 66 10 add.u64 %r81,%frame,16; add.u64 %r82,%frame,32; $L9: .loc 1 65 10 and.b64 %r48,%r25,1; .loc 1 65 7 setp.eq.u64 %r49,%r48,0; @ %r49 bra $L8; .loc 1 66 10 st.u64 [%frame+16],%r73; st.u64 [%frame+24],%r74; st.u64 [%frame+32],%r75; st.u64 [%frame+40],%r76; {$L8: .loc 1 67 6 shr.u64 %r25,%r25,1; .loc 1 68 7 setp.eq.u64 %r60,%r25,0; @ %r60 bra $L2; .loc 1 69 8 st.u64 [%frame+16],%r75; st.u64 [%frame+24],%r76; st.u64 [%frame+32],%r75; st.u64 [%frame+40],%r76; { 7 bra $L9; $L10: .loc 1 48 7 mov.u64 %r73,1; mov.u64 %r74,%r29; $L2: .loc 1 74 10 st.u64 [%r26],%r73; st.u64 [%r26+8],%r74; $L1: .loc 1 75 1 ret; } pow_r4_i8.o/gfortran_pow_r4_i8 .visible .func (.param .f32 %value_out) _gfortran_pow_r4_fortran/generated/pow_r4_i8gfortran_pow_r4_i8 .visible .func (.param .f32 %value_out) _gfortran_pow_r4_49 6 setp.eq.u64 %r28,%r27,0; @ %r28 bra $L7; .loc 1 51 10 setp.ge.s64 %r29,%r27,0; @ %r29 bra $L3; .loc 1 54 8 neg.s64 %r24,%r27; .loc 1 55 6 mov.f32 %r30,0f3f800000; div.rn.f32 %r26,%r30,%r26; $L5: .loc 1 41 1 mov.f32 %r25,0f3f800000; bra $L4; $L3: .loc 1 59 7 mov.u64 %r24,%r27; bra $L5; $L4: .loc 1 63 10 and.b64 %r31,%r24,1; .loc 1 63 76; .loc 1 64 10 mul.f32 %r25,%r25,%r26; $L6: .loc 1 65 6 shr.u64 %r24,%r24,1; .loc 1 66 7 setp.eq.u64 %r33,%r24,0; @ %r33 bra $L1; .loc 1 67 8 mul.f32 %r26,%r26,%r26; .loc 1 63 7 bra $L4; $L7: .loc 1 48 7 mov.f32 %r25,0f3f800000; $L1: .loc 1 73 pow_r8_i8.o/gfortran_pow_r8_i8 .visible .func (.param .f64 %value_out) _gfortran_pow_r8_fortran/generated/pow_r8_i8gfortran_pow_r8_i8 .visible .func (.param .f64 %value_out) _gfortran_pow_r8_49 6 setp.eq.u64 %r28,%r27,0; @ %r28 bra $L7; .loc 1 51 10 setp.ge.s64 %r29,%r27,0; @ %r29 bra $L3; .loc 1 54 8 neg.s64 %r24,%r27; .loc 1 55 6 mov.f64 %r30,0d3ff0000000000000; div.rn.f64 %r26,%r30,%r26; $L5: .loc 1 41 1 mov.f64 %r25,0d3ff0000000000000; bra $L4; $L3: .loc 1 59 7 mov.u64 %r24,%r27; bra $L5; $L4: .loc 1 63 10 and.b64 %r31,%r24,1; .loc 1 63 76; .loc 1 64 10 mul.f64 %r25,%r25,%r26; $L6: .loc 1 65 6 shr.u64 %r24,%r24,1; .loc 1 66 7 setp.eq.u64 %r33,%r24,0; @ %r33 bra $L1; .loc 1 67 8 mul.f64 %r26,%r26,%r26; .loc 1 63 7 bra $L4; $L7: .loc 1 48 7 mov.f64 %r25,0d3ff0000000000000; $L1: .loc 1 73 pow_r10_i8.o/ pow_c4_i8.o/gfortran_pow_c4_i8 .visible .func _gfortran_pow_c4_i8); .file 1 "../../../libgfortran/generated/pow_c4_i8gfortran_pow_c4_i8 .visible .func _gfortran_pow_c4_i8u64f32 %r53; .reg .predpredmov.u64 %r46,%ar0; mov.f32 %r47,%ar1; mov.f32 %r48,%ar2; mov.u64 %r49,%ar3; .loc 1 49mov.f32 %r39,%r47; mov.f32 %r40,%r48; .loc 1 51 10 setp.ge.s64 %r51,%r49,0; @ %r51 bra $L3; .loc 1 54 8 neg.s64 %r26,%r49; .loc 1 55 6 abs.f32 %r52,%r39; abs.f32 %r53,%r40; setp.lt.f32 %r54,%r52,%r53; @ ! %r54 bra $L18; div.rn.f32 %r29,%r39,%r40; fma.rn.f32 %r30,%r29,%r39,%r40; fma.rn.f32 %r32,%r29,0f00000000,0fbf800000; add.f32 %r55,%r29,0f00000000; div.rn.f32 %r39,%r55,%r30; div.rn.f32 %r40,%r32,%r30; bra $L8; $L18: div.rn.f32 %r33,%r40,%r39; fma.rn.f32 %r34,%r40,%r33,%r39; fma.rn.f32 %r35,%r33,0f00000000,0f3f800000; div.rn.f32 %r39,%r35,%r34; mov.f32 %r57,0f00000000; sub.f32 %r56,%r57,%r33; div.rn.f32 %r40,%r56,%r34; $L8: .loc 1 41 1 mov.f32 %r38,0f00000000; mov.f32 %r37,0f3f800000; bra $L7; $L3: .loc 1 59 7 mov.u64 %r26,%r49; bra $L8; $L7: .loc 1 63 10 and.b64 %r58,%r26,1; .loc 1 63 7 setp.eq.u64 %r59,%r58,0; @ %r59 bra $L9; .loc 1 64 10 mul.f32 %r42,%r38,%r39; mul.f32 %r60,%r38,%r40; neg.f32 %r61,%r60; fma.rn.f32 %r38,%r37,%r40,%r42; fma.rn.f32 %r37,%r37,%r39,%r61; $L9: .loc 1 65 6 shr.u64 %r26,%r26,1; .loc 1 66 7 setp.eq.u64 %r62,%r26,0; @ %r62 bra $L2; .loc 1 67 8 mul.f32 %r45,%r39,%r40; mul.f32 %r63,%r40,%r40; neg.f32 %r64,%r63; fma.rn.f32 %r39,%r39,%r39,%r64; add.f32 %r40,%r45,%r45; .loc 1 63 7 bra $L7; $L10: mov.f32 %r38,0f00000000; mov.f32 %r37,0f3f800000; $L2: .loc 1 72 10 st.f32 [%r46],%r37; st.f32 [%r46+4],%r38; .loc 1 73 1 ret; } pow_c8_i8.o/gfortran_pow_c8_i8 .visible .func _gfortran_pow_c8_i8); .file 1 "../../../libgfortran/generated/pow_c8_i8gfortran_pow_c8_i8 .visible .func _gfortran_pow_c8_i8f64 %r53; .reg .predpred %r59; .reg .f64 %r60; .reg .f64 %r61; .reg .predmov.u64 %r46,%ar0; mov.f64 %r47,%ar1; mov.f64 %r48,%ar2; mov.u64 %r49,%ar3; .loc 1 49mov.f64 %r39,%r47; mov.f64 %r40,%r48; .loc 1 51 10 setp.ge.s64 %r51,%r49,0; @ %r51 bra $L3; .loc 1 54 8 neg.s64 %r26,%r49; .loc 1 55 6 abs.f64 %r52,%r39; abs.f64 %r53,%r40; setp.lt.f64 %r54,%r52,%r53; @ ! %r54 bra $L18; div.rn.f64 %r29,%r39,%r40; fma.rn.f64 %r30,%r29,%r39,%r40; fma.rn.f64 %r32,%r29,0d0000000000000000,0dbff0000000000000; add.f64 %r55,%r29,0d0000000000000000; div.rn.f64 %r39,%r55,%r30; div.rn.f64 %r40,%r32,%r30; bra $L8; $L18: div.rn.f64 %r33,%r40,%r39; fma.rn.f64 %r34,%r40,%r33,%r39; fma.rn.f64 %r35,%r33,0d0000000000000000,0d3ff0000000000000; div.rn.f64 %r39,%r35,%r34; mov.f64 %r57,0d0000000000000000; sub.f64 %r56,%r57,%r33; div.rn.f64 %r40,%r56,%r34; $L8: .loc 1 41 1 mov.f64 %r38,0d0000000000000000; mov.f64 %r37,0d3ff0000000000000; bra $L7; $L3: .loc 1 59 7 mov.u64 %r26,%r49; bra $L8; $L7: .loc 1 63 10 and.b64 %r58,%r26,1; .loc 1 63 7 setp.eq.u64 %r59,%r58,0; @ %r59 bra $L9; .loc 1 64 10 mul.f64 %r42,%r38,%r39; mul.f64 %r60,%r38,%r40; neg.f64 %r61,%r60; fma.rn.f64 %r38,%r37,%r40,%r42; fma.rn.f64 %r37,%r37,%r39,%r61; $L9: .loc 1 65 6 shr.u64 %r26,%r26,1; .loc 1 66 7 setp.eq.u64 %r62,%r26,0; @ %r62 bra $L2; .loc 1 67 8 mul.f64 %r45,%r39,%r40; mul.f64 %r63,%r40,%r40; neg.f64 %r64,%r63; fma.rn.f64 %r39,%r39,%r39,%r64; add.f64 %r40,%r45,%r45; .loc 1 63 7 bra $L7; $L10: mov.f64 %r38,0d0000000000000000; mov.f64 %r37,0d3ff0000000000000; $L2: .loc 1 72 10 st.f64 [%r46],%r37; st.f64 [%r46+8],%r38; .loc 1 73 1 ret; } pow_c10_i8.o/ pow_i4_i16.o/gfortran_pow_i4_i16gfortran_pow_i4_i16fortran/generated/pow_i4_i16gfortran_pow_i4_i16gfortran_pow_i4_i16 (predpred64.loc 1 49 6 or.b64 %r29,%r66,%r67; setp.eq.u64 %r31,%r29,0; @ %r31 bra $L11; .loc 1 51 10 setp.ge.s64 %r33,%r67,0; @ %r33 bra $L4; .loc 1 53 7 setp.eq.u32 %r34,%r26,1; @ %r34 bra $L12; .loc 1 55 7 setp.ne.u32 %r35,%r26,-1; @ %r35 bra $L13; .loc 1 56 16 and.b64 %r37,%r66,1; .loc 1 56 26 setp.eq.u64 %r43,%r37,0; selp.u32 %r25,1,-1,%r43; bra $L1; $L4: .loc 1 61 7 mov.u64 %r64,%r66; mov.u64 %r65,%r67; .loc 1 48 7 mov.u32 %r25,1; $L10: .loc 1 65 10 and.b64 %r45,%r64,1; .loc 1 65 7 setp.eq.u64 %r51,%r45,0; @ %r51 bra $L7; .loc 1 66 10 mul.lo.u32 %r25,%r25,%r26; $L7: .loc 1 67 6 shl.b64 %r53,%r65,63; shr.u64 %r55,%r64,1; or.b64 %r64,%r55,%r53; shr.u64 %r65,%r65,1; .loc 1 68 7 or.b64 %r60,%r64,%r65; setp.eq.u64 %r62,%r60,0; @ %r62 bra $L1; .loc 1 69 8 mul.lo.u32 %r26,%r26,%r26; .loc 1 65 7 bra $L10; $L11: .loc 1 56 26 mov.u32 %r25,1; bra $L1; $L12: mov.u32 %r25,%r26; bra $L1; $L13: .loc 1 57 28 mov.u32 %r25,0; $L1: .loc 1 75gfortran_pow_i8_i16gfortran_pow_i8_i16fortran/generated/pow_i8_i16gfortran_pow_i8_i16gfortran_pow_i8_i16pred %r43; .reg .u64 %r45; .reg .pred64.loc 1 49 6 or.b64 %r29,%r66,%r67; setp.eq.u64 %r31,%r29,0; @ %r31 bra $L11; .loc 1 51 10 setp.ge.s64 %r33,%r67,0; @ %r33 bra $L4; .loc 1 53 7 setp.eq.u64 %r34,%r26,1; @ %r34 bra $L12; .loc 1 55 7 setp.ne.u64 %r35,%r26,-1; @ %r35 bra $L13; .loc 1 56 16 and.b64 %r37,%r66,1; .loc 1 56 26 setp.eq.u64 %r43,%r37,0; selp.u64 %r25,1,-1,%r43; bra $L1; $L4: .loc 1 61 7 mov.u64 %r64,%r66; mov.u64 %r65,%r67; .loc 1 48 7 mov.u64 %r25,1; $L10: .loc 1 65 10 and.b64 %r45,%r64,1; .loc 1 65 7 setp.eq.u64 %r51,%r45,0; @ %r51 bra $L7; .loc 1 66 10 mul.lo.u64 %r25,%r25,%r26; $L7: .loc 1 67 6 shl.b64 %r53,%r65,63; shr.u64 %r55,%r64,1; or.b64 %r64,%r55,%r53; shr.u64 %r65,%r65,1; .loc 1 68 7 or.b64 %r60,%r64,%r65; setp.eq.u64 %r62,%r60,0; @ %r62 bra $L1; .loc 1 69 8 mul.lo.u64 %r26,%r26,%r26; .loc 1 65 7 bra $L10; $L11: .loc 1 56 26 mov.u64 %r25,1; bra $L1; $L12: mov.u64 %r25,%r26; bra $L1; $L13: .loc 1 57 28 mov.u64 %r25,0; $L1: .loc 1 75pow_i16_i16.o/_gfortran_pow_i16_i16 .visible .func _gfortran_pow_i16_i16fortran/generated/pow_i16_i16F: _gfortran_pow_i16_i16 .visible .func _gfortran_pow_i16_i16pred %r33; .reg .predu64 %r47; .reg .pred %r53; .reg .u64 %r56; .reg .u64 %r59; .reg .pred %r65; .reg .u64 %r77; .reg .u64 %r79; .reg .u64 %r84; .reg .pred103,[%r27]; ld.u64 %r104105,[%r29]; ld.u64 %r106,[%r29+8]; .loc 1 49 6 or.b64 %r31,%r105,%r106; setp.eq.u64 %r33,%r31,0; @ %r33 bra $L14; .loc 1 51 10 setp.ge.s64 %r35,%r106,0; @ %r35 bra $L4; .loc 1 53 7 setp.ne.u64 %r37,%r103,1; @ %r37 bra $L6; setp.ne.u64 %r39,%r104,0; @ %r39 bra $L8; .loc 1 54 13 st.u64 [%r26],%r103; st.u64 [%r26+8],%r104; bra $L1; $L6: .loc 1 55 7 setp.ne.u64 %r43,%r103,-1; @ %r43 bra $L8; setp.ne.u64 %r45,%r104,-1; @ %r45 bra $L8; .loc 1 56 16 and.b64 %r47,%r105,1; .loc 1 56 26 setp.ne.u64 %r53,%r47,0; selp.u64 %r103,%r104,1,%r53; selp.u64 %r104,%r104,0,%r53; st.u64 [%r26],%r103; st.u64 [%r26+8],%r104; bra $L1; $L8: .loc 1 57 28 mov.u64 %r56,0; st.u64 [%r26],%r56; st.u64 [%r26+8],%r56; bra $L1; $L4: .loc 1 61 7 mov.u64 %r101,%r105; mov.u64 %r102,%r106; .loc 1 48 7 mov.u64 %r99,1; mov.u64 %r100,0; .loc 1 66 10 add.u64 %r117,%frame,16; add.u64 %r118,%frame,32; $L13: .loc 1 65 10 and.b64 %r59,%r101,1; .loc 1 65 7 setp.eq.u64 %r65,%r59,0; @ %r65 bra $L10; .loc 1 66 10 st.u64 [%frame+16],%r99; st.u64 [%frame+24],%r100; st.u64 [%frame+32],%r103; st.u64 [%frame+40],%r104; {11799,[%frame]; ld.u64 %r100,[%frame+8]; $L10: .loc 1 67 6 shl.b64 %r77,%r102,63; shr.u64 %r79,%r101,1; or.b64 %r101,%r79,%r77; shr.u64 %r102,%r102,1; .loc 1 68 7 or.b64 %r84,%r101,%r102; setp.eq.u64 %r86,%r84,0; @ %r86 bra $L2; .loc 1 69 8 st.u64 [%frame+16],%r103; st.u64 [%frame+24],%r104; st.u64 [%frame+32],%r103; st.u64 [%frame+40],%r104; {117]; ld.u64 %r104,[%frame+8]; .loc 1 65 7 bra $L13; $L14: .loc 1 48 7 mov.u64 %r99,1; mov.u64 %r100,%r31; $L2: .loc 1 74 10 st.u64 [%r26],%r99; st.u64 [%r26+8],%r100; $L1: .loc 1 75 1 ret; } pow_r4_i16.o/gfortran_pow_r4_i16 .visible .func (.param .f32 %value_out) _gfortran_pow_r4_i16fortran/generated/pow_r4_i16gfortran_pow_r4_i16 .visible .func (.param .f32 %value_out) _gfortran_pow_r4_i16pred.loc 1 49 6 or.b64 %r29,%r69,%r70; setp.eq.u64 %r31,%r29,0; @ %r31 bra $L11; .loc 1 51 10 setp.ge.s64 %r33,%r70,0; @ %r33 bra $L4; .loc 1 54 8 set.u32.ne.u64 %r41,%r69,0; cvt.s64.s32 %r39,%r41; neg.s64 %r67,%r69; sub.u64 %r68,%r39,%r70; .loc 1 55 6 mov.f32 %r46,0f3f800000; div.rn.f32 %r26,%r46,%r26; $L7: .loc 1 41 1 mov.f32 %r25,0f3f800000; bra $L6; $L4: .loc 1 59 7 mov.u64 %r67,%r69; mov.u64 %r68,%r70; bra $L7; $L6: .loc 1 63 10 and.b64 %r48,%r67,1; .loc 1 63 7 setp.eq.u64 %r54,%r48,0; @ %r54 bra $L8; .loc 1 64 10 mul.f32 %r25,%r25,%r26; $L8: .loc 1 65 6 shl.b64 %r56,%r68,63; shr.u64 %r58,%r67,1; or.b64 %r67,%r58,%r56; shr.u64 %r68,%r68,1; .loc 1 66 7 or.b64 %r63,%r67,%r68; setp.eq.u64 %r65,%r63,0; @ %r65 bra $L1; .loc 1 67 8 mul.f32 %r26,%r26,%r26; .loc 1 63 7 bra $L6; $L11: .loc 1 48 7 mov.f32 %r25,0f3f800000; $L1: .loc 1 73ortran_pow_r8_i16 .visible .func (.param .f64 %value_out) _gfortran_pow_r8_i16fortran/generated/pow_r8_i16gfortran_pow_r8_i16 .visible .func (.param .f64 %value_out) _gfortran_pow_r8_i16pred.loc 1 49 6 or.b64 %r29,%r69,%r70; setp.eq.u64 %r31,%r29,0; @ %r31 bra $L11; .loc 1 51 10 setp.ge.s64 %r33,%r70,0; @ %r33 bra $L4; .loc 1 54 8 set.u32.ne.u64 %r41,%r69,0; cvt.s64.s32 %r39,%r41; neg.s64 %r67,%r69; sub.u64 %r68,%r39,%r70; .loc 1 55 6 mov.f64 %r46,0d3ff0000000000000; div.rn.f64 %r26,%r46,%r26; $L7: .loc 1 41 1 mov.f64 %r25,0d3ff0000000000000; bra $L6; $L4: .loc 1 59 7 mov.u64 %r67,%r69; mov.u64 %r68,%r70; bra $L7; $L6: .loc 1 63 10 and.b64 %r48,%r67,1; .loc 1 63 7 setp.eq.u64 %r54,%r48,0; @ %r54 bra $L8; .loc 1 64 10 mul.f64 %r25,%r25,%r26; $L8: .loc 1 65 6 shl.b64 %r56,%r68,63; shr.u64 %r58,%r67,1; or.b64 %r67,%r58,%r56; shr.u64 %r68,%r68,1; .loc 1 66 7 or.b64 %r63,%r67,%r68; setp.eq.u64 %r65,%r63,0; @ %r65 bra $L1; .loc 1 67 8 mul.f64 %r26,%r26,%r26; .loc 1 63 7 bra $L6; $L11: .loc 1 48 7 mov.f64 %r25,0d3ff0000000000000; $L1: .loc 1 73pow_r10_i16pow_c4_i16.o/gfortran_pow_c4_i16 .visible .func _gfortran_pow_c4_i16); .file 1 "../../../libgfortran/generated/pow_c4_i16gfortran_pow_c4_i16 .visible .func _gfortran_pow_c4_i16pred %r53; .reg .predpredpred100; mov.u64 %r46,%ar0; mov.f32 %r47,%ar1; mov.f32 %r48,%ar2; mov.u64 %r49,%ar3; ld.u64 %r99,[%r49]; ld.u64 %r100,[%r49+8]; .loc 1 49 6 or.b64 %r51,%r99,%r100; setp.eq.u64 %r53,%r51,0; @ %r53 bra $L14; mov.f32 %r39,%r47; mov.f32 %r40,%r48; .loc 1 51 10 setp.ge.s64 %r55,%r100,0; @ %r55 bra $L4; .loc 1 54 8 set.u32.ne.u64 %r63,%r99,0; cvt.s64.s32 %r61,%r63; neg.s64 %r97,%r99; sub.u64 %r98,%r61,%r100; .loc 1 55 6 abs.f32 %r68,%r39; abs.f32 %r69,%r40; setp.lt.f32 %r70,%r68,%r69; @ ! %r70 bra $L22; div.rn.f32 %r29,%r39,%r40; fma.rn.f32 %r30,%r29,%r39,%r40; fma.rn.f32 %r32,%r29,0f00000000,0fbf800000; add.f32 %r71,%r29,0f00000000; div.rn.f32 %r39,%r71,%r30; div.rn.f32 %r40,%r32,%r30; bra $L10; $L22: div.rn.f32 %r33,%r40,%r39; fma.rn.f32 %r34,%r40,%r33,%r39; fma.rn.f32 %r35,%r33,0f00000000,0f3f800000; div.rn.f32 %r39,%r35,%r34; mov.f32 %r73,0f00000000; sub.f32 %r72,%r73,%r33; div.rn.f32 %r40,%r72,%r34; $L10: .loc 1 41 1 mov.f32 %r38,0f00000000; mov.f32 %r37,0f3f800000; bra $L9; $L4: .loc 1 59 7 mov.u64 %r97,%r99; mov.u64 %r98,%r100; bra $L10; $L9: .loc 1 63 10 and.b64 %r75,%r97,1; .loc 1 63 7 setp.eq.u64 %r81,%r75,0; @ %r81 bra $L11; .loc 1 64 10 mul.f32 %r42,%r38,%r39; mul.f32 %r82,%r38,%r40; neg.f32 %r83,%r82; fma.rn.f32 %r38,%r37,%r40,%r42; fma.rn.f32 %r37,%r37,%r39,%r83; $L11: .loc 1 65 6 shl.b64 %r85,%r98,63; shr.u64 %r87,%r97,1; or.b64 %r97,%r87,%r85; shr.u64 %r98,%r98,1; .loc 1 66 7 or.b64 %r92,%r97,%r98; setp.eq.u64 %r94,%r92,0; @ %r94 bra $L2; .loc 1 67 8 mul.f32 %r45,%r39,%r40; mul.f32 %r95,%r40,%r40; neg.f32 %r96,%r95; fma.rn.f32 %r39,%r39,%r39,%r96; add.f32 %r40,%r45,%r45; .loc 1 63 7 bra $L9; $L14: mov.f32 %r38,0f00000000; mov.f32 %r37,0f3f800000; $L2: .loc 1 72 10 st.f32 [%r46],%r37; st.f32 [%r46+4],%r38; .loc 1 73 1 ret; } pow_c8_i16.o/gfortran_pow_c8_i16 .visible .func _gfortran_pow_c8_i16); .file 1 "../../../libgfortran/generated/pow_c8_i16gfortran_pow_c8_i16 .visible .func _gfortran_pow_c8_i16.reg .f64 %r38; .reg .f64 %r39; .reg .f64 %r40; .reg .f64 %r42; .reg .f64 %r45; .reg .u64 %r46; .reg .f64 %r47; .reg .f64 %r48; .reg .u64 %r49; .reg .u64 %r51; .reg .pred %r53; .reg .predpredpred100; mov.u64 %r46,%ar0; mov.f64 %r47,%ar1; mov.f64 %r48,%ar2; mov.u64 %r49,%ar3; ld.u64 %r99,[%r49]; ld.u64 %r100,[%r49+8]; .loc 1 49 6 or.b64 %r51,%r99,%r100; setp.eq.u64 %r53,%r51,0; @ %r53 bra $L14; mov.f64 %r39,%r47; mov.f64 %r40,%r48; .loc 1 51 10 setp.ge.s64 %r55,%r100,0; @ %r55 bra $L4; .loc 1 54 8 set.u32.ne.u64 %r63,%r99,0; cvt.s64.s32 %r61,%r63; neg.s64 %r97,%r99; sub.u64 %r98,%r61,%r100; .loc 1 55 6 abs.f64 %r68,%r39; abs.f64 %r69,%r40; setp.lt.f64 %r70,%r68,%r69; @ ! %r70 bra $L22; div.rn.f64 %r29,%r39,%r40; fma.rn.f64 %r30,%r29,%r39,%r40; fma.rn.f64 %r32,%r29,0d0000000000000000,0dbff0000000000000; add.f64 %r71,%r29,0d0000000000000000; div.rn.f64 %r39,%r71,%r30; div.rn.f64 %r40,%r32,%r30; bra $L10; $L22: div.rn.f64 %r33,%r40,%r39; fma.rn.f64 %r34,%r40,%r33,%r39; fma.rn.f64 %r35,%r33,0d0000000000000000,0d3ff0000000000000; div.rn.f64 %r39,%r35,%r34; mov.f64 %r73,0d0000000000000000; sub.f64 %r72,%r73,%r33; div.rn.f64 %r40,%r72,%r34; $L10: .loc 1 41 1 mov.f64 %r38,0d0000000000000000; mov.f64 %r37,0d3ff0000000000000; bra $L9; $L4: .loc 1 59 7 mov.u64 %r97,%r99; mov.u64 %r98,%r100; bra $L10; $L9: .loc 1 63 10 and.b64 %r75,%r97,1; .loc 1 63 7 setp.eq.u64 %r81,%r75,0; @ %r81 bra $L11; .loc 1 64 10 mul.f64 %r42,%r38,%r39; mul.f64 %r82,%r38,%r40; neg.f64 %r83,%r82; fma.rn.f64 %r38,%r37,%r40,%r42; fma.rn.f64 %r37,%r37,%r39,%r83; $L11: .loc 1 65 6 shl.b64 %r85,%r98,63; shr.u64 %r87,%r97,1; or.b64 %r97,%r87,%r85; shr.u64 %r98,%r98,1; .loc 1 66 7 or.b64 %r92,%r97,%r98; setp.eq.u64 %r94,%r92,0; @ %r94 bra $L2; .loc 1 67 8 mul.f64 %r45,%r39,%r40; mul.f64 %r95,%r40,%r40; neg.f64 %r96,%r95; fma.rn.f64 %r39,%r39,%r39,%r96; add.f64 %r40,%r45,%r45; .loc 1 63 7 bra $L9; $L14: mov.f64 %r38,0d0000000000000000; mov.f64 %r37,0d3ff0000000000000; $L2: .loc 1 72 10 st.f64 [%r46],%r37; st.f64 [%r46+8],%r38; .loc 1 pack_i1.o/_gfortrani_pack_i1 .visible .func _gfortrani_pack_fortran/generated/pack_i1.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_count_0_gfortrani_count_0_gfortrani_xmallocarr_gfortrani_xmallocarray110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,910,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,80,65,67,75,32,105,110,116,114,105,110,115,105,99,59,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortrani_pack_i1 .visible .func _gfortrani_pack_480predu64 %r146; .reg .u64predpredpredpredpredu16pred %r222; .reg .predpred %r243; .reg .predu64pred %r262; .reg .u64 %r265; .reg .u64 %r266; .reg .u64 %r267; mov.u64 %r124,%ar0; mov.u64 %r125,%ar1; mov.u64 %r126,%ar2; mov.u64 %r127,%ar3; .loc 1 97 9 ld.s8 %r22,[%r125+28]; .loc 1 97 7 cvt.u32.u32 %r128,%r22; cvt.s64.s8 %r80,%r128; .loc 1 99 8 ld.u64 %r98,[%r126]; .loc 1 104 15 ld.u64 %r23,[%r126+16]; .loc 1 104 13 cvt.u32.u64 %r81,%r23; .loc 1 106 53 add.u32 %r129,%r81,-4; and.b32 %r130,%r129,-5; set.u32.eq.u32 %r132,%r130,0; neg.s32 %r133,%r132; .loc 1 106 22 add.u32 %r134,%r81,-1; set.u32.le.u32 %r136,%r134,1; neg.s32 %r137,%r136; .loc 1 106 6 cvt.u16.u32 %r139,%r133; cvt.u16.u32 %r140,%r137; or.b16 %r138,%r139,%r140; cvt.u32.u16 %r141,%r138; cvt.u16.u8 %r142,%r141; setp.eq.u16 %r143,%r142,0; @ %r143 bra $L2; $L5: .loc 1 120 3 setp.gt.s64 %r144,%r80,0; @ %r144 bra $L3; mov.u32 %r78,0; bra $L4; $L2: .loc 1 108 7 setp.eq.u32 %r145,%r81,16; @ %r145 bra $L5; .loc 1 117 5 cvta.const.u64 %r146146_gfortran_runtime_error122 16 cvt.u32.u32 %r149,%r22; cvt.s64.s8 %r148,%r149; shl.b64 %r101,%r148,3; add.u64 %r150,%frame,120; mov.u32 %r154mov.u64 %r114,40; mov.u64 %r115,0; cvt.u32.u64 %r78,%r115; add.u64 %r259,%frame,360; add.u64 %r257,%frame,240; $L7: add.u64 %r107,%r125,%r114; .loc 1 123 19 ld.u64 %r159,[%r107+16]; add.u64 %r158,%r159,1; ld.u64 %r160,[%r107+8]; sub.u64 %r33,%r158,%r160; .loc 1 123 17 add.u64 %r161,%frame,%r115; st.u64 [%r161],%r33; .loc 1 124 10 setp.gt.s64 %r162,%r33,0; .loc 1 125 19 selp.u32 %r78,%r78,1,%r162; .loc 1 126 18 add.u64 %r164,%r259,%r115; ld.u64 %r165,[%r107]; st.u64 [%r164],%r165; .loc 1 127 18 add.u64 %r167,%r257,%r115; .loc 1 127 20 add.u64 %r168,%r126,%r114; ld.u64 %r170,[%r168]; mul.lo.u64 %r169,%r170,%r23; .loc 1 127 18 st.u64 [%r167],%r169; .loc 1 120 3 add.u64 %r115,%r115,8; add.u64 %r114,%r114,24; setp.ne.u64 %r171,%r101,%r115; @ %r171 bra $L7; $L4: .loc 1 129 14 ld.u64 %r39,[%frame+360]; .loc 1 129 6 setp.ne.u64 %r172,%r39,0; .loc 1 130 16 selp.u64 %r39,%r39,1,%r172; .loc 1 131 6 ld.u64 %r173,[%frame+240]; setp.ne.u64 %r174,%r173,0; @ %r174 bra $L9; .loc 1 132 16 cvt.s64.s32 %r175,%r81; st.u64 [%frame+240],%r175; $L9: .loc 1 134 6 setp.ne.u32 %r176,%r78,0; @ %r176 bra $L34; .loc 1 137 10 ld.u64 %r77,[%r125]; bra $L10; $L34: .loc 1 135 10 mov.u64 %r77,0; $L10: .loc 1 139 10 ld.u64 %r92,[%r124]; mov.u64 %r260,%r92; .loc 1 139 6 setp.eq.u64 %r262,%r92,0; @ %r262 bra $L11; .loc 1 139 33 cvta.global.u64 %r178,_gfortrani_compile_options; .loc 1 139 30 ld.u32 %r179,[%r178+36]; setp.eq.u32 %r180,%r179,0; @ ! %r180 bra $L11; setp.eq.u64 %r258,%r127,0; bra $L12; $L11: .loc 1 144 10 setp.eq.u64 %r258,%r127,0; @ %r258 bra $L13; .loc 1 148 12 ld.u64 %r183,[%r127+56]; add.u64 %r182,%r183,1; .loc 1 148 10 ld.u64 %r184,[%r127+48]; sub.u64 %r79,%r182,%r184; .loc 1 149 7 setp.lt.s64 %r185,%r79,0; @ ! %r185 bra $L15; bra $L14; $L13: .loc 1 158 12126; call (%value_in),_gfortrani_count_0,(%out_arg1); ld.param.u64 %r187,[%value_in]; } mov.u64 %r79,%r187; .loc 1 161 14 ld.u64 %r260,[%r124]; $L15: .loc 1 161 10 setp.ne.u64 %r188,%r260,0; @ %r188 bra $L16; .loc 1 164 4 st.u64 [%r124+48],%r260; add.u64 %r190,%r79,-1; st.u64 [%r124+56],%r190; mov.u64 %r191,1; st.u64 [%r124+40],%r191; .loc 1 166 16 st.u64 [%r124+8],%r260; .loc 1 169 21191; call (%value_in),_gfortrani_xmallocarray195,[%value_in]; } mov.u64 %r92,%r195; .loc 1 169 19 st.u64 [%r124],%r92; .loc 1 171 7 setp.ne.u64 %r197,%r79,0; @ %r197 bra $L12; bra $L1; $L16: .loc 1 179 17 ld.u64 %r199,[%r124+56]; add.u64 %r198,%r199,1; .loc 1 179 15 ld.u64 %r200,[%r124+48]; sub.u64 %r84,%r198,%r200; .loc 1 180 7 setp.ne.u64 %r201,%r84,%r79; @ %r201 bra $L17; .loc 1 192 8 mov.u64 %r92,%r260; setp.eq.u64 %r258,%r127,0; bra $L12; $L17: .loc 1 181 6 st.u64 [%stack+8],%r84; st.u64 [%stack],%r79;_gfortran_runtime_error12: .loc 1 187 12 ld.u64 %r76,[%r124+40]; .loc 1 188 6 setp.ne.u64 %r204,%r76,0; .loc 1 189 14 selp.u64 %r76,%r76,1,%r204; .loc 1 191 12 ld.u64 %r85,[%frame+240]; .loc 1 194 15 set.u32.ne.u64 %r206,%r98,0; neg.s32 %r207,%r206; .loc 1 194 10 set.u32.ne.u64 %r209,%r77,0; neg.s32 %r210,%r209; .loc 1 194 15 cvt.u16.u32 %r212,%r207; cvt.u16.u32 %r213,%r210; and.b16 %r211,%r212,%r213; .loc 1 194 9 cvt.u32.u16 %r214,%r211; cvt.u16.u8 %r215,%r2149; ld.u64 %r44,[%frame+120]; .loc 1 208 32 ld.u64 %r103,[%frame]; .loc 1 215 30 mul.lo.u64 %r28,%r103,%r39; .loc 1 216 30 mul.lo.u64 %r217,%r103,%r85; .loc 1 216 16 neg.s64 %r100,%r217; setp.le.s64 %r261,%r80,1; add.u64 %r265,%frame,360; add.u64 %r266,%frame,240; .loc 1 212 20 mov.u64 %r267,0; $L28: .loc 1 197 10 ld.s8 %r219,[%r98]; cvt.u16.u32 %r218,%r219; setp.eq.u16 %r220,%r218,0; @ %r220 bra $L20; .loc 1 200 10 ld.u8 %r221,[%r77]; st.u8 [%r92],%r221; .loc 1 201 16 add.u64 %r92,%r92,%r76; .loc 1 204 12 add.u64 %r77,%r77,%r39; .loc 1 205 12 add.u64 %r98,%r98,%r85; .loc 1 206 15 add.u64 %r44,%r44,1; .loc 1 208 13 setp.eq.u64 %r222,%r44,%r103; @ %r222 bra $L21; bra $L28; $L20: .loc 1 204 12 add.u64 %r77,%r77,%r39; .loc 1 205 12 add.u64 %r98,%r98,%r85; .loc 1 206 15 add.u64 %r44,%r44,1; .loc 1 208 13 setp.ne.u64 %r223,%r44,%r103; @ %r223 bra $L28; $L21: .loc 1 216 16 add.u64 %r93,%r98,%r100; .loc 1 218 14 @ ! %r261 bra $L51; bra $L19; $L27: .loc 1 212 20 st.u64 [%r95],%r267; .loc 1 215 30 mul.lo.u64 %r57,%r63,%r62; .loc 1 216 30 mul.lo.u64 %r226,%r65,%r62; .loc 1 216 16 sub.u64 %r93,%r98,%r226; .loc 1 217 12 add.u64 %r94,%r94,1; .loc 1 218 14 add.u64 %r95,%r95,8; add.u64 %r120,%r120,8; setp.ne.u64 %r227,%r80,%r94; @ %r227 bra $L26; bra $L19; $L51: add.u64 %r95,%frame,128; .loc 1 215 30 mov.u64 %r57,%r28; .loc 1 218 14 mov.u64 %r120,8; .loc 1 217 12 mov.u64 %r94,1; $L26: .loc 1 226 23 ld.u64 %r229,[%r95]; add.u64 %r62,%r229,1; st.u64 [%r95],%r62; .loc 1 227 30 add.u64 %r231,%r265,%r120; ld.u64 %r63,[%r231]; .loc 1 227 20 sub.u64 %r232,%r63,%r57; add.u64 %r77,%r77,%r232; .loc 1 228 30 add.u64 %r234,%r266,%r120; ld.u64 %r65,[%r234]; .loc 1 228 20 add.u64 %r98,%r93,%r65; .loc 1 208 32 add.u64 %r235,%frame,%r120; ld.u64 %r67,[%r235]; .loc 1 208 13 setp.eq.u64 %r236,%r62,%r67; @ %r236 bra $L27; .loc 1 212 20 mov.u64 %r44,0; bra $L28; $L19: .loc 1 234 6 @ %r258 bra $L1; .loc 1 236 11 ld.u64 %r239,[%r127+56]; add.u64 %r238,%r239,1; .loc 1 236 9 ld.u64 %r240,[%r127+48]; sub.u64 %r86,%r238,%r240; .loc 1 237 22 ld.u64 %r242,[%r124]; sub.u64 %r241,%r92,%r242; .loc 1 237 13 div.s64 %r87,%r241,%r76; .loc 1 238 10 setp.le.s64 %r243,%r86,%r87; @ %r243 bra $L1; .loc 1 240 20 ld.u64 %r88,[%r127+40]; .loc 1 241 14 setp.eq.u64 %r244,%r88,0; @ %r244 bra $L35; .loc 1 244 47 mul.lo.u64 %r121,%r87,%r88; .loc 1 250 20 mov.u64 %r122,%r88; bra $L30; $L35: mov.u64 %r121,%r87; mov.u64 %r122,1; $L30: .loc 1 244 16 ld.u64 %r245,[%r127]; add.u64 %r91,%r245,%r121; .loc 1 245 13 sub.u64 %r89,%r86,%r87; .loc 1 246 19 add.u64 %r90,%r89,-1; $L31: .loc 1 248 14 ld.u8 %r246,[%r91]; st.u8 [%r92],%r246; .loc 1 249 20 add.u64 %r92,%r92,%r76; .loc 1 250 20 add.u64 %r91,%r91,%r122; .loc 1 246 19 add.u64 %r90,%r90,-1; .loc 1 246 17 setp.ne.u64 %r247,%r90,-1; @ %r247 bra $L31; bra $L1; $L33: .loc 1 164 4 st.u64 [%r124+48],%r260; mov.u64 %r249,-1; st.u64 [%r124+56],%r249; mov.u64 %r250,1; st.u64 [%r124+40],%r250; .loc 1 166 16 st.u64 [%r124+8],%r260; .loc 1 169 2126250; call (%value_in),_gfortrani_xmallocarray254,[%value_in]; } .loc 1 169 19 st.u64 [%r124],%r254; bra $L1; $L14: .loc 1 161 10 @ %r262 bra $L33; .loc 1 151 14 mov.u64 %r127,0; .loc 1 152 15 mov.u64 %r79,%r127; bra $L16; $L1: .loc 1 254 1 ret; } pack_i2.o/_gfortrani_pack_i2 .visible .func _gfortrani_pack_fortran/generated/pack_igfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_count_0_gfortrani_count_0_gfortrani_xmallocarr_gfortrani_xmallocarray110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,910,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,80,65,67,75,32,105,110,116,114,105,110,115,105,99,59,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortrani_pack_i2 .visible .func _gfortrani_pack_480pred %r150; .reg .predpredpredu64 %r206; .reg .u64 %r207; .reg .predu16 %r222; .reg .pred %r223; .reg .u64 %r226; .reg .u16 %r227; .reg .u32 %r228; .reg .pred %r229; .reg .u16 %r230; .reg .pred %r231; .reg .predpredpredu64 %r282; mov.u64 %r131,%ar0; mov.u64 %r132,%ar1; mov.u64 %r133,%ar2; mov.u64 %r134,%ar3; .loc 1 97 9 ld.s8 %r22,[%r132+28]; .loc 1 97 7 cvt.u32.u32 %r135,%r22; cvt.s64.s8 %r91,%r135; .loc 1 99 8 ld.u64 %r103,[%r133]; .loc 1 104 15 ld.u64 %r23,[%r133+16]; .loc 1 104 13 cvt.u32.u64 %r92,%r23; .loc 1 106 53 add.u32 %r136,%r92,-4; and.b32 %r137,%r136,-5; set.u32.eq.u32 %r139,%r137,0; neg.s32 %r140,%r139; .loc 1 106 22 add.u32 %r141,%r92,-1; set.u32.le.u32 %r143,%r141,1; neg.s32 %r144,%r143; .loc 1 106 6 cvt.u16.u32 %r146,%r140; cvt.u16.u32 %r147,%r144; or.b16 %r145,%r146,%r147; cvt.u32.u16 %r148,%r145; cvt.u16.u8 %r149,%r148; setp.eq.u16 %r150,%r149,0; @ %r150 bra $L2; $L5: .loc 1 120 3 setp.gt.s64 %r151,%r91,0; @ %r151 bra $L3; mov.u32 %r87,0; bra $L4; $L2: .loc 1 108 7 setp.eq.u32 %r152,%r92,16; @ %r152 bra $L5; .loc 1 117 5 cvta.const.u64 %r11_gfortran_runtime_error122 16 cvt.u32.u32 %r156,%r22; cvt.s64.s8 %r155,%r156; shl.b64 %r114,%r155,3; add.u64 %r157,%frame,120; mov.u32 %r16111163,[%value_in]; } mov.u64 %r123,40; mov.u64 %r124,0; cvt.u32.u64 %r87,%r124; add.u64 %r274,%frame,360; add.u64 %r273,%frame,240; $L7: add.u64 %r117,%r132,%r123; .loc 1 123 19 ld.u64 %r166,[%r117+16]; add.u64 %r165,%r166,1; ld.u64 %r167,[%r117+8]; sub.u64 %r32,%r165,%r167; .loc 1 123 17 add.u64 %r168,%frame,%r124; st.u64 [%r168],%r32; .loc 1 124 10 setp.gt.s64 %r169,%r32,0; .loc 1 125 19 selp.u32 %r87,%r87,1,%r169; .loc 1 126 18 add.u64 %r171,%r274,%r124; ld.u64 %r172,[%r117]; st.u64 [%r171],%r172; .loc 1 127 18 add.u64 %r174,%r273,%r124; .loc 1 127 20 add.u64 %r175,%r133,%r123; ld.u64 %r177,[%r175]; mul.lo.u64 %r176,%r177,%r23; .loc 1 127 18 st.u64 [%r174],%r176; .loc 1 120 3 add.u64 %r124,%r124,8; add.u64 %r123,%r123,24; setp.ne.u64 %r178,%r114,%r124; @ %r178 bra $L7; $L4: .loc 1 129 14 ld.u64 %r38,[%frame+360]; .loc 1 129 6 setp.ne.u64 %r179,%r38,0; .loc 1 130 16 selp.u64 %r38,%r38,1,%r179; .loc 1 131 6 ld.u64 %r180,[%frame+240]; setp.ne.u64 %r181,%r180,0; @ %r181 bra $L9; .loc 1 132 16 cvt.s64.s32 %r182,%r92; st.u64 [%frame+240],%r182; $L9: .loc 1 134 6 setp.ne.u32 %r183,%r87,0; @ %r183 bra $L34; .loc 1 137 10 ld.u64 %r88,[%r132]; bra $L10; $L34: .loc 1 135 10 mov.u64 %r88,0; $L10: .loc 1 139 10 ld.u64 %r41,[%r131]; mov.u64 %r276,%r41; .loc 1 139 6 setp.eq.u64 %r277,%r41,0; @ %r277 bra $L11; .loc 1 139 33 cvta.global.u64 %r185,_gfortrani_compile_options; .loc 1 139 30 ld.u32 %r186,[%r185+36]; setp.eq.u32 %r187,%r186,0; @ ! %r187 bra $L11; setp.eq.u64 %r275,%r134,0; bra $L12; $L11: .loc 1 144 10 setp.eq.u64 %r275,%r134,0; @ %r275 bra $L13; .loc 1 148 12 ld.u64 %r190,[%r134+56]; add.u64 %r189,%r190,1; .loc 1 148 10 ld.u64 %r191,[%r134+48]; sub.u64 %r89,%r189,%r191; .loc 1 149 7 setp.lt.s64 %r192,%r89,0; @ ! %r192 bra $L15; bra $L14; $L13: .loc 1 158 12133; call (%value_in),_gfortrani_count_0,(%out_arg1); ld.param.u64 %r194,[%value_in]; } mov.u64 %r89,%r194; .loc 1 161 14 ld.u64 %r276,[%r131]; $L15: .loc 1 161 10 setp.ne.u64 %r195,%r276,0; @ %r195 bra $L16; .loc 1 164 4 st.u64 [%r131+48],%r276; add.u64 %r197,%r89,-1; st.u64 [%r131+56],%r197; mov.u64 %r198,1; st.u64 [%r131+40],%r198; .loc 1 166 16 st.u64 [%r131+8],%r276; .loc 1 169 21 mov.u64 %r201,201; call (%value_in),_gfortrani_xmallocarrayt.u64 [%r131],%r41; .loc 1 171 7 setp.ne.u64 %r204,%r89,0; @ %r204 bra $L12; bra $L1; $L16: .loc 1 179 17 ld.u64 %r206,[%r131+56]; add.u64 %r205,%r206,1; .loc 1 179 15 ld.u64 %r207,[%r131+48]; sub.u64 %r95,%r205,%r207; .loc 1 180 7 setp.ne.u64 %r208,%r95,%r89; @ %r208 bra $L17; .loc 1 192 8 mov.u64 %r41,%r276; setp.eq.u64 %r275,%r134,0; bra $L12; $L17: .loc 1 181 6 st.u64 [%stack+8],%r95; st.u64 [%stack],%r89;_gfortran_runtime_error12: .loc 1 187 12 ld.u64 %r82,[%r131+40]; .loc 1 188 6 setp.ne.u64 %r211,%r82,0; .loc 1 189 14 selp.u64 %r82,%r82,1,%r211; .loc 1 191 12 ld.u64 %r97,[%frame+240]; .loc 1 194 15 set.u32.ne.u64 %r213,%r103,0; neg.s32 %r214,%r213; .loc 1 194 10 set.u32.ne.u64 %r216,%r88,0; neg.s32 %r217,%r216; .loc 1 194 15 cvt.u16.u32 %r219,%r214; cvt.u16.u32 %r220,%r217; and.b16 %r218,%r219,%r220; .loc 1 194 9 cvt.u32.u16 %r221,%r218; cvt.u16.u8 %r222,%r221; setp.eq.u16 %r223,%r222,0; @ %r223 bra $L35; .loc 1 201 16 add.u64 %r54,%r82,%r82; .loc 1 204 12 add.u64 %r56,%r38,%r38; ld.u64 %r94,[%frame+120]; .loc 1 208 32 ld.u64 %r115,[%frame]; .loc 1 215 30 mul.lo.u64 %r28,%r115,%r38; .loc 1 216 30 mul.lo.u64 %r226,%r115,%r97; .loc 1 216 16 neg.s64 %r112,%r226; .loc 1 192 8 mov.u64 %r90,%r41; setp.le.s64 %r272,%r91,1; add.u64 %r280,%frame,360; add.u64 %r281,%frame,240; .loc 1 212 20 mov.u64 %r282,0; $L28: .loc 1 197 10 ld.s8 %r228,[%r103]; cvt.u16.u32 %r227,%r228; setp.eq.u16 %r229,%r227,0; @ %r229 bra $L20; .loc 1 200 10 ld.u16 %r230,[%r88]; st.u16 [%r90],%r230; .loc 1 201 16 add.u64 %r90,%r90,%r54; .loc 1 204 12 add.u64 %r88,%r88,%r56; .loc 1 205 12 add.u64 %r103,%r103,%r97; .loc 1 206 15 add.u64 %r94,%r94,1; .loc 1 208 13 setp.eq.u64 %r231,%r94,%r115; @ %r231 bra $L21; bra $L28; $L20: .loc 1 204 12 add.u64 %r88,%r88,%r56; .loc 1 205 12 add.u64 %r103,%r103,%r97; .loc 1 206 15 add.u64 %r94,%r94,1; .loc 1 208 13 setp.ne.u64 %r232,%r94,%r115; @ %r232 bra $L28; $L21: .loc 1 216 16 add.u64 %r104,%r103,%r112; .loc 1 218 14 @ ! %r272 bra $L52; bra $L19; $L27: .loc 1 212 20 st.u64 [%r83],%r282; .loc 1 215 30 mul.lo.u64 %r58,%r64,%r63; .loc 1 216 30 mul.lo.u64 %r235,%r66,%r63; .loc 1 216 16 sub.u64 %r104,%r103,%r235; .loc 1 217 12 add.u64 %r105,%r105,1; .loc 1 218 14 add.u64 %r83,%r83,8; add.u64 %r129,%r129,8; setp.ne.u64 %r236,%r91,%r105; @ %r236 bra $L26; bra $L19; $L52: add.u64 %r83,%frame,128; .loc 1 215 30 mov.u64 %r58,%r28; .loc 1 218 14 mov.u64 %r129,8; .loc 1 217 12 mov.u64 %r105,1; $L26: .loc 1 226 23 ld.u64 %r238,[%r83]; add.u64 %r63,%r238,1; st.u64 [%r83],%r63; .loc 1 227 30 add.u64 %r240,%r280,%r129; ld.u64 %r64,[%r240]; .loc 1 227 20 sub.u64 %r241,%r64,%r58; add.u64 %r242,%r241,%r241; add.u64 %r88,%r88,%r242; .loc 1 228 30 add.u64 %r244,%r281,%r129; ld.u64 %r66,[%r244]; .loc 1 228 20 add.u64 %r103,%r104,%r66; .loc 1 208 32 add.u64 %r245,%frame,%r129; ld.u64 %r68,[%r245]; .loc 1 208 13 setp.eq.u64 %r246,%r63,%r68; @ %r246 bra $L27; .loc 1 212 20 mov.u64 %r94,0; bra $L28; $L35: .loc 1 192 8 mov.u64 %r90,%r41; $L19: .loc 1 234 6 @ %r275 bra $L1; .loc 1 236 11 ld.u64 %r249,[%r134+56]; add.u64 %r248,%r249,1; .loc 1 236 9 ld.u64 %r250,[%r134+48]; sub.u64 %r98,%r248,%r250; .loc 1 237 22 sub.u64 %r251,%r90,%r41; shr.s64 %r253,%r251,1; .loc 1 237 13 div.s64 %r99,%r253,%r82; .loc 1 238 10 setp.le.s64 %r254,%r98,%r99; @ %r254 bra $L1; .loc 1 240 20 ld.u64 %r84,[%r134+40]; .loc 1 241 14 setp.eq.u64 %r255,%r84,0; @ %r255 bra $L36; .loc 1 244 47 mul.lo.u64 %r130,%r99,%r84; bra $L30; $L36: mov.u64 %r130,%r99; .loc 1 242 22 mov.u64 %r84,1; $L30: .loc 1 244 36 add.u64 %r256,%r130,%r130; .loc 1 244 16 ld.u64 %r257,[%r134]; add.u64 %r102,%r257,%r256; .loc 1 245 13 sub.u64 %r100,%r98,%r99; .loc 1 246 19 add.u64 %r101,%r100,-1; .loc 1 246 17 setp.eq.u64 %r258,%r100,0; @ %r258 bra $L1; .loc 1 249 20 add.u64 %r79,%r82,%r82; .loc 1 250 20 add.u64 %r81,%r84,%r84; $L31: .loc 1 248 14 ld.u16 %r261,[%r102]; st.u16 [%r90],%r261; .loc 1 249 20 add.u64 %r90,%r90,%r79; .loc 1 250 20 add.u64 %r102,%r102,%r81; .loc 1 246 19 add.u64 %r101,%r101,-1; .loc 1 246 17 setp.ne.u64 %r262,%r101,-1; @ %r262 bra $L31; bra $L1; $L33: .loc 1 164 4 st.u64 [%r131+48],%r276; mov.u64 %r264,-1; st.u64 [%r131+56],%r264; mov.u64 %r265,1; st.u64 [%r131+40],%r265; .loc 1 166 16 st.u64 [%r131+8],%r276; .loc 1 169 21 mov.u64 %r268,2(%value_in),_gfortrani_xmallocarray269,[%value_in]; } .loc 1 169 19 st.u64 [%r131],%r269; bra $L1; $L14: .loc 1 161 10 @ %r277 bra $L33; .loc 1 151 14 mov.u64 %r134,0; .loc 1 152 15 mov.u64 %r89,%r134; bra $L16; $L1: .loc 1 254 1 ret; } pack_i4.o/_gfortrani_pack_i4 .visible .func _gfortrani_pack_fortran/generated/pack_i4.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_count_0_gfortrani_count_0_gfortrani_xmallocarr_gfortrani_xmallocarray110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,910,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,80,65,67,75,32,105,110,116,114,105,110,115,105,99,59,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortrani_pack_i4 .visible .func _gfortrani_pack_480pred %r150; .reg .predpredpredu64 %r206; .reg .u64 %r207; .reg .predpred %r229; .reg .u64 %r232; .reg .predpred %r251; .reg .predmov.u64 %r131,%ar0; mov.u64 %r132,%ar1; mov.u64 %r133,%ar2; mov.u64 %r134,%ar3; .loc 1 97 9 ld.s8 %r22,[%r132+28]; .loc 1 97 7 cvt.u32.u32 %r135,%r22; cvt.s64.s8 %r91,%r135; .loc 1 99 8 ld.u64 %r103,[%r133]; .loc 1 104 15 ld.u64 %r23,[%r133+16]; .loc 1 104 13 cvt.u32.u64 %r92,%r23; .loc 1 106 53 add.u32 %r136,%r92,-4; and.b32 %r137,%r136,-5; set.u32.eq.u32 %r139,%r137,0; neg.s32 %r140,%r139; .loc 1 106 22 add.u32 %r141,%r92,-1; set.u32.le.u32 %r143,%r141,1; neg.s32 %r144,%r143; .loc 1 106 6 cvt.u16.u32 %r146,%r140; cvt.u16.u32 %r147,%r144; or.b16 %r145,%r146,%r147; cvt.u32.u16 %r148,%r145; cvt.u16.u8 %r149,%r148; setp.eq.u16 %r150,%r149,0; @ %r150 bra $L2; $L5: .loc 1 120 3 setp.gt.s64 %r151,%r91,0; @ %r151 bra $L3; mov.u32 %r87,0; bra $L4; $L2: .loc 1 108 7 setp.eq.u32 %r152,%r92,16; @ %r152 bra $L5; .loc 1 117 5 cvta.const.u64 %r11_gfortran_runtime_error122 16 cvt.u32.u32 %r156,%r22; cvt.s64.s8 %r155,%r156; shl.b64 %r114,%r155,3; add.u64 %r157,%frame,120; mov.u32 %r16111163,[%value_in]; } mov.u64 %r123,40; mov.u64 %r124,0; cvt.u32.u64 %r87,%r124; add.u64 %r268,%frame,360; add.u64 %r267,%frame,240; $L7: add.u64 %r117,%r132,%r123; .loc 1 123 19 ld.u64 %r166,[%r117+16]; add.u64 %r165,%r166,1; ld.u64 %r167,[%r117+8]; sub.u64 %r32,%r165,%r167; .loc 1 123 17 add.u64 %r168,%frame,%r124; st.u64 [%r168],%r32; .loc 1 124 10 setp.gt.s64 %r169,%r32,0; .loc 1 125 19 selp.u32 %r87,%r87,1,%r169; .loc 1 126 18 add.u64 %r171,%r268,%r124; ld.u64 %r172,[%r117]; st.u64 [%r171],%r172; .loc 1 127 18 add.u64 %r174,%r267,%r124; .loc 1 127 20 add.u64 %r175,%r133,%r123; ld.u64 %r177,[%r175]; mul.lo.u64 %r176,%r177,%r23; .loc 1 127 18 st.u64 [%r174],%r176; .loc 1 120 3 add.u64 %r124,%r124,8; add.u64 %r123,%r123,24; setp.ne.u64 %r178,%r114,%r124; @ %r178 bra $L7; $L4: .loc 1 129 14 ld.u64 %r38,[%frame+360]; .loc 1 129 6 setp.ne.u64 %r179,%r38,0; .loc 1 130 16 selp.u64 %r38,%r38,1,%r179; .loc 1 131 6 ld.u64 %r180,[%frame+240]; setp.ne.u64 %r181,%r180,0; @ %r181 bra $L9; .loc 1 132 16 cvt.s64.s32 %r182,%r92; st.u64 [%frame+240],%r182; $L9: .loc 1 134 6 setp.ne.u32 %r183,%r87,0; @ %r183 bra $L34; .loc 1 137 10 ld.u64 %r88,[%r132]; bra $L10; $L34: .loc 1 135 10 mov.u64 %r88,0; $L10: .loc 1 139 10 ld.u64 %r41,[%r131]; mov.u64 %r270,%r41; .loc 1 139 6 setp.eq.u64 %r271,%r41,0; @ %r271 bra $L11; .loc 1 139 33 cvta.global.u64 %r185,_gfortrani_compile_options; .loc 1 139 30 ld.u32 %r186,[%r185+36]; setp.eq.u32 %r187,%r186,0; @ ! %r187 bra $L11; setp.eq.u64 %r269,%r134,0; bra $L12; $L11: .loc 1 144 10 setp.eq.u64 %r269,%r134,0; @ %r269 bra $L13; .loc 1 148 12 ld.u64 %r190,[%r134+56]; add.u64 %r189,%r190,1; .loc 1 148 10 ld.u64 %r191,[%r134+48]; sub.u64 %r89,%r189,%r191; .loc 1 149 7 setp.lt.s64 %r192,%r89,0; @ ! %r192 bra $L15; bra $L14; $L13: .loc 1 158 12133; call (%value_in),_gfortrani_count_0,(%out_arg1); ld.param.u64 %r194,[%value_in]; } mov.u64 %r89,%r194; .loc 1 161 14 ld.u64 %r270,[%r131]; $L15: .loc 1 161 10 setp.ne.u64 %r195,%r270,0; @ %r195 bra $L16; .loc 1 164 4 st.u64 [%r131+48],%r270; add.u64 %r197,%r89,-1; st.u64 [%r131+56],%r197; mov.u64 %r198,1; st.u64 [%r131+40],%r198; .loc 1 166 16 st.u64 [%r131+8],%r270; .loc 1 169 2189201; call (%value_in),_gfortrani_xmallocarrayt.u64 [%r131],%r41; .loc 1 171 7 setp.ne.u64 %r204,%r89,0; @ %r204 bra $L12; bra $L1; $L16: .loc 1 179 17 ld.u64 %r206,[%r131+56]; add.u64 %r205,%r206,1; .loc 1 179 15 ld.u64 %r207,[%r131+48]; sub.u64 %r95,%r205,%r207; .loc 1 180 7 setp.ne.u64 %r208,%r95,%r89; @ %r208 bra $L17; .loc 1 192 8 mov.u64 %r41,%r270; setp.eq.u64 %r269,%r134,0; bra $L12; $L17: .loc 1 181 6 st.u64 [%stack+8],%r95; st.u64 [%stack],%r89;_gfortran_runtime_error12: .loc 1 187 12 ld.u64 %r82,[%r131+40]; .loc 1 188 6 setp.ne.u64 %r211,%r82,0; .loc 1 189 14 selp.u64 %r82,%r82,1,%r211; .loc 1 191 12 ld.u64 %r97,[%frame+240]; .loc 1 194 15 set.u32.ne.u64 %r213,%r103,0; neg.s32 %r214,%r213; .loc 1 194 10 set.u32.ne.u64 %r216,%r88,0; neg.s32 %r217,%r216; .loc 1 194 15 cvt.u16.u32 %r219,%r214; cvt.u16.u32 %r220,%r217; and.b16 %r218,%r219,%r220; .loc 1 194 9 cvt.u32.u16 %r221,%r218; cvt.u16.u8 %r222,%r221; setp.eq.u16 %r223,%r222,0; @ %r223 bra $L35; .loc 1 201 16 shl.b64 %r54,%r82,2; .loc 1 204 12 shl.b64 %r56,%r38,2; ld.u64 %r94,[%frame+120]; .loc 1 208 32 ld.u64 %r115,[%frame]; .loc 1 215 30 mul.lo.u64 %r28,%r115,%r38; .loc 1 216 30 mul.lo.u64 %r224,%r115,%r97; .loc 1 216 16 neg.s64 %r112,%r224; .loc 1 192 8 mov.u64 %r90,%r41; setp.le.s64 %r266,%r91,1; add.u64 %r274,%frame,360; add.u64 %r275,%frame,240; .loc 1 212 20 mov.u64 %r276,0; $L28: .loc 1 197 10 ld.s8 %r226,[%r103]; cvt.u16.u32 %r225,%r226; setp.eq.u16 %r227,%r225,0; @ %r227 bra $L20; .loc 1 200 12 ld.u32 %r52,[%r88]; .loc 1 200 10 st.u32 [%r90],%r52; .loc 1 201 16 add.u64 %r90,%r90,%r54; .loc 1 204 12 add.u64 %r88,%r88,%r56; .loc 1 205 12 add.u64 %r103,%r103,%r97; .loc 1 206 15 add.u64 %r94,%r94,1; .loc 1 208 13 setp.eq.u64 %r228,%r94,%r115; @ %r228 bra $L21; bra $L28; $L20: .loc 1 204 12 add.u64 %r88,%r88,%r56; .loc 1 205 12 add.u64 %r103,%r103,%r97; .loc 1 206 15 add.u64 %r94,%r94,1; .loc 1 208 13 setp.ne.u64 %r229,%r94,%r115; @ %r229 bra $L28; $L21: .loc 1 216 16 add.u64 %r104,%r103,%r112; .loc 1 218 14 @ ! %r266 bra $L52; bra $L19; $L27: .loc 1 212 20 st.u64 [%r83],%r276; .loc 1 215 30 mul.lo.u64 %r58,%r64,%r63; .loc 1 216 30 mul.lo.u64 %r232,%r66,%r63; .loc 1 216 16 sub.u64 %r104,%r103,%r232; .loc 1 217 12 add.u64 %r105,%r105,1; .loc 1 218 14 add.u64 %r83,%r83,8; add.u64 %r129,%r129,8; setp.ne.u64 %r233,%r91,%r105; @ %r233 bra $L26; bra $L19; $L52: add.u64 %r83,%frame,128; .loc 1 215 30 mov.u64 %r58,%r28; .loc 1 218 14 mov.u64 %r129,8; .loc 1 217 12 mov.u64 %r105,1; $L26: .loc 1 226 23 ld.u64 %r235,[%r83]; add.u64 %r63,%r235,1; st.u64 [%r83],%r63; .loc 1 227 30 add.u64 %r237,%r274,%r129; ld.u64 %r64,[%r237]; .loc 1 227 20 sub.u64 %r238,%r64,%r58; shl.b64 %r239,%r238,2; add.u64 %r88,%r88,%r239; .loc 1 228 30 add.u64 %r241,%r275,%r129; ld.u64 %r66,[%r241]; .loc 1 228 20 add.u64 %r103,%r104,%r66; .loc 1 208 32 add.u64 %r242,%frame,%r129; ld.u64 %r68,[%r242]; .loc 1 208 13 setp.eq.u64 %r243,%r63,%r68; @ %r243 bra $L27; .loc 1 212 20 mov.u64 %r94,0; bra $L28; $L35: .loc 1 192 8 mov.u64 %r90,%r41; $L19: .loc 1 234 6 @ %r269 bra $L1; .loc 1 236 11 ld.u64 %r246,[%r134+56]; add.u64 %r245,%r246,1; .loc 1 236 9 ld.u64 %r247,[%r134+48]; sub.u64 %r98,%r245,%r247; .loc 1 237 22 sub.u64 %r248,%r90,%r41; shr.s64 %r250,%r248,2; .loc 1 237 13 div.s64 %r99,%r250,%r82; .loc 1 238 10 setp.le.s64 %r251,%r98,%r99; @ %r251 bra $L1; .loc 1 240 20 ld.u64 %r84,[%r134+40]; .loc 1 241 14 setp.eq.u64 %r252,%r84,0; @ %r252 bra $L36; .loc 1 244 47 mul.lo.u64 %r130,%r99,%r84; bra $L30; $L36: mov.u64 %r130,%r99; .loc 1 242 22 mov.u64 %r84,1; $L30: .loc 1 244 36 shl.b64 %r253,%r130,2; .loc 1 244 16 ld.u64 %r254,[%r134]; add.u64 %r102,%r254,%r253; .loc 1 245 13 sub.u64 %r100,%r98,%r99; .loc 1 246 19 add.u64 %r101,%r100,-1; .loc 1 246 17 setp.eq.u64 %r255,%r100,0; @ %r255 bra $L1; .loc 1 249 20 shl.b64 %r79,%r82,2; .loc 1 250 20 shl.b64 %r81,%r84,2; $L31: .loc 1 248 16 ld.u32 %r77,[%r102]; .loc 1 248 14 st.u32 [%r90],%r77; .loc 1 249 20 add.u64 %r90,%r90,%r79; .loc 1 250 20 add.u64 %r102,%r102,%r81; .loc 1 246 19 add.u64 %r101,%r101,-1; .loc 1 246 17 setp.ne.u64 %r256,%r101,-1; @ %r256 bra $L31; bra $L1; $L33: .loc 1 164 4 st.u64 [%r131+48],%r270; mov.u64 %r258,-1; st.u64 [%r131+56],%r258; mov.u64 %r259,1; st.u64 [%r131+40],%r259; .loc 1 166 16 st.u64 [%r131+8],%r270; .loc 1 169 21262; call (%value_in),_gfortrani_xmallocarray263,[%value_in]; } .loc 1 169 19 st.u64 [%r131],%r263; bra $L1; $L14: .loc 1 161 10 @ %r271 bra $L33; .loc 1 151 14 mov.u64 %r134,0; .loc 1 152 15 mov.u64 %r89,%r134; bra $L16; $L1: .loc 1 254 1 ret; } pack_i8.o/_gfortrani_pack_i8 .visible .func _gfortrani_pack_fortran/generated/pack_i8.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_count_0_gfortrani_count_0_gfortrani_xmallocarr_gfortrani_xmallocarray110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,910,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,80,65,67,75,32,105,110,116,114,105,110,115,105,99,59,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortrani_pack_i8 .visible .func _gfortrani_pack_480pred %r150; .reg .predpredpredu64 %r206; .reg .u64 %r207; .reg .predpred %r229; .reg .u64 %r232; .reg .predpred %r251; .reg .predmov.u64 %r131,%ar0; mov.u64 %r132,%ar1; mov.u64 %r133,%ar2; mov.u64 %r134,%ar3; .loc 1 97 9 ld.s8 %r22,[%r132+28]; .loc 1 97 7 cvt.u32.u32 %r135,%r22; cvt.s64.s8 %r91,%r135; .loc 1 99 8 ld.u64 %r103,[%r133]; .loc 1 104 15 ld.u64 %r23,[%r133+16]; .loc 1 104 13 cvt.u32.u64 %r92,%r23; .loc 1 106 53 add.u32 %r136,%r92,-4; and.b32 %r137,%r136,-5; set.u32.eq.u32 %r139,%r137,0; neg.s32 %r140,%r139; .loc 1 106 22 add.u32 %r141,%r92,-1; set.u32.le.u32 %r143,%r141,1; neg.s32 %r144,%r143; .loc 1 106 6 cvt.u16.u32 %r146,%r140; cvt.u16.u32 %r147,%r144; or.b16 %r145,%r146,%r147; cvt.u32.u16 %r148,%r145; cvt.u16.u8 %r149,%r148; setp.eq.u16 %r150,%r149,0; @ %r150 bra $L2; $L5: .loc 1 120 3 setp.gt.s64 %r151,%r91,0; @ %r151 bra $L3; mov.u32 %r87,0; bra $L4; $L2: .loc 1 108 7 setp.eq.u32 %r152,%r92,16; @ %r152 bra $L5; .loc 1 117 5 cvta.const.u64 %r11_gfortran_runtime_error122 16 cvt.u32.u32 %r156,%r22; cvt.s64.s8 %r155,%r156; shl.b64 %r114,%r155,3; add.u64 %r157,%frame,120; mov.u32 %r16111163,[%value_in]; } mov.u64 %r123,40; mov.u64 %r124,0; cvt.u32.u64 %r87,%r124; add.u64 %r268,%frame,360; add.u64 %r267,%frame,240; $L7: add.u64 %r117,%r132,%r123; .loc 1 123 19 ld.u64 %r166,[%r117+16]; add.u64 %r165,%r166,1; ld.u64 %r167,[%r117+8]; sub.u64 %r32,%r165,%r167; .loc 1 123 17 add.u64 %r168,%frame,%r124; st.u64 [%r168],%r32; .loc 1 124 10 setp.gt.s64 %r169,%r32,0; .loc 1 125 19 selp.u32 %r87,%r87,1,%r169; .loc 1 126 18 add.u64 %r171,%r268,%r124; ld.u64 %r172,[%r117]; st.u64 [%r171],%r172; .loc 1 127 18 add.u64 %r174,%r267,%r124; .loc 1 127 20 add.u64 %r175,%r133,%r123; ld.u64 %r177,[%r175]; mul.lo.u64 %r176,%r177,%r23; .loc 1 127 18 st.u64 [%r174],%r176; .loc 1 120 3 add.u64 %r124,%r124,8; add.u64 %r123,%r123,24; setp.ne.u64 %r178,%r114,%r124; @ %r178 bra $L7; $L4: .loc 1 129 14 ld.u64 %r38,[%frame+360]; .loc 1 129 6 setp.ne.u64 %r179,%r38,0; .loc 1 130 16 selp.u64 %r38,%r38,1,%r179; .loc 1 131 6 ld.u64 %r180,[%frame+240]; setp.ne.u64 %r181,%r180,0; @ %r181 bra $L9; .loc 1 132 16 cvt.s64.s32 %r182,%r92; st.u64 [%frame+240],%r182; $L9: .loc 1 134 6 setp.ne.u32 %r183,%r87,0; @ %r183 bra $L34; .loc 1 137 10 ld.u64 %r88,[%r132]; bra $L10; $L34: .loc 1 135 10 mov.u64 %r88,0; $L10: .loc 1 139 10 ld.u64 %r41,[%r131]; mov.u64 %r270,%r41; .loc 1 139 6 setp.eq.u64 %r271,%r41,0; @ %r271 bra $L11; .loc 1 139 33 cvta.global.u64 %r185,_gfortrani_compile_options; .loc 1 139 30 ld.u32 %r186,[%r185+36]; setp.eq.u32 %r187,%r186,0; @ ! %r187 bra $L11; setp.eq.u64 %r269,%r134,0; bra $L12; $L11: .loc 1 144 10 setp.eq.u64 %r269,%r134,0; @ %r269 bra $L13; .loc 1 148 12 ld.u64 %r190,[%r134+56]; add.u64 %r189,%r190,1; .loc 1 148 10 ld.u64 %r191,[%r134+48]; sub.u64 %r89,%r189,%r191; .loc 1 149 7 setp.lt.s64 %r192,%r89,0; @ ! %r192 bra $L15; bra $L14; $L13: .loc 1 158 12133; call (%value_in),_gfortrani_count_0,(%out_arg1); ld.param.u64 %r194,[%value_in]; } mov.u64 %r89,%r194; .loc 1 161 14 ld.u64 %r270,[%r131]; $L15: .loc 1 161 10 setp.ne.u64 %r195,%r270,0; @ %r195 bra $L16; .loc 1 164 4 st.u64 [%r131+48],%r270; add.u64 %r197,%r89,-1; st.u64 [%r131+56],%r197; mov.u64 %r198,1; st.u64 [%r131+40],%r198; .loc 1 166 16 st.u64 [%r131+8],%r270; .loc 1 169 21 mov.u64 %r20189201; call (%value_in),_gfortrani_xmallocarrayt.u64 [%r131],%r41; .loc 1 171 7 setp.ne.u64 %r204,%r89,0; @ %r204 bra $L12; bra $L1; $L16: .loc 1 179 17 ld.u64 %r206,[%r131+56]; add.u64 %r205,%r206,1; .loc 1 179 15 ld.u64 %r207,[%r131+48]; sub.u64 %r95,%r205,%r207; .loc 1 180 7 setp.ne.u64 %r208,%r95,%r89; @ %r208 bra $L17; .loc 1 192 8 mov.u64 %r41,%r270; setp.eq.u64 %r269,%r134,0; bra $L12; $L17: .loc 1 181 6 st.u64 [%stack+8],%r95; st.u64 [%stack],%r89;_gfortran_runtime_error12: .loc 1 187 12 ld.u64 %r82,[%r131+40]; .loc 1 188 6 setp.ne.u64 %r211,%r82,0; .loc 1 189 14 selp.u64 %r82,%r82,1,%r211; .loc 1 191 12 ld.u64 %r97,[%frame+240]; .loc 1 194 15 set.u32.ne.u64 %r213,%r103,0; neg.s32 %r214,%r213; .loc 1 194 10 set.u32.ne.u64 %r216,%r88,0; neg.s32 %r217,%r216; .loc 1 194 15 cvt.u16.u32 %r219,%r214; cvt.u16.u32 %r220,%r217; and.b16 %r218,%r219,%r220; .loc 1 194 9 cvt.u32.u16 %r221,%r218; cvt.u16.u8 %r222,%r221; setp.eq.u16 %r223,%r222,0; @ %r223 bra $L35; .loc 1 201 16 shl.b64 %r54,%r82,3; .loc 1 204 12 shl.b64 %r56,%r38,3; ld.u64 %r94,[%frame+120]; .loc 1 208 32 ld.u64 %r115,[%frame]; .loc 1 215 30 mul.lo.u64 %r28,%r115,%r38; .loc 1 216 30 mul.lo.u64 %r224,%r115,%r97; .loc 1 216 16 neg.s64 %r112,%r224; .loc 1 192 8 mov.u64 %r90,%r41; setp.le.s64 %r266,%r91,1; add.u64 %r274,%frame,360; add.u64 %r275,%frame,240; .loc 1 212 20 mov.u64 %r276,0; $L28: .loc 1 197 10 ld.s8 %r226,[%r103]; cvt.u16.u32 %r225,%r226; setp.eq.u16 %r227,%r225,0; @ %r227 bra $L20; .loc 1 200 12 ld.u64 %r52,[%r88]; .loc 1 200 10 st.u64 [%r90],%r52; .loc 1 201 16 add.u64 %r90,%r90,%r54; .loc 1 204 12 add.u64 %r88,%r88,%r56; .loc 1 205 12 add.u64 %r103,%r103,%r97; .loc 1 206 15 add.u64 %r94,%r94,1; .loc 1 208 13 setp.eq.u64 %r228,%r94,%r115; @ %r228 bra $L21; bra $L28; $L20: .loc 1 204 12 add.u64 %r88,%r88,%r56; .loc 1 205 12 add.u64 %r103,%r103,%r97; .loc 1 206 15 add.u64 %r94,%r94,1; .loc 1 208 13 setp.ne.u64 %r229,%r94,%r115; @ %r229 bra $L28; $L21: .loc 1 216 16 add.u64 %r104,%r103,%r112; .loc 1 218 14 @ ! %r266 bra $L52; bra $L19; $L27: .loc 1 212 20 st.u64 [%r83],%r276; .loc 1 215 30 mul.lo.u64 %r58,%r64,%r63; .loc 1 216 30 mul.lo.u64 %r232,%r66,%r63; .loc 1 216 16 sub.u64 %r104,%r103,%r232; .loc 1 217 12 add.u64 %r105,%r105,1; .loc 1 218 14 add.u64 %r83,%r83,8; add.u64 %r129,%r129,8; setp.ne.u64 %r233,%r91,%r105; @ %r233 bra $L26; bra $L19; $L52: add.u64 %r83,%frame,128; .loc 1 215 30 mov.u64 %r58,%r28; .loc 1 218 14 mov.u64 %r129,8; .loc 1 217 12 mov.u64 %r105,1; $L26: .loc 1 226 23 ld.u64 %r235,[%r83]; add.u64 %r63,%r235,1; st.u64 [%r83],%r63; .loc 1 227 30 add.u64 %r237,%r274,%r129; ld.u64 %r64,[%r237]; .loc 1 227 20 sub.u64 %r238,%r64,%r58; shl.b64 %r239,%r238,3; add.u64 %r88,%r88,%r239; .loc 1 228 30 add.u64 %r241,%r275,%r129; ld.u64 %r66,[%r241]; .loc 1 228 20 add.u64 %r103,%r104,%r66; .loc 1 208 32 add.u64 %r242,%frame,%r129; ld.u64 %r68,[%r242]; .loc 1 208 13 setp.eq.u64 %r243,%r63,%r68; @ %r243 bra $L27; .loc 1 212 20 mov.u64 %r94,0; bra $L28; $L35: .loc 1 192 8 mov.u64 %r90,%r41; $L19: .loc 1 234 6 @ %r269 bra $L1; .loc 1 236 11 ld.u64 %r246,[%r134+56]; add.u64 %r245,%r246,1; .loc 1 236 9 ld.u64 %r247,[%r134+48]; sub.u64 %r98,%r245,%r247; .loc 1 237 22 sub.u64 %r248,%r90,%r41; shr.s64 %r250,%r248,3; .loc 1 237 13 div.s64 %r99,%r250,%r82; .loc 1 238 10 setp.le.s64 %r251,%r98,%r99; @ %r251 bra $L1; .loc 1 240 20 ld.u64 %r84,[%r134+40]; .loc 1 241 14 setp.eq.u64 %r252,%r84,0; @ %r252 bra $L36; .loc 1 244 47 mul.lo.u64 %r130,%r99,%r84; bra $L30; $L36: mov.u64 %r130,%r99; .loc 1 242 22 mov.u64 %r84,1; $L30: .loc 1 244 36 shl.b64 %r253,%r130,3; .loc 1 244 16 ld.u64 %r254,[%r134]; add.u64 %r102,%r254,%r253; .loc 1 245 13 sub.u64 %r100,%r98,%r99; .loc 1 246 19 add.u64 %r101,%r100,-1; .loc 1 246 17 setp.eq.u64 %r255,%r100,0; @ %r255 bra $L1; .loc 1 249 20 shl.b64 %r79,%r82,3; .loc 1 250 20 shl.b64 %r81,%r84,3; $L31: .loc 1 248 16 ld.u64 %r77,[%r102]; .loc 1 248 14 st.u64 [%r90],%r77; .loc 1 249 20 add.u64 %r90,%r90,%r79; .loc 1 250 20 add.u64 %r102,%r102,%r81; .loc 1 246 19 add.u64 %r101,%r101,-1; .loc 1 246 17 setp.ne.u64 %r256,%r101,-1; @ %r256 bra $L31; bra $L1; $L33: .loc 1 164 4 st.u64 [%r131+48],%r270; mov.u64 %r258,-1; st.u64 [%r131+56],%r258; mov.u64 %r259,1; st.u64 [%r131+40],%r259; .loc 1 166 16 st.u64 [%r131+8],%r270; .loc 1 169 21 mov.u64 %r2627262; call (%value_in),_gfortrani_xmallocarray263,[%value_in]; } .loc 1 169 19 st.u64 [%r131],%r263; bra $L1; $L14: .loc 1 161 10 @ %r271 bra $L33; .loc 1 151 14 mov.u64 %r134,0; .loc 1 152 15 mov.u64 %r89,%r134; bra $L16; $L1: .loc 1 254 1 ret; } pack_i16.o/_gfortrani_pack_i16 .visible .func _gfortrani_pack_i16fortran/generated/pack_i16.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_count_0_gfortrani_count_0_gfortrani_xmallocarr_gfortrani_xmallocarray110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,910,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,80,65,67,75,32,105,110,116,114,105,110,115,105,99,59,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortrani_pack_i16 .visible .func _gfortrani_pack_i16480pred %r150; .reg .predpredpredu64 %r206; .reg .u64 %r207; .reg .predu16 %r222; .reg .pred %r223; .reg .u64 %r224; .reg .u16 %r225; .reg .u32 %r226; .reg .pred %r227; .reg .predpredpred %r253; .reg .predu64 %r273; .reg .pred %r274; .reg .predu64 %r282; .reg .u64 %r283; .reg .u64 %r284; mov.u64 %r131,%ar0; mov.u64 %r132,%ar1; mov.u64 %r133,%ar2; mov.u64 %r134,%ar3; .loc 1 97 9 ld.s8 %r22,[%r132+28]; .loc 1 97 7 cvt.u32.u32 %r135,%r22; cvt.s64.s8 %r91,%r135; .loc 1 99 8 ld.u64 %r103,[%r133]; .loc 1 104 15 ld.u64 %r23,[%r133+16]; .loc 1 104 13 cvt.u32.u64 %r92,%r23; .loc 1 106 53 add.u32 %r136,%r92,-4; and.b32 %r137,%r136,-5; set.u32.eq.u32 %r139,%r137,0; neg.s32 %r140,%r139; .loc 1 106 22 add.u32 %r141,%r92,-1; set.u32.le.u32 %r143,%r141,1; neg.s32 %r144,%r143; .loc 1 106 6 cvt.u16.u32 %r146,%r140; cvt.u16.u32 %r147,%r144; or.b16 %r145,%r146,%r147; cvt.u32.u16 %r148,%r145; cvt.u16.u8 %r149,%r148; setp.eq.u16 %r150,%r149,0; @ %r150 bra $L2; $L5: .loc 1 120 3 setp.gt.s64 %r151,%r91,0; @ %r151 bra $L3; mov.u32 %r87,0; bra $L4; $L2: .loc 1 108 7 setp.eq.u32 %r152,%r92,16; @ %r152 bra $L5; .loc 1 117 5 cvta.const.u64 %r11_gfortran_runtime_error122 16 cvt.u32.u32 %r156,%r22; cvt.s64.s8 %r155,%r156; shl.b64 %r114,%r155,3; add.u64 %r157,%frame,120; mov.u32 %r16111163,[%value_in]; } mov.u64 %r123,40; mov.u64 %r124,0; cvt.u32.u64 %r87,%r124; add.u64 %r278,%frame,360; add.u64 %r277,%frame,240; $L7: add.u64 %r117,%r132,%r123; .loc 1 123 19 ld.u64 %r166,[%r117+16]; add.u64 %r165,%r166,1; ld.u64 %r167,[%r117+8]; sub.u64 %r32,%r165,%r167; .loc 1 123 17 add.u64 %r168,%frame,%r124; st.u64 [%r168],%r32; .loc 1 124 10 setp.gt.s64 %r169,%r32,0; .loc 1 125 19 selp.u32 %r87,%r87,1,%r169; .loc 1 126 18 add.u64 %r171,%r278,%r124; ld.u64 %r172,[%r117]; st.u64 [%r171],%r172; .loc 1 127 18 add.u64 %r174,%r277,%r124; .loc 1 127 20 add.u64 %r175,%r133,%r123; ld.u64 %r177,[%r175]; mul.lo.u64 %r176,%r177,%r23; .loc 1 127 18 st.u64 [%r174],%r176; .loc 1 120 3 add.u64 %r124,%r124,8; add.u64 %r123,%r123,24; setp.ne.u64 %r178,%r114,%r124; @ %r178 bra $L7; $L4: .loc 1 129 14 ld.u64 %r38,[%frame+360]; .loc 1 129 6 setp.ne.u64 %r179,%r38,0; .loc 1 130 16 selp.u64 %r38,%r38,1,%r179; .loc 1 131 6 ld.u64 %r180,[%frame+240]; setp.ne.u64 %r181,%r180,0; @ %r181 bra $L9; .loc 1 132 16 cvt.s64.s32 %r182,%r92; st.u64 [%frame+240],%r182; $L9: .loc 1 134 6 setp.ne.u32 %r183,%r87,0; @ %r183 bra $L34; .loc 1 137 10 ld.u64 %r88,[%r132]; bra $L10; $L34: .loc 1 135 10 mov.u64 %r88,0; $L10: .loc 1 139 10 ld.u64 %r41,[%r131]; mov.u64 %r279,%r41; .loc 1 139 6 setp.eq.u64 %r275,%r41,0; @ %r275 bra $L11; .loc 1 139 33 cvta.global.u64 %r185,_gfortrani_compile_options; .loc 1 139 30 ld.u32 %r186,[%r185+36]; setp.eq.u32 %r187,%r186,0; @ ! %r187 bra $L11; setp.eq.u64 %r274,%r134,0; bra $L12; $L11: .loc 1 144 10 setp.eq.u64 %r274,%r134,0; @ %r274 bra $L13; .loc 1 148 12 ld.u64 %r190,[%r134+56]; add.u64 %r189,%r190,1; .loc 1 148 10 ld.u64 %r191,[%r134+48]; sub.u64 %r89,%r189,%r191; .loc 1 149 7 setp.lt.s64 %r192,%r89,0; @ ! %r192 bra $L15; bra $L14; $L13: .loc 1 158 12133; call (%value_in),_gfortrani_count_0,(%out_arg1); ld.param.u64 %r194,[%value_in]; } mov.u64 %r89,%r194; .loc 1 161 14 ld.u64 %r279,[%r131]; $L15: .loc 1 161 10 setp.ne.u64 %r195,%r279,0; @ %r195 bra $L16; .loc 1 164 4 st.u64 [%r131+48],%r279; add.u64 %r197,%r89,-1; st.u64 [%r131+56],%r197; mov.u64 %r198,1; st.u64 [%r131+40],%r198; .loc 1 166 16 st.u64 [%r131+8],%r279; .loc 1 169 21 mov.u64 %r201,1689201; call (%value_in),_gfortrani_xmallocarrayt.u64 [%r131],%r41; .loc 1 171 7 setp.ne.u64 %r204,%r89,0; @ %r204 bra $L12; bra $L1; $L16: .loc 1 179 17 ld.u64 %r206,[%r131+56]; add.u64 %r205,%r206,1; .loc 1 179 15 ld.u64 %r207,[%r131+48]; sub.u64 %r95,%r205,%r207; .loc 1 180 7 setp.ne.u64 %r208,%r95,%r89; @ %r208 bra $L17; .loc 1 192 8 mov.u64 %r41,%r279; setp.eq.u64 %r274,%r134,0; bra $L12; $L17: .loc 1 181 6 st.u64 [%stack+8],%r95; st.u64 [%stack],%r89;_gfortran_runtime_error12: .loc 1 187 12 ld.u64 %r82,[%r131+40]; .loc 1 188 6 setp.ne.u64 %r211,%r82,0; .loc 1 189 14 selp.u64 %r82,%r82,1,%r211; .loc 1 191 12 ld.u64 %r97,[%frame+240]; .loc 1 194 15 set.u32.ne.u64 %r213,%r103,0; neg.s32 %r214,%r213; .loc 1 194 10 set.u32.ne.u64 %r216,%r88,0; neg.s32 %r217,%r216; .loc 1 194 15 cvt.u16.u32 %r219,%r214; cvt.u16.u32 %r220,%r217; and.b16 %r218,%r219,%r220; .loc 1 194 9 cvt.u32.u16 %r221,%r218; cvt.u16.u8 %r222,%r221; setp.eq.u16 %r223,%r222,0; @ %r223 bra $L35; .loc 1 201 16 shl.b64 %r54,%r82,4; .loc 1 204 12 shl.b64 %r56,%r38,4; ld.u64 %r94,[%frame+120]; .loc 1 208 32 ld.u64 %r115,[%frame]; .loc 1 215 30 mul.lo.u64 %r28,%r115,%r38; .loc 1 216 30 mul.lo.u64 %r224,%r115,%r97; .loc 1 216 16 neg.s64 %r112,%r224; .loc 1 192 8 mov.u64 %r90,%r41; setp.le.s64 %r276,%r91,1; add.u64 %r282,%frame,360; add.u64 %r283,%frame,240; .loc 1 212 20 mov.u64 %r284,0; $L28: .loc 1 197 10 ld.s8 %r226,[%r103]; cvt.u16.u32 %r225,%r226; setp.eq.u16 %r227,%r225,0; @ %r227 bra $L20; .loc 1 200 12 ld.u64 %r270,[%r88]; ld.u64 %r271,[%r88+8]; .loc 1 200 10 st.u64 [%r90],%r270; st.u64 [%r90+8],%r271; .loc 1 201 16 add.u64 %r90,%r90,%r54; .loc 1 204 12 add.u64 %r88,%r88,%r56; .loc 1 205 12 add.u64 %r103,%r103,%r97; .loc 1 206 15 add.u64 %r94,%r94,1; .loc 1 208 13 setp.eq.u64 %r230,%r94,%r115; @ %r230 bra $L21; bra $L28; $L20: .loc 1 204 12 add.u64 %r88,%r88,%r56; .loc 1 205 12 add.u64 %r103,%r103,%r97; .loc 1 206 15 add.u64 %r94,%r94,1; .loc 1 208 13 setp.ne.u64 %r231,%r94,%r115; @ %r231 bra $L28; $L21: .loc 1 216 16 add.u64 %r104,%r103,%r112; .loc 1 218 14 @ ! %r276 bra $L52; bra $L19; $L27: .loc 1 212 20 st.u64 [%r83],%r284; .loc 1 215 30 mul.lo.u64 %r58,%r64,%r63; .loc 1 216 30 mul.lo.u64 %r234,%r66,%r63; .loc 1 216 16 sub.u64 %r104,%r103,%r234; .loc 1 217 12 add.u64 %r105,%r105,1; .loc 1 218 14 add.u64 %r83,%r83,8; add.u64 %r129,%r129,8; setp.ne.u64 %r235,%r91,%r105; @ %r235 bra $L26; bra $L19; $L52: add.u64 %r83,%frame,128; .loc 1 215 30 mov.u64 %r58,%r28; .loc 1 218 14 mov.u64 %r129,8; .loc 1 217 12 mov.u64 %r105,1; $L26: .loc 1 226 23 ld.u64 %r237,[%r83]; add.u64 %r63,%r237,1; st.u64 [%r83],%r63; .loc 1 227 30 add.u64 %r239,%r282,%r129; ld.u64 %r64,[%r239]; .loc 1 227 20 sub.u64 %r240,%r64,%r58; shl.b64 %r241,%r240,4; add.u64 %r88,%r88,%r241; .loc 1 228 30 add.u64 %r243,%r283,%r129; ld.u64 %r66,[%r243]; .loc 1 228 20 add.u64 %r103,%r104,%r66; .loc 1 208 32 add.u64 %r244,%frame,%r129; ld.u64 %r68,[%r244]; .loc 1 208 13 setp.eq.u64 %r245,%r63,%r68; @ %r245 bra $L27; .loc 1 212 20 mov.u64 %r94,0; bra $L28; $L35: .loc 1 192 8 mov.u64 %r90,%r41; $L19: .loc 1 234 6 @ %r274 bra $L1; .loc 1 236 11 ld.u64 %r248,[%r134+56]; add.u64 %r247,%r248,1; .loc 1 236 9 ld.u64 %r249,[%r134+48]; sub.u64 %r98,%r247,%r249; .loc 1 237 22 sub.u64 %r250,%r90,%r41; shr.s64 %r252,%r250,4; .loc 1 237 13 div.s64 %r99,%r252,%r82; .loc 1 238 10 setp.le.s64 %r253,%r98,%r99; @ %r253 bra $L1; .loc 1 240 20 ld.u64 %r84,[%r134+40]; .loc 1 241 14 setp.eq.u64 %r254,%r84,0; @ %r254 bra $L36; .loc 1 244 47 mul.lo.u64 %r130,%r99,%r84; bra $L30; $L36: mov.u64 %r130,%r99; .loc 1 242 22 mov.u64 %r84,1; $L30: .loc 1 244 36 shl.b64 %r255,%r130,4; .loc 1 244 16 ld.u64 %r256,[%r134]; add.u64 %r102,%r256,%r255; .loc 1 245 13 sub.u64 %r100,%r98,%r99; .loc 1 246 19 add.u64 %r101,%r100,-1; .loc 1 246 17 setp.eq.u64 %r257,%r100,0; @ %r257 bra $L1; .loc 1 249 20 shl.b64 %r79,%r82,4; .loc 1 250 20 shl.b64 %r81,%r84,4; $L31: .loc 1 248 16 ld.u64 %r272,[%r102]; ld.u64 %r273,[%r102+8]; .loc 1 248 14 st.u64 [%r90],%r272; st.u64 [%r90+8],%r273; .loc 1 249 20 add.u64 %r90,%r90,%r79; .loc 1 250 20 add.u64 %r102,%r102,%r81; .loc 1 246 19 add.u64 %r101,%r101,-1; .loc 1 246 17 setp.ne.u64 %r260,%r101,-1; @ %r260 bra $L31; bra $L1; $L33: .loc 1 164 4 st.u64 [%r131+48],%r279; mov.u64 %r262,-1; st.u64 [%r131+56],%r262; mov.u64 %r263,1; st.u64 [%r131+40],%r263; .loc 1 166 16 st.u64 [%r131+8],%r279; .loc 1 169 21 mov.u64 %r266,16266; call (%value_in),_gfortrani_xmallocarray267,[%value_in]; } .loc 1 169 19 st.u64 [%r131],%r267; bra $L1; $L14: .loc 1 161 10 @ %r275 bra $L33; .loc 1 151 14 mov.u64 %r134,0; .loc 1 152 15 mov.u64 %r89,%r134; bra $L16; $L1: .loc 1 254 1 ret; } pack_r4.o/_gfortrani_pack_r4 .visible .func _gfortrani_pack_rfortran/generated/pack_r4.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_count_0_gfortrani_count_0_gfortrani_xmallocarr_gfortrani_xmallocarray110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,910,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,80,65,67,75,32,105,110,116,114,105,110,115,105,99,59,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortrani_pack_r4 .visible .func _gfortrani_pack_r480pred %r150; .reg .predpredpredu64 %r206; .reg .u64 %r207; .reg .predpred %r229; .reg .u64 %r232; .reg .predpred %r251; .reg .predmov.u64 %r131,%ar0; mov.u64 %r132,%ar1; mov.u64 %r133,%ar2; mov.u64 %r134,%ar3; .loc 1 97 9 ld.s8 %r22,[%r132+28]; .loc 1 97 7 cvt.u32.u32 %r135,%r22; cvt.s64.s8 %r91,%r135; .loc 1 99 8 ld.u64 %r103,[%r133]; .loc 1 104 15 ld.u64 %r23,[%r133+16]; .loc 1 104 13 cvt.u32.u64 %r92,%r23; .loc 1 106 53 add.u32 %r136,%r92,-4; and.b32 %r137,%r136,-5; set.u32.eq.u32 %r139,%r137,0; neg.s32 %r140,%r139; .loc 1 106 22 add.u32 %r141,%r92,-1; set.u32.le.u32 %r143,%r141,1; neg.s32 %r144,%r143; .loc 1 106 6 cvt.u16.u32 %r146,%r140; cvt.u16.u32 %r147,%r144; or.b16 %r145,%r146,%r147; cvt.u32.u16 %r148,%r145; cvt.u16.u8 %r149,%r148; setp.eq.u16 %r150,%r149,0; @ %r150 bra $L2; $L5: .loc 1 120 3 setp.gt.s64 %r151,%r91,0; @ %r151 bra $L3; mov.u32 %r87,0; bra $L4; $L2: .loc 1 108 7 setp.eq.u32 %r152,%r92,16; @ %r152 bra $L5; .loc 1 117 5 cvta.const.u64 %r11_gfortran_runtime_error122 16 cvt.u32.u32 %r156,%r22; cvt.s64.s8 %r155,%r156; shl.b64 %r114,%r155,3; add.u64 %r157,%frame,120; mov.u32 %r16111163,[%value_in]; } mov.u64 %r123,40; mov.u64 %r124,0; cvt.u32.u64 %r87,%r124; add.u64 %r268,%frame,360; add.u64 %r267,%frame,240; $L7: add.u64 %r117,%r132,%r123; .loc 1 123 19 ld.u64 %r166,[%r117+16]; add.u64 %r165,%r166,1; ld.u64 %r167,[%r117+8]; sub.u64 %r32,%r165,%r167; .loc 1 123 17 add.u64 %r168,%frame,%r124; st.u64 [%r168],%r32; .loc 1 124 10 setp.gt.s64 %r169,%r32,0; .loc 1 125 19 selp.u32 %r87,%r87,1,%r169; .loc 1 126 18 add.u64 %r171,%r268,%r124; ld.u64 %r172,[%r117]; st.u64 [%r171],%r172; .loc 1 127 18 add.u64 %r174,%r267,%r124; .loc 1 127 20 add.u64 %r175,%r133,%r123; ld.u64 %r177,[%r175]; mul.lo.u64 %r176,%r177,%r23; .loc 1 127 18 st.u64 [%r174],%r176; .loc 1 120 3 add.u64 %r124,%r124,8; add.u64 %r123,%r123,24; setp.ne.u64 %r178,%r114,%r124; @ %r178 bra $L7; $L4: .loc 1 129 14 ld.u64 %r38,[%frame+360]; .loc 1 129 6 setp.ne.u64 %r179,%r38,0; .loc 1 130 16 selp.u64 %r38,%r38,1,%r179; .loc 1 131 6 ld.u64 %r180,[%frame+240]; setp.ne.u64 %r181,%r180,0; @ %r181 bra $L9; .loc 1 132 16 cvt.s64.s32 %r182,%r92; st.u64 [%frame+240],%r182; $L9: .loc 1 134 6 setp.ne.u32 %r183,%r87,0; @ %r183 bra $L34; .loc 1 137 10 ld.u64 %r88,[%r132]; bra $L10; $L34: .loc 1 135 10 mov.u64 %r88,0; $L10: .loc 1 139 10 ld.u64 %r41,[%r131]; mov.u64 %r270,%r41; .loc 1 139 6 setp.eq.u64 %r271,%r41,0; @ %r271 bra $L11; .loc 1 139 33 cvta.global.u64 %r185,_gfortrani_compile_options; .loc 1 139 30 ld.u32 %r186,[%r185+36]; setp.eq.u32 %r187,%r186,0; @ ! %r187 bra $L11; setp.eq.u64 %r269,%r134,0; bra $L12; $L11: .loc 1 144 10 setp.eq.u64 %r269,%r134,0; @ %r269 bra $L13; .loc 1 148 12 ld.u64 %r190,[%r134+56]; add.u64 %r189,%r190,1; .loc 1 148 10 ld.u64 %r191,[%r134+48]; sub.u64 %r89,%r189,%r191; .loc 1 149 7 setp.lt.s64 %r192,%r89,0; @ ! %r192 bra $L15; bra $L14; $L13: .loc 1 158 12133; call (%value_in),_gfortrani_count_0,(%out_arg1); ld.param.u64 %r194,[%value_in]; } mov.u64 %r89,%r194; .loc 1 161 14 ld.u64 %r270,[%r131]; $L15: .loc 1 161 10 setp.ne.u64 %r195,%r270,0; @ %r195 bra $L16; .loc 1 164 4 st.u64 [%r131+48],%r270; add.u64 %r197,%r89,-1; st.u64 [%r131+56],%r197; mov.u64 %r198,1; st.u64 [%r131+40],%r198; .loc 1 166 16 st.u64 [%r131+8],%r270; .loc 1 169 2189201; call (%value_in),_gfortrani_xmallocarrayt.u64 [%r131],%r41; .loc 1 171 7 setp.ne.u64 %r204,%r89,0; @ %r204 bra $L12; bra $L1; $L16: .loc 1 179 17 ld.u64 %r206,[%r131+56]; add.u64 %r205,%r206,1; .loc 1 179 15 ld.u64 %r207,[%r131+48]; sub.u64 %r95,%r205,%r207; .loc 1 180 7 setp.ne.u64 %r208,%r95,%r89; @ %r208 bra $L17; .loc 1 192 8 mov.u64 %r41,%r270; setp.eq.u64 %r269,%r134,0; bra $L12; $L17: .loc 1 181 6 st.u64 [%stack+8],%r95; st.u64 [%stack],%r89;_gfortran_runtime_error12: .loc 1 187 12 ld.u64 %r82,[%r131+40]; .loc 1 188 6 setp.ne.u64 %r211,%r82,0; .loc 1 189 14 selp.u64 %r82,%r82,1,%r211; .loc 1 191 12 ld.u64 %r97,[%frame+240]; .loc 1 194 15 set.u32.ne.u64 %r213,%r103,0; neg.s32 %r214,%r213; .loc 1 194 10 set.u32.ne.u64 %r216,%r88,0; neg.s32 %r217,%r216; .loc 1 194 15 cvt.u16.u32 %r219,%r214; cvt.u16.u32 %r220,%r217; and.b16 %r218,%r219,%r220; .loc 1 194 9 cvt.u32.u16 %r221,%r218; cvt.u16.u8 %r222,%r221; setp.eq.u16 %r223,%r222,0; @ %r223 bra $L35; .loc 1 201 16 shl.b64 %r54,%r82,2; .loc 1 204 12 shl.b64 %r56,%r38,2; ld.u64 %r94,[%frame+120]; .loc 1 208 32 ld.u64 %r115,[%frame]; .loc 1 215 30 mul.lo.u64 %r28,%r115,%r38; .loc 1 216 30 mul.lo.u64 %r224,%r115,%r97; .loc 1 216 16 neg.s64 %r112,%r224; .loc 1 192 8 mov.u64 %r90,%r41; setp.le.s64 %r266,%r91,1; add.u64 %r274,%frame,360; add.u64 %r275,%frame,240; .loc 1 212 20 mov.u64 %r276,0; $L28: .loc 1 197 10 ld.s8 %r226,[%r103]; cvt.u16.u32 %r225,%r226; setp.eq.u16 %r227,%r225,0; @ %r227 bra $L20; .loc 1 200 12 ld.f32 %r52,[%r88]; .loc 1 200 10 st.f32 [%r90],%r52; .loc 1 201 16 add.u64 %r90,%r90,%r54; .loc 1 204 12 add.u64 %r88,%r88,%r56; .loc 1 205 12 add.u64 %r103,%r103,%r97; .loc 1 206 15 add.u64 %r94,%r94,1; .loc 1 208 13 setp.eq.u64 %r228,%r94,%r115; @ %r228 bra $L21; bra $L28; $L20: .loc 1 204 12 add.u64 %r88,%r88,%r56; .loc 1 205 12 add.u64 %r103,%r103,%r97; .loc 1 206 15 add.u64 %r94,%r94,1; .loc 1 208 13 setp.ne.u64 %r229,%r94,%r115; @ %r229 bra $L28; $L21: .loc 1 216 16 add.u64 %r104,%r103,%r112; .loc 1 218 14 @ ! %r266 bra $L52; bra $L19; $L27: .loc 1 212 20 st.u64 [%r83],%r276; .loc 1 215 30 mul.lo.u64 %r58,%r64,%r63; .loc 1 216 30 mul.lo.u64 %r232,%r66,%r63; .loc 1 216 16 sub.u64 %r104,%r103,%r232; .loc 1 217 12 add.u64 %r105,%r105,1; .loc 1 218 14 add.u64 %r83,%r83,8; add.u64 %r129,%r129,8; setp.ne.u64 %r233,%r91,%r105; @ %r233 bra $L26; bra $L19; $L52: add.u64 %r83,%frame,128; .loc 1 215 30 mov.u64 %r58,%r28; .loc 1 218 14 mov.u64 %r129,8; .loc 1 217 12 mov.u64 %r105,1; $L26: .loc 1 226 23 ld.u64 %r235,[%r83]; add.u64 %r63,%r235,1; st.u64 [%r83],%r63; .loc 1 227 30 add.u64 %r237,%r274,%r129; ld.u64 %r64,[%r237]; .loc 1 227 20 sub.u64 %r238,%r64,%r58; shl.b64 %r239,%r238,2; add.u64 %r88,%r88,%r239; .loc 1 228 30 add.u64 %r241,%r275,%r129; ld.u64 %r66,[%r241]; .loc 1 228 20 add.u64 %r103,%r104,%r66; .loc 1 208 32 add.u64 %r242,%frame,%r129; ld.u64 %r68,[%r242]; .loc 1 208 13 setp.eq.u64 %r243,%r63,%r68; @ %r243 bra $L27; .loc 1 212 20 mov.u64 %r94,0; bra $L28; $L35: .loc 1 192 8 mov.u64 %r90,%r41; $L19: .loc 1 234 6 @ %r269 bra $L1; .loc 1 236 11 ld.u64 %r246,[%r134+56]; add.u64 %r245,%r246,1; .loc 1 236 9 ld.u64 %r247,[%r134+48]; sub.u64 %r98,%r245,%r247; .loc 1 237 22 sub.u64 %r248,%r90,%r41; shr.s64 %r250,%r248,2; .loc 1 237 13 div.s64 %r99,%r250,%r82; .loc 1 238 10 setp.le.s64 %r251,%r98,%r99; @ %r251 bra $L1; .loc 1 240 20 ld.u64 %r84,[%r134+40]; .loc 1 241 14 setp.eq.u64 %r252,%r84,0; @ %r252 bra $L36; .loc 1 244 47 mul.lo.u64 %r130,%r99,%r84; bra $L30; $L36: mov.u64 %r130,%r99; .loc 1 242 22 mov.u64 %r84,1; $L30: .loc 1 244 36 shl.b64 %r253,%r130,2; .loc 1 244 16 ld.u64 %r254,[%r134]; add.u64 %r102,%r254,%r253; .loc 1 245 13 sub.u64 %r100,%r98,%r99; .loc 1 246 19 add.u64 %r101,%r100,-1; .loc 1 246 17 setp.eq.u64 %r255,%r100,0; @ %r255 bra $L1; .loc 1 249 20 shl.b64 %r79,%r82,2; .loc 1 250 20 shl.b64 %r81,%r84,2; $L31: .loc 1 248 16 ld.f32 %r77,[%r102]; .loc 1 248 14 st.f32 [%r90],%r77; .loc 1 249 20 add.u64 %r90,%r90,%r79; .loc 1 250 20 add.u64 %r102,%r102,%r81; .loc 1 246 19 add.u64 %r101,%r101,-1; .loc 1 246 17 setp.ne.u64 %r256,%r101,-1; @ %r256 bra $L31; bra $L1; $L33: .loc 1 164 4 st.u64 [%r131+48],%r270; mov.u64 %r258,-1; st.u64 [%r131+56],%r258; mov.u64 %r259,1; st.u64 [%r131+40],%r259; .loc 1 166 16 st.u64 [%r131+8],%r270; .loc 1 169 21262; call (%value_in),_gfortrani_xmallocarray263,[%value_in]; } .loc 1 169 19 st.u64 [%r131],%r263; bra $L1; $L14: .loc 1 161 10 @ %r271 bra $L33; .loc 1 151 14 mov.u64 %r134,0; .loc 1 152 15 mov.u64 %r89,%r134; bra $L16; $L1: .loc 1 254 1 ret; } pack_r8.o/_gfortrani_pack_r8 .visible .func _gfortrani_pack_r8fortran/generated/pack_r8.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_count_0_gfortrani_count_0_gfortrani_xmallocarr_gfortrani_xmallocarray110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,910,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,80,65,67,75,32,105,110,116,114,105,110,115,105,99,59,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortrani_pack_r8 .visible .func _gfortrani_pack_r8480pred %r150; .reg .predpredpredu64 %r206; .reg .u64 %r207; .reg .predpred %r229; .reg .u64 %r232; .reg .predpred %r251; .reg .predmov.u64 %r131,%ar0; mov.u64 %r132,%ar1; mov.u64 %r133,%ar2; mov.u64 %r134,%ar3; .loc 1 97 9 ld.s8 %r22,[%r132+28]; .loc 1 97 7 cvt.u32.u32 %r135,%r22; cvt.s64.s8 %r91,%r135; .loc 1 99 8 ld.u64 %r103,[%r133]; .loc 1 104 15 ld.u64 %r23,[%r133+16]; .loc 1 104 13 cvt.u32.u64 %r92,%r23; .loc 1 106 53 add.u32 %r136,%r92,-4; and.b32 %r137,%r136,-5; set.u32.eq.u32 %r139,%r137,0; neg.s32 %r140,%r139; .loc 1 106 22 add.u32 %r141,%r92,-1; set.u32.le.u32 %r143,%r141,1; neg.s32 %r144,%r143; .loc 1 106 6 cvt.u16.u32 %r146,%r140; cvt.u16.u32 %r147,%r144; or.b16 %r145,%r146,%r147; cvt.u32.u16 %r148,%r145; cvt.u16.u8 %r149,%r148; setp.eq.u16 %r150,%r149,0; @ %r150 bra $L2; $L5: .loc 1 120 3 setp.gt.s64 %r151,%r91,0; @ %r151 bra $L3; mov.u32 %r87,0; bra $L4; $L2: .loc 1 108 7 setp.eq.u32 %r152,%r92,16; @ %r152 bra $L5; .loc 1 117 5 cvta.const.u64 %r11_gfortran_runtime_error122 16 cvt.u32.u32 %r156,%r22; cvt.s64.s8 %r155,%r156; shl.b64 %r114,%r155,3; add.u64 %r157,%frame,120; mov.u32 %r16111163,[%value_in]; } mov.u64 %r123,40; mov.u64 %r124,0; cvt.u32.u64 %r87,%r124; add.u64 %r268,%frame,360; add.u64 %r267,%frame,240; $L7: add.u64 %r117,%r132,%r123; .loc 1 123 19 ld.u64 %r166,[%r117+16]; add.u64 %r165,%r166,1; ld.u64 %r167,[%r117+8]; sub.u64 %r32,%r165,%r167; .loc 1 123 17 add.u64 %r168,%frame,%r124; st.u64 [%r168],%r32; .loc 1 124 10 setp.gt.s64 %r169,%r32,0; .loc 1 125 19 selp.u32 %r87,%r87,1,%r169; .loc 1 126 18 add.u64 %r171,%r268,%r124; ld.u64 %r172,[%r117]; st.u64 [%r171],%r172; .loc 1 127 18 add.u64 %r174,%r267,%r124; .loc 1 127 20 add.u64 %r175,%r133,%r123; ld.u64 %r177,[%r175]; mul.lo.u64 %r176,%r177,%r23; .loc 1 127 18 st.u64 [%r174],%r176; .loc 1 120 3 add.u64 %r124,%r124,8; add.u64 %r123,%r123,24; setp.ne.u64 %r178,%r114,%r124; @ %r178 bra $L7; $L4: .loc 1 129 14 ld.u64 %r38,[%frame+360]; .loc 1 129 6 setp.ne.u64 %r179,%r38,0; .loc 1 130 16 selp.u64 %r38,%r38,1,%r179; .loc 1 131 6 ld.u64 %r180,[%frame+240]; setp.ne.u64 %r181,%r180,0; @ %r181 bra $L9; .loc 1 132 16 cvt.s64.s32 %r182,%r92; st.u64 [%frame+240],%r182; $L9: .loc 1 134 6 setp.ne.u32 %r183,%r87,0; @ %r183 bra $L34; .loc 1 137 10 ld.u64 %r88,[%r132]; bra $L10; $L34: .loc 1 135 10 mov.u64 %r88,0; $L10: .loc 1 139 10 ld.u64 %r41,[%r131]; mov.u64 %r270,%r41; .loc 1 139 6 setp.eq.u64 %r271,%r41,0; @ %r271 bra $L11; .loc 1 139 33 cvta.global.u64 %r185,_gfortrani_compile_options; .loc 1 139 30 ld.u32 %r186,[%r185+36]; setp.eq.u32 %r187,%r186,0; @ ! %r187 bra $L11; setp.eq.u64 %r269,%r134,0; bra $L12; $L11: .loc 1 144 10 setp.eq.u64 %r269,%r134,0; @ %r269 bra $L13; .loc 1 148 12 ld.u64 %r190,[%r134+56]; add.u64 %r189,%r190,1; .loc 1 148 10 ld.u64 %r191,[%r134+48]; sub.u64 %r89,%r189,%r191; .loc 1 149 7 setp.lt.s64 %r192,%r89,0; @ ! %r192 bra $L15; bra $L14; $L13: .loc 1 158 12133; call (%value_in),_gfortrani_count_0,(%out_arg1); ld.param.u64 %r194,[%value_in]; } mov.u64 %r89,%r194; .loc 1 161 14 ld.u64 %r270,[%r131]; $L15: .loc 1 161 10 setp.ne.u64 %r195,%r270,0; @ %r195 bra $L16; .loc 1 164 4 st.u64 [%r131+48],%r270; add.u64 %r197,%r89,-1; st.u64 [%r131+56],%r197; mov.u64 %r198,1; st.u64 [%r131+40],%r198; .loc 1 166 16 st.u64 [%r131+8],%r270; .loc 1 169 21 mov.u64 %r20189201; call (%value_in),_gfortrani_xmallocarrayt.u64 [%r131],%r41; .loc 1 171 7 setp.ne.u64 %r204,%r89,0; @ %r204 bra $L12; bra $L1; $L16: .loc 1 179 17 ld.u64 %r206,[%r131+56]; add.u64 %r205,%r206,1; .loc 1 179 15 ld.u64 %r207,[%r131+48]; sub.u64 %r95,%r205,%r207; .loc 1 180 7 setp.ne.u64 %r208,%r95,%r89; @ %r208 bra $L17; .loc 1 192 8 mov.u64 %r41,%r270; setp.eq.u64 %r269,%r134,0; bra $L12; $L17: .loc 1 181 6 st.u64 [%stack+8],%r95; st.u64 [%stack],%r89;_gfortran_runtime_error12: .loc 1 187 12 ld.u64 %r82,[%r131+40]; .loc 1 188 6 setp.ne.u64 %r211,%r82,0; .loc 1 189 14 selp.u64 %r82,%r82,1,%r211; .loc 1 191 12 ld.u64 %r97,[%frame+240]; .loc 1 194 15 set.u32.ne.u64 %r213,%r103,0; neg.s32 %r214,%r213; .loc 1 194 10 set.u32.ne.u64 %r216,%r88,0; neg.s32 %r217,%r216; .loc 1 194 15 cvt.u16.u32 %r219,%r214; cvt.u16.u32 %r220,%r217; and.b16 %r218,%r219,%r220; .loc 1 194 9 cvt.u32.u16 %r221,%r218; cvt.u16.u8 %r222,%r221; setp.eq.u16 %r223,%r222,0; @ %r223 bra $L35; .loc 1 201 16 shl.b64 %r54,%r82,3; .loc 1 204 12 shl.b64 %r56,%r38,3; ld.u64 %r94,[%frame+120]; .loc 1 208 32 ld.u64 %r115,[%frame]; .loc 1 215 30 mul.lo.u64 %r28,%r115,%r38; .loc 1 216 30 mul.lo.u64 %r224,%r115,%r97; .loc 1 216 16 neg.s64 %r112,%r224; .loc 1 192 8 mov.u64 %r90,%r41; setp.le.s64 %r266,%r91,1; add.u64 %r274,%frame,360; add.u64 %r275,%frame,240; .loc 1 212 20 mov.u64 %r276,0; $L28: .loc 1 197 10 ld.s8 %r226,[%r103]; cvt.u16.u32 %r225,%r226; setp.eq.u16 %r227,%r225,0; @ %r227 bra $L20; .loc 1 200 12 ld.f64 %r52,[%r88]; .loc 1 200 10 st.f64 [%r90],%r52; .loc 1 201 16 add.u64 %r90,%r90,%r54; .loc 1 204 12 add.u64 %r88,%r88,%r56; .loc 1 205 12 add.u64 %r103,%r103,%r97; .loc 1 206 15 add.u64 %r94,%r94,1; .loc 1 208 13 setp.eq.u64 %r228,%r94,%r115; @ %r228 bra $L21; bra $L28; $L20: .loc 1 204 12 add.u64 %r88,%r88,%r56; .loc 1 205 12 add.u64 %r103,%r103,%r97; .loc 1 206 15 add.u64 %r94,%r94,1; .loc 1 208 13 setp.ne.u64 %r229,%r94,%r115; @ %r229 bra $L28; $L21: .loc 1 216 16 add.u64 %r104,%r103,%r112; .loc 1 218 14 @ ! %r266 bra $L52; bra $L19; $L27: .loc 1 212 20 st.u64 [%r83],%r276; .loc 1 215 30 mul.lo.u64 %r58,%r64,%r63; .loc 1 216 30 mul.lo.u64 %r232,%r66,%r63; .loc 1 216 16 sub.u64 %r104,%r103,%r232; .loc 1 217 12 add.u64 %r105,%r105,1; .loc 1 218 14 add.u64 %r83,%r83,8; add.u64 %r129,%r129,8; setp.ne.u64 %r233,%r91,%r105; @ %r233 bra $L26; bra $L19; $L52: add.u64 %r83,%frame,128; .loc 1 215 30 mov.u64 %r58,%r28; .loc 1 218 14 mov.u64 %r129,8; .loc 1 217 12 mov.u64 %r105,1; $L26: .loc 1 226 23 ld.u64 %r235,[%r83]; add.u64 %r63,%r235,1; st.u64 [%r83],%r63; .loc 1 227 30 add.u64 %r237,%r274,%r129; ld.u64 %r64,[%r237]; .loc 1 227 20 sub.u64 %r238,%r64,%r58; shl.b64 %r239,%r238,3; add.u64 %r88,%r88,%r239; .loc 1 228 30 add.u64 %r241,%r275,%r129; ld.u64 %r66,[%r241]; .loc 1 228 20 add.u64 %r103,%r104,%r66; .loc 1 208 32 add.u64 %r242,%frame,%r129; ld.u64 %r68,[%r242]; .loc 1 208 13 setp.eq.u64 %r243,%r63,%r68; @ %r243 bra $L27; .loc 1 212 20 mov.u64 %r94,0; bra $L28; $L35: .loc 1 192 8 mov.u64 %r90,%r41; $L19: .loc 1 234 6 @ %r269 bra $L1; .loc 1 236 11 ld.u64 %r246,[%r134+56]; add.u64 %r245,%r246,1; .loc 1 236 9 ld.u64 %r247,[%r134+48]; sub.u64 %r98,%r245,%r247; .loc 1 237 22 sub.u64 %r248,%r90,%r41; shr.s64 %r250,%r248,3; .loc 1 237 13 div.s64 %r99,%r250,%r82; .loc 1 238 10 setp.le.s64 %r251,%r98,%r99; @ %r251 bra $L1; .loc 1 240 20 ld.u64 %r84,[%r134+40]; .loc 1 241 14 setp.eq.u64 %r252,%r84,0; @ %r252 bra $L36; .loc 1 244 47 mul.lo.u64 %r130,%r99,%r84; bra $L30; $L36: mov.u64 %r130,%r99; .loc 1 242 22 mov.u64 %r84,1; $L30: .loc 1 244 36 shl.b64 %r253,%r130,3; .loc 1 244 16 ld.u64 %r254,[%r134]; add.u64 %r102,%r254,%r253; .loc 1 245 13 sub.u64 %r100,%r98,%r99; .loc 1 246 19 add.u64 %r101,%r100,-1; .loc 1 246 17 setp.eq.u64 %r255,%r100,0; @ %r255 bra $L1; .loc 1 249 20 shl.b64 %r79,%r82,3; .loc 1 250 20 shl.b64 %r81,%r84,3; $L31: .loc 1 248 16 ld.f64 %r77,[%r102]; .loc 1 248 14 st.f64 [%r90],%r77; .loc 1 249 20 add.u64 %r90,%r90,%r79; .loc 1 250 20 add.u64 %r102,%r102,%r81; .loc 1 246 19 add.u64 %r101,%r101,-1; .loc 1 246 17 setp.ne.u64 %r256,%r101,-1; @ %r256 bra $L31; bra $L1; $L33: .loc 1 164 4 st.u64 [%r131+48],%r270; mov.u64 %r258,-1; st.u64 [%r131+56],%r258; mov.u64 %r259,1; st.u64 [%r131+40],%r259; .loc 1 166 16 st.u64 [%r131+8],%r270; .loc 1 169 21 mov.u64 %r2627262; call (%value_in),_gfortrani_xmallocarray263,[%value_in]; } .loc 1 169 19 st.u64 [%r131],%r263; bra $L1; $L14: .loc 1 161 10 @ %r271 bra $L33; .loc 1 151 14 mov.u64 %r134,0; .loc 1 152 15 mov.u64 %r89,%r134; bra $L16; $L1: .loc 1 254 1 ret; } pack_r10.o/ pack_c4.o/_gfortrani_pack_c4 .visible .func _gfortrani_pack_cfortran/generated/pack_c4.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_count_0_gfortrani_count_0_gfortrani_xmallocarr_gfortrani_xmallocarray110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,910,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,80,65,67,75,32,105,110,116,114,105,110,115,105,99,59,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortrani_pack_c4 .visible .func _gfortrani_pack_c480predpred %r180; .reg .predpredpredu64 %r199; .reg .u64pred %r213; .reg .u32 %r215; .reg .u32 %r216; .reg .u32 %r218; .reg .u32 %r219; .reg .u16u16 %r227; .reg .u32 %r228; .reg .pred %r229; .reg .predu64 %r241; .reg .u64 %r243; .reg .u64 %r244; .reg .predpred %r253; .reg .predpredmov.u64 %r133,%ar0; mov.u64 %r134,%ar1; mov.u64 %r135,%ar2; mov.u64 %r136,%ar3; .loc 1 97 9 ld.s8 %r22,[%r134+28]; .loc 1 97 7 cvt.u32.u32 %r137,%r22; cvt.s64.s8 %r93,%r137; .loc 1 99 8 ld.u64 %r105,[%r135]; .loc 1 104 15 ld.u64 %r23,[%r135+16]; .loc 1 104 13 cvt.u32.u64 %r94,%r23; .loc 1 106 53 add.u32 %r138,%r94,-4; and.b32 %r139,%r138,-5; set.u32.eq.u32 %r141,%r139,0; neg.s32 %r142,%r141; .loc 1 106 22 add.u32 %r143,%r94,-1; set.u32.le.u32 %r145,%r143,1; neg.s32 %r146,%r145; .loc 1 106 6 cvt.u16.u32 %r148,%r142; cvt.u16.u32 %r149,%r146; or.b16 %r147,%r148,%r149; cvt.u32.u16 %r150,%r147; cvt.u16.u8 %r151,%r150; setp.eq.u16 %r152,%r151,0; @ %r152 bra $L2; $L5: .loc 1 120 3 setp.gt.s64 %r153,%r93,0; @ %r153 bra $L3; mov.u32 %r89,0; bra $L4; $L2: .loc 1 108 7 setp.eq.u32 %r154,%r94,16; @ %r154 bra $L5; .loc 1 117 5 cvta.const.u64 %r11_gfortran_runtime_error122 16 cvt.u32.u32 %r158,%r22; cvt.s64.s8 %r157,%r158; shl.b64 %r116,%r157,3; add.u64 %r159,%frame,120; mov.u32 %r163116165,[%value_in]; } mov.u64 %r125,40; mov.u64 %r126,0; cvt.u32.u64 %r89,%r126; add.u64 %r270,%frame,360; add.u64 %r269,%frame,240; $L7: add.u64 %r119,%r134,%r125; .loc 1 123 19 ld.u64 %r168,[%r119+16]; add.u64 %r167,%r168,1; ld.u64 %r169,[%r119+8]; sub.u64 %r32,%r167,%r169; .loc 1 123 17 add.u64 %r170,%frame,%r126; st.u64 [%r170],%r32; .loc 1 124 10 setp.gt.s64 %r171,%r32,0; .loc 1 125 19 selp.u32 %r89,%r89,1,%r171; .loc 1 126 18 add.u64 %r173,%r270,%r126; ld.u64 %r174,[%r119]; st.u64 [%r173],%r174; .loc 1 127 18 add.u64 %r176,%r269,%r126; .loc 1 127 20 add.u64 %r177,%r135,%r125; ld.u64 %r179,[%r177]; mul.lo.u64 %r178,%r179,%r23; .loc 1 127 18 st.u64 [%r176],%r178; .loc 1 120 3 add.u64 %r126,%r126,8; add.u64 %r125,%r125,24; setp.ne.u64 %r180,%r116,%r126; @ %r180 bra $L7; $L4: .loc 1 129 14 ld.u64 %r38,[%frame+360]; .loc 1 129 6 setp.ne.u64 %r181,%r38,0; .loc 1 130 16 selp.u64 %r38,%r38,1,%r181; .loc 1 131 6 ld.u64 %r182,[%frame+240]; setp.ne.u64 %r183,%r182,0; @ %r183 bra $L9; .loc 1 132 16 cvt.s64.s32 %r184,%r94; st.u64 [%frame+240],%r184; $L9: .loc 1 134 6 setp.ne.u32 %r185,%r89,0; @ %r185 bra $L34; .loc 1 137 10 ld.u64 %r90,[%r134]; bra $L10; $L34: .loc 1 135 10 mov.u64 %r90,0; $L10: .loc 1 139 10 ld.u64 %r41,[%r133]; mov.u64 %r272,%r41; .loc 1 139 6 setp.eq.u64 %r273,%r41,0; @ %r273 bra $L11; .loc 1 139 33 cvta.global.u64 %r187,_gfortrani_compile_options; .loc 1 139 30 ld.u32 %r188,[%r187+36]; setp.eq.u32 %r189,%r188,0; @ ! %r189 bra $L11; setp.eq.u64 %r271,%r136,0; bra $L12; $L11: .loc 1 144 10 setp.eq.u64 %r271,%r136,0; @ %r271 bra $L13; .loc 1 148 12 ld.u64 %r192,[%r136+56]; add.u64 %r191,%r192,1; .loc 1 148 10 ld.u64 %r193,[%r136+48]; sub.u64 %r91,%r191,%r193; .loc 1 149 7 setp.lt.s64 %r194,%r91,0; @ ! %r194 bra $L15; bra $L14; $L13: .loc 1 158 12135; call (%value_in),_gfortrani_count_0,(%out_arg1); ld.param.u64 %r196,[%value_in]; } mov.u64 %r91,%r196; .loc 1 161 14 ld.u64 %r272,[%r133]; $L15: .loc 1 161 10 setp.ne.u64 %r197,%r272,0; @ %r197 bra $L16; .loc 1 164 4 st.u64 [%r133+48],%r272; add.u64 %r199,%r91,-1; st.u64 [%r133+56],%r199; mov.u64 %r200,1; st.u64 [%r133+40],%r200; .loc 1 166 16 st.u64 [%r133+8],%r272; .loc 1 169 21 mov.u64 %r20391203; call (%value_in),_gfortrani_xmallocarrayt.u64 [%r133],%r41; .loc 1 171 7 setp.ne.u64 %r206,%r91,0; @ %r206 bra $L12; bra $L1; $L16: .loc 1 179 17 ld.u64 %r208,[%r133+56]; add.u64 %r207,%r208,1; .loc 1 179 15 ld.u64 %r209,[%r133+48]; sub.u64 %r97,%r207,%r209; .loc 1 180 7 setp.ne.u64 %r210,%r97,%r91; @ %r210 bra $L17; .loc 1 192 8 mov.u64 %r41,%r272; setp.eq.u64 %r271,%r136,0; bra $L12; $L17: .loc 1 181 6 st.u64 [%stack+8],%r97; st.u64 [%stack],%r91;_gfortran_runtime_error12: .loc 1 187 12 ld.u64 %r84,[%r133+40]; .loc 1 188 6 setp.ne.u64 %r213,%r84,0; .loc 1 189 14 selp.u64 %r84,%r84,1,%r213; .loc 1 191 12 ld.u64 %r99,[%frame+240]; .loc 1 194 15 set.u32.ne.u64 %r215,%r105,0; neg.s32 %r216,%r215; .loc 1 194 10 set.u32.ne.u64 %r218,%r90,0; neg.s32 %r219,%r218; .loc 1 194 15 cvt.u16.u32 %r221,%r216; cvt.u16.u32 %r222,%r219; and.b16 %r220,%r221,%r222; .loc 1 194 9 cvt.u32.u16 %r223,%r220; cvt.u16.u8 %r224,%r223; setp.eq.u16 %r225,%r224,0; @ %r225 bra $L35; .loc 1 201 16 shl.b64 %r55,%r84,3; .loc 1 204 12 shl.b64 %r57,%r38,3; ld.u64 %r96,[%frame+120]; .loc 1 208 32 ld.u64 %r117,[%frame]; .loc 1 215 30 mul.lo.u64 %r28,%r117,%r38; .loc 1 216 30 mul.lo.u64 %r226,%r117,%r99; .loc 1 216 16 neg.s64 %r114,%r226; .loc 1 192 8 mov.u64 %r92,%r41; setp.le.s64 %r268,%r93,1; add.u64 %r276,%frame,360; add.u64 %r277,%frame,240; .loc 1 212 20 mov.u64 %r278,0; $L28: .loc 1 197 10 ld.s8 %r228,[%r105]; cvt.u16.u32 %r227,%r228; setp.eq.u16 %r229,%r227,0; @ %r229 bra $L20; .loc 1 200 12 ld.f32 %r52,[%r90]; ld.f32 %r53,[%r90+4]; .loc 1 200 10 st.f32 [%r92],%r52; st.f32 [%r92+4],%r53; .loc 1 201 16 add.u64 %r92,%r92,%r55; .loc 1 204 12 add.u64 %r90,%r90,%r57; .loc 1 205 12 add.u64 %r105,%r105,%r99; .loc 1 206 15 add.u64 %r96,%r96,1; .loc 1 208 13 setp.eq.u64 %r230,%r96,%r117; @ %r230 bra $L21; bra $L28; $L20: .loc 1 204 12 add.u64 %r90,%r90,%r57; .loc 1 205 12 add.u64 %r105,%r105,%r99; .loc 1 206 15 add.u64 %r96,%r96,1; .loc 1 208 13 setp.ne.u64 %r231,%r96,%r117; @ %r231 bra $L28; $L21: .loc 1 216 16 add.u64 %r106,%r105,%r114; .loc 1 218 14 @ ! %r268 bra $L52; bra $L19; $L27: .loc 1 212 20 st.u64 [%r85],%r278; .loc 1 215 30 mul.lo.u64 %r59,%r65,%r64; .loc 1 216 30 mul.lo.u64 %r234,%r67,%r64; .loc 1 216 16 sub.u64 %r106,%r105,%r234; .loc 1 217 12 add.u64 %r107,%r107,1; .loc 1 218 14 add.u64 %r85,%r85,8; add.u64 %r131,%r131,8; setp.ne.u64 %r235,%r93,%r107; @ %r235 bra $L26; bra $L19; $L52: add.u64 %r85,%frame,128; .loc 1 215 30 mov.u64 %r59,%r28; .loc 1 218 14 mov.u64 %r131,8; .loc 1 217 12 mov.u64 %r107,1; $L26: .loc 1 226 23 ld.u64 %r237,[%r85]; add.u64 %r64,%r237,1; st.u64 [%r85],%r64; .loc 1 227 30 add.u64 %r239,%r276,%r131; ld.u64 %r65,[%r239]; .loc 1 227 20 sub.u64 %r240,%r65,%r59; shl.b64 %r241,%r240,3; add.u64 %r90,%r90,%r241; .loc 1 228 30 add.u64 %r243,%r277,%r131; ld.u64 %r67,[%r243]; .loc 1 228 20 add.u64 %r105,%r106,%r67; .loc 1 208 32 add.u64 %r244,%frame,%r131; ld.u64 %r69,[%r244]; .loc 1 208 13 setp.eq.u64 %r245,%r64,%r69; @ %r245 bra $L27; .loc 1 212 20 mov.u64 %r96,0; bra $L28; $L35: .loc 1 192 8 mov.u64 %r92,%r41; $L19: .loc 1 234 6 @ %r271 bra $L1; .loc 1 236 11 ld.u64 %r248,[%r136+56]; add.u64 %r247,%r248,1; .loc 1 236 9 ld.u64 %r249,[%r136+48]; sub.u64 %r100,%r247,%r249; .loc 1 237 22 sub.u64 %r250,%r92,%r41; shr.s64 %r252,%r250,3; .loc 1 237 13 div.s64 %r101,%r252,%r84; .loc 1 238 10 setp.le.s64 %r253,%r100,%r101; @ %r253 bra $L1; .loc 1 240 20 ld.u64 %r86,[%r136+40]; .loc 1 241 14 setp.eq.u64 %r254,%r86,0; @ %r254 bra $L36; .loc 1 244 47 mul.lo.u64 %r132,%r101,%r86; bra $L30; $L36: mov.u64 %r132,%r101; .loc 1 242 22 mov.u64 %r86,1; $L30: .loc 1 244 36 shl.b64 %r255,%r132,3; .loc 1 244 16 ld.u64 %r256,[%r136]; add.u64 %r104,%r256,%r255; .loc 1 245 13 sub.u64 %r102,%r100,%r101; .loc 1 246 19 add.u64 %r103,%r102,-1; .loc 1 246 17 setp.eq.u64 %r257,%r102,0; @ %r257 bra $L1; .loc 1 249 20 shl.b64 %r81,%r84,3; .loc 1 250 20 shl.b64 %r83,%r86,3; $L31: .loc 1 248 16 ld.f32 %r78,[%r104]; ld.f32 %r79,[%r104+4]; .loc 1 248 14 st.f32 [%r92],%r78; st.f32 [%r92+4],%r79; .loc 1 249 20 add.u64 %r92,%r92,%r81; .loc 1 250 20 add.u64 %r104,%r104,%r83; .loc 1 246 19 add.u64 %r103,%r103,-1; .loc 1 246 17 setp.ne.u64 %r258,%r103,-1; @ %r258 bra $L31; bra $L1; $L33: .loc 1 164 4 st.u64 [%r133+48],%r272; mov.u64 %r260,-1; st.u64 [%r133+56],%r260; mov.u64 %r261,1; st.u64 [%r133+40],%r261; .loc 1 166 16 st.u64 [%r133+8],%r272; .loc 1 169 21 mov.u64 %r26427264; call (%value_in),_gfortrani_xmallocarray265,[%value_in]; } .loc 1 169 19 st.u64 [%r133],%r265; bra $L1; $L14: .loc 1 161 10 @ %r273 bra $L33; .loc 1 151 14 mov.u64 %r136,0; .loc 1 152 15 mov.u64 %r91,%r136; bra $L16; $L1: .loc 1 254 1 ret; } pack_c8.o/_gfortrani_pack_c8 .visible .func _gfortrani_pack_c8fortran/generated/pack_c8.c" // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_count_0_gfortrani_count_0_gfortrani_xmallocarr_gfortrani_xmallocarray110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,910,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,80,65,67,75,32,105,110,116,114,105,110,115,105,99,59,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37_gfortrani_pack_c8 .visible .func _gfortrani_pack_c8480predpred %r180; .reg .predpredpredu64 %r199; .reg .u64pred %r213; .reg .u32 %r215; .reg .u32 %r216; .reg .u32 %r218; .reg .u32 %r219; .reg .u16u16 %r227; .reg .u32 %r228; .reg .pred %r229; .reg .predu64 %r241; .reg .u64 %r243; .reg .u64 %r244; .reg .predpred %r253; .reg .predpredmov.u64 %r133,%ar0; mov.u64 %r134,%ar1; mov.u64 %r135,%ar2; mov.u64 %r136,%ar3; .loc 1 97 9 ld.s8 %r22,[%r134+28]; .loc 1 97 7 cvt.u32.u32 %r137,%r22; cvt.s64.s8 %r93,%r137; .loc 1 99 8 ld.u64 %r105,[%r135]; .loc 1 104 15 ld.u64 %r23,[%r135+16]; .loc 1 104 13 cvt.u32.u64 %r94,%r23; .loc 1 106 53 add.u32 %r138,%r94,-4; and.b32 %r139,%r138,-5; set.u32.eq.u32 %r141,%r139,0; neg.s32 %r142,%r141; .loc 1 106 22 add.u32 %r143,%r94,-1; set.u32.le.u32 %r145,%r143,1; neg.s32 %r146,%r145; .loc 1 106 6 cvt.u16.u32 %r148,%r142; cvt.u16.u32 %r149,%r146; or.b16 %r147,%r148,%r149; cvt.u32.u16 %r150,%r147; cvt.u16.u8 %r151,%r150; setp.eq.u16 %r152,%r151,0; @ %r152 bra $L2; $L5: .loc 1 120 3 setp.gt.s64 %r153,%r93,0; @ %r153 bra $L3; mov.u32 %r89,0; bra $L4; $L2: .loc 1 108 7 setp.eq.u32 %r154,%r94,16; @ %r154 bra $L5; .loc 1 117 5 cvta.const.u64 %r11_gfortran_runtime_error122 16 cvt.u32.u32 %r158,%r22; cvt.s64.s8 %r157,%r158; shl.b64 %r116,%r157,3; add.u64 %r159,%frame,120; mov.u32 %r163116165,[%value_in]; } mov.u64 %r125,40; mov.u64 %r126,0; cvt.u32.u64 %r89,%r126; add.u64 %r270,%frame,360; add.u64 %r269,%frame,240; $L7: add.u64 %r119,%r134,%r125; .loc 1 123 19 ld.u64 %r168,[%r119+16]; add.u64 %r167,%r168,1; ld.u64 %r169,[%r119+8]; sub.u64 %r32,%r167,%r169; .loc 1 123 17 add.u64 %r170,%frame,%r126; st.u64 [%r170],%r32; .loc 1 124 10 setp.gt.s64 %r171,%r32,0; .loc 1 125 19 selp.u32 %r89,%r89,1,%r171; .loc 1 126 18 add.u64 %r173,%r270,%r126; ld.u64 %r174,[%r119]; st.u64 [%r173],%r174; .loc 1 127 18 add.u64 %r176,%r269,%r126; .loc 1 127 20 add.u64 %r177,%r135,%r125; ld.u64 %r179,[%r177]; mul.lo.u64 %r178,%r179,%r23; .loc 1 127 18 st.u64 [%r176],%r178; .loc 1 120 3 add.u64 %r126,%r126,8; add.u64 %r125,%r125,24; setp.ne.u64 %r180,%r116,%r126; @ %r180 bra $L7; $L4: .loc 1 129 14 ld.u64 %r38,[%frame+360]; .loc 1 129 6 setp.ne.u64 %r181,%r38,0; .loc 1 130 16 selp.u64 %r38,%r38,1,%r181; .loc 1 131 6 ld.u64 %r182,[%frame+240]; setp.ne.u64 %r183,%r182,0; @ %r183 bra $L9; .loc 1 132 16 cvt.s64.s32 %r184,%r94; st.u64 [%frame+240],%r184; $L9: .loc 1 134 6 setp.ne.u32 %r185,%r89,0; @ %r185 bra $L34; .loc 1 137 10 ld.u64 %r90,[%r134]; bra $L10; $L34: .loc 1 135 10 mov.u64 %r90,0; $L10: .loc 1 139 10 ld.u64 %r41,[%r133]; mov.u64 %r272,%r41; .loc 1 139 6 setp.eq.u64 %r273,%r41,0; @ %r273 bra $L11; .loc 1 139 33 cvta.global.u64 %r187,_gfortrani_compile_options; .loc 1 139 30 ld.u32 %r188,[%r187+36]; setp.eq.u32 %r189,%r188,0; @ ! %r189 bra $L11; setp.eq.u64 %r271,%r136,0; bra $L12; $L11: .loc 1 144 10 setp.eq.u64 %r271,%r136,0; @ %r271 bra $L13; .loc 1 148 12 ld.u64 %r192,[%r136+56]; add.u64 %r191,%r192,1; .loc 1 148 10 ld.u64 %r193,[%r136+48]; sub.u64 %r91,%r191,%r193; .loc 1 149 7 setp.lt.s64 %r194,%r91,0; @ ! %r194 bra $L15; bra $L14; $L13: .loc 1 158 12135; call (%value_in),_gfortrani_count_0,(%out_arg1); ld.param.u64 %r196,[%value_in]; } mov.u64 %r91,%r196; .loc 1 161 14 ld.u64 %r272,[%r133]; $L15: .loc 1 161 10 setp.ne.u64 %r197,%r272,0; @ %r197 bra $L16; .loc 1 164 4 st.u64 [%r133+48],%r272; add.u64 %r199,%r91,-1; st.u64 [%r133+56],%r199; mov.u64 %r200,1; st.u64 [%r133+40],%r200; .loc 1 166 16 st.u64 [%r133+8],%r272; .loc 1 169 21 mov.u64 %r203,1691203; call (%value_in),_gfortrani_xmallocarrayt.u64 [%r133],%r41; .loc 1 171 7 setp.ne.u64 %r206,%r91,0; @ %r206 bra $L12; bra $L1; $L16: .loc 1 179 17 ld.u64 %r208,[%r133+56]; add.u64 %r207,%r208,1; .loc 1 179 15 ld.u64 %r209,[%r133+48]; sub.u64 %r97,%r207,%r209; .loc 1 180 7 setp.ne.u64 %r210,%r97,%r91; @ %r210 bra $L17; .loc 1 192 8 mov.u64 %r41,%r272; setp.eq.u64 %r271,%r136,0; bra $L12; $L17: .loc 1 181 6 st.u64 [%stack+8],%r97; st.u64 [%stack],%r91;_gfortran_runtime_error12: .loc 1 187 12 ld.u64 %r84,[%r133+40]; .loc 1 188 6 setp.ne.u64 %r213,%r84,0; .loc 1 189 14 selp.u64 %r84,%r84,1,%r213; .loc 1 191 12 ld.u64 %r99,[%frame+240]; .loc 1 194 15 set.u32.ne.u64 %r215,%r105,0; neg.s32 %r216,%r215; .loc 1 194 10 set.u32.ne.u64 %r218,%r90,0; neg.s32 %r219,%r218; .loc 1 194 15 cvt.u16.u32 %r221,%r216; cvt.u16.u32 %r222,%r219; and.b16 %r220,%r221,%r222; .loc 1 194 9 cvt.u32.u16 %r223,%r220; cvt.u16.u8 %r224,%r223; setp.eq.u16 %r225,%r224,0; @ %r225 bra $L35; .loc 1 201 16 shl.b64 %r55,%r84,4; .loc 1 204 12 shl.b64 %r57,%r38,4; ld.u64 %r96,[%frame+120]; .loc 1 208 32 ld.u64 %r117,[%frame]; .loc 1 215 30 mul.lo.u64 %r28,%r117,%r38; .loc 1 216 30 mul.lo.u64 %r226,%r117,%r99; .loc 1 216 16 neg.s64 %r114,%r226; .loc 1 192 8 mov.u64 %r92,%r41; setp.le.s64 %r268,%r93,1; add.u64 %r276,%frame,360; add.u64 %r277,%frame,240; .loc 1 212 20 mov.u64 %r278,0; $L28: .loc 1 197 10 ld.s8 %r228,[%r105]; cvt.u16.u32 %r227,%r228; setp.eq.u16 %r229,%r227,0; @ %r229 bra $L20; .loc 1 200 12 ld.f64 %r52,[%r90]; ld.f64 %r53,[%r90+8]; .loc 1 200 10 st.f64 [%r92],%r52; st.f64 [%r92+8],%r53; .loc 1 201 16 add.u64 %r92,%r92,%r55; .loc 1 204 12 add.u64 %r90,%r90,%r57; .loc 1 205 12 add.u64 %r105,%r105,%r99; .loc 1 206 15 add.u64 %r96,%r96,1; .loc 1 208 13 setp.eq.u64 %r230,%r96,%r117; @ %r230 bra $L21; bra $L28; $L20: .loc 1 204 12 add.u64 %r90,%r90,%r57; .loc 1 205 12 add.u64 %r105,%r105,%r99; .loc 1 206 15 add.u64 %r96,%r96,1; .loc 1 208 13 setp.ne.u64 %r231,%r96,%r117; @ %r231 bra $L28; $L21: .loc 1 216 16 add.u64 %r106,%r105,%r114; .loc 1 218 14 @ ! %r268 bra $L52; bra $L19; $L27: .loc 1 212 20 st.u64 [%r85],%r278; .loc 1 215 30 mul.lo.u64 %r59,%r65,%r64; .loc 1 216 30 mul.lo.u64 %r234,%r67,%r64; .loc 1 216 16 sub.u64 %r106,%r105,%r234; .loc 1 217 12 add.u64 %r107,%r107,1; .loc 1 218 14 add.u64 %r85,%r85,8; add.u64 %r131,%r131,8; setp.ne.u64 %r235,%r93,%r107; @ %r235 bra $L26; bra $L19; $L52: add.u64 %r85,%frame,128; .loc 1 215 30 mov.u64 %r59,%r28; .loc 1 218 14 mov.u64 %r131,8; .loc 1 217 12 mov.u64 %r107,1; $L26: .loc 1 226 23 ld.u64 %r237,[%r85]; add.u64 %r64,%r237,1; st.u64 [%r85],%r64; .loc 1 227 30 add.u64 %r239,%r276,%r131; ld.u64 %r65,[%r239]; .loc 1 227 20 sub.u64 %r240,%r65,%r59; shl.b64 %r241,%r240,4; add.u64 %r90,%r90,%r241; .loc 1 228 30 add.u64 %r243,%r277,%r131; ld.u64 %r67,[%r243]; .loc 1 228 20 add.u64 %r105,%r106,%r67; .loc 1 208 32 add.u64 %r244,%frame,%r131; ld.u64 %r69,[%r244]; .loc 1 208 13 setp.eq.u64 %r245,%r64,%r69; @ %r245 bra $L27; .loc 1 212 20 mov.u64 %r96,0; bra $L28; $L35: .loc 1 192 8 mov.u64 %r92,%r41; $L19: .loc 1 234 6 @ %r271 bra $L1; .loc 1 236 11 ld.u64 %r248,[%r136+56]; add.u64 %r247,%r248,1; .loc 1 236 9 ld.u64 %r249,[%r136+48]; sub.u64 %r100,%r247,%r249; .loc 1 237 22 sub.u64 %r250,%r92,%r41; shr.s64 %r252,%r250,4; .loc 1 237 13 div.s64 %r101,%r252,%r84; .loc 1 238 10 setp.le.s64 %r253,%r100,%r101; @ %r253 bra $L1; .loc 1 240 20 ld.u64 %r86,[%r136+40]; .loc 1 241 14 setp.eq.u64 %r254,%r86,0; @ %r254 bra $L36; .loc 1 244 47 mul.lo.u64 %r132,%r101,%r86; bra $L30; $L36: mov.u64 %r132,%r101; .loc 1 242 22 mov.u64 %r86,1; $L30: .loc 1 244 36 shl.b64 %r255,%r132,4; .loc 1 244 16 ld.u64 %r256,[%r136]; add.u64 %r104,%r256,%r255; .loc 1 245 13 sub.u64 %r102,%r100,%r101; .loc 1 246 19 add.u64 %r103,%r102,-1; .loc 1 246 17 setp.eq.u64 %r257,%r102,0; @ %r257 bra $L1; .loc 1 249 20 shl.b64 %r81,%r84,4; .loc 1 250 20 shl.b64 %r83,%r86,4; $L31: .loc 1 248 16 ld.f64 %r78,[%r104]; ld.f64 %r79,[%r104+8]; .loc 1 248 14 st.f64 [%r92],%r78; st.f64 [%r92+8],%r79; .loc 1 249 20 add.u64 %r92,%r92,%r81; .loc 1 250 20 add.u64 %r104,%r104,%r83; .loc 1 246 19 add.u64 %r103,%r103,-1; .loc 1 246 17 setp.ne.u64 %r258,%r103,-1; @ %r258 bra $L31; bra $L1; $L33: .loc 1 164 4 st.u64 [%r133+48],%r272; mov.u64 %r260,-1; st.u64 [%r133+56],%r260; mov.u64 %r261,1; st.u64 [%r133+40],%r261; .loc 1 166 16 st.u64 [%r133+8],%r272; .loc 1 169 21 mov.u64 %r264,16call (%value_in),_gfortrani_xmallocarray265,[%value_in]; } .loc 1 169 19 st.u64 [%r133],%r265; bra $L1; $L14: .loc 1 161 10 @ %r273 bra $L33; .loc 1 151 14 mov.u64 %r136,0; .loc 1 152 15 mov.u64 %r91,%r136; bra $L16; $L1: .loc 1 254 1 ret; } pack_c10.o/ unpack_i1.o/_gfortrani_unpack0_i1 .visible .func _gfortrani_unpack0_fortran/generated/unpack_i1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_unpack1_i1 .visible .func _gfortrani_unpack1_i1gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortrani_unpack0_i1 .visible .func _gfortrani_unpack0_reg .u64 %stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[4809predpredpred %r202; .reg .u64 %r205; .reg .u32 %r209; .reg .u64 %r211; .reg .u64 %r213; .reg .u64u16 %r224; .reg .u16u64 %r235; .reg .pred %r236; .reg .u64 %r237; .reg .predpredpredpred %r253; .reg .u64 %r255; .reg .u64 %r257; .reg .u64 %r258; .reg .u64 %r260; .reg .u64 %r261; .reg .predmov.u64 %r130,%ar0; mov.u64 %r131,%ar1; mov.u64 %r132,%ar2; mov.u64 %r133,%ar3; .loc 1 46 23 ld.s8 %r84,[%r133]; .loc 1 62 8 ld.u64 %r95,[%r132]; .loc 1 67 15 ld.u64 %r22,[%r132+16]; .loc 1 69 22 cvt.u32.u64 %r23,%r22; .loc 1 69 53 add.u32 %r134,%r23,-4; and.b32 %r135,%r134,-5; set.u32.eq.u32 %r137,%r135,0; neg.s32 %r138,%r137; .loc 1 69 22 add.u32 %r139,%r23,-1; set.u32.le.u32 %r141,%r139,1; neg.s32 %r142,%r141; .loc 1 69 6 cvt.u16.u32 %r144,%r138; cvt.u16.u32 %r145,%r142; or.b16 %r143,%r144,%r145;2; .loc 1 71 7 setp.eq.u32 %r150,%r23,16; @ %r150 bra $L2; .loc 1 80 5 cvta.const.u64 %r11_gfortran_runtime_errorr94,[%r130]; .loc 1 82 6 setp.ne.u64 %r153,%r94,0; @ %r153 bra $L3; .loc 1 86 11 ld.s8 %r80,[%r132+28]; .loc 1 88 7 setp.le.s64 %r155,%r80,0; @ %r155 bra $L4; .loc 1 90 13 shl.b64 %r157,%r80,3; add.u64 %r158,%frame,120; cvt.u32.u64 %r162,%r94157164,[%value_in]; } add.u64 %r117,%r130,40; add.u64 %r115,%r132,40; mov.u64 %r113,%frame; add.u64 %r112,%frame,360; add.u64 %r111,%frame,240; add.u64 %r167,%r80,%r80; add.u64 %r168,%r167,%r80; shl.b64 %r169,%r168,3; add.u64 %r98,%r169,%r117; .loc 1 60 9 cvt.u32.u64 %r38,%r94; .loc 1 87 10 mov.u64 %r88,1; .loc 1 91 4 mov.u64 %r171,%r94; $L5: st.u64 [%r117+8],%r171; ld.u64 %r173,[%r115+16]; add.u64 %r172,%r173,1; ld.u64 %r174,[%r115+8]; sub.u64 %r33,%r172,%r174; add.u64 %r175,%r33,-1; st.u64 [%r117+16],%r175; st.u64 [%r117],%r88; .loc 1 93 14 st.u64 [%r113],%r33; .loc 1 94 18 set.u32.le.s64 %r177,%r33,0; neg.s32 %r178,%r177; cvt.u16.u32 %r181,%r38; and.b16 %r180,%r181,1; cvt.u16.u32 %r184,%r178; or.b16 %r183,%r184,%r180; cvt.u32.u16 %r186,%r183; cvt.u32.u8 %r38,%r186; .loc 1 95 15 st.u64 [%r112],%r88; .loc 1 96 17 ld.u64 %r188,[%r115]; mul.lo.u64 %r187,%r188,%r22; .loc 1 96 15 st.u64 [%r111],%r187; .loc 1 97 7 mul.lo.u64 %r88,%r88,%r33; .loc 1 88 7 add.u64 %r117,%r117,24; add.u64 %r115,%r115,24; add.u64 %r113,%r113,8;8; setp.ne.u64 %r189,%r98,%r117; @ %r189 bra $L5; bra $L31; $L4: .loc 1 99 19 st.u64 [%r130+8],%r94; .loc 1 100 24 mov.u64 %r19219192; call (%value_in),_gfortrani_xmallocarray193,[%value_in]; } mov.u64 %r94,%r193; .loc 1 100 22 st.u64 [%r130],%r94; .loc 1 128 12 ld.u64 %r129,[%frame+360]; bra $L7; $L31: .loc 1 99 19 st.u64 [%r130+8],%r171; .loc 1 100 24 mov.u64 %r197r197; call (%value_in),_gfortrani_xmallocarray198,[%value_in]; } .loc 1 100 22 st.u64 [%r130],%r198; bra $L8; $L3: .loc 1 104 11 ld.s8 %r80,[%r130+28]; .loc 1 106 18 mov.u64 %r201,1; st.u64 [%frame+360],%r201; .loc 1 107 7 setp.le.s64 %r202,%r80,0; @ %r202 bra $L20; .loc 1 109 13 shl.b64 %r127,%r80,3; add.u64 %r205,%frame,120; mov.u32 %r209111,[%value_in]; } mov.u64 %r92,40; mov.u64 %r97,0; .loc 1 60 9 cvt.u32.u64 %r38,%r97; add.u64 %r263,%frame,360; add.u64 %r265,%frame,240; $L9: add.u64 %r82,%r130,%r92; .loc 1 110 16 ld.u64 %r214,[%r82+16]; add.u64 %r213,%r214,1; ld.u64 %r215,[%r82+8]; sub.u64 %r49,%r213,%r215; .loc 1 110 14 add.u64 %r216,%frame,%r97; st.u64 [%r216],%r49; .loc 1 111 18 set.u32.le.s64 %r218,%r49,0; neg.s32 %r219,%r218; cvt.u16.u32 %r222,%r38; and.b16 %r221,%r222,1; cvt.u16.u32 %r225,%r219; or.b16 %r224,%r225,%r221; cvt.u32.u16 %r227,%r224; cvt.u32.u8 %r38,%r227; .loc 1 112 15 add.u64 %r229,%r263,%r97; ld.u64 %r230,[%r82]; st.u64 [%r229],%r230; .loc 1 113 15 add.u64 %r232,%r265,%r97; .loc 1 113 17 add.u64 %r233,%r132,%r92; ld.u64 %r235,[%r233]; mul.lo.u64 %r234,%r235,%r22; .loc 1 113 15 st.u64 [%r232],%r234; .loc 1 107 7 add.u64 %r97,%r97,8; add.u64 %r92,%r92,24; setp.ne.u64 %r236,%r97,%r127; @ %r236 bra $L9; .loc 1 115 10 ld.u64 %r237,[%frame+360].loc 1 116 13 mov.u64 %r239,1; st.u64 [%frame+360],%r239; $L8: .loc 1 119 6 setp.ne.u32 %r240,%r38,0; @ %r240 bra $L1; .loc 1 128 12 ld.u64 %r129,[%frame+360]; .loc 1 130 8 ld.u64 %r94,[%r130]; bra $L7; $L20: .loc 1 107 7 mov.u64 %r129,%r201; $L7: .loc 1 122 14 ld.u64 %r57,[%frame+240]; .loc 1 122 6 setp.ne.u64 %r241,%r57,0; .loc 1 123 16 selp.u64 %r57,%r57,1,%r241; .loc 1 125 12 ld.u64 %r75,[%r131+40]; .loc 1 126 6 setp.ne.u64 %r242,%r75,0; .loc 1 127 14 selp.u64 %r75,%r75,1,%r242; .loc 1 131 8 ld.u64 %r76,[%r131]; .loc 1 133 9 setp.eq.u64 %r243,%r94,0; @ %r243 bra $L1; ld.u64 %r27,[%frame+120]; .loc 1 151 32 ld.u64 %r96,[%frame]; .loc 1 158 30 mul.lo.u64 %r28,%r96,%r129; .loc 1 159 30 mul.lo.u64 %r244,%r96,%r57; .loc 1 159 16 neg.s64 %r99,%r244; shl.b64 %r118,%r80,3; setp.le.s64 %r264,%r80,1; add.u64 %r270,%frame,360; add.u64 %r271,%frame,240; .loc 1 155 20 mov.u64 %r272,0; $L19: .loc 1 135 10 ld.s8 %r246,[%r95]; cvt.u16.u32 %r245,%r246; setp.eq.u16 %r247,%r245,0; @ %r247 bra $L21; .loc 1 138 12 ld.s8 %r59,[%r76]; .loc 1 139 9 add.u64 %r76,%r76,%r75; bra $L13; $L21: .loc 1 144 10 mov.u32 %r59,%r84; $L13: cvt.u32.u32 %r248,%r59; st.u8 [%r94],%r248; .loc 1 147 12 add.u64 %r94,%r94,%r129; .loc 1 148 12 add.u64 %r95,%r95,%r57; .loc 1 149 15 add.u64 %r27,%r27,1; .loc 1 151 13 setp.ne.u64 %r249,%r27,%r96; @ %r249 bra $L19; .loc 1 159 16 add.u64 %r93,%r95,%r99; .loc 1 161 14 @ ! %r264 bra $L32; bra $L1; $L18: .loc 1 155 20 st.u64 [%r87],%r272; .loc 1 158 30 mul.lo.u64 %r63,%r70,%r69; .loc 1 159 30 mul.lo.u64 %r252,%r72,%r69; .loc 1 159 16 sub.u64 %r93,%r95,%r252; .loc 1 161 14 add.u64 %r87,%r87,8; add.u64 %r124,%r124,8; setp.ne.u64 %r253,%r118,%r124; @ %r253 bra $L17; bra $L1; $L32: add.u64 %r87,%frame,128; .loc 1 158 30 mov.u64 %r63,%r28; .loc 1 161 14 mov.u64 %r124,8; $L17: .loc 1 169 23 ld.u64 %r255,[%r87]; add.u64 %r69,%r255,1; st.u64 [%r87],%r69; .loc 1 170 30 add.u64 %r257,%r270,%r124; ld.u64 %r70,[%r257]; .loc 1 170 20 sub.u64 %r258,%r70,%r63; add.u64 %r94,%r94,%r258; .loc 1 171 30 add.u64 %r260,%r271,%r124; ld.u64 %r72,[%r260]; .loc 1 171 20 add.u64 %r95,%r93,%r72; .loc 1 151 32 add.u64 %r261,%frame,%r124; ld.u64 %r74,[%r261]; .loc 1 151 13 setp.eq.u64 %r262,%r69,%r74; @ %r262 bra $L18; .loc 1 155 20 mov.u64 %r27,0; bra $L19; $L_gfortrani_unpack1_i1 .visible .func _gfortrani_unpack1_reg .u64 %stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[608pred %r168; .reg .u64 %r171; .reg .u32 %r175; .reg .u64predu64 %r215; .reg .u64 %r218; .reg .pred %r219; .reg .u64 %r222; .reg .u32 %r226; .reg .u64 %r228; .reg .u64 %r230; .reg .u64 %r231; .reg .u64u32 %r236; .reg .u16u32predpredpredpred %r275; .reg .u64 %r277; .reg .u64 %r279; .reg .u64 %r280; .reg .u64 %r282; .reg .u64predmov.u64 %r143,%ar0; mov.u64 %r144,%ar1; mov.u64 %r145,%ar2; mov.u64 %r146,%ar3; .loc 1 208 8 ld.u64 %r100,[%r145]; .loc 1 213 15 ld.u64 %r22,[%r145+16]; .loc 1 215 22 cvt.u32.u64 %r23,%r22; .loc 1 215 53 add.u32 %r147,%r23,-4; and.b32 %r148,%r147,-5; set.u32.eq.u32 %r150,%r148,0; neg.s32 %r151,%r150; .loc 1 215 22 add.u32 %r152,%r23,-1; set.u32.le.u32 %r154,%r152,1; neg.s32 %r155,%r154; .loc 1 215 6 cvt.u16.u3234; .loc 1 217 7 setp.eq.u32 %r163,%r23,16; @ %r163 bra $L34; .loc 1 226 5 cvta.const.u64 %r164164_gfortran_runtime_error34: .loc 1 228 10 ld.u64 %r98,[%r143]; .loc 1 228 6 setp.ne.u64 %r166,%r98,0; @ %r166 bra $L35; .loc 1 232 11 ld.s8 %r85,[%r145+28]; .loc 1 234 7 setp.le.s64 %r168,%r85,0; @ %r168 bra $L36; .loc 1 236 13 shl.b64 %r138,%r85,3; add.u64 %r171,%frame,120; cvt.u32.u64 %r175,%r98138177,[%value_in]; } mov.u64 %r128,40; mov.u64 %r129,%r98; .loc 1 206 9 cvt.u32.u64 %r39,%r129; .loc 1 233 10 mov.u64 %r91,1; add.u64 %r289,%frame,480; add.u64 %r287,%frame,360; add.u64 %r290,%frame,240; .loc 1 237 4 mov.u64 %r179,%r129; $L37: add.u64 %r127,%r143,%r128; st.u64 [%r127+8],%r179; add.u64 %r126,%r145,%r128; ld.u64 %r181,[%r126+16]; add.u64 %r180,%r181,1; ld.u64 %r182,[%r126+8]; sub.u64 %r34,%r180,%r182; add.u64 %r183,%r34,-1; st.u64 [%r127+16],%r183; st.u64 [%r127],%r91; .loc 1 239 14 add.u64 %r184,%frame,%r129; st.u64 [%r184],%r34; .loc 1 240 18 set.u32.le.s64 %r186,%r34,0; neg.s32 %r187,%r186; cvt.u16.u32 %r190,%r39; and.b16 %r189,%r190,1; cvt.u16.u32 %r193,%r187; or.b16 %r192,%r193,%r189; cvt.u32.u16 %r195,%r192; cvt.u32.u8 %r39,%r195; .loc 1 241 15 add.u64 %r197,%r289,%r129; st.u64 [%r197],%r91; .loc 1 242 15 add.u64 %r199,%r287,%r129; .loc 1 242 17 add.u64 %r200,%r146,%r128; .loc 1 242 15 ld.u64 %r201,[%r200]; st.u64 [%r199],%r201; .loc 1 243 15 add.u64 %r203,%r290,%r129; .loc 1 243 17 ld.u64 %r205,[%r126]; mul.lo.u64 %r204,%r205,%r22; .loc 1 243 15 st.u64 [%r203],%r204; .loc 1 244 7 mul.lo.u64 %r91,%r91,%r34; .loc 1 234 7 add.u64 %r129,%r129,8; add.u64 %r128,%r128,24; setp.ne.u64 %r206,%r129,%r138; @ %r206 bra $L37; bra $L64; $L36: .loc 1 246 19 st.u64 [%r143+8],%r98; .loc 1 247 24 mov.u64 %r209209; call (%value_in),_gfortrani_xmallocarray210,[%value_in]; } mov.u64 %r98,%r210; .loc 1 247 22 st.u64 [%r143],%r98; .loc 1 278 12 ld.u64 %r142,[%frame+480]; bra $L39; $L64: .loc 1 246 19 st.u64 [%r143+8],%r179; .loc 1 247 24 mov.u64 %r21491214; call (%value_in),_gfortrani_xmallocarray215,[%value_in]; } .loc 1 247 22 st.u64 [%r143],%r215; bra $L40; $L35: .loc 1 251 11 ld.s8 %r85,[%r143+28]; .loc 1 253 18 mov.u64 %r218,1; st.u64 [%frame+480],%r218; .loc 1 254 7 setp.le.s64 %r219,%r85,0; @ %r219 bra $L54; .loc 1 256 13 shl.b64 %r140,%r85,3; add.u64 %r222,%frame,120; mov.u32 %r226add.u64 %r120,%r143,40; add.u64 %r118,%r146,40; add.u64 %r116,%r145,40; mov.u64 %r113,0; .loc 1 206 9 cvt.u32.u64 %r39,%r113; add.u64 %r289,%frame,480; add.u64 %r287,%frame,360; add.u64 %r290,%frame,240; $L41: .loc 1 257 16 ld.u64 %r231,[%r120+16]; add.u64 %r230,%r231,1; ld.u64 %r232,[%r120+8]; sub.u64 %r51,%r230,%r232; .loc 1 257 14 add.u64 %r233,%frame,%r113; st.u64 [%r233],%r51; .loc 1 258 18 set.u32.le.s64 %r235,%r51,0; neg.s32 %r236,%r235; cvt.u16.u32 %r239,%r39; and.b16 %r238,%r239,1; cvt.u16.u32 %r242,%r236; or.b16 %r241,%r242,%r238; cvt.u32.u16 %r244,%r241; cvt.u32.u8 %r39,%r244; .loc 1 259 15 add.u64 %r246,%r289,%r113; ld.u64 %r247,[%r120]; st.u64 [%r246],%r247; .loc 1 260 15 add.u64 %r249,%r287,%r113; ld.u64 %r250,[%r118]; st.u64 [%r249],%r250; .loc 1 261 15 add.u64 %r252,%r290,%r113; .loc 1 261 17 ld.u64 %r254,[%r116]; mul.lo.u64 %r253,%r254,%r22; .loc 1 261 15 st.u64 [%r252],%r253; .loc 1 254 7 add.u64 %r120,%r120,24; add.u64 %r118,%r118,24; add.u64 %r116,%r116,24; add.u64 %r113,%r113,8; setp.ne.u64 %r255,%r113,%r140; @ %r255 bra $L41; .loc 1 263 10 ld.u64 %r256,[%frame+480]; setp.ne.u64 %r257,%r256,0; @ %r257 bra $L40; .loc 1 264 13 mov.u64 %r258,1; st.u64 [%frame+480],%r258; $L40: .loc 1 267 6 setp.ne.u32 %r259,%r39,0; @ %r259 bra $L33; .loc 1 278 12 ld.u64 %r142,[%frame+480]; .loc 1 281 8 ld.u64 %r98,[%r143]; bra $L39; $L54: .loc 1 254 7 mov.u64 %r142,%r218; $L39: .loc 1 270 14 ld.u64 %r60,[%frame+360]; .loc 1 270 6 setp.ne.u64 %r260,%r60,0; .loc 1 271 16 selp.u64 %r60,%r60,1,%r260; .loc 1 272 14 ld.u64 %r61,[%frame+240]; .loc 1 272 6 setp.ne.u64 %r261,%r61,0; .loc 1 273 16 selp.u64 %r61,%r61,1,%r261; .loc 1 275 12 ld.u64 %r82,[%r144+40]; .loc 1 276 6 setp.ne.u64 %r262,%r82,0; .loc 1 277 14 selp.u64 %r82,%r82,1,%r262; .loc 1 282 8 ld.u64 %r99,[%r146]; .loc 1 283 8 ld.u64 %r83,[%r144]; .loc 1 285 9 setp.eq.u64 %r263,%r98,0; @ %r263 bra $L33; ld.u64 %r29,[%frame+120]; .loc 1 304 32 ld.u64 %r104,[%frame]; .loc 1 311 30 mul.lo.u64 %r28,%r104,%r142; .loc 1 312 30 mul.lo.u64 %r264,%r104,%r60; .loc 1 312 16 neg.s64 %r110,%r264; .loc 1 313 30 mul.lo.u64 %r265,%r104,%r61; .loc 1 313 16 neg.s64 %r111,%r265; shl.b64 %r130,%r85,3; setp.le.s64 %r288,%r85,1; add.u64 %r295,%frame,480; add.u64 %r296,%frame,360; add.u64 %r297,%frame,240; .loc 1 308 20 mov.u64 %r298,0; $L53: .loc 1 287 10 ld.s8 %r267,[%r100]; cvt.u16.u32 %r266,%r267; setp.eq.u16 %r268,%r266,0; @ %r268 bra $L46; .loc 1 290 12 ld.s8 %r63,[%r83]; .loc 1 291 16 add.u64 %r83,%r83,%r82; bra $L47; $L46: .loc 1 296 12 ld.s8 %r63,[%r99]; $L47: cvt.u32.u32 %r269,%r63; st.u8 [%r98],%r269; .loc 1 299 12 add.u64 %r98,%r98,%r142; .loc 1 300 12 add.u64 %r99,%r99,%r60; .loc 1 301 12 add.u64 %r100,%r100,%r61; .loc 1 302 15 add.u64 %r29,%r29,1; .loc 1 304 13 setp.ne.u64 %r270,%r29,%r104; @ %r270 bra $L53; .loc 1 312 16 add.u64 %r96,%r99,%r110; .loc 1 313 16 add.u64 %r97,%r100,%r111; .loc 1 315 14 @ ! %r288 bra $L65; bra $L33; $L52: .loc 1 308 20 st.u64 [%r90],%r298; .loc 1 311 30 mul.lo.u64 %r66,%r75,%r74; .loc 1 312 30 mul.lo.u64 %r273,%r77,%r74; .loc 1 312 16 sub.u64 %r96,%r99,%r273; .loc 1 313 30 mul.lo.u64 %r274,%r79,%r74; .loc 1 313 16 sub.u64 %r97,%r100,%r274; .loc 1 315 14 add.u64 %r90,%r90,8; add.u64 %r137,%r137,8; setp.ne.u64 %r275,%r130,%r137; @ %r275 bra $L51; bra $L33; $L65: add.u64 %r90,%frame,128; .loc 1 311 30 mov.u64 %r66,%r28; .loc 1 315 14 mov.u64 %r137,8; $L51: .loc 1 323 23 ld.u64 %r277,[%r90]; add.u64 %r74,%r277,1; st.u64 [%r90],%r74; .loc 1 324 30 add.u64 %r279,%r295,%r137; ld.u64 %r75,[%r279]; .loc 1 324 20 sub.u64 %r280,%r75,%r66; add.u64 %r98,%r98,%r280; .loc 1 325 30 add.u64 %r282,%r296,%r137; ld.u64 %r77,[%r282]; .loc 1 325 20 add.u64 %r99,%r96,%r77; .loc 1 326 30 add.u64 %r284,%r297,%r137; ld.u64 %r79,[%r284]; .loc 1 326 20 add.u64 %r100,%r97,%r79; .loc 1 304 32 add.u64 %r285,%frame,%r137; ld.u64 %r81,[%r285]; .loc 1 304 13 setp.eq.u64 %r286,%r74,%r81; @ %r286 bra $L52; .loc 1 308 20 mov.u64 %r29,0; bra $L53; $L33: .loc 1 330 1 ret; } unpack_i2.o/_gfortrani_unpack0_i2 .visible .func _gfortrani_unpack0_fortran/generated/unpack_i2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_unpack1_i2 .visible .func _gfortrani_unpack1_i2gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortrani_unpack0_i2 .visible .func _gfortrani_unpack0_reg .u64 %stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[480predpredpred %r205; .reg .u64 %r208; .reg .u32 %r212; .reg .u64u16 %r225; .reg .u16 %r227; .reg .u16u64 %r237; .reg .u64 %r238; .reg .pred %r239; .reg .u64 %r240; .reg .pred %r241; .reg .u64 %r242; .reg .predu32 %r251; .reg .predmov.u64 %r133,%ar0; mov.u64 %r134,%ar1; mov.u64 %r135,%ar2; mov.u64 %r136,%ar3; .loc 1 46 23 ld.s16 %r86,[%r136]; .loc 1 62 8 ld.u64 %r97,[%r135]; .loc 1 67 15 ld.u64 %r22,[%r135+16]; .loc 1 69 22 cvt.u32.u64 %r23,%r22; .loc 1 69 53 add.u32 %r137,%r23,-4; and.b32 %r138,%r137,-5; set.u32.eq.u32 %r140,%r138,0; neg.s32 %r141,%r140; .loc 1 69 22 add.u32 %r142,%r23,-1; set.u32.le.u32 %r144,%r142,1; neg.s32 %r145,%r144; .loc 1 69 6 cvt.u16.u32 %r147,%r141; cvt.u16.u32 %r148,%r145; or.b16 %r146,%r147,%r148; cvt.u32.u16 %r149,%r146; cvt.u16.u8 %r150,%r149; setp.ne.u16 %r151,%r150,0; @ %r151 bra $L2; .loc 1 71 7 setp.eq.u32 %r153,%r23,16; @ %r153 bra $L2; .loc 1 80 5 cvta.const.u64 %r1_gfortran_runtime_errorr96,[%r133]; .loc 1 82 6 setp.ne.u64 %r156,%r96,0; @ %r156 bra $L3; .loc 1 86 11 ld.s8 %r78,[%r135+28]; .loc 1 88 7 setp.le.s64 %r158,%r78,0; @ %r158 bra $L4; .loc 1 90 13 shl.b64 %r160,%r78,3; add.u64 %r161,%frame,120; cvt.u32.u64 %r165,%r96add.u64 %r120,%r133,40; add.u64 %r118,%r135,40; mov.u64 %r116,%frame; add.u64 %r115,%frame,360; add.u64 %r114,%frame,240; add.u64 %r170,%r78,%r78; add.u64 %r171,%r170,%r78; shl.b64 %r172,%r171,3; add.u64 %r102,%r172,%r120; .loc 1 60 9 cvt.u32.u64 %r37,%r96; .loc 1 87 10 mov.u64 %r91,1; .loc 1 91 4 mov.u64 %r174,%r96; $L5: st.u64 [%r120+8],%r174; ld.u64 %r176,[%r118+16]; add.u64 %r175,%r176,1; ld.u64 %r177,[%r118+8]; sub.u64 %r32,%r175,%r177; add.u64 %r178,%r32,-1; st.u64 [%r120+16],%r178; st.u64 [%r120],%r91; .loc 1 93 14 st.u64 [%r116],%r32; .loc 1 94 18 set.u32.le.s64 %r180,%r32,0; neg.s32 %r181,%r180; cvt.u16.u32 %r184,%r37; and.b16 %r183,%r184,1; cvt.u16.u32 %r187,%r181; or.b16 %r186,%r187,%r183; cvt.u32.u16 %r189,%r186; cvt.u32.u8 %r37,%r189; .loc 1 95 15 st.u64 [%r115],%r91; .loc 1 96 17 ld.u64 %r191,[%r118]; mul.lo.u64 %r190,%r191,%r22; .loc 1 96 15 st.u64 [%r114],%r190; .loc 1 97 7 mul.lo.u64 %r91,%r91,%r32; .loc 1 88 7 add.u64 %r120,%r120,24; add.u64 %r118,%r118,24; add.u64 %r116,%r116,8; add.u64 %r115,%r115,8; add.u64 %r114,%r114,8; setp.ne.u64 %r192,%r102,%r120; @ %r192 bra $L5; bra $L31; $L4: .loc 1 99 19 st.u64 [%r133+8],%r96; .loc 1 100 24 mov.u64 %r195,2; mov.u64 %r19419195; call (%value_in),_gfortrani_xmallocarray196,[%value_in]; } mov.u64 %r96,%r196; .loc 1 100 22 st.u64 [%r133],%r96; .loc 1 128 12 ld.u64 %r132,[%frame+360]; bra $L7; $L31: .loc 1 99 19 st.u64 [%r133+8],%r174; .loc 1 100 24 mov.u64 %r200,200; call (%value_in),_gfortrani_xmallocarray201,[%value_in]; } .loc 1 100 22 st.u64 [%r133],%r201; bra $L8; $L3: .loc 1 104 11 ld.s8 %r78,[%r133+28]; .loc 1 106 18 mov.u64 %r204,1; st.u64 [%frame+360],%r204; .loc 1 107 7 setp.le.s64 %r205,%r78,0; @ %r205 bra $L20; .loc 1 109 13 shl.b64 %r130,%r78,3; add.u64 %r208,%frame,120; mov.u32 %r21mov.u64 %r99,40; mov.u64 %r100,0; .loc 1 60 9 cvt.u32.u64 %r37,%r100; add.u64 %r270,%frame,360; add.u64 %r269,%frame,240; $L9: add.u64 %r82,%r133,%r99; .loc 1 110 16 ld.u64 %r217,[%r82+16]; add.u64 %r216,%r217,1; ld.u64 %r218,[%r82+8]; sub.u64 %r48,%r216,%r218; .loc 1 110 14 add.u64 %r219,%frame,%r100; st.u64 [%r219],%r48; .loc 1 111 18 set.u32.le.s64 %r221,%r48,0; neg.s32 %r222,%r221; cvt.u16.u32 %r225,%r37; and.b16 %r224,%r225,1; cvt.u16.u32 %r228,%r222; or.b16 %r227,%r228,%r224; cvt.u32.u16 %r230,%r227; cvt.u32.u8 %r37,%r230; .loc 1 112 15 add.u64 %r232,%r270,%r100; ld.u64 %r233,[%r82]; st.u64 [%r232],%r233; .loc 1 113 15 add.u64 %r235,%r269,%r100; .loc 1 113 17 add.u64 %r236,%r135,%r99; ld.u64 %r238,[%r236]; mul.lo.u64 %r237,%r238,%r22; .loc 1 113 15 st.u64 [%r235],%r237; .loc 1 107 7 add.u64 %r100,%r100,8; add.u64 %r99,%r99,24; setp.ne.u64 %r239,%r100,%r130; @ %r239 bra $L9; .loc 1 115 10 ld.u64 %r240,[%frame+360]; setp.ne.u64 %r241,%r240,0; @ %r241 bra $L8; .loc 1 116 13 mov.u64 %r242,1; st.u64 [%frame+360],%r242; $L8: .loc 1 119 6 setp.ne.u32 %r243,%r37,0; @ %r243 bra $L1; .loc 1 128 12 ld.u64 %r132,[%frame+360]; .loc 1 130 8 ld.u64 %r96,[%r133]; bra $L7; $L20: .loc 1 107 7 mov.u64 %r132,%r204; $L7: .loc 1 122 14 ld.u64 %r56,[%frame+240]; .loc 1 122 6 setp.ne.u64 %r244,%r56,0; .loc 1 123 16 selp.u64 %r56,%r56,1,%r244; .loc 1 125 12 ld.u64 %r75,[%r134+40]; .loc 1 126 6 setp.ne.u64 %r245,%r75,0; .loc 1 127 14 selp.u64 %r75,%r75,1,%r245; .loc 1 131 8 ld.u64 %r76,[%r134]; .loc 1 133 9 setp.eq.u64 %r246,%r96,0; @ %r246 bra $L1; .loc 1 139 9 add.u64 %r60,%r75,%r75; .loc 1 147 12 add.u64 %r62,%r132,%r132; ld.u64 %r94,[%frame+120]; .loc 1 151 32 ld.u64 %r98,[%frame]; .loc 1 158 30 mul.lo.u64 %r27,%r98,%r132; .loc 1 159 30 mul.lo.u64 %r249,%r98,%r56; .loc 1 159 16 neg.s64 %r101,%r249; shl.b64 %r121,%r78,3; setp.le.s64 %r271,%r78,1; add.u64 %r276,%frame,360; add.u64 %r277,%frame,240; .loc 1 155 20 mov.u64 %r278,0; $L19: .loc 1 135 10 ld.s8 %r251,[%r97]; cvt.u16.u32 %r250,%r251; setp.eq.u16 %r252,%r250,0; @ %r252 bra $L21; .loc 1 138 12 ld.s16 %r58,[%r76]; .loc 1 139 9 add.u64 %r76,%r76,%r60; bra $L13; $L21: .loc 1 144 10 mov.u32 %r58,%r86; $L13: cvt.u16.u32 %r253,%r58; st.u16 [%r96],%r253; .loc 1 147 12 add.u64 %r96,%r96,%r62; .loc 1 148 12 add.u64 %r97,%r97,%r56; .loc 1 149 15 add.u64 %r94,%r94,1; .loc 1 151 13 setp.ne.u64 %r254,%r94,%r98; @ %r254 bra $L19; .loc 1 159 16 add.u64 %r95,%r97,%r101; .loc 1 161 14 @ ! %r271 bra $L32; bra $L1; $L18: .loc 1 155 20 st.u64 [%r90],%r278; .loc 1 158 30 mul.lo.u64 %r64,%r70,%r69; .loc 1 159 30 mul.lo.u64 %r257,%r72,%r69; .loc 1 159 16 sub.u64 %r95,%r97,%r257; .loc 1 161 14 add.u64 %r90,%r90,8; add.u64 %r127,%r127,8; setp.ne.u64 %r258,%r121,%r127; @ %r258 bra $L17; bra $L1; $L32: add.u64 %r90,%frame,128; .loc 1 158 30 mov.u64 %r64,%r27; .loc 1 161 14 mov.u64 %r127,8; $L17: .loc 1 169 23 ld.u64 %r260,[%r90]; add.u64 %r69,%r260,1; st.u64 [%r90],%r69; .loc 1 170 30 add.u64 %r262,%r276,%r127; ld.u64 %r70,[%r262]; .loc 1 170 20 sub.u64 %r263,%r70,%r64; add.u64 %r264,%r263,%r263; add.u64 %r96,%r96,%r264; .loc 1 171 30 add.u64 %r266,%r277,%r127; ld.u64 %r72,[%r266]; .loc 1 171 20 add.u64 %r97,%r95,%r72; .loc 1 151 32 add.u64 %r267,%frame,%r127; ld.u64 %r74,[%r267]; .loc 1 151 13 setp.eq.u64 %r268,%r69,%r74; @ %r268 bra $L18; .loc 1 155 20 mov.u64 %r94,0; bra $L19; $L_gfortrani_unpack1_i2 .visible .func _gfortrani_unpack1_reg .u64 %stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[6089932predpredpred %r213; .reg .u64u32u64 %r257; .reg .u64predpredpredmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 208 8 ld.u64 %r106,[%r152]; .loc 1 213 15 ld.u64 %r22,[%r152+16]; .loc 1 215 22 cvt.u32.u64 %r23,%r22; .loc 1 215 53 add.u32 %r154,%r23,-4; and.b32 %r155,%r154,-5; set.u32.eq.u32 %r157,%r155,0; neg.s32 %r158,%r157; .loc 1 215 22 add.u32 %r159,%r23,-1; set.u32.le.u32 %r161,%r159,1; neg.s32 %r162,%r161; .loc 1 215 6 cvt.u16.u32 %r164,%r158; cvt.u16.u32 %r165,%r162; or%r168 bra $L34; .loc 1 217 7 setp.eq.u32 %r170,%r23,16; @ %r170 bra $L34; .loc 1 226 5 cvta.const.u64 %r171171_gfortran_runtime_error34: .loc 1 228 10 ld.u64 %r104,[%r150]; .loc 1 228 6 setp.ne.u64 %r173,%r104,0; @ %r173 bra $L35; .loc 1 232 11 ld.s8 %r92,[%r152+28]; .loc 1 234 7 setp.le.s64 %r175,%r92,0; @ %r175 bra $L36; .loc 1 236 13 shl.b64 %r145,%r92,3; add.u64 %r178,%frame,120; cvt.u32.u64 %r182,%r1041454,[%value_in]; } mov.u64 %r135,40; mov.u64 %r136,%r104; .loc 1 206 9 cvt.u32.u64 %r38,%r136; .loc 1 233 10 mov.u64 %r99,1; add.u64 %r304,%frame,480; add.u64 %r303,%frame,360; add.u64 %r302,%frame,240; .loc 1 237 4 mov.u64 %r186,%r136; $L37: add.u64 %r134,%r150,%r135; st.u64 [%r134+8],%r186; add.u64 %r133,%r152,%r135; ld.u64 %r188,[%r133+16]; add.u64 %r187,%r188,1; ld.u64 %r189,[%r133+8]; sub.u64 %r33,%r187,%r189; add.u64 %r190,%r33,-1; st.u64 [%r134+16],%r190; st.u64 [%r134],%r99; .loc 1 239 14 add.u64 %r191,%frame,%r136; st.u64 [%r191],%r33; .loc 1 240 18 set.u32.le.s64 %r193,%r33,0; neg.s32 %r194,%r193; cvt.u16.u32 %r197,%r38; and.b16 %r196,%r197,1; cvt.u16.u32 %r200,%r194; or.b16 %r199,%r200,%r196; cvt.u32.u16 %r202,%r199; cvt.u32.u8 %r38,%r202; .loc 1 241 15 add.u64 %r204,%r304,%r136; st.u64 [%r204],%r99; .loc 1 242 15 add.u64 %r206,%r303,%r136; .loc 1 242 17 add.u64 %r207,%r153,%r135; .loc 1 242 15 ld.u64 %r208,[%r207]; st.u64 [%r206],%r208; .loc 1 243 15 add.u64 %r210,%r302,%r136; .loc 1 243 17 ld.u64 %r212,[%r133]; mul.lo.u64 %r211,%r212,%r22; .loc 1 243 15 st.u64 [%r210],%r211; .loc 1 244 7 mul.lo.u64 %r99,%r99,%r33; .loc 1 234 7 add.u64 %r136,%r136,8; add.u64 %r135,%r135,24; setp.ne.u64 %r213,%r136,%r145; @ %r213 bra $L37; bra $L64; $L36: .loc 1 246 19 st.u64 [%r150+8],%r104; .loc 1 247 24 mov.u64 %r216,2; mov.u64 %r215216; call (%value_in),_gfortrani_xmallocarray217,[%value_in]; } mov.u64 %r104,%r217; .loc 1 247 22 st.u64 [%r150],%r104; .loc 1 278 12 ld.u64 %r149,[%frame+480]; bra $L39; $L64: .loc 1 246 19 st.u64 [%r150+8],%r186; .loc 1 247 24 mov.u64 %r221,221; call (%value_in),_gfortrani_xmallocarray222,[%value_in]; } .loc 1 247 22 st.u64 [%r150],%r222; bra $L40; $L35: .loc 1 251 11 ld.s8 %r92,[%r150+28]; .loc 1 253 18 mov.u64 %r225,1; st.u64 [%frame+480],%r225; .loc 1 254 7 setp.le.s64 %r226,%r92,0; @ %r226 bra $L54; .loc 1 256 13 shl.b64 %r147,%r92,3; add.u64 %r229,%frame,120; mov.u32 %r23314735,[%value_in]; } add.u64 %r127,%r150,40; add.u64 %r125,%r153,40; add.u64 %r123,%r152,40; mov.u64 %r121,0; .loc 1 206 9 cvt.u32.u64 %r38,%r121; add.u64 %r304,%frame,480; add.u64 %r303,%frame,360; add.u64 %r302,%frame,240; $L41: .loc 1 257 16 ld.u64 %r238,[%r127+16]; add.u64 %r237,%r238,1; ld.u64 %r239,[%r127+8]; sub.u64 %r50,%r237,%r239; .loc 1 257 14 add.u64 %r240,%frame,%r121; st.u64 [%r240],%r50; .loc 1 258 18 set.u32.le.s64 %r242,%r50,0; neg.s32 %r243,%r242; cvt.u16.u32 %r246,%r38; and.b16 %r245,%r246,1; cvt.u16.u32 %r249,%r243; or.b16 %r248,%r249,%r245; cvt.u32.u16 %r251,%r248; cvt.u32.u8 %r38,%r251; .loc 1 259 15 add.u64 %r253,%r304,%r121; ld.u64 %r254,[%r127]; st.u64 [%r253],%r254; .loc 1 260 15 add.u64 %r256,%r303,%r121; ld.u64 %r257,[%r125]; st.u64 [%r256],%r257; .loc 1 261 15 add.u64 %r259,%r302,%r121; .loc 1 261 17 ld.u64 %r261,[%r123]; mul.lo.u64 %r260,%r261,%r22; .loc 1 261 15 st.u64 [%r259],%r260; .loc 1 254 7 add.u64 %r127,%r127,24; add.u64 %r125,%r125,24; add.u64 %r123,%r123,24; add.u64 %r121,%r121,8; setp.ne.u64 %r262,%r121,%r147; @ %r262 bra $L41; .loc 1 263 10 ld.u64 %r263,[%frame+48040; .loc 1 264 13 mov.u64 %r265,1; st.u64 [%frame+480],%r265; $L40: .loc 1 267 6 setp.ne.u32 %r266,%r38,0; @ %r266 bra $L33; .loc 1 278 12 ld.u64 %r149,[%frame+480]; .loc 1 281 8 ld.u64 %r104,[%r150]; bra $L39; $L54: .loc 1 254 7 mov.u64 %r149,%r225; $L39: .loc 1 270 14 ld.u64 %r59,[%frame+360]; .loc 1 270 6 setp.ne.u64 %r267,%r59,0; .loc 1 271 16 selp.u64 %r59,%r59,1,%r267; .loc 1 272 14 ld.u64 %r60,[%frame+240]; .loc 1 272 6 setp.ne.u64 %r268,%r60,0; .loc 1 273 16 selp.u64 %r60,%r60,1,%r268; .loc 1 275 12 ld.u64 %r90,[%r151+40]; .loc 1 276 6 setp.ne.u64 %r269,%r90,0; .loc 1 277 14 selp.u64 %r90,%r90,1,%r269; .loc 1 282 8 ld.u64 %r105,[%r153]; .loc 1 283 8 ld.u64 %r91,[%r151]; .loc 1 285 9 setp.eq.u64 %r270,%r104,0; @ %r270 bra $L33; .loc 1 291 16 add.u64 %r64,%r90,%r90; .loc 1 299 12 add.u64 %r66,%r149,%r149; .loc 1 300 12 add.u64 %r68,%r59,%r59; ld.u64 %r27,[%frame+120]; .loc 1 304 32 ld.u64 %r107,[%frame]; .loc 1 311 30 mul.lo.u64 %r28,%r107,%r149; .loc 1 312 16 mul.lo.u64 %r274,%r107,%r59; add.u64 %r275,%r274,%r274; neg.s64 %r111,%r275; .loc 1 313 30 mul.lo.u64 %r276,%r107,%r60; .loc 1 313 16 neg.s64 %r112,%r276; shl.b64 %r137,%r92,3; setp.le.s64 %r301,%r92,1; add.u64 %r309,%frame,480; add.u64 %r310,%frame,360; add.u64 %r311,%frame,240; .loc 1 308 20 mov.u64 %r312,0; $L53: .loc 1 287 10 ld.s8 %r278,[%r106]; cvt.u16.u32 %r277,%r278; setp.eq.u16 %r279,%r277,0; @ %r279 bra $L46; .loc 1 290 12 ld.s16 %r62,[%r91]; .loc 1 291 16 add.u64 %r91,%r91,%r64; bra $L47; $L46: .loc 1 296 12 ld.s16 %r62,[%r105]; $L47: cvt.u16.u32 %r280,%r62; st.u16 [%r104],%r280; .loc 1 299 12 add.u64 %r104,%r104,%r66; .loc 1 300 12 add.u64 %r105,%r105,%r68; .loc 1 301 12 add.u64 %r106,%r106,%r60; .loc 1 302 15 add.u64 %r27,%r27,1; .loc 1 304 13 setp.ne.u64 %r281,%r27,%r107; @ %r281 bra $L53; .loc 1 312 16 add.u64 %r102,%r105,%r111; .loc 1 313 16 add.u64 %r103,%r106,%r112; .loc 1 315 14 @ ! %r301 bra $L65; bra $L33; $L52: .loc 1 308 20 st.u64 [%r98],%r312; .loc 1 311 30 mul.lo.u64 %r71,%r81,%r80; .loc 1 312 30 mul.lo.u64 %r284,%r83,%r80; .loc 1 312 16 add.u64 %r285,%r284,%r284; sub.u64 %r102,%r105,%r285; .loc 1 313 30 mul.lo.u64 %r286,%r86,%r80; .loc 1 313 16 sub.u64 %r103,%r106,%r286; .loc 1 315 14 add.u64 %r98,%r98,8; add.u64 %r144,%r144,8; setp.ne.u64 %r287,%r137,%r144; @ %r287 bra $L51; bra $L33; $L65: add.u64 %r98,%frame,128; .loc 1 311 30 mov.u64 %r71,%r28; .loc 1 315 14 mov.u64 %r144,8; $L51: .loc 1 323 23 ld.u64 %r289,[%r98]; add.u64 %r80,%r289,1; st.u64 [%r98],%r80; .loc 1 324 30 add.u64 %r291,%r309,%r144; ld.u64 %r81,[%r291]; .loc 1 324 20 sub.u64 %r292,%r81,%r71; add.u64 %r293,%r292,%r292; add.u64 %r104,%r104,%r293; .loc 1 325 30 add.u64 %r295,%r310,%r144; ld.u64 %r83,[%r295]; .loc 1 325 20 add.u64 %r296,%r83,%r83; add.u64 %r105,%r102,%r296; .loc 1 326 30 add.u64 %r298,%r311,%r144; ld.u64 %r86,[%r298]; .loc 1 326 20 add.u64 %r106,%r103,%r86; .loc 1 304 32 add.u64 %r299,%frame,%r144; ld.u64 %r88,[%r299]; .loc 1 304 13 setp.eq.u64 %r300,%r80,%r88; @ %r300 bra $L52; .loc 1 308 20 mov.u64 %r27,0; bra $L53; $L33: .loc 1 330 1 ret; } unpack_i4.o/_gfortrani_unpack0_i4 .visible .func _gfortrani_unpack0_fortran/generated/unpack_i4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_unpack1_i4 .visible .func _gfortrani_unpack1_i4gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortrani_unpack0_i4 .visible .func _gfortrani_unpack0_reg .u64 %stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[480predpredpred %r205; .reg .u64 %r208; .reg .u32 %r212; .reg .u64u16 %r225; .reg .u16 %r227; .reg .u16u64 %r237; .reg .u64 %r238; .reg .pred %r239; .reg .u64 %r240; .reg .pred %r241; .reg .u64 %r242; .reg .predpredpredmov.u64 %r133,%ar0; mov.u64 %r134,%ar1; mov.u64 %r135,%ar2; mov.u64 %r136,%ar3; .loc 1 46 23 ld.u32 %r86,[%r136]; .loc 1 62 8 ld.u64 %r97,[%r135]; .loc 1 67 15 ld.u64 %r22,[%r135+16]; .loc 1 69 22 cvt.u32.u64 %r23,%r22; .loc 1 69 53 add.u32 %r137,%r23,-4; and.b32 %r138,%r137,-5; set.u32.eq.u32 %r140,%r138,0; neg.s32 %r141,%r140; .loc 1 69 22 add.u32 %r142,%r23,-1; set.u32.le.u32 %r144,%r142,1; neg.s32 %r145,%r144; .loc 1 69 6 cvt.u16.u32 %r147,%r141; cvt.u16.u32 %r148,%r145; or.b16 %r146,%r147,%r148; cvt.u32.u16 %r149,%r146; cvt.u16.u8 %r150,%r149; setp.ne.u16 %r151,%r150,0; @ %r151 bra $L2; .loc 1 71 7 setp.eq.u32 %r153,%r23,16; @ %r153 bra $L2; .loc 1 80 5 cvta.const.u64 %r1_gfortran_runtime_errorr96,[%r133]; .loc 1 82 6 setp.ne.u64 %r156,%r96,0; @ %r156 bra $L3; .loc 1 86 11 ld.s8 %r78,[%r135+28]; .loc 1 88 7 setp.le.s64 %r158,%r78,0; @ %r158 bra $L4; .loc 1 90 13 shl.b64 %r160,%r78,3; add.u64 %r161,%frame,120; cvt.u32.u64 %r165,%r96add.u64 %r120,%r133,40; add.u64 %r118,%r135,40; mov.u64 %r116,%frame; add.u64 %r115,%frame,360; add.u64 %r114,%frame,240; add.u64 %r170,%r78,%r78; add.u64 %r171,%r170,%r78; shl.b64 %r172,%r171,3; add.u64 %r102,%r172,%r120; .loc 1 60 9 cvt.u32.u64 %r37,%r96; .loc 1 87 10 mov.u64 %r91,1; .loc 1 91 4 mov.u64 %r174,%r96; $L5: st.u64 [%r120+8],%r174; ld.u64 %r176,[%r118+16]; add.u64 %r175,%r176,1; ld.u64 %r177,[%r118+8]; sub.u64 %r32,%r175,%r177; add.u64 %r178,%r32,-1; st.u64 [%r120+16],%r178; st.u64 [%r120],%r91; .loc 1 93 14 st.u64 [%r116],%r32; .loc 1 94 18 set.u32.le.s64 %r180,%r32,0; neg.s32 %r181,%r180; cvt.u16.u32 %r184,%r37; and.b16 %r183,%r184,1; cvt.u16.u32 %r187,%r181; or.b16 %r186,%r187,%r183; cvt.u32.u16 %r189,%r186; cvt.u32.u8 %r37,%r189; .loc 1 95 15 st.u64 [%r115],%r91; .loc 1 96 17 ld.u64 %r191,[%r118]; mul.lo.u64 %r190,%r191,%r22; .loc 1 96 15 st.u64 [%r114],%r190; .loc 1 97 7 mul.lo.u64 %r91,%r91,%r32; .loc 1 88 7 add.u64 %r120,%r120,24; add.u64 %r118,%r118,24; add.u64 %r116,%r116,8; add.u64 %r115,%r115,8; add.u64 %r114,%r114,8; setp.ne.u64 %r192,%r102,%r120; @ %r192 bra $L5; bra $L31; $L4: .loc 1 99 19 st.u64 [%r133+8],%r96; .loc 1 100 24 mov.u64 %r195,4; mov.u64 %r19419195; call (%value_in),_gfortrani_xmallocarray196,[%value_in]; } mov.u64 %r96,%r196; .loc 1 100 22 st.u64 [%r133],%r96; .loc 1 128 12 ld.u64 %r132,[%frame+360]; bra $L7; $L31: .loc 1 99 19 st.u64 [%r133+8],%r174; .loc 1 100 2491200; call (%value_in),_gfortrani_xmallocarray201,[%value_in]; } .loc 1 100 22 st.u64 [%r133],%r201; bra $L8; $L3: .loc 1 104 11 ld.s8 %r78,[%r133+28]; .loc 1 106 18 mov.u64 %r204,1; st.u64 [%frame+360],%r204; .loc 1 107 7 setp.le.s64 %r205,%r78,0; @ %r205 bra $L20; .loc 1 109 13 shl.b64 %r130,%r78,3; add.u64 %r208,%frame,120; mov.u32 %r21mov.u64 %r99,40; mov.u64 %r100,0; .loc 1 60 9 cvt.u32.u64 %r37,%r100; add.u64 %r266,%frame,360; add.u64 %r268,%frame,240; $L9: add.u64 %r82,%r133,%r99; .loc 1 110 16 ld.u64 %r217,[%r82+16]; add.u64 %r216,%r217,1; ld.u64 %r218,[%r82+8]; sub.u64 %r48,%r216,%r218; .loc 1 110 14 add.u64 %r219,%frame,%r100; st.u64 [%r219],%r48; .loc 1 111 18 set.u32.le.s64 %r221,%r48,0; neg.s32 %r222,%r221; cvt.u16.u32 %r225,%r37; and.b16 %r224,%r225,1; cvt.u16.u32 %r228,%r222; or.b16 %r227,%r228,%r224; cvt.u32.u16 %r230,%r227; cvt.u32.u8 %r37,%r230; .loc 1 112 15 add.u64 %r232,%r266,%r100; ld.u64 %r233,[%r82]; st.u64 [%r232],%r233; .loc 1 113 15 add.u64 %r235,%r268,%r100; .loc 1 113 17 add.u64 %r236,%r135,%r99; ld.u64 %r238,[%r236]; mul.lo.u64 %r237,%r238,%r22; .loc 1 113 15 st.u64 [%r235],%r237; .loc 1 107 7 add.u64 %r100,%r100,8; add.u64 %r99,%r99,24; setp.ne.u64 %r239,%r100,%r130; @ %r239 bra $L9; .loc 1 115 10 ld.u64 %r240,[%frame+360]; setp.ne.u64 %r241,%r240,0; @ %r241 bra $L8; .loc 1 116 13 mov.u64 %r242,1; st.u64 [%frame+360],%r242; $L8: .loc 1 119 6 setp.ne.u32 %r243,%r37,0; @ %r243 bra $L1; .loc 1 128 12 ld.u64 %r132,[%frame+360]; .loc 1 130 8 ld.u64 %r96,[%r133]; bra $L7; $L20: .loc 1 107 7 mov.u64 %r132,%r204; $L7: .loc 1 122 14 ld.u64 %r56,[%frame+240]; .loc 1 122 6 setp.ne.u64 %r244,%r56,0; .loc 1 123 16 selp.u64 %r56,%r56,1,%r244; .loc 1 125 12 ld.u64 %r75,[%r134+40]; .loc 1 126 6 setp.ne.u64 %r245,%r75,0; .loc 1 127 14 selp.u64 %r75,%r75,1,%r245; .loc 1 131 8 ld.u64 %r76,[%r134]; .loc 1 133 9 setp.eq.u64 %r246,%r96,0; @ %r246 bra $L1; .loc 1 139 9 shl.b64 %r60,%r75,2; .loc 1 147 12 shl.b64 %r62,%r132,2; ld.u64 %r94,[%frame+120]; .loc 1 151 32 ld.u64 %r98,[%frame]; .loc 1 158 30 mul.lo.u64 %r27,%r98,%r132; .loc 1 159 30 mul.lo.u64 %r247,%r98,%r56; .loc 1 159 16 neg.s64 %r101,%r247; shl.b64 %r121,%r78,3; setp.le.s64 %r267,%r78,1; add.u64 %r273,%frame,360; add.u64 %r274,%frame,240; .loc 1 155 20 mov.u64 %r275,0; $L19: .loc 1 135 10 ld.s8 %r249,[%r97]; cvt.u16.u32 %r248,%r249; setp.eq.u16 %r250,%r248,0; @ %r250 bra $L21; .loc 1 138 12 ld.u32 %r58,[%r76]; .loc 1 139 9 add.u64 %r76,%r76,%r60; bra $L13; $L21: .loc 1 144 10 mov.u32 %r58,%r86; $L13: st.u32 [%r96],%r58; .loc 1 147 12 add.u64 %r96,%r96,%r62; .loc 1 148 12 add.u64 %r97,%r97,%r56; .loc 1 149 15 add.u64 %r94,%r94,1; .loc 1 151 13 setp.ne.u64 %r251,%r94,%r98; @ %r251 bra $L19; .loc 1 159 16 add.u64 %r95,%r97,%r101; .loc 1 161 14 @ ! %r267 bra $L32; bra $L1; $L18: .loc 1 155 20 st.u64 [%r90],%r275; .loc 1 158 30 mul.lo.u64 %r64,%r70,%r69; .loc 1 159 30 mul.lo.u64 %r254,%r72,%r69; .loc 1 159 16 sub.u64 %r95,%r97,%r254; .loc 1 161 14 add.u64 %r90,%r90,8; add.u64 %r127,%r127,8; setp.ne.u64 %r255,%r121,%r127; @ %r255 bra $L17; bra $L1; $L32: add.u64 %r90,%frame,128; .loc 1 158 30 mov.u64 %r64,%r27; .loc 1 161 14 mov.u64 %r127,8; $L17: .loc 1 169 23 ld.u64 %r257,[%r90]; add.u64 %r69,%r257,1; st.u64 [%r90],%r69; .loc 1 170 30 add.u64 %r259,%r273,%r127; ld.u64 %r70,[%r259]; .loc 1 170 20 sub.u64 %r260,%r70,%r64; shl.b64 %r261,%r260,2; add.u64 %r96,%r96,%r261; .loc 1 171 30 add.u64 %r263,%r274,%r127; ld.u64 %r72,[%r263]; .loc 1 171 20 add.u64 %r97,%r95,%r72; .loc 1 151 32 add.u64 %r264,%frame,%r127; ld.u64 %r74,[%r264]; .loc 1 151 13 setp.eq.u64 %r265,%r69,%r74; @ %r265 bra $L18; .loc 1 155 20 mov.u64 %r94,0; bra $L19; $L_gfortrani_unpack1_i4 .visible .func _gfortrani_unpack1_reg .u64 %stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[6089932predpredpred %r213; .reg .u64u32u64 %r257; .reg .u64predu64 %r281; .reg .u64 %r282; .reg .pred %r283; .reg .u64u64 %r306; .reg .u64 %r307; .reg .u64 %r308; mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 208 8 ld.u64 %r106,[%r152]; .loc 1 213 15 ld.u64 %r22,[%r152+16]; .loc 1 215 22 cvt.u32.u64 %r23,%r22; .loc 1 215 53 add.u32 %r154,%r23,-4; and.b32 %r155,%r154,-5; set.u32.eq.u32 %r157,%r155,0; neg.s32 %r158,%r157; .loc 1 215 22 add.u32 %r159,%r23,-1; set.u32.le.u32 %r161,%r159,1; neg.s32 %r162,%r161; .loc 1 215 6 cvt.u16.u32 %r164,%r158; cvt.u16.u32 %r165,%r162; or%r168 bra $L34; .loc 1 217 7 setp.eq.u32 %r170,%r23,16; @ %r170 bra $L34; .loc 1 226 5 cvta.const.u64 %r171171_gfortran_runtime_error34: .loc 1 228 10 ld.u64 %r104,[%r150]; .loc 1 228 6 setp.ne.u64 %r173,%r104,0; @ %r173 bra $L35; .loc 1 232 11 ld.s8 %r92,[%r152+28]; .loc 1 234 7 setp.le.s64 %r175,%r92,0; @ %r175 bra $L36; .loc 1 236 13 shl.b64 %r145,%r92,3; add.u64 %r178,%frame,120; cvt.u32.u64 %r182,%r1041454,[%value_in]; } mov.u64 %r135,40; mov.u64 %r136,%r104; .loc 1 206 9 cvt.u32.u64 %r38,%r136; .loc 1 233 10 mov.u64 %r99,1; add.u64 %r297,%frame,480; add.u64 %r300,%frame,360; add.u64 %r299,%frame,240; .loc 1 237 4 mov.u64 %r186,%r136; $L37: add.u64 %r134,%r150,%r135; st.u64 [%r134+8],%r186; add.u64 %r133,%r152,%r135; ld.u64 %r188,[%r133+16]; add.u64 %r187,%r188,1; ld.u64 %r189,[%r133+8]; sub.u64 %r33,%r187,%r189; add.u64 %r190,%r33,-1; st.u64 [%r134+16],%r190; st.u64 [%r134],%r99; .loc 1 239 14 add.u64 %r191,%frame,%r136; st.u64 [%r191],%r33; .loc 1 240 18 set.u32.le.s64 %r193,%r33,0; neg.s32 %r194,%r193; cvt.u16.u32 %r197,%r38; and.b16 %r196,%r197,1; cvt.u16.u32 %r200,%r194; or.b16 %r199,%r200,%r196; cvt.u32.u16 %r202,%r199; cvt.u32.u8 %r38,%r202; .loc 1 241 15 add.u64 %r204,%r297,%r136; st.u64 [%r204],%r99; .loc 1 242 15 add.u64 %r206,%r300,%r136; .loc 1 242 17 add.u64 %r207,%r153,%r135; .loc 1 242 15 ld.u64 %r208,[%r207]; st.u64 [%r206],%r208; .loc 1 243 15 add.u64 %r210,%r299,%r136; .loc 1 243 17 ld.u64 %r212,[%r133]; mul.lo.u64 %r211,%r212,%r22; .loc 1 243 15 st.u64 [%r210],%r211; .loc 1 244 7 mul.lo.u64 %r99,%r99,%r33; .loc 1 234 7 add.u64 %r136,%r136,8; add.u64 %r135,%r135,24; setp.ne.u64 %r213,%r136,%r145; @ %r213 bra $L37; bra $L64; $L36: .loc 1 246 19 st.u64 [%r150+8],%r104; .loc 1 247 24 mov.u64 %r216,4; mov.u64 %r215216; call (%value_in),_gfortrani_xmallocarray217,[%value_in]; } mov.u64 %r104,%r217; .loc 1 247 22 st.u64 [%r150],%r104; .loc 1 278 12 ld.u64 %r149,[%frame+480]; bra $L39; $L64: .loc 1 246 19 st.u64 [%r150+8],%r186; .loc 1 247 2499221; call (%value_in),_gfortrani_xmallocarray222,[%value_in]; } .loc 1 247 22 st.u64 [%r150],%r222; bra $L40; $L35: .loc 1 251 11 ld.s8 %r92,[%r150+28]; .loc 1 253 18 mov.u64 %r225,1; st.u64 [%frame+480],%r225; .loc 1 254 7 setp.le.s64 %r226,%r92,0; @ %r226 bra $L54; .loc 1 256 13 shl.b64 %r147,%r92,3; add.u64 %r229,%frame,120; mov.u32 %r23314735,[%value_in]; } add.u64 %r127,%r150,40; add.u64 %r125,%r153,40; add.u64 %r123,%r152,40; mov.u64 %r121,0; .loc 1 206 9 cvt.u32.u64 %r38,%r121; add.u64 %r297,%frame,480; add.u64 %r300,%frame,360; add.u64 %r299,%frame,240; $L41: .loc 1 257 16 ld.u64 %r238,[%r127+16]; add.u64 %r237,%r238,1; ld.u64 %r239,[%r127+8]; sub.u64 %r50,%r237,%r239; .loc 1 257 14 add.u64 %r240,%frame,%r121; st.u64 [%r240],%r50; .loc 1 258 18 set.u32.le.s64 %r242,%r50,0; neg.s32 %r243,%r242; cvt.u16.u32 %r246,%r38; and.b16 %r245,%r246,1; cvt.u16.u32 %r249,%r243; or.b16 %r248,%r249,%r245; cvt.u32.u16 %r251,%r248; cvt.u32.u8 %r38,%r251; .loc 1 259 15 add.u64 %r253,%r297,%r121; ld.u64 %r254,[%r127]; st.u64 [%r253],%r254; .loc 1 260 15 add.u64 %r256,%r300,%r121; ld.u64 %r257,[%r125]; st.u64 [%r256],%r257; .loc 1 261 15 add.u64 %r259,%r299,%r121; .loc 1 261 17 ld.u64 %r261,[%r123]; mul.lo.u64 %r260,%r261,%r22; .loc 1 261 15 st.u64 [%r259],%r260; .loc 1 254 7 add.u64 %r127,%r127,24; add.u64 %r125,%r125,24; add.u64 %r123,%r123,24; add.u64 %r121,%r121,8; setp.ne.u64 %r262,%r121,%r147; @ %r262 bra $L41; .loc 1 263 10 ld.u64 %r263,[%frame+48040; .loc 1 264 13 mov.u64 %r265,1; st.u64 [%frame+480],%r265; $L40: .loc 1 267 6 setp.ne.u32 %r266,%r38,0; @ %r266 bra $L33; .loc 1 278 12 ld.u64 %r149,[%frame+480]; .loc 1 281 8 ld.u64 %r104,[%r150]; bra $L39; $L54: .loc 1 254 7 mov.u64 %r149,%r225; $L39: .loc 1 270 14 ld.u64 %r59,[%frame+360]; .loc 1 270 6 setp.ne.u64 %r267,%r59,0; .loc 1 271 16 selp.u64 %r59,%r59,1,%r267; .loc 1 272 14 ld.u64 %r60,[%frame+240]; .loc 1 272 6 setp.ne.u64 %r268,%r60,0; .loc 1 273 16 selp.u64 %r60,%r60,1,%r268; .loc 1 275 12 ld.u64 %r90,[%r151+40]; .loc 1 276 6 setp.ne.u64 %r269,%r90,0; .loc 1 277 14 selp.u64 %r90,%r90,1,%r269; .loc 1 282 8 ld.u64 %r105,[%r153]; .loc 1 283 8 ld.u64 %r91,[%r151]; .loc 1 285 9 setp.eq.u64 %r270,%r104,0; @ %r270 bra $L33; .loc 1 291 16 shl.b64 %r64,%r90,2; .loc 1 299 12 shl.b64 %r66,%r149,2; .loc 1 300 12 shl.b64 %r68,%r59,2; ld.u64 %r27,[%frame+120]; .loc 1 304 32 ld.u64 %r107,[%frame]; .loc 1 311 30 mul.lo.u64 %r28,%r107,%r149; .loc 1 312 16 mul.lo.u64 %r271,%r107,%r59; shl.b64 %r272,%r271,2; neg.s64 %r111,%r272; .loc 1 313 30 mul.lo.u64 %r273,%r107,%r60; .loc 1 313 16 neg.s64 %r112,%r273; shl.b64 %r137,%r92,3; setp.le.s64 %r298,%r92,1; add.u64 %r305,%frame,480; add.u64 %r306,%frame,360; add.u64 %r307,%frame,240; .loc 1 308 20 mov.u64 %r308,0; $L53: .loc 1 287 10 ld.s8 %r275,[%r106]; cvt.u16.u32 %r274,%r275; setp.eq.u16 %r276,%r274,0; @ %r276 bra $L46; .loc 1 290 12 ld.u32 %r62,[%r91]; .loc 1 291 16 add.u64 %r91,%r91,%r64; bra $L47; $L46: .loc 1 296 12 ld.u32 %r62,[%r105]; $L47: st.u32 [%r104],%r62; .loc 1 299 12 add.u64 %r104,%r104,%r66; .loc 1 300 12 add.u64 %r105,%r105,%r68; .loc 1 301 12 add.u64 %r106,%r106,%r60; .loc 1 302 15 add.u64 %r27,%r27,1; .loc 1 304 13 setp.ne.u64 %r277,%r27,%r107; @ %r277 bra $L53; .loc 1 312 16 add.u64 %r102,%r105,%r111; .loc 1 313 16 add.u64 %r103,%r106,%r112; .loc 1 315 14 @ ! %r298 bra $L65; bra $L33; $L52: .loc 1 308 20 st.u64 [%r98],%r308; .loc 1 311 30 mul.lo.u64 %r71,%r81,%r80; .loc 1 312 30 mul.lo.u64 %r280,%r83,%r80; .loc 1 312 16 shl.b64 %r281,%r280,2; sub.u64 %r102,%r105,%r281; .loc 1 313 30 mul.lo.u64 %r282,%r86,%r80; .loc 1 313 16 sub.u64 %r103,%r106,%r282; .loc 1 315 14 add.u64 %r98,%r98,8; add.u64 %r144,%r144,8; setp.ne.u64 %r283,%r137,%r144; @ %r283 bra $L51; bra $L33; $L65: add.u64 %r98,%frame,128; .loc 1 311 30 mov.u64 %r71,%r28; .loc 1 315 14 mov.u64 %r144,8; $L51: .loc 1 323 23 ld.u64 %r285,[%r98]; add.u64 %r80,%r285,1; st.u64 [%r98],%r80; .loc 1 324 30 add.u64 %r287,%r305,%r144; ld.u64 %r81,[%r287]; .loc 1 324 20 sub.u64 %r288,%r81,%r71; shl.b64 %r289,%r288,2; add.u64 %r104,%r104,%r289; .loc 1 325 30 add.u64 %r291,%r306,%r144; ld.u64 %r83,[%r291]; .loc 1 325 20 shl.b64 %r292,%r83,2; add.u64 %r105,%r102,%r292; .loc 1 326 30 add.u64 %r294,%r307,%r144; ld.u64 %r86,[%r294]; .loc 1 326 20 add.u64 %r106,%r103,%r86; .loc 1 304 32 add.u64 %r295,%frame,%r144; ld.u64 %r88,[%r295]; .loc 1 304 13 setp.eq.u64 %r296,%r80,%r88; @ %r296 bra $L52; .loc 1 308 20 mov.u64 %r27,0; bra $L53; $L33: .loc 1 330 1 ret; } unpack_i8.o/_gfortrani_unpack0_i8 .visible .func _gfortrani_unpack0_fortran/generated/unpack_i8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_unpack1_i8 .visible .func _gfortrani_unpack1_i8gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortrani_unpack0_i8 .visible .func _gfortrani_unpack0_reg .u64 %stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[480predpredpred %r205; .reg .u64 %r208; .reg .u32 %r212; .reg .u64u16 %r225; .reg .u16 %r227; .reg .u16u64 %r237; .reg .u64 %r238; .reg .pred %r239; .reg .u64 %r240; .reg .pred %r241; .reg .u64 %r242; .reg .predpredpredmov.u64 %r133,%ar0; mov.u64 %r134,%ar1; mov.u64 %r135,%ar2; mov.u64 %r136,%ar3; .loc 1 46 23 ld.u64 %r86,[%r136]; .loc 1 62 8 ld.u64 %r97,[%r135]; .loc 1 67 15 ld.u64 %r22,[%r135+16]; .loc 1 69 22 cvt.u32.u64 %r23,%r22; .loc 1 69 53 add.u32 %r137,%r23,-4; and.b32 %r138,%r137,-5; set.u32.eq.u32 %r140,%r138,0; neg.s32 %r141,%r140; .loc 1 69 22 add.u32 %r142,%r23,-1; set.u32.le.u32 %r144,%r142,1; neg.s32 %r145,%r144; .loc 1 69 6 cvt.u16.u32 %r147,%r141; cvt.u16.u32 %r148,%r145; or.b16 %r146,%r147,%r148; cvt.u32.u16 %r149,%r146; cvt.u16.u8 %r150,%r149; setp.ne.u16 %r151,%r150,0; @ %r151 bra $L2; .loc 1 71 7 setp.eq.u32 %r153,%r23,16; @ %r153 bra $L2; .loc 1 80 5 cvta.const.u64 %r1_gfortran_runtime_errorr96,[%r133]; .loc 1 82 6 setp.ne.u64 %r156,%r96,0; @ %r156 bra $L3; .loc 1 86 11 ld.s8 %r78,[%r135+28]; .loc 1 88 7 setp.le.s64 %r158,%r78,0; @ %r158 bra $L4; .loc 1 90 13 shl.b64 %r160,%r78,3; add.u64 %r161,%frame,120; cvt.u32.u64 %r165,%r96add.u64 %r120,%r133,40; add.u64 %r118,%r135,40; mov.u64 %r116,%frame; add.u64 %r115,%frame,360; add.u64 %r114,%frame,240; add.u64 %r170,%r78,%r78; add.u64 %r171,%r170,%r78; shl.b64 %r172,%r171,3; add.u64 %r102,%r172,%r120; .loc 1 60 9 cvt.u32.u64 %r37,%r96; .loc 1 87 10 mov.u64 %r91,1; .loc 1 91 4 mov.u64 %r174,%r96; $L5: st.u64 [%r120+8],%r174; ld.u64 %r176,[%r118+16]; add.u64 %r175,%r176,1; ld.u64 %r177,[%r118+8]; sub.u64 %r32,%r175,%r177; add.u64 %r178,%r32,-1; st.u64 [%r120+16],%r178; st.u64 [%r120],%r91; .loc 1 93 14 st.u64 [%r116],%r32; .loc 1 94 18 set.u32.le.s64 %r180,%r32,0; neg.s32 %r181,%r180; cvt.u16.u32 %r184,%r37; and.b16 %r183,%r184,1; cvt.u16.u32 %r187,%r181; or.b16 %r186,%r187,%r183; cvt.u32.u16 %r189,%r186; cvt.u32.u8 %r37,%r189; .loc 1 95 15 st.u64 [%r115],%r91; .loc 1 96 17 ld.u64 %r191,[%r118]; mul.lo.u64 %r190,%r191,%r22; .loc 1 96 15 st.u64 [%r114],%r190; .loc 1 97 7 mul.lo.u64 %r91,%r91,%r32; .loc 1 88 7 add.u64 %r120,%r120,24; add.u64 %r118,%r118,24; add.u64 %r116,%r116,8; add.u64 %r115,%r115,8; add.u64 %r114,%r114,8; setp.ne.u64 %r192,%r102,%r120; @ %r192 bra $L5; bra $L31; $L4: .loc 1 99 19 st.u64 [%r133+8],%r96; .loc 1 100 24 mov.u64 %r195,8; mov.u64 %r19419195; call (%value_in),_gfortrani_xmallocarray196,[%value_in]; } mov.u64 %r96,%r196; .loc 1 100 22 st.u64 [%r133],%r96; .loc 1 128 12 ld.u64 %r132,[%frame+360]; bra $L7; $L31: .loc 1 99 19 st.u64 [%r133+8],%r174; .loc 1 100 24 mov.u64 %r20091200; call (%value_in),_gfortrani_xmallocarray201,[%value_in]; } .loc 1 100 22 st.u64 [%r133],%r201; bra $L8; $L3: .loc 1 104 11 ld.s8 %r78,[%r133+28]; .loc 1 106 18 mov.u64 %r204,1; st.u64 [%frame+360],%r204; .loc 1 107 7 setp.le.s64 %r205,%r78,0; @ %r205 bra $L20; .loc 1 109 13 shl.b64 %r130,%r78,3; add.u64 %r208,%frame,120; mov.u32 %r21mov.u64 %r99,40; mov.u64 %r100,0; .loc 1 60 9 cvt.u32.u64 %r37,%r100; add.u64 %r266,%frame,360; add.u64 %r268,%frame,240; $L9: add.u64 %r82,%r133,%r99; .loc 1 110 16 ld.u64 %r217,[%r82+16]; add.u64 %r216,%r217,1; ld.u64 %r218,[%r82+8]; sub.u64 %r48,%r216,%r218; .loc 1 110 14 add.u64 %r219,%frame,%r100; st.u64 [%r219],%r48; .loc 1 111 18 set.u32.le.s64 %r221,%r48,0; neg.s32 %r222,%r221; cvt.u16.u32 %r225,%r37; and.b16 %r224,%r225,1; cvt.u16.u32 %r228,%r222; or.b16 %r227,%r228,%r224; cvt.u32.u16 %r230,%r227; cvt.u32.u8 %r37,%r230; .loc 1 112 15 add.u64 %r232,%r266,%r100; ld.u64 %r233,[%r82]; st.u64 [%r232],%r233; .loc 1 113 15 add.u64 %r235,%r268,%r100; .loc 1 113 17 add.u64 %r236,%r135,%r99; ld.u64 %r238,[%r236]; mul.lo.u64 %r237,%r238,%r22; .loc 1 113 15 st.u64 [%r235],%r237; .loc 1 107 7 add.u64 %r100,%r100,8; add.u64 %r99,%r99,24; setp.ne.u64 %r239,%r100,%r130; @ %r239 bra $L9; .loc 1 115 10 ld.u64 %r240,[%frame+360]; setp.ne.u64 %r241,%r240,0; @ %r241 bra $L8; .loc 1 116 13 mov.u64 %r242,1; st.u64 [%frame+360],%r242; $L8: .loc 1 119 6 setp.ne.u32 %r243,%r37,0; @ %r243 bra $L1; .loc 1 128 12 ld.u64 %r132,[%frame+360]; .loc 1 130 8 ld.u64 %r96,[%r133]; bra $L7; $L20: .loc 1 107 7 mov.u64 %r132,%r204; $L7: .loc 1 122 14 ld.u64 %r56,[%frame+240]; .loc 1 122 6 setp.ne.u64 %r244,%r56,0; .loc 1 123 16 selp.u64 %r56,%r56,1,%r244; .loc 1 125 12 ld.u64 %r75,[%r134+40]; .loc 1 126 6 setp.ne.u64 %r245,%r75,0; .loc 1 127 14 selp.u64 %r75,%r75,1,%r245; .loc 1 131 8 ld.u64 %r76,[%r134]; .loc 1 133 9 setp.eq.u64 %r246,%r96,0; @ %r246 bra $L1; .loc 1 139 9 shl.b64 %r60,%r75,3; .loc 1 147 12 shl.b64 %r62,%r132,3; ld.u64 %r94,[%frame+120]; .loc 1 151 32 ld.u64 %r98,[%frame]; .loc 1 158 30 mul.lo.u64 %r27,%r98,%r132; .loc 1 159 30 mul.lo.u64 %r247,%r98,%r56; .loc 1 159 16 neg.s64 %r101,%r247; shl.b64 %r121,%r78,3; setp.le.s64 %r267,%r78,1; add.u64 %r273,%frame,360; add.u64 %r274,%frame,240; .loc 1 155 20 mov.u64 %r275,0; $L19: .loc 1 135 10 ld.s8 %r249,[%r97]; cvt.u16.u32 %r248,%r249; setp.eq.u16 %r250,%r248,0; @ %r250 bra $L21; .loc 1 138 12 ld.u64 %r58,[%r76]; .loc 1 139 9 add.u64 %r76,%r76,%r60; bra $L13; $L21: .loc 1 144 10 mov.u64 %r58,%r86; $L13: st.u64 [%r96],%r58; .loc 1 147 12 add.u64 %r96,%r96,%r62; .loc 1 148 12 add.u64 %r97,%r97,%r56; .loc 1 149 15 add.u64 %r94,%r94,1; .loc 1 151 13 setp.ne.u64 %r251,%r94,%r98; @ %r251 bra $L19; .loc 1 159 16 add.u64 %r95,%r97,%r101; .loc 1 161 14 @ ! %r267 bra $L32; bra $L1; $L18: .loc 1 155 20 st.u64 [%r90],%r275; .loc 1 158 30 mul.lo.u64 %r64,%r70,%r69; .loc 1 159 30 mul.lo.u64 %r254,%r72,%r69; .loc 1 159 16 sub.u64 %r95,%r97,%r254; .loc 1 161 14 add.u64 %r90,%r90,8; add.u64 %r127,%r127,8; setp.ne.u64 %r255,%r121,%r127; @ %r255 bra $L17; bra $L1; $L32: add.u64 %r90,%frame,128; .loc 1 158 30 mov.u64 %r64,%r27; .loc 1 161 14 mov.u64 %r127,8; $L17: .loc 1 169 23 ld.u64 %r257,[%r90]; add.u64 %r69,%r257,1; st.u64 [%r90],%r69; .loc 1 170 30 add.u64 %r259,%r273,%r127; ld.u64 %r70,[%r259]; .loc 1 170 20 sub.u64 %r260,%r70,%r64; shl.b64 %r261,%r260,3; add.u64 %r96,%r96,%r261; .loc 1 171 30 add.u64 %r263,%r274,%r127; ld.u64 %r72,[%r263]; .loc 1 171 20 add.u64 %r97,%r95,%r72; .loc 1 151 32 add.u64 %r264,%frame,%r127; ld.u64 %r74,[%r264]; .loc 1 151 13 setp.eq.u64 %r265,%r69,%r74; @ %r265 bra $L18; .loc 1 155 20 mov.u64 %r94,0; bra $L19; $L_gfortrani_unpack1_i8 .visible .func _gfortrani_unpack1_reg .u64 %stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[6089932predpredpred %r213; .reg .u64u32u64 %r257; .reg .u64predu64 %r281; .reg .u64 %r282; .reg .pred %r283; .reg .u64u64 %r306; .reg .u64 %r307; .reg .u64 %r308; mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 208 8 ld.u64 %r106,[%r152]; .loc 1 213 15 ld.u64 %r22,[%r152+16]; .loc 1 215 22 cvt.u32.u64 %r23,%r22; .loc 1 215 53 add.u32 %r154,%r23,-4; and.b32 %r155,%r154,-5; set.u32.eq.u32 %r157,%r155,0; neg.s32 %r158,%r157; .loc 1 215 22 add.u32 %r159,%r23,-1; set.u32.le.u32 %r161,%r159,1; neg.s32 %r162,%r161; .loc 1 215 6 cvt.u16.u32 %r164,%r158; cvt.u16.u32 %r165,%r162; or%r168 bra $L34; .loc 1 217 7 setp.eq.u32 %r170,%r23,16; @ %r170 bra $L34; .loc 1 226 5 cvta.const.u64 %r171171_gfortran_runtime_error34: .loc 1 228 10 ld.u64 %r104,[%r150]; .loc 1 228 6 setp.ne.u64 %r173,%r104,0; @ %r173 bra $L35; .loc 1 232 11 ld.s8 %r92,[%r152+28]; .loc 1 234 7 setp.le.s64 %r175,%r92,0; @ %r175 bra $L36; .loc 1 236 13 shl.b64 %r145,%r92,3; add.u64 %r178,%frame,120; cvt.u32.u64 %r182,%r1041454,[%value_in]; } mov.u64 %r135,40; mov.u64 %r136,%r104; .loc 1 206 9 cvt.u32.u64 %r38,%r136; .loc 1 233 10 mov.u64 %r99,1; add.u64 %r297,%frame,480; add.u64 %r300,%frame,360; add.u64 %r299,%frame,240; .loc 1 237 4 mov.u64 %r186,%r136; $L37: add.u64 %r134,%r150,%r135; st.u64 [%r134+8],%r186; add.u64 %r133,%r152,%r135; ld.u64 %r188,[%r133+16]; add.u64 %r187,%r188,1; ld.u64 %r189,[%r133+8]; sub.u64 %r33,%r187,%r189; add.u64 %r190,%r33,-1; st.u64 [%r134+16],%r190; st.u64 [%r134],%r99; .loc 1 239 14 add.u64 %r191,%frame,%r136; st.u64 [%r191],%r33; .loc 1 240 18 set.u32.le.s64 %r193,%r33,0; neg.s32 %r194,%r193; cvt.u16.u32 %r197,%r38; and.b16 %r196,%r197,1; cvt.u16.u32 %r200,%r194; or.b16 %r199,%r200,%r196; cvt.u32.u16 %r202,%r199; cvt.u32.u8 %r38,%r202; .loc 1 241 15 add.u64 %r204,%r297,%r136; st.u64 [%r204],%r99; .loc 1 242 15 add.u64 %r206,%r300,%r136; .loc 1 242 17 add.u64 %r207,%r153,%r135; .loc 1 242 15 ld.u64 %r208,[%r207]; st.u64 [%r206],%r208; .loc 1 243 15 add.u64 %r210,%r299,%r136; .loc 1 243 17 ld.u64 %r212,[%r133]; mul.lo.u64 %r211,%r212,%r22; .loc 1 243 15 st.u64 [%r210],%r211; .loc 1 244 7 mul.lo.u64 %r99,%r99,%r33; .loc 1 234 7 add.u64 %r136,%r136,8; add.u64 %r135,%r135,24; setp.ne.u64 %r213,%r136,%r145; @ %r213 bra $L37; bra $L64; $L36: .loc 1 246 19 st.u64 [%r150+8],%r104; .loc 1 247 24 mov.u64 %r216,8; mov.u64 %r215216; call (%value_in),_gfortrani_xmallocarray217,[%value_in]; } mov.u64 %r104,%r217; .loc 1 247 22 st.u64 [%r150],%r104; .loc 1 278 12 ld.u64 %r149,[%frame+480]; bra $L39; $L64: .loc 1 246 19 st.u64 [%r150+8],%r186; .loc 1 247 24 mov.u64 %r299221; call (%value_in),_gfortrani_xmallocarray222,[%value_in]; } .loc 1 247 22 st.u64 [%r150],%r222; bra $L40; $L35: .loc 1 251 11 ld.s8 %r92,[%r150+28]; .loc 1 253 18 mov.u64 %r225,1; st.u64 [%frame+480],%r225; .loc 1 254 7 setp.le.s64 %r226,%r92,0; @ %r226 bra $L54; .loc 1 256 13 shl.b64 %r147,%r92,3; add.u64 %r229,%frame,120; mov.u32 %r23314735,[%value_in]; } add.u64 %r127,%r150,40; add.u64 %r125,%r153,40; add.u64 %r123,%r152,40; mov.u64 %r121,0; .loc 1 206 9 cvt.u32.u64 %r38,%r121; add.u64 %r297,%frame,480; add.u64 %r300,%frame,360; add.u64 %r299,%frame,240; $L41: .loc 1 257 16 ld.u64 %r238,[%r127+16]; add.u64 %r237,%r238,1; ld.u64 %r239,[%r127+8]; sub.u64 %r50,%r237,%r239; .loc 1 257 14 add.u64 %r240,%frame,%r121; st.u64 [%r240],%r50; .loc 1 258 18 set.u32.le.s64 %r242,%r50,0; neg.s32 %r243,%r242; cvt.u16.u32 %r246,%r38; and.b16 %r245,%r246,1; cvt.u16.u32 %r249,%r243; or.b16 %r248,%r249,%r245; cvt.u32.u16 %r251,%r248; cvt.u32.u8 %r38,%r251; .loc 1 259 15 add.u64 %r253,%r297,%r121; ld.u64 %r254,[%r127]; st.u64 [%r253],%r254; .loc 1 260 15 add.u64 %r256,%r300,%r121; ld.u64 %r257,[%r125]; st.u64 [%r256],%r257; .loc 1 261 15 add.u64 %r259,%r299,%r121; .loc 1 261 17 ld.u64 %r261,[%r123]; mul.lo.u64 %r260,%r261,%r22; .loc 1 261 15 st.u64 [%r259],%r260; .loc 1 254 7 add.u64 %r127,%r127,24; add.u64 %r125,%r125,24; add.u64 %r123,%r123,24; add.u64 %r121,%r121,8; setp.ne.u64 %r262,%r121,%r147; @ %r262 bra $L41; .loc 1 263 10 ld.u64 %r263,[%frame+48040; .loc 1 264 13 mov.u64 %r265,1; st.u64 [%frame+480],%r265; $L40: .loc 1 267 6 setp.ne.u32 %r266,%r38,0; @ %r266 bra $L33; .loc 1 278 12 ld.u64 %r149,[%frame+480]; .loc 1 281 8 ld.u64 %r104,[%r150]; bra $L39; $L54: .loc 1 254 7 mov.u64 %r149,%r225; $L39: .loc 1 270 14 ld.u64 %r59,[%frame+360]; .loc 1 270 6 setp.ne.u64 %r267,%r59,0; .loc 1 271 16 selp.u64 %r59,%r59,1,%r267; .loc 1 272 14 ld.u64 %r60,[%frame+240]; .loc 1 272 6 setp.ne.u64 %r268,%r60,0; .loc 1 273 16 selp.u64 %r60,%r60,1,%r268; .loc 1 275 12 ld.u64 %r90,[%r151+40]; .loc 1 276 6 setp.ne.u64 %r269,%r90,0; .loc 1 277 14 selp.u64 %r90,%r90,1,%r269; .loc 1 282 8 ld.u64 %r105,[%r153]; .loc 1 283 8 ld.u64 %r91,[%r151]; .loc 1 285 9 setp.eq.u64 %r270,%r104,0; @ %r270 bra $L33; .loc 1 291 16 shl.b64 %r64,%r90,3; .loc 1 299 12 shl.b64 %r66,%r149,3; .loc 1 300 12 shl.b64 %r68,%r59,3; ld.u64 %r27,[%frame+120]; .loc 1 304 32 ld.u64 %r107,[%frame]; .loc 1 311 30 mul.lo.u64 %r28,%r107,%r149; .loc 1 312 16 mul.lo.u64 %r271,%r107,%r59; shl.b64 %r272,%r271,3; neg.s64 %r111,%r272; .loc 1 313 30 mul.lo.u64 %r273,%r107,%r60; .loc 1 313 16 neg.s64 %r112,%r273; shl.b64 %r137,%r92,3; setp.le.s64 %r298,%r92,1; add.u64 %r305,%frame,480; add.u64 %r306,%frame,360; add.u64 %r307,%frame,240; .loc 1 308 20 mov.u64 %r308,0; $L53: .loc 1 287 10 ld.s8 %r275,[%r106]; cvt.u16.u32 %r274,%r275; setp.eq.u16 %r276,%r274,0; @ %r276 bra $L46; .loc 1 290 12 ld.u64 %r62,[%r91]; .loc 1 291 16 add.u64 %r91,%r91,%r64; bra $L47; $L46: .loc 1 296 12 ld.u64 %r62,[%r105]; $L47: st.u64 [%r104],%r62; .loc 1 299 12 add.u64 %r104,%r104,%r66; .loc 1 300 12 add.u64 %r105,%r105,%r68; .loc 1 301 12 add.u64 %r106,%r106,%r60; .loc 1 302 15 add.u64 %r27,%r27,1; .loc 1 304 13 setp.ne.u64 %r277,%r27,%r107; @ %r277 bra $L53; .loc 1 312 16 add.u64 %r102,%r105,%r111; .loc 1 313 16 add.u64 %r103,%r106,%r112; .loc 1 315 14 @ ! %r298 bra $L65; bra $L33; $L52: .loc 1 308 20 st.u64 [%r98],%r308; .loc 1 311 30 mul.lo.u64 %r71,%r81,%r80; .loc 1 312 30 mul.lo.u64 %r280,%r83,%r80; .loc 1 312 16 shl.b64 %r281,%r280,3; sub.u64 %r102,%r105,%r281; .loc 1 313 30 mul.lo.u64 %r282,%r86,%r80; .loc 1 313 16 sub.u64 %r103,%r106,%r282; .loc 1 315 14 add.u64 %r98,%r98,8; add.u64 %r144,%r144,8; setp.ne.u64 %r283,%r137,%r144; @ %r283 bra $L51; bra $L33; $L65: add.u64 %r98,%frame,128; .loc 1 311 30 mov.u64 %r71,%r28; .loc 1 315 14 mov.u64 %r144,8; $L51: .loc 1 323 23 ld.u64 %r285,[%r98]; add.u64 %r80,%r285,1; st.u64 [%r98],%r80; .loc 1 324 30 add.u64 %r287,%r305,%r144; ld.u64 %r81,[%r287]; .loc 1 324 20 sub.u64 %r288,%r81,%r71; shl.b64 %r289,%r288,3; add.u64 %r104,%r104,%r289; .loc 1 325 30 add.u64 %r291,%r306,%r144; ld.u64 %r83,[%r291]; .loc 1 325 20 shl.b64 %r292,%r83,3; add.u64 %r105,%r102,%r292; .loc 1 326 30 add.u64 %r294,%r307,%r144; ld.u64 %r86,[%r294]; .loc 1 326 20 add.u64 %r106,%r103,%r86; .loc 1 304 32 add.u64 %r295,%frame,%r144; ld.u64 %r88,[%r295]; .loc 1 304 13 setp.eq.u64 %r296,%r80,%r88; @ %r296 bra $L52; .loc 1 308 20 mov.u64 %r27,0; bra $L53; $L33: .loc 1 330 1 ret; } unpack_i16.o/_gfortrani_unpack0_i16 .visible .func _gfortrani_unpack0_i16fortran/generated/unpack_i16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_unpack1_i16 .visible .func _gfortrani_unpack1_i16gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortrani_unpack0_i16 .visible .func _gfortrani_unpack0_i16reg .u64 %stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[480predpredpred %r205; .reg .u64 %r208; .reg .u32 %r212; .reg .u64u16 %r225; .reg .u16 %r227; .reg .u16u64 %r237; .reg .u64 %r238; .reg .pred %r239; .reg .u64 %r240; .reg .pred %r241; .reg .u64 %r242; .reg .predpred64mov.u64 %r133,%ar0; mov.u64 %r134,%ar1; mov.u64 %r135,%ar2; mov.u64 %r136,%ar3; .loc 1 46 24 ld.u64 %r270,[%r136]; ld.u64 %r271,[%r136+8]; .loc 1 62 8 ld.u64 %r97,[%r135]; .loc 1 67 15 ld.u64 %r22,[%r135+16]; .loc 1 69 22 cvt.u32.u64 %r23,%r22; .loc 1 69 53 add.u32 %r137,%r23,-4; and.b32 %r138,%r137,-5; set.u32.eq.u32 %r140,%r138,0; neg.s32 %r141,%r140; .loc 1 69 22 add.u32 %r142,%r23,-1; set.u32.le.u32 %r144,%r142,1; neg.s32 %r145,%r144; .loc 1 69 6 cvt.u16.u32 %r147,%r141; cvt.u16.u32 %r148,%r145; or.b16 %r146,%r147,%r148; cvt.u32.u16 %r149,%r146; cvt.u16.u8 %r150,%r149; setp.ne.u16 %r151,%r150,0; @ %r151 bra $L2; .loc 1 71 7 setp.eq.u32 %r153,%r23,16; @ %r153 bra $L2; .loc 1 80 5 cvta.const.u64 %r1_gfortran_runtime_errorr96,[%r133]; .loc 1 82 6 setp.ne.u64 %r156,%r96,0; @ %r156 bra $L3; .loc 1 86 11 ld.s8 %r78,[%r135+28]; .loc 1 88 7 setp.le.s64 %r158,%r78,0; @ %r158 bra $L4; .loc 1 90 13 shl.b64 %r160,%r78,3; add.u64 %r161,%frame,120; cvt.u32.u64 %r165,%r96add.u64 %r120,%r133,40; add.u64 %r118,%r135,40; mov.u64 %r116,%frame; add.u64 %r115,%frame,360; add.u64 %r114,%frame,240; add.u64 %r170,%r78,%r78; add.u64 %r171,%r170,%r78; shl.b64 %r172,%r171,3; add.u64 %r102,%r172,%r120; .loc 1 60 9 cvt.u32.u64 %r37,%r96; .loc 1 87 10 mov.u64 %r91,1; .loc 1 91 4 mov.u64 %r174,%r96; $L5: st.u64 [%r120+8],%r174; ld.u64 %r176,[%r118+16]; add.u64 %r175,%r176,1; ld.u64 %r177,[%r118+8]; sub.u64 %r32,%r175,%r177; add.u64 %r178,%r32,-1; st.u64 [%r120+16],%r178; st.u64 [%r120],%r91; .loc 1 93 14 st.u64 [%r116],%r32; .loc 1 94 18 set.u32.le.s64 %r180,%r32,0; neg.s32 %r181,%r180; cvt.u16.u32 %r184,%r37; and.b16 %r183,%r184,1; cvt.u16.u32 %r187,%r181; or.b16 %r186,%r187,%r183; cvt.u32.u16 %r189,%r186; cvt.u32.u8 %r37,%r189; .loc 1 95 15 st.u64 [%r115],%r91; .loc 1 96 17 ld.u64 %r191,[%r118]; mul.lo.u64 %r190,%r191,%r22; .loc 1 96 15 st.u64 [%r114],%r190; .loc 1 97 7 mul.lo.u64 %r91,%r91,%r32; .loc 1 88 7 add.u64 %r120,%r120,24; add.u64 %r118,%r118,24; add.u64 %r116,%r116,8; add.u64 %r115,%r115,8; add.u64 %r114,%r114,8; setp.ne.u64 %r192,%r102,%r120; @ %r192 bra $L5; bra $L31; $L4: .loc 1 99 19 st.u64 [%r133+8],%r96; .loc 1 100 24 mov.u64 %r195,16; mov.u64 %r19419195; call (%value_in),_gfortrani_xmallocarray196,[%value_in]; } mov.u64 %r96,%r196; .loc 1 100 22 st.u64 [%r133],%r96; .loc 1 128 12 ld.u64 %r132,[%frame+360]; bra $L7; $L31: .loc 1 99 19 st.u64 [%r133+8],%r174; .loc 1 100 24 mov.u64 %r200,1691200; call (%value_in),_gfortrani_xmallocarray201,[%value_in]; } .loc 1 100 22 st.u64 [%r133],%r201; bra $L8; $L3: .loc 1 104 11 ld.s8 %r78,[%r133+28]; .loc 1 106 18 mov.u64 %r204,1; st.u64 [%frame+360],%r204; .loc 1 107 7 setp.le.s64 %r205,%r78,0; @ %r205 bra $L20; .loc 1 109 13 shl.b64 %r130,%r78,3; add.u64 %r208,%frame,120; mov.u32 %r21mov.u64 %r99,40; mov.u64 %r100,0; .loc 1 60 9 cvt.u32.u64 %r37,%r100; add.u64 %r272,%frame,360; add.u64 %r274,%frame,240; $L9: add.u64 %r82,%r133,%r99; .loc 1 110 16 ld.u64 %r217,[%r82+16]; add.u64 %r216,%r217,1; ld.u64 %r218,[%r82+8]; sub.u64 %r48,%r216,%r218; .loc 1 110 14 add.u64 %r219,%frame,%r100; st.u64 [%r219],%r48; .loc 1 111 18 set.u32.le.s64 %r221,%r48,0; neg.s32 %r222,%r221; cvt.u16.u32 %r225,%r37; and.b16 %r224,%r225,1; cvt.u16.u32 %r228,%r222; or.b16 %r227,%r228,%r224; cvt.u32.u16 %r230,%r227; cvt.u32.u8 %r37,%r230; .loc 1 112 15 add.u64 %r232,%r272,%r100; ld.u64 %r233,[%r82]; st.u64 [%r232],%r233; .loc 1 113 15 add.u64 %r235,%r274,%r100; .loc 1 113 17 add.u64 %r236,%r135,%r99; ld.u64 %r238,[%r236]; mul.lo.u64 %r237,%r238,%r22; .loc 1 113 15 st.u64 [%r235],%r237; .loc 1 107 7 add.u64 %r100,%r100,8; add.u64 %r99,%r99,24; setp.ne.u64 %r239,%r100,%r130; @ %r239 bra $L9; .loc 1 115 10 ld.u64 %r240,[%frame+360]; setp.ne.u64 %r241,%r240,0; @ %r241 bra $L8; .loc 1 116 13 mov.u64 %r242,1; st.u64 [%frame+360],%r242; $L8: .loc 1 119 6 setp.ne.u32 %r243,%r37,0; @ %r243 bra $L1; .loc 1 128 12 ld.u64 %r132,[%frame+360]; .loc 1 130 8 ld.u64 %r96,[%r133]; bra $L7; $L20: .loc 1 107 7 mov.u64 %r132,%r204; $L7: .loc 1 122 14 ld.u64 %r56,[%frame+240]; .loc 1 122 6 setp.ne.u64 %r244,%r56,0; .loc 1 123 16 selp.u64 %r56,%r56,1,%r244; .loc 1 125 12 ld.u64 %r75,[%r134+40]; .loc 1 126 6 setp.ne.u64 %r245,%r75,0; .loc 1 127 14 selp.u64 %r75,%r75,1,%r245; .loc 1 131 8 ld.u64 %r76,[%r134]; .loc 1 133 9 setp.eq.u64 %r246,%r96,0; @ %r246 bra $L1; .loc 1 139 9 shl.b64 %r60,%r75,4; .loc 1 147 12 shl.b64 %r62,%r132,4; ld.u64 %r94,[%frame+120]; .loc 1 151 32 ld.u64 %r98,[%frame]; .loc 1 158 30 mul.lo.u64 %r27,%r98,%r132; .loc 1 159 30 mul.lo.u64 %r247,%r98,%r56; .loc 1 159 16 neg.s64 %r101,%r247; shl.b64 %r121,%r78,3; setp.le.s64 %r273,%r78,1; add.u64 %r279,%frame,360; add.u64 %r280,%frame,240; .loc 1 155 20 mov.u64 %r281,0; $L19: .loc 1 135 10 ld.s8 %r249,[%r97]; cvt.u16.u32 %r248,%r249; setp.eq.u16 %r250,%r248,0; @ %r250 bra $L21; .loc 1 138 12 ld.u64 %r268,[%r76]; ld.u64 %r269,[%r76+8]; .loc 1 139 9 add.u64 %r76,%r76,%r60; bra $L13; $L21: .loc 1 144 10 mov.u64 %r268,%r270; mov.u64 %r269,%r271; $L13: st.u64 [%r96],%r268; st.u64 [%r96+8],%r269; .loc 1 147 12 add.u64 %r96,%r96,%r62; .loc 1 148 12 add.u64 %r97,%r97,%r56; .loc 1 149 15 add.u64 %r94,%r94,1; .loc 1 151 13 setp.ne.u64 %r253,%r94,%r98; @ %r253 bra $L19; .loc 1 159 16 add.u64 %r95,%r97,%r101; .loc 1 161 14 @ ! %r273 bra $L32; bra $L1; $L18: .loc 1 155 20 st.u64 [%r90],%r281; .loc 1 158 30 mul.lo.u64 %r64,%r70,%r69; .loc 1 159 30 mul.lo.u64 %r256,%r72,%r69; .loc 1 159 16 sub.u64 %r95,%r97,%r256; .loc 1 161 14 add.u64 %r90,%r90,8; add.u64 %r127,%r127,8; setp.ne.u64 %r257,%r121,%r127; @ %r257 bra $L17; bra $L1; $L32: add.u64 %r90,%frame,128; .loc 1 158 30 mov.u64 %r64,%r27; .loc 1 161 14 mov.u64 %r127,8; $L17: .loc 1 169 23 ld.u64 %r259,[%r90]; add.u64 %r69,%r259,1; st.u64 [%r90],%r69; .loc 1 170 30 add.u64 %r261,%r279,%r127; ld.u64 %r70,[%r261]; .loc 1 170 20 sub.u64 %r262,%r70,%r64; shl.b64 %r263,%r262,4; add.u64 %r96,%r96,%r263; .loc 1 171 30 add.u64 %r265,%r280,%r127; ld.u64 %r72,[%r265]; .loc 1 171 20 add.u64 %r97,%r95,%r72; .loc 1 151 32 add.u64 %r266,%frame,%r127; ld.u64 %r74,[%r266]; .loc 1 151 13 setp.eq.u64 %r267,%r69,%r74; @ %r267 bra $L18; .loc 1 155 20 mov.u64 %r94,0; bra $L19; $L_gfortrani_unpack1_i16 .visible .func _gfortrani_unpack1_i16reg .u64 %stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[6089932predpredpred %r213; .reg .u64u32u64 %r257; .reg .u64predu32 %r275; .reg .pred %r276; .reg .pred %r279; .reg .u64 %r282; .reg .u64 %r283; .reg .u64 %r284; .reg .predmov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 208 8 ld.u64 %r106,[%r152]; .loc 1 213 15 ld.u64 %r22,[%r152+16]; .loc 1 215 22 cvt.u32.u64 %r23,%r22; .loc 1 215 53 add.u32 %r154,%r23,-4; and.b32 %r155,%r154,-5; set.u32.eq.u32 %r157,%r155,0; neg.s32 %r158,%r157; .loc 1 215 22 add.u32 %r159,%r23,-1; set.u32.le.u32 %r161,%r159,1; neg.s32 %r162,%r161; .loc 1 215 6 cvt.u16.u32 %r164,%r158; cvt.u16.u32 %r165,%r162; or%r168 bra $L34; .loc 1 217 7 setp.eq.u32 %r170,%r23,16; @ %r170 bra $L34; .loc 1 226 5 cvta.const.u64 %r171171_gfortran_runtime_error34: .loc 1 228 10 ld.u64 %r104,[%r150]; .loc 1 228 6 setp.ne.u64 %r173,%r104,0; @ %r173 bra $L35; .loc 1 232 11 ld.s8 %r92,[%r152+28]; .loc 1 234 7 setp.le.s64 %r175,%r92,0; @ %r175 bra $L36; .loc 1 236 13 shl.b64 %r145,%r92,3; add.u64 %r178,%frame,120; cvt.u32.u64 %r182,%r1041454,[%value_in]; } mov.u64 %r135,40; mov.u64 %r136,%r104; .loc 1 206 9 cvt.u32.u64 %r38,%r136; .loc 1 233 10 mov.u64 %r99,1; add.u64 %r304,%frame,480; add.u64 %r303,%frame,360; add.u64 %r302,%frame,240; .loc 1 237 4 mov.u64 %r186,%r136; $L37: add.u64 %r134,%r150,%r135; st.u64 [%r134+8],%r186; add.u64 %r133,%r152,%r135; ld.u64 %r188,[%r133+16]; add.u64 %r187,%r188,1; ld.u64 %r189,[%r133+8]; sub.u64 %r33,%r187,%r189; add.u64 %r190,%r33,-1; st.u64 [%r134+16],%r190; st.u64 [%r134],%r99; .loc 1 239 14 add.u64 %r191,%frame,%r136; st.u64 [%r191],%r33; .loc 1 240 18 set.u32.le.s64 %r193,%r33,0; neg.s32 %r194,%r193; cvt.u16.u32 %r197,%r38; and.b16 %r196,%r197,1; cvt.u16.u32 %r200,%r194; or.b16 %r199,%r200,%r196; cvt.u32.u16 %r202,%r199; cvt.u32.u8 %r38,%r202; .loc 1 241 15 add.u64 %r204,%r304,%r136; st.u64 [%r204],%r99; .loc 1 242 15 add.u64 %r206,%r303,%r136; .loc 1 242 17 add.u64 %r207,%r153,%r135; .loc 1 242 15 ld.u64 %r208,[%r207]; st.u64 [%r206],%r208; .loc 1 243 15 add.u64 %r210,%r302,%r136; .loc 1 243 17 ld.u64 %r212,[%r133]; mul.lo.u64 %r211,%r212,%r22; .loc 1 243 15 st.u64 [%r210],%r211; .loc 1 244 7 mul.lo.u64 %r99,%r99,%r33; .loc 1 234 7 add.u64 %r136,%r136,8; add.u64 %r135,%r135,24; setp.ne.u64 %r213,%r136,%r145; @ %r213 bra $L37; bra $L64; $L36: .loc 1 246 19 st.u64 [%r150+8],%r104; .loc 1 247 24 mov.u64 %r216,16; mov.u64 %r215216; call (%value_in),_gfortrani_xmallocarray217,[%value_in]; } mov.u64 %r104,%r217; .loc 1 247 22 st.u64 [%r150],%r104; .loc 1 278 12 ld.u64 %r149,[%frame+480]; bra $L39; $L64: .loc 1 246 19 st.u64 [%r150+8],%r186; .loc 1 247 24 mov.u64 %r221,1699221; call (%value_in),_gfortrani_xmallocarray222,[%value_in]; } .loc 1 247 22 st.u64 [%r150],%r222; bra $L40; $L35: .loc 1 251 11 ld.s8 %r92,[%r150+28]; .loc 1 253 18 mov.u64 %r225,1; st.u64 [%frame+480],%r225; .loc 1 254 7 setp.le.s64 %r226,%r92,0; @ %r226 bra $L54; .loc 1 256 13 shl.b64 %r147,%r92,3; add.u64 %r229,%frame,120; mov.u32 %r23314735,[%value_in]; } add.u64 %r127,%r150,40; add.u64 %r125,%r153,40; add.u64 %r123,%r152,40; mov.u64 %r121,0; .loc 1 206 9 cvt.u32.u64 %r38,%r121; add.u64 %r304,%frame,480; add.u64 %r303,%frame,360; add.u64 %r302,%frame,240; $L41: .loc 1 257 16 ld.u64 %r238,[%r127+16]; add.u64 %r237,%r238,1; ld.u64 %r239,[%r127+8]; sub.u64 %r50,%r237,%r239; .loc 1 257 14 add.u64 %r240,%frame,%r121; st.u64 [%r240],%r50; .loc 1 258 18 set.u32.le.s64 %r242,%r50,0; neg.s32 %r243,%r242; cvt.u16.u32 %r246,%r38; and.b16 %r245,%r246,1; cvt.u16.u32 %r249,%r243; or.b16 %r248,%r249,%r245; cvt.u32.u16 %r251,%r248; cvt.u32.u8 %r38,%r251; .loc 1 259 15 add.u64 %r253,%r304,%r121; ld.u64 %r254,[%r127]; st.u64 [%r253],%r254; .loc 1 260 15 add.u64 %r256,%r303,%r121; ld.u64 %r257,[%r125]; st.u64 [%r256],%r257; .loc 1 261 15 add.u64 %r259,%r302,%r121; .loc 1 261 17 ld.u64 %r261,[%r123]; mul.lo.u64 %r260,%r261,%r22; .loc 1 261 15 st.u64 [%r259],%r260; .loc 1 254 7 add.u64 %r127,%r127,24; add.u64 %r125,%r125,24; add.u64 %r123,%r123,24; add.u64 %r121,%r121,8; setp.ne.u64 %r262,%r121,%r147; @ %r262 bra $L41; .loc 1 263 10 ld.u64 %r263,[%frame+48040; .loc 1 264 13 mov.u64 %r265,1; st.u64 [%frame+480],%r265; $L40: .loc 1 267 6 setp.ne.u32 %r266,%r38,0; @ %r266 bra $L33; .loc 1 278 12 ld.u64 %r149,[%frame+480]; .loc 1 281 8 ld.u64 %r104,[%r150]; bra $L39; $L54: .loc 1 254 7 mov.u64 %r149,%r225; $L39: .loc 1 270 14 ld.u64 %r59,[%frame+360]; .loc 1 270 6 setp.ne.u64 %r267,%r59,0; .loc 1 271 16 selp.u64 %r59,%r59,1,%r267; .loc 1 272 14 ld.u64 %r60,[%frame+240]; .loc 1 272 6 setp.ne.u64 %r268,%r60,0; .loc 1 273 16 selp.u64 %r60,%r60,1,%r268; .loc 1 275 12 ld.u64 %r90,[%r151+40]; .loc 1 276 6 setp.ne.u64 %r269,%r90,0; .loc 1 277 14 selp.u64 %r90,%r90,1,%r269; .loc 1 282 8 ld.u64 %r105,[%r153]; .loc 1 283 8 ld.u64 %r91,[%r151]; .loc 1 285 9 setp.eq.u64 %r270,%r104,0; @ %r270 bra $L33; .loc 1 291 16 shl.b64 %r64,%r90,4; .loc 1 299 12 shl.b64 %r66,%r149,4; .loc 1 300 12 shl.b64 %r68,%r59,4; ld.u64 %r27,[%frame+120]; .loc 1 304 32 ld.u64 %r107,[%frame]; .loc 1 311 30 mul.lo.u64 %r28,%r107,%r149; .loc 1 312 16 mul.lo.u64 %r271,%r107,%r59; shl.b64 %r272,%r271,4; neg.s64 %r111,%r272; .loc 1 313 30 mul.lo.u64 %r273,%r107,%r60; .loc 1 313 16 neg.s64 %r112,%r273; shl.b64 %r137,%r92,3; setp.le.s64 %r301,%r92,1; add.u64 %r309,%frame,480; add.u64 %r310,%frame,360; add.u64 %r311,%frame,240; .loc 1 308 20 mov.u64 %r312,0; $L53: .loc 1 287 10 ld.s8 %r275,[%r106]; cvt.u16.u32 %r274,%r275; setp.eq.u16 %r276,%r274,0; @ %r276 bra $L46; .loc 1 290 12 ld.u64 %r299,[%r91]; ld.u64 %r300,[%r91+8]; .loc 1 291 16 add.u64 %r91,%r91,%r64; bra $L47; $L46: .loc 1 296 12 ld.u64 %r299,[%r105]; ld.u64 %r300,[%r105+8]; $L47: st.u64 [%r104],%r299; st.u64 [%r104+8],%r300; .loc 1 299 12 add.u64 %r104,%r104,%r66; .loc 1 300 12 add.u64 %r105,%r105,%r68; .loc 1 301 12 add.u64 %r106,%r106,%r60; .loc 1 302 15 add.u64 %r27,%r27,1; .loc 1 304 13 setp.ne.u64 %r279,%r27,%r107; @ %r279 bra $L53; .loc 1 312 16 add.u64 %r102,%r105,%r111; .loc 1 313 16 add.u64 %r103,%r106,%r112; .loc 1 315 14 @ ! %r301 bra $L65; bra $L33; $L52: .loc 1 308 20 st.u64 [%r98],%r312; .loc 1 311 30 mul.lo.u64 %r71,%r81,%r80; .loc 1 312 30 mul.lo.u64 %r282,%r83,%r80; .loc 1 312 16 shl.b64 %r283,%r282,4; sub.u64 %r102,%r105,%r283; .loc 1 313 30 mul.lo.u64 %r284,%r86,%r80; .loc 1 313 16 sub.u64 %r103,%r106,%r284; .loc 1 315 14 add.u64 %r98,%r98,8; add.u64 %r144,%r144,8; setp.ne.u64 %r285,%r137,%r144; @ %r285 bra $L51; bra $L33; $L65: add.u64 %r98,%frame,128; .loc 1 311 30 mov.u64 %r71,%r28; .loc 1 315 14 mov.u64 %r144,8; $L51: .loc 1 323 23 ld.u64 %r287,[%r98]; add.u64 %r80,%r287,1; st.u64 [%r98],%r80; .loc 1 324 30 add.u64 %r289,%r309,%r144; ld.u64 %r81,[%r289]; .loc 1 324 20 sub.u64 %r290,%r81,%r71; shl.b64 %r291,%r290,4; add.u64 %r104,%r104,%r291; .loc 1 325 30 add.u64 %r293,%r310,%r144; ld.u64 %r83,[%r293]; .loc 1 325 20 shl.b64 %r294,%r83,4; add.u64 %r105,%r102,%r294; .loc 1 326 30 add.u64 %r296,%r311,%r144; ld.u64 %r86,[%r296]; .loc 1 326 20 add.u64 %r106,%r103,%r86; .loc 1 304 32 add.u64 %r297,%frame,%r144; ld.u64 %r88,[%r297]; .loc 1 304 13 setp.eq.u64 %r298,%r80,%r88; @ %r298 bra $L52; .loc 1 308 20 mov.u64 %r27,0; bra $L53; $L33: .loc 1 330 1 ret; } unpack_r4.o/_gfortrani_unpack0_r4 .visible .func _gfortrani_unpack0_rfortran/generated/unpack_r4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_unpack1_r4 .visible .func _gfortrani_unpack1_r4gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortrani_unpack0_r4 .visible .func _gfortrani_unpack0_rreg .u64 %stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[480predpredpred %r205; .reg .u64 %r208; .reg .u32 %r212; .reg .u64u16 %r225; .reg .u16 %r227; .reg .u16u64 %r237; .reg .u64 %r238; .reg .pred %r239; .reg .u64 %r240; .reg .pred %r241; .reg .u64 %r242; .reg .predpredpredmov.u64 %r133,%ar0; mov.u64 %r134,%ar1; mov.u64 %r135,%ar2; mov.u64 %r136,%ar3; .loc 1 46 20 ld.f32 %r86,[%r136]; .loc 1 62 8 ld.u64 %r97,[%r135]; .loc 1 67 15 ld.u64 %r22,[%r135+16]; .loc 1 69 22 cvt.u32.u64 %r23,%r22; .loc 1 69 53 add.u32 %r137,%r23,-4; and.b32 %r138,%r137,-5; set.u32.eq.u32 %r140,%r138,0; neg.s32 %r141,%r140; .loc 1 69 22 add.u32 %r142,%r23,-1; set.u32.le.u32 %r144,%r142,1; neg.s32 %r145,%r144; .loc 1 69 6 cvt.u16.u32 %r147,%r141; cvt.u16.u32 %r148,%r145; or.b16 %r146,%r147,%r148; cvt.u32.u16 %r149,%r146; cvt.u16.u8 %r150,%r149; setp.ne.u16 %r151,%r150,0; @ %r151 bra $L2; .loc 1 71 7 setp.eq.u32 %r153,%r23,16; @ %r153 bra $L2; .loc 1 80 5 cvta.const.u64 %r1_gfortran_runtime_errorr96,[%r133]; .loc 1 82 6 setp.ne.u64 %r156,%r96,0; @ %r156 bra $L3; .loc 1 86 11 ld.s8 %r78,[%r135+28]; .loc 1 88 7 setp.le.s64 %r158,%r78,0; @ %r158 bra $L4; .loc 1 90 13 shl.b64 %r160,%r78,3; add.u64 %r161,%frame,120; cvt.u32.u64 %r165,%r96add.u64 %r120,%r133,40; add.u64 %r118,%r135,40; mov.u64 %r116,%frame; add.u64 %r115,%frame,360; add.u64 %r114,%frame,240; add.u64 %r170,%r78,%r78; add.u64 %r171,%r170,%r78; shl.b64 %r172,%r171,3; add.u64 %r102,%r172,%r120; .loc 1 60 9 cvt.u32.u64 %r37,%r96; .loc 1 87 10 mov.u64 %r91,1; .loc 1 91 4 mov.u64 %r174,%r96; $L5: st.u64 [%r120+8],%r174; ld.u64 %r176,[%r118+16]; add.u64 %r175,%r176,1; ld.u64 %r177,[%r118+8]; sub.u64 %r32,%r175,%r177; add.u64 %r178,%r32,-1; st.u64 [%r120+16],%r178; st.u64 [%r120],%r91; .loc 1 93 14 st.u64 [%r116],%r32; .loc 1 94 18 set.u32.le.s64 %r180,%r32,0; neg.s32 %r181,%r180; cvt.u16.u32 %r184,%r37; and.b16 %r183,%r184,1; cvt.u16.u32 %r187,%r181; or.b16 %r186,%r187,%r183; cvt.u32.u16 %r189,%r186; cvt.u32.u8 %r37,%r189; .loc 1 95 15 st.u64 [%r115],%r91; .loc 1 96 17 ld.u64 %r191,[%r118]; mul.lo.u64 %r190,%r191,%r22; .loc 1 96 15 st.u64 [%r114],%r190; .loc 1 97 7 mul.lo.u64 %r91,%r91,%r32; .loc 1 88 7 add.u64 %r120,%r120,24; add.u64 %r118,%r118,24; add.u64 %r116,%r116,8; add.u64 %r115,%r115,8; add.u64 %r114,%r114,8; setp.ne.u64 %r192,%r102,%r120; @ %r192 bra $L5; bra $L31; $L4: .loc 1 99 19 st.u64 [%r133+8],%r96; .loc 1 100 24 mov.u64 %r195,4; mov.u64 %r19419195; call (%value_in),_gfortrani_xmallocarray196,[%value_in]; } mov.u64 %r96,%r196; .loc 1 100 22 st.u64 [%r133],%r96; .loc 1 128 12 ld.u64 %r132,[%frame+360]; bra $L7; $L31: .loc 1 99 19 st.u64 [%r133+8],%r174; .loc 1 100 2491200; call (%value_in),_gfortrani_xmallocarray201,[%value_in]; } .loc 1 100 22 st.u64 [%r133],%r201; bra $L8; $L3: .loc 1 104 11 ld.s8 %r78,[%r133+28]; .loc 1 106 18 mov.u64 %r204,1; st.u64 [%frame+360],%r204; .loc 1 107 7 setp.le.s64 %r205,%r78,0; @ %r205 bra $L20; .loc 1 109 13 shl.b64 %r130,%r78,3; add.u64 %r208,%frame,120; mov.u32 %r21mov.u64 %r99,40; mov.u64 %r100,0; .loc 1 60 9 cvt.u32.u64 %r37,%r100; add.u64 %r266,%frame,360; add.u64 %r268,%frame,240; $L9: add.u64 %r82,%r133,%r99; .loc 1 110 16 ld.u64 %r217,[%r82+16]; add.u64 %r216,%r217,1; ld.u64 %r218,[%r82+8]; sub.u64 %r48,%r216,%r218; .loc 1 110 14 add.u64 %r219,%frame,%r100; st.u64 [%r219],%r48; .loc 1 111 18 set.u32.le.s64 %r221,%r48,0; neg.s32 %r222,%r221; cvt.u16.u32 %r225,%r37; and.b16 %r224,%r225,1; cvt.u16.u32 %r228,%r222; or.b16 %r227,%r228,%r224; cvt.u32.u16 %r230,%r227; cvt.u32.u8 %r37,%r230; .loc 1 112 15 add.u64 %r232,%r266,%r100; ld.u64 %r233,[%r82]; st.u64 [%r232],%r233; .loc 1 113 15 add.u64 %r235,%r268,%r100; .loc 1 113 17 add.u64 %r236,%r135,%r99; ld.u64 %r238,[%r236]; mul.lo.u64 %r237,%r238,%r22; .loc 1 113 15 st.u64 [%r235],%r237; .loc 1 107 7 add.u64 %r100,%r100,8; add.u64 %r99,%r99,24; setp.ne.u64 %r239,%r100,%r130; @ %r239 bra $L9; .loc 1 115 10 ld.u64 %r240,[%frame+360]; setp.ne.u64 %r241,%r240,0; @ %r241 bra $L8; .loc 1 116 13 mov.u64 %r242,1; st.u64 [%frame+360],%r242; $L8: .loc 1 119 6 setp.ne.u32 %r243,%r37,0; @ %r243 bra $L1; .loc 1 128 12 ld.u64 %r132,[%frame+360]; .loc 1 130 8 ld.u64 %r96,[%r133]; bra $L7; $L20: .loc 1 107 7 mov.u64 %r132,%r204; $L7: .loc 1 122 14 ld.u64 %r56,[%frame+240]; .loc 1 122 6 setp.ne.u64 %r244,%r56,0; .loc 1 123 16 selp.u64 %r56,%r56,1,%r244; .loc 1 125 12 ld.u64 %r75,[%r134+40]; .loc 1 126 6 setp.ne.u64 %r245,%r75,0; .loc 1 127 14 selp.u64 %r75,%r75,1,%r245; .loc 1 131 8 ld.u64 %r76,[%r134]; .loc 1 133 9 setp.eq.u64 %r246,%r96,0; @ %r246 bra $L1; .loc 1 139 9 shl.b64 %r60,%r75,2; .loc 1 147 12 shl.b64 %r62,%r132,2; ld.u64 %r94,[%frame+120]; .loc 1 151 32 ld.u64 %r98,[%frame]; .loc 1 158 30 mul.lo.u64 %r27,%r98,%r132; .loc 1 159 30 mul.lo.u64 %r247,%r98,%r56; .loc 1 159 16 neg.s64 %r101,%r247; shl.b64 %r121,%r78,3; setp.le.s64 %r267,%r78,1; add.u64 %r273,%frame,360; add.u64 %r274,%frame,240; .loc 1 155 20 mov.u64 %r275,0; $L19: .loc 1 135 10 ld.s8 %r249,[%r97]; cvt.u16.u32 %r248,%r249; setp.eq.u16 %r250,%r248,0; @ %r250 bra $L21; .loc 1 138 12 ld.f32 %r58,[%r76]; .loc 1 139 9 add.u64 %r76,%r76,%r60; bra $L13; $L21: .loc 1 144 10 mov.f32 %r58,%r86; $L13: st.f32 [%r96],%r58; .loc 1 147 12 add.u64 %r96,%r96,%r62; .loc 1 148 12 add.u64 %r97,%r97,%r56; .loc 1 149 15 add.u64 %r94,%r94,1; .loc 1 151 13 setp.ne.u64 %r251,%r94,%r98; @ %r251 bra $L19; .loc 1 159 16 add.u64 %r95,%r97,%r101; .loc 1 161 14 @ ! %r267 bra $L32; bra $L1; $L18: .loc 1 155 20 st.u64 [%r90],%r275; .loc 1 158 30 mul.lo.u64 %r64,%r70,%r69; .loc 1 159 30 mul.lo.u64 %r254,%r72,%r69; .loc 1 159 16 sub.u64 %r95,%r97,%r254; .loc 1 161 14 add.u64 %r90,%r90,8; add.u64 %r127,%r127,8; setp.ne.u64 %r255,%r121,%r127; @ %r255 bra $L17; bra $L1; $L32: add.u64 %r90,%frame,128; .loc 1 158 30 mov.u64 %r64,%r27; .loc 1 161 14 mov.u64 %r127,8; $L17: .loc 1 169 23 ld.u64 %r257,[%r90]; add.u64 %r69,%r257,1; st.u64 [%r90],%r69; .loc 1 170 30 add.u64 %r259,%r273,%r127; ld.u64 %r70,[%r259]; .loc 1 170 20 sub.u64 %r260,%r70,%r64; shl.b64 %r261,%r260,2; add.u64 %r96,%r96,%r261; .loc 1 171 30 add.u64 %r263,%r274,%r127; ld.u64 %r72,[%r263]; .loc 1 171 20 add.u64 %r97,%r95,%r72; .loc 1 151 32 add.u64 %r264,%frame,%r127; ld.u64 %r74,[%r264]; .loc 1 151 13 setp.eq.u64 %r265,%r69,%r74; @ %r265 bra $L18; .loc 1 155 20 mov.u64 %r94,0; bra $L19; $L_gfortrani_unpack1_r4 .visible .func _gfortrani_unpack1_rreg .u64 %stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[6089932predpredpred %r213; .reg .u64u32u64 %r257; .reg .u64predu64 %r281; .reg .u64 %r282; .reg .pred %r283; .reg .u64u64 %r306; .reg .u64 %r307; .reg .u64 %r308; mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 208 8 ld.u64 %r106,[%r152]; .loc 1 213 15 ld.u64 %r22,[%r152+16]; .loc 1 215 22 cvt.u32.u64 %r23,%r22; .loc 1 215 53 add.u32 %r154,%r23,-4; and.b32 %r155,%r154,-5; set.u32.eq.u32 %r157,%r155,0; neg.s32 %r158,%r157; .loc 1 215 22 add.u32 %r159,%r23,-1; set.u32.le.u32 %r161,%r159,1; neg.s32 %r162,%r161; .loc 1 215 6 cvt.u16.u32 %r164,%r158; cvt.u16.u32 %r165,%r162; or%r168 bra $L34; .loc 1 217 7 setp.eq.u32 %r170,%r23,16; @ %r170 bra $L34; .loc 1 226 5 cvta.const.u64 %r171171_gfortran_runtime_error34: .loc 1 228 10 ld.u64 %r104,[%r150]; .loc 1 228 6 setp.ne.u64 %r173,%r104,0; @ %r173 bra $L35; .loc 1 232 11 ld.s8 %r92,[%r152+28]; .loc 1 234 7 setp.le.s64 %r175,%r92,0; @ %r175 bra $L36; .loc 1 236 13 shl.b64 %r145,%r92,3; add.u64 %r178,%frame,120; cvt.u32.u64 %r182,%r1041454,[%value_in]; } mov.u64 %r135,40; mov.u64 %r136,%r104; .loc 1 206 9 cvt.u32.u64 %r38,%r136; .loc 1 233 10 mov.u64 %r99,1; add.u64 %r297,%frame,480; add.u64 %r300,%frame,360; add.u64 %r299,%frame,240; .loc 1 237 4 mov.u64 %r186,%r136; $L37: add.u64 %r134,%r150,%r135; st.u64 [%r134+8],%r186; add.u64 %r133,%r152,%r135; ld.u64 %r188,[%r133+16]; add.u64 %r187,%r188,1; ld.u64 %r189,[%r133+8]; sub.u64 %r33,%r187,%r189; add.u64 %r190,%r33,-1; st.u64 [%r134+16],%r190; st.u64 [%r134],%r99; .loc 1 239 14 add.u64 %r191,%frame,%r136; st.u64 [%r191],%r33; .loc 1 240 18 set.u32.le.s64 %r193,%r33,0; neg.s32 %r194,%r193; cvt.u16.u32 %r197,%r38; and.b16 %r196,%r197,1; cvt.u16.u32 %r200,%r194; or.b16 %r199,%r200,%r196; cvt.u32.u16 %r202,%r199; cvt.u32.u8 %r38,%r202; .loc 1 241 15 add.u64 %r204,%r297,%r136; st.u64 [%r204],%r99; .loc 1 242 15 add.u64 %r206,%r300,%r136; .loc 1 242 17 add.u64 %r207,%r153,%r135; .loc 1 242 15 ld.u64 %r208,[%r207]; st.u64 [%r206],%r208; .loc 1 243 15 add.u64 %r210,%r299,%r136; .loc 1 243 17 ld.u64 %r212,[%r133]; mul.lo.u64 %r211,%r212,%r22; .loc 1 243 15 st.u64 [%r210],%r211; .loc 1 244 7 mul.lo.u64 %r99,%r99,%r33; .loc 1 234 7 add.u64 %r136,%r136,8; add.u64 %r135,%r135,24; setp.ne.u64 %r213,%r136,%r145; @ %r213 bra $L37; bra $L64; $L36: .loc 1 246 19 st.u64 [%r150+8],%r104; .loc 1 247 24 mov.u64 %r216,4; mov.u64 %r215216; call (%value_in),_gfortrani_xmallocarray217,[%value_in]; } mov.u64 %r104,%r217; .loc 1 247 22 st.u64 [%r150],%r104; .loc 1 278 12 ld.u64 %r149,[%frame+480]; bra $L39; $L64: .loc 1 246 19 st.u64 [%r150+8],%r186; .loc 1 247 2499221; call (%value_in),_gfortrani_xmallocarray222,[%value_in]; } .loc 1 247 22 st.u64 [%r150],%r222; bra $L40; $L35: .loc 1 251 11 ld.s8 %r92,[%r150+28]; .loc 1 253 18 mov.u64 %r225,1; st.u64 [%frame+480],%r225; .loc 1 254 7 setp.le.s64 %r226,%r92,0; @ %r226 bra $L54; .loc 1 256 13 shl.b64 %r147,%r92,3; add.u64 %r229,%frame,120; mov.u32 %r23314735,[%value_in]; } add.u64 %r127,%r150,40; add.u64 %r125,%r153,40; add.u64 %r123,%r152,40; mov.u64 %r121,0; .loc 1 206 9 cvt.u32.u64 %r38,%r121; add.u64 %r297,%frame,480; add.u64 %r300,%frame,360; add.u64 %r299,%frame,240; $L41: .loc 1 257 16 ld.u64 %r238,[%r127+16]; add.u64 %r237,%r238,1; ld.u64 %r239,[%r127+8]; sub.u64 %r50,%r237,%r239; .loc 1 257 14 add.u64 %r240,%frame,%r121; st.u64 [%r240],%r50; .loc 1 258 18 set.u32.le.s64 %r242,%r50,0; neg.s32 %r243,%r242; cvt.u16.u32 %r246,%r38; and.b16 %r245,%r246,1; cvt.u16.u32 %r249,%r243; or.b16 %r248,%r249,%r245; cvt.u32.u16 %r251,%r248; cvt.u32.u8 %r38,%r251; .loc 1 259 15 add.u64 %r253,%r297,%r121; ld.u64 %r254,[%r127]; st.u64 [%r253],%r254; .loc 1 260 15 add.u64 %r256,%r300,%r121; ld.u64 %r257,[%r125]; st.u64 [%r256],%r257; .loc 1 261 15 add.u64 %r259,%r299,%r121; .loc 1 261 17 ld.u64 %r261,[%r123]; mul.lo.u64 %r260,%r261,%r22; .loc 1 261 15 st.u64 [%r259],%r260; .loc 1 254 7 add.u64 %r127,%r127,24; add.u64 %r125,%r125,24; add.u64 %r123,%r123,24; add.u64 %r121,%r121,8; setp.ne.u64 %r262,%r121,%r147; @ %r262 bra $L41; .loc 1 263 10 ld.u64 %r263,[%frame+48040; .loc 1 264 13 mov.u64 %r265,1; st.u64 [%frame+480],%r265; $L40: .loc 1 267 6 setp.ne.u32 %r266,%r38,0; @ %r266 bra $L33; .loc 1 278 12 ld.u64 %r149,[%frame+480]; .loc 1 281 8 ld.u64 %r104,[%r150]; bra $L39; $L54: .loc 1 254 7 mov.u64 %r149,%r225; $L39: .loc 1 270 14 ld.u64 %r59,[%frame+360]; .loc 1 270 6 setp.ne.u64 %r267,%r59,0; .loc 1 271 16 selp.u64 %r59,%r59,1,%r267; .loc 1 272 14 ld.u64 %r60,[%frame+240]; .loc 1 272 6 setp.ne.u64 %r268,%r60,0; .loc 1 273 16 selp.u64 %r60,%r60,1,%r268; .loc 1 275 12 ld.u64 %r90,[%r151+40]; .loc 1 276 6 setp.ne.u64 %r269,%r90,0; .loc 1 277 14 selp.u64 %r90,%r90,1,%r269; .loc 1 282 8 ld.u64 %r105,[%r153]; .loc 1 283 8 ld.u64 %r91,[%r151]; .loc 1 285 9 setp.eq.u64 %r270,%r104,0; @ %r270 bra $L33; .loc 1 291 16 shl.b64 %r64,%r90,2; .loc 1 299 12 shl.b64 %r66,%r149,2; .loc 1 300 12 shl.b64 %r68,%r59,2; ld.u64 %r27,[%frame+120]; .loc 1 304 32 ld.u64 %r107,[%frame]; .loc 1 311 30 mul.lo.u64 %r28,%r107,%r149; .loc 1 312 16 mul.lo.u64 %r271,%r107,%r59; shl.b64 %r272,%r271,2; neg.s64 %r111,%r272; .loc 1 313 30 mul.lo.u64 %r273,%r107,%r60; .loc 1 313 16 neg.s64 %r112,%r273; shl.b64 %r137,%r92,3; setp.le.s64 %r298,%r92,1; add.u64 %r305,%frame,480; add.u64 %r306,%frame,360; add.u64 %r307,%frame,240; .loc 1 308 20 mov.u64 %r308,0; $L53: .loc 1 287 10 ld.s8 %r275,[%r106]; cvt.u16.u32 %r274,%r275; setp.eq.u16 %r276,%r274,0; @ %r276 bra $L46; .loc 1 290 12 ld.f32 %r62,[%r91]; .loc 1 291 16 add.u64 %r91,%r91,%r64; bra $L47; $L46: .loc 1 296 12 ld.f32 %r62,[%r105]; $L47: st.f32 [%r104],%r62; .loc 1 299 12 add.u64 %r104,%r104,%r66; .loc 1 300 12 add.u64 %r105,%r105,%r68; .loc 1 301 12 add.u64 %r106,%r106,%r60; .loc 1 302 15 add.u64 %r27,%r27,1; .loc 1 304 13 setp.ne.u64 %r277,%r27,%r107; @ %r277 bra $L53; .loc 1 312 16 add.u64 %r102,%r105,%r111; .loc 1 313 16 add.u64 %r103,%r106,%r112; .loc 1 315 14 @ ! %r298 bra $L65; bra $L33; $L52: .loc 1 308 20 st.u64 [%r98],%r308; .loc 1 311 30 mul.lo.u64 %r71,%r81,%r80; .loc 1 312 30 mul.lo.u64 %r280,%r83,%r80; .loc 1 312 16 shl.b64 %r281,%r280,2; sub.u64 %r102,%r105,%r281; .loc 1 313 30 mul.lo.u64 %r282,%r86,%r80; .loc 1 313 16 sub.u64 %r103,%r106,%r282; .loc 1 315 14 add.u64 %r98,%r98,8; add.u64 %r144,%r144,8; setp.ne.u64 %r283,%r137,%r144; @ %r283 bra $L51; bra $L33; $L65: add.u64 %r98,%frame,128; .loc 1 311 30 mov.u64 %r71,%r28; .loc 1 315 14 mov.u64 %r144,8; $L51: .loc 1 323 23 ld.u64 %r285,[%r98]; add.u64 %r80,%r285,1; st.u64 [%r98],%r80; .loc 1 324 30 add.u64 %r287,%r305,%r144; ld.u64 %r81,[%r287]; .loc 1 324 20 sub.u64 %r288,%r81,%r71; shl.b64 %r289,%r288,2; add.u64 %r104,%r104,%r289; .loc 1 325 30 add.u64 %r291,%r306,%r144; ld.u64 %r83,[%r291]; .loc 1 325 20 shl.b64 %r292,%r83,2; add.u64 %r105,%r102,%r292; .loc 1 326 30 add.u64 %r294,%r307,%r144; ld.u64 %r86,[%r294]; .loc 1 326 20 add.u64 %r106,%r103,%r86; .loc 1 304 32 add.u64 %r295,%frame,%r144; ld.u64 %r88,[%r295]; .loc 1 304 13 setp.eq.u64 %r296,%r80,%r88; @ %r296 bra $L52; .loc 1 308 20 mov.u64 %r27,0; bra $L53; $L33: .loc 1 330 1 ret; } unpack_r8.o/_gfortrani_unpack0_r8 .visible .func _gfortrani_unpack0_r8fortran/generated/unpack_r8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_unpack1_r8 .visible .func _gfortrani_unpack1_r8gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortrani_unpack0_r8 .visible .func _gfortrani_unpack0_r8reg .u64 %stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[480predpredpred %r205; .reg .u64 %r208; .reg .u32 %r212; .reg .u64u16 %r225; .reg .u16 %r227; .reg .u16u64 %r237; .reg .u64 %r238; .reg .pred %r239; .reg .u64 %r240; .reg .pred %r241; .reg .u64 %r242; .reg .predpredpredmov.u64 %r133,%ar0; mov.u64 %r134,%ar1; mov.u64 %r135,%ar2; mov.u64 %r136,%ar3; .loc 1 46 20 ld.f64 %r86,[%r136]; .loc 1 62 8 ld.u64 %r97,[%r135]; .loc 1 67 15 ld.u64 %r22,[%r135+16]; .loc 1 69 22 cvt.u32.u64 %r23,%r22; .loc 1 69 53 add.u32 %r137,%r23,-4; and.b32 %r138,%r137,-5; set.u32.eq.u32 %r140,%r138,0; neg.s32 %r141,%r140; .loc 1 69 22 add.u32 %r142,%r23,-1; set.u32.le.u32 %r144,%r142,1; neg.s32 %r145,%r144; .loc 1 69 6 cvt.u16.u32 %r147,%r141; cvt.u16.u32 %r148,%r145; or.b16 %r146,%r147,%r148; cvt.u32.u16 %r149,%r146; cvt.u16.u8 %r150,%r149; setp.ne.u16 %r151,%r150,0; @ %r151 bra $L2; .loc 1 71 7 setp.eq.u32 %r153,%r23,16; @ %r153 bra $L2; .loc 1 80 5 cvta.const.u64 %r1_gfortran_runtime_errorr96,[%r133]; .loc 1 82 6 setp.ne.u64 %r156,%r96,0; @ %r156 bra $L3; .loc 1 86 11 ld.s8 %r78,[%r135+28]; .loc 1 88 7 setp.le.s64 %r158,%r78,0; @ %r158 bra $L4; .loc 1 90 13 shl.b64 %r160,%r78,3; add.u64 %r161,%frame,120; cvt.u32.u64 %r165,%r96add.u64 %r120,%r133,40; add.u64 %r118,%r135,40; mov.u64 %r116,%frame; add.u64 %r115,%frame,360; add.u64 %r114,%frame,240; add.u64 %r170,%r78,%r78; add.u64 %r171,%r170,%r78; shl.b64 %r172,%r171,3; add.u64 %r102,%r172,%r120; .loc 1 60 9 cvt.u32.u64 %r37,%r96; .loc 1 87 10 mov.u64 %r91,1; .loc 1 91 4 mov.u64 %r174,%r96; $L5: st.u64 [%r120+8],%r174; ld.u64 %r176,[%r118+16]; add.u64 %r175,%r176,1; ld.u64 %r177,[%r118+8]; sub.u64 %r32,%r175,%r177; add.u64 %r178,%r32,-1; st.u64 [%r120+16],%r178; st.u64 [%r120],%r91; .loc 1 93 14 st.u64 [%r116],%r32; .loc 1 94 18 set.u32.le.s64 %r180,%r32,0; neg.s32 %r181,%r180; cvt.u16.u32 %r184,%r37; and.b16 %r183,%r184,1; cvt.u16.u32 %r187,%r181; or.b16 %r186,%r187,%r183; cvt.u32.u16 %r189,%r186; cvt.u32.u8 %r37,%r189; .loc 1 95 15 st.u64 [%r115],%r91; .loc 1 96 17 ld.u64 %r191,[%r118]; mul.lo.u64 %r190,%r191,%r22; .loc 1 96 15 st.u64 [%r114],%r190; .loc 1 97 7 mul.lo.u64 %r91,%r91,%r32; .loc 1 88 7 add.u64 %r120,%r120,24; add.u64 %r118,%r118,24; add.u64 %r116,%r116,8; add.u64 %r115,%r115,8; add.u64 %r114,%r114,8; setp.ne.u64 %r192,%r102,%r120; @ %r192 bra $L5; bra $L31; $L4: .loc 1 99 19 st.u64 [%r133+8],%r96; .loc 1 100 24 mov.u64 %r195,8; mov.u64 %r19419195; call (%value_in),_gfortrani_xmallocarray196,[%value_in]; } mov.u64 %r96,%r196; .loc 1 100 22 st.u64 [%r133],%r96; .loc 1 128 12 ld.u64 %r132,[%frame+360]; bra $L7; $L31: .loc 1 99 19 st.u64 [%r133+8],%r174; .loc 1 100 24 mov.u64 %r20091200; call (%value_in),_gfortrani_xmallocarray201,[%value_in]; } .loc 1 100 22 st.u64 [%r133],%r201; bra $L8; $L3: .loc 1 104 11 ld.s8 %r78,[%r133+28]; .loc 1 106 18 mov.u64 %r204,1; st.u64 [%frame+360],%r204; .loc 1 107 7 setp.le.s64 %r205,%r78,0; @ %r205 bra $L20; .loc 1 109 13 shl.b64 %r130,%r78,3; add.u64 %r208,%frame,120; mov.u32 %r21mov.u64 %r99,40; mov.u64 %r100,0; .loc 1 60 9 cvt.u32.u64 %r37,%r100; add.u64 %r266,%frame,360; add.u64 %r268,%frame,240; $L9: add.u64 %r82,%r133,%r99; .loc 1 110 16 ld.u64 %r217,[%r82+16]; add.u64 %r216,%r217,1; ld.u64 %r218,[%r82+8]; sub.u64 %r48,%r216,%r218; .loc 1 110 14 add.u64 %r219,%frame,%r100; st.u64 [%r219],%r48; .loc 1 111 18 set.u32.le.s64 %r221,%r48,0; neg.s32 %r222,%r221; cvt.u16.u32 %r225,%r37; and.b16 %r224,%r225,1; cvt.u16.u32 %r228,%r222; or.b16 %r227,%r228,%r224; cvt.u32.u16 %r230,%r227; cvt.u32.u8 %r37,%r230; .loc 1 112 15 add.u64 %r232,%r266,%r100; ld.u64 %r233,[%r82]; st.u64 [%r232],%r233; .loc 1 113 15 add.u64 %r235,%r268,%r100; .loc 1 113 17 add.u64 %r236,%r135,%r99; ld.u64 %r238,[%r236]; mul.lo.u64 %r237,%r238,%r22; .loc 1 113 15 st.u64 [%r235],%r237; .loc 1 107 7 add.u64 %r100,%r100,8; add.u64 %r99,%r99,24; setp.ne.u64 %r239,%r100,%r130; @ %r239 bra $L9; .loc 1 115 10 ld.u64 %r240,[%frame+360]; setp.ne.u64 %r241,%r240,0; @ %r241 bra $L8; .loc 1 116 13 mov.u64 %r242,1; st.u64 [%frame+360],%r242; $L8: .loc 1 119 6 setp.ne.u32 %r243,%r37,0; @ %r243 bra $L1; .loc 1 128 12 ld.u64 %r132,[%frame+360]; .loc 1 130 8 ld.u64 %r96,[%r133]; bra $L7; $L20: .loc 1 107 7 mov.u64 %r132,%r204; $L7: .loc 1 122 14 ld.u64 %r56,[%frame+240]; .loc 1 122 6 setp.ne.u64 %r244,%r56,0; .loc 1 123 16 selp.u64 %r56,%r56,1,%r244; .loc 1 125 12 ld.u64 %r75,[%r134+40]; .loc 1 126 6 setp.ne.u64 %r245,%r75,0; .loc 1 127 14 selp.u64 %r75,%r75,1,%r245; .loc 1 131 8 ld.u64 %r76,[%r134]; .loc 1 133 9 setp.eq.u64 %r246,%r96,0; @ %r246 bra $L1; .loc 1 139 9 shl.b64 %r60,%r75,3; .loc 1 147 12 shl.b64 %r62,%r132,3; ld.u64 %r94,[%frame+120]; .loc 1 151 32 ld.u64 %r98,[%frame]; .loc 1 158 30 mul.lo.u64 %r27,%r98,%r132; .loc 1 159 30 mul.lo.u64 %r247,%r98,%r56; .loc 1 159 16 neg.s64 %r101,%r247; shl.b64 %r121,%r78,3; setp.le.s64 %r267,%r78,1; add.u64 %r273,%frame,360; add.u64 %r274,%frame,240; .loc 1 155 20 mov.u64 %r275,0; $L19: .loc 1 135 10 ld.s8 %r249,[%r97]; cvt.u16.u32 %r248,%r249; setp.eq.u16 %r250,%r248,0; @ %r250 bra $L21; .loc 1 138 12 ld.f64 %r58,[%r76]; .loc 1 139 9 add.u64 %r76,%r76,%r60; bra $L13; $L21: .loc 1 144 10 mov.f64 %r58,%r86; $L13: st.f64 [%r96],%r58; .loc 1 147 12 add.u64 %r96,%r96,%r62; .loc 1 148 12 add.u64 %r97,%r97,%r56; .loc 1 149 15 add.u64 %r94,%r94,1; .loc 1 151 13 setp.ne.u64 %r251,%r94,%r98; @ %r251 bra $L19; .loc 1 159 16 add.u64 %r95,%r97,%r101; .loc 1 161 14 @ ! %r267 bra $L32; bra $L1; $L18: .loc 1 155 20 st.u64 [%r90],%r275; .loc 1 158 30 mul.lo.u64 %r64,%r70,%r69; .loc 1 159 30 mul.lo.u64 %r254,%r72,%r69; .loc 1 159 16 sub.u64 %r95,%r97,%r254; .loc 1 161 14 add.u64 %r90,%r90,8; add.u64 %r127,%r127,8; setp.ne.u64 %r255,%r121,%r127; @ %r255 bra $L17; bra $L1; $L32: add.u64 %r90,%frame,128; .loc 1 158 30 mov.u64 %r64,%r27; .loc 1 161 14 mov.u64 %r127,8; $L17: .loc 1 169 23 ld.u64 %r257,[%r90]; add.u64 %r69,%r257,1; st.u64 [%r90],%r69; .loc 1 170 30 add.u64 %r259,%r273,%r127; ld.u64 %r70,[%r259]; .loc 1 170 20 sub.u64 %r260,%r70,%r64; shl.b64 %r261,%r260,3; add.u64 %r96,%r96,%r261; .loc 1 171 30 add.u64 %r263,%r274,%r127; ld.u64 %r72,[%r263]; .loc 1 171 20 add.u64 %r97,%r95,%r72; .loc 1 151 32 add.u64 %r264,%frame,%r127; ld.u64 %r74,[%r264]; .loc 1 151 13 setp.eq.u64 %r265,%r69,%r74; @ %r265 bra $L18; .loc 1 155 20 mov.u64 %r94,0; bra $L19; $L_gfortrani_unpack1_r8 .visible .func _gfortrani_unpack1_r8reg .u64 %stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[6089932predpredpred %r213; .reg .u64u32u64 %r257; .reg .u64predu64 %r281; .reg .u64 %r282; .reg .pred %r283; .reg .u64u64 %r306; .reg .u64 %r307; .reg .u64 %r308; mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 208 8 ld.u64 %r106,[%r152]; .loc 1 213 15 ld.u64 %r22,[%r152+16]; .loc 1 215 22 cvt.u32.u64 %r23,%r22; .loc 1 215 53 add.u32 %r154,%r23,-4; and.b32 %r155,%r154,-5; set.u32.eq.u32 %r157,%r155,0; neg.s32 %r158,%r157; .loc 1 215 22 add.u32 %r159,%r23,-1; set.u32.le.u32 %r161,%r159,1; neg.s32 %r162,%r161; .loc 1 215 6 cvt.u16.u32 %r164,%r158; cvt.u16.u32 %r165,%r162; or%r168 bra $L34; .loc 1 217 7 setp.eq.u32 %r170,%r23,16; @ %r170 bra $L34; .loc 1 226 5 cvta.const.u64 %r171171_gfortran_runtime_error34: .loc 1 228 10 ld.u64 %r104,[%r150]; .loc 1 228 6 setp.ne.u64 %r173,%r104,0; @ %r173 bra $L35; .loc 1 232 11 ld.s8 %r92,[%r152+28]; .loc 1 234 7 setp.le.s64 %r175,%r92,0; @ %r175 bra $L36; .loc 1 236 13 shl.b64 %r145,%r92,3; add.u64 %r178,%frame,120; cvt.u32.u64 %r182,%r1041454,[%value_in]; } mov.u64 %r135,40; mov.u64 %r136,%r104; .loc 1 206 9 cvt.u32.u64 %r38,%r136; .loc 1 233 10 mov.u64 %r99,1; add.u64 %r297,%frame,480; add.u64 %r300,%frame,360; add.u64 %r299,%frame,240; .loc 1 237 4 mov.u64 %r186,%r136; $L37: add.u64 %r134,%r150,%r135; st.u64 [%r134+8],%r186; add.u64 %r133,%r152,%r135; ld.u64 %r188,[%r133+16]; add.u64 %r187,%r188,1; ld.u64 %r189,[%r133+8]; sub.u64 %r33,%r187,%r189; add.u64 %r190,%r33,-1; st.u64 [%r134+16],%r190; st.u64 [%r134],%r99; .loc 1 239 14 add.u64 %r191,%frame,%r136; st.u64 [%r191],%r33; .loc 1 240 18 set.u32.le.s64 %r193,%r33,0; neg.s32 %r194,%r193; cvt.u16.u32 %r197,%r38; and.b16 %r196,%r197,1; cvt.u16.u32 %r200,%r194; or.b16 %r199,%r200,%r196; cvt.u32.u16 %r202,%r199; cvt.u32.u8 %r38,%r202; .loc 1 241 15 add.u64 %r204,%r297,%r136; st.u64 [%r204],%r99; .loc 1 242 15 add.u64 %r206,%r300,%r136; .loc 1 242 17 add.u64 %r207,%r153,%r135; .loc 1 242 15 ld.u64 %r208,[%r207]; st.u64 [%r206],%r208; .loc 1 243 15 add.u64 %r210,%r299,%r136; .loc 1 243 17 ld.u64 %r212,[%r133]; mul.lo.u64 %r211,%r212,%r22; .loc 1 243 15 st.u64 [%r210],%r211; .loc 1 244 7 mul.lo.u64 %r99,%r99,%r33; .loc 1 234 7 add.u64 %r136,%r136,8; add.u64 %r135,%r135,24; setp.ne.u64 %r213,%r136,%r145; @ %r213 bra $L37; bra $L64; $L36: .loc 1 246 19 st.u64 [%r150+8],%r104; .loc 1 247 24 mov.u64 %r216,8; mov.u64 %r215216; call (%value_in),_gfortrani_xmallocarray217,[%value_in]; } mov.u64 %r104,%r217; .loc 1 247 22 st.u64 [%r150],%r104; .loc 1 278 12 ld.u64 %r149,[%frame+480]; bra $L39; $L64: .loc 1 246 19 st.u64 [%r150+8],%r186; .loc 1 247 24 mov.u64 %r299221; call (%value_in),_gfortrani_xmallocarray222,[%value_in]; } .loc 1 247 22 st.u64 [%r150],%r222; bra $L40; $L35: .loc 1 251 11 ld.s8 %r92,[%r150+28]; .loc 1 253 18 mov.u64 %r225,1; st.u64 [%frame+480],%r225; .loc 1 254 7 setp.le.s64 %r226,%r92,0; @ %r226 bra $L54; .loc 1 256 13 shl.b64 %r147,%r92,3; add.u64 %r229,%frame,120; mov.u32 %r23314735,[%value_in]; } add.u64 %r127,%r150,40; add.u64 %r125,%r153,40; add.u64 %r123,%r152,40; mov.u64 %r121,0; .loc 1 206 9 cvt.u32.u64 %r38,%r121; add.u64 %r297,%frame,480; add.u64 %r300,%frame,360; add.u64 %r299,%frame,240; $L41: .loc 1 257 16 ld.u64 %r238,[%r127+16]; add.u64 %r237,%r238,1; ld.u64 %r239,[%r127+8]; sub.u64 %r50,%r237,%r239; .loc 1 257 14 add.u64 %r240,%frame,%r121; st.u64 [%r240],%r50; .loc 1 258 18 set.u32.le.s64 %r242,%r50,0; neg.s32 %r243,%r242; cvt.u16.u32 %r246,%r38; and.b16 %r245,%r246,1; cvt.u16.u32 %r249,%r243; or.b16 %r248,%r249,%r245; cvt.u32.u16 %r251,%r248; cvt.u32.u8 %r38,%r251; .loc 1 259 15 add.u64 %r253,%r297,%r121; ld.u64 %r254,[%r127]; st.u64 [%r253],%r254; .loc 1 260 15 add.u64 %r256,%r300,%r121; ld.u64 %r257,[%r125]; st.u64 [%r256],%r257; .loc 1 261 15 add.u64 %r259,%r299,%r121; .loc 1 261 17 ld.u64 %r261,[%r123]; mul.lo.u64 %r260,%r261,%r22; .loc 1 261 15 st.u64 [%r259],%r260; .loc 1 254 7 add.u64 %r127,%r127,24; add.u64 %r125,%r125,24; add.u64 %r123,%r123,24; add.u64 %r121,%r121,8; setp.ne.u64 %r262,%r121,%r147; @ %r262 bra $L41; .loc 1 263 10 ld.u64 %r263,[%frame+48040; .loc 1 264 13 mov.u64 %r265,1; st.u64 [%frame+480],%r265; $L40: .loc 1 267 6 setp.ne.u32 %r266,%r38,0; @ %r266 bra $L33; .loc 1 278 12 ld.u64 %r149,[%frame+480]; .loc 1 281 8 ld.u64 %r104,[%r150]; bra $L39; $L54: .loc 1 254 7 mov.u64 %r149,%r225; $L39: .loc 1 270 14 ld.u64 %r59,[%frame+360]; .loc 1 270 6 setp.ne.u64 %r267,%r59,0; .loc 1 271 16 selp.u64 %r59,%r59,1,%r267; .loc 1 272 14 ld.u64 %r60,[%frame+240]; .loc 1 272 6 setp.ne.u64 %r268,%r60,0; .loc 1 273 16 selp.u64 %r60,%r60,1,%r268; .loc 1 275 12 ld.u64 %r90,[%r151+40]; .loc 1 276 6 setp.ne.u64 %r269,%r90,0; .loc 1 277 14 selp.u64 %r90,%r90,1,%r269; .loc 1 282 8 ld.u64 %r105,[%r153]; .loc 1 283 8 ld.u64 %r91,[%r151]; .loc 1 285 9 setp.eq.u64 %r270,%r104,0; @ %r270 bra $L33; .loc 1 291 16 shl.b64 %r64,%r90,3; .loc 1 299 12 shl.b64 %r66,%r149,3; .loc 1 300 12 shl.b64 %r68,%r59,3; ld.u64 %r27,[%frame+120]; .loc 1 304 32 ld.u64 %r107,[%frame]; .loc 1 311 30 mul.lo.u64 %r28,%r107,%r149; .loc 1 312 16 mul.lo.u64 %r271,%r107,%r59; shl.b64 %r272,%r271,3; neg.s64 %r111,%r272; .loc 1 313 30 mul.lo.u64 %r273,%r107,%r60; .loc 1 313 16 neg.s64 %r112,%r273; shl.b64 %r137,%r92,3; setp.le.s64 %r298,%r92,1; add.u64 %r305,%frame,480; add.u64 %r306,%frame,360; add.u64 %r307,%frame,240; .loc 1 308 20 mov.u64 %r308,0; $L53: .loc 1 287 10 ld.s8 %r275,[%r106]; cvt.u16.u32 %r274,%r275; setp.eq.u16 %r276,%r274,0; @ %r276 bra $L46; .loc 1 290 12 ld.f64 %r62,[%r91]; .loc 1 291 16 add.u64 %r91,%r91,%r64; bra $L47; $L46: .loc 1 296 12 ld.f64 %r62,[%r105]; $L47: st.f64 [%r104],%r62; .loc 1 299 12 add.u64 %r104,%r104,%r66; .loc 1 300 12 add.u64 %r105,%r105,%r68; .loc 1 301 12 add.u64 %r106,%r106,%r60; .loc 1 302 15 add.u64 %r27,%r27,1; .loc 1 304 13 setp.ne.u64 %r277,%r27,%r107; @ %r277 bra $L53; .loc 1 312 16 add.u64 %r102,%r105,%r111; .loc 1 313 16 add.u64 %r103,%r106,%r112; .loc 1 315 14 @ ! %r298 bra $L65; bra $L33; $L52: .loc 1 308 20 st.u64 [%r98],%r308; .loc 1 311 30 mul.lo.u64 %r71,%r81,%r80; .loc 1 312 30 mul.lo.u64 %r280,%r83,%r80; .loc 1 312 16 shl.b64 %r281,%r280,3; sub.u64 %r102,%r105,%r281; .loc 1 313 30 mul.lo.u64 %r282,%r86,%r80; .loc 1 313 16 sub.u64 %r103,%r106,%r282; .loc 1 315 14 add.u64 %r98,%r98,8; add.u64 %r144,%r144,8; setp.ne.u64 %r283,%r137,%r144; @ %r283 bra $L51; bra $L33; $L65: add.u64 %r98,%frame,128; .loc 1 311 30 mov.u64 %r71,%r28; .loc 1 315 14 mov.u64 %r144,8; $L51: .loc 1 323 23 ld.u64 %r285,[%r98]; add.u64 %r80,%r285,1; st.u64 [%r98],%r80; .loc 1 324 30 add.u64 %r287,%r305,%r144; ld.u64 %r81,[%r287]; .loc 1 324 20 sub.u64 %r288,%r81,%r71; shl.b64 %r289,%r288,3; add.u64 %r104,%r104,%r289; .loc 1 325 30 add.u64 %r291,%r306,%r144; ld.u64 %r83,[%r291]; .loc 1 325 20 shl.b64 %r292,%r83,3; add.u64 %r105,%r102,%r292; .loc 1 326 30 add.u64 %r294,%r307,%r144; ld.u64 %r86,[%r294]; .loc 1 326 20 add.u64 %r106,%r103,%r86; .loc 1 304 32 add.u64 %r295,%frame,%r144; ld.u64 %r88,[%r295]; .loc 1 304 13 setp.eq.u64 %r296,%r80,%r88; @ %r296 bra $L52; .loc 1 308 20 mov.u64 %r27,0; bra $L53; $L33: .loc 1 330 1 ret; } unpack_r10.o/ unpack_c4.o/_gfortrani_unpack0_c4 .visible .func _gfortrani_unpack0_cfortran/generated/unpack_c4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_unpack1_c4 .visible .func _gfortrani_unpack1_c4gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortrani_unpack0_c4 .visible .func _gfortrani_unpack0_creg .u64 %stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[48032predpredu16predu32 %r251; .reg .pred %r252; .reg .predpred %r267; .reg .u64 %r268; .reg .u64mov.u64 %r135,%ar0; mov.u64 %r136,%ar1; mov.u64 %r137,%ar2; mov.u64 %r138,%ar3; .loc 1 46 23 ld.f32 %r87,[%r138]; ld.f32 %r88,[%r138+4]; .loc 1 62 8 ld.u64 %r99,[%r137]; .loc 1 67 15 ld.u64 %r22,[%r137+16]; .loc 1 69 22 cvt.u32.u64 %r23,%r22; .loc 1 69 53 add.u32 %r139,%r23,-4; and.b32 %r140,%r139,-5; set.u32.eq.u32 %r142,%r140,0; neg.s32 %r143,%r142; .loc 1 69 22 add.u32 %r144,%r23,-1; set.u32.le.u32 %r146,%r144,1; neg.s32 %r147,%r146; .loc 1 69 6 cvt.u16.u32 %r149,%r143; cvt.u16.u32 %r150,%r1472; .loc 1 71 7 setp.eq.u32 %r155,%r23,16; @ %r155 bra $L2; .loc 1 80 5_gfortran_runtime_errorr98,[%r135]; .loc 1 82 6 setp.ne.u64 %r158,%r98,0; @ %r158 bra $L3; .loc 1 86 11 ld.s8 %r79,[%r137+28]; .loc 1 88 7 setp.le.s64 %r160,%r79,0; @ %r160 bra $L4; .loc 1 90 13 shl.b64 %r162,%r79,3; add.u64 %r163,%frame,120; cvt.u32.u64 %r167,%r98162169,[%value_in]; } add.u64 %r122,%r135,40; add.u64 %r120,%r137,40; mov.u64 %r118,%frame; add.u64 %r117,%frame,360; add.u64 %r116,%frame,240; add.u64 %r172,%r79,%r79; add.u64 %r173,%r172,%r79; shl.b64 %r174,%r173,3; add.u64 %r104,%r174,%r122; .loc 1 60 9 cvt.u32.u64 %r37,%r98; .loc 1 87 10 mov.u64 %r93,1; .loc 1 91 4 mov.u64 %r176,%r98; $L5: st.u64 [%r122+8],%r176; ld.u64 %r178,[%r120+16]; add.u64 %r177,%r178,1; ld.u64 %r179,[%r120+8]; sub.u64 %r32,%r177,%r179; add.u64 %r180,%r32,-1; st.u64 [%r122+16],%r180; st.u64 [%r122],%r93; .loc 1 93 14 st.u64 [%r118],%r32; .loc 1 94 18 set.u32.le.s64 %r182,%r32,0; neg.s32 %r183,%r182; cvt.u16.u32 %r186,%r37; and.b16 %r185,%r186,1; cvt.u16.u32 %r189,%r183; or.b16 %r188,%r189,%r185; cvt.u32.u16 %r191,%r188; cvt.u32.u8 %r37,%r191; .loc 1 95 15 st.u64 [%r117],%r93; .loc 1 96 17 ld.u64 %r193,[%r120]; mul.lo.u64 %r192,%r193,%r22; .loc 1 96 15 st.u64 [%r116],%r192; .loc 1 97 7 mul.lo.u64 %r93,%r93,%r32; .loc 1 88 7 add.u64 %r122,%r122,24; add.u64 %r120,%r120,24; add.u64 %r118,%r118,8; add.u64 %r117,%r117,8; add.u64 %r116,%r116,8; setp.ne.u64 %r194,%r104,%r122; @ %r194 bra $L5; bra $L31; $L4: .loc 1 99 19 st.u64 [%r135+8],%r98; .loc 1 100 24 mov.u64 %r197,8; mov.u64 %r196call (%value_in),_gfortrani_xmallocarray198,[%value_in]; } mov.u64 %r98,%r198; .loc 1 100 22 st.u64 [%r135],%r98; .loc 1 128 12 ld.u64 %r134,[%frame+360]; bra $L7; $L31: .loc 1 99 19 st.u64 [%r135+8],%r176; .loc 1 100 24 mov.u64 %r20(%value_in),_gfortrani_xmallocarray203,[%value_in]; } .loc 1 100 22 st.u64 [%r135],%r203; bra $L8; $L3: .loc 1 104 11 ld.s8 %r79,[%r135+28]; .loc 1 106 18 mov.u64 %r206,1; st.u64 [%frame+360],%r206; .loc 1 107 7 setp.le.s64 %r207,%r79,0; @ %r207 bra $L20; .loc 1 109 13 shl.b64 %r132,%r79,3; add.u64 %r210,%frame,120; mov.u32 %r21413216,[%value_in]; } mov.u64 %r101,40; mov.u64 %r102,0; .loc 1 60 9 cvt.u32.u64 %r37,%r102; add.u64 %r269,%frame,360; add.u64 %r268,%frame,240; $L9: add.u64 %r83,%r135,%r101; .loc 1 110 16 ld.u64 %r219,[%r83+16]; add.u64 %r218,%r219,1; ld.u64 %r220,[%r83+8]; sub.u64 %r48,%r218,%r220; .loc 1 110 14 add.u64 %r221,%frame,%r102; st.u64 [%r221],%r48; .loc 1 111 18 set.u32.le.s64 %r223,%r48,0; neg.s32 %r224,%r223; cvt.u16.u32 %r227,%r37; and.b16 %r226,%r227,1; cvt.u16.u32 %r230,%r224; or.b16 %r229,%r230,%r226; cvt.u32.u16 %r232,%r229; cvt.u32.u8 %r37,%r232; .loc 1 112 15 add.u64 %r234,%r269,%r102; ld.u64 %r235,[%r83]; st.u64 [%r234],%r235; .loc 1 113 15 add.u64 %r237,%r268,%r102; .loc 1 113 17 add.u64 %r238,%r137,%r101; ld.u64 %r240,[%r238]; mul.lo.u64 %r239,%r240,%r22; .loc 1 113 15 st.u64 [%r237],%r239; .loc 1 107 7 add.u64 %r102,%r102,8; add.u64 %r101,%r101,24; setp.ne.u64 %r241,%r102,%r132; @ %r241 bra $L9; .loc 1 115 10 ld.u64 %r242,[%frame+360]; setp.ne.u64 %r243,%r242,0; @ %r243 bra $L8; .loc 1 116 13 mov.u64 %r244,1; st.u64 [%frame+360],%r244; $L8: .loc 1 119 6 setp.ne.u32 %r245,%r37,0; @ %r245 bra $L1; .loc 1 128 12 ld.u64 %r134,[%frame+360]; .loc 1 130 8 ld.u64 %r98,[%r135]; bra $L7; $L20: .loc 1 107 7 mov.u64 %r134,%r206; $L7: .loc 1 122 14 ld.u64 %r56,[%frame+240]; .loc 1 122 6 setp.ne.u64 %r246,%r56,0; .loc 1 123 16 selp.u64 %r56,%r56,1,%r246; .loc 1 125 12 ld.u64 %r76,[%r136+40]; .loc 1 126 6 setp.ne.u64 %r247,%r76,0; .loc 1 127 14 selp.u64 %r76,%r76,1,%r247; .loc 1 131 8 ld.u64 %r77,[%r136]; .loc 1 133 9 setp.eq.u64 %r248,%r98,0; @ %r248 bra $L1; .loc 1 139 9 shl.b64 %r61,%r76,3; .loc 1 147 12 shl.b64 %r63,%r134,3; ld.u64 %r96,[%frame+120]; .loc 1 151 32 ld.u64 %r100,[%frame]; .loc 1 158 30 mul.lo.u64 %r27,%r100,%r134; .loc 1 159 30 mul.lo.u64 %r249,%r100,%r56; .loc 1 159 16 neg.s64 %r103,%r249; shl.b64 %r123,%r79,3; setp.le.s64 %r270,%r79,1; add.u64 %r275,%frame,360; add.u64 %r276,%frame,240; .loc 1 155 20 mov.u64 %r277,0; $L19: .loc 1 135 10 ld.s8 %r251,[%r99]; cvt.u16.u32 %r250,%r251; setp.eq.u16 %r252,%r250,0; @ %r252 bra $L21; .loc 1 138 12 ld.f32 %r58,[%r77]; ld.f32 %r59,[%r77+4]; .loc 1 139 9 add.u64 %r77,%r77,%r61; bra $L13; $L21: .loc 1 144 10 mov.f32 %r58,%r87; mov.f32 %r59,%r88; $L13: st.f32 [%r98],%r58; st.f32 [%r98+4],%r59; .loc 1 147 12 add.u64 %r98,%r98,%r63; .loc 1 148 12 add.u64 %r99,%r99,%r56; .loc 1 149 15 add.u64 %r96,%r96,1; .loc 1 151 13 setp.ne.u64 %r253,%r96,%r100; @ %r253 bra $L19; .loc 1 159 16 add.u64 %r97,%r99,%r103; .loc 1 161 14 @ ! %r270 bra $L32; bra $L1; $L18: .loc 1 155 20 st.u64 [%r92],%r277; .loc 1 158 30 mul.lo.u64 %r65,%r71,%r70; .loc 1 159 30 mul.lo.u64 %r256,%r73,%r70; .loc 1 159 16 sub.u64 %r97,%r99,%r256; .loc 1 161 14 add.u64 %r92,%r92,8; add.u64 %r129,%r129,8; setp.ne.u64 %r257,%r123,%r129; @ %r257 bra $L17; bra $L1; $L32: add.u64 %r92,%frame,128; .loc 1 158 30 mov.u64 %r65,%r27; .loc 1 161 14 mov.u64 %r129,8; $L17: .loc 1 169 23 ld.u64 %r259,[%r92]; add.u64 %r70,%r259,1; st.u64 [%r92],%r70; .loc 1 170 30 add.u64 %r261,%r275,%r129; ld.u64 %r71,[%r261]; .loc 1 170 20 sub.u64 %r262,%r71,%r65; shl.b64 %r263,%r262,3; add.u64 %r98,%r98,%r263; .loc 1 171 30 add.u64 %r265,%r276,%r129; ld.u64 %r73,[%r265]; .loc 1 171 20 add.u64 %r99,%r97,%r73; .loc 1 151 32 add.u64 %r266,%frame,%r129; ld.u64 %r75,[%r266]; .loc 1 151 13 setp.eq.u64 %r267,%r70,%r75; @ %r267 bra $L18; .loc 1 155 20 mov.u64 %r96,0; bra $L19; $L_gfortrani_unpack1_c4 .visible .func _gfortrani_unpack1_creg .u64 %stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[608predpred %r174; .reg .predu32u64pred %r269; .reg .predu64 %r306; .reg .u64mov.u64 %r151,%ar0; mov.u64 %r152,%ar1; mov.u64 %r153,%ar2; mov.u64 %r154,%ar3; .loc 1 208 8 ld.u64 %r107,[%r153]; .loc 1 213 15 ld.u64 %r22,[%r153+16]; .loc 1 215 22 cvt.u32.u64 %r23,%r22; .loc 1 215 53 add.u32 %r155,%r23,-4; and.b32 %r156,%r155,-5; set.u32.eq.u32 %r158,%r156,0; neg.s32 %r159,%r158; .loc 1 215 22 add.u32 %r160,%r23,-1; set.u32.le.u32 %r162,%r160,1; neg.s32 %r163,%r162; .loc 1 215 6 cvt.u16.u32 %r165,%r159; cvt.u16.u32 %r166,%r163; or.b16 %r164,%r165,%r166; cvt.u32.u16 %r167,%r164; cvt.u16.u8 %r168,%r167; setp.ne.u16 %r169,%r168,0; @ %r169 bra $L34; .loc 1 217 7 setp.eq.u32 %r171,%r23,16; @ %r171 bra $L34; .loc 1 226 5 cvta.const.u64 %r172172_gfortran_runtime_error34: .loc 1 228 10 ld.u64 %r105,[%r151]; .loc 1 228 6 setp.ne.u64 %r174,%r105,0; @ %r174 bra $L35; .loc 1 232 11 ld.s8 %r93,[%r153+28]; .loc 1 234 7 setp.le.s64 %r176,%r93,0; @ %r176 bra $L36; .loc 1 236 13 shl.b64 %r146,%r93,3; add.u64 %r179,%frame,120; cvt.u32.u64 %r183,%r1051465,[%value_in]; } mov.u64 %r136,40; mov.u64 %r137,%r105; .loc 1 206 9 cvt.u32.u64 %r38,%r137; .loc 1 233 10 mov.u64 %r100,1; add.u64 %r301,%frame,480; add.u64 %r300,%frame,360; add.u64 %r299,%frame,240; .loc 1 237 4 mov.u64 %r187,%r137; $L37: add.u64 %r135,%r151,%r136; st.u64 [%r135+8],%r187; add.u64 %r134,%r153,%r136; ld.u64 %r189,[%r134+16]; add.u64 %r188,%r189,1; ld.u64 %r190,[%r134+8]; sub.u64 %r33,%r188,%r190; add.u64 %r191,%r33,-1; st.u64 [%r135+16],%r191; st.u64 [%r135],%r100; .loc 1 239 14 add.u64 %r192,%frame,%r137; st.u64 [%r192],%r33; .loc 1 240 18 set.u32.le.s64 %r194,%r33,0; neg.s32 %r195,%r194; cvt.u16.u32 %r198,%r38; and.b16 %r197,%r198,1; cvt.u16.u32 %r201,%r195; or.b16 %r200,%r201,%r197; cvt.u32.u16 %r203,%r200; cvt.u32.u8 %r38,%r203; .loc 1 241 15 add.u64 %r205,%r301,%r137; st.u64 [%r205],%r100; .loc 1 242 15 add.u64 %r207,%r300,%r137; .loc 1 242 17 add.u64 %r208,%r154,%r136; .loc 1 242 15 ld.u64 %r209,[%r208]; st.u64 [%r207],%r209; .loc 1 243 15 add.u64 %r211,%r299,%r137; .loc 1 243 17 ld.u64 %r213,[%r134]; mul.lo.u64 %r212,%r213,%r22; .loc 1 243 15 st.u64 [%r211],%r212; .loc 1 244 7 mul.lo.u64 %r100,%r100,%r33; .loc 1 234 7 add.u64 %r137,%r137,8; add.u64 %r136,%r136,24; setp.ne.u64 %r214,%r137,%r146; @ %r214 bra $L37; bra $L64; $L36: .loc 1 246 19 st.u64 [%r151+8],%r105; .loc 1 247 24 mov.u64 %r217,8; mov.u64 %r216217; call (%value_in),_gfortrani_xmallocarray218,[%value_in]; } mov.u64 %r105,%r218; .loc 1 247 22 st.u64 [%r151],%r105; .loc 1 278 12 ld.u64 %r150,[%frame+480]; bra $L39; $L64: .loc 1 246 19 st.u64 [%r151+8],%r187; .loc 1 247 24 mov.u64 %r2222; call (%value_in),_gfortrani_xmallocarray223,[%value_in]; } .loc 1 247 22 st.u64 [%r151],%r223; bra $L40; $L35: .loc 1 251 11 ld.s8 %r93,[%r151+28]; .loc 1 253 18 mov.u64 %r226,1; st.u64 [%frame+480],%r226; .loc 1 254 7 setp.le.s64 %r227,%r93,0; @ %r227 bra $L54; .loc 1 256 13 shl.b64 %r148,%r93,3; add.u64 %r230,%frame,120; mov.u32 %r23414836,[%value_in]; } add.u64 %r128,%r151,40; add.u64 %r126,%r154,40; add.u64 %r124,%r153,40; mov.u64 %r122,0; .loc 1 206 9 cvt.u32.u64 %r38,%r122; add.u64 %r301,%frame,480; add.u64 %r300,%frame,360; add.u64 %r299,%frame,240; $L41: .loc 1 257 16 ld.u64 %r239,[%r128+16]; add.u64 %r238,%r239,1; ld.u64 %r240,[%r128+8]; sub.u64 %r50,%r238,%r240; .loc 1 257 14 add.u64 %r241,%frame,%r122; st.u64 [%r241],%r50; .loc 1 258 18 set.u32.le.s64 %r243,%r50,0; neg.s32 %r244,%r243; cvt.u16.u32 %r247,%r38; and.b16 %r246,%r247,1; cvt.u16.u32 %r250,%r244; or.b16 %r249,%r250,%r246; cvt.u32.u16 %r252,%r249; cvt.u32.u8 %r38,%r252; .loc 1 259 15 add.u64 %r254,%r301,%r122; ld.u64 %r255,[%r128]; st.u64 [%r254],%r255; .loc 1 260 15 add.u64 %r257,%r300,%r122; ld.u64 %r258,[%r126]; st.u64 [%r257],%r258; .loc 1 261 15 add.u64 %r260,%r299,%r122; .loc 1 261 17 ld.u64 %r262,[%r124]; mul.lo.u64 %r261,%r262,%r22; .loc 1 261 15 st.u64 [%r260],%r261; .loc 1 254 7 add.u64 %r128,%r128,24; add.u64 %r126,%r126,24; add.u64 %r124,%r124,24; add.u64 %r122,%r122,8; setp.ne.u64 %r263,%r122,%r148; @ %r263 bra $L41; .loc 1 263 10 ld.u64 %r264,[%frame+480]; setp.ne.u64 %r265,%r264,0; @ %r265 bra $L40; .loc 1 264 13 mov.u64 %r266,1; st.u64 [%frame+480],%r266; $L40: .loc 1 267 6 setp.ne.u32 %r267,%r38,0; @ %r267 bra $L33; .loc 1 278 12 ld.u64 %r150,[%frame+480]; .loc 1 281 8 ld.u64 %r105,[%r151]; bra $L39; $L54: .loc 1 254 7 mov.u64 %r150,%r226; $L39: .loc 1 270 14 ld.u64 %r59,[%frame+360]; .loc 1 270 6 setp.ne.u64 %r268,%r59,0; .loc 1 271 16 selp.u64 %r59,%r59,1,%r268; .loc 1 272 14 ld.u64 %r60,[%frame+240]; .loc 1 272 6 setp.ne.u64 %r269,%r60,0; .loc 1 273 16 selp.u64 %r60,%r60,1,%r269; .loc 1 275 12 ld.u64 %r91,[%r152+40]; .loc 1 276 6 setp.ne.u64 %r270,%r91,0; .loc 1 277 14 selp.u64 %r91,%r91,1,%r270; .loc 1 282 8 ld.u64 %r106,[%r154]; .loc 1 283 8 ld.u64 %r92,[%r152]; .loc 1 285 9 setp.eq.u64 %r271,%r105,0; @ %r271 bra $L33; .loc 1 291 16 shl.b64 %r65,%r91,3; .loc 1 299 12 shl.b64 %r67,%r150,3; .loc 1 300 12 shl.b64 %r69,%r59,3; ld.u64 %r27,[%frame+120]; .loc 1 304 32 ld.u64 %r108,[%frame]; .loc 1 311 30 mul.lo.u64 %r28,%r108,%r150; .loc 1 312 16 mul.lo.u64 %r272,%r108,%r59; shl.b64 %r273,%r272,3; neg.s64 %r112,%r273; .loc 1 313 30 mul.lo.u64 %r274,%r108,%r60; .loc 1 313 16 neg.s64 %r113,%r274; shl.b64 %r138,%r93,3; setp.le.s64 %r298,%r93,1; add.u64 %r306,%frame,480; add.u64 %r307,%frame,360; add.u64 %r308,%frame,240; .loc 1 308 20 mov.u64 %r309,0; $L53: .loc 1 287 10 ld.s8 %r276,[%r107]; cvt.u16.u32 %r275,%r276; setp.eq.u16 %r277,%r275,0; @ %r277 bra $L46; .loc 1 290 12 ld.f32 %r62,[%r92]; ld.f32 %r63,[%r92+4]; .loc 1 291 16 add.u64 %r92,%r92,%r65; bra $L47; $L46: .loc 1 296 12 ld.f32 %r62,[%r106]; ld.f32 %r63,[%r106+4]; $L47: st.f32 [%r105],%r62; st.f32 [%r105+4],%r63; .loc 1 299 12 add.u64 %r105,%r105,%r67; .loc 1 300 12 add.u64 %r106,%r106,%r69; .loc 1 301 12 add.u64 %r107,%r107,%r60; .loc 1 302 15 add.u64 %r27,%r27,1; .loc 1 304 13 setp.ne.u64 %r278,%r27,%r108; @ %r278 bra $L53; .loc 1 312 16 add.u64 %r103,%r106,%r112; .loc 1 313 16 add.u64 %r104,%r107,%r113; .loc 1 315 14 @ ! %r298 bra $L65; bra $L33; $L52: .loc 1 308 20 st.u64 [%r99],%r309; .loc 1 311 30 mul.lo.u64 %r72,%r82,%r81; .loc 1 312 30 mul.lo.u64 %r281,%r84,%r81; .loc 1 312 16 shl.b64 %r282,%r281,3; sub.u64 %r103,%r106,%r282; .loc 1 313 30 mul.lo.u64 %r283,%r87,%r81; .loc 1 313 16 sub.u64 %r104,%r107,%r283; .loc 1 315 14 add.u64 %r99,%r99,8; add.u64 %r145,%r145,8; setp.ne.u64 %r284,%r138,%r145; @ %r284 bra $L51; bra $L33; $L65: add.u64 %r99,%frame,128; .loc 1 311 30 mov.u64 %r72,%r28; .loc 1 315 14 mov.u64 %r145,8; $L51: .loc 1 323 23 ld.u64 %r286,[%r99]; add.u64 %r81,%r286,1; st.u64 [%r99],%r81; .loc 1 324 30 add.u64 %r288,%r306,%r145; ld.u64 %r82,[%r288]; .loc 1 324 20 sub.u64 %r289,%r82,%r72; shl.b64 %r290,%r289,3; add.u64 %r105,%r105,%r290; .loc 1 325 30 add.u64 %r292,%r307,%r145; ld.u64 %r84,[%r292]; .loc 1 325 20 shl.b64 %r293,%r84,3; add.u64 %r106,%r103,%r293; .loc 1 326 30 add.u64 %r295,%r308,%r145; ld.u64 %r87,[%r295]; .loc 1 326 20 add.u64 %r107,%r104,%r87; .loc 1 304 32 add.u64 %r296,%frame,%r145; ld.u64 %r89,[%r296]; .loc 1 304 13 setp.eq.u64 %r297,%r81,%r89; @ %r297 bra $L52; .loc 1 308 20 mov.u64 %r27,0; bra $L53; $L33: .loc 1 330 1 ret; } unpack_c8.o/_gfortrani_unpack0_c8 .visible .func _gfortrani_unpack0_c8fortran/generated/unpack_c8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_unpack1_c8 .visible .func _gfortrani_unpack1_c8gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortrani_unpack0_c8 .visible .func _gfortrani_unpack0_c8reg .u64 %stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[48032predpredu16predu32 %r251; .reg .pred %r252; .reg .predpred %r267; .reg .u64 %r268; .reg .u64mov.u64 %r135,%ar0; mov.u64 %r136,%ar1; mov.u64 %r137,%ar2; mov.u64 %r138,%ar3; .loc 1 46 23 ld.f64 %r87,[%r138]; ld.f64 %r88,[%r138+8]; .loc 1 62 8 ld.u64 %r99,[%r137]; .loc 1 67 15 ld.u64 %r22,[%r137+16]; .loc 1 69 22 cvt.u32.u64 %r23,%r22; .loc 1 69 53 add.u32 %r139,%r23,-4; and.b32 %r140,%r139,-5; set.u32.eq.u32 %r142,%r140,0; neg.s32 %r143,%r142; .loc 1 69 22 add.u32 %r144,%r23,-1; set.u32.le.u32 %r146,%r144,1; neg.s32 %r147,%r146; .loc 1 69 6 cvt.u16.u32 %r149,%r143; cvt.u16.u32 %r150,%r1472; .loc 1 71 7 setp.eq.u32 %r155,%r23,16; @ %r155 bra $L2; .loc 1 80 5_gfortran_runtime_errorr98,[%r135]; .loc 1 82 6 setp.ne.u64 %r158,%r98,0; @ %r158 bra $L3; .loc 1 86 11 ld.s8 %r79,[%r137+28]; .loc 1 88 7 setp.le.s64 %r160,%r79,0; @ %r160 bra $L4; .loc 1 90 13 shl.b64 %r162,%r79,3; add.u64 %r163,%frame,120; cvt.u32.u64 %r167,%r98162169,[%value_in]; } add.u64 %r122,%r135,40; add.u64 %r120,%r137,40; mov.u64 %r118,%frame; add.u64 %r117,%frame,360; add.u64 %r116,%frame,240; add.u64 %r172,%r79,%r79; add.u64 %r173,%r172,%r79; shl.b64 %r174,%r173,3; add.u64 %r104,%r174,%r122; .loc 1 60 9 cvt.u32.u64 %r37,%r98; .loc 1 87 10 mov.u64 %r93,1; .loc 1 91 4 mov.u64 %r176,%r98; $L5: st.u64 [%r122+8],%r176; ld.u64 %r178,[%r120+16]; add.u64 %r177,%r178,1; ld.u64 %r179,[%r120+8]; sub.u64 %r32,%r177,%r179; add.u64 %r180,%r32,-1; st.u64 [%r122+16],%r180; st.u64 [%r122],%r93; .loc 1 93 14 st.u64 [%r118],%r32; .loc 1 94 18 set.u32.le.s64 %r182,%r32,0; neg.s32 %r183,%r182; cvt.u16.u32 %r186,%r37; and.b16 %r185,%r186,1; cvt.u16.u32 %r189,%r183; or.b16 %r188,%r189,%r185; cvt.u32.u16 %r191,%r188; cvt.u32.u8 %r37,%r191; .loc 1 95 15 st.u64 [%r117],%r93; .loc 1 96 17 ld.u64 %r193,[%r120]; mul.lo.u64 %r192,%r193,%r22; .loc 1 96 15 st.u64 [%r116],%r192; .loc 1 97 7 mul.lo.u64 %r93,%r93,%r32; .loc 1 88 7 add.u64 %r122,%r122,24; add.u64 %r120,%r120,24; add.u64 %r118,%r118,8; add.u64 %r117,%r117,8; add.u64 %r116,%r116,8; setp.ne.u64 %r194,%r104,%r122; @ %r194 bra $L5; bra $L31; $L4: .loc 1 99 19 st.u64 [%r135+8],%r98; .loc 1 100 24 mov.u64 %r197,16; mov.u64 %r196call (%value_in),_gfortrani_xmallocarray198,[%value_in]; } mov.u64 %r98,%r198; .loc 1 100 22 st.u64 [%r135],%r98; .loc 1 128 12 ld.u64 %r134,[%frame+360]; bra $L7; $L31: .loc 1 99 19 st.u64 [%r135+8],%r176; .loc 1 100 24 mov.u64 %r202,16(%value_in),_gfortrani_xmallocarray203,[%value_in]; } .loc 1 100 22 st.u64 [%r135],%r203; bra $L8; $L3: .loc 1 104 11 ld.s8 %r79,[%r135+28]; .loc 1 106 18 mov.u64 %r206,1; st.u64 [%frame+360],%r206; .loc 1 107 7 setp.le.s64 %r207,%r79,0; @ %r207 bra $L20; .loc 1 109 13 shl.b64 %r132,%r79,3; add.u64 %r210,%frame,120; mov.u32 %r21413216,[%value_in]; } mov.u64 %r101,40; mov.u64 %r102,0; .loc 1 60 9 cvt.u32.u64 %r37,%r102; add.u64 %r269,%frame,360; add.u64 %r268,%frame,240; $L9: add.u64 %r83,%r135,%r101; .loc 1 110 16 ld.u64 %r219,[%r83+16]; add.u64 %r218,%r219,1; ld.u64 %r220,[%r83+8]; sub.u64 %r48,%r218,%r220; .loc 1 110 14 add.u64 %r221,%frame,%r102; st.u64 [%r221],%r48; .loc 1 111 18 set.u32.le.s64 %r223,%r48,0; neg.s32 %r224,%r223; cvt.u16.u32 %r227,%r37; and.b16 %r226,%r227,1; cvt.u16.u32 %r230,%r224; or.b16 %r229,%r230,%r226; cvt.u32.u16 %r232,%r229; cvt.u32.u8 %r37,%r232; .loc 1 112 15 add.u64 %r234,%r269,%r102; ld.u64 %r235,[%r83]; st.u64 [%r234],%r235; .loc 1 113 15 add.u64 %r237,%r268,%r102; .loc 1 113 17 add.u64 %r238,%r137,%r101; ld.u64 %r240,[%r238]; mul.lo.u64 %r239,%r240,%r22; .loc 1 113 15 st.u64 [%r237],%r239; .loc 1 107 7 add.u64 %r102,%r102,8; add.u64 %r101,%r101,24; setp.ne.u64 %r241,%r102,%r132; @ %r241 bra $L9; .loc 1 115 10 ld.u64 %r242,[%frame+360]; setp.ne.u64 %r243,%r242,0; @ %r243 bra $L8; .loc 1 116 13 mov.u64 %r244,1; st.u64 [%frame+360],%r244; $L8: .loc 1 119 6 setp.ne.u32 %r245,%r37,0; @ %r245 bra $L1; .loc 1 128 12 ld.u64 %r134,[%frame+360]; .loc 1 130 8 ld.u64 %r98,[%r135]; bra $L7; $L20: .loc 1 107 7 mov.u64 %r134,%r206; $L7: .loc 1 122 14 ld.u64 %r56,[%frame+240]; .loc 1 122 6 setp.ne.u64 %r246,%r56,0; .loc 1 123 16 selp.u64 %r56,%r56,1,%r246; .loc 1 125 12 ld.u64 %r76,[%r136+40]; .loc 1 126 6 setp.ne.u64 %r247,%r76,0; .loc 1 127 14 selp.u64 %r76,%r76,1,%r247; .loc 1 131 8 ld.u64 %r77,[%r136]; .loc 1 133 9 setp.eq.u64 %r248,%r98,0; @ %r248 bra $L1; .loc 1 139 9 shl.b64 %r61,%r76,4; .loc 1 147 12 shl.b64 %r63,%r134,4; ld.u64 %r96,[%frame+120]; .loc 1 151 32 ld.u64 %r100,[%frame]; .loc 1 158 30 mul.lo.u64 %r27,%r100,%r134; .loc 1 159 30 mul.lo.u64 %r249,%r100,%r56; .loc 1 159 16 neg.s64 %r103,%r249; shl.b64 %r123,%r79,3; setp.le.s64 %r270,%r79,1; add.u64 %r275,%frame,360; add.u64 %r276,%frame,240; .loc 1 155 20 mov.u64 %r277,0; $L19: .loc 1 135 10 ld.s8 %r251,[%r99]; cvt.u16.u32 %r250,%r251; setp.eq.u16 %r252,%r250,0; @ %r252 bra $L21; .loc 1 138 12 ld.f64 %r58,[%r77]; ld.f64 %r59,[%r77+8]; .loc 1 139 9 add.u64 %r77,%r77,%r61; bra $L13; $L21: .loc 1 144 10 mov.f64 %r58,%r87; mov.f64 %r59,%r88; $L13: st.f64 [%r98],%r58; st.f64 [%r98+8],%r59; .loc 1 147 12 add.u64 %r98,%r98,%r63; .loc 1 148 12 add.u64 %r99,%r99,%r56; .loc 1 149 15 add.u64 %r96,%r96,1; .loc 1 151 13 setp.ne.u64 %r253,%r96,%r100; @ %r253 bra $L19; .loc 1 159 16 add.u64 %r97,%r99,%r103; .loc 1 161 14 @ ! %r270 bra $L32; bra $L1; $L18: .loc 1 155 20 st.u64 [%r92],%r277; .loc 1 158 30 mul.lo.u64 %r65,%r71,%r70; .loc 1 159 30 mul.lo.u64 %r256,%r73,%r70; .loc 1 159 16 sub.u64 %r97,%r99,%r256; .loc 1 161 14 add.u64 %r92,%r92,8; add.u64 %r129,%r129,8; setp.ne.u64 %r257,%r123,%r129; @ %r257 bra $L17; bra $L1; $L32: add.u64 %r92,%frame,128; .loc 1 158 30 mov.u64 %r65,%r27; .loc 1 161 14 mov.u64 %r129,8; $L17: .loc 1 169 23 ld.u64 %r259,[%r92]; add.u64 %r70,%r259,1; st.u64 [%r92],%r70; .loc 1 170 30 add.u64 %r261,%r275,%r129; ld.u64 %r71,[%r261]; .loc 1 170 20 sub.u64 %r262,%r71,%r65; shl.b64 %r263,%r262,4; add.u64 %r98,%r98,%r263; .loc 1 171 30 add.u64 %r265,%r276,%r129; ld.u64 %r73,[%r265]; .loc 1 171 20 add.u64 %r99,%r97,%r73; .loc 1 151 32 add.u64 %r266,%frame,%r129; ld.u64 %r75,[%r266]; .loc 1 151 13 setp.eq.u64 %r267,%r70,%r75; @ %r267 bra $L18; .loc 1 155 20 mov.u64 %r96,0; bra $L19; $L_gfortrani_unpack1_c8 .visible .func _gfortrani_unpack1_c8reg .u64 %stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[608predpred %r174; .reg .predu32u64pred %r269; .reg .predu64 %r306; .reg .u64mov.u64 %r151,%ar0; mov.u64 %r152,%ar1; mov.u64 %r153,%ar2; mov.u64 %r154,%ar3; .loc 1 208 8 ld.u64 %r107,[%r153]; .loc 1 213 15 ld.u64 %r22,[%r153+16]; .loc 1 215 22 cvt.u32.u64 %r23,%r22; .loc 1 215 53 add.u32 %r155,%r23,-4; and.b32 %r156,%r155,-5; set.u32.eq.u32 %r158,%r156,0; neg.s32 %r159,%r158; .loc 1 215 22 add.u32 %r160,%r23,-1; set.u32.le.u32 %r162,%r160,1; neg.s32 %r163,%r162; .loc 1 215 6 cvt.u16.u32 %r165,%r159; cvt.u16.u32 %r166,%r163; or.b16 %r164,%r165,%r166; cvt.u32.u16 %r167,%r164; cvt.u16.u8 %r168,%r167; setp.ne.u16 %r169,%r168,0; @ %r169 bra $L34; .loc 1 217 7 setp.eq.u32 %r171,%r23,16; @ %r171 bra $L34; .loc 1 226 5 cvta.const.u64 %r172172_gfortran_runtime_error34: .loc 1 228 10 ld.u64 %r105,[%r151]; .loc 1 228 6 setp.ne.u64 %r174,%r105,0; @ %r174 bra $L35; .loc 1 232 11 ld.s8 %r93,[%r153+28]; .loc 1 234 7 setp.le.s64 %r176,%r93,0; @ %r176 bra $L36; .loc 1 236 13 shl.b64 %r146,%r93,3; add.u64 %r179,%frame,120; cvt.u32.u64 %r183,%r1051465,[%value_in]; } mov.u64 %r136,40; mov.u64 %r137,%r105; .loc 1 206 9 cvt.u32.u64 %r38,%r137; .loc 1 233 10 mov.u64 %r100,1; add.u64 %r301,%frame,480; add.u64 %r300,%frame,360; add.u64 %r299,%frame,240; .loc 1 237 4 mov.u64 %r187,%r137; $L37: add.u64 %r135,%r151,%r136; st.u64 [%r135+8],%r187; add.u64 %r134,%r153,%r136; ld.u64 %r189,[%r134+16]; add.u64 %r188,%r189,1; ld.u64 %r190,[%r134+8]; sub.u64 %r33,%r188,%r190; add.u64 %r191,%r33,-1; st.u64 [%r135+16],%r191; st.u64 [%r135],%r100; .loc 1 239 14 add.u64 %r192,%frame,%r137; st.u64 [%r192],%r33; .loc 1 240 18 set.u32.le.s64 %r194,%r33,0; neg.s32 %r195,%r194; cvt.u16.u32 %r198,%r38; and.b16 %r197,%r198,1; cvt.u16.u32 %r201,%r195; or.b16 %r200,%r201,%r197; cvt.u32.u16 %r203,%r200; cvt.u32.u8 %r38,%r203; .loc 1 241 15 add.u64 %r205,%r301,%r137; st.u64 [%r205],%r100; .loc 1 242 15 add.u64 %r207,%r300,%r137; .loc 1 242 17 add.u64 %r208,%r154,%r136; .loc 1 242 15 ld.u64 %r209,[%r208]; st.u64 [%r207],%r209; .loc 1 243 15 add.u64 %r211,%r299,%r137; .loc 1 243 17 ld.u64 %r213,[%r134]; mul.lo.u64 %r212,%r213,%r22; .loc 1 243 15 st.u64 [%r211],%r212; .loc 1 244 7 mul.lo.u64 %r100,%r100,%r33; .loc 1 234 7 add.u64 %r137,%r137,8; add.u64 %r136,%r136,24; setp.ne.u64 %r214,%r137,%r146; @ %r214 bra $L37; bra $L64; $L36: .loc 1 246 19 st.u64 [%r151+8],%r105; .loc 1 247 24 mov.u64 %r217,16; mov.u64 %r216217; call (%value_in),_gfortrani_xmallocarray218,[%value_in]; } mov.u64 %r105,%r218; .loc 1 247 22 st.u64 [%r151],%r105; .loc 1 278 12 ld.u64 %r150,[%frame+480]; bra $L39; $L64: .loc 1 246 19 st.u64 [%r151+8],%r187; .loc 1 247 24 mov.u64 %r222,16222; call (%value_in),_gfortrani_xmallocarray223,[%value_in]; } .loc 1 247 22 st.u64 [%r151],%r223; bra $L40; $L35: .loc 1 251 11 ld.s8 %r93,[%r151+28]; .loc 1 253 18 mov.u64 %r226,1; st.u64 [%frame+480],%r226; .loc 1 254 7 setp.le.s64 %r227,%r93,0; @ %r227 bra $L54; .loc 1 256 13 shl.b64 %r148,%r93,3; add.u64 %r230,%frame,120; mov.u32 %r23414836,[%value_in]; } add.u64 %r128,%r151,40; add.u64 %r126,%r154,40; add.u64 %r124,%r153,40; mov.u64 %r122,0; .loc 1 206 9 cvt.u32.u64 %r38,%r122; add.u64 %r301,%frame,480; add.u64 %r300,%frame,360; add.u64 %r299,%frame,240; $L41: .loc 1 257 16 ld.u64 %r239,[%r128+16]; add.u64 %r238,%r239,1; ld.u64 %r240,[%r128+8]; sub.u64 %r50,%r238,%r240; .loc 1 257 14 add.u64 %r241,%frame,%r122; st.u64 [%r241],%r50; .loc 1 258 18 set.u32.le.s64 %r243,%r50,0; neg.s32 %r244,%r243; cvt.u16.u32 %r247,%r38; and.b16 %r246,%r247,1; cvt.u16.u32 %r250,%r244; or.b16 %r249,%r250,%r246; cvt.u32.u16 %r252,%r249; cvt.u32.u8 %r38,%r252; .loc 1 259 15 add.u64 %r254,%r301,%r122; ld.u64 %r255,[%r128]; st.u64 [%r254],%r255; .loc 1 260 15 add.u64 %r257,%r300,%r122; ld.u64 %r258,[%r126]; st.u64 [%r257],%r258; .loc 1 261 15 add.u64 %r260,%r299,%r122; .loc 1 261 17 ld.u64 %r262,[%r124]; mul.lo.u64 %r261,%r262,%r22; .loc 1 261 15 st.u64 [%r260],%r261; .loc 1 254 7 add.u64 %r128,%r128,24; add.u64 %r126,%r126,24; add.u64 %r124,%r124,24; add.u64 %r122,%r122,8; setp.ne.u64 %r263,%r122,%r148; @ %r263 bra $L41; .loc 1 263 10 ld.u64 %r264,[%frame+480]; setp.ne.u64 %r265,%r264,0; @ %r265 bra $L40; .loc 1 264 13 mov.u64 %r266,1; st.u64 [%frame+480],%r266; $L40: .loc 1 267 6 setp.ne.u32 %r267,%r38,0; @ %r267 bra $L33; .loc 1 278 12 ld.u64 %r150,[%frame+480]; .loc 1 281 8 ld.u64 %r105,[%r151]; bra $L39; $L54: .loc 1 254 7 mov.u64 %r150,%r226; $L39: .loc 1 270 14 ld.u64 %r59,[%frame+360]; .loc 1 270 6 setp.ne.u64 %r268,%r59,0; .loc 1 271 16 selp.u64 %r59,%r59,1,%r268; .loc 1 272 14 ld.u64 %r60,[%frame+240]; .loc 1 272 6 setp.ne.u64 %r269,%r60,0; .loc 1 273 16 selp.u64 %r60,%r60,1,%r269; .loc 1 275 12 ld.u64 %r91,[%r152+40]; .loc 1 276 6 setp.ne.u64 %r270,%r91,0; .loc 1 277 14 selp.u64 %r91,%r91,1,%r270; .loc 1 282 8 ld.u64 %r106,[%r154]; .loc 1 283 8 ld.u64 %r92,[%r152]; .loc 1 285 9 setp.eq.u64 %r271,%r105,0; @ %r271 bra $L33; .loc 1 291 16 shl.b64 %r65,%r91,4; .loc 1 299 12 shl.b64 %r67,%r150,4; .loc 1 300 12 shl.b64 %r69,%r59,4; ld.u64 %r27,[%frame+120]; .loc 1 304 32 ld.u64 %r108,[%frame]; .loc 1 311 30 mul.lo.u64 %r28,%r108,%r150; .loc 1 312 16 mul.lo.u64 %r272,%r108,%r59; shl.b64 %r273,%r272,4; neg.s64 %r112,%r273; .loc 1 313 30 mul.lo.u64 %r274,%r108,%r60; .loc 1 313 16 neg.s64 %r113,%r274; shl.b64 %r138,%r93,3; setp.le.s64 %r298,%r93,1; add.u64 %r306,%frame,480; add.u64 %r307,%frame,360; add.u64 %r308,%frame,240; .loc 1 308 20 mov.u64 %r309,0; $L53: .loc 1 287 10 ld.s8 %r276,[%r107]; cvt.u16.u32 %r275,%r276; setp.eq.u16 %r277,%r275,0; @ %r277 bra $L46; .loc 1 290 12 ld.f64 %r62,[%r92]; ld.f64 %r63,[%r92+8]; .loc 1 291 16 add.u64 %r92,%r92,%r65; bra $L47; $L46: .loc 1 296 12 ld.f64 %r62,[%r106]; ld.f64 %r63,[%r106+8]; $L47: st.f64 [%r105],%r62; st.f64 [%r105+8],%r63; .loc 1 299 12 add.u64 %r105,%r105,%r67; .loc 1 300 12 add.u64 %r106,%r106,%r69; .loc 1 301 12 add.u64 %r107,%r107,%r60; .loc 1 302 15 add.u64 %r27,%r27,1; .loc 1 304 13 setp.ne.u64 %r278,%r27,%r108; @ %r278 bra $L53; .loc 1 312 16 add.u64 %r103,%r106,%r112; .loc 1 313 16 add.u64 %r104,%r107,%r113; .loc 1 315 14 @ ! %r298 bra $L65; bra $L33; $L52: .loc 1 308 20 st.u64 [%r99],%r309; .loc 1 311 30 mul.lo.u64 %r72,%r82,%r81; .loc 1 312 30 mul.lo.u64 %r281,%r84,%r81; .loc 1 312 16 shl.b64 %r282,%r281,4; sub.u64 %r103,%r106,%r282; .loc 1 313 30 mul.lo.u64 %r283,%r87,%r81; .loc 1 313 16 sub.u64 %r104,%r107,%r283; .loc 1 315 14 add.u64 %r99,%r99,8; add.u64 %r145,%r145,8; setp.ne.u64 %r284,%r138,%r145; @ %r284 bra $L51; bra $L33; $L65: add.u64 %r99,%frame,128; .loc 1 311 30 mov.u64 %r72,%r28; .loc 1 315 14 mov.u64 %r145,8; $L51: .loc 1 323 23 ld.u64 %r286,[%r99]; add.u64 %r81,%r286,1; st.u64 [%r99],%r81; .loc 1 324 30 add.u64 %r288,%r306,%r145; ld.u64 %r82,[%r288]; .loc 1 324 20 sub.u64 %r289,%r82,%r72; shl.b64 %r290,%r289,4; add.u64 %r105,%r105,%r290; .loc 1 325 30 add.u64 %r292,%r307,%r145; ld.u64 %r84,[%r292]; .loc 1 325 20 shl.b64 %r293,%r84,4; add.u64 %r106,%r103,%r293; .loc 1 326 30 add.u64 %r295,%r308,%r145; ld.u64 %r87,[%r295]; .loc 1 326 20 add.u64 %r107,%r104,%r87; .loc 1 304 32 add.u64 %r296,%frame,%r145; ld.u64 %r89,[%r296]; .loc 1 304 13 setp.eq.u64 %r297,%r81,%r89; @ %r297 bra $L52; .loc 1 308 20 mov.u64 %r27,0; bra $L53; $L33: .loc 1 330 1 ret; } unpack_c10.o/ 235 273 311 350 389 428 467 fortran/generated/spread_i1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_spread_scalar_i1 .visible .func _gfortrani_spread_scalar_i1VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray114,101,116,117,114,110,32,114,97,110,107,32,116,111,111,32,108,97,114,103,101,32,105,110,32,115,112,114,101,97,100,40,410,105,109,32,111,117,116,115,105,100,101,32,111,102,32,114,97,110,107,32,105,110,32114,97,110,107,32,109,105,115,109,97,116,99,104,32,105,110,32,115,112,114,101,97,100,40,473,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,83,80,82,69,65,68,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10105,110,99,111,114,114,101,99,116,32,100,101,115,116,105,110,97,116,105,111,110,32,114,97,110,107,32,105,110,32,115,112,114,101,97,100,40,41100,105,109,32,116,111,111,32,108,97,114,103,101,32,105,110,32,115,112,114,101,97,100,40,41_gfortrani_spread_i1 .visible .func _gfortrani_sprea480predpredpred %r184; .reg .u64 %r185; .reg .predpredpred %r198; .reg .predpred %r228; .reg .u64 %r238; .reg .u64 %r242; .reg .pred %r243; .reg .predpredpred %r287; .reg .pred %r288; .reg .u64 %r292; .reg .predmov.u64 %r122,%ar0; mov.u64 %r123,%ar1; mov.u64 %r124,%ar2; mov.u64 %r125,%ar3; .loc 1 57 11 ld.s8 %r22,[%r123+28]; .loc 1 57 9 cvt.u32.u32 %r126,%r22; cvt.s64.s8 %r82,%r126; .loc 1 59 9 add.u64 %r83,%r82,1; .loc 1 60 6 setp.le.s64 %r127,%r83,15; @ %r127 bra $L2; .loc 1 61_gfortran_runtime_errorsetp.ge.s64 %r130,%r83,%r124; @ %r130 bra $L3; .loc 1 64_gfortran_runtime_errorr99,[%r122]; .loc 1 68 6 setp.ne.u64 %r133,%r99,0; @ %r133 bra $L4; .loc 1 75 23 st.u8 [%r122+28],%r83; .loc 1 79 7 setp.le.s64 %r134,%r83,0; @ %r134 bra $L5; .loc 1 82 19 add.u64 %r117,%r124,-1; add.u64 %r102,%r122,40; .loc 1 77 11 mov.u64 %r71,%r99; .loc 1 79 14 mov.u64 %r89,%r71; .loc 1 78 10 mov.u64 %r111,1; .loc 1 40 14 mov.u64 %r67,%r71; .loc 1 90 19 mov.u64 %r322,%r71; .loc 1 84 21 add.u64 %r323,%r125,-1; bra $L8; $L37: mov.u64 %r111,%r68; $L8: .loc 1 82 7 setp.ne.u64 %r137,%r89,%r117; @ %r137 bra $L6; .loc 1 84 21 mov.u64 %r29,%r323; .loc 1 86 11 mul.lo.u64 %r68,%r125,%r111; mov.u64 %r67,%r111; bra $L7; $L6: .loc 1 90 19 shl.b64 %r138,%r71,3; add.u64 %r139,%frame,%r138; st.u64 [%r139+120],%r322; .loc 1 91 22 add.u64 %r143,%r71,%r71; add.u64 %r144,%r143,%r71; shl.b64 %r145,%r144,3; add.u64 %r146,%r123,%r145; ld.u64 %r149,[%r146+56]; add.u64 %r148,%r149,1; ld.u64 %r156,[%r146+48]; sub.u64 %r27,%r148,%r156; .loc 1 91 20 st.u64 [%r139],%r27; .loc 1 92 21 ld.u64 %r168,[%r146+40]; st.u64 [%r139+240],%r168; .loc 1 93 21 st.u64 [%r139+360],%r111; .loc 1 95 25 add.u64 %r29,%r27,-1; .loc 1 96 11 mul.lo.u64 %r68,%r27,%r111; .loc 1 97 11 add.u64 %r71,%r71,1; $L7: .loc 1 99 4 st.u64 [%r102+8],%r322; st.u64 [%r102+16],%r29; st.u64 [%r102],%r111; .loc 1 79 31 add.u64 %r89,%r89,1; .loc 1 79 7 add.u64 %r102,%r102,24; setp.ne.u64 %r173,%r83,%r89; @ %r173 bra $L37; bra $L45; $L5: .loc 1 101 19 st.u64 [%r122+8],%r99; .loc 1 104 24 mov.u64 %r1761(%value_in),_gfortrani_xmallocarray177,[%value_in]; } mov.u64 %r99,%r177; .loc 1 104 22 st.u64 [%r122],%r99; .loc 1 181 12 ld.u64 %r50,[%frame+240]; .loc 1 40 14 mov.u64 %r67,0; bra $L10; $L45: .loc 1 101 19 st.u64 [%r122+8],%r322; .loc 1 104 24 mov.u64 %r18181; call (%value_in),_gfortrani_xmallocarray,(%out_arg1,%out_arg2); ld.param.u64 %r182,[%value_in]; } mov.u64 %r99,%r182; .loc 1 104 22 st.u64 [%r122],%r99; .loc 1 105 10 setp.le.s64 %r184,%r68,0; @ %r184 bra $L1; .loc 1 181 12 ld.u64 %r50,[%frame+240]; bra $L10; $L4: .loc 1 115 11 ld.s8 %r185,[%r122+28]; .loc 1 115 10 setp.eq.u64 %r186,%r185,%r83; @ %r186 bra $L12; .loc 1 116 2 cvta.const.u64 %r187187_gfortran_runtime_error12: .loc 1 118 11 cvta.global.u64 %r189,_gfortrani_compile_options; ld.u32 %r75,[%r189+36]; .loc 1 118 10 setp.ne.u32 %r190,%r75,0; @ %r190 bra $L13; .loc 1 156 4 setp.gt.s64 %r191,%r185,0; @ %r191 bra $L14; .loc 1 40 14 mov.u64 %r67,0; bra $L15; $L13: .loc 1 120 4 setp.le.s64 %r192,%r185,0; @ %r192 bra $L38; .loc 1 82 19 add.u64 %r116,%r124,-1; add.u64 %r73,%r122,40; .loc 1 112 18 mov.u32 %r75,0; .loc 1 114 11 mov.u64 %r72,0; .loc 1 120 11 mov.u64 %r86,%r72; .loc 1 40 14 mov.u64 %r67,%r72; .loc 1 137 16 mov.u64 %r324,%r72; $L20: .loc 1 124 21 ld.u64 %r196,[%r73+16]; add.u64 %r195,%r196,1; .loc 1 124 19 ld.u64 %r197,[%r73+8]; sub.u64 %r85,%r195,%r197; .loc 1 125 11 setp.ne.u64 %r198,%r86,%r116; @ %r198 bra $L16; .loc 1 127 12 ld.u64 %r67,[%r73]; .loc 1 129 8 setp.eq.u64 %r199,%r125,%r85; @ %r199 bra $L17; .loc 1 130 7 st.u64 [%stack+16],%r125; st.u64 [%stack+8],%r85; add.u64 %r201,%r86,1;_gfortran_runtime_error16: .loc 1 137 16 shl.b64 %r311,%r72,3; add.u64 %r204,%frame,%r311; add.u64 %r205,%r204,120; st.u64 [%r205],%r324; .loc 1 138 19 add.u64 %r309,%r72,%r72; add.u64 %r307,%r309,%r72; shl.b64 %r210,%r307,3; add.u64 %r211,%r123,%r210; ld.u64 %r214,[%r211+56]; add.u64 %r213,%r214,1; ld.u64 %r221,[%r211+48]; sub.u64 %r41,%r213,%r221; .loc 1 138 17 st.u64 [%r204],%r41; .loc 1 139 8 setp.eq.u64 %r224,%r41,%r85; @ %r224 bra $L18; .loc 1 140 7 st.u64 [%stack+16],%r41; st.u64 [%stack+8],%r85; add.u64 %r226,%r86,1;_gfortran_runtime_error18: .loc 1 146 8 setp.gt.s64 %r228,%r41,0; .loc 1 147 18 selp.u32 %r75,%r75,1,%r228; .loc 1 148 18 ld.u64 %r238,[%r211+40]; st.u64 [%r204+240],%r238; .loc 1 149 18 ld.u64 %r242,[%r73]; st.u64 [%r204+360],%r242; .loc 1 150 8 add.u64 %r72,%r72,1; $L17: .loc 1 120 28 add.u64 %r86,%r86,1; .loc 1 120 4 add.u64 %r73,%r73,24; setp.ne.u64 %r243,%r185,%r86; @ %r243 bra $L20; bra $L21; $L14: .loc 1 82 19 add.u64 %r113,%r124,-1; add.u64 %r96,%r122,40; .loc 1 114 11 mov.u64 %r74,0; .loc 1 156 11 mov.u64 %r84,%r74; .loc 1 40 14 mov.u64 %r67,%r74; .loc 1 164 16 mov.u64 %r325,%r74; $L24: .loc 1 160 12 ld.u64 %r121,[%r96]; .loc 1 158 11 setp.eq.u64 %r246,%r84,%r113; @ %r246 bra $L39; .loc 1 164 16 shl.b64 %r313,%r74,3; add.u64 %r248,%frame,%r313; add.u64 %r249,%r248,120; st.u64 [%r249],%r325; .loc 1 165 19 add.u64 %r310,%r74,%r74; add.u64 %r308,%r310,%r74; shl.b64 %r254,%r308,3; add.u64 %r255,%r123,%r254; ld.u64 %r258,[%r255+56]; add.u64 %r257,%r258,1; ld.u64 %r265,[%r255+48]; sub.u64 %r48,%r257,%r265; .loc 1 165 17 st.u64 [%r248],%r48; .loc 1 166 8 setp.gt.s64 %r268,%r48,0; .loc 1 167 18 selp.u32 %r75,%r75,1,%r268; .loc 1 168 18 ld.u64 %r278,[%r255+40]; st.u64 [%r248+240],%r278; .loc 1 169 18 st.u64 [%r248+360],%r121; .loc 1 170 8 add.u64 %r74,%r74,1; bra $L22; $L39: .loc 1 160 12 mov.u64 %r67,%r121; $L22: .loc 1 156 28 add.u64 %r84,%r84,1; .loc 1 156 4 add.u64 %r96,%r96,24; setp.ne.u64 %r282,%r84,%r185; @ %r282 bra $L24; $L21: .loc 1 175 10 setp.ne.u32 %r283,%r75,0; @ ! %r283 bra $L15; bra $L1; $L38: .loc 1 40 14 mov.u64 %r67,0; $L15: .loc 1 178 18 ld.u64 %r50,[%frame+240]; .loc 1 178 10 setp.eq.u64 %r284,%r50,0; selp.u64 %r50,1,%r50,%r284; $L10: .loc 1 182 12 ld.u64 %r90,[%frame+360]; .loc 1 184 8 ld.u64 %r93,[%r123]; .loc 1 186 9 setp.ne.u64 %r285,%r93,0; @ %r285 bra $L25; bra $L1; $L33: .loc 1 193 16 mov.u64 %r94,%r99; .loc 1 190 14 mov.u64 %r95,0; $L26: .loc 1 192 10 ld.u8 %r286,[%r93]; st.u8 [%r94],%r286; .loc 1 193 16 add.u64 %r94,%r94,%r67; .loc 1 190 33 add.u64 %r95,%r95,1; .loc 1 190 7 setp.ne.u64 %r287,%r125,%r95; @ %r287 bra $L26; bra $L46; $L44: .loc 1 196 12 add.u64 %r93,%r93,%r50; .loc 1 197 12 add.u64 %r99,%r99,%r90; .loc 1 198 15 add.u64 %r87,%r87,1; .loc 1 200 13 setp.ne.u64 %r288,%r30,%r87; @ %r288 bra $L44; $L35: .loc 1 208 16 add.u64 %r92,%r99,%r107; .loc 1 210 14 @ %r315 bra $L1; add.u64 %r100,%frame,128; .loc 1 207 30 mov.u64 %r53,%r91; .loc 1 210 14 mov.u64 %r110,8; bra $L30; $L31: .loc 1 204 20 st.u64 [%r100],%r320; .loc 1 207 30 mul.lo.u64 %r53,%r60,%r59; .loc 1 208 30 mul.lo.u64 %r292,%r62,%r59; .loc 1 208 16 sub.u64 %r92,%r99,%r292; .loc 1 210 14 add.u64 %r100,%r100,8; add.u64 %r110,%r110,8; setp.eq.u64 %r293,%r103,%r110; @ %r293 bra $L1; $L30: .loc 1 218 23 ld.u64 %r294,[%r100]; add.u64 %r59,%r294,1; st.u64 [%r100],%r59; .loc 1 219 30 add.u64 %r296,%r318,%r110; ld.u64 %r60,[%r296]; .loc 1 219 20 sub.u64 %r297,%r60,%r53; add.u64 %r93,%r93,%r297; .loc 1 220 30 add.u64 %r299,%r319,%r110; ld.u64 %r62,[%r299]; .loc 1 220 20 add.u64 %r99,%r92,%r62; .loc 1 200 32 add.u64 %r300,%frame,%r110; ld.u64 %r64,[%r300]; .loc 1 200 13 setp.eq.u64 %r301,%r59,%r64; @ %r301 bra $L31; .loc 1 204 20 mov.u64 %r87,0; bra $L32; $L25: ld.u64 %r87,[%frame+120]; .loc 1 200 32 ld.u64 %r30,[%frame]; .loc 1 207 30 mul.lo.u64 %r91,%r50,%r30; .loc 1 208 30 mul.lo.u64 %r302,%r90,%r30; .loc 1 208 16 neg.s64 %r107,%r302; cvt.u32.u32 %r304,%r22; cvt.s64.s8 %r303,%r304; shl.b64 %r103,%r303,3; setp.le.s64 %r315,%r82,1; setp.gt.s64 %r316,%r125,0; add.u64 %r318,%frame,240; add.u64 %r319,%frame,360; .loc 1 204 20 mov.u64 %r320,0; $L32: .loc 1 190 7 @ %r316 bra $L33; bra $L44; $L46: .loc 1 196 12 add.u64 %r93,%r93,%r50; .loc 1 197 12 add.u64 %r99,%r99,%r90; .loc 1 198 15 add.u64 %r87,%r87,1; .loc 1 200 13 setp.eq.u64 %r306,%r30,%r87; @ %r306 bra $L35; bra $L32; $L_gfortrani_spread_scalar_i1 .visible .func _gfortrani_spread_scalar_u32u64 %r46; .reg .predmov.u64 %r38,%ar2; mov.u64 %r39,%ar3; .loc 1 236 6 ld.s8 %r41,[%r36+28]; cvt.u16.u32 %r40,%r41; setp.eq.u16 %r42,%r40,1; @ %r42 bra $L48; .loc 1 237 5 cvta.const.u64 %r43,$LC4; {stack; call _gfortran_runtime_error48: .loc 1 239 6 setp.le.s64 %r45,%r38,1_gfortran_runtime_error49: .loc 1 242 10 ld.u64 %r23,[%r36]; .loc 1 246 7 add.u64 %r35,%r39,-1; .loc 1 242 6 setp.ne.u64 %r48,%r23,0; @ %r48 bra $L50; .loc 1 244 24 mov.u64 _gfortrani_xmallocarray244 22 st.u64 [%r36],%r23; .loc 1 245 19 mov.u64 %r53,0; st.u64 [%r36+8],%r53; .loc 1 246 7 st.u64 [%r36+48],%r53; st.u64 [%r36+56],%r35; st.u64 [%r36+40],%r50; mov.u64 %r30,%r50; bra $L51; $L50: .loc 1 251 9 ld.u64 %r30,[%r36+40]; .loc 1 250 26 ld.u64 %r57,[%r36+56]; ld.u64 %r59,[%r36+48]; .loc 1 250 55 sub.u64 %r60,%r57,%r59; .loc 1 251 7 div.s64 %r62,%r60,%r30; .loc 1 250 10 setp.ge.s64 %r63,%r62,%r35; @ %r63 bra $L51; .loc 1 252 2_gfortran_runtime_error1: .loc 1 258 3 setp.gt.s64 %r66,%r39,0; @ ! %r66 bra $L47; .loc 1 258 19 mov.u64 %r34,0; $L53: .loc 1 260 13 ld.u8 %r67,[%r37]; st.u8 [%r23],%r67; .loc 1 261 12 add.u64 %r23,%r23,%r30; .loc 1 258 40 add.u64 %r34,%r34,1; .loc 1 258 3 setp.ne.u64 %r68,%r39,%r34; @ %r68 bra $L53; $L47: .loc 1 263 1 ret; } spread_i2.o/fortran/generated/spread_i2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_spread_scalar_i2 .visible .func _gfortrani_spread_scalar_i2VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray114,101,116,117,114,110,32,114,97,110,107,32,116,111,111,32,108,97,114,103,101,32,105,110,32,115,112,114,101,97,100,40,410,105,109,32,111,117,116,115,105,100,101,32,111,102,32,114,97,110,107,32,105,110,32114,97,110,107,32,109,105,115,109,97,116,99,104,32,105,110,32,115,112,114,101,97,100,40,473,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,83,80,82,69,65,68,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10105,110,99,111,114,114,101,99,116,32,100,101,115,116,105,110,97,116,105,111,110,32,114,97,110,107,32,105,110,32,115,112,114,101,97,100,40,41100,105,109,32,116,111,111,32,108,97,114,103,101,32,105,110,32,115,112,114,101,97,100,40,41_gfortrani_spread_i2 .visible .func _gfortrani_sprea480predpredpredpred %r250; .reg .predu64 %r264; .reg .u64 %r265; .reg .u64 %r272; .reg .pred %r275; .reg .u64 %r285; .reg .predpred %r292; .reg .u16 %r293; .reg .pred %r294; .reg .predpredmov.u64 %r129,%ar0; mov.u64 %r130,%ar1; mov.u64 %r131,%ar2; mov.u64 %r132,%ar3; .loc 1 57 11 ld.s8 %r22,[%r130+28]; .loc 1 57 9 cvt.u32.u32 %r133,%r22; cvt.s64.s8 %r89,%r133; .loc 1 59 9 add.u64 %r90,%r89,1; .loc 1 60 6 setp.le.s64 %r134,%r90,15; @ %r134 bra $L2; .loc 1 61 5 cvta.const.u64 %r135135_gfortran_runtime_errorsetp.ge.s64 %r137,%r90,%r131; @ %r137 bra $L3; .loc 1 64_gfortran_runtime_errorr112,[%r129]; .loc 1 68 6 setp.ne.u64 %r140,%r112,0; @ %r140 bra $L4; .loc 1 75 23 st.u8 [%r129+28],%r90; .loc 1 79 7 setp.le.s64 %r141,%r90,0; @ %r141 bra $L5; .loc 1 82 19 add.u64 %r127,%r131,-1; add.u64 %r115,%r129,40; .loc 1 77 11 mov.u64 %r80,%r112; .loc 1 79 14 mov.u64 %r97,%r80; .loc 1 78 10 mov.u64 %r121,1; .loc 1 40 14 mov.u64 %r74,%r80; .loc 1 90 19 mov.u64 %r336,%r80; .loc 1 84 21 add.u64 %r337,%r132,-1; bra $L8; $L36: mov.u64 %r121,%r76; $L8: .loc 1 82 7 setp.ne.u64 %r144,%r97,%r127; @ %r144 bra $L6; .loc 1 84 21 mov.u64 %r29,%r337; .loc 1 86 11 mul.lo.u64 %r76,%r132,%r121; mov.u64 %r74,%r121; bra $L7; $L6: .loc 1 90 19 shl.b64 %r145,%r80,3; add.u64 %r146,%frame,%r145; st.u64 [%r146+120],%r336; .loc 1 91 22 add.u64 %r150,%r80,%r80; add.u64 %r151,%r150,%r80; shl.b64 %r152,%r151,3; add.u64 %r153,%r130,%r152; ld.u64 %r156,[%r153+56]; add.u64 %r155,%r156,1; ld.u64 %r163,[%r153+48]; sub.u64 %r27,%r155,%r163; .loc 1 91 20 st.u64 [%r146],%r27; .loc 1 92 21 ld.u64 %r175,[%r153+40]; st.u64 [%r146+240],%r175; .loc 1 93 21 st.u64 [%r146+360],%r121; .loc 1 95 25 add.u64 %r29,%r27,-1; .loc 1 96 11 mul.lo.u64 %r76,%r27,%r121; .loc 1 97 11 add.u64 %r80,%r80,1; $L7: .loc 1 99 4 st.u64 [%r115+8],%r336; st.u64 [%r115+16],%r29; st.u64 [%r115],%r121; .loc 1 79 31 add.u64 %r97,%r97,1; .loc 1 79 7 add.u64 %r115,%r115,24; setp.ne.u64 %r180,%r97,%r90; @ %r180 bra $L36; bra $L45; $L5: .loc 1 101 19 st.u64 [%r129+8],%r112; .loc 1 104 24 mov.u64 %r183,2; mov.u64 %r18218183; call (%value_in),_gfortrani_xmallocarray184,[%value_in]; } mov.u64 %r112,%r184; .loc 1 104 22 st.u64 [%r129],%r112; .loc 1 181 12 ld.u64 %r50,[%frame+240]; .loc 1 40 14 mov.u64 %r74,0; bra $L10; $L45: .loc 1 101 19 st.u64 [%r129+8],%r336; .loc 1 104 24 mov.u64 %r188,2;188; call (%value_in),_gfortrani_xmallocarray189,[%value_in]; } mov.u64 %r112,%r189; .loc 1 104 22 st.u64 [%r129],%r112; .loc 1 105 10 setp.le.s64 %r191,%r76,0; @ %r191 bra $L1; .loc 1 181 12 ld.u64 %r50,[%frame+240]; bra $L10; $L4: .loc 1 115 11 ld.s8 %r192,[%r129+28]; .loc 1 115 10 setp.eq.u64 %r193,%r192,%r90; @ %r193 bra $L12; .loc 1 116 2 cvta.const.u64 %r1944_gfortran_runtime_error12: .loc 1 118 11 cvta.global.u64 %r196,_gfortrani_compile_options; ld.u32 %r85,[%r196+36]; .loc 1 118 10 setp.ne.u32 %r197,%r85,0; @ %r197 bra $L13; .loc 1 156 4 setp.gt.s64 %r198,%r192,0; @ %r198 bra $L14; .loc 1 40 14 mov.u64 %r74,0; bra $L15; $L13: .loc 1 120 4 setp.le.s64 %r199,%r192,0; @ %r199 bra $L37; .loc 1 82 19 add.u64 %r126,%r131,-1; add.u64 %r106,%r129,40; .loc 1 112 18 mov.u32 %r85,0; .loc 1 114 11 mov.u64 %r82,0; .loc 1 120 11 mov.u64 %r94,%r82; .loc 1 40 14 mov.u64 %r74,%r82; .loc 1 137 16 mov.u64 %r338,%r82; $L20: .loc 1 124 21 ld.u64 %r203,[%r106+16]; add.u64 %r202,%r203,1; .loc 1 124 19 ld.u64 %r204,[%r106+8]; sub.u64 %r93,%r202,%r204; .loc 1 125 11 setp.ne.u64 %r205,%r94,%r126; @ %r205 bra $L16; .loc 1 127 12 ld.u64 %r74,[%r106]; .loc 1 129 8 setp.eq.u64 %r206,%r132,%r93; @ %r206 bra $L17; .loc 1 130 7 st.u64 [%stack+16],%r132; st.u64 [%stack+8],%r93; add.u64 %r208,%r94,1;_gfortran_runtime_error16: .loc 1 137 16 shl.b64 %r326,%r82,3; add.u64 %r211,%frame,%r326; add.u64 %r212,%r211,120; st.u64 [%r212],%r338; .loc 1 138 19 add.u64 %r321,%r82,%r82; add.u64 %r323,%r321,%r82; shl.b64 %r217,%r323,3; add.u64 %r218,%r130,%r217; ld.u64 %r221,[%r218+56]; add.u64 %r220,%r221,1; ld.u64 %r228,[%r218+48]; sub.u64 %r41,%r220,%r228; .loc 1 138 17 st.u64 [%r211],%r41; .loc 1 139 8 setp.eq.u64 %r231,%r41,%r93; @ %r231 bra $L18; .loc 1 140 7 st.u64 [%stack+16],%r41; st.u64 [%stack+8],%r93; add.u64 %r233,%r94,1;_gfortran_runtime_error18: .loc 1 146 8 setp.gt.s64 %r235,%r41,0; .loc 1 147 18 selp.u32 %r85,%r85,1,%r235; .loc 1 148 18 ld.u64 %r245,[%r218+40]; st.u64 [%r211+240],%r245; .loc 1 149 18 ld.u64 %r249,[%r106]; st.u64 [%r211+360],%r249; .loc 1 150 8 add.u64 %r82,%r82,1; $L17: .loc 1 120 28 add.u64 %r94,%r94,1; .loc 1 120 4 add.u64 %r106,%r106,24; setp.ne.u64 %r250,%r192,%r94; @ %r250 bra $L20; bra $L21; $L14: .loc 1 82 19 add.u64 %r125,%r131,-1; add.u64 %r75,%r129,40; .loc 1 114 11 mov.u64 %r84,0; .loc 1 156 11 mov.u64 %r92,%r84; .loc 1 40 14 mov.u64 %r74,%r84; .loc 1 164 16 mov.u64 %r339,%r84; $L24: .loc 1 160 12 ld.u64 %r128,[%r75]; .loc 1 158 11 setp.eq.u64 %r253,%r92,%r125; @ %r253 bra $L38; .loc 1 164 16 shl.b64 %r327,%r84,3; add.u64 %r255,%frame,%r327; add.u64 %r256,%r255,120; st.u64 [%r256],%r339; .loc 1 165 19 add.u64 %r322,%r84,%r84; add.u64 %r325,%r322,%r84; shl.b64 %r261,%r325,3; add.u64 %r262,%r130,%r261; ld.u64 %r265,[%r262+56]; add.u64 %r264,%r265,1; ld.u64 %r272,[%r262+48]; sub.u64 %r48,%r264,%r272; .loc 1 165 17 st.u64 [%r255],%r48; .loc 1 166 8 setp.gt.s64 %r275,%r48,0; .loc 1 167 18 selp.u32 %r85,%r85,1,%r275; .loc 1 168 18 ld.u64 %r285,[%r262+40]; st.u64 [%r255+240],%r285; .loc 1 169 18 st.u64 [%r255+360],%r128; .loc 1 170 8 add.u64 %r84,%r84,1; bra $L22; $L38: .loc 1 160 12 mov.u64 %r74,%r128; $L22: .loc 1 156 28 add.u64 %r92,%r92,1; .loc 1 156 4 add.u64 %r75,%r75,24; setp.ne.u64 %r289,%r92,%r192; @ %r289 bra $L24; $L21: .loc 1 175 10 setp.ne.u32 %r290,%r85,0; @ ! %r290 bra $L15; bra $L1; $L37: .loc 1 40 14 mov.u64 %r74,0; $L15: .loc 1 178 18 ld.u64 %r50,[%frame+240]; .loc 1 178 10 setp.eq.u64 %r291,%r50,0; selp.u64 %r50,1,%r50,%r291; $L10: .loc 1 182 12 ld.u64 %r98,[%frame+360]; .loc 1 184 8 ld.u64 %r101,[%r130]; .loc 1 186 9 setp.ne.u64 %r292,%r101,0; @ %r292 bra $L25; bra $L1; $L40: .loc 1 190 7 mov.u64 %r102,%r112; .loc 1 190 14 mov.u64 %r103,0; $L26: .loc 1 192 10 ld.u16 %r293,[%r101]; st.u16 [%r102],%r293; .loc 1 193 16 add.u64 %r102,%r102,%r53; .loc 1 190 33 add.u64 %r103,%r103,1; .loc 1 190 7 setp.ne.u64 %r294,%r132,%r103; @ %r294 bra $L26; bra $L46; $L44: .loc 1 196 12 add.u64 %r101,%r101,%r55; .loc 1 197 12 add.u64 %r112,%r112,%r57; .loc 1 198 15 add.u64 %r95,%r95,1; .loc 1 200 13 setp.ne.u64 %r295,%r30,%r95; @ %r295 bra $L44; $L34: .loc 1 208 16 add.u64 %r100,%r112,%r116; .loc 1 210 14 @ %r329 bra $L1; add.u64 %r108,%frame,128; .loc 1 207 30 mov.u64 %r59,%r99; .loc 1 210 14 mov.u64 %r124,8; bra $L30; $L31: .loc 1 204 20 st.u64 [%r108],%r334; .loc 1 207 30 mul.lo.u64 %r59,%r67,%r66; .loc 1 208 30 mul.lo.u64 %r299,%r69,%r66; .loc 1 208 16 add.u64 %r300,%r299,%r299; sub.u64 %r100,%r112,%r300; .loc 1 210 14 add.u64 %r108,%r108,8; add.u64 %r124,%r124,8; setp.eq.u64 %r301,%r117,%r124; @ %r301 bra $L1; $L30: .loc 1 218 23 ld.u64 %r302,[%r108]; add.u64 %r66,%r302,1; st.u64 [%r108],%r66; .loc 1 219 30 add.u64 %r304,%r332,%r124; ld.u64 %r67,[%r304]; .loc 1 219 20 sub.u64 %r305,%r67,%r59; add.u64 %r306,%r305,%r305; add.u64 %r101,%r101,%r306; .loc 1 220 30 add.u64 %r308,%r333,%r124; ld.u64 %r69,[%r308]; .loc 1 220 20 add.u64 %r309,%r69,%r69; add.u64 %r112,%r100,%r309; .loc 1 200 32 add.u64 %r310,%frame,%r124; ld.u64 %r72,[%r310]; .loc 1 200 13 setp.eq.u64 %r311,%r66,%r72; @ %r311 bra $L31; .loc 1 204 20 mov.u64 %r95,0; bra $L32; $L25: .loc 1 193 16 add.u64 %r53,%r74,%r74; .loc 1 196 12 add.u64 %r55,%r50,%r50; .loc 1 197 12 add.u64 %r57,%r98,%r98; ld.u64 %r95,[%frame+120]; .loc 1 200 32 ld.u64 %r30,[%frame]; .loc 1 207 30 mul.lo.u64 %r99,%r50,%r30; .loc 1 208 30 mul.lo.u64 %r315,%r98,%r30; .loc 1 208 16 add.u64 %r316,%r315,%r315; neg.s64 %r116,%r316; cvt.u32.u32 %r318,%r22; cvt.s64.s8 %r317,%r318; shl.b64 %r117,%r317,3; setp.le.s64 %r329,%r89,1; setp.gt.s64 %r330,%r132,0; add.u64 %r332,%frame,240; add.u64 %r333,%frame,360; .loc 1 204 20 mov.u64 %r334,0; $L32: .loc 1 190 7 @ %r330 bra $L40; bra $L44; $L46: .loc 1 196 12 add.u64 %r101,%r101,%r55; .loc 1 197 12 add.u64 %r112,%r112,%r57; .loc 1 198 15 add.u64 %r95,%r95,1; .loc 1 200 13 setp.eq.u64 %r320,%r30,%r95; @ %r320 bra $L34; bra $L32; $L_gfortrani_spread_scalar_i2 .visible .func _gfortrani_spread_scalar_predmov.u64 %r39,%ar2; mov.u64 %r40,%ar3; .loc 1 236 6 ld.s8 %r42,[%r37+28]; cvt.u16.u32 %r41,%r42; setp.eq.u16 %r43,%r41,1; @ %r43 bra $L48; .loc 1 237 5 cvta.const.u64 %r44,$LC4; {stack; call _gfortran_runtime_error48: .loc 1 239 6 setp.le.s64 %r46,%r39,1; @ %r46 bra $L49; .loc 1 240 5 cvta.const.u64 %r47_gfortran_runtime_error49: .loc 1 242 10 ld.u64 %r23,[%r37]; .loc 1 246 7 add.u64 %r36,%r40,-1; .loc 1 242 6 setp.ne.u64 %r49,%r23,0; @ %r49 bra $L50; .loc 1 244 24 mov.u64 %r51,call (%value_in),_gfortrani_xmallocarray244 22 st.u64 [%r37],%r23; .loc 1 245 19 mov.u64 %r54,0; st.u64 [%r37+8],%r54; .loc 1 246 7 st.u64 [%r37+48],%r54; st.u64 [%r37+56],%r36; mov.u64 %r56,1; st.u64 [%r37+40],%r56; mov.u64 %r30,%r56; bra $L51; $L50: .loc 1 251 9 ld.u64 %r30,[%r37+40]; .loc 1 250 26 ld.u64 %r58,[%r37+56]; ld.u64 %r60,[%r37+48]; .loc 1 250 55 sub.u64 %r61,%r58,%r60; .loc 1 251 7 div.s64 %r63,%r61,%r30; .loc 1 250 10 setp.ge.s64 %r64,%r63,%r36; @ %r64 bra $L51; .loc 1 252 2 cvta.const.u64 %r65,$LC5stack; call _gfortran_runtime_error1: .loc 1 258 3 setp.gt.s64 %r67,%r40,0; @ ! %r67 bra $L47; .loc 1 261 12 add.u64 %r34,%r30,%r30; .loc 1 258 19 mov.u64 %r35,0; $L53: .loc 1 260 13 ld.u16 %r69,[%r38]; st.u16 [%r23],%r69; .loc 1 261 12 add.u64 %r23,%r23,%r34; .loc 1 258 40 add.u64 %r35,%r35,1; .loc 1 258 3 setp.ne.u64 %r70,%r40,%r35; @ %r70 bra $L53; $L47: .loc 1 263 1 ret; } spread_i4.o/_gfortrani_spread_i4 .visible .func _gfortrani_spreafortran/generated/spread_i4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_spread_scalar_i4 .visible .func _gfortrani_spread_scalar_i4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray114,101,116,117,114,110,32,114,97,110,107,32,116,111,111,32,108,97,114,103,101,32,105,110,32,115,112,114,101,97,100,40,410,105,109,32,111,117,116,115,105,100,101,32,111,102,32,114,97,110,107,32,105,110,32114,97,110,107,32,109,105,115,109,97,116,99,104,32,105,110,32,115,112,114,101,97,100,40,473,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,83,80,82,69,65,68,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10105,110,99,111,114,114,101,99,116,32,100,101,115,116,105,110,97,116,105,111,110,32,114,97,110,107,32,105,110,32,115,112,114,101,97,100,40,41100,105,109,32,116,111,111,32,108,97,114,103,101,32,105,110,32,115,112,114,101,97,100,40,41_gfortrani_spread_i4 .visible .func _gfortrani_sprea480predpredpredpred %r250; .reg .predu64 %r264; .reg .u64 %r265; .reg .u64 %r272; .reg .pred %r275; .reg .u64 %r285; .reg .predpred %r292; .reg .pred %r293; .reg .predu64 %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u64pred %r316; .reg .u64 %r317; .reg .u64 %r318; .reg .u64 %r319; .reg .u64 %r321; .reg .u64 %r322; .reg .u64 %r323; .reg .pred %r325; .reg .pred %r326; .reg .u64u64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; mov.u64 %r129,%ar0; mov.u64 %r130,%ar1; mov.u64 %r131,%ar2; mov.u64 %r132,%ar3; .loc 1 57 11 ld.s8 %r22,[%r130+28]; .loc 1 57 9 cvt.u32.u32 %r133,%r22; cvt.s64.s8 %r89,%r133; .loc 1 59 9 add.u64 %r90,%r89,1; .loc 1 60 6 setp.le.s64 %r134,%r90,15; @ %r134 bra $L2; .loc 1 61 5 cvta.const.u64 %r135135_gfortran_runtime_errorsetp.ge.s64 %r137,%r90,%r131; @ %r137 bra $L3; .loc 1 64_gfortran_runtime_errorr112,[%r129]; .loc 1 68 6 setp.ne.u64 %r140,%r112,0; @ %r140 bra $L4; .loc 1 75 23 st.u8 [%r129+28],%r90; .loc 1 79 7 setp.le.s64 %r141,%r90,0; @ %r141 bra $L5; .loc 1 82 19 add.u64 %r127,%r131,-1; add.u64 %r115,%r129,40; .loc 1 77 11 mov.u64 %r80,%r112; .loc 1 79 14 mov.u64 %r97,%r80; .loc 1 78 10 mov.u64 %r121,1; .loc 1 40 14 mov.u64 %r74,%r80; .loc 1 90 19 mov.u64 %r332,%r80; .loc 1 84 21 add.u64 %r333,%r132,-1; bra $L8; $L36: mov.u64 %r121,%r76; $L8: .loc 1 82 7 setp.ne.u64 %r144,%r97,%r127; @ %r144 bra $L6; .loc 1 84 21 mov.u64 %r29,%r333; .loc 1 86 11 mul.lo.u64 %r76,%r132,%r121; mov.u64 %r74,%r121; bra $L7; $L6: .loc 1 90 19 shl.b64 %r145,%r80,3; add.u64 %r146,%frame,%r145; st.u64 [%r146+120],%r332; .loc 1 91 22 add.u64 %r150,%r80,%r80; add.u64 %r151,%r150,%r80; shl.b64 %r152,%r151,3; add.u64 %r153,%r130,%r152; ld.u64 %r156,[%r153+56]; add.u64 %r155,%r156,1; ld.u64 %r163,[%r153+48]; sub.u64 %r27,%r155,%r163; .loc 1 91 20 st.u64 [%r146],%r27; .loc 1 92 21 ld.u64 %r175,[%r153+40]; st.u64 [%r146+240],%r175; .loc 1 93 21 st.u64 [%r146+360],%r121; .loc 1 95 25 add.u64 %r29,%r27,-1; .loc 1 96 11 mul.lo.u64 %r76,%r27,%r121; .loc 1 97 11 add.u64 %r80,%r80,1; $L7: .loc 1 99 4 st.u64 [%r115+8],%r332; st.u64 [%r115+16],%r29; st.u64 [%r115],%r121; .loc 1 79 31 add.u64 %r97,%r97,1; .loc 1 79 7 add.u64 %r115,%r115,24; setp.ne.u64 %r180,%r97,%r90; @ %r180 bra $L36; bra $L45; $L5: .loc 1 101 19 st.u64 [%r129+8],%r112; .loc 1 104 24 mov.u64 %r183,4; mov.u64 %r18218183; call (%value_in),_gfortrani_xmallocarray184,[%value_in]; } mov.u64 %r112,%r184; .loc 1 104 22 st.u64 [%r129],%r112; .loc 1 181 12 ld.u64 %r50,[%frame+240]; .loc 1 40 14 mov.u64 %r74,0; bra $L10; $L45: .loc 1 101 19 st.u64 [%r129+8],%r332; .loc 1 104 24 mov.u64 %r188,4;188; call (%value_in),_gfortrani_xmallocarray189,[%value_in]; } mov.u64 %r112,%r189; .loc 1 104 22 st.u64 [%r129],%r112; .loc 1 105 10 setp.le.s64 %r191,%r76,0; @ %r191 bra $L1; .loc 1 181 12 ld.u64 %r50,[%frame+240]; bra $L10; $L4: .loc 1 115 11 ld.s8 %r192,[%r129+28]; .loc 1 115 10 setp.eq.u64 %r193,%r192,%r90; @ %r193 bra $L12; .loc 1 116 2 cvta.const.u64 %r1944_gfortran_runtime_error12: .loc 1 118 11 cvta.global.u64 %r196,_gfortrani_compile_options; ld.u32 %r85,[%r196+36]; .loc 1 118 10 setp.ne.u32 %r197,%r85,0; @ %r197 bra $L13; .loc 1 156 4 setp.gt.s64 %r198,%r192,0; @ %r198 bra $L14; .loc 1 40 14 mov.u64 %r74,0; bra $L15; $L13: .loc 1 120 4 setp.le.s64 %r199,%r192,0; @ %r199 bra $L37; .loc 1 82 19 add.u64 %r126,%r131,-1; add.u64 %r106,%r129,40; .loc 1 112 18 mov.u32 %r85,0; .loc 1 114 11 mov.u64 %r82,0; .loc 1 120 11 mov.u64 %r94,%r82; .loc 1 40 14 mov.u64 %r74,%r82; .loc 1 137 16 mov.u64 %r334,%r82; $L20: .loc 1 124 21 ld.u64 %r203,[%r106+16]; add.u64 %r202,%r203,1; .loc 1 124 19 ld.u64 %r204,[%r106+8]; sub.u64 %r93,%r202,%r204; .loc 1 125 11 setp.ne.u64 %r205,%r94,%r126; @ %r205 bra $L16; .loc 1 127 12 ld.u64 %r74,[%r106]; .loc 1 129 8 setp.eq.u64 %r206,%r132,%r93; @ %r206 bra $L17; .loc 1 130 7 st.u64 [%stack+16],%r132; st.u64 [%stack+8],%r93; add.u64 %r208,%r94,1;_gfortran_runtime_error16: .loc 1 137 16 shl.b64 %r322,%r82,3; add.u64 %r211,%frame,%r322; add.u64 %r212,%r211,120; st.u64 [%r212],%r334; .loc 1 138 19 add.u64 %r317,%r82,%r82; add.u64 %r319,%r317,%r82; shl.b64 %r217,%r319,3; add.u64 %r218,%r130,%r217; ld.u64 %r221,[%r218+56]; add.u64 %r220,%r221,1; ld.u64 %r228,[%r218+48]; sub.u64 %r41,%r220,%r228; .loc 1 138 17 st.u64 [%r211],%r41; .loc 1 139 8 setp.eq.u64 %r231,%r41,%r93; @ %r231 bra $L18; .loc 1 140 7 st.u64 [%stack+16],%r41; st.u64 [%stack+8],%r93; add.u64 %r233,%r94,1;_gfortran_runtime_error18: .loc 1 146 8 setp.gt.s64 %r235,%r41,0; .loc 1 147 18 selp.u32 %r85,%r85,1,%r235; .loc 1 148 18 ld.u64 %r245,[%r218+40]; st.u64 [%r211+240],%r245; .loc 1 149 18 ld.u64 %r249,[%r106]; st.u64 [%r211+360],%r249; .loc 1 150 8 add.u64 %r82,%r82,1; $L17: .loc 1 120 28 add.u64 %r94,%r94,1; .loc 1 120 4 add.u64 %r106,%r106,24; setp.ne.u64 %r250,%r192,%r94; @ %r250 bra $L20; bra $L21; $L14: .loc 1 82 19 add.u64 %r125,%r131,-1; add.u64 %r75,%r129,40; .loc 1 114 11 mov.u64 %r84,0; .loc 1 156 11 mov.u64 %r92,%r84; .loc 1 40 14 mov.u64 %r74,%r84; .loc 1 164 16 mov.u64 %r335,%r84; $L24: .loc 1 160 12 ld.u64 %r128,[%r75]; .loc 1 158 11 setp.eq.u64 %r253,%r92,%r125; @ %r253 bra $L38; .loc 1 164 16 shl.b64 %r323,%r84,3; add.u64 %r255,%frame,%r323; add.u64 %r256,%r255,120; st.u64 [%r256],%r335; .loc 1 165 19 add.u64 %r318,%r84,%r84; add.u64 %r321,%r318,%r84; shl.b64 %r261,%r321,3; add.u64 %r262,%r130,%r261; ld.u64 %r265,[%r262+56]; add.u64 %r264,%r265,1; ld.u64 %r272,[%r262+48]; sub.u64 %r48,%r264,%r272; .loc 1 165 17 st.u64 [%r255],%r48; .loc 1 166 8 setp.gt.s64 %r275,%r48,0; .loc 1 167 18 selp.u32 %r85,%r85,1,%r275; .loc 1 168 18 ld.u64 %r285,[%r262+40]; st.u64 [%r255+240],%r285; .loc 1 169 18 st.u64 [%r255+360],%r128; .loc 1 170 8 add.u64 %r84,%r84,1; bra $L22; $L38: .loc 1 160 12 mov.u64 %r74,%r128; $L22: .loc 1 156 28 add.u64 %r92,%r92,1; .loc 1 156 4 add.u64 %r75,%r75,24; setp.ne.u64 %r289,%r92,%r192; @ %r289 bra $L24; $L21: .loc 1 175 10 setp.ne.u32 %r290,%r85,0; @ ! %r290 bra $L15; bra $L1; $L37: .loc 1 40 14 mov.u64 %r74,0; $L15: .loc 1 178 18 ld.u64 %r50,[%frame+240]; .loc 1 178 10 setp.eq.u64 %r291,%r50,0; selp.u64 %r50,1,%r50,%r291; $L10: .loc 1 182 12 ld.u64 %r98,[%frame+360]; .loc 1 184 8 ld.u64 %r101,[%r130]; .loc 1 186 9 setp.ne.u64 %r292,%r101,0; @ %r292 bra $L25; bra $L1; $L40: .loc 1 190 7 mov.u64 %r102,%r112; .loc 1 190 14 mov.u64 %r103,0; $L26: .loc 1 192 12 ld.u32 %r51,[%r101]; .loc 1 192 10 st.u32 [%r102],%r51; .loc 1 193 16 add.u64 %r102,%r102,%r53; .loc 1 190 33 add.u64 %r103,%r103,1; .loc 1 190 7 setp.ne.u64 %r293,%r132,%r103; @ %r293 bra $L26; bra $L46; $L44: .loc 1 196 12 add.u64 %r101,%r101,%r55; .loc 1 197 12 add.u64 %r112,%r112,%r57; .loc 1 198 15 add.u64 %r95,%r95,1; .loc 1 200 13 setp.ne.u64 %r294,%r30,%r95; @ %r294 bra $L44; $L34: .loc 1 208 16 add.u64 %r100,%r112,%r116; .loc 1 210 14 @ %r325 bra $L1; add.u64 %r108,%frame,128; .loc 1 207 30 mov.u64 %r59,%r99; .loc 1 210 14 mov.u64 %r124,8; bra $L30; $L31: .loc 1 204 20 st.u64 [%r108],%r330; .loc 1 207 30 mul.lo.u64 %r59,%r67,%r66; .loc 1 208 30 mul.lo.u64 %r298,%r69,%r66; .loc 1 208 16 shl.b64 %r299,%r298,2; sub.u64 %r100,%r112,%r299; .loc 1 210 14 add.u64 %r108,%r108,8; add.u64 %r124,%r124,8; setp.eq.u64 %r300,%r117,%r124; @ %r300 bra $L1; $L30: .loc 1 218 23 ld.u64 %r301,[%r108]; add.u64 %r66,%r301,1; st.u64 [%r108],%r66; .loc 1 219 30 add.u64 %r303,%r328,%r124; ld.u64 %r67,[%r303]; .loc 1 219 20 sub.u64 %r304,%r67,%r59; shl.b64 %r305,%r304,2; add.u64 %r101,%r101,%r305; .loc 1 220 30 add.u64 %r307,%r329,%r124; ld.u64 %r69,[%r307]; .loc 1 220 20 shl.b64 %r308,%r69,2; add.u64 %r112,%r100,%r308; .loc 1 200 32 add.u64 %r309,%frame,%r124; ld.u64 %r72,[%r309]; .loc 1 200 13 setp.eq.u64 %r310,%r66,%r72; @ %r310 bra $L31; .loc 1 204 20 mov.u64 %r95,0; bra $L32; $L25: .loc 1 193 16 shl.b64 %r53,%r74,2; .loc 1 196 12 shl.b64 %r55,%r50,2; .loc 1 197 12 shl.b64 %r57,%r98,2; ld.u64 %r95,[%frame+120]; .loc 1 200 32 ld.u64 %r30,[%frame]; .loc 1 207 30 mul.lo.u64 %r99,%r50,%r30; .loc 1 208 30 mul.lo.u64 %r311,%r98,%r30; .loc 1 208 16 shl.b64 %r312,%r311,2; neg.s64 %r116,%r312; cvt.u32.u32 %r314,%r22; cvt.s64.s8 %r313,%r314; shl.b64 %r117,%r313,3; setp.le.s64 %r325,%r89,1; setp.gt.s64 %r326,%r132,0; add.u64 %r328,%frame,240; add.u64 %r329,%frame,360; .loc 1 204 20 mov.u64 %r330,0; $L32: .loc 1 190 7 @ %r326 bra $L40; bra $L44; $L46: .loc 1 196 12 add.u64 %r101,%r101,%r55; .loc 1 197 12 add.u64 %r112,%r112,%r57; .loc 1 198 15 add.u64 %r95,%r95,1; .loc 1 200 13 setp.eq.u64 %r316,%r30,%r95; @ %r316 bra $L34; bra $L32; $L_gfortrani_spread_scalar_i4 .visible .func _gfortrani_spread_scalar_predmov.u64 %r39,%ar2; mov.u64 %r40,%ar3; .loc 1 236 6 ld.s8 %r42,[%r37+28]; cvt.u16.u32 %r41,%r42; setp.eq.u16 %r43,%r41,1; @ %r43 bra $L48; .loc 1 237 5 cvta.const.u64 %r44,$LC4; {stack; call _gfortran_runtime_error48: .loc 1 239 6 setp.le.s64 %r46,%r39,1; @ %r46 bra $L49; .loc 1 240 5 cvta.const.u64 %r47_gfortran_runtime_error49: .loc 1 242 10 ld.u64 %r23,[%r37]; .loc 1 246 7 add.u64 %r36,%r40,-1; .loc 1 242 6 setp.ne.u64 %r49,%r23,0; @ %r49 bra $L50; .loc 1 244 24 mov.u64 %r51_gfortrani_xmallocarray244 22 st.u64 [%r37],%r23; .loc 1 245 19 mov.u64 %r54,0; st.u64 [%r37+8],%r54; .loc 1 246 7 st.u64 [%r37+48],%r54; st.u64 [%r37+56],%r36; mov.u64 %r56,1; st.u64 [%r37+40],%r56; mov.u64 %r30,%r56; bra $L51; $L50: .loc 1 251 9 ld.u64 %r30,[%r37+40]; .loc 1 250 26 ld.u64 %r58,[%r37+56]; ld.u64 %r60,[%r37+48]; .loc 1 250 55 sub.u64 %r61,%r58,%r60; .loc 1 251 7 div.s64 %r63,%r61,%r30; .loc 1 250 10 setp.ge.s64 %r64,%r63,%r36; @ %r64 bra $L51; .loc 1 252 2 cvta.const.u64 %r65,$LC5stack; call _gfortran_runtime_error1: .loc 1 258 3 setp.gt.s64 %r67,%r40,0; @ ! %r67 bra $L47; .loc 1 261 12 shl.b64 %r34,%r30,2; .loc 1 258 19 mov.u64 %r35,0; $L53: .loc 1 260 15 ld.u32 %r32,[%r38]; .loc 1 260 13 st.u32 [%r23],%r32; .loc 1 261 12 add.u64 %r23,%r23,%r34; .loc 1 258 40 add.u64 %r35,%r35,1; .loc 1 258 3 setp.ne.u64 %r68,%r40,%r35; @ %r68 bra $L53; $L47: .loc 1 263 1 ret; } spread_i8.o/fortran/generated/spread_i8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_spread_scalar_i8 .visible .func _gfortrani_spread_scalar_i8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray114,101,116,117,114,110,32,114,97,110,107,32,116,111,111,32,108,97,114,103,101,32,105,110,32,115,112,114,101,97,100,40,410,105,109,32,111,117,116,115,105,100,101,32,111,102,32,114,97,110,107,32,105,110,32114,97,110,107,32,109,105,115,109,97,116,99,104,32,105,110,32,115,112,114,101,97,100,40,473,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,83,80,82,69,65,68,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10105,110,99,111,114,114,101,99,116,32,100,101,115,116,105,110,97,116,105,111,110,32,114,97,110,107,32,105,110,32,115,112,114,101,97,100,40,41100,105,109,32,116,111,111,32,108,97,114,103,101,32,105,110,32,115,112,114,101,97,100,40,41_gfortrani_spread_i8 .visible .func _gfortrani_sprea48064predpredpredpred %r250; .reg .predu64 %r264; .reg .u64 %r265; .reg .u64 %r272; .reg .pred %r275; .reg .u64 %r285; .reg .predpred %r292; .reg .pred %r293; .reg .predu64 %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u64pred %r316; .reg .u64 %r317; .reg .u64 %r318; .reg .u64 %r319; .reg .u64 %r321; .reg .u64 %r322; .reg .u64 %r323; .reg .pred %r325; .reg .pred %r326; .reg .u64u64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; mov.u64 %r129,%ar0; mov.u64 %r130,%ar1; mov.u64 %r131,%ar2; mov.u64 %r132,%ar3; .loc 1 57 11 ld.s8 %r22,[%r130+28]; .loc 1 57 9 cvt.u32.u32 %r133,%r22; cvt.s64.s8 %r89,%r133; .loc 1 59 9 add.u64 %r90,%r89,1; .loc 1 60 6 setp.le.s64 %r134,%r90,15; @ %r134 bra $L2; .loc 1 61 5 cvta.const.u64 %r135135_gfortran_runtime_errorsetp.ge.s64 %r137,%r90,%r131; @ %r137 bra $L3; .loc 1 64_gfortran_runtime_errorr112,[%r129]; .loc 1 68 6 setp.ne.u64 %r140,%r112,0; @ %r140 bra $L4; .loc 1 75 23 st.u8 [%r129+28],%r90; .loc 1 79 7 setp.le.s64 %r141,%r90,0; @ %r141 bra $L5; .loc 1 82 19 add.u64 %r127,%r131,-1; add.u64 %r115,%r129,40; .loc 1 77 11 mov.u64 %r80,%r112; .loc 1 79 14 mov.u64 %r97,%r80; .loc 1 78 10 mov.u64 %r121,1; .loc 1 40 14 mov.u64 %r74,%r80; .loc 1 90 19 mov.u64 %r332,%r80; .loc 1 84 21 add.u64 %r333,%r132,-1; bra $L8; $L36: mov.u64 %r121,%r76; $L8: .loc 1 82 7 setp.ne.u64 %r144,%r97,%r127; @ %r144 bra $L6; .loc 1 84 21 mov.u64 %r29,%r333; .loc 1 86 11 mul.lo.u64 %r76,%r132,%r121; mov.u64 %r74,%r121; bra $L7; $L6: .loc 1 90 19 shl.b64 %r145,%r80,3; add.u64 %r146,%frame,%r145; st.u64 [%r146+120],%r332; .loc 1 91 22 add.u64 %r150,%r80,%r80; add.u64 %r151,%r150,%r80; shl.b64 %r152,%r151,3; add.u64 %r153,%r130,%r152; ld.u64 %r156,[%r153+56]; add.u64 %r155,%r156,1; ld.u64 %r163,[%r153+48]; sub.u64 %r27,%r155,%r163; .loc 1 91 20 st.u64 [%r146],%r27; .loc 1 92 21 ld.u64 %r175,[%r153+40]; st.u64 [%r146+240],%r175; .loc 1 93 21 st.u64 [%r146+360],%r121; .loc 1 95 25 add.u64 %r29,%r27,-1; .loc 1 96 11 mul.lo.u64 %r76,%r27,%r121; .loc 1 97 11 add.u64 %r80,%r80,1; $L7: .loc 1 99 4 st.u64 [%r115+8],%r332; st.u64 [%r115+16],%r29; st.u64 [%r115],%r121; .loc 1 79 31 add.u64 %r97,%r97,1; .loc 1 79 7 add.u64 %r115,%r115,24; setp.ne.u64 %r180,%r97,%r90; @ %r180 bra $L36; bra $L45; $L5: .loc 1 101 19 st.u64 [%r129+8],%r112; .loc 1 104 24 mov.u64 %r183,8; mov.u64 %r18218183; call (%value_in),_gfortrani_xmallocarray184,[%value_in]; } mov.u64 %r112,%r184; .loc 1 104 22 st.u64 [%r129],%r112; .loc 1 181 12 ld.u64 %r50,[%frame+240]; .loc 1 40 14 mov.u64 %r74,0; bra $L10; $L45: .loc 1 101 19 st.u64 [%r129+8],%r332; .loc 1 104 24 mov.u64 %r188(%value_in),_gfortrani_xmallocarray189,[%value_in]; } mov.u64 %r112,%r189; .loc 1 104 22 st.u64 [%r129],%r112; .loc 1 105 10 setp.le.s64 %r191,%r76,0; @ %r191 bra $L1; .loc 1 181 12 ld.u64 %r50,[%frame+240]; bra $L10; $L4: .loc 1 115 11 ld.s8 %r192,[%r129+28]; .loc 1 115 10 setp.eq.u64 %r193,%r192,%r90; @ %r193 bra $L12; .loc 1 116 2 cvta.const.u64 %r1944_gfortran_runtime_error12: .loc 1 118 11 cvta.global.u64 %r196,_gfortrani_compile_options; ld.u32 %r85,[%r196+36]; .loc 1 118 10 setp.ne.u32 %r197,%r85,0; @ %r197 bra $L13; .loc 1 156 4 setp.gt.s64 %r198,%r192,0; @ %r198 bra $L14; .loc 1 40 14 mov.u64 %r74,0; bra $L15; $L13: .loc 1 120 4 setp.le.s64 %r199,%r192,0; @ %r199 bra $L37; .loc 1 82 19 add.u64 %r126,%r131,-1; add.u64 %r106,%r129,40; .loc 1 112 18 mov.u32 %r85,0; .loc 1 114 11 mov.u64 %r82,0; .loc 1 120 11 mov.u64 %r94,%r82; .loc 1 40 14 mov.u64 %r74,%r82; .loc 1 137 16 mov.u64 %r334,%r82; $L20: .loc 1 124 21 ld.u64 %r203,[%r106+16]; add.u64 %r202,%r203,1; .loc 1 124 19 ld.u64 %r204,[%r106+8]; sub.u64 %r93,%r202,%r204; .loc 1 125 11 setp.ne.u64 %r205,%r94,%r126; @ %r205 bra $L16; .loc 1 127 12 ld.u64 %r74,[%r106]; .loc 1 129 8 setp.eq.u64 %r206,%r132,%r93; @ %r206 bra $L17; .loc 1 130 7 st.u64 [%stack+16],%r132; st.u64 [%stack+8],%r93; add.u64 %r208,%r94,1;_gfortran_runtime_error16: .loc 1 137 16 shl.b64 %r322,%r82,3; add.u64 %r211,%frame,%r322; add.u64 %r212,%r211,120; st.u64 [%r212],%r334; .loc 1 138 19 add.u64 %r317,%r82,%r82; add.u64 %r319,%r317,%r82; shl.b64 %r217,%r319,3; add.u64 %r218,%r130,%r217; ld.u64 %r221,[%r218+56]; add.u64 %r220,%r221,1; ld.u64 %r228,[%r218+48]; sub.u64 %r41,%r220,%r228; .loc 1 138 17 st.u64 [%r211],%r41; .loc 1 139 8 setp.eq.u64 %r231,%r41,%r93; @ %r231 bra $L18; .loc 1 140 7 st.u64 [%stack+16],%r41; st.u64 [%stack+8],%r93; add.u64 %r233,%r94,1;_gfortran_runtime_error18: .loc 1 146 8 setp.gt.s64 %r235,%r41,0; .loc 1 147 18 selp.u32 %r85,%r85,1,%r235; .loc 1 148 18 ld.u64 %r245,[%r218+40]; st.u64 [%r211+240],%r245; .loc 1 149 18 ld.u64 %r249,[%r106]; st.u64 [%r211+360],%r249; .loc 1 150 8 add.u64 %r82,%r82,1; $L17: .loc 1 120 28 add.u64 %r94,%r94,1; .loc 1 120 4 add.u64 %r106,%r106,24; setp.ne.u64 %r250,%r192,%r94; @ %r250 bra $L20; bra $L21; $L14: .loc 1 82 19 add.u64 %r125,%r131,-1; add.u64 %r75,%r129,40; .loc 1 114 11 mov.u64 %r84,0; .loc 1 156 11 mov.u64 %r92,%r84; .loc 1 40 14 mov.u64 %r74,%r84; .loc 1 164 16 mov.u64 %r335,%r84; $L24: .loc 1 160 12 ld.u64 %r128,[%r75]; .loc 1 158 11 setp.eq.u64 %r253,%r92,%r125; @ %r253 bra $L38; .loc 1 164 16 shl.b64 %r323,%r84,3; add.u64 %r255,%frame,%r323; add.u64 %r256,%r255,120; st.u64 [%r256],%r335; .loc 1 165 19 add.u64 %r318,%r84,%r84; add.u64 %r321,%r318,%r84; shl.b64 %r261,%r321,3; add.u64 %r262,%r130,%r261; ld.u64 %r265,[%r262+56]; add.u64 %r264,%r265,1; ld.u64 %r272,[%r262+48]; sub.u64 %r48,%r264,%r272; .loc 1 165 17 st.u64 [%r255],%r48; .loc 1 166 8 setp.gt.s64 %r275,%r48,0; .loc 1 167 18 selp.u32 %r85,%r85,1,%r275; .loc 1 168 18 ld.u64 %r285,[%r262+40]; st.u64 [%r255+240],%r285; .loc 1 169 18 st.u64 [%r255+360],%r128; .loc 1 170 8 add.u64 %r84,%r84,1; bra $L22; $L38: .loc 1 160 12 mov.u64 %r74,%r128; $L22: .loc 1 156 28 add.u64 %r92,%r92,1; .loc 1 156 4 add.u64 %r75,%r75,24; setp.ne.u64 %r289,%r92,%r192; @ %r289 bra $L24; $L21: .loc 1 175 10 setp.ne.u32 %r290,%r85,0; @ ! %r290 bra $L15; bra $L1; $L37: .loc 1 40 14 mov.u64 %r74,0; $L15: .loc 1 178 18 ld.u64 %r50,[%frame+240]; .loc 1 178 10 setp.eq.u64 %r291,%r50,0; selp.u64 %r50,1,%r50,%r291; $L10: .loc 1 182 12 ld.u64 %r98,[%frame+360]; .loc 1 184 8 ld.u64 %r101,[%r130]; .loc 1 186 9 setp.ne.u64 %r292,%r101,0; @ %r292 bra $L25; bra $L1; $L40: .loc 1 190 7 mov.u64 %r102,%r112; .loc 1 190 14 mov.u64 %r103,0; $L26: .loc 1 192 12 ld.u64 %r51,[%r101]; .loc 1 192 10 st.u64 [%r102],%r51; .loc 1 193 16 add.u64 %r102,%r102,%r53; .loc 1 190 33 add.u64 %r103,%r103,1; .loc 1 190 7 setp.ne.u64 %r293,%r132,%r103; @ %r293 bra $L26; bra $L46; $L44: .loc 1 196 12 add.u64 %r101,%r101,%r55; .loc 1 197 12 add.u64 %r112,%r112,%r57; .loc 1 198 15 add.u64 %r95,%r95,1; .loc 1 200 13 setp.ne.u64 %r294,%r30,%r95; @ %r294 bra $L44; $L34: .loc 1 208 16 add.u64 %r100,%r112,%r116; .loc 1 210 14 @ %r325 bra $L1; add.u64 %r108,%frame,128; .loc 1 207 30 mov.u64 %r59,%r99; .loc 1 210 14 mov.u64 %r124,8; bra $L30; $L31: .loc 1 204 20 st.u64 [%r108],%r330; .loc 1 207 30 mul.lo.u64 %r59,%r67,%r66; .loc 1 208 30 mul.lo.u64 %r298,%r69,%r66; .loc 1 208 16 shl.b64 %r299,%r298,3; sub.u64 %r100,%r112,%r299; .loc 1 210 14 add.u64 %r108,%r108,8; add.u64 %r124,%r124,8; setp.eq.u64 %r300,%r117,%r124; @ %r300 bra $L1; $L30: .loc 1 218 23 ld.u64 %r301,[%r108]; add.u64 %r66,%r301,1; st.u64 [%r108],%r66; .loc 1 219 30 add.u64 %r303,%r328,%r124; ld.u64 %r67,[%r303]; .loc 1 219 20 sub.u64 %r304,%r67,%r59; shl.b64 %r305,%r304,3; add.u64 %r101,%r101,%r305; .loc 1 220 30 add.u64 %r307,%r329,%r124; ld.u64 %r69,[%r307]; .loc 1 220 20 shl.b64 %r308,%r69,3; add.u64 %r112,%r100,%r308; .loc 1 200 32 add.u64 %r309,%frame,%r124; ld.u64 %r72,[%r309]; .loc 1 200 13 setp.eq.u64 %r310,%r66,%r72; @ %r310 bra $L31; .loc 1 204 20 mov.u64 %r95,0; bra $L32; $L25: .loc 1 193 16 shl.b64 %r53,%r74,3; .loc 1 196 12 shl.b64 %r55,%r50,3; .loc 1 197 12 shl.b64 %r57,%r98,3; ld.u64 %r95,[%frame+120]; .loc 1 200 32 ld.u64 %r30,[%frame]; .loc 1 207 30 mul.lo.u64 %r99,%r50,%r30; .loc 1 208 30 mul.lo.u64 %r311,%r98,%r30; .loc 1 208 16 shl.b64 %r312,%r311,3; neg.s64 %r116,%r312; cvt.u32.u32 %r314,%r22; cvt.s64.s8 %r313,%r314; shl.b64 %r117,%r313,3; setp.le.s64 %r325,%r89,1; setp.gt.s64 %r326,%r132,0; add.u64 %r328,%frame,240; add.u64 %r329,%frame,360; .loc 1 204 20 mov.u64 %r330,0; $L32: .loc 1 190 7 @ %r326 bra $L40; bra $L44; $L46: .loc 1 196 12 add.u64 %r101,%r101,%r55; .loc 1 197 12 add.u64 %r112,%r112,%r57; .loc 1 198 15 add.u64 %r95,%r95,1; .loc 1 200 13 setp.eq.u64 %r316,%r30,%r95; @ %r316 bra $L34; bra $L32; $L_gfortrani_spread_scalar_i8 .visible .func _gfortrani_spread_scalar_predmov.u64 %r39,%ar2; mov.u64 %r40,%ar3; .loc 1 236 6 ld.s8 %r42,[%r37+28]; cvt.u16.u32 %r41,%r42; setp.eq.u16 %r43,%r41,1; @ %r43 bra $L48; .loc 1 237 5 cvta.const.u64 %r44,$LC4; {stack; call _gfortran_runtime_error48: .loc 1 239 6 setp.le.s64 %r46,%r39,1; @ %r46 bra $L49; .loc 1 240 5 cvta.const.u64 %r47_gfortran_runtime_error49: .loc 1 242 10 ld.u64 %r23,[%r37]; .loc 1 246 7 add.u64 %r36,%r40,-1; .loc 1 242 6 setp.ne.u64 %r49,%r23,0; @ %r49 bra $L50; .loc 1 244 24 mov.u64 _gfortrani_xmallocarray244 22 st.u64 [%r37],%r23; .loc 1 245 19 mov.u64 %r54,0; st.u64 [%r37+8],%r54; .loc 1 246 7 st.u64 [%r37+48],%r54; st.u64 [%r37+56],%r36; mov.u64 %r56,1; st.u64 [%r37+40],%r56; mov.u64 %r30,%r56; bra $L51; $L50: .loc 1 251 9 ld.u64 %r30,[%r37+40]; .loc 1 250 26 ld.u64 %r58,[%r37+56]; ld.u64 %r60,[%r37+48]; .loc 1 250 55 sub.u64 %r61,%r58,%r60; .loc 1 251 7 div.s64 %r63,%r61,%r30; .loc 1 250 10 setp.ge.s64 %r64,%r63,%r36; @ %r64 bra $L51; .loc 1 252 2 cvta.const.u64 %r65,$LC5stack; call _gfortran_runtime_error1: .loc 1 258 3 setp.gt.s64 %r67,%r40,0; @ ! %r67 bra $L47; .loc 1 261 12 shl.b64 %r34,%r30,3; .loc 1 258 19 mov.u64 %r35,0; $L53: .loc 1 260 15 ld.u64 %r32,[%r38]; .loc 1 260 13 st.u64 [%r23],%r32; .loc 1 261 12 add.u64 %r23,%r23,%r34; .loc 1 258 40 add.u64 %r35,%r35,1; .loc 1 258 3 setp.ne.u64 %r68,%r40,%r35; @ %r68 bra $L53; $L47: .loc 1 263 1 ret; } spread_i16.o/_gfortrani_spread_i16 .visible .func _gfortrani_spread_i16fortran/generated/spread_i16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_spread_scalar_i16 .visible .func _gfortrani_spread_scalar_i16VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray114,101,116,117,114,110,32,114,97,110,107,32,116,111,111,32,108,97,114,103,101,32,105,110,32,115,112,114,101,97,100,40,410,105,109,32,111,117,116,115,105,100,101,32,111,102,32,114,97,110,107,32,105,110,32114,97,110,107,32,109,105,115,109,97,116,99,104,32,105,110,32,115,112,114,101,97,100,40,473,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,83,80,82,69,65,68,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10105,110,99,111,114,114,101,99,116,32,100,101,115,116,105,110,97,116,105,111,110,32,114,97,110,107,32,105,110,32,115,112,114,101,97,100,40,41100,105,109,32,116,111,111,32,108,97,114,103,101,32,105,110,32,115,112,114,101,97,100,40,41_gfortrani_spread_i16 .visible .func _gfortrani_spread_i16480predpredpredpred %r250; .reg .predu64 %r264; .reg .u64 %r265; .reg .u64 %r272; .reg .pred %r275; .reg .u64 %r285; .reg .predpred %r292; .reg .pred %r295; .reg .pred %r296; .reg .u64u64predmov.u64 %r129,%ar0; mov.u64 %r130,%ar1; mov.u64 %r131,%ar2; mov.u64 %r132,%ar3; .loc 1 57 11 ld.s8 %r22,[%r130+28]; .loc 1 57 9 cvt.u32.u32 %r133,%r22; cvt.s64.s8 %r89,%r133; .loc 1 59 9 add.u64 %r90,%r89,1; .loc 1 60 6 setp.le.s64 %r134,%r90,15; @ %r134 bra $L2; .loc 1 61 5 cvta.const.u64 %r135135_gfortran_runtime_errorsetp.ge.s64 %r137,%r90,%r131; @ %r137 bra $L3; .loc 1 64_gfortran_runtime_errorr112,[%r129]; .loc 1 68 6 setp.ne.u64 %r140,%r112,0; @ %r140 bra $L4; .loc 1 75 23 st.u8 [%r129+28],%r90; .loc 1 79 7 setp.le.s64 %r141,%r90,0; @ %r141 bra $L5; .loc 1 82 19 add.u64 %r127,%r131,-1; add.u64 %r115,%r129,40; .loc 1 77 11 mov.u64 %r80,%r112; .loc 1 79 14 mov.u64 %r97,%r80; .loc 1 78 10 mov.u64 %r121,1; .loc 1 40 14 mov.u64 %r74,%r80; .loc 1 90 19 mov.u64 %r336,%r80; .loc 1 84 21 add.u64 %r337,%r132,-1; bra $L8; $L36: mov.u64 %r121,%r76; $L8: .loc 1 82 7 setp.ne.u64 %r144,%r97,%r127; @ %r144 bra $L6; .loc 1 84 21 mov.u64 %r29,%r337; .loc 1 86 11 mul.lo.u64 %r76,%r132,%r121; mov.u64 %r74,%r121; bra $L7; $L6: .loc 1 90 19 shl.b64 %r145,%r80,3; add.u64 %r146,%frame,%r145; st.u64 [%r146+120],%r336; .loc 1 91 22 add.u64 %r150,%r80,%r80; add.u64 %r151,%r150,%r80; shl.b64 %r152,%r151,3; add.u64 %r153,%r130,%r152; ld.u64 %r156,[%r153+56]; add.u64 %r155,%r156,1; ld.u64 %r163,[%r153+48]; sub.u64 %r27,%r155,%r163; .loc 1 91 20 st.u64 [%r146],%r27; .loc 1 92 21 ld.u64 %r175,[%r153+40]; st.u64 [%r146+240],%r175; .loc 1 93 21 st.u64 [%r146+360],%r121; .loc 1 95 25 add.u64 %r29,%r27,-1; .loc 1 96 11 mul.lo.u64 %r76,%r27,%r121; .loc 1 97 11 add.u64 %r80,%r80,1; $L7: .loc 1 99 4 st.u64 [%r115+8],%r336; st.u64 [%r115+16],%r29; st.u64 [%r115],%r121; .loc 1 79 31 add.u64 %r97,%r97,1; .loc 1 79 7 add.u64 %r115,%r115,24; setp.ne.u64 %r180,%r97,%r90; @ %r180 bra $L36; bra $L45; $L5: .loc 1 101 19 st.u64 [%r129+8],%r112; .loc 1 104 24 mov.u64 %r183,16; mov.u64 %r18218183; call (%value_in),_gfortrani_xmallocarray184,[%value_in]; } mov.u64 %r112,%r184; .loc 1 104 22 st.u64 [%r129],%r112; .loc 1 181 12 ld.u64 %r50,[%frame+240]; .loc 1 40 14 mov.u64 %r74,0; bra $L10; $L45: .loc 1 101 19 st.u64 [%r129+8],%r336; .loc 1 104 24 mov.u64 %r188,16;188; call (%value_in),_gfortrani_xmallocarray189,[%value_in]; } mov.u64 %r112,%r189; .loc 1 104 22 st.u64 [%r129],%r112; .loc 1 105 10 setp.le.s64 %r191,%r76,0; @ %r191 bra $L1; .loc 1 181 12 ld.u64 %r50,[%frame+240]; bra $L10; $L4: .loc 1 115 11 ld.s8 %r192,[%r129+28]; .loc 1 115 10 setp.eq.u64 %r193,%r192,%r90; @ %r193 bra $L12; .loc 1 116 2 cvta.const.u64 %r1944_gfortran_runtime_error12: .loc 1 118 11 cvta.global.u64 %r196,_gfortrani_compile_options; ld.u32 %r85,[%r196+36]; .loc 1 118 10 setp.ne.u32 %r197,%r85,0; @ %r197 bra $L13; .loc 1 156 4 setp.gt.s64 %r198,%r192,0; @ %r198 bra $L14; .loc 1 40 14 mov.u64 %r74,0; bra $L15; $L13: .loc 1 120 4 setp.le.s64 %r199,%r192,0; @ %r199 bra $L37; .loc 1 82 19 add.u64 %r126,%r131,-1; add.u64 %r106,%r129,40; .loc 1 112 18 mov.u32 %r85,0; .loc 1 114 11 mov.u64 %r82,0; .loc 1 120 11 mov.u64 %r94,%r82; .loc 1 40 14 mov.u64 %r74,%r82; .loc 1 137 16 mov.u64 %r338,%r82; $L20: .loc 1 124 21 ld.u64 %r203,[%r106+16]; add.u64 %r202,%r203,1; .loc 1 124 19 ld.u64 %r204,[%r106+8]; sub.u64 %r93,%r202,%r204; .loc 1 125 11 setp.ne.u64 %r205,%r94,%r126; @ %r205 bra $L16; .loc 1 127 12 ld.u64 %r74,[%r106]; .loc 1 129 8 setp.eq.u64 %r206,%r132,%r93; @ %r206 bra $L17; .loc 1 130 7 st.u64 [%stack+16],%r132; st.u64 [%stack+8],%r93; add.u64 %r208,%r94,1;_gfortran_runtime_error16: .loc 1 137 16 shl.b64 %r326,%r82,3; add.u64 %r211,%frame,%r326; add.u64 %r212,%r211,120; st.u64 [%r212],%r338; .loc 1 138 19 add.u64 %r321,%r82,%r82; add.u64 %r323,%r321,%r82; shl.b64 %r217,%r323,3; add.u64 %r218,%r130,%r217; ld.u64 %r221,[%r218+56]; add.u64 %r220,%r221,1; ld.u64 %r228,[%r218+48]; sub.u64 %r41,%r220,%r228; .loc 1 138 17 st.u64 [%r211],%r41; .loc 1 139 8 setp.eq.u64 %r231,%r41,%r93; @ %r231 bra $L18; .loc 1 140 7 st.u64 [%stack+16],%r41; st.u64 [%stack+8],%r93; add.u64 %r233,%r94,1;_gfortran_runtime_error18: .loc 1 146 8 setp.gt.s64 %r235,%r41,0; .loc 1 147 18 selp.u32 %r85,%r85,1,%r235; .loc 1 148 18 ld.u64 %r245,[%r218+40]; st.u64 [%r211+240],%r245; .loc 1 149 18 ld.u64 %r249,[%r106]; st.u64 [%r211+360],%r249; .loc 1 150 8 add.u64 %r82,%r82,1; $L17: .loc 1 120 28 add.u64 %r94,%r94,1; .loc 1 120 4 add.u64 %r106,%r106,24; setp.ne.u64 %r250,%r192,%r94; @ %r250 bra $L20; bra $L21; $L14: .loc 1 82 19 add.u64 %r125,%r131,-1; add.u64 %r75,%r129,40; .loc 1 114 11 mov.u64 %r84,0; .loc 1 156 11 mov.u64 %r92,%r84; .loc 1 40 14 mov.u64 %r74,%r84; .loc 1 164 16 mov.u64 %r339,%r84; $L24: .loc 1 160 12 ld.u64 %r128,[%r75]; .loc 1 158 11 setp.eq.u64 %r253,%r92,%r125; @ %r253 bra $L38; .loc 1 164 16 shl.b64 %r327,%r84,3; add.u64 %r255,%frame,%r327; add.u64 %r256,%r255,120; st.u64 [%r256],%r339; .loc 1 165 19 add.u64 %r322,%r84,%r84; add.u64 %r325,%r322,%r84; shl.b64 %r261,%r325,3; add.u64 %r262,%r130,%r261; ld.u64 %r265,[%r262+56]; add.u64 %r264,%r265,1; ld.u64 %r272,[%r262+48]; sub.u64 %r48,%r264,%r272; .loc 1 165 17 st.u64 [%r255],%r48; .loc 1 166 8 setp.gt.s64 %r275,%r48,0; .loc 1 167 18 selp.u32 %r85,%r85,1,%r275; .loc 1 168 18 ld.u64 %r285,[%r262+40]; st.u64 [%r255+240],%r285; .loc 1 169 18 st.u64 [%r255+360],%r128; .loc 1 170 8 add.u64 %r84,%r84,1; bra $L22; $L38: .loc 1 160 12 mov.u64 %r74,%r128; $L22: .loc 1 156 28 add.u64 %r92,%r92,1; .loc 1 156 4 add.u64 %r75,%r75,24; setp.ne.u64 %r289,%r92,%r192; @ %r289 bra $L24; $L21: .loc 1 175 10 setp.ne.u32 %r290,%r85,0; @ ! %r290 bra $L15; bra $L1; $L37: .loc 1 40 14 mov.u64 %r74,0; $L15: .loc 1 178 18 ld.u64 %r50,[%frame+240]; .loc 1 178 10 setp.eq.u64 %r291,%r50,0; selp.u64 %r50,1,%r50,%r291; $L10: .loc 1 182 12 ld.u64 %r98,[%frame+360]; .loc 1 184 8 ld.u64 %r101,[%r130]; .loc 1 186 9 setp.ne.u64 %r292,%r101,0; @ %r292 bra $L25; bra $L1; $L40: .loc 1 190 7 mov.u64 %r102,%r112; .loc 1 190 14 mov.u64 %r103,0; $L26: .loc 1 192 12 ld.u64 %r319,[%r101]; ld.u64 %r320,[%r101+8]; .loc 1 192 10 st.u64 [%r102],%r319; st.u64 [%r102+8],%r320; .loc 1 193 16 add.u64 %r102,%r102,%r53; .loc 1 190 33 add.u64 %r103,%r103,1; .loc 1 190 7 setp.ne.u64 %r295,%r132,%r103; @ %r295 bra $L26; bra $L46; $L44: .loc 1 196 12 add.u64 %r101,%r101,%r55; .loc 1 197 12 add.u64 %r112,%r112,%r57; .loc 1 198 15 add.u64 %r95,%r95,1; .loc 1 200 13 setp.ne.u64 %r296,%r30,%r95; @ %r296 bra $L44; $L34: .loc 1 208 16 add.u64 %r100,%r112,%r116; .loc 1 210 14 @ %r329 bra $L1; add.u64 %r108,%frame,128; .loc 1 207 30 mov.u64 %r59,%r99; .loc 1 210 14 mov.u64 %r124,8; bra $L30; $L31: .loc 1 204 20 st.u64 [%r108],%r334; .loc 1 207 30 mul.lo.u64 %r59,%r67,%r66; .loc 1 208 30 mul.lo.u64 %r300,%r69,%r66; .loc 1 208 16 shl.b64 %r301,%r300,4; sub.u64 %r100,%r112,%r301; .loc 1 210 14 add.u64 %r108,%r108,8; add.u64 %r124,%r124,8; setp.eq.u64 %r302,%r117,%r124; @ %r302 bra $L1; $L30: .loc 1 218 23 ld.u64 %r303,[%r108]; add.u64 %r66,%r303,1; st.u64 [%r108],%r66; .loc 1 219 30 add.u64 %r305,%r332,%r124; ld.u64 %r67,[%r305]; .loc 1 219 20 sub.u64 %r306,%r67,%r59; shl.b64 %r307,%r306,4; add.u64 %r101,%r101,%r307; .loc 1 220 30 add.u64 %r309,%r333,%r124; ld.u64 %r69,[%r309]; .loc 1 220 20 shl.b64 %r310,%r69,4; add.u64 %r112,%r100,%r310; .loc 1 200 32 add.u64 %r311,%frame,%r124; ld.u64 %r72,[%r311]; .loc 1 200 13 setp.eq.u64 %r312,%r66,%r72; @ %r312 bra $L31; .loc 1 204 20 mov.u64 %r95,0; bra $L32; $L25: .loc 1 193 16 shl.b64 %r53,%r74,4; .loc 1 196 12 shl.b64 %r55,%r50,4; .loc 1 197 12 shl.b64 %r57,%r98,4; ld.u64 %r95,[%frame+120]; .loc 1 200 32 ld.u64 %r30,[%frame]; .loc 1 207 30 mul.lo.u64 %r99,%r50,%r30; .loc 1 208 30 mul.lo.u64 %r313,%r98,%r30; .loc 1 208 16 shl.b64 %r314,%r313,4; neg.s64 %r116,%r314; cvt.u32.u32 %r316,%r22; cvt.s64.s8 %r315,%r316; shl.b64 %r117,%r315,3; setp.le.s64 %r329,%r89,1; setp.gt.s64 %r330,%r132,0; add.u64 %r332,%frame,240; add.u64 %r333,%frame,360; .loc 1 204 20 mov.u64 %r334,0; $L32: .loc 1 190 7 @ %r330 bra $L40; bra $L44; $L46: .loc 1 196 12 add.u64 %r101,%r101,%r55; .loc 1 197 12 add.u64 %r112,%r112,%r57; .loc 1 198 15 add.u64 %r95,%r95,1; .loc 1 200 13 setp.eq.u64 %r318,%r30,%r95; @ %r318 bra $L34; bra $L32; $L_gfortrani_spread_scalar_i16 .visible .func _gfortrani_spread_scalar_i16pred %r67; .reg .predmov.u64 %r37,%ar0; mov.u64 %r38,%ar1; mov.u64 %r39,%ar2; mov.u64 %r40,%ar3; .loc 1 236 6 ld.s8 %r42,[%r37+28]; cvt.u16.u32 %r41,%r42; setp.eq.u16 %r43,%r41,1; @ %r43 bra $L48; .loc 1 237 5 cvta.const.u64 %r44,$LC4; {stack; call _gfortran_runtime_error48: .loc 1 239 6 setp.le.s64 %r46,%r39,1; @ %r46 bra $L49; .loc 1 240 5 cvta.const.u64 %r47_gfortran_runtime_error49: .loc 1 242 10 ld.u64 %r23,[%r37]; .loc 1 246 7 add.u64 %r36,%r40,-1; .loc 1 242 6 setp.ne.u64 %r49,%r23,0; @ %r49 bra $L50; .loc 1 244 24_gfortrani_xmallocarray244 22 st.u64 [%r37],%r23; .loc 1 245 19 mov.u64 %r54,0; st.u64 [%r37+8],%r54; .loc 1 246 7 st.u64 [%r37+48],%r54; st.u64 [%r37+56],%r36; mov.u64 %r56,1; st.u64 [%r37+40],%r56; mov.u64 %r30,%r56; bra $L51; $L50: .loc 1 251 9 ld.u64 %r30,[%r37+40]; .loc 1 250 26 ld.u64 %r58,[%r37+56]; ld.u64 %r60,[%r37+48]; .loc 1 250 55 sub.u64 %r61,%r58,%r60; .loc 1 251 7 div.s64 %r63,%r61,%r30; .loc 1 250 10 setp.ge.s64 %r64,%r63,%r36; @ %r64 bra $L51; .loc 1 252 2 cvta.const.u64 %r65,$LC5stack; call _gfortran_runtime_error1: .loc 1 258 3 setp.gt.s64 %r67,%r40,0; @ ! %r67 bra $L47; .loc 1 261 12 shl.b64 %r34,%r30,4; .loc 1 258 19 mov.u64 %r35,0; $L53: .loc 1 260 15 ld.u64 %r71,[%r38]; ld.u64 %r72,[%r38+8]; .loc 1 260 13 st.u64 [%r23],%r71; st.u64 [%r23+8],%r72; .loc 1 261 12 add.u64 %r23,%r23,%r34; .loc 1 258 40 add.u64 %r35,%r35,1; .loc 1 258 3 setp.ne.u64 %r70,%r40,%r35; @ %r70 bra $L53; $L47: .loc 1 263 1 ret; } spread_r4.o/_gfortrani_spread_r4 .visible .func _gfortrani_spread_rfortran/generated/spread_r4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_spread_scalar_r4 .visible .func _gfortrani_spread_scalar_r4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray114,101,116,117,114,110,32,114,97,110,107,32,116,111,111,32,108,97,114,103,101,32,105,110,32,115,112,114,101,97,100,40,410,105,109,32,111,117,116,115,105,100,101,32,111,102,32,114,97,110,107,32,105,110,32114,97,110,107,32,109,105,115,109,97,116,99,104,32,105,110,32,115,112,114,101,97,100,40,473,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,83,80,82,69,65,68,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10105,110,99,111,114,114,101,99,116,32,100,101,115,116,105,110,97,116,105,111,110,32,114,97,110,107,32,105,110,32,115,112,114,101,97,100,40,41100,105,109,32,116,111,111,32,108,97,114,103,101,32,105,110,32,115,112,114,101,97,100,40,41_gfortrani_spread_r4 .visible .func _gfortrani_spread_r480predpredpredpred %r250; .reg .predu64 %r264; .reg .u64 %r265; .reg .u64 %r272; .reg .pred %r275; .reg .u64 %r285; .reg .predpred %r292; .reg .pred %r293; .reg .predu64 %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u64pred %r316; .reg .u64 %r317; .reg .u64 %r318; .reg .u64 %r319; .reg .u64 %r321; .reg .u64 %r322; .reg .u64 %r323; .reg .pred %r325; .reg .pred %r326; .reg .u64u64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; mov.u64 %r129,%ar0; mov.u64 %r130,%ar1; mov.u64 %r131,%ar2; mov.u64 %r132,%ar3; .loc 1 57 11 ld.s8 %r22,[%r130+28]; .loc 1 57 9 cvt.u32.u32 %r133,%r22; cvt.s64.s8 %r89,%r133; .loc 1 59 9 add.u64 %r90,%r89,1; .loc 1 60 6 setp.le.s64 %r134,%r90,15; @ %r134 bra $L2; .loc 1 61 5 cvta.const.u64 %r135135_gfortran_runtime_errorsetp.ge.s64 %r137,%r90,%r131; @ %r137 bra $L3; .loc 1 64_gfortran_runtime_errorr112,[%r129]; .loc 1 68 6 setp.ne.u64 %r140,%r112,0; @ %r140 bra $L4; .loc 1 75 23 st.u8 [%r129+28],%r90; .loc 1 79 7 setp.le.s64 %r141,%r90,0; @ %r141 bra $L5; .loc 1 82 19 add.u64 %r127,%r131,-1; add.u64 %r115,%r129,40; .loc 1 77 11 mov.u64 %r80,%r112; .loc 1 79 14 mov.u64 %r97,%r80; .loc 1 78 10 mov.u64 %r121,1; .loc 1 40 14 mov.u64 %r74,%r80; .loc 1 90 19 mov.u64 %r332,%r80; .loc 1 84 21 add.u64 %r333,%r132,-1; bra $L8; $L36: mov.u64 %r121,%r76; $L8: .loc 1 82 7 setp.ne.u64 %r144,%r97,%r127; @ %r144 bra $L6; .loc 1 84 21 mov.u64 %r29,%r333; .loc 1 86 11 mul.lo.u64 %r76,%r132,%r121; mov.u64 %r74,%r121; bra $L7; $L6: .loc 1 90 19 shl.b64 %r145,%r80,3; add.u64 %r146,%frame,%r145; st.u64 [%r146+120],%r332; .loc 1 91 22 add.u64 %r150,%r80,%r80; add.u64 %r151,%r150,%r80; shl.b64 %r152,%r151,3; add.u64 %r153,%r130,%r152; ld.u64 %r156,[%r153+56]; add.u64 %r155,%r156,1; ld.u64 %r163,[%r153+48]; sub.u64 %r27,%r155,%r163; .loc 1 91 20 st.u64 [%r146],%r27; .loc 1 92 21 ld.u64 %r175,[%r153+40]; st.u64 [%r146+240],%r175; .loc 1 93 21 st.u64 [%r146+360],%r121; .loc 1 95 25 add.u64 %r29,%r27,-1; .loc 1 96 11 mul.lo.u64 %r76,%r27,%r121; .loc 1 97 11 add.u64 %r80,%r80,1; $L7: .loc 1 99 4 st.u64 [%r115+8],%r332; st.u64 [%r115+16],%r29; st.u64 [%r115],%r121; .loc 1 79 31 add.u64 %r97,%r97,1; .loc 1 79 7 add.u64 %r115,%r115,24; setp.ne.u64 %r180,%r97,%r90; @ %r180 bra $L36; bra $L45; $L5: .loc 1 101 19 st.u64 [%r129+8],%r112; .loc 1 104 24 mov.u64 %r183,4; mov.u64 %r18218183; call (%value_in),_gfortrani_xmallocarray184,[%value_in]; } mov.u64 %r112,%r184; .loc 1 104 22 st.u64 [%r129],%r112; .loc 1 181 12 ld.u64 %r50,[%frame+240]; .loc 1 40 14 mov.u64 %r74,0; bra $L10; $L45: .loc 1 101 19 st.u64 [%r129+8],%r332; .loc 1 104 24 mov.u64 %r188,4;188; call (%value_in),_gfortrani_xmallocarray189,[%value_in]; } mov.u64 %r112,%r189; .loc 1 104 22 st.u64 [%r129],%r112; .loc 1 105 10 setp.le.s64 %r191,%r76,0; @ %r191 bra $L1; .loc 1 181 12 ld.u64 %r50,[%frame+240]; bra $L10; $L4: .loc 1 115 11 ld.s8 %r192,[%r129+28]; .loc 1 115 10 setp.eq.u64 %r193,%r192,%r90; @ %r193 bra $L12; .loc 1 116 2 cvta.const.u64 %r1944_gfortran_runtime_error12: .loc 1 118 11 cvta.global.u64 %r196,_gfortrani_compile_options; ld.u32 %r85,[%r196+36]; .loc 1 118 10 setp.ne.u32 %r197,%r85,0; @ %r197 bra $L13; .loc 1 156 4 setp.gt.s64 %r198,%r192,0; @ %r198 bra $L14; .loc 1 40 14 mov.u64 %r74,0; bra $L15; $L13: .loc 1 120 4 setp.le.s64 %r199,%r192,0; @ %r199 bra $L37; .loc 1 82 19 add.u64 %r126,%r131,-1; add.u64 %r106,%r129,40; .loc 1 112 18 mov.u32 %r85,0; .loc 1 114 11 mov.u64 %r82,0; .loc 1 120 11 mov.u64 %r94,%r82; .loc 1 40 14 mov.u64 %r74,%r82; .loc 1 137 16 mov.u64 %r334,%r82; $L20: .loc 1 124 21 ld.u64 %r203,[%r106+16]; add.u64 %r202,%r203,1; .loc 1 124 19 ld.u64 %r204,[%r106+8]; sub.u64 %r93,%r202,%r204; .loc 1 125 11 setp.ne.u64 %r205,%r94,%r126; @ %r205 bra $L16; .loc 1 127 12 ld.u64 %r74,[%r106]; .loc 1 129 8 setp.eq.u64 %r206,%r132,%r93; @ %r206 bra $L17; .loc 1 130 7 st.u64 [%stack+16],%r132; st.u64 [%stack+8],%r93; add.u64 %r208,%r94,1;_gfortran_runtime_error16: .loc 1 137 16 shl.b64 %r322,%r82,3; add.u64 %r211,%frame,%r322; add.u64 %r212,%r211,120; st.u64 [%r212],%r334; .loc 1 138 19 add.u64 %r317,%r82,%r82; add.u64 %r319,%r317,%r82; shl.b64 %r217,%r319,3; add.u64 %r218,%r130,%r217; ld.u64 %r221,[%r218+56]; add.u64 %r220,%r221,1; ld.u64 %r228,[%r218+48]; sub.u64 %r41,%r220,%r228; .loc 1 138 17 st.u64 [%r211],%r41; .loc 1 139 8 setp.eq.u64 %r231,%r41,%r93; @ %r231 bra $L18; .loc 1 140 7 st.u64 [%stack+16],%r41; st.u64 [%stack+8],%r93; add.u64 %r233,%r94,1;_gfortran_runtime_error18: .loc 1 146 8 setp.gt.s64 %r235,%r41,0; .loc 1 147 18 selp.u32 %r85,%r85,1,%r235; .loc 1 148 18 ld.u64 %r245,[%r218+40]; st.u64 [%r211+240],%r245; .loc 1 149 18 ld.u64 %r249,[%r106]; st.u64 [%r211+360],%r249; .loc 1 150 8 add.u64 %r82,%r82,1; $L17: .loc 1 120 28 add.u64 %r94,%r94,1; .loc 1 120 4 add.u64 %r106,%r106,24; setp.ne.u64 %r250,%r192,%r94; @ %r250 bra $L20; bra $L21; $L14: .loc 1 82 19 add.u64 %r125,%r131,-1; add.u64 %r75,%r129,40; .loc 1 114 11 mov.u64 %r84,0; .loc 1 156 11 mov.u64 %r92,%r84; .loc 1 40 14 mov.u64 %r74,%r84; .loc 1 164 16 mov.u64 %r335,%r84; $L24: .loc 1 160 12 ld.u64 %r128,[%r75]; .loc 1 158 11 setp.eq.u64 %r253,%r92,%r125; @ %r253 bra $L38; .loc 1 164 16 shl.b64 %r323,%r84,3; add.u64 %r255,%frame,%r323; add.u64 %r256,%r255,120; st.u64 [%r256],%r335; .loc 1 165 19 add.u64 %r318,%r84,%r84; add.u64 %r321,%r318,%r84; shl.b64 %r261,%r321,3; add.u64 %r262,%r130,%r261; ld.u64 %r265,[%r262+56]; add.u64 %r264,%r265,1; ld.u64 %r272,[%r262+48]; sub.u64 %r48,%r264,%r272; .loc 1 165 17 st.u64 [%r255],%r48; .loc 1 166 8 setp.gt.s64 %r275,%r48,0; .loc 1 167 18 selp.u32 %r85,%r85,1,%r275; .loc 1 168 18 ld.u64 %r285,[%r262+40]; st.u64 [%r255+240],%r285; .loc 1 169 18 st.u64 [%r255+360],%r128; .loc 1 170 8 add.u64 %r84,%r84,1; bra $L22; $L38: .loc 1 160 12 mov.u64 %r74,%r128; $L22: .loc 1 156 28 add.u64 %r92,%r92,1; .loc 1 156 4 add.u64 %r75,%r75,24; setp.ne.u64 %r289,%r92,%r192; @ %r289 bra $L24; $L21: .loc 1 175 10 setp.ne.u32 %r290,%r85,0; @ ! %r290 bra $L15; bra $L1; $L37: .loc 1 40 14 mov.u64 %r74,0; $L15: .loc 1 178 18 ld.u64 %r50,[%frame+240]; .loc 1 178 10 setp.eq.u64 %r291,%r50,0; selp.u64 %r50,1,%r50,%r291; $L10: .loc 1 182 12 ld.u64 %r98,[%frame+360]; .loc 1 184 8 ld.u64 %r101,[%r130]; .loc 1 186 9 setp.ne.u64 %r292,%r101,0; @ %r292 bra $L25; bra $L1; $L40: .loc 1 190 7 mov.u64 %r102,%r112; .loc 1 190 14 mov.u64 %r103,0; $L26: .loc 1 192 12 ld.f32 %r51,[%r101]; .loc 1 192 10 st.f32 [%r102],%r51; .loc 1 193 16 add.u64 %r102,%r102,%r53; .loc 1 190 33 add.u64 %r103,%r103,1; .loc 1 190 7 setp.ne.u64 %r293,%r132,%r103; @ %r293 bra $L26; bra $L46; $L44: .loc 1 196 12 add.u64 %r101,%r101,%r55; .loc 1 197 12 add.u64 %r112,%r112,%r57; .loc 1 198 15 add.u64 %r95,%r95,1; .loc 1 200 13 setp.ne.u64 %r294,%r30,%r95; @ %r294 bra $L44; $L34: .loc 1 208 16 add.u64 %r100,%r112,%r116; .loc 1 210 14 @ %r325 bra $L1; add.u64 %r108,%frame,128; .loc 1 207 30 mov.u64 %r59,%r99; .loc 1 210 14 mov.u64 %r124,8; bra $L30; $L31: .loc 1 204 20 st.u64 [%r108],%r330; .loc 1 207 30 mul.lo.u64 %r59,%r67,%r66; .loc 1 208 30 mul.lo.u64 %r298,%r69,%r66; .loc 1 208 16 shl.b64 %r299,%r298,2; sub.u64 %r100,%r112,%r299; .loc 1 210 14 add.u64 %r108,%r108,8; add.u64 %r124,%r124,8; setp.eq.u64 %r300,%r117,%r124; @ %r300 bra $L1; $L30: .loc 1 218 23 ld.u64 %r301,[%r108]; add.u64 %r66,%r301,1; st.u64 [%r108],%r66; .loc 1 219 30 add.u64 %r303,%r328,%r124; ld.u64 %r67,[%r303]; .loc 1 219 20 sub.u64 %r304,%r67,%r59; shl.b64 %r305,%r304,2; add.u64 %r101,%r101,%r305; .loc 1 220 30 add.u64 %r307,%r329,%r124; ld.u64 %r69,[%r307]; .loc 1 220 20 shl.b64 %r308,%r69,2; add.u64 %r112,%r100,%r308; .loc 1 200 32 add.u64 %r309,%frame,%r124; ld.u64 %r72,[%r309]; .loc 1 200 13 setp.eq.u64 %r310,%r66,%r72; @ %r310 bra $L31; .loc 1 204 20 mov.u64 %r95,0; bra $L32; $L25: .loc 1 193 16 shl.b64 %r53,%r74,2; .loc 1 196 12 shl.b64 %r55,%r50,2; .loc 1 197 12 shl.b64 %r57,%r98,2; ld.u64 %r95,[%frame+120]; .loc 1 200 32 ld.u64 %r30,[%frame]; .loc 1 207 30 mul.lo.u64 %r99,%r50,%r30; .loc 1 208 30 mul.lo.u64 %r311,%r98,%r30; .loc 1 208 16 shl.b64 %r312,%r311,2; neg.s64 %r116,%r312; cvt.u32.u32 %r314,%r22; cvt.s64.s8 %r313,%r314; shl.b64 %r117,%r313,3; setp.le.s64 %r325,%r89,1; setp.gt.s64 %r326,%r132,0; add.u64 %r328,%frame,240; add.u64 %r329,%frame,360; .loc 1 204 20 mov.u64 %r330,0; $L32: .loc 1 190 7 @ %r326 bra $L40; bra $L44; $L46: .loc 1 196 12 add.u64 %r101,%r101,%r55; .loc 1 197 12 add.u64 %r112,%r112,%r57; .loc 1 198 15 add.u64 %r95,%r95,1; .loc 1 200 13 setp.eq.u64 %r316,%r30,%r95; @ %r316 bra $L34; bra $L32; $L_gfortrani_spread_scalar_r4 .visible .func _gfortrani_spread_scalar_rpredmov.u64 %r39,%ar2; mov.u64 %r40,%ar3; .loc 1 236 6 ld.s8 %r42,[%r37+28]; cvt.u16.u32 %r41,%r42; setp.eq.u16 %r43,%r41,1; @ %r43 bra $L48; .loc 1 237 5 cvta.const.u64 %r44,$LC4; {stack; call _gfortran_runtime_error48: .loc 1 239 6 setp.le.s64 %r46,%r39,1; @ %r46 bra $L49; .loc 1 240 5 cvta.const.u64 %r47_gfortran_runtime_error49: .loc 1 242 10 ld.u64 %r23,[%r37]; .loc 1 246 7 add.u64 %r36,%r40,-1; .loc 1 242 6 setp.ne.u64 %r49,%r23,0; @ %r49 bra $L50; .loc 1 244 24 mov.u64 %r51_gfortrani_xmallocarray244 22 st.u64 [%r37],%r23; .loc 1 245 19 mov.u64 %r54,0; st.u64 [%r37+8],%r54; .loc 1 246 7 st.u64 [%r37+48],%r54; st.u64 [%r37+56],%r36; mov.u64 %r56,1; st.u64 [%r37+40],%r56; mov.u64 %r30,%r56; bra $L51; $L50: .loc 1 251 9 ld.u64 %r30,[%r37+40]; .loc 1 250 26 ld.u64 %r58,[%r37+56]; ld.u64 %r60,[%r37+48]; .loc 1 250 55 sub.u64 %r61,%r58,%r60; .loc 1 251 7 div.s64 %r63,%r61,%r30; .loc 1 250 10 setp.ge.s64 %r64,%r63,%r36; @ %r64 bra $L51; .loc 1 252 2 cvta.const.u64 %r65,$LC5stack; call _gfortran_runtime_error1: .loc 1 258 3 setp.gt.s64 %r67,%r40,0; @ ! %r67 bra $L47; .loc 1 261 12 shl.b64 %r34,%r30,2; .loc 1 258 19 mov.u64 %r35,0; $L53: .loc 1 260 15 ld.f32 %r32,[%r38]; .loc 1 260 13 st.f32 [%r23],%r32; .loc 1 261 12 add.u64 %r23,%r23,%r34; .loc 1 258 40 add.u64 %r35,%r35,1; .loc 1 258 3 setp.ne.u64 %r68,%r40,%r35; @ %r68 bra $L53; $L47: .loc 1 263 1 ret; } spread_r8.o/fortran/generated/spread_r8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_spread_scalar_r8 .visible .func _gfortrani_spread_scalar_r8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray114,101,116,117,114,110,32,114,97,110,107,32,116,111,111,32,108,97,114,103,101,32,105,110,32,115,112,114,101,97,100,40,410,105,109,32,111,117,116,115,105,100,101,32,111,102,32,114,97,110,107,32,105,110,32114,97,110,107,32,109,105,115,109,97,116,99,104,32,105,110,32,115,112,114,101,97,100,40,473,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,83,80,82,69,65,68,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10105,110,99,111,114,114,101,99,116,32,100,101,115,116,105,110,97,116,105,111,110,32,114,97,110,107,32,105,110,32,115,112,114,101,97,100,40,41100,105,109,32,116,111,111,32,108,97,114,103,101,32,105,110,32,115,112,114,101,97,100,40,41_gfortrani_spread_r8 .visible .func _gfortrani_spread_r8480predpredpredpred %r250; .reg .predu64 %r264; .reg .u64 %r265; .reg .u64 %r272; .reg .pred %r275; .reg .u64 %r285; .reg .predpred %r292; .reg .pred %r293; .reg .predu64 %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u64pred %r316; .reg .u64 %r317; .reg .u64 %r318; .reg .u64 %r319; .reg .u64 %r321; .reg .u64 %r322; .reg .u64 %r323; .reg .pred %r325; .reg .pred %r326; .reg .u64u64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; mov.u64 %r129,%ar0; mov.u64 %r130,%ar1; mov.u64 %r131,%ar2; mov.u64 %r132,%ar3; .loc 1 57 11 ld.s8 %r22,[%r130+28]; .loc 1 57 9 cvt.u32.u32 %r133,%r22; cvt.s64.s8 %r89,%r133; .loc 1 59 9 add.u64 %r90,%r89,1; .loc 1 60 6 setp.le.s64 %r134,%r90,15; @ %r134 bra $L2; .loc 1 61 5 cvta.const.u64 %r135135_gfortran_runtime_errorsetp.ge.s64 %r137,%r90,%r131; @ %r137 bra $L3; .loc 1 64_gfortran_runtime_errorr112,[%r129]; .loc 1 68 6 setp.ne.u64 %r140,%r112,0; @ %r140 bra $L4; .loc 1 75 23 st.u8 [%r129+28],%r90; .loc 1 79 7 setp.le.s64 %r141,%r90,0; @ %r141 bra $L5; .loc 1 82 19 add.u64 %r127,%r131,-1; add.u64 %r115,%r129,40; .loc 1 77 11 mov.u64 %r80,%r112; .loc 1 79 14 mov.u64 %r97,%r80; .loc 1 78 10 mov.u64 %r121,1; .loc 1 40 14 mov.u64 %r74,%r80; .loc 1 90 19 mov.u64 %r332,%r80; .loc 1 84 21 add.u64 %r333,%r132,-1; bra $L8; $L36: mov.u64 %r121,%r76; $L8: .loc 1 82 7 setp.ne.u64 %r144,%r97,%r127; @ %r144 bra $L6; .loc 1 84 21 mov.u64 %r29,%r333; .loc 1 86 11 mul.lo.u64 %r76,%r132,%r121; mov.u64 %r74,%r121; bra $L7; $L6: .loc 1 90 19 shl.b64 %r145,%r80,3; add.u64 %r146,%frame,%r145; st.u64 [%r146+120],%r332; .loc 1 91 22 add.u64 %r150,%r80,%r80; add.u64 %r151,%r150,%r80; shl.b64 %r152,%r151,3; add.u64 %r153,%r130,%r152; ld.u64 %r156,[%r153+56]; add.u64 %r155,%r156,1; ld.u64 %r163,[%r153+48]; sub.u64 %r27,%r155,%r163; .loc 1 91 20 st.u64 [%r146],%r27; .loc 1 92 21 ld.u64 %r175,[%r153+40]; st.u64 [%r146+240],%r175; .loc 1 93 21 st.u64 [%r146+360],%r121; .loc 1 95 25 add.u64 %r29,%r27,-1; .loc 1 96 11 mul.lo.u64 %r76,%r27,%r121; .loc 1 97 11 add.u64 %r80,%r80,1; $L7: .loc 1 99 4 st.u64 [%r115+8],%r332; st.u64 [%r115+16],%r29; st.u64 [%r115],%r121; .loc 1 79 31 add.u64 %r97,%r97,1; .loc 1 79 7 add.u64 %r115,%r115,24; setp.ne.u64 %r180,%r97,%r90; @ %r180 bra $L36; bra $L45; $L5: .loc 1 101 19 st.u64 [%r129+8],%r112; .loc 1 104 24 mov.u64 %r183,8; mov.u64 %r18218183; call (%value_in),_gfortrani_xmallocarray184,[%value_in]; } mov.u64 %r112,%r184; .loc 1 104 22 st.u64 [%r129],%r112; .loc 1 181 12 ld.u64 %r50,[%frame+240]; .loc 1 40 14 mov.u64 %r74,0; bra $L10; $L45: .loc 1 101 19 st.u64 [%r129+8],%r332; .loc 1 104 24 mov.u64 %r188(%value_in),_gfortrani_xmallocarray189,[%value_in]; } mov.u64 %r112,%r189; .loc 1 104 22 st.u64 [%r129],%r112; .loc 1 105 10 setp.le.s64 %r191,%r76,0; @ %r191 bra $L1; .loc 1 181 12 ld.u64 %r50,[%frame+240]; bra $L10; $L4: .loc 1 115 11 ld.s8 %r192,[%r129+28]; .loc 1 115 10 setp.eq.u64 %r193,%r192,%r90; @ %r193 bra $L12; .loc 1 116 2 cvta.const.u64 %r1944_gfortran_runtime_error12: .loc 1 118 11 cvta.global.u64 %r196,_gfortrani_compile_options; ld.u32 %r85,[%r196+36]; .loc 1 118 10 setp.ne.u32 %r197,%r85,0; @ %r197 bra $L13; .loc 1 156 4 setp.gt.s64 %r198,%r192,0; @ %r198 bra $L14; .loc 1 40 14 mov.u64 %r74,0; bra $L15; $L13: .loc 1 120 4 setp.le.s64 %r199,%r192,0; @ %r199 bra $L37; .loc 1 82 19 add.u64 %r126,%r131,-1; add.u64 %r106,%r129,40; .loc 1 112 18 mov.u32 %r85,0; .loc 1 114 11 mov.u64 %r82,0; .loc 1 120 11 mov.u64 %r94,%r82; .loc 1 40 14 mov.u64 %r74,%r82; .loc 1 137 16 mov.u64 %r334,%r82; $L20: .loc 1 124 21 ld.u64 %r203,[%r106+16]; add.u64 %r202,%r203,1; .loc 1 124 19 ld.u64 %r204,[%r106+8]; sub.u64 %r93,%r202,%r204; .loc 1 125 11 setp.ne.u64 %r205,%r94,%r126; @ %r205 bra $L16; .loc 1 127 12 ld.u64 %r74,[%r106]; .loc 1 129 8 setp.eq.u64 %r206,%r132,%r93; @ %r206 bra $L17; .loc 1 130 7 st.u64 [%stack+16],%r132; st.u64 [%stack+8],%r93; add.u64 %r208,%r94,1;_gfortran_runtime_error16: .loc 1 137 16 shl.b64 %r322,%r82,3; add.u64 %r211,%frame,%r322; add.u64 %r212,%r211,120; st.u64 [%r212],%r334; .loc 1 138 19 add.u64 %r317,%r82,%r82; add.u64 %r319,%r317,%r82; shl.b64 %r217,%r319,3; add.u64 %r218,%r130,%r217; ld.u64 %r221,[%r218+56]; add.u64 %r220,%r221,1; ld.u64 %r228,[%r218+48]; sub.u64 %r41,%r220,%r228; .loc 1 138 17 st.u64 [%r211],%r41; .loc 1 139 8 setp.eq.u64 %r231,%r41,%r93; @ %r231 bra $L18; .loc 1 140 7 st.u64 [%stack+16],%r41; st.u64 [%stack+8],%r93; add.u64 %r233,%r94,1;_gfortran_runtime_error18: .loc 1 146 8 setp.gt.s64 %r235,%r41,0; .loc 1 147 18 selp.u32 %r85,%r85,1,%r235; .loc 1 148 18 ld.u64 %r245,[%r218+40]; st.u64 [%r211+240],%r245; .loc 1 149 18 ld.u64 %r249,[%r106]; st.u64 [%r211+360],%r249; .loc 1 150 8 add.u64 %r82,%r82,1; $L17: .loc 1 120 28 add.u64 %r94,%r94,1; .loc 1 120 4 add.u64 %r106,%r106,24; setp.ne.u64 %r250,%r192,%r94; @ %r250 bra $L20; bra $L21; $L14: .loc 1 82 19 add.u64 %r125,%r131,-1; add.u64 %r75,%r129,40; .loc 1 114 11 mov.u64 %r84,0; .loc 1 156 11 mov.u64 %r92,%r84; .loc 1 40 14 mov.u64 %r74,%r84; .loc 1 164 16 mov.u64 %r335,%r84; $L24: .loc 1 160 12 ld.u64 %r128,[%r75]; .loc 1 158 11 setp.eq.u64 %r253,%r92,%r125; @ %r253 bra $L38; .loc 1 164 16 shl.b64 %r323,%r84,3; add.u64 %r255,%frame,%r323; add.u64 %r256,%r255,120; st.u64 [%r256],%r335; .loc 1 165 19 add.u64 %r318,%r84,%r84; add.u64 %r321,%r318,%r84; shl.b64 %r261,%r321,3; add.u64 %r262,%r130,%r261; ld.u64 %r265,[%r262+56]; add.u64 %r264,%r265,1; ld.u64 %r272,[%r262+48]; sub.u64 %r48,%r264,%r272; .loc 1 165 17 st.u64 [%r255],%r48; .loc 1 166 8 setp.gt.s64 %r275,%r48,0; .loc 1 167 18 selp.u32 %r85,%r85,1,%r275; .loc 1 168 18 ld.u64 %r285,[%r262+40]; st.u64 [%r255+240],%r285; .loc 1 169 18 st.u64 [%r255+360],%r128; .loc 1 170 8 add.u64 %r84,%r84,1; bra $L22; $L38: .loc 1 160 12 mov.u64 %r74,%r128; $L22: .loc 1 156 28 add.u64 %r92,%r92,1; .loc 1 156 4 add.u64 %r75,%r75,24; setp.ne.u64 %r289,%r92,%r192; @ %r289 bra $L24; $L21: .loc 1 175 10 setp.ne.u32 %r290,%r85,0; @ ! %r290 bra $L15; bra $L1; $L37: .loc 1 40 14 mov.u64 %r74,0; $L15: .loc 1 178 18 ld.u64 %r50,[%frame+240]; .loc 1 178 10 setp.eq.u64 %r291,%r50,0; selp.u64 %r50,1,%r50,%r291; $L10: .loc 1 182 12 ld.u64 %r98,[%frame+360]; .loc 1 184 8 ld.u64 %r101,[%r130]; .loc 1 186 9 setp.ne.u64 %r292,%r101,0; @ %r292 bra $L25; bra $L1; $L40: .loc 1 190 7 mov.u64 %r102,%r112; .loc 1 190 14 mov.u64 %r103,0; $L26: .loc 1 192 12 ld.f64 %r51,[%r101]; .loc 1 192 10 st.f64 [%r102],%r51; .loc 1 193 16 add.u64 %r102,%r102,%r53; .loc 1 190 33 add.u64 %r103,%r103,1; .loc 1 190 7 setp.ne.u64 %r293,%r132,%r103; @ %r293 bra $L26; bra $L46; $L44: .loc 1 196 12 add.u64 %r101,%r101,%r55; .loc 1 197 12 add.u64 %r112,%r112,%r57; .loc 1 198 15 add.u64 %r95,%r95,1; .loc 1 200 13 setp.ne.u64 %r294,%r30,%r95; @ %r294 bra $L44; $L34: .loc 1 208 16 add.u64 %r100,%r112,%r116; .loc 1 210 14 @ %r325 bra $L1; add.u64 %r108,%frame,128; .loc 1 207 30 mov.u64 %r59,%r99; .loc 1 210 14 mov.u64 %r124,8; bra $L30; $L31: .loc 1 204 20 st.u64 [%r108],%r330; .loc 1 207 30 mul.lo.u64 %r59,%r67,%r66; .loc 1 208 30 mul.lo.u64 %r298,%r69,%r66; .loc 1 208 16 shl.b64 %r299,%r298,3; sub.u64 %r100,%r112,%r299; .loc 1 210 14 add.u64 %r108,%r108,8; add.u64 %r124,%r124,8; setp.eq.u64 %r300,%r117,%r124; @ %r300 bra $L1; $L30: .loc 1 218 23 ld.u64 %r301,[%r108]; add.u64 %r66,%r301,1; st.u64 [%r108],%r66; .loc 1 219 30 add.u64 %r303,%r328,%r124; ld.u64 %r67,[%r303]; .loc 1 219 20 sub.u64 %r304,%r67,%r59; shl.b64 %r305,%r304,3; add.u64 %r101,%r101,%r305; .loc 1 220 30 add.u64 %r307,%r329,%r124; ld.u64 %r69,[%r307]; .loc 1 220 20 shl.b64 %r308,%r69,3; add.u64 %r112,%r100,%r308; .loc 1 200 32 add.u64 %r309,%frame,%r124; ld.u64 %r72,[%r309]; .loc 1 200 13 setp.eq.u64 %r310,%r66,%r72; @ %r310 bra $L31; .loc 1 204 20 mov.u64 %r95,0; bra $L32; $L25: .loc 1 193 16 shl.b64 %r53,%r74,3; .loc 1 196 12 shl.b64 %r55,%r50,3; .loc 1 197 12 shl.b64 %r57,%r98,3; ld.u64 %r95,[%frame+120]; .loc 1 200 32 ld.u64 %r30,[%frame]; .loc 1 207 30 mul.lo.u64 %r99,%r50,%r30; .loc 1 208 30 mul.lo.u64 %r311,%r98,%r30; .loc 1 208 16 shl.b64 %r312,%r311,3; neg.s64 %r116,%r312; cvt.u32.u32 %r314,%r22; cvt.s64.s8 %r313,%r314; shl.b64 %r117,%r313,3; setp.le.s64 %r325,%r89,1; setp.gt.s64 %r326,%r132,0; add.u64 %r328,%frame,240; add.u64 %r329,%frame,360; .loc 1 204 20 mov.u64 %r330,0; $L32: .loc 1 190 7 @ %r326 bra $L40; bra $L44; $L46: .loc 1 196 12 add.u64 %r101,%r101,%r55; .loc 1 197 12 add.u64 %r112,%r112,%r57; .loc 1 198 15 add.u64 %r95,%r95,1; .loc 1 200 13 setp.eq.u64 %r316,%r30,%r95; @ %r316 bra $L34; bra $L32; $L_gfortrani_spread_scalar_r8 .visible .func _gfortrani_spread_scalar_r8predmov.u64 %r39,%ar2; mov.u64 %r40,%ar3; .loc 1 236 6 ld.s8 %r42,[%r37+28]; cvt.u16.u32 %r41,%r42; setp.eq.u16 %r43,%r41,1; @ %r43 bra $L48; .loc 1 237 5 cvta.const.u64 %r44,$LC4; {stack; call _gfortran_runtime_error48: .loc 1 239 6 setp.le.s64 %r46,%r39,1; @ %r46 bra $L49; .loc 1 240 5 cvta.const.u64 %r47_gfortran_runtime_error49: .loc 1 242 10 ld.u64 %r23,[%r37]; .loc 1 246 7 add.u64 %r36,%r40,-1; .loc 1 242 6 setp.ne.u64 %r49,%r23,0; @ %r49 bra $L50; .loc 1 244 24 mov.u64 _gfortrani_xmallocarray244 22 st.u64 [%r37],%r23; .loc 1 245 19 mov.u64 %r54,0; st.u64 [%r37+8],%r54; .loc 1 246 7 st.u64 [%r37+48],%r54; st.u64 [%r37+56],%r36; mov.u64 %r56,1; st.u64 [%r37+40],%r56; mov.u64 %r30,%r56; bra $L51; $L50: .loc 1 251 9 ld.u64 %r30,[%r37+40]; .loc 1 250 26 ld.u64 %r58,[%r37+56]; ld.u64 %r60,[%r37+48]; .loc 1 250 55 sub.u64 %r61,%r58,%r60; .loc 1 251 7 div.s64 %r63,%r61,%r30; .loc 1 250 10 setp.ge.s64 %r64,%r63,%r36; @ %r64 bra $L51; .loc 1 252 2 cvta.const.u64 %r65,$LC5stack; call _gfortran_runtime_error1: .loc 1 258 3 setp.gt.s64 %r67,%r40,0; @ ! %r67 bra $L47; .loc 1 261 12 shl.b64 %r34,%r30,3; .loc 1 258 19 mov.u64 %r35,0; $L53: .loc 1 260 15 ld.f64 %r32,[%r38]; .loc 1 260 13 st.f64 [%r23],%r32; .loc 1 261 12 add.u64 %r23,%r23,%r34; .loc 1 258 40 add.u64 %r35,%r35,1; .loc 1 258 3 setp.ne.u64 %r68,%r40,%r35; @ %r68 bra $L53; $L47: .loc 1 263 1 ret; } spread_r10.o/ spread_c4.o/_gfortrani_spread_c4 .visible .func _gfortrani_spread_cfortran/generated/spread_c4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_spread_scalar_c4 .visible .func _gfortrani_spread_scalar_c4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray114,101,116,117,114,110,32,114,97,110,107,32,116,111,111,32,108,97,114,103,101,32,105,110,32,115,112,114,101,97,100,40,410,105,109,32,111,117,116,115,105,100,101,32,111,102,32,114,97,110,107,32,105,110,32114,97,110,107,32,109,105,115,109,97,116,99,104,32,105,110,32,115,112,114,101,97,100,40,473,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,83,80,82,69,65,68,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10105,110,99,111,114,114,101,99,116,32,100,101,115,116,105,110,97,116,105,111,110,32,114,97,110,107,32,105,110,32,115,112,114,101,97,100,40,41100,105,109,32,116,111,111,32,108,97,114,103,101,32,105,110,32,115,112,114,101,97,100,40,41_gfortrani_spread_c4 .visible .func _gfortrani_spread_c480pred %r142; .reg .predpredpred %r192; .reg .u64pred %r206; .reg .predpred %r251; .reg .predu64 %r265; .reg .u64 %r266; .reg .u64 %r273; .reg .pred %r276; .reg .u64 %r286; .reg .predpred %r293; .reg .pred %r294; .reg .predpredu64 %r331; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64 %r336; mov.u64 %r130,%ar0; mov.u64 %r131,%ar1; mov.u64 %r132,%ar2; mov.u64 %r133,%ar3; .loc 1 57 11 ld.s8 %r22,[%r131+28]; .loc 1 57 9 cvt.u32.u32 %r134,%r22; cvt.s64.s8 %r90,%r134; .loc 1 59 9 add.u64 %r91,%r90,1; .loc 1 60 6 setp.le.s64 %r135,%r91,15; @ %r135 bra $L2; .loc 1 61 5 cvta.const.u64 %r136136_gfortran_runtime_errorsetp.ge.s64 %r138,%r91,%r132; @ %r138 bra $L3; .loc 1 64_gfortran_runtime_errorr113,[%r130]; .loc 1 68 6 setp.ne.u64 %r141,%r113,0; @ %r141 bra $L4; .loc 1 75 23 st.u8 [%r130+28],%r91; .loc 1 79 7 setp.le.s64 %r142,%r91,0; @ %r142 bra $L5; .loc 1 82 19 add.u64 %r128,%r132,-1; add.u64 %r116,%r130,40; .loc 1 77 11 mov.u64 %r81,%r113; .loc 1 79 14 mov.u64 %r98,%r81; .loc 1 78 10 mov.u64 %r122,1; .loc 1 40 14 mov.u64 %r75,%r81; .loc 1 90 19 mov.u64 %r333,%r81; .loc 1 84 21 add.u64 %r334,%r133,-1; bra $L8; $L36: mov.u64 %r122,%r77; $L8: .loc 1 82 7 setp.ne.u64 %r145,%r98,%r128; @ %r145 bra $L6; .loc 1 84 21 mov.u64 %r29,%r334; .loc 1 86 11 mul.lo.u64 %r77,%r133,%r122; mov.u64 %r75,%r122; bra $L7; $L6: .loc 1 90 19 shl.b64 %r146,%r81,3; add.u64 %r147,%frame,%r146; st.u64 [%r147+120],%r333; .loc 1 91 22 add.u64 %r151,%r81,%r81; add.u64 %r152,%r151,%r81; shl.b64 %r153,%r152,3; add.u64 %r154,%r131,%r153; ld.u64 %r157,[%r154+56]; add.u64 %r156,%r157,1; ld.u64 %r164,[%r154+48]; sub.u64 %r27,%r156,%r164; .loc 1 91 20 st.u64 [%r147],%r27; .loc 1 92 21 ld.u64 %r176,[%r154+40]; st.u64 [%r147+240],%r176; .loc 1 93 21 st.u64 [%r147+360],%r122; .loc 1 95 25 add.u64 %r29,%r27,-1; .loc 1 96 11 mul.lo.u64 %r77,%r27,%r122; .loc 1 97 11 add.u64 %r81,%r81,1; $L7: .loc 1 99 4 st.u64 [%r116+8],%r333; st.u64 [%r116+16],%r29; st.u64 [%r116],%r122; .loc 1 79 31 add.u64 %r98,%r98,1; .loc 1 79 7 add.u64 %r116,%r116,24; setp.ne.u64 %r181,%r98,%r91; @ %r181 bra $L36; bra $L45; $L5: .loc 1 101 19 st.u64 [%r130+8],%r113; .loc 1 104 24 mov.u64 %r184,8; mov.u64 %r1818(%value_in),_gfortrani_xmallocarray185,[%value_in]; } mov.u64 %r113,%r185; .loc 1 104 22 st.u64 [%r130],%r113; .loc 1 181 12 ld.u64 %r50,[%frame+240]; .loc 1 40 14 mov.u64 %r75,0; bra $L10; $L45: .loc 1 101 19 st.u64 [%r130+8],%r333; .loc 1 104 24 mov.u64 %r1897189; call (%value_in),_gfortrani_xmallocarray190,[%value_in]; } mov.u64 %r113,%r190; .loc 1 104 22 st.u64 [%r130],%r113; .loc 1 105 10 setp.le.s64 %r192,%r77,0; @ %r192 bra $L1; .loc 1 181 12 ld.u64 %r50,[%frame+240]; bra $L10; $L4: .loc 1 115 11 ld.s8 %r193,[%r130+28]; .loc 1 115 10 setp.eq.u64 %r194,%r193,%r91; @ %r194 bra $L12; .loc 1 116 2 cvta.const.u64 %r1955_gfortran_runtime_error12: .loc 1 118 11 cvta.global.u64 %r197,_gfortrani_compile_options; ld.u32 %r86,[%r197+36]; .loc 1 118 10 setp.ne.u32 %r198,%r86,0; @ %r198 bra $L13; .loc 1 156 4 setp.gt.s64 %r199,%r193,0; @ %r199 bra $L14; .loc 1 40 14 mov.u64 %r75,0; bra $L15; $L13: .loc 1 120 4 setp.le.s64 %r200,%r193,0; @ %r200 bra $L37; .loc 1 82 19 add.u64 %r127,%r132,-1; add.u64 %r107,%r130,40; .loc 1 112 18 mov.u32 %r86,0; .loc 1 114 11 mov.u64 %r83,0; .loc 1 120 11 mov.u64 %r95,%r83; .loc 1 40 14 mov.u64 %r75,%r83; .loc 1 137 16 mov.u64 %r335,%r83; $L20: .loc 1 124 21 ld.u64 %r204,[%r107+16]; add.u64 %r203,%r204,1; .loc 1 124 19 ld.u64 %r205,[%r107+8]; sub.u64 %r94,%r203,%r205; .loc 1 125 11 setp.ne.u64 %r206,%r95,%r127; @ %r206 bra $L16; .loc 1 127 12 ld.u64 %r75,[%r107]; .loc 1 129 8 setp.eq.u64 %r207,%r133,%r94; @ %r207 bra $L17; .loc 1 130 7 st.u64 [%stack+16],%r133; st.u64 [%stack+8],%r94; add.u64 %r209,%r95,1;_gfortran_runtime_error16: .loc 1 137 16 shl.b64 %r323,%r83,3; add.u64 %r212,%frame,%r323; add.u64 %r213,%r212,120; st.u64 [%r213],%r335; .loc 1 138 19 add.u64 %r318,%r83,%r83; add.u64 %r320,%r318,%r83; shl.b64 %r218,%r320,3; add.u64 %r219,%r131,%r218; ld.u64 %r222,[%r219+56]; add.u64 %r221,%r222,1; ld.u64 %r229,[%r219+48]; sub.u64 %r41,%r221,%r229; .loc 1 138 17 st.u64 [%r212],%r41; .loc 1 139 8 setp.eq.u64 %r232,%r41,%r94; @ %r232 bra $L18; .loc 1 140 7 st.u64 [%stack+16],%r41; st.u64 [%stack+8],%r94; add.u64 %r234,%r95,1;_gfortran_runtime_error18: .loc 1 146 8 setp.gt.s64 %r236,%r41,0; .loc 1 147 18 selp.u32 %r86,%r86,1,%r236; .loc 1 148 18 ld.u64 %r246,[%r219+40]; st.u64 [%r212+240],%r246; .loc 1 149 18 ld.u64 %r250,[%r107]; st.u64 [%r212+360],%r250; .loc 1 150 8 add.u64 %r83,%r83,1; $L17: .loc 1 120 28 add.u64 %r95,%r95,1; .loc 1 120 4 add.u64 %r107,%r107,24; setp.ne.u64 %r251,%r193,%r95; @ %r251 bra $L20; bra $L21; $L14: .loc 1 82 19 add.u64 %r126,%r132,-1; add.u64 %r76,%r130,40; .loc 1 114 11 mov.u64 %r85,0; .loc 1 156 11 mov.u64 %r93,%r85; .loc 1 40 14 mov.u64 %r75,%r85; .loc 1 164 16 mov.u64 %r336,%r85; $L24: .loc 1 160 12 ld.u64 %r129,[%r76]; .loc 1 158 11 setp.eq.u64 %r254,%r93,%r126; @ %r254 bra $L38; .loc 1 164 16 shl.b64 %r324,%r85,3; add.u64 %r256,%frame,%r324; add.u64 %r257,%r256,120; st.u64 [%r257],%r336; .loc 1 165 19 add.u64 %r321,%r85,%r85; add.u64 %r322,%r321,%r85; shl.b64 %r262,%r322,3; add.u64 %r263,%r131,%r262; ld.u64 %r266,[%r263+56]; add.u64 %r265,%r266,1; ld.u64 %r273,[%r263+48]; sub.u64 %r48,%r265,%r273; .loc 1 165 17 st.u64 [%r256],%r48; .loc 1 166 8 setp.gt.s64 %r276,%r48,0; .loc 1 167 18 selp.u32 %r86,%r86,1,%r276; .loc 1 168 18 ld.u64 %r286,[%r263+40]; st.u64 [%r256+240],%r286; .loc 1 169 18 st.u64 [%r256+360],%r129; .loc 1 170 8 add.u64 %r85,%r85,1; bra $L22; $L38: .loc 1 160 12 mov.u64 %r75,%r129; $L22: .loc 1 156 28 add.u64 %r93,%r93,1; .loc 1 156 4 add.u64 %r76,%r76,24; setp.ne.u64 %r290,%r93,%r193; @ %r290 bra $L24; $L21: .loc 1 175 10 setp.ne.u32 %r291,%r86,0; @ ! %r291 bra $L15; bra $L1; $L37: .loc 1 40 14 mov.u64 %r75,0; $L15: .loc 1 178 18 ld.u64 %r50,[%frame+240]; .loc 1 178 10 setp.eq.u64 %r292,%r50,0; selp.u64 %r50,1,%r50,%r292; $L10: .loc 1 182 12 ld.u64 %r99,[%frame+360]; .loc 1 184 8 ld.u64 %r102,[%r131]; .loc 1 186 9 setp.ne.u64 %r293,%r102,0; @ %r293 bra $L25; bra $L1; $L40: .loc 1 190 7 mov.u64 %r103,%r113; .loc 1 190 14 mov.u64 %r104,0; $L26: .loc 1 192 12 ld.f32 %r51,[%r102]; ld.f32 %r52,[%r102+4]; .loc 1 192 10 st.f32 [%r103],%r51; st.f32 [%r103+4],%r52; .loc 1 193 16 add.u64 %r103,%r103,%r54; .loc 1 190 33 add.u64 %r104,%r104,1; .loc 1 190 7 setp.ne.u64 %r294,%r133,%r104; @ %r294 bra $L26; bra $L46; $L44: .loc 1 196 12 add.u64 %r102,%r102,%r56; .loc 1 197 12 add.u64 %r113,%r113,%r58; .loc 1 198 15 add.u64 %r96,%r96,1; .loc 1 200 13 setp.ne.u64 %r295,%r30,%r96; @ %r295 bra $L44; $L34: .loc 1 208 16 add.u64 %r101,%r113,%r117; .loc 1 210 14 @ %r326 bra $L1; add.u64 %r109,%frame,128; .loc 1 207 30 mov.u64 %r60,%r100; .loc 1 210 14 mov.u64 %r125,8; bra $L30; $L31: .loc 1 204 20 st.u64 [%r109],%r331; .loc 1 207 30 mul.lo.u64 %r60,%r68,%r67; .loc 1 208 30 mul.lo.u64 %r299,%r70,%r67; .loc 1 208 16 shl.b64 %r300,%r299,3; sub.u64 %r101,%r113,%r300; .loc 1 210 14 add.u64 %r109,%r109,8; add.u64 %r125,%r125,8; setp.eq.u64 %r301,%r118,%r125; @ %r301 bra $L1; $L30: .loc 1 218 23 ld.u64 %r302,[%r109]; add.u64 %r67,%r302,1; st.u64 [%r109],%r67; .loc 1 219 30 add.u64 %r304,%r329,%r125; ld.u64 %r68,[%r304]; .loc 1 219 20 sub.u64 %r305,%r68,%r60; shl.b64 %r306,%r305,3; add.u64 %r102,%r102,%r306; .loc 1 220 30 add.u64 %r308,%r330,%r125; ld.u64 %r70,[%r308]; .loc 1 220 20 shl.b64 %r309,%r70,3; add.u64 %r113,%r101,%r309; .loc 1 200 32 add.u64 %r310,%frame,%r125; ld.u64 %r73,[%r310]; .loc 1 200 13 setp.eq.u64 %r311,%r67,%r73; @ %r311 bra $L31; .loc 1 204 20 mov.u64 %r96,0; bra $L32; $L25: .loc 1 193 16 shl.b64 %r54,%r75,3; .loc 1 196 12 shl.b64 %r56,%r50,3; .loc 1 197 12 shl.b64 %r58,%r99,3; ld.u64 %r96,[%frame+120]; .loc 1 200 32 ld.u64 %r30,[%frame]; .loc 1 207 30 mul.lo.u64 %r100,%r50,%r30; .loc 1 208 30 mul.lo.u64 %r312,%r99,%r30; .loc 1 208 16 shl.b64 %r313,%r312,3; neg.s64 %r117,%r313; cvt.u32.u32 %r315,%r22; cvt.s64.s8 %r314,%r315; shl.b64 %r118,%r314,3; setp.le.s64 %r326,%r90,1; setp.gt.s64 %r327,%r133,0; add.u64 %r329,%frame,240; add.u64 %r330,%frame,360; .loc 1 204 20 mov.u64 %r331,0; $L32: .loc 1 190 7 @ %r327 bra $L40; bra $L44; $L46: .loc 1 196 12 add.u64 %r102,%r102,%r56; .loc 1 197 12 add.u64 %r113,%r113,%r58; .loc 1 198 15 add.u64 %r96,%r96,1; .loc 1 200 13 setp.eq.u64 %r317,%r30,%r96; @ %r317 bra $L34; bra $L32; $L_gfortrani_spread_scalar_c4 .visible .func _gfortrani_spread_scalar_cpred %r47; .reg .u64 %r48; .reg .pred.reg .u64 %r62; .reg .u64pred %r68; .reg .predmov.u64 %r40,%ar2; mov.u64 %r41,%ar3; .loc 1 236 6 ld.s8 %r43,[%r38+28]; cvt.u16.u32 %r42,%r43; setp.eq.u16 %r44,%r42,1; @ %r44 bra $L48; .loc 1 237 5 cvta.const.u64 %r45,$LC4stack; call _gfortran_runtime_error48: .loc 1 239 6 setp.le.s64 %r47,%r40,1; @ %r47 bra $L49; .loc 1 240 5 cvta.const.u64 %r48_gfortran_runtime_error49: .loc 1 242 10 ld.u64 %r23,[%r38]; .loc 1 246 7 add.u64 %r37,%r41,-1; .loc 1 242 6 setp.ne.u64 %r50,%r23,0; @ %r50 bra $L50; .loc 1 244 24 mov.u64 52; call (%value_in),_gfortrani_xmallocarray244 22 st.u64 [%r38],%r23; .loc 1 245 19 mov.u64 %r55,0; st.u64 [%r38+8],%r55; .loc 1 246 7 st.u64 [%r38+48],%r55; st.u64 [%r38+56],%r37; mov.u64 %r57,1; st.u64 [%r38+40],%r57; mov.u64 %r30,%r57; bra $L51; $L50: .loc 1 251 9 ld.u64 %r30,[%r38+40]; .loc 1 250 26 ld.u64 %r59,[%r38+56]; ld.u64 %r61,[%r38+48]; .loc 1 250 55 sub.u64 %r62,%r59,%r61; .loc 1 251 7 div.s64 %r64,%r62,%r30; .loc 1 250 10 setp.ge.s64 %r65,%r64,%r37; @ %r65 bra $L51; .loc 1 252 2_gfortran_runtime_error1: .loc 1 258 3 setp.gt.s64 %r68,%r41,0; @ ! %r68 bra $L47; .loc 1 261 12 shl.b64 %r35,%r30,3; .loc 1 258 19 mov.u64 %r36,0; $L53: .loc 1 260 15 ld.f32 %r32,[%r39]; ld.f32 %r33,[%r39+4]; .loc 1 260 13 st.f32 [%r23],%r32; st.f32 [%r23+4],%r33; .loc 1 261 12 add.u64 %r23,%r23,%r35; .loc 1 258 40 add.u64 %r36,%r36,1; .loc 1 258 3 setp.ne.u64 %r69,%r41,%r36; @ %r69 bra $L53; $L47: .loc 1 263 1 ret; } spread_c8.o/fortran/generated/spread_c8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_spread_scalar_c8 .visible .func _gfortrani_spread_scalar_c8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray114,101,116,117,114,110,32,114,97,110,107,32,116,111,111,32,108,97,114,103,101,32,105,110,32,115,112,114,101,97,100,40,410,105,109,32,111,117,116,115,105,100,101,32,111,102,32,114,97,110,107,32,105,110,32114,97,110,107,32,109,105,115,109,97,116,99,104,32,105,110,32,115,112,114,101,97,100,40,473,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,83,80,82,69,65,68,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10105,110,99,111,114,114,101,99,116,32,100,101,115,116,105,110,97,116,105,111,110,32,114,97,110,107,32,105,110,32,115,112,114,101,97,100,40,41100,105,109,32,116,111,111,32,108,97,114,103,101,32,105,110,32,115,112,114,101,97,100,40,41_gfortrani_spread_c8 .visible .func _gfortrani_spread_c8480pred %r142; .reg .predpredpred %r192; .reg .u64pred %r206; .reg .predpred %r251; .reg .predu64 %r265; .reg .u64 %r266; .reg .u64 %r273; .reg .pred %r276; .reg .u64 %r286; .reg .predpred %r293; .reg .pred %r294; .reg .predpredu64 %r331; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64 %r336; mov.u64 %r130,%ar0; mov.u64 %r131,%ar1; mov.u64 %r132,%ar2; mov.u64 %r133,%ar3; .loc 1 57 11 ld.s8 %r22,[%r131+28]; .loc 1 57 9 cvt.u32.u32 %r134,%r22; cvt.s64.s8 %r90,%r134; .loc 1 59 9 add.u64 %r91,%r90,1; .loc 1 60 6 setp.le.s64 %r135,%r91,15; @ %r135 bra $L2; .loc 1 61 5 cvta.const.u64 %r136136_gfortran_runtime_errorsetp.ge.s64 %r138,%r91,%r132; @ %r138 bra $L3; .loc 1 64_gfortran_runtime_errorr113,[%r130]; .loc 1 68 6 setp.ne.u64 %r141,%r113,0; @ %r141 bra $L4; .loc 1 75 23 st.u8 [%r130+28],%r91; .loc 1 79 7 setp.le.s64 %r142,%r91,0; @ %r142 bra $L5; .loc 1 82 19 add.u64 %r128,%r132,-1; add.u64 %r116,%r130,40; .loc 1 77 11 mov.u64 %r81,%r113; .loc 1 79 14 mov.u64 %r98,%r81; .loc 1 78 10 mov.u64 %r122,1; .loc 1 40 14 mov.u64 %r75,%r81; .loc 1 90 19 mov.u64 %r333,%r81; .loc 1 84 21 add.u64 %r334,%r133,-1; bra $L8; $L36: mov.u64 %r122,%r77; $L8: .loc 1 82 7 setp.ne.u64 %r145,%r98,%r128; @ %r145 bra $L6; .loc 1 84 21 mov.u64 %r29,%r334; .loc 1 86 11 mul.lo.u64 %r77,%r133,%r122; mov.u64 %r75,%r122; bra $L7; $L6: .loc 1 90 19 shl.b64 %r146,%r81,3; add.u64 %r147,%frame,%r146; st.u64 [%r147+120],%r333; .loc 1 91 22 add.u64 %r151,%r81,%r81; add.u64 %r152,%r151,%r81; shl.b64 %r153,%r152,3; add.u64 %r154,%r131,%r153; ld.u64 %r157,[%r154+56]; add.u64 %r156,%r157,1; ld.u64 %r164,[%r154+48]; sub.u64 %r27,%r156,%r164; .loc 1 91 20 st.u64 [%r147],%r27; .loc 1 92 21 ld.u64 %r176,[%r154+40]; st.u64 [%r147+240],%r176; .loc 1 93 21 st.u64 [%r147+360],%r122; .loc 1 95 25 add.u64 %r29,%r27,-1; .loc 1 96 11 mul.lo.u64 %r77,%r27,%r122; .loc 1 97 11 add.u64 %r81,%r81,1; $L7: .loc 1 99 4 st.u64 [%r116+8],%r333; st.u64 [%r116+16],%r29; st.u64 [%r116],%r122; .loc 1 79 31 add.u64 %r98,%r98,1; .loc 1 79 7 add.u64 %r116,%r116,24; setp.ne.u64 %r181,%r98,%r91; @ %r181 bra $L36; bra $L45; $L5: .loc 1 101 19 st.u64 [%r130+8],%r113; .loc 1 104 24 mov.u64 %r184,16; mov.u64 %r1818(%value_in),_gfortrani_xmallocarray185,[%value_in]; } mov.u64 %r113,%r185; .loc 1 104 22 st.u64 [%r130],%r113; .loc 1 181 12 ld.u64 %r50,[%frame+240]; .loc 1 40 14 mov.u64 %r75,0; bra $L10; $L45: .loc 1 101 19 st.u64 [%r130+8],%r333; .loc 1 104 24 mov.u64 %r189,167189; call (%value_in),_gfortrani_xmallocarray190,[%value_in]; } mov.u64 %r113,%r190; .loc 1 104 22 st.u64 [%r130],%r113; .loc 1 105 10 setp.le.s64 %r192,%r77,0; @ %r192 bra $L1; .loc 1 181 12 ld.u64 %r50,[%frame+240]; bra $L10; $L4: .loc 1 115 11 ld.s8 %r193,[%r130+28]; .loc 1 115 10 setp.eq.u64 %r194,%r193,%r91; @ %r194 bra $L12; .loc 1 116 2 cvta.const.u64 %r1955_gfortran_runtime_error12: .loc 1 118 11 cvta.global.u64 %r197,_gfortrani_compile_options; ld.u32 %r86,[%r197+36]; .loc 1 118 10 setp.ne.u32 %r198,%r86,0; @ %r198 bra $L13; .loc 1 156 4 setp.gt.s64 %r199,%r193,0; @ %r199 bra $L14; .loc 1 40 14 mov.u64 %r75,0; bra $L15; $L13: .loc 1 120 4 setp.le.s64 %r200,%r193,0; @ %r200 bra $L37; .loc 1 82 19 add.u64 %r127,%r132,-1; add.u64 %r107,%r130,40; .loc 1 112 18 mov.u32 %r86,0; .loc 1 114 11 mov.u64 %r83,0; .loc 1 120 11 mov.u64 %r95,%r83; .loc 1 40 14 mov.u64 %r75,%r83; .loc 1 137 16 mov.u64 %r335,%r83; $L20: .loc 1 124 21 ld.u64 %r204,[%r107+16]; add.u64 %r203,%r204,1; .loc 1 124 19 ld.u64 %r205,[%r107+8]; sub.u64 %r94,%r203,%r205; .loc 1 125 11 setp.ne.u64 %r206,%r95,%r127; @ %r206 bra $L16; .loc 1 127 12 ld.u64 %r75,[%r107]; .loc 1 129 8 setp.eq.u64 %r207,%r133,%r94; @ %r207 bra $L17; .loc 1 130 7 st.u64 [%stack+16],%r133; st.u64 [%stack+8],%r94; add.u64 %r209,%r95,1;_gfortran_runtime_error16: .loc 1 137 16 shl.b64 %r323,%r83,3; add.u64 %r212,%frame,%r323; add.u64 %r213,%r212,120; st.u64 [%r213],%r335; .loc 1 138 19 add.u64 %r318,%r83,%r83; add.u64 %r320,%r318,%r83; shl.b64 %r218,%r320,3; add.u64 %r219,%r131,%r218; ld.u64 %r222,[%r219+56]; add.u64 %r221,%r222,1; ld.u64 %r229,[%r219+48]; sub.u64 %r41,%r221,%r229; .loc 1 138 17 st.u64 [%r212],%r41; .loc 1 139 8 setp.eq.u64 %r232,%r41,%r94; @ %r232 bra $L18; .loc 1 140 7 st.u64 [%stack+16],%r41; st.u64 [%stack+8],%r94; add.u64 %r234,%r95,1;_gfortran_runtime_error18: .loc 1 146 8 setp.gt.s64 %r236,%r41,0; .loc 1 147 18 selp.u32 %r86,%r86,1,%r236; .loc 1 148 18 ld.u64 %r246,[%r219+40]; st.u64 [%r212+240],%r246; .loc 1 149 18 ld.u64 %r250,[%r107]; st.u64 [%r212+360],%r250; .loc 1 150 8 add.u64 %r83,%r83,1; $L17: .loc 1 120 28 add.u64 %r95,%r95,1; .loc 1 120 4 add.u64 %r107,%r107,24; setp.ne.u64 %r251,%r193,%r95; @ %r251 bra $L20; bra $L21; $L14: .loc 1 82 19 add.u64 %r126,%r132,-1; add.u64 %r76,%r130,40; .loc 1 114 11 mov.u64 %r85,0; .loc 1 156 11 mov.u64 %r93,%r85; .loc 1 40 14 mov.u64 %r75,%r85; .loc 1 164 16 mov.u64 %r336,%r85; $L24: .loc 1 160 12 ld.u64 %r129,[%r76]; .loc 1 158 11 setp.eq.u64 %r254,%r93,%r126; @ %r254 bra $L38; .loc 1 164 16 shl.b64 %r324,%r85,3; add.u64 %r256,%frame,%r324; add.u64 %r257,%r256,120; st.u64 [%r257],%r336; .loc 1 165 19 add.u64 %r321,%r85,%r85; add.u64 %r322,%r321,%r85; shl.b64 %r262,%r322,3; add.u64 %r263,%r131,%r262; ld.u64 %r266,[%r263+56]; add.u64 %r265,%r266,1; ld.u64 %r273,[%r263+48]; sub.u64 %r48,%r265,%r273; .loc 1 165 17 st.u64 [%r256],%r48; .loc 1 166 8 setp.gt.s64 %r276,%r48,0; .loc 1 167 18 selp.u32 %r86,%r86,1,%r276; .loc 1 168 18 ld.u64 %r286,[%r263+40]; st.u64 [%r256+240],%r286; .loc 1 169 18 st.u64 [%r256+360],%r129; .loc 1 170 8 add.u64 %r85,%r85,1; bra $L22; $L38: .loc 1 160 12 mov.u64 %r75,%r129; $L22: .loc 1 156 28 add.u64 %r93,%r93,1; .loc 1 156 4 add.u64 %r76,%r76,24; setp.ne.u64 %r290,%r93,%r193; @ %r290 bra $L24; $L21: .loc 1 175 10 setp.ne.u32 %r291,%r86,0; @ ! %r291 bra $L15; bra $L1; $L37: .loc 1 40 14 mov.u64 %r75,0; $L15: .loc 1 178 18 ld.u64 %r50,[%frame+240]; .loc 1 178 10 setp.eq.u64 %r292,%r50,0; selp.u64 %r50,1,%r50,%r292; $L10: .loc 1 182 12 ld.u64 %r99,[%frame+360]; .loc 1 184 8 ld.u64 %r102,[%r131]; .loc 1 186 9 setp.ne.u64 %r293,%r102,0; @ %r293 bra $L25; bra $L1; $L40: .loc 1 190 7 mov.u64 %r103,%r113; .loc 1 190 14 mov.u64 %r104,0; $L26: .loc 1 192 12 ld.f64 %r51,[%r102]; ld.f64 %r52,[%r102+8]; .loc 1 192 10 st.f64 [%r103],%r51; st.f64 [%r103+8],%r52; .loc 1 193 16 add.u64 %r103,%r103,%r54; .loc 1 190 33 add.u64 %r104,%r104,1; .loc 1 190 7 setp.ne.u64 %r294,%r133,%r104; @ %r294 bra $L26; bra $L46; $L44: .loc 1 196 12 add.u64 %r102,%r102,%r56; .loc 1 197 12 add.u64 %r113,%r113,%r58; .loc 1 198 15 add.u64 %r96,%r96,1; .loc 1 200 13 setp.ne.u64 %r295,%r30,%r96; @ %r295 bra $L44; $L34: .loc 1 208 16 add.u64 %r101,%r113,%r117; .loc 1 210 14 @ %r326 bra $L1; add.u64 %r109,%frame,128; .loc 1 207 30 mov.u64 %r60,%r100; .loc 1 210 14 mov.u64 %r125,8; bra $L30; $L31: .loc 1 204 20 st.u64 [%r109],%r331; .loc 1 207 30 mul.lo.u64 %r60,%r68,%r67; .loc 1 208 30 mul.lo.u64 %r299,%r70,%r67; .loc 1 208 16 shl.b64 %r300,%r299,4; sub.u64 %r101,%r113,%r300; .loc 1 210 14 add.u64 %r109,%r109,8; add.u64 %r125,%r125,8; setp.eq.u64 %r301,%r118,%r125; @ %r301 bra $L1; $L30: .loc 1 218 23 ld.u64 %r302,[%r109]; add.u64 %r67,%r302,1; st.u64 [%r109],%r67; .loc 1 219 30 add.u64 %r304,%r329,%r125; ld.u64 %r68,[%r304]; .loc 1 219 20 sub.u64 %r305,%r68,%r60; shl.b64 %r306,%r305,4; add.u64 %r102,%r102,%r306; .loc 1 220 30 add.u64 %r308,%r330,%r125; ld.u64 %r70,[%r308]; .loc 1 220 20 shl.b64 %r309,%r70,4; add.u64 %r113,%r101,%r309; .loc 1 200 32 add.u64 %r310,%frame,%r125; ld.u64 %r73,[%r310]; .loc 1 200 13 setp.eq.u64 %r311,%r67,%r73; @ %r311 bra $L31; .loc 1 204 20 mov.u64 %r96,0; bra $L32; $L25: .loc 1 193 16 shl.b64 %r54,%r75,4; .loc 1 196 12 shl.b64 %r56,%r50,4; .loc 1 197 12 shl.b64 %r58,%r99,4; ld.u64 %r96,[%frame+120]; .loc 1 200 32 ld.u64 %r30,[%frame]; .loc 1 207 30 mul.lo.u64 %r100,%r50,%r30; .loc 1 208 30 mul.lo.u64 %r312,%r99,%r30; .loc 1 208 16 shl.b64 %r313,%r312,4; neg.s64 %r117,%r313; cvt.u32.u32 %r315,%r22; cvt.s64.s8 %r314,%r315; shl.b64 %r118,%r314,3; setp.le.s64 %r326,%r90,1; setp.gt.s64 %r327,%r133,0; add.u64 %r329,%frame,240; add.u64 %r330,%frame,360; .loc 1 204 20 mov.u64 %r331,0; $L32: .loc 1 190 7 @ %r327 bra $L40; bra $L44; $L46: .loc 1 196 12 add.u64 %r102,%r102,%r56; .loc 1 197 12 add.u64 %r113,%r113,%r58; .loc 1 198 15 add.u64 %r96,%r96,1; .loc 1 200 13 setp.eq.u64 %r317,%r30,%r96; @ %r317 bra $L34; bra $L32; $L_gfortrani_spread_scalar_c8 .visible .func _gfortrani_spread_scalar_c8pred %r47; .reg .u64 %r48; .reg .pred.reg .u64 %r62; .reg .u64pred %r68; .reg .predmov.u64 %r40,%ar2; mov.u64 %r41,%ar3; .loc 1 236 6 ld.s8 %r43,[%r38+28]; cvt.u16.u32 %r42,%r43; setp.eq.u16 %r44,%r42,1; @ %r44 bra $L48; .loc 1 237 5 cvta.const.u64 %r45,$LC4stack; call _gfortran_runtime_error48: .loc 1 239 6 setp.le.s64 %r47,%r40,1; @ %r47 bra $L49; .loc 1 240 5 cvta.const.u64 %r48_gfortran_runtime_error49: .loc 1 242 10 ld.u64 %r23,[%r38]; .loc 1 246 7 add.u64 %r37,%r41,-1; .loc 1 242 6 setp.ne.u64 %r50,%r23,0; @ %r50 bra $L50; .loc 1 244 24call (%value_in),_gfortrani_xmallocarray244 22 st.u64 [%r38],%r23; .loc 1 245 19 mov.u64 %r55,0; st.u64 [%r38+8],%r55; .loc 1 246 7 st.u64 [%r38+48],%r55; st.u64 [%r38+56],%r37; mov.u64 %r57,1; st.u64 [%r38+40],%r57; mov.u64 %r30,%r57; bra $L51; $L50: .loc 1 251 9 ld.u64 %r30,[%r38+40]; .loc 1 250 26 ld.u64 %r59,[%r38+56]; ld.u64 %r61,[%r38+48]; .loc 1 250 55 sub.u64 %r62,%r59,%r61; .loc 1 251 7 div.s64 %r64,%r62,%r30; .loc 1 250 10 setp.ge.s64 %r65,%r64,%r37; @ %r65 bra $L51; .loc 1 252 2_gfortran_runtime_error1: .loc 1 258 3 setp.gt.s64 %r68,%r41,0; @ ! %r68 bra $L47; .loc 1 261 12 shl.b64 %r35,%r30,4; .loc 1 258 19 mov.u64 %r36,0; $L53: .loc 1 260 15 ld.f64 %r32,[%r39]; ld.f64 %r33,[%r39+8]; .loc 1 260 13 st.f64 [%r23],%r32; st.f64 [%r23+8],%r33; .loc 1 261 12 add.u64 %r23,%r23,%r35; .loc 1 258 40 add.u64 %r36,%r36,1; .loc 1 258 3 setp.ne.u64 %r69,%r41,%r36; @ %r69 bra $L53; $L47: .loc 1 263 1 ret; } spread_c10.o/ cshift0_i1.o/gfortrani_cshift0_i1 .visible .func _gfortrani_cshift0_i1.file 1 "../../../libgfortran/generated/cshift0_i1.c"F: _gfortrani_cshift0_i1 .visible .func _gfortrani_cshift0_i116 .b8 %frame_ar[4808predu64 %r177; .reg .u32predpred %r204; .reg .pred %r205; .reg .predu32 %r225; .reg .u32 %r226; .reg .u16 %r228; .reg .u16 %r229; .reg .u16predu64 %r328; .reg .u64 %r329; mov.u64 %r149,%ar0; mov.u64 %r150,%ar1; mov.u64 %r151,%ar2; mov.u32 %r152,%ar3; .loc 1 57 9 add.u32 %r98,%r152,-1; .loc 1 58 14 mov.u64 %r153,0; st.u64 [%frame+240],%r153; .loc 1 59 14 st.u64 [%frame+360],%r153; .loc 1 61 13 mov.u64 %r155,1; st.u64 [%frame],%r155; .loc 1 62 12 st.u64 [%frame+120],%r153; .loc 1 76 13 ld.s8 %r132,[%r150+28]; .loc 1 76 11 cvt.u32.u32 %r157,%r132; cvt.s64.s8 %r85,%r157; .loc 1 72 6 setp.le.s32 %r158,%r98,0; @ %r158 bra $L2; .loc 1 77 7 setp.le.s64 %r159,%r85,0; @ %r159 bra $L3; mov.u64 %r52,40; .loc 1 70 8 mov.u64 %r106,%r155; .loc 1 69 8 mov.u64 %r105,%r106; .loc 1 77 14 mov.u64 %r107,%r153; $L7: add.u64 %r112,%r149,%r52; .loc 1 81 7 ld.u64 %r160,[%r112]; setp.ne.u64 %r161,%r160,%r105; @ ! %r161 bra $L50; bra $L4; $L2: .loc 1 133 7 setp.gt.s64 %r162,%r85,0; @ %r162 bra $L4; bra $L51; $L50: add.u64 %r56,%r150,%r52; .loc 1 87 7 ld.u64 %r163,[%r56]; setp.ne.u64 %r164,%r163,%r106; @ %r164 bra $L4; .loc 1 92 12 ld.u64 %r166,[%r112+16]; ld.u64 %r168,[%r112+8]; sub.u64 %r167,%r166,%r168; .loc 1 92 9 mad.lo.u64 %r105,%r167,%r105,%r105; .loc 1 93 12 ld.u64 %r170,[%r56+16]; ld.u64 %r172,[%r56+8]; sub.u64 %r171,%r170,%r172; .loc 1 93 9 mad.lo.u64 %r106,%r171,%r106,%r106; .loc 1 77 30 add.u64 %r107,%r107,1; .loc 1 77 7 add.u64 %r52,%r52,24; setp.ne.u64 %r173,%r107,%r85; @ %r173 bra $L7; bra $L3; $L33: .loc 1 123 13 cvt.s64.s32 %r146,%r152; cvt.u32.u32 %r175,%r132; cvt.s64.s8 %r174,%r175; sub.u64 %r176,%r174,%r146; shl.b64 %r140,%r176,3; add.u64 %r177,%frame,1201403,[%value_in]; } add.u64 %r186,%r146,%r146; add.u64 %r187,%r186,%r146; shl.b64 %r188,%r187,3; add.u64 %r79,%r188,40; add.u64 %r41,%r150,%r79; mov.u64 %r135,%frame; add.u64 %r102,%r149,%r79; mov.u64 %r49,0; add.u64 %r320,%frame,360; add.u64 %r321,%frame,240; $L8: .loc 1 124 16 ld.u64 %r190,[%r41+16]; add.u64 %r189,%r190,1; ld.u64 %r192,[%r41+8]; sub.u64 %r191,%r189,%r192; .loc 1 124 14 st.u64 [%r135],%r191; .loc 1 125 15 add.u64 %r194,%r320,%r49; ld.u64 %r195,[%r102]; st.u64 [%r194],%r195; .loc 1 126 15 add.u64 %r197,%r321,%r49; ld.u64 %r198,[%r41]; st.u64 [%r197],%r198; .loc 1 121 7 add.u64 %r41,%r41,24; add.u64 %r135,%r135,8; add.u64 %r102,%r102,24; add.u64 %r49,%r49,8; setp.ne.u64 %r199,%r49,%r140; @ %r199 bra $L8; .loc 1 162 12 ld.u64 %r148,[%frame+360]; .loc 1 163 12 ld.u64 %r139,[%frame+240]; $L34: .loc 1 129 41 sub.u32 %r200,%r132,%r98; .loc 1 129 11 cvt.s64.s32 %r85,%r200; .loc 1 117 15 mov.u64 %r81,1; .loc 1 116 15 mov.u64 %r78,%r81; bra $L9; $L4: .loc 1 135 12 cvt.s64.s32 %r92,%r98; add.u64 %r93,%r150,40; add.u64 %r76,%r149,40; mov.u64 %r90,0; mov.u64 %r86,%r90; mov.u64 %r110,%r90; mov.u64 %r81,1; mov.u64 %r78,%r81; .loc 1 147 17 mov.u64 %r329,%r90; $L14: .loc 1 143 14 ld.u64 %r202,[%r93+16]; add.u64 %r201,%r202,1; .loc 1 143 12 ld.u64 %r203,[%r93+8]; sub.u64 %r138,%r201,%r203; .loc 1 137 16 ld.u64 %r42,[%r76]; .loc 1 140 16 ld.u64 %r133,[%r93]; .loc 1 135 7 setp.ne.u64 %r204,%r92,%r110; @ %r204 bra $L10; .loc 1 138 11 setp.ne.u64 %r205,%r42,0; .loc 1 139 11 selp.u64 %r78,%r42,1,%r205; .loc 1 141 11 setp.ne.u64 %r206,%r133,0; .loc 1 142 11 selp.u64 %r81,%r133,1,%r206; .loc 1 143 12 mov.u64 %r86,%r138; bra $L13; $L10: .loc 1 147 17 shl.b64 %r207,%r90,3; add.u64 %r208,%frame,%r207; add.u64 %r209,%r208,120; st.u64 [%r209],%r329; .loc 1 148 18 st.u64 [%r208],%r138; .loc 1 149 19 st.u64 [%r208+360],%r42; .loc 1 150 19 st.u64 [%r208+240],%r133; .loc 1 151 9 add.u64 %r90,%r90,1; $L13: .loc 1 133 59 add.u64 %r110,%r110,1; .loc 1 133 7 add.u64 %r93,%r93,24; add.u64 %r76,%r76,24; setp.lt.s64 %r219,%r110,%r85; @ %r219 bra $L14; .loc 1 154 18 ld.u64 %r139,[%frame+240]; .loc 1 156 18 ld.u64 %r148,[%frame+360]; .loc 1 154 10 setp.ne.u64 %r220,%r139,0; .loc 1 155 13 selp.u64 %r139,%r139,1,%r220; .loc 1 156 10 setp.eq.u64 %r221,%r148,0; selp.u64 %r148,1,%r148,%r221; $L9: .loc 1 164 8 ld.u64 %r121,[%r149]; .loc 1 165 8 ld.u64 %r122,[%r150]; .loc 1 168 13 shr.u64 %r223,%r151,63; .loc 1 168 26 set.u32.ge.s64 %r225,%r151,%r86; neg.s32 %r226,%r225; .loc 1 168 17 cvt.u16.u64 %r229,%r223; cvt.u16.u32 %r230,%r226; or.b16 %r228,%r229,%r230; .loc 1 168 6 cvt.u32.u16 %r231,%r228; cvt.u16.u8 %r232,%r231; setp.eq.u16 %r233,%r232,0; @ %r233 bra $L18; .loc 1 170 28 setp.ne.u64 %r234,%r86,0; @ %r234 bra $L17; $L36: mov.u64 %r151,0; bra $L18; $L17: rem.s64 %r151,%r151,%r86; .loc 1 171 10 setp.ge.s64 %r237,%r151,0; @ %r237 bra $L18; .loc 1 172 8 add.u64 %r151,%r151,%r86; bra $L18; $L32: .loc 1 193 43 mul.lo.u64 %r83,%r151,%r81; .loc 1 181 19 set.u32.eq.u64 %r239,%r81,1; neg.s32 %r240,%r239; .loc 1 181 35 set.u32.eq.u64 %r242,%r78,1; neg.s32 %r243,%r242; .loc 1 181 24 cvt.u16.u32 %r245,%r240; cvt.u16.u32 %r246,%r243; and.b16 %r244,%r245,%r246; cvt.u32.u16 %r247,%r244; cvt.u32.u8 %r131,%r247; .loc 1 214 32 ld.u64 %r87,[%frame]; .loc 1 184 23 sub.u64 %r84,%r86,%r151; mul.lo.u64 %r142,%r78,%r84; .loc 1 221 30 mul.lo.u64 %r147,%r148,%r87; .loc 1 222 30 mul.lo.u64 %r248,%r139,%r87; .loc 1 222 16 neg.s64 %r128,%r248; shl.b64 %r249,%r85,3; add.u64 %r96,%r249,-8; setp.eq.u32 %r323,%r131,0; setp.gt.s64 %r322,%r85,2; .loc 1 195 4 setp.le.s64 %r325,%r84,0; .loc 1 201 4 setp.gt.s64 %r326,%r151,0; add.u64 %r327,%frame,360; add.u64 %r328,%frame,240; $L35: .loc 1 181 10 @ %r323 bra $L19; .loc 1 185 23 add.u64 %r251,%r122,%r151; .loc 1 185 4258,[%value_in]; } .loc 1 186 17 add.u64 %r260,%r121,%r84; .loc 1 186 4261221267,[%value_in]; } bra $L20; $L19: .loc 1 193 25 add.u64 %r117,%r122,%r83; .loc 1 195 4 @ %r325 bra $L40; mov.u64 %r116,%r121; .loc 1 195 11 mov.u64 %r118,0; $L22: .loc 1 197 14 ld.u8 %r270,[%r117]; st.u8 [%r116],%r270; .loc 1 198 13 add.u64 %r116,%r116,%r78; .loc 1 199 12 add.u64 %r117,%r117,%r81; .loc 1 195 34 add.u64 %r118,%r118,1; .loc 1 195 4 setp.ne.u64 %r271,%r84,%r118; @ %r271 bra $L22; .loc 1 198 13 add.u64 %r113,%r121,%r142; bra $L21; $L40: .loc 1 195 4 mov.u64 %r113,%r121; $L21: .loc 1 201 4 @ ! %r326 bra $L20; mov.u64 %r114,%r122; .loc 1 201 23 mov.u64 %r115,0; $L23: .loc 1 203 14 ld.u8 %r273,[%r114]; st.u8 [%r113],%r273; .loc 1 204 13 add.u64 %r113,%r113,%r78; .loc 1 205 12 add.u64 %r114,%r114,%r81; .loc 1 201 40 add.u64 %r115,%r115,1; .loc 1 201 4 setp.ne.u64 %r274,%r115,%r151; @ %r274 bra $L23; $L20: .loc 1 210 12 add.u64 %r121,%r121,%r148; .loc 1 211 12 add.u64 %r122,%r122,%r139; .loc 1 212 15 ld.u64 %r275,[%frame+120]; add.u64 %r62,%r275,1; .loc 1 214 13 setp.eq.u64 %r276,%r62,%r87; @ %r276 bra $L25; .loc 1 212 15 st.u64 [%frame+120],%r62; bra $L35; $L25: .loc 1 218 20 mov.u64 %r277,0; st.u64 [%frame+120],%r277; .loc 1 222 16 add.u64 %r119,%r122,%r128; .loc 1 224 14 @ %r322 bra $L27; bra $L1; $L31: .loc 1 218 20 st.u64 [%r111],%r277; .loc 1 221 30 mul.lo.u64 %r65,%r72,%r71; .loc 1 222 30 mul.lo.u64 %r280,%r74,%r71; .loc 1 222 16 sub.u64 %r119,%r122,%r280; .loc 1 224 14 add.u64 %r111,%r111,8; add.u64 %r136,%r136,8; setp.eq.u64 %r281,%r96,%r136; @ ! %r281 bra $L30; bra $L1; $L27: add.u64 %r111,%frame,128; .loc 1 221 30 mov.u64 %r65,%r147; .loc 1 224 14 mov.u64 %r136,8; $L30: .loc 1 232 23 ld.u64 %r283,[%r111]; add.u64 %r71,%r283,1; st.u64 [%r111],%r71; .loc 1 233 30 add.u64 %r285,%r327,%r136; ld.u64 %r72,[%r285]; .loc 1 233 20 sub.u64 %r286,%r72,%r65; add.u64 %r121,%r121,%r286; .loc 1 234 30 add.u64 %r288,%r328,%r136; ld.u64 %r74,[%r288]; .loc 1 234 20 add.u64 %r122,%r119,%r74; .loc 1 214 32 add.u64 %r289,%frame,%r136; ld.u64 %r77,[%r289]; .loc 1 214 13 setp.eq.u64 %r290,%r71,%r77; @ %r290 bra $L31; bra $L35; $L18: .loc 1 175 9 setp.ne.u64 %r291,%r121,0; @ %r291 bra $L32; bra $L1; $L3: .loc 1 114 18 mov.u64 %r292,1; st.u64 [%frame+240],%r292; .loc 1 115 18 st.u64 [%frame+360],%r292; .loc 1 118 13 cvt.s64.s32 %r294,%r98; add.u64 %r296,%r294,%r294; add.u64 %r297,%r296,%r294; shl.b64 %r298,%r297,3; add.u64 %r299,%r150,%r298; ld.u64 %r30,[%r299+40]; .loc 1 119 4 ld.u64 %r309,[%r299+56]; ld.u64 %r318,[%r299+48]; sub.u64 %r317,%r309,%r318; .loc 1 118 11 mad.lo.u64 %r86,%r317,%r30,%r30; .loc 1 120 13 mul.lo.u64 %r151,%r151,%r30; .loc 1 121 7 setp.lt.s32 %r319,%r152,%r132; @ %r319 bra $L33; mov.u64 %r139,%r292; mov.u64 %r148,%r139; bra $L34; $L51: .loc 1 164 8 ld.u64 %r121,[%r149]; .loc 1 165 8 ld.u64 %r122,[%r150]; .loc 1 155 13 mov.u64 %r139,%r155; .loc 1 165 8 mov.u64 %r148,%r139; mov.u64 %r86,%r153; mov.u64 %r81,%r139; mov.u64 %r78,%r139; bra $L36; $Lgfortrani_cshift0_i2 .visible .func _gfortrani_cshift0_i2.file 1 "../../../libgfortran/generated/cshift0_i2.c"F: _gfortrani_cshift0_i2 .visible .func _gfortrani_cshift0_i216 .b8 %frame_ar[48092; .reg .u6432u64 %r214; .reg .u64 %r215; .reg .pred %r216; .reg .pred %r217; .reg .predpred %r232; .reg .pred %r233; .reg .u64 %r235; .reg .u32 %r237; .reg .u32 %r238; .reg .u16 %r240; .reg .u16 %r241; .reg .u16 %r242; .reg .u32 %r243; .reg .u16predu64u16 %r292; .reg .predpred %r301; .reg .u64 %r303; .reg .u64 %r305; .reg .u64 %r306; .reg .u64 %r307; .reg .u64 %r309; .reg .u64 %r310; .reg .u64 %r311; .reg .predu64 %r318; .reg .u64pred %r345; .reg .pred %r346; .reg .pred %r348; .reg .pred %r349; .reg .u64 %r350; .reg .u64 %r351; .reg .u64 %r352; mov.u64 %r160,%ar0; mov.u64 %r161,%ar1; mov.u64 %r162,%ar2; mov.u32 %r163,%ar3; .loc 1 57 9 add.u32 %r108,%r163,-1; .loc 1 58 14 mov.u64 %r164,0; st.u64 [%frame+240],%r164; .loc 1 59 14 st.u64 [%frame+360],%r164; .loc 1 61 13 mov.u64 %r166,1; st.u64 [%frame],%r166; .loc 1 62 12 st.u64 [%frame+120],%r164; .loc 1 76 13 ld.s8 %r112,[%r161+28]; .loc 1 72 6 setp.le.s32 %r168,%r108,0; @ %r168 bra $L2; .loc 1 76 11 cvt.u32.u32 %r169,%r112; cvt.s64.s8 %r96,%r169; .loc 1 77 7 setp.le.s64 %r170,%r96,0; @ %r170 bra $L3; mov.u64 %r53,40; .loc 1 70 8 mov.u64 %r117,%r166; .loc 1 69 8 mov.u64 %r116,%r117; .loc 1 77 14 mov.u64 %r118,%r164; $L7: add.u64 %r125,%r160,%r53; .loc 1 81 7 ld.u64 %r171,[%r125]; setp.ne.u64 %r172,%r171,%r116; @ ! %r172 bra $L49; bra $L4; $L2: .loc 1 133 27 cvt.u32.u32 %r173,%r112; cvt.s64.s8 %r96,%r173; .loc 1 133 7 setp.gt.s64 %r174,%r96,0; @ %r174 bra $L4; .loc 1 164 8 ld.u64 %r136,[%r160]; .loc 1 165 8 ld.u64 %r137,[%r161]; mov.u64 %r97,%r164; mov.u64 %r159,%r166; .loc 1 155 13 mov.u64 %r95,%r159; .loc 1 165 8 mov.u64 %r92,%r159; mov.u64 %r54,%r159; bra $L6; $L49: add.u64 %r67,%r161,%r53; .loc 1 87 7 ld.u64 %r175,[%r67]; setp.ne.u64 %r176,%r175,%r117; @ %r176 bra $L4; .loc 1 92 12 ld.u64 %r178,[%r125+16]; ld.u64 %r180,[%r125+8]; sub.u64 %r179,%r178,%r180; .loc 1 92 9 mad.lo.u64 %r116,%r179,%r116,%r116; .loc 1 93 12 ld.u64 %r182,[%r67+16]; ld.u64 %r184,[%r67+8]; sub.u64 %r183,%r182,%r184; .loc 1 93 9 mad.lo.u64 %r117,%r183,%r117,%r117; .loc 1 77 30 add.u64 %r118,%r118,1; .loc 1 77 7 add.u64 %r53,%r53,24; setp.ne.u64 %r185,%r96,%r118; @ %r185 bra $L7; bra $L3; $L33: .loc 1 123 13 cvt.u32.u32 %r187,%r112; cvt.s64.s8 %r186,%r187; sub.u64 %r188,%r186,%r341; shl.b64 %r154,%r188,3; add.u64 %r189,%frame,120;1add.u64 %r198,%r341,%r341; add.u64 %r199,%r198,%r341; shl.b64 %r200,%r199,3; add.u64 %r44,%r200,40; add.u64 %r74,%r161,%r44; mov.u64 %r35,%frame; add.u64 %r101,%r160,%r44; mov.u64 %r49,0; add.u64 %r343,%frame,360; add.u64 %r344,%frame,240; $L8: .loc 1 124 16 ld.u64 %r202,[%r74+16]; add.u64 %r201,%r202,1; ld.u64 %r204,[%r74+8]; sub.u64 %r203,%r201,%r204; .loc 1 124 14 st.u64 [%r35],%r203; .loc 1 125 15 add.u64 %r206,%r343,%r49; ld.u64 %r207,[%r101]; st.u64 [%r206],%r207; .loc 1 126 15 add.u64 %r209,%r344,%r49; ld.u64 %r210,[%r74]; st.u64 [%r209],%r210; .loc 1 121 7 add.u64 %r74,%r74,24; add.u64 %r35,%r35,8; add.u64 %r101,%r101,24; add.u64 %r49,%r49,8; setp.ne.u64 %r211,%r49,%r154; @ %r211 bra $L8; .loc 1 162 12 ld.u64 %r159,[%frame+360]; .loc 1 163 12 ld.u64 %r95,[%frame+240]; $L34: .loc 1 129 41 sub.u32 %r212,%r112,%r108; .loc 1 129 11 cvt.s64.s32 %r96,%r212; .loc 1 117 15 mov.u64 %r92,1; .loc 1 116 15 mov.u64 %r54,%r92; bra $L9; $L4: .loc 1 135 12 cvt.s64.s32 %r151,%r108; add.u64 %r104,%r161,40; add.u64 %r100,%r160,40; mov.u64 %r99,0; mov.u64 %r97,%r99; mov.u64 %r121,%r99; mov.u64 %r92,1; mov.u64 %r54,%r92; .loc 1 147 17 mov.u64 %r352,%r99; $L14: .loc 1 143 14 ld.u64 %r214,[%r104+16]; add.u64 %r213,%r214,1; .loc 1 143 12 ld.u64 %r215,[%r104+8]; sub.u64 %r150,%r213,%r215; .loc 1 137 16 ld.u64 %r152,[%r100]; .loc 1 140 16 ld.u64 %r141,[%r104]; .loc 1 135 7 setp.ne.u64 %r216,%r151,%r121; @ %r216 bra $L10; .loc 1 138 11 setp.ne.u64 %r217,%r152,0; .loc 1 139 11 selp.u64 %r54,%r152,1,%r217; .loc 1 141 11 setp.ne.u64 %r218,%r141,0; .loc 1 142 11 selp.u64 %r92,%r141,1,%r218; .loc 1 143 12 mov.u64 %r97,%r150; bra $L13; $L10: .loc 1 147 17 shl.b64 %r219,%r99,3; add.u64 %r220,%frame,%r219; add.u64 %r221,%r220,120; st.u64 [%r221],%r352; .loc 1 148 18 st.u64 [%r220],%r150; .loc 1 149 19 st.u64 [%r220+360],%r152; .loc 1 150 19 st.u64 [%r220+240],%r141; .loc 1 151 9 add.u64 %r99,%r99,1; $L13: .loc 1 133 59 add.u64 %r121,%r121,1; .loc 1 133 7 add.u64 %r104,%r104,24; add.u64 %r100,%r100,24; setp.lt.s64 %r231,%r121,%r96; @ %r231 bra $L14; .loc 1 154 18 ld.u64 %r95,[%frame+240]; .loc 1 156 18 ld.u64 %r159,[%frame+360]; .loc 1 154 10 setp.ne.u64 %r232,%r95,0; .loc 1 155 13 selp.u64 %r95,%r95,1,%r232; .loc 1 156 10 setp.eq.u64 %r233,%r159,0; selp.u64 %r159,1,%r159,%r233; $L9: .loc 1 164 8 ld.u64 %r136,[%r160]; .loc 1 165 8 ld.u64 %r137,[%r161]; .loc 1 168 13 shr.u64 %r235,%r162,63; .loc 1 168 26 set.u32.ge.s64 %r237,%r162,%r97; neg.s32 %r238,%r237; .loc 1 168 17 cvt.u16.u64 %r241,%r235; cvt.u16.u32 %r242,%r238; or.b16 %r240,%r241,%r242; .loc 1 168 6 cvt.u32.u16 %r243,%r240; cvt.u16.u8 %r244,%r243; setp.eq.u16 %r245,%r244,0; @ %r245 bra $L18; .loc 1 170 28 setp.ne.u64 %r246,%r97,0; @ %r246 bra $L17; $L6: mov.u64 %r162,0; bra $L18; $L17: rem.s64 %r162,%r162,%r97; .loc 1 171 10 setp.ge.s64 %r249,%r162,0; @ %r249 bra $L18; .loc 1 172 8 add.u64 %r162,%r162,%r97; bra $L18; $L32: add.u64 %r158,%r92,%r92; .loc 1 193 36 mul.lo.u64 %r62,%r162,%r158; .loc 1 198 13 add.u64 %r65,%r54,%r54; .loc 1 183 11 add.u64 %r132,%r162,%r162; .loc 1 184 23 sub.u64 %r143,%r97,%r162; .loc 1 184 11 add.u64 %r133,%r143,%r143; .loc 1 210 12 add.u64 %r69,%r159,%r159; .loc 1 211 12 add.u64 %r71,%r95,%r95; .loc 1 181 35 set.u32.eq.u64 %r257,%r54,1; neg.s32 %r258,%r257; .loc 1 181 19 set.u32.eq.u64 %r260,%r92,1; neg.s32 %r261,%r260; .loc 1 181 24 cvt.u16.u32 %r263,%r258; cvt.u16.u32 %r264,%r261; and.b16 %r262,%r263,%r264; cvt.u32.u16 %r265,%r262; cvt.u32.u8 %r103,%r265; .loc 1 214 32 ld.u64 %r98,[%frame]; mul.lo.u64 %r156,%r65,%r143; .loc 1 221 30 mul.lo.u64 %r135,%r159,%r98; .loc 1 222 30 mul.lo.u64 %r266,%r95,%r98; .loc 1 222 16 add.u64 %r267,%r266,%r266; neg.s64 %r153,%r267; shl.b64 %r268,%r96,3; add.u64 %r106,%r268,-8; setp.eq.u32 %r346,%r103,0; setp.gt.s64 %r345,%r96,2; .loc 1 195 4 setp.le.s64 %r348,%r143,0; .loc 1 201 4 setp.gt.s64 %r349,%r162,0; add.u64 %r350,%frame,360; add.u64 %r351,%frame,240; $L35: .loc 1 181 10 @ %r346 bra $L19; .loc 1 185 23 add.u64 %r270,%r137,%r132; .loc 1 185 433277,[%value_in]; } .loc 1 186 17 add.u64 %r279,%r136,%r133; .loc 1 186 4326,[%value_in]; } bra $L20; $L19: .loc 1 193 25 add.u64 %r130,%r137,%r62; .loc 1 195 4 @ %r348 bra $L39; mov.u64 %r129,%r136; .loc 1 195 11 mov.u64 %r131,0; $L22: .loc 1 197 14 ld.u16 %r289,[%r130]; st.u16 [%r129],%r289; .loc 1 198 13 add.u64 %r129,%r129,%r65; .loc 1 199 12 add.u64 %r130,%r130,%r158; .loc 1 195 34 add.u64 %r131,%r131,1; .loc 1 195 4 setp.ne.u64 %r290,%r131,%r143; @ %r290 bra $L22; .loc 1 198 13 add.u64 %r126,%r136,%r156; bra $L21; $L39: .loc 1 195 4 mov.u64 %r126,%r136; $L21: .loc 1 201 4 @ ! %r349 bra $L20; mov.u64 %r127,%r137; .loc 1 201 23 mov.u64 %r128,0; $L23: .loc 1 203 14 ld.u16 %r292,[%r127]; st.u16 [%r126],%r292; .loc 1 204 13 add.u64 %r126,%r126,%r65; .loc 1 205 12 add.u64 %r127,%r127,%r158; .loc 1 201 40 add.u64 %r128,%r128,1; .loc 1 201 4 setp.ne.u64 %r293,%r128,%r162; @ %r293 bra $L23; $L20: .loc 1 210 12 add.u64 %r136,%r136,%r69; .loc 1 211 12 add.u64 %r137,%r137,%r71; .loc 1 212 15 ld.u64 %r294,[%frame+120]; add.u64 %r73,%r294,1; .loc 1 214 13 setp.eq.u64 %r295,%r73,%r98; @ %r295 bra $L25; .loc 1 212 15 st.u64 [%frame+120],%r73; bra $L35; $L25: .loc 1 218 20 mov.u64 %r296,0; st.u64 [%frame+120],%r296; .loc 1 222 16 add.u64 %r134,%r137,%r153; .loc 1 224 14 @ %r345 bra $L27; bra $L1; $L31: .loc 1 218 20 st.u64 [%r123],%r296; .loc 1 221 30 mul.lo.u64 %r76,%r84,%r83; .loc 1 222 30 mul.lo.u64 %r299,%r86,%r83; .loc 1 222 16 add.u64 %r300,%r299,%r299; sub.u64 %r134,%r137,%r300; .loc 1 224 14 add.u64 %r123,%r123,8; add.u64 %r148,%r148,8; setp.eq.u64 %r301,%r106,%r148; @ ! %r301 bra $L30; bra $L1; $L27: add.u64 %r123,%frame,128; .loc 1 221 30 mov.u64 %r76,%r135; .loc 1 224 14 mov.u64 %r148,8; $L30: .loc 1 232 23 ld.u64 %r303,[%r123]; add.u64 %r83,%r303,1; st.u64 [%r123],%r83; .loc 1 233 30 add.u64 %r305,%r350,%r148; ld.u64 %r84,[%r305]; .loc 1 233 20 sub.u64 %r306,%r84,%r76; add.u64 %r307,%r306,%r306; add.u64 %r136,%r136,%r307; .loc 1 234 30 add.u64 %r309,%r351,%r148; ld.u64 %r86,[%r309]; .loc 1 234 20 add.u64 %r310,%r86,%r86; add.u64 %r137,%r134,%r310; .loc 1 214 32 add.u64 %r311,%frame,%r148; ld.u64 %r90,[%r311]; .loc 1 214 13 setp.eq.u64 %r312,%r83,%r90; @ %r312 bra $L31; bra $L35; $L18: .loc 1 175 9 setp.ne.u64 %r313,%r136,0; @ %r313 bra $L32; bra $L1; $L3: .loc 1 114 18 mov.u64 %r314,1; st.u64 [%frame+240],%r314; .loc 1 115 18 st.u64 [%frame+360],%r314; .loc 1 118 13 cvt.s64.s32 %r316,%r108; add.u64 %r318,%r316,%r316; add.u64 %r319,%r318,%r316; shl.b64 %r320,%r319,3; add.u64 %r321,%r161,%r320; ld.u64 %r30,[%r321+40]; .loc 1 119 4 ld.u64 %r331,[%r321+56]; ld.u64 %r340,[%r321+48]; sub.u64 %r339,%r331,%r340; .loc 1 118 11 mad.lo.u64 %r97,%r339,%r30,%r30; .loc 1 120 13 mul.lo.u64 %r162,%r162,%r30; .loc 1 121 16 cvt.s64.s32 %r341,%r163; .loc 1 121 7 setp.gt.s64 %r342,%r96,%r341; @ %r342 bra $L33; mov.u64 %r95,%r314; mov.u64 %r159,%r95; bra $L34; $L1: .loc 1 240 1 ret; } cshift0_i4.o/gfortrani_cshift0_i4 .visible .func _gfortrani_cshift0_fortran/generated/cshift0_i4.c"F: _gfortrani_cshift0_i4 .visible .func _gfortrani_cshift0_i416 .b8 %frame_ar[48032u64 %r214; .reg .u64 %r215; .reg .pred %r216; .reg .pred %r217; .reg .predpred %r232; .reg .pred %r233; .reg .u64 %r235; .reg .u32 %r237; .reg .u32 %r238; .reg .u16 %r240; .reg .u16 %r241; .reg .u16 %r242; .reg .u32 %r243; .reg .u16pred %r249; .reg .u32 %r251; .reg .u32 %r252; .reg .u32 %r254; .reg .u32 %r255; .reg .u16 %r256; .reg .u16 %r257; .reg .u16 %r258; .reg .u32predpredu64pred %r305; .reg .u64u64pred %r337; .reg .pred %r338; .reg .pred %r340; .reg .predmov.u64 %r160,%ar0; mov.u64 %r161,%ar1; mov.u64 %r162,%ar2; mov.u32 %r163,%ar3; .loc 1 57 9 add.u32 %r108,%r163,-1; .loc 1 58 14 mov.u64 %r164,0; st.u64 [%frame+240],%r164; .loc 1 59 14 st.u64 [%frame+360],%r164; .loc 1 61 13 mov.u64 %r166,1; st.u64 [%frame],%r166; .loc 1 62 12 st.u64 [%frame+120],%r164; .loc 1 76 13 ld.s8 %r112,[%r161+28]; .loc 1 72 6 setp.le.s32 %r168,%r108,0; @ %r168 bra $L2; .loc 1 76 11 cvt.u32.u32 %r169,%r112; cvt.s64.s8 %r96,%r169; .loc 1 77 7 setp.le.s64 %r170,%r96,0; @ %r170 bra $L3; mov.u64 %r53,40; .loc 1 70 8 mov.u64 %r117,%r166; .loc 1 69 8 mov.u64 %r116,%r117; .loc 1 77 14 mov.u64 %r118,%r164; $L7: add.u64 %r125,%r160,%r53; .loc 1 81 7 ld.u64 %r171,[%r125]; setp.ne.u64 %r172,%r171,%r116; @ ! %r172 bra $L49; bra $L4; $L2: .loc 1 133 27 cvt.u32.u32 %r173,%r112; cvt.s64.s8 %r96,%r173; .loc 1 133 7 setp.gt.s64 %r174,%r96,0; @ %r174 bra $L4; .loc 1 164 8 ld.u64 %r136,[%r160]; .loc 1 165 8 ld.u64 %r137,[%r161]; mov.u64 %r97,%r164; mov.u64 %r159,%r166; .loc 1 155 13 mov.u64 %r95,%r159; .loc 1 165 8 mov.u64 %r92,%r159; mov.u64 %r54,%r159; bra $L6; $L49: add.u64 %r67,%r161,%r53; .loc 1 87 7 ld.u64 %r175,[%r67]; setp.ne.u64 %r176,%r175,%r117; @ %r176 bra $L4; .loc 1 92 12 ld.u64 %r178,[%r125+16]; ld.u64 %r180,[%r125+8]; sub.u64 %r179,%r178,%r180; .loc 1 92 9 mad.lo.u64 %r116,%r179,%r116,%r116; .loc 1 93 12 ld.u64 %r182,[%r67+16]; ld.u64 %r184,[%r67+8]; sub.u64 %r183,%r182,%r184; .loc 1 93 9 mad.lo.u64 %r117,%r183,%r117,%r117; .loc 1 77 30 add.u64 %r118,%r118,1; .loc 1 77 7 add.u64 %r53,%r53,24; setp.ne.u64 %r185,%r96,%r118; @ %r185 bra $L7; bra $L3; $L33: .loc 1 123 13 cvt.u32.u32 %r187,%r112; cvt.s64.s8 %r186,%r187; sub.u64 %r188,%r186,%r333; shl.b64 %r154,%r188,3; add.u64 %r189,%frame,120;1add.u64 %r198,%r333,%r333; add.u64 %r199,%r198,%r333; shl.b64 %r200,%r199,3; add.u64 %r44,%r200,40; add.u64 %r74,%r161,%r44; mov.u64 %r35,%frame; add.u64 %r101,%r160,%r44; mov.u64 %r49,0; add.u64 %r335,%frame,360; add.u64 %r336,%frame,240; $L8: .loc 1 124 16 ld.u64 %r202,[%r74+16]; add.u64 %r201,%r202,1; ld.u64 %r204,[%r74+8]; sub.u64 %r203,%r201,%r204; .loc 1 124 14 st.u64 [%r35],%r203; .loc 1 125 15 add.u64 %r206,%r335,%r49; ld.u64 %r207,[%r101]; st.u64 [%r206],%r207; .loc 1 126 15 add.u64 %r209,%r336,%r49; ld.u64 %r210,[%r74]; st.u64 [%r209],%r210; .loc 1 121 7 add.u64 %r74,%r74,24; add.u64 %r35,%r35,8; add.u64 %r101,%r101,24; add.u64 %r49,%r49,8; setp.ne.u64 %r211,%r49,%r154; @ %r211 bra $L8; .loc 1 162 12 ld.u64 %r159,[%frame+360]; .loc 1 163 12 ld.u64 %r95,[%frame+240]; $L34: .loc 1 129 41 sub.u32 %r212,%r112,%r108; .loc 1 129 11 cvt.s64.s32 %r96,%r212; .loc 1 117 15 mov.u64 %r92,1; .loc 1 116 15 mov.u64 %r54,%r92; bra $L9; $L4: .loc 1 135 12 cvt.s64.s32 %r151,%r108; add.u64 %r104,%r161,40; add.u64 %r100,%r160,40; mov.u64 %r99,0; mov.u64 %r97,%r99; mov.u64 %r121,%r99; mov.u64 %r92,1; mov.u64 %r54,%r92; .loc 1 147 17 mov.u64 %r344,%r99; $L14: .loc 1 143 14 ld.u64 %r214,[%r104+16]; add.u64 %r213,%r214,1; .loc 1 143 12 ld.u64 %r215,[%r104+8]; sub.u64 %r150,%r213,%r215; .loc 1 137 16 ld.u64 %r152,[%r100]; .loc 1 140 16 ld.u64 %r141,[%r104]; .loc 1 135 7 setp.ne.u64 %r216,%r151,%r121; @ %r216 bra $L10; .loc 1 138 11 setp.ne.u64 %r217,%r152,0; .loc 1 139 11 selp.u64 %r54,%r152,1,%r217; .loc 1 141 11 setp.ne.u64 %r218,%r141,0; .loc 1 142 11 selp.u64 %r92,%r141,1,%r218; .loc 1 143 12 mov.u64 %r97,%r150; bra $L13; $L10: .loc 1 147 17 shl.b64 %r219,%r99,3; add.u64 %r220,%frame,%r219; add.u64 %r221,%r220,120; st.u64 [%r221],%r344; .loc 1 148 18 st.u64 [%r220],%r150; .loc 1 149 19 st.u64 [%r220+360],%r152; .loc 1 150 19 st.u64 [%r220+240],%r141; .loc 1 151 9 add.u64 %r99,%r99,1; $L13: .loc 1 133 59 add.u64 %r121,%r121,1; .loc 1 133 7 add.u64 %r104,%r104,24; add.u64 %r100,%r100,24; setp.lt.s64 %r231,%r121,%r96; @ %r231 bra $L14; .loc 1 154 18 ld.u64 %r95,[%frame+240]; .loc 1 156 18 ld.u64 %r159,[%frame+360]; .loc 1 154 10 setp.ne.u64 %r232,%r95,0; .loc 1 155 13 selp.u64 %r95,%r95,1,%r232; .loc 1 156 10 setp.eq.u64 %r233,%r159,0; selp.u64 %r159,1,%r159,%r233; $L9: .loc 1 164 8 ld.u64 %r136,[%r160]; .loc 1 165 8 ld.u64 %r137,[%r161]; .loc 1 168 13 shr.u64 %r235,%r162,63; .loc 1 168 26 set.u32.ge.s64 %r237,%r162,%r97; neg.s32 %r238,%r237; .loc 1 168 17 cvt.u16.u64 %r241,%r235; cvt.u16.u32 %r242,%r238; or.b16 %r240,%r241,%r242; .loc 1 168 6 cvt.u32.u16 %r243,%r240; cvt.u16.u8 %r244,%r243; setp.eq.u16 %r245,%r244,0; @ %r245 bra $L18; .loc 1 170 28 setp.ne.u64 %r246,%r97,0; @ %r246 bra $L17; $L6: mov.u64 %r162,0; bra $L18; $L17: rem.s64 %r162,%r162,%r97; .loc 1 171 10 setp.ge.s64 %r249,%r162,0; @ %r249 bra $L18; .loc 1 172 8 add.u64 %r162,%r162,%r97; bra $L18; $L32: shl.b64 %r158,%r92,2; .loc 1 193 36 mul.lo.u64 %r62,%r162,%r158; .loc 1 198 13 shl.b64 %r65,%r54,2; .loc 1 183 11 shl.b64 %r132,%r162,2; .loc 1 184 23 sub.u64 %r143,%r97,%r162; .loc 1 184 11 shl.b64 %r133,%r143,2; .loc 1 210 12 shl.b64 %r69,%r159,2; .loc 1 211 12 shl.b64 %r71,%r95,2; .loc 1 181 35 set.u32.eq.u64 %r251,%r54,1; neg.s32 %r252,%r251; .loc 1 181 19 set.u32.eq.u64 %r254,%r92,1; neg.s32 %r255,%r254; .loc 1 181 24 cvt.u16.u32 %r257,%r252; cvt.u16.u32 %r258,%r255; and.b16 %r256,%r257,%r258; cvt.u32.u16 %r259,%r256; cvt.u32.u8 %r103,%r259; .loc 1 214 32 ld.u64 %r98,[%frame]; mul.lo.u64 %r156,%r65,%r143; .loc 1 221 30 mul.lo.u64 %r135,%r159,%r98; .loc 1 222 30 mul.lo.u64 %r260,%r95,%r98; .loc 1 222 16 shl.b64 %r261,%r260,2; neg.s64 %r153,%r261; shl.b64 %r262,%r96,3; add.u64 %r106,%r262,-8; setp.eq.u32 %r338,%r103,0; setp.gt.s64 %r337,%r96,2; .loc 1 195 4 setp.le.s64 %r340,%r143,0; .loc 1 201 4 setp.gt.s64 %r341,%r162,0; add.u64 %r342,%frame,360; add.u64 %r343,%frame,240; $L35: .loc 1 181 10 @ %r338 bra $L19; .loc 1 185 23 add.u64 %r264,%r137,%r132; .loc 1 185 433271,[%value_in]; } .loc 1 186 17 add.u64 %r273,%r136,%r133; .loc 1 186 427r11320,[%value_in]; } bra $L20; $L19: .loc 1 193 25 add.u64 %r130,%r137,%r62; .loc 1 195 4 @ %r340 bra $L39; mov.u64 %r129,%r136; .loc 1 195 11 mov.u64 %r131,0; $L22: .loc 1 197 16 ld.u32 %r63,[%r130]; .loc 1 197 14 st.u32 [%r129],%r63; .loc 1 198 13 add.u64 %r129,%r129,%r65; .loc 1 199 12 add.u64 %r130,%r130,%r158; .loc 1 195 34 add.u64 %r131,%r131,1; .loc 1 195 4 setp.ne.u64 %r283,%r131,%r143; @ %r283 bra $L22; .loc 1 198 13 add.u64 %r126,%r136,%r156; bra $L21; $L39: .loc 1 195 4 mov.u64 %r126,%r136; $L21: .loc 1 201 4 @ ! %r341 bra $L20; mov.u64 %r127,%r137; .loc 1 201 23 mov.u64 %r128,0; $L23: .loc 1 203 16 ld.u32 %r66,[%r127]; .loc 1 203 14 st.u32 [%r126],%r66; .loc 1 204 13 add.u64 %r126,%r126,%r65; .loc 1 205 12 add.u64 %r127,%r127,%r158; .loc 1 201 40 add.u64 %r128,%r128,1; .loc 1 201 4 setp.ne.u64 %r285,%r128,%r162; @ %r285 bra $L23; $L20: .loc 1 210 12 add.u64 %r136,%r136,%r69; .loc 1 211 12 add.u64 %r137,%r137,%r71; .loc 1 212 15 ld.u64 %r286,[%frame+120]; add.u64 %r73,%r286,1; .loc 1 214 13 setp.eq.u64 %r287,%r73,%r98; @ %r287 bra $L25; .loc 1 212 15 st.u64 [%frame+120],%r73; bra $L35; $L25: .loc 1 218 20 mov.u64 %r288,0; st.u64 [%frame+120],%r288; .loc 1 222 16 add.u64 %r134,%r137,%r153; .loc 1 224 14 @ %r337 bra $L27; bra $L1; $L31: .loc 1 218 20 st.u64 [%r123],%r288; .loc 1 221 30 mul.lo.u64 %r76,%r84,%r83; .loc 1 222 30 mul.lo.u64 %r291,%r86,%r83; .loc 1 222 16 shl.b64 %r292,%r291,2; sub.u64 %r134,%r137,%r292; .loc 1 224 14 add.u64 %r123,%r123,8; add.u64 %r148,%r148,8; setp.eq.u64 %r293,%r106,%r148; @ ! %r293 bra $L30; bra $L1; $L27: add.u64 %r123,%frame,128; .loc 1 221 30 mov.u64 %r76,%r135; .loc 1 224 14 mov.u64 %r148,8; $L30: .loc 1 232 23 ld.u64 %r295,[%r123]; add.u64 %r83,%r295,1; st.u64 [%r123],%r83; .loc 1 233 30 add.u64 %r297,%r342,%r148; ld.u64 %r84,[%r297]; .loc 1 233 20 sub.u64 %r298,%r84,%r76; shl.b64 %r299,%r298,2; add.u64 %r136,%r136,%r299; .loc 1 234 30 add.u64 %r301,%r343,%r148; ld.u64 %r86,[%r301]; .loc 1 234 20 shl.b64 %r302,%r86,2; add.u64 %r137,%r134,%r302; .loc 1 214 32 add.u64 %r303,%frame,%r148; ld.u64 %r90,[%r303]; .loc 1 214 13 setp.eq.u64 %r304,%r83,%r90; @ %r304 bra $L31; bra $L35; $L18: .loc 1 175 9 setp.ne.u64 %r305,%r136,0; @ %r305 bra $L32; bra $L1; $L3: .loc 1 114 18 mov.u64 %r306,1; st.u64 [%frame+240],%r306; .loc 1 115 18 st.u64 [%frame+360],%r306; .loc 1 118 13 cvt.s64.s32 %r308,%r108; add.u64 %r310,%r308,%r308; add.u64 %r311,%r310,%r308; shl.b64 %r312,%r311,3; add.u64 %r313,%r161,%r312; ld.u64 %r30,[%r313+40]; .loc 1 119 4 ld.u64 %r323,[%r313+56]; ld.u64 %r332,[%r313+48]; sub.u64 %r331,%r323,%r332; .loc 1 118 11 mad.lo.u64 %r97,%r331,%r30,%r30; .loc 1 120 13 mul.lo.u64 %r162,%r162,%r30; .loc 1 121 16 cvt.s64.s32 %r333,%r163; .loc 1 121 7 setp.gt.s64 %r334,%r96,%r333; @ %r334 bra $L33; mov.u64 %r95,%r306; mov.u64 %r159,%r95; bra $L34; $L1: .loc 1 240 1 ret; } cshift0_i8.o/gfortrani_cshift0_i8 .visible .func _gfortrani_cshift0_i8.file 1 "../../../libgfortran/generated/cshift0_i8.c"F: _gfortrani_cshift0_i8 .visible .func _gfortrani_cshift0_i816 .b8 %frame_ar[480.reg .u6432u64 %r214; .reg .u64 %r215; .reg .pred %r216; .reg .pred %r217; .reg .predpred %r232; .reg .pred %r233; .reg .u64 %r235; .reg .u32 %r237; .reg .u32 %r238; .reg .u16 %r240; .reg .u16 %r241; .reg .u16 %r242; .reg .u32 %r243; .reg .u16pred %r249; .reg .u32 %r251; .reg .u32 %r252; .reg .u32 %r254; .reg .u32 %r255; .reg .u16 %r256; .reg .u16 %r257; .reg .u16 %r258; .reg .u32predpredu64pred %r305; .reg .u64u64pred %r337; .reg .pred %r338; .reg .pred %r340; .reg .predmov.u64 %r160,%ar0; mov.u64 %r161,%ar1; mov.u64 %r162,%ar2; mov.u32 %r163,%ar3; .loc 1 57 9 add.u32 %r108,%r163,-1; .loc 1 58 14 mov.u64 %r164,0; st.u64 [%frame+240],%r164; .loc 1 59 14 st.u64 [%frame+360],%r164; .loc 1 61 13 mov.u64 %r166,1; st.u64 [%frame],%r166; .loc 1 62 12 st.u64 [%frame+120],%r164; .loc 1 76 13 ld.s8 %r112,[%r161+28]; .loc 1 72 6 setp.le.s32 %r168,%r108,0; @ %r168 bra $L2; .loc 1 76 11 cvt.u32.u32 %r169,%r112; cvt.s64.s8 %r96,%r169; .loc 1 77 7 setp.le.s64 %r170,%r96,0; @ %r170 bra $L3; mov.u64 %r53,40; .loc 1 70 8 mov.u64 %r117,%r166; .loc 1 69 8 mov.u64 %r116,%r117; .loc 1 77 14 mov.u64 %r118,%r164; $L7: add.u64 %r125,%r160,%r53; .loc 1 81 7 ld.u64 %r171,[%r125]; setp.ne.u64 %r172,%r171,%r116; @ ! %r172 bra $L49; bra $L4; $L2: .loc 1 133 27 cvt.u32.u32 %r173,%r112; cvt.s64.s8 %r96,%r173; .loc 1 133 7 setp.gt.s64 %r174,%r96,0; @ %r174 bra $L4; .loc 1 164 8 ld.u64 %r136,[%r160]; .loc 1 165 8 ld.u64 %r137,[%r161]; mov.u64 %r97,%r164; mov.u64 %r159,%r166; .loc 1 155 13 mov.u64 %r95,%r159; .loc 1 165 8 mov.u64 %r92,%r159; mov.u64 %r54,%r159; bra $L6; $L49: add.u64 %r67,%r161,%r53; .loc 1 87 7 ld.u64 %r175,[%r67]; setp.ne.u64 %r176,%r175,%r117; @ %r176 bra $L4; .loc 1 92 12 ld.u64 %r178,[%r125+16]; ld.u64 %r180,[%r125+8]; sub.u64 %r179,%r178,%r180; .loc 1 92 9 mad.lo.u64 %r116,%r179,%r116,%r116; .loc 1 93 12 ld.u64 %r182,[%r67+16]; ld.u64 %r184,[%r67+8]; sub.u64 %r183,%r182,%r184; .loc 1 93 9 mad.lo.u64 %r117,%r183,%r117,%r117; .loc 1 77 30 add.u64 %r118,%r118,1; .loc 1 77 7 add.u64 %r53,%r53,24; setp.ne.u64 %r185,%r96,%r118; @ %r185 bra $L7; bra $L3; $L33: .loc 1 123 13 cvt.u32.u32 %r187,%r112; cvt.s64.s8 %r186,%r187; sub.u64 %r188,%r186,%r333; shl.b64 %r154,%r188,3; add.u64 %r189,%frame,120;1add.u64 %r198,%r333,%r333; add.u64 %r199,%r198,%r333; shl.b64 %r200,%r199,3; add.u64 %r44,%r200,40; add.u64 %r74,%r161,%r44; mov.u64 %r35,%frame; add.u64 %r101,%r160,%r44; mov.u64 %r49,0; add.u64 %r335,%frame,360; add.u64 %r336,%frame,240; $L8: .loc 1 124 16 ld.u64 %r202,[%r74+16]; add.u64 %r201,%r202,1; ld.u64 %r204,[%r74+8]; sub.u64 %r203,%r201,%r204; .loc 1 124 14 st.u64 [%r35],%r203; .loc 1 125 15 add.u64 %r206,%r335,%r49; ld.u64 %r207,[%r101]; st.u64 [%r206],%r207; .loc 1 126 15 add.u64 %r209,%r336,%r49; ld.u64 %r210,[%r74]; st.u64 [%r209],%r210; .loc 1 121 7 add.u64 %r74,%r74,24; add.u64 %r35,%r35,8; add.u64 %r101,%r101,24; add.u64 %r49,%r49,8; setp.ne.u64 %r211,%r49,%r154; @ %r211 bra $L8; .loc 1 162 12 ld.u64 %r159,[%frame+360]; .loc 1 163 12 ld.u64 %r95,[%frame+240]; $L34: .loc 1 129 41 sub.u32 %r212,%r112,%r108; .loc 1 129 11 cvt.s64.s32 %r96,%r212; .loc 1 117 15 mov.u64 %r92,1; .loc 1 116 15 mov.u64 %r54,%r92; bra $L9; $L4: .loc 1 135 12 cvt.s64.s32 %r151,%r108; add.u64 %r104,%r161,40; add.u64 %r100,%r160,40; mov.u64 %r99,0; mov.u64 %r97,%r99; mov.u64 %r121,%r99; mov.u64 %r92,1; mov.u64 %r54,%r92; .loc 1 147 17 mov.u64 %r344,%r99; $L14: .loc 1 143 14 ld.u64 %r214,[%r104+16]; add.u64 %r213,%r214,1; .loc 1 143 12 ld.u64 %r215,[%r104+8]; sub.u64 %r150,%r213,%r215; .loc 1 137 16 ld.u64 %r152,[%r100]; .loc 1 140 16 ld.u64 %r141,[%r104]; .loc 1 135 7 setp.ne.u64 %r216,%r151,%r121; @ %r216 bra $L10; .loc 1 138 11 setp.ne.u64 %r217,%r152,0; .loc 1 139 11 selp.u64 %r54,%r152,1,%r217; .loc 1 141 11 setp.ne.u64 %r218,%r141,0; .loc 1 142 11 selp.u64 %r92,%r141,1,%r218; .loc 1 143 12 mov.u64 %r97,%r150; bra $L13; $L10: .loc 1 147 17 shl.b64 %r219,%r99,3; add.u64 %r220,%frame,%r219; add.u64 %r221,%r220,120; st.u64 [%r221],%r344; .loc 1 148 18 st.u64 [%r220],%r150; .loc 1 149 19 st.u64 [%r220+360],%r152; .loc 1 150 19 st.u64 [%r220+240],%r141; .loc 1 151 9 add.u64 %r99,%r99,1; $L13: .loc 1 133 59 add.u64 %r121,%r121,1; .loc 1 133 7 add.u64 %r104,%r104,24; add.u64 %r100,%r100,24; setp.lt.s64 %r231,%r121,%r96; @ %r231 bra $L14; .loc 1 154 18 ld.u64 %r95,[%frame+240]; .loc 1 156 18 ld.u64 %r159,[%frame+360]; .loc 1 154 10 setp.ne.u64 %r232,%r95,0; .loc 1 155 13 selp.u64 %r95,%r95,1,%r232; .loc 1 156 10 setp.eq.u64 %r233,%r159,0; selp.u64 %r159,1,%r159,%r233; $L9: .loc 1 164 8 ld.u64 %r136,[%r160]; .loc 1 165 8 ld.u64 %r137,[%r161]; .loc 1 168 13 shr.u64 %r235,%r162,63; .loc 1 168 26 set.u32.ge.s64 %r237,%r162,%r97; neg.s32 %r238,%r237; .loc 1 168 17 cvt.u16.u64 %r241,%r235; cvt.u16.u32 %r242,%r238; or.b16 %r240,%r241,%r242; .loc 1 168 6 cvt.u32.u16 %r243,%r240; cvt.u16.u8 %r244,%r243; setp.eq.u16 %r245,%r244,0; @ %r245 bra $L18; .loc 1 170 28 setp.ne.u64 %r246,%r97,0; @ %r246 bra $L17; $L6: mov.u64 %r162,0; bra $L18; $L17: rem.s64 %r162,%r162,%r97; .loc 1 171 10 setp.ge.s64 %r249,%r162,0; @ %r249 bra $L18; .loc 1 172 8 add.u64 %r162,%r162,%r97; bra $L18; $L32: shl.b64 %r158,%r92,3; .loc 1 193 36 mul.lo.u64 %r62,%r162,%r158; .loc 1 198 13 shl.b64 %r65,%r54,3; .loc 1 183 11 shl.b64 %r132,%r162,3; .loc 1 184 23 sub.u64 %r143,%r97,%r162; .loc 1 184 11 shl.b64 %r133,%r143,3; .loc 1 210 12 shl.b64 %r69,%r159,3; .loc 1 211 12 shl.b64 %r71,%r95,3; .loc 1 181 35 set.u32.eq.u64 %r251,%r54,1; neg.s32 %r252,%r251; .loc 1 181 19 set.u32.eq.u64 %r254,%r92,1; neg.s32 %r255,%r254; .loc 1 181 24 cvt.u16.u32 %r257,%r252; cvt.u16.u32 %r258,%r255; and.b16 %r256,%r257,%r258; cvt.u32.u16 %r259,%r256; cvt.u32.u8 %r103,%r259; .loc 1 214 32 ld.u64 %r98,[%frame]; mul.lo.u64 %r156,%r65,%r143; .loc 1 221 30 mul.lo.u64 %r135,%r159,%r98; .loc 1 222 30 mul.lo.u64 %r260,%r95,%r98; .loc 1 222 16 shl.b64 %r261,%r260,3; neg.s64 %r153,%r261; shl.b64 %r262,%r96,3; add.u64 %r106,%r262,-8; setp.eq.u32 %r338,%r103,0; setp.gt.s64 %r337,%r96,2; .loc 1 195 4 setp.le.s64 %r340,%r143,0; .loc 1 201 4 setp.gt.s64 %r341,%r162,0; add.u64 %r342,%frame,360; add.u64 %r343,%frame,240; $L35: .loc 1 181 10 @ %r338 bra $L19; .loc 1 185 23 add.u64 %r264,%r137,%r132; .loc 1 185 433271,[%value_in]; } .loc 1 186 17 add.u64 %r273,%r136,%r133; .loc 1 186 427r11320,[%value_in]; } bra $L20; $L19: .loc 1 193 25 add.u64 %r130,%r137,%r62; .loc 1 195 4 @ %r340 bra $L39; mov.u64 %r129,%r136; .loc 1 195 11 mov.u64 %r131,0; $L22: .loc 1 197 16 ld.u64 %r63,[%r130]; .loc 1 197 14 st.u64 [%r129],%r63; .loc 1 198 13 add.u64 %r129,%r129,%r65; .loc 1 199 12 add.u64 %r130,%r130,%r158; .loc 1 195 34 add.u64 %r131,%r131,1; .loc 1 195 4 setp.ne.u64 %r283,%r131,%r143; @ %r283 bra $L22; .loc 1 198 13 add.u64 %r126,%r136,%r156; bra $L21; $L39: .loc 1 195 4 mov.u64 %r126,%r136; $L21: .loc 1 201 4 @ ! %r341 bra $L20; mov.u64 %r127,%r137; .loc 1 201 23 mov.u64 %r128,0; $L23: .loc 1 203 16 ld.u64 %r66,[%r127]; .loc 1 203 14 st.u64 [%r126],%r66; .loc 1 204 13 add.u64 %r126,%r126,%r65; .loc 1 205 12 add.u64 %r127,%r127,%r158; .loc 1 201 40 add.u64 %r128,%r128,1; .loc 1 201 4 setp.ne.u64 %r285,%r128,%r162; @ %r285 bra $L23; $L20: .loc 1 210 12 add.u64 %r136,%r136,%r69; .loc 1 211 12 add.u64 %r137,%r137,%r71; .loc 1 212 15 ld.u64 %r286,[%frame+120]; add.u64 %r73,%r286,1; .loc 1 214 13 setp.eq.u64 %r287,%r73,%r98; @ %r287 bra $L25; .loc 1 212 15 st.u64 [%frame+120],%r73; bra $L35; $L25: .loc 1 218 20 mov.u64 %r288,0; st.u64 [%frame+120],%r288; .loc 1 222 16 add.u64 %r134,%r137,%r153; .loc 1 224 14 @ %r337 bra $L27; bra $L1; $L31: .loc 1 218 20 st.u64 [%r123],%r288; .loc 1 221 30 mul.lo.u64 %r76,%r84,%r83; .loc 1 222 30 mul.lo.u64 %r291,%r86,%r83; .loc 1 222 16 shl.b64 %r292,%r291,3; sub.u64 %r134,%r137,%r292; .loc 1 224 14 add.u64 %r123,%r123,8; add.u64 %r148,%r148,8; setp.eq.u64 %r293,%r106,%r148; @ ! %r293 bra $L30; bra $L1; $L27: add.u64 %r123,%frame,128; .loc 1 221 30 mov.u64 %r76,%r135; .loc 1 224 14 mov.u64 %r148,8; $L30: .loc 1 232 23 ld.u64 %r295,[%r123]; add.u64 %r83,%r295,1; st.u64 [%r123],%r83; .loc 1 233 30 add.u64 %r297,%r342,%r148; ld.u64 %r84,[%r297]; .loc 1 233 20 sub.u64 %r298,%r84,%r76; shl.b64 %r299,%r298,3; add.u64 %r136,%r136,%r299; .loc 1 234 30 add.u64 %r301,%r343,%r148; ld.u64 %r86,[%r301]; .loc 1 234 20 shl.b64 %r302,%r86,3; add.u64 %r137,%r134,%r302; .loc 1 214 32 add.u64 %r303,%frame,%r148; ld.u64 %r90,[%r303]; .loc 1 214 13 setp.eq.u64 %r304,%r83,%r90; @ %r304 bra $L31; bra $L35; $L18: .loc 1 175 9 setp.ne.u64 %r305,%r136,0; @ %r305 bra $L32; bra $L1; $L3: .loc 1 114 18 mov.u64 %r306,1; st.u64 [%frame+240],%r306; .loc 1 115 18 st.u64 [%frame+360],%r306; .loc 1 118 13 cvt.s64.s32 %r308,%r108; add.u64 %r310,%r308,%r308; add.u64 %r311,%r310,%r308; shl.b64 %r312,%r311,3; add.u64 %r313,%r161,%r312; ld.u64 %r30,[%r313+40]; .loc 1 119 4 ld.u64 %r323,[%r313+56]; ld.u64 %r332,[%r313+48]; sub.u64 %r331,%r323,%r332; .loc 1 118 11 mad.lo.u64 %r97,%r331,%r30,%r30; .loc 1 120 13 mul.lo.u64 %r162,%r162,%r30; .loc 1 121 16 cvt.s64.s32 %r333,%r163; .loc 1 121 7 setp.gt.s64 %r334,%r96,%r333; @ %r334 bra $L33; mov.u64 %r95,%r306; mov.u64 %r159,%r95; bra $L34; $L1: .loc 1 240 1 ret; } cshift0_i16.o/gfortrani_cshift0_i16 .visible .func _gfortrani_cshift0_i16.file 1 "../../../libgfortran/generated/cshift0_i16.c"F: _gfortrani_cshift0_i16 .visible .func _gfortrani_cshift0_i1616 .b8 %frame_ar[48092; .reg .u6432u64 %r214; .reg .u64 %r215; .reg .pred %r216; .reg .pred %r217; .reg .predpred %r232; .reg .pred %r233; .reg .u64 %r235; .reg .u32 %r237; .reg .u32 %r238; .reg .u16 %r240; .reg .u16 %r241; .reg .u16 %r242; .reg .u32 %r243; .reg .u16pred %r249; .reg .u32 %r251; .reg .u32 %r252; .reg .u32 %r254; .reg .u32 %r255; .reg .u16 %r256; .reg .u16 %r257; .reg .u16 %r258; .reg .u32predpredu64 %r327; .reg .u64 %r335; .reg .u64 %r336; .reg .u64 %r337; .reg .pred %r338; .reg .u64 %r339; .reg .u64 %r340; .reg .u64 %r341; .reg .u64 %r342; .reg .u64 %r343; .reg .u64 %r344; .reg .pred %r345; .reg .pred %r346; .reg .pred %r348; .reg .pred %r349; .reg .u64 %r350; .reg .u64 %r351; .reg .u64 %r352; mov.u64 %r160,%ar0; mov.u64 %r161,%ar1; mov.u64 %r162,%ar2; mov.u32 %r163,%ar3; .loc 1 57 9 add.u32 %r108,%r163,-1; .loc 1 58 14 mov.u64 %r164,0; st.u64 [%frame+240],%r164; .loc 1 59 14 st.u64 [%frame+360],%r164; .loc 1 61 13 mov.u64 %r166,1; st.u64 [%frame],%r166; .loc 1 62 12 st.u64 [%frame+120],%r164; .loc 1 76 13 ld.s8 %r112,[%r161+28]; .loc 1 72 6 setp.le.s32 %r168,%r108,0; @ %r168 bra $L2; .loc 1 76 11 cvt.u32.u32 %r169,%r112; cvt.s64.s8 %r96,%r169; .loc 1 77 7 setp.le.s64 %r170,%r96,0; @ %r170 bra $L3; mov.u64 %r53,40; .loc 1 70 8 mov.u64 %r117,%r166; .loc 1 69 8 mov.u64 %r116,%r117; .loc 1 77 14 mov.u64 %r118,%r164; $L7: add.u64 %r125,%r160,%r53; .loc 1 81 7 ld.u64 %r171,[%r125]; setp.ne.u64 %r172,%r171,%r116; @ ! %r172 bra $L49; bra $L4; $L2: .loc 1 133 27 cvt.u32.u32 %r173,%r112; cvt.s64.s8 %r96,%r173; .loc 1 133 7 setp.gt.s64 %r174,%r96,0; @ %r174 bra $L4; .loc 1 164 8 ld.u64 %r136,[%r160]; .loc 1 165 8 ld.u64 %r137,[%r161]; mov.u64 %r97,%r164; mov.u64 %r159,%r166; .loc 1 155 13 mov.u64 %r95,%r159; .loc 1 165 8 mov.u64 %r92,%r159; mov.u64 %r54,%r159; bra $L6; $L49: add.u64 %r67,%r161,%r53; .loc 1 87 7 ld.u64 %r175,[%r67]; setp.ne.u64 %r176,%r175,%r117; @ %r176 bra $L4; .loc 1 92 12 ld.u64 %r178,[%r125+16]; ld.u64 %r180,[%r125+8]; sub.u64 %r179,%r178,%r180; .loc 1 92 9 mad.lo.u64 %r116,%r179,%r116,%r116; .loc 1 93 12 ld.u64 %r182,[%r67+16]; ld.u64 %r184,[%r67+8]; sub.u64 %r183,%r182,%r184; .loc 1 93 9 mad.lo.u64 %r117,%r183,%r117,%r117; .loc 1 77 30 add.u64 %r118,%r118,1; .loc 1 77 7 add.u64 %r53,%r53,24; setp.ne.u64 %r185,%r96,%r118; @ %r185 bra $L7; bra $L3; $L33: .loc 1 123 13 cvt.u32.u32 %r187,%r112; cvt.s64.s8 %r186,%r187; sub.u64 %r188,%r186,%r337; shl.b64 %r154,%r188,3; add.u64 %r189,%frame,120;1add.u64 %r198,%r337,%r337; add.u64 %r199,%r198,%r337; shl.b64 %r200,%r199,3; add.u64 %r44,%r200,40; add.u64 %r74,%r161,%r44; mov.u64 %r35,%frame; add.u64 %r101,%r160,%r44; mov.u64 %r49,0; add.u64 %r343,%frame,360; add.u64 %r344,%frame,240; $L8: .loc 1 124 16 ld.u64 %r202,[%r74+16]; add.u64 %r201,%r202,1; ld.u64 %r204,[%r74+8]; sub.u64 %r203,%r201,%r204; .loc 1 124 14 st.u64 [%r35],%r203; .loc 1 125 15 add.u64 %r206,%r343,%r49; ld.u64 %r207,[%r101]; st.u64 [%r206],%r207; .loc 1 126 15 add.u64 %r209,%r344,%r49; ld.u64 %r210,[%r74]; st.u64 [%r209],%r210; .loc 1 121 7 add.u64 %r74,%r74,24; add.u64 %r35,%r35,8; add.u64 %r101,%r101,24; add.u64 %r49,%r49,8; setp.ne.u64 %r211,%r49,%r154; @ %r211 bra $L8; .loc 1 162 12 ld.u64 %r159,[%frame+360]; .loc 1 163 12 ld.u64 %r95,[%frame+240]; $L34: .loc 1 129 41 sub.u32 %r212,%r112,%r108; .loc 1 129 11 cvt.s64.s32 %r96,%r212; .loc 1 117 15 mov.u64 %r92,1; .loc 1 116 15 mov.u64 %r54,%r92; bra $L9; $L4: .loc 1 135 12 cvt.s64.s32 %r151,%r108; add.u64 %r104,%r161,40; add.u64 %r100,%r160,40; mov.u64 %r99,0; mov.u64 %r97,%r99; mov.u64 %r121,%r99; mov.u64 %r92,1; mov.u64 %r54,%r92; .loc 1 147 17 mov.u64 %r352,%r99; $L14: .loc 1 143 14 ld.u64 %r214,[%r104+16]; add.u64 %r213,%r214,1; .loc 1 143 12 ld.u64 %r215,[%r104+8]; sub.u64 %r150,%r213,%r215; .loc 1 137 16 ld.u64 %r152,[%r100]; .loc 1 140 16 ld.u64 %r141,[%r104]; .loc 1 135 7 setp.ne.u64 %r216,%r151,%r121; @ %r216 bra $L10; .loc 1 138 11 setp.ne.u64 %r217,%r152,0; .loc 1 139 11 selp.u64 %r54,%r152,1,%r217; .loc 1 141 11 setp.ne.u64 %r218,%r141,0; .loc 1 142 11 selp.u64 %r92,%r141,1,%r218; .loc 1 143 12 mov.u64 %r97,%r150; bra $L13; $L10: .loc 1 147 17 shl.b64 %r219,%r99,3; add.u64 %r220,%frame,%r219; add.u64 %r221,%r220,120; st.u64 [%r221],%r352; .loc 1 148 18 st.u64 [%r220],%r150; .loc 1 149 19 st.u64 [%r220+360],%r152; .loc 1 150 19 st.u64 [%r220+240],%r141; .loc 1 151 9 add.u64 %r99,%r99,1; $L13: .loc 1 133 59 add.u64 %r121,%r121,1; .loc 1 133 7 add.u64 %r104,%r104,24; add.u64 %r100,%r100,24; setp.lt.s64 %r231,%r121,%r96; @ %r231 bra $L14; .loc 1 154 18 ld.u64 %r95,[%frame+240]; .loc 1 156 18 ld.u64 %r159,[%frame+360]; .loc 1 154 10 setp.ne.u64 %r232,%r95,0; .loc 1 155 13 selp.u64 %r95,%r95,1,%r232; .loc 1 156 10 setp.eq.u64 %r233,%r159,0; selp.u64 %r159,1,%r159,%r233; $L9: .loc 1 164 8 ld.u64 %r136,[%r160]; .loc 1 165 8 ld.u64 %r137,[%r161]; .loc 1 168 13 shr.u64 %r235,%r162,63; .loc 1 168 26 set.u32.ge.s64 %r237,%r162,%r97; neg.s32 %r238,%r237; .loc 1 168 17 cvt.u16.u64 %r241,%r235; cvt.u16.u32 %r242,%r238; or.b16 %r240,%r241,%r242; .loc 1 168 6 cvt.u32.u16 %r243,%r240; cvt.u16.u8 %r244,%r243; setp.eq.u16 %r245,%r244,0; @ %r245 bra $L18; .loc 1 170 28 setp.ne.u64 %r246,%r97,0; @ %r246 bra $L17; $L6: mov.u64 %r162,0; bra $L18; $L17: rem.s64 %r162,%r162,%r97; .loc 1 171 10 setp.ge.s64 %r249,%r162,0; @ %r249 bra $L18; .loc 1 172 8 add.u64 %r162,%r162,%r97; bra $L18; $L32: shl.b64 %r158,%r92,4; .loc 1 193 37 mul.lo.u64 %r62,%r162,%r158; .loc 1 198 13 shl.b64 %r65,%r54,4; .loc 1 183 11 shl.b64 %r132,%r162,4; .loc 1 184 23 sub.u64 %r143,%r97,%r162; .loc 1 184 11 shl.b64 %r133,%r143,4; .loc 1 210 12 shl.b64 %r69,%r159,4; .loc 1 211 12 shl.b64 %r71,%r95,4; .loc 1 181 35 set.u32.eq.u64 %r251,%r54,1; neg.s32 %r252,%r251; .loc 1 181 19 set.u32.eq.u64 %r254,%r92,1; neg.s32 %r255,%r254; .loc 1 181 24 cvt.u16.u32 %r257,%r252; cvt.u16.u32 %r258,%r255; and.b16 %r256,%r257,%r258; cvt.u32.u16 %r259,%r256; cvt.u32.u8 %r103,%r259; .loc 1 214 32 ld.u64 %r98,[%frame]; mul.lo.u64 %r156,%r65,%r143; .loc 1 221 30 mul.lo.u64 %r135,%r159,%r98; .loc 1 222 30 mul.lo.u64 %r260,%r95,%r98; .loc 1 222 16 shl.b64 %r261,%r260,4; neg.s64 %r153,%r261; shl.b64 %r262,%r96,3; add.u64 %r106,%r262,-8; setp.eq.u32 %r346,%r103,0; setp.gt.s64 %r345,%r96,2; .loc 1 195 4 setp.le.s64 %r348,%r143,0; .loc 1 201 4 setp.gt.s64 %r349,%r162,0; add.u64 %r350,%frame,360; add.u64 %r351,%frame,240; $L35: .loc 1 181 10 @ %r346 bra $L19; .loc 1 185 23 add.u64 %r264,%r137,%r132; .loc 1 185 433271,[%value_in]; } .loc 1 186 17 add.u64 %r273,%r136,%r133; .loc 1 186 427r11320,[%value_in]; } bra $L20; $L19: .loc 1 193 26 add.u64 %r130,%r137,%r62; .loc 1 195 4 @ %r348 bra $L39; mov.u64 %r129,%r136; .loc 1 195 11 mov.u64 %r131,0; $L22: .loc 1 197 16 ld.u64 %r339,[%r130]; ld.u64 %r340,[%r130+8]; .loc 1 197 14 st.u64 [%r129],%r339; st.u64 [%r129+8],%r340; .loc 1 198 13 add.u64 %r129,%r129,%r65; .loc 1 199 12 add.u64 %r130,%r130,%r158; .loc 1 195 34 add.u64 %r131,%r131,1; .loc 1 195 4 setp.ne.u64 %r285,%r131,%r143; @ %r285 bra $L22; .loc 1 198 13 add.u64 %r126,%r136,%r156; bra $L21; $L39: .loc 1 195 4 mov.u64 %r126,%r136; $L21: .loc 1 201 4 @ ! %r349 bra $L20; mov.u64 %r127,%r137; .loc 1 201 23 mov.u64 %r128,0; $L23: .loc 1 203 16 ld.u64 %r341,[%r127]; ld.u64 %r342,[%r127+8]; .loc 1 203 14 st.u64 [%r126],%r341; st.u64 [%r126+8],%r342; .loc 1 204 13 add.u64 %r126,%r126,%r65; .loc 1 205 12 add.u64 %r127,%r127,%r158; .loc 1 201 40 add.u64 %r128,%r128,1; .loc 1 201 4 setp.ne.u64 %r289,%r128,%r162; @ %r289 bra $L23; $L20: .loc 1 210 12 add.u64 %r136,%r136,%r69; .loc 1 211 12 add.u64 %r137,%r137,%r71; .loc 1 212 15 ld.u64 %r290,[%frame+120]; add.u64 %r73,%r290,1; .loc 1 214 13 setp.eq.u64 %r291,%r73,%r98; @ %r291 bra $L25; .loc 1 212 15 st.u64 [%frame+120],%r73; bra $L35; $L25: .loc 1 218 20 mov.u64 %r292,0; st.u64 [%frame+120],%r292; .loc 1 222 16 add.u64 %r134,%r137,%r153; .loc 1 224 14 @ %r345 bra $L27; bra $L1; $L31: .loc 1 218 20 st.u64 [%r123],%r292; .loc 1 221 30 mul.lo.u64 %r76,%r84,%r83; .loc 1 222 30 mul.lo.u64 %r295,%r86,%r83; .loc 1 222 16 shl.b64 %r296,%r295,4; sub.u64 %r134,%r137,%r296; .loc 1 224 14 add.u64 %r123,%r123,8; add.u64 %r148,%r148,8; setp.eq.u64 %r297,%r106,%r148; @ ! %r297 bra $L30; bra $L1; $L27: add.u64 %r123,%frame,128; .loc 1 221 30 mov.u64 %r76,%r135; .loc 1 224 14 mov.u64 %r148,8; $L30: .loc 1 232 23 ld.u64 %r299,[%r123]; add.u64 %r83,%r299,1; st.u64 [%r123],%r83; .loc 1 233 30 add.u64 %r301,%r350,%r148; ld.u64 %r84,[%r301]; .loc 1 233 20 sub.u64 %r302,%r84,%r76; shl.b64 %r303,%r302,4; add.u64 %r136,%r136,%r303; .loc 1 234 30 add.u64 %r305,%r351,%r148; ld.u64 %r86,[%r305]; .loc 1 234 20 shl.b64 %r306,%r86,4; add.u64 %r137,%r134,%r306; .loc 1 214 32 add.u64 %r307,%frame,%r148; ld.u64 %r90,[%r307]; .loc 1 214 13 setp.eq.u64 %r308,%r83,%r90; @ %r308 bra $L31; bra $L35; $L18: .loc 1 175 9 setp.ne.u64 %r309,%r136,0; @ %r309 bra $L32; bra $L1; $L3: .loc 1 114 18 mov.u64 %r310,1; st.u64 [%frame+240],%r310; .loc 1 115 18 st.u64 [%frame+360],%r310; .loc 1 118 13 cvt.s64.s32 %r312,%r108; add.u64 %r314,%r312,%r312; add.u64 %r315,%r314,%r312; shl.b64 %r316,%r315,3; add.u64 %r317,%r161,%r316; ld.u64 %r30,[%r317+40]; .loc 1 119 4 ld.u64 %r327,[%r317+56]; ld.u64 %r336,[%r317+48]; sub.u64 %r335,%r327,%r336; .loc 1 118 11 mad.lo.u64 %r97,%r335,%r30,%r30; .loc 1 120 13 mul.lo.u64 %r162,%r162,%r30; .loc 1 121 16 cvt.s64.s32 %r337,%r163; .loc 1 121 7 setp.gt.s64 %r338,%r96,%r337; @ %r338 bra $L33; mov.u64 %r95,%r310; mov.u64 %r159,%r95; bra $L34; $Lgfortrani_cshift0_r4 .visible .func _gfortrani_cshift0_rfortran/generated/cshift0_r4.c"F: _gfortrani_cshift0_r4 .visible .func _gfortrani_cshift0_r416 .b8 %frame_ar[48092; .reg .u6432u64 %r214; .reg .u64 %r215; .reg .pred %r216; .reg .pred %r217; .reg .predpred %r232; .reg .pred %r233; .reg .u64 %r235; .reg .u32 %r237; .reg .u32 %r238; .reg .u16 %r240; .reg .u16 %r241; .reg .u16 %r242; .reg .u32 %r243; .reg .u16pred %r249; .reg .u32 %r251; .reg .u32 %r252; .reg .u32 %r254; .reg .u32 %r255; .reg .u16 %r256; .reg .u16 %r257; .reg .u16 %r258; .reg .u32predpredu64pred %r305; .reg .u64u64pred %r337; .reg .pred %r338; .reg .pred %r340; .reg .predmov.u64 %r160,%ar0; mov.u64 %r161,%ar1; mov.u64 %r162,%ar2; mov.u32 %r163,%ar3; .loc 1 57 9 add.u32 %r108,%r163,-1; .loc 1 58 14 mov.u64 %r164,0; st.u64 [%frame+240],%r164; .loc 1 59 14 st.u64 [%frame+360],%r164; .loc 1 61 13 mov.u64 %r166,1; st.u64 [%frame],%r166; .loc 1 62 12 st.u64 [%frame+120],%r164; .loc 1 76 13 ld.s8 %r112,[%r161+28]; .loc 1 72 6 setp.le.s32 %r168,%r108,0; @ %r168 bra $L2; .loc 1 76 11 cvt.u32.u32 %r169,%r112; cvt.s64.s8 %r96,%r169; .loc 1 77 7 setp.le.s64 %r170,%r96,0; @ %r170 bra $L3; mov.u64 %r53,40; .loc 1 70 8 mov.u64 %r117,%r166; .loc 1 69 8 mov.u64 %r116,%r117; .loc 1 77 14 mov.u64 %r118,%r164; $L7: add.u64 %r125,%r160,%r53; .loc 1 81 7 ld.u64 %r171,[%r125]; setp.ne.u64 %r172,%r171,%r116; @ ! %r172 bra $L49; bra $L4; $L2: .loc 1 133 27 cvt.u32.u32 %r173,%r112; cvt.s64.s8 %r96,%r173; .loc 1 133 7 setp.gt.s64 %r174,%r96,0; @ %r174 bra $L4; .loc 1 164 8 ld.u64 %r136,[%r160]; .loc 1 165 8 ld.u64 %r137,[%r161]; mov.u64 %r97,%r164; mov.u64 %r159,%r166; .loc 1 155 13 mov.u64 %r95,%r159; .loc 1 165 8 mov.u64 %r92,%r159; mov.u64 %r54,%r159; bra $L6; $L49: add.u64 %r67,%r161,%r53; .loc 1 87 7 ld.u64 %r175,[%r67]; setp.ne.u64 %r176,%r175,%r117; @ %r176 bra $L4; .loc 1 92 12 ld.u64 %r178,[%r125+16]; ld.u64 %r180,[%r125+8]; sub.u64 %r179,%r178,%r180; .loc 1 92 9 mad.lo.u64 %r116,%r179,%r116,%r116; .loc 1 93 12 ld.u64 %r182,[%r67+16]; ld.u64 %r184,[%r67+8]; sub.u64 %r183,%r182,%r184; .loc 1 93 9 mad.lo.u64 %r117,%r183,%r117,%r117; .loc 1 77 30 add.u64 %r118,%r118,1; .loc 1 77 7 add.u64 %r53,%r53,24; setp.ne.u64 %r185,%r96,%r118; @ %r185 bra $L7; bra $L3; $L33: .loc 1 123 13 cvt.u32.u32 %r187,%r112; cvt.s64.s8 %r186,%r187; sub.u64 %r188,%r186,%r333; shl.b64 %r154,%r188,3; add.u64 %r189,%frame,120;1add.u64 %r198,%r333,%r333; add.u64 %r199,%r198,%r333; shl.b64 %r200,%r199,3; add.u64 %r44,%r200,40; add.u64 %r74,%r161,%r44; mov.u64 %r35,%frame; add.u64 %r101,%r160,%r44; mov.u64 %r49,0; add.u64 %r335,%frame,360; add.u64 %r336,%frame,240; $L8: .loc 1 124 16 ld.u64 %r202,[%r74+16]; add.u64 %r201,%r202,1; ld.u64 %r204,[%r74+8]; sub.u64 %r203,%r201,%r204; .loc 1 124 14 st.u64 [%r35],%r203; .loc 1 125 15 add.u64 %r206,%r335,%r49; ld.u64 %r207,[%r101]; st.u64 [%r206],%r207; .loc 1 126 15 add.u64 %r209,%r336,%r49; ld.u64 %r210,[%r74]; st.u64 [%r209],%r210; .loc 1 121 7 add.u64 %r74,%r74,24; add.u64 %r35,%r35,8; add.u64 %r101,%r101,24; add.u64 %r49,%r49,8; setp.ne.u64 %r211,%r49,%r154; @ %r211 bra $L8; .loc 1 162 12 ld.u64 %r159,[%frame+360]; .loc 1 163 12 ld.u64 %r95,[%frame+240]; $L34: .loc 1 129 41 sub.u32 %r212,%r112,%r108; .loc 1 129 11 cvt.s64.s32 %r96,%r212; .loc 1 117 15 mov.u64 %r92,1; .loc 1 116 15 mov.u64 %r54,%r92; bra $L9; $L4: .loc 1 135 12 cvt.s64.s32 %r151,%r108; add.u64 %r104,%r161,40; add.u64 %r100,%r160,40; mov.u64 %r99,0; mov.u64 %r97,%r99; mov.u64 %r121,%r99; mov.u64 %r92,1; mov.u64 %r54,%r92; .loc 1 147 17 mov.u64 %r344,%r99; $L14: .loc 1 143 14 ld.u64 %r214,[%r104+16]; add.u64 %r213,%r214,1; .loc 1 143 12 ld.u64 %r215,[%r104+8]; sub.u64 %r150,%r213,%r215; .loc 1 137 16 ld.u64 %r152,[%r100]; .loc 1 140 16 ld.u64 %r141,[%r104]; .loc 1 135 7 setp.ne.u64 %r216,%r151,%r121; @ %r216 bra $L10; .loc 1 138 11 setp.ne.u64 %r217,%r152,0; .loc 1 139 11 selp.u64 %r54,%r152,1,%r217; .loc 1 141 11 setp.ne.u64 %r218,%r141,0; .loc 1 142 11 selp.u64 %r92,%r141,1,%r218; .loc 1 143 12 mov.u64 %r97,%r150; bra $L13; $L10: .loc 1 147 17 shl.b64 %r219,%r99,3; add.u64 %r220,%frame,%r219; add.u64 %r221,%r220,120; st.u64 [%r221],%r344; .loc 1 148 18 st.u64 [%r220],%r150; .loc 1 149 19 st.u64 [%r220+360],%r152; .loc 1 150 19 st.u64 [%r220+240],%r141; .loc 1 151 9 add.u64 %r99,%r99,1; $L13: .loc 1 133 59 add.u64 %r121,%r121,1; .loc 1 133 7 add.u64 %r104,%r104,24; add.u64 %r100,%r100,24; setp.lt.s64 %r231,%r121,%r96; @ %r231 bra $L14; .loc 1 154 18 ld.u64 %r95,[%frame+240]; .loc 1 156 18 ld.u64 %r159,[%frame+360]; .loc 1 154 10 setp.ne.u64 %r232,%r95,0; .loc 1 155 13 selp.u64 %r95,%r95,1,%r232; .loc 1 156 10 setp.eq.u64 %r233,%r159,0; selp.u64 %r159,1,%r159,%r233; $L9: .loc 1 164 8 ld.u64 %r136,[%r160]; .loc 1 165 8 ld.u64 %r137,[%r161]; .loc 1 168 13 shr.u64 %r235,%r162,63; .loc 1 168 26 set.u32.ge.s64 %r237,%r162,%r97; neg.s32 %r238,%r237; .loc 1 168 17 cvt.u16.u64 %r241,%r235; cvt.u16.u32 %r242,%r238; or.b16 %r240,%r241,%r242; .loc 1 168 6 cvt.u32.u16 %r243,%r240; cvt.u16.u8 %r244,%r243; setp.eq.u16 %r245,%r244,0; @ %r245 bra $L18; .loc 1 170 28 setp.ne.u64 %r246,%r97,0; @ %r246 bra $L17; $L6: mov.u64 %r162,0; bra $L18; $L17: rem.s64 %r162,%r162,%r97; .loc 1 171 10 setp.ge.s64 %r249,%r162,0; @ %r249 bra $L18; .loc 1 172 8 add.u64 %r162,%r162,%r97; bra $L18; $L32: shl.b64 %r158,%r92,2; .loc 1 193 33 mul.lo.u64 %r62,%r162,%r158; .loc 1 198 13 shl.b64 %r65,%r54,2; .loc 1 183 11 shl.b64 %r132,%r162,2; .loc 1 184 23 sub.u64 %r143,%r97,%r162; .loc 1 184 11 shl.b64 %r133,%r143,2; .loc 1 210 12 shl.b64 %r69,%r159,2; .loc 1 211 12 shl.b64 %r71,%r95,2; .loc 1 181 35 set.u32.eq.u64 %r251,%r54,1; neg.s32 %r252,%r251; .loc 1 181 19 set.u32.eq.u64 %r254,%r92,1; neg.s32 %r255,%r254; .loc 1 181 24 cvt.u16.u32 %r257,%r252; cvt.u16.u32 %r258,%r255; and.b16 %r256,%r257,%r258; cvt.u32.u16 %r259,%r256; cvt.u32.u8 %r103,%r259; .loc 1 214 32 ld.u64 %r98,[%frame]; mul.lo.u64 %r156,%r65,%r143; .loc 1 221 30 mul.lo.u64 %r135,%r159,%r98; .loc 1 222 30 mul.lo.u64 %r260,%r95,%r98; .loc 1 222 16 shl.b64 %r261,%r260,2; neg.s64 %r153,%r261; shl.b64 %r262,%r96,3; add.u64 %r106,%r262,-8; setp.eq.u32 %r338,%r103,0; setp.gt.s64 %r337,%r96,2; .loc 1 195 4 setp.le.s64 %r340,%r143,0; .loc 1 201 4 setp.gt.s64 %r341,%r162,0; add.u64 %r342,%frame,360; add.u64 %r343,%frame,240; $L35: .loc 1 181 10 @ %r338 bra $L19; .loc 1 185 23 add.u64 %r264,%r137,%r132; .loc 1 185 433271,[%value_in]; } .loc 1 186 17 add.u64 %r273,%r136,%r133; .loc 1 186 427r11320,[%value_in]; } bra $L20; $L19: .loc 1 193 22 add.u64 %r130,%r137,%r62; .loc 1 195 4 @ %r340 bra $L39; mov.u64 %r129,%r136; .loc 1 195 11 mov.u64 %r131,0; $L22: .loc 1 197 16 ld.f32 %r63,[%r130]; .loc 1 197 14 st.f32 [%r129],%r63; .loc 1 198 13 add.u64 %r129,%r129,%r65; .loc 1 199 12 add.u64 %r130,%r130,%r158; .loc 1 195 34 add.u64 %r131,%r131,1; .loc 1 195 4 setp.ne.u64 %r283,%r131,%r143; @ %r283 bra $L22; .loc 1 198 13 add.u64 %r126,%r136,%r156; bra $L21; $L39: .loc 1 195 4 mov.u64 %r126,%r136; $L21: .loc 1 201 4 @ ! %r341 bra $L20; mov.u64 %r127,%r137; .loc 1 201 23 mov.u64 %r128,0; $L23: .loc 1 203 16 ld.f32 %r66,[%r127]; .loc 1 203 14 st.f32 [%r126],%r66; .loc 1 204 13 add.u64 %r126,%r126,%r65; .loc 1 205 12 add.u64 %r127,%r127,%r158; .loc 1 201 40 add.u64 %r128,%r128,1; .loc 1 201 4 setp.ne.u64 %r285,%r128,%r162; @ %r285 bra $L23; $L20: .loc 1 210 12 add.u64 %r136,%r136,%r69; .loc 1 211 12 add.u64 %r137,%r137,%r71; .loc 1 212 15 ld.u64 %r286,[%frame+120]; add.u64 %r73,%r286,1; .loc 1 214 13 setp.eq.u64 %r287,%r73,%r98; @ %r287 bra $L25; .loc 1 212 15 st.u64 [%frame+120],%r73; bra $L35; $L25: .loc 1 218 20 mov.u64 %r288,0; st.u64 [%frame+120],%r288; .loc 1 222 16 add.u64 %r134,%r137,%r153; .loc 1 224 14 @ %r337 bra $L27; bra $L1; $L31: .loc 1 218 20 st.u64 [%r123],%r288; .loc 1 221 30 mul.lo.u64 %r76,%r84,%r83; .loc 1 222 30 mul.lo.u64 %r291,%r86,%r83; .loc 1 222 16 shl.b64 %r292,%r291,2; sub.u64 %r134,%r137,%r292; .loc 1 224 14 add.u64 %r123,%r123,8; add.u64 %r148,%r148,8; setp.eq.u64 %r293,%r106,%r148; @ ! %r293 bra $L30; bra $L1; $L27: add.u64 %r123,%frame,128; .loc 1 221 30 mov.u64 %r76,%r135; .loc 1 224 14 mov.u64 %r148,8; $L30: .loc 1 232 23 ld.u64 %r295,[%r123]; add.u64 %r83,%r295,1; st.u64 [%r123],%r83; .loc 1 233 30 add.u64 %r297,%r342,%r148; ld.u64 %r84,[%r297]; .loc 1 233 20 sub.u64 %r298,%r84,%r76; shl.b64 %r299,%r298,2; add.u64 %r136,%r136,%r299; .loc 1 234 30 add.u64 %r301,%r343,%r148; ld.u64 %r86,[%r301]; .loc 1 234 20 shl.b64 %r302,%r86,2; add.u64 %r137,%r134,%r302; .loc 1 214 32 add.u64 %r303,%frame,%r148; ld.u64 %r90,[%r303]; .loc 1 214 13 setp.eq.u64 %r304,%r83,%r90; @ %r304 bra $L31; bra $L35; $L18: .loc 1 175 9 setp.ne.u64 %r305,%r136,0; @ %r305 bra $L32; bra $L1; $L3: .loc 1 114 18 mov.u64 %r306,1; st.u64 [%frame+240],%r306; .loc 1 115 18 st.u64 [%frame+360],%r306; .loc 1 118 13 cvt.s64.s32 %r308,%r108; add.u64 %r310,%r308,%r308; add.u64 %r311,%r310,%r308; shl.b64 %r312,%r311,3; add.u64 %r313,%r161,%r312; ld.u64 %r30,[%r313+40]; .loc 1 119 4 ld.u64 %r323,[%r313+56]; ld.u64 %r332,[%r313+48]; sub.u64 %r331,%r323,%r332; .loc 1 118 11 mad.lo.u64 %r97,%r331,%r30,%r30; .loc 1 120 13 mul.lo.u64 %r162,%r162,%r30; .loc 1 121 16 cvt.s64.s32 %r333,%r163; .loc 1 121 7 setp.gt.s64 %r334,%r96,%r333; @ %r334 bra $L33; mov.u64 %r95,%r306; mov.u64 %r159,%r95; bra $L34; $L1: .loc 1 240 1 ret; } cshift0_r8.o/gfortrani_cshift0_r8 .visible .func _gfortrani_cshift0_r8.file 1 "../../../libgfortran/generated/cshift0_r8.c"F: _gfortrani_cshift0_r8 .visible .func _gfortrani_cshift0_r816 .b8 %frame_ar[48092; .reg .u6432u64 %r214; .reg .u64 %r215; .reg .pred %r216; .reg .pred %r217; .reg .predpred %r232; .reg .pred %r233; .reg .u64 %r235; .reg .u32 %r237; .reg .u32 %r238; .reg .u16 %r240; .reg .u16 %r241; .reg .u16 %r242; .reg .u32 %r243; .reg .u16pred %r249; .reg .u32 %r251; .reg .u32 %r252; .reg .u32 %r254; .reg .u32 %r255; .reg .u16 %r256; .reg .u16 %r257; .reg .u16 %r258; .reg .u32predpredu64pred %r305; .reg .u64u64pred %r337; .reg .pred %r338; .reg .pred %r340; .reg .predmov.u64 %r160,%ar0; mov.u64 %r161,%ar1; mov.u64 %r162,%ar2; mov.u32 %r163,%ar3; .loc 1 57 9 add.u32 %r108,%r163,-1; .loc 1 58 14 mov.u64 %r164,0; st.u64 [%frame+240],%r164; .loc 1 59 14 st.u64 [%frame+360],%r164; .loc 1 61 13 mov.u64 %r166,1; st.u64 [%frame],%r166; .loc 1 62 12 st.u64 [%frame+120],%r164; .loc 1 76 13 ld.s8 %r112,[%r161+28]; .loc 1 72 6 setp.le.s32 %r168,%r108,0; @ %r168 bra $L2; .loc 1 76 11 cvt.u32.u32 %r169,%r112; cvt.s64.s8 %r96,%r169; .loc 1 77 7 setp.le.s64 %r170,%r96,0; @ %r170 bra $L3; mov.u64 %r53,40; .loc 1 70 8 mov.u64 %r117,%r166; .loc 1 69 8 mov.u64 %r116,%r117; .loc 1 77 14 mov.u64 %r118,%r164; $L7: add.u64 %r125,%r160,%r53; .loc 1 81 7 ld.u64 %r171,[%r125]; setp.ne.u64 %r172,%r171,%r116; @ ! %r172 bra $L49; bra $L4; $L2: .loc 1 133 27 cvt.u32.u32 %r173,%r112; cvt.s64.s8 %r96,%r173; .loc 1 133 7 setp.gt.s64 %r174,%r96,0; @ %r174 bra $L4; .loc 1 164 8 ld.u64 %r136,[%r160]; .loc 1 165 8 ld.u64 %r137,[%r161]; mov.u64 %r97,%r164; mov.u64 %r159,%r166; .loc 1 155 13 mov.u64 %r95,%r159; .loc 1 165 8 mov.u64 %r92,%r159; mov.u64 %r54,%r159; bra $L6; $L49: add.u64 %r67,%r161,%r53; .loc 1 87 7 ld.u64 %r175,[%r67]; setp.ne.u64 %r176,%r175,%r117; @ %r176 bra $L4; .loc 1 92 12 ld.u64 %r178,[%r125+16]; ld.u64 %r180,[%r125+8]; sub.u64 %r179,%r178,%r180; .loc 1 92 9 mad.lo.u64 %r116,%r179,%r116,%r116; .loc 1 93 12 ld.u64 %r182,[%r67+16]; ld.u64 %r184,[%r67+8]; sub.u64 %r183,%r182,%r184; .loc 1 93 9 mad.lo.u64 %r117,%r183,%r117,%r117; .loc 1 77 30 add.u64 %r118,%r118,1; .loc 1 77 7 add.u64 %r53,%r53,24; setp.ne.u64 %r185,%r96,%r118; @ %r185 bra $L7; bra $L3; $L33: .loc 1 123 13 cvt.u32.u32 %r187,%r112; cvt.s64.s8 %r186,%r187; sub.u64 %r188,%r186,%r333; shl.b64 %r154,%r188,3; add.u64 %r189,%frame,120;1add.u64 %r198,%r333,%r333; add.u64 %r199,%r198,%r333; shl.b64 %r200,%r199,3; add.u64 %r44,%r200,40; add.u64 %r74,%r161,%r44; mov.u64 %r35,%frame; add.u64 %r101,%r160,%r44; mov.u64 %r49,0; add.u64 %r335,%frame,360; add.u64 %r336,%frame,240; $L8: .loc 1 124 16 ld.u64 %r202,[%r74+16]; add.u64 %r201,%r202,1; ld.u64 %r204,[%r74+8]; sub.u64 %r203,%r201,%r204; .loc 1 124 14 st.u64 [%r35],%r203; .loc 1 125 15 add.u64 %r206,%r335,%r49; ld.u64 %r207,[%r101]; st.u64 [%r206],%r207; .loc 1 126 15 add.u64 %r209,%r336,%r49; ld.u64 %r210,[%r74]; st.u64 [%r209],%r210; .loc 1 121 7 add.u64 %r74,%r74,24; add.u64 %r35,%r35,8; add.u64 %r101,%r101,24; add.u64 %r49,%r49,8; setp.ne.u64 %r211,%r49,%r154; @ %r211 bra $L8; .loc 1 162 12 ld.u64 %r159,[%frame+360]; .loc 1 163 12 ld.u64 %r95,[%frame+240]; $L34: .loc 1 129 41 sub.u32 %r212,%r112,%r108; .loc 1 129 11 cvt.s64.s32 %r96,%r212; .loc 1 117 15 mov.u64 %r92,1; .loc 1 116 15 mov.u64 %r54,%r92; bra $L9; $L4: .loc 1 135 12 cvt.s64.s32 %r151,%r108; add.u64 %r104,%r161,40; add.u64 %r100,%r160,40; mov.u64 %r99,0; mov.u64 %r97,%r99; mov.u64 %r121,%r99; mov.u64 %r92,1; mov.u64 %r54,%r92; .loc 1 147 17 mov.u64 %r344,%r99; $L14: .loc 1 143 14 ld.u64 %r214,[%r104+16]; add.u64 %r213,%r214,1; .loc 1 143 12 ld.u64 %r215,[%r104+8]; sub.u64 %r150,%r213,%r215; .loc 1 137 16 ld.u64 %r152,[%r100]; .loc 1 140 16 ld.u64 %r141,[%r104]; .loc 1 135 7 setp.ne.u64 %r216,%r151,%r121; @ %r216 bra $L10; .loc 1 138 11 setp.ne.u64 %r217,%r152,0; .loc 1 139 11 selp.u64 %r54,%r152,1,%r217; .loc 1 141 11 setp.ne.u64 %r218,%r141,0; .loc 1 142 11 selp.u64 %r92,%r141,1,%r218; .loc 1 143 12 mov.u64 %r97,%r150; bra $L13; $L10: .loc 1 147 17 shl.b64 %r219,%r99,3; add.u64 %r220,%frame,%r219; add.u64 %r221,%r220,120; st.u64 [%r221],%r344; .loc 1 148 18 st.u64 [%r220],%r150; .loc 1 149 19 st.u64 [%r220+360],%r152; .loc 1 150 19 st.u64 [%r220+240],%r141; .loc 1 151 9 add.u64 %r99,%r99,1; $L13: .loc 1 133 59 add.u64 %r121,%r121,1; .loc 1 133 7 add.u64 %r104,%r104,24; add.u64 %r100,%r100,24; setp.lt.s64 %r231,%r121,%r96; @ %r231 bra $L14; .loc 1 154 18 ld.u64 %r95,[%frame+240]; .loc 1 156 18 ld.u64 %r159,[%frame+360]; .loc 1 154 10 setp.ne.u64 %r232,%r95,0; .loc 1 155 13 selp.u64 %r95,%r95,1,%r232; .loc 1 156 10 setp.eq.u64 %r233,%r159,0; selp.u64 %r159,1,%r159,%r233; $L9: .loc 1 164 8 ld.u64 %r136,[%r160]; .loc 1 165 8 ld.u64 %r137,[%r161]; .loc 1 168 13 shr.u64 %r235,%r162,63; .loc 1 168 26 set.u32.ge.s64 %r237,%r162,%r97; neg.s32 %r238,%r237; .loc 1 168 17 cvt.u16.u64 %r241,%r235; cvt.u16.u32 %r242,%r238; or.b16 %r240,%r241,%r242; .loc 1 168 6 cvt.u32.u16 %r243,%r240; cvt.u16.u8 %r244,%r243; setp.eq.u16 %r245,%r244,0; @ %r245 bra $L18; .loc 1 170 28 setp.ne.u64 %r246,%r97,0; @ %r246 bra $L17; $L6: mov.u64 %r162,0; bra $L18; $L17: rem.s64 %r162,%r162,%r97; .loc 1 171 10 setp.ge.s64 %r249,%r162,0; @ %r249 bra $L18; .loc 1 172 8 add.u64 %r162,%r162,%r97; bra $L18; $L32: shl.b64 %r158,%r92,3; .loc 1 193 33 mul.lo.u64 %r62,%r162,%r158; .loc 1 198 13 shl.b64 %r65,%r54,3; .loc 1 183 11 shl.b64 %r132,%r162,3; .loc 1 184 23 sub.u64 %r143,%r97,%r162; .loc 1 184 11 shl.b64 %r133,%r143,3; .loc 1 210 12 shl.b64 %r69,%r159,3; .loc 1 211 12 shl.b64 %r71,%r95,3; .loc 1 181 35 set.u32.eq.u64 %r251,%r54,1; neg.s32 %r252,%r251; .loc 1 181 19 set.u32.eq.u64 %r254,%r92,1; neg.s32 %r255,%r254; .loc 1 181 24 cvt.u16.u32 %r257,%r252; cvt.u16.u32 %r258,%r255; and.b16 %r256,%r257,%r258; cvt.u32.u16 %r259,%r256; cvt.u32.u8 %r103,%r259; .loc 1 214 32 ld.u64 %r98,[%frame]; mul.lo.u64 %r156,%r65,%r143; .loc 1 221 30 mul.lo.u64 %r135,%r159,%r98; .loc 1 222 30 mul.lo.u64 %r260,%r95,%r98; .loc 1 222 16 shl.b64 %r261,%r260,3; neg.s64 %r153,%r261; shl.b64 %r262,%r96,3; add.u64 %r106,%r262,-8; setp.eq.u32 %r338,%r103,0; setp.gt.s64 %r337,%r96,2; .loc 1 195 4 setp.le.s64 %r340,%r143,0; .loc 1 201 4 setp.gt.s64 %r341,%r162,0; add.u64 %r342,%frame,360; add.u64 %r343,%frame,240; $L35: .loc 1 181 10 @ %r338 bra $L19; .loc 1 185 23 add.u64 %r264,%r137,%r132; .loc 1 185 433271,[%value_in]; } .loc 1 186 17 add.u64 %r273,%r136,%r133; .loc 1 186 427r11320,[%value_in]; } bra $L20; $L19: .loc 1 193 22 add.u64 %r130,%r137,%r62; .loc 1 195 4 @ %r340 bra $L39; mov.u64 %r129,%r136; .loc 1 195 11 mov.u64 %r131,0; $L22: .loc 1 197 16 ld.f64 %r63,[%r130]; .loc 1 197 14 st.f64 [%r129],%r63; .loc 1 198 13 add.u64 %r129,%r129,%r65; .loc 1 199 12 add.u64 %r130,%r130,%r158; .loc 1 195 34 add.u64 %r131,%r131,1; .loc 1 195 4 setp.ne.u64 %r283,%r131,%r143; @ %r283 bra $L22; .loc 1 198 13 add.u64 %r126,%r136,%r156; bra $L21; $L39: .loc 1 195 4 mov.u64 %r126,%r136; $L21: .loc 1 201 4 @ ! %r341 bra $L20; mov.u64 %r127,%r137; .loc 1 201 23 mov.u64 %r128,0; $L23: .loc 1 203 16 ld.f64 %r66,[%r127]; .loc 1 203 14 st.f64 [%r126],%r66; .loc 1 204 13 add.u64 %r126,%r126,%r65; .loc 1 205 12 add.u64 %r127,%r127,%r158; .loc 1 201 40 add.u64 %r128,%r128,1; .loc 1 201 4 setp.ne.u64 %r285,%r128,%r162; @ %r285 bra $L23; $L20: .loc 1 210 12 add.u64 %r136,%r136,%r69; .loc 1 211 12 add.u64 %r137,%r137,%r71; .loc 1 212 15 ld.u64 %r286,[%frame+120]; add.u64 %r73,%r286,1; .loc 1 214 13 setp.eq.u64 %r287,%r73,%r98; @ %r287 bra $L25; .loc 1 212 15 st.u64 [%frame+120],%r73; bra $L35; $L25: .loc 1 218 20 mov.u64 %r288,0; st.u64 [%frame+120],%r288; .loc 1 222 16 add.u64 %r134,%r137,%r153; .loc 1 224 14 @ %r337 bra $L27; bra $L1; $L31: .loc 1 218 20 st.u64 [%r123],%r288; .loc 1 221 30 mul.lo.u64 %r76,%r84,%r83; .loc 1 222 30 mul.lo.u64 %r291,%r86,%r83; .loc 1 222 16 shl.b64 %r292,%r291,3; sub.u64 %r134,%r137,%r292; .loc 1 224 14 add.u64 %r123,%r123,8; add.u64 %r148,%r148,8; setp.eq.u64 %r293,%r106,%r148; @ ! %r293 bra $L30; bra $L1; $L27: add.u64 %r123,%frame,128; .loc 1 221 30 mov.u64 %r76,%r135; .loc 1 224 14 mov.u64 %r148,8; $L30: .loc 1 232 23 ld.u64 %r295,[%r123]; add.u64 %r83,%r295,1; st.u64 [%r123],%r83; .loc 1 233 30 add.u64 %r297,%r342,%r148; ld.u64 %r84,[%r297]; .loc 1 233 20 sub.u64 %r298,%r84,%r76; shl.b64 %r299,%r298,3; add.u64 %r136,%r136,%r299; .loc 1 234 30 add.u64 %r301,%r343,%r148; ld.u64 %r86,[%r301]; .loc 1 234 20 shl.b64 %r302,%r86,3; add.u64 %r137,%r134,%r302; .loc 1 214 32 add.u64 %r303,%frame,%r148; ld.u64 %r90,[%r303]; .loc 1 214 13 setp.eq.u64 %r304,%r83,%r90; @ %r304 bra $L31; bra $L35; $L18: .loc 1 175 9 setp.ne.u64 %r305,%r136,0; @ %r305 bra $L32; bra $L1; $L3: .loc 1 114 18 mov.u64 %r306,1; st.u64 [%frame+240],%r306; .loc 1 115 18 st.u64 [%frame+360],%r306; .loc 1 118 13 cvt.s64.s32 %r308,%r108; add.u64 %r310,%r308,%r308; add.u64 %r311,%r310,%r308; shl.b64 %r312,%r311,3; add.u64 %r313,%r161,%r312; ld.u64 %r30,[%r313+40]; .loc 1 119 4 ld.u64 %r323,[%r313+56]; ld.u64 %r332,[%r313+48]; sub.u64 %r331,%r323,%r332; .loc 1 118 11 mad.lo.u64 %r97,%r331,%r30,%r30; .loc 1 120 13 mul.lo.u64 %r162,%r162,%r30; .loc 1 121 16 cvt.s64.s32 %r333,%r163; .loc 1 121 7 setp.gt.s64 %r334,%r96,%r333; @ %r334 bra $L33; mov.u64 %r95,%r306; mov.u64 %r159,%r95; bra $L34; $L1: .loc 1 240 1 ret; } cshift0_r10cshift0_c4.o/gfortrani_cshift0_c4 .visible .func _gfortrani_cshift0_cfortran/generated/cshift0_c4.c"F: _gfortrani_cshift0_c4 .visible .func _gfortrani_cshift0_16 .b8 %frame_ar[480pred %r213; .reg .u32predu64 %r223; .reg .predu32 %r240; .reg .u16predpredu64 %r312; .reg .u64u64 %r325; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .predmov.u64 %r162,%ar0; mov.u64 %r163,%ar1; mov.u64 %r164,%ar2; mov.u32 %r165,%ar3; .loc 1 57 9 add.u32 %r110,%r165,-1; .loc 1 58 14 mov.u64 %r166,0; st.u64 [%frame+240],%r166; .loc 1 59 14 st.u64 [%frame+360],%r166; .loc 1 61 13 mov.u64 %r168,1; st.u64 [%frame],%r168; .loc 1 62 12 st.u64 [%frame+120],%r166; .loc 1 76 13 ld.s8 %r114,[%r163+28]; .loc 1 72 6 setp.le.s32 %r170,%r110,0; @ %r170 bra $L2; .loc 1 76 11 cvt.u32.u32 %r171,%r114; cvt.s64.s8 %r98,%r171; .loc 1 77 7 setp.le.s64 %r172,%r98,0; @ %r172 bra $L3; mov.u64 %r53,40; .loc 1 70 8 mov.u64 %r119,%r168; .loc 1 69 8 mov.u64 %r118,%r119; .loc 1 77 14 mov.u64 %r120,%r166; $L7: add.u64 %r127,%r162,%r53; .loc 1 81 7 ld.u64 %r173,[%r127]; setp.ne.u64 %r174,%r173,%r118; @ ! %r174 bra $L49; bra $L4; $L2: .loc 1 133 27 cvt.u32.u32 %r175,%r114; cvt.s64.s8 %r98,%r175; .loc 1 133 7 setp.gt.s64 %r176,%r98,0; @ %r176 bra $L4; .loc 1 164 8 ld.u64 %r138,[%r162]; .loc 1 165 8 ld.u64 %r139,[%r163]; mov.u64 %r99,%r166; mov.u64 %r161,%r168; .loc 1 155 13 mov.u64 %r97,%r161; .loc 1 165 8 mov.u64 %r94,%r161; mov.u64 %r54,%r161; bra $L6; $L49: add.u64 %r69,%r163,%r53; .loc 1 87 7 ld.u64 %r177,[%r69]; setp.ne.u64 %r178,%r177,%r119; @ %r178 bra $L4; .loc 1 92 12 ld.u64 %r180,[%r127+16]; ld.u64 %r182,[%r127+8]; sub.u64 %r181,%r180,%r182; .loc 1 92 9 mad.lo.u64 %r118,%r181,%r118,%r118; .loc 1 93 12 ld.u64 %r184,[%r69+16]; ld.u64 %r186,[%r69+8]; sub.u64 %r185,%r184,%r186; .loc 1 93 9 mad.lo.u64 %r119,%r185,%r119,%r119; .loc 1 77 30 add.u64 %r120,%r120,1; .loc 1 77 7 add.u64 %r53,%r53,24; setp.ne.u64 %r187,%r98,%r120; @ %r187 bra $L7; bra $L3; $L33: .loc 1 123 13 cvt.u32.u32 %r189,%r114; cvt.s64.s8 %r188,%r189; sub.u64 %r190,%r188,%r335; shl.b64 %r156,%r190,3; add.u64 %r191,%frame,120; mov.u32 %r1add.u64 %r200,%r335,%r335; add.u64 %r201,%r200,%r335; shl.b64 %r202,%r201,3; add.u64 %r44,%r202,40; add.u64 %r76,%r163,%r44; mov.u64 %r35,%frame; add.u64 %r103,%r162,%r44; mov.u64 %r49,0; add.u64 %r337,%frame,360; add.u64 %r338,%frame,240; $L8: .loc 1 124 16 ld.u64 %r204,[%r76+16]; add.u64 %r203,%r204,1; ld.u64 %r206,[%r76+8]; sub.u64 %r205,%r203,%r206; .loc 1 124 14 st.u64 [%r35],%r205; .loc 1 125 15 add.u64 %r208,%r337,%r49; ld.u64 %r209,[%r103]; st.u64 [%r208],%r209; .loc 1 126 15 add.u64 %r211,%r338,%r49; ld.u64 %r212,[%r76]; st.u64 [%r211],%r212; .loc 1 121 7 add.u64 %r76,%r76,24; add.u64 %r35,%r35,8; add.u64 %r103,%r103,24; add.u64 %r49,%r49,8; setp.ne.u64 %r213,%r49,%r156; @ %r213 bra $L8; .loc 1 162 12 ld.u64 %r161,[%frame+360]; .loc 1 163 12 ld.u64 %r97,[%frame+240]; $L34: .loc 1 129 41 sub.u32 %r214,%r114,%r110; .loc 1 129 11 cvt.s64.s32 %r98,%r214; .loc 1 117 15 mov.u64 %r94,1; .loc 1 116 15 mov.u64 %r54,%r94; bra $L9; $L4: .loc 1 135 12 cvt.s64.s32 %r153,%r110; add.u64 %r106,%r163,40; add.u64 %r102,%r162,40; mov.u64 %r101,0; mov.u64 %r99,%r101; mov.u64 %r123,%r101; mov.u64 %r94,1; mov.u64 %r54,%r94; .loc 1 147 17 mov.u64 %r346,%r101; $L14: .loc 1 143 14 ld.u64 %r216,[%r106+16]; add.u64 %r215,%r216,1; .loc 1 143 12 ld.u64 %r217,[%r106+8]; sub.u64 %r152,%r215,%r217; .loc 1 137 16 ld.u64 %r154,[%r102]; .loc 1 140 16 ld.u64 %r143,[%r106]; .loc 1 135 7 setp.ne.u64 %r218,%r153,%r123; @ %r218 bra $L10; .loc 1 138 11 setp.ne.u64 %r219,%r154,0; .loc 1 139 11 selp.u64 %r54,%r154,1,%r219; .loc 1 141 11 setp.ne.u64 %r220,%r143,0; .loc 1 142 11 selp.u64 %r94,%r143,1,%r220; .loc 1 143 12 mov.u64 %r99,%r152; bra $L13; $L10: .loc 1 147 17 shl.b64 %r221,%r101,3; add.u64 %r222,%frame,%r221; add.u64 %r223,%r222,120; st.u64 [%r223],%r346; .loc 1 148 18 st.u64 [%r222],%r152; .loc 1 149 19 st.u64 [%r222+360],%r154; .loc 1 150 19 st.u64 [%r222+240],%r143; .loc 1 151 9 add.u64 %r101,%r101,1; $L13: .loc 1 133 59 add.u64 %r123,%r123,1; .loc 1 133 7 add.u64 %r106,%r106,24; add.u64 %r102,%r102,24; setp.lt.s64 %r233,%r123,%r98; @ %r233 bra $L14; .loc 1 154 18 ld.u64 %r97,[%frame+240]; .loc 1 156 18 ld.u64 %r161,[%frame+360]; .loc 1 154 10 setp.ne.u64 %r234,%r97,0; .loc 1 155 13 selp.u64 %r97,%r97,1,%r234; .loc 1 156 10 setp.eq.u64 %r235,%r161,0; selp.u64 %r161,1,%r161,%r235; $L9: .loc 1 164 8 ld.u64 %r138,[%r162]; .loc 1 165 8 ld.u64 %r139,[%r163]; .loc 1 168 13 shr.u64 %r237,%r164,63; .loc 1 168 26 set.u32.ge.s64 %r239,%r164,%r99; neg.s32 %r240,%r239; .loc 1 168 17 cvt.u16.u64 %r243,%r237; cvt.u16.u32 %r244,%r240; or.b16 %r242,%r243,%r244; .loc 1 168 6 cvt.u32.u16 %r245,%r242; cvt.u16.u8 %r246,%r245; setp.eq.u16 %r247,%r246,0; @ %r247 bra $L18; .loc 1 170 28 setp.ne.u64 %r248,%r99,0; @ %r248 bra $L17; $L6: mov.u64 %r164,0; bra $L18; $L17: rem.s64 %r164,%r164,%r99; .loc 1 171 10 setp.ge.s64 %r251,%r164,0; @ %r251 bra $L18; .loc 1 172 8 add.u64 %r164,%r164,%r99; bra $L18; $L32: shl.b64 %r160,%r94,3; .loc 1 193 36 mul.lo.u64 %r62,%r164,%r160; .loc 1 198 13 shl.b64 %r66,%r54,3; .loc 1 183 11 shl.b64 %r134,%r164,3; .loc 1 184 23 sub.u64 %r145,%r99,%r164; .loc 1 184 11 shl.b64 %r135,%r145,3; .loc 1 210 12 shl.b64 %r71,%r161,3; .loc 1 211 12 shl.b64 %r73,%r97,3; .loc 1 181 35 set.u32.eq.u64 %r253,%r54,1; neg.s32 %r254,%r253; .loc 1 181 19 set.u32.eq.u64 %r256,%r94,1; neg.s32 %r257,%r256; .loc 1 181 24.loc 1 214 32 ld.u64 %r100,[%frame]; mul.lo.u64 %r158,%r66,%r145; .loc 1 221 30 mul.lo.u64 %r137,%r161,%r100; .loc 1 222 30 mul.lo.u64 %r262,%r97,%r100; .loc 1 222 16 shl.b64 %r263,%r262,3; neg.s64 %r155,%r263; shl.b64 %r264,%r98,3; add.u64 %r108,%r264,-8; setp.eq.u32 %r340,%r105,0; setp.gt.s64 %r339,%r98,2; .loc 1 195 4 setp.le.s64 %r342,%r145,0; .loc 1 201 4 setp.gt.s64 %r343,%r164,0; add.u64 %r344,%frame,360; add.u64 %r345,%frame,240; $L35: .loc 1 181 10 @ %r340 bra $L19; .loc 1 185 23 add.u64 %r266,%r139,%r134; .loc 1 185 413r26613273,[%value_in]; } .loc 1 186 17 add.u64 %r275,%r138,%r135; .loc 1 186 4342,[%value_in]; } bra $L20; $L19: .loc 1 193 25 add.u64 %r132,%r139,%r62; .loc 1 195 4 @ %r342 bra $L39; mov.u64 %r131,%r138; .loc 1 195 11 mov.u64 %r133,0; $L22: .loc 1 197 16 ld.f32 %r63,[%r132]; ld.f32 %r64,[%r132+4]; .loc 1 197 14 st.f32 [%r131],%r63; st.f32 [%r131+4],%r64; .loc 1 198 13 add.u64 %r131,%r131,%r66; .loc 1 199 12 add.u64 %r132,%r132,%r160; .loc 1 195 34 add.u64 %r133,%r133,1; .loc 1 195 4 setp.ne.u64 %r285,%r133,%r145; @ %r285 bra $L22; .loc 1 198 13 add.u64 %r128,%r138,%r158; bra $L21; $L39: .loc 1 195 4 mov.u64 %r128,%r138; $L21: .loc 1 201 4 @ ! %r343 bra $L20; mov.u64 %r129,%r139; .loc 1 201 23 mov.u64 %r130,0; $L23: .loc 1 203 16 ld.f32 %r67,[%r129]; ld.f32 %r68,[%r129+4]; .loc 1 203 14 st.f32 [%r128],%r67; st.f32 [%r128+4],%r68; .loc 1 204 13 add.u64 %r128,%r128,%r66; .loc 1 205 12 add.u64 %r129,%r129,%r160; .loc 1 201 40 add.u64 %r130,%r130,1; .loc 1 201 4 setp.ne.u64 %r287,%r130,%r164; @ %r287 bra $L23; $L20: .loc 1 210 12 add.u64 %r138,%r138,%r71; .loc 1 211 12 add.u64 %r139,%r139,%r73; .loc 1 212 15 ld.u64 %r288,[%frame+120]; add.u64 %r75,%r288,1; .loc 1 214 13 setp.eq.u64 %r289,%r75,%r100; @ %r289 bra $L25; .loc 1 212 15 st.u64 [%frame+120],%r75; bra $L35; $L25: .loc 1 218 20 mov.u64 %r290,0; st.u64 [%frame+120],%r290; .loc 1 222 16 add.u64 %r136,%r139,%r155; .loc 1 224 14 @ %r339 bra $L27; bra $L1; $L31: .loc 1 218 20 st.u64 [%r125],%r290; .loc 1 221 30 mul.lo.u64 %r78,%r86,%r85; .loc 1 222 30 mul.lo.u64 %r293,%r88,%r85; .loc 1 222 16 shl.b64 %r294,%r293,3; sub.u64 %r136,%r139,%r294; .loc 1 224 14 add.u64 %r125,%r125,8; add.u64 %r150,%r150,8; setp.eq.u64 %r295,%r108,%r150; @ ! %r295 bra $L30; bra $L1; $L27: add.u64 %r125,%frame,128; .loc 1 221 30 mov.u64 %r78,%r137; .loc 1 224 14 mov.u64 %r150,8; $L30: .loc 1 232 23 ld.u64 %r297,[%r125]; add.u64 %r85,%r297,1; st.u64 [%r125],%r85; .loc 1 233 30 add.u64 %r299,%r344,%r150; ld.u64 %r86,[%r299]; .loc 1 233 20 sub.u64 %r300,%r86,%r78; shl.b64 %r301,%r300,3; add.u64 %r138,%r138,%r301; .loc 1 234 30 add.u64 %r303,%r345,%r150; ld.u64 %r88,[%r303]; .loc 1 234 20 shl.b64 %r304,%r88,3; add.u64 %r139,%r136,%r304; .loc 1 214 32 add.u64 %r305,%frame,%r150; ld.u64 %r92,[%r305]; .loc 1 214 13 setp.eq.u64 %r306,%r85,%r92; @ %r306 bra $L31; bra $L35; $L18: .loc 1 175 9 setp.ne.u64 %r307,%r138,0; @ %r307 bra $L32; bra $L1; $L3: .loc 1 114 18 mov.u64 %r308,1; st.u64 [%frame+240],%r308; .loc 1 115 18 st.u64 [%frame+360],%r308; .loc 1 118 13 cvt.s64.s32 %r310,%r110; add.u64 %r312,%r310,%r310; add.u64 %r313,%r312,%r310; shl.b64 %r314,%r313,3; add.u64 %r315,%r163,%r314; ld.u64 %r30,[%r315+40]; .loc 1 119 4 ld.u64 %r325,[%r315+56]; ld.u64 %r334,[%r315+48]; sub.u64 %r333,%r325,%r334; .loc 1 118 11 mad.lo.u64 %r99,%r333,%r30,%r30; .loc 1 120 13 mul.lo.u64 %r164,%r164,%r30; .loc 1 121 16 cvt.s64.s32 %r335,%r165; .loc 1 121 7 setp.gt.s64 %r336,%r98,%r335; @ %r336 bra $L33; mov.u64 %r97,%r308; mov.u64 %r161,%r97; bra $L34; $Lgfortrani_cshift0_c8 .visible .func _gfortrani_cshift0_.file 1 "../../../libgfortran/generated/cshift0_c8.c"F: _gfortrani_cshift0_c8 .visible .func _gfortrani_cshift0_16 .b8 %frame_ar[480pred %r213; .reg .u32predu64 %r223; .reg .predu32 %r240; .reg .u16predpredu64 %r312; .reg .u64u64 %r325; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .predmov.u64 %r162,%ar0; mov.u64 %r163,%ar1; mov.u64 %r164,%ar2; mov.u32 %r165,%ar3; .loc 1 57 9 add.u32 %r110,%r165,-1; .loc 1 58 14 mov.u64 %r166,0; st.u64 [%frame+240],%r166; .loc 1 59 14 st.u64 [%frame+360],%r166; .loc 1 61 13 mov.u64 %r168,1; st.u64 [%frame],%r168; .loc 1 62 12 st.u64 [%frame+120],%r166; .loc 1 76 13 ld.s8 %r114,[%r163+28]; .loc 1 72 6 setp.le.s32 %r170,%r110,0; @ %r170 bra $L2; .loc 1 76 11 cvt.u32.u32 %r171,%r114; cvt.s64.s8 %r98,%r171; .loc 1 77 7 setp.le.s64 %r172,%r98,0; @ %r172 bra $L3; mov.u64 %r53,40; .loc 1 70 8 mov.u64 %r119,%r168; .loc 1 69 8 mov.u64 %r118,%r119; .loc 1 77 14 mov.u64 %r120,%r166; $L7: add.u64 %r127,%r162,%r53; .loc 1 81 7 ld.u64 %r173,[%r127]; setp.ne.u64 %r174,%r173,%r118; @ ! %r174 bra $L49; bra $L4; $L2: .loc 1 133 27 cvt.u32.u32 %r175,%r114; cvt.s64.s8 %r98,%r175; .loc 1 133 7 setp.gt.s64 %r176,%r98,0; @ %r176 bra $L4; .loc 1 164 8 ld.u64 %r138,[%r162]; .loc 1 165 8 ld.u64 %r139,[%r163]; mov.u64 %r99,%r166; mov.u64 %r161,%r168; .loc 1 155 13 mov.u64 %r97,%r161; .loc 1 165 8 mov.u64 %r94,%r161; mov.u64 %r54,%r161; bra $L6; $L49: add.u64 %r69,%r163,%r53; .loc 1 87 7 ld.u64 %r177,[%r69]; setp.ne.u64 %r178,%r177,%r119; @ %r178 bra $L4; .loc 1 92 12 ld.u64 %r180,[%r127+16]; ld.u64 %r182,[%r127+8]; sub.u64 %r181,%r180,%r182; .loc 1 92 9 mad.lo.u64 %r118,%r181,%r118,%r118; .loc 1 93 12 ld.u64 %r184,[%r69+16]; ld.u64 %r186,[%r69+8]; sub.u64 %r185,%r184,%r186; .loc 1 93 9 mad.lo.u64 %r119,%r185,%r119,%r119; .loc 1 77 30 add.u64 %r120,%r120,1; .loc 1 77 7 add.u64 %r53,%r53,24; setp.ne.u64 %r187,%r98,%r120; @ %r187 bra $L7; bra $L3; $L33: .loc 1 123 13 cvt.u32.u32 %r189,%r114; cvt.s64.s8 %r188,%r189; sub.u64 %r190,%r188,%r335; shl.b64 %r156,%r190,3; add.u64 %r191,%frame,120; mov.u32 %r1add.u64 %r200,%r335,%r335; add.u64 %r201,%r200,%r335; shl.b64 %r202,%r201,3; add.u64 %r44,%r202,40; add.u64 %r76,%r163,%r44; mov.u64 %r35,%frame; add.u64 %r103,%r162,%r44; mov.u64 %r49,0; add.u64 %r337,%frame,360; add.u64 %r338,%frame,240; $L8: .loc 1 124 16 ld.u64 %r204,[%r76+16]; add.u64 %r203,%r204,1; ld.u64 %r206,[%r76+8]; sub.u64 %r205,%r203,%r206; .loc 1 124 14 st.u64 [%r35],%r205; .loc 1 125 15 add.u64 %r208,%r337,%r49; ld.u64 %r209,[%r103]; st.u64 [%r208],%r209; .loc 1 126 15 add.u64 %r211,%r338,%r49; ld.u64 %r212,[%r76]; st.u64 [%r211],%r212; .loc 1 121 7 add.u64 %r76,%r76,24; add.u64 %r35,%r35,8; add.u64 %r103,%r103,24; add.u64 %r49,%r49,8; setp.ne.u64 %r213,%r49,%r156; @ %r213 bra $L8; .loc 1 162 12 ld.u64 %r161,[%frame+360]; .loc 1 163 12 ld.u64 %r97,[%frame+240]; $L34: .loc 1 129 41 sub.u32 %r214,%r114,%r110; .loc 1 129 11 cvt.s64.s32 %r98,%r214; .loc 1 117 15 mov.u64 %r94,1; .loc 1 116 15 mov.u64 %r54,%r94; bra $L9; $L4: .loc 1 135 12 cvt.s64.s32 %r153,%r110; add.u64 %r106,%r163,40; add.u64 %r102,%r162,40; mov.u64 %r101,0; mov.u64 %r99,%r101; mov.u64 %r123,%r101; mov.u64 %r94,1; mov.u64 %r54,%r94; .loc 1 147 17 mov.u64 %r346,%r101; $L14: .loc 1 143 14 ld.u64 %r216,[%r106+16]; add.u64 %r215,%r216,1; .loc 1 143 12 ld.u64 %r217,[%r106+8]; sub.u64 %r152,%r215,%r217; .loc 1 137 16 ld.u64 %r154,[%r102]; .loc 1 140 16 ld.u64 %r143,[%r106]; .loc 1 135 7 setp.ne.u64 %r218,%r153,%r123; @ %r218 bra $L10; .loc 1 138 11 setp.ne.u64 %r219,%r154,0; .loc 1 139 11 selp.u64 %r54,%r154,1,%r219; .loc 1 141 11 setp.ne.u64 %r220,%r143,0; .loc 1 142 11 selp.u64 %r94,%r143,1,%r220; .loc 1 143 12 mov.u64 %r99,%r152; bra $L13; $L10: .loc 1 147 17 shl.b64 %r221,%r101,3; add.u64 %r222,%frame,%r221; add.u64 %r223,%r222,120; st.u64 [%r223],%r346; .loc 1 148 18 st.u64 [%r222],%r152; .loc 1 149 19 st.u64 [%r222+360],%r154; .loc 1 150 19 st.u64 [%r222+240],%r143; .loc 1 151 9 add.u64 %r101,%r101,1; $L13: .loc 1 133 59 add.u64 %r123,%r123,1; .loc 1 133 7 add.u64 %r106,%r106,24; add.u64 %r102,%r102,24; setp.lt.s64 %r233,%r123,%r98; @ %r233 bra $L14; .loc 1 154 18 ld.u64 %r97,[%frame+240]; .loc 1 156 18 ld.u64 %r161,[%frame+360]; .loc 1 154 10 setp.ne.u64 %r234,%r97,0; .loc 1 155 13 selp.u64 %r97,%r97,1,%r234; .loc 1 156 10 setp.eq.u64 %r235,%r161,0; selp.u64 %r161,1,%r161,%r235; $L9: .loc 1 164 8 ld.u64 %r138,[%r162]; .loc 1 165 8 ld.u64 %r139,[%r163]; .loc 1 168 13 shr.u64 %r237,%r164,63; .loc 1 168 26 set.u32.ge.s64 %r239,%r164,%r99; neg.s32 %r240,%r239; .loc 1 168 17 cvt.u16.u64 %r243,%r237; cvt.u16.u32 %r244,%r240; or.b16 %r242,%r243,%r244; .loc 1 168 6 cvt.u32.u16 %r245,%r242; cvt.u16.u8 %r246,%r245; setp.eq.u16 %r247,%r246,0; @ %r247 bra $L18; .loc 1 170 28 setp.ne.u64 %r248,%r99,0; @ %r248 bra $L17; $L6: mov.u64 %r164,0; bra $L18; $L17: rem.s64 %r164,%r164,%r99; .loc 1 171 10 setp.ge.s64 %r251,%r164,0; @ %r251 bra $L18; .loc 1 172 8 add.u64 %r164,%r164,%r99; bra $L18; $L32: shl.b64 %r160,%r94,4; .loc 1 193 36 mul.lo.u64 %r62,%r164,%r160; .loc 1 198 13 shl.b64 %r66,%r54,4; .loc 1 183 11 shl.b64 %r134,%r164,4; .loc 1 184 23 sub.u64 %r145,%r99,%r164; .loc 1 184 11 shl.b64 %r135,%r145,4; .loc 1 210 12 shl.b64 %r71,%r161,4; .loc 1 211 12 shl.b64 %r73,%r97,4; .loc 1 181 35 set.u32.eq.u64 %r253,%r54,1; neg.s32 %r254,%r253; .loc 1 181 19 set.u32.eq.u64 %r256,%r94,1; neg.s32 %r257,%r256; .loc 1 181 24.loc 1 214 32 ld.u64 %r100,[%frame]; mul.lo.u64 %r158,%r66,%r145; .loc 1 221 30 mul.lo.u64 %r137,%r161,%r100; .loc 1 222 30 mul.lo.u64 %r262,%r97,%r100; .loc 1 222 16 shl.b64 %r263,%r262,4; neg.s64 %r155,%r263; shl.b64 %r264,%r98,3; add.u64 %r108,%r264,-8; setp.eq.u32 %r340,%r105,0; setp.gt.s64 %r339,%r98,2; .loc 1 195 4 setp.le.s64 %r342,%r145,0; .loc 1 201 4 setp.gt.s64 %r343,%r164,0; add.u64 %r344,%frame,360; add.u64 %r345,%frame,240; $L35: .loc 1 181 10 @ %r340 bra $L19; .loc 1 185 23 add.u64 %r266,%r139,%r134; .loc 1 185 413r26613273,[%value_in]; } .loc 1 186 17 add.u64 %r275,%r138,%r135; .loc 1 186 4342,[%value_in]; } bra $L20; $L19: .loc 1 193 25 add.u64 %r132,%r139,%r62; .loc 1 195 4 @ %r342 bra $L39; mov.u64 %r131,%r138; .loc 1 195 11 mov.u64 %r133,0; $L22: .loc 1 197 16 ld.f64 %r63,[%r132]; ld.f64 %r64,[%r132+8]; .loc 1 197 14 st.f64 [%r131],%r63; st.f64 [%r131+8],%r64; .loc 1 198 13 add.u64 %r131,%r131,%r66; .loc 1 199 12 add.u64 %r132,%r132,%r160; .loc 1 195 34 add.u64 %r133,%r133,1; .loc 1 195 4 setp.ne.u64 %r285,%r133,%r145; @ %r285 bra $L22; .loc 1 198 13 add.u64 %r128,%r138,%r158; bra $L21; $L39: .loc 1 195 4 mov.u64 %r128,%r138; $L21: .loc 1 201 4 @ ! %r343 bra $L20; mov.u64 %r129,%r139; .loc 1 201 23 mov.u64 %r130,0; $L23: .loc 1 203 16 ld.f64 %r67,[%r129]; ld.f64 %r68,[%r129+8]; .loc 1 203 14 st.f64 [%r128],%r67; st.f64 [%r128+8],%r68; .loc 1 204 13 add.u64 %r128,%r128,%r66; .loc 1 205 12 add.u64 %r129,%r129,%r160; .loc 1 201 40 add.u64 %r130,%r130,1; .loc 1 201 4 setp.ne.u64 %r287,%r130,%r164; @ %r287 bra $L23; $L20: .loc 1 210 12 add.u64 %r138,%r138,%r71; .loc 1 211 12 add.u64 %r139,%r139,%r73; .loc 1 212 15 ld.u64 %r288,[%frame+120]; add.u64 %r75,%r288,1; .loc 1 214 13 setp.eq.u64 %r289,%r75,%r100; @ %r289 bra $L25; .loc 1 212 15 st.u64 [%frame+120],%r75; bra $L35; $L25: .loc 1 218 20 mov.u64 %r290,0; st.u64 [%frame+120],%r290; .loc 1 222 16 add.u64 %r136,%r139,%r155; .loc 1 224 14 @ %r339 bra $L27; bra $L1; $L31: .loc 1 218 20 st.u64 [%r125],%r290; .loc 1 221 30 mul.lo.u64 %r78,%r86,%r85; .loc 1 222 30 mul.lo.u64 %r293,%r88,%r85; .loc 1 222 16 shl.b64 %r294,%r293,4; sub.u64 %r136,%r139,%r294; .loc 1 224 14 add.u64 %r125,%r125,8; add.u64 %r150,%r150,8; setp.eq.u64 %r295,%r108,%r150; @ ! %r295 bra $L30; bra $L1; $L27: add.u64 %r125,%frame,128; .loc 1 221 30 mov.u64 %r78,%r137; .loc 1 224 14 mov.u64 %r150,8; $L30: .loc 1 232 23 ld.u64 %r297,[%r125]; add.u64 %r85,%r297,1; st.u64 [%r125],%r85; .loc 1 233 30 add.u64 %r299,%r344,%r150; ld.u64 %r86,[%r299]; .loc 1 233 20 sub.u64 %r300,%r86,%r78; shl.b64 %r301,%r300,4; add.u64 %r138,%r138,%r301; .loc 1 234 30 add.u64 %r303,%r345,%r150; ld.u64 %r88,[%r303]; .loc 1 234 20 shl.b64 %r304,%r88,4; add.u64 %r139,%r136,%r304; .loc 1 214 32 add.u64 %r305,%frame,%r150; ld.u64 %r92,[%r305]; .loc 1 214 13 setp.eq.u64 %r306,%r85,%r92; @ %r306 bra $L31; bra $L35; $L18: .loc 1 175 9 setp.ne.u64 %r307,%r138,0; @ %r307 bra $L32; bra $L1; $L3: .loc 1 114 18 mov.u64 %r308,1; st.u64 [%frame+240],%r308; .loc 1 115 18 st.u64 [%frame+360],%r308; .loc 1 118 13 cvt.s64.s32 %r310,%r110; add.u64 %r312,%r310,%r310; add.u64 %r313,%r312,%r310; shl.b64 %r314,%r313,3; add.u64 %r315,%r163,%r314; ld.u64 %r30,[%r315+40]; .loc 1 119 4 ld.u64 %r325,[%r315+56]; ld.u64 %r334,[%r315+48]; sub.u64 %r333,%r325,%r334; .loc 1 118 11 mad.lo.u64 %r99,%r333,%r30,%r30; .loc 1 120 13 mul.lo.u64 %r164,%r164,%r30; .loc 1 121 16 cvt.s64.s32 %r335,%r165; .loc 1 121 7 setp.gt.s64 %r336,%r98,%r335; @ %r336 bra $L33; mov.u64 %r97,%r308; mov.u64 %r161,%r97; bra $L34; $L1: .loc 1 240cshift1_4_i1.o/ 1622802209gfortrani_cshift1_4_i1 .visible .func _gfortrani_cshift1_4_fortran/generated/cshift1_4_i1.c"F: _gfortrani_cshift1_4_i1 .visible .func _gfortrani_cshift1_4_960predpredpredpred %r185; .reg .predu16 %r197; .reg .u16pred %r201; .reg .predpredpred %r229; .reg .predu64pred %r270; .reg .predmov.u64 %r136,%ar0; mov.u64 %r137,%ar1; mov.u64 %r138,%ar2; mov.u64 %r139,%ar3; .loc 1 68 6 setp.eq.u64 %r140,%r139,0; @ %r140 bra $L31; .loc 1 69 11 ld.u32 %r141,[%r139]; add.u32 %r88,%r141,-1; bra $L2; $L31: .loc 1 71 11 cvt.u32.u64 %r88,%r139; $L2: .loc 1 73 13 mov.u64 %r142,1; st.u64 [%frame+360],%r142; .loc 1 74 12 mov.u64 %r143,0; st.u64 [%frame+480],%r143; .loc 1 82 23 ld.s8 %r126,[%r137+28]; cvt.u32.u32 %r144,%r126; cvt.s64.s8 %r132,%r144; .loc 1 82 3 setp.le.s64 %r145,%r132,0; @ %r145 bra $L32; .loc 1 84 15 cvt.s64.s32 %r24,%r88; add.u64 %r74,%r137,40; add.u64 %r45,%r136,40; .loc 1 75 5 mov.u64 %r84,%r143; .loc 1 80 7 mov.u64 %r82,%r84; .loc 1 82 12 mov.u64 %r116,%r84; .loc 1 79 11 mov.u64 %r77,%r142; .loc 1 78 11 mov.u64 %r72,%r77; .loc 1 96 20 mov.u64 %r292,%r84; $L8: .loc 1 92 17 ld.u64 %r147,[%r74+16]; add.u64 %r146,%r147,1; .loc 1 92 15 ld.u64 %r148,[%r74+8]; sub.u64 %r128,%r146,%r148; .loc 1 86 19 ld.u64 %r71,[%r45]; .loc 1 89 19 ld.u64 %r76,[%r74]; .loc 1 84 10 setp.ne.u64 %r149,%r24,%r116; @ %r149 bra $L4; .loc 1 87 14 setp.ne.u64 %r150,%r71,0; .loc 1 90 14 setp.ne.u64 %r151,%r76,0; .loc 1 92 15 mov.u64 %r82,%r128; selp.u64 %r77,%r76,1,%r151; selp.u64 %r72,%r71,1,%r150; bra $L7; $L4: .loc 1 96 20 shl.b64 %r152,%r84,3; add.u64 %r153,%frame,%r152; add.u64 %r154,%r153,480; st.u64 [%r154],%r292; .loc 1 97 21 st.u64 [%r153+360],%r128; .loc 1 98 22 st.u64 [%r153+840],%r71; .loc 1 99 22 st.u64 [%r153+720],%r76; .loc 1 100 24 add.u64 %r166,%r84,%r84; add.u64 %r167,%r166,%r84; shl.b64 %r168,%r167,3; add.u64 %r169,%r138,%r168; ld.u64 %r23,[%r169+40]; .loc 1 100 22 st.u64 [%r153+600],%r23; .loc 1 101 26 mul.lo.u64 %r177,%r128,%r71; .loc 1 101 13 st.u64 [%r153+240],%r177; .loc 1 102 26 mul.lo.u64 %r181,%r76,%r128; .loc 1 102 13 st.u64 [%r153+120],%r181; .loc 1 103 26 mul.lo.u64 %r184,%r23,%r128; .loc 1 103 13 st.u64 [%r153],%r184; .loc 1 104 12 add.u64 %r84,%r84,1; $L7: .loc 1 82 55 add.u64 %r116,%r116,1; .loc 1 82 3 add.u64 %r74,%r74,24; add.u64 %r45,%r45,24; setp.ne.u64 %r185,%r116,%r132; @ %r185 bra $L8; bra $L3; $L32: .loc 1 80 7 mov.u64 %r82,%r143; .loc 1 79 11 mov.u64 %r77,%r142; .loc 1 78 11 mov.u64 %r72,%r77; $L3: .loc 1 107 14 ld.u64 %r28,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r186,%r28,0; .loc 1 108 16 selp.u64 %r28,%r28,1,%r186; .loc 1 109 14 ld.u64 %r29,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r187,%r29,0; .loc 1 110 16 selp.u64 %r29,%r29,1,%r187; .loc 1 111 14 ld.u64 %r30,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r188,%r30,0; .loc 1 112 16 selp.u64 %r30,%r30,1,%r188; .loc 1 118 8 ld.u64 %r113,[%r136]; .loc 1 119 8 ld.u64 %r114,[%r137]; .loc 1 120 8 ld.u64 %r115,[%r138]; .loc 1 122 9 setp.eq.u64 %r189,%r113,0; @ %r189 bra $L1; .loc 1 164 12 shl.b64 %r48,%r30,2; .loc 1 138 35 set.u32.eq.u64 %r191,%r72,1; neg.s32 %r192,%r191; .loc 1 138 19 set.u32.eq.u64 %r194,%r77,1; neg.s32 %r195,%r194; .loc 1 138 24 cvt.u16.u32 %r197,%r192; cvt.u16.u32 %r198,%r195; and.b16 %r196,%r197,%r198; cvt.u32.u16 %r199,%r196; cvt.u32.u8 %r95,%r199; .loc 1 167 32 ld.u64 %r75,[%frame+360]; setp.ne.u32 %r278,%r95,0; setp.le.s64 %r274,%r132,2; .loc 1 129 12 cvt.u32.u64 %r282,%r82; cvt.u32.u32 %r283,%r126; cvt.s64.s8 %r284,%r283; add.u64 %r285,%r284,-1; add.u64 %r286,%frame,840; add.u64 %r287,%frame,720; add.u64 %r288,%frame,600; add.u64 %r289,%frame,360; .loc 1 172 24 add.u64 %r290,%frame,240; .loc 1 173 24 add.u64 %r291,%frame,120; $L30: .loc 1 125 10 ld.u32 %r90,[%r115]; .loc 1 128 10 setp.ge.s32 %r200,%r90,0; @ %r200 bra $L13; .loc 1 129 12 add.u32 %r90,%r90,%r282; .loc 1 130 11 cvt.s64.s32 %r91,%r90; .loc 1 130 10 setp.ge.s64 %r201,%r91,%r82; @ %r201 bra $L14; setp.ge.s32 %r202,%r90,0; @ %r202 bra $L15; $L14: .loc 1 132 13 rem.s64 %r91,%r91,%r82; .loc 1 133 7 setp.lt.s64 %r205,%r91,0; @ %r205 bra $L16; .loc 1 132 8 cvt.u32.u64 %r90,%r91; bra $L15; $L16: .loc 1 134 16 cvt.u32.u64 %r206,%r91; add.u32 %r90,%r282,%r206; .loc 1 136 22 cvt.s64.s32 %r91,%r90; $L15: .loc 1 147 24 sub.u64 %r129,%r82,%r91; .loc 1 138 10 @ %r278 bra $L17; .loc 1 147 4 setp.gt.s64 %r208,%r129,0; @ %r208 bra $L18; mov.u64 %r135,%r113; bra $L19; $L17: .loc 1 140 11 cvt.s64.s32 %r108,%r90; .loc 1 142 23 add.u64 %r209,%r114,%r108; .loc 1 142 411r20943 17 add.u64 %r218,%r113,%r129; .loc 1 143 4r108225,[%value_in]; } bra $L20; $L18: .loc 1 136 11 mad.lo.u64 %r106,%r91,%r77,%r114; .loc 1 157 12 mov.u64 %r105,%r113; .loc 1 147 11 mov.u64 %r107,0; $L21: .loc 1 149 14 ld.u8 %r228,[%r106]; st.u8 [%r105],%r228; .loc 1 150 13 add.u64 %r105,%r105,%r72; .loc 1 151 12 add.u64 %r106,%r106,%r77; .loc 1 147 31 add.u64 %r107,%r107,1; .loc 1 147 4 setp.ne.u64 %r229,%r107,%r129; @ %r229 bra $L21; .loc 1 150 13 mad.lo.u64 %r135,%r107,%r72,%r113; $L19: .loc 1 153 4 setp.gt.s64 %r231,%r91,0; @ ! %r231 bra $L20; .loc 1 157 12 mov.u64 %r103,%r114; .loc 1 153 23 mov.u64 %r104,0; $L23: .loc 1 155 14 ld.u8 %r232,[%r103]; st.u8 [%r135],%r232; .loc 1 156 13 add.u64 %r135,%r135,%r72; .loc 1 157 12 add.u64 %r103,%r103,%r77; .loc 1 153 37 add.u64 %r104,%r104,1; .loc 1 153 4 setp.ne.u64 %r233,%r91,%r104; @ %r233 bra $L23; $L20: .loc 1 162 12 add.u64 %r113,%r113,%r29; .loc 1 163 12 add.u64 %r114,%r114,%r28; .loc 1 164 12 add.u64 %r115,%r115,%r48; .loc 1 165 15 ld.u64 %r234,[%frame+480]; add.u64 %r50,%r234,1; .loc 1 167 13 setp.eq.u64 %r235,%r50,%r75; @ %r235 bra $L25; .loc 1 165 15 st.u64 [%frame+480],%r50; bra $L30; $L25: .loc 1 171 20 mov.u64 %r236,0; st.u64 [%frame+480],%r236; .loc 1 172 24 ld.u64 %r52,[%frame+240]; .loc 1 173 16 ld.u64 %r237,[%frame+120]; sub.u64 %r110,%r114,%r237; .loc 1 174 9 ld.u64 %r239,[%frame]; shl.b64 %r238,%r239,2; sub.u64 %r111,%r115,%r238; .loc 1 176 14 @ %r274 bra $L1; add.u64 %r130,%frame,488; mov.u64 %r120,8; .loc 1 175 12 mov.u64 %r112,1; bra $L28; $L29: .loc 1 171 20 st.u64 [%r130],%r236; .loc 1 172 24 add.u64 %r246,%r290,%r120; ld.u64 %r52,[%r246]; .loc 1 173 24 add.u64 %r248,%r291,%r120; .loc 1 173 16 ld.u64 %r249,[%r248]; sub.u64 %r110,%r114,%r249; .loc 1 174 17 add.u64 %r250,%frame,%r120; .loc 1 174 9 ld.u64 %r252,[%r250]; shl.b64 %r251,%r252,2; sub.u64 %r111,%r115,%r251; .loc 1 175 12 add.u64 %r112,%r112,1; .loc 1 176 14 add.u64 %r130,%r130,8; add.u64 %r120,%r120,8; setp.eq.u64 %r253,%r285,%r112; @ %r253 bra $L1; $L28: .loc 1 184 23 ld.u64 %r254,[%r130]; add.u64 %r62,%r254,1; st.u64 [%r130],%r62; .loc 1 185 30 add.u64 %r256,%r286,%r120; .loc 1 185 20 ld.u64 %r258,[%r256]; sub.u64 %r257,%r258,%r52; add.u64 %r113,%r113,%r257; .loc 1 186 30 add.u64 %r260,%r287,%r120; .loc 1 186 20 ld.u64 %r261,[%r260]; add.u64 %r114,%r110,%r261; .loc 1 187 23 add.u64 %r263,%r288,%r120; .loc 1 187 13 ld.u64 %r265,[%r263]; shl.b64 %r264,%r265,2; add.u64 %r115,%r111,%r264; .loc 1 167 32 add.u64 %r267,%r289,%r120; .loc 1 167 13 ld.u64 %r268,[%r267]; setp.eq.u64 %r269,%r62,%r268; @ %r269 bra $L29; bra $L30; $L13: .loc 1 130 11 cvt.s64.s32 %r91,%r90; .loc 1 130 10 setp.lt.s64 %r270,%r91,%r82; @ %r270 bra $L15; .loc 1 132 13 rem.s64 %r91,%r91,%r82; .loc 1 132 8 cvt.u32.u64 %r90,%r91; bra $L15; $L1: .loc 1 191 1 ret; } cshift1_4_i2.o/ 1622802209_gfortrani_cshift1_4_i2 .visible .func _gfortrani_cshift1_4_fortran/generated/cshift1_4_i2.c"F: _gfortrani_cshift1_4_i2 .visible .func _gfortrani_cshift1_4_9609532predpredpred %r215; .reg .predu64 %r244; .reg .u16 %r245; .reg .pred %r246; .reg .pred %r248; .reg .u16 %r249; .reg .pred %r250; .reg .u64 %r251; .reg .predu64 %r267; .reg .u6464pred %r295; .reg .pred64u64 %r313; mov.u64 %r145,%ar0; mov.u64 %r146,%ar1; mov.u64 %r147,%ar2; mov.u64 %r148,%ar3; .loc 1 68 6 setp.eq.u64 %r149,%r148,0; @ %r149 bra $L30; .loc 1 69 11 ld.u32 %r150,[%r148]; add.u32 %r94,%r150,-1; bra $L2; $L30: .loc 1 71 11 cvt.u32.u64 %r94,%r148; $L2: .loc 1 73 13 mov.u64 %r151,1; st.u64 [%frame+360],%r151; .loc 1 74 12 mov.u64 %r152,0; st.u64 [%frame+480],%r152; .loc 1 82 23 ld.s8 %r140,[%r146+28]; cvt.u32.u32 %r153,%r140; cvt.s64.s8 %r141,%r153; .loc 1 82 3 setp.le.s64 %r154,%r141,0; @ %r154 bra $L31; .loc 1 84 15 cvt.s64.s32 %r92,%r94; add.u64 %r93,%r146,40; add.u64 %r86,%r145,40; .loc 1 75 5 mov.u64 %r90,%r152; .loc 1 80 7 mov.u64 %r88,%r90; .loc 1 82 12 mov.u64 %r125,%r90; .loc 1 79 11 mov.u64 %r85,%r151; .loc 1 78 11 mov.u64 %r81,%r85; .loc 1 96 20 mov.u64 %r313,%r90; $L8: .loc 1 92 17 ld.u64 %r156,[%r93+16]; add.u64 %r155,%r156,1; .loc 1 92 15 ld.u64 %r157,[%r93+8]; sub.u64 %r33,%r155,%r157; .loc 1 86 19 ld.u64 %r80,[%r86]; .loc 1 89 19 ld.u64 %r84,[%r93]; .loc 1 84 10 setp.ne.u64 %r158,%r92,%r125; @ %r158 bra $L4; .loc 1 87 14 setp.ne.u64 %r159,%r80,0; .loc 1 90 14 setp.ne.u64 %r160,%r84,0; .loc 1 92 15 mov.u64 %r88,%r33; selp.u64 %r85,%r84,1,%r160; selp.u64 %r81,%r80,1,%r159; bra $L7; $L4: .loc 1 96 20 shl.b64 %r161,%r90,3; add.u64 %r162,%frame,%r161; add.u64 %r163,%r162,480; st.u64 [%r163],%r313; .loc 1 97 21 st.u64 [%r162+360],%r33; .loc 1 98 22 st.u64 [%r162+840],%r80; .loc 1 99 22 st.u64 [%r162+720],%r84; .loc 1 100 24 add.u64 %r175,%r90,%r90; add.u64 %r176,%r175,%r90; shl.b64 %r177,%r176,3; add.u64 %r178,%r147,%r177; ld.u64 %r23,[%r178+40]; .loc 1 100 22 st.u64 [%r162+600],%r23; .loc 1 101 26 mul.lo.u64 %r186,%r33,%r80; .loc 1 101 13 st.u64 [%r162+240],%r186; .loc 1 102 26 mul.lo.u64 %r190,%r33,%r84; .loc 1 102 13 st.u64 [%r162+120],%r190; .loc 1 103 26 mul.lo.u64 %r193,%r23,%r33; .loc 1 103 13 st.u64 [%r162],%r193; .loc 1 104 12 add.u64 %r90,%r90,1; $L7: .loc 1 82 55 add.u64 %r125,%r125,1; .loc 1 82 3 add.u64 %r93,%r93,24; add.u64 %r86,%r86,24; setp.ne.u64 %r194,%r125,%r141; @ %r194 bra $L8; bra $L3; $L31: .loc 1 80 7 mov.u64 %r88,%r152; .loc 1 79 11 mov.u64 %r85,%r151; .loc 1 78 11 mov.u64 %r81,%r85; $L3: .loc 1 107 14 ld.u64 %r27,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r195,%r27,0; .loc 1 108 16 selp.u64 %r27,%r27,1,%r195; .loc 1 109 14 ld.u64 %r28,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r196,%r28,0; .loc 1 110 16 selp.u64 %r28,%r28,1,%r196; .loc 1 111 14 ld.u64 %r29,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r197,%r29,0; .loc 1 112 16 selp.u64 %r29,%r29,1,%r197; .loc 1 118 8 ld.u64 %r122,[%r145]; .loc 1 119 8 ld.u64 %r123,[%r146]; .loc 1 120 8 ld.u64 %r124,[%r147]; .loc 1 122 9 setp.eq.u64 %r198,%r122,0; @ %r198 bra $L1; .loc 1 150 13 add.u64 %r45,%r81,%r81; .loc 1 151 12 add.u64 %r47,%r85,%r85; .loc 1 162 12 add.u64 %r51,%r28,%r28; .loc 1 163 12 add.u64 %r53,%r27,%r27; .loc 1 164 12 shl.b64 %r55,%r29,2; .loc 1 138 19 set.u32.eq.u64 %r204,%r85,1; neg.s32 %r205,%r204; .loc 1 138 35 set.u32.eq.u64 %r207,%r81,1; neg.s32 %r208,%r207; .loc 1 138 24 cvt.u16.u32 %r210,%r205; cvt.u16.u32 %r211,%r208; and.b16 %r209,%r210,%r211; cvt.u32.u16 %r212,%r209; cvt.u32.u8 %r91,%r212; .loc 1 167 32 ld.u64 %r89,[%frame+360]; setp.ne.u32 %r295,%r91,0; setp.le.s64 %r297,%r141,2; .loc 1 129 12 cvt.u32.u64 %r303,%r88; cvt.u32.u32 %r304,%r140; cvt.s64.s8 %r305,%r304; add.u64 %r306,%r305,-1; add.u64 %r307,%frame,840; add.u64 %r308,%frame,720; add.u64 %r309,%frame,600; add.u64 %r310,%frame,360; .loc 1 172 24 add.u64 %r311,%frame,240; .loc 1 173 24 add.u64 %r312,%frame,120; $L29: .loc 1 125 10 ld.u32 %r95,[%r124]; .loc 1 128 10 setp.ge.s32 %r213,%r95,0; @ %r213 bra $L13; .loc 1 129 12 add.u32 %r95,%r95,%r303; .loc 1 130 11 cvt.s64.s32 %r132,%r95; .loc 1 130 10 setp.ge.s64 %r214,%r132,%r88; @ %r214 bra $L14; setp.ge.s32 %r215,%r95,0; @ %r215 bra $L15; $L14: .loc 1 132 13 rem.s64 %r132,%r132,%r88; .loc 1 133 7 setp.lt.s64 %r218,%r132,0; @ %r218 bra $L16; .loc 1 132 8 cvt.u32.u64 %r95,%r132; bra $L15; $L16: .loc 1 134 16 cvt.u32.u64 %r219,%r132; add.u32 %r95,%r303,%r219; .loc 1 136 22 cvt.s64.s32 %r132,%r95; $L15: .loc 1 147 24 sub.u64 %r138,%r88,%r132; .loc 1 138 10 @ %r295 bra $L17; .loc 1 147 4 setp.gt.s64 %r221,%r138,0; @ %r221 bra $L18; mov.u64 %r111,%r122; bra $L19; $L17: .loc 1 140 21 cvt.s64.s32 %r222,%r95; .loc 1 140 11 add.u64 %r223,%r222,%r222; .loc 1 141 11 add.u64 %r224,%r138,%r138; .loc 1 142 23 add.u64 %r225,%r123,%r223; .loc 1 142 4224232,[%value_in]; } .loc 1 143 17 add.u64 %r234,%r122,%r224123bra $L20; $L18: .loc 1 136 22 mul.lo.u64 %r243,%r132,%r85; .loc 1 136 18 add.u64 %r244,%r243,%r243; .loc 1 136 11 add.u64 %r115,%r123,%r244; mov.u64 %r114,%r122; .loc 1 147 11 mov.u64 %r116,0; $L21: .loc 1 149 14 ld.u16 %r245,[%r115]; st.u16 [%r114],%r245; .loc 1 150 13 add.u64 %r114,%r114,%r45; .loc 1 151 12 add.u64 %r115,%r115,%r47; .loc 1 147 31 add.u64 %r116,%r116,1; .loc 1 147 4 setp.ne.u64 %r246,%r116,%r138; @ %r246 bra $L21; .loc 1 150 13 mad.lo.u64 %r111,%r116,%r45,%r122; $L19: .loc 1 153 4 setp.gt.s64 %r248,%r132,0; @ ! %r248 bra $L20; mov.u64 %r112,%r123; .loc 1 153 23 mov.u64 %r113,0; $L22: .loc 1 155 14 ld.u16 %r249,[%r112]; st.u16 [%r111],%r249; .loc 1 156 13 add.u64 %r111,%r111,%r45; .loc 1 157 12 add.u64 %r112,%r112,%r47; .loc 1 153 37 add.u64 %r113,%r113,1; .loc 1 153 4 setp.ne.u64 %r250,%r113,%r132; @ %r250 bra $L22; $L20: .loc 1 162 12 add.u64 %r122,%r122,%r51; .loc 1 163 12 add.u64 %r123,%r123,%r53; .loc 1 164 12 add.u64 %r124,%r124,%r55; .loc 1 165 15 ld.u64 %r251,[%frame+480]; add.u64 %r57,%r251,1; .loc 1 167 13 setp.eq.u64 %r252,%r57,%r89; @ %r252 bra $L24; .loc 1 165 15 st.u64 [%frame+480],%r57; bra $L29; $L24: .loc 1 171 20 mov.u64 %r253,0; st.u64 [%frame+480],%r253; .loc 1 172 24 ld.u64 %r59,[%frame+240]; .loc 1 173 16 ld.u64 %r254,[%frame+120]; add.u64 %r255,%r254,%r254; sub.u64 %r119,%r123,%r255; .loc 1 174 9 ld.u64 %r257,[%frame]; shl.b64 %r256,%r257,2; sub.u64 %r120,%r124,%r256; .loc 1 176 14 @ %r297 bra $L1; add.u64 %r142,%frame,488; mov.u64 %r135,8; .loc 1 175 12 mov.u64 %r121,1; bra $L27; $L28: .loc 1 171 20 st.u64 [%r142],%r253; .loc 1 172 24 add.u64 %r264,%r311,%r135; ld.u64 %r59,[%r264]; .loc 1 173 24 add.u64 %r266,%r312,%r135; .loc 1 173 16 ld.u64 %r267,[%r266]; add.u64 %r268,%r267,%r267; sub.u64 %r119,%r123,%r268; .loc 1 174 17 add.u64 %r269,%frame,%r135; .loc 1 174 9 ld.u64 %r271,[%r269]; shl.b64 %r270,%r271,2; sub.u64 %r120,%r124,%r270; .loc 1 175 12 add.u64 %r121,%r121,1; .loc 1 176 14 add.u64 %r142,%r142,8; add.u64 %r135,%r135,8; setp.eq.u64 %r272,%r306,%r121; @ %r272 bra $L1; $L27: .loc 1 184 23 ld.u64 %r273,[%r142]; add.u64 %r70,%r273,1; st.u64 [%r142],%r70; .loc 1 185 30 add.u64 %r275,%r307,%r135; .loc 1 185 20 ld.u64 %r277,[%r275]; sub.u64 %r276,%r277,%r59; add.u64 %r278,%r276,%r276; add.u64 %r122,%r122,%r278; .loc 1 186 30 add.u64 %r280,%r308,%r135; .loc 1 186 20 ld.u64 %r281,[%r280]; add.u64 %r282,%r281,%r281; add.u64 %r123,%r119,%r282; .loc 1 187 23 add.u64 %r284,%r309,%r135; .loc 1 187 13 ld.u64 %r286,[%r284]; shl.b64 %r285,%r286,2; add.u64 %r124,%r120,%r285; .loc 1 167 32 add.u64 %r288,%r310,%r135; .loc 1 167 13 ld.u64 %r289,[%r288]; setp.eq.u64 %r290,%r70,%r289; @ %r290 bra $L28; bra $L29; $L13: .loc 1 130 11 cvt.s64.s32 %r132,%r95; .loc 1 130 10 setp.lt.s64 %r291,%r132,%r88; @ %r291 bra $L15; .loc 1 132 13 rem.s64 %r132,%r132,%r88; .loc 1 132 8 cvt.u32.u64 %r95,%r132; bra $L15; $Lgfortrani_cshift1_4_i4 .visible .func _gfortrani_cshift1_4_fortran/generated/cshift1_4_i4.c"F: _gfortrani_cshift1_4_i4 .visible .func _gfortrani_cshift1_4_9609532predpredu16 %r206; .reg .u16 %r207; .reg .u32predpred %r239; .reg .predpredu64 %r249; .reg .u64 %r256; .reg .u64 %r258; .reg .u64u64 %r267; .reg .u64u64 %r278; .reg .u64pred %r287; .reg .pred %r290; .reg .u32 %r295; .reg .u32 %r296; .reg .u64u64 %r305; mov.u64 %r145,%ar0; mov.u64 %r146,%ar1; mov.u64 %r147,%ar2; mov.u64 %r148,%ar3; .loc 1 68 6 setp.eq.u64 %r149,%r148,0; @ %r149 bra $L30; .loc 1 69 11 ld.u32 %r150,[%r148]; add.u32 %r94,%r150,-1; bra $L2; $L30: .loc 1 71 11 cvt.u32.u64 %r94,%r148; $L2: .loc 1 73 13 mov.u64 %r151,1; st.u64 [%frame+360],%r151; .loc 1 74 12 mov.u64 %r152,0; st.u64 [%frame+480],%r152; .loc 1 82 23 ld.s8 %r140,[%r146+28]; cvt.u32.u32 %r153,%r140; cvt.s64.s8 %r141,%r153; .loc 1 82 3 setp.le.s64 %r154,%r141,0; @ %r154 bra $L31; .loc 1 84 15 cvt.s64.s32 %r92,%r94; add.u64 %r93,%r146,40; add.u64 %r86,%r145,40; .loc 1 75 5 mov.u64 %r90,%r152; .loc 1 80 7 mov.u64 %r88,%r90; .loc 1 82 12 mov.u64 %r125,%r90; .loc 1 79 11 mov.u64 %r85,%r151; .loc 1 78 11 mov.u64 %r81,%r85; .loc 1 96 20 mov.u64 %r305,%r90; $L8: .loc 1 92 17 ld.u64 %r156,[%r93+16]; add.u64 %r155,%r156,1; .loc 1 92 15 ld.u64 %r157,[%r93+8]; sub.u64 %r33,%r155,%r157; .loc 1 86 19 ld.u64 %r80,[%r86]; .loc 1 89 19 ld.u64 %r84,[%r93]; .loc 1 84 10 setp.ne.u64 %r158,%r92,%r125; @ %r158 bra $L4; .loc 1 87 14 setp.ne.u64 %r159,%r80,0; .loc 1 90 14 setp.ne.u64 %r160,%r84,0; .loc 1 92 15 mov.u64 %r88,%r33; selp.u64 %r85,%r84,1,%r160; selp.u64 %r81,%r80,1,%r159; bra $L7; $L4: .loc 1 96 20 shl.b64 %r161,%r90,3; add.u64 %r162,%frame,%r161; add.u64 %r163,%r162,480; st.u64 [%r163],%r305; .loc 1 97 21 st.u64 [%r162+360],%r33; .loc 1 98 22 st.u64 [%r162+840],%r80; .loc 1 99 22 st.u64 [%r162+720],%r84; .loc 1 100 24 add.u64 %r175,%r90,%r90; add.u64 %r176,%r175,%r90; shl.b64 %r177,%r176,3; add.u64 %r178,%r147,%r177; ld.u64 %r23,[%r178+40]; .loc 1 100 22 st.u64 [%r162+600],%r23; .loc 1 101 26 mul.lo.u64 %r186,%r33,%r80; .loc 1 101 13 st.u64 [%r162+240],%r186; .loc 1 102 26 mul.lo.u64 %r190,%r33,%r84; .loc 1 102 13 st.u64 [%r162+120],%r190; .loc 1 103 26 mul.lo.u64 %r193,%r23,%r33; .loc 1 103 13 st.u64 [%r162],%r193; .loc 1 104 12 add.u64 %r90,%r90,1; $L7: .loc 1 82 55 add.u64 %r125,%r125,1; .loc 1 82 3 add.u64 %r93,%r93,24; add.u64 %r86,%r86,24; setp.ne.u64 %r194,%r125,%r141; @ %r194 bra $L8; bra $L3; $L31: .loc 1 80 7 mov.u64 %r88,%r152; .loc 1 79 11 mov.u64 %r85,%r151; .loc 1 78 11 mov.u64 %r81,%r85; $L3: .loc 1 107 14 ld.u64 %r27,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r195,%r27,0; .loc 1 108 16 selp.u64 %r27,%r27,1,%r195; .loc 1 109 14 ld.u64 %r28,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r196,%r28,0; .loc 1 110 16 selp.u64 %r28,%r28,1,%r196; .loc 1 111 14 ld.u64 %r29,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r197,%r29,0; .loc 1 112 16 selp.u64 %r29,%r29,1,%r197; .loc 1 118 8 ld.u64 %r122,[%r145]; .loc 1 119 8 ld.u64 %r123,[%r146]; .loc 1 120 8 ld.u64 %r124,[%r147]; .loc 1 122 9 setp.eq.u64 %r198,%r122,0; @ %r198 bra $L1; .loc 1 150 13 shl.b64 %r45,%r81,2; .loc 1 151 12 shl.b64 %r47,%r85,2; .loc 1 162 12 shl.b64 %r51,%r28,2; .loc 1 163 12 shl.b64 %r53,%r27,2; .loc 1 164 12 shl.b64 %r55,%r29,2; .loc 1 138 19 set.u32.eq.u64 %r200,%r85,1; neg.s32 %r201,%r200; .loc 1 138 35 set.u32.eq.u64 %r203,%r81,1; neg.s32 %r204,%r203; .loc 1 138 24 cvt.u16.u32 %r206,%r201; cvt.u16.u32 %r207,%r204; and.b16 %r205,%r206,%r207; cvt.u32.u16 %r208,%r205; cvt.u32.u8 %r91,%r208; .loc 1 167 32 ld.u64 %r89,[%frame+360]; setp.ne.u32 %r290,%r91,0; setp.le.s64 %r287,%r141,2; .loc 1 129 12 cvt.u32.u64 %r295,%r88; cvt.u32.u32 %r296,%r140; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; add.u64 %r299,%frame,840; add.u64 %r300,%frame,720; add.u64 %r301,%frame,600; add.u64 %r302,%frame,360; .loc 1 172 24 add.u64 %r303,%frame,240; .loc 1 173 24 add.u64 %r304,%frame,120; $L29: .loc 1 125 10 ld.u32 %r95,[%r124]; .loc 1 128 10 setp.ge.s32 %r209,%r95,0; @ %r209 bra $L13; .loc 1 129 12 add.u32 %r95,%r95,%r295; .loc 1 130 11 cvt.s64.s32 %r132,%r95; .loc 1 130 10 setp.ge.s64 %r210,%r132,%r88; @ %r210 bra $L14; setp.ge.s32 %r211,%r95,0; @ %r211 bra $L15; $L14: .loc 1 132 13 rem.s64 %r132,%r132,%r88; .loc 1 133 7 setp.lt.s64 %r214,%r132,0; @ %r214 bra $L16; .loc 1 132 8 cvt.u32.u64 %r95,%r132; bra $L15; $L16: .loc 1 134 16 cvt.u32.u64 %r215,%r132; add.u32 %r95,%r295,%r215; .loc 1 136 22 cvt.s64.s32 %r132,%r95; $L15: .loc 1 147 24 sub.u64 %r138,%r88,%r132; .loc 1 138 10 @ %r290 bra $L17; .loc 1 147 4 setp.gt.s64 %r217,%r138,0; @ %r217 bra $L18; mov.u64 %r111,%r122; bra $L19; $L17: .loc 1 140 21 cvt.s64.s32 %r218,%r95; .loc 1 140 11 shl.b64 %r117,%r218,2; .loc 1 141 11 shl.b64 %r118,%r138,2; .loc 1 142 23 add.u64 %r219,%r123,%r117; .loc 1 142 418226,[%value_in]; } .loc 1 143 17 add.u64 %r228,%r122,%r118; .loc 1 143 42r123117235,[%value_in]; } bra $L20; $L18: .loc 1 136 22 mul.lo.u64 %r237,%r132,%r85; .loc 1 136 18 shl.b64 %r238,%r237,2; .loc 1 136 11 add.u64 %r115,%r123,%r238; mov.u64 %r114,%r122; .loc 1 147 11 mov.u64 %r116,0; $L21: .loc 1 149 16 ld.u32 %r43,[%r115]; .loc 1 149 14 st.u32 [%r114],%r43; .loc 1 150 13 add.u64 %r114,%r114,%r45; .loc 1 151 12 add.u64 %r115,%r115,%r47; .loc 1 147 31 add.u64 %r116,%r116,1; .loc 1 147 4 setp.ne.u64 %r239,%r116,%r138; @ %r239 bra $L21; .loc 1 150 13 mad.lo.u64 %r111,%r116,%r45,%r122; $L19: .loc 1 153 4 setp.gt.s64 %r241,%r132,0; @ ! %r241 bra $L20; mov.u64 %r112,%r123; .loc 1 153 23 mov.u64 %r113,0; $L22: .loc 1 155 16 ld.u32 %r48,[%r112]; .loc 1 155 14 st.u32 [%r111],%r48; .loc 1 156 13 add.u64 %r111,%r111,%r45; .loc 1 157 12 add.u64 %r112,%r112,%r47; .loc 1 153 37 add.u64 %r113,%r113,1; .loc 1 153 4 setp.ne.u64 %r242,%r113,%r132; @ %r242 bra $L22; $L20: .loc 1 162 12 add.u64 %r122,%r122,%r51; .loc 1 163 12 add.u64 %r123,%r123,%r53; .loc 1 164 12 add.u64 %r124,%r124,%r55; .loc 1 165 15 ld.u64 %r243,[%frame+480]; add.u64 %r57,%r243,1; .loc 1 167 13 setp.eq.u64 %r244,%r57,%r89; @ %r244 bra $L24; .loc 1 165 15 st.u64 [%frame+480],%r57; bra $L29; $L24: .loc 1 171 20 mov.u64 %r245,0; st.u64 [%frame+480],%r245; .loc 1 172 24 ld.u64 %r59,[%frame+240]; .loc 1 173 16 ld.u64 %r247,[%frame+120]; shl.b64 %r246,%r247,2; sub.u64 %r119,%r123,%r246; .loc 1 174 9 ld.u64 %r249,[%frame]; shl.b64 %r248,%r249,2; sub.u64 %r120,%r124,%r248; .loc 1 176 14 @ %r287 bra $L1; add.u64 %r142,%frame,488; mov.u64 %r135,8; .loc 1 175 12 mov.u64 %r121,1; bra $L27; $L28: .loc 1 171 20 st.u64 [%r142],%r245; .loc 1 172 24 add.u64 %r256,%r303,%r135; ld.u64 %r59,[%r256]; .loc 1 173 24 add.u64 %r258,%r304,%r135; .loc 1 173 16 ld.u64 %r260,[%r258]; shl.b64 %r259,%r260,2; sub.u64 %r119,%r123,%r259; .loc 1 174 17 add.u64 %r261,%frame,%r135; .loc 1 174 9 ld.u64 %r263,[%r261]; shl.b64 %r262,%r263,2; sub.u64 %r120,%r124,%r262; .loc 1 175 12 add.u64 %r121,%r121,1; .loc 1 176 14 add.u64 %r142,%r142,8; add.u64 %r135,%r135,8; setp.eq.u64 %r264,%r298,%r121; @ %r264 bra $L1; $L27: .loc 1 184 23 ld.u64 %r265,[%r142]; add.u64 %r70,%r265,1; st.u64 [%r142],%r70; .loc 1 185 30 add.u64 %r267,%r299,%r135; .loc 1 185 20 ld.u64 %r269,[%r267]; sub.u64 %r268,%r269,%r59; shl.b64 %r270,%r268,2; add.u64 %r122,%r122,%r270; .loc 1 186 30 add.u64 %r272,%r300,%r135; .loc 1 186 20 ld.u64 %r274,[%r272]; shl.b64 %r273,%r274,2; add.u64 %r123,%r119,%r273; .loc 1 187 23 add.u64 %r276,%r301,%r135; .loc 1 187 13 ld.u64 %r278,[%r276]; shl.b64 %r277,%r278,2; add.u64 %r124,%r120,%r277; .loc 1 167 32 add.u64 %r280,%r302,%r135; .loc 1 167 13 ld.u64 %r281,[%r280]; setp.eq.u64 %r282,%r70,%r281; @ %r282 bra $L28; bra $L29; $L13: .loc 1 130 11 cvt.s64.s32 %r132,%r95; .loc 1 130 10 setp.lt.s64 %r283,%r132,%r88; @ %r283 bra $L15; .loc 1 132 13 rem.s64 %r132,%r132,%r88; .loc 1 132 8 cvt.u32.u64 %r95,%r132; bra $L15; $Lgfortrani_cshift1_4_i8 .visible .func _gfortrani_cshift1_4_fortran/generated/cshift1_4_i8.c"F: _gfortrani_cshift1_4_i8 .visible .func _gfortrani_cshift1_4_9609532predpredu16 %r206; .reg .u16 %r207; .reg .u32predpred %r239; .reg .predpredu64 %r249; .reg .u64 %r256; .reg .u64 %r258; .reg .u64u64 %r267; .reg .u64u64 %r278; .reg .u64pred %r287; .reg .pred %r290; .reg .u32 %r295; .reg .u32 %r296; .reg .u64u64 %r305; mov.u64 %r145,%ar0; mov.u64 %r146,%ar1; mov.u64 %r147,%ar2; mov.u64 %r148,%ar3; .loc 1 68 6 setp.eq.u64 %r149,%r148,0; @ %r149 bra $L30; .loc 1 69 11 ld.u32 %r150,[%r148]; add.u32 %r94,%r150,-1; bra $L2; $L30: .loc 1 71 11 cvt.u32.u64 %r94,%r148; $L2: .loc 1 73 13 mov.u64 %r151,1; st.u64 [%frame+360],%r151; .loc 1 74 12 mov.u64 %r152,0; st.u64 [%frame+480],%r152; .loc 1 82 23 ld.s8 %r140,[%r146+28]; cvt.u32.u32 %r153,%r140; cvt.s64.s8 %r141,%r153; .loc 1 82 3 setp.le.s64 %r154,%r141,0; @ %r154 bra $L31; .loc 1 84 15 cvt.s64.s32 %r92,%r94; add.u64 %r93,%r146,40; add.u64 %r86,%r145,40; .loc 1 75 5 mov.u64 %r90,%r152; .loc 1 80 7 mov.u64 %r88,%r90; .loc 1 82 12 mov.u64 %r125,%r90; .loc 1 79 11 mov.u64 %r85,%r151; .loc 1 78 11 mov.u64 %r81,%r85; .loc 1 96 20 mov.u64 %r305,%r90; $L8: .loc 1 92 17 ld.u64 %r156,[%r93+16]; add.u64 %r155,%r156,1; .loc 1 92 15 ld.u64 %r157,[%r93+8]; sub.u64 %r33,%r155,%r157; .loc 1 86 19 ld.u64 %r80,[%r86]; .loc 1 89 19 ld.u64 %r84,[%r93]; .loc 1 84 10 setp.ne.u64 %r158,%r92,%r125; @ %r158 bra $L4; .loc 1 87 14 setp.ne.u64 %r159,%r80,0; .loc 1 90 14 setp.ne.u64 %r160,%r84,0; .loc 1 92 15 mov.u64 %r88,%r33; selp.u64 %r85,%r84,1,%r160; selp.u64 %r81,%r80,1,%r159; bra $L7; $L4: .loc 1 96 20 shl.b64 %r161,%r90,3; add.u64 %r162,%frame,%r161; add.u64 %r163,%r162,480; st.u64 [%r163],%r305; .loc 1 97 21 st.u64 [%r162+360],%r33; .loc 1 98 22 st.u64 [%r162+840],%r80; .loc 1 99 22 st.u64 [%r162+720],%r84; .loc 1 100 24 add.u64 %r175,%r90,%r90; add.u64 %r176,%r175,%r90; shl.b64 %r177,%r176,3; add.u64 %r178,%r147,%r177; ld.u64 %r23,[%r178+40]; .loc 1 100 22 st.u64 [%r162+600],%r23; .loc 1 101 26 mul.lo.u64 %r186,%r33,%r80; .loc 1 101 13 st.u64 [%r162+240],%r186; .loc 1 102 26 mul.lo.u64 %r190,%r33,%r84; .loc 1 102 13 st.u64 [%r162+120],%r190; .loc 1 103 26 mul.lo.u64 %r193,%r23,%r33; .loc 1 103 13 st.u64 [%r162],%r193; .loc 1 104 12 add.u64 %r90,%r90,1; $L7: .loc 1 82 55 add.u64 %r125,%r125,1; .loc 1 82 3 add.u64 %r93,%r93,24; add.u64 %r86,%r86,24; setp.ne.u64 %r194,%r125,%r141; @ %r194 bra $L8; bra $L3; $L31: .loc 1 80 7 mov.u64 %r88,%r152; .loc 1 79 11 mov.u64 %r85,%r151; .loc 1 78 11 mov.u64 %r81,%r85; $L3: .loc 1 107 14 ld.u64 %r27,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r195,%r27,0; .loc 1 108 16 selp.u64 %r27,%r27,1,%r195; .loc 1 109 14 ld.u64 %r28,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r196,%r28,0; .loc 1 110 16 selp.u64 %r28,%r28,1,%r196; .loc 1 111 14 ld.u64 %r29,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r197,%r29,0; .loc 1 112 16 selp.u64 %r29,%r29,1,%r197; .loc 1 118 8 ld.u64 %r122,[%r145]; .loc 1 119 8 ld.u64 %r123,[%r146]; .loc 1 120 8 ld.u64 %r124,[%r147]; .loc 1 122 9 setp.eq.u64 %r198,%r122,0; @ %r198 bra $L1; .loc 1 150 13 shl.b64 %r45,%r81,3; .loc 1 151 12 shl.b64 %r47,%r85,3; .loc 1 162 12 shl.b64 %r51,%r28,3; .loc 1 163 12 shl.b64 %r53,%r27,3; .loc 1 164 12 shl.b64 %r55,%r29,2; .loc 1 138 19 set.u32.eq.u64 %r200,%r85,1; neg.s32 %r201,%r200; .loc 1 138 35 set.u32.eq.u64 %r203,%r81,1; neg.s32 %r204,%r203; .loc 1 138 24 cvt.u16.u32 %r206,%r201; cvt.u16.u32 %r207,%r204; and.b16 %r205,%r206,%r207; cvt.u32.u16 %r208,%r205; cvt.u32.u8 %r91,%r208; .loc 1 167 32 ld.u64 %r89,[%frame+360]; setp.ne.u32 %r290,%r91,0; setp.le.s64 %r287,%r141,2; .loc 1 129 12 cvt.u32.u64 %r295,%r88; cvt.u32.u32 %r296,%r140; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; add.u64 %r299,%frame,840; add.u64 %r300,%frame,720; add.u64 %r301,%frame,600; add.u64 %r302,%frame,360; .loc 1 172 24 add.u64 %r303,%frame,240; .loc 1 173 24 add.u64 %r304,%frame,120; $L29: .loc 1 125 10 ld.u32 %r95,[%r124]; .loc 1 128 10 setp.ge.s32 %r209,%r95,0; @ %r209 bra $L13; .loc 1 129 12 add.u32 %r95,%r95,%r295; .loc 1 130 11 cvt.s64.s32 %r132,%r95; .loc 1 130 10 setp.ge.s64 %r210,%r132,%r88; @ %r210 bra $L14; setp.ge.s32 %r211,%r95,0; @ %r211 bra $L15; $L14: .loc 1 132 13 rem.s64 %r132,%r132,%r88; .loc 1 133 7 setp.lt.s64 %r214,%r132,0; @ %r214 bra $L16; .loc 1 132 8 cvt.u32.u64 %r95,%r132; bra $L15; $L16: .loc 1 134 16 cvt.u32.u64 %r215,%r132; add.u32 %r95,%r295,%r215; .loc 1 136 22 cvt.s64.s32 %r132,%r95; $L15: .loc 1 147 24 sub.u64 %r138,%r88,%r132; .loc 1 138 10 @ %r290 bra $L17; .loc 1 147 4 setp.gt.s64 %r217,%r138,0; @ %r217 bra $L18; mov.u64 %r111,%r122; bra $L19; $L17: .loc 1 140 21 cvt.s64.s32 %r218,%r95; .loc 1 140 11 shl.b64 %r117,%r218,3; .loc 1 141 11 shl.b64 %r118,%r138,3; .loc 1 142 23 add.u64 %r219,%r123,%r117; .loc 1 142 418226,[%value_in]; } .loc 1 143 17 add.u64 %r228,%r122,%r118; .loc 1 143 42r123117235,[%value_in]; } bra $L20; $L18: .loc 1 136 22 mul.lo.u64 %r237,%r132,%r85; .loc 1 136 18 shl.b64 %r238,%r237,3; .loc 1 136 11 add.u64 %r115,%r123,%r238; mov.u64 %r114,%r122; .loc 1 147 11 mov.u64 %r116,0; $L21: .loc 1 149 16 ld.u64 %r43,[%r115]; .loc 1 149 14 st.u64 [%r114],%r43; .loc 1 150 13 add.u64 %r114,%r114,%r45; .loc 1 151 12 add.u64 %r115,%r115,%r47; .loc 1 147 31 add.u64 %r116,%r116,1; .loc 1 147 4 setp.ne.u64 %r239,%r116,%r138; @ %r239 bra $L21; .loc 1 150 13 mad.lo.u64 %r111,%r116,%r45,%r122; $L19: .loc 1 153 4 setp.gt.s64 %r241,%r132,0; @ ! %r241 bra $L20; mov.u64 %r112,%r123; .loc 1 153 23 mov.u64 %r113,0; $L22: .loc 1 155 16 ld.u64 %r48,[%r112]; .loc 1 155 14 st.u64 [%r111],%r48; .loc 1 156 13 add.u64 %r111,%r111,%r45; .loc 1 157 12 add.u64 %r112,%r112,%r47; .loc 1 153 37 add.u64 %r113,%r113,1; .loc 1 153 4 setp.ne.u64 %r242,%r113,%r132; @ %r242 bra $L22; $L20: .loc 1 162 12 add.u64 %r122,%r122,%r51; .loc 1 163 12 add.u64 %r123,%r123,%r53; .loc 1 164 12 add.u64 %r124,%r124,%r55; .loc 1 165 15 ld.u64 %r243,[%frame+480]; add.u64 %r57,%r243,1; .loc 1 167 13 setp.eq.u64 %r244,%r57,%r89; @ %r244 bra $L24; .loc 1 165 15 st.u64 [%frame+480],%r57; bra $L29; $L24: .loc 1 171 20 mov.u64 %r245,0; st.u64 [%frame+480],%r245; .loc 1 172 24 ld.u64 %r59,[%frame+240]; .loc 1 173 16 ld.u64 %r247,[%frame+120]; shl.b64 %r246,%r247,3; sub.u64 %r119,%r123,%r246; .loc 1 174 9 ld.u64 %r249,[%frame]; shl.b64 %r248,%r249,2; sub.u64 %r120,%r124,%r248; .loc 1 176 14 @ %r287 bra $L1; add.u64 %r142,%frame,488; mov.u64 %r135,8; .loc 1 175 12 mov.u64 %r121,1; bra $L27; $L28: .loc 1 171 20 st.u64 [%r142],%r245; .loc 1 172 24 add.u64 %r256,%r303,%r135; ld.u64 %r59,[%r256]; .loc 1 173 24 add.u64 %r258,%r304,%r135; .loc 1 173 16 ld.u64 %r260,[%r258]; shl.b64 %r259,%r260,3; sub.u64 %r119,%r123,%r259; .loc 1 174 17 add.u64 %r261,%frame,%r135; .loc 1 174 9 ld.u64 %r263,[%r261]; shl.b64 %r262,%r263,2; sub.u64 %r120,%r124,%r262; .loc 1 175 12 add.u64 %r121,%r121,1; .loc 1 176 14 add.u64 %r142,%r142,8; add.u64 %r135,%r135,8; setp.eq.u64 %r264,%r298,%r121; @ %r264 bra $L1; $L27: .loc 1 184 23 ld.u64 %r265,[%r142]; add.u64 %r70,%r265,1; st.u64 [%r142],%r70; .loc 1 185 30 add.u64 %r267,%r299,%r135; .loc 1 185 20 ld.u64 %r269,[%r267]; sub.u64 %r268,%r269,%r59; shl.b64 %r270,%r268,3; add.u64 %r122,%r122,%r270; .loc 1 186 30 add.u64 %r272,%r300,%r135; .loc 1 186 20 ld.u64 %r274,[%r272]; shl.b64 %r273,%r274,3; add.u64 %r123,%r119,%r273; .loc 1 187 23 add.u64 %r276,%r301,%r135; .loc 1 187 13 ld.u64 %r278,[%r276]; shl.b64 %r277,%r278,2; add.u64 %r124,%r120,%r277; .loc 1 167 32 add.u64 %r280,%r302,%r135; .loc 1 167 13 ld.u64 %r281,[%r280]; setp.eq.u64 %r282,%r70,%r281; @ %r282 bra $L28; bra $L29; $L13: .loc 1 130 11 cvt.s64.s32 %r132,%r95; .loc 1 130 10 setp.lt.s64 %r283,%r132,%r88; @ %r283 bra $L15; .loc 1 132 13 rem.s64 %r132,%r132,%r88; .loc 1 132 8 cvt.u32.u64 %r95,%r132; bra $L15; $Lgfortrani_cshift1_4_i16 .visible .func _gfortrani_cshift1_4_i16fortran/generated/cshift1_4_i16.c"F: _gfortrani_cshift1_4_i16 .visible .func _gfortrani_cshift1_4_i169609532predpredu16 %r206; .reg .u16 %r207; .reg .u32predpred %r241; .reg .pred %r243; .reg .predu64 %r251; .reg .u64 %r252; .reg .u64u64u64predu64 %r312; .reg .u64 %r313; mov.u64 %r145,%ar0; mov.u64 %r146,%ar1; mov.u64 %r147,%ar2; mov.u64 %r148,%ar3; .loc 1 68 6 setp.eq.u64 %r149,%r148,0; @ %r149 bra $L30; .loc 1 69 11 ld.u32 %r150,[%r148]; add.u32 %r94,%r150,-1; bra $L2; $L30: .loc 1 71 11 cvt.u32.u64 %r94,%r148; $L2: .loc 1 73 13 mov.u64 %r151,1; st.u64 [%frame+360],%r151; .loc 1 74 12 mov.u64 %r152,0; st.u64 [%frame+480],%r152; .loc 1 82 23 ld.s8 %r140,[%r146+28]; cvt.u32.u32 %r153,%r140; cvt.s64.s8 %r141,%r153; .loc 1 82 3 setp.le.s64 %r154,%r141,0; @ %r154 bra $L31; .loc 1 84 15 cvt.s64.s32 %r92,%r94; add.u64 %r93,%r146,40; add.u64 %r86,%r145,40; .loc 1 75 5 mov.u64 %r90,%r152; .loc 1 80 7 mov.u64 %r88,%r90; .loc 1 82 12 mov.u64 %r125,%r90; .loc 1 79 11 mov.u64 %r85,%r151; .loc 1 78 11 mov.u64 %r81,%r85; .loc 1 96 20 mov.u64 %r313,%r90; $L8: .loc 1 92 17 ld.u64 %r156,[%r93+16]; add.u64 %r155,%r156,1; .loc 1 92 15 ld.u64 %r157,[%r93+8]; sub.u64 %r33,%r155,%r157; .loc 1 86 19 ld.u64 %r80,[%r86]; .loc 1 89 19 ld.u64 %r84,[%r93]; .loc 1 84 10 setp.ne.u64 %r158,%r92,%r125; @ %r158 bra $L4; .loc 1 87 14 setp.ne.u64 %r159,%r80,0; .loc 1 90 14 setp.ne.u64 %r160,%r84,0; .loc 1 92 15 mov.u64 %r88,%r33; selp.u64 %r85,%r84,1,%r160; selp.u64 %r81,%r80,1,%r159; bra $L7; $L4: .loc 1 96 20 shl.b64 %r161,%r90,3; add.u64 %r162,%frame,%r161; add.u64 %r163,%r162,480; st.u64 [%r163],%r313; .loc 1 97 21 st.u64 [%r162+360],%r33; .loc 1 98 22 st.u64 [%r162+840],%r80; .loc 1 99 22 st.u64 [%r162+720],%r84; .loc 1 100 24 add.u64 %r175,%r90,%r90; add.u64 %r176,%r175,%r90; shl.b64 %r177,%r176,3; add.u64 %r178,%r147,%r177; ld.u64 %r23,[%r178+40]; .loc 1 100 22 st.u64 [%r162+600],%r23; .loc 1 101 26 mul.lo.u64 %r186,%r33,%r80; .loc 1 101 13 st.u64 [%r162+240],%r186; .loc 1 102 26 mul.lo.u64 %r190,%r33,%r84; .loc 1 102 13 st.u64 [%r162+120],%r190; .loc 1 103 26 mul.lo.u64 %r193,%r23,%r33; .loc 1 103 13 st.u64 [%r162],%r193; .loc 1 104 12 add.u64 %r90,%r90,1; $L7: .loc 1 82 55 add.u64 %r125,%r125,1; .loc 1 82 3 add.u64 %r93,%r93,24; add.u64 %r86,%r86,24; setp.ne.u64 %r194,%r125,%r141; @ %r194 bra $L8; bra $L3; $L31: .loc 1 80 7 mov.u64 %r88,%r152; .loc 1 79 11 mov.u64 %r85,%r151; .loc 1 78 11 mov.u64 %r81,%r85; $L3: .loc 1 107 14 ld.u64 %r27,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r195,%r27,0; .loc 1 108 16 selp.u64 %r27,%r27,1,%r195; .loc 1 109 14 ld.u64 %r28,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r196,%r28,0; .loc 1 110 16 selp.u64 %r28,%r28,1,%r196; .loc 1 111 14 ld.u64 %r29,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r197,%r29,0; .loc 1 112 16 selp.u64 %r29,%r29,1,%r197; .loc 1 118 8 ld.u64 %r122,[%r145]; .loc 1 119 8 ld.u64 %r123,[%r146]; .loc 1 120 8 ld.u64 %r124,[%r147]; .loc 1 122 9 setp.eq.u64 %r198,%r122,0; @ %r198 bra $L1; .loc 1 150 13 shl.b64 %r45,%r81,4; .loc 1 151 12 shl.b64 %r47,%r85,4; .loc 1 162 12 shl.b64 %r51,%r28,4; .loc 1 163 12 shl.b64 %r53,%r27,4; .loc 1 164 12 shl.b64 %r55,%r29,2; .loc 1 138 19 set.u32.eq.u64 %r200,%r85,1; neg.s32 %r201,%r200; .loc 1 138 35 set.u32.eq.u64 %r203,%r81,1; neg.s32 %r204,%r203; .loc 1 138 24 cvt.u16.u32 %r206,%r201; cvt.u16.u32 %r207,%r204; and.b16 %r205,%r206,%r207; cvt.u32.u16 %r208,%r205; cvt.u32.u8 %r91,%r208; .loc 1 167 32 ld.u64 %r89,[%frame+360]; setp.ne.u32 %r296,%r91,0; setp.le.s64 %r298,%r141,2; .loc 1 129 12 cvt.u32.u64 %r303,%r88; cvt.u32.u32 %r304,%r140; cvt.s64.s8 %r305,%r304; add.u64 %r306,%r305,-1; add.u64 %r307,%frame,840; add.u64 %r308,%frame,720; add.u64 %r309,%frame,600; add.u64 %r310,%frame,360; .loc 1 172 24 add.u64 %r311,%frame,240; .loc 1 173 24 add.u64 %r312,%frame,120; $L29: .loc 1 125 10 ld.u32 %r95,[%r124]; .loc 1 128 10 setp.ge.s32 %r209,%r95,0; @ %r209 bra $L13; .loc 1 129 12 add.u32 %r95,%r95,%r303; .loc 1 130 11 cvt.s64.s32 %r132,%r95; .loc 1 130 10 setp.ge.s64 %r210,%r132,%r88; @ %r210 bra $L14; setp.ge.s32 %r211,%r95,0; @ %r211 bra $L15; $L14: .loc 1 132 13 rem.s64 %r132,%r132,%r88; .loc 1 133 7 setp.lt.s64 %r214,%r132,0; @ %r214 bra $L16; .loc 1 132 8 cvt.u32.u64 %r95,%r132; bra $L15; $L16: .loc 1 134 16 cvt.u32.u64 %r215,%r132; add.u32 %r95,%r303,%r215; .loc 1 136 22 cvt.s64.s32 %r132,%r95; $L15: .loc 1 147 24 sub.u64 %r138,%r88,%r132; .loc 1 138 10 @ %r296 bra $L17; .loc 1 147 4 setp.gt.s64 %r217,%r138,0; @ %r217 bra $L18; mov.u64 %r111,%r122; bra $L19; $L17: .loc 1 140 21 cvt.s64.s32 %r218,%r95; .loc 1 140 11 shl.b64 %r117,%r218,4; .loc 1 141 11 shl.b64 %r118,%r138,4; .loc 1 142 23 add.u64 %r219,%r123,%r117; .loc 1 142 418226,[%value_in]; } .loc 1 143 17 add.u64 %r228,%r122,%r118; .loc 1 143 42r123117235,[%value_in]; } bra $L20; $L18: .loc 1 136 22 mul.lo.u64 %r237,%r132,%r85; .loc 1 136 18 shl.b64 %r238,%r237,4; .loc 1 136 11 add.u64 %r115,%r123,%r238; mov.u64 %r114,%r122; .loc 1 147 11 mov.u64 %r116,0; $L21: .loc 1 149 16 ld.u64 %r290,[%r115]; ld.u64 %r291,[%r115+8]; .loc 1 149 14 st.u64 [%r114],%r290; st.u64 [%r114+8],%r291; .loc 1 150 13 add.u64 %r114,%r114,%r45; .loc 1 151 12 add.u64 %r115,%r115,%r47; .loc 1 147 31 add.u64 %r116,%r116,1; .loc 1 147 4 setp.ne.u64 %r241,%r116,%r138; @ %r241 bra $L21; .loc 1 150 13 mad.lo.u64 %r111,%r116,%r45,%r122; $L19: .loc 1 153 4 setp.gt.s64 %r243,%r132,0; @ ! %r243 bra $L20; mov.u64 %r112,%r123; .loc 1 153 23 mov.u64 %r113,0; $L22: .loc 1 155 16 ld.u64 %r292,[%r112]; ld.u64 %r293,[%r112+8]; .loc 1 155 14 st.u64 [%r111],%r292; st.u64 [%r111+8],%r293; .loc 1 156 13 add.u64 %r111,%r111,%r45; .loc 1 157 12 add.u64 %r112,%r112,%r47; .loc 1 153 37 add.u64 %r113,%r113,1; .loc 1 153 4 setp.ne.u64 %r246,%r113,%r132; @ %r246 bra $L22; $L20: .loc 1 162 12 add.u64 %r122,%r122,%r51; .loc 1 163 12 add.u64 %r123,%r123,%r53; .loc 1 164 12 add.u64 %r124,%r124,%r55; .loc 1 165 15 ld.u64 %r247,[%frame+480]; add.u64 %r57,%r247,1; .loc 1 167 13 setp.eq.u64 %r248,%r57,%r89; @ %r248 bra $L24; .loc 1 165 15 st.u64 [%frame+480],%r57; bra $L29; $L24: .loc 1 171 20 mov.u64 %r249,0; st.u64 [%frame+480],%r249; .loc 1 172 24 ld.u64 %r59,[%frame+240]; .loc 1 173 16 ld.u64 %r251,[%frame+120]; shl.b64 %r250,%r251,4; sub.u64 %r119,%r123,%r250; .loc 1 174 9 ld.u64 %r253,[%frame]; shl.b64 %r252,%r253,2; sub.u64 %r120,%r124,%r252; .loc 1 176 14 @ %r298 bra $L1; add.u64 %r142,%frame,488; mov.u64 %r135,8; .loc 1 175 12 mov.u64 %r121,1; bra $L27; $L28: .loc 1 171 20 st.u64 [%r142],%r249; .loc 1 172 24 add.u64 %r260,%r311,%r135; ld.u64 %r59,[%r260]; .loc 1 173 24 add.u64 %r262,%r312,%r135; .loc 1 173 16 ld.u64 %r264,[%r262]; shl.b64 %r263,%r264,4; sub.u64 %r119,%r123,%r263; .loc 1 174 17 add.u64 %r265,%frame,%r135; .loc 1 174 9 ld.u64 %r267,[%r265]; shl.b64 %r266,%r267,2; sub.u64 %r120,%r124,%r266; .loc 1 175 12 add.u64 %r121,%r121,1; .loc 1 176 14 add.u64 %r142,%r142,8; add.u64 %r135,%r135,8; setp.eq.u64 %r268,%r306,%r121; @ %r268 bra $L1; $L27: .loc 1 184 23 ld.u64 %r269,[%r142]; add.u64 %r70,%r269,1; st.u64 [%r142],%r70; .loc 1 185 30 add.u64 %r271,%r307,%r135; .loc 1 185 20 ld.u64 %r273,[%r271]; sub.u64 %r272,%r273,%r59; shl.b64 %r274,%r272,4; add.u64 %r122,%r122,%r274; .loc 1 186 30 add.u64 %r276,%r308,%r135; .loc 1 186 20 ld.u64 %r278,[%r276]; shl.b64 %r277,%r278,4; add.u64 %r123,%r119,%r277; .loc 1 187 23 add.u64 %r280,%r309,%r135; .loc 1 187 13 ld.u64 %r282,[%r280]; shl.b64 %r281,%r282,2; add.u64 %r124,%r120,%r281; .loc 1 167 32 add.u64 %r284,%r310,%r135; .loc 1 167 13 ld.u64 %r285,[%r284]; setp.eq.u64 %r286,%r70,%r285; @ %r286 bra $L28; bra $L29; $L13: .loc 1 130 11 cvt.s64.s32 %r132,%r95; .loc 1 130 10 setp.lt.s64 %r287,%r132,%r88; @ %r287 bra $L15; .loc 1 132 13 rem.s64 %r132,%r132,%r88; .loc 1 132 8 cvt.u32.u64 %r95,%r132; bra $L15; $L1: .loc 1 191 1 ret; } cshift1_4_r4.o/ 1622802209gfortrani_cshift1_4_r4 .visible .func _gfortrani_cshift1_4_rfortran/generated/cshift1_4_r4.c"F: _gfortrani_cshift1_4_r4 .visible .func _gfortrani_cshift1_4_r9609532predpredu16 %r206; .reg .u16 %r207; .reg .u32predpred %r239; .reg .predpredu64 %r249; .reg .u64 %r256; .reg .u64 %r258; .reg .u64u64 %r267; .reg .u64u64 %r278; .reg .u64pred %r287; .reg .pred %r290; .reg .u32 %r295; .reg .u32 %r296; .reg .u64u64 %r305; mov.u64 %r145,%ar0; mov.u64 %r146,%ar1; mov.u64 %r147,%ar2; mov.u64 %r148,%ar3; .loc 1 68 6 setp.eq.u64 %r149,%r148,0; @ %r149 bra $L30; .loc 1 69 11 ld.u32 %r150,[%r148]; add.u32 %r94,%r150,-1; bra $L2; $L30: .loc 1 71 11 cvt.u32.u64 %r94,%r148; $L2: .loc 1 73 13 mov.u64 %r151,1; st.u64 [%frame+360],%r151; .loc 1 74 12 mov.u64 %r152,0; st.u64 [%frame+480],%r152; .loc 1 82 23 ld.s8 %r140,[%r146+28]; cvt.u32.u32 %r153,%r140; cvt.s64.s8 %r141,%r153; .loc 1 82 3 setp.le.s64 %r154,%r141,0; @ %r154 bra $L31; .loc 1 84 15 cvt.s64.s32 %r92,%r94; add.u64 %r93,%r146,40; add.u64 %r86,%r145,40; .loc 1 75 5 mov.u64 %r90,%r152; .loc 1 80 7 mov.u64 %r88,%r90; .loc 1 82 12 mov.u64 %r125,%r90; .loc 1 79 11 mov.u64 %r85,%r151; .loc 1 78 11 mov.u64 %r81,%r85; .loc 1 96 20 mov.u64 %r305,%r90; $L8: .loc 1 92 17 ld.u64 %r156,[%r93+16]; add.u64 %r155,%r156,1; .loc 1 92 15 ld.u64 %r157,[%r93+8]; sub.u64 %r33,%r155,%r157; .loc 1 86 19 ld.u64 %r80,[%r86]; .loc 1 89 19 ld.u64 %r84,[%r93]; .loc 1 84 10 setp.ne.u64 %r158,%r92,%r125; @ %r158 bra $L4; .loc 1 87 14 setp.ne.u64 %r159,%r80,0; .loc 1 90 14 setp.ne.u64 %r160,%r84,0; .loc 1 92 15 mov.u64 %r88,%r33; selp.u64 %r85,%r84,1,%r160; selp.u64 %r81,%r80,1,%r159; bra $L7; $L4: .loc 1 96 20 shl.b64 %r161,%r90,3; add.u64 %r162,%frame,%r161; add.u64 %r163,%r162,480; st.u64 [%r163],%r305; .loc 1 97 21 st.u64 [%r162+360],%r33; .loc 1 98 22 st.u64 [%r162+840],%r80; .loc 1 99 22 st.u64 [%r162+720],%r84; .loc 1 100 24 add.u64 %r175,%r90,%r90; add.u64 %r176,%r175,%r90; shl.b64 %r177,%r176,3; add.u64 %r178,%r147,%r177; ld.u64 %r23,[%r178+40]; .loc 1 100 22 st.u64 [%r162+600],%r23; .loc 1 101 26 mul.lo.u64 %r186,%r33,%r80; .loc 1 101 13 st.u64 [%r162+240],%r186; .loc 1 102 26 mul.lo.u64 %r190,%r33,%r84; .loc 1 102 13 st.u64 [%r162+120],%r190; .loc 1 103 26 mul.lo.u64 %r193,%r23,%r33; .loc 1 103 13 st.u64 [%r162],%r193; .loc 1 104 12 add.u64 %r90,%r90,1; $L7: .loc 1 82 55 add.u64 %r125,%r125,1; .loc 1 82 3 add.u64 %r93,%r93,24; add.u64 %r86,%r86,24; setp.ne.u64 %r194,%r125,%r141; @ %r194 bra $L8; bra $L3; $L31: .loc 1 80 7 mov.u64 %r88,%r152; .loc 1 79 11 mov.u64 %r85,%r151; .loc 1 78 11 mov.u64 %r81,%r85; $L3: .loc 1 107 14 ld.u64 %r27,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r195,%r27,0; .loc 1 108 16 selp.u64 %r27,%r27,1,%r195; .loc 1 109 14 ld.u64 %r28,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r196,%r28,0; .loc 1 110 16 selp.u64 %r28,%r28,1,%r196; .loc 1 111 14 ld.u64 %r29,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r197,%r29,0; .loc 1 112 16 selp.u64 %r29,%r29,1,%r197; .loc 1 118 8 ld.u64 %r122,[%r145]; .loc 1 119 8 ld.u64 %r123,[%r146]; .loc 1 120 8 ld.u64 %r124,[%r147]; .loc 1 122 9 setp.eq.u64 %r198,%r122,0; @ %r198 bra $L1; .loc 1 150 13 shl.b64 %r45,%r81,2; .loc 1 151 12 shl.b64 %r47,%r85,2; .loc 1 162 12 shl.b64 %r51,%r28,2; .loc 1 163 12 shl.b64 %r53,%r27,2; .loc 1 164 12 shl.b64 %r55,%r29,2; .loc 1 138 19 set.u32.eq.u64 %r200,%r85,1; neg.s32 %r201,%r200; .loc 1 138 35 set.u32.eq.u64 %r203,%r81,1; neg.s32 %r204,%r203; .loc 1 138 24 cvt.u16.u32 %r206,%r201; cvt.u16.u32 %r207,%r204; and.b16 %r205,%r206,%r207; cvt.u32.u16 %r208,%r205; cvt.u32.u8 %r91,%r208; .loc 1 167 32 ld.u64 %r89,[%frame+360]; setp.ne.u32 %r290,%r91,0; setp.le.s64 %r287,%r141,2; .loc 1 129 12 cvt.u32.u64 %r295,%r88; cvt.u32.u32 %r296,%r140; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; add.u64 %r299,%frame,840; add.u64 %r300,%frame,720; add.u64 %r301,%frame,600; add.u64 %r302,%frame,360; .loc 1 172 24 add.u64 %r303,%frame,240; .loc 1 173 24 add.u64 %r304,%frame,120; $L29: .loc 1 125 10 ld.u32 %r95,[%r124]; .loc 1 128 10 setp.ge.s32 %r209,%r95,0; @ %r209 bra $L13; .loc 1 129 12 add.u32 %r95,%r95,%r295; .loc 1 130 11 cvt.s64.s32 %r132,%r95; .loc 1 130 10 setp.ge.s64 %r210,%r132,%r88; @ %r210 bra $L14; setp.ge.s32 %r211,%r95,0; @ %r211 bra $L15; $L14: .loc 1 132 13 rem.s64 %r132,%r132,%r88; .loc 1 133 7 setp.lt.s64 %r214,%r132,0; @ %r214 bra $L16; .loc 1 132 8 cvt.u32.u64 %r95,%r132; bra $L15; $L16: .loc 1 134 16 cvt.u32.u64 %r215,%r132; add.u32 %r95,%r295,%r215; .loc 1 136 22 cvt.s64.s32 %r132,%r95; $L15: .loc 1 147 24 sub.u64 %r138,%r88,%r132; .loc 1 138 10 @ %r290 bra $L17; .loc 1 147 4 setp.gt.s64 %r217,%r138,0; @ %r217 bra $L18; mov.u64 %r111,%r122; bra $L19; $L17: .loc 1 140 21 cvt.s64.s32 %r218,%r95; .loc 1 140 11 shl.b64 %r117,%r218,2; .loc 1 141 11 shl.b64 %r118,%r138,2; .loc 1 142 23 add.u64 %r219,%r123,%r117; .loc 1 142 418226,[%value_in]; } .loc 1 143 17 add.u64 %r228,%r122,%r118; .loc 1 143 42r123117235,[%value_in]; } bra $L20; $L18: .loc 1 136 22 mul.lo.u64 %r237,%r132,%r85; .loc 1 136 18 shl.b64 %r238,%r237,2; .loc 1 136 11 add.u64 %r115,%r123,%r238; mov.u64 %r114,%r122; .loc 1 147 11 mov.u64 %r116,0; $L21: .loc 1 149 16 ld.f32 %r43,[%r115]; .loc 1 149 14 st.f32 [%r114],%r43; .loc 1 150 13 add.u64 %r114,%r114,%r45; .loc 1 151 12 add.u64 %r115,%r115,%r47; .loc 1 147 31 add.u64 %r116,%r116,1; .loc 1 147 4 setp.ne.u64 %r239,%r116,%r138; @ %r239 bra $L21; .loc 1 150 13 mad.lo.u64 %r111,%r116,%r45,%r122; $L19: .loc 1 153 4 setp.gt.s64 %r241,%r132,0; @ ! %r241 bra $L20; mov.u64 %r112,%r123; .loc 1 153 23 mov.u64 %r113,0; $L22: .loc 1 155 16 ld.f32 %r48,[%r112]; .loc 1 155 14 st.f32 [%r111],%r48; .loc 1 156 13 add.u64 %r111,%r111,%r45; .loc 1 157 12 add.u64 %r112,%r112,%r47; .loc 1 153 37 add.u64 %r113,%r113,1; .loc 1 153 4 setp.ne.u64 %r242,%r113,%r132; @ %r242 bra $L22; $L20: .loc 1 162 12 add.u64 %r122,%r122,%r51; .loc 1 163 12 add.u64 %r123,%r123,%r53; .loc 1 164 12 add.u64 %r124,%r124,%r55; .loc 1 165 15 ld.u64 %r243,[%frame+480]; add.u64 %r57,%r243,1; .loc 1 167 13 setp.eq.u64 %r244,%r57,%r89; @ %r244 bra $L24; .loc 1 165 15 st.u64 [%frame+480],%r57; bra $L29; $L24: .loc 1 171 20 mov.u64 %r245,0; st.u64 [%frame+480],%r245; .loc 1 172 24 ld.u64 %r59,[%frame+240]; .loc 1 173 16 ld.u64 %r247,[%frame+120]; shl.b64 %r246,%r247,2; sub.u64 %r119,%r123,%r246; .loc 1 174 9 ld.u64 %r249,[%frame]; shl.b64 %r248,%r249,2; sub.u64 %r120,%r124,%r248; .loc 1 176 14 @ %r287 bra $L1; add.u64 %r142,%frame,488; mov.u64 %r135,8; .loc 1 175 12 mov.u64 %r121,1; bra $L27; $L28: .loc 1 171 20 st.u64 [%r142],%r245; .loc 1 172 24 add.u64 %r256,%r303,%r135; ld.u64 %r59,[%r256]; .loc 1 173 24 add.u64 %r258,%r304,%r135; .loc 1 173 16 ld.u64 %r260,[%r258]; shl.b64 %r259,%r260,2; sub.u64 %r119,%r123,%r259; .loc 1 174 17 add.u64 %r261,%frame,%r135; .loc 1 174 9 ld.u64 %r263,[%r261]; shl.b64 %r262,%r263,2; sub.u64 %r120,%r124,%r262; .loc 1 175 12 add.u64 %r121,%r121,1; .loc 1 176 14 add.u64 %r142,%r142,8; add.u64 %r135,%r135,8; setp.eq.u64 %r264,%r298,%r121; @ %r264 bra $L1; $L27: .loc 1 184 23 ld.u64 %r265,[%r142]; add.u64 %r70,%r265,1; st.u64 [%r142],%r70; .loc 1 185 30 add.u64 %r267,%r299,%r135; .loc 1 185 20 ld.u64 %r269,[%r267]; sub.u64 %r268,%r269,%r59; shl.b64 %r270,%r268,2; add.u64 %r122,%r122,%r270; .loc 1 186 30 add.u64 %r272,%r300,%r135; .loc 1 186 20 ld.u64 %r274,[%r272]; shl.b64 %r273,%r274,2; add.u64 %r123,%r119,%r273; .loc 1 187 23 add.u64 %r276,%r301,%r135; .loc 1 187 13 ld.u64 %r278,[%r276]; shl.b64 %r277,%r278,2; add.u64 %r124,%r120,%r277; .loc 1 167 32 add.u64 %r280,%r302,%r135; .loc 1 167 13 ld.u64 %r281,[%r280]; setp.eq.u64 %r282,%r70,%r281; @ %r282 bra $L28; bra $L29; $L13: .loc 1 130 11 cvt.s64.s32 %r132,%r95; .loc 1 130 10 setp.lt.s64 %r283,%r132,%r88; @ %r283 bra $L15; .loc 1 132 13 rem.s64 %r132,%r132,%r88; .loc 1 132 8 cvt.u32.u64 %r95,%r132; bra $L15; $Lgfortrani_cshift1_4_r8 .visible .func _gfortrani_cshift1_4_r8fortran/generated/cshift1_4_r8.c"F: _gfortrani_cshift1_4_r8 .visible .func _gfortrani_cshift1_4_r89609532predpredu16 %r206; .reg .u16 %r207; .reg .u32predpred %r239; .reg .predpredu64 %r249; .reg .u64 %r256; .reg .u64 %r258; .reg .u64u64 %r267; .reg .u64u64 %r278; .reg .u64pred %r287; .reg .pred %r290; .reg .u32 %r295; .reg .u32 %r296; .reg .u64u64 %r305; mov.u64 %r145,%ar0; mov.u64 %r146,%ar1; mov.u64 %r147,%ar2; mov.u64 %r148,%ar3; .loc 1 68 6 setp.eq.u64 %r149,%r148,0; @ %r149 bra $L30; .loc 1 69 11 ld.u32 %r150,[%r148]; add.u32 %r94,%r150,-1; bra $L2; $L30: .loc 1 71 11 cvt.u32.u64 %r94,%r148; $L2: .loc 1 73 13 mov.u64 %r151,1; st.u64 [%frame+360],%r151; .loc 1 74 12 mov.u64 %r152,0; st.u64 [%frame+480],%r152; .loc 1 82 23 ld.s8 %r140,[%r146+28]; cvt.u32.u32 %r153,%r140; cvt.s64.s8 %r141,%r153; .loc 1 82 3 setp.le.s64 %r154,%r141,0; @ %r154 bra $L31; .loc 1 84 15 cvt.s64.s32 %r92,%r94; add.u64 %r93,%r146,40; add.u64 %r86,%r145,40; .loc 1 75 5 mov.u64 %r90,%r152; .loc 1 80 7 mov.u64 %r88,%r90; .loc 1 82 12 mov.u64 %r125,%r90; .loc 1 79 11 mov.u64 %r85,%r151; .loc 1 78 11 mov.u64 %r81,%r85; .loc 1 96 20 mov.u64 %r305,%r90; $L8: .loc 1 92 17 ld.u64 %r156,[%r93+16]; add.u64 %r155,%r156,1; .loc 1 92 15 ld.u64 %r157,[%r93+8]; sub.u64 %r33,%r155,%r157; .loc 1 86 19 ld.u64 %r80,[%r86]; .loc 1 89 19 ld.u64 %r84,[%r93]; .loc 1 84 10 setp.ne.u64 %r158,%r92,%r125; @ %r158 bra $L4; .loc 1 87 14 setp.ne.u64 %r159,%r80,0; .loc 1 90 14 setp.ne.u64 %r160,%r84,0; .loc 1 92 15 mov.u64 %r88,%r33; selp.u64 %r85,%r84,1,%r160; selp.u64 %r81,%r80,1,%r159; bra $L7; $L4: .loc 1 96 20 shl.b64 %r161,%r90,3; add.u64 %r162,%frame,%r161; add.u64 %r163,%r162,480; st.u64 [%r163],%r305; .loc 1 97 21 st.u64 [%r162+360],%r33; .loc 1 98 22 st.u64 [%r162+840],%r80; .loc 1 99 22 st.u64 [%r162+720],%r84; .loc 1 100 24 add.u64 %r175,%r90,%r90; add.u64 %r176,%r175,%r90; shl.b64 %r177,%r176,3; add.u64 %r178,%r147,%r177; ld.u64 %r23,[%r178+40]; .loc 1 100 22 st.u64 [%r162+600],%r23; .loc 1 101 26 mul.lo.u64 %r186,%r33,%r80; .loc 1 101 13 st.u64 [%r162+240],%r186; .loc 1 102 26 mul.lo.u64 %r190,%r33,%r84; .loc 1 102 13 st.u64 [%r162+120],%r190; .loc 1 103 26 mul.lo.u64 %r193,%r23,%r33; .loc 1 103 13 st.u64 [%r162],%r193; .loc 1 104 12 add.u64 %r90,%r90,1; $L7: .loc 1 82 55 add.u64 %r125,%r125,1; .loc 1 82 3 add.u64 %r93,%r93,24; add.u64 %r86,%r86,24; setp.ne.u64 %r194,%r125,%r141; @ %r194 bra $L8; bra $L3; $L31: .loc 1 80 7 mov.u64 %r88,%r152; .loc 1 79 11 mov.u64 %r85,%r151; .loc 1 78 11 mov.u64 %r81,%r85; $L3: .loc 1 107 14 ld.u64 %r27,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r195,%r27,0; .loc 1 108 16 selp.u64 %r27,%r27,1,%r195; .loc 1 109 14 ld.u64 %r28,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r196,%r28,0; .loc 1 110 16 selp.u64 %r28,%r28,1,%r196; .loc 1 111 14 ld.u64 %r29,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r197,%r29,0; .loc 1 112 16 selp.u64 %r29,%r29,1,%r197; .loc 1 118 8 ld.u64 %r122,[%r145]; .loc 1 119 8 ld.u64 %r123,[%r146]; .loc 1 120 8 ld.u64 %r124,[%r147]; .loc 1 122 9 setp.eq.u64 %r198,%r122,0; @ %r198 bra $L1; .loc 1 150 13 shl.b64 %r45,%r81,3; .loc 1 151 12 shl.b64 %r47,%r85,3; .loc 1 162 12 shl.b64 %r51,%r28,3; .loc 1 163 12 shl.b64 %r53,%r27,3; .loc 1 164 12 shl.b64 %r55,%r29,2; .loc 1 138 19 set.u32.eq.u64 %r200,%r85,1; neg.s32 %r201,%r200; .loc 1 138 35 set.u32.eq.u64 %r203,%r81,1; neg.s32 %r204,%r203; .loc 1 138 24 cvt.u16.u32 %r206,%r201; cvt.u16.u32 %r207,%r204; and.b16 %r205,%r206,%r207; cvt.u32.u16 %r208,%r205; cvt.u32.u8 %r91,%r208; .loc 1 167 32 ld.u64 %r89,[%frame+360]; setp.ne.u32 %r290,%r91,0; setp.le.s64 %r287,%r141,2; .loc 1 129 12 cvt.u32.u64 %r295,%r88; cvt.u32.u32 %r296,%r140; cvt.s64.s8 %r297,%r296; add.u64 %r298,%r297,-1; add.u64 %r299,%frame,840; add.u64 %r300,%frame,720; add.u64 %r301,%frame,600; add.u64 %r302,%frame,360; .loc 1 172 24 add.u64 %r303,%frame,240; .loc 1 173 24 add.u64 %r304,%frame,120; $L29: .loc 1 125 10 ld.u32 %r95,[%r124]; .loc 1 128 10 setp.ge.s32 %r209,%r95,0; @ %r209 bra $L13; .loc 1 129 12 add.u32 %r95,%r95,%r295; .loc 1 130 11 cvt.s64.s32 %r132,%r95; .loc 1 130 10 setp.ge.s64 %r210,%r132,%r88; @ %r210 bra $L14; setp.ge.s32 %r211,%r95,0; @ %r211 bra $L15; $L14: .loc 1 132 13 rem.s64 %r132,%r132,%r88; .loc 1 133 7 setp.lt.s64 %r214,%r132,0; @ %r214 bra $L16; .loc 1 132 8 cvt.u32.u64 %r95,%r132; bra $L15; $L16: .loc 1 134 16 cvt.u32.u64 %r215,%r132; add.u32 %r95,%r295,%r215; .loc 1 136 22 cvt.s64.s32 %r132,%r95; $L15: .loc 1 147 24 sub.u64 %r138,%r88,%r132; .loc 1 138 10 @ %r290 bra $L17; .loc 1 147 4 setp.gt.s64 %r217,%r138,0; @ %r217 bra $L18; mov.u64 %r111,%r122; bra $L19; $L17: .loc 1 140 21 cvt.s64.s32 %r218,%r95; .loc 1 140 11 shl.b64 %r117,%r218,3; .loc 1 141 11 shl.b64 %r118,%r138,3; .loc 1 142 23 add.u64 %r219,%r123,%r117; .loc 1 142 418226,[%value_in]; } .loc 1 143 17 add.u64 %r228,%r122,%r118; .loc 1 143 42r123117235,[%value_in]; } bra $L20; $L18: .loc 1 136 22 mul.lo.u64 %r237,%r132,%r85; .loc 1 136 18 shl.b64 %r238,%r237,3; .loc 1 136 11 add.u64 %r115,%r123,%r238; mov.u64 %r114,%r122; .loc 1 147 11 mov.u64 %r116,0; $L21: .loc 1 149 16 ld.f64 %r43,[%r115]; .loc 1 149 14 st.f64 [%r114],%r43; .loc 1 150 13 add.u64 %r114,%r114,%r45; .loc 1 151 12 add.u64 %r115,%r115,%r47; .loc 1 147 31 add.u64 %r116,%r116,1; .loc 1 147 4 setp.ne.u64 %r239,%r116,%r138; @ %r239 bra $L21; .loc 1 150 13 mad.lo.u64 %r111,%r116,%r45,%r122; $L19: .loc 1 153 4 setp.gt.s64 %r241,%r132,0; @ ! %r241 bra $L20; mov.u64 %r112,%r123; .loc 1 153 23 mov.u64 %r113,0; $L22: .loc 1 155 16 ld.f64 %r48,[%r112]; .loc 1 155 14 st.f64 [%r111],%r48; .loc 1 156 13 add.u64 %r111,%r111,%r45; .loc 1 157 12 add.u64 %r112,%r112,%r47; .loc 1 153 37 add.u64 %r113,%r113,1; .loc 1 153 4 setp.ne.u64 %r242,%r113,%r132; @ %r242 bra $L22; $L20: .loc 1 162 12 add.u64 %r122,%r122,%r51; .loc 1 163 12 add.u64 %r123,%r123,%r53; .loc 1 164 12 add.u64 %r124,%r124,%r55; .loc 1 165 15 ld.u64 %r243,[%frame+480]; add.u64 %r57,%r243,1; .loc 1 167 13 setp.eq.u64 %r244,%r57,%r89; @ %r244 bra $L24; .loc 1 165 15 st.u64 [%frame+480],%r57; bra $L29; $L24: .loc 1 171 20 mov.u64 %r245,0; st.u64 [%frame+480],%r245; .loc 1 172 24 ld.u64 %r59,[%frame+240]; .loc 1 173 16 ld.u64 %r247,[%frame+120]; shl.b64 %r246,%r247,3; sub.u64 %r119,%r123,%r246; .loc 1 174 9 ld.u64 %r249,[%frame]; shl.b64 %r248,%r249,2; sub.u64 %r120,%r124,%r248; .loc 1 176 14 @ %r287 bra $L1; add.u64 %r142,%frame,488; mov.u64 %r135,8; .loc 1 175 12 mov.u64 %r121,1; bra $L27; $L28: .loc 1 171 20 st.u64 [%r142],%r245; .loc 1 172 24 add.u64 %r256,%r303,%r135; ld.u64 %r59,[%r256]; .loc 1 173 24 add.u64 %r258,%r304,%r135; .loc 1 173 16 ld.u64 %r260,[%r258]; shl.b64 %r259,%r260,3; sub.u64 %r119,%r123,%r259; .loc 1 174 17 add.u64 %r261,%frame,%r135; .loc 1 174 9 ld.u64 %r263,[%r261]; shl.b64 %r262,%r263,2; sub.u64 %r120,%r124,%r262; .loc 1 175 12 add.u64 %r121,%r121,1; .loc 1 176 14 add.u64 %r142,%r142,8; add.u64 %r135,%r135,8; setp.eq.u64 %r264,%r298,%r121; @ %r264 bra $L1; $L27: .loc 1 184 23 ld.u64 %r265,[%r142]; add.u64 %r70,%r265,1; st.u64 [%r142],%r70; .loc 1 185 30 add.u64 %r267,%r299,%r135; .loc 1 185 20 ld.u64 %r269,[%r267]; sub.u64 %r268,%r269,%r59; shl.b64 %r270,%r268,3; add.u64 %r122,%r122,%r270; .loc 1 186 30 add.u64 %r272,%r300,%r135; .loc 1 186 20 ld.u64 %r274,[%r272]; shl.b64 %r273,%r274,3; add.u64 %r123,%r119,%r273; .loc 1 187 23 add.u64 %r276,%r301,%r135; .loc 1 187 13 ld.u64 %r278,[%r276]; shl.b64 %r277,%r278,2; add.u64 %r124,%r120,%r277; .loc 1 167 32 add.u64 %r280,%r302,%r135; .loc 1 167 13 ld.u64 %r281,[%r280]; setp.eq.u64 %r282,%r70,%r281; @ %r282 bra $L28; bra $L29; $L13: .loc 1 130 11 cvt.s64.s32 %r132,%r95; .loc 1 130 10 setp.lt.s64 %r283,%r132,%r88; @ %r283 bra $L15; .loc 1 132 13 rem.s64 %r132,%r132,%r88; .loc 1 132 8 cvt.u32.u64 %r95,%r132; bra $L15; $L1: .loc 1 191 1 ret; } cshift1_4_r10.o/1622802209cshift1_4_c4.o/ 1622802209_gfortrani_cshift1_4_c4 .visible .func _gfortrani_cshift1_4_cfortran/generated/cshift1_4_c4.c"F: _gfortrani_cshift1_4_c4 .visible .func _gfortrani_cshift1_4_c96097predpredpredpredu64 %r251; .reg .u64predu32 %r298; .reg .u64u64 %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u64 %r306; .reg .u64 %r307; mov.u64 %r147,%ar0; mov.u64 %r148,%ar1; mov.u64 %r149,%ar2; mov.u64 %r150,%ar3; .loc 1 68 6 setp.eq.u64 %r151,%r150,0; @ %r151 bra $L30; .loc 1 69 11 ld.u32 %r152,[%r150]; add.u32 %r96,%r152,-1; bra $L2; $L30: .loc 1 71 11 cvt.u32.u64 %r96,%r150; $L2: .loc 1 73 13 mov.u64 %r153,1; st.u64 [%frame+360],%r153; .loc 1 74 12 mov.u64 %r154,0; st.u64 [%frame+480],%r154; .loc 1 82 23 ld.s8 %r142,[%r148+28]; cvt.u32.u32 %r155,%r142; cvt.s64.s8 %r143,%r155; .loc 1 82 3 setp.le.s64 %r156,%r143,0; @ %r156 bra $L31; .loc 1 84 15 cvt.s64.s32 %r94,%r96; add.u64 %r95,%r148,40; add.u64 %r88,%r147,40; .loc 1 75 5 mov.u64 %r92,%r154; .loc 1 80 7 mov.u64 %r90,%r92; .loc 1 82 12 mov.u64 %r127,%r92; .loc 1 79 11 mov.u64 %r87,%r153; .loc 1 78 11 mov.u64 %r83,%r87; .loc 1 96 20 mov.u64 %r307,%r92; $L8: .loc 1 92 17 ld.u64 %r158,[%r95+16]; add.u64 %r157,%r158,1; .loc 1 92 15 ld.u64 %r159,[%r95+8]; sub.u64 %r33,%r157,%r159; .loc 1 86 19 ld.u64 %r82,[%r88]; .loc 1 89 19 ld.u64 %r86,[%r95]; .loc 1 84 10 setp.ne.u64 %r160,%r94,%r127; @ %r160 bra $L4; .loc 1 87 14 setp.ne.u64 %r161,%r82,0; .loc 1 90 14 setp.ne.u64 %r162,%r86,0; .loc 1 92 15 mov.u64 %r90,%r33; selp.u64 %r87,%r86,1,%r162; selp.u64 %r83,%r82,1,%r161; bra $L7; $L4: .loc 1 96 20 shl.b64 %r163,%r92,3; add.u64 %r164,%frame,%r163; add.u64 %r165,%r164,480; st.u64 [%r165],%r307; .loc 1 97 21 st.u64 [%r164+360],%r33; .loc 1 98 22 st.u64 [%r164+840],%r82; .loc 1 99 22 st.u64 [%r164+720],%r86; .loc 1 100 24 add.u64 %r177,%r92,%r92; add.u64 %r178,%r177,%r92; shl.b64 %r179,%r178,3; add.u64 %r180,%r149,%r179; ld.u64 %r23,[%r180+40]; .loc 1 100 22 st.u64 [%r164+600],%r23; .loc 1 101 26 mul.lo.u64 %r188,%r33,%r82; .loc 1 101 13 st.u64 [%r164+240],%r188; .loc 1 102 26 mul.lo.u64 %r192,%r33,%r86; .loc 1 102 13 st.u64 [%r164+120],%r192; .loc 1 103 26 mul.lo.u64 %r195,%r23,%r33; .loc 1 103 13 st.u64 [%r164],%r195; .loc 1 104 12 add.u64 %r92,%r92,1; $L7: .loc 1 82 55 add.u64 %r127,%r127,1; .loc 1 82 3 add.u64 %r95,%r95,24; add.u64 %r88,%r88,24; setp.ne.u64 %r196,%r127,%r143; @ %r196 bra $L8; bra $L3; $L31: .loc 1 80 7 mov.u64 %r90,%r154; .loc 1 79 11 mov.u64 %r87,%r153; .loc 1 78 11 mov.u64 %r83,%r87; $L3: .loc 1 107 14 ld.u64 %r27,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r197,%r27,0; .loc 1 108 16 selp.u64 %r27,%r27,1,%r197; .loc 1 109 14 ld.u64 %r28,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r198,%r28,0; .loc 1 110 16 selp.u64 %r28,%r28,1,%r198; .loc 1 111 14 ld.u64 %r29,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r199,%r29,0; .loc 1 112 16 selp.u64 %r29,%r29,1,%r199; .loc 1 118 8 ld.u64 %r124,[%r147]; .loc 1 119 8 ld.u64 %r125,[%r148]; .loc 1 120 8 ld.u64 %r126,[%r149]; .loc 1 122 9 setp.eq.u64 %r200,%r124,0; @ %r200 bra $L1; .loc 1 150 13 shl.b64 %r46,%r83,3; .loc 1 151 12 shl.b64 %r48,%r87,3; .loc 1 162 12 shl.b64 %r53,%r28,3; .loc 1 163 12 shl.b64 %r55,%r27,3; .loc 1 164 12 shl.b64 %r57,%r29,2; .loc 1 138 19 set.u32.eq.u64 %r202,%r87,1; neg.s32 %r203,%r202; .loc 1 138 35 set.u32.eq.u64 %r205,%r83,1; neg.s32 %r206,%r205; .loc 1 138 24 cvt.u16.u32 %r208,%r203; cvt.u16.u32 %r209,%r206; and.b16 %r207,%r208,%r209; cvt.u32.u16 %r210,%r207; cvt.u32.u8 %r93,%r210; .loc 1 167 32 ld.u64 %r91,[%frame+360]; setp.ne.u32 %r289,%r93,0; setp.le.s64 %r291,%r143,2; .loc 1 129 12 cvt.u32.u64 %r297,%r90; cvt.u32.u32 %r298,%r142; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; add.u64 %r301,%frame,840; add.u64 %r302,%frame,720; add.u64 %r303,%frame,600; add.u64 %r304,%frame,360; .loc 1 172 24 add.u64 %r305,%frame,240; .loc 1 173 24 add.u64 %r306,%frame,120; $L29: .loc 1 125 10 ld.u32 %r97,[%r126]; .loc 1 128 10 setp.ge.s32 %r211,%r97,0; @ %r211 bra $L13; .loc 1 129 12 add.u32 %r97,%r97,%r297; .loc 1 130 11 cvt.s64.s32 %r134,%r97; .loc 1 130 10 setp.ge.s64 %r212,%r134,%r90; @ %r212 bra $L14; setp.ge.s32 %r213,%r97,0; @ %r213 bra $L15; $L14: .loc 1 132 13 rem.s64 %r134,%r134,%r90; .loc 1 133 7 setp.lt.s64 %r216,%r134,0; @ %r216 bra $L16; .loc 1 132 8 cvt.u32.u64 %r97,%r134; bra $L15; $L16: .loc 1 134 16 cvt.u32.u64 %r217,%r134; add.u32 %r97,%r297,%r217; .loc 1 136 22 cvt.s64.s32 %r134,%r97; $L15: .loc 1 147 24 sub.u64 %r140,%r90,%r134; .loc 1 138 10 @ %r289 bra $L17; .loc 1 147 4 setp.gt.s64 %r219,%r140,0; @ %r219 bra $L18; mov.u64 %r113,%r124; bra $L19; $L17: .loc 1 140 21 cvt.s64.s32 %r220,%r97; .loc 1 140 11 shl.b64 %r119,%r220,3; .loc 1 141 11 shl.b64 %r120,%r140,3; .loc 1 142 23 add.u64 %r221,%r125,%r119; .loc 1 142 40228,[%value_in]; } .loc 1 143 17 add.u64 %r230,%r124,%r120r119237,[%value_in]; } bra $L20; $L18: .loc 1 136 22 mul.lo.u64 %r239,%r134,%r87; .loc 1 136 18 shl.b64 %r240,%r239,3; .loc 1 136 11 add.u64 %r117,%r125,%r240; mov.u64 %r116,%r124; .loc 1 147 11 mov.u64 %r118,0; $L21: .loc 1 149 16 ld.f32 %r43,[%r117]; ld.f32 %r44,[%r117+4]; .loc 1 149 14 st.f32 [%r116],%r43; st.f32 [%r116+4],%r44; .loc 1 150 13 add.u64 %r116,%r116,%r46; .loc 1 151 12 add.u64 %r117,%r117,%r48; .loc 1 147 31 add.u64 %r118,%r118,1; .loc 1 147 4 setp.ne.u64 %r241,%r118,%r140; @ %r241 bra $L21; .loc 1 150 13 mad.lo.u64 %r113,%r118,%r46,%r124; $L19: .loc 1 153 4 setp.gt.s64 %r243,%r134,0; @ ! %r243 bra $L20; mov.u64 %r114,%r125; .loc 1 153 23 mov.u64 %r115,0; $L22: .loc 1 155 16 ld.f32 %r49,[%r114]; ld.f32 %r50,[%r114+4]; .loc 1 155 14 st.f32 [%r113],%r49; st.f32 [%r113+4],%r50; .loc 1 156 13 add.u64 %r113,%r113,%r46; .loc 1 157 12 add.u64 %r114,%r114,%r48; .loc 1 153 37 add.u64 %r115,%r115,1; .loc 1 153 4 setp.ne.u64 %r244,%r115,%r134; @ %r244 bra $L22; $L20: .loc 1 162 12 add.u64 %r124,%r124,%r53; .loc 1 163 12 add.u64 %r125,%r125,%r55; .loc 1 164 12 add.u64 %r126,%r126,%r57; .loc 1 165 15 ld.u64 %r245,[%frame+480]; add.u64 %r59,%r245,1; .loc 1 167 13 setp.eq.u64 %r246,%r59,%r91; @ %r246 bra $L24; .loc 1 165 15 st.u64 [%frame+480],%r59; bra $L29; $L24: .loc 1 171 20 mov.u64 %r247,0; st.u64 [%frame+480],%r247; .loc 1 172 24 ld.u64 %r61,[%frame+240]; .loc 1 173 16 ld.u64 %r249,[%frame+120]; shl.b64 %r248,%r249,3; sub.u64 %r121,%r125,%r248; .loc 1 174 9 ld.u64 %r251,[%frame]; shl.b64 %r250,%r251,2; sub.u64 %r122,%r126,%r250; .loc 1 176 14 @ %r291 bra $L1; add.u64 %r144,%frame,488; mov.u64 %r137,8; .loc 1 175 12 mov.u64 %r123,1; bra $L27; $L28: .loc 1 171 20 st.u64 [%r144],%r247; .loc 1 172 24 add.u64 %r258,%r305,%r137; ld.u64 %r61,[%r258]; .loc 1 173 24 add.u64 %r260,%r306,%r137; .loc 1 173 16 ld.u64 %r262,[%r260]; shl.b64 %r261,%r262,3; sub.u64 %r121,%r125,%r261; .loc 1 174 17 add.u64 %r263,%frame,%r137; .loc 1 174 9 ld.u64 %r265,[%r263]; shl.b64 %r264,%r265,2; sub.u64 %r122,%r126,%r264; .loc 1 175 12 add.u64 %r123,%r123,1; .loc 1 176 14 add.u64 %r144,%r144,8; add.u64 %r137,%r137,8; setp.eq.u64 %r266,%r300,%r123; @ %r266 bra $L1; $L27: .loc 1 184 23 ld.u64 %r267,[%r144]; add.u64 %r72,%r267,1; st.u64 [%r144],%r72; .loc 1 185 30 add.u64 %r269,%r301,%r137; .loc 1 185 20 ld.u64 %r271,[%r269]; sub.u64 %r270,%r271,%r61; shl.b64 %r272,%r270,3; add.u64 %r124,%r124,%r272; .loc 1 186 30 add.u64 %r274,%r302,%r137; .loc 1 186 20 ld.u64 %r276,[%r274]; shl.b64 %r275,%r276,3; add.u64 %r125,%r121,%r275; .loc 1 187 23 add.u64 %r278,%r303,%r137; .loc 1 187 13 ld.u64 %r280,[%r278]; shl.b64 %r279,%r280,2; add.u64 %r126,%r122,%r279; .loc 1 167 32 add.u64 %r282,%r304,%r137; .loc 1 167 13 ld.u64 %r283,[%r282]; setp.eq.u64 %r284,%r72,%r283; @ %r284 bra $L28; bra $L29; $L13: .loc 1 130 11 cvt.s64.s32 %r134,%r97; .loc 1 130 10 setp.lt.s64 %r285,%r134,%r90; @ %r285 bra $L15; .loc 1 132 13 rem.s64 %r134,%r134,%r90; .loc 1 132 8 cvt.u32.u64 %r97,%r134; bra $L15; $L1: .loc 1 191 1 ret; } cshift1_4_c8.o/ 1622802209_gfortrani_cshift1_4_c8 .visible .func _gfortrani_cshift1_4_c8fortran/generated/cshift1_4_c8.c"F: _gfortrani_cshift1_4_c8 .visible .func _gfortrani_cshift1_4_c896097predpredpredpredu64 %r251; .reg .u64predu32 %r298; .reg .u64u64 %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u64 %r306; .reg .u64 %r307; mov.u64 %r147,%ar0; mov.u64 %r148,%ar1; mov.u64 %r149,%ar2; mov.u64 %r150,%ar3; .loc 1 68 6 setp.eq.u64 %r151,%r150,0; @ %r151 bra $L30; .loc 1 69 11 ld.u32 %r152,[%r150]; add.u32 %r96,%r152,-1; bra $L2; $L30: .loc 1 71 11 cvt.u32.u64 %r96,%r150; $L2: .loc 1 73 13 mov.u64 %r153,1; st.u64 [%frame+360],%r153; .loc 1 74 12 mov.u64 %r154,0; st.u64 [%frame+480],%r154; .loc 1 82 23 ld.s8 %r142,[%r148+28]; cvt.u32.u32 %r155,%r142; cvt.s64.s8 %r143,%r155; .loc 1 82 3 setp.le.s64 %r156,%r143,0; @ %r156 bra $L31; .loc 1 84 15 cvt.s64.s32 %r94,%r96; add.u64 %r95,%r148,40; add.u64 %r88,%r147,40; .loc 1 75 5 mov.u64 %r92,%r154; .loc 1 80 7 mov.u64 %r90,%r92; .loc 1 82 12 mov.u64 %r127,%r92; .loc 1 79 11 mov.u64 %r87,%r153; .loc 1 78 11 mov.u64 %r83,%r87; .loc 1 96 20 mov.u64 %r307,%r92; $L8: .loc 1 92 17 ld.u64 %r158,[%r95+16]; add.u64 %r157,%r158,1; .loc 1 92 15 ld.u64 %r159,[%r95+8]; sub.u64 %r33,%r157,%r159; .loc 1 86 19 ld.u64 %r82,[%r88]; .loc 1 89 19 ld.u64 %r86,[%r95]; .loc 1 84 10 setp.ne.u64 %r160,%r94,%r127; @ %r160 bra $L4; .loc 1 87 14 setp.ne.u64 %r161,%r82,0; .loc 1 90 14 setp.ne.u64 %r162,%r86,0; .loc 1 92 15 mov.u64 %r90,%r33; selp.u64 %r87,%r86,1,%r162; selp.u64 %r83,%r82,1,%r161; bra $L7; $L4: .loc 1 96 20 shl.b64 %r163,%r92,3; add.u64 %r164,%frame,%r163; add.u64 %r165,%r164,480; st.u64 [%r165],%r307; .loc 1 97 21 st.u64 [%r164+360],%r33; .loc 1 98 22 st.u64 [%r164+840],%r82; .loc 1 99 22 st.u64 [%r164+720],%r86; .loc 1 100 24 add.u64 %r177,%r92,%r92; add.u64 %r178,%r177,%r92; shl.b64 %r179,%r178,3; add.u64 %r180,%r149,%r179; ld.u64 %r23,[%r180+40]; .loc 1 100 22 st.u64 [%r164+600],%r23; .loc 1 101 26 mul.lo.u64 %r188,%r33,%r82; .loc 1 101 13 st.u64 [%r164+240],%r188; .loc 1 102 26 mul.lo.u64 %r192,%r33,%r86; .loc 1 102 13 st.u64 [%r164+120],%r192; .loc 1 103 26 mul.lo.u64 %r195,%r23,%r33; .loc 1 103 13 st.u64 [%r164],%r195; .loc 1 104 12 add.u64 %r92,%r92,1; $L7: .loc 1 82 55 add.u64 %r127,%r127,1; .loc 1 82 3 add.u64 %r95,%r95,24; add.u64 %r88,%r88,24; setp.ne.u64 %r196,%r127,%r143; @ %r196 bra $L8; bra $L3; $L31: .loc 1 80 7 mov.u64 %r90,%r154; .loc 1 79 11 mov.u64 %r87,%r153; .loc 1 78 11 mov.u64 %r83,%r87; $L3: .loc 1 107 14 ld.u64 %r27,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r197,%r27,0; .loc 1 108 16 selp.u64 %r27,%r27,1,%r197; .loc 1 109 14 ld.u64 %r28,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r198,%r28,0; .loc 1 110 16 selp.u64 %r28,%r28,1,%r198; .loc 1 111 14 ld.u64 %r29,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r199,%r29,0; .loc 1 112 16 selp.u64 %r29,%r29,1,%r199; .loc 1 118 8 ld.u64 %r124,[%r147]; .loc 1 119 8 ld.u64 %r125,[%r148]; .loc 1 120 8 ld.u64 %r126,[%r149]; .loc 1 122 9 setp.eq.u64 %r200,%r124,0; @ %r200 bra $L1; .loc 1 150 13 shl.b64 %r46,%r83,4; .loc 1 151 12 shl.b64 %r48,%r87,4; .loc 1 162 12 shl.b64 %r53,%r28,4; .loc 1 163 12 shl.b64 %r55,%r27,4; .loc 1 164 12 shl.b64 %r57,%r29,2; .loc 1 138 19 set.u32.eq.u64 %r202,%r87,1; neg.s32 %r203,%r202; .loc 1 138 35 set.u32.eq.u64 %r205,%r83,1; neg.s32 %r206,%r205; .loc 1 138 24 cvt.u16.u32 %r208,%r203; cvt.u16.u32 %r209,%r206; and.b16 %r207,%r208,%r209; cvt.u32.u16 %r210,%r207; cvt.u32.u8 %r93,%r210; .loc 1 167 32 ld.u64 %r91,[%frame+360]; setp.ne.u32 %r289,%r93,0; setp.le.s64 %r291,%r143,2; .loc 1 129 12 cvt.u32.u64 %r297,%r90; cvt.u32.u32 %r298,%r142; cvt.s64.s8 %r299,%r298; add.u64 %r300,%r299,-1; add.u64 %r301,%frame,840; add.u64 %r302,%frame,720; add.u64 %r303,%frame,600; add.u64 %r304,%frame,360; .loc 1 172 24 add.u64 %r305,%frame,240; .loc 1 173 24 add.u64 %r306,%frame,120; $L29: .loc 1 125 10 ld.u32 %r97,[%r126]; .loc 1 128 10 setp.ge.s32 %r211,%r97,0; @ %r211 bra $L13; .loc 1 129 12 add.u32 %r97,%r97,%r297; .loc 1 130 11 cvt.s64.s32 %r134,%r97; .loc 1 130 10 setp.ge.s64 %r212,%r134,%r90; @ %r212 bra $L14; setp.ge.s32 %r213,%r97,0; @ %r213 bra $L15; $L14: .loc 1 132 13 rem.s64 %r134,%r134,%r90; .loc 1 133 7 setp.lt.s64 %r216,%r134,0; @ %r216 bra $L16; .loc 1 132 8 cvt.u32.u64 %r97,%r134; bra $L15; $L16: .loc 1 134 16 cvt.u32.u64 %r217,%r134; add.u32 %r97,%r297,%r217; .loc 1 136 22 cvt.s64.s32 %r134,%r97; $L15: .loc 1 147 24 sub.u64 %r140,%r90,%r134; .loc 1 138 10 @ %r289 bra $L17; .loc 1 147 4 setp.gt.s64 %r219,%r140,0; @ %r219 bra $L18; mov.u64 %r113,%r124; bra $L19; $L17: .loc 1 140 21 cvt.s64.s32 %r220,%r97; .loc 1 140 11 shl.b64 %r119,%r220,4; .loc 1 141 11 shl.b64 %r120,%r140,4; .loc 1 142 23 add.u64 %r221,%r125,%r119; .loc 1 142 40228,[%value_in]; } .loc 1 143 17 add.u64 %r230,%r124,%r120r119237,[%value_in]; } bra $L20; $L18: .loc 1 136 22 mul.lo.u64 %r239,%r134,%r87; .loc 1 136 18 shl.b64 %r240,%r239,4; .loc 1 136 11 add.u64 %r117,%r125,%r240; mov.u64 %r116,%r124; .loc 1 147 11 mov.u64 %r118,0; $L21: .loc 1 149 16 ld.f64 %r43,[%r117]; ld.f64 %r44,[%r117+8]; .loc 1 149 14 st.f64 [%r116],%r43; st.f64 [%r116+8],%r44; .loc 1 150 13 add.u64 %r116,%r116,%r46; .loc 1 151 12 add.u64 %r117,%r117,%r48; .loc 1 147 31 add.u64 %r118,%r118,1; .loc 1 147 4 setp.ne.u64 %r241,%r118,%r140; @ %r241 bra $L21; .loc 1 150 13 mad.lo.u64 %r113,%r118,%r46,%r124; $L19: .loc 1 153 4 setp.gt.s64 %r243,%r134,0; @ ! %r243 bra $L20; mov.u64 %r114,%r125; .loc 1 153 23 mov.u64 %r115,0; $L22: .loc 1 155 16 ld.f64 %r49,[%r114]; ld.f64 %r50,[%r114+8]; .loc 1 155 14 st.f64 [%r113],%r49; st.f64 [%r113+8],%r50; .loc 1 156 13 add.u64 %r113,%r113,%r46; .loc 1 157 12 add.u64 %r114,%r114,%r48; .loc 1 153 37 add.u64 %r115,%r115,1; .loc 1 153 4 setp.ne.u64 %r244,%r115,%r134; @ %r244 bra $L22; $L20: .loc 1 162 12 add.u64 %r124,%r124,%r53; .loc 1 163 12 add.u64 %r125,%r125,%r55; .loc 1 164 12 add.u64 %r126,%r126,%r57; .loc 1 165 15 ld.u64 %r245,[%frame+480]; add.u64 %r59,%r245,1; .loc 1 167 13 setp.eq.u64 %r246,%r59,%r91; @ %r246 bra $L24; .loc 1 165 15 st.u64 [%frame+480],%r59; bra $L29; $L24: .loc 1 171 20 mov.u64 %r247,0; st.u64 [%frame+480],%r247; .loc 1 172 24 ld.u64 %r61,[%frame+240]; .loc 1 173 16 ld.u64 %r249,[%frame+120]; shl.b64 %r248,%r249,4; sub.u64 %r121,%r125,%r248; .loc 1 174 9 ld.u64 %r251,[%frame]; shl.b64 %r250,%r251,2; sub.u64 %r122,%r126,%r250; .loc 1 176 14 @ %r291 bra $L1; add.u64 %r144,%frame,488; mov.u64 %r137,8; .loc 1 175 12 mov.u64 %r123,1; bra $L27; $L28: .loc 1 171 20 st.u64 [%r144],%r247; .loc 1 172 24 add.u64 %r258,%r305,%r137; ld.u64 %r61,[%r258]; .loc 1 173 24 add.u64 %r260,%r306,%r137; .loc 1 173 16 ld.u64 %r262,[%r260]; shl.b64 %r261,%r262,4; sub.u64 %r121,%r125,%r261; .loc 1 174 17 add.u64 %r263,%frame,%r137; .loc 1 174 9 ld.u64 %r265,[%r263]; shl.b64 %r264,%r265,2; sub.u64 %r122,%r126,%r264; .loc 1 175 12 add.u64 %r123,%r123,1; .loc 1 176 14 add.u64 %r144,%r144,8; add.u64 %r137,%r137,8; setp.eq.u64 %r266,%r300,%r123; @ %r266 bra $L1; $L27: .loc 1 184 23 ld.u64 %r267,[%r144]; add.u64 %r72,%r267,1; st.u64 [%r144],%r72; .loc 1 185 30 add.u64 %r269,%r301,%r137; .loc 1 185 20 ld.u64 %r271,[%r269]; sub.u64 %r270,%r271,%r61; shl.b64 %r272,%r270,4; add.u64 %r124,%r124,%r272; .loc 1 186 30 add.u64 %r274,%r302,%r137; .loc 1 186 20 ld.u64 %r276,[%r274]; shl.b64 %r275,%r276,4; add.u64 %r125,%r121,%r275; .loc 1 187 23 add.u64 %r278,%r303,%r137; .loc 1 187 13 ld.u64 %r280,[%r278]; shl.b64 %r279,%r280,2; add.u64 %r126,%r122,%r279; .loc 1 167 32 add.u64 %r282,%r304,%r137; .loc 1 167 13 ld.u64 %r283,[%r282]; setp.eq.u64 %r284,%r72,%r283; @ %r284 bra $L28; bra $L29; $L13: .loc 1 130 11 cvt.s64.s32 %r134,%r97; .loc 1 130 10 setp.lt.s64 %r285,%r134,%r90; @ %r285 bra $L15; .loc 1 132 13 rem.s64 %r134,%r134,%r90; .loc 1 132 8 cvt.u32.u64 %r97,%r134; bra $L15; $L1: .loc 1 191 1 ret; } cshift1_4_c10.o/1622802209cshift1_8_i1.o/ 1622802209_gfortrani_cshift1_8_i1 .visible .func _gfortrani_cshift1_8_fortran/generated/cshift1_8_i1.c"F: _gfortrani_cshift1_8_i1 .visible .func _gfortrani_cshift1_8_9600predpredpredpred %r185; .reg .predu16 %r197; .reg .u16predu64u64 %r266; .reg .u64pred %r281; .reg .u32 %r288; .reg .u64 %r289; .reg .u64 %r290; .reg .u64u64 %r296; .reg .u64 %r297; mov.u64 %r135,%ar0; mov.u64 %r136,%ar1; mov.u64 %r137,%ar2; mov.u64 %r138,%ar3; .loc 1 68 629; .loc 1 69 21 ld.u32 %r141,[%r138]; add.u32 %r89,%r141,-1; bra $L2; $L29: .loc 1 71 11 cvt.u32.u64 %r89,%r138; $L2: .loc 1 73 13 mov.u64 %r142,1; st.u64 [%frame+360],%r142; .loc 1 74 12 mov.u64 %r143,0; st.u64 [%frame+480],%r143; .loc 1 82 23 ld.s8 %r132,[%r136+28]; cvt.u32.u32 %r144,%r132; cvt.s64.s8 %r133,%r144; .loc 1 82 3 setp.le.s64 %r145,%r133,0; @ %r145 bra $L30; .loc 1 84 15 cvt.s64.s32 %r88,%r89; add.u64 %r69,%r136,40; add.u64 %r42,%r135,40; .loc 1 75 5 mov.u64 %r84,%r143; .loc 1 80 7 mov.u64 %r82,%r84; .loc 1 82 12 mov.u64 %r117,%r84; .loc 1 79 11 mov.u64 %r76,%r142; .loc 1 78 11 mov.u64 %r72,%r76; .loc 1 96 20 mov.u64 %r297,%r84; $L8: .loc 1 92 17 ld.u64 %r147,[%r69+16]; add.u64 %r146,%r147,1; .loc 1 92 15 ld.u64 %r148,[%r69+8]; sub.u64 %r81,%r146,%r148; .loc 1 86 19 ld.u64 %r71,[%r42]; .loc 1 89 19 ld.u64 %r73,[%r69]; .loc 1 84 10 setp.ne.u64 %r149,%r88,%r117; @ %r149 bra $L4; .loc 1 87 14 setp.ne.u64 %r150,%r71,0; .loc 1 90 14 setp.ne.u64 %r151,%r73,0; .loc 1 92 15 mov.u64 %r82,%r81; selp.u64 %r76,%r73,1,%r151; selp.u64 %r72,%r71,1,%r150; bra $L7; $L4: .loc 1 96 20 shl.b64 %r152,%r84,3; add.u64 %r153,%frame,%r152; add.u64 %r154,%r153,480; st.u64 [%r154],%r297; .loc 1 97 21 st.u64 [%r153+360],%r81; .loc 1 98 22 st.u64 [%r153+840],%r71; .loc 1 99 22 st.u64 [%r153+720],%r73; .loc 1 100 24 add.u64 %r166,%r84,%r84; add.u64 %r167,%r166,%r84; shl.b64 %r168,%r167,3; add.u64 %r169,%r137,%r168; ld.u64 %r25,[%r169+40]; .loc 1 100 22 st.u64 [%r153+600],%r25; .loc 1 101 26 mul.lo.u64 %r177,%r71,%r81; .loc 1 101 13 st.u64 [%r153+240],%r177; .loc 1 102 26 mul.lo.u64 %r181,%r73,%r81; .loc 1 102 13 st.u64 [%r153+120],%r181; .loc 1 103 26 mul.lo.u64 %r184,%r25,%r81; .loc 1 103 13 st.u64 [%r153],%r184; .loc 1 104 12 add.u64 %r84,%r84,1; $L7: .loc 1 82 55 add.u64 %r117,%r117,1; .loc 1 82 3 add.u64 %r69,%r69,24; add.u64 %r42,%r42,24; setp.ne.u64 %r185,%r117,%r133; @ %r185 bra $L8; bra $L3; $L30: .loc 1 80 7 mov.u64 %r82,%r143; .loc 1 79 11 mov.u64 %r76,%r142; .loc 1 78 11 mov.u64 %r72,%r76; $L3: .loc 1 107 14 ld.u64 %r31,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r186,%r31,0; .loc 1 108 16 selp.u64 %r31,%r31,1,%r186; .loc 1 109 14 ld.u64 %r32,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r187,%r32,0; .loc 1 110 16 selp.u64 %r32,%r32,1,%r187; .loc 1 111 14 ld.u64 %r33,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r188,%r33,0; .loc 1 112 16 selp.u64 %r33,%r33,1,%r188; .loc 1 118 8 ld.u64 %r114,[%r135]; .loc 1 119 8 ld.u64 %r115,[%r136]; .loc 1 120 8 ld.u64 %r116,[%r137]; .loc 1 122 9 setp.eq.u64 %r189,%r114,0; @ %r189 bra $L1; .loc 1 164 12 shl.b64 %r48,%r33,3; .loc 1 138 35 set.u32.eq.u64 %r191,%r72,1; neg.s32 %r192,%r191; .loc 1 138 19 set.u32.eq.u64 %r194,%r76,1; neg.s32 %r195,%r194; .loc 1 138 24 cvt.u16.u32 %r197,%r192; cvt.u16.u32 %r198,%r195; and.b16 %r196,%r197,%r198; cvt.u32.u16 %r199,%r196; cvt.u32.u8 %r119,%r199; .loc 1 167 32 ld.u64 %r38,[%frame+360]; setp.ne.u32 %r279,%r119,0; setp.le.s64 %r281,%r133,2; cvt.u32.u32 %r288,%r132; cvt.s64.s8 %r289,%r288; add.u64 %r290,%r289,-1; add.u64 %r291,%frame,840; add.u64 %r292,%frame,720; add.u64 %r293,%frame,600; add.u64 %r294,%frame,360; .loc 1 172 24 add.u64 %r295,%frame,240; .loc 1 173 24 add.u64 %r296,%frame,120; $L28: .loc 1 125 10 ld.u64 %r90,[%r116]; .loc 1 128 10 setp.ge.s64 %r200,%r90,0; @ %r200 bra $L13; .loc 1 129 12 add.u64 %r90,%r90,%r82; $L13: .loc 1 130 11 set.u32.ge.s64 %r202,%r90,%r82; neg.s32 %r203,%r202; shr.u64 %r205,%r90,63;4; .loc 1 132 8 rem.s64 %r90,%r90,%r82; .loc 1 133 7 setp.ge.s64 %r215,%r90,0; @ %r215 bra $L14; .loc 1 134 16 add.u64 %r90,%r90,%r82; $L14: .loc 1 147 24 sub.u64 %r75,%r82,%r90; .loc 1 138 10 @ %r279 bra $L15; .loc 1 147 4 setp.gt.s64 %r217,%r75,0; @ %r217 bra $L16; mov.u64 %r134,%r114; bra $L17; $L15: .loc 1 142 23 add.u64 %r218,%r115,%r90; .loc 1 142 41121225,[%value_in]; } .loc 1 143 17 add.u64 %r227,%r114,%r75; .loc 1 143 4234,[%value_in]; } bra $L18; $L16: .loc 1 136 11 mad.lo.u64 %r107,%r90,%r76,%r115; .loc 1 157 12 mov.u64 %r106,%r114; .loc 1 147 11 mov.u64 %r108,0; $L19: .loc 1 149 14 ld.u8 %r237,[%r107]; st.u8 [%r106],%r237; .loc 1 150 13 add.u64 %r106,%r106,%r72; .loc 1 151 12 add.u64 %r107,%r107,%r76; .loc 1 147 31 add.u64 %r108,%r108,1; .loc 1 147 4 setp.ne.u64 %r238,%r75,%r108; @ %r238 bra $L19; .loc 1 150 13 mad.lo.u64 %r134,%r75,%r72,%r114; $L17: .loc 1 153 4 setp.gt.s64 %r240,%r90,0; @ ! %r240 bra $L18; .loc 1 157 12 mov.u64 %r104,%r115; .loc 1 153 23 mov.u64 %r105,0; $L21: .loc 1 155 14 ld.u8 %r241,[%r104]; st.u8 [%r134],%r241; .loc 1 156 13 add.u64 %r134,%r134,%r72; .loc 1 157 12 add.u64 %r104,%r104,%r76; .loc 1 153 37 add.u64 %r105,%r105,1; .loc 1 153 4 setp.ne.u64 %r242,%r90,%r105; @ %r242 bra $L21; $L18: .loc 1 162 12 add.u64 %r114,%r114,%r32; .loc 1 163 12 add.u64 %r115,%r115,%r31; .loc 1 164 12 add.u64 %r116,%r116,%r48; .loc 1 165 15 ld.u64 %r243,[%frame+480]; add.u64 %r50,%r243,1; .loc 1 167 13 setp.eq.u64 %r244,%r38,%r50; @ %r244 bra $L23; .loc 1 165 15 st.u64 [%frame+480],%r50; bra $L28; $L23: .loc 1 171 20 mov.u64 %r245,0; st.u64 [%frame+480],%r245; .loc 1 172 24 ld.u64 %r52,[%frame+240]; .loc 1 173 16 ld.u64 %r246,[%frame+120]; sub.u64 %r111,%r115,%r246; .loc 1 174 9 ld.u64 %r248,[%frame]; shl.b64 %r247,%r248,3; sub.u64 %r112,%r116,%r247; .loc 1 176 14 @ %r281 bra $L1; add.u64 %r124,%frame,488; mov.u64 %r122,8; .loc 1 175 12 mov.u64 %r113,1; bra $L26; $L27: .loc 1 171 20 st.u64 [%r124],%r245; .loc 1 172 24 add.u64 %r255,%r295,%r122; ld.u64 %r52,[%r255]; .loc 1 173 24 add.u64 %r257,%r296,%r122; .loc 1 173 16 ld.u64 %r258,[%r257]; sub.u64 %r111,%r115,%r258; .loc 1 174 17 add.u64 %r259,%frame,%r122; .loc 1 174 9 ld.u64 %r261,[%r259]; shl.b64 %r260,%r261,3; sub.u64 %r112,%r116,%r260; .loc 1 175 12 add.u64 %r113,%r113,1; .loc 1 176 14 add.u64 %r124,%r124,8; add.u64 %r122,%r122,8; setp.eq.u64 %r262,%r290,%r113; @ %r262 bra $L1; $L26: .loc 1 184 23 ld.u64 %r263,[%r124]; add.u64 %r61,%r263,1; st.u64 [%r124],%r61; .loc 1 185 30 add.u64 %r265,%r291,%r122; .loc 1 185 20 ld.u64 %r267,[%r265]; sub.u64 %r266,%r267,%r52; add.u64 %r114,%r114,%r266; .loc 1 186 30 add.u64 %r269,%r292,%r122; .loc 1 186 20 ld.u64 %r270,[%r269]; add.u64 %r115,%r111,%r270; .loc 1 187 23 add.u64 %r272,%r293,%r122; .loc 1 187 13 ld.u64 %r274,[%r272]; shl.b64 %r273,%r274,3; add.u64 %r116,%r112,%r273; .loc 1 167 32 add.u64 %r276,%r294,%r122; .loc 1 167 13 ld.u64 %r277,[%r276]; setp.eq.u64 %r278,%r61,%r277; @ %r278 bra $L27; bra $L28; $L1: .loc 1 191 1 ret; } cshift1_8_i2.o/ 1622802209_gfortrani_cshift1_8_i2 .visible .func _gfortrani_cshift1_8_fortran/generated/cshift1_8_i2.c"F: _gfortrani_cshift1_8_i2 .visible .func _gfortrani_cshift1_8_960pred32predpredu32 %r215; .reg .u32predu64 %r233; .reg .u64 %r240; .reg .u64pred %r260; .reg .u64 %r261; .reg .u6464u64u32u64 %r313; .reg .u64mov.u64 %r144,%ar0; mov.u64 %r145,%ar1; mov.u64 %r146,%ar2; mov.u64 %r147,%ar3; .loc 1 68 6 setp.eq.u64 %r148,%r147,0; @ %r148 bra $L28; .loc 1 69 21 ld.u32 %r150,[%r147]; add.u32 %r97,%r150,-1; bra $L2; $L28: .loc 1 71 11 cvt.u32.u64 %r97,%r147; $L2: .loc 1 73 13 mov.u64 %r151,1; st.u64 [%frame+360],%r151; .loc 1 74 12 mov.u64 %r152,0; st.u64 [%frame+480],%r152; .loc 1 82 23 ld.s8 %r142,[%r145+28]; cvt.u32.u32 %r153,%r142; cvt.s64.s8 %r143,%r153; .loc 1 82 3 setp.le.s64 %r154,%r143,0; @ %r154 bra $L29; .loc 1 84 15 cvt.s64.s32 %r35,%r97; add.u64 %r90,%r145,40; add.u64 %r83,%r144,40; .loc 1 75 5 mov.u64 %r94,%r152; .loc 1 80 7 mov.u64 %r92,%r94; .loc 1 82 12 mov.u64 %r127,%r94; .loc 1 79 11 mov.u64 %r86,%r151; .loc 1 78 11 mov.u64 %r82,%r86; .loc 1 96 20 mov.u64 %r317,%r94; $L8: .loc 1 92 17 ld.u64 %r156,[%r90+16]; add.u64 %r155,%r156,1; .loc 1 92 15 ld.u64 %r157,[%r90+8]; sub.u64 %r87,%r155,%r157; .loc 1 86 19 ld.u64 %r81,[%r83]; .loc 1 89 19 ld.u64 %r85,[%r90]; .loc 1 84 10 setp.ne.u64 %r158,%r35,%r127; @ %r158 bra $L4; .loc 1 87 14 setp.ne.u64 %r159,%r81,0; .loc 1 90 14 setp.ne.u64 %r160,%r85,0; .loc 1 92 15 mov.u64 %r92,%r87; selp.u64 %r86,%r85,1,%r160; selp.u64 %r82,%r81,1,%r159; bra $L7; $L4: .loc 1 96 20 shl.b64 %r161,%r94,3; add.u64 %r162,%frame,%r161; add.u64 %r163,%r162,480; st.u64 [%r163],%r317; .loc 1 97 21 st.u64 [%r162+360],%r87; .loc 1 98 22 st.u64 [%r162+840],%r81; .loc 1 99 22 st.u64 [%r162+720],%r85; .loc 1 100 24 add.u64 %r175,%r94,%r94; add.u64 %r176,%r175,%r94; shl.b64 %r177,%r176,3; add.u64 %r178,%r146,%r177; ld.u64 %r25,[%r178+40]; .loc 1 100 22 st.u64 [%r162+600],%r25; .loc 1 101 26 mul.lo.u64 %r186,%r81,%r87; .loc 1 101 13 st.u64 [%r162+240],%r186; .loc 1 102 26 mul.lo.u64 %r190,%r87,%r85; .loc 1 102 13 st.u64 [%r162+120],%r190; .loc 1 103 26 mul.lo.u64 %r193,%r25,%r87; .loc 1 103 13 st.u64 [%r162],%r193; .loc 1 104 12 add.u64 %r94,%r94,1; $L7: .loc 1 82 55 add.u64 %r127,%r127,1; .loc 1 82 3 add.u64 %r90,%r90,24; add.u64 %r83,%r83,24; setp.ne.u64 %r194,%r127,%r143; @ %r194 bra $L8; bra $L3; $L29: .loc 1 80 7 mov.u64 %r92,%r152; .loc 1 79 11 mov.u64 %r86,%r151; .loc 1 78 11 mov.u64 %r82,%r86; $L3: .loc 1 107 14 ld.u64 %r29,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r195,%r29,0; .loc 1 108 16 selp.u64 %r29,%r29,1,%r195; .loc 1 109 14 ld.u64 %r30,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r196,%r30,0; .loc 1 110 16 selp.u64 %r30,%r30,1,%r196; .loc 1 111 14 ld.u64 %r31,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r197,%r31,0; .loc 1 112 16 selp.u64 %r31,%r31,1,%r197; .loc 1 118 8 ld.u64 %r124,[%r144]; .loc 1 119 8 ld.u64 %r125,[%r145]; .loc 1 120 8 ld.u64 %r126,[%r146]; .loc 1 122 9 setp.eq.u64 %r198,%r124,0; @ %r198 bra $L1; .loc 1 150 13 add.u64 %r47,%r82,%r82; .loc 1 151 12 add.u64 %r49,%r86,%r86; .loc 1 162 12 add.u64 %r52,%r30,%r30; .loc 1 163 12 add.u64 %r54,%r29,%r29; .loc 1 164 12 shl.b64 %r56,%r31,3; .loc 1 138 35 set.u32.eq.u64 %r204,%r82,1; neg.s32 %r205,%r204; .loc 1 138 19 set.u32.eq.u64 %r207,%r86,1; neg.s32 %r208,%r207; .loc 1 138 24 cvt.u16.u32 %r210,%r205; cvt.u16.u32 %r211,%r208; and.b16 %r209,%r210,%r211; cvt.u32.u16 %r212,%r209; cvt.u32.u8 %r84,%r212; .loc 1 167 32 ld.u64 %r103,[%frame+360]; setp.ne.u32 %r303,%r84,0; setp.le.s64 %r300,%r143,2; cvt.u32.u32 %r308,%r142; cvt.s64.s8 %r309,%r308; add.u64 %r310,%r309,-1; add.u64 %r311,%frame,840; add.u64 %r312,%frame,720; add.u64 %r313,%frame,600; add.u64 %r314,%frame,360; .loc 1 172 24 add.u64 %r315,%frame,240; .loc 1 173 24 add.u64 %r316,%frame,120; $L27: .loc 1 125 10 ld.u64 %r98,[%r126]; .loc 1 128 10 setp.ge.s64 %r213,%r98,0; @ %r213 bra $L13; .loc 1 129 12 add.u64 %r98,%r98,%r92; $L13: .loc 1 130 11 set.u32.ge.s64 %r215,%r98,%r92; neg.s32 %r216,%r215; shr.u64 %r218,%r98,63; cvt.u16.u32 %r221,%r216; cvt.u16.u64 %r222,%r218; or.b16 %r220,%r221,%r222; .loc 1 130 10 cvt.u32.u16 %r223,%r220; cvt.u16.u8 %r224,%r223; setp.eq.u16 %r225,%r224,0; @ %r225 bra $L14; .loc 1 132 8 rem.s64 %r98,%r98,%r92; .loc 1 133 7 setp.ge.s64 %r228,%r98,0; @ %r228 bra $L14; .loc 1 134 16 add.u64 %r98,%r98,%r92; $L14: .loc 1 147 24 sub.u64 %r99,%r92,%r98; .loc 1 138 10 @ %r303 bra $L15; .loc 1 147 4 setp.gt.s64 %r230,%r99,0; @ %r230 bra $L16; mov.u64 %r113,%r124; bra $L17; $L15: .loc 1 140 11 add.u64 %r231,%r98,%r98; .loc 1 141 11 add.u64 %r232,%r99,%r99; .loc 1 142 23 add.u64 %r233,%r125,%r231; .loc 1 142 41232240,[%value_in]; } .loc 1 143 17 add.u64 %r242,%r124,%r232; .loc 1 143 4r231249,[%value_in]; } bra $L18; $L16: .loc 1 136 22 mul.lo.u64 %r251,%r98,%r86; .loc 1 136 18 add.u64 %r252,%r251,%r251; .loc 1 136 11 add.u64 %r117,%r125,%r252; mov.u64 %r116,%r124; .loc 1 147 11 mov.u64 %r118,0; $L19: .loc 1 149 14 ld.u16 %r253,[%r117]; st.u16 [%r116],%r253; .loc 1 150 13 add.u64 %r116,%r116,%r47; .loc 1 151 12 add.u64 %r117,%r117,%r49; .loc 1 147 31 add.u64 %r118,%r118,1; .loc 1 147 4 setp.ne.u64 %r254,%r99,%r118; @ %r254 bra $L19; .loc 1 150 13 mad.lo.u64 %r113,%r99,%r47,%r124; $L17: .loc 1 153 4 setp.gt.s64 %r256,%r98,0; @ ! %r256 bra $L18; mov.u64 %r114,%r125; .loc 1 153 23 mov.u64 %r115,0; $L20: .loc 1 155 14 ld.u16 %r257,[%r114]; st.u16 [%r113],%r257; .loc 1 156 13 add.u64 %r113,%r113,%r47; .loc 1 157 12 add.u64 %r114,%r114,%r49; .loc 1 153 37 add.u64 %r115,%r115,1; .loc 1 153 4 setp.ne.u64 %r258,%r98,%r115; @ %r258 bra $L20; $L18: .loc 1 162 12 add.u64 %r124,%r124,%r52; .loc 1 163 12 add.u64 %r125,%r125,%r54; .loc 1 164 12 add.u64 %r126,%r126,%r56; .loc 1 165 15 ld.u64 %r259,[%frame+480]; add.u64 %r58,%r259,1; .loc 1 167 13 setp.eq.u64 %r260,%r58,%r103; @ %r260 bra $L22; .loc 1 165 15 st.u64 [%frame+480],%r58; bra $L27; $L22: .loc 1 171 20 mov.u64 %r261,0; st.u64 [%frame+480],%r261; .loc 1 172 24 ld.u64 %r60,[%frame+240]; .loc 1 173 16 ld.u64 %r262,[%frame+120]; add.u64 %r263,%r262,%r262; sub.u64 %r121,%r125,%r263; .loc 1 174 9 ld.u64 %r265,[%frame]; shl.b64 %r264,%r265,3; sub.u64 %r122,%r126,%r264; .loc 1 176 14 @ %r300 bra $L1; add.u64 %r134,%frame,488; mov.u64 %r132,8; .loc 1 175 12 mov.u64 %r123,1; bra $L25; $L26: .loc 1 171 20 st.u64 [%r134],%r261; .loc 1 172 24 add.u64 %r272,%r315,%r132; ld.u64 %r60,[%r272]; .loc 1 173 24 add.u64 %r274,%r316,%r132; .loc 1 173 16 ld.u64 %r275,[%r274]; add.u64 %r276,%r275,%r275; sub.u64 %r121,%r125,%r276; .loc 1 174 17 add.u64 %r277,%frame,%r132; .loc 1 174 9 ld.u64 %r279,[%r277]; shl.b64 %r278,%r279,3; sub.u64 %r122,%r126,%r278; .loc 1 175 12 add.u64 %r123,%r123,1; .loc 1 176 14 add.u64 %r134,%r134,8; add.u64 %r132,%r132,8; setp.eq.u64 %r280,%r310,%r123; @ %r280 bra $L1; $L25: .loc 1 184 23 ld.u64 %r281,[%r134]; add.u64 %r70,%r281,1; st.u64 [%r134],%r70; .loc 1 185 30 add.u64 %r283,%r311,%r132; .loc 1 185 20 ld.u64 %r285,[%r283]; sub.u64 %r284,%r285,%r60; add.u64 %r286,%r284,%r284; add.u64 %r124,%r124,%r286; .loc 1 186 30 add.u64 %r288,%r312,%r132; .loc 1 186 20 ld.u64 %r289,[%r288]; add.u64 %r290,%r289,%r289; add.u64 %r125,%r121,%r290; .loc 1 187 23 add.u64 %r292,%r313,%r132; .loc 1 187 13 ld.u64 %r294,[%r292]; shl.b64 %r293,%r294,3; add.u64 %r126,%r122,%r293; .loc 1 167 32 add.u64 %r296,%r314,%r132; .loc 1 167 13 ld.u64 %r297,[%r296]; setp.eq.u64 %r298,%r70,%r297; @ %r298 bra $L26; bra $L27; $Lgfortrani_cshift1_8_i4 .visible .func _gfortrani_cshift1_8_fortran/generated/cshift1_8_i4.c"F: _gfortrani_cshift1_8_i4 .visible .func _gfortrani_cshift1_8_960pred32predpredu16u64 %r214; .reg .u16 %r216; .reg .u16 %r217; .reg .u16 %r218; .reg .u32 %r219; .reg .u16 %r220; .reg .pred %r221; .reg .pred %r224; .reg .pred %r226; .reg .u64 %r227; .reg .u64pred %r247; .reg .pred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .predu64 %r267; .reg .u6464u64 %r305; .reg .u64 %r306; .reg .u64mov.u64 %r144,%ar0; mov.u64 %r145,%ar1; mov.u64 %r146,%ar2; mov.u64 %r147,%ar3; .loc 1 68 6 setp.eq.u64 %r148,%r147,0; @ %r148 bra $L28; .loc 1 69 21 ld.u32 %r150,[%r147]; add.u32 %r97,%r150,-1; bra $L2; $L28: .loc 1 71 11 cvt.u32.u64 %r97,%r147; $L2: .loc 1 73 13 mov.u64 %r151,1; st.u64 [%frame+360],%r151; .loc 1 74 12 mov.u64 %r152,0; st.u64 [%frame+480],%r152; .loc 1 82 23 ld.s8 %r142,[%r145+28]; cvt.u32.u32 %r153,%r142; cvt.s64.s8 %r143,%r153; .loc 1 82 3 setp.le.s64 %r154,%r143,0; @ %r154 bra $L29; .loc 1 84 15 cvt.s64.s32 %r35,%r97; add.u64 %r90,%r145,40; add.u64 %r83,%r144,40; .loc 1 75 5 mov.u64 %r94,%r152; .loc 1 80 7 mov.u64 %r92,%r94; .loc 1 82 12 mov.u64 %r127,%r94; .loc 1 79 11 mov.u64 %r86,%r151; .loc 1 78 11 mov.u64 %r82,%r86; .loc 1 96 20 mov.u64 %r309,%r94; $L8: .loc 1 92 17 ld.u64 %r156,[%r90+16]; add.u64 %r155,%r156,1; .loc 1 92 15 ld.u64 %r157,[%r90+8]; sub.u64 %r87,%r155,%r157; .loc 1 86 19 ld.u64 %r81,[%r83]; .loc 1 89 19 ld.u64 %r85,[%r90]; .loc 1 84 10 setp.ne.u64 %r158,%r35,%r127; @ %r158 bra $L4; .loc 1 87 14 setp.ne.u64 %r159,%r81,0; .loc 1 90 14 setp.ne.u64 %r160,%r85,0; .loc 1 92 15 mov.u64 %r92,%r87; selp.u64 %r86,%r85,1,%r160; selp.u64 %r82,%r81,1,%r159; bra $L7; $L4: .loc 1 96 20 shl.b64 %r161,%r94,3; add.u64 %r162,%frame,%r161; add.u64 %r163,%r162,480; st.u64 [%r163],%r309; .loc 1 97 21 st.u64 [%r162+360],%r87; .loc 1 98 22 st.u64 [%r162+840],%r81; .loc 1 99 22 st.u64 [%r162+720],%r85; .loc 1 100 24 add.u64 %r175,%r94,%r94; add.u64 %r176,%r175,%r94; shl.b64 %r177,%r176,3; add.u64 %r178,%r146,%r177; ld.u64 %r25,[%r178+40]; .loc 1 100 22 st.u64 [%r162+600],%r25; .loc 1 101 26 mul.lo.u64 %r186,%r81,%r87; .loc 1 101 13 st.u64 [%r162+240],%r186; .loc 1 102 26 mul.lo.u64 %r190,%r87,%r85; .loc 1 102 13 st.u64 [%r162+120],%r190; .loc 1 103 26 mul.lo.u64 %r193,%r25,%r87; .loc 1 103 13 st.u64 [%r162],%r193; .loc 1 104 12 add.u64 %r94,%r94,1; $L7: .loc 1 82 55 add.u64 %r127,%r127,1; .loc 1 82 3 add.u64 %r90,%r90,24; add.u64 %r83,%r83,24; setp.ne.u64 %r194,%r127,%r143; @ %r194 bra $L8; bra $L3; $L29: .loc 1 80 7 mov.u64 %r92,%r152; .loc 1 79 11 mov.u64 %r86,%r151; .loc 1 78 11 mov.u64 %r82,%r86; $L3: .loc 1 107 14 ld.u64 %r29,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r195,%r29,0; .loc 1 108 16 selp.u64 %r29,%r29,1,%r195; .loc 1 109 14 ld.u64 %r30,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r196,%r30,0; .loc 1 110 16 selp.u64 %r30,%r30,1,%r196; .loc 1 111 14 ld.u64 %r31,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r197,%r31,0; .loc 1 112 16 selp.u64 %r31,%r31,1,%r197; .loc 1 118 8 ld.u64 %r124,[%r144]; .loc 1 119 8 ld.u64 %r125,[%r145]; .loc 1 120 8 ld.u64 %r126,[%r146]; .loc 1 122 9 setp.eq.u64 %r198,%r124,0; @ %r198 bra $L1; .loc 1 150 13 shl.b64 %r47,%r82,2; .loc 1 151 12 shl.b64 %r49,%r86,2; .loc 1 162 12 shl.b64 %r52,%r30,2; .loc 1 163 12 shl.b64 %r54,%r29,2; .loc 1 164 12 shl.b64 %r56,%r31,3; .loc 1 138 35 set.u32.eq.u64 %r200,%r82,1; neg.s32 %r201,%r200; .loc 1 138 19 set.u32.eq.u64 %r203,%r86,1; neg.s32 %r204,%r203; .loc 1 138 24 cvt.u16.u32 %r206,%r201; cvt.u16.u32 %r207,%r204; and.b16 %r205,%r206,%r207; cvt.u32.u16 %r208,%r205; cvt.u32.u8 %r84,%r208; .loc 1 167 32 ld.u64 %r103,[%frame+360]; setp.ne.u32 %r294,%r84,0; setp.le.s64 %r292,%r143,2; cvt.u32.u32 %r300,%r142; cvt.s64.s8 %r301,%r300; add.u64 %r302,%r301,-1; add.u64 %r303,%frame,840; add.u64 %r304,%frame,720; add.u64 %r305,%frame,600; add.u64 %r306,%frame,360; .loc 1 172 24 add.u64 %r307,%frame,240; .loc 1 173 24 add.u64 %r308,%frame,120; $L27: .loc 1 125 10 ld.u64 %r98,[%r126]; .loc 1 128 10 setp.ge.s64 %r209,%r98,0; @ %r209 bra $L13; .loc 1 129 12 add.u64 %r98,%r98,%r92; $L13: .loc 1 130 11 set.u32.ge.s64 %r211,%r98,%r92; neg.s32 %r212,%r211; shr.u64 %r214,%r98,63; cvt.u16.u32 %r217,%r212; cvt.u16.u64 %r218,%r214; or.b16 %r216,%r217,%r218; .loc 1 130 10 cvt.u32.u16 %r219,%r216; cvt.u16.u8 %r220,%r219; setp.eq.u16 %r221,%r220,0; @ %r221 bra $L14; .loc 1 132 8 rem.s64 %r98,%r98,%r92; .loc 1 133 7 setp.ge.s64 %r224,%r98,0; @ %r224 bra $L14; .loc 1 134 16 add.u64 %r98,%r98,%r92; $L14: .loc 1 147 24 sub.u64 %r99,%r92,%r98; .loc 1 138 10 @ %r294 bra $L15; .loc 1 147 4 setp.gt.s64 %r226,%r99,0; @ %r226 bra $L16; mov.u64 %r113,%r124; bra $L17; $L15: .loc 1 140 11 shl.b64 %r119,%r98,2; .loc 1 141 11 shl.b64 %r120,%r99,2; .loc 1 142 23 add.u64 %r227,%r125,%r119; .loc 1 142 41270234,[%value_in]; } .loc 1 143 17 add.u64 %r236,%r124,%r120r119243,[%value_in]; } bra $L18; $L16: .loc 1 136 22 mul.lo.u64 %r245,%r98,%r86; .loc 1 136 18 shl.b64 %r246,%r245,2; .loc 1 136 11 add.u64 %r117,%r125,%r246; mov.u64 %r116,%r124; .loc 1 147 11 mov.u64 %r118,0; $L19: .loc 1 149 16 ld.u32 %r45,[%r117]; .loc 1 149 14 st.u32 [%r116],%r45; .loc 1 150 13 add.u64 %r116,%r116,%r47; .loc 1 151 12 add.u64 %r117,%r117,%r49; .loc 1 147 31 add.u64 %r118,%r118,1; .loc 1 147 4 setp.ne.u64 %r247,%r99,%r118; @ %r247 bra $L19; .loc 1 150 13 mad.lo.u64 %r113,%r99,%r47,%r124; $L17: .loc 1 153 4 setp.gt.s64 %r249,%r98,0; @ ! %r249 bra $L18; mov.u64 %r114,%r125; .loc 1 153 23 mov.u64 %r115,0; $L20: .loc 1 155 16 ld.u32 %r50,[%r114]; .loc 1 155 14 st.u32 [%r113],%r50; .loc 1 156 13 add.u64 %r113,%r113,%r47; .loc 1 157 12 add.u64 %r114,%r114,%r49; .loc 1 153 37 add.u64 %r115,%r115,1; .loc 1 153 4 setp.ne.u64 %r250,%r98,%r115; @ %r250 bra $L20; $L18: .loc 1 162 12 add.u64 %r124,%r124,%r52; .loc 1 163 12 add.u64 %r125,%r125,%r54; .loc 1 164 12 add.u64 %r126,%r126,%r56; .loc 1 165 15 ld.u64 %r251,[%frame+480]; add.u64 %r58,%r251,1; .loc 1 167 13 setp.eq.u64 %r252,%r58,%r103; @ %r252 bra $L22; .loc 1 165 15 st.u64 [%frame+480],%r58; bra $L27; $L22: .loc 1 171 20 mov.u64 %r253,0; st.u64 [%frame+480],%r253; .loc 1 172 24 ld.u64 %r60,[%frame+240]; .loc 1 173 16 ld.u64 %r255,[%frame+120]; shl.b64 %r254,%r255,2; sub.u64 %r121,%r125,%r254; .loc 1 174 9 ld.u64 %r257,[%frame]; shl.b64 %r256,%r257,3; sub.u64 %r122,%r126,%r256; .loc 1 176 14 @ %r292 bra $L1; add.u64 %r134,%frame,488; mov.u64 %r132,8; .loc 1 175 12 mov.u64 %r123,1; bra $L25; $L26: .loc 1 171 20 st.u64 [%r134],%r253; .loc 1 172 24 add.u64 %r264,%r307,%r132; ld.u64 %r60,[%r264]; .loc 1 173 24 add.u64 %r266,%r308,%r132; .loc 1 173 16 ld.u64 %r268,[%r266]; shl.b64 %r267,%r268,2; sub.u64 %r121,%r125,%r267; .loc 1 174 17 add.u64 %r269,%frame,%r132; .loc 1 174 9 ld.u64 %r271,[%r269]; shl.b64 %r270,%r271,3; sub.u64 %r122,%r126,%r270; .loc 1 175 12 add.u64 %r123,%r123,1; .loc 1 176 14 add.u64 %r134,%r134,8; add.u64 %r132,%r132,8; setp.eq.u64 %r272,%r302,%r123; @ %r272 bra $L1; $L25: .loc 1 184 23 ld.u64 %r273,[%r134]; add.u64 %r70,%r273,1; st.u64 [%r134],%r70; .loc 1 185 30 add.u64 %r275,%r303,%r132; .loc 1 185 20 ld.u64 %r277,[%r275]; sub.u64 %r276,%r277,%r60; shl.b64 %r278,%r276,2; add.u64 %r124,%r124,%r278; .loc 1 186 30 add.u64 %r280,%r304,%r132; .loc 1 186 20 ld.u64 %r282,[%r280]; shl.b64 %r281,%r282,2; add.u64 %r125,%r121,%r281; .loc 1 187 23 add.u64 %r284,%r305,%r132; .loc 1 187 13 ld.u64 %r286,[%r284]; shl.b64 %r285,%r286,3; add.u64 %r126,%r122,%r285; .loc 1 167 32 add.u64 %r288,%r306,%r132; .loc 1 167 13 ld.u64 %r289,[%r288]; setp.eq.u64 %r290,%r70,%r289; @ %r290 bra $L26; bra $L27; $Lgfortrani_cshift1_8_i8 .visible .func _gfortrani_cshift1_8_fortran/generated/cshift1_8_i8.c"F: _gfortrani_cshift1_8_i8 .visible .func _gfortrani_cshift1_8_960pred32predpredu16u64 %r214; .reg .u16 %r216; .reg .u16 %r217; .reg .u16 %r218; .reg .u32 %r219; .reg .u16 %r220; .reg .pred %r221; .reg .pred %r224; .reg .pred %r226; .reg .u64 %r227; .reg .u64pred %r247; .reg .pred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .predu64 %r267; .reg .u6464u64 %r305; .reg .u64 %r306; .reg .u64mov.u64 %r144,%ar0; mov.u64 %r145,%ar1; mov.u64 %r146,%ar2; mov.u64 %r147,%ar3; .loc 1 68 6 setp.eq.u64 %r148,%r147,0; @ %r148 bra $L28; .loc 1 69 21 ld.u32 %r150,[%r147]; add.u32 %r97,%r150,-1; bra $L2; $L28: .loc 1 71 11 cvt.u32.u64 %r97,%r147; $L2: .loc 1 73 13 mov.u64 %r151,1; st.u64 [%frame+360],%r151; .loc 1 74 12 mov.u64 %r152,0; st.u64 [%frame+480],%r152; .loc 1 82 23 ld.s8 %r142,[%r145+28]; cvt.u32.u32 %r153,%r142; cvt.s64.s8 %r143,%r153; .loc 1 82 3 setp.le.s64 %r154,%r143,0; @ %r154 bra $L29; .loc 1 84 15 cvt.s64.s32 %r35,%r97; add.u64 %r90,%r145,40; add.u64 %r83,%r144,40; .loc 1 75 5 mov.u64 %r94,%r152; .loc 1 80 7 mov.u64 %r92,%r94; .loc 1 82 12 mov.u64 %r127,%r94; .loc 1 79 11 mov.u64 %r86,%r151; .loc 1 78 11 mov.u64 %r82,%r86; .loc 1 96 20 mov.u64 %r309,%r94; $L8: .loc 1 92 17 ld.u64 %r156,[%r90+16]; add.u64 %r155,%r156,1; .loc 1 92 15 ld.u64 %r157,[%r90+8]; sub.u64 %r87,%r155,%r157; .loc 1 86 19 ld.u64 %r81,[%r83]; .loc 1 89 19 ld.u64 %r85,[%r90]; .loc 1 84 10 setp.ne.u64 %r158,%r35,%r127; @ %r158 bra $L4; .loc 1 87 14 setp.ne.u64 %r159,%r81,0; .loc 1 90 14 setp.ne.u64 %r160,%r85,0; .loc 1 92 15 mov.u64 %r92,%r87; selp.u64 %r86,%r85,1,%r160; selp.u64 %r82,%r81,1,%r159; bra $L7; $L4: .loc 1 96 20 shl.b64 %r161,%r94,3; add.u64 %r162,%frame,%r161; add.u64 %r163,%r162,480; st.u64 [%r163],%r309; .loc 1 97 21 st.u64 [%r162+360],%r87; .loc 1 98 22 st.u64 [%r162+840],%r81; .loc 1 99 22 st.u64 [%r162+720],%r85; .loc 1 100 24 add.u64 %r175,%r94,%r94; add.u64 %r176,%r175,%r94; shl.b64 %r177,%r176,3; add.u64 %r178,%r146,%r177; ld.u64 %r25,[%r178+40]; .loc 1 100 22 st.u64 [%r162+600],%r25; .loc 1 101 26 mul.lo.u64 %r186,%r81,%r87; .loc 1 101 13 st.u64 [%r162+240],%r186; .loc 1 102 26 mul.lo.u64 %r190,%r87,%r85; .loc 1 102 13 st.u64 [%r162+120],%r190; .loc 1 103 26 mul.lo.u64 %r193,%r25,%r87; .loc 1 103 13 st.u64 [%r162],%r193; .loc 1 104 12 add.u64 %r94,%r94,1; $L7: .loc 1 82 55 add.u64 %r127,%r127,1; .loc 1 82 3 add.u64 %r90,%r90,24; add.u64 %r83,%r83,24; setp.ne.u64 %r194,%r127,%r143; @ %r194 bra $L8; bra $L3; $L29: .loc 1 80 7 mov.u64 %r92,%r152; .loc 1 79 11 mov.u64 %r86,%r151; .loc 1 78 11 mov.u64 %r82,%r86; $L3: .loc 1 107 14 ld.u64 %r29,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r195,%r29,0; .loc 1 108 16 selp.u64 %r29,%r29,1,%r195; .loc 1 109 14 ld.u64 %r30,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r196,%r30,0; .loc 1 110 16 selp.u64 %r30,%r30,1,%r196; .loc 1 111 14 ld.u64 %r31,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r197,%r31,0; .loc 1 112 16 selp.u64 %r31,%r31,1,%r197; .loc 1 118 8 ld.u64 %r124,[%r144]; .loc 1 119 8 ld.u64 %r125,[%r145]; .loc 1 120 8 ld.u64 %r126,[%r146]; .loc 1 122 9 setp.eq.u64 %r198,%r124,0; @ %r198 bra $L1; .loc 1 150 13 shl.b64 %r47,%r82,3; .loc 1 151 12 shl.b64 %r49,%r86,3; .loc 1 162 12 shl.b64 %r52,%r30,3; .loc 1 163 12 shl.b64 %r54,%r29,3; .loc 1 164 12 shl.b64 %r56,%r31,3; .loc 1 138 35 set.u32.eq.u64 %r200,%r82,1; neg.s32 %r201,%r200; .loc 1 138 19 set.u32.eq.u64 %r203,%r86,1; neg.s32 %r204,%r203; .loc 1 138 24 cvt.u16.u32 %r206,%r201; cvt.u16.u32 %r207,%r204; and.b16 %r205,%r206,%r207; cvt.u32.u16 %r208,%r205; cvt.u32.u8 %r84,%r208; .loc 1 167 32 ld.u64 %r103,[%frame+360]; setp.ne.u32 %r294,%r84,0; setp.le.s64 %r292,%r143,2; cvt.u32.u32 %r300,%r142; cvt.s64.s8 %r301,%r300; add.u64 %r302,%r301,-1; add.u64 %r303,%frame,840; add.u64 %r304,%frame,720; add.u64 %r305,%frame,600; add.u64 %r306,%frame,360; .loc 1 172 24 add.u64 %r307,%frame,240; .loc 1 173 24 add.u64 %r308,%frame,120; $L27: .loc 1 125 10 ld.u64 %r98,[%r126]; .loc 1 128 10 setp.ge.s64 %r209,%r98,0; @ %r209 bra $L13; .loc 1 129 12 add.u64 %r98,%r98,%r92; $L13: .loc 1 130 11 set.u32.ge.s64 %r211,%r98,%r92; neg.s32 %r212,%r211; shr.u64 %r214,%r98,63; cvt.u16.u32 %r217,%r212; cvt.u16.u64 %r218,%r214; or.b16 %r216,%r217,%r218; .loc 1 130 10 cvt.u32.u16 %r219,%r216; cvt.u16.u8 %r220,%r219; setp.eq.u16 %r221,%r220,0; @ %r221 bra $L14; .loc 1 132 8 rem.s64 %r98,%r98,%r92; .loc 1 133 7 setp.ge.s64 %r224,%r98,0; @ %r224 bra $L14; .loc 1 134 16 add.u64 %r98,%r98,%r92; $L14: .loc 1 147 24 sub.u64 %r99,%r92,%r98; .loc 1 138 10 @ %r294 bra $L15; .loc 1 147 4 setp.gt.s64 %r226,%r99,0; @ %r226 bra $L16; mov.u64 %r113,%r124; bra $L17; $L15: .loc 1 140 11 shl.b64 %r119,%r98,3; .loc 1 141 11 shl.b64 %r120,%r99,3; .loc 1 142 23 add.u64 %r227,%r125,%r119; .loc 1 142 41270234,[%value_in]; } .loc 1 143 17 add.u64 %r236,%r124,%r120r119243,[%value_in]; } bra $L18; $L16: .loc 1 136 22 mul.lo.u64 %r245,%r98,%r86; .loc 1 136 18 shl.b64 %r246,%r245,3; .loc 1 136 11 add.u64 %r117,%r125,%r246; mov.u64 %r116,%r124; .loc 1 147 11 mov.u64 %r118,0; $L19: .loc 1 149 16 ld.u64 %r45,[%r117]; .loc 1 149 14 st.u64 [%r116],%r45; .loc 1 150 13 add.u64 %r116,%r116,%r47; .loc 1 151 12 add.u64 %r117,%r117,%r49; .loc 1 147 31 add.u64 %r118,%r118,1; .loc 1 147 4 setp.ne.u64 %r247,%r99,%r118; @ %r247 bra $L19; .loc 1 150 13 mad.lo.u64 %r113,%r99,%r47,%r124; $L17: .loc 1 153 4 setp.gt.s64 %r249,%r98,0; @ ! %r249 bra $L18; mov.u64 %r114,%r125; .loc 1 153 23 mov.u64 %r115,0; $L20: .loc 1 155 16 ld.u64 %r50,[%r114]; .loc 1 155 14 st.u64 [%r113],%r50; .loc 1 156 13 add.u64 %r113,%r113,%r47; .loc 1 157 12 add.u64 %r114,%r114,%r49; .loc 1 153 37 add.u64 %r115,%r115,1; .loc 1 153 4 setp.ne.u64 %r250,%r98,%r115; @ %r250 bra $L20; $L18: .loc 1 162 12 add.u64 %r124,%r124,%r52; .loc 1 163 12 add.u64 %r125,%r125,%r54; .loc 1 164 12 add.u64 %r126,%r126,%r56; .loc 1 165 15 ld.u64 %r251,[%frame+480]; add.u64 %r58,%r251,1; .loc 1 167 13 setp.eq.u64 %r252,%r58,%r103; @ %r252 bra $L22; .loc 1 165 15 st.u64 [%frame+480],%r58; bra $L27; $L22: .loc 1 171 20 mov.u64 %r253,0; st.u64 [%frame+480],%r253; .loc 1 172 24 ld.u64 %r60,[%frame+240]; .loc 1 173 16 ld.u64 %r255,[%frame+120]; shl.b64 %r254,%r255,3; sub.u64 %r121,%r125,%r254; .loc 1 174 9 ld.u64 %r257,[%frame]; shl.b64 %r256,%r257,3; sub.u64 %r122,%r126,%r256; .loc 1 176 14 @ %r292 bra $L1; add.u64 %r134,%frame,488; mov.u64 %r132,8; .loc 1 175 12 mov.u64 %r123,1; bra $L25; $L26: .loc 1 171 20 st.u64 [%r134],%r253; .loc 1 172 24 add.u64 %r264,%r307,%r132; ld.u64 %r60,[%r264]; .loc 1 173 24 add.u64 %r266,%r308,%r132; .loc 1 173 16 ld.u64 %r268,[%r266]; shl.b64 %r267,%r268,3; sub.u64 %r121,%r125,%r267; .loc 1 174 17 add.u64 %r269,%frame,%r132; .loc 1 174 9 ld.u64 %r271,[%r269]; shl.b64 %r270,%r271,3; sub.u64 %r122,%r126,%r270; .loc 1 175 12 add.u64 %r123,%r123,1; .loc 1 176 14 add.u64 %r134,%r134,8; add.u64 %r132,%r132,8; setp.eq.u64 %r272,%r302,%r123; @ %r272 bra $L1; $L25: .loc 1 184 23 ld.u64 %r273,[%r134]; add.u64 %r70,%r273,1; st.u64 [%r134],%r70; .loc 1 185 30 add.u64 %r275,%r303,%r132; .loc 1 185 20 ld.u64 %r277,[%r275]; sub.u64 %r276,%r277,%r60; shl.b64 %r278,%r276,3; add.u64 %r124,%r124,%r278; .loc 1 186 30 add.u64 %r280,%r304,%r132; .loc 1 186 20 ld.u64 %r282,[%r280]; shl.b64 %r281,%r282,3; add.u64 %r125,%r121,%r281; .loc 1 187 23 add.u64 %r284,%r305,%r132; .loc 1 187 13 ld.u64 %r286,[%r284]; shl.b64 %r285,%r286,3; add.u64 %r126,%r122,%r285; .loc 1 167 32 add.u64 %r288,%r306,%r132; .loc 1 167 13 ld.u64 %r289,[%r288]; setp.eq.u64 %r290,%r70,%r289; @ %r290 bra $L26; bra $L27; $L1: .loc 1 191 1 ret; } cshift1_8_i16.o/1622802209gfortrani_cshift1_8_i16 .visible .func _gfortrani_cshift1_8_i16fortran/generated/cshift1_8_i16.c"F: _gfortrani_cshift1_8_i16 .visible .func _gfortrani_cshift1_8_i16960pred32predpredu16u64 %r214; .reg .u16 %r216; .reg .u16 %r217; .reg .u16 %r218; .reg .u32 %r219; .reg .u16 %r220; .reg .pred %r221; .reg .pred %r224; .reg .pred %r226; .reg .u64 %r227; .reg .u64pred %r249; .reg .predpred %r276; .reg .u64 %r277; .reg .u64 %r279; .reg .u64 %r280; .reg .u64 %r281; .reg .u64 %r282; .reg .u64predpred %r300; .reg .pred %r303; .reg .u32u64 %r313; .reg .u64mov.u64 %r144,%ar0; mov.u64 %r145,%ar1; mov.u64 %r146,%ar2; mov.u64 %r147,%ar3; .loc 1 68 6 setp.eq.u64 %r148,%r147,0; @ %r148 bra $L28; .loc 1 69 21 ld.u32 %r150,[%r147]; add.u32 %r97,%r150,-1; bra $L2; $L28: .loc 1 71 11 cvt.u32.u64 %r97,%r147; $L2: .loc 1 73 13 mov.u64 %r151,1; st.u64 [%frame+360],%r151; .loc 1 74 12 mov.u64 %r152,0; st.u64 [%frame+480],%r152; .loc 1 82 23 ld.s8 %r142,[%r145+28]; cvt.u32.u32 %r153,%r142; cvt.s64.s8 %r143,%r153; .loc 1 82 3 setp.le.s64 %r154,%r143,0; @ %r154 bra $L29; .loc 1 84 15 cvt.s64.s32 %r35,%r97; add.u64 %r90,%r145,40; add.u64 %r83,%r144,40; .loc 1 75 5 mov.u64 %r94,%r152; .loc 1 80 7 mov.u64 %r92,%r94; .loc 1 82 12 mov.u64 %r127,%r94; .loc 1 79 11 mov.u64 %r86,%r151; .loc 1 78 11 mov.u64 %r82,%r86; .loc 1 96 20 mov.u64 %r317,%r94; $L8: .loc 1 92 17 ld.u64 %r156,[%r90+16]; add.u64 %r155,%r156,1; .loc 1 92 15 ld.u64 %r157,[%r90+8]; sub.u64 %r87,%r155,%r157; .loc 1 86 19 ld.u64 %r81,[%r83]; .loc 1 89 19 ld.u64 %r85,[%r90]; .loc 1 84 10 setp.ne.u64 %r158,%r35,%r127; @ %r158 bra $L4; .loc 1 87 14 setp.ne.u64 %r159,%r81,0; .loc 1 90 14 setp.ne.u64 %r160,%r85,0; .loc 1 92 15 mov.u64 %r92,%r87; selp.u64 %r86,%r85,1,%r160; selp.u64 %r82,%r81,1,%r159; bra $L7; $L4: .loc 1 96 20 shl.b64 %r161,%r94,3; add.u64 %r162,%frame,%r161; add.u64 %r163,%r162,480; st.u64 [%r163],%r317; .loc 1 97 21 st.u64 [%r162+360],%r87; .loc 1 98 22 st.u64 [%r162+840],%r81; .loc 1 99 22 st.u64 [%r162+720],%r85; .loc 1 100 24 add.u64 %r175,%r94,%r94; add.u64 %r176,%r175,%r94; shl.b64 %r177,%r176,3; add.u64 %r178,%r146,%r177; ld.u64 %r25,[%r178+40]; .loc 1 100 22 st.u64 [%r162+600],%r25; .loc 1 101 26 mul.lo.u64 %r186,%r81,%r87; .loc 1 101 13 st.u64 [%r162+240],%r186; .loc 1 102 26 mul.lo.u64 %r190,%r87,%r85; .loc 1 102 13 st.u64 [%r162+120],%r190; .loc 1 103 26 mul.lo.u64 %r193,%r25,%r87; .loc 1 103 13 st.u64 [%r162],%r193; .loc 1 104 12 add.u64 %r94,%r94,1; $L7: .loc 1 82 55 add.u64 %r127,%r127,1; .loc 1 82 3 add.u64 %r90,%r90,24; add.u64 %r83,%r83,24; setp.ne.u64 %r194,%r127,%r143; @ %r194 bra $L8; bra $L3; $L29: .loc 1 80 7 mov.u64 %r92,%r152; .loc 1 79 11 mov.u64 %r86,%r151; .loc 1 78 11 mov.u64 %r82,%r86; $L3: .loc 1 107 14 ld.u64 %r29,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r195,%r29,0; .loc 1 108 16 selp.u64 %r29,%r29,1,%r195; .loc 1 109 14 ld.u64 %r30,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r196,%r30,0; .loc 1 110 16 selp.u64 %r30,%r30,1,%r196; .loc 1 111 14 ld.u64 %r31,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r197,%r31,0; .loc 1 112 16 selp.u64 %r31,%r31,1,%r197; .loc 1 118 8 ld.u64 %r124,[%r144]; .loc 1 119 8 ld.u64 %r125,[%r145]; .loc 1 120 8 ld.u64 %r126,[%r146]; .loc 1 122 9 setp.eq.u64 %r198,%r124,0; @ %r198 bra $L1; .loc 1 150 13 shl.b64 %r47,%r82,4; .loc 1 151 12 shl.b64 %r49,%r86,4; .loc 1 162 12 shl.b64 %r52,%r30,4; .loc 1 163 12 shl.b64 %r54,%r29,4; .loc 1 164 12 shl.b64 %r56,%r31,3; .loc 1 138 35 set.u32.eq.u64 %r200,%r82,1; neg.s32 %r201,%r200; .loc 1 138 19 set.u32.eq.u64 %r203,%r86,1; neg.s32 %r204,%r203; .loc 1 138 24 cvt.u16.u32 %r206,%r201; cvt.u16.u32 %r207,%r204; and.b16 %r205,%r206,%r207; cvt.u32.u16 %r208,%r205; cvt.u32.u8 %r84,%r208; .loc 1 167 32 ld.u64 %r103,[%frame+360]; setp.ne.u32 %r303,%r84,0; setp.le.s64 %r300,%r143,2; cvt.u32.u32 %r308,%r142; cvt.s64.s8 %r309,%r308; add.u64 %r310,%r309,-1; add.u64 %r311,%frame,840; add.u64 %r312,%frame,720; add.u64 %r313,%frame,600; add.u64 %r314,%frame,360; .loc 1 172 24 add.u64 %r315,%frame,240; .loc 1 173 24 add.u64 %r316,%frame,120; $L27: .loc 1 125 10 ld.u64 %r98,[%r126]; .loc 1 128 10 setp.ge.s64 %r209,%r98,0; @ %r209 bra $L13; .loc 1 129 12 add.u64 %r98,%r98,%r92; $L13: .loc 1 130 11 set.u32.ge.s64 %r211,%r98,%r92; neg.s32 %r212,%r211; shr.u64 %r214,%r98,63; cvt.u16.u32 %r217,%r212; cvt.u16.u64 %r218,%r214; or.b16 %r216,%r217,%r218; .loc 1 130 10 cvt.u32.u16 %r219,%r216; cvt.u16.u8 %r220,%r219; setp.eq.u16 %r221,%r220,0; @ %r221 bra $L14; .loc 1 132 8 rem.s64 %r98,%r98,%r92; .loc 1 133 7 setp.ge.s64 %r224,%r98,0; @ %r224 bra $L14; .loc 1 134 16 add.u64 %r98,%r98,%r92; $L14: .loc 1 147 24 sub.u64 %r99,%r92,%r98; .loc 1 138 10 @ %r303 bra $L15; .loc 1 147 4 setp.gt.s64 %r226,%r99,0; @ %r226 bra $L16; mov.u64 %r113,%r124; bra $L17; $L15: .loc 1 140 11 shl.b64 %r119,%r98,4; .loc 1 141 11 shl.b64 %r120,%r99,4; .loc 1 142 23 add.u64 %r227,%r125,%r119; .loc 1 142 41270234,[%value_in]; } .loc 1 143 17 add.u64 %r236,%r124,%r120r119243,[%value_in]; } bra $L18; $L16: .loc 1 136 22 mul.lo.u64 %r245,%r98,%r86; .loc 1 136 18 shl.b64 %r246,%r245,4; .loc 1 136 11 add.u64 %r117,%r125,%r246; mov.u64 %r116,%r124; .loc 1 147 11 mov.u64 %r118,0; $L19: .loc 1 149 16 ld.u64 %r295,[%r117]; ld.u64 %r296,[%r117+8]; .loc 1 149 14 st.u64 [%r116],%r295; st.u64 [%r116+8],%r296; .loc 1 150 13 add.u64 %r116,%r116,%r47; .loc 1 151 12 add.u64 %r117,%r117,%r49; .loc 1 147 31 add.u64 %r118,%r118,1; .loc 1 147 4 setp.ne.u64 %r249,%r99,%r118; @ %r249 bra $L19; .loc 1 150 13 mad.lo.u64 %r113,%r99,%r47,%r124; $L17: .loc 1 153 4 setp.gt.s64 %r251,%r98,0; @ ! %r251 bra $L18; mov.u64 %r114,%r125; .loc 1 153 23 mov.u64 %r115,0; $L20: .loc 1 155 16 ld.u64 %r297,[%r114]; ld.u64 %r298,[%r114+8]; .loc 1 155 14 st.u64 [%r113],%r297; st.u64 [%r113+8],%r298; .loc 1 156 13 add.u64 %r113,%r113,%r47; .loc 1 157 12 add.u64 %r114,%r114,%r49; .loc 1 153 37 add.u64 %r115,%r115,1; .loc 1 153 4 setp.ne.u64 %r254,%r98,%r115; @ %r254 bra $L20; $L18: .loc 1 162 12 add.u64 %r124,%r124,%r52; .loc 1 163 12 add.u64 %r125,%r125,%r54; .loc 1 164 12 add.u64 %r126,%r126,%r56; .loc 1 165 15 ld.u64 %r255,[%frame+480]; add.u64 %r58,%r255,1; .loc 1 167 13 setp.eq.u64 %r256,%r58,%r103; @ %r256 bra $L22; .loc 1 165 15 st.u64 [%frame+480],%r58; bra $L27; $L22: .loc 1 171 20 mov.u64 %r257,0; st.u64 [%frame+480],%r257; .loc 1 172 24 ld.u64 %r60,[%frame+240]; .loc 1 173 16 ld.u64 %r259,[%frame+120]; shl.b64 %r258,%r259,4; sub.u64 %r121,%r125,%r258; .loc 1 174 9 ld.u64 %r261,[%frame]; shl.b64 %r260,%r261,3; sub.u64 %r122,%r126,%r260; .loc 1 176 14 @ %r300 bra $L1; add.u64 %r134,%frame,488; mov.u64 %r132,8; .loc 1 175 12 mov.u64 %r123,1; bra $L25; $L26: .loc 1 171 20 st.u64 [%r134],%r257; .loc 1 172 24 add.u64 %r268,%r315,%r132; ld.u64 %r60,[%r268]; .loc 1 173 24 add.u64 %r270,%r316,%r132; .loc 1 173 16 ld.u64 %r272,[%r270]; shl.b64 %r271,%r272,4; sub.u64 %r121,%r125,%r271; .loc 1 174 17 add.u64 %r273,%frame,%r132; .loc 1 174 9 ld.u64 %r275,[%r273]; shl.b64 %r274,%r275,3; sub.u64 %r122,%r126,%r274; .loc 1 175 12 add.u64 %r123,%r123,1; .loc 1 176 14 add.u64 %r134,%r134,8; add.u64 %r132,%r132,8; setp.eq.u64 %r276,%r310,%r123; @ %r276 bra $L1; $L25: .loc 1 184 23 ld.u64 %r277,[%r134]; add.u64 %r70,%r277,1; st.u64 [%r134],%r70; .loc 1 185 30 add.u64 %r279,%r311,%r132; .loc 1 185 20 ld.u64 %r281,[%r279]; sub.u64 %r280,%r281,%r60; shl.b64 %r282,%r280,4; add.u64 %r124,%r124,%r282; .loc 1 186 30 add.u64 %r284,%r312,%r132; .loc 1 186 20 ld.u64 %r286,[%r284]; shl.b64 %r285,%r286,4; add.u64 %r125,%r121,%r285; .loc 1 187 23 add.u64 %r288,%r313,%r132; .loc 1 187 13 ld.u64 %r290,[%r288]; shl.b64 %r289,%r290,3; add.u64 %r126,%r122,%r289; .loc 1 167 32 add.u64 %r292,%r314,%r132; .loc 1 167 13 ld.u64 %r293,[%r292]; setp.eq.u64 %r294,%r70,%r293; @ %r294 bra $L26; bra $L27; $L1: .loc 1 191 1 ret; } cshift1_8_r4.o/ 1622802209gfortrani_cshift1_8_r4 .visible .func _gfortrani_cshift1_8_rfortran/generated/cshift1_8_r4.c"F: _gfortrani_cshift1_8_r4 .visible .func _gfortrani_cshift1_8_r960pred32predpredu16u64 %r214; .reg .u16 %r216; .reg .u16 %r217; .reg .u16 %r218; .reg .u32 %r219; .reg .u16 %r220; .reg .pred %r221; .reg .pred %r224; .reg .pred %r226; .reg .u64 %r227; .reg .u64pred %r247; .reg .pred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .predu64 %r267; .reg .u6464u64 %r305; .reg .u64 %r306; .reg .u64mov.u64 %r144,%ar0; mov.u64 %r145,%ar1; mov.u64 %r146,%ar2; mov.u64 %r147,%ar3; .loc 1 68 6 setp.eq.u64 %r148,%r147,0; @ %r148 bra $L28; .loc 1 69 21 ld.u32 %r150,[%r147]; add.u32 %r97,%r150,-1; bra $L2; $L28: .loc 1 71 11 cvt.u32.u64 %r97,%r147; $L2: .loc 1 73 13 mov.u64 %r151,1; st.u64 [%frame+360],%r151; .loc 1 74 12 mov.u64 %r152,0; st.u64 [%frame+480],%r152; .loc 1 82 23 ld.s8 %r142,[%r145+28]; cvt.u32.u32 %r153,%r142; cvt.s64.s8 %r143,%r153; .loc 1 82 3 setp.le.s64 %r154,%r143,0; @ %r154 bra $L29; .loc 1 84 15 cvt.s64.s32 %r35,%r97; add.u64 %r90,%r145,40; add.u64 %r83,%r144,40; .loc 1 75 5 mov.u64 %r94,%r152; .loc 1 80 7 mov.u64 %r92,%r94; .loc 1 82 12 mov.u64 %r127,%r94; .loc 1 79 11 mov.u64 %r86,%r151; .loc 1 78 11 mov.u64 %r82,%r86; .loc 1 96 20 mov.u64 %r309,%r94; $L8: .loc 1 92 17 ld.u64 %r156,[%r90+16]; add.u64 %r155,%r156,1; .loc 1 92 15 ld.u64 %r157,[%r90+8]; sub.u64 %r87,%r155,%r157; .loc 1 86 19 ld.u64 %r81,[%r83]; .loc 1 89 19 ld.u64 %r85,[%r90]; .loc 1 84 10 setp.ne.u64 %r158,%r35,%r127; @ %r158 bra $L4; .loc 1 87 14 setp.ne.u64 %r159,%r81,0; .loc 1 90 14 setp.ne.u64 %r160,%r85,0; .loc 1 92 15 mov.u64 %r92,%r87; selp.u64 %r86,%r85,1,%r160; selp.u64 %r82,%r81,1,%r159; bra $L7; $L4: .loc 1 96 20 shl.b64 %r161,%r94,3; add.u64 %r162,%frame,%r161; add.u64 %r163,%r162,480; st.u64 [%r163],%r309; .loc 1 97 21 st.u64 [%r162+360],%r87; .loc 1 98 22 st.u64 [%r162+840],%r81; .loc 1 99 22 st.u64 [%r162+720],%r85; .loc 1 100 24 add.u64 %r175,%r94,%r94; add.u64 %r176,%r175,%r94; shl.b64 %r177,%r176,3; add.u64 %r178,%r146,%r177; ld.u64 %r25,[%r178+40]; .loc 1 100 22 st.u64 [%r162+600],%r25; .loc 1 101 26 mul.lo.u64 %r186,%r81,%r87; .loc 1 101 13 st.u64 [%r162+240],%r186; .loc 1 102 26 mul.lo.u64 %r190,%r87,%r85; .loc 1 102 13 st.u64 [%r162+120],%r190; .loc 1 103 26 mul.lo.u64 %r193,%r25,%r87; .loc 1 103 13 st.u64 [%r162],%r193; .loc 1 104 12 add.u64 %r94,%r94,1; $L7: .loc 1 82 55 add.u64 %r127,%r127,1; .loc 1 82 3 add.u64 %r90,%r90,24; add.u64 %r83,%r83,24; setp.ne.u64 %r194,%r127,%r143; @ %r194 bra $L8; bra $L3; $L29: .loc 1 80 7 mov.u64 %r92,%r152; .loc 1 79 11 mov.u64 %r86,%r151; .loc 1 78 11 mov.u64 %r82,%r86; $L3: .loc 1 107 14 ld.u64 %r29,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r195,%r29,0; .loc 1 108 16 selp.u64 %r29,%r29,1,%r195; .loc 1 109 14 ld.u64 %r30,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r196,%r30,0; .loc 1 110 16 selp.u64 %r30,%r30,1,%r196; .loc 1 111 14 ld.u64 %r31,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r197,%r31,0; .loc 1 112 16 selp.u64 %r31,%r31,1,%r197; .loc 1 118 8 ld.u64 %r124,[%r144]; .loc 1 119 8 ld.u64 %r125,[%r145]; .loc 1 120 8 ld.u64 %r126,[%r146]; .loc 1 122 9 setp.eq.u64 %r198,%r124,0; @ %r198 bra $L1; .loc 1 150 13 shl.b64 %r47,%r82,2; .loc 1 151 12 shl.b64 %r49,%r86,2; .loc 1 162 12 shl.b64 %r52,%r30,2; .loc 1 163 12 shl.b64 %r54,%r29,2; .loc 1 164 12 shl.b64 %r56,%r31,3; .loc 1 138 35 set.u32.eq.u64 %r200,%r82,1; neg.s32 %r201,%r200; .loc 1 138 19 set.u32.eq.u64 %r203,%r86,1; neg.s32 %r204,%r203; .loc 1 138 24 cvt.u16.u32 %r206,%r201; cvt.u16.u32 %r207,%r204; and.b16 %r205,%r206,%r207; cvt.u32.u16 %r208,%r205; cvt.u32.u8 %r84,%r208; .loc 1 167 32 ld.u64 %r103,[%frame+360]; setp.ne.u32 %r294,%r84,0; setp.le.s64 %r292,%r143,2; cvt.u32.u32 %r300,%r142; cvt.s64.s8 %r301,%r300; add.u64 %r302,%r301,-1; add.u64 %r303,%frame,840; add.u64 %r304,%frame,720; add.u64 %r305,%frame,600; add.u64 %r306,%frame,360; .loc 1 172 24 add.u64 %r307,%frame,240; .loc 1 173 24 add.u64 %r308,%frame,120; $L27: .loc 1 125 10 ld.u64 %r98,[%r126]; .loc 1 128 10 setp.ge.s64 %r209,%r98,0; @ %r209 bra $L13; .loc 1 129 12 add.u64 %r98,%r98,%r92; $L13: .loc 1 130 11 set.u32.ge.s64 %r211,%r98,%r92; neg.s32 %r212,%r211; shr.u64 %r214,%r98,63; cvt.u16.u32 %r217,%r212; cvt.u16.u64 %r218,%r214; or.b16 %r216,%r217,%r218; .loc 1 130 10 cvt.u32.u16 %r219,%r216; cvt.u16.u8 %r220,%r219; setp.eq.u16 %r221,%r220,0; @ %r221 bra $L14; .loc 1 132 8 rem.s64 %r98,%r98,%r92; .loc 1 133 7 setp.ge.s64 %r224,%r98,0; @ %r224 bra $L14; .loc 1 134 16 add.u64 %r98,%r98,%r92; $L14: .loc 1 147 24 sub.u64 %r99,%r92,%r98; .loc 1 138 10 @ %r294 bra $L15; .loc 1 147 4 setp.gt.s64 %r226,%r99,0; @ %r226 bra $L16; mov.u64 %r113,%r124; bra $L17; $L15: .loc 1 140 11 shl.b64 %r119,%r98,2; .loc 1 141 11 shl.b64 %r120,%r99,2; .loc 1 142 23 add.u64 %r227,%r125,%r119; .loc 1 142 41270234,[%value_in]; } .loc 1 143 17 add.u64 %r236,%r124,%r120r119243,[%value_in]; } bra $L18; $L16: .loc 1 136 22 mul.lo.u64 %r245,%r98,%r86; .loc 1 136 18 shl.b64 %r246,%r245,2; .loc 1 136 11 add.u64 %r117,%r125,%r246; mov.u64 %r116,%r124; .loc 1 147 11 mov.u64 %r118,0; $L19: .loc 1 149 16 ld.f32 %r45,[%r117]; .loc 1 149 14 st.f32 [%r116],%r45; .loc 1 150 13 add.u64 %r116,%r116,%r47; .loc 1 151 12 add.u64 %r117,%r117,%r49; .loc 1 147 31 add.u64 %r118,%r118,1; .loc 1 147 4 setp.ne.u64 %r247,%r99,%r118; @ %r247 bra $L19; .loc 1 150 13 mad.lo.u64 %r113,%r99,%r47,%r124; $L17: .loc 1 153 4 setp.gt.s64 %r249,%r98,0; @ ! %r249 bra $L18; mov.u64 %r114,%r125; .loc 1 153 23 mov.u64 %r115,0; $L20: .loc 1 155 16 ld.f32 %r50,[%r114]; .loc 1 155 14 st.f32 [%r113],%r50; .loc 1 156 13 add.u64 %r113,%r113,%r47; .loc 1 157 12 add.u64 %r114,%r114,%r49; .loc 1 153 37 add.u64 %r115,%r115,1; .loc 1 153 4 setp.ne.u64 %r250,%r98,%r115; @ %r250 bra $L20; $L18: .loc 1 162 12 add.u64 %r124,%r124,%r52; .loc 1 163 12 add.u64 %r125,%r125,%r54; .loc 1 164 12 add.u64 %r126,%r126,%r56; .loc 1 165 15 ld.u64 %r251,[%frame+480]; add.u64 %r58,%r251,1; .loc 1 167 13 setp.eq.u64 %r252,%r58,%r103; @ %r252 bra $L22; .loc 1 165 15 st.u64 [%frame+480],%r58; bra $L27; $L22: .loc 1 171 20 mov.u64 %r253,0; st.u64 [%frame+480],%r253; .loc 1 172 24 ld.u64 %r60,[%frame+240]; .loc 1 173 16 ld.u64 %r255,[%frame+120]; shl.b64 %r254,%r255,2; sub.u64 %r121,%r125,%r254; .loc 1 174 9 ld.u64 %r257,[%frame]; shl.b64 %r256,%r257,3; sub.u64 %r122,%r126,%r256; .loc 1 176 14 @ %r292 bra $L1; add.u64 %r134,%frame,488; mov.u64 %r132,8; .loc 1 175 12 mov.u64 %r123,1; bra $L25; $L26: .loc 1 171 20 st.u64 [%r134],%r253; .loc 1 172 24 add.u64 %r264,%r307,%r132; ld.u64 %r60,[%r264]; .loc 1 173 24 add.u64 %r266,%r308,%r132; .loc 1 173 16 ld.u64 %r268,[%r266]; shl.b64 %r267,%r268,2; sub.u64 %r121,%r125,%r267; .loc 1 174 17 add.u64 %r269,%frame,%r132; .loc 1 174 9 ld.u64 %r271,[%r269]; shl.b64 %r270,%r271,3; sub.u64 %r122,%r126,%r270; .loc 1 175 12 add.u64 %r123,%r123,1; .loc 1 176 14 add.u64 %r134,%r134,8; add.u64 %r132,%r132,8; setp.eq.u64 %r272,%r302,%r123; @ %r272 bra $L1; $L25: .loc 1 184 23 ld.u64 %r273,[%r134]; add.u64 %r70,%r273,1; st.u64 [%r134],%r70; .loc 1 185 30 add.u64 %r275,%r303,%r132; .loc 1 185 20 ld.u64 %r277,[%r275]; sub.u64 %r276,%r277,%r60; shl.b64 %r278,%r276,2; add.u64 %r124,%r124,%r278; .loc 1 186 30 add.u64 %r280,%r304,%r132; .loc 1 186 20 ld.u64 %r282,[%r280]; shl.b64 %r281,%r282,2; add.u64 %r125,%r121,%r281; .loc 1 187 23 add.u64 %r284,%r305,%r132; .loc 1 187 13 ld.u64 %r286,[%r284]; shl.b64 %r285,%r286,3; add.u64 %r126,%r122,%r285; .loc 1 167 32 add.u64 %r288,%r306,%r132; .loc 1 167 13 ld.u64 %r289,[%r288]; setp.eq.u64 %r290,%r70,%r289; @ %r290 bra $L26; bra $L27; $Lgfortrani_cshift1_8_r8 .visible .func _gfortrani_cshift1_8_r8fortran/generated/cshift1_8_r8.c"F: _gfortrani_cshift1_8_r8 .visible .func _gfortrani_cshift1_8_r8960pred32predpredu16u64 %r214; .reg .u16 %r216; .reg .u16 %r217; .reg .u16 %r218; .reg .u32 %r219; .reg .u16 %r220; .reg .pred %r221; .reg .pred %r224; .reg .pred %r226; .reg .u64 %r227; .reg .u64pred %r247; .reg .pred %r249; .reg .pred %r250; .reg .u64 %r251; .reg .predu64 %r267; .reg .u6464u64 %r305; .reg .u64 %r306; .reg .u64mov.u64 %r144,%ar0; mov.u64 %r145,%ar1; mov.u64 %r146,%ar2; mov.u64 %r147,%ar3; .loc 1 68 6 setp.eq.u64 %r148,%r147,0; @ %r148 bra $L28; .loc 1 69 21 ld.u32 %r150,[%r147]; add.u32 %r97,%r150,-1; bra $L2; $L28: .loc 1 71 11 cvt.u32.u64 %r97,%r147; $L2: .loc 1 73 13 mov.u64 %r151,1; st.u64 [%frame+360],%r151; .loc 1 74 12 mov.u64 %r152,0; st.u64 [%frame+480],%r152; .loc 1 82 23 ld.s8 %r142,[%r145+28]; cvt.u32.u32 %r153,%r142; cvt.s64.s8 %r143,%r153; .loc 1 82 3 setp.le.s64 %r154,%r143,0; @ %r154 bra $L29; .loc 1 84 15 cvt.s64.s32 %r35,%r97; add.u64 %r90,%r145,40; add.u64 %r83,%r144,40; .loc 1 75 5 mov.u64 %r94,%r152; .loc 1 80 7 mov.u64 %r92,%r94; .loc 1 82 12 mov.u64 %r127,%r94; .loc 1 79 11 mov.u64 %r86,%r151; .loc 1 78 11 mov.u64 %r82,%r86; .loc 1 96 20 mov.u64 %r309,%r94; $L8: .loc 1 92 17 ld.u64 %r156,[%r90+16]; add.u64 %r155,%r156,1; .loc 1 92 15 ld.u64 %r157,[%r90+8]; sub.u64 %r87,%r155,%r157; .loc 1 86 19 ld.u64 %r81,[%r83]; .loc 1 89 19 ld.u64 %r85,[%r90]; .loc 1 84 10 setp.ne.u64 %r158,%r35,%r127; @ %r158 bra $L4; .loc 1 87 14 setp.ne.u64 %r159,%r81,0; .loc 1 90 14 setp.ne.u64 %r160,%r85,0; .loc 1 92 15 mov.u64 %r92,%r87; selp.u64 %r86,%r85,1,%r160; selp.u64 %r82,%r81,1,%r159; bra $L7; $L4: .loc 1 96 20 shl.b64 %r161,%r94,3; add.u64 %r162,%frame,%r161; add.u64 %r163,%r162,480; st.u64 [%r163],%r309; .loc 1 97 21 st.u64 [%r162+360],%r87; .loc 1 98 22 st.u64 [%r162+840],%r81; .loc 1 99 22 st.u64 [%r162+720],%r85; .loc 1 100 24 add.u64 %r175,%r94,%r94; add.u64 %r176,%r175,%r94; shl.b64 %r177,%r176,3; add.u64 %r178,%r146,%r177; ld.u64 %r25,[%r178+40]; .loc 1 100 22 st.u64 [%r162+600],%r25; .loc 1 101 26 mul.lo.u64 %r186,%r81,%r87; .loc 1 101 13 st.u64 [%r162+240],%r186; .loc 1 102 26 mul.lo.u64 %r190,%r87,%r85; .loc 1 102 13 st.u64 [%r162+120],%r190; .loc 1 103 26 mul.lo.u64 %r193,%r25,%r87; .loc 1 103 13 st.u64 [%r162],%r193; .loc 1 104 12 add.u64 %r94,%r94,1; $L7: .loc 1 82 55 add.u64 %r127,%r127,1; .loc 1 82 3 add.u64 %r90,%r90,24; add.u64 %r83,%r83,24; setp.ne.u64 %r194,%r127,%r143; @ %r194 bra $L8; bra $L3; $L29: .loc 1 80 7 mov.u64 %r92,%r152; .loc 1 79 11 mov.u64 %r86,%r151; .loc 1 78 11 mov.u64 %r82,%r86; $L3: .loc 1 107 14 ld.u64 %r29,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r195,%r29,0; .loc 1 108 16 selp.u64 %r29,%r29,1,%r195; .loc 1 109 14 ld.u64 %r30,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r196,%r30,0; .loc 1 110 16 selp.u64 %r30,%r30,1,%r196; .loc 1 111 14 ld.u64 %r31,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r197,%r31,0; .loc 1 112 16 selp.u64 %r31,%r31,1,%r197; .loc 1 118 8 ld.u64 %r124,[%r144]; .loc 1 119 8 ld.u64 %r125,[%r145]; .loc 1 120 8 ld.u64 %r126,[%r146]; .loc 1 122 9 setp.eq.u64 %r198,%r124,0; @ %r198 bra $L1; .loc 1 150 13 shl.b64 %r47,%r82,3; .loc 1 151 12 shl.b64 %r49,%r86,3; .loc 1 162 12 shl.b64 %r52,%r30,3; .loc 1 163 12 shl.b64 %r54,%r29,3; .loc 1 164 12 shl.b64 %r56,%r31,3; .loc 1 138 35 set.u32.eq.u64 %r200,%r82,1; neg.s32 %r201,%r200; .loc 1 138 19 set.u32.eq.u64 %r203,%r86,1; neg.s32 %r204,%r203; .loc 1 138 24 cvt.u16.u32 %r206,%r201; cvt.u16.u32 %r207,%r204; and.b16 %r205,%r206,%r207; cvt.u32.u16 %r208,%r205; cvt.u32.u8 %r84,%r208; .loc 1 167 32 ld.u64 %r103,[%frame+360]; setp.ne.u32 %r294,%r84,0; setp.le.s64 %r292,%r143,2; cvt.u32.u32 %r300,%r142; cvt.s64.s8 %r301,%r300; add.u64 %r302,%r301,-1; add.u64 %r303,%frame,840; add.u64 %r304,%frame,720; add.u64 %r305,%frame,600; add.u64 %r306,%frame,360; .loc 1 172 24 add.u64 %r307,%frame,240; .loc 1 173 24 add.u64 %r308,%frame,120; $L27: .loc 1 125 10 ld.u64 %r98,[%r126]; .loc 1 128 10 setp.ge.s64 %r209,%r98,0; @ %r209 bra $L13; .loc 1 129 12 add.u64 %r98,%r98,%r92; $L13: .loc 1 130 11 set.u32.ge.s64 %r211,%r98,%r92; neg.s32 %r212,%r211; shr.u64 %r214,%r98,63; cvt.u16.u32 %r217,%r212; cvt.u16.u64 %r218,%r214; or.b16 %r216,%r217,%r218; .loc 1 130 10 cvt.u32.u16 %r219,%r216; cvt.u16.u8 %r220,%r219; setp.eq.u16 %r221,%r220,0; @ %r221 bra $L14; .loc 1 132 8 rem.s64 %r98,%r98,%r92; .loc 1 133 7 setp.ge.s64 %r224,%r98,0; @ %r224 bra $L14; .loc 1 134 16 add.u64 %r98,%r98,%r92; $L14: .loc 1 147 24 sub.u64 %r99,%r92,%r98; .loc 1 138 10 @ %r294 bra $L15; .loc 1 147 4 setp.gt.s64 %r226,%r99,0; @ %r226 bra $L16; mov.u64 %r113,%r124; bra $L17; $L15: .loc 1 140 11 shl.b64 %r119,%r98,3; .loc 1 141 11 shl.b64 %r120,%r99,3; .loc 1 142 23 add.u64 %r227,%r125,%r119; .loc 1 142 41270234,[%value_in]; } .loc 1 143 17 add.u64 %r236,%r124,%r120r119243,[%value_in]; } bra $L18; $L16: .loc 1 136 22 mul.lo.u64 %r245,%r98,%r86; .loc 1 136 18 shl.b64 %r246,%r245,3; .loc 1 136 11 add.u64 %r117,%r125,%r246; mov.u64 %r116,%r124; .loc 1 147 11 mov.u64 %r118,0; $L19: .loc 1 149 16 ld.f64 %r45,[%r117]; .loc 1 149 14 st.f64 [%r116],%r45; .loc 1 150 13 add.u64 %r116,%r116,%r47; .loc 1 151 12 add.u64 %r117,%r117,%r49; .loc 1 147 31 add.u64 %r118,%r118,1; .loc 1 147 4 setp.ne.u64 %r247,%r99,%r118; @ %r247 bra $L19; .loc 1 150 13 mad.lo.u64 %r113,%r99,%r47,%r124; $L17: .loc 1 153 4 setp.gt.s64 %r249,%r98,0; @ ! %r249 bra $L18; mov.u64 %r114,%r125; .loc 1 153 23 mov.u64 %r115,0; $L20: .loc 1 155 16 ld.f64 %r50,[%r114]; .loc 1 155 14 st.f64 [%r113],%r50; .loc 1 156 13 add.u64 %r113,%r113,%r47; .loc 1 157 12 add.u64 %r114,%r114,%r49; .loc 1 153 37 add.u64 %r115,%r115,1; .loc 1 153 4 setp.ne.u64 %r250,%r98,%r115; @ %r250 bra $L20; $L18: .loc 1 162 12 add.u64 %r124,%r124,%r52; .loc 1 163 12 add.u64 %r125,%r125,%r54; .loc 1 164 12 add.u64 %r126,%r126,%r56; .loc 1 165 15 ld.u64 %r251,[%frame+480]; add.u64 %r58,%r251,1; .loc 1 167 13 setp.eq.u64 %r252,%r58,%r103; @ %r252 bra $L22; .loc 1 165 15 st.u64 [%frame+480],%r58; bra $L27; $L22: .loc 1 171 20 mov.u64 %r253,0; st.u64 [%frame+480],%r253; .loc 1 172 24 ld.u64 %r60,[%frame+240]; .loc 1 173 16 ld.u64 %r255,[%frame+120]; shl.b64 %r254,%r255,3; sub.u64 %r121,%r125,%r254; .loc 1 174 9 ld.u64 %r257,[%frame]; shl.b64 %r256,%r257,3; sub.u64 %r122,%r126,%r256; .loc 1 176 14 @ %r292 bra $L1; add.u64 %r134,%frame,488; mov.u64 %r132,8; .loc 1 175 12 mov.u64 %r123,1; bra $L25; $L26: .loc 1 171 20 st.u64 [%r134],%r253; .loc 1 172 24 add.u64 %r264,%r307,%r132; ld.u64 %r60,[%r264]; .loc 1 173 24 add.u64 %r266,%r308,%r132; .loc 1 173 16 ld.u64 %r268,[%r266]; shl.b64 %r267,%r268,3; sub.u64 %r121,%r125,%r267; .loc 1 174 17 add.u64 %r269,%frame,%r132; .loc 1 174 9 ld.u64 %r271,[%r269]; shl.b64 %r270,%r271,3; sub.u64 %r122,%r126,%r270; .loc 1 175 12 add.u64 %r123,%r123,1; .loc 1 176 14 add.u64 %r134,%r134,8; add.u64 %r132,%r132,8; setp.eq.u64 %r272,%r302,%r123; @ %r272 bra $L1; $L25: .loc 1 184 23 ld.u64 %r273,[%r134]; add.u64 %r70,%r273,1; st.u64 [%r134],%r70; .loc 1 185 30 add.u64 %r275,%r303,%r132; .loc 1 185 20 ld.u64 %r277,[%r275]; sub.u64 %r276,%r277,%r60; shl.b64 %r278,%r276,3; add.u64 %r124,%r124,%r278; .loc 1 186 30 add.u64 %r280,%r304,%r132; .loc 1 186 20 ld.u64 %r282,[%r280]; shl.b64 %r281,%r282,3; add.u64 %r125,%r121,%r281; .loc 1 187 23 add.u64 %r284,%r305,%r132; .loc 1 187 13 ld.u64 %r286,[%r284]; shl.b64 %r285,%r286,3; add.u64 %r126,%r122,%r285; .loc 1 167 32 add.u64 %r288,%r306,%r132; .loc 1 167 13 ld.u64 %r289,[%r288]; setp.eq.u64 %r290,%r70,%r289; @ %r290 bra $L26; bra $L27; $L1: .loc 1 191 1 ret; } cshift1_8_r10.o/1622802209cshift1_8_c4.o/ 1622802210gfortrani_cshift1_8_c4 .visible .func _gfortrani_cshift1_8_cfortran/generated/cshift1_8_c4.c"F: _gfortrani_cshift1_8_c4 .visible .func _gfortrani_cshift1_8_c960predpredpredpredpred %r226; .reg .predu64pred %r249; .reg .pred %r251; .reg .pred %r252; .reg .u64 %r253; .reg .predu64 %r257; .reg .u64 %r258; .reg .u64 %r259; .reg .u64u64pred %r294; .reg .pred %r296; .reg .u32 %r302; .reg .u64 %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u64 %r306; .reg .u64u64 %r310; .reg .u64 %r311; mov.u64 %r146,%ar0; mov.u64 %r147,%ar1; mov.u64 %r148,%ar2; mov.u64 %r149,%ar3; .loc 1 68 6 setp.eq.u64 %r150,%r149,0; @ %r150 bra $L28; .loc 1 69 21 ld.u32 %r152,[%r149]; add.u32 %r99,%r152,-1; bra $L2; $L28: .loc 1 71 11 cvt.u32.u64 %r99,%r149; $L2: .loc 1 73 13 mov.u64 %r153,1; st.u64 [%frame+360],%r153; .loc 1 74 12 mov.u64 %r154,0; st.u64 [%frame+480],%r154; .loc 1 82 23 ld.s8 %r144,[%r147+28]; cvt.u32.u32 %r155,%r144; cvt.s64.s8 %r145,%r155; .loc 1 82 3 setp.le.s64 %r156,%r145,0; @ %r156 bra $L29; .loc 1 84 15 cvt.s64.s32 %r35,%r99; add.u64 %r92,%r147,40; add.u64 %r85,%r146,40; .loc 1 75 5 mov.u64 %r96,%r154; .loc 1 80 7 mov.u64 %r94,%r96; .loc 1 82 12 mov.u64 %r129,%r96; .loc 1 79 11 mov.u64 %r88,%r153; .loc 1 78 11 mov.u64 %r84,%r88; .loc 1 96 20 mov.u64 %r311,%r96; $L8: .loc 1 92 17 ld.u64 %r158,[%r92+16]; add.u64 %r157,%r158,1; .loc 1 92 15 ld.u64 %r159,[%r92+8]; sub.u64 %r89,%r157,%r159; .loc 1 86 19 ld.u64 %r83,[%r85]; .loc 1 89 19 ld.u64 %r87,[%r92]; .loc 1 84 10 setp.ne.u64 %r160,%r35,%r129; @ %r160 bra $L4; .loc 1 87 14 setp.ne.u64 %r161,%r83,0; .loc 1 90 14 setp.ne.u64 %r162,%r87,0; .loc 1 92 15 mov.u64 %r94,%r89; selp.u64 %r88,%r87,1,%r162; selp.u64 %r84,%r83,1,%r161; bra $L7; $L4: .loc 1 96 20 shl.b64 %r163,%r96,3; add.u64 %r164,%frame,%r163; add.u64 %r165,%r164,480; st.u64 [%r165],%r311; .loc 1 97 21 st.u64 [%r164+360],%r89; .loc 1 98 22 st.u64 [%r164+840],%r83; .loc 1 99 22 st.u64 [%r164+720],%r87; .loc 1 100 24 add.u64 %r177,%r96,%r96; add.u64 %r178,%r177,%r96; shl.b64 %r179,%r178,3; add.u64 %r180,%r148,%r179; ld.u64 %r25,[%r180+40]; .loc 1 100 22 st.u64 [%r164+600],%r25; .loc 1 101 26 mul.lo.u64 %r188,%r83,%r89; .loc 1 101 13 st.u64 [%r164+240],%r188; .loc 1 102 26 mul.lo.u64 %r192,%r89,%r87; .loc 1 102 13 st.u64 [%r164+120],%r192; .loc 1 103 26 mul.lo.u64 %r195,%r25,%r89; .loc 1 103 13 st.u64 [%r164],%r195; .loc 1 104 12 add.u64 %r96,%r96,1; $L7: .loc 1 82 55 add.u64 %r129,%r129,1; .loc 1 82 3 add.u64 %r92,%r92,24; add.u64 %r85,%r85,24; setp.ne.u64 %r196,%r129,%r145; @ %r196 bra $L8; bra $L3; $L29: .loc 1 80 7 mov.u64 %r94,%r154; .loc 1 79 11 mov.u64 %r88,%r153; .loc 1 78 11 mov.u64 %r84,%r88; $L3: .loc 1 107 14 ld.u64 %r29,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r197,%r29,0; .loc 1 108 16 selp.u64 %r29,%r29,1,%r197; .loc 1 109 14 ld.u64 %r30,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r198,%r30,0; .loc 1 110 16 selp.u64 %r30,%r30,1,%r198; .loc 1 111 14 ld.u64 %r31,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r199,%r31,0; .loc 1 112 16 selp.u64 %r31,%r31,1,%r199; .loc 1 118 8 ld.u64 %r126,[%r146]; .loc 1 119 8 ld.u64 %r127,[%r147]; .loc 1 120 8 ld.u64 %r128,[%r148]; .loc 1 122 9 setp.eq.u64 %r200,%r126,0; @ %r200 bra $L1; .loc 1 150 13 shl.b64 %r48,%r84,3; .loc 1 151 12 shl.b64 %r50,%r88,3; .loc 1 162 12 shl.b64 %r54,%r30,3; .loc 1 163 12 shl.b64 %r56,%r29,3; .loc 1 164 12 shl.b64 %r58,%r31,3; .loc 1 138 35 set.u32.eq.u64 %r202,%r84,1; neg.s32 %r203,%r202; .loc 1 138 19 set.u32.eq.u64 %r205,%r88,1; neg.s32 %r206,%r205; .loc 1 138 24 cvt.u16.u32 %r208,%r203; cvt.u16.u32 %r209,%r206; and.b16 %r207,%r208,%r209; cvt.u32.u16 %r210,%r207; cvt.u32.u8 %r86,%r210; .loc 1 167 32 ld.u64 %r105,[%frame+360]; setp.ne.u32 %r296,%r86,0; setp.le.s64 %r294,%r145,2; cvt.u32.u32 %r302,%r144; cvt.s64.s8 %r303,%r302; add.u64 %r304,%r303,-1; add.u64 %r305,%frame,840; add.u64 %r306,%frame,720; add.u64 %r307,%frame,600; add.u64 %r308,%frame,360; .loc 1 172 24 add.u64 %r309,%frame,240; .loc 1 173 24 add.u64 %r310,%frame,120; $L27: .loc 1 125 10 ld.u64 %r100,[%r128]; .loc 1 128 10 setp.ge.s64 %r211,%r100,0; @ %r211 bra $L13; .loc 1 129 12 add.u64 %r100,%r100,%r94; $L13: .loc 1 130 11 set.u32.ge.s64 %r213,%r100,%r94; neg.s32 %r214,%r213; shr.u64 %r216,%r100,63; cvt.u16.u32 %r219,%r214; cvt.u16.u64 %r220,%r216; or.b16 %r218,%r219,%r220; .loc 1 130 10 cvt.u32.u16 %r221,%r218; cvt.u16.u8 %r222,%r221; setp.eq.u16 %r223,%r222,0; @ %r223 bra $L14; .loc 1 132 8 rem.s64 %r100,%r100,%r94; .loc 1 133 7 setp.ge.s64 %r226,%r100,0; @ %r226 bra $L14; .loc 1 134 16 add.u64 %r100,%r100,%r94; $L14: .loc 1 147 24 sub.u64 %r101,%r94,%r100; .loc 1 138 10 @ %r296 bra $L15; .loc 1 147 4 setp.gt.s64 %r228,%r101,0; @ %r228 bra $L16; mov.u64 %r115,%r126; bra $L17; $L15: .loc 1 140 11 shl.b64 %r121,%r100,3; .loc 1 141 11 shl.b64 %r122,%r101,3; .loc 1 142 23 add.u64 %r229,%r127,%r121; .loc 1 142 422236,[%value_in]; } .loc 1 143 17 add.u64 %r238,%r126,%r122r1271245,[%value_in]; } bra $L18; $L16: .loc 1 136 22 mul.lo.u64 %r247,%r100,%r88; .loc 1 136 18 shl.b64 %r248,%r247,3; .loc 1 136 11 add.u64 %r119,%r127,%r248; mov.u64 %r118,%r126; .loc 1 147 11 mov.u64 %r120,0; $L19: .loc 1 149 16 ld.f32 %r45,[%r119]; ld.f32 %r46,[%r119+4]; .loc 1 149 14 st.f32 [%r118],%r45; st.f32 [%r118+4],%r46; .loc 1 150 13 add.u64 %r118,%r118,%r48; .loc 1 151 12 add.u64 %r119,%r119,%r50; .loc 1 147 31 add.u64 %r120,%r120,1; .loc 1 147 4 setp.ne.u64 %r249,%r101,%r120; @ %r249 bra $L19; .loc 1 150 13 mad.lo.u64 %r115,%r101,%r48,%r126; $L17: .loc 1 153 4 setp.gt.s64 %r251,%r100,0; @ ! %r251 bra $L18; mov.u64 %r116,%r127; .loc 1 153 23 mov.u64 %r117,0; $L20: .loc 1 155 16 ld.f32 %r51,[%r116]; ld.f32 %r52,[%r116+4]; .loc 1 155 14 st.f32 [%r115],%r51; st.f32 [%r115+4],%r52; .loc 1 156 13 add.u64 %r115,%r115,%r48; .loc 1 157 12 add.u64 %r116,%r116,%r50; .loc 1 153 37 add.u64 %r117,%r117,1; .loc 1 153 4 setp.ne.u64 %r252,%r100,%r117; @ %r252 bra $L20; $L18: .loc 1 162 12 add.u64 %r126,%r126,%r54; .loc 1 163 12 add.u64 %r127,%r127,%r56; .loc 1 164 12 add.u64 %r128,%r128,%r58; .loc 1 165 15 ld.u64 %r253,[%frame+480]; add.u64 %r60,%r253,1; .loc 1 167 13 setp.eq.u64 %r254,%r60,%r105; @ %r254 bra $L22; .loc 1 165 15 st.u64 [%frame+480],%r60; bra $L27; $L22: .loc 1 171 20 mov.u64 %r255,0; st.u64 [%frame+480],%r255; .loc 1 172 24 ld.u64 %r62,[%frame+240]; .loc 1 173 16 ld.u64 %r257,[%frame+120]; shl.b64 %r256,%r257,3; sub.u64 %r123,%r127,%r256; .loc 1 174 9 ld.u64 %r259,[%frame]; shl.b64 %r258,%r259,3; sub.u64 %r124,%r128,%r258; .loc 1 176 14 @ %r294 bra $L1; add.u64 %r136,%frame,488; mov.u64 %r134,8; .loc 1 175 12 mov.u64 %r125,1; bra $L25; $L26: .loc 1 171 20 st.u64 [%r136],%r255; .loc 1 172 24 add.u64 %r266,%r309,%r134; ld.u64 %r62,[%r266]; .loc 1 173 24 add.u64 %r268,%r310,%r134; .loc 1 173 16 ld.u64 %r270,[%r268]; shl.b64 %r269,%r270,3; sub.u64 %r123,%r127,%r269; .loc 1 174 17 add.u64 %r271,%frame,%r134; .loc 1 174 9 ld.u64 %r273,[%r271]; shl.b64 %r272,%r273,3; sub.u64 %r124,%r128,%r272; .loc 1 175 12 add.u64 %r125,%r125,1; .loc 1 176 14 add.u64 %r136,%r136,8; add.u64 %r134,%r134,8; setp.eq.u64 %r274,%r304,%r125; @ %r274 bra $L1; $L25: .loc 1 184 23 ld.u64 %r275,[%r136]; add.u64 %r72,%r275,1; st.u64 [%r136],%r72; .loc 1 185 30 add.u64 %r277,%r305,%r134; .loc 1 185 20 ld.u64 %r279,[%r277]; sub.u64 %r278,%r279,%r62; shl.b64 %r280,%r278,3; add.u64 %r126,%r126,%r280; .loc 1 186 30 add.u64 %r282,%r306,%r134; .loc 1 186 20 ld.u64 %r284,[%r282]; shl.b64 %r283,%r284,3; add.u64 %r127,%r123,%r283; .loc 1 187 23 add.u64 %r286,%r307,%r134; .loc 1 187 13 ld.u64 %r288,[%r286]; shl.b64 %r287,%r288,3; add.u64 %r128,%r124,%r287; .loc 1 167 32 add.u64 %r290,%r308,%r134; .loc 1 167 13 ld.u64 %r291,[%r290]; setp.eq.u64 %r292,%r72,%r291; @ %r292 bra $L26; bra $L27; $L1: .loc 1 191 1 ret; } cshift1_8_c8.o/ 1622802210gfortrani_cshift1_8_c8 .visible .func _gfortrani_cshift1_8_c8fortran/generated/cshift1_8_c8.c"F: _gfortrani_cshift1_8_c8 .visible .func _gfortrani_cshift1_8_c8960predpredpredpredpred %r226; .reg .predu64pred %r249; .reg .pred %r251; .reg .pred %r252; .reg .u64 %r253; .reg .predu64 %r257; .reg .u64 %r258; .reg .u64 %r259; .reg .u64u64pred %r294; .reg .pred %r296; .reg .u32 %r302; .reg .u64 %r303; .reg .u64 %r304; .reg .u64 %r305; .reg .u64 %r306; .reg .u64u64 %r310; .reg .u64 %r311; mov.u64 %r146,%ar0; mov.u64 %r147,%ar1; mov.u64 %r148,%ar2; mov.u64 %r149,%ar3; .loc 1 68 6 setp.eq.u64 %r150,%r149,0; @ %r150 bra $L28; .loc 1 69 21 ld.u32 %r152,[%r149]; add.u32 %r99,%r152,-1; bra $L2; $L28: .loc 1 71 11 cvt.u32.u64 %r99,%r149; $L2: .loc 1 73 13 mov.u64 %r153,1; st.u64 [%frame+360],%r153; .loc 1 74 12 mov.u64 %r154,0; st.u64 [%frame+480],%r154; .loc 1 82 23 ld.s8 %r144,[%r147+28]; cvt.u32.u32 %r155,%r144; cvt.s64.s8 %r145,%r155; .loc 1 82 3 setp.le.s64 %r156,%r145,0; @ %r156 bra $L29; .loc 1 84 15 cvt.s64.s32 %r35,%r99; add.u64 %r92,%r147,40; add.u64 %r85,%r146,40; .loc 1 75 5 mov.u64 %r96,%r154; .loc 1 80 7 mov.u64 %r94,%r96; .loc 1 82 12 mov.u64 %r129,%r96; .loc 1 79 11 mov.u64 %r88,%r153; .loc 1 78 11 mov.u64 %r84,%r88; .loc 1 96 20 mov.u64 %r311,%r96; $L8: .loc 1 92 17 ld.u64 %r158,[%r92+16]; add.u64 %r157,%r158,1; .loc 1 92 15 ld.u64 %r159,[%r92+8]; sub.u64 %r89,%r157,%r159; .loc 1 86 19 ld.u64 %r83,[%r85]; .loc 1 89 19 ld.u64 %r87,[%r92]; .loc 1 84 10 setp.ne.u64 %r160,%r35,%r129; @ %r160 bra $L4; .loc 1 87 14 setp.ne.u64 %r161,%r83,0; .loc 1 90 14 setp.ne.u64 %r162,%r87,0; .loc 1 92 15 mov.u64 %r94,%r89; selp.u64 %r88,%r87,1,%r162; selp.u64 %r84,%r83,1,%r161; bra $L7; $L4: .loc 1 96 20 shl.b64 %r163,%r96,3; add.u64 %r164,%frame,%r163; add.u64 %r165,%r164,480; st.u64 [%r165],%r311; .loc 1 97 21 st.u64 [%r164+360],%r89; .loc 1 98 22 st.u64 [%r164+840],%r83; .loc 1 99 22 st.u64 [%r164+720],%r87; .loc 1 100 24 add.u64 %r177,%r96,%r96; add.u64 %r178,%r177,%r96; shl.b64 %r179,%r178,3; add.u64 %r180,%r148,%r179; ld.u64 %r25,[%r180+40]; .loc 1 100 22 st.u64 [%r164+600],%r25; .loc 1 101 26 mul.lo.u64 %r188,%r83,%r89; .loc 1 101 13 st.u64 [%r164+240],%r188; .loc 1 102 26 mul.lo.u64 %r192,%r89,%r87; .loc 1 102 13 st.u64 [%r164+120],%r192; .loc 1 103 26 mul.lo.u64 %r195,%r25,%r89; .loc 1 103 13 st.u64 [%r164],%r195; .loc 1 104 12 add.u64 %r96,%r96,1; $L7: .loc 1 82 55 add.u64 %r129,%r129,1; .loc 1 82 3 add.u64 %r92,%r92,24; add.u64 %r85,%r85,24; setp.ne.u64 %r196,%r129,%r145; @ %r196 bra $L8; bra $L3; $L29: .loc 1 80 7 mov.u64 %r94,%r154; .loc 1 79 11 mov.u64 %r88,%r153; .loc 1 78 11 mov.u64 %r84,%r88; $L3: .loc 1 107 14 ld.u64 %r29,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r197,%r29,0; .loc 1 108 16 selp.u64 %r29,%r29,1,%r197; .loc 1 109 14 ld.u64 %r30,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r198,%r30,0; .loc 1 110 16 selp.u64 %r30,%r30,1,%r198; .loc 1 111 14 ld.u64 %r31,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r199,%r31,0; .loc 1 112 16 selp.u64 %r31,%r31,1,%r199; .loc 1 118 8 ld.u64 %r126,[%r146]; .loc 1 119 8 ld.u64 %r127,[%r147]; .loc 1 120 8 ld.u64 %r128,[%r148]; .loc 1 122 9 setp.eq.u64 %r200,%r126,0; @ %r200 bra $L1; .loc 1 150 13 shl.b64 %r48,%r84,4; .loc 1 151 12 shl.b64 %r50,%r88,4; .loc 1 162 12 shl.b64 %r54,%r30,4; .loc 1 163 12 shl.b64 %r56,%r29,4; .loc 1 164 12 shl.b64 %r58,%r31,3; .loc 1 138 35 set.u32.eq.u64 %r202,%r84,1; neg.s32 %r203,%r202; .loc 1 138 19 set.u32.eq.u64 %r205,%r88,1; neg.s32 %r206,%r205; .loc 1 138 24 cvt.u16.u32 %r208,%r203; cvt.u16.u32 %r209,%r206; and.b16 %r207,%r208,%r209; cvt.u32.u16 %r210,%r207; cvt.u32.u8 %r86,%r210; .loc 1 167 32 ld.u64 %r105,[%frame+360]; setp.ne.u32 %r296,%r86,0; setp.le.s64 %r294,%r145,2; cvt.u32.u32 %r302,%r144; cvt.s64.s8 %r303,%r302; add.u64 %r304,%r303,-1; add.u64 %r305,%frame,840; add.u64 %r306,%frame,720; add.u64 %r307,%frame,600; add.u64 %r308,%frame,360; .loc 1 172 24 add.u64 %r309,%frame,240; .loc 1 173 24 add.u64 %r310,%frame,120; $L27: .loc 1 125 10 ld.u64 %r100,[%r128]; .loc 1 128 10 setp.ge.s64 %r211,%r100,0; @ %r211 bra $L13; .loc 1 129 12 add.u64 %r100,%r100,%r94; $L13: .loc 1 130 11 set.u32.ge.s64 %r213,%r100,%r94; neg.s32 %r214,%r213; shr.u64 %r216,%r100,63; cvt.u16.u32 %r219,%r214; cvt.u16.u64 %r220,%r216; or.b16 %r218,%r219,%r220; .loc 1 130 10 cvt.u32.u16 %r221,%r218; cvt.u16.u8 %r222,%r221; setp.eq.u16 %r223,%r222,0; @ %r223 bra $L14; .loc 1 132 8 rem.s64 %r100,%r100,%r94; .loc 1 133 7 setp.ge.s64 %r226,%r100,0; @ %r226 bra $L14; .loc 1 134 16 add.u64 %r100,%r100,%r94; $L14: .loc 1 147 24 sub.u64 %r101,%r94,%r100; .loc 1 138 10 @ %r296 bra $L15; .loc 1 147 4 setp.gt.s64 %r228,%r101,0; @ %r228 bra $L16; mov.u64 %r115,%r126; bra $L17; $L15: .loc 1 140 11 shl.b64 %r121,%r100,4; .loc 1 141 11 shl.b64 %r122,%r101,4; .loc 1 142 23 add.u64 %r229,%r127,%r121; .loc 1 142 422236,[%value_in]; } .loc 1 143 17 add.u64 %r238,%r126,%r122r1271245,[%value_in]; } bra $L18; $L16: .loc 1 136 22 mul.lo.u64 %r247,%r100,%r88; .loc 1 136 18 shl.b64 %r248,%r247,4; .loc 1 136 11 add.u64 %r119,%r127,%r248; mov.u64 %r118,%r126; .loc 1 147 11 mov.u64 %r120,0; $L19: .loc 1 149 16 ld.f64 %r45,[%r119]; ld.f64 %r46,[%r119+8]; .loc 1 149 14 st.f64 [%r118],%r45; st.f64 [%r118+8],%r46; .loc 1 150 13 add.u64 %r118,%r118,%r48; .loc 1 151 12 add.u64 %r119,%r119,%r50; .loc 1 147 31 add.u64 %r120,%r120,1; .loc 1 147 4 setp.ne.u64 %r249,%r101,%r120; @ %r249 bra $L19; .loc 1 150 13 mad.lo.u64 %r115,%r101,%r48,%r126; $L17: .loc 1 153 4 setp.gt.s64 %r251,%r100,0; @ ! %r251 bra $L18; mov.u64 %r116,%r127; .loc 1 153 23 mov.u64 %r117,0; $L20: .loc 1 155 16 ld.f64 %r51,[%r116]; ld.f64 %r52,[%r116+8]; .loc 1 155 14 st.f64 [%r115],%r51; st.f64 [%r115+8],%r52; .loc 1 156 13 add.u64 %r115,%r115,%r48; .loc 1 157 12 add.u64 %r116,%r116,%r50; .loc 1 153 37 add.u64 %r117,%r117,1; .loc 1 153 4 setp.ne.u64 %r252,%r100,%r117; @ %r252 bra $L20; $L18: .loc 1 162 12 add.u64 %r126,%r126,%r54; .loc 1 163 12 add.u64 %r127,%r127,%r56; .loc 1 164 12 add.u64 %r128,%r128,%r58; .loc 1 165 15 ld.u64 %r253,[%frame+480]; add.u64 %r60,%r253,1; .loc 1 167 13 setp.eq.u64 %r254,%r60,%r105; @ %r254 bra $L22; .loc 1 165 15 st.u64 [%frame+480],%r60; bra $L27; $L22: .loc 1 171 20 mov.u64 %r255,0; st.u64 [%frame+480],%r255; .loc 1 172 24 ld.u64 %r62,[%frame+240]; .loc 1 173 16 ld.u64 %r257,[%frame+120]; shl.b64 %r256,%r257,4; sub.u64 %r123,%r127,%r256; .loc 1 174 9 ld.u64 %r259,[%frame]; shl.b64 %r258,%r259,3; sub.u64 %r124,%r128,%r258; .loc 1 176 14 @ %r294 bra $L1; add.u64 %r136,%frame,488; mov.u64 %r134,8; .loc 1 175 12 mov.u64 %r125,1; bra $L25; $L26: .loc 1 171 20 st.u64 [%r136],%r255; .loc 1 172 24 add.u64 %r266,%r309,%r134; ld.u64 %r62,[%r266]; .loc 1 173 24 add.u64 %r268,%r310,%r134; .loc 1 173 16 ld.u64 %r270,[%r268]; shl.b64 %r269,%r270,4; sub.u64 %r123,%r127,%r269; .loc 1 174 17 add.u64 %r271,%frame,%r134; .loc 1 174 9 ld.u64 %r273,[%r271]; shl.b64 %r272,%r273,3; sub.u64 %r124,%r128,%r272; .loc 1 175 12 add.u64 %r125,%r125,1; .loc 1 176 14 add.u64 %r136,%r136,8; add.u64 %r134,%r134,8; setp.eq.u64 %r274,%r304,%r125; @ %r274 bra $L1; $L25: .loc 1 184 23 ld.u64 %r275,[%r136]; add.u64 %r72,%r275,1; st.u64 [%r136],%r72; .loc 1 185 30 add.u64 %r277,%r305,%r134; .loc 1 185 20 ld.u64 %r279,[%r277]; sub.u64 %r278,%r279,%r62; shl.b64 %r280,%r278,4; add.u64 %r126,%r126,%r280; .loc 1 186 30 add.u64 %r282,%r306,%r134; .loc 1 186 20 ld.u64 %r284,[%r282]; shl.b64 %r283,%r284,4; add.u64 %r127,%r123,%r283; .loc 1 187 23 add.u64 %r286,%r307,%r134; .loc 1 187 13 ld.u64 %r288,[%r286]; shl.b64 %r287,%r288,3; add.u64 %r128,%r124,%r287; .loc 1 167 32 add.u64 %r290,%r308,%r134; .loc 1 167 13 ld.u64 %r291,[%r290]; setp.eq.u64 %r292,%r72,%r291; @ %r292 bra $L26; bra $L27; $L1: .loc 1 191 1 ret; } cshift1_8_c10.o/1622802210cshift1_16_i1.o/1622802210_gfortrani_cshift1_16_i1 .visible .func _gfortrani_cshift1_16_fortran/generated/cshift1_16_i1.c"odti3 .extern .func __modti3F: _gfortrani_cshift1_16_i1 .visible .func _gfortrani_cshift1_16_1001predpred %r150; .reg .pred %r151; .reg .predpredu16 %r197; .reg .u16 %r198; .reg .u16 %r199; .reg .u32 %r200; .reg .predpredu64 %r266; .reg .pred %r268; .reg .pred %r270; .reg .predu64 %r282; .reg .u64 %r289; .reg .u32u64 %r320; .reg .u64u64 %r333; .reg .u64 %r334; .reg .u64u64pred %r352; .reg .predu32 %r410; .reg .u64mov.u64 %r137,%ar0; mov.u64 %r138,%ar1; mov.u64 %r139,%ar2; mov.u64 %r140,%ar3; .loc 1 68 6 setp.eq.u64 %r141,%r140,0; @ %r141 bra $L37; .loc 1 69 21 ld.u32 %r142,[%r140]; add.u32 %r91,%r142,-1; bra $L2; $L37: .loc 1 71 11 cvt.u32.u64 %r91,%r140; $L2: .loc 1 73 13 mov.u64 %r143,1; st.u64 [%frame+360],%r143; .loc 1 74 12 mov.u64 %r144,0; st.u64 [%frame+480],%r144; .loc 1 82 23 ld.s8 %r129,[%r138+28]; cvt.u32.u32 %r145,%r129; cvt.s64.s8 %r130,%r145; .loc 1 82 3 setp.le.s64 %r146,%r130,0; @ %r146 bra $L38; .loc 1 84 15 cvt.s64.s32 %r90,%r91; add.u64 %r36,%r138,40; add.u64 %r33,%r137,40; .loc 1 75 5 mov.u64 %r88,%r144; .loc 1 80 7 mov.u64 %r86,%r88; .loc 1 82 12 mov.u64 %r118,%r88; .loc 1 79 11 mov.u64 %r79,%r143; .loc 1 78 11 mov.u64 %r72,%r79; .loc 1 96 20 mov.u64 %r419,%r88; $L8: .loc 1 92 17 ld.u64 %r148,[%r36+16]; add.u64 %r147,%r148,1; .loc 1 92 15 ld.u64 %r149,[%r36+8]; sub.u64 %r73,%r147,%r149; .loc 1 86 19 ld.u64 %r71,[%r33]; .loc 1 89 19 ld.u64 %r78,[%r36]; .loc 1 84 10 setp.ne.u64 %r150,%r90,%r118; @ %r150 bra $L4; .loc 1 87 14 setp.ne.u64 %r151,%r71,0; .loc 1 90 14 setp.ne.u64 %r152,%r78,0; .loc 1 92 15 mov.u64 %r86,%r73; selp.u64 %r79,%r78,1,%r152; selp.u64 %r72,%r71,1,%r151; bra $L7; $L4: .loc 1 96 20 shl.b64 %r153,%r88,3; add.u64 %r154,%frame,%r153; add.u64 %r155,%r154,480; st.u64 [%r155],%r419; .loc 1 97 21 st.u64 [%r154+360],%r73; .loc 1 98 22 st.u64 [%r154+840],%r71; .loc 1 99 22 st.u64 [%r154+720],%r78; .loc 1 100 24 add.u64 %r167,%r88,%r88; add.u64 %r168,%r167,%r88; shl.b64 %r169,%r168,3; add.u64 %r170,%r139,%r169; ld.u64 %r25,[%r170+40]; .loc 1 100 22 st.u64 [%r154+600],%r25; .loc 1 101 26 mul.lo.u64 %r178,%r73,%r71; .loc 1 101 13 st.u64 [%r154+240],%r178; .loc 1 102 26 mul.lo.u64 %r182,%r73,%r78; .loc 1 102 13 st.u64 [%r154+120],%r182; .loc 1 103 26 mul.lo.u64 %r185,%r25,%r73; .loc 1 103 13 st.u64 [%r154],%r185; .loc 1 104 12 add.u64 %r88,%r88,1; $L7: .loc 1 82 55 add.u64 %r118,%r118,1; .loc 1 82 3 add.u64 %r36,%r36,24; add.u64 %r33,%r33,24; setp.ne.u64 %r186,%r118,%r130; @ %r186 bra $L8; bra $L3; $L38: .loc 1 80 7 mov.u64 %r86,%r144; .loc 1 79 11 mov.u64 %r79,%r143; .loc 1 78 11 mov.u64 %r72,%r79; $L3: .loc 1 107 14 ld.u64 %r29,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r187,%r29,0; .loc 1 108 16 selp.u64 %r29,%r29,1,%r187; .loc 1 109 14 ld.u64 %r30,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r188,%r30,0; .loc 1 110 16 selp.u64 %r30,%r30,1,%r188; .loc 1 111 14 ld.u64 %r31,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r189,%r31,0; .loc 1 112 16 selp.u64 %r31,%r31,1,%r189; .loc 1 118 8 ld.u64 %r115,[%r137]; .loc 1 119 8 ld.u64 %r116,[%r138]; .loc 1 120 8 ld.u64 %r117,[%r139]; .loc 1 122 9 setp.eq.u64 %r190,%r115,0; @ %r190 bra $L1; .loc 1 164 12 shl.b64 %r47,%r31,4; .loc 1 138 19 set.u32.eq.u64 %r192,%r79,1; neg.s32 %r193,%r192; .loc 1 138 35 set.u32.eq.u64 %r195,%r72,1; neg.s32 %r196,%r195; .loc 1 138 24 cvt.u16.u32 %r198,%r193; cvt.u16.u32 %r199,%r196; and.b16 %r197,%r198,%r199; cvt.u32.u16 %r200,%r197; cvt.u32.u8 %r35,%r200; .loc 1 167 32 ld.u64 %r40,[%frame+360]; .loc 1 129 12 shr.s64 %r379,%r86,63; setp.ne.u32 %r398,%r35,0; setp.le.s64 %r403,%r130,2; .loc 1 132 8 add.u64 %r407,%frame,960; add.u64 %r408,%frame,976; add.u64 %r409,%frame,992; cvt.u32.u32 %r410,%r129; cvt.s64.s8 %r411,%r410; add.u64 %r412,%r411,-1; add.u64 %r413,%frame,840; add.u64 %r414,%frame,720; add.u64 %r415,%frame,600; add.u64 %r416,%frame,360; .loc 1 172 24 add.u64 %r417,%frame,240; .loc 1 173 24 add.u64 %r418,%frame,120; $L36: .loc 1 125 10 ld.u64 %r380,[%r117]; ld.u64 %r381,[%r117+8]; .loc 1 128 10 setp.ge.s64 %r203,%r381,0; @ %r203 bra $L13; .loc 1 129 12 add.u64 %r205,%r380,%r86; set.u32.lt.u64 %r210,%r205,%r380; cvt.s64.s32 %r208,%r210; add.u64 %r211,%r381,%r379; sub.u64 %r214,%r211,%r208; mov.u64 %r380,%r205; mov.u64 %r381,%r214; .loc 1 130 10 setp.gt.s64 %r217,%r379,%r214; @ %r217 bra $L39; setp.ne.u64 %r220,%r379,%r214; @ %r220 bra $L15; setp.gt.u64 %r223,%r86,%r205; @ ! %r223 bra $L15; $L39: setp.ge.s64 %r225,%r214,0; @ %r225 bra $L17; $L15: .loc 1 132 8 st.u64 [%frame+976],%r205; st.u64 [%frame+984],%r214; st.u64 [%frame+992],%r86; st.u64 [%frame+1000]960]; ld.u64 %r381,[%frame+968]; .loc 1 133 7 setp.ge.s64 %r243,%r381,0; @ %r243 bra $L17; .loc 1 134 16 add.u64 %r245,%r380,%r86; set.u32.lt.u64 %r250,%r245,%r380; cvt.s64.s32 %r248,%r250; add.u64 %r251,%r381,%r379; mov.u64 %r380,%r245; sub.u64 %r381,%r251,%r248; $L17: .loc 1 138 10 @ %r398 bra $L20; .loc 1 147 24 sub.u64 %r257,%r86,%r380; set.u32.gt.u64 %r262,%r257,%r86; cvt.s64.s32 %r260,%r262; sub.u64 %r263,%r379,%r381; add.u64 %r266,%r263,%r260; .loc 1 147 4 setp.gt.s64 %r268,%r266,0; @ %r268 bra $L21; setp.ne.u64 %r270,%r266,0; @ %r270 bra $L40; setp.ne.u64 %r272,%r257,0; @ %r272 bra $L21; $L40: mov.u64 %r136,%r115; bra $L23; $L20: .loc 1 141 11 sub.u64 %r111,%r86,%r380; .loc 1 142 23 add.u64 %r273,%r116,%r380; .loc 1 142 4110,[%value_in]; } .loc 1 143 17 add.u64 %r282,%r115,%r111; .loc 1 143 428bra $L24; $L21: .loc 1 136 11 mad.lo.u64 %r109,%r380,%r79,%r116; .loc 1 156 13 mov.u64 %r108,%r115; .loc 1 147 11 mov.u64 %r110,0; $L25: .loc 1 149 14 ld.u8 %r292,[%r109]; st.u8 [%r108],%r292; .loc 1 150 13 add.u64 %r108,%r108,%r72; .loc 1 151 12 add.u64 %r109,%r109,%r79; .loc 1 147 31 add.u64 %r110,%r110,1; .loc 1 147 4 setp.ne.u64 %r293,%r257,%r110; @ %r293 bra $L25; .loc 1 150 13 mad.lo.u64 %r136,%r257,%r72,%r115; $L23: .loc 1 153 4 setp.gt.s64 %r310,%r381,0; @ %r310 bra $L26; setp.ne.u64 %r312,%r381,0; @ %r312 bra $L24; setp.ne.u64 %r314,%r380,0; @ ! %r314 bra $L24; $L26: .loc 1 156 13 mov.u64 %r106,%r116; .loc 1 153 23 mov.u64 %r107,0; $L28: .loc 1 155 14 ld.u8 %r315,[%r106]; st.u8 [%r136],%r315; .loc 1 156 13 add.u64 %r136,%r136,%r72; .loc 1 157 12 add.u64 %r106,%r106,%r79; .loc 1 153 37 add.u64 %r107,%r107,1; .loc 1 153 4 setp.ne.u64 %r316,%r380,%r107; @ %r316 bra $L28; $L24: .loc 1 162 12 add.u64 %r115,%r115,%r30; .loc 1 163 12 add.u64 %r116,%r116,%r29; .loc 1 164 12 add.u64 %r117,%r117,%r47; .loc 1 165 15 ld.u64 %r317,[%frame+480]; add.u64 %r49,%r317,1; .loc 1 167 13 setp.eq.u64 %r318,%r40,%r49; @ %r318 bra $L30; .loc 1 165 15 st.u64 [%frame+480],%r49; bra $L36; $L30: .loc 1 171 20 mov.u64 %r319,0; st.u64 [%frame+480],%r319; .loc 1 172 24 ld.u64 %r51,[%frame+240]; .loc 1 173 16 ld.u64 %r320,[%frame+120]; sub.u64 %r112,%r116,%r320; .loc 1 174 9 ld.u64 %r322,[%frame]; shl.b64 %r321,%r322,4; sub.u64 %r113,%r117,%r321; .loc 1 176 14 @ %r403 bra $L1; add.u64 %r126,%frame,488; mov.u64 %r121,8; .loc 1 175 12 mov.u64 %r114,1; bra $L33; $L34: .loc 1 171 20 st.u64 [%r126],%r319; .loc 1 172 24 add.u64 %r329,%r417,%r121; ld.u64 %r51,[%r329]; .loc 1 173 24 add.u64 %r331,%r418,%r121; .loc 1 173 16 ld.u64 %r332,[%r331]; sub.u64 %r112,%r116,%r332; .loc 1 174 17 add.u64 %r333,%frame,%r121; .loc 1 174 9 ld.u64 %r335,[%r333]; shl.b64 %r334,%r335,4; sub.u64 %r113,%r117,%r334; .loc 1 175 12 add.u64 %r114,%r114,1; .loc 1 176 14 add.u64 %r126,%r126,8; add.u64 %r121,%r121,8; setp.eq.u64 %r336,%r412,%r114; @ %r336 bra $L1; $L33: .loc 1 184 23 ld.u64 %r337,[%r126]; add.u64 %r61,%r337,1; st.u64 [%r126],%r61; .loc 1 185 30 add.u64 %r339,%r413,%r121; .loc 1 185 20 ld.u64 %r341,[%r339]; sub.u64 %r340,%r341,%r51; add.u64 %r115,%r115,%r340; .loc 1 186 30 add.u64 %r343,%r414,%r121; .loc 1 186 20 ld.u64 %r344,[%r343]; add.u64 %r116,%r112,%r344; .loc 1 187 23 add.u64 %r346,%r415,%r121; .loc 1 187 13 ld.u64 %r348,[%r346]; shl.b64 %r347,%r348,4; add.u64 %r117,%r113,%r347; .loc 1 167 32 add.u64 %r350,%r416,%r121; .loc 1 167 13 ld.u64 %r351,[%r350]; setp.eq.u64 %r352,%r61,%r351; @ %r352 bra $L34; bra $L36; $L13: .loc 1 130 10 setp.gt.s64 %r355,%r379,%r381; @ %r355 bra $L17; setp.ne.u64 %r358,%r379,%r381; @ %r358 bra $L41; setp.gt.u64 %r361,%r86,%r380; @ %r361 bra $L17; $L41: .loc 1 132 8 st.u64 [%frame+976],%r380; st.u64 [%frame+984],%r381; st.u64 [%frame+992],%r86; st.u64 [%frame+1000]960]; ld.u64 %r381,[%frame+968]; bra $L17; $Lgfortrani_cshift1_16_i2 .visible .func _gfortrani_cshift1_16_fortran/generated/cshift1_16_i2.c"odti3 .extern .func __modti3F: _gfortrani_cshift1_16_i2 .visible .func _gfortrani_cshift1_16_1005predu16 %r214; .reg .u32 %r215; .reg .predpred %r232; .reg .pred %r235; .reg .predu64 %r280; .reg .pred %r283; .reg .pred %r285; .reg .predpred %r313; .reg .pred %r330; .reg .pred %r332; .reg .pred %r334; .reg .u16u64 %r375; .reg .pred %r376; .reg .pred %r379; .reg .pred %r382; .reg .pred %r385; .reg .u64pred.reg .u64mov.u64 %r148,%ar0; mov.u64 %r149,%ar1; mov.u64 %r150,%ar2; mov.u64 %r151,%ar3; .loc 1 68 6 setp.eq.u64 %r152,%r151,0; @ %r152 bra $L37; .loc 1 69 21 ld.u32 %r153,[%r151]; add.u32 %r95,%r153,-1; bra $L2; $L37: .loc 1 71 11 cvt.u32.u64 %r95,%r151; $L2: .loc 1 73 13 mov.u64 %r154,1; st.u64 [%frame+360],%r154; .loc 1 74 12 mov.u64 %r155,0; st.u64 [%frame+480],%r155; .loc 1 82 23 ld.s8 %r139,[%r149+28]; cvt.u32.u32 %r156,%r139; cvt.s64.s8 %r134,%r156; .loc 1 82 3 setp.le.s64 %r157,%r134,0; @ %r157 bra $L38; .loc 1 84 15 cvt.s64.s32 %r87,%r95; add.u64 %r81,%r149,40; add.u64 %r48,%r148,40; .loc 1 75 5 mov.u64 %r90,%r155; .loc 1 80 7 mov.u64 %r89,%r90; .loc 1 82 12 mov.u64 %r125,%r90; .loc 1 79 11 mov.u64 %r84,%r154; .loc 1 78 11 mov.u64 %r80,%r84; .loc 1 96 20 mov.u64 %r444,%r90; $L8: .loc 1 92 17 ld.u64 %r159,[%r81+16]; add.u64 %r158,%r159,1; .loc 1 92 15 ld.u64 %r160,[%r81+8]; sub.u64 %r141,%r158,%r160; .loc 1 86 19 ld.u64 %r79,[%r48]; .loc 1 89 19 ld.u64 %r83,[%r81]; .loc 1 84 10 setp.ne.u64 %r161,%r87,%r125; @ %r161 bra $L4; .loc 1 87 14 setp.ne.u64 %r162,%r79,0; .loc 1 90 14 setp.ne.u64 %r163,%r83,0; .loc 1 92 15 mov.u64 %r89,%r141; selp.u64 %r84,%r83,1,%r163; selp.u64 %r80,%r79,1,%r162; bra $L7; $L4: .loc 1 96 20 shl.b64 %r164,%r90,3; add.u64 %r165,%frame,%r164; add.u64 %r166,%r165,480; st.u64 [%r166],%r444; .loc 1 97 21 st.u64 [%r165+360],%r141; .loc 1 98 22 st.u64 [%r165+840],%r79; .loc 1 99 22 st.u64 [%r165+720],%r83; .loc 1 100 24 add.u64 %r178,%r90,%r90; add.u64 %r179,%r178,%r90; shl.b64 %r180,%r179,3; add.u64 %r181,%r150,%r180; ld.u64 %r25,[%r181+40]; .loc 1 100 22 st.u64 [%r165+600],%r25; .loc 1 101 26 mul.lo.u64 %r189,%r79,%r141; .loc 1 101 13 st.u64 [%r165+240],%r189; .loc 1 102 26 mul.lo.u64 %r193,%r83,%r141; .loc 1 102 13 st.u64 [%r165+120],%r193; .loc 1 103 26 mul.lo.u64 %r196,%r25,%r141; .loc 1 103 13 st.u64 [%r165],%r196; .loc 1 104 12 add.u64 %r90,%r90,1; $L7: .loc 1 82 55 add.u64 %r125,%r125,1; .loc 1 82 3 add.u64 %r81,%r81,24; add.u64 %r48,%r48,24; setp.ne.u64 %r197,%r125,%r134; @ %r197 bra $L8; bra $L3; $L38: .loc 1 80 7 mov.u64 %r89,%r155; .loc 1 79 11 mov.u64 %r84,%r154; .loc 1 78 11 mov.u64 %r80,%r84; $L3: .loc 1 107 14 ld.u64 %r29,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r198,%r29,0; .loc 1 108 16 selp.u64 %r29,%r29,1,%r198; .loc 1 109 14 ld.u64 %r30,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r199,%r30,0; .loc 1 110 16 selp.u64 %r30,%r30,1,%r199; .loc 1 111 14 ld.u64 %r31,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r200,%r31,0; .loc 1 112 16 selp.u64 %r31,%r31,1,%r200; .loc 1 118 8 ld.u64 %r122,[%r148]; .loc 1 119 8 ld.u64 %r123,[%r149]; .loc 1 120 8 ld.u64 %r124,[%r150]; .loc 1 122 9 setp.eq.u64 %r201,%r122,0; @ %r201 bra $L1; add.u64 %r133,%r84,%r84; .loc 1 150 13 add.u64 %r46,%r80,%r80; .loc 1 162 12 add.u64 %r50,%r30,%r30; .loc 1 163 12 add.u64 %r52,%r29,%r29; .loc 1 164 12 shl.b64 %r54,%r31,4; .loc 1 138 35 set.u32.eq.u64 %r207,%r80,1; neg.s32 %r208,%r207; .loc 1 138 19 set.u32.eq.u64 %r210,%r84,1; neg.s32 %r211,%r210; .loc 1 138 24 cvt.u16.u32 %r213,%r208; cvt.u16.u32 %r214,%r211; and.b16 %r212,%r213,%r214; cvt.u32.u16 %r215,%r212; cvt.u32.u8 %r86,%r215; .loc 1 167 32 ld.u64 %r82,[%frame+360]; .loc 1 129 12 shr.s64 %r403,%r89,63; setp.ne.u32 %r427,%r86,0; setp.le.s64 %r423,%r134,2; .loc 1 132 8 add.u64 %r432,%frame,960; add.u64 %r433,%frame,976; add.u64 %r434,%frame,992; cvt.u32.u32 %r435,%r139; cvt.s64.s8 %r436,%r435; add.u64 %r437,%r436,-1; add.u64 %r438,%frame,840; add.u64 %r439,%frame,720; add.u64 %r440,%frame,600; add.u64 %r441,%frame,360; .loc 1 172 24 add.u64 %r442,%frame,240; .loc 1 173 24 add.u64 %r443,%frame,120; $L36: .loc 1 125 10 ld.u64 %r404,[%r124]; ld.u64 %r405,[%r124+8]; .loc 1 128 10 setp.ge.s64 %r218,%r405,0; @ %r218 bra $L13; .loc 1 129 12 add.u64 %r220,%r404,%r89; set.u32.lt.u64 %r225,%r220,%r404; cvt.s64.s32 %r223,%r225; add.u64 %r226,%r405,%r403; sub.u64 %r229,%r226,%r223; mov.u64 %r404,%r220; mov.u64 %r405,%r229; .loc 1 130 10 setp.gt.s64 %r232,%r403,%r229; @ %r232 bra $L39; setp.ne.u64 %r235,%r403,%r229; @ %r235 bra $L15; setp.gt.u64 %r238,%r89,%r220; @ ! %r238 bra $L15; $L39: setp.ge.s64 %r240,%r229,0; @ %r240 bra $L17; $L15: .loc 1 132 8 st.u64 [%frame+976],%r220; st.u64 [%frame+984],%r229; st.u64 [%frame+992],%r89; st.u64 [%frame+1000],%r44960]; ld.u64 %r405,[%frame+968]; .loc 1 133 7 setp.ge.s64 %r258,%r405,0; @ %r258 bra $L17; .loc 1 134 16 add.u64 %r260,%r404,%r89; set.u32.lt.u64 %r265,%r260,%r404; cvt.s64.s32 %r263,%r265; add.u64 %r266,%r405,%r403; mov.u64 %r404,%r260; sub.u64 %r405,%r266,%r263; $L17: .loc 1 147 24 sub.u64 %r428,%r89,%r404; set.u32.gt.u64 %r276,%r428,%r89; cvt.s64.s32 %r274,%r276; sub.u64 %r277,%r403,%r405; add.u64 %r280,%r277,%r274; .loc 1 138 10 @ %r427 bra $L20; .loc 1 147 4 setp.gt.s64 %r283,%r280,0; @ %r283 bra $L21; setp.ne.u64 %r285,%r280,0; @ %r285 bra $L40; setp.ne.u64 %r287,%r428,0; @ %r287 bra $L21; $L40: mov.u64 %r147,%r122; bra $L23; $L20: .loc 1 140 11 add.u64 %r288,%r404,%r404; .loc 1 141 11 add.u64 %r290,%r428,%r428; .loc 1 142 23 add.u64 %r291,%r123,%r288; .loc 1 142 4290298,[%value_in]; } .loc 1 143 17 add.u64 %r302,%r122,%r290; .loc 1 143 40123288309,[%value_in]; } bra $L24; $L21: .loc 1 136 11 mad.lo.u64 %r115,%r404,%r133,%r123; mov.u64 %r114,%r122; .loc 1 147 11 mov.u64 %r116,0; $L25: .loc 1 149 14 ld.u16 %r312,[%r115]; st.u16 [%r114],%r312; .loc 1 150 13 add.u64 %r114,%r114,%r46; .loc 1 151 12 add.u64 %r115,%r115,%r133; .loc 1 147 31 add.u64 %r116,%r116,1; .loc 1 147 4 setp.ne.u64 %r313,%r428,%r116; @ %r313 bra $L25; .loc 1 150 13 mad.lo.u64 %r147,%r428,%r46,%r122; $L23: .loc 1 153 4 setp.gt.s64 %r330,%r405,0; @ %r330 bra $L26; setp.ne.u64 %r332,%r405,0; @ %r332 bra $L24; setp.ne.u64 %r334,%r404,0; @ ! %r334 bra $L24; $L26: mov.u64 %r112,%r123; .loc 1 153 23 mov.u64 %r113,0; $L28: .loc 1 155 14 ld.u16 %r335,[%r112]; st.u16 [%r147],%r335; .loc 1 156 13 add.u64 %r147,%r147,%r46; .loc 1 157 12 add.u64 %r112,%r112,%r133; .loc 1 153 37 add.u64 %r113,%r113,1; .loc 1 153 4 setp.ne.u64 %r336,%r404,%r113; @ %r336 bra $L28; $L24: .loc 1 162 12 add.u64 %r122,%r122,%r50; .loc 1 163 12 add.u64 %r123,%r123,%r52; .loc 1 164 12 add.u64 %r124,%r124,%r54; .loc 1 165 15 ld.u64 %r337,[%frame+480]; add.u64 %r56,%r337,1; .loc 1 167 13 setp.eq.u64 %r338,%r56,%r82; @ %r338 bra $L30; .loc 1 165 15 st.u64 [%frame+480],%r56; bra $L36; $L30: .loc 1 171 20 mov.u64 %r339,0; st.u64 [%frame+480],%r339; .loc 1 172 24 ld.u64 %r58,[%frame+240]; .loc 1 173 16 ld.u64 %r340,[%frame+120]; add.u64 %r341,%r340,%r340; sub.u64 %r119,%r123,%r341; .loc 1 174 9 ld.u64 %r343,[%frame]; shl.b64 %r342,%r343,4; sub.u64 %r120,%r124,%r342; .loc 1 176 14 @ %r423 bra $L1; add.u64 %r136,%frame,488; mov.u64 %r128,8; .loc 1 175 12 mov.u64 %r121,1; bra $L33; $L34: .loc 1 171 20 st.u64 [%r136],%r339; .loc 1 172 24 add.u64 %r350,%r442,%r128; ld.u64 %r58,[%r350]; .loc 1 173 24 add.u64 %r352,%r443,%r128; .loc 1 173 16 ld.u64 %r353,[%r352]; add.u64 %r354,%r353,%r353; sub.u64 %r119,%r123,%r354; .loc 1 174 17 add.u64 %r355,%frame,%r128; .loc 1 174 9 ld.u64 %r357,[%r355]; shl.b64 %r356,%r357,4; sub.u64 %r120,%r124,%r356; .loc 1 175 12 add.u64 %r121,%r121,1; .loc 1 176 14 add.u64 %r136,%r136,8; add.u64 %r128,%r128,8; setp.eq.u64 %r358,%r437,%r121; @ %r358 bra $L1; $L33: .loc 1 184 23 ld.u64 %r359,[%r136]; add.u64 %r68,%r359,1; st.u64 [%r136],%r68; .loc 1 185 30 add.u64 %r361,%r438,%r128; .loc 1 185 20 ld.u64 %r363,[%r361]; sub.u64 %r362,%r363,%r58; add.u64 %r364,%r362,%r362; add.u64 %r122,%r122,%r364; .loc 1 186 30 add.u64 %r366,%r439,%r128; .loc 1 186 20 ld.u64 %r367,[%r366]; add.u64 %r368,%r367,%r367; add.u64 %r123,%r119,%r368; .loc 1 187 23 add.u64 %r370,%r440,%r128; .loc 1 187 13 ld.u64 %r372,[%r370]; shl.b64 %r371,%r372,4; add.u64 %r124,%r120,%r371; .loc 1 167 32 add.u64 %r374,%r441,%r128; .loc 1 167 13 ld.u64 %r375,[%r374]; setp.eq.u64 %r376,%r68,%r375; @ %r376 bra $L34; bra $L36; $L13: .loc 1 130 10 setp.gt.s64 %r379,%r403,%r405; @ %r379 bra $L17; setp.ne.u64 %r382,%r403,%r405; @ %r382 bra $L41; setp.gt.u64 %r385,%r89,%r404; @ %r385 bra $L17; $L41: .loc 1 132 8 st.u64 [%frame+976],%r404; st.u64 [%frame+984],%r405; st.u64 [%frame+992],%r89; st.u64 [%frame+1000],%r44960]; ld.u64 %r405,[%frame+968]; bra $L17; $L1: .loc 1 191 1 ret; } cshift1_16_i4.o/1622802210gfortrani_cshift1_16_i4 .visible .func _gfortrani_cshift1_16_fortran/generated/cshift1_16_i4.c"odti3 .extern .func __modti3F: _gfortrani_cshift1_16_i4 .visible .func _gfortrani_cshift1_16_10095predu16 %r210; .reg .u32 %r211; .reg .predpred %r228; .reg .predpred %r254; .reg .u64 %r256; .reg .u64 %r259; .reg .u32 %r261; .reg .u64predpred %r326; .reg .pred %r328; .reg .predu64pred %r369; .reg .pred %r372; .reg .pred %r375; .reg .predmov.u64 %r148,%ar0; mov.u64 %r149,%ar1; mov.u64 %r150,%ar2; mov.u64 %r151,%ar3; .loc 1 68 6 setp.eq.u64 %r152,%r151,0; @ %r152 bra $L37; .loc 1 69 21 ld.u32 %r153,[%r151]; add.u32 %r95,%r153,-1; bra $L2; $L37: .loc 1 71 11 cvt.u32.u64 %r95,%r151; $L2: .loc 1 73 13 mov.u64 %r154,1; st.u64 [%frame+360],%r154; .loc 1 74 12 mov.u64 %r155,0; st.u64 [%frame+480],%r155; .loc 1 82 23 ld.s8 %r139,[%r149+28]; cvt.u32.u32 %r156,%r139; cvt.s64.s8 %r134,%r156; .loc 1 82 3 setp.le.s64 %r157,%r134,0; @ %r157 bra $L38; .loc 1 84 15 cvt.s64.s32 %r87,%r95; add.u64 %r81,%r149,40; add.u64 %r48,%r148,40; .loc 1 75 5 mov.u64 %r90,%r155; .loc 1 80 7 mov.u64 %r89,%r90; .loc 1 82 12 mov.u64 %r125,%r90; .loc 1 79 11 mov.u64 %r84,%r154; .loc 1 78 11 mov.u64 %r80,%r84; .loc 1 96 20 mov.u64 %r437,%r90; $L8: .loc 1 92 17 ld.u64 %r159,[%r81+16]; add.u64 %r158,%r159,1; .loc 1 92 15 ld.u64 %r160,[%r81+8]; sub.u64 %r141,%r158,%r160; .loc 1 86 19 ld.u64 %r79,[%r48]; .loc 1 89 19 ld.u64 %r83,[%r81]; .loc 1 84 10 setp.ne.u64 %r161,%r87,%r125; @ %r161 bra $L4; .loc 1 87 14 setp.ne.u64 %r162,%r79,0; .loc 1 90 14 setp.ne.u64 %r163,%r83,0; .loc 1 92 15 mov.u64 %r89,%r141; selp.u64 %r84,%r83,1,%r163; selp.u64 %r80,%r79,1,%r162; bra $L7; $L4: .loc 1 96 20 shl.b64 %r164,%r90,3; add.u64 %r165,%frame,%r164; add.u64 %r166,%r165,480; st.u64 [%r166],%r437; .loc 1 97 21 st.u64 [%r165+360],%r141; .loc 1 98 22 st.u64 [%r165+840],%r79; .loc 1 99 22 st.u64 [%r165+720],%r83; .loc 1 100 24 add.u64 %r178,%r90,%r90; add.u64 %r179,%r178,%r90; shl.b64 %r180,%r179,3; add.u64 %r181,%r150,%r180; ld.u64 %r25,[%r181+40]; .loc 1 100 22 st.u64 [%r165+600],%r25; .loc 1 101 26 mul.lo.u64 %r189,%r79,%r141; .loc 1 101 13 st.u64 [%r165+240],%r189; .loc 1 102 26 mul.lo.u64 %r193,%r83,%r141; .loc 1 102 13 st.u64 [%r165+120],%r193; .loc 1 103 26 mul.lo.u64 %r196,%r25,%r141; .loc 1 103 13 st.u64 [%r165],%r196; .loc 1 104 12 add.u64 %r90,%r90,1; $L7: .loc 1 82 55 add.u64 %r125,%r125,1; .loc 1 82 3 add.u64 %r81,%r81,24; add.u64 %r48,%r48,24; setp.ne.u64 %r197,%r125,%r134; @ %r197 bra $L8; bra $L3; $L38: .loc 1 80 7 mov.u64 %r89,%r155; .loc 1 79 11 mov.u64 %r84,%r154; .loc 1 78 11 mov.u64 %r80,%r84; $L3: .loc 1 107 14 ld.u64 %r29,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r198,%r29,0; .loc 1 108 16 selp.u64 %r29,%r29,1,%r198; .loc 1 109 14 ld.u64 %r30,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r199,%r30,0; .loc 1 110 16 selp.u64 %r30,%r30,1,%r199; .loc 1 111 14 ld.u64 %r31,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r200,%r31,0; .loc 1 112 16 selp.u64 %r31,%r31,1,%r200; .loc 1 118 8 ld.u64 %r122,[%r148]; .loc 1 119 8 ld.u64 %r123,[%r149]; .loc 1 120 8 ld.u64 %r124,[%r150]; .loc 1 122 9 setp.eq.u64 %r201,%r122,0; @ %r201 bra $L1; shl.b64 %r133,%r84,2; .loc 1 150 13 shl.b64 %r46,%r80,2; .loc 1 162 12 shl.b64 %r50,%r30,2; .loc 1 163 12 shl.b64 %r52,%r29,2; .loc 1 164 12 shl.b64 %r54,%r31,4; .loc 1 138 35 set.u32.eq.u64 %r203,%r80,1; neg.s32 %r204,%r203; .loc 1 138 19 set.u32.eq.u64 %r206,%r84,1; neg.s32 %r207,%r206; .loc 1 138 24 cvt.u16.u32 %r209,%r204; cvt.u16.u32 %r210,%r207; and.b16 %r208,%r209,%r210; cvt.u32.u16 %r211,%r208; cvt.u32.u8 %r86,%r211; .loc 1 167 32 ld.u64 %r82,[%frame+360]; .loc 1 129 12 shr.s64 %r396,%r89,63; setp.ne.u32 %r418,%r86,0; setp.le.s64 %r415,%r134,2; .loc 1 132 8 add.u64 %r425,%frame,960; add.u64 %r426,%frame,976; add.u64 %r427,%frame,992; cvt.u32.u32 %r428,%r139; cvt.s64.s8 %r429,%r428; add.u64 %r430,%r429,-1; add.u64 %r431,%frame,840; add.u64 %r432,%frame,720; add.u64 %r433,%frame,600; add.u64 %r434,%frame,360; .loc 1 172 24 add.u64 %r435,%frame,240; .loc 1 173 24 add.u64 %r436,%frame,120; $L36: .loc 1 125 10 ld.u64 %r397,[%r124]; ld.u64 %r398,[%r124+8]; .loc 1 128 10 setp.ge.s64 %r214,%r398,0; @ %r214 bra $L13; .loc 1 129 12 add.u64 %r216,%r397,%r89; set.u32.lt.u64 %r221,%r216,%r397; cvt.s64.s32 %r219,%r221; add.u64 %r222,%r398,%r396; sub.u64 %r225,%r222,%r219; mov.u64 %r397,%r216; mov.u64 %r398,%r225; .loc 1 130 10 setp.gt.s64 %r228,%r396,%r225; @ %r228 bra $L39; setp.ne.u64 %r231,%r396,%r225; @ %r231 bra $L15; setp.gt.u64 %r234,%r89,%r216; @ ! %r234 bra $L15; $L39: setp.ge.s64 %r236,%r225,0; @ %r236 bra $L17; $L15: .loc 1 132 8 st.u64 [%frame+976],%r216; st.u64 [%frame+984],%r225; st.u64 [%frame+992],%r89; st.u64 [%frame+1000],%r39960]; ld.u64 %r398,[%frame+968]; .loc 1 133 7 setp.ge.s64 %r254,%r398,0; @ %r254 bra $L17; .loc 1 134 16 add.u64 %r256,%r397,%r89; set.u32.lt.u64 %r261,%r256,%r397; cvt.s64.s32 %r259,%r261; add.u64 %r262,%r398,%r396; mov.u64 %r397,%r256; sub.u64 %r398,%r262,%r259; $L17: .loc 1 147 24 sub.u64 %r420,%r89,%r397; set.u32.gt.u64 %r272,%r420,%r89; cvt.s64.s32 %r270,%r272; sub.u64 %r273,%r396,%r398; add.u64 %r276,%r273,%r270; .loc 1 138 10 @ %r418 bra $L20; .loc 1 147 4 setp.gt.s64 %r279,%r276,0; @ %r279 bra $L21; setp.ne.u64 %r281,%r276,0; @ %r281 bra $L40; setp.ne.u64 %r283,%r420,0; @ %r283 bra $L21; $L40: mov.u64 %r147,%r122; bra $L23; $L20: .loc 1 140 11 shl.b64 %r117,%r397,2; .loc 1 141 11 shl.b64 %r284,%r420,2; .loc 1 142 23 add.u64 %r286,%r123,%r117; .loc 1 142 4284293,[%value_in]; } .loc 1 143 17 add.u64 %r297,%r122,%r284; .loc 1 143 4117304,[%value_in]; } bra $L24; $L21: .loc 1 136 11 mad.lo.u64 %r115,%r397,%r133,%r123; mov.u64 %r114,%r122; .loc 1 147 11 mov.u64 %r116,0; $L25: .loc 1 149 16 ld.u32 %r44,[%r115]; .loc 1 149 14 st.u32 [%r114],%r44; .loc 1 150 13 add.u64 %r114,%r114,%r46; .loc 1 151 12 add.u64 %r115,%r115,%r133; .loc 1 147 31 add.u64 %r116,%r116,1; .loc 1 147 4 setp.ne.u64 %r307,%r420,%r116; @ %r307 bra $L25; .loc 1 150 13 mad.lo.u64 %r147,%r420,%r46,%r122; $L23: .loc 1 153 4 setp.gt.s64 %r324,%r398,0; @ %r324 bra $L26; setp.ne.u64 %r326,%r398,0; @ %r326 bra $L24; setp.ne.u64 %r328,%r397,0; @ ! %r328 bra $L24; $L26: mov.u64 %r112,%r123; .loc 1 153 23 mov.u64 %r113,0; $L28: .loc 1 155 16 ld.u32 %r47,[%r112]; .loc 1 155 14 st.u32 [%r147],%r47; .loc 1 156 13 add.u64 %r147,%r147,%r46; .loc 1 157 12 add.u64 %r112,%r112,%r133; .loc 1 153 37 add.u64 %r113,%r113,1; .loc 1 153 4 setp.ne.u64 %r329,%r397,%r113; @ %r329 bra $L28; $L24: .loc 1 162 12 add.u64 %r122,%r122,%r50; .loc 1 163 12 add.u64 %r123,%r123,%r52; .loc 1 164 12 add.u64 %r124,%r124,%r54; .loc 1 165 15 ld.u64 %r330,[%frame+480]; add.u64 %r56,%r330,1; .loc 1 167 13 setp.eq.u64 %r331,%r56,%r82; @ %r331 bra $L30; .loc 1 165 15 st.u64 [%frame+480],%r56; bra $L36; $L30: .loc 1 171 20 mov.u64 %r332,0; st.u64 [%frame+480],%r332; .loc 1 172 24 ld.u64 %r58,[%frame+240]; .loc 1 173 16 ld.u64 %r334,[%frame+120]; shl.b64 %r333,%r334,2; sub.u64 %r119,%r123,%r333; .loc 1 174 9 ld.u64 %r336,[%frame]; shl.b64 %r335,%r336,4; sub.u64 %r120,%r124,%r335; .loc 1 176 14 @ %r415 bra $L1; add.u64 %r136,%frame,488; mov.u64 %r128,8; .loc 1 175 12 mov.u64 %r121,1; bra $L33; $L34: .loc 1 171 20 st.u64 [%r136],%r332; .loc 1 172 24 add.u64 %r343,%r435,%r128; ld.u64 %r58,[%r343]; .loc 1 173 24 add.u64 %r345,%r436,%r128; .loc 1 173 16 ld.u64 %r347,[%r345]; shl.b64 %r346,%r347,2; sub.u64 %r119,%r123,%r346; .loc 1 174 17 add.u64 %r348,%frame,%r128; .loc 1 174 9 ld.u64 %r350,[%r348]; shl.b64 %r349,%r350,4; sub.u64 %r120,%r124,%r349; .loc 1 175 12 add.u64 %r121,%r121,1; .loc 1 176 14 add.u64 %r136,%r136,8; add.u64 %r128,%r128,8; setp.eq.u64 %r351,%r430,%r121; @ %r351 bra $L1; $L33: .loc 1 184 23 ld.u64 %r352,[%r136]; add.u64 %r68,%r352,1; st.u64 [%r136],%r68; .loc 1 185 30 add.u64 %r354,%r431,%r128; .loc 1 185 20 ld.u64 %r356,[%r354]; sub.u64 %r355,%r356,%r58; shl.b64 %r357,%r355,2; add.u64 %r122,%r122,%r357; .loc 1 186 30 add.u64 %r359,%r432,%r128; .loc 1 186 20 ld.u64 %r361,[%r359]; shl.b64 %r360,%r361,2; add.u64 %r123,%r119,%r360; .loc 1 187 23 add.u64 %r363,%r433,%r128; .loc 1 187 13 ld.u64 %r365,[%r363]; shl.b64 %r364,%r365,4; add.u64 %r124,%r120,%r364; .loc 1 167 32 add.u64 %r367,%r434,%r128; .loc 1 167 13 ld.u64 %r368,[%r367]; setp.eq.u64 %r369,%r68,%r368; @ %r369 bra $L34; bra $L36; $L13: .loc 1 130 10 setp.gt.s64 %r372,%r396,%r398; @ %r372 bra $L17; setp.ne.u64 %r375,%r396,%r398; @ %r375 bra $L41; setp.gt.u64 %r378,%r89,%r397; @ %r378 bra $L17; $L41: .loc 1 132 8 st.u64 [%frame+976],%r397; st.u64 [%frame+984],%r398; st.u64 [%frame+992],%r89; st.u64 [%frame+1000],%r39960]; ld.u64 %r398,[%frame+968]; bra $L17; $L1: .loc 1 191 1 ret; } cshift1_16_i8.o/1622802210gfortrani_cshift1_16_i8 .visible .func _gfortrani_cshift1_16_fortran/generated/cshift1_16_i8.c"odti3 .extern .func __modti3F: _gfortrani_cshift1_16_i8 .visible .func _gfortrani_cshift1_16_10095predu16 %r210; .reg .u32 %r211; .reg .predpred %r228; .reg .predpred %r254; .reg .u64 %r256; .reg .u64 %r259; .reg .u32 %r261; .reg .u64predpred %r326; .reg .pred %r328; .reg .predu64pred %r369; .reg .pred %r372; .reg .pred %r375; .reg .predmov.u64 %r148,%ar0; mov.u64 %r149,%ar1; mov.u64 %r150,%ar2; mov.u64 %r151,%ar3; .loc 1 68 6 setp.eq.u64 %r152,%r151,0; @ %r152 bra $L37; .loc 1 69 21 ld.u32 %r153,[%r151]; add.u32 %r95,%r153,-1; bra $L2; $L37: .loc 1 71 11 cvt.u32.u64 %r95,%r151; $L2: .loc 1 73 13 mov.u64 %r154,1; st.u64 [%frame+360],%r154; .loc 1 74 12 mov.u64 %r155,0; st.u64 [%frame+480],%r155; .loc 1 82 23 ld.s8 %r139,[%r149+28]; cvt.u32.u32 %r156,%r139; cvt.s64.s8 %r134,%r156; .loc 1 82 3 setp.le.s64 %r157,%r134,0; @ %r157 bra $L38; .loc 1 84 15 cvt.s64.s32 %r87,%r95; add.u64 %r81,%r149,40; add.u64 %r48,%r148,40; .loc 1 75 5 mov.u64 %r90,%r155; .loc 1 80 7 mov.u64 %r89,%r90; .loc 1 82 12 mov.u64 %r125,%r90; .loc 1 79 11 mov.u64 %r84,%r154; .loc 1 78 11 mov.u64 %r80,%r84; .loc 1 96 20 mov.u64 %r437,%r90; $L8: .loc 1 92 17 ld.u64 %r159,[%r81+16]; add.u64 %r158,%r159,1; .loc 1 92 15 ld.u64 %r160,[%r81+8]; sub.u64 %r141,%r158,%r160; .loc 1 86 19 ld.u64 %r79,[%r48]; .loc 1 89 19 ld.u64 %r83,[%r81]; .loc 1 84 10 setp.ne.u64 %r161,%r87,%r125; @ %r161 bra $L4; .loc 1 87 14 setp.ne.u64 %r162,%r79,0; .loc 1 90 14 setp.ne.u64 %r163,%r83,0; .loc 1 92 15 mov.u64 %r89,%r141; selp.u64 %r84,%r83,1,%r163; selp.u64 %r80,%r79,1,%r162; bra $L7; $L4: .loc 1 96 20 shl.b64 %r164,%r90,3; add.u64 %r165,%frame,%r164; add.u64 %r166,%r165,480; st.u64 [%r166],%r437; .loc 1 97 21 st.u64 [%r165+360],%r141; .loc 1 98 22 st.u64 [%r165+840],%r79; .loc 1 99 22 st.u64 [%r165+720],%r83; .loc 1 100 24 add.u64 %r178,%r90,%r90; add.u64 %r179,%r178,%r90; shl.b64 %r180,%r179,3; add.u64 %r181,%r150,%r180; ld.u64 %r25,[%r181+40]; .loc 1 100 22 st.u64 [%r165+600],%r25; .loc 1 101 26 mul.lo.u64 %r189,%r79,%r141; .loc 1 101 13 st.u64 [%r165+240],%r189; .loc 1 102 26 mul.lo.u64 %r193,%r83,%r141; .loc 1 102 13 st.u64 [%r165+120],%r193; .loc 1 103 26 mul.lo.u64 %r196,%r25,%r141; .loc 1 103 13 st.u64 [%r165],%r196; .loc 1 104 12 add.u64 %r90,%r90,1; $L7: .loc 1 82 55 add.u64 %r125,%r125,1; .loc 1 82 3 add.u64 %r81,%r81,24; add.u64 %r48,%r48,24; setp.ne.u64 %r197,%r125,%r134; @ %r197 bra $L8; bra $L3; $L38: .loc 1 80 7 mov.u64 %r89,%r155; .loc 1 79 11 mov.u64 %r84,%r154; .loc 1 78 11 mov.u64 %r80,%r84; $L3: .loc 1 107 14 ld.u64 %r29,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r198,%r29,0; .loc 1 108 16 selp.u64 %r29,%r29,1,%r198; .loc 1 109 14 ld.u64 %r30,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r199,%r30,0; .loc 1 110 16 selp.u64 %r30,%r30,1,%r199; .loc 1 111 14 ld.u64 %r31,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r200,%r31,0; .loc 1 112 16 selp.u64 %r31,%r31,1,%r200; .loc 1 118 8 ld.u64 %r122,[%r148]; .loc 1 119 8 ld.u64 %r123,[%r149]; .loc 1 120 8 ld.u64 %r124,[%r150]; .loc 1 122 9 setp.eq.u64 %r201,%r122,0; @ %r201 bra $L1; shl.b64 %r133,%r84,3; .loc 1 150 13 shl.b64 %r46,%r80,3; .loc 1 162 12 shl.b64 %r50,%r30,3; .loc 1 163 12 shl.b64 %r52,%r29,3; .loc 1 164 12 shl.b64 %r54,%r31,4; .loc 1 138 35 set.u32.eq.u64 %r203,%r80,1; neg.s32 %r204,%r203; .loc 1 138 19 set.u32.eq.u64 %r206,%r84,1; neg.s32 %r207,%r206; .loc 1 138 24 cvt.u16.u32 %r209,%r204; cvt.u16.u32 %r210,%r207; and.b16 %r208,%r209,%r210; cvt.u32.u16 %r211,%r208; cvt.u32.u8 %r86,%r211; .loc 1 167 32 ld.u64 %r82,[%frame+360]; .loc 1 129 12 shr.s64 %r396,%r89,63; setp.ne.u32 %r418,%r86,0; setp.le.s64 %r415,%r134,2; .loc 1 132 8 add.u64 %r425,%frame,960; add.u64 %r426,%frame,976; add.u64 %r427,%frame,992; cvt.u32.u32 %r428,%r139; cvt.s64.s8 %r429,%r428; add.u64 %r430,%r429,-1; add.u64 %r431,%frame,840; add.u64 %r432,%frame,720; add.u64 %r433,%frame,600; add.u64 %r434,%frame,360; .loc 1 172 24 add.u64 %r435,%frame,240; .loc 1 173 24 add.u64 %r436,%frame,120; $L36: .loc 1 125 10 ld.u64 %r397,[%r124]; ld.u64 %r398,[%r124+8]; .loc 1 128 10 setp.ge.s64 %r214,%r398,0; @ %r214 bra $L13; .loc 1 129 12 add.u64 %r216,%r397,%r89; set.u32.lt.u64 %r221,%r216,%r397; cvt.s64.s32 %r219,%r221; add.u64 %r222,%r398,%r396; sub.u64 %r225,%r222,%r219; mov.u64 %r397,%r216; mov.u64 %r398,%r225; .loc 1 130 10 setp.gt.s64 %r228,%r396,%r225; @ %r228 bra $L39; setp.ne.u64 %r231,%r396,%r225; @ %r231 bra $L15; setp.gt.u64 %r234,%r89,%r216; @ ! %r234 bra $L15; $L39: setp.ge.s64 %r236,%r225,0; @ %r236 bra $L17; $L15: .loc 1 132 8 st.u64 [%frame+976],%r216; st.u64 [%frame+984],%r225; st.u64 [%frame+992],%r89; st.u64 [%frame+1000],%r39960]; ld.u64 %r398,[%frame+968]; .loc 1 133 7 setp.ge.s64 %r254,%r398,0; @ %r254 bra $L17; .loc 1 134 16 add.u64 %r256,%r397,%r89; set.u32.lt.u64 %r261,%r256,%r397; cvt.s64.s32 %r259,%r261; add.u64 %r262,%r398,%r396; mov.u64 %r397,%r256; sub.u64 %r398,%r262,%r259; $L17: .loc 1 147 24 sub.u64 %r420,%r89,%r397; set.u32.gt.u64 %r272,%r420,%r89; cvt.s64.s32 %r270,%r272; sub.u64 %r273,%r396,%r398; add.u64 %r276,%r273,%r270; .loc 1 138 10 @ %r418 bra $L20; .loc 1 147 4 setp.gt.s64 %r279,%r276,0; @ %r279 bra $L21; setp.ne.u64 %r281,%r276,0; @ %r281 bra $L40; setp.ne.u64 %r283,%r420,0; @ %r283 bra $L21; $L40: mov.u64 %r147,%r122; bra $L23; $L20: .loc 1 140 11 shl.b64 %r117,%r397,3; .loc 1 141 11 shl.b64 %r284,%r420,3; .loc 1 142 23 add.u64 %r286,%r123,%r117; .loc 1 142 4284293,[%value_in]; } .loc 1 143 17 add.u64 %r297,%r122,%r284; .loc 1 143 4117304,[%value_in]; } bra $L24; $L21: .loc 1 136 11 mad.lo.u64 %r115,%r397,%r133,%r123; mov.u64 %r114,%r122; .loc 1 147 11 mov.u64 %r116,0; $L25: .loc 1 149 16 ld.u64 %r44,[%r115]; .loc 1 149 14 st.u64 [%r114],%r44; .loc 1 150 13 add.u64 %r114,%r114,%r46; .loc 1 151 12 add.u64 %r115,%r115,%r133; .loc 1 147 31 add.u64 %r116,%r116,1; .loc 1 147 4 setp.ne.u64 %r307,%r420,%r116; @ %r307 bra $L25; .loc 1 150 13 mad.lo.u64 %r147,%r420,%r46,%r122; $L23: .loc 1 153 4 setp.gt.s64 %r324,%r398,0; @ %r324 bra $L26; setp.ne.u64 %r326,%r398,0; @ %r326 bra $L24; setp.ne.u64 %r328,%r397,0; @ ! %r328 bra $L24; $L26: mov.u64 %r112,%r123; .loc 1 153 23 mov.u64 %r113,0; $L28: .loc 1 155 16 ld.u64 %r47,[%r112]; .loc 1 155 14 st.u64 [%r147],%r47; .loc 1 156 13 add.u64 %r147,%r147,%r46; .loc 1 157 12 add.u64 %r112,%r112,%r133; .loc 1 153 37 add.u64 %r113,%r113,1; .loc 1 153 4 setp.ne.u64 %r329,%r397,%r113; @ %r329 bra $L28; $L24: .loc 1 162 12 add.u64 %r122,%r122,%r50; .loc 1 163 12 add.u64 %r123,%r123,%r52; .loc 1 164 12 add.u64 %r124,%r124,%r54; .loc 1 165 15 ld.u64 %r330,[%frame+480]; add.u64 %r56,%r330,1; .loc 1 167 13 setp.eq.u64 %r331,%r56,%r82; @ %r331 bra $L30; .loc 1 165 15 st.u64 [%frame+480],%r56; bra $L36; $L30: .loc 1 171 20 mov.u64 %r332,0; st.u64 [%frame+480],%r332; .loc 1 172 24 ld.u64 %r58,[%frame+240]; .loc 1 173 16 ld.u64 %r334,[%frame+120]; shl.b64 %r333,%r334,3; sub.u64 %r119,%r123,%r333; .loc 1 174 9 ld.u64 %r336,[%frame]; shl.b64 %r335,%r336,4; sub.u64 %r120,%r124,%r335; .loc 1 176 14 @ %r415 bra $L1; add.u64 %r136,%frame,488; mov.u64 %r128,8; .loc 1 175 12 mov.u64 %r121,1; bra $L33; $L34: .loc 1 171 20 st.u64 [%r136],%r332; .loc 1 172 24 add.u64 %r343,%r435,%r128; ld.u64 %r58,[%r343]; .loc 1 173 24 add.u64 %r345,%r436,%r128; .loc 1 173 16 ld.u64 %r347,[%r345]; shl.b64 %r346,%r347,3; sub.u64 %r119,%r123,%r346; .loc 1 174 17 add.u64 %r348,%frame,%r128; .loc 1 174 9 ld.u64 %r350,[%r348]; shl.b64 %r349,%r350,4; sub.u64 %r120,%r124,%r349; .loc 1 175 12 add.u64 %r121,%r121,1; .loc 1 176 14 add.u64 %r136,%r136,8; add.u64 %r128,%r128,8; setp.eq.u64 %r351,%r430,%r121; @ %r351 bra $L1; $L33: .loc 1 184 23 ld.u64 %r352,[%r136]; add.u64 %r68,%r352,1; st.u64 [%r136],%r68; .loc 1 185 30 add.u64 %r354,%r431,%r128; .loc 1 185 20 ld.u64 %r356,[%r354]; sub.u64 %r355,%r356,%r58; shl.b64 %r357,%r355,3; add.u64 %r122,%r122,%r357; .loc 1 186 30 add.u64 %r359,%r432,%r128; .loc 1 186 20 ld.u64 %r361,[%r359]; shl.b64 %r360,%r361,3; add.u64 %r123,%r119,%r360; .loc 1 187 23 add.u64 %r363,%r433,%r128; .loc 1 187 13 ld.u64 %r365,[%r363]; shl.b64 %r364,%r365,4; add.u64 %r124,%r120,%r364; .loc 1 167 32 add.u64 %r367,%r434,%r128; .loc 1 167 13 ld.u64 %r368,[%r367]; setp.eq.u64 %r369,%r68,%r368; @ %r369 bra $L34; bra $L36; $L13: .loc 1 130 10 setp.gt.s64 %r372,%r396,%r398; @ %r372 bra $L17; setp.ne.u64 %r375,%r396,%r398; @ %r375 bra $L41; setp.gt.u64 %r378,%r89,%r397; @ %r378 bra $L17; $L41: .loc 1 132 8 st.u64 [%frame+976],%r397; st.u64 [%frame+984],%r398; st.u64 [%frame+992],%r89; st.u64 [%frame+1000],%r39960]; ld.u64 %r398,[%frame+968]; bra $L17; $L1: .loc 1 191 1 ret; } /487 gfortrani_cshift1_16_i16 .visible .func _gfortrani_cshift1_16_i16fortran/generated/cshift1_16_i16.c"odti3 .extern .func __modti3F: _gfortrani_cshift1_16_i16 .visible .func _gfortrani_cshift1_16_i161005predu16 %r210; .reg .u32 %r211; .reg .predpred %r228; .reg .predpred %r254; .reg .u64 %r256; .reg .u64 %r259; .reg .u32 %r261; .reg .u64predu64 %r304; .reg .pred %r309; .reg .pred %r326; .reg .pred %r328; .reg .pred %r330; .reg .predu64pred %r373; .reg .pred %r376; .reg .pred %r379; .reg .pred %r382; .reg .u64 %r399; .reg .u64u64 %r406; .reg .pred %r424; .reg .predmov.u64 %r148,%ar0; mov.u64 %r149,%ar1; mov.u64 %r150,%ar2; mov.u64 %r151,%ar3; .loc 1 68 6 setp.eq.u64 %r152,%r151,0; @ %r152 bra $L37; .loc 1 69 21 ld.u32 %r153,[%r151]; add.u32 %r95,%r153,-1; bra $L2; $L37: .loc 1 71 11 cvt.u32.u64 %r95,%r151; $L2: .loc 1 73 13 mov.u64 %r154,1; st.u64 [%frame+360],%r154; .loc 1 74 12 mov.u64 %r155,0; st.u64 [%frame+480],%r155; .loc 1 82 23 ld.s8 %r139,[%r149+28]; cvt.u32.u32 %r156,%r139; cvt.s64.s8 %r134,%r156; .loc 1 82 3 setp.le.s64 %r157,%r134,0; @ %r157 bra $L38; .loc 1 84 15 cvt.s64.s32 %r87,%r95; add.u64 %r81,%r149,40; add.u64 %r48,%r148,40; .loc 1 75 5 mov.u64 %r90,%r155; .loc 1 80 7 mov.u64 %r89,%r90; .loc 1 82 12 mov.u64 %r125,%r90; .loc 1 79 11 mov.u64 %r84,%r154; .loc 1 78 11 mov.u64 %r80,%r84; .loc 1 96 20 mov.u64 %r445,%r90; $L8: .loc 1 92 17 ld.u64 %r159,[%r81+16]; add.u64 %r158,%r159,1; .loc 1 92 15 ld.u64 %r160,[%r81+8]; sub.u64 %r141,%r158,%r160; .loc 1 86 19 ld.u64 %r79,[%r48]; .loc 1 89 19 ld.u64 %r83,[%r81]; .loc 1 84 10 setp.ne.u64 %r161,%r87,%r125; @ %r161 bra $L4; .loc 1 87 14 setp.ne.u64 %r162,%r79,0; .loc 1 90 14 setp.ne.u64 %r163,%r83,0; .loc 1 92 15 mov.u64 %r89,%r141; selp.u64 %r84,%r83,1,%r163; selp.u64 %r80,%r79,1,%r162; bra $L7; $L4: .loc 1 96 20 shl.b64 %r164,%r90,3; add.u64 %r165,%frame,%r164; add.u64 %r166,%r165,480; st.u64 [%r166],%r445; .loc 1 97 21 st.u64 [%r165+360],%r141; .loc 1 98 22 st.u64 [%r165+840],%r79; .loc 1 99 22 st.u64 [%r165+720],%r83; .loc 1 100 24 add.u64 %r178,%r90,%r90; add.u64 %r179,%r178,%r90; shl.b64 %r180,%r179,3; add.u64 %r181,%r150,%r180; ld.u64 %r25,[%r181+40]; .loc 1 100 22 st.u64 [%r165+600],%r25; .loc 1 101 26 mul.lo.u64 %r189,%r79,%r141; .loc 1 101 13 st.u64 [%r165+240],%r189; .loc 1 102 26 mul.lo.u64 %r193,%r83,%r141; .loc 1 102 13 st.u64 [%r165+120],%r193; .loc 1 103 26 mul.lo.u64 %r196,%r25,%r141; .loc 1 103 13 st.u64 [%r165],%r196; .loc 1 104 12 add.u64 %r90,%r90,1; $L7: .loc 1 82 55 add.u64 %r125,%r125,1; .loc 1 82 3 add.u64 %r81,%r81,24; add.u64 %r48,%r48,24; setp.ne.u64 %r197,%r125,%r134; @ %r197 bra $L8; bra $L3; $L38: .loc 1 80 7 mov.u64 %r89,%r155; .loc 1 79 11 mov.u64 %r84,%r154; .loc 1 78 11 mov.u64 %r80,%r84; $L3: .loc 1 107 14 ld.u64 %r29,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r198,%r29,0; .loc 1 108 16 selp.u64 %r29,%r29,1,%r198; .loc 1 109 14 ld.u64 %r30,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r199,%r30,0; .loc 1 110 16 selp.u64 %r30,%r30,1,%r199; .loc 1 111 14 ld.u64 %r31,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r200,%r31,0; .loc 1 112 16 selp.u64 %r31,%r31,1,%r200; .loc 1 118 8 ld.u64 %r122,[%r148]; .loc 1 119 8 ld.u64 %r123,[%r149]; .loc 1 120 8 ld.u64 %r124,[%r150]; .loc 1 122 9 setp.eq.u64 %r201,%r122,0; @ %r201 bra $L1; shl.b64 %r133,%r84,4; .loc 1 150 13 shl.b64 %r46,%r80,4; .loc 1 162 12 shl.b64 %r50,%r30,4; .loc 1 163 12 shl.b64 %r52,%r29,4; .loc 1 164 12 shl.b64 %r54,%r31,4; .loc 1 138 35 set.u32.eq.u64 %r203,%r80,1; neg.s32 %r204,%r203; .loc 1 138 19 set.u32.eq.u64 %r206,%r84,1; neg.s32 %r207,%r206; .loc 1 138 24 cvt.u16.u32 %r209,%r204; cvt.u16.u32 %r210,%r207; and.b16 %r208,%r209,%r210; cvt.u32.u16 %r211,%r208; cvt.u32.u8 %r86,%r211; .loc 1 167 32 ld.u64 %r82,[%frame+360]; .loc 1 129 12 shr.s64 %r404,%r89,63; setp.ne.u32 %r428,%r86,0; setp.le.s64 %r424,%r134,2; .loc 1 132 8 add.u64 %r433,%frame,960; add.u64 %r434,%frame,976; add.u64 %r435,%frame,992; cvt.u32.u32 %r436,%r139; cvt.s64.s8 %r437,%r436; add.u64 %r438,%r437,-1; add.u64 %r439,%frame,840; add.u64 %r440,%frame,720; add.u64 %r441,%frame,600; add.u64 %r442,%frame,360; .loc 1 172 24 add.u64 %r443,%frame,240; .loc 1 173 24 add.u64 %r444,%frame,120; $L36: .loc 1 125 10 ld.u64 %r405,[%r124]; ld.u64 %r406,[%r124+8]; .loc 1 128 10 setp.ge.s64 %r214,%r406,0; @ %r214 bra $L13; .loc 1 129 12 add.u64 %r216,%r405,%r89; set.u32.lt.u64 %r221,%r216,%r405; cvt.s64.s32 %r219,%r221; add.u64 %r222,%r406,%r404; sub.u64 %r225,%r222,%r219; mov.u64 %r405,%r216; mov.u64 %r406,%r225; .loc 1 130 10 setp.gt.s64 %r228,%r404,%r225; @ %r228 bra $L39; setp.ne.u64 %r231,%r404,%r225; @ %r231 bra $L15; setp.gt.u64 %r234,%r89,%r216; @ ! %r234 bra $L15; $L39: setp.ge.s64 %r236,%r225,0; @ %r236 bra $L17; $L15: .loc 1 132 8 st.u64 [%frame+976],%r216; st.u64 [%frame+984],%r225; st.u64 [%frame+992],%r89; st.u64 [%frame+1000],%r40443r4960]; ld.u64 %r406,[%frame+968]; .loc 1 133 7 setp.ge.s64 %r254,%r406,0; @ %r254 bra $L17; .loc 1 134 16 add.u64 %r256,%r405,%r89; set.u32.lt.u64 %r261,%r256,%r405; cvt.s64.s32 %r259,%r261; add.u64 %r262,%r406,%r404; mov.u64 %r405,%r256; sub.u64 %r406,%r262,%r259; $L17: .loc 1 147 24 sub.u64 %r429,%r89,%r405; set.u32.gt.u64 %r272,%r429,%r89; cvt.s64.s32 %r270,%r272; sub.u64 %r273,%r404,%r406; add.u64 %r276,%r273,%r270; .loc 1 138 10 @ %r428 bra $L20; .loc 1 147 4 setp.gt.s64 %r279,%r276,0; @ %r279 bra $L21; setp.ne.u64 %r281,%r276,0; @ %r281 bra $L40; setp.ne.u64 %r283,%r429,0; @ %r283 bra $L21; $L40: mov.u64 %r147,%r122; bra $L23; $L20: .loc 1 140 11 shl.b64 %r117,%r405,4; .loc 1 141 11 shl.b64 %r284,%r429,4; .loc 1 142 23 add.u64 %r286,%r123,%r117; .loc 1 142 4284293,[%value_in]; } .loc 1 143 17 add.u64 %r297,%r122,%r284; .loc 1 143 4117304,[%value_in]; } bra $L24; $L21: .loc 1 136 11 mad.lo.u64 %r115,%r405,%r133,%r123; mov.u64 %r114,%r122; .loc 1 147 11 mov.u64 %r116,0; $L25: .loc 1 149 16 ld.u64 %r399,[%r115]; ld.u64 %r400,[%r115+8]; .loc 1 149 14 st.u64 [%r114],%r399; st.u64 [%r114+8],%r400; .loc 1 150 13 add.u64 %r114,%r114,%r46; .loc 1 151 12 add.u64 %r115,%r115,%r133; .loc 1 147 31 add.u64 %r116,%r116,1; .loc 1 147 4 setp.ne.u64 %r309,%r429,%r116; @ %r309 bra $L25; .loc 1 150 13 mad.lo.u64 %r147,%r429,%r46,%r122; $L23: .loc 1 153 4 setp.gt.s64 %r326,%r406,0; @ %r326 bra $L26; setp.ne.u64 %r328,%r406,0; @ %r328 bra $L24; setp.ne.u64 %r330,%r405,0; @ ! %r330 bra $L24; $L26: mov.u64 %r112,%r123; .loc 1 153 23 mov.u64 %r113,0; $L28: .loc 1 155 16 ld.u64 %r401,[%r112]; ld.u64 %r402,[%r112+8]; .loc 1 155 14 st.u64 [%r147],%r401; st.u64 [%r147+8],%r402; .loc 1 156 13 add.u64 %r147,%r147,%r46; .loc 1 157 12 add.u64 %r112,%r112,%r133; .loc 1 153 37 add.u64 %r113,%r113,1; .loc 1 153 4 setp.ne.u64 %r333,%r405,%r113; @ %r333 bra $L28; $L24: .loc 1 162 12 add.u64 %r122,%r122,%r50; .loc 1 163 12 add.u64 %r123,%r123,%r52; .loc 1 164 12 add.u64 %r124,%r124,%r54; .loc 1 165 15 ld.u64 %r334,[%frame+480]; add.u64 %r56,%r334,1; .loc 1 167 13 setp.eq.u64 %r335,%r56,%r82; @ %r335 bra $L30; .loc 1 165 15 st.u64 [%frame+480],%r56; bra $L36; $L30: .loc 1 171 20 mov.u64 %r336,0; st.u64 [%frame+480],%r336; .loc 1 172 24 ld.u64 %r58,[%frame+240]; .loc 1 173 16 ld.u64 %r338,[%frame+120]; shl.b64 %r337,%r338,4; sub.u64 %r119,%r123,%r337; .loc 1 174 9 ld.u64 %r340,[%frame]; shl.b64 %r339,%r340,4; sub.u64 %r120,%r124,%r339; .loc 1 176 14 @ %r424 bra $L1; add.u64 %r136,%frame,488; mov.u64 %r128,8; .loc 1 175 12 mov.u64 %r121,1; bra $L33; $L34: .loc 1 171 20 st.u64 [%r136],%r336; .loc 1 172 24 add.u64 %r347,%r443,%r128; ld.u64 %r58,[%r347]; .loc 1 173 24 add.u64 %r349,%r444,%r128; .loc 1 173 16 ld.u64 %r351,[%r349]; shl.b64 %r350,%r351,4; sub.u64 %r119,%r123,%r350; .loc 1 174 17 add.u64 %r352,%frame,%r128; .loc 1 174 9 ld.u64 %r354,[%r352]; shl.b64 %r353,%r354,4; sub.u64 %r120,%r124,%r353; .loc 1 175 12 add.u64 %r121,%r121,1; .loc 1 176 14 add.u64 %r136,%r136,8; add.u64 %r128,%r128,8; setp.eq.u64 %r355,%r438,%r121; @ %r355 bra $L1; $L33: .loc 1 184 23 ld.u64 %r356,[%r136]; add.u64 %r68,%r356,1; st.u64 [%r136],%r68; .loc 1 185 30 add.u64 %r358,%r439,%r128; .loc 1 185 20 ld.u64 %r360,[%r358]; sub.u64 %r359,%r360,%r58; shl.b64 %r361,%r359,4; add.u64 %r122,%r122,%r361; .loc 1 186 30 add.u64 %r363,%r440,%r128; .loc 1 186 20 ld.u64 %r365,[%r363]; shl.b64 %r364,%r365,4; add.u64 %r123,%r119,%r364; .loc 1 187 23 add.u64 %r367,%r441,%r128; .loc 1 187 13 ld.u64 %r369,[%r367]; shl.b64 %r368,%r369,4; add.u64 %r124,%r120,%r368; .loc 1 167 32 add.u64 %r371,%r442,%r128; .loc 1 167 13 ld.u64 %r372,[%r371]; setp.eq.u64 %r373,%r68,%r372; @ %r373 bra $L34; bra $L36; $L13: .loc 1 130 10 setp.gt.s64 %r376,%r404,%r406; @ %r376 bra $L17; setp.ne.u64 %r379,%r404,%r406; @ %r379 bra $L41; setp.gt.u64 %r382,%r89,%r405; @ %r382 bra $L17; $L41: .loc 1 132 8 st.u64 [%frame+976],%r405; st.u64 [%frame+984],%r406; st.u64 [%frame+992],%r89; st.u64 [%frame+1000],%r40443r4960]; ld.u64 %r406,[%frame+968]; bra $L17; $Lgfortrani_cshift1_16_r4 .visible .func _gfortrani_cshift1_16_rfortran/generated/cshift1_16_r4.c"odti3 .extern .func __modti3F: _gfortrani_cshift1_16_r4 .visible .func _gfortrani_cshift1_16_r10095predu16 %r210; .reg .u32 %r211; .reg .predpred %r228; .reg .predpred %r254; .reg .u64 %r256; .reg .u64 %r259; .reg .u32 %r261; .reg .u64predpred %r326; .reg .pred %r328; .reg .predu64pred %r369; .reg .pred %r372; .reg .pred %r375; .reg .predmov.u64 %r148,%ar0; mov.u64 %r149,%ar1; mov.u64 %r150,%ar2; mov.u64 %r151,%ar3; .loc 1 68 6 setp.eq.u64 %r152,%r151,0; @ %r152 bra $L37; .loc 1 69 21 ld.u32 %r153,[%r151]; add.u32 %r95,%r153,-1; bra $L2; $L37: .loc 1 71 11 cvt.u32.u64 %r95,%r151; $L2: .loc 1 73 13 mov.u64 %r154,1; st.u64 [%frame+360],%r154; .loc 1 74 12 mov.u64 %r155,0; st.u64 [%frame+480],%r155; .loc 1 82 23 ld.s8 %r139,[%r149+28]; cvt.u32.u32 %r156,%r139; cvt.s64.s8 %r134,%r156; .loc 1 82 3 setp.le.s64 %r157,%r134,0; @ %r157 bra $L38; .loc 1 84 15 cvt.s64.s32 %r87,%r95; add.u64 %r81,%r149,40; add.u64 %r48,%r148,40; .loc 1 75 5 mov.u64 %r90,%r155; .loc 1 80 7 mov.u64 %r89,%r90; .loc 1 82 12 mov.u64 %r125,%r90; .loc 1 79 11 mov.u64 %r84,%r154; .loc 1 78 11 mov.u64 %r80,%r84; .loc 1 96 20 mov.u64 %r437,%r90; $L8: .loc 1 92 17 ld.u64 %r159,[%r81+16]; add.u64 %r158,%r159,1; .loc 1 92 15 ld.u64 %r160,[%r81+8]; sub.u64 %r141,%r158,%r160; .loc 1 86 19 ld.u64 %r79,[%r48]; .loc 1 89 19 ld.u64 %r83,[%r81]; .loc 1 84 10 setp.ne.u64 %r161,%r87,%r125; @ %r161 bra $L4; .loc 1 87 14 setp.ne.u64 %r162,%r79,0; .loc 1 90 14 setp.ne.u64 %r163,%r83,0; .loc 1 92 15 mov.u64 %r89,%r141; selp.u64 %r84,%r83,1,%r163; selp.u64 %r80,%r79,1,%r162; bra $L7; $L4: .loc 1 96 20 shl.b64 %r164,%r90,3; add.u64 %r165,%frame,%r164; add.u64 %r166,%r165,480; st.u64 [%r166],%r437; .loc 1 97 21 st.u64 [%r165+360],%r141; .loc 1 98 22 st.u64 [%r165+840],%r79; .loc 1 99 22 st.u64 [%r165+720],%r83; .loc 1 100 24 add.u64 %r178,%r90,%r90; add.u64 %r179,%r178,%r90; shl.b64 %r180,%r179,3; add.u64 %r181,%r150,%r180; ld.u64 %r25,[%r181+40]; .loc 1 100 22 st.u64 [%r165+600],%r25; .loc 1 101 26 mul.lo.u64 %r189,%r79,%r141; .loc 1 101 13 st.u64 [%r165+240],%r189; .loc 1 102 26 mul.lo.u64 %r193,%r83,%r141; .loc 1 102 13 st.u64 [%r165+120],%r193; .loc 1 103 26 mul.lo.u64 %r196,%r25,%r141; .loc 1 103 13 st.u64 [%r165],%r196; .loc 1 104 12 add.u64 %r90,%r90,1; $L7: .loc 1 82 55 add.u64 %r125,%r125,1; .loc 1 82 3 add.u64 %r81,%r81,24; add.u64 %r48,%r48,24; setp.ne.u64 %r197,%r125,%r134; @ %r197 bra $L8; bra $L3; $L38: .loc 1 80 7 mov.u64 %r89,%r155; .loc 1 79 11 mov.u64 %r84,%r154; .loc 1 78 11 mov.u64 %r80,%r84; $L3: .loc 1 107 14 ld.u64 %r29,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r198,%r29,0; .loc 1 108 16 selp.u64 %r29,%r29,1,%r198; .loc 1 109 14 ld.u64 %r30,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r199,%r30,0; .loc 1 110 16 selp.u64 %r30,%r30,1,%r199; .loc 1 111 14 ld.u64 %r31,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r200,%r31,0; .loc 1 112 16 selp.u64 %r31,%r31,1,%r200; .loc 1 118 8 ld.u64 %r122,[%r148]; .loc 1 119 8 ld.u64 %r123,[%r149]; .loc 1 120 8 ld.u64 %r124,[%r150]; .loc 1 122 9 setp.eq.u64 %r201,%r122,0; @ %r201 bra $L1; shl.b64 %r133,%r84,2; .loc 1 150 13 shl.b64 %r46,%r80,2; .loc 1 162 12 shl.b64 %r50,%r30,2; .loc 1 163 12 shl.b64 %r52,%r29,2; .loc 1 164 12 shl.b64 %r54,%r31,4; .loc 1 138 35 set.u32.eq.u64 %r203,%r80,1; neg.s32 %r204,%r203; .loc 1 138 19 set.u32.eq.u64 %r206,%r84,1; neg.s32 %r207,%r206; .loc 1 138 24 cvt.u16.u32 %r209,%r204; cvt.u16.u32 %r210,%r207; and.b16 %r208,%r209,%r210; cvt.u32.u16 %r211,%r208; cvt.u32.u8 %r86,%r211; .loc 1 167 32 ld.u64 %r82,[%frame+360]; .loc 1 129 12 shr.s64 %r396,%r89,63; setp.ne.u32 %r418,%r86,0; setp.le.s64 %r415,%r134,2; .loc 1 132 8 add.u64 %r425,%frame,960; add.u64 %r426,%frame,976; add.u64 %r427,%frame,992; cvt.u32.u32 %r428,%r139; cvt.s64.s8 %r429,%r428; add.u64 %r430,%r429,-1; add.u64 %r431,%frame,840; add.u64 %r432,%frame,720; add.u64 %r433,%frame,600; add.u64 %r434,%frame,360; .loc 1 172 24 add.u64 %r435,%frame,240; .loc 1 173 24 add.u64 %r436,%frame,120; $L36: .loc 1 125 10 ld.u64 %r397,[%r124]; ld.u64 %r398,[%r124+8]; .loc 1 128 10 setp.ge.s64 %r214,%r398,0; @ %r214 bra $L13; .loc 1 129 12 add.u64 %r216,%r397,%r89; set.u32.lt.u64 %r221,%r216,%r397; cvt.s64.s32 %r219,%r221; add.u64 %r222,%r398,%r396; sub.u64 %r225,%r222,%r219; mov.u64 %r397,%r216; mov.u64 %r398,%r225; .loc 1 130 10 setp.gt.s64 %r228,%r396,%r225; @ %r228 bra $L39; setp.ne.u64 %r231,%r396,%r225; @ %r231 bra $L15; setp.gt.u64 %r234,%r89,%r216; @ ! %r234 bra $L15; $L39: setp.ge.s64 %r236,%r225,0; @ %r236 bra $L17; $L15: .loc 1 132 8 st.u64 [%frame+976],%r216; st.u64 [%frame+984],%r225; st.u64 [%frame+992],%r89; st.u64 [%frame+1000],%r39960]; ld.u64 %r398,[%frame+968]; .loc 1 133 7 setp.ge.s64 %r254,%r398,0; @ %r254 bra $L17; .loc 1 134 16 add.u64 %r256,%r397,%r89; set.u32.lt.u64 %r261,%r256,%r397; cvt.s64.s32 %r259,%r261; add.u64 %r262,%r398,%r396; mov.u64 %r397,%r256; sub.u64 %r398,%r262,%r259; $L17: .loc 1 147 24 sub.u64 %r420,%r89,%r397; set.u32.gt.u64 %r272,%r420,%r89; cvt.s64.s32 %r270,%r272; sub.u64 %r273,%r396,%r398; add.u64 %r276,%r273,%r270; .loc 1 138 10 @ %r418 bra $L20; .loc 1 147 4 setp.gt.s64 %r279,%r276,0; @ %r279 bra $L21; setp.ne.u64 %r281,%r276,0; @ %r281 bra $L40; setp.ne.u64 %r283,%r420,0; @ %r283 bra $L21; $L40: mov.u64 %r147,%r122; bra $L23; $L20: .loc 1 140 11 shl.b64 %r117,%r397,2; .loc 1 141 11 shl.b64 %r284,%r420,2; .loc 1 142 23 add.u64 %r286,%r123,%r117; .loc 1 142 4284293,[%value_in]; } .loc 1 143 17 add.u64 %r297,%r122,%r284; .loc 1 143 4117304,[%value_in]; } bra $L24; $L21: .loc 1 136 11 mad.lo.u64 %r115,%r397,%r133,%r123; mov.u64 %r114,%r122; .loc 1 147 11 mov.u64 %r116,0; $L25: .loc 1 149 16 ld.f32 %r44,[%r115]; .loc 1 149 14 st.f32 [%r114],%r44; .loc 1 150 13 add.u64 %r114,%r114,%r46; .loc 1 151 12 add.u64 %r115,%r115,%r133; .loc 1 147 31 add.u64 %r116,%r116,1; .loc 1 147 4 setp.ne.u64 %r307,%r420,%r116; @ %r307 bra $L25; .loc 1 150 13 mad.lo.u64 %r147,%r420,%r46,%r122; $L23: .loc 1 153 4 setp.gt.s64 %r324,%r398,0; @ %r324 bra $L26; setp.ne.u64 %r326,%r398,0; @ %r326 bra $L24; setp.ne.u64 %r328,%r397,0; @ ! %r328 bra $L24; $L26: mov.u64 %r112,%r123; .loc 1 153 23 mov.u64 %r113,0; $L28: .loc 1 155 16 ld.f32 %r47,[%r112]; .loc 1 155 14 st.f32 [%r147],%r47; .loc 1 156 13 add.u64 %r147,%r147,%r46; .loc 1 157 12 add.u64 %r112,%r112,%r133; .loc 1 153 37 add.u64 %r113,%r113,1; .loc 1 153 4 setp.ne.u64 %r329,%r397,%r113; @ %r329 bra $L28; $L24: .loc 1 162 12 add.u64 %r122,%r122,%r50; .loc 1 163 12 add.u64 %r123,%r123,%r52; .loc 1 164 12 add.u64 %r124,%r124,%r54; .loc 1 165 15 ld.u64 %r330,[%frame+480]; add.u64 %r56,%r330,1; .loc 1 167 13 setp.eq.u64 %r331,%r56,%r82; @ %r331 bra $L30; .loc 1 165 15 st.u64 [%frame+480],%r56; bra $L36; $L30: .loc 1 171 20 mov.u64 %r332,0; st.u64 [%frame+480],%r332; .loc 1 172 24 ld.u64 %r58,[%frame+240]; .loc 1 173 16 ld.u64 %r334,[%frame+120]; shl.b64 %r333,%r334,2; sub.u64 %r119,%r123,%r333; .loc 1 174 9 ld.u64 %r336,[%frame]; shl.b64 %r335,%r336,4; sub.u64 %r120,%r124,%r335; .loc 1 176 14 @ %r415 bra $L1; add.u64 %r136,%frame,488; mov.u64 %r128,8; .loc 1 175 12 mov.u64 %r121,1; bra $L33; $L34: .loc 1 171 20 st.u64 [%r136],%r332; .loc 1 172 24 add.u64 %r343,%r435,%r128; ld.u64 %r58,[%r343]; .loc 1 173 24 add.u64 %r345,%r436,%r128; .loc 1 173 16 ld.u64 %r347,[%r345]; shl.b64 %r346,%r347,2; sub.u64 %r119,%r123,%r346; .loc 1 174 17 add.u64 %r348,%frame,%r128; .loc 1 174 9 ld.u64 %r350,[%r348]; shl.b64 %r349,%r350,4; sub.u64 %r120,%r124,%r349; .loc 1 175 12 add.u64 %r121,%r121,1; .loc 1 176 14 add.u64 %r136,%r136,8; add.u64 %r128,%r128,8; setp.eq.u64 %r351,%r430,%r121; @ %r351 bra $L1; $L33: .loc 1 184 23 ld.u64 %r352,[%r136]; add.u64 %r68,%r352,1; st.u64 [%r136],%r68; .loc 1 185 30 add.u64 %r354,%r431,%r128; .loc 1 185 20 ld.u64 %r356,[%r354]; sub.u64 %r355,%r356,%r58; shl.b64 %r357,%r355,2; add.u64 %r122,%r122,%r357; .loc 1 186 30 add.u64 %r359,%r432,%r128; .loc 1 186 20 ld.u64 %r361,[%r359]; shl.b64 %r360,%r361,2; add.u64 %r123,%r119,%r360; .loc 1 187 23 add.u64 %r363,%r433,%r128; .loc 1 187 13 ld.u64 %r365,[%r363]; shl.b64 %r364,%r365,4; add.u64 %r124,%r120,%r364; .loc 1 167 32 add.u64 %r367,%r434,%r128; .loc 1 167 13 ld.u64 %r368,[%r367]; setp.eq.u64 %r369,%r68,%r368; @ %r369 bra $L34; bra $L36; $L13: .loc 1 130 10 setp.gt.s64 %r372,%r396,%r398; @ %r372 bra $L17; setp.ne.u64 %r375,%r396,%r398; @ %r375 bra $L41; setp.gt.u64 %r378,%r89,%r397; @ %r378 bra $L17; $L41: .loc 1 132 8 st.u64 [%frame+976],%r397; st.u64 [%frame+984],%r398; st.u64 [%frame+992],%r89; st.u64 [%frame+1000],%r39960]; ld.u64 %r398,[%frame+968]; bra $L17; $L1: .loc 1 191 1 ret; } cshift1_16_r8.o/1622802210gfortrani_cshift1_16_r8 .visible .func _gfortrani_cshift1_16_r8fortran/generated/cshift1_16_r8.c"odti3 .extern .func __modti3F: _gfortrani_cshift1_16_r8 .visible .func _gfortrani_cshift1_16_r810095predu16 %r210; .reg .u32 %r211; .reg .predpred %r228; .reg .predpred %r254; .reg .u64 %r256; .reg .u64 %r259; .reg .u32 %r261; .reg .u64predpred %r326; .reg .pred %r328; .reg .predu64pred %r369; .reg .pred %r372; .reg .pred %r375; .reg .predmov.u64 %r148,%ar0; mov.u64 %r149,%ar1; mov.u64 %r150,%ar2; mov.u64 %r151,%ar3; .loc 1 68 6 setp.eq.u64 %r152,%r151,0; @ %r152 bra $L37; .loc 1 69 21 ld.u32 %r153,[%r151]; add.u32 %r95,%r153,-1; bra $L2; $L37: .loc 1 71 11 cvt.u32.u64 %r95,%r151; $L2: .loc 1 73 13 mov.u64 %r154,1; st.u64 [%frame+360],%r154; .loc 1 74 12 mov.u64 %r155,0; st.u64 [%frame+480],%r155; .loc 1 82 23 ld.s8 %r139,[%r149+28]; cvt.u32.u32 %r156,%r139; cvt.s64.s8 %r134,%r156; .loc 1 82 3 setp.le.s64 %r157,%r134,0; @ %r157 bra $L38; .loc 1 84 15 cvt.s64.s32 %r87,%r95; add.u64 %r81,%r149,40; add.u64 %r48,%r148,40; .loc 1 75 5 mov.u64 %r90,%r155; .loc 1 80 7 mov.u64 %r89,%r90; .loc 1 82 12 mov.u64 %r125,%r90; .loc 1 79 11 mov.u64 %r84,%r154; .loc 1 78 11 mov.u64 %r80,%r84; .loc 1 96 20 mov.u64 %r437,%r90; $L8: .loc 1 92 17 ld.u64 %r159,[%r81+16]; add.u64 %r158,%r159,1; .loc 1 92 15 ld.u64 %r160,[%r81+8]; sub.u64 %r141,%r158,%r160; .loc 1 86 19 ld.u64 %r79,[%r48]; .loc 1 89 19 ld.u64 %r83,[%r81]; .loc 1 84 10 setp.ne.u64 %r161,%r87,%r125; @ %r161 bra $L4; .loc 1 87 14 setp.ne.u64 %r162,%r79,0; .loc 1 90 14 setp.ne.u64 %r163,%r83,0; .loc 1 92 15 mov.u64 %r89,%r141; selp.u64 %r84,%r83,1,%r163; selp.u64 %r80,%r79,1,%r162; bra $L7; $L4: .loc 1 96 20 shl.b64 %r164,%r90,3; add.u64 %r165,%frame,%r164; add.u64 %r166,%r165,480; st.u64 [%r166],%r437; .loc 1 97 21 st.u64 [%r165+360],%r141; .loc 1 98 22 st.u64 [%r165+840],%r79; .loc 1 99 22 st.u64 [%r165+720],%r83; .loc 1 100 24 add.u64 %r178,%r90,%r90; add.u64 %r179,%r178,%r90; shl.b64 %r180,%r179,3; add.u64 %r181,%r150,%r180; ld.u64 %r25,[%r181+40]; .loc 1 100 22 st.u64 [%r165+600],%r25; .loc 1 101 26 mul.lo.u64 %r189,%r79,%r141; .loc 1 101 13 st.u64 [%r165+240],%r189; .loc 1 102 26 mul.lo.u64 %r193,%r83,%r141; .loc 1 102 13 st.u64 [%r165+120],%r193; .loc 1 103 26 mul.lo.u64 %r196,%r25,%r141; .loc 1 103 13 st.u64 [%r165],%r196; .loc 1 104 12 add.u64 %r90,%r90,1; $L7: .loc 1 82 55 add.u64 %r125,%r125,1; .loc 1 82 3 add.u64 %r81,%r81,24; add.u64 %r48,%r48,24; setp.ne.u64 %r197,%r125,%r134; @ %r197 bra $L8; bra $L3; $L38: .loc 1 80 7 mov.u64 %r89,%r155; .loc 1 79 11 mov.u64 %r84,%r154; .loc 1 78 11 mov.u64 %r80,%r84; $L3: .loc 1 107 14 ld.u64 %r29,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r198,%r29,0; .loc 1 108 16 selp.u64 %r29,%r29,1,%r198; .loc 1 109 14 ld.u64 %r30,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r199,%r30,0; .loc 1 110 16 selp.u64 %r30,%r30,1,%r199; .loc 1 111 14 ld.u64 %r31,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r200,%r31,0; .loc 1 112 16 selp.u64 %r31,%r31,1,%r200; .loc 1 118 8 ld.u64 %r122,[%r148]; .loc 1 119 8 ld.u64 %r123,[%r149]; .loc 1 120 8 ld.u64 %r124,[%r150]; .loc 1 122 9 setp.eq.u64 %r201,%r122,0; @ %r201 bra $L1; shl.b64 %r133,%r84,3; .loc 1 150 13 shl.b64 %r46,%r80,3; .loc 1 162 12 shl.b64 %r50,%r30,3; .loc 1 163 12 shl.b64 %r52,%r29,3; .loc 1 164 12 shl.b64 %r54,%r31,4; .loc 1 138 35 set.u32.eq.u64 %r203,%r80,1; neg.s32 %r204,%r203; .loc 1 138 19 set.u32.eq.u64 %r206,%r84,1; neg.s32 %r207,%r206; .loc 1 138 24 cvt.u16.u32 %r209,%r204; cvt.u16.u32 %r210,%r207; and.b16 %r208,%r209,%r210; cvt.u32.u16 %r211,%r208; cvt.u32.u8 %r86,%r211; .loc 1 167 32 ld.u64 %r82,[%frame+360]; .loc 1 129 12 shr.s64 %r396,%r89,63; setp.ne.u32 %r418,%r86,0; setp.le.s64 %r415,%r134,2; .loc 1 132 8 add.u64 %r425,%frame,960; add.u64 %r426,%frame,976; add.u64 %r427,%frame,992; cvt.u32.u32 %r428,%r139; cvt.s64.s8 %r429,%r428; add.u64 %r430,%r429,-1; add.u64 %r431,%frame,840; add.u64 %r432,%frame,720; add.u64 %r433,%frame,600; add.u64 %r434,%frame,360; .loc 1 172 24 add.u64 %r435,%frame,240; .loc 1 173 24 add.u64 %r436,%frame,120; $L36: .loc 1 125 10 ld.u64 %r397,[%r124]; ld.u64 %r398,[%r124+8]; .loc 1 128 10 setp.ge.s64 %r214,%r398,0; @ %r214 bra $L13; .loc 1 129 12 add.u64 %r216,%r397,%r89; set.u32.lt.u64 %r221,%r216,%r397; cvt.s64.s32 %r219,%r221; add.u64 %r222,%r398,%r396; sub.u64 %r225,%r222,%r219; mov.u64 %r397,%r216; mov.u64 %r398,%r225; .loc 1 130 10 setp.gt.s64 %r228,%r396,%r225; @ %r228 bra $L39; setp.ne.u64 %r231,%r396,%r225; @ %r231 bra $L15; setp.gt.u64 %r234,%r89,%r216; @ ! %r234 bra $L15; $L39: setp.ge.s64 %r236,%r225,0; @ %r236 bra $L17; $L15: .loc 1 132 8 st.u64 [%frame+976],%r216; st.u64 [%frame+984],%r225; st.u64 [%frame+992],%r89; st.u64 [%frame+1000],%r39960]; ld.u64 %r398,[%frame+968]; .loc 1 133 7 setp.ge.s64 %r254,%r398,0; @ %r254 bra $L17; .loc 1 134 16 add.u64 %r256,%r397,%r89; set.u32.lt.u64 %r261,%r256,%r397; cvt.s64.s32 %r259,%r261; add.u64 %r262,%r398,%r396; mov.u64 %r397,%r256; sub.u64 %r398,%r262,%r259; $L17: .loc 1 147 24 sub.u64 %r420,%r89,%r397; set.u32.gt.u64 %r272,%r420,%r89; cvt.s64.s32 %r270,%r272; sub.u64 %r273,%r396,%r398; add.u64 %r276,%r273,%r270; .loc 1 138 10 @ %r418 bra $L20; .loc 1 147 4 setp.gt.s64 %r279,%r276,0; @ %r279 bra $L21; setp.ne.u64 %r281,%r276,0; @ %r281 bra $L40; setp.ne.u64 %r283,%r420,0; @ %r283 bra $L21; $L40: mov.u64 %r147,%r122; bra $L23; $L20: .loc 1 140 11 shl.b64 %r117,%r397,3; .loc 1 141 11 shl.b64 %r284,%r420,3; .loc 1 142 23 add.u64 %r286,%r123,%r117; .loc 1 142 4284293,[%value_in]; } .loc 1 143 17 add.u64 %r297,%r122,%r284; .loc 1 143 4117304,[%value_in]; } bra $L24; $L21: .loc 1 136 11 mad.lo.u64 %r115,%r397,%r133,%r123; mov.u64 %r114,%r122; .loc 1 147 11 mov.u64 %r116,0; $L25: .loc 1 149 16 ld.f64 %r44,[%r115]; .loc 1 149 14 st.f64 [%r114],%r44; .loc 1 150 13 add.u64 %r114,%r114,%r46; .loc 1 151 12 add.u64 %r115,%r115,%r133; .loc 1 147 31 add.u64 %r116,%r116,1; .loc 1 147 4 setp.ne.u64 %r307,%r420,%r116; @ %r307 bra $L25; .loc 1 150 13 mad.lo.u64 %r147,%r420,%r46,%r122; $L23: .loc 1 153 4 setp.gt.s64 %r324,%r398,0; @ %r324 bra $L26; setp.ne.u64 %r326,%r398,0; @ %r326 bra $L24; setp.ne.u64 %r328,%r397,0; @ ! %r328 bra $L24; $L26: mov.u64 %r112,%r123; .loc 1 153 23 mov.u64 %r113,0; $L28: .loc 1 155 16 ld.f64 %r47,[%r112]; .loc 1 155 14 st.f64 [%r147],%r47; .loc 1 156 13 add.u64 %r147,%r147,%r46; .loc 1 157 12 add.u64 %r112,%r112,%r133; .loc 1 153 37 add.u64 %r113,%r113,1; .loc 1 153 4 setp.ne.u64 %r329,%r397,%r113; @ %r329 bra $L28; $L24: .loc 1 162 12 add.u64 %r122,%r122,%r50; .loc 1 163 12 add.u64 %r123,%r123,%r52; .loc 1 164 12 add.u64 %r124,%r124,%r54; .loc 1 165 15 ld.u64 %r330,[%frame+480]; add.u64 %r56,%r330,1; .loc 1 167 13 setp.eq.u64 %r331,%r56,%r82; @ %r331 bra $L30; .loc 1 165 15 st.u64 [%frame+480],%r56; bra $L36; $L30: .loc 1 171 20 mov.u64 %r332,0; st.u64 [%frame+480],%r332; .loc 1 172 24 ld.u64 %r58,[%frame+240]; .loc 1 173 16 ld.u64 %r334,[%frame+120]; shl.b64 %r333,%r334,3; sub.u64 %r119,%r123,%r333; .loc 1 174 9 ld.u64 %r336,[%frame]; shl.b64 %r335,%r336,4; sub.u64 %r120,%r124,%r335; .loc 1 176 14 @ %r415 bra $L1; add.u64 %r136,%frame,488; mov.u64 %r128,8; .loc 1 175 12 mov.u64 %r121,1; bra $L33; $L34: .loc 1 171 20 st.u64 [%r136],%r332; .loc 1 172 24 add.u64 %r343,%r435,%r128; ld.u64 %r58,[%r343]; .loc 1 173 24 add.u64 %r345,%r436,%r128; .loc 1 173 16 ld.u64 %r347,[%r345]; shl.b64 %r346,%r347,3; sub.u64 %r119,%r123,%r346; .loc 1 174 17 add.u64 %r348,%frame,%r128; .loc 1 174 9 ld.u64 %r350,[%r348]; shl.b64 %r349,%r350,4; sub.u64 %r120,%r124,%r349; .loc 1 175 12 add.u64 %r121,%r121,1; .loc 1 176 14 add.u64 %r136,%r136,8; add.u64 %r128,%r128,8; setp.eq.u64 %r351,%r430,%r121; @ %r351 bra $L1; $L33: .loc 1 184 23 ld.u64 %r352,[%r136]; add.u64 %r68,%r352,1; st.u64 [%r136],%r68; .loc 1 185 30 add.u64 %r354,%r431,%r128; .loc 1 185 20 ld.u64 %r356,[%r354]; sub.u64 %r355,%r356,%r58; shl.b64 %r357,%r355,3; add.u64 %r122,%r122,%r357; .loc 1 186 30 add.u64 %r359,%r432,%r128; .loc 1 186 20 ld.u64 %r361,[%r359]; shl.b64 %r360,%r361,3; add.u64 %r123,%r119,%r360; .loc 1 187 23 add.u64 %r363,%r433,%r128; .loc 1 187 13 ld.u64 %r365,[%r363]; shl.b64 %r364,%r365,4; add.u64 %r124,%r120,%r364; .loc 1 167 32 add.u64 %r367,%r434,%r128; .loc 1 167 13 ld.u64 %r368,[%r367]; setp.eq.u64 %r369,%r68,%r368; @ %r369 bra $L34; bra $L36; $L13: .loc 1 130 10 setp.gt.s64 %r372,%r396,%r398; @ %r372 bra $L17; setp.ne.u64 %r375,%r396,%r398; @ %r375 bra $L41; setp.gt.u64 %r378,%r89,%r397; @ %r378 bra $L17; $L41: .loc 1 132 8 st.u64 [%frame+976],%r397; st.u64 [%frame+984],%r398; st.u64 [%frame+992],%r89; st.u64 [%frame+1000],%r39960]; ld.u64 %r398,[%frame+968]; bra $L17; $L1: .loc 1 191 1 ret; } /505 cshift1_16_c4.o/1622802211gfortrani_cshift1_16_c4 .visible .func _gfortrani_cshift1_16_cfortran/generated/cshift1_16_c4.c"odti3 .extern .func __modti3F: _gfortrani_cshift1_16_c4 .visible .func _gfortrani_cshift1_16_c10091; .reg .u64 %r92; .reg .u32 %r97u64predpredpredpred %r230; .reg .pred %r233; .reg .pred %r236; .reg .pred %r238; .reg .pred %r256; .reg .u64u64 %r264; .reg .u64 %r272; .reg .u32 %r274; .reg .u64 %r275; .reg .u64 %r278; .reg .predu64 %r299; .reg .u64 %r306; .reg .pred %r309; .reg .pred %r326; .reg .pred %r328; .reg .pred %r330; .reg .pred64u64pred %r374; .reg .pred %r377; .reg .pred %r380; .reg .u64pred.reg .u64 %r438; .reg .u64 %r439; mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 68 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L37; .loc 1 69 21 ld.u32 %r155,[%r153]; add.u32 %r97,%r155,-1; bra $L2; $L37: .loc 1 71 11 cvt.u32.u64 %r97,%r153; $L2: .loc 1 73 13 mov.u64 %r156,1; st.u64 [%frame+360],%r156; .loc 1 74 12 mov.u64 %r157,0; st.u64 [%frame+480],%r157; .loc 1 82 23 ld.s8 %r141,[%r151+28]; cvt.u32.u32 %r158,%r141; cvt.s64.s8 %r136,%r158; .loc 1 82 3 setp.le.s64 %r159,%r136,0; @ %r159 bra $L38; .loc 1 84 15 cvt.s64.s32 %r89,%r97; add.u64 %r83,%r151,40; add.u64 %r50,%r150,40; .loc 1 75 5 mov.u64 %r92,%r157; .loc 1 80 7 mov.u64 %r91,%r92; .loc 1 82 12 mov.u64 %r127,%r92; .loc 1 79 11 mov.u64 %r86,%r156; .loc 1 78 11 mov.u64 %r82,%r86; .loc 1 96 20 mov.u64 %r439,%r92; $L8: .loc 1 92 17 ld.u64 %r161,[%r83+16]; add.u64 %r160,%r161,1; .loc 1 92 15 ld.u64 %r162,[%r83+8]; sub.u64 %r143,%r160,%r162; .loc 1 86 19 ld.u64 %r81,[%r50]; .loc 1 89 19 ld.u64 %r85,[%r83]; .loc 1 84 10 setp.ne.u64 %r163,%r89,%r127; @ %r163 bra $L4; .loc 1 87 14 setp.ne.u64 %r164,%r81,0; .loc 1 90 14 setp.ne.u64 %r165,%r85,0; .loc 1 92 15 mov.u64 %r91,%r143; selp.u64 %r86,%r85,1,%r165; selp.u64 %r82,%r81,1,%r164; bra $L7; $L4: .loc 1 96 20 shl.b64 %r166,%r92,3; add.u64 %r167,%frame,%r166; add.u64 %r168,%r167,480; st.u64 [%r168],%r439; .loc 1 97 21 st.u64 [%r167+360],%r143; .loc 1 98 22 st.u64 [%r167+840],%r81; .loc 1 99 22 st.u64 [%r167+720],%r85; .loc 1 100 24 add.u64 %r180,%r92,%r92; add.u64 %r181,%r180,%r92; shl.b64 %r182,%r181,3; add.u64 %r183,%r152,%r182; ld.u64 %r25,[%r183+40]; .loc 1 100 22 st.u64 [%r167+600],%r25; .loc 1 101 26 mul.lo.u64 %r191,%r81,%r143; .loc 1 101 13 st.u64 [%r167+240],%r191; .loc 1 102 26 mul.lo.u64 %r195,%r85,%r143; .loc 1 102 13 st.u64 [%r167+120],%r195; .loc 1 103 26 mul.lo.u64 %r198,%r25,%r143; .loc 1 103 13 st.u64 [%r167],%r198; .loc 1 104 12 add.u64 %r92,%r92,1; $L7: .loc 1 82 55 add.u64 %r127,%r127,1; .loc 1 82 3 add.u64 %r83,%r83,24; add.u64 %r50,%r50,24; setp.ne.u64 %r199,%r127,%r136; @ %r199 bra $L8; bra $L3; $L38: .loc 1 80 7 mov.u64 %r91,%r157; .loc 1 79 11 mov.u64 %r86,%r156; .loc 1 78 11 mov.u64 %r82,%r86; $L3: .loc 1 107 14 ld.u64 %r29,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r200,%r29,0; .loc 1 108 16 selp.u64 %r29,%r29,1,%r200; .loc 1 109 14 ld.u64 %r30,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r201,%r30,0; .loc 1 110 16 selp.u64 %r30,%r30,1,%r201; .loc 1 111 14 ld.u64 %r31,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r202,%r31,0; .loc 1 112 16 selp.u64 %r31,%r31,1,%r202; .loc 1 118 8 ld.u64 %r124,[%r150]; .loc 1 119 8 ld.u64 %r125,[%r151]; .loc 1 120 8 ld.u64 %r126,[%r152]; .loc 1 122 9 setp.eq.u64 %r203,%r124,0; @ %r203 bra $L1; shl.b64 %r135,%r86,3; .loc 1 150 13 shl.b64 %r47,%r82,3; .loc 1 162 12 shl.b64 %r52,%r30,3; .loc 1 163 12 shl.b64 %r54,%r29,3; .loc 1 164 12 shl.b64 %r56,%r31,4; .loc 1 138 35 set.u32.eq.u64 %r205,%r82,1; neg.s32 %r206,%r205; .loc 1 138 19 set.u32.eq.u64 %r208,%r86,1; neg.s32 %r209,%r208; .loc 1 138 24 cvt.u16.u32 %r211,%r206; cvt.u16.u32 %r212,%r209; and.b16 %r210,%r211,%r212; cvt.u32.u16 %r213,%r210; cvt.u32.u8 %r88,%r213; .loc 1 167 32 ld.u64 %r84,[%frame+360]; .loc 1 129 12 shr.s64 %r398,%r91,63; setp.ne.u32 %r420,%r88,0; setp.le.s64 %r417,%r136,2; .loc 1 132 8 add.u64 %r427,%frame,960; add.u64 %r428,%frame,976; add.u64 %r429,%frame,992; cvt.u32.u32 %r430,%r141; cvt.s64.s8 %r431,%r430; add.u64 %r432,%r431,-1; add.u64 %r433,%frame,840; add.u64 %r434,%frame,720; add.u64 %r435,%frame,600; add.u64 %r436,%frame,360; .loc 1 172 24 add.u64 %r437,%frame,240; .loc 1 173 24 add.u64 %r438,%frame,120; $L36: .loc 1 125 10 ld.u64 %r399,[%r126]; ld.u64 %r400,[%r126+8]; .loc 1 128 10 setp.ge.s64 %r216,%r400,0; @ %r216 bra $L13; .loc 1 129 12 add.u64 %r218,%r399,%r91; set.u32.lt.u64 %r223,%r218,%r399; cvt.s64.s32 %r221,%r223; add.u64 %r224,%r400,%r398; sub.u64 %r227,%r224,%r221; mov.u64 %r399,%r218; mov.u64 %r400,%r227; .loc 1 130 10 setp.gt.s64 %r230,%r398,%r227; @ %r230 bra $L39; setp.ne.u64 %r233,%r398,%r227; @ %r233 bra $L15; setp.gt.u64 %r236,%r91,%r218; @ ! %r236 bra $L15; $L39: setp.ge.s64 %r238,%r227,0; @ %r238 bra $L17; $L15: .loc 1 132 8 st.u64 [%frame+976],%r218; st.u64 [%frame+984],%r227; st.u64 [%frame+992],%r91; st.u64 [%frame+1000],%r39800,[%frame+968]; .loc 1 133 7 setp.ge.s64 %r256,%r400,0; @ %r256 bra $L17; .loc 1 134 16 add.u64 %r258,%r399,%r91; set.u32.lt.u64 %r263,%r258,%r399; cvt.s64.s32 %r261,%r263; add.u64 %r264,%r400,%r398; mov.u64 %r399,%r258; sub.u64 %r400,%r264,%r261; $L17: .loc 1 147 24 sub.u64 %r422,%r91,%r399; set.u32.gt.u64 %r274,%r422,%r91; cvt.s64.s32 %r272,%r274; sub.u64 %r275,%r398,%r400; add.u64 %r278,%r275,%r272; .loc 1 138 10 @ %r420 bra $L20; .loc 1 147 4 setp.gt.s64 %r281,%r278,0; @ %r281 bra $L21; setp.ne.u64 %r283,%r278,0; @ %r283 bra $L40; setp.ne.u64 %r285,%r422,0; @ %r285 bra $L21; $L40: mov.u64 %r149,%r124; bra $L23; $L20: .loc 1 140 11 shl.b64 %r119,%r399,3; .loc 1 141 11 shl.b64 %r286,%r422,3; .loc 1 142 23 add.u64 %r288,%r125,%r119; .loc 1 142 41286295,[%value_in]; } .loc 1 143 17 add.u64 %r299,%r124,%r286; .loc 1 143 4r119306,[%value_in]; } bra $L24; $L21: .loc 1 136 11 mad.lo.u64 %r117,%r399,%r135,%r125; mov.u64 %r116,%r124; .loc 1 147 11 mov.u64 %r118,0; $L25: .loc 1 149 16 ld.f32 %r44,[%r117]; ld.f32 %r45,[%r117+4]; .loc 1 149 14 st.f32 [%r116],%r44; st.f32 [%r116+4],%r45; .loc 1 150 13 add.u64 %r116,%r116,%r47; .loc 1 151 12 add.u64 %r117,%r117,%r135; .loc 1 147 31 add.u64 %r118,%r118,1; .loc 1 147 4 setp.ne.u64 %r309,%r422,%r118; @ %r309 bra $L25; .loc 1 150 13 mad.lo.u64 %r149,%r422,%r47,%r124; $L23: .loc 1 153 4 setp.gt.s64 %r326,%r400,0; @ %r326 bra $L26; setp.ne.u64 %r328,%r400,0; @ %r328 bra $L24; setp.ne.u64 %r330,%r399,0; @ ! %r330 bra $L24; $L26: mov.u64 %r114,%r125; .loc 1 153 23 mov.u64 %r115,0; $L28: .loc 1 155 16 ld.f32 %r48,[%r114]; ld.f32 %r49,[%r114+4]; .loc 1 155 14 st.f32 [%r149],%r48; st.f32 [%r149+4],%r49; .loc 1 156 13 add.u64 %r149,%r149,%r47; .loc 1 157 12 add.u64 %r114,%r114,%r135; .loc 1 153 37 add.u64 %r115,%r115,1; .loc 1 153 4 setp.ne.u64 %r331,%r399,%r115; @ %r331 bra $L28; $L24: .loc 1 162 12 add.u64 %r124,%r124,%r52; .loc 1 163 12 add.u64 %r125,%r125,%r54; .loc 1 164 12 add.u64 %r126,%r126,%r56; .loc 1 165 15 ld.u64 %r332,[%frame+480]; add.u64 %r58,%r332,1; .loc 1 167 13 setp.eq.u64 %r333,%r58,%r84; @ %r333 bra $L30; .loc 1 165 15 st.u64 [%frame+480],%r58; bra $L36; $L30: .loc 1 171 20 mov.u64 %r334,0; st.u64 [%frame+480],%r334; .loc 1 172 24 ld.u64 %r60,[%frame+240]; .loc 1 173 16 ld.u64 %r336,[%frame+120]; shl.b64 %r335,%r336,3; sub.u64 %r121,%r125,%r335; .loc 1 174 9 ld.u64 %r338,[%frame]; shl.b64 %r337,%r338,4; sub.u64 %r122,%r126,%r337; .loc 1 176 14 @ %r417 bra $L1; add.u64 %r138,%frame,488; mov.u64 %r130,8; .loc 1 175 12 mov.u64 %r123,1; bra $L33; $L34: .loc 1 171 20 st.u64 [%r138],%r334; .loc 1 172 24 add.u64 %r345,%r437,%r130; ld.u64 %r60,[%r345]; .loc 1 173 24 add.u64 %r347,%r438,%r130; .loc 1 173 16 ld.u64 %r349,[%r347]; shl.b64 %r348,%r349,3; sub.u64 %r121,%r125,%r348; .loc 1 174 17 add.u64 %r350,%frame,%r130; .loc 1 174 9 ld.u64 %r352,[%r350]; shl.b64 %r351,%r352,4; sub.u64 %r122,%r126,%r351; .loc 1 175 12 add.u64 %r123,%r123,1; .loc 1 176 14 add.u64 %r138,%r138,8; add.u64 %r130,%r130,8; setp.eq.u64 %r353,%r432,%r123; @ %r353 bra $L1; $L33: .loc 1 184 23 ld.u64 %r354,[%r138]; add.u64 %r70,%r354,1; st.u64 [%r138],%r70; .loc 1 185 30 add.u64 %r356,%r433,%r130; .loc 1 185 20 ld.u64 %r358,[%r356]; sub.u64 %r357,%r358,%r60; shl.b64 %r359,%r357,3; add.u64 %r124,%r124,%r359; .loc 1 186 30 add.u64 %r361,%r434,%r130; .loc 1 186 20 ld.u64 %r363,[%r361]; shl.b64 %r362,%r363,3; add.u64 %r125,%r121,%r362; .loc 1 187 23 add.u64 %r365,%r435,%r130; .loc 1 187 13 ld.u64 %r367,[%r365]; shl.b64 %r366,%r367,4; add.u64 %r126,%r122,%r366; .loc 1 167 32 add.u64 %r369,%r436,%r130; .loc 1 167 13 ld.u64 %r370,[%r369]; setp.eq.u64 %r371,%r70,%r370; @ %r371 bra $L34; bra $L36; $L13: .loc 1 130 10 setp.gt.s64 %r374,%r398,%r400; @ %r374 bra $L17; setp.ne.u64 %r377,%r398,%r400; @ %r377 bra $L41; setp.gt.u64 %r380,%r91,%r399; @ %r380 bra $L17; $L41: .loc 1 132 8 st.u64 [%frame+976],%r399; st.u64 [%frame+984],%r400; st.u64 [%frame+992],%r91; st.u64 [%frame+1000],%r39800,[%frame+968]; bra $L17; $L1: .loc 1 191 1 ret; } cshift1_16_c8.o/1622802211gfortrani_cshift1_16_c8 .visible .func _gfortrani_cshift1_16_c8fortran/generated/cshift1_16_c8.c"odti3 .extern .func __modti3F: _gfortrani_cshift1_16_c8 .visible .func _gfortrani_cshift1_16_c810091; .reg .u64 %r92; .reg .u32 %r97u64predpredpredpred %r230; .reg .pred %r233; .reg .pred %r236; .reg .pred %r238; .reg .pred %r256; .reg .u64u64 %r264; .reg .u64 %r272; .reg .u32 %r274; .reg .u64 %r275; .reg .u64 %r278; .reg .predu64 %r299; .reg .u64 %r306; .reg .pred %r309; .reg .pred %r326; .reg .pred %r328; .reg .pred %r330; .reg .pred64u64pred %r374; .reg .pred %r377; .reg .pred %r380; .reg .u64pred.reg .u64 %r438; .reg .u64 %r439; mov.u64 %r150,%ar0; mov.u64 %r151,%ar1; mov.u64 %r152,%ar2; mov.u64 %r153,%ar3; .loc 1 68 6 setp.eq.u64 %r154,%r153,0; @ %r154 bra $L37; .loc 1 69 21 ld.u32 %r155,[%r153]; add.u32 %r97,%r155,-1; bra $L2; $L37: .loc 1 71 11 cvt.u32.u64 %r97,%r153; $L2: .loc 1 73 13 mov.u64 %r156,1; st.u64 [%frame+360],%r156; .loc 1 74 12 mov.u64 %r157,0; st.u64 [%frame+480],%r157; .loc 1 82 23 ld.s8 %r141,[%r151+28]; cvt.u32.u32 %r158,%r141; cvt.s64.s8 %r136,%r158; .loc 1 82 3 setp.le.s64 %r159,%r136,0; @ %r159 bra $L38; .loc 1 84 15 cvt.s64.s32 %r89,%r97; add.u64 %r83,%r151,40; add.u64 %r50,%r150,40; .loc 1 75 5 mov.u64 %r92,%r157; .loc 1 80 7 mov.u64 %r91,%r92; .loc 1 82 12 mov.u64 %r127,%r92; .loc 1 79 11 mov.u64 %r86,%r156; .loc 1 78 11 mov.u64 %r82,%r86; .loc 1 96 20 mov.u64 %r439,%r92; $L8: .loc 1 92 17 ld.u64 %r161,[%r83+16]; add.u64 %r160,%r161,1; .loc 1 92 15 ld.u64 %r162,[%r83+8]; sub.u64 %r143,%r160,%r162; .loc 1 86 19 ld.u64 %r81,[%r50]; .loc 1 89 19 ld.u64 %r85,[%r83]; .loc 1 84 10 setp.ne.u64 %r163,%r89,%r127; @ %r163 bra $L4; .loc 1 87 14 setp.ne.u64 %r164,%r81,0; .loc 1 90 14 setp.ne.u64 %r165,%r85,0; .loc 1 92 15 mov.u64 %r91,%r143; selp.u64 %r86,%r85,1,%r165; selp.u64 %r82,%r81,1,%r164; bra $L7; $L4: .loc 1 96 20 shl.b64 %r166,%r92,3; add.u64 %r167,%frame,%r166; add.u64 %r168,%r167,480; st.u64 [%r168],%r439; .loc 1 97 21 st.u64 [%r167+360],%r143; .loc 1 98 22 st.u64 [%r167+840],%r81; .loc 1 99 22 st.u64 [%r167+720],%r85; .loc 1 100 24 add.u64 %r180,%r92,%r92; add.u64 %r181,%r180,%r92; shl.b64 %r182,%r181,3; add.u64 %r183,%r152,%r182; ld.u64 %r25,[%r183+40]; .loc 1 100 22 st.u64 [%r167+600],%r25; .loc 1 101 26 mul.lo.u64 %r191,%r81,%r143; .loc 1 101 13 st.u64 [%r167+240],%r191; .loc 1 102 26 mul.lo.u64 %r195,%r85,%r143; .loc 1 102 13 st.u64 [%r167+120],%r195; .loc 1 103 26 mul.lo.u64 %r198,%r25,%r143; .loc 1 103 13 st.u64 [%r167],%r198; .loc 1 104 12 add.u64 %r92,%r92,1; $L7: .loc 1 82 55 add.u64 %r127,%r127,1; .loc 1 82 3 add.u64 %r83,%r83,24; add.u64 %r50,%r50,24; setp.ne.u64 %r199,%r127,%r136; @ %r199 bra $L8; bra $L3; $L38: .loc 1 80 7 mov.u64 %r91,%r157; .loc 1 79 11 mov.u64 %r86,%r156; .loc 1 78 11 mov.u64 %r82,%r86; $L3: .loc 1 107 14 ld.u64 %r29,[%frame+720]; .loc 1 107 6 setp.ne.u64 %r200,%r29,0; .loc 1 108 16 selp.u64 %r29,%r29,1,%r200; .loc 1 109 14 ld.u64 %r30,[%frame+840]; .loc 1 109 6 setp.ne.u64 %r201,%r30,0; .loc 1 110 16 selp.u64 %r30,%r30,1,%r201; .loc 1 111 14 ld.u64 %r31,[%frame+600]; .loc 1 111 6 setp.ne.u64 %r202,%r31,0; .loc 1 112 16 selp.u64 %r31,%r31,1,%r202; .loc 1 118 8 ld.u64 %r124,[%r150]; .loc 1 119 8 ld.u64 %r125,[%r151]; .loc 1 120 8 ld.u64 %r126,[%r152]; .loc 1 122 9 setp.eq.u64 %r203,%r124,0; @ %r203 bra $L1; shl.b64 %r135,%r86,4; .loc 1 150 13 shl.b64 %r47,%r82,4; .loc 1 162 12 shl.b64 %r52,%r30,4; .loc 1 163 12 shl.b64 %r54,%r29,4; .loc 1 164 12 shl.b64 %r56,%r31,4; .loc 1 138 35 set.u32.eq.u64 %r205,%r82,1; neg.s32 %r206,%r205; .loc 1 138 19 set.u32.eq.u64 %r208,%r86,1; neg.s32 %r209,%r208; .loc 1 138 24 cvt.u16.u32 %r211,%r206; cvt.u16.u32 %r212,%r209; and.b16 %r210,%r211,%r212; cvt.u32.u16 %r213,%r210; cvt.u32.u8 %r88,%r213; .loc 1 167 32 ld.u64 %r84,[%frame+360]; .loc 1 129 12 shr.s64 %r398,%r91,63; setp.ne.u32 %r420,%r88,0; setp.le.s64 %r417,%r136,2; .loc 1 132 8 add.u64 %r427,%frame,960; add.u64 %r428,%frame,976; add.u64 %r429,%frame,992; cvt.u32.u32 %r430,%r141; cvt.s64.s8 %r431,%r430; add.u64 %r432,%r431,-1; add.u64 %r433,%frame,840; add.u64 %r434,%frame,720; add.u64 %r435,%frame,600; add.u64 %r436,%frame,360; .loc 1 172 24 add.u64 %r437,%frame,240; .loc 1 173 24 add.u64 %r438,%frame,120; $L36: .loc 1 125 10 ld.u64 %r399,[%r126]; ld.u64 %r400,[%r126+8]; .loc 1 128 10 setp.ge.s64 %r216,%r400,0; @ %r216 bra $L13; .loc 1 129 12 add.u64 %r218,%r399,%r91; set.u32.lt.u64 %r223,%r218,%r399; cvt.s64.s32 %r221,%r223; add.u64 %r224,%r400,%r398; sub.u64 %r227,%r224,%r221; mov.u64 %r399,%r218; mov.u64 %r400,%r227; .loc 1 130 10 setp.gt.s64 %r230,%r398,%r227; @ %r230 bra $L39; setp.ne.u64 %r233,%r398,%r227; @ %r233 bra $L15; setp.gt.u64 %r236,%r91,%r218; @ ! %r236 bra $L15; $L39: setp.ge.s64 %r238,%r227,0; @ %r238 bra $L17; $L15: .loc 1 132 8 st.u64 [%frame+976],%r218; st.u64 [%frame+984],%r227; st.u64 [%frame+992],%r91; st.u64 [%frame+1000],%r39800,[%frame+968]; .loc 1 133 7 setp.ge.s64 %r256,%r400,0; @ %r256 bra $L17; .loc 1 134 16 add.u64 %r258,%r399,%r91; set.u32.lt.u64 %r263,%r258,%r399; cvt.s64.s32 %r261,%r263; add.u64 %r264,%r400,%r398; mov.u64 %r399,%r258; sub.u64 %r400,%r264,%r261; $L17: .loc 1 147 24 sub.u64 %r422,%r91,%r399; set.u32.gt.u64 %r274,%r422,%r91; cvt.s64.s32 %r272,%r274; sub.u64 %r275,%r398,%r400; add.u64 %r278,%r275,%r272; .loc 1 138 10 @ %r420 bra $L20; .loc 1 147 4 setp.gt.s64 %r281,%r278,0; @ %r281 bra $L21; setp.ne.u64 %r283,%r278,0; @ %r283 bra $L40; setp.ne.u64 %r285,%r422,0; @ %r285 bra $L21; $L40: mov.u64 %r149,%r124; bra $L23; $L20: .loc 1 140 11 shl.b64 %r119,%r399,4; .loc 1 141 11 shl.b64 %r286,%r422,4; .loc 1 142 23 add.u64 %r288,%r125,%r119; .loc 1 142 41286295,[%value_in]; } .loc 1 143 17 add.u64 %r299,%r124,%r286; .loc 1 143 4r119306,[%value_in]; } bra $L24; $L21: .loc 1 136 11 mad.lo.u64 %r117,%r399,%r135,%r125; mov.u64 %r116,%r124; .loc 1 147 11 mov.u64 %r118,0; $L25: .loc 1 149 16 ld.f64 %r44,[%r117]; ld.f64 %r45,[%r117+8]; .loc 1 149 14 st.f64 [%r116],%r44; st.f64 [%r116+8],%r45; .loc 1 150 13 add.u64 %r116,%r116,%r47; .loc 1 151 12 add.u64 %r117,%r117,%r135; .loc 1 147 31 add.u64 %r118,%r118,1; .loc 1 147 4 setp.ne.u64 %r309,%r422,%r118; @ %r309 bra $L25; .loc 1 150 13 mad.lo.u64 %r149,%r422,%r47,%r124; $L23: .loc 1 153 4 setp.gt.s64 %r326,%r400,0; @ %r326 bra $L26; setp.ne.u64 %r328,%r400,0; @ %r328 bra $L24; setp.ne.u64 %r330,%r399,0; @ ! %r330 bra $L24; $L26: mov.u64 %r114,%r125; .loc 1 153 23 mov.u64 %r115,0; $L28: .loc 1 155 16 ld.f64 %r48,[%r114]; ld.f64 %r49,[%r114+8]; .loc 1 155 14 st.f64 [%r149],%r48; st.f64 [%r149+8],%r49; .loc 1 156 13 add.u64 %r149,%r149,%r47; .loc 1 157 12 add.u64 %r114,%r114,%r135; .loc 1 153 37 add.u64 %r115,%r115,1; .loc 1 153 4 setp.ne.u64 %r331,%r399,%r115; @ %r331 bra $L28; $L24: .loc 1 162 12 add.u64 %r124,%r124,%r52; .loc 1 163 12 add.u64 %r125,%r125,%r54; .loc 1 164 12 add.u64 %r126,%r126,%r56; .loc 1 165 15 ld.u64 %r332,[%frame+480]; add.u64 %r58,%r332,1; .loc 1 167 13 setp.eq.u64 %r333,%r58,%r84; @ %r333 bra $L30; .loc 1 165 15 st.u64 [%frame+480],%r58; bra $L36; $L30: .loc 1 171 20 mov.u64 %r334,0; st.u64 [%frame+480],%r334; .loc 1 172 24 ld.u64 %r60,[%frame+240]; .loc 1 173 16 ld.u64 %r336,[%frame+120]; shl.b64 %r335,%r336,4; sub.u64 %r121,%r125,%r335; .loc 1 174 9 ld.u64 %r338,[%frame]; shl.b64 %r337,%r338,4; sub.u64 %r122,%r126,%r337; .loc 1 176 14 @ %r417 bra $L1; add.u64 %r138,%frame,488; mov.u64 %r130,8; .loc 1 175 12 mov.u64 %r123,1; bra $L33; $L34: .loc 1 171 20 st.u64 [%r138],%r334; .loc 1 172 24 add.u64 %r345,%r437,%r130; ld.u64 %r60,[%r345]; .loc 1 173 24 add.u64 %r347,%r438,%r130; .loc 1 173 16 ld.u64 %r349,[%r347]; shl.b64 %r348,%r349,4; sub.u64 %r121,%r125,%r348; .loc 1 174 17 add.u64 %r350,%frame,%r130; .loc 1 174 9 ld.u64 %r352,[%r350]; shl.b64 %r351,%r352,4; sub.u64 %r122,%r126,%r351; .loc 1 175 12 add.u64 %r123,%r123,1; .loc 1 176 14 add.u64 %r138,%r138,8; add.u64 %r130,%r130,8; setp.eq.u64 %r353,%r432,%r123; @ %r353 bra $L1; $L33: .loc 1 184 23 ld.u64 %r354,[%r138]; add.u64 %r70,%r354,1; st.u64 [%r138],%r70; .loc 1 185 30 add.u64 %r356,%r433,%r130; .loc 1 185 20 ld.u64 %r358,[%r356]; sub.u64 %r357,%r358,%r60; shl.b64 %r359,%r357,4; add.u64 %r124,%r124,%r359; .loc 1 186 30 add.u64 %r361,%r434,%r130; .loc 1 186 20 ld.u64 %r363,[%r361]; shl.b64 %r362,%r363,4; add.u64 %r125,%r121,%r362; .loc 1 187 23 add.u64 %r365,%r435,%r130; .loc 1 187 13 ld.u64 %r367,[%r365]; shl.b64 %r366,%r367,4; add.u64 %r126,%r122,%r366; .loc 1 167 32 add.u64 %r369,%r436,%r130; .loc 1 167 13 ld.u64 %r370,[%r369]; setp.eq.u64 %r371,%r70,%r370; @ %r371 bra $L34; bra $L36; $L13: .loc 1 130 10 setp.gt.s64 %r374,%r398,%r400; @ %r374 bra $L17; setp.ne.u64 %r377,%r398,%r400; @ %r377 bra $L41; setp.gt.u64 %r380,%r91,%r399; @ %r380 bra $L17; $L41: .loc 1 132 8 st.u64 [%frame+976],%r399; st.u64 [%frame+984],%r400; st.u64 [%frame+992],%r91; st.u64 [%frame+1000],%r39800,[%frame+968]; bra $L17; $L1: .loc 1 191 1 ret; } /541 maxloc0_4_s1.o/ 1622802211_gfortran_maxloc0_4_s1 .visible .func _gfortran_maxloc0_4_s1); .file 1 "../../../libgfortran/generated/maxloc0_4_s1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_4_s1 .visible .func _gfortran_mmaxloc0_4_s1_gfortran_smaxloc0_4_s1 .visible .func _gfortran_smaxloc0_4_s1in_ar4); // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_returnmemcmpmem_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents7] = {77,65,88,777,65,83,75,32,97,114,103,117,109,101,110,116,0 }82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,6270,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_4_s1 .visible .func _gfortran_maxloc0_4_s1stack; mov.u64 %stack,08predpredpredpredpred %r175; .reg .u64 %r178; mov.u64 %r99,%ar0; mov.u64 %r100,%ar1; mov.u32 %r101,%ar2; mov.u64 %r102,%ar3; .loc 1 64 10 ld.s8 %r22,[%r100+28]; .loc 1 64 8 cvt.u32.u32 %r103,%r22; cvt.s64.s8 %r65,%r103; .loc 1 65 6 setp.gt.s64 %r104,%r65,0; @ %r104 bra $L2; .loc 1 66 5 cvta.const.u64 %r105,$LC0stack; call _gfortran_runtime_errorr23,[%r99]; .loc 1 68 6 setp.ne.u64 %r107,%r23,0; @ %r107 bra $L3; .loc 1 70 7 st.u64 [%r99+48],%r23; add.u64 %r109,%r65,-1; st.u64 [%r99+56],%r109; mov.u64 %r110,1; st.u64 [%r99+40],%r110; .loc 1 71 28 cvt.u32.u64 %r111,%r110; st.u8 [%r99+28],%r111; .loc 1 72 24 st.u64 [%r99+8],%r23; .loc 1 73 2965call (%value_in),_gfortrani_xmallocarray117,[%value_in]; } mov.u64 %r23,%r117; .loc 1 73 27 st.u64 [%r99],%r23; bra $L4; $L3: .loc 1 77 11 cvta.global.u64 %r119,_gfortrani_compile_options; .loc 1 77 10 ld.u32 %r120,[%r119+36]; setp.eq.u32 %r121,%r120,0; @ %r121 bra $L4; .loc 1 78r124; call _gfortrani_bounds_iforeach_return.loc 1 83 8 ld.u64 %r23,[%r99]; $L4: .loc 1 82 11 ld.u64 %r66,[%r99+40]; add.u64 %r28,%r100,40; mov.u64 %r70,0; .loc 1 84 10 mov.u64 %r86,%r70; add.u64 %r173,%frame,120; add.u64 %r172,%frame,240; .loc 1 88 16 mov.u64 %r135,%r70; bra $L7; $L19: mov.u64 %r86,%r72; $L7: .loc 1 86 18 add.u64 %r125,%frame,%r70; .loc 1 86 51 ld.u64 %r127,[%r28]; mul.lo.u64 %r126,%r127,%r102; .loc 1 86 18 st.u64 [%r125],%r126; .loc 1 87 19 ld.u64 %r129,[%r28+16]; add.u64 %r128,%r129,1; ld.u64 %r130,[%r28+8]; sub.u64 %r36,%r128,%r130; .loc 1 87 17 add.u64 %r132,%r173,%r70; st.u64 [%r132],%r36; .loc 1 88 16 add.u64 %r134,%r172,%r70; st.u64 [%r134],%r135; .loc 1 89 10 setp.gt.s64 %r136,%r36,0; @ %r136 bra $L5; shl.b64 %r90,%r66,2; mov.u64 %r27,%r23; .loc 1 92 11 mov.u64 %r73,0; .loc 1 93 24 cvt.u32.u64 %r137,%r73; $L6: st.u32 [%r27],%r137; .loc 1 92 27 add.u64 %r73,%r73,1; .loc 1 92 4 add.u64 %r27,%r27,%r90; setp.ne.u64 %r138,%r65,%r73; @ %r138 bra $L6; bra $L1; $L5: .loc 1 84 26 add.u64 %r72,%r86,1; .loc 1 84 3 add.u64 %r28,%r28,24; add.u64 %r70,%r70,8; setp.ne.u64 %r139,%r65,%r72; @ %r139 bra $L19; .loc 1 98 8 ld.u64 %r53,[%r100]; shl.b64 %r55,%r66,2; mov.u64 %r57,%r23; .loc 1 101 10 mov.u64 %r71,0; .loc 1 102 23 mov.u32 %r140,1; $L8: st.u32 [%r57],%r140; mov.u64 %r87,%r71; .loc 1 101 26 add.u64 %r71,%r71,1; .loc 1 101 3 add.u64 %r57,%r57,%r55; setp.ne.u64 %r141,%r86,%r87; @ %r141 bra $L8; .loc 1 108 9 setp.ne.u64 %r142,%r53,0; @ ! %r142 bra $L1; .loc 1 123 19 ld.u64 %r40,[%frame]; .loc 1 125 34 ld.u64 %r43,[%frame+120]; cvt.u32.u32 %r144,%r22; cvt.s64.s8 %r143,%r144; shl.b64 %r145,%r143,3; add.u64 %r61,%r172,%r145; .loc 1 106 11 mov.u64 %r59,0; add.u64 %r174,%frame,248; .loc 1 114 8 setp.eq.u32 %r175,%r101,0; .loc 1 131 13 mov.u64 %r178,%r59; $L18: .loc 1 125 21 ld.u64 %r91,[%frame+240]; mov.u64 %r42,%r91; mov.u64 %r52,%r53; $L15: .loc 1 114 8 setp.eq.u64 %r147,%r59,0; @ %r147 bra $L11; .loc 1 41 12r10151,[%value_in]; } .loc 1 114 8 @ %r175 bra $L12; .loc 1 114 24 setp.lt.s32 %r154,%r151,0; @ %r154 bra $L13; $L11: mov.u64 %r74,%r174; .loc 1 125 21 mov.u64 %r75,%r23; mov.u64 %r92,%r42; bra $L14; $L12: .loc 1 114 24 setp.gt.s32 %r156,%r151,0; @ ! %r156 bra $L13; bra $L11; $L14: .loc 1 119 38 cvt.u32.u64 %r158,%r92; add.u32 %r157,%r158,1; .loc 1 119 27 st.u32 [%r75],%r157; .loc 1 118 7 add.u64 %r75,%r75,%r55; setp.eq.u64 %r159,%r61,%r74; @ %r159 bra $L20; .loc 1 119 34 ld.u64 %r92,[%r74]; add.u64 %r74,%r74,8; bra $L14; $L20: mov.u64 %r59,%r52; $L13: .loc 1 123 9 add.u64 %r52,%r52,%r40; .loc 1 125 14 add.u64 %r42,%r42,1; .loc 1 125 7 st.u64 [%frame+240],%r42; setp.ne.u64 %r160,%r42,%r43; @ %r160 bra $L15; .loc 1 123 9 sub.u64 %r161,%r43,%r91; mad.lo.u64 %r53,%r161,%r40,%r53; mov.u64 %r81,%r174; mov.u64 %r51,%r43; mov.u64 %r48,%r40; .loc 1 126 9 mov.u64 %r58,0; $L17: .loc 1 131 13 st.u64 [%r81+-8],%r178; .loc 1 134 23 mul.lo.u64 %r45,%r48,%r51; mov.u64 %r94,%r58; .loc 1 135 5 add.u64 %r58,%r58,1; .loc 1 136 7 setp.eq.u64 %r165,%r94,%r86; @ %r165 bra $L1; .loc 1 144 16 ld.u64 %r166,[%r81]; add.u64 %r47,%r166,1; st.u64 [%r81],%r47; shl.b64 %r78,%r58,3; .loc 1 145 23 add.u64 %r167,%frame,%r78; ld.u64 %r48,[%r167]; .loc 1 145 13 sub.u64 %r168,%r48,%r45; add.u64 %r53,%r53,%r168; .loc 1 148 32 add.u64 %r170,%r173,%r78; ld.u64 %r51,[%r170]; .loc 1 148 7 add.u64 %r81,%r81,8; setp.eq.u64 %r171,%r47,%r51; @ %r171 bra $L17; bra $L18; $L1:_gfortran_mmaxloc0_4_s1 .visible .func _gfortran_mmaxloc0_4_s1in_ar4) {stack; mov.u64 %stack,0480u64u64u32 %r177; .reg .u16 %r178; .reg .pred %r179; .reg .predpredpred %r205; .reg .predpred %r223; .reg .u32predpredpred %r246; .reg .u64 %r249; mov.u64 %r128,%ar0; mov.u64 %r129,%ar1; mov.u64 %r130,%ar2; mov.u32 %r131,%ar3; mov.u64 %r132,%ar4; .loc 1 177 6 setp.ne.u64 %r133,%r130,0; @ %r133 bra $L321r12913132; call _gfortran_maxloc0_4_s1184 7 bra $L31; $L32: .loc 1 187 10 ld.s8 %r22,[%r129+28]; .loc 1 188 6 setp.gt.s32 %r138,%r22,0; @ %r138 bra $L34; .loc 1 189 5 cvta.const.u64 %r131stack; call _gfortran_runtime_error34: .loc 1 191 6 ld.u64 %r141,[%r128]; setp.ne.u64 %r142,%r141,0; @ %r142 bra $L35; .loc 1 193 7 st.u64 [%r128+48],%r141; add.u32 %r144,%r22,-1; cvt.s64.s32 %r145,%r144; st.u64 [%r128+56],%r145; mov.u64 %r146,1; st.u64 [%r128+40],%r146; .loc 1 194 28 cvt.u32.u64 %r147,%r146; st.u8 [%r128+28],%r147; .loc 1 195 24 st.u64 [%r128+8],%r141; .loc 1 196 29 cvt.u32.u32 %r152,%r22; cvt.s64.s8 %r151,%r152call (%value_in),_gfortrani_xmallocarray153,[%value_in]; } .loc 1 196 27 st.u64 [%r128],%r153; bra $L36; $L35: .loc 1 200 11 cvta.global.u64 %r155,_gfortrani_compile_options; .loc 1 200 10 ld.u32 %r156,[%r155+36]; setp.eq.u32 %r157,%r156,0r129160; call _gfortrani_bounds_iforeach_return.loc 1 205 4 cvta.const.u64 %r16311291630; call _gfortrani_bounds_equal_extents$L36: .loc 1 210 15 ld.u64 %r31,[%r130+16]; .loc 1 212 9 ld.u64 %r82,[%r130]; .loc 1 214 22 cvt.u32.u64 %r32,%r31; .loc 1 214 53 add.u32 %r165,%r32,-4; and.b32 %r166,%r165,-5; set.u32.eq.u32 %r168,%r166,0; neg.s32 %r169,%r168; .loc 1 214 22 add.u32 %r170,%r32,-1; set.u32.le.u32 %r172,%r170,1; neg.s32 %r173,%r172; .loc 1 214 6 cvt.u16.u32 %r175,%r169; cvt.u16.u32 %r176,%r173; or.b16 %r174,%r175,%r176; cvt.u32.u16 %r177,%r174; cvt.u16.u8 %r178,%r177; setp.ne.u16 %r179,%r178,0; @ %r179 bra $L37; .loc 1 216 7 setp.ne.u32 %r181,%r32,16; @ %r181 bra $L38; $L37: .loc 1 223 11 ld.u64 %r93,[%r128+40]; .loc 1 224 8 ld.u64 %r94,[%r128]; .loc 1 225 17 cvt.u32.u32 %r182,%r22; cvt.s64.s8 %r111,%r182; add.u64 %r89,%r129,40; add.u64 %r83,%r130,40; mov.u64 %r68,0; .loc 1 225 10 mov.u64 %r84,%r68; add.u64 %r245,%frame,120; add.u64 %r242,%frame,240; add.u64 %r243,%frame,360; .loc 1 230 16 mov.u64 %r199,%r68; bra $L39; $L38: .loc 1 221 5_gfortran_runtime_error39: .loc 1 227 18 add.u64 %r186,%r245,%r68; .loc 1 227 51 ld.u64 %r188,[%r89]; mul.lo.u64 %r187,%r188,%r132; .loc 1 227 18 st.u64 [%r186],%r187; .loc 1 228 18 add.u64 %r189,%frame,%r68; .loc 1 228 20 ld.u64 %r191,[%r83]; mul.lo.u64 %r190,%r191,%r31; .loc 1 228 18 st.u64 [%r189],%r190; .loc 1 229 19 ld.u64 %r193,[%r89+16]; add.u64 %r192,%r193,1; ld.u64 %r194,[%r89+8]; sub.u64 %r51,%r192,%r194; .loc 1 229 17 add.u64 %r196,%r242,%r68; st.u64 [%r196],%r51; .loc 1 230 16 add.u64 %r198,%r243,%r68; st.u64 [%r198],%r199; .loc 1 231 10 setp.gt.s64 %r200,%r51,0; @ %r200 bra $L40; shl.b64 %r124,%r93,2; mov.u64 %r38,%r94; .loc 1 234 11 mov.u64 %r96,0; .loc 1 235 24 cvt.u32.u64 %r201,%r96; $L41: st.u32 [%r38],%r201; .loc 1 234 27 add.u64 %r96,%r96,1; .loc 1 234 4 add.u64 %r38,%r38,%r124; setp.ne.u64 %r202,%r96,%r111; @ %r202 bra $L41; bra $L31; $L40: .loc 1 225 26 add.u64 %r84,%r84,1; .loc 1 225 3 add.u64 %r89,%r89,24; add.u64 %r83,%r83,24; add.u64 %r68,%r68,8; setp.ne.u64 %r203,%r84,%r111; @ %r203 bra $L39; .loc 1 240 8 ld.u64 %r80,[%r129]; shl.b64 %r99,%r93,2; mov.u64 %r101,%r94; .loc 1 243 10 mov.u64 %r95,0; .loc 1 244 23 cvt.u32.u64 %r204,%r95; $L43: st.u32 [%r101],%r204; .loc 1 243 26 add.u64 %r95,%r95,1; .loc 1 243 3 add.u64 %r101,%r101,%r99; setp.ne.u64 %r205,%r95,%r111; @ %r205 bra $L43; .loc 1 251 9 setp.eq.u64 %r206,%r80,0; @ %r206 bra $L31; .loc 1 267 19 ld.u64 %r59,[%frame+120]; .loc 1 268 20 ld.u64 %r61,[%frame]; .loc 1 270 34 ld.u64 %r65,[%frame+240]; cvt.u32.u32 %r208,%r22; cvt.s64.s8 %r207,%r208; shl.b64 %r209,%r207,3; add.u64 %r103,%r243,%r209; .loc 1 249 10 mov.u64 %r86,0; add.u64 %r244,%frame,368; .loc 1 257 6 setp.eq.u32 %r246,%r131,0; .loc 1 276 13 mov.u64 %r249,%r86; $L50: .loc 1 270 21 ld.u64 %r125,[%frame+360]; mov.u64 %r63,%r125; mov.u64 %r81,%r82; mov.u64 %r79,%r80; $L48: .loc 1 257 6 ld.s8 %r212,[%r81]; cvt.u16.u32 %r211,%r212; setp.eq.u16 %r213,%r211,0; @ %r213 bra $L44; .loc 1 257 14 setp.eq.u64 %r214,%r86,0; @ %r214 bra $L45; .loc 1 4113218,[%value_in]; } .loc 1 257 6 @ %r246 bra $L46; .loc 1 258 25 setp.lt.s32 %r221,%r218,0; @ %r221 bra $L44; $L45: mov.u64 %r108,%r244; .loc 1 270 21 mov.u64 %r109,%r94; mov.u64 %r126,%r63; bra $L47; $L46: .loc 1 258 25 setp.gt.s32 %r223,%r218,0; @ ! %r223 bra $L44; bra $L45; $L47: .loc 1 263 38 cvt.u32.u64 %r225,%r126; add.u32 %r224,%r225,1; .loc 1 263 27 st.u32 [%r109],%r224; .loc 1 262 7 add.u64 %r109,%r109,%r99; setp.eq.u64 %r226,%r103,%r108; @ %r226 bra $L51; .loc 1 263 34 ld.u64 %r126,[%r108]; add.u64 %r108,%r108,8; bra $L47; $L51: mov.u64 %r86,%r79; $L44: .loc 1 267 9 add.u64 %r79,%r79,%r59; .loc 1 268 10 add.u64 %r81,%r81,%r61; .loc 1 270 14 add.u64 %r63,%r63,1; .loc 1 270 7 st.u64 [%frame+360],%r63; setp.ne.u64 %r227,%r63,%r65; @ %r227 bra $L48; sub.u64 %r119,%r65,%r125; .loc 1 267 9 mad.lo.u64 %r80,%r59,%r119,%r80; .loc 1 268 10 mad.lo.u64 %r82,%r61,%r119,%r82; mov.u64 %r116,%r244; mov.u64 %r75,%r61; mov.u64 %r78,%r65; mov.u64 %r73,%r59; mov.u64 %r115,8; .loc 1 271 9 mov.u64 %r85,0; $L49: .loc 1 276 13 st.u64 [%r116+-8],%r249; .loc 1 279 23 mul.lo.u64 %r67,%r73,%r78; .loc 1 280 24 mul.lo.u64 %r70,%r78,%r75; .loc 1 281 5 add.u64 %r85,%r85,1; .loc 1 282 7 setp.eq.u64 %r232,%r85,%r111; @ %r232 bra $L31; .loc 1 290 16 ld.u64 %r233,[%r116]; add.u64 %r72,%r233,1; st.u64 [%r116],%r72; .loc 1 291 23 add.u64 %r235,%r245,%r115; ld.u64 %r73,[%r235]; .loc 1 291 13 sub.u64 %r236,%r73,%r67; add.u64 %r80,%r80,%r236; .loc 1 292 24 add.u64 %r237,%frame,%r115; ld.u64 %r75,[%r237]; .loc 1 292 14 sub.u64 %r238,%r75,%r70; add.u64 %r82,%r82,%r238; .loc 1 295 32 add.u64 %r240,%r242,%r115; ld.u64 %r78,[%r240]; .loc 1 295 7 add.u64 %r116,%r116,8; add.u64 %r115,%r115,8; setp.eq.u64 %r241,%r72,%r78; @ %r241 bra $L49; bra $L50; $L31:_gfortran_smaxloc0_4_s1 .visible .func _gfortran_smaxloc0_4_s1in_ar4) {u32 %r41; .reg .predu32 %r55; .reg .u64u32ar2; mov.u32 %r38,%ar3; mov.u64 %r39,%ar4; .loc 1 317 6 setp.eq.u64 %r40,%r37,0; @ %r40 bra $L70; .loc 1 317 20 ld.u32 %r41,[%r37]; setp.eq.u32 %r42,%r41,0; @ %r42 bra $L71; $L70: .loc 1 320 7 {call _gfortran_maxloc0_4_s1324 7 bra $L69; $L71: .loc 1 327 8 ld.s8 %r30,[%r36+28]; .loc 1 329 6 setp.gt.s64 %r48,%r30,0; @ %r48 bra $L73; .loc 1 330 5 cvta.const.u64 %r49,$LC0; {stack; call _gfortran_runtime_error73: .loc 1 332 15 ld.u64 %r24,[%r35]; .loc 1 332 6 setp.ne.u64 %r51,%r24,0; @ %r51 bra $L74; .loc 1 334 7 st.u64 [%r35+48],%r24; add.u64 %r53,%r30,-1; st.u64 [%r35+56],%r53; mov.u64 %r54,1; st.u64 [%r35+40],%r54; .loc 1 335 28 cvt.u32.u64 %r55,%r54; st.u8 [%r35+28],%r55; .loc 1 336 24 st.u64 [%r35+8],%r24; .loc 1 337 29 mov.u64 %r58call (%value_in),_gfortrani_xmallocarray37 27 st.u64 [%r35],%r24; bra $L75; $L74: .loc 1 339 12 cvta.global.u64 %r63,_gfortrani_compile_options; .loc 1 339 11 ld.u32 %r64,[%r63+36]; setp.eq.u32 %r65,%r64,0; @ %r65 bra $L75; .loc 1 341 8 cvta.const.u64 %r68,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 346 8 ld.u64 %r24,[%r35]; $L75: ld.u64 %r69,[%r35+40]; shl.b64 %r34,%r69,2; mov.u64 %r29,%r24; .loc 1 347 10 mov.u64 %r32,0; .loc 1 348 23 cvt.u32.u64 %r70,%r32; $L76: st.u32 [%r29],%r70; .loc 1 347 24 add.u64 %r32,%r32,1; .loc 1 347 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r71,%r30,%r32; @ %r71 bra $L76; $L69: .loc 1 349 1 ret; } maxloc0_4_s4.o/ 1622802211_gfortran_maxloc0_4_s4 .visible .func _gfortran_maxloc0_4_s4); .file 1 "../../../libgfortran/generated/maxloc0_4_s4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_4_s4 .visible .func _gfortran_mmaxloc0_4_s_gfortran_smaxloc0_4_s4 .visible .func _gfortran_smaxloc0_4_sVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_memcmp_char4_gfortrani_memcmp_char4_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents7] = {77,65,88,777,65,83,75,32,97,114,103,117,109,101,110,116,0 }82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,6270,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_4_s4 .visible .func _gfortran_maxloc0_4_s4stack; mov.u64 %stack,0897u64 %r120; .reg .u64 %r122; .reg .u32 %r123; .reg .predu64pred %r182; mov.u64 %r102,%ar0; mov.u64 %r103,%ar1; mov.u32 %r104,%ar2; mov.u64 %r105,%ar3; .loc 1 64 10 ld.s8 %r22,[%r103+28]; .loc 1 64 8 cvt.u32.u32 %r106,%r22; cvt.s64.s8 %r62,%r106; .loc 1 65 6 setp.gt.s64 %r107,%r62,0; @ %r107 bra $L2; .loc 1 66 5 cvta.const.u64 %r108,$LC0; {stack; call _gfortran_runtime_errorr23,[%r102]; .loc 1 68 6 setp.ne.u64 %r110,%r23,0; @ %r110 bra $L3; .loc 1 70 7 st.u64 [%r102+48],%r23; add.u64 %r112,%r62,-1; st.u64 [%r102+56],%r112; mov.u64 %r113,1; st.u64 [%r102+40],%r113; .loc 1 71 28 cvt.u32.u64 %r114,%r113; st.u8 [%r102+28],%r114; .loc 1 72 24 st.u64 [%r102+8],%r23; .loc 1 73 29117; call (%value_in),_gfortrani_xmallocarray120,[%value_in]; } mov.u64 %r23,%r120; .loc 1 73 27 st.u64 [%r102],%r23; bra $L4; $L3: .loc 1 77 11 cvta.global.u64 %r122,_gfortrani_compile_options; .loc 1 77 10 ld.u32 %r123,[%r122+36]; setp.eq.u32 %r124,%r123,0; @ %r124 bra $L4; .loc 1 78 2127; call _gfortrani_bounds_iforeach_return.loc 1 83 8 ld.u64 %r23,[%r102]; $L4: .loc 1 82 11 ld.u64 %r63,[%r102+40]; add.u64 %r27,%r103,40; mov.u64 %r68,0; .loc 1 84 10 mov.u64 %r83,%r68; add.u64 %r181,%frame,120; add.u64 %r179,%frame,240; .loc 1 88 16 mov.u64 %r138,%r68; bra $L7; $L20: mov.u64 %r83,%r70; $L7: .loc 1 86 18 add.u64 %r128,%frame,%r68; .loc 1 86 51 ld.u64 %r130,[%r27]; mul.lo.u64 %r129,%r130,%r105; .loc 1 86 18 st.u64 [%r128],%r129; .loc 1 87 19 ld.u64 %r132,[%r27+16]; add.u64 %r131,%r132,1; ld.u64 %r133,[%r27+8]; sub.u64 %r35,%r131,%r133; .loc 1 87 17 add.u64 %r135,%r181,%r68; st.u64 [%r135],%r35; .loc 1 88 16 add.u64 %r137,%r179,%r68; st.u64 [%r137],%r138; .loc 1 89 10 setp.gt.s64 %r139,%r35,0; @ %r139 bra $L5; shl.b64 %r85,%r63,2; mov.u64 %r87,%r23; .loc 1 92 11 mov.u64 %r71,0; .loc 1 93 24 cvt.u32.u64 %r140,%r71; $L6: st.u32 [%r87],%r140; .loc 1 92 27 add.u64 %r71,%r71,1; .loc 1 92 4 add.u64 %r87,%r87,%r85; setp.ne.u64 %r141,%r62,%r71; @ %r141 bra $L6; bra $L1; $L5: .loc 1 84 26 add.u64 %r70,%r83,1; .loc 1 84 3 add.u64 %r27,%r27,24; add.u64 %r68,%r68,8; setp.ne.u64 %r142,%r62,%r70; @ %r142 bra $L20; .loc 1 98 8 ld.u64 %r50,[%r103]; shl.b64 %r52,%r63,2; mov.u64 %r54,%r23; .loc 1 101 10 mov.u64 %r69,0; .loc 1 102 23 mov.u32 %r143,1; $L8: st.u32 [%r54],%r143; mov.u64 %r84,%r69; .loc 1 101 26 add.u64 %r69,%r69,1; .loc 1 101 3 add.u64 %r54,%r54,%r52; setp.ne.u64 %r144,%r83,%r84; @ %r144 bra $L8; .loc 1 108 9 setp.ne.u64 %r145,%r50,0; @ ! %r145 bra $L1; .loc 1 123 19 ld.u64 %r92,[%frame]; .loc 1 123 9 shl.b64 %r94,%r92,2; .loc 1 125 34 ld.u64 %r95,[%frame+120]; cvt.u32.u32 %r147,%r22; cvt.s64.s8 %r146,%r147; shl.b64 %r148,%r146,3; add.u64 %r58,%r179,%r148; .loc 1 106 11 mov.u64 %r56,0; add.u64 %r180,%frame,248; .loc 1 114 8 setp.eq.u32 %r182,%r104,0; $L19: .loc 1 125 21 ld.u64 %r96,[%frame+240]; mov.u64 %r40,%r96; mov.u64 %r49,%r50; $L16: .loc 1 114 8 setp.ne.u64 %r150,%r56,0; @ %r150 bra $L11; $L14: mov.u64 %r74,%r180; .loc 1 125 21 mov.u64 %r75,%r23; mov.u64 %r97,%r40; bra $L12; $L11: .loc 1 114 8 @ %r182 bra $L13;105; call (%value_in),_gfortrani_memcmp_char4156,[%value_in]; } .loc 1 114 24 setp.ge.s32 %r157,%r156,0; @ ! %r157 bra $L15; bra $L14105; call (%value_in),_gfortrani_memcmp_char4161,[%value_in]; } .loc 1 114 24 setp.gt.s32 %r162,%r161,0; @ ! %r162 bra $L15; bra $L14; $L12: .loc 1 119 38 cvt.u32.u64 %r164,%r97; add.u32 %r163,%r164,1; .loc 1 119 27 st.u32 [%r75],%r163; .loc 1 118 7 add.u64 %r75,%r75,%r52; setp.eq.u64 %r165,%r58,%r74; @ %r165 bra $L21; .loc 1 119 34 ld.u64 %r97,[%r74]; add.u64 %r74,%r74,8; bra $L12; $L21: .loc 1 117 14 mov.u64 %r56,%r49; $L15: .loc 1 123 9 add.u64 %r49,%r49,%r94; .loc 1 125 14 add.u64 %r40,%r40,1; .loc 1 125 7 st.u64 [%frame+240],%r40; setp.ne.u64 %r166,%r40,%r95; @ %r166 bra $L16; .loc 1 123 9 sub.u64 %r167,%r95,%r96; mad.lo.u64 %r50,%r167,%r94,%r50; mov.u64 %r81,%r180; mov.u64 %r48,%r95; mov.u64 %r45,%r92; .loc 1 126 9 mov.u64 %r55,0; .loc 1 131 13 mov.u64 %r170,%r55; $L18: st.u64 [%r81+-8],%r170; .loc 1 134 23 mul.lo.u64 %r42,%r45,%r48; mov.u64 %r36,%r55; .loc 1 135 5 add.u64 %r55,%r55,1; .loc 1 136 7 setp.eq.u64 %r171,%r36,%r83; @ %r171 bra $L1; .loc 1 144 16 ld.u64 %r172,[%r81]; add.u64 %r44,%r172,1; st.u64 [%r81],%r44; shl.b64 %r78,%r55,3; .loc 1 145 23 add.u64 %r173,%frame,%r78; ld.u64 %r45,[%r173]; .loc 1 145 13 sub.u64 %r174,%r45,%r42; shl.b64 %r175,%r174,2; add.u64 %r50,%r50,%r175; .loc 1 148 32 add.u64 %r177,%r181,%r78; ld.u64 %r48,[%r177]; .loc 1 148 7 add.u64 %r81,%r81,8; setp.eq.u64 %r178,%r44,%r48; @ %r178 bra $L18; bra $L19; $L1:_gfortran_mmaxloc0_4_s4 .visible .func _gfortran_mmaxloc0_4_s, .param .u64 %in_ar4) {stack; mov.u64 %stack,0480predpred.reg .u16 %r179; .reg .u32 %r180; .reg .u16 %r181; .reg .pred %r182; .reg .predpred %r205; .reg .pred %r206; .reg .u32 %r207; .reg .predu32 %r211; .reg .u64 %r212; .reg .u16 %r214; .reg .u32 %r215; .reg .predpred %r232; .reg .predu64 %r251; .reg .u64 %r252; .reg .pred %r253; mov.u64 %r131,%ar0; mov.u64 %r132,%ar1; mov.u64 %r133,%ar2; mov.u32 %r134,%ar3; mov.u64 %r135,%ar4; .loc 1 177 6 setp.ne.u64 %r136,%r133,0; @ %r136 bra $L301132134135; call _gfortran_maxloc0_4_s4184 7 bra $L29; $L30: .loc 1 187 10 ld.s8 %r22,[%r132+28]; .loc 1 188 6 setp.gt.s32 %r141,%r22,0; @ %r141 bra $L32; .loc 1 189 5 cvta.const.u64 %r142,$LC0stack; call _gfortran_runtime_error32: .loc 1 191 6 ld.u64 %r144,[%r131]; setp.ne.u64 %r145,%r144,0; @ %r145 bra $L33; .loc 1 193 7 st.u64 [%r131+48],%r144; add.u32 %r147,%r22,-1; cvt.s64.s32 %r148,%r147; st.u64 [%r131+56],%r148; mov.u64 %r149,1; st.u64 [%r131+40],%r149; .loc 1 194 28 cvt.u32.u64 %r150,%r149; st.u8 [%r131+28],%r150; .loc 1 195 24 st.u64 [%r131+8],%r144; .loc 1 196 29 cvt.u32.u32 %r155,%r22; cvt.s64.s8 %r154,%r155call (%value_in),_gfortrani_xmallocarray156,[%value_in]; } .loc 1 196 27 st.u64 [%r131],%r156; bra $L34; $L33: .loc 1 200 11 cvta.global.u64 %r158,_gfortrani_compile_options; .loc 1 200 10 ld.u32 %r159,[%r158+36]; setp.eq.u32 %r160,%r159,0r1163; call _gfortrani_bounds_iforeach_return.loc 1 205 4 cvta.const.u64 %r16613r11663; call _gfortrani_bounds_equal_extents$L34: .loc 1 210 15 ld.u64 %r31,[%r133+16]; .loc 1 212 9 ld.u64 %r76,[%r133]; .loc 1 214 22 cvt.u32.u64 %r32,%r31; .loc 1 214 53 add.u32 %r168,%r32,-4; and.b32 %r169,%r168,-5; set.u32.eq.u32 %r171,%r169,0; neg.s32 %r172,%r171; .loc 1 214 22 add.u32 %r173,%r32,-1; set.u32.le.u32 %r175,%r173,1; neg.s32 %r176,%r175; .loc 1 214 6 cvt.u16.u32 %r178,%r172; cvt.u16.u32 %r179,%r176; or.b16 %r177,%r178,%r179; cvt.u32.u16 %r180,%r177; cvt.u16.u8 %r181,%r180; setp.ne.u16 %r182,%r181,0; @ %r182 bra $L35; .loc 1 216 7 setp.ne.u32 %r184,%r32,16; @ %r184 bra $L36; $L35: .loc 1 223 11 ld.u64 %r87,[%r131+40]; .loc 1 224 8 ld.u64 %r88,[%r131]; .loc 1 225 17 cvt.u32.u32 %r185,%r22; cvt.s64.s8 %r107,%r185; add.u64 %r83,%r132,40; add.u64 %r77,%r133,40; mov.u64 %r61,0; .loc 1 225 10 mov.u64 %r78,%r61; add.u64 %r250,%frame,120; add.u64 %r251,%frame,240; add.u64 %r252,%frame,360; .loc 1 230 16 mov.u64 %r202,%r61; bra $L37; $L36: .loc 1 221 5_gfortran_runtime_error37: .loc 1 227 18 add.u64 %r189,%r250,%r61; .loc 1 227 51 ld.u64 %r191,[%r83]; mul.lo.u64 %r190,%r191,%r135; .loc 1 227 18 st.u64 [%r189],%r190; .loc 1 228 18 add.u64 %r192,%frame,%r61; .loc 1 228 20 ld.u64 %r194,[%r77]; mul.lo.u64 %r193,%r194,%r31; .loc 1 228 18 st.u64 [%r192],%r193; .loc 1 229 19 ld.u64 %r196,[%r83+16]; add.u64 %r195,%r196,1; ld.u64 %r197,[%r83+8]; sub.u64 %r50,%r195,%r197; .loc 1 229 17 add.u64 %r199,%r251,%r61; st.u64 [%r199],%r50; .loc 1 230 16 add.u64 %r201,%r252,%r61; st.u64 [%r201],%r202; .loc 1 231 10 setp.gt.s64 %r203,%r50,0; @ %r203 bra $L38; shl.b64 %r115,%r87,2; mov.u64 %r117,%r88; .loc 1 234 11 mov.u64 %r91,0; .loc 1 235 24 cvt.u32.u64 %r204,%r91; $L39: st.u32 [%r117],%r204; .loc 1 234 27 add.u64 %r91,%r91,1; .loc 1 234 4 add.u64 %r117,%r117,%r115; setp.ne.u64 %r205,%r91,%r107; @ %r205 bra $L39; bra $L29; $L38: .loc 1 225 26 add.u64 %r78,%r78,1; .loc 1 225 3 add.u64 %r83,%r83,24; add.u64 %r77,%r77,24; add.u64 %r61,%r61,8; setp.ne.u64 %r206,%r78,%r107; @ %r206 bra $L37; .loc 1 240 8 ld.u64 %r73,[%r132]; shl.b64 %r96,%r87,2; mov.u64 %r98,%r88; .loc 1 243 10 mov.u64 %r90,0; .loc 1 244 23 cvt.u32.u64 %r207,%r90; $L41: st.u32 [%r98],%r207; .loc 1 243 26 add.u64 %r90,%r90,1; .loc 1 243 3 add.u64 %r98,%r98,%r96; setp.ne.u64 %r208,%r90,%r107; @ %r208 bra $L41; .loc 1 251 9 setp.eq.u64 %r209,%r73,0; @ %r209 bra $L29; .loc 1 267 19 ld.u64 %r123,[%frame+120]; .loc 1 267 9 shl.b64 %r125,%r123,2; .loc 1 268 20 ld.u64 %r126,[%frame]; .loc 1 270 34 ld.u64 %r128,[%frame+240]; cvt.u32.u32 %r211,%r22; cvt.s64.s8 %r210,%r211; shl.b64 %r212,%r210,3; add.u64 %r100,%r252,%r212; .loc 1 249 10 mov.u64 %r80,0; add.u64 %r249,%frame,368; .loc 1 257 6 setp.eq.u32 %r253,%r134,0; $L49: .loc 1 270 21 ld.u64 %r129,[%frame+360]; mov.u64 %r57,%r129; mov.u64 %r75,%r76; mov.u64 %r72,%r73; $L47: .loc 1 257 6 ld.s8 %r215,[%r75]; cvt.u16.u32 %r214,%r215; setp.eq.u16 %r216,%r214,0; @ %r216 bra $L42; .loc 1 257 14 setp.ne.u64 %r217,%r80,0; @ %r217 bra $L43; $L46: mov.u64 %r105,%r249; .loc 1 270 21 mov.u64 %r106,%r88; mov.u64 %r130,%r57; bra $L44; $L43: .loc 1 257 6 @ %r253 bra $L45; .loc 1 43135; call (%value_in),_gfortrani_memcmp_char4223,[%value_in]; } .loc 1 258 25 setp.ge.s32 %r224,%r223,0; @ ! %r224 bra $L42; bra $L46; $L45: .loc 1 43135; call (%value_in),_gfortrani_memcmp_char4228,[%value_in]; } .loc 1 258 25 setp.gt.s32 %r229,%r228,0; @ ! %r229 bra $L42; bra $L46; $L44: .loc 1 263 38 cvt.u32.u64 %r231,%r130; add.u32 %r230,%r231,1; .loc 1 263 27 st.u32 [%r106],%r230; .loc 1 262 7 add.u64 %r106,%r106,%r96; setp.eq.u64 %r232,%r100,%r105; @ %r232 bra $L50; .loc 1 263 34 ld.u64 %r130,[%r105]; add.u64 %r105,%r105,8; bra $L44; $L50: .loc 1 261 14 mov.u64 %r80,%r72; $L42: .loc 1 267 9 add.u64 %r72,%r72,%r125; .loc 1 268 10 add.u64 %r75,%r75,%r126; .loc 1 270 14 add.u64 %r57,%r57,1; .loc 1 270 7 st.u64 [%frame+360],%r57; setp.ne.u64 %r233,%r57,%r128; @ %r233 bra $L47; sub.u64 %r120,%r128,%r129; .loc 1 267 9 mad.lo.u64 %r73,%r120,%r125,%r73; .loc 1 268 10 mad.lo.u64 %r76,%r120,%r126,%r76; mov.u64 %r113,%r249; mov.u64 %r69,%r126; mov.u64 %r71,%r128; mov.u64 %r67,%r123; mov.u64 %r112,8; .loc 1 271 9 mov.u64 %r79,0; .loc 1 276 13 mov.u64 %r237,%r79; $L48: st.u64 [%r113+-8],%r237; .loc 1 279 23 mul.lo.u64 %r60,%r67,%r71; .loc 1 280 24 mul.lo.u64 %r63,%r71,%r69; .loc 1 281 5 add.u64 %r79,%r79,1; .loc 1 282 7 setp.eq.u64 %r238,%r79,%r107; @ %r238 bra $L29; .loc 1 290 16 ld.u64 %r239,[%r113]; add.u64 %r66,%r239,1; st.u64 [%r113],%r66; .loc 1 291 23 add.u64 %r241,%r250,%r112; ld.u64 %r67,[%r241]; .loc 1 291 13 sub.u64 %r242,%r67,%r60; shl.b64 %r243,%r242,2; add.u64 %r73,%r73,%r243; .loc 1 292 24 add.u64 %r244,%frame,%r112; ld.u64 %r69,[%r244]; .loc 1 292 14 sub.u64 %r245,%r69,%r63; add.u64 %r76,%r76,%r245; .loc 1 295 32 add.u64 %r247,%r251,%r112; ld.u64 %r71,[%r247]; .loc 1 295 7 add.u64 %r113,%r113,8; add.u64 %r112,%r112,8; setp.eq.u64 %r248,%r66,%r71; @ %r248 bra $L48; bra $L49; $L29:_gfortran_smaxloc0_4_s4 .visible .func _gfortran_smaxloc0_4_s, .param .u64 %in_ar4) {u32 %r41; .reg .predu32 %r55; .reg .u64u32ar2; mov.u32 %r38,%ar3; mov.u64 %r39,%ar4; .loc 1 317 6 setp.eq.u64 %r40,%r37,0; @ %r40 bra $L66; .loc 1 317 20 ld.u32 %r41,[%r37]; setp.eq.u32 %r42,%r41,0; @ %r42 bra $L67; $L66: .loc 1 320 7 {call _gfortran_maxloc0_4_s4324 7 bra $L65; $L67: .loc 1 327 8 ld.s8 %r30,[%r36+28]; .loc 1 329 6 setp.gt.s64 %r48,%r30,0; @ %r48 bra $L69; .loc 1 330 5 cvta.const.u64 %r49,$LC0; {stack; call _gfortran_runtime_error69: .loc 1 332 15 ld.u64 %r24,[%r35]; .loc 1 332 6 setp.ne.u64 %r51,%r24,0; @ %r51 bra $L70; .loc 1 334 7 st.u64 [%r35+48],%r24; add.u64 %r53,%r30,-1; st.u64 [%r35+56],%r53; mov.u64 %r54,1; st.u64 [%r35+40],%r54; .loc 1 335 28 cvt.u32.u64 %r55,%r54; st.u8 [%r35+28],%r55; .loc 1 336 24 st.u64 [%r35+8],%r24; .loc 1 337 29 mov.u64 %r58call (%value_in),_gfortrani_xmallocarray37 27 st.u64 [%r35],%r24; bra $L71; $L70: .loc 1 339 12 cvta.global.u64 %r63,_gfortrani_compile_options; .loc 1 339 11 ld.u32 %r64,[%r63+36]; setp.eq.u32 %r65,%r64,0; @ %r65 bra $L71; .loc 1 341 8 cvta.const.u64 %r68,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 346 8 ld.u64 %r24,[%r35]; $L71: ld.u64 %r69,[%r35+40]; shl.b64 %r34,%r69,2; mov.u64 %r29,%r24; .loc 1 347 10 mov.u64 %r32,0; .loc 1 348 23 cvt.u32.u64 %r70,%r32; $L72: st.u32 [%r29],%r70; .loc 1 347 24 add.u64 %r32,%r32,1; .loc 1 347 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r71,%r30,%r32; @ %r71 bra $L72; $L65: .loc 1 349 1 ret; } maxloc0_8_s1.o/ 1622802211_gfortran_maxloc0_8_s1 .visible .func _gfortran_maxloc0_8_s1); .file 1 "../../../libgfortran/generated/maxloc0_8_s1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_8_s1 .visible .func _gfortran_mmaxloc0_8_s1_gfortran_smaxloc0_8_s1 .visible .func _gfortran_smaxloc0_8_s1in_ar4); // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_returnmemcmpmem_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents7] = {77,65,88,777,65,83,75,32,97,114,103,117,109,101,110,116,0 }82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,6270,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_8_s1 .visible .func _gfortran_maxloc0_8_s1stack; mov.u64 %stack,08u64u64u64 %r92; .reg .u6432u64u32 %r142; .reg .u64 %r143; .reg .predpred %r157; .reg .u64 %r158; .reg .predpredpred %r172; .reg .u64 %r175; mov.u64 %r97,%ar0; mov.u64 %r98,%ar1; mov.u32 %r99,%ar2; mov.u64 %r100,%ar3; .loc 1 64 10 ld.s8 %r22,[%r98+28]; .loc 1 64 8 cvt.u32.u32 %r101,%r22; cvt.s64.s8 %r64,%r101; .loc 1 65 6 setp.gt.s64 %r102,%r64,0; @ %r102 bra $L2; .loc 1 66 5 cvta.const.u64 %r103103_gfortran_runtime_errorr23,[%r97]; .loc 1 68 6 setp.ne.u64 %r105,%r23,0; @ %r105 bra $L3; .loc 1 70 7 st.u64 [%r97+48],%r23; add.u64 %r107,%r64,-1; st.u64 [%r97+56],%r107; mov.u64 %r108,1; st.u64 [%r97+40],%r108; .loc 1 71 28 cvt.u32.u64 %r109,%r108; st.u8 [%r97+28],%r109; .loc 1 72 24 st.u64 [%r97+8],%r23; .loc 1 73 29 mov.u64 %r11112; call (%value_in),_gfortrani_xmallocarray115,[%value_in]; } mov.u64 %r23,%r115; .loc 1 73 27 st.u64 [%r97],%r23; bra $L4; $L3: .loc 1 77 11 cvta.global.u64 %r117,_gfortrani_compile_options; .loc 1 77 10 ld.u32 %r118,[%r117+36]; setp.eq.u32 %r119,%r118,0; @ %r119 bra $L4; .loc 1 7892; call _gfortrani_bounds_iforeach_return.loc 1 83 8 ld.u64 %r23,[%r97]; $L4: .loc 1 82 11 ld.u64 %r65,[%r97+40]; add.u64 %r67,%r98,40; mov.u64 %r94,0; .loc 1 84 10 mov.u64 %r85,%r94; add.u64 %r170,%frame,120; add.u64 %r169,%frame,240; .loc 1 88 16 mov.u64 %r133,%r94; bra $L7; $L19: mov.u64 %r85,%r69; $L7: .loc 1 86 18 add.u64 %r123,%frame,%r94; .loc 1 86 51 ld.u64 %r125,[%r67]; mul.lo.u64 %r124,%r125,%r100; .loc 1 86 18 st.u64 [%r123],%r124; .loc 1 87 19 ld.u64 %r127,[%r67+16]; add.u64 %r126,%r127,1; ld.u64 %r128,[%r67+8]; sub.u64 %r37,%r126,%r128; .loc 1 87 17 add.u64 %r130,%r170,%r94; st.u64 [%r130],%r37; .loc 1 88 16 add.u64 %r132,%r169,%r94; st.u64 [%r132],%r133; .loc 1 89 10 setp.gt.s64 %r134,%r37,0; @ %r134 bra $L5; shl.b64 %r84,%r65,3; mov.u64 %r27,%r23; .loc 1 92 11 mov.u64 %r70,0; .loc 1 93 24 mov.u64 %r135,%r70; $L6: st.u64 [%r27],%r135; .loc 1 92 27 add.u64 %r70,%r70,1; .loc 1 92 4 add.u64 %r27,%r27,%r84; setp.ne.u64 %r136,%r64,%r70; @ %r136 bra $L6; bra $L1; $L5: .loc 1 84 26 add.u64 %r69,%r85,1; .loc 1 84 3 add.u64 %r67,%r67,24; add.u64 %r94,%r94,8; setp.ne.u64 %r137,%r64,%r69; @ %r137 bra $L19; .loc 1 98 8 ld.u64 %r53,[%r98]; shl.b64 %r29,%r65,3; mov.u64 %r50,%r23; .loc 1 101 10 mov.u64 %r68,0; .loc 1 102 23 mov.u64 %r138,1; $L8: st.u64 [%r50],%r138; mov.u64 %r86,%r68; .loc 1 101 26 add.u64 %r68,%r68,1; .loc 1 101 3 add.u64 %r50,%r50,%r29; setp.ne.u64 %r139,%r85,%r86; @ %r139 bra $L8; .loc 1 108 9 setp.ne.u64 %r140,%r53,0; @ ! %r140 bra $L1; .loc 1 123 19 ld.u64 %r40,[%frame]; .loc 1 125 34 ld.u64 %r43,[%frame+120]; cvt.u32.u32 %r142,%r22; cvt.s64.s8 %r141,%r142; shl.b64 %r143,%r141,3; add.u64 %r55,%r169,%r143; .loc 1 106 11 mov.u64 %r58,0; add.u64 %r171,%frame,248; .loc 1 114 8 setp.eq.u32 %r172,%r99,0; .loc 1 131 13 mov.u64 %r175,%r58; $L18: .loc 1 125 21 ld.u64 %r87,[%frame+240]; mov.u64 %r42,%r87; mov.u64 %r52,%r53; $L15: .loc 1 114 8 setp.eq.u64 %r145,%r58,0; @ %r145 bra $L11; .loc 1 41 12r100149,[%value_in]; } .loc 1 114 8 @ %r172 bra $L12; .loc 1 114 24 setp.lt.s32 %r152,%r149,0; @ %r152 bra $L13; $L11: mov.u64 %r63,%r171; .loc 1 125 21 mov.u64 %r71,%r23; mov.u64 %r88,%r42; bra $L14; $L12: .loc 1 114 24 setp.gt.s32 %r154,%r149,0; @ ! %r154 bra $L13; bra $L11; $L14: .loc 1 119 38 add.u64 %r155,%r88,1; .loc 1 119 27 st.u64 [%r71],%r155; .loc 1 118 7 add.u64 %r71,%r71,%r29; setp.eq.u64 %r156,%r55,%r63; @ %r156 bra $L20; .loc 1 119 34 ld.u64 %r88,[%r63]; add.u64 %r63,%r63,8; bra $L14; $L20: mov.u64 %r58,%r52; $L13: .loc 1 123 9 add.u64 %r52,%r52,%r40; .loc 1 125 14 add.u64 %r42,%r42,1; .loc 1 125 7 st.u64 [%frame+240],%r42; setp.ne.u64 %r157,%r42,%r43; @ %r157 bra $L15; .loc 1 123 9 sub.u64 %r158,%r43,%r87; mad.lo.u64 %r53,%r158,%r40,%r53; mov.u64 %r77,%r171; mov.u64 %r51,%r43; mov.u64 %r48,%r40; .loc 1 126 9 mov.u64 %r57,0; $L17: .loc 1 131 13 st.u64 [%r77+-8],%r175; .loc 1 134 23 mul.lo.u64 %r45,%r48,%r51; mov.u64 %r92,%r57; .loc 1 135 5 add.u64 %r57,%r57,1; .loc 1 136 7 setp.eq.u64 %r162,%r92,%r85; @ %r162 bra $L1; .loc 1 144 16 ld.u64 %r163,[%r77]; add.u64 %r47,%r163,1; st.u64 [%r77],%r47; shl.b64 %r74,%r57,3; .loc 1 145 23 add.u64 %r164,%frame,%r74; ld.u64 %r48,[%r164]; .loc 1 145 13 sub.u64 %r165,%r48,%r45; add.u64 %r53,%r53,%r165; .loc 1 148 32 add.u64 %r167,%r170,%r74; ld.u64 %r51,[%r167]; .loc 1 148 7 add.u64 %r77,%r77,8; setp.eq.u64 %r168,%r47,%r51; @ %r168 bra $L17; bra $L18; $L1:_gfortran_mmaxloc0_8_s1 .visible .func _gfortran_mmaxloc0_8_s1in_ar4) {stack; mov.u64 %stack,0480u64 %r130; .reg .pred %r131; .reg .predpredpred %r155; .reg .u64 %r158; .reg .u64 %r161; .reg .u32pred %r198; .reg .u64predpredpred %r229; .reg .u64pred %r238; .reg .u64pred %r243; .reg .u64 %r246; mov.u64 %r126,%ar0; mov.u64 %r127,%ar1; mov.u64 %r128,%ar2; mov.u32 %r129,%ar3; mov.u64 %r130,%ar4; .loc 1 177 6 setp.ne.u64 %r131,%r128,0; @ %r131 bra $L32; .loc 1 180 7130; call _gfortran_maxloc0_8_s1184 7 bra $L31; $L32: .loc 1 187 10 ld.s8 %r22,[%r127+28]; .loc 1 188 6 setp.gt.s32 %r136,%r22,0; @ %r136 bra $L34; .loc 1 189 5 cvta.const.u64 %r131stack; call _gfortran_runtime_error34: .loc 1 191 6 ld.u64 %r139,[%r126]; setp.ne.u64 %r140,%r139,0; @ %r140 bra $L35; .loc 1 193 7 st.u64 [%r126+48],%r139; add.u32 %r142,%r22,-1; cvt.s64.s32 %r143,%r142; st.u64 [%r126+56],%r143; mov.u64 %r144,1; st.u64 [%r126+40],%r144; .loc 1 194 28 cvt.u32.u64 %r145,%r144; st.u8 [%r126+28],%r145; .loc 1 195 24 st.u64 [%r126+8],%r139; .loc 1 196 29 cvt.u32.u32 %r150,%r22; cvt.s64.s8 %r149,%r150; mov.u64 %r148call (%value_in),_gfortrani_xmallocarray151,[%value_in]; } .loc 1 196 27 st.u64 [%r126],%r151; bra $L36; $L35: .loc 1 200 11 cvta.global.u64 %r153,_gfortrani_compile_options; .loc 1 200 10 ld.u32 %r154,[%r153+36]; setp.eq.u32 %r155,%r154,0; @ %r155 bra $L36; .loc 1 203 4 cvta.const.u64 %r158,$LC1gfortrani_bounds_iforeach_return.loc 1 205 4 cvta.const.u64 %r1611r12716158; call _gfortrani_bounds_equal_extents$L36: .loc 1 210 15 ld.u64 %r31,[%r128+16]; .loc 1 212 9 ld.u64 %r79,[%r128]; .loc 1 214 22 cvt.u32.u64 %r32,%r31; .loc 1 214 53 add.u32 %r163,%r32,-4; and.b32 %r164,%r163,-5; set.u32.eq.u32 %r166,%r164,0; neg.s32 %r167,%r166; .loc 1 214 22 add.u32 %r168,%r32,-1; set.u32.le.u32 %r170,%r168,1; neg.s32 %r171,%r170; .loc 1 214 6 cvt.u16.u32 %r173,%r167; cvt.u16.u32 %r174,%r171; or.b16 %r172,%r173,%r174; cvt.u32.u16 %r175,%r172; cvt.u16.u8 %r176,%r175; setp.ne.u16 %r177,%r176,0; @ %r177 bra $L37; .loc 1 216 7 setp.ne.u32 %r179,%r32,16; @ %r179 bra $L38; $L37: .loc 1 223 11 ld.u64 %r92,[%r126+40]; .loc 1 224 8 ld.u64 %r93,[%r126]; .loc 1 225 17 cvt.u32.u32 %r180,%r22; cvt.s64.s8 %r112,%r180; add.u64 %r80,%r127,40; add.u64 %r65,%r128,40; mov.u64 %r44,0; .loc 1 225 10 mov.u64 %r81,%r44; add.u64 %r242,%frame,120; add.u64 %r239,%frame,240; add.u64 %r240,%frame,360; .loc 1 230 16 mov.u64 %r197,%r44; bra $L39; $L38: .loc 1 221 5_gfortran_runtime_error39: .loc 1 227 18 add.u64 %r184,%r242,%r44; .loc 1 227 51 ld.u64 %r186,[%r80]; mul.lo.u64 %r185,%r186,%r130; .loc 1 227 18 st.u64 [%r184],%r185; .loc 1 228 18 add.u64 %r187,%frame,%r44; .loc 1 228 20 ld.u64 %r189,[%r65]; mul.lo.u64 %r188,%r189,%r31; .loc 1 228 18 st.u64 [%r187],%r188; .loc 1 229 19 ld.u64 %r191,[%r80+16]; add.u64 %r190,%r191,1; ld.u64 %r192,[%r80+8]; sub.u64 %r50,%r190,%r192; .loc 1 229 17 add.u64 %r194,%r239,%r44; st.u64 [%r194],%r50; .loc 1 230 16 add.u64 %r196,%r240,%r44; st.u64 [%r196],%r197; .loc 1 231 10 setp.gt.s64 %r198,%r50,0; @ %r198 bra $L40; shl.b64 %r122,%r92,3; mov.u64 %r37,%r93; .loc 1 234 11 mov.u64 %r95,0; .loc 1 235 24 mov.u64 %r199,%r95; $L41: st.u64 [%r37],%r199; .loc 1 234 27 add.u64 %r95,%r95,1; .loc 1 234 4 add.u64 %r37,%r37,%r122; setp.ne.u64 %r200,%r95,%r112; @ %r200 bra $L41; bra $L31; $L40: .loc 1 225 26 add.u64 %r81,%r81,1; .loc 1 225 3 add.u64 %r80,%r80,24; add.u64 %r65,%r65,24; add.u64 %r44,%r44,8; setp.ne.u64 %r201,%r81,%r112; @ %r201 bra $L39; .loc 1 240 8 ld.u64 %r77,[%r127]; shl.b64 %r87,%r92,3; mov.u64 %r99,%r93; .loc 1 243 10 mov.u64 %r94,0; .loc 1 244 23 mov.u64 %r202,%r94; $L43: st.u64 [%r99],%r202; .loc 1 243 26 add.u64 %r94,%r94,1; .loc 1 243 3 add.u64 %r99,%r99,%r87; setp.ne.u64 %r203,%r94,%r112; @ %r203 bra $L43; .loc 1 251 9 setp.eq.u64 %r204,%r77,0; @ %r204 bra $L31; .loc 1 267 19 ld.u64 %r57,[%frame+120]; .loc 1 268 20 ld.u64 %r59,[%frame]; .loc 1 270 34 ld.u64 %r62,[%frame+240]; cvt.u32.u32 %r206,%r22; cvt.s64.s8 %r205,%r206; shl.b64 %r207,%r205,3; add.u64 %r101,%r240,%r207; .loc 1 249 10 mov.u64 %r83,0; add.u64 %r241,%frame,368; .loc 1 257 6 setp.eq.u32 %r243,%r129,0; .loc 1 276 13 mov.u64 %r246,%r83; $L50: .loc 1 270 21 ld.u64 %r123,[%frame+360]; mov.u64 %r61,%r123; mov.u64 %r78,%r79; mov.u64 %r76,%r77; $L48: .loc 1 257 6 ld.s8 %r210,[%r78]; cvt.u16.u32 %r209,%r210; setp.eq.u16 %r211,%r209,0; @ %r211 bra $L44; .loc 1 257 14 setp.eq.u64 %r212,%r83,0; @ %r212 bra $L45;257 6 @ %r243 bra $L46; .loc 1 258 25 setp.lt.s32 %r219,%r216,0; @ %r219 bra $L44; $L45: mov.u64 %r106,%r241; .loc 1 270 21 mov.u64 %r107,%r93; mov.u64 %r124,%r61; bra $L47; $L46: .loc 1 258 25 setp.gt.s32 %r221,%r216,0; @ ! %r221 bra $L44; bra $L45; $L47: .loc 1 263 38 add.u64 %r222,%r124,1; .loc 1 263 27 st.u64 [%r107],%r222; .loc 1 262 7 add.u64 %r107,%r107,%r87; setp.eq.u64 %r223,%r101,%r106; @ %r223 bra $L51; .loc 1 263 34 ld.u64 %r124,[%r106]; add.u64 %r106,%r106,8; bra $L47; $L51: mov.u64 %r83,%r76; $L44: .loc 1 267 9 add.u64 %r76,%r76,%r57; .loc 1 268 10 add.u64 %r78,%r78,%r59; .loc 1 270 14 add.u64 %r61,%r61,1; .loc 1 270 7 st.u64 [%frame+360],%r61; setp.ne.u64 %r224,%r61,%r62; @ %r224 bra $L48; sub.u64 %r117,%r62,%r123; .loc 1 267 9 mad.lo.u64 %r77,%r57,%r117,%r77; .loc 1 268 10 mad.lo.u64 %r79,%r59,%r117,%r79; mov.u64 %r114,%r241; mov.u64 %r72,%r59; mov.u64 %r75,%r62; mov.u64 %r70,%r57; mov.u64 %r113,8; .loc 1 271 9 mov.u64 %r82,0; $L49: .loc 1 276 13 st.u64 [%r114+-8],%r246; .loc 1 279 23 mul.lo.u64 %r64,%r70,%r75; .loc 1 280 24 mul.lo.u64 %r67,%r75,%r72; .loc 1 281 5 add.u64 %r82,%r82,1; .loc 1 282 7 setp.eq.u64 %r229,%r82,%r112; @ %r229 bra $L31; .loc 1 290 16 ld.u64 %r230,[%r114]; add.u64 %r69,%r230,1; st.u64 [%r114],%r69; .loc 1 291 23 add.u64 %r232,%r242,%r113; ld.u64 %r70,[%r232]; .loc 1 291 13 sub.u64 %r233,%r70,%r64; add.u64 %r77,%r77,%r233; .loc 1 292 24 add.u64 %r234,%frame,%r113; ld.u64 %r72,[%r234]; .loc 1 292 14 sub.u64 %r235,%r72,%r67; add.u64 %r79,%r79,%r235; .loc 1 295 32 add.u64 %r237,%r239,%r113; ld.u64 %r75,[%r237]; .loc 1 295 7 add.u64 %r114,%r114,8; add.u64 %r113,%r113,8; setp.eq.u64 %r238,%r69,%r75; @ %r238 bra $L49; bra $L50; $L31:_gfortran_smaxloc0_8_s1 .visible .func _gfortran_smaxloc0_8_s1in_ar4) {u32 %r41; .reg .predu32 %r55; .reg .u64u32ar2; mov.u32 %r38,%ar3; mov.u64 %r39,%ar4; .loc 1 317 6 setp.eq.u64 %r40,%r37,0; @ %r40 bra $L70; .loc 1 317 20 ld.u32 %r41,[%r37]; setp.eq.u32 %r42,%r41,0; @ %r42 bra $L71; $L70: .loc 1 320 7 {call _gfortran_maxloc0_8_s1324 7 bra $L69; $L71: .loc 1 327 8 ld.s8 %r30,[%r36+28]; .loc 1 329 6 setp.gt.s64 %r48,%r30,0; @ %r48 bra $L73; .loc 1 330 5 cvta.const.u64 %r49,$LC0; {stack; call _gfortran_runtime_error73: .loc 1 332 15 ld.u64 %r24,[%r35]; .loc 1 332 6 setp.ne.u64 %r51,%r24,0; @ %r51 bra $L74; .loc 1 334 7 st.u64 [%r35+48],%r24; add.u64 %r53,%r30,-1; st.u64 [%r35+56],%r53; mov.u64 %r54,1; st.u64 [%r35+40],%r54; .loc 1 335 28 cvt.u32.u64 %r55,%r54; st.u8 [%r35+28],%r55; .loc 1 336 24 st.u64 [%r35+8],%r24; .loc 1 337 29 mov.u64 call (%value_in),_gfortrani_xmallocarray37 27 st.u64 [%r35],%r24; bra $L75; $L74: .loc 1 339 12 cvta.global.u64 %r63,_gfortrani_compile_options; .loc 1 339 11 ld.u32 %r64,[%r63+36]; setp.eq.u32 %r65,%r64,0; @ %r65 bra $L75; .loc 1 341 8 cvta.const.u64 %r68,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 346 8 ld.u64 %r24,[%r35]; $L75: ld.u64 %r69,[%r35+40]; shl.b64 %r34,%r69,3; mov.u64 %r29,%r24; .loc 1 347 10 mov.u64 %r32,0; .loc 1 348 23 mov.u64 %r70,%r32; $L76: st.u64 [%r29],%r70; .loc 1 347 24 add.u64 %r32,%r32,1; .loc 1 347 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r71,%r30,%r32; @ %r71 bra $L76; $L69: .loc 1 349 1 ret; } maxloc0_8_s4.o/ 1622802211_gfortran_maxloc0_8_s4 .visible .func _gfortran_maxloc0_8_s4); .file 1 "../../../libgfortran/generated/maxloc0_8_s4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_8_s4 .visible .func _gfortran_mmaxloc0_8_s_gfortran_smaxloc0_8_s4 .visible .func _gfortran_smaxloc0_8_sVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_memcmp_char4_gfortrani_memcmp_char4_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents7] = {77,65,88,777,65,83,75,32,97,114,103,117,109,101,110,116,0 }82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,6270,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_8_s4 .visible .func _gfortran_maxloc0_8_s4stack; mov.u64 %stack,0844predu64 %r146; .reg .pred %r148; .reg .u32 %r154; .reg .pred %r155; .reg .u32 %r159; .reg .predpredpredpredpred64 10 ld.s8 %r22,[%r101+28]; .loc 1 64 8 cvt.u32.u32 %r104,%r22; cvt.s64.s8 %r61,%r104; .loc 1 65 6 setp.gt.s64 %r105,%r61,0; @ %r105 bra $L2; .loc 1 66 5 cvta.const.u64 %r106,$LC0; {stack; call _gfortran_runtime_errorr23,[%r100]; .loc 1 68 6 setp.ne.u64 %r108,%r23,0; @ %r108 bra $L3; .loc 1 70 7 st.u64 [%r100+48],%r23; add.u64 %r110,%r61,-1; st.u64 [%r100+56],%r110; mov.u64 %r111,1; st.u64 [%r100+40],%r111; .loc 1 71 28 cvt.u32.u64 %r112,%r111; st.u8 [%r100+28],%r112; .loc 1 72 24 st.u64 [%r100+8],%r23; .loc 1 73 29 mov.u64 %r11561call (%value_in),_gfortrani_xmallocarray118,[%value_in]; } mov.u64 %r23,%r118; .loc 1 73 27 st.u64 [%r100],%r23; bra $L4; $L3: .loc 1 77 11 cvta.global.u64 %r120,_gfortrani_compile_options; .loc 1 77 10 ld.u32 %r121,[%r120+364; .loc 1 78r1015; call _gfortrani_bounds_iforeach_return.loc 1 83 8 ld.u64 %r23,[%r100]; $L4: .loc 1 82 11 ld.u64 %r62,[%r100+40]; add.u64 %r65,%r101,40; mov.u64 %r98,0; .loc 1 84 10 mov.u64 %r82,%r98; add.u64 %r178,%frame,120; add.u64 %r176,%frame,240; .loc 1 88 16 mov.u64 %r136,%r98; bra $L7; $L20: mov.u64 %r82,%r67; $L7: .loc 1 86 18 add.u64 %r126,%frame,%r98; .loc 1 86 51 ld.u64 %r128,[%r65]; mul.lo.u64 %r127,%r128,%r103; .loc 1 86 18 st.u64 [%r126],%r127; .loc 1 87 19 ld.u64 %r130,[%r65+16]; add.u64 %r129,%r130,1; ld.u64 %r131,[%r65+8]; sub.u64 %r36,%r129,%r131; .loc 1 87 17 add.u64 %r133,%r178,%r98; st.u64 [%r133],%r36; .loc 1 88 16 add.u64 %r135,%r176,%r98; st.u64 [%r135],%r136; .loc 1 89 10 setp.gt.s64 %r137,%r36,0; @ %r137 bra $L5; shl.b64 %r79,%r62,3; mov.u64 %r81,%r23; .loc 1 92 11 mov.u64 %r68,0; .loc 1 93 24 mov.u64 %r138,%r68; $L6: st.u64 [%r81],%r138; .loc 1 92 27 add.u64 %r68,%r68,1; .loc 1 92 4 add.u64 %r81,%r81,%r79; setp.ne.u64 %r139,%r61,%r68; @ %r139 bra $L6; bra $L1; $L5: .loc 1 84 26 add.u64 %r67,%r82,1; .loc 1 84 3 add.u64 %r65,%r65,24; add.u64 %r98,%r98,8; setp.ne.u64 %r140,%r61,%r67; @ %r140 bra $L20; .loc 1 98 8 ld.u64 %r50,[%r101]; shl.b64 %r28,%r62,3; mov.u64 %r47,%r23; .loc 1 101 10 mov.u64 %r66,0; .loc 1 102 23 mov.u64 %r141,1; $L8: st.u64 [%r47],%r141; mov.u64 %r83,%r66; .loc 1 101 26 add.u64 %r66,%r66,1; .loc 1 101 3 add.u64 %r47,%r47,%r28; setp.ne.u64 %r142,%r82,%r83; @ %r142 bra $L8; .loc 1 108 9 setp.ne.u64 %r143,%r50,0; @ ! %r143 bra $L1; .loc 1 123 19 ld.u64 %r88,[%frame]; .loc 1 123 9 shl.b64 %r90,%r88,2; .loc 1 125 34 ld.u64 %r91,[%frame+120]; cvt.u32.u32 %r145,%r22; cvt.s64.s8 %r144,%r145; shl.b64 %r146,%r144,3; add.u64 %r52,%r176,%r146; .loc 1 106 11 mov.u64 %r55,0; add.u64 %r177,%frame,248; .loc 1 114 8 setp.eq.u32 %r179,%r102,0; $L19: .loc 1 125 21 ld.u64 %r92,[%frame+240]; mov.u64 %r40,%r92; mov.u64 %r49,%r50; $L16: .loc 1 114 8 setp.ne.u64 %r148,%r55,0; @ %r148 bra $L11; $L14: mov.u64 %r60,%r177; .loc 1 125 21 mov.u64 %r71,%r23; mov.u64 %r93,%r40; bra $L12; $L11: .loc 1 114 8 @ %r179 bra $L13;5_gfortrani_memcmp_char4154,[%value_in]; } .loc 1 114 24 setp.ge.s32 %r155,%r154,0; @ ! %r155 bra $L15; bra $L145_gfortrani_memcmp_char4.loc 1 114 24 setp.gt.s32 %r160,%r159,0; @ ! %r160 bra $L15; bra $L14; $L12: .loc 1 119 38 add.u64 %r161,%r93,1; .loc 1 119 27 st.u64 [%r71],%r161; .loc 1 118 7 add.u64 %r71,%r71,%r28; setp.eq.u64 %r162,%r52,%r60; @ %r162 bra $L21; .loc 1 119 34 ld.u64 %r93,[%r60]; add.u64 %r60,%r60,8; bra $L12; $L21: .loc 1 117 14 mov.u64 %r55,%r49; $L15: .loc 1 123 9 add.u64 %r49,%r49,%r90; .loc 1 125 14 add.u64 %r40,%r40,1; .loc 1 125 7 st.u64 [%frame+240],%r40; setp.ne.u64 %r163,%r40,%r91; @ %r163 bra $L16; .loc 1 123 9 sub.u64 %r164,%r91,%r92; mad.lo.u64 %r50,%r164,%r90,%r50; mov.u64 %r77,%r177; mov.u64 %r48,%r91; mov.u64 %r45,%r88; .loc 1 126 9 mov.u64 %r54,0; .loc 1 131 13 mov.u64 %r167,%r54; $L18: st.u64 [%r77+-8],%r167; .loc 1 134 23 mul.lo.u64 %r42,%r45,%r48; mov.u64 %r38,%r54; .loc 1 135 5 add.u64 %r54,%r54,1; .loc 1 136 7 setp.eq.u64 %r168,%r38,%r82; @ %r168 bra $L1; .loc 1 144 16 ld.u64 %r169,[%r77]; add.u64 %r44,%r169,1; st.u64 [%r77],%r44; shl.b64 %r74,%r54,3; .loc 1 145 23 add.u64 %r170,%frame,%r74; ld.u64 %r45,[%r170]; .loc 1 145 13 sub.u64 %r171,%r45,%r42; shl.b64 %r172,%r171,2; add.u64 %r50,%r50,%r172; .loc 1 148 32 add.u64 %r174,%r178,%r74; ld.u64 %r48,[%r174]; .loc 1 148 7 add.u64 %r77,%r77,8; setp.eq.u64 %r175,%r44,%r48; @ %r175 bra $L18; bra $L19; $L1:_gfortran_mmaxloc0_8_s4 .visible .func _gfortran_mmaxloc0_8_s, .param .u64 %in_ar4) {stack; mov.u64 %stack,048098pred %r134; .reg .predpredpredpred %r204; .reg .u64 %r205; .reg .predu32 %r209; .reg .u64pred %r222; .reg .u32u64 %r242; .reg .u64mov.u64 %r129,%ar0; mov.u64 %r130,%ar1; mov.u64 %r131,%ar2; mov.u32 %r132,%ar3; mov.u64 %r133,%ar4; .loc 1 177 6 setp.ne.u64 %r134,%r131,0; @ %r134 bra $L302133; call _gfortran_maxloc0_8_s4184 7 bra $L29; $L30: .loc 1 187 10 ld.s8 %r22,[%r130+28]; .loc 1 188 6 setp.gt.s32 %r139,%r22,0; @ %r139 bra $L32; .loc 1 189 5 cvta.const.u64 %r140140_gfortran_runtime_error32: .loc 1 191 6 ld.u64 %r142,[%r129]; setp.ne.u64 %r143,%r142,0; @ %r143 bra $L33; .loc 1 193 7 st.u64 [%r129+48],%r142; add.u32 %r145,%r22,-1; cvt.s64.s32 %r146,%r145; st.u64 [%r129+56],%r146; mov.u64 %r147,1; st.u64 [%r129+40],%r147; .loc 1 194 28 cvt.u32.u64 %r148,%r147; st.u8 [%r129+28],%r148; .loc 1 195 24 st.u64 [%r129+8],%r142; .loc 1 196 29 cvt.u32.u32 %r153,%r22; cvt.s64.s8 %r152,%r153; mov.u64 %r1511151; call (%value_in),_gfortrani_xmallocarray154,[%value_in]; } .loc 1 196 27 st.u64 [%r129],%r154; bra $L34; $L33: .loc 1 200 11 cvta.global.u64 %r156,_gfortrani_compile_options; .loc 1 200 10 ld.u32 %r157,[%r156+36]; setp.eq.u32 %r158,%r157,0gfortrani_bounds_iforeach_return.loc 1 205 4 cvta.const.u64 %r1641; call _gfortrani_bounds_equal_extents$L34: .loc 1 210 15 ld.u64 %r31,[%r131+16]; .loc 1 212 9 ld.u64 %r73,[%r131]; .loc 1 214 22 cvt.u32.u64 %r32,%r31; .loc 1 214 53 add.u32 %r166,%r32,-4; and.b32 %r167,%r166,-5; set.u32.eq.u32 %r169,%r167,0; neg.s32 %r170,%r169; .loc 1 214 22 add.u32 %r171,%r32,-1; set.u32.le.u32 %r173,%r171,1; neg.s32 %r174,%r173; .loc 1 214 6 cvt.u16.u32 %r176,%r170; cvt.u16.u32 %r177,%r174; or35; .loc 1 216 7 setp.ne.u32 %r182,%r32,16; @ %r182 bra $L36; $L35: .loc 1 223 11 ld.u64 %r85,[%r129+40]; .loc 1 224 8 ld.u64 %r86,[%r129]; .loc 1 225 17 cvt.u32.u32 %r183,%r22; cvt.s64.s8 %r107,%r183; add.u64 %r74,%r130,40; add.u64 %r58,%r131,40; mov.u64 %r37,0; .loc 1 225 10 mov.u64 %r75,%r37; add.u64 %r247,%frame,120; add.u64 %r248,%frame,240; add.u64 %r249,%frame,360; .loc 1 230 16 mov.u64 %r200,%r37; bra $L37; $L36: .loc 1 221 5_gfortran_runtime_error37: .loc 1 227 18 add.u64 %r187,%r247,%r37; .loc 1 227 51 ld.u64 %r189,[%r74]; mul.lo.u64 %r188,%r189,%r133; .loc 1 227 18 st.u64 [%r187],%r188; .loc 1 228 18 add.u64 %r190,%frame,%r37; .loc 1 228 20 ld.u64 %r192,[%r58]; mul.lo.u64 %r191,%r192,%r31; .loc 1 228 18 st.u64 [%r190],%r191; .loc 1 229 19 ld.u64 %r194,[%r74+16]; add.u64 %r193,%r194,1; ld.u64 %r195,[%r74+8]; sub.u64 %r49,%r193,%r195; .loc 1 229 17 add.u64 %r197,%r248,%r37; st.u64 [%r197],%r49; .loc 1 230 16 add.u64 %r199,%r249,%r37; st.u64 [%r199],%r200; .loc 1 231 10 setp.gt.s64 %r201,%r49,0; @ %r201 bra $L38; shl.b64 %r113,%r85,3; mov.u64 %r115,%r86; .loc 1 234 11 mov.u64 %r90,0; .loc 1 235 24 mov.u64 %r202,%r90; $L39: st.u64 [%r115],%r202; .loc 1 234 27 add.u64 %r90,%r90,1; .loc 1 234 4 add.u64 %r115,%r115,%r113; setp.ne.u64 %r203,%r90,%r107; @ %r203 bra $L39; bra $L29; $L38: .loc 1 225 26 add.u64 %r75,%r75,1; .loc 1 225 3 add.u64 %r74,%r74,24; add.u64 %r58,%r58,24; add.u64 %r37,%r37,8; setp.ne.u64 %r204,%r75,%r107; @ %r204 bra $L37; .loc 1 240 8 ld.u64 %r70,[%r130]; shl.b64 %r88,%r85,3; mov.u64 %r96,%r86; .loc 1 243 10 mov.u64 %r89,0; .loc 1 244 23 mov.u64 %r205,%r89; $L41: st.u64 [%r96],%r205; .loc 1 243 26 add.u64 %r89,%r89,1; .loc 1 243 3 add.u64 %r96,%r96,%r88; setp.ne.u64 %r206,%r89,%r107; @ %r206 bra $L41; .loc 1 251 9 setp.eq.u64 %r207,%r70,0; @ %r207 bra $L29; .loc 1 267 19 ld.u64 %r121,[%frame+120]; .loc 1 267 9 shl.b64 %r123,%r121,2; .loc 1 268 20 ld.u64 %r124,[%frame]; .loc 1 270 34 ld.u64 %r126,[%frame+240]; cvt.u32.u32 %r209,%r22; cvt.s64.s8 %r208,%r209; shl.b64 %r210,%r208,3; add.u64 %r98,%r249,%r210; .loc 1 249 10 mov.u64 %r77,0; add.u64 %r246,%frame,368; .loc 1 257 6 setp.eq.u32 %r250,%r132,0; $L49: .loc 1 270 21 ld.u64 %r127,[%frame+360]; mov.u64 %r55,%r127; mov.u64 %r72,%r73; mov.u64 %r69,%r70; $L47: .loc 1 257 6 ld.s8 %r213,[%r72]; cvt.u16.u32 %r212,%r213; setp.eq.u16 %r214,%r212,0; @ %r214 bra $L42; .loc 1 257 14 setp.ne.u64 %r215,%r77,0; @ %r215 bra $L43; $L46: mov.u64 %r103,%r246; .loc 1 270 21 mov.u64 %r104,%r86; mov.u64 %r128,%r55; bra $L44; $L43: .loc 1 257 6 @ %r250 bra $L45; .loc 1 43r133; call (%value_in),_gfortrani_memcmp_char4221,[%value_in]; } .loc 1 258 25 setp.ge.s32 %r222,%r221,0; @ ! %r222 bra $L42; bra $L46; $L45: .loc 1 43r133; call (%value_in),_gfortrani_memcmp_char4226,[%value_in]; } .loc 1 258 25 setp.gt.s32 %r227,%r226,0; @ ! %r227 bra $L42; bra $L46; $L44: .loc 1 263 38 add.u64 %r228,%r128,1; .loc 1 263 27 st.u64 [%r104],%r228; .loc 1 262 7 add.u64 %r104,%r104,%r88; setp.eq.u64 %r229,%r98,%r103; @ %r229 bra $L50; .loc 1 263 34 ld.u64 %r128,[%r103]; add.u64 %r103,%r103,8; bra $L44; $L50: .loc 1 261 14 mov.u64 %r77,%r69; $L42: .loc 1 267 9 add.u64 %r69,%r69,%r123; .loc 1 268 10 add.u64 %r72,%r72,%r124; .loc 1 270 14 add.u64 %r55,%r55,1; .loc 1 270 7 st.u64 [%frame+360],%r55; setp.ne.u64 %r230,%r55,%r126; @ %r230 bra $L47; sub.u64 %r118,%r126,%r127; .loc 1 267 9 mad.lo.u64 %r70,%r118,%r123,%r70; .loc 1 268 10 mad.lo.u64 %r73,%r118,%r124,%r73; mov.u64 %r111,%r246; mov.u64 %r66,%r124; mov.u64 %r68,%r126; mov.u64 %r64,%r121; mov.u64 %r110,8; .loc 1 271 9 mov.u64 %r76,0; .loc 1 276 13 mov.u64 %r234,%r76; $L48: st.u64 [%r111+-8],%r234; .loc 1 279 23 mul.lo.u64 %r57,%r64,%r68; .loc 1 280 24 mul.lo.u64 %r60,%r68,%r66; .loc 1 281 5 add.u64 %r76,%r76,1; .loc 1 282 7 setp.eq.u64 %r235,%r76,%r107; @ %r235 bra $L29; .loc 1 290 16 ld.u64 %r236,[%r111]; add.u64 %r63,%r236,1; st.u64 [%r111],%r63; .loc 1 291 23 add.u64 %r238,%r247,%r110; ld.u64 %r64,[%r238]; .loc 1 291 13 sub.u64 %r239,%r64,%r57; shl.b64 %r240,%r239,2; add.u64 %r70,%r70,%r240; .loc 1 292 24 add.u64 %r241,%frame,%r110; ld.u64 %r66,[%r241]; .loc 1 292 14 sub.u64 %r242,%r66,%r60; add.u64 %r73,%r73,%r242; .loc 1 295 32 add.u64 %r244,%r248,%r110; ld.u64 %r68,[%r244]; .loc 1 295 7 add.u64 %r111,%r111,8; add.u64 %r110,%r110,8; setp.eq.u64 %r245,%r63,%r68; @ %r245 bra $L48; bra $L49; $L29:_gfortran_smaxloc0_8_s4 .visible .func _gfortran_smaxloc0_8_s, .param .u64 %in_ar4) {u32 %r41; .reg .predu32 %r55; .reg .u64u32ar2; mov.u32 %r38,%ar3; mov.u64 %r39,%ar4; .loc 1 317 6 setp.eq.u64 %r40,%r37,0; @ %r40 bra $L66; .loc 1 317 20 ld.u32 %r41,[%r37]; setp.eq.u32 %r42,%r41,0; @ %r42 bra $L67; $L66: .loc 1 320 7 {call _gfortran_maxloc0_8_s4324 7 bra $L65; $L67: .loc 1 327 8 ld.s8 %r30,[%r36+28]; .loc 1 329 6 setp.gt.s64 %r48,%r30,0; @ %r48 bra $L69; .loc 1 330 5 cvta.const.u64 %r49,$LC0; {stack; call _gfortran_runtime_error69: .loc 1 332 15 ld.u64 %r24,[%r35]; .loc 1 332 6 setp.ne.u64 %r51,%r24,0; @ %r51 bra $L70; .loc 1 334 7 st.u64 [%r35+48],%r24; add.u64 %r53,%r30,-1; st.u64 [%r35+56],%r53; mov.u64 %r54,1; st.u64 [%r35+40],%r54; .loc 1 335 28 cvt.u32.u64 %r55,%r54; st.u8 [%r35+28],%r55; .loc 1 336 24 st.u64 [%r35+8],%r24; .loc 1 337 29 mov.u64 call (%value_in),_gfortrani_xmallocarray37 27 st.u64 [%r35],%r24; bra $L71; $L70: .loc 1 339 12 cvta.global.u64 %r63,_gfortrani_compile_options; .loc 1 339 11 ld.u32 %r64,[%r63+36]; setp.eq.u32 %r65,%r64,0; @ %r65 bra $L71; .loc 1 341 8 cvta.const.u64 %r68,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 346 8 ld.u64 %r24,[%r35]; $L71: ld.u64 %r69,[%r35+40]; shl.b64 %r34,%r69,3; mov.u64 %r29,%r24; .loc 1 347 10 mov.u64 %r32,0; .loc 1 348 23 mov.u64 %r70,%r32; $L72: st.u64 [%r29],%r70; .loc 1 347 24 add.u64 %r32,%r32,1; .loc 1 347 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r71,%r30,%r32; @ %r71 bra $L72; $L65: .loc 1 349 1 ret; } maxloc0_16_s1.o/1622802212_gfortran_maxloc0_16_s1 .visible .func _gfortran_maxloc0_16_s1); .file 1 "../../../libgfortran/generated/maxloc0_16_s1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_16_s1 .visible .func _gfortran_mmaxloc0_16_s1_gfortran_smaxloc0_16_s1 .visible .func _gfortran_smaxloc0_16_s1in_ar4); // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_returnmemcmpmem_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents7] = {77,65,88,777,65,83,75,32,97,114,103,117,109,101,110,116,0 }82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,6270,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_16_s1 .visible .func _gfortran_maxloc0_16_s1stack; mov.u64 %stack,08u64predu64 %r146; .reg .predu64 %r162; .reg .predpredpred %r176; .reg .u64.loc 1 64 10 ld.s8 %r22,[%r99+28]; .loc 1 64 8 cvt.u32.u32 %r102,%r22; cvt.s64.s8 %r65,%r102; .loc 1 65 6 setp.gt.s64 %r103,%r65,0; @ %r103 bra $L2; .loc 1 66 5 cvta.const.u64 %r104,$LC0; {_gfortran_runtime_errorr23,[%r98]; .loc 1 3; .loc 1 70 7 st.u64 [%r98+48],%r23; add.u64 %r108,%r65,-1; st.u64 [%r98+56],%r108; mov.u64 %r109,1; st.u64 [%r98+40],%r109; .loc 1 71 28 cvt.u32.u64 %r110,%r109; st.u8 [%r98+28],%r110; .loc 1 72 24 st.u64 [%r98+8],%r23; .loc 1 73 29 mov.u64 %r113,1665call (%value_in),_gfortrani_xmallocarray116,[%value_in]; } mov.u64 %r23,%r116; .loc 1 73 27 st.u64 [%r98],%r23; bra $L4; $L3: .loc 1 77 11 cvta.global.u64 %r118,_gfortrani_compile_options; .loc 1 77 10 ld.u32 %r119,[%r118+36]; setp.eq.u32 %r120,%r119,0; @ %r120 bra $L4; .loc 1 783; call _gfortrani_bounds_iforeach_return.loc 1 83 8 ld.u64 %r23,[%r98]; $L4: .loc 1 82 11 ld.u64 %r66,[%r98+40]; add.u64 %r62,%r99,40; mov.u64 %r94,0; .loc 1 84 10 mov.u64 %r86,%r94; add.u64 %r175,%frame,120; add.u64 %r173,%frame,240; .loc 1 88 16 mov.u64 %r134,%r94; bra $L7; $L19: mov.u64 %r86,%r71; $L7: .loc 1 86 18 add.u64 %r124,%frame,%r94; .loc 1 86 51 ld.u64 %r126,[%r62]; mul.lo.u64 %r125,%r126,%r101; .loc 1 86 18 st.u64 [%r124],%r125; .loc 1 87 19 ld.u64 %r128,[%r62+16]; add.u64 %r127,%r128,1; ld.u64 %r129,[%r62+8]; sub.u64 %r36,%r127,%r129; .loc 1 87 17 add.u64 %r131,%r175,%r94; st.u64 [%r131],%r36; .loc 1 88 16 add.u64 %r133,%r173,%r94; st.u64 [%r133],%r134; .loc 1 89 10 setp.gt.s64 %r135,%r36,0; @ %r135 bra $L5; shl.b64 %r88,%r66,4; mov.u64 %r27,%r23; .loc 1 92 11 mov.u64 %r72,0; .loc 1 93 24 mov.u64 %r136,%r72; $L6: st.u64 [%r27],%r136; st.u64 [%r27+8],%r136; .loc 1 92 27 add.u64 %r72,%r72,1; .loc 1 92 4 add.u64 %r27,%r27,%r88; setp.ne.u64 %r138,%r65,%r72; @ %r138 bra $L6; bra $L1; $L5: .loc 1 84 26 add.u64 %r71,%r86,1; .loc 1 84 3 add.u64 %r62,%r62,24; add.u64 %r94,%r94,8; setp.ne.u64 %r139,%r65,%r71; @ %r139 bra $L19; .loc 1 98 8 ld.u64 %r53,[%r99]; shl.b64 %r50,%r66,4; mov.u64 %r55,%r23; .loc 1 101 10 mov.u64 %r70,0; .loc 1 102 23 mov.u64 %r140,1; mov.u64 %r141,%r70; $L8: st.u64 [%r55],%r140; st.u64 [%r55+8],%r141; mov.u64 %r87,%r70; .loc 1 101 26 add.u64 %r70,%r70,1; .loc 1 101 3 add.u64 %r55,%r55,%r50; setp.ne.u64 %r142,%r86,%r87; @ %r142 bra $L8; .loc 1 108 9 setp.ne.u64 %r143,%r53,0; @ ! %r143 bra $L1; .loc 1 123 19 ld.u64 %r40,[%frame]; .loc 1 125 34 ld.u64 %r43,[%frame+120]; cvt.u32.u32 %r145,%r22; cvt.s64.s8 %r144,%r145; shl.b64 %r146,%r144,3; add.u64 %r59,%r173,%r146; .loc 1 106 11 mov.u64 %r58,0; add.u64 %r174,%frame,248; .loc 1 114 8 setp.eq.u32 %r176,%r100,0; .loc 1 131 13 mov.u64 %r179,%r58; $L18: .loc 1 125 21 ld.u64 %r89,[%frame+240]; mov.u64 %r42,%r89; mov.u64 %r52,%r53; $L15: .loc 1 114 8 setp.eq.u64 %r148,%r58,0; @ %r148 bra $L11; .loc 1 41 12r101152,[%value_in]; } .loc 1 114 8 @ %r176 bra $L12; .loc 1 114 24 setp.lt.s32 %r155,%r152,0; @ %r155 bra $L13; $L11: mov.u64 %r69,%r174; .loc 1 125 21 mov.u64 %r73,%r23; mov.u64 %r90,%r42; bra $L14; $L12: .loc 1 114 24 setp.gt.s32 %r157,%r152,0; @ ! %r157 bra $L13; bra $L11; $L14: .loc 1 119 38 add.u64 %r158,%r90,1; st.u64 [%r73],%r158; shr.s64 %r159,%r158,63; st.u64 [%r73+8],%r159; .loc 1 118 7 add.u64 %r73,%r73,%r50; setp.eq.u64 %r160,%r59,%r69; @ %r160 bra $L20; .loc 1 119 34 ld.u64 %r90,[%r69]; add.u64 %r69,%r69,8; bra $L14; $L20: mov.u64 %r58,%r52; $L13: .loc 1 123 9 add.u64 %r52,%r52,%r40; .loc 1 125 14 add.u64 %r42,%r42,1; .loc 1 125 7 st.u64 [%frame+240],%r42; setp.ne.u64 %r161,%r42,%r43; @ %r161 bra $L15; .loc 1 123 9 sub.u64 %r162,%r43,%r89; mad.lo.u64 %r53,%r162,%r40,%r53; mov.u64 %r79,%r174; mov.u64 %r51,%r43; mov.u64 %r48,%r40; .loc 1 126 9 mov.u64 %r57,0; $L17: .loc 1 131 13 st.u64 [%r79+-8],%r179; .loc 1 134 23 mul.lo.u64 %r45,%r48,%r51; mov.u64 %r96,%r57; .loc 1 135 5 add.u64 %r57,%r57,1; .loc 1 136 7 setp.eq.u64 %r166,%r96,%r86; @ %r166 bra $L1; .loc 1 144 16 ld.u64 %r167,[%r79]; add.u64 %r47,%r167,1; st.u64 [%r79],%r47; shl.b64 %r76,%r57,3; .loc 1 145 23 add.u64 %r168,%frame,%r76; ld.u64 %r48,[%r168]; .loc 1 145 13 sub.u64 %r169,%r48,%r45; add.u64 %r53,%r53,%r169; .loc 1 148 32 add.u64 %r171,%r175,%r76; ld.u64 %r51,[%r171]; .loc 1 148 7 add.u64 %r79,%r79,8; setp.eq.u64 %r172,%r47,%r51; @ %r172 bra $L17; bra $L18; $L1:_gfortran_mmaxloc0_16_s1 .visible .func _gfortran_mmaxloc0_16_s1in_ar4) {stack; mov.u64 %stack,0480predpredu32 %r209; .reg .u64pred %r222; .reg .predpredmov.u32 %r130,%ar3; mov.u64 %r131,%ar4; .loc 1 177 6 setp.ne.u64 %r132,%r129,0; @ %r132 bra $L32; .loc 1 180 7r128130131; call _gfortran_maxloc0_16_s1184 7 bra $L31; $L32: .loc 1 187 10 ld.s8 %r22,[%r128+28]; .loc 1 188 6 setp.gt.s32 %r137,%r22,0; @ %r137 bra $L34;_gfortran_runtime_error34: .loc 1 191 6 ld.u64 %r140,[%r127]; setp.ne.u64 %r141,%r140,0; @ %r141 bra $L35; .loc 1 193 7 st.u64 [%r127+48],%r140; add.u32 %r143,%r22,-1; cvt.s64.s32 %r144,%r143; st.u64 [%r127+56],%r144; mov.u64 %r145,1; st.u64 [%r127+40],%r145; .loc 1 194 28 cvt.u32.u64 %r146,%r145; st.u8 [%r127+28],%r146; .loc 1 195 24 st.u64 [%r127+8],%r140; .loc 1 196 29 cvt.u32.u32 %r151,%r22; cvt.s64.s8 %r150,%r151; mov.u64 %r149,16call (%value_in),_gfortrani_xmallocarray152,[%value_in]; } .loc 1 196 27 st.u64 [%r127],%r152; bra $L36; $L35: .loc 1 200 11 cvta.global.u64 %r154,_gfortrani_compile_options; .loc 1 200 10 ld.u32 %r155,[%r154+36]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L36; .loc 1 203 4r12159; call _gfortrani_bounds_iforeach_return.loc 1 205 4 cvta.const.u64 %r16262159; call _gfortrani_bounds_equal_extents$L36: .loc 1 210 15 ld.u64 %r32,[%r129+16]; .loc 1 212 9 ld.u64 %r81,[%r129]; .loc 1 214 22 cvt.u32.u64 %r33,%r32; .loc 1 214 53 add.u32 %r164,%r33,-4; and.b32 %r165,%r164,-5; set.u32.eq.u32 %r167,%r165,0; neg.s32 %r168,%r167; .loc 1 214 22 add.u32 %r169,%r33,-1; set.u32.le.u32 %r171,%r169,1; neg.s32 %r172,%r171; .loc 1 214 6 cvt.u16.u32 %r174,%r168; cvt.u16.u32 %r175,%r172; or.b16 %r173,%r174,%r175; cvt.u32.u16 %r176,%r173; cvt.u16.u8 %r177,%r176; setp.ne.u16 %r178,%r177,0; @ %r178 bra $L37; .loc 1 216 7 setp.ne.u32 %r180,%r33,16; @ %r180 bra $L38; $L37: .loc 1 223 11 ld.u64 %r93,[%r127+40]; .loc 1 224 8 ld.u64 %r94,[%r127]; .loc 1 225 17 cvt.u32.u32 %r181,%r22; cvt.s64.s8 %r112,%r181; add.u64 %r86,%r128,40; add.u64 %r76,%r129,40; mov.u64 %r58,0; .loc 1 225 10 mov.u64 %r83,%r58; add.u64 %r244,%frame,120; add.u64 %r245,%frame,240; add.u64 %r246,%frame,360; .loc 1 230 16 mov.u64 %r198,%r58; bra $L39; $L38: .loc 1 221 5_gfortran_runtime_error39: .loc 1 227 18 add.u64 %r185,%r244,%r58; .loc 1 227 51 ld.u64 %r187,[%r86]; mul.lo.u64 %r186,%r187,%r131; .loc 1 227 18 st.u64 [%r185],%r186; .loc 1 228 18 add.u64 %r188,%frame,%r58; .loc 1 228 20 ld.u64 %r190,[%r76]; mul.lo.u64 %r189,%r190,%r32; .loc 1 228 18 st.u64 [%r188],%r189; .loc 1 229 19 ld.u64 %r192,[%r86+16]; add.u64 %r191,%r192,1; ld.u64 %r193,[%r86+8]; sub.u64 %r50,%r191,%r193; .loc 1 229 17 add.u64 %r195,%r245,%r58; st.u64 [%r195],%r50; .loc 1 230 16 add.u64 %r197,%r246,%r58; st.u64 [%r197],%r198; .loc 1 231 10 setp.gt.s64 %r199,%r50,0; @ %r199 bra $L40; shl.b64 %r123,%r93,4; mov.u64 %r38,%r94; .loc 1 234 11 mov.u64 %r96,0; .loc 1 235 24 mov.u64 %r200,%r96; $L41: st.u64 [%r38],%r200; st.u64 [%r38+8],%r200; .loc 1 234 27 add.u64 %r96,%r96,1; .loc 1 234 4 add.u64 %r38,%r38,%r123; setp.ne.u64 %r202,%r96,%r112; @ %r202 bra $L41; bra $L31; $L40: .loc 1 225 26 add.u64 %r83,%r83,1; .loc 1 225 3 add.u64 %r86,%r86,24; add.u64 %r76,%r76,24; add.u64 %r58,%r58,8; setp.ne.u64 %r203,%r83,%r112; @ %r203 bra $L39; .loc 1 240 8 ld.u64 %r79,[%r128]; shl.b64 %r98,%r93,4; mov.u64 %r100,%r94; .loc 1 243 10 mov.u64 %r95,0; .loc 1 244 23 mov.u64 %r204,%r95; $L43: st.u64 [%r100],%r204; st.u64 [%r100+8],%r204; .loc 1 243 26 add.u64 %r95,%r95,1; .loc 1 243 3 add.u64 %r100,%r100,%r98; setp.ne.u64 %r206,%r95,%r112; @ %r206 bra $L43; .loc 1 251 9 setp.eq.u64 %r207,%r79,0; @ %r207 bra $L31; .loc 1 267 19 ld.u64 %r59,[%frame+120]; .loc 1 268 20 ld.u64 %r61,[%frame]; .loc 1 270 34 ld.u64 %r64,[%frame+240]; cvt.u32.u32 %r209,%r22; cvt.s64.s8 %r208,%r209; shl.b64 %r210,%r208,3; add.u64 %r102,%r246,%r210; .loc 1 249 10 mov.u64 %r85,0; add.u64 %r243,%frame,368; .loc 1 257 6 setp.eq.u32 %r247,%r130,0; .loc 1 276 13 mov.u64 %r250,%r85; $L50: .loc 1 270 21 ld.u64 %r124,[%frame+360]; mov.u64 %r63,%r124; mov.u64 %r80,%r81; mov.u64 %r78,%r79; $L48: .loc 1 257 6 ld.s8 %r213,[%r80]; cvt.u16.u32 %r212,%r213; setp.eq.u16 %r214,%r212,0; @ %r214 bra $L44; .loc 1 257 14 setp.eq.u64 %r215,%r85,0; @ %r215 bra $L45; .loc 1 41 12131219,[%value_in]; } .loc 1 257 6 @ %r247 bra $L46; .loc 1 258 25 setp.lt.s32 %r222,%r219,0; @ %r222 bra $L44; $L45: mov.u64 %r107,%r243; .loc 1 270 21 mov.u64 %r108,%r94; mov.u64 %r125,%r63; bra $L47; $L46: .loc 1 258 25 setp.gt.s32 %r224,%r219,0; @ ! %r224 bra $L44; bra $L45; $L47: .loc 1 263 38 add.u64 %r225,%r125,1; st.u64 [%r108],%r225; shr.s64 %r226,%r225,63; st.u64 [%r108+8],%r226; .loc 1 262 7 add.u64 %r108,%r108,%r98; setp.eq.u64 %r227,%r102,%r107; @ %r227 bra $L51; .loc 1 263 34 ld.u64 %r125,[%r107]; add.u64 %r107,%r107,8; bra $L47; $L51: mov.u64 %r85,%r78; $L44: .loc 1 267 9 add.u64 %r78,%r78,%r59; .loc 1 268 10 add.u64 %r80,%r80,%r61; .loc 1 270 14 add.u64 %r63,%r63,1; .loc 1 270 7 st.u64 [%frame+360],%r63; setp.ne.u64 %r228,%r63,%r64; @ %r228 bra $L48; sub.u64 %r118,%r64,%r124; .loc 1 267 9 mad.lo.u64 %r79,%r59,%r118,%r79; .loc 1 268 10 mad.lo.u64 %r81,%r61,%r118,%r81; mov.u64 %r115,%r243; mov.u64 %r74,%r61; mov.u64 %r77,%r64; mov.u64 %r72,%r59; mov.u64 %r114,8; .loc 1 271 9 mov.u64 %r84,0; $L49: .loc 1 276 13 st.u64 [%r115+-8],%r250; .loc 1 279 23 mul.lo.u64 %r66,%r72,%r77; .loc 1 280 24 mul.lo.u64 %r69,%r77,%r74; .loc 1 281 5 add.u64 %r84,%r84,1; .loc 1 282 7 setp.eq.u64 %r233,%r84,%r112; @ %r233 bra $L31; .loc 1 290 16 ld.u64 %r234,[%r115]; add.u64 %r71,%r234,1; st.u64 [%r115],%r71; .loc 1 291 23 add.u64 %r236,%r244,%r114; ld.u64 %r72,[%r236]; .loc 1 291 13 sub.u64 %r237,%r72,%r66; add.u64 %r79,%r79,%r237; .loc 1 292 24 add.u64 %r238,%frame,%r114; ld.u64 %r74,[%r238]; .loc 1 292 14 sub.u64 %r239,%r74,%r69; add.u64 %r81,%r81,%r239; .loc 1 295 32 add.u64 %r241,%r245,%r114; ld.u64 %r77,[%r241]; .loc 1 295 7 add.u64 %r115,%r115,8; add.u64 %r114,%r114,8; setp.eq.u64 %r242,%r71,%r77; @ %r242 bra $L49; bra $L50; $L31:_gfortran_smaxloc0_16_s1 .visible .func _gfortran_smaxloc0_16_s1in_ar4) {u32 %r41; .reg .predu32 %r55; .reg .u64u32ar2; mov.u32 %r38,%ar3; mov.u64 %r39,%ar4; .loc 1 317 6 setp.eq.u64 %r40,%r37,0; @ %r40 bra $L70; .loc 1 317 20 ld.u32 %r41,[%r37]; setp.eq.u32 %r42,%r41,0; @ %r42 bra $L71; $L70: .loc 1 320 7 {call _gfortran_maxloc0_16_s1324 7 bra $L69; $L71: .loc 1 327 8 ld.s8 %r30,[%r36+28]; .loc 1 329 6 setp.gt.s64 %r48,%r30,0; @ %r48 bra $L73; .loc 1 330 5 cvta.const.u64 %r49,$LC0; {stack; call _gfortran_runtime_error73: .loc 1 332 15 ld.u64 %r24,[%r35]; .loc 1 332 6 setp.ne.u64 %r51,%r24,0; @ %r51 bra $L74; .loc 1 334 7 st.u64 [%r35+48],%r24; add.u64 %r53,%r30,-1; st.u64 [%r35+56],%r53; mov.u64 %r54,1; st.u64 [%r35+40],%r54; .loc 1 335 28 cvt.u32.u64 %r55,%r54; st.u8 [%r35+28],%r55; .loc 1 336 24 st.u64 [%r35+8],%r24; .loc 1 337 29call (%value_in),_gfortrani_xmallocarray37 27 st.u64 [%r35],%r24; bra $L75; $L74: .loc 1 339 12 cvta.global.u64 %r63,_gfortrani_compile_options; .loc 1 339 11 ld.u32 %r64,[%r63+36]; setp.eq.u32 %r65,%r64,0; @ %r65 bra $L75; .loc 1 341 8 cvta.const.u64 %r68,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 346 8 ld.u64 %r24,[%r35]; $L75: ld.u64 %r69,[%r35+40]; shl.b64 %r34,%r69,4; mov.u64 %r29,%r24; .loc 1 347 10 mov.u64 %r32,0; .loc 1 348 23 mov.u64 %r70,%r32; $L76: st.u64 [%r29],%r70; st.u64 [%r29+8],%r70; .loc 1 347 24 add.u64 %r32,%r32,1; .loc 1 347 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r72,%r30,%r32; @ %r72 bra $L76; $L69: .loc 1 349 1 ret; } maxloc0_16_s4.o/1622802212_gfortran_maxloc0_16_s4 .visible .func _gfortran_maxloc0_16_s4); .file 1 "../../../libgfortran/generated/maxloc0_16_s4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc0_16_s4 .visible .func _gfortran_mmaxloc0_16_s_gfortran_smaxloc0_16_s4 .visible .func _gfortran_smaxloc0_16_sVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_memcmp_char4_gfortrani_memcmp_char4_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents7] = {77,65,88,777,65,83,75,32,97,114,103,117,109,101,110,116,0 }82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,6270,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc0_16_s4 .visible .func _gfortran_maxloc0_16_s4stack; mov.u64 %stack,0844u64 %r104; .reg .u32 %r105; .reg .pred %r106; .reg .u64 %r107; .reg .predpredpredmov.u64 %r101,%ar0; mov.u64 %r102,%ar1; mov.u32 %r103,%ar2; mov.u64 %r104,%ar3; .loc 1 64 10 ld.s8 %r22,[%r102+28]; .loc 1 64 8 cvt.u32.u32 %r105,%r22; cvt.s64.s8 %r62,%r105; .loc 1 65 6 setp.gt.s64 %r106,%r62,0; @ %r106 bra $L2; .loc 1 66 5 cvta.const.u64 %r107,$LC0; {stack; call _gfortran_runtime_errorr23,[%r101]; .loc 1 68 6 setp.ne.u64 %r109,%r23,0; @ %r109 bra $L3; .loc 1 70 7 st.u64 [%r101+48],%r23; add.u64 %r111,%r62,-1; st.u64 [%r101+56],%r111; mov.u64 %r112,1; st.u64 [%r101+40],%r112; .loc 1 71 28 cvt.u32.u64 %r113,%r112; st.u8 [%r101+28],%r113; .loc 1 72 24 st.u64 [%r101+8],%r23; .loc 1 73 29 mov.u64 %r116,16116; call (%value_in),_gfortrani_xmallocarray119,[%value_in]; } mov.u64 %r23,%r119; .loc 1 73 27 st.u64 [%r101],%r23; bra $L4; $L3: .loc 1 77 11 cvta.global.u64 %r121,_gfortrani_compile_options; .loc 1 77 10 ld.u32 %r122,[%r121+36]; setp.eq.u32 %r123,%r122,0; @ %r123 bra $L4; .loc 1 78 2 cvta.const.u64 %r126,$LC1; {r1026; call _gfortrani_bounds_iforeach_return.loc 1 83 8 ld.u64 %r23,[%r101]; $L4: .loc 1 82 11 ld.u64 %r63,[%r101+40]; add.u64 %r59,%r102,40; mov.u64 %r99,0; .loc 1 84 10 mov.u64 %r82,%r99; add.u64 %r180,%frame,120; add.u64 %r181,%frame,240; .loc 1 88 16 mov.u64 %r137,%r99; bra $L7; $L20: mov.u64 %r82,%r69; $L7: .loc 1 86 18 add.u64 %r127,%frame,%r99; .loc 1 86 51 ld.u64 %r129,[%r59]; mul.lo.u64 %r128,%r129,%r104; .loc 1 86 18 st.u64 [%r127],%r128; .loc 1 87 19 ld.u64 %r131,[%r59+16]; add.u64 %r130,%r131,1; ld.u64 %r132,[%r59+8]; sub.u64 %r35,%r130,%r132; .loc 1 87 17 add.u64 %r134,%r180,%r99; st.u64 [%r134],%r35; .loc 1 88 16 add.u64 %r136,%r181,%r99; st.u64 [%r136],%r137; .loc 1 89 10 setp.gt.s64 %r138,%r35,0; @ %r138 bra $L5; shl.b64 %r81,%r63,4; mov.u64 %r85,%r23; .loc 1 92 11 mov.u64 %r70,0; .loc 1 93 24 mov.u64 %r139,%r70; $L6: st.u64 [%r85],%r139; st.u64 [%r85+8],%r139; .loc 1 92 27 add.u64 %r70,%r70,1; .loc 1 92 4 add.u64 %r85,%r85,%r81; setp.ne.u64 %r141,%r62,%r70; @ %r141 bra $L6; bra $L1; $L5: .loc 1 84 26 add.u64 %r69,%r82,1; .loc 1 84 3 add.u64 %r59,%r59,24; add.u64 %r99,%r99,8; setp.ne.u64 %r142,%r62,%r69; @ %r142 bra $L20; .loc 1 98 8 ld.u64 %r50,[%r102]; shl.b64 %r47,%r63,4; mov.u64 %r52,%r23; .loc 1 101 10 mov.u64 %r68,0; .loc 1 102 23 mov.u64 %r143,1; mov.u64 %r144,%r68; $L8: st.u64 [%r52],%r143; st.u64 [%r52+8],%r144; mov.u64 %r83,%r68; .loc 1 101 26 add.u64 %r68,%r68,1; .loc 1 101 3 add.u64 %r52,%r52,%r47; setp.ne.u64 %r145,%r82,%r83; @ %r145 bra $L8; .loc 1 108 9 setp.ne.u64 %r146,%r50,0; @ ! %r146 bra $L1; .loc 1 123 19 ld.u64 %r90,[%frame]; .loc 1 123 9 shl.b64 %r92,%r90,2; .loc 1 125 34 ld.u64 %r93,[%frame+120]; cvt.u32.u32 %r148,%r22; cvt.s64.s8 %r147,%r148; shl.b64 %r149,%r147,3; add.u64 %r56,%r181,%r149; .loc 1 106 11 mov.u64 %r55,0; add.u64 %r182,%frame,248; .loc 1 114 8 setp.eq.u32 %r183,%r103,0; $L19: .loc 1 125 21 ld.u64 %r94,[%frame+240]; mov.u64 %r40,%r94; mov.u64 %r49,%r50; $L16: .loc 1 114 8 setp.ne.u64 %r151,%r55,0; @ %r151 bra $L11; $L14: mov.u64 %r67,%r182; .loc 1 125 21 mov.u64 %r73,%r23; mov.u64 %r95,%r40; bra $L12; $L11: .loc 1 114 8 @ %r183 bra $L13;104; call (%value_in),_gfortrani_memcmp_char4157,[%value_in]; } .loc 1 114 24 setp.ge.s32 %r158,%r157,0; @ ! %r158 bra $L15; bra $L14104; call (%value_in),_gfortrani_memcmp_char4162,[%value_in]; } .loc 1 114 24 setp.gt.s32 %r163,%r162,0; @ ! %r163 bra $L15; bra $L14; $L12: .loc 1 119 38 add.u64 %r164,%r95,1; st.u64 [%r73],%r164; shr.s64 %r165,%r164,63; st.u64 [%r73+8],%r165; .loc 1 118 7 add.u64 %r73,%r73,%r47; setp.eq.u64 %r166,%r56,%r67; @ %r166 bra $L21; .loc 1 119 34 ld.u64 %r95,[%r67]; add.u64 %r67,%r67,8; bra $L12; $L21: .loc 1 117 14 mov.u64 %r55,%r49; $L15: .loc 1 123 9 add.u64 %r49,%r49,%r92; .loc 1 125 14 add.u64 %r40,%r40,1; .loc 1 125 7 st.u64 [%frame+240],%r40; setp.ne.u64 %r167,%r40,%r93; @ %r167 bra $L16; .loc 1 123 9 sub.u64 %r168,%r93,%r94; mad.lo.u64 %r50,%r168,%r92,%r50; mov.u64 %r79,%r182; mov.u64 %r48,%r93; mov.u64 %r45,%r90; .loc 1 126 9 mov.u64 %r54,0; .loc 1 131 13 mov.u64 %r171,%r54; $L18: st.u64 [%r79+-8],%r171; .loc 1 134 23 mul.lo.u64 %r42,%r45,%r48; mov.u64 %r37,%r54; .loc 1 135 5 add.u64 %r54,%r54,1; .loc 1 136 7 setp.eq.u64 %r172,%r37,%r82; @ %r172 bra $L1; .loc 1 144 16 ld.u64 %r173,[%r79]; add.u64 %r44,%r173,1; st.u64 [%r79],%r44; shl.b64 %r76,%r54,3; .loc 1 145 23 add.u64 %r174,%frame,%r76; ld.u64 %r45,[%r174]; .loc 1 145 13 sub.u64 %r175,%r45,%r42; shl.b64 %r176,%r175,2; add.u64 %r50,%r50,%r176; .loc 1 148 32 add.u64 %r178,%r180,%r76; ld.u64 %r48,[%r178]; .loc 1 148 7 add.u64 %r79,%r79,8; setp.eq.u64 %r179,%r44,%r48; @ %r179 bra $L18; bra $L19; $L1:_gfortran_mmaxloc0_16_s4 .visible .func _gfortran_mmaxloc0_16_s, .param .u64 %in_ar4) {stack; mov.u64 %stack,04809pred %r135; .reg .predpredpred %r159; .reg .u64.reg .u32 %r179; .reg .u16 %r180; .reg .pred %r181; .reg .pred.reg .pred %r206; .reg .u64 %r207; .reg .pred %r209; .reg .pred %r210; .reg .u64u16 %r215; .reg .u32 %r216; .reg .pred %r217; .reg .pred %r218; .reg .u32 %r224; .reg .pred %r225; .reg .u32 %r229; .reg .pred %r230; .reg .u64 %r231; .reg .u64 %r232; .reg .pred %r233; .reg .pred.reg .u64 %r252; .reg .u64 %r253; .reg .pred %r254; mov.u64 %r130,%ar0; mov.u64 %r131,%ar1; mov.u64 %r132,%ar2; mov.u32 %r133,%ar3; mov.u64 %r134,%ar4; .loc 1 177 6 setp.ne.u64 %r135,%r132,0; @ %r135 bra $L3011133134; call _gfortran_maxloc0_16_s4184 7 bra $L29; $L30: .loc 1 187 10 ld.s8 %r22,[%r131+28]; .loc 1 188 6 setp.gt.s32 %r140,%r22,0; @ %r140 bra $L32; .loc 1 189 5 cvta.const.u64 %r1411stack; call _gfortran_runtime_error32: .loc 1 191 6 ld.u64 %r143,[%r130]; setp.ne.u64 %r144,%r143,0; @ %r144 bra $L33; .loc 1 193 7 st.u64 [%r130+48],%r143; add.u32 %r146,%r22,-1; cvt.s64.s32 %r147,%r146; st.u64 [%r130+56],%r147; mov.u64 %r148,1; st.u64 [%r130+40],%r148; .loc 1 194 28 cvt.u32.u64 %r149,%r148; st.u8 [%r130+28],%r149; .loc 1 195 24 st.u64 [%r130+8],%r143; .loc 1 196 29 cvt.u32.u32 %r154,%r22; cvt.s64.s8 %r153,%r154; mov.u64 %r152,16call (%value_in),_gfortrani_xmallocarray155,[%value_in]; } .loc 1 196 27 st.u64 [%r130],%r155; bra $L34; $L33: .loc 1 200 11 cvta.global.u64 %r157,_gfortrani_compile_options; .loc 1 200 10 ld.u32 %r158,[%r157+36]; setp.eq.u32 %r159,%r158,0r1162; call _gfortrani_bounds_iforeach_return.loc 1 205 4 cvta.const.u64 %r1651162; call _gfortrani_bounds_equal_extents$L34: .loc 1 210 15 ld.u64 %r32,[%r132+16]; .loc 1 212 9 ld.u64 %r75,[%r132]; .loc 1 214 22 cvt.u32.u64 %r33,%r32; .loc 1 214 53 add.u32 %r167,%r33,-4; and.b32 %r168,%r167,-5; set.u32.eq.u32 %r170,%r168,0; neg.s32 %r171,%r170; .loc 1 214 22 add.u32 %r172,%r33,-1; set.u32.le.u32 %r174,%r172,1; neg.s32 %r175,%r174; .loc 1 214 6 cvt.u16.u32 %r177,%r171; cvt.u16.u32 %r178,%r175; or.b16 %r176,%r177,%r178; cvt.u32.u16 %r179,%r176; cvt.u16.u8 %r180,%r179; setp.ne.u16 %r181,%r180,0; @ %r181 bra $L35; .loc 1 216 7 setp.ne.u32 %r183,%r33,16; @ %r183 bra $L36; $L35: .loc 1 223 11 ld.u64 %r86,[%r130+40]; .loc 1 224 8 ld.u64 %r87,[%r130]; .loc 1 225 17 cvt.u32.u32 %r184,%r22; cvt.s64.s8 %r106,%r184; add.u64 %r80,%r131,40; add.u64 %r73,%r132,40; mov.u64 %r56,0; .loc 1 225 10 mov.u64 %r77,%r56; add.u64 %r252,%frame,120; add.u64 %r253,%frame,240; add.u64 %r250,%frame,360; .loc 1 230 16 mov.u64 %r201,%r56; bra $L37; $L36: .loc 1 221 5_gfortran_runtime_error37: .loc 1 227 18 add.u64 %r188,%r252,%r56; .loc 1 227 51 ld.u64 %r190,[%r80]; mul.lo.u64 %r189,%r190,%r134; .loc 1 227 18 st.u64 [%r188],%r189; .loc 1 228 18 add.u64 %r191,%frame,%r56; .loc 1 228 20 ld.u64 %r193,[%r73]; mul.lo.u64 %r192,%r193,%r32; .loc 1 228 18 st.u64 [%r191],%r192; .loc 1 229 19 ld.u64 %r195,[%r80+16]; add.u64 %r194,%r195,1; ld.u64 %r196,[%r80+8]; sub.u64 %r49,%r194,%r196; .loc 1 229 17 add.u64 %r198,%r253,%r56; st.u64 [%r198],%r49; .loc 1 230 16 add.u64 %r200,%r250,%r56; st.u64 [%r200],%r201; .loc 1 231 10 setp.gt.s64 %r202,%r49,0; @ %r202 bra $L38; shl.b64 %r114,%r86,4; mov.u64 %r116,%r87; .loc 1 234 11 mov.u64 %r91,0; .loc 1 235 24 mov.u64 %r203,%r91; $L39: st.u64 [%r116],%r203; st.u64 [%r116+8],%r203; .loc 1 234 27 add.u64 %r91,%r91,1; .loc 1 234 4 add.u64 %r116,%r116,%r114; setp.ne.u64 %r205,%r91,%r106; @ %r205 bra $L39; bra $L29; $L38: .loc 1 225 26 add.u64 %r77,%r77,1; .loc 1 225 3 add.u64 %r80,%r80,24; add.u64 %r73,%r73,24; add.u64 %r56,%r56,8; setp.ne.u64 %r206,%r77,%r106; @ %r206 bra $L37; .loc 1 240 8 ld.u64 %r72,[%r131]; shl.b64 %r95,%r86,4; mov.u64 %r97,%r87; .loc 1 243 10 mov.u64 %r90,0; .loc 1 244 23 mov.u64 %r207,%r90; $L41: st.u64 [%r97],%r207; st.u64 [%r97+8],%r207; .loc 1 243 26 add.u64 %r90,%r90,1; .loc 1 243 3 add.u64 %r97,%r97,%r95; setp.ne.u64 %r209,%r90,%r106; @ %r209 bra $L41; .loc 1 251 9 setp.eq.u64 %r210,%r72,0; @ %r210 bra $L29; .loc 1 267 19 ld.u64 %r122,[%frame+120]; .loc 1 267 9 shl.b64 %r124,%r122,2; .loc 1 268 20 ld.u64 %r125,[%frame]; .loc 1 270 34 ld.u64 %r127,[%frame+240]; cvt.u32.u32 %r212,%r22; cvt.s64.s8 %r211,%r212; shl.b64 %r213,%r211,3; add.u64 %r99,%r250,%r213; .loc 1 249 10 mov.u64 %r79,0; add.u64 %r251,%frame,368; .loc 1 257 6 setp.eq.u32 %r254,%r133,0; $L49: .loc 1 270 21 ld.u64 %r128,[%frame+360]; mov.u64 %r57,%r128; mov.u64 %r74,%r75; mov.u64 %r71,%r72; $L47: .loc 1 257 6 ld.s8 %r216,[%r74]; cvt.u16.u32 %r215,%r216; setp.eq.u16 %r217,%r215,0; @ %r217 bra $L42; .loc 1 257 14 setp.ne.u64 %r218,%r79,0; @ %r218 bra $L43; $L46: mov.u64 %r104,%r251; .loc 1 270 21 mov.u64 %r105,%r87; mov.u64 %r129,%r57; bra $L44; $L43: .loc 1 257 6 @ %r254 bra $L45; .loc 1 43134; call (%value_in),_gfortrani_memcmp_char4224,[%value_in]; } .loc 1 258 25 setp.ge.s32 %r225,%r224,0; @ ! %r225 bra $L42; bra $L46; $L45: .loc 1 43134; call (%value_in),_gfortrani_memcmp_char4229,[%value_in]; } .loc 1 258 25 setp.gt.s32 %r230,%r229,0; @ ! %r230 bra $L42; bra $L46; $L44: .loc 1 263 38 add.u64 %r231,%r129,1; st.u64 [%r105],%r231; shr.s64 %r232,%r231,63; st.u64 [%r105+8],%r232; .loc 1 262 7 add.u64 %r105,%r105,%r95; setp.eq.u64 %r233,%r99,%r104; @ %r233 bra $L50; .loc 1 263 34 ld.u64 %r129,[%r104]; add.u64 %r104,%r104,8; bra $L44; $L50: .loc 1 261 14 mov.u64 %r79,%r71; $L42: .loc 1 267 9 add.u64 %r71,%r71,%r124; .loc 1 268 10 add.u64 %r74,%r74,%r125; .loc 1 270 14 add.u64 %r57,%r57,1; .loc 1 270 7 st.u64 [%frame+360],%r57; setp.ne.u64 %r234,%r57,%r127; @ %r234 bra $L47; sub.u64 %r119,%r127,%r128; .loc 1 267 9 mad.lo.u64 %r72,%r119,%r124,%r72; .loc 1 268 10 mad.lo.u64 %r75,%r119,%r125,%r75; mov.u64 %r112,%r251; mov.u64 %r68,%r125; mov.u64 %r70,%r127; mov.u64 %r66,%r122; mov.u64 %r111,8; .loc 1 271 9 mov.u64 %r78,0; .loc 1 276 13 mov.u64 %r238,%r78; $L48: st.u64 [%r112+-8],%r238; .loc 1 279 23 mul.lo.u64 %r59,%r66,%r70; .loc 1 280 24 mul.lo.u64 %r62,%r70,%r68; .loc 1 281 5 add.u64 %r78,%r78,1; .loc 1 282 7 setp.eq.u64 %r239,%r78,%r106; @ %r239 bra $L29; .loc 1 290 16 ld.u64 %r240,[%r112]; add.u64 %r65,%r240,1; st.u64 [%r112],%r65; .loc 1 291 23 add.u64 %r242,%r252,%r111; ld.u64 %r66,[%r242]; .loc 1 291 13 sub.u64 %r243,%r66,%r59; shl.b64 %r244,%r243,2; add.u64 %r72,%r72,%r244; .loc 1 292 24 add.u64 %r245,%frame,%r111; ld.u64 %r68,[%r245]; .loc 1 292 14 sub.u64 %r246,%r68,%r62; add.u64 %r75,%r75,%r246; .loc 1 295 32 add.u64 %r248,%r253,%r111; ld.u64 %r70,[%r248]8; setp.eq.u64 %r249,%r65,%r70; @ %r249 bra $L48; bra $L49; $L29:_gfortran_smaxloc0_16_s4 .visible .func _gfortran_smaxloc0_16_s, .param .u64 %in_ar4) {u32 %r41; .reg .predu32 %r55; .reg .u64u32ar2; mov.u32 %r38,%ar3; mov.u64 %r39,%ar4; .loc 1 317 6 setp.eq.u64 %r40,%r37,0; @ %r40 bra $L66; .loc 1 317 20 ld.u32 %r41,[%r37]; setp.eq.u32 %r42,%r41,0; @ %r42 bra $L67; $L66: .loc 1 320 7 {call _gfortran_maxloc0_16_s4324 7 bra $L65; $L67: .loc 1 327 8 ld.s8 %r30,[%r36+28]; .loc 1 329 6 setp.gt.s64 %r48,%r30,0; @ %r48 bra $L69; .loc 1 330 5 cvta.const.u64 %r49,$LC0; {stack; call _gfortran_runtime_error69: .loc 1 332 15 ld.u64 %r24,[%r35]; .loc 1 332 6 setp.ne.u64 %r51,%r24,0; @ %r51 bra $L70; .loc 1 334 7 st.u64 [%r35+48],%r24; add.u64 %r53,%r30,-1; st.u64 [%r35+56],%r53; mov.u64 %r54,1; st.u64 [%r35+40],%r54; .loc 1 335 28 cvt.u32.u64 %r55,%r54; st.u8 [%r35+28],%r55; .loc 1 336 24 st.u64 [%r35+8],%r24; .loc 1 337 29call (%value_in),_gfortrani_xmallocarray37 27 st.u64 [%r35],%r24; bra $L71; $L70: .loc 1 339 12 cvta.global.u64 %r63,_gfortrani_compile_options; .loc 1 339 11 ld.u32 %r64,[%r63+36]; setp.eq.u32 %r65,%r64,0; @ %r65 bra $L71; .loc 1 341 8 cvta.const.u64 %r68,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 346 8 ld.u64 %r24,[%r35]; $L71: ld.u64 %r69,[%r35+40]; shl.b64 %r34,%r69,4; mov.u64 %r29,%r24; .loc 1 347 10 mov.u64 %r32,0; .loc 1 348 23 mov.u64 %r70,%r32; $L72: st.u64 [%r29],%r70; st.u64 [%r29+8],%r70; .loc 1 347 24 add.u64 %r32,%r32,1; .loc 1 347 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r72,%r30,%r32; @ %r72 bra $L72; $L65: .loc 1 349 1 ret; } minloc0_4_s1.o/ 1622802212_gfortran_minloc0_4_s1 .visible .func _gfortran_minloc0_4_s1); .file 1 "../../../libgfortran/generated/minloc0_4_s1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_4_s1 .visible .func _gfortran_mminloc0_4_s1_gfortran_sminloc0_4_s1 .visible .func _gfortran_sminloc0_4_s1in_ar4); // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_returnmemcmpmem_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents7] = {77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,116,0 }82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,6270,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_4_s1 .visible .func _gfortran_minloc0_4_s1stack; mov.u64 %stack,08predpredpredpredpred %r175; .reg .u64 %r178; mov.u64 %r99,%ar0; mov.u64 %r100,%ar1; mov.u32 %r101,%ar2; mov.u64 %r102,%ar3; .loc 1 64 10 ld.s8 %r22,[%r100+28]; .loc 1 64 8 cvt.u32.u32 %r103,%r22; cvt.s64.s8 %r65,%r103; .loc 1 65 6 setp.gt.s64 %r104,%r65,0; @ %r104 bra $L2; .loc 1 66 5 cvta.const.u64 %r105,$LC0stack; call _gfortran_runtime_errorr23,[%r99]; .loc 1 68 6 setp.ne.u64 %r107,%r23,0; @ %r107 bra $L3; .loc 1 70 7 st.u64 [%r99+48],%r23; add.u64 %r109,%r65,-1; st.u64 [%r99+56],%r109; mov.u64 %r110,1; st.u64 [%r99+40],%r110; .loc 1 71 28 cvt.u32.u64 %r111,%r110; st.u8 [%r99+28],%r111; .loc 1 72 24 st.u64 [%r99+8],%r23; .loc 1 73 2965call (%value_in),_gfortrani_xmallocarray117,[%value_in]; } mov.u64 %r23,%r117; .loc 1 73 27 st.u64 [%r99],%r23; bra $L4; $L3: .loc 1 77 11 cvta.global.u64 %r119,_gfortrani_compile_options; .loc 1 77 10 ld.u32 %r120,[%r119+36]; setp.eq.u32 %r121,%r120,0; @ %r121 bra $L4; .loc 1 78r124; call _gfortrani_bounds_iforeach_return.loc 1 83 8 ld.u64 %r23,[%r99]; $L4: .loc 1 82 11 ld.u64 %r66,[%r99+40]; add.u64 %r28,%r100,40; mov.u64 %r70,0; .loc 1 84 10 mov.u64 %r86,%r70; add.u64 %r173,%frame,120; add.u64 %r172,%frame,240; .loc 1 88 16 mov.u64 %r135,%r70; bra $L7; $L19: mov.u64 %r86,%r72; $L7: .loc 1 86 18 add.u64 %r125,%frame,%r70; .loc 1 86 51 ld.u64 %r127,[%r28]; mul.lo.u64 %r126,%r127,%r102; .loc 1 86 18 st.u64 [%r125],%r126; .loc 1 87 19 ld.u64 %r129,[%r28+16]; add.u64 %r128,%r129,1; ld.u64 %r130,[%r28+8]; sub.u64 %r36,%r128,%r130; .loc 1 87 17 add.u64 %r132,%r173,%r70; st.u64 [%r132],%r36; .loc 1 88 16 add.u64 %r134,%r172,%r70; st.u64 [%r134],%r135; .loc 1 89 10 setp.gt.s64 %r136,%r36,0; @ %r136 bra $L5; shl.b64 %r90,%r66,2; mov.u64 %r27,%r23; .loc 1 92 11 mov.u64 %r73,0; .loc 1 93 24 cvt.u32.u64 %r137,%r73; $L6: st.u32 [%r27],%r137; .loc 1 92 27 add.u64 %r73,%r73,1; .loc 1 92 4 add.u64 %r27,%r27,%r90; setp.ne.u64 %r138,%r65,%r73; @ %r138 bra $L6; bra $L1; $L5: .loc 1 84 26 add.u64 %r72,%r86,1; .loc 1 84 3 add.u64 %r28,%r28,24; add.u64 %r70,%r70,8; setp.ne.u64 %r139,%r65,%r72; @ %r139 bra $L19; .loc 1 98 8 ld.u64 %r53,[%r100]; shl.b64 %r55,%r66,2; mov.u64 %r57,%r23; .loc 1 101 10 mov.u64 %r71,0; .loc 1 102 23 mov.u32 %r140,1; $L8: st.u32 [%r57],%r140; mov.u64 %r87,%r71; .loc 1 101 26 add.u64 %r71,%r71,1; .loc 1 101 3 add.u64 %r57,%r57,%r55; setp.ne.u64 %r141,%r86,%r87; @ %r141 bra $L8; .loc 1 108 9 setp.ne.u64 %r142,%r53,0; @ ! %r142 bra $L1; .loc 1 123 19 ld.u64 %r40,[%frame]; .loc 1 125 34 ld.u64 %r43,[%frame+120]; cvt.u32.u32 %r144,%r22; cvt.s64.s8 %r143,%r144; shl.b64 %r145,%r143,3; add.u64 %r61,%r172,%r145; .loc 1 106 11 mov.u64 %r59,0; add.u64 %r174,%frame,248; .loc 1 114 8 setp.eq.u32 %r175,%r101,0; .loc 1 131 13 mov.u64 %r178,%r59; $L18: .loc 1 125 21 ld.u64 %r91,[%frame+240]; mov.u64 %r42,%r91; mov.u64 %r52,%r53; $L15: .loc 1 114 8 setp.eq.u64 %r147,%r59,0; @ %r147 bra $L11; .loc 1 41 12r10151,[%value_in]; } .loc 1 114 8 @ %r175 bra $L12; .loc 1 114 24 setp.gt.s32 %r154,%r151,0; @ %r154 bra $L13; $L11: mov.u64 %r74,%r174; .loc 1 125 21 mov.u64 %r75,%r23; mov.u64 %r92,%r42; bra $L14; $L12: .loc 1 114 24 setp.lt.s32 %r156,%r151,0; @ ! %r156 bra $L13; bra $L11; $L14: .loc 1 119 38 cvt.u32.u64 %r158,%r92; add.u32 %r157,%r158,1; .loc 1 119 27 st.u32 [%r75],%r157; .loc 1 118 7 add.u64 %r75,%r75,%r55; setp.eq.u64 %r159,%r61,%r74; @ %r159 bra $L20; .loc 1 119 34 ld.u64 %r92,[%r74]; add.u64 %r74,%r74,8; bra $L14; $L20: mov.u64 %r59,%r52; $L13: .loc 1 123 9 add.u64 %r52,%r52,%r40; .loc 1 125 14 add.u64 %r42,%r42,1; .loc 1 125 7 st.u64 [%frame+240],%r42; setp.ne.u64 %r160,%r42,%r43; @ %r160 bra $L15; .loc 1 123 9 sub.u64 %r161,%r43,%r91; mad.lo.u64 %r53,%r161,%r40,%r53; mov.u64 %r81,%r174; mov.u64 %r51,%r43; mov.u64 %r48,%r40; .loc 1 126 9 mov.u64 %r58,0; $L17: .loc 1 131 13 st.u64 [%r81+-8],%r178; .loc 1 134 23 mul.lo.u64 %r45,%r48,%r51; mov.u64 %r94,%r58; .loc 1 135 5 add.u64 %r58,%r58,1; .loc 1 136 7 setp.eq.u64 %r165,%r94,%r86; @ %r165 bra $L1; .loc 1 144 16 ld.u64 %r166,[%r81]; add.u64 %r47,%r166,1; st.u64 [%r81],%r47; shl.b64 %r78,%r58,3; .loc 1 145 23 add.u64 %r167,%frame,%r78; ld.u64 %r48,[%r167]; .loc 1 145 13 sub.u64 %r168,%r48,%r45; add.u64 %r53,%r53,%r168; .loc 1 148 32 add.u64 %r170,%r173,%r78; ld.u64 %r51,[%r170]; .loc 1 148 7 add.u64 %r81,%r81,8; setp.eq.u64 %r171,%r47,%r51; @ %r171 bra $L17; bra $L18; $L1:_gfortran_mminloc0_4_s1 .visible .func _gfortran_mminloc0_4_s1in_ar4) {stack; mov.u64 %stack,0480u64u64u32 %r177; .reg .u16 %r178; .reg .pred %r179; .reg .predpredpred %r205; .reg .predpred %r223; .reg .u32predpredpred %r246; .reg .u64 %r249; mov.u64 %r128,%ar0; mov.u64 %r129,%ar1; mov.u64 %r130,%ar2; mov.u32 %r131,%ar3; mov.u64 %r132,%ar4; .loc 1 177 6 setp.ne.u64 %r133,%r130,0; @ %r133 bra $L321r12913132; call _gfortran_minloc0_4_s1184 7 bra $L31; $L32: .loc 1 187 10 ld.s8 %r22,[%r129+28]; .loc 1 188 6 setp.gt.s32 %r138,%r22,0; @ %r138 bra $L34; .loc 1 189 5 cvta.const.u64 %r131stack; call _gfortran_runtime_error34: .loc 1 191 6 ld.u64 %r141,[%r128]; setp.ne.u64 %r142,%r141,0; @ %r142 bra $L35; .loc 1 193 7 st.u64 [%r128+48],%r141; add.u32 %r144,%r22,-1; cvt.s64.s32 %r145,%r144; st.u64 [%r128+56],%r145; mov.u64 %r146,1; st.u64 [%r128+40],%r146; .loc 1 194 28 cvt.u32.u64 %r147,%r146; st.u8 [%r128+28],%r147; .loc 1 195 24 st.u64 [%r128+8],%r141; .loc 1 196 29 cvt.u32.u32 %r152,%r22; cvt.s64.s8 %r151,%r152call (%value_in),_gfortrani_xmallocarray153,[%value_in]; } .loc 1 196 27 st.u64 [%r128],%r153; bra $L36; $L35: .loc 1 200 11 cvta.global.u64 %r155,_gfortrani_compile_options; .loc 1 200 10 ld.u32 %r156,[%r155+36]; setp.eq.u32 %r157,%r156,0r129160; call _gfortrani_bounds_iforeach_return.loc 1 205 4 cvta.const.u64 %r16311291630; call _gfortrani_bounds_equal_extents$L36: .loc 1 210 15 ld.u64 %r31,[%r130+16]; .loc 1 212 9 ld.u64 %r82,[%r130]; .loc 1 214 22 cvt.u32.u64 %r32,%r31; .loc 1 214 53 add.u32 %r165,%r32,-4; and.b32 %r166,%r165,-5; set.u32.eq.u32 %r168,%r166,0; neg.s32 %r169,%r168; .loc 1 214 22 add.u32 %r170,%r32,-1; set.u32.le.u32 %r172,%r170,1; neg.s32 %r173,%r172; .loc 1 214 6 cvt.u16.u32 %r175,%r169; cvt.u16.u32 %r176,%r173; or.b16 %r174,%r175,%r176; cvt.u32.u16 %r177,%r174; cvt.u16.u8 %r178,%r177; setp.ne.u16 %r179,%r178,0; @ %r179 bra $L37; .loc 1 216 7 setp.ne.u32 %r181,%r32,16; @ %r181 bra $L38; $L37: .loc 1 223 11 ld.u64 %r93,[%r128+40]; .loc 1 224 8 ld.u64 %r94,[%r128]; .loc 1 225 17 cvt.u32.u32 %r182,%r22; cvt.s64.s8 %r111,%r182; add.u64 %r89,%r129,40; add.u64 %r83,%r130,40; mov.u64 %r68,0; .loc 1 225 10 mov.u64 %r84,%r68; add.u64 %r245,%frame,120; add.u64 %r242,%frame,240; add.u64 %r243,%frame,360; .loc 1 230 16 mov.u64 %r199,%r68; bra $L39; $L38: .loc 1 221 5_gfortran_runtime_error39: .loc 1 227 18 add.u64 %r186,%r245,%r68; .loc 1 227 51 ld.u64 %r188,[%r89]; mul.lo.u64 %r187,%r188,%r132; .loc 1 227 18 st.u64 [%r186],%r187; .loc 1 228 18 add.u64 %r189,%frame,%r68; .loc 1 228 20 ld.u64 %r191,[%r83]; mul.lo.u64 %r190,%r191,%r31; .loc 1 228 18 st.u64 [%r189],%r190; .loc 1 229 19 ld.u64 %r193,[%r89+16]; add.u64 %r192,%r193,1; ld.u64 %r194,[%r89+8]; sub.u64 %r51,%r192,%r194; .loc 1 229 17 add.u64 %r196,%r242,%r68; st.u64 [%r196],%r51; .loc 1 230 16 add.u64 %r198,%r243,%r68; st.u64 [%r198],%r199; .loc 1 231 10 setp.gt.s64 %r200,%r51,0; @ %r200 bra $L40; shl.b64 %r124,%r93,2; mov.u64 %r38,%r94; .loc 1 234 11 mov.u64 %r96,0; .loc 1 235 24 cvt.u32.u64 %r201,%r96; $L41: st.u32 [%r38],%r201; .loc 1 234 27 add.u64 %r96,%r96,1; .loc 1 234 4 add.u64 %r38,%r38,%r124; setp.ne.u64 %r202,%r96,%r111; @ %r202 bra $L41; bra $L31; $L40: .loc 1 225 26 add.u64 %r84,%r84,1; .loc 1 225 3 add.u64 %r89,%r89,24; add.u64 %r83,%r83,24; add.u64 %r68,%r68,8; setp.ne.u64 %r203,%r84,%r111; @ %r203 bra $L39; .loc 1 240 8 ld.u64 %r80,[%r129]; shl.b64 %r99,%r93,2; mov.u64 %r101,%r94; .loc 1 243 10 mov.u64 %r95,0; .loc 1 244 23 cvt.u32.u64 %r204,%r95; $L43: st.u32 [%r101],%r204; .loc 1 243 26 add.u64 %r95,%r95,1; .loc 1 243 3 add.u64 %r101,%r101,%r99; setp.ne.u64 %r205,%r95,%r111; @ %r205 bra $L43; .loc 1 251 9 setp.eq.u64 %r206,%r80,0; @ %r206 bra $L31; .loc 1 267 19 ld.u64 %r59,[%frame+120]; .loc 1 268 20 ld.u64 %r61,[%frame]; .loc 1 270 34 ld.u64 %r65,[%frame+240]; cvt.u32.u32 %r208,%r22; cvt.s64.s8 %r207,%r208; shl.b64 %r209,%r207,3; add.u64 %r103,%r243,%r209; .loc 1 249 10 mov.u64 %r86,0; add.u64 %r244,%frame,368; .loc 1 257 6 setp.eq.u32 %r246,%r131,0; .loc 1 276 13 mov.u64 %r249,%r86; $L50: .loc 1 270 21 ld.u64 %r125,[%frame+360]; mov.u64 %r63,%r125; mov.u64 %r81,%r82; mov.u64 %r79,%r80; $L48: .loc 1 257 6 ld.s8 %r212,[%r81]; cvt.u16.u32 %r211,%r212; setp.eq.u16 %r213,%r211,0; @ %r213 bra $L44; .loc 1 257 14 setp.eq.u64 %r214,%r86,0; @ %r214 bra $L45; .loc 1 4113218,[%value_in]; } .loc 1 257 6 @ %r246 bra $L46; .loc 1 258 23 setp.gt.s32 %r221,%r218,0; @ %r221 bra $L44; $L45: mov.u64 %r108,%r244; .loc 1 270 21 mov.u64 %r109,%r94; mov.u64 %r126,%r63; bra $L47; $L46: .loc 1 258 23 setp.lt.s32 %r223,%r218,0; @ ! %r223 bra $L44; bra $L45; $L47: .loc 1 263 38 cvt.u32.u64 %r225,%r126; add.u32 %r224,%r225,1; .loc 1 263 27 st.u32 [%r109],%r224; .loc 1 262 7 add.u64 %r109,%r109,%r99; setp.eq.u64 %r226,%r103,%r108; @ %r226 bra $L51; .loc 1 263 34 ld.u64 %r126,[%r108]; add.u64 %r108,%r108,8; bra $L47; $L51: mov.u64 %r86,%r79; $L44: .loc 1 267 9 add.u64 %r79,%r79,%r59; .loc 1 268 10 add.u64 %r81,%r81,%r61; .loc 1 270 14 add.u64 %r63,%r63,1; .loc 1 270 7 st.u64 [%frame+360],%r63; setp.ne.u64 %r227,%r63,%r65; @ %r227 bra $L48; sub.u64 %r119,%r65,%r125; .loc 1 267 9 mad.lo.u64 %r80,%r59,%r119,%r80; .loc 1 268 10 mad.lo.u64 %r82,%r61,%r119,%r82; mov.u64 %r116,%r244; mov.u64 %r75,%r61; mov.u64 %r78,%r65; mov.u64 %r73,%r59; mov.u64 %r115,8; .loc 1 271 9 mov.u64 %r85,0; $L49: .loc 1 276 13 st.u64 [%r116+-8],%r249; .loc 1 279 23 mul.lo.u64 %r67,%r73,%r78; .loc 1 280 24 mul.lo.u64 %r70,%r78,%r75; .loc 1 281 5 add.u64 %r85,%r85,1; .loc 1 282 7 setp.eq.u64 %r232,%r85,%r111; @ %r232 bra $L31; .loc 1 290 16 ld.u64 %r233,[%r116]; add.u64 %r72,%r233,1; st.u64 [%r116],%r72; .loc 1 291 23 add.u64 %r235,%r245,%r115; ld.u64 %r73,[%r235]; .loc 1 291 13 sub.u64 %r236,%r73,%r67; add.u64 %r80,%r80,%r236; .loc 1 292 24 add.u64 %r237,%frame,%r115; ld.u64 %r75,[%r237]; .loc 1 292 14 sub.u64 %r238,%r75,%r70; add.u64 %r82,%r82,%r238; .loc 1 295 32 add.u64 %r240,%r242,%r115; ld.u64 %r78,[%r240]; .loc 1 295 7 add.u64 %r116,%r116,8; add.u64 %r115,%r115,8; setp.eq.u64 %r241,%r72,%r78; @ %r241 bra $L49; bra $L50; $L31:_gfortran_sminloc0_4_s1 .visible .func _gfortran_sminloc0_4_s1in_ar4) {u32 %r41; .reg .predu32 %r55; .reg .u64u32ar2; mov.u32 %r38,%ar3; mov.u64 %r39,%ar4; .loc 1 317 6 setp.eq.u64 %r40,%r37,0; @ %r40 bra $L70; .loc 1 317 20 ld.u32 %r41,[%r37]; setp.eq.u32 %r42,%r41,0; @ %r42 bra $L71; $L70: .loc 1 320 7 {call _gfortran_minloc0_4_s1324 7 bra $L69; $L71: .loc 1 327 8 ld.s8 %r30,[%r36+28]; .loc 1 329 6 setp.gt.s64 %r48,%r30,0; @ %r48 bra $L73; .loc 1 330 5 cvta.const.u64 %r49,$LC0; {stack; call _gfortran_runtime_error73: .loc 1 332 15 ld.u64 %r24,[%r35]; .loc 1 332 6 setp.ne.u64 %r51,%r24,0; @ %r51 bra $L74; .loc 1 334 7 st.u64 [%r35+48],%r24; add.u64 %r53,%r30,-1; st.u64 [%r35+56],%r53; mov.u64 %r54,1; st.u64 [%r35+40],%r54; .loc 1 335 28 cvt.u32.u64 %r55,%r54; st.u8 [%r35+28],%r55; .loc 1 336 24 st.u64 [%r35+8],%r24; .loc 1 337 29 mov.u64 %r58call (%value_in),_gfortrani_xmallocarray37 27 st.u64 [%r35],%r24; bra $L75; $L74: .loc 1 339 12 cvta.global.u64 %r63,_gfortrani_compile_options; .loc 1 339 11 ld.u32 %r64,[%r63+36]; setp.eq.u32 %r65,%r64,0; @ %r65 bra $L75; .loc 1 341 8 cvta.const.u64 %r68,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 346 8 ld.u64 %r24,[%r35]; $L75: ld.u64 %r69,[%r35+40]; shl.b64 %r34,%r69,2; mov.u64 %r29,%r24; .loc 1 347 10 mov.u64 %r32,0; .loc 1 348 23 cvt.u32.u64 %r70,%r32; $L76: st.u32 [%r29],%r70; .loc 1 347 24 add.u64 %r32,%r32,1; .loc 1 347 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r71,%r30,%r32; @ %r71 bra $L76; $L69: .loc 1 349 1 ret; } minloc0_4_s4.o/ 1622802212_gfortran_minloc0_4_s4 .visible .func _gfortran_minloc0_4_s4); .file 1 "../../../libgfortran/generated/minloc0_4_s4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_4_s4 .visible .func _gfortran_mminloc0_4_s_gfortran_sminloc0_4_s4 .visible .func _gfortran_sminloc0_4_sVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_memcmp_char4_gfortrani_memcmp_char4_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents7] = {77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,116,0 }82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,6270,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_4_s4 .visible .func _gfortran_minloc0_4_s4stack; mov.u64 %stack,0897u64 %r120; .reg .u64 %r122; .reg .u32 %r123; .reg .predu64pred %r182; mov.u64 %r102,%ar0; mov.u64 %r103,%ar1; mov.u32 %r104,%ar2; mov.u64 %r105,%ar3; .loc 1 64 10 ld.s8 %r22,[%r103+28]; .loc 1 64 8 cvt.u32.u32 %r106,%r22; cvt.s64.s8 %r62,%r106; .loc 1 65 6 setp.gt.s64 %r107,%r62,0; @ %r107 bra $L2; .loc 1 66 5 cvta.const.u64 %r108,$LC0; {stack; call _gfortran_runtime_errorr23,[%r102]; .loc 1 68 6 setp.ne.u64 %r110,%r23,0; @ %r110 bra $L3; .loc 1 70 7 st.u64 [%r102+48],%r23; add.u64 %r112,%r62,-1; st.u64 [%r102+56],%r112; mov.u64 %r113,1; st.u64 [%r102+40],%r113; .loc 1 71 28 cvt.u32.u64 %r114,%r113; st.u8 [%r102+28],%r114; .loc 1 72 24 st.u64 [%r102+8],%r23; .loc 1 73 29117; call (%value_in),_gfortrani_xmallocarray120,[%value_in]; } mov.u64 %r23,%r120; .loc 1 73 27 st.u64 [%r102],%r23; bra $L4; $L3: .loc 1 77 11 cvta.global.u64 %r122,_gfortrani_compile_options; .loc 1 77 10 ld.u32 %r123,[%r122+36]; setp.eq.u32 %r124,%r123,0; @ %r124 bra $L4; .loc 1 78 2127; call _gfortrani_bounds_iforeach_return.loc 1 83 8 ld.u64 %r23,[%r102]; $L4: .loc 1 82 11 ld.u64 %r63,[%r102+40]; add.u64 %r27,%r103,40; mov.u64 %r68,0; .loc 1 84 10 mov.u64 %r83,%r68; add.u64 %r181,%frame,120; add.u64 %r179,%frame,240; .loc 1 88 16 mov.u64 %r138,%r68; bra $L7; $L20: mov.u64 %r83,%r70; $L7: .loc 1 86 18 add.u64 %r128,%frame,%r68; .loc 1 86 51 ld.u64 %r130,[%r27]; mul.lo.u64 %r129,%r130,%r105; .loc 1 86 18 st.u64 [%r128],%r129; .loc 1 87 19 ld.u64 %r132,[%r27+16]; add.u64 %r131,%r132,1; ld.u64 %r133,[%r27+8]; sub.u64 %r35,%r131,%r133; .loc 1 87 17 add.u64 %r135,%r181,%r68; st.u64 [%r135],%r35; .loc 1 88 16 add.u64 %r137,%r179,%r68; st.u64 [%r137],%r138; .loc 1 89 10 setp.gt.s64 %r139,%r35,0; @ %r139 bra $L5; shl.b64 %r85,%r63,2; mov.u64 %r87,%r23; .loc 1 92 11 mov.u64 %r71,0; .loc 1 93 24 cvt.u32.u64 %r140,%r71; $L6: st.u32 [%r87],%r140; .loc 1 92 27 add.u64 %r71,%r71,1; .loc 1 92 4 add.u64 %r87,%r87,%r85; setp.ne.u64 %r141,%r62,%r71; @ %r141 bra $L6; bra $L1; $L5: .loc 1 84 26 add.u64 %r70,%r83,1; .loc 1 84 3 add.u64 %r27,%r27,24; add.u64 %r68,%r68,8; setp.ne.u64 %r142,%r62,%r70; @ %r142 bra $L20; .loc 1 98 8 ld.u64 %r50,[%r103]; shl.b64 %r52,%r63,2; mov.u64 %r54,%r23; .loc 1 101 10 mov.u64 %r69,0; .loc 1 102 23 mov.u32 %r143,1; $L8: st.u32 [%r54],%r143; mov.u64 %r84,%r69; .loc 1 101 26 add.u64 %r69,%r69,1; .loc 1 101 3 add.u64 %r54,%r54,%r52; setp.ne.u64 %r144,%r83,%r84; @ %r144 bra $L8; .loc 1 108 9 setp.ne.u64 %r145,%r50,0; @ ! %r145 bra $L1; .loc 1 123 19 ld.u64 %r92,[%frame]; .loc 1 123 9 shl.b64 %r94,%r92,2; .loc 1 125 34 ld.u64 %r95,[%frame+120]; cvt.u32.u32 %r147,%r22; cvt.s64.s8 %r146,%r147; shl.b64 %r148,%r146,3; add.u64 %r58,%r179,%r148; .loc 1 106 11 mov.u64 %r56,0; add.u64 %r180,%frame,248; .loc 1 114 8 setp.eq.u32 %r182,%r104,0; $L19: .loc 1 125 21 ld.u64 %r96,[%frame+240]; mov.u64 %r40,%r96; mov.u64 %r49,%r50; $L16: .loc 1 114 8 setp.ne.u64 %r150,%r56,0; @ %r150 bra $L11; $L14: mov.u64 %r74,%r180; .loc 1 125 21 mov.u64 %r75,%r23; mov.u64 %r97,%r40; bra $L12; $L11: .loc 1 114 8 @ %r182 bra $L13;105; call (%value_in),_gfortrani_memcmp_char4156,[%value_in]; } .loc 1 114 24 setp.le.s32 %r157,%r156,0; @ ! %r157 bra $L15; bra $L14105; call (%value_in),_gfortrani_memcmp_char4161,[%value_in]; } .loc 1 114 24 setp.lt.s32 %r162,%r161,0; @ ! %r162 bra $L15; bra $L14; $L12: .loc 1 119 38 cvt.u32.u64 %r164,%r97; add.u32 %r163,%r164,1; .loc 1 119 27 st.u32 [%r75],%r163; .loc 1 118 7 add.u64 %r75,%r75,%r52; setp.eq.u64 %r165,%r58,%r74; @ %r165 bra $L21; .loc 1 119 34 ld.u64 %r97,[%r74]; add.u64 %r74,%r74,8; bra $L12; $L21: .loc 1 117 14 mov.u64 %r56,%r49; $L15: .loc 1 123 9 add.u64 %r49,%r49,%r94; .loc 1 125 14 add.u64 %r40,%r40,1; .loc 1 125 7 st.u64 [%frame+240],%r40; setp.ne.u64 %r166,%r40,%r95; @ %r166 bra $L16; .loc 1 123 9 sub.u64 %r167,%r95,%r96; mad.lo.u64 %r50,%r167,%r94,%r50; mov.u64 %r81,%r180; mov.u64 %r48,%r95; mov.u64 %r45,%r92; .loc 1 126 9 mov.u64 %r55,0; .loc 1 131 13 mov.u64 %r170,%r55; $L18: st.u64 [%r81+-8],%r170; .loc 1 134 23 mul.lo.u64 %r42,%r45,%r48; mov.u64 %r36,%r55; .loc 1 135 5 add.u64 %r55,%r55,1; .loc 1 136 7 setp.eq.u64 %r171,%r36,%r83; @ %r171 bra $L1; .loc 1 144 16 ld.u64 %r172,[%r81]; add.u64 %r44,%r172,1; st.u64 [%r81],%r44; shl.b64 %r78,%r55,3; .loc 1 145 23 add.u64 %r173,%frame,%r78; ld.u64 %r45,[%r173]; .loc 1 145 13 sub.u64 %r174,%r45,%r42; shl.b64 %r175,%r174,2; add.u64 %r50,%r50,%r175; .loc 1 148 32 add.u64 %r177,%r181,%r78; ld.u64 %r48,[%r177]; .loc 1 148 7 add.u64 %r81,%r81,8; setp.eq.u64 %r178,%r44,%r48; @ %r178 bra $L18; bra $L19; $L1:_gfortran_mminloc0_4_s4 .visible .func _gfortran_mminloc0_4_s, .param .u64 %in_ar4) {stack; mov.u64 %stack,0480predpred.reg .u16 %r179; .reg .u32 %r180; .reg .u16 %r181; .reg .pred %r182; .reg .predpred %r205; .reg .pred %r206; .reg .u32 %r207; .reg .predu32 %r211; .reg .u64 %r212; .reg .u16 %r214; .reg .u32 %r215; .reg .predpred %r232; .reg .predu64 %r251; .reg .u64 %r252; .reg .pred %r253; mov.u64 %r131,%ar0; mov.u64 %r132,%ar1; mov.u64 %r133,%ar2; mov.u32 %r134,%ar3; mov.u64 %r135,%ar4; .loc 1 177 6 setp.ne.u64 %r136,%r133,0; @ %r136 bra $L301132134135; call _gfortran_minloc0_4_s4184 7 bra $L29; $L30: .loc 1 187 10 ld.s8 %r22,[%r132+28]; .loc 1 188 6 setp.gt.s32 %r141,%r22,0; @ %r141 bra $L32; .loc 1 189 5 cvta.const.u64 %r142,$LC0stack; call _gfortran_runtime_error32: .loc 1 191 6 ld.u64 %r144,[%r131]; setp.ne.u64 %r145,%r144,0; @ %r145 bra $L33; .loc 1 193 7 st.u64 [%r131+48],%r144; add.u32 %r147,%r22,-1; cvt.s64.s32 %r148,%r147; st.u64 [%r131+56],%r148; mov.u64 %r149,1; st.u64 [%r131+40],%r149; .loc 1 194 28 cvt.u32.u64 %r150,%r149; st.u8 [%r131+28],%r150; .loc 1 195 24 st.u64 [%r131+8],%r144; .loc 1 196 29 cvt.u32.u32 %r155,%r22; cvt.s64.s8 %r154,%r155call (%value_in),_gfortrani_xmallocarray156,[%value_in]; } .loc 1 196 27 st.u64 [%r131],%r156; bra $L34; $L33: .loc 1 200 11 cvta.global.u64 %r158,_gfortrani_compile_options; .loc 1 200 10 ld.u32 %r159,[%r158+36]; setp.eq.u32 %r160,%r159,0r1163; call _gfortrani_bounds_iforeach_return.loc 1 205 4 cvta.const.u64 %r16613r11663; call _gfortrani_bounds_equal_extents$L34: .loc 1 210 15 ld.u64 %r31,[%r133+16]; .loc 1 212 9 ld.u64 %r76,[%r133]; .loc 1 214 22 cvt.u32.u64 %r32,%r31; .loc 1 214 53 add.u32 %r168,%r32,-4; and.b32 %r169,%r168,-5; set.u32.eq.u32 %r171,%r169,0; neg.s32 %r172,%r171; .loc 1 214 22 add.u32 %r173,%r32,-1; set.u32.le.u32 %r175,%r173,1; neg.s32 %r176,%r175; .loc 1 214 6 cvt.u16.u32 %r178,%r172; cvt.u16.u32 %r179,%r176; or.b16 %r177,%r178,%r179; cvt.u32.u16 %r180,%r177; cvt.u16.u8 %r181,%r180; setp.ne.u16 %r182,%r181,0; @ %r182 bra $L35; .loc 1 216 7 setp.ne.u32 %r184,%r32,16; @ %r184 bra $L36; $L35: .loc 1 223 11 ld.u64 %r87,[%r131+40]; .loc 1 224 8 ld.u64 %r88,[%r131]; .loc 1 225 17 cvt.u32.u32 %r185,%r22; cvt.s64.s8 %r107,%r185; add.u64 %r83,%r132,40; add.u64 %r77,%r133,40; mov.u64 %r61,0; .loc 1 225 10 mov.u64 %r78,%r61; add.u64 %r250,%frame,120; add.u64 %r251,%frame,240; add.u64 %r252,%frame,360; .loc 1 230 16 mov.u64 %r202,%r61; bra $L37; $L36: .loc 1 221 5_gfortran_runtime_error37: .loc 1 227 18 add.u64 %r189,%r250,%r61; .loc 1 227 51 ld.u64 %r191,[%r83]; mul.lo.u64 %r190,%r191,%r135; .loc 1 227 18 st.u64 [%r189],%r190; .loc 1 228 18 add.u64 %r192,%frame,%r61; .loc 1 228 20 ld.u64 %r194,[%r77]; mul.lo.u64 %r193,%r194,%r31; .loc 1 228 18 st.u64 [%r192],%r193; .loc 1 229 19 ld.u64 %r196,[%r83+16]; add.u64 %r195,%r196,1; ld.u64 %r197,[%r83+8]; sub.u64 %r50,%r195,%r197; .loc 1 229 17 add.u64 %r199,%r251,%r61; st.u64 [%r199],%r50; .loc 1 230 16 add.u64 %r201,%r252,%r61; st.u64 [%r201],%r202; .loc 1 231 10 setp.gt.s64 %r203,%r50,0; @ %r203 bra $L38; shl.b64 %r115,%r87,2; mov.u64 %r117,%r88; .loc 1 234 11 mov.u64 %r91,0; .loc 1 235 24 cvt.u32.u64 %r204,%r91; $L39: st.u32 [%r117],%r204; .loc 1 234 27 add.u64 %r91,%r91,1; .loc 1 234 4 add.u64 %r117,%r117,%r115; setp.ne.u64 %r205,%r91,%r107; @ %r205 bra $L39; bra $L29; $L38: .loc 1 225 26 add.u64 %r78,%r78,1; .loc 1 225 3 add.u64 %r83,%r83,24; add.u64 %r77,%r77,24; add.u64 %r61,%r61,8; setp.ne.u64 %r206,%r78,%r107; @ %r206 bra $L37; .loc 1 240 8 ld.u64 %r73,[%r132]; shl.b64 %r96,%r87,2; mov.u64 %r98,%r88; .loc 1 243 10 mov.u64 %r90,0; .loc 1 244 23 cvt.u32.u64 %r207,%r90; $L41: st.u32 [%r98],%r207; .loc 1 243 26 add.u64 %r90,%r90,1; .loc 1 243 3 add.u64 %r98,%r98,%r96; setp.ne.u64 %r208,%r90,%r107; @ %r208 bra $L41; .loc 1 251 9 setp.eq.u64 %r209,%r73,0; @ %r209 bra $L29; .loc 1 267 19 ld.u64 %r123,[%frame+120]; .loc 1 267 9 shl.b64 %r125,%r123,2; .loc 1 268 20 ld.u64 %r126,[%frame]; .loc 1 270 34 ld.u64 %r128,[%frame+240]; cvt.u32.u32 %r211,%r22; cvt.s64.s8 %r210,%r211; shl.b64 %r212,%r210,3; add.u64 %r100,%r252,%r212; .loc 1 249 10 mov.u64 %r80,0; add.u64 %r249,%frame,368; .loc 1 257 6 setp.eq.u32 %r253,%r134,0; $L49: .loc 1 270 21 ld.u64 %r129,[%frame+360]; mov.u64 %r57,%r129; mov.u64 %r75,%r76; mov.u64 %r72,%r73; $L47: .loc 1 257 6 ld.s8 %r215,[%r75]; cvt.u16.u32 %r214,%r215; setp.eq.u16 %r216,%r214,0; @ %r216 bra $L42; .loc 1 257 14 setp.ne.u64 %r217,%r80,0; @ %r217 bra $L43; $L46: mov.u64 %r105,%r249; .loc 1 270 21 mov.u64 %r106,%r88; mov.u64 %r130,%r57; bra $L44; $L43: .loc 1 257 6 @ %r253 bra $L45; .loc 1 43135; call (%value_in),_gfortrani_memcmp_char4223,[%value_in]; } .loc 1 258 23 setp.le.s32 %r224,%r223,0; @ ! %r224 bra $L42; bra $L46; $L45: .loc 1 43135; call (%value_in),_gfortrani_memcmp_char4228,[%value_in]; } .loc 1 258 23 setp.lt.s32 %r229,%r228,0; @ ! %r229 bra $L42; bra $L46; $L44: .loc 1 263 38 cvt.u32.u64 %r231,%r130; add.u32 %r230,%r231,1; .loc 1 263 27 st.u32 [%r106],%r230; .loc 1 262 7 add.u64 %r106,%r106,%r96; setp.eq.u64 %r232,%r100,%r105; @ %r232 bra $L50; .loc 1 263 34 ld.u64 %r130,[%r105]; add.u64 %r105,%r105,8; bra $L44; $L50: .loc 1 261 14 mov.u64 %r80,%r72; $L42: .loc 1 267 9 add.u64 %r72,%r72,%r125; .loc 1 268 10 add.u64 %r75,%r75,%r126; .loc 1 270 14 add.u64 %r57,%r57,1; .loc 1 270 7 st.u64 [%frame+360],%r57; setp.ne.u64 %r233,%r57,%r128; @ %r233 bra $L47; sub.u64 %r120,%r128,%r129; .loc 1 267 9 mad.lo.u64 %r73,%r120,%r125,%r73; .loc 1 268 10 mad.lo.u64 %r76,%r120,%r126,%r76; mov.u64 %r113,%r249; mov.u64 %r69,%r126; mov.u64 %r71,%r128; mov.u64 %r67,%r123; mov.u64 %r112,8; .loc 1 271 9 mov.u64 %r79,0; .loc 1 276 13 mov.u64 %r237,%r79; $L48: st.u64 [%r113+-8],%r237; .loc 1 279 23 mul.lo.u64 %r60,%r67,%r71; .loc 1 280 24 mul.lo.u64 %r63,%r71,%r69; .loc 1 281 5 add.u64 %r79,%r79,1; .loc 1 282 7 setp.eq.u64 %r238,%r79,%r107; @ %r238 bra $L29; .loc 1 290 16 ld.u64 %r239,[%r113]; add.u64 %r66,%r239,1; st.u64 [%r113],%r66; .loc 1 291 23 add.u64 %r241,%r250,%r112; ld.u64 %r67,[%r241]; .loc 1 291 13 sub.u64 %r242,%r67,%r60; shl.b64 %r243,%r242,2; add.u64 %r73,%r73,%r243; .loc 1 292 24 add.u64 %r244,%frame,%r112; ld.u64 %r69,[%r244]; .loc 1 292 14 sub.u64 %r245,%r69,%r63; add.u64 %r76,%r76,%r245; .loc 1 295 32 add.u64 %r247,%r251,%r112; ld.u64 %r71,[%r247]; .loc 1 295 7 add.u64 %r113,%r113,8; add.u64 %r112,%r112,8; setp.eq.u64 %r248,%r66,%r71; @ %r248 bra $L48; bra $L49; $L29:_gfortran_sminloc0_4_s4 .visible .func _gfortran_sminloc0_4_s, .param .u64 %in_ar4) {u32 %r41; .reg .predu32 %r55; .reg .u64u32ar2; mov.u32 %r38,%ar3; mov.u64 %r39,%ar4; .loc 1 317 6 setp.eq.u64 %r40,%r37,0; @ %r40 bra $L66; .loc 1 317 20 ld.u32 %r41,[%r37]; setp.eq.u32 %r42,%r41,0; @ %r42 bra $L67; $L66: .loc 1 320 7 {call _gfortran_minloc0_4_s4324 7 bra $L65; $L67: .loc 1 327 8 ld.s8 %r30,[%r36+28]; .loc 1 329 6 setp.gt.s64 %r48,%r30,0; @ %r48 bra $L69; .loc 1 330 5 cvta.const.u64 %r49,$LC0; {stack; call _gfortran_runtime_error69: .loc 1 332 15 ld.u64 %r24,[%r35]; .loc 1 332 6 setp.ne.u64 %r51,%r24,0; @ %r51 bra $L70; .loc 1 334 7 st.u64 [%r35+48],%r24; add.u64 %r53,%r30,-1; st.u64 [%r35+56],%r53; mov.u64 %r54,1; st.u64 [%r35+40],%r54; .loc 1 335 28 cvt.u32.u64 %r55,%r54; st.u8 [%r35+28],%r55; .loc 1 336 24 st.u64 [%r35+8],%r24; .loc 1 337 29 mov.u64 %r58call (%value_in),_gfortrani_xmallocarray37 27 st.u64 [%r35],%r24; bra $L71; $L70: .loc 1 339 12 cvta.global.u64 %r63,_gfortrani_compile_options; .loc 1 339 11 ld.u32 %r64,[%r63+36]; setp.eq.u32 %r65,%r64,0; @ %r65 bra $L71; .loc 1 341 8 cvta.const.u64 %r68,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 346 8 ld.u64 %r24,[%r35]; $L71: ld.u64 %r69,[%r35+40]; shl.b64 %r34,%r69,2; mov.u64 %r29,%r24; .loc 1 347 10 mov.u64 %r32,0; .loc 1 348 23 cvt.u32.u64 %r70,%r32; $L72: st.u32 [%r29],%r70; .loc 1 347 24 add.u64 %r32,%r32,1; .loc 1 347 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r71,%r30,%r32; @ %r71 bra $L72; $L65: .loc 1 349 1 ret; } minloc0_8_s1.o/ 1622802212_gfortran_minloc0_8_s1 .visible .func _gfortran_minloc0_8_s1); .file 1 "../../../libgfortran/generated/minloc0_8_s1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_8_s1 .visible .func _gfortran_mminloc0_8_s1_gfortran_sminloc0_8_s1 .visible .func _gfortran_sminloc0_8_s1in_ar4); // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_returnmemcmpmem_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents7] = {77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,116,0 }82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,6270,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_8_s1 .visible .func _gfortran_minloc0_8_s1stack; mov.u64 %stack,08u64u64u64 %r92; .reg .u6432u64u32 %r142; .reg .u64 %r143; .reg .predpred %r157; .reg .u64 %r158; .reg .predpredpred %r172; .reg .u64 %r175; mov.u64 %r97,%ar0; mov.u64 %r98,%ar1; mov.u32 %r99,%ar2; mov.u64 %r100,%ar3; .loc 1 64 10 ld.s8 %r22,[%r98+28]; .loc 1 64 8 cvt.u32.u32 %r101,%r22; cvt.s64.s8 %r64,%r101; .loc 1 65 6 setp.gt.s64 %r102,%r64,0; @ %r102 bra $L2; .loc 1 66 5 cvta.const.u64 %r103103_gfortran_runtime_errorr23,[%r97]; .loc 1 68 6 setp.ne.u64 %r105,%r23,0; @ %r105 bra $L3; .loc 1 70 7 st.u64 [%r97+48],%r23; add.u64 %r107,%r64,-1; st.u64 [%r97+56],%r107; mov.u64 %r108,1; st.u64 [%r97+40],%r108; .loc 1 71 28 cvt.u32.u64 %r109,%r108; st.u8 [%r97+28],%r109; .loc 1 72 24 st.u64 [%r97+8],%r23; .loc 1 73 29 mov.u64 %r11112; call (%value_in),_gfortrani_xmallocarray115,[%value_in]; } mov.u64 %r23,%r115; .loc 1 73 27 st.u64 [%r97],%r23; bra $L4; $L3: .loc 1 77 11 cvta.global.u64 %r117,_gfortrani_compile_options; .loc 1 77 10 ld.u32 %r118,[%r117+36]; setp.eq.u32 %r119,%r118,0; @ %r119 bra $L4; .loc 1 7892; call _gfortrani_bounds_iforeach_return.loc 1 83 8 ld.u64 %r23,[%r97]; $L4: .loc 1 82 11 ld.u64 %r65,[%r97+40]; add.u64 %r67,%r98,40; mov.u64 %r94,0; .loc 1 84 10 mov.u64 %r85,%r94; add.u64 %r170,%frame,120; add.u64 %r169,%frame,240; .loc 1 88 16 mov.u64 %r133,%r94; bra $L7; $L19: mov.u64 %r85,%r69; $L7: .loc 1 86 18 add.u64 %r123,%frame,%r94; .loc 1 86 51 ld.u64 %r125,[%r67]; mul.lo.u64 %r124,%r125,%r100; .loc 1 86 18 st.u64 [%r123],%r124; .loc 1 87 19 ld.u64 %r127,[%r67+16]; add.u64 %r126,%r127,1; ld.u64 %r128,[%r67+8]; sub.u64 %r37,%r126,%r128; .loc 1 87 17 add.u64 %r130,%r170,%r94; st.u64 [%r130],%r37; .loc 1 88 16 add.u64 %r132,%r169,%r94; st.u64 [%r132],%r133; .loc 1 89 10 setp.gt.s64 %r134,%r37,0; @ %r134 bra $L5; shl.b64 %r84,%r65,3; mov.u64 %r27,%r23; .loc 1 92 11 mov.u64 %r70,0; .loc 1 93 24 mov.u64 %r135,%r70; $L6: st.u64 [%r27],%r135; .loc 1 92 27 add.u64 %r70,%r70,1; .loc 1 92 4 add.u64 %r27,%r27,%r84; setp.ne.u64 %r136,%r64,%r70; @ %r136 bra $L6; bra $L1; $L5: .loc 1 84 26 add.u64 %r69,%r85,1; .loc 1 84 3 add.u64 %r67,%r67,24; add.u64 %r94,%r94,8; setp.ne.u64 %r137,%r64,%r69; @ %r137 bra $L19; .loc 1 98 8 ld.u64 %r53,[%r98]; shl.b64 %r29,%r65,3; mov.u64 %r50,%r23; .loc 1 101 10 mov.u64 %r68,0; .loc 1 102 23 mov.u64 %r138,1; $L8: st.u64 [%r50],%r138; mov.u64 %r86,%r68; .loc 1 101 26 add.u64 %r68,%r68,1; .loc 1 101 3 add.u64 %r50,%r50,%r29; setp.ne.u64 %r139,%r85,%r86; @ %r139 bra $L8; .loc 1 108 9 setp.ne.u64 %r140,%r53,0; @ ! %r140 bra $L1; .loc 1 123 19 ld.u64 %r40,[%frame]; .loc 1 125 34 ld.u64 %r43,[%frame+120]; cvt.u32.u32 %r142,%r22; cvt.s64.s8 %r141,%r142; shl.b64 %r143,%r141,3; add.u64 %r55,%r169,%r143; .loc 1 106 11 mov.u64 %r58,0; add.u64 %r171,%frame,248; .loc 1 114 8 setp.eq.u32 %r172,%r99,0; .loc 1 131 13 mov.u64 %r175,%r58; $L18: .loc 1 125 21 ld.u64 %r87,[%frame+240]; mov.u64 %r42,%r87; mov.u64 %r52,%r53; $L15: .loc 1 114 8 setp.eq.u64 %r145,%r58,0; @ %r145 bra $L11; .loc 1 41 12r100149,[%value_in]; } .loc 1 114 8 @ %r172 bra $L12; .loc 1 114 24 setp.gt.s32 %r152,%r149,0; @ %r152 bra $L13; $L11: mov.u64 %r63,%r171; .loc 1 125 21 mov.u64 %r71,%r23; mov.u64 %r88,%r42; bra $L14; $L12: .loc 1 114 24 setp.lt.s32 %r154,%r149,0; @ ! %r154 bra $L13; bra $L11; $L14: .loc 1 119 38 add.u64 %r155,%r88,1; .loc 1 119 27 st.u64 [%r71],%r155; .loc 1 118 7 add.u64 %r71,%r71,%r29; setp.eq.u64 %r156,%r55,%r63; @ %r156 bra $L20; .loc 1 119 34 ld.u64 %r88,[%r63]; add.u64 %r63,%r63,8; bra $L14; $L20: mov.u64 %r58,%r52; $L13: .loc 1 123 9 add.u64 %r52,%r52,%r40; .loc 1 125 14 add.u64 %r42,%r42,1; .loc 1 125 7 st.u64 [%frame+240],%r42; setp.ne.u64 %r157,%r42,%r43; @ %r157 bra $L15; .loc 1 123 9 sub.u64 %r158,%r43,%r87; mad.lo.u64 %r53,%r158,%r40,%r53; mov.u64 %r77,%r171; mov.u64 %r51,%r43; mov.u64 %r48,%r40; .loc 1 126 9 mov.u64 %r57,0; $L17: .loc 1 131 13 st.u64 [%r77+-8],%r175; .loc 1 134 23 mul.lo.u64 %r45,%r48,%r51; mov.u64 %r92,%r57; .loc 1 135 5 add.u64 %r57,%r57,1; .loc 1 136 7 setp.eq.u64 %r162,%r92,%r85; @ %r162 bra $L1; .loc 1 144 16 ld.u64 %r163,[%r77]; add.u64 %r47,%r163,1; st.u64 [%r77],%r47; shl.b64 %r74,%r57,3; .loc 1 145 23 add.u64 %r164,%frame,%r74; ld.u64 %r48,[%r164]; .loc 1 145 13 sub.u64 %r165,%r48,%r45; add.u64 %r53,%r53,%r165; .loc 1 148 32 add.u64 %r167,%r170,%r74; ld.u64 %r51,[%r167]; .loc 1 148 7 add.u64 %r77,%r77,8; setp.eq.u64 %r168,%r47,%r51; @ %r168 bra $L17; bra $L18; $L1:_gfortran_mminloc0_8_s1 .visible .func _gfortran_mminloc0_8_s1in_ar4) {stack; mov.u64 %stack,0480u64 %r130; .reg .pred %r131; .reg .predpredpred %r155; .reg .u64 %r158; .reg .u64 %r161; .reg .u32pred %r198; .reg .u64predpredpred %r229; .reg .u64pred %r238; .reg .u64pred %r243; .reg .u64 %r246; mov.u64 %r126,%ar0; mov.u64 %r127,%ar1; mov.u64 %r128,%ar2; mov.u32 %r129,%ar3; mov.u64 %r130,%ar4; .loc 1 177 6 setp.ne.u64 %r131,%r128,0; @ %r131 bra $L32; .loc 1 180 7130; call _gfortran_minloc0_8_s1184 7 bra $L31; $L32: .loc 1 187 10 ld.s8 %r22,[%r127+28]; .loc 1 188 6 setp.gt.s32 %r136,%r22,0; @ %r136 bra $L34; .loc 1 189 5 cvta.const.u64 %r131stack; call _gfortran_runtime_error34: .loc 1 191 6 ld.u64 %r139,[%r126]; setp.ne.u64 %r140,%r139,0; @ %r140 bra $L35; .loc 1 193 7 st.u64 [%r126+48],%r139; add.u32 %r142,%r22,-1; cvt.s64.s32 %r143,%r142; st.u64 [%r126+56],%r143; mov.u64 %r144,1; st.u64 [%r126+40],%r144; .loc 1 194 28 cvt.u32.u64 %r145,%r144; st.u8 [%r126+28],%r145; .loc 1 195 24 st.u64 [%r126+8],%r139; .loc 1 196 29 cvt.u32.u32 %r150,%r22; cvt.s64.s8 %r149,%r150; mov.u64 %r148call (%value_in),_gfortrani_xmallocarray151,[%value_in]; } .loc 1 196 27 st.u64 [%r126],%r151; bra $L36; $L35: .loc 1 200 11 cvta.global.u64 %r153,_gfortrani_compile_options; .loc 1 200 10 ld.u32 %r154,[%r153+36]; setp.eq.u32 %r155,%r154,0; @ %r155 bra $L36; .loc 1 203 4 cvta.const.u64 %r158,$LC1gfortrani_bounds_iforeach_return.loc 1 205 4 cvta.const.u64 %r1611r12716158; call _gfortrani_bounds_equal_extents$L36: .loc 1 210 15 ld.u64 %r31,[%r128+16]; .loc 1 212 9 ld.u64 %r79,[%r128]; .loc 1 214 22 cvt.u32.u64 %r32,%r31; .loc 1 214 53 add.u32 %r163,%r32,-4; and.b32 %r164,%r163,-5; set.u32.eq.u32 %r166,%r164,0; neg.s32 %r167,%r166; .loc 1 214 22 add.u32 %r168,%r32,-1; set.u32.le.u32 %r170,%r168,1; neg.s32 %r171,%r170; .loc 1 214 6 cvt.u16.u32 %r173,%r167; cvt.u16.u32 %r174,%r171; or.b16 %r172,%r173,%r174; cvt.u32.u16 %r175,%r172; cvt.u16.u8 %r176,%r175; setp.ne.u16 %r177,%r176,0; @ %r177 bra $L37; .loc 1 216 7 setp.ne.u32 %r179,%r32,16; @ %r179 bra $L38; $L37: .loc 1 223 11 ld.u64 %r92,[%r126+40]; .loc 1 224 8 ld.u64 %r93,[%r126]; .loc 1 225 17 cvt.u32.u32 %r180,%r22; cvt.s64.s8 %r112,%r180; add.u64 %r80,%r127,40; add.u64 %r65,%r128,40; mov.u64 %r44,0; .loc 1 225 10 mov.u64 %r81,%r44; add.u64 %r242,%frame,120; add.u64 %r239,%frame,240; add.u64 %r240,%frame,360; .loc 1 230 16 mov.u64 %r197,%r44; bra $L39; $L38: .loc 1 221 5_gfortran_runtime_error39: .loc 1 227 18 add.u64 %r184,%r242,%r44; .loc 1 227 51 ld.u64 %r186,[%r80]; mul.lo.u64 %r185,%r186,%r130; .loc 1 227 18 st.u64 [%r184],%r185; .loc 1 228 18 add.u64 %r187,%frame,%r44; .loc 1 228 20 ld.u64 %r189,[%r65]; mul.lo.u64 %r188,%r189,%r31; .loc 1 228 18 st.u64 [%r187],%r188; .loc 1 229 19 ld.u64 %r191,[%r80+16]; add.u64 %r190,%r191,1; ld.u64 %r192,[%r80+8]; sub.u64 %r50,%r190,%r192; .loc 1 229 17 add.u64 %r194,%r239,%r44; st.u64 [%r194],%r50; .loc 1 230 16 add.u64 %r196,%r240,%r44; st.u64 [%r196],%r197; .loc 1 231 10 setp.gt.s64 %r198,%r50,0; @ %r198 bra $L40; shl.b64 %r122,%r92,3; mov.u64 %r37,%r93; .loc 1 234 11 mov.u64 %r95,0; .loc 1 235 24 mov.u64 %r199,%r95; $L41: st.u64 [%r37],%r199; .loc 1 234 27 add.u64 %r95,%r95,1; .loc 1 234 4 add.u64 %r37,%r37,%r122; setp.ne.u64 %r200,%r95,%r112; @ %r200 bra $L41; bra $L31; $L40: .loc 1 225 26 add.u64 %r81,%r81,1; .loc 1 225 3 add.u64 %r80,%r80,24; add.u64 %r65,%r65,24; add.u64 %r44,%r44,8; setp.ne.u64 %r201,%r81,%r112; @ %r201 bra $L39; .loc 1 240 8 ld.u64 %r77,[%r127]; shl.b64 %r87,%r92,3; mov.u64 %r99,%r93; .loc 1 243 10 mov.u64 %r94,0; .loc 1 244 23 mov.u64 %r202,%r94; $L43: st.u64 [%r99],%r202; .loc 1 243 26 add.u64 %r94,%r94,1; .loc 1 243 3 add.u64 %r99,%r99,%r87; setp.ne.u64 %r203,%r94,%r112; @ %r203 bra $L43; .loc 1 251 9 setp.eq.u64 %r204,%r77,0; @ %r204 bra $L31; .loc 1 267 19 ld.u64 %r57,[%frame+120]; .loc 1 268 20 ld.u64 %r59,[%frame]; .loc 1 270 34 ld.u64 %r62,[%frame+240]; cvt.u32.u32 %r206,%r22; cvt.s64.s8 %r205,%r206; shl.b64 %r207,%r205,3; add.u64 %r101,%r240,%r207; .loc 1 249 10 mov.u64 %r83,0; add.u64 %r241,%frame,368; .loc 1 257 6 setp.eq.u32 %r243,%r129,0; .loc 1 276 13 mov.u64 %r246,%r83; $L50: .loc 1 270 21 ld.u64 %r123,[%frame+360]; mov.u64 %r61,%r123; mov.u64 %r78,%r79; mov.u64 %r76,%r77; $L48: .loc 1 257 6 ld.s8 %r210,[%r78]; cvt.u16.u32 %r209,%r210; setp.eq.u16 %r211,%r209,0; @ %r211 bra $L44; .loc 1 257 14 setp.eq.u64 %r212,%r83,0; @ %r212 bra $L45;257 6 @ %r243 bra $L46; .loc 1 258 23 setp.gt.s32 %r219,%r216,0; @ %r219 bra $L44; $L45: mov.u64 %r106,%r241; .loc 1 270 21 mov.u64 %r107,%r93; mov.u64 %r124,%r61; bra $L47; $L46: .loc 1 258 23 setp.lt.s32 %r221,%r216,0; @ ! %r221 bra $L44; bra $L45; $L47: .loc 1 263 38 add.u64 %r222,%r124,1; .loc 1 263 27 st.u64 [%r107],%r222; .loc 1 262 7 add.u64 %r107,%r107,%r87; setp.eq.u64 %r223,%r101,%r106; @ %r223 bra $L51; .loc 1 263 34 ld.u64 %r124,[%r106]; add.u64 %r106,%r106,8; bra $L47; $L51: mov.u64 %r83,%r76; $L44: .loc 1 267 9 add.u64 %r76,%r76,%r57; .loc 1 268 10 add.u64 %r78,%r78,%r59; .loc 1 270 14 add.u64 %r61,%r61,1; .loc 1 270 7 st.u64 [%frame+360],%r61; setp.ne.u64 %r224,%r61,%r62; @ %r224 bra $L48; sub.u64 %r117,%r62,%r123; .loc 1 267 9 mad.lo.u64 %r77,%r57,%r117,%r77; .loc 1 268 10 mad.lo.u64 %r79,%r59,%r117,%r79; mov.u64 %r114,%r241; mov.u64 %r72,%r59; mov.u64 %r75,%r62; mov.u64 %r70,%r57; mov.u64 %r113,8; .loc 1 271 9 mov.u64 %r82,0; $L49: .loc 1 276 13 st.u64 [%r114+-8],%r246; .loc 1 279 23 mul.lo.u64 %r64,%r70,%r75; .loc 1 280 24 mul.lo.u64 %r67,%r75,%r72; .loc 1 281 5 add.u64 %r82,%r82,1; .loc 1 282 7 setp.eq.u64 %r229,%r82,%r112; @ %r229 bra $L31; .loc 1 290 16 ld.u64 %r230,[%r114]; add.u64 %r69,%r230,1; st.u64 [%r114],%r69; .loc 1 291 23 add.u64 %r232,%r242,%r113; ld.u64 %r70,[%r232]; .loc 1 291 13 sub.u64 %r233,%r70,%r64; add.u64 %r77,%r77,%r233; .loc 1 292 24 add.u64 %r234,%frame,%r113; ld.u64 %r72,[%r234]; .loc 1 292 14 sub.u64 %r235,%r72,%r67; add.u64 %r79,%r79,%r235; .loc 1 295 32 add.u64 %r237,%r239,%r113; ld.u64 %r75,[%r237]; .loc 1 295 7 add.u64 %r114,%r114,8; add.u64 %r113,%r113,8; setp.eq.u64 %r238,%r69,%r75; @ %r238 bra $L49; bra $L50; $L31:_gfortran_sminloc0_8_s1 .visible .func _gfortran_sminloc0_8_s1in_ar4) {u32 %r41; .reg .predu32 %r55; .reg .u64u32ar2; mov.u32 %r38,%ar3; mov.u64 %r39,%ar4; .loc 1 317 6 setp.eq.u64 %r40,%r37,0; @ %r40 bra $L70; .loc 1 317 20 ld.u32 %r41,[%r37]; setp.eq.u32 %r42,%r41,0; @ %r42 bra $L71; $L70: .loc 1 320 7 {call _gfortran_minloc0_8_s1324 7 bra $L69; $L71: .loc 1 327 8 ld.s8 %r30,[%r36+28]; .loc 1 329 6 setp.gt.s64 %r48,%r30,0; @ %r48 bra $L73; .loc 1 330 5 cvta.const.u64 %r49,$LC0; {stack; call _gfortran_runtime_error73: .loc 1 332 15 ld.u64 %r24,[%r35]; .loc 1 332 6 setp.ne.u64 %r51,%r24,0; @ %r51 bra $L74; .loc 1 334 7 st.u64 [%r35+48],%r24; add.u64 %r53,%r30,-1; st.u64 [%r35+56],%r53; mov.u64 %r54,1; st.u64 [%r35+40],%r54; .loc 1 335 28 cvt.u32.u64 %r55,%r54; st.u8 [%r35+28],%r55; .loc 1 336 24 st.u64 [%r35+8],%r24; .loc 1 337 29 mov.u64 call (%value_in),_gfortrani_xmallocarray37 27 st.u64 [%r35],%r24; bra $L75; $L74: .loc 1 339 12 cvta.global.u64 %r63,_gfortrani_compile_options; .loc 1 339 11 ld.u32 %r64,[%r63+36]; setp.eq.u32 %r65,%r64,0; @ %r65 bra $L75; .loc 1 341 8 cvta.const.u64 %r68,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 346 8 ld.u64 %r24,[%r35]; $L75: ld.u64 %r69,[%r35+40]; shl.b64 %r34,%r69,3; mov.u64 %r29,%r24; .loc 1 347 10 mov.u64 %r32,0; .loc 1 348 23 mov.u64 %r70,%r32; $L76: st.u64 [%r29],%r70; .loc 1 347 24 add.u64 %r32,%r32,1; .loc 1 347 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r71,%r30,%r32; @ %r71 bra $L76; $L69: .loc 1 349 1 ret; } minloc0_8_s4.o/ 1622802212_gfortran_minloc0_8_s4 .visible .func _gfortran_minloc0_8_s4); .file 1 "../../../libgfortran/generated/minloc0_8_s4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_8_s4 .visible .func _gfortran_mminloc0_8_s_gfortran_sminloc0_8_s4 .visible .func _gfortran_sminloc0_8_sVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_memcmp_char4_gfortrani_memcmp_char4_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents7] = {77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,116,0 }82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,6270,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_8_s4 .visible .func _gfortran_minloc0_8_s4stack; mov.u64 %stack,0844predu64 %r146; .reg .pred %r148; .reg .u32 %r154; .reg .pred %r155; .reg .u32 %r159; .reg .predpredpredpredpred64 10 ld.s8 %r22,[%r101+28]; .loc 1 64 8 cvt.u32.u32 %r104,%r22; cvt.s64.s8 %r61,%r104; .loc 1 65 6 setp.gt.s64 %r105,%r61,0; @ %r105 bra $L2; .loc 1 66 5 cvta.const.u64 %r106,$LC0; {stack; call _gfortran_runtime_errorr23,[%r100]; .loc 1 68 6 setp.ne.u64 %r108,%r23,0; @ %r108 bra $L3; .loc 1 70 7 st.u64 [%r100+48],%r23; add.u64 %r110,%r61,-1; st.u64 [%r100+56],%r110; mov.u64 %r111,1; st.u64 [%r100+40],%r111; .loc 1 71 28 cvt.u32.u64 %r112,%r111; st.u8 [%r100+28],%r112; .loc 1 72 24 st.u64 [%r100+8],%r23; .loc 1 73 29 mov.u64 %r11561call (%value_in),_gfortrani_xmallocarray118,[%value_in]; } mov.u64 %r23,%r118; .loc 1 73 27 st.u64 [%r100],%r23; bra $L4; $L3: .loc 1 77 11 cvta.global.u64 %r120,_gfortrani_compile_options; .loc 1 77 10 ld.u32 %r121,[%r120+364; .loc 1 78r1015; call _gfortrani_bounds_iforeach_return.loc 1 83 8 ld.u64 %r23,[%r100]; $L4: .loc 1 82 11 ld.u64 %r62,[%r100+40]; add.u64 %r65,%r101,40; mov.u64 %r98,0; .loc 1 84 10 mov.u64 %r82,%r98; add.u64 %r178,%frame,120; add.u64 %r176,%frame,240; .loc 1 88 16 mov.u64 %r136,%r98; bra $L7; $L20: mov.u64 %r82,%r67; $L7: .loc 1 86 18 add.u64 %r126,%frame,%r98; .loc 1 86 51 ld.u64 %r128,[%r65]; mul.lo.u64 %r127,%r128,%r103; .loc 1 86 18 st.u64 [%r126],%r127; .loc 1 87 19 ld.u64 %r130,[%r65+16]; add.u64 %r129,%r130,1; ld.u64 %r131,[%r65+8]; sub.u64 %r36,%r129,%r131; .loc 1 87 17 add.u64 %r133,%r178,%r98; st.u64 [%r133],%r36; .loc 1 88 16 add.u64 %r135,%r176,%r98; st.u64 [%r135],%r136; .loc 1 89 10 setp.gt.s64 %r137,%r36,0; @ %r137 bra $L5; shl.b64 %r79,%r62,3; mov.u64 %r81,%r23; .loc 1 92 11 mov.u64 %r68,0; .loc 1 93 24 mov.u64 %r138,%r68; $L6: st.u64 [%r81],%r138; .loc 1 92 27 add.u64 %r68,%r68,1; .loc 1 92 4 add.u64 %r81,%r81,%r79; setp.ne.u64 %r139,%r61,%r68; @ %r139 bra $L6; bra $L1; $L5: .loc 1 84 26 add.u64 %r67,%r82,1; .loc 1 84 3 add.u64 %r65,%r65,24; add.u64 %r98,%r98,8; setp.ne.u64 %r140,%r61,%r67; @ %r140 bra $L20; .loc 1 98 8 ld.u64 %r50,[%r101]; shl.b64 %r28,%r62,3; mov.u64 %r47,%r23; .loc 1 101 10 mov.u64 %r66,0; .loc 1 102 23 mov.u64 %r141,1; $L8: st.u64 [%r47],%r141; mov.u64 %r83,%r66; .loc 1 101 26 add.u64 %r66,%r66,1; .loc 1 101 3 add.u64 %r47,%r47,%r28; setp.ne.u64 %r142,%r82,%r83; @ %r142 bra $L8; .loc 1 108 9 setp.ne.u64 %r143,%r50,0; @ ! %r143 bra $L1; .loc 1 123 19 ld.u64 %r88,[%frame]; .loc 1 123 9 shl.b64 %r90,%r88,2; .loc 1 125 34 ld.u64 %r91,[%frame+120]; cvt.u32.u32 %r145,%r22; cvt.s64.s8 %r144,%r145; shl.b64 %r146,%r144,3; add.u64 %r52,%r176,%r146; .loc 1 106 11 mov.u64 %r55,0; add.u64 %r177,%frame,248; .loc 1 114 8 setp.eq.u32 %r179,%r102,0; $L19: .loc 1 125 21 ld.u64 %r92,[%frame+240]; mov.u64 %r40,%r92; mov.u64 %r49,%r50; $L16: .loc 1 114 8 setp.ne.u64 %r148,%r55,0; @ %r148 bra $L11; $L14: mov.u64 %r60,%r177; .loc 1 125 21 mov.u64 %r71,%r23; mov.u64 %r93,%r40; bra $L12; $L11: .loc 1 114 8 @ %r179 bra $L13;5_gfortrani_memcmp_char4154,[%value_in]; } .loc 1 114 24 setp.le.s32 %r155,%r154,0; @ ! %r155 bra $L15; bra $L145_gfortrani_memcmp_char4.loc 1 114 24 setp.lt.s32 %r160,%r159,0; @ ! %r160 bra $L15; bra $L14; $L12: .loc 1 119 38 add.u64 %r161,%r93,1; .loc 1 119 27 st.u64 [%r71],%r161; .loc 1 118 7 add.u64 %r71,%r71,%r28; setp.eq.u64 %r162,%r52,%r60; @ %r162 bra $L21; .loc 1 119 34 ld.u64 %r93,[%r60]; add.u64 %r60,%r60,8; bra $L12; $L21: .loc 1 117 14 mov.u64 %r55,%r49; $L15: .loc 1 123 9 add.u64 %r49,%r49,%r90; .loc 1 125 14 add.u64 %r40,%r40,1; .loc 1 125 7 st.u64 [%frame+240],%r40; setp.ne.u64 %r163,%r40,%r91; @ %r163 bra $L16; .loc 1 123 9 sub.u64 %r164,%r91,%r92; mad.lo.u64 %r50,%r164,%r90,%r50; mov.u64 %r77,%r177; mov.u64 %r48,%r91; mov.u64 %r45,%r88; .loc 1 126 9 mov.u64 %r54,0; .loc 1 131 13 mov.u64 %r167,%r54; $L18: st.u64 [%r77+-8],%r167; .loc 1 134 23 mul.lo.u64 %r42,%r45,%r48; mov.u64 %r38,%r54; .loc 1 135 5 add.u64 %r54,%r54,1; .loc 1 136 7 setp.eq.u64 %r168,%r38,%r82; @ %r168 bra $L1; .loc 1 144 16 ld.u64 %r169,[%r77]; add.u64 %r44,%r169,1; st.u64 [%r77],%r44; shl.b64 %r74,%r54,3; .loc 1 145 23 add.u64 %r170,%frame,%r74; ld.u64 %r45,[%r170]; .loc 1 145 13 sub.u64 %r171,%r45,%r42; shl.b64 %r172,%r171,2; add.u64 %r50,%r50,%r172; .loc 1 148 32 add.u64 %r174,%r178,%r74; ld.u64 %r48,[%r174]; .loc 1 148 7 add.u64 %r77,%r77,8; setp.eq.u64 %r175,%r44,%r48; @ %r175 bra $L18; bra $L19; $L1:_gfortran_mminloc0_8_s4 .visible .func _gfortran_mminloc0_8_s, .param .u64 %in_ar4) {stack; mov.u64 %stack,048098pred %r134; .reg .predpredpredpred %r204; .reg .u64 %r205; .reg .predu32 %r209; .reg .u64pred %r222; .reg .u32u64 %r242; .reg .u64mov.u64 %r129,%ar0; mov.u64 %r130,%ar1; mov.u64 %r131,%ar2; mov.u32 %r132,%ar3; mov.u64 %r133,%ar4; .loc 1 177 6 setp.ne.u64 %r134,%r131,0; @ %r134 bra $L302133; call _gfortran_minloc0_8_s4184 7 bra $L29; $L30: .loc 1 187 10 ld.s8 %r22,[%r130+28]; .loc 1 188 6 setp.gt.s32 %r139,%r22,0; @ %r139 bra $L32; .loc 1 189 5 cvta.const.u64 %r140140_gfortran_runtime_error32: .loc 1 191 6 ld.u64 %r142,[%r129]; setp.ne.u64 %r143,%r142,0; @ %r143 bra $L33; .loc 1 193 7 st.u64 [%r129+48],%r142; add.u32 %r145,%r22,-1; cvt.s64.s32 %r146,%r145; st.u64 [%r129+56],%r146; mov.u64 %r147,1; st.u64 [%r129+40],%r147; .loc 1 194 28 cvt.u32.u64 %r148,%r147; st.u8 [%r129+28],%r148; .loc 1 195 24 st.u64 [%r129+8],%r142; .loc 1 196 29 cvt.u32.u32 %r153,%r22; cvt.s64.s8 %r152,%r153; mov.u64 %r1511151; call (%value_in),_gfortrani_xmallocarray154,[%value_in]; } .loc 1 196 27 st.u64 [%r129],%r154; bra $L34; $L33: .loc 1 200 11 cvta.global.u64 %r156,_gfortrani_compile_options; .loc 1 200 10 ld.u32 %r157,[%r156+36]; setp.eq.u32 %r158,%r157,0gfortrani_bounds_iforeach_return.loc 1 205 4 cvta.const.u64 %r1641; call _gfortrani_bounds_equal_extents$L34: .loc 1 210 15 ld.u64 %r31,[%r131+16]; .loc 1 212 9 ld.u64 %r73,[%r131]; .loc 1 214 22 cvt.u32.u64 %r32,%r31; .loc 1 214 53 add.u32 %r166,%r32,-4; and.b32 %r167,%r166,-5; set.u32.eq.u32 %r169,%r167,0; neg.s32 %r170,%r169; .loc 1 214 22 add.u32 %r171,%r32,-1; set.u32.le.u32 %r173,%r171,1; neg.s32 %r174,%r173; .loc 1 214 6 cvt.u16.u32 %r176,%r170; cvt.u16.u32 %r177,%r174; or35; .loc 1 216 7 setp.ne.u32 %r182,%r32,16; @ %r182 bra $L36; $L35: .loc 1 223 11 ld.u64 %r85,[%r129+40]; .loc 1 224 8 ld.u64 %r86,[%r129]; .loc 1 225 17 cvt.u32.u32 %r183,%r22; cvt.s64.s8 %r107,%r183; add.u64 %r74,%r130,40; add.u64 %r58,%r131,40; mov.u64 %r37,0; .loc 1 225 10 mov.u64 %r75,%r37; add.u64 %r247,%frame,120; add.u64 %r248,%frame,240; add.u64 %r249,%frame,360; .loc 1 230 16 mov.u64 %r200,%r37; bra $L37; $L36: .loc 1 221 5_gfortran_runtime_error37: .loc 1 227 18 add.u64 %r187,%r247,%r37; .loc 1 227 51 ld.u64 %r189,[%r74]; mul.lo.u64 %r188,%r189,%r133; .loc 1 227 18 st.u64 [%r187],%r188; .loc 1 228 18 add.u64 %r190,%frame,%r37; .loc 1 228 20 ld.u64 %r192,[%r58]; mul.lo.u64 %r191,%r192,%r31; .loc 1 228 18 st.u64 [%r190],%r191; .loc 1 229 19 ld.u64 %r194,[%r74+16]; add.u64 %r193,%r194,1; ld.u64 %r195,[%r74+8]; sub.u64 %r49,%r193,%r195; .loc 1 229 17 add.u64 %r197,%r248,%r37; st.u64 [%r197],%r49; .loc 1 230 16 add.u64 %r199,%r249,%r37; st.u64 [%r199],%r200; .loc 1 231 10 setp.gt.s64 %r201,%r49,0; @ %r201 bra $L38; shl.b64 %r113,%r85,3; mov.u64 %r115,%r86; .loc 1 234 11 mov.u64 %r90,0; .loc 1 235 24 mov.u64 %r202,%r90; $L39: st.u64 [%r115],%r202; .loc 1 234 27 add.u64 %r90,%r90,1; .loc 1 234 4 add.u64 %r115,%r115,%r113; setp.ne.u64 %r203,%r90,%r107; @ %r203 bra $L39; bra $L29; $L38: .loc 1 225 26 add.u64 %r75,%r75,1; .loc 1 225 3 add.u64 %r74,%r74,24; add.u64 %r58,%r58,24; add.u64 %r37,%r37,8; setp.ne.u64 %r204,%r75,%r107; @ %r204 bra $L37; .loc 1 240 8 ld.u64 %r70,[%r130]; shl.b64 %r88,%r85,3; mov.u64 %r96,%r86; .loc 1 243 10 mov.u64 %r89,0; .loc 1 244 23 mov.u64 %r205,%r89; $L41: st.u64 [%r96],%r205; .loc 1 243 26 add.u64 %r89,%r89,1; .loc 1 243 3 add.u64 %r96,%r96,%r88; setp.ne.u64 %r206,%r89,%r107; @ %r206 bra $L41; .loc 1 251 9 setp.eq.u64 %r207,%r70,0; @ %r207 bra $L29; .loc 1 267 19 ld.u64 %r121,[%frame+120]; .loc 1 267 9 shl.b64 %r123,%r121,2; .loc 1 268 20 ld.u64 %r124,[%frame]; .loc 1 270 34 ld.u64 %r126,[%frame+240]; cvt.u32.u32 %r209,%r22; cvt.s64.s8 %r208,%r209; shl.b64 %r210,%r208,3; add.u64 %r98,%r249,%r210; .loc 1 249 10 mov.u64 %r77,0; add.u64 %r246,%frame,368; .loc 1 257 6 setp.eq.u32 %r250,%r132,0; $L49: .loc 1 270 21 ld.u64 %r127,[%frame+360]; mov.u64 %r55,%r127; mov.u64 %r72,%r73; mov.u64 %r69,%r70; $L47: .loc 1 257 6 ld.s8 %r213,[%r72]; cvt.u16.u32 %r212,%r213; setp.eq.u16 %r214,%r212,0; @ %r214 bra $L42; .loc 1 257 14 setp.ne.u64 %r215,%r77,0; @ %r215 bra $L43; $L46: mov.u64 %r103,%r246; .loc 1 270 21 mov.u64 %r104,%r86; mov.u64 %r128,%r55; bra $L44; $L43: .loc 1 257 6 @ %r250 bra $L45; .loc 1 43r133; call (%value_in),_gfortrani_memcmp_char4221,[%value_in]; } .loc 1 258 23 setp.le.s32 %r222,%r221,0; @ ! %r222 bra $L42; bra $L46; $L45: .loc 1 43r133; call (%value_in),_gfortrani_memcmp_char4226,[%value_in]; } .loc 1 258 23 setp.lt.s32 %r227,%r226,0; @ ! %r227 bra $L42; bra $L46; $L44: .loc 1 263 38 add.u64 %r228,%r128,1; .loc 1 263 27 st.u64 [%r104],%r228; .loc 1 262 7 add.u64 %r104,%r104,%r88; setp.eq.u64 %r229,%r98,%r103; @ %r229 bra $L50; .loc 1 263 34 ld.u64 %r128,[%r103]; add.u64 %r103,%r103,8; bra $L44; $L50: .loc 1 261 14 mov.u64 %r77,%r69; $L42: .loc 1 267 9 add.u64 %r69,%r69,%r123; .loc 1 268 10 add.u64 %r72,%r72,%r124; .loc 1 270 14 add.u64 %r55,%r55,1; .loc 1 270 7 st.u64 [%frame+360],%r55; setp.ne.u64 %r230,%r55,%r126; @ %r230 bra $L47; sub.u64 %r118,%r126,%r127; .loc 1 267 9 mad.lo.u64 %r70,%r118,%r123,%r70; .loc 1 268 10 mad.lo.u64 %r73,%r118,%r124,%r73; mov.u64 %r111,%r246; mov.u64 %r66,%r124; mov.u64 %r68,%r126; mov.u64 %r64,%r121; mov.u64 %r110,8; .loc 1 271 9 mov.u64 %r76,0; .loc 1 276 13 mov.u64 %r234,%r76; $L48: st.u64 [%r111+-8],%r234; .loc 1 279 23 mul.lo.u64 %r57,%r64,%r68; .loc 1 280 24 mul.lo.u64 %r60,%r68,%r66; .loc 1 281 5 add.u64 %r76,%r76,1; .loc 1 282 7 setp.eq.u64 %r235,%r76,%r107; @ %r235 bra $L29; .loc 1 290 16 ld.u64 %r236,[%r111]; add.u64 %r63,%r236,1; st.u64 [%r111],%r63; .loc 1 291 23 add.u64 %r238,%r247,%r110; ld.u64 %r64,[%r238]; .loc 1 291 13 sub.u64 %r239,%r64,%r57; shl.b64 %r240,%r239,2; add.u64 %r70,%r70,%r240; .loc 1 292 24 add.u64 %r241,%frame,%r110; ld.u64 %r66,[%r241]; .loc 1 292 14 sub.u64 %r242,%r66,%r60; add.u64 %r73,%r73,%r242; .loc 1 295 32 add.u64 %r244,%r248,%r110; ld.u64 %r68,[%r244]; .loc 1 295 7 add.u64 %r111,%r111,8; add.u64 %r110,%r110,8; setp.eq.u64 %r245,%r63,%r68; @ %r245 bra $L48; bra $L49; $L29:_gfortran_sminloc0_8_s4 .visible .func _gfortran_sminloc0_8_s, .param .u64 %in_ar4) {u32 %r41; .reg .predu32 %r55; .reg .u64u32ar2; mov.u32 %r38,%ar3; mov.u64 %r39,%ar4; .loc 1 317 6 setp.eq.u64 %r40,%r37,0; @ %r40 bra $L66; .loc 1 317 20 ld.u32 %r41,[%r37]; setp.eq.u32 %r42,%r41,0; @ %r42 bra $L67; $L66: .loc 1 320 7 {call _gfortran_minloc0_8_s4324 7 bra $L65; $L67: .loc 1 327 8 ld.s8 %r30,[%r36+28]; .loc 1 329 6 setp.gt.s64 %r48,%r30,0; @ %r48 bra $L69; .loc 1 330 5 cvta.const.u64 %r49,$LC0; {stack; call _gfortran_runtime_error69: .loc 1 332 15 ld.u64 %r24,[%r35]; .loc 1 332 6 setp.ne.u64 %r51,%r24,0; @ %r51 bra $L70; .loc 1 334 7 st.u64 [%r35+48],%r24; add.u64 %r53,%r30,-1; st.u64 [%r35+56],%r53; mov.u64 %r54,1; st.u64 [%r35+40],%r54; .loc 1 335 28 cvt.u32.u64 %r55,%r54; st.u8 [%r35+28],%r55; .loc 1 336 24 st.u64 [%r35+8],%r24; .loc 1 337 29 mov.u64 call (%value_in),_gfortrani_xmallocarray37 27 st.u64 [%r35],%r24; bra $L71; $L70: .loc 1 339 12 cvta.global.u64 %r63,_gfortrani_compile_options; .loc 1 339 11 ld.u32 %r64,[%r63+36]; setp.eq.u32 %r65,%r64,0; @ %r65 bra $L71; .loc 1 341 8 cvta.const.u64 %r68,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 346 8 ld.u64 %r24,[%r35]; $L71: ld.u64 %r69,[%r35+40]; shl.b64 %r34,%r69,3; mov.u64 %r29,%r24; .loc 1 347 10 mov.u64 %r32,0; .loc 1 348 23 mov.u64 %r70,%r32; $L72: st.u64 [%r29],%r70; .loc 1 347 24 add.u64 %r32,%r32,1; .loc 1 347 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r71,%r30,%r32; @ %r71 bra $L72; $L65: .loc 1 349 1 ret; } minloc0_16_s1.o/1622802212_gfortran_minloc0_16_s1 .visible .func _gfortran_minloc0_16_s1); .file 1 "../../../libgfortran/generated/minloc0_16_s1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_16_s1 .visible .func _gfortran_mminloc0_16_s1_gfortran_sminloc0_16_s1 .visible .func _gfortran_sminloc0_16_s1in_ar4); // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_returnmemcmpmem_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents7] = {77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,116,0 }82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,6270,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_16_s1 .visible .func _gfortran_minloc0_16_s1stack; mov.u64 %stack,08u64predu64 %r146; .reg .predu64 %r162; .reg .predpredpred %r176; .reg .u64.loc 1 64 10 ld.s8 %r22,[%r99+28]; .loc 1 64 8 cvt.u32.u32 %r102,%r22; cvt.s64.s8 %r65,%r102; .loc 1 65 6 setp.gt.s64 %r103,%r65,0; @ %r103 bra $L2; .loc 1 66 5 cvta.const.u64 %r104,$LC0; {_gfortran_runtime_errorr23,[%r98]; .loc 1 3; .loc 1 70 7 st.u64 [%r98+48],%r23; add.u64 %r108,%r65,-1; st.u64 [%r98+56],%r108; mov.u64 %r109,1; st.u64 [%r98+40],%r109; .loc 1 71 28 cvt.u32.u64 %r110,%r109; st.u8 [%r98+28],%r110; .loc 1 72 24 st.u64 [%r98+8],%r23; .loc 1 73 29 mov.u64 %r113,1665call (%value_in),_gfortrani_xmallocarray116,[%value_in]; } mov.u64 %r23,%r116; .loc 1 73 27 st.u64 [%r98],%r23; bra $L4; $L3: .loc 1 77 11 cvta.global.u64 %r118,_gfortrani_compile_options; .loc 1 77 10 ld.u32 %r119,[%r118+36]; setp.eq.u32 %r120,%r119,0; @ %r120 bra $L4; .loc 1 783; call _gfortrani_bounds_iforeach_return.loc 1 83 8 ld.u64 %r23,[%r98]; $L4: .loc 1 82 11 ld.u64 %r66,[%r98+40]; add.u64 %r62,%r99,40; mov.u64 %r94,0; .loc 1 84 10 mov.u64 %r86,%r94; add.u64 %r175,%frame,120; add.u64 %r173,%frame,240; .loc 1 88 16 mov.u64 %r134,%r94; bra $L7; $L19: mov.u64 %r86,%r71; $L7: .loc 1 86 18 add.u64 %r124,%frame,%r94; .loc 1 86 51 ld.u64 %r126,[%r62]; mul.lo.u64 %r125,%r126,%r101; .loc 1 86 18 st.u64 [%r124],%r125; .loc 1 87 19 ld.u64 %r128,[%r62+16]; add.u64 %r127,%r128,1; ld.u64 %r129,[%r62+8]; sub.u64 %r36,%r127,%r129; .loc 1 87 17 add.u64 %r131,%r175,%r94; st.u64 [%r131],%r36; .loc 1 88 16 add.u64 %r133,%r173,%r94; st.u64 [%r133],%r134; .loc 1 89 10 setp.gt.s64 %r135,%r36,0; @ %r135 bra $L5; shl.b64 %r88,%r66,4; mov.u64 %r27,%r23; .loc 1 92 11 mov.u64 %r72,0; .loc 1 93 24 mov.u64 %r136,%r72; $L6: st.u64 [%r27],%r136; st.u64 [%r27+8],%r136; .loc 1 92 27 add.u64 %r72,%r72,1; .loc 1 92 4 add.u64 %r27,%r27,%r88; setp.ne.u64 %r138,%r65,%r72; @ %r138 bra $L6; bra $L1; $L5: .loc 1 84 26 add.u64 %r71,%r86,1; .loc 1 84 3 add.u64 %r62,%r62,24; add.u64 %r94,%r94,8; setp.ne.u64 %r139,%r65,%r71; @ %r139 bra $L19; .loc 1 98 8 ld.u64 %r53,[%r99]; shl.b64 %r50,%r66,4; mov.u64 %r55,%r23; .loc 1 101 10 mov.u64 %r70,0; .loc 1 102 23 mov.u64 %r140,1; mov.u64 %r141,%r70; $L8: st.u64 [%r55],%r140; st.u64 [%r55+8],%r141; mov.u64 %r87,%r70; .loc 1 101 26 add.u64 %r70,%r70,1; .loc 1 101 3 add.u64 %r55,%r55,%r50; setp.ne.u64 %r142,%r86,%r87; @ %r142 bra $L8; .loc 1 108 9 setp.ne.u64 %r143,%r53,0; @ ! %r143 bra $L1; .loc 1 123 19 ld.u64 %r40,[%frame]; .loc 1 125 34 ld.u64 %r43,[%frame+120]; cvt.u32.u32 %r145,%r22; cvt.s64.s8 %r144,%r145; shl.b64 %r146,%r144,3; add.u64 %r59,%r173,%r146; .loc 1 106 11 mov.u64 %r58,0; add.u64 %r174,%frame,248; .loc 1 114 8 setp.eq.u32 %r176,%r100,0; .loc 1 131 13 mov.u64 %r179,%r58; $L18: .loc 1 125 21 ld.u64 %r89,[%frame+240]; mov.u64 %r42,%r89; mov.u64 %r52,%r53; $L15: .loc 1 114 8 setp.eq.u64 %r148,%r58,0; @ %r148 bra $L11; .loc 1 41 12r101152,[%value_in]; } .loc 1 114 8 @ %r176 bra $L12; .loc 1 114 24 setp.gt.s32 %r155,%r152,0; @ %r155 bra $L13; $L11: mov.u64 %r69,%r174; .loc 1 125 21 mov.u64 %r73,%r23; mov.u64 %r90,%r42; bra $L14; $L12: .loc 1 114 24 setp.lt.s32 %r157,%r152,0; @ ! %r157 bra $L13; bra $L11; $L14: .loc 1 119 38 add.u64 %r158,%r90,1; st.u64 [%r73],%r158; shr.s64 %r159,%r158,63; st.u64 [%r73+8],%r159; .loc 1 118 7 add.u64 %r73,%r73,%r50; setp.eq.u64 %r160,%r59,%r69; @ %r160 bra $L20; .loc 1 119 34 ld.u64 %r90,[%r69]; add.u64 %r69,%r69,8; bra $L14; $L20: mov.u64 %r58,%r52; $L13: .loc 1 123 9 add.u64 %r52,%r52,%r40; .loc 1 125 14 add.u64 %r42,%r42,1; .loc 1 125 7 st.u64 [%frame+240],%r42; setp.ne.u64 %r161,%r42,%r43; @ %r161 bra $L15; .loc 1 123 9 sub.u64 %r162,%r43,%r89; mad.lo.u64 %r53,%r162,%r40,%r53; mov.u64 %r79,%r174; mov.u64 %r51,%r43; mov.u64 %r48,%r40; .loc 1 126 9 mov.u64 %r57,0; $L17: .loc 1 131 13 st.u64 [%r79+-8],%r179; .loc 1 134 23 mul.lo.u64 %r45,%r48,%r51; mov.u64 %r96,%r57; .loc 1 135 5 add.u64 %r57,%r57,1; .loc 1 136 7 setp.eq.u64 %r166,%r96,%r86; @ %r166 bra $L1; .loc 1 144 16 ld.u64 %r167,[%r79]; add.u64 %r47,%r167,1; st.u64 [%r79],%r47; shl.b64 %r76,%r57,3; .loc 1 145 23 add.u64 %r168,%frame,%r76; ld.u64 %r48,[%r168]; .loc 1 145 13 sub.u64 %r169,%r48,%r45; add.u64 %r53,%r53,%r169; .loc 1 148 32 add.u64 %r171,%r175,%r76; ld.u64 %r51,[%r171]; .loc 1 148 7 add.u64 %r79,%r79,8; setp.eq.u64 %r172,%r47,%r51; @ %r172 bra $L17; bra $L18; $L1:_gfortran_mminloc0_16_s1 .visible .func _gfortran_mminloc0_16_s1in_ar4) {stack; mov.u64 %stack,0480predpredu32 %r209; .reg .u64pred %r222; .reg .predpredmov.u32 %r130,%ar3; mov.u64 %r131,%ar4; .loc 1 177 6 setp.ne.u64 %r132,%r129,0; @ %r132 bra $L32; .loc 1 180 7r128130131; call _gfortran_minloc0_16_s1184 7 bra $L31; $L32: .loc 1 187 10 ld.s8 %r22,[%r128+28]; .loc 1 188 6 setp.gt.s32 %r137,%r22,0; @ %r137 bra $L34;_gfortran_runtime_error34: .loc 1 191 6 ld.u64 %r140,[%r127]; setp.ne.u64 %r141,%r140,0; @ %r141 bra $L35; .loc 1 193 7 st.u64 [%r127+48],%r140; add.u32 %r143,%r22,-1; cvt.s64.s32 %r144,%r143; st.u64 [%r127+56],%r144; mov.u64 %r145,1; st.u64 [%r127+40],%r145; .loc 1 194 28 cvt.u32.u64 %r146,%r145; st.u8 [%r127+28],%r146; .loc 1 195 24 st.u64 [%r127+8],%r140; .loc 1 196 29 cvt.u32.u32 %r151,%r22; cvt.s64.s8 %r150,%r151; mov.u64 %r149,16call (%value_in),_gfortrani_xmallocarray152,[%value_in]; } .loc 1 196 27 st.u64 [%r127],%r152; bra $L36; $L35: .loc 1 200 11 cvta.global.u64 %r154,_gfortrani_compile_options; .loc 1 200 10 ld.u32 %r155,[%r154+36]; setp.eq.u32 %r156,%r155,0; @ %r156 bra $L36; .loc 1 203 4r12159; call _gfortrani_bounds_iforeach_return.loc 1 205 4 cvta.const.u64 %r16262159; call _gfortrani_bounds_equal_extents$L36: .loc 1 210 15 ld.u64 %r32,[%r129+16]; .loc 1 212 9 ld.u64 %r81,[%r129]; .loc 1 214 22 cvt.u32.u64 %r33,%r32; .loc 1 214 53 add.u32 %r164,%r33,-4; and.b32 %r165,%r164,-5; set.u32.eq.u32 %r167,%r165,0; neg.s32 %r168,%r167; .loc 1 214 22 add.u32 %r169,%r33,-1; set.u32.le.u32 %r171,%r169,1; neg.s32 %r172,%r171; .loc 1 214 6 cvt.u16.u32 %r174,%r168; cvt.u16.u32 %r175,%r172; or.b16 %r173,%r174,%r175; cvt.u32.u16 %r176,%r173; cvt.u16.u8 %r177,%r176; setp.ne.u16 %r178,%r177,0; @ %r178 bra $L37; .loc 1 216 7 setp.ne.u32 %r180,%r33,16; @ %r180 bra $L38; $L37: .loc 1 223 11 ld.u64 %r93,[%r127+40]; .loc 1 224 8 ld.u64 %r94,[%r127]; .loc 1 225 17 cvt.u32.u32 %r181,%r22; cvt.s64.s8 %r112,%r181; add.u64 %r86,%r128,40; add.u64 %r76,%r129,40; mov.u64 %r58,0; .loc 1 225 10 mov.u64 %r83,%r58; add.u64 %r244,%frame,120; add.u64 %r245,%frame,240; add.u64 %r246,%frame,360; .loc 1 230 16 mov.u64 %r198,%r58; bra $L39; $L38: .loc 1 221 5_gfortran_runtime_error39: .loc 1 227 18 add.u64 %r185,%r244,%r58; .loc 1 227 51 ld.u64 %r187,[%r86]; mul.lo.u64 %r186,%r187,%r131; .loc 1 227 18 st.u64 [%r185],%r186; .loc 1 228 18 add.u64 %r188,%frame,%r58; .loc 1 228 20 ld.u64 %r190,[%r76]; mul.lo.u64 %r189,%r190,%r32; .loc 1 228 18 st.u64 [%r188],%r189; .loc 1 229 19 ld.u64 %r192,[%r86+16]; add.u64 %r191,%r192,1; ld.u64 %r193,[%r86+8]; sub.u64 %r50,%r191,%r193; .loc 1 229 17 add.u64 %r195,%r245,%r58; st.u64 [%r195],%r50; .loc 1 230 16 add.u64 %r197,%r246,%r58; st.u64 [%r197],%r198; .loc 1 231 10 setp.gt.s64 %r199,%r50,0; @ %r199 bra $L40; shl.b64 %r123,%r93,4; mov.u64 %r38,%r94; .loc 1 234 11 mov.u64 %r96,0; .loc 1 235 24 mov.u64 %r200,%r96; $L41: st.u64 [%r38],%r200; st.u64 [%r38+8],%r200; .loc 1 234 27 add.u64 %r96,%r96,1; .loc 1 234 4 add.u64 %r38,%r38,%r123; setp.ne.u64 %r202,%r96,%r112; @ %r202 bra $L41; bra $L31; $L40: .loc 1 225 26 add.u64 %r83,%r83,1; .loc 1 225 3 add.u64 %r86,%r86,24; add.u64 %r76,%r76,24; add.u64 %r58,%r58,8; setp.ne.u64 %r203,%r83,%r112; @ %r203 bra $L39; .loc 1 240 8 ld.u64 %r79,[%r128]; shl.b64 %r98,%r93,4; mov.u64 %r100,%r94; .loc 1 243 10 mov.u64 %r95,0; .loc 1 244 23 mov.u64 %r204,%r95; $L43: st.u64 [%r100],%r204; st.u64 [%r100+8],%r204; .loc 1 243 26 add.u64 %r95,%r95,1; .loc 1 243 3 add.u64 %r100,%r100,%r98; setp.ne.u64 %r206,%r95,%r112; @ %r206 bra $L43; .loc 1 251 9 setp.eq.u64 %r207,%r79,0; @ %r207 bra $L31; .loc 1 267 19 ld.u64 %r59,[%frame+120]; .loc 1 268 20 ld.u64 %r61,[%frame]; .loc 1 270 34 ld.u64 %r64,[%frame+240]; cvt.u32.u32 %r209,%r22; cvt.s64.s8 %r208,%r209; shl.b64 %r210,%r208,3; add.u64 %r102,%r246,%r210; .loc 1 249 10 mov.u64 %r85,0; add.u64 %r243,%frame,368; .loc 1 257 6 setp.eq.u32 %r247,%r130,0; .loc 1 276 13 mov.u64 %r250,%r85; $L50: .loc 1 270 21 ld.u64 %r124,[%frame+360]; mov.u64 %r63,%r124; mov.u64 %r80,%r81; mov.u64 %r78,%r79; $L48: .loc 1 257 6 ld.s8 %r213,[%r80]; cvt.u16.u32 %r212,%r213; setp.eq.u16 %r214,%r212,0; @ %r214 bra $L44; .loc 1 257 14 setp.eq.u64 %r215,%r85,0; @ %r215 bra $L45; .loc 1 41 12131219,[%value_in]; } .loc 1 257 6 @ %r247 bra $L46; .loc 1 258 23 setp.gt.s32 %r222,%r219,0; @ %r222 bra $L44; $L45: mov.u64 %r107,%r243; .loc 1 270 21 mov.u64 %r108,%r94; mov.u64 %r125,%r63; bra $L47; $L46: .loc 1 258 23 setp.lt.s32 %r224,%r219,0; @ ! %r224 bra $L44; bra $L45; $L47: .loc 1 263 38 add.u64 %r225,%r125,1; st.u64 [%r108],%r225; shr.s64 %r226,%r225,63; st.u64 [%r108+8],%r226; .loc 1 262 7 add.u64 %r108,%r108,%r98; setp.eq.u64 %r227,%r102,%r107; @ %r227 bra $L51; .loc 1 263 34 ld.u64 %r125,[%r107]; add.u64 %r107,%r107,8; bra $L47; $L51: mov.u64 %r85,%r78; $L44: .loc 1 267 9 add.u64 %r78,%r78,%r59; .loc 1 268 10 add.u64 %r80,%r80,%r61; .loc 1 270 14 add.u64 %r63,%r63,1; .loc 1 270 7 st.u64 [%frame+360],%r63; setp.ne.u64 %r228,%r63,%r64; @ %r228 bra $L48; sub.u64 %r118,%r64,%r124; .loc 1 267 9 mad.lo.u64 %r79,%r59,%r118,%r79; .loc 1 268 10 mad.lo.u64 %r81,%r61,%r118,%r81; mov.u64 %r115,%r243; mov.u64 %r74,%r61; mov.u64 %r77,%r64; mov.u64 %r72,%r59; mov.u64 %r114,8; .loc 1 271 9 mov.u64 %r84,0; $L49: .loc 1 276 13 st.u64 [%r115+-8],%r250; .loc 1 279 23 mul.lo.u64 %r66,%r72,%r77; .loc 1 280 24 mul.lo.u64 %r69,%r77,%r74; .loc 1 281 5 add.u64 %r84,%r84,1; .loc 1 282 7 setp.eq.u64 %r233,%r84,%r112; @ %r233 bra $L31; .loc 1 290 16 ld.u64 %r234,[%r115]; add.u64 %r71,%r234,1; st.u64 [%r115],%r71; .loc 1 291 23 add.u64 %r236,%r244,%r114; ld.u64 %r72,[%r236]; .loc 1 291 13 sub.u64 %r237,%r72,%r66; add.u64 %r79,%r79,%r237; .loc 1 292 24 add.u64 %r238,%frame,%r114; ld.u64 %r74,[%r238]; .loc 1 292 14 sub.u64 %r239,%r74,%r69; add.u64 %r81,%r81,%r239; .loc 1 295 32 add.u64 %r241,%r245,%r114; ld.u64 %r77,[%r241]; .loc 1 295 7 add.u64 %r115,%r115,8; add.u64 %r114,%r114,8; setp.eq.u64 %r242,%r71,%r77; @ %r242 bra $L49; bra $L50; $L31:_gfortran_sminloc0_16_s1 .visible .func _gfortran_sminloc0_16_s1in_ar4) {u32 %r41; .reg .predu32 %r55; .reg .u64u32ar2; mov.u32 %r38,%ar3; mov.u64 %r39,%ar4; .loc 1 317 6 setp.eq.u64 %r40,%r37,0; @ %r40 bra $L70; .loc 1 317 20 ld.u32 %r41,[%r37]; setp.eq.u32 %r42,%r41,0; @ %r42 bra $L71; $L70: .loc 1 320 7 {call _gfortran_minloc0_16_s1324 7 bra $L69; $L71: .loc 1 327 8 ld.s8 %r30,[%r36+28]; .loc 1 329 6 setp.gt.s64 %r48,%r30,0; @ %r48 bra $L73; .loc 1 330 5 cvta.const.u64 %r49,$LC0; {stack; call _gfortran_runtime_error73: .loc 1 332 15 ld.u64 %r24,[%r35]; .loc 1 332 6 setp.ne.u64 %r51,%r24,0; @ %r51 bra $L74; .loc 1 334 7 st.u64 [%r35+48],%r24; add.u64 %r53,%r30,-1; st.u64 [%r35+56],%r53; mov.u64 %r54,1; st.u64 [%r35+40],%r54; .loc 1 335 28 cvt.u32.u64 %r55,%r54; st.u8 [%r35+28],%r55; .loc 1 336 24 st.u64 [%r35+8],%r24; .loc 1 337 29call (%value_in),_gfortrani_xmallocarray37 27 st.u64 [%r35],%r24; bra $L75; $L74: .loc 1 339 12 cvta.global.u64 %r63,_gfortrani_compile_options; .loc 1 339 11 ld.u32 %r64,[%r63+36]; setp.eq.u32 %r65,%r64,0; @ %r65 bra $L75; .loc 1 341 8 cvta.const.u64 %r68,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 346 8 ld.u64 %r24,[%r35]; $L75: ld.u64 %r69,[%r35+40]; shl.b64 %r34,%r69,4; mov.u64 %r29,%r24; .loc 1 347 10 mov.u64 %r32,0; .loc 1 348 23 mov.u64 %r70,%r32; $L76: st.u64 [%r29],%r70; st.u64 [%r29+8],%r70; .loc 1 347 24 add.u64 %r32,%r32,1; .loc 1 347 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r72,%r30,%r32; @ %r72 bra $L76; $L69: .loc 1 349 1 ret; } minloc0_16_s4.o/1622802212_gfortran_minloc0_16_s4 .visible .func _gfortran_minloc0_16_s4); .file 1 "../../../libgfortran/generated/minloc0_16_s4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc0_16_s4 .visible .func _gfortran_mminloc0_16_s_gfortran_sminloc0_16_s4 .visible .func _gfortran_sminloc0_16_sVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_memcmp_char4_gfortrani_memcmp_char4_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents7] = {77,73,78,777,65,83,75,32,97,114,103,117,109,101,110,116,0 }82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,6270,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc0_16_s4 .visible .func _gfortran_minloc0_16_s4stack; mov.u64 %stack,0844u64 %r104; .reg .u32 %r105; .reg .pred %r106; .reg .u64 %r107; .reg .predpredpredmov.u64 %r101,%ar0; mov.u64 %r102,%ar1; mov.u32 %r103,%ar2; mov.u64 %r104,%ar3; .loc 1 64 10 ld.s8 %r22,[%r102+28]; .loc 1 64 8 cvt.u32.u32 %r105,%r22; cvt.s64.s8 %r62,%r105; .loc 1 65 6 setp.gt.s64 %r106,%r62,0; @ %r106 bra $L2; .loc 1 66 5 cvta.const.u64 %r107,$LC0; {stack; call _gfortran_runtime_errorr23,[%r101]; .loc 1 68 6 setp.ne.u64 %r109,%r23,0; @ %r109 bra $L3; .loc 1 70 7 st.u64 [%r101+48],%r23; add.u64 %r111,%r62,-1; st.u64 [%r101+56],%r111; mov.u64 %r112,1; st.u64 [%r101+40],%r112; .loc 1 71 28 cvt.u32.u64 %r113,%r112; st.u8 [%r101+28],%r113; .loc 1 72 24 st.u64 [%r101+8],%r23; .loc 1 73 29 mov.u64 %r116,16116; call (%value_in),_gfortrani_xmallocarray119,[%value_in]; } mov.u64 %r23,%r119; .loc 1 73 27 st.u64 [%r101],%r23; bra $L4; $L3: .loc 1 77 11 cvta.global.u64 %r121,_gfortrani_compile_options; .loc 1 77 10 ld.u32 %r122,[%r121+36]; setp.eq.u32 %r123,%r122,0; @ %r123 bra $L4; .loc 1 78 2 cvta.const.u64 %r126,$LC1; {r1026; call _gfortrani_bounds_iforeach_return.loc 1 83 8 ld.u64 %r23,[%r101]; $L4: .loc 1 82 11 ld.u64 %r63,[%r101+40]; add.u64 %r59,%r102,40; mov.u64 %r99,0; .loc 1 84 10 mov.u64 %r82,%r99; add.u64 %r180,%frame,120; add.u64 %r181,%frame,240; .loc 1 88 16 mov.u64 %r137,%r99; bra $L7; $L20: mov.u64 %r82,%r69; $L7: .loc 1 86 18 add.u64 %r127,%frame,%r99; .loc 1 86 51 ld.u64 %r129,[%r59]; mul.lo.u64 %r128,%r129,%r104; .loc 1 86 18 st.u64 [%r127],%r128; .loc 1 87 19 ld.u64 %r131,[%r59+16]; add.u64 %r130,%r131,1; ld.u64 %r132,[%r59+8]; sub.u64 %r35,%r130,%r132; .loc 1 87 17 add.u64 %r134,%r180,%r99; st.u64 [%r134],%r35; .loc 1 88 16 add.u64 %r136,%r181,%r99; st.u64 [%r136],%r137; .loc 1 89 10 setp.gt.s64 %r138,%r35,0; @ %r138 bra $L5; shl.b64 %r81,%r63,4; mov.u64 %r85,%r23; .loc 1 92 11 mov.u64 %r70,0; .loc 1 93 24 mov.u64 %r139,%r70; $L6: st.u64 [%r85],%r139; st.u64 [%r85+8],%r139; .loc 1 92 27 add.u64 %r70,%r70,1; .loc 1 92 4 add.u64 %r85,%r85,%r81; setp.ne.u64 %r141,%r62,%r70; @ %r141 bra $L6; bra $L1; $L5: .loc 1 84 26 add.u64 %r69,%r82,1; .loc 1 84 3 add.u64 %r59,%r59,24; add.u64 %r99,%r99,8; setp.ne.u64 %r142,%r62,%r69; @ %r142 bra $L20; .loc 1 98 8 ld.u64 %r50,[%r102]; shl.b64 %r47,%r63,4; mov.u64 %r52,%r23; .loc 1 101 10 mov.u64 %r68,0; .loc 1 102 23 mov.u64 %r143,1; mov.u64 %r144,%r68; $L8: st.u64 [%r52],%r143; st.u64 [%r52+8],%r144; mov.u64 %r83,%r68; .loc 1 101 26 add.u64 %r68,%r68,1; .loc 1 101 3 add.u64 %r52,%r52,%r47; setp.ne.u64 %r145,%r82,%r83; @ %r145 bra $L8; .loc 1 108 9 setp.ne.u64 %r146,%r50,0; @ ! %r146 bra $L1; .loc 1 123 19 ld.u64 %r90,[%frame]; .loc 1 123 9 shl.b64 %r92,%r90,2; .loc 1 125 34 ld.u64 %r93,[%frame+120]; cvt.u32.u32 %r148,%r22; cvt.s64.s8 %r147,%r148; shl.b64 %r149,%r147,3; add.u64 %r56,%r181,%r149; .loc 1 106 11 mov.u64 %r55,0; add.u64 %r182,%frame,248; .loc 1 114 8 setp.eq.u32 %r183,%r103,0; $L19: .loc 1 125 21 ld.u64 %r94,[%frame+240]; mov.u64 %r40,%r94; mov.u64 %r49,%r50; $L16: .loc 1 114 8 setp.ne.u64 %r151,%r55,0; @ %r151 bra $L11; $L14: mov.u64 %r67,%r182; .loc 1 125 21 mov.u64 %r73,%r23; mov.u64 %r95,%r40; bra $L12; $L11: .loc 1 114 8 @ %r183 bra $L13;104; call (%value_in),_gfortrani_memcmp_char4157,[%value_in]; } .loc 1 114 24 setp.le.s32 %r158,%r157,0; @ ! %r158 bra $L15; bra $L14104; call (%value_in),_gfortrani_memcmp_char4162,[%value_in]; } .loc 1 114 24 setp.lt.s32 %r163,%r162,0; @ ! %r163 bra $L15; bra $L14; $L12: .loc 1 119 38 add.u64 %r164,%r95,1; st.u64 [%r73],%r164; shr.s64 %r165,%r164,63; st.u64 [%r73+8],%r165; .loc 1 118 7 add.u64 %r73,%r73,%r47; setp.eq.u64 %r166,%r56,%r67; @ %r166 bra $L21; .loc 1 119 34 ld.u64 %r95,[%r67]; add.u64 %r67,%r67,8; bra $L12; $L21: .loc 1 117 14 mov.u64 %r55,%r49; $L15: .loc 1 123 9 add.u64 %r49,%r49,%r92; .loc 1 125 14 add.u64 %r40,%r40,1; .loc 1 125 7 st.u64 [%frame+240],%r40; setp.ne.u64 %r167,%r40,%r93; @ %r167 bra $L16; .loc 1 123 9 sub.u64 %r168,%r93,%r94; mad.lo.u64 %r50,%r168,%r92,%r50; mov.u64 %r79,%r182; mov.u64 %r48,%r93; mov.u64 %r45,%r90; .loc 1 126 9 mov.u64 %r54,0; .loc 1 131 13 mov.u64 %r171,%r54; $L18: st.u64 [%r79+-8],%r171; .loc 1 134 23 mul.lo.u64 %r42,%r45,%r48; mov.u64 %r37,%r54; .loc 1 135 5 add.u64 %r54,%r54,1; .loc 1 136 7 setp.eq.u64 %r172,%r37,%r82; @ %r172 bra $L1; .loc 1 144 16 ld.u64 %r173,[%r79]; add.u64 %r44,%r173,1; st.u64 [%r79],%r44; shl.b64 %r76,%r54,3; .loc 1 145 23 add.u64 %r174,%frame,%r76; ld.u64 %r45,[%r174]; .loc 1 145 13 sub.u64 %r175,%r45,%r42; shl.b64 %r176,%r175,2; add.u64 %r50,%r50,%r176; .loc 1 148 32 add.u64 %r178,%r180,%r76; ld.u64 %r48,[%r178]; .loc 1 148 7 add.u64 %r79,%r79,8; setp.eq.u64 %r179,%r44,%r48; @ %r179 bra $L18; bra $L19; $L1:_gfortran_mminloc0_16_s4 .visible .func _gfortran_mminloc0_16_s, .param .u64 %in_ar4) {stack; mov.u64 %stack,04809pred %r135; .reg .predpredpred %r159; .reg .u64.reg .u32 %r179; .reg .u16 %r180; .reg .pred %r181; .reg .pred.reg .pred %r206; .reg .u64 %r207; .reg .pred %r209; .reg .pred %r210; .reg .u64u16 %r215; .reg .u32 %r216; .reg .pred %r217; .reg .pred %r218; .reg .u32 %r224; .reg .pred %r225; .reg .u32 %r229; .reg .pred %r230; .reg .u64 %r231; .reg .u64 %r232; .reg .pred %r233; .reg .pred.reg .u64 %r252; .reg .u64 %r253; .reg .pred %r254; mov.u64 %r130,%ar0; mov.u64 %r131,%ar1; mov.u64 %r132,%ar2; mov.u32 %r133,%ar3; mov.u64 %r134,%ar4; .loc 1 177 6 setp.ne.u64 %r135,%r132,0; @ %r135 bra $L3011133134; call _gfortran_minloc0_16_s4184 7 bra $L29; $L30: .loc 1 187 10 ld.s8 %r22,[%r131+28]; .loc 1 188 6 setp.gt.s32 %r140,%r22,0; @ %r140 bra $L32; .loc 1 189 5 cvta.const.u64 %r1411stack; call _gfortran_runtime_error32: .loc 1 191 6 ld.u64 %r143,[%r130]; setp.ne.u64 %r144,%r143,0; @ %r144 bra $L33; .loc 1 193 7 st.u64 [%r130+48],%r143; add.u32 %r146,%r22,-1; cvt.s64.s32 %r147,%r146; st.u64 [%r130+56],%r147; mov.u64 %r148,1; st.u64 [%r130+40],%r148; .loc 1 194 28 cvt.u32.u64 %r149,%r148; st.u8 [%r130+28],%r149; .loc 1 195 24 st.u64 [%r130+8],%r143; .loc 1 196 29 cvt.u32.u32 %r154,%r22; cvt.s64.s8 %r153,%r154; mov.u64 %r152,16call (%value_in),_gfortrani_xmallocarray155,[%value_in]; } .loc 1 196 27 st.u64 [%r130],%r155; bra $L34; $L33: .loc 1 200 11 cvta.global.u64 %r157,_gfortrani_compile_options; .loc 1 200 10 ld.u32 %r158,[%r157+36]; setp.eq.u32 %r159,%r158,0r1162; call _gfortrani_bounds_iforeach_return.loc 1 205 4 cvta.const.u64 %r1651162; call _gfortrani_bounds_equal_extents$L34: .loc 1 210 15 ld.u64 %r32,[%r132+16]; .loc 1 212 9 ld.u64 %r75,[%r132]; .loc 1 214 22 cvt.u32.u64 %r33,%r32; .loc 1 214 53 add.u32 %r167,%r33,-4; and.b32 %r168,%r167,-5; set.u32.eq.u32 %r170,%r168,0; neg.s32 %r171,%r170; .loc 1 214 22 add.u32 %r172,%r33,-1; set.u32.le.u32 %r174,%r172,1; neg.s32 %r175,%r174; .loc 1 214 6 cvt.u16.u32 %r177,%r171; cvt.u16.u32 %r178,%r175; or.b16 %r176,%r177,%r178; cvt.u32.u16 %r179,%r176; cvt.u16.u8 %r180,%r179; setp.ne.u16 %r181,%r180,0; @ %r181 bra $L35; .loc 1 216 7 setp.ne.u32 %r183,%r33,16; @ %r183 bra $L36; $L35: .loc 1 223 11 ld.u64 %r86,[%r130+40]; .loc 1 224 8 ld.u64 %r87,[%r130]; .loc 1 225 17 cvt.u32.u32 %r184,%r22; cvt.s64.s8 %r106,%r184; add.u64 %r80,%r131,40; add.u64 %r73,%r132,40; mov.u64 %r56,0; .loc 1 225 10 mov.u64 %r77,%r56; add.u64 %r252,%frame,120; add.u64 %r253,%frame,240; add.u64 %r250,%frame,360; .loc 1 230 16 mov.u64 %r201,%r56; bra $L37; $L36: .loc 1 221 5_gfortran_runtime_error37: .loc 1 227 18 add.u64 %r188,%r252,%r56; .loc 1 227 51 ld.u64 %r190,[%r80]; mul.lo.u64 %r189,%r190,%r134; .loc 1 227 18 st.u64 [%r188],%r189; .loc 1 228 18 add.u64 %r191,%frame,%r56; .loc 1 228 20 ld.u64 %r193,[%r73]; mul.lo.u64 %r192,%r193,%r32; .loc 1 228 18 st.u64 [%r191],%r192; .loc 1 229 19 ld.u64 %r195,[%r80+16]; add.u64 %r194,%r195,1; ld.u64 %r196,[%r80+8]; sub.u64 %r49,%r194,%r196; .loc 1 229 17 add.u64 %r198,%r253,%r56; st.u64 [%r198],%r49; .loc 1 230 16 add.u64 %r200,%r250,%r56; st.u64 [%r200],%r201; .loc 1 231 10 setp.gt.s64 %r202,%r49,0; @ %r202 bra $L38; shl.b64 %r114,%r86,4; mov.u64 %r116,%r87; .loc 1 234 11 mov.u64 %r91,0; .loc 1 235 24 mov.u64 %r203,%r91; $L39: st.u64 [%r116],%r203; st.u64 [%r116+8],%r203; .loc 1 234 27 add.u64 %r91,%r91,1; .loc 1 234 4 add.u64 %r116,%r116,%r114; setp.ne.u64 %r205,%r91,%r106; @ %r205 bra $L39; bra $L29; $L38: .loc 1 225 26 add.u64 %r77,%r77,1; .loc 1 225 3 add.u64 %r80,%r80,24; add.u64 %r73,%r73,24; add.u64 %r56,%r56,8; setp.ne.u64 %r206,%r77,%r106; @ %r206 bra $L37; .loc 1 240 8 ld.u64 %r72,[%r131]; shl.b64 %r95,%r86,4; mov.u64 %r97,%r87; .loc 1 243 10 mov.u64 %r90,0; .loc 1 244 23 mov.u64 %r207,%r90; $L41: st.u64 [%r97],%r207; st.u64 [%r97+8],%r207; .loc 1 243 26 add.u64 %r90,%r90,1; .loc 1 243 3 add.u64 %r97,%r97,%r95; setp.ne.u64 %r209,%r90,%r106; @ %r209 bra $L41; .loc 1 251 9 setp.eq.u64 %r210,%r72,0; @ %r210 bra $L29; .loc 1 267 19 ld.u64 %r122,[%frame+120]; .loc 1 267 9 shl.b64 %r124,%r122,2; .loc 1 268 20 ld.u64 %r125,[%frame]; .loc 1 270 34 ld.u64 %r127,[%frame+240]; cvt.u32.u32 %r212,%r22; cvt.s64.s8 %r211,%r212; shl.b64 %r213,%r211,3; add.u64 %r99,%r250,%r213; .loc 1 249 10 mov.u64 %r79,0; add.u64 %r251,%frame,368; .loc 1 257 6 setp.eq.u32 %r254,%r133,0; $L49: .loc 1 270 21 ld.u64 %r128,[%frame+360]; mov.u64 %r57,%r128; mov.u64 %r74,%r75; mov.u64 %r71,%r72; $L47: .loc 1 257 6 ld.s8 %r216,[%r74]; cvt.u16.u32 %r215,%r216; setp.eq.u16 %r217,%r215,0; @ %r217 bra $L42; .loc 1 257 14 setp.ne.u64 %r218,%r79,0; @ %r218 bra $L43; $L46: mov.u64 %r104,%r251; .loc 1 270 21 mov.u64 %r105,%r87; mov.u64 %r129,%r57; bra $L44; $L43: .loc 1 257 6 @ %r254 bra $L45; .loc 1 43134; call (%value_in),_gfortrani_memcmp_char4224,[%value_in]; } .loc 1 258 23 setp.le.s32 %r225,%r224,0; @ ! %r225 bra $L42; bra $L46; $L45: .loc 1 43134; call (%value_in),_gfortrani_memcmp_char4229,[%value_in]; } .loc 1 258 23 setp.lt.s32 %r230,%r229,0; @ ! %r230 bra $L42; bra $L46; $L44: .loc 1 263 38 add.u64 %r231,%r129,1; st.u64 [%r105],%r231; shr.s64 %r232,%r231,63; st.u64 [%r105+8],%r232; .loc 1 262 7 add.u64 %r105,%r105,%r95; setp.eq.u64 %r233,%r99,%r104; @ %r233 bra $L50; .loc 1 263 34 ld.u64 %r129,[%r104]; add.u64 %r104,%r104,8; bra $L44; $L50: .loc 1 261 14 mov.u64 %r79,%r71; $L42: .loc 1 267 9 add.u64 %r71,%r71,%r124; .loc 1 268 10 add.u64 %r74,%r74,%r125; .loc 1 270 14 add.u64 %r57,%r57,1; .loc 1 270 7 st.u64 [%frame+360],%r57; setp.ne.u64 %r234,%r57,%r127; @ %r234 bra $L47; sub.u64 %r119,%r127,%r128; .loc 1 267 9 mad.lo.u64 %r72,%r119,%r124,%r72; .loc 1 268 10 mad.lo.u64 %r75,%r119,%r125,%r75; mov.u64 %r112,%r251; mov.u64 %r68,%r125; mov.u64 %r70,%r127; mov.u64 %r66,%r122; mov.u64 %r111,8; .loc 1 271 9 mov.u64 %r78,0; .loc 1 276 13 mov.u64 %r238,%r78; $L48: st.u64 [%r112+-8],%r238; .loc 1 279 23 mul.lo.u64 %r59,%r66,%r70; .loc 1 280 24 mul.lo.u64 %r62,%r70,%r68; .loc 1 281 5 add.u64 %r78,%r78,1; .loc 1 282 7 setp.eq.u64 %r239,%r78,%r106; @ %r239 bra $L29; .loc 1 290 16 ld.u64 %r240,[%r112]; add.u64 %r65,%r240,1; st.u64 [%r112],%r65; .loc 1 291 23 add.u64 %r242,%r252,%r111; ld.u64 %r66,[%r242]; .loc 1 291 13 sub.u64 %r243,%r66,%r59; shl.b64 %r244,%r243,2; add.u64 %r72,%r72,%r244; .loc 1 292 24 add.u64 %r245,%frame,%r111; ld.u64 %r68,[%r245]; .loc 1 292 14 sub.u64 %r246,%r68,%r62; add.u64 %r75,%r75,%r246; .loc 1 295 32 add.u64 %r248,%r253,%r111; ld.u64 %r70,[%r248]8; setp.eq.u64 %r249,%r65,%r70; @ %r249 bra $L48; bra $L49; $L29:_gfortran_sminloc0_16_s4 .visible .func _gfortran_sminloc0_16_s, .param .u64 %in_ar4) {u32 %r41; .reg .predu32 %r55; .reg .u64u32ar2; mov.u32 %r38,%ar3; mov.u64 %r39,%ar4; .loc 1 317 6 setp.eq.u64 %r40,%r37,0; @ %r40 bra $L66; .loc 1 317 20 ld.u32 %r41,[%r37]; setp.eq.u32 %r42,%r41,0; @ %r42 bra $L67; $L66: .loc 1 320 7 {call _gfortran_minloc0_16_s4324 7 bra $L65; $L67: .loc 1 327 8 ld.s8 %r30,[%r36+28]; .loc 1 329 6 setp.gt.s64 %r48,%r30,0; @ %r48 bra $L69; .loc 1 330 5 cvta.const.u64 %r49,$LC0; {stack; call _gfortran_runtime_error69: .loc 1 332 15 ld.u64 %r24,[%r35]; .loc 1 332 6 setp.ne.u64 %r51,%r24,0; @ %r51 bra $L70; .loc 1 334 7 st.u64 [%r35+48],%r24; add.u64 %r53,%r30,-1; st.u64 [%r35+56],%r53; mov.u64 %r54,1; st.u64 [%r35+40],%r54; .loc 1 335 28 cvt.u32.u64 %r55,%r54; st.u8 [%r35+28],%r55; .loc 1 336 24 st.u64 [%r35+8],%r24; .loc 1 337 29call (%value_in),_gfortrani_xmallocarray37 27 st.u64 [%r35],%r24; bra $L71; $L70: .loc 1 339 12 cvta.global.u64 %r63,_gfortrani_compile_options; .loc 1 339 11 ld.u32 %r64,[%r63+36]; setp.eq.u32 %r65,%r64,0; @ %r65 bra $L71; .loc 1 341 8 cvta.const.u64 %r68,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 346 8 ld.u64 %r24,[%r35]; $L71: ld.u64 %r69,[%r35+40]; shl.b64 %r34,%r69,4; mov.u64 %r29,%r24; .loc 1 347 10 mov.u64 %r32,0; .loc 1 348 23 mov.u64 %r70,%r32; $L72: st.u64 [%r29],%r70; st.u64 [%r29+8],%r70; .loc 1 347 24 add.u64 %r32,%r32,1; .loc 1 347 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r72,%r30,%r32; @ %r72 bra $L72; $L65: .loc 1 349 1 ret; } maxloc1_4_s1.o/ 1622802212_gfortran_maxloc1_4_s1 .visible .func _gfortran_maxloc1_4_s1in_ar4); .file 1 "../../../libgfortran/generated/maxloc1_4_s1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_4_s1 .visible .func _gfortran_mmaxloc1_4_s1_gfortran_smaxloc1_4_s1 .visible .func _gfortran_smaxloc1_4_s1VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returmemcmpmem_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents7] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,914,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,117,101114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99_gfortran_maxloc1_4_s1 .visible .func _gfortran_maxloc1_4_s1in_ar4) {48099predpredpred %r238; .reg .u64predu64 %r279; .reg .u64 %r280; .reg .u64 %r281; .reg .u64u64 %r303; .reg .u32 %r304; .reg .pred %r305; .reg .u64 %r308; .reg .u64 %r309; .reg .pred %r311; .reg .u64 %r312; .reg .u64 %r313; .reg .u64u64predpredpredpred %r356; .reg .u64 %r357; .reg .predmov.u64 %r184,%ar0; mov.u64 %r185,%ar1; mov.u64 %r186,%ar2; mov.u32 %r187,%ar3; mov.u64 %r188,%ar4; .loc 1 70 10 ld.s8 %r22,[%r185+28]; .loc 1 70 38 add.u32 %r189,%r22,-1; .loc 1 70 8 cvt.s64.s32 %r117,%r189; .loc 1 71 10 ld.u64 %r25,[%r186]; .loc 1 71 7 add.u64 %r118,%r25,-1; .loc 1 73 7 shr.u64 %r191,%r118,63; set.u32.lt.s64 %r193,%r117,%r118; neg.s32 %r194,%r193; cvt.u16.u64 %r197,%r191; cvt.u16.u32 %r198,%r194; or.b16 %r196,%r197,%r198; .loc 1 73 6 cvt.u32.u16 %r199,%r196; cvt.u16.u8 %r200,%r199; setp.eq.u16 %r201,%r200,0; @ %r201 bra $L2; .loc 1 75 7 cvt.u32.u32 %r204,%r22; cvt.s64.s8 %r203,%r204; st.u64 [%stack+8],%r20302202_gfortran_runtime_errorr207,%r118,%r118; add.u64 %r208,%r207,%r118; shl.b64 %r209,%r208,3; add.u64 %r210,%r185,%r209; ld.u64 %r31,[%r210+56]; ld.u64 %r33,[%r210+48]; .loc 1 83 11 ld.u64 %r34,[%r210+40]; .loc 1 85 3 setp.ne.u64 %r224,%r118,0; @ %r224 bra $L3; $L9: .loc 1 93 3 setp.gt.s64 %r225,%r117,%r118; @ %r225 bra $L4; bra $L52; $L3: add.u64 %r29,%r185,40; add.u64 %r133,%frame,120; add.u64 %r182,%frame,240; add.u64 %r227,%r25,%r25; add.u64 %r228,%r227,%r25; shl.b64 %r229,%r228,3; add.u64 %r230,%r185,16; add.u64 %r160,%r229,%r230; .loc 1 91 12 mov.u64 %r373,0; $L8: .loc 1 87 51 ld.u64 %r232,[%r29]; mul.lo.u64 %r231,%r232,%r188; .loc 1 87 18 st.u64 [%r133],%r231; .loc 1 88 19 ld.u64 %r234,[%r29+16]; add.u64 %r233,%r234,1; ld.u64 %r235,[%r29+8]; sub.u64 %r44,%r233,%r235; .loc 1 90 10 setp.lt.s64 %r236,%r44,0; @ %r236 bra $L6; .loc 1 88 17 st.u64 [%r182],%r44; bra $L7; $L6: .loc 1 91 12 st.u64 [%r182],%r373; $L7: .loc 1 85 3 add.u64 %r29,%r29,24; add.u64 %r133,%r133,8; add.u64 %r182,%r182,8; setp.ne.u64 %r238,%r29,%r160; @ %r238 bra $L8; bra $L9; $L52: .loc 1 102 6 ld.u64 %r239,[%r184]; setp.eq.u64 %r240,%r239,0; @ ! %r240 bra $L11; bra $L10; $L4: add.u64 %r242,%r25,%r25; add.u64 %r243,%r242,%r25; shl.b64 %r244,%r243,3; add.u64 %r245,%r244,40; add.u64 %r134,%r185,%r245; shl.b64 %r246,%r25,3; add.u64 %r108,%r246,-8; add.u64 %r362,%frame,120; add.u64 %r112,%r362,%r108; add.u64 %r361,%frame,240; add.u64 %r105,%r361,%r108; add.u64 %r249,%r185,40; cvt.u32.u32 %r251,%r22; cvt.s64.s8 %r250,%r251; add.u64 %r253,%r250,%r250; add.u64 %r254,%r253,%r250; shl.b64 %r255,%r254,3; add.u64 %r67,%r249,%r255; .loc 1 99 12 mov.u64 %r372,0; $L14: .loc 1 95 56 ld.u64 %r257,[%r134]; mul.lo.u64 %r256,%r257,%r188; .loc 1 95 18 st.u64 [%r112],%r256; .loc 1 96 19 ld.u64 %r259,[%r134+16]; add.u64 %r258,%r259,1; ld.u64 %r260,[%r134+8]; sub.u64 %r54,%r258,%r260; .loc 1 98 10 setp.lt.s64 %r261,%r54,0; @ %r261 bra $L12; .loc 1 96 17 st.u64 [%r105],%r54; bra $L13; $L12: .loc 1 99 12 st.u64 [%r105],%r372; $L13: .loc 1 93 3 add.u64 %r134,%r134,24; add.u64 %r112,%r112,8; add.u64 %r105,%r105,8; setp.ne.u64 %r263,%r67,%r134; @ %r263 bra $L14; bra $L53; $L10: .loc 1 106 7 setp.le.s64 %r264,%r117,0; @ %r264 bra $L16; add.u64 %r361,%frame,240; $L37: add.u64 %r150,%r184,40; mov.u64 %r148,%r361; cvt.u32.u32 %r266,%r22; cvt.s64.s8 %r265,%r266; add.u64 %r268,%r265,%r265; add.u64 %r269,%r268,%r265; shl.b64 %r270,%r269,3; add.u64 %r271,%r184,16; add.u64 %r135,%r270,%r271; .loc 1 109 10 mov.u64 %r107,1; .loc 1 113 4 mov.u64 %r294,0; bra $L17; $L16: .loc 1 117 24 mov.u64 %r272,0; st.u64 [%r184+8],%r272; .loc 1 118 28 cvt.u16.u32 %r275,%r22; add.u16 %r274,%r275,-1; cvt.u32.u16 %r276,%r274; st.u8 [%r184+28],%r276; .loc 1 120 20 add.u32 %r277,%r22,-2; cvt.s64.s32 %r65,%r277; add.u64 %r279,%r65,%r65; add.u64 %r280,%r279,%r65; shl.b64 %r281,%r280,3; add.u64 %r282,%r184,%r281; .loc 1 120 67 shl.b64 %r284,%r65,3; add.u64 %r285,%frame,%r284; .loc 1 120 59 ld.u64 %r287,[%r282+40]; ld.u64 %r288,[%r285+240]; mul.lo.u64 %r123,%r287,%r288; .loc 1 122 29290; call (%value_in),_gfortrani_xmallocarray291,[%value_in]; } .loc 1 122 27 st.u64 [%r184],%r291; .loc 1 123 10 setp.eq.u64 %r293,%r123,0; @ ! %r293 bra $L21; bra $L18; $L20: .loc 1 111 48 mul.lo.u64 %r107,%r61,%r107; $L17: .loc 1 113 4 st.u64 [%r150+8],%r294; ld.u64 %r61,[%r148]; add.u64 %r295,%r61,-1; st.u64 [%r150+16],%r295; st.u64 [%r150],%r107; .loc 1 106 7 add.u64 %r150,%r150,24; add.u64 %r148,%r148,8; setp.ne.u64 %r296,%r135,%r150; @ %r296 bra $L20; bra $L16; $L18: .loc 1 126 4 st.u64 [%r184+48],%r123; mov.u64 %r298,-1; st.u64 [%r184+56],%r298; mov.u64 %r299,1; st.u64 [%r184+40],%r299; .loc 1 127 4 bra $L1; $L11: .loc 1 133 19 ld.s8 %r72,[%r184+28]; .loc 1 133 10 setp.eq.u64 %r300,%r72,%r117; @ %r300 bra $L23; .loc 1 134 2 st.u64 [%stack+8],%r117; st.u64 [%stack],%r72;_gfortran_runtime_error23: .loc 1 139 11 cvta.global.u64 %r303,_gfortrani_compile_options; .loc 1 139 10 ld.u32 %r304,[%r303+36]; setp.eq.u32 %r305,%r304,0; @ %r305 bra $L21; .loc 1 140 2 add.u64 %r361,%frame,240; cvta.const.u64 %r309,$LC2r361308309; call _gfortrani_bounds_ifunction_return$L21: .loc 1 144 3 setp.gt.s64 %r311,%r117,0; @ %r311 bra $L24; $L28: .loc 1 80 9 add.u64 %r312,%r31,1; .loc 1 80 7 sub.u64 %r119,%r312,%r33; max.s64 %r121,%r119,0; .loc 1 83 44 mul.lo.u64 %r36,%r34,%r188; .loc 1 152 8 ld.u64 %r129,[%r185]; .loc 1 153 8 ld.u64 %r130,[%r184]; .loc 1 186 22 ld.u64 %r81,[%frame+120]; .loc 1 187 22 ld.u64 %r83,[%frame]; .loc 1 187 12 shl.b64 %r84,%r83,2; .loc 1 189 32 ld.u64 %r136,[%frame+240]; .loc 1 196 23 mul.lo.u64 %r313,%r81,%r136; .loc 1 196 9 neg.s64 %r144,%r313; .loc 1 197 23 mul.lo.u64 %r314,%r83,%r136; .loc 1 197 9 shl.b64 %r315,%r314,2; neg.s64 %r146,%r315; .loc 1 185 12 ld.u64 %r181,[%frame+360]; setp.gt.s64 %r359,%r119,0; setp.le.s64 %r360,%r117,1; .loc 1 164 9 mov.u64 %r365,0; .loc 1 165 9 cvt.u32.u64 %r366,%r365; .loc 1 173 6 setp.eq.u32 %r367,%r187,0; cvt.u32.u32 %r369,%r22; cvt.s64.s8 %r370,%r369; add.u64 %r371,%r370,-1; bra $L25; $L24: add.u64 %r169,%frame,360; add.u64 %r168,%r184,40; mov.u64 %r166,%frame; cvt.u32.u32 %r317,%r22; cvt.s64.s8 %r316,%r317; add.u64 %r152,%r316,-1; .loc 1 144 3 mov.u64 %r131,0; add.u64 %r361,%frame,240; .loc 1 146 16 mov.u64 %r318,%r131; $L27: st.u64 [%r169],%r318; .loc 1 147 18 ld.u64 %r319,[%r168]; st.u64 [%r166],%r319; .loc 1 148 17 shl.b64 %r321,%r131,3; add.u64 %r322,%r361,%r321; .loc 1 148 10 ld.u64 %r323,[%r322]; setp.le.s64 %r324,%r323,0; @ %r324 bra $L1; .loc 1 144 26 add.u64 %r131,%r131,1; .loc 1 144 3 add.u64 %r169,%r169,8; add.u64 %r168,%r168,24; add.u64 %r166,%r166,8; setp.ne.u64 %r325,%r131,%r152; @ %r325 bra $L27; bra $L28; $L25: .loc 1 166 5 @ %r359 bra $L38; .loc 1 167 10 mov.u32 %r327,0; st.u32 [%r130],%r327; bra $L30; $L38: mov.u64 %r125,%r129; .loc 1 164 9 mov.u64 %r111,%r365; .loc 1 165 9 mov.u32 %r109,%r366; .loc 1 170 13 mov.u64 %r124,0; $L29: .loc 1 173 6 setp.eq.u64 %r328,%r111,0; @ %r328 bra $L31; .loc 1 40125188332,[%value_in]; } .loc 1 173 6 @ %r367 bra $L32; .loc 1 173 22 setp.ge.s32 %r335,%r332,0; @ %r335 bra $L31; bra $L33; $L32: setp.le.s32 %r336,%r332,0; @ %r336 bra $L33; $L31: cvt.u32.u64 %r337,%r124; add.u32 %r109,%r337,1; .loc 1 170 13 mov.u64 %r111,%r125; $L33: .loc 1 170 28 add.u64 %r124,%r124,1; .loc 1 170 36 add.u64 %r125,%r125,%r36; .loc 1 170 6 setp.gt.s64 %r338,%r121,%r124; @ %r338 bra $L29; .loc 1 181 12 st.u32 [%r130],%r109; $L30: .loc 1 185 15 add.u64 %r181,%r181,1; .loc 1 186 12 add.u64 %r129,%r129,%r81; .loc 1 187 12 add.u64 %r130,%r130,%r84; .loc 1 189 13 setp.ne.u64 %r339,%r181,%r136; @ %r339 bra $L25; .loc 1 196 9 add.u64 %r126,%r129,%r144; .loc 1 197 9 add.u64 %r127,%r130,%r146; .loc 1 199 7 @ %r360 bra $L1; add.u64 %r137,%frame,368; mov.u64 %r155,8; .loc 1 198 5 mov.u64 %r128,1; add.u64 %r362,%frame,120; add.u64 %r361,%frame,240; bra $L35; $L36: .loc 1 193 13 st.u64 [%r137],%r365; .loc 1 196 23 mul.lo.u64 %r345,%r97,%r96; .loc 1 196 9 sub.u64 %r126,%r129,%r345; .loc 1 197 23 mul.lo.u64 %r346,%r99,%r96; .loc 1 197 9 shl.b64 %r347,%r346,2; sub.u64 %r127,%r130,%r347; .loc 1 198 5 add.u64 %r128,%r128,1; .loc 1 199 7 add.u64 %r137,%r137,8; add.u64 %r155,%r155,8; setp.eq.u64 %r348,%r128,%r371; @ %r348 bra $L1; $L35: .loc 1 207 16 ld.u64 %r349,[%r137]; add.u64 %r96,%r349,1; st.u64 [%r137],%r96; .loc 1 208 23 add.u64 %r351,%r362,%r155; ld.u64 %r97,[%r351]; .loc 1 208 13 add.u64 %r129,%r126,%r97; .loc 1 209 23 add.u64 %r352,%frame,%r155; ld.u64 %r99,[%r352]; .loc 1 209 13 shl.b64 %r353,%r99,2; add.u64 %r130,%r127,%r353; .loc 1 189 32 add.u64 %r355,%r361,%r155; ld.u64 %r102,[%r355]; .loc 1 189 13 setp.eq.u64 %r356,%r96,%r102; @ %r356 bra $L36; mov.u64 %r181,0; bra $L25; $L53: .loc 1 102 6 ld.u64 %r357,[%r184]; setp.eq.u64 %r358,%r357,0; @ ! %r358 bra $L11; bra $L37; $L1: .loc 1 21_gfortran_mmaxloc1_4_s1 .visible .func _gfortran_mmaxloc1_4_s1608u64 %r218; .reg .u64 %r227; .reg .u64 %r230; .reg .u64pred %r289; .reg .predpred %r304; .reg .pred %r305; .reg .u64u64 %r313; .reg .u64u64pred %r324; .reg .u64predu16 %r371; .reg .u16 %r372; .reg .u32 %r373; .reg .pred %r374; .reg .u64 %r375; .reg .u64 %r376; .reg .predu64u64 %r399; .reg .predu64 %r410; .reg .pred %r411; .reg .pred.reg .u32 %r417; .reg .predpred %r427; .reg .pred %r428; .reg .predpred %r461; .reg .predmov.u64 %r230,%ar0; mov.u64 %r231,%ar1; mov.u64 %r232,%ar2; mov.u64 %r233,%ar3; mov.u32 %r234,%ar4; mov.u64 %r235,%ar5; .loc 1 244 6 setp.ne.u64 %r236,%r233,0; @ %r236 bra $L2234235; call _gfortran_maxloc1_4_s1); } .loc 1 251 7 bra $L54; $L55: .loc 1 254 10 ld.u64 %r22,[%r232]; .loc 1 254 7 add.u64 %r151,%r22,-1; .loc 1 255 10 ld.s8 %r23,[%r231+28]; .loc 1 255 38 add.u32 %r242,%r23,-1; .loc 1 255 8 cvt.s64.s32 %r152,%r242; .loc 1 258 7 shr.u64 %r244,%r151,63; set.u32.gt.s64 %r246,%r151,%r152; neg.s32 %r247,%r246; cvt.u16.u64 %r250,%r244; cvt.u16.u32 %r251,%r247; or.b16 %r249,%r250,%r251; .loc 1 258 6.loc 1 260 7 cvt.u32.u32 %r257,%r23; cvt.s64.s8 %r256,%r257; st.u64 [%stack+8],%r256; st.u64 [%stack],%r22; cvta.const.u64 %r22_gfortran_runtime_error7: .loc 1 265 9 add.u64 %r460,%r151,%r151; add.u64 %r459,%r460,%r151; shl.b64 %r262,%r459,3; add.u64 %r263,%r231,%r262; ld.u64 %r266,[%r263+56]; add.u64 %r265,%r266,1; .loc 1 265 7 ld.u64 %r273,[%r263+48]; sub.u64 %r153,%r265,%r273; .loc 1 266 6 setp.le.s64 %r274,%r153,0; @ %r274 bra $L54; .loc 1 269 9 ld.u64 %r164,[%r233]; .loc 1 271 15 ld.u64 %r34,[%r233+16]; .loc 1 273 22 cvt.u32.u64 %r35,%r34; .loc 1 273 53 add.u32 %r275,%r35,-4; and.b32 %r276,%r275,-5; set.u32.eq.u32 %r278,%r276,0; neg.s32 %r279,%r278; .loc 1 273 22 add.u32 %r280,%r35,-1; set.u32.le.u32 %r282,%r280,1; neg.s32 %r283,%r282; .loc 1 273 6 cvt.u16.u32 %r285,%r279; cvt.u16.u32 %r286,%r283; or.b16 %r284,%r285,%r286; cvt.u32.u16 %r287,%r284; cvt.u16.u8 %r288,%r287; setp.ne.u16 %r289,%r288,0; @ %r289 bra $L59; .loc 1 275 7 setp.ne.u32 %r291,%r35,16; @ %r291 bra $L60; $L59: .loc 1 282 11 shl.b64 %r295,%r459,3; add.u64 %r296,%r231,%r295; ld.u64 %r40,[%r296+40]; .loc 1 283 12 add.u64 %r302,%r233,%r295; ld.u64 %r43,[%r302+40]; .loc 1 285 3 setp.ne.u64 %r304,%r151,0; @ %r304 bra $L61; $L67: .loc 1 295 3 setp.lt.s64 %r305,%r151,%r152; @ %r305 bra $L62; bra $L113; $L60: .loc 1 280 5 cvta.const.u64 %r306306_gfortran_runtime_error61: add.u64 %r212,%r231,40; add.u64 %r214,%frame,240; add.u64 %r215,%r233,40; mov.u64 %r217,%frame; add.u64 %r218,%frame,360; add.u64 %r308,%r231,16; add.u64 %r310,%r22,%r22; add.u64 %r311,%r310,%r22; shl.b64 %r312,%r311,3; add.u64 %r227,%r308,%r312; .loc 1 292 12 mov.u64 %r472,0; $L66: .loc 1 287 51 ld.u64 %r314,[%r212]; mul.lo.u64 %r313,%r314,%r235; .loc 1 287 18 st.u64 [%r214],%r313; .loc 1 288 20 ld.u64 %r316,[%r215]; mul.lo.u64 %r315,%r316,%r34; .loc 1 288 18 st.u64 [%r217],%r315; .loc 1 289 19 ld.u64 %r318,[%r212+16]; add.u64 %r317,%r318,1; ld.u64 %r319,[%r212+8]; sub.u64 %r58,%r317,%r319; .loc 1 291 10 setp.lt.s64 %r320,%r58,0; @ %r320 bra $L64; .loc 1 289 17 st.u64 [%r218],%r58; bra $L65; $L64: .loc 1 292 12 st.u64 [%r218],%r472; $L65: .loc 1 285 3 add.u64 %r212,%r212,24; add.u64 %r214,%r214,8; add.u64 %r215,%r215,24; add.u64 %r217,%r217,8; add.u64 %r218,%r218,8; setp.ne.u64 %r322,%r212,%r227; @ %r322 bra $L66; bra $L67; $L113: .loc 1 305 6 ld.u64 %r323,[%r230]; setp.eq.u64 %r324,%r323,0; @ ! %r324 bra $L69; bra $L68; $L62: add.u64 %r326,%r22,%r22; add.u64 %r327,%r326,%r22; shl.b64 %r328,%r327,3; add.u64 %r67,%r328,40; add.u64 %r84,%r231,%r67; shl.b64 %r329,%r22,3; add.u64 %r171,%r329,-8; add.u64 %r457,%frame,240; add.u64 %r38,%r457,%r171; add.u64 %r173,%r233,%r67; add.u64 %r77,%frame,%r171; add.u64 %r455,%frame,360; add.u64 %r194,%r455,%r171; add.u64 %r332,%r231,40; cvt.u32.u32 %r334,%r23; cvt.s64.s8 %r333,%r334; add.u64 %r336,%r333,%r333; add.u64 %r337,%r336,%r333; shl.b64 %r338,%r337,3; add.u64 %r209,%r332,%r338; .loc 1 302 12 mov.u64 %r471,0; $L72: .loc 1 297 55 ld.u64 %r340,[%r84]; mul.lo.u64 %r339,%r340,%r235; .loc 1 297 18 st.u64 [%r38],%r339; .loc 1 298 20 ld.u64 %r342,[%r173]; mul.lo.u64 %r341,%r342,%r34; .loc 1 298 18 st.u64 [%r77],%r341; .loc 1 299 19 ld.u64 %r344,[%r84+16]; add.u64 %r343,%r344,1; ld.u64 %r345,[%r84+8]; sub.u64 %r74,%r343,%r345; .loc 1 301 10 setp.lt.s64 %r346,%r74,0; @ %r346 bra $L70; .loc 1 299 17 st.u64 [%r194],%r74; bra $L71; $L70: .loc 1 302 12 st.u64 [%r194],%r471; $L71: .loc 1 295 3 add.u64 %r84,%r84,24; add.u64 %r38,%r38,8; add.u64 %r173,%r173,24; add.u64 %r77,%r77,8; add.u64 %r194,%r194,8; setp.ne.u64 %r348,%r84,%r209; @ %r348 bra $L72; bra $L114; $L68: .loc 1 309 7 setp.eq.u64 %r349,%r152,0; @ %r349 bra $L74; add.u64 %r455,%frame,360; $L97: add.u64 %r133,%r230,40; mov.u64 %r130,%r455; cvt.u32.u32 %r351,%r23; cvt.s64.s8 %r350,%r351; add.u64 %r353,%r350,%r350; add.u64 %r354,%r353,%r350; shl.b64 %r355,%r354,3; add.u64 %r356,%r230,16; add.u64 %r92,%r355,%r356; .loc 1 312 10 mov.u64 %r135,1; .loc 1 316 4 mov.u64 %r375,0; bra $L75; $L74: .loc 1 320 20 add.u32 %r357,%r23,-2; cvt.s64.s32 %r82,%r357; add.u64 %r359,%r82,%r82; add.u64 %r360,%r359,%r82; shl.b64 %r361,%r360,3; add.u64 %r362,%r230,%r361; .loc 1 320 67 shl.b64 %r364,%r82,3; add.u64 %r365,%frame,%r364; .loc 1 320 59 ld.u64 %r367,[%r362+40]; ld.u64 %r368,[%r365+360]; mul.lo.u64 %r158,%r367,%r368; .loc 1 322 24 mov.u64 %r369,0; st.u64 [%r230+8],%r369; .loc 1 323 28 cvt.u16.u32 %r372,%r23; add.u16 %r371,%r372,-1; cvt.u32.u16 %r373,%r371; st.u8 [%r230+28],%r373; .loc 1 325 10 setp.eq.u64 %r374,%r158,0; @ %r374 bra $L76; bra $L115; $L78: .loc 1 314 47 mul.lo.u64 %r135,%r79,%r135; $L75: .loc 1 316 4 st.u64 [%r133+8],%r375; ld.u64 %r79,[%r130]; add.u64 %r376,%r79,-1; st.u64 [%r133+16],%r376; st.u64 [%r133],%r135; .loc 1 309 7 add.u64 %r133,%r133,24; add.u64 %r130,%r130,8; setp.ne.u64 %r377,%r92,%r133; @ %r377 bra $L78; bra $L74; $L76: .loc 1 328 4 st.u64 [%r230+48],%r158; mov.u64 %r379,-1; st.u64 [%r230+56],%r379; mov.u64 %r380,1; st.u64 [%r230+40],%r380; .loc 1 329 4 bra $L54; $L115: .loc 1 332 24382; call (%value_in),_gfortrani_xmallocarray83,[%value_in]; } .loc 1 332 22 st.u64 [%r230],%r383; bra $L79; $L69: .loc 1 337 19 ld.s8 %r385,[%r230+28]; .loc 1 337 10 setp.eq.u64 %r386,%r385,%r152; @ %r386 bra $L80; .loc 1 338 2 cvta.const.u64 %r387387_gfortran_runtime_error80: .loc 1 340 11 cvta.global.u64 %r389,_gfortrani_compile_options; .loc 1 340 10 ld.u32 %r390,[%r389+36]; setp.eq.u32 %r391,%r390,0; @ %r391 bra $L79; .loc 1 342 4 add.u64 %r455,%frame,360; cvta.const.u64 %r395,$LC2r455394395; call _gfortrani_bounds_ifunction_return344 4 cvta.const.u64 %r399,$LC3r2399395; call _gfortrani_bounds_equal_extents$L79: .loc 1 349 3 setp.ne.u64 %r401,%r152,0; @ %r401 bra $L81; $L84: .loc 1 357 8 ld.u64 %r165,[%r230]; .loc 1 358 8 ld.u64 %r163,[%r231]; .loc 1 360 9 setp.ne.u64 %r402,%r163,0; @ %r402 bra $L82; bra $L54; $L81: add.u64 %r174,%frame,480; add.u64 %r149,%r230,40; add.u64 %r147,%frame,120; cvt.u32.u32 %r404,%r23; cvt.s64.s8 %r403,%r404; add.u64 %r134,%r403,-1; .loc 1 349 3 mov.u64 %r169,0; add.u64 %r455,%frame,360; .loc 1 351 16 mov.u64 %r405,%r169; $L83: st.u64 [%r174],%r405; .loc 1 352 18 ld.u64 %r406,[%r149]; st.u64 [%r147],%r406; .loc 1 353 17 shl.b64 %r408,%r169,3; add.u64 %r409,%r455,%r408; .loc 1 353 10 ld.u64 %r410,[%r409]; setp.le.s64 %r411,%r410,0; @ %r411 bra $L54; .loc 1 349 26 add.u64 %r169,%r169,1; .loc 1 349 3 add.u64 %r174,%r174,8; add.u64 %r149,%r149,24; add.u64 %r147,%r147,8; setp.ne.u64 %r412,%r134,%r169; @ %r412 bra $L83; bra $L84; $L82: .loc 1 282 44 mul.lo.u64 %r42,%r40,%r235; .loc 1 283 12 mul.lo.u64 %r46,%r43,%r34; .loc 1 396 22 ld.u64 %r99,[%frame+240]; .loc 1 397 23 ld.u64 %r101,[%frame]; .loc 1 398 22 ld.u64 %r103,[%frame+120]; .loc 1 398 12 shl.b64 %r104,%r103,2; .loc 1 400 32 ld.u64 %r29,[%frame+360]; .loc 1 407 23 mul.lo.u64 %r166,%r99,%r29; .loc 1 408 24 mul.lo.u64 %r413,%r101,%r29; .loc 1 408 10 neg.s64 %r177,%r413; .loc 1 409 23 mul.lo.u64 %r414,%r103,%r29; .loc 1 409 9 shl.b64 %r415,%r414,2; neg.s64 %r182,%r415; setp.le.s64 %r461,%r152,1; .loc 1 384 13 setp.eq.u32 %r464,%r234,0; cvt.u32.u32 %r465,%r23; cvt.s64.s8 %r466,%r465; add.u64 %r467,%r466,-1; add.u64 %r468,%frame,240; add.u64 %r469,%frame,360; add.u64 %r470,%frame,120; $L98: .loc 1 349 3 mov.u64 %r168,%r164; mov.u64 %r141,%r163; .loc 1 372 9 mov.u64 %r159,0; $L88: .loc 1 375 7 ld.s8 %r199,[%r168]; .loc 1 375 6 setp.eq.u32 %r456,%r199,0; @ %r456 bra $L85; .loc 1 378 11 cvt.u32.u64 %r417,%r159; add.u32 %r139,%r417,1; .loc 1 382 6 setp.le.s64 %r418,%r153,%r159; @ %r418 bra $L86; add.u64 %r180,%r159,1; mov.u64 %r167,%r141; bra $L87; $L85: .loc 1 372 24 add.u64 %r159,%r159,1; .loc 1 372 32 add.u64 %r141,%r141,%r42; .loc 1 372 47 add.u64 %r168,%r168,%r46; .loc 1 372 2 setp.ne.u64 %r419,%r153,%r159; @ %r419 bra $L88; .loc 1 371 9 mov.u32 %r139,%r199; bra $L86; $L87: .loc 1 384 6 @ %r456 bra $L89; .loc 1 40(%value_in),memcmp,(%out_arg1,%out_arg2,%out_arg3); ld.param.u32 %r424,[%value_in]; } .loc 1 384 13 @ %r464 bra $L90; setp.ge.s32 %r427,%r424,0; @ %r427 bra $L91; bra $L89; $L90: setp.le.s32 %r428,%r424,0; @ %r428 bra $L89; $L91: .loc 1 388 14 cvt.u32.u64 %r139,%r180; mov.u64 %r141,%r167; $L89: .loc 1 382 31 add.u64 %r167,%r167,%r42; .loc 1 382 46 add.u64 %r168,%r168,%r46; .loc 1 382 6 add.u64 %r181,%r180,1; setp.eq.u64 %r429,%r153,%r180; @ %r429 bra $L86; .loc 1 384 7 ld.s8 %r199,[%r168]; mov.u64 %r180,%r181; setp.eq.u32 %r456,%r199,0; bra $L87; $L86: .loc 1 392 8 st.u32 [%r165],%r139; .loc 1 395 15 ld.u64 %r430,[%frame+480]; add.u64 %r97,%r430,1; .loc 1 396 12 add.u64 %r163,%r163,%r99; .loc 1 397 13 add.u64 %r164,%r164,%r101; .loc 1 398 12 add.u64 %r165,%r165,%r104; .loc 1 400 13 setp.eq.u64 %r431,%r29,%r97; @ %r431 bra $L92; .loc 1 395 15 st.u64 [%frame+480],%r97; bra $L98; $L92: .loc 1 404 13 mov.u64 %r432,0; st.u64 [%frame+480],%r432; .loc 1 408 10 add.u64 %r160,%r164,%r177; .loc 1 409 9 add.u64 %r161,%r165,%r182; .loc 1 411 7 @ %r461 bra $L54; add.u64 %r150,%frame,488; .loc 1 407 23 mov.u64 %r107,%r166; .loc 1 411 7 mov.u64 %r197,8; .loc 1 410 5 mov.u64 %r162,1; bra $L95; $L96: .loc 1 404 13 st.u64 [%r150],%r432; .loc 1 407 23 mul.lo.u64 %r107,%r120,%r119; .loc 1 408 24 mul.lo.u64 %r438,%r122,%r119; .loc 1 408 10 sub.u64 %r160,%r164,%r438; .loc 1 409 23 mul.lo.u64 %r439,%r124,%r119; .loc 1 409 9 shl.b64 %r440,%r439,2; sub.u64 %r161,%r165,%r440; .loc 1 410 5 add.u64 %r162,%r162,1; .loc 1 411 7 add.u64 %r150,%r150,8; add.u64 %r197,%r197,8; setp.eq.u64 %r441,%r162,%r467; @ %r441 bra $L54; $L95: .loc 1 419 16 ld.u64 %r442,[%r150]; add.u64 %r119,%r442,1; st.u64 [%r150],%r119; .loc 1 420 23 add.u64 %r444,%r468,%r197; ld.u64 %r120,[%r444]; .loc 1 420 13 sub.u64 %r445,%r120,%r107; add.u64 %r163,%r163,%r445; .loc 1 421 24 add.u64 %r446,%frame,%r197; ld.u64 %r122,[%r446]; .loc 1 421 14 add.u64 %r164,%r160,%r122; .loc 1 422 23 add.u64 %r448,%r470,%r197; ld.u64 %r124,[%r448]; .loc 1 422 13 shl.b64 %r449,%r124,2; add.u64 %r165,%r161,%r449; .loc 1 400 32 add.u64 %r451,%r469,%r197; ld.u64 %r128,[%r451]; .loc 1 400 13 setp.eq.u64 %r452,%r119,%r128; @ %r452 bra $L96; bra $L98; $L114: .loc 1 305 6 ld.u64 %r453,[%r230]; setp.eq.u64 %r454,%r453,0; @ ! %r454 bra $L69; bra $L97; $L54: .loc 1 42_gfortran_smaxloc1_4_s1 .visible .func _gfortran_smaxloc1_4_s1local .align 16 .b8 %frame_ar[368u64 %r89; .reg .u643u64 %r125; .reg .u64predu16 %r180; .reg .predpred %r222; .reg .predu64 %r231; .reg .u16 %r233; .reg .u16 %r234; .reg .u32pred %r259; .reg .pred %r260; .reg .u64u64 %r267; .reg .u32pred %r303; .reg .u64 %r304; .reg .pred %r305; .reg .u64 %r306; .reg .u64 %r307; .reg .u32mov.u64 %r156,%ar0; mov.u64 %r157,%ar1; mov.u64 %r158,%ar2; mov.u64 %r159,%ar3; mov.u32 %r160,%ar4; mov.u64 %r161,%ar5; .loc 1 449 6 setp.eq.u64 %r162,%r159,0; @ %r162 bra $L117; .loc 1 449 20 ld.u32 %r163,[%r159]; setp.eq.u32 %r164,%r163,0; @ %r164 bra $L118; $L117: .loc 1 452 7r1571580161; call _gfortran_maxloc1_4_s1); } .loc 1 456 7 bra $L116; $L118: .loc 1 459 10 ld.u64 %r23,[%r158]; .loc 1 459 7 add.u64 %r95,%r23,-1; .loc 1 460 10 ld.s8 %r24,[%r157+28]; .loc 1 460 38 add.u32 %r26,%r24,-1; .loc 1 460 8 cvt.s64.s32 %r306,%r26; .loc 1 462 7 shr.u64 %r171,%r95,63; set.u32.gt.s64 %r173,%r95,%r306; neg.s32 %r174,%r173; cvt.u16.u64 %r177,%r171; cvt.u16.u32 %r178,%r174; or.b16 %r176,%r177,%r178; .loc 1 462 6 cvt.u32.u16 %r179,%r176; cvt.u16.u8 %r180,%r179; setp.ne.u16 %r181,%r180,0; @ %r181 bra $L120; .loc 1 469 3 setp.ne.u64 %r182,%r95,0; @ %r182 bra $L121; $L127: .loc 1 477 3 setp.lt.s64 %r183,%r95,%r306; @ %r183 bra $L122; bra $L175; $L120: .loc 1 464 7 cvt.u32.u32 %r186,%r24; cvt.s64.s8 %r185,%r186; st.u64 [%stack+8],%r1851841stack; call _gfortran_runtime_error121: add.u64 %r92,%r157,48; add.u64 %r114,%frame,120; add.u64 %r146,%r157,56; add.u64 %r188,%r157,24; add.u64 %r190,%r23,%r23; add.u64 %r191,%r190,%r23; shl.b64 %r192,%r191,3; add.u64 %r152,%r188,%r192; .loc 1 474 12 mov.u64 %r312,0; $L126: .loc 1 471 19 ld.u64 %r194,[%r146]; ld.u64 %r196,[%r92]; sub.u64 %r195,%r194,%r196; .loc 1 471 50 mad.lo.u64 %r38,%r195,%r161,%r161; .loc 1 473 10 setp.le.s64 %r197,%r38,0; @ %r197 bra $L124; .loc 1 471 17 st.u64 [%r114],%r38; bra $L125; $L124: .loc 1 474 12 st.u64 [%r114],%r312; $L125: .loc 1 469 3 add.u64 %r92,%r92,24; add.u64 %r114,%r114,8; add.u64 %r146,%r146,24; setp.ne.u64 %r199,%r92,%r152; @ %r199 bra $L126; bra $L127; $L175: .loc 1 486 15 ld.u64 %r107,[%r156]; .loc 1 486 6 setp.eq.u64 %r200,%r107,0; @ ! %r200 bra $L129; bra $L128; $L122: add.u64 %r202,%r23,%r23; add.u64 %r203,%r202,%r23; shl.b64 %r204,%r203,3; add.u64 %r205,%r204,48; add.u64 %r83,%r157,%r205; add.u64 %r304,%frame,120; shl.b64 %r207,%r23,3; add.u64 %r208,%r207,-8; add.u64 %r144,%r304,%r208; add.u64 %r209,%r157,48; cvt.u32.u32 %r211,%r24; cvt.s64.s8 %r210,%r211; add.u64 %r213,%r210,%r210; add.u64 %r214,%r213,%r210; shl.b64 %r215,%r214,3; add.u64 %r72,%r209,%r215; .loc 1 483 12 mov.u64 %r311,0; $L132: .loc 1 480 2 ld.u64 %r217,[%r83+8]; ld.u64 %r219,[%r83]; sub.u64 %r218,%r217,%r219; .loc 1 480 37 mad.lo.u64 %r46,%r218,%r161,%r161; .loc 1 482 10 setp.le.s64 %r220,%r46,0; @ %r220 bra $L130; .loc 1 479 17 st.u64 [%r144],%r46; bra $L131; $L130: .loc 1 483 12 st.u64 [%r144],%r311; $L131: .loc 1 477 3 add.u64 %r83,%r83,24; add.u64 %r144,%r144,8; setp.ne.u64 %r222,%r72,%r83; @ %r222 bra $L132; bra $L176; $L128: .loc 1 490 7 setp.eq.u64 %r223,%r306,0; @ %r223 bra $L134; add.u64 %r304,%frame,120; $L154: add.u64 %r125,%r156,40; mov.u64 %r123,%r304; add.u64 %r224,%r156,16; cvt.u32.u32 %r226,%r24; cvt.s64.s8 %r225,%r226; add.u64 %r228,%r225,%r225; add.u64 %r229,%r228,%r225; shl.b64 %r230,%r229,3; add.u64 %r117,%r224,%r230; .loc 1 493 10 mov.u64 %r89,1; .loc 1 497 4 mov.u64 %r249,0; bra $L135; $L134: .loc 1 501 24 mov.u64 %r231,0; st.u64 [%r156+8],%r231; .loc 1 502 28 cvt.u16.u32 %r234,%r24; add.u16 %r233,%r234,-1; cvt.u32.u16 %r235,%r233; st.u8 [%r156+28],%r235; .loc 1 504 20 add.u32 %r236,%r24,-2; cvt.s64.s32 %r56,%r236; add.u64 %r238,%r56,%r56; add.u64 %r239,%r238,%r56; shl.b64 %r240,%r239,3; add.u64 %r241,%r156,%r240; .loc 1 504 67 shl.b64 %r243,%r56,3; add.u64 %r244,%frame,%r243; .loc 1 504 59 ld.u64 %r246,[%r241+40]; ld.u64 %r247,[%r244+120]; mul.lo.u64 %r103,%r246,%r247; .loc 1 506 10 setp.eq.u64 %r248,%r103,0; @ %r248 bra $L136; bra $L177; $L138: .loc 1 495 48 mul.lo.u64 %r89,%r52,%r89; $L135: .loc 1 497 4 st.u64 [%r125+8],%r249; ld.u64 %r52,[%r123]; add.u64 %r250,%r52,-1; st.u64 [%r125+16],%r250; st.u64 [%r125],%r89; .loc 1 490 7 add.u64 %r125,%r125,24; add.u64 %r123,%r123,8; setp.ne.u64 %r251,%r117,%r125; @ %r251 bra $L138; bra $L134; $L136: .loc 1 509 4 st.u64 [%r156+48],%r103; mov.u64 %r253,-1; st.u64 [%r156+56],%r253; mov.u64 %r254,1; st.u64 [%r156+40],%r254; .loc 1 510 4 bra $L116; $L177: .loc 1 513 24256; call (%value_in),_gfortrani_xmallocarray257,[%value_in]; } mov.u64 %r107,%r257; .loc 1 513 22 st.u64 [%r156],%r107; $L142: .loc 1 539 3 setp.ne.u64 %r259,%r306,0; @ %r259 bra $L139; bra $L140; $L129: .loc 1 517 19 ld.s8 %r61,[%r156+28]; .loc 1 517 10 setp.eq.u64 %r260,%r61,%r306; @ %r260 bra $L141; .loc 1 518 2 st.u64 [%stack+8],%r306; st.u64 [%stack],%r61;_gfortran_runtime_error141: .loc 1 523 11 cvta.global.u64 %r263,_gfortrani_compile_options; .loc 1 523 10 ld.u32 %r264,[%r263+36]; setp.eq.u32 %r265,%r264,0; @ %r265 bra $L142; .loc 1 525 4 setp.eq.u64 %r266,%r306,0; @ %r266 bra $L140; add.u64 %r115,%r156,48; add.u64 %r105,%frame,120; cvt.u32.u32 %r268,%r24; cvt.s64.s8 %r267,%r268; add.u64 %r86,%r267,-1; .loc 1 525 10 mov.u64 %r113,0; $L144: .loc 1 529 21 ld.u64 %r270,[%r115+8]; add.u64 %r269,%r270,1; .loc 1 529 19 ld.u64 %r271,[%r115]; sub.u64 %r100,%r269,%r271; .loc 1 530 18 ld.u64 %r70,[%r105]; .loc 1 525 25 add.u64 %r113,%r113,1; .loc 1 530 11 setp.eq.u64 %r272,%r70,%r100; @ %r272 bra $L143; .loc 1 531 3 st.u64 [%stack+16],%r70; st.u64 [%stack+8],%r100; st.u64 [%stack],%r113;_gfortran_runtime_error143: .loc 1 525 4 add.u64 %r115,%r115,24; add.u64 %r105,%r105,8; setp.ne.u64 %r275,%r86,%r113; @ %r275 bra $L144; $L139: .loc 1 541 16 cvt.u16.u32 %r277,%r24; setp.le.s16 %r278,%r277,1; @ %r278 bra $L145; shl.b64 %r276,%r306,3; bra $L146; $L145: mov.u64 %r276,8; $L146: add.u64 %r280,%frame,240; mov.u32 %r2842766,[%value_in]; } add.u64 %r130,%r156,40; mov.u64 %r128,%frame; mov.u64 %r109,0; $L147: .loc 1 542 18 ld.u64 %r288,[%r130]; st.u64 [%r128],%r288; .loc 1 539 26 add.u64 %r109,%r109,1; .loc 1 539 3 add.u64 %r130,%r130,24; add.u64 %r128,%r128,8; setp.gt.s64 %r289,%r306,%r109; @ %r289 bra $L147; $L140: ld.u64 %r88,[%frame+240]; .loc 1 551 22 ld.u64 %r75,[%frame]; .loc 1 551 12 shl.b64 %r76,%r75,2; .loc 1 553 32 ld.u64 %r65,[%frame+120]; .loc 1 560 23 mul.lo.u64 %r49,%r65,%r75; setp.le.s64 %r305,%r306,1; .loc 1 549 13 mov.u32 %r290,0; cvt.u32.u32 %r308,%r24; cvt.s64.s8 %r309,%r308; add.u64 %r310,%r309,-1; $L153: st.u32 [%r107],%r290; .loc 1 550 15 add.u64 %r88,%r88,1; .loc 1 551 12 add.u64 %r107,%r107,%r76; .loc 1 553 13 setp.ne.u64 %r291,%r65,%r88; @ %r291 bra $L153; .loc 1 562 7 @ ! %r305 bra $L178; bra $L116; $L152: .loc 1 557 13 st.u64 [%r108],%r307; .loc 1 560 23 mul.lo.u64 %r78,%r81,%r80; .loc 1 561 5 add.u64 %r106,%r106,1; .loc 1 562 7 add.u64 %r108,%r108,8; add.u64 %r138,%r138,8; add.u64 %r137,%r137,8; setp.ne.u64 %r294,%r106,%r310; @ %r294 bra $L151; bra $L116; $L178: add.u64 %r108,%frame,248; add.u64 %r138,%frame,8; add.u64 %r137,%frame,128; .loc 1 560 23 mov.u64 %r78,%r49; .loc 1 561 5 mov.u64 %r106,1; .loc 1 557 13 mov.u64 %r307,0; $L151: .loc 1 566 16 ld.u64 %r299,[%r108]; add.u64 %r80,%r299,1; st.u64 [%r108],%r80; .loc 1 567 23 ld.u64 %r81,[%r138]; .loc 1 567 13 sub.u64 %r300,%r81,%r78; shl.b64 %r301,%r300,2; add.u64 %r107,%r107,%r301; .loc 1 553 32 ld.u64 %r84,[%r137]; .loc 1 553 13 setp.eq.u64 %r302,%r80,%r84; @ %r302 bra $L152; .loc 1 557 13 mov.u64 %r88,0; bra $L153; $L176: .loc 1 486 15 ld.u64 %r107,[%r156]; .loc 1 486 6 setp.eq.u64 %r303,%r107,0; @ ! %r303 bra $L129; bra $L154; $L116: .loc 1 571 1 ret; } maxloc1_4_s4.o/ 1622802212_gfortran_maxloc1_4_s4 .visible .func _gfortran_maxloc1_4_sfortran/generated/maxloc1_4_s4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_4_s4 .visible .func _gfortran_mmaxloc1_4_s_gfortran_smaxloc1_4_s4 .visible .func _gfortran_smaxloc1_4_sVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_memcmp_char4_gfortrani_memcmp_char4_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents7] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,914,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,117,101114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99_gfortran_maxloc1_4_s4 .visible .func _gfortran_maxloc1_4_s, .param .u64 %in_ar4) {480987predpredu64u64 %r266; .reg .u64 %r267; .reg .pred %r268; .reg .predu64u64 %r320; .reg .u64u64 %r327; .reg .u64 %r328; .reg .predpred %r340; .reg .u32 %r344; .reg .predpredu64mov.u64 %r191,%ar0; mov.u64 %r192,%ar1; mov.u64 %r193,%ar2; mov.u32 %r194,%ar3; mov.u64 %r195,%ar4; .loc 1 70 10 ld.s8 %r22,[%r192+28]; .loc 1 70 38 add.u32 %r196,%r22,-1; .loc 1 70 8 cvt.s64.s32 %r103,%r196; .loc 1 71 10 ld.u64 %r25,[%r193]; .loc 1 71 7 add.u64 %r104,%r25,-1; .loc 1 73 7 shr.u64 %r198,%r104,63; set.u32.lt.s64 %r200,%r103,%r104; neg.s32 %r201,%r200; cvt.u16.u64 %r204,%r198; cvt.u16.u32 %r205,%r201; or.b16 %r203,%r204,%r205; .loc 1 73 6 cvt.u32.u16 %r206,%r203; cvt.u16.u8 %r207,%r206; setp.eq.u16 %r208,%r207,0; @ %r208 bra $L2; .loc 1 75 7 cvt.u32.u32 %r211,%r22; cvt.s64.s8 %r210,%r211; st.u64 [%stack+8],%r21000_gfortran_runtime_errorr214,%r104,%r104; add.u64 %r215,%r214,%r104; shl.b64 %r216,%r215,3; add.u64 %r217,%r192,%r216; ld.u64 %r30,[%r217+56]; ld.u64 %r32,[%r217+48]; .loc 1 83 11 ld.u64 %r33,[%r217+40]; .loc 1 85 3 setp.ne.u64 %r231,%r104,0; @ %r231 bra $L3; $L9: .loc 1 93 3 setp.gt.s64 %r232,%r103,%r104; @ %r232 bra $L4; bra $L50; $L3: add.u64 %r124,%r192,40; add.u64 %r102,%frame,120; add.u64 %r101,%frame,240; add.u64 %r233,%r192,16; add.u64 %r235,%r25,%r25; add.u64 %r236,%r235,%r25; shl.b64 %r237,%r236,3; add.u64 %r90,%r233,%r237; .loc 1 91 12 mov.u64 %r388,0; $L8: .loc 1 87 51 ld.u64 %r239,[%r124]; mul.lo.u64 %r238,%r239,%r195; .loc 1 87 18 st.u64 [%r102],%r238; .loc 1 88 19 ld.u64 %r241,[%r124+16]; add.u64 %r240,%r241,1; ld.u64 %r242,[%r124+8]; sub.u64 %r42,%r240,%r242; .loc 1 90 10 setp.lt.s64 %r243,%r42,0; @ %r243 bra $L6; .loc 1 88 17 st.u64 [%r101],%r42; bra $L7; $L6: .loc 1 91 12 st.u64 [%r101],%r388; $L7: .loc 1 85 3 add.u64 %r124,%r124,24; add.u64 %r102,%r102,8; add.u64 %r101,%r101,8; setp.ne.u64 %r245,%r90,%r124; @ %r245 bra $L8; bra $L9; $L50: .loc 1 102 6 ld.u64 %r246,[%r191]; setp.eq.u64 %r247,%r246,0; @ ! %r247 bra $L11; bra $L10; $L4: add.u64 %r249,%r25,%r25; add.u64 %r250,%r249,%r25; shl.b64 %r251,%r250,3; add.u64 %r252,%r251,40; add.u64 %r152,%r192,%r252; shl.b64 %r253,%r25,3; add.u64 %r145,%r253,-8; add.u64 %r374,%frame,120; add.u64 %r147,%r374,%r145; add.u64 %r373,%frame,240; add.u64 %r142,%r373,%r145; add.u64 %r256,%r192,40; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r127,%r256,%r262; .loc 1 99 12 mov.u64 %r387,0; $L14: .loc 1 95 56 ld.u64 %r264,[%r152]; mul.lo.u64 %r263,%r264,%r195; .loc 1 95 18 st.u64 [%r147],%r263; .loc 1 96 19 ld.u64 %r266,[%r152+16]; add.u64 %r265,%r266,1; ld.u64 %r267,[%r152+8]; sub.u64 %r51,%r265,%r267; .loc 1 98 10 setp.lt.s64 %r268,%r51,0; @ %r268 bra $L12; .loc 1 96 17 st.u64 [%r142],%r51; bra $L13; $L12: .loc 1 99 12 st.u64 [%r142],%r387; $L13: .loc 1 93 3 add.u64 %r152,%r152,24; add.u64 %r147,%r147,8; add.u64 %r142,%r142,8; setp.ne.u64 %r270,%r127,%r152; @ %r270 bra $L14; bra $L51; $L10: .loc 1 106 7 setp.le.s64 %r271,%r103,0; @ %r271 bra $L16; add.u64 %r373,%frame,240; $L38: add.u64 %r161,%r191,40; mov.u64 %r159,%r373; add.u64 %r272,%r191,16; cvt.u32.u32 %r274,%r22; cvt.s64.s8 %r273,%r274; add.u64 %r276,%r273,%r273; add.u64 %r277,%r276,%r273; shl.b64 %r278,%r277,3; add.u64 %r153,%r272,%r278; .loc 1 109 10 mov.u64 %r94,1; .loc 1 113 4 mov.u64 %r301,0; bra $L17; $L16: .loc 1 117 24 mov.u64 %r279,0; st.u64 [%r191+8],%r279; .loc 1 118 28 cvt.u16.u32 %r282,%r22; add.u16 %r281,%r282,-1; cvt.u32.u16 %r283,%r281; st.u8 [%r191+28],%r283; .loc 1 120 20 add.u32 %r284,%r22,-2; cvt.s64.s32 %r58,%r284; add.u64 %r286,%r58,%r58; add.u64 %r287,%r286,%r58; shl.b64 %r288,%r287,3; add.u64 %r289,%r191,%r288; .loc 1 120 67 shl.b64 %r291,%r58,3; add.u64 %r292,%frame,%r291; .loc 1 120 59 ld.u64 %r294,[%r289+40]; ld.u64 %r295,[%r292+240]; mul.lo.u64 %r109,%r294,%r295; .loc 1 122 29297; call (%value_in),_gfortrani_xmallocarray298,[%value_in]; } .loc 1 122 27 st.u64 [%r191],%r298; .loc 1 123 10 setp.eq.u64 %r300,%r109,0; @ ! %r300 bra $L21; bra $L18; $L20: .loc 1 111 48 mul.lo.u64 %r94,%r54,%r94; $L17: .loc 1 113 4 st.u64 [%r161+8],%r301; ld.u64 %r54,[%r159]; add.u64 %r302,%r54,-1; st.u64 [%r161+16],%r302; st.u64 [%r161],%r94; .loc 1 106 7 add.u64 %r161,%r161,24; add.u64 %r159,%r159,8; setp.ne.u64 %r303,%r153,%r161; @ %r303 bra $L20; bra $L16; $L18: .loc 1 126 4 st.u64 [%r191+48],%r109; mov.u64 %r305,-1; st.u64 [%r191+56],%r305; mov.u64 %r306,1; st.u64 [%r191+40],%r306; .loc 1 127 4 bra $L1; $L11: .loc 1 133 19 ld.s8 %r64,[%r191+28]; .loc 1 133 10 setp.eq.u64 %r307,%r64,%r103; @ %r307 bra $L23; .loc 1 134 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r64;_gfortran_runtime_error23: .loc 1 139 11 cvta.global.u64 %r310,_gfortrani_compile_options; .loc 1 139 10 ld.u32 %r311,[%r310+36]; setp.eq.u32 %r312,%r311,0; @ %r312 bra $L21; .loc 1 140 2 add.u64 %r373,%frame,240; cvta.const.u64 %r316,$LC2r37331316; call _gfortrani_bounds_ifunction_return$L21: .loc 1 144 3 setp.gt.s64 %r318,%r103,0; @ %r318 bra $L24; $L28: .loc 1 80 9 add.u64 %r319,%r30,1; .loc 1 80 7 sub.u64 %r105,%r319,%r32; max.s64 %r107,%r105,0; .loc 1 152 8 ld.u64 %r93,[%r192]; .loc 1 153 8 ld.u64 %r115,[%r191]; .loc 1 170 36 mul.lo.u64 %r320,%r33,%r195; shl.b64 %r69,%r320,2; .loc 1 186 22 ld.u64 %r183,[%frame+120]; .loc 1 186 12 shl.b64 %r185,%r183,2; .loc 1 187 22 ld.u64 %r186,[%frame]; .loc 1 187 12 shl.b64 %r188,%r186,2; .loc 1 185 12 ld.u64 %r189,[%frame+360]; setp.gt.s64 %r375,%r105,0; setp.le.s64 %r376,%r103,1; .loc 1 164 9 mov.u64 %r379,0; .loc 1 165 9 cvt.u32.u64 %r380,%r379; .loc 1 173 6 setp.eq.u32 %r381,%r194,0; cvt.u32.u32 %r382,%r22; cvt.s64.s8 %r383,%r382; add.u64 %r384,%r383,-1; add.u64 %r385,%frame,120; add.u64 %r386,%frame,240; bra $L25; $L24: add.u64 %r174,%frame,360; add.u64 %r173,%r191,40; mov.u64 %r171,%frame; cvt.u32.u32 %r322,%r22; cvt.s64.s8 %r321,%r322; add.u64 %r162,%r321,-1; .loc 1 144 3 mov.u64 %r116,0; add.u64 %r373,%frame,240; .loc 1 146 16 mov.u64 %r323,%r116; $L27: st.u64 [%r174],%r323; .loc 1 147 18 ld.u64 %r324,[%r173]; st.u64 [%r171],%r324; .loc 1 148 17 shl.b64 %r326,%r116,3; add.u64 %r327,%r373,%r326; .loc 1 148 10 ld.u64 %r328,[%r327]; setp.le.s64 %r329,%r328,0; @ %r329 bra $L1; .loc 1 144 26 add.u64 %r116,%r116,1; .loc 1 144 3 add.u64 %r174,%r174,8; add.u64 %r173,%r173,24; add.u64 %r171,%r171,8; setp.ne.u64 %r330,%r116,%r162; @ %r330 bra $L27; bra $L28; $L25: .loc 1 166 5 @ %r375 bra $L39; .loc 1 167 10 mov.u32 %r332,0; st.u32 [%r115],%r332; bra $L30; $L39: mov.u64 %r111,%r93; .loc 1 164 9 mov.u64 %r98,%r379; .loc 1 165 9 mov.u32 %r96,%r380; .loc 1 170 13 mov.u64 %r110,0; $L29: .loc 1 173 6 setp.ne.u64 %r333,%r98,0; @ %r333 bra $L31; $L34: cvt.u32.u64 %r334,%r110; add.u32 %r96,%r334,1; .loc 1 170 13 mov.u64 %r98,%r111; bra $L32; $L31: .loc 1 173 6 @ %r381 bra $L33; .loc 1 42195; call (%value_in),_gfortrani_memcmp_char4339,[%value_in]; } .loc 1 173 22 setp.ge.s32 %r340,%r339,0; @ %r340 bra $L34; bra $L32; $L33: .loc 1 42195; call (%value_in),_gfortrani_memcmp_char4344,[%value_in]; } .loc 1 173 22 setp.gt.s32 %r345,%r344,0; @ %r345 bra $L34; $L32: .loc 1 170 28 add.u64 %r110,%r110,1; .loc 1 170 36 add.u64 %r111,%r111,%r69; .loc 1 170 6 setp.gt.s64 %r346,%r107,%r110; @ %r346 bra $L29; .loc 1 181 12 st.u32 [%r115],%r96; $L30: .loc 1 185 15 add.u64 %r189,%r189,1; .loc 1 186 12 add.u64 %r93,%r93,%r185; .loc 1 187 12 add.u64 %r115,%r115,%r188; .loc 1 189 32 ld.u64 %r121,[%frame+240]; .loc 1 189 13 setp.ne.u64 %r347,%r189,%r121; @ %r347 bra $L25; .loc 1 196 23 mul.lo.u64 %r348,%r189,%r183; .loc 1 196 9 shl.b64 %r349,%r348,2; sub.u64 %r112,%r93,%r349; .loc 1 197 23 mul.lo.u64 %r350,%r189,%r186; .loc 1 197 9 shl.b64 %r351,%r350,2; sub.u64 %r113,%r115,%r351; .loc 1 199 7 @ %r376 bra $L1; add.u64 %r123,%frame,368; mov.u64 %r143,8; .loc 1 198 5 mov.u64 %r114,1; bra $L36; $L37: .loc 1 193 13 st.u64 [%r123],%r379; .loc 1 196 23 mul.lo.u64 %r357,%r81,%r80; .loc 1 196 9 shl.b64 %r358,%r357,2; sub.u64 %r112,%r93,%r358; .loc 1 197 23 mul.lo.u64 %r359,%r84,%r80; .loc 1 197 9 shl.b64 %r360,%r359,2; sub.u64 %r113,%r115,%r360; .loc 1 198 5 add.u64 %r114,%r114,1; .loc 1 199 7 add.u64 %r123,%r123,8; add.u64 %r143,%r143,8; setp.eq.u64 %r361,%r114,%r384; @ %r361 bra $L1; $L36: .loc 1 207 16 ld.u64 %r362,[%r123]; add.u64 %r80,%r362,1; st.u64 [%r123],%r80; .loc 1 208 23 add.u64 %r364,%r385,%r143; ld.u64 %r81,[%r364]; .loc 1 208 13 shl.b64 %r365,%r81,2; add.u64 %r93,%r112,%r365; .loc 1 209 23 add.u64 %r366,%frame,%r143; ld.u64 %r84,[%r366]; .loc 1 209 13 shl.b64 %r367,%r84,2; add.u64 %r115,%r113,%r367; .loc 1 189 32 add.u64 %r369,%r386,%r143; ld.u64 %r87,[%r369]; .loc 1 189 13 setp.eq.u64 %r370,%r80,%r87; @ %r370 bra $L37; mov.u64 %r189,0; bra $L25; $L51: .loc 1 102 6 ld.u64 %r371,[%r191]; setp.eq.u64 %r372,%r371,0; @ ! %r372 bra $L11; bra $L38; $L1: .loc 1 21_gfortran_mmaxloc1_4_s4 .visible .func _gfortran_mmaxloc1_4_s608predu64u32 %r283; .reg .u32 %r284; .reg .u32 %r285; .reg .u32 %r287; .reg .u32 %r288; .reg .u16 %r289; .reg .u16 %r290; .reg .u16 %r291; .reg .u32 %r292; .reg .u16 %r293; .reg .pred %r294; .reg .predu64 %r307; .reg .pred %r309; .reg .predu64 %r318; .reg .u64pred %r327; .reg .u64 %r328; .reg .predu64 %r333; .reg .u64 %r334; .reg .u64 %r337; .reg .u32predu64 %r374; .reg .u16 %r376; .reg .u16 %r377; .reg .u32 %r378; .reg .predpred %r396; .reg .u64 %r399; .reg .u64 %r400; .reg .u64 %r404; .reg .pred %r406; .reg .predu64.reg .predpredpredpred %r450; .reg .u64pred %r462; .reg .u64 %r463; .reg .pred %r464; .reg .u64 %r465; .reg .pred %r466; .reg .u64 %r468; .reg .pred %r469; .reg .u64 %r470; .reg .u64 %r471; .reg .predu64mov.u64 %r235,%ar0; mov.u64 %r236,%ar1; mov.u64 %r237,%ar2; mov.u64 %r238,%ar3; mov.u32 %r239,%ar4; mov.u64 %r240,%ar5; .loc 1 244 6 setp.ne.u64 %r241,%r238,0; @ %r241 bra $L237240; call _gfortran_maxloc1_4_s4); } .loc 1 251 7 bra $L52; $L53: .loc 1 254 10 ld.u64 %r22,[%r237]; .loc 1 254 7 add.u64 %r143,%r22,-1; .loc 1 255 10 ld.s8 %r23,[%r236+28]; .loc 1 255 38 add.u32 %r247,%r23,-1; .loc 1 255 8 cvt.s64.s32 %r144,%r247; .loc 1 258 7 shr.u64 %r249,%r143,63; set.u32.gt.s64 %r251,%r143,%r144; neg.s32 %r252,%r251; cvt.u16.u64 %r255,%r249; cvt.u16.u32 %r256,%r252; or.b16 %r254,%r255,%r256; .loc 1 258 6 cvt.u32.u16 %r257,%r254; cvt.u16.u8 %r258,%r257; setp.eq.u16 %r259,%r258,0; @ %r259 bra $L55; .loc 1 260 7 cvt.u32.u32 %r262,%r23; cvt.s64.s8 %r261,%r262; st.u64 [%stack+8],%r261; st.u64 [%stack],%r22; cvta.const.u64 %r260260_gfortran_runtime_error5: .loc 1 265 9 add.u64 %r470,%r143,%r143; add.u64 %r468,%r470,%r143; shl.b64 %r267,%r468,3; add.u64 %r268,%r236,%r267; ld.u64 %r271,[%r268+56]; add.u64 %r270,%r271,1; .loc 1 265 7 ld.u64 %r278,[%r268+48]; sub.u64 %r145,%r270,%r278; .loc 1 266 6 setp.le.s64 %r279,%r145,0; @ %r279 bra $L52; .loc 1 269 9 ld.u64 %r158,[%r238]; .loc 1 271 15 ld.u64 %r35,[%r238+16]; .loc 1 273 22 cvt.u32.u64 %r36,%r35; .loc 1 273 53 add.u32 %r280,%r36,-4; and.b32 %r281,%r280,-5; set.u32.eq.u32 %r283,%r281,0; neg.s32 %r284,%r283; .loc 1 273 22 add.u32 %r285,%r36,-1; set.u32.le.u32 %r287,%r285,1; neg.s32 %r288,%r287; .loc 1 273 6 cvt.u16.u32 %r290,%r284; cvt.u16.u32 %r291,%r288; or.b16 %r289,%r290,%r291; cvt.u32.u16 %r292,%r289; cvt.u16.u8 %r293,%r292; setp.ne.u16 %r294,%r293,0; @ %r294 bra $L57; .loc 1 275 7 setp.ne.u32 %r296,%r36,16; @ %r296 bra $L58; $L57: .loc 1 282 11 shl.b64 %r300,%r468,3; add.u64 %r301,%r236,%r300; ld.u64 %r42,[%r301+40]; .loc 1 283 12 add.u64 %r307,%r238,%r300; ld.u64 %r44,[%r307+40]; .loc 1 285 3 setp.ne.u64 %r309,%r143,0; @ %r309 bra $L59; $L65: .loc 1 295 3 setp.lt.s64 %r310,%r143,%r144; @ %r310 bra $L60; bra $L110; $L58: .loc 1 280 5 cvta.const.u64 %r311311_gfortran_runtime_error9: add.u64 %r214,%r236,40; add.u64 %r77,%frame,240; add.u64 %r98,%r238,40; mov.u64 %r97,%frame; add.u64 %r101,%frame,360; add.u64 %r313,%r236,16; add.u64 %r315,%r22,%r22; add.u64 %r316,%r315,%r22; shl.b64 %r317,%r316,3; add.u64 %r231,%r313,%r317; .loc 1 292 12 mov.u64 %r482,0; $L64: .loc 1 287 51 ld.u64 %r319,[%r214]; mul.lo.u64 %r318,%r319,%r240; .loc 1 287 18 st.u64 [%r77],%r318; .loc 1 288 20 ld.u64 %r321,[%r98]; mul.lo.u64 %r320,%r321,%r35; .loc 1 288 18 st.u64 [%r97],%r320; .loc 1 289 19 ld.u64 %r323,[%r214+16]; add.u64 %r322,%r323,1; ld.u64 %r324,[%r214+8]; sub.u64 %r58,%r322,%r324; .loc 1 291 10 setp.lt.s64 %r325,%r58,0; @ %r325 bra $L62; .loc 1 289 17 st.u64 [%r101],%r58; bra $L63; $L62: .loc 1 292 12 st.u64 [%r101],%r482; $L63: .loc 1 285 3 add.u64 %r214,%r214,24; add.u64 %r77,%r77,8; add.u64 %r98,%r98,24; add.u64 %r97,%r97,8; add.u64 %r101,%r101,8; setp.ne.u64 %r327,%r214,%r231; @ %r327 bra $L64; bra $L65; $L110: .loc 1 305 6 ld.u64 %r328,[%r235]; setp.eq.u64 %r329,%r328,0; @ ! %r329 bra $L67; bra $L66; $L60: add.u64 %r331,%r22,%r22; add.u64 %r332,%r331,%r22; shl.b64 %r333,%r332,3; add.u64 %r135,%r333,40; add.u64 %r139,%r236,%r135; shl.b64 %r334,%r22,3; add.u64 %r128,%r334,-8; add.u64 %r465,%frame,240; add.u64 %r132,%r465,%r128; add.u64 %r126,%r238,%r135; add.u64 %r123,%frame,%r128; add.u64 %r471,%frame,360; add.u64 %r92,%r471,%r128; cvt.u32.u32 %r338,%r23; cvt.s64.s8 %r337,%r338; add.u64 %r340,%r337,%r337; add.u64 %r341,%r340,%r337; shl.b64 %r342,%r341,3; add.u64 %r343,%r236,40; add.u64 %r166,%r342,%r343; .loc 1 302 12 mov.u64 %r481,0; $L70: .loc 1 297 55 ld.u64 %r345,[%r139]; mul.lo.u64 %r344,%r345,%r240; .loc 1 297 18 st.u64 [%r132],%r344; .loc 1 298 20 ld.u64 %r347,[%r126]; mul.lo.u64 %r346,%r347,%r35; .loc 1 298 18 st.u64 [%r123],%r346; .loc 1 299 19 ld.u64 %r349,[%r139+16]; add.u64 %r348,%r349,1; ld.u64 %r350,[%r139+8]; sub.u64 %r75,%r348,%r350; .loc 1 301 10 setp.lt.s64 %r351,%r75,0; @ %r351 bra $L68; .loc 1 299 17 st.u64 [%r92],%r75; bra $L69; $L68: .loc 1 302 12 st.u64 [%r92],%r481; $L69: .loc 1 295 3 add.u64 %r139,%r139,24; add.u64 %r132,%r132,8; add.u64 %r126,%r126,24; add.u64 %r123,%r123,8; add.u64 %r92,%r92,8; setp.ne.u64 %r353,%r139,%r166; @ %r353 bra $L70; bra $L111; $L66: .loc 1 309 7 setp.eq.u64 %r354,%r144,0; @ %r354 bra $L72; add.u64 %r471,%frame,360; $L94: add.u64 %r180,%r235,40; mov.u64 %r177,%r471; cvt.u32.u32 %r356,%r23; cvt.s64.s8 %r355,%r356; add.u64 %r358,%r355,%r355; add.u64 %r359,%r358,%r355; shl.b64 %r360,%r359,3; add.u64 %r361,%r235,16; add.u64 %r141,%r360,%r361; .loc 1 312 10 mov.u64 %r131,1; .loc 1 316 4 mov.u64 %r380,0; bra $L73; $L72: .loc 1 320 20 add.u32 %r362,%r23,-2; cvt.s64.s32 %r83,%r362; add.u64 %r364,%r83,%r83; add.u64 %r365,%r364,%r83; shl.b64 %r366,%r365,3; add.u64 %r367,%r235,%r366; .loc 1 320 67 shl.b64 %r369,%r83,3; add.u64 %r370,%frame,%r369; .loc 1 320 59 ld.u64 %r372,[%r367+40]; ld.u64 %r373,[%r370+360]; mul.lo.u64 %r151,%r372,%r373; .loc 1 322 24 mov.u64 %r374,0; st.u64 [%r235+8],%r374; .loc 1 323 28 cvt.u16.u32 %r377,%r23; add.u16 %r376,%r377,-1; cvt.u32.u16 %r378,%r376; st.u8 [%r235+28],%r378; .loc 1 325 10 setp.eq.u64 %r379,%r151,0; @ %r379 bra $L74; bra $L112; $L76: .loc 1 314 47 mul.lo.u64 %r131,%r80,%r131; $L73: .loc 1 316 4 st.u64 [%r180+8],%r380; ld.u64 %r80,[%r177]; add.u64 %r381,%r80,-1; st.u64 [%r180+16],%r381; st.u64 [%r180],%r131; .loc 1 309 7 add.u64 %r180,%r180,24; add.u64 %r177,%r177,8; setp.ne.u64 %r382,%r141,%r180; @ %r382 bra $L76; bra $L72; $L74: .loc 1 328 4 st.u64 [%r235+48],%r151; mov.u64 %r384,-1; st.u64 [%r235+56],%r384; mov.u64 %r385,1; st.u64 [%r235+40],%r385; .loc 1 329 4 bra $L52; $L112: .loc 1 332 24387; call (%value_in),_gfortrani_xmallocarray88,[%value_in]; } .loc 1 332 22 st.u64 [%r235],%r388; bra $L77; $L67: .loc 1 337 19 ld.s8 %r390,[%r235+28]; .loc 1 337 10 setp.eq.u64 %r391,%r390,%r144; @ %r391 bra $L78; .loc 1 338 2 cvta.const.u64 %r392,$LC5;stack; call _gfortran_runtime_error78: .loc 1 340 11 cvta.global.u64 %r394,_gfortrani_compile_options; .loc 1 340 10 ld.u32 %r395,[%r394+36]; setp.eq.u32 %r396,%r395,0; @ %r396 bra $L77; .loc 1 342 4 add.u64 %r471,%frame,360; cvta.const.u64 %r400,$LC2r471399400; call _gfortrani_bounds_ifunction_return344 4 cvta.const.u64 %r404,$LC3r2r404400; call _gfortrani_bounds_equal_extents$L77: .loc 1 349 3 setp.ne.u64 %r406,%r144,0; @ %r406 bra $L79; $L82: .loc 1 357 8 ld.u64 %r159,[%r235]; .loc 1 358 8 ld.u64 %r157,[%r236]; .loc 1 360 9 setp.ne.u64 %r407,%r157,0; @ %r407 bra $L80; bra $L52; $L79: add.u64 %r198,%frame,480; add.u64 %r197,%r235,40; add.u64 %r194,%frame,120; cvt.u32.u32 %r409,%r23; cvt.s64.s8 %r408,%r409; add.u64 %r184,%r408,-1; .loc 1 349 3 mov.u64 %r164,0; add.u64 %r471,%frame,360; .loc 1 351 16 mov.u64 %r410,%r164; $L81: st.u64 [%r198],%r410; .loc 1 352 18 ld.u64 %r411,[%r197]; st.u64 [%r194],%r411; .loc 1 353 17 shl.b64 %r413,%r164,3; add.u64 %r414,%r471,%r413; .loc 1 353 10 ld.u64 %r415,[%r414]; setp.le.s64 %r416,%r415,0; @ %r416 bra $L52; .loc 1 349 26 add.u64 %r164,%r164,1; .loc 1 349 3 add.u64 %r198,%r198,8; add.u64 %r197,%r197,24; add.u64 %r194,%r194,8; setp.ne.u64 %r417,%r164,%r184; @ %r417 bra $L81; bra $L82; $L80: .loc 1 283 12 mul.lo.u64 %r46,%r44,%r35; .loc 1 372 32 mul.lo.u64 %r418,%r42,%r240; shl.b64 %r96,%r418,2; .loc 1 396 22 ld.u64 %r215,[%frame+240]; .loc 1 396 12 shl.b64 %r217,%r215,2; .loc 1 397 23 ld.u64 %r218,[%frame]; .loc 1 398 22 ld.u64 %r220,[%frame+120]; .loc 1 398 12 shl.b64 %r223,%r220,2; setp.le.s64 %r469,%r144,1; .loc 1 384 13 setp.eq.u32 %r474,%r239,0; cvt.u32.u32 %r475,%r23; cvt.s64.s8 %r476,%r475; add.u64 %r477,%r476,-1; add.u64 %r478,%frame,240; add.u64 %r479,%frame,360; add.u64 %r480,%frame,120; $L95: .loc 1 349 3 mov.u64 %r163,%r158; mov.u64 %r153,%r157; .loc 1 372 9 mov.u64 %r152,0; $L86: .loc 1 375 7 ld.s8 %r224,[%r163]; .loc 1 375 6 setp.eq.u32 %r466,%r224,0; @ %r466 bra $L83; .loc 1 378 11 cvt.u32.u64 %r420,%r152; add.u32 %r133,%r420,1; .loc 1 382 6 setp.le.s64 %r421,%r145,%r152; @ %r421 bra $L84; add.u64 %r201,%r152,1; mov.u64 %r162,%r153; bra $L85; $L83: .loc 1 372 24 add.u64 %r152,%r152,1; .loc 1 372 32 add.u64 %r153,%r153,%r96; .loc 1 372 47 add.u64 %r163,%r163,%r46; .loc 1 372 2 setp.ne.u64 %r422,%r145,%r152; @ %r422 bra $L86; .loc 1 371 9 mov.u32 %r133,%r224; bra $L84; $L85: .loc 1 384 6 @ %r466 bra $L87; .loc 1 384 13 @ %r474 bra $L88; .loc 1 4216153240; call (%value_in),_gfortrani_memcmp_char4428,[%value_in]; } .loc 1 384 13 setp.lt.s32 %r429,%r428,0; @ %r429 bra $L87; $L89: .loc 1 388 14 cvt.u32.u64 %r133,%r201; mov.u64 %r153,%r162; bra $L87; $L88: .loc 1 4216153240; call (%value_in),_gfortrani_memcmp_char4433,[%value_in]; } .loc 1 384 13 setp.gt.s32 %r434,%r433,0; @ %r434 bra $L89; $L87: .loc 1 382 31 add.u64 %r162,%r162,%r96; .loc 1 382 46 add.u64 %r163,%r163,%r46; .loc 1 382 6 add.u64 %r202,%r201,1; setp.eq.u64 %r435,%r145,%r201; @ %r435 bra $L84; .loc 1 384 7 ld.s8 %r224,[%r163]; mov.u64 %r201,%r202; setp.eq.u32 %r466,%r224,0; bra $L85; $L84: .loc 1 392 8 st.u32 [%r159],%r133; .loc 1 395 15 ld.u64 %r436,[%frame+480]; add.u64 %r100,%r436,1; st.u64 [%frame+480],%r100; .loc 1 396 12 add.u64 %r157,%r157,%r217; .loc 1 397 13 add.u64 %r158,%r158,%r218; .loc 1 398 12 add.u64 %r159,%r159,%r223; .loc 1 400 32 ld.u64 %r29,[%frame+360]; .loc 1 400 13 setp.ne.u64 %r437,%r29,%r100; @ %r437 bra $L95; .loc 1 404 13 mov.u64 %r438,0; st.u64 [%frame+480],%r438; .loc 1 407 23 mul.lo.u64 %r106,%r29,%r215; .loc 1 408 24 mul.lo.u64 %r439,%r29,%r218; .loc 1 408 10 sub.u64 %r154,%r158,%r439; .loc 1 409 23 mul.lo.u64 %r440,%r29,%r220; .loc 1 409 9 shl.b64 %r441,%r440,2; sub.u64 %r155,%r159,%r441; .loc 1 411 7 @ %r469 bra $L52; add.u64 %r212,%frame,488; mov.u64 %r211,8; .loc 1 410 5 mov.u64 %r156,1; bra $L92; $L93: .loc 1 404 13 st.u64 [%r212],%r438; .loc 1 407 23 mul.lo.u64 %r106,%r116,%r115; .loc 1 408 24 mul.lo.u64 %r447,%r118,%r115; .loc 1 408 10 sub.u64 %r154,%r158,%r447; .loc 1 409 23 mul.lo.u64 %r448,%r120,%r115; .loc 1 409 9 shl.b64 %r449,%r448,2; sub.u64 %r155,%r159,%r449; .loc 1 410 5 add.u64 %r156,%r156,1; .loc 1 411 7 add.u64 %r212,%r212,8; add.u64 %r211,%r211,8; setp.eq.u64 %r450,%r156,%r477; @ %r450 bra $L52; $L92: .loc 1 419 16 ld.u64 %r451,[%r212]; add.u64 %r115,%r451,1; st.u64 [%r212],%r115; .loc 1 420 23 add.u64 %r453,%r478,%r211; ld.u64 %r116,[%r453]; .loc 1 420 13 sub.u64 %r454,%r116,%r106; shl.b64 %r455,%r454,2; add.u64 %r157,%r157,%r455; .loc 1 421 24 add.u64 %r456,%frame,%r211; ld.u64 %r118,[%r456]; .loc 1 421 14 add.u64 %r158,%r154,%r118; .loc 1 422 23 add.u64 %r458,%r480,%r211; ld.u64 %r120,[%r458]; .loc 1 422 13 shl.b64 %r459,%r120,2; add.u64 %r159,%r155,%r459; .loc 1 400 32 add.u64 %r461,%r479,%r211; ld.u64 %r124,[%r461]; .loc 1 400 13 setp.eq.u64 %r462,%r115,%r124; @ %r462 bra $L93; bra $L95; $L111: .loc 1 305 6 ld.u64 %r463,[%r235]; setp.eq.u64 %r464,%r463,0; @ ! %r464 bra $L67; bra $L94; $L52: .loc 1 42_gfortran_smaxloc1_4_s4 .visible .func _gfortran_smaxloc1_4_slocal .align 16 .b8 %frame_ar[368u64 %r89; .reg .u643u64 %r125; .reg .u64predu16 %r180; .reg .predpred %r222; .reg .predu64 %r231; .reg .u16 %r233; .reg .u16 %r234; .reg .u32pred %r259; .reg .pred %r260; .reg .u64u64 %r267; .reg .u32pred %r303; .reg .u64 %r304; .reg .pred %r305; .reg .u64 %r306; .reg .u64 %r307; .reg .u32mov.u64 %r156,%ar0; mov.u64 %r157,%ar1; mov.u64 %r158,%ar2; mov.u64 %r159,%ar3; mov.u32 %r160,%ar4; mov.u64 %r161,%ar5; .loc 1 449 6 setp.eq.u64 %r162,%r159,0; @ %r162 bra $L114; .loc 1 449 20 ld.u32 %r163,[%r159]; setp.eq.u32 %r164,%r163,0; @ %r164 bra $L115; $L114: .loc 1 452 7r1571580161; call _gfortran_maxloc1_4_s4); } .loc 1 456 7 bra $L113; $L115: .loc 1 459 10 ld.u64 %r23,[%r158]; .loc 1 459 7 add.u64 %r95,%r23,-1; .loc 1 460 10 ld.s8 %r24,[%r157+28]; .loc 1 460 38 add.u32 %r26,%r24,-1; .loc 1 460 8 cvt.s64.s32 %r306,%r26; .loc 1 462 7 shr.u64 %r171,%r95,63; set.u32.gt.s64 %r173,%r95,%r306; neg.s32 %r174,%r173; cvt.u16.u64 %r177,%r171; cvt.u16.u32 %r178,%r174; or.b16 %r176,%r177,%r178; .loc 1 462 6 cvt.u32.u16 %r179,%r176; cvt.u16.u8 %r180,%r179; setp.ne.u16 %r181,%r180,0; @ %r181 bra $L117; .loc 1 469 3 setp.ne.u64 %r182,%r95,0; @ %r182 bra $L118; $L124: .loc 1 477 3 setp.lt.s64 %r183,%r95,%r306; @ %r183 bra $L119; bra $L172; $L117: .loc 1 464 7 cvt.u32.u32 %r186,%r24; cvt.s64.s8 %r185,%r186; st.u64 [%stack+8],%r1851841stack; call _gfortran_runtime_error118: add.u64 %r92,%r157,48; add.u64 %r114,%frame,120; add.u64 %r146,%r157,56; add.u64 %r188,%r157,24; add.u64 %r190,%r23,%r23; add.u64 %r191,%r190,%r23; shl.b64 %r192,%r191,3; add.u64 %r152,%r188,%r192; .loc 1 474 12 mov.u64 %r312,0; $L123: .loc 1 471 19 ld.u64 %r194,[%r146]; ld.u64 %r196,[%r92]; sub.u64 %r195,%r194,%r196; .loc 1 471 50 mad.lo.u64 %r38,%r195,%r161,%r161; .loc 1 473 10 setp.le.s64 %r197,%r38,0; @ %r197 bra $L121; .loc 1 471 17 st.u64 [%r114],%r38; bra $L122; $L121: .loc 1 474 12 st.u64 [%r114],%r312; $L122: .loc 1 469 3 add.u64 %r92,%r92,24; add.u64 %r114,%r114,8; add.u64 %r146,%r146,24; setp.ne.u64 %r199,%r92,%r152; @ %r199 bra $L123; bra $L124; $L172: .loc 1 486 15 ld.u64 %r107,[%r156]; .loc 1 486 6 setp.eq.u64 %r200,%r107,0; @ ! %r200 bra $L126; bra $L125; $L119: add.u64 %r202,%r23,%r23; add.u64 %r203,%r202,%r23; shl.b64 %r204,%r203,3; add.u64 %r205,%r204,48; add.u64 %r83,%r157,%r205; add.u64 %r304,%frame,120; shl.b64 %r207,%r23,3; add.u64 %r208,%r207,-8; add.u64 %r144,%r304,%r208; add.u64 %r209,%r157,48; cvt.u32.u32 %r211,%r24; cvt.s64.s8 %r210,%r211; add.u64 %r213,%r210,%r210; add.u64 %r214,%r213,%r210; shl.b64 %r215,%r214,3; add.u64 %r72,%r209,%r215; .loc 1 483 12 mov.u64 %r311,0; $L129: .loc 1 480 2 ld.u64 %r217,[%r83+8]; ld.u64 %r219,[%r83]; sub.u64 %r218,%r217,%r219; .loc 1 480 37 mad.lo.u64 %r46,%r218,%r161,%r161; .loc 1 482 10 setp.le.s64 %r220,%r46,0; @ %r220 bra $L127; .loc 1 479 17 st.u64 [%r144],%r46; bra $L128; $L127: .loc 1 483 12 st.u64 [%r144],%r311; $L128: .loc 1 477 3 add.u64 %r83,%r83,24; add.u64 %r144,%r144,8; setp.ne.u64 %r222,%r72,%r83; @ %r222 bra $L129; bra $L173; $L125: .loc 1 490 7 setp.eq.u64 %r223,%r306,0; @ %r223 bra $L131; add.u64 %r304,%frame,120; $L151: add.u64 %r125,%r156,40; mov.u64 %r123,%r304; add.u64 %r224,%r156,16; cvt.u32.u32 %r226,%r24; cvt.s64.s8 %r225,%r226; add.u64 %r228,%r225,%r225; add.u64 %r229,%r228,%r225; shl.b64 %r230,%r229,3; add.u64 %r117,%r224,%r230; .loc 1 493 10 mov.u64 %r89,1; .loc 1 497 4 mov.u64 %r249,0; bra $L132; $L131: .loc 1 501 24 mov.u64 %r231,0; st.u64 [%r156+8],%r231; .loc 1 502 28 cvt.u16.u32 %r234,%r24; add.u16 %r233,%r234,-1; cvt.u32.u16 %r235,%r233; st.u8 [%r156+28],%r235; .loc 1 504 20 add.u32 %r236,%r24,-2; cvt.s64.s32 %r56,%r236; add.u64 %r238,%r56,%r56; add.u64 %r239,%r238,%r56; shl.b64 %r240,%r239,3; add.u64 %r241,%r156,%r240; .loc 1 504 67 shl.b64 %r243,%r56,3; add.u64 %r244,%frame,%r243; .loc 1 504 59 ld.u64 %r246,[%r241+40]; ld.u64 %r247,[%r244+120]; mul.lo.u64 %r103,%r246,%r247; .loc 1 506 10 setp.eq.u64 %r248,%r103,0; @ %r248 bra $L133; bra $L174; $L135: .loc 1 495 48 mul.lo.u64 %r89,%r52,%r89; $L132: .loc 1 497 4 st.u64 [%r125+8],%r249; ld.u64 %r52,[%r123]; add.u64 %r250,%r52,-1; st.u64 [%r125+16],%r250; st.u64 [%r125],%r89; .loc 1 490 7 add.u64 %r125,%r125,24; add.u64 %r123,%r123,8; setp.ne.u64 %r251,%r117,%r125; @ %r251 bra $L135; bra $L131; $L133: .loc 1 509 4 st.u64 [%r156+48],%r103; mov.u64 %r253,-1; st.u64 [%r156+56],%r253; mov.u64 %r254,1; st.u64 [%r156+40],%r254; .loc 1 510 4 bra $L113; $L174: .loc 1 513 24256; call (%value_in),_gfortrani_xmallocarray257,[%value_in]; } mov.u64 %r107,%r257; .loc 1 513 22 st.u64 [%r156],%r107; $L139: .loc 1 539 3 setp.ne.u64 %r259,%r306,0; @ %r259 bra $L136; bra $L137; $L126: .loc 1 517 19 ld.s8 %r61,[%r156+28]; .loc 1 517 10 setp.eq.u64 %r260,%r61,%r306; @ %r260 bra $L138; .loc 1 518 2 st.u64 [%stack+8],%r306; st.u64 [%stack],%r61;_gfortran_runtime_error138: .loc 1 523 11 cvta.global.u64 %r263,_gfortrani_compile_options; .loc 1 523 10 ld.u32 %r264,[%r263+36]; setp.eq.u32 %r265,%r264,0; @ %r265 bra $L139; .loc 1 525 4 setp.eq.u64 %r266,%r306,0; @ %r266 bra $L137; add.u64 %r115,%r156,48; add.u64 %r105,%frame,120; cvt.u32.u32 %r268,%r24; cvt.s64.s8 %r267,%r268; add.u64 %r86,%r267,-1; .loc 1 525 10 mov.u64 %r113,0; $L141: .loc 1 529 21 ld.u64 %r270,[%r115+8]; add.u64 %r269,%r270,1; .loc 1 529 19 ld.u64 %r271,[%r115]; sub.u64 %r100,%r269,%r271; .loc 1 530 18 ld.u64 %r70,[%r105]; .loc 1 525 25 add.u64 %r113,%r113,1; .loc 1 530 11 setp.eq.u64 %r272,%r70,%r100; @ %r272 bra $L140; .loc 1 531 3 st.u64 [%stack+16],%r70; st.u64 [%stack+8],%r100; st.u64 [%stack],%r113;_gfortran_runtime_error140: .loc 1 525 4 add.u64 %r115,%r115,24; add.u64 %r105,%r105,8; setp.ne.u64 %r275,%r86,%r113; @ %r275 bra $L141; $L136: .loc 1 541 16 cvt.u16.u32 %r277,%r24; setp.le.s16 %r278,%r277,1; @ %r278 bra $L142; shl.b64 %r276,%r306,3; bra $L143; $L142: mov.u64 %r276,8; $L143: add.u64 %r280,%frame,240; mov.u32 %r2842766,[%value_in]; } add.u64 %r130,%r156,40; mov.u64 %r128,%frame; mov.u64 %r109,0; $L144: .loc 1 542 18 ld.u64 %r288,[%r130]; st.u64 [%r128],%r288; .loc 1 539 26 add.u64 %r109,%r109,1; .loc 1 539 3 add.u64 %r130,%r130,24; add.u64 %r128,%r128,8; setp.gt.s64 %r289,%r306,%r109; @ %r289 bra $L144; $L137: ld.u64 %r88,[%frame+240]; .loc 1 551 22 ld.u64 %r75,[%frame]; .loc 1 551 12 shl.b64 %r76,%r75,2; .loc 1 553 32 ld.u64 %r65,[%frame+120]; .loc 1 560 23 mul.lo.u64 %r49,%r65,%r75; setp.le.s64 %r305,%r306,1; .loc 1 549 13 mov.u32 %r290,0; cvt.u32.u32 %r308,%r24; cvt.s64.s8 %r309,%r308; add.u64 %r310,%r309,-1; $L150: st.u32 [%r107],%r290; .loc 1 550 15 add.u64 %r88,%r88,1; .loc 1 551 12 add.u64 %r107,%r107,%r76; .loc 1 553 13 setp.ne.u64 %r291,%r65,%r88; @ %r291 bra $L150; .loc 1 562 7 @ ! %r305 bra $L175; bra $L113; $L149: .loc 1 557 13 st.u64 [%r108],%r307; .loc 1 560 23 mul.lo.u64 %r78,%r81,%r80; .loc 1 561 5 add.u64 %r106,%r106,1; .loc 1 562 7 add.u64 %r108,%r108,8; add.u64 %r138,%r138,8; add.u64 %r137,%r137,8; setp.ne.u64 %r294,%r106,%r310; @ %r294 bra $L148; bra $L113; $L175: add.u64 %r108,%frame,248; add.u64 %r138,%frame,8; add.u64 %r137,%frame,128; .loc 1 560 23 mov.u64 %r78,%r49; .loc 1 561 5 mov.u64 %r106,1; .loc 1 557 13 mov.u64 %r307,0; $L148: .loc 1 566 16 ld.u64 %r299,[%r108]; add.u64 %r80,%r299,1; st.u64 [%r108],%r80; .loc 1 567 23 ld.u64 %r81,[%r138]; .loc 1 567 13 sub.u64 %r300,%r81,%r78; shl.b64 %r301,%r300,2; add.u64 %r107,%r107,%r301; .loc 1 553 32 ld.u64 %r84,[%r137]; .loc 1 553 13 setp.eq.u64 %r302,%r80,%r84; @ %r302 bra $L149; .loc 1 557 13 mov.u64 %r88,0; bra $L150; $L173: .loc 1 486 15 ld.u64 %r107,[%r156]; .loc 1 486 6 setp.eq.u64 %r303,%r107,0; @ ! %r303 bra $L126; bra $L151; $L113: .loc 1 571 1 ret; } maxloc1_8_s1.o/ 1622802212_gfortran_maxloc1_8_s1 .visible .func _gfortran_maxloc1_8_s1in_ar4); .file 1 "../../../libgfortran/generated/maxloc1_8_s1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_8_s1 .visible .func _gfortran_mmaxloc1_8_s1_gfortran_smaxloc1_8_s1 .visible .func _gfortran_smaxloc1_8_s1VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returmemcmpmem_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents7] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,914,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,117,101114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99_gfortran_maxloc1_8_s1 .visible .func _gfortran_maxloc1_8_s1in_ar4) {480u64 %r59u64pred %r235; .reg .predu16 %r273; .reg .u16u32pred %r331; .reg .predu64pred %r357; .reg .predu64 %r370; .reg .u64 %r371; mov.u64 %r183,%ar0; mov.u64 %r184,%ar1; mov.u64 %r185,%ar2; mov.u32 %r186,%ar3; mov.u64 %r187,%ar4; .loc 1 70 10 ld.s8 %r22,[%r184+28]; .loc 1 70 38 add.u32 %r188,%r22,-1; .loc 1 70 8 cvt.s64.s32 %r116,%r188; .loc 1 71 10 ld.u64 %r25,[%r185]; .loc 1 71 7 add.u64 %r117,%r25,-1; .loc 1 73 7 shr.u64 %r190,%r117,63; set.u32.lt.s64 %r192,%r116,%r117; neg.s32 %r193,%r192; cvt.u16.u64 %r196,%r190; cvt.u16.u32 %r197,%r193; or.b16 %r195,%r196,%r197; .loc 1 73 6 cvt.u32.u16 %r198,%r195; cvt.u16.u8 %r199,%r198; setp.eq.u16 %r200,%r199,0; @ %r200 bra $L2; .loc 1 75 7 cvt.u32.u32 %r203,%r22; cvt.s64.s8 %r202,%r203; st.u64 [%stack+8],%r20201201_gfortran_runtime_errorr206,%r117,%r117; add.u64 %r207,%r206,%r117; shl.b64 %r208,%r207,3; add.u64 %r209,%r184,%r208; ld.u64 %r31,[%r209+56]; ld.u64 %r33,[%r209+48]; .loc 1 83 11 ld.u64 %r34,[%r209+40]; .loc 1 85 3 setp.ne.u64 %r223,%r117,0; @ %r223 bra $L3; $L9: .loc 1 93 3 setp.gt.s64 %r224,%r116,%r117; @ %r224 bra $L4; bra $L49; $L3: add.u64 %r133,%r184,40; add.u64 %r177,%frame,120; add.u64 %r173,%frame,240; add.u64 %r225,%r184,16; add.u64 %r227,%r25,%r25; add.u64 %r228,%r227,%r25; shl.b64 %r229,%r228,3; add.u64 %r179,%r225,%r229; .loc 1 91 12 mov.u64 %r371,0; $L8: .loc 1 87 51 ld.u64 %r231,[%r133]; mul.lo.u64 %r230,%r231,%r187; .loc 1 87 18 st.u64 [%r177],%r230; .loc 1 88 19 ld.u64 %r233,[%r133+16]; add.u64 %r232,%r233,1; ld.u64 %r234,[%r133+8]; sub.u64 %r44,%r232,%r234; .loc 1 90 10 setp.lt.s64 %r235,%r44,0; @ %r235 bra $L6; .loc 1 88 17 st.u64 [%r173],%r44; bra $L7; $L6: .loc 1 91 12 st.u64 [%r173],%r371; $L7: .loc 1 85 3 add.u64 %r133,%r133,24; add.u64 %r177,%r177,8; add.u64 %r173,%r173,8; setp.ne.u64 %r237,%r133,%r179; @ %r237 bra $L8; bra $L9; $L49: .loc 1 102 6 ld.u64 %r238,[%r183]; setp.eq.u64 %r239,%r238,0; @ ! %r239 bra $L11; bra $L10; $L4: add.u64 %r241,%r25,%r25; add.u64 %r242,%r241,%r25; shl.b64 %r243,%r242,3; add.u64 %r244,%r243,40; add.u64 %r111,%r184,%r244; shl.b64 %r245,%r25,3; add.u64 %r101,%r245,-8; add.u64 %r360,%frame,120; add.u64 %r103,%r360,%r101; add.u64 %r359,%frame,240; add.u64 %r98,%r359,%r101; add.u64 %r248,%r184,40; cvt.u32.u32 %r250,%r22; cvt.s64.s8 %r249,%r250; add.u64 %r252,%r249,%r249; add.u64 %r253,%r252,%r249; shl.b64 %r254,%r253,3; add.u64 %r29,%r248,%r254; .loc 1 99 12 mov.u64 %r370,0; $L14: .loc 1 95 56 ld.u64 %r256,[%r111]; mul.lo.u64 %r255,%r256,%r187; .loc 1 95 18 st.u64 [%r103],%r255; .loc 1 96 19 ld.u64 %r258,[%r111+16]; add.u64 %r257,%r258,1; ld.u64 %r259,[%r111+8]; sub.u64 %r53,%r257,%r259; .loc 1 98 10 setp.lt.s64 %r260,%r53,0; @ %r260 bra $L12; .loc 1 96 17 st.u64 [%r98],%r53; bra $L13; $L12: .loc 1 99 12 st.u64 [%r98],%r370; $L13: .loc 1 93 3 add.u64 %r111,%r111,24; add.u64 %r103,%r103,8; add.u64 %r98,%r98,8; setp.ne.u64 %r262,%r29,%r111; @ %r262 bra $L14; bra $L50; $L10: .loc 1 106 7 setp.le.s64 %r263,%r116,0; @ %r263 bra $L16; add.u64 %r359,%frame,240; $L36: add.u64 %r147,%r183,40; mov.u64 %r141,%r359; cvt.u32.u32 %r265,%r22; cvt.s64.s8 %r264,%r265; add.u64 %r267,%r264,%r264; add.u64 %r268,%r267,%r264; shl.b64 %r269,%r268,3; add.u64 %r270,%r183,16; add.u64 %r112,%r269,%r270; .loc 1 109 10 mov.u64 %r104,1; .loc 1 113 4 mov.u64 %r293,0; bra $L17; $L16: .loc 1 117 24 mov.u64 %r271,0; st.u64 [%r183+8],%r271; .loc 1 118 28 cvt.u16.u32 %r274,%r22; add.u16 %r273,%r274,-1; cvt.u32.u16 %r275,%r273; st.u8 [%r183+28],%r275; .loc 1 120 20 add.u32 %r276,%r22,-2; cvt.s64.s32 %r63,%r276; add.u64 %r278,%r63,%r63; add.u64 %r279,%r278,%r63; shl.b64 %r280,%r279,3; add.u64 %r281,%r183,%r280; .loc 1 120 67 shl.b64 %r283,%r63,3; add.u64 %r284,%frame,%r283; .loc 1 120 59 ld.u64 %r286,[%r281+40]; ld.u64 %r287,[%r284+240]; mul.lo.u64 %r122,%r286,%r287; .loc 1 122 29 mov.u64 %r289189; call (%value_in),_gfortrani_xmallocarray290,[%value_in]; } .loc 1 122 27 st.u64 [%r183],%r290; .loc 1 123 10 setp.eq.u64 %r292,%r122,0; @ ! %r292 bra $L21; bra $L18; $L20: .loc 1 111 48 mul.lo.u64 %r104,%r59,%r104; $L17: .loc 1 113 4 st.u64 [%r147+8],%r293; ld.u64 %r59,[%r141]; add.u64 %r294,%r59,-1; st.u64 [%r147+16],%r294; st.u64 [%r147],%r104; .loc 1 106 7 add.u64 %r147,%r147,2420; bra $L16; $L18: .loc 1 126 4 st.u64 [%r183+48],%r122; mov.u64 %r297,-1; st.u64 [%r183+56],%r297; mov.u64 %r298,1; st.u64 [%r183+40],%r298; .loc 1 127 4 bra $L1; $L11: .loc 1 133 19 ld.s8 %r70,[%r183+28]; .loc 1 133 10 setp.eq.u64 %r299,%r70,%r116; @ %r299 bra $L23; .loc 1 134 2 st.u64 [%stack+8],%r116_gfortran_runtime_error23: .loc 1 139 11 cvta.global.u64 %r302,_gfortrani_compile_options; .loc 1 139 10 ld.u32 %r303,[%r302+36]; setp.eq.u32 %r304,%r303,0; @ %r304 bra $L21; .loc 1 140 2 add.u64 %r359,%frame,240; cvta.const.u64 %r308,$LC2r359307308; call _gfortrani_bounds_ifunction_return$L21: .loc 1 144 3 setp.gt.s64 %r310,%r116,0; @ %r310 bra $L24; $L28: .loc 1 80 9 add.u64 %r311,%r31,1; .loc 1 80 7 sub.u64 %r118,%r311,%r33; max.s64 %r120,%r118,0; .loc 1 83 44 mul.lo.u64 %r36,%r34,%r187; .loc 1 152 8 ld.u64 %r129,[%r184]; .loc 1 153 8 ld.u64 %r130,[%r183]; .loc 1 186 22 ld.u64 %r76,[%frame+120]; .loc 1 187 22 ld.u64 %r78,[%frame]; .loc 1 187 12 shl.b64 %r80,%r78,3; .loc 1 185 12 ld.u64 %r176,[%frame+360]; setp.gt.s64 %r357,%r118,0; setp.le.s64 %r358,%r116,1; .loc 1 173 6 setp.eq.u32 %r363,%r186,0; .loc 1 167 10 mov.u64 %r364,0; cvt.u32.u32 %r365,%r22; cvt.s64.s8 %r366,%r365; add.u64 %r367,%r366,-1; add.u64 %r368,%frame,120; add.u64 %r369,%frame,240; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r183,40; mov.u64 %r162,%frame; cvt.u32.u32 %r313,%r22; cvt.s64.s8 %r312,%r313; add.u64 %r150,%r312,-1; .loc 1 144 3 mov.u64 %r131,0; add.u64 %r359,%frame,240; .loc 1 146 16 mov.u64 %r314,%r131; $L27: st.u64 [%r165],%r314; .loc 1 147 18 ld.u64 %r315,[%r164]; st.u64 [%r162],%r315; .loc 1 148 17 shl.b64 %r317,%r131,3; add.u64 %r318,%r359,%r317; .loc 1 148 10 ld.u64 %r319,[%r318]; setp.le.s64 %r320,%r319,0; @ %r320 bra $L1; .loc 1 144 26 add.u64 %r131,%r131,1; .loc 1 144 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r321,%r131,%r150; @ %r321 bra $L27; bra $L28; $L25: .loc 1 166 5 @ %r357 bra $L37; .loc 1 167 10 st.u64 [%r130],%r364; bra $L30; $L37: mov.u64 %r125,%r129; .loc 1 164 9 mov.u64 %r107,0; .loc 1 165 9 mov.u64 %r105,%r107; .loc 1 170 13 mov.u64 %r158,%r107; $L29: .loc 1 177 14 add.u64 %r158,%r158,1; .loc 1 173 6 setp.eq.u64 %r324,%r107,0; @ %r324 bra $L38; .loc 1 40(%value_in),memcmp,(%out_arg1,%out_arg2,%out_arg3); ld.param.u32 %r328,[%value_in]; } .loc 1 173 6 @ %r363 bra $L32; .loc 1 173 22 setp.lt.s32 %r331,%r328,0; selp.u64 %r107,%r107,%r125,%r331; selp.u64 %r105,%r105,%r158,%r331; bra $L31; $L32: setp.le.s32 %r332,%r328,0; selp.u64 %r107,%r107,%r125,%r332; selp.u64 %r105,%r105,%r158,%r332; bra $L31; $L38: mov.u64 %r107,%r125; .loc 1 177 14 mov.u64 %r105,%r158; $L31: .loc 1 170 36 add.u64 %r125,%r125,%r36; .loc 1 170 6 setp.gt.s64 %r333,%r120,%r158; @ %r333 bra $L29; .loc 1 181 12 st.u64 [%r130],%r105; $L30: .loc 1 185 15 add.u64 %r176,%r176,1; .loc 1 186 12 add.u64 %r129,%r129,%r76; .loc 1 187 12 add.u64 %r130,%r130,%r80; .loc 1 189 32 ld.u64 %r137,[%frame+240]; .loc 1 189 13 setp.ne.u64 %r334,%r176,%r137; @ %r334 bra $L25; .loc 1 196 23 mul.lo.u64 %r335,%r76,%r176; .loc 1 196 9 sub.u64 %r126,%r129,%r335; .loc 1 197 23 mul.lo.u64 %r336,%r78,%r176; .loc 1 197 9 shl.b64 %r337,%r336,3; sub.u64 %r127,%r130,%r337; .loc 1 199 7 @ %r358 bra $L1; add.u64 %r139,%frame,368; mov.u64 %r159,8; .loc 1 198 5 mov.u64 %r128,1; bra $L34; $L35: .loc 1 193 13 st.u64 [%r139],%r364; .loc 1 196 23 mul.lo.u64 %r343,%r92,%r91; .loc 1 196 9 sub.u64 %r126,%r129,%r343; .loc 1 197 23 mul.lo.u64 %r344,%r94,%r91; .loc 1 197 9 shl.b64 %r345,%r344,3; sub.u64 %r127,%r130,%r345; .loc 1 198 5 add.u64 %r128,%r128,1; .loc 1 199 7 add.u64 %r139,%r139,8; add.u64 %r159,%r159,8; setp.eq.u64 %r346,%r128,%r367; @ %r346 bra $L1; $L34: .loc 1 207 16 ld.u64 %r347,[%r139]; add.u64 %r91,%r347,1; st.u64 [%r139],%r91; .loc 1 208 23 add.u64 %r349,%r368,%r159; ld.u64 %r92,[%r349]; .loc 1 208 13 add.u64 %r129,%r126,%r92; .loc 1 209 23 add.u64 %r350,%frame,%r159; ld.u64 %r94,[%r350]; .loc 1 209 13 shl.b64 %r351,%r94,3; add.u64 %r130,%r127,%r351; .loc 1 189 32 add.u64 %r353,%r369,%r159; ld.u64 %r97,[%r353]; .loc 1 189 13 setp.eq.u64 %r354,%r91,%r97; @ %r354 bra $L35; mov.u64 %r176,0; bra $L25; $L50: .loc 1 102 6 ld.u64 %r355,[%r183]; setp.eq.u64 %r356,%r355,0; @ ! %r356 bra $L11; bra $L36; $L1: .loc 1 21_gfortran_mmaxloc1_8_s1 .visible .func _gfortran_mmaxloc1_8_s1608u64 %r210; .reg .u64 %r211; .reg .u64 %r213; .reg .u64u64 %r229; .reg .u32 %r230; .reg .u64 %r231; .reg .pred %r232; .reg .u32 %r238; .reg .u64 %r240; .reg .u32 %r242; .reg .u32u64 %r258; .reg .u64u64u64u64predu64 %r357; .reg .u64 %r358; .reg .u64u32pred %r421; .reg .predpred %r447; .reg .u64 %r448; .reg .pred %r449; .reg .u64 %r450; .reg .u64 %r452; .reg .pred %r453; .reg .pred %r454; .reg .u64mov.u64 %r226,%ar0; mov.u64 %r227,%ar1; mov.u64 %r228,%ar2; mov.u64 %r229,%ar3; mov.u32 %r230,%ar4; mov.u64 %r231,%ar5; .loc 1 244 6 setp.ne.u64 %r232,%r229,0; @ %r232 bra $L228230231; call _gfortran_maxloc1_8_s1); } .loc 1 251 7 bra $L51; $L52: .loc 1 254 10 ld.u64 %r22,[%r228]; .loc 1 254 7 add.u64 %r150,%r22,-1; .loc 1 255 10 ld.s8 %r23,[%r227+28]; .loc 1 255 38 add.u32 %r238,%r23,-1; .loc 1 255 8 cvt.s64.s32 %r151,%r238; .loc 1 258 7 shr.u64 %r240,%r150,63; set.u32.gt.s64 %r242,%r150,%r151; neg.s32 %r243,%r242; cvt.u16.u64 %r246,%r240; cvt.u16.u32 %r247,%r243; or.b16 %r245,%r246,%r247; .loc 1 258 660 7 cvt.u32.u32 %r253,%r23; cvt.s64.s8 %r252,%r253; st.u64 [%stack+8],%r252; st.u64 [%stack],%r22; cvta.const.u64 %r22_gfortran_runtime_error4: .loc 1 265 9 add.u64 %r455,%r150,%r150; add.u64 %r452,%r455,%r150; shl.b64 %r258,%r452,3; add.u64 %r259,%r227,%r258; ld.u64 %r262,[%r259+56]; add.u64 %r261,%r262,1; .loc 1 265 7 ld.u64 %r269,[%r259+48]; sub.u64 %r152,%r261,%r269; .loc 1 266 6 setp.le.s64 %r270,%r152,0; @ %r270 bra $L51; .loc 1 269 9 ld.u64 %r162,[%r229]; .loc 1 271 15 ld.u64 %r35,[%r229+16]; .loc 1 273 22 cvt.u32.u64 %r36,%r35; .loc 1 273 53 add.u32 %r271,%r36,-4; and.b32 %r272,%r271,-5; set.u32.eq.u32 %r274,%r272,0; neg.s32 %r275,%r274; .loc 1 273 22 add.u32 %r276,%r36,-1; set.u32.le.u32 %r278,%r276,1; neg.s32 %r279,%r278; .loc 1 273 6 cvt.u16.u32 %r281,%r275; cvt.u16.u32 %r282,%r279; or.loc 1 275 7 setp.ne.u32 %r287,%r36,16; @ %r287 bra $L57; $L56: .loc 1 282 11 shl.b64 %r291,%r452,3; add.u64 %r292,%r227,%r291; ld.u64 %r42,[%r292+40]; .loc 1 283 12 add.u64 %r298,%r229,%r291; ld.u64 %r45,[%r298+40]; .loc 1 285 3 setp.ne.u64 %r300,%r150,0; @ %r300 bra $L58; $L64: .loc 1 295 3 setp.lt.s64 %r301,%r150,%r151; @ %r301 bra $L59; bra $L109; $L57: .loc 1 280 5 cvta.const.u64 %r302302_gfortran_runtime_error8: add.u64 %r208,%r227,40; add.u64 %r210,%frame,240; add.u64 %r211,%r229,40; mov.u64 %r213,%frame; add.u64 %r214,%frame,360; add.u64 %r304,%r227,16; add.u64 %r306,%r22,%r22; add.u64 %r307,%r306,%r22; shl.b64 %r308,%r307,3; add.u64 %r223,%r304,%r308; .loc 1 292 12 mov.u64 %r468,0; $L63: .loc 1 287 51 ld.u64 %r310,[%r208]; mul.lo.u64 %r309,%r310,%r231; .loc 1 287 18 st.u64 [%r210],%r309; .loc 1 288 20 ld.u64 %r312,[%r211]; mul.lo.u64 %r311,%r312,%r35; .loc 1 288 18 st.u64 [%r213],%r311; .loc 1 289 19 ld.u64 %r314,[%r208+16]; add.u64 %r313,%r314,1; ld.u64 %r315,[%r208+8]; sub.u64 %r60,%r313,%r315; .loc 1 291 10 setp.lt.s64 %r316,%r60,0; @ %r316 bra $L61; .loc 1 289 17 st.u64 [%r214],%r60; bra $L62; $L61: .loc 1 292 12 st.u64 [%r214],%r468; $L62: .loc 1 285 3 add.u64 %r208,%r208,24; add.u64 %r210,%r210,8; add.u64 %r211,%r211,24; add.u64 %r213,%r213,8; add.u64 %r214,%r214,8; setp.ne.u64 %r318,%r208,%r223; @ %r318 bra $L63; bra $L64; $L109: .loc 1 305 6 ld.u64 %r319,[%r226]; setp.eq.u64 %r320,%r319,0; @ ! %r320 bra $L66; bra $L65; $L59: add.u64 %r322,%r22,%r22; add.u64 %r323,%r322,%r22; shl.b64 %r324,%r323,3; add.u64 %r40,%r324,40; add.u64 %r71,%r227,%r40; shl.b64 %r325,%r22,3; add.u64 %r171,%r325,-8; add.u64 %r450,%frame,240; add.u64 %r30,%r450,%r171; add.u64 %r173,%r229,%r40; add.u64 %r79,%frame,%r171; add.u64 %r456,%frame,360; add.u64 %r164,%r456,%r171; add.u64 %r328,%r227,40; cvt.u32.u32 %r330,%r23; cvt.s64.s8 %r329,%r330; add.u64 %r332,%r329,%r329; add.u64 %r333,%r332,%r329; shl.b64 %r334,%r333,3; add.u64 %r205,%r328,%r334; .loc 1 302 12 mov.u64 %r467,0; $L69: .loc 1 297 55 ld.u64 %r336,[%r71]; mul.lo.u64 %r335,%r336,%r231; .loc 1 297 18 st.u64 [%r30],%r335; .loc 1 298 20 ld.u64 %r338,[%r173]; mul.lo.u64 %r337,%r338,%r35; .loc 1 298 18 st.u64 [%r79],%r337; .loc 1 299 19 ld.u64 %r340,[%r71+16]; add.u64 %r339,%r340,1; ld.u64 %r341,[%r71+8]; sub.u64 %r76,%r339,%r341; .loc 1 301 10 setp.lt.s64 %r342,%r76,0; @ %r342 bra $L67; .loc 1 299 17 st.u64 [%r164],%r76; bra $L68; $L67: .loc 1 302 12 st.u64 [%r164],%r467; $L68: .loc 1 295 3 add.u64 %r71,%r71,24; add.u64 %r30,%r30,8; add.u64 %r173,%r173,24; add.u64 %r79,%r79,8; add.u64 %r164,%r164,8; setp.ne.u64 %r344,%r71,%r205; @ %r344 bra $L69; bra $L110; $L65: .loc 1 309 7 setp.eq.u64 %r345,%r151,0; @ %r345 bra $L71; add.u64 %r456,%frame,360; $L92: add.u64 %r130,%r226,40; mov.u64 %r128,%r456; cvt.u32.u32 %r347,%r23; cvt.s64.s8 %r346,%r347; add.u64 %r349,%r346,%r346; add.u64 %r350,%r349,%r346; shl.b64 %r351,%r350,3; add.u64 %r352,%r226,16; add.u64 %r78,%r351,%r352; .loc 1 312 10 mov.u64 %r137,1; .loc 1 316 4 mov.u64 %r371,0; bra $L72; $L71: .loc 1 320 20 add.u32 %r353,%r23,-2; cvt.s64.s32 %r84,%r353; add.u64 %r355,%r84,%r84; add.u64 %r356,%r355,%r84; shl.b64 %r357,%r356,3; add.u64 %r358,%r226,%r357; .loc 1 320 67 shl.b64 %r360,%r84,3; add.u64 %r361,%frame,%r360; .loc 1 320 59 ld.u64 %r363,[%r358+40]; ld.u64 %r364,[%r361+360]; mul.lo.u64 %r157,%r363,%r364; .loc 1 322 24 mov.u64 %r365,0; st.u64 [%r226+8],%r365; .loc 1 323 28 cvt.u16.u32 %r368,%r23; add.u16 %r367,%r368,-1; cvt.u32.u16 %r369,%r367; st.u8 [%r226+28],%r369; .loc 1 325 10 setp.eq.u64 %r370,%r157,0; @ %r370 bra $L73; bra $L111; $L75: .loc 1 314 47 mul.lo.u64 %r137,%r81,%r137; $L72: .loc 1 316 4 st.u64 [%r130+8],%r371; ld.u64 %r81,[%r128]; add.u64 %r372,%r81,-1; st.u64 [%r130+16],%r372; st.u64 [%r130],%r137; .loc 1 309 7 add.u64 %r130,%r130,24; add.u64 %r128,%r128,8; setp.ne.u64 %r373,%r78,%r130; @ %r373 bra $L75; bra $L71; $L73: .loc 1 328 4 st.u64 [%r226+48],%r157; mov.u64 %r375,-1; st.u64 [%r226+56],%r375; mov.u64 %r376,1; st.u64 [%r226+40],%r376; .loc 1 329 4 bra $L51; $L111: .loc 1 332 24 mov.u64 %r378378; call (%value_in),_gfortrani_xmallocarray79,[%value_in]; } .loc 1 332 22 st.u64 [%r226],%r379; bra $L76; $L66: .loc 1 337 19 ld.s8 %r381,[%r226+28]; .loc 1 337 10 setp.eq.u64 %r382,%r381,%r151; @ %r382 bra $L77; .loc 1 338 2 cvta.const.u64 %r383383_gfortran_runtime_error77: .loc 1 340 11 cvta.global.u64 %r385,_gfortrani_compile_options; .loc 1 340 10 ld.u32 %r386,[%r385+36]; setp.eq.u32 %r387,%r386,0; @ %r387 bra $L76; .loc 1 342 4 add.u64 %r456,%frame,360; cvta.const.u64 %r391,$LC2r456390391; call _gfortrani_bounds_ifunction_return344 4 cvta.const.u64 %r395,$LC39391; call _gfortrani_bounds_equal_extents$L76: .loc 1 349 3 setp.ne.u64 %r397,%r151,0; @ %r397 bra $L78; $L81: .loc 1 357 8 ld.u64 %r163,[%r226]; .loc 1 358 8 ld.u64 %r161,[%r227]; .loc 1 360 9 setp.ne.u64 %r398,%r161,0; @ %r398 bra $L79; bra $L51; $L78: add.u64 %r174,%frame,480; add.u64 %r165,%r226,40; add.u64 %r148,%frame,120; cvt.u32.u32 %r400,%r23; cvt.s64.s8 %r399,%r400; add.u64 %r135,%r399,-1; .loc 1 349 3 mov.u64 %r169,0; add.u64 %r456,%frame,360; .loc 1 351 16 mov.u64 %r401,%r169; $L80: st.u64 [%r174],%r401; .loc 1 352 18 ld.u64 %r402,[%r165]; st.u64 [%r148],%r402; .loc 1 353 17 shl.b64 %r404,%r169,3; add.u64 %r405,%r456,%r404; .loc 1 353 10 ld.u64 %r406,[%r405]; setp.le.s64 %r407,%r406,0; @ %r407 bra $L51; .loc 1 349 26 add.u64 %r169,%r169,1; .loc 1 349 3 add.u64 %r174,%r174,8; add.u64 %r165,%r165,24; add.u64 %r148,%r148,8; setp.ne.u64 %r408,%r135,%r169; @ %r408 bra $L80; bra $L81; $L79: .loc 1 282 44 mul.lo.u64 %r44,%r42,%r231; .loc 1 283 12 mul.lo.u64 %r48,%r45,%r35; .loc 1 396 22 ld.u64 %r100,[%frame+240]; .loc 1 397 23 ld.u64 %r102,[%frame]; .loc 1 398 22 ld.u64 %r104,[%frame+120]; .loc 1 398 12 shl.b64 %r106,%r104,3; ld.u64 %r457,[%frame+480]; setp.le.s64 %r454,%r151,1; .loc 1 384 13 setp.eq.u32 %r460,%r230,0; cvt.u32.u32 %r461,%r23; cvt.s64.s8 %r462,%r461; add.u64 %r463,%r462,-1; add.u64 %r464,%frame,240; add.u64 %r465,%frame,360; add.u64 %r466,%frame,120; $L93: .loc 1 349 3 mov.u64 %r168,%r162; mov.u64 %r143,%r161; .loc 1 372 9 mov.u64 %r141,0; $L85: .loc 1 375 7 ld.s8 %r132,[%r168]; mov.u64 %r190,%r141; .loc 1 378 11 add.u64 %r141,%r141,1; .loc 1 375 6 setp.eq.u32 %r453,%r132,0; @ %r453 bra $L82; .loc 1 382 6 setp.gt.s64 %r410,%r152,%r190; @ %r410 bra $L94; bra $L84; $L82: .loc 1 372 32 add.u64 %r143,%r143,%r44; .loc 1 372 47 add.u64 %r168,%r168,%r48; .loc 1 372 2 setp.ne.u64 %r411,%r152,%r141; @ %r411 bra $L85; .loc 1 371 9 mov.u64 %r141,0; bra $L84; $L94: .loc 1 382 6 mov.u64 %r167,%r143; $L83: .loc 1 388 14 add.u64 %r190,%r190,1; .loc 1 384 6 @ %r453 bra $L86; .loc 1 40(%value_in),memcmp,(%out_arg1,%out_arg2,%out_arg3); ld.param.u32 %r416,[%value_in]; } .loc 1 384 13 @ %r460 bra $L87; setp.lt.s32 %r419,%r416,0; selp.u64 %r143,%r143,%r167,%r419; selp.u64 %r141,%r141,%r190,%r419; bra $L86; $L87: setp.le.s32 %r420,%r416,0; selp.u64 %r143,%r143,%r167,%r420; selp.u64 %r141,%r141,%r190,%r420; $L86: .loc 1 382 31 add.u64 %r167,%r167,%r44; .loc 1 382 46 add.u64 %r168,%r168,%r48; .loc 1 382 6 setp.eq.u64 %r421,%r152,%r190; @ %r421 bra $L84; .loc 1 384 7 ld.s8 %r132,[%r168]; setp.eq.u32 %r453,%r132,0; bra $L83; $L84: .loc 1 392 8 st.u64 [%r163],%r141; .loc 1 395 15 add.u64 %r457,%r457,1; st.u64 [%frame+480],%r457; .loc 1 396 12 add.u64 %r161,%r161,%r100; .loc 1 397 13 add.u64 %r162,%r162,%r102; .loc 1 398 12 add.u64 %r163,%r163,%r106; .loc 1 400 32 ld.u64 %r29,[%frame+360]; .loc 1 400 13 setp.ne.u64 %r423,%r29,%r457; @ %r423 bra $L93; .loc 1 404 13 mov.u64 %r424,0; st.u64 [%frame+480],%r424; .loc 1 407 23 mul.lo.u64 %r109,%r457,%r100; .loc 1 408 24 mul.lo.u64 %r425,%r457,%r102; .loc 1 408 10 sub.u64 %r158,%r162,%r425; .loc 1 409 23 mul.lo.u64 %r426,%r457,%r104; .loc 1 409 9 shl.b64 %r427,%r426,3; sub.u64 %r159,%r163,%r427; .loc 1 411 7 @ %r454 bra $L51; add.u64 %r197,%frame,488; mov.u64 %r196,8; .loc 1 410 5 mov.u64 %r160,1; bra $L90; $L91: .loc 1 404 13 st.u64 [%r197],%r424; .loc 1 407 23 mul.lo.u64 %r109,%r120,%r119; .loc 1 408 24 mul.lo.u64 %r433,%r122,%r119; .loc 1 408 10 sub.u64 %r158,%r162,%r433; .loc 1 409 23 mul.lo.u64 %r434,%r124,%r119; .loc 1 409 9 shl.b64 %r435,%r434,3; sub.u64 %r159,%r163,%r435; .loc 1 410 5 add.u64 %r160,%r160,1; .loc 1 411 7 add.u64 %r197,%r197,8; add.u64 %r196,%r196,8; setp.eq.u64 %r436,%r160,%r463; @ %r436 bra $L51; $L90: .loc 1 419 16 ld.u64 %r437,[%r197]; add.u64 %r119,%r437,1; st.u64 [%r197],%r119; .loc 1 420 23 add.u64 %r439,%r464,%r196; ld.u64 %r120,[%r439]; .loc 1 420 13 sub.u64 %r440,%r120,%r109; add.u64 %r161,%r161,%r440; .loc 1 421 24 add.u64 %r441,%frame,%r196; ld.u64 %r122,[%r441]; .loc 1 421 14 add.u64 %r162,%r158,%r122; .loc 1 422 23 add.u64 %r443,%r466,%r196; ld.u64 %r124,[%r443]; .loc 1 422 13 shl.b64 %r444,%r124,3; add.u64 %r163,%r159,%r444; .loc 1 400 32 add.u64 %r446,%r465,%r196; ld.u64 %r127,[%r446]; .loc 1 400 13 setp.eq.u64 %r447,%r119,%r127; @ %r447 bra $L91; ld.u64 %r457,[%frame+480]; bra $L93; $L110: .loc 1 305 6 ld.u64 %r448,[%r226]; setp.eq.u64 %r449,%r448,0; @ ! %r449 bra $L66; bra $L92; $L51: .loc 1 42_gfortran_smaxloc1_8_s1 .visible .func _gfortran_smaxloc1_8_s1local .align 16 .b8 %frame_ar[368u64 %r89; .reg .u643u64 %r125; .reg .u64predu16 %r180; .reg .predpred %r222; .reg .predu64 %r231; .reg .u16 %r233; .reg .u16 %r234; .reg .u32pred %r259; .reg .pred %r260; .reg .u64u64 %r267; .reg .u32pred %r303; .reg .u64 %r304; .reg .pred %r305; .reg .u64 %r306; .reg .u32mov.u64 %r156,%ar0; mov.u64 %r157,%ar1; mov.u64 %r158,%ar2; mov.u64 %r159,%ar3; mov.u32 %r160,%ar4; mov.u64 %r161,%ar5; .loc 1 449 6 setp.eq.u64 %r162,%r159,0; @ %r162 bra $L113; .loc 1 449 20 ld.u32 %r163,[%r159]; setp.eq.u32 %r164,%r163,0; @ %r164 bra $L114; $L113: .loc 1 452 7r1571580161; call _gfortran_maxloc1_8_s1); } .loc 1 456 7 bra $L112; $L114: .loc 1 459 10 ld.u64 %r23,[%r158]; .loc 1 459 7 add.u64 %r95,%r23,-1; .loc 1 460 10 ld.s8 %r24,[%r157+28]; .loc 1 460 38 add.u32 %r26,%r24,-1; .loc 1 460 8 cvt.s64.s32 %r306,%r26; .loc 1 462 7 shr.u64 %r171,%r95,63; set.u32.gt.s64 %r173,%r95,%r306; neg.s32 %r174,%r173; cvt.u16.u64 %r177,%r171; cvt.u16.u32 %r178,%r174; or.b16 %r176,%r177,%r178; .loc 1 462 6 cvt.u32.u16 %r179,%r176; cvt.u16.u8 %r180,%r179; setp.ne.u16 %r181,%r180,0; @ %r181 bra $L116; .loc 1 469 3 setp.ne.u64 %r182,%r95,0; @ %r182 bra $L117; $L123: .loc 1 477 3 setp.lt.s64 %r183,%r95,%r306; @ %r183 bra $L118; bra $L171; $L116: .loc 1 464 7 cvt.u32.u32 %r186,%r24; cvt.s64.s8 %r185,%r186; st.u64 [%stack+8],%r1851841stack; call _gfortran_runtime_error117: add.u64 %r92,%r157,48; add.u64 %r114,%frame,120; add.u64 %r146,%r157,56; add.u64 %r188,%r157,24; add.u64 %r190,%r23,%r23; add.u64 %r191,%r190,%r23; shl.b64 %r192,%r191,3; add.u64 %r152,%r188,%r192; .loc 1 474 12 mov.u64 %r312,0; $L122: .loc 1 471 19 ld.u64 %r194,[%r146]; ld.u64 %r196,[%r92]; sub.u64 %r195,%r194,%r196; .loc 1 471 50 mad.lo.u64 %r38,%r195,%r161,%r161; .loc 1 473 10 setp.le.s64 %r197,%r38,0; @ %r197 bra $L120; .loc 1 471 17 st.u64 [%r114],%r38; bra $L121; $L120: .loc 1 474 12 st.u64 [%r114],%r312; $L121: .loc 1 469 3 add.u64 %r92,%r92,24; add.u64 %r114,%r114,8; add.u64 %r146,%r146,24; setp.ne.u64 %r199,%r92,%r152; @ %r199 bra $L122; bra $L123; $L171: .loc 1 486 15 ld.u64 %r107,[%r156]; .loc 1 486 6 setp.eq.u64 %r200,%r107,0; @ ! %r200 bra $L125; bra $L124; $L118: add.u64 %r202,%r23,%r23; add.u64 %r203,%r202,%r23; shl.b64 %r204,%r203,3; add.u64 %r205,%r204,48; add.u64 %r83,%r157,%r205; add.u64 %r304,%frame,120; shl.b64 %r207,%r23,3; add.u64 %r208,%r207,-8; add.u64 %r144,%r304,%r208; add.u64 %r209,%r157,48; cvt.u32.u32 %r211,%r24; cvt.s64.s8 %r210,%r211; add.u64 %r213,%r210,%r210; add.u64 %r214,%r213,%r210; shl.b64 %r215,%r214,3; add.u64 %r72,%r209,%r215; .loc 1 483 12 mov.u64 %r311,0; $L128: .loc 1 480 2 ld.u64 %r217,[%r83+8]; ld.u64 %r219,[%r83]; sub.u64 %r218,%r217,%r219; .loc 1 480 37 mad.lo.u64 %r46,%r218,%r161,%r161; .loc 1 482 10 setp.le.s64 %r220,%r46,0; @ %r220 bra $L126; .loc 1 479 17 st.u64 [%r144],%r46; bra $L127; $L126: .loc 1 483 12 st.u64 [%r144],%r311; $L127: .loc 1 477 3 add.u64 %r83,%r83,24; add.u64 %r144,%r144,8; setp.ne.u64 %r222,%r72,%r83; @ %r222 bra $L128; bra $L172; $L124: .loc 1 490 7 setp.eq.u64 %r223,%r306,0; @ %r223 bra $L130; add.u64 %r304,%frame,120; $L150: add.u64 %r125,%r156,40; mov.u64 %r123,%r304; add.u64 %r224,%r156,16; cvt.u32.u32 %r226,%r24; cvt.s64.s8 %r225,%r226; add.u64 %r228,%r225,%r225; add.u64 %r229,%r228,%r225; shl.b64 %r230,%r229,3; add.u64 %r117,%r224,%r230; .loc 1 493 10 mov.u64 %r89,1; .loc 1 497 4 mov.u64 %r249,0; bra $L131; $L130: .loc 1 501 24 mov.u64 %r231,0; st.u64 [%r156+8],%r231; .loc 1 502 28 cvt.u16.u32 %r234,%r24; add.u16 %r233,%r234,-1; cvt.u32.u16 %r235,%r233; st.u8 [%r156+28],%r235; .loc 1 504 20 add.u32 %r236,%r24,-2; cvt.s64.s32 %r56,%r236; add.u64 %r238,%r56,%r56; add.u64 %r239,%r238,%r56; shl.b64 %r240,%r239,3; add.u64 %r241,%r156,%r240; .loc 1 504 67 shl.b64 %r243,%r56,3; add.u64 %r244,%frame,%r243; .loc 1 504 59 ld.u64 %r246,[%r241+40]; ld.u64 %r247,[%r244+120]; mul.lo.u64 %r103,%r246,%r247; .loc 1 506 10 setp.eq.u64 %r248,%r103,0; @ %r248 bra $L132; bra $L173; $L134: .loc 1 495 48 mul.lo.u64 %r89,%r52,%r89; $L131: .loc 1 497 4 st.u64 [%r125+8],%r249; ld.u64 %r52,[%r123]; add.u64 %r250,%r52,-1; st.u64 [%r125+16],%r250; st.u64 [%r125],%r89; .loc 1 490 7 add.u64 %r125,%r125,24; add.u64 %r123,%r123,8; setp.ne.u64 %r251,%r117,%r125; @ %r251 bra $L134; bra $L130; $L132: .loc 1 509 4 st.u64 [%r156+48],%r103; mov.u64 %r253,-1; st.u64 [%r156+56],%r253; mov.u64 %r254,1; st.u64 [%r156+40],%r254; .loc 1 510 4 bra $L112; $L173: .loc 1 513 24 mov.u64 %r256(%value_in),_gfortrani_xmallocarray257,[%value_in]; } mov.u64 %r107,%r257; .loc 1 513 22 st.u64 [%r156],%r107; $L138: .loc 1 539 3 setp.ne.u64 %r259,%r306,0; @ %r259 bra $L135; bra $L136; $L125: .loc 1 517 19 ld.s8 %r61,[%r156+28]; .loc 1 517 10 setp.eq.u64 %r260,%r61,%r306; @ %r260 bra $L137; .loc 1 518 2 st.u64 [%stack+8],%r306; st.u64 [%stack],%r61;_gfortran_runtime_error137: .loc 1 523 11 cvta.global.u64 %r263,_gfortrani_compile_options; .loc 1 523 10 ld.u32 %r264,[%r263+36]; setp.eq.u32 %r265,%r264,0; @ %r265 bra $L138; .loc 1 525 4 setp.eq.u64 %r266,%r306,0; @ %r266 bra $L136; add.u64 %r115,%r156,48; add.u64 %r105,%frame,120; cvt.u32.u32 %r268,%r24; cvt.s64.s8 %r267,%r268; add.u64 %r86,%r267,-1; .loc 1 525 10 mov.u64 %r113,0; $L140: .loc 1 529 21 ld.u64 %r270,[%r115+8]; add.u64 %r269,%r270,1; .loc 1 529 19 ld.u64 %r271,[%r115]; sub.u64 %r100,%r269,%r271; .loc 1 530 18 ld.u64 %r70,[%r105]; .loc 1 525 25 add.u64 %r113,%r113,1; .loc 1 530 11 setp.eq.u64 %r272,%r70,%r100; @ %r272 bra $L139; .loc 1 531 3 st.u64 [%stack+16],%r70; st.u64 [%stack+8],%r100; st.u64 [%stack],%r113;_gfortran_runtime_error139: .loc 1 525 4 add.u64 %r115,%r115,24; add.u64 %r105,%r105,8; setp.ne.u64 %r275,%r86,%r113; @ %r275 bra $L140; $L135: .loc 1 541 16 cvt.u16.u32 %r277,%r24; setp.le.s16 %r278,%r277,1; @ %r278 bra $L141; shl.b64 %r276,%r306,3; bra $L142; $L141: mov.u64 %r276,8; $L142: add.u64 %r280,%frame,240; mov.u32 %r2842766,[%value_in]; } add.u64 %r130,%r156,40; mov.u64 %r128,%frame; mov.u64 %r109,0; $L143: .loc 1 542 18 ld.u64 %r288,[%r130]; st.u64 [%r128],%r288; .loc 1 539 26 add.u64 %r109,%r109,1; .loc 1 539 3 add.u64 %r130,%r130,24; add.u64 %r128,%r128,8; setp.gt.s64 %r289,%r306,%r109; @ %r289 bra $L143; $L136: ld.u64 %r88,[%frame+240]; .loc 1 551 22 ld.u64 %r75,[%frame]; .loc 1 551 12 shl.b64 %r76,%r75,3; .loc 1 553 32 ld.u64 %r65,[%frame+120]; .loc 1 560 23 mul.lo.u64 %r49,%r65,%r75; setp.le.s64 %r305,%r306,1; .loc 1 549 13 mov.u64 %r290,0; cvt.u32.u32 %r308,%r24; cvt.s64.s8 %r309,%r308; add.u64 %r310,%r309,-1; $L149: st.u64 [%r107],%r290; .loc 1 550 15 add.u64 %r88,%r88,1; .loc 1 551 12 add.u64 %r107,%r107,%r76; .loc 1 553 13 setp.ne.u64 %r291,%r65,%r88; @ %r291 bra $L149; .loc 1 562 7 @ ! %r305 bra $L174; bra $L112; $L148: .loc 1 557 13 st.u64 [%r108],%r290; .loc 1 560 23 mul.lo.u64 %r78,%r81,%r80; .loc 1 561 5 add.u64 %r106,%r106,1; .loc 1 562 7 add.u64 %r108,%r108,8; add.u64 %r138,%r138,8; add.u64 %r137,%r137,8; setp.ne.u64 %r294,%r106,%r310; @ %r294 bra $L147; bra $L112; $L174: add.u64 %r108,%frame,248; add.u64 %r138,%frame,8; add.u64 %r137,%frame,128; .loc 1 560 23 mov.u64 %r78,%r49; .loc 1 561 5 mov.u64 %r106,1; $L147: .loc 1 566 16 ld.u64 %r299,[%r108]; add.u64 %r80,%r299,1; st.u64 [%r108],%r80; .loc 1 567 23 ld.u64 %r81,[%r138]; .loc 1 567 13 sub.u64 %r300,%r81,%r78; shl.b64 %r301,%r300,3; add.u64 %r107,%r107,%r301; .loc 1 553 32 ld.u64 %r84,[%r137]; .loc 1 553 13 setp.eq.u64 %r302,%r80,%r84; @ %r302 bra $L148; .loc 1 557 13 mov.u64 %r88,0; bra $L149; $L172: .loc 1 486 15 ld.u64 %r107,[%r156]; .loc 1 486 6 setp.eq.u64 %r303,%r107,0; @ ! %r303 bra $L125; bra $L150; $L112: .loc 1 571 1 ret; } maxloc1_8_s4.o/ 1622802212_gfortran_maxloc1_8_s4 .visible .func _gfortran_maxloc1_8_sfortran/generated/maxloc1_8_s4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_8_s4 .visible .func _gfortran_mmaxloc1_8_s_gfortran_smaxloc1_8_s4 .visible .func _gfortran_smaxloc1_8_sVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_memcmp_char4_gfortrani_memcmp_char4_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents7] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,914,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,117,101114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99_gfortran_maxloc1_8_s4 .visible .func _gfortran_maxloc1_8_s, .param .u64 %in_ar4) {480predpred %r229; .reg .predpredu64 %r258; .reg .u64pred %r266; .reg .pred64pred %r310; .reg .u64 %r313; .reg .u64 %r314; .reg .pred %r316; .reg .u64 %r317; .reg .u64 %r318; .reg .u64 %r319; .reg .u32 %r320; .reg .u64pred %r337; .reg .u32 %r341; .reg .predpredpredpredmov.u64 %r189,%ar0; mov.u64 %r190,%ar1; mov.u64 %r191,%ar2; mov.u32 %r192,%ar3; mov.u64 %r193,%ar4; .loc 1 70 10 ld.s8 %r22,[%r190+28]; .loc 1 70 38 add.u32 %r194,%r22,-1; .loc 1 70 8 cvt.s64.s32 %r114,%r194; .loc 1 71 10 ld.u64 %r25,[%r191]; .loc 1 71 7 add.u64 %r115,%r25,-1; .loc 1 73 7 shr.u64 %r196,%r115,63; set.u32.lt.s64 %r198,%r114,%r115; neg.s32 %r199,%r198; cvt.u16.u64 %r202,%r196;73%r206 bra $L2; .loc 1 75 7 cvt.u32.u32 %r209,%r22; cvt.s64.s8 %r208,%r209; st.u64 [%stack+8],%r208007_gfortran_runtime_errorr212,%r115,%r115; add.u64 %r213,%r212,%r115; shl.b64 %r214,%r213,3; add.u64 %r215,%r190,%r214; ld.u64 %r31,[%r215+56]; ld.u64 %r33,[%r215+48]; .loc 1 83 11 ld.u64 %r34,[%r215+40]; .loc 1 85 3 setp.ne.u64 %r229,%r115,0; @ %r229 bra $L3; $L9: .loc 1 93 3 setp.gt.s64 %r230,%r114,%r115; @ %r230 bra $L4; bra $L50; $L3: add.u64 %r54,%r190,40; add.u64 %r113,%frame,120; add.u64 %r127,%frame,240; add.u64 %r231,%r190,16; add.u64 %r233,%r25,%r25; add.u64 %r234,%r233,%r25; shl.b64 %r235,%r234,3; add.u64 %r55,%r231,%r235; .loc 1 91 12 mov.u64 %r383,0; $L8: .loc 1 87 51 ld.u64 %r237,[%r54]; mul.lo.u64 %r236,%r237,%r193; .loc 1 87 18 st.u64 [%r113],%r236; .loc 1 88 19 ld.u64 %r239,[%r54+16]; add.u64 %r238,%r239,1; ld.u64 %r240,[%r54+8]; sub.u64 %r43,%r238,%r240; .loc 1 90 10 setp.lt.s64 %r241,%r43,0; @ %r241 bra $L6; .loc 1 88 17 st.u64 [%r127],%r43; bra $L7; $L6: .loc 1 91 12 st.u64 [%r127],%r383; $L7: .loc 1 85 3 add.u64 %r54,%r54,24; add.u64 %r113,%r113,8; add.u64 %r127,%r127,8; setp.ne.u64 %r243,%r54,%r55; @ %r243 bra $L8; bra $L9; $L50: .loc 1 102 6 ld.u64 %r244,[%r189]; setp.eq.u64 %r245,%r244,0; @ ! %r245 bra $L11; bra $L10; $L4: add.u64 %r247,%r25,%r25; add.u64 %r248,%r247,%r25; shl.b64 %r249,%r248,3; add.u64 %r250,%r249,40; add.u64 %r130,%r190,%r250; shl.b64 %r251,%r25,3; add.u64 %r105,%r251,-8; add.u64 %r373,%frame,120; add.u64 %r108,%r373,%r105; add.u64 %r372,%frame,240; add.u64 %r100,%r372,%r105; add.u64 %r254,%r190,40; cvt.u32.u32 %r256,%r22; cvt.s64.s8 %r255,%r256; add.u64 %r258,%r255,%r255; add.u64 %r259,%r258,%r255; shl.b64 %r260,%r259,3; add.u64 %r70,%r254,%r260; .loc 1 99 12 mov.u64 %r382,0; $L14: .loc 1 95 56 ld.u64 %r262,[%r130]; mul.lo.u64 %r261,%r262,%r193; .loc 1 95 18 st.u64 [%r108],%r261; .loc 1 96 19 ld.u64 %r264,[%r130+16]; add.u64 %r263,%r264,1; ld.u64 %r265,[%r130+8]; sub.u64 %r52,%r263,%r265; .loc 1 98 10 setp.lt.s64 %r266,%r52,0; @ %r266 bra $L12; .loc 1 96 17 st.u64 [%r100],%r52; bra $L13; $L12: .loc 1 99 12 st.u64 [%r100],%r382; $L13: .loc 1 93 3 add.u64 %r130,%r130,24; add.u64 %r108,%r108,8; add.u64 %r100,%r100,8; setp.ne.u64 %r268,%r70,%r130; @ %r268 bra $L14; bra $L51; $L10: .loc 1 106 7 setp.le.s64 %r269,%r114,0; @ %r269 bra $L16; add.u64 %r372,%frame,240; $L38: add.u64 %r147,%r189,40; mov.u64 %r145,%r372; cvt.u32.u32 %r271,%r22; cvt.s64.s8 %r270,%r271; add.u64 %r273,%r270,%r270; add.u64 %r274,%r273,%r270; shl.b64 %r275,%r274,3; add.u64 %r276,%r189,16; add.u64 %r135,%r275,%r276; .loc 1 109 10 mov.u64 %r102,1; .loc 1 113 4 mov.u64 %r299,0; bra $L17; $L16: .loc 1 117 24 mov.u64 %r277,0; st.u64 [%r189+8],%r277; .loc 1 118 28 cvt.u16.u32 %r280,%r22; add.u16 %r279,%r280,-1; cvt.u32.u16 %r281,%r279; st.u8 [%r189+28],%r281; .loc 1 120 20 add.u32 %r282,%r22,-2; cvt.s64.s32 %r61,%r282; add.u64 %r284,%r61,%r61; add.u64 %r285,%r284,%r61; shl.b64 %r286,%r285,3; add.u64 %r287,%r189,%r286; .loc 1 120 67 shl.b64 %r289,%r61,3; add.u64 %r290,%frame,%r289; .loc 1 120 59 ld.u64 %r292,[%r287+40]; ld.u64 %r293,[%r290+240]; mul.lo.u64 %r120,%r292,%r293; .loc 1 122 29 mov.u64 %r29512295; call (%value_in),_gfortrani_xmallocarray296,[%value_in]; } .loc 1 122 27 st.u64 [%r189],%r296; .loc 1 123 10 setp.eq.u64 %r298,%r120,0; @ ! %r298 bra $L21; bra $L18; $L20: .loc 1 111 48 mul.lo.u64 %r102,%r57,%r102; $L17: .loc 1 113 4 st.u64 [%r147+8],%r299; ld.u64 %r57,[%r145]; add.u64 %r300,%r57,-1; st.u64 [%r147+16],%r300; st.u64 [%r147],%r102; .loc 1 106 7 add.u64 %r147,%r147,24; add.u64 %r145,%r145,8; setp.ne.u64 %r301,%r135,%r147; @ %r301 bra $L20; bra $L16; $L18: .loc 1 126 4 st.u64 [%r189+48],%r120; mov.u64 %r303,-1; st.u64 [%r189+56],%r303; mov.u64 %r304,1; st.u64 [%r189+40],%r304; .loc 1 127 4 bra $L1; $L11: .loc 1 133 19 ld.s8 %r68,[%r189+28]; .loc 1 133 10 setp.eq.u64 %r305,%r68,%r114; @ %r305 bra $L23; .loc 1 134 2 st.u64 [%stack+8],%r114; st.u64 [%stack],%r68;_gfortran_runtime_error23: .loc 1 139 11 cvta.global.u64 %r308,_gfortrani_compile_options; .loc 1 139 10 ld.u32 %r309,[%r308+36]; setp.eq.u32 %r310,%r309,0; @ %r310 bra $L21; .loc 1 140 2 add.u64 %r372,%frame,240; cvta.const.u64 %r314,$LC2r372313314; call _gfortrani_bounds_ifunction_return$L21: .loc 1 144 3 setp.gt.s64 %r316,%r114,0; @ %r316 bra $L24; $L28: .loc 1 80 9 add.u64 %r317,%r31,1; .loc 1 80 7 sub.u64 %r116,%r317,%r33; max.s64 %r118,%r116,0; .loc 1 152 8 ld.u64 %r101,[%r190]; .loc 1 153 8 ld.u64 %r125,[%r189]; .loc 1 170 36 mul.lo.u64 %r318,%r34,%r193; shl.b64 %r75,%r318,2; .loc 1 186 22 ld.u64 %r177,[%frame+120]; .loc 1 186 12 shl.b64 %r180,%r177,2; .loc 1 187 22 ld.u64 %r181,[%frame]; .loc 1 187 12 shl.b64 %r184,%r181,3; .loc 1 185 12 ld.u64 %r185,[%frame+360]; setp.gt.s64 %r370,%r116,0; setp.le.s64 %r371,%r114,1; .loc 1 173 6 setp.eq.u32 %r376,%r192,0; cvt.u32.u32 %r377,%r22; cvt.s64.s8 %r378,%r377; add.u64 %r379,%r378,-1; add.u64 %r380,%frame,120; add.u64 %r381,%frame,240; bra $L25; $L24: add.u64 %r166,%frame,360; add.u64 %r165,%r189,40; mov.u64 %r163,%frame; cvt.u32.u32 %r320,%r22; cvt.s64.s8 %r319,%r320; add.u64 %r151,%r319,-1; .loc 1 144 3 mov.u64 %r126,0; add.u64 %r372,%frame,240; .loc 1 146 16 mov.u64 %r321,%r126; $L27: st.u64 [%r166],%r321; .loc 1 147 18 ld.u64 %r322,[%r165]; st.u64 [%r163],%r322; .loc 1 148 17 shl.b64 %r324,%r126,3; add.u64 %r325,%r372,%r324; .loc 1 148 10 ld.u64 %r326,[%r325]; setp.le.s64 %r327,%r326,0; @ %r327 bra $L1; .loc 1 144 26 add.u64 %r126,%r126,1; .loc 1 144 3 add.u64 %r166,%r166,8; add.u64 %r165,%r165,24; add.u64 %r163,%r163,8; setp.ne.u64 %r328,%r126,%r151; @ %r328 bra $L27; bra $L28; $L25: .loc 1 166 5 @ %r370 bra $L39; .loc 1 167 10 mov.u64 %r330,0; st.u64 [%r125],%r330; bra $L30; $L39: mov.u64 %r121,%r101; .loc 1 164 9 mov.u64 %r106,0; .loc 1 165 9 mov.u64 %r104,%r106; .loc 1 170 13 mov.u64 %r152,%r106; $L29: .loc 1 177 14 add.u64 %r152,%r152,1; .loc 1 173 6 setp.ne.u64 %r331,%r106,0; @ %r331 bra $L31; $L34: .loc 1 170 13 mov.u64 %r106,%r121; .loc 1 177 14 mov.u64 %r104,%r152; bra $L32; $L31: .loc 1 173 6 @ %r376 bra $L33; .loc 1 42 12 { .param .u32 %value_in;(%value_in),_gfortrani_memcmp_char4336,[%value_in]; } .loc 1 173 22 setp.ge.s32 %r337,%r336,0; @ %r337 bra $L34; bra $L32; $L33: .loc 1 42(%value_in),_gfortrani_memcmp_char4341,[%value_in]; } .loc 1 173 22 setp.gt.s32 %r342,%r341,0; @ %r342 bra $L34; $L32: .loc 1 170 36 add.u64 %r121,%r121,%r75; .loc 1 170 6 setp.gt.s64 %r343,%r118,%r152; @ %r343 bra $L29; .loc 1 181 12 st.u64 [%r125],%r104; $L30: .loc 1 185 15 add.u64 %r185,%r185,1; .loc 1 186 12 add.u64 %r101,%r101,%r180; .loc 1 187 12 add.u64 %r125,%r125,%r184; .loc 1 189 32 ld.u64 %r132,[%frame+240]; .loc 1 189 13 setp.ne.u64 %r344,%r185,%r132; @ %r344 bra $L25; .loc 1 196 23 mul.lo.u64 %r345,%r185,%r177; .loc 1 196 9 shl.b64 %r346,%r345,2; sub.u64 %r122,%r101,%r346; .loc 1 197 23 mul.lo.u64 %r347,%r185,%r181; .loc 1 197 9 shl.b64 %r348,%r347,3; sub.u64 %r123,%r125,%r348; .loc 1 199 7 @ %r371 bra $L1; add.u64 %r134,%frame,368; mov.u64 %r155,8; .loc 1 198 5 mov.u64 %r124,1; .loc 1 193 13 mov.u64 %r374,0; bra $L36; $L37: st.u64 [%r134],%r374; .loc 1 196 23 mul.lo.u64 %r354,%r90,%r89; .loc 1 196 9 shl.b64 %r355,%r354,2; sub.u64 %r122,%r101,%r355; .loc 1 197 23 mul.lo.u64 %r356,%r93,%r89; .loc 1 197 9 shl.b64 %r357,%r356,3; sub.u64 %r123,%r125,%r357; .loc 1 198 5 add.u64 %r124,%r124,1; .loc 1 199 7 add.u64 %r134,%r134,8; add.u64 %r155,%r155,8; setp.eq.u64 %r358,%r124,%r379; @ %r358 bra $L1; $L36: .loc 1 207 16 ld.u64 %r359,[%r134]; add.u64 %r89,%r359,1; st.u64 [%r134],%r89; .loc 1 208 23 add.u64 %r361,%r380,%r155; ld.u64 %r90,[%r361]; .loc 1 208 13 shl.b64 %r362,%r90,2; add.u64 %r101,%r122,%r362; .loc 1 209 23 add.u64 %r363,%frame,%r155; ld.u64 %r93,[%r363]; .loc 1 209 13 shl.b64 %r364,%r93,3; add.u64 %r125,%r123,%r364; .loc 1 189 32 add.u64 %r366,%r381,%r155; ld.u64 %r96,[%r366]; .loc 1 189 13 setp.eq.u64 %r367,%r89,%r96; @ %r367 bra $L37; mov.u64 %r185,0; bra $L25; $L51: .loc 1 102 6 ld.u64 %r368,[%r189]; setp.eq.u64 %r369,%r368,0; @ ! %r369 bra $L11; bra $L38; $L1: .loc 1 21_gfortran_mmaxloc1_8_s4 .visible .func _gfortran_mmaxloc1_8_s608pred %r289; .reg .predpred %r304; .reg .pred %r305; .reg .u64u64 %r313; .reg .u64u64pred %r324; .reg .u64predu16 %r371; .reg .u16 %r372; .reg .u32 %r373; .reg .pred %r374; .reg .u64 %r375; .reg .u64 %r376; .reg .predu64u64 %r399; .reg .predpred %r416; .reg .u32 %r422; .reg .pred %r423; .reg .u32 %r427; .reg .pred %r428; .reg .predpred.reg .u64u64u64 %r464; .reg .u64 %r465; .reg .pred.reg .u64 %r475; .reg .u64 %r476; mov.u64 %r230,%ar0; mov.u64 %r231,%ar1; mov.u64 %r232,%ar2; mov.u64 %r233,%ar3; mov.u32 %r234,%ar4; mov.u64 %r235,%ar5; .loc 1 244 6 setp.ne.u64 %r236,%r233,0; @ %r236 bra $L2234235; call _gfortran_maxloc1_8_s4); } .loc 1 251 7 bra $L52; $L53: .loc 1 254 10 ld.u64 %r22,[%r232]; .loc 1 254 7 add.u64 %r149,%r22,-1; .loc 1 255 10 ld.s8 %r23,[%r231+28]; .loc 1 255 38 add.u32 %r242,%r23,-1; .loc 1 255 8 cvt.s64.s32 %r150,%r242; .loc 1 258 7 shr.u64 %r244,%r149,63; set.u32.gt.s64 %r246,%r149,%r150; neg.s32 %r247,%r246; cvt.u16.u64 %r250,%r244; cvt.u16.u32 %r251,%r247; or.b16 %r249,%r250,%r251; .loc 1 258 6.loc 1 260 7 cvt.u32.u32 %r257,%r23; cvt.s64.s8 %r256,%r257; st.u64 [%stack+8],%r256; st.u64 [%stack],%r22; cvta.const.u64 %r22_gfortran_runtime_error5: .loc 1 265 9 add.u64 %r464,%r149,%r149; add.u64 %r462,%r464,%r149; shl.b64 %r262,%r462,3; add.u64 %r263,%r231,%r262; ld.u64 %r266,[%r263+56]; add.u64 %r265,%r266,1; .loc 1 265 7 ld.u64 %r273,[%r263+48]; sub.u64 %r151,%r265,%r273; .loc 1 266 6 setp.le.s64 %r274,%r151,0; @ %r274 bra $L52; .loc 1 269 9 ld.u64 %r164,[%r233]; .loc 1 271 15 ld.u64 %r35,[%r233+16]; .loc 1 273 22 cvt.u32.u64 %r36,%r35; .loc 1 273 53 add.u32 %r275,%r36,-4; and.b32 %r276,%r275,-5; set.u32.eq.u32 %r278,%r276,0; neg.s32 %r279,%r278; .loc 1 273 22 add.u32 %r280,%r36,-1; set.u32.le.u32 %r282,%r280,1; neg.s32 %r283,%r282; .loc 1 273 6 cvt.u16.u32 %r285,%r279; cvt.u16.u32 %r286,%r283; or.b16 %r284,%r285,%r286; cvt.u32.u16 %r287,%r284; cvt.u16.u8 %r288,%r287; setp.ne.u16 %r289,%r288,0; @ %r289 bra $L57; .loc 1 275 7 setp.ne.u32 %r291,%r36,16; @ %r291 bra $L58; $L57: .loc 1 282 11 shl.b64 %r295,%r462,3; add.u64 %r296,%r231,%r295; ld.u64 %r40,[%r296+40]; .loc 1 283 12 add.u64 %r302,%r233,%r295; ld.u64 %r42,[%r302+40]; .loc 1 285 3 setp.ne.u64 %r304,%r149,0; @ %r304 bra $L59; $L65: .loc 1 295 3 setp.lt.s64 %r305,%r149,%r150; @ %r305 bra $L60; bra $L111; $L58: .loc 1 280 5 cvta.const.u64 %r306306_gfortran_runtime_error9: add.u64 %r103,%r231,40; add.u64 %r213,%frame,240; add.u64 %r214,%r233,40; mov.u64 %r216,%frame; add.u64 %r217,%frame,360; add.u64 %r308,%r231,16; add.u64 %r310,%r22,%r22; add.u64 %r311,%r310,%r22; shl.b64 %r312,%r311,3; add.u64 %r226,%r308,%r312; .loc 1 292 12 mov.u64 %r476,0; $L64: .loc 1 287 51 ld.u64 %r314,[%r103]; mul.lo.u64 %r313,%r314,%r235; .loc 1 287 18 st.u64 [%r213],%r313; .loc 1 288 20 ld.u64 %r316,[%r214]; mul.lo.u64 %r315,%r316,%r35; .loc 1 288 18 st.u64 [%r216],%r315; .loc 1 289 19 ld.u64 %r318,[%r103+16]; add.u64 %r317,%r318,1; ld.u64 %r319,[%r103+8]; sub.u64 %r57,%r317,%r319; .loc 1 291 10 setp.lt.s64 %r320,%r57,0; @ %r320 bra $L62; .loc 1 289 17 st.u64 [%r217],%r57; bra $L63; $L62: .loc 1 292 12 st.u64 [%r217],%r476; $L63: .loc 1 285 3 add.u64 %r103,%r103,24; add.u64 %r213,%r213,8; add.u64 %r214,%r214,24; add.u64 %r216,%r216,8; add.u64 %r217,%r217,8; setp.ne.u64 %r322,%r103,%r226; @ %r322 bra $L64; bra $L65; $L111: .loc 1 305 6 ld.u64 %r323,[%r230]; setp.eq.u64 %r324,%r323,0; @ ! %r324 bra $L67; bra $L66; $L60: add.u64 %r326,%r22,%r22; add.u64 %r327,%r326,%r22; shl.b64 %r328,%r327,3; add.u64 %r58,%r328,40; add.u64 %r77,%r231,%r58; shl.b64 %r329,%r22,3; add.u64 %r148,%r329,-8; add.u64 %r460,%frame,240; add.u64 %r39,%r460,%r148; add.u64 %r173,%r233,%r58; add.u64 %r143,%frame,%r148; add.u64 %r465,%frame,360; add.u64 %r178,%r465,%r148; add.u64 %r332,%r231,40; cvt.u32.u32 %r334,%r23; cvt.s64.s8 %r333,%r334; add.u64 %r336,%r333,%r333; add.u64 %r337,%r336,%r333; shl.b64 %r338,%r337,3; add.u64 %r100,%r332,%r338; .loc 1 302 12 mov.u64 %r475,0; $L70: .loc 1 297 55 ld.u64 %r340,[%r77]; mul.lo.u64 %r339,%r340,%r235; .loc 1 297 18 st.u64 [%r39],%r339; .loc 1 298 20 ld.u64 %r342,[%r173]; mul.lo.u64 %r341,%r342,%r35; .loc 1 298 18 st.u64 [%r143],%r341; .loc 1 299 19 ld.u64 %r344,[%r77+16]; add.u64 %r343,%r344,1; ld.u64 %r345,[%r77+8]; sub.u64 %r75,%r343,%r345; .loc 1 301 10 setp.lt.s64 %r346,%r75,0; @ %r346 bra $L68; .loc 1 299 17 st.u64 [%r178],%r75; bra $L69; $L68: .loc 1 302 12 st.u64 [%r178],%r475; $L69: .loc 1 295 3 add.u64 %r77,%r77,24; add.u64 %r39,%r39,8; add.u64 %r173,%r173,24; add.u64 %r143,%r143,8; add.u64 %r178,%r178,8; setp.ne.u64 %r348,%r77,%r100; @ %r348 bra $L70; bra $L112; $L66: .loc 1 309 7 setp.eq.u64 %r349,%r150,0; @ %r349 bra $L72; add.u64 %r465,%frame,360; $L94: add.u64 %r130,%r230,40; mov.u64 %r128,%r465; cvt.u32.u32 %r351,%r23; cvt.s64.s8 %r350,%r351; add.u64 %r353,%r350,%r350; add.u64 %r354,%r353,%r350; shl.b64 %r355,%r354,3; add.u64 %r356,%r230,16; add.u64 %r92,%r355,%r356; .loc 1 312 10 mov.u64 %r135,1; .loc 1 316 4 mov.u64 %r375,0; bra $L73; $L72: .loc 1 320 20 add.u32 %r357,%r23,-2; cvt.s64.s32 %r83,%r357; add.u64 %r359,%r83,%r83; add.u64 %r360,%r359,%r83; shl.b64 %r361,%r360,3; add.u64 %r362,%r230,%r361; .loc 1 320 67 shl.b64 %r364,%r83,3; add.u64 %r365,%frame,%r364; .loc 1 320 59 ld.u64 %r367,[%r362+40]; ld.u64 %r368,[%r365+360]; mul.lo.u64 %r157,%r367,%r368; .loc 1 322 24 mov.u64 %r369,0; st.u64 [%r230+8],%r369; .loc 1 323 28 cvt.u16.u32 %r372,%r23; add.u16 %r371,%r372,-1; cvt.u32.u16 %r373,%r371; st.u8 [%r230+28],%r373; .loc 1 325 10 setp.eq.u64 %r374,%r157,0; @ %r374 bra $L74; bra $L113; $L76: .loc 1 314 47 mul.lo.u64 %r135,%r80,%r135; $L73: .loc 1 316 4 st.u64 [%r130+8],%r375; ld.u64 %r80,[%r128]; add.u64 %r376,%r80,-1; st.u64 [%r130+16],%r376; st.u64 [%r130],%r135; .loc 1 309 7 add.u64 %r130,%r130,24; add.u64 %r128,%r128,8; setp.ne.u64 %r377,%r92,%r130; @ %r377 bra $L76; bra $L72; $L74: .loc 1 328 4 st.u64 [%r230+48],%r157; mov.u64 %r379,-1; st.u64 [%r230+56],%r379; mov.u64 %r380,1; st.u64 [%r230+40],%r380; .loc 1 329 4 bra $L52; $L113: .loc 1 332 24 mov.u64 %r38382; call (%value_in),_gfortrani_xmallocarray83,[%value_in]; } .loc 1 332 22 st.u64 [%r230],%r383; bra $L77; $L67: .loc 1 337 19 ld.s8 %r385,[%r230+28]; .loc 1 337 10 setp.eq.u64 %r386,%r385,%r150; @ %r386 bra $L78; .loc 1 338 2 cvta.const.u64 %r387387_gfortran_runtime_error78: .loc 1 340 11 cvta.global.u64 %r389,_gfortrani_compile_options; .loc 1 340 10 ld.u32 %r390,[%r389+36]; setp.eq.u32 %r391,%r390,0; @ %r391 bra $L77; .loc 1 342 4 add.u64 %r465,%frame,360; cvta.const.u64 %r395,$LC2r394395; call _gfortrani_bounds_ifunction_return344 4 cvta.const.u64 %r399,$LC3r2399395; call _gfortrani_bounds_equal_extents$L77: .loc 1 349 3 setp.ne.u64 %r401,%r150,0; @ %r401 bra $L79; $L82: .loc 1 357 8 ld.u64 %r165,[%r230]; .loc 1 358 8 ld.u64 %r163,[%r231]; .loc 1 360 9 setp.ne.u64 %r402,%r163,0; @ %r402 bra $L80; bra $L52; $L79: add.u64 %r174,%frame,480; add.u64 %r166,%r230,40; add.u64 %r145,%frame,120; cvt.u32.u32 %r404,%r23; cvt.s64.s8 %r403,%r404; add.u64 %r131,%r403,-1; .loc 1 349 3 mov.u64 %r171,0; add.u64 %r465,%frame,360; .loc 1 351 16 mov.u64 %r405,%r171; $L81: st.u64 [%r174],%r405; .loc 1 352 18 ld.u64 %r406,[%r166]; st.u64 [%r145],%r406; .loc 1 353 17 shl.b64 %r408,%r171,3; add.u64 %r409,%r465,%r408; .loc 1 353 10 ld.u64 %r410,[%r409]; setp.le.s64 %r411,%r410,0; @ %r411 bra $L52; .loc 1 349 26 add.u64 %r171,%r171,1; .loc 1 349 3 add.u64 %r174,%r174,8; add.u64 %r166,%r166,24; add.u64 %r145,%r145,8; setp.ne.u64 %r412,%r131,%r171; @ %r412 bra $L81; bra $L82; $L80: .loc 1 283 12 mul.lo.u64 %r45,%r42,%r35; .loc 1 372 32 mul.lo.u64 %r413,%r40,%r235; shl.b64 %r95,%r413,2; .loc 1 396 22 ld.u64 %r197,[%frame+240]; .loc 1 396 12 shl.b64 %r200,%r197,2; .loc 1 397 23 ld.u64 %r201,[%frame]; .loc 1 398 22 ld.u64 %r203,[%frame+120]; .loc 1 398 12 shl.b64 %r205,%r203,3; setp.le.s64 %r463,%r150,1; .loc 1 384 13 setp.eq.u32 %r468,%r234,0; cvt.u32.u32 %r469,%r23; cvt.s64.s8 %r470,%r469; add.u64 %r471,%r470,-1; add.u64 %r472,%frame,240; add.u64 %r473,%frame,360; add.u64 %r474,%frame,120; $L95: .loc 1 349 3 mov.u64 %r170,%r164; mov.u64 %r159,%r163; .loc 1 372 9 mov.u64 %r139,0; $L86: .loc 1 375 7 ld.s8 %r206,[%r170]; mov.u64 %r190,%r139; .loc 1 378 11 add.u64 %r139,%r139,1; .loc 1 375 6 setp.eq.u32 %r459,%r206,0; @ %r459 bra $L83; .loc 1 382 6 setp.gt.s64 %r415,%r151,%r190; @ %r415 bra $L96; bra $L85; $L83: .loc 1 372 32 add.u64 %r159,%r159,%r95; .loc 1 372 47 add.u64 %r170,%r170,%r45; .loc 1 372 2 setp.ne.u64 %r416,%r151,%r139; @ %r416 bra $L86; .loc 1 371 9 mov.u64 %r139,0; bra $L85; $L96: .loc 1 382 6 mov.u64 %r169,%r159; $L84: .loc 1 388 14 add.u64 %r190,%r190,1; .loc 1 384 6 @ %r459 bra $L87; .loc 1 384 13 @ %r468 bra $L88; .loc 1 42call (%value_in),_gfortrani_memcmp_char4422,[%value_in]; } .loc 1 384 13 setp.lt.s32 %r423,%r422,0; @ %r423 bra $L87; $L89: .loc 1 382 6 mov.u64 %r159,%r169; .loc 1 388 14 mov.u64 %r139,%r190; bra $L87; $L88: .loc 1 42call (%value_in),_gfortrani_memcmp_char4427,[%value_in]; } .loc 1 384 13 setp.gt.s32 %r428,%r427,0; @ %r428 bra $L89; $L87: .loc 1 382 31 add.u64 %r169,%r169,%r95; .loc 1 382 46 add.u64 %r170,%r170,%r45; .loc 1 382 6 setp.eq.u64 %r429,%r151,%r190; @ %r429 bra $L85; .loc 1 384 7 ld.s8 %r206,[%r170]; setp.eq.u32 %r459,%r206,0; bra $L84; $L85: .loc 1 392 8 st.u64 [%r165],%r139; .loc 1 395 15 ld.u64 %r430,[%frame+480]; add.u64 %r98,%r430,1; st.u64 [%frame+480],%r98; .loc 1 396 12 add.u64 %r163,%r163,%r200; .loc 1 397 13 add.u64 %r164,%r164,%r201; .loc 1 398 12 add.u64 %r165,%r165,%r205; .loc 1 400 32 ld.u64 %r29,[%frame+360]; .loc 1 400 13 setp.ne.u64 %r431,%r29,%r98; @ %r431 bra $L95; .loc 1 404 13 mov.u64 %r432,0; st.u64 [%frame+480],%r432; .loc 1 407 23 mul.lo.u64 %r107,%r29,%r197; .loc 1 408 24 mul.lo.u64 %r433,%r29,%r201; .loc 1 408 10 sub.u64 %r160,%r164,%r433; .loc 1 409 23 mul.lo.u64 %r434,%r29,%r203; .loc 1 409 9 shl.b64 %r435,%r434,3; sub.u64 %r161,%r165,%r435; .loc 1 411 7 @ %r463 bra $L52; add.u64 %r198,%frame,488; mov.u64 %r196,8; .loc 1 410 5 mov.u64 %r162,1; bra $L92; $L93: .loc 1 404 13 st.u64 [%r198],%r432; .loc 1 407 23 mul.lo.u64 %r107,%r119,%r118; .loc 1 408 24 mul.lo.u64 %r441,%r121,%r118; .loc 1 408 10 sub.u64 %r160,%r164,%r441; .loc 1 409 23 mul.lo.u64 %r442,%r123,%r118; .loc 1 409 9 shl.b64 %r443,%r442,3; sub.u64 %r161,%r165,%r443; .loc 1 410 5 add.u64 %r162,%r162,1; .loc 1 411 7 add.u64 %r198,%r198,8; add.u64 %r196,%r196,8; setp.eq.u64 %r444,%r162,%r471; @ %r444 bra $L52; $L92: .loc 1 419 16 ld.u64 %r445,[%r198]; add.u64 %r118,%r445,1; st.u64 [%r198],%r118; .loc 1 420 23 add.u64 %r447,%r472,%r196; ld.u64 %r119,[%r447]; .loc 1 420 13 sub.u64 %r448,%r119,%r107; shl.b64 %r449,%r448,2; add.u64 %r163,%r163,%r449; .loc 1 421 24 add.u64 %r450,%frame,%r196; ld.u64 %r121,[%r450]; .loc 1 421 14 add.u64 %r164,%r160,%r121; .loc 1 422 23 add.u64 %r452,%r474,%r196; ld.u64 %r123,[%r452]; .loc 1 422 13 shl.b64 %r453,%r123,3; add.u64 %r165,%r161,%r453; .loc 1 400 32 add.u64 %r455,%r473,%r196; ld.u64 %r127,[%r455]; .loc 1 400 13 setp.eq.u64 %r456,%r118,%r127; @ %r456 bra $L93; bra $L95; $L112: .loc 1 305 6 ld.u64 %r457,[%r230]; setp.eq.u64 %r458,%r457,0; @ ! %r458 bra $L67; bra $L94; $L52: .loc 1 42_gfortran_smaxloc1_8_s4 .visible .func _gfortran_smaxloc1_8_slocal .align 16 .b8 %frame_ar[368u64 %r89; .reg .u643u64 %r125; .reg .u64predu16 %r180; .reg .predpred %r222; .reg .predu64 %r231; .reg .u16 %r233; .reg .u16 %r234; .reg .u32pred %r259; .reg .pred %r260; .reg .u64u64 %r267; .reg .u32pred %r303; .reg .u64 %r304; .reg .pred %r305; .reg .u64 %r306; .reg .u32mov.u64 %r156,%ar0; mov.u64 %r157,%ar1; mov.u64 %r158,%ar2; mov.u64 %r159,%ar3; mov.u32 %r160,%ar4; mov.u64 %r161,%ar5; .loc 1 449 6 setp.eq.u64 %r162,%r159,0; @ %r162 bra $L115; .loc 1 449 20 ld.u32 %r163,[%r159]; setp.eq.u32 %r164,%r163,0; @ %r164 bra $L116; $L115: .loc 1 452 7r1571580161; call _gfortran_maxloc1_8_s4); } .loc 1 456 7 bra $L114; $L116: .loc 1 459 10 ld.u64 %r23,[%r158]; .loc 1 459 7 add.u64 %r95,%r23,-1; .loc 1 460 10 ld.s8 %r24,[%r157+28]; .loc 1 460 38 add.u32 %r26,%r24,-1; .loc 1 460 8 cvt.s64.s32 %r306,%r26; .loc 1 462 7 shr.u64 %r171,%r95,63; set.u32.gt.s64 %r173,%r95,%r306; neg.s32 %r174,%r173; cvt.u16.u64 %r177,%r171; cvt.u16.u32 %r178,%r174; or.b16 %r176,%r177,%r178; .loc 1 462 6 cvt.u32.u16 %r179,%r176; cvt.u16.u8 %r180,%r179; setp.ne.u16 %r181,%r180,0; @ %r181 bra $L118; .loc 1 469 3 setp.ne.u64 %r182,%r95,0; @ %r182 bra $L119; $L125: .loc 1 477 3 setp.lt.s64 %r183,%r95,%r306; @ %r183 bra $L120; bra $L173; $L118: .loc 1 464 7 cvt.u32.u32 %r186,%r24; cvt.s64.s8 %r185,%r186; st.u64 [%stack+8],%r1851841stack; call _gfortran_runtime_error119: add.u64 %r92,%r157,48; add.u64 %r114,%frame,120; add.u64 %r146,%r157,56; add.u64 %r188,%r157,24; add.u64 %r190,%r23,%r23; add.u64 %r191,%r190,%r23; shl.b64 %r192,%r191,3; add.u64 %r152,%r188,%r192; .loc 1 474 12 mov.u64 %r312,0; $L124: .loc 1 471 19 ld.u64 %r194,[%r146]; ld.u64 %r196,[%r92]; sub.u64 %r195,%r194,%r196; .loc 1 471 50 mad.lo.u64 %r38,%r195,%r161,%r161; .loc 1 473 10 setp.le.s64 %r197,%r38,0; @ %r197 bra $L122; .loc 1 471 17 st.u64 [%r114],%r38; bra $L123; $L122: .loc 1 474 12 st.u64 [%r114],%r312; $L123: .loc 1 469 3 add.u64 %r92,%r92,24; add.u64 %r114,%r114,8; add.u64 %r146,%r146,24; setp.ne.u64 %r199,%r92,%r152; @ %r199 bra $L124; bra $L125; $L173: .loc 1 486 15 ld.u64 %r107,[%r156]; .loc 1 486 6 setp.eq.u64 %r200,%r107,0; @ ! %r200 bra $L127; bra $L126; $L120: add.u64 %r202,%r23,%r23; add.u64 %r203,%r202,%r23; shl.b64 %r204,%r203,3; add.u64 %r205,%r204,48; add.u64 %r83,%r157,%r205; add.u64 %r304,%frame,120; shl.b64 %r207,%r23,3; add.u64 %r208,%r207,-8; add.u64 %r144,%r304,%r208; add.u64 %r209,%r157,48; cvt.u32.u32 %r211,%r24; cvt.s64.s8 %r210,%r211; add.u64 %r213,%r210,%r210; add.u64 %r214,%r213,%r210; shl.b64 %r215,%r214,3; add.u64 %r72,%r209,%r215; .loc 1 483 12 mov.u64 %r311,0; $L130: .loc 1 480 2 ld.u64 %r217,[%r83+8]; ld.u64 %r219,[%r83]; sub.u64 %r218,%r217,%r219; .loc 1 480 37 mad.lo.u64 %r46,%r218,%r161,%r161; .loc 1 482 10 setp.le.s64 %r220,%r46,0; @ %r220 bra $L128; .loc 1 479 17 st.u64 [%r144],%r46; bra $L129; $L128: .loc 1 483 12 st.u64 [%r144],%r311; $L129: .loc 1 477 3 add.u64 %r83,%r83,24; add.u64 %r144,%r144,8; setp.ne.u64 %r222,%r72,%r83; @ %r222 bra $L130; bra $L174; $L126: .loc 1 490 7 setp.eq.u64 %r223,%r306,0; @ %r223 bra $L132; add.u64 %r304,%frame,120; $L152: add.u64 %r125,%r156,40; mov.u64 %r123,%r304; add.u64 %r224,%r156,16; cvt.u32.u32 %r226,%r24; cvt.s64.s8 %r225,%r226; add.u64 %r228,%r225,%r225; add.u64 %r229,%r228,%r225; shl.b64 %r230,%r229,3; add.u64 %r117,%r224,%r230; .loc 1 493 10 mov.u64 %r89,1; .loc 1 497 4 mov.u64 %r249,0; bra $L133; $L132: .loc 1 501 24 mov.u64 %r231,0; st.u64 [%r156+8],%r231; .loc 1 502 28 cvt.u16.u32 %r234,%r24; add.u16 %r233,%r234,-1; cvt.u32.u16 %r235,%r233; st.u8 [%r156+28],%r235; .loc 1 504 20 add.u32 %r236,%r24,-2; cvt.s64.s32 %r56,%r236; add.u64 %r238,%r56,%r56; add.u64 %r239,%r238,%r56; shl.b64 %r240,%r239,3; add.u64 %r241,%r156,%r240; .loc 1 504 67 shl.b64 %r243,%r56,3; add.u64 %r244,%frame,%r243; .loc 1 504 59 ld.u64 %r246,[%r241+40]; ld.u64 %r247,[%r244+120]; mul.lo.u64 %r103,%r246,%r247; .loc 1 506 10 setp.eq.u64 %r248,%r103,0; @ %r248 bra $L134; bra $L175; $L136: .loc 1 495 48 mul.lo.u64 %r89,%r52,%r89; $L133: .loc 1 497 4 st.u64 [%r125+8],%r249; ld.u64 %r52,[%r123]; add.u64 %r250,%r52,-1; st.u64 [%r125+16],%r250; st.u64 [%r125],%r89; .loc 1 490 7 add.u64 %r125,%r125,24; add.u64 %r123,%r123,8; setp.ne.u64 %r251,%r117,%r125; @ %r251 bra $L136; bra $L132; $L134: .loc 1 509 4 st.u64 [%r156+48],%r103; mov.u64 %r253,-1; st.u64 [%r156+56],%r253; mov.u64 %r254,1; st.u64 [%r156+40],%r254; .loc 1 510 4 bra $L114; $L175: .loc 1 513 24 mov.u64 %r256(%value_in),_gfortrani_xmallocarray257,[%value_in]; } mov.u64 %r107,%r257; .loc 1 513 22 st.u64 [%r156],%r107; $L140: .loc 1 539 3 setp.ne.u64 %r259,%r306,0; @ %r259 bra $L137; bra $L138; $L127: .loc 1 517 19 ld.s8 %r61,[%r156+28]; .loc 1 517 10 setp.eq.u64 %r260,%r61,%r306; @ %r260 bra $L139; .loc 1 518 2 st.u64 [%stack+8],%r306; st.u64 [%stack],%r61;_gfortran_runtime_error139: .loc 1 523 11 cvta.global.u64 %r263,_gfortrani_compile_options; .loc 1 523 10 ld.u32 %r264,[%r263+36]; setp.eq.u32 %r265,%r264,0; @ %r265 bra $L140; .loc 1 525 4 setp.eq.u64 %r266,%r306,0; @ %r266 bra $L138; add.u64 %r115,%r156,48; add.u64 %r105,%frame,120; cvt.u32.u32 %r268,%r24; cvt.s64.s8 %r267,%r268; add.u64 %r86,%r267,-1; .loc 1 525 10 mov.u64 %r113,0; $L142: .loc 1 529 21 ld.u64 %r270,[%r115+8]; add.u64 %r269,%r270,1; .loc 1 529 19 ld.u64 %r271,[%r115]; sub.u64 %r100,%r269,%r271; .loc 1 530 18 ld.u64 %r70,[%r105]; .loc 1 525 25 add.u64 %r113,%r113,1; .loc 1 530 11 setp.eq.u64 %r272,%r70,%r100; @ %r272 bra $L141; .loc 1 531 3 st.u64 [%stack+16],%r70; st.u64 [%stack+8],%r100; st.u64 [%stack],%r113;_gfortran_runtime_error141: .loc 1 525 4 add.u64 %r115,%r115,24; add.u64 %r105,%r105,8; setp.ne.u64 %r275,%r86,%r113; @ %r275 bra $L142; $L137: .loc 1 541 16 cvt.u16.u32 %r277,%r24; setp.le.s16 %r278,%r277,1; @ %r278 bra $L143; shl.b64 %r276,%r306,3; bra $L144; $L143: mov.u64 %r276,8; $L144: add.u64 %r280,%frame,240; mov.u32 %r2842766,[%value_in]; } add.u64 %r130,%r156,40; mov.u64 %r128,%frame; mov.u64 %r109,0; $L145: .loc 1 542 18 ld.u64 %r288,[%r130]; st.u64 [%r128],%r288; .loc 1 539 26 add.u64 %r109,%r109,1; .loc 1 539 3 add.u64 %r130,%r130,24; add.u64 %r128,%r128,8; setp.gt.s64 %r289,%r306,%r109; @ %r289 bra $L145; $L138: ld.u64 %r88,[%frame+240]; .loc 1 551 22 ld.u64 %r75,[%frame]; .loc 1 551 12 shl.b64 %r76,%r75,3; .loc 1 553 32 ld.u64 %r65,[%frame+120]; .loc 1 560 23 mul.lo.u64 %r49,%r65,%r75; setp.le.s64 %r305,%r306,1; .loc 1 549 13 mov.u64 %r290,0; cvt.u32.u32 %r308,%r24; cvt.s64.s8 %r309,%r308; add.u64 %r310,%r309,-1; $L151: st.u64 [%r107],%r290; .loc 1 550 15 add.u64 %r88,%r88,1; .loc 1 551 12 add.u64 %r107,%r107,%r76; .loc 1 553 13 setp.ne.u64 %r291,%r65,%r88; @ %r291 bra $L151; .loc 1 562 7 @ ! %r305 bra $L176; bra $L114; $L150: .loc 1 557 13 st.u64 [%r108],%r290; .loc 1 560 23 mul.lo.u64 %r78,%r81,%r80; .loc 1 561 5 add.u64 %r106,%r106,1; .loc 1 562 7 add.u64 %r108,%r108,8; add.u64 %r138,%r138,8; add.u64 %r137,%r137,8; setp.ne.u64 %r294,%r106,%r310; @ %r294 bra $L149; bra $L114; $L176: add.u64 %r108,%frame,248; add.u64 %r138,%frame,8; add.u64 %r137,%frame,128; .loc 1 560 23 mov.u64 %r78,%r49; .loc 1 561 5 mov.u64 %r106,1; $L149: .loc 1 566 16 ld.u64 %r299,[%r108]; add.u64 %r80,%r299,1; st.u64 [%r108],%r80; .loc 1 567 23 ld.u64 %r81,[%r138]; .loc 1 567 13 sub.u64 %r300,%r81,%r78; shl.b64 %r301,%r300,3; add.u64 %r107,%r107,%r301; .loc 1 553 32 ld.u64 %r84,[%r137]; .loc 1 553 13 setp.eq.u64 %r302,%r80,%r84; @ %r302 bra $L150; .loc 1 557 13 mov.u64 %r88,0; bra $L151; $L174: .loc 1 486 15 ld.u64 %r107,[%r156]; .loc 1 486 6 setp.eq.u64 %r303,%r107,0; @ ! %r303 bra $L127; bra $L152; $L114: .loc 1 571 1 ret; } maxloc1_16_s1.o/1622802212_gfortran_maxloc1_16_s1 .visible .func _gfortran_maxloc1_16_s1in_ar4); .file 1 "../../../libgfortran/generated/maxloc1_16_s1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_16_s1 .visible .func _gfortran_mmaxloc1_16_s1_gfortran_smaxloc1_16_s1 .visible .func _gfortran_smaxloc1_16_s1VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returmemcmpmem_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents7] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,914,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,117,101114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99_gfortran_maxloc1_16_s1 .visible .func _gfortran_maxloc1_16_s1in_ar4) {480predpred %r225; .reg .predu64 %r257; .reg .u64 %r258; .reg .u64pred %r262; .reg .pred %r264; .reg .pred %r265; .reg .u64u64predu32 %r318; .reg .u64pred %r330; .reg .u32 %r334; .reg .pred %r337; .reg .predu64 %r370; .reg .u64 %r372; .reg .pred %r373; .reg .u64 %r374; .reg .pred398; mov.u64 %r185,%ar0; mov.u64 %r186,%ar1; mov.u64 %r187,%ar2; mov.u32 %r188,%ar3; mov.u64 %r189,%ar4; .loc 1 70 10 ld.s8 %r22,[%r186+28]; .loc 1 70 38 add.u32 %r190,%r22,-1; .loc 1 70 8 cvt.s64.s32 %r117,%r190; .loc 1 71 10 ld.u64 %r25,[%r187]; .loc 1 71 7 add.u64 %r118,%r25,-1; .loc 1 73 7 shr.u64 %r192,%r118,63; set.u32.lt.s64 %r194,%r117,%r118; neg.s32 %r195,%r194; cvt.u16.u64 %r198,%r192; cvt.u16.u32 %r199,%r195; or.b16 %r197,%r198,%r199; .loc 1 73 6 cvt.u32.u16 %r200,%r197; cvt.u16.u8 %r201,%r200; setp.eq.u16 %r202,%r201,0; @ %r202 bra $L2; .loc 1 75 7 cvt.u32.u32 %r205,%r22; cvt.s64.s8 %r204,%r205; st.u64 [%stack+8],%r20403203_gfortran_runtime_errorr208,%r118,%r118; add.u64 %r209,%r208,%r118; shl.b64 %r210,%r209,3; add.u64 %r211,%r186,%r210; ld.u64 %r31,[%r211+56]; ld.u64 %r33,[%r211+48]; .loc 1 83 11 ld.u64 %r34,[%r211+40]; .loc 1 85 3 setp.ne.u64 %r225,%r118,0; @ %r225 bra $L3; $L9: .loc 1 93 3 setp.gt.s64 %r226,%r117,%r118; @ %r226 bra $L4; bra $L52; $L3: add.u64 %r132,%r186,40; add.u64 %r182,%frame,120; add.u64 %r180,%frame,240; add.u64 %r227,%r186,16; add.u64 %r229,%r25,%r25; add.u64 %r230,%r229,%r25; shl.b64 %r231,%r230,3; add.u64 %r137,%r227,%r231; .loc 1 91 12 mov.u64 %r398,0; $L8: .loc 1 87 51 ld.u64 %r233,[%r132]; mul.lo.u64 %r232,%r233,%r189; .loc 1 87 18 st.u64 [%r182],%r232; .loc 1 88 19 ld.u64 %r235,[%r132+16]; add.u64 %r234,%r235,1; ld.u64 %r236,[%r132+8]; sub.u64 %r44,%r234,%r236; .loc 1 90 10 setp.lt.s64 %r237,%r44,0; @ %r237 bra $L6; .loc 1 88 17 st.u64 [%r180],%r44; bra $L7; $L6: .loc 1 91 12 st.u64 [%r180],%r398; $L7: .loc 1 85 3 add.u64 %r132,%r132,24; add.u64 %r182,%r182,8; add.u64 %r180,%r180,8; setp.ne.u64 %r239,%r132,%r137; @ %r239 bra $L8; bra $L9; $L52: .loc 1 102 6 ld.u64 %r240,[%r185]; setp.eq.u64 %r241,%r240,0; @ ! %r241 bra $L11; bra $L10; $L4: add.u64 %r243,%r25,%r25; add.u64 %r244,%r243,%r25; shl.b64 %r245,%r244,3; add.u64 %r246,%r245,40; add.u64 %r114,%r186,%r246; shl.b64 %r247,%r25,3; add.u64 %r103,%r247,-8; add.u64 %r385,%frame,120; add.u64 %r108,%r385,%r103; add.u64 %r384,%frame,240; add.u64 %r101,%r384,%r103; cvt.u32.u32 %r251,%r22; cvt.s64.s8 %r250,%r251; add.u64 %r253,%r250,%r250; add.u64 %r254,%r253,%r250; shl.b64 %r255,%r254,3; add.u64 %r256,%r186,40; add.u64 %r50,%r255,%r256; .loc 1 99 12 mov.u64 %r397,0; $L14: .loc 1 95 56 ld.u64 %r258,[%r114]; mul.lo.u64 %r257,%r258,%r189; .loc 1 95 18 st.u64 [%r108],%r257; .loc 1 96 19 ld.u64 %r260,[%r114+16]; add.u64 %r259,%r260,1; ld.u64 %r261,[%r114+8]; sub.u64 %r54,%r259,%r261; .loc 1 98 10 setp.lt.s64 %r262,%r54,0; @ %r262 bra $L12; .loc 1 96 17 st.u64 [%r101],%r54; bra $L13; $L12: .loc 1 99 12 st.u64 [%r101],%r397; $L13: .loc 1 93 3 add.u64 %r114,%r114,24; add.u64 %r108,%r108,8; add.u64 %r101,%r101,8; setp.ne.u64 %r264,%r50,%r114; @ %r264 bra $L14; bra $L53; $L10: .loc 1 106 7 setp.le.s64 %r265,%r117,0; @ %r265 bra $L16; add.u64 %r384,%frame,240; $L37: add.u64 %r149,%r185,40; mov.u64 %r146,%r384; add.u64 %r266,%r185,16; cvt.u32.u32 %r268,%r22; cvt.s64.s8 %r267,%r268; add.u64 %r270,%r267,%r267; add.u64 %r271,%r270,%r267; shl.b64 %r272,%r271,3; add.u64 %r115,%r266,%r272; .loc 1 109 10 mov.u64 %r106,1; .loc 1 113 4 mov.u64 %r295,0; bra $L17; $L16: .loc 1 117 24 mov.u64 %r273,0; st.u64 [%r185+8],%r273; .loc 1 118 28 cvt.u16.u32 %r276,%r22; add.u16 %r275,%r276,-1; cvt.u32.u16 %r277,%r275; st.u8 [%r185+28],%r277; .loc 1 120 20 add.u32 %r278,%r22,-2; cvt.s64.s32 %r63,%r278; add.u64 %r280,%r63,%r63; add.u64 %r281,%r280,%r63; shl.b64 %r282,%r281,3; add.u64 %r283,%r185,%r282; .loc 1 120 67 shl.b64 %r285,%r63,3; add.u64 %r286,%frame,%r285; .loc 1 120 59 ld.u64 %r288,[%r283+40]; ld.u64 %r289,[%r286+240]; mul.lo.u64 %r123,%r288,%r289; .loc 1 122 29 mov.u64 %r291,16291; call (%value_in),_gfortrani_xmallocarray292,[%value_in]; } .loc 1 122 27 st.u64 [%r185],%r292; .loc 1 123 10 setp.eq.u64 %r294,%r123,0; @ ! %r294 bra $L21; bra $L18; $L20: .loc 1 111 48 mul.lo.u64 %r106,%r59,%r106; $L17: .loc 1 113 4 st.u64 [%r149+8],%r295; ld.u64 %r59,[%r146]; add.u64 %r296,%r59,-1; st.u64 [%r149+16],%r296; st.u64 [%r149],%r106; .loc 1 106 7 add.u64 %r149,%r149,24; add.u64 %r146,%r146,8; setp.ne.u64 %r297,%r115,%r149; @ %r297 bra $L20; bra $L16; $L18: .loc 1 126 4 st.u64 [%r185+48],%r123; mov.u64 %r299,-1; st.u64 [%r185+56],%r299; mov.u64 %r300,1; st.u64 [%r185+40],%r300; .loc 1 127 4 bra $L1; $L11: .loc 1 133 19 ld.s8 %r70,[%r185+28]; .loc 1 133 10 setp.eq.u64 %r301,%r70,%r117; @ %r301 bra $L23; .loc 1 134 2 st.u64 [%stack+8],%r117_gfortran_runtime_error23: .loc 1 139 11 cvta.global.u64 %r304,_gfortrani_compile_options; .loc 1 139 10 ld.u32 %r305,[%r304+36]; setp.eq.u32 %r306,%r305,0; @ %r306 bra $L21; .loc 1 140 2 add.u64 %r384,%frame,240; cvta.const.u64 %r310,$LC2r384309310; call _gfortrani_bounds_ifunction_return$L21: .loc 1 144 3 setp.gt.s64 %r312,%r117,0; @ %r312 bra $L24; $L28: .loc 1 80 9 add.u64 %r313,%r31,1; .loc 1 80 7 sub.u64 %r119,%r313,%r33; max.s64 %r121,%r119,0; .loc 1 83 44 mul.lo.u64 %r36,%r34,%r189; .loc 1 152 8 ld.u64 %r129,[%r186]; .loc 1 153 8 ld.u64 %r130,[%r185]; .loc 1 186 22 ld.u64 %r78,[%frame+120]; .loc 1 187 22 ld.u64 %r80,[%frame]; .loc 1 187 12 shl.b64 %r81,%r80,4; .loc 1 189 32 ld.u64 %r152,[%frame+240]; .loc 1 196 23 mul.lo.u64 %r314,%r78,%r152; .loc 1 196 9 neg.s64 %r142,%r314; .loc 1 197 23 mul.lo.u64 %r315,%r80,%r152; .loc 1 197 9 shl.b64 %r316,%r315,4; neg.s64 %r141,%r316; .loc 1 185 12 ld.u64 %r181,[%frame+360]; setp.gt.s64 %r386,%r119,0; setp.le.s64 %r387,%r117,1; mov.u64 %r390,1; .loc 1 173 6 setp.eq.u32 %r391,%r188,0; cvt.u32.u32 %r392,%r22; cvt.s64.s8 %r393,%r392; add.u64 %r394,%r393,-1; add.u64 %r395,%frame,120; add.u64 %r396,%frame,240; bra $L25; $L24: add.u64 %r167,%frame,360; add.u64 %r166,%r185,40; mov.u64 %r164,%frame; cvt.u32.u32 %r318,%r22; cvt.s64.s8 %r317,%r318; add.u64 %r150,%r317,-1; .loc 1 144 3 mov.u64 %r131,0; add.u64 %r384,%frame,240; .loc 1 146 16 mov.u64 %r319,%r131; $L27: st.u64 [%r167],%r319; .loc 1 147 18 ld.u64 %r320,[%r166]; st.u64 [%r164],%r320; .loc 1 148 17 shl.b64 %r322,%r131,3; add.u64 %r323,%r384,%r322; .loc 1 148 10 ld.u64 %r324,[%r323]; setp.le.s64 %r325,%r324,0; @ %r325 bra $L1; .loc 1 144 26 add.u64 %r131,%r131,1; .loc 1 144 3 add.u64 %r167,%r167,8; add.u64 %r166,%r166,24; add.u64 %r164,%r164,8; setp.ne.u64 %r326,%r131,%r150; @ %r326 bra $L27; bra $L28; $L25: .loc 1 166 5 @ %r386 bra $L38; .loc 1 167 10 mov.u64 %r328,0; st.u64 [%r130],%r328; st.u64 [%r130+8],%r328; bra $L30; $L38: mov.u64 %r125,%r129; mov.u64 %r378,%r390; mov.u64 %r379,0; .loc 1 164 9 mov.u64 %r109,%r379; .loc 1 165 9 mov.u64 %r376,%r379; mov.u64 %r377,%r379; $L29: .loc 1 173 6 setp.eq.u64 %r330,%r109,0; @ %r330 bra $L31; .loc 1 40(%value_in),memcmp,(%out_arg1,%out_arg2,%out_arg3); ld.param.u32 %r334,[%value_in]; } .loc 1 173 6 @ %r391 bra $L32; .loc 1 173 22 setp.ge.s32 %r337,%r334,0; @ %r337 bra $L31; bra $L33; $L32: setp.le.s32 %r338,%r334,0; @ %r338 bra $L33; $L31: .loc 1 177 14 mov.u64 %r376,%r378; mov.u64 %r377,%r379; mov.u64 %r109,%r125; $L33: .loc 1 170 36 add.u64 %r125,%r125,%r36; .loc 1 170 6 add.u64 %r341,%r378,%r390; set.u32.lt.u64 %r346,%r341,%r378; cvt.s64.s32 %r344,%r346; mov.u64 %r378,%r341; sub.u64 %r379,%r379,%r344; add.u64 %r351,%r378,-1; setp.gt.s64 %r353,%r121,%r351; @ %r353 bra $L29; .loc 1 181 12 st.u64 [%r130],%r376; st.u64 [%r130+8],%r377; $L30: .loc 1 185 15 add.u64 %r181,%r181,1; .loc 1 186 12 add.u64 %r129,%r129,%r78; .loc 1 187 12 add.u64 %r130,%r130,%r81; .loc 1 189 13 setp.ne.u64 %r356,%r181,%r152; @ %r356 bra $L25; .loc 1 196 9 add.u64 %r126,%r129,%r142; .loc 1 197 9 add.u64 %r127,%r130,%r141; .loc 1 199 7 @ %r387 bra $L1; add.u64 %r136,%frame,368; mov.u64 %r156,8; .loc 1 198 5 mov.u64 %r128,1; .loc 1 193 13 mov.u64 %r388,0; bra $L35; $L36: st.u64 [%r136],%r388; .loc 1 196 23 mul.lo.u64 %r362,%r95,%r94; .loc 1 196 9 sub.u64 %r126,%r129,%r362; .loc 1 197 23 mul.lo.u64 %r363,%r97,%r94; .loc 1 197 9 shl.b64 %r364,%r363,4; sub.u64 %r127,%r130,%r364; .loc 1 198 5 add.u64 %r128,%r128,1; .loc 1 199 7 add.u64 %r136,%r136,8; add.u64 %r156,%r156,8; setp.eq.u64 %r365,%r128,%r394; @ %r365 bra $L1; $L35: .loc 1 207 16 ld.u64 %r366,[%r136]; add.u64 %r94,%r366,1; st.u64 [%r136],%r94; .loc 1 208 23 add.u64 %r368,%r395,%r156; ld.u64 %r95,[%r368]; .loc 1 208 13 add.u64 %r129,%r126,%r95; .loc 1 209 23 add.u64 %r369,%frame,%r156; ld.u64 %r97,[%r369]; .loc 1 209 13 shl.b64 %r370,%r97,4; add.u64 %r130,%r127,%r370; .loc 1 189 32 add.u64 %r372,%r396,%r156; ld.u64 %r100,[%r372]; .loc 1 189 13 setp.eq.u64 %r373,%r94,%r100; @ %r373 bra $L36; mov.u64 %r181,0; bra $L25; $L53: .loc 1 102 6 ld.u64 %r374,[%r185]; setp.eq.u64 %r375,%r374,0; @ ! %r375 bra $L11; bra $L37; $L1: .loc 1 21_gfortran_mmaxloc1_16_s1 .visible .func _gfortran_mmaxloc1_16_s1608u16u32 %r263; .reg .u64pred %r295; .reg .predpred %r310; .reg .pred %r311; .reg .u64u64 %r318; .reg .u64pred64pred64predpred %r397; .reg .u64 %r400; .reg .u64 %r401; .reg .u64 %r405; .reg .pred.reg .u64 %r416; .reg .predpred.reg .u64 %r448; .reg .u32 %r450; .reg .u64pred %r463; .reg .predpred %r479; .reg .pred %r482; .reg .u64 %r485; .reg .predu64 %r504; .reg .u64 %r506; .reg .pred %r507; .reg .u64 %r508; .reg .pred %r509; .reg .u64 %r510; .reg .u64 %r511; .reg .u64 %r514; .reg .u64 %r515; .reg .u64u64 %r534; .reg .pred %r535; .reg .u64 %r536; .reg .u64 %r539; .reg .pred %r540; .reg .u32 %r542; .reg .u64 %r543; .reg .u64 %r544; .reg .u64 %r545; .reg .u64 %r546; .reg .u64 %r547; mov.u64 %r236,%ar0; mov.u64 %r237,%ar1; mov.u64 %r238,%ar2; mov.u64 %r239,%ar3; mov.u32 %r240,%ar4; mov.u64 %r241,%ar5; .loc 1 244 6 setp.ne.u64 %r242,%r239,0; @ %r242 bra $L238240241; call _gfortran_maxloc1_16_s1); } .loc 1 251 7 bra $L54; $L55: .loc 1 254 10 ld.u64 %r22,[%r238]; .loc 1 254 7 add.u64 %r150,%r22,-1; .loc 1 255 10 ld.s8 %r23,[%r237+28]; .loc 1 255 38 add.u32 %r248,%r23,-1; .loc 1 255 8 cvt.s64.s32 %r151,%r248; .loc 1 258 7 shr.u64 %r250,%r150,63; set.u32.gt.s64 %r252,%r150,%r151; neg.s32 %r253,%r252; cvt.u16.u64 %r256,%r250; cvt.u16.u32 %r257,%r253; or.b16 %r255,%r256,%r257; .loc 1 258 6 cvt.u32.u16 %r258,%r255; cvt.u16.u8 %r259,%r258; setp.eq.u16 %r260,%r259,0; @ %r260 bra $L57; .loc 1 260 7 cvt.u32.u32 %r263,%r23; cvt.s64.s8 %r262,%r263; st.u64 [%stack+8],%r262; st.u64 [%stack],%r22; cvta.const.u64 %r261261_gfortran_runtime_error7: .loc 1 265 9 add.u64 %r534,%r150,%r150; add.u64 %r531,%r534,%r150; shl.b64 %r268,%r531,3; add.u64 %r269,%r237,%r268; ld.u64 %r272,[%r269+56]; add.u64 %r271,%r272,1; .loc 1 265 7 ld.u64 %r279,[%r269+48]; sub.u64 %r152,%r271,%r279; .loc 1 266 6 setp.le.s64 %r280,%r152,0; @ %r280 bra $L54; .loc 1 269 9 ld.u64 %r164,[%r239]; .loc 1 271 15 ld.u64 %r34,[%r239+16]; .loc 1 273 22 cvt.u32.u64 %r35,%r34; .loc 1 273 53 add.u32 %r281,%r35,-4; and.b32 %r282,%r281,-5; set.u32.eq.u32 %r284,%r282,0; neg.s32 %r285,%r284; .loc 1 273 22 add.u32 %r286,%r35,-1; set.u32.le.u32 %r288,%r286,1; neg.s32 %r289,%r288; .loc 1 273 6 cvt.u16.u32 %r291,%r285; cvt.u16.u32 %r292,%r289; or.b16 %r290,%r291,%r292; cvt.u32.u16 %r293,%r290; cvt.u16.u8 %r294,%r293; setp.ne.u16 %r295,%r294,0; @ %r295 bra $L59; .loc 1 275 7 setp.ne.u32 %r297,%r35,16; @ %r297 bra $L60; $L59: .loc 1 282 11 shl.b64 %r301,%r531,3; add.u64 %r302,%r237,%r301; ld.u64 %r41,[%r302+40]; .loc 1 283 12 add.u64 %r308,%r239,%r301; ld.u64 %r44,[%r308+40]; .loc 1 285 3 setp.ne.u64 %r310,%r150,0; @ %r310 bra $L61; $L67: .loc 1 295 3 setp.lt.s64 %r311,%r150,%r151; @ %r311 bra $L62; bra $L115; $L60: .loc 1 280 5 cvta.const.u64 %r312312_gfortran_runtime_error61: add.u64 %r217,%r237,40; add.u64 %r219,%frame,240; add.u64 %r220,%r239,40; mov.u64 %r222,%frame; add.u64 %r223,%frame,360; add.u64 %r314,%r237,16; add.u64 %r316,%r22,%r22; add.u64 %r317,%r316,%r22; shl.b64 %r318,%r317,3; add.u64 %r232,%r314,%r318; .loc 1 292 12 mov.u64 %r547,0; $L66: .loc 1 287 51 ld.u64 %r320,[%r217]; mul.lo.u64 %r319,%r320,%r241; .loc 1 287 18 st.u64 [%r219],%r319; .loc 1 288 20 ld.u64 %r322,[%r220]; mul.lo.u64 %r321,%r322,%r34; .loc 1 288 18 st.u64 [%r222],%r321; .loc 1 289 19 ld.u64 %r324,[%r217+16]; add.u64 %r323,%r324,1; ld.u64 %r325,[%r217+8]; sub.u64 %r59,%r323,%r325; .loc 1 291 10 setp.lt.s64 %r326,%r59,0; @ %r326 bra $L64; .loc 1 289 17 st.u64 [%r223],%r59; bra $L65; $L64: .loc 1 292 12 st.u64 [%r223],%r547; $L65: .loc 1 285 3 add.u64 %r217,%r217,24; add.u64 %r219,%r219,8; add.u64 %r220,%r220,24; add.u64 %r222,%r222,8; add.u64 %r223,%r223,8; setp.ne.u64 %r328,%r217,%r232; @ %r328 bra $L66; bra $L67; $L115: .loc 1 305 6 ld.u64 %r329,[%r236]; setp.eq.u64 %r330,%r329,0; @ ! %r330 bra $L69; bra $L68; $L62: add.u64 %r332,%r22,%r22; add.u64 %r333,%r332,%r22; shl.b64 %r334,%r333,3; add.u64 %r175,%r334,40; add.u64 %r39,%r237,%r175; shl.b64 %r335,%r22,3; add.u64 %r203,%r335,-8; add.u64 %r536,%frame,240; add.u64 %r177,%r536,%r203; add.u64 %r112,%r239,%r175; add.u64 %r38,%frame,%r203; add.u64 %r532,%frame,360; add.u64 %r96,%r532,%r203; add.u64 %r338,%r237,40; cvt.u32.u32 %r340,%r23; cvt.s64.s8 %r339,%r340; add.u64 %r342,%r339,%r339; add.u64 %r343,%r342,%r339; shl.b64 %r344,%r343,3; add.u64 %r214,%r338,%r344; .loc 1 302 12 mov.u64 %r546,0; $L72: .loc 1 297 55 ld.u64 %r346,[%r39]; mul.lo.u64 %r345,%r346,%r241; .loc 1 297 18 st.u64 [%r177],%r345; .loc 1 298 20 ld.u64 %r348,[%r112]; mul.lo.u64 %r347,%r348,%r34; .loc 1 298 18 st.u64 [%r38],%r347; .loc 1 299 19 ld.u64 %r350,[%r39+16]; add.u64 %r349,%r350,1; ld.u64 %r351,[%r39+8]; sub.u64 %r75,%r349,%r351; .loc 1 301 10 setp.lt.s64 %r352,%r75,0; @ %r352 bra $L70; .loc 1 299 17 st.u64 [%r96],%r75; bra $L71; $L70: .loc 1 302 12 st.u64 [%r96],%r546; $L71: .loc 1 295 3 add.u64 %r39,%r39,24; add.u64 %r177,%r177,8; add.u64 %r112,%r112,24; add.u64 %r38,%r38,8; add.u64 %r96,%r96,8; setp.ne.u64 %r354,%r39,%r214; @ %r354 bra $L72; bra $L116; $L68: .loc 1 309 7 setp.eq.u64 %r355,%r151,0; @ %r355 bra $L74; add.u64 %r532,%frame,360; $L98: add.u64 %r128,%r236,40; mov.u64 %r108,%r532; add.u64 %r356,%r236,16; cvt.u32.u32 %r358,%r23; cvt.s64.s8 %r357,%r358; add.u64 %r360,%r357,%r357; add.u64 %r361,%r360,%r357; shl.b64 %r362,%r361,3; add.u64 %r60,%r356,%r362; .loc 1 312 10 mov.u64 %r138,1; .loc 1 316 4 mov.u64 %r381,0; bra $L75; $L74: .loc 1 320 20 add.u32 %r363,%r23,-2; cvt.s64.s32 %r83,%r363; add.u64 %r365,%r83,%r83; add.u64 %r366,%r365,%r83; shl.b64 %r367,%r366,3; add.u64 %r368,%r236,%r367; .loc 1 320 67 shl.b64 %r370,%r83,3; add.u64 %r371,%frame,%r370; .loc 1 320 59 ld.u64 %r373,[%r368+40]; ld.u64 %r374,[%r371+360]; mul.lo.u64 %r157,%r373,%r374; .loc 1 322 24 mov.u64 %r375,0; st.u64 [%r236+8],%r375; .loc 1 323 28 cvt.u16.u32 %r378,%r23; add.u16 %r377,%r378,-1; cvt.u32.u16 %r379,%r377; st.u8 [%r236+28],%r379; .loc 1 325 10 setp.eq.u64 %r380,%r157,0; @ %r380 bra $L76; bra $L117; $L78: .loc 1 314 47 mul.lo.u64 %r138,%r80,%r138; $L75: .loc 1 316 4 st.u64 [%r128+8],%r381; ld.u64 %r80,[%r108]; add.u64 %r382,%r80,-1; st.u64 [%r128+16],%r382; st.u64 [%r128],%r138; .loc 1 309 7 add.u64 %r128,%r128,24; add.u64 %r108,%r108,8; setp.ne.u64 %r383,%r60,%r128; @ %r383 bra $L78; bra $L74; $L76: .loc 1 328 4 st.u64 [%r236+48],%r157; mov.u64 %r385,-1; st.u64 [%r236+56],%r385; mov.u64 %r386,1; st.u64 [%r236+40],%r386; .loc 1 329 4 bra $L54; $L117: .loc 1 332 24 mov.u64 %r388,16388; call (%value_in),_gfortrani_xmallocarray89,[%value_in]; } .loc 1 332 22 st.u64 [%r236],%r389; bra $L79; $L69: .loc 1 337 19 ld.s8 %r391,[%r236+28]; .loc 1 337 10 setp.eq.u64 %r392,%r391,%r151; @ %r392 bra $L80; .loc 1 338 2 cvta.const.u64 %r3933_gfortran_runtime_error80: .loc 1 340 11 cvta.global.u64 %r395,_gfortrani_compile_options; .loc 1 340 10 ld.u32 %r396,[%r395+36]; setp.eq.u32 %r397,%r396,0; @ %r397 bra $L79; .loc 1 342 4 add.u64 %r532,%frame,360; cvta.const.u64 %r401,$LC2r500401; call _gfortrani_bounds_ifunction_return344 4 cvta.const.u64 %r405,$LC20401; call _gfortrani_bounds_equal_extents$L79: .loc 1 349 3 setp.ne.u64 %r407,%r151,0; @ %r407 bra $L81; $L84: .loc 1 357 8 ld.u64 %r165,[%r236]; .loc 1 358 8 ld.u64 %r163,[%r237]; .loc 1 360 9 setp.ne.u64 %r408,%r163,0; @ %r408 bra $L82; bra $L54; $L81: add.u64 %r146,%frame,480; add.u64 %r145,%r236,40; add.u64 %r140,%frame,120; cvt.u32.u32 %r410,%r23; cvt.s64.s8 %r409,%r410; add.u64 %r129,%r409,-1; .loc 1 349 3 mov.u64 %r172,0; add.u64 %r532,%frame,360; .loc 1 351 16 mov.u64 %r411,%r172; $L83: st.u64 [%r146],%r411; .loc 1 352 18 ld.u64 %r412,[%r145]; st.u64 [%r140],%r412; .loc 1 353 17 shl.b64 %r414,%r172,3; add.u64 %r415,%r532,%r414; .loc 1 353 10 ld.u64 %r416,[%r415]; setp.le.s64 %r417,%r416,0; @ %r417 bra $L54; .loc 1 349 26 add.u64 %r172,%r172,1; .loc 1 349 3 add.u64 %r146,%r146,8; add.u64 %r145,%r145,24; add.u64 %r140,%r140,8; setp.ne.u64 %r418,%r129,%r172; @ %r418 bra $L83; bra $L84; $L82: .loc 1 282 44 mul.lo.u64 %r43,%r41,%r241; .loc 1 283 12 mul.lo.u64 %r47,%r44,%r34; .loc 1 396 22 ld.u64 %r100,[%frame+240]; .loc 1 397 23 ld.u64 %r102,[%frame]; .loc 1 398 22 ld.u64 %r104,[%frame+120]; .loc 1 398 12 shl.b64 %r105,%r104,4; .loc 1 400 32 ld.u64 %r29,[%frame+360]; .loc 1 407 23 mul.lo.u64 %r168,%r100,%r29; .loc 1 408 24 mul.lo.u64 %r419,%r102,%r29; .loc 1 408 10 neg.s64 %r144,%r419; .loc 1 409 23 mul.lo.u64 %r420,%r104,%r29; .loc 1 409 9 shl.b64 %r421,%r420,4; neg.s64 %r188,%r421; setp.le.s64 %r533,%r151,1; add.u64 %r539,%r152,-1; .loc 1 384 13 setp.eq.u32 %r540,%r240,0; cvt.u32.u32 %r542,%r23; cvt.s64.s8 %r543,%r542; add.u64 %r544,%r543,-1; add.u64 %r545,%frame,240; $L99: .loc 1 349 3 mov.u64 %r171,%r164; mov.u64 %r142,%r163; .loc 1 372 9 mov.u64 %r199,0; $L88: .loc 1 375 7 ld.s8 %r205,[%r171]; mov.u64 %r209,%r199; add.u64 %r199,%r199,1; .loc 1 375 6 setp.eq.u32 %r535,%r205,0; @ %r535 bra $L85; .loc 1 378 11 mov.u64 %r510,%r199; shr.s64 %r511,%r510,63; .loc 1 382 6 setp.le.s64 %r424,%r152,%r209; @ %r424 bra $L86; add.u64 %r514,%r209,1; shr.s64 %r515,%r514,63; sub.u64 %r428,%r539,%r209; shr.s64 %r431,%r209,63; add.u64 %r434,%r209,2; set.u32.lt.u64 %r439,%r434,%r209; cvt.s64.s32 %r437,%r439; sub.u64 %r443,%r431,%r437; add.u64 %r445,%r428,%r434; set.u32.lt.u64 %r450,%r445,%r428; cvt.s64.s32 %r448,%r450; sub.u64 %r454,%r443,%r448; mov.u64 %r170,%r142; mov.u64 %r526,1; bra $L87; $L85: .loc 1 372 32 add.u64 %r142,%r142,%r43; .loc 1 372 47 add.u64 %r171,%r171,%r47; .loc 1 372 2 setp.ne.u64 %r455,%r152,%r199; @ %r455 bra $L88; .loc 1 371 9 mov.u64 %r510,0; mov.u64 %r511,%r510; bra $L86; $L87: .loc 1 384 6 @ %r535 bra $L89; .loc 1 4017142241460,[%value_in]; } .loc 1 384 13 @ %r540 bra $L90; setp.ge.s32 %r463,%r460,0; @ %r463 bra $L91; bra $L89; $L90: setp.le.s32 %r464,%r460,0; @ %r464 bra $L89; $L91: .loc 1 388 14 mov.u64 %r510,%r514; mov.u64 %r511,%r515; mov.u64 %r142,%r170; $L89: .loc 1 382 31 add.u64 %r170,%r170,%r43; .loc 1 382 46 add.u64 %r171,%r171,%r47; .loc 1 382 6 add.u64 %r467,%r514,%r526; set.u32.lt.u64 %r472,%r467,%r514; cvt.s64.s32 %r470,%r472; mov.u64 %r514,%r467; sub.u64 %r515,%r515,%r470; setp.ne.u64 %r479,%r445,%r514; @ %r479 bra $L100; setp.ne.u64 %r482,%r454,%r515; @ ! %r482 bra $L86; $L100: .loc 1 384 7 ld.s8 %r205,[%r171]; setp.eq.u32 %r535,%r205,0; bra $L87; $L86: .loc 1 392 8 st.u64 [%r165],%r510; st.u64 [%r165+8],%r511; .loc 1 395 15 ld.u64 %r485,[%frame+480]; add.u64 %r98,%r485,1; .loc 1 396 12 add.u64 %r163,%r163,%r100; .loc 1 397 13 add.u64 %r164,%r164,%r102; .loc 1 398 12 add.u64 %r165,%r165,%r105; .loc 1 400 13 setp.eq.u64 %r486,%r29,%r98; @ %r486 bra $L93; .loc 1 395 15 st.u64 [%frame+480],%r98; bra $L99; $L93: .loc 1 404 13 mov.u64 %r487,0; st.u64 [%frame+480],%r487; .loc 1 408 10 add.u64 %r160,%r164,%r144; .loc 1 409 9 add.u64 %r161,%r165,%r188; .loc 1 411 7 @ %r533 bra $L54; add.u64 %r201,%frame,488; .loc 1 407 23 mov.u64 %r107,%r168; .loc 1 411 7 mov.u64 %r197,8; .loc 1 410 5 mov.u64 %r162,1; add.u64 %r532,%frame,360; add.u64 %r530,%frame,120; bra $L96; $L97: .loc 1 404 13 st.u64 [%r201],%r487; .loc 1 407 23 mul.lo.u64 %r107,%r119,%r118; .loc 1 408 24 mul.lo.u64 %r493,%r121,%r118; .loc 1 408 10 sub.u64 %r160,%r164,%r493; .loc 1 409 23 mul.lo.u64 %r494,%r123,%r118; .loc 1 409 9 shl.b64 %r495,%r494,4; sub.u64 %r161,%r165,%r495; .loc 1 410 5 add.u64 %r162,%r162,1; .loc 1 411 7 add.u64 %r201,%r201,8; add.u64 %r197,%r197,8; setp.eq.u64 %r496,%r162,%r544; @ %r496 bra $L54; $L96: .loc 1 419 16 ld.u64 %r497,[%r201]; add.u64 %r118,%r497,1; st.u64 [%r201],%r118; .loc 1 420 23 add.u64 %r499,%r545,%r197; ld.u64 %r119,[%r499]; .loc 1 420 13 sub.u64 %r500,%r119,%r107; add.u64 %r163,%r163,%r500; .loc 1 421 24 add.u64 %r501,%frame,%r197; ld.u64 %r121,[%r501]; .loc 1 421 14 add.u64 %r164,%r160,%r121; .loc 1 422 23 add.u64 %r503,%r530,%r197; ld.u64 %r123,[%r503]; .loc 1 422 13 shl.b64 %r504,%r123,4; add.u64 %r165,%r161,%r504; .loc 1 400 32 add.u64 %r506,%r532,%r197; ld.u64 %r127,[%r506]; .loc 1 400 13 setp.eq.u64 %r507,%r118,%r127; @ %r507 bra $L97; bra $L99; $L116: .loc 1 305 6 ld.u64 %r508,[%r236]; setp.eq.u64 %r509,%r508,0; @ ! %r509 bra $L69; bra $L98; $L54: .loc 1 42_gfortran_smaxloc1_16_s1 .visible .func _gfortran_smaxloc1_16_s1local .align 16 .b8 %frame_ar[368u64 %r89; .reg .u643u64 %r125; .reg .u64predu16 %r180; .reg .predpred %r222; .reg .predu64 %r231; .reg .u16 %r233; .reg .u16 %r234; .reg .u32pred %r259; .reg .pred %r260; .reg .u64u64 %r267; .reg .u32predmov.u64 %r156,%ar0; mov.u64 %r157,%ar1; mov.u64 %r158,%ar2; mov.u64 %r159,%ar3; mov.u32 %r160,%ar4; mov.u64 %r161,%ar5; .loc 1 449 6 setp.eq.u64 %r162,%r159,0; @ %r162 bra $L119; .loc 1 449 20 ld.u32 %r163,[%r159]; setp.eq.u32 %r164,%r163,0; @ %r164 bra $L120; $L119: .loc 1 452 7r1571580161; call _gfortran_maxloc1_16_s1); } .loc 1 456 7 bra $L118; $L120: .loc 1 459 10 ld.u64 %r23,[%r158]; .loc 1 459 7 add.u64 %r95,%r23,-1; .loc 1 460 10 ld.s8 %r24,[%r157+28]; .loc 1 460 38 add.u32 %r26,%r24,-1; .loc 1 460 8 cvt.s64.s32 %r307,%r26; .loc 1 462 7 shr.u64 %r171,%r95,63; set.u32.gt.s64 %r173,%r95,%r307; neg.s32 %r174,%r173; cvt.u16.u64 %r177,%r171; cvt.u16.u32 %r178,%r174; or.b16 %r176,%r177,%r178; .loc 1 462 6 cvt.u32.u16 %r179,%r176; cvt.u16.u8 %r180,%r179; setp.ne.u16 %r181,%r180,0; @ %r181 bra $L122; .loc 1 469 3 setp.ne.u64 %r182,%r95,0; @ %r182 bra $L123; $L129: .loc 1 477 3 setp.lt.s64 %r183,%r95,%r307; @ %r183 bra $L124; bra $L177; $L122: .loc 1 464 7 cvt.u32.u32 %r186,%r24; cvt.s64.s8 %r185,%r186; st.u64 [%stack+8],%r1851841stack; call _gfortran_runtime_error123: add.u64 %r92,%r157,48; add.u64 %r114,%frame,120; add.u64 %r146,%r157,56; add.u64 %r188,%r157,24; add.u64 %r190,%r23,%r23; add.u64 %r191,%r190,%r23; shl.b64 %r192,%r191,3; add.u64 %r152,%r188,%r192; .loc 1 474 12 mov.u64 %r313,0; $L128: .loc 1 471 19 ld.u64 %r194,[%r146]; ld.u64 %r196,[%r92]; sub.u64 %r195,%r194,%r196; .loc 1 471 50 mad.lo.u64 %r38,%r195,%r161,%r161; .loc 1 473 10 setp.le.s64 %r197,%r38,0; @ %r197 bra $L126; .loc 1 471 17 st.u64 [%r114],%r38; bra $L127; $L126: .loc 1 474 12 st.u64 [%r114],%r313; $L127: .loc 1 469 3 add.u64 %r92,%r92,24; add.u64 %r114,%r114,8; add.u64 %r146,%r146,24; setp.ne.u64 %r199,%r92,%r152; @ %r199 bra $L128; bra $L129; $L177: .loc 1 486 15 ld.u64 %r107,[%r156]; .loc 1 486 6 setp.eq.u64 %r200,%r107,0; @ ! %r200 bra $L131; bra $L130; $L124: add.u64 %r202,%r23,%r23; add.u64 %r203,%r202,%r23; shl.b64 %r204,%r203,3; add.u64 %r205,%r204,48; add.u64 %r83,%r157,%r205; add.u64 %r305,%frame,120; shl.b64 %r207,%r23,3; add.u64 %r208,%r207,-8; add.u64 %r144,%r305,%r208; add.u64 %r209,%r157,48; cvt.u32.u32 %r211,%r24; cvt.s64.s8 %r210,%r211; add.u64 %r213,%r210,%r210; add.u64 %r214,%r213,%r210; shl.b64 %r215,%r214,3; add.u64 %r72,%r209,%r215; .loc 1 483 12 mov.u64 %r312,0; $L134: .loc 1 480 2 ld.u64 %r217,[%r83+8]; ld.u64 %r219,[%r83]; sub.u64 %r218,%r217,%r219; .loc 1 480 37 mad.lo.u64 %r46,%r218,%r161,%r161; .loc 1 482 10 setp.le.s64 %r220,%r46,0; @ %r220 bra $L132; .loc 1 479 17 st.u64 [%r144],%r46; bra $L133; $L132: .loc 1 483 12 st.u64 [%r144],%r312; $L133: .loc 1 477 3 add.u64 %r83,%r83,24; add.u64 %r144,%r144,8; setp.ne.u64 %r222,%r72,%r83; @ %r222 bra $L134; bra $L178; $L130: .loc 1 490 7 setp.eq.u64 %r223,%r307,0; @ %r223 bra $L136; add.u64 %r305,%frame,120; $L156: add.u64 %r125,%r156,40; mov.u64 %r123,%r305; add.u64 %r224,%r156,16; cvt.u32.u32 %r226,%r24; cvt.s64.s8 %r225,%r226; add.u64 %r228,%r225,%r225; add.u64 %r229,%r228,%r225; shl.b64 %r230,%r229,3; add.u64 %r117,%r224,%r230; .loc 1 493 10 mov.u64 %r89,1; .loc 1 497 4 mov.u64 %r249,0; bra $L137; $L136: .loc 1 501 24 mov.u64 %r231,0; st.u64 [%r156+8],%r231; .loc 1 502 28 cvt.u16.u32 %r234,%r24; add.u16 %r233,%r234,-1; cvt.u32.u16 %r235,%r233; st.u8 [%r156+28],%r235; .loc 1 504 20 add.u32 %r236,%r24,-2; cvt.s64.s32 %r56,%r236; add.u64 %r238,%r56,%r56; add.u64 %r239,%r238,%r56; shl.b64 %r240,%r239,3; add.u64 %r241,%r156,%r240; .loc 1 504 67 shl.b64 %r243,%r56,3; add.u64 %r244,%frame,%r243; .loc 1 504 59 ld.u64 %r246,[%r241+40]; ld.u64 %r247,[%r244+120]; mul.lo.u64 %r103,%r246,%r247; .loc 1 506 10 setp.eq.u64 %r248,%r103,0; @ %r248 bra $L138; bra $L179; $L140: .loc 1 495 48 mul.lo.u64 %r89,%r52,%r89; $L137: .loc 1 497 4 st.u64 [%r125+8],%r249; ld.u64 %r52,[%r123]; add.u64 %r250,%r52,-1; st.u64 [%r125+16],%r250; st.u64 [%r125],%r89; .loc 1 490 7 add.u64 %r125,%r125,24; add.u64 %r123,%r123,8; setp.ne.u64 %r251,%r117,%r125; @ %r251 bra $L140; bra $L136; $L138: .loc 1 509 4 st.u64 [%r156+48],%r103; mov.u64 %r253,-1; st.u64 [%r156+56],%r253; mov.u64 %r254,1; st.u64 [%r156+40],%r254; .loc 1 510 4 bra $L118; $L179: .loc 1 513 24 mov.u64 %r256,16256; call (%value_in),_gfortrani_xmallocarray257,[%value_in]; } mov.u64 %r107,%r257; .loc 1 513 22 st.u64 [%r156],%r107; $L144: .loc 1 539 3 setp.ne.u64 %r259,%r307,0; @ %r259 bra $L141; bra $L142; $L131: .loc 1 517 19 ld.s8 %r61,[%r156+28]; .loc 1 517 10 setp.eq.u64 %r260,%r61,%r307; @ %r260 bra $L143; .loc 1 518 2 st.u64 [%stack+8],%r307; st.u64 [%stack],%r61;_gfortran_runtime_error143: .loc 1 523 11 cvta.global.u64 %r263,_gfortrani_compile_options; .loc 1 523 10 ld.u32 %r264,[%r263+36]; setp.eq.u32 %r265,%r264,0; @ %r265 bra $L144; .loc 1 525 4 setp.eq.u64 %r266,%r307,0; @ %r266 bra $L142; add.u64 %r115,%r156,48; add.u64 %r105,%frame,120; cvt.u32.u32 %r268,%r24; cvt.s64.s8 %r267,%r268; add.u64 %r86,%r267,-1; .loc 1 525 10 mov.u64 %r113,0; $L146: .loc 1 529 21 ld.u64 %r270,[%r115+8]; add.u64 %r269,%r270,1; .loc 1 529 19 ld.u64 %r271,[%r115]; sub.u64 %r100,%r269,%r271; .loc 1 530 18 ld.u64 %r70,[%r105]; .loc 1 525 25 add.u64 %r113,%r113,1; .loc 1 530 11 setp.eq.u64 %r272,%r70,%r100; @ %r272 bra $L145; .loc 1 531 3 st.u64 [%stack+16],%r70; st.u64 [%stack+8],%r100; st.u64 [%stack],%r113;_gfortran_runtime_error145: .loc 1 525 4 add.u64 %r115,%r115,24; add.u64 %r105,%r105,8; setp.ne.u64 %r275,%r86,%r113; @ %r275 bra $L146; $L141: .loc 1 541 16 cvt.u16.u32 %r277,%r24; setp.le.s16 %r278,%r277,1; @ %r278 bra $L147; shl.b64 %r276,%r307,3; bra $L148; $L147: mov.u64 %r276,8; $L148: add.u64 %r280,%frame,240; mov.u32 %r2842766,[%value_in]; } add.u64 %r130,%r156,40; mov.u64 %r128,%frame; mov.u64 %r109,0; $L149: .loc 1 542 18 ld.u64 %r288,[%r130]; st.u64 [%r128],%r288; .loc 1 539 26 add.u64 %r109,%r109,1; .loc 1 539 3 add.u64 %r130,%r130,24; add.u64 %r128,%r128,8; setp.gt.s64 %r289,%r307,%r109; @ %r289 bra $L149; $L142: ld.u64 %r88,[%frame+240]; .loc 1 551 22 ld.u64 %r75,[%frame]; .loc 1 551 12 shl.b64 %r76,%r75,4; .loc 1 553 32 ld.u64 %r65,[%frame+120]; .loc 1 560 23 mul.lo.u64 %r49,%r65,%r75; setp.le.s64 %r306,%r307,1; .loc 1 549 13 mov.u64 %r290,0; cvt.u32.u32 %r309,%r24; cvt.s64.s8 %r310,%r309; add.u64 %r311,%r310,-1; $L155: st.u64 [%r107],%r290; st.u64 [%r107+8],%r290; .loc 1 550 15 add.u64 %r88,%r88,1; .loc 1 551 12 add.u64 %r107,%r107,%r76; .loc 1 553 13 setp.ne.u64 %r292,%r65,%r88; @ %r292 bra $L155; .loc 1 562 7 @ ! %r306 bra $L180; bra $L118; $L154: .loc 1 557 13 st.u64 [%r108],%r290; .loc 1 560 23 mul.lo.u64 %r78,%r81,%r80; .loc 1 561 5 add.u64 %r106,%r106,1; .loc 1 562 7 add.u64 %r108,%r108,8; add.u64 %r138,%r138,8; add.u64 %r137,%r137,8; setp.ne.u64 %r295,%r106,%r311; @ %r295 bra $L153; bra $L118; $L180: add.u64 %r108,%frame,248; add.u64 %r138,%frame,8; add.u64 %r137,%frame,128; .loc 1 560 23 mov.u64 %r78,%r49; .loc 1 561 5 mov.u64 %r106,1; $L153: .loc 1 566 16 ld.u64 %r300,[%r108]; add.u64 %r80,%r300,1; st.u64 [%r108],%r80; .loc 1 567 23 ld.u64 %r81,[%r138]; .loc 1 567 13 sub.u64 %r301,%r81,%r78; shl.b64 %r302,%r301,4; add.u64 %r107,%r107,%r302; .loc 1 553 32 ld.u64 %r84,[%r137]; .loc 1 553 13 setp.eq.u64 %r303,%r80,%r84; @ %r303 bra $L154; .loc 1 557 13 mov.u64 %r88,0; bra $L155; $L178: .loc 1 486 15 ld.u64 %r107,[%r156]; .loc 1 486 6 setp.eq.u64 %r304,%r107,0; @ ! %r304 bra $L131; bra $L156; $L118: .loc 1 571 1 ret; } maxloc1_16_s4.o/1622802213_gfortran_maxloc1_16_s4 .visible .func _gfortran_maxloc1_16_sfortran/generated/maxloc1_16_s4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc1_16_s4 .visible .func _gfortran_mmaxloc1_16_s_gfortran_smaxloc1_16_s4 .visible .func _gfortran_smaxloc1_16_sVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_memcmp_char4_gfortrani_memcmp_char4_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents7] = {77,65,88,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,914,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,117,101114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,76,79,67,32,105,110,116,114,105,110,115,105,99_gfortran_maxloc1_16_s4 .visible .func _gfortran_maxloc1_16_s, .param .u64 %in_ar4) {480u64predpredu32 %r259; .reg .u64predu64 %r280; .reg .u16 %r282; .reg .u16 %r283; .reg .u32 %r284; .reg .u32pred %r301; .reg .u64predu64 %r324; .reg .u64pred %r361; .reg .pred %r364; .reg .u64u64 %r375; .reg .u64u64pred %r389; .reg .u64mov.u64 %r192,%ar0; mov.u64 %r193,%ar1; mov.u64 %r194,%ar2; mov.u32 %r195,%ar3; mov.u64 %r196,%ar4; .loc 1 70 10 ld.s8 %r22,[%r193+28]; .loc 1 70 38 add.u32 %r197,%r22,-1; .loc 1 70 8 cvt.s64.s32 %r104,%r197; .loc 1 71 10 ld.u64 %r25,[%r194]; .loc 1 71 7 add.u64 %r105,%r25,-1; .loc 1 73 7 shr.u64 %r199,%r105,63; set.u32.lt.s64 %r201,%r104,%r105; neg.s32 %r202,%r201; cvt.u16.u64 %r205,%r199; cvt.u16.u32 %r206,%r202; or.b16 %r204,%r205,%r206; .loc 1 73 6 cvt.u32.u16 %r207,%r204; cvt.u16.u8 %r208,%r207; setp.eq.u16 %r209,%r208,0; @ %r209 bra $L2; .loc 1 75 7 cvt.u32.u32 %r212,%r22; cvt.s64.s8 %r211,%r212; st.u64 [%stack+8],%r2110210_gfortran_runtime_errorr215,%r105,%r105; add.u64 %r216,%r215,%r105; shl.b64 %r217,%r216,3; add.u64 %r218,%r193,%r217; ld.u64 %r30,[%r218+56]; ld.u64 %r32,[%r218+48]; .loc 1 83 11 ld.u64 %r33,[%r218+40]; .loc 1 85 3 setp.ne.u64 %r232,%r105,0; @ %r232 bra $L3; $L9: .loc 1 93 3 setp.gt.s64 %r233,%r104,%r105; @ %r233 bra $L4; bra $L50; $L3: add.u64 %r103,%r193,40; add.u64 %r101,%frame,120; add.u64 %r100,%frame,240; add.u64 %r234,%r193,16; add.u64 %r236,%r25,%r25; add.u64 %r237,%r236,%r25.loc 1 91 12 mov.u64 %r411,0; $L8: .loc 1 87 51 ld.u64 %r240,[%r103]; mul.lo.u64 %r239,%r240,%r196; .loc 1 87 18 st.u64 [%r101],%r239; .loc 1 88 19 ld.u64 %r242,[%r103+16]; add.u64 %r241,%r242,1; ld.u64 %r243,[%r103+8]; sub.u64 %r42,%r241,%r243; .loc 1 90 10 setp.lt.s64 %r244,%r42,0; @ %r244 bra $L6; .loc 1 88 17 st.u64 [%r100],%r42; bra $L7; $L6: .loc 1 91 12 st.u64 [%r100],%r411; $L7: .loc 1 85 3 add.u64 %r103,%r103,24; add.u64 %r101,%r101,8; add.u64 %r100,%r100,8; setp.ne.u64 %r246,%r89,%r103; @ %r246 bra $L8; bra $L9; $L50: .loc 1 102 6 ld.u64 %r247,[%r192]; setp.eq.u64 %r248,%r247,0; @ ! %r248 bra $L11; bra $L10; $L4: add.u64 %r250,%r25,%r25; add.u64 %r251,%r250,%r25; shl.b64 %r252,%r251,3; add.u64 %r253,%r252,40; add.u64 %r151,%r193,%r253; shl.b64 %r254,%r25,3; add.u64 %r144,%r254,-8; add.u64 %r399,%frame,120; add.u64 %r146,%r399,%r144; add.u64 %r398,%frame,240; add.u64 %r137,%r398,%r144; add.u64 %r257,%r193,40; cvt.u32.u32 %r259,%r22; cvt.s64.s8 %r258,%r259; add.u64 %r261,%r258,%r258; add.u64 %r262,%r261,%r258; shl.b64 %r263,%r262,3; add.u64 %r125,%r257,%r263; .loc 1 99 12 mov.u64 %r410,0; $L14: .loc 1 95 56 ld.u64 %r265,[%r151]; mul.lo.u64 %r264,%r265,%r196; .loc 1 95 18 st.u64 [%r146],%r264; .loc 1 96 19 ld.u64 %r267,[%r151+16]; add.u64 %r266,%r267,1; ld.u64 %r268,[%r151+8]; sub.u64 %r51,%r266,%r268; .loc 1 98 10 setp.lt.s64 %r269,%r51,0; @ %r269 bra $L12; .loc 1 96 17 st.u64 [%r137],%r51; bra $L13; $L12: .loc 1 99 12 st.u64 [%r137],%r410; $L13: .loc 1 93 3 add.u64 %r151,%r151,24; add.u64 %r146,%r146,8; add.u64 %r137,%r137,8; setp.ne.u64 %r271,%r125,%r151; @ %r271 bra $L14; bra $L51; $L10: .loc 1 106 7 setp.le.s64 %r272,%r104,0; @ %r272 bra $L16; add.u64 %r398,%frame,240; $L38: add.u64 %r160,%r192,40; mov.u64 %r158,%r398; add.u64 %r273,%r192,16; cvt.u32.u32 %r275,%r22; cvt.s64.s8 %r274,%r275; add.u64 %r277,%r274,%r274; add.u64 %r278,%r277,%r274; shl.b64 %r279,%r278,3; add.u64 %r152,%r273,%r279; .loc 1 109 10 mov.u64 %r95,1; .loc 1 113 4 mov.u64 %r302,0; bra $L17; $L16: .loc 1 117 24 mov.u64 %r280,0; st.u64 [%r192+8],%r280; .loc 1 118 28 cvt.u16.u32 %r283,%r22; add.u16 %r282,%r283,-1; cvt.u32.u16 %r284,%r282; st.u8 [%r192+28],%r284; .loc 1 120 20 add.u32 %r285,%r22,-2; cvt.s64.s32 %r58,%r285; add.u64 %r287,%r58,%r58; add.u64 %r288,%r287,%r58; shl.b64 %r289,%r288,3; add.u64 %r290,%r192,%r289; .loc 1 120 67 shl.b64 %r292,%r58,3; add.u64 %r293,%frame,%r292; .loc 1 120 59 ld.u64 %r295,[%r290+40]; ld.u64 %r296,[%r293+240]; mul.lo.u64 %r110,%r295,%r296; .loc 1 122 29 mov.u64 %r298,1622 27 st.u64 [%r192],%r299; .loc 1 123 10 setp.eq.u64 %r301,%r110,0; @ ! %r301 bra $L21; bra $L18; $L20: .loc 1 111 48 mul.lo.u64 %r95,%r54,%r95; $L17: .loc 1 113 4 st.u64 [%r160+8],%r302; ld.u64 %r54,[%r158]; add.u64 %r303,%r54,-1; st.u64 [%r160+16],%r303; st.u64 [%r160],%r95; .loc 1 106 7 add.u64 %r160,%r160,24; add.u64 %r158,%r158,8; setp.ne.u64 %r304,%r152,%r160; @ %r304 bra $L20; bra $L16; $L18: .loc 1 126 4 st.u64 [%r192+48],%r110; mov.u64 %r306,-1; st.u64 [%r192+56],%r306; mov.u64 %r307,1; st.u64 [%r192+40],%r307; .loc 1 127 4 bra $L1; $L11: .loc 1 133 19 ld.s8 %r64,[%r192+28]; .loc 1 133 10 setp.eq.u64 %r308,%r64,%r104; @ %r308 bra $L23; .loc 1 134 2 st.u64 [%stack+8],%r104; st.u64 [%stack],%r64;_gfortran_runtime_error23: .loc 1 139 11 cvta.global.u64 %r311,_gfortrani_compile_options; .loc 1 139 10 ld.u32 %r312,[%r311+36]; setp.eq.u32 %r313,%r312,0; @ %r313 bra $L21; .loc 1 140 2 add.u64 %r398,%frame,240; cvta.const.u64 %r317,$LC2317; call _gfortrani_bounds_ifunction_return$L21: .loc 1 144 3 setp.gt.s64 %r319,%r104,0; @ %r319 bra $L24; $L28: .loc 1 80 9 add.u64 %r320,%r30,1; .loc 1 80 7 sub.u64 %r106,%r320,%r32; max.s64 %r108,%r106,0; .loc 1 152 8 ld.u64 %r115,[%r193]; .loc 1 153 8 ld.u64 %r116,[%r192]; .loc 1 170 36 mul.lo.u64 %r321,%r33,%r196; shl.b64 %r68,%r321,2; .loc 1 186 22 ld.u64 %r184,[%frame+120]; .loc 1 186 12 shl.b64 %r186,%r184,2; .loc 1 187 22 ld.u64 %r187,[%frame]; .loc 1 187 12 shl.b64 %r189,%r187,4; .loc 1 185 12 ld.u64 %r190,[%frame+360]; setp.gt.s64 %r400,%r106,0; setp.le.s64 %r401,%r104,1; .loc 1 173 6 setp.eq.u32 %r404,%r195,0; cvt.u32.u32 %r405,%r22; cvt.s64.s8 %r406,%r405; add.u64 %r407,%r406,-1; add.u64 %r408,%frame,120; add.u64 %r409,%frame,240; bra $L25; $L24: add.u64 %r173,%frame,360; add.u64 %r172,%r192,40; mov.u64 %r170,%frame; cvt.u32.u32 %r323,%r22; cvt.s64.s8 %r322,%r323; add.u64 %r161,%r322,-1; .loc 1 144 3 mov.u64 %r117,0; add.u64 %r398,%frame,240; .loc 1 146 16 mov.u64 %r324,%r117; $L27: st.u64 [%r173],%r324; .loc 1 147 18 ld.u64 %r325,[%r172]; st.u64 [%r170],%r325; .loc 1 148 17 shl.b64 %r327,%r117,3; add.u64 %r328,%r398,%r327; .loc 1 148 10 ld.u64 %r329,[%r328]; setp.le.s64 %r330,%r329,0; @ %r330 bra $L1; .loc 1 144 26 add.u64 %r117,%r117,1; .loc 1 144 3 add.u64 %r173,%r173,8; add.u64 %r172,%r172,24; add.u64 %r170,%r170,8; setp.ne.u64 %r331,%r117,%r161; @ %r331 bra $L27; bra $L28; $L25: .loc 1 166 5 @ %r400 bra $L39; .loc 1 167 10 mov.u64 %r333,0; st.u64 [%r116],%r333; st.u64 [%r116+8],%r333; bra $L30; $L39: mov.u64 %r111,%r115; mov.u64 %r392,1; mov.u64 %r393,0; .loc 1 164 9 mov.u64 %r99,%r393; .loc 1 165 9 mov.u64 %r390,%r393; mov.u64 %r391,%r393; $L29: .loc 1 173 6 setp.ne.u64 %r335,%r99,0; @ %r335 bra $L31; $L34: .loc 1 177 14 mov.u64 %r390,%r392; mov.u64 %r391,%r393; mov.u64 %r99,%r111; bra $L32; $L31: .loc 1 173 6 @ %r404 bra $L33; .loc 1 42196; call (%value_in),_gfortrani_memcmp_char4340,[%value_in]; } .loc 1 173 22 setp.ge.s32 %r341,%r340,0; @ %r341 bra $L34; bra $L32; $L33: .loc 1 42196; call (%value_in),_gfortrani_memcmp_char4345,[%value_in]; } .loc 1 173 22 setp.gt.s32 %r346,%r345,0; @ %r346 bra $L34; $L32: .loc 1 170 36 add.u64 %r111,%r111,%r68; .loc 1 170 6 add.u64 %r349,%r392,1; set.u32.lt.u64 %r354,%r349,%r392; cvt.s64.s32 %r352,%r354; mov.u64 %r392,%r349; sub.u64 %r393,%r393,%r352; add.u64 %r359,%r392,-1; setp.gt.s64 %r361,%r108,%r359; @ %r361 bra $L29; .loc 1 181 12 st.u64 [%r116],%r390; st.u64 [%r116+8],%r391; $L30: .loc 1 185 15 add.u64 %r190,%r190,1; .loc 1 186 12 add.u64 %r115,%r115,%r186; .loc 1 187 12 add.u64 %r116,%r116,%r189; .loc 1 189 32 ld.u64 %r139,[%frame+240]; .loc 1 189 13 setp.ne.u64 %r364,%r190,%r139; @ %r364 bra $L25; .loc 1 196 23 mul.lo.u64 %r365,%r190,%r184; .loc 1 196 9 shl.b64 %r366,%r365,2; sub.u64 %r112,%r115,%r366; .loc 1 197 23 mul.lo.u64 %r367,%r190,%r187; .loc 1 197 9 shl.b64 %r368,%r367,4; sub.u64 %r113,%r116,%r368; .loc 1 199 7 @ %r401 bra $L1; add.u64 %r122,%frame,368; mov.u64 %r143,8; .loc 1 198 5 mov.u64 %r114,1; .loc 1 193 13 mov.u64 %r402,0; bra $L36; $L37: st.u64 [%r122],%r402; .loc 1 196 23 mul.lo.u64 %r374,%r80,%r79; .loc 1 196 9 shl.b64 %r375,%r374,2; sub.u64 %r112,%r115,%r375; .loc 1 197 23 mul.lo.u64 %r376,%r83,%r79; .loc 1 197 9 shl.b64 %r377,%r376,4; sub.u64 %r113,%r116,%r377; .loc 1 198 5 add.u64 %r114,%r114,1; .loc 1 199 7 add.u64 %r122,%r122,8; add.u64 %r143,%r143,8; setp.eq.u64 %r378,%r114,%r407; @ %r378 bra $L1; $L36: .loc 1 207 16 ld.u64 %r379,[%r122]; add.u64 %r79,%r379,1; st.u64 [%r122],%r79; .loc 1 208 23 add.u64 %r381,%r408,%r143; ld.u64 %r80,[%r381]; .loc 1 208 13 shl.b64 %r382,%r80,2; add.u64 %r115,%r112,%r382; .loc 1 209 23 add.u64 %r383,%frame,%r143; ld.u64 %r83,[%r383]; .loc 1 209 13 shl.b64 %r384,%r83,4; add.u64 %r116,%r113,%r384; .loc 1 189 32 add.u64 %r386,%r409,%r143; ld.u64 %r86,[%r386]; .loc 1 189 13 setp.eq.u64 %r387,%r79,%r86; @ %r387 bra $L37; mov.u64 %r190,0; bra $L25; $L51: .loc 1 102 6 ld.u64 %r388,[%r192]; setp.eq.u64 %r389,%r388,0; @ ! %r389 bra $L11; bra $L38; $L1: .loc 1 21_gfortran_mmaxloc1_16_s4 .visible .func _gfortran_mmaxloc1_16_s608pred %r265; .reg .u64 %r266; .reg .u64 %r267; .reg .u32 %r268; .reg .u64 %r273; .reg .u64 %r274; .reg .u64 %r276; .reg .u64 %r277; .reg .u64 %r284; .reg .pred %r285; .reg .u32 %r286; .reg .u32 %r287; .reg .u32predu64predu16 %r382; .reg .u16 %r383; .reg .u32 %r384; .reg .pred %r385; .reg .u64 %r386; .reg .u64u64 %r410; .reg .pred %r412; .reg .pred %r413; .reg .u64 %r414; .reg .u32pred %r423; .reg .u64 %r424; .reg .pred %r427; .reg .u64 %r431; .reg .u64 %r434; .reg .u64 %r437; .reg .u64 %r440; .reg .u32 %r442; .reg .u64 %r446; .reg .u64 %r448; .reg .u64 %r451; .reg .u32 %r453; .reg .u64 %r457; .reg .pred %r458; .reg .u32 %r464; .reg .pred %r465; .reg .u32 %r469; .reg .pred %r470; .reg .u64predu64 %r511; .reg .u64 %r513; .reg .u64 %r514; .reg .u64 %r516; .reg .pred %r517; .reg .u64 %r518; .reg .pred %r519; .reg .u64 %r520; .reg .u64 %r521; .reg .u64pred %r543; .reg .u64 %r544; .reg .u64 %r545; .reg .pred %r546; .reg .u64 %r549; .reg .pred %r550; .reg .u32 %r551; .reg .u64u64 %r556; .reg .u64 %r557; .reg .u64 %r558;.loc 1 244 6 setp.ne.u64 %r247,%r244,0; @ %r247 bra $L42243245246; call _gfortran_maxloc1_16_s4); } .loc 1 251 7 bra $L52; $L53: .loc 1 254 10 ld.u64 %r22,[%r243]; .loc 1 254 7 add.u64 %r145,%r22,-1; .loc 1 255 10 ld.s8 %r23,[%r242+28]; .loc 1 255 38 add.u32 %r253,%r23,-1; .loc 1 255 8 cvt.s64.s32 %r146,%r253; .loc 1 258 7 shr.u64 %r255,%r145,63; set.u32.gt.s64 %r257,%r145,%r146; neg.s32 %r258,%r257; cvt.u16.u64 %r261,%r255; cvt.u16.u32 %r262,%r258; or.b16 %r260,%r261,%r262; .loc 1 258 6 cvt.u32.u16 %r263,%r260; cvt.u16.u8 %r264,%r263; setp.eq.u16 %r265,%r264,0; @ %r265 bra $L55; .loc 1 260 7 cvt.u32.u32 %r268,%r23; cvt.s64.s8 %r267,%r268; st.u64 [%stack+8],%r267; st.u64 [%stack],%r22; cvta.const.u64 %r266266_gfortran_runtime_error5: .loc 1 265 9 add.u64 %r542,%r145,%r145; add.u64 %r541,%r542,%r145; shl.b64 %r273,%r541,3; add.u64 %r274,%r242,%r273; ld.u64 %r277,[%r274+56]; add.u64 %r276,%r277,1; .loc 1 265 7 ld.u64 %r284,[%r274+48]; sub.u64 %r147,%r276,%r284; .loc 1 266 6 setp.le.s64 %r285,%r147,0; @ %r285 bra $L52; .loc 1 269 9 ld.u64 %r159,[%r244]; .loc 1 271 15 ld.u64 %r35,[%r244+16]; .loc 1 273 22 cvt.u32.u64 %r36,%r35; .loc 1 273 53 add.u32 %r286,%r36,-4; and.b32 %r287,%r286,-5; set.u32.eq.u32 %r289,%r287,0; neg.s32 %r290,%r289; .loc 1 273 22 add.u32 %r291,%r36,-1; set.u32.le.u32 %r293,%r291,1; neg.s32 %r294,%r293; .loc 1 273 6 cvt.u16.u32 %r296,%r290; cvt.u16.u32 %r297,%r294; or.loc 1 275 7 setp.ne.u32 %r302,%r36,16; @ %r302 bra $L58; $L57: .loc 1 282 11 shl.b64 %r306,%r541,3; add.u64 %r307,%r242,%r306; ld.u64 %r39,[%r307+40]; .loc 1 283 12 add.u64 %r313,%r244,%r306; ld.u64 %r41,[%r313+40]; .loc 1 285 3 setp.ne.u64 %r315,%r145,0; @ %r315 bra $L59; $L65: .loc 1 295 3 setp.lt.s64 %r316,%r145,%r146; @ %r316 bra $L60; bra $L112; $L58: .loc 1 280 5 cvta.const.u64 %r317317_gfortran_runtime_error9: add.u64 %r74,%r242,40; add.u64 %r224,%frame,240; add.u64 %r93,%r244,40; mov.u64 %r232,%frame; add.u64 %r195,%frame,360; add.u64 %r320,%r22,%r22; add.u64 %r321,%r320,%r22; shl.b64 %r322,%r321,3; add.u64 %r323,%r242,16; add.u64 %r236,%r322,%r323; .loc 1 292 12 mov.u64 %r558,0; $L64: .loc 1 287 51 ld.u64 %r325,[%r74]; mul.lo.u64 %r324,%r325,%r246; .loc 1 287 18 st.u64 [%r224],%r324; .loc 1 288 20 ld.u64 %r327,[%r93]; mul.lo.u64 %r326,%r327,%r35; .loc 1 288 18 st.u64 [%r232],%r326; .loc 1 289 19 ld.u64 %r329,[%r74+16]; add.u64 %r328,%r329,1; ld.u64 %r330,[%r74+8]; sub.u64 %r56,%r328,%r330; .loc 1 291 10 setp.lt.s64 %r331,%r56,0; @ %r331 bra $L62; .loc 1 289 17 st.u64 [%r195],%r56; bra $L63; $L62: .loc 1 292 12 st.u64 [%r195],%r558; $L63: .loc 1 285 3 add.u64 %r74,%r74,24; add.u64 %r224,%r224,8; add.u64 %r93,%r93,24; add.u64 %r232,%r232,8; add.u64 %r195,%r195,8; setp.ne.u64 %r333,%r74,%r236; @ %r333 bra $L64; bra $L65; $L112: .loc 1 305 6 ld.u64 %r334,[%r241]; setp.eq.u64 %r335,%r334,0; @ ! %r335 bra $L67; bra $L66; $L60: add.u64 %r337,%r22,%r22; add.u64 %r338,%r337,%r22; shl.b64 %r339,%r338,3; add.u64 %r129,%r339,40; add.u64 %r135,%r242,%r129; shl.b64 %r340,%r22,3; add.u64 %r125,%r340,-8; add.u64 %r545,%frame,240; add.u64 %r127,%r545,%r125; add.u64 %r122,%r244,%r129; add.u64 %r89,%frame,%r125; add.u64 %r544,%frame,360; add.u64 %r43,%r544,%r125; add.u64 %r343,%r242,40; cvt.u32.u32 %r345,%r23; cvt.s64.s8 %r344,%r345; add.u64 %r347,%r344,%r344; add.u64 %r348,%r347,%r344; shl.b64 %r349,%r348,3; add.u64 %r217,%r343,%r349; .loc 1 302 12 mov.u64 %r557,0; $L70: .loc 1 297 55 ld.u64 %r351,[%r135]; mul.lo.u64 %r350,%r351,%r246; .loc 1 297 18 st.u64 [%r127],%r350; .loc 1 298 20 ld.u64 %r353,[%r122]; mul.lo.u64 %r352,%r353,%r35; .loc 1 298 18 st.u64 [%r89],%r352; .loc 1 299 19 ld.u64 %r355,[%r135+16]; add.u64 %r354,%r355,1; ld.u64 %r356,[%r135+8]; sub.u64 %r72,%r354,%r356; .loc 1 301 10 setp.lt.s64 %r357,%r72,0; @ %r357 bra $L68; .loc 1 299 17 st.u64 [%r43],%r72; bra $L69; $L68: .loc 1 302 12 st.u64 [%r43],%r557; $L69: .loc 1 295 3 add.u64 %r135,%r135,24; add.u64 %r127,%r127,8; add.u64 %r122,%r122,24; add.u64 %r89,%r89,8; add.u64 %r43,%r43,8; setp.ne.u64 %r359,%r135,%r217; @ %r359 bra $L70; bra $L113; $L66: .loc 1 309 7 setp.eq.u64 %r360,%r146,0; @ %r360 bra $L72; add.u64 %r544,%frame,360; $L95: add.u64 %r174,%r241,40; mov.u64 %r161,%r544; cvt.u32.u32 %r362,%r23; cvt.s64.s8 %r361,%r362; add.u64 %r364,%r361,%r361; add.u64 %r365,%r364,%r361; shl.b64 %r366,%r365,3; add.u64 %r367,%r241,16; add.u64 %r137,%r366,%r367; .loc 1 312 10 mov.u64 %r133,1; .loc 1 316 4 mov.u64 %r386,0; bra $L73; $L72: .loc 1 320 20 add.u32 %r368,%r23,-2; cvt.s64.s32 %r80,%r368; add.u64 %r370,%r80,%r80; add.u64 %r371,%r370,%r80; shl.b64 %r372,%r371,3; add.u64 %r373,%r241,%r372; .loc 1 320 67 shl.b64 %r375,%r80,3; add.u64 %r376,%frame,%r375; .loc 1 320 59 ld.u64 %r378,[%r373+40]; ld.u64 %r379,[%r376+360]; mul.lo.u64 %r153,%r378,%r379; .loc 1 322 24 mov.u64 %r380,0; st.u64 [%r241+8],%r380; .loc 1 323 28 cvt.u16.u32 %r383,%r23; add.u16 %r382,%r383,-1; cvt.u32.u16 %r384,%r382; st.u8 [%r241+28],%r384; .loc 1 325 10 setp.eq.u64 %r385,%r153,0; @ %r385 bra $L74; bra $L114; $L76: .loc 1 314 47 mul.lo.u64 %r133,%r77,%r133; $L73: .loc 1 316 4 st.u64 [%r174+8],%r386; ld.u64 %r77,[%r161]; add.u64 %r387,%r77,-1; st.u64 [%r174+16],%r387; st.u64 [%r174],%r133; .loc 1 309 7 add.u64 %r174,%r174,24; add.u64 %r161,%r161,8; setp.ne.u64 %r388,%r137,%r174; @ %r388 bra $L76; bra $L72; $L74: .loc 1 328 4 st.u64 [%r241+48],%r153; mov.u64 %r390,-1; st.u64 [%r241+56],%r390; mov.u64 %r391,1; st.u64 [%r241+40],%r391; .loc 1 329 4 bra $L52; $L114: .loc 1 332 24 mov.u64 %r393,16393; call (%value_in),_gfortrani_xmallocarray94,[%value_in]; } .loc 1 332 22 st.u64 [%r241],%r394; bra $L77; $L67: .loc 1 337 19 ld.s8 %r396,[%r241+28]; .loc 1 337 10 setp.eq.u64 %r397,%r396,%r146; @ %r397 bra $L78; .loc 1 338 2 cvta.const.u64 %r398,$LC5stack; call _gfortran_runtime_error78: .loc 1 340 11 cvta.global.u64 %r400,_gfortrani_compile_options; .loc 1 340 10 ld.u32 %r401,[%r400+36]; setp.eq.u32 %r402,%r401,0; @ %r402 bra $L77; .loc 1 342 4 add.u64 %r544,%frame,360; cvta.const.u64 %r406,$LC2r5440406; call _gfortrani_bounds_ifunction_return344 4 cvta.const.u64 %r410,$LC424210406; call _gfortrani_bounds_equal_extents$L77: .loc 1 349 3 setp.ne.u64 %r412,%r146,0; @ %r412 bra $L79; $L82: .loc 1 357 8 ld.u64 %r160,[%r241]; .loc 1 358 8 ld.u64 %r158,[%r242]; .loc 1 360 9 setp.ne.u64 %r413,%r158,0; @ %r413 bra $L80; bra $L52; $L79: add.u64 %r196,%frame,480; add.u64 %r193,%r241,40; add.u64 %r189,%frame,120; cvt.u32.u32 %r415,%r23; cvt.s64.s8 %r414,%r415; add.u64 %r177,%r414,-1; .loc 1 349 3 mov.u64 %r165,0; add.u64 %r544,%frame,360; .loc 1 351 16 mov.u64 %r416,%r165; $L81: st.u64 [%r196],%r416; .loc 1 352 18 ld.u64 %r417,[%r193]; st.u64 [%r189],%r417; .loc 1 353 17 shl.b64 %r419,%r165,3; add.u64 %r420,%r544,%r419; .loc 1 353 10 ld.u64 %r421,[%r420]; setp.le.s64 %r422,%r421,0; @ %r422 bra $L52; .loc 1 349 26 add.u64 %r165,%r165,1; .loc 1 349 3 add.u64 %r196,%r196,8; add.u64 %r193,%r193,24; add.u64 %r189,%r189,8; setp.ne.u64 %r423,%r165,%r177; @ %r423 bra $L81; bra $L82; $L80: .loc 1 283 12 mul.lo.u64 %r44,%r41,%r35; .loc 1 372 32 mul.lo.u64 %r424,%r39,%r246; shl.b64 %r92,%r424,2; .loc 1 396 22 ld.u64 %r219,[%frame+240]; .loc 1 396 12 shl.b64 %r222,%r219,2; .loc 1 397 23 ld.u64 %r223,[%frame]; .loc 1 398 22 ld.u64 %r226,[%frame+120]; .loc 1 398 12 shl.b64 %r228,%r226,4; setp.le.s64 %r543,%r146,1; add.u64 %r549,%r147,-1; .loc 1 384 13 setp.eq.u32 %r550,%r245,0; cvt.u32.u32 %r551,%r23; cvt.s64.s8 %r552,%r551; add.u64 %r553,%r552,-1; add.u64 %r554,%frame,240; add.u64 %r555,%frame,360; add.u64 %r556,%frame,120; $L96: .loc 1 349 3 mov.u64 %r164,%r159; mov.u64 %r154,%r158; .loc 1 372 9 mov.u64 %r198,0; $L86: .loc 1 375 7 ld.s8 %r230,[%r164]; mov.u64 %r235,%r198; add.u64 %r198,%r198,1; .loc 1 375 6 setp.eq.u32 %r546,%r230,0; @ %r546 bra $L83; .loc 1 378 11 mov.u64 %r520,%r198; shr.s64 %r521,%r520,63; .loc 1 382 6 setp.le.s64 %r427,%r147,%r235; @ %r427 bra $L84; add.u64 %r524,%r235,1; shr.s64 %r525,%r524,63; sub.u64 %r431,%r549,%r235; shr.s64 %r434,%r235,63; add.u64 %r437,%r235,2; set.u32.lt.u64 %r442,%r437,%r235; cvt.s64.s32 %r440,%r442; sub.u64 %r446,%r434,%r440; add.u64 %r448,%r431,%r437; set.u32.lt.u64 %r453,%r448,%r431; cvt.s64.s32 %r451,%r453; sub.u64 %r457,%r446,%r451; mov.u64 %r163,%r154; bra $L85; $L83: .loc 1 372 32 add.u64 %r154,%r154,%r92; .loc 1 372 47 add.u64 %r164,%r164,%r44; .loc 1 372 2 setp.ne.u64 %r458,%r147,%r198; @ %r458 bra $L86; .loc 1 371 9 mov.u64 %r520,0; mov.u64 %r521,%r520; bra $L84; $L85: .loc 1 384 6 @ %r546 bra $L87; .loc 1 384 13 @ %r550 bra $L88; .loc 1 4216r154246; call (%value_in),_gfortrani_memcmp_char4464,[%value_in]; } .loc 1 384 13 setp.lt.s32 %r465,%r464,0; @ %r465 bra $L87; $L89: .loc 1 388 14 mov.u64 %r520,%r524; mov.u64 %r521,%r525; mov.u64 %r154,%r163; bra $L87; $L88: .loc 1 4216r154246; call (%value_in),_gfortrani_memcmp_char4469,[%value_in]; } .loc 1 384 13 setp.gt.s32 %r470,%r469,0; @ %r470 bra $L89; $L87: .loc 1 382 31 add.u64 %r163,%r163,%r92; .loc 1 382 46 add.u64 %r164,%r164,%r44; .loc 1 382 6 add.u64 %r473,%r524,1; set.u32.lt.u64 %r478,%r473,%r524; cvt.s64.s32 %r476,%r478; mov.u64 %r524,%r473; sub.u64 %r525,%r525,%r476; setp.ne.u64 %r485,%r448,%r524; @ %r485 bra $L97; setp.ne.u64 %r488,%r457,%r525; @ ! %r488 bra $L84; $L97: .loc 1 384 7 ld.s8 %r230,[%r164]; setp.eq.u32 %r546,%r230,0; bra $L85; $L84: .loc 1 392 8 st.u64 [%r160],%r520; st.u64 [%r160+8],%r521; .loc 1 395 15 ld.u64 %r491,[%frame+480]; add.u64 %r95,%r491,1; st.u64 [%frame+480],%r95; .loc 1 396 12 add.u64 %r158,%r158,%r222; .loc 1 397 13 add.u64 %r159,%r159,%r223; .loc 1 398 12 add.u64 %r160,%r160,%r228; .loc 1 400 32 ld.u64 %r29,[%frame+360]; .loc 1 400 13 setp.ne.u64 %r492,%r29,%r95; @ %r492 bra $L96; .loc 1 404 13 mov.u64 %r493,0; st.u64 [%frame+480],%r493; .loc 1 407 23 mul.lo.u64 %r105,%r29,%r219; .loc 1 408 24 mul.lo.u64 %r494,%r29,%r223; .loc 1 408 10 sub.u64 %r155,%r159,%r494; .loc 1 409 23 mul.lo.u64 %r495,%r29,%r226; .loc 1 409 9 shl.b64 %r496,%r495,4; sub.u64 %r156,%r160,%r496; .loc 1 411 7 @ %r543 bra $L52; add.u64 %r215,%frame,488; mov.u64 %r214,8; .loc 1 410 5 mov.u64 %r157,1; bra $L93; $L94: .loc 1 404 13 st.u64 [%r215],%r493; .loc 1 407 23 mul.lo.u64 %r105,%r115,%r114; .loc 1 408 24 mul.lo.u64 %r502,%r117,%r114; .loc 1 408 10 sub.u64 %r155,%r159,%r502; .loc 1 409 23 mul.lo.u64 %r503,%r119,%r114; .loc 1 409 9 shl.b64 %r504,%r503,4; sub.u64 %r156,%r160,%r504; .loc 1 410 5 add.u64 %r157,%r157,1; .loc 1 411 7 add.u64 %r215,%r215,8; add.u64 %r214,%r214,8; setp.eq.u64 %r505,%r157,%r553; @ %r505 bra $L52; $L93: .loc 1 419 16 ld.u64 %r506,[%r215]; add.u64 %r114,%r506,1; st.u64 [%r215],%r114; .loc 1 420 23 add.u64 %r508,%r554,%r214; ld.u64 %r115,[%r508]; .loc 1 420 13 sub.u64 %r509,%r115,%r105; shl.b64 %r510,%r509,2; add.u64 %r158,%r158,%r510; .loc 1 421 24 add.u64 %r511,%frame,%r214; ld.u64 %r117,[%r511]; .loc 1 421 14 add.u64 %r159,%r155,%r117; .loc 1 422 23 add.u64 %r513,%r556,%r214; ld.u64 %r119,[%r513]; .loc 1 422 13 shl.b64 %r514,%r119,4; add.u64 %r160,%r156,%r514; .loc 1 400 32 add.u64 %r516,%r555,%r214; ld.u64 %r123,[%r516]; .loc 1 400 13 setp.eq.u64 %r517,%r114,%r123; @ %r517 bra $L94; bra $L96; $L113: .loc 1 305 6 ld.u64 %r518,[%r241]; setp.eq.u64 %r519,%r518,0; @ ! %r519 bra $L67; bra $L95; $L52: .loc 1 42_gfortran_smaxloc1_16_s4 .visible .func _gfortran_smaxloc1_16_slocal .align 16 .b8 %frame_ar[368u64 %r89; .reg .u643u64 %r125; .reg .u64predu16 %r180; .reg .predpred %r222; .reg .predu64 %r231; .reg .u16 %r233; .reg .u16 %r234; .reg .u32pred %r259; .reg .pred %r260; .reg .u64u64 %r267; .reg .u32predmov.u64 %r156,%ar0; mov.u64 %r157,%ar1; mov.u64 %r158,%ar2; mov.u64 %r159,%ar3; mov.u32 %r160,%ar4; mov.u64 %r161,%ar5; .loc 1 449 6 setp.eq.u64 %r162,%r159,0; @ %r162 bra $L116; .loc 1 449 20 ld.u32 %r163,[%r159]; setp.eq.u32 %r164,%r163,0; @ %r164 bra $L117; $L116: .loc 1 452 7r1571580161; call _gfortran_maxloc1_16_s4); } .loc 1 456 7 bra $L115; $L117: .loc 1 459 10 ld.u64 %r23,[%r158]; .loc 1 459 7 add.u64 %r95,%r23,-1; .loc 1 460 10 ld.s8 %r24,[%r157+28]; .loc 1 460 38 add.u32 %r26,%r24,-1; .loc 1 460 8 cvt.s64.s32 %r307,%r26; .loc 1 462 7 shr.u64 %r171,%r95,63; set.u32.gt.s64 %r173,%r95,%r307; neg.s32 %r174,%r173; cvt.u16.u64 %r177,%r171; cvt.u16.u32 %r178,%r174; or.b16 %r176,%r177,%r178; .loc 1 462 6 cvt.u32.u16 %r179,%r176; cvt.u16.u8 %r180,%r179; setp.ne.u16 %r181,%r180,0; @ %r181 bra $L119; .loc 1 469 3 setp.ne.u64 %r182,%r95,0; @ %r182 bra $L120; $L126: .loc 1 477 3 setp.lt.s64 %r183,%r95,%r307; @ %r183 bra $L121; bra $L174; $L119: .loc 1 464 7 cvt.u32.u32 %r186,%r24; cvt.s64.s8 %r185,%r186; st.u64 [%stack+8],%r1851841stack; call _gfortran_runtime_error120: add.u64 %r92,%r157,48; add.u64 %r114,%frame,120; add.u64 %r146,%r157,56; add.u64 %r188,%r157,24; add.u64 %r190,%r23,%r23; add.u64 %r191,%r190,%r23; shl.b64 %r192,%r191,3; add.u64 %r152,%r188,%r192; .loc 1 474 12 mov.u64 %r313,0; $L125: .loc 1 471 19 ld.u64 %r194,[%r146]; ld.u64 %r196,[%r92]; sub.u64 %r195,%r194,%r196; .loc 1 471 50 mad.lo.u64 %r38,%r195,%r161,%r161; .loc 1 473 10 setp.le.s64 %r197,%r38,0; @ %r197 bra $L123; .loc 1 471 17 st.u64 [%r114],%r38; bra $L124; $L123: .loc 1 474 12 st.u64 [%r114],%r313; $L124: .loc 1 469 3 add.u64 %r92,%r92,24; add.u64 %r114,%r114,8; add.u64 %r146,%r146,24; setp.ne.u64 %r199,%r92,%r152; @ %r199 bra $L125; bra $L126; $L174: .loc 1 486 15 ld.u64 %r107,[%r156]; .loc 1 486 6 setp.eq.u64 %r200,%r107,0; @ ! %r200 bra $L128; bra $L127; $L121: add.u64 %r202,%r23,%r23; add.u64 %r203,%r202,%r23; shl.b64 %r204,%r203,3; add.u64 %r205,%r204,48; add.u64 %r83,%r157,%r205; add.u64 %r305,%frame,120; shl.b64 %r207,%r23,3; add.u64 %r208,%r207,-8; add.u64 %r144,%r305,%r208; add.u64 %r209,%r157,48; cvt.u32.u32 %r211,%r24; cvt.s64.s8 %r210,%r211; add.u64 %r213,%r210,%r210; add.u64 %r214,%r213,%r210; shl.b64 %r215,%r214,3; add.u64 %r72,%r209,%r215; .loc 1 483 12 mov.u64 %r312,0; $L131: .loc 1 480 2 ld.u64 %r217,[%r83+8]; ld.u64 %r219,[%r83]; sub.u64 %r218,%r217,%r219; .loc 1 480 37 mad.lo.u64 %r46,%r218,%r161,%r161; .loc 1 482 10 setp.le.s64 %r220,%r46,0; @ %r220 bra $L129; .loc 1 479 17 st.u64 [%r144],%r46; bra $L130; $L129: .loc 1 483 12 st.u64 [%r144],%r312; $L130: .loc 1 477 3 add.u64 %r83,%r83,24; add.u64 %r144,%r144,8; setp.ne.u64 %r222,%r72,%r83; @ %r222 bra $L131; bra $L175; $L127: .loc 1 490 7 setp.eq.u64 %r223,%r307,0; @ %r223 bra $L133; add.u64 %r305,%frame,120; $L153: add.u64 %r125,%r156,40; mov.u64 %r123,%r305; add.u64 %r224,%r156,16; cvt.u32.u32 %r226,%r24; cvt.s64.s8 %r225,%r226; add.u64 %r228,%r225,%r225; add.u64 %r229,%r228,%r225; shl.b64 %r230,%r229,3; add.u64 %r117,%r224,%r230; .loc 1 493 10 mov.u64 %r89,1; .loc 1 497 4 mov.u64 %r249,0; bra $L134; $L133: .loc 1 501 24 mov.u64 %r231,0; st.u64 [%r156+8],%r231; .loc 1 502 28 cvt.u16.u32 %r234,%r24; add.u16 %r233,%r234,-1; cvt.u32.u16 %r235,%r233; st.u8 [%r156+28],%r235; .loc 1 504 20 add.u32 %r236,%r24,-2; cvt.s64.s32 %r56,%r236; add.u64 %r238,%r56,%r56; add.u64 %r239,%r238,%r56; shl.b64 %r240,%r239,3; add.u64 %r241,%r156,%r240; .loc 1 504 67 shl.b64 %r243,%r56,3; add.u64 %r244,%frame,%r243; .loc 1 504 59 ld.u64 %r246,[%r241+40]; ld.u64 %r247,[%r244+120]; mul.lo.u64 %r103,%r246,%r247; .loc 1 506 10 setp.eq.u64 %r248,%r103,0; @ %r248 bra $L135; bra $L176; $L137: .loc 1 495 48 mul.lo.u64 %r89,%r52,%r89; $L134: .loc 1 497 4 st.u64 [%r125+8],%r249; ld.u64 %r52,[%r123]; add.u64 %r250,%r52,-1; st.u64 [%r125+16],%r250; st.u64 [%r125],%r89; .loc 1 490 7 add.u64 %r125,%r125,24; add.u64 %r123,%r123,8; setp.ne.u64 %r251,%r117,%r125; @ %r251 bra $L137; bra $L133; $L135: .loc 1 509 4 st.u64 [%r156+48],%r103; mov.u64 %r253,-1; st.u64 [%r156+56],%r253; mov.u64 %r254,1; st.u64 [%r156+40],%r254; .loc 1 510 4 bra $L115; $L176: .loc 1 513 24 mov.u64 %r256,16256; call (%value_in),_gfortrani_xmallocarray257,[%value_in]; } mov.u64 %r107,%r257; .loc 1 513 22 st.u64 [%r156],%r107; $L141: .loc 1 539 3 setp.ne.u64 %r259,%r307,0; @ %r259 bra $L138; bra $L139; $L128: .loc 1 517 19 ld.s8 %r61,[%r156+28]; .loc 1 517 10 setp.eq.u64 %r260,%r61,%r307; @ %r260 bra $L140; .loc 1 518 2 st.u64 [%stack+8],%r307; st.u64 [%stack],%r61;_gfortran_runtime_error140: .loc 1 523 11 cvta.global.u64 %r263,_gfortrani_compile_options; .loc 1 523 10 ld.u32 %r264,[%r263+36]; setp.eq.u32 %r265,%r264,0; @ %r265 bra $L141; .loc 1 525 4 setp.eq.u64 %r266,%r307,0; @ %r266 bra $L139; add.u64 %r115,%r156,48; add.u64 %r105,%frame,120; cvt.u32.u32 %r268,%r24; cvt.s64.s8 %r267,%r268; add.u64 %r86,%r267,-1; .loc 1 525 10 mov.u64 %r113,0; $L143: .loc 1 529 21 ld.u64 %r270,[%r115+8]; add.u64 %r269,%r270,1; .loc 1 529 19 ld.u64 %r271,[%r115]; sub.u64 %r100,%r269,%r271; .loc 1 530 18 ld.u64 %r70,[%r105]; .loc 1 525 25 add.u64 %r113,%r113,1; .loc 1 530 11 setp.eq.u64 %r272,%r70,%r100; @ %r272 bra $L142; .loc 1 531 3 st.u64 [%stack+16],%r70; st.u64 [%stack+8],%r100; st.u64 [%stack],%r113;_gfortran_runtime_error142: .loc 1 525 4 add.u64 %r115,%r115,24; add.u64 %r105,%r105,8; setp.ne.u64 %r275,%r86,%r113; @ %r275 bra $L143; $L138: .loc 1 541 16 cvt.u16.u32 %r277,%r24; setp.le.s16 %r278,%r277,1; @ %r278 bra $L144; shl.b64 %r276,%r307,3; bra $L145; $L144: mov.u64 %r276,8; $L145: add.u64 %r280,%frame,240; mov.u32 %r2842766,[%value_in]; } add.u64 %r130,%r156,40; mov.u64 %r128,%frame; mov.u64 %r109,0; $L146: .loc 1 542 18 ld.u64 %r288,[%r130]; st.u64 [%r128],%r288; .loc 1 539 26 add.u64 %r109,%r109,1; .loc 1 539 3 add.u64 %r130,%r130,24; add.u64 %r128,%r128,8; setp.gt.s64 %r289,%r307,%r109; @ %r289 bra $L146; $L139: ld.u64 %r88,[%frame+240]; .loc 1 551 22 ld.u64 %r75,[%frame]; .loc 1 551 12 shl.b64 %r76,%r75,4; .loc 1 553 32 ld.u64 %r65,[%frame+120]; .loc 1 560 23 mul.lo.u64 %r49,%r65,%r75; setp.le.s64 %r306,%r307,1; .loc 1 549 13 mov.u64 %r290,0; cvt.u32.u32 %r309,%r24; cvt.s64.s8 %r310,%r309; add.u64 %r311,%r310,-1; $L152: st.u64 [%r107],%r290; st.u64 [%r107+8],%r290; .loc 1 550 15 add.u64 %r88,%r88,1; .loc 1 551 12 add.u64 %r107,%r107,%r76; .loc 1 553 13 setp.ne.u64 %r292,%r65,%r88; @ %r292 bra $L152; .loc 1 562 7 @ ! %r306 bra $L177; bra $L115; $L151: .loc 1 557 13 st.u64 [%r108],%r290; .loc 1 560 23 mul.lo.u64 %r78,%r81,%r80; .loc 1 561 5 add.u64 %r106,%r106,1; .loc 1 562 7 add.u64 %r108,%r108,8; add.u64 %r138,%r138,8; add.u64 %r137,%r137,8; setp.ne.u64 %r295,%r106,%r311; @ %r295 bra $L150; bra $L115; $L177: add.u64 %r108,%frame,248; add.u64 %r138,%frame,8; add.u64 %r137,%frame,128; .loc 1 560 23 mov.u64 %r78,%r49; .loc 1 561 5 mov.u64 %r106,1; $L150: .loc 1 566 16 ld.u64 %r300,[%r108]; add.u64 %r80,%r300,1; st.u64 [%r108],%r80; .loc 1 567 23 ld.u64 %r81,[%r138]; .loc 1 567 13 sub.u64 %r301,%r81,%r78; shl.b64 %r302,%r301,4; add.u64 %r107,%r107,%r302; .loc 1 553 32 ld.u64 %r84,[%r137]; .loc 1 553 13 setp.eq.u64 %r303,%r80,%r84; @ %r303 bra $L151; .loc 1 557 13 mov.u64 %r88,0; bra $L152; $L175: .loc 1 486 15 ld.u64 %r107,[%r156]; .loc 1 486 6 setp.eq.u64 %r304,%r107,0; @ ! %r304 bra $L128; bra $L153; $L115: .loc 1 571 1 ret; } minloc1_4_s1.o/ 1622802212_gfortran_minloc1_4_s1 .visible .func _gfortran_minloc1_4_s1in_ar4); .file 1 "../../../libgfortran/generated/minloc1_4_s1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_4_s1 .visible .func _gfortran_mminloc1_4_s1_gfortran_sminloc1_4_s1 .visible .func _gfortran_sminloc1_4_s1VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returmemcmpmem_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents7] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,914,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,117,101114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99_gfortran_minloc1_4_s1 .visible .func _gfortran_minloc1_4_s1in_ar4) {48099predpredpred %r238; .reg .u64predu64 %r279; .reg .u64 %r280; .reg .u64 %r281; .reg .u64u64 %r303; .reg .u32 %r304; .reg .pred %r305; .reg .u64 %r308; .reg .u64 %r309; .reg .pred %r311; .reg .u64 %r312; .reg .u64 %r313; .reg .u64u64predpredpredpred %r356; .reg .u64 %r357; .reg .predmov.u64 %r184,%ar0; mov.u64 %r185,%ar1; mov.u64 %r186,%ar2; mov.u32 %r187,%ar3; mov.u64 %r188,%ar4; .loc 1 70 10 ld.s8 %r22,[%r185+28]; .loc 1 70 38 add.u32 %r189,%r22,-1; .loc 1 70 8 cvt.s64.s32 %r117,%r189; .loc 1 71 10 ld.u64 %r25,[%r186]; .loc 1 71 7 add.u64 %r118,%r25,-1; .loc 1 73 7 shr.u64 %r191,%r118,63; set.u32.lt.s64 %r193,%r117,%r118; neg.s32 %r194,%r193; cvt.u16.u64 %r197,%r191; cvt.u16.u32 %r198,%r194; or.b16 %r196,%r197,%r198; .loc 1 73 6 cvt.u32.u16 %r199,%r196; cvt.u16.u8 %r200,%r199; setp.eq.u16 %r201,%r200,0; @ %r201 bra $L2; .loc 1 75 7 cvt.u32.u32 %r204,%r22; cvt.s64.s8 %r203,%r204; st.u64 [%stack+8],%r20302202_gfortran_runtime_errorr207,%r118,%r118; add.u64 %r208,%r207,%r118; shl.b64 %r209,%r208,3; add.u64 %r210,%r185,%r209; ld.u64 %r31,[%r210+56]; ld.u64 %r33,[%r210+48]; .loc 1 83 11 ld.u64 %r34,[%r210+40]; .loc 1 85 3 setp.ne.u64 %r224,%r118,0; @ %r224 bra $L3; $L9: .loc 1 93 3 setp.gt.s64 %r225,%r117,%r118; @ %r225 bra $L4; bra $L52; $L3: add.u64 %r29,%r185,40; add.u64 %r133,%frame,120; add.u64 %r182,%frame,240; add.u64 %r227,%r25,%r25; add.u64 %r228,%r227,%r25; shl.b64 %r229,%r228,3; add.u64 %r230,%r185,16; add.u64 %r160,%r229,%r230; .loc 1 91 12 mov.u64 %r373,0; $L8: .loc 1 87 51 ld.u64 %r232,[%r29]; mul.lo.u64 %r231,%r232,%r188; .loc 1 87 18 st.u64 [%r133],%r231; .loc 1 88 19 ld.u64 %r234,[%r29+16]; add.u64 %r233,%r234,1; ld.u64 %r235,[%r29+8]; sub.u64 %r44,%r233,%r235; .loc 1 90 10 setp.lt.s64 %r236,%r44,0; @ %r236 bra $L6; .loc 1 88 17 st.u64 [%r182],%r44; bra $L7; $L6: .loc 1 91 12 st.u64 [%r182],%r373; $L7: .loc 1 85 3 add.u64 %r29,%r29,24; add.u64 %r133,%r133,8; add.u64 %r182,%r182,8; setp.ne.u64 %r238,%r29,%r160; @ %r238 bra $L8; bra $L9; $L52: .loc 1 102 6 ld.u64 %r239,[%r184]; setp.eq.u64 %r240,%r239,0; @ ! %r240 bra $L11; bra $L10; $L4: add.u64 %r242,%r25,%r25; add.u64 %r243,%r242,%r25; shl.b64 %r244,%r243,3; add.u64 %r245,%r244,40; add.u64 %r134,%r185,%r245; shl.b64 %r246,%r25,3; add.u64 %r108,%r246,-8; add.u64 %r362,%frame,120; add.u64 %r112,%r362,%r108; add.u64 %r361,%frame,240; add.u64 %r105,%r361,%r108; add.u64 %r249,%r185,40; cvt.u32.u32 %r251,%r22; cvt.s64.s8 %r250,%r251; add.u64 %r253,%r250,%r250; add.u64 %r254,%r253,%r250; shl.b64 %r255,%r254,3; add.u64 %r67,%r249,%r255; .loc 1 99 12 mov.u64 %r372,0; $L14: .loc 1 95 56 ld.u64 %r257,[%r134]; mul.lo.u64 %r256,%r257,%r188; .loc 1 95 18 st.u64 [%r112],%r256; .loc 1 96 19 ld.u64 %r259,[%r134+16]; add.u64 %r258,%r259,1; ld.u64 %r260,[%r134+8]; sub.u64 %r54,%r258,%r260; .loc 1 98 10 setp.lt.s64 %r261,%r54,0; @ %r261 bra $L12; .loc 1 96 17 st.u64 [%r105],%r54; bra $L13; $L12: .loc 1 99 12 st.u64 [%r105],%r372; $L13: .loc 1 93 3 add.u64 %r134,%r134,24; add.u64 %r112,%r112,8; add.u64 %r105,%r105,8; setp.ne.u64 %r263,%r67,%r134; @ %r263 bra $L14; bra $L53; $L10: .loc 1 106 7 setp.le.s64 %r264,%r117,0; @ %r264 bra $L16; add.u64 %r361,%frame,240; $L37: add.u64 %r150,%r184,40; mov.u64 %r148,%r361; cvt.u32.u32 %r266,%r22; cvt.s64.s8 %r265,%r266; add.u64 %r268,%r265,%r265; add.u64 %r269,%r268,%r265; shl.b64 %r270,%r269,3; add.u64 %r271,%r184,16; add.u64 %r135,%r270,%r271; .loc 1 109 10 mov.u64 %r107,1; .loc 1 113 4 mov.u64 %r294,0; bra $L17; $L16: .loc 1 117 24 mov.u64 %r272,0; st.u64 [%r184+8],%r272; .loc 1 118 28 cvt.u16.u32 %r275,%r22; add.u16 %r274,%r275,-1; cvt.u32.u16 %r276,%r274; st.u8 [%r184+28],%r276; .loc 1 120 20 add.u32 %r277,%r22,-2; cvt.s64.s32 %r65,%r277; add.u64 %r279,%r65,%r65; add.u64 %r280,%r279,%r65; shl.b64 %r281,%r280,3; add.u64 %r282,%r184,%r281; .loc 1 120 67 shl.b64 %r284,%r65,3; add.u64 %r285,%frame,%r284; .loc 1 120 59 ld.u64 %r287,[%r282+40]; ld.u64 %r288,[%r285+240]; mul.lo.u64 %r123,%r287,%r288; .loc 1 122 29290; call (%value_in),_gfortrani_xmallocarray291,[%value_in]; } .loc 1 122 27 st.u64 [%r184],%r291; .loc 1 123 10 setp.eq.u64 %r293,%r123,0; @ ! %r293 bra $L21; bra $L18; $L20: .loc 1 111 48 mul.lo.u64 %r107,%r61,%r107; $L17: .loc 1 113 4 st.u64 [%r150+8],%r294; ld.u64 %r61,[%r148]; add.u64 %r295,%r61,-1; st.u64 [%r150+16],%r295; st.u64 [%r150],%r107; .loc 1 106 7 add.u64 %r150,%r150,24; add.u64 %r148,%r148,8; setp.ne.u64 %r296,%r135,%r150; @ %r296 bra $L20; bra $L16; $L18: .loc 1 126 4 st.u64 [%r184+48],%r123; mov.u64 %r298,-1; st.u64 [%r184+56],%r298; mov.u64 %r299,1; st.u64 [%r184+40],%r299; .loc 1 127 4 bra $L1; $L11: .loc 1 133 19 ld.s8 %r72,[%r184+28]; .loc 1 133 10 setp.eq.u64 %r300,%r72,%r117; @ %r300 bra $L23; .loc 1 134 2 st.u64 [%stack+8],%r117; st.u64 [%stack],%r72;_gfortran_runtime_error23: .loc 1 139 11 cvta.global.u64 %r303,_gfortrani_compile_options; .loc 1 139 10 ld.u32 %r304,[%r303+36]; setp.eq.u32 %r305,%r304,0; @ %r305 bra $L21; .loc 1 140 2 add.u64 %r361,%frame,240; cvta.const.u64 %r309,$LC2r361308309; call _gfortrani_bounds_ifunction_return$L21: .loc 1 144 3 setp.gt.s64 %r311,%r117,0; @ %r311 bra $L24; $L28: .loc 1 80 9 add.u64 %r312,%r31,1; .loc 1 80 7 sub.u64 %r119,%r312,%r33; max.s64 %r121,%r119,0; .loc 1 83 44 mul.lo.u64 %r36,%r34,%r188; .loc 1 152 8 ld.u64 %r129,[%r185]; .loc 1 153 8 ld.u64 %r130,[%r184]; .loc 1 186 22 ld.u64 %r81,[%frame+120]; .loc 1 187 22 ld.u64 %r83,[%frame]; .loc 1 187 12 shl.b64 %r84,%r83,2; .loc 1 189 32 ld.u64 %r136,[%frame+240]; .loc 1 196 23 mul.lo.u64 %r313,%r81,%r136; .loc 1 196 9 neg.s64 %r144,%r313; .loc 1 197 23 mul.lo.u64 %r314,%r83,%r136; .loc 1 197 9 shl.b64 %r315,%r314,2; neg.s64 %r146,%r315; .loc 1 185 12 ld.u64 %r181,[%frame+360]; setp.gt.s64 %r359,%r119,0; setp.le.s64 %r360,%r117,1; .loc 1 164 9 mov.u64 %r365,0; .loc 1 165 9 cvt.u32.u64 %r366,%r365; .loc 1 173 6 setp.eq.u32 %r367,%r187,0; cvt.u32.u32 %r369,%r22; cvt.s64.s8 %r370,%r369; add.u64 %r371,%r370,-1; bra $L25; $L24: add.u64 %r169,%frame,360; add.u64 %r168,%r184,40; mov.u64 %r166,%frame; cvt.u32.u32 %r317,%r22; cvt.s64.s8 %r316,%r317; add.u64 %r152,%r316,-1; .loc 1 144 3 mov.u64 %r131,0; add.u64 %r361,%frame,240; .loc 1 146 16 mov.u64 %r318,%r131; $L27: st.u64 [%r169],%r318; .loc 1 147 18 ld.u64 %r319,[%r168]; st.u64 [%r166],%r319; .loc 1 148 17 shl.b64 %r321,%r131,3; add.u64 %r322,%r361,%r321; .loc 1 148 10 ld.u64 %r323,[%r322]; setp.le.s64 %r324,%r323,0; @ %r324 bra $L1; .loc 1 144 26 add.u64 %r131,%r131,1; .loc 1 144 3 add.u64 %r169,%r169,8; add.u64 %r168,%r168,24; add.u64 %r166,%r166,8; setp.ne.u64 %r325,%r131,%r152; @ %r325 bra $L27; bra $L28; $L25: .loc 1 166 5 @ %r359 bra $L38; .loc 1 167 10 mov.u32 %r327,0; st.u32 [%r130],%r327; bra $L30; $L38: mov.u64 %r125,%r129; .loc 1 164 9 mov.u64 %r111,%r365; .loc 1 165 9 mov.u32 %r109,%r366; .loc 1 170 13 mov.u64 %r124,0; $L29: .loc 1 173 6 setp.eq.u64 %r328,%r111,0; @ %r328 bra $L31; .loc 1 40125188332,[%value_in]; } .loc 1 173 6 @ %r367 bra $L32; .loc 1 173 22 setp.le.s32 %r335,%r332,0; @ ! %r335 bra $L33; bra $L31; $L32: setp.ge.s32 %r336,%r332,0; @ %r336 bra $L33; $L31: cvt.u32.u64 %r337,%r124; add.u32 %r109,%r337,1; .loc 1 170 13 mov.u64 %r111,%r125; $L33: .loc 1 170 28 add.u64 %r124,%r124,1; .loc 1 170 36 add.u64 %r125,%r125,%r36; .loc 1 170 6 setp.gt.s64 %r338,%r121,%r124; @ %r338 bra $L29; .loc 1 181 12 st.u32 [%r130],%r109; $L30: .loc 1 185 15 add.u64 %r181,%r181,1; .loc 1 186 12 add.u64 %r129,%r129,%r81; .loc 1 187 12 add.u64 %r130,%r130,%r84; .loc 1 189 13 setp.ne.u64 %r339,%r181,%r136; @ %r339 bra $L25; .loc 1 196 9 add.u64 %r126,%r129,%r144; .loc 1 197 9 add.u64 %r127,%r130,%r146; .loc 1 199 7 @ %r360 bra $L1; add.u64 %r137,%frame,368; mov.u64 %r155,8; .loc 1 198 5 mov.u64 %r128,1; add.u64 %r362,%frame,120; add.u64 %r361,%frame,240; bra $L35; $L36: .loc 1 193 13 st.u64 [%r137],%r365; .loc 1 196 23 mul.lo.u64 %r345,%r97,%r96; .loc 1 196 9 sub.u64 %r126,%r129,%r345; .loc 1 197 23 mul.lo.u64 %r346,%r99,%r96; .loc 1 197 9 shl.b64 %r347,%r346,2; sub.u64 %r127,%r130,%r347; .loc 1 198 5 add.u64 %r128,%r128,1; .loc 1 199 7 add.u64 %r137,%r137,8; add.u64 %r155,%r155,8; setp.eq.u64 %r348,%r128,%r371; @ %r348 bra $L1; $L35: .loc 1 207 16 ld.u64 %r349,[%r137]; add.u64 %r96,%r349,1; st.u64 [%r137],%r96; .loc 1 208 23 add.u64 %r351,%r362,%r155; ld.u64 %r97,[%r351]; .loc 1 208 13 add.u64 %r129,%r126,%r97; .loc 1 209 23 add.u64 %r352,%frame,%r155; ld.u64 %r99,[%r352]; .loc 1 209 13 shl.b64 %r353,%r99,2; add.u64 %r130,%r127,%r353; .loc 1 189 32 add.u64 %r355,%r361,%r155; ld.u64 %r102,[%r355]; .loc 1 189 13 setp.eq.u64 %r356,%r96,%r102; @ %r356 bra $L36; mov.u64 %r181,0; bra $L25; $L53: .loc 1 102 6 ld.u64 %r357,[%r184]; setp.eq.u64 %r358,%r357,0; @ ! %r358 bra $L11; bra $L37; $L1: .loc 1 21_gfortran_mminloc1_4_s1 .visible .func _gfortran_mminloc1_4_s1608u64 %r218; .reg .u64 %r227; .reg .u64 %r230; .reg .u64pred %r289; .reg .predpred %r304; .reg .pred %r305; .reg .u64u64 %r313; .reg .u64u64pred %r324; .reg .u64predu16 %r371; .reg .u16 %r372; .reg .u32 %r373; .reg .pred %r374; .reg .u64 %r375; .reg .u64 %r376; .reg .predu64u64 %r399; .reg .predu64 %r410; .reg .pred %r411; .reg .pred.reg .u32 %r417; .reg .predpred %r427; .reg .pred %r428; .reg .predpred %r461; .reg .predmov.u64 %r230,%ar0; mov.u64 %r231,%ar1; mov.u64 %r232,%ar2; mov.u64 %r233,%ar3; mov.u32 %r234,%ar4; mov.u64 %r235,%ar5; .loc 1 244 6 setp.ne.u64 %r236,%r233,0; @ %r236 bra $L2234235; call _gfortran_minloc1_4_s1); } .loc 1 251 7 bra $L54; $L55: .loc 1 254 10 ld.u64 %r22,[%r232]; .loc 1 254 7 add.u64 %r151,%r22,-1; .loc 1 255 10 ld.s8 %r23,[%r231+28]; .loc 1 255 38 add.u32 %r242,%r23,-1; .loc 1 255 8 cvt.s64.s32 %r152,%r242; .loc 1 258 7 shr.u64 %r244,%r151,63; set.u32.gt.s64 %r246,%r151,%r152; neg.s32 %r247,%r246; cvt.u16.u64 %r250,%r244; cvt.u16.u32 %r251,%r247; or.b16 %r249,%r250,%r251; .loc 1 258 6.loc 1 260 7 cvt.u32.u32 %r257,%r23; cvt.s64.s8 %r256,%r257; st.u64 [%stack+8],%r256; st.u64 [%stack],%r22; cvta.const.u64 %r22_gfortran_runtime_error7: .loc 1 265 9 add.u64 %r460,%r151,%r151; add.u64 %r459,%r460,%r151; shl.b64 %r262,%r459,3; add.u64 %r263,%r231,%r262; ld.u64 %r266,[%r263+56]; add.u64 %r265,%r266,1; .loc 1 265 7 ld.u64 %r273,[%r263+48]; sub.u64 %r153,%r265,%r273; .loc 1 266 6 setp.le.s64 %r274,%r153,0; @ %r274 bra $L54; .loc 1 269 9 ld.u64 %r164,[%r233]; .loc 1 271 15 ld.u64 %r34,[%r233+16]; .loc 1 273 22 cvt.u32.u64 %r35,%r34; .loc 1 273 53 add.u32 %r275,%r35,-4; and.b32 %r276,%r275,-5; set.u32.eq.u32 %r278,%r276,0; neg.s32 %r279,%r278; .loc 1 273 22 add.u32 %r280,%r35,-1; set.u32.le.u32 %r282,%r280,1; neg.s32 %r283,%r282; .loc 1 273 6 cvt.u16.u32 %r285,%r279; cvt.u16.u32 %r286,%r283; or.b16 %r284,%r285,%r286; cvt.u32.u16 %r287,%r284; cvt.u16.u8 %r288,%r287; setp.ne.u16 %r289,%r288,0; @ %r289 bra $L59; .loc 1 275 7 setp.ne.u32 %r291,%r35,16; @ %r291 bra $L60; $L59: .loc 1 282 11 shl.b64 %r295,%r459,3; add.u64 %r296,%r231,%r295; ld.u64 %r40,[%r296+40]; .loc 1 283 12 add.u64 %r302,%r233,%r295; ld.u64 %r43,[%r302+40]; .loc 1 285 3 setp.ne.u64 %r304,%r151,0; @ %r304 bra $L61; $L67: .loc 1 295 3 setp.lt.s64 %r305,%r151,%r152; @ %r305 bra $L62; bra $L113; $L60: .loc 1 280 5 cvta.const.u64 %r306306_gfortran_runtime_error61: add.u64 %r212,%r231,40; add.u64 %r214,%frame,240; add.u64 %r215,%r233,40; mov.u64 %r217,%frame; add.u64 %r218,%frame,360; add.u64 %r308,%r231,16; add.u64 %r310,%r22,%r22; add.u64 %r311,%r310,%r22; shl.b64 %r312,%r311,3; add.u64 %r227,%r308,%r312; .loc 1 292 12 mov.u64 %r472,0; $L66: .loc 1 287 51 ld.u64 %r314,[%r212]; mul.lo.u64 %r313,%r314,%r235; .loc 1 287 18 st.u64 [%r214],%r313; .loc 1 288 20 ld.u64 %r316,[%r215]; mul.lo.u64 %r315,%r316,%r34; .loc 1 288 18 st.u64 [%r217],%r315; .loc 1 289 19 ld.u64 %r318,[%r212+16]; add.u64 %r317,%r318,1; ld.u64 %r319,[%r212+8]; sub.u64 %r58,%r317,%r319; .loc 1 291 10 setp.lt.s64 %r320,%r58,0; @ %r320 bra $L64; .loc 1 289 17 st.u64 [%r218],%r58; bra $L65; $L64: .loc 1 292 12 st.u64 [%r218],%r472; $L65: .loc 1 285 3 add.u64 %r212,%r212,24; add.u64 %r214,%r214,8; add.u64 %r215,%r215,24; add.u64 %r217,%r217,8; add.u64 %r218,%r218,8; setp.ne.u64 %r322,%r212,%r227; @ %r322 bra $L66; bra $L67; $L113: .loc 1 305 6 ld.u64 %r323,[%r230]; setp.eq.u64 %r324,%r323,0; @ ! %r324 bra $L69; bra $L68; $L62: add.u64 %r326,%r22,%r22; add.u64 %r327,%r326,%r22; shl.b64 %r328,%r327,3; add.u64 %r67,%r328,40; add.u64 %r84,%r231,%r67; shl.b64 %r329,%r22,3; add.u64 %r171,%r329,-8; add.u64 %r457,%frame,240; add.u64 %r38,%r457,%r171; add.u64 %r173,%r233,%r67; add.u64 %r77,%frame,%r171; add.u64 %r455,%frame,360; add.u64 %r194,%r455,%r171; add.u64 %r332,%r231,40; cvt.u32.u32 %r334,%r23; cvt.s64.s8 %r333,%r334; add.u64 %r336,%r333,%r333; add.u64 %r337,%r336,%r333; shl.b64 %r338,%r337,3; add.u64 %r209,%r332,%r338; .loc 1 302 12 mov.u64 %r471,0; $L72: .loc 1 297 55 ld.u64 %r340,[%r84]; mul.lo.u64 %r339,%r340,%r235; .loc 1 297 18 st.u64 [%r38],%r339; .loc 1 298 20 ld.u64 %r342,[%r173]; mul.lo.u64 %r341,%r342,%r34; .loc 1 298 18 st.u64 [%r77],%r341; .loc 1 299 19 ld.u64 %r344,[%r84+16]; add.u64 %r343,%r344,1; ld.u64 %r345,[%r84+8]; sub.u64 %r74,%r343,%r345; .loc 1 301 10 setp.lt.s64 %r346,%r74,0; @ %r346 bra $L70; .loc 1 299 17 st.u64 [%r194],%r74; bra $L71; $L70: .loc 1 302 12 st.u64 [%r194],%r471; $L71: .loc 1 295 3 add.u64 %r84,%r84,24; add.u64 %r38,%r38,8; add.u64 %r173,%r173,24; add.u64 %r77,%r77,8; add.u64 %r194,%r194,8; setp.ne.u64 %r348,%r84,%r209; @ %r348 bra $L72; bra $L114; $L68: .loc 1 309 7 setp.eq.u64 %r349,%r152,0; @ %r349 bra $L74; add.u64 %r455,%frame,360; $L97: add.u64 %r133,%r230,40; mov.u64 %r130,%r455; cvt.u32.u32 %r351,%r23; cvt.s64.s8 %r350,%r351; add.u64 %r353,%r350,%r350; add.u64 %r354,%r353,%r350; shl.b64 %r355,%r354,3; add.u64 %r356,%r230,16; add.u64 %r92,%r355,%r356; .loc 1 312 10 mov.u64 %r135,1; .loc 1 316 4 mov.u64 %r375,0; bra $L75; $L74: .loc 1 320 20 add.u32 %r357,%r23,-2; cvt.s64.s32 %r82,%r357; add.u64 %r359,%r82,%r82; add.u64 %r360,%r359,%r82; shl.b64 %r361,%r360,3; add.u64 %r362,%r230,%r361; .loc 1 320 67 shl.b64 %r364,%r82,3; add.u64 %r365,%frame,%r364; .loc 1 320 59 ld.u64 %r367,[%r362+40]; ld.u64 %r368,[%r365+360]; mul.lo.u64 %r158,%r367,%r368; .loc 1 322 24 mov.u64 %r369,0; st.u64 [%r230+8],%r369; .loc 1 323 28 cvt.u16.u32 %r372,%r23; add.u16 %r371,%r372,-1; cvt.u32.u16 %r373,%r371; st.u8 [%r230+28],%r373; .loc 1 325 10 setp.eq.u64 %r374,%r158,0; @ %r374 bra $L76; bra $L115; $L78: .loc 1 314 47 mul.lo.u64 %r135,%r79,%r135; $L75: .loc 1 316 4 st.u64 [%r133+8],%r375; ld.u64 %r79,[%r130]; add.u64 %r376,%r79,-1; st.u64 [%r133+16],%r376; st.u64 [%r133],%r135; .loc 1 309 7 add.u64 %r133,%r133,24; add.u64 %r130,%r130,8; setp.ne.u64 %r377,%r92,%r133; @ %r377 bra $L78; bra $L74; $L76: .loc 1 328 4 st.u64 [%r230+48],%r158; mov.u64 %r379,-1; st.u64 [%r230+56],%r379; mov.u64 %r380,1; st.u64 [%r230+40],%r380; .loc 1 329 4 bra $L54; $L115: .loc 1 332 24382; call (%value_in),_gfortrani_xmallocarray83,[%value_in]; } .loc 1 332 22 st.u64 [%r230],%r383; bra $L79; $L69: .loc 1 337 19 ld.s8 %r385,[%r230+28]; .loc 1 337 10 setp.eq.u64 %r386,%r385,%r152; @ %r386 bra $L80; .loc 1 338 2 cvta.const.u64 %r387387_gfortran_runtime_error80: .loc 1 340 11 cvta.global.u64 %r389,_gfortrani_compile_options; .loc 1 340 10 ld.u32 %r390,[%r389+36]; setp.eq.u32 %r391,%r390,0; @ %r391 bra $L79; .loc 1 342 4 add.u64 %r455,%frame,360; cvta.const.u64 %r395,$LC2r455394395; call _gfortrani_bounds_ifunction_return344 4 cvta.const.u64 %r399,$LC3r2399395; call _gfortrani_bounds_equal_extents$L79: .loc 1 349 3 setp.ne.u64 %r401,%r152,0; @ %r401 bra $L81; $L84: .loc 1 357 8 ld.u64 %r165,[%r230]; .loc 1 358 8 ld.u64 %r163,[%r231]; .loc 1 360 9 setp.ne.u64 %r402,%r163,0; @ %r402 bra $L82; bra $L54; $L81: add.u64 %r174,%frame,480; add.u64 %r149,%r230,40; add.u64 %r147,%frame,120; cvt.u32.u32 %r404,%r23; cvt.s64.s8 %r403,%r404; add.u64 %r134,%r403,-1; .loc 1 349 3 mov.u64 %r169,0; add.u64 %r455,%frame,360; .loc 1 351 16 mov.u64 %r405,%r169; $L83: st.u64 [%r174],%r405; .loc 1 352 18 ld.u64 %r406,[%r149]; st.u64 [%r147],%r406; .loc 1 353 17 shl.b64 %r408,%r169,3; add.u64 %r409,%r455,%r408; .loc 1 353 10 ld.u64 %r410,[%r409]; setp.le.s64 %r411,%r410,0; @ %r411 bra $L54; .loc 1 349 26 add.u64 %r169,%r169,1; .loc 1 349 3 add.u64 %r174,%r174,8; add.u64 %r149,%r149,24; add.u64 %r147,%r147,8; setp.ne.u64 %r412,%r134,%r169; @ %r412 bra $L83; bra $L84; $L82: .loc 1 282 44 mul.lo.u64 %r42,%r40,%r235; .loc 1 283 12 mul.lo.u64 %r46,%r43,%r34; .loc 1 396 22 ld.u64 %r99,[%frame+240]; .loc 1 397 23 ld.u64 %r101,[%frame]; .loc 1 398 22 ld.u64 %r103,[%frame+120]; .loc 1 398 12 shl.b64 %r104,%r103,2; .loc 1 400 32 ld.u64 %r29,[%frame+360]; .loc 1 407 23 mul.lo.u64 %r166,%r99,%r29; .loc 1 408 24 mul.lo.u64 %r413,%r101,%r29; .loc 1 408 10 neg.s64 %r177,%r413; .loc 1 409 23 mul.lo.u64 %r414,%r103,%r29; .loc 1 409 9 shl.b64 %r415,%r414,2; neg.s64 %r182,%r415; setp.le.s64 %r461,%r152,1; .loc 1 384 13 setp.eq.u32 %r464,%r234,0; cvt.u32.u32 %r465,%r23; cvt.s64.s8 %r466,%r465; add.u64 %r467,%r466,-1; add.u64 %r468,%frame,240; add.u64 %r469,%frame,360; add.u64 %r470,%frame,120; $L98: .loc 1 349 3 mov.u64 %r168,%r164; mov.u64 %r141,%r163; .loc 1 372 9 mov.u64 %r159,0; $L88: .loc 1 375 7 ld.s8 %r199,[%r168]; .loc 1 375 6 setp.eq.u32 %r456,%r199,0; @ %r456 bra $L85; .loc 1 378 11 cvt.u32.u64 %r417,%r159; add.u32 %r139,%r417,1; .loc 1 382 6 setp.le.s64 %r418,%r153,%r159; @ %r418 bra $L86; add.u64 %r180,%r159,1; mov.u64 %r167,%r141; bra $L87; $L85: .loc 1 372 24 add.u64 %r159,%r159,1; .loc 1 372 32 add.u64 %r141,%r141,%r42; .loc 1 372 47 add.u64 %r168,%r168,%r46; .loc 1 372 2 setp.ne.u64 %r419,%r153,%r159; @ %r419 bra $L88; .loc 1 371 9 mov.u32 %r139,%r199; bra $L86; $L87: .loc 1 384 6 @ %r456 bra $L89; .loc 1 40(%value_in),memcmp,(%out_arg1,%out_arg2,%out_arg3); ld.param.u32 %r424,[%value_in]; } .loc 1 384 13 @ %r464 bra $L90; setp.le.s32 %r427,%r424,0; @ %r427 bra $L91; bra $L89; $L90: setp.ge.s32 %r428,%r424,0; @ %r428 bra $L89; $L91: .loc 1 388 14 cvt.u32.u64 %r139,%r180; mov.u64 %r141,%r167; $L89: .loc 1 382 31 add.u64 %r167,%r167,%r42; .loc 1 382 46 add.u64 %r168,%r168,%r46; .loc 1 382 6 add.u64 %r181,%r180,1; setp.eq.u64 %r429,%r153,%r180; @ %r429 bra $L86; .loc 1 384 7 ld.s8 %r199,[%r168]; mov.u64 %r180,%r181; setp.eq.u32 %r456,%r199,0; bra $L87; $L86: .loc 1 392 8 st.u32 [%r165],%r139; .loc 1 395 15 ld.u64 %r430,[%frame+480]; add.u64 %r97,%r430,1; .loc 1 396 12 add.u64 %r163,%r163,%r99; .loc 1 397 13 add.u64 %r164,%r164,%r101; .loc 1 398 12 add.u64 %r165,%r165,%r104; .loc 1 400 13 setp.eq.u64 %r431,%r29,%r97; @ %r431 bra $L92; .loc 1 395 15 st.u64 [%frame+480],%r97; bra $L98; $L92: .loc 1 404 13 mov.u64 %r432,0; st.u64 [%frame+480],%r432; .loc 1 408 10 add.u64 %r160,%r164,%r177; .loc 1 409 9 add.u64 %r161,%r165,%r182; .loc 1 411 7 @ %r461 bra $L54; add.u64 %r150,%frame,488; .loc 1 407 23 mov.u64 %r107,%r166; .loc 1 411 7 mov.u64 %r197,8; .loc 1 410 5 mov.u64 %r162,1; bra $L95; $L96: .loc 1 404 13 st.u64 [%r150],%r432; .loc 1 407 23 mul.lo.u64 %r107,%r120,%r119; .loc 1 408 24 mul.lo.u64 %r438,%r122,%r119; .loc 1 408 10 sub.u64 %r160,%r164,%r438; .loc 1 409 23 mul.lo.u64 %r439,%r124,%r119; .loc 1 409 9 shl.b64 %r440,%r439,2; sub.u64 %r161,%r165,%r440; .loc 1 410 5 add.u64 %r162,%r162,1; .loc 1 411 7 add.u64 %r150,%r150,8; add.u64 %r197,%r197,8; setp.eq.u64 %r441,%r162,%r467; @ %r441 bra $L54; $L95: .loc 1 419 16 ld.u64 %r442,[%r150]; add.u64 %r119,%r442,1; st.u64 [%r150],%r119; .loc 1 420 23 add.u64 %r444,%r468,%r197; ld.u64 %r120,[%r444]; .loc 1 420 13 sub.u64 %r445,%r120,%r107; add.u64 %r163,%r163,%r445; .loc 1 421 24 add.u64 %r446,%frame,%r197; ld.u64 %r122,[%r446]; .loc 1 421 14 add.u64 %r164,%r160,%r122; .loc 1 422 23 add.u64 %r448,%r470,%r197; ld.u64 %r124,[%r448]; .loc 1 422 13 shl.b64 %r449,%r124,2; add.u64 %r165,%r161,%r449; .loc 1 400 32 add.u64 %r451,%r469,%r197; ld.u64 %r128,[%r451]; .loc 1 400 13 setp.eq.u64 %r452,%r119,%r128; @ %r452 bra $L96; bra $L98; $L114: .loc 1 305 6 ld.u64 %r453,[%r230]; setp.eq.u64 %r454,%r453,0; @ ! %r454 bra $L69; bra $L97; $L54: .loc 1 42_gfortran_sminloc1_4_s1 .visible .func _gfortran_sminloc1_4_s1local .align 16 .b8 %frame_ar[368u64 %r89; .reg .u643u64 %r125; .reg .u64predu16 %r180; .reg .predpred %r222; .reg .predu64 %r231; .reg .u16 %r233; .reg .u16 %r234; .reg .u32pred %r259; .reg .pred %r260; .reg .u64u64 %r267; .reg .u32pred %r303; .reg .u64 %r304; .reg .pred %r305; .reg .u64 %r306; .reg .u64 %r307; .reg .u32mov.u64 %r156,%ar0; mov.u64 %r157,%ar1; mov.u64 %r158,%ar2; mov.u64 %r159,%ar3; mov.u32 %r160,%ar4; mov.u64 %r161,%ar5; .loc 1 449 6 setp.eq.u64 %r162,%r159,0; @ %r162 bra $L117; .loc 1 449 20 ld.u32 %r163,[%r159]; setp.eq.u32 %r164,%r163,0; @ %r164 bra $L118; $L117: .loc 1 452 7r1571580161; call _gfortran_minloc1_4_s1); } .loc 1 456 7 bra $L116; $L118: .loc 1 459 10 ld.u64 %r23,[%r158]; .loc 1 459 7 add.u64 %r95,%r23,-1; .loc 1 460 10 ld.s8 %r24,[%r157+28]; .loc 1 460 38 add.u32 %r26,%r24,-1; .loc 1 460 8 cvt.s64.s32 %r306,%r26; .loc 1 462 7 shr.u64 %r171,%r95,63; set.u32.gt.s64 %r173,%r95,%r306; neg.s32 %r174,%r173; cvt.u16.u64 %r177,%r171; cvt.u16.u32 %r178,%r174; or.b16 %r176,%r177,%r178; .loc 1 462 6 cvt.u32.u16 %r179,%r176; cvt.u16.u8 %r180,%r179; setp.ne.u16 %r181,%r180,0; @ %r181 bra $L120; .loc 1 469 3 setp.ne.u64 %r182,%r95,0; @ %r182 bra $L121; $L127: .loc 1 477 3 setp.lt.s64 %r183,%r95,%r306; @ %r183 bra $L122; bra $L175; $L120: .loc 1 464 7 cvt.u32.u32 %r186,%r24; cvt.s64.s8 %r185,%r186; st.u64 [%stack+8],%r1851841stack; call _gfortran_runtime_error121: add.u64 %r92,%r157,48; add.u64 %r114,%frame,120; add.u64 %r146,%r157,56; add.u64 %r188,%r157,24; add.u64 %r190,%r23,%r23; add.u64 %r191,%r190,%r23; shl.b64 %r192,%r191,3; add.u64 %r152,%r188,%r192; .loc 1 474 12 mov.u64 %r312,0; $L126: .loc 1 471 19 ld.u64 %r194,[%r146]; ld.u64 %r196,[%r92]; sub.u64 %r195,%r194,%r196; .loc 1 471 50 mad.lo.u64 %r38,%r195,%r161,%r161; .loc 1 473 10 setp.le.s64 %r197,%r38,0; @ %r197 bra $L124; .loc 1 471 17 st.u64 [%r114],%r38; bra $L125; $L124: .loc 1 474 12 st.u64 [%r114],%r312; $L125: .loc 1 469 3 add.u64 %r92,%r92,24; add.u64 %r114,%r114,8; add.u64 %r146,%r146,24; setp.ne.u64 %r199,%r92,%r152; @ %r199 bra $L126; bra $L127; $L175: .loc 1 486 15 ld.u64 %r107,[%r156]; .loc 1 486 6 setp.eq.u64 %r200,%r107,0; @ ! %r200 bra $L129; bra $L128; $L122: add.u64 %r202,%r23,%r23; add.u64 %r203,%r202,%r23; shl.b64 %r204,%r203,3; add.u64 %r205,%r204,48; add.u64 %r83,%r157,%r205; add.u64 %r304,%frame,120; shl.b64 %r207,%r23,3; add.u64 %r208,%r207,-8; add.u64 %r144,%r304,%r208; add.u64 %r209,%r157,48; cvt.u32.u32 %r211,%r24; cvt.s64.s8 %r210,%r211; add.u64 %r213,%r210,%r210; add.u64 %r214,%r213,%r210; shl.b64 %r215,%r214,3; add.u64 %r72,%r209,%r215; .loc 1 483 12 mov.u64 %r311,0; $L132: .loc 1 480 2 ld.u64 %r217,[%r83+8]; ld.u64 %r219,[%r83]; sub.u64 %r218,%r217,%r219; .loc 1 480 37 mad.lo.u64 %r46,%r218,%r161,%r161; .loc 1 482 10 setp.le.s64 %r220,%r46,0; @ %r220 bra $L130; .loc 1 479 17 st.u64 [%r144],%r46; bra $L131; $L130: .loc 1 483 12 st.u64 [%r144],%r311; $L131: .loc 1 477 3 add.u64 %r83,%r83,24; add.u64 %r144,%r144,8; setp.ne.u64 %r222,%r72,%r83; @ %r222 bra $L132; bra $L176; $L128: .loc 1 490 7 setp.eq.u64 %r223,%r306,0; @ %r223 bra $L134; add.u64 %r304,%frame,120; $L154: add.u64 %r125,%r156,40; mov.u64 %r123,%r304; add.u64 %r224,%r156,16; cvt.u32.u32 %r226,%r24; cvt.s64.s8 %r225,%r226; add.u64 %r228,%r225,%r225; add.u64 %r229,%r228,%r225; shl.b64 %r230,%r229,3; add.u64 %r117,%r224,%r230; .loc 1 493 10 mov.u64 %r89,1; .loc 1 497 4 mov.u64 %r249,0; bra $L135; $L134: .loc 1 501 24 mov.u64 %r231,0; st.u64 [%r156+8],%r231; .loc 1 502 28 cvt.u16.u32 %r234,%r24; add.u16 %r233,%r234,-1; cvt.u32.u16 %r235,%r233; st.u8 [%r156+28],%r235; .loc 1 504 20 add.u32 %r236,%r24,-2; cvt.s64.s32 %r56,%r236; add.u64 %r238,%r56,%r56; add.u64 %r239,%r238,%r56; shl.b64 %r240,%r239,3; add.u64 %r241,%r156,%r240; .loc 1 504 67 shl.b64 %r243,%r56,3; add.u64 %r244,%frame,%r243; .loc 1 504 59 ld.u64 %r246,[%r241+40]; ld.u64 %r247,[%r244+120]; mul.lo.u64 %r103,%r246,%r247; .loc 1 506 10 setp.eq.u64 %r248,%r103,0; @ %r248 bra $L136; bra $L177; $L138: .loc 1 495 48 mul.lo.u64 %r89,%r52,%r89; $L135: .loc 1 497 4 st.u64 [%r125+8],%r249; ld.u64 %r52,[%r123]; add.u64 %r250,%r52,-1; st.u64 [%r125+16],%r250; st.u64 [%r125],%r89; .loc 1 490 7 add.u64 %r125,%r125,24; add.u64 %r123,%r123,8; setp.ne.u64 %r251,%r117,%r125; @ %r251 bra $L138; bra $L134; $L136: .loc 1 509 4 st.u64 [%r156+48],%r103; mov.u64 %r253,-1; st.u64 [%r156+56],%r253; mov.u64 %r254,1; st.u64 [%r156+40],%r254; .loc 1 510 4 bra $L116; $L177: .loc 1 513 24256; call (%value_in),_gfortrani_xmallocarray257,[%value_in]; } mov.u64 %r107,%r257; .loc 1 513 22 st.u64 [%r156],%r107; $L142: .loc 1 539 3 setp.ne.u64 %r259,%r306,0; @ %r259 bra $L139; bra $L140; $L129: .loc 1 517 19 ld.s8 %r61,[%r156+28]; .loc 1 517 10 setp.eq.u64 %r260,%r61,%r306; @ %r260 bra $L141; .loc 1 518 2 st.u64 [%stack+8],%r306; st.u64 [%stack],%r61;_gfortran_runtime_error141: .loc 1 523 11 cvta.global.u64 %r263,_gfortrani_compile_options; .loc 1 523 10 ld.u32 %r264,[%r263+36]; setp.eq.u32 %r265,%r264,0; @ %r265 bra $L142; .loc 1 525 4 setp.eq.u64 %r266,%r306,0; @ %r266 bra $L140; add.u64 %r115,%r156,48; add.u64 %r105,%frame,120; cvt.u32.u32 %r268,%r24; cvt.s64.s8 %r267,%r268; add.u64 %r86,%r267,-1; .loc 1 525 10 mov.u64 %r113,0; $L144: .loc 1 529 21 ld.u64 %r270,[%r115+8]; add.u64 %r269,%r270,1; .loc 1 529 19 ld.u64 %r271,[%r115]; sub.u64 %r100,%r269,%r271; .loc 1 530 18 ld.u64 %r70,[%r105]; .loc 1 525 25 add.u64 %r113,%r113,1; .loc 1 530 11 setp.eq.u64 %r272,%r70,%r100; @ %r272 bra $L143; .loc 1 531 3 st.u64 [%stack+16],%r70; st.u64 [%stack+8],%r100; st.u64 [%stack],%r113;_gfortran_runtime_error143: .loc 1 525 4 add.u64 %r115,%r115,24; add.u64 %r105,%r105,8; setp.ne.u64 %r275,%r86,%r113; @ %r275 bra $L144; $L139: .loc 1 541 16 cvt.u16.u32 %r277,%r24; setp.le.s16 %r278,%r277,1; @ %r278 bra $L145; shl.b64 %r276,%r306,3; bra $L146; $L145: mov.u64 %r276,8; $L146: add.u64 %r280,%frame,240; mov.u32 %r2842766,[%value_in]; } add.u64 %r130,%r156,40; mov.u64 %r128,%frame; mov.u64 %r109,0; $L147: .loc 1 542 18 ld.u64 %r288,[%r130]; st.u64 [%r128],%r288; .loc 1 539 26 add.u64 %r109,%r109,1; .loc 1 539 3 add.u64 %r130,%r130,24; add.u64 %r128,%r128,8; setp.gt.s64 %r289,%r306,%r109; @ %r289 bra $L147; $L140: ld.u64 %r88,[%frame+240]; .loc 1 551 22 ld.u64 %r75,[%frame]; .loc 1 551 12 shl.b64 %r76,%r75,2; .loc 1 553 32 ld.u64 %r65,[%frame+120]; .loc 1 560 23 mul.lo.u64 %r49,%r65,%r75; setp.le.s64 %r305,%r306,1; .loc 1 549 13 mov.u32 %r290,0; cvt.u32.u32 %r308,%r24; cvt.s64.s8 %r309,%r308; add.u64 %r310,%r309,-1; $L153: st.u32 [%r107],%r290; .loc 1 550 15 add.u64 %r88,%r88,1; .loc 1 551 12 add.u64 %r107,%r107,%r76; .loc 1 553 13 setp.ne.u64 %r291,%r65,%r88; @ %r291 bra $L153; .loc 1 562 7 @ ! %r305 bra $L178; bra $L116; $L152: .loc 1 557 13 st.u64 [%r108],%r307; .loc 1 560 23 mul.lo.u64 %r78,%r81,%r80; .loc 1 561 5 add.u64 %r106,%r106,1; .loc 1 562 7 add.u64 %r108,%r108,8; add.u64 %r138,%r138,8; add.u64 %r137,%r137,8; setp.ne.u64 %r294,%r106,%r310; @ %r294 bra $L151; bra $L116; $L178: add.u64 %r108,%frame,248; add.u64 %r138,%frame,8; add.u64 %r137,%frame,128; .loc 1 560 23 mov.u64 %r78,%r49; .loc 1 561 5 mov.u64 %r106,1; .loc 1 557 13 mov.u64 %r307,0; $L151: .loc 1 566 16 ld.u64 %r299,[%r108]; add.u64 %r80,%r299,1; st.u64 [%r108],%r80; .loc 1 567 23 ld.u64 %r81,[%r138]; .loc 1 567 13 sub.u64 %r300,%r81,%r78; shl.b64 %r301,%r300,2; add.u64 %r107,%r107,%r301; .loc 1 553 32 ld.u64 %r84,[%r137]; .loc 1 553 13 setp.eq.u64 %r302,%r80,%r84; @ %r302 bra $L152; .loc 1 557 13 mov.u64 %r88,0; bra $L153; $L176: .loc 1 486 15 ld.u64 %r107,[%r156]; .loc 1 486 6 setp.eq.u64 %r303,%r107,0; @ ! %r303 bra $L129; bra $L154; $L116: .loc 1 571 1 ret; } minloc1_4_s4.o/ 1622802212_gfortran_minloc1_4_s4 .visible .func _gfortran_minloc1_4_sfortran/generated/minloc1_4_s4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_4_s4 .visible .func _gfortran_mminloc1_4_s_gfortran_sminloc1_4_s4 .visible .func _gfortran_sminloc1_4_sVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_memcmp_char4_gfortrani_memcmp_char4_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents7] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,914,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,117,101114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99_gfortran_minloc1_4_s4 .visible .func _gfortran_minloc1_4_s, .param .u64 %in_ar4) {480987predpredu64u64 %r266; .reg .u64 %r267; .reg .pred %r268; .reg .predu64u64 %r320; .reg .u64u64 %r327; .reg .u64 %r328; .reg .predpred %r340; .reg .u32 %r344; .reg .predpredu64mov.u64 %r191,%ar0; mov.u64 %r192,%ar1; mov.u64 %r193,%ar2; mov.u32 %r194,%ar3; mov.u64 %r195,%ar4; .loc 1 70 10 ld.s8 %r22,[%r192+28]; .loc 1 70 38 add.u32 %r196,%r22,-1; .loc 1 70 8 cvt.s64.s32 %r103,%r196; .loc 1 71 10 ld.u64 %r25,[%r193]; .loc 1 71 7 add.u64 %r104,%r25,-1; .loc 1 73 7 shr.u64 %r198,%r104,63; set.u32.lt.s64 %r200,%r103,%r104; neg.s32 %r201,%r200; cvt.u16.u64 %r204,%r198; cvt.u16.u32 %r205,%r201; or.b16 %r203,%r204,%r205; .loc 1 73 6 cvt.u32.u16 %r206,%r203; cvt.u16.u8 %r207,%r206; setp.eq.u16 %r208,%r207,0; @ %r208 bra $L2; .loc 1 75 7 cvt.u32.u32 %r211,%r22; cvt.s64.s8 %r210,%r211; st.u64 [%stack+8],%r21000_gfortran_runtime_errorr214,%r104,%r104; add.u64 %r215,%r214,%r104; shl.b64 %r216,%r215,3; add.u64 %r217,%r192,%r216; ld.u64 %r30,[%r217+56]; ld.u64 %r32,[%r217+48]; .loc 1 83 11 ld.u64 %r33,[%r217+40]; .loc 1 85 3 setp.ne.u64 %r231,%r104,0; @ %r231 bra $L3; $L9: .loc 1 93 3 setp.gt.s64 %r232,%r103,%r104; @ %r232 bra $L4; bra $L50; $L3: add.u64 %r124,%r192,40; add.u64 %r102,%frame,120; add.u64 %r101,%frame,240; add.u64 %r233,%r192,16; add.u64 %r235,%r25,%r25; add.u64 %r236,%r235,%r25; shl.b64 %r237,%r236,3; add.u64 %r90,%r233,%r237; .loc 1 91 12 mov.u64 %r388,0; $L8: .loc 1 87 51 ld.u64 %r239,[%r124]; mul.lo.u64 %r238,%r239,%r195; .loc 1 87 18 st.u64 [%r102],%r238; .loc 1 88 19 ld.u64 %r241,[%r124+16]; add.u64 %r240,%r241,1; ld.u64 %r242,[%r124+8]; sub.u64 %r42,%r240,%r242; .loc 1 90 10 setp.lt.s64 %r243,%r42,0; @ %r243 bra $L6; .loc 1 88 17 st.u64 [%r101],%r42; bra $L7; $L6: .loc 1 91 12 st.u64 [%r101],%r388; $L7: .loc 1 85 3 add.u64 %r124,%r124,24; add.u64 %r102,%r102,8; add.u64 %r101,%r101,8; setp.ne.u64 %r245,%r90,%r124; @ %r245 bra $L8; bra $L9; $L50: .loc 1 102 6 ld.u64 %r246,[%r191]; setp.eq.u64 %r247,%r246,0; @ ! %r247 bra $L11; bra $L10; $L4: add.u64 %r249,%r25,%r25; add.u64 %r250,%r249,%r25; shl.b64 %r251,%r250,3; add.u64 %r252,%r251,40; add.u64 %r152,%r192,%r252; shl.b64 %r253,%r25,3; add.u64 %r145,%r253,-8; add.u64 %r374,%frame,120; add.u64 %r147,%r374,%r145; add.u64 %r373,%frame,240; add.u64 %r142,%r373,%r145; add.u64 %r256,%r192,40; cvt.u32.u32 %r258,%r22; cvt.s64.s8 %r257,%r258; add.u64 %r260,%r257,%r257; add.u64 %r261,%r260,%r257; shl.b64 %r262,%r261,3; add.u64 %r127,%r256,%r262; .loc 1 99 12 mov.u64 %r387,0; $L14: .loc 1 95 56 ld.u64 %r264,[%r152]; mul.lo.u64 %r263,%r264,%r195; .loc 1 95 18 st.u64 [%r147],%r263; .loc 1 96 19 ld.u64 %r266,[%r152+16]; add.u64 %r265,%r266,1; ld.u64 %r267,[%r152+8]; sub.u64 %r51,%r265,%r267; .loc 1 98 10 setp.lt.s64 %r268,%r51,0; @ %r268 bra $L12; .loc 1 96 17 st.u64 [%r142],%r51; bra $L13; $L12: .loc 1 99 12 st.u64 [%r142],%r387; $L13: .loc 1 93 3 add.u64 %r152,%r152,24; add.u64 %r147,%r147,8; add.u64 %r142,%r142,8; setp.ne.u64 %r270,%r127,%r152; @ %r270 bra $L14; bra $L51; $L10: .loc 1 106 7 setp.le.s64 %r271,%r103,0; @ %r271 bra $L16; add.u64 %r373,%frame,240; $L38: add.u64 %r161,%r191,40; mov.u64 %r159,%r373; add.u64 %r272,%r191,16; cvt.u32.u32 %r274,%r22; cvt.s64.s8 %r273,%r274; add.u64 %r276,%r273,%r273; add.u64 %r277,%r276,%r273; shl.b64 %r278,%r277,3; add.u64 %r153,%r272,%r278; .loc 1 109 10 mov.u64 %r94,1; .loc 1 113 4 mov.u64 %r301,0; bra $L17; $L16: .loc 1 117 24 mov.u64 %r279,0; st.u64 [%r191+8],%r279; .loc 1 118 28 cvt.u16.u32 %r282,%r22; add.u16 %r281,%r282,-1; cvt.u32.u16 %r283,%r281; st.u8 [%r191+28],%r283; .loc 1 120 20 add.u32 %r284,%r22,-2; cvt.s64.s32 %r58,%r284; add.u64 %r286,%r58,%r58; add.u64 %r287,%r286,%r58; shl.b64 %r288,%r287,3; add.u64 %r289,%r191,%r288; .loc 1 120 67 shl.b64 %r291,%r58,3; add.u64 %r292,%frame,%r291; .loc 1 120 59 ld.u64 %r294,[%r289+40]; ld.u64 %r295,[%r292+240]; mul.lo.u64 %r109,%r294,%r295; .loc 1 122 29297; call (%value_in),_gfortrani_xmallocarray298,[%value_in]; } .loc 1 122 27 st.u64 [%r191],%r298; .loc 1 123 10 setp.eq.u64 %r300,%r109,0; @ ! %r300 bra $L21; bra $L18; $L20: .loc 1 111 48 mul.lo.u64 %r94,%r54,%r94; $L17: .loc 1 113 4 st.u64 [%r161+8],%r301; ld.u64 %r54,[%r159]; add.u64 %r302,%r54,-1; st.u64 [%r161+16],%r302; st.u64 [%r161],%r94; .loc 1 106 7 add.u64 %r161,%r161,24; add.u64 %r159,%r159,8; setp.ne.u64 %r303,%r153,%r161; @ %r303 bra $L20; bra $L16; $L18: .loc 1 126 4 st.u64 [%r191+48],%r109; mov.u64 %r305,-1; st.u64 [%r191+56],%r305; mov.u64 %r306,1; st.u64 [%r191+40],%r306; .loc 1 127 4 bra $L1; $L11: .loc 1 133 19 ld.s8 %r64,[%r191+28]; .loc 1 133 10 setp.eq.u64 %r307,%r64,%r103; @ %r307 bra $L23; .loc 1 134 2 st.u64 [%stack+8],%r103; st.u64 [%stack],%r64;_gfortran_runtime_error23: .loc 1 139 11 cvta.global.u64 %r310,_gfortrani_compile_options; .loc 1 139 10 ld.u32 %r311,[%r310+36]; setp.eq.u32 %r312,%r311,0; @ %r312 bra $L21; .loc 1 140 2 add.u64 %r373,%frame,240; cvta.const.u64 %r316,$LC2r37331316; call _gfortrani_bounds_ifunction_return$L21: .loc 1 144 3 setp.gt.s64 %r318,%r103,0; @ %r318 bra $L24; $L28: .loc 1 80 9 add.u64 %r319,%r30,1; .loc 1 80 7 sub.u64 %r105,%r319,%r32; max.s64 %r107,%r105,0; .loc 1 152 8 ld.u64 %r93,[%r192]; .loc 1 153 8 ld.u64 %r115,[%r191]; .loc 1 170 36 mul.lo.u64 %r320,%r33,%r195; shl.b64 %r69,%r320,2; .loc 1 186 22 ld.u64 %r183,[%frame+120]; .loc 1 186 12 shl.b64 %r185,%r183,2; .loc 1 187 22 ld.u64 %r186,[%frame]; .loc 1 187 12 shl.b64 %r188,%r186,2; .loc 1 185 12 ld.u64 %r189,[%frame+360]; setp.gt.s64 %r375,%r105,0; setp.le.s64 %r376,%r103,1; .loc 1 164 9 mov.u64 %r379,0; .loc 1 165 9 cvt.u32.u64 %r380,%r379; .loc 1 173 6 setp.eq.u32 %r381,%r194,0; cvt.u32.u32 %r382,%r22; cvt.s64.s8 %r383,%r382; add.u64 %r384,%r383,-1; add.u64 %r385,%frame,120; add.u64 %r386,%frame,240; bra $L25; $L24: add.u64 %r174,%frame,360; add.u64 %r173,%r191,40; mov.u64 %r171,%frame; cvt.u32.u32 %r322,%r22; cvt.s64.s8 %r321,%r322; add.u64 %r162,%r321,-1; .loc 1 144 3 mov.u64 %r116,0; add.u64 %r373,%frame,240; .loc 1 146 16 mov.u64 %r323,%r116; $L27: st.u64 [%r174],%r323; .loc 1 147 18 ld.u64 %r324,[%r173]; st.u64 [%r171],%r324; .loc 1 148 17 shl.b64 %r326,%r116,3; add.u64 %r327,%r373,%r326; .loc 1 148 10 ld.u64 %r328,[%r327]; setp.le.s64 %r329,%r328,0; @ %r329 bra $L1; .loc 1 144 26 add.u64 %r116,%r116,1; .loc 1 144 3 add.u64 %r174,%r174,8; add.u64 %r173,%r173,24; add.u64 %r171,%r171,8; setp.ne.u64 %r330,%r116,%r162; @ %r330 bra $L27; bra $L28; $L25: .loc 1 166 5 @ %r375 bra $L39; .loc 1 167 10 mov.u32 %r332,0; st.u32 [%r115],%r332; bra $L30; $L39: mov.u64 %r111,%r93; .loc 1 164 9 mov.u64 %r98,%r379; .loc 1 165 9 mov.u32 %r96,%r380; .loc 1 170 13 mov.u64 %r110,0; $L29: .loc 1 173 6 setp.ne.u64 %r333,%r98,0; @ %r333 bra $L31; $L34: cvt.u32.u64 %r334,%r110; add.u32 %r96,%r334,1; .loc 1 170 13 mov.u64 %r98,%r111; bra $L32; $L31: .loc 1 173 6 @ %r381 bra $L33; .loc 1 42195; call (%value_in),_gfortrani_memcmp_char4339,[%value_in]; } .loc 1 173 22 setp.le.s32 %r340,%r339,0; @ ! %r340 bra $L32; bra $L34; $L33: .loc 1 42195; call (%value_in),_gfortrani_memcmp_char4344,[%value_in]; } .loc 1 173 22 setp.lt.s32 %r345,%r344,0; @ %r345 bra $L34; $L32: .loc 1 170 28 add.u64 %r110,%r110,1; .loc 1 170 36 add.u64 %r111,%r111,%r69; .loc 1 170 6 setp.gt.s64 %r346,%r107,%r110; @ %r346 bra $L29; .loc 1 181 12 st.u32 [%r115],%r96; $L30: .loc 1 185 15 add.u64 %r189,%r189,1; .loc 1 186 12 add.u64 %r93,%r93,%r185; .loc 1 187 12 add.u64 %r115,%r115,%r188; .loc 1 189 32 ld.u64 %r121,[%frame+240]; .loc 1 189 13 setp.ne.u64 %r347,%r189,%r121; @ %r347 bra $L25; .loc 1 196 23 mul.lo.u64 %r348,%r189,%r183; .loc 1 196 9 shl.b64 %r349,%r348,2; sub.u64 %r112,%r93,%r349; .loc 1 197 23 mul.lo.u64 %r350,%r189,%r186; .loc 1 197 9 shl.b64 %r351,%r350,2; sub.u64 %r113,%r115,%r351; .loc 1 199 7 @ %r376 bra $L1; add.u64 %r123,%frame,368; mov.u64 %r143,8; .loc 1 198 5 mov.u64 %r114,1; bra $L36; $L37: .loc 1 193 13 st.u64 [%r123],%r379; .loc 1 196 23 mul.lo.u64 %r357,%r81,%r80; .loc 1 196 9 shl.b64 %r358,%r357,2; sub.u64 %r112,%r93,%r358; .loc 1 197 23 mul.lo.u64 %r359,%r84,%r80; .loc 1 197 9 shl.b64 %r360,%r359,2; sub.u64 %r113,%r115,%r360; .loc 1 198 5 add.u64 %r114,%r114,1; .loc 1 199 7 add.u64 %r123,%r123,8; add.u64 %r143,%r143,8; setp.eq.u64 %r361,%r114,%r384; @ %r361 bra $L1; $L36: .loc 1 207 16 ld.u64 %r362,[%r123]; add.u64 %r80,%r362,1; st.u64 [%r123],%r80; .loc 1 208 23 add.u64 %r364,%r385,%r143; ld.u64 %r81,[%r364]; .loc 1 208 13 shl.b64 %r365,%r81,2; add.u64 %r93,%r112,%r365; .loc 1 209 23 add.u64 %r366,%frame,%r143; ld.u64 %r84,[%r366]; .loc 1 209 13 shl.b64 %r367,%r84,2; add.u64 %r115,%r113,%r367; .loc 1 189 32 add.u64 %r369,%r386,%r143; ld.u64 %r87,[%r369]; .loc 1 189 13 setp.eq.u64 %r370,%r80,%r87; @ %r370 bra $L37; mov.u64 %r189,0; bra $L25; $L51: .loc 1 102 6 ld.u64 %r371,[%r191]; setp.eq.u64 %r372,%r371,0; @ ! %r372 bra $L11; bra $L38; $L1: .loc 1 21_gfortran_mminloc1_4_s4 .visible .func _gfortran_mminloc1_4_s608predu64u32 %r283; .reg .u32 %r284; .reg .u32 %r285; .reg .u32 %r287; .reg .u32 %r288; .reg .u16 %r289; .reg .u16 %r290; .reg .u16 %r291; .reg .u32 %r292; .reg .u16 %r293; .reg .pred %r294; .reg .predu64 %r307; .reg .pred %r309; .reg .predu64 %r318; .reg .u64pred %r327; .reg .u64 %r328; .reg .predu64 %r333; .reg .u64 %r334; .reg .u64 %r337; .reg .u32predu64 %r374; .reg .u16 %r376; .reg .u16 %r377; .reg .u32 %r378; .reg .predpred %r396; .reg .u64 %r399; .reg .u64 %r400; .reg .u64 %r404; .reg .pred %r406; .reg .predu64.reg .predpredpredpred %r450; .reg .u64pred %r462; .reg .u64 %r463; .reg .pred %r464; .reg .u64 %r465; .reg .pred %r466; .reg .u64 %r468; .reg .pred %r469; .reg .u64 %r470; .reg .u64 %r471; .reg .predu64mov.u64 %r235,%ar0; mov.u64 %r236,%ar1; mov.u64 %r237,%ar2; mov.u64 %r238,%ar3; mov.u32 %r239,%ar4; mov.u64 %r240,%ar5; .loc 1 244 6 setp.ne.u64 %r241,%r238,0; @ %r241 bra $L237240; call _gfortran_minloc1_4_s4); } .loc 1 251 7 bra $L52; $L53: .loc 1 254 10 ld.u64 %r22,[%r237]; .loc 1 254 7 add.u64 %r143,%r22,-1; .loc 1 255 10 ld.s8 %r23,[%r236+28]; .loc 1 255 38 add.u32 %r247,%r23,-1; .loc 1 255 8 cvt.s64.s32 %r144,%r247; .loc 1 258 7 shr.u64 %r249,%r143,63; set.u32.gt.s64 %r251,%r143,%r144; neg.s32 %r252,%r251; cvt.u16.u64 %r255,%r249; cvt.u16.u32 %r256,%r252; or.b16 %r254,%r255,%r256; .loc 1 258 6 cvt.u32.u16 %r257,%r254; cvt.u16.u8 %r258,%r257; setp.eq.u16 %r259,%r258,0; @ %r259 bra $L55; .loc 1 260 7 cvt.u32.u32 %r262,%r23; cvt.s64.s8 %r261,%r262; st.u64 [%stack+8],%r261; st.u64 [%stack],%r22; cvta.const.u64 %r260260_gfortran_runtime_error5: .loc 1 265 9 add.u64 %r470,%r143,%r143; add.u64 %r468,%r470,%r143; shl.b64 %r267,%r468,3; add.u64 %r268,%r236,%r267; ld.u64 %r271,[%r268+56]; add.u64 %r270,%r271,1; .loc 1 265 7 ld.u64 %r278,[%r268+48]; sub.u64 %r145,%r270,%r278; .loc 1 266 6 setp.le.s64 %r279,%r145,0; @ %r279 bra $L52; .loc 1 269 9 ld.u64 %r158,[%r238]; .loc 1 271 15 ld.u64 %r35,[%r238+16]; .loc 1 273 22 cvt.u32.u64 %r36,%r35; .loc 1 273 53 add.u32 %r280,%r36,-4; and.b32 %r281,%r280,-5; set.u32.eq.u32 %r283,%r281,0; neg.s32 %r284,%r283; .loc 1 273 22 add.u32 %r285,%r36,-1; set.u32.le.u32 %r287,%r285,1; neg.s32 %r288,%r287; .loc 1 273 6 cvt.u16.u32 %r290,%r284; cvt.u16.u32 %r291,%r288; or.b16 %r289,%r290,%r291; cvt.u32.u16 %r292,%r289; cvt.u16.u8 %r293,%r292; setp.ne.u16 %r294,%r293,0; @ %r294 bra $L57; .loc 1 275 7 setp.ne.u32 %r296,%r36,16; @ %r296 bra $L58; $L57: .loc 1 282 11 shl.b64 %r300,%r468,3; add.u64 %r301,%r236,%r300; ld.u64 %r42,[%r301+40]; .loc 1 283 12 add.u64 %r307,%r238,%r300; ld.u64 %r44,[%r307+40]; .loc 1 285 3 setp.ne.u64 %r309,%r143,0; @ %r309 bra $L59; $L65: .loc 1 295 3 setp.lt.s64 %r310,%r143,%r144; @ %r310 bra $L60; bra $L110; $L58: .loc 1 280 5 cvta.const.u64 %r311311_gfortran_runtime_error9: add.u64 %r214,%r236,40; add.u64 %r77,%frame,240; add.u64 %r98,%r238,40; mov.u64 %r97,%frame; add.u64 %r101,%frame,360; add.u64 %r313,%r236,16; add.u64 %r315,%r22,%r22; add.u64 %r316,%r315,%r22; shl.b64 %r317,%r316,3; add.u64 %r231,%r313,%r317; .loc 1 292 12 mov.u64 %r482,0; $L64: .loc 1 287 51 ld.u64 %r319,[%r214]; mul.lo.u64 %r318,%r319,%r240; .loc 1 287 18 st.u64 [%r77],%r318; .loc 1 288 20 ld.u64 %r321,[%r98]; mul.lo.u64 %r320,%r321,%r35; .loc 1 288 18 st.u64 [%r97],%r320; .loc 1 289 19 ld.u64 %r323,[%r214+16]; add.u64 %r322,%r323,1; ld.u64 %r324,[%r214+8]; sub.u64 %r58,%r322,%r324; .loc 1 291 10 setp.lt.s64 %r325,%r58,0; @ %r325 bra $L62; .loc 1 289 17 st.u64 [%r101],%r58; bra $L63; $L62: .loc 1 292 12 st.u64 [%r101],%r482; $L63: .loc 1 285 3 add.u64 %r214,%r214,24; add.u64 %r77,%r77,8; add.u64 %r98,%r98,24; add.u64 %r97,%r97,8; add.u64 %r101,%r101,8; setp.ne.u64 %r327,%r214,%r231; @ %r327 bra $L64; bra $L65; $L110: .loc 1 305 6 ld.u64 %r328,[%r235]; setp.eq.u64 %r329,%r328,0; @ ! %r329 bra $L67; bra $L66; $L60: add.u64 %r331,%r22,%r22; add.u64 %r332,%r331,%r22; shl.b64 %r333,%r332,3; add.u64 %r135,%r333,40; add.u64 %r139,%r236,%r135; shl.b64 %r334,%r22,3; add.u64 %r128,%r334,-8; add.u64 %r465,%frame,240; add.u64 %r132,%r465,%r128; add.u64 %r126,%r238,%r135; add.u64 %r123,%frame,%r128; add.u64 %r471,%frame,360; add.u64 %r92,%r471,%r128; cvt.u32.u32 %r338,%r23; cvt.s64.s8 %r337,%r338; add.u64 %r340,%r337,%r337; add.u64 %r341,%r340,%r337; shl.b64 %r342,%r341,3; add.u64 %r343,%r236,40; add.u64 %r166,%r342,%r343; .loc 1 302 12 mov.u64 %r481,0; $L70: .loc 1 297 55 ld.u64 %r345,[%r139]; mul.lo.u64 %r344,%r345,%r240; .loc 1 297 18 st.u64 [%r132],%r344; .loc 1 298 20 ld.u64 %r347,[%r126]; mul.lo.u64 %r346,%r347,%r35; .loc 1 298 18 st.u64 [%r123],%r346; .loc 1 299 19 ld.u64 %r349,[%r139+16]; add.u64 %r348,%r349,1; ld.u64 %r350,[%r139+8]; sub.u64 %r75,%r348,%r350; .loc 1 301 10 setp.lt.s64 %r351,%r75,0; @ %r351 bra $L68; .loc 1 299 17 st.u64 [%r92],%r75; bra $L69; $L68: .loc 1 302 12 st.u64 [%r92],%r481; $L69: .loc 1 295 3 add.u64 %r139,%r139,24; add.u64 %r132,%r132,8; add.u64 %r126,%r126,24; add.u64 %r123,%r123,8; add.u64 %r92,%r92,8; setp.ne.u64 %r353,%r139,%r166; @ %r353 bra $L70; bra $L111; $L66: .loc 1 309 7 setp.eq.u64 %r354,%r144,0; @ %r354 bra $L72; add.u64 %r471,%frame,360; $L94: add.u64 %r180,%r235,40; mov.u64 %r177,%r471; cvt.u32.u32 %r356,%r23; cvt.s64.s8 %r355,%r356; add.u64 %r358,%r355,%r355; add.u64 %r359,%r358,%r355; shl.b64 %r360,%r359,3; add.u64 %r361,%r235,16; add.u64 %r141,%r360,%r361; .loc 1 312 10 mov.u64 %r131,1; .loc 1 316 4 mov.u64 %r380,0; bra $L73; $L72: .loc 1 320 20 add.u32 %r362,%r23,-2; cvt.s64.s32 %r83,%r362; add.u64 %r364,%r83,%r83; add.u64 %r365,%r364,%r83; shl.b64 %r366,%r365,3; add.u64 %r367,%r235,%r366; .loc 1 320 67 shl.b64 %r369,%r83,3; add.u64 %r370,%frame,%r369; .loc 1 320 59 ld.u64 %r372,[%r367+40]; ld.u64 %r373,[%r370+360]; mul.lo.u64 %r151,%r372,%r373; .loc 1 322 24 mov.u64 %r374,0; st.u64 [%r235+8],%r374; .loc 1 323 28 cvt.u16.u32 %r377,%r23; add.u16 %r376,%r377,-1; cvt.u32.u16 %r378,%r376; st.u8 [%r235+28],%r378; .loc 1 325 10 setp.eq.u64 %r379,%r151,0; @ %r379 bra $L74; bra $L112; $L76: .loc 1 314 47 mul.lo.u64 %r131,%r80,%r131; $L73: .loc 1 316 4 st.u64 [%r180+8],%r380; ld.u64 %r80,[%r177]; add.u64 %r381,%r80,-1; st.u64 [%r180+16],%r381; st.u64 [%r180],%r131; .loc 1 309 7 add.u64 %r180,%r180,24; add.u64 %r177,%r177,8; setp.ne.u64 %r382,%r141,%r180; @ %r382 bra $L76; bra $L72; $L74: .loc 1 328 4 st.u64 [%r235+48],%r151; mov.u64 %r384,-1; st.u64 [%r235+56],%r384; mov.u64 %r385,1; st.u64 [%r235+40],%r385; .loc 1 329 4 bra $L52; $L112: .loc 1 332 24387; call (%value_in),_gfortrani_xmallocarray88,[%value_in]; } .loc 1 332 22 st.u64 [%r235],%r388; bra $L77; $L67: .loc 1 337 19 ld.s8 %r390,[%r235+28]; .loc 1 337 10 setp.eq.u64 %r391,%r390,%r144; @ %r391 bra $L78; .loc 1 338 2 cvta.const.u64 %r392,$LC5;stack; call _gfortran_runtime_error78: .loc 1 340 11 cvta.global.u64 %r394,_gfortrani_compile_options; .loc 1 340 10 ld.u32 %r395,[%r394+36]; setp.eq.u32 %r396,%r395,0; @ %r396 bra $L77; .loc 1 342 4 add.u64 %r471,%frame,360; cvta.const.u64 %r400,$LC2r471399400; call _gfortrani_bounds_ifunction_return344 4 cvta.const.u64 %r404,$LC3r2r404400; call _gfortrani_bounds_equal_extents$L77: .loc 1 349 3 setp.ne.u64 %r406,%r144,0; @ %r406 bra $L79; $L82: .loc 1 357 8 ld.u64 %r159,[%r235]; .loc 1 358 8 ld.u64 %r157,[%r236]; .loc 1 360 9 setp.ne.u64 %r407,%r157,0; @ %r407 bra $L80; bra $L52; $L79: add.u64 %r198,%frame,480; add.u64 %r197,%r235,40; add.u64 %r194,%frame,120; cvt.u32.u32 %r409,%r23; cvt.s64.s8 %r408,%r409; add.u64 %r184,%r408,-1; .loc 1 349 3 mov.u64 %r164,0; add.u64 %r471,%frame,360; .loc 1 351 16 mov.u64 %r410,%r164; $L81: st.u64 [%r198],%r410; .loc 1 352 18 ld.u64 %r411,[%r197]; st.u64 [%r194],%r411; .loc 1 353 17 shl.b64 %r413,%r164,3; add.u64 %r414,%r471,%r413; .loc 1 353 10 ld.u64 %r415,[%r414]; setp.le.s64 %r416,%r415,0; @ %r416 bra $L52; .loc 1 349 26 add.u64 %r164,%r164,1; .loc 1 349 3 add.u64 %r198,%r198,8; add.u64 %r197,%r197,24; add.u64 %r194,%r194,8; setp.ne.u64 %r417,%r164,%r184; @ %r417 bra $L81; bra $L82; $L80: .loc 1 283 12 mul.lo.u64 %r46,%r44,%r35; .loc 1 372 32 mul.lo.u64 %r418,%r42,%r240; shl.b64 %r96,%r418,2; .loc 1 396 22 ld.u64 %r215,[%frame+240]; .loc 1 396 12 shl.b64 %r217,%r215,2; .loc 1 397 23 ld.u64 %r218,[%frame]; .loc 1 398 22 ld.u64 %r220,[%frame+120]; .loc 1 398 12 shl.b64 %r223,%r220,2; setp.le.s64 %r469,%r144,1; .loc 1 384 13 setp.eq.u32 %r474,%r239,0; cvt.u32.u32 %r475,%r23; cvt.s64.s8 %r476,%r475; add.u64 %r477,%r476,-1; add.u64 %r478,%frame,240; add.u64 %r479,%frame,360; add.u64 %r480,%frame,120; $L95: .loc 1 349 3 mov.u64 %r163,%r158; mov.u64 %r153,%r157; .loc 1 372 9 mov.u64 %r152,0; $L86: .loc 1 375 7 ld.s8 %r224,[%r163]; .loc 1 375 6 setp.eq.u32 %r466,%r224,0; @ %r466 bra $L83; .loc 1 378 11 cvt.u32.u64 %r420,%r152; add.u32 %r133,%r420,1; .loc 1 382 6 setp.le.s64 %r421,%r145,%r152; @ %r421 bra $L84; add.u64 %r201,%r152,1; mov.u64 %r162,%r153; bra $L85; $L83: .loc 1 372 24 add.u64 %r152,%r152,1; .loc 1 372 32 add.u64 %r153,%r153,%r96; .loc 1 372 47 add.u64 %r163,%r163,%r46; .loc 1 372 2 setp.ne.u64 %r422,%r145,%r152; @ %r422 bra $L86; .loc 1 371 9 mov.u32 %r133,%r224; bra $L84; $L85: .loc 1 384 6 @ %r466 bra $L87; .loc 1 384 13 @ %r474 bra $L88; .loc 1 4216153240; call (%value_in),_gfortrani_memcmp_char4428,[%value_in]; } .loc 1 384 13 setp.gt.s32 %r429,%r428,0; @ %r429 bra $L87; $L89: .loc 1 388 14 cvt.u32.u64 %r133,%r201; mov.u64 %r153,%r162; bra $L87; $L88: .loc 1 4216153240; call (%value_in),_gfortrani_memcmp_char4433,[%value_in]; } .loc 1 384 13 setp.lt.s32 %r434,%r433,0; @ %r434 bra $L89; $L87: .loc 1 382 31 add.u64 %r162,%r162,%r96; .loc 1 382 46 add.u64 %r163,%r163,%r46; .loc 1 382 6 add.u64 %r202,%r201,1; setp.eq.u64 %r435,%r145,%r201; @ %r435 bra $L84; .loc 1 384 7 ld.s8 %r224,[%r163]; mov.u64 %r201,%r202; setp.eq.u32 %r466,%r224,0; bra $L85; $L84: .loc 1 392 8 st.u32 [%r159],%r133; .loc 1 395 15 ld.u64 %r436,[%frame+480]; add.u64 %r100,%r436,1; st.u64 [%frame+480],%r100; .loc 1 396 12 add.u64 %r157,%r157,%r217; .loc 1 397 13 add.u64 %r158,%r158,%r218; .loc 1 398 12 add.u64 %r159,%r159,%r223; .loc 1 400 32 ld.u64 %r29,[%frame+360]; .loc 1 400 13 setp.ne.u64 %r437,%r29,%r100; @ %r437 bra $L95; .loc 1 404 13 mov.u64 %r438,0; st.u64 [%frame+480],%r438; .loc 1 407 23 mul.lo.u64 %r106,%r29,%r215; .loc 1 408 24 mul.lo.u64 %r439,%r29,%r218; .loc 1 408 10 sub.u64 %r154,%r158,%r439; .loc 1 409 23 mul.lo.u64 %r440,%r29,%r220; .loc 1 409 9 shl.b64 %r441,%r440,2; sub.u64 %r155,%r159,%r441; .loc 1 411 7 @ %r469 bra $L52; add.u64 %r212,%frame,488; mov.u64 %r211,8; .loc 1 410 5 mov.u64 %r156,1; bra $L92; $L93: .loc 1 404 13 st.u64 [%r212],%r438; .loc 1 407 23 mul.lo.u64 %r106,%r116,%r115; .loc 1 408 24 mul.lo.u64 %r447,%r118,%r115; .loc 1 408 10 sub.u64 %r154,%r158,%r447; .loc 1 409 23 mul.lo.u64 %r448,%r120,%r115; .loc 1 409 9 shl.b64 %r449,%r448,2; sub.u64 %r155,%r159,%r449; .loc 1 410 5 add.u64 %r156,%r156,1; .loc 1 411 7 add.u64 %r212,%r212,8; add.u64 %r211,%r211,8; setp.eq.u64 %r450,%r156,%r477; @ %r450 bra $L52; $L92: .loc 1 419 16 ld.u64 %r451,[%r212]; add.u64 %r115,%r451,1; st.u64 [%r212],%r115; .loc 1 420 23 add.u64 %r453,%r478,%r211; ld.u64 %r116,[%r453]; .loc 1 420 13 sub.u64 %r454,%r116,%r106; shl.b64 %r455,%r454,2; add.u64 %r157,%r157,%r455; .loc 1 421 24 add.u64 %r456,%frame,%r211; ld.u64 %r118,[%r456]; .loc 1 421 14 add.u64 %r158,%r154,%r118; .loc 1 422 23 add.u64 %r458,%r480,%r211; ld.u64 %r120,[%r458]; .loc 1 422 13 shl.b64 %r459,%r120,2; add.u64 %r159,%r155,%r459; .loc 1 400 32 add.u64 %r461,%r479,%r211; ld.u64 %r124,[%r461]; .loc 1 400 13 setp.eq.u64 %r462,%r115,%r124; @ %r462 bra $L93; bra $L95; $L111: .loc 1 305 6 ld.u64 %r463,[%r235]; setp.eq.u64 %r464,%r463,0; @ ! %r464 bra $L67; bra $L94; $L52: .loc 1 42_gfortran_sminloc1_4_s4 .visible .func _gfortran_sminloc1_4_slocal .align 16 .b8 %frame_ar[368u64 %r89; .reg .u643u64 %r125; .reg .u64predu16 %r180; .reg .predpred %r222; .reg .predu64 %r231; .reg .u16 %r233; .reg .u16 %r234; .reg .u32pred %r259; .reg .pred %r260; .reg .u64u64 %r267; .reg .u32pred %r303; .reg .u64 %r304; .reg .pred %r305; .reg .u64 %r306; .reg .u64 %r307; .reg .u32mov.u64 %r156,%ar0; mov.u64 %r157,%ar1; mov.u64 %r158,%ar2; mov.u64 %r159,%ar3; mov.u32 %r160,%ar4; mov.u64 %r161,%ar5; .loc 1 449 6 setp.eq.u64 %r162,%r159,0; @ %r162 bra $L114; .loc 1 449 20 ld.u32 %r163,[%r159]; setp.eq.u32 %r164,%r163,0; @ %r164 bra $L115; $L114: .loc 1 452 7r1571580161; call _gfortran_minloc1_4_s4); } .loc 1 456 7 bra $L113; $L115: .loc 1 459 10 ld.u64 %r23,[%r158]; .loc 1 459 7 add.u64 %r95,%r23,-1; .loc 1 460 10 ld.s8 %r24,[%r157+28]; .loc 1 460 38 add.u32 %r26,%r24,-1; .loc 1 460 8 cvt.s64.s32 %r306,%r26; .loc 1 462 7 shr.u64 %r171,%r95,63; set.u32.gt.s64 %r173,%r95,%r306; neg.s32 %r174,%r173; cvt.u16.u64 %r177,%r171; cvt.u16.u32 %r178,%r174; or.b16 %r176,%r177,%r178; .loc 1 462 6 cvt.u32.u16 %r179,%r176; cvt.u16.u8 %r180,%r179; setp.ne.u16 %r181,%r180,0; @ %r181 bra $L117; .loc 1 469 3 setp.ne.u64 %r182,%r95,0; @ %r182 bra $L118; $L124: .loc 1 477 3 setp.lt.s64 %r183,%r95,%r306; @ %r183 bra $L119; bra $L172; $L117: .loc 1 464 7 cvt.u32.u32 %r186,%r24; cvt.s64.s8 %r185,%r186; st.u64 [%stack+8],%r1851841stack; call _gfortran_runtime_error118: add.u64 %r92,%r157,48; add.u64 %r114,%frame,120; add.u64 %r146,%r157,56; add.u64 %r188,%r157,24; add.u64 %r190,%r23,%r23; add.u64 %r191,%r190,%r23; shl.b64 %r192,%r191,3; add.u64 %r152,%r188,%r192; .loc 1 474 12 mov.u64 %r312,0; $L123: .loc 1 471 19 ld.u64 %r194,[%r146]; ld.u64 %r196,[%r92]; sub.u64 %r195,%r194,%r196; .loc 1 471 50 mad.lo.u64 %r38,%r195,%r161,%r161; .loc 1 473 10 setp.le.s64 %r197,%r38,0; @ %r197 bra $L121; .loc 1 471 17 st.u64 [%r114],%r38; bra $L122; $L121: .loc 1 474 12 st.u64 [%r114],%r312; $L122: .loc 1 469 3 add.u64 %r92,%r92,24; add.u64 %r114,%r114,8; add.u64 %r146,%r146,24; setp.ne.u64 %r199,%r92,%r152; @ %r199 bra $L123; bra $L124; $L172: .loc 1 486 15 ld.u64 %r107,[%r156]; .loc 1 486 6 setp.eq.u64 %r200,%r107,0; @ ! %r200 bra $L126; bra $L125; $L119: add.u64 %r202,%r23,%r23; add.u64 %r203,%r202,%r23; shl.b64 %r204,%r203,3; add.u64 %r205,%r204,48; add.u64 %r83,%r157,%r205; add.u64 %r304,%frame,120; shl.b64 %r207,%r23,3; add.u64 %r208,%r207,-8; add.u64 %r144,%r304,%r208; add.u64 %r209,%r157,48; cvt.u32.u32 %r211,%r24; cvt.s64.s8 %r210,%r211; add.u64 %r213,%r210,%r210; add.u64 %r214,%r213,%r210; shl.b64 %r215,%r214,3; add.u64 %r72,%r209,%r215; .loc 1 483 12 mov.u64 %r311,0; $L129: .loc 1 480 2 ld.u64 %r217,[%r83+8]; ld.u64 %r219,[%r83]; sub.u64 %r218,%r217,%r219; .loc 1 480 37 mad.lo.u64 %r46,%r218,%r161,%r161; .loc 1 482 10 setp.le.s64 %r220,%r46,0; @ %r220 bra $L127; .loc 1 479 17 st.u64 [%r144],%r46; bra $L128; $L127: .loc 1 483 12 st.u64 [%r144],%r311; $L128: .loc 1 477 3 add.u64 %r83,%r83,24; add.u64 %r144,%r144,8; setp.ne.u64 %r222,%r72,%r83; @ %r222 bra $L129; bra $L173; $L125: .loc 1 490 7 setp.eq.u64 %r223,%r306,0; @ %r223 bra $L131; add.u64 %r304,%frame,120; $L151: add.u64 %r125,%r156,40; mov.u64 %r123,%r304; add.u64 %r224,%r156,16; cvt.u32.u32 %r226,%r24; cvt.s64.s8 %r225,%r226; add.u64 %r228,%r225,%r225; add.u64 %r229,%r228,%r225; shl.b64 %r230,%r229,3; add.u64 %r117,%r224,%r230; .loc 1 493 10 mov.u64 %r89,1; .loc 1 497 4 mov.u64 %r249,0; bra $L132; $L131: .loc 1 501 24 mov.u64 %r231,0; st.u64 [%r156+8],%r231; .loc 1 502 28 cvt.u16.u32 %r234,%r24; add.u16 %r233,%r234,-1; cvt.u32.u16 %r235,%r233; st.u8 [%r156+28],%r235; .loc 1 504 20 add.u32 %r236,%r24,-2; cvt.s64.s32 %r56,%r236; add.u64 %r238,%r56,%r56; add.u64 %r239,%r238,%r56; shl.b64 %r240,%r239,3; add.u64 %r241,%r156,%r240; .loc 1 504 67 shl.b64 %r243,%r56,3; add.u64 %r244,%frame,%r243; .loc 1 504 59 ld.u64 %r246,[%r241+40]; ld.u64 %r247,[%r244+120]; mul.lo.u64 %r103,%r246,%r247; .loc 1 506 10 setp.eq.u64 %r248,%r103,0; @ %r248 bra $L133; bra $L174; $L135: .loc 1 495 48 mul.lo.u64 %r89,%r52,%r89; $L132: .loc 1 497 4 st.u64 [%r125+8],%r249; ld.u64 %r52,[%r123]; add.u64 %r250,%r52,-1; st.u64 [%r125+16],%r250; st.u64 [%r125],%r89; .loc 1 490 7 add.u64 %r125,%r125,24; add.u64 %r123,%r123,8; setp.ne.u64 %r251,%r117,%r125; @ %r251 bra $L135; bra $L131; $L133: .loc 1 509 4 st.u64 [%r156+48],%r103; mov.u64 %r253,-1; st.u64 [%r156+56],%r253; mov.u64 %r254,1; st.u64 [%r156+40],%r254; .loc 1 510 4 bra $L113; $L174: .loc 1 513 24256; call (%value_in),_gfortrani_xmallocarray257,[%value_in]; } mov.u64 %r107,%r257; .loc 1 513 22 st.u64 [%r156],%r107; $L139: .loc 1 539 3 setp.ne.u64 %r259,%r306,0; @ %r259 bra $L136; bra $L137; $L126: .loc 1 517 19 ld.s8 %r61,[%r156+28]; .loc 1 517 10 setp.eq.u64 %r260,%r61,%r306; @ %r260 bra $L138; .loc 1 518 2 st.u64 [%stack+8],%r306; st.u64 [%stack],%r61;_gfortran_runtime_error138: .loc 1 523 11 cvta.global.u64 %r263,_gfortrani_compile_options; .loc 1 523 10 ld.u32 %r264,[%r263+36]; setp.eq.u32 %r265,%r264,0; @ %r265 bra $L139; .loc 1 525 4 setp.eq.u64 %r266,%r306,0; @ %r266 bra $L137; add.u64 %r115,%r156,48; add.u64 %r105,%frame,120; cvt.u32.u32 %r268,%r24; cvt.s64.s8 %r267,%r268; add.u64 %r86,%r267,-1; .loc 1 525 10 mov.u64 %r113,0; $L141: .loc 1 529 21 ld.u64 %r270,[%r115+8]; add.u64 %r269,%r270,1; .loc 1 529 19 ld.u64 %r271,[%r115]; sub.u64 %r100,%r269,%r271; .loc 1 530 18 ld.u64 %r70,[%r105]; .loc 1 525 25 add.u64 %r113,%r113,1; .loc 1 530 11 setp.eq.u64 %r272,%r70,%r100; @ %r272 bra $L140; .loc 1 531 3 st.u64 [%stack+16],%r70; st.u64 [%stack+8],%r100; st.u64 [%stack],%r113;_gfortran_runtime_error140: .loc 1 525 4 add.u64 %r115,%r115,24; add.u64 %r105,%r105,8; setp.ne.u64 %r275,%r86,%r113; @ %r275 bra $L141; $L136: .loc 1 541 16 cvt.u16.u32 %r277,%r24; setp.le.s16 %r278,%r277,1; @ %r278 bra $L142; shl.b64 %r276,%r306,3; bra $L143; $L142: mov.u64 %r276,8; $L143: add.u64 %r280,%frame,240; mov.u32 %r2842766,[%value_in]; } add.u64 %r130,%r156,40; mov.u64 %r128,%frame; mov.u64 %r109,0; $L144: .loc 1 542 18 ld.u64 %r288,[%r130]; st.u64 [%r128],%r288; .loc 1 539 26 add.u64 %r109,%r109,1; .loc 1 539 3 add.u64 %r130,%r130,24; add.u64 %r128,%r128,8; setp.gt.s64 %r289,%r306,%r109; @ %r289 bra $L144; $L137: ld.u64 %r88,[%frame+240]; .loc 1 551 22 ld.u64 %r75,[%frame]; .loc 1 551 12 shl.b64 %r76,%r75,2; .loc 1 553 32 ld.u64 %r65,[%frame+120]; .loc 1 560 23 mul.lo.u64 %r49,%r65,%r75; setp.le.s64 %r305,%r306,1; .loc 1 549 13 mov.u32 %r290,0; cvt.u32.u32 %r308,%r24; cvt.s64.s8 %r309,%r308; add.u64 %r310,%r309,-1; $L150: st.u32 [%r107],%r290; .loc 1 550 15 add.u64 %r88,%r88,1; .loc 1 551 12 add.u64 %r107,%r107,%r76; .loc 1 553 13 setp.ne.u64 %r291,%r65,%r88; @ %r291 bra $L150; .loc 1 562 7 @ ! %r305 bra $L175; bra $L113; $L149: .loc 1 557 13 st.u64 [%r108],%r307; .loc 1 560 23 mul.lo.u64 %r78,%r81,%r80; .loc 1 561 5 add.u64 %r106,%r106,1; .loc 1 562 7 add.u64 %r108,%r108,8; add.u64 %r138,%r138,8; add.u64 %r137,%r137,8; setp.ne.u64 %r294,%r106,%r310; @ %r294 bra $L148; bra $L113; $L175: add.u64 %r108,%frame,248; add.u64 %r138,%frame,8; add.u64 %r137,%frame,128; .loc 1 560 23 mov.u64 %r78,%r49; .loc 1 561 5 mov.u64 %r106,1; .loc 1 557 13 mov.u64 %r307,0; $L148: .loc 1 566 16 ld.u64 %r299,[%r108]; add.u64 %r80,%r299,1; st.u64 [%r108],%r80; .loc 1 567 23 ld.u64 %r81,[%r138]; .loc 1 567 13 sub.u64 %r300,%r81,%r78; shl.b64 %r301,%r300,2; add.u64 %r107,%r107,%r301; .loc 1 553 32 ld.u64 %r84,[%r137]; .loc 1 553 13 setp.eq.u64 %r302,%r80,%r84; @ %r302 bra $L149; .loc 1 557 13 mov.u64 %r88,0; bra $L150; $L173: .loc 1 486 15 ld.u64 %r107,[%r156]; .loc 1 486 6 setp.eq.u64 %r303,%r107,0; @ ! %r303 bra $L126; bra $L151; $L113: .loc 1 571 1 ret; } minloc1_8_s1.o/ 1622802213_gfortran_minloc1_8_s1 .visible .func _gfortran_minloc1_8_s1in_ar4); .file 1 "../../../libgfortran/generated/minloc1_8_s1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_8_s1 .visible .func _gfortran_mminloc1_8_s1_gfortran_sminloc1_8_s1 .visible .func _gfortran_sminloc1_8_s1VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returmemcmpmem_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents7] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,914,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,117,101114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99_gfortran_minloc1_8_s1 .visible .func _gfortran_minloc1_8_s1in_ar4) {480u64 %r59u64pred %r235; .reg .predu16 %r273; .reg .u16u32pred %r331; .reg .predu64pred %r357; .reg .predu64 %r370; .reg .u64 %r371; mov.u64 %r183,%ar0; mov.u64 %r184,%ar1; mov.u64 %r185,%ar2; mov.u32 %r186,%ar3; mov.u64 %r187,%ar4; .loc 1 70 10 ld.s8 %r22,[%r184+28]; .loc 1 70 38 add.u32 %r188,%r22,-1; .loc 1 70 8 cvt.s64.s32 %r116,%r188; .loc 1 71 10 ld.u64 %r25,[%r185]; .loc 1 71 7 add.u64 %r117,%r25,-1; .loc 1 73 7 shr.u64 %r190,%r117,63; set.u32.lt.s64 %r192,%r116,%r117; neg.s32 %r193,%r192; cvt.u16.u64 %r196,%r190; cvt.u16.u32 %r197,%r193; or.b16 %r195,%r196,%r197; .loc 1 73 6 cvt.u32.u16 %r198,%r195; cvt.u16.u8 %r199,%r198; setp.eq.u16 %r200,%r199,0; @ %r200 bra $L2; .loc 1 75 7 cvt.u32.u32 %r203,%r22; cvt.s64.s8 %r202,%r203; st.u64 [%stack+8],%r20201201_gfortran_runtime_errorr206,%r117,%r117; add.u64 %r207,%r206,%r117; shl.b64 %r208,%r207,3; add.u64 %r209,%r184,%r208; ld.u64 %r31,[%r209+56]; ld.u64 %r33,[%r209+48]; .loc 1 83 11 ld.u64 %r34,[%r209+40]; .loc 1 85 3 setp.ne.u64 %r223,%r117,0; @ %r223 bra $L3; $L9: .loc 1 93 3 setp.gt.s64 %r224,%r116,%r117; @ %r224 bra $L4; bra $L49; $L3: add.u64 %r133,%r184,40; add.u64 %r177,%frame,120; add.u64 %r173,%frame,240; add.u64 %r225,%r184,16; add.u64 %r227,%r25,%r25; add.u64 %r228,%r227,%r25; shl.b64 %r229,%r228,3; add.u64 %r179,%r225,%r229; .loc 1 91 12 mov.u64 %r371,0; $L8: .loc 1 87 51 ld.u64 %r231,[%r133]; mul.lo.u64 %r230,%r231,%r187; .loc 1 87 18 st.u64 [%r177],%r230; .loc 1 88 19 ld.u64 %r233,[%r133+16]; add.u64 %r232,%r233,1; ld.u64 %r234,[%r133+8]; sub.u64 %r44,%r232,%r234; .loc 1 90 10 setp.lt.s64 %r235,%r44,0; @ %r235 bra $L6; .loc 1 88 17 st.u64 [%r173],%r44; bra $L7; $L6: .loc 1 91 12 st.u64 [%r173],%r371; $L7: .loc 1 85 3 add.u64 %r133,%r133,24; add.u64 %r177,%r177,8; add.u64 %r173,%r173,8; setp.ne.u64 %r237,%r133,%r179; @ %r237 bra $L8; bra $L9; $L49: .loc 1 102 6 ld.u64 %r238,[%r183]; setp.eq.u64 %r239,%r238,0; @ ! %r239 bra $L11; bra $L10; $L4: add.u64 %r241,%r25,%r25; add.u64 %r242,%r241,%r25; shl.b64 %r243,%r242,3; add.u64 %r244,%r243,40; add.u64 %r111,%r184,%r244; shl.b64 %r245,%r25,3; add.u64 %r101,%r245,-8; add.u64 %r360,%frame,120; add.u64 %r103,%r360,%r101; add.u64 %r359,%frame,240; add.u64 %r98,%r359,%r101; add.u64 %r248,%r184,40; cvt.u32.u32 %r250,%r22; cvt.s64.s8 %r249,%r250; add.u64 %r252,%r249,%r249; add.u64 %r253,%r252,%r249; shl.b64 %r254,%r253,3; add.u64 %r29,%r248,%r254; .loc 1 99 12 mov.u64 %r370,0; $L14: .loc 1 95 56 ld.u64 %r256,[%r111]; mul.lo.u64 %r255,%r256,%r187; .loc 1 95 18 st.u64 [%r103],%r255; .loc 1 96 19 ld.u64 %r258,[%r111+16]; add.u64 %r257,%r258,1; ld.u64 %r259,[%r111+8]; sub.u64 %r53,%r257,%r259; .loc 1 98 10 setp.lt.s64 %r260,%r53,0; @ %r260 bra $L12; .loc 1 96 17 st.u64 [%r98],%r53; bra $L13; $L12: .loc 1 99 12 st.u64 [%r98],%r370; $L13: .loc 1 93 3 add.u64 %r111,%r111,24; add.u64 %r103,%r103,8; add.u64 %r98,%r98,8; setp.ne.u64 %r262,%r29,%r111; @ %r262 bra $L14; bra $L50; $L10: .loc 1 106 7 setp.le.s64 %r263,%r116,0; @ %r263 bra $L16; add.u64 %r359,%frame,240; $L36: add.u64 %r147,%r183,40; mov.u64 %r141,%r359; cvt.u32.u32 %r265,%r22; cvt.s64.s8 %r264,%r265; add.u64 %r267,%r264,%r264; add.u64 %r268,%r267,%r264; shl.b64 %r269,%r268,3; add.u64 %r270,%r183,16; add.u64 %r112,%r269,%r270; .loc 1 109 10 mov.u64 %r104,1; .loc 1 113 4 mov.u64 %r293,0; bra $L17; $L16: .loc 1 117 24 mov.u64 %r271,0; st.u64 [%r183+8],%r271; .loc 1 118 28 cvt.u16.u32 %r274,%r22; add.u16 %r273,%r274,-1; cvt.u32.u16 %r275,%r273; st.u8 [%r183+28],%r275; .loc 1 120 20 add.u32 %r276,%r22,-2; cvt.s64.s32 %r63,%r276; add.u64 %r278,%r63,%r63; add.u64 %r279,%r278,%r63; shl.b64 %r280,%r279,3; add.u64 %r281,%r183,%r280; .loc 1 120 67 shl.b64 %r283,%r63,3; add.u64 %r284,%frame,%r283; .loc 1 120 59 ld.u64 %r286,[%r281+40]; ld.u64 %r287,[%r284+240]; mul.lo.u64 %r122,%r286,%r287; .loc 1 122 29 mov.u64 %r289189; call (%value_in),_gfortrani_xmallocarray290,[%value_in]; } .loc 1 122 27 st.u64 [%r183],%r290; .loc 1 123 10 setp.eq.u64 %r292,%r122,0; @ ! %r292 bra $L21; bra $L18; $L20: .loc 1 111 48 mul.lo.u64 %r104,%r59,%r104; $L17: .loc 1 113 4 st.u64 [%r147+8],%r293; ld.u64 %r59,[%r141]; add.u64 %r294,%r59,-1; st.u64 [%r147+16],%r294; st.u64 [%r147],%r104; .loc 1 106 7 add.u64 %r147,%r147,2420; bra $L16; $L18: .loc 1 126 4 st.u64 [%r183+48],%r122; mov.u64 %r297,-1; st.u64 [%r183+56],%r297; mov.u64 %r298,1; st.u64 [%r183+40],%r298; .loc 1 127 4 bra $L1; $L11: .loc 1 133 19 ld.s8 %r70,[%r183+28]; .loc 1 133 10 setp.eq.u64 %r299,%r70,%r116; @ %r299 bra $L23; .loc 1 134 2 st.u64 [%stack+8],%r116_gfortran_runtime_error23: .loc 1 139 11 cvta.global.u64 %r302,_gfortrani_compile_options; .loc 1 139 10 ld.u32 %r303,[%r302+36]; setp.eq.u32 %r304,%r303,0; @ %r304 bra $L21; .loc 1 140 2 add.u64 %r359,%frame,240; cvta.const.u64 %r308,$LC2r359307308; call _gfortrani_bounds_ifunction_return$L21: .loc 1 144 3 setp.gt.s64 %r310,%r116,0; @ %r310 bra $L24; $L28: .loc 1 80 9 add.u64 %r311,%r31,1; .loc 1 80 7 sub.u64 %r118,%r311,%r33; max.s64 %r120,%r118,0; .loc 1 83 44 mul.lo.u64 %r36,%r34,%r187; .loc 1 152 8 ld.u64 %r129,[%r184]; .loc 1 153 8 ld.u64 %r130,[%r183]; .loc 1 186 22 ld.u64 %r76,[%frame+120]; .loc 1 187 22 ld.u64 %r78,[%frame]; .loc 1 187 12 shl.b64 %r80,%r78,3; .loc 1 185 12 ld.u64 %r176,[%frame+360]; setp.gt.s64 %r357,%r118,0; setp.le.s64 %r358,%r116,1; .loc 1 173 6 setp.eq.u32 %r363,%r186,0; .loc 1 167 10 mov.u64 %r364,0; cvt.u32.u32 %r365,%r22; cvt.s64.s8 %r366,%r365; add.u64 %r367,%r366,-1; add.u64 %r368,%frame,120; add.u64 %r369,%frame,240; bra $L25; $L24: add.u64 %r165,%frame,360; add.u64 %r164,%r183,40; mov.u64 %r162,%frame; cvt.u32.u32 %r313,%r22; cvt.s64.s8 %r312,%r313; add.u64 %r150,%r312,-1; .loc 1 144 3 mov.u64 %r131,0; add.u64 %r359,%frame,240; .loc 1 146 16 mov.u64 %r314,%r131; $L27: st.u64 [%r165],%r314; .loc 1 147 18 ld.u64 %r315,[%r164]; st.u64 [%r162],%r315; .loc 1 148 17 shl.b64 %r317,%r131,3; add.u64 %r318,%r359,%r317; .loc 1 148 10 ld.u64 %r319,[%r318]; setp.le.s64 %r320,%r319,0; @ %r320 bra $L1; .loc 1 144 26 add.u64 %r131,%r131,1; .loc 1 144 3 add.u64 %r165,%r165,8; add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r321,%r131,%r150; @ %r321 bra $L27; bra $L28; $L25: .loc 1 166 5 @ %r357 bra $L37; .loc 1 167 10 st.u64 [%r130],%r364; bra $L30; $L37: mov.u64 %r125,%r129; .loc 1 164 9 mov.u64 %r107,0; .loc 1 165 9 mov.u64 %r105,%r107; .loc 1 170 13 mov.u64 %r158,%r107; $L29: .loc 1 177 14 add.u64 %r158,%r158,1; .loc 1 173 6 setp.eq.u64 %r324,%r107,0; @ %r324 bra $L38; .loc 1 40(%value_in),memcmp,(%out_arg1,%out_arg2,%out_arg3); ld.param.u32 %r328,[%value_in]; } .loc 1 173 6 @ %r363 bra $L32; .loc 1 173 22 setp.gt.s32 %r331,%r328,0; selp.u64 %r107,%r107,%r125,%r331; selp.u64 %r105,%r105,%r158,%r331; bra $L31; $L32: setp.ge.s32 %r332,%r328,0; selp.u64 %r107,%r107,%r125,%r332; selp.u64 %r105,%r105,%r158,%r332; bra $L31; $L38: mov.u64 %r107,%r125; .loc 1 177 14 mov.u64 %r105,%r158; $L31: .loc 1 170 36 add.u64 %r125,%r125,%r36; .loc 1 170 6 setp.gt.s64 %r333,%r120,%r158; @ %r333 bra $L29; .loc 1 181 12 st.u64 [%r130],%r105; $L30: .loc 1 185 15 add.u64 %r176,%r176,1; .loc 1 186 12 add.u64 %r129,%r129,%r76; .loc 1 187 12 add.u64 %r130,%r130,%r80; .loc 1 189 32 ld.u64 %r137,[%frame+240]; .loc 1 189 13 setp.ne.u64 %r334,%r176,%r137; @ %r334 bra $L25; .loc 1 196 23 mul.lo.u64 %r335,%r76,%r176; .loc 1 196 9 sub.u64 %r126,%r129,%r335; .loc 1 197 23 mul.lo.u64 %r336,%r78,%r176; .loc 1 197 9 shl.b64 %r337,%r336,3; sub.u64 %r127,%r130,%r337; .loc 1 199 7 @ %r358 bra $L1; add.u64 %r139,%frame,368; mov.u64 %r159,8; .loc 1 198 5 mov.u64 %r128,1; bra $L34; $L35: .loc 1 193 13 st.u64 [%r139],%r364; .loc 1 196 23 mul.lo.u64 %r343,%r92,%r91; .loc 1 196 9 sub.u64 %r126,%r129,%r343; .loc 1 197 23 mul.lo.u64 %r344,%r94,%r91; .loc 1 197 9 shl.b64 %r345,%r344,3; sub.u64 %r127,%r130,%r345; .loc 1 198 5 add.u64 %r128,%r128,1; .loc 1 199 7 add.u64 %r139,%r139,8; add.u64 %r159,%r159,8; setp.eq.u64 %r346,%r128,%r367; @ %r346 bra $L1; $L34: .loc 1 207 16 ld.u64 %r347,[%r139]; add.u64 %r91,%r347,1; st.u64 [%r139],%r91; .loc 1 208 23 add.u64 %r349,%r368,%r159; ld.u64 %r92,[%r349]; .loc 1 208 13 add.u64 %r129,%r126,%r92; .loc 1 209 23 add.u64 %r350,%frame,%r159; ld.u64 %r94,[%r350]; .loc 1 209 13 shl.b64 %r351,%r94,3; add.u64 %r130,%r127,%r351; .loc 1 189 32 add.u64 %r353,%r369,%r159; ld.u64 %r97,[%r353]; .loc 1 189 13 setp.eq.u64 %r354,%r91,%r97; @ %r354 bra $L35; mov.u64 %r176,0; bra $L25; $L50: .loc 1 102 6 ld.u64 %r355,[%r183]; setp.eq.u64 %r356,%r355,0; @ ! %r356 bra $L11; bra $L36; $L1: .loc 1 21_gfortran_mminloc1_8_s1 .visible .func _gfortran_mminloc1_8_s1608u64 %r210; .reg .u64 %r211; .reg .u64 %r213; .reg .u64u64 %r229; .reg .u32 %r230; .reg .u64 %r231; .reg .pred %r232; .reg .u32 %r238; .reg .u64 %r240; .reg .u32 %r242; .reg .u32u64 %r258; .reg .u64u64u64u64predu64 %r357; .reg .u64 %r358; .reg .u64u32pred %r421; .reg .predpred %r447; .reg .u64 %r448; .reg .pred %r449; .reg .u64 %r450; .reg .u64 %r452; .reg .pred %r453; .reg .pred %r454; .reg .u64mov.u64 %r226,%ar0; mov.u64 %r227,%ar1; mov.u64 %r228,%ar2; mov.u64 %r229,%ar3; mov.u32 %r230,%ar4; mov.u64 %r231,%ar5; .loc 1 244 6 setp.ne.u64 %r232,%r229,0; @ %r232 bra $L228230231; call _gfortran_minloc1_8_s1); } .loc 1 251 7 bra $L51; $L52: .loc 1 254 10 ld.u64 %r22,[%r228]; .loc 1 254 7 add.u64 %r150,%r22,-1; .loc 1 255 10 ld.s8 %r23,[%r227+28]; .loc 1 255 38 add.u32 %r238,%r23,-1; .loc 1 255 8 cvt.s64.s32 %r151,%r238; .loc 1 258 7 shr.u64 %r240,%r150,63; set.u32.gt.s64 %r242,%r150,%r151; neg.s32 %r243,%r242; cvt.u16.u64 %r246,%r240; cvt.u16.u32 %r247,%r243; or.b16 %r245,%r246,%r247; .loc 1 258 660 7 cvt.u32.u32 %r253,%r23; cvt.s64.s8 %r252,%r253; st.u64 [%stack+8],%r252; st.u64 [%stack],%r22; cvta.const.u64 %r22_gfortran_runtime_error4: .loc 1 265 9 add.u64 %r455,%r150,%r150; add.u64 %r452,%r455,%r150; shl.b64 %r258,%r452,3; add.u64 %r259,%r227,%r258; ld.u64 %r262,[%r259+56]; add.u64 %r261,%r262,1; .loc 1 265 7 ld.u64 %r269,[%r259+48]; sub.u64 %r152,%r261,%r269; .loc 1 266 6 setp.le.s64 %r270,%r152,0; @ %r270 bra $L51; .loc 1 269 9 ld.u64 %r162,[%r229]; .loc 1 271 15 ld.u64 %r35,[%r229+16]; .loc 1 273 22 cvt.u32.u64 %r36,%r35; .loc 1 273 53 add.u32 %r271,%r36,-4; and.b32 %r272,%r271,-5; set.u32.eq.u32 %r274,%r272,0; neg.s32 %r275,%r274; .loc 1 273 22 add.u32 %r276,%r36,-1; set.u32.le.u32 %r278,%r276,1; neg.s32 %r279,%r278; .loc 1 273 6 cvt.u16.u32 %r281,%r275; cvt.u16.u32 %r282,%r279; or.loc 1 275 7 setp.ne.u32 %r287,%r36,16; @ %r287 bra $L57; $L56: .loc 1 282 11 shl.b64 %r291,%r452,3; add.u64 %r292,%r227,%r291; ld.u64 %r42,[%r292+40]; .loc 1 283 12 add.u64 %r298,%r229,%r291; ld.u64 %r45,[%r298+40]; .loc 1 285 3 setp.ne.u64 %r300,%r150,0; @ %r300 bra $L58; $L64: .loc 1 295 3 setp.lt.s64 %r301,%r150,%r151; @ %r301 bra $L59; bra $L109; $L57: .loc 1 280 5 cvta.const.u64 %r302302_gfortran_runtime_error8: add.u64 %r208,%r227,40; add.u64 %r210,%frame,240; add.u64 %r211,%r229,40; mov.u64 %r213,%frame; add.u64 %r214,%frame,360; add.u64 %r304,%r227,16; add.u64 %r306,%r22,%r22; add.u64 %r307,%r306,%r22; shl.b64 %r308,%r307,3; add.u64 %r223,%r304,%r308; .loc 1 292 12 mov.u64 %r468,0; $L63: .loc 1 287 51 ld.u64 %r310,[%r208]; mul.lo.u64 %r309,%r310,%r231; .loc 1 287 18 st.u64 [%r210],%r309; .loc 1 288 20 ld.u64 %r312,[%r211]; mul.lo.u64 %r311,%r312,%r35; .loc 1 288 18 st.u64 [%r213],%r311; .loc 1 289 19 ld.u64 %r314,[%r208+16]; add.u64 %r313,%r314,1; ld.u64 %r315,[%r208+8]; sub.u64 %r60,%r313,%r315; .loc 1 291 10 setp.lt.s64 %r316,%r60,0; @ %r316 bra $L61; .loc 1 289 17 st.u64 [%r214],%r60; bra $L62; $L61: .loc 1 292 12 st.u64 [%r214],%r468; $L62: .loc 1 285 3 add.u64 %r208,%r208,24; add.u64 %r210,%r210,8; add.u64 %r211,%r211,24; add.u64 %r213,%r213,8; add.u64 %r214,%r214,8; setp.ne.u64 %r318,%r208,%r223; @ %r318 bra $L63; bra $L64; $L109: .loc 1 305 6 ld.u64 %r319,[%r226]; setp.eq.u64 %r320,%r319,0; @ ! %r320 bra $L66; bra $L65; $L59: add.u64 %r322,%r22,%r22; add.u64 %r323,%r322,%r22; shl.b64 %r324,%r323,3; add.u64 %r40,%r324,40; add.u64 %r71,%r227,%r40; shl.b64 %r325,%r22,3; add.u64 %r171,%r325,-8; add.u64 %r450,%frame,240; add.u64 %r30,%r450,%r171; add.u64 %r173,%r229,%r40; add.u64 %r79,%frame,%r171; add.u64 %r456,%frame,360; add.u64 %r164,%r456,%r171; add.u64 %r328,%r227,40; cvt.u32.u32 %r330,%r23; cvt.s64.s8 %r329,%r330; add.u64 %r332,%r329,%r329; add.u64 %r333,%r332,%r329; shl.b64 %r334,%r333,3; add.u64 %r205,%r328,%r334; .loc 1 302 12 mov.u64 %r467,0; $L69: .loc 1 297 55 ld.u64 %r336,[%r71]; mul.lo.u64 %r335,%r336,%r231; .loc 1 297 18 st.u64 [%r30],%r335; .loc 1 298 20 ld.u64 %r338,[%r173]; mul.lo.u64 %r337,%r338,%r35; .loc 1 298 18 st.u64 [%r79],%r337; .loc 1 299 19 ld.u64 %r340,[%r71+16]; add.u64 %r339,%r340,1; ld.u64 %r341,[%r71+8]; sub.u64 %r76,%r339,%r341; .loc 1 301 10 setp.lt.s64 %r342,%r76,0; @ %r342 bra $L67; .loc 1 299 17 st.u64 [%r164],%r76; bra $L68; $L67: .loc 1 302 12 st.u64 [%r164],%r467; $L68: .loc 1 295 3 add.u64 %r71,%r71,24; add.u64 %r30,%r30,8; add.u64 %r173,%r173,24; add.u64 %r79,%r79,8; add.u64 %r164,%r164,8; setp.ne.u64 %r344,%r71,%r205; @ %r344 bra $L69; bra $L110; $L65: .loc 1 309 7 setp.eq.u64 %r345,%r151,0; @ %r345 bra $L71; add.u64 %r456,%frame,360; $L92: add.u64 %r130,%r226,40; mov.u64 %r128,%r456; cvt.u32.u32 %r347,%r23; cvt.s64.s8 %r346,%r347; add.u64 %r349,%r346,%r346; add.u64 %r350,%r349,%r346; shl.b64 %r351,%r350,3; add.u64 %r352,%r226,16; add.u64 %r78,%r351,%r352; .loc 1 312 10 mov.u64 %r137,1; .loc 1 316 4 mov.u64 %r371,0; bra $L72; $L71: .loc 1 320 20 add.u32 %r353,%r23,-2; cvt.s64.s32 %r84,%r353; add.u64 %r355,%r84,%r84; add.u64 %r356,%r355,%r84; shl.b64 %r357,%r356,3; add.u64 %r358,%r226,%r357; .loc 1 320 67 shl.b64 %r360,%r84,3; add.u64 %r361,%frame,%r360; .loc 1 320 59 ld.u64 %r363,[%r358+40]; ld.u64 %r364,[%r361+360]; mul.lo.u64 %r157,%r363,%r364; .loc 1 322 24 mov.u64 %r365,0; st.u64 [%r226+8],%r365; .loc 1 323 28 cvt.u16.u32 %r368,%r23; add.u16 %r367,%r368,-1; cvt.u32.u16 %r369,%r367; st.u8 [%r226+28],%r369; .loc 1 325 10 setp.eq.u64 %r370,%r157,0; @ %r370 bra $L73; bra $L111; $L75: .loc 1 314 47 mul.lo.u64 %r137,%r81,%r137; $L72: .loc 1 316 4 st.u64 [%r130+8],%r371; ld.u64 %r81,[%r128]; add.u64 %r372,%r81,-1; st.u64 [%r130+16],%r372; st.u64 [%r130],%r137; .loc 1 309 7 add.u64 %r130,%r130,24; add.u64 %r128,%r128,8; setp.ne.u64 %r373,%r78,%r130; @ %r373 bra $L75; bra $L71; $L73: .loc 1 328 4 st.u64 [%r226+48],%r157; mov.u64 %r375,-1; st.u64 [%r226+56],%r375; mov.u64 %r376,1; st.u64 [%r226+40],%r376; .loc 1 329 4 bra $L51; $L111: .loc 1 332 24 mov.u64 %r378378; call (%value_in),_gfortrani_xmallocarray79,[%value_in]; } .loc 1 332 22 st.u64 [%r226],%r379; bra $L76; $L66: .loc 1 337 19 ld.s8 %r381,[%r226+28]; .loc 1 337 10 setp.eq.u64 %r382,%r381,%r151; @ %r382 bra $L77; .loc 1 338 2 cvta.const.u64 %r383383_gfortran_runtime_error77: .loc 1 340 11 cvta.global.u64 %r385,_gfortrani_compile_options; .loc 1 340 10 ld.u32 %r386,[%r385+36]; setp.eq.u32 %r387,%r386,0; @ %r387 bra $L76; .loc 1 342 4 add.u64 %r456,%frame,360; cvta.const.u64 %r391,$LC2r456390391; call _gfortrani_bounds_ifunction_return344 4 cvta.const.u64 %r395,$LC39391; call _gfortrani_bounds_equal_extents$L76: .loc 1 349 3 setp.ne.u64 %r397,%r151,0; @ %r397 bra $L78; $L81: .loc 1 357 8 ld.u64 %r163,[%r226]; .loc 1 358 8 ld.u64 %r161,[%r227]; .loc 1 360 9 setp.ne.u64 %r398,%r161,0; @ %r398 bra $L79; bra $L51; $L78: add.u64 %r174,%frame,480; add.u64 %r165,%r226,40; add.u64 %r148,%frame,120; cvt.u32.u32 %r400,%r23; cvt.s64.s8 %r399,%r400; add.u64 %r135,%r399,-1; .loc 1 349 3 mov.u64 %r169,0; add.u64 %r456,%frame,360; .loc 1 351 16 mov.u64 %r401,%r169; $L80: st.u64 [%r174],%r401; .loc 1 352 18 ld.u64 %r402,[%r165]; st.u64 [%r148],%r402; .loc 1 353 17 shl.b64 %r404,%r169,3; add.u64 %r405,%r456,%r404; .loc 1 353 10 ld.u64 %r406,[%r405]; setp.le.s64 %r407,%r406,0; @ %r407 bra $L51; .loc 1 349 26 add.u64 %r169,%r169,1; .loc 1 349 3 add.u64 %r174,%r174,8; add.u64 %r165,%r165,24; add.u64 %r148,%r148,8; setp.ne.u64 %r408,%r135,%r169; @ %r408 bra $L80; bra $L81; $L79: .loc 1 282 44 mul.lo.u64 %r44,%r42,%r231; .loc 1 283 12 mul.lo.u64 %r48,%r45,%r35; .loc 1 396 22 ld.u64 %r100,[%frame+240]; .loc 1 397 23 ld.u64 %r102,[%frame]; .loc 1 398 22 ld.u64 %r104,[%frame+120]; .loc 1 398 12 shl.b64 %r106,%r104,3; ld.u64 %r457,[%frame+480]; setp.le.s64 %r454,%r151,1; .loc 1 384 13 setp.eq.u32 %r460,%r230,0; cvt.u32.u32 %r461,%r23; cvt.s64.s8 %r462,%r461; add.u64 %r463,%r462,-1; add.u64 %r464,%frame,240; add.u64 %r465,%frame,360; add.u64 %r466,%frame,120; $L93: .loc 1 349 3 mov.u64 %r168,%r162; mov.u64 %r143,%r161; .loc 1 372 9 mov.u64 %r141,0; $L85: .loc 1 375 7 ld.s8 %r132,[%r168]; mov.u64 %r190,%r141; .loc 1 378 11 add.u64 %r141,%r141,1; .loc 1 375 6 setp.eq.u32 %r453,%r132,0; @ %r453 bra $L82; .loc 1 382 6 setp.gt.s64 %r410,%r152,%r190; @ %r410 bra $L94; bra $L84; $L82: .loc 1 372 32 add.u64 %r143,%r143,%r44; .loc 1 372 47 add.u64 %r168,%r168,%r48; .loc 1 372 2 setp.ne.u64 %r411,%r152,%r141; @ %r411 bra $L85; .loc 1 371 9 mov.u64 %r141,0; bra $L84; $L94: .loc 1 382 6 mov.u64 %r167,%r143; $L83: .loc 1 388 14 add.u64 %r190,%r190,1; .loc 1 384 6 @ %r453 bra $L86; .loc 1 40(%value_in),memcmp,(%out_arg1,%out_arg2,%out_arg3); ld.param.u32 %r416,[%value_in]; } .loc 1 384 13 @ %r460 bra $L87; setp.gt.s32 %r419,%r416,0; selp.u64 %r143,%r143,%r167,%r419; selp.u64 %r141,%r141,%r190,%r419; bra $L86; $L87: setp.ge.s32 %r420,%r416,0; selp.u64 %r143,%r143,%r167,%r420; selp.u64 %r141,%r141,%r190,%r420; $L86: .loc 1 382 31 add.u64 %r167,%r167,%r44; .loc 1 382 46 add.u64 %r168,%r168,%r48; .loc 1 382 6 setp.eq.u64 %r421,%r152,%r190; @ %r421 bra $L84; .loc 1 384 7 ld.s8 %r132,[%r168]; setp.eq.u32 %r453,%r132,0; bra $L83; $L84: .loc 1 392 8 st.u64 [%r163],%r141; .loc 1 395 15 add.u64 %r457,%r457,1; st.u64 [%frame+480],%r457; .loc 1 396 12 add.u64 %r161,%r161,%r100; .loc 1 397 13 add.u64 %r162,%r162,%r102; .loc 1 398 12 add.u64 %r163,%r163,%r106; .loc 1 400 32 ld.u64 %r29,[%frame+360]; .loc 1 400 13 setp.ne.u64 %r423,%r29,%r457; @ %r423 bra $L93; .loc 1 404 13 mov.u64 %r424,0; st.u64 [%frame+480],%r424; .loc 1 407 23 mul.lo.u64 %r109,%r457,%r100; .loc 1 408 24 mul.lo.u64 %r425,%r457,%r102; .loc 1 408 10 sub.u64 %r158,%r162,%r425; .loc 1 409 23 mul.lo.u64 %r426,%r457,%r104; .loc 1 409 9 shl.b64 %r427,%r426,3; sub.u64 %r159,%r163,%r427; .loc 1 411 7 @ %r454 bra $L51; add.u64 %r197,%frame,488; mov.u64 %r196,8; .loc 1 410 5 mov.u64 %r160,1; bra $L90; $L91: .loc 1 404 13 st.u64 [%r197],%r424; .loc 1 407 23 mul.lo.u64 %r109,%r120,%r119; .loc 1 408 24 mul.lo.u64 %r433,%r122,%r119; .loc 1 408 10 sub.u64 %r158,%r162,%r433; .loc 1 409 23 mul.lo.u64 %r434,%r124,%r119; .loc 1 409 9 shl.b64 %r435,%r434,3; sub.u64 %r159,%r163,%r435; .loc 1 410 5 add.u64 %r160,%r160,1; .loc 1 411 7 add.u64 %r197,%r197,8; add.u64 %r196,%r196,8; setp.eq.u64 %r436,%r160,%r463; @ %r436 bra $L51; $L90: .loc 1 419 16 ld.u64 %r437,[%r197]; add.u64 %r119,%r437,1; st.u64 [%r197],%r119; .loc 1 420 23 add.u64 %r439,%r464,%r196; ld.u64 %r120,[%r439]; .loc 1 420 13 sub.u64 %r440,%r120,%r109; add.u64 %r161,%r161,%r440; .loc 1 421 24 add.u64 %r441,%frame,%r196; ld.u64 %r122,[%r441]; .loc 1 421 14 add.u64 %r162,%r158,%r122; .loc 1 422 23 add.u64 %r443,%r466,%r196; ld.u64 %r124,[%r443]; .loc 1 422 13 shl.b64 %r444,%r124,3; add.u64 %r163,%r159,%r444; .loc 1 400 32 add.u64 %r446,%r465,%r196; ld.u64 %r127,[%r446]; .loc 1 400 13 setp.eq.u64 %r447,%r119,%r127; @ %r447 bra $L91; ld.u64 %r457,[%frame+480]; bra $L93; $L110: .loc 1 305 6 ld.u64 %r448,[%r226]; setp.eq.u64 %r449,%r448,0; @ ! %r449 bra $L66; bra $L92; $L51: .loc 1 42_gfortran_sminloc1_8_s1 .visible .func _gfortran_sminloc1_8_s1local .align 16 .b8 %frame_ar[368u64 %r89; .reg .u643u64 %r125; .reg .u64predu16 %r180; .reg .predpred %r222; .reg .predu64 %r231; .reg .u16 %r233; .reg .u16 %r234; .reg .u32pred %r259; .reg .pred %r260; .reg .u64u64 %r267; .reg .u32pred %r303; .reg .u64 %r304; .reg .pred %r305; .reg .u64 %r306; .reg .u32mov.u64 %r156,%ar0; mov.u64 %r157,%ar1; mov.u64 %r158,%ar2; mov.u64 %r159,%ar3; mov.u32 %r160,%ar4; mov.u64 %r161,%ar5; .loc 1 449 6 setp.eq.u64 %r162,%r159,0; @ %r162 bra $L113; .loc 1 449 20 ld.u32 %r163,[%r159]; setp.eq.u32 %r164,%r163,0; @ %r164 bra $L114; $L113: .loc 1 452 7r1571580161; call _gfortran_minloc1_8_s1); } .loc 1 456 7 bra $L112; $L114: .loc 1 459 10 ld.u64 %r23,[%r158]; .loc 1 459 7 add.u64 %r95,%r23,-1; .loc 1 460 10 ld.s8 %r24,[%r157+28]; .loc 1 460 38 add.u32 %r26,%r24,-1; .loc 1 460 8 cvt.s64.s32 %r306,%r26; .loc 1 462 7 shr.u64 %r171,%r95,63; set.u32.gt.s64 %r173,%r95,%r306; neg.s32 %r174,%r173; cvt.u16.u64 %r177,%r171; cvt.u16.u32 %r178,%r174; or.b16 %r176,%r177,%r178; .loc 1 462 6 cvt.u32.u16 %r179,%r176; cvt.u16.u8 %r180,%r179; setp.ne.u16 %r181,%r180,0; @ %r181 bra $L116; .loc 1 469 3 setp.ne.u64 %r182,%r95,0; @ %r182 bra $L117; $L123: .loc 1 477 3 setp.lt.s64 %r183,%r95,%r306; @ %r183 bra $L118; bra $L171; $L116: .loc 1 464 7 cvt.u32.u32 %r186,%r24; cvt.s64.s8 %r185,%r186; st.u64 [%stack+8],%r1851841stack; call _gfortran_runtime_error117: add.u64 %r92,%r157,48; add.u64 %r114,%frame,120; add.u64 %r146,%r157,56; add.u64 %r188,%r157,24; add.u64 %r190,%r23,%r23; add.u64 %r191,%r190,%r23; shl.b64 %r192,%r191,3; add.u64 %r152,%r188,%r192; .loc 1 474 12 mov.u64 %r312,0; $L122: .loc 1 471 19 ld.u64 %r194,[%r146]; ld.u64 %r196,[%r92]; sub.u64 %r195,%r194,%r196; .loc 1 471 50 mad.lo.u64 %r38,%r195,%r161,%r161; .loc 1 473 10 setp.le.s64 %r197,%r38,0; @ %r197 bra $L120; .loc 1 471 17 st.u64 [%r114],%r38; bra $L121; $L120: .loc 1 474 12 st.u64 [%r114],%r312; $L121: .loc 1 469 3 add.u64 %r92,%r92,24; add.u64 %r114,%r114,8; add.u64 %r146,%r146,24; setp.ne.u64 %r199,%r92,%r152; @ %r199 bra $L122; bra $L123; $L171: .loc 1 486 15 ld.u64 %r107,[%r156]; .loc 1 486 6 setp.eq.u64 %r200,%r107,0; @ ! %r200 bra $L125; bra $L124; $L118: add.u64 %r202,%r23,%r23; add.u64 %r203,%r202,%r23; shl.b64 %r204,%r203,3; add.u64 %r205,%r204,48; add.u64 %r83,%r157,%r205; add.u64 %r304,%frame,120; shl.b64 %r207,%r23,3; add.u64 %r208,%r207,-8; add.u64 %r144,%r304,%r208; add.u64 %r209,%r157,48; cvt.u32.u32 %r211,%r24; cvt.s64.s8 %r210,%r211; add.u64 %r213,%r210,%r210; add.u64 %r214,%r213,%r210; shl.b64 %r215,%r214,3; add.u64 %r72,%r209,%r215; .loc 1 483 12 mov.u64 %r311,0; $L128: .loc 1 480 2 ld.u64 %r217,[%r83+8]; ld.u64 %r219,[%r83]; sub.u64 %r218,%r217,%r219; .loc 1 480 37 mad.lo.u64 %r46,%r218,%r161,%r161; .loc 1 482 10 setp.le.s64 %r220,%r46,0; @ %r220 bra $L126; .loc 1 479 17 st.u64 [%r144],%r46; bra $L127; $L126: .loc 1 483 12 st.u64 [%r144],%r311; $L127: .loc 1 477 3 add.u64 %r83,%r83,24; add.u64 %r144,%r144,8; setp.ne.u64 %r222,%r72,%r83; @ %r222 bra $L128; bra $L172; $L124: .loc 1 490 7 setp.eq.u64 %r223,%r306,0; @ %r223 bra $L130; add.u64 %r304,%frame,120; $L150: add.u64 %r125,%r156,40; mov.u64 %r123,%r304; add.u64 %r224,%r156,16; cvt.u32.u32 %r226,%r24; cvt.s64.s8 %r225,%r226; add.u64 %r228,%r225,%r225; add.u64 %r229,%r228,%r225; shl.b64 %r230,%r229,3; add.u64 %r117,%r224,%r230; .loc 1 493 10 mov.u64 %r89,1; .loc 1 497 4 mov.u64 %r249,0; bra $L131; $L130: .loc 1 501 24 mov.u64 %r231,0; st.u64 [%r156+8],%r231; .loc 1 502 28 cvt.u16.u32 %r234,%r24; add.u16 %r233,%r234,-1; cvt.u32.u16 %r235,%r233; st.u8 [%r156+28],%r235; .loc 1 504 20 add.u32 %r236,%r24,-2; cvt.s64.s32 %r56,%r236; add.u64 %r238,%r56,%r56; add.u64 %r239,%r238,%r56; shl.b64 %r240,%r239,3; add.u64 %r241,%r156,%r240; .loc 1 504 67 shl.b64 %r243,%r56,3; add.u64 %r244,%frame,%r243; .loc 1 504 59 ld.u64 %r246,[%r241+40]; ld.u64 %r247,[%r244+120]; mul.lo.u64 %r103,%r246,%r247; .loc 1 506 10 setp.eq.u64 %r248,%r103,0; @ %r248 bra $L132; bra $L173; $L134: .loc 1 495 48 mul.lo.u64 %r89,%r52,%r89; $L131: .loc 1 497 4 st.u64 [%r125+8],%r249; ld.u64 %r52,[%r123]; add.u64 %r250,%r52,-1; st.u64 [%r125+16],%r250; st.u64 [%r125],%r89; .loc 1 490 7 add.u64 %r125,%r125,24; add.u64 %r123,%r123,8; setp.ne.u64 %r251,%r117,%r125; @ %r251 bra $L134; bra $L130; $L132: .loc 1 509 4 st.u64 [%r156+48],%r103; mov.u64 %r253,-1; st.u64 [%r156+56],%r253; mov.u64 %r254,1; st.u64 [%r156+40],%r254; .loc 1 510 4 bra $L112; $L173: .loc 1 513 24 mov.u64 %r256(%value_in),_gfortrani_xmallocarray257,[%value_in]; } mov.u64 %r107,%r257; .loc 1 513 22 st.u64 [%r156],%r107; $L138: .loc 1 539 3 setp.ne.u64 %r259,%r306,0; @ %r259 bra $L135; bra $L136; $L125: .loc 1 517 19 ld.s8 %r61,[%r156+28]; .loc 1 517 10 setp.eq.u64 %r260,%r61,%r306; @ %r260 bra $L137; .loc 1 518 2 st.u64 [%stack+8],%r306; st.u64 [%stack],%r61;_gfortran_runtime_error137: .loc 1 523 11 cvta.global.u64 %r263,_gfortrani_compile_options; .loc 1 523 10 ld.u32 %r264,[%r263+36]; setp.eq.u32 %r265,%r264,0; @ %r265 bra $L138; .loc 1 525 4 setp.eq.u64 %r266,%r306,0; @ %r266 bra $L136; add.u64 %r115,%r156,48; add.u64 %r105,%frame,120; cvt.u32.u32 %r268,%r24; cvt.s64.s8 %r267,%r268; add.u64 %r86,%r267,-1; .loc 1 525 10 mov.u64 %r113,0; $L140: .loc 1 529 21 ld.u64 %r270,[%r115+8]; add.u64 %r269,%r270,1; .loc 1 529 19 ld.u64 %r271,[%r115]; sub.u64 %r100,%r269,%r271; .loc 1 530 18 ld.u64 %r70,[%r105]; .loc 1 525 25 add.u64 %r113,%r113,1; .loc 1 530 11 setp.eq.u64 %r272,%r70,%r100; @ %r272 bra $L139; .loc 1 531 3 st.u64 [%stack+16],%r70; st.u64 [%stack+8],%r100; st.u64 [%stack],%r113;_gfortran_runtime_error139: .loc 1 525 4 add.u64 %r115,%r115,24; add.u64 %r105,%r105,8; setp.ne.u64 %r275,%r86,%r113; @ %r275 bra $L140; $L135: .loc 1 541 16 cvt.u16.u32 %r277,%r24; setp.le.s16 %r278,%r277,1; @ %r278 bra $L141; shl.b64 %r276,%r306,3; bra $L142; $L141: mov.u64 %r276,8; $L142: add.u64 %r280,%frame,240; mov.u32 %r2842766,[%value_in]; } add.u64 %r130,%r156,40; mov.u64 %r128,%frame; mov.u64 %r109,0; $L143: .loc 1 542 18 ld.u64 %r288,[%r130]; st.u64 [%r128],%r288; .loc 1 539 26 add.u64 %r109,%r109,1; .loc 1 539 3 add.u64 %r130,%r130,24; add.u64 %r128,%r128,8; setp.gt.s64 %r289,%r306,%r109; @ %r289 bra $L143; $L136: ld.u64 %r88,[%frame+240]; .loc 1 551 22 ld.u64 %r75,[%frame]; .loc 1 551 12 shl.b64 %r76,%r75,3; .loc 1 553 32 ld.u64 %r65,[%frame+120]; .loc 1 560 23 mul.lo.u64 %r49,%r65,%r75; setp.le.s64 %r305,%r306,1; .loc 1 549 13 mov.u64 %r290,0; cvt.u32.u32 %r308,%r24; cvt.s64.s8 %r309,%r308; add.u64 %r310,%r309,-1; $L149: st.u64 [%r107],%r290; .loc 1 550 15 add.u64 %r88,%r88,1; .loc 1 551 12 add.u64 %r107,%r107,%r76; .loc 1 553 13 setp.ne.u64 %r291,%r65,%r88; @ %r291 bra $L149; .loc 1 562 7 @ ! %r305 bra $L174; bra $L112; $L148: .loc 1 557 13 st.u64 [%r108],%r290; .loc 1 560 23 mul.lo.u64 %r78,%r81,%r80; .loc 1 561 5 add.u64 %r106,%r106,1; .loc 1 562 7 add.u64 %r108,%r108,8; add.u64 %r138,%r138,8; add.u64 %r137,%r137,8; setp.ne.u64 %r294,%r106,%r310; @ %r294 bra $L147; bra $L112; $L174: add.u64 %r108,%frame,248; add.u64 %r138,%frame,8; add.u64 %r137,%frame,128; .loc 1 560 23 mov.u64 %r78,%r49; .loc 1 561 5 mov.u64 %r106,1; $L147: .loc 1 566 16 ld.u64 %r299,[%r108]; add.u64 %r80,%r299,1; st.u64 [%r108],%r80; .loc 1 567 23 ld.u64 %r81,[%r138]; .loc 1 567 13 sub.u64 %r300,%r81,%r78; shl.b64 %r301,%r300,3; add.u64 %r107,%r107,%r301; .loc 1 553 32 ld.u64 %r84,[%r137]; .loc 1 553 13 setp.eq.u64 %r302,%r80,%r84; @ %r302 bra $L148; .loc 1 557 13 mov.u64 %r88,0; bra $L149; $L172: .loc 1 486 15 ld.u64 %r107,[%r156]; .loc 1 486 6 setp.eq.u64 %r303,%r107,0; @ ! %r303 bra $L125; bra $L150; $L112: .loc 1 571 1 ret; } minloc1_8_s4.o/ 1622802213_gfortran_minloc1_8_s4 .visible .func _gfortran_minloc1_8_sfortran/generated/minloc1_8_s4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_8_s4 .visible .func _gfortran_mminloc1_8_s_gfortran_sminloc1_8_s4 .visible .func _gfortran_sminloc1_8_sVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_memcmp_char4_gfortrani_memcmp_char4_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents7] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,914,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,117,101114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99_gfortran_minloc1_8_s4 .visible .func _gfortran_minloc1_8_s, .param .u64 %in_ar4) {480predpred %r229; .reg .predpredu64 %r258; .reg .u64pred %r266; .reg .pred64pred %r310; .reg .u64 %r313; .reg .u64 %r314; .reg .pred %r316; .reg .u64 %r317; .reg .u64 %r318; .reg .u64 %r319; .reg .u32 %r320; .reg .u64pred %r337; .reg .u32 %r341; .reg .predpredpredpredmov.u64 %r189,%ar0; mov.u64 %r190,%ar1; mov.u64 %r191,%ar2; mov.u32 %r192,%ar3; mov.u64 %r193,%ar4; .loc 1 70 10 ld.s8 %r22,[%r190+28]; .loc 1 70 38 add.u32 %r194,%r22,-1; .loc 1 70 8 cvt.s64.s32 %r114,%r194; .loc 1 71 10 ld.u64 %r25,[%r191]; .loc 1 71 7 add.u64 %r115,%r25,-1; .loc 1 73 7 shr.u64 %r196,%r115,63; set.u32.lt.s64 %r198,%r114,%r115; neg.s32 %r199,%r198; cvt.u16.u64 %r202,%r196;73%r206 bra $L2; .loc 1 75 7 cvt.u32.u32 %r209,%r22; cvt.s64.s8 %r208,%r209; st.u64 [%stack+8],%r208007_gfortran_runtime_errorr212,%r115,%r115; add.u64 %r213,%r212,%r115; shl.b64 %r214,%r213,3; add.u64 %r215,%r190,%r214; ld.u64 %r31,[%r215+56]; ld.u64 %r33,[%r215+48]; .loc 1 83 11 ld.u64 %r34,[%r215+40]; .loc 1 85 3 setp.ne.u64 %r229,%r115,0; @ %r229 bra $L3; $L9: .loc 1 93 3 setp.gt.s64 %r230,%r114,%r115; @ %r230 bra $L4; bra $L50; $L3: add.u64 %r54,%r190,40; add.u64 %r113,%frame,120; add.u64 %r127,%frame,240; add.u64 %r231,%r190,16; add.u64 %r233,%r25,%r25; add.u64 %r234,%r233,%r25; shl.b64 %r235,%r234,3; add.u64 %r55,%r231,%r235; .loc 1 91 12 mov.u64 %r383,0; $L8: .loc 1 87 51 ld.u64 %r237,[%r54]; mul.lo.u64 %r236,%r237,%r193; .loc 1 87 18 st.u64 [%r113],%r236; .loc 1 88 19 ld.u64 %r239,[%r54+16]; add.u64 %r238,%r239,1; ld.u64 %r240,[%r54+8]; sub.u64 %r43,%r238,%r240; .loc 1 90 10 setp.lt.s64 %r241,%r43,0; @ %r241 bra $L6; .loc 1 88 17 st.u64 [%r127],%r43; bra $L7; $L6: .loc 1 91 12 st.u64 [%r127],%r383; $L7: .loc 1 85 3 add.u64 %r54,%r54,24; add.u64 %r113,%r113,8; add.u64 %r127,%r127,8; setp.ne.u64 %r243,%r54,%r55; @ %r243 bra $L8; bra $L9; $L50: .loc 1 102 6 ld.u64 %r244,[%r189]; setp.eq.u64 %r245,%r244,0; @ ! %r245 bra $L11; bra $L10; $L4: add.u64 %r247,%r25,%r25; add.u64 %r248,%r247,%r25; shl.b64 %r249,%r248,3; add.u64 %r250,%r249,40; add.u64 %r130,%r190,%r250; shl.b64 %r251,%r25,3; add.u64 %r105,%r251,-8; add.u64 %r373,%frame,120; add.u64 %r108,%r373,%r105; add.u64 %r372,%frame,240; add.u64 %r100,%r372,%r105; add.u64 %r254,%r190,40; cvt.u32.u32 %r256,%r22; cvt.s64.s8 %r255,%r256; add.u64 %r258,%r255,%r255; add.u64 %r259,%r258,%r255; shl.b64 %r260,%r259,3; add.u64 %r70,%r254,%r260; .loc 1 99 12 mov.u64 %r382,0; $L14: .loc 1 95 56 ld.u64 %r262,[%r130]; mul.lo.u64 %r261,%r262,%r193; .loc 1 95 18 st.u64 [%r108],%r261; .loc 1 96 19 ld.u64 %r264,[%r130+16]; add.u64 %r263,%r264,1; ld.u64 %r265,[%r130+8]; sub.u64 %r52,%r263,%r265; .loc 1 98 10 setp.lt.s64 %r266,%r52,0; @ %r266 bra $L12; .loc 1 96 17 st.u64 [%r100],%r52; bra $L13; $L12: .loc 1 99 12 st.u64 [%r100],%r382; $L13: .loc 1 93 3 add.u64 %r130,%r130,24; add.u64 %r108,%r108,8; add.u64 %r100,%r100,8; setp.ne.u64 %r268,%r70,%r130; @ %r268 bra $L14; bra $L51; $L10: .loc 1 106 7 setp.le.s64 %r269,%r114,0; @ %r269 bra $L16; add.u64 %r372,%frame,240; $L38: add.u64 %r147,%r189,40; mov.u64 %r145,%r372; cvt.u32.u32 %r271,%r22; cvt.s64.s8 %r270,%r271; add.u64 %r273,%r270,%r270; add.u64 %r274,%r273,%r270; shl.b64 %r275,%r274,3; add.u64 %r276,%r189,16; add.u64 %r135,%r275,%r276; .loc 1 109 10 mov.u64 %r102,1; .loc 1 113 4 mov.u64 %r299,0; bra $L17; $L16: .loc 1 117 24 mov.u64 %r277,0; st.u64 [%r189+8],%r277; .loc 1 118 28 cvt.u16.u32 %r280,%r22; add.u16 %r279,%r280,-1; cvt.u32.u16 %r281,%r279; st.u8 [%r189+28],%r281; .loc 1 120 20 add.u32 %r282,%r22,-2; cvt.s64.s32 %r61,%r282; add.u64 %r284,%r61,%r61; add.u64 %r285,%r284,%r61; shl.b64 %r286,%r285,3; add.u64 %r287,%r189,%r286; .loc 1 120 67 shl.b64 %r289,%r61,3; add.u64 %r290,%frame,%r289; .loc 1 120 59 ld.u64 %r292,[%r287+40]; ld.u64 %r293,[%r290+240]; mul.lo.u64 %r120,%r292,%r293; .loc 1 122 29 mov.u64 %r29512295; call (%value_in),_gfortrani_xmallocarray296,[%value_in]; } .loc 1 122 27 st.u64 [%r189],%r296; .loc 1 123 10 setp.eq.u64 %r298,%r120,0; @ ! %r298 bra $L21; bra $L18; $L20: .loc 1 111 48 mul.lo.u64 %r102,%r57,%r102; $L17: .loc 1 113 4 st.u64 [%r147+8],%r299; ld.u64 %r57,[%r145]; add.u64 %r300,%r57,-1; st.u64 [%r147+16],%r300; st.u64 [%r147],%r102; .loc 1 106 7 add.u64 %r147,%r147,24; add.u64 %r145,%r145,8; setp.ne.u64 %r301,%r135,%r147; @ %r301 bra $L20; bra $L16; $L18: .loc 1 126 4 st.u64 [%r189+48],%r120; mov.u64 %r303,-1; st.u64 [%r189+56],%r303; mov.u64 %r304,1; st.u64 [%r189+40],%r304; .loc 1 127 4 bra $L1; $L11: .loc 1 133 19 ld.s8 %r68,[%r189+28]; .loc 1 133 10 setp.eq.u64 %r305,%r68,%r114; @ %r305 bra $L23; .loc 1 134 2 st.u64 [%stack+8],%r114; st.u64 [%stack],%r68;_gfortran_runtime_error23: .loc 1 139 11 cvta.global.u64 %r308,_gfortrani_compile_options; .loc 1 139 10 ld.u32 %r309,[%r308+36]; setp.eq.u32 %r310,%r309,0; @ %r310 bra $L21; .loc 1 140 2 add.u64 %r372,%frame,240; cvta.const.u64 %r314,$LC2r372313314; call _gfortrani_bounds_ifunction_return$L21: .loc 1 144 3 setp.gt.s64 %r316,%r114,0; @ %r316 bra $L24; $L28: .loc 1 80 9 add.u64 %r317,%r31,1; .loc 1 80 7 sub.u64 %r116,%r317,%r33; max.s64 %r118,%r116,0; .loc 1 152 8 ld.u64 %r101,[%r190]; .loc 1 153 8 ld.u64 %r125,[%r189]; .loc 1 170 36 mul.lo.u64 %r318,%r34,%r193; shl.b64 %r75,%r318,2; .loc 1 186 22 ld.u64 %r177,[%frame+120]; .loc 1 186 12 shl.b64 %r180,%r177,2; .loc 1 187 22 ld.u64 %r181,[%frame]; .loc 1 187 12 shl.b64 %r184,%r181,3; .loc 1 185 12 ld.u64 %r185,[%frame+360]; setp.gt.s64 %r370,%r116,0; setp.le.s64 %r371,%r114,1; .loc 1 173 6 setp.eq.u32 %r376,%r192,0; cvt.u32.u32 %r377,%r22; cvt.s64.s8 %r378,%r377; add.u64 %r379,%r378,-1; add.u64 %r380,%frame,120; add.u64 %r381,%frame,240; bra $L25; $L24: add.u64 %r166,%frame,360; add.u64 %r165,%r189,40; mov.u64 %r163,%frame; cvt.u32.u32 %r320,%r22; cvt.s64.s8 %r319,%r320; add.u64 %r151,%r319,-1; .loc 1 144 3 mov.u64 %r126,0; add.u64 %r372,%frame,240; .loc 1 146 16 mov.u64 %r321,%r126; $L27: st.u64 [%r166],%r321; .loc 1 147 18 ld.u64 %r322,[%r165]; st.u64 [%r163],%r322; .loc 1 148 17 shl.b64 %r324,%r126,3; add.u64 %r325,%r372,%r324; .loc 1 148 10 ld.u64 %r326,[%r325]; setp.le.s64 %r327,%r326,0; @ %r327 bra $L1; .loc 1 144 26 add.u64 %r126,%r126,1; .loc 1 144 3 add.u64 %r166,%r166,8; add.u64 %r165,%r165,24; add.u64 %r163,%r163,8; setp.ne.u64 %r328,%r126,%r151; @ %r328 bra $L27; bra $L28; $L25: .loc 1 166 5 @ %r370 bra $L39; .loc 1 167 10 mov.u64 %r330,0; st.u64 [%r125],%r330; bra $L30; $L39: mov.u64 %r121,%r101; .loc 1 164 9 mov.u64 %r106,0; .loc 1 165 9 mov.u64 %r104,%r106; .loc 1 170 13 mov.u64 %r152,%r106; $L29: .loc 1 177 14 add.u64 %r152,%r152,1; .loc 1 173 6 setp.ne.u64 %r331,%r106,0; @ %r331 bra $L31; $L34: .loc 1 170 13 mov.u64 %r106,%r121; .loc 1 177 14 mov.u64 %r104,%r152; bra $L32; $L31: .loc 1 173 6 @ %r376 bra $L33; .loc 1 42 12 { .param .u32 %value_in;(%value_in),_gfortrani_memcmp_char4336,[%value_in]; } .loc 1 173 22 setp.le.s32 %r337,%r336,0; @ ! %r337 bra $L32; bra $L34; $L33: .loc 1 42(%value_in),_gfortrani_memcmp_char4341,[%value_in]; } .loc 1 173 22 setp.lt.s32 %r342,%r341,0; @ %r342 bra $L34; $L32: .loc 1 170 36 add.u64 %r121,%r121,%r75; .loc 1 170 6 setp.gt.s64 %r343,%r118,%r152; @ %r343 bra $L29; .loc 1 181 12 st.u64 [%r125],%r104; $L30: .loc 1 185 15 add.u64 %r185,%r185,1; .loc 1 186 12 add.u64 %r101,%r101,%r180; .loc 1 187 12 add.u64 %r125,%r125,%r184; .loc 1 189 32 ld.u64 %r132,[%frame+240]; .loc 1 189 13 setp.ne.u64 %r344,%r185,%r132; @ %r344 bra $L25; .loc 1 196 23 mul.lo.u64 %r345,%r185,%r177; .loc 1 196 9 shl.b64 %r346,%r345,2; sub.u64 %r122,%r101,%r346; .loc 1 197 23 mul.lo.u64 %r347,%r185,%r181; .loc 1 197 9 shl.b64 %r348,%r347,3; sub.u64 %r123,%r125,%r348; .loc 1 199 7 @ %r371 bra $L1; add.u64 %r134,%frame,368; mov.u64 %r155,8; .loc 1 198 5 mov.u64 %r124,1; .loc 1 193 13 mov.u64 %r374,0; bra $L36; $L37: st.u64 [%r134],%r374; .loc 1 196 23 mul.lo.u64 %r354,%r90,%r89; .loc 1 196 9 shl.b64 %r355,%r354,2; sub.u64 %r122,%r101,%r355; .loc 1 197 23 mul.lo.u64 %r356,%r93,%r89; .loc 1 197 9 shl.b64 %r357,%r356,3; sub.u64 %r123,%r125,%r357; .loc 1 198 5 add.u64 %r124,%r124,1; .loc 1 199 7 add.u64 %r134,%r134,8; add.u64 %r155,%r155,8; setp.eq.u64 %r358,%r124,%r379; @ %r358 bra $L1; $L36: .loc 1 207 16 ld.u64 %r359,[%r134]; add.u64 %r89,%r359,1; st.u64 [%r134],%r89; .loc 1 208 23 add.u64 %r361,%r380,%r155; ld.u64 %r90,[%r361]; .loc 1 208 13 shl.b64 %r362,%r90,2; add.u64 %r101,%r122,%r362; .loc 1 209 23 add.u64 %r363,%frame,%r155; ld.u64 %r93,[%r363]; .loc 1 209 13 shl.b64 %r364,%r93,3; add.u64 %r125,%r123,%r364; .loc 1 189 32 add.u64 %r366,%r381,%r155; ld.u64 %r96,[%r366]; .loc 1 189 13 setp.eq.u64 %r367,%r89,%r96; @ %r367 bra $L37; mov.u64 %r185,0; bra $L25; $L51: .loc 1 102 6 ld.u64 %r368,[%r189]; setp.eq.u64 %r369,%r368,0; @ ! %r369 bra $L11; bra $L38; $L1: .loc 1 21_gfortran_mminloc1_8_s4 .visible .func _gfortran_mminloc1_8_s608pred %r289; .reg .predpred %r304; .reg .pred %r305; .reg .u64u64 %r313; .reg .u64u64pred %r324; .reg .u64predu16 %r371; .reg .u16 %r372; .reg .u32 %r373; .reg .pred %r374; .reg .u64 %r375; .reg .u64 %r376; .reg .predu64u64 %r399; .reg .predpred %r416; .reg .u32 %r422; .reg .pred %r423; .reg .u32 %r427; .reg .pred %r428; .reg .predpred.reg .u64u64u64 %r464; .reg .u64 %r465; .reg .pred.reg .u64 %r475; .reg .u64 %r476; mov.u64 %r230,%ar0; mov.u64 %r231,%ar1; mov.u64 %r232,%ar2; mov.u64 %r233,%ar3; mov.u32 %r234,%ar4; mov.u64 %r235,%ar5; .loc 1 244 6 setp.ne.u64 %r236,%r233,0; @ %r236 bra $L2234235; call _gfortran_minloc1_8_s4); } .loc 1 251 7 bra $L52; $L53: .loc 1 254 10 ld.u64 %r22,[%r232]; .loc 1 254 7 add.u64 %r149,%r22,-1; .loc 1 255 10 ld.s8 %r23,[%r231+28]; .loc 1 255 38 add.u32 %r242,%r23,-1; .loc 1 255 8 cvt.s64.s32 %r150,%r242; .loc 1 258 7 shr.u64 %r244,%r149,63; set.u32.gt.s64 %r246,%r149,%r150; neg.s32 %r247,%r246; cvt.u16.u64 %r250,%r244; cvt.u16.u32 %r251,%r247; or.b16 %r249,%r250,%r251; .loc 1 258 6.loc 1 260 7 cvt.u32.u32 %r257,%r23; cvt.s64.s8 %r256,%r257; st.u64 [%stack+8],%r256; st.u64 [%stack],%r22; cvta.const.u64 %r22_gfortran_runtime_error5: .loc 1 265 9 add.u64 %r464,%r149,%r149; add.u64 %r462,%r464,%r149; shl.b64 %r262,%r462,3; add.u64 %r263,%r231,%r262; ld.u64 %r266,[%r263+56]; add.u64 %r265,%r266,1; .loc 1 265 7 ld.u64 %r273,[%r263+48]; sub.u64 %r151,%r265,%r273; .loc 1 266 6 setp.le.s64 %r274,%r151,0; @ %r274 bra $L52; .loc 1 269 9 ld.u64 %r164,[%r233]; .loc 1 271 15 ld.u64 %r35,[%r233+16]; .loc 1 273 22 cvt.u32.u64 %r36,%r35; .loc 1 273 53 add.u32 %r275,%r36,-4; and.b32 %r276,%r275,-5; set.u32.eq.u32 %r278,%r276,0; neg.s32 %r279,%r278; .loc 1 273 22 add.u32 %r280,%r36,-1; set.u32.le.u32 %r282,%r280,1; neg.s32 %r283,%r282; .loc 1 273 6 cvt.u16.u32 %r285,%r279; cvt.u16.u32 %r286,%r283; or.b16 %r284,%r285,%r286; cvt.u32.u16 %r287,%r284; cvt.u16.u8 %r288,%r287; setp.ne.u16 %r289,%r288,0; @ %r289 bra $L57; .loc 1 275 7 setp.ne.u32 %r291,%r36,16; @ %r291 bra $L58; $L57: .loc 1 282 11 shl.b64 %r295,%r462,3; add.u64 %r296,%r231,%r295; ld.u64 %r40,[%r296+40]; .loc 1 283 12 add.u64 %r302,%r233,%r295; ld.u64 %r42,[%r302+40]; .loc 1 285 3 setp.ne.u64 %r304,%r149,0; @ %r304 bra $L59; $L65: .loc 1 295 3 setp.lt.s64 %r305,%r149,%r150; @ %r305 bra $L60; bra $L111; $L58: .loc 1 280 5 cvta.const.u64 %r306306_gfortran_runtime_error9: add.u64 %r103,%r231,40; add.u64 %r213,%frame,240; add.u64 %r214,%r233,40; mov.u64 %r216,%frame; add.u64 %r217,%frame,360; add.u64 %r308,%r231,16; add.u64 %r310,%r22,%r22; add.u64 %r311,%r310,%r22; shl.b64 %r312,%r311,3; add.u64 %r226,%r308,%r312; .loc 1 292 12 mov.u64 %r476,0; $L64: .loc 1 287 51 ld.u64 %r314,[%r103]; mul.lo.u64 %r313,%r314,%r235; .loc 1 287 18 st.u64 [%r213],%r313; .loc 1 288 20 ld.u64 %r316,[%r214]; mul.lo.u64 %r315,%r316,%r35; .loc 1 288 18 st.u64 [%r216],%r315; .loc 1 289 19 ld.u64 %r318,[%r103+16]; add.u64 %r317,%r318,1; ld.u64 %r319,[%r103+8]; sub.u64 %r57,%r317,%r319; .loc 1 291 10 setp.lt.s64 %r320,%r57,0; @ %r320 bra $L62; .loc 1 289 17 st.u64 [%r217],%r57; bra $L63; $L62: .loc 1 292 12 st.u64 [%r217],%r476; $L63: .loc 1 285 3 add.u64 %r103,%r103,24; add.u64 %r213,%r213,8; add.u64 %r214,%r214,24; add.u64 %r216,%r216,8; add.u64 %r217,%r217,8; setp.ne.u64 %r322,%r103,%r226; @ %r322 bra $L64; bra $L65; $L111: .loc 1 305 6 ld.u64 %r323,[%r230]; setp.eq.u64 %r324,%r323,0; @ ! %r324 bra $L67; bra $L66; $L60: add.u64 %r326,%r22,%r22; add.u64 %r327,%r326,%r22; shl.b64 %r328,%r327,3; add.u64 %r58,%r328,40; add.u64 %r77,%r231,%r58; shl.b64 %r329,%r22,3; add.u64 %r148,%r329,-8; add.u64 %r460,%frame,240; add.u64 %r39,%r460,%r148; add.u64 %r173,%r233,%r58; add.u64 %r143,%frame,%r148; add.u64 %r465,%frame,360; add.u64 %r178,%r465,%r148; add.u64 %r332,%r231,40; cvt.u32.u32 %r334,%r23; cvt.s64.s8 %r333,%r334; add.u64 %r336,%r333,%r333; add.u64 %r337,%r336,%r333; shl.b64 %r338,%r337,3; add.u64 %r100,%r332,%r338; .loc 1 302 12 mov.u64 %r475,0; $L70: .loc 1 297 55 ld.u64 %r340,[%r77]; mul.lo.u64 %r339,%r340,%r235; .loc 1 297 18 st.u64 [%r39],%r339; .loc 1 298 20 ld.u64 %r342,[%r173]; mul.lo.u64 %r341,%r342,%r35; .loc 1 298 18 st.u64 [%r143],%r341; .loc 1 299 19 ld.u64 %r344,[%r77+16]; add.u64 %r343,%r344,1; ld.u64 %r345,[%r77+8]; sub.u64 %r75,%r343,%r345; .loc 1 301 10 setp.lt.s64 %r346,%r75,0; @ %r346 bra $L68; .loc 1 299 17 st.u64 [%r178],%r75; bra $L69; $L68: .loc 1 302 12 st.u64 [%r178],%r475; $L69: .loc 1 295 3 add.u64 %r77,%r77,24; add.u64 %r39,%r39,8; add.u64 %r173,%r173,24; add.u64 %r143,%r143,8; add.u64 %r178,%r178,8; setp.ne.u64 %r348,%r77,%r100; @ %r348 bra $L70; bra $L112; $L66: .loc 1 309 7 setp.eq.u64 %r349,%r150,0; @ %r349 bra $L72; add.u64 %r465,%frame,360; $L94: add.u64 %r130,%r230,40; mov.u64 %r128,%r465; cvt.u32.u32 %r351,%r23; cvt.s64.s8 %r350,%r351; add.u64 %r353,%r350,%r350; add.u64 %r354,%r353,%r350; shl.b64 %r355,%r354,3; add.u64 %r356,%r230,16; add.u64 %r92,%r355,%r356; .loc 1 312 10 mov.u64 %r135,1; .loc 1 316 4 mov.u64 %r375,0; bra $L73; $L72: .loc 1 320 20 add.u32 %r357,%r23,-2; cvt.s64.s32 %r83,%r357; add.u64 %r359,%r83,%r83; add.u64 %r360,%r359,%r83; shl.b64 %r361,%r360,3; add.u64 %r362,%r230,%r361; .loc 1 320 67 shl.b64 %r364,%r83,3; add.u64 %r365,%frame,%r364; .loc 1 320 59 ld.u64 %r367,[%r362+40]; ld.u64 %r368,[%r365+360]; mul.lo.u64 %r157,%r367,%r368; .loc 1 322 24 mov.u64 %r369,0; st.u64 [%r230+8],%r369; .loc 1 323 28 cvt.u16.u32 %r372,%r23; add.u16 %r371,%r372,-1; cvt.u32.u16 %r373,%r371; st.u8 [%r230+28],%r373; .loc 1 325 10 setp.eq.u64 %r374,%r157,0; @ %r374 bra $L74; bra $L113; $L76: .loc 1 314 47 mul.lo.u64 %r135,%r80,%r135; $L73: .loc 1 316 4 st.u64 [%r130+8],%r375; ld.u64 %r80,[%r128]; add.u64 %r376,%r80,-1; st.u64 [%r130+16],%r376; st.u64 [%r130],%r135; .loc 1 309 7 add.u64 %r130,%r130,24; add.u64 %r128,%r128,8; setp.ne.u64 %r377,%r92,%r130; @ %r377 bra $L76; bra $L72; $L74: .loc 1 328 4 st.u64 [%r230+48],%r157; mov.u64 %r379,-1; st.u64 [%r230+56],%r379; mov.u64 %r380,1; st.u64 [%r230+40],%r380; .loc 1 329 4 bra $L52; $L113: .loc 1 332 24 mov.u64 %r38382; call (%value_in),_gfortrani_xmallocarray83,[%value_in]; } .loc 1 332 22 st.u64 [%r230],%r383; bra $L77; $L67: .loc 1 337 19 ld.s8 %r385,[%r230+28]; .loc 1 337 10 setp.eq.u64 %r386,%r385,%r150; @ %r386 bra $L78; .loc 1 338 2 cvta.const.u64 %r387387_gfortran_runtime_error78: .loc 1 340 11 cvta.global.u64 %r389,_gfortrani_compile_options; .loc 1 340 10 ld.u32 %r390,[%r389+36]; setp.eq.u32 %r391,%r390,0; @ %r391 bra $L77; .loc 1 342 4 add.u64 %r465,%frame,360; cvta.const.u64 %r395,$LC2r394395; call _gfortrani_bounds_ifunction_return344 4 cvta.const.u64 %r399,$LC3r2399395; call _gfortrani_bounds_equal_extents$L77: .loc 1 349 3 setp.ne.u64 %r401,%r150,0; @ %r401 bra $L79; $L82: .loc 1 357 8 ld.u64 %r165,[%r230]; .loc 1 358 8 ld.u64 %r163,[%r231]; .loc 1 360 9 setp.ne.u64 %r402,%r163,0; @ %r402 bra $L80; bra $L52; $L79: add.u64 %r174,%frame,480; add.u64 %r166,%r230,40; add.u64 %r145,%frame,120; cvt.u32.u32 %r404,%r23; cvt.s64.s8 %r403,%r404; add.u64 %r131,%r403,-1; .loc 1 349 3 mov.u64 %r171,0; add.u64 %r465,%frame,360; .loc 1 351 16 mov.u64 %r405,%r171; $L81: st.u64 [%r174],%r405; .loc 1 352 18 ld.u64 %r406,[%r166]; st.u64 [%r145],%r406; .loc 1 353 17 shl.b64 %r408,%r171,3; add.u64 %r409,%r465,%r408; .loc 1 353 10 ld.u64 %r410,[%r409]; setp.le.s64 %r411,%r410,0; @ %r411 bra $L52; .loc 1 349 26 add.u64 %r171,%r171,1; .loc 1 349 3 add.u64 %r174,%r174,8; add.u64 %r166,%r166,24; add.u64 %r145,%r145,8; setp.ne.u64 %r412,%r131,%r171; @ %r412 bra $L81; bra $L82; $L80: .loc 1 283 12 mul.lo.u64 %r45,%r42,%r35; .loc 1 372 32 mul.lo.u64 %r413,%r40,%r235; shl.b64 %r95,%r413,2; .loc 1 396 22 ld.u64 %r197,[%frame+240]; .loc 1 396 12 shl.b64 %r200,%r197,2; .loc 1 397 23 ld.u64 %r201,[%frame]; .loc 1 398 22 ld.u64 %r203,[%frame+120]; .loc 1 398 12 shl.b64 %r205,%r203,3; setp.le.s64 %r463,%r150,1; .loc 1 384 13 setp.eq.u32 %r468,%r234,0; cvt.u32.u32 %r469,%r23; cvt.s64.s8 %r470,%r469; add.u64 %r471,%r470,-1; add.u64 %r472,%frame,240; add.u64 %r473,%frame,360; add.u64 %r474,%frame,120; $L95: .loc 1 349 3 mov.u64 %r170,%r164; mov.u64 %r159,%r163; .loc 1 372 9 mov.u64 %r139,0; $L86: .loc 1 375 7 ld.s8 %r206,[%r170]; mov.u64 %r190,%r139; .loc 1 378 11 add.u64 %r139,%r139,1; .loc 1 375 6 setp.eq.u32 %r459,%r206,0; @ %r459 bra $L83; .loc 1 382 6 setp.gt.s64 %r415,%r151,%r190; @ %r415 bra $L96; bra $L85; $L83: .loc 1 372 32 add.u64 %r159,%r159,%r95; .loc 1 372 47 add.u64 %r170,%r170,%r45; .loc 1 372 2 setp.ne.u64 %r416,%r151,%r139; @ %r416 bra $L86; .loc 1 371 9 mov.u64 %r139,0; bra $L85; $L96: .loc 1 382 6 mov.u64 %r169,%r159; $L84: .loc 1 388 14 add.u64 %r190,%r190,1; .loc 1 384 6 @ %r459 bra $L87; .loc 1 384 13 @ %r468 bra $L88; .loc 1 42call (%value_in),_gfortrani_memcmp_char4422,[%value_in]; } .loc 1 384 13 setp.gt.s32 %r423,%r422,0; @ %r423 bra $L87; $L89: .loc 1 382 6 mov.u64 %r159,%r169; .loc 1 388 14 mov.u64 %r139,%r190; bra $L87; $L88: .loc 1 42call (%value_in),_gfortrani_memcmp_char4427,[%value_in]; } .loc 1 384 13 setp.lt.s32 %r428,%r427,0; @ %r428 bra $L89; $L87: .loc 1 382 31 add.u64 %r169,%r169,%r95; .loc 1 382 46 add.u64 %r170,%r170,%r45; .loc 1 382 6 setp.eq.u64 %r429,%r151,%r190; @ %r429 bra $L85; .loc 1 384 7 ld.s8 %r206,[%r170]; setp.eq.u32 %r459,%r206,0; bra $L84; $L85: .loc 1 392 8 st.u64 [%r165],%r139; .loc 1 395 15 ld.u64 %r430,[%frame+480]; add.u64 %r98,%r430,1; st.u64 [%frame+480],%r98; .loc 1 396 12 add.u64 %r163,%r163,%r200; .loc 1 397 13 add.u64 %r164,%r164,%r201; .loc 1 398 12 add.u64 %r165,%r165,%r205; .loc 1 400 32 ld.u64 %r29,[%frame+360]; .loc 1 400 13 setp.ne.u64 %r431,%r29,%r98; @ %r431 bra $L95; .loc 1 404 13 mov.u64 %r432,0; st.u64 [%frame+480],%r432; .loc 1 407 23 mul.lo.u64 %r107,%r29,%r197; .loc 1 408 24 mul.lo.u64 %r433,%r29,%r201; .loc 1 408 10 sub.u64 %r160,%r164,%r433; .loc 1 409 23 mul.lo.u64 %r434,%r29,%r203; .loc 1 409 9 shl.b64 %r435,%r434,3; sub.u64 %r161,%r165,%r435; .loc 1 411 7 @ %r463 bra $L52; add.u64 %r198,%frame,488; mov.u64 %r196,8; .loc 1 410 5 mov.u64 %r162,1; bra $L92; $L93: .loc 1 404 13 st.u64 [%r198],%r432; .loc 1 407 23 mul.lo.u64 %r107,%r119,%r118; .loc 1 408 24 mul.lo.u64 %r441,%r121,%r118; .loc 1 408 10 sub.u64 %r160,%r164,%r441; .loc 1 409 23 mul.lo.u64 %r442,%r123,%r118; .loc 1 409 9 shl.b64 %r443,%r442,3; sub.u64 %r161,%r165,%r443; .loc 1 410 5 add.u64 %r162,%r162,1; .loc 1 411 7 add.u64 %r198,%r198,8; add.u64 %r196,%r196,8; setp.eq.u64 %r444,%r162,%r471; @ %r444 bra $L52; $L92: .loc 1 419 16 ld.u64 %r445,[%r198]; add.u64 %r118,%r445,1; st.u64 [%r198],%r118; .loc 1 420 23 add.u64 %r447,%r472,%r196; ld.u64 %r119,[%r447]; .loc 1 420 13 sub.u64 %r448,%r119,%r107; shl.b64 %r449,%r448,2; add.u64 %r163,%r163,%r449; .loc 1 421 24 add.u64 %r450,%frame,%r196; ld.u64 %r121,[%r450]; .loc 1 421 14 add.u64 %r164,%r160,%r121; .loc 1 422 23 add.u64 %r452,%r474,%r196; ld.u64 %r123,[%r452]; .loc 1 422 13 shl.b64 %r453,%r123,3; add.u64 %r165,%r161,%r453; .loc 1 400 32 add.u64 %r455,%r473,%r196; ld.u64 %r127,[%r455]; .loc 1 400 13 setp.eq.u64 %r456,%r118,%r127; @ %r456 bra $L93; bra $L95; $L112: .loc 1 305 6 ld.u64 %r457,[%r230]; setp.eq.u64 %r458,%r457,0; @ ! %r458 bra $L67; bra $L94; $L52: .loc 1 42_gfortran_sminloc1_8_s4 .visible .func _gfortran_sminloc1_8_slocal .align 16 .b8 %frame_ar[368u64 %r89; .reg .u643u64 %r125; .reg .u64predu16 %r180; .reg .predpred %r222; .reg .predu64 %r231; .reg .u16 %r233; .reg .u16 %r234; .reg .u32pred %r259; .reg .pred %r260; .reg .u64u64 %r267; .reg .u32pred %r303; .reg .u64 %r304; .reg .pred %r305; .reg .u64 %r306; .reg .u32mov.u64 %r156,%ar0; mov.u64 %r157,%ar1; mov.u64 %r158,%ar2; mov.u64 %r159,%ar3; mov.u32 %r160,%ar4; mov.u64 %r161,%ar5; .loc 1 449 6 setp.eq.u64 %r162,%r159,0; @ %r162 bra $L115; .loc 1 449 20 ld.u32 %r163,[%r159]; setp.eq.u32 %r164,%r163,0; @ %r164 bra $L116; $L115: .loc 1 452 7r1571580161; call _gfortran_minloc1_8_s4); } .loc 1 456 7 bra $L114; $L116: .loc 1 459 10 ld.u64 %r23,[%r158]; .loc 1 459 7 add.u64 %r95,%r23,-1; .loc 1 460 10 ld.s8 %r24,[%r157+28]; .loc 1 460 38 add.u32 %r26,%r24,-1; .loc 1 460 8 cvt.s64.s32 %r306,%r26; .loc 1 462 7 shr.u64 %r171,%r95,63; set.u32.gt.s64 %r173,%r95,%r306; neg.s32 %r174,%r173; cvt.u16.u64 %r177,%r171; cvt.u16.u32 %r178,%r174; or.b16 %r176,%r177,%r178; .loc 1 462 6 cvt.u32.u16 %r179,%r176; cvt.u16.u8 %r180,%r179; setp.ne.u16 %r181,%r180,0; @ %r181 bra $L118; .loc 1 469 3 setp.ne.u64 %r182,%r95,0; @ %r182 bra $L119; $L125: .loc 1 477 3 setp.lt.s64 %r183,%r95,%r306; @ %r183 bra $L120; bra $L173; $L118: .loc 1 464 7 cvt.u32.u32 %r186,%r24; cvt.s64.s8 %r185,%r186; st.u64 [%stack+8],%r1851841stack; call _gfortran_runtime_error119: add.u64 %r92,%r157,48; add.u64 %r114,%frame,120; add.u64 %r146,%r157,56; add.u64 %r188,%r157,24; add.u64 %r190,%r23,%r23; add.u64 %r191,%r190,%r23; shl.b64 %r192,%r191,3; add.u64 %r152,%r188,%r192; .loc 1 474 12 mov.u64 %r312,0; $L124: .loc 1 471 19 ld.u64 %r194,[%r146]; ld.u64 %r196,[%r92]; sub.u64 %r195,%r194,%r196; .loc 1 471 50 mad.lo.u64 %r38,%r195,%r161,%r161; .loc 1 473 10 setp.le.s64 %r197,%r38,0; @ %r197 bra $L122; .loc 1 471 17 st.u64 [%r114],%r38; bra $L123; $L122: .loc 1 474 12 st.u64 [%r114],%r312; $L123: .loc 1 469 3 add.u64 %r92,%r92,24; add.u64 %r114,%r114,8; add.u64 %r146,%r146,24; setp.ne.u64 %r199,%r92,%r152; @ %r199 bra $L124; bra $L125; $L173: .loc 1 486 15 ld.u64 %r107,[%r156]; .loc 1 486 6 setp.eq.u64 %r200,%r107,0; @ ! %r200 bra $L127; bra $L126; $L120: add.u64 %r202,%r23,%r23; add.u64 %r203,%r202,%r23; shl.b64 %r204,%r203,3; add.u64 %r205,%r204,48; add.u64 %r83,%r157,%r205; add.u64 %r304,%frame,120; shl.b64 %r207,%r23,3; add.u64 %r208,%r207,-8; add.u64 %r144,%r304,%r208; add.u64 %r209,%r157,48; cvt.u32.u32 %r211,%r24; cvt.s64.s8 %r210,%r211; add.u64 %r213,%r210,%r210; add.u64 %r214,%r213,%r210; shl.b64 %r215,%r214,3; add.u64 %r72,%r209,%r215; .loc 1 483 12 mov.u64 %r311,0; $L130: .loc 1 480 2 ld.u64 %r217,[%r83+8]; ld.u64 %r219,[%r83]; sub.u64 %r218,%r217,%r219; .loc 1 480 37 mad.lo.u64 %r46,%r218,%r161,%r161; .loc 1 482 10 setp.le.s64 %r220,%r46,0; @ %r220 bra $L128; .loc 1 479 17 st.u64 [%r144],%r46; bra $L129; $L128: .loc 1 483 12 st.u64 [%r144],%r311; $L129: .loc 1 477 3 add.u64 %r83,%r83,24; add.u64 %r144,%r144,8; setp.ne.u64 %r222,%r72,%r83; @ %r222 bra $L130; bra $L174; $L126: .loc 1 490 7 setp.eq.u64 %r223,%r306,0; @ %r223 bra $L132; add.u64 %r304,%frame,120; $L152: add.u64 %r125,%r156,40; mov.u64 %r123,%r304; add.u64 %r224,%r156,16; cvt.u32.u32 %r226,%r24; cvt.s64.s8 %r225,%r226; add.u64 %r228,%r225,%r225; add.u64 %r229,%r228,%r225; shl.b64 %r230,%r229,3; add.u64 %r117,%r224,%r230; .loc 1 493 10 mov.u64 %r89,1; .loc 1 497 4 mov.u64 %r249,0; bra $L133; $L132: .loc 1 501 24 mov.u64 %r231,0; st.u64 [%r156+8],%r231; .loc 1 502 28 cvt.u16.u32 %r234,%r24; add.u16 %r233,%r234,-1; cvt.u32.u16 %r235,%r233; st.u8 [%r156+28],%r235; .loc 1 504 20 add.u32 %r236,%r24,-2; cvt.s64.s32 %r56,%r236; add.u64 %r238,%r56,%r56; add.u64 %r239,%r238,%r56; shl.b64 %r240,%r239,3; add.u64 %r241,%r156,%r240; .loc 1 504 67 shl.b64 %r243,%r56,3; add.u64 %r244,%frame,%r243; .loc 1 504 59 ld.u64 %r246,[%r241+40]; ld.u64 %r247,[%r244+120]; mul.lo.u64 %r103,%r246,%r247; .loc 1 506 10 setp.eq.u64 %r248,%r103,0; @ %r248 bra $L134; bra $L175; $L136: .loc 1 495 48 mul.lo.u64 %r89,%r52,%r89; $L133: .loc 1 497 4 st.u64 [%r125+8],%r249; ld.u64 %r52,[%r123]; add.u64 %r250,%r52,-1; st.u64 [%r125+16],%r250; st.u64 [%r125],%r89; .loc 1 490 7 add.u64 %r125,%r125,24; add.u64 %r123,%r123,8; setp.ne.u64 %r251,%r117,%r125; @ %r251 bra $L136; bra $L132; $L134: .loc 1 509 4 st.u64 [%r156+48],%r103; mov.u64 %r253,-1; st.u64 [%r156+56],%r253; mov.u64 %r254,1; st.u64 [%r156+40],%r254; .loc 1 510 4 bra $L114; $L175: .loc 1 513 24 mov.u64 %r256(%value_in),_gfortrani_xmallocarray257,[%value_in]; } mov.u64 %r107,%r257; .loc 1 513 22 st.u64 [%r156],%r107; $L140: .loc 1 539 3 setp.ne.u64 %r259,%r306,0; @ %r259 bra $L137; bra $L138; $L127: .loc 1 517 19 ld.s8 %r61,[%r156+28]; .loc 1 517 10 setp.eq.u64 %r260,%r61,%r306; @ %r260 bra $L139; .loc 1 518 2 st.u64 [%stack+8],%r306; st.u64 [%stack],%r61;_gfortran_runtime_error139: .loc 1 523 11 cvta.global.u64 %r263,_gfortrani_compile_options; .loc 1 523 10 ld.u32 %r264,[%r263+36]; setp.eq.u32 %r265,%r264,0; @ %r265 bra $L140; .loc 1 525 4 setp.eq.u64 %r266,%r306,0; @ %r266 bra $L138; add.u64 %r115,%r156,48; add.u64 %r105,%frame,120; cvt.u32.u32 %r268,%r24; cvt.s64.s8 %r267,%r268; add.u64 %r86,%r267,-1; .loc 1 525 10 mov.u64 %r113,0; $L142: .loc 1 529 21 ld.u64 %r270,[%r115+8]; add.u64 %r269,%r270,1; .loc 1 529 19 ld.u64 %r271,[%r115]; sub.u64 %r100,%r269,%r271; .loc 1 530 18 ld.u64 %r70,[%r105]; .loc 1 525 25 add.u64 %r113,%r113,1; .loc 1 530 11 setp.eq.u64 %r272,%r70,%r100; @ %r272 bra $L141; .loc 1 531 3 st.u64 [%stack+16],%r70; st.u64 [%stack+8],%r100; st.u64 [%stack],%r113;_gfortran_runtime_error141: .loc 1 525 4 add.u64 %r115,%r115,24; add.u64 %r105,%r105,8; setp.ne.u64 %r275,%r86,%r113; @ %r275 bra $L142; $L137: .loc 1 541 16 cvt.u16.u32 %r277,%r24; setp.le.s16 %r278,%r277,1; @ %r278 bra $L143; shl.b64 %r276,%r306,3; bra $L144; $L143: mov.u64 %r276,8; $L144: add.u64 %r280,%frame,240; mov.u32 %r2842766,[%value_in]; } add.u64 %r130,%r156,40; mov.u64 %r128,%frame; mov.u64 %r109,0; $L145: .loc 1 542 18 ld.u64 %r288,[%r130]; st.u64 [%r128],%r288; .loc 1 539 26 add.u64 %r109,%r109,1; .loc 1 539 3 add.u64 %r130,%r130,24; add.u64 %r128,%r128,8; setp.gt.s64 %r289,%r306,%r109; @ %r289 bra $L145; $L138: ld.u64 %r88,[%frame+240]; .loc 1 551 22 ld.u64 %r75,[%frame]; .loc 1 551 12 shl.b64 %r76,%r75,3; .loc 1 553 32 ld.u64 %r65,[%frame+120]; .loc 1 560 23 mul.lo.u64 %r49,%r65,%r75; setp.le.s64 %r305,%r306,1; .loc 1 549 13 mov.u64 %r290,0; cvt.u32.u32 %r308,%r24; cvt.s64.s8 %r309,%r308; add.u64 %r310,%r309,-1; $L151: st.u64 [%r107],%r290; .loc 1 550 15 add.u64 %r88,%r88,1; .loc 1 551 12 add.u64 %r107,%r107,%r76; .loc 1 553 13 setp.ne.u64 %r291,%r65,%r88; @ %r291 bra $L151; .loc 1 562 7 @ ! %r305 bra $L176; bra $L114; $L150: .loc 1 557 13 st.u64 [%r108],%r290; .loc 1 560 23 mul.lo.u64 %r78,%r81,%r80; .loc 1 561 5 add.u64 %r106,%r106,1; .loc 1 562 7 add.u64 %r108,%r108,8; add.u64 %r138,%r138,8; add.u64 %r137,%r137,8; setp.ne.u64 %r294,%r106,%r310; @ %r294 bra $L149; bra $L114; $L176: add.u64 %r108,%frame,248; add.u64 %r138,%frame,8; add.u64 %r137,%frame,128; .loc 1 560 23 mov.u64 %r78,%r49; .loc 1 561 5 mov.u64 %r106,1; $L149: .loc 1 566 16 ld.u64 %r299,[%r108]; add.u64 %r80,%r299,1; st.u64 [%r108],%r80; .loc 1 567 23 ld.u64 %r81,[%r138]; .loc 1 567 13 sub.u64 %r300,%r81,%r78; shl.b64 %r301,%r300,3; add.u64 %r107,%r107,%r301; .loc 1 553 32 ld.u64 %r84,[%r137]; .loc 1 553 13 setp.eq.u64 %r302,%r80,%r84; @ %r302 bra $L150; .loc 1 557 13 mov.u64 %r88,0; bra $L151; $L174: .loc 1 486 15 ld.u64 %r107,[%r156]; .loc 1 486 6 setp.eq.u64 %r303,%r107,0; @ ! %r303 bra $L127; bra $L152; $L114: .loc 1 571 1 ret; } minloc1_16_s1.o/1622802213_gfortran_minloc1_16_s1 .visible .func _gfortran_minloc1_16_s1in_ar4); .file 1 "../../../libgfortran/generated/minloc1_16_s1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_16_s1 .visible .func _gfortran_mminloc1_16_s1_gfortran_sminloc1_16_s1 .visible .func _gfortran_sminloc1_16_s1VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returmemcmpmem_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents7] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,914,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,117,101114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99_gfortran_minloc1_16_s1 .visible .func _gfortran_minloc1_16_s1in_ar4) {480predpred %r225; .reg .predu64 %r257; .reg .u64 %r258; .reg .u64pred %r262; .reg .pred %r264; .reg .pred %r265; .reg .u64u64predu32 %r318; .reg .u64pred %r330; .reg .u32 %r334; .reg .pred %r337; .reg .predu64 %r370; .reg .u64 %r372; .reg .pred %r373; .reg .u64 %r374; .reg .pred398; mov.u64 %r185,%ar0; mov.u64 %r186,%ar1; mov.u64 %r187,%ar2; mov.u32 %r188,%ar3; mov.u64 %r189,%ar4; .loc 1 70 10 ld.s8 %r22,[%r186+28]; .loc 1 70 38 add.u32 %r190,%r22,-1; .loc 1 70 8 cvt.s64.s32 %r117,%r190; .loc 1 71 10 ld.u64 %r25,[%r187]; .loc 1 71 7 add.u64 %r118,%r25,-1; .loc 1 73 7 shr.u64 %r192,%r118,63; set.u32.lt.s64 %r194,%r117,%r118; neg.s32 %r195,%r194; cvt.u16.u64 %r198,%r192; cvt.u16.u32 %r199,%r195; or.b16 %r197,%r198,%r199; .loc 1 73 6 cvt.u32.u16 %r200,%r197; cvt.u16.u8 %r201,%r200; setp.eq.u16 %r202,%r201,0; @ %r202 bra $L2; .loc 1 75 7 cvt.u32.u32 %r205,%r22; cvt.s64.s8 %r204,%r205; st.u64 [%stack+8],%r20403203_gfortran_runtime_errorr208,%r118,%r118; add.u64 %r209,%r208,%r118; shl.b64 %r210,%r209,3; add.u64 %r211,%r186,%r210; ld.u64 %r31,[%r211+56]; ld.u64 %r33,[%r211+48]; .loc 1 83 11 ld.u64 %r34,[%r211+40]; .loc 1 85 3 setp.ne.u64 %r225,%r118,0; @ %r225 bra $L3; $L9: .loc 1 93 3 setp.gt.s64 %r226,%r117,%r118; @ %r226 bra $L4; bra $L52; $L3: add.u64 %r132,%r186,40; add.u64 %r182,%frame,120; add.u64 %r180,%frame,240; add.u64 %r227,%r186,16; add.u64 %r229,%r25,%r25; add.u64 %r230,%r229,%r25; shl.b64 %r231,%r230,3; add.u64 %r137,%r227,%r231; .loc 1 91 12 mov.u64 %r398,0; $L8: .loc 1 87 51 ld.u64 %r233,[%r132]; mul.lo.u64 %r232,%r233,%r189; .loc 1 87 18 st.u64 [%r182],%r232; .loc 1 88 19 ld.u64 %r235,[%r132+16]; add.u64 %r234,%r235,1; ld.u64 %r236,[%r132+8]; sub.u64 %r44,%r234,%r236; .loc 1 90 10 setp.lt.s64 %r237,%r44,0; @ %r237 bra $L6; .loc 1 88 17 st.u64 [%r180],%r44; bra $L7; $L6: .loc 1 91 12 st.u64 [%r180],%r398; $L7: .loc 1 85 3 add.u64 %r132,%r132,24; add.u64 %r182,%r182,8; add.u64 %r180,%r180,8; setp.ne.u64 %r239,%r132,%r137; @ %r239 bra $L8; bra $L9; $L52: .loc 1 102 6 ld.u64 %r240,[%r185]; setp.eq.u64 %r241,%r240,0; @ ! %r241 bra $L11; bra $L10; $L4: add.u64 %r243,%r25,%r25; add.u64 %r244,%r243,%r25; shl.b64 %r245,%r244,3; add.u64 %r246,%r245,40; add.u64 %r114,%r186,%r246; shl.b64 %r247,%r25,3; add.u64 %r103,%r247,-8; add.u64 %r385,%frame,120; add.u64 %r108,%r385,%r103; add.u64 %r384,%frame,240; add.u64 %r101,%r384,%r103; cvt.u32.u32 %r251,%r22; cvt.s64.s8 %r250,%r251; add.u64 %r253,%r250,%r250; add.u64 %r254,%r253,%r250; shl.b64 %r255,%r254,3; add.u64 %r256,%r186,40; add.u64 %r50,%r255,%r256; .loc 1 99 12 mov.u64 %r397,0; $L14: .loc 1 95 56 ld.u64 %r258,[%r114]; mul.lo.u64 %r257,%r258,%r189; .loc 1 95 18 st.u64 [%r108],%r257; .loc 1 96 19 ld.u64 %r260,[%r114+16]; add.u64 %r259,%r260,1; ld.u64 %r261,[%r114+8]; sub.u64 %r54,%r259,%r261; .loc 1 98 10 setp.lt.s64 %r262,%r54,0; @ %r262 bra $L12; .loc 1 96 17 st.u64 [%r101],%r54; bra $L13; $L12: .loc 1 99 12 st.u64 [%r101],%r397; $L13: .loc 1 93 3 add.u64 %r114,%r114,24; add.u64 %r108,%r108,8; add.u64 %r101,%r101,8; setp.ne.u64 %r264,%r50,%r114; @ %r264 bra $L14; bra $L53; $L10: .loc 1 106 7 setp.le.s64 %r265,%r117,0; @ %r265 bra $L16; add.u64 %r384,%frame,240; $L37: add.u64 %r149,%r185,40; mov.u64 %r146,%r384; add.u64 %r266,%r185,16; cvt.u32.u32 %r268,%r22; cvt.s64.s8 %r267,%r268; add.u64 %r270,%r267,%r267; add.u64 %r271,%r270,%r267; shl.b64 %r272,%r271,3; add.u64 %r115,%r266,%r272; .loc 1 109 10 mov.u64 %r106,1; .loc 1 113 4 mov.u64 %r295,0; bra $L17; $L16: .loc 1 117 24 mov.u64 %r273,0; st.u64 [%r185+8],%r273; .loc 1 118 28 cvt.u16.u32 %r276,%r22; add.u16 %r275,%r276,-1; cvt.u32.u16 %r277,%r275; st.u8 [%r185+28],%r277; .loc 1 120 20 add.u32 %r278,%r22,-2; cvt.s64.s32 %r63,%r278; add.u64 %r280,%r63,%r63; add.u64 %r281,%r280,%r63; shl.b64 %r282,%r281,3; add.u64 %r283,%r185,%r282; .loc 1 120 67 shl.b64 %r285,%r63,3; add.u64 %r286,%frame,%r285; .loc 1 120 59 ld.u64 %r288,[%r283+40]; ld.u64 %r289,[%r286+240]; mul.lo.u64 %r123,%r288,%r289; .loc 1 122 29 mov.u64 %r291,16291; call (%value_in),_gfortrani_xmallocarray292,[%value_in]; } .loc 1 122 27 st.u64 [%r185],%r292; .loc 1 123 10 setp.eq.u64 %r294,%r123,0; @ ! %r294 bra $L21; bra $L18; $L20: .loc 1 111 48 mul.lo.u64 %r106,%r59,%r106; $L17: .loc 1 113 4 st.u64 [%r149+8],%r295; ld.u64 %r59,[%r146]; add.u64 %r296,%r59,-1; st.u64 [%r149+16],%r296; st.u64 [%r149],%r106; .loc 1 106 7 add.u64 %r149,%r149,24; add.u64 %r146,%r146,8; setp.ne.u64 %r297,%r115,%r149; @ %r297 bra $L20; bra $L16; $L18: .loc 1 126 4 st.u64 [%r185+48],%r123; mov.u64 %r299,-1; st.u64 [%r185+56],%r299; mov.u64 %r300,1; st.u64 [%r185+40],%r300; .loc 1 127 4 bra $L1; $L11: .loc 1 133 19 ld.s8 %r70,[%r185+28]; .loc 1 133 10 setp.eq.u64 %r301,%r70,%r117; @ %r301 bra $L23; .loc 1 134 2 st.u64 [%stack+8],%r117_gfortran_runtime_error23: .loc 1 139 11 cvta.global.u64 %r304,_gfortrani_compile_options; .loc 1 139 10 ld.u32 %r305,[%r304+36]; setp.eq.u32 %r306,%r305,0; @ %r306 bra $L21; .loc 1 140 2 add.u64 %r384,%frame,240; cvta.const.u64 %r310,$LC2r384309310; call _gfortrani_bounds_ifunction_return$L21: .loc 1 144 3 setp.gt.s64 %r312,%r117,0; @ %r312 bra $L24; $L28: .loc 1 80 9 add.u64 %r313,%r31,1; .loc 1 80 7 sub.u64 %r119,%r313,%r33; max.s64 %r121,%r119,0; .loc 1 83 44 mul.lo.u64 %r36,%r34,%r189; .loc 1 152 8 ld.u64 %r129,[%r186]; .loc 1 153 8 ld.u64 %r130,[%r185]; .loc 1 186 22 ld.u64 %r78,[%frame+120]; .loc 1 187 22 ld.u64 %r80,[%frame]; .loc 1 187 12 shl.b64 %r81,%r80,4; .loc 1 189 32 ld.u64 %r152,[%frame+240]; .loc 1 196 23 mul.lo.u64 %r314,%r78,%r152; .loc 1 196 9 neg.s64 %r142,%r314; .loc 1 197 23 mul.lo.u64 %r315,%r80,%r152; .loc 1 197 9 shl.b64 %r316,%r315,4; neg.s64 %r141,%r316; .loc 1 185 12 ld.u64 %r181,[%frame+360]; setp.gt.s64 %r386,%r119,0; setp.le.s64 %r387,%r117,1; mov.u64 %r390,1; .loc 1 173 6 setp.eq.u32 %r391,%r188,0; cvt.u32.u32 %r392,%r22; cvt.s64.s8 %r393,%r392; add.u64 %r394,%r393,-1; add.u64 %r395,%frame,120; add.u64 %r396,%frame,240; bra $L25; $L24: add.u64 %r167,%frame,360; add.u64 %r166,%r185,40; mov.u64 %r164,%frame; cvt.u32.u32 %r318,%r22; cvt.s64.s8 %r317,%r318; add.u64 %r150,%r317,-1; .loc 1 144 3 mov.u64 %r131,0; add.u64 %r384,%frame,240; .loc 1 146 16 mov.u64 %r319,%r131; $L27: st.u64 [%r167],%r319; .loc 1 147 18 ld.u64 %r320,[%r166]; st.u64 [%r164],%r320; .loc 1 148 17 shl.b64 %r322,%r131,3; add.u64 %r323,%r384,%r322; .loc 1 148 10 ld.u64 %r324,[%r323]; setp.le.s64 %r325,%r324,0; @ %r325 bra $L1; .loc 1 144 26 add.u64 %r131,%r131,1; .loc 1 144 3 add.u64 %r167,%r167,8; add.u64 %r166,%r166,24; add.u64 %r164,%r164,8; setp.ne.u64 %r326,%r131,%r150; @ %r326 bra $L27; bra $L28; $L25: .loc 1 166 5 @ %r386 bra $L38; .loc 1 167 10 mov.u64 %r328,0; st.u64 [%r130],%r328; st.u64 [%r130+8],%r328; bra $L30; $L38: mov.u64 %r125,%r129; mov.u64 %r378,%r390; mov.u64 %r379,0; .loc 1 164 9 mov.u64 %r109,%r379; .loc 1 165 9 mov.u64 %r376,%r379; mov.u64 %r377,%r379; $L29: .loc 1 173 6 setp.eq.u64 %r330,%r109,0; @ %r330 bra $L31; .loc 1 40(%value_in),memcmp,(%out_arg1,%out_arg2,%out_arg3); ld.param.u32 %r334,[%value_in]; } .loc 1 173 6 @ %r391 bra $L32; .loc 1 173 22 setp.le.s32 %r337,%r334,0; @ ! %r337 bra $L33; bra $L31; $L32: setp.ge.s32 %r338,%r334,0; @ %r338 bra $L33; $L31: .loc 1 177 14 mov.u64 %r376,%r378; mov.u64 %r377,%r379; mov.u64 %r109,%r125; $L33: .loc 1 170 36 add.u64 %r125,%r125,%r36; .loc 1 170 6 add.u64 %r341,%r378,%r390; set.u32.lt.u64 %r346,%r341,%r378; cvt.s64.s32 %r344,%r346; mov.u64 %r378,%r341; sub.u64 %r379,%r379,%r344; add.u64 %r351,%r378,-1; setp.gt.s64 %r353,%r121,%r351; @ %r353 bra $L29; .loc 1 181 12 st.u64 [%r130],%r376; st.u64 [%r130+8],%r377; $L30: .loc 1 185 15 add.u64 %r181,%r181,1; .loc 1 186 12 add.u64 %r129,%r129,%r78; .loc 1 187 12 add.u64 %r130,%r130,%r81; .loc 1 189 13 setp.ne.u64 %r356,%r181,%r152; @ %r356 bra $L25; .loc 1 196 9 add.u64 %r126,%r129,%r142; .loc 1 197 9 add.u64 %r127,%r130,%r141; .loc 1 199 7 @ %r387 bra $L1; add.u64 %r136,%frame,368; mov.u64 %r156,8; .loc 1 198 5 mov.u64 %r128,1; .loc 1 193 13 mov.u64 %r388,0; bra $L35; $L36: st.u64 [%r136],%r388; .loc 1 196 23 mul.lo.u64 %r362,%r95,%r94; .loc 1 196 9 sub.u64 %r126,%r129,%r362; .loc 1 197 23 mul.lo.u64 %r363,%r97,%r94; .loc 1 197 9 shl.b64 %r364,%r363,4; sub.u64 %r127,%r130,%r364; .loc 1 198 5 add.u64 %r128,%r128,1; .loc 1 199 7 add.u64 %r136,%r136,8; add.u64 %r156,%r156,8; setp.eq.u64 %r365,%r128,%r394; @ %r365 bra $L1; $L35: .loc 1 207 16 ld.u64 %r366,[%r136]; add.u64 %r94,%r366,1; st.u64 [%r136],%r94; .loc 1 208 23 add.u64 %r368,%r395,%r156; ld.u64 %r95,[%r368]; .loc 1 208 13 add.u64 %r129,%r126,%r95; .loc 1 209 23 add.u64 %r369,%frame,%r156; ld.u64 %r97,[%r369]; .loc 1 209 13 shl.b64 %r370,%r97,4; add.u64 %r130,%r127,%r370; .loc 1 189 32 add.u64 %r372,%r396,%r156; ld.u64 %r100,[%r372]; .loc 1 189 13 setp.eq.u64 %r373,%r94,%r100; @ %r373 bra $L36; mov.u64 %r181,0; bra $L25; $L53: .loc 1 102 6 ld.u64 %r374,[%r185]; setp.eq.u64 %r375,%r374,0; @ ! %r375 bra $L11; bra $L37; $L1: .loc 1 21_gfortran_mminloc1_16_s1 .visible .func _gfortran_mminloc1_16_s1608u16u32 %r263; .reg .u64pred %r295; .reg .predpred %r310; .reg .pred %r311; .reg .u64u64 %r318; .reg .u64pred64pred64predpred %r397; .reg .u64 %r400; .reg .u64 %r401; .reg .u64 %r405; .reg .pred.reg .u64 %r416; .reg .predpred.reg .u64 %r448; .reg .u32 %r450; .reg .u64pred %r463; .reg .predpred %r479; .reg .pred %r482; .reg .u64 %r485; .reg .predu64 %r504; .reg .u64 %r506; .reg .pred %r507; .reg .u64 %r508; .reg .pred %r509; .reg .u64 %r510; .reg .u64 %r511; .reg .u64 %r514; .reg .u64 %r515; .reg .u64u64 %r534; .reg .pred %r535; .reg .u64 %r536; .reg .u64 %r539; .reg .pred %r540; .reg .u32 %r542; .reg .u64 %r543; .reg .u64 %r544; .reg .u64 %r545; .reg .u64 %r546; .reg .u64 %r547; mov.u64 %r236,%ar0; mov.u64 %r237,%ar1; mov.u64 %r238,%ar2; mov.u64 %r239,%ar3; mov.u32 %r240,%ar4; mov.u64 %r241,%ar5; .loc 1 244 6 setp.ne.u64 %r242,%r239,0; @ %r242 bra $L238240241; call _gfortran_minloc1_16_s1); } .loc 1 251 7 bra $L54; $L55: .loc 1 254 10 ld.u64 %r22,[%r238]; .loc 1 254 7 add.u64 %r150,%r22,-1; .loc 1 255 10 ld.s8 %r23,[%r237+28]; .loc 1 255 38 add.u32 %r248,%r23,-1; .loc 1 255 8 cvt.s64.s32 %r151,%r248; .loc 1 258 7 shr.u64 %r250,%r150,63; set.u32.gt.s64 %r252,%r150,%r151; neg.s32 %r253,%r252; cvt.u16.u64 %r256,%r250; cvt.u16.u32 %r257,%r253; or.b16 %r255,%r256,%r257; .loc 1 258 6 cvt.u32.u16 %r258,%r255; cvt.u16.u8 %r259,%r258; setp.eq.u16 %r260,%r259,0; @ %r260 bra $L57; .loc 1 260 7 cvt.u32.u32 %r263,%r23; cvt.s64.s8 %r262,%r263; st.u64 [%stack+8],%r262; st.u64 [%stack],%r22; cvta.const.u64 %r261261_gfortran_runtime_error7: .loc 1 265 9 add.u64 %r534,%r150,%r150; add.u64 %r531,%r534,%r150; shl.b64 %r268,%r531,3; add.u64 %r269,%r237,%r268; ld.u64 %r272,[%r269+56]; add.u64 %r271,%r272,1; .loc 1 265 7 ld.u64 %r279,[%r269+48]; sub.u64 %r152,%r271,%r279; .loc 1 266 6 setp.le.s64 %r280,%r152,0; @ %r280 bra $L54; .loc 1 269 9 ld.u64 %r164,[%r239]; .loc 1 271 15 ld.u64 %r34,[%r239+16]; .loc 1 273 22 cvt.u32.u64 %r35,%r34; .loc 1 273 53 add.u32 %r281,%r35,-4; and.b32 %r282,%r281,-5; set.u32.eq.u32 %r284,%r282,0; neg.s32 %r285,%r284; .loc 1 273 22 add.u32 %r286,%r35,-1; set.u32.le.u32 %r288,%r286,1; neg.s32 %r289,%r288; .loc 1 273 6 cvt.u16.u32 %r291,%r285; cvt.u16.u32 %r292,%r289; or.b16 %r290,%r291,%r292; cvt.u32.u16 %r293,%r290; cvt.u16.u8 %r294,%r293; setp.ne.u16 %r295,%r294,0; @ %r295 bra $L59; .loc 1 275 7 setp.ne.u32 %r297,%r35,16; @ %r297 bra $L60; $L59: .loc 1 282 11 shl.b64 %r301,%r531,3; add.u64 %r302,%r237,%r301; ld.u64 %r41,[%r302+40]; .loc 1 283 12 add.u64 %r308,%r239,%r301; ld.u64 %r44,[%r308+40]; .loc 1 285 3 setp.ne.u64 %r310,%r150,0; @ %r310 bra $L61; $L67: .loc 1 295 3 setp.lt.s64 %r311,%r150,%r151; @ %r311 bra $L62; bra $L115; $L60: .loc 1 280 5 cvta.const.u64 %r312312_gfortran_runtime_error61: add.u64 %r217,%r237,40; add.u64 %r219,%frame,240; add.u64 %r220,%r239,40; mov.u64 %r222,%frame; add.u64 %r223,%frame,360; add.u64 %r314,%r237,16; add.u64 %r316,%r22,%r22; add.u64 %r317,%r316,%r22; shl.b64 %r318,%r317,3; add.u64 %r232,%r314,%r318; .loc 1 292 12 mov.u64 %r547,0; $L66: .loc 1 287 51 ld.u64 %r320,[%r217]; mul.lo.u64 %r319,%r320,%r241; .loc 1 287 18 st.u64 [%r219],%r319; .loc 1 288 20 ld.u64 %r322,[%r220]; mul.lo.u64 %r321,%r322,%r34; .loc 1 288 18 st.u64 [%r222],%r321; .loc 1 289 19 ld.u64 %r324,[%r217+16]; add.u64 %r323,%r324,1; ld.u64 %r325,[%r217+8]; sub.u64 %r59,%r323,%r325; .loc 1 291 10 setp.lt.s64 %r326,%r59,0; @ %r326 bra $L64; .loc 1 289 17 st.u64 [%r223],%r59; bra $L65; $L64: .loc 1 292 12 st.u64 [%r223],%r547; $L65: .loc 1 285 3 add.u64 %r217,%r217,24; add.u64 %r219,%r219,8; add.u64 %r220,%r220,24; add.u64 %r222,%r222,8; add.u64 %r223,%r223,8; setp.ne.u64 %r328,%r217,%r232; @ %r328 bra $L66; bra $L67; $L115: .loc 1 305 6 ld.u64 %r329,[%r236]; setp.eq.u64 %r330,%r329,0; @ ! %r330 bra $L69; bra $L68; $L62: add.u64 %r332,%r22,%r22; add.u64 %r333,%r332,%r22; shl.b64 %r334,%r333,3; add.u64 %r175,%r334,40; add.u64 %r39,%r237,%r175; shl.b64 %r335,%r22,3; add.u64 %r203,%r335,-8; add.u64 %r536,%frame,240; add.u64 %r177,%r536,%r203; add.u64 %r112,%r239,%r175; add.u64 %r38,%frame,%r203; add.u64 %r532,%frame,360; add.u64 %r96,%r532,%r203; add.u64 %r338,%r237,40; cvt.u32.u32 %r340,%r23; cvt.s64.s8 %r339,%r340; add.u64 %r342,%r339,%r339; add.u64 %r343,%r342,%r339; shl.b64 %r344,%r343,3; add.u64 %r214,%r338,%r344; .loc 1 302 12 mov.u64 %r546,0; $L72: .loc 1 297 55 ld.u64 %r346,[%r39]; mul.lo.u64 %r345,%r346,%r241; .loc 1 297 18 st.u64 [%r177],%r345; .loc 1 298 20 ld.u64 %r348,[%r112]; mul.lo.u64 %r347,%r348,%r34; .loc 1 298 18 st.u64 [%r38],%r347; .loc 1 299 19 ld.u64 %r350,[%r39+16]; add.u64 %r349,%r350,1; ld.u64 %r351,[%r39+8]; sub.u64 %r75,%r349,%r351; .loc 1 301 10 setp.lt.s64 %r352,%r75,0; @ %r352 bra $L70; .loc 1 299 17 st.u64 [%r96],%r75; bra $L71; $L70: .loc 1 302 12 st.u64 [%r96],%r546; $L71: .loc 1 295 3 add.u64 %r39,%r39,24; add.u64 %r177,%r177,8; add.u64 %r112,%r112,24; add.u64 %r38,%r38,8; add.u64 %r96,%r96,8; setp.ne.u64 %r354,%r39,%r214; @ %r354 bra $L72; bra $L116; $L68: .loc 1 309 7 setp.eq.u64 %r355,%r151,0; @ %r355 bra $L74; add.u64 %r532,%frame,360; $L98: add.u64 %r128,%r236,40; mov.u64 %r108,%r532; add.u64 %r356,%r236,16; cvt.u32.u32 %r358,%r23; cvt.s64.s8 %r357,%r358; add.u64 %r360,%r357,%r357; add.u64 %r361,%r360,%r357; shl.b64 %r362,%r361,3; add.u64 %r60,%r356,%r362; .loc 1 312 10 mov.u64 %r138,1; .loc 1 316 4 mov.u64 %r381,0; bra $L75; $L74: .loc 1 320 20 add.u32 %r363,%r23,-2; cvt.s64.s32 %r83,%r363; add.u64 %r365,%r83,%r83; add.u64 %r366,%r365,%r83; shl.b64 %r367,%r366,3; add.u64 %r368,%r236,%r367; .loc 1 320 67 shl.b64 %r370,%r83,3; add.u64 %r371,%frame,%r370; .loc 1 320 59 ld.u64 %r373,[%r368+40]; ld.u64 %r374,[%r371+360]; mul.lo.u64 %r157,%r373,%r374; .loc 1 322 24 mov.u64 %r375,0; st.u64 [%r236+8],%r375; .loc 1 323 28 cvt.u16.u32 %r378,%r23; add.u16 %r377,%r378,-1; cvt.u32.u16 %r379,%r377; st.u8 [%r236+28],%r379; .loc 1 325 10 setp.eq.u64 %r380,%r157,0; @ %r380 bra $L76; bra $L117; $L78: .loc 1 314 47 mul.lo.u64 %r138,%r80,%r138; $L75: .loc 1 316 4 st.u64 [%r128+8],%r381; ld.u64 %r80,[%r108]; add.u64 %r382,%r80,-1; st.u64 [%r128+16],%r382; st.u64 [%r128],%r138; .loc 1 309 7 add.u64 %r128,%r128,24; add.u64 %r108,%r108,8; setp.ne.u64 %r383,%r60,%r128; @ %r383 bra $L78; bra $L74; $L76: .loc 1 328 4 st.u64 [%r236+48],%r157; mov.u64 %r385,-1; st.u64 [%r236+56],%r385; mov.u64 %r386,1; st.u64 [%r236+40],%r386; .loc 1 329 4 bra $L54; $L117: .loc 1 332 24 mov.u64 %r388,16388; call (%value_in),_gfortrani_xmallocarray89,[%value_in]; } .loc 1 332 22 st.u64 [%r236],%r389; bra $L79; $L69: .loc 1 337 19 ld.s8 %r391,[%r236+28]; .loc 1 337 10 setp.eq.u64 %r392,%r391,%r151; @ %r392 bra $L80; .loc 1 338 2 cvta.const.u64 %r3933_gfortran_runtime_error80: .loc 1 340 11 cvta.global.u64 %r395,_gfortrani_compile_options; .loc 1 340 10 ld.u32 %r396,[%r395+36]; setp.eq.u32 %r397,%r396,0; @ %r397 bra $L79; .loc 1 342 4 add.u64 %r532,%frame,360; cvta.const.u64 %r401,$LC2r500401; call _gfortrani_bounds_ifunction_return344 4 cvta.const.u64 %r405,$LC20401; call _gfortrani_bounds_equal_extents$L79: .loc 1 349 3 setp.ne.u64 %r407,%r151,0; @ %r407 bra $L81; $L84: .loc 1 357 8 ld.u64 %r165,[%r236]; .loc 1 358 8 ld.u64 %r163,[%r237]; .loc 1 360 9 setp.ne.u64 %r408,%r163,0; @ %r408 bra $L82; bra $L54; $L81: add.u64 %r146,%frame,480; add.u64 %r145,%r236,40; add.u64 %r140,%frame,120; cvt.u32.u32 %r410,%r23; cvt.s64.s8 %r409,%r410; add.u64 %r129,%r409,-1; .loc 1 349 3 mov.u64 %r172,0; add.u64 %r532,%frame,360; .loc 1 351 16 mov.u64 %r411,%r172; $L83: st.u64 [%r146],%r411; .loc 1 352 18 ld.u64 %r412,[%r145]; st.u64 [%r140],%r412; .loc 1 353 17 shl.b64 %r414,%r172,3; add.u64 %r415,%r532,%r414; .loc 1 353 10 ld.u64 %r416,[%r415]; setp.le.s64 %r417,%r416,0; @ %r417 bra $L54; .loc 1 349 26 add.u64 %r172,%r172,1; .loc 1 349 3 add.u64 %r146,%r146,8; add.u64 %r145,%r145,24; add.u64 %r140,%r140,8; setp.ne.u64 %r418,%r129,%r172; @ %r418 bra $L83; bra $L84; $L82: .loc 1 282 44 mul.lo.u64 %r43,%r41,%r241; .loc 1 283 12 mul.lo.u64 %r47,%r44,%r34; .loc 1 396 22 ld.u64 %r100,[%frame+240]; .loc 1 397 23 ld.u64 %r102,[%frame]; .loc 1 398 22 ld.u64 %r104,[%frame+120]; .loc 1 398 12 shl.b64 %r105,%r104,4; .loc 1 400 32 ld.u64 %r29,[%frame+360]; .loc 1 407 23 mul.lo.u64 %r168,%r100,%r29; .loc 1 408 24 mul.lo.u64 %r419,%r102,%r29; .loc 1 408 10 neg.s64 %r144,%r419; .loc 1 409 23 mul.lo.u64 %r420,%r104,%r29; .loc 1 409 9 shl.b64 %r421,%r420,4; neg.s64 %r188,%r421; setp.le.s64 %r533,%r151,1; add.u64 %r539,%r152,-1; .loc 1 384 13 setp.eq.u32 %r540,%r240,0; cvt.u32.u32 %r542,%r23; cvt.s64.s8 %r543,%r542; add.u64 %r544,%r543,-1; add.u64 %r545,%frame,240; $L99: .loc 1 349 3 mov.u64 %r171,%r164; mov.u64 %r142,%r163; .loc 1 372 9 mov.u64 %r199,0; $L88: .loc 1 375 7 ld.s8 %r205,[%r171]; mov.u64 %r209,%r199; add.u64 %r199,%r199,1; .loc 1 375 6 setp.eq.u32 %r535,%r205,0; @ %r535 bra $L85; .loc 1 378 11 mov.u64 %r510,%r199; shr.s64 %r511,%r510,63; .loc 1 382 6 setp.le.s64 %r424,%r152,%r209; @ %r424 bra $L86; add.u64 %r514,%r209,1; shr.s64 %r515,%r514,63; sub.u64 %r428,%r539,%r209; shr.s64 %r431,%r209,63; add.u64 %r434,%r209,2; set.u32.lt.u64 %r439,%r434,%r209; cvt.s64.s32 %r437,%r439; sub.u64 %r443,%r431,%r437; add.u64 %r445,%r428,%r434; set.u32.lt.u64 %r450,%r445,%r428; cvt.s64.s32 %r448,%r450; sub.u64 %r454,%r443,%r448; mov.u64 %r170,%r142; mov.u64 %r526,1; bra $L87; $L85: .loc 1 372 32 add.u64 %r142,%r142,%r43; .loc 1 372 47 add.u64 %r171,%r171,%r47; .loc 1 372 2 setp.ne.u64 %r455,%r152,%r199; @ %r455 bra $L88; .loc 1 371 9 mov.u64 %r510,0; mov.u64 %r511,%r510; bra $L86; $L87: .loc 1 384 6 @ %r535 bra $L89; .loc 1 4017142241460,[%value_in]; } .loc 1 384 13 @ %r540 bra $L90; setp.le.s32 %r463,%r460,0; @ %r463 bra $L91; bra $L89; $L90: setp.ge.s32 %r464,%r460,0; @ %r464 bra $L89; $L91: .loc 1 388 14 mov.u64 %r510,%r514; mov.u64 %r511,%r515; mov.u64 %r142,%r170; $L89: .loc 1 382 31 add.u64 %r170,%r170,%r43; .loc 1 382 46 add.u64 %r171,%r171,%r47; .loc 1 382 6 add.u64 %r467,%r514,%r526; set.u32.lt.u64 %r472,%r467,%r514; cvt.s64.s32 %r470,%r472; mov.u64 %r514,%r467; sub.u64 %r515,%r515,%r470; setp.ne.u64 %r479,%r445,%r514; @ %r479 bra $L100; setp.ne.u64 %r482,%r454,%r515; @ ! %r482 bra $L86; $L100: .loc 1 384 7 ld.s8 %r205,[%r171]; setp.eq.u32 %r535,%r205,0; bra $L87; $L86: .loc 1 392 8 st.u64 [%r165],%r510; st.u64 [%r165+8],%r511; .loc 1 395 15 ld.u64 %r485,[%frame+480]; add.u64 %r98,%r485,1; .loc 1 396 12 add.u64 %r163,%r163,%r100; .loc 1 397 13 add.u64 %r164,%r164,%r102; .loc 1 398 12 add.u64 %r165,%r165,%r105; .loc 1 400 13 setp.eq.u64 %r486,%r29,%r98; @ %r486 bra $L93; .loc 1 395 15 st.u64 [%frame+480],%r98; bra $L99; $L93: .loc 1 404 13 mov.u64 %r487,0; st.u64 [%frame+480],%r487; .loc 1 408 10 add.u64 %r160,%r164,%r144; .loc 1 409 9 add.u64 %r161,%r165,%r188; .loc 1 411 7 @ %r533 bra $L54; add.u64 %r201,%frame,488; .loc 1 407 23 mov.u64 %r107,%r168; .loc 1 411 7 mov.u64 %r197,8; .loc 1 410 5 mov.u64 %r162,1; add.u64 %r532,%frame,360; add.u64 %r530,%frame,120; bra $L96; $L97: .loc 1 404 13 st.u64 [%r201],%r487; .loc 1 407 23 mul.lo.u64 %r107,%r119,%r118; .loc 1 408 24 mul.lo.u64 %r493,%r121,%r118; .loc 1 408 10 sub.u64 %r160,%r164,%r493; .loc 1 409 23 mul.lo.u64 %r494,%r123,%r118; .loc 1 409 9 shl.b64 %r495,%r494,4; sub.u64 %r161,%r165,%r495; .loc 1 410 5 add.u64 %r162,%r162,1; .loc 1 411 7 add.u64 %r201,%r201,8; add.u64 %r197,%r197,8; setp.eq.u64 %r496,%r162,%r544; @ %r496 bra $L54; $L96: .loc 1 419 16 ld.u64 %r497,[%r201]; add.u64 %r118,%r497,1; st.u64 [%r201],%r118; .loc 1 420 23 add.u64 %r499,%r545,%r197; ld.u64 %r119,[%r499]; .loc 1 420 13 sub.u64 %r500,%r119,%r107; add.u64 %r163,%r163,%r500; .loc 1 421 24 add.u64 %r501,%frame,%r197; ld.u64 %r121,[%r501]; .loc 1 421 14 add.u64 %r164,%r160,%r121; .loc 1 422 23 add.u64 %r503,%r530,%r197; ld.u64 %r123,[%r503]; .loc 1 422 13 shl.b64 %r504,%r123,4; add.u64 %r165,%r161,%r504; .loc 1 400 32 add.u64 %r506,%r532,%r197; ld.u64 %r127,[%r506]; .loc 1 400 13 setp.eq.u64 %r507,%r118,%r127; @ %r507 bra $L97; bra $L99; $L116: .loc 1 305 6 ld.u64 %r508,[%r236]; setp.eq.u64 %r509,%r508,0; @ ! %r509 bra $L69; bra $L98; $L54: .loc 1 42_gfortran_sminloc1_16_s1 .visible .func _gfortran_sminloc1_16_s1local .align 16 .b8 %frame_ar[368u64 %r89; .reg .u643u64 %r125; .reg .u64predu16 %r180; .reg .predpred %r222; .reg .predu64 %r231; .reg .u16 %r233; .reg .u16 %r234; .reg .u32pred %r259; .reg .pred %r260; .reg .u64u64 %r267; .reg .u32predmov.u64 %r156,%ar0; mov.u64 %r157,%ar1; mov.u64 %r158,%ar2; mov.u64 %r159,%ar3; mov.u32 %r160,%ar4; mov.u64 %r161,%ar5; .loc 1 449 6 setp.eq.u64 %r162,%r159,0; @ %r162 bra $L119; .loc 1 449 20 ld.u32 %r163,[%r159]; setp.eq.u32 %r164,%r163,0; @ %r164 bra $L120; $L119: .loc 1 452 7r1571580161; call _gfortran_minloc1_16_s1); } .loc 1 456 7 bra $L118; $L120: .loc 1 459 10 ld.u64 %r23,[%r158]; .loc 1 459 7 add.u64 %r95,%r23,-1; .loc 1 460 10 ld.s8 %r24,[%r157+28]; .loc 1 460 38 add.u32 %r26,%r24,-1; .loc 1 460 8 cvt.s64.s32 %r307,%r26; .loc 1 462 7 shr.u64 %r171,%r95,63; set.u32.gt.s64 %r173,%r95,%r307; neg.s32 %r174,%r173; cvt.u16.u64 %r177,%r171; cvt.u16.u32 %r178,%r174; or.b16 %r176,%r177,%r178; .loc 1 462 6 cvt.u32.u16 %r179,%r176; cvt.u16.u8 %r180,%r179; setp.ne.u16 %r181,%r180,0; @ %r181 bra $L122; .loc 1 469 3 setp.ne.u64 %r182,%r95,0; @ %r182 bra $L123; $L129: .loc 1 477 3 setp.lt.s64 %r183,%r95,%r307; @ %r183 bra $L124; bra $L177; $L122: .loc 1 464 7 cvt.u32.u32 %r186,%r24; cvt.s64.s8 %r185,%r186; st.u64 [%stack+8],%r1851841stack; call _gfortran_runtime_error123: add.u64 %r92,%r157,48; add.u64 %r114,%frame,120; add.u64 %r146,%r157,56; add.u64 %r188,%r157,24; add.u64 %r190,%r23,%r23; add.u64 %r191,%r190,%r23; shl.b64 %r192,%r191,3; add.u64 %r152,%r188,%r192; .loc 1 474 12 mov.u64 %r313,0; $L128: .loc 1 471 19 ld.u64 %r194,[%r146]; ld.u64 %r196,[%r92]; sub.u64 %r195,%r194,%r196; .loc 1 471 50 mad.lo.u64 %r38,%r195,%r161,%r161; .loc 1 473 10 setp.le.s64 %r197,%r38,0; @ %r197 bra $L126; .loc 1 471 17 st.u64 [%r114],%r38; bra $L127; $L126: .loc 1 474 12 st.u64 [%r114],%r313; $L127: .loc 1 469 3 add.u64 %r92,%r92,24; add.u64 %r114,%r114,8; add.u64 %r146,%r146,24; setp.ne.u64 %r199,%r92,%r152; @ %r199 bra $L128; bra $L129; $L177: .loc 1 486 15 ld.u64 %r107,[%r156]; .loc 1 486 6 setp.eq.u64 %r200,%r107,0; @ ! %r200 bra $L131; bra $L130; $L124: add.u64 %r202,%r23,%r23; add.u64 %r203,%r202,%r23; shl.b64 %r204,%r203,3; add.u64 %r205,%r204,48; add.u64 %r83,%r157,%r205; add.u64 %r305,%frame,120; shl.b64 %r207,%r23,3; add.u64 %r208,%r207,-8; add.u64 %r144,%r305,%r208; add.u64 %r209,%r157,48; cvt.u32.u32 %r211,%r24; cvt.s64.s8 %r210,%r211; add.u64 %r213,%r210,%r210; add.u64 %r214,%r213,%r210; shl.b64 %r215,%r214,3; add.u64 %r72,%r209,%r215; .loc 1 483 12 mov.u64 %r312,0; $L134: .loc 1 480 2 ld.u64 %r217,[%r83+8]; ld.u64 %r219,[%r83]; sub.u64 %r218,%r217,%r219; .loc 1 480 37 mad.lo.u64 %r46,%r218,%r161,%r161; .loc 1 482 10 setp.le.s64 %r220,%r46,0; @ %r220 bra $L132; .loc 1 479 17 st.u64 [%r144],%r46; bra $L133; $L132: .loc 1 483 12 st.u64 [%r144],%r312; $L133: .loc 1 477 3 add.u64 %r83,%r83,24; add.u64 %r144,%r144,8; setp.ne.u64 %r222,%r72,%r83; @ %r222 bra $L134; bra $L178; $L130: .loc 1 490 7 setp.eq.u64 %r223,%r307,0; @ %r223 bra $L136; add.u64 %r305,%frame,120; $L156: add.u64 %r125,%r156,40; mov.u64 %r123,%r305; add.u64 %r224,%r156,16; cvt.u32.u32 %r226,%r24; cvt.s64.s8 %r225,%r226; add.u64 %r228,%r225,%r225; add.u64 %r229,%r228,%r225; shl.b64 %r230,%r229,3; add.u64 %r117,%r224,%r230; .loc 1 493 10 mov.u64 %r89,1; .loc 1 497 4 mov.u64 %r249,0; bra $L137; $L136: .loc 1 501 24 mov.u64 %r231,0; st.u64 [%r156+8],%r231; .loc 1 502 28 cvt.u16.u32 %r234,%r24; add.u16 %r233,%r234,-1; cvt.u32.u16 %r235,%r233; st.u8 [%r156+28],%r235; .loc 1 504 20 add.u32 %r236,%r24,-2; cvt.s64.s32 %r56,%r236; add.u64 %r238,%r56,%r56; add.u64 %r239,%r238,%r56; shl.b64 %r240,%r239,3; add.u64 %r241,%r156,%r240; .loc 1 504 67 shl.b64 %r243,%r56,3; add.u64 %r244,%frame,%r243; .loc 1 504 59 ld.u64 %r246,[%r241+40]; ld.u64 %r247,[%r244+120]; mul.lo.u64 %r103,%r246,%r247; .loc 1 506 10 setp.eq.u64 %r248,%r103,0; @ %r248 bra $L138; bra $L179; $L140: .loc 1 495 48 mul.lo.u64 %r89,%r52,%r89; $L137: .loc 1 497 4 st.u64 [%r125+8],%r249; ld.u64 %r52,[%r123]; add.u64 %r250,%r52,-1; st.u64 [%r125+16],%r250; st.u64 [%r125],%r89; .loc 1 490 7 add.u64 %r125,%r125,24; add.u64 %r123,%r123,8; setp.ne.u64 %r251,%r117,%r125; @ %r251 bra $L140; bra $L136; $L138: .loc 1 509 4 st.u64 [%r156+48],%r103; mov.u64 %r253,-1; st.u64 [%r156+56],%r253; mov.u64 %r254,1; st.u64 [%r156+40],%r254; .loc 1 510 4 bra $L118; $L179: .loc 1 513 24 mov.u64 %r256,16256; call (%value_in),_gfortrani_xmallocarray257,[%value_in]; } mov.u64 %r107,%r257; .loc 1 513 22 st.u64 [%r156],%r107; $L144: .loc 1 539 3 setp.ne.u64 %r259,%r307,0; @ %r259 bra $L141; bra $L142; $L131: .loc 1 517 19 ld.s8 %r61,[%r156+28]; .loc 1 517 10 setp.eq.u64 %r260,%r61,%r307; @ %r260 bra $L143; .loc 1 518 2 st.u64 [%stack+8],%r307; st.u64 [%stack],%r61;_gfortran_runtime_error143: .loc 1 523 11 cvta.global.u64 %r263,_gfortrani_compile_options; .loc 1 523 10 ld.u32 %r264,[%r263+36]; setp.eq.u32 %r265,%r264,0; @ %r265 bra $L144; .loc 1 525 4 setp.eq.u64 %r266,%r307,0; @ %r266 bra $L142; add.u64 %r115,%r156,48; add.u64 %r105,%frame,120; cvt.u32.u32 %r268,%r24; cvt.s64.s8 %r267,%r268; add.u64 %r86,%r267,-1; .loc 1 525 10 mov.u64 %r113,0; $L146: .loc 1 529 21 ld.u64 %r270,[%r115+8]; add.u64 %r269,%r270,1; .loc 1 529 19 ld.u64 %r271,[%r115]; sub.u64 %r100,%r269,%r271; .loc 1 530 18 ld.u64 %r70,[%r105]; .loc 1 525 25 add.u64 %r113,%r113,1; .loc 1 530 11 setp.eq.u64 %r272,%r70,%r100; @ %r272 bra $L145; .loc 1 531 3 st.u64 [%stack+16],%r70; st.u64 [%stack+8],%r100; st.u64 [%stack],%r113;_gfortran_runtime_error145: .loc 1 525 4 add.u64 %r115,%r115,24; add.u64 %r105,%r105,8; setp.ne.u64 %r275,%r86,%r113; @ %r275 bra $L146; $L141: .loc 1 541 16 cvt.u16.u32 %r277,%r24; setp.le.s16 %r278,%r277,1; @ %r278 bra $L147; shl.b64 %r276,%r307,3; bra $L148; $L147: mov.u64 %r276,8; $L148: add.u64 %r280,%frame,240; mov.u32 %r2842766,[%value_in]; } add.u64 %r130,%r156,40; mov.u64 %r128,%frame; mov.u64 %r109,0; $L149: .loc 1 542 18 ld.u64 %r288,[%r130]; st.u64 [%r128],%r288; .loc 1 539 26 add.u64 %r109,%r109,1; .loc 1 539 3 add.u64 %r130,%r130,24; add.u64 %r128,%r128,8; setp.gt.s64 %r289,%r307,%r109; @ %r289 bra $L149; $L142: ld.u64 %r88,[%frame+240]; .loc 1 551 22 ld.u64 %r75,[%frame]; .loc 1 551 12 shl.b64 %r76,%r75,4; .loc 1 553 32 ld.u64 %r65,[%frame+120]; .loc 1 560 23 mul.lo.u64 %r49,%r65,%r75; setp.le.s64 %r306,%r307,1; .loc 1 549 13 mov.u64 %r290,0; cvt.u32.u32 %r309,%r24; cvt.s64.s8 %r310,%r309; add.u64 %r311,%r310,-1; $L155: st.u64 [%r107],%r290; st.u64 [%r107+8],%r290; .loc 1 550 15 add.u64 %r88,%r88,1; .loc 1 551 12 add.u64 %r107,%r107,%r76; .loc 1 553 13 setp.ne.u64 %r292,%r65,%r88; @ %r292 bra $L155; .loc 1 562 7 @ ! %r306 bra $L180; bra $L118; $L154: .loc 1 557 13 st.u64 [%r108],%r290; .loc 1 560 23 mul.lo.u64 %r78,%r81,%r80; .loc 1 561 5 add.u64 %r106,%r106,1; .loc 1 562 7 add.u64 %r108,%r108,8; add.u64 %r138,%r138,8; add.u64 %r137,%r137,8; setp.ne.u64 %r295,%r106,%r311; @ %r295 bra $L153; bra $L118; $L180: add.u64 %r108,%frame,248; add.u64 %r138,%frame,8; add.u64 %r137,%frame,128; .loc 1 560 23 mov.u64 %r78,%r49; .loc 1 561 5 mov.u64 %r106,1; $L153: .loc 1 566 16 ld.u64 %r300,[%r108]; add.u64 %r80,%r300,1; st.u64 [%r108],%r80; .loc 1 567 23 ld.u64 %r81,[%r138]; .loc 1 567 13 sub.u64 %r301,%r81,%r78; shl.b64 %r302,%r301,4; add.u64 %r107,%r107,%r302; .loc 1 553 32 ld.u64 %r84,[%r137]; .loc 1 553 13 setp.eq.u64 %r303,%r80,%r84; @ %r303 bra $L154; .loc 1 557 13 mov.u64 %r88,0; bra $L155; $L178: .loc 1 486 15 ld.u64 %r107,[%r156]; .loc 1 486 6 setp.eq.u64 %r304,%r107,0; @ ! %r304 bra $L131; bra $L156; $L118: .loc 1 571 1 ret; } minloc1_16_s4.o/1622802213_gfortran_minloc1_16_s4 .visible .func _gfortran_minloc1_16_sfortran/generated/minloc1_16_s4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc1_16_s4 .visible .func _gfortran_mminloc1_16_s_gfortran_sminloc1_16_s4 .visible .func _gfortran_sminloc1_16_sVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_memcmp_char4_gfortrani_memcmp_char4_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents7] = {77,73,78,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,914,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1077,65,83,75,32,97,114,103,117,109,101,11073,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,117,101114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,76,79,67,32,105,110,116,114,105,110,115,105,99_gfortran_minloc1_16_s4 .visible .func _gfortran_minloc1_16_s, .param .u64 %in_ar4) {480u64predpredu32 %r259; .reg .u64predu64 %r280; .reg .u16 %r282; .reg .u16 %r283; .reg .u32 %r284; .reg .u32pred %r301; .reg .u64predu64 %r324; .reg .u64pred %r361; .reg .pred %r364; .reg .u64u64 %r375; .reg .u64u64pred %r389; .reg .u64mov.u64 %r192,%ar0; mov.u64 %r193,%ar1; mov.u64 %r194,%ar2; mov.u32 %r195,%ar3; mov.u64 %r196,%ar4; .loc 1 70 10 ld.s8 %r22,[%r193+28]; .loc 1 70 38 add.u32 %r197,%r22,-1; .loc 1 70 8 cvt.s64.s32 %r104,%r197; .loc 1 71 10 ld.u64 %r25,[%r194]; .loc 1 71 7 add.u64 %r105,%r25,-1; .loc 1 73 7 shr.u64 %r199,%r105,63; set.u32.lt.s64 %r201,%r104,%r105; neg.s32 %r202,%r201; cvt.u16.u64 %r205,%r199; cvt.u16.u32 %r206,%r202; or.b16 %r204,%r205,%r206; .loc 1 73 6 cvt.u32.u16 %r207,%r204; cvt.u16.u8 %r208,%r207; setp.eq.u16 %r209,%r208,0; @ %r209 bra $L2; .loc 1 75 7 cvt.u32.u32 %r212,%r22; cvt.s64.s8 %r211,%r212; st.u64 [%stack+8],%r2110210_gfortran_runtime_errorr215,%r105,%r105; add.u64 %r216,%r215,%r105; shl.b64 %r217,%r216,3; add.u64 %r218,%r193,%r217; ld.u64 %r30,[%r218+56]; ld.u64 %r32,[%r218+48]; .loc 1 83 11 ld.u64 %r33,[%r218+40]; .loc 1 85 3 setp.ne.u64 %r232,%r105,0; @ %r232 bra $L3; $L9: .loc 1 93 3 setp.gt.s64 %r233,%r104,%r105; @ %r233 bra $L4; bra $L50; $L3: add.u64 %r103,%r193,40; add.u64 %r101,%frame,120; add.u64 %r100,%frame,240; add.u64 %r234,%r193,16; add.u64 %r236,%r25,%r25; add.u64 %r237,%r236,%r25.loc 1 91 12 mov.u64 %r411,0; $L8: .loc 1 87 51 ld.u64 %r240,[%r103]; mul.lo.u64 %r239,%r240,%r196; .loc 1 87 18 st.u64 [%r101],%r239; .loc 1 88 19 ld.u64 %r242,[%r103+16]; add.u64 %r241,%r242,1; ld.u64 %r243,[%r103+8]; sub.u64 %r42,%r241,%r243; .loc 1 90 10 setp.lt.s64 %r244,%r42,0; @ %r244 bra $L6; .loc 1 88 17 st.u64 [%r100],%r42; bra $L7; $L6: .loc 1 91 12 st.u64 [%r100],%r411; $L7: .loc 1 85 3 add.u64 %r103,%r103,24; add.u64 %r101,%r101,8; add.u64 %r100,%r100,8; setp.ne.u64 %r246,%r89,%r103; @ %r246 bra $L8; bra $L9; $L50: .loc 1 102 6 ld.u64 %r247,[%r192]; setp.eq.u64 %r248,%r247,0; @ ! %r248 bra $L11; bra $L10; $L4: add.u64 %r250,%r25,%r25; add.u64 %r251,%r250,%r25; shl.b64 %r252,%r251,3; add.u64 %r253,%r252,40; add.u64 %r151,%r193,%r253; shl.b64 %r254,%r25,3; add.u64 %r144,%r254,-8; add.u64 %r399,%frame,120; add.u64 %r146,%r399,%r144; add.u64 %r398,%frame,240; add.u64 %r137,%r398,%r144; add.u64 %r257,%r193,40; cvt.u32.u32 %r259,%r22; cvt.s64.s8 %r258,%r259; add.u64 %r261,%r258,%r258; add.u64 %r262,%r261,%r258; shl.b64 %r263,%r262,3; add.u64 %r125,%r257,%r263; .loc 1 99 12 mov.u64 %r410,0; $L14: .loc 1 95 56 ld.u64 %r265,[%r151]; mul.lo.u64 %r264,%r265,%r196; .loc 1 95 18 st.u64 [%r146],%r264; .loc 1 96 19 ld.u64 %r267,[%r151+16]; add.u64 %r266,%r267,1; ld.u64 %r268,[%r151+8]; sub.u64 %r51,%r266,%r268; .loc 1 98 10 setp.lt.s64 %r269,%r51,0; @ %r269 bra $L12; .loc 1 96 17 st.u64 [%r137],%r51; bra $L13; $L12: .loc 1 99 12 st.u64 [%r137],%r410; $L13: .loc 1 93 3 add.u64 %r151,%r151,24; add.u64 %r146,%r146,8; add.u64 %r137,%r137,8; setp.ne.u64 %r271,%r125,%r151; @ %r271 bra $L14; bra $L51; $L10: .loc 1 106 7 setp.le.s64 %r272,%r104,0; @ %r272 bra $L16; add.u64 %r398,%frame,240; $L38: add.u64 %r160,%r192,40; mov.u64 %r158,%r398; add.u64 %r273,%r192,16; cvt.u32.u32 %r275,%r22; cvt.s64.s8 %r274,%r275; add.u64 %r277,%r274,%r274; add.u64 %r278,%r277,%r274; shl.b64 %r279,%r278,3; add.u64 %r152,%r273,%r279; .loc 1 109 10 mov.u64 %r95,1; .loc 1 113 4 mov.u64 %r302,0; bra $L17; $L16: .loc 1 117 24 mov.u64 %r280,0; st.u64 [%r192+8],%r280; .loc 1 118 28 cvt.u16.u32 %r283,%r22; add.u16 %r282,%r283,-1; cvt.u32.u16 %r284,%r282; st.u8 [%r192+28],%r284; .loc 1 120 20 add.u32 %r285,%r22,-2; cvt.s64.s32 %r58,%r285; add.u64 %r287,%r58,%r58; add.u64 %r288,%r287,%r58; shl.b64 %r289,%r288,3; add.u64 %r290,%r192,%r289; .loc 1 120 67 shl.b64 %r292,%r58,3; add.u64 %r293,%frame,%r292; .loc 1 120 59 ld.u64 %r295,[%r290+40]; ld.u64 %r296,[%r293+240]; mul.lo.u64 %r110,%r295,%r296; .loc 1 122 29 mov.u64 %r298,1622 27 st.u64 [%r192],%r299; .loc 1 123 10 setp.eq.u64 %r301,%r110,0; @ ! %r301 bra $L21; bra $L18; $L20: .loc 1 111 48 mul.lo.u64 %r95,%r54,%r95; $L17: .loc 1 113 4 st.u64 [%r160+8],%r302; ld.u64 %r54,[%r158]; add.u64 %r303,%r54,-1; st.u64 [%r160+16],%r303; st.u64 [%r160],%r95; .loc 1 106 7 add.u64 %r160,%r160,24; add.u64 %r158,%r158,8; setp.ne.u64 %r304,%r152,%r160; @ %r304 bra $L20; bra $L16; $L18: .loc 1 126 4 st.u64 [%r192+48],%r110; mov.u64 %r306,-1; st.u64 [%r192+56],%r306; mov.u64 %r307,1; st.u64 [%r192+40],%r307; .loc 1 127 4 bra $L1; $L11: .loc 1 133 19 ld.s8 %r64,[%r192+28]; .loc 1 133 10 setp.eq.u64 %r308,%r64,%r104; @ %r308 bra $L23; .loc 1 134 2 st.u64 [%stack+8],%r104; st.u64 [%stack],%r64;_gfortran_runtime_error23: .loc 1 139 11 cvta.global.u64 %r311,_gfortrani_compile_options; .loc 1 139 10 ld.u32 %r312,[%r311+36]; setp.eq.u32 %r313,%r312,0; @ %r313 bra $L21; .loc 1 140 2 add.u64 %r398,%frame,240; cvta.const.u64 %r317,$LC2317; call _gfortrani_bounds_ifunction_return$L21: .loc 1 144 3 setp.gt.s64 %r319,%r104,0; @ %r319 bra $L24; $L28: .loc 1 80 9 add.u64 %r320,%r30,1; .loc 1 80 7 sub.u64 %r106,%r320,%r32; max.s64 %r108,%r106,0; .loc 1 152 8 ld.u64 %r115,[%r193]; .loc 1 153 8 ld.u64 %r116,[%r192]; .loc 1 170 36 mul.lo.u64 %r321,%r33,%r196; shl.b64 %r68,%r321,2; .loc 1 186 22 ld.u64 %r184,[%frame+120]; .loc 1 186 12 shl.b64 %r186,%r184,2; .loc 1 187 22 ld.u64 %r187,[%frame]; .loc 1 187 12 shl.b64 %r189,%r187,4; .loc 1 185 12 ld.u64 %r190,[%frame+360]; setp.gt.s64 %r400,%r106,0; setp.le.s64 %r401,%r104,1; .loc 1 173 6 setp.eq.u32 %r404,%r195,0; cvt.u32.u32 %r405,%r22; cvt.s64.s8 %r406,%r405; add.u64 %r407,%r406,-1; add.u64 %r408,%frame,120; add.u64 %r409,%frame,240; bra $L25; $L24: add.u64 %r173,%frame,360; add.u64 %r172,%r192,40; mov.u64 %r170,%frame; cvt.u32.u32 %r323,%r22; cvt.s64.s8 %r322,%r323; add.u64 %r161,%r322,-1; .loc 1 144 3 mov.u64 %r117,0; add.u64 %r398,%frame,240; .loc 1 146 16 mov.u64 %r324,%r117; $L27: st.u64 [%r173],%r324; .loc 1 147 18 ld.u64 %r325,[%r172]; st.u64 [%r170],%r325; .loc 1 148 17 shl.b64 %r327,%r117,3; add.u64 %r328,%r398,%r327; .loc 1 148 10 ld.u64 %r329,[%r328]; setp.le.s64 %r330,%r329,0; @ %r330 bra $L1; .loc 1 144 26 add.u64 %r117,%r117,1; .loc 1 144 3 add.u64 %r173,%r173,8; add.u64 %r172,%r172,24; add.u64 %r170,%r170,8; setp.ne.u64 %r331,%r117,%r161; @ %r331 bra $L27; bra $L28; $L25: .loc 1 166 5 @ %r400 bra $L39; .loc 1 167 10 mov.u64 %r333,0; st.u64 [%r116],%r333; st.u64 [%r116+8],%r333; bra $L30; $L39: mov.u64 %r111,%r115; mov.u64 %r392,1; mov.u64 %r393,0; .loc 1 164 9 mov.u64 %r99,%r393; .loc 1 165 9 mov.u64 %r390,%r393; mov.u64 %r391,%r393; $L29: .loc 1 173 6 setp.ne.u64 %r335,%r99,0; @ %r335 bra $L31; $L34: .loc 1 177 14 mov.u64 %r390,%r392; mov.u64 %r391,%r393; mov.u64 %r99,%r111; bra $L32; $L31: .loc 1 173 6 @ %r404 bra $L33; .loc 1 42196; call (%value_in),_gfortrani_memcmp_char4340,[%value_in]; } .loc 1 173 22 setp.le.s32 %r341,%r340,0; @ ! %r341 bra $L32; bra $L34; $L33: .loc 1 42196; call (%value_in),_gfortrani_memcmp_char4345,[%value_in]; } .loc 1 173 22 setp.lt.s32 %r346,%r345,0; @ %r346 bra $L34; $L32: .loc 1 170 36 add.u64 %r111,%r111,%r68; .loc 1 170 6 add.u64 %r349,%r392,1; set.u32.lt.u64 %r354,%r349,%r392; cvt.s64.s32 %r352,%r354; mov.u64 %r392,%r349; sub.u64 %r393,%r393,%r352; add.u64 %r359,%r392,-1; setp.gt.s64 %r361,%r108,%r359; @ %r361 bra $L29; .loc 1 181 12 st.u64 [%r116],%r390; st.u64 [%r116+8],%r391; $L30: .loc 1 185 15 add.u64 %r190,%r190,1; .loc 1 186 12 add.u64 %r115,%r115,%r186; .loc 1 187 12 add.u64 %r116,%r116,%r189; .loc 1 189 32 ld.u64 %r139,[%frame+240]; .loc 1 189 13 setp.ne.u64 %r364,%r190,%r139; @ %r364 bra $L25; .loc 1 196 23 mul.lo.u64 %r365,%r190,%r184; .loc 1 196 9 shl.b64 %r366,%r365,2; sub.u64 %r112,%r115,%r366; .loc 1 197 23 mul.lo.u64 %r367,%r190,%r187; .loc 1 197 9 shl.b64 %r368,%r367,4; sub.u64 %r113,%r116,%r368; .loc 1 199 7 @ %r401 bra $L1; add.u64 %r122,%frame,368; mov.u64 %r143,8; .loc 1 198 5 mov.u64 %r114,1; .loc 1 193 13 mov.u64 %r402,0; bra $L36; $L37: st.u64 [%r122],%r402; .loc 1 196 23 mul.lo.u64 %r374,%r80,%r79; .loc 1 196 9 shl.b64 %r375,%r374,2; sub.u64 %r112,%r115,%r375; .loc 1 197 23 mul.lo.u64 %r376,%r83,%r79; .loc 1 197 9 shl.b64 %r377,%r376,4; sub.u64 %r113,%r116,%r377; .loc 1 198 5 add.u64 %r114,%r114,1; .loc 1 199 7 add.u64 %r122,%r122,8; add.u64 %r143,%r143,8; setp.eq.u64 %r378,%r114,%r407; @ %r378 bra $L1; $L36: .loc 1 207 16 ld.u64 %r379,[%r122]; add.u64 %r79,%r379,1; st.u64 [%r122],%r79; .loc 1 208 23 add.u64 %r381,%r408,%r143; ld.u64 %r80,[%r381]; .loc 1 208 13 shl.b64 %r382,%r80,2; add.u64 %r115,%r112,%r382; .loc 1 209 23 add.u64 %r383,%frame,%r143; ld.u64 %r83,[%r383]; .loc 1 209 13 shl.b64 %r384,%r83,4; add.u64 %r116,%r113,%r384; .loc 1 189 32 add.u64 %r386,%r409,%r143; ld.u64 %r86,[%r386]; .loc 1 189 13 setp.eq.u64 %r387,%r79,%r86; @ %r387 bra $L37; mov.u64 %r190,0; bra $L25; $L51: .loc 1 102 6 ld.u64 %r388,[%r192]; setp.eq.u64 %r389,%r388,0; @ ! %r389 bra $L11; bra $L38; $L1: .loc 1 21_gfortran_mminloc1_16_s4 .visible .func _gfortran_mminloc1_16_s608pred %r265; .reg .u64 %r266; .reg .u64 %r267; .reg .u32 %r268; .reg .u64 %r273; .reg .u64 %r274; .reg .u64 %r276; .reg .u64 %r277; .reg .u64 %r284; .reg .pred %r285; .reg .u32 %r286; .reg .u32 %r287; .reg .u32predu64predu16 %r382; .reg .u16 %r383; .reg .u32 %r384; .reg .pred %r385; .reg .u64 %r386; .reg .u64u64 %r410; .reg .pred %r412; .reg .pred %r413; .reg .u64 %r414; .reg .u32pred %r423; .reg .u64 %r424; .reg .pred %r427; .reg .u64 %r431; .reg .u64 %r434; .reg .u64 %r437; .reg .u64 %r440; .reg .u32 %r442; .reg .u64 %r446; .reg .u64 %r448; .reg .u64 %r451; .reg .u32 %r453; .reg .u64 %r457; .reg .pred %r458; .reg .u32 %r464; .reg .pred %r465; .reg .u32 %r469; .reg .pred %r470; .reg .u64predu64 %r511; .reg .u64 %r513; .reg .u64 %r514; .reg .u64 %r516; .reg .pred %r517; .reg .u64 %r518; .reg .pred %r519; .reg .u64 %r520; .reg .u64 %r521; .reg .u64pred %r543; .reg .u64 %r544; .reg .u64 %r545; .reg .pred %r546; .reg .u64 %r549; .reg .pred %r550; .reg .u32 %r551; .reg .u64u64 %r556; .reg .u64 %r557; .reg .u64 %r558;.loc 1 244 6 setp.ne.u64 %r247,%r244,0; @ %r247 bra $L42243245246; call _gfortran_minloc1_16_s4); } .loc 1 251 7 bra $L52; $L53: .loc 1 254 10 ld.u64 %r22,[%r243]; .loc 1 254 7 add.u64 %r145,%r22,-1; .loc 1 255 10 ld.s8 %r23,[%r242+28]; .loc 1 255 38 add.u32 %r253,%r23,-1; .loc 1 255 8 cvt.s64.s32 %r146,%r253; .loc 1 258 7 shr.u64 %r255,%r145,63; set.u32.gt.s64 %r257,%r145,%r146; neg.s32 %r258,%r257; cvt.u16.u64 %r261,%r255; cvt.u16.u32 %r262,%r258; or.b16 %r260,%r261,%r262; .loc 1 258 6 cvt.u32.u16 %r263,%r260; cvt.u16.u8 %r264,%r263; setp.eq.u16 %r265,%r264,0; @ %r265 bra $L55; .loc 1 260 7 cvt.u32.u32 %r268,%r23; cvt.s64.s8 %r267,%r268; st.u64 [%stack+8],%r267; st.u64 [%stack],%r22; cvta.const.u64 %r266266_gfortran_runtime_error5: .loc 1 265 9 add.u64 %r542,%r145,%r145; add.u64 %r541,%r542,%r145; shl.b64 %r273,%r541,3; add.u64 %r274,%r242,%r273; ld.u64 %r277,[%r274+56]; add.u64 %r276,%r277,1; .loc 1 265 7 ld.u64 %r284,[%r274+48]; sub.u64 %r147,%r276,%r284; .loc 1 266 6 setp.le.s64 %r285,%r147,0; @ %r285 bra $L52; .loc 1 269 9 ld.u64 %r159,[%r244]; .loc 1 271 15 ld.u64 %r35,[%r244+16]; .loc 1 273 22 cvt.u32.u64 %r36,%r35; .loc 1 273 53 add.u32 %r286,%r36,-4; and.b32 %r287,%r286,-5; set.u32.eq.u32 %r289,%r287,0; neg.s32 %r290,%r289; .loc 1 273 22 add.u32 %r291,%r36,-1; set.u32.le.u32 %r293,%r291,1; neg.s32 %r294,%r293; .loc 1 273 6 cvt.u16.u32 %r296,%r290; cvt.u16.u32 %r297,%r294; or.loc 1 275 7 setp.ne.u32 %r302,%r36,16; @ %r302 bra $L58; $L57: .loc 1 282 11 shl.b64 %r306,%r541,3; add.u64 %r307,%r242,%r306; ld.u64 %r39,[%r307+40]; .loc 1 283 12 add.u64 %r313,%r244,%r306; ld.u64 %r41,[%r313+40]; .loc 1 285 3 setp.ne.u64 %r315,%r145,0; @ %r315 bra $L59; $L65: .loc 1 295 3 setp.lt.s64 %r316,%r145,%r146; @ %r316 bra $L60; bra $L112; $L58: .loc 1 280 5 cvta.const.u64 %r317317_gfortran_runtime_error9: add.u64 %r74,%r242,40; add.u64 %r224,%frame,240; add.u64 %r93,%r244,40; mov.u64 %r232,%frame; add.u64 %r195,%frame,360; add.u64 %r320,%r22,%r22; add.u64 %r321,%r320,%r22; shl.b64 %r322,%r321,3; add.u64 %r323,%r242,16; add.u64 %r236,%r322,%r323; .loc 1 292 12 mov.u64 %r558,0; $L64: .loc 1 287 51 ld.u64 %r325,[%r74]; mul.lo.u64 %r324,%r325,%r246; .loc 1 287 18 st.u64 [%r224],%r324; .loc 1 288 20 ld.u64 %r327,[%r93]; mul.lo.u64 %r326,%r327,%r35; .loc 1 288 18 st.u64 [%r232],%r326; .loc 1 289 19 ld.u64 %r329,[%r74+16]; add.u64 %r328,%r329,1; ld.u64 %r330,[%r74+8]; sub.u64 %r56,%r328,%r330; .loc 1 291 10 setp.lt.s64 %r331,%r56,0; @ %r331 bra $L62; .loc 1 289 17 st.u64 [%r195],%r56; bra $L63; $L62: .loc 1 292 12 st.u64 [%r195],%r558; $L63: .loc 1 285 3 add.u64 %r74,%r74,24; add.u64 %r224,%r224,8; add.u64 %r93,%r93,24; add.u64 %r232,%r232,8; add.u64 %r195,%r195,8; setp.ne.u64 %r333,%r74,%r236; @ %r333 bra $L64; bra $L65; $L112: .loc 1 305 6 ld.u64 %r334,[%r241]; setp.eq.u64 %r335,%r334,0; @ ! %r335 bra $L67; bra $L66; $L60: add.u64 %r337,%r22,%r22; add.u64 %r338,%r337,%r22; shl.b64 %r339,%r338,3; add.u64 %r129,%r339,40; add.u64 %r135,%r242,%r129; shl.b64 %r340,%r22,3; add.u64 %r125,%r340,-8; add.u64 %r545,%frame,240; add.u64 %r127,%r545,%r125; add.u64 %r122,%r244,%r129; add.u64 %r89,%frame,%r125; add.u64 %r544,%frame,360; add.u64 %r43,%r544,%r125; add.u64 %r343,%r242,40; cvt.u32.u32 %r345,%r23; cvt.s64.s8 %r344,%r345; add.u64 %r347,%r344,%r344; add.u64 %r348,%r347,%r344; shl.b64 %r349,%r348,3; add.u64 %r217,%r343,%r349; .loc 1 302 12 mov.u64 %r557,0; $L70: .loc 1 297 55 ld.u64 %r351,[%r135]; mul.lo.u64 %r350,%r351,%r246; .loc 1 297 18 st.u64 [%r127],%r350; .loc 1 298 20 ld.u64 %r353,[%r122]; mul.lo.u64 %r352,%r353,%r35; .loc 1 298 18 st.u64 [%r89],%r352; .loc 1 299 19 ld.u64 %r355,[%r135+16]; add.u64 %r354,%r355,1; ld.u64 %r356,[%r135+8]; sub.u64 %r72,%r354,%r356; .loc 1 301 10 setp.lt.s64 %r357,%r72,0; @ %r357 bra $L68; .loc 1 299 17 st.u64 [%r43],%r72; bra $L69; $L68: .loc 1 302 12 st.u64 [%r43],%r557; $L69: .loc 1 295 3 add.u64 %r135,%r135,24; add.u64 %r127,%r127,8; add.u64 %r122,%r122,24; add.u64 %r89,%r89,8; add.u64 %r43,%r43,8; setp.ne.u64 %r359,%r135,%r217; @ %r359 bra $L70; bra $L113; $L66: .loc 1 309 7 setp.eq.u64 %r360,%r146,0; @ %r360 bra $L72; add.u64 %r544,%frame,360; $L95: add.u64 %r174,%r241,40; mov.u64 %r161,%r544; cvt.u32.u32 %r362,%r23; cvt.s64.s8 %r361,%r362; add.u64 %r364,%r361,%r361; add.u64 %r365,%r364,%r361; shl.b64 %r366,%r365,3; add.u64 %r367,%r241,16; add.u64 %r137,%r366,%r367; .loc 1 312 10 mov.u64 %r133,1; .loc 1 316 4 mov.u64 %r386,0; bra $L73; $L72: .loc 1 320 20 add.u32 %r368,%r23,-2; cvt.s64.s32 %r80,%r368; add.u64 %r370,%r80,%r80; add.u64 %r371,%r370,%r80; shl.b64 %r372,%r371,3; add.u64 %r373,%r241,%r372; .loc 1 320 67 shl.b64 %r375,%r80,3; add.u64 %r376,%frame,%r375; .loc 1 320 59 ld.u64 %r378,[%r373+40]; ld.u64 %r379,[%r376+360]; mul.lo.u64 %r153,%r378,%r379; .loc 1 322 24 mov.u64 %r380,0; st.u64 [%r241+8],%r380; .loc 1 323 28 cvt.u16.u32 %r383,%r23; add.u16 %r382,%r383,-1; cvt.u32.u16 %r384,%r382; st.u8 [%r241+28],%r384; .loc 1 325 10 setp.eq.u64 %r385,%r153,0; @ %r385 bra $L74; bra $L114; $L76: .loc 1 314 47 mul.lo.u64 %r133,%r77,%r133; $L73: .loc 1 316 4 st.u64 [%r174+8],%r386; ld.u64 %r77,[%r161]; add.u64 %r387,%r77,-1; st.u64 [%r174+16],%r387; st.u64 [%r174],%r133; .loc 1 309 7 add.u64 %r174,%r174,24; add.u64 %r161,%r161,8; setp.ne.u64 %r388,%r137,%r174; @ %r388 bra $L76; bra $L72; $L74: .loc 1 328 4 st.u64 [%r241+48],%r153; mov.u64 %r390,-1; st.u64 [%r241+56],%r390; mov.u64 %r391,1; st.u64 [%r241+40],%r391; .loc 1 329 4 bra $L52; $L114: .loc 1 332 24 mov.u64 %r393,16393; call (%value_in),_gfortrani_xmallocarray94,[%value_in]; } .loc 1 332 22 st.u64 [%r241],%r394; bra $L77; $L67: .loc 1 337 19 ld.s8 %r396,[%r241+28]; .loc 1 337 10 setp.eq.u64 %r397,%r396,%r146; @ %r397 bra $L78; .loc 1 338 2 cvta.const.u64 %r398,$LC5stack; call _gfortran_runtime_error78: .loc 1 340 11 cvta.global.u64 %r400,_gfortrani_compile_options; .loc 1 340 10 ld.u32 %r401,[%r400+36]; setp.eq.u32 %r402,%r401,0; @ %r402 bra $L77; .loc 1 342 4 add.u64 %r544,%frame,360; cvta.const.u64 %r406,$LC2r5440406; call _gfortrani_bounds_ifunction_return344 4 cvta.const.u64 %r410,$LC424210406; call _gfortrani_bounds_equal_extents$L77: .loc 1 349 3 setp.ne.u64 %r412,%r146,0; @ %r412 bra $L79; $L82: .loc 1 357 8 ld.u64 %r160,[%r241]; .loc 1 358 8 ld.u64 %r158,[%r242]; .loc 1 360 9 setp.ne.u64 %r413,%r158,0; @ %r413 bra $L80; bra $L52; $L79: add.u64 %r196,%frame,480; add.u64 %r193,%r241,40; add.u64 %r189,%frame,120; cvt.u32.u32 %r415,%r23; cvt.s64.s8 %r414,%r415; add.u64 %r177,%r414,-1; .loc 1 349 3 mov.u64 %r165,0; add.u64 %r544,%frame,360; .loc 1 351 16 mov.u64 %r416,%r165; $L81: st.u64 [%r196],%r416; .loc 1 352 18 ld.u64 %r417,[%r193]; st.u64 [%r189],%r417; .loc 1 353 17 shl.b64 %r419,%r165,3; add.u64 %r420,%r544,%r419; .loc 1 353 10 ld.u64 %r421,[%r420]; setp.le.s64 %r422,%r421,0; @ %r422 bra $L52; .loc 1 349 26 add.u64 %r165,%r165,1; .loc 1 349 3 add.u64 %r196,%r196,8; add.u64 %r193,%r193,24; add.u64 %r189,%r189,8; setp.ne.u64 %r423,%r165,%r177; @ %r423 bra $L81; bra $L82; $L80: .loc 1 283 12 mul.lo.u64 %r44,%r41,%r35; .loc 1 372 32 mul.lo.u64 %r424,%r39,%r246; shl.b64 %r92,%r424,2; .loc 1 396 22 ld.u64 %r219,[%frame+240]; .loc 1 396 12 shl.b64 %r222,%r219,2; .loc 1 397 23 ld.u64 %r223,[%frame]; .loc 1 398 22 ld.u64 %r226,[%frame+120]; .loc 1 398 12 shl.b64 %r228,%r226,4; setp.le.s64 %r543,%r146,1; add.u64 %r549,%r147,-1; .loc 1 384 13 setp.eq.u32 %r550,%r245,0; cvt.u32.u32 %r551,%r23; cvt.s64.s8 %r552,%r551; add.u64 %r553,%r552,-1; add.u64 %r554,%frame,240; add.u64 %r555,%frame,360; add.u64 %r556,%frame,120; $L96: .loc 1 349 3 mov.u64 %r164,%r159; mov.u64 %r154,%r158; .loc 1 372 9 mov.u64 %r198,0; $L86: .loc 1 375 7 ld.s8 %r230,[%r164]; mov.u64 %r235,%r198; add.u64 %r198,%r198,1; .loc 1 375 6 setp.eq.u32 %r546,%r230,0; @ %r546 bra $L83; .loc 1 378 11 mov.u64 %r520,%r198; shr.s64 %r521,%r520,63; .loc 1 382 6 setp.le.s64 %r427,%r147,%r235; @ %r427 bra $L84; add.u64 %r524,%r235,1; shr.s64 %r525,%r524,63; sub.u64 %r431,%r549,%r235; shr.s64 %r434,%r235,63; add.u64 %r437,%r235,2; set.u32.lt.u64 %r442,%r437,%r235; cvt.s64.s32 %r440,%r442; sub.u64 %r446,%r434,%r440; add.u64 %r448,%r431,%r437; set.u32.lt.u64 %r453,%r448,%r431; cvt.s64.s32 %r451,%r453; sub.u64 %r457,%r446,%r451; mov.u64 %r163,%r154; bra $L85; $L83: .loc 1 372 32 add.u64 %r154,%r154,%r92; .loc 1 372 47 add.u64 %r164,%r164,%r44; .loc 1 372 2 setp.ne.u64 %r458,%r147,%r198; @ %r458 bra $L86; .loc 1 371 9 mov.u64 %r520,0; mov.u64 %r521,%r520; bra $L84; $L85: .loc 1 384 6 @ %r546 bra $L87; .loc 1 384 13 @ %r550 bra $L88; .loc 1 4216r154246; call (%value_in),_gfortrani_memcmp_char4464,[%value_in]; } .loc 1 384 13 setp.gt.s32 %r465,%r464,0; @ %r465 bra $L87; $L89: .loc 1 388 14 mov.u64 %r520,%r524; mov.u64 %r521,%r525; mov.u64 %r154,%r163; bra $L87; $L88: .loc 1 4216r154246; call (%value_in),_gfortrani_memcmp_char4469,[%value_in]; } .loc 1 384 13 setp.lt.s32 %r470,%r469,0; @ %r470 bra $L89; $L87: .loc 1 382 31 add.u64 %r163,%r163,%r92; .loc 1 382 46 add.u64 %r164,%r164,%r44; .loc 1 382 6 add.u64 %r473,%r524,1; set.u32.lt.u64 %r478,%r473,%r524; cvt.s64.s32 %r476,%r478; mov.u64 %r524,%r473; sub.u64 %r525,%r525,%r476; setp.ne.u64 %r485,%r448,%r524; @ %r485 bra $L97; setp.ne.u64 %r488,%r457,%r525; @ ! %r488 bra $L84; $L97: .loc 1 384 7 ld.s8 %r230,[%r164]; setp.eq.u32 %r546,%r230,0; bra $L85; $L84: .loc 1 392 8 st.u64 [%r160],%r520; st.u64 [%r160+8],%r521; .loc 1 395 15 ld.u64 %r491,[%frame+480]; add.u64 %r95,%r491,1; st.u64 [%frame+480],%r95; .loc 1 396 12 add.u64 %r158,%r158,%r222; .loc 1 397 13 add.u64 %r159,%r159,%r223; .loc 1 398 12 add.u64 %r160,%r160,%r228; .loc 1 400 32 ld.u64 %r29,[%frame+360]; .loc 1 400 13 setp.ne.u64 %r492,%r29,%r95; @ %r492 bra $L96; .loc 1 404 13 mov.u64 %r493,0; st.u64 [%frame+480],%r493; .loc 1 407 23 mul.lo.u64 %r105,%r29,%r219; .loc 1 408 24 mul.lo.u64 %r494,%r29,%r223; .loc 1 408 10 sub.u64 %r155,%r159,%r494; .loc 1 409 23 mul.lo.u64 %r495,%r29,%r226; .loc 1 409 9 shl.b64 %r496,%r495,4; sub.u64 %r156,%r160,%r496; .loc 1 411 7 @ %r543 bra $L52; add.u64 %r215,%frame,488; mov.u64 %r214,8; .loc 1 410 5 mov.u64 %r157,1; bra $L93; $L94: .loc 1 404 13 st.u64 [%r215],%r493; .loc 1 407 23 mul.lo.u64 %r105,%r115,%r114; .loc 1 408 24 mul.lo.u64 %r502,%r117,%r114; .loc 1 408 10 sub.u64 %r155,%r159,%r502; .loc 1 409 23 mul.lo.u64 %r503,%r119,%r114; .loc 1 409 9 shl.b64 %r504,%r503,4; sub.u64 %r156,%r160,%r504; .loc 1 410 5 add.u64 %r157,%r157,1; .loc 1 411 7 add.u64 %r215,%r215,8; add.u64 %r214,%r214,8; setp.eq.u64 %r505,%r157,%r553; @ %r505 bra $L52; $L93: .loc 1 419 16 ld.u64 %r506,[%r215]; add.u64 %r114,%r506,1; st.u64 [%r215],%r114; .loc 1 420 23 add.u64 %r508,%r554,%r214; ld.u64 %r115,[%r508]; .loc 1 420 13 sub.u64 %r509,%r115,%r105; shl.b64 %r510,%r509,2; add.u64 %r158,%r158,%r510; .loc 1 421 24 add.u64 %r511,%frame,%r214; ld.u64 %r117,[%r511]; .loc 1 421 14 add.u64 %r159,%r155,%r117; .loc 1 422 23 add.u64 %r513,%r556,%r214; ld.u64 %r119,[%r513]; .loc 1 422 13 shl.b64 %r514,%r119,4; add.u64 %r160,%r156,%r514; .loc 1 400 32 add.u64 %r516,%r555,%r214; ld.u64 %r123,[%r516]; .loc 1 400 13 setp.eq.u64 %r517,%r114,%r123; @ %r517 bra $L94; bra $L96; $L113: .loc 1 305 6 ld.u64 %r518,[%r241]; setp.eq.u64 %r519,%r518,0; @ ! %r519 bra $L67; bra $L95; $L52: .loc 1 42_gfortran_sminloc1_16_s4 .visible .func _gfortran_sminloc1_16_slocal .align 16 .b8 %frame_ar[368u64 %r89; .reg .u643u64 %r125; .reg .u64predu16 %r180; .reg .predpred %r222; .reg .predu64 %r231; .reg .u16 %r233; .reg .u16 %r234; .reg .u32pred %r259; .reg .pred %r260; .reg .u64u64 %r267; .reg .u32predmov.u64 %r156,%ar0; mov.u64 %r157,%ar1; mov.u64 %r158,%ar2; mov.u64 %r159,%ar3; mov.u32 %r160,%ar4; mov.u64 %r161,%ar5; .loc 1 449 6 setp.eq.u64 %r162,%r159,0; @ %r162 bra $L116; .loc 1 449 20 ld.u32 %r163,[%r159]; setp.eq.u32 %r164,%r163,0; @ %r164 bra $L117; $L116: .loc 1 452 7r1571580161; call _gfortran_minloc1_16_s4); } .loc 1 456 7 bra $L115; $L117: .loc 1 459 10 ld.u64 %r23,[%r158]; .loc 1 459 7 add.u64 %r95,%r23,-1; .loc 1 460 10 ld.s8 %r24,[%r157+28]; .loc 1 460 38 add.u32 %r26,%r24,-1; .loc 1 460 8 cvt.s64.s32 %r307,%r26; .loc 1 462 7 shr.u64 %r171,%r95,63; set.u32.gt.s64 %r173,%r95,%r307; neg.s32 %r174,%r173; cvt.u16.u64 %r177,%r171; cvt.u16.u32 %r178,%r174; or.b16 %r176,%r177,%r178; .loc 1 462 6 cvt.u32.u16 %r179,%r176; cvt.u16.u8 %r180,%r179; setp.ne.u16 %r181,%r180,0; @ %r181 bra $L119; .loc 1 469 3 setp.ne.u64 %r182,%r95,0; @ %r182 bra $L120; $L126: .loc 1 477 3 setp.lt.s64 %r183,%r95,%r307; @ %r183 bra $L121; bra $L174; $L119: .loc 1 464 7 cvt.u32.u32 %r186,%r24; cvt.s64.s8 %r185,%r186; st.u64 [%stack+8],%r1851841stack; call _gfortran_runtime_error120: add.u64 %r92,%r157,48; add.u64 %r114,%frame,120; add.u64 %r146,%r157,56; add.u64 %r188,%r157,24; add.u64 %r190,%r23,%r23; add.u64 %r191,%r190,%r23; shl.b64 %r192,%r191,3; add.u64 %r152,%r188,%r192; .loc 1 474 12 mov.u64 %r313,0; $L125: .loc 1 471 19 ld.u64 %r194,[%r146]; ld.u64 %r196,[%r92]; sub.u64 %r195,%r194,%r196; .loc 1 471 50 mad.lo.u64 %r38,%r195,%r161,%r161; .loc 1 473 10 setp.le.s64 %r197,%r38,0; @ %r197 bra $L123; .loc 1 471 17 st.u64 [%r114],%r38; bra $L124; $L123: .loc 1 474 12 st.u64 [%r114],%r313; $L124: .loc 1 469 3 add.u64 %r92,%r92,24; add.u64 %r114,%r114,8; add.u64 %r146,%r146,24; setp.ne.u64 %r199,%r92,%r152; @ %r199 bra $L125; bra $L126; $L174: .loc 1 486 15 ld.u64 %r107,[%r156]; .loc 1 486 6 setp.eq.u64 %r200,%r107,0; @ ! %r200 bra $L128; bra $L127; $L121: add.u64 %r202,%r23,%r23; add.u64 %r203,%r202,%r23; shl.b64 %r204,%r203,3; add.u64 %r205,%r204,48; add.u64 %r83,%r157,%r205; add.u64 %r305,%frame,120; shl.b64 %r207,%r23,3; add.u64 %r208,%r207,-8; add.u64 %r144,%r305,%r208; add.u64 %r209,%r157,48; cvt.u32.u32 %r211,%r24; cvt.s64.s8 %r210,%r211; add.u64 %r213,%r210,%r210; add.u64 %r214,%r213,%r210; shl.b64 %r215,%r214,3; add.u64 %r72,%r209,%r215; .loc 1 483 12 mov.u64 %r312,0; $L131: .loc 1 480 2 ld.u64 %r217,[%r83+8]; ld.u64 %r219,[%r83]; sub.u64 %r218,%r217,%r219; .loc 1 480 37 mad.lo.u64 %r46,%r218,%r161,%r161; .loc 1 482 10 setp.le.s64 %r220,%r46,0; @ %r220 bra $L129; .loc 1 479 17 st.u64 [%r144],%r46; bra $L130; $L129: .loc 1 483 12 st.u64 [%r144],%r312; $L130: .loc 1 477 3 add.u64 %r83,%r83,24; add.u64 %r144,%r144,8; setp.ne.u64 %r222,%r72,%r83; @ %r222 bra $L131; bra $L175; $L127: .loc 1 490 7 setp.eq.u64 %r223,%r307,0; @ %r223 bra $L133; add.u64 %r305,%frame,120; $L153: add.u64 %r125,%r156,40; mov.u64 %r123,%r305; add.u64 %r224,%r156,16; cvt.u32.u32 %r226,%r24; cvt.s64.s8 %r225,%r226; add.u64 %r228,%r225,%r225; add.u64 %r229,%r228,%r225; shl.b64 %r230,%r229,3; add.u64 %r117,%r224,%r230; .loc 1 493 10 mov.u64 %r89,1; .loc 1 497 4 mov.u64 %r249,0; bra $L134; $L133: .loc 1 501 24 mov.u64 %r231,0; st.u64 [%r156+8],%r231; .loc 1 502 28 cvt.u16.u32 %r234,%r24; add.u16 %r233,%r234,-1; cvt.u32.u16 %r235,%r233; st.u8 [%r156+28],%r235; .loc 1 504 20 add.u32 %r236,%r24,-2; cvt.s64.s32 %r56,%r236; add.u64 %r238,%r56,%r56; add.u64 %r239,%r238,%r56; shl.b64 %r240,%r239,3; add.u64 %r241,%r156,%r240; .loc 1 504 67 shl.b64 %r243,%r56,3; add.u64 %r244,%frame,%r243; .loc 1 504 59 ld.u64 %r246,[%r241+40]; ld.u64 %r247,[%r244+120]; mul.lo.u64 %r103,%r246,%r247; .loc 1 506 10 setp.eq.u64 %r248,%r103,0; @ %r248 bra $L135; bra $L176; $L137: .loc 1 495 48 mul.lo.u64 %r89,%r52,%r89; $L134: .loc 1 497 4 st.u64 [%r125+8],%r249; ld.u64 %r52,[%r123]; add.u64 %r250,%r52,-1; st.u64 [%r125+16],%r250; st.u64 [%r125],%r89; .loc 1 490 7 add.u64 %r125,%r125,24; add.u64 %r123,%r123,8; setp.ne.u64 %r251,%r117,%r125; @ %r251 bra $L137; bra $L133; $L135: .loc 1 509 4 st.u64 [%r156+48],%r103; mov.u64 %r253,-1; st.u64 [%r156+56],%r253; mov.u64 %r254,1; st.u64 [%r156+40],%r254; .loc 1 510 4 bra $L115; $L176: .loc 1 513 24 mov.u64 %r256,16256; call (%value_in),_gfortrani_xmallocarray257,[%value_in]; } mov.u64 %r107,%r257; .loc 1 513 22 st.u64 [%r156],%r107; $L141: .loc 1 539 3 setp.ne.u64 %r259,%r307,0; @ %r259 bra $L138; bra $L139; $L128: .loc 1 517 19 ld.s8 %r61,[%r156+28]; .loc 1 517 10 setp.eq.u64 %r260,%r61,%r307; @ %r260 bra $L140; .loc 1 518 2 st.u64 [%stack+8],%r307; st.u64 [%stack],%r61;_gfortran_runtime_error140: .loc 1 523 11 cvta.global.u64 %r263,_gfortrani_compile_options; .loc 1 523 10 ld.u32 %r264,[%r263+36]; setp.eq.u32 %r265,%r264,0; @ %r265 bra $L141; .loc 1 525 4 setp.eq.u64 %r266,%r307,0; @ %r266 bra $L139; add.u64 %r115,%r156,48; add.u64 %r105,%frame,120; cvt.u32.u32 %r268,%r24; cvt.s64.s8 %r267,%r268; add.u64 %r86,%r267,-1; .loc 1 525 10 mov.u64 %r113,0; $L143: .loc 1 529 21 ld.u64 %r270,[%r115+8]; add.u64 %r269,%r270,1; .loc 1 529 19 ld.u64 %r271,[%r115]; sub.u64 %r100,%r269,%r271; .loc 1 530 18 ld.u64 %r70,[%r105]; .loc 1 525 25 add.u64 %r113,%r113,1; .loc 1 530 11 setp.eq.u64 %r272,%r70,%r100; @ %r272 bra $L142; .loc 1 531 3 st.u64 [%stack+16],%r70; st.u64 [%stack+8],%r100; st.u64 [%stack],%r113;_gfortran_runtime_error142: .loc 1 525 4 add.u64 %r115,%r115,24; add.u64 %r105,%r105,8; setp.ne.u64 %r275,%r86,%r113; @ %r275 bra $L143; $L138: .loc 1 541 16 cvt.u16.u32 %r277,%r24; setp.le.s16 %r278,%r277,1; @ %r278 bra $L144; shl.b64 %r276,%r307,3; bra $L145; $L144: mov.u64 %r276,8; $L145: add.u64 %r280,%frame,240; mov.u32 %r2842766,[%value_in]; } add.u64 %r130,%r156,40; mov.u64 %r128,%frame; mov.u64 %r109,0; $L146: .loc 1 542 18 ld.u64 %r288,[%r130]; st.u64 [%r128],%r288; .loc 1 539 26 add.u64 %r109,%r109,1; .loc 1 539 3 add.u64 %r130,%r130,24; add.u64 %r128,%r128,8; setp.gt.s64 %r289,%r307,%r109; @ %r289 bra $L146; $L139: ld.u64 %r88,[%frame+240]; .loc 1 551 22 ld.u64 %r75,[%frame]; .loc 1 551 12 shl.b64 %r76,%r75,4; .loc 1 553 32 ld.u64 %r65,[%frame+120]; .loc 1 560 23 mul.lo.u64 %r49,%r65,%r75; setp.le.s64 %r306,%r307,1; .loc 1 549 13 mov.u64 %r290,0; cvt.u32.u32 %r309,%r24; cvt.s64.s8 %r310,%r309; add.u64 %r311,%r310,-1; $L152: st.u64 [%r107],%r290; st.u64 [%r107+8],%r290; .loc 1 550 15 add.u64 %r88,%r88,1; .loc 1 551 12 add.u64 %r107,%r107,%r76; .loc 1 553 13 setp.ne.u64 %r292,%r65,%r88; @ %r292 bra $L152; .loc 1 562 7 @ ! %r306 bra $L177; bra $L115; $L151: .loc 1 557 13 st.u64 [%r108],%r290; .loc 1 560 23 mul.lo.u64 %r78,%r81,%r80; .loc 1 561 5 add.u64 %r106,%r106,1; .loc 1 562 7 add.u64 %r108,%r108,8; add.u64 %r138,%r138,8; add.u64 %r137,%r137,8; setp.ne.u64 %r295,%r106,%r311; @ %r295 bra $L150; bra $L115; $L177: add.u64 %r108,%frame,248; add.u64 %r138,%frame,8; add.u64 %r137,%frame,128; .loc 1 560 23 mov.u64 %r78,%r49; .loc 1 561 5 mov.u64 %r106,1; $L150: .loc 1 566 16 ld.u64 %r300,[%r108]; add.u64 %r80,%r300,1; st.u64 [%r108],%r80; .loc 1 567 23 ld.u64 %r81,[%r138]; .loc 1 567 13 sub.u64 %r301,%r81,%r78; shl.b64 %r302,%r301,4; add.u64 %r107,%r107,%r302; .loc 1 553 32 ld.u64 %r84,[%r137]; .loc 1 553 13 setp.eq.u64 %r303,%r80,%r84; @ %r303 bra $L151; .loc 1 557 13 mov.u64 %r88,0; bra $L152; $L175: .loc 1 486 15 ld.u64 %r107,[%r156]; .loc 1 486 6 setp.eq.u64 %r304,%r107,0; @ ! %r304 bra $L128; bra $L153; $L115: .loc 1 571 1 ret; } maxloc2_4_s1.o/ 1622802213gfortran_maxloc2_4_s1gfortran_maxloc2_4_s1.file 1 "../../../libgfortran/generated/maxloc2_4_s1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc2_4_s1gfortran_mmaxloc2_4_s1); // BEGIN GLOBAL FUNCTION DECL: _gfortran_smaxloc2_4_s1gfortran_smaxloc2_4_s1); // BEGIN GLOBAL FUNCTION DECL: memcmpmem_gfortrani_internal_error .extern .func _gfortrani_internal_error21,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc2_4_s1gfortran_maxloc2_4_s1value;u64 %r42; .reg .u64 %r43; .reg .pred %r44; .reg .u64 %r45; .reg .predmov.u64 %r38,%ar0; mov.u32 %r39,%ar1; mov.u64 %r40,%ar2; .loc 1 56 12 ld.u64 %r42,[%r38+56]; add.u64 %r41,%r42,1; .loc 1 56 10 ld.u64 %r43,[%r38+48]; sub.u64 %r30,%r41,%r43; .loc 1 57 6 setp.le.s64 %r44,%r30,0; @ %r44 bra $L7; .loc 1 60 44 ld.u64 %r45,[%r38+40]; mul.lo.u64 %r27,%r40,%r45; .loc 1 63 7 ld.u64 %r33,[%r38]; add.u64 %r36,%r30,1; mov.u64 %r29,%r33; .loc 1 65 9 mov.u64 %r34,1; mov.u64 %r28,%r34; .loc 1 67 10 setp.eq.u32 %r57,%r39,0; bra $L3; $L6:8(%value_in),memcmp,(%out_arg1,%out_arg2,%out_arg3); ld.param.u32 %r50,[%value_in]; } .loc 1 67 10 @ %r57 bra $L5; .loc 1 67 26 setp.lt.s32 %r53,%r50,0; selp.u64 %r29,%r29,%r33,%r53; selp.u64 %r28,%r28,%r34,%r53; bra $L3; $L5: setp.le.s32 %r54,%r50,0; selp.u64 %r29,%r29,%r33,%r54; selp.u64 %r28,%r28,%r34,%r54; bra $L3; $L8: .loc 1 73 11 mov.u64 %r29,%r33; mov.u64 %r28,%r34; $L3: add.u64 %r33,%r33,%r27; .loc 1 65 25 add.u64 %r34,%r34,1; .loc 1 65 3 setp.ne.u64 %r55,%r34,%r36; @ %r55 bra $L6; .loc 1 75 10 cvt.u32.u64 %r37,%r28; bra $L1; $L7: .loc 1 58 12 mov.u32 %r37,0; $L1: .loc 1 76_gfortran_mmaxloc2_4_s1gfortran_mmaxloc2_4_s1predu16 %r88; .reg .u32 %r89; .reg .pred %r90; .reg .pred %r91; .reg .pred.reg .pred %r98; .reg .u32pred.loc 1 98 12 ld.u64 %r64,[%r59+56]; add.u64 %r63,%r64,1; .loc 1 98 10 ld.u64 %r65,[%r59+48]; sub.u64 %r48,%r63,%r65; .loc 1 99 6 setp.le.s64 %r66,%r48,0; @ %r66 bra $L20; .loc 1 102 44 ld.u64 %r67,[%r59+40]; mul.lo.u64 %r27,%r62,%r67; .loc 1 104 15 ld.u64 %r28,[%r60+16]; .loc 1 105 9 ld.u64 %r51,[%r60]; .loc 1 107 22 cvt.u32.u64 %r29,%r28; .loc 1 107 53 add.u32 %r68,%r29,-4; and.b32 %r69,%r68,-5; set.u32.eq.u32 %r71,%r69,0; neg.s32 %r72,%r71; .loc 1 107 22 add.u32 %r73,%r29,-1; set.u32.le.u32 %r75,%r73,1; neg.s32 %r76,%r75; .loc 1 107 6 cvt.u16.u32 %r78,%r72; cvt.u16.u32 %r79,%r76; or.b16 %r77,%r78,%r79;109 7 setp.ne.u32 %r84,%r29,16; @ %r84 bra $L12; $L11: .loc 1 116 13 ld.u64 %r85,[%r60+40]; mul.lo.u64 %r37,%r28,%r85; .loc 1 119 9 mov.u64 %r52,0; bra $L13; $L12: .loc 1 114 5 cvta.const.u64 %r87,$LC0; mov.u64 %r86,r87; call _gfortrani_internal_error13: .loc 1 121 10 ld.s8 %r89,[%r51]; cvt.u16.u32 %r88,%r89; setp.ne.u16 %r90,%r88,0; @ %r90 bra $L14; .loc 1 123 13 add.u64 %r51,%r51,%r37; .loc 1 119 24 add.u64 %r52,%r52,1; .loc 1 119 3 setp.ne.u64 %r91,%r48,%r52; @ %r91 bra $L13; .loc 1 100 12 mov.u32 %r58,0; bra $L9; $L14: .loc 1 126 6 setp.eq.u64 %r92,%r48,%r52; @ %r92 bra $L21; .loc 1 129 7 add.u64 %r43,%r52,1; .loc 1 130 7 ld.u64 %r94,[%r59]; mad.lo.u64 %r44,%r27,%r52,%r94; .loc 1 133 3 setp.lt.s64 %r95,%r48,%r43; @ %r95 bra $L15; add.u64 %r34,%r48,1; mov.u64 %r55,%r43; .loc 1 130 7 mov.u64 %r54,%r44; .loc 1 135 18 setp.eq.u32 %r115,%r61,0; $L19: .loc 1 135 10 ld.s8 %r97,[%r51]; cvt.u16.u32 %r96,%r97; setp.eq.u16 %r98,%r96,0; @ %r98 bra $L16; .loc 1 37102,[%value_in]; } .loc 1 135 18 @ %r115 bra $L17; not.b32 %r106,%r102; shr.u32 %r107,%r106,31; cvt.u32.u32 %r108,%r107; cvt.u32.u8 %r53,%r108; bra $L18; $L17: set.u32.gt.s32 %r110,%r102,0; neg.s32 %r111,%r110; cvt.u32.u32 %r109,%r111; cvt.u32.u8 %r53,%r109; bra $L18; $L16: .loc 1 141 11 add.u64 %r54,%r54,%r27; .loc 1 142 13 add.u64 %r51,%r51,%r37; .loc 1 133 27 add.u64 %r55,%r55,1; .loc 1 133 3 setp.ne.u64 %r112,%r34,%r55; @ %r112 bra $L19; $L15: .loc 1 144 10 cvt.u32.u64 %r58,%r43; bra $L9; $L20: .loc 1 100 12 mov.u32 %r58,0; bra $L9; $L21: mov.u32 %r58,0; bra $L9; $L18: .loc 1 135 18 setp.eq.u32 %r114,%r53,0; selp.u64 %r44,%r44,%r54,%r114; selp.u64 %r43,%r43,%r55,%r114; bra $L16;gfortran_smaxloc2_4_s1gfortran_smaxloc2_4_s1predmov.u64 %r28,%ar3; .loc 1 155 6 setp.eq.u64 %r29,%r26,0; @ %r29 bra $L31; .loc 1 156 12 cvt.s64.s32 %r33,%r27; cvt.u32.u64 %r31,%r225(%value_in),_gfortran_maxloc2_4_s1mov.u32 %r24,%r34; bra $L29; $L31: .loc 1 158 12 cvt.u32gfortran_maxloc2_4_s4gfortran_maxloc2_4_s4.file 1 "../../../libgfortran/generated/maxloc2_4_s4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc2_4_s4gfortran_mmaxloc2_4_s4); // BEGIN GLOBAL FUNCTION DECL: _gfortran_smaxloc2_4_s4gfortran_smaxloc2_4_s4); // BEGIN GLOBAL FUNCTION DECL: _gfortrani_memcmp_char4_gfortrani_memcmp_char4_gfortrani_internal_error .extern .func _gfortrani_internal_error21,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc2_4_s4gfortran_maxloc2_4_s4value;u64mov.u64 %r37,%ar0; mov.u32 %r38,%ar1; mov.u64 %r39,%ar2; .loc 1 56 12 ld.u64 %r41,[%r37+56]; add.u64 %r40,%r41,1; .loc 1 56 10 ld.u64 %r42,[%r37+48]; sub.u64 %r31,%r40,%r42; .loc 1 57 6 setp.le.s64 %r43,%r31,0; @ %r43 bra $L7; .loc 1 63 7 ld.u64 %r32,[%r37]; .loc 1 73 11 ld.u64 %r45,[%r37+40]; mul.lo.u64 %r44,%r39,%r45; shl.b64 %r27,%r44,2; .loc 1 63 7 mov.u64 %r29,%r32; .loc 1 65 9 mov.u64 %r33,1; mov.u64 %r28,%r33; .loc 1 67 10 setp.eq.u32 %r60,%r38,0; bra $L3; $L6:8; @ %r60 bra $L5; .loc 1 3932_gfortrani_memcmp_char4.loc 1 67 26 setp.lt.s32 %r52,%r51,0; .loc 1 73 11 selp.u64 %r29,%r29,%r32,%r52; selp.u64 %r28,%r28,%r33,%r52; bra $L3; $L5: .loc 1 3932_gfortrani_memcmp_char4.loc 1 67 26 setp.le.s32 %r57,%r56,0; .loc 1 73 11 selp.u64 %r29,%r29,%r32,%r57; selp.u64 %r28,%r28,%r33,%r57; bra $L3; $L8: mov.u64 %r29,%r32; mov.u64 %r28,%r33; $L3: add.u64 %r32,%r32,%r27; .loc 1 65 25 add.u64 %r33,%r33,1; .loc 1 65 3 setp.ge.s64 %r58,%r31,%r33; @ %r58 bra $L6; .loc 1 75 10 cvt.u32.u64 %r36,%r28; bra $L1; $L7: .loc 1 58 12 mov.u32 %r36,0; $L1: .loc 1 _gfortran_mmaxloc2_4_s4gfortran_mmaxloc2_4_s4predu16 %r88; .reg .u32 %r89; .reg .pred %r90; .reg .pred %r91; .reg .predpredpred %r116; .reg .pred.loc 1 98 12 ld.u64 %r64,[%r59+56]; add.u64 %r63,%r64,1; .loc 1 98 10 ld.u64 %r65,[%r59+48]; sub.u64 %r48,%r63,%r65; .loc 1 99 6 setp.le.s64 %r66,%r48,0; @ %r66 bra $L20; .loc 1 102 44 ld.u64 %r67,[%r59+40]; mul.lo.u64 %r27,%r62,%r67; .loc 1 104 15 ld.u64 %r28,[%r60+16]; .loc 1 105 9 ld.u64 %r51,[%r60]; .loc 1 107 22 cvt.u32.u64 %r29,%r28; .loc 1 107 53 add.u32 %r68,%r29,-4; and.b32 %r69,%r68,-5; set.u32.eq.u32 %r71,%r69,0; neg.s32 %r72,%r71; .loc 1 107 22 add.u32 %r73,%r29,-1; set.u32.le.u32 %r75,%r73,1; neg.s32 %r76,%r75; .loc 1 107 6 cvt.u16.u32 %r78,%r72; cvt.u16.u32 %r79,%r76; or.b16 %r77,%r78,%r79;109 7 setp.ne.u32 %r84,%r29,16; @ %r84 bra $L12; $L11: .loc 1 116 13 ld.u64 %r85,[%r60+40]; mul.lo.u64 %r34,%r28,%r85; .loc 1 119 9 mov.u64 %r52,0; bra $L13; $L12: .loc 1 114 5 cvta.const.u64 %r87,$LC0; mov.u64 %r86,r87; call _gfortrani_internal_error13: .loc 1 121 10 ld.s8 %r89,[%r51]; cvt.u16.u32 %r88,%r89; setp.ne.u16 %r90,%r88,0; @ %r90 bra $L14; .loc 1 123 13 add.u64 %r51,%r51,%r34; .loc 1 119 24 add.u64 %r52,%r52,1; .loc 1 119 3 setp.ne.u64 %r91,%r48,%r52; @ %r91 bra $L13; .loc 1 100 12 mov.u32 %r58,0; bra $L9; $L14: .loc 1 126 6 setp.eq.u64 %r92,%r48,%r52; @ %r92 bra $L21; .loc 1 129 7 add.u64 %r42,%r52,1; .loc 1 130 30 mul.lo.u64 %r93,%r27,%r52; .loc 1 130 26 shl.b64 %r94,%r93,2; .loc 1 130 7 ld.u64 %r95,[%r59]; add.u64 %r44,%r95,%r94; .loc 1 133 3 setp.lt.s64 %r96,%r48,%r42; @ %r96 bra $L15; .loc 1 141 11 shl.b64 %r41,%r27,2; mov.u64 %r55,%r42; .loc 1 130 7 mov.u64 %r54,%r44; .loc 1 135 18 setp.eq.u32 %r119,%r61,0; $L19: .loc 1 135 10 ld.s8 %r98,[%r51]; cvt.u16.u32 %r97,%r98; setp.eq.u16 %r99,%r97,0; @ %r99 bra $L16; .loc 1 135 18 @ %r119 bra $L17; .loc 1 39_gfortrani_memcmp_char44,[%value_in]; } .loc 1 135 18 not.b32 %r106,%r104; shr.u32 %r107,%r106,31; cvt.u32.u32 %r108,%r107; cvt.u32.u8 %r53,%r108; bra $L18; $L17: .loc 1 39_gfortrani_memcmp_char4112,[%value_in]; } .loc 1 135 18 set.u32.gt.s32 %r114,%r112,0; neg.s32 %r115,%r114; cvt.u32.u32 %r113,%r115; cvt.u32.u8 %r53,%r113; bra $L18; $L16: .loc 1 141 11 add.u64 %r54,%r54,%r41; .loc 1 142 13 add.u64 %r51,%r51,%r34; .loc 1 133 27 add.u64 %r55,%r55,1; .loc 1 133 3 setp.ge.s64 %r116,%r48,%r55; @ %r116 bra $L19; $L15: .loc 1 144 10 cvt.u32.u64 %r58,%r42; bra $L9; $L20: .loc 1 100 12 mov.u32 %r58,0; bra $L9; $L21: mov.u32 %r58,0; bra $L9; $L18: .loc 1 135 18 setp.eq.u32 %r118,%r53,0; selp.u64 %r44,%r44,%r54,%r118; selp.u64 %r42,%r42,%r55,%r118; bra $L16;gfortran_smaxloc2_4_s4gfortran_smaxloc2_4_s4predmov.u64 %r28,%ar3; .loc 1 155 6 setp.eq.u64 %r29,%r26,0; @ %r29 bra $L31; .loc 1 156 12 cvt.s64.s32 %r33,%r27; cvt.u32.u64 %r31,%r225(%value_in),_gfortran_maxloc2_4_s4mov.u32 %r24,%r34; bra $L29; $L31: .loc 1 158 12 cvt.u32 maxloc2_8_s1.o/ 1622802213gfortran_maxloc2_8_s1gfortran_maxloc2_8_s1.file 1 "../../../libgfortran/generated/maxloc2_8_s1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc2_8_s1gfortran_mmaxloc2_8_s1); // BEGIN GLOBAL FUNCTION DECL: _gfortran_smaxloc2_8_s1gfortran_smaxloc2_8_s1); // BEGIN GLOBAL FUNCTION DECL: memcmpmem_gfortrani_internal_error .extern .func _gfortrani_internal_error21,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc2_8_s1gfortran_maxloc2_8_s1value;mov.u64 %r37,%ar0; mov.u32 %r38,%ar1; mov.u64 %r39,%ar2; .loc 1 56 12 ld.u64 %r41,[%r37+56]; add.u64 %r40,%r41,1; .loc 1 56 10 ld.u64 %r42,[%r37+48]; sub.u64 %r29,%r40,%r42; .loc 1 57 6 setp.le.s64 %r43,%r29,0; @ %r43 bra $L7; .loc 1 60 44 ld.u64 %r44,[%r37+40]; mul.lo.u64 %r27,%r39,%r44; .loc 1 63 7 ld.u64 %r32,[%r37]; add.u64 %r35,%r29,1; mov.u64 %r28,%r32; .loc 1 65 9 mov.u64 %r33,1; mov.u64 %r36,%r33; .loc 1 67 10 setp.eq.u32 %r56,%r38,0; bra $L3; $L6: setp.eq.u64 %r45,%r28,0; @ %r45 bra $L8; .loc 1 3710 @ %r56 bra $L5; .loc 1 67 26 setp.lt.s32 %r52,%r49,0; selp.u64 %r28,%r28,%r32,%r52; selp.u64 %r36,%r36,%r33,%r52; bra $L3; $L5: setp.le.s32 %r53,%r49,0; selp.u64 %r28,%r28,%r32,%r53; selp.u64 %r36,%r36,%r33,%r53; bra $L3; $L8: .loc 1 73 11 mov.u64 %r28,%r32; mov.u64 %r36,%r33; $L3: add.u64 %r32,%r32,%r27; .loc 1 65 25 add.u64 %r33,%r33,1; .loc 1 65 3 setp.ne.u64 %r54,%r33,%r35; @ %r54 bra $L6; bra $L1; $L7: .loc 1 58 12 mov.u64 %r36,0; $L1: .loc 1 7_gfortran_mmaxloc2_8_s1gfortran_mmaxloc2_8_s1.reg .u32 %r75; .reg .u16 %r76; .reg .u16 %r77; .reg .u16 %r78; .reg .u32pred32pred %r91; .reg .u64 %r93; .reg .predpred %r97; .reg .u32 %r101; .reg .u32pred %r114; mov.u64 %r58,%ar0; mov.u64 %r59,%ar1; mov.u32 %r60,%ar2; mov.u64 %r61,%ar3; .loc 1 98 12 ld.u64 %r63,[%r58+56]; add.u64 %r62,%r63,1; .loc 1 98 10 ld.u64 %r64,[%r58+48]; sub.u64 %r48,%r62,%r64; .loc 1 99 6 setp.le.s64 %r65,%r48,0; @ %r65 bra $L19; .loc 1 102 44 ld.u64 %r66,[%r58+40]; mul.lo.u64 %r27,%r61,%r66; .loc 1 104 15 ld.u64 %r28,[%r59+16]; .loc 1 105 9 ld.u64 %r51,[%r59]; .loc 1 107 22 cvt.u32.u64 %r29,%r28; .loc 1 107 53 add.u32 %r67,%r29,-4; and.b32 %r68,%r67,-5; set.u32.eq.u32 %r70,%r68,0; neg.s32 %r71,%r70; .loc 1 107 22 add.u32 %r72,%r29,-1; set.u32.le.u32 %r74,%r72,1; neg.s32 %r75,%r74; .loc 1 107 6 cvt.u16.u32 %r77,%r71; cvt.u16.u32 %r78,%r75; or.b16 %r76,%r77,%r78; cvt.u32.u16 %r79,%r76; cvt.u16.u8 %r80,%r79; setp.ne.u16 %r81,%r80,0; @ %r81 bra $L11; .loc 1 109 7 setp.ne.u32 %r83,%r29,16; @ %r83 bra $L12; $L11: .loc 1 116 13 ld.u64 %r84,[%r59+40]; mul.lo.u64 %r37,%r28,%r84; .loc 1 119 9 mov.u64 %r52,0; bra $L13; $L12: .loc 1 114 5 cvta.const.u64 %r86,$LC0; mov.u64 %r85,r86; call _gfortrani_internal_error13: .loc 1 121 10 ld.s8 %r88,[%r51]; cvt.u16.u32 %r87,%r88; setp.ne.u16 %r89,%r87,0; @ %r89 bra $L14; .loc 1 123 13 add.u64 %r51,%r51,%r37; .loc 1 119 24 add.u64 %r52,%r52,1; .loc 1 119 3 setp.ne.u64 %r90,%r48,%r52; @ %r90 bra $L13; .loc 1 100 12 mov.u64 %r57,0; bra $L9; $L14: .loc 1 126 6 setp.eq.u64 %r91,%r48,%r52; @ %r91 bra $L20; .loc 1 129 7 add.u64 %r57,%r52,1; .loc 1 130 7 ld.u64 %r93,[%r58]; mad.lo.u64 %r43,%r27,%r52,%r93; .loc 1 133 3 setp.lt.s64 %r94,%r48,%r57; @ %r94 bra $L9; add.u64 %r34,%r48,1; mov.u64 %r55,%r57; .loc 1 130 7 mov.u64 %r54,%r43; .loc 1 135 18 setp.eq.u32 %r114,%r60,0; $L18: .loc 1 135 10 ld.s8 %r96,[%r51]; cvt.u16.u32 %r95,%r96; setp.eq.u16 %r97,%r95,0; @ %r97 bra $L15; .loc 1 371,[%value_in]; } .loc 1 135 18 @ %r114 bra $L16; not.b32 %r105,%r101; shr.u32 %r106,%r105,31; cvt.u32.u32 %r107,%r106; cvt.u32.u8 %r53,%r107; bra $L17; $L16: set.u32.gt.s32 %r109,%r101,0; neg.s32 %r110,%r109; cvt.u32.u32 %r108,%r110; cvt.u32.u8 %r53,%r108; bra $L17; $L15: .loc 1 141 11 add.u64 %r54,%r54,%r27; .loc 1 142 13 add.u64 %r51,%r51,%r37; .loc 1 133 27 add.u64 %r55,%r55,1; .loc 1 133 3 setp.ne.u64 %r111,%r34,%r55; @ %r111 bra $L18; bra $L9; $L19: .loc 1 100 12 mov.u64 %r57,0; bra $L9; $L20: mov.u64 %r57,0; bra $L9; $L17: .loc 1 135 18 setp.eq.u32 %r113,%r53,0; selp.u64 %r43,%r43,%r54,%r113; selp.u64 %r57,%r57,%r55,%r113; bra $L15; $L9predmov.u64 %r28,%ar3; .loc 1 155 6 setp.eq.u64 %r29,%r26,0; @ %r29 bra $L30; .loc 1 156 12 cvt.s64.s32 %r33,%r27; cvt.u32.u64 %r31,%r2825(%value_in),_gfortran_maxloc2_8_s1mov.u64 %r24,%r34; bra $L28; $L30: .loc 1 158 12 mov.u64 %r24,%r26; $L28gfortran_maxloc2_8_s4gfortran_maxloc2_8_s4.file 1 "../../../libgfortran/generated/maxloc2_8_s4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc2_8_s4gfortran_mmaxloc2_8_s4); // BEGIN GLOBAL FUNCTION DECL: _gfortran_smaxloc2_8_s4gfortran_smaxloc2_8_s4); // BEGIN GLOBAL FUNCTION DECL: _gfortrani_memcmp_char4_gfortrani_memcmp_char4_gfortrani_internal_error .extern .func _gfortrani_internal_error21,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc2_8_s4gfortran_maxloc2_8_s4value;predpred %r57; .reg .pred %r59; mov.u64 %r36,%ar0; mov.u32 %r37,%ar1; mov.u64 %r38,%ar2; .loc 1 56 12 ld.u64 %r40,[%r36+56]; add.u64 %r39,%r40,1; .loc 1 56 10 ld.u64 %r41,[%r36+48]; sub.u64 %r30,%r39,%r41; .loc 1 57 6 setp.le.s64 %r42,%r30,0; @ %r42 bra $L7; .loc 1 63 7 ld.u64 %r31,[%r36]; .loc 1 73 11 ld.u64 %r44,[%r36+40]; mul.lo.u64 %r43,%r38,%r44; shl.b64 %r27,%r43,2; .loc 1 63 7 mov.u64 %r29,%r31; .loc 1 65 9 mov.u64 %r32,1; mov.u64 %r35,%r32; .loc 1 67 10 setp.eq.u32 %r59,%r37,0; bra $L3; $L6: setp.eq.u64 %r45,%r29,0; @ %r45 bra $L8; @ %r59 bra $L5; .loc 1 39 1229_gfortrani_memcmp_char4.loc 1 67 26 setp.lt.s32 %r51,%r50,0; .loc 1 73 11 selp.u64 %r29,%r29,%r31,%r51; selp.u64 %r35,%r35,%r32,%r51; bra $L3; $L5: .loc 1 39 1229_gfortrani_memcmp_char4.loc 1 67 26 setp.le.s32 %r56,%r55,0; .loc 1 73 11 selp.u64 %r29,%r29,%r31,%r56; selp.u64 %r35,%r35,%r32,%r56; bra $L3; $L8: mov.u64 %r29,%r31; mov.u64 %r35,%r32; $L3: add.u64 %r31,%r31,%r27; .loc 1 65 25 add.u64 %r32,%r32,1; .loc 1 65 3 setp.ge.s64 %r57,%r30,%r32; @ %r57 bra $L6; bra $L1; $L7: .loc 1 58 12 mov.u64 %r35,0; $L1: .loc 1 76gfortran_mmaxloc2_8_s4gfortran_mmaxloc2_8_s4.reg .u32 %r75; .reg .u16 %r76; .reg .u16 %r77; .reg .u16 %r78; .reg .u32predpred %r89; .reg .pred %r90; .reg .predpred %r117; .reg .pred %r118; mov.u64 %r58,%ar0; mov.u64 %r59,%ar1; mov.u32 %r60,%ar2; mov.u64 %r61,%ar3; .loc 1 98 12 ld.u64 %r63,[%r58+56]; add.u64 %r62,%r63,1; .loc 1 98 10 ld.u64 %r64,[%r58+48]; sub.u64 %r47,%r62,%r64; .loc 1 99 6 setp.le.s64 %r65,%r47,0; @ %r65 bra $L19; .loc 1 102 44 ld.u64 %r66,[%r58+40]; mul.lo.u64 %r27,%r61,%r66; .loc 1 104 15 ld.u64 %r28,[%r59+16]; .loc 1 105 9 ld.u64 %r50,[%r59]; .loc 1 107 22 cvt.u32.u64 %r29,%r28; .loc 1 107 53 add.u32 %r67,%r29,-4; and.b32 %r68,%r67,-5; set.u32.eq.u32 %r70,%r68,0; neg.s32 %r71,%r70; .loc 1 107 22 add.u32 %r72,%r29,-1; set.u32.le.u32 %r74,%r72,1; neg.s32 %r75,%r74; .loc 1 107 6 cvt.u16.u32 %r77,%r71; cvt.u16.u32 %r78,%r75; or.b16 %r76,%r77,%r78; cvt.u32.u16 %r79,%r76; cvt.u16.u8 %r80,%r79; setp.ne.u16 %r81,%r80,0; @ %r81 bra $L11; .loc 1 109 7 setp.ne.u32 %r83,%r29,16; @ %r83 bra $L12; $L11: .loc 1 116 13 ld.u64 %r84,[%r59+40]; mul.lo.u64 %r34,%r28,%r84; .loc 1 119 9 mov.u64 %r51,0; bra $L13; $L12: .loc 1 114 5 cvta.const.u64 %r86,$LC0; mov.u64 %r85,r86; call _gfortrani_internal_error13: .loc 1 121 10 ld.s8 %r88,[%r50]; cvt.u16.u32 %r87,%r88; setp.ne.u16 %r89,%r87,0; @ %r89 bra $L14; .loc 1 123 13 add.u64 %r50,%r50,%r34; .loc 1 119 24 add.u64 %r51,%r51,1; .loc 1 119 3 setp.ne.u64 %r90,%r47,%r51; @ %r90 bra $L13; .loc 1 100 12 mov.u64 %r57,0; bra $L9; $L14: .loc 1 126 6 setp.eq.u64 %r91,%r47,%r51; @ %r91 bra $L20; .loc 1 129 7 add.u64 %r57,%r51,1; .loc 1 130 30 mul.lo.u64 %r92,%r27,%r51; .loc 1 130 26 shl.b64 %r93,%r92,2; .loc 1 130 7 ld.u64 %r94,[%r58]; add.u64 %r43,%r94,%r93; .loc 1 133 3 setp.lt.s64 %r95,%r47,%r57; @ %r95 bra $L9; .loc 1 141 11 shl.b64 %r41,%r27,2; mov.u64 %r54,%r57; .loc 1 130 7 mov.u64 %r53,%r43; .loc 1 135 18 setp.eq.u32 %r118,%r60,0; $L18: .loc 1 135 10 ld.s8 %r97,[%r50]; cvt.u16.u32 %r96,%r97; setp.eq.u16 %r98,%r96,0; @ %r98 bra $L15; .loc 1 135 18 @ %r118 bra $L16; .loc 1 39call (%value_in),_gfortrani_memcmp_char43,[%value_in]; } .loc 1 135 18 not.b32 %r105,%r103; shr.u32 %r106,%r105,31; cvt.u32.u32 %r107,%r106; cvt.u32.u8 %r52,%r107; bra $L17; $L16: .loc 1 39call (%value_in),_gfortrani_memcmp_char4111,[%value_in]; } .loc 1 135 18 set.u32.gt.s32 %r113,%r111,0; neg.s32 %r114,%r113; cvt.u32.u32 %r112,%r114; cvt.u32.u8 %r52,%r112; bra $L17; $L15: .loc 1 141 11 add.u64 %r53,%r53,%r41; .loc 1 142 13 add.u64 %r50,%r50,%r34; .loc 1 133 27 add.u64 %r54,%r54,1; .loc 1 133 3 setp.ge.s64 %r115,%r47,%r54; @ %r115 bra $L18; bra $L9; $L19: .loc 1 100 12 mov.u64 %r57,0; bra $L9; $L20: mov.u64 %r57,0; bra $L9; $L17: .loc 1 135 18 setp.eq.u32 %r117,%r52,0; selp.u64 %r43,%r43,%r53,%r117; selp.u64 %r57,%r57,%r54,%r117; bra $L15; $L9predmov.u64 %r28,%ar3; .loc 1 155 6 setp.eq.u64 %r29,%r26,0; @ %r29 bra $L30; .loc 1 156 12 cvt.s64.s32 %r33,%r27; cvt.u32.u64 %r31,%r2825(%value_in),_gfortran_maxloc2_8_s4mov.u64 %r24,%r34; bra $L28; $L30: .loc 1 158 12 mov.u64 %r24,%r26; $L28 maxloc2_16_s1.o/1622802213gfortran_maxloc2_16_s1 .visible .func _gfortran_maxloc2_16_s1); .file 1 "../../../libgfortran/generated/maxloc2_16_s1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc2_16_s1 .visible .func _gfortran_mmaxloc2_16_s1_gfortran_smaxloc2_16_s1 .visible .func _gfortran_smaxloc2_16_s1memcmpmem_gfortrani_internal_error .extern .func _gfortrani_internal_error21,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc2_16_s1 .visible .func _gfortran_maxloc2_16_s1predmov.u32 %r40,%ar2; mov.u64 %r41,%ar3; .loc 1 56 12 ld.u64 %r43,[%r39+56]; add.u64 %r42,%r43,1; .loc 1 56 10 ld.u64 %r44,[%r39+48]; sub.u64 %r31,%r42,%r44; .loc 1 57 6 setp.gt.s64 %r45,%r31,0; @ %r45 bra $L2; .loc 1 58 12 mov.u64 %r46,0; st.u64 [%r38],%r46; st.u64 [%r38+8],%r46; bra $L1; $L2: .loc 1 60 44 ld.u64 %r48,[%r39+40]; mul.lo.u64 %r27,%r41,%r48; .loc 1 63 7 ld.u64 %r34,[%r39]; add.u64 %r37,%r31,1; mov.u64 %r30,%r34; .loc 1 65 9 mov.u64 %r35,1; mov.u64 %r29,%r35; .loc 1 67 10 setp.eq.u32 %r60,%r40,0; bra $L4; $L7: setp.eq.u64 %r49,%r30,0; @ %r49 bra $L8; .loc 1 37(%value_in),memcmp,(%out_arg1,%out_arg2,67 10 @ %r60 bra $L6; .loc 1 67 26 setp.lt.s32 %r56,%r53,0; selp.u64 %r30,%r30,%r34,%r56; selp.u64 %r29,%r29,%r35,%r56; bra $L4; $L6: setp.le.s32 %r57,%r53,0; selp.u64 %r30,%r30,%r34,%r57; selp.u64 %r29,%r29,%r35,%r57; bra $L4; $L8: .loc 1 73 11 mov.u64 %r30,%r34; mov.u64 %r29,%r35; $L4: add.u64 %r34,%r34,%r27; .loc 1 65 25 add.u64 %r35,%r35,1; .loc 1 65 3 setp.ne.u64 %r58,%r35,%r37; @ %r58 bra $L7; .loc 1 75 10 st.u64 [%r38],%r29; shr.s64 %r59,%r29,63; st.u64 [%r38+8],%r59;_gfortran_mmaxloc2_16_s1 .visible .func _gfortran_mmaxloc2_16_s1in_ar4) {u16 %r84; .reg .pred %r85; .reg .predu32.loc 1 98 12 ld.u64 %r65,[%r60+56]; add.u64 %r64,%r65,1; .loc 1 98 10 ld.u64 %r66,[%r60+48]; sub.u64 %r50,%r64,%r66; .loc 1 99 6 setp.gt.s64 %r67,%r50,0; @ %r67 bra $L10; $L16: .loc 1 100 12 mov.u64 %r68,0; st.u64 [%r59],%r68; st.u64 [%r59+8],%r68; bra $L9; $L10: .loc 1 102 44 ld.u64 %r70,[%r60+40]; mul.lo.u64 %r27,%r63,%r70; .loc 1 104 15 ld.u64 %r28,[%r61+16]; .loc 1 105 9 ld.u64 %r53,[%r61]; .loc 1 107 22 cvt.u32.u64 %r29,%r28; .loc 1 107 53 add.u32 %r71,%r29,-4; and.b32 %r72,%r71,-5;.loc 1 107 22 add.u32 %r76,%r29,-1; set.u32.le.u32 %r78,%r76,1; neg.s32 %r79,%r78; .loc 1 107 6 cvt.u16.u32 %r81,%r75; cvt.u16.u32 %r82,%r79; or.b16 %r80,%r81,%r82;@ %r85 bra $L12; .loc 1 109 7 setp.ne.u32 %r87,%r29,16; @ %r87 bra $L13; $L12: .loc 1 116 13 ld.u64 %r88,[%r61+40]; mul.lo.u64 %r37,%r28,%r88; .loc 1 119 9 mov.u64 %r54,0; bra $L14; $L13: .loc 1 114 5 cvta.const.u64 %r90,$LC0; mov.u64 %r89,0; {r90; call _gfortrani_internal_error14: .loc 1 121 10 ld.s8 %r92,[%r53]; cvt.u16.u32 %r91,%r92; setp.ne.u16 %r93,%r91,0; @ %r93 bra $L15; .loc 1 123 13 add.u64 %r53,%r53,%r37; .loc 1 119 24 add.u64 %r54,%r54,1; .loc 1 119 3 setp.ne.u64 %r94,%r50,%r54; @ %r94 bra $L14; bra $L16; $L15: .loc 1 126 6 setp.eq.u64 %r95,%r50,%r54; @ %r95 bra $L16; .loc 1 129 7 add.u64 %r44,%r54,1; .loc 1 130 7 ld.u64 %r97,[%r60]; mad.lo.u64 %r45,%r27,%r54,%r97; .loc 1 133 3 setp.lt.s64 %r98,%r50,%r44; @ %r98 bra $L17; add.u64 %r33,%r50,1; mov.u64 %r57,%r44; .loc 1 130 7 mov.u64 %r56,%r45; .loc 1 135 18 setp.eq.u32 %r118,%r62,0; $L21: .loc 1 135 10 ld.s8 %r100,[%r53]; cvt.u16.u32 %r99,%r100; setp.eq.u16 %r101,%r99,0; @ %r101 bra $L18; .loc 1 37105,[%value_in]; } .loc 1 135 18 @ %r118 bra $L19; not.b32 %r109,%r105; shr.u32 %r110,%r109,31; cvt.u32.u32 %r111,%r110; cvt.u32.u8 %r55,%r111; bra $L20; $L19: set.u32.gt.s32 %r113,%r105,0; neg.s32 %r114,%r113; cvt.u32.u32 %r112,%r114; cvt.u32.u8 %r55,%r112; bra $L20; $L18: .loc 1 141 11 add.u64 %r56,%r56,%r27; .loc 1 142 13 add.u64 %r53,%r53,%r37; .loc 1 133 27 add.u64 %r57,%r57,1; .loc 1 133 3 setp.ne.u64 %r115,%r33,%r57; @ %r115 bra $L21; $L17: .loc 1 144 10 st.u64 [%r59],%r44; shr.s64 %r116,%r44,63; st.u64 [%r59+8],%r116; bra $L9; $L20: .loc 1 135 18 setp.eq.u32 %r117,%r55,0; selp.u64 %r45,%r45,%r56,%r117; selp.u64 %r44,%r44,%r57,%r117; bra $L18; $L9: .loc 1 1_gfortran_smaxloc2_16_s1 .visible .func _gfortran_smaxloc2_16_s1in_ar4) {mov.u64 %r29,%ar4; .loc 1 155 6 setp.eq.u64 %r30,%r27,0; @ %r30 bra $L31; .loc 1 156 12 cvt.s64.s32 %r35,%r28; cvt.u32.u64 %r33,%r29; {35; call _gfortran_maxloc2_16_s1ld.u64 %r38,[%frame]; ld.u64 %r39,[%frame+8]; bra $L30; $L31: .loc 1 158 12 mov.u64 %r38,%r27; mov.u64 %r39,%r38; $L30: st.u64 [%r25],%r38; st.u64 [%r25+8],%r39; .loc 1 159 1 ret; } maxloc2_16_s4.o/1622802213gfortran_maxloc2_16_s4 .visible .func _gfortran_maxloc2_16_s4); .file 1 "../../../libgfortran/generated/maxloc2_16_s4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxloc2_16_s4 .visible .func _gfortran_mmaxloc2_16_s_gfortran_smaxloc2_16_s4 .visible .func _gfortran_smaxloc2_16_s_gfortrani_memcmp_char4_gfortrani_memcmp_char4_gfortrani_internal_error .extern .func _gfortrani_internal_error21,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxloc2_16_s4 .visible .func _gfortran_maxloc2_16_s4pred.loc 1 56 12 ld.u64 %r42,[%r38+56]; add.u64 %r41,%r42,1; .loc 1 56 10 ld.u64 %r43,[%r38+48]; sub.u64 %r32,%r41,%r43; .loc 1 57 6 setp.gt.s64 %r44,%r32,0; @ %r44 bra $L2; .loc 1 58 12 mov.u64 %r45,0; st.u64 [%r37],%r45; st.u64 [%r37+8],%r45; bra $L1; $L2: .loc 1 63 7 ld.u64 %r33,[%r38]; .loc 1 73 11 ld.u64 %r48,[%r38+40]; mul.lo.u64 %r47,%r40,%r48; shl.b64 %r27,%r47,2; .loc 1 63 7 mov.u64 %r30,%r33; .loc 1 65 9 mov.u64 %r34,1; mov.u64 %r29,%r34; .loc 1 67 10 setp.eq.u32 %r63,%r39,0; bra $L4; $L7: setp.eq.u64 %r49,%r30,0; @ %r49 bra $L8; @ %r63 bra $L(%value_in),_gfortrani_memcmp_char467 26 setp.lt.s32 %r55,%r54,0; .loc 1 73 11 selp.u64 %r30,%r30,%r33,%r55; selp.u64 %r29,%r29,%r34,%r55; bra $L4; $L(%value_in),_gfortrani_memcmp_char4.loc 1 67 26 setp.le.s32 %r60,%r59,0; .loc 1 73 11 selp.u64 %r30,%r30,%r33,%r60; selp.u64 %r29,%r29,%r34,%r60; bra $L4; $L8: mov.u64 %r30,%r33; mov.u64 %r29,%r34; $L4: add.u64 %r33,%r33,%r27; .loc 1 65 25 add.u64 %r34,%r34,1; .loc 1 65 3 setp.ge.s64 %r61,%r32,%r34; @ %r61 bra $L7; .loc 1 75 10 st.u64 [%r37],%r29; shr.s64 %r62,%r29,63; st.u64 [%r37+8],%r62;_gfortran_mmaxloc2_16_s4 .visible .func _gfortran_mmaxloc2_16_s, .param .u64 %in_ar4) {predpredpred.loc 1 98 12 ld.u64 %r65,[%r60+56]; add.u64 %r64,%r65,1; .loc 1 98 10 ld.u64 %r66,[%r60+48]; sub.u64 %r49,%r64,%r66; .loc 1 99 6 setp.gt.s64 %r67,%r49,0; @ %r67 bra $L10; $L16: .loc 1 100 12 mov.u64 %r68,0; st.u64 [%r59],%r68; st.u64 [%r59+8],%r68; bra $L9; $L10: .loc 1 102 44 ld.u64 %r70,[%r60+40]; mul.lo.u64 %r27,%r63,%r70; .loc 1 104 15 ld.u64 %r28,[%r61+16]; .loc 1 105 9 ld.u64 %r52,[%r61]; .loc 1 107 22 cvt.u32.u64 %r29,%r28; .loc 1 107 53 add.u32 %r71,%r29,-4; and.b32 %r72,%r71,-5;.loc 1 107 22 add.u32 %r76,%r29,-1; set.u32.le.u32 %r78,%r76,1; neg.s32 %r79,%r78; .loc 1 107 6 cvt.u16.u32 %r81,%r75; cvt.u16.u32 %r82,%r79; or.b16 %r80,%r81,%r82;@ %r85 bra $L12; .loc 1 109 7 setp.ne.u32 %r87,%r29,16; @ %r87 bra $L13; $L12: .loc 1 116 13 ld.u64 %r88,[%r61+40]; mul.lo.u64 %r34,%r28,%r88; .loc 1 119 9 mov.u64 %r53,0; bra $L14; $L13: .loc 1 114 5 cvta.const.u64 %r90,$LC0; mov.u64 %r89,0; {r90; call _gfortrani_internal_error14: .loc 1 121 10 ld.s8 %r92,[%r52]; cvt.u16.u32 %r91,%r92; setp.ne.u16 %r93,%r91,0; @ %r93 bra $L15; .loc 1 123 13 add.u64 %r52,%r52,%r34; .loc 1 119 24 add.u64 %r53,%r53,1; .loc 1 119 3 setp.ne.u64 %r94,%r49,%r53; @ %r94 bra $L14; bra $L16; $L15: .loc 1 126 6 setp.eq.u64 %r95,%r49,%r53; @ %r95 bra $L16; .loc 1 129 7 add.u64 %r43,%r53,1; .loc 1 130 30 mul.lo.u64 %r96,%r27,%r53; .loc 1 130 26 shl.b64 %r97,%r96,2; .loc 1 130 7 ld.u64 %r98,[%r60]; add.u64 %r45,%r98,%r97; .loc 1 133 3 setp.lt.s64 %r99,%r49,%r43; @ %r99 bra $L17; .loc 1 141 11 shl.b64 %r41,%r27,2; mov.u64 %r56,%r43; .loc 1 130 7 mov.u64 %r55,%r45; .loc 1 135 18 setp.eq.u32 %r122,%r62,0; $L21: .loc 1 135 10 ld.s8 %r101,[%r52]; cvt.u16.u32 %r100,%r101; setp.eq.u16 %r102,%r100,0; @ %r102 bra $L18; .loc 1 135 18 @ %r122 bra $L19; .loc 1 39_gfortrani_memcmp_char47,[%value_in]; } .loc 1 135 18 not.b32 %r109,%r107; shr.u32 %r110,%r109,31; cvt.u32.u32 %r111,%r110; cvt.u32.u8 %r54,%r111; bra $L20; $L19: .loc 1 39_gfortrani_memcmp_char4115,[%value_in]; } .loc 1 135 18 set.u32.gt.s32 %r117,%r115,0; neg.s32 %r118,%r117; cvt.u32.u32 %r116,%r118; cvt.u32.u8 %r54,%r116; bra $L20; $L18: .loc 1 141 11 add.u64 %r55,%r55,%r41; .loc 1 142 13 add.u64 %r52,%r52,%r34; .loc 1 133 27 add.u64 %r56,%r56,1; .loc 1 133 3 setp.ge.s64 %r119,%r49,%r56; @ %r119 bra $L21; $L17: .loc 1 144 10 st.u64 [%r59],%r43; shr.s64 %r120,%r43,63; st.u64 [%r59+8],%r120; bra $L9; $L20: .loc 1 135 18 setp.eq.u32 %r121,%r54,0; selp.u64 %r45,%r45,%r55,%r121; selp.u64 %r43,%r43,%r56,%r121; bra $L18; $L9: .loc 1 1_gfortran_smaxloc2_16_s4 .visible .func _gfortran_smaxloc2_16_s, .param .u64 %in_ar4) {mov.u64 %r29,%ar4; .loc 1 155 6 setp.eq.u64 %r30,%r27,0; @ %r30 bra $L30; .loc 1 156 12 cvt.s64.s32 %r35,%r28; cvt.u32.u64 %r33,%r29; {35; call _gfortran_maxloc2_16_s4ld.u64 %r40,[%frame]; ld.u64 %r41,[%frame+8]; st.u64 [%r25],%r40; st.u64 [%r25+8],%r41; bra $L29; $L30: .loc 1 158 12 st.u64 [%r25],%r27; st.u64 [%r25+8],%r27; $L29: .loc 1 159 1 ret; } minloc2_4_s1.o/ 1622802213gfortran_minloc2_4_s1gfortran_minloc2_4_s1.file 1 "../../../libgfortran/generated/minloc2_4_s1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc2_4_s1gfortran_mminloc2_4_s1); // BEGIN GLOBAL FUNCTION DECL: _gfortran_sminloc2_4_s1gfortran_sminloc2_4_s1); // BEGIN GLOBAL FUNCTION DECL: memcmpmem_gfortrani_internal_error .extern .func _gfortrani_internal_error21,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc2_4_s1gfortran_minloc2_4_s1value;u64 %r42; .reg .u64 %r43; .reg .pred %r44; .reg .u64 %r45; .reg .predmov.u64 %r38,%ar0; mov.u32 %r39,%ar1; mov.u64 %r40,%ar2; .loc 1 57 12 ld.u64 %r42,[%r38+56]; add.u64 %r41,%r42,1; .loc 1 57 10 ld.u64 %r43,[%r38+48]; sub.u64 %r30,%r41,%r43; .loc 1 58 6 setp.le.s64 %r44,%r30,0; @ %r44 bra $L7; .loc 1 61 44 ld.u64 %r45,[%r38+40]; mul.lo.u64 %r27,%r40,%r45; .loc 1 64 7 ld.u64 %r33,[%r38]; add.u64 %r36,%r30,1; mov.u64 %r29,%r33; .loc 1 66 9 mov.u64 %r34,1; mov.u64 %r28,%r34; .loc 1 68 10 setp.eq.u32 %r57,%r39,0; bra $L3; $L6:8(%value_in),memcmp,(%out_arg1,%out_arg2,%out_arg3); ld.param.u32 %r50,[%value_in]; } .loc 1 68 10 @ %r57 bra $L5; .loc 1 68 26 setp.gt.s32 %r53,%r50,0; selp.u64 %r29,%r29,%r33,%r53; selp.u64 %r28,%r28,%r34,%r53; bra $L3; $L5: setp.ge.s32 %r54,%r50,0; selp.u64 %r29,%r29,%r33,%r54; selp.u64 %r28,%r28,%r34,%r54; bra $L3; $L8: .loc 1 74 11 mov.u64 %r29,%r33; mov.u64 %r28,%r34; $L3: add.u64 %r33,%r33,%r27; .loc 1 66 25 add.u64 %r34,%r34,1; .loc 1 66 3 setp.ne.u64 %r55,%r34,%r36; @ %r55 bra $L6; .loc 1 76 10 cvt.u32.u64 %r37,%r28; bra $L1; $L7: .loc 1 59 12 mov.u32 %r37,0; $L1: .loc 1 77_gfortran_mminloc2_4_s1gfortran_mminloc2_4_s1predu16 %r88; .reg .u32 %r89; .reg .pred %r90; .reg .pred %r91; .reg .pred.reg .pred %r98; .reg .u32 %r102; .reg .u32.loc 1 99 12 ld.u64 %r64,[%r59+56]; add.u64 %r63,%r64,1; .loc 1 99 10 ld.u64 %r65,[%r59+48]; sub.u64 %r48,%r63,%r65; .loc 1 100 6 setp.le.s64 %r66,%r48,0; @ %r66 bra $L20; .loc 1 103 44 ld.u64 %r67,[%r59+40]; mul.lo.u64 %r27,%r62,%r67; .loc 1 105 15 ld.u64 %r28,[%r60+16]; .loc 1 106 9 ld.u64 %r51,[%r60]; .loc 1 108 22 cvt.u32.u64 %r29,%r28; .loc 1 108 53 add.u32 %r68,%r29,-4; and.b32 %r69,%r68,-5; set.u32.eq.u32 %r71,%r69,0; neg.s32 %r72,%r71; .loc 1 108 22 add.u32 %r73,%r29,-1; set.u32.le.u32 %r75,%r73,1; neg.s32 %r76,%r75; .loc 1 108 6 cvt.u16.u32 %r78,%r72; cvt.u16.u32 %r79,%r76; or.b16 %r77,%r78,%r79;110 7 setp.ne.u32 %r84,%r29,16; @ %r84 bra $L12; $L11: .loc 1 117 13 ld.u64 %r85,[%r60+40]; mul.lo.u64 %r37,%r28,%r85; .loc 1 120 9 mov.u64 %r52,0; bra $L13; $L12: .loc 1 115 5 cvta.const.u64 %r87,$LC0; mov.u64 %r86,r87; call _gfortrani_internal_error13: .loc 1 122 10 ld.s8 %r89,[%r51]; cvt.u16.u32 %r88,%r89; setp.ne.u16 %r90,%r88,0; @ %r90 bra $L14; .loc 1 124 13 add.u64 %r51,%r51,%r37; .loc 1 120 24 add.u64 %r52,%r52,1; .loc 1 120 3 setp.ne.u64 %r91,%r48,%r52; @ %r91 bra $L13; .loc 1 101 12 mov.u32 %r58,0; bra $L9; $L14: .loc 1 127 6 setp.eq.u64 %r92,%r48,%r52; @ %r92 bra $L21; .loc 1 130 7 add.u64 %r43,%r52,1; .loc 1 131 7 ld.u64 %r94,[%r59]; mad.lo.u64 %r44,%r27,%r52,%r94; .loc 1 134 3 setp.lt.s64 %r95,%r48,%r43; @ %r95 bra $L15; add.u64 %r34,%r48,1; mov.u64 %r55,%r43; .loc 1 131 7 mov.u64 %r54,%r44; .loc 1 137 18 setp.eq.u32 %r114,%r61,0; $L19: .loc 1 137 10 ld.s8 %r97,[%r51]; cvt.u16.u32 %r96,%r97; setp.eq.u16 %r98,%r96,0; @ %r98 bra $L16; .loc 1 37102,[%value_in]; } .loc 1 137 18 @ %r114 bra $L17; set.u32.le.s32 %r106,%r102,0; neg.s32 %r107,%r106; cvt.u32.u32 %r105,%r107; cvt.u32.u8 %r53,%r105; bra $L18; $L17: shr.u32 %r109,%r102,31; cvt.u32.u32 %r110,%r109; cvt.u32.u8 %r53,%r110; bra $L18; $L16: .loc 1 143 11 add.u64 %r54,%r54,%r27; .loc 1 144 13 add.u64 %r51,%r51,%r37; .loc 1 134 27 add.u64 %r55,%r55,1; .loc 1 134 3 setp.ne.u64 %r111,%r34,%r55; @ %r111 bra $L19; $L15: .loc 1 146 10 cvt.u32.u64 %r58,%r43; bra $L9; $L20: .loc 1 101 12 mov.u32 %r58,0; bra $L9; $L21: mov.u32 %r58,0; bra $L9; $L18: .loc 1 137 18 setp.eq.u32 %r113,%r53,0; selp.u64 %r44,%r44,%r54,%r113; selp.u64 %r43,%r43,%r55,%r113; bra $L16;gfortran_sminloc2_4_s1gfortran_sminloc2_4_s1predmov.u64 %r28,%ar3; .loc 1 157 6 setp.eq.u64 %r29,%r26,0; @ %r29 bra $L31; .loc 1 158 12 cvt.s64.s32 %r33,%r27; cvt.u32.u64 %r31,%r225(%value_in),_gfortran_minloc2_4_s1mov.u32 %r24,%r34; bra $L29; $L31: .loc 1 160 12 cvt.u32 minloc2_4_s4.o/ 1622802213gfortran_minloc2_4_s4gfortran_minloc2_4_s4.file 1 "../../../libgfortran/generated/minloc2_4_s4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc2_4_s4gfortran_mminloc2_4_s4); // BEGIN GLOBAL FUNCTION DECL: _gfortran_sminloc2_4_s4gfortran_sminloc2_4_s4); // BEGIN GLOBAL FUNCTION DECL: _gfortrani_memcmp_char4_gfortrani_memcmp_char4_gfortrani_internal_error .extern .func _gfortrani_internal_error21,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc2_4_s4gfortran_minloc2_4_s4value;u64mov.u64 %r37,%ar0; mov.u32 %r38,%ar1; mov.u64 %r39,%ar2; .loc 1 57 12 ld.u64 %r41,[%r37+56]; add.u64 %r40,%r41,1; .loc 1 57 10 ld.u64 %r42,[%r37+48]; sub.u64 %r31,%r40,%r42; .loc 1 58 6 setp.le.s64 %r43,%r31,0; @ %r43 bra $L7; .loc 1 64 7 ld.u64 %r32,[%r37]; .loc 1 74 11 ld.u64 %r45,[%r37+40]; mul.lo.u64 %r44,%r39,%r45; shl.b64 %r27,%r44,2; .loc 1 64 7 mov.u64 %r29,%r32; .loc 1 66 9 mov.u64 %r33,1; mov.u64 %r28,%r33; .loc 1 68 10 setp.eq.u32 %r60,%r38,0; bra $L3; $L6:8; @ %r60 bra $L5; .loc 1 3932_gfortrani_memcmp_char4.loc 1 68 26 setp.gt.s32 %r52,%r51,0; .loc 1 74 11 selp.u64 %r29,%r29,%r32,%r52; selp.u64 %r28,%r28,%r33,%r52; bra $L3; $L5: .loc 1 3932_gfortrani_memcmp_char4.loc 1 68 26 setp.ge.s32 %r57,%r56,0; .loc 1 74 11 selp.u64 %r29,%r29,%r32,%r57; selp.u64 %r28,%r28,%r33,%r57; bra $L3; $L8: mov.u64 %r29,%r32; mov.u64 %r28,%r33; $L3: add.u64 %r32,%r32,%r27; .loc 1 66 25 add.u64 %r33,%r33,1; .loc 1 66 3 setp.ge.s64 %r58,%r31,%r33; @ %r58 bra $L6; .loc 1 76 10 cvt.u32.u64 %r36,%r28; bra $L1; $L7: .loc 1 59 12 mov.u32 %r36,0; $L1: .loc 1 _gfortran_mminloc2_4_s4gfortran_mminloc2_4_s4predu16 %r88; .reg .u32 %r89; .reg .pred %r90; .reg .pred %r91; .reg .predpredpred %r115; .reg .pred.loc 1 99 12 ld.u64 %r64,[%r59+56]; add.u64 %r63,%r64,1; .loc 1 99 10 ld.u64 %r65,[%r59+48]; sub.u64 %r48,%r63,%r65; .loc 1 100 6 setp.le.s64 %r66,%r48,0; @ %r66 bra $L20; .loc 1 103 44 ld.u64 %r67,[%r59+40]; mul.lo.u64 %r27,%r62,%r67; .loc 1 105 15 ld.u64 %r28,[%r60+16]; .loc 1 106 9 ld.u64 %r51,[%r60]; .loc 1 108 22 cvt.u32.u64 %r29,%r28; .loc 1 108 53 add.u32 %r68,%r29,-4; and.b32 %r69,%r68,-5; set.u32.eq.u32 %r71,%r69,0; neg.s32 %r72,%r71; .loc 1 108 22 add.u32 %r73,%r29,-1; set.u32.le.u32 %r75,%r73,1; neg.s32 %r76,%r75; .loc 1 108 6 cvt.u16.u32 %r78,%r72; cvt.u16.u32 %r79,%r76; or.b16 %r77,%r78,%r79;110 7 setp.ne.u32 %r84,%r29,16; @ %r84 bra $L12; $L11: .loc 1 117 13 ld.u64 %r85,[%r60+40]; mul.lo.u64 %r34,%r28,%r85; .loc 1 120 9 mov.u64 %r52,0; bra $L13; $L12: .loc 1 115 5 cvta.const.u64 %r87,$LC0; mov.u64 %r86,r87; call _gfortrani_internal_error13: .loc 1 122 10 ld.s8 %r89,[%r51]; cvt.u16.u32 %r88,%r89; setp.ne.u16 %r90,%r88,0; @ %r90 bra $L14; .loc 1 124 13 add.u64 %r51,%r51,%r34; .loc 1 120 24 add.u64 %r52,%r52,1; .loc 1 120 3 setp.ne.u64 %r91,%r48,%r52; @ %r91 bra $L13; .loc 1 101 12 mov.u32 %r58,0; bra $L9; $L14: .loc 1 127 6 setp.eq.u64 %r92,%r48,%r52; @ %r92 bra $L21; .loc 1 130 7 add.u64 %r42,%r52,1; .loc 1 131 30 mul.lo.u64 %r93,%r27,%r52; .loc 1 131 26 shl.b64 %r94,%r93,2; .loc 1 131 7 ld.u64 %r95,[%r59]; add.u64 %r44,%r95,%r94; .loc 1 134 3 setp.lt.s64 %r96,%r48,%r42; @ %r96 bra $L15; .loc 1 143 11 shl.b64 %r41,%r27,2; mov.u64 %r55,%r42; .loc 1 131 7 mov.u64 %r54,%r44; .loc 1 137 18 setp.eq.u32 %r118,%r61,0; $L19: .loc 1 137 10 ld.s8 %r98,[%r51]; cvt.u16.u32 %r97,%r98; setp.eq.u16 %r99,%r97,0; @ %r99 bra $L16; .loc 1 137 18 @ %r118 bra $L17; .loc 1 39_gfortrani_memcmp_char44,[%value_in]; } .loc 1 137 18 set.u32.le.s32 %r106,%r104,0; neg.s32 %r107,%r106; cvt.u32.u32 %r105,%r107; cvt.u32.u8 %r53,%r105; bra $L18; $L17: .loc 1 39_gfortrani_memcmp_char4111,[%value_in]; } .loc 1 137 18 shr.u32 %r113,%r111,31; cvt.u32.u32 %r114,%r113; cvt.u32.u8 %r53,%r114; bra $L18; $L16: .loc 1 143 11 add.u64 %r54,%r54,%r41; .loc 1 144 13 add.u64 %r51,%r51,%r34; .loc 1 134 27 add.u64 %r55,%r55,1; .loc 1 134 3 setp.ge.s64 %r115,%r48,%r55; @ %r115 bra $L19; $L15: .loc 1 146 10 cvt.u32.u64 %r58,%r42; bra $L9; $L20: .loc 1 101 12 mov.u32 %r58,0; bra $L9; $L21: mov.u32 %r58,0; bra $L9; $L18: .loc 1 137 18 setp.eq.u32 %r117,%r53,0; selp.u64 %r44,%r44,%r54,%r117; selp.u64 %r42,%r42,%r55,%r117; bra $L16;gfortran_sminloc2_4_s4gfortran_sminloc2_4_s4predmov.u64 %r28,%ar3; .loc 1 157 6 setp.eq.u64 %r29,%r26,0; @ %r29 bra $L31; .loc 1 158 12 cvt.s64.s32 %r33,%r27; cvt.u32.u64 %r31,%r225(%value_in),_gfortran_minloc2_4_s4mov.u32 %r24,%r34; bra $L29; $L31: .loc 1 160 12 cvt.u32gfortran_minloc2_8_s1gfortran_minloc2_8_s1.file 1 "../../../libgfortran/generated/minloc2_8_s1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc2_8_s1gfortran_mminloc2_8_s1); // BEGIN GLOBAL FUNCTION DECL: _gfortran_sminloc2_8_s1gfortran_sminloc2_8_s1); // BEGIN GLOBAL FUNCTION DECL: memcmpmem_gfortrani_internal_error .extern .func _gfortrani_internal_error21,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc2_8_s1gfortran_minloc2_8_s1value;mov.u64 %r37,%ar0; mov.u32 %r38,%ar1; mov.u64 %r39,%ar2; .loc 1 57 12 ld.u64 %r41,[%r37+56]; add.u64 %r40,%r41,1; .loc 1 57 10 ld.u64 %r42,[%r37+48]; sub.u64 %r29,%r40,%r42; .loc 1 58 6 setp.le.s64 %r43,%r29,0; @ %r43 bra $L7; .loc 1 61 44 ld.u64 %r44,[%r37+40]; mul.lo.u64 %r27,%r39,%r44; .loc 1 64 7 ld.u64 %r32,[%r37]; add.u64 %r35,%r29,1; mov.u64 %r28,%r32; .loc 1 66 9 mov.u64 %r33,1; mov.u64 %r36,%r33; .loc 1 68 10 setp.eq.u32 %r56,%r38,0; bra $L3; $L6: setp.eq.u64 %r45,%r28,0; @ %r45 bra $L8; .loc 1 3710 @ %r56 bra $L5; .loc 1 68 26 setp.gt.s32 %r52,%r49,0; selp.u64 %r28,%r28,%r32,%r52; selp.u64 %r36,%r36,%r33,%r52; bra $L3; $L5: setp.ge.s32 %r53,%r49,0; selp.u64 %r28,%r28,%r32,%r53; selp.u64 %r36,%r36,%r33,%r53; bra $L3; $L8: .loc 1 74 11 mov.u64 %r28,%r32; mov.u64 %r36,%r33; $L3: add.u64 %r32,%r32,%r27; .loc 1 66 25 add.u64 %r33,%r33,1; .loc 1 66 3 setp.ne.u64 %r54,%r33,%r35; @ %r54 bra $L6; bra $L1; $L7: .loc 1 59 12 mov.u64 %r36,0; $L1: .loc 1 77_gfortran_mminloc2_8_s1gfortran_mminloc2_8_s1.reg .u32 %r75; .reg .u16 %r76; .reg .u16 %r77; .reg .u16 %r78; .reg .u32pred32pred %r91; .reg .u64 %r93; .reg .predpred %r97; .reg .u32pred %r110; .reg .pred %r112; .reg .pred %r113; mov.u64 %r58,%ar0; mov.u64 %r59,%ar1; mov.u32 %r60,%ar2; mov.u64 %r61,%ar3; .loc 1 99 12 ld.u64 %r63,[%r58+56]; add.u64 %r62,%r63,1; .loc 1 99 10 ld.u64 %r64,[%r58+48]; sub.u64 %r48,%r62,%r64; .loc 1 100 6 setp.le.s64 %r65,%r48,0; @ %r65 bra $L19; .loc 1 103 44 ld.u64 %r66,[%r58+40]; mul.lo.u64 %r27,%r61,%r66; .loc 1 105 15 ld.u64 %r28,[%r59+16]; .loc 1 106 9 ld.u64 %r51,[%r59]; .loc 1 108 22 cvt.u32.u64 %r29,%r28; .loc 1 108 53 add.u32 %r67,%r29,-4; and.b32 %r68,%r67,-5; set.u32.eq.u32 %r70,%r68,0; neg.s32 %r71,%r70; .loc 1 108 22 add.u32 %r72,%r29,-1; set.u32.le.u32 %r74,%r72,1; neg.s32 %r75,%r74; .loc 1 108 6 cvt.u16.u32 %r77,%r71; cvt.u16.u32 %r78,%r75; or.b16 %r76,%r77,%r78; cvt.u32.u16 %r79,%r76; cvt.u16.u8 %r80,%r79; setp.ne.u16 %r81,%r80,0; @ %r81 bra $L11; .loc 1 110 7 setp.ne.u32 %r83,%r29,16; @ %r83 bra $L12; $L11: .loc 1 117 13 ld.u64 %r84,[%r59+40]; mul.lo.u64 %r37,%r28,%r84; .loc 1 120 9 mov.u64 %r52,0; bra $L13; $L12: .loc 1 115 5 cvta.const.u64 %r86,$LC0; mov.u64 %r85,r86; call _gfortrani_internal_error13: .loc 1 122 10 ld.s8 %r88,[%r51]; cvt.u16.u32 %r87,%r88; setp.ne.u16 %r89,%r87,0; @ %r89 bra $L14; .loc 1 124 13 add.u64 %r51,%r51,%r37; .loc 1 120 24 add.u64 %r52,%r52,1; .loc 1 120 3 setp.ne.u64 %r90,%r48,%r52; @ %r90 bra $L13; .loc 1 101 12 mov.u64 %r57,0; bra $L9; $L14: .loc 1 127 6 setp.eq.u64 %r91,%r48,%r52; @ %r91 bra $L20; .loc 1 130 7 add.u64 %r57,%r52,1; .loc 1 131 7 ld.u64 %r93,[%r58]; mad.lo.u64 %r43,%r27,%r52,%r93; .loc 1 134 3 setp.lt.s64 %r94,%r48,%r57; @ %r94 bra $L9; add.u64 %r34,%r48,1; mov.u64 %r55,%r57; .loc 1 131 7 mov.u64 %r54,%r43; .loc 1 137 18 setp.eq.u32 %r113,%r60,0; $L18: .loc 1 137 10 ld.s8 %r96,[%r51]; cvt.u16.u32 %r95,%r96; setp.eq.u16 %r97,%r95,0; @ %r97 bra $L15; .loc 1 371,[%value_in]; } .loc 1 137 18 @ %r113 bra $L16; set.u32.le.s32 %r105,%r101,0; neg.s32 %r106,%r105; cvt.u32.u32 %r104,%r106; cvt.u32.u8 %r53,%r104; bra $L17; $L16: shr.u32 %r108,%r101,31; cvt.u32.u32 %r109,%r108; cvt.u32.u8 %r53,%r109; bra $L17; $L15: .loc 1 143 11 add.u64 %r54,%r54,%r27; .loc 1 144 13 add.u64 %r51,%r51,%r37; .loc 1 134 27 add.u64 %r55,%r55,1; .loc 1 134 3 setp.ne.u64 %r110,%r34,%r55; @ %r110 bra $L18; bra $L9; $L19: .loc 1 101 12 mov.u64 %r57,0; bra $L9; $L20: mov.u64 %r57,0; bra $L9; $L17: .loc 1 137 18 setp.eq.u32 %r112,%r53,0; selp.u64 %r43,%r43,%r54,%r112; selp.u64 %r57,%r57,%r55,%r112; bra $L15; $L9predmov.u64 %r28,%ar3; .loc 1 157 6 setp.eq.u64 %r29,%r26,0; @ %r29 bra $L30; .loc 1 158 12 cvt.s64.s32 %r33,%r27; cvt.u32.u64 %r31,%r2825(%value_in),_gfortran_minloc2_8_s1mov.u64 %r24,%r34; bra $L28; $L30: .loc 1 160 12 mov.u64 %r24,%r26; $L28 minloc2_8_s4.o/ 1622802213gfortran_minloc2_8_s4gfortran_minloc2_8_s4.file 1 "../../../libgfortran/generated/minloc2_8_s4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc2_8_s4gfortran_mminloc2_8_s4); // BEGIN GLOBAL FUNCTION DECL: _gfortran_sminloc2_8_s4gfortran_sminloc2_8_s4); // BEGIN GLOBAL FUNCTION DECL: _gfortrani_memcmp_char4_gfortrani_memcmp_char4_gfortrani_internal_error .extern .func _gfortrani_internal_error21,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc2_8_s4gfortran_minloc2_8_s4value;predpred %r57; .reg .pred %r59; mov.u64 %r36,%ar0; mov.u32 %r37,%ar1; mov.u64 %r38,%ar2; .loc 1 57 12 ld.u64 %r40,[%r36+56]; add.u64 %r39,%r40,1; .loc 1 57 10 ld.u64 %r41,[%r36+48]; sub.u64 %r30,%r39,%r41; .loc 1 58 6 setp.le.s64 %r42,%r30,0; @ %r42 bra $L7; .loc 1 64 7 ld.u64 %r31,[%r36]; .loc 1 74 11 ld.u64 %r44,[%r36+40]; mul.lo.u64 %r43,%r38,%r44; shl.b64 %r27,%r43,2; .loc 1 64 7 mov.u64 %r29,%r31; .loc 1 66 9 mov.u64 %r32,1; mov.u64 %r35,%r32; .loc 1 68 10 setp.eq.u32 %r59,%r37,0; bra $L3; $L6: setp.eq.u64 %r45,%r29,0; @ %r45 bra $L8; @ %r59 bra $L5; .loc 1 39 1229_gfortrani_memcmp_char4.loc 1 68 26 setp.gt.s32 %r51,%r50,0; .loc 1 74 11 selp.u64 %r29,%r29,%r31,%r51; selp.u64 %r35,%r35,%r32,%r51; bra $L3; $L5: .loc 1 39 1229_gfortrani_memcmp_char4.loc 1 68 26 setp.ge.s32 %r56,%r55,0; .loc 1 74 11 selp.u64 %r29,%r29,%r31,%r56; selp.u64 %r35,%r35,%r32,%r56; bra $L3; $L8: mov.u64 %r29,%r31; mov.u64 %r35,%r32; $L3: add.u64 %r31,%r31,%r27; .loc 1 66 25 add.u64 %r32,%r32,1; .loc 1 66 3 setp.ge.s64 %r57,%r30,%r32; @ %r57 bra $L6; bra $L1; $L7: .loc 1 59 12 mov.u64 %r35,0; $L1: .loc 1 77gfortran_mminloc2_8_s4gfortran_mminloc2_8_s4.reg .u32 %r75; .reg .u16 %r76; .reg .u16 %r77; .reg .u16 %r78; .reg .u32predpred %r89; .reg .pred %r90; .reg .predpredmov.u64 %r58,%ar0; mov.u64 %r59,%ar1; mov.u32 %r60,%ar2; mov.u64 %r61,%ar3; .loc 1 99 12 ld.u64 %r63,[%r58+56]; add.u64 %r62,%r63,1; .loc 1 99 10 ld.u64 %r64,[%r58+48]; sub.u64 %r47,%r62,%r64; .loc 1 100 6 setp.le.s64 %r65,%r47,0; @ %r65 bra $L19; .loc 1 103 44 ld.u64 %r66,[%r58+40]; mul.lo.u64 %r27,%r61,%r66; .loc 1 105 15 ld.u64 %r28,[%r59+16]; .loc 1 106 9 ld.u64 %r50,[%r59]; .loc 1 108 22 cvt.u32.u64 %r29,%r28; .loc 1 108 53 add.u32 %r67,%r29,-4; and.b32 %r68,%r67,-5; set.u32.eq.u32 %r70,%r68,0; neg.s32 %r71,%r70; .loc 1 108 22 add.u32 %r72,%r29,-1; set.u32.le.u32 %r74,%r72,1; neg.s32 %r75,%r74; .loc 1 108 6 cvt.u16.u32 %r77,%r71; cvt.u16.u32 %r78,%r75; or.b16 %r76,%r77,%r78; cvt.u32.u16 %r79,%r76; cvt.u16.u8 %r80,%r79; setp.ne.u16 %r81,%r80,0; @ %r81 bra $L11; .loc 1 110 7 setp.ne.u32 %r83,%r29,16; @ %r83 bra $L12; $L11: .loc 1 117 13 ld.u64 %r84,[%r59+40]; mul.lo.u64 %r34,%r28,%r84; .loc 1 120 9 mov.u64 %r51,0; bra $L13; $L12: .loc 1 115 5 cvta.const.u64 %r86,$LC0; mov.u64 %r85,r86; call _gfortrani_internal_error13: .loc 1 122 10 ld.s8 %r88,[%r50]; cvt.u16.u32 %r87,%r88; setp.ne.u16 %r89,%r87,0; @ %r89 bra $L14; .loc 1 124 13 add.u64 %r50,%r50,%r34; .loc 1 120 24 add.u64 %r51,%r51,1; .loc 1 120 3 setp.ne.u64 %r90,%r47,%r51; @ %r90 bra $L13; .loc 1 101 12 mov.u64 %r57,0; bra $L9; $L14: .loc 1 127 6 setp.eq.u64 %r91,%r47,%r51; @ %r91 bra $L20; .loc 1 130 7 add.u64 %r57,%r51,1; .loc 1 131 30 mul.lo.u64 %r92,%r27,%r51; .loc 1 131 26 shl.b64 %r93,%r92,2; .loc 1 131 7 ld.u64 %r94,[%r58]; add.u64 %r43,%r94,%r93; .loc 1 134 3 setp.lt.s64 %r95,%r47,%r57; @ %r95 bra $L9; .loc 1 143 11 shl.b64 %r41,%r27,2; mov.u64 %r54,%r57; .loc 1 131 7 mov.u64 %r53,%r43; .loc 1 137 18 setp.eq.u32 %r117,%r60,0; $L18: .loc 1 137 10 ld.s8 %r97,[%r50]; cvt.u16.u32 %r96,%r97; setp.eq.u16 %r98,%r96,0; @ %r98 bra $L15; .loc 1 137 18 @ %r117 bra $L16; .loc 1 39call (%value_in),_gfortrani_memcmp_char43,[%value_in]; } .loc 1 137 18 set.u32.le.s32 %r105,%r103,0; neg.s32 %r106,%r105; cvt.u32.u32 %r104,%r106; cvt.u32.u8 %r52,%r104; bra $L17; $L16: .loc 1 39call (%value_in),_gfortrani_memcmp_char4110,[%value_in]; } .loc 1 137 18 shr.u32 %r112,%r110,31; cvt.u32.u32 %r113,%r112; cvt.u32.u8 %r52,%r113; bra $L17; $L15: .loc 1 143 11 add.u64 %r53,%r53,%r41; .loc 1 144 13 add.u64 %r50,%r50,%r34; .loc 1 134 27 add.u64 %r54,%r54,1; .loc 1 134 3 setp.ge.s64 %r114,%r47,%r54; @ %r114 bra $L18; bra $L9; $L19: .loc 1 101 12 mov.u64 %r57,0; bra $L9; $L20: mov.u64 %r57,0; bra $L9; $L17: .loc 1 137 18 setp.eq.u32 %r116,%r52,0; selp.u64 %r43,%r43,%r53,%r116; selp.u64 %r57,%r57,%r54,%r116; bra $L15; $L9predmov.u64 %r28,%ar3; .loc 1 157 6 setp.eq.u64 %r29,%r26,0; @ %r29 bra $L30; .loc 1 158 12 cvt.s64.s32 %r33,%r27; cvt.u32.u64 %r31,%r2825(%value_in),_gfortran_minloc2_8_s4mov.u64 %r24,%r34; bra $L28; $L30: .loc 1 160 12 mov.u64 %r24,%r26; $L28gfortran_minloc2_16_s1 .visible .func _gfortran_minloc2_16_s1); .file 1 "../../../libgfortran/generated/minloc2_16_s1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc2_16_s1 .visible .func _gfortran_mminloc2_16_s1_gfortran_sminloc2_16_s1 .visible .func _gfortran_sminloc2_16_s1memcmpmem_gfortrani_internal_error .extern .func _gfortrani_internal_error21,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc2_16_s1 .visible .func _gfortran_minloc2_16_s1predmov.u32 %r40,%ar2; mov.u64 %r41,%ar3; .loc 1 57 12 ld.u64 %r43,[%r39+56]; add.u64 %r42,%r43,1; .loc 1 57 10 ld.u64 %r44,[%r39+48]; sub.u64 %r31,%r42,%r44; .loc 1 58 6 setp.gt.s64 %r45,%r31,0; @ %r45 bra $L2; .loc 1 59 12 mov.u64 %r46,0; st.u64 [%r38],%r46; st.u64 [%r38+8],%r46; bra $L1; $L2: .loc 1 61 44 ld.u64 %r48,[%r39+40]; mul.lo.u64 %r27,%r41,%r48; .loc 1 64 7 ld.u64 %r34,[%r39]; add.u64 %r37,%r31,1; mov.u64 %r30,%r34; .loc 1 66 9 mov.u64 %r35,1; mov.u64 %r29,%r35; .loc 1 68 10 setp.eq.u32 %r60,%r40,0; bra $L4; $L7: setp.eq.u64 %r49,%r30,0; @ %r49 bra $L8; .loc 1 37(%value_in),memcmp,(%out_arg1,%out_arg2,68 10 @ %r60 bra $L6; .loc 1 68 26 setp.gt.s32 %r56,%r53,0; selp.u64 %r30,%r30,%r34,%r56; selp.u64 %r29,%r29,%r35,%r56; bra $L4; $L6: setp.ge.s32 %r57,%r53,0; selp.u64 %r30,%r30,%r34,%r57; selp.u64 %r29,%r29,%r35,%r57; bra $L4; $L8: .loc 1 74 11 mov.u64 %r30,%r34; mov.u64 %r29,%r35; $L4: add.u64 %r34,%r34,%r27; .loc 1 66 25 add.u64 %r35,%r35,1; .loc 1 66 3 setp.ne.u64 %r58,%r35,%r37; @ %r58 bra $L7; .loc 1 76 10 st.u64 [%r38],%r29; shr.s64 %r59,%r29,63; st.u64 [%r38+8],%r59; $L1: .loc 1 77_gfortran_mminloc2_16_s1 .visible .func _gfortran_mminloc2_16_s1in_ar4) {u16 %r84; .reg .pred %r85; .reg .predpred.loc 1 99 12 ld.u64 %r65,[%r60+56]; add.u64 %r64,%r65,1; .loc 1 99 10 ld.u64 %r66,[%r60+48]; sub.u64 %r50,%r64,%r66; .loc 1 100 6 setp.gt.s64 %r67,%r50,0; @ %r67 bra $L10; $L16: .loc 1 101 12 mov.u64 %r68,0; st.u64 [%r59],%r68; st.u64 [%r59+8],%r68; bra $L9; $L10: .loc 1 103 44 ld.u64 %r70,[%r60+40]; mul.lo.u64 %r27,%r63,%r70; .loc 1 105 15 ld.u64 %r28,[%r61+16]; .loc 1 106 9 ld.u64 %r53,[%r61]; .loc 1 108 22 cvt.u32.u64 %r29,%r28; .loc 1 108 53 add.u32 %r71,%r29,-4; and.b32 %r72,%r71,-5;.loc 1 108 22 add.u32 %r76,%r29,-1; set.u32.le.u32 %r78,%r76,1; neg.s32 %r79,%r78; .loc 1 108 6 cvt.u16.u32 %r81,%r75; cvt.u16.u32 %r82,%r79; or.b16 %r80,%r81,%r82;@ %r85 bra $L12; .loc 1 110 7 setp.ne.u32 %r87,%r29,16; @ %r87 bra $L13; $L12: .loc 1 117 13 ld.u64 %r88,[%r61+40]; mul.lo.u64 %r37,%r28,%r88; .loc 1 120 9 mov.u64 %r54,0; bra $L14; $L13: .loc 1 115 5 cvta.const.u64 %r90,$LC0; mov.u64 %r89,0; {r90; call _gfortrani_internal_error14: .loc 1 122 10 ld.s8 %r92,[%r53]; cvt.u16.u32 %r91,%r92; setp.ne.u16 %r93,%r91,0; @ %r93 bra $L15; .loc 1 124 13 add.u64 %r53,%r53,%r37; .loc 1 120 24 add.u64 %r54,%r54,1; .loc 1 120 3 setp.ne.u64 %r94,%r50,%r54; @ %r94 bra $L14; bra $L16; $L15: .loc 1 127 6 setp.eq.u64 %r95,%r50,%r54; @ %r95 bra $L16; .loc 1 130 7 add.u64 %r44,%r54,1; .loc 1 131 7 ld.u64 %r97,[%r60]; mad.lo.u64 %r45,%r27,%r54,%r97; .loc 1 134 3 setp.lt.s64 %r98,%r50,%r44; @ %r98 bra $L17; add.u64 %r33,%r50,1; mov.u64 %r57,%r44; .loc 1 131 7 mov.u64 %r56,%r45; .loc 1 137 18 setp.eq.u32 %r117,%r62,0; $L21: .loc 1 137 10 ld.s8 %r100,[%r53]; cvt.u16.u32 %r99,%r100; setp.eq.u16 %r101,%r99,0; @ %r101 bra $L18; .loc 1 37105,[%value_in]; } .loc 1 137 18 @ %r117 bra $L19; set.u32.le.s32 %r109,%r105,0; neg.s32 %r110,%r109; cvt.u32.u32 %r108,%r110; cvt.u32.u8 %r55,%r108; bra $L20; $L19: shr.u32 %r112,%r105,31; cvt.u32.u32 %r113,%r112; cvt.u32.u8 %r55,%r113; bra $L20; $L18: .loc 1 143 11 add.u64 %r56,%r56,%r27; .loc 1 144 13 add.u64 %r53,%r53,%r37; .loc 1 134 27 add.u64 %r57,%r57,1; .loc 1 134 3 setp.ne.u64 %r114,%r33,%r57; @ %r114 bra $L21; $L17: .loc 1 146 10 st.u64 [%r59],%r44; shr.s64 %r115,%r44,63; st.u64 [%r59+8],%r115; bra $L9; $L20: .loc 1 137 18 setp.eq.u32 %r116,%r55,0; selp.u64 %r45,%r45,%r56,%r116; selp.u64 %r44,%r44,%r57,%r116; bra $L18; $L9:_gfortran_sminloc2_16_s1 .visible .func _gfortran_sminloc2_16_s1in_ar4) {mov.u64 %r29,%ar4; .loc 1 157 6 setp.eq.u64 %r30,%r27,0; @ %r30 bra $L31; .loc 1 158 12 cvt.s64.s32 %r35,%r28; cvt.u32.u64 %r33,%r29; {35; call _gfortran_minloc2_16_s1ld.u64 %r38,[%frame]; ld.u64 %r39,[%frame+8]; bra $L30; $L31: .loc 1 160 12 mov.u64 %r38,%r27; mov.u64 %r39,%r38; $L30: st.u64 [%r25],%r38; st.u64 [%r25+8],%r39; .loc 1 161 1 ret; } minloc2_16_s4.o/1622802213ortran_minloc2_16_s4 .visible .func _gfortran_minloc2_16_s4); .file 1 "../../../libgfortran/generated/minloc2_16_s4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminloc2_16_s4 .visible .func _gfortran_mminloc2_16_s_gfortran_sminloc2_16_s4 .visible .func _gfortran_sminloc2_16_s_gfortrani_memcmp_char4_gfortrani_memcmp_char4_gfortrani_internal_error .extern .func _gfortrani_internal_error21,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minloc2_16_s4 .visible .func _gfortran_minloc2_16_s4pred.loc 1 57 12 ld.u64 %r42,[%r38+56]; add.u64 %r41,%r42,1; .loc 1 57 10 ld.u64 %r43,[%r38+48]; sub.u64 %r32,%r41,%r43; .loc 1 58 6 setp.gt.s64 %r44,%r32,0; @ %r44 bra $L2; .loc 1 59 12 mov.u64 %r45,0; st.u64 [%r37],%r45; st.u64 [%r37+8],%r45; bra $L1; $L2: .loc 1 64 7 ld.u64 %r33,[%r38]; .loc 1 74 11 ld.u64 %r48,[%r38+40]; mul.lo.u64 %r47,%r40,%r48; shl.b64 %r27,%r47,2; .loc 1 64 7 mov.u64 %r30,%r33; .loc 1 66 9 mov.u64 %r34,1; mov.u64 %r29,%r34; .loc 1 68 10 setp.eq.u32 %r63,%r39,0; bra $L4; $L7: setp.eq.u64 %r49,%r30,0; @ %r49 bra $L8; @ %r63 bra $L(%value_in),_gfortrani_memcmp_char468 26 setp.gt.s32 %r55,%r54,0; .loc 1 74 11 selp.u64 %r30,%r30,%r33,%r55; selp.u64 %r29,%r29,%r34,%r55; bra $L4; $L(%value_in),_gfortrani_memcmp_char4.loc 1 68 26 setp.ge.s32 %r60,%r59,0; .loc 1 74 11 selp.u64 %r30,%r30,%r33,%r60; selp.u64 %r29,%r29,%r34,%r60; bra $L4; $L8: mov.u64 %r30,%r33; mov.u64 %r29,%r34; $L4: add.u64 %r33,%r33,%r27; .loc 1 66 25 add.u64 %r34,%r34,1; .loc 1 66 3 setp.ge.s64 %r61,%r32,%r34; @ %r61 bra $L7; .loc 1 76 10 st.u64 [%r37],%r29; shr.s64 %r62,%r29,63; st.u64 [%r37+8],%r62; $L1: .loc 1 77_gfortran_mminloc2_16_s4 .visible .func _gfortran_mminloc2_16_s, .param .u64 %in_ar4) {predpredu32 %r110; .reg .u32 %r114; .reg .u32 %r116; .reg .u32 %r117; .reg .pred.loc 1 99 12 ld.u64 %r65,[%r60+56]; add.u64 %r64,%r65,1; .loc 1 99 10 ld.u64 %r66,[%r60+48]; sub.u64 %r49,%r64,%r66; .loc 1 100 6 setp.gt.s64 %r67,%r49,0; @ %r67 bra $L10; $L16: .loc 1 101 12 mov.u64 %r68,0; st.u64 [%r59],%r68; st.u64 [%r59+8],%r68; bra $L9; $L10: .loc 1 103 44 ld.u64 %r70,[%r60+40]; mul.lo.u64 %r27,%r63,%r70; .loc 1 105 15 ld.u64 %r28,[%r61+16]; .loc 1 106 9 ld.u64 %r52,[%r61]; .loc 1 108 22 cvt.u32.u64 %r29,%r28; .loc 1 108 53 add.u32 %r71,%r29,-4; and.b32 %r72,%r71,-5;.loc 1 108 22 add.u32 %r76,%r29,-1; set.u32.le.u32 %r78,%r76,1; neg.s32 %r79,%r78; .loc 1 108 6 cvt.u16.u32 %r81,%r75; cvt.u16.u32 %r82,%r79; or.b16 %r80,%r81,%r82;@ %r85 bra $L12; .loc 1 110 7 setp.ne.u32 %r87,%r29,16; @ %r87 bra $L13; $L12: .loc 1 117 13 ld.u64 %r88,[%r61+40]; mul.lo.u64 %r34,%r28,%r88; .loc 1 120 9 mov.u64 %r53,0; bra $L14; $L13: .loc 1 115 5 cvta.const.u64 %r90,$LC0; mov.u64 %r89,0; {r90; call _gfortrani_internal_error14: .loc 1 122 10 ld.s8 %r92,[%r52]; cvt.u16.u32 %r91,%r92; setp.ne.u16 %r93,%r91,0; @ %r93 bra $L15; .loc 1 124 13 add.u64 %r52,%r52,%r34; .loc 1 120 24 add.u64 %r53,%r53,1; .loc 1 120 3 setp.ne.u64 %r94,%r49,%r53; @ %r94 bra $L14; bra $L16; $L15: .loc 1 127 6 setp.eq.u64 %r95,%r49,%r53; @ %r95 bra $L16; .loc 1 130 7 add.u64 %r43,%r53,1; .loc 1 131 30 mul.lo.u64 %r96,%r27,%r53; .loc 1 131 26 shl.b64 %r97,%r96,2; .loc 1 131 7 ld.u64 %r98,[%r60]; add.u64 %r45,%r98,%r97; .loc 1 134 3 setp.lt.s64 %r99,%r49,%r43; @ %r99 bra $L17; .loc 1 143 11 shl.b64 %r41,%r27,2; mov.u64 %r56,%r43; .loc 1 131 7 mov.u64 %r55,%r45; .loc 1 137 18 setp.eq.u32 %r121,%r62,0; $L21: .loc 1 137 10 ld.s8 %r101,[%r52]; cvt.u16.u32 %r100,%r101; setp.eq.u16 %r102,%r100,0; @ %r102 bra $L18; .loc 1 137 18 @ %r121 bra $L19; .loc 1 39_gfortrani_memcmp_char47,[%value_in]; } .loc 1 137 18 set.u32.le.s32 %r109,%r107,0; neg.s32 %r110,%r109; cvt.u32.u32 %r108,%r110; cvt.u32.u8 %r54,%r108; bra $L20; $L19: .loc 1 39_gfortrani_memcmp_char4114,[%value_in]; } .loc 1 137 18 shr.u32 %r116,%r114,31; cvt.u32.u32 %r117,%r116; cvt.u32.u8 %r54,%r117; bra $L20; $L18: .loc 1 143 11 add.u64 %r55,%r55,%r41; .loc 1 144 13 add.u64 %r52,%r52,%r34; .loc 1 134 27 add.u64 %r56,%r56,1; .loc 1 134 3 setp.ge.s64 %r118,%r49,%r56; @ %r118 bra $L21; $L17: .loc 1 146 10 st.u64 [%r59],%r43; shr.s64 %r119,%r43,63; st.u64 [%r59+8],%r119; bra $L9; $L20: .loc 1 137 18 setp.eq.u32 %r120,%r54,0; selp.u64 %r45,%r45,%r55,%r120; selp.u64 %r43,%r43,%r56,%r120; bra $L18; $L9:_gfortran_sminloc2_16_s4 .visible .func _gfortran_sminloc2_16_s, .param .u64 %in_ar4) {mov.u64 %r29,%ar4; .loc 1 157 6 setp.eq.u64 %r30,%r27,0; @ %r30 bra $L30; .loc 1 158 12 cvt.s64.s32 %r35,%r28; cvt.u32.u64 %r33,%r29; {35; call _gfortran_minloc2_16_s4ld.u64 %r40,[%frame]; ld.u64 %r41,[%frame+8]; st.u64 [%r25],%r40; st.u64 [%r25+8],%r41; bra $L29; $L30: .loc 1 160 12 st.u64 [%r25],%r27; st.u64 [%r25+8],%r27; $L29: .loc 1 161 1 ret; } maxval0_s1.o/fortran/generated/maxval0_s1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxval0_s1 .visible .func _gfortran_mmaxval0_s1_gfortran_smaxval0_s1 .visible .func _gfortran_smaxval0_s1_gfortran_runtime_error .extern .func _gfortran_runtime_memcmpmem120,118,97,108,48,95,115,49,0 }; // BEGIN VAR DEF: __func__$1 .const .align 1 .u8 __func__$1[11] = {109,97,120,118,97,108,48,95,115,49116,114,97,110,47,103,101,110,101,114,97,116,101,100,47,109,97,120,118,97,108,48,95,115,49,46,99,0 }82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,4820,108,101,110,32,61,61,32,108,101,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxval0_s1 .visible .func _gfortran_maxval0_s1reg .u64 %stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[368101; .reg .u64predmov.u64 %r73,%ar0; mov.u64 %r74,%ar1; mov.u64 %r75,%ar2; mov.u64 %r76,%ar3; .loc 1 64 8 ld.s8 %r50,[%r75+28]; .loc 1 65 6 setp.gt.s64 %r77,%r50,0; @ %r77 bra $L2; .loc 1 66 5 cvta.const.u64 %r78,$LC0; {stack; call _gfortran_runtime_errorsetp.eq.u64 %r80,%r74,%r76; @ %r80 bra $L3; cvta.const.u64 %r84,$LC1; cvta.const.u64 %r83,__func__$1; mov.u32 %r82,683: .loc 1 71 3add.u64 %r66,%r75,40; mov.u64 %r68,0; .loc 1 73 10 mov.u64 %r59,%r68; add.u64 %r134,%frame,120; add.u64 %r132,%frame,240; .loc 1 77 16 mov.u64 %r102,%r68; bra $L6; $L12: mov.u64 %r59,%r54; $L6: .loc 1 75 18 add.u64 %r92,%frame,%r68; .loc 1 75 51 ld.u64 %r94,[%r66]; mul.lo.u64 %r93,%r94,%r76; .loc 1 75 18 st.u64 [%r92],%r93; .loc 1 76 19 ld.u64 %r96,[%r66+16]; add.u64 %r95,%r96,1; ld.u64 %r97,[%r66+8]; sub.u64 %r30,%r95,%r97; .loc 1 76 17 add.u64 %r99,%r134,%r68; st.u64 [%r99],%r30; .loc 1 77 16 add.u64 %r101,%r132,%r68; st.u64 [%r101],%r102; .loc 1 78 10 setp.gt.s64 %r103,%r30,0; @ ! %r103 bra $L1; .loc 1 73 26 add.u64 %r54,%r59,1; .loc 1 73 3 add.u64 %r66,%r66,24; add.u64 %r68,%r68,8; setp.ne.u64 %r104,%r50,%r54; @ %r104 bra $L12; .loc 1 82 8 ld.u64 %r46,[%r75]; .loc 1 89 9 setp.eq.u64 %r105,%r46,0; @ %r105 bra $L13; .loc 1 101 19 ld.u64 %r32,[%frame]; .loc 1 103 34 ld.u64 %r36,[%frame+120]; mov.u64 %r49,%r73; add.u64 %r133,%frame,248; .loc 1 109 13 mov.u64 %r116,0; $L11: .loc 1 103 21 ld.u64 %r58,[%frame+240]; add.u64 %r61,%r58,1; mov.u64 %r45,%r46; bra $L9; $L14: mov.u64 %r61,%r62; $L9: .loc 1 39(%value_in),memcmp,(%out_arg1,%out_arg2,%out_arg3); ld.param.u32 %r109,[%value_in]; } .loc 1 95 6 setp.le.s32 %r111,%r109,0; selp.u64 %r49,%r49,%r45,%r111; .loc 1 101 9 add.u64 %r45,%r45,%r32; .loc 1 103 7 st.u64 [%frame+240],%r61; add.u64 %r62,%r61,1; setp.ne.u64 %r112,%r61,%r36; @ %r112 bra $L14; .loc 1 101 9 sub.u64 %r113,%r36,%r58; mad.lo.u64 %r46,%r113,%r32,%r46; mov.u64 %r52,%r133; mov.u64 %r44,%r36; mov.u64 %r42,%r32; .loc 1 104 9 mov.u64 %r48,0; $L10: .loc 1 109 13 st.u64 [%r52+-8],%r116; .loc 1 112 23 mul.lo.u64 %r39,%r42,%r44; mov.u64 %r65,%r48; .loc 1 113 5 add.u64 %r48,%r48,1; .loc 1 114 7 setp.eq.u64 %r117,%r65,%r59; @ %r117 bra $L7; .loc 1 122 16 ld.u64 %r118,[%r52]; add.u64 %r41,%r118,1; st.u64 [%r52],%r41; shl.b64 %r31,%r48,3; .loc 1 123 23 add.u64 %r119,%frame,%r31; ld.u64 %r42,[%r119]; .loc 1 123 13 sub.u64 %r120,%r42,%r39; add.u64 %r46,%r46,%r120; .loc 1 126 32 add.u64 %r122,%r134,%r31; ld.u64 %r44,[%r122]; .loc 1 126 7 add.u64 %r52,%r52,8; setp.eq.u64 %r123,%r41,%r44; @ %r123 bra $L10; bra $L11; $L13: .loc 1 89 9 mov.u64 %r49,%r73; $L7: .loc 1 128 4130,[%value_in]; } $L1:_gfortran_mmaxval0_s1 .visible .func _gfortran_mmaxval0_s1 {stack; mov.u64 %stack,0480predu64 %r114; .reg .u3232predpredmov.u64 %r99,%ar0; mov.u64 %r100,%ar1; mov.u64 %r101,%ar2; mov.u64 %r102,%ar3; mov.u64 %r103,%ar4; .loc 1 153 6 setp.ne.u64 %r104,%r102,0; @ %r104 bra $L17r10103; call _gfortran_maxval0_s17 bra $L16; $L17: .loc 1 159 10 ld.s8 %r22,[%r101+28]; .loc 1 160 6 cvt.u16.u32 %r109,%r22; setp.gt.s16 %r110,%r109,0; @ %r110 bra $L19; .loc 1 161 5 cvta.const.u64 %r111111_gfortran_runtime_error19: .loc 1 163 3 setp.eq.u64 %r113,%r100,%r103; @ %r113 bra $L20; cvta.const.u64 %r117,$LC1; cvta.const.u64 %r116,__func__$0; mov.u32 %r115,163; cvta.const.u64 %r11411611720: .loc 1 166 3 mov.u32 %r121100123,[%value_in]; } .loc 1 168 15 ld.u64 %r23,[%r102+16]; .loc 1 170 9 ld.u64 %r71,[%r102]; .loc 1 172 22 cvt.u32.u64 %r24,%r23; .loc 1 172 53 add.u32 %r125,%r24,-4; and.b32 %r126,%r125,-5; set.u32.eq.u32 %r128,%r126,0; neg.s32 %r129,%r128; .loc 1 172 22 add.u32 %r130,%r24,-1; set.u32.le.u32 %r132,%r130,1; neg.s32 %r133,%r132; .loc 1 172 6 cvt.u16.u32 %r135,%r129; cvt.u16.u32 %r136,%r133; or.b16 %r134,%r135,%r136; cvt.u32.u16 %r137,%r134; cvt.u16.u8 %r138,%r137; setp.eq.u16 %r139,%r138,0; @ %r139 bra $L21; $L23: .loc 1 181 17 cvt.u32.u32 %r140,%r22; cvt.s64.s8 %r85,%r140; add.u64 %r28,%r101,40; add.u64 %r49,%r102,40; mov.u64 %r92,0; mov.u64 %r82,%r92; add.u64 %r198,%frame,120; add.u64 %r196,%frame,240; add.u64 %r195,%frame,360; .loc 1 186 16 mov.u64 %r159,%r92; bra $L22; $L21: .loc 1 174 7 setp.eq.u32 %r142,%r24,16; @ %r142 bra $L23; .loc 1 179 5 cvta.const.u64 %r143,$LC3; {_gfortran_runtime_error22: .loc 1 183 18 add.u64 %r146,%r198,%r92; .loc 1 183 51 ld.u64 %r148,[%r28]; mul.lo.u64 %r147,%r148,%r103; .loc 1 183 18 st.u64 [%r146],%r147; .loc 1 184 18 add.u64 %r149,%frame,%r92; .loc 1 184 20 ld.u64 %r151,[%r49]; mul.lo.u64 %r150,%r151,%r23; .loc 1 184 18 st.u64 [%r149],%r150; .loc 1 185 19 ld.u64 %r153,[%r28+16]; add.u64 %r152,%r153,1; ld.u64 %r154,[%r28+8]; sub.u64 %r41,%r152,%r154; .loc 1 185 17 add.u64 %r156,%r196,%r92; st.u64 [%r156],%r41; .loc 1 186 16 add.u64 %r158,%r195,%r92; st.u64 [%r158],%r159; .loc 1 187 10 setp.le.s64 %r160,%r41,0; @ %r160 bra $L16; .loc 1 181 26 add.u64 %r82,%r82,1; .loc 1 181 3 add.u64 %r28,%r28,24; add.u64 %r49,%r49,24; add.u64 %r92,%r92,8; setp.ne.u64 %r161,%r82,%r85; @ %r161 bra $L22; .loc 1 191 8 ld.u64 %r69,[%r101]; .loc 1 198 9 setp.eq.u64 %r162,%r69,0; @ %r162 bra $L30; .loc 1 210 19 ld.u64 %r45,[%frame+120]; .loc 1 211 20 ld.u64 %r47,[%frame]; .loc 1 213 34 ld.u64 %r52,[%frame+240]; mov.u64 %r73,%r99; add.u64 %r197,%frame,368; .loc 1 219 13 mov.u64 %r176,0; $L28: .loc 1 213 21 ld.u64 %r91,[%frame+360]; add.u64 %r56,%r91,1; mov.u64 %r70,%r71; mov.u64 %r68,%r69; bra $L26; $L31: mov.u64 %r56,%r59; $L26: .loc 1 204 6 ld.s8 %r164,[%r70]; cvt.u16.u32 %r163,%r164; setp.eq.u16 %r165,%r163,0; @ %r165 bra $L25; .loc 1 3973.loc 1 204 14 setp.le.s32 %r171,%r169,0; selp.u64 %r73,%r73,%r68,%r171; $L25: .loc 1 210 9 add.u64 %r68,%r68,%r45; .loc 1 211 10 add.u64 %r70,%r70,%r47; .loc 1 213 7 st.u64 [%frame+360],%r56; add.u64 %r59,%r56,1; setp.ne.u64 %r172,%r52,%r56; @ %r172 bra $L31; sub.u64 %r89,%r52,%r91; .loc 1 210 9 mad.lo.u64 %r69,%r45,%r89,%r69; .loc 1 211 10 mad.lo.u64 %r71,%r47,%r89,%r71; mov.u64 %r83,%r197; mov.u64 %r64,%r47; mov.u64 %r67,%r52; mov.u64 %r62,%r45; mov.u64 %r81,8; .loc 1 214 9 mov.u64 %r72,0; $L27: .loc 1 219 13 st.u64 [%r83+-8],%r176; .loc 1 222 23 mul.lo.u64 %r54,%r62,%r67; .loc 1 223 24 mul.lo.u64 %r58,%r67,%r64; .loc 1 224 5 add.u64 %r72,%r72,1; .loc 1 225 7 setp.eq.u64 %r177,%r72,%r85; @ %r177 bra $L24; .loc 1 233 16 ld.u64 %r178,[%r83]; add.u64 %r61,%r178,1; st.u64 [%r83],%r61; .loc 1 234 23 add.u64 %r180,%r198,%r81; ld.u64 %r62,[%r180]; .loc 1 234 13 sub.u64 %r181,%r62,%r54; add.u64 %r69,%r69,%r181; .loc 1 235 24 add.u64 %r182,%frame,%r81; ld.u64 %r64,[%r182]; .loc 1 235 14 sub.u64 %r183,%r64,%r58; add.u64 %r71,%r71,%r183; .loc 1 238 32 add.u64 %r185,%r196,%r81; ld.u64 %r67,[%r185]; .loc 1 238 7 add.u64 %r83,%r83,8; add.u64 %r81,%r81,8; setp.eq.u64 %r186,%r61,%r67; @ %r186 bra $L27; bra $L28; $L30: .loc 1 198 9 mov.u64 %r73,%r99; $L24: .loc 1 240 5103193,[%value_in]; } $L16: .loc 1 2_gfortran_smaxval0_s1 .visible .func _gfortran_smaxval0_s1 {pred %r28; .reg .u32 %r29; .reg .pred %r30; .reg .u64 %r40;_gfortran_maxval0_s17 bra $L40; $L42:40,[%value_in]; } $L40: .loc 1 262 1 ret; } maxval0_s4.o/_gfortran_maxval0_s4 .visible .func _gfortran_maxval0_sfortran/generated/maxval0_s4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxval0_s4 .visible .func _gfortran_mmaxval0_s4_gfortran_smaxval0_s4 .visible .func _gfortran_smaxval0_s4_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_memcmp_char4_gfortrani_memcmp_char4120,118,97,108,48,95,115,52,0 }; // BEGIN VAR DEF: __func__$1 .const .align 1 .u8 __func__$1[11] = {109,97,120,118,97,108,48,95,115,52116,114,97,110,47,103,101,110,101,114,97,116,101,100,47,109,97,120,118,97,108,48,95,115,52,46,99,0 }82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,4820,108,101,110,32,61,61,32,108,101,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_maxval0_s4 .visible .func _gfortran_maxval0_sreg .u64 %stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[368101; .reg .u64predpredmov.u64 %r73,%ar0; mov.u64 %r74,%ar1; mov.u64 %r75,%ar2; mov.u64 %r76,%ar3; .loc 1 64 8 ld.s8 %r51,[%r75+28]; .loc 1 65 6 setp.gt.s64 %r77,%r51,0; @ %r77 bra $L2; .loc 1 66 5 cvta.const.u64 %r78,$LC0; {stack; call _gfortran_runtime_errorsetp.eq.u64 %r80,%r74,%r76; @ %r80 bra $L3; cvta.const.u64 %r84,$LC1; cvta.const.u64 %r83,__func__$1; mov.u32 %r82,683: .loc 1 71 3 shl.b64 %r23,%r76,2add.u64 %r33,%r75,40; mov.u64 %r37,0; .loc 1 73 10 mov.u64 %r60,%r37; add.u64 %r134,%frame,120; add.u64 %r132,%frame,240; .loc 1 77 16 mov.u64 %r102,%r37; bra $L6; $L12: mov.u64 %r60,%r54; $L6: .loc 1 75 18 add.u64 %r92,%frame,%r37; .loc 1 75 51 ld.u64 %r94,[%r33]; mul.lo.u64 %r93,%r94,%r76; .loc 1 75 18 st.u64 [%r92],%r93; .loc 1 76 19 ld.u64 %r96,[%r33+16]; add.u64 %r95,%r96,1; ld.u64 %r97,[%r33+8]; sub.u64 %r31,%r95,%r97; .loc 1 76 17 add.u64 %r99,%r134,%r37; st.u64 [%r99],%r31; .loc 1 77 16 add.u64 %r101,%r132,%r37; st.u64 [%r101],%r102; .loc 1 78 10 setp.gt.s64 %r103,%r31,0; @ ! %r103 bra $L1; .loc 1 73 26 add.u64 %r54,%r60,1; .loc 1 73 3 add.u64 %r33,%r33,24; add.u64 %r37,%r37,8; setp.ne.u64 %r104,%r51,%r54; @ %r104 bra $L12; .loc 1 82 8 ld.u64 %r47,[%r75]; .loc 1 89 9 setp.eq.u64 %r105,%r47,0; @ %r105 bra $L13; .loc 1 101 19 ld.u64 %r61,[%frame]; .loc 1 101 9 shl.b64 %r63,%r61,2; .loc 1 103 34 ld.u64 %r64,[%frame+120]; mov.u64 %r49,%r73; add.u64 %r133,%frame,248; $L11: .loc 1 103 21 ld.u64 %r66,[%frame+240]; mov.u64 %r34,%r66; mov.u64 %r46,%r47; $L9: .loc 1 41(%value_in),_gfortrani_memcmp_char49,[%value_in]; } .loc 1 95 6 setp.le.s32 %r110,%r109,0; .loc 1 97 14 selp.u64 %r49,%r49,%r46,%r110; .loc 1 101 9 add.u64 %r46,%r46,%r63; .loc 1 103 14 add.u64 %r34,%r34,1; .loc 1 103 7 st.u64 [%frame+240],%r34; setp.ne.u64 %r111,%r34,%r64; @ %r111 bra $L9; .loc 1 101 9 sub.u64 %r112,%r64,%r66; mad.lo.u64 %r47,%r112,%r63,%r47; mov.u64 %r50,%r133; mov.u64 %r44,%r64; mov.u64 %r42,%r61; .loc 1 104 9 mov.u64 %r48,0; .loc 1 109 13 mov.u64 %r115,%r48; $L10: st.u64 [%r50+-8],%r115; .loc 1 112 23 mul.lo.u64 %r39,%r42,%r44; mov.u64 %r72,%r48; .loc 1 113 5 add.u64 %r48,%r48,1; .loc 1 114 7 setp.eq.u64 %r116,%r72,%r60; @ %r116 bra $L7; .loc 1 122 16 ld.u64 %r117,[%r50]; add.u64 %r41,%r117,1; st.u64 [%r50],%r41; shl.b64 %r32,%r48,3; .loc 1 123 23 add.u64 %r118,%frame,%r32; ld.u64 %r42,[%r118]; .loc 1 123 13 sub.u64 %r119,%r42,%r39; shl.b64 %r120,%r119,2; add.u64 %r47,%r47,%r120; .loc 1 126 32 add.u64 %r122,%r134,%r32; ld.u64 %r44,[%r122]; .loc 1 126 7 add.u64 %r50,%r50,8; setp.eq.u64 %r123,%r41,%r44; @ %r123 bra $L10; bra $L11; $L13: .loc 1 89 9 mov.u64 %r49,%r73; $L7: .loc 1 128 4130,[%value_in]; } $L1:_gfortran_mmaxval0_s4 .visible .func _gfortran_mmaxval0_s4 {stack; mov.u64 %stack,0480predu64 %r114; .reg .u3232predu64 %r183; .reg .u64 %r185; .reg .predmov.u64 %r99,%ar0; mov.u64 %r100,%ar1; mov.u64 %r101,%ar2; mov.u64 %r102,%ar3; mov.u64 %r103,%ar4; .loc 1 153 6 setp.ne.u64 %r104,%r102,0; @ %r104 bra $L17r10103; call _gfortran_maxval0_s47 bra $L16; $L17: .loc 1 159 10 ld.s8 %r22,[%r101+28]; .loc 1 160 6 cvt.u16.u32 %r109,%r22; setp.gt.s16 %r110,%r109,0; @ %r110 bra $L19; .loc 1 161 5 cvta.const.u64 %r111111_gfortran_runtime_error19: .loc 1 163 3 setp.eq.u64 %r113,%r100,%r103; @ %r113 bra $L20; cvta.const.u64 %r117,$LC1; cvta.const.u64 %r116,__func__$0; mov.u32 %r115,163; cvta.const.u64 %r11411611720: .loc 1 166 3 shl.b64 %r23,%r100,2; mov.u32 %r1211123,[%value_in]; } .loc 1 168 15 ld.u64 %r24,[%r102+16]; .loc 1 170 9 ld.u64 %r65,[%r102]; .loc 1 172 22 cvt.u32.u64 %r25,%r24; .loc 1 172 53 add.u32 %r125,%r25,-4; and.b32 %r126,%r125,-5; set.u32.eq.u32 %r128,%r126,0; neg.s32 %r129,%r128; .loc 1 172 22 add.u32 %r130,%r25,-1; set.u32.le.u32 %r132,%r130,1; neg.s32 %r133,%r132; .loc 1 172 6 cvt.u16.u32 %r135,%r129; cvt.u16.u32 %r136,%r133; or.b16 %r134,%r135,%r136; cvt.u32.u16 %r137,%r134; cvt.u16.u8 %r138,%r137; setp.eq.u16 %r139,%r138,0; @ %r139 bra $L21; $L23: .loc 1 181 17 cvt.u32.u32 %r140,%r22; cvt.s64.s8 %r80,%r140; add.u64 %r51,%r101,40; add.u64 %r30,%r102,40; mov.u64 %r44,0; mov.u64 %r76,%r44; add.u64 %r197,%frame,120; add.u64 %r196,%frame,240; add.u64 %r195,%frame,360; .loc 1 186 16 mov.u64 %r159,%r44; bra $L22; $L21: .loc 1 174 7 setp.eq.u32 %r142,%r25,16; @ %r142 bra $L23; .loc 1 179 5 cvta.const.u64 %r143,$LC3; {_gfortran_runtime_error22: .loc 1 183 18 add.u64 %r146,%r197,%r44; .loc 1 183 51 ld.u64 %r148,[%r51]; mul.lo.u64 %r147,%r148,%r103; .loc 1 183 18 st.u64 [%r146],%r147; .loc 1 184 18 add.u64 %r149,%frame,%r44; .loc 1 184 20 ld.u64 %r151,[%r30]; mul.lo.u64 %r150,%r151,%r24; .loc 1 184 18 st.u64 [%r149],%r150; .loc 1 185 19 ld.u64 %r153,[%r51+16]; add.u64 %r152,%r153,1; ld.u64 %r154,[%r51+8]; sub.u64 %r42,%r152,%r154; .loc 1 185 17 add.u64 %r156,%r196,%r44; st.u64 [%r156],%r42; .loc 1 186 16 add.u64 %r158,%r195,%r44; st.u64 [%r158],%r159; .loc 1 187 10 setp.le.s64 %r160,%r42,0; @ %r160 bra $L16; .loc 1 181 26 add.u64 %r76,%r76,1; .loc 1 181 3 add.u64 %r51,%r51,24; add.u64 %r30,%r30,24; add.u64 %r44,%r44,8; setp.ne.u64 %r161,%r76,%r80; @ %r161 bra $L22; .loc 1 191 8 ld.u64 %r62,[%r101]; .loc 1 198 9 setp.eq.u64 %r162,%r62,0; @ %r162 bra $L30; .loc 1 210 19 ld.u64 %r88,[%frame+120]; .loc 1 210 9 shl.b64 %r91,%r88,2; .loc 1 211 20 ld.u64 %r92,[%frame]; .loc 1 213 34 ld.u64 %r94,[%frame+240]; mov.u64 %r69,%r99; add.u64 %r198,%frame,368; $L28: .loc 1 213 21 ld.u64 %r98,[%frame+360]; mov.u64 %r48,%r98; mov.u64 %r64,%r65; mov.u64 %r61,%r62; $L26: .loc 1 204 6 ld.s8 %r164,[%r64]; cvt.u16.u32 %r163,%r164; setp.eq.u16 %r165,%r163,0; @ %r165 bra $L25; .loc 1 41r103; call (%value_in),_gfortrani_memcmp_char4169,[%value_in]; } .loc 1 204 14 setp.le.s32 %r170,%r169,0; .loc 1 206 14 selp.u64 %r69,%r69,%r61,%r170; $L25: .loc 1 210 9 add.u64 %r61,%r61,%r91; .loc 1 211 10 add.u64 %r64,%r64,%r92; .loc 1 213 14 add.u64 %r48,%r48,1; .loc 1 213 7 st.u64 [%frame+360],%r48; setp.ne.u64 %r171,%r48,%r94; @ %r171 bra $L26; sub.u64 %r85,%r94,%r98; .loc 1 210 9 mad.lo.u64 %r62,%r85,%r91,%r62; .loc 1 211 10 mad.lo.u64 %r65,%r85,%r92,%r65; mov.u64 %r82,%r198; mov.u64 %r58,%r92; mov.u64 %r60,%r94; mov.u64 %r56,%r88; mov.u64 %r78,8; .loc 1 214 9 mov.u64 %r68,0; .loc 1 219 13 mov.u64 %r175,%r68; $L27: st.u64 [%r82+-8],%r175; .loc 1 222 23 mul.lo.u64 %r50,%r56,%r60; .loc 1 223 24 mul.lo.u64 %r53,%r60,%r58; .loc 1 224 5 add.u64 %r68,%r68,1; .loc 1 225 7 setp.eq.u64 %r176,%r68,%r80; @ %r176 bra $L24; .loc 1 233 16 ld.u64 %r177,[%r82]; add.u64 %r55,%r177,1; st.u64 [%r82],%r55; .loc 1 234 23 add.u64 %r179,%r197,%r78; ld.u64 %r56,[%r179]; .loc 1 234 13 sub.u64 %r180,%r56,%r50; shl.b64 %r181,%r180,2; add.u64 %r62,%r62,%r181; .loc 1 235 24 add.u64 %r182,%frame,%r78; ld.u64 %r58,[%r182]; .loc 1 235 14 sub.u64 %r183,%r58,%r53; add.u64 %r65,%r65,%r183; .loc 1 238 32 add.u64 %r185,%r196,%r78; ld.u64 %r60,[%r185]; .loc 1 238 7 add.u64 %r82,%r82,8; add.u64 %r78,%r78,8; setp.eq.u64 %r186,%r55,%r60; @ %r186 bra $L27; bra $L28; $L30: .loc 1 198 9 mov.u64 %r69,%r99; $L24: .loc 1 240 5193,[%value_in]; } $L16: .loc 1 2_gfortran_smaxval0_s4 .visible .func _gfortran_smaxval0_s4 {r24predmov.u64 %r26,%ar2; mov.u64 %r27,%ar3; mov.u64 %r28,%ar4; .loc 1 256 6 setp.eq.u64 %r29,%r27,0; @ %r29 bra $L41; .loc 1 256 20 l42; $L41: .loc 1 258 7 {call _gfortran_maxval0_s47 bra $L40; $L42: .loc 1 261 3 shl.b64 %r36,%r28,2$L40: .loc 1 262 1 ret; } minval0_s1.o/fortran/generated/minval0_s1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminval0_s1 .visible .func _gfortran_mminval0_s1_gfortran_sminval0_s1 .visible .func _gfortran_sminval0_s1_gfortran_runtime_error .extern .func _gfortran_runtime_memcmpmem09,105,110,118,97,108,48,95,115,49,0 }; // BEGIN VAR DEF: __func__$1 .const .align 1 .u8 __func__$1[11] = {109,105,110,118,97,108,48,95,115,49116,114,97,110,47,103,101,110,101,114,97,116,101,100,47,109,105,110,118,97,108,48,95,115,49,46,99,0 }82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,4820,108,101,110,32,61,61,32,108,101,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minval0_s1 .visible .func _gfortran_minval0_s1reg .u64 %stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[368predmov.u64 %r73,%ar0; mov.u64 %r74,%ar1; mov.u64 %r75,%ar2; mov.u64 %r76,%ar3; .loc 1 64 8 ld.s8 %r50,[%r75+28]; .loc 1 65 6 setp.gt.s64 %r77,%r50,0; @ %r77 bra $L2; .loc 1 66 5 cvta.const.u64 %r78,$LC0; {stack; call _gfortran_runtime_errorsetp.eq.u64 %r80,%r74,%r76; @ %r80 bra $L3; cvta.const.u64 %r84,$LC1; cvta.const.u64 %r83,__func__$1; mov.u32 %r82,683: .loc 1 71 3 mov.u32 %r86,2557add.u64 %r66,%r75,40; mov.u64 %r68,0; .loc 1 73 10 mov.u64 %r59,%r68; add.u64 %r131,%frame,120; add.u64 %r129,%frame,240; .loc 1 77 16 mov.u64 %r99,%r68; bra $L6; $L12: mov.u64 %r59,%r54; $L6: .loc 1 75 18 add.u64 %r89,%frame,%r68; .loc 1 75 51 ld.u64 %r91,[%r66]; mul.lo.u64 %r90,%r91,%r76; .loc 1 75 18 st.u64 [%r89],%r90; .loc 1 76 19 ld.u64 %r93,[%r66+16]; add.u64 %r92,%r93,1; ld.u64 %r94,[%r66+8]; sub.u64 %r30,%r92,%r94; .loc 1 76 17 add.u64 %r96,%r131,%r68; st.u64 [%r96],%r30; .loc 1 77 16 add.u64 %r98,%r129,%r68; st.u64 [%r98],%r99; .loc 1 78 10 setp.gt.s64 %r100,%r30,0; @ ! %r100 bra $L1; .loc 1 73 26 add.u64 %r54,%r59,1; .loc 1 73 3 add.u64 %r66,%r66,24; add.u64 %r68,%r68,8; setp.ne.u64 %r101,%r50,%r54; @ %r101 bra $L12; .loc 1 82 8 ld.u64 %r46,[%r75]; .loc 1 89 9 setp.eq.u64 %r102,%r46,0; @ %r102 bra $L13; .loc 1 101 19 ld.u64 %r32,[%frame]; .loc 1 103 34 ld.u64 %r36,[%frame+120]; mov.u64 %r49,%r73; add.u64 %r130,%frame,248; .loc 1 109 13 mov.u64 %r113,0; $L11: .loc 1 103 21 ld.u64 %r58,[%frame+240]; add.u64 %r61,%r58,1; mov.u64 %r45,%r46; bra $L9; $L14: mov.u64 %r61,%r62; $L9: .loc 1 39(%value_in),memcmp,(%out_arg1,%out_arg2,%out_arg3); ld.param.u32 %r106,[%value_in]; } .loc 1 95 6 setp.ge.s32 %r108,%r106,0; selp.u64 %r49,%r49,%r45,%r108; .loc 1 101 9 add.u64 %r45,%r45,%r32; .loc 1 103 7 st.u64 [%frame+240],%r61; add.u64 %r62,%r61,1; setp.ne.u64 %r109,%r61,%r36; @ %r109 bra $L14; .loc 1 101 9 sub.u64 %r110,%r36,%r58; mad.lo.u64 %r46,%r110,%r32,%r46; mov.u64 %r52,%r130; mov.u64 %r44,%r36; mov.u64 %r42,%r32; .loc 1 104 9 mov.u64 %r48,0; $L10: .loc 1 109 13 st.u64 [%r52+-8],%r113; .loc 1 112 23 mul.lo.u64 %r39,%r42,%r44; mov.u64 %r65,%r48; .loc 1 113 5 add.u64 %r48,%r48,1; .loc 1 114 7 setp.eq.u64 %r114,%r65,%r59; @ %r114 bra $L7; .loc 1 122 16 ld.u64 %r115,[%r52]; add.u64 %r41,%r115,1; st.u64 [%r52],%r41; shl.b64 %r31,%r48,3; .loc 1 123 23 add.u64 %r116,%frame,%r31; ld.u64 %r42,[%r116]; .loc 1 123 13 sub.u64 %r117,%r42,%r39; add.u64 %r46,%r46,%r117; .loc 1 126 32 add.u64 %r119,%r131,%r31; ld.u64 %r44,[%r119]; .loc 1 126 7 add.u64 %r52,%r52,8; setp.eq.u64 %r120,%r41,%r44; @ %r120 bra $L10; bra $L11; $L13: .loc 1 89 9 mov.u64 %r49,%r73; $L7: .loc 1 128 4127,[%value_in]; } $L1:_gfortran_mminval0_s1 .visible .func _gfortran_mminval0_s1 {stack; mov.u64 %stack,0480predu64 %r114; .reg .u32u16 %r133; .reg .u32 %r134; .reg .u16 %r135; .reg .predu16mov.u64 %r99,%ar0; mov.u64 %r100,%ar1; mov.u64 %r101,%ar2; mov.u64 %r102,%ar3; mov.u64 %r103,%ar4; .loc 1 153 6 setp.ne.u64 %r104,%r102,0; @ %r104 bra $L17r10103; call _gfortran_minval0_s17 bra $L16; $L17: .loc 1 159 10 ld.s8 %r22,[%r101+28]; .loc 1 160 6 cvt.u16.u32 %r109,%r22; setp.gt.s16 %r110,%r109,0; @ %r110 bra $L19; .loc 1 161 5 cvta.const.u64 %r111111_gfortran_runtime_error19: .loc 1 163 3 setp.eq.u64 %r113,%r100,%r103; @ %r113 bra $L20; cvta.const.u64 %r117,$LC1; cvta.const.u64 %r116,__func__$0; mov.u32 %r115,163; cvta.const.u64 %r11411611720: .loc 1 166 3 mov.u32 %r119,25599100121,[%value_in]; } .loc 1 168 15 ld.u64 %r23,[%r102+16]; .loc 1 170 9 ld.u64 %r71,[%r102]; .loc 1 172 22 cvt.u32.u64 %r24,%r23; .loc 1 172 53 add.u32 %r122,%r24,-4; and.b32 %r123,%r122,-5; set.u32.eq.u32 %r125,%r123,0; neg.s32 %r126,%r125; .loc 1 172 22 add.u32 %r127,%r24,-1; set.u32.le.u32 %r129,%r127,1; neg.s32 %r130,%r129; .loc 1 172 6 cvt.u16.u32 %r132,%r126; cvt.u16.u32 %r133,%r130; or.b16 %r131,%r132,%r133; cvt.u32.u16 %r134,%r131; cvt.u16.u8 %r135,%r134; setp.eq.u16 %r136,%r135,0; @ %r136 bra $L21; $L23: .loc 1 181 17 cvt.u32.u32 %r137,%r22; cvt.s64.s8 %r85,%r137; add.u64 %r28,%r101,40; add.u64 %r49,%r102,40; mov.u64 %r92,0; mov.u64 %r82,%r92; add.u64 %r194,%frame,120; add.u64 %r193,%frame,240; add.u64 %r192,%frame,360; .loc 1 186 16 mov.u64 %r156,%r92; bra $L22; $L21: .loc 1 174 7 setp.eq.u32 %r139,%r24,16; @ %r139 bra $L23; .loc 1 179 5_gfortran_runtime_error22: .loc 1 183 18 add.u64 %r143,%r194,%r92; .loc 1 183 51 ld.u64 %r145,[%r28]; mul.lo.u64 %r144,%r145,%r103; .loc 1 183 18 st.u64 [%r143],%r144; .loc 1 184 18 add.u64 %r146,%frame,%r92; .loc 1 184 20 ld.u64 %r148,[%r49]; mul.lo.u64 %r147,%r148,%r23; .loc 1 184 18 st.u64 [%r146],%r147; .loc 1 185 19 ld.u64 %r150,[%r28+16]; add.u64 %r149,%r150,1; ld.u64 %r151,[%r28+8]; sub.u64 %r41,%r149,%r151; .loc 1 185 17 add.u64 %r153,%r193,%r92; st.u64 [%r153],%r41; .loc 1 186 16 add.u64 %r155,%r192,%r92; st.u64 [%r155],%r156; .loc 1 187 10 setp.le.s64 %r157,%r41,0; @ %r157 bra $L16; .loc 1 181 26 add.u64 %r82,%r82,1; .loc 1 181 3 add.u64 %r28,%r28,24; add.u64 %r49,%r49,24; add.u64 %r92,%r92,8; setp.ne.u64 %r158,%r82,%r85; @ %r158 bra $L22; .loc 1 191 8 ld.u64 %r69,[%r101]; .loc 1 198 9 setp.eq.u64 %r159,%r69,0; @ %r159 bra $L30; .loc 1 210 19 ld.u64 %r45,[%frame+120]; .loc 1 211 20 ld.u64 %r47,[%frame]; .loc 1 213 34 ld.u64 %r52,[%frame+240]; mov.u64 %r73,%r99; add.u64 %r195,%frame,368; .loc 1 219 13 mov.u64 %r173,0; $L28: .loc 1 213 21 ld.u64 %r91,[%frame+360]; add.u64 %r56,%r91,1; mov.u64 %r70,%r71; mov.u64 %r68,%r69; bra $L26; $L31: mov.u64 %r56,%r59; $L26: .loc 1 204 6 ld.s8 %r161,[%r70]; cvt.u16.u32 %r160,%r161; setp.eq.u16 %r162,%r160,0; @ %r162 bra $L25; .loc 1 3973.loc 1 204 14 setp.ge.s32 %r168,%r166,0; selp.u64 %r73,%r73,%r68,%r168; $L25: .loc 1 210 9 add.u64 %r68,%r68,%r45; .loc 1 211 10 add.u64 %r70,%r70,%r47; .loc 1 213 7 st.u64 [%frame+360],%r56; add.u64 %r59,%r56,1; setp.ne.u64 %r169,%r52,%r56; @ %r169 bra $L31; sub.u64 %r89,%r52,%r91; .loc 1 210 9 mad.lo.u64 %r69,%r45,%r89,%r69; .loc 1 211 10 mad.lo.u64 %r71,%r47,%r89,%r71; mov.u64 %r83,%r195; mov.u64 %r64,%r47; mov.u64 %r67,%r52; mov.u64 %r62,%r45; mov.u64 %r81,8; .loc 1 214 9 mov.u64 %r72,0; $L27: .loc 1 219 13 st.u64 [%r83+-8],%r173; .loc 1 222 23 mul.lo.u64 %r54,%r62,%r67; .loc 1 223 24 mul.lo.u64 %r58,%r67,%r64; .loc 1 224 5 add.u64 %r72,%r72,1; .loc 1 225 7 setp.eq.u64 %r174,%r72,%r85; @ %r174 bra $L24; .loc 1 233 16 ld.u64 %r175,[%r83]; add.u64 %r61,%r175,1; st.u64 [%r83],%r61; .loc 1 234 23 add.u64 %r177,%r194,%r81; ld.u64 %r62,[%r177]; .loc 1 234 13 sub.u64 %r178,%r62,%r54; add.u64 %r69,%r69,%r178; .loc 1 235 24 add.u64 %r179,%frame,%r81; ld.u64 %r64,[%r179]; .loc 1 235 14 sub.u64 %r180,%r64,%r58; add.u64 %r71,%r71,%r180; .loc 1 238 32 add.u64 %r182,%r193,%r81; ld.u64 %r67,[%r182]; .loc 1 238 7 add.u64 %r83,%r83,8; add.u64 %r81,%r81,8; setp.eq.u64 %r183,%r61,%r67; @ %r183 bra $L27; bra $L28; $L30: .loc 1 198 9 mov.u64 %r73,%r99; $L24: .loc 1 240 5103190,[%value_in]; } $L16: .loc 1 2_gfortran_sminval0_s1 .visible .func _gfortran_sminval0_s1 {predu64 %r38;_gfortran_minval0_s17 bra $L40; $L42: .loc 1 261 3 mov.u32 %r36,255;$L40: .loc 1 262 1 ret; } minval0_s4.o/_gfortran_minval0_s4 .visible .func _gfortran_minval0_sfortran/generated/minval0_s4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminval0_s4 .visible .func _gfortran_mminval0_s4_gfortran_sminval0_s4 .visible .func _gfortran_sminval0_s4_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_memcmp_char4_gfortrani_memcmp_char409,105,110,118,97,108,48,95,115,52,0 }; // BEGIN VAR DEF: __func__$1 .const .align 1 .u8 __func__$1[11] = {109,105,110,118,97,108,48,95,115,52116,114,97,110,47,103,101,110,101,114,97,116,101,100,47,109,105,110,118,97,108,48,95,115,52,46,99,0 }82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,4820,108,101,110,32,61,61,32,108,101,170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_minval0_s4 .visible .func _gfortran_minval0_sreg .u64 %stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[368u64 %r112; .reg .predpredmov.u64 %r73,%ar0; mov.u64 %r74,%ar1; mov.u64 %r75,%ar2; mov.u64 %r76,%ar3; .loc 1 64 8 ld.s8 %r51,[%r75+28]; .loc 1 65 6 setp.gt.s64 %r77,%r51,0; @ %r77 bra $L2; .loc 1 66 5 cvta.const.u64 %r78,$LC0; {stack; call _gfortran_runtime_errorsetp.eq.u64 %r80,%r74,%r76; @ %r80 bra $L3; cvta.const.u64 %r84,$LC1; cvta.const.u64 %r83,__func__$1; mov.u32 %r82,683: .loc 1 71 3 shl.b64 %r23,%r76,2; mov.u32 %r86,2557add.u64 %r33,%r75,40; mov.u64 %r37,0; .loc 1 73 10 mov.u64 %r60,%r37; add.u64 %r130,%frame,120; add.u64 %r129,%frame,240; .loc 1 77 16 mov.u64 %r99,%r37; bra $L6; $L12: mov.u64 %r60,%r54; $L6: .loc 1 75 18 add.u64 %r89,%frame,%r37; .loc 1 75 51 ld.u64 %r91,[%r33]; mul.lo.u64 %r90,%r91,%r76; .loc 1 75 18 st.u64 [%r89],%r90; .loc 1 76 19 ld.u64 %r93,[%r33+16]; add.u64 %r92,%r93,1; ld.u64 %r94,[%r33+8]; sub.u64 %r31,%r92,%r94; .loc 1 76 17 add.u64 %r96,%r130,%r37; st.u64 [%r96],%r31; .loc 1 77 16 add.u64 %r98,%r129,%r37; st.u64 [%r98],%r99; .loc 1 78 10 setp.gt.s64 %r100,%r31,0; @ ! %r100 bra $L1; .loc 1 73 26 add.u64 %r54,%r60,1; .loc 1 73 3 add.u64 %r33,%r33,24; add.u64 %r37,%r37,8; setp.ne.u64 %r101,%r51,%r54; @ %r101 bra $L12; .loc 1 82 8 ld.u64 %r47,[%r75]; .loc 1 89 9 setp.eq.u64 %r102,%r47,0; @ %r102 bra $L13; .loc 1 101 19 ld.u64 %r61,[%frame]; .loc 1 101 9 shl.b64 %r63,%r61,2; .loc 1 103 34 ld.u64 %r64,[%frame+120]; mov.u64 %r49,%r73; add.u64 %r131,%frame,248; $L11: .loc 1 103 21 ld.u64 %r66,[%frame+240]; mov.u64 %r34,%r66; mov.u64 %r46,%r47; $L9: .loc 1 41(%value_in),_gfortrani_memcmp_char46,[%value_in]; } .loc 1 95 6 setp.ge.s32 %r107,%r106,0; .loc 1 97 14 selp.u64 %r49,%r49,%r46,%r107; .loc 1 101 9 add.u64 %r46,%r46,%r63; .loc 1 103 14 add.u64 %r34,%r34,1; .loc 1 103 7 st.u64 [%frame+240],%r34; setp.ne.u64 %r108,%r34,%r64; @ %r108 bra $L9; .loc 1 101 9 sub.u64 %r109,%r64,%r66; mad.lo.u64 %r47,%r109,%r63,%r47; mov.u64 %r50,%r131; mov.u64 %r44,%r64; mov.u64 %r42,%r61; .loc 1 104 9 mov.u64 %r48,0; .loc 1 109 13 mov.u64 %r112,%r48; $L10: st.u64 [%r50+-8],%r112; .loc 1 112 23 mul.lo.u64 %r39,%r42,%r44; mov.u64 %r72,%r48; .loc 1 113 5 add.u64 %r48,%r48,1; .loc 1 114 7 setp.eq.u64 %r113,%r72,%r60; @ %r113 bra $L7; .loc 1 122 16 ld.u64 %r114,[%r50]; add.u64 %r41,%r114,1; st.u64 [%r50],%r41; shl.b64 %r32,%r48,3; .loc 1 123 23 add.u64 %r115,%frame,%r32; ld.u64 %r42,[%r115]; .loc 1 123 13 sub.u64 %r116,%r42,%r39; shl.b64 %r117,%r116,2; add.u64 %r47,%r47,%r117; .loc 1 126 32 add.u64 %r119,%r130,%r32; ld.u64 %r44,[%r119]; .loc 1 126 7 add.u64 %r50,%r50,8; setp.eq.u64 %r120,%r41,%r44; @ %r120 bra $L10; bra $L11; $L13: .loc 1 89 9 mov.u64 %r49,%r73; $L7: .loc 1 128 4127,[%value_in]; } $L1:_gfortran_mminval0_s4 .visible .func _gfortran_mminval0_s4 {stack; mov.u64 %stack,0480predu64 %r114; .reg .u32u16 %r133; .reg .u32 %r134; .reg .u16 %r135; .reg .predu16 %r160; .reg .u32 %r161; .reg .predu64 %r194; .reg .u64 %r195; mov.u64 %r99,%ar0; mov.u64 %r100,%ar1; mov.u64 %r101,%ar2; mov.u64 %r102,%ar3; mov.u64 %r103,%ar4; .loc 1 153 6 setp.ne.u64 %r104,%r102,0; @ %r104 bra $L17r10103; call _gfortran_minval0_s47 bra $L16; $L17: .loc 1 159 10 ld.s8 %r22,[%r101+28]; .loc 1 160 6 cvt.u16.u32 %r109,%r22; setp.gt.s16 %r110,%r109,0; @ %r110 bra $L19; .loc 1 161 5 cvta.const.u64 %r111111_gfortran_runtime_error19: .loc 1 163 3 setp.eq.u64 %r113,%r100,%r103; @ %r113 bra $L20; cvta.const.u64 %r117,$LC1; cvta.const.u64 %r116,__func__$0; mov.u32 %r115,163; cvta.const.u64 %r11411611720: .loc 1 166 3 shl.b64 %r23,%r100,2; mov.u32 %r119,25599%r24,[%r102+16]; .loc 1 170 9 ld.u64 %r65,[%r102]; .loc 1 172 22 cvt.u32.u64 %r25,%r24; .loc 1 172 53 add.u32 %r122,%r25,-4; and.b32 %r123,%r122,-5; set.u32.eq.u32 %r125,%r123,0; neg.s32 %r126,%r125; .loc 1 172 22 add.u32 %r127,%r25,-1; set.u32.le.u32 %r129,%r127,1; neg.s32 %r130,%r129; .loc 1 172 6 cvt.u16.u32 %r132,%r126; cvt.u16.u32 %r133,%r130; or.b16 %r131,%r132,%r133; cvt.u32.u16 %r134,%r131; cvt.u16.u8 %r135,%r134; setp.eq.u16 %r136,%r135,0; @ %r136 bra $L21; $L23: .loc 1 181 17 cvt.u32.u32 %r137,%r22; cvt.s64.s8 %r80,%r137; add.u64 %r51,%r101,40; add.u64 %r30,%r102,40; mov.u64 %r44,0; mov.u64 %r76,%r44; add.u64 %r194,%frame,120; add.u64 %r193,%frame,240; add.u64 %r192,%frame,360; .loc 1 186 16 mov.u64 %r156,%r44; bra $L22; $L21: .loc 1 174 7 setp.eq.u32 %r139,%r25,16; @ %r139 bra $L23; .loc 1 179 5_gfortran_runtime_error22: .loc 1 183 18 add.u64 %r143,%r194,%r44; .loc 1 183 51 ld.u64 %r145,[%r51]; mul.lo.u64 %r144,%r145,%r103; .loc 1 183 18 st.u64 [%r143],%r144; .loc 1 184 18 add.u64 %r146,%frame,%r44; .loc 1 184 20 ld.u64 %r148,[%r30]; mul.lo.u64 %r147,%r148,%r24; .loc 1 184 18 st.u64 [%r146],%r147; .loc 1 185 19 ld.u64 %r150,[%r51+16]; add.u64 %r149,%r150,1; ld.u64 %r151,[%r51+8]; sub.u64 %r42,%r149,%r151; .loc 1 185 17 add.u64 %r153,%r193,%r44; st.u64 [%r153],%r42; .loc 1 186 16 add.u64 %r155,%r192,%r44; st.u64 [%r155],%r156; .loc 1 187 10 setp.le.s64 %r157,%r42,0; @ %r157 bra $L16; .loc 1 181 26 add.u64 %r76,%r76,1; .loc 1 181 3 add.u64 %r51,%r51,24; add.u64 %r30,%r30,24; add.u64 %r44,%r44,8; setp.ne.u64 %r158,%r76,%r80; @ %r158 bra $L22; .loc 1 191 8 ld.u64 %r62,[%r101]; .loc 1 198 9 setp.eq.u64 %r159,%r62,0; @ %r159 bra $L30; .loc 1 210 19 ld.u64 %r88,[%frame+120]; .loc 1 210 9 shl.b64 %r91,%r88,2; .loc 1 211 20 ld.u64 %r92,[%frame]; .loc 1 213 34 ld.u64 %r94,[%frame+240]; mov.u64 %r69,%r99; add.u64 %r195,%frame,368; $L28: .loc 1 213 21 ld.u64 %r98,[%frame+360]; mov.u64 %r48,%r98; mov.u64 %r64,%r65; mov.u64 %r61,%r62; $L26: .loc 1 204 6 ld.s8 %r161,[%r64]; cvt.u16.u32 %r160,%r161; setp.eq.u16 %r162,%r160,0; @ %r162 bra $L25; .loc 1 41r103; call (%value_in),_gfortrani_memcmp_char4166,[%value_in]; } .loc 1 204 14 setp.ge.s32 %r167,%r166,0; .loc 1 206 14 selp.u64 %r69,%r69,%r61,%r167; $L25: .loc 1 210 9 add.u64 %r61,%r61,%r91; .loc 1 211 10 add.u64 %r64,%r64,%r92; .loc 1 213 14 add.u64 %r48,%r48,1; .loc 1 213 7 st.u64 [%frame+360],%r48; setp.ne.u64 %r168,%r48,%r94; @ %r168 bra $L26; sub.u64 %r85,%r94,%r98; .loc 1 210 9 mad.lo.u64 %r62,%r85,%r91,%r62; .loc 1 211 10 mad.lo.u64 %r65,%r85,%r92,%r65; mov.u64 %r82,%r195; mov.u64 %r58,%r92; mov.u64 %r60,%r94; mov.u64 %r56,%r88; mov.u64 %r78,8; .loc 1 214 9 mov.u64 %r68,0; .loc 1 219 13 mov.u64 %r172,%r68; $L27: st.u64 [%r82+-8],%r172; .loc 1 222 23 mul.lo.u64 %r50,%r56,%r60; .loc 1 223 24 mul.lo.u64 %r53,%r60,%r58; .loc 1 224 5 add.u64 %r68,%r68,1; .loc 1 225 7 setp.eq.u64 %r173,%r68,%r80; @ %r173 bra $L24; .loc 1 233 16 ld.u64 %r174,[%r82]; add.u64 %r55,%r174,1; st.u64 [%r82],%r55; .loc 1 234 23 add.u64 %r176,%r194,%r78; ld.u64 %r56,[%r176]; .loc 1 234 13 sub.u64 %r177,%r56,%r50; shl.b64 %r178,%r177,2; add.u64 %r62,%r62,%r178; .loc 1 235 24 add.u64 %r179,%frame,%r78; ld.u64 %r58,[%r179]; .loc 1 235 14 sub.u64 %r180,%r58,%r53; add.u64 %r65,%r65,%r180; .loc 1 238 32 add.u64 %r182,%r193,%r78; ld.u64 %r60,[%r182]; .loc 1 238 7 add.u64 %r82,%r82,8; add.u64 %r78,%r78,8; setp.eq.u64 %r183,%r55,%r60; @ %r183 bra $L27; bra $L28; $L30: .loc 1 198 9 mov.u64 %r69,%r99; $L24: .loc 1 240 5190,[%value_in]; } $L16: .loc 1 2_gfortran_sminval0_s4 .visible .func _gfortran_sminval0_s4 {r24predmov.u64 %r26,%ar2; mov.u64 %r27,%ar3; mov.u64 %r28,%ar4; .loc 1 256 6 setp.eq.u64 %r29,%r27,0; @ %r29 bra $L41; .loc 1 256 20 l42; $L41: .loc 1 258 7 {call _gfortran_minval0_s47 bra $L40; $L42: .loc 1 261 3 shl.b64 %r40,%r28,2; mov.u32 %r38,255$L40: .loc 1 262 1 ret; } maxval1_s1.o/_gfortran_maxval1_s1 .visible .func _gfortran_maxval1_s1.file 1 "../../../libgfortran/generated/maxval1_s1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxval1_s1 .visible .func _gfortran_mmaxval1_s1GLOBAL FUNCTION DECL: _gfortran_smaxval1_s1 .visible .func _gfortran_smaxval1_s1GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_options[6]FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returmemcmpmem_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents73,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37120,118,97,108,49,95,115,49,0 }; // BEGIN VAR DEF: __func__$1 .const .align 1 .u8 __func__$1[11] = {109,97,120,118,97,108,49,95,115,49,0 }120,108,101,110,32,61,61,32,115,116,114,105,110,103,95,108,101116,114,97,110,47,103,101,110,101,114,97,116,101,100,47,109,97,120,118,97,108,49,95,115,49,46,9968,105,109,32,97,114,103,117,109,101,110,116,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,10114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,86,65,76114,101,116,117,114,110,32,118,9726] = {70,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,997,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,0 }_gfortran_maxval1_s1 .visible .func _gfortran_maxval1_s1 {480u64 %r97u64predpredpred %r238; .reg .u64predu64 %r279; .reg .u64 %r280; .reg .u64 %r281; .reg .u64predpred %r312; .reg .u64 %r313; .reg .u64 %r314; .reg .u32 %r315; .reg .u64 %r317; .reg .u64 %r318; .reg .u64 %r319; .reg .u64 %r320; .reg .u64 %r321; .reg .u64u64 %r352; .reg .u64 %r358; .reg .u64 %r359; .reg .pred %r360; .reg .u64 %r361; .reg .u64 %r363; .reg .u64 %r364; .reg .u64 %r366; .reg .predpredu64 %r381; .reg .u64 %r382; mov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; mov.u64 %r182,%ar3; mov.u64 %r183,%ar4; .loc 1 66 3 setp.eq.u64 %r184,%r180,%r183; @ %r184 bra $L2; cvta.const.u64 %r188,$LC0; cvta.const.u64 %r187,__func__$1; mov.u32 %r186,66;r1861871882: .loc 1 68 10 ld.s8 %r22,[%r181+28]; .loc 1 68 38 add.u32 %r189,%r22,-1; .loc 1 68 8 cvt.s64.s32 %r111,%r189; .loc 1 69 10 ld.u64 %r25,[%r182]; .loc 1 69 7 add.u64 %r112,%r25,-1; .loc 1 71 7 shr.u64 %r191,%r112,63; set.u32.lt.s64 %r193,%r111,%r112; neg.s32 %r194,%r193; cvt.u16.u64 %r197,%r191; cvt.u16.u32 %r198,%r194; or.b16 %r196,%r197,%r198; .loc 1 71 6 cvt.u32.u16 %r199,%r196; cvt.u16.u8 %r200,%r199; setp.eq.u16 %r201,%r200,0; @ %r201 bra $L3; .loc 1 73 7 cvt.u32.u32 %r204,%r22; cvt.s64.s8 %r203,%r204; st.u64 [%stack+8],%r20302202_gfortran_runtime_errorr207,%r112,%r112; add.u64 %r208,%r207,%r112; shl.b64 %r209,%r208,3; add.u64 %r210,%r181,%r209; ld.u64 %r31,[%r210+56]; ld.u64 %r33,[%r210+48]; .loc 1 82 11 ld.u64 %r34,[%r210+40]; .loc 1 84 3 setp.ne.u64 %r224,%r112,0; @ %r224 bra $L4; $L10: .loc 1 92 3 setp.gt.s64 %r225,%r111,%r112; @ %r225 bra $L5; bra $L48; $L4: add.u64 %r65,%r181,40; add.u64 %r29,%frame,120; add.u64 %r127,%frame,240; add.u64 %r226,%r181,16; add.u64 %r228,%r25,%r25; add.u64 %r229,%r228,%r25; shl.b64 %r230,%r229,3; add.u64 %r168,%r226,%r230; .loc 1 90 12 mov.u64 %r382,0; $L9: .loc 1 86 51 ld.u64 %r232,[%r65]; mul.lo.u64 %r231,%r232,%r183; .loc 1 86 18 st.u64 [%r29],%r231; .loc 1 87 19 ld.u64 %r234,[%r65+16]; add.u64 %r233,%r234,1; ld.u64 %r235,[%r65+8]; sub.u64 %r44,%r233,%r235; .loc 1 89 10 setp.lt.s64 %r236,%r44,0; @ %r236 bra $L7; .loc 1 87 17 st.u64 [%r127],%r44; bra $L8; $L7: .loc 1 90 12 st.u64 [%r127],%r382; $L8: .loc 1 84 3 add.u64 %r65,%r65,24; add.u64 %r29,%r29,8; add.u64 %r127,%r127,8; setp.ne.u64 %r238,%r65,%r168; @ %r238 bra $L9; bra $L10; $L48: .loc 1 101 6 ld.u64 %r239,[%r179]; setp.eq.u64 %r240,%r239,0; @ ! %r240 bra $L12; bra $L11; $L5: add.u64 %r242,%r25,%r25; add.u64 %r243,%r242,%r25; shl.b64 %r244,%r243,3; add.u64 %r245,%r244,40; add.u64 %r136,%r181,%r245; shl.b64 %r246,%r25,3; add.u64 %r106,%r246,-8; add.u64 %r374,%frame,120; add.u64 %r108,%r374,%r106; add.u64 %r373,%frame,240; add.u64 %r103,%r373,%r106; add.u64 %r249,%r181,40; cvt.u32.u32 %r251,%r22; cvt.s64.s8 %r250,%r251; add.u64 %r253,%r250,%r250; add.u64 %r254,%r253,%r250; shl.b64 %r255,%r254,3; add.u64 %r81,%r249,%r255; .loc 1 98 12 mov.u64 %r381,0; $L15: .loc 1 94 56 ld.u64 %r257,[%r136]; mul.lo.u64 %r256,%r257,%r183; .loc 1 94 18 st.u64 [%r108],%r256; .loc 1 95 19 ld.u64 %r259,[%r136+16]; add.u64 %r258,%r259,1; ld.u64 %r260,[%r136+8]; sub.u64 %r53,%r258,%r260; .loc 1 97 10 setp.lt.s64 %r261,%r53,0; @ %r261 bra $L13; .loc 1 95 17 st.u64 [%r103],%r53; bra $L14; $L13: .loc 1 98 12 st.u64 [%r103],%r381; $L14: .loc 1 92 3 add.u64 %r136,%r136,24; add.u64 %r108,%r108,8; add.u64 %r103,%r103,8; setp.ne.u64 %r263,%r81,%r136; @ %r263 bra $L15; bra $L49; $L11: .loc 1 105 7 setp.le.s64 %r264,%r111,0; @ %r264 bra $L17; add.u64 %r373,%frame,240; $L37: add.u64 %r151,%r179,40; mov.u64 %r148,%r373; cvt.u32.u32 %r266,%r22; cvt.s64.s8 %r265,%r266; add.u64 %r268,%r265,%r265; add.u64 %r269,%r268,%r265; shl.b64 %r270,%r269,3; add.u64 %r271,%r179,16; add.u64 %r139,%r270,%r271; .loc 1 108 10 mov.u64 %r102,1; .loc 1 112 4 mov.u64 %r295,0; bra $L18; $L17: .loc 1 116 24 mov.u64 %r272,0; st.u64 [%r179+8],%r272; .loc 1 117 28 cvt.u16.u32 %r275,%r22; add.u16 %r274,%r275,-1; cvt.u32.u16 %r276,%r274; st.u8 [%r179+28],%r276; .loc 1 119 20 add.u32 %r277,%r22,-2; cvt.s64.s32 %r63,%r277; add.u64 %r279,%r63,%r63; add.u64 %r280,%r279,%r63; shl.b64 %r281,%r280,3; add.u64 %r282,%r179,%r281; .loc 1 119 67 shl.b64 %r284,%r63,3; add.u64 %r285,%frame,%r284; .loc 1 119 59 ld.u64 %r288,[%r282+40]; ld.u64 %r289,[%r285+240]; mul.lo.u64 %r287,%r288,%r289; .loc 1 119 18 mul.lo.u64 %r117,%r287,%r183; .loc 1 122 29 mov.u64 %r29call (%value_in),_gfortrani_xmallocarray292,[%value_in]; } .loc 1 122 27 st.u64 [%r179],%r292; .loc 1 123 10 setp.eq.u64 %r294,%r117,0; @ ! %r294 bra $L22; bra $L19; $L21: .loc 1 110 48 mul.lo.u64 %r102,%r59,%r102; $L18: .loc 1 112 4 st.u64 [%r151+8],%r295; ld.u64 %r59,[%r148]; add.u64 %r296,%r59,-1; st.u64 [%r151+16],%r296; st.u64 [%r151],%r102; .loc 1 105 7 add.u64 %r151,%r151,24; add.u64 %r148,%r148,8; setp.ne.u64 %r297,%r139,%r151; @ %r297 bra $L21; bra $L17; $L19: .loc 1 126 4 st.u64 [%r179+48],%r117; mov.u64 %r299,-1; st.u64 [%r179+56],%r299; st.u64 [%r179+40],%r291; .loc 1 127 4 bra $L1; $L12: .loc 1 133 19 ld.s8 %r71,[%r179+28]; .loc 1 133 10 setp.eq.u64 %r301,%r71,%r111; @ %r301 bra $L24; .loc 1 134 2 st.u64 [%stack+8],%r111; st.u64 [%stack],%r71_gfortran_runtime_error24: .loc 1 139 11 cvta.global.u64 %r304,_gfortrani_compile_options; .loc 1 139 10 ld.u32 %r305,[%r304+36]; setp.eq.u32 %r306,%r305,0; @ %r306 bra $L22; .loc 1 140 2 add.u64 %r373,%frame,240; cvta.const.u64 %r310,$LC4r373309310; call _gfortrani_bounds_ifunction_return$L22: .loc 1 144 3 setp.gt.s64 %r312,%r111,0; @ %r312 bra $L25; $L29: .loc 1 78 9 add.u64 %r313,%r31,1; .loc 1 78 7 sub.u64 %r113,%r313,%r33; max.s64 %r115,%r113,0; .loc 1 82 44 mul.lo.u64 %r36,%r34,%r183; .loc 1 152 8 ld.u64 %r124,[%r181]; .loc 1 153 8 ld.u64 %r125,[%r179]; .loc 1 182 22 ld.u64 %r171,[%frame+120]; .loc 1 183 22 ld.u64 %r173,[%frame]; .loc 1 181 12 ld.u64 %r176,[%frame+360]; setp.gt.s64 %r370,%r113,0; setp.le.s64 %r371,%r111,1; cvt.u32.u32 %r376,%r22; cvt.s64.s8 %r377,%r376; add.u64 %r378,%r377,-1; add.u64 %r379,%frame,120; add.u64 %r380,%frame,240; bra $L26; $L25: add.u64 %r161,%r179,40; cvt.u32.u32 %r315,%r22; cvt.s64.s8 %r314,%r315; add.u64 %r152,%r314,-1; .loc 1 144 3 mov.u64 %r159,0; mov.u64 %r126,%r159; add.u64 %r373,%frame,240; add.u64 %r372,%frame,360; .loc 1 146 16 mov.u64 %r318,%r159; $L28: add.u64 %r317,%r372,%r159; st.u64 [%r317],%r318; .loc 1 147 18 add.u64 %r319,%frame,%r159; .loc 1 147 54 ld.u64 %r321,[%r161]; mul.lo.u64 %r320,%r321,%r183; .loc 1 147 18 st.u64 [%r319],%r320; .loc 1 148 17 add.u64 %r323,%r373,%r159; .loc 1 148 10 ld.u64 %r324,[%r323]; setp.le.s64 %r325,%r324,0; @ %r325 bra $L1; .loc 1 144 26 add.u64 %r126,%r126,1; .loc 1 144 3 add.u64 %r161,%r161,24; add.u64 %r159,%r159,8; setp.ne.u64 %r326,%r126,%r152; @ %r326 bra $L28; bra $L29; $L26: .loc 1 164 5 @ %r370 bra $L38; .loc 1 165 4 mov.u32 %r3313333,[%value_in]; } bra $L31; $L33: .loc 1 3812104183338,[%value_in]; } .loc 1 171 6 setp.le.s32 %r340,%r338,0; selp.u64 %r104,%r104,%r120,%r340; bra $L30; $L38: mov.u64 %r120,%r124; mov.u64 %r104,%r124; .loc 1 168 13 mov.u64 %r119,0; $L30: .loc 1 168 28 add.u64 %r119,%r119,1; .loc 1 168 36 add.u64 %r120,%r120,%r36; .loc 1 168 6 setp.gt.s64 %r341,%r115,%r11983348,[%value_in]; } $L31: .loc 1 181 15 add.u64 %r176,%r176,1; .loc 1 182 12 add.u64 %r124,%r124,%r171; .loc 1 183 12 add.u64 %r125,%r125,%r173; .loc 1 185 32 ld.u64 %r130,[%frame+240]; .loc 1 185 13 setp.ne.u64 %r350,%r176,%r130; @ %r350 bra $L26; .loc 1 192 23 mul.lo.u64 %r351,%r176,%r171; .loc 1 192 9 sub.u64 %r121,%r124,%r351; .loc 1 193 23 mul.lo.u64 %r352,%r176,%r173; .loc 1 193 9 sub.u64 %r122,%r125,%r352; .loc 1 195 7 @ %r371 bra $L1; add.u64 %r132,%frame,368; mov.u64 %r150,8; .loc 1 194 5 mov.u64 %r123,1; .loc 1 189 13 mov.u64 %r375,0; bra $L35; $L36: st.u64 [%r132],%r375; .loc 1 192 23 mul.lo.u64 %r358,%r92,%r91; .loc 1 192 9 sub.u64 %r121,%r124,%r358; .loc 1 193 23 mul.lo.u64 %r359,%r94,%r91; .loc 1 193 9 sub.u64 %r122,%r125,%r359; .loc 1 194 5 add.u64 %r123,%r123,1; .loc 1 195 7 add.u64 %r132,%r132,8; add.u64 %r150,%r150,8; setp.eq.u64 %r360,%r123,%r378; @ %r360 bra $L1; $L35: .loc 1 203 16 ld.u64 %r361,[%r132]; add.u64 %r91,%r361,1; st.u64 [%r132],%r91; .loc 1 204 23 add.u64 %r363,%r379,%r150; ld.u64 %r92,[%r363]; .loc 1 204 13 add.u64 %r124,%r121,%r92; .loc 1 205 23 add.u64 %r364,%frame,%r150; ld.u64 %r94,[%r364]; .loc 1 205 13 add.u64 %r125,%r122,%r94; .loc 1 185 32 add.u64 %r366,%r380,%r150; ld.u64 %r97,[%r366]; .loc 1 185 13 setp.eq.u64 %r367,%r91,%r97; @ %r367 bra $L36; mov.u64 %r176,0; bra $L26; $L49: .loc 1 101 6 ld.u64 %r368,[%r179]; setp.eq.u64 %r369,%r368,0; @ ! %r369 bra $L12; bra $L37; $L_gfortran_mmaxval1_s1 .visible .func _gfortran_mmaxval1_s1608pred %r234; .reg .u64 %r235; .reg .u32 %r236; .reg .u64 %r237; .reg .u64 %r238; .reg .u32 %r239; .reg .u64 %r241; .reg .u32u32u64 %r313; .reg .u64u64 %r329; .reg .u32 %r330; .reg .u64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64predpredu64 %r411; .reg .u64 %r412; .reg .pred %r413; .reg .pred %r414; .reg .u32 %r418; .reg .u64 %r420; .reg .pred %r422; .reg .pred %r423; .reg .predpredpred %r463; .reg .u64 %r464; .reg .predpredu64 %r480; .reg .u64 %r481; mov.u64 %r222,%ar0; mov.u64 %r223,%ar1; mov.u64 %r224,%ar2; mov.u64 %r225,%ar3; mov.u64 %r226,%ar4; mov.u64 %r227,%ar5; .loc 1 242 6 setp.ne.u64 %r228,%r226,0; @ %r228 bra $L223257; call _gfortran_maxval1_s1); } .loc 1 245 7 bra $L50; $L51: .loc 1 248 3 setp.eq.u64 %r234,%r223,%r227; @ %r234 bra $L53; cvta.const.u64 %r238,$LC0; cvta.const.u64 %r237,__func__$0; mov.u32 %r236,248;r2r2373853: .loc 1 250 10 ld.u64 %r22,[%r225]; .loc 1 250 7 add.u64 %r146,%r22,-1; .loc 1 251 10 ld.s8 %r23,[%r224+28]; .loc 1 251 38 add.u32 %r239,%r23,-1; .loc 1 251 8 cvt.s64.s32 %r147,%r239; .loc 1 253 7 shr.u64 %r241,%r146,63; set.u32.gt.s64 %r243,%r146,%r147; neg.s32 %r244,%r243; cvt.u16.u64 %r247,%r241; cvt.u16.u32 %r248,%r244; or.b16 %r246,%r247,%r248; .loc 1 253 6 cvt.u32.u16 %r249,%r246; cvt.u16.u8 %r250,%r249; setp.eq.u16 %r251,%r250,0; @ %r251 bra $L54; .loc 1 255 7 cvt.u32.u32 %r254,%r23; cvt.s64.s8 %r253,%r254; st.u64 [%stack+8],%r253; st.u64 [%stack],%r22; cvta.const.u64 %r22_gfortran_runtime_error4: .loc 1 260 9 add.u64 %r471,%r146,%r146; add.u64 %r468,%r471,%r146; shl.b64 %r259,%r468,3; add.u64 %r260,%r224,%r259; ld.u64 %r263,[%r260+56]; add.u64 %r262,%r263,1; .loc 1 260 7 ld.u64 %r270,[%r260+48]; sub.u64 %r148,%r262,%r270; .loc 1 261 6 setp.le.s64 %r271,%r148,0; @ %r271 bra $L50; .loc 1 264 9 ld.u64 %r159,[%r226]; .loc 1 266 15 ld.u64 %r35,[%r226+16]; .loc 1 268 22 cvt.u32.u64 %r36,%r35; .loc 1 268 53 add.u32 %r272,%r36,-4; and.b32 %r273,%r272,-5; set.u32.eq.u32 %r275,%r273,0; neg.s32 %r276,%r275; .loc 1 268 22 add.u32 %r277,%r36,-1; set.u32.le.u32 %r279,%r277,1; neg.s32 %r280,%r279; .loc 1 268 6 cvt.u16.u32 %r282,%r276; cvt.u16.u32 %r283,%r280; or.b16 %r281,%r282,%r283; cvt.u32.u16 %r284,%r281; cvt.u16.u8 %r285,%r28456; .loc 1 270 7 setp.ne.u32 %r288,%r36,16; @ %r288 bra $L57; $L56: .loc 1 277 11 shl.b64 %r292,%r468,3; add.u64 %r293,%r224,%r292; ld.u64 %r39,[%r293+40]; .loc 1 278 12 add.u64 %r299,%r226,%r292; ld.u64 %r42,[%r299+40]; .loc 1 280 3 setp.ne.u64 %r301,%r146,0; @ %r301 bra $L58; $L64: .loc 1 290 3 setp.lt.s64 %r302,%r146,%r147; @ %r302 bra $L59; bra $L109; $L57: .loc 1 275 5 cvta.const.u64 %r303,$LC303_gfortran_runtime_error8: add.u64 %r204,%r224,40; add.u64 %r206,%frame,240; add.u64 %r207,%r226,40; mov.u64 %r209,%frame; add.u64 %r210,%frame,360; add.u64 %r305,%r224,16; add.u64 %r307,%r22,%r22; add.u64 %r308,%r307,%r22; shl.b64 %r309,%r308,3; add.u64 %r219,%r305,%r309; .loc 1 287 12 mov.u64 %r481,0; $L63: .loc 1 282 51 ld.u64 %r311,[%r204]; mul.lo.u64 %r310,%r311,%r227; .loc 1 282 18 st.u64 [%r206],%r310; .loc 1 283 20 ld.u64 %r313,[%r207]; mul.lo.u64 %r312,%r313,%r35; .loc 1 283 18 st.u64 [%r209],%r312; .loc 1 284 19 ld.u64 %r315,[%r204+16]; add.u64 %r314,%r315,1; ld.u64 %r316,[%r204+8]; sub.u64 %r57,%r314,%r316; .loc 1 286 10 setp.lt.s64 %r317,%r57,0; @ %r317 bra $L61; .loc 1 284 17 st.u64 [%r210],%r57; bra $L62; $L61: .loc 1 287 12 st.u64 [%r210],%r481; $L62: .loc 1 280 3 add.u64 %r204,%r204,24; add.u64 %r206,%r206,8; add.u64 %r207,%r207,24; add.u64 %r209,%r209,8; add.u64 %r210,%r210,8; setp.ne.u64 %r319,%r204,%r219; @ %r319 bra $L63; bra $L64; $L109: .loc 1 300 6 ld.u64 %r320,[%r222]; setp.eq.u64 %r321,%r320,0; @ ! %r321 bra $L66; bra $L65; $L59: add.u64 %r323,%r22,%r22; add.u64 %r324,%r323,%r22; shl.b64 %r325,%r324,3; add.u64 %r125,%r325,40; add.u64 %r130,%r224,%r125; shl.b64 %r326,%r22,3; add.u64 %r75,%r326,-8; add.u64 %r466,%frame,240; add.u64 %r106,%r466,%r75; add.u64 %r68,%r226,%r125; add.u64 %r29,%frame,%r75; add.u64 %r472,%frame,360; add.u64 %r192,%r472,%r75; cvt.u32.u32 %r330,%r23; cvt.s64.s8 %r329,%r330; add.u64 %r332,%r329,%r329; add.u64 %r333,%r332,%r329; shl.b64 %r334,%r333,3; add.u64 %r335,%r224,40; add.u64 %r201,%r334,%r335; .loc 1 297 12 mov.u64 %r480,0; $L69: .loc 1 292 55 ld.u64 %r337,[%r130]; mul.lo.u64 %r336,%r337,%r227; .loc 1 292 18 st.u64 [%r106],%r336; .loc 1 293 20 ld.u64 %r339,[%r68]; mul.lo.u64 %r338,%r339,%r35; .loc 1 293 18 st.u64 [%r29],%r338; .loc 1 294 19 ld.u64 %r341,[%r130+16]; add.u64 %r340,%r341,1; ld.u64 %r342,[%r130+8]; sub.u64 %r72,%r340,%r342; .loc 1 296 10 setp.lt.s64 %r343,%r72,0; @ %r343 bra $L67; .loc 1 294 17 st.u64 [%r192],%r72; bra $L68; $L67: .loc 1 297 12 st.u64 [%r192],%r480; $L68: .loc 1 290 3 add.u64 %r130,%r130,24; add.u64 %r106,%r106,8; add.u64 %r68,%r68,24; add.u64 %r29,%r29,8; add.u64 %r192,%r192,8; setp.ne.u64 %r345,%r130,%r201; @ %r345 bra $L69; bra $L110; $L65: .loc 1 304 7 setp.eq.u64 %r346,%r147,0; @ %r346 bra $L71; add.u64 %r472,%frame,360; $L91: add.u64 %r141,%r222,40; mov.u64 %r139,%r472; cvt.u32.u32 %r348,%r23; cvt.s64.s8 %r347,%r348; add.u64 %r350,%r347,%r347; add.u64 %r351,%r350,%r347; shl.b64 %r352,%r351,3; add.u64 %r353,%r222,16; add.u64 %r131,%r352,%r353; .loc 1 307 10 mov.u64 %r137,1; .loc 1 311 4 mov.u64 %r373,0; bra $L72; $L71: .loc 1 315 20 add.u32 %r354,%r23,-2; cvt.s64.s32 %r81,%r354; add.u64 %r356,%r81,%r81; add.u64 %r357,%r356,%r81; shl.b64 %r358,%r357,3; add.u64 %r359,%r222,%r358; .loc 1 315 67 shl.b64 %r361,%r81,3; add.u64 %r362,%frame,%r361; .loc 1 315 59 ld.u64 %r365,[%r359+40]; ld.u64 %r366,[%r362+360]; mul.lo.u64 %r364,%r365,%r366; .loc 1 315 18 mul.lo.u64 %r153,%r364,%r227; .loc 1 318 24 mov.u64 %r367,0; st.u64 [%r222+8],%r367; .loc 1 319 28 cvt.u16.u32 %r370,%r23; add.u16 %r369,%r370,-1; cvt.u32.u16 %r371,%r369; st.u8 [%r222+28],%r371; .loc 1 321 10 setp.eq.u64 %r372,%r153,0; @ %r372 bra $L73; bra $L111; $L75: .loc 1 309 47 mul.lo.u64 %r137,%r78,%r137; $L72: .loc 1 311 4 st.u64 [%r141+8],%r373; ld.u64 %r78,[%r139]; add.u64 %r374,%r78,-1; st.u64 [%r141+16],%r374; st.u64 [%r141],%r137; .loc 1 304 7 add.u64 %r141,%r141,24; add.u64 %r139,%r139,8; setp.ne.u64 %r375,%r131,%r141; @ %r375 bra $L75; bra $L71; $L73: .loc 1 324 4 st.u64 [%r222+48],%r153; mov.u64 %r377,-1; st.u64 [%r222+56],%r377; mov.u64 %r378,1; st.u64 [%r222+40],%r378; .loc 1 325 4 bra $L50; $L111: .loc 1 328 24 mov.u64 %r38015(%value_in),_gfortrani_xmallocarray81,[%value_in]; } .loc 1 328 22 st.u64 [%r222],%r381; bra $L76; $L66: .loc 1 333 19 ld.s8 %r383,[%r222+28]; .loc 1 333 10 setp.eq.u64 %r384,%r383,%r147; @ %r384 bra $L77; .loc 1 334 2 cvta.const.u64 %r385385_gfortran_runtime_error77: .loc 1 336 11 cvta.global.u64 %r387,_gfortrani_compile_options; .loc 1 336 10 ld.u32 %r388,[%r387+36]; setp.eq.u32 %r389,%r388,0; @ %r389 bra $L76; .loc 1 338 4 add.u64 %r472,%frame,360; cvta.const.u64 %r393,$LC4; cvta.const.u64 %r3922472392393; call _gfortrani_bounds_ifunction_return340 4 cvta.const.u64 %r397397393; call _gfortrani_bounds_equal_extents$L76: .loc 1 345 3 setp.ne.u64 %r399,%r147,0; @ %r399 bra $L78; $L81: .loc 1 353 8 ld.u64 %r160,[%r222]; .loc 1 354 8 ld.u64 %r158,[%r224]; .loc 1 356 9 setp.ne.u64 %r400,%r158,0; @ %r400 bra $L79; bra $L50; $L78: add.u64 %r171,%r222,40; cvt.u32.u32 %r402,%r23; cvt.s64.s8 %r401,%r402; add.u64 %r142,%r401,-1; .loc 1 345 3 mov.u64 %r169,0; mov.u64 %r164,%r169; add.u64 %r472,%frame,360; add.u64 %r469,%frame,480; add.u64 %r467,%frame,120; .loc 1 347 16 mov.u64 %r405,%r169; $L80: add.u64 %r404,%r469,%r169; st.u64 [%r404],%r405; .loc 1 348 18 add.u64 %r407,%r467,%r169; .loc 1 348 54 ld.u64 %r409,[%r171]; mul.lo.u64 %r408,%r409,%r227; .loc 1 348 18 st.u64 [%r407],%r408; .loc 1 349 17 add.u64 %r411,%r472,%r169; .loc 1 349 10 ld.u64 %r412,[%r411]; setp.le.s64 %r413,%r412,0; @ %r413 bra $L50; .loc 1 345 26 add.u64 %r164,%r164,1; .loc 1 345 3 add.u64 %r171,%r171,24; add.u64 %r169,%r169,8; setp.ne.u64 %r414,%r142,%r164; @ %r414 bra $L80; bra $L81; $L79: .loc 1 277 44 mul.lo.u64 %r41,%r39,%r227; .loc 1 278 12 mul.lo.u64 %r45,%r42,%r35; .loc 1 389 22 ld.u64 %r193,[%frame+240]; .loc 1 390 23 ld.u64 %r196,[%frame]; .loc 1 391 22 ld.u64 %r198,[%frame+120]; setp.le.s64 %r470,%r147,1; cvt.u32.u32 %r474,%r23; cvt.s64.s8 %r475,%r474; add.u64 %r476,%r475,-1; add.u64 %r477,%frame,240; add.u64 %r478,%frame,360; add.u64 %r479,%frame,120; $L92: .loc 1 366 2 mov.u32 %r4182420,[%value_in]; } mov.u64 %r65,%r159; mov.u64 %r154,%r158; .loc 1 368 9 mov.u64 %r162,0; $L83: .loc 1 371 7 ld.s8 %r200,[%r65]; .loc 1 371 6 setp.ne.u32 %r422,%r200,0; @ %r422 bra $L82; .loc 1 368 24 add.u64 %r162,%r162,1; .loc 1 368 32 add.u64 %r154,%r154,%r41; .loc 1 368 47 add.u64 %r65,%r65,%r45; .loc 1 368 2 setp.ne.u64 %r423,%r148,%r162; @ %r423 bra $L83; mov.u64 %r154,%r160; bra $L84; $L82: .loc 1 377 6 setp.le.s64 %r424,%r148,%r162; @ %r424 bra $L84; mov.u64 %r163,%r154; $L86: .loc 1 379 6 setp.eq.u32 %r425,%r200,0; @ %r425 bra $L85; .loc 1 3816r154429,[%value_in]; } .loc 1 379 13 setp.le.s32 %r431,%r429,0; selp.u64 %r154,%r154,%r163,%r431; $L85: .loc 1 377 23 add.u64 %r162,%r162,1; .loc 1 377 31 add.u64 %r163,%r163,%r41; .loc 1 377 46 add.u64 %r65,%r65,%r45; .loc 1 377 6 setp.eq.u64 %r432,%r148,%r162; @ %r432 bra $L84; .loc 1 379 7 ld.s8 %r200,[%r65]; bra $L86; $L84: .loc 1 385 2161547439,[%value_in]; } .loc 1 388 15 ld.u64 %r441,[%frame+480]; add.u64 %r100,%r441,1; st.u64 [%frame+480],%r100; .loc 1 389 12 add.u64 %r158,%r158,%r193; .loc 1 390 13 add.u64 %r159,%r159,%r196; .loc 1 391 12 add.u64 %r160,%r160,%r198; .loc 1 393 32 ld.u64 %r134,[%frame+360]; .loc 1 393 13 setp.ne.u64 %r442,%r100,%r134; @ %r442 bra $L92; .loc 1 397 13 mov.u64 %r443,0; st.u64 [%frame+480],%r443; .loc 1 400 23 mul.lo.u64 %r109,%r100,%r193; .loc 1 401 24 mul.lo.u64 %r444,%r100,%r196; .loc 1 401 10 sub.u64 %r155,%r159,%r444; .loc 1 402 23 mul.lo.u64 %r445,%r100,%r198; .loc 1 402 9 sub.u64 %r156,%r160,%r445; .loc 1 404 7 @ %r470 bra $L50; add.u64 %r191,%frame,488; mov.u64 %r189,8; .loc 1 403 5 mov.u64 %r157,1; bra $L89; $L90: .loc 1 397 13 st.u64 [%r191],%r443; .loc 1 400 23 mul.lo.u64 %r109,%r119,%r118; .loc 1 401 24 mul.lo.u64 %r451,%r121,%r118; .loc 1 401 10 sub.u64 %r155,%r159,%r451; .loc 1 402 23 mul.lo.u64 %r452,%r123,%r118; .loc 1 402 9 sub.u64 %r156,%r160,%r452; .loc 1 403 5 add.u64 %r157,%r157,1; .loc 1 404 7 add.u64 %r191,%r191,8; add.u64 %r189,%r189,8; setp.eq.u64 %r453,%r157,%r476; @ %r453 bra $L50; $L89: .loc 1 412 16 ld.u64 %r454,[%r191]; add.u64 %r118,%r454,1; st.u64 [%r191],%r118; .loc 1 413 23 add.u64 %r456,%r477,%r189; ld.u64 %r119,[%r456]; .loc 1 413 13 sub.u64 %r457,%r119,%r109; add.u64 %r158,%r158,%r457; .loc 1 414 24 add.u64 %r458,%frame,%r189; ld.u64 %r121,[%r458]; .loc 1 414 14 add.u64 %r159,%r155,%r121; .loc 1 415 23 add.u64 %r460,%r479,%r189; ld.u64 %r123,[%r460]; .loc 1 415 13 add.u64 %r160,%r156,%r123; .loc 1 393 32 add.u64 %r462,%r478,%r189; ld.u64 %r126,[%r462]; .loc 1 393 13 setp.eq.u64 %r463,%r118,%r126; @ %r463 bra $L90; bra $L92; $L110: .loc 1 300 6 ld.u64 %r464,[%r222]; setp.eq.u64 %r465,%r464,0; @ ! %r465 bra $L66; bra $L91; $L50_gfortran_smaxval1_s1 .visible .func _gfortran_smaxval1_s1local .align 16 .b8 %frame_ar[3688pred %r159; .reg .u32 %r160; .reg .predu16 %r173; .reg .u16 %r174; .reg .u16 %r175; .reg .u32 %r176; .reg .u16 %r177; .reg .pred %r178; .reg .pred %r179; .reg .predpred %r193; .reg .pred %r195; .reg .predpredu16 %r229; .reg .u32pred %r271; .reg .u64u32 %r280; .reg .u64 %r282; .reg .u64 %r284; .reg .u64pred %r296; .reg .predu64u64 %r316; mov.u64 %r153,%ar0; mov.u64 %r154,%ar1; mov.u64 %r155,%ar2; mov.u64 %r156,%ar3; mov.u64 %r157,%ar4; mov.u64 %r158,%ar5; .loc 1 445 6 setp.eq.u64 %r159,%r157,0; @ %r159 bra $L113; .loc 1 445 20 ld.u32 %r160,[%r157]; setp.eq.u32 %r161,%r160,0; @ %r161 bra $L114; 15r1541515158; call _gfortran_maxval1_s1); } .loc 1 448 7 bra $L112; $L114: .loc 1 451 10 ld.u64 %r23,[%r156]; .loc 1 451 7 add.u64 %r88,%r23,-1; .loc 1 452 10 ld.s8 %r24,[%r155+28]; .loc 1 452 38 add.u32 %r26,%r24,-1; .loc 1 452 8 cvt.s64.s32 %r310,%r26; .loc 1 454 7 shr.u64 %r168,%r88,63; set.u32.gt.s64 %r170,%r88,%r310; neg.s32 %r171,%r170; cvt.u16.u64 %r174,%r168; cvt.u16.u32 %r175,%r171; or.b16 %r173,%r174,%r175; .loc 1 454 6 cvt.u32.u16 %r176,%r173; cvt.u16.u8 %r177,%r176; setp.ne.u16 %r178,%r177,0; @ %r178 bra $L116; .loc 1 461 3 setp.ne.u64 %r179,%r88,0; @ %r179 bra $L117; $L123: .loc 1 469 3 setp.lt.s64 %r180,%r88,%r310; @ %r180 bra $L118; bra $L169; $L116: .loc 1 456 7 cvt.u32.u32 %r183,%r24; cvt.s64.s8 %r182,%r183; st.u64 [%stack+8],%r182181181_gfortran_runtime_error117: add.u64 %r59,%r155,48; add.u64 %r137,%frame,120; add.u64 %r101,%r155,56; add.u64 %r185,%r155,24; add.u64 %r187,%r23,%r23; add.u64 %r188,%r187,%r23; shl.b64 %r189,%r188,3; add.u64 %r149,%r185,%r189; .loc 1 466 12 mov.u64 %r316,0; $L122: .loc 1 463 19 ld.u64 %r191,[%r101]; add.u64 %r190,%r191,1; ld.u64 %r192,[%r59]; sub.u64 %r36,%r190,%r192; .loc 1 465 10 setp.le.s64 %r193,%r36,0; @ %r193 bra $L120; .loc 1 463 17 st.u64 [%r137],%r36; bra $L121; $L120: .loc 1 466 12 st.u64 [%r137],%r316; $L121: .loc 1 461 3 add.u64 %r59,%r59,24; add.u64 %r137,%r137,8; add.u64 %r101,%r101,24; setp.ne.u64 %r195,%r59,%r149; @ %r195 bra $L122; bra $L123; $L169: .loc 1 478 15 ld.u64 %r96,[%r153]; .loc 1 478 6 setp.eq.u64 %r196,%r96,0; @ ! %r196 bra $L125; bra $L124; $L118: add.u64 %r198,%r23,%r23; add.u64 %r199,%r198,%r23; shl.b64 %r200,%r199,3; add.u64 %r201,%r200,48; add.u64 %r43,%r155,%r201; add.u64 %r309,%frame,120; shl.b64 %r203,%r23,3; add.u64 %r204,%r203,-8; add.u64 %r131,%r309,%r204; cvt.u32.u32 %r206,%r24; cvt.s64.s8 %r205,%r206; add.u64 %r208,%r205,%r205; add.u64 %r209,%r208,%r205; shl.b64 %r210,%r209,3; add.u64 %r211,%r155,48; add.u64 %r133,%r210,%r211; .loc 1 475 12 mov.u64 %r315,0; $L128: .loc 1 472 2 ld.u64 %r213,[%r43+8]; add.u64 %r212,%r213,1; ld.u64 %r214,[%r43]; sub.u64 %r41,%r212,%r214; .loc 1 474 10 setp.le.s64 %r215,%r41,0; @ %r215 bra $L126; .loc 1 471 17 st.u64 [%r131],%r41; bra $L127; $L126: .loc 1 475 12 st.u64 [%r131],%r315; $L127: .loc 1 469 3 add.u64 %r43,%r43,24; add.u64 %r131,%r131,8; setp.ne.u64 %r217,%r43,%r133; @ %r217 bra $L128; bra $L170; $L124: .loc 1 482 7 setp.eq.u64 %r218,%r310,0; @ %r218 bra $L130; add.u64 %r309,%frame,120; $L149: add.u64 %r111,%r153,40; mov.u64 %r108,%r309; cvt.u32.u32 %r220,%r24; cvt.s64.s8 %r219,%r220; add.u64 %r222,%r219,%r219; add.u64 %r223,%r222,%r219; shl.b64 %r224,%r223,3; add.u64 %r225,%r153,16; add.u64 %r87,%r224,%r225; .loc 1 485 10 mov.u64 %r84,1; .loc 1 489 4 mov.u64 %r245,0; bra $L131; $L130: .loc 1 493 24 mov.u64 %r226,0; st.u64 [%r153+8],%r226; .loc 1 494 28 cvt.u16.u32 %r229,%r24; add.u16 %r228,%r229,-1; cvt.u32.u16 %r230,%r228; st.u8 [%r153+28],%r230; .loc 1 496 20 add.u32 %r231,%r24,-2; cvt.s64.s32 %r50,%r231; add.u64 %r233,%r50,%r50; add.u64 %r234,%r233,%r50; shl.b64 %r235,%r234,3; add.u64 %r236,%r153,%r235; .loc 1 496 67 shl.b64 %r238,%r50,3; add.u64 %r239,%frame,%r238; .loc 1 496 59 ld.u64 %r242,[%r236+40]; ld.u64 %r243,[%r239+120]; mul.lo.u64 %r241,%r242,%r243; .loc 1 496 18 mul.lo.u64 %r94,%r241,%r158; .loc 1 499 10 setp.eq.u64 %r244,%r94,0; @ %r244 bra $L132; bra $L171; $L134: .loc 1 487 48 mul.lo.u64 %r84,%r46,%r84; $L131: .loc 1 489 4 st.u64 [%r111+8],%r245; ld.u64 %r46,[%r108]; add.u64 %r246,%r46,-1; st.u64 [%r111+16],%r246; st.u64 [%r111],%r84; .loc 1 482 7 add.u64 %r111,%r111,24; add.u64 %r108,%r108,8; setp.ne.u64 %r247,%r87,%r111; @ %r247 bra $L134; bra $L130; $L132: .loc 1 502 4 st.u64 [%r153+48],%r94; mov.u64 %r249,-1; st.u64 [%r153+56],%r249; mov.u64 %r250,1; st.u64 [%r153+40],%r250; .loc 1 503 4 bra $L112; $L171: .loc 1 506 24 mov.u64 %r2522; call (%value_in),_gfortrani_xmallocarray253,[%value_in]; } mov.u64 %r96,%r253; .loc 1 506 22 st.u64 [%r153],%r96; $L138: .loc 1 532 3 setp.ne.u64 %r255,%r310,0; @ %r255 bra $L135; bra $L136; $L125: .loc 1 510 19 ld.s8 %r56,[%r153+28]; .loc 1 510 10 setp.eq.u64 %r256,%r56,%r310; @ %r256 bra $L137; .loc 1 511 2 st.u64 [%stack+8],%r310_gfortran_runtime_error137: .loc 1 516 11 cvta.global.u64 %r259,_gfortrani_compile_options; .loc 1 516 10 ld.u32 %r260,[%r259+36]; setp.eq.u32 %r261,%r260,0; @ %r261 bra $L138; .loc 1 518 4 setp.eq.u64 %r262,%r310,0; @ %r262 bra $L136; add.u64 %r86,%r153,48; add.u64 %r83,%frame,120; cvt.u32.u32 %r264,%r24; cvt.s64.s8 %r263,%r264; add.u64 %r72,%r263,-1; .loc 1 518 10 mov.u64 %r104,0; $L140: .loc 1 522 21 ld.u64 %r266,[%r86+8]; add.u64 %r265,%r266,1; .loc 1 522 19 ld.u64 %r267,[%r86]; sub.u64 %r93,%r265,%r267; .loc 1 523 18 ld.u64 %r63,[%r83]; .loc 1 524 3 add.u64 %r104,%r104,1; .loc 1 523 11 setp.eq.u64 %r268,%r63,%r93; @ %r268 bra $L139; .loc 1 524 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r269,$LCstack; call _gfortran_runtime_error139: .loc 1 518 4 add.u64 %r86,%r86,24; add.u64 %r83,%r83,8; setp.ne.u64 %r271,%r72,%r104; @ %r271 bra $L140; $L135: .loc 1 534 16 cvt.u16.u32 %r273,%r24; setp.le.s16 %r274,%r273,1; @ %r274 bra $L141; shl.b64 %r272,%r310,3; bra $L142; $L141: mov.u64 %r272,8; $L142: add.u64 %r276,%frame,240; mov.u32 %r2802722,[%value_in]; } add.u64 %r117,%r153,40; mov.u64 %r98,0; $L143: .loc 1 535 18 shl.b64 %r284,%r98,3; add.u64 %r285,%frame,%r284; .loc 1 535 54 ld.u64 %r287,[%r117]; mul.lo.u64 %r286,%r287,%r158; .loc 1 535 18 st.u64 [%r285],%r286; .loc 1 532 26 add.u64 %r98,%r98,1; .loc 1 532 3 add.u64 %r117,%r117,24; setp.gt.s64 %r288,%r310,%r98; @ %r288 bra $L143; $L136: .loc 1 546 32 ld.u64 %r132,[%frame+120]; .loc 1 544 22 ld.u64 %r136,[%frame]; .loc 1 543 12 ld.u64 %r138,[%frame+240]; .loc 1 553 23 mul.lo.u64 %r142,%r132,%r136; setp.le.s64 %r308,%r310,1; cvt.u32.u32 %r312,%r24; cvt.s64.s8 %r313,%r312; add.u64 %r314,%r313,-1; $L148: .loc 1 542 7 mov.u32 %r2543 15 add.u64 %r138,%r138,1; .loc 1 544 12 add.u64 %r96,%r96,%r136; .loc 1 546 13 setp.ne.u64 %r296,%r138,%r132; @ %r296 bra $L148; .loc 1 555 7 @ %r308 bra $L112; add.u64 %r97,%frame,248; add.u64 %r125,%frame,8; add.u64 %r124,%frame,128; .loc 1 553 23 mov.u64 %r74,%r142; .loc 1 554 5 mov.u64 %r95,1; .loc 1 550 13 mov.u64 %r311,0; bra $L146; $L147: st.u64 [%r97],%r311; .loc 1 553 23 mul.lo.u64 %r74,%r77,%r76; .loc 1 554 5 add.u64 %r95,%r95,1; .loc 1 555 7 add.u64 %r97,%r97,8; add.u64 %r125,%r125,8; add.u64 %r124,%r124,8; setp.eq.u64 %r303,%r95,%r314; @ %r303 bra $L112; $L146: .loc 1 559 16 ld.u64 %r304,[%r97]; add.u64 %r76,%r304,1; st.u64 [%r97],%r76; .loc 1 560 23 ld.u64 %r77,[%r125]; .loc 1 560 13 sub.u64 %r305,%r77,%r74; add.u64 %r96,%r96,%r305; .loc 1 546 32 ld.u64 %r80,[%r124]; .loc 1 546 13 setp.eq.u64 %r306,%r76,%r80; @ %r306 bra $L147; mov.u64 %r138,0; bra $L148; $L170: .loc 1 478 15 ld.u64 %r96,[%r153]; .loc 1 478 6 setp.eq.u64 %r307,%r96,0; @ ! %r307 bra $L125; bra $L149; $L112: .loc 1 564 1 ret; } maxval1_s4.o/_gfortran_maxval1_s4 .visible .func _gfortran_maxval1_s4.file 1 "../../../libgfortran/generated/maxval1_s4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mmaxval1_s4 .visible .func _gfortran_mmaxval1_s4GLOBAL FUNCTION DECL: _gfortran_smaxval1_s4 .visible .func _gfortran_smaxval1_s4GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_options[6]FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_retur_gfortrani_memcmp_char4_gfortrani_memcmp_char4_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents73,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37120,118,97,108,49,95,115,52,0 }; // BEGIN VAR DEF: __func__$1 .const .align 1 .u8 __func__$1[11] = {109,97,120,118,97,108,49,95,115,52,0 }120,108,101,110,32,61,61,32,115,116,114,105,110,103,95,108,101116,114,97,110,47,103,101,110,101,114,97,116,101,100,47,109,97,120,118,97,108,49,95,115,52,46,9968,105,109,32,97,114,103,117,109,101,110,116,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,10114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,65,88,86,65,76114,101,116,117,114,110,32,118,9726] = {70,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,997,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,65,88,86,65,76,32,105,110,116,114,105,110,115,105,99,0 }_gfortran_maxval1_s4 .visible .func _gfortran_maxval1_s4 {480pred %r210; .reg .u64 %r211; .reg .u64 %r212; .reg .u32predu64 %r259; .reg .u32u64 %r266; .reg .u64 %r267; .reg .u64 %r268; .reg .u64 %r269; .reg .pred %r270; .reg .pred %r272; .reg .predu64 %r280; .reg .u64u64pred %r315; .reg .u64u64 %r330; .reg .u64 %r332; .reg .u64pred %r348; .reg .pred %r349; .reg .u64 %r356; .reg .predpredu64 %r389; .reg .u32u64 %r396; mov.u64 %r188,%ar0; mov.u64 %r189,%ar1; mov.u64 %r190,%ar2; mov.u64 %r191,%ar3; mov.u64 %r192,%ar4; .loc 1 66 3 setp.eq.u64 %r193,%r189,%r192; @ %r193 bra $L2; cvta.const.u64 %r197,$LC0; cvta.const.u64 %r196,__func__$1; mov.u32 %r195,66;r1951961972: .loc 1 68 10 ld.s8 %r22,[%r190+28]; .loc 1 68 38 add.u32 %r198,%r22,-1; .loc 1 68 8 cvt.s64.s32 %r114,%r198; .loc 1 69 10 ld.u64 %r25,[%r191]; .loc 1 69 7 add.u64 %r115,%r25,-1; .loc 1 71 7 shr.u64 %r200,%r115,63; set.u32.lt.s64 %r202,%r114,%r115; neg.s32 %r203,%r202; cvt.u16.u64 %r206,%r200; cvt.u16.u32 %r207,%r203; or.b16 %r205,%r206,%r207; .loc 1 71 6 cvt.u32.u16 %r208,%r205; cvt.u16.u8 %r209,%r2083; .loc 1 73 7 cvt.u32.u32 %r213,%r22; cvt.s64.s8 %r212,%r213; st.u64 [%stack+8],%r21211211_gfortran_runtime_errorr216,%r115,%r115; add.u64 %r217,%r216,%r115; shl.b64 %r218,%r217,3; add.u64 %r219,%r190,%r218; ld.u64 %r31,[%r219+56]; ld.u64 %r33,[%r219+48]; .loc 1 82 11 ld.u64 %r34,[%r219+40]; .loc 1 84 3 setp.ne.u64 %r233,%r115,0; @ %r233 bra $L4; $L10: .loc 1 92 3 setp.gt.s64 %r234,%r114,%r115; @ %r234 bra $L5; bra $L48; $L4: add.u64 %r100,%r190,40; add.u64 %r84,%frame,120; add.u64 %r73,%frame,240; add.u64 %r236,%r25,%r25; add.u64 %r237,%r236,%r25190,16; add.u64 %r128,%r238,%r239; .loc 1 90 12 mov.u64 %r396,0; $L9: .loc 1 86 51 ld.u64 %r241,[%r100]; mul.lo.u64 %r240,%r241,%r192; .loc 1 86 18 st.u64 [%r84],%r240; .loc 1 87 19 ld.u64 %r243,[%r100+16]; add.u64 %r242,%r243,1; ld.u64 %r244,[%r100+8]; sub.u64 %r43,%r242,%r244; .loc 1 89 10 setp.lt.s64 %r245,%r43,0; @ %r245 bra $L7; .loc 1 87 17 st.u64 [%r73],%r43; bra $L8; $L7: .loc 1 90 12 st.u64 [%r73],%r396; $L8: .loc 1 84 3 add.u64 %r100,%r100,24; add.u64 %r84,%r84,8; add.u64 %r73,%r73,8; setp.ne.u64 %r247,%r100,%r128; @ %r247 bra $L9; bra $L10; $L48: .loc 1 101 6 ld.u64 %r248,[%r188]; setp.eq.u64 %r249,%r248,0; @ ! %r249 bra $L12; bra $L11; $L5: add.u64 %r251,%r25,%r25; add.u64 %r252,%r251,%r25; shl.b64 %r253,%r252,3; add.u64 %r254,%r253,40; add.u64 %r148,%r190,%r254; shl.b64 %r255,%r25,3; add.u64 %r137,%r255,-8; add.u64 %r388,%frame,120; add.u64 %r140,%r388,%r137; add.u64 %r387,%frame,240; add.u64 %r132,%r387,%r137; add.u64 %r258,%r190,40; cvt.u32.u32 %r260,%r22; cvt.s64.s8 %r259,%r260; add.u64 %r262,%r259,%r259; add.u64 %r263,%r262,%r259; shl.b64 %r264,%r263,3; add.u64 %r104,%r258,%r264; .loc 1 98 12 mov.u64 %r395,0; $L15: .loc 1 94 56 ld.u64 %r266,[%r148]; mul.lo.u64 %r265,%r266,%r192; .loc 1 94 18 st.u64 [%r140],%r265; .loc 1 95 19 ld.u64 %r268,[%r148+16]; add.u64 %r267,%r268,1; ld.u64 %r269,[%r148+8]; sub.u64 %r53,%r267,%r269; .loc 1 97 10 setp.lt.s64 %r270,%r53,0; @ %r270 bra $L13; .loc 1 95 17 st.u64 [%r132],%r53; bra $L14; $L13: .loc 1 98 12 st.u64 [%r132],%r395; $L14: .loc 1 92 3 add.u64 %r148,%r148,24; add.u64 %r140,%r140,8; add.u64 %r132,%r132,8; setp.ne.u64 %r272,%r104,%r148; @ %r272 bra $L15; bra $L49; $L11: .loc 1 105 7 setp.le.s64 %r273,%r114,0; @ %r273 bra $L17; add.u64 %r387,%frame,240; $L36: add.u64 %r164,%r188,40; mov.u64 %r162,%r387; add.u64 %r274,%r188,16; cvt.u32.u32 %r276,%r22; cvt.s64.s8 %r275,%r276; add.u64 %r278,%r275,%r275; add.u64 %r279,%r278,%r275; shl.b64 %r280,%r279,3; add.u64 %r152,%r274,%r280; .loc 1 108 10 mov.u64 %r107,1; .loc 1 112 4 mov.u64 %r304,0; bra $L18; $L17: .loc 1 116 24 mov.u64 %r281,0; st.u64 [%r188+8],%r281; .loc 1 117 28 cvt.u16.u32 %r284,%r22; add.u16 %r283,%r284,-1; cvt.u32.u16 %r285,%r283; st.u8 [%r188+28],%r285; .loc 1 119 20 add.u32 %r286,%r22,-2; cvt.s64.s32 %r62,%r286; add.u64 %r288,%r62,%r62; add.u64 %r289,%r288,%r62; shl.b64 %r290,%r289,3; add.u64 %r291,%r188,%r290; .loc 1 119 67 shl.b64 %r293,%r62,3; add.u64 %r294,%frame,%r293; .loc 1 119 59 ld.u64 %r297,[%r291+40]; ld.u64 %r298,[%r294+240]; mul.lo.u64 %r296,%r297,%r298; .loc 1 119 18 mul.lo.u64 %r120,%r296,%r192; .loc 1 122 29300; call (%value_in),_gfortrani_xmallocarray01,[%value_in]; } .loc 1 122 27 st.u64 [%r188],%r301; .loc 1 123 10 setp.eq.u64 %r303,%r120,0; @ ! %r303 bra $L22; bra $L19; $L21: .loc 1 110 48 mul.lo.u64 %r107,%r58,%r107; $L18: .loc 1 112 4 st.u64 [%r164+8],%r304; ld.u64 %r58,[%r162]; add.u64 %r305,%r58,-1; st.u64 [%r164+16],%r305; st.u64 [%r164],%r107; .loc 1 105 7 add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r306,%r152,%r164; @ %r306 bra $L21; bra $L17; $L19: .loc 1 126 4 st.u64 [%r188+48],%r120; mov.u64 %r308,-1; st.u64 [%r188+56],%r308; mov.u64 %r309,1; st.u64 [%r188+40],%r309; .loc 1 127 4 bra $L1; $L12: .loc 1 133 19 ld.s8 %r69,[%r188+28]; .loc 1 133 10 setp.eq.u64 %r310,%r69,%r114; @ %r310 bra $L24; .loc 1 134 2 st.u64 [%stack+8],%r114; st.u64 [%stack],%r69_gfortran_runtime_error24: .loc 1 139 11 cvta.global.u64 %r313,_gfortrani_compile_options; .loc 1 139 10 ld.u32 %r314,[%r313+36]; setp.eq.u32 %r315,%r314,0; @ %r315 bra $L22; .loc 1 140 2 add.u64 %r387,%frame,240; cvta.const.u64 %r319,$LC4; cvta.const.u64 %r31818r387318319; call _gfortrani_bounds_ifunction_return$L22: .loc 1 144 3 setp.gt.s64 %r321,%r114,0; @ %r321 bra $L25; $L29: .loc 1 78 9 add.u64 %r322,%r31,1; .loc 1 78 7 sub.u64 %r116,%r322,%r33; max.s64 %r118,%r116,0; .loc 1 152 8 ld.u64 %r106,[%r190]; .loc 1 153 8 ld.u64 %r126,[%r188]; shl.b64 %r153,%r192,2; .loc 1 168 36 mul.lo.u64 %r79,%r34,%r153; .loc 1 182 22 ld.u64 %r181,[%frame+120]; .loc 1 182 12 shl.b64 %r183,%r181,2; .loc 1 183 22 ld.u64 %r184,[%frame]; .loc 1 183 12 shl.b64 %r186,%r184,2; .loc 1 181 12 ld.u64 %r187,[%frame+360]; setp.gt.s64 %r384,%r116,0; setp.le.s64 %r385,%r114,1; cvt.u32.u32 %r390,%r22; cvt.s64.s8 %r391,%r390; add.u64 %r392,%r391,-1; add.u64 %r393,%frame,120; add.u64 %r394,%frame,240; bra $L26; $L25: add.u64 %r174,%r188,40; cvt.u32.u32 %r324,%r22; cvt.s64.s8 %r323,%r324; add.u64 %r165,%r323,-1; .loc 1 144 3 mov.u64 %r172,0; mov.u64 %r127,%r172; add.u64 %r387,%frame,240; add.u64 %r386,%frame,360; .loc 1 146 16 mov.u64 %r327,%r172; $L28: add.u64 %r326,%r386,%r172; st.u64 [%r326],%r327; .loc 1 147 18 add.u64 %r328,%frame,%r172; .loc 1 147 54 ld.u64 %r330,[%r174]; mul.lo.u64 %r329,%r330,%r192; .loc 1 147 18 st.u64 [%r328],%r329; .loc 1 148 17 add.u64 %r332,%r387,%r172; .loc 1 148 10 ld.u64 %r333,[%r332]; setp.le.s64 %r334,%r333,0; @ %r334 bra $L1; .loc 1 144 26 add.u64 %r127,%r127,1; .loc 1 144 3 add.u64 %r174,%r174,24; add.u64 %r172,%r172,8; setp.ne.u64 %r335,%r127,%r165; @ %r335 bra $L28; bra $L29; $L26: .loc 1 164 5 @ %r384 bra $L37; .loc 1 165 4 mov.u32 %r340bra $L31; $L37: mov.u64 %r109,%r106; mov.u64 %r122,%r106; .loc 1 168 13 mov.u64 %r121,0; $L30: .loc 1 401109192; call (%value_in),_gfortrani_memcmp_char4347,[%value_in]; } .loc 1 171 6 setp.le.s32 %r348,%r347,0; selp.u64 %r109,%r109,%r122,%r348; .loc 1 168 28 add.u64 %r121,%r121,1; .loc 1 168 36 add.u64 %r122,%r122,%r79; .loc 1 168 6 setp.gt.s64 %r349,%r118,%r121356,[%value_in]; } $L31: .loc 1 181 15 add.u64 %r187,%r187,1; .loc 1 182 12 add.u64 %r106,%r106,%r183; .loc 1 183 12 add.u64 %r126,%r126,%r186; .loc 1 185 32 ld.u64 %r133,[%frame+240]; .loc 1 185 13 setp.ne.u64 %r358,%r187,%r133; @ %r358 bra $L26; .loc 1 192 23 mul.lo.u64 %r359,%r187,%r181; .loc 1 192 9 shl.b64 %r360,%r359,2; sub.u64 %r123,%r106,%r360; .loc 1 193 23 mul.lo.u64 %r361,%r187,%r184; .loc 1 193 9 shl.b64 %r362,%r361,2; sub.u64 %r124,%r126,%r362; .loc 1 195 7 @ %r385 bra $L1; add.u64 %r135,%frame,368; mov.u64 %r156,8; .loc 1 194 5 mov.u64 %r125,1; .loc 1 189 13 mov.u64 %r389,0; bra $L34; $L35: st.u64 [%r135],%r389; .loc 1 192 23 mul.lo.u64 %r368,%r92,%r91; .loc 1 192 9 shl.b64 %r369,%r368,2; sub.u64 %r123,%r106,%r369; .loc 1 193 23 mul.lo.u64 %r370,%r95,%r91; .loc 1 193 9 shl.b64 %r371,%r370,2; sub.u64 %r124,%r126,%r371; .loc 1 194 5 add.u64 %r125,%r125,1; .loc 1 195 7 add.u64 %r135,%r135,8; add.u64 %r156,%r156,8; setp.eq.u64 %r372,%r125,%r392; @ %r372 bra $L1; $L34: .loc 1 203 16 ld.u64 %r373,[%r135]; add.u64 %r91,%r373,1; st.u64 [%r135],%r91; .loc 1 204 23 add.u64 %r375,%r393,%r156; ld.u64 %r92,[%r375]; .loc 1 204 13 shl.b64 %r376,%r92,2; add.u64 %r106,%r123,%r376; .loc 1 205 23 add.u64 %r377,%frame,%r156; ld.u64 %r95,[%r377]; .loc 1 205 13 shl.b64 %r378,%r95,2; add.u64 %r126,%r124,%r378; .loc 1 185 32 add.u64 %r380,%r394,%r156; ld.u64 %r98,[%r380]; .loc 1 185 13 setp.eq.u64 %r381,%r91,%r98; @ %r381 bra $L35; mov.u64 %r187,0; bra $L26; $L49: .loc 1 101 6 ld.u64 %r382,[%r188]; setp.eq.u64 %r383,%r382,0; @ ! %r383 bra $L12; bra $L36; $L_gfortran_mmaxval1_s4 .visible .func _gfortran_mmaxval1_s4608pred %r235; .reg .predu32 %r261; .reg .u64 %r266; .reg .u64pred %r278; .reg .u32u32 %r291; .reg .u16 %r292; .reg .predpredu64 %r318; .reg .u64u64 %r333; .reg .u64 %r336; .reg .u64 %r337; .reg .u32u64 %r349; .reg .pred %r350; .reg .pred %r352; .reg .predu64 %r374; .reg .u16 %r376; .reg .u16 %r377; .reg .u32 %r378; .reg .predpred %r396; .reg .u64 %r399; .reg .u64 %r400; .reg .u64 %r404; .reg .pred %r406; .reg .predpred %r420; .reg .predpred %r429; .reg .predpred %r437; .reg .pred %r438; .reg .u64 %r445; .reg .u64 %r447; .reg .predpred.reg .pred %r473; .reg .u64 %r474; .reg .predu64 %r490; .reg .u64 %r491; mov.u64 %r229,%ar0; mov.u64 %r230,%ar1; mov.u64 %r231,%ar2; mov.u64 %r232,%ar3; mov.u64 %r233,%ar4; mov.u64 %r234,%ar5; .loc 1 242 6 setp.ne.u64 %r235,%r233,0; @ %r235 bra $L23232234; call _gfortran_maxval1_s4); } .loc 1 245 7 bra $L50; $L51: .loc 1 248 3 setp.eq.u64 %r241,%r230,%r234; @ %r241 bra $L53; cvta.const.u64 %r245,$LC0; cvta.const.u64 %r244,__func__$0; mov.u32 %r243,248;r2432444553: .loc 1 250 10 ld.u64 %r22,[%r232]; .loc 1 250 7 add.u64 %r147,%r22,-1; .loc 1 251 10 ld.s8 %r23,[%r231+28]; .loc 1 251 38 add.u32 %r246,%r23,-1; .loc 1 251 8 cvt.s64.s32 %r148,%r246; .loc 1 253 7 shr.u64 %r248,%r147,63; set.u32.gt.s64 %r250,%r147,%r148; neg.s32 %r251,%r250; cvt.u16.u64 %r254,%r248; cvt.u16.u32 %r255,%r251; or.b16 %r253,%r254,%r255; .loc 1 253 6 cvt.u32.u16 %r256,%r253; cvt.u16.u8 %r257,%r256; setp.eq.u16 %r258,%r257,0; @ %r258 bra $L54; .loc 1 255 7 cvt.u32.u32 %r261,%r23; cvt.s64.s8 %r260,%r261; st.u64 [%stack+8],%r260; st.u64 [%stack],%r22; cvta.const.u64 %r22_gfortran_runtime_error4: .loc 1 260 9 add.u64 %r481,%r147,%r147; add.u64 %r479,%r481,%r147; shl.b64 %r266,%r479,3; add.u64 %r267,%r231,%r266; ld.u64 %r270,[%r267+56]; add.u64 %r269,%r270,1; .loc 1 260 7 ld.u64 %r277,[%r267+48]; sub.u64 %r149,%r269,%r277; .loc 1 261 6 setp.le.s64 %r278,%r149,0; @ %r278 bra $L50; .loc 1 264 9 ld.u64 %r164,[%r233]; .loc 1 266 15 ld.u64 %r33,[%r233+16]; .loc 1 268 22 cvt.u32.u64 %r34,%r33; .loc 1 268 53 add.u32 %r279,%r34,-4; and.b32 %r280,%r279,-5; set.u32.eq.u32 %r282,%r280,0; neg.s32 %r283,%r282; .loc 1 268 22 add.u32 %r284,%r34,-1; set.u32.le.u32 %r286,%r284,1; neg.s32 %r287,%r286; .loc 1 268 6 cvt.u16.u32 %r289,%r283; cvt.u16.u32 %r290,%r287; or.b16 %r288,%r289,%r290; cvt.u32.u16 %r291,%r288; cvt.u16.u8 %r292,%r291; setp.ne.u16 %r293,%r292,0; @ %r293 bra $L56; .loc 1 270 7 setp.ne.u32 %r295,%r34,16; @ %r295 bra $L57; $L56: .loc 1 277 11 shl.b64 %r299,%r479,3; add.u64 %r300,%r231,%r299; ld.u64 %r39,[%r300+40]; .loc 1 278 12 add.u64 %r306,%r233,%r299; ld.u64 %r41,[%r306+40]; .loc 1 280 3 setp.ne.u64 %r308,%r147,0; @ %r308 bra $L58; $L64: .loc 1 290 3 setp.lt.s64 %r309,%r147,%r148; @ %r309 bra $L59; bra $L109; $L57: .loc 1 275 5 cvta.const.u64 %r310,$LC310_gfortran_runtime_error8: add.u64 %r104,%r231,40; add.u64 %r213,%frame,240; add.u64 %r214,%r233,40; mov.u64 %r216,%frame; add.u64 %r217,%frame,360; add.u64 %r312,%r231,16; add.u64 %r314,%r22,%r22; add.u64 %r315,%r314,%r22; shl.b64 %r316,%r315,3; add.u64 %r226,%r312,%r316; .loc 1 287 12 mov.u64 %r491,0; $L63: .loc 1 282 51 ld.u64 %r318,[%r104]; mul.lo.u64 %r317,%r318,%r234; .loc 1 282 18 st.u64 [%r213],%r317; .loc 1 283 20 ld.u64 %r320,[%r214]; mul.lo.u64 %r319,%r320,%r33; .loc 1 283 18 st.u64 [%r216],%r319; .loc 1 284 19 ld.u64 %r322,[%r104+16]; add.u64 %r321,%r322,1; ld.u64 %r323,[%r104+8]; sub.u64 %r56,%r321,%r323; .loc 1 286 10 setp.lt.s64 %r324,%r56,0; @ %r324 bra $L61; .loc 1 284 17 st.u64 [%r217],%r56; bra $L62; $L61: .loc 1 287 12 st.u64 [%r217],%r491; $L62: .loc 1 280 3 add.u64 %r104,%r104,24; add.u64 %r213,%r213,8; add.u64 %r214,%r214,24; add.u64 %r216,%r216,8; add.u64 %r217,%r217,8; setp.ne.u64 %r326,%r104,%r226; @ %r326 bra $L63; bra $L64; $L109: .loc 1 300 6 ld.u64 %r327,[%r229]; setp.eq.u64 %r328,%r327,0; @ ! %r328 bra $L66; bra $L65; $L59: add.u64 %r330,%r22,%r22; add.u64 %r331,%r330,%r22; shl.b64 %r332,%r331,3; add.u64 %r130,%r332,40; add.u64 %r134,%r231,%r130; shl.b64 %r333,%r22,3; add.u64 %r106,%r333,-8; add.u64 %r477,%frame,240; add.u64 %r127,%r477,%r106; add.u64 %r88,%r233,%r130; add.u64 %r37,%frame,%r106; add.u64 %r476,%frame,360; add.u64 %r168,%r476,%r106; add.u64 %r336,%r231,40; cvt.u32.u32 %r338,%r23; cvt.s64.s8 %r337,%r338; add.u64 %r340,%r337,%r337; add.u64 %r341,%r340,%r337; shl.b64 %r342,%r341,3; add.u64 %r101,%r336,%r342; .loc 1 297 12 mov.u64 %r490,0; $L69: .loc 1 292 55 ld.u64 %r344,[%r134]; mul.lo.u64 %r343,%r344,%r234; .loc 1 292 18 st.u64 [%r127],%r343; .loc 1 293 20 ld.u64 %r346,[%r88]; mul.lo.u64 %r345,%r346,%r33; .loc 1 293 18 st.u64 [%r37],%r345; .loc 1 294 19 ld.u64 %r348,[%r134+16]; add.u64 %r347,%r348,1; ld.u64 %r349,[%r134+8]; sub.u64 %r70,%r347,%r349; .loc 1 296 10 setp.lt.s64 %r350,%r70,0; @ %r350 bra $L67; .loc 1 294 17 st.u64 [%r168],%r70; bra $L68; $L67: .loc 1 297 12 st.u64 [%r168],%r490; $L68: .loc 1 290 3 add.u64 %r134,%r134,24; add.u64 %r127,%r127,8; add.u64 %r88,%r88,24; add.u64 %r37,%r37,8; add.u64 %r168,%r168,8; setp.ne.u64 %r352,%r101,%r134; @ %r352 bra $L69; bra $L110; $L65: .loc 1 304 7 setp.eq.u64 %r353,%r148,0; @ %r353 bra $L71; add.u64 %r476,%frame,360; $L91: add.u64 %r146,%r229,40; mov.u64 %r144,%r476; add.u64 %r354,%r229,16; cvt.u32.u32 %r356,%r23; cvt.s64.s8 %r355,%r356; add.u64 %r358,%r355,%r355; add.u64 %r359,%r358,%r355; shl.b64 %r360,%r359,3; add.u64 %r135,%r354,%r360; .loc 1 307 10 mov.u64 %r138,1; .loc 1 311 4 mov.u64 %r380,0; bra $L72; $L71: .loc 1 315 20 add.u32 %r361,%r23,-2; cvt.s64.s32 %r77,%r361; add.u64 %r363,%r77,%r77; add.u64 %r364,%r363,%r77; shl.b64 %r365,%r364,3; add.u64 %r366,%r229,%r365; .loc 1 315 67 shl.b64 %r368,%r77,3; add.u64 %r369,%frame,%r368; .loc 1 315 59 ld.u64 %r372,[%r366+40]; ld.u64 %r373,[%r369+360]; mul.lo.u64 %r371,%r372,%r373; .loc 1 315 18 mul.lo.u64 %r155,%r371,%r234; .loc 1 318 24 mov.u64 %r374,0; st.u64 [%r229+8],%r374; .loc 1 319 28 cvt.u16.u32 %r377,%r23; add.u16 %r376,%r377,-1; cvt.u32.u16 %r378,%r376; st.u8 [%r229+28],%r378; .loc 1 321 10 setp.eq.u64 %r379,%r155,0; @ %r379 bra $L73; bra $L111; $L75: .loc 1 309 47 mul.lo.u64 %r138,%r74,%r138; $L72: .loc 1 311 4 st.u64 [%r146+8],%r380; ld.u64 %r74,[%r144]; add.u64 %r381,%r74,-1; st.u64 [%r146+16],%r381; st.u64 [%r146],%r138; .loc 1 304 7 add.u64 %r146,%r146,24; add.u64 %r144,%r144,8; setp.ne.u64 %r382,%r135,%r146; @ %r382 bra $L75; bra $L71; $L73: .loc 1 324 4 st.u64 [%r229+48],%r155; mov.u64 %r384,-1; st.u64 [%r229+56],%r384; mov.u64 %r385,1; st.u64 [%r229+40],%r385; .loc 1 325 4 bra $L50; $L111: .loc 1 328 24387; call (%value_in),_gfortrani_xmallocarray88,[%value_in]; } .loc 1 328 22 st.u64 [%r229],%r388; bra $L76; $L66: .loc 1 333 19 ld.s8 %r390,[%r229+28]; .loc 1 333 10 setp.eq.u64 %r391,%r390,%r148; @ %r391 bra $L77; .loc 1 334 2 cvta.const.u64 %r392,$LC7;stack; call _gfortran_runtime_error77: .loc 1 336 11 cvta.global.u64 %r394,_gfortrani_compile_options; .loc 1 336 10 ld.u32 %r395,[%r394+36]; setp.eq.u32 %r396,%r395,0; @ %r396 bra $L76; .loc 1 338 4 add.u64 %r476,%frame,360; cvta.const.u64 %r400,$LC4; cvta.const.u64 %r39399400; call _gfortrani_bounds_ifunction_return340 4 cvta.const.u64 %r40423r204400; call _gfortrani_bounds_equal_extents$L76: .loc 1 345 3 setp.ne.u64 %r406,%r148,0; @ %r406 bra $L78; $L81: .loc 1 353 8 ld.u64 %r165,[%r229]; .loc 1 354 8 ld.u64 %r163,[%r231]; .loc 1 356 9 setp.ne.u64 %r407,%r163,0; @ %r407 bra $L79; bra $L50; $L78: add.u64 %r185,%r229,40; cvt.u32.u32 %r409,%r23; cvt.s64.s8 %r408,%r409; add.u64 %r169,%r408,-1; .loc 1 345 3 mov.u64 %r179,0; mov.u64 %r62,%r179; add.u64 %r476,%frame,360; add.u64 %r480,%frame,480; add.u64 %r478,%frame,120; .loc 1 347 16 mov.u64 %r412,%r179; $L80: add.u64 %r411,%r480,%r179; st.u64 [%r411],%r412; .loc 1 348 18 add.u64 %r414,%r478,%r179; .loc 1 348 54 ld.u64 %r416,[%r185]; mul.lo.u64 %r415,%r416,%r234; .loc 1 348 18 st.u64 [%r414],%r415; .loc 1 349 17 add.u64 %r418,%r476,%r179; .loc 1 349 10 ld.u64 %r419,[%r418]; setp.le.s64 %r420,%r419,0; @ %r420 bra $L50; .loc 1 345 26 add.u64 %r62,%r62,1; .loc 1 345 3 add.u64 %r185,%r185,24; add.u64 %r179,%r179,8; setp.ne.u64 %r421,%r62,%r169; @ %r421 bra $L80; bra $L81; $L79: .loc 1 278 12 mul.lo.u64 %r44,%r41,%r33; .loc 1 366 2 shl.b64 %r95,%r234,2; .loc 1 368 32 mul.lo.u64 %r96,%r39,%r95; .loc 1 389 22 ld.u64 %r201,[%frame+240]; .loc 1 389 12 shl.b64 %r204,%r201,2; .loc 1 390 23 ld.u64 %r205,[%frame]; .loc 1 391 22 ld.u64 %r208,[%frame+120]; .loc 1 391 12 shl.b64 %r211,%r208,2; setp.le.s64 %r482,%r148,1; cvt.u32.u32 %r484,%r23; cvt.s64.s8 %r485,%r484; add.u64 %r486,%r485,-1; add.u64 %r487,%frame,240; add.u64 %r488,%frame,360; add.u64 %r489,%frame,120; $L92: .loc 1 366 2 mov.u32 %r425mov.u64 %r159,%r164; mov.u64 %r158,%r163; .loc 1 368 9 mov.u64 %r157,0; $L83: .loc 1 371 7 ld.s8 %r212,[%r159]; .loc 1 371 6 setp.ne.u32 %r429,%r212,0; @ %r429 bra $L82; .loc 1 368 24 add.u64 %r157,%r157,1; .loc 1 368 32 add.u64 %r158,%r158,%r96; .loc 1 368 47 add.u64 %r159,%r159,%r44; .loc 1 368 2 setp.ne.u64 %r430,%r149,%r157; @ %r430 bra $L83; mov.u64 %r158,%r165; bra $L84; $L82: .loc 1 377 6 setp.le.s64 %r431,%r149,%r157; @ %r431 bra $L84; mov.u64 %r166,%r158; $L86: .loc 1 379 6 setp.eq.u32 %r432,%r212,0; @ %r432 bra $L85; .loc 1 40(%value_in),_gfortrani_memcmp_char4436,[%value_in]; } .loc 1 379 13 setp.le.s32 %r437,%r436,0; selp.u64 %r158,%r158,%r166,%r437; $L85: .loc 1 377 23 add.u64 %r157,%r157,1; .loc 1 377 31 add.u64 %r166,%r166,%r96; .loc 1 377 46 add.u64 %r159,%r159,%r44; .loc 1 377 6 setp.eq.u64 %r438,%r149,%r157; @ %r438 bra $L84; .loc 1 379 7 ld.s8 %r212,[%r159]; bra $L86; $L84: .loc 1 385 2445,[%value_in]; } .loc 1 388 15 ld.u64 %r447,[%frame+480]; add.u64 %r100,%r447,1; st.u64 [%frame+480],%r100; .loc 1 389 12 add.u64 %r163,%r163,%r204; .loc 1 390 13 add.u64 %r164,%r164,%r205; .loc 1 391 12 add.u64 %r165,%r165,%r211; .loc 1 393 32 ld.u64 %r136,[%frame+360]; .loc 1 393 13 setp.ne.u64 %r448,%r100,%r136; @ %r448 bra $L92; .loc 1 397 13 mov.u64 %r449,0; st.u64 [%frame+480],%r449; .loc 1 400 23 mul.lo.u64 %r108,%r100,%r201; .loc 1 401 24 mul.lo.u64 %r450,%r100,%r205; .loc 1 401 10 sub.u64 %r160,%r164,%r450; .loc 1 402 23 mul.lo.u64 %r451,%r100,%r208; .loc 1 402 9 shl.b64 %r452,%r451,2; sub.u64 %r161,%r165,%r452; .loc 1 404 7 @ %r482 bra $L50; add.u64 %r198,%frame,488; mov.u64 %r197,8; .loc 1 403 5 mov.u64 %r162,1; bra $L89; $L90: .loc 1 397 13 st.u64 [%r198],%r449; .loc 1 400 23 mul.lo.u64 %r108,%r120,%r119; .loc 1 401 24 mul.lo.u64 %r458,%r122,%r119; .loc 1 401 10 sub.u64 %r160,%r164,%r458; .loc 1 402 23 mul.lo.u64 %r459,%r124,%r119; .loc 1 402 9 shl.b64 %r460,%r459,2; sub.u64 %r161,%r165,%r460; .loc 1 403 5 add.u64 %r162,%r162,1; .loc 1 404 7 add.u64 %r198,%r198,8; add.u64 %r197,%r197,8; setp.eq.u64 %r461,%r162,%r486; @ %r461 bra $L50; $L89: .loc 1 412 16 ld.u64 %r462,[%r198]; add.u64 %r119,%r462,1; st.u64 [%r198],%r119; .loc 1 413 23 add.u64 %r464,%r487,%r197; ld.u64 %r120,[%r464]; .loc 1 413 13 sub.u64 %r465,%r120,%r108; shl.b64 %r466,%r465,2; add.u64 %r163,%r163,%r466; .loc 1 414 24 add.u64 %r467,%frame,%r197; ld.u64 %r122,[%r467]; .loc 1 414 14 add.u64 %r164,%r160,%r122; .loc 1 415 23 add.u64 %r469,%r489,%r197; ld.u64 %r124,[%r469]; .loc 1 415 13 shl.b64 %r470,%r124,2; add.u64 %r165,%r161,%r470; .loc 1 393 32 add.u64 %r472,%r488,%r197; ld.u64 %r128,[%r472]; .loc 1 393 13 setp.eq.u64 %r473,%r119,%r128; @ %r473 bra $L90; bra $L92; $L110: .loc 1 300 6 ld.u64 %r474,[%r229]; setp.eq.u64 %r475,%r474,0; @ ! %r475 bra $L66; bra $L91; $L50_gfortran_smaxval1_s4 .visible .func _gfortran_smaxval1_s4local .align 16 .b8 %frame_ar[368predu16 %r180; .reg .predpredu32u64 %r229; .reg .u16predpred %r264; .reg .pred %r265; .reg .u64 %r266; .reg .u32pred %r274; .reg .u64 %r275; .reg .u16u32 %r283; .reg .u64predpred %r310; .reg .predu32 %r316; .reg .u64 %r317; .reg .u64 %r318; .reg .u64 %r319; .reg .u64 %r320; mov.u64 %r156,%ar0; mov.u64 %r157,%ar1; mov.u64 %r158,%ar2; mov.u64 %r159,%ar3; mov.u64 %r160,%ar4; mov.u64 %r161,%ar5; .loc 1 445 6 setp.eq.u64 %r162,%r160,0; @ %r162 bra $L113; .loc 1 445 20 ld.u32 %r163,[%r160]; setp.eq.u32 %r164,%r163,0; @ %r164 bra $L114; $L113: .loc 1 447 7r157158161; call _gfortran_maxval1_s4); } .loc 1 448 7 bra $L112; $L114: .loc 1 451 10 ld.u64 %r23,[%r159]; .loc 1 451 7 add.u64 %r91,%r23,-1; .loc 1 452 10 ld.s8 %r24,[%r158+28]; .loc 1 452 38 add.u32 %r26,%r24,-1; .loc 1 452 8 cvt.s64.s32 %r314,%r26; .loc 1 454 7 shr.u64 %r171,%r91,63; set.u32.gt.s64 %r173,%r91,%r314; neg.s32 %r174,%r173; cvt.u16.u64 %r177,%r171; cvt.u16.u32 %r178,%r174; or.b16 %r176,%r177,%r178; .loc 1 454 6 cvt.u32.u16 %r179,%r176; cvt.u16.u8 %r180,%r179; setp.ne.u16 %r181,%r180,0; @ %r181 bra $L116; .loc 1 461 3 setp.ne.u64 %r182,%r91,0; @ %r182 bra $L117; $L123: .loc 1 469 3 setp.lt.s64 %r183,%r91,%r314; @ %r183 bra $L118; bra $L169; $L116: .loc 1 456 7 cvt.u32.u32 %r186,%r24; cvt.s64.s8 %r185,%r186; st.u64 [%stack+8],%r1851841stack; call _gfortran_runtime_error117: add.u64 %r100,%r158,48; add.u64 %r72,%frame,120; add.u64 %r59,%r158,56; add.u64 %r188,%r158,24; add.u64 %r190,%r23,%r23; add.u64 %r191,%r190,%r23; shl.b64 %r192,%r191,3; add.u64 %r152,%r188,%r192; .loc 1 466 12 mov.u64 %r320,0; $L122: .loc 1 463 19 ld.u64 %r194,[%r59]; add.u64 %r193,%r194,1; ld.u64 %r195,[%r100]; sub.u64 %r35,%r193,%r195; .loc 1 465 10 setp.le.s64 %r196,%r35,0; @ %r196 bra $L120; .loc 1 463 17 st.u64 [%r72],%r35; bra $L121; $L120: .loc 1 466 12 st.u64 [%r72],%r320; $L121: .loc 1 461 3 add.u64 %r100,%r100,24; add.u64 %r72,%r72,8; add.u64 %r59,%r59,24; setp.ne.u64 %r198,%r100,%r152; @ %r198 bra $L122; bra $L123; $L169: .loc 1 478 15 ld.u64 %r102,[%r156]; .loc 1 478 6 setp.eq.u64 %r199,%r102,0; @ ! %r199 bra $L125; bra $L124; $L118: add.u64 %r201,%r23,%r23; add.u64 %r202,%r201,%r23; shl.b64 %r203,%r202,3; add.u64 %r204,%r203,48; add.u64 %r81,%r158,%r204; add.u64 %r312,%frame,120; shl.b64 %r206,%r23,3; add.u64 %r207,%r206,-8; add.u64 %r30,%r312,%r207; add.u64 %r208,%r158,48; cvt.u32.u32 %r210,%r24; cvt.s64.s8 %r209,%r210; add.u64 %r212,%r209,%r209; add.u64 %r213,%r212,%r209; shl.b64 %r214,%r213,3; add.u64 %r97,%r208,%r214; .loc 1 475 12 mov.u64 %r319,0; $L128: .loc 1 472 2 ld.u64 %r216,[%r81+8]; add.u64 %r215,%r216,1; ld.u64 %r217,[%r81]; sub.u64 %r40,%r215,%r217; .loc 1 474 10 setp.le.s64 %r218,%r40,0; @ %r218 bra $L126; .loc 1 471 17 st.u64 [%r30],%r40; bra $L127; $L126: .loc 1 475 12 st.u64 [%r30],%r319; $L127: .loc 1 469 3 add.u64 %r81,%r81,24; add.u64 %r30,%r30,8; setp.ne.u64 %r220,%r81,%r97; @ %r220 bra $L128; bra $L170; $L124: .loc 1 482 7 setp.eq.u64 %r221,%r314,0; @ %r221 bra $L130; add.u64 %r312,%frame,120; $L149: add.u64 %r119,%r156,40; mov.u64 %r117,%r312; cvt.u32.u32 %r223,%r24; cvt.s64.s8 %r222,%r223; add.u64 %r225,%r222,%r222; add.u64 %r226,%r225,%r222; shl.b64 %r227,%r226,3; add.u64 %r228,%r156,16; add.u64 %r110,%r227,%r228; .loc 1 485 10 mov.u64 %r86,1; .loc 1 489 4 mov.u64 %r248,0; bra $L131; $L130: .loc 1 493 24 mov.u64 %r229,0; st.u64 [%r156+8],%r229; .loc 1 494 28 cvt.u16.u32 %r232,%r24; add.u16 %r231,%r232,-1; cvt.u32.u16 %r233,%r231; st.u8 [%r156+28],%r233; .loc 1 496 20 add.u32 %r234,%r24,-2; cvt.s64.s32 %r49,%r234; add.u64 %r236,%r49,%r49; add.u64 %r237,%r236,%r49156,%r238; .loc 1 496 67 shl.b64 %r241,%r49,3; add.u64 %r242,%frame,%r241; .loc 1 496 59 ld.u64 %r245,[%r239+40]; ld.u64 %r246,[%r242+120]; mul.lo.u64 %r244,%r245,%r246; .loc 1 496 18 mul.lo.u64 %r99,%r244,%r161; .loc 1 499 10 setp.eq.u64 %r247,%r99,0; @ %r247 bra $L132; bra $L171; $L134: .loc 1 487 48 mul.lo.u64 %r86,%r45,%r86; $L131: .loc 1 489 4 st.u64 [%r119+8],%r248; ld.u64 %r45,[%r117]; add.u64 %r249,%r45,-1; st.u64 [%r119+16],%r249; st.u64 [%r119],%r86; .loc 1 482 7 add.u64 %r119,%r119,24; add.u64 %r117,%r117,8; setp.ne.u64 %r250,%r110,%r119; @ %r250 bra $L134; bra $L130; $L132: .loc 1 502 4 st.u64 [%r156+48],%r99; mov.u64 %r252,-1; st.u64 [%r156+56],%r252; mov.u64 %r253,1; st.u64 [%r156+40],%r253; .loc 1 503 4 bra $L112; $L171: .loc 1 506 2499call (%value_in),_gfortrani_xmallocarray256,[%value_in]; } mov.u64 %r102,%r256; .loc 1 506 22 st.u64 [%r156],%r102; $L138: .loc 1 532 3 setp.ne.u64 %r258,%r314,0; @ %r258 bra $L135; bra $L136; $L125: .loc 1 510 19 ld.s8 %r55,[%r156+28]; .loc 1 510 10 setp.eq.u64 %r259,%r55,%r314; @ %r259 bra $L137; .loc 1 511 2 st.u64 [%stack+8],%r314; st.u64 [%stack],%r5_gfortran_runtime_error137: .loc 1 516 11 cvta.global.u64 %r262,_gfortrani_compile_options; .loc 1 516 10 ld.u32 %r263,[%r262+36]; setp.eq.u32 %r264,%r263,0; @ %r264 bra $L138; .loc 1 518 4 setp.eq.u64 %r265,%r314,0; @ %r265 bra $L136; add.u64 %r105,%r156,48; add.u64 %r88,%frame,120; cvt.u32.u32 %r267,%r24; cvt.s64.s8 %r266,%r267; add.u64 %r83,%r266,-1; .loc 1 518 10 mov.u64 %r109,0; $L140: .loc 1 522 21 ld.u64 %r269,[%r105+8]; add.u64 %r268,%r269,1; .loc 1 522 19 ld.u64 %r270,[%r105]; sub.u64 %r96,%r268,%r270; .loc 1 523 18 ld.u64 %r63,[%r88]; .loc 1 524 3 add.u64 %r109,%r109,1; .loc 1 523 11 setp.eq.u64 %r271,%r63,%r96; @ %r271 bra $L139; .loc 1 524 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r96; st.u64 [%stack],%r109; cvta.const.u64 %r272,$LC272_gfortran_runtime_error139: .loc 1 518 4 add.u64 %r105,%r105,24; add.u64 %r88,%r88,8; setp.ne.u64 %r274,%r83,%r109; @ %r274 bra $L140; $L135: .loc 1 534 16 cvt.u16.u32 %r276,%r24; setp.le.s16 %r277,%r276,1; @ %r277 bra $L141; shl.b64 %r275,%r314,3; bra $L142; $L141: mov.u64 %r275,8; $L142: add.u64 %r279,%frame,240; mov.u32 %r2832755,[%value_in]; } add.u64 %r125,%r156,40; mov.u64 %r104,0; $L143: .loc 1 535 18 shl.b64 %r287,%r104,3; add.u64 %r288,%frame,%r287; .loc 1 535 54 ld.u64 %r290,[%r125]; mul.lo.u64 %r289,%r290,%r161; .loc 1 535 18 st.u64 [%r288],%r289; .loc 1 532 26 add.u64 %r104,%r104,1; .loc 1 532 3 add.u64 %r125,%r125,24; setp.gt.s64 %r291,%r314,%r104; @ %r291 bra $L143; $L136: .loc 1 542 7 shl.b64 %r70,%r161,2; .loc 1 546 32 ld.u64 %r140,[%frame+120]; .loc 1 544 22 ld.u64 %r141,[%frame]; .loc 1 544 12 shl.b64 %r143,%r141,2; .loc 1 543 12 ld.u64 %r147,[%frame+240]; .loc 1 553 23 mul.lo.u64 %r151,%r140,%r141; setp.le.s64 %r313,%r314,1; cvt.u32.u32 %r316,%r24; cvt.s64.s8 %r317,%r316; add.u64 %r318,%r317,-1; $L148: .loc 1 542 7 mov.u32 %r297,[%value_in]; } .loc 1 543 15 add.u64 %r147,%r147,1; .loc 1 544 12 add.u64 %r102,%r102,%r143; .loc 1 546 13 setp.ne.u64 %r299,%r147,%r140; @ %r299 bra $L148; .loc 1 555 7 @ %r313 bra $L112; add.u64 %r103,%frame,248; add.u64 %r133,%frame,8; add.u64 %r132,%frame,128; .loc 1 553 23 mov.u64 %r75,%r151; .loc 1 554 5 mov.u64 %r101,1; .loc 1 550 13 mov.u64 %r315,0; bra $L146; $L147: st.u64 [%r103],%r315; .loc 1 553 23 mul.lo.u64 %r75,%r78,%r77; .loc 1 554 5 add.u64 %r101,%r101,1; .loc 1 555 7 add.u64 %r103,%r103,8; add.u64 %r133,%r133,8; add.u64 %r132,%r132,8; setp.eq.u64 %r306,%r101,%r318; @ %r306 bra $L112; $L146: .loc 1 559 16 ld.u64 %r307,[%r103]; add.u64 %r77,%r307,1; st.u64 [%r103],%r77; .loc 1 560 23 ld.u64 %r78,[%r133]; .loc 1 560 13 sub.u64 %r308,%r78,%r75; shl.b64 %r309,%r308,2; add.u64 %r102,%r102,%r309; .loc 1 546 32 ld.u64 %r80,[%r132]; .loc 1 546 13 setp.eq.u64 %r310,%r77,%r80; @ %r310 bra $L147; mov.u64 %r147,0; bra $L148; $L170: .loc 1 478 15 ld.u64 %r102,[%r156]; .loc 1 478 6 setp.eq.u64 %r311,%r102,0; @ ! %r311 bra $L125; bra $L149; $L112: .loc 1 564 1 ret; } minval1_s1.o/_gfortran_minval1_s1 .visible .func _gfortran_minval1_s1.file 1 "../../../libgfortran/generated/minval1_s1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminval1_s1 .visible .func _gfortran_mminval1_s1GLOBAL FUNCTION DECL: _gfortran_sminval1_s1 .visible .func _gfortran_sminval1_s1GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_options[6]FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returmemcmpmem_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents73,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,3709,105,110,118,97,108,49,95,115,49,0 }; // BEGIN VAR DEF: __func__$1 .const .align 1 .u8 __func__$1[11] = {109,105,110,118,97,108,49,95,115,49,0 }120,108,101,110,32,61,61,32,115,116,114,105,110,103,95,108,101116,114,97,110,47,103,101,110,101,114,97,116,101,100,47,109,105,110,118,97,108,49,95,115,49,46,9968,105,109,32,97,114,103,117,109,101,110,116,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,10114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,86,65,76114,101,116,117,114,110,32,118,9726] = {70,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,997,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,0 }_gfortran_minval1_s1 .visible .func _gfortran_minval1_s1 {480u64 %r97u64predpredpred %r238; .reg .u64predu64 %r279; .reg .u64 %r280; .reg .u64 %r281; .reg .u64predpred %r312; .reg .u64 %r313; .reg .u64 %r314; .reg .u32 %r315; .reg .u64 %r317; .reg .u64 %r318; .reg .u64 %r319; .reg .u64 %r320; .reg .u64 %r321; .reg .u64u64u64 %r358; .reg .u64predpredmov.u64 %r179,%ar0; mov.u64 %r180,%ar1; mov.u64 %r181,%ar2; mov.u64 %r182,%ar3; mov.u64 %r183,%ar4; .loc 1 66 3 setp.eq.u64 %r184,%r180,%r183; @ %r184 bra $L2; cvta.const.u64 %r188,$LC0; cvta.const.u64 %r187,__func__$1; mov.u32 %r186,66;r1861871882: .loc 1 68 10 ld.s8 %r22,[%r181+28]; .loc 1 68 38 add.u32 %r189,%r22,-1; .loc 1 68 8 cvt.s64.s32 %r111,%r189; .loc 1 69 10 ld.u64 %r25,[%r182]; .loc 1 69 7 add.u64 %r112,%r25,-1; .loc 1 71 7 shr.u64 %r191,%r112,63; set.u32.lt.s64 %r193,%r111,%r112; neg.s32 %r194,%r193; cvt.u16.u64 %r197,%r191; cvt.u16.u32 %r198,%r194; or.b16 %r196,%r197,%r198; .loc 1 71 6 cvt.u32.u16 %r199,%r196; cvt.u16.u8 %r200,%r199; setp.eq.u16 %r201,%r200,0; @ %r201 bra $L3; .loc 1 73 7 cvt.u32.u32 %r204,%r22; cvt.s64.s8 %r203,%r204; st.u64 [%stack+8],%r20302202_gfortran_runtime_errorr207,%r112,%r112; add.u64 %r208,%r207,%r112; shl.b64 %r209,%r208,3; add.u64 %r210,%r181,%r209; ld.u64 %r31,[%r210+56]; ld.u64 %r33,[%r210+48]; .loc 1 82 11 ld.u64 %r34,[%r210+40]; .loc 1 84 3 setp.ne.u64 %r224,%r112,0; @ %r224 bra $L4; $L10: .loc 1 92 3 setp.gt.s64 %r225,%r111,%r112; @ %r225 bra $L5; bra $L48; $L4: add.u64 %r65,%r181,40; add.u64 %r29,%frame,120; add.u64 %r127,%frame,240; add.u64 %r226,%r181,16; add.u64 %r228,%r25,%r25; add.u64 %r229,%r228,%r25; shl.b64 %r230,%r229,3; add.u64 %r168,%r226,%r230; .loc 1 90 12 mov.u64 %r379,0; $L9: .loc 1 86 51 ld.u64 %r232,[%r65]; mul.lo.u64 %r231,%r232,%r183; .loc 1 86 18 st.u64 [%r29],%r231; .loc 1 87 19 ld.u64 %r234,[%r65+16]; add.u64 %r233,%r234,1; ld.u64 %r235,[%r65+8]; sub.u64 %r44,%r233,%r235; .loc 1 89 10 setp.lt.s64 %r236,%r44,0; @ %r236 bra $L7; .loc 1 87 17 st.u64 [%r127],%r44; bra $L8; $L7: .loc 1 90 12 st.u64 [%r127],%r379; $L8: .loc 1 84 3 add.u64 %r65,%r65,24; add.u64 %r29,%r29,8; add.u64 %r127,%r127,8; setp.ne.u64 %r238,%r65,%r168; @ %r238 bra $L9; bra $L10; $L48: .loc 1 101 6 ld.u64 %r239,[%r179]; setp.eq.u64 %r240,%r239,0; @ ! %r240 bra $L12; bra $L11; $L5: add.u64 %r242,%r25,%r25; add.u64 %r243,%r242,%r25; shl.b64 %r244,%r243,3; add.u64 %r245,%r244,40; add.u64 %r136,%r181,%r245; shl.b64 %r246,%r25,3; add.u64 %r106,%r246,-8; add.u64 %r371,%frame,120; add.u64 %r108,%r371,%r106; add.u64 %r370,%frame,240; add.u64 %r103,%r370,%r106; add.u64 %r249,%r181,40; cvt.u32.u32 %r251,%r22; cvt.s64.s8 %r250,%r251; add.u64 %r253,%r250,%r250; add.u64 %r254,%r253,%r250; shl.b64 %r255,%r254,3; add.u64 %r81,%r249,%r255; .loc 1 98 12 mov.u64 %r378,0; $L15: .loc 1 94 56 ld.u64 %r257,[%r136]; mul.lo.u64 %r256,%r257,%r183; .loc 1 94 18 st.u64 [%r108],%r256; .loc 1 95 19 ld.u64 %r259,[%r136+16]; add.u64 %r258,%r259,1; ld.u64 %r260,[%r136+8]; sub.u64 %r53,%r258,%r260; .loc 1 97 10 setp.lt.s64 %r261,%r53,0; @ %r261 bra $L13; .loc 1 95 17 st.u64 [%r103],%r53; bra $L14; $L13: .loc 1 98 12 st.u64 [%r103],%r378; $L14: .loc 1 92 3 add.u64 %r136,%r136,24; add.u64 %r108,%r108,8; add.u64 %r103,%r103,8; setp.ne.u64 %r263,%r81,%r136; @ %r263 bra $L15; bra $L49; $L11: .loc 1 105 7 setp.le.s64 %r264,%r111,0; @ %r264 bra $L17; add.u64 %r370,%frame,240; $L37: add.u64 %r151,%r179,40; mov.u64 %r148,%r370; cvt.u32.u32 %r266,%r22; cvt.s64.s8 %r265,%r266; add.u64 %r268,%r265,%r265; add.u64 %r269,%r268,%r265; shl.b64 %r270,%r269,3; add.u64 %r271,%r179,16; add.u64 %r139,%r270,%r271; .loc 1 108 10 mov.u64 %r102,1; .loc 1 112 4 mov.u64 %r295,0; bra $L18; $L17: .loc 1 116 24 mov.u64 %r272,0; st.u64 [%r179+8],%r272; .loc 1 117 28 cvt.u16.u32 %r275,%r22; add.u16 %r274,%r275,-1; cvt.u32.u16 %r276,%r274; st.u8 [%r179+28],%r276; .loc 1 119 20 add.u32 %r277,%r22,-2; cvt.s64.s32 %r63,%r277; add.u64 %r279,%r63,%r63; add.u64 %r280,%r279,%r63; shl.b64 %r281,%r280,3; add.u64 %r282,%r179,%r281; .loc 1 119 67 shl.b64 %r284,%r63,3; add.u64 %r285,%frame,%r284; .loc 1 119 59 ld.u64 %r288,[%r282+40]; ld.u64 %r289,[%r285+240]; mul.lo.u64 %r287,%r288,%r289; .loc 1 119 18 mul.lo.u64 %r117,%r287,%r183; .loc 1 122 29 mov.u64 %r29call (%value_in),_gfortrani_xmallocarray292,[%value_in]; } .loc 1 122 27 st.u64 [%r179],%r292; .loc 1 123 10 setp.eq.u64 %r294,%r117,0; @ ! %r294 bra $L22; bra $L19; $L21: .loc 1 110 48 mul.lo.u64 %r102,%r59,%r102; $L18: .loc 1 112 4 st.u64 [%r151+8],%r295; ld.u64 %r59,[%r148]; add.u64 %r296,%r59,-1; st.u64 [%r151+16],%r296; st.u64 [%r151],%r102; .loc 1 105 7 add.u64 %r151,%r151,24; add.u64 %r148,%r148,8; setp.ne.u64 %r297,%r139,%r151; @ %r297 bra $L21; bra $L17; $L19: .loc 1 126 4 st.u64 [%r179+48],%r117; mov.u64 %r299,-1; st.u64 [%r179+56],%r299; st.u64 [%r179+40],%r291; .loc 1 127 4 bra $L1; $L12: .loc 1 133 19 ld.s8 %r71,[%r179+28]; .loc 1 133 10 setp.eq.u64 %r301,%r71,%r111; @ %r301 bra $L24; .loc 1 134 2 st.u64 [%stack+8],%r111; st.u64 [%stack],%r71_gfortran_runtime_error24: .loc 1 139 11 cvta.global.u64 %r304,_gfortrani_compile_options; .loc 1 139 10 ld.u32 %r305,[%r304+36]; setp.eq.u32 %r306,%r305,0; @ %r306 bra $L22; .loc 1 140 2 add.u64 %r370,%frame,240; cvta.const.u64 %r310,$LC4r370309310; call _gfortrani_bounds_ifunction_return$L22: .loc 1 144 3 setp.gt.s64 %r312,%r111,0; @ %r312 bra $L25; $L29: .loc 1 78 9 add.u64 %r313,%r31,1; .loc 1 78 7 sub.u64 %r113,%r313,%r33; max.s64 %r115,%r113,0; .loc 1 82 44 mul.lo.u64 %r36,%r34,%r183; .loc 1 152 8 ld.u64 %r124,[%r181]; .loc 1 153 8 ld.u64 %r125,[%r179]; .loc 1 182 22 ld.u64 %r171,[%frame+120]; .loc 1 183 22 ld.u64 %r173,[%frame]; .loc 1 181 12 ld.u64 %r176,[%frame+360]; setp.gt.s64 %r367,%r113,0; setp.le.s64 %r368,%r111,1; cvt.u32.u32 %r373,%r22; cvt.s64.s8 %r374,%r373; add.u64 %r375,%r374,-1; add.u64 %r376,%frame,120; add.u64 %r377,%frame,240; bra $L26; $L25: add.u64 %r161,%r179,40; cvt.u32.u32 %r315,%r22; cvt.s64.s8 %r314,%r315; add.u64 %r152,%r314,-1; .loc 1 144 3 mov.u64 %r159,0; mov.u64 %r126,%r159; add.u64 %r370,%frame,240; add.u64 %r369,%frame,360; .loc 1 146 16 mov.u64 %r318,%r159; $L28: add.u64 %r317,%r369,%r159; st.u64 [%r317],%r318; .loc 1 147 18 add.u64 %r319,%frame,%r159; .loc 1 147 54 ld.u64 %r321,[%r161]; mul.lo.u64 %r320,%r321,%r183; .loc 1 147 18 st.u64 [%r319],%r320; .loc 1 148 17 add.u64 %r323,%r370,%r159; .loc 1 148 10 ld.u64 %r324,[%r323]; setp.le.s64 %r325,%r324,0; @ %r325 bra $L1; .loc 1 144 26 add.u64 %r126,%r126,1; .loc 1 144 3 add.u64 %r161,%r161,24; add.u64 %r159,%r159,8; setp.ne.u64 %r326,%r126,%r152; @ %r326 bra $L28; bra $L29; $L26: .loc 1 164 5 @ %r367 bra $L38; .loc 1 165 4 mov.u32 %r329,2553331,[%value_in]; } bra $L31; $L33: .loc 1 3812104183335,[%value_in]; } .loc 1 171 6 setp.ge.s32 %r337,%r335,0; selp.u64 %r104,%r104,%r120,%r337; bra $L30; $L38: mov.u64 %r120,%r124; mov.u64 %r104,%r124; .loc 1 168 13 mov.u64 %r119,0; $L30: .loc 1 168 28 add.u64 %r119,%r119,1; .loc 1 168 36 add.u64 %r120,%r120,%r36; .loc 1 168 6 setp.gt.s64 %r338,%r115,%r11983345,[%value_in]; } $L31: .loc 1 181 15 add.u64 %r176,%r176,1; .loc 1 182 12 add.u64 %r124,%r124,%r171; .loc 1 183 12 add.u64 %r125,%r125,%r173; .loc 1 185 32 ld.u64 %r130,[%frame+240]; .loc 1 185 13 setp.ne.u64 %r347,%r176,%r130; @ %r347 bra $L26; .loc 1 192 23 mul.lo.u64 %r348,%r176,%r171; .loc 1 192 9 sub.u64 %r121,%r124,%r348; .loc 1 193 23 mul.lo.u64 %r349,%r176,%r173; .loc 1 193 9 sub.u64 %r122,%r125,%r349; .loc 1 195 7 @ %r368 bra $L1; add.u64 %r132,%frame,368; mov.u64 %r150,8; .loc 1 194 5 mov.u64 %r123,1; .loc 1 189 13 mov.u64 %r372,0; bra $L35; $L36: st.u64 [%r132],%r372; .loc 1 192 23 mul.lo.u64 %r355,%r92,%r91; .loc 1 192 9 sub.u64 %r121,%r124,%r355; .loc 1 193 23 mul.lo.u64 %r356,%r94,%r91; .loc 1 193 9 sub.u64 %r122,%r125,%r356; .loc 1 194 5 add.u64 %r123,%r123,1; .loc 1 195 7 add.u64 %r132,%r132,8; add.u64 %r150,%r150,8; setp.eq.u64 %r357,%r123,%r375; @ %r357 bra $L1; $L35: .loc 1 203 16 ld.u64 %r358,[%r132]; add.u64 %r91,%r358,1; st.u64 [%r132],%r91; .loc 1 204 23 add.u64 %r360,%r376,%r150; ld.u64 %r92,[%r360]; .loc 1 204 13 add.u64 %r124,%r121,%r92; .loc 1 205 23 add.u64 %r361,%frame,%r150; ld.u64 %r94,[%r361]; .loc 1 205 13 add.u64 %r125,%r122,%r94; .loc 1 185 32 add.u64 %r363,%r377,%r150; ld.u64 %r97,[%r363]; .loc 1 185 13 setp.eq.u64 %r364,%r91,%r97; @ %r364 bra $L36; mov.u64 %r176,0; bra $L26; $L49: .loc 1 101 6 ld.u64 %r365,[%r179]; setp.eq.u64 %r366,%r365,0; @ ! %r366 bra $L12; bra $L37; $L_gfortran_mminval1_s1 .visible .func _gfortran_mminval1_s1608pred %r234; .reg .u64 %r235; .reg .u32 %r236; .reg .u64 %r237; .reg .u64 %r238; .reg .u32 %r239; .reg .u64 %r241; .reg .u32u32u64 %r313; .reg .u64u64 %r329; .reg .u32 %r330; .reg .u64 %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64predpredu64 %r411; .reg .u64 %r412; .reg .pred %r413; .reg .pred %r414; .reg .u32 %r416; .reg .u64 %r418; .reg .pred %r419; .reg .pred %r420; .reg .pred %r421; .reg .pred %r422; .reg .u32 %r426; .reg .pred %r428; .reg .pred.reg .u64 %r454; .reg .u64 %r455; .reg .u64 %r457; .reg .u64 %r459; .reg .pred %r460; .reg .u64 %r461; .reg .predpredmov.u64 %r222,%ar0; mov.u64 %r223,%ar1; mov.u64 %r224,%ar2; mov.u64 %r225,%ar3; mov.u64 %r226,%ar4; mov.u64 %r227,%ar5; .loc 1 242 6 setp.ne.u64 %r228,%r226,0; @ %r228 bra $L223257; call _gfortran_minval1_s1); } .loc 1 245 7 bra $L50; $L51: .loc 1 248 3 setp.eq.u64 %r234,%r223,%r227; @ %r234 bra $L53; cvta.const.u64 %r238,$LC0; cvta.const.u64 %r237,__func__$0; mov.u32 %r236,248;r2r2373853: .loc 1 250 10 ld.u64 %r22,[%r225]; .loc 1 250 7 add.u64 %r146,%r22,-1; .loc 1 251 10 ld.s8 %r23,[%r224+28]; .loc 1 251 38 add.u32 %r239,%r23,-1; .loc 1 251 8 cvt.s64.s32 %r147,%r239; .loc 1 253 7 shr.u64 %r241,%r146,63; set.u32.gt.s64 %r243,%r146,%r147; neg.s32 %r244,%r243; cvt.u16.u64 %r247,%r241; cvt.u16.u32 %r248,%r244; or.b16 %r246,%r247,%r248; .loc 1 253 6 cvt.u32.u16 %r249,%r246; cvt.u16.u8 %r250,%r249; setp.eq.u16 %r251,%r250,0; @ %r251 bra $L54; .loc 1 255 7 cvt.u32.u32 %r254,%r23; cvt.s64.s8 %r253,%r254; st.u64 [%stack+8],%r253; st.u64 [%stack],%r22; cvta.const.u64 %r22_gfortran_runtime_error4: .loc 1 260 9 add.u64 %r468,%r146,%r146; add.u64 %r465,%r468,%r146; shl.b64 %r259,%r465,3; add.u64 %r260,%r224,%r259; ld.u64 %r263,[%r260+56]; add.u64 %r262,%r263,1; .loc 1 260 7 ld.u64 %r270,[%r260+48]; sub.u64 %r148,%r262,%r270; .loc 1 261 6 setp.le.s64 %r271,%r148,0; @ %r271 bra $L50; .loc 1 264 9 ld.u64 %r159,[%r226]; .loc 1 266 15 ld.u64 %r35,[%r226+16]; .loc 1 268 22 cvt.u32.u64 %r36,%r35; .loc 1 268 53 add.u32 %r272,%r36,-4; and.b32 %r273,%r272,-5; set.u32.eq.u32 %r275,%r273,0; neg.s32 %r276,%r275; .loc 1 268 22 add.u32 %r277,%r36,-1; set.u32.le.u32 %r279,%r277,1; neg.s32 %r280,%r279; .loc 1 268 6 cvt.u16.u32 %r282,%r276; cvt.u16.u32 %r283,%r280; or.b16 %r281,%r282,%r283; cvt.u32.u16 %r284,%r281; cvt.u16.u8 %r285,%r28456; .loc 1 270 7 setp.ne.u32 %r288,%r36,16; @ %r288 bra $L57; $L56: .loc 1 277 11 shl.b64 %r292,%r465,3; add.u64 %r293,%r224,%r292; ld.u64 %r39,[%r293+40]; .loc 1 278 12 add.u64 %r299,%r226,%r292; ld.u64 %r42,[%r299+40]; .loc 1 280 3 setp.ne.u64 %r301,%r146,0; @ %r301 bra $L58; $L64: .loc 1 290 3 setp.lt.s64 %r302,%r146,%r147; @ %r302 bra $L59; bra $L109; $L57: .loc 1 275 5 cvta.const.u64 %r303,$LC303_gfortran_runtime_error8: add.u64 %r204,%r224,40; add.u64 %r206,%frame,240; add.u64 %r207,%r226,40; mov.u64 %r209,%frame; add.u64 %r210,%frame,360; add.u64 %r305,%r224,16; add.u64 %r307,%r22,%r22; add.u64 %r308,%r307,%r22; shl.b64 %r309,%r308,3; add.u64 %r219,%r305,%r309; .loc 1 287 12 mov.u64 %r478,0; $L63: .loc 1 282 51 ld.u64 %r311,[%r204]; mul.lo.u64 %r310,%r311,%r227; .loc 1 282 18 st.u64 [%r206],%r310; .loc 1 283 20 ld.u64 %r313,[%r207]; mul.lo.u64 %r312,%r313,%r35; .loc 1 283 18 st.u64 [%r209],%r312; .loc 1 284 19 ld.u64 %r315,[%r204+16]; add.u64 %r314,%r315,1; ld.u64 %r316,[%r204+8]; sub.u64 %r57,%r314,%r316; .loc 1 286 10 setp.lt.s64 %r317,%r57,0; @ %r317 bra $L61; .loc 1 284 17 st.u64 [%r210],%r57; bra $L62; $L61: .loc 1 287 12 st.u64 [%r210],%r478; $L62: .loc 1 280 3 add.u64 %r204,%r204,24; add.u64 %r206,%r206,8; add.u64 %r207,%r207,24; add.u64 %r209,%r209,8; add.u64 %r210,%r210,8; setp.ne.u64 %r319,%r204,%r219; @ %r319 bra $L63; bra $L64; $L109: .loc 1 300 6 ld.u64 %r320,[%r222]; setp.eq.u64 %r321,%r320,0; @ ! %r321 bra $L66; bra $L65; $L59: add.u64 %r323,%r22,%r22; add.u64 %r324,%r323,%r22; shl.b64 %r325,%r324,3; add.u64 %r125,%r325,40; add.u64 %r130,%r224,%r125; shl.b64 %r326,%r22,3; add.u64 %r75,%r326,-8; add.u64 %r463,%frame,240; add.u64 %r106,%r463,%r75; add.u64 %r68,%r226,%r125; add.u64 %r29,%frame,%r75; add.u64 %r469,%frame,360; add.u64 %r192,%r469,%r75; cvt.u32.u32 %r330,%r23; cvt.s64.s8 %r329,%r330; add.u64 %r332,%r329,%r329; add.u64 %r333,%r332,%r329; shl.b64 %r334,%r333,3; add.u64 %r335,%r224,40; add.u64 %r201,%r334,%r335; .loc 1 297 12 mov.u64 %r477,0; $L69: .loc 1 292 55 ld.u64 %r337,[%r130]; mul.lo.u64 %r336,%r337,%r227; .loc 1 292 18 st.u64 [%r106],%r336; .loc 1 293 20 ld.u64 %r339,[%r68]; mul.lo.u64 %r338,%r339,%r35; .loc 1 293 18 st.u64 [%r29],%r338; .loc 1 294 19 ld.u64 %r341,[%r130+16]; add.u64 %r340,%r341,1; ld.u64 %r342,[%r130+8]; sub.u64 %r72,%r340,%r342; .loc 1 296 10 setp.lt.s64 %r343,%r72,0; @ %r343 bra $L67; .loc 1 294 17 st.u64 [%r192],%r72; bra $L68; $L67: .loc 1 297 12 st.u64 [%r192],%r477; $L68: .loc 1 290 3 add.u64 %r130,%r130,24; add.u64 %r106,%r106,8; add.u64 %r68,%r68,24; add.u64 %r29,%r29,8; add.u64 %r192,%r192,8; setp.ne.u64 %r345,%r130,%r201; @ %r345 bra $L69; bra $L110; $L65: .loc 1 304 7 setp.eq.u64 %r346,%r147,0; @ %r346 bra $L71; add.u64 %r469,%frame,360; $L91: add.u64 %r141,%r222,40; mov.u64 %r139,%r469; cvt.u32.u32 %r348,%r23; cvt.s64.s8 %r347,%r348; add.u64 %r350,%r347,%r347; add.u64 %r351,%r350,%r347; shl.b64 %r352,%r351,3; add.u64 %r353,%r222,16; add.u64 %r131,%r352,%r353; .loc 1 307 10 mov.u64 %r137,1; .loc 1 311 4 mov.u64 %r373,0; bra $L72; $L71: .loc 1 315 20 add.u32 %r354,%r23,-2; cvt.s64.s32 %r81,%r354; add.u64 %r356,%r81,%r81; add.u64 %r357,%r356,%r81; shl.b64 %r358,%r357,3; add.u64 %r359,%r222,%r358; .loc 1 315 67 shl.b64 %r361,%r81,3; add.u64 %r362,%frame,%r361; .loc 1 315 59 ld.u64 %r365,[%r359+40]; ld.u64 %r366,[%r362+360]; mul.lo.u64 %r364,%r365,%r366; .loc 1 315 18 mul.lo.u64 %r153,%r364,%r227; .loc 1 318 24 mov.u64 %r367,0; st.u64 [%r222+8],%r367; .loc 1 319 28 cvt.u16.u32 %r370,%r23; add.u16 %r369,%r370,-1; cvt.u32.u16 %r371,%r369; st.u8 [%r222+28],%r371; .loc 1 321 10 setp.eq.u64 %r372,%r153,0; @ %r372 bra $L73; bra $L111; $L75: .loc 1 309 47 mul.lo.u64 %r137,%r78,%r137; $L72: .loc 1 311 4 st.u64 [%r141+8],%r373; ld.u64 %r78,[%r139]; add.u64 %r374,%r78,-1; st.u64 [%r141+16],%r374; st.u64 [%r141],%r137; .loc 1 304 7 add.u64 %r141,%r141,24; add.u64 %r139,%r139,8; setp.ne.u64 %r375,%r131,%r141; @ %r375 bra $L75; bra $L71; $L73: .loc 1 324 4 st.u64 [%r222+48],%r153; mov.u64 %r377,-1; st.u64 [%r222+56],%r377; mov.u64 %r378,1; st.u64 [%r222+40],%r378; .loc 1 325 4 bra $L50; $L111: .loc 1 328 24 mov.u64 %r38015(%value_in),_gfortrani_xmallocarray81,[%value_in]; } .loc 1 328 22 st.u64 [%r222],%r381; bra $L76; $L66: .loc 1 333 19 ld.s8 %r383,[%r222+28]; .loc 1 333 10 setp.eq.u64 %r384,%r383,%r147; @ %r384 bra $L77; .loc 1 334 2 cvta.const.u64 %r385385_gfortran_runtime_error77: .loc 1 336 11 cvta.global.u64 %r387,_gfortrani_compile_options; .loc 1 336 10 ld.u32 %r388,[%r387+36]; setp.eq.u32 %r389,%r388,0; @ %r389 bra $L76; .loc 1 338 4 add.u64 %r469,%frame,360; cvta.const.u64 %r393,$LC4; cvta.const.u64 %r3922469392393; call _gfortrani_bounds_ifunction_return340 4 cvta.const.u64 %r397397393; call _gfortrani_bounds_equal_extents$L76: .loc 1 345 3 setp.ne.u64 %r399,%r147,0; @ %r399 bra $L78; $L81: .loc 1 353 8 ld.u64 %r160,[%r222]; .loc 1 354 8 ld.u64 %r158,[%r224]; .loc 1 356 9 setp.ne.u64 %r400,%r158,0; @ %r400 bra $L79; bra $L50; $L78: add.u64 %r171,%r222,40; cvt.u32.u32 %r402,%r23; cvt.s64.s8 %r401,%r402; add.u64 %r142,%r401,-1; .loc 1 345 3 mov.u64 %r169,0; mov.u64 %r164,%r169; add.u64 %r469,%frame,360; add.u64 %r466,%frame,480; add.u64 %r464,%frame,120; .loc 1 347 16 mov.u64 %r405,%r169; $L80: add.u64 %r404,%r466,%r169; st.u64 [%r404],%r405; .loc 1 348 18 add.u64 %r407,%r464,%r169; .loc 1 348 54 ld.u64 %r409,[%r171]; mul.lo.u64 %r408,%r409,%r227; .loc 1 348 18 st.u64 [%r407],%r408; .loc 1 349 17 add.u64 %r411,%r469,%r169; .loc 1 349 10 ld.u64 %r412,[%r411]; setp.le.s64 %r413,%r412,0; @ %r413 bra $L50; .loc 1 345 26 add.u64 %r164,%r164,1; .loc 1 345 3 add.u64 %r171,%r171,24; add.u64 %r169,%r169,8; setp.ne.u64 %r414,%r142,%r164; @ %r414 bra $L80; bra $L81; $L79: .loc 1 277 44 mul.lo.u64 %r41,%r39,%r227; .loc 1 278 12 mul.lo.u64 %r45,%r42,%r35; .loc 1 389 22 ld.u64 %r193,[%frame+240]; .loc 1 390 23 ld.u64 %r196,[%frame]; .loc 1 391 22 ld.u64 %r198,[%frame+120]; setp.le.s64 %r467,%r147,1; cvt.u32.u32 %r471,%r23; cvt.s64.s8 %r472,%r471; add.u64 %r473,%r472,-1; add.u64 %r474,%frame,240; add.u64 %r475,%frame,360; add.u64 %r476,%frame,120; $L92: .loc 1 366 2 mov.u32 %r416,2552418,[%value_in]; } mov.u64 %r65,%r159; mov.u64 %r154,%r158; .loc 1 368 9 mov.u64 %r162,0; $L83: .loc 1 371 7 ld.s8 %r200,[%r65]; .loc 1 371 6 setp.ne.u32 %r419,%r200,0; @ %r419 bra $L82; .loc 1 368 24 add.u64 %r162,%r162,1; .loc 1 368 32 add.u64 %r154,%r154,%r41; .loc 1 368 47 add.u64 %r65,%r65,%r45; .loc 1 368 2 setp.ne.u64 %r420,%r148,%r162; @ %r420 bra $L83; mov.u64 %r154,%r160; bra $L84; $L82: .loc 1 377 6 setp.le.s64 %r421,%r148,%r162; @ %r421 bra $L84; mov.u64 %r163,%r154; $L86: .loc 1 379 6 setp.eq.u32 %r422,%r200,0; @ %r422 bra $L85; .loc 1 3816r154426,[%value_in]; } .loc 1 379 13 setp.ge.s32 %r428,%r426,0; selp.u64 %r154,%r154,%r163,%r428; $L85: .loc 1 377 23 add.u64 %r162,%r162,1; .loc 1 377 31 add.u64 %r163,%r163,%r41; .loc 1 377 46 add.u64 %r65,%r65,%r45; .loc 1 377 6 setp.eq.u64 %r429,%r148,%r162; @ %r429 bra $L84; .loc 1 379 7 ld.s8 %r200,[%r65]; bra $L86; $L84: .loc 1 385 2161547436,[%value_in]; } .loc 1 388 15 ld.u64 %r438,[%frame+480]; add.u64 %r100,%r438,1; st.u64 [%frame+480],%r100; .loc 1 389 12 add.u64 %r158,%r158,%r193; .loc 1 390 13 add.u64 %r159,%r159,%r196; .loc 1 391 12 add.u64 %r160,%r160,%r198; .loc 1 393 32 ld.u64 %r134,[%frame+360]; .loc 1 393 13 setp.ne.u64 %r439,%r100,%r134; @ %r439 bra $L92; .loc 1 397 13 mov.u64 %r440,0; st.u64 [%frame+480],%r440; .loc 1 400 23 mul.lo.u64 %r109,%r100,%r193; .loc 1 401 24 mul.lo.u64 %r441,%r100,%r196; .loc 1 401 10 sub.u64 %r155,%r159,%r441; .loc 1 402 23 mul.lo.u64 %r442,%r100,%r198; .loc 1 402 9 sub.u64 %r156,%r160,%r442; .loc 1 404 7 @ %r467 bra $L50; add.u64 %r191,%frame,488; mov.u64 %r189,8; .loc 1 403 5 mov.u64 %r157,1; bra $L89; $L90: .loc 1 397 13 st.u64 [%r191],%r440; .loc 1 400 23 mul.lo.u64 %r109,%r119,%r118; .loc 1 401 24 mul.lo.u64 %r448,%r121,%r118; .loc 1 401 10 sub.u64 %r155,%r159,%r448; .loc 1 402 23 mul.lo.u64 %r449,%r123,%r118; .loc 1 402 9 sub.u64 %r156,%r160,%r449; .loc 1 403 5 add.u64 %r157,%r157,1; .loc 1 404 7 add.u64 %r191,%r191,8; add.u64 %r189,%r189,8; setp.eq.u64 %r450,%r157,%r473; @ %r450 bra $L50; $L89: .loc 1 412 16 ld.u64 %r451,[%r191]; add.u64 %r118,%r451,1; st.u64 [%r191],%r118; .loc 1 413 23 add.u64 %r453,%r474,%r189; ld.u64 %r119,[%r453]; .loc 1 413 13 sub.u64 %r454,%r119,%r109; add.u64 %r158,%r158,%r454; .loc 1 414 24 add.u64 %r455,%frame,%r189; ld.u64 %r121,[%r455]; .loc 1 414 14 add.u64 %r159,%r155,%r121; .loc 1 415 23 add.u64 %r457,%r476,%r189; ld.u64 %r123,[%r457]; .loc 1 415 13 add.u64 %r160,%r156,%r123; .loc 1 393 32 add.u64 %r459,%r475,%r189; ld.u64 %r126,[%r459]; .loc 1 393 13 setp.eq.u64 %r460,%r118,%r126; @ %r460 bra $L90; bra $L92; $L110: .loc 1 300 6 ld.u64 %r461,[%r222]; setp.eq.u64 %r462,%r461,0; @ ! %r462 bra $L66; bra $L91; $L50_gfortran_sminval1_s1 .visible .func _gfortran_sminval1_s1local .align 16 .b8 %frame_ar[3688pred %r159; .reg .u32 %r160; .reg .predu16 %r173; .reg .u16 %r174; .reg .u16 %r175; .reg .u32 %r176; .reg .u16 %r177; .reg .pred %r178; .reg .pred %r179; .reg .predpred %r193; .reg .pred %r195; .reg .predpredu16 %r229; .reg .u32pred %r271; .reg .u64u32 %r280; .reg .u64 %r282; .reg .u64 %r284; .reg .u64pred %r293; .reg .predmov.u64 %r153,%ar0; mov.u64 %r154,%ar1; mov.u64 %r155,%ar2; mov.u64 %r156,%ar3; mov.u64 %r157,%ar4; mov.u64 %r158,%ar5; .loc 1 445 6 setp.eq.u64 %r159,%r157,0; @ %r159 bra $L113; .loc 1 445 20 ld.u32 %r160,[%r157]; setp.eq.u32 %r161,%r160,0; @ %r161 bra $L114; 15r1541515158; call _gfortran_minval1_s1); } .loc 1 448 7 bra $L112; $L114: .loc 1 451 10 ld.u64 %r23,[%r156]; .loc 1 451 7 add.u64 %r88,%r23,-1; .loc 1 452 10 ld.s8 %r24,[%r155+28]; .loc 1 452 38 add.u32 %r26,%r24,-1; .loc 1 452 8 cvt.s64.s32 %r307,%r26; .loc 1 454 7 shr.u64 %r168,%r88,63; set.u32.gt.s64 %r170,%r88,%r307; neg.s32 %r171,%r170; cvt.u16.u64 %r174,%r168; cvt.u16.u32 %r175,%r171; or.b16 %r173,%r174,%r175; .loc 1 454 6 cvt.u32.u16 %r176,%r173; cvt.u16.u8 %r177,%r176; setp.ne.u16 %r178,%r177,0; @ %r178 bra $L116; .loc 1 461 3 setp.ne.u64 %r179,%r88,0; @ %r179 bra $L117; $L123: .loc 1 469 3 setp.lt.s64 %r180,%r88,%r307; @ %r180 bra $L118; bra $L169; $L116: .loc 1 456 7 cvt.u32.u32 %r183,%r24; cvt.s64.s8 %r182,%r183; st.u64 [%stack+8],%r182181181_gfortran_runtime_error117: add.u64 %r59,%r155,48; add.u64 %r137,%frame,120; add.u64 %r101,%r155,56; add.u64 %r185,%r155,24; add.u64 %r187,%r23,%r23; add.u64 %r188,%r187,%r23; shl.b64 %r189,%r188,3; add.u64 %r149,%r185,%r189; .loc 1 466 12 mov.u64 %r313,0; $L122: .loc 1 463 19 ld.u64 %r191,[%r101]; add.u64 %r190,%r191,1; ld.u64 %r192,[%r59]; sub.u64 %r36,%r190,%r192; .loc 1 465 10 setp.le.s64 %r193,%r36,0; @ %r193 bra $L120; .loc 1 463 17 st.u64 [%r137],%r36; bra $L121; $L120: .loc 1 466 12 st.u64 [%r137],%r313; $L121: .loc 1 461 3 add.u64 %r59,%r59,24; add.u64 %r137,%r137,8; add.u64 %r101,%r101,24; setp.ne.u64 %r195,%r59,%r149; @ %r195 bra $L122; bra $L123; $L169: .loc 1 478 15 ld.u64 %r96,[%r153]; .loc 1 478 6 setp.eq.u64 %r196,%r96,0; @ ! %r196 bra $L125; bra $L124; $L118: add.u64 %r198,%r23,%r23; add.u64 %r199,%r198,%r23; shl.b64 %r200,%r199,3; add.u64 %r201,%r200,48; add.u64 %r43,%r155,%r201; add.u64 %r306,%frame,120; shl.b64 %r203,%r23,3; add.u64 %r204,%r203,-8; add.u64 %r131,%r306,%r204; cvt.u32.u32 %r206,%r24; cvt.s64.s8 %r205,%r206; add.u64 %r208,%r205,%r205; add.u64 %r209,%r208,%r205; shl.b64 %r210,%r209,3; add.u64 %r211,%r155,48; add.u64 %r133,%r210,%r211; .loc 1 475 12 mov.u64 %r312,0; $L128: .loc 1 472 2 ld.u64 %r213,[%r43+8]; add.u64 %r212,%r213,1; ld.u64 %r214,[%r43]; sub.u64 %r41,%r212,%r214; .loc 1 474 10 setp.le.s64 %r215,%r41,0; @ %r215 bra $L126; .loc 1 471 17 st.u64 [%r131],%r41; bra $L127; $L126: .loc 1 475 12 st.u64 [%r131],%r312; $L127: .loc 1 469 3 add.u64 %r43,%r43,24; add.u64 %r131,%r131,8; setp.ne.u64 %r217,%r43,%r133; @ %r217 bra $L128; bra $L170; $L124: .loc 1 482 7 setp.eq.u64 %r218,%r307,0; @ %r218 bra $L130; add.u64 %r306,%frame,120; $L149: add.u64 %r111,%r153,40; mov.u64 %r108,%r306; cvt.u32.u32 %r220,%r24; cvt.s64.s8 %r219,%r220; add.u64 %r222,%r219,%r219; add.u64 %r223,%r222,%r219; shl.b64 %r224,%r223,3; add.u64 %r225,%r153,16; add.u64 %r87,%r224,%r225; .loc 1 485 10 mov.u64 %r84,1; .loc 1 489 4 mov.u64 %r245,0; bra $L131; $L130: .loc 1 493 24 mov.u64 %r226,0; st.u64 [%r153+8],%r226; .loc 1 494 28 cvt.u16.u32 %r229,%r24; add.u16 %r228,%r229,-1; cvt.u32.u16 %r230,%r228; st.u8 [%r153+28],%r230; .loc 1 496 20 add.u32 %r231,%r24,-2; cvt.s64.s32 %r50,%r231; add.u64 %r233,%r50,%r50; add.u64 %r234,%r233,%r50; shl.b64 %r235,%r234,3; add.u64 %r236,%r153,%r235; .loc 1 496 67 shl.b64 %r238,%r50,3; add.u64 %r239,%frame,%r238; .loc 1 496 59 ld.u64 %r242,[%r236+40]; ld.u64 %r243,[%r239+120]; mul.lo.u64 %r241,%r242,%r243; .loc 1 496 18 mul.lo.u64 %r94,%r241,%r158; .loc 1 499 10 setp.eq.u64 %r244,%r94,0; @ %r244 bra $L132; bra $L171; $L134: .loc 1 487 48 mul.lo.u64 %r84,%r46,%r84; $L131: .loc 1 489 4 st.u64 [%r111+8],%r245; ld.u64 %r46,[%r108]; add.u64 %r246,%r46,-1; st.u64 [%r111+16],%r246; st.u64 [%r111],%r84; .loc 1 482 7 add.u64 %r111,%r111,24; add.u64 %r108,%r108,8; setp.ne.u64 %r247,%r87,%r111; @ %r247 bra $L134; bra $L130; $L132: .loc 1 502 4 st.u64 [%r153+48],%r94; mov.u64 %r249,-1; st.u64 [%r153+56],%r249; mov.u64 %r250,1; st.u64 [%r153+40],%r250; .loc 1 503 4 bra $L112; $L171: .loc 1 506 24 mov.u64 %r2522; call (%value_in),_gfortrani_xmallocarray253,[%value_in]; } mov.u64 %r96,%r253; .loc 1 506 22 st.u64 [%r153],%r96; $L138: .loc 1 532 3 setp.ne.u64 %r255,%r307,0; @ %r255 bra $L135; bra $L136; $L125: .loc 1 510 19 ld.s8 %r56,[%r153+28]; .loc 1 510 10 setp.eq.u64 %r256,%r56,%r307; @ %r256 bra $L137; .loc 1 511 2 st.u64 [%stack+8],%r307_gfortran_runtime_error137: .loc 1 516 11 cvta.global.u64 %r259,_gfortrani_compile_options; .loc 1 516 10 ld.u32 %r260,[%r259+36]; setp.eq.u32 %r261,%r260,0; @ %r261 bra $L138; .loc 1 518 4 setp.eq.u64 %r262,%r307,0; @ %r262 bra $L136; add.u64 %r86,%r153,48; add.u64 %r83,%frame,120; cvt.u32.u32 %r264,%r24; cvt.s64.s8 %r263,%r264; add.u64 %r72,%r263,-1; .loc 1 518 10 mov.u64 %r104,0; $L140: .loc 1 522 21 ld.u64 %r266,[%r86+8]; add.u64 %r265,%r266,1; .loc 1 522 19 ld.u64 %r267,[%r86]; sub.u64 %r93,%r265,%r267; .loc 1 523 18 ld.u64 %r63,[%r83]; .loc 1 524 3 add.u64 %r104,%r104,1; .loc 1 523 11 setp.eq.u64 %r268,%r63,%r93; @ %r268 bra $L139; .loc 1 524 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r93; st.u64 [%stack],%r104; cvta.const.u64 %r269,$LCstack; call _gfortran_runtime_error139: .loc 1 518 4 add.u64 %r86,%r86,24; add.u64 %r83,%r83,8; setp.ne.u64 %r271,%r72,%r104; @ %r271 bra $L140; $L135: .loc 1 534 16 cvt.u16.u32 %r273,%r24; setp.le.s16 %r274,%r273,1; @ %r274 bra $L141; shl.b64 %r272,%r307,3; bra $L142; $L141: mov.u64 %r272,8; $L142: add.u64 %r276,%frame,240; mov.u32 %r2802722,[%value_in]; } add.u64 %r117,%r153,40; mov.u64 %r98,0; $L143: .loc 1 535 18 shl.b64 %r284,%r98,3; add.u64 %r285,%frame,%r284; .loc 1 535 54 ld.u64 %r287,[%r117]; mul.lo.u64 %r286,%r287,%r158; .loc 1 535 18 st.u64 [%r285],%r286; .loc 1 532 26 add.u64 %r98,%r98,1; .loc 1 532 3 add.u64 %r117,%r117,24; setp.gt.s64 %r288,%r307,%r98; @ %r288 bra $L143; $L136: .loc 1 546 32 ld.u64 %r132,[%frame+120]; .loc 1 544 22 ld.u64 %r136,[%frame]; .loc 1 543 12 ld.u64 %r138,[%frame+240]; .loc 1 553 23 mul.lo.u64 %r142,%r132,%r136; setp.le.s64 %r305,%r307,1; cvt.u32.u32 %r309,%r24; cvt.s64.s8 %r310,%r309; add.u64 %r311,%r310,-1; $L148: .loc 1 542 7 mov.u32 %r290,2559615892,[%value_in]; } .loc 1 543 15 add.u64 %r138,%r138,1; .loc 1 544 12 add.u64 %r96,%r96,%r136; .loc 1 546 13 setp.ne.u64 %r293,%r138,%r132; @ %r293 bra $L148; .loc 1 555 7 @ %r305 bra $L112; add.u64 %r97,%frame,248; add.u64 %r125,%frame,8; add.u64 %r124,%frame,128; .loc 1 553 23 mov.u64 %r74,%r142; .loc 1 554 5 mov.u64 %r95,1; .loc 1 550 13 mov.u64 %r308,0; bra $L146; $L147: st.u64 [%r97],%r308; .loc 1 553 23 mul.lo.u64 %r74,%r77,%r76; .loc 1 554 5 add.u64 %r95,%r95,1; .loc 1 555 7 add.u64 %r97,%r97,8; add.u64 %r125,%r125,8; add.u64 %r124,%r124,8; setp.eq.u64 %r300,%r95,%r311; @ %r300 bra $L112; $L146: .loc 1 559 16 ld.u64 %r301,[%r97]; add.u64 %r76,%r301,1; st.u64 [%r97],%r76; .loc 1 560 23 ld.u64 %r77,[%r125]; .loc 1 560 13 sub.u64 %r302,%r77,%r74; add.u64 %r96,%r96,%r302; .loc 1 546 32 ld.u64 %r80,[%r124]; .loc 1 546 13 setp.eq.u64 %r303,%r76,%r80; @ %r303 bra $L147; mov.u64 %r138,0; bra $L148; $L170: .loc 1 478 15 ld.u64 %r96,[%r153]; .loc 1 478 6 setp.eq.u64 %r304,%r96,0; @ ! %r304 bra $L125; bra $L149; $L112: .loc 1 564 1 ret; } minval1_s4.o/_gfortran_minval1_s4 .visible .func _gfortran_minval1_s4.file 1 "../../../libgfortran/generated/minval1_s4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mminval1_s4 .visible .func _gfortran_mminval1_s4GLOBAL FUNCTION DECL: _gfortran_sminval1_s4 .visible .func _gfortran_sminval1_s4GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_options[6]FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_retur_gfortrani_memcmp_char4_gfortrani_memcmp_char4_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents73,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,3709,105,110,118,97,108,49,95,115,52,0 }; // BEGIN VAR DEF: __func__$1 .const .align 1 .u8 __func__$1[11] = {109,105,110,118,97,108,49,95,115,52,0 }120,108,101,110,32,61,61,32,115,116,114,105,110,103,95,108,101116,114,97,110,47,103,101,110,101,114,97,116,101,100,47,109,105,110,118,97,108,49,95,115,52,46,9968,105,109,32,97,114,103,117,109,101,110,116,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,10114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,107] = {77,73,78,86,65,76114,101,116,117,114,110,32,118,9726] = {70,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,997,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,77,73,78,86,65,76,32,105,110,116,114,105,110,115,105,99,0 }_gfortran_minval1_s4 .visible .func _gfortran_minval1_s4 {480pred %r210; .reg .u64 %r211; .reg .u64 %r212; .reg .u32predu64 %r259; .reg .u32u64 %r266; .reg .u64 %r267; .reg .u64 %r268; .reg .u64 %r269; .reg .pred %r270; .reg .pred %r272; .reg .predu64 %r280; .reg .u64u64pred %r315; .reg .u64u64 %r330; .reg .u64 %r332; .reg .u64pred %r345; .reg .pred %r346; .reg .u64 %r353; .reg .pred %r355; .reg .u64 %r356; .reg .u64 %r357; .reg .u64pred %r378; .reg .u64 %r379; .reg .pred %r380; .reg .predu64 %r391; .reg .u64 %r392; .reg .u64 %r393; mov.u64 %r188,%ar0; mov.u64 %r189,%ar1; mov.u64 %r190,%ar2; mov.u64 %r191,%ar3; mov.u64 %r192,%ar4; .loc 1 66 3 setp.eq.u64 %r193,%r189,%r192; @ %r193 bra $L2; cvta.const.u64 %r197,$LC0; cvta.const.u64 %r196,__func__$1; mov.u32 %r195,66;r1951961972: .loc 1 68 10 ld.s8 %r22,[%r190+28]; .loc 1 68 38 add.u32 %r198,%r22,-1; .loc 1 68 8 cvt.s64.s32 %r114,%r198; .loc 1 69 10 ld.u64 %r25,[%r191]; .loc 1 69 7 add.u64 %r115,%r25,-1; .loc 1 71 7 shr.u64 %r200,%r115,63; set.u32.lt.s64 %r202,%r114,%r115; neg.s32 %r203,%r202; cvt.u16.u64 %r206,%r200; cvt.u16.u32 %r207,%r203; or.b16 %r205,%r206,%r207; .loc 1 71 6 cvt.u32.u16 %r208,%r205; cvt.u16.u8 %r209,%r2083; .loc 1 73 7 cvt.u32.u32 %r213,%r22; cvt.s64.s8 %r212,%r213; st.u64 [%stack+8],%r21211211_gfortran_runtime_errorr216,%r115,%r115; add.u64 %r217,%r216,%r115; shl.b64 %r218,%r217,3; add.u64 %r219,%r190,%r218; ld.u64 %r31,[%r219+56]; ld.u64 %r33,[%r219+48]; .loc 1 82 11 ld.u64 %r34,[%r219+40]; .loc 1 84 3 setp.ne.u64 %r233,%r115,0; @ %r233 bra $L4; $L10: .loc 1 92 3 setp.gt.s64 %r234,%r114,%r115; @ %r234 bra $L5; bra $L48; $L4: add.u64 %r100,%r190,40; add.u64 %r84,%frame,120; add.u64 %r73,%frame,240; add.u64 %r236,%r25,%r25; add.u64 %r237,%r236,%r25190,16; add.u64 %r128,%r238,%r239; .loc 1 90 12 mov.u64 %r393,0; $L9: .loc 1 86 51 ld.u64 %r241,[%r100]; mul.lo.u64 %r240,%r241,%r192; .loc 1 86 18 st.u64 [%r84],%r240; .loc 1 87 19 ld.u64 %r243,[%r100+16]; add.u64 %r242,%r243,1; ld.u64 %r244,[%r100+8]; sub.u64 %r43,%r242,%r244; .loc 1 89 10 setp.lt.s64 %r245,%r43,0; @ %r245 bra $L7; .loc 1 87 17 st.u64 [%r73],%r43; bra $L8; $L7: .loc 1 90 12 st.u64 [%r73],%r393; $L8: .loc 1 84 3 add.u64 %r100,%r100,24; add.u64 %r84,%r84,8; add.u64 %r73,%r73,8; setp.ne.u64 %r247,%r100,%r128; @ %r247 bra $L9; bra $L10; $L48: .loc 1 101 6 ld.u64 %r248,[%r188]; setp.eq.u64 %r249,%r248,0; @ ! %r249 bra $L12; bra $L11; $L5: add.u64 %r251,%r25,%r25; add.u64 %r252,%r251,%r25; shl.b64 %r253,%r252,3; add.u64 %r254,%r253,40; add.u64 %r148,%r190,%r254; shl.b64 %r255,%r25,3; add.u64 %r137,%r255,-8; add.u64 %r385,%frame,120; add.u64 %r140,%r385,%r137; add.u64 %r384,%frame,240; add.u64 %r132,%r384,%r137; add.u64 %r258,%r190,40; cvt.u32.u32 %r260,%r22; cvt.s64.s8 %r259,%r260; add.u64 %r262,%r259,%r259; add.u64 %r263,%r262,%r259; shl.b64 %r264,%r263,3; add.u64 %r104,%r258,%r264; .loc 1 98 12 mov.u64 %r392,0; $L15: .loc 1 94 56 ld.u64 %r266,[%r148]; mul.lo.u64 %r265,%r266,%r192; .loc 1 94 18 st.u64 [%r140],%r265; .loc 1 95 19 ld.u64 %r268,[%r148+16]; add.u64 %r267,%r268,1; ld.u64 %r269,[%r148+8]; sub.u64 %r53,%r267,%r269; .loc 1 97 10 setp.lt.s64 %r270,%r53,0; @ %r270 bra $L13; .loc 1 95 17 st.u64 [%r132],%r53; bra $L14; $L13: .loc 1 98 12 st.u64 [%r132],%r392; $L14: .loc 1 92 3 add.u64 %r148,%r148,24; add.u64 %r140,%r140,8; add.u64 %r132,%r132,8; setp.ne.u64 %r272,%r104,%r148; @ %r272 bra $L15; bra $L49; $L11: .loc 1 105 7 setp.le.s64 %r273,%r114,0; @ %r273 bra $L17; add.u64 %r384,%frame,240; $L36: add.u64 %r164,%r188,40; mov.u64 %r162,%r384; add.u64 %r274,%r188,16; cvt.u32.u32 %r276,%r22; cvt.s64.s8 %r275,%r276; add.u64 %r278,%r275,%r275; add.u64 %r279,%r278,%r275; shl.b64 %r280,%r279,3; add.u64 %r152,%r274,%r280; .loc 1 108 10 mov.u64 %r107,1; .loc 1 112 4 mov.u64 %r304,0; bra $L18; $L17: .loc 1 116 24 mov.u64 %r281,0; st.u64 [%r188+8],%r281; .loc 1 117 28 cvt.u16.u32 %r284,%r22; add.u16 %r283,%r284,-1; cvt.u32.u16 %r285,%r283; st.u8 [%r188+28],%r285; .loc 1 119 20 add.u32 %r286,%r22,-2; cvt.s64.s32 %r62,%r286; add.u64 %r288,%r62,%r62; add.u64 %r289,%r288,%r62; shl.b64 %r290,%r289,3; add.u64 %r291,%r188,%r290; .loc 1 119 67 shl.b64 %r293,%r62,3; add.u64 %r294,%frame,%r293; .loc 1 119 59 ld.u64 %r297,[%r291+40]; ld.u64 %r298,[%r294+240]; mul.lo.u64 %r296,%r297,%r298; .loc 1 119 18 mul.lo.u64 %r120,%r296,%r192; .loc 1 122 29300; call (%value_in),_gfortrani_xmallocarray01,[%value_in]; } .loc 1 122 27 st.u64 [%r188],%r301; .loc 1 123 10 setp.eq.u64 %r303,%r120,0; @ ! %r303 bra $L22; bra $L19; $L21: .loc 1 110 48 mul.lo.u64 %r107,%r58,%r107; $L18: .loc 1 112 4 st.u64 [%r164+8],%r304; ld.u64 %r58,[%r162]; add.u64 %r305,%r58,-1; st.u64 [%r164+16],%r305; st.u64 [%r164],%r107; .loc 1 105 7 add.u64 %r164,%r164,24; add.u64 %r162,%r162,8; setp.ne.u64 %r306,%r152,%r164; @ %r306 bra $L21; bra $L17; $L19: .loc 1 126 4 st.u64 [%r188+48],%r120; mov.u64 %r308,-1; st.u64 [%r188+56],%r308; mov.u64 %r309,1; st.u64 [%r188+40],%r309; .loc 1 127 4 bra $L1; $L12: .loc 1 133 19 ld.s8 %r69,[%r188+28]; .loc 1 133 10 setp.eq.u64 %r310,%r69,%r114; @ %r310 bra $L24; .loc 1 134 2 st.u64 [%stack+8],%r114; st.u64 [%stack],%r69_gfortran_runtime_error24: .loc 1 139 11 cvta.global.u64 %r313,_gfortrani_compile_options; .loc 1 139 10 ld.u32 %r314,[%r313+36]; setp.eq.u32 %r315,%r314,0; @ %r315 bra $L22; .loc 1 140 2 add.u64 %r384,%frame,240; cvta.const.u64 %r319,$LC4; cvta.const.u64 %r31818r384318319; call _gfortrani_bounds_ifunction_return$L22: .loc 1 144 3 setp.gt.s64 %r321,%r114,0; @ %r321 bra $L25; $L29: .loc 1 78 9 add.u64 %r322,%r31,1; .loc 1 78 7 sub.u64 %r116,%r322,%r33; max.s64 %r118,%r116,0; .loc 1 152 8 ld.u64 %r106,[%r190]; .loc 1 153 8 ld.u64 %r126,[%r188]; shl.b64 %r153,%r192,2; .loc 1 168 36 mul.lo.u64 %r79,%r34,%r153; .loc 1 182 22 ld.u64 %r181,[%frame+120]; .loc 1 182 12 shl.b64 %r183,%r181,2; .loc 1 183 22 ld.u64 %r184,[%frame]; .loc 1 183 12 shl.b64 %r186,%r184,2; .loc 1 181 12 ld.u64 %r187,[%frame+360]; setp.gt.s64 %r381,%r116,0; setp.le.s64 %r382,%r114,1; cvt.u32.u32 %r387,%r22; cvt.s64.s8 %r388,%r387; add.u64 %r389,%r388,-1; add.u64 %r390,%frame,120; add.u64 %r391,%frame,240; bra $L26; $L25: add.u64 %r174,%r188,40; cvt.u32.u32 %r324,%r22; cvt.s64.s8 %r323,%r324; add.u64 %r165,%r323,-1; .loc 1 144 3 mov.u64 %r172,0; mov.u64 %r127,%r172; add.u64 %r384,%frame,240; add.u64 %r383,%frame,360; .loc 1 146 16 mov.u64 %r327,%r172; $L28: add.u64 %r326,%r383,%r172; st.u64 [%r326],%r327; .loc 1 147 18 add.u64 %r328,%frame,%r172; .loc 1 147 54 ld.u64 %r330,[%r174]; mul.lo.u64 %r329,%r330,%r192; .loc 1 147 18 st.u64 [%r328],%r329; .loc 1 148 17 add.u64 %r332,%r384,%r172; .loc 1 148 10 ld.u64 %r333,[%r332]; setp.le.s64 %r334,%r333,0; @ %r334 bra $L1; .loc 1 144 26 add.u64 %r127,%r127,1; .loc 1 144 3 add.u64 %r174,%r174,24; add.u64 %r172,%r172,8; setp.ne.u64 %r335,%r127,%r165; @ %r335 bra $L28; bra $L29; $L26: .loc 1 164 5 @ %r381 bra $L37; .loc 1 165 4 mov.u32 %r338,255153340,[%value_in]; } bra $L31; $L37: mov.u64 %r109,%r106; mov.u64 %r122,%r106; .loc 1 168 13 mov.u64 %r121,0; $L30: .loc 1 401109192; call (%value_in),_gfortrani_memcmp_char4344,[%value_in]; } .loc 1 171 6 setp.ge.s32 %r345,%r344,0; selp.u64 %r109,%r109,%r122,%r345; .loc 1 168 28 add.u64 %r121,%r121,1; .loc 1 168 36 add.u64 %r122,%r122,%r79; .loc 1 168 6 setp.gt.s64 %r346,%r118,%r121353,[%value_in]; } $L31: .loc 1 181 15 add.u64 %r187,%r187,1; .loc 1 182 12 add.u64 %r106,%r106,%r183; .loc 1 183 12 add.u64 %r126,%r126,%r186; .loc 1 185 32 ld.u64 %r133,[%frame+240]; .loc 1 185 13 setp.ne.u64 %r355,%r187,%r133; @ %r355 bra $L26; .loc 1 192 23 mul.lo.u64 %r356,%r187,%r181; .loc 1 192 9 shl.b64 %r357,%r356,2; sub.u64 %r123,%r106,%r357; .loc 1 193 23 mul.lo.u64 %r358,%r187,%r184; .loc 1 193 9 shl.b64 %r359,%r358,2; sub.u64 %r124,%r126,%r359; .loc 1 195 7 @ %r382 bra $L1; add.u64 %r135,%frame,368; mov.u64 %r156,8; .loc 1 194 5 mov.u64 %r125,1; .loc 1 189 13 mov.u64 %r386,0; bra $L34; $L35: st.u64 [%r135],%r386; .loc 1 192 23 mul.lo.u64 %r365,%r92,%r91; .loc 1 192 9 shl.b64 %r366,%r365,2; sub.u64 %r123,%r106,%r366; .loc 1 193 23 mul.lo.u64 %r367,%r95,%r91; .loc 1 193 9 shl.b64 %r368,%r367,2; sub.u64 %r124,%r126,%r368; .loc 1 194 5 add.u64 %r125,%r125,1; .loc 1 195 7 add.u64 %r135,%r135,8; add.u64 %r156,%r156,8; setp.eq.u64 %r369,%r125,%r389; @ %r369 bra $L1; $L34: .loc 1 203 16 ld.u64 %r370,[%r135]; add.u64 %r91,%r370,1; st.u64 [%r135],%r91; .loc 1 204 23 add.u64 %r372,%r390,%r156; ld.u64 %r92,[%r372]; .loc 1 204 13 shl.b64 %r373,%r92,2; add.u64 %r106,%r123,%r373; .loc 1 205 23 add.u64 %r374,%frame,%r156; ld.u64 %r95,[%r374]; .loc 1 205 13 shl.b64 %r375,%r95,2; add.u64 %r126,%r124,%r375; .loc 1 185 32 add.u64 %r377,%r391,%r156; ld.u64 %r98,[%r377]; .loc 1 185 13 setp.eq.u64 %r378,%r91,%r98; @ %r378 bra $L35; mov.u64 %r187,0; bra $L26; $L49: .loc 1 101 6 ld.u64 %r379,[%r188]; setp.eq.u64 %r380,%r379,0; @ ! %r380 bra $L12; bra $L36; $L_gfortran_mminval1_s4 .visible .func _gfortran_mminval1_s4608pred %r235; .reg .predu32 %r261; .reg .u64 %r266; .reg .u64pred %r278; .reg .u32u32 %r291; .reg .u16 %r292; .reg .predpredu64 %r318; .reg .u64u64 %r333; .reg .u64 %r336; .reg .u64 %r337; .reg .u32u64 %r349; .reg .pred %r350; .reg .pred %r352; .reg .predu64 %r374; .reg .u16 %r376; .reg .u16 %r377; .reg .u32 %r378; .reg .predpred %r396; .reg .u64 %r399; .reg .u64 %r400; .reg .u64 %r404; .reg .pred %r406; .reg .predpred %r420; .reg .predpred %r427; .reg .pred %r428; .reg .pred %r429; .reg .u32 %r433; .reg .predpredpred %r470; .reg .u64 %r471; .reg .predpred488; mov.u64 %r229,%ar0; mov.u64 %r230,%ar1; mov.u64 %r231,%ar2; mov.u64 %r232,%ar3; mov.u64 %r233,%ar4; mov.u64 %r234,%ar5; .loc 1 242 6 setp.ne.u64 %r235,%r233,0; @ %r235 bra $L23232234; call _gfortran_minval1_s4); } .loc 1 245 7 bra $L50; $L51: .loc 1 248 3 setp.eq.u64 %r241,%r230,%r234; @ %r241 bra $L53; cvta.const.u64 %r245,$LC0; cvta.const.u64 %r244,__func__$0; mov.u32 %r243,248;r2432444553: .loc 1 250 10 ld.u64 %r22,[%r232]; .loc 1 250 7 add.u64 %r147,%r22,-1; .loc 1 251 10 ld.s8 %r23,[%r231+28]; .loc 1 251 38 add.u32 %r246,%r23,-1; .loc 1 251 8 cvt.s64.s32 %r148,%r246; .loc 1 253 7 shr.u64 %r248,%r147,63; set.u32.gt.s64 %r250,%r147,%r148; neg.s32 %r251,%r250; cvt.u16.u64 %r254,%r248; cvt.u16.u32 %r255,%r251; or.b16 %r253,%r254,%r255; .loc 1 253 6 cvt.u32.u16 %r256,%r253; cvt.u16.u8 %r257,%r256; setp.eq.u16 %r258,%r257,0; @ %r258 bra $L54; .loc 1 255 7 cvt.u32.u32 %r261,%r23; cvt.s64.s8 %r260,%r261; st.u64 [%stack+8],%r260; st.u64 [%stack],%r22; cvta.const.u64 %r22_gfortran_runtime_error4: .loc 1 260 9 add.u64 %r478,%r147,%r147; add.u64 %r475,%r478,%r147; shl.b64 %r266,%r475,3; add.u64 %r267,%r231,%r266; ld.u64 %r270,[%r267+56]; add.u64 %r269,%r270,1; .loc 1 260 7 ld.u64 %r277,[%r267+48]; sub.u64 %r149,%r269,%r277; .loc 1 261 6 setp.le.s64 %r278,%r149,0; @ %r278 bra $L50; .loc 1 264 9 ld.u64 %r164,[%r233]; .loc 1 266 15 ld.u64 %r33,[%r233+16]; .loc 1 268 22 cvt.u32.u64 %r34,%r33; .loc 1 268 53 add.u32 %r279,%r34,-4; and.b32 %r280,%r279,-5; set.u32.eq.u32 %r282,%r280,0; neg.s32 %r283,%r282; .loc 1 268 22 add.u32 %r284,%r34,-1; set.u32.le.u32 %r286,%r284,1; neg.s32 %r287,%r286; .loc 1 268 6 cvt.u16.u32 %r289,%r283; cvt.u16.u32 %r290,%r287; or.b16 %r288,%r289,%r290; cvt.u32.u16 %r291,%r288; cvt.u16.u8 %r292,%r291; setp.ne.u16 %r293,%r292,0; @ %r293 bra $L56; .loc 1 270 7 setp.ne.u32 %r295,%r34,16; @ %r295 bra $L57; $L56: .loc 1 277 11 shl.b64 %r299,%r475,3; add.u64 %r300,%r231,%r299; ld.u64 %r39,[%r300+40]; .loc 1 278 12 add.u64 %r306,%r233,%r299; ld.u64 %r41,[%r306+40]; .loc 1 280 3 setp.ne.u64 %r308,%r147,0; @ %r308 bra $L58; $L64: .loc 1 290 3 setp.lt.s64 %r309,%r147,%r148; @ %r309 bra $L59; bra $L109; $L57: .loc 1 275 5 cvta.const.u64 %r310,$LC310_gfortran_runtime_error8: add.u64 %r104,%r231,40; add.u64 %r213,%frame,240; add.u64 %r214,%r233,40; mov.u64 %r216,%frame; add.u64 %r217,%frame,360; add.u64 %r312,%r231,16; add.u64 %r314,%r22,%r22; add.u64 %r315,%r314,%r22; shl.b64 %r316,%r315,3; add.u64 %r226,%r312,%r316; .loc 1 287 12 mov.u64 %r488,0; $L63: .loc 1 282 51 ld.u64 %r318,[%r104]; mul.lo.u64 %r317,%r318,%r234; .loc 1 282 18 st.u64 [%r213],%r317; .loc 1 283 20 ld.u64 %r320,[%r214]; mul.lo.u64 %r319,%r320,%r33; .loc 1 283 18 st.u64 [%r216],%r319; .loc 1 284 19 ld.u64 %r322,[%r104+16]; add.u64 %r321,%r322,1; ld.u64 %r323,[%r104+8]; sub.u64 %r56,%r321,%r323; .loc 1 286 10 setp.lt.s64 %r324,%r56,0; @ %r324 bra $L61; .loc 1 284 17 st.u64 [%r217],%r56; bra $L62; $L61: .loc 1 287 12 st.u64 [%r217],%r488; $L62: .loc 1 280 3 add.u64 %r104,%r104,24; add.u64 %r213,%r213,8; add.u64 %r214,%r214,24; add.u64 %r216,%r216,8; add.u64 %r217,%r217,8; setp.ne.u64 %r326,%r104,%r226; @ %r326 bra $L63; bra $L64; $L109: .loc 1 300 6 ld.u64 %r327,[%r229]; setp.eq.u64 %r328,%r327,0; @ ! %r328 bra $L66; bra $L65; $L59: add.u64 %r330,%r22,%r22; add.u64 %r331,%r330,%r22; shl.b64 %r332,%r331,3; add.u64 %r130,%r332,40; add.u64 %r134,%r231,%r130; shl.b64 %r333,%r22,3; add.u64 %r106,%r333,-8; add.u64 %r473,%frame,240; add.u64 %r127,%r473,%r106; add.u64 %r88,%r233,%r130; add.u64 %r37,%frame,%r106; add.u64 %r479,%frame,360; add.u64 %r168,%r479,%r106; add.u64 %r336,%r231,40; cvt.u32.u32 %r338,%r23; cvt.s64.s8 %r337,%r338; add.u64 %r340,%r337,%r337; add.u64 %r341,%r340,%r337; shl.b64 %r342,%r341,3; add.u64 %r101,%r336,%r342; .loc 1 297 12 mov.u64 %r487,0; $L69: .loc 1 292 55 ld.u64 %r344,[%r134]; mul.lo.u64 %r343,%r344,%r234; .loc 1 292 18 st.u64 [%r127],%r343; .loc 1 293 20 ld.u64 %r346,[%r88]; mul.lo.u64 %r345,%r346,%r33; .loc 1 293 18 st.u64 [%r37],%r345; .loc 1 294 19 ld.u64 %r348,[%r134+16]; add.u64 %r347,%r348,1; ld.u64 %r349,[%r134+8]; sub.u64 %r70,%r347,%r349; .loc 1 296 10 setp.lt.s64 %r350,%r70,0; @ %r350 bra $L67; .loc 1 294 17 st.u64 [%r168],%r70; bra $L68; $L67: .loc 1 297 12 st.u64 [%r168],%r487; $L68: .loc 1 290 3 add.u64 %r134,%r134,24; add.u64 %r127,%r127,8; add.u64 %r88,%r88,24; add.u64 %r37,%r37,8; add.u64 %r168,%r168,8; setp.ne.u64 %r352,%r101,%r134; @ %r352 bra $L69; bra $L110; $L65: .loc 1 304 7 setp.eq.u64 %r353,%r148,0; @ %r353 bra $L71; add.u64 %r479,%frame,360; $L91: add.u64 %r146,%r229,40; mov.u64 %r144,%r479; add.u64 %r354,%r229,16; cvt.u32.u32 %r356,%r23; cvt.s64.s8 %r355,%r356; add.u64 %r358,%r355,%r355; add.u64 %r359,%r358,%r355; shl.b64 %r360,%r359,3; add.u64 %r135,%r354,%r360; .loc 1 307 10 mov.u64 %r138,1; .loc 1 311 4 mov.u64 %r380,0; bra $L72; $L71: .loc 1 315 20 add.u32 %r361,%r23,-2; cvt.s64.s32 %r77,%r361; add.u64 %r363,%r77,%r77; add.u64 %r364,%r363,%r77; shl.b64 %r365,%r364,3; add.u64 %r366,%r229,%r365; .loc 1 315 67 shl.b64 %r368,%r77,3; add.u64 %r369,%frame,%r368; .loc 1 315 59 ld.u64 %r372,[%r366+40]; ld.u64 %r373,[%r369+360]; mul.lo.u64 %r371,%r372,%r373; .loc 1 315 18 mul.lo.u64 %r155,%r371,%r234; .loc 1 318 24 mov.u64 %r374,0; st.u64 [%r229+8],%r374; .loc 1 319 28 cvt.u16.u32 %r377,%r23; add.u16 %r376,%r377,-1; cvt.u32.u16 %r378,%r376; st.u8 [%r229+28],%r378; .loc 1 321 10 setp.eq.u64 %r379,%r155,0; @ %r379 bra $L73; bra $L111; $L75: .loc 1 309 47 mul.lo.u64 %r138,%r74,%r138; $L72: .loc 1 311 4 st.u64 [%r146+8],%r380; ld.u64 %r74,[%r144]; add.u64 %r381,%r74,-1; st.u64 [%r146+16],%r381; st.u64 [%r146],%r138; .loc 1 304 7 add.u64 %r146,%r146,24; add.u64 %r144,%r144,8; setp.ne.u64 %r382,%r135,%r146; @ %r382 bra $L75; bra $L71; $L73: .loc 1 324 4 st.u64 [%r229+48],%r155; mov.u64 %r384,-1; st.u64 [%r229+56],%r384; mov.u64 %r385,1; st.u64 [%r229+40],%r385; .loc 1 325 4 bra $L50; $L111: .loc 1 328 24387; call (%value_in),_gfortrani_xmallocarray88,[%value_in]; } .loc 1 328 22 st.u64 [%r229],%r388; bra $L76; $L66: .loc 1 333 19 ld.s8 %r390,[%r229+28]; .loc 1 333 10 setp.eq.u64 %r391,%r390,%r148; @ %r391 bra $L77; .loc 1 334 2 cvta.const.u64 %r392,$LC7;stack; call _gfortran_runtime_error77: .loc 1 336 11 cvta.global.u64 %r394,_gfortrani_compile_options; .loc 1 336 10 ld.u32 %r395,[%r394+36]; setp.eq.u32 %r396,%r395,0; @ %r396 bra $L76; .loc 1 338 4 add.u64 %r479,%frame,360; cvta.const.u64 %r400,$LC4; cvta.const.u64 %r39399400; call _gfortrani_bounds_ifunction_return340 4 cvta.const.u64 %r40423r204400; call _gfortrani_bounds_equal_extents$L76: .loc 1 345 3 setp.ne.u64 %r406,%r148,0; @ %r406 bra $L78; $L81: .loc 1 353 8 ld.u64 %r165,[%r229]; .loc 1 354 8 ld.u64 %r163,[%r231]; .loc 1 356 9 setp.ne.u64 %r407,%r163,0; @ %r407 bra $L79; bra $L50; $L78: add.u64 %r185,%r229,40; cvt.u32.u32 %r409,%r23; cvt.s64.s8 %r408,%r409; add.u64 %r169,%r408,-1; .loc 1 345 3 mov.u64 %r179,0; mov.u64 %r62,%r179; add.u64 %r479,%frame,360; add.u64 %r476,%frame,480; add.u64 %r474,%frame,120; .loc 1 347 16 mov.u64 %r412,%r179; $L80: add.u64 %r411,%r476,%r179; st.u64 [%r411],%r412; .loc 1 348 18 add.u64 %r414,%r474,%r179; .loc 1 348 54 ld.u64 %r416,[%r185]; mul.lo.u64 %r415,%r416,%r234; .loc 1 348 18 st.u64 [%r414],%r415; .loc 1 349 17 add.u64 %r418,%r479,%r179; .loc 1 349 10 ld.u64 %r419,[%r418]; setp.le.s64 %r420,%r419,0; @ %r420 bra $L50; .loc 1 345 26 add.u64 %r62,%r62,1; .loc 1 345 3 add.u64 %r185,%r185,24; add.u64 %r179,%r179,8; setp.ne.u64 %r421,%r62,%r169; @ %r421 bra $L80; bra $L81; $L79: .loc 1 278 12 mul.lo.u64 %r44,%r41,%r33; .loc 1 366 2 shl.b64 %r95,%r234,2; .loc 1 368 32 mul.lo.u64 %r96,%r39,%r95; .loc 1 389 22 ld.u64 %r201,[%frame+240]; .loc 1 389 12 shl.b64 %r204,%r201,2; .loc 1 390 23 ld.u64 %r205,[%frame]; .loc 1 391 22 ld.u64 %r208,[%frame+120]; .loc 1 391 12 shl.b64 %r211,%r208,2; setp.le.s64 %r477,%r148,1; cvt.u32.u32 %r481,%r23; cvt.s64.s8 %r482,%r481; add.u64 %r483,%r482,-1; add.u64 %r484,%frame,240; add.u64 %r485,%frame,360; add.u64 %r486,%frame,120; $L92: .loc 1 366 2 mov.u32 %r423,255mov.u64 %r159,%r164; mov.u64 %r158,%r163; .loc 1 368 9 mov.u64 %r157,0; $L83: .loc 1 371 7 ld.s8 %r212,[%r159]; .loc 1 371 6 setp.ne.u32 %r426,%r212,0; @ %r426 bra $L82; .loc 1 368 24 add.u64 %r157,%r157,1; .loc 1 368 32 add.u64 %r158,%r158,%r96; .loc 1 368 47 add.u64 %r159,%r159,%r44; .loc 1 368 2 setp.ne.u64 %r427,%r149,%r157; @ %r427 bra $L83; mov.u64 %r158,%r165; bra $L84; $L82: .loc 1 377 6 setp.le.s64 %r428,%r149,%r157; @ %r428 bra $L84; mov.u64 %r166,%r158; $L86: .loc 1 379 6 setp.eq.u32 %r429,%r212,0; @ %r429 bra $L85; .loc 1 40(%value_in),_gfortrani_memcmp_char4433,[%value_in]; } .loc 1 379 13 setp.ge.s32 %r434,%r433,0; selp.u64 %r158,%r158,%r166,%r434; $L85: .loc 1 377 23 add.u64 %r157,%r157,1; .loc 1 377 31 add.u64 %r166,%r166,%r96; .loc 1 377 46 add.u64 %r159,%r159,%r44; .loc 1 377 6 setp.eq.u64 %r435,%r149,%r157; @ %r435 bra $L84; .loc 1 379 7 ld.s8 %r212,[%r159]; bra $L86; $L84: .loc 1 385 2442,[%value_in]; } .loc 1 388 15 ld.u64 %r444,[%frame+480]; add.u64 %r100,%r444,1; st.u64 [%frame+480],%r100; .loc 1 389 12 add.u64 %r163,%r163,%r204; .loc 1 390 13 add.u64 %r164,%r164,%r205; .loc 1 391 12 add.u64 %r165,%r165,%r211; .loc 1 393 32 ld.u64 %r136,[%frame+360]; .loc 1 393 13 setp.ne.u64 %r445,%r100,%r136; @ %r445 bra $L92; .loc 1 397 13 mov.u64 %r446,0; st.u64 [%frame+480],%r446; .loc 1 400 23 mul.lo.u64 %r108,%r100,%r201; .loc 1 401 24 mul.lo.u64 %r447,%r100,%r205; .loc 1 401 10 sub.u64 %r160,%r164,%r447; .loc 1 402 23 mul.lo.u64 %r448,%r100,%r208; .loc 1 402 9 shl.b64 %r449,%r448,2; sub.u64 %r161,%r165,%r449; .loc 1 404 7 @ %r477 bra $L50; add.u64 %r198,%frame,488; mov.u64 %r197,8; .loc 1 403 5 mov.u64 %r162,1; bra $L89; $L90: .loc 1 397 13 st.u64 [%r198],%r446; .loc 1 400 23 mul.lo.u64 %r108,%r120,%r119; .loc 1 401 24 mul.lo.u64 %r455,%r122,%r119; .loc 1 401 10 sub.u64 %r160,%r164,%r455; .loc 1 402 23 mul.lo.u64 %r456,%r124,%r119; .loc 1 402 9 shl.b64 %r457,%r456,2; sub.u64 %r161,%r165,%r457; .loc 1 403 5 add.u64 %r162,%r162,1; .loc 1 404 7 add.u64 %r198,%r198,8; add.u64 %r197,%r197,8; setp.eq.u64 %r458,%r162,%r483; @ %r458 bra $L50; $L89: .loc 1 412 16 ld.u64 %r459,[%r198]; add.u64 %r119,%r459,1; st.u64 [%r198],%r119; .loc 1 413 23 add.u64 %r461,%r484,%r197; ld.u64 %r120,[%r461]; .loc 1 413 13 sub.u64 %r462,%r120,%r108; shl.b64 %r463,%r462,2; add.u64 %r163,%r163,%r463; .loc 1 414 24 add.u64 %r464,%frame,%r197; ld.u64 %r122,[%r464]; .loc 1 414 14 add.u64 %r164,%r160,%r122; .loc 1 415 23 add.u64 %r466,%r486,%r197; ld.u64 %r124,[%r466]; .loc 1 415 13 shl.b64 %r467,%r124,2; add.u64 %r165,%r161,%r467; .loc 1 393 32 add.u64 %r469,%r485,%r197; ld.u64 %r128,[%r469]; .loc 1 393 13 setp.eq.u64 %r470,%r119,%r128; @ %r470 bra $L90; bra $L92; $L110: .loc 1 300 6 ld.u64 %r471,[%r229]; setp.eq.u64 %r472,%r471,0; @ ! %r472 bra $L66; bra $L91; $L50_gfortran_sminval1_s4 .visible .func _gfortran_sminval1_s4local .align 16 .b8 %frame_ar[368predu16 %r180; .reg .predpredu32u64 %r229; .reg .u16predpred %r264; .reg .pred %r265; .reg .u64 %r266; .reg .u32pred %r274; .reg .u64 %r275; .reg .u16u32 %r283; .reg .u64predu32 %r313; .reg .u64mov.u64 %r156,%ar0; mov.u64 %r157,%ar1; mov.u64 %r158,%ar2; mov.u64 %r159,%ar3; mov.u64 %r160,%ar4; mov.u64 %r161,%ar5; .loc 1 445 6 setp.eq.u64 %r162,%r160,0; @ %r162 bra $L113; .loc 1 445 20 ld.u32 %r163,[%r160]; setp.eq.u32 %r164,%r163,0; @ %r164 bra $L114; $L113: .loc 1 447 7r157158161; call _gfortran_minval1_s4); } .loc 1 448 7 bra $L112; $L114: .loc 1 451 10 ld.u64 %r23,[%r159]; .loc 1 451 7 add.u64 %r91,%r23,-1; .loc 1 452 10 ld.s8 %r24,[%r158+28]; .loc 1 452 38 add.u32 %r26,%r24,-1; .loc 1 452 8 cvt.s64.s32 %r311,%r26; .loc 1 454 7 shr.u64 %r171,%r91,63; set.u32.gt.s64 %r173,%r91,%r311; neg.s32 %r174,%r173; cvt.u16.u64 %r177,%r171; cvt.u16.u32 %r178,%r174; or.b16 %r176,%r177,%r178; .loc 1 454 6 cvt.u32.u16 %r179,%r176; cvt.u16.u8 %r180,%r179; setp.ne.u16 %r181,%r180,0; @ %r181 bra $L116; .loc 1 461 3 setp.ne.u64 %r182,%r91,0; @ %r182 bra $L117; $L123: .loc 1 469 3 setp.lt.s64 %r183,%r91,%r311; @ %r183 bra $L118; bra $L169; $L116: .loc 1 456 7 cvt.u32.u32 %r186,%r24; cvt.s64.s8 %r185,%r186; st.u64 [%stack+8],%r1851841stack; call _gfortran_runtime_error117: add.u64 %r100,%r158,48; add.u64 %r72,%frame,120; add.u64 %r59,%r158,56; add.u64 %r188,%r158,24; add.u64 %r190,%r23,%r23; add.u64 %r191,%r190,%r23; shl.b64 %r192,%r191,3; add.u64 %r152,%r188,%r192; .loc 1 466 12 mov.u64 %r317,0; $L122: .loc 1 463 19 ld.u64 %r194,[%r59]; add.u64 %r193,%r194,1; ld.u64 %r195,[%r100]; sub.u64 %r35,%r193,%r195; .loc 1 465 10 setp.le.s64 %r196,%r35,0; @ %r196 bra $L120; .loc 1 463 17 st.u64 [%r72],%r35; bra $L121; $L120: .loc 1 466 12 st.u64 [%r72],%r317; $L121: .loc 1 461 3 add.u64 %r100,%r100,24; add.u64 %r72,%r72,8; add.u64 %r59,%r59,24; setp.ne.u64 %r198,%r100,%r152; @ %r198 bra $L122; bra $L123; $L169: .loc 1 478 15 ld.u64 %r102,[%r156]; .loc 1 478 6 setp.eq.u64 %r199,%r102,0; @ ! %r199 bra $L125; bra $L124; $L118: add.u64 %r201,%r23,%r23; add.u64 %r202,%r201,%r23; shl.b64 %r203,%r202,3; add.u64 %r204,%r203,48; add.u64 %r81,%r158,%r204; add.u64 %r309,%frame,120; shl.b64 %r206,%r23,3; add.u64 %r207,%r206,-8; add.u64 %r30,%r309,%r207; add.u64 %r208,%r158,48; cvt.u32.u32 %r210,%r24; cvt.s64.s8 %r209,%r210; add.u64 %r212,%r209,%r209; add.u64 %r213,%r212,%r209; shl.b64 %r214,%r213,3; add.u64 %r97,%r208,%r214; .loc 1 475 12 mov.u64 %r316,0; $L128: .loc 1 472 2 ld.u64 %r216,[%r81+8]; add.u64 %r215,%r216,1; ld.u64 %r217,[%r81]; sub.u64 %r40,%r215,%r217; .loc 1 474 10 setp.le.s64 %r218,%r40,0; @ %r218 bra $L126; .loc 1 471 17 st.u64 [%r30],%r40; bra $L127; $L126: .loc 1 475 12 st.u64 [%r30],%r316; $L127: .loc 1 469 3 add.u64 %r81,%r81,24; add.u64 %r30,%r30,8; setp.ne.u64 %r220,%r81,%r97; @ %r220 bra $L128; bra $L170; $L124: .loc 1 482 7 setp.eq.u64 %r221,%r311,0; @ %r221 bra $L130; add.u64 %r309,%frame,120; $L149: add.u64 %r119,%r156,40; mov.u64 %r117,%r309; cvt.u32.u32 %r223,%r24; cvt.s64.s8 %r222,%r223; add.u64 %r225,%r222,%r222; add.u64 %r226,%r225,%r222; shl.b64 %r227,%r226,3; add.u64 %r228,%r156,16; add.u64 %r110,%r227,%r228; .loc 1 485 10 mov.u64 %r86,1; .loc 1 489 4 mov.u64 %r248,0; bra $L131; $L130: .loc 1 493 24 mov.u64 %r229,0; st.u64 [%r156+8],%r229; .loc 1 494 28 cvt.u16.u32 %r232,%r24; add.u16 %r231,%r232,-1; cvt.u32.u16 %r233,%r231; st.u8 [%r156+28],%r233; .loc 1 496 20 add.u32 %r234,%r24,-2; cvt.s64.s32 %r49,%r234; add.u64 %r236,%r49,%r49; add.u64 %r237,%r236,%r49156,%r238; .loc 1 496 67 shl.b64 %r241,%r49,3; add.u64 %r242,%frame,%r241; .loc 1 496 59 ld.u64 %r245,[%r239+40]; ld.u64 %r246,[%r242+120]; mul.lo.u64 %r244,%r245,%r246; .loc 1 496 18 mul.lo.u64 %r99,%r244,%r161; .loc 1 499 10 setp.eq.u64 %r247,%r99,0; @ %r247 bra $L132; bra $L171; $L134: .loc 1 487 48 mul.lo.u64 %r86,%r45,%r86; $L131: .loc 1 489 4 st.u64 [%r119+8],%r248; ld.u64 %r45,[%r117]; add.u64 %r249,%r45,-1; st.u64 [%r119+16],%r249; st.u64 [%r119],%r86; .loc 1 482 7 add.u64 %r119,%r119,24; add.u64 %r117,%r117,8; setp.ne.u64 %r250,%r110,%r119; @ %r250 bra $L134; bra $L130; $L132: .loc 1 502 4 st.u64 [%r156+48],%r99; mov.u64 %r252,-1; st.u64 [%r156+56],%r252; mov.u64 %r253,1; st.u64 [%r156+40],%r253; .loc 1 503 4 bra $L112; $L171: .loc 1 506 2499call (%value_in),_gfortrani_xmallocarray256,[%value_in]; } mov.u64 %r102,%r256; .loc 1 506 22 st.u64 [%r156],%r102; $L138: .loc 1 532 3 setp.ne.u64 %r258,%r311,0; @ %r258 bra $L135; bra $L136; $L125: .loc 1 510 19 ld.s8 %r55,[%r156+28]; .loc 1 510 10 setp.eq.u64 %r259,%r55,%r311; @ %r259 bra $L137; .loc 1 511 2 st.u64 [%stack+8],%r311; st.u64 [%stack],%r5_gfortran_runtime_error137: .loc 1 516 11 cvta.global.u64 %r262,_gfortrani_compile_options; .loc 1 516 10 ld.u32 %r263,[%r262+36]; setp.eq.u32 %r264,%r263,0; @ %r264 bra $L138; .loc 1 518 4 setp.eq.u64 %r265,%r311,0; @ %r265 bra $L136; add.u64 %r105,%r156,48; add.u64 %r88,%frame,120; cvt.u32.u32 %r267,%r24; cvt.s64.s8 %r266,%r267; add.u64 %r83,%r266,-1; .loc 1 518 10 mov.u64 %r109,0; $L140: .loc 1 522 21 ld.u64 %r269,[%r105+8]; add.u64 %r268,%r269,1; .loc 1 522 19 ld.u64 %r270,[%r105]; sub.u64 %r96,%r268,%r270; .loc 1 523 18 ld.u64 %r63,[%r88]; .loc 1 524 3 add.u64 %r109,%r109,1; .loc 1 523 11 setp.eq.u64 %r271,%r63,%r96; @ %r271 bra $L139; .loc 1 524 3 st.u64 [%stack+16],%r63; st.u64 [%stack+8],%r96; st.u64 [%stack],%r109; cvta.const.u64 %r272,$LC272_gfortran_runtime_error139: .loc 1 518 4 add.u64 %r105,%r105,24; add.u64 %r88,%r88,8; setp.ne.u64 %r274,%r83,%r109; @ %r274 bra $L140; $L135: .loc 1 534 16 cvt.u16.u32 %r276,%r24; setp.le.s16 %r277,%r276,1; @ %r277 bra $L141; shl.b64 %r275,%r311,3; bra $L142; $L141: mov.u64 %r275,8; $L142: add.u64 %r279,%frame,240; mov.u32 %r2832755,[%value_in]; } add.u64 %r125,%r156,40; mov.u64 %r104,0; $L143: .loc 1 535 18 shl.b64 %r287,%r104,3; add.u64 %r288,%frame,%r287; .loc 1 535 54 ld.u64 %r290,[%r125]; mul.lo.u64 %r289,%r290,%r161; .loc 1 535 18 st.u64 [%r288],%r289; .loc 1 532 26 add.u64 %r104,%r104,1; .loc 1 532 3 add.u64 %r125,%r125,24; setp.gt.s64 %r291,%r311,%r104; @ %r291 bra $L143; $L136: .loc 1 542 7 shl.b64 %r70,%r161,2; .loc 1 546 32 ld.u64 %r140,[%frame+120]; .loc 1 544 22 ld.u64 %r141,[%frame]; .loc 1 544 12 shl.b64 %r143,%r141,2; .loc 1 543 12 ld.u64 %r147,[%frame+240]; .loc 1 553 23 mul.lo.u64 %r151,%r140,%r141; setp.le.s64 %r310,%r311,1; cvt.u32.u32 %r313,%r24; cvt.s64.s8 %r314,%r313; add.u64 %r315,%r314,-1; $L148: .loc 1 542 7 mov.u32 %r293,255543 15 add.u64 %r147,%r147,1; .loc 1 544 12 add.u64 %r102,%r102,%r143; .loc 1 546 13 setp.ne.u64 %r296,%r147,%r140; @ %r296 bra $L148; .loc 1 555 7 @ %r310 bra $L112; add.u64 %r103,%frame,248; add.u64 %r133,%frame,8; add.u64 %r132,%frame,128; .loc 1 553 23 mov.u64 %r75,%r151; .loc 1 554 5 mov.u64 %r101,1; .loc 1 550 13 mov.u64 %r312,0; bra $L146; $L147: st.u64 [%r103],%r312; .loc 1 553 23 mul.lo.u64 %r75,%r78,%r77; .loc 1 554 5 add.u64 %r101,%r101,1; .loc 1 555 7 add.u64 %r103,%r103,8; add.u64 %r133,%r133,8; add.u64 %r132,%r132,8; setp.eq.u64 %r303,%r101,%r315; @ %r303 bra $L112; $L146: .loc 1 559 16 ld.u64 %r304,[%r103]; add.u64 %r77,%r304,1; st.u64 [%r103],%r77; .loc 1 560 23 ld.u64 %r78,[%r133]; .loc 1 560 13 sub.u64 %r305,%r78,%r75; shl.b64 %r306,%r305,2; add.u64 %r102,%r102,%r306; .loc 1 546 32 ld.u64 %r80,[%r132]; .loc 1 546 13 setp.eq.u64 %r307,%r77,%r80; @ %r307 bra $L147; mov.u64 %r147,0; bra $L148; $L170: .loc 1 478 15 ld.u64 %r102,[%r156]; .loc 1 478 6 setp.eq.u64 %r308,%r102,0; @ ! %r308 bra $L125; bra $L149; $L112: .loc 1 564 1 ret; } findloc0_i1.o/_gfortran_findloc0_i1 .visible .func _gfortran_findloc0_i1); .file 1 "../../../libgfortran/generated/findloc0_i1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mfindloc0_i1 .visible .func _gfortran_mfindloc0_i1_gfortran_sfindloc0_i1 .visible .func _gfortran_sfindloc0_i1); // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extentsgfortrani_internal_error .extern .func _gfortrani_internal_error8] = {70,73,78,68,777,65,83,75,32,97,114,103,117,109,101,110,116,0 }82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,6270,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_findloc0_i1 .visible .func _gfortran_findloc0_i1stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[368predpredpred %r194; .reg .pred %r195; .reg .predu32 %r212; .reg .u64 %r215; mov.u64 %r116,%ar0; mov.u64 %r117,%ar1; mov.u32 %r118,%ar2; mov.u32 %r119,%ar3; .loc 1 51 10 ld.s8 %r22,[%r117+28]; .loc 1 51 8 cvt.u32.u32 %r120,%r22; cvt.s64.s8 %r78,%r120; .loc 1 52 6 setp.gt.s64 %r121,%r78,0; @ %r121 bra $L2; .loc 1 53 5 cvta.const.u64 %r1221stack; call _gfortran_runtime_errorr23,[%r116]; .loc 1 55 6 setp.ne.u64 %r124,%r23,0; @ %r124 bra $L3; .loc 1 57 7 st.u64 [%r116+48],%r23; add.u64 %r126,%r78,-1; st.u64 [%r116+56],%r126; mov.u64 %r127,1; st.u64 [%r116+40],%r127; .loc 1 58 28 cvt.u32.u64 %r128,%r127; st.u8 [%r116+28],%r128; .loc 1 59 24 st.u64 [%r116+8],%r23; .loc 1 60 29 mov.u64 %r131r131; call (%value_in),_gfortrani_xmallocarray134,[%value_in]; } mov.u64 %r23,%r134; .loc 1 60 27 st.u64 [%r116],%r23; bra $L4; $L3: .loc 1 64 11 cvta.global.u64 %r136,_gfortrani_compile_options; .loc 1 64 10 ld.u32 %r137,[%r136+36]; setp.eq.u32 %r138,%r137,0; @ %r138 bra $L4; .loc 1 6r141; call _gfortrani_bounds_iforeach_return.loc 1 70 8 ld.u64 %r23,[%r116]; $L4: ld.u64 %r142,[%r116+40]; shl.b64 %r114,%r142,3; mov.u64 %r97,%r23; .loc 1 69 11 mov.u64 %r112,%r97; .loc 1 73 10 mov.u64 %r85,0; .loc 1 74 23 mov.u64 %r143,%r85; $L5: st.u64 [%r112],%r143; mov.u64 %r66,%r85; .loc 1 73 26 add.u64 %r85,%r66,1; .loc 1 73 3 add.u64 %r112,%r112,%r114; setp.ne.u64 %r144,%r78,%r85; @ %r144 bra $L5; add.u64 %r93,%r117,40; mov.u64 %r45,%frame; .loc 1 76 6 mov.u64 %r83,1; .loc 1 77 10 mov.u64 %r96,0; add.u64 %r205,%frame,120; bra $L7; $L20: mov.u64 %r96,%r84; $L7: .loc 1 79 18 ld.u64 %r145,[%r93]; st.u64 [%r45],%r145; .loc 1 80 19 ld.u64 %r147,[%r93+16]; add.u64 %r146,%r147,1; ld.u64 %r148,[%r93+8]; sub.u64 %r33,%r146,%r148; .loc 1 80 17 shl.b64 %r150,%r96,3; add.u64 %r151,%r205,%r150; st.u64 [%r151],%r33; .loc 1 81 10 mul.lo.u64 %r83,%r83,%r33; .loc 1 82 10 setp.le.s64 %r152,%r33,0; @ %r152 bra $L1; .loc 1 77 26 add.u64 %r84,%r96,1; .loc 1 77 3 add.u64 %r93,%r93,24; add.u64 %r45,%r45,8; setp.ne.u64 %r153,%r66,%r96; @ %r153 bra $L20; .loc 1 87 16 cvt.u32.u32 %r155,%r22; cvt.s64.s8 %r154,%r155; shl.b64 %r91,%r154,3; add.u64 %r206,%frame,240; mov.u32 %r160%r108,[%frame]; .loc 1 105 34 ld.u64 %r104,[%frame+120]; .loc 1 91 19 ld.u64 %r43,[%r117]; .loc 1 89 6 setp.eq.u32 %r164,%r119,0; @ ! %r164 bra $L32; add.u64 %r207,%frame,248; .loc 1 134 11 cvt.u16.u32 %r188,%r118; mov.u32 %r212,1; .loc 1 149 17 mov.u64 %r215,0; bra $L8; $L32: .loc 1 91 42 add.u64 %r165,%r83,-1; .loc 1 91 12 add.u64 %r65,%r43,%r165; .loc 1 104 13 neg.s64 %r42,%r108; add.u64 %r207,%frame,248; .loc 1 97 11 cvt.u16.u32 %r168,%r118; mov.u32 %r208,1; .loc 1 112 17 mov.u64 %r211,0; $L14: ld.u64 %r38,[%frame+240]; mov.u64 %r105,%r38; mov.u32 %r57,%r160; $L12: .loc 1 97 11 ld.s8 %r167,[%r65]; cvt.u16.u32 %r166,%r167; setp.ne.u16 %r169,%r166,%r168; @ %r169 bra $L9; setp.eq.u32 %r170,%r57,0; selp.u64 %r105,%r38,%r105,%r170; add.u64 %r89,%frame,128; mov.u64 %r88,%r207; add.u64 %r82,%r205,%r91; $L11: .loc 1 100 37 sub.u64 %r174,%r104,%r105; .loc 1 100 25 st.u64 [%r97],%r174; .loc 1 99 5 add.u64 %r97,%r97,%r114; setp.eq.u64 %r175,%r82,%r89; @ %r175 bra $L1; .loc 1 100 33 ld.u64 %r104,[%r89]; .loc 1 100 44 ld.u64 %r105,[%r88]; add.u64 %r89,%r89,8; add.u64 %r88,%r88,8; bra $L11; $L9: .loc 1 104 13 add.u64 %r65,%r65,%r42; .loc 1 105 14 add.u64 %r105,%r105,1; mov.u32 %r57,%r208; .loc 1 105 6 setp.ne.u64 %r176,%r105,%r104; @ %r176 bra $L12; st.u64 [%frame+240],%r104; mov.u64 %r106,%r207; mov.u64 %r52,%r104; mov.u64 %r50,%r108; .loc 1 107 6 mov.u64 %r70,0; $L13: .loc 1 112 17 st.u64 [%r106+-8],%r211; .loc 1 115 39 mul.lo.u64 %r47,%r50,%r52; mov.u64 %r67,%r70; .loc 1 116 9 add.u64 %r70,%r70,1; .loc 1 117 11 setp.eq.u64 %r179,%r67,%r66; @ %r179 bra $L1; .loc 1 121 13 ld.u64 %r180,[%r106]; add.u64 %r49,%r180,1; st.u64 [%r106],%r49; shl.b64 %r101,%r70,3; .loc 1 122 24 add.u64 %r181,%frame,%r101; ld.u64 %r50,[%r181]; .loc 1 122 10 sub.u64 %r182,%r47,%r50; add.u64 %r65,%r65,%r182; .loc 1 124 33 add.u64 %r184,%r205,%r101; ld.u64 %r52,[%r184]; .loc 1 124 6 add.u64 %r106,%r106,8; setp.eq.u64 %r185,%r49,%r52; @ %r185 bra $L13; bra $L14; $L8: ld.u64 %r92,[%frame+240]; .loc 1 107 6 mov.u64 %r100,%r92; mov.u32 %r75,%r119; $L18: .loc 1 134 11 ld.s8 %r187,[%r43]; cvt.u16.u32 %r186,%r187; setp.ne.u16 %r189,%r186,%r188; @ %r189 bra $L15; setp.eq.u32 %r190,%r75,0; selp.u64 %r100,%r92,%r100,%r190; mov.u64 %r69,%r207; add.u64 %r95,%r206,%r91; $L17: .loc 1 137 36 add.u64 %r193,%r100,1; .loc 1 137 25 st.u64 [%r97],%r193; .loc 1 136 5 add.u64 %r97,%r97,%r114; setp.eq.u64 %r194,%r69,%r95; @ %r194 bra $L1; .loc 1 137 32 ld.u64 %r100,[%r69]; add.u64 %r69,%r69,8; bra $L17; $L15: .loc 1 141 13 add.u64 %r43,%r43,%r108; .loc 1 142 14 add.u64 %r100,%r100,1; mov.u32 %r75,%r212; .loc 1 142 6 setp.ne.u64 %r195,%r100,%r104; @ %r195 bra $L18; st.u64 [%frame+240],%r104; mov.u64 %r68,%r207; mov.u64 %r64,%r104; mov.u64 %r62,%r108; .loc 1 144 6 mov.u64 %r71,0; $L19: .loc 1 149 17 st.u64 [%r68+-8],%r215; .loc 1 152 39 mul.lo.u64 %r59,%r62,%r64; mov.u64 %r98,%r71; .loc 1 153 9 add.u64 %r71,%r71,1; .loc 1 154 11 setp.eq.u64 %r198,%r98,%r66; @ %r198 bra $L1; .loc 1 158 13 ld.u64 %r199,[%r68]; add.u64 %r61,%r199,1; st.u64 [%r68],%r61; shl.b64 %r54,%r71,3; .loc 1 159 24 add.u64 %r200,%frame,%r54; ld.u64 %r62,[%r200]; .loc 1 159 10 sub.u64 %r201,%r62,%r59; add.u64 %r43,%r43,%r201; .loc 1 161 33 add.u64 %r203,%r205,%r54; ld.u64 %r64,[%r203]; .loc 1 161 6 add.u64 %r68,%r68,8; setp.eq.u64 %r204,%r61,%r64; @ %r204 bra $L19; bra $L8; $L_gfortran_mfindloc0_i1 .visible .func _gfortran_mfindloc0_i1stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[480u32u32u16 %r197; .reg .u16 %r198; .reg .u16 %r199; .reg .u32 %r200; .reg .u16u64 %r217; .reg .u32.reg .u32 %r234; .reg .pred %r236; .reg .pred %r237; .reg .u64 %r240; .reg .predmov.u64 %r156,%ar0; mov.u64 %r157,%ar1; mov.u32 %r158,%ar2; mov.u64 %r159,%ar3; mov.u32 %r160,%ar4; .loc 1 190 10 ld.s8 %r22,[%r157+28]; .loc 1 190 8 cvt.u32.u32 %r161,%r22; cvt.s64.s8 %r103,%r161; .loc 1 191 6 setp.gt.s64 %r162,%r103,0; @ %r162 bra $L34; .loc 1 192 5 cvta.const.u64 %r163163_gfortran_runtime_error34: .loc 1 194 6 ld.u64 %r165,[%r156]; setp.ne.u64 %r166,%r165,0; @ %r166 bra $L35; .loc 1 196 7 st.u64 [%r156+48],%r165; add.u64 %r168,%r103,-1; st.u64 [%r156+56],%r168; mov.u64 %r169,1; st.u64 [%r156+40],%r169; .loc 1 197 28 cvt.u32.u64 %r170,%r169; st.u8 [%r156+28],%r170; .loc 1 198 24 st.u64 [%r156+8],%r165; .loc 1 199 29 mov.u64 %r173(%value_in),_gfortrani_xmallocarray176,[%value_in]; } .loc 1 199 27 st.u64 [%r156],%r176; bra $L36; $L35: .loc 1 203 11 cvta.global.u64 %r178,_gfortrani_compile_options; .loc 1 203 10 ld.u32 %r179,[%r178+36]; setp.eq.u32 %r180,%r179,0; @ %r180 bra $L36; .loc 1 205 4r157183; call _gfortrani_bounds_iforeach_return.loc 1 207 4 cvta.const.u64 %r186183; call _gfortrani_bounds_equal_extents$L36: .loc 1 212 13 ld.u32 %r104,[%r159+16]; .loc 1 214 9 ld.u64 %r58,[%r159]; .loc 1 216 53 add.u32 %r188,%r104,-4; and.b32 %r189,%r188,-5; set.u32.eq.u32 %r191,%r189,0; neg.s32 %r192,%r191; .loc 1 216 22 add.u32 %r193,%r104,-1; set.u32.le.u32 %r195,%r193,1; neg.s32 %r196,%r195; .loc 1 216 6 cvt.u16.u32 %r198,%r192; cvt.u16.u32 %r199,%r196; or.b16 %r197,%r198,%r199; cvt.u32.u16 %r200,%r197; cvt.u16.u8 %r201,%r200; setp.ne.u16 %r202,%r201,0; @ %r202 bra $L37; .loc 1 218 7 setp.ne.u32 %r203,%r104,16; @ %r203 bra $L38; $L37: ld.u64 %r204,[%r156+40]; shl.b64 %r43,%r204,3; ld.u64 %r131,[%r156]; .loc 1 226 8 mov.u64 %r84,%r131; .loc 1 229 10 mov.u64 %r115,0; .loc 1 230 23 mov.u64 %r207,%r115; bra $L39; $L38: .loc 1 223 5 cvta.const.u64 %r206,$LC3; mov.u64 %r205,206; call _gfortrani_internal_error39: .loc 1 230 23 st.u64 [%r84],%r207; mov.u64 %r139,%r115; .loc 1 229 26 add.u64 %r115,%r139,1; .loc 1 229 3 add.u64 %r84,%r84,%r43; setp.ne.u64 %r208,%r103,%r115; @ %r208 bra $L39; .loc 1 236 20 ld.u64 %r36,[%r159+16]; add.u64 %r140,%r157,40; add.u64 %r282,%frame,120; mov.u64 %r96,%r282; add.u64 %r77,%r159,40; mov.u64 %r80,%frame; add.u64 %r79,%frame,240; mov.u64 %r75,%r79; .loc 1 232 6 mov.u64 %r113,1; .loc 1 233 10 mov.u64 %r130,0; bra $L41; $L54: mov.u64 %r130,%r114; $L41: .loc 1 235 18 ld.u64 %r209,[%r140]; st.u64 [%r96],%r209; .loc 1 236 20 ld.u64 %r211,[%r77]; mul.lo.u64 %r210,%r211,%r36; .loc 1 236 18 st.u64 [%r80],%r210; .loc 1 237 19 ld.u64 %r213,[%r140+16]; add.u64 %r212,%r213,1; ld.u64 %r214,[%r140+8]; sub.u64 %r42,%r212,%r214; .loc 1 237 17 st.u64 [%r75],%r42; .loc 1 238 10 mul.lo.u64 %r113,%r113,%r42; .loc 1 239 10 setp.le.s64 %r215,%r42,0; @ %r215 bra $L33; .loc 1 233 26 add.u64 %r114,%r130,1; .loc 1 233 3 add.u64 %r140,%r140,24; add.u64 %r96,%r96,8; add.u64 %r77,%r77,24; add.u64 %r80,%r80,8; add.u64 %r75,%r75,8; setp.ne.u64 %r216,%r139,%r130; @ %r216 bra $L54; .loc 1 244 16 cvt.u32.u32 %r218,%r22; cvt.s64.s8 %r217,%r218; shl.b64 %r138,%r217,3; add.u64 %r280,%frame,360; mov.u32 %r22313825,[%value_in]; } .loc 1 261 27 ld.u64 %r152,[%frame+120]; .loc 1 262 24 ld.u64 %r154,[%frame]; .loc 1 263 34 ld.u64 %r148,[%frame+240]; .loc 1 248 19 ld.u64 %r46,[%r157]; .loc 1 246 6 setp.eq.u32 %r227,%r160,0; @ ! %r227 bra $L72; add.u64 %r281,%frame,368; .loc 1 294 11 cvt.u16.u32 %r293,%r158; mov.u32 %r294,1; .loc 1 310 17 mov.u64 %r298,0; .loc 1 324 33 add.u64 %r299,%frame,240; bra $L42; $L72: .loc 1 248 42 add.u64 %r44,%r113,-1; .loc 1 248 12 add.u64 %r94,%r46,%r44; .loc 1 249 32 cvt.s64.s32 %r228,%r104; .loc 1 249 13 mad.lo.u64 %r95,%r228,%r44,%r58; .loc 1 261 13 neg.s64 %r55,%r152; .loc 1 262 14 neg.s64 %r57,%r154; add.u64 %r281,%frame,368; .loc 1 254 11 cvt.u16.u32 %r284,%r158; mov.u32 %r285,1; .loc 1 270 17 mov.u64 %r289,0; .loc 1 284 33 add.u64 %r290,%frame,240; $L48: ld.u64 %r128,[%frame+360]; mov.u64 %r149,%r128; mov.u32 %r74,%r223; $L46: .loc 1 254 11 ld.s8 %r231,[%r95]; cvt.u16.u32 %r230,%r231; setp.eq.u16 %r232,%r230,0; @ %r232 bra $L43; ld.s8 %r234,[%r94]; cvt.u16.u32 %r233,%r234; setp.ne.u16 %r236,%r233,%r284; @ %r236 bra $L43; setp.eq.u32 %r237,%r74,0; selp.u64 %r149,%r128,%r149,%r237; add.u64 %r136,%frame,248; mov.u64 %r135,%r281; add.u64 %r133,%r79,%r138; $L45: .loc 1 257 37 sub.u64 %r240,%r148,%r149; .loc 1 257 25 st.u64 [%r131],%r240; .loc 1 256 5 add.u64 %r131,%r131,%r43; setp.eq.u64 %r241,%r133,%r136; @ %r241 bra $L33; .loc 1 257 33 ld.u64 %r148,[%r136]; .loc 1 257 44 ld.u64 %r149,[%r135]; add.u64 %r136,%r136,8; add.u64 %r135,%r135,8; bra $L45; $L43: .loc 1 261 13 add.u64 %r94,%r94,%r55; .loc 1 262 14 add.u64 %r95,%r95,%r57; .loc 1 263 14 add.u64 %r149,%r149,1; mov.u32 %r74,%r285; .loc 1 263 6 setp.ne.u64 %r242,%r149,%r148; @ %r242 bra $L46; st.u64 [%frame+360],%r148; mov.u64 %r124,%r281; mov.u64 %r67,%r154; mov.u64 %r70,%r148; mov.u64 %r65,%r152; mov.u64 %r123,8; .loc 1 265 6 mov.u64 %r97,0; $L47: .loc 1 270 17 st.u64 [%r124+-8],%r289; .loc 1 273 39 mul.lo.u64 %r60,%r65,%r70; .loc 1 274 28 mul.lo.u64 %r62,%r70,%r67; mov.u64 %r102,%r97; .loc 1 275 9 add.u64 %r97,%r97,1; .loc 1 276 11 setp.eq.u64 %r245,%r102,%r139; @ %r245 bra $L33; .loc 1 280 13 ld.u64 %r246,[%r124]; add.u64 %r64,%r246,1; st.u64 [%r124],%r64; .loc 1 281 24 add.u64 %r248,%r282,%r123; ld.u64 %r65,[%r248]; .loc 1 281 10 sub.u64 %r249,%r60,%r65; add.u64 %r94,%r94,%r249; .loc 1 282 21 add.u64 %r250,%frame,%r123; ld.u64 %r67,[%r250]; .loc 1 282 11 sub.u64 %r251,%r67,%r62; add.u64 %r95,%r95,%r251; .loc 1 284 33 add.u64 %r253,%r290,%r123; ld.u64 %r70,[%r253]; .loc 1 284 6 add.u64 %r124,%r124,8; add.u64 %r123,%r123,8; setp.eq.u64 %r254,%r64,%r70; @ %r254 bra $L47; bra $L48; $L42: ld.u64 %r52,[%frame+360]; .loc 1 265 6 mov.u64 %r143,%r52; mov.u32 %r56,%r160; $L52: .loc 1 294 11 ld.s8 %r256,[%r58]; cvt.u16.u32 %r255,%r256; setp.eq.u16 %r257,%r255,0; @ %r257 bra $L49; ld.s8 %r259,[%r46]; cvt.u16.u32 %r258,%r259; setp.ne.u16 %r261,%r258,%r293; @ %r261 bra $L49; setp.eq.u32 %r262,%r56,0; selp.u64 %r143,%r52,%r143,%r262; mov.u64 %r118,%r281; add.u64 %r69,%r280,%r138; $L51: .loc 1 297 36 add.u64 %r265,%r143,1; .loc 1 297 25 st.u64 [%r131],%r265; .loc 1 296 5 add.u64 %r131,%r131,%r43; setp.eq.u64 %r266,%r69,%r118; @ %r266 bra $L33; .loc 1 297 32 ld.u64 %r143,[%r118]; add.u64 %r118,%r118,8; bra $L51; $L49: .loc 1 301 13 add.u64 %r46,%r46,%r152; .loc 1 302 14 add.u64 %r58,%r58,%r154; .loc 1 303 14 add.u64 %r143,%r143,1; mov.u32 %r56,%r294; .loc 1 303 6 setp.ne.u64 %r267,%r143,%r148; @ %r267 bra $L52; st.u64 [%frame+360],%r148; mov.u64 %r150,%r281; mov.u64 %r91,%r154; mov.u64 %r93,%r148; mov.u64 %r89,%r152; mov.u64 %r147,8; .loc 1 305 6 mov.u64 %r98,0; $L53: .loc 1 310 17 st.u64 [%r150+-8],%r298; .loc 1 313 39 mul.lo.u64 %r83,%r89,%r93; .loc 1 314 28 mul.lo.u64 %r86,%r93,%r91; mov.u64 %r116,%r98; .loc 1 315 9 add.u64 %r98,%r98,1; .loc 1 316 11 setp.eq.u64 %r270,%r116,%r139; @ %r270 bra $L33; .loc 1 320 13 ld.u64 %r271,[%r150]; add.u64 %r88,%r271,1; st.u64 [%r150],%r88; .loc 1 321 23 add.u64 %r273,%r282,%r147; ld.u64 %r89,[%r273]; .loc 1 321 10 sub.u64 %r274,%r89,%r83; add.u64 %r46,%r46,%r274; .loc 1 322 21 add.u64 %r275,%frame,%r147; ld.u64 %r91,[%r275]; .loc 1 322 11 sub.u64 %r276,%r91,%r86; add.u64 %r58,%r58,%r276; .loc 1 324 33 add.u64 %r278,%r299,%r147; ld.u64 %r93,[%r278]; .loc 1 324 6 add.u64 %r150,%r150,8; add.u64 %r147,%r147,8; setp.eq.u64 %r279,%r88,%r93; @ %r279 bra $L53; bra $L42; $L33:_gfortran_sfindloc0_i1 .visible .func _gfortran_sfindloc0_i1predar2; mov.u64 %r38,%ar3; mov.u32 %r39,%ar4; .loc 1 345 6 setp.eq.u64 %r40,%r38,0; @ %r40 bra $L74; .loc 1 345 20 ld.u32 %r41,[%r38]; setp.eq.u32 %r42,%r41,0; @ %r42 bra $L75; $L74: .loc 1 347 7 {call _gfortran_findloc0_348 7 bra $L73; $L75: .loc 1 351 8 ld.s8 %r30,[%r36+28]; .loc 1 353 6 setp.gt.s64 %r48,%r30,0; @ %r48 bra $L77; .loc 1 354 5 cvta.const.u64 %r50,$LC0; mov.u64 %r49,0; {_gfortrani_internal_error77: .loc 1 356 15 ld.u64 %r24,[%r35]; .loc 1 356 6 setp.ne.u64 %r51,%r24,0; @ %r51 bra $L78; .loc 1 358 7 st.u64 [%r35+48],%r24; add.u64 %r53,%r30,-1; st.u64 [%r35+56],%r53; mov.u64 %r54,1; st.u64 [%r35+40],%r54; .loc 1 359 28 cvt.u32.u64 %r55,%r54; st.u8 [%r35+28],%r55; .loc 1 360 24 st.u64 [%r35+8],%r24; .loc 1 361 29 mov.u64 call (%value_in),_gfortrani_xmallocarray61 27 st.u64 [%r35],%r24; bra $L79; $L78: .loc 1 363 12 cvta.global.u64 %r63,_gfortrani_compile_options; .loc 1 363 11 ld.u32 %r64,[%r63+36]; setp.eq.u32 %r65,%r64,0; @ %r65 bra $L79; .loc 1 365 8 cvta.const.u64 %r68,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 370 8 ld.u64 %r24,[%r35]; $L79: ld.u64 %r69,[%r35+40]; shl.b64 %r34,%r69,3; mov.u64 %r29,%r24; .loc 1 371 10 mov.u64 %r32,0; .loc 1 372 23 mov.u64 %r70,%r32; $L80: st.u64 [%r29],%r70; .loc 1 371 24 add.u64 %r32,%r32,1; .loc 1 371 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r71,%r30,%r32; @ %r71 bra $L80; $L73: .loc 1 373 1 ret; } findloc0_i2.o/_gfortran_findloc0_i2 .visible .func _gfortran_findloc0_i2); .file 1 "../../../libgfortran/generated/findloc0_i2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mfindloc0_i2 .visible .func _gfortran_mfindloc0_i2_gfortran_sfindloc0_i2 .visible .func _gfortran_sfindloc0_i2); // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extentsgfortrani_internal_error .extern .func _gfortrani_internal_error8] = {70,73,78,68,777,65,83,75,32,97,114,103,117,109,101,110,116,0 }82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,6270,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_findloc0_i2 .visible .func _gfortran_findloc0_i2stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[3688pred.reg .u64 %r140; .reg .u32 %r141; .reg .predpredu64 %r166; .reg .u64 %r168; .reg .predpred %r174; .reg .predpredpred %r191; .reg .u16 %r192; .reg .u16 %r193; .reg .pred64u32u64 %r221; mov.u64 %r120,%ar0; mov.u64 %r121,%ar1; mov.u32 %r122,%ar2; mov.u32 %r123,%ar3; .loc 1 51 10 ld.s8 %r22,[%r121+28]; .loc 1 51 8 cvt.u32.u32 %r124,%r22; cvt.s64.s8 %r80,%r124; .loc 1 52 6 setp.gt.s64 %r125,%r80,0; @ %r125 bra $L2; .loc 1 53 5 cvta.const.u64 %r126,$LCstack; call _gfortran_runtime_errorr23,[%r120]; .loc 1 55 6 setp.ne.u64 %r128,%r23,0; @ %r128 bra $L3; .loc 1 57 7 st.u64 [%r120+48],%r23; add.u64 %r130,%r80,-1; st.u64 [%r120+56],%r130; mov.u64 %r131,1; st.u64 [%r120+40],%r131; .loc 1 58 28 cvt.u32.u64 %r132,%r131; st.u8 [%r120+28],%r132; .loc 1 59 24 st.u64 [%r120+8],%r23; .loc 1 60 29 mov.u64 %r135135; call (%value_in),_gfortrani_xmallocarray138,[%value_in]; } mov.u64 %r23,%r138; .loc 1 60 27 st.u64 [%r120],%r23; bra $L4; $L3: .loc 1 64 11 cvta.global.u64 %r140,_gfortrani_compile_options; .loc 1 64 10 ld.u32 %r141,[%r140+36]; setp.eq.u32 %r142,%r141,0; @ %r142 bra $L4; .loc 1 6gfortrani_bounds_iforeach_return.loc 1 70 8 ld.u64 %r23,[%r120]; $L4: ld.u64 %r146,[%r120+40]; shl.b64 %r118,%r146,3; mov.u64 %r102,%r23; .loc 1 69 11 mov.u64 %r116,%r102; .loc 1 73 10 mov.u64 %r89,0; .loc 1 74 23 mov.u64 %r147,%r89; $L5: st.u64 [%r116],%r147; mov.u64 %r103,%r89; .loc 1 73 26 add.u64 %r89,%r103,1; .loc 1 73 3 add.u64 %r116,%r116,%r118; setp.ne.u64 %r148,%r80,%r89; @ %r148 bra $L5; add.u64 %r39,%r121,40; mov.u64 %r74,%frame; .loc 1 76 6 mov.u64 %r87,1; .loc 1 77 10 mov.u64 %r101,0; add.u64 %r211,%frame,120; bra $L7; $L20: mov.u64 %r101,%r88; $L7: .loc 1 79 18 ld.u64 %r149,[%r39]; st.u64 [%r74],%r149; .loc 1 80 19 ld.u64 %r151,[%r39+16]; add.u64 %r150,%r151,1; ld.u64 %r152,[%r39+8]; sub.u64 %r33,%r150,%r152; .loc 1 80 17 shl.b64 %r154,%r101,3; add.u64 %r155,%r211,%r154; st.u64 [%r155],%r33; .loc 1 81 10 mul.lo.u64 %r87,%r87,%r33; .loc 1 82 10 setp.le.s64 %r156,%r33,0; @ %r156 bra $L1; .loc 1 77 26 add.u64 %r88,%r101,1; .loc 1 77 3 add.u64 %r39,%r39,24; add.u64 %r74,%r74,8; setp.ne.u64 %r157,%r103,%r101; @ %r157 bra $L20; .loc 1 87 16 cvt.u32.u32 %r159,%r22; cvt.s64.s8 %r158,%r159; shl.b64 %r95,%r158,3; add.u64 %r212,%frame,240; mov.u32 %r164166,[%value_in]; } .loc 1 104 23 ld.u64 %r113,[%frame]; .loc 1 104 13 add.u64 %r168,%r113,%r113; .loc 1 105 34 ld.u64 %r109,[%frame+120]; .loc 1 91 19 ld.u64 %r45,[%r121]; .loc 1 89 6 setp.eq.u32 %r169,%r123,0; @ ! %r169 bra $L32; add.u64 %r213,%frame,248; .loc 1 134 11 cvt.u16.u32 %r193,%r122; mov.u32 %r218,1; .loc 1 149 17 mov.u64 %r221,0; bra $L8; $L32: .loc 1 91 37 add.u64 %r170,%r87,-1; .loc 1 91 31 add.u64 %r171,%r170,%r170; .loc 1 91 12 add.u64 %r68,%r45,%r171; .loc 1 104 13 neg.s64 %r44,%r168; add.u64 %r213,%frame,248; .loc 1 97 11 cvt.u16.u32 %r173,%r122; mov.u32 %r214,1; .loc 1 112 17 mov.u64 %r217,0; $L14: ld.u64 %r38,[%frame+240]; mov.u64 %r110,%r38; mov.u32 %r98,%r164; $L12: .loc 1 97 11 ld.u16 %r172,[%r68]; setp.ne.u16 %r174,%r172,%r173; @ %r174 bra $L9; setp.eq.u32 %r175,%r98,0; selp.u64 %r110,%r38,%r110,%r175; add.u64 %r93,%frame,128; mov.u64 %r92,%r213; add.u64 %r86,%r211,%r95; $L11: .loc 1 100 37 sub.u64 %r179,%r109,%r110; .loc 1 100 25 st.u64 [%r102],%r179; .loc 1 99 5 add.u64 %r102,%r102,%r118; setp.eq.u64 %r180,%r86,%r93; @ %r180 bra $L1; .loc 1 100 33 ld.u64 %r109,[%r93]; .loc 1 100 44 ld.u64 %r110,[%r92]; add.u64 %r93,%r93,8; add.u64 %r92,%r92,8; bra $L11; $L9: .loc 1 104 13 add.u64 %r68,%r68,%r44; .loc 1 105 14 add.u64 %r110,%r110,1; mov.u32 %r98,%r214; .loc 1 105 6 setp.ne.u64 %r181,%r110,%r109; @ %r181 bra $L12; st.u64 [%frame+240],%r109; mov.u64 %r111,%r213; mov.u64 %r53,%r109; mov.u64 %r51,%r113; .loc 1 107 6 mov.u64 %r73,0; $L13: .loc 1 112 17 st.u64 [%r111+-8],%r217; .loc 1 115 27 mul.lo.u64 %r48,%r51,%r53; mov.u64 %r79,%r73; .loc 1 116 9 add.u64 %r73,%r73,1; .loc 1 117 11 setp.eq.u64 %r184,%r79,%r103; @ %r184 bra $L1; .loc 1 121 13 ld.u64 %r185,[%r111]; add.u64 %r50,%r185,1; st.u64 [%r111],%r50; shl.b64 %r106,%r73,3; .loc 1 122 20 add.u64 %r186,%frame,%r106; ld.u64 %r51,[%r186]; .loc 1 122 10 sub.u64 %r187,%r48,%r51; add.u64 %r188,%r187,%r187; add.u64 %r68,%r68,%r188; .loc 1 124 33 add.u64 %r190,%r211,%r106; ld.u64 %r53,[%r190]; .loc 1 124 6 add.u64 %r111,%r111,8; setp.eq.u64 %r191,%r50,%r53; @ %r191 bra $L13; bra $L14; $L8: ld.u64 %r96,[%frame+240]; .loc 1 107 6 mov.u64 %r105,%r96; mov.u32 %r59,%r123; $L18: .loc 1 134 11 ld.u16 %r192,[%r45]; setp.ne.u16 %r194,%r192,%r193; @ %r194 bra $L15; setp.eq.u32 %r195,%r59,0; selp.u64 %r105,%r96,%r105,%r195; mov.u64 %r72,%r213; add.u64 %r27,%r212,%r95; $L17: .loc 1 137 36 add.u64 %r198,%r105,1; .loc 1 137 25 st.u64 [%r102],%r198; .loc 1 136 5 add.u64 %r102,%r102,%r118; setp.eq.u64 %r199,%r27,%r72; @ %r199 bra $L1; .loc 1 137 32 ld.u64 %r105,[%r72]; add.u64 %r72,%r72,8; bra $L17; $L15: .loc 1 141 13 add.u64 %r45,%r45,%r168; .loc 1 142 14 add.u64 %r105,%r105,1; mov.u32 %r59,%r218; .loc 1 142 6 setp.ne.u64 %r200,%r105,%r109; @ %r200 bra $L18; st.u64 [%frame+240],%r109; mov.u64 %r71,%r213; mov.u64 %r67,%r109; mov.u64 %r65,%r113; .loc 1 144 6 mov.u64 %r75,0; $L19: .loc 1 149 17 st.u64 [%r71+-8],%r221; .loc 1 152 27 mul.lo.u64 %r62,%r65,%r67; mov.u64 %r78,%r75; .loc 1 153 9 add.u64 %r75,%r75,1; .loc 1 154 11 setp.eq.u64 %r203,%r78,%r103; @ %r203 bra $L1; .loc 1 158 13 ld.u64 %r204,[%r71]; add.u64 %r64,%r204,1; st.u64 [%r71],%r64; shl.b64 %r76,%r75,3; .loc 1 159 20 add.u64 %r205,%frame,%r76; ld.u64 %r65,[%r205]; .loc 1 159 10 sub.u64 %r206,%r65,%r62; add.u64 %r207,%r206,%r206; add.u64 %r45,%r45,%r207; .loc 1 161 33 add.u64 %r209,%r211,%r76; ld.u64 %r67,[%r209]; .loc 1 161 6 add.u64 %r71,%r71,8; setp.eq.u64 %r210,%r64,%r67; @ %r210 bra $L19; bra $L8; $L_gfortran_mfindloc0_i2 .visible .func _gfortran_mfindloc0_i2stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[480pred %r166; .reg .u64pred %r184; .reg .u64 %r187; .reg .u64 %r190; .reg .u32u64 %r213; .reg .u64pred %r238; .reg .u16 %r239; .reg .predpred %r246; .reg .predpred %r271; .reg .pred %r272; .reg .predmov.u64 %r160,%ar0; mov.u64 %r161,%ar1; mov.u32 %r162,%ar2; mov.u64 %r163,%ar3; mov.u32 %r164,%ar4; .loc 1 190 10 ld.s8 %r22,[%r161+28]; .loc 1 190 8 cvt.u32.u32 %r165,%r22; cvt.s64.s8 %r104,%r165; .loc 1 191 6 setp.gt.s64 %r166,%r104,0; @ %r166 bra $L34; .loc 1 192 5 cvta.const.u64 %r16167_gfortran_runtime_error34: .loc 1 194 6 ld.u64 %r169,[%r160; .loc 1 196 7 st.u64 [%r160+48],%r169; add.u64 %r172,%r104,-1; st.u64 [%r160+56],%r172; mov.u64 %r173,1; st.u64 [%r160+40],%r173; .loc 1 197 28 cvt.u32.u64 %r174,%r173; st.u8 [%r160+28],%r174; .loc 1 198 24 st.u64 [%r160+8],%r169; .loc 1 199 29 mov.u64 %r177call (%value_in),_gfortrani_xmallocarray,(%out_arg1,%out_arg2); ld.param.u64 %r180,[%value_in]; } .loc 1 199 27 st.u64 [%r160],%r180; bra $L36; $L35: .loc 1 203 11 cvta.global.u64 %r182,_gfortrani_compile_options; .loc 1 203 10 ld.u32 %r183,[%r182+36]; setp.eq.u32 %r184,%r183,0; @ %r184 bra $L36; .loc 1 205 4 cvta.const.u64 %r187,$LC1; {161187; call _gfortrani_bounds_iforeach_return.loc 1 207 4 cvta.const.u64 %r19016r161190187; call _gfortrani_bounds_equal_extents$L36: .loc 1 212 13 ld.u32 %r105,[%r163+16]; .loc 1 214 9 ld.u64 %r60,[%r163]; .loc 1 216 53 add.u32 %r192,%r105,-4; and.b32 %r193,%r192,-5; set.u32.eq.u32 %r195,%r193,0; neg.s32 %r196,%r195; .loc 1 216 22 add.u32 %r197,%r105,-1; set.u32.le.u32 %r199,%r197,1; neg.s32 %r200,%r199; .loc 1 216 6 cvt.u16.u32 %r202,%r196; cvt.u16.u32 %r203,%r200; or.b16 %r201,%r202,%r203;%r206 bra $L37; .loc 1 218 7 setp.ne.u32 %r207,%r105,16; @ %r207 bra $L38; $L37: ld.u64 %r208,[%r160+40]; shl.b64 %r98,%r208,3; ld.u64 %r133,[%r160]; .loc 1 226 8 mov.u64 %r84,%r133; .loc 1 229 10 mov.u64 %r118,0; .loc 1 230 23 mov.u64 %r211,%r118; bra $L39; $L38: .loc 1 223 5 cvta.const.u64 %r210,$LC3; mov.u64 %r209,210; call _gfortrani_internal_error39: .loc 1 230 23 st.u64 [%r84],%r211; mov.u64 %r134,%r118; .loc 1 229 26 add.u64 %r118,%r134,1; .loc 1 229 3 add.u64 %r84,%r84,%r98; setp.ne.u64 %r212,%r104,%r118; @ %r212 bra $L39; .loc 1 236 20 ld.u64 %r37,[%r163+16]; add.u64 %r143,%r161,40; add.u64 %r288,%frame,120; mov.u64 %r28,%r288; add.u64 %r80,%r163,40; mov.u64 %r82,%frame; add.u64 %r78,%frame,240; mov.u64 %r54,%r78; .loc 1 232 6 mov.u64 %r116,1; .loc 1 233 10 mov.u64 %r132,0; bra $L41; $L54: mov.u64 %r132,%r117; $L41: .loc 1 235 18 ld.u64 %r213,[%r143]; st.u64 [%r28],%r213; .loc 1 236 20 ld.u64 %r215,[%r80]; mul.lo.u64 %r214,%r215,%r37; .loc 1 236 18 st.u64 [%r82],%r214; .loc 1 237 19 ld.u64 %r217,[%r143+16]; add.u64 %r216,%r217,1; ld.u64 %r218,[%r143+8]; sub.u64 %r43,%r216,%r218; .loc 1 237 17 st.u64 [%r54],%r43; .loc 1 238 10 mul.lo.u64 %r116,%r116,%r43; .loc 1 239 10 setp.le.s64 %r219,%r43,0; @ %r219 bra $L33; .loc 1 233 26 add.u64 %r117,%r132,1; .loc 1 233 3 add.u64 %r143,%r143,24; add.u64 %r28,%r28,8; add.u64 %r80,%r80,24; add.u64 %r82,%r82,8; add.u64 %r54,%r54,8; setp.ne.u64 %r220,%r134,%r132; @ %r220 bra $L54; .loc 1 244 16 cvt.u32.u32 %r222,%r22; cvt.s64.s8 %r221,%r222; shl.b64 %r141,%r221,3; add.u64 %r286,%frame,360; mov.u32 %r227261 23 ld.u64 %r155,[%frame+120]; .loc 1 261 13 add.u64 %r231,%r155,%r155; .loc 1 262 24 ld.u64 %r158,[%frame]; .loc 1 263 34 ld.u64 %r151,[%frame+240]; .loc 1 248 19 ld.u64 %r47,[%r161]; .loc 1 246 6 setp.eq.u32 %r232,%r164,0; @ ! %r232 bra $L72; add.u64 %r287,%frame,368; .loc 1 294 11 cvt.u16.u32 %r299,%r162; mov.u32 %r300,1; .loc 1 310 17 mov.u64 %r304,0; .loc 1 324 33 add.u64 %r305,%frame,240; bra $L42; $L72: .loc 1 248 37 add.u64 %r44,%r116,-1; .loc 1 248 31 add.u64 %r233,%r44,%r44; .loc 1 248 12 add.u64 %r96,%r47,%r233; .loc 1 249 32 cvt.s64.s32 %r234,%r105; .loc 1 249 13 mad.lo.u64 %r97,%r234,%r44,%r60; .loc 1 261 13 neg.s64 %r58,%r231; .loc 1 262 14 neg.s64 %r59,%r158; add.u64 %r287,%frame,368; .loc 1 254 11 cvt.u16.u32 %r290,%r162; mov.u32 %r291,1; .loc 1 270 17 mov.u64 %r295,0; .loc 1 284 33 add.u64 %r296,%frame,240; $L48: ld.u64 %r130,[%frame+360]; mov.u64 %r152,%r130; mov.u32 %r33,%r227; $L46: .loc 1 254 11 ld.s8 %r237,[%r97]; cvt.u16.u32 %r236,%r237; setp.eq.u16 %r238,%r236,0; @ %r238 bra $L43; ld.u16 %r239,[%r96]; setp.ne.u16 %r241,%r239,%r290; @ %r241 bra $L43; setp.eq.u32 %r242,%r33,0; selp.u64 %r152,%r130,%r152,%r242; add.u64 %r139,%frame,248; mov.u64 %r138,%r287; add.u64 %r136,%r78,%r141; $L45: .loc 1 257 37 sub.u64 %r245,%r151,%r152; .loc 1 257 25 st.u64 [%r133],%r245; .loc 1 256 5 add.u64 %r133,%r133,%r98; setp.eq.u64 %r246,%r136,%r139; @ %r246 bra $L33; .loc 1 257 33 ld.u64 %r151,[%r139]; .loc 1 257 44 ld.u64 %r152,[%r138]; add.u64 %r139,%r139,8; add.u64 %r138,%r138,8; bra $L45; $L43: .loc 1 261 13 add.u64 %r96,%r96,%r58; .loc 1 262 14 add.u64 %r97,%r97,%r59; .loc 1 263 14 add.u64 %r152,%r152,1; mov.u32 %r33,%r291; .loc 1 263 6 setp.ne.u64 %r247,%r152,%r151; @ %r247 bra $L46; st.u64 [%frame+360],%r151; mov.u64 %r125,%r287; mov.u64 %r69,%r158; mov.u64 %r72,%r151; mov.u64 %r67,%r155; mov.u64 %r124,8; .loc 1 265 6 mov.u64 %r100,0; $L47: .loc 1 270 17 st.u64 [%r125+-8],%r295; .loc 1 273 27 mul.lo.u64 %r62,%r67,%r72; .loc 1 274 28 mul.lo.u64 %r64,%r72,%r69; mov.u64 %r142,%r100; .loc 1 275 9 add.u64 %r100,%r100,1; .loc 1 276 11 setp.eq.u64 %r250,%r142,%r134; @ %r250 bra $L33; .loc 1 280 13 ld.u64 %r251,[%r125]; add.u64 %r66,%r251,1; st.u64 [%r125],%r66; .loc 1 281 20 add.u64 %r253,%r288,%r124; ld.u64 %r67,[%r253]; .loc 1 281 10 sub.u64 %r254,%r62,%r67; add.u64 %r255,%r254,%r254; add.u64 %r96,%r96,%r255; .loc 1 282 21 add.u64 %r256,%frame,%r124; ld.u64 %r69,[%r256]; .loc 1 282 11 sub.u64 %r257,%r69,%r64; add.u64 %r97,%r97,%r257; .loc 1 284 33 add.u64 %r259,%r296,%r124; ld.u64 %r72,[%r259]; .loc 1 284 6 add.u64 %r125,%r125,8; add.u64 %r124,%r124,8; setp.eq.u64 %r260,%r66,%r72; @ %r260 bra $L47; bra $L48; $L42: ld.u64 %r55,[%frame+360]; .loc 1 265 6 mov.u64 %r146,%r55; mov.u32 %r126,%r164; $L52: .loc 1 294 11 ld.s8 %r262,[%r60]; cvt.u16.u32 %r261,%r262; setp.eq.u16 %r263,%r261,0; @ %r263 bra $L49; ld.u16 %r264,[%r47]; setp.ne.u16 %r266,%r264,%r299; @ %r266 bra $L49; setp.eq.u32 %r267,%r126,0; selp.u64 %r146,%r55,%r146,%r267; mov.u64 %r119,%r287; add.u64 %r77,%r286,%r141; $L51: .loc 1 297 36 add.u64 %r270,%r146,1; .loc 1 297 25 st.u64 [%r133],%r270; .loc 1 296 5 add.u64 %r133,%r133,%r98; setp.eq.u64 %r271,%r77,%r119; @ %r271 bra $L33; .loc 1 297 32 ld.u64 %r146,[%r119]; add.u64 %r119,%r119,8; bra $L51; $L49: .loc 1 301 13 add.u64 %r47,%r47,%r231; .loc 1 302 14 add.u64 %r60,%r60,%r158; .loc 1 303 14 add.u64 %r146,%r146,1; mov.u32 %r126,%r300; .loc 1 303 6 setp.ne.u64 %r272,%r146,%r151; @ %r272 bra $L52; st.u64 [%frame+360],%r151; mov.u64 %r153,%r287; mov.u64 %r93,%r158; mov.u64 %r95,%r151; mov.u64 %r91,%r155; mov.u64 %r150,8; .loc 1 305 6 mov.u64 %r101,0; $L53: .loc 1 310 17 st.u64 [%r153+-8],%r304; .loc 1 313 27 mul.lo.u64 %r86,%r91,%r95; .loc 1 314 28 mul.lo.u64 %r88,%r95,%r93; mov.u64 %r103,%r101; .loc 1 315 9 add.u64 %r101,%r101,1; .loc 1 316 11 setp.eq.u64 %r275,%r103,%r134; @ %r275 bra $L33; .loc 1 320 13 ld.u64 %r276,[%r153]; add.u64 %r90,%r276,1; st.u64 [%r153],%r90; .loc 1 321 20 add.u64 %r278,%r288,%r150; ld.u64 %r91,[%r278]; .loc 1 321 10 sub.u64 %r279,%r91,%r86; add.u64 %r280,%r279,%r279; add.u64 %r47,%r47,%r280; .loc 1 322 21 add.u64 %r281,%frame,%r150; ld.u64 %r93,[%r281]; .loc 1 322 11 sub.u64 %r282,%r93,%r88; add.u64 %r60,%r60,%r282; .loc 1 324 33 add.u64 %r284,%r305,%r150; ld.u64 %r95,[%r284]; .loc 1 324 6 add.u64 %r153,%r153,8; add.u64 %r150,%r150,8; setp.eq.u64 %r285,%r90,%r95; @ %r285 bra $L53; bra $L42; $L33:_gfortran_sfindloc0_i2 .visible .func _gfortran_sfindloc0_i2predar2; mov.u64 %r38,%ar3; mov.u32 %r39,%ar4; .loc 1 345 6 setp.eq.u64 %r40,%r38,0; @ %r40 bra $L74; .loc 1 345 20 ld.u32 %r41,[%r38]; setp.eq.u32 %r42,%r41,0; @ %r42 bra $L75; $L74: .loc 1 347 7 {call _gfortran_findloc0_348 7 bra $L73; $L75: .loc 1 351 8 ld.s8 %r30,[%r36+28]; .loc 1 353 6 setp.gt.s64 %r48,%r30,0; @ %r48 bra $L77; .loc 1 354 5 cvta.const.u64 %r50,$LC0; mov.u64 %r49,0; {_gfortrani_internal_error77: .loc 1 356 15 ld.u64 %r24,[%r35]; .loc 1 356 6 setp.ne.u64 %r51,%r24,0; @ %r51 bra $L78; .loc 1 358 7 st.u64 [%r35+48],%r24; add.u64 %r53,%r30,-1; st.u64 [%r35+56],%r53; mov.u64 %r54,1; st.u64 [%r35+40],%r54; .loc 1 359 28 cvt.u32.u64 %r55,%r54; st.u8 [%r35+28],%r55; .loc 1 360 24 st.u64 [%r35+8],%r24; .loc 1 361 29 mov.u64 call (%value_in),_gfortrani_xmallocarray61 27 st.u64 [%r35],%r24; bra $L79; $L78: .loc 1 363 12 cvta.global.u64 %r63,_gfortrani_compile_options; .loc 1 363 11 ld.u32 %r64,[%r63+36]; setp.eq.u32 %r65,%r64,0; @ %r65 bra $L79; .loc 1 365 8 cvta.const.u64 %r68,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 370 8 ld.u64 %r24,[%r35]; $L79: ld.u64 %r69,[%r35+40]; shl.b64 %r34,%r69,3; mov.u64 %r29,%r24; .loc 1 371 10 mov.u64 %r32,0; .loc 1 372 23 mov.u64 %r70,%r32; $L80: st.u64 [%r29],%r70; .loc 1 371 24 add.u64 %r32,%r32,1; .loc 1 371 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r71,%r30,%r32; @ %r71 bra $L80; $L73: .loc 1 373 1 ret; } findloc0_i4.o/_gfortran_findloc0_i4 .visible .func _gfortran_findloc0_i4); .file 1 "../../../libgfortran/generated/findloc0_i4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mfindloc0_i4 .visible .func _gfortran_mfindloc0_i4_gfortran_sfindloc0_i4 .visible .func _gfortran_sfindloc0_i4); // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extentsgfortrani_internal_error .extern .func _gfortrani_internal_error8] = {70,73,78,68,777,65,83,75,32,97,114,103,117,109,101,110,116,0 }82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,6270,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_findloc0_i4 .visible .func _gfortran_findloc0_i4stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[3688pred.reg .u64 %r140; .reg .u32 %r141; .reg .predpredu64predu64pred %r192; .reg .u64 %r195; .reg .predpredu32 %r211; .reg .u64 %r215; .reg .u32 %r216; .reg .u64 %r220; mov.u64 %r120,%ar0; mov.u64 %r121,%ar1; mov.u32 %r122,%ar2; mov.u32 %r123,%ar3; .loc 1 51 10 ld.s8 %r22,[%r121+28]; .loc 1 51 8 cvt.u32.u32 %r124,%r22; cvt.s64.s8 %r80,%r124; .loc 1 52 6 setp.gt.s64 %r125,%r80,0; @ %r125 bra $L2; .loc 1 53 5 cvta.const.u64 %r126,$LCstack; call _gfortran_runtime_errorr23,[%r120]; .loc 1 55 6 setp.ne.u64 %r128,%r23,0; @ %r128 bra $L3; .loc 1 57 7 st.u64 [%r120+48],%r23; add.u64 %r130,%r80,-1; st.u64 [%r120+56],%r130; mov.u64 %r131,1; st.u64 [%r120+40],%r131; .loc 1 58 28 cvt.u32.u64 %r132,%r131; st.u8 [%r120+28],%r132; .loc 1 59 24 st.u64 [%r120+8],%r23; .loc 1 60 29 mov.u64 %r135135; call (%value_in),_gfortrani_xmallocarray138,[%value_in]; } mov.u64 %r23,%r138; .loc 1 60 27 st.u64 [%r120],%r23; bra $L4; $L3: .loc 1 64 11 cvta.global.u64 %r140,_gfortrani_compile_options; .loc 1 64 10 ld.u32 %r141,[%r140+36]; setp.eq.u32 %r142,%r141,0; @ %r142 bra $L4; .loc 1 6gfortrani_bounds_iforeach_return.loc 1 70 8 ld.u64 %r23,[%r120]; $L4: ld.u64 %r146,[%r120+40]; shl.b64 %r118,%r146,3; mov.u64 %r102,%r23; .loc 1 69 11 mov.u64 %r116,%r102; .loc 1 73 10 mov.u64 %r89,0; .loc 1 74 23 mov.u64 %r147,%r89; $L5: st.u64 [%r116],%r147; mov.u64 %r103,%r89; .loc 1 73 26 add.u64 %r89,%r103,1; .loc 1 73 3 add.u64 %r116,%r116,%r118; setp.ne.u64 %r148,%r80,%r89; @ %r148 bra $L5; add.u64 %r39,%r121,40; mov.u64 %r74,%frame; .loc 1 76 6 mov.u64 %r87,1; .loc 1 77 10 mov.u64 %r101,0; add.u64 %r208,%frame,120; bra $L7; $L20: mov.u64 %r101,%r88; $L7: .loc 1 79 18 ld.u64 %r149,[%r39]; st.u64 [%r74],%r149; .loc 1 80 19 ld.u64 %r151,[%r39+16]; add.u64 %r150,%r151,1; ld.u64 %r152,[%r39+8]; sub.u64 %r33,%r150,%r152; .loc 1 80 17 shl.b64 %r154,%r101,3; add.u64 %r155,%r208,%r154; st.u64 [%r155],%r33; .loc 1 81 10 mul.lo.u64 %r87,%r87,%r33; .loc 1 82 10 setp.le.s64 %r156,%r33,0; @ %r156 bra $L1; .loc 1 77 26 add.u64 %r88,%r101,1; .loc 1 77 3 add.u64 %r39,%r39,24; add.u64 %r74,%r74,8; setp.ne.u64 %r157,%r103,%r101; @ %r157 bra $L20; .loc 1 87 16 cvt.u32.u32 %r159,%r22; cvt.s64.s8 %r158,%r159; shl.b64 %r95,%r158,3; add.u64 %r209,%frame,240; mov.u32 %r164%r113,[%frame]; .loc 1 104 13 shl.b64 %r115,%r113,2; .loc 1 105 34 ld.u64 %r109,[%frame+120]; .loc 1 91 19 ld.u64 %r45,[%r121]; .loc 1 89 6 setp.eq.u32 %r168,%r123,0; @ ! %r168 bra $L32; add.u64 %r210,%frame,248; mov.u32 %r216,1; .loc 1 149 17 mov.u64 %r220,0; bra $L8; $L32: .loc 1 91 37 add.u64 %r169,%r87,-1; .loc 1 91 31 shl.b64 %r170,%r169,2; .loc 1 91 12 add.u64 %r68,%r45,%r170; .loc 1 104 13 neg.s64 %r44,%r115; add.u64 %r210,%frame,248; mov.u32 %r211,1; .loc 1 112 17 mov.u64 %r215,0; $L14: ld.u64 %r38,[%frame+240]; mov.u64 %r110,%r38; mov.u32 %r98,%r164; $L12: .loc 1 97 11 ld.u32 %r171,[%r68]; setp.ne.u32 %r172,%r171,%r122; @ %r172 bra $L9; setp.eq.u32 %r173,%r98,0; selp.u64 %r110,%r38,%r110,%r173; add.u64 %r93,%frame,128; mov.u64 %r92,%r210; add.u64 %r86,%r208,%r95; $L11: .loc 1 100 37 sub.u64 %r177,%r109,%r110; .loc 1 100 25 st.u64 [%r102],%r177; .loc 1 99 5 add.u64 %r102,%r102,%r118; setp.eq.u64 %r178,%r86,%r93; @ %r178 bra $L1; .loc 1 100 33 ld.u64 %r109,[%r93]; .loc 1 100 44 ld.u64 %r110,[%r92]; add.u64 %r93,%r93,8; add.u64 %r92,%r92,8; bra $L11; $L9: .loc 1 104 13 add.u64 %r68,%r68,%r44; .loc 1 105 14 add.u64 %r110,%r110,1; mov.u32 %r98,%r211; .loc 1 105 6 setp.ne.u64 %r179,%r110,%r109; @ %r179 bra $L12; st.u64 [%frame+240],%r109; mov.u64 %r111,%r210; mov.u64 %r53,%r109; mov.u64 %r51,%r113; .loc 1 107 6 mov.u64 %r73,0; $L13: .loc 1 112 17 st.u64 [%r111+-8],%r215; .loc 1 115 27 mul.lo.u64 %r48,%r51,%r53; mov.u64 %r79,%r73; .loc 1 116 9 add.u64 %r73,%r73,1; .loc 1 117 11 setp.eq.u64 %r182,%r79,%r103; @ %r182 bra $L1; .loc 1 121 13 ld.u64 %r183,[%r111]; add.u64 %r50,%r183,1; st.u64 [%r111],%r50; shl.b64 %r106,%r73,3; .loc 1 122 20 add.u64 %r184,%frame,%r106; ld.u64 %r51,[%r184]; .loc 1 122 10 sub.u64 %r185,%r48,%r51; shl.b64 %r186,%r185,2; add.u64 %r68,%r68,%r186; .loc 1 124 33 add.u64 %r188,%r208,%r106; ld.u64 %r53,[%r188]; .loc 1 124 6 add.u64 %r111,%r111,8; setp.eq.u64 %r189,%r50,%r53; @ %r189 bra $L13; bra $L14; $L8: ld.u64 %r96,[%frame+240]; .loc 1 107 6 mov.u64 %r105,%r96; mov.u32 %r59,%r123; $L18: .loc 1 134 11 ld.u32 %r190,[%r45]; setp.ne.u32 %r191,%r190,%r122; @ %r191 bra $L15; setp.eq.u32 %r192,%r59,0; selp.u64 %r105,%r96,%r105,%r192; mov.u64 %r72,%r210; add.u64 %r27,%r209,%r95; $L17: .loc 1 137 36 add.u64 %r195,%r105,1; .loc 1 137 25 st.u64 [%r102],%r195; .loc 1 136 5 add.u64 %r102,%r102,%r118; setp.eq.u64 %r196,%r27,%r72; @ %r196 bra $L1; .loc 1 137 32 ld.u64 %r105,[%r72]; add.u64 %r72,%r72,8; bra $L17; $L15: .loc 1 141 13 add.u64 %r45,%r45,%r115; .loc 1 142 14 add.u64 %r105,%r105,1; mov.u32 %r59,%r216; .loc 1 142 6 setp.ne.u64 %r197,%r105,%r109; @ %r197 bra $L18; st.u64 [%frame+240],%r109; mov.u64 %r71,%r210; mov.u64 %r67,%r109; mov.u64 %r65,%r113; .loc 1 144 6 mov.u64 %r75,0; $L19: .loc 1 149 17 st.u64 [%r71+-8],%r220; .loc 1 152 27 mul.lo.u64 %r62,%r65,%r67; mov.u64 %r78,%r75; .loc 1 153 9 add.u64 %r75,%r75,1; .loc 1 154 11 setp.eq.u64 %r200,%r78,%r103; @ %r200 bra $L1; .loc 1 158 13 ld.u64 %r201,[%r71]; add.u64 %r64,%r201,1; st.u64 [%r71],%r64; shl.b64 %r76,%r75,3; .loc 1 159 20 add.u64 %r202,%frame,%r76; ld.u64 %r65,[%r202]; .loc 1 159 10 sub.u64 %r203,%r65,%r62; shl.b64 %r204,%r203,2; add.u64 %r45,%r45,%r204; .loc 1 161 33 add.u64 %r206,%r208,%r76; ld.u64 %r67,[%r206]; .loc 1 161 6 add.u64 %r71,%r71,8; setp.eq.u64 %r207,%r64,%r67; @ %r207 bra $L19; bra $L8; $L_gfortran_mfindloc0_i4 .visible .func _gfortran_mfindloc0_i4stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[480pred %r166; .reg .u64pred %r184; .reg .u64 %r187; .reg .u64 %r190; .reg .u32u64 %r213; .reg .u64u32 %r227; .reg .u64 %r229; .reg .pred %r231; .reg .u64 %r232; .reg .u64 %r233; .reg .u16u64 %r255; .reg .u64 %r257; .reg .pred %r258; .reg .u16 %r259; .reg .u32 %r260; .reg .predmov.u64 %r160,%ar0; mov.u64 %r161,%ar1; mov.u32 %r162,%ar2; mov.u64 %r163,%ar3; mov.u32 %r164,%ar4; .loc 1 190 10 ld.s8 %r22,[%r161+28]; .loc 1 190 8 cvt.u32.u32 %r165,%r22; cvt.s64.s8 %r104,%r165; .loc 1 191 6 setp.gt.s64 %r166,%r104,0; @ %r166 bra $L34; .loc 1 192 5 cvta.const.u64 %r16167_gfortran_runtime_error34: .loc 1 194 6 ld.u64 %r169,[%r160; .loc 1 196 7 st.u64 [%r160+48],%r169; add.u64 %r172,%r104,-1; st.u64 [%r160+56],%r172; mov.u64 %r173,1; st.u64 [%r160+40],%r173; .loc 1 197 28 cvt.u32.u64 %r174,%r173; st.u8 [%r160+28],%r174; .loc 1 198 24 st.u64 [%r160+8],%r169; .loc 1 199 29 mov.u64 %r177call (%value_in),_gfortrani_xmallocarray,(%out_arg1,%out_arg2); ld.param.u64 %r180,[%value_in]; } .loc 1 199 27 st.u64 [%r160],%r180; bra $L36; $L35: .loc 1 203 11 cvta.global.u64 %r182,_gfortrani_compile_options; .loc 1 203 10 ld.u32 %r183,[%r182+36]; setp.eq.u32 %r184,%r183,0; @ %r184 bra $L36; .loc 1 205 4 cvta.const.u64 %r187,$LC1; {161187; call _gfortrani_bounds_iforeach_return.loc 1 207 4 cvta.const.u64 %r19016r161190187; call _gfortrani_bounds_equal_extents$L36: .loc 1 212 13 ld.u32 %r105,[%r163+16]; .loc 1 214 9 ld.u64 %r60,[%r163]; .loc 1 216 53 add.u32 %r192,%r105,-4; and.b32 %r193,%r192,-5; set.u32.eq.u32 %r195,%r193,0; neg.s32 %r196,%r195; .loc 1 216 22 add.u32 %r197,%r105,-1; set.u32.le.u32 %r199,%r197,1; neg.s32 %r200,%r199; .loc 1 216 6 cvt.u16.u32 %r202,%r196; cvt.u16.u32 %r203,%r200; or.b16 %r201,%r202,%r203;%r206 bra $L37; .loc 1 218 7 setp.ne.u32 %r207,%r105,16; @ %r207 bra $L38; $L37: ld.u64 %r208,[%r160+40]; shl.b64 %r98,%r208,3; ld.u64 %r133,[%r160]; .loc 1 226 8 mov.u64 %r84,%r133; .loc 1 229 10 mov.u64 %r118,0; .loc 1 230 23 mov.u64 %r211,%r118; bra $L39; $L38: .loc 1 223 5 cvta.const.u64 %r210,$LC3; mov.u64 %r209,210; call _gfortrani_internal_error39: .loc 1 230 23 st.u64 [%r84],%r211; mov.u64 %r134,%r118; .loc 1 229 26 add.u64 %r118,%r134,1; .loc 1 229 3 add.u64 %r84,%r84,%r98; setp.ne.u64 %r212,%r104,%r118; @ %r212 bra $L39; .loc 1 236 20 ld.u64 %r37,[%r163+16]; add.u64 %r143,%r161,40; add.u64 %r285,%frame,120; mov.u64 %r28,%r285; add.u64 %r80,%r163,40; mov.u64 %r82,%frame; add.u64 %r78,%frame,240; mov.u64 %r54,%r78; .loc 1 232 6 mov.u64 %r116,1; .loc 1 233 10 mov.u64 %r132,0; bra $L41; $L54: mov.u64 %r132,%r117; $L41: .loc 1 235 18 ld.u64 %r213,[%r143]; st.u64 [%r28],%r213; .loc 1 236 20 ld.u64 %r215,[%r80]; mul.lo.u64 %r214,%r215,%r37; .loc 1 236 18 st.u64 [%r82],%r214; .loc 1 237 19 ld.u64 %r217,[%r143+16]; add.u64 %r216,%r217,1; ld.u64 %r218,[%r143+8]; sub.u64 %r43,%r216,%r218; .loc 1 237 17 st.u64 [%r54],%r43; .loc 1 238 10 mul.lo.u64 %r116,%r116,%r43; .loc 1 239 10 setp.le.s64 %r219,%r43,0; @ %r219 bra $L33; .loc 1 233 26 add.u64 %r117,%r132,1; .loc 1 233 3 add.u64 %r143,%r143,24; add.u64 %r28,%r28,8; add.u64 %r80,%r80,24; add.u64 %r82,%r82,8; add.u64 %r54,%r54,8; setp.ne.u64 %r220,%r134,%r132; @ %r220 bra $L54; .loc 1 244 16 cvt.u32.u32 %r222,%r22; cvt.s64.s8 %r221,%r222; shl.b64 %r141,%r221,3; add.u64 %r283,%frame,360; mov.u32 %r227227261 23 ld.u64 %r155,[%frame+120]; .loc 1 261 13 shl.b64 %r157,%r155,2; .loc 1 262 24 ld.u64 %r158,[%frame]; .loc 1 263 34 ld.u64 %r151,[%frame+240]; .loc 1 248 19 ld.u64 %r47,[%r161]; .loc 1 246 6 setp.eq.u32 %r231,%r164,0; @ ! %r231 bra $L72; add.u64 %r284,%frame,368; mov.u32 %r293,1; .loc 1 324 33 add.u64 %r296,%frame,240; bra $L42; $L72: .loc 1 248 37 add.u64 %r44,%r116,-1; .loc 1 248 31 shl.b64 %r232,%r44,2; .loc 1 248 12 add.u64 %r96,%r47,%r232; .loc 1 249 32 cvt.s64.s32 %r233,%r105; .loc 1 249 13 mad.lo.u64 %r97,%r233,%r44,%r60; .loc 1 261 13 neg.s64 %r58,%r157; .loc 1 262 14 neg.s64 %r59,%r158; add.u64 %r284,%frame,368; mov.u32 %r287,1; .loc 1 284 33 add.u64 %r290,%frame,240; $L48: ld.u64 %r130,[%frame+360]; mov.u64 %r152,%r130; mov.u32 %r33,%r227; $L46: .loc 1 254 11 ld.s8 %r236,[%r97]; cvt.u16.u32 %r235,%r236; setp.eq.u16 %r237,%r235,0; @ %r237 bra $L43; ld.u32 %r238,[%r96]; setp.ne.u32 %r239,%r238,%r162; @ %r239 bra $L43; setp.eq.u32 %r240,%r33,0; selp.u64 %r152,%r130,%r152,%r240; add.u64 %r139,%frame,248; mov.u64 %r138,%r284; add.u64 %r136,%r78,%r141; $L45: .loc 1 257 37 sub.u64 %r243,%r151,%r152; .loc 1 257 25 st.u64 [%r133],%r243; .loc 1 256 5 add.u64 %r133,%r133,%r98; setp.eq.u64 %r244,%r136,%r139; @ %r244 bra $L33; .loc 1 257 33 ld.u64 %r151,[%r139]; .loc 1 257 44 ld.u64 %r152,[%r138]; add.u64 %r139,%r139,8; add.u64 %r138,%r138,8; bra $L45; $L43: .loc 1 261 13 add.u64 %r96,%r96,%r58; .loc 1 262 14 add.u64 %r97,%r97,%r59; .loc 1 263 14 add.u64 %r152,%r152,1; mov.u32 %r33,%r287; .loc 1 263 6 setp.ne.u64 %r245,%r152,%r151; @ %r245 bra $L46; st.u64 [%frame+360],%r151; mov.u64 %r125,%r284; mov.u64 %r69,%r158; mov.u64 %r72,%r151; mov.u64 %r67,%r155; mov.u64 %r124,8; .loc 1 265 6 mov.u64 %r100,0; .loc 1 270 17 mov.u64 %r247,%r100; $L47: st.u64 [%r125+-8],%r247; .loc 1 273 27 mul.lo.u64 %r62,%r67,%r72; .loc 1 274 28 mul.lo.u64 %r64,%r72,%r69; mov.u64 %r142,%r100; .loc 1 275 9 add.u64 %r100,%r100,1; .loc 1 276 11 setp.eq.u64 %r248,%r142,%r134; @ %r248 bra $L33; .loc 1 280 13 ld.u64 %r249,[%r125]; add.u64 %r66,%r249,1; st.u64 [%r125],%r66; .loc 1 281 20 add.u64 %r251,%r285,%r124; ld.u64 %r67,[%r251]; .loc 1 281 10 sub.u64 %r252,%r62,%r67; shl.b64 %r253,%r252,2; add.u64 %r96,%r96,%r253; .loc 1 282 21 add.u64 %r254,%frame,%r124; ld.u64 %r69,[%r254]; .loc 1 282 11 sub.u64 %r255,%r69,%r64; add.u64 %r97,%r97,%r255; .loc 1 284 33 add.u64 %r257,%r290,%r124; ld.u64 %r72,[%r257]; .loc 1 284 6 add.u64 %r125,%r125,8; add.u64 %r124,%r124,8; setp.eq.u64 %r258,%r66,%r72; @ %r258 bra $L47; bra $L48; $L42: ld.u64 %r55,[%frame+360]; .loc 1 265 6 mov.u64 %r146,%r55; mov.u32 %r126,%r164; $L52: .loc 1 294 11 ld.s8 %r260,[%r60]; cvt.u16.u32 %r259,%r260; setp.eq.u16 %r261,%r259,0; @ %r261 bra $L49; ld.u32 %r262,[%r47]; setp.ne.u32 %r263,%r262,%r162; @ %r263 bra $L49; setp.eq.u32 %r264,%r126,0; selp.u64 %r146,%r55,%r146,%r264; mov.u64 %r119,%r284; add.u64 %r77,%r283,%r141; $L51: .loc 1 297 36 add.u64 %r267,%r146,1; .loc 1 297 25 st.u64 [%r133],%r267; .loc 1 296 5 add.u64 %r133,%r133,%r98; setp.eq.u64 %r268,%r77,%r119; @ %r268 bra $L33; .loc 1 297 32 ld.u64 %r146,[%r119]; add.u64 %r119,%r119,8; bra $L51; $L49: .loc 1 301 13 add.u64 %r47,%r47,%r157; .loc 1 302 14 add.u64 %r60,%r60,%r158; .loc 1 303 14 add.u64 %r146,%r146,1; mov.u32 %r126,%r293; .loc 1 303 6 setp.ne.u64 %r269,%r146,%r151; @ %r269 bra $L52; st.u64 [%frame+360],%r151; mov.u64 %r153,%r284; mov.u64 %r93,%r158; mov.u64 %r95,%r151; mov.u64 %r91,%r155; mov.u64 %r150,8; .loc 1 305 6 mov.u64 %r101,0; .loc 1 310 17 mov.u64 %r271,%r101; $L53: st.u64 [%r153+-8],%r271; .loc 1 313 27 mul.lo.u64 %r86,%r91,%r95; .loc 1 314 28 mul.lo.u64 %r88,%r95,%r93; mov.u64 %r103,%r101; .loc 1 315 9 add.u64 %r101,%r101,1; .loc 1 316 11 setp.eq.u64 %r272,%r103,%r134; @ %r272 bra $L33; .loc 1 320 13 ld.u64 %r273,[%r153]; add.u64 %r90,%r273,1; st.u64 [%r153],%r90; .loc 1 321 20 add.u64 %r275,%r285,%r150; ld.u64 %r91,[%r275]; .loc 1 321 10 sub.u64 %r276,%r91,%r86; shl.b64 %r277,%r276,2; add.u64 %r47,%r47,%r277; .loc 1 322 21 add.u64 %r278,%frame,%r150; ld.u64 %r93,[%r278]; .loc 1 322 11 sub.u64 %r279,%r93,%r88; add.u64 %r60,%r60,%r279; .loc 1 324 33 add.u64 %r281,%r296,%r150; ld.u64 %r95,[%r281]; .loc 1 324 6 add.u64 %r153,%r153,8; add.u64 %r150,%r150,8; setp.eq.u64 %r282,%r90,%r95; @ %r282 bra $L53; bra $L42; $L33:_gfortran_sfindloc0_i4 .visible .func _gfortran_sfindloc0_i4predar2; mov.u64 %r38,%ar3; mov.u32 %r39,%ar4; .loc 1 345 6 setp.eq.u64 %r40,%r38,0; @ %r40 bra $L74; .loc 1 345 20 ld.u32 %r41,[%r38]; setp.eq.u32 %r42,%r41,0; @ %r42 bra $L75; $L74: .loc 1 347 7 {call _gfortran_findloc0_348 7 bra $L73; $L75: .loc 1 351 8 ld.s8 %r30,[%r36+28]; .loc 1 353 6 setp.gt.s64 %r48,%r30,0; @ %r48 bra $L77; .loc 1 354 5 cvta.const.u64 %r50,$LC0; mov.u64 %r49,0; {_gfortrani_internal_error77: .loc 1 356 15 ld.u64 %r24,[%r35]; .loc 1 356 6 setp.ne.u64 %r51,%r24,0; @ %r51 bra $L78; .loc 1 358 7 st.u64 [%r35+48],%r24; add.u64 %r53,%r30,-1; st.u64 [%r35+56],%r53; mov.u64 %r54,1; st.u64 [%r35+40],%r54; .loc 1 359 28 cvt.u32.u64 %r55,%r54; st.u8 [%r35+28],%r55; .loc 1 360 24 st.u64 [%r35+8],%r24; .loc 1 361 29 mov.u64 call (%value_in),_gfortrani_xmallocarray61 27 st.u64 [%r35],%r24; bra $L79; $L78: .loc 1 363 12 cvta.global.u64 %r63,_gfortrani_compile_options; .loc 1 363 11 ld.u32 %r64,[%r63+36]; setp.eq.u32 %r65,%r64,0; @ %r65 bra $L79; .loc 1 365 8 cvta.const.u64 %r68,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 370 8 ld.u64 %r24,[%r35]; $L79: ld.u64 %r69,[%r35+40]; shl.b64 %r34,%r69,3; mov.u64 %r29,%r24; .loc 1 371 10 mov.u64 %r32,0; .loc 1 372 23 mov.u64 %r70,%r32; $L80: st.u64 [%r29],%r70; .loc 1 371 24 add.u64 %r32,%r32,1; .loc 1 371 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r71,%r30,%r32; @ %r71 bra $L80; $L73: .loc 1 373 1 ret; } findloc0_i8.o/_gfortran_findloc0_i8 .visible .func _gfortran_findloc0_i8.file 1 "../../../libgfortran/generated/findloc0_i8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mfindloc0_i8 .visible .func _gfortran_mfindloc0__gfortran_sfindloc0_i8 .visible .func _gfortran_sfindloc0_VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extentsgfortrani_internal_error .extern .func _gfortrani_internal_error8] = {70,73,78,68,777,65,83,75,32,97,114,103,117,109,101,110,116,0 }82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,6270,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_findloc0_i8 .visible .func _gfortran_findloc0_i8reg .u64 %stack; mov.u64 %stack,088pred.reg .u64 %r140; .reg .u32 %r141; .reg .predpredu64 %r166; .reg .pred.reg .pred %r178; .reg .predu64pred %r192; .reg .u64 %r195; .reg .predpredu32 %r211; .reg .u64 %r215; .reg .u32 %r216; .reg .u64 %r220; mov.u64 %r120,%ar0; mov.u64 %r121,%ar1; mov.u64 %r122,%ar2; mov.u32 %r123,%ar3; .loc 1 51 10 ld.s8 %r22,[%r121+28]; .loc 1 51 8 cvt.u32.u32 %r124,%r22; cvt.s64.s8 %r80,%r124; .loc 1 52 6 setp.gt.s64 %r125,%r80,0; @ %r125 bra $L2; .loc 1 53 5 cvta.const.u64 %r126,$LCstack; call _gfortran_runtime_errorr23,[%r120]; .loc 1 55 6 setp.ne.u64 %r128,%r23,0; @ %r128 bra $L3; .loc 1 57 7 st.u64 [%r120+48],%r23; add.u64 %r130,%r80,-1; st.u64 [%r120+56],%r130; mov.u64 %r131,1; st.u64 [%r120+40],%r131; .loc 1 58 28 cvt.u32.u64 %r132,%r131; st.u8 [%r120+28],%r132; .loc 1 59 24 st.u64 [%r120+8],%r23; .loc 1 60 29 mov.u64 %r135135; call (%value_in),_gfortrani_xmallocarray138,[%value_in]; } mov.u64 %r23,%r138; .loc 1 60 27 st.u64 [%r120],%r23; bra $L4; $L3: .loc 1 64 11 cvta.global.u64 %r140,_gfortrani_compile_options; .loc 1 64 10 ld.u32 %r141,[%r140+36]; setp.eq.u32 %r142,%r141,0; @ %r142 bra $L4; .loc 1 6gfortrani_bounds_iforeach_return.loc 1 70 8 ld.u64 %r23,[%r120]; $L4: ld.u64 %r146,[%r120+40]; shl.b64 %r118,%r146,3; mov.u64 %r102,%r23; .loc 1 69 11 mov.u64 %r116,%r102; .loc 1 73 10 mov.u64 %r89,0; .loc 1 74 23 mov.u64 %r147,%r89; $L5: st.u64 [%r116],%r147; mov.u64 %r103,%r89; .loc 1 73 26 add.u64 %r89,%r103,1; .loc 1 73 3 add.u64 %r116,%r116,%r118; setp.ne.u64 %r148,%r80,%r89; @ %r148 bra $L5; add.u64 %r39,%r121,40; mov.u64 %r74,%frame; .loc 1 76 6 mov.u64 %r87,1; .loc 1 77 10 mov.u64 %r101,0; add.u64 %r208,%frame,120; bra $L7; $L20: mov.u64 %r101,%r88; $L7: .loc 1 79 18 ld.u64 %r149,[%r39]; st.u64 [%r74],%r149; .loc 1 80 19 ld.u64 %r151,[%r39+16]; add.u64 %r150,%r151,1; ld.u64 %r152,[%r39+8]; sub.u64 %r33,%r150,%r152; .loc 1 80 17 shl.b64 %r154,%r101,3; add.u64 %r155,%r208,%r154; st.u64 [%r155],%r33; .loc 1 81 10 mul.lo.u64 %r87,%r87,%r33; .loc 1 82 10 setp.le.s64 %r156,%r33,0; @ %r156 bra $L1; .loc 1 77 26 add.u64 %r88,%r101,1; .loc 1 77 3 add.u64 %r39,%r39,24; add.u64 %r74,%r74,8; setp.ne.u64 %r157,%r103,%r101; @ %r157 bra $L20; .loc 1 87 16 cvt.u32.u32 %r159,%r22; cvt.s64.s8 %r158,%r159; shl.b64 %r95,%r158,3; add.u64 %r209,%frame,240; mov.u32 %r164%r113,[%frame]; .loc 1 104 13 shl.b64 %r115,%r113,3; .loc 1 105 34 ld.u64 %r109,[%frame+120]; .loc 1 91 19 ld.u64 %r45,[%r121]; .loc 1 89 6 setp.eq.u32 %r168,%r123,0; @ ! %r168 bra $L32; add.u64 %r210,%frame,248; mov.u32 %r216,1; .loc 1 149 17 mov.u64 %r220,0; bra $L8; $L32: .loc 1 91 37 add.u64 %r169,%r87,-1; .loc 1 91 31 shl.b64 %r170,%r169,3; .loc 1 91 12 add.u64 %r68,%r45,%r170; .loc 1 104 13 neg.s64 %r44,%r115; add.u64 %r210,%frame,248; mov.u32 %r211,1; .loc 1 112 17 mov.u64 %r215,0; $L14: ld.u64 %r38,[%frame+240]; mov.u64 %r110,%r38; mov.u32 %r98,%r164; $L12: .loc 1 97 11 ld.u64 %r171,[%r68]; setp.ne.u64 %r172,%r171,%r122; @ %r172 bra $L9; setp.eq.u32 %r173,%r98,0; selp.u64 %r110,%r38,%r110,%r173; add.u64 %r93,%frame,128; mov.u64 %r92,%r210; add.u64 %r86,%r208,%r95; $L11: .loc 1 100 37 sub.u64 %r177,%r109,%r110; .loc 1 100 25 st.u64 [%r102],%r177; .loc 1 99 5 add.u64 %r102,%r102,%r118; setp.eq.u64 %r178,%r86,%r93; @ %r178 bra $L1; .loc 1 100 33 ld.u64 %r109,[%r93]; .loc 1 100 44 ld.u64 %r110,[%r92]; add.u64 %r93,%r93,8; add.u64 %r92,%r92,8; bra $L11; $L9: .loc 1 104 13 add.u64 %r68,%r68,%r44; .loc 1 105 14 add.u64 %r110,%r110,1; mov.u32 %r98,%r211; .loc 1 105 6 setp.ne.u64 %r179,%r110,%r109; @ %r179 bra $L12; st.u64 [%frame+240],%r109; mov.u64 %r111,%r210; mov.u64 %r53,%r109; mov.u64 %r51,%r113; .loc 1 107 6 mov.u64 %r73,0; $L13: .loc 1 112 17 st.u64 [%r111+-8],%r215; .loc 1 115 27 mul.lo.u64 %r48,%r51,%r53; mov.u64 %r79,%r73; .loc 1 116 9 add.u64 %r73,%r73,1; .loc 1 117 11 setp.eq.u64 %r182,%r79,%r103; @ %r182 bra $L1; .loc 1 121 13 ld.u64 %r183,[%r111]; add.u64 %r50,%r183,1; st.u64 [%r111],%r50; shl.b64 %r106,%r73,3; .loc 1 122 20 add.u64 %r184,%frame,%r106; ld.u64 %r51,[%r184]; .loc 1 122 10 sub.u64 %r185,%r48,%r51; shl.b64 %r186,%r185,3; add.u64 %r68,%r68,%r186; .loc 1 124 33 add.u64 %r188,%r208,%r106; ld.u64 %r53,[%r188]; .loc 1 124 6 add.u64 %r111,%r111,8; setp.eq.u64 %r189,%r50,%r53; @ %r189 bra $L13; bra $L14; $L8: ld.u64 %r96,[%frame+240]; .loc 1 107 6 mov.u64 %r105,%r96; mov.u32 %r59,%r123; $L18: .loc 1 134 11 ld.u64 %r190,[%r45]; setp.ne.u64 %r191,%r190,%r122; @ %r191 bra $L15; setp.eq.u32 %r192,%r59,0; selp.u64 %r105,%r96,%r105,%r192; mov.u64 %r72,%r210; add.u64 %r27,%r209,%r95; $L17: .loc 1 137 36 add.u64 %r195,%r105,1; .loc 1 137 25 st.u64 [%r102],%r195; .loc 1 136 5 add.u64 %r102,%r102,%r118; setp.eq.u64 %r196,%r27,%r72; @ %r196 bra $L1; .loc 1 137 32 ld.u64 %r105,[%r72]; add.u64 %r72,%r72,8; bra $L17; $L15: .loc 1 141 13 add.u64 %r45,%r45,%r115; .loc 1 142 14 add.u64 %r105,%r105,1; mov.u32 %r59,%r216; .loc 1 142 6 setp.ne.u64 %r197,%r105,%r109; @ %r197 bra $L18; st.u64 [%frame+240],%r109; mov.u64 %r71,%r210; mov.u64 %r67,%r109; mov.u64 %r65,%r113; .loc 1 144 6 mov.u64 %r75,0; $L19: .loc 1 149 17 st.u64 [%r71+-8],%r220; .loc 1 152 27 mul.lo.u64 %r62,%r65,%r67; mov.u64 %r78,%r75; .loc 1 153 9 add.u64 %r75,%r75,1; .loc 1 154 11 setp.eq.u64 %r200,%r78,%r103; @ %r200 bra $L1; .loc 1 158 13 ld.u64 %r201,[%r71]; add.u64 %r64,%r201,1; st.u64 [%r71],%r64; shl.b64 %r76,%r75,3; .loc 1 159 20 add.u64 %r202,%frame,%r76; ld.u64 %r65,[%r202]; .loc 1 159 10 sub.u64 %r203,%r65,%r62; shl.b64 %r204,%r203,3; add.u64 %r45,%r45,%r204; .loc 1 161 33 add.u64 %r206,%r208,%r76; ld.u64 %r67,[%r206]; .loc 1 161 6 add.u64 %r71,%r71,8; setp.eq.u64 %r207,%r64,%r67; @ %r207 bra $L19; bra $L8; $L_gfortran_mfindloc0_i8 .visible .func _gfortran_mfindloc0_, .param .u32 %in_ar4) {reg .u64 %stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[480pred %r166; .reg .u64pred %r184; .reg .u64 %r187; .reg .u64 %r190; .reg .u32u64 %r213; .reg .u64u32 %r227; .reg .u64 %r229; .reg .pred %r231; .reg .u64 %r232; .reg .u64 %r233; .reg .u16u64 %r255; .reg .u64 %r257; .reg .pred %r258; .reg .u16 %r259; .reg .u32mov.u64 %r160,%ar0; mov.u64 %r161,%ar1; mov.u64 %r162,%ar2; mov.u64 %r163,%ar3; mov.u32 %r164,%ar4; .loc 1 190 10 ld.s8 %r22,[%r161+28]; .loc 1 190 8 cvt.u32.u32 %r165,%r22; cvt.s64.s8 %r104,%r165; .loc 1 191 6 setp.gt.s64 %r166,%r104,0; @ %r166 bra $L34; .loc 1 192 5 cvta.const.u64 %r16167_gfortran_runtime_error34: .loc 1 194 6 ld.u64 %r169,[%r160; .loc 1 196 7 st.u64 [%r160+48],%r169; add.u64 %r172,%r104,-1; st.u64 [%r160+56],%r172; mov.u64 %r173,1; st.u64 [%r160+40],%r173; .loc 1 197 28 cvt.u32.u64 %r174,%r173; st.u8 [%r160+28],%r174; .loc 1 198 24 st.u64 [%r160+8],%r169; .loc 1 199 29 mov.u64 %r177call (%value_in),_gfortrani_xmallocarray,(%out_arg1,%out_arg2); ld.param.u64 %r180,[%value_in]; } .loc 1 199 27 st.u64 [%r160],%r180; bra $L36; $L35: .loc 1 203 11 cvta.global.u64 %r182,_gfortrani_compile_options; .loc 1 203 10 ld.u32 %r183,[%r182+36]; setp.eq.u32 %r184,%r183,0; @ %r184 bra $L36; .loc 1 205 4 cvta.const.u64 %r187,$LC1; {161187; call _gfortrani_bounds_iforeach_return.loc 1 207 4 cvta.const.u64 %r19016r161190187; call _gfortrani_bounds_equal_extents$L36: .loc 1 212 13 ld.u32 %r105,[%r163+16]; .loc 1 214 9 ld.u64 %r60,[%r163]; .loc 1 216 53 add.u32 %r192,%r105,-4; and.b32 %r193,%r192,-5; set.u32.eq.u32 %r195,%r193,0; neg.s32 %r196,%r195; .loc 1 216 22 add.u32 %r197,%r105,-1; set.u32.le.u32 %r199,%r197,1; neg.s32 %r200,%r199; .loc 1 216 6 cvt.u16.u32 %r202,%r196; cvt.u16.u32 %r203,%r200; or.b16 %r201,%r202,%r203;%r206 bra $L37; .loc 1 218 7 setp.ne.u32 %r207,%r105,16; @ %r207 bra $L38; $L37: ld.u64 %r208,[%r160+40]; shl.b64 %r98,%r208,3; ld.u64 %r133,[%r160]; .loc 1 226 8 mov.u64 %r84,%r133; .loc 1 229 10 mov.u64 %r118,0; .loc 1 230 23 mov.u64 %r211,%r118; bra $L39; $L38: .loc 1 223 5 cvta.const.u64 %r210,$LC3; mov.u64 %r209,210; call _gfortrani_internal_error39: .loc 1 230 23 st.u64 [%r84],%r211; mov.u64 %r134,%r118; .loc 1 229 26 add.u64 %r118,%r134,1; .loc 1 229 3 add.u64 %r84,%r84,%r98; setp.ne.u64 %r212,%r104,%r118; @ %r212 bra $L39; .loc 1 236 20 ld.u64 %r37,[%r163+16]; add.u64 %r143,%r161,40; add.u64 %r285,%frame,120; mov.u64 %r28,%r285; add.u64 %r80,%r163,40; mov.u64 %r82,%frame; add.u64 %r78,%frame,240; mov.u64 %r54,%r78; .loc 1 232 6 mov.u64 %r116,1; .loc 1 233 10 mov.u64 %r132,0; bra $L41; $L54: mov.u64 %r132,%r117; $L41: .loc 1 235 18 ld.u64 %r213,[%r143]; st.u64 [%r28],%r213; .loc 1 236 20 ld.u64 %r215,[%r80]; mul.lo.u64 %r214,%r215,%r37; .loc 1 236 18 st.u64 [%r82],%r214; .loc 1 237 19 ld.u64 %r217,[%r143+16]; add.u64 %r216,%r217,1; ld.u64 %r218,[%r143+8]; sub.u64 %r43,%r216,%r218; .loc 1 237 17 st.u64 [%r54],%r43; .loc 1 238 10 mul.lo.u64 %r116,%r116,%r43; .loc 1 239 10 setp.le.s64 %r219,%r43,0; @ %r219 bra $L33; .loc 1 233 26 add.u64 %r117,%r132,1; .loc 1 233 3 add.u64 %r143,%r143,24; add.u64 %r28,%r28,8; add.u64 %r80,%r80,24; add.u64 %r82,%r82,8; add.u64 %r54,%r54,8; setp.ne.u64 %r220,%r134,%r132; @ %r220 bra $L54; .loc 1 244 16 cvt.u32.u32 %r222,%r22; cvt.s64.s8 %r221,%r222; shl.b64 %r141,%r221,3; add.u64 %r283,%frame,360; mov.u32 %r227227261 23 ld.u64 %r155,[%frame+120]; .loc 1 261 13 shl.b64 %r157,%r155,3; .loc 1 262 24 ld.u64 %r158,[%frame]; .loc 1 263 34 ld.u64 %r151,[%frame+240]; .loc 1 248 19 ld.u64 %r47,[%r161]; .loc 1 246 6 setp.eq.u32 %r231,%r164,0; @ ! %r231 bra $L72; add.u64 %r284,%frame,368; mov.u32 %r293,1; .loc 1 324 33 add.u64 %r296,%frame,240; bra $L42; $L72: .loc 1 248 37 add.u64 %r44,%r116,-1; .loc 1 248 31 shl.b64 %r232,%r44,3; .loc 1 248 12 add.u64 %r96,%r47,%r232; .loc 1 249 32 cvt.s64.s32 %r233,%r105; .loc 1 249 13 mad.lo.u64 %r97,%r233,%r44,%r60; .loc 1 261 13 neg.s64 %r58,%r157; .loc 1 262 14 neg.s64 %r59,%r158; add.u64 %r284,%frame,368; mov.u32 %r287,1; .loc 1 284 33 add.u64 %r290,%frame,240; $L48: ld.u64 %r130,[%frame+360]; mov.u64 %r152,%r130; mov.u32 %r33,%r227; $L46: .loc 1 254 11 ld.s8 %r236,[%r97]; cvt.u16.u32 %r235,%r236; setp.eq.u16 %r237,%r235,0; @ %r237 bra $L43; ld.u64 %r238,[%r96]; setp.ne.u64 %r239,%r238,%r162; @ %r239 bra $L43; setp.eq.u32 %r240,%r33,0; selp.u64 %r152,%r130,%r152,%r240; add.u64 %r139,%frame,248; mov.u64 %r138,%r284; add.u64 %r136,%r78,%r141; $L45: .loc 1 257 37 sub.u64 %r243,%r151,%r152; .loc 1 257 25 st.u64 [%r133],%r243; .loc 1 256 5 add.u64 %r133,%r133,%r98; setp.eq.u64 %r244,%r136,%r139; @ %r244 bra $L33; .loc 1 257 33 ld.u64 %r151,[%r139]; .loc 1 257 44 ld.u64 %r152,[%r138]; add.u64 %r139,%r139,8; add.u64 %r138,%r138,8; bra $L45; $L43: .loc 1 261 13 add.u64 %r96,%r96,%r58; .loc 1 262 14 add.u64 %r97,%r97,%r59; .loc 1 263 14 add.u64 %r152,%r152,1; mov.u32 %r33,%r287; .loc 1 263 6 setp.ne.u64 %r245,%r152,%r151; @ %r245 bra $L46; st.u64 [%frame+360],%r151; mov.u64 %r125,%r284; mov.u64 %r69,%r158; mov.u64 %r72,%r151; mov.u64 %r67,%r155; mov.u64 %r124,8; .loc 1 265 6 mov.u64 %r100,0; .loc 1 270 17 mov.u64 %r247,%r100; $L47: st.u64 [%r125+-8],%r247; .loc 1 273 27 mul.lo.u64 %r62,%r67,%r72; .loc 1 274 28 mul.lo.u64 %r64,%r72,%r69; mov.u64 %r142,%r100; .loc 1 275 9 add.u64 %r100,%r100,1; .loc 1 276 11 setp.eq.u64 %r248,%r142,%r134; @ %r248 bra $L33; .loc 1 280 13 ld.u64 %r249,[%r125]; add.u64 %r66,%r249,1; st.u64 [%r125],%r66; .loc 1 281 20 add.u64 %r251,%r285,%r124; ld.u64 %r67,[%r251]; .loc 1 281 10 sub.u64 %r252,%r62,%r67; shl.b64 %r253,%r252,3; add.u64 %r96,%r96,%r253; .loc 1 282 21 add.u64 %r254,%frame,%r124; ld.u64 %r69,[%r254]; .loc 1 282 11 sub.u64 %r255,%r69,%r64; add.u64 %r97,%r97,%r255; .loc 1 284 33 add.u64 %r257,%r290,%r124; ld.u64 %r72,[%r257]; .loc 1 284 6 add.u64 %r125,%r125,8; add.u64 %r124,%r124,8; setp.eq.u64 %r258,%r66,%r72; @ %r258 bra $L47; bra $L48; $L42: ld.u64 %r55,[%frame+360]; .loc 1 265 6 mov.u64 %r146,%r55; mov.u32 %r126,%r164; $L52: .loc 1 294 11 ld.s8 %r260,[%r60]; cvt.u16.u32 %r259,%r260; setp.eq.u16 %r261,%r259,0; @ %r261 bra $L49; ld.u64 %r262,[%r47]; setp.ne.u64 %r263,%r262,%r162; @ %r263 bra $L49; setp.eq.u32 %r264,%r126,0; selp.u64 %r146,%r55,%r146,%r264; mov.u64 %r119,%r284; add.u64 %r77,%r283,%r141; $L51: .loc 1 297 36 add.u64 %r267,%r146,1; .loc 1 297 25 st.u64 [%r133],%r267; .loc 1 296 5 add.u64 %r133,%r133,%r98; setp.eq.u64 %r268,%r77,%r119; @ %r268 bra $L33; .loc 1 297 32 ld.u64 %r146,[%r119]; add.u64 %r119,%r119,8; bra $L51; $L49: .loc 1 301 13 add.u64 %r47,%r47,%r157; .loc 1 302 14 add.u64 %r60,%r60,%r158; .loc 1 303 14 add.u64 %r146,%r146,1; mov.u32 %r126,%r293; .loc 1 303 6 setp.ne.u64 %r269,%r146,%r151; @ %r269 bra $L52; st.u64 [%frame+360],%r151; mov.u64 %r153,%r284; mov.u64 %r93,%r158; mov.u64 %r95,%r151; mov.u64 %r91,%r155; mov.u64 %r150,8; .loc 1 305 6 mov.u64 %r101,0; .loc 1 310 17 mov.u64 %r271,%r101; $L53: st.u64 [%r153+-8],%r271; .loc 1 313 27 mul.lo.u64 %r86,%r91,%r95; .loc 1 314 28 mul.lo.u64 %r88,%r95,%r93; mov.u64 %r103,%r101; .loc 1 315 9 add.u64 %r101,%r101,1; .loc 1 316 11 setp.eq.u64 %r272,%r103,%r134; @ %r272 bra $L33; .loc 1 320 13 ld.u64 %r273,[%r153]; add.u64 %r90,%r273,1; st.u64 [%r153],%r90; .loc 1 321 20 add.u64 %r275,%r285,%r150; ld.u64 %r91,[%r275]; .loc 1 321 10 sub.u64 %r276,%r91,%r86; shl.b64 %r277,%r276,3; add.u64 %r47,%r47,%r277; .loc 1 322 21 add.u64 %r278,%frame,%r150; ld.u64 %r93,[%r278]; .loc 1 322 11 sub.u64 %r279,%r93,%r88; add.u64 %r60,%r60,%r279; .loc 1 324 33 add.u64 %r281,%r296,%r150; ld.u64 %r95,[%r281]; .loc 1 324 6 add.u64 %r153,%r153,8; add.u64 %r150,%r150,8; setp.eq.u64 %r282,%r90,%r95; @ %r282 bra $L53; bra $L42; $L33:_gfortran_sfindloc0_i8 .visible .func _gfortran_sfindloc0_, .param .u32 %in_ar4) {reg .u64 %r24u64 %r38; .reg .u32predar2; mov.u64 %r38,%ar3; mov.u32 %r39,%ar4; .loc 1 345 6 setp.eq.u64 %r40,%r38,0; @ %r40 bra $L74; .loc 1 345 20 ld.u32 %r41,[%r38]; setp.eq.u32 %r42,%r41,0; @ %r42 bra $L75; $L74: .loc 1 347 7 {call _gfortran_findloc0_348 7 bra $L73; $L75: .loc 1 351 8 ld.s8 %r30,[%r36+28]; .loc 1 353 6 setp.gt.s64 %r48,%r30,0; @ %r48 bra $L77; .loc 1 354 5 cvta.const.u64 %r50,$LC0; mov.u64 %r49,0; {_gfortrani_internal_error77: .loc 1 356 15 ld.u64 %r24,[%r35]; .loc 1 356 6 setp.ne.u64 %r51,%r24,0; @ %r51 bra $L78; .loc 1 358 7 st.u64 [%r35+48],%r24; add.u64 %r53,%r30,-1; st.u64 [%r35+56],%r53; mov.u64 %r54,1; st.u64 [%r35+40],%r54; .loc 1 359 28 cvt.u32.u64 %r55,%r54; st.u8 [%r35+28],%r55; .loc 1 360 24 st.u64 [%r35+8],%r24; .loc 1 361 29 mov.u64 call (%value_in),_gfortrani_xmallocarray61 27 st.u64 [%r35],%r24; bra $L79; $L78: .loc 1 363 12 cvta.global.u64 %r63,_gfortrani_compile_options; .loc 1 363 11 ld.u32 %r64,[%r63+36]; setp.eq.u32 %r65,%r64,0; @ %r65 bra $L79; .loc 1 365 8 cvta.const.u64 %r68,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 370 8 ld.u64 %r24,[%r35]; $L79: ld.u64 %r69,[%r35+40]; shl.b64 %r34,%r69,3; mov.u64 %r29,%r24; .loc 1 371 10 mov.u64 %r32,0; .loc 1 372 23 mov.u64 %r70,%r32; $L80: st.u64 [%r29],%r70; .loc 1 371 24 add.u64 %r32,%r32,1; .loc 1 371 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r71,%r30,%r32; @ %r71 bra $L80; $L73: .loc 1 373 1 ret; } findloc0_i16.o/_gfortran_findloc0_i16 .visible .func _gfortran_findloc0_i16.file 1 "../../../libgfortran/generated/findloc0_i16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mfindloc0_i16 .visible .func _gfortran_mfindloc0_i16_gfortran_sfindloc0_i16 .visible .func _gfortran_sfindloc0_i16VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extentsgfortrani_internal_error .extern .func _gfortrani_internal_error8] = {70,73,78,68,777,65,83,75,32,97,114,103,117,109,101,110,116,0 }82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,6270,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_findloc0_i16 .visible .func _gfortran_findloc0_i16reg .u64 %stack; mov.u64 %stack,088predpred %r157; .reg .predpredpredpred %r209; .reg .u64 %r210; .reg .u64predmov.u64 %r120,%ar0; mov.u64 %r121,%ar1; mov.u64 %r122,%ar2; ld.u64 %r217,[%r122]; ld.u64 %r218,[%r122+8]; mov.u32 %r124,%ar3; .loc 1 51 10 ld.s8 %r22,[%r121+28]; .loc 1 51 8 cvt.u32.u32 %r125,%r22; cvt.s64.s8 %r80,%r125; .loc 1 52 6 setp.gt.s64 %r126,%r80,0; @ %r126 bra $L2; .loc 1 53 5_gfortran_runtime_errorr23,[%r120]; .loc 1 55 6 setp.ne.u64 %r129,%r23,0; @ %r129 bra $L3; .loc 1 57 7 st.u64 [%r120+48],%r23; add.u64 %r131,%r80,-1; st.u64 [%r120+56],%r131; mov.u64 %r132,1; st.u64 [%r120+40],%r132; .loc 1 58 28 cvt.u32.u64 %r133,%r132; st.u8 [%r120+28],%r133; .loc 1 59 24 st.u64 [%r120+8],%r23; .loc 1 60 29 mov.u64 %r136136; call (%value_in),_gfortrani_xmallocarray139,[%value_in]; } mov.u64 %r23,%r139; .loc 1 60 27 st.u64 [%r120],%r23; bra $L4; $L3: .loc 1 64 11 cvta.global.u64 %r141,_gfortrani_compile_options; .loc 1 64 10 ld.u32 %r142,[%r141+36]; setp.eq.u32 %r143,%r142,0; @ %r143 bra $L4; .loc 1 6gfortrani_bounds_iforeach_return.loc 1 70 8 ld.u64 %r23,[%r120]; $L4: ld.u64 %r147,[%r120+40]; shl.b64 %r118,%r147,3; mov.u64 %r102,%r23; .loc 1 69 11 mov.u64 %r116,%r102; .loc 1 73 10 mov.u64 %r89,0; .loc 1 74 23 mov.u64 %r148,%r89; $L5: st.u64 [%r116],%r148; mov.u64 %r103,%r89; .loc 1 73 26 add.u64 %r89,%r103,1; .loc 1 73 3 add.u64 %r116,%r116,%r118; setp.ne.u64 %r149,%r80,%r89; @ %r149 bra $L5; add.u64 %r39,%r121,40; mov.u64 %r74,%frame; .loc 1 76 6 mov.u64 %r87,1; .loc 1 77 10 mov.u64 %r101,0; add.u64 %r221,%frame,120; bra $L7; $L20: mov.u64 %r101,%r88; $L7: .loc 1 79 18 ld.u64 %r150,[%r39]; st.u64 [%r74],%r150; .loc 1 80 19 ld.u64 %r152,[%r39+16]; add.u64 %r151,%r152,1; ld.u64 %r153,[%r39+8]; sub.u64 %r33,%r151,%r153; .loc 1 80 17 shl.b64 %r155,%r101,3; add.u64 %r156,%r221,%r155; st.u64 [%r156],%r33; .loc 1 81 10 mul.lo.u64 %r87,%r87,%r33; .loc 1 82 10 setp.le.s64 %r157,%r33,0; @ %r157 bra $L1; .loc 1 77 26 add.u64 %r88,%r101,1; .loc 1 77 3 add.u64 %r39,%r39,24; add.u64 %r74,%r74,8; setp.ne.u64 %r158,%r103,%r101; @ %r158 bra $L20; .loc 1 87 16 cvt.u32.u32 %r160,%r22; cvt.s64.s8 %r159,%r160; shl.b64 %r95,%r159,3; add.u64 %r219,%frame,240; mov.u32 %r165%r113,[%frame]; .loc 1 104 13 shl.b64 %r115,%r113,4; .loc 1 105 34 ld.u64 %r109,[%frame+120]; .loc 1 91 19 ld.u64 %r45,[%r121]; .loc 1 89 6 setp.eq.u32 %r169,%r124,0; @ ! %r169 bra $L32; add.u64 %r220,%frame,248; mov.u32 %r227,1; .loc 1 149 17 mov.u64 %r231,0; bra $L8; $L32: .loc 1 91 37 add.u64 %r170,%r87,-1; .loc 1 91 31 shl.b64 %r171,%r170,4; .loc 1 91 12 add.u64 %r68,%r45,%r171; .loc 1 104 13 neg.s64 %r44,%r115; add.u64 %r220,%frame,248; mov.u32 %r222,1; .loc 1 112 17 mov.u64 %r226,0; $L14: ld.u64 %r38,[%frame+240]; mov.u64 %r110,%r38; mov.u32 %r98,%r165; $L12: .loc 1 97 11 ld.u64 %r172,[%r68]; setp.ne.u64 %r174,%r172,%r217; @ %r174 bra $L9; ld.u64 %r175,[%r68+8]; setp.ne.u64 %r177,%r175,%r218; @ %r177 bra $L9; setp.eq.u32 %r178,%r98,0; selp.u64 %r110,%r38,%r110,%r178; add.u64 %r93,%frame,128; mov.u64 %r92,%r220; add.u64 %r86,%r221,%r95; $L11: .loc 1 100 37 sub.u64 %r182,%r109,%r110; .loc 1 100 25 st.u64 [%r102],%r182; .loc 1 99 5 add.u64 %r102,%r102,%r118; setp.eq.u64 %r183,%r86,%r93; @ %r183 bra $L1; .loc 1 100 33 ld.u64 %r109,[%r93]; .loc 1 100 44 ld.u64 %r110,[%r92]; add.u64 %r93,%r93,8; add.u64 %r92,%r92,8; bra $L11; $L9: .loc 1 104 13 add.u64 %r68,%r68,%r44; .loc 1 105 14 add.u64 %r110,%r110,1; mov.u32 %r98,%r222; .loc 1 105 6 setp.ne.u64 %r184,%r110,%r109; @ %r184 bra $L12; st.u64 [%frame+240],%r109; mov.u64 %r111,%r220; mov.u64 %r53,%r109; mov.u64 %r51,%r113; .loc 1 107 6 mov.u64 %r73,0; $L13: .loc 1 112 17 st.u64 [%r111+-8],%r226; .loc 1 115 27 mul.lo.u64 %r48,%r51,%r53; mov.u64 %r79,%r73; .loc 1 116 9 add.u64 %r73,%r73,1; .loc 1 117 11 setp.eq.u64 %r187,%r79,%r103; @ %r187 bra $L1; .loc 1 121 13 ld.u64 %r188,[%r111]; add.u64 %r50,%r188,1; st.u64 [%r111],%r50; shl.b64 %r106,%r73,3; .loc 1 122 20 add.u64 %r189,%frame,%r106; ld.u64 %r51,[%r189]; .loc 1 122 10 sub.u64 %r190,%r48,%r51; shl.b64 %r191,%r190,4; add.u64 %r68,%r68,%r191; .loc 1 124 33 add.u64 %r193,%r221,%r106; ld.u64 %r53,[%r193]; .loc 1 124 6 add.u64 %r111,%r111,8; setp.eq.u64 %r194,%r50,%r53; @ %r194 bra $L13; bra $L14; $L8: ld.u64 %r96,[%frame+240]; .loc 1 107 6 mov.u64 %r105,%r96; mov.u32 %r59,%r124; $L18: .loc 1 134 11 ld.u64 %r195,[%r45]; setp.ne.u64 %r197,%r195,%r217; @ %r197 bra $L15; ld.u64 %r198,[%r45+8]; setp.ne.u64 %r200,%r198,%r218; @ %r200 bra $L15; setp.eq.u32 %r201,%r59,0; selp.u64 %r105,%r96,%r105,%r201; mov.u64 %r72,%r220; add.u64 %r27,%r219,%r95; $L17: .loc 1 137 36 add.u64 %r204,%r105,1; .loc 1 137 25 st.u64 [%r102],%r204; .loc 1 136 5 add.u64 %r102,%r102,%r118; setp.eq.u64 %r205,%r27,%r72; @ %r205 bra $L1; .loc 1 137 32 ld.u64 %r105,[%r72]; add.u64 %r72,%r72,8; bra $L17; $L15: .loc 1 141 13 add.u64 %r45,%r45,%r115; .loc 1 142 14 add.u64 %r105,%r105,1; mov.u32 %r59,%r227; .loc 1 142 6 setp.ne.u64 %r206,%r105,%r109; @ %r206 bra $L18; st.u64 [%frame+240],%r109; mov.u64 %r71,%r220; mov.u64 %r67,%r109; mov.u64 %r65,%r113; .loc 1 144 6 mov.u64 %r75,0; $L19: .loc 1 149 17 st.u64 [%r71+-8],%r231; .loc 1 152 27 mul.lo.u64 %r62,%r65,%r67; mov.u64 %r78,%r75; .loc 1 153 9 add.u64 %r75,%r75,1; .loc 1 154 11 setp.eq.u64 %r209,%r78,%r103; @ %r209 bra $L1; .loc 1 158 13 ld.u64 %r210,[%r71]; add.u64 %r64,%r210,1; st.u64 [%r71],%r64; shl.b64 %r76,%r75,3; .loc 1 159 20 add.u64 %r211,%frame,%r76; ld.u64 %r65,[%r211]; .loc 1 159 10 sub.u64 %r212,%r65,%r62; shl.b64 %r213,%r212,4; add.u64 %r45,%r45,%r213; .loc 1 161 33 add.u64 %r215,%r221,%r76; ld.u64 %r67,[%r215]; .loc 1 161 6 add.u64 %r71,%r71,8; setp.eq.u64 %r216,%r64,%r67; @ %r216 bra $L19; bra $L8; $L_gfortran_mfindloc0_i16 .visible .func _gfortran_mfindloc0_i16, .param .u32 %in_ar4) {reg .u64 %stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[480predpredu64u32 %r196; .reg .u32 %r197; .reg .u32u16pred %r207; .reg .pred64predpred %r249; .reg .pred %r250; .reg .u64 %r252; .reg .predu64.reg .predpred %r272; .reg .pred %r273; .reg .u64 %r276; .reg .predu64u64mov.u64 %r160,%ar0; mov.u64 %r161,%ar1; mov.u64 %r162,%ar2; ld.u64 %r292,[%r162]; ld.u64 %r293,[%r162+8]; mov.u64 %r164,%ar3; mov.u32 %r165,%ar4; .loc 1 190 10 ld.s8 %r22,[%r161+28]; .loc 1 190 8 cvt.u32.u32 %r166,%r22; cvt.s64.s8 %r104,%r166; .loc 1 191 6 setp.gt.s64 %r167,%r104,0; @ %r167 bra $L34;_gfortran_runtime_error34: .loc 1 194 6 ld.u64 %r170,[%r160]; setp.ne.u64 %r171,%r170,0; @ %r171 bra $L35; .loc 1 196 7 st.u64 [%r160+48],%r170; add.u64 %r173,%r104,-1; st.u64 [%r160+56],%r173; mov.u64 %r174,1; st.u64 [%r160+40],%r174; .loc 1 197 28 cvt.u32.u64 %r175,%r174; st.u8 [%r160+28],%r175; .loc 1 198 24 st.u64 [%r160+8],%r170; .loc 1 199 29 mov.u64 %r178call (%value_in),_gfortrani_xmallocarray,(%out_arg1,%out_arg2); ld.param.u64 %r181,[%value_in]; } .loc 1 199 27 st.u64 [%r160],%r181; bra $L36; $L35: .loc 1 203 11 cvta.global.u64 %r183,_gfortrani_compile_options; .loc 1 203 10 ld.u32 %r184,[%r183+36]; setp.eq.u32 %r185,%r184,0; @ %r185 bra $L36; .loc 1 205 4 cvta.const.u64 %r188,$LC1; {161188; call _gfortrani_bounds_iforeach_return.loc 1 207 4 cvta.const.u64 %r1911619188; call _gfortrani_bounds_equal_extents$L36: .loc 1 212 13 ld.u32 %r105,[%r164+16]; .loc 1 214 9 ld.u64 %r60,[%r164]; .loc 1 216 53 add.u32 %r193,%r105,-4; and.b32 %r194,%r193,-5; set.u32.eq.u32 %r196,%r194,0; neg.s32 %r197,%r196; .loc 1 216 22 add.u32 %r198,%r105,-1; set.u32.le.u32 %r200,%r198,1; neg.s32 %r201,%r200; .loc 1 216 6 cvt.u16.u32 %r203,%r197; cvt.u16.u32 %r204,%r201; or.b16 %r202,%r203,%r204; cvt.u32.u16 %r205,%r202; cvt.u16.u8 %r206,%r205; setp.ne.u16 %r207,%r206,0; @ %r207 bra $L37; .loc 1 218 7 setp.ne.u32 %r208,%r105,16; @ %r208 bra $L38; $L37: ld.u64 %r209,[%r160+40]; shl.b64 %r98,%r209,3; ld.u64 %r133,[%r160]; .loc 1 226 8 mov.u64 %r84,%r133; .loc 1 229 10 mov.u64 %r118,0; .loc 1 230 23 mov.u64 %r212,%r118; bra $L39; $L38: .loc 1 223 5 cvta.const.u64 %r211,$LC3; mov.u64 %r210,211; call _gfortrani_internal_error39: .loc 1 230 23 st.u64 [%r84],%r212; mov.u64 %r134,%r118; .loc 1 229 26 add.u64 %r118,%r134,1; .loc 1 229 3 add.u64 %r84,%r84,%r98; setp.ne.u64 %r213,%r104,%r118; @ %r213 bra $L39; .loc 1 236 20 ld.u64 %r37,[%r164+16]; add.u64 %r143,%r161,40; add.u64 %r296,%frame,120; mov.u64 %r28,%r296; add.u64 %r80,%r164,40; mov.u64 %r82,%frame; add.u64 %r78,%frame,240; mov.u64 %r54,%r78; .loc 1 232 6 mov.u64 %r116,1; .loc 1 233 10 mov.u64 %r132,0; bra $L41; $L54: mov.u64 %r132,%r117; $L41: .loc 1 235 18 ld.u64 %r214,[%r143]; st.u64 [%r28],%r214; .loc 1 236 20 ld.u64 %r216,[%r80]; mul.lo.u64 %r215,%r216,%r37; .loc 1 236 18 st.u64 [%r82],%r215; .loc 1 237 19 ld.u64 %r218,[%r143+16]; add.u64 %r217,%r218,1; ld.u64 %r219,[%r143+8]; sub.u64 %r43,%r217,%r219; .loc 1 237 17 st.u64 [%r54],%r43; .loc 1 238 10 mul.lo.u64 %r116,%r116,%r43; .loc 1 239 10 setp.le.s64 %r220,%r43,0; @ %r220 bra $L33; .loc 1 233 26 add.u64 %r117,%r132,1; .loc 1 233 3 add.u64 %r143,%r143,24; add.u64 %r28,%r28,8; add.u64 %r80,%r80,24; add.u64 %r82,%r82,8; add.u64 %r54,%r54,8; setp.ne.u64 %r221,%r134,%r132; @ %r221 bra $L54; .loc 1 244 16 cvt.u32.u32 %r223,%r22; cvt.s64.s8 %r222,%r223; shl.b64 %r141,%r222,3; add.u64 %r294,%frame,360; mov.u32 %r228261 23 ld.u64 %r155,[%frame+120]; .loc 1 261 13 shl.b64 %r157,%r155,4; .loc 1 262 24 ld.u64 %r158,[%frame]; .loc 1 263 34 ld.u64 %r151,[%frame+240]; .loc 1 248 19 ld.u64 %r47,[%r161]; .loc 1 246 6 setp.eq.u32 %r232,%r165,0; @ ! %r232 bra $L72; add.u64 %r295,%frame,368; mov.u32 %r304,1; .loc 1 324 33 add.u64 %r307,%frame,240; bra $L42; $L72: .loc 1 248 37 add.u64 %r44,%r116,-1; .loc 1 248 31 shl.b64 %r233,%r44,4; .loc 1 248 12 add.u64 %r96,%r47,%r233; .loc 1 249 32 cvt.s64.s32 %r234,%r105; .loc 1 249 13 mad.lo.u64 %r97,%r234,%r44,%r60; .loc 1 261 13 neg.s64 %r58,%r157; .loc 1 262 14 neg.s64 %r59,%r158; add.u64 %r295,%frame,368; mov.u32 %r298,1; .loc 1 284 33 add.u64 %r301,%frame,240; $L48: ld.u64 %r130,[%frame+360]; mov.u64 %r152,%r130; mov.u32 %r33,%r228; $L46: .loc 1 254 11 ld.s8 %r237,[%r97]; cvt.u16.u32 %r236,%r237; setp.eq.u16 %r238,%r236,0; @ %r238 bra $L43; ld.u64 %r239,[%r96]; setp.ne.u64 %r241,%r239,%r292; @ %r241 bra $L43; ld.u64 %r242,[%r96+8]; setp.ne.u64 %r244,%r242,%r293; @ %r244 bra $L43; setp.eq.u32 %r245,%r33,0; selp.u64 %r152,%r130,%r152,%r245; add.u64 %r139,%frame,248; mov.u64 %r138,%r295; add.u64 %r136,%r78,%r141; $L45: .loc 1 257 37 sub.u64 %r248,%r151,%r152; .loc 1 257 25 st.u64 [%r133],%r248; .loc 1 256 5 add.u64 %r133,%r133,%r98; setp.eq.u64 %r249,%r136,%r139; @ %r249 bra $L33; .loc 1 257 33 ld.u64 %r151,[%r139]; .loc 1 257 44 ld.u64 %r152,[%r138]; add.u64 %r139,%r139,8; add.u64 %r138,%r138,8; bra $L45; $L43: .loc 1 261 13 add.u64 %r96,%r96,%r58; .loc 1 262 14 add.u64 %r97,%r97,%r59; .loc 1 263 14 add.u64 %r152,%r152,1; mov.u32 %r33,%r298; .loc 1 263 6 setp.ne.u64 %r250,%r152,%r151; @ %r250 bra $L46; st.u64 [%frame+360],%r151; mov.u64 %r125,%r295; mov.u64 %r69,%r158; mov.u64 %r72,%r151; mov.u64 %r67,%r155; mov.u64 %r124,8; .loc 1 265 6 mov.u64 %r100,0; .loc 1 270 17 mov.u64 %r252,%r100; $L47: st.u64 [%r125+-8],%r252; .loc 1 273 27 mul.lo.u64 %r62,%r67,%r72; .loc 1 274 28 mul.lo.u64 %r64,%r72,%r69; mov.u64 %r142,%r100; .loc 1 275 9 add.u64 %r100,%r100,1; .loc 1 276 11 setp.eq.u64 %r253,%r142,%r134; @ %r253 bra $L33; .loc 1 280 13 ld.u64 %r254,[%r125]; add.u64 %r66,%r254,1; st.u64 [%r125],%r66; .loc 1 281 20 add.u64 %r256,%r296,%r124; ld.u64 %r67,[%r256]; .loc 1 281 10 sub.u64 %r257,%r62,%r67; shl.b64 %r258,%r257,4; add.u64 %r96,%r96,%r258; .loc 1 282 21 add.u64 %r259,%frame,%r124; ld.u64 %r69,[%r259]; .loc 1 282 11 sub.u64 %r260,%r69,%r64; add.u64 %r97,%r97,%r260; .loc 1 284 33 add.u64 %r262,%r301,%r124; ld.u64 %r72,[%r262]; .loc 1 284 6 add.u64 %r125,%r125,8; add.u64 %r124,%r124,8; setp.eq.u64 %r263,%r66,%r72; @ %r263 bra $L47; bra $L48; $L42: ld.u64 %r55,[%frame+360]; .loc 1 265 6 mov.u64 %r146,%r55; mov.u32 %r126,%r165; $L52: .loc 1 294 11 ld.s8 %r265,[%r60]; cvt.u16.u32 %r264,%r265; setp.eq.u16 %r266,%r264,0; @ %r266 bra $L49; ld.u64 %r267,[%r47]; setp.ne.u64 %r269,%r267,%r292; @ %r269 bra $L49; ld.u64 %r270,[%r47+8]; setp.ne.u64 %r272,%r270,%r293; @ %r272 bra $L49; setp.eq.u32 %r273,%r126,0; selp.u64 %r146,%r55,%r146,%r273; mov.u64 %r119,%r295; add.u64 %r77,%r294,%r141; $L51: .loc 1 297 36 add.u64 %r276,%r146,1; .loc 1 297 25 st.u64 [%r133],%r276; .loc 1 296 5 add.u64 %r133,%r133,%r98; setp.eq.u64 %r277,%r77,%r119; @ %r277 bra $L33; .loc 1 297 32 ld.u64 %r146,[%r119]; add.u64 %r119,%r119,8; bra $L51; $L49: .loc 1 301 13 add.u64 %r47,%r47,%r157; .loc 1 302 14 add.u64 %r60,%r60,%r158; .loc 1 303 14 add.u64 %r146,%r146,1; mov.u32 %r126,%r304; .loc 1 303 6 setp.ne.u64 %r278,%r146,%r151; @ %r278 bra $L52; st.u64 [%frame+360],%r151; mov.u64 %r153,%r295; mov.u64 %r93,%r158; mov.u64 %r95,%r151; mov.u64 %r91,%r155; mov.u64 %r150,8; .loc 1 305 6 mov.u64 %r101,0; .loc 1 310 17 mov.u64 %r280,%r101; $L53: st.u64 [%r153+-8],%r280; .loc 1 313 27 mul.lo.u64 %r86,%r91,%r95; .loc 1 314 28 mul.lo.u64 %r88,%r95,%r93; mov.u64 %r103,%r101; .loc 1 315 9 add.u64 %r101,%r101,1; .loc 1 316 11 setp.eq.u64 %r281,%r103,%r134; @ %r281 bra $L33; .loc 1 320 13 ld.u64 %r282,[%r153]; add.u64 %r90,%r282,1; st.u64 [%r153],%r90; .loc 1 321 20 add.u64 %r284,%r296,%r150; ld.u64 %r91,[%r284]; .loc 1 321 10 sub.u64 %r285,%r91,%r86; shl.b64 %r286,%r285,4; add.u64 %r47,%r47,%r286; .loc 1 322 21 add.u64 %r287,%frame,%r150; ld.u64 %r93,[%r287]; .loc 1 322 11 sub.u64 %r288,%r93,%r88; add.u64 %r60,%r60,%r288; .loc 1 324 33 add.u64 %r290,%r307,%r150; ld.u64 %r95,[%r290]; .loc 1 324 6 add.u64 %r153,%r153,8; add.u64 %r150,%r150,8; setp.eq.u64 %r291,%r90,%r95; @ %r291 bra $L53; bra $L42; $L33:_gfortran_sfindloc0_i16 .visible .func _gfortran_sfindloc0_i16, .param .u32 %in_ar4) {u64ar2; ld.u64 %r75,[%r37]; ld.u64 %r76,[%r37+8]; mov.u64 %r39,%ar3; mov.u32 %r40,%ar4; .loc 1 345 6 setp.eq.u64 %r41,%r39,0; @ %r41 bra $L74; .loc 1 345 20 ld.u32 %r42,[%r39]; setp.eq.u32 %r43,%r42,0; @ %r43 bra $L75; $L74: .loc 1 347 7 st.u64 [%frame],%r75; st.u64 [%frame+8],%r76; {framecall _gfortran_findloc0_i16348 7 bra $L73; $L75: .loc 1 351 8 ld.s8 %r30,[%r36+28]; .loc 1 353 6 setp.gt.s64 %r51,%r30,0; @ %r51 bra $L77; .loc 1 354 5 cvta.const.u64 %r53,$LC0; mov.u64 %r52,0; {call _gfortrani_internal_error77: .loc 1 356 15 ld.u64 %r24,[%r35]; .loc 1 356 6 setp.ne.u64 %r54,%r24,0; @ %r54 bra $L78; .loc 1 358 7 st.u64 [%r35+48],%r24; add.u64 %r56,%r30,-1; st.u64 [%r35+56],%r56; mov.u64 %r57,1; st.u64 [%r35+40],%r57; .loc 1 359 28 cvt.u32.u64 %r58,%r57; st.u8 [%r35+28],%r58; .loc 1 360 24 st.u64 [%r35+8],%r24; .loc 1 361 29 mov.u64 call (%value_in),_gfortrani_xmallocarray61 27 st.u64 [%r35],%r24; bra $L79; $L78: .loc 1 363 12 cvta.global.u64 %r66,_gfortrani_compile_options; .loc 1 363 11 ld.u32 %r67,[%r66+36]365 8 cvta.const.u64 %r71,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 370 8 ld.u64 %r24,[%r35]; $L79: ld.u64 %r72,[%r35+40]; shl.b64 %r34,%r72,3; mov.u64 %r29,%r24; .loc 1 371 10 mov.u64 %r32,0; .loc 1 372 23 mov.u64 %r73,%r32; $L80: st.u64 [%r29],%r73; .loc 1 371 24 add.u64 %r32,%r32,1; .loc 1 371 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r74,%r30,%r32; @ %r74 bra $L80; $L73: .loc 1 373 1 ret; } findloc0_r4.o/_gfortran_findloc0_r4 .visible .func _gfortran_findloc0_rfortran/generated/findloc0_r4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mfindloc0_r4 .visible .func _gfortran_mfindloc0_r_gfortran_sfindloc0_r4 .visible .func _gfortran_sfindloc0_rVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extentsgfortrani_internal_error .extern .func _gfortrani_internal_error8] = {70,73,78,68,777,65,83,75,32,97,114,103,117,109,101,110,116,0 }82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,6270,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_findloc0_r4 .visible .func _gfortran_findloc0_rreg .u64 %stack; mov.u64 %stack,088pred.reg .u64 %r140; .reg .u32 %r141; .reg .predpredu64predu64pred %r192; .reg .u64 %r195; .reg .predpredu32 %r211; .reg .u64 %r215; .reg .u32 %r216; .reg .u64 %r220; mov.u64 %r120,%ar0; mov.u64 %r121,%ar1; mov.f32 %r122,%ar2; mov.u32 %r123,%ar3; .loc 1 51 10 ld.s8 %r22,[%r121+28]; .loc 1 51 8 cvt.u32.u32 %r124,%r22; cvt.s64.s8 %r80,%r124; .loc 1 52 6 setp.gt.s64 %r125,%r80,0; @ %r125 bra $L2; .loc 1 53 5 cvta.const.u64 %r126,$LCstack; call _gfortran_runtime_errorr23,[%r120]; .loc 1 55 6 setp.ne.u64 %r128,%r23,0; @ %r128 bra $L3; .loc 1 57 7 st.u64 [%r120+48],%r23; add.u64 %r130,%r80,-1; st.u64 [%r120+56],%r130; mov.u64 %r131,1; st.u64 [%r120+40],%r131; .loc 1 58 28 cvt.u32.u64 %r132,%r131; st.u8 [%r120+28],%r132; .loc 1 59 24 st.u64 [%r120+8],%r23; .loc 1 60 29 mov.u64 %r135135; call (%value_in),_gfortrani_xmallocarray138,[%value_in]; } mov.u64 %r23,%r138; .loc 1 60 27 st.u64 [%r120],%r23; bra $L4; $L3: .loc 1 64 11 cvta.global.u64 %r140,_gfortrani_compile_options; .loc 1 64 10 ld.u32 %r141,[%r140+36]; setp.eq.u32 %r142,%r141,0; @ %r142 bra $L4; .loc 1 6gfortrani_bounds_iforeach_return.loc 1 70 8 ld.u64 %r23,[%r120]; $L4: ld.u64 %r146,[%r120+40]; shl.b64 %r118,%r146,3; mov.u64 %r102,%r23; .loc 1 69 11 mov.u64 %r116,%r102; .loc 1 73 10 mov.u64 %r89,0; .loc 1 74 23 mov.u64 %r147,%r89; $L5: st.u64 [%r116],%r147; mov.u64 %r103,%r89; .loc 1 73 26 add.u64 %r89,%r103,1; .loc 1 73 3 add.u64 %r116,%r116,%r118; setp.ne.u64 %r148,%r80,%r89; @ %r148 bra $L5; add.u64 %r39,%r121,40; mov.u64 %r74,%frame; .loc 1 76 6 mov.u64 %r87,1; .loc 1 77 10 mov.u64 %r101,0; add.u64 %r208,%frame,120; bra $L7; $L20: mov.u64 %r101,%r88; $L7: .loc 1 79 18 ld.u64 %r149,[%r39]; st.u64 [%r74],%r149; .loc 1 80 19 ld.u64 %r151,[%r39+16]; add.u64 %r150,%r151,1; ld.u64 %r152,[%r39+8]; sub.u64 %r33,%r150,%r152; .loc 1 80 17 shl.b64 %r154,%r101,3; add.u64 %r155,%r208,%r154; st.u64 [%r155],%r33; .loc 1 81 10 mul.lo.u64 %r87,%r87,%r33; .loc 1 82 10 setp.le.s64 %r156,%r33,0; @ %r156 bra $L1; .loc 1 77 26 add.u64 %r88,%r101,1; .loc 1 77 3 add.u64 %r39,%r39,24; add.u64 %r74,%r74,8; setp.ne.u64 %r157,%r103,%r101; @ %r157 bra $L20; .loc 1 87 16 cvt.u32.u32 %r159,%r22; cvt.s64.s8 %r158,%r159; shl.b64 %r95,%r158,3; add.u64 %r209,%frame,240; mov.u32 %r164%r113,[%frame]; .loc 1 104 13 shl.b64 %r115,%r113,2; .loc 1 105 34 ld.u64 %r109,[%frame+120]; .loc 1 91 19 ld.u64 %r45,[%r121]; .loc 1 89 6 setp.eq.u32 %r168,%r123,0; @ ! %r168 bra $L32; add.u64 %r210,%frame,248; mov.u32 %r216,1; .loc 1 149 17 mov.u64 %r220,0; bra $L8; $L32: .loc 1 91 37 add.u64 %r169,%r87,-1; .loc 1 91 31 shl.b64 %r170,%r169,2; .loc 1 91 12 add.u64 %r68,%r45,%r170; .loc 1 104 13 neg.s64 %r44,%r115; add.u64 %r210,%frame,248; mov.u32 %r211,1; .loc 1 112 17 mov.u64 %r215,0; $L14: ld.u64 %r38,[%frame+240]; mov.u64 %r110,%r38; mov.u32 %r98,%r164; $L12: .loc 1 97 11 ld.f32 %r171,[%r68]; setp.neu.f32 %r172,%r171,%r122; @ %r172 bra $L9; setp.eq.u32 %r173,%r98,0; selp.u64 %r110,%r38,%r110,%r173; add.u64 %r93,%frame,128; mov.u64 %r92,%r210; add.u64 %r86,%r208,%r95; $L11: .loc 1 100 37 sub.u64 %r177,%r109,%r110; .loc 1 100 25 st.u64 [%r102],%r177; .loc 1 99 5 add.u64 %r102,%r102,%r118; setp.eq.u64 %r178,%r86,%r93; @ %r178 bra $L1; .loc 1 100 33 ld.u64 %r109,[%r93]; .loc 1 100 44 ld.u64 %r110,[%r92]; add.u64 %r93,%r93,8; add.u64 %r92,%r92,8; bra $L11; $L9: .loc 1 104 13 add.u64 %r68,%r68,%r44; .loc 1 105 14 add.u64 %r110,%r110,1; mov.u32 %r98,%r211; .loc 1 105 6 setp.ne.u64 %r179,%r110,%r109; @ %r179 bra $L12; st.u64 [%frame+240],%r109; mov.u64 %r111,%r210; mov.u64 %r53,%r109; mov.u64 %r51,%r113; .loc 1 107 6 mov.u64 %r73,0; $L13: .loc 1 112 17 st.u64 [%r111+-8],%r215; .loc 1 115 27 mul.lo.u64 %r48,%r51,%r53; mov.u64 %r79,%r73; .loc 1 116 9 add.u64 %r73,%r73,1; .loc 1 117 11 setp.eq.u64 %r182,%r79,%r103; @ %r182 bra $L1; .loc 1 121 13 ld.u64 %r183,[%r111]; add.u64 %r50,%r183,1; st.u64 [%r111],%r50; shl.b64 %r106,%r73,3; .loc 1 122 20 add.u64 %r184,%frame,%r106; ld.u64 %r51,[%r184]; .loc 1 122 10 sub.u64 %r185,%r48,%r51; shl.b64 %r186,%r185,2; add.u64 %r68,%r68,%r186; .loc 1 124 33 add.u64 %r188,%r208,%r106; ld.u64 %r53,[%r188]; .loc 1 124 6 add.u64 %r111,%r111,8; setp.eq.u64 %r189,%r50,%r53; @ %r189 bra $L13; bra $L14; $L8: ld.u64 %r96,[%frame+240]; .loc 1 107 6 mov.u64 %r105,%r96; mov.u32 %r59,%r123; $L18: .loc 1 134 11 ld.f32 %r190,[%r45]; setp.neu.f32 %r191,%r190,%r122; @ %r191 bra $L15; setp.eq.u32 %r192,%r59,0; selp.u64 %r105,%r96,%r105,%r192; mov.u64 %r72,%r210; add.u64 %r27,%r209,%r95; $L17: .loc 1 137 36 add.u64 %r195,%r105,1; .loc 1 137 25 st.u64 [%r102],%r195; .loc 1 136 5 add.u64 %r102,%r102,%r118; setp.eq.u64 %r196,%r27,%r72; @ %r196 bra $L1; .loc 1 137 32 ld.u64 %r105,[%r72]; add.u64 %r72,%r72,8; bra $L17; $L15: .loc 1 141 13 add.u64 %r45,%r45,%r115; .loc 1 142 14 add.u64 %r105,%r105,1; mov.u32 %r59,%r216; .loc 1 142 6 setp.ne.u64 %r197,%r105,%r109; @ %r197 bra $L18; st.u64 [%frame+240],%r109; mov.u64 %r71,%r210; mov.u64 %r67,%r109; mov.u64 %r65,%r113; .loc 1 144 6 mov.u64 %r75,0; $L19: .loc 1 149 17 st.u64 [%r71+-8],%r220; .loc 1 152 27 mul.lo.u64 %r62,%r65,%r67; mov.u64 %r78,%r75; .loc 1 153 9 add.u64 %r75,%r75,1; .loc 1 154 11 setp.eq.u64 %r200,%r78,%r103; @ %r200 bra $L1; .loc 1 158 13 ld.u64 %r201,[%r71]; add.u64 %r64,%r201,1; st.u64 [%r71],%r64; shl.b64 %r76,%r75,3; .loc 1 159 20 add.u64 %r202,%frame,%r76; ld.u64 %r65,[%r202]; .loc 1 159 10 sub.u64 %r203,%r65,%r62; shl.b64 %r204,%r203,2; add.u64 %r45,%r45,%r204; .loc 1 161 33 add.u64 %r206,%r208,%r76; ld.u64 %r67,[%r206]; .loc 1 161 6 add.u64 %r71,%r71,8; setp.eq.u64 %r207,%r64,%r67; @ %r207 bra $L19; bra $L8; $Lreg .u64 %stack; mov.u64 %stack,0480pred %r166; .reg .u64pred %r184; .reg .u64 %r187; .reg .u64 %r190; .reg .u32u64 %r213; .reg .u64u32 %r227; .reg .u64 %r229; .reg .pred %r231; .reg .u64 %r232; .reg .u64 %r233; .reg .u16u64 %r255; .reg .u64 %r257; .reg .pred %r258; .reg .u16 %r259; .reg .u32 %r260; .reg .predmov.u64 %r160,%ar0; mov.u64 %r161,%ar1; mov.f32 %r162,%ar2; mov.u64 %r163,%ar3; mov.u32 %r164,%ar4; .loc 1 190 10 ld.s8 %r22,[%r161+28]; .loc 1 190 8 cvt.u32.u32 %r165,%r22; cvt.s64.s8 %r104,%r165; .loc 1 191 6 setp.gt.s64 %r166,%r104,0; @ %r166 bra $L34; .loc 1 192 5 cvta.const.u64 %r16167_gfortran_runtime_error34: .loc 1 194 6 ld.u64 %r169,[%r160; .loc 1 196 7 st.u64 [%r160+48],%r169; add.u64 %r172,%r104,-1; st.u64 [%r160+56],%r172; mov.u64 %r173,1; st.u64 [%r160+40],%r173; .loc 1 197 28 cvt.u32.u64 %r174,%r173; st.u8 [%r160+28],%r174; .loc 1 198 24 st.u64 [%r160+8],%r169; .loc 1 199 29 mov.u64 %r177call (%value_in),_gfortrani_xmallocarray,(%out_arg1,%out_arg2); ld.param.u64 %r180,[%value_in]; } .loc 1 199 27 st.u64 [%r160],%r180; bra $L36; $L35: .loc 1 203 11 cvta.global.u64 %r182,_gfortrani_compile_options; .loc 1 203 10 ld.u32 %r183,[%r182+36]; setp.eq.u32 %r184,%r183,0; @ %r184 bra $L36; .loc 1 205 4 cvta.const.u64 %r187,$LC1; {161187; call _gfortrani_bounds_iforeach_return.loc 1 207 4 cvta.const.u64 %r19016r161190187; call _gfortrani_bounds_equal_extents$L36: .loc 1 212 13 ld.u32 %r105,[%r163+16]; .loc 1 214 9 ld.u64 %r60,[%r163]; .loc 1 216 53 add.u32 %r192,%r105,-4; and.b32 %r193,%r192,-5; set.u32.eq.u32 %r195,%r193,0; neg.s32 %r196,%r195; .loc 1 216 22 add.u32 %r197,%r105,-1; set.u32.le.u32 %r199,%r197,1; neg.s32 %r200,%r199; .loc 1 216 6 cvt.u16.u32 %r202,%r196; cvt.u16.u32 %r203,%r200; or.b16 %r201,%r202,%r203;%r206 bra $L37; .loc 1 218 7 setp.ne.u32 %r207,%r105,16; @ %r207 bra $L38; $L37: ld.u64 %r208,[%r160+40]; shl.b64 %r98,%r208,3; ld.u64 %r133,[%r160]; .loc 1 226 8 mov.u64 %r84,%r133; .loc 1 229 10 mov.u64 %r118,0; .loc 1 230 23 mov.u64 %r211,%r118; bra $L39; $L38: .loc 1 223 5 cvta.const.u64 %r210,$LC3; mov.u64 %r209,210; call _gfortrani_internal_error39: .loc 1 230 23 st.u64 [%r84],%r211; mov.u64 %r134,%r118; .loc 1 229 26 add.u64 %r118,%r134,1; .loc 1 229 3 add.u64 %r84,%r84,%r98; setp.ne.u64 %r212,%r104,%r118; @ %r212 bra $L39; .loc 1 236 20 ld.u64 %r37,[%r163+16]; add.u64 %r143,%r161,40; add.u64 %r285,%frame,120; mov.u64 %r28,%r285; add.u64 %r80,%r163,40; mov.u64 %r82,%frame; add.u64 %r78,%frame,240; mov.u64 %r54,%r78; .loc 1 232 6 mov.u64 %r116,1; .loc 1 233 10 mov.u64 %r132,0; bra $L41; $L54: mov.u64 %r132,%r117; $L41: .loc 1 235 18 ld.u64 %r213,[%r143]; st.u64 [%r28],%r213; .loc 1 236 20 ld.u64 %r215,[%r80]; mul.lo.u64 %r214,%r215,%r37; .loc 1 236 18 st.u64 [%r82],%r214; .loc 1 237 19 ld.u64 %r217,[%r143+16]; add.u64 %r216,%r217,1; ld.u64 %r218,[%r143+8]; sub.u64 %r43,%r216,%r218; .loc 1 237 17 st.u64 [%r54],%r43; .loc 1 238 10 mul.lo.u64 %r116,%r116,%r43; .loc 1 239 10 setp.le.s64 %r219,%r43,0; @ %r219 bra $L33; .loc 1 233 26 add.u64 %r117,%r132,1; .loc 1 233 3 add.u64 %r143,%r143,24; add.u64 %r28,%r28,8; add.u64 %r80,%r80,24; add.u64 %r82,%r82,8; add.u64 %r54,%r54,8; setp.ne.u64 %r220,%r134,%r132; @ %r220 bra $L54; .loc 1 244 16 cvt.u32.u32 %r222,%r22; cvt.s64.s8 %r221,%r222; shl.b64 %r141,%r221,3; add.u64 %r283,%frame,360; mov.u32 %r227227261 23 ld.u64 %r155,[%frame+120]; .loc 1 261 13 shl.b64 %r157,%r155,2; .loc 1 262 24 ld.u64 %r158,[%frame]; .loc 1 263 34 ld.u64 %r151,[%frame+240]; .loc 1 248 19 ld.u64 %r47,[%r161]; .loc 1 246 6 setp.eq.u32 %r231,%r164,0; @ ! %r231 bra $L72; add.u64 %r284,%frame,368; mov.u32 %r293,1; .loc 1 324 33 add.u64 %r296,%frame,240; bra $L42; $L72: .loc 1 248 37 add.u64 %r44,%r116,-1; .loc 1 248 31 shl.b64 %r232,%r44,2; .loc 1 248 12 add.u64 %r96,%r47,%r232; .loc 1 249 32 cvt.s64.s32 %r233,%r105; .loc 1 249 13 mad.lo.u64 %r97,%r233,%r44,%r60; .loc 1 261 13 neg.s64 %r58,%r157; .loc 1 262 14 neg.s64 %r59,%r158; add.u64 %r284,%frame,368; mov.u32 %r287,1; .loc 1 284 33 add.u64 %r290,%frame,240; $L48: ld.u64 %r130,[%frame+360]; mov.u64 %r152,%r130; mov.u32 %r33,%r227; $L46: .loc 1 254 11 ld.s8 %r236,[%r97]; cvt.u16.u32 %r235,%r236; setp.eq.u16 %r237,%r235,0; @ %r237 bra $L43; ld.f32 %r238,[%r96]; setp.neu.f32 %r239,%r238,%r162; @ %r239 bra $L43; setp.eq.u32 %r240,%r33,0; selp.u64 %r152,%r130,%r152,%r240; add.u64 %r139,%frame,248; mov.u64 %r138,%r284; add.u64 %r136,%r78,%r141; $L45: .loc 1 257 37 sub.u64 %r243,%r151,%r152; .loc 1 257 25 st.u64 [%r133],%r243; .loc 1 256 5 add.u64 %r133,%r133,%r98; setp.eq.u64 %r244,%r136,%r139; @ %r244 bra $L33; .loc 1 257 33 ld.u64 %r151,[%r139]; .loc 1 257 44 ld.u64 %r152,[%r138]; add.u64 %r139,%r139,8; add.u64 %r138,%r138,8; bra $L45; $L43: .loc 1 261 13 add.u64 %r96,%r96,%r58; .loc 1 262 14 add.u64 %r97,%r97,%r59; .loc 1 263 14 add.u64 %r152,%r152,1; mov.u32 %r33,%r287; .loc 1 263 6 setp.ne.u64 %r245,%r152,%r151; @ %r245 bra $L46; st.u64 [%frame+360],%r151; mov.u64 %r125,%r284; mov.u64 %r69,%r158; mov.u64 %r72,%r151; mov.u64 %r67,%r155; mov.u64 %r124,8; .loc 1 265 6 mov.u64 %r100,0; .loc 1 270 17 mov.u64 %r247,%r100; $L47: st.u64 [%r125+-8],%r247; .loc 1 273 27 mul.lo.u64 %r62,%r67,%r72; .loc 1 274 28 mul.lo.u64 %r64,%r72,%r69; mov.u64 %r142,%r100; .loc 1 275 9 add.u64 %r100,%r100,1; .loc 1 276 11 setp.eq.u64 %r248,%r142,%r134; @ %r248 bra $L33; .loc 1 280 13 ld.u64 %r249,[%r125]; add.u64 %r66,%r249,1; st.u64 [%r125],%r66; .loc 1 281 20 add.u64 %r251,%r285,%r124; ld.u64 %r67,[%r251]; .loc 1 281 10 sub.u64 %r252,%r62,%r67; shl.b64 %r253,%r252,2; add.u64 %r96,%r96,%r253; .loc 1 282 21 add.u64 %r254,%frame,%r124; ld.u64 %r69,[%r254]; .loc 1 282 11 sub.u64 %r255,%r69,%r64; add.u64 %r97,%r97,%r255; .loc 1 284 33 add.u64 %r257,%r290,%r124; ld.u64 %r72,[%r257]; .loc 1 284 6 add.u64 %r125,%r125,8; add.u64 %r124,%r124,8; setp.eq.u64 %r258,%r66,%r72; @ %r258 bra $L47; bra $L48; $L42: ld.u64 %r55,[%frame+360]; .loc 1 265 6 mov.u64 %r146,%r55; mov.u32 %r126,%r164; $L52: .loc 1 294 11 ld.s8 %r260,[%r60]; cvt.u16.u32 %r259,%r260; setp.eq.u16 %r261,%r259,0; @ %r261 bra $L49; ld.f32 %r262,[%r47]; setp.neu.f32 %r263,%r262,%r162; @ %r263 bra $L49; setp.eq.u32 %r264,%r126,0; selp.u64 %r146,%r55,%r146,%r264; mov.u64 %r119,%r284; add.u64 %r77,%r283,%r141; $L51: .loc 1 297 36 add.u64 %r267,%r146,1; .loc 1 297 25 st.u64 [%r133],%r267; .loc 1 296 5 add.u64 %r133,%r133,%r98; setp.eq.u64 %r268,%r77,%r119; @ %r268 bra $L33; .loc 1 297 32 ld.u64 %r146,[%r119]; add.u64 %r119,%r119,8; bra $L51; $L49: .loc 1 301 13 add.u64 %r47,%r47,%r157; .loc 1 302 14 add.u64 %r60,%r60,%r158; .loc 1 303 14 add.u64 %r146,%r146,1; mov.u32 %r126,%r293; .loc 1 303 6 setp.ne.u64 %r269,%r146,%r151; @ %r269 bra $L52; st.u64 [%frame+360],%r151; mov.u64 %r153,%r284; mov.u64 %r93,%r158; mov.u64 %r95,%r151; mov.u64 %r91,%r155; mov.u64 %r150,8; .loc 1 305 6 mov.u64 %r101,0; .loc 1 310 17 mov.u64 %r271,%r101; $L53: st.u64 [%r153+-8],%r271; .loc 1 313 27 mul.lo.u64 %r86,%r91,%r95; .loc 1 314 28 mul.lo.u64 %r88,%r95,%r93; mov.u64 %r103,%r101; .loc 1 315 9 add.u64 %r101,%r101,1; .loc 1 316 11 setp.eq.u64 %r272,%r103,%r134; @ %r272 bra $L33; .loc 1 320 13 ld.u64 %r273,[%r153]; add.u64 %r90,%r273,1; st.u64 [%r153],%r90; .loc 1 321 20 add.u64 %r275,%r285,%r150; ld.u64 %r91,[%r275]; .loc 1 321 10 sub.u64 %r276,%r91,%r86; shl.b64 %r277,%r276,2; add.u64 %r47,%r47,%r277; .loc 1 322 21 add.u64 %r278,%frame,%r150; ld.u64 %r93,[%r278]; .loc 1 322 11 sub.u64 %r279,%r93,%r88; add.u64 %r60,%r60,%r279; .loc 1 324 33 add.u64 %r281,%r296,%r150; ld.u64 %r95,[%r281]; .loc 1 324 6 add.u64 %r153,%r153,8; add.u64 %r150,%r150,8; setp.eq.u64 %r282,%r90,%r95; @ %r282 bra $L53; bra $L42; $L33:_gfortran_sfindloc0_r4 .visible .func _gfortran_sfindloc0_rreg .u64 %r24u64 %r38; .reg .u32predov.f32 %r37,%ar2; mov.u64 %r38,%ar3; mov.u32 %r39,%ar4; .loc 1 345 6 setp.eq.u64 %r40,%r38,0; @ %r40 bra $L74; .loc 1 345 20 ld.u32 %r41,[%r38]; setp.eq.u32 %r42,%r41,0; @ %r42 bra $L75; $L74: .loc 1 347 7 {call _gfortran_findloc0_r4348 7 bra $L73; $L75: .loc 1 351 8 ld.s8 %r30,[%r36+28]; .loc 1 353 6 setp.gt.s64 %r48,%r30,0; @ %r48 bra $L77; .loc 1 354 5 cvta.const.u64 %r50,$LC0; mov.u64 %r49,0; {_gfortrani_internal_error77: .loc 1 356 15 ld.u64 %r24,[%r35]; .loc 1 356 6 setp.ne.u64 %r51,%r24,0; @ %r51 bra $L78; .loc 1 358 7 st.u64 [%r35+48],%r24; add.u64 %r53,%r30,-1; st.u64 [%r35+56],%r53; mov.u64 %r54,1; st.u64 [%r35+40],%r54; .loc 1 359 28 cvt.u32.u64 %r55,%r54; st.u8 [%r35+28],%r55; .loc 1 360 24 st.u64 [%r35+8],%r24; .loc 1 361 29 mov.u64 call (%value_in),_gfortrani_xmallocarray61 27 st.u64 [%r35],%r24; bra $L79; $L78: .loc 1 363 12 cvta.global.u64 %r63,_gfortrani_compile_options; .loc 1 363 11 ld.u32 %r64,[%r63+36]; setp.eq.u32 %r65,%r64,0; @ %r65 bra $L79; .loc 1 365 8 cvta.const.u64 %r68,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 370 8 ld.u64 %r24,[%r35]; $L79: ld.u64 %r69,[%r35+40]; shl.b64 %r34,%r69,3; mov.u64 %r29,%r24; .loc 1 371 10 mov.u64 %r32,0; .loc 1 372 23 mov.u64 %r70,%r32; $L80: st.u64 [%r29],%r70; .loc 1 371 24 add.u64 %r32,%r32,1; .loc 1 371 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r71,%r30,%r32; @ %r71 bra $L80; $L73: .loc 1 373 1 ret; } findloc0_r8.o/_gfortran_findloc0_r8 .visible .func _gfortran_findloc0_r8, .param .u32 %in_ar3); .file 1 "../../../libgfortran/generated/findloc0_r8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mfindloc0_r8 .visible .func _gfortran_mfindloc0_r8_gfortran_sfindloc0_r8 .visible .func _gfortran_sfindloc0_r8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extentsgfortrani_internal_error .extern .func _gfortrani_internal_error8] = {70,73,78,68,777,65,83,75,32,97,114,103,117,109,101,110,116,0 }82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,6270,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_findloc0_r8 .visible .func _gfortran_findloc0_r8reg .u64 %stack; mov.u64 %stack,088pred.reg .u64 %r140; .reg .u32 %r141; .reg .predpredu64predu64pred %r192; .reg .u64 %r195; .reg .predpredu32 %r211; .reg .u64 %r215; .reg .u32 %r216; .reg .u64 %r220; mov.u64 %r120,%ar0; mov.u64 %r121,%ar1; mov.f64 %r122,%ar2; mov.u32 %r123,%ar3; .loc 1 51 10 ld.s8 %r22,[%r121+28]; .loc 1 51 8 cvt.u32.u32 %r124,%r22; cvt.s64.s8 %r80,%r124; .loc 1 52 6 setp.gt.s64 %r125,%r80,0; @ %r125 bra $L2; .loc 1 53 5 cvta.const.u64 %r126,$LCstack; call _gfortran_runtime_errorr23,[%r120]; .loc 1 55 6 setp.ne.u64 %r128,%r23,0; @ %r128 bra $L3; .loc 1 57 7 st.u64 [%r120+48],%r23; add.u64 %r130,%r80,-1; st.u64 [%r120+56],%r130; mov.u64 %r131,1; st.u64 [%r120+40],%r131; .loc 1 58 28 cvt.u32.u64 %r132,%r131; st.u8 [%r120+28],%r132; .loc 1 59 24 st.u64 [%r120+8],%r23; .loc 1 60 29 mov.u64 %r135135; call (%value_in),_gfortrani_xmallocarray138,[%value_in]; } mov.u64 %r23,%r138; .loc 1 60 27 st.u64 [%r120],%r23; bra $L4; $L3: .loc 1 64 11 cvta.global.u64 %r140,_gfortrani_compile_options; .loc 1 64 10 ld.u32 %r141,[%r140+36]; setp.eq.u32 %r142,%r141,0; @ %r142 bra $L4; .loc 1 6gfortrani_bounds_iforeach_return.loc 1 70 8 ld.u64 %r23,[%r120]; $L4: ld.u64 %r146,[%r120+40]; shl.b64 %r118,%r146,3; mov.u64 %r102,%r23; .loc 1 69 11 mov.u64 %r116,%r102; .loc 1 73 10 mov.u64 %r89,0; .loc 1 74 23 mov.u64 %r147,%r89; $L5: st.u64 [%r116],%r147; mov.u64 %r103,%r89; .loc 1 73 26 add.u64 %r89,%r103,1; .loc 1 73 3 add.u64 %r116,%r116,%r118; setp.ne.u64 %r148,%r80,%r89; @ %r148 bra $L5; add.u64 %r39,%r121,40; mov.u64 %r74,%frame; .loc 1 76 6 mov.u64 %r87,1; .loc 1 77 10 mov.u64 %r101,0; add.u64 %r208,%frame,120; bra $L7; $L20: mov.u64 %r101,%r88; $L7: .loc 1 79 18 ld.u64 %r149,[%r39]; st.u64 [%r74],%r149; .loc 1 80 19 ld.u64 %r151,[%r39+16]; add.u64 %r150,%r151,1; ld.u64 %r152,[%r39+8]; sub.u64 %r33,%r150,%r152; .loc 1 80 17 shl.b64 %r154,%r101,3; add.u64 %r155,%r208,%r154; st.u64 [%r155],%r33; .loc 1 81 10 mul.lo.u64 %r87,%r87,%r33; .loc 1 82 10 setp.le.s64 %r156,%r33,0; @ %r156 bra $L1; .loc 1 77 26 add.u64 %r88,%r101,1; .loc 1 77 3 add.u64 %r39,%r39,24; add.u64 %r74,%r74,8; setp.ne.u64 %r157,%r103,%r101; @ %r157 bra $L20; .loc 1 87 16 cvt.u32.u32 %r159,%r22; cvt.s64.s8 %r158,%r159; shl.b64 %r95,%r158,3; add.u64 %r209,%frame,240; mov.u32 %r164%r113,[%frame]; .loc 1 104 13 shl.b64 %r115,%r113,3; .loc 1 105 34 ld.u64 %r109,[%frame+120]; .loc 1 91 19 ld.u64 %r45,[%r121]; .loc 1 89 6 setp.eq.u32 %r168,%r123,0; @ ! %r168 bra $L32; add.u64 %r210,%frame,248; mov.u32 %r216,1; .loc 1 149 17 mov.u64 %r220,0; bra $L8; $L32: .loc 1 91 37 add.u64 %r169,%r87,-1; .loc 1 91 31 shl.b64 %r170,%r169,3; .loc 1 91 12 add.u64 %r68,%r45,%r170; .loc 1 104 13 neg.s64 %r44,%r115; add.u64 %r210,%frame,248; mov.u32 %r211,1; .loc 1 112 17 mov.u64 %r215,0; $L14: ld.u64 %r38,[%frame+240]; mov.u64 %r110,%r38; mov.u32 %r98,%r164; $L12: .loc 1 97 11 ld.f64 %r171,[%r68]; setp.neu.f64 %r172,%r171,%r122; @ %r172 bra $L9; setp.eq.u32 %r173,%r98,0; selp.u64 %r110,%r38,%r110,%r173; add.u64 %r93,%frame,128; mov.u64 %r92,%r210; add.u64 %r86,%r208,%r95; $L11: .loc 1 100 37 sub.u64 %r177,%r109,%r110; .loc 1 100 25 st.u64 [%r102],%r177; .loc 1 99 5 add.u64 %r102,%r102,%r118; setp.eq.u64 %r178,%r86,%r93; @ %r178 bra $L1; .loc 1 100 33 ld.u64 %r109,[%r93]; .loc 1 100 44 ld.u64 %r110,[%r92]; add.u64 %r93,%r93,8; add.u64 %r92,%r92,8; bra $L11; $L9: .loc 1 104 13 add.u64 %r68,%r68,%r44; .loc 1 105 14 add.u64 %r110,%r110,1; mov.u32 %r98,%r211; .loc 1 105 6 setp.ne.u64 %r179,%r110,%r109; @ %r179 bra $L12; st.u64 [%frame+240],%r109; mov.u64 %r111,%r210; mov.u64 %r53,%r109; mov.u64 %r51,%r113; .loc 1 107 6 mov.u64 %r73,0; $L13: .loc 1 112 17 st.u64 [%r111+-8],%r215; .loc 1 115 27 mul.lo.u64 %r48,%r51,%r53; mov.u64 %r79,%r73; .loc 1 116 9 add.u64 %r73,%r73,1; .loc 1 117 11 setp.eq.u64 %r182,%r79,%r103; @ %r182 bra $L1; .loc 1 121 13 ld.u64 %r183,[%r111]; add.u64 %r50,%r183,1; st.u64 [%r111],%r50; shl.b64 %r106,%r73,3; .loc 1 122 20 add.u64 %r184,%frame,%r106; ld.u64 %r51,[%r184]; .loc 1 122 10 sub.u64 %r185,%r48,%r51; shl.b64 %r186,%r185,3; add.u64 %r68,%r68,%r186; .loc 1 124 33 add.u64 %r188,%r208,%r106; ld.u64 %r53,[%r188]; .loc 1 124 6 add.u64 %r111,%r111,8; setp.eq.u64 %r189,%r50,%r53; @ %r189 bra $L13; bra $L14; $L8: ld.u64 %r96,[%frame+240]; .loc 1 107 6 mov.u64 %r105,%r96; mov.u32 %r59,%r123; $L18: .loc 1 134 11 ld.f64 %r190,[%r45]; setp.neu.f64 %r191,%r190,%r122; @ %r191 bra $L15; setp.eq.u32 %r192,%r59,0; selp.u64 %r105,%r96,%r105,%r192; mov.u64 %r72,%r210; add.u64 %r27,%r209,%r95; $L17: .loc 1 137 36 add.u64 %r195,%r105,1; .loc 1 137 25 st.u64 [%r102],%r195; .loc 1 136 5 add.u64 %r102,%r102,%r118; setp.eq.u64 %r196,%r27,%r72; @ %r196 bra $L1; .loc 1 137 32 ld.u64 %r105,[%r72]; add.u64 %r72,%r72,8; bra $L17; $L15: .loc 1 141 13 add.u64 %r45,%r45,%r115; .loc 1 142 14 add.u64 %r105,%r105,1; mov.u32 %r59,%r216; .loc 1 142 6 setp.ne.u64 %r197,%r105,%r109; @ %r197 bra $L18; st.u64 [%frame+240],%r109; mov.u64 %r71,%r210; mov.u64 %r67,%r109; mov.u64 %r65,%r113; .loc 1 144 6 mov.u64 %r75,0; $L19: .loc 1 149 17 st.u64 [%r71+-8],%r220; .loc 1 152 27 mul.lo.u64 %r62,%r65,%r67; mov.u64 %r78,%r75; .loc 1 153 9 add.u64 %r75,%r75,1; .loc 1 154 11 setp.eq.u64 %r200,%r78,%r103; @ %r200 bra $L1; .loc 1 158 13 ld.u64 %r201,[%r71]; add.u64 %r64,%r201,1; st.u64 [%r71],%r64; shl.b64 %r76,%r75,3; .loc 1 159 20 add.u64 %r202,%frame,%r76; ld.u64 %r65,[%r202]; .loc 1 159 10 sub.u64 %r203,%r65,%r62; shl.b64 %r204,%r203,3; add.u64 %r45,%r45,%r204; .loc 1 161 33 add.u64 %r206,%r208,%r76; ld.u64 %r67,[%r206]; .loc 1 161 6 add.u64 %r71,%r71,8; setp.eq.u64 %r207,%r64,%r67; @ %r207 bra $L19; bra $L8; $Lstack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[480pred %r166; .reg .u64pred %r184; .reg .u64 %r187; .reg .u64 %r190; .reg .u32u64 %r213; .reg .u64u32 %r227; .reg .u64 %r229; .reg .pred %r231; .reg .u64 %r232; .reg .u64 %r233; .reg .u16u64 %r255; .reg .u64 %r257; .reg .pred %r258; .reg .u16 %r259; .reg .u32 %r260; .reg .predmov.u64 %r160,%ar0; mov.u64 %r161,%ar1; mov.f64 %r162,%ar2; mov.u64 %r163,%ar3; mov.u32 %r164,%ar4; .loc 1 190 10 ld.s8 %r22,[%r161+28]; .loc 1 190 8 cvt.u32.u32 %r165,%r22; cvt.s64.s8 %r104,%r165; .loc 1 191 6 setp.gt.s64 %r166,%r104,0; @ %r166 bra $L34; .loc 1 192 5 cvta.const.u64 %r16167_gfortran_runtime_error34: .loc 1 194 6 ld.u64 %r169,[%r160; .loc 1 196 7 st.u64 [%r160+48],%r169; add.u64 %r172,%r104,-1; st.u64 [%r160+56],%r172; mov.u64 %r173,1; st.u64 [%r160+40],%r173; .loc 1 197 28 cvt.u32.u64 %r174,%r173; st.u8 [%r160+28],%r174; .loc 1 198 24 st.u64 [%r160+8],%r169; .loc 1 199 29 mov.u64 %r177call (%value_in),_gfortrani_xmallocarray,(%out_arg1,%out_arg2); ld.param.u64 %r180,[%value_in]; } .loc 1 199 27 st.u64 [%r160],%r180; bra $L36; $L35: .loc 1 203 11 cvta.global.u64 %r182,_gfortrani_compile_options; .loc 1 203 10 ld.u32 %r183,[%r182+36]; setp.eq.u32 %r184,%r183,0; @ %r184 bra $L36; .loc 1 205 4 cvta.const.u64 %r187,$LC1; {161187; call _gfortrani_bounds_iforeach_return.loc 1 207 4 cvta.const.u64 %r19016r161190187; call _gfortrani_bounds_equal_extents$L36: .loc 1 212 13 ld.u32 %r105,[%r163+16]; .loc 1 214 9 ld.u64 %r60,[%r163]; .loc 1 216 53 add.u32 %r192,%r105,-4; and.b32 %r193,%r192,-5; set.u32.eq.u32 %r195,%r193,0; neg.s32 %r196,%r195; .loc 1 216 22 add.u32 %r197,%r105,-1; set.u32.le.u32 %r199,%r197,1; neg.s32 %r200,%r199; .loc 1 216 6 cvt.u16.u32 %r202,%r196; cvt.u16.u32 %r203,%r200; or.b16 %r201,%r202,%r203;%r206 bra $L37; .loc 1 218 7 setp.ne.u32 %r207,%r105,16; @ %r207 bra $L38; $L37: ld.u64 %r208,[%r160+40]; shl.b64 %r98,%r208,3; ld.u64 %r133,[%r160]; .loc 1 226 8 mov.u64 %r84,%r133; .loc 1 229 10 mov.u64 %r118,0; .loc 1 230 23 mov.u64 %r211,%r118; bra $L39; $L38: .loc 1 223 5 cvta.const.u64 %r210,$LC3; mov.u64 %r209,210; call _gfortrani_internal_error39: .loc 1 230 23 st.u64 [%r84],%r211; mov.u64 %r134,%r118; .loc 1 229 26 add.u64 %r118,%r134,1; .loc 1 229 3 add.u64 %r84,%r84,%r98; setp.ne.u64 %r212,%r104,%r118; @ %r212 bra $L39; .loc 1 236 20 ld.u64 %r37,[%r163+16]; add.u64 %r143,%r161,40; add.u64 %r285,%frame,120; mov.u64 %r28,%r285; add.u64 %r80,%r163,40; mov.u64 %r82,%frame; add.u64 %r78,%frame,240; mov.u64 %r54,%r78; .loc 1 232 6 mov.u64 %r116,1; .loc 1 233 10 mov.u64 %r132,0; bra $L41; $L54: mov.u64 %r132,%r117; $L41: .loc 1 235 18 ld.u64 %r213,[%r143]; st.u64 [%r28],%r213; .loc 1 236 20 ld.u64 %r215,[%r80]; mul.lo.u64 %r214,%r215,%r37; .loc 1 236 18 st.u64 [%r82],%r214; .loc 1 237 19 ld.u64 %r217,[%r143+16]; add.u64 %r216,%r217,1; ld.u64 %r218,[%r143+8]; sub.u64 %r43,%r216,%r218; .loc 1 237 17 st.u64 [%r54],%r43; .loc 1 238 10 mul.lo.u64 %r116,%r116,%r43; .loc 1 239 10 setp.le.s64 %r219,%r43,0; @ %r219 bra $L33; .loc 1 233 26 add.u64 %r117,%r132,1; .loc 1 233 3 add.u64 %r143,%r143,24; add.u64 %r28,%r28,8; add.u64 %r80,%r80,24; add.u64 %r82,%r82,8; add.u64 %r54,%r54,8; setp.ne.u64 %r220,%r134,%r132; @ %r220 bra $L54; .loc 1 244 16 cvt.u32.u32 %r222,%r22; cvt.s64.s8 %r221,%r222; shl.b64 %r141,%r221,3; add.u64 %r283,%frame,360; mov.u32 %r227227261 23 ld.u64 %r155,[%frame+120]; .loc 1 261 13 shl.b64 %r157,%r155,3; .loc 1 262 24 ld.u64 %r158,[%frame]; .loc 1 263 34 ld.u64 %r151,[%frame+240]; .loc 1 248 19 ld.u64 %r47,[%r161]; .loc 1 246 6 setp.eq.u32 %r231,%r164,0; @ ! %r231 bra $L72; add.u64 %r284,%frame,368; mov.u32 %r293,1; .loc 1 324 33 add.u64 %r296,%frame,240; bra $L42; $L72: .loc 1 248 37 add.u64 %r44,%r116,-1; .loc 1 248 31 shl.b64 %r232,%r44,3; .loc 1 248 12 add.u64 %r96,%r47,%r232; .loc 1 249 32 cvt.s64.s32 %r233,%r105; .loc 1 249 13 mad.lo.u64 %r97,%r233,%r44,%r60; .loc 1 261 13 neg.s64 %r58,%r157; .loc 1 262 14 neg.s64 %r59,%r158; add.u64 %r284,%frame,368; mov.u32 %r287,1; .loc 1 284 33 add.u64 %r290,%frame,240; $L48: ld.u64 %r130,[%frame+360]; mov.u64 %r152,%r130; mov.u32 %r33,%r227; $L46: .loc 1 254 11 ld.s8 %r236,[%r97]; cvt.u16.u32 %r235,%r236; setp.eq.u16 %r237,%r235,0; @ %r237 bra $L43; ld.f64 %r238,[%r96]; setp.neu.f64 %r239,%r238,%r162; @ %r239 bra $L43; setp.eq.u32 %r240,%r33,0; selp.u64 %r152,%r130,%r152,%r240; add.u64 %r139,%frame,248; mov.u64 %r138,%r284; add.u64 %r136,%r78,%r141; $L45: .loc 1 257 37 sub.u64 %r243,%r151,%r152; .loc 1 257 25 st.u64 [%r133],%r243; .loc 1 256 5 add.u64 %r133,%r133,%r98; setp.eq.u64 %r244,%r136,%r139; @ %r244 bra $L33; .loc 1 257 33 ld.u64 %r151,[%r139]; .loc 1 257 44 ld.u64 %r152,[%r138]; add.u64 %r139,%r139,8; add.u64 %r138,%r138,8; bra $L45; $L43: .loc 1 261 13 add.u64 %r96,%r96,%r58; .loc 1 262 14 add.u64 %r97,%r97,%r59; .loc 1 263 14 add.u64 %r152,%r152,1; mov.u32 %r33,%r287; .loc 1 263 6 setp.ne.u64 %r245,%r152,%r151; @ %r245 bra $L46; st.u64 [%frame+360],%r151; mov.u64 %r125,%r284; mov.u64 %r69,%r158; mov.u64 %r72,%r151; mov.u64 %r67,%r155; mov.u64 %r124,8; .loc 1 265 6 mov.u64 %r100,0; .loc 1 270 17 mov.u64 %r247,%r100; $L47: st.u64 [%r125+-8],%r247; .loc 1 273 27 mul.lo.u64 %r62,%r67,%r72; .loc 1 274 28 mul.lo.u64 %r64,%r72,%r69; mov.u64 %r142,%r100; .loc 1 275 9 add.u64 %r100,%r100,1; .loc 1 276 11 setp.eq.u64 %r248,%r142,%r134; @ %r248 bra $L33; .loc 1 280 13 ld.u64 %r249,[%r125]; add.u64 %r66,%r249,1; st.u64 [%r125],%r66; .loc 1 281 20 add.u64 %r251,%r285,%r124; ld.u64 %r67,[%r251]; .loc 1 281 10 sub.u64 %r252,%r62,%r67; shl.b64 %r253,%r252,3; add.u64 %r96,%r96,%r253; .loc 1 282 21 add.u64 %r254,%frame,%r124; ld.u64 %r69,[%r254]; .loc 1 282 11 sub.u64 %r255,%r69,%r64; add.u64 %r97,%r97,%r255; .loc 1 284 33 add.u64 %r257,%r290,%r124; ld.u64 %r72,[%r257]; .loc 1 284 6 add.u64 %r125,%r125,8; add.u64 %r124,%r124,8; setp.eq.u64 %r258,%r66,%r72; @ %r258 bra $L47; bra $L48; $L42: ld.u64 %r55,[%frame+360]; .loc 1 265 6 mov.u64 %r146,%r55; mov.u32 %r126,%r164; $L52: .loc 1 294 11 ld.s8 %r260,[%r60]; cvt.u16.u32 %r259,%r260; setp.eq.u16 %r261,%r259,0; @ %r261 bra $L49; ld.f64 %r262,[%r47]; setp.neu.f64 %r263,%r262,%r162; @ %r263 bra $L49; setp.eq.u32 %r264,%r126,0; selp.u64 %r146,%r55,%r146,%r264; mov.u64 %r119,%r284; add.u64 %r77,%r283,%r141; $L51: .loc 1 297 36 add.u64 %r267,%r146,1; .loc 1 297 25 st.u64 [%r133],%r267; .loc 1 296 5 add.u64 %r133,%r133,%r98; setp.eq.u64 %r268,%r77,%r119; @ %r268 bra $L33; .loc 1 297 32 ld.u64 %r146,[%r119]; add.u64 %r119,%r119,8; bra $L51; $L49: .loc 1 301 13 add.u64 %r47,%r47,%r157; .loc 1 302 14 add.u64 %r60,%r60,%r158; .loc 1 303 14 add.u64 %r146,%r146,1; mov.u32 %r126,%r293; .loc 1 303 6 setp.ne.u64 %r269,%r146,%r151; @ %r269 bra $L52; st.u64 [%frame+360],%r151; mov.u64 %r153,%r284; mov.u64 %r93,%r158; mov.u64 %r95,%r151; mov.u64 %r91,%r155; mov.u64 %r150,8; .loc 1 305 6 mov.u64 %r101,0; .loc 1 310 17 mov.u64 %r271,%r101; $L53: st.u64 [%r153+-8],%r271; .loc 1 313 27 mul.lo.u64 %r86,%r91,%r95; .loc 1 314 28 mul.lo.u64 %r88,%r95,%r93; mov.u64 %r103,%r101; .loc 1 315 9 add.u64 %r101,%r101,1; .loc 1 316 11 setp.eq.u64 %r272,%r103,%r134; @ %r272 bra $L33; .loc 1 320 13 ld.u64 %r273,[%r153]; add.u64 %r90,%r273,1; st.u64 [%r153],%r90; .loc 1 321 20 add.u64 %r275,%r285,%r150; ld.u64 %r91,[%r275]; .loc 1 321 10 sub.u64 %r276,%r91,%r86; shl.b64 %r277,%r276,3; add.u64 %r47,%r47,%r277; .loc 1 322 21 add.u64 %r278,%frame,%r150; ld.u64 %r93,[%r278]; .loc 1 322 11 sub.u64 %r279,%r93,%r88; add.u64 %r60,%r60,%r279; .loc 1 324 33 add.u64 %r281,%r296,%r150; ld.u64 %r95,[%r281]; .loc 1 324 6 add.u64 %r153,%r153,8; add.u64 %r150,%r150,8; setp.eq.u64 %r282,%r90,%r95; @ %r282 bra $L53; bra $L42; $L33:_gfortran_sfindloc0_r8 .visible .func _gfortran_sfindloc0_r8reg .u64 %r24u64 %r38; .reg .u32predar2; mov.u64 %r38,%ar3; mov.u32 %r39,%ar4; .loc 1 345 6 setp.eq.u64 %r40,%r38,0; @ %r40 bra $L74; .loc 1 345 20 ld.u32 %r41,[%r38]; setp.eq.u32 %r42,%r41,0; @ %r42 bra $L75; $L74: .loc 1 347 7 {call _gfortran_findloc0_r8348 7 bra $L73; $L75: .loc 1 351 8 ld.s8 %r30,[%r36+28]; .loc 1 353 6 setp.gt.s64 %r48,%r30,0; @ %r48 bra $L77; .loc 1 354 5 cvta.const.u64 %r50,$LC0; mov.u64 %r49,0; {_gfortrani_internal_error77: .loc 1 356 15 ld.u64 %r24,[%r35]; .loc 1 356 6 setp.ne.u64 %r51,%r24,0; @ %r51 bra $L78; .loc 1 358 7 st.u64 [%r35+48],%r24; add.u64 %r53,%r30,-1; st.u64 [%r35+56],%r53; mov.u64 %r54,1; st.u64 [%r35+40],%r54; .loc 1 359 28 cvt.u32.u64 %r55,%r54; st.u8 [%r35+28],%r55; .loc 1 360 24 st.u64 [%r35+8],%r24; .loc 1 361 29 mov.u64 call (%value_in),_gfortrani_xmallocarray61 27 st.u64 [%r35],%r24; bra $L79; $L78: .loc 1 363 12 cvta.global.u64 %r63,_gfortrani_compile_options; .loc 1 363 11 ld.u32 %r64,[%r63+36]; setp.eq.u32 %r65,%r64,0; @ %r65 bra $L79; .loc 1 365 8 cvta.const.u64 %r68,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 370 8 ld.u64 %r24,[%r35]; $L79: ld.u64 %r69,[%r35+40]; shl.b64 %r34,%r69,3; mov.u64 %r29,%r24; .loc 1 371 10 mov.u64 %r32,0; .loc 1 372 23 mov.u64 %r70,%r32; $L80: st.u64 [%r29],%r70; .loc 1 371 24 add.u64 %r32,%r32,1; .loc 1 371 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r71,%r30,%r32; @ %r71 bra $L80; $L73: .loc 1 373 1 ret; } findloc0_r10.o/findloc0_c4.o/_gfortran_findloc0_c4 .visible .func _gfortran_findloc0_cfortran/generated/findloc0_c4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mfindloc0_c4 .visible .func _gfortran_mfindloc0_c_gfortran_sfindloc0_c4 .visible .func _gfortran_sfindloc0_cVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extentsgfortrani_internal_error .extern .func _gfortrani_internal_error77,65,83,75,32,97,114,103,117,109,101,110,116,0 }82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,488] = {70,73,78,68,770,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_findloc0_c4 .visible .func _gfortran_findloc0_creg .u64 %stack; mov.u64 %stack,08u64u64 %r158; .reg .predu32 %r170; .reg .u32 %r175; .reg .u64 %r177; .reg .predu16 %r191; .reg .u16 %r192; .reg .u32u32 %r216; .reg .f32 %r218; .reg .u32 %r219; .reg .u32 %r220; .reg .u16 %r221; .reg .u16 %r222; .reg .u16 %r223; .reg .u32 %r224; .reg .u16 %r225; .reg .predpred %r231; .reg .pred %r232; .reg .predmov.u64 %r130,%ar0; mov.u64 %r131,%ar1; mov.f32 %r132,%ar2; mov.f32 %r133,%ar3; mov.u32 %r134,%ar4; .loc 1 51 10 ld.s8 %r22,[%r131+28]; .loc 1 51 8 cvt.u32.u32 %r135,%r22; cvt.s64.s8 %r71,%r135; .loc 1 52 6 setp.gt.s64 %r136,%r71,0; @ %r136 bra $L2; .loc 1 53 5 cvta.const.u64 %r131stack; call _gfortran_runtime_errorr23,[%r130]; .loc 1 55 6 setp.ne.u64 %r139,%r23,0; @ %r139 bra $L3; .loc 1 57 7 st.u64 [%r130+48],%r23; add.u64 %r141,%r71,-1; st.u64 [%r130+56],%r141; mov.u64 %r142,1; st.u64 [%r130+40],%r142; .loc 1 58 28 cvt.u32.u64 %r143,%r142; st.u8 [%r130+28],%r143; .loc 1 59 24 st.u64 [%r130+8],%r23; .loc 1 60 29 mov.u64 %r14671call (%value_in),_gfortrani_xmallocarray9,[%value_in]; } mov.u64 %r23,%r149; .loc 1 60 27 st.u64 [%r130],%r23; bra $L4; $L3: .loc 1 64 11 cvta.global.u64 %r151,_gfortrani_compile_options; .loc 1 64 10 ld.u32 %r152,[%r151+36]; setp.eq.u32 %r153,%r152,0; @ %r153 bra $L4; .loc 1 6gfortrani_bounds_iforeach_return.loc 1 70 8 ld.u64 %r23,[%r130]; $L4: ld.u64 %r157,[%r130+40]; shl.b64 %r95,%r157,3; mov.u64 %r97,%r23; .loc 1 69 11 mov.u64 %r57,%r97; .loc 1 73 10 mov.u64 %r81,0; .loc 1 74 23 mov.u64 %r158,%r81; $L5: st.u64 [%r57],%r158; mov.u64 %r98,%r81; .loc 1 73 26 add.u64 %r81,%r98,1; .loc 1 73 3 add.u64 %r57,%r57,%r95; setp.ne.u64 %r159,%r71,%r81; @ %r159 bra $L5; add.u64 %r117,%r131,40; mov.u64 %r116,%frame; .loc 1 76 6 mov.u64 %r79,1; .loc 1 77 10 mov.u64 %r96,0; add.u64 %r245,%frame,120; bra $L7; $L20: mov.u64 %r96,%r80; $L7: .loc 1 79 18 ld.u64 %r160,[%r117]; st.u64 [%r116],%r160; .loc 1 80 19 ld.u64 %r162,[%r117+16]; add.u64 %r161,%r162,1; ld.u64 %r163,[%r117+8]; sub.u64 %r34,%r161,%r163; .loc 1 80 17 shl.b64 %r165,%r96,3; add.u64 %r166,%r245,%r165; st.u64 [%r166],%r34; .loc 1 81 10 mul.lo.u64 %r79,%r79,%r34; .loc 1 82 10 setp.le.s64 %r167,%r34,0; @ %r167 bra $L1; .loc 1 77 26 add.u64 %r80,%r96,1; .loc 1 77 3 add.u64 %r117,%r117,24; add.u64 %r116,%r116,8; setp.ne.u64 %r168,%r98,%r96; @ %r168 bra $L20; .loc 1 87 16 cvt.u32.u32 %r170,%r22; cvt.s64.s8 %r169,%r170; shl.b64 %r114,%r169,3; add.u64 %r243,%frame,240; mov.u32 %r17511177,[%value_in]; } .loc 1 104 23 ld.u64 %r127,[%frame]; .loc 1 104 13 shl.b64 %r129,%r127,3; .loc 1 105 34 ld.u64 %r121,[%frame+120]; .loc 1 91 19 ld.u64 %r43,[%r131]; .loc 1 89 6 setp.eq.u32 %r179,%r134,0; @ ! %r179 bra $L32; add.u64 %r244,%frame,248; mov.u32 %r251,1; .loc 1 149 17 mov.u64 %r255,0; bra $L8; $L32: .loc 1 91 37 add.u64 %r180,%r79,-1; .loc 1 91 31 shl.b64 %r181,%r180,3; .loc 1 91 12 add.u64 %r66,%r43,%r181; .loc 1 104 13 neg.s64 %r42,%r129; add.u64 %r244,%frame,248; mov.u32 %r246,1; .loc 1 112 17 mov.u64 %r250,0; $L14: ld.u64 %r70,[%frame+240]; mov.u64 %r122,%r70; mov.u32 %r39,%r175; $L12: .loc 1 97 11 ld.f32 %r183,[%r66]; set.u32.eq.f32 %r184,%r183,%r132; neg.s32 %r185,%r184; ld.f32 %r187,[%r66+4]; set.u32.eq.f32 %r188,%r187,%r133; neg.s32 %r189,%r188; cvt.u16.u32 %r191,%r185; cvt.u16.u32 %r192,%r189; and.b16 %r190,%r191,%r192; cvt.u32.u16 %r193,%r190; cvt.u16.u8 %r194,%r193; setp.eq.u16 %r195,%r194,0; @ %r195 bra $L9; setp.eq.u32 %r196,%r39,0; selp.u64 %r122,%r70,%r122,%r196; add.u64 %r112,%frame,128; mov.u64 %r111,%r244; add.u64 %r108,%r245,%r114; $L11: .loc 1 100 37 sub.u64 %r200,%r121,%r122; .loc 1 100 25 st.u64 [%r97],%r200; .loc 1 99 5 add.u64 %r97,%r97,%r95; setp.eq.u64 %r201,%r108,%r112; @ %r201 bra $L1; .loc 1 100 33 ld.u64 %r121,[%r112]; .loc 1 100 44 ld.u64 %r122,[%r111];8; bra $L11; $L9: .loc 1 104 13 add.u64 %r66,%r66,%r42; .loc 1 105 14 add.u64 %r122,%r122,1; mov.u32 %r39,%r246; .loc 1 105 6 setp.ne.u64 %r202,%r122,%r121; @ %r202 bra $L12; st.u64 [%frame+240],%r121; mov.u64 %r105,%r244; mov.u64 %r50,%r121; mov.u64 %r48,%r127; .loc 1 107 6 mov.u64 %r68,0; $L13: .loc 1 112 17 st.u64 [%r105+-8],%r250; .loc 1 115 27 mul.lo.u64 %r45,%r48,%r50; mov.u64 %r124,%r68; .loc 1 116 9 add.u64 %r68,%r68,1; .loc 1 117 11 setp.eq.u64 %r205,%r124,%r98; @ %r205 bra $L1; .loc 1 121 13 ld.u64 %r206,[%r105]; add.u64 %r47,%r206,1; st.u64 [%r105],%r47; shl.b64 %r102,%r68,3; .loc 1 122 20 add.u64 %r207,%frame,%r102; ld.u64 %r48,[%r207]; .loc 1 122 10 sub.u64 %r208,%r45,%r48; shl.b64 %r209,%r208,3; add.u64 %r66,%r66,%r209; .loc 1 124 33 add.u64 %r211,%r245,%r102; ld.u64 %r50,[%r211]; .loc 1 124 6 add.u64 %r105,%r105,8; setp.eq.u64 %r212,%r47,%r50; @ %r212 bra $L13; bra $L14; $L8: ld.u64 %r91,[%frame+240]; .loc 1 107 6 mov.u64 %r118,%r91; mov.u32 %r88,%r134; $L18: .loc 1 134 11 ld.f32 %r214,[%r43]; set.u32.eq.f32 %r215,%r214,%r132; neg.s32 %r216,%r215; ld.f32 %r218,[%r43+4]; set.u32.eq.f32 %r219,%r218,%r133; neg.s32 %r220,%r219; cvt.u16.u32 %r222,%r216; cvt.u16.u32 %r223,%r220; and.b16 %r221,%r222,%r223; cvt.u32.u16 %r224,%r221; cvt.u16.u8 %r225,%r224; setp.eq.u16 %r226,%r225,0; @ %r226 bra $L15; setp.eq.u32 %r227,%r88,0; selp.u64 %r118,%r91,%r118,%r227; mov.u64 %r99,%r244; add.u64 %r83,%r243,%r114; $L17: .loc 1 137 36 add.u64 %r230,%r118,1; .loc 1 137 25 st.u64 [%r97],%r230; .loc 1 136 5 add.u64 %r97,%r97,%r95; setp.eq.u64 %r231,%r83,%r99; @ %r231 bra $L1; .loc 1 137 32 ld.u64 %r118,[%r99]; add.u64 %r99,%r99,8; bra $L17; $L15: .loc 1 141 13 add.u64 %r43,%r43,%r129; .loc 1 142 14 add.u64 %r118,%r118,1; mov.u32 %r88,%r251; .loc 1 142 6 setp.ne.u64 %r232,%r118,%r121; @ %r232 bra $L18; st.u64 [%frame+240],%r121; mov.u64 %r51,%r244; mov.u64 %r65,%r121; mov.u64 %r63,%r127; .loc 1 144 6 mov.u64 %r69,0; $L19: .loc 1 149 17 st.u64 [%r51+-8],%r255; .loc 1 152 27 mul.lo.u64 %r60,%r63,%r65; mov.u64 %r90,%r69; .loc 1 153 9 add.u64 %r69,%r69,1; .loc 1 154 11 setp.eq.u64 %r235,%r90,%r98; @ %r235 bra $L1; .loc 1 158 13 ld.u64 %r236,[%r51]; add.u64 %r62,%r236,1; st.u64 [%r51],%r62; shl.b64 %r126,%r69,3; .loc 1 159 20 add.u64 %r237,%frame,%r126; ld.u64 %r63,[%r237]; .loc 1 159 10 sub.u64 %r238,%r63,%r60; shl.b64 %r239,%r238,3; add.u64 %r43,%r43,%r239; .loc 1 161 33 add.u64 %r241,%r245,%r126; ld.u64 %r65,[%r241]; .loc 1 161 6 add.u64 %r51,%r51,8; setp.eq.u64 %r242,%r62,%r65; @ %r242 bra $L19; bra $L8; $Lmov.u64 %stack,0; .local .align 16 .b8 %frame_ar[480u64 %r96; .reg .u32 %r97u64.reg .u64u32 %r211; .reg .u16pred %r217; .reg .predu64 %r222; .reg .pred %r223; .reg .u64u64 %r229; .reg .pred %r230; .reg .pred %r231; .reg .u64 %r232; .reg .u32 %r233; .reg .u32 %r238; .reg .u64 %r240; .reg .predpred %r248; .reg .f32 %r250; .reg .u32 %r251; .reg .u32 %r252; .reg .f3264pred %r284; .reg .f32 %r286; .reg .u32 %r287; .reg .u32 %r288; .reg .f32 %r290; .reg .u32 %r291; .reg .u32 %r292; .reg .u16 %r293; .reg .u16 %r294; .reg .u16 %r295; .reg .u32 %r296; .reg .u16 %r297; .reg .pred %r298; .reg .predpredu64 %r311; .reg .u64 %r312; .reg .u64mov.u64 %r170,%ar0; mov.u64 %r171,%ar1; mov.f32 %r172,%ar2; mov.f32 %r173,%ar3; mov.u64 %r174,%ar4; mov.u32 %r175,%ar5; .loc 1 190 10 ld.s8 %r22,[%r171+28]; .loc 1 190 8 cvt.u32.u32 %r176,%r22; cvt.s64.s8 %r96,%r176; .loc 1 191 6 setp.gt.s64 %r177,%r96,0; @ %r177 bra $L34;_gfortran_runtime_error34: .loc 1 194 6 ld.u64 %r180,[%r170]; setp.ne.u64 %r181,%r180,0; @ %r181 bra $L35; .loc 1 196 7 st.u64 [%r170+48],%r180; add.u64 %r183,%r96,-1; st.u64 [%r170+56],%r183; mov.u64 %r184,1; st.u64 [%r170+40],%r184; .loc 1 197 28 cvt.u32.u64 %r185,%r184; st.u8 [%r170+28],%r185; .loc 1 198 24 st.u64 [%r170+8],%r180; .loc 1 199 29 mov.u64 %r18896call (%value_in),_gfortrani_xmallocarray191,[%value_in]; } .loc 1 199 27 st.u64 [%r170],%r191; bra $L36; $L35: .loc 1 203 11 cvta.global.u64 %r193,_gfortrani_compile_options; .loc 1 203 10 ld.u32 %r194,[%r193+36]; setp.eq.u32 %r195,%r194,0r198; call _gfortrani_bounds_iforeach_return.loc 1 207 4 cvta.const.u64 %r2011717120198; call _gfortrani_bounds_equal_extents$L36: .loc 1 212 13 ld.u32 %r97,[%r174+16]; .loc 1 214 9 ld.u64 %r61,[%r174]; .loc 1 216 53 add.u32 %r203,%r97,-4; and.b32 %r204,%r203,-5;.loc 1 216 22 add.u32 %r208,%r97,-1; set.u32.le.u32 %r210,%r208,1; neg.s32 %r211,%r210; .loc 1 216 6 cvt.u16.u32 %r213,%r207; cvt.u16.u32 %r214,%r211; or.b16 %r212,%r213,%r214; cvt.u32.u16 %r215,%r212; cvt.u16.u8 %r216,%r215; setp.ne.u16 %r217,%r216,0; @ %r217 bra $L37; .loc 1 218 7 setp.ne.u32 %r218,%r97,16; @ %r218 bra $L38; $L37: ld.u64 %r219,[%r170+40]; shl.b64 %r158,%r219,3; ld.u64 %r126,[%r170]; .loc 1 226 8 mov.u64 %r156,%r126; .loc 1 229 10 mov.u64 %r109,0; .loc 1 230 23 mov.u64 %r222,%r109; bra $L39; $L38: .loc 1 223 5 cvta.const.u64 %r221,$LC3; mov.u64 %r220,22221; call _gfortrani_internal_error39: .loc 1 230 23 st.u64 [%r156],%r222; mov.u64 %r127,%r109; .loc 1 229 26 add.u64 %r109,%r127,1; .loc 1 229 3 add.u64 %r156,%r156,%r158; setp.ne.u64 %r223,%r96,%r109; @ %r223 bra $L39; .loc 1 236 20 ld.u64 %r39,[%r174+16]; add.u64 %r118,%r171,40; add.u64 %r318,%frame,120; mov.u64 %r116,%r318; add.u64 %r115,%r174,40; mov.u64 %r113,%frame; add.u64 %r34,%frame,240; mov.u64 %r106,%r34; .loc 1 232 6 mov.u64 %r107,1; .loc 1 233 10 mov.u64 %r125,0; bra $L41; $L54: mov.u64 %r125,%r108; $L41: .loc 1 235 18 ld.u64 %r224,[%r118]; st.u64 [%r116],%r224; .loc 1 236 20 ld.u64 %r226,[%r115]; mul.lo.u64 %r225,%r226,%r39; .loc 1 236 18 st.u64 [%r113],%r225; .loc 1 237 19 ld.u64 %r228,[%r118+16]; add.u64 %r227,%r228,1; ld.u64 %r229,[%r118+8]; sub.u64 %r45,%r227,%r229; .loc 1 237 17 st.u64 [%r106],%r45; .loc 1 238 10 mul.lo.u64 %r107,%r107,%r45; .loc 1 239 10 setp.le.s64 %r230,%r45,0; @ %r230 bra $L33; .loc 1 233 26 add.u64 %r108,%r125,1; .loc 1 233 3 add.u64 %r118,%r118,24; add.u64 %r116,%r116,8; add.u64 %r115,%r115,24; add.u64 %r113,%r113,8; add.u64 %r106,%r106,8; setp.ne.u64 %r231,%r127,%r125; @ %r231 bra $L54; .loc 1 244 16 cvt.u32.u32 %r233,%r22; cvt.s64.s8 %r232,%r233; shl.b64 %r152,%r232,3; add.u64 %r319,%frame,360; mov.u32 %r23815240,[%value_in]; } .loc 1 261 23 ld.u64 %r165,[%frame+120]; .loc 1 261 13 shl.b64 %r167,%r165,3; .loc 1 262 24 ld.u64 %r168,[%frame]; .loc 1 263 34 ld.u64 %r160,[%frame+240]; .loc 1 248 19 ld.u64 %r49,[%r171]; .loc 1 246 6 setp.eq.u32 %r242,%r175,0; @ ! %r242 bra $L78; add.u64 %r320,%frame,368; mov.u32 %r328,1; .loc 1 324 33 add.u64 %r331,%frame,240; bra $L42; $L78: .loc 1 248 37 add.u64 %r46,%r107,-1; .loc 1 248 31 shl.b64 %r243,%r46,3; .loc 1 248 12 add.u64 %r91,%r49,%r243; .loc 1 249 32 cvt.s64.s32 %r244,%r97; .loc 1 249 13 mad.lo.u64 %r92,%r244,%r46,%r61; .loc 1 261 13 neg.s64 %r59,%r167; .loc 1 262 14 neg.s64 %r60,%r168; add.u64 %r320,%frame,368; mov.u32 %r322,1; .loc 1 284 33 add.u64 %r325,%frame,240; $L48: ld.u64 %r120,[%frame+360]; mov.u64 %r161,%r120; mov.u32 %r112,%r238; $L46: .loc 1 254 11 ld.s8 %r247,[%r92]; cvt.u16.u32 %r246,%r247; setp.eq.u16 %r248,%r246,0; @ %r248 bra $L43; ld.f32 %r250,[%r91]; set.u32.eq.f32 %r251,%r250,%r172; neg.s32 %r252,%r251; ld.f32 %r254,[%r91+4]; set.u32.eq.f32 %r255,%r254,%r173; neg.s32 %r256,%r255; cvt.u16.u32 %r258,%r252; cvt.u16.u32 %r259,%r256; and.b16 %r257,%r258,%r259; cvt.u32.u16 %r260,%r257; cvt.u16.u8 %r261,%r260; setp.eq.u16 %r262,%r261,0; @ %r262 bra $L43; setp.eq.u32 %r263,%r112,0; selp.u64 %r161,%r120,%r161,%r263; add.u64 %r150,%frame,248; mov.u64 %r149,%r320; add.u64 %r147,%r34,%r152; $L45: .loc 1 257 37 sub.u64 %r266,%r160,%r161; .loc 1 257 25 st.u64 [%r126],%r266; .loc 1 256 5 add.u64 %r126,%r126,%r158; setp.eq.u64 %r267,%r147,%r150; @ %r267 bra $L33; .loc 1 257 33 ld.u64 %r160,[%r150]; .loc 1 257 44 ld.u64 %r161,[%r149]; add.u64 %r150,%r150,8; add.u64 %r149,%r149,8; bra $L45; $L43: .loc 1 261 13 add.u64 %r91,%r91,%r59; .loc 1 262 14 add.u64 %r92,%r92,%r60; .loc 1 263 14 add.u64 %r161,%r161,1; mov.u32 %r112,%r322; .loc 1 263 6 setp.ne.u64 %r268,%r161,%r160; @ %r268 bra $L46; st.u64 [%frame+360],%r160; mov.u64 %r144,%r320; mov.u64 %r70,%r168; mov.u64 %r73,%r160; mov.u64 %r68,%r165; mov.u64 %r143,8; .loc 1 265 6 mov.u64 %r93,0; .loc 1 270 17 mov.u64 %r270,%r93; $L47: st.u64 [%r144+-8],%r270; .loc 1 273 27 mul.lo.u64 %r63,%r68,%r73; .loc 1 274 28 mul.lo.u64 %r65,%r73,%r70; mov.u64 %r153,%r93; .loc 1 275 9 add.u64 %r93,%r93,1; .loc 1 276 11 setp.eq.u64 %r271,%r153,%r127; @ %r271 bra $L33; .loc 1 280 13 ld.u64 %r272,[%r144]; add.u64 %r67,%r272,1; st.u64 [%r144],%r67; .loc 1 281 20 add.u64 %r274,%r318,%r143; ld.u64 %r68,[%r274]; .loc 1 281 10 sub.u64 %r275,%r63,%r68; shl.b64 %r276,%r275,3; add.u64 %r91,%r91,%r276; .loc 1 282 21 add.u64 %r277,%frame,%r143; ld.u64 %r70,[%r277]; .loc 1 282 11 sub.u64 %r278,%r70,%r65; add.u64 %r92,%r92,%r278; .loc 1 284 33 add.u64 %r280,%r325,%r143; ld.u64 %r73,[%r280]; .loc 1 284 6 add.u64 %r144,%r144,8; add.u64 %r143,%r143,8; setp.eq.u64 %r281,%r67,%r73; @ %r281 bra $L47; bra $L48; $L42: ld.u64 %r35,[%frame+360]; .loc 1 265 6 mov.u64 %r155,%r35; mov.u32 %r33,%r175; $L52: .loc 1 294 11 ld.s8 %r283,[%r61]; cvt.u16.u32 %r282,%r283; setp.eq.u16 %r284,%r282,0; @ %r284 bra $L49; ld.f32 %r286,[%r49]; set.u32.eq.f32 %r287,%r286,%r172; neg.s32 %r288,%r287; ld.f32 %r290,[%r49+4]; set.u32.eq.f32 %r291,%r290,%r173; neg.s32 %r292,%r291; cvt.u16.u32 %r294,%r288; cvt.u16.u32 %r295,%r292; and.b16 %r293,%r294,%r295; cvt.u32.u16 %r296,%r293; cvt.u16.u8 %r297,%r296; setp.eq.u16 %r298,%r297,0; @ %r298 bra $L49; setp.eq.u32 %r299,%r33,0; selp.u64 %r155,%r35,%r155,%r299; mov.u64 %r138,%r320; add.u64 %r135,%r319,%r152; $L51: .loc 1 297 36 add.u64 %r302,%r155,1; .loc 1 297 25 st.u64 [%r126],%r302; .loc 1 296 5 add.u64 %r126,%r126,%r158; setp.eq.u64 %r303,%r135,%r138; @ %r303 bra $L33; .loc 1 297 32 ld.u64 %r155,[%r138]; add.u64 %r138,%r138,8; bra $L51; $L49: .loc 1 301 13 add.u64 %r49,%r49,%r167; .loc 1 302 14 add.u64 %r61,%r61,%r168; .loc 1 303 14 add.u64 %r155,%r155,1; mov.u32 %r33,%r328; .loc 1 303 6 setp.ne.u64 %r304,%r155,%r160; @ %r304 bra $L52; st.u64 [%frame+360],%r160; mov.u64 %r133,%r320; mov.u64 %r88,%r168; mov.u64 %r90,%r160; mov.u64 %r86,%r165; mov.u64 %r132,8; .loc 1 305 6 mov.u64 %r94,0; .loc 1 310 17 mov.u64 %r306,%r94; $L53: st.u64 [%r133+-8],%r306; .loc 1 313 27 mul.lo.u64 %r81,%r86,%r90; .loc 1 314 28 mul.lo.u64 %r83,%r90,%r88; mov.u64 %r95,%r94; .loc 1 315 9 add.u64 %r94,%r94,1; .loc 1 316 11 setp.eq.u64 %r307,%r95,%r127; @ %r307 bra $L33; .loc 1 320 13 ld.u64 %r308,[%r133]; add.u64 %r85,%r308,1; st.u64 [%r133],%r85; .loc 1 321 20 add.u64 %r310,%r318,%r132; ld.u64 %r86,[%r310]; .loc 1 321 10 sub.u64 %r311,%r86,%r81; shl.b64 %r312,%r311,3; add.u64 %r49,%r49,%r312; .loc 1 322 21 add.u64 %r313,%frame,%r132; ld.u64 %r88,[%r313]; .loc 1 322 11 sub.u64 %r314,%r88,%r83; add.u64 %r61,%r61,%r314; .loc 1 324 33 add.u64 %r316,%r331,%r132; ld.u64 %r90,[%r316]; .loc 1 324 6 add.u64 %r133,%r133,8; add.u64 %r132,%r132,8; setp.eq.u64 %r317,%r85,%r90; @ %r317 bra $L53; bra $L42; $L33:_gfortran_sfindloc0_c4 .visible .func _gfortran_sfindloc0_cu64 %r65; .reg .u32 %r66; .reg .pred %r67; .reg .u64 %r70; .reg .u64 %r71; .reg .u64 %r72; .reg .pred %r73; mov.u64 %r35,%ar0; mov.u64 %r36,%ar1; mov.f32 %r37,%ar2; mov.f32 %r38,%ar3; mov.u64 %r39,%ar4; mov.u32 %r40,%ar5; .loc 1 345 6 setp.eq.u64 %r41,%r39,0; @ %r41 bra $L80; .loc 1 345 20 ld.u32 %r42,[%r39]; setp.eq.u32 %r43,%r42,0; @ %r43 bra $L81; $L80: .loc 1 347 7 {.param .f32 %out_arg4; st.param.f32 [%out_arg4],%r3840; call _gfortran_findloc0_c4); } .loc 1 348 7 bra $L79; $L81: .loc 1 351 8 ld.s8 %r30,[%r36+28]; .loc 1 353 6 setp.gt.s64 %r50,%r30,0; @ %r50 bra $L83; .loc 1 354 5 cvta.const.u64 %r52,$LC0; mov.u64 %r51,r52; call _gfortrani_internal_error83: .loc 1 356 15 ld.u64 %r24,[%r35]; .loc 1 356 6 setp.ne.u64 %r53,%r24,0; @ %r53 bra $L84; .loc 1 358 7 st.u64 [%r35+48],%r24; add.u64 %r55,%r30,-1; st.u64 [%r35+56],%r55; mov.u64 %r56,1; st.u64 [%r35+40],%r56; .loc 1 359 28 cvt.u32.u64 %r57,%r56; st.u8 [%r35+28],%r57; .loc 1 360 24 st.u64 [%r35+8],%r24; .loc 1 361 29 mov.u64 _gfortrani_xmallocarray61 27 st.u64 [%r35],%r24; bra $L85; $L84: .loc 1 363 12 cvta.global.u64 %r65,_gfortrani_compile_options; .loc 1 363 11 ld.u32 %r66,[%r65+36]; setp.eq.u32 %r67,%r66,0; @ %r67 bra $L85; .loc 1 365 8 cvta.const.u64 %r70,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 370 8 ld.u64 %r24,[%r35]; $L85: ld.u64 %r71,[%r35+40]; shl.b64 %r34,%r71,3; mov.u64 %r29,%r24; .loc 1 371 10 mov.u64 %r32,0; .loc 1 372 23 mov.u64 %r72,%r32; $L86: st.u64 [%r29],%r72; .loc 1 371 24 add.u64 %r32,%r32,1; .loc 1 371 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r73,%r30,%r32; @ %r73 bra $L86; $L79: .loc 1 373 1 ret; } findloc0_c8.o/_gfortran_findloc0_c8 .visible .func _gfortran_findloc0_c8fortran/generated/findloc0_c8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mfindloc0_c8 .visible .func _gfortran_mfindloc0_c8_gfortran_sfindloc0_c8 .visible .func _gfortran_sfindloc0_c8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extentsgfortrani_internal_error .extern .func _gfortrani_internal_error77,65,83,75,32,97,114,103,117,109,101,110,116,0 }82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,62,32,488] = {70,73,78,68,770,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_findloc0_c8 .visible .func _gfortran_findloc0_c8reg .u64 %stack; mov.u64 %stack,08u64u64 %r158; .reg .predu32 %r170; .reg .u32 %r175; .reg .u64 %r177; .reg .predu16 %r191; .reg .u16 %r192; .reg .u32u32 %r216; .reg .f64 %r218; .reg .u32 %r219; .reg .u32 %r220; .reg .u16 %r221; .reg .u16 %r222; .reg .u16 %r223; .reg .u32 %r224; .reg .u16 %r225; .reg .predpred %r231; .reg .pred %r232; .reg .predmov.u64 %r130,%ar0; mov.u64 %r131,%ar1; mov.f64 %r132,%ar2; mov.f64 %r133,%ar3; mov.u32 %r134,%ar4; .loc 1 51 10 ld.s8 %r22,[%r131+28]; .loc 1 51 8 cvt.u32.u32 %r135,%r22; cvt.s64.s8 %r71,%r135; .loc 1 52 6 setp.gt.s64 %r136,%r71,0; @ %r136 bra $L2; .loc 1 53 5 cvta.const.u64 %r131stack; call _gfortran_runtime_errorr23,[%r130]; .loc 1 55 6 setp.ne.u64 %r139,%r23,0; @ %r139 bra $L3; .loc 1 57 7 st.u64 [%r130+48],%r23; add.u64 %r141,%r71,-1; st.u64 [%r130+56],%r141; mov.u64 %r142,1; st.u64 [%r130+40],%r142; .loc 1 58 28 cvt.u32.u64 %r143,%r142; st.u8 [%r130+28],%r143; .loc 1 59 24 st.u64 [%r130+8],%r23; .loc 1 60 29 mov.u64 %r14671call (%value_in),_gfortrani_xmallocarray9,[%value_in]; } mov.u64 %r23,%r149; .loc 1 60 27 st.u64 [%r130],%r23; bra $L4; $L3: .loc 1 64 11 cvta.global.u64 %r151,_gfortrani_compile_options; .loc 1 64 10 ld.u32 %r152,[%r151+36]; setp.eq.u32 %r153,%r152,0; @ %r153 bra $L4; .loc 1 6gfortrani_bounds_iforeach_return.loc 1 70 8 ld.u64 %r23,[%r130]; $L4: ld.u64 %r157,[%r130+40]; shl.b64 %r95,%r157,3; mov.u64 %r97,%r23; .loc 1 69 11 mov.u64 %r57,%r97; .loc 1 73 10 mov.u64 %r81,0; .loc 1 74 23 mov.u64 %r158,%r81; $L5: st.u64 [%r57],%r158; mov.u64 %r98,%r81; .loc 1 73 26 add.u64 %r81,%r98,1; .loc 1 73 3 add.u64 %r57,%r57,%r95; setp.ne.u64 %r159,%r71,%r81; @ %r159 bra $L5; add.u64 %r117,%r131,40; mov.u64 %r116,%frame; .loc 1 76 6 mov.u64 %r79,1; .loc 1 77 10 mov.u64 %r96,0; add.u64 %r245,%frame,120; bra $L7; $L20: mov.u64 %r96,%r80; $L7: .loc 1 79 18 ld.u64 %r160,[%r117]; st.u64 [%r116],%r160; .loc 1 80 19 ld.u64 %r162,[%r117+16]; add.u64 %r161,%r162,1; ld.u64 %r163,[%r117+8]; sub.u64 %r34,%r161,%r163; .loc 1 80 17 shl.b64 %r165,%r96,3; add.u64 %r166,%r245,%r165; st.u64 [%r166],%r34; .loc 1 81 10 mul.lo.u64 %r79,%r79,%r34; .loc 1 82 10 setp.le.s64 %r167,%r34,0; @ %r167 bra $L1; .loc 1 77 26 add.u64 %r80,%r96,1; .loc 1 77 3 add.u64 %r117,%r117,24; add.u64 %r116,%r116,8; setp.ne.u64 %r168,%r98,%r96; @ %r168 bra $L20; .loc 1 87 16 cvt.u32.u32 %r170,%r22; cvt.s64.s8 %r169,%r170; shl.b64 %r114,%r169,3; add.u64 %r243,%frame,240; mov.u32 %r17511177,[%value_in]; } .loc 1 104 23 ld.u64 %r127,[%frame]; .loc 1 104 13 shl.b64 %r129,%r127,4; .loc 1 105 34 ld.u64 %r121,[%frame+120]; .loc 1 91 19 ld.u64 %r43,[%r131]; .loc 1 89 6 setp.eq.u32 %r179,%r134,0; @ ! %r179 bra $L32; add.u64 %r244,%frame,248; mov.u32 %r251,1; .loc 1 149 17 mov.u64 %r255,0; bra $L8; $L32: .loc 1 91 37 add.u64 %r180,%r79,-1; .loc 1 91 31 shl.b64 %r181,%r180,4; .loc 1 91 12 add.u64 %r66,%r43,%r181; .loc 1 104 13 neg.s64 %r42,%r129; add.u64 %r244,%frame,248; mov.u32 %r246,1; .loc 1 112 17 mov.u64 %r250,0; $L14: ld.u64 %r70,[%frame+240]; mov.u64 %r122,%r70; mov.u32 %r39,%r175; $L12: .loc 1 97 11 ld.f64 %r183,[%r66]; set.u32.eq.f64 %r184,%r183,%r132; neg.s32 %r185,%r184; ld.f64 %r187,[%r66+8]; set.u32.eq.f64 %r188,%r187,%r133; neg.s32 %r189,%r188; cvt.u16.u32 %r191,%r185; cvt.u16.u32 %r192,%r189; and.b16 %r190,%r191,%r192; cvt.u32.u16 %r193,%r190; cvt.u16.u8 %r194,%r193; setp.eq.u16 %r195,%r194,0; @ %r195 bra $L9; setp.eq.u32 %r196,%r39,0; selp.u64 %r122,%r70,%r122,%r196; add.u64 %r112,%frame,128; mov.u64 %r111,%r244; add.u64 %r108,%r245,%r114; $L11: .loc 1 100 37 sub.u64 %r200,%r121,%r122; .loc 1 100 25 st.u64 [%r97],%r200; .loc 1 99 5 add.u64 %r97,%r97,%r95; setp.eq.u64 %r201,%r108,%r112; @ %r201 bra $L1; .loc 1 100 33 ld.u64 %r121,[%r112]; .loc 1 100 44 ld.u64 %r122,[%r111];8; bra $L11; $L9: .loc 1 104 13 add.u64 %r66,%r66,%r42; .loc 1 105 14 add.u64 %r122,%r122,1; mov.u32 %r39,%r246; .loc 1 105 6 setp.ne.u64 %r202,%r122,%r121; @ %r202 bra $L12; st.u64 [%frame+240],%r121; mov.u64 %r105,%r244; mov.u64 %r50,%r121; mov.u64 %r48,%r127; .loc 1 107 6 mov.u64 %r68,0; $L13: .loc 1 112 17 st.u64 [%r105+-8],%r250; .loc 1 115 27 mul.lo.u64 %r45,%r48,%r50; mov.u64 %r124,%r68; .loc 1 116 9 add.u64 %r68,%r68,1; .loc 1 117 11 setp.eq.u64 %r205,%r124,%r98; @ %r205 bra $L1; .loc 1 121 13 ld.u64 %r206,[%r105]; add.u64 %r47,%r206,1; st.u64 [%r105],%r47; shl.b64 %r102,%r68,3; .loc 1 122 20 add.u64 %r207,%frame,%r102; ld.u64 %r48,[%r207]; .loc 1 122 10 sub.u64 %r208,%r45,%r48; shl.b64 %r209,%r208,4; add.u64 %r66,%r66,%r209; .loc 1 124 33 add.u64 %r211,%r245,%r102; ld.u64 %r50,[%r211]; .loc 1 124 6 add.u64 %r105,%r105,8; setp.eq.u64 %r212,%r47,%r50; @ %r212 bra $L13; bra $L14; $L8: ld.u64 %r91,[%frame+240]; .loc 1 107 6 mov.u64 %r118,%r91; mov.u32 %r88,%r134; $L18: .loc 1 134 11 ld.f64 %r214,[%r43]; set.u32.eq.f64 %r215,%r214,%r132; neg.s32 %r216,%r215; ld.f64 %r218,[%r43+8]; set.u32.eq.f64 %r219,%r218,%r133; neg.s32 %r220,%r219; cvt.u16.u32 %r222,%r216; cvt.u16.u32 %r223,%r220; and.b16 %r221,%r222,%r223; cvt.u32.u16 %r224,%r221; cvt.u16.u8 %r225,%r224; setp.eq.u16 %r226,%r225,0; @ %r226 bra $L15; setp.eq.u32 %r227,%r88,0; selp.u64 %r118,%r91,%r118,%r227; mov.u64 %r99,%r244; add.u64 %r83,%r243,%r114; $L17: .loc 1 137 36 add.u64 %r230,%r118,1; .loc 1 137 25 st.u64 [%r97],%r230; .loc 1 136 5 add.u64 %r97,%r97,%r95; setp.eq.u64 %r231,%r83,%r99; @ %r231 bra $L1; .loc 1 137 32 ld.u64 %r118,[%r99]; add.u64 %r99,%r99,8; bra $L17; $L15: .loc 1 141 13 add.u64 %r43,%r43,%r129; .loc 1 142 14 add.u64 %r118,%r118,1; mov.u32 %r88,%r251; .loc 1 142 6 setp.ne.u64 %r232,%r118,%r121; @ %r232 bra $L18; st.u64 [%frame+240],%r121; mov.u64 %r51,%r244; mov.u64 %r65,%r121; mov.u64 %r63,%r127; .loc 1 144 6 mov.u64 %r69,0; $L19: .loc 1 149 17 st.u64 [%r51+-8],%r255; .loc 1 152 27 mul.lo.u64 %r60,%r63,%r65; mov.u64 %r90,%r69; .loc 1 153 9 add.u64 %r69,%r69,1; .loc 1 154 11 setp.eq.u64 %r235,%r90,%r98; @ %r235 bra $L1; .loc 1 158 13 ld.u64 %r236,[%r51]; add.u64 %r62,%r236,1; st.u64 [%r51],%r62; shl.b64 %r126,%r69,3; .loc 1 159 20 add.u64 %r237,%frame,%r126; ld.u64 %r63,[%r237]; .loc 1 159 10 sub.u64 %r238,%r63,%r60; shl.b64 %r239,%r238,4; add.u64 %r43,%r43,%r239; .loc 1 161 33 add.u64 %r241,%r245,%r126; ld.u64 %r65,[%r241]; .loc 1 161 6 add.u64 %r51,%r51,8; setp.eq.u64 %r242,%r62,%r65; @ %r242 bra $L19; bra $L8; $Lmov.u64 %stack,0; .local .align 16 .b8 %frame_ar[480u64 %r96; .reg .u32 %r97u64.reg .u64u32 %r211; .reg .u16pred %r217; .reg .predu64 %r222; .reg .pred %r223; .reg .u64u64 %r229; .reg .pred %r230; .reg .pred %r231; .reg .u64 %r232; .reg .u32 %r233; .reg .u32 %r238; .reg .u64 %r240; .reg .predpred %r248; .reg .f64 %r250; .reg .u32 %r251; .reg .u32 %r252; .reg .f6464pred %r284; .reg .f64 %r286; .reg .u32 %r287; .reg .u32 %r288; .reg .f64 %r290; .reg .u32 %r291; .reg .u32 %r292; .reg .u16 %r293; .reg .u16 %r294; .reg .u16 %r295; .reg .u32 %r296; .reg .u16 %r297; .reg .pred %r298; .reg .predpredu64 %r311; .reg .u64 %r312; .reg .u64mov.u64 %r170,%ar0; mov.u64 %r171,%ar1; mov.f64 %r172,%ar2; mov.f64 %r173,%ar3; mov.u64 %r174,%ar4; mov.u32 %r175,%ar5; .loc 1 190 10 ld.s8 %r22,[%r171+28]; .loc 1 190 8 cvt.u32.u32 %r176,%r22; cvt.s64.s8 %r96,%r176; .loc 1 191 6 setp.gt.s64 %r177,%r96,0; @ %r177 bra $L34;_gfortran_runtime_error34: .loc 1 194 6 ld.u64 %r180,[%r170]; setp.ne.u64 %r181,%r180,0; @ %r181 bra $L35; .loc 1 196 7 st.u64 [%r170+48],%r180; add.u64 %r183,%r96,-1; st.u64 [%r170+56],%r183; mov.u64 %r184,1; st.u64 [%r170+40],%r184; .loc 1 197 28 cvt.u32.u64 %r185,%r184; st.u8 [%r170+28],%r185; .loc 1 198 24 st.u64 [%r170+8],%r180; .loc 1 199 29 mov.u64 %r18896call (%value_in),_gfortrani_xmallocarray191,[%value_in]; } .loc 1 199 27 st.u64 [%r170],%r191; bra $L36; $L35: .loc 1 203 11 cvta.global.u64 %r193,_gfortrani_compile_options; .loc 1 203 10 ld.u32 %r194,[%r193+36]; setp.eq.u32 %r195,%r194,0r198; call _gfortrani_bounds_iforeach_return.loc 1 207 4 cvta.const.u64 %r2011717120198; call _gfortrani_bounds_equal_extents$L36: .loc 1 212 13 ld.u32 %r97,[%r174+16]; .loc 1 214 9 ld.u64 %r61,[%r174]; .loc 1 216 53 add.u32 %r203,%r97,-4; and.b32 %r204,%r203,-5;.loc 1 216 22 add.u32 %r208,%r97,-1; set.u32.le.u32 %r210,%r208,1; neg.s32 %r211,%r210; .loc 1 216 6 cvt.u16.u32 %r213,%r207; cvt.u16.u32 %r214,%r211; or.b16 %r212,%r213,%r214; cvt.u32.u16 %r215,%r212; cvt.u16.u8 %r216,%r215; setp.ne.u16 %r217,%r216,0; @ %r217 bra $L37; .loc 1 218 7 setp.ne.u32 %r218,%r97,16; @ %r218 bra $L38; $L37: ld.u64 %r219,[%r170+40]; shl.b64 %r158,%r219,3; ld.u64 %r126,[%r170]; .loc 1 226 8 mov.u64 %r156,%r126; .loc 1 229 10 mov.u64 %r109,0; .loc 1 230 23 mov.u64 %r222,%r109; bra $L39; $L38: .loc 1 223 5 cvta.const.u64 %r221,$LC3; mov.u64 %r220,22221; call _gfortrani_internal_error39: .loc 1 230 23 st.u64 [%r156],%r222; mov.u64 %r127,%r109; .loc 1 229 26 add.u64 %r109,%r127,1; .loc 1 229 3 add.u64 %r156,%r156,%r158; setp.ne.u64 %r223,%r96,%r109; @ %r223 bra $L39; .loc 1 236 20 ld.u64 %r39,[%r174+16]; add.u64 %r118,%r171,40; add.u64 %r318,%frame,120; mov.u64 %r116,%r318; add.u64 %r115,%r174,40; mov.u64 %r113,%frame; add.u64 %r34,%frame,240; mov.u64 %r106,%r34; .loc 1 232 6 mov.u64 %r107,1; .loc 1 233 10 mov.u64 %r125,0; bra $L41; $L54: mov.u64 %r125,%r108; $L41: .loc 1 235 18 ld.u64 %r224,[%r118]; st.u64 [%r116],%r224; .loc 1 236 20 ld.u64 %r226,[%r115]; mul.lo.u64 %r225,%r226,%r39; .loc 1 236 18 st.u64 [%r113],%r225; .loc 1 237 19 ld.u64 %r228,[%r118+16]; add.u64 %r227,%r228,1; ld.u64 %r229,[%r118+8]; sub.u64 %r45,%r227,%r229; .loc 1 237 17 st.u64 [%r106],%r45; .loc 1 238 10 mul.lo.u64 %r107,%r107,%r45; .loc 1 239 10 setp.le.s64 %r230,%r45,0; @ %r230 bra $L33; .loc 1 233 26 add.u64 %r108,%r125,1; .loc 1 233 3 add.u64 %r118,%r118,24; add.u64 %r116,%r116,8; add.u64 %r115,%r115,24; add.u64 %r113,%r113,8; add.u64 %r106,%r106,8; setp.ne.u64 %r231,%r127,%r125; @ %r231 bra $L54; .loc 1 244 16 cvt.u32.u32 %r233,%r22; cvt.s64.s8 %r232,%r233; shl.b64 %r152,%r232,3; add.u64 %r319,%frame,360; mov.u32 %r23815240,[%value_in]; } .loc 1 261 23 ld.u64 %r165,[%frame+120]; .loc 1 261 13 shl.b64 %r167,%r165,4; .loc 1 262 24 ld.u64 %r168,[%frame]; .loc 1 263 34 ld.u64 %r160,[%frame+240]; .loc 1 248 19 ld.u64 %r49,[%r171]; .loc 1 246 6 setp.eq.u32 %r242,%r175,0; @ ! %r242 bra $L78; add.u64 %r320,%frame,368; mov.u32 %r328,1; .loc 1 324 33 add.u64 %r331,%frame,240; bra $L42; $L78: .loc 1 248 37 add.u64 %r46,%r107,-1; .loc 1 248 31 shl.b64 %r243,%r46,4; .loc 1 248 12 add.u64 %r91,%r49,%r243; .loc 1 249 32 cvt.s64.s32 %r244,%r97; .loc 1 249 13 mad.lo.u64 %r92,%r244,%r46,%r61; .loc 1 261 13 neg.s64 %r59,%r167; .loc 1 262 14 neg.s64 %r60,%r168; add.u64 %r320,%frame,368; mov.u32 %r322,1; .loc 1 284 33 add.u64 %r325,%frame,240; $L48: ld.u64 %r120,[%frame+360]; mov.u64 %r161,%r120; mov.u32 %r112,%r238; $L46: .loc 1 254 11 ld.s8 %r247,[%r92]; cvt.u16.u32 %r246,%r247; setp.eq.u16 %r248,%r246,0; @ %r248 bra $L43; ld.f64 %r250,[%r91]; set.u32.eq.f64 %r251,%r250,%r172; neg.s32 %r252,%r251; ld.f64 %r254,[%r91+8]; set.u32.eq.f64 %r255,%r254,%r173; neg.s32 %r256,%r255; cvt.u16.u32 %r258,%r252; cvt.u16.u32 %r259,%r256; and.b16 %r257,%r258,%r259; cvt.u32.u16 %r260,%r257; cvt.u16.u8 %r261,%r260; setp.eq.u16 %r262,%r261,0; @ %r262 bra $L43; setp.eq.u32 %r263,%r112,0; selp.u64 %r161,%r120,%r161,%r263; add.u64 %r150,%frame,248; mov.u64 %r149,%r320; add.u64 %r147,%r34,%r152; $L45: .loc 1 257 37 sub.u64 %r266,%r160,%r161; .loc 1 257 25 st.u64 [%r126],%r266; .loc 1 256 5 add.u64 %r126,%r126,%r158; setp.eq.u64 %r267,%r147,%r150; @ %r267 bra $L33; .loc 1 257 33 ld.u64 %r160,[%r150]; .loc 1 257 44 ld.u64 %r161,[%r149]; add.u64 %r150,%r150,8; add.u64 %r149,%r149,8; bra $L45; $L43: .loc 1 261 13 add.u64 %r91,%r91,%r59; .loc 1 262 14 add.u64 %r92,%r92,%r60; .loc 1 263 14 add.u64 %r161,%r161,1; mov.u32 %r112,%r322; .loc 1 263 6 setp.ne.u64 %r268,%r161,%r160; @ %r268 bra $L46; st.u64 [%frame+360],%r160; mov.u64 %r144,%r320; mov.u64 %r70,%r168; mov.u64 %r73,%r160; mov.u64 %r68,%r165; mov.u64 %r143,8; .loc 1 265 6 mov.u64 %r93,0; .loc 1 270 17 mov.u64 %r270,%r93; $L47: st.u64 [%r144+-8],%r270; .loc 1 273 27 mul.lo.u64 %r63,%r68,%r73; .loc 1 274 28 mul.lo.u64 %r65,%r73,%r70; mov.u64 %r153,%r93; .loc 1 275 9 add.u64 %r93,%r93,1; .loc 1 276 11 setp.eq.u64 %r271,%r153,%r127; @ %r271 bra $L33; .loc 1 280 13 ld.u64 %r272,[%r144]; add.u64 %r67,%r272,1; st.u64 [%r144],%r67; .loc 1 281 20 add.u64 %r274,%r318,%r143; ld.u64 %r68,[%r274]; .loc 1 281 10 sub.u64 %r275,%r63,%r68; shl.b64 %r276,%r275,4; add.u64 %r91,%r91,%r276; .loc 1 282 21 add.u64 %r277,%frame,%r143; ld.u64 %r70,[%r277]; .loc 1 282 11 sub.u64 %r278,%r70,%r65; add.u64 %r92,%r92,%r278; .loc 1 284 33 add.u64 %r280,%r325,%r143; ld.u64 %r73,[%r280]; .loc 1 284 6 add.u64 %r144,%r144,8; add.u64 %r143,%r143,8; setp.eq.u64 %r281,%r67,%r73; @ %r281 bra $L47; bra $L48; $L42: ld.u64 %r35,[%frame+360]; .loc 1 265 6 mov.u64 %r155,%r35; mov.u32 %r33,%r175; $L52: .loc 1 294 11 ld.s8 %r283,[%r61]; cvt.u16.u32 %r282,%r283; setp.eq.u16 %r284,%r282,0; @ %r284 bra $L49; ld.f64 %r286,[%r49]; set.u32.eq.f64 %r287,%r286,%r172; neg.s32 %r288,%r287; ld.f64 %r290,[%r49+8]; set.u32.eq.f64 %r291,%r290,%r173; neg.s32 %r292,%r291; cvt.u16.u32 %r294,%r288; cvt.u16.u32 %r295,%r292; and.b16 %r293,%r294,%r295; cvt.u32.u16 %r296,%r293; cvt.u16.u8 %r297,%r296; setp.eq.u16 %r298,%r297,0; @ %r298 bra $L49; setp.eq.u32 %r299,%r33,0; selp.u64 %r155,%r35,%r155,%r299; mov.u64 %r138,%r320; add.u64 %r135,%r319,%r152; $L51: .loc 1 297 36 add.u64 %r302,%r155,1; .loc 1 297 25 st.u64 [%r126],%r302; .loc 1 296 5 add.u64 %r126,%r126,%r158; setp.eq.u64 %r303,%r135,%r138; @ %r303 bra $L33; .loc 1 297 32 ld.u64 %r155,[%r138]; add.u64 %r138,%r138,8; bra $L51; $L49: .loc 1 301 13 add.u64 %r49,%r49,%r167; .loc 1 302 14 add.u64 %r61,%r61,%r168; .loc 1 303 14 add.u64 %r155,%r155,1; mov.u32 %r33,%r328; .loc 1 303 6 setp.ne.u64 %r304,%r155,%r160; @ %r304 bra $L52; st.u64 [%frame+360],%r160; mov.u64 %r133,%r320; mov.u64 %r88,%r168; mov.u64 %r90,%r160; mov.u64 %r86,%r165; mov.u64 %r132,8; .loc 1 305 6 mov.u64 %r94,0; .loc 1 310 17 mov.u64 %r306,%r94; $L53: st.u64 [%r133+-8],%r306; .loc 1 313 27 mul.lo.u64 %r81,%r86,%r90; .loc 1 314 28 mul.lo.u64 %r83,%r90,%r88; mov.u64 %r95,%r94; .loc 1 315 9 add.u64 %r94,%r94,1; .loc 1 316 11 setp.eq.u64 %r307,%r95,%r127; @ %r307 bra $L33; .loc 1 320 13 ld.u64 %r308,[%r133]; add.u64 %r85,%r308,1; st.u64 [%r133],%r85; .loc 1 321 20 add.u64 %r310,%r318,%r132; ld.u64 %r86,[%r310]; .loc 1 321 10 sub.u64 %r311,%r86,%r81; shl.b64 %r312,%r311,4; add.u64 %r49,%r49,%r312; .loc 1 322 21 add.u64 %r313,%frame,%r132; ld.u64 %r88,[%r313]; .loc 1 322 11 sub.u64 %r314,%r88,%r83; add.u64 %r61,%r61,%r314; .loc 1 324 33 add.u64 %r316,%r331,%r132; ld.u64 %r90,[%r316]; .loc 1 324 6 add.u64 %r133,%r133,8; add.u64 %r132,%r132,8; setp.eq.u64 %r317,%r85,%r90; @ %r317 bra $L53; bra $L42; $L33:_gfortran_sfindloc0_c8 .visible .func _gfortran_sfindloc0_c8u64 %r65; .reg .u32 %r66; .reg .pred %r67; .reg .u64 %r70; .reg .u64 %r71; .reg .u64 %r72; .reg .pred %r73; mov.u64 %r35,%ar0; mov.u64 %r36,%ar1; mov.f64 %r37,%ar2; mov.f64 %r38,%ar3; mov.u64 %r39,%ar4; mov.u32 %r40,%ar5; .loc 1 345 6 setp.eq.u64 %r41,%r39,0; @ %r41 bra $L80; .loc 1 345 20 ld.u32 %r42,[%r39]; setp.eq.u32 %r43,%r42,0; @ %r43 bra $L81; $L80: .loc 1 347 7 {.param .f64 %out_arg4; st.param.f64 [%out_arg4],%r3840; call _gfortran_findloc0_c8); } .loc 1 348 7 bra $L79; $L81: .loc 1 351 8 ld.s8 %r30,[%r36+28]; .loc 1 353 6 setp.gt.s64 %r50,%r30,0; @ %r50 bra $L83; .loc 1 354 5 cvta.const.u64 %r52,$LC0; mov.u64 %r51,r52; call _gfortrani_internal_error83: .loc 1 356 15 ld.u64 %r24,[%r35]; .loc 1 356 6 setp.ne.u64 %r53,%r24,0; @ %r53 bra $L84; .loc 1 358 7 st.u64 [%r35+48],%r24; add.u64 %r55,%r30,-1; st.u64 [%r35+56],%r55; mov.u64 %r56,1; st.u64 [%r35+40],%r56; .loc 1 359 28 cvt.u32.u64 %r57,%r56; st.u8 [%r35+28],%r57; .loc 1 360 24 st.u64 [%r35+8],%r24; .loc 1 361 29 mov.u64 _gfortrani_xmallocarray61 27 st.u64 [%r35],%r24; bra $L85; $L84: .loc 1 363 12 cvta.global.u64 %r65,_gfortrani_compile_options; .loc 1 363 11 ld.u32 %r66,[%r65+36]; setp.eq.u32 %r67,%r66,0; @ %r67 bra $L85; .loc 1 365 8 cvta.const.u64 %r70,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 370 8 ld.u64 %r24,[%r35]; $L85: ld.u64 %r71,[%r35+40]; shl.b64 %r34,%r71,3; mov.u64 %r29,%r24; .loc 1 371 10 mov.u64 %r32,0; .loc 1 372 23 mov.u64 %r72,%r32; $L86: st.u64 [%r29],%r72; .loc 1 371 24 add.u64 %r32,%r32,1; .loc 1 371 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r73,%r30,%r32; @ %r73 bra $L86; $L79: .loc 1 373 1 ret; } findloc0_c10.o/findloc0_s1.o/_gfortran_findloc0_s1 .visible .func _gfortran_findloc0_s1.file 1 "../../../libgfortran/generated/findloc0_s1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mfindloc0_s1 .visible .func _gfortran_mfindloc0_s1_gfortran_sfindloc0_s1 .visible .func _gfortran_sfindloc0_s1VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortran_compare_gfortran_compare); // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extentsgfortrani_internal_error .extern .func _gfortrani_internal_82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,6270,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,98] = {70,73,78,68,777,65,83,75,32,97,114,103,117,109,101,110_gfortran_findloc0_s1 .visible .func _gfortran_findloc0_s1stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[3689pred.reg .u64 %r140; .reg .u32 %r141; .reg .predpredpred %r193; .reg .u32u64 %r202; .reg .pred %r203; .reg .pred %r205; .reg .u64 %r207; .reg .predmov.u64 %r118,%ar0; mov.u64 %r119,%ar1; mov.u64 %r120,%ar2; mov.u32 %r121,%ar3; mov.u64 %r122,%ar4; mov.u64 %r123,%ar5; .loc 1 52 10 ld.s8 %r22,[%r119+28]; .loc 1 52 8 cvt.u32.u32 %r124,%r22; cvt.s64.s8 %r79,%r124; .loc 1 53 6 setp.gt.s64 %r125,%r79,0; @ %r125 bra $L2; .loc 1 54 5 cvta.const.u64 %r126,$LCstack; call _gfortran_runtime_errorr23,[%r118]; .loc 1 56 6 setp.ne.u64 %r128,%r23,0; @ %r128 bra $L3; .loc 1 58 7 st.u64 [%r118+48],%r23; add.u64 %r130,%r79,-1; st.u64 [%r118+56],%r130; mov.u64 %r131,1; st.u64 [%r118+40],%r131; .loc 1 59 28 cvt.u32.u64 %r132,%r131; st.u8 [%r118+28],%r132; .loc 1 60 24 st.u64 [%r118+8],%r23; .loc 1 61 29 mov.u64 %r13579call (%value_in),_gfortrani_xmallocarray138,[%value_in]; } mov.u64 %r23,%r138; .loc 1 61 27 st.u64 [%r118],%r23; bra $L4; $L3: .loc 1 65 11 cvta.global.u64 %r140,_gfortrani_compile_options; .loc 1 65 10 ld.u32 %r141,[%r140+36]; setp.eq.u32 %r142,%r141,0; @ %r142 bra $L4; .loc 1 66r119145; call _gfortrani_bounds_iforeach_return.loc 1 71 8 ld.u64 %r23,[%r118]; $L4: ld.u64 %r146,[%r118+40]; shl.b64 %r116,%r146,3; mov.u64 %r95,%r23; .loc 1 70 11 mov.u64 %r114,%r95; .loc 1 74 10 mov.u64 %r87,0; .loc 1 75 23 mov.u64 %r147,%r87; $L5: st.u64 [%r114],%r147; mov.u64 %r96,%r87; .loc 1 74 26 add.u64 %r87,%r96,1; .loc 1 74 3 add.u64 %r114,%r114,%r116; setp.ne.u64 %r148,%r79,%r87; @ %r148 bra $L5; add.u64 %r106,%r119,40; mov.u64 %r108,%frame; .loc 1 77 6 mov.u64 %r85,1; .loc 1 78 10 mov.u64 %r94,0; add.u64 %r216,%frame,120; bra $L7; $L18: mov.u64 %r94,%r86; $L7: .loc 1 80 18 ld.u64 %r149,[%r106]; st.u64 [%r108],%r149; .loc 1 81 19 ld.u64 %r151,[%r106+16]; add.u64 %r150,%r151,1; ld.u64 %r152,[%r106+8]; sub.u64 %r32,%r150,%r152; .loc 1 81 17 shl.b64 %r154,%r94,3; add.u64 %r155,%r216,%r154; st.u64 [%r155],%r32; .loc 1 82 10 mul.lo.u64 %r85,%r85,%r32; .loc 1 83 10 setp.le.s64 %r156,%r32,0; @ %r156 bra $L1; .loc 1 78 26 add.u64 %r86,%r94,1; .loc 1 78 3 add.u64 %r106,%r106,24; add.u64 %r108,%r108,8; setp.ne.u64 %r157,%r96,%r94; @ %r157 bra $L18; .loc 1 88 16 cvt.u32.u32 %r159,%r22; cvt.s64.s8 %r158,%r159; shl.b64 %r78,%r158,3; add.u64 %r217,%frame,240; mov.u32 %r16492 19 ld.u64 %r48,[%r119]; .loc 1 90 6 setp.eq.u32 %r168,%r121,0; @ ! %r168 bra $L28; add.u64 %r218,%frame,248; bra $L8; $L28: .loc 1 92 37 add.u64 %r169,%r85,-1; .loc 1 92 12 mad.lo.u64 %r73,%r169,%r122,%r48; .loc 1 106 34 ld.u64 %r99,[%frame+120]; add.u64 %r218,%frame,248; $L13: .loc 1 106 21 ld.u64 %r47,[%frame+240]; $L11: .loc 1 981call (%value_in),_gfortran_compare_stringld.param.u32 %r175,[%value_in]; } .loc 1 98 11 setp.ne.u32 %r176,%r175,0; @ %r176 bra $L9; add.u64 %r77,%frame,128; mov.u64 %r74,%r218; add.u64 %r40,%r216,%r78; $L10: .loc 1 101 37 sub.u64 %r180,%r99,%r47; .loc 1 101 25 st.u64 [%r95],%r180; .loc 1 100 5 add.u64 %r95,%r95,%r116; setp.eq.u64 %r181,%r40,%r77; @ %r181 bra $L1; .loc 1 101 33 ld.u64 %r99,[%r77]; .loc 1 101 44 ld.u64 %r47,[%r74]; add.u64 %r77,%r77,8; add.u64 %r74,%r74,8; bra $L10; $L9: .loc 1 105 23 ld.u64 %r54,[%frame]; .loc 1 105 27 mul.lo.u64 %r182,%r54,%r122; .loc 1 105 13 sub.u64 %r73,%r73,%r182; .loc 1 106 14 add.u64 %r47,%r47,1; .loc 1 106 6 st.u64 [%frame+240],%r47; setp.ne.u64 %r183,%r47,%r99; @ %r183 bra $L11; mov.u64 %r27,%r218; mov.u64 %r56,%r99; .loc 1 108 6 mov.u64 %r75,0; .loc 1 113 17 mov.u64 %r185,%r75; $L12: st.u64 [%r27+-8],%r185; .loc 1 116 27 mul.lo.u64 %r51,%r54,%r56; mov.u64 %r97,%r75; .loc 1 117 9 add.u64 %r75,%r75,1; .loc 1 118 11 setp.eq.u64 %r186,%r97,%r96; @ %r186 bra $L1; .loc 1 122 13 ld.u64 %r187,[%r27]; add.u64 %r53,%r187,1; st.u64 [%r27],%r53; shl.b64 %r102,%r75,3; .loc 1 123 20 add.u64 %r188,%frame,%r102; ld.u64 %r54,[%r188]; .loc 1 123 10 sub.u64 %r189,%r51,%r54; mad.lo.u64 %r73,%r189,%r122,%r73; .loc 1 125 33 add.u64 %r192,%r216,%r102; ld.u64 %r56,[%r192]; .loc 1 125 6 add.u64 %r27,%r27,8; setp.eq.u64 %r193,%r53,%r56; @ %r193 bra $L12; bra $L13; $L8: .loc 1 143 21 ld.u64 %r89,[%frame+240]12call (%value_in),_gfortran_compare_stringld.param.u32 %r198,[%value_in]; } mov.u64 %r92,%r89; .loc 1 143 14 add.u64 %r89,%r92,1; .loc 1 135 11 setp.ne.u32 %r199,%r198,0; @ %r199 bra $L14; mov.u64 %r90,%r218; add.u64 %r60,%r217,%r78; $L15: .loc 1 138 36 add.u64 %r202,%r92,1; .loc 1 138 25 st.u64 [%r95],%r202; .loc 1 137 5 add.u64 %r95,%r95,%r116; setp.eq.u64 %r203,%r60,%r90; @ %r203 bra $L1; .loc 1 138 32 ld.u64 %r92,[%r90]; add.u64 %r90,%r90,8; bra $L15; $L14: .loc 1 142 23 ld.u64 %r70,[%frame]; .loc 1 142 13 mad.lo.u64 %r48,%r70,%r122,%r48; .loc 1 143 6 st.u64 [%frame+240],%r89; .loc 1 143 34 ld.u64 %r72,[%frame+120]; .loc 1 143 6 setp.ne.u64 %r205,%r72,%r89; @ %r205 bra $L16; mov.u64 %r46,%r218; .loc 1 145 6 mov.u64 %r76,0; .loc 1 150 17 mov.u64 %r207,%r76; $L17: st.u64 [%r46+-8],%r207; .loc 1 153 27 mul.lo.u64 %r67,%r72,%r70; mov.u64 %r93,%r76; .loc 1 154 9 add.u64 %r76,%r76,1; .loc 1 155 11 setp.eq.u64 %r208,%r93,%r96; @ %r208 bra $L1; .loc 1 159 13 ld.u64 %r209,[%r46]; add.u64 %r69,%r209,1; st.u64 [%r46],%r69; shl.b64 %r103,%r76,3; .loc 1 160 20 add.u64 %r210,%frame,%r103; ld.u64 %r70,[%r210]; .loc 1 160 10 sub.u64 %r211,%r70,%r67; mad.lo.u64 %r48,%r211,%r122,%r48; .loc 1 162 33 add.u64 %r214,%r216,%r103; ld.u64 %r72,[%r214]; .loc 1 162 6 add.u64 %r46,%r46,8; setp.eq.u64 %r215,%r69,%r72; @ %r215 bra $L17; bra $L8; $L_gfortran_mfindloc0_s1 .visible .func _gfortran_mfindloc0_s1stack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[480u64 %r168; .reg .predpred %r218; .reg .predpred %r242; .reg .u64 %r245; .reg .predpredu64 %r282; .reg .u64 %r283; .reg .u6464.loc 1 193 10 ld.s8 %r22,[%r158+28]; .loc 1 193 8 cvt.u32.u32 %r164,%r22; cvt.s64.s8 %r115,%r164; .loc 1 194 6 setp.gt.s64 %r165,%r115,0; @ %r165 bra $L30; .loc 1 195 5 cvta.const.u64 %r166166_gfortran_runtime_error30: .loc 1 197 6 ld.u64 %r168,[%r157]; setp.ne.u64 %r169,%r168,0; @ %r169 bra $L31; .loc 1 199 7 st.u64 [%r157+48],%r168; add.u64 %r171,%r115,-1; st.u64 [%r157+56],%r171; mov.u64 %r172,1; st.u64 [%r157+40],%r172; .loc 1 200 28 cvt.u32.u64 %r173,%r172; st.u8 [%r157+28],%r173; .loc 1 201 24 st.u64 [%r157+8],%r168; .loc 1 202 29 mov.u64 %r176call (%value_in),_gfortrani_xmallocarray179,[%value_in]; } .loc 1 202 27 st.u64 [%r157],%r179; bra $L32; $L31: .loc 1 206 11 cvta.global.u64 %r181,_gfortrani_compile_options; .loc 1 206 10 ld.u32 %r182,[%r181+36]; setp.eq.u32 %r183,%r182,0r15186; call _gfortrani_bounds_iforeach_return.loc 1 210 4 cvta.const.u64 %r189,$LC2; {15189186; call _gfortrani_bounds_equal_extents$L32: .loc 1 215 13 ld.u32 %r116,[%r160+16]; .loc 1 217 9 ld.u64 %r69,[%r160]; .loc 1 219 53 add.u32 %r191,%r116,-4; and.b32 %r192,%r191,-5; set.u32.eq.u32 %r194,%r192,0; neg.s32 %r195,%r194; .loc 1 219 22 add.u32 %r196,%r116,-1; set.u32.le.u32 %r198,%r196,1; neg.s32 %r199,%r198; .loc 1 219 6 cvt.u16.u32 %r201,%r195; cvt.u16.u32 %r202,%r199; or.b16 %r200,%r201,%r202; cvt.u32.u16 %r203,%r200; cvt.u16.u8 %r204,%r203; setp.ne.u16 %r205,%r204,0; @ %r205 bra $L33; .loc 1 221 7 setp.ne.u32 %r206,%r116,16; @ %r206 bra $L34; $L33: ld.u64 %r207,[%r157+40]; shl.b64 %r155,%r207,3; ld.u64 %r140,[%r157]; .loc 1 229 8 mov.u64 %r153,%r140; .loc 1 232 10 mov.u64 %r127,0; .loc 1 233 23 mov.u64 %r210,%r127; bra $L35; $L34: .loc 1 226 5 cvta.const.u64 %r209,$LC3; mov.u64 %r208,20r209; call _gfortrani_internal_error35: .loc 1 233 23 st.u64 [%r153],%r210; mov.u64 %r141,%r127; .loc 1 232 26 add.u64 %r127,%r141,1; .loc 1 232 3 add.u64 %r153,%r153,%r155; setp.ne.u64 %r211,%r115,%r127; @ %r211 bra $L35; .loc 1 239 20 ld.u64 %r39,[%r160+16]; add.u64 %r145,%r158,40; add.u64 %r290,%frame,120; mov.u64 %r144,%r290; add.u64 %r29,%r160,40; mov.u64 %r94,%frame; add.u64 %r60,%frame,240; mov.u64 %r46,%r60; .loc 1 235 6 mov.u64 %r125,1; .loc 1 236 10 mov.u64 %r139,0; bra $L37; $L48: mov.u64 %r139,%r126; $L37: .loc 1 238 18 ld.u64 %r212,[%r145]; st.u64 [%r144],%r212; .loc 1 239 20 ld.u64 %r214,[%r29]; mul.lo.u64 %r213,%r214,%r39; .loc 1 239 18 st.u64 [%r94],%r213; .loc 1 240 19 ld.u64 %r216,[%r145+16]; add.u64 %r215,%r216,1; ld.u64 %r217,[%r145+8]; sub.u64 %r45,%r215,%r217; .loc 1 240 17 st.u64 [%r46],%r45; .loc 1 241 10 mul.lo.u64 %r125,%r125,%r45; .loc 1 242 10 setp.le.s64 %r218,%r45,0; @ %r218 bra $L29; .loc 1 236 26 add.u64 %r126,%r139,1; .loc 1 236 3 add.u64 %r145,%r145,24; add.u64 %r144,%r144,8; add.u64 %r29,%r29,24; add.u64 %r94,%r94,8; add.u64 %r46,%r46,8; setp.ne.u64 %r219,%r141,%r139; @ %r219 bra $L48; .loc 1 247 16 cvt.u32.u32 %r221,%r22; cvt.s64.s8 %r220,%r221; shl.b64 %r135,%r220,3; add.u64 %r291,%frame,360; mov.u32 %r226613528,[%value_in]; } .loc 1 251 19 ld.u64 %r50,[%r158]; .loc 1 249 6 setp.eq.u32 %r230,%r161,0; @ ! %r230 bra $L64; add.u64 %r292,%frame,368; .loc 1 327 33 add.u64 %r296,%frame,240; bra $L38; $L64: .loc 1 251 37 add.u64 %r47,%r125,-1; .loc 1 251 12 mad.lo.u64 %r106,%r47,%r162,%r50; .loc 1 252 32 cvt.s64.s32 %r232,%r116; .loc 1 252 13 mad.lo.u64 %r107,%r232,%r47,%r69; .loc 1 266 34 ld.u64 %r149,[%frame+240]; add.u64 %r292,%frame,368; .loc 1 287 33 add.u64 %r294,%frame,240; $L43: .loc 1 266 21 ld.u64 %r68,[%frame+360]; $L41: .loc 1 257 11 ld.s8 %r235,[%r107]; cvt.u16.u32 %r234,%r235; setp.eq.u16 %r236,%r234,0; @ %r236 bra 16106163159; call (%value_in),_gfortran_compare_stringld.param.u32 %r241,[%value_in]; } .loc 1 257 11 setp.ne.u32 %r242,%r241,0; @ %r242 bra $L39; add.u64 %r134,%frame,248; mov.u64 %r133,%r292; add.u64 %r131,%r60,%r135; $L40: .loc 1 260 37 sub.u64 %r245,%r149,%r68; .loc 1 260 25 st.u64 [%r140],%r245; .loc 1 259 5 add.u64 %r140,%r140,%r155; setp.eq.u64 %r246,%r131,%r134; @ %r246 bra $L29; .loc 1 260 33 ld.u64 %r149,[%r134]; .loc 1 260 44 ld.u64 %r68,[%r133]; add.u64 %r134,%r134,8; add.u64 %r133,%r133,8; bra $L40; $L39: .loc 1 264 23 ld.u64 %r79,[%frame+120]; .loc 1 264 27 mul.lo.u64 %r247,%r79,%r162; .loc 1 264 13 sub.u64 %r106,%r106,%r247; .loc 1 265 24 ld.u64 %r81,[%frame]; .loc 1 265 14 sub.u64 %r107,%r107,%r81; .loc 1 266 14 add.u64 %r68,%r68,1; .loc 1 266 6 st.u64 [%frame+360],%r68; setp.ne.u64 %r248,%r68,%r149; @ %r248 bra $L41; mov.u64 %r128,%r292; mov.u64 %r84,%r149; mov.u64 %r114,8; .loc 1 268 6 mov.u64 %r109,0; .loc 1 273 17 mov.u64 %r250,%r109; $L42: st.u64 [%r128+-8],%r250; .loc 1 276 27 mul.lo.u64 %r73,%r79,%r84; .loc 1 277 28 mul.lo.u64 %r76,%r84,%r81; mov.u64 %r142,%r109; .loc 1 278 9 add.u64 %r109,%r109,1; .loc 1 279 11 setp.eq.u64 %r251,%r142,%r141; @ %r251 bra $L29; .loc 1 283 13 ld.u64 %r252,[%r128]; add.u64 %r78,%r252,1; st.u64 [%r128],%r78; .loc 1 284 20 add.u64 %r254,%r290,%r114; ld.u64 %r79,[%r254]; .loc 1 284 10 sub.u64 %r255,%r73,%r79; mad.lo.u64 %r106,%r255,%r162,%r106; .loc 1 285 21 add.u64 %r257,%frame,%r114; ld.u64 %r81,[%r257]; .loc 1 285 11 sub.u64 %r258,%r81,%r76; add.u64 %r107,%r107,%r258; .loc 1 287 33 add.u64 %r260,%r294,%r114; ld.u64 %r84,[%r260]; .loc 1 287 6 add.u64 %r128,%r128,8; add.u64 %r114,%r114,8; setp.eq.u64 %r261,%r78,%r84; @ %r261 bra $L42; bra $L43; $L38: .loc 1 306 21 ld.u64 %r93,[%frame+360]; $L46: .loc 1 297 11 ld.s8 %r263,[%r69]; cvt.u16.u32 %r262,%r263; setp.eq.u16 %r264,%r262,0; @ %r264 bra $L44;16call (%value_in),_gfortran_compare_stringld.param.u32 %r269,[%value_in]; } .loc 1 297 11 setp.ne.u32 %r270,%r269,0; @ %r270 bra $L44; mov.u64 %r59,%r292; add.u64 %r35,%r291,%r135; $L45: .loc 1 300 36 add.u64 %r273,%r93,1; .loc 1 300 25 st.u64 [%r140],%r273; .loc 1 299 5 add.u64 %r140,%r140,%r155; setp.eq.u64 %r274,%r35,%r59; @ %r274 bra $L29; .loc 1 300 32 ld.u64 %r93,[%r59]; add.u64 %r59,%r59,8; bra $L45; $L44: .loc 1 304 23 ld.u64 %r101,[%frame+120]; .loc 1 304 13 mad.lo.u64 %r50,%r101,%r162,%r50; .loc 1 305 24 ld.u64 %r103,[%frame]; .loc 1 305 14 add.u64 %r69,%r69,%r103; .loc 1 306 14 add.u64 %r93,%r93,1; .loc 1 306 6 st.u64 [%frame+360],%r93; .loc 1 306 34 ld.u64 %r105,[%frame+240]; .loc 1 306 6 setp.ne.u64 %r276,%r93,%r105; @ %r276 bra $L46; mov.u64 %r28,%r292; mov.u64 %r150,8; .loc 1 308 6 mov.u64 %r111,0; .loc 1 313 17 mov.u64 %r278,%r111; $L47: st.u64 [%r28+-8],%r278; .loc 1 316 27 mul.lo.u64 %r96,%r101,%r105; .loc 1 317 28 mul.lo.u64 %r98,%r105,%r103; mov.u64 %r138,%r111; .loc 1 318 9 add.u64 %r111,%r111,1; .loc 1 319 11 setp.eq.u64 %r279,%r138,%r141; @ %r279 bra $L29; .loc 1 323 13 ld.u64 %r280,[%r28]; add.u64 %r100,%r280,1; st.u64 [%r28],%r100; .loc 1 324 20 add.u64 %r282,%r290,%r150; ld.u64 %r101,[%r282]; .loc 1 324 10 sub.u64 %r283,%r101,%r96; mad.lo.u64 %r50,%r283,%r162,%r50; .loc 1 325 21 add.u64 %r285,%frame,%r150; ld.u64 %r103,[%r285]; .loc 1 325 11 sub.u64 %r286,%r103,%r98; add.u64 %r69,%r69,%r286; .loc 1 327 33 add.u64 %r288,%r296,%r150; ld.u64 %r105,[%r288]; .loc 1 327 6 add.u64 %r28,%r28,8; add.u64 %r150,%r150,8; setp.eq.u64 %r289,%r100,%r105; @ %r289 bra $L47; bra $L38; $L29:_gfortran_sfindloc0_s1 .visible .func _gfortran_sfindloc0_s1pred %r44; .reg .predpredpredpredar2; mov.u64 %r38,%ar3; mov.u32 %r39,%ar4; mov.u64 %r40,%ar5; mov.u64 %r41,%ar6; .loc 1 350 6 setp.eq.u64 %r42,%r38,0; @ %r42 bra $L66; .loc 1 350 20 ld.u32 %r43,[%r38];67; $L66: .loc 1 352 7 {41; call _gfortran_findloc0_s1353 7 bra $L65; $L67: .loc 1 356 8 ld.s8 %r30,[%r36+28]; .loc 1 358 6 setp.gt.s64 %r52,%r30,0; @ %r52 bra $L69; .loc 1 359 5 cvta.const.u64 %r54,$LC0; mov.u64 %r53,r54; call _gfortrani_internal_error69: .loc 1 361 15 ld.u64 %r24,[%r35]; .loc 1 361 6 setp.ne.u64 %r55,%r24,0; @ %r55 bra $L70; .loc 1 363 7 st.u64 [%r35+48],%r24; add.u64 %r57,%r30,-1; st.u64 [%r35+56],%r57; mov.u64 %r58,1; st.u64 [%r35+40],%r58; .loc 1 364 28 cvt.u32.u64 %r59,%r58; st.u8 [%r35+28],%r59; .loc 1 365 24 st.u64 [%r35+8],%r24; .loc 1 366 29 mov.u64 call (%value_in),_gfortrani_xmallocarray27 st.u64 [%r35],%r24; bra $L71; $L70: .loc 1 368 12 cvta.global.u64 %r67,_gfortrani_compile_options; .loc 1 368 11 ld.u32 %r68,[%r67+36]; setp.eq.u32 %r69,%r68,0; @ %r69 bra $L71; .loc 1 370 8 cvta.const.u64 %r72,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 375 8 ld.u64 %r24,[%r35]; $L71: ld.u64 %r73,[%r35+40]; shl.b64 %r34,%r73,3; mov.u64 %r29,%r24; .loc 1 376 10 mov.u64 %r32,0; .loc 1 377 23 mov.u64 %r74,%r32; $L72: st.u64 [%r29],%r74; .loc 1 376 24 add.u64 %r32,%r32,1; .loc 1 376 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r75,%r30,%r32; @ %r75 bra $L72; $L65: .loc 1 378 1 ret; } findloc0_s4.o/_gfortran_findloc0_s4 .visible .func _gfortran_findloc0_s.file 1 "../../../libgfortran/generated/findloc0_s4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mfindloc0_s4 .visible .func _gfortran_mfindloc0_s_gfortran_sfindloc0_s4 .visible .func _gfortran_sfindloc0_sVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_iforeach_return .extern .func _gfortrani_bounds_iforeach_return_gfortran_compare_string_char4_gfortran_compare_string_char4gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extentsgfortrani_internal_error .extern .func _gfortrani_internal_82,97,110,107,32,111,102,32,97,114,114,97,121,32,110,101,101,100,115,32,116,111,32,98,101,32,6270,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,98] = {70,73,78,68,777,65,83,75,32,97,114,103,117,109,101,110_gfortran_findloc0_s4 .visible .func _gfortran_findloc0_sstack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[368pred.reg .u32predpred.reg .u64 %r194; .reg .u64 %r197; .reg .predpredu64mov.u64 %r122,%ar0; mov.u64 %r123,%ar1; mov.u64 %r124,%ar2; mov.u32 %r125,%ar3; mov.u64 %r126,%ar4; mov.u64 %r127,%ar5; .loc 1 52 10 ld.s8 %r22,[%r123+28]; .loc 1 52 8 cvt.u32.u32 %r128,%r22; cvt.s64.s8 %r84,%r128; .loc 1 53 6 setp.gt.s64 %r129,%r84,0; @ %r129 bra $L2; .loc 1 54 5 cvta.const.u64 %r1301stack; call _gfortran_runtime_errorr23,[%r122]; .loc 1 56 6 setp.ne.u64 %r132,%r23,0; @ %r132 bra $L3; .loc 1 58 7 st.u64 [%r122+48],%r23; add.u64 %r134,%r84,-1; st.u64 [%r122+56],%r134; mov.u64 %r135,1; st.u64 [%r122+40],%r135; .loc 1 59 28 cvt.u32.u64 %r136,%r135; st.u8 [%r122+28],%r136; .loc 1 60 24 st.u64 [%r122+8],%r23; .loc 1 61 29 mov.u64 %r139139; call (%value_in),_gfortrani_xmallocarray2,[%value_in]; } mov.u64 %r23,%r142; .loc 1 61 27 st.u64 [%r122],%r23; bra $L4; $L3: .loc 1 65 11 cvta.global.u64 %r144,_gfortrani_compile_options; .loc 1 65 10 ld.u32 %r145,[%r144+36]; setp.eq.u32 %r146,%r145,0; @ %r146 bra $L4; .loc 1 66r123149; call _gfortrani_bounds_iforeach_return.loc 1 71 8 ld.u64 %r23,[%r122]; $L4: ld.u64 %r150,[%r122+40]; shl.b64 %r120,%r150,3; mov.u64 %r102,%r23; .loc 1 70 11 mov.u64 %r118,%r102; .loc 1 74 10 mov.u64 %r94,0; .loc 1 75 23 mov.u64 %r151,%r94; $L5: st.u64 [%r118],%r151; mov.u64 %r29,%r94; .loc 1 74 26 add.u64 %r94,%r29,1; .loc 1 74 3 add.u64 %r118,%r118,%r120; setp.ne.u64 %r152,%r84,%r94; @ %r152 bra $L5; add.u64 %r110,%r123,40; mov.u64 %r112,%frame; .loc 1 77 6 mov.u64 %r92,1; .loc 1 78 10 mov.u64 %r101,0; add.u64 %r223,%frame,120; bra $L7; $L18: mov.u64 %r101,%r93; $L7: .loc 1 80 18 ld.u64 %r153,[%r110]; st.u64 [%r112],%r153; .loc 1 81 19 ld.u64 %r155,[%r110+16]; add.u64 %r154,%r155,1; ld.u64 %r156,[%r110+8]; sub.u64 %r35,%r154,%r156; .loc 1 81 17 shl.b64 %r158,%r101,3; add.u64 %r159,%r223,%r158; st.u64 [%r159],%r35; .loc 1 82 10 mul.lo.u64 %r92,%r92,%r35; .loc 1 83 10 setp.le.s64 %r160,%r35,0; @ %r160 bra $L1; .loc 1 78 26 add.u64 %r93,%r101,1; .loc 1 78 3 add.u64 %r110,%r110,24; add.u64 %r112,%r112,8; setp.ne.u64 %r161,%r29,%r101; @ %r161 bra $L18; .loc 1 88 16 cvt.u32.u32 %r163,%r22; cvt.s64.s8 %r162,%r163; shl.b64 %r99,%r162,3; add.u64 %r224,%frame,240; mov.u32 %r16892 19 ld.u64 %r50,[%r123]; .loc 1 90 6 setp.eq.u32 %r172,%r125,0; @ ! %r172 bra $L28; add.u64 %r225,%frame,248; bra $L8; $L28: shl.b64 %r43,%r126,2; .loc 1 92 37 add.u64 %r173,%r92,-1; .loc 1 92 12 mad.lo.u64 %r75,%r173,%r43,%r50; .loc 1 106 34 ld.u64 %r106,[%frame+120]; add.u64 %r225,%frame,248; $L13: .loc 1 106 21 ld.u64 %r49,[%frame+240]; $L11: .loc 1 98126r1124; call (%value_in),_gfortran_compare_string_char4ld.param.u32 %r179,[%value_in]; } .loc 1 98 11 setp.ne.u32 %r180,%r179,0; @ %r180 bra $L9; add.u64 %r88,%frame,128; mov.u64 %r82,%r225; add.u64 %r76,%r223,%r99; $L10: .loc 1 101 37 sub.u64 %r184,%r106,%r49; .loc 1 101 25 st.u64 [%r102],%r184; .loc 1 100 5 add.u64 %r102,%r102,%r120; setp.eq.u64 %r185,%r76,%r88; @ %r185 bra $L1; .loc 1 101 33 ld.u64 %r106,[%r88]; .loc 1 101 44 ld.u64 %r49,[%r82]; add.u64 %r88,%r88,8; add.u64 %r82,%r82,8; bra $L10; $L9: .loc 1 105 23 ld.u64 %r56,[%frame]; .loc 1 105 13 mul.lo.u64 %r186,%r56,%r126; shl.b64 %r187,%r186,2; sub.u64 %r75,%r75,%r187; .loc 1 106 14 add.u64 %r49,%r49,1; .loc 1 106 6 st.u64 [%frame+240],%r49; setp.ne.u64 %r188,%r49,%r106; @ %r188 bra $L11; mov.u64 %r40,%r225; mov.u64 %r58,%r106; .loc 1 108 6 mov.u64 %r78,0; .loc 1 113 17 mov.u64 %r190,%r78; $L12: st.u64 [%r40+-8],%r190; .loc 1 116 27 mul.lo.u64 %r53,%r56,%r58; mov.u64 %r30,%r78; .loc 1 117 9 add.u64 %r78,%r78,1; .loc 1 118 11 setp.eq.u64 %r191,%r30,%r29; @ %r191 bra $L1; .loc 1 122 13 ld.u64 %r192,[%r40]; add.u64 %r55,%r192,1; st.u64 [%r40],%r55; shl.b64 %r107,%r78,3; .loc 1 123 20 add.u64 %r193,%frame,%r107; ld.u64 %r56,[%r193]; .loc 1 123 10 sub.u64 %r194,%r53,%r56; mad.lo.u64 %r75,%r194,%r43,%r75; .loc 1 125 33 add.u64 %r197,%r223,%r107; ld.u64 %r58,[%r197]; .loc 1 125 6 add.u64 %r40,%r40,8; setp.eq.u64 %r198,%r55,%r58; @ %r198 bra $L12; bra $L13; $L8: .loc 1 143 21 ld.u64 %r98,[%frame+240]124; call (%value_in),_gfortran_compare_string_char4ld.param.u32 %r203,[%value_in]; } mov.u64 %r103,%r98; .loc 1 143 14 add.u64 %r98,%r103,1; .loc 1 135 11 setp.ne.u32 %r204,%r203,0; @ %r204 bra $L14; mov.u64 %r96,%r225; add.u64 %r65,%r224,%r99; $L15: .loc 1 138 36 add.u64 %r207,%r103,1; .loc 1 138 25 st.u64 [%r102],%r207; .loc 1 137 5 add.u64 %r102,%r102,%r120; setp.eq.u64 %r208,%r65,%r96; @ %r208 bra $L1; .loc 1 138 32 ld.u64 %r103,[%r96]; add.u64 %r96,%r96,8; bra $L15; $L14: .loc 1 142 23 ld.u64 %r72,[%frame]; .loc 1 142 13 mul.lo.u64 %r209,%r72,%r126; shl.b64 %r210,%r209,2; add.u64 %r50,%r50,%r210; .loc 1 143 6 st.u64 [%frame+240],%r98; .loc 1 143 34 ld.u64 %r74,[%frame+120]; .loc 1 143 6 setp.ne.u64 %r211,%r74,%r98; @ %r211 bra $L16; mov.u64 %r67,%r225; .loc 1 145 6 mov.u64 %r80,0; .loc 1 150 17 mov.u64 %r213,%r80; $L17: st.u64 [%r67+-8],%r213; .loc 1 153 27 mul.lo.u64 %r69,%r72,%r74; mov.u64 %r28,%r80; .loc 1 154 9 add.u64 %r80,%r80,1; .loc 1 155 11 setp.eq.u64 %r214,%r28,%r29; @ %r214 bra $L1; .loc 1 159 13 ld.u64 %r215,[%r67]; add.u64 %r71,%r215,1; st.u64 [%r67],%r71; shl.b64 %r48,%r80,3; .loc 1 160 20 add.u64 %r216,%frame,%r48; ld.u64 %r72,[%r216]; .loc 1 160 10 sub.u64 %r217,%r72,%r69; mul.lo.u64 %r218,%r217,%r126; shl.b64 %r219,%r218,2; add.u64 %r50,%r50,%r219; .loc 1 162 33 add.u64 %r221,%r223,%r48; ld.u64 %r74,[%r221]; .loc 1 162 6 add.u64 %r67,%r67,8; setp.eq.u64 %r222,%r71,%r74; @ %r222 bra $L17; bra $L8; $L_gfortran_mfindloc0_s4 .visible .func _gfortran_mfindloc0_sstack; mov.u64 %stack,0; .local .align 16 .b8 %frame_ar[480predpredpred %r187; .reg .u64 %r190; .reg .u64pred %r209; .reg .pred %r210; .reg .u64pred %r223; .reg .u64 %r224; .reg .u32 %r225; .reg .u32 %r230; .reg .u64 %r232; .reg .pred %r234; .reg .u64 %r236; .reg .u16 %r238; .reg .u32 %r239; .reg .pred %r240; .reg .u32 %r245; .reg .pred %r246; .reg .u64 %r249; .reg .predu64 %r303; mov.u64 %r161,%ar0; mov.u64 %r162,%ar1; mov.u64 %r163,%ar2; mov.u64 %r164,%ar3; mov.u32 %r165,%ar4; mov.u64 %r166,%ar5; mov.u64 %r167,%ar6; .loc 1 193 10 ld.s8 %r22,[%r162+28]; .loc 1 193 8 cvt.u32.u32 %r168,%r22; cvt.s64.s8 %r117,%r168; .loc 1 194 6 setp.gt.s64 %r169,%r117,0; @ %r169 bra $L30; .loc 1 195 5 cvta.const.u64 %r170170_gfortran_runtime_error30: .loc 1 197 6 ld.u64 %r172,[%r161]; setp.ne.u64 %r173,%r172,0; @ %r173 bra $L31; .loc 1 199 7 st.u64 [%r161+48],%r172; add.u64 %r175,%r117,-1; st.u64 [%r161+56],%r175; mov.u64 %r176,1; st.u64 [%r161+40],%r176; .loc 1 200 28 cvt.u32.u64 %r177,%r176; st.u8 [%r161+28],%r177; .loc 1 201 24 st.u64 [%r161+8],%r172; .loc 1 202 29 mov.u64 %r180180; call (%value_in),_gfortrani_xmallocarray183,[%value_in]; } .loc 1 202 27 st.u64 [%r161],%r183; bra $L32; $L31: .loc 1 206 11 cvta.global.u64 %r185,_gfortrani_compile_options; .loc 1 206 10 ld.u32 %r186,[%r185+36]; setp.eq.u32 %r187,%r186,0161_gfortrani_bounds_iforeach_return.loc 1 210 4 cvta.const.u64 %r193164193190; call _gfortrani_bounds_equal_extents$L32: .loc 1 215 13 ld.u32 %r118,[%r164+16]; .loc 1 217 9 ld.u64 %r71,[%r164]; .loc 1 219 53 add.u32 %r195,%r118,-4; and.b32 %r196,%r195,-5; set.u32.eq.u32 %r198,%r196,0; neg.s32 %r199,%r198; .loc 1 219 22 add.u32 %r200,%r118,-1; set.u32.le.u32 %r202,%r200,1; neg.s32 %r203,%r202; .loc 1 219 6 cvt.u16.u32 %r205,%r199; cvt.u16.u32 %r206,%r203; or.b16 %r204,%r205,%r206; cvt.u32.u16 %r207,%r204; cvt.u16.u8 %r208,%r20733; .loc 1 221 7 setp.ne.u32 %r210,%r118,16; @ %r210 bra $L34; $L33: ld.u64 %r211,[%r161+40]; shl.b64 %r76,%r211,3; ld.u64 %r147,[%r161]; .loc 1 229 8 mov.u64 %r72,%r147; .loc 1 232 10 mov.u64 %r131,0; .loc 1 233 23 mov.u64 %r214,%r131; bra $L35; $L34: .loc 1 226 5 cvta.const.u64 %r213,$LC3; mov.u64 %r212,21213; call _gfortrani_internal_error35: .loc 1 233 23 st.u64 [%r72],%r214; mov.u64 %r35,%r131; .loc 1 232 26 add.u64 %r131,%r35,1; .loc 1 232 3 add.u64 %r72,%r72,%r76; setp.ne.u64 %r215,%r117,%r131; @ %r215 bra $L35; .loc 1 239 20 ld.u64 %r40,[%r164+16]; add.u64 %r152,%r162,40; add.u64 %r297,%frame,120; mov.u64 %r150,%r297; add.u64 %r149,%r164,40; mov.u64 %r28,%frame; add.u64 %r97,%frame,240; mov.u64 %r95,%r97; .loc 1 235 6 mov.u64 %r129,1; .loc 1 236 10 mov.u64 %r146,0; bra $L37; $L48: mov.u64 %r146,%r130; $L37: .loc 1 238 18 ld.u64 %r216,[%r152]; st.u64 [%r150],%r216; .loc 1 239 20 ld.u64 %r218,[%r149]; mul.lo.u64 %r217,%r218,%r40; .loc 1 239 18 st.u64 [%r28],%r217; .loc 1 240 19 ld.u64 %r220,[%r152+16]; add.u64 %r219,%r220,1; ld.u64 %r221,[%r152+8]; sub.u64 %r46,%r219,%r221; .loc 1 240 17 st.u64 [%r95],%r46; .loc 1 241 10 mul.lo.u64 %r129,%r129,%r46; .loc 1 242 10 setp.le.s64 %r222,%r46,0; @ %r222 bra $L29; .loc 1 236 26 add.u64 %r130,%r146,1; .loc 1 236 3 add.u64 %r152,%r152,24; add.u64 %r150,%r150,8; add.u64 %r149,%r149,24; add.u64 %r28,%r28,8; add.u64 %r95,%r95,8; setp.ne.u64 %r223,%r35,%r146; @ %r223 bra $L48; .loc 1 247 16 cvt.u32.u32 %r225,%r22; cvt.s64.s8 %r224,%r225; shl.b64 %r142,%r224,3; add.u64 %r298,%frame,360; mov.u32 %r230214232,[%value_in]; } .loc 1 251 19 ld.u64 %r50,[%r162]; .loc 1 249 6 setp.eq.u32 %r234,%r165,0; @ ! %r234 bra $L64; add.u64 %r299,%frame,368; .loc 1 327 33 add.u64 %r303,%frame,240; bra $L38; $L64: .loc 1 251 37 add.u64 %r47,%r129,-1; shl.b64 %r62,%r166,2; .loc 1 251 12 mad.lo.u64 %r110,%r47,%r62,%r50; .loc 1 252 32 cvt.s64.s32 %r236,%r118; .loc 1 252 13 mad.lo.u64 %r111,%r236,%r47,%r71; .loc 1 266 34 ld.u64 %r155,[%frame+240]; add.u64 %r299,%frame,368; .loc 1 287 33 add.u64 %r301,%frame,240; $L43: .loc 1 266 21 ld.u64 %r70,[%frame+360]; $L41: .loc 1 257 11 ld.s8 %r239,[%r111]; cvt.u16.u32 %r238,%r239; setp.eq.u16 %r240,%r238,0; @ %r240 bra 673; call (%value_in),_gfortran_compare_string_char4ld.param.u32 %r245,[%value_in]; } .loc 1 257 11 setp.ne.u32 %r246,%r245,0; @ %r246 bra $L39; add.u64 %r140,%frame,248; mov.u64 %r139,%r299; add.u64 %r137,%r97,%r142; $L40: .loc 1 260 37 sub.u64 %r249,%r155,%r70; .loc 1 260 25 st.u64 [%r147],%r249; .loc 1 259 5 add.u64 %r147,%r147,%r76; setp.eq.u64 %r250,%r137,%r140; @ %r250 bra $L29; .loc 1 260 33 ld.u64 %r155,[%r140]; .loc 1 260 44 ld.u64 %r70,[%r139]; add.u64 %r140,%r140,8; add.u64 %r139,%r139,8; bra $L40; $L39: .loc 1 264 23 ld.u64 %r81,[%frame+120]; .loc 1 264 13 mul.lo.u64 %r251,%r81,%r166; shl.b64 %r252,%r251,2; sub.u64 %r110,%r110,%r252; .loc 1 265 24 ld.u64 %r83,[%frame]; .loc 1 265 14 sub.u64 %r111,%r111,%r83; .loc 1 266 14 add.u64 %r70,%r70,1; .loc 1 266 6 st.u64 [%frame+360],%r70; setp.ne.u64 %r253,%r70,%r155; @ %r253 bra $L41; mov.u64 %r134,%r299; mov.u64 %r86,%r155; mov.u64 %r126,8; .loc 1 268 6 mov.u64 %r112,0; .loc 1 273 17 mov.u64 %r255,%r112; $L42: st.u64 [%r134+-8],%r255; .loc 1 276 27 mul.lo.u64 %r75,%r81,%r86; .loc 1 277 28 mul.lo.u64 %r78,%r86,%r83; mov.u64 %r36,%r112; .loc 1 278 9 add.u64 %r112,%r112,1; .loc 1 279 11 setp.eq.u64 %r256,%r36,%r35; @ %r256 bra $L29; .loc 1 283 13 ld.u64 %r257,[%r134]; add.u64 %r80,%r257,1; st.u64 [%r134],%r80; .loc 1 284 20 add.u64 %r259,%r297,%r126; ld.u64 %r81,[%r259]; .loc 1 284 10 sub.u64 %r260,%r75,%r81; mad.lo.u64 %r110,%r260,%r62,%r110; .loc 1 285 21 add.u64 %r262,%frame,%r126; ld.u64 %r83,[%r262]; .loc 1 285 11 sub.u64 %r263,%r83,%r78; add.u64 %r111,%r111,%r263; .loc 1 287 33 add.u64 %r265,%r301,%r126; ld.u64 %r86,[%r265]; .loc 1 287 6 add.u64 %r134,%r134,8; add.u64 %r126,%r126,8; setp.eq.u64 %r266,%r80,%r86; @ %r266 bra $L42; bra $L43; $L38: .loc 1 306 21 ld.u64 %r96,[%frame+360]; $L46: .loc 1 297 11 ld.s8 %r268,[%r71]; cvt.u16.u32 %r267,%r268; setp.eq.u16 %r269,%r267,0; @ %r269 bra $L44;1673; call (%value_in),_gfortran_compare_string_char4ld.param.u32 %r274,[%value_in]; } .loc 1 297 11 setp.ne.u32 %r275,%r274,0; @ %r275 bra $L44; mov.u64 %r90,%r299; add.u64 %r56,%r298,%r142; $L45: .loc 1 300 36 add.u64 %r278,%r96,1; .loc 1 300 25 st.u64 [%r147],%r278; .loc 1 299 5 add.u64 %r147,%r147,%r76; setp.eq.u64 %r279,%r56,%r90; @ %r279 bra $L29; .loc 1 300 32 ld.u64 %r96,[%r90]; add.u64 %r90,%r90,8; bra $L45; $L44: .loc 1 304 23 ld.u64 %r105,[%frame+120]; .loc 1 304 13 mul.lo.u64 %r280,%r105,%r166; shl.b64 %r281,%r280,2; add.u64 %r50,%r50,%r281; .loc 1 305 24 ld.u64 %r107,[%frame]; .loc 1 305 14 add.u64 %r71,%r71,%r107; .loc 1 306 14 add.u64 %r96,%r96,1; .loc 1 306 6 st.u64 [%frame+360],%r96; .loc 1 306 34 ld.u64 %r109,[%frame+240]; .loc 1 306 6 setp.ne.u64 %r282,%r96,%r109; @ %r282 bra $L46; mov.u64 %r34,%r299; mov.u64 %r159,8; .loc 1 308 6 mov.u64 %r113,0; .loc 1 313 17 mov.u64 %r284,%r113; $L47: st.u64 [%r34+-8],%r284; .loc 1 316 27 mul.lo.u64 %r99,%r105,%r109; .loc 1 317 28 mul.lo.u64 %r102,%r109,%r107; mov.u64 %r148,%r113; .loc 1 318 9 add.u64 %r113,%r113,1; .loc 1 319 11 setp.eq.u64 %r285,%r148,%r35; @ %r285 bra $L29; .loc 1 323 13 ld.u64 %r286,[%r34]; add.u64 %r104,%r286,1; st.u64 [%r34],%r104; .loc 1 324 20 add.u64 %r288,%r297,%r159; ld.u64 %r105,[%r288]; .loc 1 324 10 sub.u64 %r289,%r105,%r99; mul.lo.u64 %r290,%r289,%r166; shl.b64 %r291,%r290,2; add.u64 %r50,%r50,%r291; .loc 1 325 21 add.u64 %r292,%frame,%r159; ld.u64 %r107,[%r292]; .loc 1 325 11 sub.u64 %r293,%r107,%r102; add.u64 %r71,%r71,%r293; .loc 1 327 33 add.u64 %r295,%r303,%r159; ld.u64 %r109,[%r295]; .loc 1 327 6 add.u64 %r34,%r34,8; add.u64 %r159,%r159,8; setp.eq.u64 %r296,%r104,%r109; @ %r296 bra $L47; bra $L38; $L29:_gfortran_sfindloc0_s4 .visible .func _gfortran_sfindloc0_spred %r44; .reg .predpredpredpredar2; mov.u64 %r38,%ar3; mov.u32 %r39,%ar4; mov.u64 %r40,%ar5; mov.u64 %r41,%ar6; .loc 1 350 6 setp.eq.u64 %r42,%r38,0; @ %r42 bra $L66; .loc 1 350 20 ld.u32 %r43,[%r38];67; $L66: .loc 1 352 7 {41; call _gfortran_findloc0_s4353 7 bra $L65; $L67: .loc 1 356 8 ld.s8 %r30,[%r36+28]; .loc 1 358 6 setp.gt.s64 %r52,%r30,0; @ %r52 bra $L69; .loc 1 359 5 cvta.const.u64 %r54,$LC0; mov.u64 %r53,r54; call _gfortrani_internal_error69: .loc 1 361 15 ld.u64 %r24,[%r35]; .loc 1 361 6 setp.ne.u64 %r55,%r24,0; @ %r55 bra $L70; .loc 1 363 7 st.u64 [%r35+48],%r24; add.u64 %r57,%r30,-1; st.u64 [%r35+56],%r57; mov.u64 %r58,1; st.u64 [%r35+40],%r58; .loc 1 364 28 cvt.u32.u64 %r59,%r58; st.u8 [%r35+28],%r59; .loc 1 365 24 st.u64 [%r35+8],%r24; .loc 1 366 29 mov.u64 call (%value_in),_gfortrani_xmallocarray27 st.u64 [%r35],%r24; bra $L71; $L70: .loc 1 368 12 cvta.global.u64 %r67,_gfortrani_compile_options; .loc 1 368 11 ld.u32 %r68,[%r67+36]; setp.eq.u32 %r69,%r68,0; @ %r69 bra $L71; .loc 1 370 8 cvta.const.u64 %r72,$LC1; {_gfortrani_bounds_iforeach_return.loc 1 375 8 ld.u64 %r24,[%r35]; $L71: ld.u64 %r73,[%r35+40]; shl.b64 %r34,%r73,3; mov.u64 %r29,%r24; .loc 1 376 10 mov.u64 %r32,0; .loc 1 377 23 mov.u64 %r74,%r32; $L72: st.u64 [%r29],%r74; .loc 1 376 24 add.u64 %r32,%r32,1; .loc 1 376 3 add.u64 %r29,%r29,%r34; setp.ne.u64 %r75,%r30,%r32; @ %r75 bra $L72; $L65: .loc 1 378 1 ret; } findloc1_i1.o/_gfortran_findloc1_i1 .visible .func _gfortran_findloc1_i1); .file 1 "../../../libgfortran/generated/findloc1_i1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mfindloc1_i1 .visible .func _gfortran_mfindloc1_i1_gfortran_sfindloc1_i1 .visible .func _gfortran_sfindloc1_i1); // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_internal_error .extern .func _gfortrani_internal_error8] = {70,73,78,68,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,70,73,78,68,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,914,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,70,73,78,68,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_findloc1_i1 .visible .func _gfortran_findloc1_i1480u649predu64u64 %r229; .reg .u64 %r230; .reg .predpred %r255; .reg .predu64 %r266; .reg .u16u64predpred %r317; .reg .u16 %r321; .reg .u32 %r322; .reg .predpred %r329; .reg .pred %r330; .reg .pred %r331; .reg .u64 %r332; .reg .u64 %r333; .reg .u64predmov.u64 %r180,%ar0; mov.u64 %r181,%ar1; mov.u32 %r182,%ar2; mov.u64 %r183,%ar3; mov.u32 %r184,%ar4; .loc 1 54 10 ld.s8 %r22,[%r181+28]; .loc 1 54 38 add.u32 %r185,%r22,-1; .loc 1 54 8 cvt.s64.s32 %r111,%r185; .loc 1 55 10 ld.u64 %r25,[%r183]; .loc 1 55 7 add.u64 %r112,%r25,-1; .loc 1 57 7 shr.u64 %r187,%r112,63; set.u32.lt.s64 %r189,%r111,%r112; neg.s32 %r190,%r189; cvt.u16.u64 %r193,%r187; cvt.u16.u32 %r194,%r190; or.b16 %r192,%r193,%r194; .loc 1 57 6 cvt.u32.u16 %r195,%r192; cvt.u16.u8 %r196,%r195; setp.eq.u16 %r197,%r196,0; @ %r197 bra $L2; .loc 1 59 7 cvt.u32.u32 %r200,%r22; cvt.s64.s8 %r199,%r200; st.u64 [%stack+8],%r199_gfortran_runtime_errorr203,%r112,%r112; add.u64 %r204,%r203,%r112; shl.b64 %r205,%r204,3; add.u64 %r206,%r181,%r205; ld.u64 %r30,[%r206+56]; ld.u64 %r32,[%r206+48]; .loc 1 67 9 ld.u64 %r115,[%r206+40]; .loc 1 69 3 setp.ne.u64 %r220,%r112,0; @ %r220 bra $L3; $L9: .loc 1 77 3 setp.gt.s64 %r221,%r111,%r112; @ %r221 bra $L4; bra $L50; $L3: add.u64 %r69,%r181,40; add.u64 %r74,%frame,120; add.u64 %r131,%frame,240; add.u64 %r222,%r181,16; add.u64 %r224,%r25,%r25; add.u64 %r225,%r224,%r25; shl.b64 %r226,%r225,3; add.u64 %r177,%r222,%r226; .loc 1 75 12 mov.u64 %r367,0; $L8: .loc 1 71 18 ld.u64 %r227,[%r69]; st.u64 [%r74],%r227; .loc 1 72 19 ld.u64 %r229,[%r69+16]; add.u64 %r228,%r229,1; ld.u64 %r230,[%r69+8]; sub.u64 %r37,%r228,%r230; .loc 1 74 10 setp.lt.s64 %r231,%r37,0; @ %r231 bra $L6; .loc 1 72 17 st.u64 [%r131],%r37; bra $L7; $L6: .loc 1 75 12 st.u64 [%r131],%r367; $L7: .loc 1 69 3 add.u64 %r69,%r69,24; add.u64 %r74,%r74,8; add.u64 %r131,%r131,8; setp.ne.u64 %r233,%r69,%r177; @ %r233 bra $L8; bra $L9; $L50: .loc 1 86 6 ld.u64 %r234,[%r180]; setp.eq.u64 %r235,%r234,0; @ ! %r235 bra $L11; bra $L10; $L4: add.u64 %r237,%r25,%r25; add.u64 %r238,%r237,%r25; shl.b64 %r239,%r238,3; add.u64 %r240,%r239,40; add.u64 %r104,%r181,%r240; shl.b64 %r241,%r25,3; add.u64 %r63,%r241,-8; add.u64 %r357,%frame,120; add.u64 %r99,%r357,%r63; add.u64 %r356,%frame,240; add.u64 %r48,%r356,%r63; add.u64 %r244,%r181,40; cvt.u32.u32 %r246,%r22; cvt.s64.s8 %r245,%r246; add.u64 %r248,%r245,%r245; add.u64 %r249,%r248,%r245; shl.b64 %r250,%r249,3; add.u64 %r168,%r244,%r250; .loc 1 83 12 mov.u64 %r366,0; $L14: .loc 1 79 18 ld.u64 %r251,[%r104]; st.u64 [%r99],%r251; .loc 1 80 19 ld.u64 %r253,[%r104+16]; add.u64 %r252,%r253,1; ld.u64 %r254,[%r104+8]; sub.u64 %r44,%r252,%r254; .loc 1 82 10 setp.lt.s64 %r255,%r44,0; @ %r255 bra $L12; .loc 1 80 17 st.u64 [%r48],%r44; bra $L13; $L12: .loc 1 83 12 st.u64 [%r48],%r366; $L13: .loc 1 77 3 add.u64 %r104,%r104,24; add.u64 %r99,%r99,8; add.u64 %r48,%r48,8; setp.ne.u64 %r257,%r104,%r168; @ %r257 bra $L14; bra $L51; $L10: .loc 1 90 7 setp.le.s64 %r258,%r111,0; @ %r258 bra $L16; add.u64 %r356,%frame,240; $L36: add.u64 %r138,%r180,40; mov.u64 %r133,%r356; cvt.u32.u32 %r260,%r22; cvt.s64.s8 %r259,%r260; add.u64 %r262,%r259,%r259; add.u64 %r263,%r262,%r259; shl.b64 %r264,%r263,3; add.u64 %r265,%r180,16; add.u64 %r106,%r264,%r265; .loc 1 93 10 mov.u64 %r105,1; .loc 1 97 4 mov.u64 %r288,0; bra $L17; $L16: .loc 1 101 24 mov.u64 %r266,0; st.u64 [%r180+8],%r266; .loc 1 102 28 cvt.u16.u32 %r269,%r22; add.u16 %r268,%r269,-1; cvt.u32.u16 %r270,%r268; st.u8 [%r180+28],%r270; .loc 1 104 20 add.u32 %r271,%r22,-2; cvt.s64.s32 %r55,%r271; add.u64 %r273,%r55,%r55; add.u64 %r274,%r273,%r55; shl.b64 %r275,%r274,3; add.u64 %r276,%r180,%r275; .loc 1 104 67 shl.b64 %r278,%r55,3; add.u64 %r279,%frame,%r278; .loc 1 104 59 ld.u64 %r281,[%r276+40]; ld.u64 %r282,[%r279+240]; mul.lo.u64 %r117,%r281,%r282; .loc 1 106 29 mov.u64 %r284call (%value_in),_gfortrani_xmallocarray285,[%value_in]; } .loc 1 106 27 st.u64 [%r180],%r285; .loc 1 107 10 setp.eq.u64 %r287,%r117,0; @ ! %r287 bra $L21; bra $L18; $L20: .loc 1 95 48 mul.lo.u64 %r105,%r51,%r105; $L17: .loc 1 97 4 st.u64 [%r138+8],%r288; ld.u64 %r51,[%r133]; add.u64 %r289,%r51,-1; st.u64 [%r138+16],%r289; st.u64 [%r138],%r105; .loc 1 90 7 add.u64 %r138,%r138,24; add.u64 %r133,%r133,8; setp.ne.u64 %r290,%r106,%r138; @ %r290 bra $L20; bra $L16; $L18: .loc 1 110 4 st.u64 [%r180+48],%r117; mov.u64 %r292,-1; st.u64 [%r180+56],%r292; mov.u64 %r293,1; st.u64 [%r180+40],%r293; .loc 1 111 4 bra $L1; $L11: .loc 1 116 19 ld.s8 %r61,[%r180+28]; .loc 1 116 10 setp.eq.u64 %r294,%r61,%r111; @ %r294 bra $L23; .loc 1 117 2 st.u64 [%stack+8],%r111; st.u64 [%stack],%r61;_gfortran_runtime_error23: .loc 1 122 11 cvta.global.u64 %r297,_gfortrani_compile_options; .loc 1 122 10 ld.u32 %r298,[%r297+36]; setp.eq.u32 %r299,%r298,0; @ %r299 bra $L21; .loc 1 123 2 add.u64 %r356,%frame,240; cvta.const.u64 %r303,$LC2r356302303; call _gfortrani_bounds_ifunction_return$L21: .loc 1 127 3 setp.gt.s64 %r305,%r111,0; @ %r305 bra $L24; $L27: .loc 1 64 9 add.u64 %r306,%r30,1; .loc 1 64 7 sub.u64 %r113,%r306,%r32; max.s64 %r42,%r113,0; .loc 1 135 8 ld.u64 %r126,[%r180]; .loc 1 138 8 ld.u64 %r125,[%r181]; .loc 1 147 22 add.u64 %r307,%r42,-1; .loc 1 147 35 mul.lo.u64 %r154,%r307,%r115; ld.u64 %r81,[%frame+360]; .loc 1 172 26 ld.u64 %r75,[%frame+120]; .loc 1 173 22 ld.u64 %r77,[%frame]; .loc 1 173 12 shl.b64 %r79,%r77,3; setp.ne.u32 %r355,%r184,0; setp.le.s64 %r354,%r111,1; .loc 1 148 4 setp.le.s64 %r359,%r113,0; .loc 1 150 11 cvt.u16.u32 %r360,%r182; .loc 1 160 4 setp.gt.s64 %r361,%r113,0; cvt.u32.u32 %r362,%r22; cvt.s64.s8 %r363,%r362; add.u64 %r364,%r363,-1; add.u64 %r365,%frame,120; bra $L25; $L24: add.u64 %r161,%frame,360; add.u64 %r160,%r180,40; mov.u64 %r158,%frame; cvt.u32.u32 %r309,%r22; cvt.s64.s8 %r308,%r309; add.u64 %r140,%r308,-1; .loc 1 127 3 mov.u64 %r127,0; add.u64 %r356,%frame,240; .loc 1 129 16 mov.u64 %r310,%r127; $L26: st.u64 [%r161],%r310; .loc 1 130 18 ld.u64 %r311,[%r160]; st.u64 [%r158],%r311; .loc 1 131 17 shl.b64 %r313,%r127,3; add.u64 %r314,%r356,%r313; .loc 1 131 10 ld.u64 %r315,[%r314]; setp.le.s64 %r316,%r315,0; @ %r316 bra $L1; .loc 1 127 26 add.u64 %r127,%r127,1; .loc 1 127 3 add.u64 %r161,%r161,8; add.u64 %r160,%r160,24; add.u64 %r158,%r158,8; setp.ne.u64 %r317,%r127,%r140; @ %r317 bra $L26; bra $L27; $L25: .loc 1 145 10 @ %r355 bra $L28; .loc 1 160 4 @ %r361 bra $L37; .loc 1 144 14 mov.u64 %r119,0; bra $L30; $L28: .loc 1 147 8 add.u64 %r120,%r125,%r154; .loc 1 148 4 @ %r359 bra $L38; .loc 1 148 11 mov.u64 %r119,%r42; $L31: .loc 1 150 11 ld.s8 %r322,[%r120]; cvt.u16.u32 %r321,%r322; setp.eq.u16 %r324,%r321,%r360; @ %r324 bra $L30; .loc 1 148 26 add.u64 %r119,%r119,-1; .loc 1 148 34 sub.u64 %r120,%r120,%r115; .loc 1 148 4 setp.ne.u64 %r325,%r119,0; @ %r325 bra $L31; bra $L30; $L37: .loc 1 160 4 mov.u64 %r118,%r125; .loc 1 160 11 mov.u64 %r119,1; $L29: .loc 1 162 11 ld.s8 %r327,[%r118]; cvt.u16.u32 %r326,%r327; setp.eq.u16 %r329,%r326,%r360; @ %r329 bra $L30; .loc 1 160 27 add.u64 %r119,%r119,1; .loc 1 160 35 add.u64 %r118,%r118,%r115; .loc 1 160 4 setp.ge.s64 %r330,%r42,%r119; @ %r330 bra $L29; .loc 1 144 14 mov.u64 %r119,0; bra $L30; $L38: mov.u64 %r119,0; $L30: .loc 1 169 13 st.u64 [%r126],%r119; .loc 1 171 15 add.u64 %r81,%r81,1; .loc 1 172 12 add.u64 %r125,%r125,%r75; .loc 1 173 12 add.u64 %r126,%r126,%r79; .loc 1 175 32 ld.u64 %r132,[%frame+240]; .loc 1 175 13 setp.ne.u64 %r331,%r81,%r132; @ %r331 bra $L25; .loc 1 178 35 mul.lo.u64 %r332,%r75,%r81; .loc 1 178 9 sub.u64 %r122,%r125,%r332; .loc 1 179 23 mul.lo.u64 %r333,%r77,%r81; .loc 1 179 9 shl.b64 %r334,%r333,3; sub.u64 %r123,%r126,%r334; .loc 1 181 7 @ %r354 bra $L1; add.u64 %r68,%frame,368; mov.u64 %r152,8; .loc 1 180 5 mov.u64 %r124,1; add.u64 %r356,%frame,240; .loc 1 177 13 mov.u64 %r358,0; bra $L34; $L35: st.u64 [%r68],%r358; .loc 1 178 35 mul.lo.u64 %r340,%r93,%r92; .loc 1 178 9 sub.u64 %r122,%r125,%r340; .loc 1 179 23 mul.lo.u64 %r341,%r95,%r92; .loc 1 179 9 shl.b64 %r342,%r341,3; sub.u64 %r123,%r126,%r342; .loc 1 180 5 add.u64 %r124,%r124,1; .loc 1 181 7 add.u64 %r68,%r68,8; add.u64 %r152,%r152,8; setp.eq.u64 %r343,%r124,%r364; @ %r343 bra $L1; $L34: .loc 1 188 16 ld.u64 %r344,[%r68]; add.u64 %r92,%r344,1; st.u64 [%r68],%r92; .loc 1 189 27 add.u64 %r346,%r365,%r152; ld.u64 %r93,[%r346]; .loc 1 189 13 add.u64 %r125,%r122,%r93; .loc 1 190 23 add.u64 %r347,%frame,%r152; ld.u64 %r95,[%r347]; .loc 1 190 13 shl.b64 %r348,%r95,3; add.u64 %r126,%r123,%r348; .loc 1 175 32 add.u64 %r350,%r356,%r152; ld.u64 %r98,[%r350]; .loc 1 175 13 setp.eq.u64 %r351,%r92,%r98; @ %r351 bra $L35; .loc 1 177 13 mov.u64 %r81,0; bra $L25; $L51: .loc 1 86 6 ld.u64 %r352,[%r180]; setp.eq.u64 %r353,%r352,0; @ ! %r353 bra $L11; bra $L36; $L_gfortran_mfindloc1_i1 .visible .func _gfortran_mfindloc1_i1608pred.reg .u32predpred %r266; .reg .u64 %r267; .reg .u64 %r268; .reg .u64u64 %r280; .reg .u64 %r281; .reg .u64 %r282; .reg .u64 %r283; .reg .pred %r284; .reg .u64 %r286; .reg .u64 %r288; .reg .predu64 %r297; .reg .u64 %r298; .reg .u32u64 %r305; .reg .u64 %r306; .reg .u64u32u64 %r326; .reg .u16predpred %r382; .reg .u16u16 %r388; .reg .u32 %r389; .reg .predpred %r394; .reg .pred %r395; .reg .predmov.u64 %r201,%ar0; mov.u64 %r202,%ar1; mov.u32 %r203,%ar2; mov.u64 %r204,%ar3; mov.u64 %r205,%ar4; mov.u32 %r206,%ar5; .loc 1 225 10 ld.s8 %r22,[%r202+28]; .loc 1 225 38 add.u32 %r207,%r22,-1; .loc 1 225 8 cvt.s64.s32 %r143,%r207; .loc 1 226 10 ld.u64 %r25,[%r204]; .loc 1 226 7 add.u64 %r144,%r25,-1; .loc 1 228 7 shr.u64 %r209,%r144,63; set.u32.lt.s64 %r211,%r143,%r144; neg.s32 %r212,%r211; cvt.u16.u64 %r215,%r209; cvt.u16.u32 %r216,%r212; or.b16 %r214,%r215,%r216; .loc 1 228 6 cvt.u32.u16 %r217,%r214; cvt.u16.u8 %r218,%r217; setp.eq.u16 %r219,%r218,0; @ %r219 bra $L53; .loc 1 230 7 cvt.u32.u32 %r222,%r22; cvt.s64.s8 %r221,%r222; st.u64 [%stack+8],%r2220220_gfortran_runtime_error3: .loc 1 235 9 add.u64 %r225,%r144,%r144; add.u64 %r226,%r225,%r144; shl.b64 %r227,%r226,3; add.u64 %r228,%r202,%r227; ld.u64 %r31,[%r228+56]; ld.u64 %r33,[%r228+48]; .loc 1 239 9 ld.u64 %r147,[%r228+40]; .loc 1 240 12 add.u64 %r246,%r205,%r227; ld.u64 %r34,[%r246+40]; ld.u64 %r36,[%r205+16]; .loc 1 242 9 ld.u64 %r155,[%r205]; .loc 1 246 22 cvt.u32.u64 %r39,%r36; .loc 1 246 53 add.u32 %r248,%r39,-4; and.b32 %r249,%r248,-5; set.u32.eq.u32 %r251,%r249,0; neg.s32 %r252,%r251; .loc 1 246 22 add.u32 %r253,%r39,-1; set.u32.le.u32 %r255,%r253,1; neg.s32 %r256,%r255; .loc 1 246 6 cvt.u16.u32 %r258,%r252; cvt.u16.u32 %r259,%r256; or.b16 %r257,%r258,%r259; cvt.u32.u16 %r260,%r257; cvt.u16.u8 %r261,%r260; setp.eq.u16 %r262,%r261,0; @ %r262 bra $L54; $L58: .loc 1 255 3 setp.ne.u64 %r263,%r144,0; @ %r263 bra $L55; $L62: .loc 1 264 3 setp.gt.s64 %r264,%r143,%r144; @ %r264 bra $L56; bra $L113; $L54: .loc 1 248 7 setp.eq.u32 %r266,%r39,16; @ %r266 bra $L58; .loc 1 253 5 cvta.const.u64 %r268,$LC4; mov.u64 %r26call _gfortrani_internal_error5: add.u64 %r163,%r202,40; add.u64 %r194,%r205,40; add.u64 %r269,%r202,16; add.u64 %r271,%r25,%r25; add.u64 %r272,%r271,%r25; shl.b64 %r273,%r272,3; add.u64 %r198,%r269,%r273; .loc 1 255 3 mov.u64 %r67,0; add.u64 %r418,%frame,240; add.u64 %r419,%frame,120; add.u64 %r416,%frame,360; .loc 1 262 12 mov.u64 %r433,%r67; $L61: .loc 1 257 18 add.u64 %r275,%r418,%r67; ld.u64 %r276,[%r163]; st.u64 [%r275],%r276; .loc 1 258 18 add.u64 %r278,%r419,%r67; .loc 1 258 20 ld.u64 %r280,[%r194]; mul.lo.u64 %r279,%r280,%r36; .loc 1 258 18 st.u64 [%r278],%r279; .loc 1 259 19 ld.u64 %r282,[%r163+16]; add.u64 %r281,%r282,1; ld.u64 %r283,[%r163+8]; sub.u64 %r54,%r281,%r283; .loc 1 261 10 setp.lt.s64 %r284,%r54,0; @ %r284 bra $L59; .loc 1 259 17 add.u64 %r286,%r416,%r67; st.u64 [%r286],%r54; bra $L60; $L59: .loc 1 262 12 add.u64 %r288,%r416,%r67; st.u64 [%r288],%r433; $L60: .loc 1 255 3 add.u64 %r163,%r163,24; add.u64 %r194,%r194,24; add.u64 %r67,%r67,8; setp.ne.u64 %r290,%r163,%r198; @ %r290 bra $L61; bra $L62; $L113: .loc 1 274 6 ld.u64 %r291,[%r201]; setp.eq.u64 %r292,%r291,0; @ ! %r292 bra $L64; bra $L63; $L56: add.u64 %r294,%r25,%r25; add.u64 %r295,%r294,%r25; shl.b64 %r296,%r295,3; add.u64 %r136,%r296,40; add.u64 %r139,%r202,%r136; add.u64 %r134,%r205,%r136; shl.b64 %r297,%r25,3; add.u64 %r130,%r297,-8; cvt.u32.u32 %r299,%r22; cvt.s64.s8 %r298,%r299; shl.b64 %r300,%r298,3; add.u64 %r160,%r300,-8; add.u64 %r418,%frame,240; add.u64 %r419,%frame,120; add.u64 %r416,%frame,360; .loc 1 271 12 mov.u64 %r432,0; $L67: .loc 1 266 18 add.u64 %r302,%r418,%r130; ld.u64 %r303,[%r139]; st.u64 [%r302],%r303; .loc 1 267 18 add.u64 %r305,%r419,%r130; .loc 1 267 20 ld.u64 %r307,[%r134]; mul.lo.u64 %r306,%r307,%r36; .loc 1 267 18 st.u64 [%r305],%r306; .loc 1 268 19 ld.u64 %r309,[%r139+16]; add.u64 %r308,%r309,1; ld.u64 %r310,[%r139+8]; sub.u64 %r64,%r308,%r310; .loc 1 270 10 setp.lt.s64 %r311,%r64,0; @ %r311 bra $L65; .loc 1 268 17 add.u64 %r313,%r416,%r130; st.u64 [%r313],%r64; bra $L66; $L65: .loc 1 271 12 add.u64 %r315,%r416,%r130; st.u64 [%r315],%r432; $L66: .loc 1 264 3 add.u64 %r139,%r139,24; add.u64 %r134,%r134,24; add.u64 %r130,%r130,8; setp.ne.u64 %r317,%r130,%r160; @ %r317 bra $L67; bra $L114; $L63: .loc 1 278 7 setp.le.s64 %r318,%r143,0; @ %r318 bra $L69; add.u64 %r416,%frame,360; $L90: add.u64 %r172,%r201,40; mov.u64 %r169,%r416; cvt.u32.u32 %r320,%r22; cvt.s64.s8 %r319,%r320; add.u64 %r322,%r319,%r319; add.u64 %r323,%r322,%r319; shl.b64 %r324,%r323,3; add.u64 %r325,%r201,16; add.u64 %r140,%r324,%r325; .loc 1 281 10 mov.u64 %r133,1; .loc 1 285 4 mov.u64 %r348,0; bra $L70; $L69: .loc 1 289 24 mov.u64 %r326,0; st.u64 [%r201+8],%r326; .loc 1 290 28 cvt.u16.u32 %r329,%r22; add.u16 %r328,%r329,-1; cvt.u32.u16 %r330,%r328; st.u8 [%r201+28],%r330; .loc 1 292 20 add.u32 %r331,%r22,-2; cvt.s64.s32 %r74,%r331; add.u64 %r333,%r74,%r74; add.u64 %r334,%r333,%r74; shl.b64 %r335,%r334,3; add.u64 %r336,%r201,%r335; .loc 1 292 67 shl.b64 %r338,%r74,3; add.u64 %r339,%frame,%r338; .loc 1 292 59 ld.u64 %r341,[%r336+40]; ld.u64 %r342,[%r339+360]; mul.lo.u64 %r150,%r341,%r342; .loc 1 294 29 mov.u64 %r344344; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } .loc 1 294 27 st.u64 [%r201],%r345; .loc 1 295 10 setp.eq.u64 %r347,%r150,0; @ ! %r347 bra $L74; bra $L71; $L73: .loc 1 283 48 mul.lo.u64 %r133,%r70,%r133; $L70: .loc 1 285 4 st.u64 [%r172+8],%r348; ld.u64 %r70,[%r169]; add.u64 %r349,%r70,-1; st.u64 [%r172+16],%r349; st.u64 [%r172],%r133; .loc 1 278 7 add.u64 %r172,%r172,24; add.u64 %r169,%r169,8; setp.ne.u64 %r350,%r140,%r172; @ %r350 bra $L73; bra $L69; $L71: .loc 1 298 4 st.u64 [%r201+48],%r150; mov.u64 %r352,-1; st.u64 [%r201+56],%r352; mov.u64 %r353,1; st.u64 [%r201+40],%r353; .loc 1 299 4 bra $L52; $L64: .loc 1 304 19 ld.s8 %r81,[%r201+28]; .loc 1 304 10 setp.eq.u64 %r354,%r81,%r143; @ %r354 bra $L76; .loc 1 305 2 st.u64 [%stack+8],%r143; st.u64 [%stack],%r81;_gfortran_runtime_error76: .loc 1 310 11 cvta.global.u64 %r357,_gfortrani_compile_options; .loc 1 310 10 ld.u32 %r358,[%r357+36]; setp.eq.u32 %r359,%r358,0; @ %r359 bra $L74; .loc 1 311 2 add.u64 %r416,%frame,360; cvta.const.u64 %r363,$LC2r4163623; call _gfortrani_bounds_ifunction_return$L74: .loc 1 315 3 setp.gt.s64 %r365,%r143,0; @ %r365 bra $L77; $L80: .loc 1 235 9 add.u64 %r366,%r31,1; .loc 1 235 7 sub.u64 %r145,%r366,%r33; max.s64 %r68,%r145,0; .loc 1 240 12 mul.lo.u64 %r37,%r34,%r36; .loc 1 323 8 ld.u64 %r158,[%r201]; .loc 1 326 8 ld.u64 %r157,[%r202]; .loc 1 336 22 add.u64 %r66,%r68,-1; .loc 1 336 35 mul.lo.u64 %r43,%r147,%r66; .loc 1 337 29 mul.lo.u64 %r171,%r37,%r66; .loc 1 363 26 ld.u64 %r100,[%frame+240]; .loc 1 364 23 ld.u64 %r102,[%frame+120]; .loc 1 365 22 ld.u64 %r104,[%frame]; .loc 1 365 12 shl.b64 %r106,%r104,3; .loc 1 362 12 ld.u64 %r197,[%frame+480]; setp.ne.u32 %r417,%r206,0; add.u64 %r415,%frame,488; .loc 1 338 4 setp.le.s64 %r425,%r145,0; .loc 1 340 18 cvt.u16.u32 %r426,%r203; .loc 1 382 27 add.u64 %r428,%frame,240; .loc 1 367 32 add.u64 %r429,%frame,360; bra $L78; $L77: add.u64 %r193,%frame,480; add.u64 %r192,%r201,40; mov.u64 %r190,%frame; cvt.u32.u32 %r368,%r22; cvt.s64.s8 %r367,%r368; add.u64 %r173,%r367,-1; .loc 1 315 3 mov.u64 %r159,0; add.u64 %r416,%frame,360; .loc 1 317 16 mov.u64 %r369,%r159; $L79: st.u64 [%r193],%r369; .loc 1 318 18 ld.u64 %r370,[%r192]; st.u64 [%r190],%r370; .loc 1 319 17 shl.b64 %r372,%r159,3; add.u64 %r373,%r416,%r372; .loc 1 319 10 ld.u64 %r374,[%r373]; setp.le.s64 %r375,%r374,0; @ %r375 bra $L52; .loc 1 315 26 add.u64 %r159,%r159,1; .loc 1 315 3 add.u64 %r193,%r193,8; add.u64 %r192,%r192,24; add.u64 %r190,%r190,8; setp.ne.u64 %r376,%r159,%r173; @ %r376 bra $L79; bra $L80; $L78: .loc 1 334 10 @ %r417 bra $L81; .loc 1 351 4 setp.gt.s64 %r378,%r145,0; @ %r378 bra $L91; .loc 1 333 14 mov.u64 %r153,0; bra $L83; $L81: .loc 1 336 8 add.u64 %r29,%r157,%r43; .loc 1 337 9 add.u64 %r154,%r155,%r171; .loc 1 338 4 @ %r425 bra $L92; .loc 1 338 11 mov.u64 %r153,%r68; $L85: .loc 1 340 11 ld.s8 %r381,[%r154]; cvt.u16.u32 %r380,%r381; setp.eq.u16 %r382,%r380,0; @ %r382 bra $L84; .loc 1 340 18 ld.s8 %r384,[%r29]; cvt.u16.u32 %r383,%r384; setp.eq.u16 %r386,%r383,%r426; @ %r386 bra $L83; $L84: .loc 1 338 26 add.u64 %r153,%r153,-1; .loc 1 338 34 sub.u64 %r29,%r29,%r147; .loc 1 338 53 sub.u64 %r154,%r154,%r37; .loc 1 338 4 setp.ne.u64 %r387,%r153,0; @ %r387 bra $L85; bra $L83; $L91: .loc 1 351 4 mov.u64 %r152,%r155; mov.u64 %r108,%r157; .loc 1 351 11 mov.u64 %r153,1; $L82: .loc 1 353 11 ld.s8 %r389,[%r152]; cvt.u16.u32 %r388,%r389; setp.eq.u16 %r390,%r388,0; @ %r390 bra $L86; .loc 1 353 18 ld.s8 %r392,[%r108]; cvt.u16.u32 %r391,%r392; setp.eq.u16 %r394,%r391,%r426; @ %r394 bra $L83; $L86: .loc 1 351 27 add.u64 %r153,%r153,1; .loc 1 351 35 add.u64 %r108,%r108,%r147; .loc 1 351 54 add.u64 %r152,%r152,%r37; .loc 1 351 4 setp.ge.s64 %r395,%r68,%r153; @ %r395 bra $L82; .loc 1 333 14 mov.u64 %r153,0; bra $L83; $L92: mov.u64 %r153,0; $L83: .loc 1 360 13 st.u64 [%r158],%r153; .loc 1 362 15 add.u64 %r197,%r197,1; st.u64 [%frame+480],%r197; .loc 1 363 12 add.u64 %r157,%r157,%r100; .loc 1 364 13 add.u64 %r155,%r155,%r102; .loc 1 365 12 add.u64 %r158,%r158,%r106; .loc 1 367 32 ld.u64 %r127,[%frame+360]; .loc 1 367 13 setp.ne.u64 %r396,%r197,%r127; @ %r396 bra $L78; mov.u64 %r111,%r415; mov.u64 %r124,%r104; mov.u64 %r196,%r102; mov.u64 %r122,%r100; mov.u64 %r119,8; .loc 1 366 9 mov.u64 %r156,0; .loc 1 369 13 mov.u64 %r398,%r156; .loc 1 371 20 add.u64 %r422,%frame,120; $L89: .loc 1 369 13 st.u64 [%r111+-8],%r398; .loc 1 370 35 mul.lo.u64 %r110,%r127,%r122; .loc 1 371 24 mul.lo.u64 %r399,%r127,%r196; .loc 1 371 10 sub.u64 %r155,%r155,%r399; .loc 1 372 23 mul.lo.u64 %r118,%r127,%r124; .loc 1 373 5 add.u64 %r156,%r156,1; .loc 1 374 7 setp.le.s64 %r400,%r143,%r156; @ %r400 bra $L52; .loc 1 381 16 ld.u64 %r401,[%r111]; add.u64 %r121,%r401,1; st.u64 [%r111],%r121; .loc 1 382 27 add.u64 %r403,%r428,%r119; ld.u64 %r122,[%r403]; .loc 1 382 13 sub.u64 %r404,%r122,%r110; add.u64 %r157,%r157,%r404; .loc 1 383 23 add.u64 %r405,%frame,%r119; ld.u64 %r124,[%r405]; .loc 1 383 13 sub.u64 %r406,%r124,%r118; shl.b64 %r407,%r406,3; add.u64 %r158,%r158,%r407; .loc 1 367 32 add.u64 %r409,%r429,%r119; ld.u64 %r127,[%r409]; .loc 1 367 13 add.u64 %r111,%r111,8; setp.eq.u64 %r410,%r121,%r127; @ %r410 bra $L88; .loc 1 362 12 ld.u64 %r197,[%frame+480]; bra $L78; $L88: .loc 1 371 20 add.u64 %r412,%r422,%r119; ld.u64 %r196,[%r412]; add.u64 %r119,%r119,8; bra $L89; $L114: .loc 1 274 6 ld.u64 %r413,[%r201]; setp.eq.u64 %r414,%r413,0; @ ! %r414 bra $L64; bra $L90; $L52_gfortran_sfindloc1_i1 .visible .func _gfortran_sfindloc1_i168u64 %r126; .reg .u64u32pred %r190; .reg .pred %r192; .reg .u64pred %r213; .reg .predu32predpredpred %r257; .reg .u64 %r260; .reg .u64 %r261; .reg .pred %r263; .reg .u64 %r264; .reg .u32 %r265; .reg .u64 %r266; .reg .u6464predmov.u64 %r149,%ar0; mov.u64 %r150,%ar1; mov.u32 %r151,%ar2; mov.u64 %r152,%ar3; mov.u64 %r153,%ar4; mov.u32 %r154,%ar5; .loc 1 410 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L116; .loc 1 410 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L117; $L116:call _gfortran_findloc1_i1); } .loc 1 413 7 bra $L115; $L117: .loc 1 416 10 ld.s8 %r23,[%r150+28]; .loc 1 416 38 add.u32 %r163,%r23,-1; .loc 1 416 8 cvt.s64.s32 %r90,%r163; .loc 1 417 10 ld.u64 %r26,[%r152]; .loc 1 417 7 add.u64 %r91,%r26,-1; .loc 1 419 7 shr.u64 %r165,%r91,63; set.u32.lt.s64 %r167,%r90,%r91; neg.s32 %r168,%r167; cvt.u16.u64 %r171,%r165; cvt.u16.u32 %r172,%r168; or.b16 %r170,%r171,%r172; .loc 1 419 6 cvt.u32.u16 %r173,%r170; cvt.u16.u8 %r174,%r173; setp.ne.u16 %r175,%r174,0; @ %r175 bra $L119; .loc 1 430 3 setp.ne.u64 %r176,%r91,0; @ %r176 bra $L120; $L126: .loc 1 438 3 setp.gt.s64 %r177,%r90,%r91; @ %r177 bra $L121; bra $L163; $L119: .loc 1 421 7 cvt.u32.u32 %r180,%r23; cvt.s64.s8 %r179,%r180; st.u64 [%stack+8],%r179_gfortran_runtime_error120: add.u64 %r136,%r150,48; add.u64 %r138,%frame,120; add.u64 %r139,%r150,56; add.u64 %r182,%r150,24; add.u64 %r184,%r26,%r26; add.u64 %r185,%r184,%r26; shl.b64 %r186,%r185,3; add.u64 %r145,%r182,%r186; .loc 1 435 12 mov.u64 %r302,0; $L125: .loc 1 432 19 ld.u64 %r188,[%r139]; add.u64 %r187,%r188,1; ld.u64 %r189,[%r136]; sub.u64 %r35,%r187,%r189; .loc 1 434 10 setp.le.s64 %r190,%r35,0; @ %r190 bra $L123; .loc 1 432 17 st.u64 [%r138],%r35; bra $L124; $L123: .loc 1 435 12 st.u64 [%r138],%r302; $L124: .loc 1 430 3 add.u64 %r136,%r136,24; add.u64 %r138,%r138,8; add.u64 %r139,%r139,24; setp.ne.u64 %r192,%r136,%r145; @ %r192 bra $L125; bra $L126; $L163: .loc 1 448 6 ld.u64 %r193,[%r149]; setp.eq.u64 %r194,%r193,0; @ ! %r194 bra $L128; bra $L127; $L121: add.u64 %r196,%r26,%r26; add.u64 %r197,%r196,%r26; shl.b64 %r198,%r197,3; add.u64 %r199,%r198,48; add.u64 %r125,%r150,%r199; add.u64 %r294,%frame,120; shl.b64 %r201,%r26,3; add.u64 %r202,%r201,-8; add.u64 %r45,%r294,%r202; add.u64 %r203,%r150,48; cvt.u32.u32 %r205,%r23; cvt.s64.s8 %r204,%r205; add.u64 %r207,%r204,%r204; add.u64 %r208,%r207,%r204; shl.b64 %r209,%r208,3; add.u64 %r132,%r203,%r209; .loc 1 444 12 mov.u64 %r301,0; $L131: .loc 1 441 2 ld.u64 %r211,[%r125+8]; add.u64 %r210,%r211,1; ld.u64 %r212,[%r125]; sub.u64 %r40,%r210,%r212; .loc 1 443 10 setp.le.s64 %r213,%r40,0; @ %r213 bra $L129; .loc 1 440 17 st.u64 [%r45],%r40; bra $L130; $L129: .loc 1 444 12 st.u64 [%r45],%r301; $L130: .loc 1 438 3 add.u64 %r125,%r125,24; add.u64 %r45,%r45,8; setp.ne.u64 %r215,%r125,%r132; @ %r215 bra $L131; bra $L164; $L127: .loc 1 452 7 setp.le.s64 %r216,%r90,0; @ %r216 bra $L133; add.u64 %r294,%frame,120; $L149: add.u64 %r78,%r149,40; mov.u64 %r61,%r294; cvt.u32.u32 %r218,%r23; cvt.s64.s8 %r217,%r218; add.u64 %r220,%r217,%r217; add.u64 %r221,%r220,%r217; shl.b64 %r222,%r221,3; add.u64 %r223,%r149,16; add.u64 %r126,%r222,%r223; .loc 1 455 10 mov.u64 %r81,1; .loc 1 459 4 mov.u64 %r246,0; bra $L134; $L133: .loc 1 462 24 mov.u64 %r224,0; st.u64 [%r149+8],%r224; .loc 1 463 28 cvt.u16.u32 %r227,%r23; add.u16 %r226,%r227,-1; cvt.u32.u16 %r228,%r226; st.u8 [%r149+28],%r228; .loc 1 465 20 add.u32 %r229,%r23,-2; cvt.s64.s32 %r51,%r229; add.u64 %r231,%r51,%r51; add.u64 %r232,%r231,%r51; shl.b64 %r233,%r232,3; add.u64 %r234,%r149,%r233; .loc 1 465 67 shl.b64 %r236,%r51,3; add.u64 %r237,%frame,%r236; .loc 1 465 59 ld.u64 %r239,[%r234+40]; ld.u64 %r240,[%r237+120]; mul.lo.u64 %r94,%r239,%r240; .loc 1 467 29 mov.u64 %r24242; call (%value_in),_gfortrani_xmallocarray3,[%value_in]; } .loc 1 467 27 st.u64 [%r149],%r243; .loc 1 468 10 setp.eq.u64 %r245,%r94,0; @ ! %r245 bra $L138; bra $L135; $L137: .loc 1 457 48 mul.lo.u64 %r81,%r47,%r81; $L134: .loc 1 459 4 st.u64 [%r78+8],%r246; ld.u64 %r47,[%r61]; add.u64 %r247,%r47,-1; st.u64 [%r78+16],%r247; st.u64 [%r78],%r81; .loc 1 452 7 add.u64 %r78,%r78,24; add.u64 %r61,%r61,8; setp.ne.u64 %r248,%r78,%r126; @ %r248 bra $L137; bra $L133; $L135: .loc 1 471 4 st.u64 [%r149+48],%r94; mov.u64 %r250,-1; st.u64 [%r149+56],%r250; mov.u64 %r251,1; st.u64 [%r149+40],%r251; .loc 1 472 4 bra $L115; $L128: .loc 1 477 19 ld.s8 %r58,[%r149+28]; .loc 1 477 10 setp.eq.u64 %r252,%r58,%r90; @ %r252 bra $L139; .loc 1 478 2 st.u64 [%stack+8],%r90; st.u64 [%stack],%r58;_gfortran_runtime_error139: .loc 1 483 11 cvta.global.u64 %r255,_gfortrani_compile_options; .loc 1 483 10 ld.u32 %r256,[%r255+36]; setp.eq.u32 %r257,%r256,0; @ %r257 bra $L138; .loc 1 484 2 add.u64 %r294,%frame,120; cvta.const.u64 %r261,$LC2261; call _gfortrani_bounds_ifunction_return$L138: .loc 1 488 3 setp.gt.s64 %r263,%r90,0; @ %r263 bra $L140; $L143: .loc 1 495 8 ld.u64 %r97,[%r149]; ld.u64 %r42,[%frame+240]; .loc 1 503 22 ld.u64 %r64,[%frame]; .loc 1 503 12 shl.b64 %r66,%r64,3; setp.le.s64 %r295,%r90,1; cvt.u32.u32 %r297,%r23; cvt.s64.s8 %r298,%r297; add.u64 %r299,%r298,-1; add.u64 %r300,%frame,120; bra $L141; $L140: add.u64 %r114,%frame,240; add.u64 %r113,%r149,40; mov.u64 %r108,%frame; cvt.u32.u32 %r265,%r23; cvt.s64.s8 %r264,%r265; add.u64 %r79,%r264,-1; .loc 1 488 3 mov.u64 %r98,0; add.u64 %r294,%frame,120; .loc 1 490 16 mov.u64 %r266,%r98; $L142: st.u64 [%r114],%r266; .loc 1 491 18 ld.u64 %r267,[%r113]; st.u64 [%r108],%r267; .loc 1 492 17 shl.b64 %r269,%r98,3; add.u64 %r270,%r294,%r269; .loc 1 492 10 ld.u64 %r271,[%r270]; setp.le.s64 %r272,%r271,0; @ %r272 bra $L115; .loc 1 488 26 add.u64 %r98,%r98,1; .loc 1 488 3 add.u64 %r114,%r114,8; add.u64 %r113,%r113,24; add.u64 %r108,%r108,8; setp.ne.u64 %r273,%r79,%r98; @ %r273 bra $L142; bra $L143; $L141: .loc 1 500 13 mov.u64 %r274,0; st.u64 [%r97],%r274; .loc 1 502 15 add.u64 %r42,%r42,1; .loc 1 503 12 add.u64 %r97,%r97,%r66; .loc 1 505 32 ld.u64 %r88,[%frame+120]; .loc 1 505 13 setp.ne.u64 %r275,%r42,%r88; @ %r275 bra $L141; .loc 1 508 23 mul.lo.u64 %r276,%r64,%r42; .loc 1 508 9 shl.b64 %r277,%r276,3; sub.u64 %r30,%r97,%r277; .loc 1 510 7 @ ! %r295 bra $L165; bra $L115; $L148: .loc 1 507 13 st.u64 [%r102],%r274; .loc 1 508 23 mul.lo.u64 %r280,%r73,%r72; .loc 1 508 9 shl.b64 %r281,%r280,3; sub.u64 %r30,%r97,%r281; .loc 1 509 5 add.u64 %r96,%r96,1; .loc 1 510 7 add.u64 %r102,%r102,8; setp.ne.u64 %r282,%r96,%r299; @ %r282 bra $L147; bra $L115; $L165: add.u64 %r102,%frame,248; .loc 1 509 5 mov.u64 %r96,1; $L147: .loc 1 517 16 ld.u64 %r286,[%r102]; add.u64 %r72,%r286,1; st.u64 [%r102],%r72; shl.b64 %r120,%r96,3; .loc 1 518 23 add.u64 %r287,%frame,%r120; ld.u64 %r73,[%r287]; .loc 1 518 13 shl.b64 %r288,%r73,3; add.u64 %r97,%r30,%r288; .loc 1 505 32 add.u64 %r290,%r300,%r120; ld.u64 %r77,[%r290]; .loc 1 505 13 setp.eq.u64 %r291,%r72,%r77; @ %r291 bra $L148; .loc 1 507 13 mov.u64 %r42,0; bra $L141; $L164: .loc 1 448 6 ld.u64 %r292,[%r149]; setp.eq.u64 %r293,%r292,0; @ ! %r293 bra $L128; bra $L149; $L115: .loc 1 522 1 ret; } findloc1_i2.o/_gfortran_findloc1_i2 .visible .func _gfortran_findloc1_i2); .file 1 "../../../libgfortran/generated/findloc1_i2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mfindloc1_i2 .visible .func _gfortran_mfindloc1_i2_gfortran_sfindloc1_i2 .visible .func _gfortran_sfindloc1_i2); // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_internal_error .extern .func _gfortrani_internal_error8] = {70,73,78,68,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,70,73,78,68,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,914,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,70,73,78,68,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_findloc1_i2 .visible .func _gfortran_findloc1_i248098predpred %r238; .reg .u64pred %r260; .reg .pred %r262; .reg .pred %r263; .reg .u64 %r264; .reg .u32 %r265; .reg .u64u64 %r279; .reg .u64 %r280; .reg .u64u64 %r315; .reg .u32 %r316; .reg .u64 %r317; .reg .u64pred %r323; .reg .pred %r324; .reg .u16 %r328; .reg .pred %r330; .reg .pred %r331; .reg .u16 %r332; .reg .predpredu64 %r374; .reg .u64 %r375; mov.u64 %r185,%ar0; mov.u64 %r186,%ar1; mov.u32 %r187,%ar2; mov.u64 %r188,%ar3; mov.u32 %r189,%ar4; .loc 1 54 10 ld.s8 %r22,[%r186+28]; .loc 1 54 38 add.u32 %r190,%r22,-1; .loc 1 54 8 cvt.s64.s32 %r118,%r190; .loc 1 55 10 ld.u64 %r25,[%r188]; .loc 1 55 7 add.u64 %r119,%r25,-1; .loc 1 57 7 shr.u64 %r192,%r119,63; set.u32.lt.s64 %r194,%r118,%r119; neg.s32 %r195,%r194; cvt.u16.u64 %r198,%r192; cvt.u16.u32 %r199,%r195; or.b16 %r197,%r198,%r199; .loc 1 57 6 cvt.u32.u16 %r200,%r197; cvt.u16.u8 %r201,%r200; setp.eq.u16 %r202,%r201,0; @ %r202 bra $L2; .loc 1 59 7 cvt.u32.u32 %r205,%r22; cvt.s64.s8 %r204,%r205; st.u64 [%stack+8],%r20403203_gfortran_runtime_errorr208,%r119,%r119; add.u64 %r209,%r208,%r119; shl.b64 %r210,%r209,3; add.u64 %r211,%r186,%r210; ld.u64 %r31,[%r211+56]; ld.u64 %r33,[%r211+48]; .loc 1 67 9 ld.u64 %r122,[%r211+40]; .loc 1 69 3 setp.ne.u64 %r225,%r119,0; @ %r225 bra $L3; $L9: .loc 1 77 3 setp.gt.s64 %r226,%r118,%r119; @ %r226 bra $L4; bra $L50; $L3: add.u64 %r177,%r186,40; add.u64 %r67,%frame,120; add.u64 %r70,%frame,240; add.u64 %r228,%r25,%r25; add.u64 %r229,%r228,%r25; shl.b64 %r230,%r229,3; add.u64 %r231,%r186,16; add.u64 %r182,%r230,%r231; .loc 1 75 12 mov.u64 %r375,0; $L8: .loc 1 71 18 ld.u64 %r232,[%r177]; st.u64 [%r67],%r232; .loc 1 72 19 ld.u64 %r234,[%r177+16]; add.u64 %r233,%r234,1; ld.u64 %r235,[%r177+8]; sub.u64 %r38,%r233,%r235; .loc 1 74 10 setp.lt.s64 %r236,%r38,0; @ %r236 bra $L6; .loc 1 72 17 st.u64 [%r70],%r38; bra $L7; $L6: .loc 1 75 12 st.u64 [%r70],%r375; $L7: .loc 1 69 3 add.u64 %r177,%r177,24; add.u64 %r67,%r67,8; add.u64 %r70,%r70,8; setp.ne.u64 %r238,%r177,%r182; @ %r238 bra $L8; bra $L9; $L50: .loc 1 86 6 ld.u64 %r239,[%r185]; setp.eq.u64 %r240,%r239,0; @ ! %r240 bra $L11; bra $L10; $L4: add.u64 %r242,%r25,%r25; add.u64 %r243,%r242,%r25; shl.b64 %r244,%r243,3; add.u64 %r245,%r244,40; add.u64 %r115,%r186,%r245; shl.b64 %r246,%r25,3; add.u64 %r105,%r246,-8; add.u64 %r365,%frame,120; add.u64 %r108,%r365,%r105; add.u64 %r364,%frame,240; add.u64 %r62,%r364,%r105; add.u64 %r249,%r186,40; cvt.u32.u32 %r251,%r22; cvt.s64.s8 %r250,%r251; add.u64 %r253,%r250,%r250; add.u64 %r254,%r253,%r250; shl.b64 %r255,%r254,3; add.u64 %r180,%r249,%r255; .loc 1 83 12 mov.u64 %r374,0; $L14: .loc 1 79 18 ld.u64 %r256,[%r115]; st.u64 [%r108],%r256; .loc 1 80 19 ld.u64 %r258,[%r115+16]; add.u64 %r257,%r258,1; ld.u64 %r259,[%r115+8]; sub.u64 %r45,%r257,%r259; .loc 1 82 10 setp.lt.s64 %r260,%r45,0; @ %r260 bra $L12; .loc 1 80 17 st.u64 [%r62],%r45; bra $L13; $L12: .loc 1 83 12 st.u64 [%r62],%r374; $L13: .loc 1 77 3 add.u64 %r115,%r115,24; add.u64 %r108,%r108,8; add.u64 %r62,%r62,8; setp.ne.u64 %r262,%r115,%r180; @ %r262 bra $L14; bra $L51; $L10: .loc 1 90 7 setp.le.s64 %r263,%r118,0; @ %r263 bra $L16; add.u64 %r364,%frame,240; $L36: add.u64 %r153,%r185,40; mov.u64 %r149,%r364; cvt.u32.u32 %r265,%r22; cvt.s64.s8 %r264,%r265; add.u64 %r267,%r264,%r264; add.u64 %r268,%r267,%r264; shl.b64 %r269,%r268,3; add.u64 %r270,%r185,16; add.u64 %r116,%r269,%r270; .loc 1 93 10 mov.u64 %r112,1; .loc 1 97 4 mov.u64 %r293,0; bra $L17; $L16: .loc 1 101 24 mov.u64 %r271,0; st.u64 [%r185+8],%r271; .loc 1 102 28 cvt.u16.u32 %r274,%r22; add.u16 %r273,%r274,-1; cvt.u32.u16 %r275,%r273; st.u8 [%r185+28],%r275; .loc 1 104 20 add.u32 %r276,%r22,-2; cvt.s64.s32 %r54,%r276; add.u64 %r278,%r54,%r54; add.u64 %r279,%r278,%r54; shl.b64 %r280,%r279,3; add.u64 %r281,%r185,%r280; .loc 1 104 67 shl.b64 %r283,%r54,3; add.u64 %r284,%frame,%r283; .loc 1 104 59 ld.u64 %r286,[%r281+40]; ld.u64 %r287,[%r284+240]; mul.lo.u64 %r124,%r286,%r287; .loc 1 106 29 mov.u64 %r289189; call (%value_in),_gfortrani_xmallocarray290,[%value_in]; } .loc 1 106 27 st.u64 [%r185],%r290; .loc 1 107 10 setp.eq.u64 %r292,%r124,0; @ ! %r292 bra $L21; bra $L18; $L20: .loc 1 95 48 mul.lo.u64 %r112,%r50,%r112; $L17: .loc 1 97 4 st.u64 [%r153+8],%r293; ld.u64 %r50,[%r149]; add.u64 %r294,%r50,-1; st.u64 [%r153+16],%r294; st.u64 [%r153],%r112; .loc 1 90 7 add.u64 %r153,%r153,24; add.u64 %r149,%r149,8; setp.ne.u64 %r295,%r116,%r153; @ %r295 bra $L20; bra $L16; $L18: .loc 1 110 4 st.u64 [%r185+48],%r124; mov.u64 %r297,-1; st.u64 [%r185+56],%r297; mov.u64 %r298,1; st.u64 [%r185+40],%r298; .loc 1 111 4 bra $L1; $L11: .loc 1 116 19 ld.s8 %r60,[%r185+28]; .loc 1 116 10 setp.eq.u64 %r299,%r60,%r118; @ %r299 bra $L23; .loc 1 117 2 st.u64 [%stack+8],%r118; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 122 11 cvta.global.u64 %r302,_gfortrani_compile_options; .loc 1 122 10 ld.u32 %r303,[%r302+36]; setp.eq.u32 %r304,%r303,0; @ %r304 bra $L21; .loc 1 123 2 add.u64 %r364,%frame,240; cvta.const.u64 %r308,$LC2r364307308; call _gfortrani_bounds_ifunction_return$L21: .loc 1 127 3 setp.gt.s64 %r310,%r118,0; @ %r310 bra $L24; $L27: .loc 1 64 9 add.u64 %r311,%r31,1; .loc 1 64 7 sub.u64 %r120,%r311,%r33; max.s64 %r43,%r120,0; .loc 1 135 8 ld.u64 %r111,[%r185]; .loc 1 138 8 ld.u64 %r132,[%r186]; .loc 1 160 35 add.u64 %r74,%r122,%r122; .loc 1 147 22 add.u64 %r313,%r43,-1; .loc 1 147 15 mul.lo.u64 %r68,%r313,%r74; .loc 1 148 34 neg.s64 %r71,%r74; ld.u64 %r82,[%frame+360]; .loc 1 172 22 ld.u64 %r75,[%frame+120]; .loc 1 172 12 add.u64 %r77,%r75,%r75; .loc 1 173 22 ld.u64 %r78,[%frame]; .loc 1 173 12 shl.b64 %r80,%r78,3; setp.ne.u32 %r363,%r189,0; setp.le.s64 %r362,%r118,1; .loc 1 148 4 setp.le.s64 %r367,%r120,0; .loc 1 150 11 cvt.u16.u32 %r368,%r187; .loc 1 160 4 setp.gt.s64 %r369,%r120,0; cvt.u32.u32 %r370,%r22; cvt.s64.s8 %r371,%r370; add.u64 %r372,%r371,-1; add.u64 %r373,%frame,120; bra $L25; $L24: add.u64 %r170,%frame,360; add.u64 %r169,%r185,40; mov.u64 %r167,%frame; cvt.u32.u32 %r316,%r22; cvt.s64.s8 %r315,%r316; add.u64 %r154,%r315,-1; .loc 1 127 3 mov.u64 %r133,0; add.u64 %r364,%frame,240; .loc 1 129 16 mov.u64 %r317,%r133; $L26: st.u64 [%r170],%r317; .loc 1 130 18 ld.u64 %r318,[%r169]; st.u64 [%r167],%r318; .loc 1 131 17 shl.b64 %r320,%r133,3; add.u64 %r321,%r364,%r320; .loc 1 131 10 ld.u64 %r322,[%r321]; setp.le.s64 %r323,%r322,0; @ %r323 bra $L1; .loc 1 127 26 add.u64 %r133,%r133,1; .loc 1 127 3 add.u64 %r170,%r170,8; add.u64 %r169,%r169,24; add.u64 %r167,%r167,8; setp.ne.u64 %r324,%r133,%r154; @ %r324 bra $L26; bra $L27; $L25: .loc 1 145 10 @ %r363 bra $L28; .loc 1 160 4 @ %r369 bra $L37; .loc 1 144 14 mov.u64 %r126,0; bra $L30; $L28: .loc 1 147 8 add.u64 %r127,%r132,%r68; .loc 1 148 4 @ %r367 bra $L38; .loc 1 148 11 mov.u64 %r126,%r43; $L31: .loc 1 150 11 ld.u16 %r328,[%r127]; setp.eq.u16 %r330,%r328,%r368; @ %r330 bra $L30; .loc 1 148 26 add.u64 %r126,%r126,-1; .loc 1 148 34 add.u64 %r127,%r127,%r71; .loc 1 148 4 setp.ne.u64 %r331,%r126,0; @ %r331 bra $L31; bra $L30; $L37: .loc 1 160 4 mov.u64 %r125,%r132; .loc 1 160 11 mov.u64 %r126,1; $L29: .loc 1 162 11 ld.u16 %r332,[%r125]; setp.eq.u16 %r334,%r332,%r368; @ %r334 bra $L30; .loc 1 160 27 add.u64 %r126,%r126,1; .loc 1 160 35 add.u64 %r125,%r125,%r74; .loc 1 160 4 setp.ge.s64 %r335,%r43,%r126; @ %r335 bra $L29; .loc 1 144 14 mov.u64 %r126,0; bra $L30; $L38: mov.u64 %r126,0; $L30: .loc 1 169 13 st.u64 [%r111],%r126; .loc 1 171 15 add.u64 %r82,%r82,1; .loc 1 172 12 add.u64 %r132,%r132,%r77; .loc 1 173 12 add.u64 %r111,%r111,%r80; .loc 1 175 32 ld.u64 %r141,[%frame+240]; .loc 1 175 13 setp.ne.u64 %r336,%r82,%r141; @ %r336 bra $L25; .loc 1 178 23 mul.lo.u64 %r337,%r75,%r82; .loc 1 178 9 add.u64 %r338,%r337,%r337; sub.u64 %r129,%r132,%r338; .loc 1 179 23 mul.lo.u64 %r339,%r78,%r82; .loc 1 179 9 shl.b64 %r340,%r339,3; sub.u64 %r130,%r111,%r340; .loc 1 181 7 @ %r362 bra $L1; add.u64 %r66,%frame,368; mov.u64 %r159,8; .loc 1 180 5 mov.u64 %r131,1; add.u64 %r364,%frame,240; .loc 1 177 13 mov.u64 %r366,0; bra $L34; $L35: st.u64 [%r66],%r366; .loc 1 178 23 mul.lo.u64 %r346,%r95,%r94; .loc 1 178 9 add.u64 %r347,%r346,%r346; sub.u64 %r129,%r132,%r347; .loc 1 179 23 mul.lo.u64 %r348,%r98,%r94; .loc 1 179 9 shl.b64 %r349,%r348,3; sub.u64 %r130,%r111,%r349; .loc 1 180 5 add.u64 %r131,%r131,1; .loc 1 181 7 add.u64 %r66,%r66,8; add.u64 %r159,%r159,8; setp.eq.u64 %r350,%r131,%r372; @ %r350 bra $L1; $L34: .loc 1 188 16 ld.u64 %r351,[%r66]; add.u64 %r94,%r351,1; st.u64 [%r66],%r94; .loc 1 189 23 add.u64 %r353,%r373,%r159; ld.u64 %r95,[%r353]; .loc 1 189 13 add.u64 %r354,%r95,%r95; add.u64 %r132,%r129,%r354; .loc 1 190 23 add.u64 %r355,%frame,%r159; ld.u64 %r98,[%r355]; .loc 1 190 13 shl.b64 %r356,%r98,3; add.u64 %r111,%r130,%r356; .loc 1 175 32 add.u64 %r358,%r364,%r159; ld.u64 %r102,[%r358]; .loc 1 175 13 setp.eq.u64 %r359,%r94,%r102; @ %r359 bra $L35; .loc 1 177 13 mov.u64 %r82,0; bra $L25; $L51: .loc 1 86 6 ld.u64 %r360,[%r185]; setp.eq.u64 %r361,%r360,0; @ ! %r361 bra $L11; bra $L36; $L_gfortran_mfindloc1_i2 .visible .func _gfortran_mfindloc1_i2608u32 %r209; .reg .u32 %r210; .reg .u64 %r212; .reg .u32 %r214; .reg .u32 %r215; .reg .u16 %r217; .reg .u16 %r218; .reg .u16 %r219; .reg .u32 %r220; .reg .u16 %r221; .reg .predu64 %r230; .reg .u64 %r231; .reg .u64 %r249; .reg .u32u32u64 %r281; .reg .u64 %r282; .reg .u64 %r283; .reg .u64predu64u16 %r331; .reg .u16predpred %r362; .reg .u64 %r365; .reg .u64 %r366; .reg .pred %r368; .reg .u64 %r369; .reg .u64 %r372; .reg .u32 %r373; .reg .u64pred %r380; .reg .pred %r381; .reg .pred %r383; .reg .u16 %r385; .reg .u32 %r386; .reg .pred %r387; .reg .u16 %r388; .reg .pred %r390; .reg .pred %r391; .reg .u16 %r392; .reg .u32 %r393; .reg .pred %r394; .reg .u16 %r395; .reg .pred %r397; .reg .pred %r398; .reg .predu64 %r410; .reg .u64mov.u64 %r204,%ar0; mov.u64 %r205,%ar1; mov.u32 %r206,%ar2; mov.u64 %r207,%ar3; mov.u64 %r208,%ar4; mov.u32 %r209,%ar5; .loc 1 225 10 ld.s8 %r22,[%r205+28]; .loc 1 225 38 add.u32 %r210,%r22,-1; .loc 1 225 8 cvt.s64.s32 %r147,%r210; .loc 1 226 10 ld.u64 %r25,[%r207]; .loc 1 226 7 add.u64 %r148,%r25,-1; .loc 1 228 7 shr.u64 %r212,%r148,63; set.u32.lt.s64 %r214,%r147,%r148; neg.s32 %r215,%r214; cvt.u16.u64 %r218,%r212; cvt.u16.u32 %r219,%r215; or.b16 %r217,%r218,%r219; .loc 1 228 6 cvt.u32.u16 %r220,%r217; cvt.u16.u8 %r221,%r220; setp.eq.u16 %r222,%r221,0; @ %r222 bra $L53; .loc 1 230 7 cvt.u32.u32 %r225,%r22; cvt.s64.s8 %r224,%r225; st.u64 [%stack+8],%r224232stack; call _gfortran_runtime_error3: .loc 1 235 9 add.u64 %r228,%r148,%r148; add.u64 %r229,%r228,%r148; shl.b64 %r230,%r229,3; add.u64 %r231,%r205,%r230; ld.u64 %r31,[%r231+56]; ld.u64 %r33,[%r231+48]; .loc 1 239 9 ld.u64 %r151,[%r231+40]; .loc 1 240 12 add.u64 %r249,%r208,%r230; ld.u64 %r34,[%r249+40]; ld.u64 %r36,[%r208+16]; .loc 1 242 9 ld.u64 %r160,[%r208]; .loc 1 246 22 cvt.u32.u64 %r39,%r36; .loc 1 246 53 add.u32 %r251,%r39,-4; and.b32 %r252,%r251,-5; set.u32.eq.u32 %r254,%r252,0; neg.s32 %r255,%r254; .loc 1 246 22 add.u32 %r256,%r39,-1; set.u32.le.u32 %r258,%r256,1; neg.s32 %r259,%r258; .loc 1 246 6 cvt.u16.u32 %r261,%r255; cvt.u16.u32 %r262,%r259; or.b16 %r260,%r261,%r262; cvt.u32.u16 %r263,%r260; cvt.u16.u8 %r264,%r263; setp.eq.u16 %r265,%r264,0; @ %r265 bra $L54; $L58: .loc 1 255 3 setp.ne.u64 %r266,%r148,0; @ %r266 bra $L55; $L62: .loc 1 264 3 setp.gt.s64 %r267,%r147,%r148; @ %r267 bra $L56; bra $L113; $L54: .loc 1 248 7 setp.eq.u32 %r269,%r39,16; @ %r269 bra $L58; .loc 1 253 5 cvta.const.u64 %r271,$LC4; mov.u64 %r270,27271; call _gfortrani_internal_error5: add.u64 %r146,%r205,40; add.u64 %r167,%r208,40; add.u64 %r273,%r25,%r25; add.u64 %r274,%r273,%r25; shl.b64 %r275,%r274,3; add.u64 %r276,%r205,16; add.u64 %r101,%r275,%r276; .loc 1 255 3 mov.u64 %r200,0; add.u64 %r422,%frame,240; add.u64 %r423,%frame,120; add.u64 %r420,%frame,360; .loc 1 262 12 mov.u64 %r437,%r200; $L61: .loc 1 257 18 add.u64 %r278,%r422,%r200; ld.u64 %r279,[%r146]; st.u64 [%r278],%r279; .loc 1 258 18 add.u64 %r281,%r423,%r200; .loc 1 258 20 ld.u64 %r283,[%r167]; mul.lo.u64 %r282,%r283,%r36; .loc 1 258 18 st.u64 [%r281],%r282; .loc 1 259 19 ld.u64 %r285,[%r146+16]; add.u64 %r284,%r285,1; ld.u64 %r286,[%r146+8]; sub.u64 %r53,%r284,%r286; .loc 1 261 10 setp.lt.s64 %r287,%r53,0; @ %r287 bra $L59; .loc 1 259 17 add.u64 %r289,%r420,%r200; st.u64 [%r289],%r53; bra $L60; $L59: .loc 1 262 12 add.u64 %r291,%r420,%r200; st.u64 [%r291],%r437; $L60: .loc 1 255 3 add.u64 %r146,%r146,24; add.u64 %r167,%r167,24; add.u64 %r200,%r200,8; setp.ne.u64 %r293,%r101,%r146; @ %r293 bra $L61; bra $L62; $L113: .loc 1 274 6 ld.u64 %r294,[%r204]; setp.eq.u64 %r295,%r294,0; @ ! %r295 bra $L64; bra $L63; $L56: add.u64 %r297,%r25,%r25; add.u64 %r298,%r297,%r25; shl.b64 %r299,%r298,3; add.u64 %r143,%r299,40; add.u64 %r156,%r205,%r143; add.u64 %r141,%r208,%r143; shl.b64 %r300,%r25,3; add.u64 %r137,%r300,-8; cvt.u32.u32 %r302,%r22; cvt.s64.s8 %r301,%r302; shl.b64 %r303,%r301,3; add.u64 %r84,%r303,-8; add.u64 %r422,%frame,240; add.u64 %r423,%frame,120; add.u64 %r420,%frame,360; .loc 1 271 12 mov.u64 %r436,0; $L67: .loc 1 266 18 add.u64 %r305,%r422,%r137; ld.u64 %r306,[%r156]; st.u64 [%r305],%r306; .loc 1 267 18 add.u64 %r308,%r423,%r137; .loc 1 267 20 ld.u64 %r310,[%r141]; mul.lo.u64 %r309,%r310,%r36; .loc 1 267 18 st.u64 [%r308],%r309; .loc 1 268 19 ld.u64 %r312,[%r156+16]; add.u64 %r311,%r312,1; ld.u64 %r313,[%r156+8]; sub.u64 %r62,%r311,%r313; .loc 1 270 10 setp.lt.s64 %r314,%r62,0; @ %r314 bra $L65; .loc 1 268 17 add.u64 %r316,%r420,%r137; st.u64 [%r316],%r62; bra $L66; $L65: .loc 1 271 12 add.u64 %r318,%r420,%r137; st.u64 [%r318],%r436; $L66: .loc 1 264 3 add.u64 %r156,%r156,24; add.u64 %r141,%r141,24; add.u64 %r137,%r137,8; setp.ne.u64 %r320,%r84,%r137; @ %r320 bra $L67; bra $L114; $L63: .loc 1 278 7 setp.le.s64 %r321,%r147,0; @ %r321 bra $L69; add.u64 %r420,%frame,360; $L90: add.u64 %r179,%r204,40; mov.u64 %r176,%r420; add.u64 %r322,%r204,16; cvt.u32.u32 %r324,%r22; cvt.s64.s8 %r323,%r324; add.u64 %r326,%r323,%r323; add.u64 %r327,%r326,%r323; shl.b64 %r328,%r327,3; add.u64 %r169,%r322,%r328; .loc 1 281 10 mov.u64 %r140,1; .loc 1 285 4 mov.u64 %r351,0; bra $L70; $L69: .loc 1 289 24 mov.u64 %r329,0; st.u64 [%r204+8],%r329; .loc 1 290 28 cvt.u16.u32 %r332,%r22; add.u16 %r331,%r332,-1; cvt.u32.u16 %r333,%r331; st.u8 [%r204+28],%r333; .loc 1 292 20 add.u32 %r334,%r22,-2; cvt.s64.s32 %r72,%r334; add.u64 %r336,%r72,%r72; add.u64 %r337,%r336,%r72; shl.b64 %r338,%r337,3; add.u64 %r339,%r204,%r338; .loc 1 292 67 shl.b64 %r341,%r72,3; add.u64 %r342,%frame,%r341; .loc 1 292 59 ld.u64 %r344,[%r339+40]; ld.u64 %r345,[%r342+360]; mul.lo.u64 %r154,%r344,%r345; .loc 1 294 29 mov.u64 %r3471547; call (%value_in),_gfortrani_xmallocarray8,[%value_in]; } .loc 1 294 27 st.u64 [%r204],%r348; .loc 1 295 10 setp.eq.u64 %r350,%r154,0; @ ! %r350 bra $L74; bra $L71; $L73: .loc 1 283 48 mul.lo.u64 %r140,%r68,%r140; $L70: .loc 1 285 4 st.u64 [%r179+8],%r351; ld.u64 %r68,[%r176]; add.u64 %r352,%r68,-1; st.u64 [%r179+16],%r352; st.u64 [%r179],%r140; .loc 1 278 7 add.u64 %r179,%r179,24; add.u64 %r176,%r176,8; setp.ne.u64 %r353,%r169,%r179; @ %r353 bra $L73; bra $L69; $L71: .loc 1 298 4 st.u64 [%r204+48],%r154; mov.u64 %r355,-1; st.u64 [%r204+56],%r355; mov.u64 %r356,1; st.u64 [%r204+40],%r356; .loc 1 299 4 bra $L52; $L64: .loc 1 304 19 ld.s8 %r79,[%r204+28]; .loc 1 304 10 setp.eq.u64 %r357,%r79,%r147; @ %r357 bra $L76; .loc 1 305 2 st.u64 [%stack+8],%r147; st.u64 [%stack],%r79;_gfortran_runtime_error76: .loc 1 310 11 cvta.global.u64 %r360,_gfortrani_compile_options; .loc 1 310 10 ld.u32 %r361,[%r360+36]; setp.eq.u32 %r362,%r361,0; @ %r362 bra $L74; .loc 1 311 2 add.u64 %r420,%frame,360; cvta.const.u64 %r366,$LC2r420366; call _gfortrani_bounds_ifunction_return$L74: .loc 1 315 3 setp.gt.s64 %r368,%r147,0; @ %r368 bra $L77; $L80: .loc 1 235 9 add.u64 %r369,%r31,1; .loc 1 235 7 sub.u64 %r149,%r369,%r33; max.s64 %r65,%r149,0; .loc 1 240 12 mul.lo.u64 %r37,%r34,%r36; .loc 1 323 8 ld.u64 %r164,[%r204]; .loc 1 326 8 ld.u64 %r163,[%r205]; .loc 1 351 35 add.u64 %r100,%r151,%r151; .loc 1 336 22 add.u64 %r64,%r65,-1; .loc 1 336 15 mul.lo.u64 %r89,%r64,%r100; .loc 1 337 29 mul.lo.u64 %r177,%r37,%r64; .loc 1 338 34 neg.s64 %r95,%r100; .loc 1 363 22 ld.u64 %r102,[%frame+240]; .loc 1 363 12 add.u64 %r104,%r102,%r102; .loc 1 364 23 ld.u64 %r105,[%frame+120]; .loc 1 365 22 ld.u64 %r107,[%frame]; .loc 1 365 12 shl.b64 %r109,%r107,3; .loc 1 362 12 ld.u64 %r203,[%frame+480]; setp.ne.u32 %r421,%r209,0; add.u64 %r419,%frame,488; .loc 1 338 4 setp.le.s64 %r429,%r149,0; .loc 1 340 18 cvt.u16.u32 %r430,%r206; .loc 1 382 23 add.u64 %r432,%frame,240; .loc 1 367 32 add.u64 %r433,%frame,360; bra $L78; $L77: add.u64 %r198,%frame,480; add.u64 %r197,%r204,40; mov.u64 %r195,%frame; cvt.u32.u32 %r373,%r22; cvt.s64.s8 %r372,%r373; add.u64 %r181,%r372,-1; .loc 1 315 3 mov.u64 %r165,0; add.u64 %r420,%frame,360; .loc 1 317 16 mov.u64 %r374,%r165; $L79: st.u64 [%r198],%r374; .loc 1 318 18 ld.u64 %r375,[%r197]; st.u64 [%r195],%r375; .loc 1 319 17 shl.b64 %r377,%r165,3; add.u64 %r378,%r420,%r377; .loc 1 319 10 ld.u64 %r379,[%r378]; setp.le.s64 %r380,%r379,0; @ %r380 bra $L52; .loc 1 315 26 add.u64 %r165,%r165,1; .loc 1 315 3 add.u64 %r198,%r198,8; add.u64 %r197,%r197,24; add.u64 %r195,%r195,8; setp.ne.u64 %r381,%r165,%r181; @ %r381 bra $L79; bra $L80; $L78: .loc 1 334 10 @ %r421 bra $L81; .loc 1 351 4 setp.gt.s64 %r383,%r149,0; @ %r383 bra $L91; .loc 1 333 14 mov.u64 %r158,0; bra $L83; $L81: .loc 1 336 8 add.u64 %r29,%r163,%r89; .loc 1 337 9 add.u64 %r159,%r160,%r177; .loc 1 338 4 @ %r429 bra $L92; .loc 1 338 11 mov.u64 %r158,%r65; $L85: .loc 1 340 11 ld.s8 %r386,[%r159]; cvt.u16.u32 %r385,%r386; setp.eq.u16 %r387,%r385,0; @ %r387 bra $L84; .loc 1 340 18 ld.u16 %r388,[%r29]; setp.eq.u16 %r390,%r388,%r430; @ %r390 bra $L83; $L84: .loc 1 338 26 add.u64 %r158,%r158,-1; .loc 1 338 34 add.u64 %r29,%r29,%r95; .loc 1 338 53 sub.u64 %r159,%r159,%r37; .loc 1 338 4 setp.ne.u64 %r391,%r158,0; @ %r391 bra $L85; bra $L83; $L91: .loc 1 351 4 mov.u64 %r157,%r160; mov.u64 %r111,%r163; .loc 1 351 11 mov.u64 %r158,1; $L82: .loc 1 353 11 ld.s8 %r393,[%r157]; cvt.u16.u32 %r392,%r393; setp.eq.u16 %r394,%r392,0; @ %r394 bra $L86; .loc 1 353 18 ld.u16 %r395,[%r111]; setp.eq.u16 %r397,%r395,%r430; @ %r397 bra $L83; $L86: .loc 1 351 27 add.u64 %r158,%r158,1; .loc 1 351 35 add.u64 %r111,%r111,%r100; .loc 1 351 54 add.u64 %r157,%r157,%r37; .loc 1 351 4 setp.ge.s64 %r398,%r65,%r158; @ %r398 bra $L82; .loc 1 333 14 mov.u64 %r158,0; bra $L83; $L92: mov.u64 %r158,0; $L83: .loc 1 360 13 st.u64 [%r164],%r158; .loc 1 362 15 add.u64 %r203,%r203,1; st.u64 [%frame+480],%r203; .loc 1 363 12 add.u64 %r163,%r163,%r104; .loc 1 364 13 add.u64 %r160,%r160,%r105; .loc 1 365 12 add.u64 %r164,%r164,%r109; .loc 1 367 32 ld.u64 %r130,[%frame+360]; .loc 1 367 13 setp.ne.u64 %r399,%r203,%r130; @ %r399 bra $L78; mov.u64 %r126,%r419; mov.u64 %r127,%r107; mov.u64 %r202,%r105; mov.u64 %r124,%r102; mov.u64 %r120,8; .loc 1 366 9 mov.u64 %r162,0; .loc 1 369 13 mov.u64 %r401,%r162; .loc 1 371 20 add.u64 %r426,%frame,120; $L89: .loc 1 369 13 st.u64 [%r126+-8],%r401; .loc 1 370 23 mul.lo.u64 %r113,%r130,%r124; .loc 1 371 24 mul.lo.u64 %r402,%r130,%r202; .loc 1 371 10 sub.u64 %r160,%r160,%r402; .loc 1 372 23 mul.lo.u64 %r119,%r130,%r127; .loc 1 373 5 add.u64 %r162,%r162,1; .loc 1 374 7 setp.le.s64 %r403,%r147,%r162; @ %r403 bra $L52; .loc 1 381 16 ld.u64 %r404,[%r126]; add.u64 %r123,%r404,1; st.u64 [%r126],%r123; .loc 1 382 23 add.u64 %r406,%r432,%r120; ld.u64 %r124,[%r406]; .loc 1 382 13 sub.u64 %r407,%r124,%r113; add.u64 %r408,%r407,%r407; add.u64 %r163,%r163,%r408; .loc 1 383 23 add.u64 %r409,%frame,%r120; ld.u64 %r127,[%r409]; .loc 1 383 13 sub.u64 %r410,%r127,%r119; shl.b64 %r411,%r410,3; add.u64 %r164,%r164,%r411; .loc 1 367 32 add.u64 %r413,%r433,%r120; ld.u64 %r130,[%r413]; .loc 1 367 13 add.u64 %r126,%r126,8; setp.eq.u64 %r414,%r123,%r130; @ %r414 bra $L88; .loc 1 362 12 ld.u64 %r203,[%frame+480]; bra $L78; $L88: .loc 1 371 20 add.u64 %r416,%r426,%r120; ld.u64 %r202,[%r416]; add.u64 %r120,%r120,8; bra $L89; $L114: .loc 1 274 6 ld.u64 %r417,[%r204]; setp.eq.u64 %r418,%r417,0; @ ! %r418 bra $L64; bra $L90; $L52_gfortran_sfindloc1_i2 .visible .func _gfortran_sfindloc1_i268u64 %r126; .reg .u64u32pred %r190; .reg .pred %r192; .reg .u64pred %r213; .reg .predu32predpredpred %r257; .reg .u64 %r260; .reg .u64 %r261; .reg .pred %r263; .reg .u64 %r264; .reg .u32 %r265; .reg .u64 %r266; .reg .u6464predmov.u64 %r149,%ar0; mov.u64 %r150,%ar1; mov.u32 %r151,%ar2; mov.u64 %r152,%ar3; mov.u64 %r153,%ar4; mov.u32 %r154,%ar5; .loc 1 410 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L116; .loc 1 410 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L117; $L116:call _gfortran_findloc1_i2); } .loc 1 413 7 bra $L115; $L117: .loc 1 416 10 ld.s8 %r23,[%r150+28]; .loc 1 416 38 add.u32 %r163,%r23,-1; .loc 1 416 8 cvt.s64.s32 %r90,%r163; .loc 1 417 10 ld.u64 %r26,[%r152]; .loc 1 417 7 add.u64 %r91,%r26,-1; .loc 1 419 7 shr.u64 %r165,%r91,63; set.u32.lt.s64 %r167,%r90,%r91; neg.s32 %r168,%r167; cvt.u16.u64 %r171,%r165; cvt.u16.u32 %r172,%r168; or.b16 %r170,%r171,%r172; .loc 1 419 6 cvt.u32.u16 %r173,%r170; cvt.u16.u8 %r174,%r173; setp.ne.u16 %r175,%r174,0; @ %r175 bra $L119; .loc 1 430 3 setp.ne.u64 %r176,%r91,0; @ %r176 bra $L120; $L126: .loc 1 438 3 setp.gt.s64 %r177,%r90,%r91; @ %r177 bra $L121; bra $L163; $L119: .loc 1 421 7 cvt.u32.u32 %r180,%r23; cvt.s64.s8 %r179,%r180; st.u64 [%stack+8],%r179_gfortran_runtime_error120: add.u64 %r136,%r150,48; add.u64 %r138,%frame,120; add.u64 %r139,%r150,56; add.u64 %r182,%r150,24; add.u64 %r184,%r26,%r26; add.u64 %r185,%r184,%r26; shl.b64 %r186,%r185,3; add.u64 %r145,%r182,%r186; .loc 1 435 12 mov.u64 %r302,0; $L125: .loc 1 432 19 ld.u64 %r188,[%r139]; add.u64 %r187,%r188,1; ld.u64 %r189,[%r136]; sub.u64 %r35,%r187,%r189; .loc 1 434 10 setp.le.s64 %r190,%r35,0; @ %r190 bra $L123; .loc 1 432 17 st.u64 [%r138],%r35; bra $L124; $L123: .loc 1 435 12 st.u64 [%r138],%r302; $L124: .loc 1 430 3 add.u64 %r136,%r136,24; add.u64 %r138,%r138,8; add.u64 %r139,%r139,24; setp.ne.u64 %r192,%r136,%r145; @ %r192 bra $L125; bra $L126; $L163: .loc 1 448 6 ld.u64 %r193,[%r149]; setp.eq.u64 %r194,%r193,0; @ ! %r194 bra $L128; bra $L127; $L121: add.u64 %r196,%r26,%r26; add.u64 %r197,%r196,%r26; shl.b64 %r198,%r197,3; add.u64 %r199,%r198,48; add.u64 %r125,%r150,%r199; add.u64 %r294,%frame,120; shl.b64 %r201,%r26,3; add.u64 %r202,%r201,-8; add.u64 %r45,%r294,%r202; add.u64 %r203,%r150,48; cvt.u32.u32 %r205,%r23; cvt.s64.s8 %r204,%r205; add.u64 %r207,%r204,%r204; add.u64 %r208,%r207,%r204; shl.b64 %r209,%r208,3; add.u64 %r132,%r203,%r209; .loc 1 444 12 mov.u64 %r301,0; $L131: .loc 1 441 2 ld.u64 %r211,[%r125+8]; add.u64 %r210,%r211,1; ld.u64 %r212,[%r125]; sub.u64 %r40,%r210,%r212; .loc 1 443 10 setp.le.s64 %r213,%r40,0; @ %r213 bra $L129; .loc 1 440 17 st.u64 [%r45],%r40; bra $L130; $L129: .loc 1 444 12 st.u64 [%r45],%r301; $L130: .loc 1 438 3 add.u64 %r125,%r125,24; add.u64 %r45,%r45,8; setp.ne.u64 %r215,%r125,%r132; @ %r215 bra $L131; bra $L164; $L127: .loc 1 452 7 setp.le.s64 %r216,%r90,0; @ %r216 bra $L133; add.u64 %r294,%frame,120; $L149: add.u64 %r78,%r149,40; mov.u64 %r61,%r294; cvt.u32.u32 %r218,%r23; cvt.s64.s8 %r217,%r218; add.u64 %r220,%r217,%r217; add.u64 %r221,%r220,%r217; shl.b64 %r222,%r221,3; add.u64 %r223,%r149,16; add.u64 %r126,%r222,%r223; .loc 1 455 10 mov.u64 %r81,1; .loc 1 459 4 mov.u64 %r246,0; bra $L134; $L133: .loc 1 462 24 mov.u64 %r224,0; st.u64 [%r149+8],%r224; .loc 1 463 28 cvt.u16.u32 %r227,%r23; add.u16 %r226,%r227,-1; cvt.u32.u16 %r228,%r226; st.u8 [%r149+28],%r228; .loc 1 465 20 add.u32 %r229,%r23,-2; cvt.s64.s32 %r51,%r229; add.u64 %r231,%r51,%r51; add.u64 %r232,%r231,%r51; shl.b64 %r233,%r232,3; add.u64 %r234,%r149,%r233; .loc 1 465 67 shl.b64 %r236,%r51,3; add.u64 %r237,%frame,%r236; .loc 1 465 59 ld.u64 %r239,[%r234+40]; ld.u64 %r240,[%r237+120]; mul.lo.u64 %r94,%r239,%r240; .loc 1 467 29 mov.u64 %r24242; call (%value_in),_gfortrani_xmallocarray3,[%value_in]; } .loc 1 467 27 st.u64 [%r149],%r243; .loc 1 468 10 setp.eq.u64 %r245,%r94,0; @ ! %r245 bra $L138; bra $L135; $L137: .loc 1 457 48 mul.lo.u64 %r81,%r47,%r81; $L134: .loc 1 459 4 st.u64 [%r78+8],%r246; ld.u64 %r47,[%r61]; add.u64 %r247,%r47,-1; st.u64 [%r78+16],%r247; st.u64 [%r78],%r81; .loc 1 452 7 add.u64 %r78,%r78,24; add.u64 %r61,%r61,8; setp.ne.u64 %r248,%r78,%r126; @ %r248 bra $L137; bra $L133; $L135: .loc 1 471 4 st.u64 [%r149+48],%r94; mov.u64 %r250,-1; st.u64 [%r149+56],%r250; mov.u64 %r251,1; st.u64 [%r149+40],%r251; .loc 1 472 4 bra $L115; $L128: .loc 1 477 19 ld.s8 %r58,[%r149+28]; .loc 1 477 10 setp.eq.u64 %r252,%r58,%r90; @ %r252 bra $L139; .loc 1 478 2 st.u64 [%stack+8],%r90; st.u64 [%stack],%r58;_gfortran_runtime_error139: .loc 1 483 11 cvta.global.u64 %r255,_gfortrani_compile_options; .loc 1 483 10 ld.u32 %r256,[%r255+36]; setp.eq.u32 %r257,%r256,0; @ %r257 bra $L138; .loc 1 484 2 add.u64 %r294,%frame,120; cvta.const.u64 %r261,$LC2261; call _gfortrani_bounds_ifunction_return$L138: .loc 1 488 3 setp.gt.s64 %r263,%r90,0; @ %r263 bra $L140; $L143: .loc 1 495 8 ld.u64 %r97,[%r149]; ld.u64 %r42,[%frame+240]; .loc 1 503 22 ld.u64 %r64,[%frame]; .loc 1 503 12 shl.b64 %r66,%r64,3; setp.le.s64 %r295,%r90,1; cvt.u32.u32 %r297,%r23; cvt.s64.s8 %r298,%r297; add.u64 %r299,%r298,-1; add.u64 %r300,%frame,120; bra $L141; $L140: add.u64 %r114,%frame,240; add.u64 %r113,%r149,40; mov.u64 %r108,%frame; cvt.u32.u32 %r265,%r23; cvt.s64.s8 %r264,%r265; add.u64 %r79,%r264,-1; .loc 1 488 3 mov.u64 %r98,0; add.u64 %r294,%frame,120; .loc 1 490 16 mov.u64 %r266,%r98; $L142: st.u64 [%r114],%r266; .loc 1 491 18 ld.u64 %r267,[%r113]; st.u64 [%r108],%r267; .loc 1 492 17 shl.b64 %r269,%r98,3; add.u64 %r270,%r294,%r269; .loc 1 492 10 ld.u64 %r271,[%r270]; setp.le.s64 %r272,%r271,0; @ %r272 bra $L115; .loc 1 488 26 add.u64 %r98,%r98,1; .loc 1 488 3 add.u64 %r114,%r114,8; add.u64 %r113,%r113,24; add.u64 %r108,%r108,8; setp.ne.u64 %r273,%r79,%r98; @ %r273 bra $L142; bra $L143; $L141: .loc 1 500 13 mov.u64 %r274,0; st.u64 [%r97],%r274; .loc 1 502 15 add.u64 %r42,%r42,1; .loc 1 503 12 add.u64 %r97,%r97,%r66; .loc 1 505 32 ld.u64 %r88,[%frame+120]; .loc 1 505 13 setp.ne.u64 %r275,%r42,%r88; @ %r275 bra $L141; .loc 1 508 23 mul.lo.u64 %r276,%r64,%r42; .loc 1 508 9 shl.b64 %r277,%r276,3; sub.u64 %r30,%r97,%r277; .loc 1 510 7 @ ! %r295 bra $L165; bra $L115; $L148: .loc 1 507 13 st.u64 [%r102],%r274; .loc 1 508 23 mul.lo.u64 %r280,%r73,%r72; .loc 1 508 9 shl.b64 %r281,%r280,3; sub.u64 %r30,%r97,%r281; .loc 1 509 5 add.u64 %r96,%r96,1; .loc 1 510 7 add.u64 %r102,%r102,8; setp.ne.u64 %r282,%r96,%r299; @ %r282 bra $L147; bra $L115; $L165: add.u64 %r102,%frame,248; .loc 1 509 5 mov.u64 %r96,1; $L147: .loc 1 517 16 ld.u64 %r286,[%r102]; add.u64 %r72,%r286,1; st.u64 [%r102],%r72; shl.b64 %r120,%r96,3; .loc 1 518 23 add.u64 %r287,%frame,%r120; ld.u64 %r73,[%r287]; .loc 1 518 13 shl.b64 %r288,%r73,3; add.u64 %r97,%r30,%r288; .loc 1 505 32 add.u64 %r290,%r300,%r120; ld.u64 %r77,[%r290]; .loc 1 505 13 setp.eq.u64 %r291,%r72,%r77; @ %r291 bra $L148; .loc 1 507 13 mov.u64 %r42,0; bra $L141; $L164: .loc 1 448 6 ld.u64 %r292,[%r149]; setp.eq.u64 %r293,%r292,0; @ ! %r293 bra $L128; bra $L149; $L115: .loc 1 522 1 ret; } findloc1_i4.o/_gfortran_findloc1_i4 .visible .func _gfortran_findloc1_i4); .file 1 "../../../libgfortran/generated/findloc1_i4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mfindloc1_i4 .visible .func _gfortran_mfindloc1_i4_gfortran_sfindloc1_i4 .visible .func _gfortran_sfindloc1_i4); // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_internal_error .extern .func _gfortrani_internal_error8] = {70,73,78,68,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,70,73,78,68,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,914,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,70,73,78,68,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_findloc1_i4 .visible .func _gfortran_findloc1_i448098predpred %r238; .reg .u64pred %r260; .reg .pred %r262; .reg .pred %r263; .reg .u64 %r264; .reg .u32 %r265; .reg .u64u64 %r279; .reg .u64 %r280; .reg .u6464pred %r327; .reg .predu64u64 %r352; .reg .u64 %r354; .reg .predpred %r358; .reg .pred %r359; .reg .u64predmov.u64 %r185,%ar0; mov.u64 %r186,%ar1; mov.u32 %r187,%ar2; mov.u64 %r188,%ar3; mov.u32 %r189,%ar4; .loc 1 54 10 ld.s8 %r22,[%r186+28]; .loc 1 54 38 add.u32 %r190,%r22,-1; .loc 1 54 8 cvt.s64.s32 %r118,%r190; .loc 1 55 10 ld.u64 %r25,[%r188]; .loc 1 55 7 add.u64 %r119,%r25,-1; .loc 1 57 7 shr.u64 %r192,%r119,63; set.u32.lt.s64 %r194,%r118,%r119; neg.s32 %r195,%r194; cvt.u16.u64 %r198,%r192; cvt.u16.u32 %r199,%r195; or.b16 %r197,%r198,%r199; .loc 1 57 6 cvt.u32.u16 %r200,%r197; cvt.u16.u8 %r201,%r200; setp.eq.u16 %r202,%r201,0; @ %r202 bra $L2; .loc 1 59 7 cvt.u32.u32 %r205,%r22; cvt.s64.s8 %r204,%r205; st.u64 [%stack+8],%r20403203_gfortran_runtime_errorr208,%r119,%r119; add.u64 %r209,%r208,%r119; shl.b64 %r210,%r209,3; add.u64 %r211,%r186,%r210; ld.u64 %r31,[%r211+56]; ld.u64 %r33,[%r211+48]; .loc 1 67 9 ld.u64 %r122,[%r211+40]; .loc 1 69 3 setp.ne.u64 %r225,%r119,0; @ %r225 bra $L3; $L9: .loc 1 77 3 setp.gt.s64 %r226,%r118,%r119; @ %r226 bra $L4; bra $L50; $L3: add.u64 %r177,%r186,40; add.u64 %r67,%frame,120; add.u64 %r70,%frame,240; add.u64 %r228,%r25,%r25; add.u64 %r229,%r228,%r25; shl.b64 %r230,%r229,3; add.u64 %r231,%r186,16; add.u64 %r182,%r230,%r231; .loc 1 75 12 mov.u64 %r371,0; $L8: .loc 1 71 18 ld.u64 %r232,[%r177]; st.u64 [%r67],%r232; .loc 1 72 19 ld.u64 %r234,[%r177+16]; add.u64 %r233,%r234,1; ld.u64 %r235,[%r177+8]; sub.u64 %r38,%r233,%r235; .loc 1 74 10 setp.lt.s64 %r236,%r38,0; @ %r236 bra $L6; .loc 1 72 17 st.u64 [%r70],%r38; bra $L7; $L6: .loc 1 75 12 st.u64 [%r70],%r371; $L7: .loc 1 69 3 add.u64 %r177,%r177,24; add.u64 %r67,%r67,8; add.u64 %r70,%r70,8; setp.ne.u64 %r238,%r177,%r182; @ %r238 bra $L8; bra $L9; $L50: .loc 1 86 6 ld.u64 %r239,[%r185]; setp.eq.u64 %r240,%r239,0; @ ! %r240 bra $L11; bra $L10; $L4: add.u64 %r242,%r25,%r25; add.u64 %r243,%r242,%r25; shl.b64 %r244,%r243,3; add.u64 %r245,%r244,40; add.u64 %r115,%r186,%r245; shl.b64 %r246,%r25,3; add.u64 %r105,%r246,-8; add.u64 %r361,%frame,120; add.u64 %r108,%r361,%r105; add.u64 %r360,%frame,240; add.u64 %r62,%r360,%r105; add.u64 %r249,%r186,40; cvt.u32.u32 %r251,%r22; cvt.s64.s8 %r250,%r251; add.u64 %r253,%r250,%r250; add.u64 %r254,%r253,%r250; shl.b64 %r255,%r254,3; add.u64 %r180,%r249,%r255; .loc 1 83 12 mov.u64 %r370,0; $L14: .loc 1 79 18 ld.u64 %r256,[%r115]; st.u64 [%r108],%r256; .loc 1 80 19 ld.u64 %r258,[%r115+16]; add.u64 %r257,%r258,1; ld.u64 %r259,[%r115+8]; sub.u64 %r45,%r257,%r259; .loc 1 82 10 setp.lt.s64 %r260,%r45,0; @ %r260 bra $L12; .loc 1 80 17 st.u64 [%r62],%r45; bra $L13; $L12: .loc 1 83 12 st.u64 [%r62],%r370; $L13: .loc 1 77 3 add.u64 %r115,%r115,24; add.u64 %r108,%r108,8; add.u64 %r62,%r62,8; setp.ne.u64 %r262,%r115,%r180; @ %r262 bra $L14; bra $L51; $L10: .loc 1 90 7 setp.le.s64 %r263,%r118,0; @ %r263 bra $L16; add.u64 %r360,%frame,240; $L36: add.u64 %r153,%r185,40; mov.u64 %r149,%r360; cvt.u32.u32 %r265,%r22; cvt.s64.s8 %r264,%r265; add.u64 %r267,%r264,%r264; add.u64 %r268,%r267,%r264; shl.b64 %r269,%r268,3; add.u64 %r270,%r185,16; add.u64 %r116,%r269,%r270; .loc 1 93 10 mov.u64 %r112,1; .loc 1 97 4 mov.u64 %r293,0; bra $L17; $L16: .loc 1 101 24 mov.u64 %r271,0; st.u64 [%r185+8],%r271; .loc 1 102 28 cvt.u16.u32 %r274,%r22; add.u16 %r273,%r274,-1; cvt.u32.u16 %r275,%r273; st.u8 [%r185+28],%r275; .loc 1 104 20 add.u32 %r276,%r22,-2; cvt.s64.s32 %r54,%r276; add.u64 %r278,%r54,%r54; add.u64 %r279,%r278,%r54; shl.b64 %r280,%r279,3; add.u64 %r281,%r185,%r280; .loc 1 104 67 shl.b64 %r283,%r54,3; add.u64 %r284,%frame,%r283; .loc 1 104 59 ld.u64 %r286,[%r281+40]; ld.u64 %r287,[%r284+240]; mul.lo.u64 %r124,%r286,%r287; .loc 1 106 29 mov.u64 %r289189; call (%value_in),_gfortrani_xmallocarray290,[%value_in]; } .loc 1 106 27 st.u64 [%r185],%r290; .loc 1 107 10 setp.eq.u64 %r292,%r124,0; @ ! %r292 bra $L21; bra $L18; $L20: .loc 1 95 48 mul.lo.u64 %r112,%r50,%r112; $L17: .loc 1 97 4 st.u64 [%r153+8],%r293; ld.u64 %r50,[%r149]; add.u64 %r294,%r50,-1; st.u64 [%r153+16],%r294; st.u64 [%r153],%r112; .loc 1 90 7 add.u64 %r153,%r153,24; add.u64 %r149,%r149,8; setp.ne.u64 %r295,%r116,%r153; @ %r295 bra $L20; bra $L16; $L18: .loc 1 110 4 st.u64 [%r185+48],%r124; mov.u64 %r297,-1; st.u64 [%r185+56],%r297; mov.u64 %r298,1; st.u64 [%r185+40],%r298; .loc 1 111 4 bra $L1; $L11: .loc 1 116 19 ld.s8 %r60,[%r185+28]; .loc 1 116 10 setp.eq.u64 %r299,%r60,%r118; @ %r299 bra $L23; .loc 1 117 2 st.u64 [%stack+8],%r118; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 122 11 cvta.global.u64 %r302,_gfortrani_compile_options; .loc 1 122 10 ld.u32 %r303,[%r302+36]; setp.eq.u32 %r304,%r303,0; @ %r304 bra $L21; .loc 1 123 2 add.u64 %r360,%frame,240; cvta.const.u64 %r308,$LC2r360307308; call _gfortrani_bounds_ifunction_return$L21: .loc 1 127 3 setp.gt.s64 %r310,%r118,0; @ %r310 bra $L24; $L27: .loc 1 64 9 add.u64 %r311,%r31,1; .loc 1 64 7 sub.u64 %r120,%r311,%r33; max.s64 %r43,%r120,0; .loc 1 135 8 ld.u64 %r111,[%r185]; .loc 1 138 8 ld.u64 %r132,[%r186]; .loc 1 160 35 shl.b64 %r74,%r122,2; .loc 1 147 22 add.u64 %r312,%r43,-1; .loc 1 147 15 mul.lo.u64 %r68,%r312,%r74; .loc 1 148 34 neg.s64 %r71,%r74; ld.u64 %r82,[%frame+360]; .loc 1 172 22 ld.u64 %r75,[%frame+120]; .loc 1 172 12 shl.b64 %r77,%r75,2; .loc 1 173 22 ld.u64 %r78,[%frame]; .loc 1 173 12 shl.b64 %r80,%r78,3; setp.ne.u32 %r359,%r189,0; setp.le.s64 %r358,%r118,1; .loc 1 148 4 setp.le.s64 %r363,%r120,0; .loc 1 160 4 setp.gt.s64 %r364,%r120,0; cvt.u32.u32 %r365,%r22; cvt.s64.s8 %r366,%r365; add.u64 %r367,%r366,-1; add.u64 %r368,%frame,120; add.u64 %r369,%frame,240; bra $L25; $L24: add.u64 %r170,%frame,360; add.u64 %r169,%r185,40; mov.u64 %r167,%frame; cvt.u32.u32 %r314,%r22; cvt.s64.s8 %r313,%r314; add.u64 %r154,%r313,-1; .loc 1 127 3 mov.u64 %r133,0; add.u64 %r360,%frame,240; .loc 1 129 16 mov.u64 %r315,%r133; $L26: st.u64 [%r170],%r315; .loc 1 130 18 ld.u64 %r316,[%r169]; st.u64 [%r167],%r316; .loc 1 131 17 shl.b64 %r318,%r133,3; add.u64 %r319,%r360,%r318; .loc 1 131 10 ld.u64 %r320,[%r319]; setp.le.s64 %r321,%r320,0; @ %r321 bra $L1; .loc 1 127 26 add.u64 %r133,%r133,1; .loc 1 127 3 add.u64 %r170,%r170,8; add.u64 %r169,%r169,24; add.u64 %r167,%r167,8; setp.ne.u64 %r322,%r133,%r154; @ %r322 bra $L26; bra $L27; $L25: .loc 1 145 10 @ %r359 bra $L28; .loc 1 160 4 @ %r364 bra $L37; .loc 1 144 14 mov.u64 %r126,0; bra $L30; $L28: .loc 1 147 8 add.u64 %r127,%r132,%r68; .loc 1 148 4 @ %r363 bra $L38; .loc 1 148 11 mov.u64 %r126,%r43; $L31: .loc 1 150 11 ld.u32 %r326,[%r127]; setp.eq.u32 %r327,%r326,%r187; @ %r327 bra $L30; .loc 1 148 26 add.u64 %r126,%r126,-1; .loc 1 148 34 add.u64 %r127,%r127,%r71; .loc 1 148 4 setp.ne.u64 %r328,%r126,0; @ %r328 bra $L31; bra $L30; $L37: .loc 1 160 4 mov.u64 %r125,%r132; .loc 1 160 11 mov.u64 %r126,1; $L29: .loc 1 162 11 ld.u32 %r329,[%r125]; setp.eq.u32 %r330,%r329,%r187; @ %r330 bra $L30; .loc 1 160 27 add.u64 %r126,%r126,1; .loc 1 160 35 add.u64 %r125,%r125,%r74; .loc 1 160 4 setp.ge.s64 %r331,%r43,%r126; @ %r331 bra $L29; .loc 1 144 14 mov.u64 %r126,0; bra $L30; $L38: mov.u64 %r126,0; $L30: .loc 1 169 13 st.u64 [%r111],%r126; .loc 1 171 15 add.u64 %r82,%r82,1; .loc 1 172 12 add.u64 %r132,%r132,%r77; .loc 1 173 12 add.u64 %r111,%r111,%r80; .loc 1 175 32 ld.u64 %r141,[%frame+240]; .loc 1 175 13 setp.ne.u64 %r332,%r82,%r141; @ %r332 bra $L25; .loc 1 178 23 mul.lo.u64 %r333,%r75,%r82; .loc 1 178 9 shl.b64 %r334,%r333,2; sub.u64 %r129,%r132,%r334; .loc 1 179 23 mul.lo.u64 %r335,%r78,%r82; .loc 1 179 9 shl.b64 %r336,%r335,3; sub.u64 %r130,%r111,%r336; .loc 1 181 7 @ %r358 bra $L1; add.u64 %r66,%frame,368; mov.u64 %r159,8; .loc 1 180 5 mov.u64 %r131,1; .loc 1 177 13 mov.u64 %r362,0; bra $L34; $L35: st.u64 [%r66],%r362; .loc 1 178 23 mul.lo.u64 %r342,%r95,%r94; .loc 1 178 9 shl.b64 %r343,%r342,2; sub.u64 %r129,%r132,%r343; .loc 1 179 23 mul.lo.u64 %r344,%r98,%r94; .loc 1 179 9 shl.b64 %r345,%r344,3; sub.u64 %r130,%r111,%r345; .loc 1 180 5 add.u64 %r131,%r131,1; .loc 1 181 7 add.u64 %r66,%r66,8; add.u64 %r159,%r159,8; setp.eq.u64 %r346,%r131,%r367; @ %r346 bra $L1; $L34: .loc 1 188 16 ld.u64 %r347,[%r66]; add.u64 %r94,%r347,1; st.u64 [%r66],%r94; .loc 1 189 23 add.u64 %r349,%r368,%r159; ld.u64 %r95,[%r349]; .loc 1 189 13 shl.b64 %r350,%r95,2; add.u64 %r132,%r129,%r350; .loc 1 190 23 add.u64 %r351,%frame,%r159; ld.u64 %r98,[%r351]; .loc 1 190 13 shl.b64 %r352,%r98,3; add.u64 %r111,%r130,%r352; .loc 1 175 32 add.u64 %r354,%r369,%r159; ld.u64 %r102,[%r354]; .loc 1 175 13 setp.eq.u64 %r355,%r94,%r102; @ %r355 bra $L35; .loc 1 177 13 mov.u64 %r82,0; bra $L25; $L51: .loc 1 86 6 ld.u64 %r356,[%r185]; setp.eq.u64 %r357,%r356,0; @ ! %r357 bra $L11; bra $L36; $L_gfortran_mfindloc1_i4 .visible .func _gfortran_mfindloc1_i4608u32 %r209; .reg .u32 %r210; .reg .u64 %r212; .reg .u32 %r214; .reg .u32 %r215; .reg .u16 %r217; .reg .u16 %r218; .reg .u16 %r219; .reg .u32 %r220; .reg .u16 %r221; .reg .predu64 %r230; .reg .u64 %r231; .reg .u64 %r249; .reg .u32u32u64 %r281; .reg .u64 %r282; .reg .u64 %r283; .reg .u64predu64u16 %r331; .reg .u16predpredu64 %r375; .reg .u64pred %r379; .reg .pred %r381; .reg .u16 %r383; .reg .u32 %r384; .reg .pred %r385; .reg .u32pred %r393; .reg .pred %r394; .reg .predu64u64 %r430; .reg .u64 %r431; mov.u64 %r204,%ar0; mov.u64 %r205,%ar1; mov.u32 %r206,%ar2; mov.u64 %r207,%ar3; mov.u64 %r208,%ar4; mov.u32 %r209,%ar5; .loc 1 225 10 ld.s8 %r22,[%r205+28]; .loc 1 225 38 add.u32 %r210,%r22,-1; .loc 1 225 8 cvt.s64.s32 %r147,%r210; .loc 1 226 10 ld.u64 %r25,[%r207]; .loc 1 226 7 add.u64 %r148,%r25,-1; .loc 1 228 7 shr.u64 %r212,%r148,63; set.u32.lt.s64 %r214,%r147,%r148; neg.s32 %r215,%r214; cvt.u16.u64 %r218,%r212; cvt.u16.u32 %r219,%r215; or.b16 %r217,%r218,%r219; .loc 1 228 6 cvt.u32.u16 %r220,%r217; cvt.u16.u8 %r221,%r220; setp.eq.u16 %r222,%r221,0; @ %r222 bra $L53; .loc 1 230 7 cvt.u32.u32 %r225,%r22; cvt.s64.s8 %r224,%r225; st.u64 [%stack+8],%r224232stack; call _gfortran_runtime_error3: .loc 1 235 9 add.u64 %r228,%r148,%r148; add.u64 %r229,%r228,%r148; shl.b64 %r230,%r229,3; add.u64 %r231,%r205,%r230; ld.u64 %r31,[%r231+56]; ld.u64 %r33,[%r231+48]; .loc 1 239 9 ld.u64 %r151,[%r231+40]; .loc 1 240 12 add.u64 %r249,%r208,%r230; ld.u64 %r34,[%r249+40]; ld.u64 %r36,[%r208+16]; .loc 1 242 9 ld.u64 %r160,[%r208]; .loc 1 246 22 cvt.u32.u64 %r39,%r36; .loc 1 246 53 add.u32 %r251,%r39,-4; and.b32 %r252,%r251,-5; set.u32.eq.u32 %r254,%r252,0; neg.s32 %r255,%r254; .loc 1 246 22 add.u32 %r256,%r39,-1; set.u32.le.u32 %r258,%r256,1; neg.s32 %r259,%r258; .loc 1 246 6 cvt.u16.u32 %r261,%r255; cvt.u16.u32 %r262,%r259; or.b16 %r260,%r261,%r262; cvt.u32.u16 %r263,%r260; cvt.u16.u8 %r264,%r263; setp.eq.u16 %r265,%r264,0; @ %r265 bra $L54; $L58: .loc 1 255 3 setp.ne.u64 %r266,%r148,0; @ %r266 bra $L55; $L62: .loc 1 264 3 setp.gt.s64 %r267,%r147,%r148; @ %r267 bra $L56; bra $L113; $L54: .loc 1 248 7 setp.eq.u32 %r269,%r39,16; @ %r269 bra $L58; .loc 1 253 5 cvta.const.u64 %r271,$LC4; mov.u64 %r270,27271; call _gfortrani_internal_error5: add.u64 %r146,%r205,40; add.u64 %r167,%r208,40; add.u64 %r273,%r25,%r25; add.u64 %r274,%r273,%r25; shl.b64 %r275,%r274,3; add.u64 %r276,%r205,16; add.u64 %r101,%r275,%r276; .loc 1 255 3 mov.u64 %r200,0; add.u64 %r418,%frame,240; add.u64 %r419,%frame,120; add.u64 %r416,%frame,360; .loc 1 262 12 mov.u64 %r431,%r200; $L61: .loc 1 257 18 add.u64 %r278,%r418,%r200; ld.u64 %r279,[%r146]; st.u64 [%r278],%r279; .loc 1 258 18 add.u64 %r281,%r419,%r200; .loc 1 258 20 ld.u64 %r283,[%r167]; mul.lo.u64 %r282,%r283,%r36; .loc 1 258 18 st.u64 [%r281],%r282; .loc 1 259 19 ld.u64 %r285,[%r146+16]; add.u64 %r284,%r285,1; ld.u64 %r286,[%r146+8]; sub.u64 %r53,%r284,%r286; .loc 1 261 10 setp.lt.s64 %r287,%r53,0; @ %r287 bra $L59; .loc 1 259 17 add.u64 %r289,%r416,%r200; st.u64 [%r289],%r53; bra $L60; $L59: .loc 1 262 12 add.u64 %r291,%r416,%r200; st.u64 [%r291],%r431; $L60: .loc 1 255 3 add.u64 %r146,%r146,24; add.u64 %r167,%r167,24; add.u64 %r200,%r200,8; setp.ne.u64 %r293,%r101,%r146; @ %r293 bra $L61; bra $L62; $L113: .loc 1 274 6 ld.u64 %r294,[%r204]; setp.eq.u64 %r295,%r294,0; @ ! %r295 bra $L64; bra $L63; $L56: add.u64 %r297,%r25,%r25; add.u64 %r298,%r297,%r25; shl.b64 %r299,%r298,3; add.u64 %r143,%r299,40; add.u64 %r156,%r205,%r143; add.u64 %r141,%r208,%r143; shl.b64 %r300,%r25,3; add.u64 %r137,%r300,-8; cvt.u32.u32 %r302,%r22; cvt.s64.s8 %r301,%r302; shl.b64 %r303,%r301,3; add.u64 %r84,%r303,-8; add.u64 %r418,%frame,240; add.u64 %r419,%frame,120; add.u64 %r416,%frame,360; .loc 1 271 12 mov.u64 %r430,0; $L67: .loc 1 266 18 add.u64 %r305,%r418,%r137; ld.u64 %r306,[%r156]; st.u64 [%r305],%r306; .loc 1 267 18 add.u64 %r308,%r419,%r137; .loc 1 267 20 ld.u64 %r310,[%r141]; mul.lo.u64 %r309,%r310,%r36; .loc 1 267 18 st.u64 [%r308],%r309; .loc 1 268 19 ld.u64 %r312,[%r156+16]; add.u64 %r311,%r312,1; ld.u64 %r313,[%r156+8]; sub.u64 %r62,%r311,%r313; .loc 1 270 10 setp.lt.s64 %r314,%r62,0; @ %r314 bra $L65; .loc 1 268 17 add.u64 %r316,%r416,%r137; st.u64 [%r316],%r62; bra $L66; $L65: .loc 1 271 12 add.u64 %r318,%r416,%r137; st.u64 [%r318],%r430; $L66: .loc 1 264 3 add.u64 %r156,%r156,24; add.u64 %r141,%r141,24; add.u64 %r137,%r137,8; setp.ne.u64 %r320,%r84,%r137; @ %r320 bra $L67; bra $L114; $L63: .loc 1 278 7 setp.le.s64 %r321,%r147,0; @ %r321 bra $L69; add.u64 %r416,%frame,360; $L90: add.u64 %r179,%r204,40; mov.u64 %r176,%r416; add.u64 %r322,%r204,16; cvt.u32.u32 %r324,%r22; cvt.s64.s8 %r323,%r324; add.u64 %r326,%r323,%r323; add.u64 %r327,%r326,%r323; shl.b64 %r328,%r327,3; add.u64 %r169,%r322,%r328; .loc 1 281 10 mov.u64 %r140,1; .loc 1 285 4 mov.u64 %r351,0; bra $L70; $L69: .loc 1 289 24 mov.u64 %r329,0; st.u64 [%r204+8],%r329; .loc 1 290 28 cvt.u16.u32 %r332,%r22; add.u16 %r331,%r332,-1; cvt.u32.u16 %r333,%r331; st.u8 [%r204+28],%r333; .loc 1 292 20 add.u32 %r334,%r22,-2; cvt.s64.s32 %r72,%r334; add.u64 %r336,%r72,%r72; add.u64 %r337,%r336,%r72; shl.b64 %r338,%r337,3; add.u64 %r339,%r204,%r338; .loc 1 292 67 shl.b64 %r341,%r72,3; add.u64 %r342,%frame,%r341; .loc 1 292 59 ld.u64 %r344,[%r339+40]; ld.u64 %r345,[%r342+360]; mul.lo.u64 %r154,%r344,%r345; .loc 1 294 29 mov.u64 %r3471547; call (%value_in),_gfortrani_xmallocarray8,[%value_in]; } .loc 1 294 27 st.u64 [%r204],%r348; .loc 1 295 10 setp.eq.u64 %r350,%r154,0; @ ! %r350 bra $L74; bra $L71; $L73: .loc 1 283 48 mul.lo.u64 %r140,%r68,%r140; $L70: .loc 1 285 4 st.u64 [%r179+8],%r351; ld.u64 %r68,[%r176]; add.u64 %r352,%r68,-1; st.u64 [%r179+16],%r352; st.u64 [%r179],%r140; .loc 1 278 7 add.u64 %r179,%r179,24; add.u64 %r176,%r176,8; setp.ne.u64 %r353,%r169,%r179; @ %r353 bra $L73; bra $L69; $L71: .loc 1 298 4 st.u64 [%r204+48],%r154; mov.u64 %r355,-1; st.u64 [%r204+56],%r355; mov.u64 %r356,1; st.u64 [%r204+40],%r356; .loc 1 299 4 bra $L52; $L64: .loc 1 304 19 ld.s8 %r79,[%r204+28]; .loc 1 304 10 setp.eq.u64 %r357,%r79,%r147; @ %r357 bra $L76; .loc 1 305 2 st.u64 [%stack+8],%r147; st.u64 [%stack],%r79;_gfortran_runtime_error76: .loc 1 310 11 cvta.global.u64 %r360,_gfortrani_compile_options; .loc 1 310 10 ld.u32 %r361,[%r360+36]; setp.eq.u32 %r362,%r361,0; @ %r362 bra $L74; .loc 1 311 2 add.u64 %r416,%frame,360; cvta.const.u64 %r366,$LC2r416366; call _gfortrani_bounds_ifunction_return$L74: .loc 1 315 3 setp.gt.s64 %r368,%r147,0; @ %r368 bra $L77; $L80: .loc 1 235 9 add.u64 %r369,%r31,1; .loc 1 235 7 sub.u64 %r149,%r369,%r33; max.s64 %r65,%r149,0; .loc 1 240 12 mul.lo.u64 %r37,%r34,%r36; .loc 1 323 8 ld.u64 %r164,[%r204]; .loc 1 326 8 ld.u64 %r163,[%r205]; .loc 1 351 35 shl.b64 %r100,%r151,2; .loc 1 336 22 add.u64 %r64,%r65,-1; .loc 1 336 15 mul.lo.u64 %r89,%r64,%r100; .loc 1 337 29 mul.lo.u64 %r177,%r37,%r64; .loc 1 338 34 neg.s64 %r95,%r100; .loc 1 363 22 ld.u64 %r102,[%frame+240]; .loc 1 363 12 shl.b64 %r104,%r102,2; .loc 1 364 23 ld.u64 %r105,[%frame+120]; .loc 1 365 22 ld.u64 %r107,[%frame]; .loc 1 365 12 shl.b64 %r109,%r107,3; .loc 1 362 12 ld.u64 %r203,[%frame+480]; setp.ne.u32 %r417,%r209,0; add.u64 %r415,%frame,488; .loc 1 338 4 setp.le.s64 %r423,%r149,0; .loc 1 382 23 add.u64 %r424,%frame,240; .loc 1 367 32 add.u64 %r425,%frame,360; .loc 1 371 20 add.u64 %r426,%frame,120; bra $L78; $L77: add.u64 %r198,%frame,480; add.u64 %r197,%r204,40; mov.u64 %r195,%frame; cvt.u32.u32 %r371,%r22; cvt.s64.s8 %r370,%r371; add.u64 %r181,%r370,-1; .loc 1 315 3 mov.u64 %r165,0; add.u64 %r416,%frame,360; .loc 1 317 16 mov.u64 %r372,%r165; $L79: st.u64 [%r198],%r372; .loc 1 318 18 ld.u64 %r373,[%r197]; st.u64 [%r195],%r373; .loc 1 319 17 shl.b64 %r375,%r165,3; add.u64 %r376,%r416,%r375; .loc 1 319 10 ld.u64 %r377,[%r376]; setp.le.s64 %r378,%r377,0; @ %r378 bra $L52; .loc 1 315 26 add.u64 %r165,%r165,1; .loc 1 315 3 add.u64 %r198,%r198,8; add.u64 %r197,%r197,24; add.u64 %r195,%r195,8; setp.ne.u64 %r379,%r165,%r181; @ %r379 bra $L79; bra $L80; $L78: .loc 1 334 10 @ %r417 bra $L81; .loc 1 351 4 setp.gt.s64 %r381,%r149,0; @ %r381 bra $L91; .loc 1 333 14 mov.u64 %r158,0; bra $L83; $L81: .loc 1 336 8 add.u64 %r29,%r163,%r89; .loc 1 337 9 add.u64 %r159,%r160,%r177; .loc 1 338 4 @ %r423 bra $L92; .loc 1 338 11 mov.u64 %r158,%r65; $L85: .loc 1 340 11 ld.s8 %r384,[%r159]; cvt.u16.u32 %r383,%r384; setp.eq.u16 %r385,%r383,0; @ %r385 bra $L84; .loc 1 340 18 ld.u32 %r386,[%r29]; setp.eq.u32 %r387,%r386,%r206; @ %r387 bra $L83; $L84: .loc 1 338 26 add.u64 %r158,%r158,-1; .loc 1 338 34 add.u64 %r29,%r29,%r95; .loc 1 338 53 sub.u64 %r159,%r159,%r37; .loc 1 338 4 setp.ne.u64 %r388,%r158,0; @ %r388 bra $L85; bra $L83; $L91: .loc 1 351 4 mov.u64 %r157,%r160; mov.u64 %r111,%r163; .loc 1 351 11 mov.u64 %r158,1; $L82: .loc 1 353 11 ld.s8 %r390,[%r157]; cvt.u16.u32 %r389,%r390; setp.eq.u16 %r391,%r389,0; @ %r391 bra $L86; .loc 1 353 18 ld.u32 %r392,[%r111]; setp.eq.u32 %r393,%r392,%r206; @ %r393 bra $L83; $L86: .loc 1 351 27 add.u64 %r158,%r158,1; .loc 1 351 35 add.u64 %r111,%r111,%r100; .loc 1 351 54 add.u64 %r157,%r157,%r37; .loc 1 351 4 setp.ge.s64 %r394,%r65,%r158; @ %r394 bra $L82; .loc 1 333 14 mov.u64 %r158,0; bra $L83; $L92: mov.u64 %r158,0; $L83: .loc 1 360 13 st.u64 [%r164],%r158; .loc 1 362 15 add.u64 %r203,%r203,1; st.u64 [%frame+480],%r203; .loc 1 363 12 add.u64 %r163,%r163,%r104; .loc 1 364 13 add.u64 %r160,%r160,%r105; .loc 1 365 12 add.u64 %r164,%r164,%r109; .loc 1 367 32 ld.u64 %r130,[%frame+360]; .loc 1 367 13 setp.ne.u64 %r395,%r203,%r130; @ %r395 bra $L78; mov.u64 %r126,%r415; mov.u64 %r127,%r107; mov.u64 %r202,%r105; mov.u64 %r124,%r102; mov.u64 %r120,8; .loc 1 366 9 mov.u64 %r162,0; .loc 1 369 13 mov.u64 %r397,%r162; $L89: st.u64 [%r126+-8],%r397; .loc 1 370 23 mul.lo.u64 %r113,%r130,%r124; .loc 1 371 24 mul.lo.u64 %r398,%r130,%r202; .loc 1 371 10 sub.u64 %r160,%r160,%r398; .loc 1 372 23 mul.lo.u64 %r119,%r130,%r127; .loc 1 373 5 add.u64 %r162,%r162,1; .loc 1 374 7 setp.le.s64 %r399,%r147,%r162; @ %r399 bra $L52; .loc 1 381 16 ld.u64 %r400,[%r126]; add.u64 %r123,%r400,1; st.u64 [%r126],%r123; .loc 1 382 23 add.u64 %r402,%r424,%r120; ld.u64 %r124,[%r402]; .loc 1 382 13 sub.u64 %r403,%r124,%r113; shl.b64 %r404,%r403,2; add.u64 %r163,%r163,%r404; .loc 1 383 23 add.u64 %r405,%frame,%r120; ld.u64 %r127,[%r405]; .loc 1 383 13 sub.u64 %r406,%r127,%r119; shl.b64 %r407,%r406,3; add.u64 %r164,%r164,%r407; .loc 1 367 32 add.u64 %r409,%r425,%r120; ld.u64 %r130,[%r409]; .loc 1 367 13 add.u64 %r126,%r126,8; setp.eq.u64 %r410,%r123,%r130; @ %r410 bra $L88; .loc 1 362 12 ld.u64 %r203,[%frame+480]; bra $L78; $L88: .loc 1 371 20 add.u64 %r412,%r426,%r120; ld.u64 %r202,[%r412]; add.u64 %r120,%r120,8; bra $L89; $L114: .loc 1 274 6 ld.u64 %r413,[%r204]; setp.eq.u64 %r414,%r413,0; @ ! %r414 bra $L64; bra $L90; $L52_gfortran_sfindloc1_i4 .visible .func _gfortran_sfindloc1_i468u64 %r126; .reg .u64u32pred %r190; .reg .pred %r192; .reg .u64pred %r213; .reg .predu32predpredpred %r257; .reg .u64 %r260; .reg .u64 %r261; .reg .pred %r263; .reg .u64 %r264; .reg .u32 %r265; .reg .u64 %r266; .reg .u6464predmov.u64 %r149,%ar0; mov.u64 %r150,%ar1; mov.u32 %r151,%ar2; mov.u64 %r152,%ar3; mov.u64 %r153,%ar4; mov.u32 %r154,%ar5; .loc 1 410 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L116; .loc 1 410 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L117; $L116:call _gfortran_findloc1_i4); } .loc 1 413 7 bra $L115; $L117: .loc 1 416 10 ld.s8 %r23,[%r150+28]; .loc 1 416 38 add.u32 %r163,%r23,-1; .loc 1 416 8 cvt.s64.s32 %r90,%r163; .loc 1 417 10 ld.u64 %r26,[%r152]; .loc 1 417 7 add.u64 %r91,%r26,-1; .loc 1 419 7 shr.u64 %r165,%r91,63; set.u32.lt.s64 %r167,%r90,%r91; neg.s32 %r168,%r167; cvt.u16.u64 %r171,%r165; cvt.u16.u32 %r172,%r168; or.b16 %r170,%r171,%r172; .loc 1 419 6 cvt.u32.u16 %r173,%r170; cvt.u16.u8 %r174,%r173; setp.ne.u16 %r175,%r174,0; @ %r175 bra $L119; .loc 1 430 3 setp.ne.u64 %r176,%r91,0; @ %r176 bra $L120; $L126: .loc 1 438 3 setp.gt.s64 %r177,%r90,%r91; @ %r177 bra $L121; bra $L163; $L119: .loc 1 421 7 cvt.u32.u32 %r180,%r23; cvt.s64.s8 %r179,%r180; st.u64 [%stack+8],%r179_gfortran_runtime_error120: add.u64 %r136,%r150,48; add.u64 %r138,%frame,120; add.u64 %r139,%r150,56; add.u64 %r182,%r150,24; add.u64 %r184,%r26,%r26; add.u64 %r185,%r184,%r26; shl.b64 %r186,%r185,3; add.u64 %r145,%r182,%r186; .loc 1 435 12 mov.u64 %r302,0; $L125: .loc 1 432 19 ld.u64 %r188,[%r139]; add.u64 %r187,%r188,1; ld.u64 %r189,[%r136]; sub.u64 %r35,%r187,%r189; .loc 1 434 10 setp.le.s64 %r190,%r35,0; @ %r190 bra $L123; .loc 1 432 17 st.u64 [%r138],%r35; bra $L124; $L123: .loc 1 435 12 st.u64 [%r138],%r302; $L124: .loc 1 430 3 add.u64 %r136,%r136,24; add.u64 %r138,%r138,8; add.u64 %r139,%r139,24; setp.ne.u64 %r192,%r136,%r145; @ %r192 bra $L125; bra $L126; $L163: .loc 1 448 6 ld.u64 %r193,[%r149]; setp.eq.u64 %r194,%r193,0; @ ! %r194 bra $L128; bra $L127; $L121: add.u64 %r196,%r26,%r26; add.u64 %r197,%r196,%r26; shl.b64 %r198,%r197,3; add.u64 %r199,%r198,48; add.u64 %r125,%r150,%r199; add.u64 %r294,%frame,120; shl.b64 %r201,%r26,3; add.u64 %r202,%r201,-8; add.u64 %r45,%r294,%r202; add.u64 %r203,%r150,48; cvt.u32.u32 %r205,%r23; cvt.s64.s8 %r204,%r205; add.u64 %r207,%r204,%r204; add.u64 %r208,%r207,%r204; shl.b64 %r209,%r208,3; add.u64 %r132,%r203,%r209; .loc 1 444 12 mov.u64 %r301,0; $L131: .loc 1 441 2 ld.u64 %r211,[%r125+8]; add.u64 %r210,%r211,1; ld.u64 %r212,[%r125]; sub.u64 %r40,%r210,%r212; .loc 1 443 10 setp.le.s64 %r213,%r40,0; @ %r213 bra $L129; .loc 1 440 17 st.u64 [%r45],%r40; bra $L130; $L129: .loc 1 444 12 st.u64 [%r45],%r301; $L130: .loc 1 438 3 add.u64 %r125,%r125,24; add.u64 %r45,%r45,8; setp.ne.u64 %r215,%r125,%r132; @ %r215 bra $L131; bra $L164; $L127: .loc 1 452 7 setp.le.s64 %r216,%r90,0; @ %r216 bra $L133; add.u64 %r294,%frame,120; $L149: add.u64 %r78,%r149,40; mov.u64 %r61,%r294; cvt.u32.u32 %r218,%r23; cvt.s64.s8 %r217,%r218; add.u64 %r220,%r217,%r217; add.u64 %r221,%r220,%r217; shl.b64 %r222,%r221,3; add.u64 %r223,%r149,16; add.u64 %r126,%r222,%r223; .loc 1 455 10 mov.u64 %r81,1; .loc 1 459 4 mov.u64 %r246,0; bra $L134; $L133: .loc 1 462 24 mov.u64 %r224,0; st.u64 [%r149+8],%r224; .loc 1 463 28 cvt.u16.u32 %r227,%r23; add.u16 %r226,%r227,-1; cvt.u32.u16 %r228,%r226; st.u8 [%r149+28],%r228; .loc 1 465 20 add.u32 %r229,%r23,-2; cvt.s64.s32 %r51,%r229; add.u64 %r231,%r51,%r51; add.u64 %r232,%r231,%r51; shl.b64 %r233,%r232,3; add.u64 %r234,%r149,%r233; .loc 1 465 67 shl.b64 %r236,%r51,3; add.u64 %r237,%frame,%r236; .loc 1 465 59 ld.u64 %r239,[%r234+40]; ld.u64 %r240,[%r237+120]; mul.lo.u64 %r94,%r239,%r240; .loc 1 467 29 mov.u64 %r24242; call (%value_in),_gfortrani_xmallocarray3,[%value_in]; } .loc 1 467 27 st.u64 [%r149],%r243; .loc 1 468 10 setp.eq.u64 %r245,%r94,0; @ ! %r245 bra $L138; bra $L135; $L137: .loc 1 457 48 mul.lo.u64 %r81,%r47,%r81; $L134: .loc 1 459 4 st.u64 [%r78+8],%r246; ld.u64 %r47,[%r61]; add.u64 %r247,%r47,-1; st.u64 [%r78+16],%r247; st.u64 [%r78],%r81; .loc 1 452 7 add.u64 %r78,%r78,24; add.u64 %r61,%r61,8; setp.ne.u64 %r248,%r78,%r126; @ %r248 bra $L137; bra $L133; $L135: .loc 1 471 4 st.u64 [%r149+48],%r94; mov.u64 %r250,-1; st.u64 [%r149+56],%r250; mov.u64 %r251,1; st.u64 [%r149+40],%r251; .loc 1 472 4 bra $L115; $L128: .loc 1 477 19 ld.s8 %r58,[%r149+28]; .loc 1 477 10 setp.eq.u64 %r252,%r58,%r90; @ %r252 bra $L139; .loc 1 478 2 st.u64 [%stack+8],%r90; st.u64 [%stack],%r58;_gfortran_runtime_error139: .loc 1 483 11 cvta.global.u64 %r255,_gfortrani_compile_options; .loc 1 483 10 ld.u32 %r256,[%r255+36]; setp.eq.u32 %r257,%r256,0; @ %r257 bra $L138; .loc 1 484 2 add.u64 %r294,%frame,120; cvta.const.u64 %r261,$LC2261; call _gfortrani_bounds_ifunction_return$L138: .loc 1 488 3 setp.gt.s64 %r263,%r90,0; @ %r263 bra $L140; $L143: .loc 1 495 8 ld.u64 %r97,[%r149]; ld.u64 %r42,[%frame+240]; .loc 1 503 22 ld.u64 %r64,[%frame]; .loc 1 503 12 shl.b64 %r66,%r64,3; setp.le.s64 %r295,%r90,1; cvt.u32.u32 %r297,%r23; cvt.s64.s8 %r298,%r297; add.u64 %r299,%r298,-1; add.u64 %r300,%frame,120; bra $L141; $L140: add.u64 %r114,%frame,240; add.u64 %r113,%r149,40; mov.u64 %r108,%frame; cvt.u32.u32 %r265,%r23; cvt.s64.s8 %r264,%r265; add.u64 %r79,%r264,-1; .loc 1 488 3 mov.u64 %r98,0; add.u64 %r294,%frame,120; .loc 1 490 16 mov.u64 %r266,%r98; $L142: st.u64 [%r114],%r266; .loc 1 491 18 ld.u64 %r267,[%r113]; st.u64 [%r108],%r267; .loc 1 492 17 shl.b64 %r269,%r98,3; add.u64 %r270,%r294,%r269; .loc 1 492 10 ld.u64 %r271,[%r270]; setp.le.s64 %r272,%r271,0; @ %r272 bra $L115; .loc 1 488 26 add.u64 %r98,%r98,1; .loc 1 488 3 add.u64 %r114,%r114,8; add.u64 %r113,%r113,24; add.u64 %r108,%r108,8; setp.ne.u64 %r273,%r79,%r98; @ %r273 bra $L142; bra $L143; $L141: .loc 1 500 13 mov.u64 %r274,0; st.u64 [%r97],%r274; .loc 1 502 15 add.u64 %r42,%r42,1; .loc 1 503 12 add.u64 %r97,%r97,%r66; .loc 1 505 32 ld.u64 %r88,[%frame+120]; .loc 1 505 13 setp.ne.u64 %r275,%r42,%r88; @ %r275 bra $L141; .loc 1 508 23 mul.lo.u64 %r276,%r64,%r42; .loc 1 508 9 shl.b64 %r277,%r276,3; sub.u64 %r30,%r97,%r277; .loc 1 510 7 @ ! %r295 bra $L165; bra $L115; $L148: .loc 1 507 13 st.u64 [%r102],%r274; .loc 1 508 23 mul.lo.u64 %r280,%r73,%r72; .loc 1 508 9 shl.b64 %r281,%r280,3; sub.u64 %r30,%r97,%r281; .loc 1 509 5 add.u64 %r96,%r96,1; .loc 1 510 7 add.u64 %r102,%r102,8; setp.ne.u64 %r282,%r96,%r299; @ %r282 bra $L147; bra $L115; $L165: add.u64 %r102,%frame,248; .loc 1 509 5 mov.u64 %r96,1; $L147: .loc 1 517 16 ld.u64 %r286,[%r102]; add.u64 %r72,%r286,1; st.u64 [%r102],%r72; shl.b64 %r120,%r96,3; .loc 1 518 23 add.u64 %r287,%frame,%r120; ld.u64 %r73,[%r287]; .loc 1 518 13 shl.b64 %r288,%r73,3; add.u64 %r97,%r30,%r288; .loc 1 505 32 add.u64 %r290,%r300,%r120; ld.u64 %r77,[%r290]; .loc 1 505 13 setp.eq.u64 %r291,%r72,%r77; @ %r291 bra $L148; .loc 1 507 13 mov.u64 %r42,0; bra $L141; $L164: .loc 1 448 6 ld.u64 %r292,[%r149]; setp.eq.u64 %r293,%r292,0; @ ! %r293 bra $L128; bra $L149; $L115: .loc 1 522 1 ret; } findloc1_i8.o/_gfortran_findloc1_i8 .visible .func _gfortran_findloc1_fortran/generated/findloc1_i8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mfindloc1_i8 .visible .func _gfortran_mfindloc1_i8GLOBAL FUNCTION DECL: _gfortran_sfindloc1_i8 .visible .func _gfortran_sfindloc1_i8GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_options[6]; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_internal_error .extern .func _gfortrani_internal_error8] = {70,73,78,68,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,70,73,78,68,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,914,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,70,73,78,68,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_findloc1_i8 .visible .func _gfortran_findloc1_, .param .u32 %in_ar4) {48098predpred %r238; .reg .u64pred %r260; .reg .pred %r262; .reg .pred %r263; .reg .u64 %r264; .reg .u32 %r265; .reg .u64u64 %r279; .reg .u64 %r280; .reg .u6464predpred %r330; .reg .pred %r331; .reg .pred %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64u64 %r352; .reg .u64 %r354; .reg .predpred %r358; .reg .pred %r359; .reg .u64predmov.u64 %r185,%ar0; mov.u64 %r186,%ar1; mov.u64 %r187,%ar2; mov.u64 %r188,%ar3; mov.u32 %r189,%ar4; .loc 1 54 10 ld.s8 %r22,[%r186+28]; .loc 1 54 38 add.u32 %r190,%r22,-1; .loc 1 54 8 cvt.s64.s32 %r118,%r190; .loc 1 55 10 ld.u64 %r25,[%r188]; .loc 1 55 7 add.u64 %r119,%r25,-1; .loc 1 57 7 shr.u64 %r192,%r119,63; set.u32.lt.s64 %r194,%r118,%r119; neg.s32 %r195,%r194; cvt.u16.u64 %r198,%r192; cvt.u16.u32 %r199,%r195; or.b16 %r197,%r198,%r199; .loc 1 57 6 cvt.u32.u16 %r200,%r197; cvt.u16.u8 %r201,%r200; setp.eq.u16 %r202,%r201,0; @ %r202 bra $L2; .loc 1 59 7 cvt.u32.u32 %r205,%r22; cvt.s64.s8 %r204,%r205; st.u64 [%stack+8],%r20403203_gfortran_runtime_errorr208,%r119,%r119; add.u64 %r209,%r208,%r119; shl.b64 %r210,%r209,3; add.u64 %r211,%r186,%r210; ld.u64 %r31,[%r211+56]; ld.u64 %r33,[%r211+48]; .loc 1 67 9 ld.u64 %r122,[%r211+40]; .loc 1 69 3 setp.ne.u64 %r225,%r119,0; @ %r225 bra $L3; $L9: .loc 1 77 3 setp.gt.s64 %r226,%r118,%r119; @ %r226 bra $L4; bra $L50; $L3: add.u64 %r177,%r186,40; add.u64 %r67,%frame,120; add.u64 %r70,%frame,240; add.u64 %r228,%r25,%r25; add.u64 %r229,%r228,%r25; shl.b64 %r230,%r229,3; add.u64 %r231,%r186,16; add.u64 %r182,%r230,%r231; .loc 1 75 12 mov.u64 %r371,0; $L8: .loc 1 71 18 ld.u64 %r232,[%r177]; st.u64 [%r67],%r232; .loc 1 72 19 ld.u64 %r234,[%r177+16]; add.u64 %r233,%r234,1; ld.u64 %r235,[%r177+8]; sub.u64 %r38,%r233,%r235; .loc 1 74 10 setp.lt.s64 %r236,%r38,0; @ %r236 bra $L6; .loc 1 72 17 st.u64 [%r70],%r38; bra $L7; $L6: .loc 1 75 12 st.u64 [%r70],%r371; $L7: .loc 1 69 3 add.u64 %r177,%r177,24; add.u64 %r67,%r67,8; add.u64 %r70,%r70,8; setp.ne.u64 %r238,%r177,%r182; @ %r238 bra $L8; bra $L9; $L50: .loc 1 86 6 ld.u64 %r239,[%r185]; setp.eq.u64 %r240,%r239,0; @ ! %r240 bra $L11; bra $L10; $L4: add.u64 %r242,%r25,%r25; add.u64 %r243,%r242,%r25; shl.b64 %r244,%r243,3; add.u64 %r245,%r244,40; add.u64 %r115,%r186,%r245; shl.b64 %r246,%r25,3; add.u64 %r105,%r246,-8; add.u64 %r361,%frame,120; add.u64 %r108,%r361,%r105; add.u64 %r360,%frame,240; add.u64 %r62,%r360,%r105; add.u64 %r249,%r186,40; cvt.u32.u32 %r251,%r22; cvt.s64.s8 %r250,%r251; add.u64 %r253,%r250,%r250; add.u64 %r254,%r253,%r250; shl.b64 %r255,%r254,3; add.u64 %r180,%r249,%r255; .loc 1 83 12 mov.u64 %r370,0; $L14: .loc 1 79 18 ld.u64 %r256,[%r115]; st.u64 [%r108],%r256; .loc 1 80 19 ld.u64 %r258,[%r115+16]; add.u64 %r257,%r258,1; ld.u64 %r259,[%r115+8]; sub.u64 %r45,%r257,%r259; .loc 1 82 10 setp.lt.s64 %r260,%r45,0; @ %r260 bra $L12; .loc 1 80 17 st.u64 [%r62],%r45; bra $L13; $L12: .loc 1 83 12 st.u64 [%r62],%r370; $L13: .loc 1 77 3 add.u64 %r115,%r115,24; add.u64 %r108,%r108,8; add.u64 %r62,%r62,8; setp.ne.u64 %r262,%r115,%r180; @ %r262 bra $L14; bra $L51; $L10: .loc 1 90 7 setp.le.s64 %r263,%r118,0; @ %r263 bra $L16; add.u64 %r360,%frame,240; $L36: add.u64 %r153,%r185,40; mov.u64 %r149,%r360; cvt.u32.u32 %r265,%r22; cvt.s64.s8 %r264,%r265; add.u64 %r267,%r264,%r264; add.u64 %r268,%r267,%r264; shl.b64 %r269,%r268,3; add.u64 %r270,%r185,16; add.u64 %r116,%r269,%r270; .loc 1 93 10 mov.u64 %r112,1; .loc 1 97 4 mov.u64 %r293,0; bra $L17; $L16: .loc 1 101 24 mov.u64 %r271,0; st.u64 [%r185+8],%r271; .loc 1 102 28 cvt.u16.u32 %r274,%r22; add.u16 %r273,%r274,-1; cvt.u32.u16 %r275,%r273; st.u8 [%r185+28],%r275; .loc 1 104 20 add.u32 %r276,%r22,-2; cvt.s64.s32 %r54,%r276; add.u64 %r278,%r54,%r54; add.u64 %r279,%r278,%r54; shl.b64 %r280,%r279,3; add.u64 %r281,%r185,%r280; .loc 1 104 67 shl.b64 %r283,%r54,3; add.u64 %r284,%frame,%r283; .loc 1 104 59 ld.u64 %r286,[%r281+40]; ld.u64 %r287,[%r284+240]; mul.lo.u64 %r124,%r286,%r287; .loc 1 106 29 mov.u64 %r289189; call (%value_in),_gfortrani_xmallocarray290,[%value_in]; } .loc 1 106 27 st.u64 [%r185],%r290; .loc 1 107 10 setp.eq.u64 %r292,%r124,0; @ ! %r292 bra $L21; bra $L18; $L20: .loc 1 95 48 mul.lo.u64 %r112,%r50,%r112; $L17: .loc 1 97 4 st.u64 [%r153+8],%r293; ld.u64 %r50,[%r149]; add.u64 %r294,%r50,-1; st.u64 [%r153+16],%r294; st.u64 [%r153],%r112; .loc 1 90 7 add.u64 %r153,%r153,24; add.u64 %r149,%r149,8; setp.ne.u64 %r295,%r116,%r153; @ %r295 bra $L20; bra $L16; $L18: .loc 1 110 4 st.u64 [%r185+48],%r124; mov.u64 %r297,-1; st.u64 [%r185+56],%r297; mov.u64 %r298,1; st.u64 [%r185+40],%r298; .loc 1 111 4 bra $L1; $L11: .loc 1 116 19 ld.s8 %r60,[%r185+28]; .loc 1 116 10 setp.eq.u64 %r299,%r60,%r118; @ %r299 bra $L23; .loc 1 117 2 st.u64 [%stack+8],%r118; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 122 11 cvta.global.u64 %r302,_gfortrani_compile_options; .loc 1 122 10 ld.u32 %r303,[%r302+36]; setp.eq.u32 %r304,%r303,0; @ %r304 bra $L21; .loc 1 123 2 add.u64 %r360,%frame,240; cvta.const.u64 %r308,$LC2r360307308; call _gfortrani_bounds_ifunction_return$L21: .loc 1 127 3 setp.gt.s64 %r310,%r118,0; @ %r310 bra $L24; $L27: .loc 1 64 9 add.u64 %r311,%r31,1; .loc 1 64 7 sub.u64 %r120,%r311,%r33; max.s64 %r43,%r120,0; .loc 1 135 8 ld.u64 %r111,[%r185]; .loc 1 138 8 ld.u64 %r132,[%r186]; .loc 1 160 35 shl.b64 %r74,%r122,3; .loc 1 147 22 add.u64 %r312,%r43,-1; .loc 1 147 15 mul.lo.u64 %r68,%r312,%r74; .loc 1 148 34 neg.s64 %r71,%r74; ld.u64 %r82,[%frame+360]; .loc 1 172 22 ld.u64 %r75,[%frame+120]; .loc 1 172 12 shl.b64 %r77,%r75,3; .loc 1 173 22 ld.u64 %r78,[%frame]; .loc 1 173 12 shl.b64 %r80,%r78,3; setp.ne.u32 %r359,%r189,0; setp.le.s64 %r358,%r118,1; .loc 1 148 4 setp.le.s64 %r363,%r120,0; .loc 1 160 4 setp.gt.s64 %r364,%r120,0; cvt.u32.u32 %r365,%r22; cvt.s64.s8 %r366,%r365; add.u64 %r367,%r366,-1; add.u64 %r368,%frame,120; add.u64 %r369,%frame,240; bra $L25; $L24: add.u64 %r170,%frame,360; add.u64 %r169,%r185,40; mov.u64 %r167,%frame; cvt.u32.u32 %r314,%r22; cvt.s64.s8 %r313,%r314; add.u64 %r154,%r313,-1; .loc 1 127 3 mov.u64 %r133,0; add.u64 %r360,%frame,240; .loc 1 129 16 mov.u64 %r315,%r133; $L26: st.u64 [%r170],%r315; .loc 1 130 18 ld.u64 %r316,[%r169]; st.u64 [%r167],%r316; .loc 1 131 17 shl.b64 %r318,%r133,3; add.u64 %r319,%r360,%r318; .loc 1 131 10 ld.u64 %r320,[%r319]; setp.le.s64 %r321,%r320,0; @ %r321 bra $L1; .loc 1 127 26 add.u64 %r133,%r133,1; .loc 1 127 3 add.u64 %r170,%r170,8; add.u64 %r169,%r169,24; add.u64 %r167,%r167,8; setp.ne.u64 %r322,%r133,%r154; @ %r322 bra $L26; bra $L27; $L25: .loc 1 145 10 @ %r359 bra $L28; .loc 1 160 4 @ %r364 bra $L37; .loc 1 144 14 mov.u64 %r126,0; bra $L30; $L28: .loc 1 147 8 add.u64 %r127,%r132,%r68; .loc 1 148 4 @ %r363 bra $L38; .loc 1 148 11 mov.u64 %r126,%r43; $L31: .loc 1 150 11 ld.u64 %r326,[%r127]; setp.eq.u64 %r327,%r326,%r187; @ %r327 bra $L30; .loc 1 148 26 add.u64 %r126,%r126,-1; .loc 1 148 34 add.u64 %r127,%r127,%r71; .loc 1 148 4 setp.ne.u64 %r328,%r126,0; @ %r328 bra $L31; bra $L30; $L37: .loc 1 160 4 mov.u64 %r125,%r132; .loc 1 160 11 mov.u64 %r126,1; $L29: .loc 1 162 11 ld.u64 %r329,[%r125]; setp.eq.u64 %r330,%r329,%r187; @ %r330 bra $L30; .loc 1 160 27 add.u64 %r126,%r126,1; .loc 1 160 35 add.u64 %r125,%r125,%r74; .loc 1 160 4 setp.ge.s64 %r331,%r43,%r126; @ %r331 bra $L29; .loc 1 144 14 mov.u64 %r126,0; bra $L30; $L38: mov.u64 %r126,0; $L30: .loc 1 169 13 st.u64 [%r111],%r126; .loc 1 171 15 add.u64 %r82,%r82,1; .loc 1 172 12 add.u64 %r132,%r132,%r77; .loc 1 173 12 add.u64 %r111,%r111,%r80; .loc 1 175 32 ld.u64 %r141,[%frame+240]; .loc 1 175 13 setp.ne.u64 %r332,%r82,%r141; @ %r332 bra $L25; .loc 1 178 23 mul.lo.u64 %r333,%r75,%r82; .loc 1 178 9 shl.b64 %r334,%r333,3; sub.u64 %r129,%r132,%r334; .loc 1 179 23 mul.lo.u64 %r335,%r78,%r82; .loc 1 179 9 shl.b64 %r336,%r335,3; sub.u64 %r130,%r111,%r336; .loc 1 181 7 @ %r358 bra $L1; add.u64 %r66,%frame,368; mov.u64 %r159,8; .loc 1 180 5 mov.u64 %r131,1; .loc 1 177 13 mov.u64 %r362,0; bra $L34; $L35: st.u64 [%r66],%r362; .loc 1 178 23 mul.lo.u64 %r342,%r95,%r94; .loc 1 178 9 shl.b64 %r343,%r342,3; sub.u64 %r129,%r132,%r343; .loc 1 179 23 mul.lo.u64 %r344,%r98,%r94; .loc 1 179 9 shl.b64 %r345,%r344,3; sub.u64 %r130,%r111,%r345; .loc 1 180 5 add.u64 %r131,%r131,1; .loc 1 181 7 add.u64 %r66,%r66,8; add.u64 %r159,%r159,8; setp.eq.u64 %r346,%r131,%r367; @ %r346 bra $L1; $L34: .loc 1 188 16 ld.u64 %r347,[%r66]; add.u64 %r94,%r347,1; st.u64 [%r66],%r94; .loc 1 189 23 add.u64 %r349,%r368,%r159; ld.u64 %r95,[%r349]; .loc 1 189 13 shl.b64 %r350,%r95,3; add.u64 %r132,%r129,%r350; .loc 1 190 23 add.u64 %r351,%frame,%r159; ld.u64 %r98,[%r351]; .loc 1 190 13 shl.b64 %r352,%r98,3; add.u64 %r111,%r130,%r352; .loc 1 175 32 add.u64 %r354,%r369,%r159; ld.u64 %r102,[%r354]; .loc 1 175 13 setp.eq.u64 %r355,%r94,%r102; @ %r355 bra $L35; .loc 1 177 13 mov.u64 %r82,0; bra $L25; $L51: .loc 1 86 6 ld.u64 %r356,[%r185]; setp.eq.u64 %r357,%r356,0; @ ! %r357 bra $L11; bra $L36; $L_gfortran_mfindloc1_i8 .visible .func _gfortran_mfindloc1_i8 {608u32 %r209; .reg .u32 %r210; .reg .u64 %r212; .reg .u32 %r214; .reg .u32 %r215; .reg .u16 %r217; .reg .u16 %r218; .reg .u16 %r219; .reg .u32 %r220; .reg .u16 %r221; .reg .predu64 %r230; .reg .u64 %r231; .reg .u64 %r249; .reg .u32u32u64 %r281; .reg .u64 %r282; .reg .u64 %r283; .reg .u64predu64u16 %r331; .reg .u16predpredu64 %r375; .reg .u64pred %r379; .reg .pred %r381; .reg .u16pred %r388; .reg .u16 %r389; .reg .u32predu64u64 %r430; .reg .u64 %r431; mov.u64 %r204,%ar0; mov.u64 %r205,%ar1; mov.u64 %r206,%ar2; mov.u64 %r207,%ar3; mov.u64 %r208,%ar4; mov.u32 %r209,%ar5; .loc 1 225 10 ld.s8 %r22,[%r205+28]; .loc 1 225 38 add.u32 %r210,%r22,-1; .loc 1 225 8 cvt.s64.s32 %r147,%r210; .loc 1 226 10 ld.u64 %r25,[%r207]; .loc 1 226 7 add.u64 %r148,%r25,-1; .loc 1 228 7 shr.u64 %r212,%r148,63; set.u32.lt.s64 %r214,%r147,%r148; neg.s32 %r215,%r214; cvt.u16.u64 %r218,%r212; cvt.u16.u32 %r219,%r215; or.b16 %r217,%r218,%r219; .loc 1 228 6 cvt.u32.u16 %r220,%r217; cvt.u16.u8 %r221,%r220; setp.eq.u16 %r222,%r221,0; @ %r222 bra $L53; .loc 1 230 7 cvt.u32.u32 %r225,%r22; cvt.s64.s8 %r224,%r225; st.u64 [%stack+8],%r224232stack; call _gfortran_runtime_error3: .loc 1 235 9 add.u64 %r228,%r148,%r148; add.u64 %r229,%r228,%r148; shl.b64 %r230,%r229,3; add.u64 %r231,%r205,%r230; ld.u64 %r31,[%r231+56]; ld.u64 %r33,[%r231+48]; .loc 1 239 9 ld.u64 %r151,[%r231+40]; .loc 1 240 12 add.u64 %r249,%r208,%r230; ld.u64 %r34,[%r249+40]; ld.u64 %r36,[%r208+16]; .loc 1 242 9 ld.u64 %r160,[%r208]; .loc 1 246 22 cvt.u32.u64 %r39,%r36; .loc 1 246 53 add.u32 %r251,%r39,-4; and.b32 %r252,%r251,-5; set.u32.eq.u32 %r254,%r252,0; neg.s32 %r255,%r254; .loc 1 246 22 add.u32 %r256,%r39,-1; set.u32.le.u32 %r258,%r256,1; neg.s32 %r259,%r258; .loc 1 246 6 cvt.u16.u32 %r261,%r255; cvt.u16.u32 %r262,%r259; or.b16 %r260,%r261,%r262; cvt.u32.u16 %r263,%r260; cvt.u16.u8 %r264,%r263; setp.eq.u16 %r265,%r264,0; @ %r265 bra $L54; $L58: .loc 1 255 3 setp.ne.u64 %r266,%r148,0; @ %r266 bra $L55; $L62: .loc 1 264 3 setp.gt.s64 %r267,%r147,%r148; @ %r267 bra $L56; bra $L113; $L54: .loc 1 248 7 setp.eq.u32 %r269,%r39,16; @ %r269 bra $L58; .loc 1 253 5 cvta.const.u64 %r271,$LC4; mov.u64 %r270,27271; call _gfortrani_internal_error5: add.u64 %r146,%r205,40; add.u64 %r167,%r208,40; add.u64 %r273,%r25,%r25; add.u64 %r274,%r273,%r25; shl.b64 %r275,%r274,3; add.u64 %r276,%r205,16; add.u64 %r101,%r275,%r276; .loc 1 255 3 mov.u64 %r200,0; add.u64 %r418,%frame,240; add.u64 %r419,%frame,120; add.u64 %r416,%frame,360; .loc 1 262 12 mov.u64 %r431,%r200; $L61: .loc 1 257 18 add.u64 %r278,%r418,%r200; ld.u64 %r279,[%r146]; st.u64 [%r278],%r279; .loc 1 258 18 add.u64 %r281,%r419,%r200; .loc 1 258 20 ld.u64 %r283,[%r167]; mul.lo.u64 %r282,%r283,%r36; .loc 1 258 18 st.u64 [%r281],%r282; .loc 1 259 19 ld.u64 %r285,[%r146+16]; add.u64 %r284,%r285,1; ld.u64 %r286,[%r146+8]; sub.u64 %r53,%r284,%r286; .loc 1 261 10 setp.lt.s64 %r287,%r53,0; @ %r287 bra $L59; .loc 1 259 17 add.u64 %r289,%r416,%r200; st.u64 [%r289],%r53; bra $L60; $L59: .loc 1 262 12 add.u64 %r291,%r416,%r200; st.u64 [%r291],%r431; $L60: .loc 1 255 3 add.u64 %r146,%r146,24; add.u64 %r167,%r167,24; add.u64 %r200,%r200,8; setp.ne.u64 %r293,%r101,%r146; @ %r293 bra $L61; bra $L62; $L113: .loc 1 274 6 ld.u64 %r294,[%r204]; setp.eq.u64 %r295,%r294,0; @ ! %r295 bra $L64; bra $L63; $L56: add.u64 %r297,%r25,%r25; add.u64 %r298,%r297,%r25; shl.b64 %r299,%r298,3; add.u64 %r143,%r299,40; add.u64 %r156,%r205,%r143; add.u64 %r141,%r208,%r143; shl.b64 %r300,%r25,3; add.u64 %r137,%r300,-8; cvt.u32.u32 %r302,%r22; cvt.s64.s8 %r301,%r302; shl.b64 %r303,%r301,3; add.u64 %r84,%r303,-8; add.u64 %r418,%frame,240; add.u64 %r419,%frame,120; add.u64 %r416,%frame,360; .loc 1 271 12 mov.u64 %r430,0; $L67: .loc 1 266 18 add.u64 %r305,%r418,%r137; ld.u64 %r306,[%r156]; st.u64 [%r305],%r306; .loc 1 267 18 add.u64 %r308,%r419,%r137; .loc 1 267 20 ld.u64 %r310,[%r141]; mul.lo.u64 %r309,%r310,%r36; .loc 1 267 18 st.u64 [%r308],%r309; .loc 1 268 19 ld.u64 %r312,[%r156+16]; add.u64 %r311,%r312,1; ld.u64 %r313,[%r156+8]; sub.u64 %r62,%r311,%r313; .loc 1 270 10 setp.lt.s64 %r314,%r62,0; @ %r314 bra $L65; .loc 1 268 17 add.u64 %r316,%r416,%r137; st.u64 [%r316],%r62; bra $L66; $L65: .loc 1 271 12 add.u64 %r318,%r416,%r137; st.u64 [%r318],%r430; $L66: .loc 1 264 3 add.u64 %r156,%r156,24; add.u64 %r141,%r141,24; add.u64 %r137,%r137,8; setp.ne.u64 %r320,%r84,%r137; @ %r320 bra $L67; bra $L114; $L63: .loc 1 278 7 setp.le.s64 %r321,%r147,0; @ %r321 bra $L69; add.u64 %r416,%frame,360; $L90: add.u64 %r179,%r204,40; mov.u64 %r176,%r416; add.u64 %r322,%r204,16; cvt.u32.u32 %r324,%r22; cvt.s64.s8 %r323,%r324; add.u64 %r326,%r323,%r323; add.u64 %r327,%r326,%r323; shl.b64 %r328,%r327,3; add.u64 %r169,%r322,%r328; .loc 1 281 10 mov.u64 %r140,1; .loc 1 285 4 mov.u64 %r351,0; bra $L70; $L69: .loc 1 289 24 mov.u64 %r329,0; st.u64 [%r204+8],%r329; .loc 1 290 28 cvt.u16.u32 %r332,%r22; add.u16 %r331,%r332,-1; cvt.u32.u16 %r333,%r331; st.u8 [%r204+28],%r333; .loc 1 292 20 add.u32 %r334,%r22,-2; cvt.s64.s32 %r72,%r334; add.u64 %r336,%r72,%r72; add.u64 %r337,%r336,%r72; shl.b64 %r338,%r337,3; add.u64 %r339,%r204,%r338; .loc 1 292 67 shl.b64 %r341,%r72,3; add.u64 %r342,%frame,%r341; .loc 1 292 59 ld.u64 %r344,[%r339+40]; ld.u64 %r345,[%r342+360]; mul.lo.u64 %r154,%r344,%r345; .loc 1 294 29 mov.u64 %r3471547; call (%value_in),_gfortrani_xmallocarray8,[%value_in]; } .loc 1 294 27 st.u64 [%r204],%r348; .loc 1 295 10 setp.eq.u64 %r350,%r154,0; @ ! %r350 bra $L74; bra $L71; $L73: .loc 1 283 48 mul.lo.u64 %r140,%r68,%r140; $L70: .loc 1 285 4 st.u64 [%r179+8],%r351; ld.u64 %r68,[%r176]; add.u64 %r352,%r68,-1; st.u64 [%r179+16],%r352; st.u64 [%r179],%r140; .loc 1 278 7 add.u64 %r179,%r179,24; add.u64 %r176,%r176,8; setp.ne.u64 %r353,%r169,%r179; @ %r353 bra $L73; bra $L69; $L71: .loc 1 298 4 st.u64 [%r204+48],%r154; mov.u64 %r355,-1; st.u64 [%r204+56],%r355; mov.u64 %r356,1; st.u64 [%r204+40],%r356; .loc 1 299 4 bra $L52; $L64: .loc 1 304 19 ld.s8 %r79,[%r204+28]; .loc 1 304 10 setp.eq.u64 %r357,%r79,%r147; @ %r357 bra $L76; .loc 1 305 2 st.u64 [%stack+8],%r147; st.u64 [%stack],%r79;_gfortran_runtime_error76: .loc 1 310 11 cvta.global.u64 %r360,_gfortrani_compile_options; .loc 1 310 10 ld.u32 %r361,[%r360+36]; setp.eq.u32 %r362,%r361,0; @ %r362 bra $L74; .loc 1 311 2 add.u64 %r416,%frame,360; cvta.const.u64 %r366,$LC2r416366; call _gfortrani_bounds_ifunction_return$L74: .loc 1 315 3 setp.gt.s64 %r368,%r147,0; @ %r368 bra $L77; $L80: .loc 1 235 9 add.u64 %r369,%r31,1; .loc 1 235 7 sub.u64 %r149,%r369,%r33; max.s64 %r65,%r149,0; .loc 1 240 12 mul.lo.u64 %r37,%r34,%r36; .loc 1 323 8 ld.u64 %r164,[%r204]; .loc 1 326 8 ld.u64 %r163,[%r205]; .loc 1 351 35 shl.b64 %r100,%r151,3; .loc 1 336 22 add.u64 %r64,%r65,-1; .loc 1 336 15 mul.lo.u64 %r89,%r64,%r100; .loc 1 337 29 mul.lo.u64 %r177,%r37,%r64; .loc 1 338 34 neg.s64 %r95,%r100; .loc 1 363 22 ld.u64 %r102,[%frame+240]; .loc 1 363 12 shl.b64 %r104,%r102,3; .loc 1 364 23 ld.u64 %r105,[%frame+120]; .loc 1 365 22 ld.u64 %r107,[%frame]; .loc 1 365 12 shl.b64 %r109,%r107,3; .loc 1 362 12 ld.u64 %r203,[%frame+480]; setp.ne.u32 %r417,%r209,0; add.u64 %r415,%frame,488; .loc 1 338 4 setp.le.s64 %r423,%r149,0; .loc 1 382 23 add.u64 %r424,%frame,240; .loc 1 367 32 add.u64 %r425,%frame,360; .loc 1 371 20 add.u64 %r426,%frame,120; bra $L78; $L77: add.u64 %r198,%frame,480; add.u64 %r197,%r204,40; mov.u64 %r195,%frame; cvt.u32.u32 %r371,%r22; cvt.s64.s8 %r370,%r371; add.u64 %r181,%r370,-1; .loc 1 315 3 mov.u64 %r165,0; add.u64 %r416,%frame,360; .loc 1 317 16 mov.u64 %r372,%r165; $L79: st.u64 [%r198],%r372; .loc 1 318 18 ld.u64 %r373,[%r197]; st.u64 [%r195],%r373; .loc 1 319 17 shl.b64 %r375,%r165,3; add.u64 %r376,%r416,%r375; .loc 1 319 10 ld.u64 %r377,[%r376]; setp.le.s64 %r378,%r377,0; @ %r378 bra $L52; .loc 1 315 26 add.u64 %r165,%r165,1; .loc 1 315 3 add.u64 %r198,%r198,8; add.u64 %r197,%r197,24; add.u64 %r195,%r195,8; setp.ne.u64 %r379,%r165,%r181; @ %r379 bra $L79; bra $L80; $L78: .loc 1 334 10 @ %r417 bra $L81; .loc 1 351 4 setp.gt.s64 %r381,%r149,0; @ %r381 bra $L91; .loc 1 333 14 mov.u64 %r158,0; bra $L83; $L81: .loc 1 336 8 add.u64 %r29,%r163,%r89; .loc 1 337 9 add.u64 %r159,%r160,%r177; .loc 1 338 4 @ %r423 bra $L92; .loc 1 338 11 mov.u64 %r158,%r65; $L85: .loc 1 340 11 ld.s8 %r384,[%r159]; cvt.u16.u32 %r383,%r384; setp.eq.u16 %r385,%r383,0; @ %r385 bra $L84; .loc 1 340 18 ld.u64 %r386,[%r29]; setp.eq.u64 %r387,%r386,%r206; @ %r387 bra $L83; $L84: .loc 1 338 26 add.u64 %r158,%r158,-1; .loc 1 338 34 add.u64 %r29,%r29,%r95; .loc 1 338 53 sub.u64 %r159,%r159,%r37; .loc 1 338 4 setp.ne.u64 %r388,%r158,0; @ %r388 bra $L85; bra $L83; $L91: .loc 1 351 4 mov.u64 %r157,%r160; mov.u64 %r111,%r163; .loc 1 351 11 mov.u64 %r158,1; $L82: .loc 1 353 11 ld.s8 %r390,[%r157]; cvt.u16.u32 %r389,%r390; setp.eq.u16 %r391,%r389,0; @ %r391 bra $L86; .loc 1 353 18 ld.u64 %r392,[%r111]; setp.eq.u64 %r393,%r392,%r206; @ %r393 bra $L83; $L86: .loc 1 351 27 add.u64 %r158,%r158,1; .loc 1 351 35 add.u64 %r111,%r111,%r100; .loc 1 351 54 add.u64 %r157,%r157,%r37; .loc 1 351 4 setp.ge.s64 %r394,%r65,%r158; @ %r394 bra $L82; .loc 1 333 14 mov.u64 %r158,0; bra $L83; $L92: mov.u64 %r158,0; $L83: .loc 1 360 13 st.u64 [%r164],%r158; .loc 1 362 15 add.u64 %r203,%r203,1; st.u64 [%frame+480],%r203; .loc 1 363 12 add.u64 %r163,%r163,%r104; .loc 1 364 13 add.u64 %r160,%r160,%r105; .loc 1 365 12 add.u64 %r164,%r164,%r109; .loc 1 367 32 ld.u64 %r130,[%frame+360]; .loc 1 367 13 setp.ne.u64 %r395,%r203,%r130; @ %r395 bra $L78; mov.u64 %r126,%r415; mov.u64 %r127,%r107; mov.u64 %r202,%r105; mov.u64 %r124,%r102; mov.u64 %r120,8; .loc 1 366 9 mov.u64 %r162,0; .loc 1 369 13 mov.u64 %r397,%r162; $L89: st.u64 [%r126+-8],%r397; .loc 1 370 23 mul.lo.u64 %r113,%r130,%r124; .loc 1 371 24 mul.lo.u64 %r398,%r130,%r202; .loc 1 371 10 sub.u64 %r160,%r160,%r398; .loc 1 372 23 mul.lo.u64 %r119,%r130,%r127; .loc 1 373 5 add.u64 %r162,%r162,1; .loc 1 374 7 setp.le.s64 %r399,%r147,%r162; @ %r399 bra $L52; .loc 1 381 16 ld.u64 %r400,[%r126]; add.u64 %r123,%r400,1; st.u64 [%r126],%r123; .loc 1 382 23 add.u64 %r402,%r424,%r120; ld.u64 %r124,[%r402]; .loc 1 382 13 sub.u64 %r403,%r124,%r113; shl.b64 %r404,%r403,3; add.u64 %r163,%r163,%r404; .loc 1 383 23 add.u64 %r405,%frame,%r120; ld.u64 %r127,[%r405]; .loc 1 383 13 sub.u64 %r406,%r127,%r119; shl.b64 %r407,%r406,3; add.u64 %r164,%r164,%r407; .loc 1 367 32 add.u64 %r409,%r425,%r120; ld.u64 %r130,[%r409]; .loc 1 367 13 add.u64 %r126,%r126,8; setp.eq.u64 %r410,%r123,%r130; @ %r410 bra $L88; .loc 1 362 12 ld.u64 %r203,[%frame+480]; bra $L78; $L88: .loc 1 371 20 add.u64 %r412,%r426,%r120; ld.u64 %r202,[%r412]; add.u64 %r120,%r120,8; bra $L89; $L114: .loc 1 274 6 ld.u64 %r413,[%r204]; setp.eq.u64 %r414,%r413,0; @ ! %r414 bra $L64; bra $L90; $L52_gfortran_sfindloc1_i8 .visible .func _gfortran_sfindloc1_i8 {68u64 %r126; .reg .u64u32pred %r190; .reg .pred %r192; .reg .u64pred %r213; .reg .predu32predpredpred %r257; .reg .u64 %r260; .reg .u64 %r261; .reg .pred %r263; .reg .u64 %r264; .reg .u32 %r265; .reg .u64 %r266; .reg .u6464predmov.u64 %r149,%ar0; mov.u64 %r150,%ar1; mov.u64 %r151,%ar2; mov.u64 %r152,%ar3; mov.u64 %r153,%ar4; mov.u32 %r154,%ar5; .loc 1 410 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L116; .loc 1 410 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L117; $L116:call _gfortran_findloc1_i8); } .loc 1 413 7 bra $L115; $L117: .loc 1 416 10 ld.s8 %r23,[%r150+28]; .loc 1 416 38 add.u32 %r163,%r23,-1; .loc 1 416 8 cvt.s64.s32 %r90,%r163; .loc 1 417 10 ld.u64 %r26,[%r152]; .loc 1 417 7 add.u64 %r91,%r26,-1; .loc 1 419 7 shr.u64 %r165,%r91,63; set.u32.lt.s64 %r167,%r90,%r91; neg.s32 %r168,%r167; cvt.u16.u64 %r171,%r165; cvt.u16.u32 %r172,%r168; or.b16 %r170,%r171,%r172; .loc 1 419 6 cvt.u32.u16 %r173,%r170; cvt.u16.u8 %r174,%r173; setp.ne.u16 %r175,%r174,0; @ %r175 bra $L119; .loc 1 430 3 setp.ne.u64 %r176,%r91,0; @ %r176 bra $L120; $L126: .loc 1 438 3 setp.gt.s64 %r177,%r90,%r91; @ %r177 bra $L121; bra $L163; $L119: .loc 1 421 7 cvt.u32.u32 %r180,%r23; cvt.s64.s8 %r179,%r180; st.u64 [%stack+8],%r179_gfortran_runtime_error120: add.u64 %r136,%r150,48; add.u64 %r138,%frame,120; add.u64 %r139,%r150,56; add.u64 %r182,%r150,24; add.u64 %r184,%r26,%r26; add.u64 %r185,%r184,%r26; shl.b64 %r186,%r185,3; add.u64 %r145,%r182,%r186; .loc 1 435 12 mov.u64 %r302,0; $L125: .loc 1 432 19 ld.u64 %r188,[%r139]; add.u64 %r187,%r188,1; ld.u64 %r189,[%r136]; sub.u64 %r35,%r187,%r189; .loc 1 434 10 setp.le.s64 %r190,%r35,0; @ %r190 bra $L123; .loc 1 432 17 st.u64 [%r138],%r35; bra $L124; $L123: .loc 1 435 12 st.u64 [%r138],%r302; $L124: .loc 1 430 3 add.u64 %r136,%r136,24; add.u64 %r138,%r138,8; add.u64 %r139,%r139,24; setp.ne.u64 %r192,%r136,%r145; @ %r192 bra $L125; bra $L126; $L163: .loc 1 448 6 ld.u64 %r193,[%r149]; setp.eq.u64 %r194,%r193,0; @ ! %r194 bra $L128; bra $L127; $L121: add.u64 %r196,%r26,%r26; add.u64 %r197,%r196,%r26; shl.b64 %r198,%r197,3; add.u64 %r199,%r198,48; add.u64 %r125,%r150,%r199; add.u64 %r294,%frame,120; shl.b64 %r201,%r26,3; add.u64 %r202,%r201,-8; add.u64 %r45,%r294,%r202; add.u64 %r203,%r150,48; cvt.u32.u32 %r205,%r23; cvt.s64.s8 %r204,%r205; add.u64 %r207,%r204,%r204; add.u64 %r208,%r207,%r204; shl.b64 %r209,%r208,3; add.u64 %r132,%r203,%r209; .loc 1 444 12 mov.u64 %r301,0; $L131: .loc 1 441 2 ld.u64 %r211,[%r125+8]; add.u64 %r210,%r211,1; ld.u64 %r212,[%r125]; sub.u64 %r40,%r210,%r212; .loc 1 443 10 setp.le.s64 %r213,%r40,0; @ %r213 bra $L129; .loc 1 440 17 st.u64 [%r45],%r40; bra $L130; $L129: .loc 1 444 12 st.u64 [%r45],%r301; $L130: .loc 1 438 3 add.u64 %r125,%r125,24; add.u64 %r45,%r45,8; setp.ne.u64 %r215,%r125,%r132; @ %r215 bra $L131; bra $L164; $L127: .loc 1 452 7 setp.le.s64 %r216,%r90,0; @ %r216 bra $L133; add.u64 %r294,%frame,120; $L149: add.u64 %r78,%r149,40; mov.u64 %r61,%r294; cvt.u32.u32 %r218,%r23; cvt.s64.s8 %r217,%r218; add.u64 %r220,%r217,%r217; add.u64 %r221,%r220,%r217; shl.b64 %r222,%r221,3; add.u64 %r223,%r149,16; add.u64 %r126,%r222,%r223; .loc 1 455 10 mov.u64 %r81,1; .loc 1 459 4 mov.u64 %r246,0; bra $L134; $L133: .loc 1 462 24 mov.u64 %r224,0; st.u64 [%r149+8],%r224; .loc 1 463 28 cvt.u16.u32 %r227,%r23; add.u16 %r226,%r227,-1; cvt.u32.u16 %r228,%r226; st.u8 [%r149+28],%r228; .loc 1 465 20 add.u32 %r229,%r23,-2; cvt.s64.s32 %r51,%r229; add.u64 %r231,%r51,%r51; add.u64 %r232,%r231,%r51; shl.b64 %r233,%r232,3; add.u64 %r234,%r149,%r233; .loc 1 465 67 shl.b64 %r236,%r51,3; add.u64 %r237,%frame,%r236; .loc 1 465 59 ld.u64 %r239,[%r234+40]; ld.u64 %r240,[%r237+120]; mul.lo.u64 %r94,%r239,%r240; .loc 1 467 29 mov.u64 %r24242; call (%value_in),_gfortrani_xmallocarray3,[%value_in]; } .loc 1 467 27 st.u64 [%r149],%r243; .loc 1 468 10 setp.eq.u64 %r245,%r94,0; @ ! %r245 bra $L138; bra $L135; $L137: .loc 1 457 48 mul.lo.u64 %r81,%r47,%r81; $L134: .loc 1 459 4 st.u64 [%r78+8],%r246; ld.u64 %r47,[%r61]; add.u64 %r247,%r47,-1; st.u64 [%r78+16],%r247; st.u64 [%r78],%r81; .loc 1 452 7 add.u64 %r78,%r78,24; add.u64 %r61,%r61,8; setp.ne.u64 %r248,%r78,%r126; @ %r248 bra $L137; bra $L133; $L135: .loc 1 471 4 st.u64 [%r149+48],%r94; mov.u64 %r250,-1; st.u64 [%r149+56],%r250; mov.u64 %r251,1; st.u64 [%r149+40],%r251; .loc 1 472 4 bra $L115; $L128: .loc 1 477 19 ld.s8 %r58,[%r149+28]; .loc 1 477 10 setp.eq.u64 %r252,%r58,%r90; @ %r252 bra $L139; .loc 1 478 2 st.u64 [%stack+8],%r90; st.u64 [%stack],%r58;_gfortran_runtime_error139: .loc 1 483 11 cvta.global.u64 %r255,_gfortrani_compile_options; .loc 1 483 10 ld.u32 %r256,[%r255+36]; setp.eq.u32 %r257,%r256,0; @ %r257 bra $L138; .loc 1 484 2 add.u64 %r294,%frame,120; cvta.const.u64 %r261,$LC2261; call _gfortrani_bounds_ifunction_return$L138: .loc 1 488 3 setp.gt.s64 %r263,%r90,0; @ %r263 bra $L140; $L143: .loc 1 495 8 ld.u64 %r97,[%r149]; ld.u64 %r42,[%frame+240]; .loc 1 503 22 ld.u64 %r64,[%frame]; .loc 1 503 12 shl.b64 %r66,%r64,3; setp.le.s64 %r295,%r90,1; cvt.u32.u32 %r297,%r23; cvt.s64.s8 %r298,%r297; add.u64 %r299,%r298,-1; add.u64 %r300,%frame,120; bra $L141; $L140: add.u64 %r114,%frame,240; add.u64 %r113,%r149,40; mov.u64 %r108,%frame; cvt.u32.u32 %r265,%r23; cvt.s64.s8 %r264,%r265; add.u64 %r79,%r264,-1; .loc 1 488 3 mov.u64 %r98,0; add.u64 %r294,%frame,120; .loc 1 490 16 mov.u64 %r266,%r98; $L142: st.u64 [%r114],%r266; .loc 1 491 18 ld.u64 %r267,[%r113]; st.u64 [%r108],%r267; .loc 1 492 17 shl.b64 %r269,%r98,3; add.u64 %r270,%r294,%r269; .loc 1 492 10 ld.u64 %r271,[%r270]; setp.le.s64 %r272,%r271,0; @ %r272 bra $L115; .loc 1 488 26 add.u64 %r98,%r98,1; .loc 1 488 3 add.u64 %r114,%r114,8; add.u64 %r113,%r113,24; add.u64 %r108,%r108,8; setp.ne.u64 %r273,%r79,%r98; @ %r273 bra $L142; bra $L143; $L141: .loc 1 500 13 mov.u64 %r274,0; st.u64 [%r97],%r274; .loc 1 502 15 add.u64 %r42,%r42,1; .loc 1 503 12 add.u64 %r97,%r97,%r66; .loc 1 505 32 ld.u64 %r88,[%frame+120]; .loc 1 505 13 setp.ne.u64 %r275,%r42,%r88; @ %r275 bra $L141; .loc 1 508 23 mul.lo.u64 %r276,%r64,%r42; .loc 1 508 9 shl.b64 %r277,%r276,3; sub.u64 %r30,%r97,%r277; .loc 1 510 7 @ ! %r295 bra $L165; bra $L115; $L148: .loc 1 507 13 st.u64 [%r102],%r274; .loc 1 508 23 mul.lo.u64 %r280,%r73,%r72; .loc 1 508 9 shl.b64 %r281,%r280,3; sub.u64 %r30,%r97,%r281; .loc 1 509 5 add.u64 %r96,%r96,1; .loc 1 510 7 add.u64 %r102,%r102,8; setp.ne.u64 %r282,%r96,%r299; @ %r282 bra $L147; bra $L115; $L165: add.u64 %r102,%frame,248; .loc 1 509 5 mov.u64 %r96,1; $L147: .loc 1 517 16 ld.u64 %r286,[%r102]; add.u64 %r72,%r286,1; st.u64 [%r102],%r72; shl.b64 %r120,%r96,3; .loc 1 518 23 add.u64 %r287,%frame,%r120; ld.u64 %r73,[%r287]; .loc 1 518 13 shl.b64 %r288,%r73,3; add.u64 %r97,%r30,%r288; .loc 1 505 32 add.u64 %r290,%r300,%r120; ld.u64 %r77,[%r290]; .loc 1 505 13 setp.eq.u64 %r291,%r72,%r77; @ %r291 bra $L148; .loc 1 507 13 mov.u64 %r42,0; bra $L141; $L164: .loc 1 448 6 ld.u64 %r292,[%r149]; setp.eq.u64 %r293,%r292,0; @ ! %r293 bra $L128; bra $L149; $L115: .loc 1 522 1 ret; } findloc1_i16.o/_gfortran_findloc1_i16 .visible .func _gfortran_findloc1_i16fortran/generated/findloc1_i16.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mfindloc1_i16 .visible .func _gfortran_mfindloc1_i16GLOBAL FUNCTION DECL: _gfortran_sfindloc1_i16 .visible .func _gfortran_sfindloc1_i16GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_options[6]; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_internal_error .extern .func _gfortrani_internal_error8] = {70,73,78,68,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,70,73,78,68,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,914,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,70,73,78,68,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_findloc1_i16 .visible .func _gfortran_findloc1_i16, .param .u32 %in_ar4) {48098predpredu64 %r257; .reg .u64u64 %r279; .reg .u64 %r280; .reg .u64 %r281; .reg .u64u64 %r303; .reg .u32 %r304; .reg .pred %r305; .reg .u64 %r308; .reg .u64 %r309; .reg .pred %r311; .reg .u64 %r312; .reg .u64 %r313; .reg .u64u64predpred %r332; .reg .predu64 %r337; .reg .pred %r339; .reg .pred %r340; .reg .predpredpred %r374; .reg .predu64 %r381; .reg .u64 %r382; mov.u64 %r185,%ar0; mov.u64 %r186,%ar1; mov.u64 %r187,%ar2; ld.u64 %r367,[%r187]; ld.u64 %r368,[%r187+8]; mov.u64 %r189,%ar3; mov.u32 %r190,%ar4; .loc 1 54 10 ld.s8 %r22,[%r186+28]; .loc 1 54 38 add.u32 %r191,%r22,-1; .loc 1 54 8 cvt.s64.s32 %r118,%r191; .loc 1 55 10 ld.u64 %r25,[%r189]; .loc 1 55 7 add.u64 %r119,%r25,-1; .loc 1 57 7 shr.u64 %r193,%r119,63; set.u32.lt.s64 %r195,%r118,%r119; neg.s32 %r196,%r195; cvt.u16.u64 %r199,%r193; cvt.u16.u32 %r200,%r196; or.b16 %r198,%r199,%r200; .loc 1 57 6 cvt.u32.u16 %r201,%r198; cvt.u16.u8 %r202,%r201; setp.eq.u16 %r203,%r202,0; @ %r203 bra $L2; .loc 1 59 7 cvt.u32.u32 %r206,%r22; cvt.s64.s8 %r205,%r206; st.u64 [%stack+8],%r20504stack; call _gfortran_runtime_errorr209,%r119,%r119; add.u64 %r210,%r209,%r119; shl.b64 %r211,%r210,3; add.u64 %r212,%r186,%r211; ld.u64 %r31,[%r212+56]; ld.u64 %r33,[%r212+48]; .loc 1 67 9 ld.u64 %r122,[%r212+40]; .loc 1 69 3 setp.ne.u64 %r226,%r119,0; @ %r226 bra $L3; $L9: .loc 1 77 3 setp.gt.s64 %r227,%r118,%r119; @ %r227 bra $L4; bra $L54; $L3: add.u64 %r177,%r186,40; add.u64 %r67,%frame,120; add.u64 %r70,%frame,240; add.u64 %r229,%r25,%r25; add.u64 %r230,%r229,%r25; shl.b64 %r231,%r230,3; add.u64 %r232,%r186,16; add.u64 %r182,%r231,%r232; .loc 1 75 12 mov.u64 %r382,0; $L8: .loc 1 71 18 ld.u64 %r233,[%r177]; st.u64 [%r67],%r233; .loc 1 72 19 ld.u64 %r235,[%r177+16]; add.u64 %r234,%r235,1; ld.u64 %r236,[%r177+8]; sub.u64 %r38,%r234,%r236; .loc 1 74 10 setp.lt.s64 %r237,%r38,0; @ %r237 bra $L6; .loc 1 72 17 st.u64 [%r70],%r38; bra $L7; $L6: .loc 1 75 12 st.u64 [%r70],%r382; $L7: .loc 1 69 3 add.u64 %r177,%r177,24; add.u64 %r67,%r67,8; add.u64 %r70,%r70,8; setp.ne.u64 %r239,%r177,%r182; @ %r239 bra $L8; bra $L9; $L54: .loc 1 86 6 ld.u64 %r240,[%r185]; setp.eq.u64 %r241,%r240,0; @ ! %r241 bra $L11; bra $L10; $L4: add.u64 %r243,%r25,%r25; add.u64 %r244,%r243,%r25; shl.b64 %r245,%r244,3; add.u64 %r246,%r245,40; add.u64 %r115,%r186,%r246; shl.b64 %r247,%r25,3; add.u64 %r105,%r247,-8; add.u64 %r371,%frame,120; add.u64 %r108,%r371,%r105; add.u64 %r370,%frame,240; add.u64 %r62,%r370,%r105; add.u64 %r250,%r186,40; cvt.u32.u32 %r252,%r22; cvt.s64.s8 %r251,%r252; add.u64 %r254,%r251,%r251; add.u64 %r255,%r254,%r251; shl.b64 %r256,%r255,3; add.u64 %r180,%r250,%r256; .loc 1 83 12 mov.u64 %r381,0; $L14: .loc 1 79 18 ld.u64 %r257,[%r115]; st.u64 [%r108],%r257; .loc 1 80 19 ld.u64 %r259,[%r115+16]; add.u64 %r258,%r259,1; ld.u64 %r260,[%r115+8]; sub.u64 %r45,%r258,%r260; .loc 1 82 10 setp.lt.s64 %r261,%r45,0; @ %r261 bra $L12; .loc 1 80 17 st.u64 [%r62],%r45; bra $L13; $L12: .loc 1 83 12 st.u64 [%r62],%r381; $L13: .loc 1 77 3 add.u64 %r115,%r115,24; add.u64 %r108,%r108,8; add.u64 %r62,%r62,8; setp.ne.u64 %r263,%r115,%r180; @ %r263 bra $L14; bra $L55; $L10: .loc 1 90 7 setp.le.s64 %r264,%r118,0; @ %r264 bra $L16; add.u64 %r370,%frame,240; $L38: add.u64 %r153,%r185,40; mov.u64 %r149,%r370; cvt.u32.u32 %r266,%r22; cvt.s64.s8 %r265,%r266; add.u64 %r268,%r265,%r265; add.u64 %r269,%r268,%r265; shl.b64 %r270,%r269,3; add.u64 %r271,%r185,16; add.u64 %r116,%r270,%r271; .loc 1 93 10 mov.u64 %r112,1; .loc 1 97 4 mov.u64 %r294,0; bra $L17; $L16: .loc 1 101 24 mov.u64 %r272,0; st.u64 [%r185+8],%r272; .loc 1 102 28 cvt.u16.u32 %r275,%r22; add.u16 %r274,%r275,-1; cvt.u32.u16 %r276,%r274; st.u8 [%r185+28],%r276; .loc 1 104 20 add.u32 %r277,%r22,-2; cvt.s64.s32 %r54,%r277; add.u64 %r279,%r54,%r54; add.u64 %r280,%r279,%r54; shl.b64 %r281,%r280,3; add.u64 %r282,%r185,%r281; .loc 1 104 67 shl.b64 %r284,%r54,3; add.u64 %r285,%frame,%r284; .loc 1 104 59 ld.u64 %r287,[%r282+40]; ld.u64 %r288,[%r285+240]; mul.lo.u64 %r124,%r287,%r288; .loc 1 106 29 mov.u64 %r290190; call (%value_in),_gfortrani_xmallocarray291,[%value_in]; } .loc 1 106 27 st.u64 [%r185],%r291; .loc 1 107 10 setp.eq.u64 %r293,%r124,0; @ ! %r293 bra $L21; bra $L18; $L20: .loc 1 95 48 mul.lo.u64 %r112,%r50,%r112; $L17: .loc 1 97 4 st.u64 [%r153+8],%r294; ld.u64 %r50,[%r149]; add.u64 %r295,%r50,-1; st.u64 [%r153+16],%r295; st.u64 [%r153],%r112; .loc 1 90 7 add.u64 %r153,%r153,24; add.u64 %r149,%r149,8; setp.ne.u64 %r296,%r116,%r153; @ %r296 bra $L20; bra $L16; $L18: .loc 1 110 4 st.u64 [%r185+48],%r124; mov.u64 %r298,-1; st.u64 [%r185+56],%r298; mov.u64 %r299,1; st.u64 [%r185+40],%r299; .loc 1 111 4 bra $L1; $L11: .loc 1 116 19 ld.s8 %r60,[%r185+28]; .loc 1 116 10 setp.eq.u64 %r300,%r60,%r118; @ %r300 bra $L23; .loc 1 117 2 st.u64 [%stack+8],%r118; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 122 11 cvta.global.u64 %r303,_gfortrani_compile_options; .loc 1 122 10 ld.u32 %r304,[%r303+36]; setp.eq.u32 %r305,%r304,0; @ %r305 bra $L21; .loc 1 123 2 add.u64 %r370,%frame,240; cvta.const.u64 %r309,$LC2r370308309; call _gfortrani_bounds_ifunction_return$L21: .loc 1 127 3 setp.gt.s64 %r311,%r118,0; @ %r311 bra $L24; $L27: .loc 1 64 9 add.u64 %r312,%r31,1; .loc 1 64 7 sub.u64 %r120,%r312,%r33; max.s64 %r43,%r120,0; .loc 1 135 8 ld.u64 %r111,[%r185]; .loc 1 138 8 ld.u64 %r132,[%r186]; .loc 1 160 35 shl.b64 %r74,%r122,4; .loc 1 147 22 add.u64 %r313,%r43,-1; .loc 1 147 15 mul.lo.u64 %r68,%r313,%r74; .loc 1 148 34 neg.s64 %r71,%r74; ld.u64 %r82,[%frame+360]; .loc 1 172 22 ld.u64 %r75,[%frame+120]; .loc 1 172 12 shl.b64 %r77,%r75,4; .loc 1 173 22 ld.u64 %r78,[%frame]; .loc 1 173 12 shl.b64 %r80,%r78,3; setp.ne.u32 %r369,%r190,0; setp.le.s64 %r372,%r118,1; .loc 1 148 4 setp.le.s64 %r374,%r120,0; .loc 1 160 4 setp.gt.s64 %r375,%r120,0; cvt.u32.u32 %r376,%r22; cvt.s64.s8 %r377,%r376; add.u64 %r378,%r377,-1; add.u64 %r379,%frame,120; add.u64 %r380,%frame,240; bra $L25; $L24: add.u64 %r170,%frame,360; add.u64 %r169,%r185,40; mov.u64 %r167,%frame; cvt.u32.u32 %r315,%r22; cvt.s64.s8 %r314,%r315; add.u64 %r154,%r314,-1; .loc 1 127 3 mov.u64 %r133,0; add.u64 %r370,%frame,240; .loc 1 129 16 mov.u64 %r316,%r133; $L26: st.u64 [%r170],%r316; .loc 1 130 18 ld.u64 %r317,[%r169]; st.u64 [%r167],%r317; .loc 1 131 17 shl.b64 %r319,%r133,3; add.u64 %r320,%r370,%r319; .loc 1 131 10 ld.u64 %r321,[%r320]; setp.le.s64 %r322,%r321,0; @ %r322 bra $L1; .loc 1 127 26 add.u64 %r133,%r133,1; .loc 1 127 3 add.u64 %r170,%r170,8; add.u64 %r169,%r169,24; add.u64 %r167,%r167,8; setp.ne.u64 %r323,%r133,%r154; @ %r323 bra $L26; bra $L27; $L25: .loc 1 145 10 @ %r369 bra $L28; .loc 1 160 4 @ %r375 bra $L39; .loc 1 144 14 mov.u64 %r126,0; bra $L30; $L28: .loc 1 147 8 add.u64 %r127,%r132,%r68; .loc 1 148 4 @ %r374 bra $L40; .loc 1 148 11 mov.u64 %r126,%r43; $L32: .loc 1 150 11 ld.u64 %r327,[%r127]; setp.ne.u64 %r329,%r327,%r367; @ %r329 bra $L41; ld.u64 %r330,[%r127+8]; setp.ne.u64 %r332,%r330,%r368; @ ! %r332 bra $L30; $L41: .loc 1 148 26 add.u64 %r126,%r126,-1; .loc 1 148 34 add.u64 %r127,%r127,%r71; .loc 1 148 4 setp.ne.u64 %r333,%r126,0; @ %r333 bra $L32; bra $L30; $L39: .loc 1 160 4 mov.u64 %r125,%r132; .loc 1 160 11 mov.u64 %r126,1; $L29: .loc 1 162 11 ld.u64 %r334,[%r125]; setp.ne.u64 %r336,%r334,%r367; @ %r336 bra $L42; ld.u64 %r337,[%r125+8]; setp.ne.u64 %r339,%r337,%r368; @ ! %r339 bra $L30; $L42: .loc 1 160 27 add.u64 %r126,%r126,1; .loc 1 160 35 add.u64 %r125,%r125,%r74; .loc 1 160 4 setp.ge.s64 %r340,%r43,%r126; @ %r340 bra $L29; .loc 1 144 14 mov.u64 %r126,0; bra $L30; $L40: mov.u64 %r126,0; $L30: .loc 1 169 13 st.u64 [%r111],%r126; .loc 1 171 15 add.u64 %r82,%r82,1; .loc 1 172 12 add.u64 %r132,%r132,%r77; .loc 1 173 12 add.u64 %r111,%r111,%r80; .loc 1 175 32 ld.u64 %r141,[%frame+240]; .loc 1 175 13 setp.ne.u64 %r341,%r82,%r141; @ %r341 bra $L25; .loc 1 178 23 mul.lo.u64 %r342,%r75,%r82; .loc 1 178 9 shl.b64 %r343,%r342,4; sub.u64 %r129,%r132,%r343; .loc 1 179 23 mul.lo.u64 %r344,%r78,%r82; .loc 1 179 9 shl.b64 %r345,%r344,3; sub.u64 %r130,%r111,%r345; .loc 1 181 7 @ %r372 bra $L1; add.u64 %r66,%frame,368; mov.u64 %r159,8; .loc 1 180 5 mov.u64 %r131,1; .loc 1 177 13 mov.u64 %r373,0; bra $L36; $L37: st.u64 [%r66],%r373; .loc 1 178 23 mul.lo.u64 %r351,%r95,%r94; .loc 1 178 9 shl.b64 %r352,%r351,4; sub.u64 %r129,%r132,%r352; .loc 1 179 23 mul.lo.u64 %r353,%r98,%r94; .loc 1 179 9 shl.b64 %r354,%r353,3; sub.u64 %r130,%r111,%r354; .loc 1 180 5 add.u64 %r131,%r131,1; .loc 1 181 7 add.u64 %r66,%r66,8; add.u64 %r159,%r159,8; setp.eq.u64 %r355,%r131,%r378; @ %r355 bra $L1; $L36: .loc 1 188 16 ld.u64 %r356,[%r66]; add.u64 %r94,%r356,1; st.u64 [%r66],%r94; .loc 1 189 23 add.u64 %r358,%r379,%r159; ld.u64 %r95,[%r358]; .loc 1 189 13 shl.b64 %r359,%r95,4; add.u64 %r132,%r129,%r359; .loc 1 190 23 add.u64 %r360,%frame,%r159; ld.u64 %r98,[%r360]; .loc 1 190 13 shl.b64 %r361,%r98,3; add.u64 %r111,%r130,%r361; .loc 1 175 32 add.u64 %r363,%r380,%r159; ld.u64 %r102,[%r363]; .loc 1 175 13 setp.eq.u64 %r364,%r94,%r102; @ %r364 bra $L37; .loc 1 177 13 mov.u64 %r82,0; bra $L25; $L55: .loc 1 86 6 ld.u64 %r365,[%r185]; setp.eq.u64 %r366,%r365,0; @ ! %r366 bra $L11; bra $L38; $L_gfortran_mfindloc1_i16 .visible .func _gfortran_mfindloc1_i16 {608u32 %r210; .reg .u32 %r211; .reg .u64 %r213; .reg .u32 %r215; .reg .u32 %r216; .reg .u16u64u64 %r282; .reg .u64 %r283; .reg .u64 %r284; .reg .u64predu64 %r313; .reg .u64 %r314; .reg .pred %r315; .reg .u64 %r317; .reg .u64 %r319; .reg .predu32 %r325; .reg .u64 %r327; .reg .u64u16 %r332; .reg .u16 %r333; .reg .u32 %r334; .reg .u32pred %r354; .reg .u64 %r356; .reg .u64 %r357; .reg .predpredpred %r379; .reg .pred %r380; .reg .pred %r382; .reg .u16 %r384; .reg .u32 %r385; .reg .pred %r386; .reg .u64 %r387; .reg .predpredpred %r403; .reg .predu64 %r414; .reg .u64 %r415; .reg .u64 %r416; .reg .u64 %r418; .reg .pred %r419; .reg .u64 %r421; .reg .u64 %r422; .reg .predpredpred.reg .u64 %r442; mov.u64 %r204,%ar0; mov.u64 %r205,%ar1; mov.u64 %r206,%ar2; ld.u64 %r424,[%r206]; ld.u64 %r425,[%r206+8]; mov.u64 %r208,%ar3; mov.u64 %r209,%ar4; mov.u32 %r210,%ar5; .loc 1 225 10 ld.s8 %r22,[%r205+28]; .loc 1 225 38 add.u32 %r211,%r22,-1; .loc 1 225 8 cvt.s64.s32 %r147,%r211; .loc 1 226 10 ld.u64 %r25,[%r208]; .loc 1 226 7 add.u64 %r148,%r25,-1; .loc 1 228 7 shr.u64 %r213,%r148,63; set.u32.lt.s64 %r215,%r147,%r148; neg.s32 %r216,%r215; cvt.u16.u64 %r219,%r213; cvt.u16.u32 %r220,%r216; or.b16 %r218,%r219,%r220; .loc 1 228 6 cvt.u32.u16 %r221,%r218; cvt.u16.u8 %r222,%r221; setp.eq.u16 %r223,%r222,0; @ %r223 bra $L57; .loc 1 230 7 cvt.u32.u32 %r226,%r22; cvt.s64.s8 %r225,%r226; st.u64 [%stack+8],%r225242stack; call _gfortran_runtime_error7: .loc 1 235 9 add.u64 %r229,%r148,%r148; add.u64 %r230,%r229,%r148; shl.b64 %r231,%r230,3; add.u64 %r232,%r205,%r231; ld.u64 %r31,[%r232+56]; ld.u64 %r33,[%r232+48]; .loc 1 239 9 ld.u64 %r151,[%r232+40]; .loc 1 240 12 add.u64 %r250,%r209,%r231; ld.u64 %r34,[%r250+40]; ld.u64 %r36,[%r209+16]; .loc 1 242 9 ld.u64 %r160,[%r209]; .loc 1 246 22 cvt.u32.u64 %r39,%r36; .loc 1 246 53 add.u32 %r252,%r39,-4; and.b32 %r253,%r252,-5; set.u32.eq.u32 %r255,%r253,0; neg.s32 %r256,%r255; .loc 1 246 22 add.u32 %r257,%r39,-1; set.u32.le.u32 %r259,%r257,1; neg.s32 %r260,%r259; .loc 1 246 6 cvt.u16.u32 %r262,%r256; cvt.u16.u32 %r263,%r260; or.b16 %r261,%r262,%r263; cvt.u32.u16 %r264,%r261; cvt.u16.u8 %r265,%r264; setp.eq.u16 %r266,%r265,0; @ %r266 bra $L58; $L62: .loc 1 255 3 setp.ne.u64 %r267,%r148,0; @ %r267 bra $L59; $L66: .loc 1 264 3 setp.gt.s64 %r268,%r147,%r148; @ %r268 bra $L60; bra $L119; $L58: .loc 1 248 7 setp.eq.u32 %r270,%r39,16; @ %r270 bra $L62; .loc 1 253 5 cvta.const.u64 %r272,$LC4; mov.u64 %r271,call _gfortrani_internal_error9: add.u64 %r146,%r205,40; add.u64 %r167,%r209,40; add.u64 %r274,%r25,%r25; add.u64 %r275,%r274,%r25; shl.b64 %r276,%r275,3; add.u64 %r277,%r205,16; add.u64 %r101,%r276,%r277; .loc 1 255 3 mov.u64 %r200,0; add.u64 %r429,%frame,240; add.u64 %r430,%frame,120; add.u64 %r427,%frame,360; .loc 1 262 12 mov.u64 %r442,%r200; $L65: .loc 1 257 18 add.u64 %r279,%r429,%r200; ld.u64 %r280,[%r146]; st.u64 [%r279],%r280; .loc 1 258 18 add.u64 %r282,%r430,%r200; .loc 1 258 20 ld.u64 %r284,[%r167]; mul.lo.u64 %r283,%r284,%r36; .loc 1 258 18 st.u64 [%r282],%r283; .loc 1 259 19 ld.u64 %r286,[%r146+16]; add.u64 %r285,%r286,1; ld.u64 %r287,[%r146+8]; sub.u64 %r53,%r285,%r287; .loc 1 261 10 setp.lt.s64 %r288,%r53,0; @ %r288 bra $L63; .loc 1 259 17 add.u64 %r290,%r427,%r200; st.u64 [%r290],%r53; bra $L64; $L63: .loc 1 262 12 add.u64 %r292,%r427,%r200; st.u64 [%r292],%r442; $L64: .loc 1 255 3 add.u64 %r146,%r146,24; add.u64 %r167,%r167,24; add.u64 %r200,%r200,8; setp.ne.u64 %r294,%r101,%r146; @ %r294 bra $L65; bra $L66; $L119: .loc 1 274 6 ld.u64 %r295,[%r204]; setp.eq.u64 %r296,%r295,0; @ ! %r296 bra $L68; bra $L67; $L60: add.u64 %r298,%r25,%r25; add.u64 %r299,%r298,%r25; shl.b64 %r300,%r299,3; add.u64 %r143,%r300,40; add.u64 %r156,%r205,%r143; add.u64 %r141,%r209,%r143; shl.b64 %r301,%r25,3; add.u64 %r137,%r301,-8; cvt.u32.u32 %r303,%r22; cvt.s64.s8 %r302,%r303; shl.b64 %r304,%r302,3; add.u64 %r84,%r304,-8; add.u64 %r429,%frame,240; add.u64 %r430,%frame,120; add.u64 %r427,%frame,360; .loc 1 271 12 mov.u64 %r441,0; $L71: .loc 1 266 18 add.u64 %r306,%r429,%r137; ld.u64 %r307,[%r156]; st.u64 [%r306],%r307; .loc 1 267 18 add.u64 %r309,%r430,%r137; .loc 1 267 20 ld.u64 %r311,[%r141]; mul.lo.u64 %r310,%r311,%r36; .loc 1 267 18 st.u64 [%r309],%r310; .loc 1 268 19 ld.u64 %r313,[%r156+16]; add.u64 %r312,%r313,1; ld.u64 %r314,[%r156+8]; sub.u64 %r62,%r312,%r314; .loc 1 270 10 setp.lt.s64 %r315,%r62,0; @ %r315 bra $L69; .loc 1 268 17 add.u64 %r317,%r427,%r137; st.u64 [%r317],%r62; bra $L70; $L69: .loc 1 271 12 add.u64 %r319,%r427,%r137; st.u64 [%r319],%r441; $L70: .loc 1 264 3 add.u64 %r156,%r156,24; add.u64 %r141,%r141,24; add.u64 %r137,%r137,8; setp.ne.u64 %r321,%r84,%r137; @ %r321 bra $L71; bra $L120; $L67: .loc 1 278 7 setp.le.s64 %r322,%r147,0; @ %r322 bra $L73; add.u64 %r427,%frame,360; $L96: add.u64 %r179,%r204,40; mov.u64 %r176,%r427; add.u64 %r323,%r204,16; cvt.u32.u32 %r325,%r22; cvt.s64.s8 %r324,%r325; add.u64 %r327,%r324,%r324; add.u64 %r328,%r327,%r324; shl.b64 %r329,%r328,3; add.u64 %r169,%r323,%r329; .loc 1 281 10 mov.u64 %r140,1; .loc 1 285 4 mov.u64 %r352,0; bra $L74; $L73: .loc 1 289 24 mov.u64 %r330,0; st.u64 [%r204+8],%r330; .loc 1 290 28 cvt.u16.u32 %r333,%r22; add.u16 %r332,%r333,-1; cvt.u32.u16 %r334,%r332; st.u8 [%r204+28],%r334; .loc 1 292 20 add.u32 %r335,%r22,-2; cvt.s64.s32 %r72,%r335; add.u64 %r337,%r72,%r72; add.u64 %r338,%r337,%r72; shl.b64 %r339,%r338,3; add.u64 %r340,%r204,%r339; .loc 1 292 67 shl.b64 %r342,%r72,3; add.u64 %r343,%frame,%r342; .loc 1 292 59 ld.u64 %r345,[%r340+40]; ld.u64 %r346,[%r343+360]; mul.lo.u64 %r154,%r345,%r346; .loc 1 294 29 mov.u64 %r3481548; call (%value_in),_gfortrani_xmallocarray9,[%value_in]; } .loc 1 294 27 st.u64 [%r204],%r349; .loc 1 295 10 setp.eq.u64 %r351,%r154,0; @ ! %r351 bra $L78; bra $L75; $L77: .loc 1 283 48 mul.lo.u64 %r140,%r68,%r140; $L74: .loc 1 285 4 st.u64 [%r179+8],%r352; ld.u64 %r68,[%r176]; add.u64 %r353,%r68,-1; st.u64 [%r179+16],%r353; st.u64 [%r179],%r140; .loc 1 278 7 add.u64 %r179,%r179,24; add.u64 %r176,%r176,8; setp.ne.u64 %r354,%r169,%r179; @ %r354 bra $L77; bra $L73; $L75: .loc 1 298 4 st.u64 [%r204+48],%r154; mov.u64 %r356,-1; st.u64 [%r204+56],%r356; mov.u64 %r357,1; st.u64 [%r204+40],%r357; .loc 1 299 4 bra $L56; $L68: .loc 1 304 19 ld.s8 %r79,[%r204+28]; .loc 1 304 10 setp.eq.u64 %r358,%r79,%r147; @ %r358 bra $L80; .loc 1 305 2 st.u64 [%stack+8],%r147; st.u64 [%stack],%r79;_gfortran_runtime_error80: .loc 1 310 11 cvta.global.u64 %r361,_gfortrani_compile_options; .loc 1 310 10 ld.u32 %r362,[%r361+36]; setp.eq.u32 %r363,%r362,0; @ %r363 bra $L78; .loc 1 311 2 add.u64 %r427,%frame,360; cvta.const.u64 %r367,$LC2r4273667; call _gfortrani_bounds_ifunction_return$L78: .loc 1 315 3 setp.gt.s64 %r369,%r147,0; @ %r369 bra $L81; $L84: .loc 1 235 9 add.u64 %r370,%r31,1; .loc 1 235 7 sub.u64 %r149,%r370,%r33; max.s64 %r65,%r149,0; .loc 1 240 12 mul.lo.u64 %r37,%r34,%r36; .loc 1 323 8 ld.u64 %r164,[%r204]; .loc 1 326 8 ld.u64 %r163,[%r205]; .loc 1 351 35 shl.b64 %r100,%r151,4; .loc 1 336 22 add.u64 %r64,%r65,-1; .loc 1 336 15 mul.lo.u64 %r89,%r64,%r100; .loc 1 337 29 mul.lo.u64 %r177,%r37,%r64; .loc 1 338 34 neg.s64 %r95,%r100; .loc 1 363 22 ld.u64 %r102,[%frame+240]; .loc 1 363 12 shl.b64 %r104,%r102,4; .loc 1 364 23 ld.u64 %r105,[%frame+120]; .loc 1 365 22 ld.u64 %r107,[%frame]; .loc 1 365 12 shl.b64 %r109,%r107,3; .loc 1 362 12 ld.u64 %r203,[%frame+480]; setp.ne.u32 %r428,%r210,0; add.u64 %r426,%frame,488; .loc 1 338 4 setp.le.s64 %r434,%r149,0; .loc 1 382 23 add.u64 %r435,%frame,240; .loc 1 367 32 add.u64 %r436,%frame,360; .loc 1 371 20 add.u64 %r437,%frame,120; bra $L82; $L81: add.u64 %r198,%frame,480; add.u64 %r197,%r204,40; mov.u64 %r195,%frame; cvt.u32.u32 %r372,%r22; cvt.s64.s8 %r371,%r372; add.u64 %r181,%r371,-1; .loc 1 315 3 mov.u64 %r165,0; add.u64 %r427,%frame,360; .loc 1 317 16 mov.u64 %r373,%r165; $L83: st.u64 [%r198],%r373; .loc 1 318 18 ld.u64 %r374,[%r197]; st.u64 [%r195],%r374; .loc 1 319 17 shl.b64 %r376,%r165,3; add.u64 %r377,%r427,%r376; .loc 1 319 10 ld.u64 %r378,[%r377]; setp.le.s64 %r379,%r378,0; @ %r379 bra $L56; .loc 1 315 26 add.u64 %r165,%r165,1; .loc 1 315 3 add.u64 %r198,%r198,8; add.u64 %r197,%r197,24; add.u64 %r195,%r195,8; setp.ne.u64 %r380,%r165,%r181; @ %r380 bra $L83; bra $L84; $L82: .loc 1 334 10 @ %r428 bra $L85; .loc 1 351 4 setp.gt.s64 %r382,%r149,0; @ %r382 bra $L97; .loc 1 333 14 mov.u64 %r158,0; bra $L87; $L85: .loc 1 336 8 add.u64 %r29,%r163,%r89; .loc 1 337 9 add.u64 %r159,%r160,%r177; .loc 1 338 4 @ %r434 bra $L98; .loc 1 338 11 mov.u64 %r158,%r65; $L90: .loc 1 340 11 ld.s8 %r385,[%r159]; cvt.u16.u32 %r384,%r385; setp.eq.u16 %r386,%r384,0; @ %r386 bra $L88; .loc 1 340 18 ld.u64 %r387,[%r29]; setp.ne.u64 %r389,%r387,%r424; @ %r389 bra $L88; ld.u64 %r390,[%r29+8]; setp.ne.u64 %r392,%r390,%r425; @ ! %r392 bra $L87; $L88: .loc 1 338 26 add.u64 %r158,%r158,-1; .loc 1 338 34 add.u64 %r29,%r29,%r95; .loc 1 338 53 sub.u64 %r159,%r159,%r37; .loc 1 338 4 setp.ne.u64 %r393,%r158,0; @ %r393 bra $L90; bra $L87; $L97: .loc 1 351 4 mov.u64 %r157,%r160; mov.u64 %r111,%r163; .loc 1 351 11 mov.u64 %r158,1; $L86: .loc 1 353 11 ld.s8 %r395,[%r157]; cvt.u16.u32 %r394,%r395; setp.eq.u16 %r396,%r394,0; @ %r396 bra $L91; .loc 1 353 18 ld.u64 %r397,[%r111]; setp.ne.u64 %r399,%r397,%r424; @ %r399 bra $L91; ld.u64 %r400,[%r111+8]; setp.ne.u64 %r402,%r400,%r425; @ ! %r402 bra $L87; $L91: .loc 1 351 27 add.u64 %r158,%r158,1; .loc 1 351 35 add.u64 %r111,%r111,%r100; .loc 1 351 54 add.u64 %r157,%r157,%r37; .loc 1 351 4 setp.ge.s64 %r403,%r65,%r158; @ %r403 bra $L86; .loc 1 333 14 mov.u64 %r158,0; bra $L87; $L98: mov.u64 %r158,0; $L87: .loc 1 360 13 st.u64 [%r164],%r158; .loc 1 362 15 add.u64 %r203,%r203,1; st.u64 [%frame+480],%r203; .loc 1 363 12 add.u64 %r163,%r163,%r104; .loc 1 364 13 add.u64 %r160,%r160,%r105; .loc 1 365 12 add.u64 %r164,%r164,%r109; .loc 1 367 32 ld.u64 %r130,[%frame+360]; .loc 1 367 13 setp.ne.u64 %r404,%r203,%r130; @ %r404 bra $L82; mov.u64 %r126,%r426; mov.u64 %r127,%r107; mov.u64 %r202,%r105; mov.u64 %r124,%r102; mov.u64 %r120,8; .loc 1 366 9 mov.u64 %r162,0; .loc 1 369 13 mov.u64 %r406,%r162; $L95: st.u64 [%r126+-8],%r406; .loc 1 370 23 mul.lo.u64 %r113,%r130,%r124; .loc 1 371 24 mul.lo.u64 %r407,%r130,%r202; .loc 1 371 10 sub.u64 %r160,%r160,%r407; .loc 1 372 23 mul.lo.u64 %r119,%r130,%r127; .loc 1 373 5 add.u64 %r162,%r162,1; .loc 1 374 7 setp.le.s64 %r408,%r147,%r162; @ %r408 bra $L56; .loc 1 381 16 ld.u64 %r409,[%r126]; add.u64 %r123,%r409,1; st.u64 [%r126],%r123; .loc 1 382 23 add.u64 %r411,%r435,%r120; ld.u64 %r124,[%r411]; .loc 1 382 13 sub.u64 %r412,%r124,%r113; shl.b64 %r413,%r412,4; add.u64 %r163,%r163,%r413; .loc 1 383 23 add.u64 %r414,%frame,%r120; ld.u64 %r127,[%r414]; .loc 1 383 13 sub.u64 %r415,%r127,%r119; shl.b64 %r416,%r415,3; add.u64 %r164,%r164,%r416; .loc 1 367 32 add.u64 %r418,%r436,%r120; ld.u64 %r130,[%r418]; .loc 1 367 13 add.u64 %r126,%r126,8; setp.eq.u64 %r419,%r123,%r130; @ %r419 bra $L94; .loc 1 362 12 ld.u64 %r203,[%frame+480]; bra $L82; $L94: .loc 1 371 20 add.u64 %r421,%r437,%r120; ld.u64 %r202,[%r421]; add.u64 %r120,%r120,8; bra $L95; $L120: .loc 1 274 6 ld.u64 %r422,[%r204]; setp.eq.u64 %r423,%r422,0; @ ! %r423 bra $L68; bra $L96; $L56_gfortran_sfindloc1_i16 .visible .func _gfortran_sfindloc1_i16 {84u64 %r126; .reg .u64predpredpredpredu64predpredmov.u64 %r149,%ar0; mov.u64 %r150,%ar1; mov.u64 %r151,%ar2; ld.u64 %r298,[%r151]; ld.u64 %r299,[%r151+8]; mov.u64 %r153,%ar3; mov.u64 %r154,%ar4; mov.u32 %r155,%ar5; .loc 1 410 6 setp.eq.u64 %r156,%r154,0; @ %r156 bra $L122; .loc 1 410 20 ld.u32 %r157,[%r154]; setp.eq.u32 %r158,%r157,0; @ %r158 bra $L123; $L122: .loc 1 412 7 st.u64 [%frame+368],%r298; st.u64 [%frame+376],%r299; add.u64 %r166,%frame,368155; call _gfortran_findloc1_i16); } .loc 1 413 7 bra $L121; $L123: .loc 1 416 10 ld.s8 %r23,[%r150+28]; .loc 1 416 38 add.u32 %r167,%r23,-1; .loc 1 416 8 cvt.s64.s32 %r90,%r167; .loc 1 417 10 ld.u64 %r26,[%r153]; .loc 1 417 7 add.u64 %r91,%r26,-1; .loc 1 419 7 shr.u64 %r169,%r91,63; set.u32.lt.s64 %r171,%r90,%r91; neg.s32 %r172,%r171; cvt.u16.u64 %r175,%r169; cvt.u16.u32 %r176,%r172; or.b16 %r174,%r175,%r176; .loc 1 419 6 cvt.u32.u16 %r177,%r174; cvt.u16.u8 %r178,%r177; setp.ne.u16 %r179,%r178,0; @ %r179 bra $L125; .loc 1 430 3 setp.ne.u64 %r180,%r91,0; @ %r180 bra $L126; $L132: .loc 1 438 3 setp.gt.s64 %r181,%r90,%r91; @ %r181 bra $L127; bra $L169; $L125: .loc 1 421 7 cvt.u32.u32 %r184,%r23; cvt.s64.s8 %r183,%r184; st.u64 [%stack+8],%r183182182_gfortran_runtime_error126: add.u64 %r136,%r150,48; add.u64 %r138,%frame,120; add.u64 %r139,%r150,56; add.u64 %r186,%r150,24; add.u64 %r188,%r26,%r26; add.u64 %r189,%r188,%r26; shl.b64 %r190,%r189,3; add.u64 %r145,%r186,%r190; .loc 1 435 12 mov.u64 %r308,0; $L131: .loc 1 432 19 ld.u64 %r192,[%r139]; add.u64 %r191,%r192,1; ld.u64 %r193,[%r136]; sub.u64 %r35,%r191,%r193; .loc 1 434 10 setp.le.s64 %r194,%r35,0; @ %r194 bra $L129; .loc 1 432 17 st.u64 [%r138],%r35; bra $L130; $L129: .loc 1 435 12 st.u64 [%r138],%r308; $L130: .loc 1 430 3 add.u64 %r136,%r136,24; add.u64 %r138,%r138,8; add.u64 %r139,%r139,24; setp.ne.u64 %r196,%r136,%r145; @ %r196 bra $L131; bra $L132; $L169: .loc 1 448 6 ld.u64 %r197,[%r149]; setp.eq.u64 %r198,%r197,0; @ ! %r198 bra $L134; bra $L133; $L127: add.u64 %r200,%r26,%r26; add.u64 %r201,%r200,%r26; shl.b64 %r202,%r201,3; add.u64 %r203,%r202,48; add.u64 %r125,%r150,%r203; add.u64 %r300,%frame,120; shl.b64 %r205,%r26,3; add.u64 %r206,%r205,-8; add.u64 %r45,%r300,%r206; add.u64 %r207,%r150,48; cvt.u32.u32 %r209,%r23; cvt.s64.s8 %r208,%r209; add.u64 %r211,%r208,%r208; add.u64 %r212,%r211,%r208; shl.b64 %r213,%r212,3; add.u64 %r132,%r207,%r213; .loc 1 444 12 mov.u64 %r307,0; $L137: .loc 1 441 2 ld.u64 %r215,[%r125+8]; add.u64 %r214,%r215,1; ld.u64 %r216,[%r125]; sub.u64 %r40,%r214,%r216; .loc 1 443 10 setp.le.s64 %r217,%r40,0; @ %r217 bra $L135; .loc 1 440 17 st.u64 [%r45],%r40; bra $L136; $L135: .loc 1 444 12 st.u64 [%r45],%r307; $L136: .loc 1 438 3 add.u64 %r125,%r125,24; add.u64 %r45,%r45,8; setp.ne.u64 %r219,%r125,%r132; @ %r219 bra $L137; bra $L170; $L133: .loc 1 452 7 setp.le.s64 %r220,%r90,0; @ %r220 bra $L139; add.u64 %r300,%frame,120; $L155: add.u64 %r78,%r149,40; mov.u64 %r61,%r300; cvt.u32.u32 %r222,%r23; cvt.s64.s8 %r221,%r222; add.u64 %r224,%r221,%r221; add.u64 %r225,%r224,%r221; shl.b64 %r226,%r225,3; add.u64 %r227,%r149,16; add.u64 %r126,%r226,%r227; .loc 1 455 10 mov.u64 %r81,1; .loc 1 459 4 mov.u64 %r250,0; bra $L140; $L139: .loc 1 462 24 mov.u64 %r228,0; st.u64 [%r149+8],%r228; .loc 1 463 28 cvt.u16.u32 %r231,%r23; add.u16 %r230,%r231,-1; cvt.u32.u16 %r232,%r230; st.u8 [%r149+28],%r232; .loc 1 465 20 add.u32 %r233,%r23,-2; cvt.s64.s32 %r51,%r233; add.u64 %r235,%r51,%r51; add.u64 %r236,%r235,%r51; shl.b64 %r237,%r236,3; add.u64 %r238,%r149,%r237; .loc 1 465 67 shl.b64 %r240,%r51,3; add.u64 %r241,%frame,%r240; .loc 1 465 59 ld.u64 %r243,[%r238+40]; ld.u64 %r244,[%r241+120]; mul.lo.u64 %r94,%r243,%r244; .loc 1 467 29 mov.u64 %r246246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } .loc 1 467 27 st.u64 [%r149],%r247; .loc 1 468 10 setp.eq.u64 %r249,%r94,0; @ ! %r249 bra $L144; bra $L141; $L143: .loc 1 457 48 mul.lo.u64 %r81,%r47,%r81; $L140: .loc 1 459 4 st.u64 [%r78+8],%r250; ld.u64 %r47,[%r61]; add.u64 %r251,%r47,-1; st.u64 [%r78+16],%r251; st.u64 [%r78],%r81; .loc 1 452 7 add.u64 %r78,%r78,24; add.u64 %r61,%r61,8; setp.ne.u64 %r252,%r78,%r126; @ %r252 bra $L143; bra $L139; $L141: .loc 1 471 4 st.u64 [%r149+48],%r94; mov.u64 %r254,-1; st.u64 [%r149+56],%r254; mov.u64 %r255,1; st.u64 [%r149+40],%r255; .loc 1 472 4 bra $L121; $L134: .loc 1 477 19 ld.s8 %r58,[%r149+28]; .loc 1 477 10 setp.eq.u64 %r256,%r58,%r90; @ %r256 bra $L145; .loc 1 478 2 st.u64 [%stack+8],%r90; st.u64 [%stack],%r58;_gfortran_runtime_error145: .loc 1 483 11 cvta.global.u64 %r259,_gfortrani_compile_options; .loc 1 483 10 ld.u32 %r260,[%r259+36]; setp.eq.u32 %r261,%r260,0; @ %r261 bra $L144; .loc 1 484 2 add.u64 %r300,%frame,120; cvta.const.u64 %r265,$LC2r3002645; call _gfortrani_bounds_ifunction_return$L144: .loc 1 488 3 setp.gt.s64 %r267,%r90,0; @ %r267 bra $L146; $L149: .loc 1 495 8 ld.u64 %r97,[%r149]; ld.u64 %r42,[%frame+240]; .loc 1 503 22 ld.u64 %r64,[%frame]; .loc 1 503 12 shl.b64 %r66,%r64,3; setp.le.s64 %r301,%r90,1; cvt.u32.u32 %r303,%r23; cvt.s64.s8 %r304,%r303; add.u64 %r305,%r304,-1; add.u64 %r306,%frame,120; bra $L147; $L146: add.u64 %r114,%frame,240; add.u64 %r113,%r149,40; mov.u64 %r108,%frame; cvt.u32.u32 %r269,%r23; cvt.s64.s8 %r268,%r269; add.u64 %r79,%r268,-1; .loc 1 488 3 mov.u64 %r98,0; add.u64 %r300,%frame,120; .loc 1 490 16 mov.u64 %r270,%r98; $L148: st.u64 [%r114],%r270; .loc 1 491 18 ld.u64 %r271,[%r113]; st.u64 [%r108],%r271; .loc 1 492 17 shl.b64 %r273,%r98,3; add.u64 %r274,%r300,%r273; .loc 1 492 10 ld.u64 %r275,[%r274]; setp.le.s64 %r276,%r275,0; @ %r276 bra $L121; .loc 1 488 26 add.u64 %r98,%r98,1; .loc 1 488 3 add.u64 %r114,%r114,8; add.u64 %r113,%r113,24; add.u64 %r108,%r108,8; setp.ne.u64 %r277,%r79,%r98; @ %r277 bra $L148; bra $L149; $L147: .loc 1 500 13 mov.u64 %r278,0; st.u64 [%r97],%r278; .loc 1 502 15 add.u64 %r42,%r42,1; .loc 1 503 12 add.u64 %r97,%r97,%r66; .loc 1 505 32 ld.u64 %r88,[%frame+120]; .loc 1 505 13 setp.ne.u64 %r279,%r42,%r88; @ %r279 bra $L147; .loc 1 508 23 mul.lo.u64 %r280,%r64,%r42; .loc 1 508 9 shl.b64 %r281,%r280,3; sub.u64 %r30,%r97,%r281; .loc 1 510 7 @ ! %r301 bra $L171; bra $L121; $L154: .loc 1 507 13 st.u64 [%r102],%r278; .loc 1 508 23 mul.lo.u64 %r284,%r73,%r72; .loc 1 508 9 shl.b64 %r285,%r284,3; sub.u64 %r30,%r97,%r285; .loc 1 509 5 add.u64 %r96,%r96,1; .loc 1 510 7 add.u64 %r102,%r102,8; setp.ne.u64 %r286,%r96,%r305; @ %r286 bra $L153; bra $L121; $L171: add.u64 %r102,%frame,248; .loc 1 509 5 mov.u64 %r96,1; $L153: .loc 1 517 16 ld.u64 %r290,[%r102]; add.u64 %r72,%r290,1; st.u64 [%r102],%r72; shl.b64 %r120,%r96,3; .loc 1 518 23 add.u64 %r291,%frame,%r120; ld.u64 %r73,[%r291]; .loc 1 518 13 shl.b64 %r292,%r73,3; add.u64 %r97,%r30,%r292; .loc 1 505 32 add.u64 %r294,%r306,%r120; ld.u64 %r77,[%r294]; .loc 1 505 13 setp.eq.u64 %r295,%r72,%r77; @ %r295 bra $L154; .loc 1 507 13 mov.u64 %r42,0; bra $L147; $L170: .loc 1 448 6 ld.u64 %r296,[%r149]; setp.eq.u64 %r297,%r296,0; @ ! %r297 bra $L134; bra $L155; $L121: .loc 1 522 1 ret; } findloc1_r4.o/_gfortran_findloc1_r4 .visible .func _gfortran_findloc1_rfortran/generated/findloc1_r4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mfindloc1_r4 .visible .func _gfortran_mfindloc1_r_gfortran_sfindloc1_r4 .visible .func _gfortran_sfindloc1_rVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_internal_error .extern .func _gfortrani_internal_error8] = {70,73,78,68,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,70,73,78,68,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,914,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,70,73,78,68,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_findloc1_r4 .visible .func _gfortran_findloc1_r48098predpred %r238; .reg .u64pred %r260; .reg .pred %r262; .reg .pred %r263; .reg .u64 %r264; .reg .u32 %r265; .reg .u64u64 %r279; .reg .u64 %r280; .reg .u6464pred %r327; .reg .predpred %r331; .reg .pred %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64u64 %r352; .reg .u64 %r354; .reg .predpred %r358; .reg .pred %r359; .reg .u64predmov.u64 %r185,%ar0; mov.u64 %r186,%ar1; mov.f32 %r187,%ar2; mov.u64 %r188,%ar3; mov.u32 %r189,%ar4; .loc 1 54 10 ld.s8 %r22,[%r186+28]; .loc 1 54 38 add.u32 %r190,%r22,-1; .loc 1 54 8 cvt.s64.s32 %r118,%r190; .loc 1 55 10 ld.u64 %r25,[%r188]; .loc 1 55 7 add.u64 %r119,%r25,-1; .loc 1 57 7 shr.u64 %r192,%r119,63; set.u32.lt.s64 %r194,%r118,%r119; neg.s32 %r195,%r194; cvt.u16.u64 %r198,%r192; cvt.u16.u32 %r199,%r195; or.b16 %r197,%r198,%r199; .loc 1 57 6 cvt.u32.u16 %r200,%r197; cvt.u16.u8 %r201,%r200; setp.eq.u16 %r202,%r201,0; @ %r202 bra $L2; .loc 1 59 7 cvt.u32.u32 %r205,%r22; cvt.s64.s8 %r204,%r205; st.u64 [%stack+8],%r20403203_gfortran_runtime_errorr208,%r119,%r119; add.u64 %r209,%r208,%r119; shl.b64 %r210,%r209,3; add.u64 %r211,%r186,%r210; ld.u64 %r31,[%r211+56]; ld.u64 %r33,[%r211+48]; .loc 1 67 9 ld.u64 %r122,[%r211+40]; .loc 1 69 3 setp.ne.u64 %r225,%r119,0; @ %r225 bra $L3; $L9: .loc 1 77 3 setp.gt.s64 %r226,%r118,%r119; @ %r226 bra $L4; bra $L50; $L3: add.u64 %r177,%r186,40; add.u64 %r67,%frame,120; add.u64 %r70,%frame,240; add.u64 %r228,%r25,%r25; add.u64 %r229,%r228,%r25; shl.b64 %r230,%r229,3; add.u64 %r231,%r186,16; add.u64 %r182,%r230,%r231; .loc 1 75 12 mov.u64 %r371,0; $L8: .loc 1 71 18 ld.u64 %r232,[%r177]; st.u64 [%r67],%r232; .loc 1 72 19 ld.u64 %r234,[%r177+16]; add.u64 %r233,%r234,1; ld.u64 %r235,[%r177+8]; sub.u64 %r38,%r233,%r235; .loc 1 74 10 setp.lt.s64 %r236,%r38,0; @ %r236 bra $L6; .loc 1 72 17 st.u64 [%r70],%r38; bra $L7; $L6: .loc 1 75 12 st.u64 [%r70],%r371; $L7: .loc 1 69 3 add.u64 %r177,%r177,24; add.u64 %r67,%r67,8; add.u64 %r70,%r70,8; setp.ne.u64 %r238,%r177,%r182; @ %r238 bra $L8; bra $L9; $L50: .loc 1 86 6 ld.u64 %r239,[%r185]; setp.eq.u64 %r240,%r239,0; @ ! %r240 bra $L11; bra $L10; $L4: add.u64 %r242,%r25,%r25; add.u64 %r243,%r242,%r25; shl.b64 %r244,%r243,3; add.u64 %r245,%r244,40; add.u64 %r115,%r186,%r245; shl.b64 %r246,%r25,3; add.u64 %r105,%r246,-8; add.u64 %r361,%frame,120; add.u64 %r108,%r361,%r105; add.u64 %r360,%frame,240; add.u64 %r62,%r360,%r105; add.u64 %r249,%r186,40; cvt.u32.u32 %r251,%r22; cvt.s64.s8 %r250,%r251; add.u64 %r253,%r250,%r250; add.u64 %r254,%r253,%r250; shl.b64 %r255,%r254,3; add.u64 %r180,%r249,%r255; .loc 1 83 12 mov.u64 %r370,0; $L14: .loc 1 79 18 ld.u64 %r256,[%r115]; st.u64 [%r108],%r256; .loc 1 80 19 ld.u64 %r258,[%r115+16]; add.u64 %r257,%r258,1; ld.u64 %r259,[%r115+8]; sub.u64 %r45,%r257,%r259; .loc 1 82 10 setp.lt.s64 %r260,%r45,0; @ %r260 bra $L12; .loc 1 80 17 st.u64 [%r62],%r45; bra $L13; $L12: .loc 1 83 12 st.u64 [%r62],%r370; $L13: .loc 1 77 3 add.u64 %r115,%r115,24; add.u64 %r108,%r108,8; add.u64 %r62,%r62,8; setp.ne.u64 %r262,%r115,%r180; @ %r262 bra $L14; bra $L51; $L10: .loc 1 90 7 setp.le.s64 %r263,%r118,0; @ %r263 bra $L16; add.u64 %r360,%frame,240; $L36: add.u64 %r153,%r185,40; mov.u64 %r149,%r360; cvt.u32.u32 %r265,%r22; cvt.s64.s8 %r264,%r265; add.u64 %r267,%r264,%r264; add.u64 %r268,%r267,%r264; shl.b64 %r269,%r268,3; add.u64 %r270,%r185,16; add.u64 %r116,%r269,%r270; .loc 1 93 10 mov.u64 %r112,1; .loc 1 97 4 mov.u64 %r293,0; bra $L17; $L16: .loc 1 101 24 mov.u64 %r271,0; st.u64 [%r185+8],%r271; .loc 1 102 28 cvt.u16.u32 %r274,%r22; add.u16 %r273,%r274,-1; cvt.u32.u16 %r275,%r273; st.u8 [%r185+28],%r275; .loc 1 104 20 add.u32 %r276,%r22,-2; cvt.s64.s32 %r54,%r276; add.u64 %r278,%r54,%r54; add.u64 %r279,%r278,%r54; shl.b64 %r280,%r279,3; add.u64 %r281,%r185,%r280; .loc 1 104 67 shl.b64 %r283,%r54,3; add.u64 %r284,%frame,%r283; .loc 1 104 59 ld.u64 %r286,[%r281+40]; ld.u64 %r287,[%r284+240]; mul.lo.u64 %r124,%r286,%r287; .loc 1 106 29 mov.u64 %r289189; call (%value_in),_gfortrani_xmallocarray290,[%value_in]; } .loc 1 106 27 st.u64 [%r185],%r290; .loc 1 107 10 setp.eq.u64 %r292,%r124,0; @ ! %r292 bra $L21; bra $L18; $L20: .loc 1 95 48 mul.lo.u64 %r112,%r50,%r112; $L17: .loc 1 97 4 st.u64 [%r153+8],%r293; ld.u64 %r50,[%r149]; add.u64 %r294,%r50,-1; st.u64 [%r153+16],%r294; st.u64 [%r153],%r112; .loc 1 90 7 add.u64 %r153,%r153,24; add.u64 %r149,%r149,8; setp.ne.u64 %r295,%r116,%r153; @ %r295 bra $L20; bra $L16; $L18: .loc 1 110 4 st.u64 [%r185+48],%r124; mov.u64 %r297,-1; st.u64 [%r185+56],%r297; mov.u64 %r298,1; st.u64 [%r185+40],%r298; .loc 1 111 4 bra $L1; $L11: .loc 1 116 19 ld.s8 %r60,[%r185+28]; .loc 1 116 10 setp.eq.u64 %r299,%r60,%r118; @ %r299 bra $L23; .loc 1 117 2 st.u64 [%stack+8],%r118; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 122 11 cvta.global.u64 %r302,_gfortrani_compile_options; .loc 1 122 10 ld.u32 %r303,[%r302+36]; setp.eq.u32 %r304,%r303,0; @ %r304 bra $L21; .loc 1 123 2 add.u64 %r360,%frame,240; cvta.const.u64 %r308,$LC2r360307308; call _gfortrani_bounds_ifunction_return$L21: .loc 1 127 3 setp.gt.s64 %r310,%r118,0; @ %r310 bra $L24; $L27: .loc 1 64 9 add.u64 %r311,%r31,1; .loc 1 64 7 sub.u64 %r120,%r311,%r33; max.s64 %r43,%r120,0; .loc 1 135 8 ld.u64 %r111,[%r185]; .loc 1 138 8 ld.u64 %r132,[%r186]; .loc 1 160 35 shl.b64 %r74,%r122,2; .loc 1 147 22 add.u64 %r312,%r43,-1; .loc 1 147 15 mul.lo.u64 %r68,%r312,%r74; .loc 1 148 34 neg.s64 %r71,%r74; ld.u64 %r82,[%frame+360]; .loc 1 172 22 ld.u64 %r75,[%frame+120]; .loc 1 172 12 shl.b64 %r77,%r75,2; .loc 1 173 22 ld.u64 %r78,[%frame]; .loc 1 173 12 shl.b64 %r80,%r78,3; setp.ne.u32 %r359,%r189,0; setp.le.s64 %r358,%r118,1; .loc 1 148 4 setp.le.s64 %r363,%r120,0; .loc 1 160 4 setp.gt.s64 %r364,%r120,0; cvt.u32.u32 %r365,%r22; cvt.s64.s8 %r366,%r365; add.u64 %r367,%r366,-1; add.u64 %r368,%frame,120; add.u64 %r369,%frame,240; bra $L25; $L24: add.u64 %r170,%frame,360; add.u64 %r169,%r185,40; mov.u64 %r167,%frame; cvt.u32.u32 %r314,%r22; cvt.s64.s8 %r313,%r314; add.u64 %r154,%r313,-1; .loc 1 127 3 mov.u64 %r133,0; add.u64 %r360,%frame,240; .loc 1 129 16 mov.u64 %r315,%r133; $L26: st.u64 [%r170],%r315; .loc 1 130 18 ld.u64 %r316,[%r169]; st.u64 [%r167],%r316; .loc 1 131 17 shl.b64 %r318,%r133,3; add.u64 %r319,%r360,%r318; .loc 1 131 10 ld.u64 %r320,[%r319]; setp.le.s64 %r321,%r320,0; @ %r321 bra $L1; .loc 1 127 26 add.u64 %r133,%r133,1; .loc 1 127 3 add.u64 %r170,%r170,8; add.u64 %r169,%r169,24; add.u64 %r167,%r167,8; setp.ne.u64 %r322,%r133,%r154; @ %r322 bra $L26; bra $L27; $L25: .loc 1 145 10 @ %r359 bra $L28; .loc 1 160 4 @ %r364 bra $L37; .loc 1 144 14 mov.u64 %r126,0; bra $L30; $L28: .loc 1 147 8 add.u64 %r127,%r132,%r68; .loc 1 148 4 @ %r363 bra $L38; .loc 1 148 11 mov.u64 %r126,%r43; $L31: .loc 1 150 11 ld.f32 %r326,[%r127]; setp.eq.f32 %r327,%r326,%r187; @ %r327 bra $L30; .loc 1 148 26 add.u64 %r126,%r126,-1; .loc 1 148 34 add.u64 %r127,%r127,%r71; .loc 1 148 4 setp.ne.u64 %r328,%r126,0; @ %r328 bra $L31; bra $L30; $L37: .loc 1 160 4 mov.u64 %r125,%r132; .loc 1 160 11 mov.u64 %r126,1; $L29: .loc 1 162 11 ld.f32 %r329,[%r125]; setp.eq.f32 %r330,%r329,%r187; @ %r330 bra $L30; .loc 1 160 27 add.u64 %r126,%r126,1; .loc 1 160 35 add.u64 %r125,%r125,%r74; .loc 1 160 4 setp.ge.s64 %r331,%r43,%r126; @ %r331 bra $L29; .loc 1 144 14 mov.u64 %r126,0; bra $L30; $L38: mov.u64 %r126,0; $L30: .loc 1 169 13 st.u64 [%r111],%r126; .loc 1 171 15 add.u64 %r82,%r82,1; .loc 1 172 12 add.u64 %r132,%r132,%r77; .loc 1 173 12 add.u64 %r111,%r111,%r80; .loc 1 175 32 ld.u64 %r141,[%frame+240]; .loc 1 175 13 setp.ne.u64 %r332,%r82,%r141; @ %r332 bra $L25; .loc 1 178 23 mul.lo.u64 %r333,%r75,%r82; .loc 1 178 9 shl.b64 %r334,%r333,2; sub.u64 %r129,%r132,%r334; .loc 1 179 23 mul.lo.u64 %r335,%r78,%r82; .loc 1 179 9 shl.b64 %r336,%r335,3; sub.u64 %r130,%r111,%r336; .loc 1 181 7 @ %r358 bra $L1; add.u64 %r66,%frame,368; mov.u64 %r159,8; .loc 1 180 5 mov.u64 %r131,1; .loc 1 177 13 mov.u64 %r362,0; bra $L34; $L35: st.u64 [%r66],%r362; .loc 1 178 23 mul.lo.u64 %r342,%r95,%r94; .loc 1 178 9 shl.b64 %r343,%r342,2; sub.u64 %r129,%r132,%r343; .loc 1 179 23 mul.lo.u64 %r344,%r98,%r94; .loc 1 179 9 shl.b64 %r345,%r344,3; sub.u64 %r130,%r111,%r345; .loc 1 180 5 add.u64 %r131,%r131,1; .loc 1 181 7 add.u64 %r66,%r66,8; add.u64 %r159,%r159,8; setp.eq.u64 %r346,%r131,%r367; @ %r346 bra $L1; $L34: .loc 1 188 16 ld.u64 %r347,[%r66]; add.u64 %r94,%r347,1; st.u64 [%r66],%r94; .loc 1 189 23 add.u64 %r349,%r368,%r159; ld.u64 %r95,[%r349]; .loc 1 189 13 shl.b64 %r350,%r95,2; add.u64 %r132,%r129,%r350; .loc 1 190 23 add.u64 %r351,%frame,%r159; ld.u64 %r98,[%r351]; .loc 1 190 13 shl.b64 %r352,%r98,3; add.u64 %r111,%r130,%r352; .loc 1 175 32 add.u64 %r354,%r369,%r159; ld.u64 %r102,[%r354]; .loc 1 175 13 setp.eq.u64 %r355,%r94,%r102; @ %r355 bra $L35; .loc 1 177 13 mov.u64 %r82,0; bra $L25; $L51: .loc 1 86 6 ld.u64 %r356,[%r185]; setp.eq.u64 %r357,%r356,0; @ ! %r357 bra $L11; bra $L36; $L608u32 %r209; .reg .u32 %r210; .reg .u64 %r212; .reg .u32 %r214; .reg .u32 %r215; .reg .u16 %r217; .reg .u16 %r218; .reg .u16 %r219; .reg .u32 %r220; .reg .u16 %r221; .reg .predu64 %r230; .reg .u64 %r231; .reg .u64 %r249; .reg .u32u32u64 %r281; .reg .u64 %r282; .reg .u64 %r283; .reg .u64predu64u16 %r331; .reg .u16predpredu64 %r375; .reg .u64pred %r379; .reg .pred %r381; .reg .u16 %r383; .reg .u32 %r384; .reg .pred %r385; .reg .f32u32 %r390; .reg .pred %r391; .reg .f32 %r392; .reg .pred %r393; .reg .pred %r394; .reg .predu64u64 %r430; .reg .u64 %r431; mov.u64 %r204,%ar0; mov.u64 %r205,%ar1; mov.f32 %r206,%ar2; mov.u64 %r207,%ar3; mov.u64 %r208,%ar4; mov.u32 %r209,%ar5; .loc 1 225 10 ld.s8 %r22,[%r205+28]; .loc 1 225 38 add.u32 %r210,%r22,-1; .loc 1 225 8 cvt.s64.s32 %r147,%r210; .loc 1 226 10 ld.u64 %r25,[%r207]; .loc 1 226 7 add.u64 %r148,%r25,-1; .loc 1 228 7 shr.u64 %r212,%r148,63; set.u32.lt.s64 %r214,%r147,%r148; neg.s32 %r215,%r214; cvt.u16.u64 %r218,%r212; cvt.u16.u32 %r219,%r215; or.b16 %r217,%r218,%r219; .loc 1 228 6 cvt.u32.u16 %r220,%r217; cvt.u16.u8 %r221,%r220; setp.eq.u16 %r222,%r221,0; @ %r222 bra $L53; .loc 1 230 7 cvt.u32.u32 %r225,%r22; cvt.s64.s8 %r224,%r225; st.u64 [%stack+8],%r224232stack; call _gfortran_runtime_error3: .loc 1 235 9 add.u64 %r228,%r148,%r148; add.u64 %r229,%r228,%r148; shl.b64 %r230,%r229,3; add.u64 %r231,%r205,%r230; ld.u64 %r31,[%r231+56]; ld.u64 %r33,[%r231+48]; .loc 1 239 9 ld.u64 %r151,[%r231+40]; .loc 1 240 12 add.u64 %r249,%r208,%r230; ld.u64 %r34,[%r249+40]; ld.u64 %r36,[%r208+16]; .loc 1 242 9 ld.u64 %r160,[%r208]; .loc 1 246 22 cvt.u32.u64 %r39,%r36; .loc 1 246 53 add.u32 %r251,%r39,-4; and.b32 %r252,%r251,-5; set.u32.eq.u32 %r254,%r252,0; neg.s32 %r255,%r254; .loc 1 246 22 add.u32 %r256,%r39,-1; set.u32.le.u32 %r258,%r256,1; neg.s32 %r259,%r258; .loc 1 246 6 cvt.u16.u32 %r261,%r255; cvt.u16.u32 %r262,%r259; or.b16 %r260,%r261,%r262; cvt.u32.u16 %r263,%r260; cvt.u16.u8 %r264,%r263; setp.eq.u16 %r265,%r264,0; @ %r265 bra $L54; $L58: .loc 1 255 3 setp.ne.u64 %r266,%r148,0; @ %r266 bra $L55; $L62: .loc 1 264 3 setp.gt.s64 %r267,%r147,%r148; @ %r267 bra $L56; bra $L113; $L54: .loc 1 248 7 setp.eq.u32 %r269,%r39,16; @ %r269 bra $L58; .loc 1 253 5 cvta.const.u64 %r271,$LC4; mov.u64 %r270,27271; call _gfortrani_internal_error5: add.u64 %r146,%r205,40; add.u64 %r167,%r208,40; add.u64 %r273,%r25,%r25; add.u64 %r274,%r273,%r25; shl.b64 %r275,%r274,3; add.u64 %r276,%r205,16; add.u64 %r101,%r275,%r276; .loc 1 255 3 mov.u64 %r200,0; add.u64 %r418,%frame,240; add.u64 %r419,%frame,120; add.u64 %r416,%frame,360; .loc 1 262 12 mov.u64 %r431,%r200; $L61: .loc 1 257 18 add.u64 %r278,%r418,%r200; ld.u64 %r279,[%r146]; st.u64 [%r278],%r279; .loc 1 258 18 add.u64 %r281,%r419,%r200; .loc 1 258 20 ld.u64 %r283,[%r167]; mul.lo.u64 %r282,%r283,%r36; .loc 1 258 18 st.u64 [%r281],%r282; .loc 1 259 19 ld.u64 %r285,[%r146+16]; add.u64 %r284,%r285,1; ld.u64 %r286,[%r146+8]; sub.u64 %r53,%r284,%r286; .loc 1 261 10 setp.lt.s64 %r287,%r53,0; @ %r287 bra $L59; .loc 1 259 17 add.u64 %r289,%r416,%r200; st.u64 [%r289],%r53; bra $L60; $L59: .loc 1 262 12 add.u64 %r291,%r416,%r200; st.u64 [%r291],%r431; $L60: .loc 1 255 3 add.u64 %r146,%r146,24; add.u64 %r167,%r167,24; add.u64 %r200,%r200,8; setp.ne.u64 %r293,%r101,%r146; @ %r293 bra $L61; bra $L62; $L113: .loc 1 274 6 ld.u64 %r294,[%r204]; setp.eq.u64 %r295,%r294,0; @ ! %r295 bra $L64; bra $L63; $L56: add.u64 %r297,%r25,%r25; add.u64 %r298,%r297,%r25; shl.b64 %r299,%r298,3; add.u64 %r143,%r299,40; add.u64 %r156,%r205,%r143; add.u64 %r141,%r208,%r143; shl.b64 %r300,%r25,3; add.u64 %r137,%r300,-8; cvt.u32.u32 %r302,%r22; cvt.s64.s8 %r301,%r302; shl.b64 %r303,%r301,3; add.u64 %r84,%r303,-8; add.u64 %r418,%frame,240; add.u64 %r419,%frame,120; add.u64 %r416,%frame,360; .loc 1 271 12 mov.u64 %r430,0; $L67: .loc 1 266 18 add.u64 %r305,%r418,%r137; ld.u64 %r306,[%r156]; st.u64 [%r305],%r306; .loc 1 267 18 add.u64 %r308,%r419,%r137; .loc 1 267 20 ld.u64 %r310,[%r141]; mul.lo.u64 %r309,%r310,%r36; .loc 1 267 18 st.u64 [%r308],%r309; .loc 1 268 19 ld.u64 %r312,[%r156+16]; add.u64 %r311,%r312,1; ld.u64 %r313,[%r156+8]; sub.u64 %r62,%r311,%r313; .loc 1 270 10 setp.lt.s64 %r314,%r62,0; @ %r314 bra $L65; .loc 1 268 17 add.u64 %r316,%r416,%r137; st.u64 [%r316],%r62; bra $L66; $L65: .loc 1 271 12 add.u64 %r318,%r416,%r137; st.u64 [%r318],%r430; $L66: .loc 1 264 3 add.u64 %r156,%r156,24; add.u64 %r141,%r141,24; add.u64 %r137,%r137,8; setp.ne.u64 %r320,%r84,%r137; @ %r320 bra $L67; bra $L114; $L63: .loc 1 278 7 setp.le.s64 %r321,%r147,0; @ %r321 bra $L69; add.u64 %r416,%frame,360; $L90: add.u64 %r179,%r204,40; mov.u64 %r176,%r416; add.u64 %r322,%r204,16; cvt.u32.u32 %r324,%r22; cvt.s64.s8 %r323,%r324; add.u64 %r326,%r323,%r323; add.u64 %r327,%r326,%r323; shl.b64 %r328,%r327,3; add.u64 %r169,%r322,%r328; .loc 1 281 10 mov.u64 %r140,1; .loc 1 285 4 mov.u64 %r351,0; bra $L70; $L69: .loc 1 289 24 mov.u64 %r329,0; st.u64 [%r204+8],%r329; .loc 1 290 28 cvt.u16.u32 %r332,%r22; add.u16 %r331,%r332,-1; cvt.u32.u16 %r333,%r331; st.u8 [%r204+28],%r333; .loc 1 292 20 add.u32 %r334,%r22,-2; cvt.s64.s32 %r72,%r334; add.u64 %r336,%r72,%r72; add.u64 %r337,%r336,%r72; shl.b64 %r338,%r337,3; add.u64 %r339,%r204,%r338; .loc 1 292 67 shl.b64 %r341,%r72,3; add.u64 %r342,%frame,%r341; .loc 1 292 59 ld.u64 %r344,[%r339+40]; ld.u64 %r345,[%r342+360]; mul.lo.u64 %r154,%r344,%r345; .loc 1 294 29 mov.u64 %r3471547; call (%value_in),_gfortrani_xmallocarray8,[%value_in]; } .loc 1 294 27 st.u64 [%r204],%r348; .loc 1 295 10 setp.eq.u64 %r350,%r154,0; @ ! %r350 bra $L74; bra $L71; $L73: .loc 1 283 48 mul.lo.u64 %r140,%r68,%r140; $L70: .loc 1 285 4 st.u64 [%r179+8],%r351; ld.u64 %r68,[%r176]; add.u64 %r352,%r68,-1; st.u64 [%r179+16],%r352; st.u64 [%r179],%r140; .loc 1 278 7 add.u64 %r179,%r179,24; add.u64 %r176,%r176,8; setp.ne.u64 %r353,%r169,%r179; @ %r353 bra $L73; bra $L69; $L71: .loc 1 298 4 st.u64 [%r204+48],%r154; mov.u64 %r355,-1; st.u64 [%r204+56],%r355; mov.u64 %r356,1; st.u64 [%r204+40],%r356; .loc 1 299 4 bra $L52; $L64: .loc 1 304 19 ld.s8 %r79,[%r204+28]; .loc 1 304 10 setp.eq.u64 %r357,%r79,%r147; @ %r357 bra $L76; .loc 1 305 2 st.u64 [%stack+8],%r147; st.u64 [%stack],%r79;_gfortran_runtime_error76: .loc 1 310 11 cvta.global.u64 %r360,_gfortrani_compile_options; .loc 1 310 10 ld.u32 %r361,[%r360+36]; setp.eq.u32 %r362,%r361,0; @ %r362 bra $L74; .loc 1 311 2 add.u64 %r416,%frame,360; cvta.const.u64 %r366,$LC2r416366; call _gfortrani_bounds_ifunction_return$L74: .loc 1 315 3 setp.gt.s64 %r368,%r147,0; @ %r368 bra $L77; $L80: .loc 1 235 9 add.u64 %r369,%r31,1; .loc 1 235 7 sub.u64 %r149,%r369,%r33; max.s64 %r65,%r149,0; .loc 1 240 12 mul.lo.u64 %r37,%r34,%r36; .loc 1 323 8 ld.u64 %r164,[%r204]; .loc 1 326 8 ld.u64 %r163,[%r205]; .loc 1 351 35 shl.b64 %r100,%r151,2; .loc 1 336 22 add.u64 %r64,%r65,-1; .loc 1 336 15 mul.lo.u64 %r89,%r64,%r100; .loc 1 337 29 mul.lo.u64 %r177,%r37,%r64; .loc 1 338 34 neg.s64 %r95,%r100; .loc 1 363 22 ld.u64 %r102,[%frame+240]; .loc 1 363 12 shl.b64 %r104,%r102,2; .loc 1 364 23 ld.u64 %r105,[%frame+120]; .loc 1 365 22 ld.u64 %r107,[%frame]; .loc 1 365 12 shl.b64 %r109,%r107,3; .loc 1 362 12 ld.u64 %r203,[%frame+480]; setp.ne.u32 %r417,%r209,0; add.u64 %r415,%frame,488; .loc 1 338 4 setp.le.s64 %r423,%r149,0; .loc 1 382 23 add.u64 %r424,%frame,240; .loc 1 367 32 add.u64 %r425,%frame,360; .loc 1 371 20 add.u64 %r426,%frame,120; bra $L78; $L77: add.u64 %r198,%frame,480; add.u64 %r197,%r204,40; mov.u64 %r195,%frame; cvt.u32.u32 %r371,%r22; cvt.s64.s8 %r370,%r371; add.u64 %r181,%r370,-1; .loc 1 315 3 mov.u64 %r165,0; add.u64 %r416,%frame,360; .loc 1 317 16 mov.u64 %r372,%r165; $L79: st.u64 [%r198],%r372; .loc 1 318 18 ld.u64 %r373,[%r197]; st.u64 [%r195],%r373; .loc 1 319 17 shl.b64 %r375,%r165,3; add.u64 %r376,%r416,%r375; .loc 1 319 10 ld.u64 %r377,[%r376]; setp.le.s64 %r378,%r377,0; @ %r378 bra $L52; .loc 1 315 26 add.u64 %r165,%r165,1; .loc 1 315 3 add.u64 %r198,%r198,8; add.u64 %r197,%r197,24; add.u64 %r195,%r195,8; setp.ne.u64 %r379,%r165,%r181; @ %r379 bra $L79; bra $L80; $L78: .loc 1 334 10 @ %r417 bra $L81; .loc 1 351 4 setp.gt.s64 %r381,%r149,0; @ %r381 bra $L91; .loc 1 333 14 mov.u64 %r158,0; bra $L83; $L81: .loc 1 336 8 add.u64 %r29,%r163,%r89; .loc 1 337 9 add.u64 %r159,%r160,%r177; .loc 1 338 4 @ %r423 bra $L92; .loc 1 338 11 mov.u64 %r158,%r65; $L85: .loc 1 340 11 ld.s8 %r384,[%r159]; cvt.u16.u32 %r383,%r384; setp.eq.u16 %r385,%r383,0; @ %r385 bra $L84; .loc 1 340 18 ld.f32 %r386,[%r29]; setp.eq.f32 %r387,%r386,%r206; @ %r387 bra $L83; $L84: .loc 1 338 26 add.u64 %r158,%r158,-1; .loc 1 338 34 add.u64 %r29,%r29,%r95; .loc 1 338 53 sub.u64 %r159,%r159,%r37; .loc 1 338 4 setp.ne.u64 %r388,%r158,0; @ %r388 bra $L85; bra $L83; $L91: .loc 1 351 4 mov.u64 %r157,%r160; mov.u64 %r111,%r163; .loc 1 351 11 mov.u64 %r158,1; $L82: .loc 1 353 11 ld.s8 %r390,[%r157]; cvt.u16.u32 %r389,%r390; setp.eq.u16 %r391,%r389,0; @ %r391 bra $L86; .loc 1 353 18 ld.f32 %r392,[%r111]; setp.eq.f32 %r393,%r392,%r206; @ %r393 bra $L83; $L86: .loc 1 351 27 add.u64 %r158,%r158,1; .loc 1 351 35 add.u64 %r111,%r111,%r100; .loc 1 351 54 add.u64 %r157,%r157,%r37; .loc 1 351 4 setp.ge.s64 %r394,%r65,%r158; @ %r394 bra $L82; .loc 1 333 14 mov.u64 %r158,0; bra $L83; $L92: mov.u64 %r158,0; $L83: .loc 1 360 13 st.u64 [%r164],%r158; .loc 1 362 15 add.u64 %r203,%r203,1; st.u64 [%frame+480],%r203; .loc 1 363 12 add.u64 %r163,%r163,%r104; .loc 1 364 13 add.u64 %r160,%r160,%r105; .loc 1 365 12 add.u64 %r164,%r164,%r109; .loc 1 367 32 ld.u64 %r130,[%frame+360]; .loc 1 367 13 setp.ne.u64 %r395,%r203,%r130; @ %r395 bra $L78; mov.u64 %r126,%r415; mov.u64 %r127,%r107; mov.u64 %r202,%r105; mov.u64 %r124,%r102; mov.u64 %r120,8; .loc 1 366 9 mov.u64 %r162,0; .loc 1 369 13 mov.u64 %r397,%r162; $L89: st.u64 [%r126+-8],%r397; .loc 1 370 23 mul.lo.u64 %r113,%r130,%r124; .loc 1 371 24 mul.lo.u64 %r398,%r130,%r202; .loc 1 371 10 sub.u64 %r160,%r160,%r398; .loc 1 372 23 mul.lo.u64 %r119,%r130,%r127; .loc 1 373 5 add.u64 %r162,%r162,1; .loc 1 374 7 setp.le.s64 %r399,%r147,%r162; @ %r399 bra $L52; .loc 1 381 16 ld.u64 %r400,[%r126]; add.u64 %r123,%r400,1; st.u64 [%r126],%r123; .loc 1 382 23 add.u64 %r402,%r424,%r120; ld.u64 %r124,[%r402]; .loc 1 382 13 sub.u64 %r403,%r124,%r113; shl.b64 %r404,%r403,2; add.u64 %r163,%r163,%r404; .loc 1 383 23 add.u64 %r405,%frame,%r120; ld.u64 %r127,[%r405]; .loc 1 383 13 sub.u64 %r406,%r127,%r119; shl.b64 %r407,%r406,3; add.u64 %r164,%r164,%r407; .loc 1 367 32 add.u64 %r409,%r425,%r120; ld.u64 %r130,[%r409]; .loc 1 367 13 add.u64 %r126,%r126,8; setp.eq.u64 %r410,%r123,%r130; @ %r410 bra $L88; .loc 1 362 12 ld.u64 %r203,[%frame+480]; bra $L78; $L88: .loc 1 371 20 add.u64 %r412,%r426,%r120; ld.u64 %r202,[%r412]; add.u64 %r120,%r120,8; bra $L89; $L114: .loc 1 274 6 ld.u64 %r413,[%r204]; setp.eq.u64 %r414,%r413,0; @ ! %r414 bra $L64; bra $L90; $L5268u64 %r126; .reg .u64u32pred %r190; .reg .pred %r192; .reg .u64pred %r213; .reg .predu32predpredpred %r257; .reg .u64 %r260; .reg .u64 %r261; .reg .pred %r263; .reg .u64 %r264; .reg .u32 %r265; .reg .u64 %r266; .reg .u6464predmov.u64 %r149,%ar0; mov.u64 %r150,%ar1; mov.f32 %r151,%ar2; mov.u64 %r152,%ar3; mov.u64 %r153,%ar4; mov.u32 %r154,%ar5; .loc 1 410 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L116; .loc 1 410 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L117; $L116:call _gfortran_findloc1_); } .loc 1 413 7 bra $L115; $L117: .loc 1 416 10 ld.s8 %r23,[%r150+28]; .loc 1 416 38 add.u32 %r163,%r23,-1; .loc 1 416 8 cvt.s64.s32 %r90,%r163; .loc 1 417 10 ld.u64 %r26,[%r152]; .loc 1 417 7 add.u64 %r91,%r26,-1; .loc 1 419 7 shr.u64 %r165,%r91,63; set.u32.lt.s64 %r167,%r90,%r91; neg.s32 %r168,%r167; cvt.u16.u64 %r171,%r165; cvt.u16.u32 %r172,%r168; or.b16 %r170,%r171,%r172; .loc 1 419 6 cvt.u32.u16 %r173,%r170; cvt.u16.u8 %r174,%r173; setp.ne.u16 %r175,%r174,0; @ %r175 bra $L119; .loc 1 430 3 setp.ne.u64 %r176,%r91,0; @ %r176 bra $L120; $L126: .loc 1 438 3 setp.gt.s64 %r177,%r90,%r91; @ %r177 bra $L121; bra $L163; $L119: .loc 1 421 7 cvt.u32.u32 %r180,%r23; cvt.s64.s8 %r179,%r180; st.u64 [%stack+8],%r179_gfortran_runtime_error120: add.u64 %r136,%r150,48; add.u64 %r138,%frame,120; add.u64 %r139,%r150,56; add.u64 %r182,%r150,24; add.u64 %r184,%r26,%r26; add.u64 %r185,%r184,%r26; shl.b64 %r186,%r185,3; add.u64 %r145,%r182,%r186; .loc 1 435 12 mov.u64 %r302,0; $L125: .loc 1 432 19 ld.u64 %r188,[%r139]; add.u64 %r187,%r188,1; ld.u64 %r189,[%r136]; sub.u64 %r35,%r187,%r189; .loc 1 434 10 setp.le.s64 %r190,%r35,0; @ %r190 bra $L123; .loc 1 432 17 st.u64 [%r138],%r35; bra $L124; $L123: .loc 1 435 12 st.u64 [%r138],%r302; $L124: .loc 1 430 3 add.u64 %r136,%r136,24; add.u64 %r138,%r138,8; add.u64 %r139,%r139,24; setp.ne.u64 %r192,%r136,%r145; @ %r192 bra $L125; bra $L126; $L163: .loc 1 448 6 ld.u64 %r193,[%r149]; setp.eq.u64 %r194,%r193,0; @ ! %r194 bra $L128; bra $L127; $L121: add.u64 %r196,%r26,%r26; add.u64 %r197,%r196,%r26; shl.b64 %r198,%r197,3; add.u64 %r199,%r198,48; add.u64 %r125,%r150,%r199; add.u64 %r294,%frame,120; shl.b64 %r201,%r26,3; add.u64 %r202,%r201,-8; add.u64 %r45,%r294,%r202; add.u64 %r203,%r150,48; cvt.u32.u32 %r205,%r23; cvt.s64.s8 %r204,%r205; add.u64 %r207,%r204,%r204; add.u64 %r208,%r207,%r204; shl.b64 %r209,%r208,3; add.u64 %r132,%r203,%r209; .loc 1 444 12 mov.u64 %r301,0; $L131: .loc 1 441 2 ld.u64 %r211,[%r125+8]; add.u64 %r210,%r211,1; ld.u64 %r212,[%r125]; sub.u64 %r40,%r210,%r212; .loc 1 443 10 setp.le.s64 %r213,%r40,0; @ %r213 bra $L129; .loc 1 440 17 st.u64 [%r45],%r40; bra $L130; $L129: .loc 1 444 12 st.u64 [%r45],%r301; $L130: .loc 1 438 3 add.u64 %r125,%r125,24; add.u64 %r45,%r45,8; setp.ne.u64 %r215,%r125,%r132; @ %r215 bra $L131; bra $L164; $L127: .loc 1 452 7 setp.le.s64 %r216,%r90,0; @ %r216 bra $L133; add.u64 %r294,%frame,120; $L149: add.u64 %r78,%r149,40; mov.u64 %r61,%r294; cvt.u32.u32 %r218,%r23; cvt.s64.s8 %r217,%r218; add.u64 %r220,%r217,%r217; add.u64 %r221,%r220,%r217; shl.b64 %r222,%r221,3; add.u64 %r223,%r149,16; add.u64 %r126,%r222,%r223; .loc 1 455 10 mov.u64 %r81,1; .loc 1 459 4 mov.u64 %r246,0; bra $L134; $L133: .loc 1 462 24 mov.u64 %r224,0; st.u64 [%r149+8],%r224; .loc 1 463 28 cvt.u16.u32 %r227,%r23; add.u16 %r226,%r227,-1; cvt.u32.u16 %r228,%r226; st.u8 [%r149+28],%r228; .loc 1 465 20 add.u32 %r229,%r23,-2; cvt.s64.s32 %r51,%r229; add.u64 %r231,%r51,%r51; add.u64 %r232,%r231,%r51; shl.b64 %r233,%r232,3; add.u64 %r234,%r149,%r233; .loc 1 465 67 shl.b64 %r236,%r51,3; add.u64 %r237,%frame,%r236; .loc 1 465 59 ld.u64 %r239,[%r234+40]; ld.u64 %r240,[%r237+120]; mul.lo.u64 %r94,%r239,%r240; .loc 1 467 29 mov.u64 %r24242; call (%value_in),_gfortrani_xmallocarray3,[%value_in]; } .loc 1 467 27 st.u64 [%r149],%r243; .loc 1 468 10 setp.eq.u64 %r245,%r94,0; @ ! %r245 bra $L138; bra $L135; $L137: .loc 1 457 48 mul.lo.u64 %r81,%r47,%r81; $L134: .loc 1 459 4 st.u64 [%r78+8],%r246; ld.u64 %r47,[%r61]; add.u64 %r247,%r47,-1; st.u64 [%r78+16],%r247; st.u64 [%r78],%r81; .loc 1 452 7 add.u64 %r78,%r78,24; add.u64 %r61,%r61,8; setp.ne.u64 %r248,%r78,%r126; @ %r248 bra $L137; bra $L133; $L135: .loc 1 471 4 st.u64 [%r149+48],%r94; mov.u64 %r250,-1; st.u64 [%r149+56],%r250; mov.u64 %r251,1; st.u64 [%r149+40],%r251; .loc 1 472 4 bra $L115; $L128: .loc 1 477 19 ld.s8 %r58,[%r149+28]; .loc 1 477 10 setp.eq.u64 %r252,%r58,%r90; @ %r252 bra $L139; .loc 1 478 2 st.u64 [%stack+8],%r90; st.u64 [%stack],%r58;_gfortran_runtime_error139: .loc 1 483 11 cvta.global.u64 %r255,_gfortrani_compile_options; .loc 1 483 10 ld.u32 %r256,[%r255+36]; setp.eq.u32 %r257,%r256,0; @ %r257 bra $L138; .loc 1 484 2 add.u64 %r294,%frame,120; cvta.const.u64 %r261,$LC2261; call _gfortrani_bounds_ifunction_return$L138: .loc 1 488 3 setp.gt.s64 %r263,%r90,0; @ %r263 bra $L140; $L143: .loc 1 495 8 ld.u64 %r97,[%r149]; ld.u64 %r42,[%frame+240]; .loc 1 503 22 ld.u64 %r64,[%frame]; .loc 1 503 12 shl.b64 %r66,%r64,3; setp.le.s64 %r295,%r90,1; cvt.u32.u32 %r297,%r23; cvt.s64.s8 %r298,%r297; add.u64 %r299,%r298,-1; add.u64 %r300,%frame,120; bra $L141; $L140: add.u64 %r114,%frame,240; add.u64 %r113,%r149,40; mov.u64 %r108,%frame; cvt.u32.u32 %r265,%r23; cvt.s64.s8 %r264,%r265; add.u64 %r79,%r264,-1; .loc 1 488 3 mov.u64 %r98,0; add.u64 %r294,%frame,120; .loc 1 490 16 mov.u64 %r266,%r98; $L142: st.u64 [%r114],%r266; .loc 1 491 18 ld.u64 %r267,[%r113]; st.u64 [%r108],%r267; .loc 1 492 17 shl.b64 %r269,%r98,3; add.u64 %r270,%r294,%r269; .loc 1 492 10 ld.u64 %r271,[%r270]; setp.le.s64 %r272,%r271,0; @ %r272 bra $L115; .loc 1 488 26 add.u64 %r98,%r98,1; .loc 1 488 3 add.u64 %r114,%r114,8; add.u64 %r113,%r113,24; add.u64 %r108,%r108,8; setp.ne.u64 %r273,%r79,%r98; @ %r273 bra $L142; bra $L143; $L141: .loc 1 500 13 mov.u64 %r274,0; st.u64 [%r97],%r274; .loc 1 502 15 add.u64 %r42,%r42,1; .loc 1 503 12 add.u64 %r97,%r97,%r66; .loc 1 505 32 ld.u64 %r88,[%frame+120]; .loc 1 505 13 setp.ne.u64 %r275,%r42,%r88; @ %r275 bra $L141; .loc 1 508 23 mul.lo.u64 %r276,%r64,%r42; .loc 1 508 9 shl.b64 %r277,%r276,3; sub.u64 %r30,%r97,%r277; .loc 1 510 7 @ ! %r295 bra $L165; bra $L115; $L148: .loc 1 507 13 st.u64 [%r102],%r274; .loc 1 508 23 mul.lo.u64 %r280,%r73,%r72; .loc 1 508 9 shl.b64 %r281,%r280,3; sub.u64 %r30,%r97,%r281; .loc 1 509 5 add.u64 %r96,%r96,1; .loc 1 510 7 add.u64 %r102,%r102,8; setp.ne.u64 %r282,%r96,%r299; @ %r282 bra $L147; bra $L115; $L165: add.u64 %r102,%frame,248; .loc 1 509 5 mov.u64 %r96,1; $L147: .loc 1 517 16 ld.u64 %r286,[%r102]; add.u64 %r72,%r286,1; st.u64 [%r102],%r72; shl.b64 %r120,%r96,3; .loc 1 518 23 add.u64 %r287,%frame,%r120; ld.u64 %r73,[%r287]; .loc 1 518 13 shl.b64 %r288,%r73,3; add.u64 %r97,%r30,%r288; .loc 1 505 32 add.u64 %r290,%r300,%r120; ld.u64 %r77,[%r290]; .loc 1 505 13 setp.eq.u64 %r291,%r72,%r77; @ %r291 bra $L148; .loc 1 507 13 mov.u64 %r42,0; bra $L141; $L164: .loc 1 448 6 ld.u64 %r292,[%r149]; setp.eq.u64 %r293,%r292,0; @ ! %r293 bra $L128; bra $L149; $L115: .loc 1 522 1 ret; } findloc1_r8.o/_gfortran_findloc1_r8 .visible .func _gfortran_findloc1_r8fortran/generated/findloc1_r8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mfindloc1_r8 .visible .func _gfortran_mfindloc1_r8_gfortran_sfindloc1_r8 .visible .func _gfortran_sfindloc1_r8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_internal_error .extern .func _gfortrani_internal_error8] = {70,73,78,68,76,79,67,0 },105,110,99,111,114,114,101,99,116,32,105,110,32,70,73,78,68,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,914,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,70,73,78,68,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10114,101,116,117,114,110,32,118,97,108,11_gfortran_findloc1_r8 .visible .func _gfortran_findloc1_r848098predpred %r238; .reg .u64pred %r260; .reg .pred %r262; .reg .pred %r263; .reg .u64 %r264; .reg .u32 %r265; .reg .u64u64 %r279; .reg .u64 %r280; .reg .u6464pred %r327; .reg .predpred %r331; .reg .pred %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u64u64 %r352; .reg .u64 %r354; .reg .predpred %r358; .reg .pred %r359; .reg .u64predmov.u64 %r185,%ar0; mov.u64 %r186,%ar1; mov.f64 %r187,%ar2; mov.u64 %r188,%ar3; mov.u32 %r189,%ar4; .loc 1 54 10 ld.s8 %r22,[%r186+28]; .loc 1 54 38 add.u32 %r190,%r22,-1; .loc 1 54 8 cvt.s64.s32 %r118,%r190; .loc 1 55 10 ld.u64 %r25,[%r188]; .loc 1 55 7 add.u64 %r119,%r25,-1; .loc 1 57 7 shr.u64 %r192,%r119,63; set.u32.lt.s64 %r194,%r118,%r119; neg.s32 %r195,%r194; cvt.u16.u64 %r198,%r192; cvt.u16.u32 %r199,%r195; or.b16 %r197,%r198,%r199; .loc 1 57 6 cvt.u32.u16 %r200,%r197; cvt.u16.u8 %r201,%r200; setp.eq.u16 %r202,%r201,0; @ %r202 bra $L2; .loc 1 59 7 cvt.u32.u32 %r205,%r22; cvt.s64.s8 %r204,%r205; st.u64 [%stack+8],%r20403203_gfortran_runtime_errorr208,%r119,%r119; add.u64 %r209,%r208,%r119; shl.b64 %r210,%r209,3; add.u64 %r211,%r186,%r210; ld.u64 %r31,[%r211+56]; ld.u64 %r33,[%r211+48]; .loc 1 67 9 ld.u64 %r122,[%r211+40]; .loc 1 69 3 setp.ne.u64 %r225,%r119,0; @ %r225 bra $L3; $L9: .loc 1 77 3 setp.gt.s64 %r226,%r118,%r119; @ %r226 bra $L4; bra $L50; $L3: add.u64 %r177,%r186,40; add.u64 %r67,%frame,120; add.u64 %r70,%frame,240; add.u64 %r228,%r25,%r25; add.u64 %r229,%r228,%r25; shl.b64 %r230,%r229,3; add.u64 %r231,%r186,16; add.u64 %r182,%r230,%r231; .loc 1 75 12 mov.u64 %r371,0; $L8: .loc 1 71 18 ld.u64 %r232,[%r177]; st.u64 [%r67],%r232; .loc 1 72 19 ld.u64 %r234,[%r177+16]; add.u64 %r233,%r234,1; ld.u64 %r235,[%r177+8]; sub.u64 %r38,%r233,%r235; .loc 1 74 10 setp.lt.s64 %r236,%r38,0; @ %r236 bra $L6; .loc 1 72 17 st.u64 [%r70],%r38; bra $L7; $L6: .loc 1 75 12 st.u64 [%r70],%r371; $L7: .loc 1 69 3 add.u64 %r177,%r177,24; add.u64 %r67,%r67,8; add.u64 %r70,%r70,8; setp.ne.u64 %r238,%r177,%r182; @ %r238 bra $L8; bra $L9; $L50: .loc 1 86 6 ld.u64 %r239,[%r185]; setp.eq.u64 %r240,%r239,0; @ ! %r240 bra $L11; bra $L10; $L4: add.u64 %r242,%r25,%r25; add.u64 %r243,%r242,%r25; shl.b64 %r244,%r243,3; add.u64 %r245,%r244,40; add.u64 %r115,%r186,%r245; shl.b64 %r246,%r25,3; add.u64 %r105,%r246,-8; add.u64 %r361,%frame,120; add.u64 %r108,%r361,%r105; add.u64 %r360,%frame,240; add.u64 %r62,%r360,%r105; add.u64 %r249,%r186,40; cvt.u32.u32 %r251,%r22; cvt.s64.s8 %r250,%r251; add.u64 %r253,%r250,%r250; add.u64 %r254,%r253,%r250; shl.b64 %r255,%r254,3; add.u64 %r180,%r249,%r255; .loc 1 83 12 mov.u64 %r370,0; $L14: .loc 1 79 18 ld.u64 %r256,[%r115]; st.u64 [%r108],%r256; .loc 1 80 19 ld.u64 %r258,[%r115+16]; add.u64 %r257,%r258,1; ld.u64 %r259,[%r115+8]; sub.u64 %r45,%r257,%r259; .loc 1 82 10 setp.lt.s64 %r260,%r45,0; @ %r260 bra $L12; .loc 1 80 17 st.u64 [%r62],%r45; bra $L13; $L12: .loc 1 83 12 st.u64 [%r62],%r370; $L13: .loc 1 77 3 add.u64 %r115,%r115,24; add.u64 %r108,%r108,8; add.u64 %r62,%r62,8; setp.ne.u64 %r262,%r115,%r180; @ %r262 bra $L14; bra $L51; $L10: .loc 1 90 7 setp.le.s64 %r263,%r118,0; @ %r263 bra $L16; add.u64 %r360,%frame,240; $L36: add.u64 %r153,%r185,40; mov.u64 %r149,%r360; cvt.u32.u32 %r265,%r22; cvt.s64.s8 %r264,%r265; add.u64 %r267,%r264,%r264; add.u64 %r268,%r267,%r264; shl.b64 %r269,%r268,3; add.u64 %r270,%r185,16; add.u64 %r116,%r269,%r270; .loc 1 93 10 mov.u64 %r112,1; .loc 1 97 4 mov.u64 %r293,0; bra $L17; $L16: .loc 1 101 24 mov.u64 %r271,0; st.u64 [%r185+8],%r271; .loc 1 102 28 cvt.u16.u32 %r274,%r22; add.u16 %r273,%r274,-1; cvt.u32.u16 %r275,%r273; st.u8 [%r185+28],%r275; .loc 1 104 20 add.u32 %r276,%r22,-2; cvt.s64.s32 %r54,%r276; add.u64 %r278,%r54,%r54; add.u64 %r279,%r278,%r54; shl.b64 %r280,%r279,3; add.u64 %r281,%r185,%r280; .loc 1 104 67 shl.b64 %r283,%r54,3; add.u64 %r284,%frame,%r283; .loc 1 104 59 ld.u64 %r286,[%r281+40]; ld.u64 %r287,[%r284+240]; mul.lo.u64 %r124,%r286,%r287; .loc 1 106 29 mov.u64 %r289189; call (%value_in),_gfortrani_xmallocarray290,[%value_in]; } .loc 1 106 27 st.u64 [%r185],%r290; .loc 1 107 10 setp.eq.u64 %r292,%r124,0; @ ! %r292 bra $L21; bra $L18; $L20: .loc 1 95 48 mul.lo.u64 %r112,%r50,%r112; $L17: .loc 1 97 4 st.u64 [%r153+8],%r293; ld.u64 %r50,[%r149]; add.u64 %r294,%r50,-1; st.u64 [%r153+16],%r294; st.u64 [%r153],%r112; .loc 1 90 7 add.u64 %r153,%r153,24; add.u64 %r149,%r149,8; setp.ne.u64 %r295,%r116,%r153; @ %r295 bra $L20; bra $L16; $L18: .loc 1 110 4 st.u64 [%r185+48],%r124; mov.u64 %r297,-1; st.u64 [%r185+56],%r297; mov.u64 %r298,1; st.u64 [%r185+40],%r298; .loc 1 111 4 bra $L1; $L11: .loc 1 116 19 ld.s8 %r60,[%r185+28]; .loc 1 116 10 setp.eq.u64 %r299,%r60,%r118; @ %r299 bra $L23; .loc 1 117 2 st.u64 [%stack+8],%r118; st.u64 [%stack],%r60;_gfortran_runtime_error23: .loc 1 122 11 cvta.global.u64 %r302,_gfortrani_compile_options; .loc 1 122 10 ld.u32 %r303,[%r302+36]; setp.eq.u32 %r304,%r303,0; @ %r304 bra $L21; .loc 1 123 2 add.u64 %r360,%frame,240; cvta.const.u64 %r308,$LC2r360307308; call _gfortrani_bounds_ifunction_return$L21: .loc 1 127 3 setp.gt.s64 %r310,%r118,0; @ %r310 bra $L24; $L27: .loc 1 64 9 add.u64 %r311,%r31,1; .loc 1 64 7 sub.u64 %r120,%r311,%r33; max.s64 %r43,%r120,0; .loc 1 135 8 ld.u64 %r111,[%r185]; .loc 1 138 8 ld.u64 %r132,[%r186]; .loc 1 160 35 shl.b64 %r74,%r122,3; .loc 1 147 22 add.u64 %r312,%r43,-1; .loc 1 147 15 mul.lo.u64 %r68,%r312,%r74; .loc 1 148 34 neg.s64 %r71,%r74; ld.u64 %r82,[%frame+360]; .loc 1 172 22 ld.u64 %r75,[%frame+120]; .loc 1 172 12 shl.b64 %r77,%r75,3; .loc 1 173 22 ld.u64 %r78,[%frame]; .loc 1 173 12 shl.b64 %r80,%r78,3; setp.ne.u32 %r359,%r189,0; setp.le.s64 %r358,%r118,1; .loc 1 148 4 setp.le.s64 %r363,%r120,0; .loc 1 160 4 setp.gt.s64 %r364,%r120,0; cvt.u32.u32 %r365,%r22; cvt.s64.s8 %r366,%r365; add.u64 %r367,%r366,-1; add.u64 %r368,%frame,120; add.u64 %r369,%frame,240; bra $L25; $L24: add.u64 %r170,%frame,360; add.u64 %r169,%r185,40; mov.u64 %r167,%frame; cvt.u32.u32 %r314,%r22; cvt.s64.s8 %r313,%r314; add.u64 %r154,%r313,-1; .loc 1 127 3 mov.u64 %r133,0; add.u64 %r360,%frame,240; .loc 1 129 16 mov.u64 %r315,%r133; $L26: st.u64 [%r170],%r315; .loc 1 130 18 ld.u64 %r316,[%r169]; st.u64 [%r167],%r316; .loc 1 131 17 shl.b64 %r318,%r133,3; add.u64 %r319,%r360,%r318; .loc 1 131 10 ld.u64 %r320,[%r319]; setp.le.s64 %r321,%r320,0; @ %r321 bra $L1; .loc 1 127 26 add.u64 %r133,%r133,1; .loc 1 127 3 add.u64 %r170,%r170,8; add.u64 %r169,%r169,24; add.u64 %r167,%r167,8; setp.ne.u64 %r322,%r133,%r154; @ %r322 bra $L26; bra $L27; $L25: .loc 1 145 10 @ %r359 bra $L28; .loc 1 160 4 @ %r364 bra $L37; .loc 1 144 14 mov.u64 %r126,0; bra $L30; $L28: .loc 1 147 8 add.u64 %r127,%r132,%r68; .loc 1 148 4 @ %r363 bra $L38; .loc 1 148 11 mov.u64 %r126,%r43; $L31: .loc 1 150 11 ld.f64 %r326,[%r127]; setp.eq.f64 %r327,%r326,%r187; @ %r327 bra $L30; .loc 1 148 26 add.u64 %r126,%r126,-1; .loc 1 148 34 add.u64 %r127,%r127,%r71; .loc 1 148 4 setp.ne.u64 %r328,%r126,0; @ %r328 bra $L31; bra $L30; $L37: .loc 1 160 4 mov.u64 %r125,%r132; .loc 1 160 11 mov.u64 %r126,1; $L29: .loc 1 162 11 ld.f64 %r329,[%r125]; setp.eq.f64 %r330,%r329,%r187; @ %r330 bra $L30; .loc 1 160 27 add.u64 %r126,%r126,1; .loc 1 160 35 add.u64 %r125,%r125,%r74; .loc 1 160 4 setp.ge.s64 %r331,%r43,%r126; @ %r331 bra $L29; .loc 1 144 14 mov.u64 %r126,0; bra $L30; $L38: mov.u64 %r126,0; $L30: .loc 1 169 13 st.u64 [%r111],%r126; .loc 1 171 15 add.u64 %r82,%r82,1; .loc 1 172 12 add.u64 %r132,%r132,%r77; .loc 1 173 12 add.u64 %r111,%r111,%r80; .loc 1 175 32 ld.u64 %r141,[%frame+240]; .loc 1 175 13 setp.ne.u64 %r332,%r82,%r141; @ %r332 bra $L25; .loc 1 178 23 mul.lo.u64 %r333,%r75,%r82; .loc 1 178 9 shl.b64 %r334,%r333,3; sub.u64 %r129,%r132,%r334; .loc 1 179 23 mul.lo.u64 %r335,%r78,%r82; .loc 1 179 9 shl.b64 %r336,%r335,3; sub.u64 %r130,%r111,%r336; .loc 1 181 7 @ %r358 bra $L1; add.u64 %r66,%frame,368; mov.u64 %r159,8; .loc 1 180 5 mov.u64 %r131,1; .loc 1 177 13 mov.u64 %r362,0; bra $L34; $L35: st.u64 [%r66],%r362; .loc 1 178 23 mul.lo.u64 %r342,%r95,%r94; .loc 1 178 9 shl.b64 %r343,%r342,3; sub.u64 %r129,%r132,%r343; .loc 1 179 23 mul.lo.u64 %r344,%r98,%r94; .loc 1 179 9 shl.b64 %r345,%r344,3; sub.u64 %r130,%r111,%r345; .loc 1 180 5 add.u64 %r131,%r131,1; .loc 1 181 7 add.u64 %r66,%r66,8; add.u64 %r159,%r159,8; setp.eq.u64 %r346,%r131,%r367; @ %r346 bra $L1; $L34: .loc 1 188 16 ld.u64 %r347,[%r66]; add.u64 %r94,%r347,1; st.u64 [%r66],%r94; .loc 1 189 23 add.u64 %r349,%r368,%r159; ld.u64 %r95,[%r349]; .loc 1 189 13 shl.b64 %r350,%r95,3; add.u64 %r132,%r129,%r350; .loc 1 190 23 add.u64 %r351,%frame,%r159; ld.u64 %r98,[%r351]; .loc 1 190 13 shl.b64 %r352,%r98,3; add.u64 %r111,%r130,%r352; .loc 1 175 32 add.u64 %r354,%r369,%r159; ld.u64 %r102,[%r354]; .loc 1 175 13 setp.eq.u64 %r355,%r94,%r102; @ %r355 bra $L35; .loc 1 177 13 mov.u64 %r82,0; bra $L25; $L51: .loc 1 86 6 ld.u64 %r356,[%r185]; setp.eq.u64 %r357,%r356,0; @ ! %r357 bra $L11; bra $L36; $L608u32 %r209; .reg .u32 %r210; .reg .u64 %r212; .reg .u32 %r214; .reg .u32 %r215; .reg .u16 %r217; .reg .u16 %r218; .reg .u16 %r219; .reg .u32 %r220; .reg .u16 %r221; .reg .predu64 %r230; .reg .u64 %r231; .reg .u64 %r249; .reg .u32u32u64 %r281; .reg .u64 %r282; .reg .u64 %r283; .reg .u64predu64u16 %r331; .reg .u16predpredu64 %r375; .reg .u64pred %r379; .reg .pred %r381; .reg .u16 %r383; .reg .u32 %r384; .reg .pred %r385; .reg .f64u32 %r390; .reg .pred %r391; .reg .f64 %r392; .reg .pred %r393; .reg .pred %r394; .reg .predu64u64 %r430; .reg .u64 %r431; mov.u64 %r204,%ar0; mov.u64 %r205,%ar1; mov.f64 %r206,%ar2; mov.u64 %r207,%ar3; mov.u64 %r208,%ar4; mov.u32 %r209,%ar5; .loc 1 225 10 ld.s8 %r22,[%r205+28]; .loc 1 225 38 add.u32 %r210,%r22,-1; .loc 1 225 8 cvt.s64.s32 %r147,%r210; .loc 1 226 10 ld.u64 %r25,[%r207]; .loc 1 226 7 add.u64 %r148,%r25,-1; .loc 1 228 7 shr.u64 %r212,%r148,63; set.u32.lt.s64 %r214,%r147,%r148; neg.s32 %r215,%r214; cvt.u16.u64 %r218,%r212; cvt.u16.u32 %r219,%r215; or.b16 %r217,%r218,%r219; .loc 1 228 6 cvt.u32.u16 %r220,%r217; cvt.u16.u8 %r221,%r220; setp.eq.u16 %r222,%r221,0; @ %r222 bra $L53; .loc 1 230 7 cvt.u32.u32 %r225,%r22; cvt.s64.s8 %r224,%r225; st.u64 [%stack+8],%r224232stack; call _gfortran_runtime_error3: .loc 1 235 9 add.u64 %r228,%r148,%r148; add.u64 %r229,%r228,%r148; shl.b64 %r230,%r229,3; add.u64 %r231,%r205,%r230; ld.u64 %r31,[%r231+56]; ld.u64 %r33,[%r231+48]; .loc 1 239 9 ld.u64 %r151,[%r231+40]; .loc 1 240 12 add.u64 %r249,%r208,%r230; ld.u64 %r34,[%r249+40]; ld.u64 %r36,[%r208+16]; .loc 1 242 9 ld.u64 %r160,[%r208]; .loc 1 246 22 cvt.u32.u64 %r39,%r36; .loc 1 246 53 add.u32 %r251,%r39,-4; and.b32 %r252,%r251,-5; set.u32.eq.u32 %r254,%r252,0; neg.s32 %r255,%r254; .loc 1 246 22 add.u32 %r256,%r39,-1; set.u32.le.u32 %r258,%r256,1; neg.s32 %r259,%r258; .loc 1 246 6 cvt.u16.u32 %r261,%r255; cvt.u16.u32 %r262,%r259; or.b16 %r260,%r261,%r262; cvt.u32.u16 %r263,%r260; cvt.u16.u8 %r264,%r263; setp.eq.u16 %r265,%r264,0; @ %r265 bra $L54; $L58: .loc 1 255 3 setp.ne.u64 %r266,%r148,0; @ %r266 bra $L55; $L62: .loc 1 264 3 setp.gt.s64 %r267,%r147,%r148; @ %r267 bra $L56; bra $L113; $L54: .loc 1 248 7 setp.eq.u32 %r269,%r39,16; @ %r269 bra $L58; .loc 1 253 5 cvta.const.u64 %r271,$LC4; mov.u64 %r270,27271; call _gfortrani_internal_error5: add.u64 %r146,%r205,40; add.u64 %r167,%r208,40; add.u64 %r273,%r25,%r25; add.u64 %r274,%r273,%r25; shl.b64 %r275,%r274,3; add.u64 %r276,%r205,16; add.u64 %r101,%r275,%r276; .loc 1 255 3 mov.u64 %r200,0; add.u64 %r418,%frame,240; add.u64 %r419,%frame,120; add.u64 %r416,%frame,360; .loc 1 262 12 mov.u64 %r431,%r200; $L61: .loc 1 257 18 add.u64 %r278,%r418,%r200; ld.u64 %r279,[%r146]; st.u64 [%r278],%r279; .loc 1 258 18 add.u64 %r281,%r419,%r200; .loc 1 258 20 ld.u64 %r283,[%r167]; mul.lo.u64 %r282,%r283,%r36; .loc 1 258 18 st.u64 [%r281],%r282; .loc 1 259 19 ld.u64 %r285,[%r146+16]; add.u64 %r284,%r285,1; ld.u64 %r286,[%r146+8]; sub.u64 %r53,%r284,%r286; .loc 1 261 10 setp.lt.s64 %r287,%r53,0; @ %r287 bra $L59; .loc 1 259 17 add.u64 %r289,%r416,%r200; st.u64 [%r289],%r53; bra $L60; $L59: .loc 1 262 12 add.u64 %r291,%r416,%r200; st.u64 [%r291],%r431; $L60: .loc 1 255 3 add.u64 %r146,%r146,24; add.u64 %r167,%r167,24; add.u64 %r200,%r200,8; setp.ne.u64 %r293,%r101,%r146; @ %r293 bra $L61; bra $L62; $L113: .loc 1 274 6 ld.u64 %r294,[%r204]; setp.eq.u64 %r295,%r294,0; @ ! %r295 bra $L64; bra $L63; $L56: add.u64 %r297,%r25,%r25; add.u64 %r298,%r297,%r25; shl.b64 %r299,%r298,3; add.u64 %r143,%r299,40; add.u64 %r156,%r205,%r143; add.u64 %r141,%r208,%r143; shl.b64 %r300,%r25,3; add.u64 %r137,%r300,-8; cvt.u32.u32 %r302,%r22; cvt.s64.s8 %r301,%r302; shl.b64 %r303,%r301,3; add.u64 %r84,%r303,-8; add.u64 %r418,%frame,240; add.u64 %r419,%frame,120; add.u64 %r416,%frame,360; .loc 1 271 12 mov.u64 %r430,0; $L67: .loc 1 266 18 add.u64 %r305,%r418,%r137; ld.u64 %r306,[%r156]; st.u64 [%r305],%r306; .loc 1 267 18 add.u64 %r308,%r419,%r137; .loc 1 267 20 ld.u64 %r310,[%r141]; mul.lo.u64 %r309,%r310,%r36; .loc 1 267 18 st.u64 [%r308],%r309; .loc 1 268 19 ld.u64 %r312,[%r156+16]; add.u64 %r311,%r312,1; ld.u64 %r313,[%r156+8]; sub.u64 %r62,%r311,%r313; .loc 1 270 10 setp.lt.s64 %r314,%r62,0; @ %r314 bra $L65; .loc 1 268 17 add.u64 %r316,%r416,%r137; st.u64 [%r316],%r62; bra $L66; $L65: .loc 1 271 12 add.u64 %r318,%r416,%r137; st.u64 [%r318],%r430; $L66: .loc 1 264 3 add.u64 %r156,%r156,24; add.u64 %r141,%r141,24; add.u64 %r137,%r137,8; setp.ne.u64 %r320,%r84,%r137; @ %r320 bra $L67; bra $L114; $L63: .loc 1 278 7 setp.le.s64 %r321,%r147,0; @ %r321 bra $L69; add.u64 %r416,%frame,360; $L90: add.u64 %r179,%r204,40; mov.u64 %r176,%r416; add.u64 %r322,%r204,16; cvt.u32.u32 %r324,%r22; cvt.s64.s8 %r323,%r324; add.u64 %r326,%r323,%r323; add.u64 %r327,%r326,%r323; shl.b64 %r328,%r327,3; add.u64 %r169,%r322,%r328; .loc 1 281 10 mov.u64 %r140,1; .loc 1 285 4 mov.u64 %r351,0; bra $L70; $L69: .loc 1 289 24 mov.u64 %r329,0; st.u64 [%r204+8],%r329; .loc 1 290 28 cvt.u16.u32 %r332,%r22; add.u16 %r331,%r332,-1; cvt.u32.u16 %r333,%r331; st.u8 [%r204+28],%r333; .loc 1 292 20 add.u32 %r334,%r22,-2; cvt.s64.s32 %r72,%r334; add.u64 %r336,%r72,%r72; add.u64 %r337,%r336,%r72; shl.b64 %r338,%r337,3; add.u64 %r339,%r204,%r338; .loc 1 292 67 shl.b64 %r341,%r72,3; add.u64 %r342,%frame,%r341; .loc 1 292 59 ld.u64 %r344,[%r339+40]; ld.u64 %r345,[%r342+360]; mul.lo.u64 %r154,%r344,%r345; .loc 1 294 29 mov.u64 %r3471547; call (%value_in),_gfortrani_xmallocarray8,[%value_in]; } .loc 1 294 27 st.u64 [%r204],%r348; .loc 1 295 10 setp.eq.u64 %r350,%r154,0; @ ! %r350 bra $L74; bra $L71; $L73: .loc 1 283 48 mul.lo.u64 %r140,%r68,%r140; $L70: .loc 1 285 4 st.u64 [%r179+8],%r351; ld.u64 %r68,[%r176]; add.u64 %r352,%r68,-1; st.u64 [%r179+16],%r352; st.u64 [%r179],%r140; .loc 1 278 7 add.u64 %r179,%r179,24; add.u64 %r176,%r176,8; setp.ne.u64 %r353,%r169,%r179; @ %r353 bra $L73; bra $L69; $L71: .loc 1 298 4 st.u64 [%r204+48],%r154; mov.u64 %r355,-1; st.u64 [%r204+56],%r355; mov.u64 %r356,1; st.u64 [%r204+40],%r356; .loc 1 299 4 bra $L52; $L64: .loc 1 304 19 ld.s8 %r79,[%r204+28]; .loc 1 304 10 setp.eq.u64 %r357,%r79,%r147; @ %r357 bra $L76; .loc 1 305 2 st.u64 [%stack+8],%r147; st.u64 [%stack],%r79;_gfortran_runtime_error76: .loc 1 310 11 cvta.global.u64 %r360,_gfortrani_compile_options; .loc 1 310 10 ld.u32 %r361,[%r360+36]; setp.eq.u32 %r362,%r361,0; @ %r362 bra $L74; .loc 1 311 2 add.u64 %r416,%frame,360; cvta.const.u64 %r366,$LC2r416366; call _gfortrani_bounds_ifunction_return$L74: .loc 1 315 3 setp.gt.s64 %r368,%r147,0; @ %r368 bra $L77; $L80: .loc 1 235 9 add.u64 %r369,%r31,1; .loc 1 235 7 sub.u64 %r149,%r369,%r33; max.s64 %r65,%r149,0; .loc 1 240 12 mul.lo.u64 %r37,%r34,%r36; .loc 1 323 8 ld.u64 %r164,[%r204]; .loc 1 326 8 ld.u64 %r163,[%r205]; .loc 1 351 35 shl.b64 %r100,%r151,3; .loc 1 336 22 add.u64 %r64,%r65,-1; .loc 1 336 15 mul.lo.u64 %r89,%r64,%r100; .loc 1 337 29 mul.lo.u64 %r177,%r37,%r64; .loc 1 338 34 neg.s64 %r95,%r100; .loc 1 363 22 ld.u64 %r102,[%frame+240]; .loc 1 363 12 shl.b64 %r104,%r102,3; .loc 1 364 23 ld.u64 %r105,[%frame+120]; .loc 1 365 22 ld.u64 %r107,[%frame]; .loc 1 365 12 shl.b64 %r109,%r107,3; .loc 1 362 12 ld.u64 %r203,[%frame+480]; setp.ne.u32 %r417,%r209,0; add.u64 %r415,%frame,488; .loc 1 338 4 setp.le.s64 %r423,%r149,0; .loc 1 382 23 add.u64 %r424,%frame,240; .loc 1 367 32 add.u64 %r425,%frame,360; .loc 1 371 20 add.u64 %r426,%frame,120; bra $L78; $L77: add.u64 %r198,%frame,480; add.u64 %r197,%r204,40; mov.u64 %r195,%frame; cvt.u32.u32 %r371,%r22; cvt.s64.s8 %r370,%r371; add.u64 %r181,%r370,-1; .loc 1 315 3 mov.u64 %r165,0; add.u64 %r416,%frame,360; .loc 1 317 16 mov.u64 %r372,%r165; $L79: st.u64 [%r198],%r372; .loc 1 318 18 ld.u64 %r373,[%r197]; st.u64 [%r195],%r373; .loc 1 319 17 shl.b64 %r375,%r165,3; add.u64 %r376,%r416,%r375; .loc 1 319 10 ld.u64 %r377,[%r376]; setp.le.s64 %r378,%r377,0; @ %r378 bra $L52; .loc 1 315 26 add.u64 %r165,%r165,1; .loc 1 315 3 add.u64 %r198,%r198,8; add.u64 %r197,%r197,24; add.u64 %r195,%r195,8; setp.ne.u64 %r379,%r165,%r181; @ %r379 bra $L79; bra $L80; $L78: .loc 1 334 10 @ %r417 bra $L81; .loc 1 351 4 setp.gt.s64 %r381,%r149,0; @ %r381 bra $L91; .loc 1 333 14 mov.u64 %r158,0; bra $L83; $L81: .loc 1 336 8 add.u64 %r29,%r163,%r89; .loc 1 337 9 add.u64 %r159,%r160,%r177; .loc 1 338 4 @ %r423 bra $L92; .loc 1 338 11 mov.u64 %r158,%r65; $L85: .loc 1 340 11 ld.s8 %r384,[%r159]; cvt.u16.u32 %r383,%r384; setp.eq.u16 %r385,%r383,0; @ %r385 bra $L84; .loc 1 340 18 ld.f64 %r386,[%r29]; setp.eq.f64 %r387,%r386,%r206; @ %r387 bra $L83; $L84: .loc 1 338 26 add.u64 %r158,%r158,-1; .loc 1 338 34 add.u64 %r29,%r29,%r95; .loc 1 338 53 sub.u64 %r159,%r159,%r37; .loc 1 338 4 setp.ne.u64 %r388,%r158,0; @ %r388 bra $L85; bra $L83; $L91: .loc 1 351 4 mov.u64 %r157,%r160; mov.u64 %r111,%r163; .loc 1 351 11 mov.u64 %r158,1; $L82: .loc 1 353 11 ld.s8 %r390,[%r157]; cvt.u16.u32 %r389,%r390; setp.eq.u16 %r391,%r389,0; @ %r391 bra $L86; .loc 1 353 18 ld.f64 %r392,[%r111]; setp.eq.f64 %r393,%r392,%r206; @ %r393 bra $L83; $L86: .loc 1 351 27 add.u64 %r158,%r158,1; .loc 1 351 35 add.u64 %r111,%r111,%r100; .loc 1 351 54 add.u64 %r157,%r157,%r37; .loc 1 351 4 setp.ge.s64 %r394,%r65,%r158; @ %r394 bra $L82; .loc 1 333 14 mov.u64 %r158,0; bra $L83; $L92: mov.u64 %r158,0; $L83: .loc 1 360 13 st.u64 [%r164],%r158; .loc 1 362 15 add.u64 %r203,%r203,1; st.u64 [%frame+480],%r203; .loc 1 363 12 add.u64 %r163,%r163,%r104; .loc 1 364 13 add.u64 %r160,%r160,%r105; .loc 1 365 12 add.u64 %r164,%r164,%r109; .loc 1 367 32 ld.u64 %r130,[%frame+360]; .loc 1 367 13 setp.ne.u64 %r395,%r203,%r130; @ %r395 bra $L78; mov.u64 %r126,%r415; mov.u64 %r127,%r107; mov.u64 %r202,%r105; mov.u64 %r124,%r102; mov.u64 %r120,8; .loc 1 366 9 mov.u64 %r162,0; .loc 1 369 13 mov.u64 %r397,%r162; $L89: st.u64 [%r126+-8],%r397; .loc 1 370 23 mul.lo.u64 %r113,%r130,%r124; .loc 1 371 24 mul.lo.u64 %r398,%r130,%r202; .loc 1 371 10 sub.u64 %r160,%r160,%r398; .loc 1 372 23 mul.lo.u64 %r119,%r130,%r127; .loc 1 373 5 add.u64 %r162,%r162,1; .loc 1 374 7 setp.le.s64 %r399,%r147,%r162; @ %r399 bra $L52; .loc 1 381 16 ld.u64 %r400,[%r126]; add.u64 %r123,%r400,1; st.u64 [%r126],%r123; .loc 1 382 23 add.u64 %r402,%r424,%r120; ld.u64 %r124,[%r402]; .loc 1 382 13 sub.u64 %r403,%r124,%r113; shl.b64 %r404,%r403,3; add.u64 %r163,%r163,%r404; .loc 1 383 23 add.u64 %r405,%frame,%r120; ld.u64 %r127,[%r405]; .loc 1 383 13 sub.u64 %r406,%r127,%r119; shl.b64 %r407,%r406,3; add.u64 %r164,%r164,%r407; .loc 1 367 32 add.u64 %r409,%r425,%r120; ld.u64 %r130,[%r409]; .loc 1 367 13 add.u64 %r126,%r126,8; setp.eq.u64 %r410,%r123,%r130; @ %r410 bra $L88; .loc 1 362 12 ld.u64 %r203,[%frame+480]; bra $L78; $L88: .loc 1 371 20 add.u64 %r412,%r426,%r120; ld.u64 %r202,[%r412]; add.u64 %r120,%r120,8; bra $L89; $L114: .loc 1 274 6 ld.u64 %r413,[%r204]; setp.eq.u64 %r414,%r413,0; @ ! %r414 bra $L64; bra $L90; $L5268u64 %r126; .reg .u64u32pred %r190; .reg .pred %r192; .reg .u64pred %r213; .reg .predu32predpredpred %r257; .reg .u64 %r260; .reg .u64 %r261; .reg .pred %r263; .reg .u64 %r264; .reg .u32 %r265; .reg .u64 %r266; .reg .u6464predmov.u64 %r149,%ar0; mov.u64 %r150,%ar1; mov.f64 %r151,%ar2; mov.u64 %r152,%ar3; mov.u64 %r153,%ar4; mov.u32 %r154,%ar5; .loc 1 410 6 setp.eq.u64 %r155,%r153,0; @ %r155 bra $L116; .loc 1 410 20 ld.u32 %r156,[%r153]; setp.eq.u32 %r157,%r156,0; @ %r157 bra $L117; $L116:call _gfortran_findloc1_); } .loc 1 413 7 bra $L115; $L117: .loc 1 416 10 ld.s8 %r23,[%r150+28]; .loc 1 416 38 add.u32 %r163,%r23,-1; .loc 1 416 8 cvt.s64.s32 %r90,%r163; .loc 1 417 10 ld.u64 %r26,[%r152]; .loc 1 417 7 add.u64 %r91,%r26,-1; .loc 1 419 7 shr.u64 %r165,%r91,63; set.u32.lt.s64 %r167,%r90,%r91; neg.s32 %r168,%r167; cvt.u16.u64 %r171,%r165; cvt.u16.u32 %r172,%r168; or.b16 %r170,%r171,%r172; .loc 1 419 6 cvt.u32.u16 %r173,%r170; cvt.u16.u8 %r174,%r173; setp.ne.u16 %r175,%r174,0; @ %r175 bra $L119; .loc 1 430 3 setp.ne.u64 %r176,%r91,0; @ %r176 bra $L120; $L126: .loc 1 438 3 setp.gt.s64 %r177,%r90,%r91; @ %r177 bra $L121; bra $L163; $L119: .loc 1 421 7 cvt.u32.u32 %r180,%r23; cvt.s64.s8 %r179,%r180; st.u64 [%stack+8],%r179_gfortran_runtime_error120: add.u64 %r136,%r150,48; add.u64 %r138,%frame,120; add.u64 %r139,%r150,56; add.u64 %r182,%r150,24; add.u64 %r184,%r26,%r26; add.u64 %r185,%r184,%r26; shl.b64 %r186,%r185,3; add.u64 %r145,%r182,%r186; .loc 1 435 12 mov.u64 %r302,0; $L125: .loc 1 432 19 ld.u64 %r188,[%r139]; add.u64 %r187,%r188,1; ld.u64 %r189,[%r136]; sub.u64 %r35,%r187,%r189; .loc 1 434 10 setp.le.s64 %r190,%r35,0; @ %r190 bra $L123; .loc 1 432 17 st.u64 [%r138],%r35; bra $L124; $L123: .loc 1 435 12 st.u64 [%r138],%r302; $L124: .loc 1 430 3 add.u64 %r136,%r136,24; add.u64 %r138,%r138,8; add.u64 %r139,%r139,24; setp.ne.u64 %r192,%r136,%r145; @ %r192 bra $L125; bra $L126; $L163: .loc 1 448 6 ld.u64 %r193,[%r149]; setp.eq.u64 %r194,%r193,0; @ ! %r194 bra $L128; bra $L127; $L121: add.u64 %r196,%r26,%r26; add.u64 %r197,%r196,%r26; shl.b64 %r198,%r197,3; add.u64 %r199,%r198,48; add.u64 %r125,%r150,%r199; add.u64 %r294,%frame,120; shl.b64 %r201,%r26,3; add.u64 %r202,%r201,-8; add.u64 %r45,%r294,%r202; add.u64 %r203,%r150,48; cvt.u32.u32 %r205,%r23; cvt.s64.s8 %r204,%r205; add.u64 %r207,%r204,%r204; add.u64 %r208,%r207,%r204; shl.b64 %r209,%r208,3; add.u64 %r132,%r203,%r209; .loc 1 444 12 mov.u64 %r301,0; $L131: .loc 1 441 2 ld.u64 %r211,[%r125+8]; add.u64 %r210,%r211,1; ld.u64 %r212,[%r125]; sub.u64 %r40,%r210,%r212; .loc 1 443 10 setp.le.s64 %r213,%r40,0; @ %r213 bra $L129; .loc 1 440 17 st.u64 [%r45],%r40; bra $L130; $L129: .loc 1 444 12 st.u64 [%r45],%r301; $L130: .loc 1 438 3 add.u64 %r125,%r125,24; add.u64 %r45,%r45,8; setp.ne.u64 %r215,%r125,%r132; @ %r215 bra $L131; bra $L164; $L127: .loc 1 452 7 setp.le.s64 %r216,%r90,0; @ %r216 bra $L133; add.u64 %r294,%frame,120; $L149: add.u64 %r78,%r149,40; mov.u64 %r61,%r294; cvt.u32.u32 %r218,%r23; cvt.s64.s8 %r217,%r218; add.u64 %r220,%r217,%r217; add.u64 %r221,%r220,%r217; shl.b64 %r222,%r221,3; add.u64 %r223,%r149,16; add.u64 %r126,%r222,%r223; .loc 1 455 10 mov.u64 %r81,1; .loc 1 459 4 mov.u64 %r246,0; bra $L134; $L133: .loc 1 462 24 mov.u64 %r224,0; st.u64 [%r149+8],%r224; .loc 1 463 28 cvt.u16.u32 %r227,%r23; add.u16 %r226,%r227,-1; cvt.u32.u16 %r228,%r226; st.u8 [%r149+28],%r228; .loc 1 465 20 add.u32 %r229,%r23,-2; cvt.s64.s32 %r51,%r229; add.u64 %r231,%r51,%r51; add.u64 %r232,%r231,%r51; shl.b64 %r233,%r232,3; add.u64 %r234,%r149,%r233; .loc 1 465 67 shl.b64 %r236,%r51,3; add.u64 %r237,%frame,%r236; .loc 1 465 59 ld.u64 %r239,[%r234+40]; ld.u64 %r240,[%r237+120]; mul.lo.u64 %r94,%r239,%r240; .loc 1 467 29 mov.u64 %r24242; call (%value_in),_gfortrani_xmallocarray3,[%value_in]; } .loc 1 467 27 st.u64 [%r149],%r243; .loc 1 468 10 setp.eq.u64 %r245,%r94,0; @ ! %r245 bra $L138; bra $L135; $L137: .loc 1 457 48 mul.lo.u64 %r81,%r47,%r81; $L134: .loc 1 459 4 st.u64 [%r78+8],%r246; ld.u64 %r47,[%r61]; add.u64 %r247,%r47,-1; st.u64 [%r78+16],%r247; st.u64 [%r78],%r81; .loc 1 452 7 add.u64 %r78,%r78,24; add.u64 %r61,%r61,8; setp.ne.u64 %r248,%r78,%r126; @ %r248 bra $L137; bra $L133; $L135: .loc 1 471 4 st.u64 [%r149+48],%r94; mov.u64 %r250,-1; st.u64 [%r149+56],%r250; mov.u64 %r251,1; st.u64 [%r149+40],%r251; .loc 1 472 4 bra $L115; $L128: .loc 1 477 19 ld.s8 %r58,[%r149+28]; .loc 1 477 10 setp.eq.u64 %r252,%r58,%r90; @ %r252 bra $L139; .loc 1 478 2 st.u64 [%stack+8],%r90; st.u64 [%stack],%r58;_gfortran_runtime_error139: .loc 1 483 11 cvta.global.u64 %r255,_gfortrani_compile_options; .loc 1 483 10 ld.u32 %r256,[%r255+36]; setp.eq.u32 %r257,%r256,0; @ %r257 bra $L138; .loc 1 484 2 add.u64 %r294,%frame,120; cvta.const.u64 %r261,$LC2261; call _gfortrani_bounds_ifunction_return$L138: .loc 1 488 3 setp.gt.s64 %r263,%r90,0; @ %r263 bra $L140; $L143: .loc 1 495 8 ld.u64 %r97,[%r149]; ld.u64 %r42,[%frame+240]; .loc 1 503 22 ld.u64 %r64,[%frame]; .loc 1 503 12 shl.b64 %r66,%r64,3; setp.le.s64 %r295,%r90,1; cvt.u32.u32 %r297,%r23; cvt.s64.s8 %r298,%r297; add.u64 %r299,%r298,-1; add.u64 %r300,%frame,120; bra $L141; $L140: add.u64 %r114,%frame,240; add.u64 %r113,%r149,40; mov.u64 %r108,%frame; cvt.u32.u32 %r265,%r23; cvt.s64.s8 %r264,%r265; add.u64 %r79,%r264,-1; .loc 1 488 3 mov.u64 %r98,0; add.u64 %r294,%frame,120; .loc 1 490 16 mov.u64 %r266,%r98; $L142: st.u64 [%r114],%r266; .loc 1 491 18 ld.u64 %r267,[%r113]; st.u64 [%r108],%r267; .loc 1 492 17 shl.b64 %r269,%r98,3; add.u64 %r270,%r294,%r269; .loc 1 492 10 ld.u64 %r271,[%r270]; setp.le.s64 %r272,%r271,0; @ %r272 bra $L115; .loc 1 488 26 add.u64 %r98,%r98,1; .loc 1 488 3 add.u64 %r114,%r114,8; add.u64 %r113,%r113,24; add.u64 %r108,%r108,8; setp.ne.u64 %r273,%r79,%r98; @ %r273 bra $L142; bra $L143; $L141: .loc 1 500 13 mov.u64 %r274,0; st.u64 [%r97],%r274; .loc 1 502 15 add.u64 %r42,%r42,1; .loc 1 503 12 add.u64 %r97,%r97,%r66; .loc 1 505 32 ld.u64 %r88,[%frame+120]; .loc 1 505 13 setp.ne.u64 %r275,%r42,%r88; @ %r275 bra $L141; .loc 1 508 23 mul.lo.u64 %r276,%r64,%r42; .loc 1 508 9 shl.b64 %r277,%r276,3; sub.u64 %r30,%r97,%r277; .loc 1 510 7 @ ! %r295 bra $L165; bra $L115; $L148: .loc 1 507 13 st.u64 [%r102],%r274; .loc 1 508 23 mul.lo.u64 %r280,%r73,%r72; .loc 1 508 9 shl.b64 %r281,%r280,3; sub.u64 %r30,%r97,%r281; .loc 1 509 5 add.u64 %r96,%r96,1; .loc 1 510 7 add.u64 %r102,%r102,8; setp.ne.u64 %r282,%r96,%r299; @ %r282 bra $L147; bra $L115; $L165: add.u64 %r102,%frame,248; .loc 1 509 5 mov.u64 %r96,1; $L147: .loc 1 517 16 ld.u64 %r286,[%r102]; add.u64 %r72,%r286,1; st.u64 [%r102],%r72; shl.b64 %r120,%r96,3; .loc 1 518 23 add.u64 %r287,%frame,%r120; ld.u64 %r73,[%r287]; .loc 1 518 13 shl.b64 %r288,%r73,3; add.u64 %r97,%r30,%r288; .loc 1 505 32 add.u64 %r290,%r300,%r120; ld.u64 %r77,[%r290]; .loc 1 505 13 setp.eq.u64 %r291,%r72,%r77; @ %r291 bra $L148; .loc 1 507 13 mov.u64 %r42,0; bra $L141; $L164: .loc 1 448 6 ld.u64 %r292,[%r149]; setp.eq.u64 %r293,%r292,0; @ ! %r293 bra $L128; bra $L149; $L115: .loc 1 522 1 ret; } findloc1_r10.o/ 1622802215findloc1_c4.o/_gfortran_findloc1_c4 .visible .func _gfortran_findloc1_c.file 1 "../../../libgfortran/generated/findloc1_c4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mfindloc1_c4 .visible .func _gfortran_mfindloc1_c_gfortran_sfindloc1_c4 .visible .func _gfortran_sfindloc1_cVAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_internal_error .extern .func _gfortrani_internal_,105,110,99,111,114,114,101,99,116,32,105,110,32,70,73,78,68,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,10114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,914,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,70,73,78,68,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,108] = {70,73,78,68,76,79,67_gfortran_findloc1_c4 .visible .func _gfortran_findloc1_c48010416u64 %r222; .reg .pred %r236; .reg .predpred %r247; .reg .predu64 %r265; .reg .u64 %r266; .reg .u64u64u64 %r281; .reg .u64 %r282; .reg .u16pred %r315; .reg .u64u6464pred %r333; .reg .f32 %r338; .reg .u32 %r339; .reg .u32 %r340; .reg .f32 %r342; .reg .u32 %r343; .reg .u32pred %r365; .reg .pred %r366; .reg .predpredu64 %r405; .reg .u64 %r406; mov.u64 %r195,%ar0; mov.u64 %r196,%ar1; mov.f32 %r197,%ar2; mov.f32 %r198,%ar3; mov.u64 %r199,%ar4; mov.u32 %r200,%ar5; .loc 1 54 10 ld.s8 %r22,[%r196+28]; .loc 1 54 38 add.u32 %r201,%r22,-1; .loc 1 54 8 cvt.s64.s32 %r118,%r201; .loc 1 55 10 ld.u64 %r25,[%r199]; .loc 1 55 7 add.u64 %r119,%r25,-1; .loc 1 57 7 shr.u64 %r203,%r119,63; set.u32.lt.s64 %r205,%r118,%r119; neg.s32 %r206,%r205; cvt.u16.u64 %r209,%r203; cvt.u16.u32 %r210,%r206; or.b16 %r208,%r209,%r210; .loc 1 57 6 cvt.u32.u16 %r211,%r208; cvt.u16.u8 %r212,%r211; setp.eq.u16 %r213,%r212,0; @ %r213 bra $L2; .loc 1 59 7 cvt.u32.u32 %r216,%r22; cvt.s64.s8 %r215,%r216; st.u64 [%stack+8],%r21514214_gfortran_runtime_errorr219,%r119,%r119; add.u64 %r220,%r219,%r119; shl.b64 %r221,%r220,3; add.u64 %r222,%r196,%r221; ld.u64 %r30,[%r222+56]; ld.u64 %r32,[%r222+48]; .loc 1 67 9 ld.u64 %r122,[%r222+40]; .loc 1 69 3 setp.ne.u64 %r236,%r119,0; @ %r236 bra $L3; $L9: .loc 1 77 3 setp.gt.s64 %r237,%r118,%r119; @ %r237 bra $L4; bra $L50; $L3: add.u64 %r63,%r196,40; add.u64 %r48,%frame,120; add.u64 %r133,%frame,240; add.u64 %r238,%r196,16; add.u64 %r240,%r25,%r25; add.u64 %r241,%r240,%r25; shl.b64 %r242,%r241,3; add.u64 %r47,%r238,%r242; .loc 1 75 12 mov.u64 %r406,0; $L8: .loc 1 71 18 ld.u64 %r243,[%r63]; st.u64 [%r48],%r243; .loc 1 72 19 ld.u64 %r245,[%r63+16]; add.u64 %r244,%r245,1; ld.u64 %r246,[%r63+8]; sub.u64 %r37,%r244,%r246; .loc 1 74 10 setp.lt.s64 %r247,%r37,0; @ %r247 bra $L6; .loc 1 72 17 st.u64 [%r133],%r37; bra $L7; $L6: .loc 1 75 12 st.u64 [%r133],%r406; $L7: .loc 1 69 3 add.u64 %r63,%r63,24; add.u64 %r48,%r48,8; add.u64 %r133,%r133,8; setp.ne.u64 %r249,%r47,%r63; @ %r249 bra $L8; bra $L9; $L50: .loc 1 86 6 ld.u64 %r250,[%r195]; setp.eq.u64 %r251,%r250,0; @ ! %r251 bra $L11; bra $L10; $L4: add.u64 %r253,%r25,%r25; add.u64 %r254,%r253,%r25; shl.b64 %r255,%r254,3; add.u64 %r256,%r255,40; add.u64 %r161,%r196,%r256; shl.b64 %r257,%r25,3; add.u64 %r146,%r257,-8; add.u64 %r395,%frame,120; add.u64 %r151,%r395,%r146; add.u64 %r394,%frame,240; add.u64 %r143,%r394,%r146; add.u64 %r260,%r196,40; cvt.u32.u32 %r262,%r22; cvt.s64.s8 %r261,%r262; add.u64 %r264,%r261,%r261; add.u64 %r265,%r264,%r261; shl.b64 %r266,%r265,3; add.u64 %r104,%r260,%r266; .loc 1 83 12 mov.u64 %r405,0; $L14: .loc 1 79 18 ld.u64 %r267,[%r161]; st.u64 [%r151],%r267; .loc 1 80 19 ld.u64 %r269,[%r161+16]; add.u64 %r268,%r269,1; ld.u64 %r270,[%r161+8]; sub.u64 %r44,%r268,%r270; .loc 1 82 10 setp.lt.s64 %r271,%r44,0; @ %r271 bra $L12; .loc 1 80 17 st.u64 [%r143],%r44; bra $L13; $L12: .loc 1 83 12 st.u64 [%r143],%r405; $L13: .loc 1 77 3 add.u64 %r161,%r161,24; add.u64 %r151,%r151,8; add.u64 %r143,%r143,8; setp.ne.u64 %r273,%r104,%r161; @ %r273 bra $L14; bra $L51; $L10: .loc 1 90 7 setp.le.s64 %r274,%r118,0; @ %r274 bra $L16; add.u64 %r394,%frame,240; $L36: add.u64 %r170,%r195,40; mov.u64 %r168,%r394; add.u64 %r275,%r195,16; cvt.u32.u32 %r277,%r22; cvt.s64.s8 %r276,%r277; add.u64 %r279,%r276,%r276; add.u64 %r280,%r279,%r276; shl.b64 %r281,%r280,3; add.u64 %r162,%r275,%r281; .loc 1 93 10 mov.u64 %r110,1; .loc 1 97 4 mov.u64 %r304,0; bra $L17; $L16: .loc 1 101 24 mov.u64 %r282,0; st.u64 [%r195+8],%r282; .loc 1 102 28 cvt.u16.u32 %r285,%r22; add.u16 %r284,%r285,-1; cvt.u32.u16 %r286,%r284; st.u8 [%r195+28],%r286; .loc 1 104 20 add.u32 %r287,%r22,-2; cvt.s64.s32 %r55,%r287; add.u64 %r289,%r55,%r55; add.u64 %r290,%r289,%r55; shl.b64 %r291,%r290,3; add.u64 %r292,%r195,%r291; .loc 1 104 67 shl.b64 %r294,%r55,3; add.u64 %r295,%frame,%r294; .loc 1 104 59 ld.u64 %r297,[%r292+40]; ld.u64 %r298,[%r295+240]; mul.lo.u64 %r124,%r297,%r298; .loc 1 106 29 mov.u64 %r3001300; call (%value_in),_gfortrani_xmallocarray01,[%value_in]; } .loc 1 106 27 st.u64 [%r195],%r301; .loc 1 107 10 setp.eq.u64 %r303,%r124,0; @ ! %r303 bra $L21; bra $L18; $L20: .loc 1 95 48 mul.lo.u64 %r110,%r51,%r110; $L17: .loc 1 97 4 st.u64 [%r170+8],%r304; ld.u64 %r51,[%r168]; add.u64 %r305,%r51,-1; st.u64 [%r170+16],%r305; st.u64 [%r170],%r110; .loc 1 90 7 add.u64 %r170,%r170,24; add.u64 %r168,%r168,8; setp.ne.u64 %r306,%r162,%r170; @ %r306 bra $L20; bra $L16; $L18: .loc 1 110 4 st.u64 [%r195+48],%r124; mov.u64 %r308,-1; st.u64 [%r195+56],%r308; mov.u64 %r309,1; st.u64 [%r195+40],%r309; .loc 1 111 4 bra $L1; $L11: .loc 1 116 19 ld.s8 %r62,[%r195+28]; .loc 1 116 10 setp.eq.u64 %r310,%r62,%r118; @ %r310 bra $L23; .loc 1 117 2 st.u64 [%stack+8],%r118; st.u64 [%stack],%r62;_gfortran_runtime_error23: .loc 1 122 11 cvta.global.u64 %r313,_gfortrani_compile_options; .loc 1 122 10 ld.u32 %r314,[%r313+36]; setp.eq.u32 %r315,%r314,0; @ %r315 bra $L21; .loc 1 123 2 add.u64 %r394,%frame,240; cvta.const.u64 %r319,$LC2r394318319; call _gfortrani_bounds_ifunction_return$L21: .loc 1 127 3 setp.gt.s64 %r321,%r118,0; @ %r321 bra $L24; $L27: .loc 1 64 9 add.u64 %r322,%r30,1; .loc 1 64 7 sub.u64 %r120,%r322,%r32; max.s64 %r42,%r120,0; .loc 1 135 8 ld.u64 %r109,[%r195]; .loc 1 138 8 ld.u64 %r131,[%r196]; .loc 1 160 35 shl.b64 %r73,%r122,3; .loc 1 147 22 add.u64 %r323,%r42,-1; .loc 1 147 15 mul.lo.u64 %r69,%r323,%r73; .loc 1 148 34 neg.s64 %r70,%r73; ld.u64 %r71,[%frame+360]; .loc 1 172 22 ld.u64 %r74,[%frame+120]; .loc 1 172 12 shl.b64 %r76,%r74,3; .loc 1 173 22 ld.u64 %r77,[%frame]; .loc 1 173 12 shl.b64 %r79,%r77,3; setp.ne.u32 %r393,%r200,0; setp.le.s64 %r396,%r118,1; .loc 1 148 4 setp.le.s64 %r398,%r120,0; .loc 1 160 4 setp.gt.s64 %r399,%r120,0; cvt.u32.u32 %r400,%r22; cvt.s64.s8 %r401,%r400; add.u64 %r402,%r401,-1; add.u64 %r403,%frame,120; add.u64 %r404,%frame,240; bra $L25; $L24: add.u64 %r183,%frame,360; add.u64 %r182,%r195,40; mov.u64 %r180,%frame; cvt.u32.u32 %r325,%r22; cvt.s64.s8 %r324,%r325; add.u64 %r171,%r324,-1; .loc 1 127 3 mov.u64 %r132,0; add.u64 %r394,%frame,240; .loc 1 129 16 mov.u64 %r326,%r132; $L26: st.u64 [%r183],%r326; .loc 1 130 18 ld.u64 %r327,[%r182]; st.u64 [%r180],%r327; .loc 1 131 17 shl.b64 %r329,%r132,3; add.u64 %r330,%r394,%r329; .loc 1 131 10 ld.u64 %r331,[%r330]; setp.le.s64 %r332,%r331,0; @ %r332 bra $L1; .loc 1 127 26 add.u64 %r132,%r132,1; .loc 1 127 3 add.u64 %r183,%r183,8; add.u64 %r182,%r182,24; add.u64 %r180,%r180,8; setp.ne.u64 %r333,%r132,%r171; @ %r333 bra $L26; bra $L27; $L25: .loc 1 145 10 @ %r393 bra $L28; .loc 1 160 4 @ %r399 bra $L37; .loc 1 144 14 mov.u64 %r126,0; bra $L30; $L28: .loc 1 147 8 add.u64 %r127,%r131,%r69; .loc 1 148 4 @ %r398 bra $L38; .loc 1 148 11 mov.u64 %r126,%r42; $L31: .loc 1 150 11 ld.f32 %r338,[%r127+4]; set.u32.eq.f32 %r339,%r338,%r198; neg.s32 %r340,%r339; ld.f32 %r342,[%r127]; set.u32.eq.f32 %r343,%r342,%r197; neg.s32 %r344,%r343; cvt.u16.u32 %r346,%r340; cvt.u16.u32 %r347,%r344; and.b16 %r345,%r346,%r347; cvt.u32.u16 %r348,%r345; cvt.u16.u8 %r349,%r348; setp.ne.u16 %r350,%r349,0; @ %r350 bra $L30; .loc 1 148 26 add.u64 %r126,%r126,-1; .loc 1 148 34 add.u64 %r127,%r127,%r70; .loc 1 148 4 setp.ne.u64 %r351,%r126,0; @ %r351 bra $L31; bra $L30; $L37: .loc 1 160 4 mov.u64 %r125,%r131; .loc 1 160 11 mov.u64 %r126,1; $L29: .loc 1 162 11 ld.f32 %r353,[%r125+4]; set.u32.eq.f32 %r354,%r353,%r198; neg.s32 %r355,%r354; ld.f32 %r357,[%r125]; set.u32.eq.f32 %r358,%r357,%r197; neg.s32 %r359,%r358; cvt.u16.u32 %r361,%r355; cvt.u16.u32 %r362,%r359; and.b16 %r360,%r361,%r362; cvt.u32.u16 %r363,%r360; cvt.u16.u8 %r364,%r363; setp.ne.u16 %r365,%r364,0; @ %r365 bra $L30; .loc 1 160 27 add.u64 %r126,%r126,1; .loc 1 160 35 add.u64 %r125,%r125,%r73; .loc 1 160 4 setp.ge.s64 %r366,%r42,%r126; @ %r366 bra $L29; .loc 1 144 14 mov.u64 %r126,0; bra $L30; $L38: mov.u64 %r126,0; $L30: .loc 1 169 13 st.u64 [%r109],%r126; .loc 1 171 15 add.u64 %r71,%r71,1; .loc 1 172 12 add.u64 %r131,%r131,%r76; .loc 1 173 12 add.u64 %r109,%r109,%r79; .loc 1 175 32 ld.u64 %r138,[%frame+240]; .loc 1 175 13 setp.ne.u64 %r367,%r71,%r138; @ %r367 bra $L25; .loc 1 178 23 mul.lo.u64 %r368,%r74,%r71; .loc 1 178 9 shl.b64 %r369,%r368,3; sub.u64 %r128,%r131,%r369; .loc 1 179 23 mul.lo.u64 %r370,%r77,%r71; .loc 1 179 9 shl.b64 %r371,%r370,3; sub.u64 %r129,%r109,%r371; .loc 1 181 7 @ %r396 bra $L1; add.u64 %r68,%frame,368; mov.u64 %r155,8; .loc 1 180 5 mov.u64 %r130,1; .loc 1 177 13 mov.u64 %r397,0; bra $L34; $L35: st.u64 [%r68],%r397; .loc 1 178 23 mul.lo.u64 %r377,%r92,%r91; .loc 1 178 9 shl.b64 %r378,%r377,3; sub.u64 %r128,%r131,%r378; .loc 1 179 23 mul.lo.u64 %r379,%r95,%r91; .loc 1 179 9 shl.b64 %r380,%r379,3; sub.u64 %r129,%r109,%r380; .loc 1 180 5 add.u64 %r130,%r130,1; .loc 1 181 7 add.u64 %r68,%r68,8; add.u64 %r155,%r155,8; setp.eq.u64 %r381,%r130,%r402; @ %r381 bra $L1; $L34: .loc 1 188 16 ld.u64 %r382,[%r68]; add.u64 %r91,%r382,1; st.u64 [%r68],%r91; .loc 1 189 23 add.u64 %r384,%r403,%r155; ld.u64 %r92,[%r384]; .loc 1 189 13 shl.b64 %r385,%r92,3; add.u64 %r131,%r128,%r385; .loc 1 190 23 add.u64 %r386,%frame,%r155; ld.u64 %r95,[%r386]; .loc 1 190 13 shl.b64 %r387,%r95,3; add.u64 %r109,%r129,%r387; .loc 1 175 32 add.u64 %r389,%r404,%r155; ld.u64 %r98,[%r389]; .loc 1 175 13 setp.eq.u64 %r390,%r91,%r98; @ %r390 bra $L35; .loc 1 177 13 mov.u64 %r71,0; bra $L25; $L51: .loc 1 86 6 ld.u64 %r391,[%r195]; setp.eq.u64 %r392,%r391,0; @ ! %r392 bra $L11; bra $L36; $L608predu32 %r265; .reg .u32 %r266; .reg .u32 %r267; .reg .u32 %r269; .reg .u32 %r270; .reg .u16 %r271; .reg .u16 %r272; .reg .u16 %r273; .reg .u32predu64 %r282; .reg .u64 %r283; .reg .u64u64predu64pred %r331; .reg .pred %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u32u16 %r342; .reg .u16predpredu32pred %r389; .reg .pred %r390; .reg .predpred %r411; .reg .u16 %r412; .reg .u32 %r413; .reg .pred %r414; .reg .f32pred %r428; .reg .predu64 %r433; .reg .pred.reg .u64pred %r445; .reg .u64 %r447; .reg .u64 %r448; .reg .predpredmov.f32 %r216,%ar2; mov.f32 %r217,%ar3; mov.u64 %r218,%ar4; mov.u64 %r219,%ar5; mov.u32 %r220,%ar6; .loc 1 225 10 ld.s8 %r22,[%r215+28]; .loc 1 225 38 add.u32 %r221,%r22,-1; .loc 1 225 8 cvt.s64.s32 %r143,%r221; .loc 1 226 10 ld.u64 %r25,[%r218]; .loc 1 226 7 add.u64 %r144,%r25,-1; .loc 1 228 7 shr.u64 %r223,%r144,63; set.u32.lt.s64 %r225,%r143,%r144; neg.s32 %r226,%r225; cvt.u16.u64 %r229,%r223; cvt.u16.u32 %r230,%r226; or.b16 %r228,%r229,%r230; .loc 1 228 6 cvt.u32.u16 %r231,%r228; cvt.u16.u8 %r232,%r231; setp.eq.u16 %r233,%r232,0; @ %r233 bra $L53; .loc 1 230 7 cvt.u32.u32 %r236,%r22; cvt.s64.s8 %r235,%r236; st.u64 [%stack+8],%r235_gfortran_runtime_error3: .loc 1 235 9 add.u64 %r239,%r144,%r144; add.u64 %r240,%r239,%r144; shl.b64 %r241,%r240,3; add.u64 %r242,%r215,%r241; ld.u64 %r31,[%r242+56]; ld.u64 %r33,[%r242+48]; .loc 1 239 9 ld.u64 %r147,[%r242+40]; .loc 1 240 12 add.u64 %r260,%r219,%r241; ld.u64 %r34,[%r260+40]; ld.u64 %r36,[%r219+16]; .loc 1 242 9 ld.u64 %r156,[%r219]; .loc 1 246 22 cvt.u32.u64 %r39,%r36; .loc 1 246 53 add.u32 %r262,%r39,-4; and.b32 %r263,%r262,-5; set.u32.eq.u32 %r265,%r263,0; neg.s32 %r266,%r265; .loc 1 246 22 add.u32 %r267,%r39,-1; set.u32.le.u32 %r269,%r267,1; neg.s32 %r270,%r269; .loc 1 246 6 cvt.u16.u32 %r272,%r266; cvt.u16.u32 %r273,%r270; or.b16 %r271,%r272,%r273; cvt.u32.u16 %r274,%r271; cvt.u16.u8 %r275,%r274; setp.eq.u16 %r276,%r275,0; @ %r276 bra $L54; $L58: .loc 1 255 3 setp.ne.u64 %r277,%r144,0; @ %r277 bra $L55; $L62: .loc 1 264 3 setp.gt.s64 %r278,%r143,%r144; @ %r278 bra $L56; bra $L113; $L54: .loc 1 248 7 setp.eq.u32 %r280,%r39,16; @ %r280 bra $L58; .loc 1 253 5 cvta.const.u64 %r282,$LC4; mov.u64 %r281,call _gfortrani_internal_error5: add.u64 %r125,%r215,40; add.u64 %r84,%r219,40; add.u64 %r283,%r215,16; add.u64 %r285,%r25,%r25; add.u64 %r286,%r285,%r25; shl.b64 %r287,%r286,3; add.u64 %r65,%r283,%r287; .loc 1 255 3 mov.u64 %r142,0; add.u64 %r453,%frame,240; add.u64 %r450,%frame,120; add.u64 %r452,%frame,360; .loc 1 262 12 mov.u64 %r466,%r142; $L61: .loc 1 257 18 add.u64 %r289,%r453,%r142; ld.u64 %r290,[%r125]; st.u64 [%r289],%r290; .loc 1 258 18 add.u64 %r292,%r450,%r142; .loc 1 258 20 ld.u64 %r294,[%r84]; mul.lo.u64 %r293,%r294,%r36; .loc 1 258 18 st.u64 [%r292],%r293; .loc 1 259 19 ld.u64 %r296,[%r125+16]; add.u64 %r295,%r296,1; ld.u64 %r297,[%r125+8]; sub.u64 %r53,%r295,%r297; .loc 1 261 10 setp.lt.s64 %r298,%r53,0; @ %r298 bra $L59; .loc 1 259 17 add.u64 %r300,%r452,%r142; st.u64 [%r300],%r53; bra $L60; $L59: .loc 1 262 12 add.u64 %r302,%r452,%r142; st.u64 [%r302],%r466; $L60: .loc 1 255 3 add.u64 %r125,%r125,24; add.u64 %r84,%r84,24; add.u64 %r142,%r142,8; setp.ne.u64 %r304,%r65,%r125; @ %r304 bra $L61; bra $L62; $L113: .loc 1 274 6 ld.u64 %r305,[%r214]; setp.eq.u64 %r306,%r305,0; @ ! %r306 bra $L64; bra $L63; $L56: add.u64 %r308,%r25,%r25; add.u64 %r309,%r308,%r25; shl.b64 %r310,%r309,3; add.u64 %r174,%r310,40; add.u64 %r179,%r215,%r174; add.u64 %r172,%r219,%r174; shl.b64 %r311,%r25,3; add.u64 %r152,%r311,-8; cvt.u32.u32 %r313,%r22; cvt.s64.s8 %r312,%r313; shl.b64 %r314,%r312,3; add.u64 %r129,%r314,-8; add.u64 %r453,%frame,240; add.u64 %r450,%frame,120; add.u64 %r452,%frame,360; .loc 1 271 12 mov.u64 %r465,0; $L67: .loc 1 266 18 add.u64 %r316,%r453,%r152; ld.u64 %r317,[%r179]; st.u64 [%r316],%r317; .loc 1 267 18 add.u64 %r319,%r450,%r152; .loc 1 267 20 ld.u64 %r321,[%r172]; mul.lo.u64 %r320,%r321,%r36; .loc 1 267 18 st.u64 [%r319],%r320; .loc 1 268 19 ld.u64 %r323,[%r179+16]; add.u64 %r322,%r323,1; ld.u64 %r324,[%r179+8]; sub.u64 %r62,%r322,%r324; .loc 1 270 10 setp.lt.s64 %r325,%r62,0; @ %r325 bra $L65; .loc 1 268 17 add.u64 %r327,%r452,%r152; st.u64 [%r327],%r62; bra $L66; $L65: .loc 1 271 12 add.u64 %r329,%r452,%r152; st.u64 [%r329],%r465; $L66: .loc 1 264 3 add.u64 %r179,%r179,24; add.u64 %r172,%r172,24; add.u64 %r152,%r152,8; setp.ne.u64 %r331,%r129,%r152; @ %r331 bra $L67; bra $L114; $L63: .loc 1 278 7 setp.le.s64 %r332,%r143,0; @ %r332 bra $L69; add.u64 %r452,%frame,360; $L90: add.u64 %r193,%r214,40; mov.u64 %r191,%r452; add.u64 %r333,%r214,16; cvt.u32.u32 %r335,%r22; cvt.s64.s8 %r334,%r335; add.u64 %r337,%r334,%r334; add.u64 %r338,%r337,%r334; shl.b64 %r339,%r338,3; add.u64 %r185,%r333,%r339; .loc 1 281 10 mov.u64 %r134,1; .loc 1 285 4 mov.u64 %r362,0; bra $L70; $L69: .loc 1 289 24 mov.u64 %r340,0; st.u64 [%r214+8],%r340; .loc 1 290 28 cvt.u16.u32 %r343,%r22; add.u16 %r342,%r343,-1; cvt.u32.u16 %r344,%r342; st.u8 [%r214+28],%r344; .loc 1 292 20 add.u32 %r345,%r22,-2; cvt.s64.s32 %r73,%r345; add.u64 %r347,%r73,%r73; add.u64 %r348,%r347,%r73; shl.b64 %r349,%r348,3; add.u64 %r350,%r214,%r349; .loc 1 292 67 shl.b64 %r352,%r73,3; add.u64 %r353,%frame,%r352; .loc 1 292 59 ld.u64 %r355,[%r350+40]; ld.u64 %r356,[%r353+360]; mul.lo.u64 %r150,%r355,%r356; .loc 1 294 29 mov.u64 %r358call (%value_in),_gfortrani_xmallocarray59,[%value_in]; } .loc 1 294 27 st.u64 [%r214],%r359; .loc 1 295 10 setp.eq.u64 %r361,%r150,0; @ ! %r361 bra $L74; bra $L71; $L73: .loc 1 283 48 mul.lo.u64 %r134,%r69,%r134; $L70: .loc 1 285 4 st.u64 [%r193+8],%r362; ld.u64 %r69,[%r191]; add.u64 %r363,%r69,-1; st.u64 [%r193+16],%r363; st.u64 [%r193],%r134; .loc 1 278 7 add.u64 %r193,%r193,24; add.u64 %r191,%r191,8; setp.ne.u64 %r364,%r185,%r193; @ %r364 bra $L73; bra $L69; $L71: .loc 1 298 4 st.u64 [%r214+48],%r150; mov.u64 %r366,-1; st.u64 [%r214+56],%r366; mov.u64 %r367,1; st.u64 [%r214+40],%r367; .loc 1 299 4 bra $L52; $L64: .loc 1 304 19 ld.s8 %r80,[%r214+28]; .loc 1 304 10 setp.eq.u64 %r368,%r80,%r143; @ %r368 bra $L76; .loc 1 305 2 st.u64 [%stack+8],%r143; st.u64 [%stack],%r80;_gfortran_runtime_error76: .loc 1 310 11 cvta.global.u64 %r371,_gfortrani_compile_options; .loc 1 310 10 ld.u32 %r372,[%r371+36]; setp.eq.u32 %r373,%r372,0; @ %r373 bra $L74; .loc 1 311 2 add.u64 %r452,%frame,360; cvta.const.u64 %r377,$LC2376377; call _gfortrani_bounds_ifunction_return$L74: .loc 1 315 3 setp.gt.s64 %r379,%r143,0; @ %r379 bra $L77; $L80: .loc 1 235 9 add.u64 %r380,%r31,1; .loc 1 235 7 sub.u64 %r145,%r380,%r33; max.s64 %r66,%r145,0; .loc 1 240 12 mul.lo.u64 %r37,%r34,%r36; .loc 1 323 8 ld.u64 %r160,[%r214]; .loc 1 326 8 ld.u64 %r159,[%r215]; .loc 1 351 35 shl.b64 %r95,%r147,3; .loc 1 336 22 add.u64 %r64,%r66,-1; .loc 1 336 15 mul.lo.u64 %r87,%r64,%r95; .loc 1 337 29 mul.lo.u64 %r44,%r37,%r64; .loc 1 338 34 neg.s64 %r91,%r95; .loc 1 363 22 ld.u64 %r96,[%frame+240]; .loc 1 363 12 shl.b64 %r98,%r96,3; .loc 1 364 23 ld.u64 %r99,[%frame+120]; .loc 1 365 22 ld.u64 %r101,[%frame]; .loc 1 365 12 shl.b64 %r103,%r101,3; .loc 1 362 12 ld.u64 %r213,[%frame+480]; setp.ne.u32 %r454,%r220,0; add.u64 %r451,%frame,488; .loc 1 338 4 setp.le.s64 %r458,%r145,0; .loc 1 382 23 add.u64 %r459,%frame,240; .loc 1 367 32 add.u64 %r460,%frame,360; .loc 1 371 20 add.u64 %r461,%frame,120; bra $L78; $L77: add.u64 %r206,%frame,480; add.u64 %r205,%r214,40; mov.u64 %r203,%frame; cvt.u32.u32 %r382,%r22; cvt.s64.s8 %r381,%r382; add.u64 %r194,%r381,-1; .loc 1 315 3 mov.u64 %r161,0; add.u64 %r452,%frame,360; .loc 1 317 16 mov.u64 %r383,%r161; $L79: st.u64 [%r206],%r383; .loc 1 318 18 ld.u64 %r384,[%r205]; st.u64 [%r203],%r384; .loc 1 319 17 shl.b64 %r386,%r161,3; add.u64 %r387,%r452,%r386; .loc 1 319 10 ld.u64 %r388,[%r387]; setp.le.s64 %r389,%r388,0; @ %r389 bra $L52; .loc 1 315 26 add.u64 %r161,%r161,1; .loc 1 315 3 add.u64 %r206,%r206,8; add.u64 %r205,%r205,24; add.u64 %r203,%r203,8; setp.ne.u64 %r390,%r161,%r194; @ %r390 bra $L79; bra $L80; $L78: .loc 1 334 10 @ %r454 bra $L81; .loc 1 351 4 setp.gt.s64 %r392,%r145,0; @ %r392 bra $L91; .loc 1 333 14 mov.u64 %r154,0; bra $L83; $L81: .loc 1 336 8 add.u64 %r29,%r159,%r87; .loc 1 337 9 add.u64 %r155,%r156,%r44; .loc 1 338 4 @ %r458 bra $L92; .loc 1 338 11 mov.u64 %r154,%r66; $L85: .loc 1 340 11 ld.s8 %r395,[%r155]; cvt.u16.u32 %r394,%r395; setp.eq.u16 %r396,%r394,0; @ %r396 bra $L84; .loc 1 340 18 ld.f32 %r398,[%r29+4]; set.u32.eq.f32 %r399,%r398,%r217; neg.s32 %r400,%r399; ld.f32 %r402,[%r29]; set.u32.eq.f32 %r403,%r402,%r216; neg.s32 %r404,%r403; cvt.u16.u32 %r406,%r400; cvt.u16.u32 %r407,%r404; and.b16 %r405,%r406,%r407; cvt.u32.u16 %r408,%r405; cvt.u16.u8 %r409,%r408; setp.ne.u16 %r410,%r409,0; @ %r410 bra $L83; $L84: .loc 1 338 26 add.u64 %r154,%r154,-1; .loc 1 338 34 add.u64 %r29,%r29,%r91; .loc 1 338 53 sub.u64 %r155,%r155,%r37; .loc 1 338 4 setp.ne.u64 %r411,%r154,0; @ %r411 bra $L85; bra $L83; $L91: .loc 1 351 4 mov.u64 %r153,%r156; mov.u64 %r104,%r159; .loc 1 351 11 mov.u64 %r154,1; $L82: .loc 1 353 11 ld.s8 %r413,[%r153]; cvt.u16.u32 %r412,%r413; setp.eq.u16 %r414,%r412,0; @ %r414 bra $L86; .loc 1 353 18 ld.f32 %r416,[%r104]; set.u32.eq.f32 %r417,%r416,%r216; neg.s32 %r418,%r417; ld.f32 %r420,[%r104+4]; set.u32.eq.f32 %r421,%r420,%r217; neg.s32 %r422,%r421; cvt.u16.u32 %r424,%r418; cvt.u16.u32 %r425,%r422; and.b16 %r423,%r424,%r425; cvt.u32.u16 %r426,%r423; cvt.u16.u8 %r427,%r426; setp.ne.u16 %r428,%r427,0; @ %r428 bra $L83; $L86: .loc 1 351 27 add.u64 %r154,%r154,1; .loc 1 351 35 add.u64 %r104,%r104,%r95; .loc 1 351 54 add.u64 %r153,%r153,%r37; .loc 1 351 4 setp.ge.s64 %r429,%r66,%r154; @ %r429 bra $L82; .loc 1 333 14 mov.u64 %r154,0; bra $L83; $L92: mov.u64 %r154,0; $L83: .loc 1 360 13 st.u64 [%r160],%r154; .loc 1 362 15 add.u64 %r213,%r213,1; st.u64 [%frame+480],%r213; .loc 1 363 12 add.u64 %r159,%r159,%r98; .loc 1 364 13 add.u64 %r156,%r156,%r99; .loc 1 365 12 add.u64 %r160,%r160,%r103; .loc 1 367 32 ld.u64 %r121,[%frame+360]; .loc 1 367 13 setp.ne.u64 %r430,%r213,%r121; @ %r430 bra $L78; mov.u64 %r118,%r451; mov.u64 %r119,%r101; mov.u64 %r211,%r99; mov.u64 %r116,%r96; mov.u64 %r112,8; .loc 1 366 9 mov.u64 %r158,0; .loc 1 369 13 mov.u64 %r432,%r158; $L89: st.u64 [%r118+-8],%r432; .loc 1 370 23 mul.lo.u64 %r106,%r121,%r116; .loc 1 371 24 mul.lo.u64 %r433,%r121,%r211; .loc 1 371 10 sub.u64 %r156,%r156,%r433; .loc 1 372 23 mul.lo.u64 %r111,%r121,%r119; .loc 1 373 5 add.u64 %r158,%r158,1; .loc 1 374 7 setp.le.s64 %r434,%r143,%r158; @ %r434 bra $L52; .loc 1 381 16 ld.u64 %r435,[%r118]; add.u64 %r115,%r435,1; st.u64 [%r118],%r115; .loc 1 382 23 add.u64 %r437,%r459,%r112; ld.u64 %r116,[%r437]; .loc 1 382 13 sub.u64 %r438,%r116,%r106; shl.b64 %r439,%r438,3; add.u64 %r159,%r159,%r439; .loc 1 383 23 add.u64 %r440,%frame,%r112; ld.u64 %r119,[%r440]; .loc 1 383 13 sub.u64 %r441,%r119,%r111; shl.b64 %r442,%r441,3; add.u64 %r160,%r160,%r442; .loc 1 367 32 add.u64 %r444,%r460,%r112; ld.u64 %r121,[%r444]; .loc 1 367 13 add.u64 %r118,%r118,8; setp.eq.u64 %r445,%r115,%r121; @ %r445 bra $L88; .loc 1 362 12 ld.u64 %r213,[%frame+480]; bra $L78; $L88: .loc 1 371 20 add.u64 %r447,%r461,%r112; ld.u64 %r211,[%r447]; add.u64 %r112,%r112,8; bra $L89; $L114: .loc 1 274 6 ld.u64 %r448,[%r214]; setp.eq.u64 %r449,%r448,0; @ ! %r449 bra $L64; bra $L90; $L5268u64 %r126; .reg .u64u64 %r154; .reg .u32 %r155; .reg .predu16 %r173; .reg .u16 %r174; .reg .u32 %r175; .reg .u16 %r176; .reg .pred %r177; .reg .pred %r178; .reg .predpredpredu16 %r229; .reg .u32predu64predpredu64 %r303; .reg .u64 %r304; mov.u64 %r149,%ar0; mov.u64 %r150,%ar1; mov.f32 %r151,%ar2; mov.f32 %r152,%ar3; mov.u64 %r153,%ar4; mov.u64 %r154,%ar5; mov.u32 %r155,%ar6; .loc 1 410 6 setp.eq.u64 %r156,%r154,0; @ %r156 bra $L116; .loc 1 410 20 ld.u32 %r157,[%r154]; setp.eq.u32 %r158,%r157,0; @ %r158 bra $L117; $L116:.param .f32 %out_arg4; st.param.f32 [%out_arg4],%r152153155; call _gfortran_findloc1_c4413 7 bra $L115; $L117: .loc 1 416 10 ld.s8 %r23,[%r150+28]; .loc 1 416 38 add.u32 %r165,%r23,-1; .loc 1 416 8 cvt.s64.s32 %r90,%r165; .loc 1 417 10 ld.u64 %r26,[%r153]; .loc 1 417 7 add.u64 %r91,%r26,-1; .loc 1 419 7 shr.u64 %r167,%r91,63; set.u32.lt.s64 %r169,%r90,%r91; neg.s32 %r170,%r169; cvt.u16.u64 %r173,%r167; cvt.u16.u32 %r174,%r170; or.b16 %r172,%r173,%r174; .loc 1 419 6 cvt.u32.u16 %r175,%r172; cvt.u16.u8 %r176,%r175; setp.ne.u16 %r177,%r176,0; @ %r177 bra $L119; .loc 1 430 3 setp.ne.u64 %r178,%r91,0; @ %r178 bra $L120; $L126: .loc 1 438 3 setp.gt.s64 %r179,%r90,%r91; @ %r179 bra $L121; bra $L163; $L119: .loc 1 421 7 cvt.u32.u32 %r182,%r23; cvt.s64.s8 %r181,%r182; st.u64 [%stack+8],%r18180180_gfortran_runtime_error120: add.u64 %r136,%r150,48; add.u64 %r138,%frame,120; add.u64 %r139,%r150,56; add.u64 %r184,%r150,24; add.u64 %r186,%r26,%r26; add.u64 %r187,%r186,%r26; shl.b64 %r188,%r187,3; add.u64 %r145,%r184,%r188; .loc 1 435 12 mov.u64 %r304,0; $L125: .loc 1 432 19 ld.u64 %r190,[%r139]; add.u64 %r189,%r190,1; ld.u64 %r191,[%r136]; sub.u64 %r35,%r189,%r191; .loc 1 434 10 setp.le.s64 %r192,%r35,0; @ %r192 bra $L123; .loc 1 432 17 st.u64 [%r138],%r35; bra $L124; $L123: .loc 1 435 12 st.u64 [%r138],%r304; $L124: .loc 1 430 3 add.u64 %r136,%r136,24; add.u64 %r138,%r138,8; add.u64 %r139,%r139,24; setp.ne.u64 %r194,%r136,%r145; @ %r194 bra $L125; bra $L126; $L163: .loc 1 448 6 ld.u64 %r195,[%r149]; setp.eq.u64 %r196,%r195,0; @ ! %r196 bra $L128; bra $L127; $L121: add.u64 %r198,%r26,%r26; add.u64 %r199,%r198,%r26; shl.b64 %r200,%r199,3; add.u64 %r201,%r200,48; add.u64 %r125,%r150,%r201; add.u64 %r296,%frame,120; shl.b64 %r203,%r26,3; add.u64 %r204,%r203,-8; add.u64 %r45,%r296,%r204; add.u64 %r205,%r150,48; cvt.u32.u32 %r207,%r23; cvt.s64.s8 %r206,%r207; add.u64 %r209,%r206,%r206; add.u64 %r210,%r209,%r206; shl.b64 %r211,%r210,3; add.u64 %r132,%r205,%r211; .loc 1 444 12 mov.u64 %r303,0; $L131: .loc 1 441 2 ld.u64 %r213,[%r125+8]; add.u64 %r212,%r213,1; ld.u64 %r214,[%r125]; sub.u64 %r40,%r212,%r214; .loc 1 443 10 setp.le.s64 %r215,%r40,0; @ %r215 bra $L129; .loc 1 440 17 st.u64 [%r45],%r40; bra $L130; $L129: .loc 1 444 12 st.u64 [%r45],%r303; $L130: .loc 1 438 3 add.u64 %r125,%r125,24; add.u64 %r45,%r45,8; setp.ne.u64 %r217,%r125,%r132; @ %r217 bra $L131; bra $L164; $L127: .loc 1 452 7 setp.le.s64 %r218,%r90,0; @ %r218 bra $L133; add.u64 %r296,%frame,120; $L149: add.u64 %r78,%r149,40; mov.u64 %r61,%r296; cvt.u32.u32 %r220,%r23; cvt.s64.s8 %r219,%r220; add.u64 %r222,%r219,%r219; add.u64 %r223,%r222,%r219; shl.b64 %r224,%r223,3; add.u64 %r225,%r149,16; add.u64 %r126,%r224,%r225; .loc 1 455 10 mov.u64 %r81,1; .loc 1 459 4 mov.u64 %r248,0; bra $L134; $L133: .loc 1 462 24 mov.u64 %r226,0; st.u64 [%r149+8],%r226; .loc 1 463 28 cvt.u16.u32 %r229,%r23; add.u16 %r228,%r229,-1; cvt.u32.u16 %r230,%r228; st.u8 [%r149+28],%r230; .loc 1 465 20 add.u32 %r231,%r23,-2; cvt.s64.s32 %r51,%r231; add.u64 %r233,%r51,%r51; add.u64 %r234,%r233,%r51; shl.b64 %r235,%r234,3; add.u64 %r236,%r149,%r235; .loc 1 465 67 shl.b64 %r238,%r51,3; add.u64 %r239,%frame,%r238; .loc 1 465 59 ld.u64 %r241,[%r236+40]; ld.u64 %r242,[%r239+120]; mul.lo.u64 %r94,%r241,%r242; .loc 1 467 29 mov.u64 %r244244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } .loc 1 467 27 st.u64 [%r149],%r245; .loc 1 468 10 setp.eq.u64 %r247,%r94,0; @ ! %r247 bra $L138; bra $L135; $L137: .loc 1 457 48 mul.lo.u64 %r81,%r47,%r81; $L134: .loc 1 459 4 st.u64 [%r78+8],%r248; ld.u64 %r47,[%r61]; add.u64 %r249,%r47,-1; st.u64 [%r78+16],%r249; st.u64 [%r78],%r81; .loc 1 452 7 add.u64 %r78,%r78,24; add.u64 %r61,%r61,8; setp.ne.u64 %r250,%r78,%r126; @ %r250 bra $L137; bra $L133; $L135: .loc 1 471 4 st.u64 [%r149+48],%r94; mov.u64 %r252,-1; st.u64 [%r149+56],%r252; mov.u64 %r253,1; st.u64 [%r149+40],%r253; .loc 1 472 4 bra $L115; $L128: .loc 1 477 19 ld.s8 %r58,[%r149+28]; .loc 1 477 10 setp.eq.u64 %r254,%r58,%r90; @ %r254 bra $L139; .loc 1 478 2 st.u64 [%stack+8],%r90; st.u64 [%stack],%r58;_gfortran_runtime_error139: .loc 1 483 11 cvta.global.u64 %r257,_gfortrani_compile_options; .loc 1 483 10 ld.u32 %r258,[%r257+36]; setp.eq.u32 %r259,%r258,0; @ %r259 bra $L138; .loc 1 484 2 add.u64 %r296,%frame,120; cvta.const.u64 %r263,$LC2r2962623; call _gfortrani_bounds_ifunction_return$L138: .loc 1 488 3 setp.gt.s64 %r265,%r90,0; @ %r265 bra $L140; $L143: .loc 1 495 8 ld.u64 %r97,[%r149]; ld.u64 %r42,[%frame+240]; .loc 1 503 22 ld.u64 %r64,[%frame]; .loc 1 503 12 shl.b64 %r66,%r64,3; setp.le.s64 %r297,%r90,1; cvt.u32.u32 %r299,%r23; cvt.s64.s8 %r300,%r299; add.u64 %r301,%r300,-1; add.u64 %r302,%frame,120; bra $L141; $L140: add.u64 %r114,%frame,240; add.u64 %r113,%r149,40; mov.u64 %r108,%frame; cvt.u32.u32 %r267,%r23; cvt.s64.s8 %r266,%r267; add.u64 %r79,%r266,-1; .loc 1 488 3 mov.u64 %r98,0; add.u64 %r296,%frame,120; .loc 1 490 16 mov.u64 %r268,%r98; $L142: st.u64 [%r114],%r268; .loc 1 491 18 ld.u64 %r269,[%r113]; st.u64 [%r108],%r269; .loc 1 492 17 shl.b64 %r271,%r98,3; add.u64 %r272,%r296,%r271; .loc 1 492 10 ld.u64 %r273,[%r272]; setp.le.s64 %r274,%r273,0; @ %r274 bra $L115; .loc 1 488 26 add.u64 %r98,%r98,1; .loc 1 488 3 add.u64 %r114,%r114,8; add.u64 %r113,%r113,24; add.u64 %r108,%r108,8; setp.ne.u64 %r275,%r79,%r98; @ %r275 bra $L142; bra $L143; $L141: .loc 1 500 13 mov.u64 %r276,0; st.u64 [%r97],%r276; .loc 1 502 15 add.u64 %r42,%r42,1; .loc 1 503 12 add.u64 %r97,%r97,%r66; .loc 1 505 32 ld.u64 %r88,[%frame+120]; .loc 1 505 13 setp.ne.u64 %r277,%r42,%r88; @ %r277 bra $L141; .loc 1 508 23 mul.lo.u64 %r278,%r64,%r42; .loc 1 508 9 shl.b64 %r279,%r278,3; sub.u64 %r30,%r97,%r279; .loc 1 510 7 @ ! %r297 bra $L165; bra $L115; $L148: .loc 1 507 13 st.u64 [%r102],%r276; .loc 1 508 23 mul.lo.u64 %r282,%r73,%r72; .loc 1 508 9 shl.b64 %r283,%r282,3; sub.u64 %r30,%r97,%r283; .loc 1 509 5 add.u64 %r96,%r96,1; .loc 1 510 7 add.u64 %r102,%r102,8; setp.ne.u64 %r284,%r96,%r301; @ %r284 bra $L147; bra $L115; $L165: add.u64 %r102,%frame,248; .loc 1 509 5 mov.u64 %r96,1; $L147: .loc 1 517 16 ld.u64 %r288,[%r102]; add.u64 %r72,%r288,1; st.u64 [%r102],%r72; shl.b64 %r120,%r96,3; .loc 1 518 23 add.u64 %r289,%frame,%r120; ld.u64 %r73,[%r289]; .loc 1 518 13 shl.b64 %r290,%r73,3; add.u64 %r97,%r30,%r290; .loc 1 505 32 add.u64 %r292,%r302,%r120; ld.u64 %r77,[%r292]; .loc 1 505 13 setp.eq.u64 %r293,%r72,%r77; @ %r293 bra $L148; .loc 1 507 13 mov.u64 %r42,0; bra $L141; $L164: .loc 1 448 6 ld.u64 %r294,[%r149]; setp.eq.u64 %r295,%r294,0; @ ! %r295 bra $L128; bra $L149; $L115: .loc 1 522 1 ret; } findloc1_c8.o/_gfortran_findloc1_c8 .visible .func _gfortran_findloc1_c8.file 1 "../../../libgfortran/generated/findloc1_c8.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mfindloc1_c8 .visible .func _gfortran_mfindloc1_c8_gfortran_sfindloc1_c8 .visible .func _gfortran_sfindloc1_c8VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortrani_internal_error .extern .func _gfortrani_internal_,105,110,99,111,114,114,101,99,116,32,105,110,32,70,73,78,68,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,108,10114,101,116,117,114,110,32,118,97,108,117,10170,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,914,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,70,73,78,68,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,108] = {70,73,78,68,76,79,67_gfortran_findloc1_c8 .visible .func _gfortran_findloc1_c848010416u64 %r222; .reg .pred %r236; .reg .predpred %r247; .reg .predu64 %r265; .reg .u64 %r266; .reg .u64u64u64 %r281; .reg .u64 %r282; .reg .u16pred %r315; .reg .u64u6464pred %r333; .reg .f64 %r338; .reg .u32 %r339; .reg .u32 %r340; .reg .f64 %r342; .reg .u32 %r343; .reg .u32pred %r365; .reg .pred %r366; .reg .predpredu64 %r405; .reg .u64 %r406; mov.u64 %r195,%ar0; mov.u64 %r196,%ar1; mov.f64 %r197,%ar2; mov.f64 %r198,%ar3; mov.u64 %r199,%ar4; mov.u32 %r200,%ar5; .loc 1 54 10 ld.s8 %r22,[%r196+28]; .loc 1 54 38 add.u32 %r201,%r22,-1; .loc 1 54 8 cvt.s64.s32 %r118,%r201; .loc 1 55 10 ld.u64 %r25,[%r199]; .loc 1 55 7 add.u64 %r119,%r25,-1; .loc 1 57 7 shr.u64 %r203,%r119,63; set.u32.lt.s64 %r205,%r118,%r119; neg.s32 %r206,%r205; cvt.u16.u64 %r209,%r203; cvt.u16.u32 %r210,%r206; or.b16 %r208,%r209,%r210; .loc 1 57 6 cvt.u32.u16 %r211,%r208; cvt.u16.u8 %r212,%r211; setp.eq.u16 %r213,%r212,0; @ %r213 bra $L2; .loc 1 59 7 cvt.u32.u32 %r216,%r22; cvt.s64.s8 %r215,%r216; st.u64 [%stack+8],%r21514214_gfortran_runtime_errorr219,%r119,%r119; add.u64 %r220,%r219,%r119; shl.b64 %r221,%r220,3; add.u64 %r222,%r196,%r221; ld.u64 %r30,[%r222+56]; ld.u64 %r32,[%r222+48]; .loc 1 67 9 ld.u64 %r122,[%r222+40]; .loc 1 69 3 setp.ne.u64 %r236,%r119,0; @ %r236 bra $L3; $L9: .loc 1 77 3 setp.gt.s64 %r237,%r118,%r119; @ %r237 bra $L4; bra $L50; $L3: add.u64 %r63,%r196,40; add.u64 %r48,%frame,120; add.u64 %r133,%frame,240; add.u64 %r238,%r196,16; add.u64 %r240,%r25,%r25; add.u64 %r241,%r240,%r25; shl.b64 %r242,%r241,3; add.u64 %r47,%r238,%r242; .loc 1 75 12 mov.u64 %r406,0; $L8: .loc 1 71 18 ld.u64 %r243,[%r63]; st.u64 [%r48],%r243; .loc 1 72 19 ld.u64 %r245,[%r63+16]; add.u64 %r244,%r245,1; ld.u64 %r246,[%r63+8]; sub.u64 %r37,%r244,%r246; .loc 1 74 10 setp.lt.s64 %r247,%r37,0; @ %r247 bra $L6; .loc 1 72 17 st.u64 [%r133],%r37; bra $L7; $L6: .loc 1 75 12 st.u64 [%r133],%r406; $L7: .loc 1 69 3 add.u64 %r63,%r63,24; add.u64 %r48,%r48,8; add.u64 %r133,%r133,8; setp.ne.u64 %r249,%r47,%r63; @ %r249 bra $L8; bra $L9; $L50: .loc 1 86 6 ld.u64 %r250,[%r195]; setp.eq.u64 %r251,%r250,0; @ ! %r251 bra $L11; bra $L10; $L4: add.u64 %r253,%r25,%r25; add.u64 %r254,%r253,%r25; shl.b64 %r255,%r254,3; add.u64 %r256,%r255,40; add.u64 %r161,%r196,%r256; shl.b64 %r257,%r25,3; add.u64 %r146,%r257,-8; add.u64 %r395,%frame,120; add.u64 %r151,%r395,%r146; add.u64 %r394,%frame,240; add.u64 %r143,%r394,%r146; add.u64 %r260,%r196,40; cvt.u32.u32 %r262,%r22; cvt.s64.s8 %r261,%r262; add.u64 %r264,%r261,%r261; add.u64 %r265,%r264,%r261; shl.b64 %r266,%r265,3; add.u64 %r104,%r260,%r266; .loc 1 83 12 mov.u64 %r405,0; $L14: .loc 1 79 18 ld.u64 %r267,[%r161]; st.u64 [%r151],%r267; .loc 1 80 19 ld.u64 %r269,[%r161+16]; add.u64 %r268,%r269,1; ld.u64 %r270,[%r161+8]; sub.u64 %r44,%r268,%r270; .loc 1 82 10 setp.lt.s64 %r271,%r44,0; @ %r271 bra $L12; .loc 1 80 17 st.u64 [%r143],%r44; bra $L13; $L12: .loc 1 83 12 st.u64 [%r143],%r405; $L13: .loc 1 77 3 add.u64 %r161,%r161,24; add.u64 %r151,%r151,8; add.u64 %r143,%r143,8; setp.ne.u64 %r273,%r104,%r161; @ %r273 bra $L14; bra $L51; $L10: .loc 1 90 7 setp.le.s64 %r274,%r118,0; @ %r274 bra $L16; add.u64 %r394,%frame,240; $L36: add.u64 %r170,%r195,40; mov.u64 %r168,%r394; add.u64 %r275,%r195,16; cvt.u32.u32 %r277,%r22; cvt.s64.s8 %r276,%r277; add.u64 %r279,%r276,%r276; add.u64 %r280,%r279,%r276; shl.b64 %r281,%r280,3; add.u64 %r162,%r275,%r281; .loc 1 93 10 mov.u64 %r110,1; .loc 1 97 4 mov.u64 %r304,0; bra $L17; $L16: .loc 1 101 24 mov.u64 %r282,0; st.u64 [%r195+8],%r282; .loc 1 102 28 cvt.u16.u32 %r285,%r22; add.u16 %r284,%r285,-1; cvt.u32.u16 %r286,%r284; st.u8 [%r195+28],%r286; .loc 1 104 20 add.u32 %r287,%r22,-2; cvt.s64.s32 %r55,%r287; add.u64 %r289,%r55,%r55; add.u64 %r290,%r289,%r55; shl.b64 %r291,%r290,3; add.u64 %r292,%r195,%r291; .loc 1 104 67 shl.b64 %r294,%r55,3; add.u64 %r295,%frame,%r294; .loc 1 104 59 ld.u64 %r297,[%r292+40]; ld.u64 %r298,[%r295+240]; mul.lo.u64 %r124,%r297,%r298; .loc 1 106 29 mov.u64 %r3001300; call (%value_in),_gfortrani_xmallocarray01,[%value_in]; } .loc 1 106 27 st.u64 [%r195],%r301; .loc 1 107 10 setp.eq.u64 %r303,%r124,0; @ ! %r303 bra $L21; bra $L18; $L20: .loc 1 95 48 mul.lo.u64 %r110,%r51,%r110; $L17: .loc 1 97 4 st.u64 [%r170+8],%r304; ld.u64 %r51,[%r168]; add.u64 %r305,%r51,-1; st.u64 [%r170+16],%r305; st.u64 [%r170],%r110; .loc 1 90 7 add.u64 %r170,%r170,24; add.u64 %r168,%r168,8; setp.ne.u64 %r306,%r162,%r170; @ %r306 bra $L20; bra $L16; $L18: .loc 1 110 4 st.u64 [%r195+48],%r124; mov.u64 %r308,-1; st.u64 [%r195+56],%r308; mov.u64 %r309,1; st.u64 [%r195+40],%r309; .loc 1 111 4 bra $L1; $L11: .loc 1 116 19 ld.s8 %r62,[%r195+28]; .loc 1 116 10 setp.eq.u64 %r310,%r62,%r118; @ %r310 bra $L23; .loc 1 117 2 st.u64 [%stack+8],%r118; st.u64 [%stack],%r62;_gfortran_runtime_error23: .loc 1 122 11 cvta.global.u64 %r313,_gfortrani_compile_options; .loc 1 122 10 ld.u32 %r314,[%r313+36]; setp.eq.u32 %r315,%r314,0; @ %r315 bra $L21; .loc 1 123 2 add.u64 %r394,%frame,240; cvta.const.u64 %r319,$LC2r394318319; call _gfortrani_bounds_ifunction_return$L21: .loc 1 127 3 setp.gt.s64 %r321,%r118,0; @ %r321 bra $L24; $L27: .loc 1 64 9 add.u64 %r322,%r30,1; .loc 1 64 7 sub.u64 %r120,%r322,%r32; max.s64 %r42,%r120,0; .loc 1 135 8 ld.u64 %r109,[%r195]; .loc 1 138 8 ld.u64 %r131,[%r196]; .loc 1 160 35 shl.b64 %r73,%r122,4; .loc 1 147 22 add.u64 %r323,%r42,-1; .loc 1 147 15 mul.lo.u64 %r69,%r323,%r73; .loc 1 148 34 neg.s64 %r70,%r73; ld.u64 %r71,[%frame+360]; .loc 1 172 22 ld.u64 %r74,[%frame+120]; .loc 1 172 12 shl.b64 %r76,%r74,4; .loc 1 173 22 ld.u64 %r77,[%frame]; .loc 1 173 12 shl.b64 %r79,%r77,3; setp.ne.u32 %r393,%r200,0; setp.le.s64 %r396,%r118,1; .loc 1 148 4 setp.le.s64 %r398,%r120,0; .loc 1 160 4 setp.gt.s64 %r399,%r120,0; cvt.u32.u32 %r400,%r22; cvt.s64.s8 %r401,%r400; add.u64 %r402,%r401,-1; add.u64 %r403,%frame,120; add.u64 %r404,%frame,240; bra $L25; $L24: add.u64 %r183,%frame,360; add.u64 %r182,%r195,40; mov.u64 %r180,%frame; cvt.u32.u32 %r325,%r22; cvt.s64.s8 %r324,%r325; add.u64 %r171,%r324,-1; .loc 1 127 3 mov.u64 %r132,0; add.u64 %r394,%frame,240; .loc 1 129 16 mov.u64 %r326,%r132; $L26: st.u64 [%r183],%r326; .loc 1 130 18 ld.u64 %r327,[%r182]; st.u64 [%r180],%r327; .loc 1 131 17 shl.b64 %r329,%r132,3; add.u64 %r330,%r394,%r329; .loc 1 131 10 ld.u64 %r331,[%r330]; setp.le.s64 %r332,%r331,0; @ %r332 bra $L1; .loc 1 127 26 add.u64 %r132,%r132,1; .loc 1 127 3 add.u64 %r183,%r183,8; add.u64 %r182,%r182,24; add.u64 %r180,%r180,8; setp.ne.u64 %r333,%r132,%r171; @ %r333 bra $L26; bra $L27; $L25: .loc 1 145 10 @ %r393 bra $L28; .loc 1 160 4 @ %r399 bra $L37; .loc 1 144 14 mov.u64 %r126,0; bra $L30; $L28: .loc 1 147 8 add.u64 %r127,%r131,%r69; .loc 1 148 4 @ %r398 bra $L38; .loc 1 148 11 mov.u64 %r126,%r42; $L31: .loc 1 150 11 ld.f64 %r338,[%r127+8]; set.u32.eq.f64 %r339,%r338,%r198; neg.s32 %r340,%r339; ld.f64 %r342,[%r127]; set.u32.eq.f64 %r343,%r342,%r197; neg.s32 %r344,%r343; cvt.u16.u32 %r346,%r340; cvt.u16.u32 %r347,%r344; and.b16 %r345,%r346,%r347; cvt.u32.u16 %r348,%r345; cvt.u16.u8 %r349,%r348; setp.ne.u16 %r350,%r349,0; @ %r350 bra $L30; .loc 1 148 26 add.u64 %r126,%r126,-1; .loc 1 148 34 add.u64 %r127,%r127,%r70; .loc 1 148 4 setp.ne.u64 %r351,%r126,0; @ %r351 bra $L31; bra $L30; $L37: .loc 1 160 4 mov.u64 %r125,%r131; .loc 1 160 11 mov.u64 %r126,1; $L29: .loc 1 162 11 ld.f64 %r353,[%r125+8]; set.u32.eq.f64 %r354,%r353,%r198; neg.s32 %r355,%r354; ld.f64 %r357,[%r125]; set.u32.eq.f64 %r358,%r357,%r197; neg.s32 %r359,%r358; cvt.u16.u32 %r361,%r355; cvt.u16.u32 %r362,%r359; and.b16 %r360,%r361,%r362; cvt.u32.u16 %r363,%r360; cvt.u16.u8 %r364,%r363; setp.ne.u16 %r365,%r364,0; @ %r365 bra $L30; .loc 1 160 27 add.u64 %r126,%r126,1; .loc 1 160 35 add.u64 %r125,%r125,%r73; .loc 1 160 4 setp.ge.s64 %r366,%r42,%r126; @ %r366 bra $L29; .loc 1 144 14 mov.u64 %r126,0; bra $L30; $L38: mov.u64 %r126,0; $L30: .loc 1 169 13 st.u64 [%r109],%r126; .loc 1 171 15 add.u64 %r71,%r71,1; .loc 1 172 12 add.u64 %r131,%r131,%r76; .loc 1 173 12 add.u64 %r109,%r109,%r79; .loc 1 175 32 ld.u64 %r138,[%frame+240]; .loc 1 175 13 setp.ne.u64 %r367,%r71,%r138; @ %r367 bra $L25; .loc 1 178 23 mul.lo.u64 %r368,%r74,%r71; .loc 1 178 9 shl.b64 %r369,%r368,4; sub.u64 %r128,%r131,%r369; .loc 1 179 23 mul.lo.u64 %r370,%r77,%r71; .loc 1 179 9 shl.b64 %r371,%r370,3; sub.u64 %r129,%r109,%r371; .loc 1 181 7 @ %r396 bra $L1; add.u64 %r68,%frame,368; mov.u64 %r155,8; .loc 1 180 5 mov.u64 %r130,1; .loc 1 177 13 mov.u64 %r397,0; bra $L34; $L35: st.u64 [%r68],%r397; .loc 1 178 23 mul.lo.u64 %r377,%r92,%r91; .loc 1 178 9 shl.b64 %r378,%r377,4; sub.u64 %r128,%r131,%r378; .loc 1 179 23 mul.lo.u64 %r379,%r95,%r91; .loc 1 179 9 shl.b64 %r380,%r379,3; sub.u64 %r129,%r109,%r380; .loc 1 180 5 add.u64 %r130,%r130,1; .loc 1 181 7 add.u64 %r68,%r68,8; add.u64 %r155,%r155,8; setp.eq.u64 %r381,%r130,%r402; @ %r381 bra $L1; $L34: .loc 1 188 16 ld.u64 %r382,[%r68]; add.u64 %r91,%r382,1; st.u64 [%r68],%r91; .loc 1 189 23 add.u64 %r384,%r403,%r155; ld.u64 %r92,[%r384]; .loc 1 189 13 shl.b64 %r385,%r92,4; add.u64 %r131,%r128,%r385; .loc 1 190 23 add.u64 %r386,%frame,%r155; ld.u64 %r95,[%r386]; .loc 1 190 13 shl.b64 %r387,%r95,3; add.u64 %r109,%r129,%r387; .loc 1 175 32 add.u64 %r389,%r404,%r155; ld.u64 %r98,[%r389]; .loc 1 175 13 setp.eq.u64 %r390,%r91,%r98; @ %r390 bra $L35; .loc 1 177 13 mov.u64 %r71,0; bra $L25; $L51: .loc 1 86 6 ld.u64 %r391,[%r195]; setp.eq.u64 %r392,%r391,0; @ ! %r392 bra $L11; bra $L36; $L608predu32 %r265; .reg .u32 %r266; .reg .u32 %r267; .reg .u32 %r269; .reg .u32 %r270; .reg .u16 %r271; .reg .u16 %r272; .reg .u16 %r273; .reg .u32predu64 %r282; .reg .u64 %r283; .reg .u64u64predu64pred %r331; .reg .pred %r332; .reg .u64 %r333; .reg .u64 %r334; .reg .u32u16 %r342; .reg .u16predpredu32pred %r389; .reg .pred %r390; .reg .predpred %r411; .reg .u16 %r412; .reg .u32 %r413; .reg .pred %r414; .reg .f64pred %r428; .reg .predu64 %r433; .reg .pred.reg .u64pred %r445; .reg .u64 %r447; .reg .u64 %r448; .reg .predpredmov.f64 %r216,%ar2; mov.f64 %r217,%ar3; mov.u64 %r218,%ar4; mov.u64 %r219,%ar5; mov.u32 %r220,%ar6; .loc 1 225 10 ld.s8 %r22,[%r215+28]; .loc 1 225 38 add.u32 %r221,%r22,-1; .loc 1 225 8 cvt.s64.s32 %r143,%r221; .loc 1 226 10 ld.u64 %r25,[%r218]; .loc 1 226 7 add.u64 %r144,%r25,-1; .loc 1 228 7 shr.u64 %r223,%r144,63; set.u32.lt.s64 %r225,%r143,%r144; neg.s32 %r226,%r225; cvt.u16.u64 %r229,%r223; cvt.u16.u32 %r230,%r226; or.b16 %r228,%r229,%r230; .loc 1 228 6 cvt.u32.u16 %r231,%r228; cvt.u16.u8 %r232,%r231; setp.eq.u16 %r233,%r232,0; @ %r233 bra $L53; .loc 1 230 7 cvt.u32.u32 %r236,%r22; cvt.s64.s8 %r235,%r236; st.u64 [%stack+8],%r235_gfortran_runtime_error3: .loc 1 235 9 add.u64 %r239,%r144,%r144; add.u64 %r240,%r239,%r144; shl.b64 %r241,%r240,3; add.u64 %r242,%r215,%r241; ld.u64 %r31,[%r242+56]; ld.u64 %r33,[%r242+48]; .loc 1 239 9 ld.u64 %r147,[%r242+40]; .loc 1 240 12 add.u64 %r260,%r219,%r241; ld.u64 %r34,[%r260+40]; ld.u64 %r36,[%r219+16]; .loc 1 242 9 ld.u64 %r156,[%r219]; .loc 1 246 22 cvt.u32.u64 %r39,%r36; .loc 1 246 53 add.u32 %r262,%r39,-4; and.b32 %r263,%r262,-5; set.u32.eq.u32 %r265,%r263,0; neg.s32 %r266,%r265; .loc 1 246 22 add.u32 %r267,%r39,-1; set.u32.le.u32 %r269,%r267,1; neg.s32 %r270,%r269; .loc 1 246 6 cvt.u16.u32 %r272,%r266; cvt.u16.u32 %r273,%r270; or.b16 %r271,%r272,%r273; cvt.u32.u16 %r274,%r271; cvt.u16.u8 %r275,%r274; setp.eq.u16 %r276,%r275,0; @ %r276 bra $L54; $L58: .loc 1 255 3 setp.ne.u64 %r277,%r144,0; @ %r277 bra $L55; $L62: .loc 1 264 3 setp.gt.s64 %r278,%r143,%r144; @ %r278 bra $L56; bra $L113; $L54: .loc 1 248 7 setp.eq.u32 %r280,%r39,16; @ %r280 bra $L58; .loc 1 253 5 cvta.const.u64 %r282,$LC4; mov.u64 %r281,call _gfortrani_internal_error5: add.u64 %r125,%r215,40; add.u64 %r84,%r219,40; add.u64 %r283,%r215,16; add.u64 %r285,%r25,%r25; add.u64 %r286,%r285,%r25; shl.b64 %r287,%r286,3; add.u64 %r65,%r283,%r287; .loc 1 255 3 mov.u64 %r142,0; add.u64 %r453,%frame,240; add.u64 %r450,%frame,120; add.u64 %r452,%frame,360; .loc 1 262 12 mov.u64 %r466,%r142; $L61: .loc 1 257 18 add.u64 %r289,%r453,%r142; ld.u64 %r290,[%r125]; st.u64 [%r289],%r290; .loc 1 258 18 add.u64 %r292,%r450,%r142; .loc 1 258 20 ld.u64 %r294,[%r84]; mul.lo.u64 %r293,%r294,%r36; .loc 1 258 18 st.u64 [%r292],%r293; .loc 1 259 19 ld.u64 %r296,[%r125+16]; add.u64 %r295,%r296,1; ld.u64 %r297,[%r125+8]; sub.u64 %r53,%r295,%r297; .loc 1 261 10 setp.lt.s64 %r298,%r53,0; @ %r298 bra $L59; .loc 1 259 17 add.u64 %r300,%r452,%r142; st.u64 [%r300],%r53; bra $L60; $L59: .loc 1 262 12 add.u64 %r302,%r452,%r142; st.u64 [%r302],%r466; $L60: .loc 1 255 3 add.u64 %r125,%r125,24; add.u64 %r84,%r84,24; add.u64 %r142,%r142,8; setp.ne.u64 %r304,%r65,%r125; @ %r304 bra $L61; bra $L62; $L113: .loc 1 274 6 ld.u64 %r305,[%r214]; setp.eq.u64 %r306,%r305,0; @ ! %r306 bra $L64; bra $L63; $L56: add.u64 %r308,%r25,%r25; add.u64 %r309,%r308,%r25; shl.b64 %r310,%r309,3; add.u64 %r174,%r310,40; add.u64 %r179,%r215,%r174; add.u64 %r172,%r219,%r174; shl.b64 %r311,%r25,3; add.u64 %r152,%r311,-8; cvt.u32.u32 %r313,%r22; cvt.s64.s8 %r312,%r313; shl.b64 %r314,%r312,3; add.u64 %r129,%r314,-8; add.u64 %r453,%frame,240; add.u64 %r450,%frame,120; add.u64 %r452,%frame,360; .loc 1 271 12 mov.u64 %r465,0; $L67: .loc 1 266 18 add.u64 %r316,%r453,%r152; ld.u64 %r317,[%r179]; st.u64 [%r316],%r317; .loc 1 267 18 add.u64 %r319,%r450,%r152; .loc 1 267 20 ld.u64 %r321,[%r172]; mul.lo.u64 %r320,%r321,%r36; .loc 1 267 18 st.u64 [%r319],%r320; .loc 1 268 19 ld.u64 %r323,[%r179+16]; add.u64 %r322,%r323,1; ld.u64 %r324,[%r179+8]; sub.u64 %r62,%r322,%r324; .loc 1 270 10 setp.lt.s64 %r325,%r62,0; @ %r325 bra $L65; .loc 1 268 17 add.u64 %r327,%r452,%r152; st.u64 [%r327],%r62; bra $L66; $L65: .loc 1 271 12 add.u64 %r329,%r452,%r152; st.u64 [%r329],%r465; $L66: .loc 1 264 3 add.u64 %r179,%r179,24; add.u64 %r172,%r172,24; add.u64 %r152,%r152,8; setp.ne.u64 %r331,%r129,%r152; @ %r331 bra $L67; bra $L114; $L63: .loc 1 278 7 setp.le.s64 %r332,%r143,0; @ %r332 bra $L69; add.u64 %r452,%frame,360; $L90: add.u64 %r193,%r214,40; mov.u64 %r191,%r452; add.u64 %r333,%r214,16; cvt.u32.u32 %r335,%r22; cvt.s64.s8 %r334,%r335; add.u64 %r337,%r334,%r334; add.u64 %r338,%r337,%r334; shl.b64 %r339,%r338,3; add.u64 %r185,%r333,%r339; .loc 1 281 10 mov.u64 %r134,1; .loc 1 285 4 mov.u64 %r362,0; bra $L70; $L69: .loc 1 289 24 mov.u64 %r340,0; st.u64 [%r214+8],%r340; .loc 1 290 28 cvt.u16.u32 %r343,%r22; add.u16 %r342,%r343,-1; cvt.u32.u16 %r344,%r342; st.u8 [%r214+28],%r344; .loc 1 292 20 add.u32 %r345,%r22,-2; cvt.s64.s32 %r73,%r345; add.u64 %r347,%r73,%r73; add.u64 %r348,%r347,%r73; shl.b64 %r349,%r348,3; add.u64 %r350,%r214,%r349; .loc 1 292 67 shl.b64 %r352,%r73,3; add.u64 %r353,%frame,%r352; .loc 1 292 59 ld.u64 %r355,[%r350+40]; ld.u64 %r356,[%r353+360]; mul.lo.u64 %r150,%r355,%r356; .loc 1 294 29 mov.u64 %r358call (%value_in),_gfortrani_xmallocarray59,[%value_in]; } .loc 1 294 27 st.u64 [%r214],%r359; .loc 1 295 10 setp.eq.u64 %r361,%r150,0; @ ! %r361 bra $L74; bra $L71; $L73: .loc 1 283 48 mul.lo.u64 %r134,%r69,%r134; $L70: .loc 1 285 4 st.u64 [%r193+8],%r362; ld.u64 %r69,[%r191]; add.u64 %r363,%r69,-1; st.u64 [%r193+16],%r363; st.u64 [%r193],%r134; .loc 1 278 7 add.u64 %r193,%r193,24; add.u64 %r191,%r191,8; setp.ne.u64 %r364,%r185,%r193; @ %r364 bra $L73; bra $L69; $L71: .loc 1 298 4 st.u64 [%r214+48],%r150; mov.u64 %r366,-1; st.u64 [%r214+56],%r366; mov.u64 %r367,1; st.u64 [%r214+40],%r367; .loc 1 299 4 bra $L52; $L64: .loc 1 304 19 ld.s8 %r80,[%r214+28]; .loc 1 304 10 setp.eq.u64 %r368,%r80,%r143; @ %r368 bra $L76; .loc 1 305 2 st.u64 [%stack+8],%r143; st.u64 [%stack],%r80;_gfortran_runtime_error76: .loc 1 310 11 cvta.global.u64 %r371,_gfortrani_compile_options; .loc 1 310 10 ld.u32 %r372,[%r371+36]; setp.eq.u32 %r373,%r372,0; @ %r373 bra $L74; .loc 1 311 2 add.u64 %r452,%frame,360; cvta.const.u64 %r377,$LC2376377; call _gfortrani_bounds_ifunction_return$L74: .loc 1 315 3 setp.gt.s64 %r379,%r143,0; @ %r379 bra $L77; $L80: .loc 1 235 9 add.u64 %r380,%r31,1; .loc 1 235 7 sub.u64 %r145,%r380,%r33; max.s64 %r66,%r145,0; .loc 1 240 12 mul.lo.u64 %r37,%r34,%r36; .loc 1 323 8 ld.u64 %r160,[%r214]; .loc 1 326 8 ld.u64 %r159,[%r215]; .loc 1 351 35 shl.b64 %r95,%r147,4; .loc 1 336 22 add.u64 %r64,%r66,-1; .loc 1 336 15 mul.lo.u64 %r87,%r64,%r95; .loc 1 337 29 mul.lo.u64 %r44,%r37,%r64; .loc 1 338 34 neg.s64 %r91,%r95; .loc 1 363 22 ld.u64 %r96,[%frame+240]; .loc 1 363 12 shl.b64 %r98,%r96,4; .loc 1 364 23 ld.u64 %r99,[%frame+120]; .loc 1 365 22 ld.u64 %r101,[%frame]; .loc 1 365 12 shl.b64 %r103,%r101,3; .loc 1 362 12 ld.u64 %r213,[%frame+480]; setp.ne.u32 %r454,%r220,0; add.u64 %r451,%frame,488; .loc 1 338 4 setp.le.s64 %r458,%r145,0; .loc 1 382 23 add.u64 %r459,%frame,240; .loc 1 367 32 add.u64 %r460,%frame,360; .loc 1 371 20 add.u64 %r461,%frame,120; bra $L78; $L77: add.u64 %r206,%frame,480; add.u64 %r205,%r214,40; mov.u64 %r203,%frame; cvt.u32.u32 %r382,%r22; cvt.s64.s8 %r381,%r382; add.u64 %r194,%r381,-1; .loc 1 315 3 mov.u64 %r161,0; add.u64 %r452,%frame,360; .loc 1 317 16 mov.u64 %r383,%r161; $L79: st.u64 [%r206],%r383; .loc 1 318 18 ld.u64 %r384,[%r205]; st.u64 [%r203],%r384; .loc 1 319 17 shl.b64 %r386,%r161,3; add.u64 %r387,%r452,%r386; .loc 1 319 10 ld.u64 %r388,[%r387]; setp.le.s64 %r389,%r388,0; @ %r389 bra $L52; .loc 1 315 26 add.u64 %r161,%r161,1; .loc 1 315 3 add.u64 %r206,%r206,8; add.u64 %r205,%r205,24; add.u64 %r203,%r203,8; setp.ne.u64 %r390,%r161,%r194; @ %r390 bra $L79; bra $L80; $L78: .loc 1 334 10 @ %r454 bra $L81; .loc 1 351 4 setp.gt.s64 %r392,%r145,0; @ %r392 bra $L91; .loc 1 333 14 mov.u64 %r154,0; bra $L83; $L81: .loc 1 336 8 add.u64 %r29,%r159,%r87; .loc 1 337 9 add.u64 %r155,%r156,%r44; .loc 1 338 4 @ %r458 bra $L92; .loc 1 338 11 mov.u64 %r154,%r66; $L85: .loc 1 340 11 ld.s8 %r395,[%r155]; cvt.u16.u32 %r394,%r395; setp.eq.u16 %r396,%r394,0; @ %r396 bra $L84; .loc 1 340 18 ld.f64 %r398,[%r29+8]; set.u32.eq.f64 %r399,%r398,%r217; neg.s32 %r400,%r399; ld.f64 %r402,[%r29]; set.u32.eq.f64 %r403,%r402,%r216; neg.s32 %r404,%r403; cvt.u16.u32 %r406,%r400; cvt.u16.u32 %r407,%r404; and.b16 %r405,%r406,%r407; cvt.u32.u16 %r408,%r405; cvt.u16.u8 %r409,%r408; setp.ne.u16 %r410,%r409,0; @ %r410 bra $L83; $L84: .loc 1 338 26 add.u64 %r154,%r154,-1; .loc 1 338 34 add.u64 %r29,%r29,%r91; .loc 1 338 53 sub.u64 %r155,%r155,%r37; .loc 1 338 4 setp.ne.u64 %r411,%r154,0; @ %r411 bra $L85; bra $L83; $L91: .loc 1 351 4 mov.u64 %r153,%r156; mov.u64 %r104,%r159; .loc 1 351 11 mov.u64 %r154,1; $L82: .loc 1 353 11 ld.s8 %r413,[%r153]; cvt.u16.u32 %r412,%r413; setp.eq.u16 %r414,%r412,0; @ %r414 bra $L86; .loc 1 353 18 ld.f64 %r416,[%r104]; set.u32.eq.f64 %r417,%r416,%r216; neg.s32 %r418,%r417; ld.f64 %r420,[%r104+8]; set.u32.eq.f64 %r421,%r420,%r217; neg.s32 %r422,%r421; cvt.u16.u32 %r424,%r418; cvt.u16.u32 %r425,%r422; and.b16 %r423,%r424,%r425; cvt.u32.u16 %r426,%r423; cvt.u16.u8 %r427,%r426; setp.ne.u16 %r428,%r427,0; @ %r428 bra $L83; $L86: .loc 1 351 27 add.u64 %r154,%r154,1; .loc 1 351 35 add.u64 %r104,%r104,%r95; .loc 1 351 54 add.u64 %r153,%r153,%r37; .loc 1 351 4 setp.ge.s64 %r429,%r66,%r154; @ %r429 bra $L82; .loc 1 333 14 mov.u64 %r154,0; bra $L83; $L92: mov.u64 %r154,0; $L83: .loc 1 360 13 st.u64 [%r160],%r154; .loc 1 362 15 add.u64 %r213,%r213,1; st.u64 [%frame+480],%r213; .loc 1 363 12 add.u64 %r159,%r159,%r98; .loc 1 364 13 add.u64 %r156,%r156,%r99; .loc 1 365 12 add.u64 %r160,%r160,%r103; .loc 1 367 32 ld.u64 %r121,[%frame+360]; .loc 1 367 13 setp.ne.u64 %r430,%r213,%r121; @ %r430 bra $L78; mov.u64 %r118,%r451; mov.u64 %r119,%r101; mov.u64 %r211,%r99; mov.u64 %r116,%r96; mov.u64 %r112,8; .loc 1 366 9 mov.u64 %r158,0; .loc 1 369 13 mov.u64 %r432,%r158; $L89: st.u64 [%r118+-8],%r432; .loc 1 370 23 mul.lo.u64 %r106,%r121,%r116; .loc 1 371 24 mul.lo.u64 %r433,%r121,%r211; .loc 1 371 10 sub.u64 %r156,%r156,%r433; .loc 1 372 23 mul.lo.u64 %r111,%r121,%r119; .loc 1 373 5 add.u64 %r158,%r158,1; .loc 1 374 7 setp.le.s64 %r434,%r143,%r158; @ %r434 bra $L52; .loc 1 381 16 ld.u64 %r435,[%r118]; add.u64 %r115,%r435,1; st.u64 [%r118],%r115; .loc 1 382 23 add.u64 %r437,%r459,%r112; ld.u64 %r116,[%r437]; .loc 1 382 13 sub.u64 %r438,%r116,%r106; shl.b64 %r439,%r438,4; add.u64 %r159,%r159,%r439; .loc 1 383 23 add.u64 %r440,%frame,%r112; ld.u64 %r119,[%r440]; .loc 1 383 13 sub.u64 %r441,%r119,%r111; shl.b64 %r442,%r441,3; add.u64 %r160,%r160,%r442; .loc 1 367 32 add.u64 %r444,%r460,%r112; ld.u64 %r121,[%r444]; .loc 1 367 13 add.u64 %r118,%r118,8; setp.eq.u64 %r445,%r115,%r121; @ %r445 bra $L88; .loc 1 362 12 ld.u64 %r213,[%frame+480]; bra $L78; $L88: .loc 1 371 20 add.u64 %r447,%r461,%r112; ld.u64 %r211,[%r447]; add.u64 %r112,%r112,8; bra $L89; $L114: .loc 1 274 6 ld.u64 %r448,[%r214]; setp.eq.u64 %r449,%r448,0; @ ! %r449 bra $L64; bra $L90; $L5268u64 %r126; .reg .u64u64 %r154; .reg .u32 %r155; .reg .predu16 %r173; .reg .u16 %r174; .reg .u32 %r175; .reg .u16 %r176; .reg .pred %r177; .reg .pred %r178; .reg .predpredpredu16 %r229; .reg .u32predu64predpredu64 %r303; .reg .u64 %r304; mov.u64 %r149,%ar0; mov.u64 %r150,%ar1; mov.f64 %r151,%ar2; mov.f64 %r152,%ar3; mov.u64 %r153,%ar4; mov.u64 %r154,%ar5; mov.u32 %r155,%ar6; .loc 1 410 6 setp.eq.u64 %r156,%r154,0; @ %r156 bra $L116; .loc 1 410 20 ld.u32 %r157,[%r154]; setp.eq.u32 %r158,%r157,0; @ %r158 bra $L117; $L116:.param .f64 %out_arg4; st.param.f64 [%out_arg4],%r152153155; call _gfortran_findloc1_c8413 7 bra $L115; $L117: .loc 1 416 10 ld.s8 %r23,[%r150+28]; .loc 1 416 38 add.u32 %r165,%r23,-1; .loc 1 416 8 cvt.s64.s32 %r90,%r165; .loc 1 417 10 ld.u64 %r26,[%r153]; .loc 1 417 7 add.u64 %r91,%r26,-1; .loc 1 419 7 shr.u64 %r167,%r91,63; set.u32.lt.s64 %r169,%r90,%r91; neg.s32 %r170,%r169; cvt.u16.u64 %r173,%r167; cvt.u16.u32 %r174,%r170; or.b16 %r172,%r173,%r174; .loc 1 419 6 cvt.u32.u16 %r175,%r172; cvt.u16.u8 %r176,%r175; setp.ne.u16 %r177,%r176,0; @ %r177 bra $L119; .loc 1 430 3 setp.ne.u64 %r178,%r91,0; @ %r178 bra $L120; $L126: .loc 1 438 3 setp.gt.s64 %r179,%r90,%r91; @ %r179 bra $L121; bra $L163; $L119: .loc 1 421 7 cvt.u32.u32 %r182,%r23; cvt.s64.s8 %r181,%r182; st.u64 [%stack+8],%r18180180_gfortran_runtime_error120: add.u64 %r136,%r150,48; add.u64 %r138,%frame,120; add.u64 %r139,%r150,56; add.u64 %r184,%r150,24; add.u64 %r186,%r26,%r26; add.u64 %r187,%r186,%r26; shl.b64 %r188,%r187,3; add.u64 %r145,%r184,%r188; .loc 1 435 12 mov.u64 %r304,0; $L125: .loc 1 432 19 ld.u64 %r190,[%r139]; add.u64 %r189,%r190,1; ld.u64 %r191,[%r136]; sub.u64 %r35,%r189,%r191; .loc 1 434 10 setp.le.s64 %r192,%r35,0; @ %r192 bra $L123; .loc 1 432 17 st.u64 [%r138],%r35; bra $L124; $L123: .loc 1 435 12 st.u64 [%r138],%r304; $L124: .loc 1 430 3 add.u64 %r136,%r136,24; add.u64 %r138,%r138,8; add.u64 %r139,%r139,24; setp.ne.u64 %r194,%r136,%r145; @ %r194 bra $L125; bra $L126; $L163: .loc 1 448 6 ld.u64 %r195,[%r149]; setp.eq.u64 %r196,%r195,0; @ ! %r196 bra $L128; bra $L127; $L121: add.u64 %r198,%r26,%r26; add.u64 %r199,%r198,%r26; shl.b64 %r200,%r199,3; add.u64 %r201,%r200,48; add.u64 %r125,%r150,%r201; add.u64 %r296,%frame,120; shl.b64 %r203,%r26,3; add.u64 %r204,%r203,-8; add.u64 %r45,%r296,%r204; add.u64 %r205,%r150,48; cvt.u32.u32 %r207,%r23; cvt.s64.s8 %r206,%r207; add.u64 %r209,%r206,%r206; add.u64 %r210,%r209,%r206; shl.b64 %r211,%r210,3; add.u64 %r132,%r205,%r211; .loc 1 444 12 mov.u64 %r303,0; $L131: .loc 1 441 2 ld.u64 %r213,[%r125+8]; add.u64 %r212,%r213,1; ld.u64 %r214,[%r125]; sub.u64 %r40,%r212,%r214; .loc 1 443 10 setp.le.s64 %r215,%r40,0; @ %r215 bra $L129; .loc 1 440 17 st.u64 [%r45],%r40; bra $L130; $L129: .loc 1 444 12 st.u64 [%r45],%r303; $L130: .loc 1 438 3 add.u64 %r125,%r125,24; add.u64 %r45,%r45,8; setp.ne.u64 %r217,%r125,%r132; @ %r217 bra $L131; bra $L164; $L127: .loc 1 452 7 setp.le.s64 %r218,%r90,0; @ %r218 bra $L133; add.u64 %r296,%frame,120; $L149: add.u64 %r78,%r149,40; mov.u64 %r61,%r296; cvt.u32.u32 %r220,%r23; cvt.s64.s8 %r219,%r220; add.u64 %r222,%r219,%r219; add.u64 %r223,%r222,%r219; shl.b64 %r224,%r223,3; add.u64 %r225,%r149,16; add.u64 %r126,%r224,%r225; .loc 1 455 10 mov.u64 %r81,1; .loc 1 459 4 mov.u64 %r248,0; bra $L134; $L133: .loc 1 462 24 mov.u64 %r226,0; st.u64 [%r149+8],%r226; .loc 1 463 28 cvt.u16.u32 %r229,%r23; add.u16 %r228,%r229,-1; cvt.u32.u16 %r230,%r228; st.u8 [%r149+28],%r230; .loc 1 465 20 add.u32 %r231,%r23,-2; cvt.s64.s32 %r51,%r231; add.u64 %r233,%r51,%r51; add.u64 %r234,%r233,%r51; shl.b64 %r235,%r234,3; add.u64 %r236,%r149,%r235; .loc 1 465 67 shl.b64 %r238,%r51,3; add.u64 %r239,%frame,%r238; .loc 1 465 59 ld.u64 %r241,[%r236+40]; ld.u64 %r242,[%r239+120]; mul.lo.u64 %r94,%r241,%r242; .loc 1 467 29 mov.u64 %r244244; call (%value_in),_gfortrani_xmallocarray5,[%value_in]; } .loc 1 467 27 st.u64 [%r149],%r245; .loc 1 468 10 setp.eq.u64 %r247,%r94,0; @ ! %r247 bra $L138; bra $L135; $L137: .loc 1 457 48 mul.lo.u64 %r81,%r47,%r81; $L134: .loc 1 459 4 st.u64 [%r78+8],%r248; ld.u64 %r47,[%r61]; add.u64 %r249,%r47,-1; st.u64 [%r78+16],%r249; st.u64 [%r78],%r81; .loc 1 452 7 add.u64 %r78,%r78,24; add.u64 %r61,%r61,8; setp.ne.u64 %r250,%r78,%r126; @ %r250 bra $L137; bra $L133; $L135: .loc 1 471 4 st.u64 [%r149+48],%r94; mov.u64 %r252,-1; st.u64 [%r149+56],%r252; mov.u64 %r253,1; st.u64 [%r149+40],%r253; .loc 1 472 4 bra $L115; $L128: .loc 1 477 19 ld.s8 %r58,[%r149+28]; .loc 1 477 10 setp.eq.u64 %r254,%r58,%r90; @ %r254 bra $L139; .loc 1 478 2 st.u64 [%stack+8],%r90; st.u64 [%stack],%r58;_gfortran_runtime_error139: .loc 1 483 11 cvta.global.u64 %r257,_gfortrani_compile_options; .loc 1 483 10 ld.u32 %r258,[%r257+36]; setp.eq.u32 %r259,%r258,0; @ %r259 bra $L138; .loc 1 484 2 add.u64 %r296,%frame,120; cvta.const.u64 %r263,$LC2r2962623; call _gfortrani_bounds_ifunction_return$L138: .loc 1 488 3 setp.gt.s64 %r265,%r90,0; @ %r265 bra $L140; $L143: .loc 1 495 8 ld.u64 %r97,[%r149]; ld.u64 %r42,[%frame+240]; .loc 1 503 22 ld.u64 %r64,[%frame]; .loc 1 503 12 shl.b64 %r66,%r64,3; setp.le.s64 %r297,%r90,1; cvt.u32.u32 %r299,%r23; cvt.s64.s8 %r300,%r299; add.u64 %r301,%r300,-1; add.u64 %r302,%frame,120; bra $L141; $L140: add.u64 %r114,%frame,240; add.u64 %r113,%r149,40; mov.u64 %r108,%frame; cvt.u32.u32 %r267,%r23; cvt.s64.s8 %r266,%r267; add.u64 %r79,%r266,-1; .loc 1 488 3 mov.u64 %r98,0; add.u64 %r296,%frame,120; .loc 1 490 16 mov.u64 %r268,%r98; $L142: st.u64 [%r114],%r268; .loc 1 491 18 ld.u64 %r269,[%r113]; st.u64 [%r108],%r269; .loc 1 492 17 shl.b64 %r271,%r98,3; add.u64 %r272,%r296,%r271; .loc 1 492 10 ld.u64 %r273,[%r272]; setp.le.s64 %r274,%r273,0; @ %r274 bra $L115; .loc 1 488 26 add.u64 %r98,%r98,1; .loc 1 488 3 add.u64 %r114,%r114,8; add.u64 %r113,%r113,24; add.u64 %r108,%r108,8; setp.ne.u64 %r275,%r79,%r98; @ %r275 bra $L142; bra $L143; $L141: .loc 1 500 13 mov.u64 %r276,0; st.u64 [%r97],%r276; .loc 1 502 15 add.u64 %r42,%r42,1; .loc 1 503 12 add.u64 %r97,%r97,%r66; .loc 1 505 32 ld.u64 %r88,[%frame+120]; .loc 1 505 13 setp.ne.u64 %r277,%r42,%r88; @ %r277 bra $L141; .loc 1 508 23 mul.lo.u64 %r278,%r64,%r42; .loc 1 508 9 shl.b64 %r279,%r278,3; sub.u64 %r30,%r97,%r279; .loc 1 510 7 @ ! %r297 bra $L165; bra $L115; $L148: .loc 1 507 13 st.u64 [%r102],%r276; .loc 1 508 23 mul.lo.u64 %r282,%r73,%r72; .loc 1 508 9 shl.b64 %r283,%r282,3; sub.u64 %r30,%r97,%r283; .loc 1 509 5 add.u64 %r96,%r96,1; .loc 1 510 7 add.u64 %r102,%r102,8; setp.ne.u64 %r284,%r96,%r301; @ %r284 bra $L147; bra $L115; $L165: add.u64 %r102,%frame,248; .loc 1 509 5 mov.u64 %r96,1; $L147: .loc 1 517 16 ld.u64 %r288,[%r102]; add.u64 %r72,%r288,1; st.u64 [%r102],%r72; shl.b64 %r120,%r96,3; .loc 1 518 23 add.u64 %r289,%frame,%r120; ld.u64 %r73,[%r289]; .loc 1 518 13 shl.b64 %r290,%r73,3; add.u64 %r97,%r30,%r290; .loc 1 505 32 add.u64 %r292,%r302,%r120; ld.u64 %r77,[%r292]; .loc 1 505 13 setp.eq.u64 %r293,%r72,%r77; @ %r293 bra $L148; .loc 1 507 13 mov.u64 %r42,0; bra $L141; $L164: .loc 1 448 6 ld.u64 %r294,[%r149]; setp.eq.u64 %r295,%r294,0; @ ! %r295 bra $L128; bra $L149; $L115: .loc 1 522 1 ret; } findloc1_c10.o/ 1622802215findloc1_s1.o/_gfortran_findloc1_s1 .visible .func _gfortran_findloc1_s1; .file 1 "../../../libgfortran/generated/findloc1_s1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mfindloc1_s1 .visible .func _gfortran_mfindloc1_s1); // BEGIN GLOBAL FUNCTION DECL: _gfortran_sfindloc1_s1 .visible .func _gfortran_sfindloc1_s1); // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortran_compare_gfortran_compare); // BEGIN GLOBAL FUNCTION DECL: _gfortrani_internal_error .extern .func _gfortrani_internal_,105,110,99,111,114,114,101,99,116,32,105,110,32,70,73,78,68,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,70,73,78,68,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,1218] = {70,73,78,68,7114,101,116,117,114,110,32,118,97,108,11_gfortran_findloc1_s1 .visible .func _gfortran_findloc1_s1480predu64u64 %r258; .reg .u64pred %r262; .reg .pred %r264; .reg .pred %r265; .reg .u64u64predu32 %r316; .reg .u64 %r317; .reg .u64pred %r323; .reg .predu32 %r339; .reg .pred64pred %r367; .reg .predpred %r373; .reg .predu64 %r381; mov.u64 %r185,%ar0; mov.u64 %r186,%ar1; mov.u64 %r187,%ar2; mov.u64 %r188,%ar3; mov.u32 %r189,%ar4; mov.u64 %r190,%ar5; mov.u64 %r191,%ar6; .loc 1 56 10 ld.s8 %r22,[%r186+28]; .loc 1 56 38 add.u32 %r192,%r22,-1; .loc 1 56 8 cvt.s64.s32 %r119,%r192; .loc 1 57 10 ld.u64 %r25,[%r188]; .loc 1 57 7 add.u64 %r120,%r25,-1; .loc 1 59 7 shr.u64 %r194,%r120,63; set.u32.lt.s64 %r196,%r119,%r120; neg.s32 %r197,%r196; cvt.u16.u64 %r200,%r194; cvt.u16.u32 %r201,%r197; or.b16 %r199,%r200,%r201; .loc 1 59 6 cvt.u32.u16 %r202,%r199; cvt.u16.u8 %r203,%r202; setp.eq.u16 %r204,%r203,0; @ %r204 bra $L2; .loc 1 61 7 cvt.u32.u32 %r207,%r22; cvt.s64.s8 %r206,%r207; st.u64 [%stack+8],%r20605205_gfortran_runtime_errorr210,%r120,%r120; add.u64 %r211,%r210,%r120; shl.b64 %r212,%r211,3; add.u64 %r213,%r186,%r212; ld.u64 %r31,[%r213+56]; ld.u64 %r33,[%r213+48]; .loc 1 69 9 ld.u64 %r123,[%r213+40]; .loc 1 71 3 setp.ne.u64 %r227,%r120,0; @ %r227 bra $L3; $L9: .loc 1 79 3 setp.gt.s64 %r228,%r119,%r120; @ %r228 bra $L4; bra $L56; $L3: add.u64 %r81,%r186,40; add.u64 %r174,%frame,120; add.u64 %r175,%frame,240; add.u64 %r229,%r186,16; add.u64 %r231,%r25,%r25; add.u64 %r232,%r231,%r25; shl.b64 %r233,%r232,3; add.u64 %r182,%r229,%r233; .loc 1 77 12 mov.u64 %r381,0; $L8: .loc 1 73 18 ld.u64 %r234,[%r81]; st.u64 [%r174],%r234; .loc 1 74 19 ld.u64 %r236,[%r81+16]; add.u64 %r235,%r236,1; ld.u64 %r237,[%r81+8]; sub.u64 %r38,%r235,%r237; .loc 1 76 10 setp.lt.s64 %r238,%r38,0; @ %r238 bra $L6; .loc 1 74 17 st.u64 [%r175],%r38; bra $L7; $L6: .loc 1 77 12 st.u64 [%r175],%r381; $L7: .loc 1 71 3 add.u64 %r81,%r81,24; add.u64 %r174,%r174,8; add.u64 %r175,%r175,8; setp.ne.u64 %r240,%r81,%r182; @ %r240 bra $L8; bra $L9; $L56: .loc 1 88 6 ld.u64 %r241,[%r185]; setp.eq.u64 %r242,%r241,0; @ ! %r242 bra $L11; bra $L10; $L4: add.u64 %r244,%r25,%r25; add.u64 %r245,%r244,%r25; shl.b64 %r246,%r245,3; add.u64 %r247,%r246,40; add.u64 %r62,%r186,%r247; shl.b64 %r248,%r25,3; add.u64 %r170,%r248,-8; add.u64 %r370,%frame,120; add.u64 %r135,%r370,%r170; add.u64 %r369,%frame,240; add.u64 %r110,%r369,%r170; add.u64 %r251,%r186,40; cvt.u32.u32 %r253,%r22; cvt.s64.s8 %r252,%r253; add.u64 %r255,%r252,%r252; add.u64 %r256,%r255,%r252; shl.b64 %r257,%r256,3; add.u64 %r78,%r251,%r257; .loc 1 85 12 mov.u64 %r380,0; $L14: .loc 1 81 18 ld.u64 %r258,[%r62]; st.u64 [%r135],%r258; .loc 1 82 19 ld.u64 %r260,[%r62+16]; add.u64 %r259,%r260,1; ld.u64 %r261,[%r62+8]; sub.u64 %r45,%r259,%r261; .loc 1 84 10 setp.lt.s64 %r262,%r45,0; @ %r262 bra $L12; .loc 1 82 17 st.u64 [%r110],%r45; bra $L13; $L12: .loc 1 85 12 st.u64 [%r110],%r380; $L13: .loc 1 79 3 add.u64 %r62,%r62,24; add.u64 %r135,%r135,8; add.u64 %r110,%r110,8; setp.ne.u64 %r264,%r62,%r78; @ %r264 bra $L14; bra $L57; $L10: .loc 1 92 7 setp.le.s64 %r265,%r119,0; @ %r265 bra $L16; add.u64 %r369,%frame,240; $L37: add.u64 %r109,%r185,40; mov.u64 %r106,%r369; cvt.u32.u32 %r267,%r22; cvt.s64.s8 %r266,%r267; add.u64 %r269,%r266,%r266; add.u64 %r270,%r269,%r266; shl.b64 %r271,%r270,3; add.u64 %r272,%r185,16; add.u64 %r65,%r271,%r272; .loc 1 95 10 mov.u64 %r108,1; .loc 1 99 4 mov.u64 %r295,0; bra $L17; $L16: .loc 1 103 24 mov.u64 %r273,0; st.u64 [%r185+8],%r273; .loc 1 104 28 cvt.u16.u32 %r276,%r22; add.u16 %r275,%r276,-1; cvt.u32.u16 %r277,%r275; st.u8 [%r185+28],%r277; .loc 1 106 20 add.u32 %r278,%r22,-2; cvt.s64.s32 %r54,%r278; add.u64 %r280,%r54,%r54; add.u64 %r281,%r280,%r54; shl.b64 %r282,%r281,3; add.u64 %r283,%r185,%r282; .loc 1 106 67 shl.b64 %r285,%r54,3; add.u64 %r286,%frame,%r285; .loc 1 106 59 ld.u64 %r288,[%r283+40]; ld.u64 %r289,[%r286+240]; mul.lo.u64 %r125,%r288,%r289; .loc 1 108 29 mov.u64 %r291291; call (%value_in),_gfortrani_xmallocarray292,[%value_in]; } .loc 1 108 27 st.u64 [%r185],%r292; .loc 1 109 10 setp.eq.u64 %r294,%r125,0; @ ! %r294 bra $L21; bra $L18; $L20: .loc 1 97 48 mul.lo.u64 %r108,%r50,%r108; $L17: .loc 1 99 4 st.u64 [%r109+8],%r295; ld.u64 %r50,[%r106]; add.u64 %r296,%r50,-1; st.u64 [%r109+16],%r296; st.u64 [%r109],%r108; .loc 1 92 7 add.u64 %r109,%r109,24; add.u64 %r106,%r106,8; setp.ne.u64 %r297,%r65,%r109; @ %r297 bra $L20; bra $L16; $L18: .loc 1 112 4 st.u64 [%r185+48],%r125; mov.u64 %r299,-1; st.u64 [%r185+56],%r299; mov.u64 %r300,1; st.u64 [%r185+40],%r300; .loc 1 113 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r61,[%r185+28]; .loc 1 118 10 setp.eq.u64 %r301,%r61,%r119; @ %r301 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r119; st.u64 [%stack],%r61;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r304,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r305,[%r304+36]; setp.eq.u32 %r306,%r305,0; @ %r306 bra $L21; .loc 1 125 2 add.u64 %r369,%frame,240; cvta.const.u64 %r310,$LC2r369309310; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r312,%r119,0; @ %r312 bra $L24; $L28: .loc 1 66 9 add.u64 %r313,%r31,1; .loc 1 66 7 sub.u64 %r121,%r313,%r33; max.s64 %r43,%r121,0; .loc 1 137 8 ld.u64 %r133,[%r185]; .loc 1 140 8 ld.u64 %r132,[%r186]; .loc 1 162 44 mul.lo.u64 %r75,%r123,%r190; .loc 1 149 22 add.u64 %r314,%r43,-1; .loc 1 149 35 mul.lo.u64 %r70,%r314,%r75; .loc 1 150 34 neg.s64 %r72,%r75; .loc 1 174 22 ld.u64 %r152,[%frame+120]; .loc 1 174 26 mul.lo.u64 %r142,%r152,%r190; .loc 1 175 22 ld.u64 %r165,[%frame]; .loc 1 175 12 shl.b64 %r168,%r165,3; .loc 1 173 12 ld.u64 %r169,[%frame+360]; setp.ne.u32 %r368,%r189,0; setp.le.s64 %r371,%r119,1; .loc 1 150 4 setp.le.s64 %r373,%r121,0; .loc 1 162 4 setp.gt.s64 %r374,%r121,0; cvt.u32.u32 %r375,%r22; cvt.s64.s8 %r376,%r375; add.u64 %r377,%r376,-1; add.u64 %r378,%frame,120; add.u64 %r379,%frame,240; bra $L25; $L24: add.u64 %r150,%frame,360; add.u64 %r146,%r185,40; mov.u64 %r140,%frame; cvt.u32.u32 %r316,%r22; cvt.s64.s8 %r315,%r316; add.u64 %r111,%r315,-1; .loc 1 129 3 mov.u64 %r134,0; add.u64 %r369,%frame,240; .loc 1 131 16 mov.u64 %r317,%r134; $L27: st.u64 [%r150],%r317; .loc 1 132 18 ld.u64 %r318,[%r146]; st.u64 [%r140],%r318; .loc 1 133 17 shl.b64 %r320,%r134,3; add.u64 %r321,%r369,%r320; .loc 1 133 10 ld.u64 %r322,[%r321]; setp.le.s64 %r323,%r322,0; @ %r323 bra $L1; .loc 1 129 26 add.u64 %r134,%r134,1; .loc 1 129 3 add.u64 %r150,%r150,8; add.u64 %r146,%r146,24; add.u64 %r140,%r140,8; setp.ne.u64 %r324,%r111,%r134; @ %r324 bra $L27; bra $L28; $L25: .loc 1 147 10 @ %r368 bra $L29; .loc 1 162 4 @ %r374 bra $L38; $L32: .loc 1 146 14 mov.u64 %r127,0; bra $L31; $L29: .loc 1 149 8 add.u64 %r128,%r132,%r70; .loc 1 150 4 @ %r373 bra $L32; .loc 1 150 11 mov.u64 %r127(%value_in),_gfortran_compare_stringld.param.u32 %r332,[%value_in]; } .loc 1 152 11 setp.eq.u32 %r333,%r332,0; @ %r333 bra $L31; .loc 1 150 26 add.u64 %r127,%r127,-1; .loc 1 150 34 add.u64 %r128,%r128,%r72; .loc 1 150 4 setp.ne.u64 %r334,%r127,0; @ %r334 bra $L33; bra $L32; $L38: .loc 1 162 4 mov.u64 %r126,%r132; .loc 1 162 11 mov.u64 %r127,(%value_in),_gfortran_compare_stringld.param.u32 %r339,[%value_in]; } .loc 1 164 11 setp.eq.u32 %r340,%r339,0; @ %r340 bra $L31; .loc 1 162 27 add.u64 %r127,%r127,1; .loc 1 162 35 add.u64 %r126,%r126,%r75; .loc 1 162 4 setp.ge.s64 %r341,%r43,%r127; @ %r341 bra $L30; bra $L32; $L31: .loc 1 171 13 st.u64 [%r133],%r127; .loc 1 173 15 add.u64 %r169,%r169,1; .loc 1 174 12 add.u64 %r132,%r132,%r142; .loc 1 175 12 add.u64 %r133,%r133,%r168; .loc 1 177 32 ld.u64 %r139,[%frame+240]; .loc 1 177 13 setp.ne.u64 %r342,%r169,%r139; @ %r342 bra $L25; .loc 1 180 23 mul.lo.u64 %r343,%r169,%r152; .loc 1 180 35 mul.lo.u64 %r344,%r343,%r190; .loc 1 180 9 sub.u64 %r129,%r132,%r344; .loc 1 181 23 mul.lo.u64 %r345,%r169,%r165; .loc 1 181 9 shl.b64 %r346,%r345,3; sub.u64 %r130,%r133,%r346; .loc 1 183 7 @ %r371 bra $L1; add.u64 %r68,%frame,368; mov.u64 %r162,8; .loc 1 182 5 mov.u64 %r131,1; .loc 1 179 13 mov.u64 %r372,0; bra $L35; $L36: st.u64 [%r68],%r372; .loc 1 180 23 mul.lo.u64 %r352,%r95,%r94; .loc 1 180 35 mul.lo.u64 %r353,%r352,%r190; .loc 1 180 9 sub.u64 %r129,%r132,%r353; .loc 1 181 23 mul.lo.u64 %r354,%r98,%r94; .loc 1 181 9 shl.b64 %r355,%r354,3; sub.u64 %r130,%r133,%r355; .loc 1 182 5 add.u64 %r131,%r131,1; .loc 1 183 7 add.u64 %r68,%r68,8; add.u64 %r162,%r162,8; setp.eq.u64 %r356,%r131,%r377; @ %r356 bra $L1; $L35: .loc 1 190 16 ld.u64 %r357,[%r68]; add.u64 %r94,%r357,1; st.u64 [%r68],%r94; .loc 1 191 23 add.u64 %r359,%r378,%r162; ld.u64 %r95,[%r359]; .loc 1 191 13 mad.lo.u64 %r132,%r95,%r190,%r129; .loc 1 192 23 add.u64 %r361,%frame,%r162; ld.u64 %r98,[%r361]; .loc 1 192 13 shl.b64 %r362,%r98,3; add.u64 %r133,%r130,%r362; .loc 1 177 32 add.u64 %r364,%r379,%r162; ld.u64 %r102,[%r364]; .loc 1 177 13 setp.eq.u64 %r365,%r94,%r102; @ %r365 bra $L36; mov.u64 %r169,0; bra $L25; $L57: .loc 1 88 6 ld.u64 %r366,[%r185]; setp.eq.u64 %r367,%r366,0; @ ! %r367 bra $L11; bra $L37; $L1: .loc 1 19_gfortran_mfindloc1_s1 .visible .func _gfortran_mfindloc1_s1608u16u64 %r231; .reg .u64 %r232; .reg .u64 %r233; .reg .u64u16 %r264; .reg .u32 %r265; .reg .u16u64 %r278; .reg .u64 %r280; .reg .u64 %r281; .reg .u64 %r283; .reg .u64u64 %r287; .reg .u64 %r288; .reg .pred %r289; .reg .u64 %r291; .reg .u64 %r293; .reg .predu64 %r305; .reg .u64u64 %r311; .reg .u64 %r312; .reg .u64predu64pred %r380; .reg .pred %r381; .reg .u16 %r385; .reg .u32pred %r433; .reg .pred.reg .u64 %r439; mov.u64 %r204,%ar0; mov.u64 %r205,%ar1; mov.u64 %r206,%ar2; mov.u64 %r207,%ar3; mov.u64 %r208,%ar4; mov.u32 %r209,%ar5; mov.u64 %r210,%ar6; mov.u64 %r211,%ar7; .loc 1 227 10 ld.s8 %r22,[%r205+28]; .loc 1 227 38 add.u32 %r212,%r22,-1; .loc 1 227 8 cvt.s64.s32 %r140,%r212; .loc 1 228 10 ld.u64 %r25,[%r207]; .loc 1 228 7 add.u64 %r141,%r25,-1; .loc 1 230 7 shr.u64 %r214,%r141,63; set.u32.lt.s64 %r216,%r140,%r141; neg.s32 %r217,%r216; cvt.u16.u64 %r220,%r214; cvt.u16.u32 %r221,%r217; or59; .loc 1 232 7 cvt.u32.u32 %r227,%r22; cvt.s64.s8 %r226,%r227; st.u64 [%stack+8],%r22625225_gfortran_runtime_error9: .loc 1 237 9 add.u64 %r230,%r141,%r141; add.u64 %r231,%r230,%r141; shl.b64 %r232,%r231,3; add.u64 %r233,%r205,%r232; ld.u64 %r31,[%r233+56]; ld.u64 %r33,[%r233+48]; .loc 1 241 9 ld.u64 %r144,[%r233+40]; .loc 1 242 12 add.u64 %r251,%r208,%r232; ld.u64 %r34,[%r251+40]; ld.u64 %r36,[%r208+16]; .loc 1 244 9 ld.u64 %r154,[%r208]; .loc 1 248 22 cvt.u32.u64 %r39,%r36; .loc 1 248 53 add.u32 %r253,%r39,-4; and.b32 %r254,%r253,-5; set.u32.eq.u32 %r256,%r254,0; neg.s32 %r257,%r256; .loc 1 248 22 add.u32 %r258,%r39,-1; set.u32.le.u32 %r260,%r258,1; neg.s32 %r261,%r260; .loc 1 248 6 cvt.u16.u32 %r263,%r257; cvt.u16.u32 %r264,%r261; or.b16 %r262,%r263,%r264; cvt.u32.u16 %r265,%r262; cvt.u16.u8 %r266,%r265; setp.eq.u16 %r267,%r266,0; @ %r267 bra $L60; $L64: .loc 1 257 3 setp.ne.u64 %r268,%r141,0; @ %r268 bra $L61; $L68: .loc 1 266 3 setp.gt.s64 %r269,%r140,%r141; @ %r269 bra $L62; bra $L118; $L60: .loc 1 250 7 setp.eq.u32 %r271,%r39,16; @ %r271 bra $L64; .loc 1 255 5 cvta.const.u64 %r273,$LC4; mov.u64 %r272,27273; call _gfortrani_internal_error61: add.u64 %r38,%r205,40; add.u64 %r160,%r208,40; add.u64 %r275,%r25,%r25; add.u64 %r276,%r275,%r25; shl.b64 %r277,%r276,3; add.u64 %r278,%r205,16; add.u64 %r93,%r277,%r278; .loc 1 257 3 mov.u64 %r192,0; add.u64 %r428,%frame,240; add.u64 %r429,%frame,120; add.u64 %r426,%frame,360; .loc 1 264 12 mov.u64 %r439,%r192; $L67: .loc 1 259 18 add.u64 %r280,%r428,%r192; ld.u64 %r281,[%r38]; st.u64 [%r280],%r281; .loc 1 260 18 add.u64 %r283,%r429,%r192; .loc 1 260 20 ld.u64 %r285,[%r160]; mul.lo.u64 %r284,%r285,%r36; .loc 1 260 18 st.u64 [%r283],%r284; .loc 1 261 19 ld.u64 %r287,[%r38+16]; add.u64 %r286,%r287,1; ld.u64 %r288,[%r38+8]; sub.u64 %r53,%r286,%r288; .loc 1 263 10 setp.lt.s64 %r289,%r53,0; @ %r289 bra $L65; .loc 1 261 17 add.u64 %r291,%r426,%r192; st.u64 [%r291],%r53; bra $L66; $L65: .loc 1 264 12 add.u64 %r293,%r426,%r192; st.u64 [%r293],%r439; $L66: .loc 1 257 3 add.u64 %r38,%r38,24; add.u64 %r160,%r160,24; add.u64 %r192,%r192,8; setp.ne.u64 %r295,%r38,%r93; @ %r295 bra $L67; bra $L68; $L118: .loc 1 276 6 ld.u64 %r296,[%r204]; setp.eq.u64 %r297,%r296,0; @ ! %r297 bra $L70; bra $L69; $L62: add.u64 %r299,%r25,%r25; add.u64 %r300,%r299,%r25; shl.b64 %r301,%r300,3; add.u64 %r135,%r301,40; add.u64 %r149,%r205,%r135; add.u64 %r132,%r208,%r135; shl.b64 %r302,%r25,3; add.u64 %r128,%r302,-8; cvt.u32.u32 %r304,%r22; cvt.s64.s8 %r303,%r304; shl.b64 %r305,%r303,3; add.u64 %r120,%r305,-8; add.u64 %r428,%frame,240; add.u64 %r429,%frame,120; add.u64 %r426,%frame,360; .loc 1 273 12 mov.u64 %r438,0; $L73: .loc 1 268 18 add.u64 %r307,%r428,%r128; ld.u64 %r308,[%r149]; st.u64 [%r307],%r308; .loc 1 269 18 add.u64 %r310,%r429,%r128; .loc 1 269 20 ld.u64 %r312,[%r132]; mul.lo.u64 %r311,%r312,%r36; .loc 1 269 18 st.u64 [%r310],%r311; .loc 1 270 19 ld.u64 %r314,[%r149+16]; add.u64 %r313,%r314,1; ld.u64 %r315,[%r149+8]; sub.u64 %r62,%r313,%r315; .loc 1 272 10 setp.lt.s64 %r316,%r62,0; @ %r316 bra $L71; .loc 1 270 17 add.u64 %r318,%r426,%r128; st.u64 [%r318],%r62; bra $L72; $L71: .loc 1 273 12 add.u64 %r320,%r426,%r128; st.u64 [%r320],%r438; $L72: .loc 1 266 3 add.u64 %r149,%r149,24; add.u64 %r132,%r132,24; add.u64 %r128,%r128,8; setp.ne.u64 %r322,%r120,%r128; @ %r322 bra $L73; bra $L119; $L69: .loc 1 280 7 setp.le.s64 %r323,%r140,0; @ %r323 bra $L75; add.u64 %r426,%frame,360; $L99: add.u64 %r173,%r204,40; mov.u64 %r170,%r426; add.u64 %r324,%r204,16; cvt.u32.u32 %r326,%r22; cvt.s64.s8 %r325,%r326; add.u64 %r328,%r325,%r325; add.u64 %r329,%r328,%r325; shl.b64 %r330,%r329,3; add.u64 %r163,%r324,%r330; .loc 1 283 10 mov.u64 %r130,1; .loc 1 287 4 mov.u64 %r353,0; bra $L76; $L75: .loc 1 291 24 mov.u64 %r331,0; st.u64 [%r204+8],%r331; .loc 1 292 28 cvt.u16.u32 %r334,%r22; add.u16 %r333,%r334,-1; cvt.u32.u16 %r335,%r333; st.u8 [%r204+28],%r335; .loc 1 294 20 add.u32 %r336,%r22,-2; cvt.s64.s32 %r73,%r336; add.u64 %r338,%r73,%r73; add.u64 %r339,%r338,%r73; shl.b64 %r340,%r339,3; add.u64 %r341,%r204,%r340; .loc 1 294 67 shl.b64 %r343,%r73,3; add.u64 %r344,%frame,%r343; .loc 1 294 59 ld.u64 %r346,[%r341+40]; ld.u64 %r347,[%r344+360]; mul.lo.u64 %r147,%r346,%r347; .loc 1 296 29 mov.u64 %r349call (%value_in),_gfortrani_xmallocarray50,[%value_in]; } .loc 1 296 27 st.u64 [%r204],%r350; .loc 1 297 10 setp.eq.u64 %r352,%r147,0; @ ! %r352 bra $L80; bra $L77; $L79: .loc 1 285 48 mul.lo.u64 %r130,%r69,%r130; $L76: .loc 1 287 4 st.u64 [%r173+8],%r353; ld.u64 %r69,[%r170]; add.u64 %r354,%r69,-1; st.u64 [%r173+16],%r354; st.u64 [%r173],%r130; .loc 1 280 7 add.u64 %r173,%r173,24; add.u64 %r170,%r170,8; setp.ne.u64 %r355,%r163,%r173; @ %r355 bra $L79; bra $L75; $L77: .loc 1 300 4 st.u64 [%r204+48],%r147; mov.u64 %r357,-1; st.u64 [%r204+56],%r357; mov.u64 %r358,1; st.u64 [%r204+40],%r358; .loc 1 301 4 bra $L58; $L70: .loc 1 306 19 ld.s8 %r80,[%r204+28]; .loc 1 306 10 setp.eq.u64 %r359,%r80,%r140; @ %r359 bra $L82; .loc 1 307 2 st.u64 [%stack+8],%r140; st.u64 [%stack],%r80;_gfortran_runtime_error82: .loc 1 312 11 cvta.global.u64 %r362,_gfortrani_compile_options; .loc 1 312 10 ld.u32 %r363,[%r362+36]; setp.eq.u32 %r364,%r363,0; @ %r364 bra $L80; .loc 1 313 2 add.u64 %r426,%frame,360; cvta.const.u64 %r368,$LC2r4263678; call _gfortrani_bounds_ifunction_return$L80: .loc 1 317 3 setp.gt.s64 %r370,%r140,0; @ %r370 bra $L83; $L86: .loc 1 237 9 add.u64 %r371,%r31,1; .loc 1 237 7 sub.u64 %r142,%r371,%r33; max.s64 %r66,%r142,0; .loc 1 242 12 mul.lo.u64 %r37,%r34,%r36; .loc 1 325 8 ld.u64 %r158,[%r204]; .loc 1 328 8 ld.u64 %r157,[%r205]; .loc 1 353 44 mul.lo.u64 %r100,%r144,%r210; .loc 1 338 22 add.u64 %r64,%r66,-1; .loc 1 338 35 mul.lo.u64 %r89,%r64,%r100; .loc 1 339 29 mul.lo.u64 %r169,%r37,%r64; .loc 1 340 34 neg.s64 %r95,%r100; .loc 1 365 22 ld.u64 %r193,[%frame+240]; .loc 1 365 26 mul.lo.u64 %r196,%r193,%r210; .loc 1 366 23 ld.u64 %r197,[%frame+120]; .loc 1 367 22 ld.u64 %r199,[%frame]; .loc 1 367 12 shl.b64 %r201,%r199,3; .loc 1 364 12 ld.u64 %r203,[%frame+480]; setp.ne.u32 %r427,%r209,0; add.u64 %r425,%frame,488; .loc 1 340 4 setp.le.s64 %r433,%r142,0; .loc 1 353 4 setp.gt.s64 %r434,%r142,0; .loc 1 384 23 add.u64 %r435,%frame,240; .loc 1 369 32 add.u64 %r436,%frame,360; .loc 1 373 20 add.u64 %r437,%frame,120; bra $L84; $L83: add.u64 %r139,%frame,480; add.u64 %r189,%r204,40; mov.u64 %r187,%frame; cvt.u32.u32 %r373,%r22; cvt.s64.s8 %r372,%r373; add.u64 %r175,%r372,-1; .loc 1 317 3 mov.u64 %r29,0; add.u64 %r426,%frame,360; .loc 1 319 16 mov.u64 %r374,%r29; $L85: st.u64 [%r139],%r374; .loc 1 320 18 ld.u64 %r375,[%r189]; st.u64 [%r187],%r375; .loc 1 321 17 shl.b64 %r377,%r29,3; add.u64 %r378,%r426,%r377; .loc 1 321 10 ld.u64 %r379,[%r378]; setp.le.s64 %r380,%r379,0; @ %r380 bra $L58; .loc 1 317 26 add.u64 %r29,%r29,1; .loc 1 317 3 add.u64 %r139,%r139,8; add.u64 %r189,%r189,24; add.u64 %r187,%r187,8; setp.ne.u64 %r381,%r29,%r175; @ %r381 bra $L85; bra $L86; $L84: .loc 1 336 10 @ %r427 bra $L87; .loc 1 353 4 @ %r434 bra $L100; $L90: .loc 1 335 14 mov.u64 %r101,0; bra $L89; $L87: .loc 1 338 8 add.u64 %r152,%r157,%r89; .loc 1 339 9 add.u64 %r153,%r154,%r169; .loc 1 340 4 @ %r433 bra $L90; .loc 1 340 11 mov.u64 %r101,%r66; $L92: .loc 1 342 11 ld.s8 %r386,[%r153]; cvt.u16.u32 %r385,%r386; setp.ne.u16 %r387,%r385,0; @ %r387 bra $L91; $L93: .loc 1 340 26 add.u64 %r101,%r101,-1; .loc 1 340 34 add.u64 %r152,%r152,%r95; .loc 1 340 61 sub.u64 %r153,%r153,%r37; .loc 1 340 4 setp.ne.u64 %r388,%r101,0; @ %r388 bra $L92; bra $L9021206; call (%value_in),_gfortran_compare_stringld.param.u32 %r393,[%value_in]; } .loc 1 342 18 setp.eq.u32 %r394,%r393,0; @ ! %r394 bra $L93; bra $L89; $L100: .loc 1 353 4 mov.u64 %r151,%r154; mov.u64 %r150,%r157; .loc 1 353 11 mov.u64 %r101,1; $L88: .loc 1 355 11 ld.s8 %r396,[%r151]; cvt.u16.u32 %r395,%r396; setp.ne.u16 %r397,%r395,0; @ %r397 bra $L94; $L95: .loc 1 353 27 add.u64 %r101,%r101,1; .loc 1 353 35 add.u64 %r150,%r150,%r100; .loc 1 353 62 add.u64 %r151,%r151,%r37; .loc 1 353 4 setp.ge.s64 %r398,%r66,%r101; @ %r398 bra $L88; bra $L902115021206; call (%value_in),_gfortran_compare_stringld.param.u32 %r403,[%value_in]; } .loc 1 355 18 setp.ne.u32 %r404,%r403,0; @ %r404 bra $L95; $L89: .loc 1 362 13 st.u64 [%r158],%r101; .loc 1 364 15 add.u64 %r203,%r203,1; st.u64 [%frame+480],%r203; .loc 1 365 12 add.u64 %r157,%r157,%r196; .loc 1 366 13 add.u64 %r154,%r154,%r197; .loc 1 367 12 add.u64 %r158,%r158,%r201; .loc 1 369 32 ld.u64 %r119,[%frame+360]; .loc 1 369 13 setp.ne.u64 %r405,%r203,%r119; @ %r405 bra $L84; mov.u64 %r115,%r425; mov.u64 %r116,%r199; mov.u64 %r202,%r197; mov.u64 %r113,%r193; mov.u64 %r109,8; .loc 1 368 9 mov.u64 %r156,0; .loc 1 371 13 mov.u64 %r407,%r156; $L98: st.u64 [%r115+-8],%r407; .loc 1 372 23 mul.lo.u64 %r103,%r119,%r113; .loc 1 373 24 mul.lo.u64 %r408,%r119,%r202; .loc 1 373 10 sub.u64 %r154,%r154,%r408; .loc 1 374 23 mul.lo.u64 %r108,%r119,%r116; .loc 1 375 5 add.u64 %r156,%r156,1; .loc 1 376 7 setp.le.s64 %r409,%r140,%r156; @ %r409 bra $L58; .loc 1 383 16 ld.u64 %r410,[%r115]; add.u64 %r112,%r410,1; st.u64 [%r115],%r112; .loc 1 384 23 add.u64 %r412,%r435,%r109; ld.u64 %r113,[%r412]; .loc 1 384 13 sub.u64 %r413,%r113,%r103; mad.lo.u64 %r157,%r413,%r210,%r157; .loc 1 385 23 add.u64 %r415,%frame,%r109; ld.u64 %r116,[%r415]; .loc 1 385 13 sub.u64 %r416,%r116,%r108; shl.b64 %r417,%r416,3; add.u64 %r158,%r158,%r417; .loc 1 369 32 add.u64 %r419,%r436,%r109; ld.u64 %r119,[%r419]; .loc 1 369 13 add.u64 %r115,%r115,8; setp.eq.u64 %r420,%r112,%r119; @ %r420 bra $L97; .loc 1 364 12 ld.u64 %r203,[%frame+480]; bra $L84; $L97: .loc 1 373 20 add.u64 %r422,%r437,%r109; ld.u64 %r202,[%r422]; add.u64 %r109,%r109,8; bra $L98; $L119: .loc 1 276 6 ld.u64 %r423,[%r204]; setp.eq.u64 %r424,%r423,0; @ ! %r424 bra $L70; bra $L99; $L58:_gfortran_sfindloc1_s1 .visible .func _gfortran_sfindloc1_s168u64 %r126; .reg .u64pred %r157; .reg .u32 %r158; .reg .predu32 %r177; .reg .u16 %r178; .reg .pred %r179; .reg .pred %r180; .reg .predpredpredu64predpredpredu64 %r305; .reg .u64 %r306; mov.u64 %r149,%ar0; mov.u64 %r150,%ar1; mov.u64 %r151,%ar2; mov.u64 %r152,%ar3; mov.u64 %r153,%ar4; mov.u32 %r154,%ar5; mov.u64 %r155,%ar6; mov.u64 %r156,%ar7; .loc 1 412 6 setp.eq.u64 %r157,%r153,0; @ %r157 bra $L121; .loc 1 412 20 ld.u32 %r158,[%r153]; setp.eq.u32 %r159,%r158,0; @ %r159 bra $L122; $L121:.loc 1 415 7 bra $L120; $L122: .loc 1 418 10 ld.s8 %r23,[%r150+28]; .loc 1 418 38 add.u32 %r167,%r23,-1; .loc 1 418 8 cvt.s64.s32 %r90,%r167; .loc 1 419 10 ld.u64 %r26,[%r152]; .loc 1 419 7 add.u64 %r91,%r26,-1; .loc 1 421 7 shr.u64 %r169,%r91,63; set.u32.lt.s64 %r171,%r90,%r91; neg.s32 %r172,%r171; cvt.u16.u64 %r175,%r169; cvt.u16.u32 %r176,%r172; or.b16 %r174,%r175,%r176; .loc 1 421 6 cvt.u32.u16 %r177,%r174; cvt.u16.u8 %r178,%r177; setp.ne.u16 %r179,%r178,0; @ %r179 bra $L124; .loc 1 432 3 setp.ne.u64 %r180,%r91,0; @ %r180 bra $L125; $L131: .loc 1 440 3 setp.gt.s64 %r181,%r90,%r91; @ %r181 bra $L126; bra $L168; $L124: .loc 1 423 7 cvt.u32.u32 %r184,%r23; cvt.s64.s8 %r183,%r184; st.u64 [%stack+8],%r183182182_gfortran_runtime_error125: add.u64 %r136,%r150,48; add.u64 %r138,%frame,120; add.u64 %r139,%r150,56; add.u64 %r186,%r150,24; add.u64 %r188,%r26,%r26; add.u64 %r189,%r188,%r26; shl.b64 %r190,%r189,3; add.u64 %r145,%r186,%r190; .loc 1 437 12 mov.u64 %r306,0; $L130: .loc 1 434 19 ld.u64 %r192,[%r139]; add.u64 %r191,%r192,1; ld.u64 %r193,[%r136]; sub.u64 %r35,%r191,%r193; .loc 1 436 10 setp.le.s64 %r194,%r35,0; @ %r194 bra $L128; .loc 1 434 17 st.u64 [%r138],%r35; bra $L129; $L128: .loc 1 437 12 st.u64 [%r138],%r306; $L129: .loc 1 432 3 add.u64 %r136,%r136,24; add.u64 %r138,%r138,8; add.u64 %r139,%r139,24; setp.ne.u64 %r196,%r136,%r145; @ %r196 bra $L130; bra $L131; $L168: .loc 1 450 6 ld.u64 %r197,[%r149]; setp.eq.u64 %r198,%r197,0; @ ! %r198 bra $L133; bra $L132; $L126: add.u64 %r200,%r26,%r26; add.u64 %r201,%r200,%r26; shl.b64 %r202,%r201,3; add.u64 %r203,%r202,48; add.u64 %r125,%r150,%r203; add.u64 %r298,%frame,120; shl.b64 %r205,%r26,3; add.u64 %r206,%r205,-8; add.u64 %r45,%r298,%r206; add.u64 %r207,%r150,48; cvt.u32.u32 %r209,%r23; cvt.s64.s8 %r208,%r209; add.u64 %r211,%r208,%r208; add.u64 %r212,%r211,%r208; shl.b64 %r213,%r212,3; add.u64 %r132,%r207,%r213; .loc 1 446 12 mov.u64 %r305,0; $L136: .loc 1 443 2 ld.u64 %r215,[%r125+8]; add.u64 %r214,%r215,1; ld.u64 %r216,[%r125]; sub.u64 %r40,%r214,%r216; .loc 1 445 10 setp.le.s64 %r217,%r40,0; @ %r217 bra $L134; .loc 1 442 17 st.u64 [%r45],%r40; bra $L135; $L134: .loc 1 446 12 st.u64 [%r45],%r305; $L135: .loc 1 440 3 add.u64 %r125,%r125,24; add.u64 %r45,%r45,8; setp.ne.u64 %r219,%r125,%r132; @ %r219 bra $L136; bra $L169; $L132: .loc 1 454 7 setp.le.s64 %r220,%r90,0; @ %r220 bra $L138; add.u64 %r298,%frame,120; $L154: add.u64 %r78,%r149,40; mov.u64 %r61,%r298; cvt.u32.u32 %r222,%r23; cvt.s64.s8 %r221,%r222; add.u64 %r224,%r221,%r221; add.u64 %r225,%r224,%r221; shl.b64 %r226,%r225,3; add.u64 %r227,%r149,16; add.u64 %r126,%r226,%r227; .loc 1 457 10 mov.u64 %r81,1; .loc 1 461 4 mov.u64 %r250,0; bra $L139; $L138: .loc 1 464 24 mov.u64 %r228,0; st.u64 [%r149+8],%r228; .loc 1 465 28 cvt.u16.u32 %r231,%r23; add.u16 %r230,%r231,-1; cvt.u32.u16 %r232,%r230; st.u8 [%r149+28],%r232; .loc 1 467 20 add.u32 %r233,%r23,-2; cvt.s64.s32 %r51,%r233; add.u64 %r235,%r51,%r51; add.u64 %r236,%r235,%r51; shl.b64 %r237,%r236,3; add.u64 %r238,%r149,%r237; .loc 1 467 67 shl.b64 %r240,%r51,3; add.u64 %r241,%frame,%r240; .loc 1 467 59 ld.u64 %r243,[%r238+40]; ld.u64 %r244,[%r241+120]; mul.lo.u64 %r94,%r243,%r244; .loc 1 469 29 mov.u64 %r246246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } .loc 1 469 27 st.u64 [%r149],%r247; .loc 1 470 10 setp.eq.u64 %r249,%r94,0; @ ! %r249 bra $L143; bra $L140; $L142: .loc 1 459 48 mul.lo.u64 %r81,%r47,%r81; $L139: .loc 1 461 4 st.u64 [%r78+8],%r250; ld.u64 %r47,[%r61]; add.u64 %r251,%r47,-1; st.u64 [%r78+16],%r251; st.u64 [%r78],%r81; .loc 1 454 7 add.u64 %r78,%r78,24; add.u64 %r61,%r61,8; setp.ne.u64 %r252,%r78,%r126; @ %r252 bra $L142; bra $L138; $L140: .loc 1 473 4 st.u64 [%r149+48],%r94; mov.u64 %r254,-1; st.u64 [%r149+56],%r254; mov.u64 %r255,1; st.u64 [%r149+40],%r255; .loc 1 474 4 bra $L120; $L133: .loc 1 479 19 ld.s8 %r58,[%r149+28]; .loc 1 479 10 setp.eq.u64 %r256,%r58,%r90; @ %r256 bra $L144; .loc 1 480 2 st.u64 [%stack+8],%r90; st.u64 [%stack],%r58;_gfortran_runtime_error144: .loc 1 485 11 cvta.global.u64 %r259,_gfortrani_compile_options; .loc 1 485 10 ld.u32 %r260,[%r259+36]; setp.eq.u32 %r261,%r260,0; @ %r261 bra $L143; .loc 1 486 2 add.u64 %r298,%frame,120; cvta.const.u64 %r265,$LC2r292645; call _gfortrani_bounds_ifunction_return$L143: .loc 1 490 3 setp.gt.s64 %r267,%r90,0; @ %r267 bra $L145; $L148: .loc 1 497 8 ld.u64 %r97,[%r149]; ld.u64 %r42,[%frame+240]; .loc 1 505 22 ld.u64 %r64,[%frame]; .loc 1 505 12 shl.b64 %r66,%r64,3; setp.le.s64 %r299,%r90,1; cvt.u32.u32 %r301,%r23; cvt.s64.s8 %r302,%r301; add.u64 %r303,%r302,-1; add.u64 %r304,%frame,120; bra $L146; $L145: add.u64 %r114,%frame,240; add.u64 %r113,%r149,40; mov.u64 %r108,%frame; cvt.u32.u32 %r269,%r23; cvt.s64.s8 %r268,%r269; add.u64 %r79,%r268,-1; .loc 1 490 3 mov.u64 %r98,0; add.u64 %r298,%frame,120; .loc 1 492 16 mov.u64 %r270,%r98; $L147: st.u64 [%r114],%r270; .loc 1 493 18 ld.u64 %r271,[%r113]; st.u64 [%r108],%r271; .loc 1 494 17 shl.b64 %r273,%r98,3; add.u64 %r274,%r298,%r273; .loc 1 494 10 ld.u64 %r275,[%r274]; setp.le.s64 %r276,%r275,0; @ %r276 bra $L120; .loc 1 490 26 add.u64 %r98,%r98,1; .loc 1 490 3 add.u64 %r114,%r114,8; add.u64 %r113,%r113,24; add.u64 %r108,%r108,8; setp.ne.u64 %r277,%r79,%r98; @ %r277 bra $L147; bra $L148; $L146: .loc 1 502 13 mov.u64 %r278,0; st.u64 [%r97],%r278; .loc 1 504 15 add.u64 %r42,%r42,1; .loc 1 505 12 add.u64 %r97,%r97,%r66; .loc 1 507 32 ld.u64 %r88,[%frame+120]; .loc 1 507 13 setp.ne.u64 %r279,%r42,%r88; @ %r279 bra $L146; .loc 1 510 23 mul.lo.u64 %r280,%r64,%r42; .loc 1 510 9 shl.b64 %r281,%r280,3; sub.u64 %r30,%r97,%r281; .loc 1 512 7 @ ! %r299 bra $L170; bra $L120; $L153: .loc 1 509 13 st.u64 [%r102],%r278; .loc 1 510 23 mul.lo.u64 %r284,%r73,%r72; .loc 1 510 9 shl.b64 %r285,%r284,3; sub.u64 %r30,%r97,%r285; .loc 1 511 5 add.u64 %r96,%r96,1; .loc 1 512 7 add.u64 %r102,%r102,8; setp.ne.u64 %r286,%r96,%r303; @ %r286 bra $L152; bra $L120; $L170: add.u64 %r102,%frame,248; .loc 1 511 5 mov.u64 %r96,1; $L152: .loc 1 519 16 ld.u64 %r290,[%r102]; add.u64 %r72,%r290,1; st.u64 [%r102],%r72; shl.b64 %r120,%r96,3; .loc 1 520 23 add.u64 %r291,%frame,%r120; ld.u64 %r73,[%r291]; .loc 1 520 13 shl.b64 %r292,%r73,3; add.u64 %r97,%r30,%r292; .loc 1 507 32 add.u64 %r294,%r304,%r120; ld.u64 %r77,[%r294]; .loc 1 507 13 setp.eq.u64 %r295,%r72,%r77; @ %r295 bra $L153; .loc 1 509 13 mov.u64 %r42,0; bra $L146; $L169: .loc 1 450 6 ld.u64 %r296,[%r149]; setp.eq.u64 %r297,%r296,0; @ ! %r297 bra $L133; bra $L154; $L120: .loc 1 524 1 ret; } findloc1_s4.o/_gfortran_findloc1_s4 .visible .func _gfortran_findloc1_s; .file 1 "../../../libgfortran/generated/findloc1_s4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mfindloc1_s4 .visible .func _gfortran_mfindloc1_s4); // BEGIN GLOBAL FUNCTION DECL: _gfortran_sfindloc1_s4 .visible .func _gfortran_sfindloc1_s4); // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_ifunction_return .extern .func _gfortrani_bounds_ifunction_returgfortran_compare_string_char4_gfortran_compare_string_char4gfortrani_internal_error .extern .func _gfortrani_internal_,105,110,99,111,114,114,101,99,116,32,105,110,32,70,73,78,68,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,98,101,116,119,101,101,110,32,49,32,97,110,100,32,37,10814,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,70,73,78,68,76,79,67,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,1218] = {70,73,78,68,7114,101,116,117,114,110,32,118,97,108,11_gfortran_findloc1_s4 .visible .func _gfortran_findloc1_s480predu64 %r258; .reg .u64pred %r265; .reg .pred64pred %r304; .reg .u64 %r305; .reg .u64u64 %r318; .reg .u64 %r319; .reg .u32 %r320; .reg .u64u32 %r336; .reg .pred %r337; .reg .pred %r338; .reg .u32predpredpredu64mov.u64 %r188,%ar0; mov.u64 %r189,%ar1; mov.u64 %r190,%ar2; mov.u64 %r191,%ar3; mov.u32 %r192,%ar4; mov.u64 %r193,%ar5; mov.u64 %r194,%ar6; .loc 1 56 10 ld.s8 %r22,[%r189+28]; .loc 1 56 38 add.u32 %r195,%r22,-1; .loc 1 56 8 cvt.s64.s32 %r117,%r195; .loc 1 57 10 ld.u64 %r25,[%r191]; .loc 1 57 7 add.u64 %r118,%r25,-1; .loc 1 59 7 shr.u64 %r197,%r118,63; set.u32.lt.s64 %r199,%r117,%r118; neg.s32 %r200,%r199; cvt.u16.u64 %r203,%r197; cvt.u16.u32 %r204,%r200; or.b16 %r202,%r203,%r204; .loc 1 59 6 cvt.u32.u16 %r205,%r202; cvt.u16.u8 %r206,%r205; setp.eq.u16 %r207,%r206,0; @ %r207 bra $L2; .loc 1 61 7 cvt.u32.u32 %r210,%r22; cvt.s64.s8 %r209,%r210; st.u64 [%stack+8],%r209_gfortran_runtime_errorr213,%r118,%r118; add.u64 %r214,%r213,%r118; shl.b64 %r215,%r214,3; add.u64 %r216,%r189,%r215; ld.u64 %r30,[%r216+56]; ld.u64 %r32,[%r216+48]; .loc 1 69 9 ld.u64 %r121,[%r216+40]; .loc 1 71 3 setp.ne.u64 %r230,%r118,0; @ %r230 bra $L3; $L9: .loc 1 79 3 setp.gt.s64 %r231,%r117,%r118; @ %r231 bra $L4; bra $L56; $L3: add.u64 %r81,%r189,40; add.u64 %r78,%frame,120; add.u64 %r70,%frame,240; add.u64 %r233,%r25,%r25; add.u64 %r234,%r233,%r25; shl.b64 %r235,%r234,3; add.u64 %r236,%r189,16; add.u64 %r179,%r235,%r236; .loc 1 77 12 mov.u64 %r386,0; $L8: .loc 1 73 18 ld.u64 %r237,[%r81]; st.u64 [%r78],%r237; .loc 1 74 19 ld.u64 %r239,[%r81+16]; add.u64 %r238,%r239,1; ld.u64 %r240,[%r81+8]; sub.u64 %r37,%r238,%r240; .loc 1 76 10 setp.lt.s64 %r241,%r37,0; @ %r241 bra $L6; .loc 1 74 17 st.u64 [%r70],%r37; bra $L7; $L6: .loc 1 77 12 st.u64 [%r70],%r386; $L7: .loc 1 71 3 add.u64 %r81,%r81,24; add.u64 %r78,%r78,8; add.u64 %r70,%r70,8; setp.ne.u64 %r243,%r81,%r179; @ %r243 bra $L8; bra $L9; $L56: .loc 1 88 6 ld.u64 %r244,[%r188]; setp.eq.u64 %r245,%r244,0; @ ! %r245 bra $L11; bra $L10; $L4: add.u64 %r247,%r25,%r25; add.u64 %r248,%r247,%r25; shl.b64 %r249,%r248,3; add.u64 %r250,%r249,40; add.u64 %r86,%r189,%r250; shl.b64 %r251,%r25,3; add.u64 %r134,%r251,-8; add.u64 %r375,%frame,120; add.u64 %r46,%r375,%r134; add.u64 %r374,%frame,240; add.u64 %r136,%r374,%r134; cvt.u32.u32 %r255,%r22; cvt.s64.s8 %r254,%r255; add.u64 %r257,%r254,%r254; add.u64 %r258,%r257,%r254; shl.b64 %r259,%r258,3; add.u64 %r260,%r189,40; add.u64 %r143,%r259,%r260; .loc 1 85 12 mov.u64 %r385,0; $L14: .loc 1 81 18 ld.u64 %r261,[%r86]; st.u64 [%r46],%r261; .loc 1 82 19 ld.u64 %r263,[%r86+16]; add.u64 %r262,%r263,1; ld.u64 %r264,[%r86+8]; sub.u64 %r43,%r262,%r264; .loc 1 84 10 setp.lt.s64 %r265,%r43,0; @ %r265 bra $L12; .loc 1 82 17 st.u64 [%r136],%r43; bra $L13; $L12: .loc 1 85 12 st.u64 [%r136],%r385; $L13: .loc 1 79 3 add.u64 %r86,%r86,24; add.u64 %r46,%r46,8; add.u64 %r136,%r136,8; setp.ne.u64 %r267,%r86,%r143; @ %r267 bra $L14; bra $L57; $L10: .loc 1 92 7 setp.le.s64 %r268,%r117,0; @ %r268 bra $L16; add.u64 %r374,%frame,240; $L37: add.u64 %r111,%r188,40; mov.u64 %r109,%r374; cvt.u32.u32 %r270,%r22; cvt.s64.s8 %r269,%r270; add.u64 %r272,%r269,%r269; add.u64 %r273,%r272,%r269; shl.b64 %r274,%r273,3; add.u64 %r275,%r188,16; add.u64 %r100,%r274,%r275; .loc 1 95 10 mov.u64 %r108,1; .loc 1 99 4 mov.u64 %r298,0; bra $L17; $L16: .loc 1 103 24 mov.u64 %r276,0; st.u64 [%r188+8],%r276; .loc 1 104 28 cvt.u16.u32 %r279,%r22; add.u16 %r278,%r279,-1; cvt.u32.u16 %r280,%r278; st.u8 [%r188+28],%r280; .loc 1 106 20 add.u32 %r281,%r22,-2; cvt.s64.s32 %r53,%r281; add.u64 %r283,%r53,%r53; add.u64 %r284,%r283,%r53; shl.b64 %r285,%r284,3; add.u64 %r286,%r188,%r285; .loc 1 106 67 shl.b64 %r288,%r53,3; add.u64 %r289,%frame,%r288; .loc 1 106 59 ld.u64 %r291,[%r286+40]; ld.u64 %r292,[%r289+240]; mul.lo.u64 %r123,%r291,%r292; .loc 1 108 29 mov.u64 %r294194; call (%value_in),_gfortrani_xmallocarray295,[%value_in]; } .loc 1 108 27 st.u64 [%r188],%r295; .loc 1 109 10 setp.eq.u64 %r297,%r123,0; @ ! %r297 bra $L21; bra $L18; $L20: .loc 1 97 48 mul.lo.u64 %r108,%r49,%r108; $L17: .loc 1 99 4 st.u64 [%r111+8],%r298; ld.u64 %r49,[%r109]; add.u64 %r299,%r49,-1; st.u64 [%r111+16],%r299; st.u64 [%r111],%r108; .loc 1 92 7 add.u64 %r111,%r111,24; add.u64 %r109,%r109,8; setp.ne.u64 %r300,%r100,%r111; @ %r300 bra $L20; bra $L16; $L18: .loc 1 112 4 st.u64 [%r188+48],%r123; mov.u64 %r302,-1; st.u64 [%r188+56],%r302; mov.u64 %r303,1; st.u64 [%r188+40],%r303; .loc 1 113 4 bra $L1; $L11: .loc 1 118 19 ld.s8 %r59,[%r188+28]; .loc 1 118 10 setp.eq.u64 %r304,%r59,%r117; @ %r304 bra $L23; .loc 1 119 2 st.u64 [%stack+8],%r117; st.u64 [%stack],%r59;_gfortran_runtime_error23: .loc 1 124 11 cvta.global.u64 %r307,_gfortrani_compile_options; .loc 1 124 10 ld.u32 %r308,[%r307+36]; setp.eq.u32 %r309,%r308,0; @ %r309 bra $L21; .loc 1 125 2 add.u64 %r374,%frame,240; cvta.const.u64 %r313,$LC2r374312313; call _gfortrani_bounds_ifunction_return$L21: .loc 1 129 3 setp.gt.s64 %r315,%r117,0; @ %r315 bra $L24; $L28: .loc 1 66 9 add.u64 %r316,%r30,1; .loc 1 66 7 sub.u64 %r119,%r316,%r32; max.s64 %r41,%r119,0; .loc 1 137 8 ld.u64 %r131,[%r188]; .loc 1 140 8 ld.u64 %r130,[%r189]; shl.b64 %r182,%r193,2; .loc 1 162 35 mul.lo.u64 %r77,%r121,%r182; .loc 1 149 22 add.u64 %r317,%r41,-1; .loc 1 149 15 mul.lo.u64 %r71,%r317,%r77; .loc 1 150 34 neg.s64 %r74,%r77; .loc 1 174 22 ld.u64 %r167,[%frame+120]; .loc 1 174 12 mul.lo.u64 %r318,%r167,%r193; shl.b64 %r169,%r318,2; .loc 1 175 22 ld.u64 %r170,[%frame]; .loc 1 175 12 shl.b64 %r173,%r170,3; .loc 1 173 12 ld.u64 %r174,[%frame+360]; setp.ne.u32 %r373,%r192,0; setp.le.s64 %r376,%r117,1; .loc 1 150 4 setp.le.s64 %r378,%r119,0; .loc 1 162 4 setp.gt.s64 %r379,%r119,0; cvt.u32.u32 %r380,%r22; cvt.s64.s8 %r381,%r380; add.u64 %r382,%r381,-1; add.u64 %r383,%frame,120; add.u64 %r384,%frame,240; bra $L25; $L24: add.u64 %r154,%frame,360; add.u64 %r150,%r188,40; mov.u64 %r148,%frame; cvt.u32.u32 %r320,%r22; cvt.s64.s8 %r319,%r320; add.u64 %r112,%r319,-1; .loc 1 129 3 mov.u64 %r132,0; add.u64 %r374,%frame,240; .loc 1 131 16 mov.u64 %r321,%r132; $L27: st.u64 [%r154],%r321; .loc 1 132 18 ld.u64 %r322,[%r150]; st.u64 [%r148],%r322; .loc 1 133 17 shl.b64 %r324,%r132,3; add.u64 %r325,%r374,%r324; .loc 1 133 10 ld.u64 %r326,[%r325]; setp.le.s64 %r327,%r326,0; @ %r327 bra $L1; .loc 1 129 26 add.u64 %r132,%r132,1; .loc 1 129 3 add.u64 %r154,%r154,8; add.u64 %r150,%r150,24; add.u64 %r148,%r148,8; setp.ne.u64 %r328,%r112,%r132; @ %r328 bra $L27; bra $L28; $L25: .loc 1 147 10 @ %r373 bra $L29; .loc 1 162 4 @ %r379 bra $L38; $L32: .loc 1 146 14 mov.u64 %r125,0; bra $L31; $L29: .loc 1 149 8 add.u64 %r126,%r130,%r71; .loc 1 150 4 @ %r378 bra $L32; .loc 1 150 11 mov.u64 %r12519r126194190; call (%value_in),_gfortran_compare_string_char4ld.param.u32 %r336,[%value_in]; } .loc 1 152 11 setp.eq.u32 %r337,%r336,0; @ %r337 bra $L31; .loc 1 150 26 add.u64 %r125,%r125,-1; .loc 1 150 34 add.u64 %r126,%r126,%r74; .loc 1 150 4 setp.ne.u64 %r338,%r125,0; @ %r338 bra $L33; bra $L32; $L38: .loc 1 162 4 mov.u64 %r124,%r130; .loc 1 162 11 mov.u64 %r125,19r124194190; call (%value_in),_gfortran_compare_string_char4ld.param.u32 %r343,[%value_in]; } .loc 1 164 11 setp.eq.u32 %r344,%r343,0; @ %r344 bra $L31; .loc 1 162 27 add.u64 %r125,%r125,1; .loc 1 162 35 add.u64 %r124,%r124,%r77; .loc 1 162 4 setp.ge.s64 %r345,%r41,%r125; @ %r345 bra $L30; bra $L32; $L31: .loc 1 171 13 st.u64 [%r131],%r125; .loc 1 173 15 add.u64 %r174,%r174,1; .loc 1 174 12 add.u64 %r130,%r130,%r169; .loc 1 175 12 add.u64 %r131,%r131,%r173; .loc 1 177 32 ld.u64 %r141,[%frame+240]; .loc 1 177 13 setp.ne.u64 %r346,%r174,%r141; @ %r346 bra $L25; .loc 1 180 23 mul.lo.u64 %r347,%r174,%r167; .loc 1 180 9 mul.lo.u64 %r348,%r347,%r193; shl.b64 %r349,%r348,2; sub.u64 %r127,%r130,%r349; .loc 1 181 23 mul.lo.u64 %r350,%r174,%r170; .loc 1 181 9 shl.b64 %r351,%r350,3; sub.u64 %r128,%r131,%r351; .loc 1 183 7 @ %r376 bra $L1; add.u64 %r65,%frame,368; mov.u64 %r68,8; .loc 1 182 5 mov.u64 %r129,1; .loc 1 179 13 mov.u64 %r377,0; bra $L35; $L36: st.u64 [%r65],%r377; .loc 1 180 23 mul.lo.u64 %r357,%r94,%r93; .loc 1 180 9 mul.lo.u64 %r358,%r357,%r182; sub.u64 %r127,%r130,%r358; .loc 1 181 23 mul.lo.u64 %r359,%r97,%r93; .loc 1 181 9 shl.b64 %r360,%r359,3; sub.u64 %r128,%r131,%r360; .loc 1 182 5 add.u64 %r129,%r129,1; .loc 1 183 7 add.u64 %r65,%r65,8; add.u64 %r68,%r68,8; setp.eq.u64 %r361,%r129,%r382; @ %r361 bra $L1; $L35: .loc 1 190 16 ld.u64 %r362,[%r65]; add.u64 %r93,%r362,1; st.u64 [%r65],%r93; .loc 1 191 23 add.u64 %r364,%r383,%r68; ld.u64 %r94,[%r364]; .loc 1 191 13 mad.lo.u64 %r130,%r94,%r182,%r127; .loc 1 192 23 add.u64 %r366,%frame,%r68; ld.u64 %r97,[%r366]; .loc 1 192 13 shl.b64 %r367,%r97,3; add.u64 %r131,%r128,%r367; .loc 1 177 32 add.u64 %r369,%r384,%r68; ld.u64 %r101,[%r369]; .loc 1 177 13 setp.eq.u64 %r370,%r93,%r101; @ %r370 bra $L36; mov.u64 %r174,0; bra $L25; $L57: .loc 1 88 6 ld.u64 %r371,[%r188]; setp.eq.u64 %r372,%r371,0; @ ! %r372 bra $L11; bra $L37; $L1: .loc 1 19_gfortran_mfindloc1_s4 .visible .func _gfortran_mfindloc1_s4608u64predu64pred %r270; .reg .predu64 %r279; .reg .u64 %r280; .reg .u64 %r282; .reg .u64 %r283; .reg .u64 %r285; .reg .u64pred %r297; .reg .u64 %r298; .reg .predu64 %r305; .reg .u32 %r306; .reg .u64u64predu64predpredpredpred %r397; .reg .u16 %r398; .reg .u32 %r399; .reg .pred %r400; .reg .pred %r401; .reg .u32 %r406; .reg .predpred %r430; .reg .u64predmov.u64 %r206,%ar0; mov.u64 %r207,%ar1; mov.u64 %r208,%ar2; mov.u64 %r209,%ar3; mov.u64 %r210,%ar4; mov.u32 %r211,%ar5; mov.u64 %r212,%ar6; mov.u64 %r213,%ar7; .loc 1 227 10 ld.s8 %r22,[%r207+28]; .loc 1 227 38 add.u32 %r214,%r22,-1; .loc 1 227 8 cvt.s64.s32 %r138,%r214; .loc 1 228 10 ld.u64 %r25,[%r209]; .loc 1 228 7 add.u64 %r139,%r25,-1; .loc 1 230 7 shr.u64 %r216,%r139,63; set.u32.lt.s64 %r218,%r138,%r139; neg.s32 %r219,%r218; cvt.u16.u64 %r222,%r216; cvt.u16.u32 %r223,%r219; or.b16 %r221,%r222,%r223; .loc 1 230 6 cvt.u32.u16 %r224,%r221; cvt.u16.u8 %r225,%r224; setp.eq.u16 %r226,%r225,0; @ %r226 bra $L59; .loc 1 232 7 cvt.u32.u32 %r229,%r22; cvt.s64.s8 %r228,%r229; st.u64 [%stack+8],%r22827_gfortran_runtime_error9: .loc 1 237 9 add.u64 %r232,%r139,%r139; add.u64 %r233,%r232,%r139; shl.b64 %r234,%r233,3; add.u64 %r235,%r207,%r234; ld.u64 %r31,[%r235+56]; ld.u64 %r33,[%r235+48]; .loc 1 241 9 ld.u64 %r142,[%r235+40]; .loc 1 242 12 add.u64 %r253,%r210,%r234; ld.u64 %r34,[%r253+40]; ld.u64 %r36,[%r210+16]; .loc 1 244 9 ld.u64 %r151,[%r210]; .loc 1 248 22 cvt.u32.u64 %r38,%r36; .loc 1 248 53 add.u32 %r255,%r38,-4; and.b32 %r256,%r255,-5; set.u32.eq.u32 %r258,%r256,0; neg.s32 %r259,%r258; .loc 1 248 22 add.u32 %r260,%r38,-1; set.u32.le.u32 %r262,%r260,1; neg.s32 %r263,%r262; .loc 1 248 6 cvt.u16.u32 %r265,%r259; cvt.u16.u32 %r266,%r263; or$L64: .loc 1 257 3 setp.ne.u64 %r270,%r139,0; @ %r270 bra $L61; $L68: .loc 1 266 3 setp.gt.s64 %r271,%r138,%r139; @ %r271 bra $L62; bra $L118; $L60: .loc 1 250 7 setp.eq.u32 %r273,%r38,16; @ %r273 bra $L64; .loc 1 255 5 cvta.const.u64 %r275,$LC4; mov.u64 %r274,27275; call _gfortrani_internal_error61: add.u64 %r119,%r207,40; add.u64 %r79,%r210,40; add.u64 %r276,%r207,16; add.u64 %r278,%r25,%r25; add.u64 %r279,%r278,%r25; shl.b64 %r280,%r279,3; add.u64 %r63,%r276,%r280; .loc 1 257 3 mov.u64 %r156,0; add.u64 %r431,%frame,240; add.u64 %r432,%frame,120; add.u64 %r429,%frame,360; .loc 1 264 12 mov.u64 %r442,%r156; $L67: .loc 1 259 18 add.u64 %r282,%r431,%r156; ld.u64 %r283,[%r119]; st.u64 [%r282],%r283; .loc 1 260 18 add.u64 %r285,%r432,%r156; .loc 1 260 20 ld.u64 %r287,[%r79]; mul.lo.u64 %r286,%r287,%r36; .loc 1 260 18 st.u64 [%r285],%r286; .loc 1 261 19 ld.u64 %r289,[%r119+16]; add.u64 %r288,%r289,1; ld.u64 %r290,[%r119+8]; sub.u64 %r51,%r288,%r290; .loc 1 263 10 setp.lt.s64 %r291,%r51,0; @ %r291 bra $L65; .loc 1 261 17 add.u64 %r293,%r429,%r156; st.u64 [%r293],%r51; bra $L66; $L65: .loc 1 264 12 add.u64 %r295,%r429,%r156; st.u64 [%r295],%r442; $L66: .loc 1 257 3 add.u64 %r119,%r119,24; add.u64 %r79,%r79,24; add.u64 %r156,%r156,8; setp.ne.u64 %r297,%r63,%r119; @ %r297 bra $L67; bra $L68; $L118: .loc 1 276 6 ld.u64 %r298,[%r206]; setp.eq.u64 %r299,%r298,0; @ ! %r299 bra $L70; bra $L69; $L62: add.u64 %r301,%r25,%r25; add.u64 %r302,%r301,%r25; shl.b64 %r303,%r302,3; add.u64 %r162,%r303,40; add.u64 %r167,%r207,%r162; add.u64 %r160,%r210,%r162; shl.b64 %r304,%r25,3; add.u64 %r133,%r304,-8; cvt.u32.u32 %r306,%r22; cvt.s64.s8 %r305,%r306; shl.b64 %r307,%r305,3; add.u64 %r123,%r307,-8; add.u64 %r431,%frame,240; add.u64 %r432,%frame,120; add.u64 %r429,%frame,360; .loc 1 273 12 mov.u64 %r441,0; $L73: .loc 1 268 18 add.u64 %r309,%r431,%r133; ld.u64 %r310,[%r167]; st.u64 [%r309],%r310; .loc 1 269 18 add.u64 %r312,%r432,%r133; .loc 1 269 20 ld.u64 %r314,[%r160]; mul.lo.u64 %r313,%r314,%r36; .loc 1 269 18 st.u64 [%r312],%r313; .loc 1 270 19 ld.u64 %r316,[%r167+16]; add.u64 %r315,%r316,1; ld.u64 %r317,[%r167+8]; sub.u64 %r60,%r315,%r317; .loc 1 272 10 setp.lt.s64 %r318,%r60,0; @ %r318 bra $L71; .loc 1 270 17 add.u64 %r320,%r429,%r133; st.u64 [%r320],%r60; bra $L72; $L71: .loc 1 273 12 add.u64 %r322,%r429,%r133; st.u64 [%r322],%r441; $L72: .loc 1 266 3 add.u64 %r167,%r167,24; add.u64 %r160,%r160,24; add.u64 %r133,%r133,8; setp.ne.u64 %r324,%r123,%r133; @ %r324 bra $L73; bra $L119; $L69: .loc 1 280 7 setp.le.s64 %r325,%r138,0; @ %r325 bra $L75; add.u64 %r429,%frame,360; $L99: add.u64 %r178,%r206,40; mov.u64 %r176,%r429; add.u64 %r326,%r206,16; cvt.u32.u32 %r328,%r22; cvt.s64.s8 %r327,%r328; add.u64 %r330,%r327,%r327; add.u64 %r331,%r330,%r327; shl.b64 %r332,%r331,3; add.u64 %r168,%r326,%r332; .loc 1 283 10 mov.u64 %r128,1; .loc 1 287 4 mov.u64 %r355,0; bra $L76; $L75: .loc 1 291 24 mov.u64 %r333,0; st.u64 [%r206+8],%r333; .loc 1 292 28 cvt.u16.u32 %r336,%r22; add.u16 %r335,%r336,-1; cvt.u32.u16 %r337,%r335; st.u8 [%r206+28],%r337; .loc 1 294 20 add.u32 %r338,%r22,-2; cvt.s64.s32 %r71,%r338; add.u64 %r340,%r71,%r71; add.u64 %r341,%r340,%r71; shl.b64 %r342,%r341,3; add.u64 %r343,%r206,%r342; .loc 1 294 67 shl.b64 %r345,%r71,3; add.u64 %r346,%frame,%r345; .loc 1 294 59 ld.u64 %r348,[%r343+40]; ld.u64 %r349,[%r346+360]; mul.lo.u64 %r145,%r348,%r349; .loc 1 296 29 mov.u64 %r351call (%value_in),_gfortrani_xmallocarray52,[%value_in]; } .loc 1 296 27 st.u64 [%r206],%r352; .loc 1 297 10 setp.eq.u64 %r354,%r145,0; @ ! %r354 bra $L80; bra $L77; $L79: .loc 1 285 48 mul.lo.u64 %r128,%r67,%r128; $L76: .loc 1 287 4 st.u64 [%r178+8],%r355; ld.u64 %r67,[%r176]; add.u64 %r356,%r67,-1; st.u64 [%r178+16],%r356; st.u64 [%r178],%r128; .loc 1 280 7 add.u64 %r178,%r178,24; add.u64 %r176,%r176,8; setp.ne.u64 %r357,%r168,%r178; @ %r357 bra $L79; bra $L75; $L77: .loc 1 300 4 st.u64 [%r206+48],%r145; mov.u64 %r359,-1; st.u64 [%r206+56],%r359; mov.u64 %r360,1; st.u64 [%r206+40],%r360; .loc 1 301 4 bra $L58; $L70: .loc 1 306 19 ld.s8 %r77,[%r206+28]; .loc 1 306 10 setp.eq.u64 %r361,%r77,%r138; @ %r361 bra $L82; .loc 1 307 2 st.u64 [%stack+8],%r138; st.u64 [%stack],%r77;_gfortran_runtime_error82: .loc 1 312 11 cvta.global.u64 %r364,_gfortrani_compile_options; .loc 1 312 10 ld.u32 %r365,[%r364+36]; setp.eq.u32 %r366,%r365,0; @ %r366 bra $L80; .loc 1 313 2 add.u64 %r429,%frame,360; cvta.const.u64 %r370,$LC2; cvta.const.u64 %r369,$LC3; {369370; call _gfortrani_bounds_ifunction_return$L80: .loc 1 317 3 setp.gt.s64 %r372,%r138,0; @ %r372 bra $L83; $L86: .loc 1 237 9 add.u64 %r373,%r31,1; .loc 1 237 7 sub.u64 %r140,%r373,%r33; max.s64 %r64,%r140,0; .loc 1 242 12 mul.lo.u64 %r37,%r34,%r36; .loc 1 325 8 ld.u64 %r155,[%r206]; .loc 1 328 8 ld.u64 %r154,[%r207]; shl.b64 %r82,%r212,2; .loc 1 353 35 mul.lo.u64 %r93,%r142,%r82; .loc 1 338 22 add.u64 %r62,%r64,-1; .loc 1 338 15 mul.lo.u64 %r84,%r62,%r93; .loc 1 339 29 mul.lo.u64 %r172,%r37,%r62; .loc 1 340 34 neg.s64 %r88,%r93; .loc 1 365 22 ld.u64 %r194,[%frame+240]; .loc 1 365 12 mul.lo.u64 %r374,%r194,%r212; shl.b64 %r196,%r374,2; .loc 1 366 23 ld.u64 %r197,[%frame+120]; .loc 1 367 22 ld.u64 %r199,[%frame]; .loc 1 367 12 shl.b64 %r201,%r199,3; .loc 1 364 12 ld.u64 %r205,[%frame+480]; setp.ne.u32 %r430,%r211,0; add.u64 %r428,%frame,488; .loc 1 340 4 setp.le.s64 %r436,%r140,0; .loc 1 353 4 setp.gt.s64 %r437,%r140,0; .loc 1 384 23 add.u64 %r438,%frame,240; .loc 1 369 32 add.u64 %r439,%frame,360; .loc 1 373 20 add.u64 %r440,%frame,120; bra $L84; $L83: add.u64 %r175,%frame,480; add.u64 %r137,%r206,40; mov.u64 %r188,%frame; cvt.u32.u32 %r376,%r22; cvt.s64.s8 %r375,%r376; add.u64 %r179,%r375,-1; .loc 1 317 3 mov.u64 %r29,0; add.u64 %r429,%frame,360; .loc 1 319 16 mov.u64 %r377,%r29; $L85: st.u64 [%r175],%r377; .loc 1 320 18 ld.u64 %r378,[%r137]; st.u64 [%r188],%r378; .loc 1 321 17 shl.b64 %r380,%r29,3; add.u64 %r381,%r429,%r380; .loc 1 321 10 ld.u64 %r382,[%r381]; setp.le.s64 %r383,%r382,0; @ %r383 bra $L58; .loc 1 317 26 add.u64 %r29,%r29,1; .loc 1 317 3 add.u64 %r175,%r175,8; add.u64 %r137,%r137,24; add.u64 %r188,%r188,8; setp.ne.u64 %r384,%r29,%r179; @ %r384 bra $L85; bra $L86; $L84: .loc 1 336 10 @ %r430 bra $L87; .loc 1 353 4 @ %r437 bra $L100; $L90: .loc 1 335 14 mov.u64 %r96,0; bra $L89; $L87: .loc 1 338 8 add.u64 %r148,%r154,%r84; .loc 1 339 9 add.u64 %r149,%r151,%r172; .loc 1 340 4 @ %r436 bra $L90; .loc 1 340 11 mov.u64 %r96,%r64; $L92: .loc 1 342 11 ld.s8 %r389,[%r149]; cvt.u16.u32 %r388,%r389; setp.ne.u16 %r390,%r388,0; @ %r390 bra $L91; $L93: .loc 1 340 26 add.u64 %r96,%r96,-1; .loc 1 340 34 add.u64 %r148,%r148,%r88; .loc 1 340 61 sub.u64 %r149,%r149,%r37; .loc 1 340 4 setp.ne.u64 %r391,%r96,0; @ %r391 bra $L92; bra $L902121308; call (%value_in),_gfortran_compare_string_char4ld.param.u32 %r396,[%value_in]; } .loc 1 342 18 setp.eq.u32 %r397,%r396,0; @ ! %r397 bra $L93; bra $L89; $L100: .loc 1 353 4 mov.u64 %r147,%r151; mov.u64 %r146,%r154; .loc 1 353 11 mov.u64 %r96,1; $L88: .loc 1 355 11 ld.s8 %r399,[%r147]; cvt.u16.u32 %r398,%r399; setp.ne.u16 %r400,%r398,0; @ %r400 bra $L94; $L95: .loc 1 353 27 add.u64 %r96,%r96,1; .loc 1 353 35 add.u64 %r146,%r146,%r93; .loc 1 353 62 add.u64 %r147,%r147,%r37; .loc 1 353 4 setp.ge.s64 %r401,%r64,%r96; @ %r401 bra $L88; bra $L9021621308; call (%value_in),_gfortran_compare_string_char4ld.param.u32 %r406,[%value_in]; } .loc 1 355 18 setp.ne.u32 %r407,%r406,0; @ %r407 bra $L95; $L89: .loc 1 362 13 st.u64 [%r155],%r96; .loc 1 364 15 add.u64 %r205,%r205,1; st.u64 [%frame+480],%r205; .loc 1 365 12 add.u64 %r154,%r154,%r196; .loc 1 366 13 add.u64 %r151,%r151,%r197; .loc 1 367 12 add.u64 %r155,%r155,%r201; .loc 1 369 32 ld.u64 %r116,[%frame+360]; .loc 1 369 13 setp.ne.u64 %r408,%r205,%r116; @ %r408 bra $L84; mov.u64 %r83,%r428; mov.u64 %r113,%r199; mov.u64 %r203,%r197; mov.u64 %r110,%r194; mov.u64 %r42,8; .loc 1 368 9 mov.u64 %r153,0; .loc 1 371 13 mov.u64 %r410,%r153; $L98: st.u64 [%r83+-8],%r410; .loc 1 372 23 mul.lo.u64 %r98,%r116,%r110; .loc 1 373 24 mul.lo.u64 %r411,%r116,%r203; .loc 1 373 10 sub.u64 %r151,%r151,%r411; .loc 1 374 23 mul.lo.u64 %r105,%r116,%r113; .loc 1 375 5 add.u64 %r153,%r153,1; .loc 1 376 7 setp.le.s64 %r412,%r138,%r153; @ %r412 bra $L58; .loc 1 383 16 ld.u64 %r413,[%r83]; add.u64 %r109,%r413,1; st.u64 [%r83],%r109; .loc 1 384 23 add.u64 %r415,%r438,%r42; ld.u64 %r110,[%r415]; .loc 1 384 13 sub.u64 %r416,%r110,%r98; mad.lo.u64 %r154,%r416,%r82,%r154; .loc 1 385 23 add.u64 %r418,%frame,%r42; ld.u64 %r113,[%r418]; .loc 1 385 13 sub.u64 %r419,%r113,%r105; shl.b64 %r420,%r419,3; add.u64 %r155,%r155,%r420; .loc 1 369 32 add.u64 %r422,%r439,%r42; ld.u64 %r116,[%r422]; .loc 1 369 13 add.u64 %r83,%r83,8; setp.eq.u64 %r423,%r109,%r116; @ %r423 bra $L97; .loc 1 364 12 ld.u64 %r205,[%frame+480]; bra $L84; $L97: .loc 1 373 20 add.u64 %r425,%r440,%r42; ld.u64 %r203,[%r425]; add.u64 %r42,%r42,8; bra $L98; $L119: .loc 1 276 6 ld.u64 %r426,[%r206]; setp.eq.u64 %r427,%r426,0; @ ! %r427 bra $L70; bra $L99; $L58:_gfortran_sfindloc1_s4 .visible .func _gfortran_sfindloc1_s468u64 %r126; .reg .u64pred %r157; .reg .u32 %r158; .reg .predu32 %r177; .reg .u16 %r178; .reg .pred %r179; .reg .pred %r180; .reg .predpredpredu64predpredpredu64 %r305; .reg .u64 %r306; mov.u64 %r149,%ar0; mov.u64 %r150,%ar1; mov.u64 %r151,%ar2; mov.u64 %r152,%ar3; mov.u64 %r153,%ar4; mov.u32 %r154,%ar5; mov.u64 %r155,%ar6; mov.u64 %r156,%ar7; .loc 1 412 6 setp.eq.u64 %r157,%r153,0; @ %r157 bra $L121; .loc 1 412 20 ld.u32 %r158,[%r153]; setp.eq.u32 %r159,%r158,0; @ %r159 bra $L122; $L121:.loc 1 415 7 bra $L120; $L122: .loc 1 418 10 ld.s8 %r23,[%r150+28]; .loc 1 418 38 add.u32 %r167,%r23,-1; .loc 1 418 8 cvt.s64.s32 %r90,%r167; .loc 1 419 10 ld.u64 %r26,[%r152]; .loc 1 419 7 add.u64 %r91,%r26,-1; .loc 1 421 7 shr.u64 %r169,%r91,63; set.u32.lt.s64 %r171,%r90,%r91; neg.s32 %r172,%r171; cvt.u16.u64 %r175,%r169; cvt.u16.u32 %r176,%r172; or.b16 %r174,%r175,%r176; .loc 1 421 6 cvt.u32.u16 %r177,%r174; cvt.u16.u8 %r178,%r177; setp.ne.u16 %r179,%r178,0; @ %r179 bra $L124; .loc 1 432 3 setp.ne.u64 %r180,%r91,0; @ %r180 bra $L125; $L131: .loc 1 440 3 setp.gt.s64 %r181,%r90,%r91; @ %r181 bra $L126; bra $L168; $L124: .loc 1 423 7 cvt.u32.u32 %r184,%r23; cvt.s64.s8 %r183,%r184; st.u64 [%stack+8],%r183182182_gfortran_runtime_error125: add.u64 %r136,%r150,48; add.u64 %r138,%frame,120; add.u64 %r139,%r150,56; add.u64 %r186,%r150,24; add.u64 %r188,%r26,%r26; add.u64 %r189,%r188,%r26; shl.b64 %r190,%r189,3; add.u64 %r145,%r186,%r190; .loc 1 437 12 mov.u64 %r306,0; $L130: .loc 1 434 19 ld.u64 %r192,[%r139]; add.u64 %r191,%r192,1; ld.u64 %r193,[%r136]; sub.u64 %r35,%r191,%r193; .loc 1 436 10 setp.le.s64 %r194,%r35,0; @ %r194 bra $L128; .loc 1 434 17 st.u64 [%r138],%r35; bra $L129; $L128: .loc 1 437 12 st.u64 [%r138],%r306; $L129: .loc 1 432 3 add.u64 %r136,%r136,24; add.u64 %r138,%r138,8; add.u64 %r139,%r139,24; setp.ne.u64 %r196,%r136,%r145; @ %r196 bra $L130; bra $L131; $L168: .loc 1 450 6 ld.u64 %r197,[%r149]; setp.eq.u64 %r198,%r197,0; @ ! %r198 bra $L133; bra $L132; $L126: add.u64 %r200,%r26,%r26; add.u64 %r201,%r200,%r26; shl.b64 %r202,%r201,3; add.u64 %r203,%r202,48; add.u64 %r125,%r150,%r203; add.u64 %r298,%frame,120; shl.b64 %r205,%r26,3; add.u64 %r206,%r205,-8; add.u64 %r45,%r298,%r206; add.u64 %r207,%r150,48; cvt.u32.u32 %r209,%r23; cvt.s64.s8 %r208,%r209; add.u64 %r211,%r208,%r208; add.u64 %r212,%r211,%r208; shl.b64 %r213,%r212,3; add.u64 %r132,%r207,%r213; .loc 1 446 12 mov.u64 %r305,0; $L136: .loc 1 443 2 ld.u64 %r215,[%r125+8]; add.u64 %r214,%r215,1; ld.u64 %r216,[%r125]; sub.u64 %r40,%r214,%r216; .loc 1 445 10 setp.le.s64 %r217,%r40,0; @ %r217 bra $L134; .loc 1 442 17 st.u64 [%r45],%r40; bra $L135; $L134: .loc 1 446 12 st.u64 [%r45],%r305; $L135: .loc 1 440 3 add.u64 %r125,%r125,24; add.u64 %r45,%r45,8; setp.ne.u64 %r219,%r125,%r132; @ %r219 bra $L136; bra $L169; $L132: .loc 1 454 7 setp.le.s64 %r220,%r90,0; @ %r220 bra $L138; add.u64 %r298,%frame,120; $L154: add.u64 %r78,%r149,40; mov.u64 %r61,%r298; cvt.u32.u32 %r222,%r23; cvt.s64.s8 %r221,%r222; add.u64 %r224,%r221,%r221; add.u64 %r225,%r224,%r221; shl.b64 %r226,%r225,3; add.u64 %r227,%r149,16; add.u64 %r126,%r226,%r227; .loc 1 457 10 mov.u64 %r81,1; .loc 1 461 4 mov.u64 %r250,0; bra $L139; $L138: .loc 1 464 24 mov.u64 %r228,0; st.u64 [%r149+8],%r228; .loc 1 465 28 cvt.u16.u32 %r231,%r23; add.u16 %r230,%r231,-1; cvt.u32.u16 %r232,%r230; st.u8 [%r149+28],%r232; .loc 1 467 20 add.u32 %r233,%r23,-2; cvt.s64.s32 %r51,%r233; add.u64 %r235,%r51,%r51; add.u64 %r236,%r235,%r51; shl.b64 %r237,%r236,3; add.u64 %r238,%r149,%r237; .loc 1 467 67 shl.b64 %r240,%r51,3; add.u64 %r241,%frame,%r240; .loc 1 467 59 ld.u64 %r243,[%r238+40]; ld.u64 %r244,[%r241+120]; mul.lo.u64 %r94,%r243,%r244; .loc 1 469 29 mov.u64 %r246246; call (%value_in),_gfortrani_xmallocarray7,[%value_in]; } .loc 1 469 27 st.u64 [%r149],%r247; .loc 1 470 10 setp.eq.u64 %r249,%r94,0; @ ! %r249 bra $L143; bra $L140; $L142: .loc 1 459 48 mul.lo.u64 %r81,%r47,%r81; $L139: .loc 1 461 4 st.u64 [%r78+8],%r250; ld.u64 %r47,[%r61]; add.u64 %r251,%r47,-1; st.u64 [%r78+16],%r251; st.u64 [%r78],%r81; .loc 1 454 7 add.u64 %r78,%r78,24; add.u64 %r61,%r61,8; setp.ne.u64 %r252,%r78,%r126; @ %r252 bra $L142; bra $L138; $L140: .loc 1 473 4 st.u64 [%r149+48],%r94; mov.u64 %r254,-1; st.u64 [%r149+56],%r254; mov.u64 %r255,1; st.u64 [%r149+40],%r255; .loc 1 474 4 bra $L120; $L133: .loc 1 479 19 ld.s8 %r58,[%r149+28]; .loc 1 479 10 setp.eq.u64 %r256,%r58,%r90; @ %r256 bra $L144; .loc 1 480 2 st.u64 [%stack+8],%r90; st.u64 [%stack],%r58;_gfortran_runtime_error144: .loc 1 485 11 cvta.global.u64 %r259,_gfortrani_compile_options; .loc 1 485 10 ld.u32 %r260,[%r259+36]; setp.eq.u32 %r261,%r260,0; @ %r261 bra $L143; .loc 1 486 2 add.u64 %r298,%frame,120; cvta.const.u64 %r265,$LC2r292645; call _gfortrani_bounds_ifunction_return$L143: .loc 1 490 3 setp.gt.s64 %r267,%r90,0; @ %r267 bra $L145; $L148: .loc 1 497 8 ld.u64 %r97,[%r149]; ld.u64 %r42,[%frame+240]; .loc 1 505 22 ld.u64 %r64,[%frame]; .loc 1 505 12 shl.b64 %r66,%r64,3; setp.le.s64 %r299,%r90,1; cvt.u32.u32 %r301,%r23; cvt.s64.s8 %r302,%r301; add.u64 %r303,%r302,-1; add.u64 %r304,%frame,120; bra $L146; $L145: add.u64 %r114,%frame,240; add.u64 %r113,%r149,40; mov.u64 %r108,%frame; cvt.u32.u32 %r269,%r23; cvt.s64.s8 %r268,%r269; add.u64 %r79,%r268,-1; .loc 1 490 3 mov.u64 %r98,0; add.u64 %r298,%frame,120; .loc 1 492 16 mov.u64 %r270,%r98; $L147: st.u64 [%r114],%r270; .loc 1 493 18 ld.u64 %r271,[%r113]; st.u64 [%r108],%r271; .loc 1 494 17 shl.b64 %r273,%r98,3; add.u64 %r274,%r298,%r273; .loc 1 494 10 ld.u64 %r275,[%r274]; setp.le.s64 %r276,%r275,0; @ %r276 bra $L120; .loc 1 490 26 add.u64 %r98,%r98,1; .loc 1 490 3 add.u64 %r114,%r114,8; add.u64 %r113,%r113,24; add.u64 %r108,%r108,8; setp.ne.u64 %r277,%r79,%r98; @ %r277 bra $L147; bra $L148; $L146: .loc 1 502 13 mov.u64 %r278,0; st.u64 [%r97],%r278; .loc 1 504 15 add.u64 %r42,%r42,1; .loc 1 505 12 add.u64 %r97,%r97,%r66; .loc 1 507 32 ld.u64 %r88,[%frame+120]; .loc 1 507 13 setp.ne.u64 %r279,%r42,%r88; @ %r279 bra $L146; .loc 1 510 23 mul.lo.u64 %r280,%r64,%r42; .loc 1 510 9 shl.b64 %r281,%r280,3; sub.u64 %r30,%r97,%r281; .loc 1 512 7 @ ! %r299 bra $L170; bra $L120; $L153: .loc 1 509 13 st.u64 [%r102],%r278; .loc 1 510 23 mul.lo.u64 %r284,%r73,%r72; .loc 1 510 9 shl.b64 %r285,%r284,3; sub.u64 %r30,%r97,%r285; .loc 1 511 5 add.u64 %r96,%r96,1; .loc 1 512 7 add.u64 %r102,%r102,8; setp.ne.u64 %r286,%r96,%r303; @ %r286 bra $L152; bra $L120; $L170: add.u64 %r102,%frame,248; .loc 1 511 5 mov.u64 %r96,1; $L152: .loc 1 519 16 ld.u64 %r290,[%r102]; add.u64 %r72,%r290,1; st.u64 [%r102],%r72; shl.b64 %r120,%r96,3; .loc 1 520 23 add.u64 %r291,%frame,%r120; ld.u64 %r73,[%r291]; .loc 1 520 13 shl.b64 %r292,%r73,3; add.u64 %r97,%r30,%r292; .loc 1 507 32 add.u64 %r294,%r304,%r120; ld.u64 %r77,[%r294]; .loc 1 507 13 setp.eq.u64 %r295,%r72,%r77; @ %r295 bra $L153; .loc 1 509 13 mov.u64 %r42,0; bra $L146; $L169: .loc 1 450 6 ld.u64 %r296,[%r149]; setp.eq.u64 %r297,%r296,0; @ ! %r297 bra $L133; bra $L154; $L120: .loc 1 524 1 ret; } findloc2_s1.o/gfortran_findloc2_s1gfortran_findloc2_s1); .file 1 "../../../libgfortran/generated/findloc2_s1.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mfindloc2_s1gfortran_mfindloc2_s1_gfortran_sfindloc2_s1gfortran_sfindloc2_s1_gfortran_compare_gfortran_compare); // BEGIN GLOBAL FUNCTION DECL: _gfortrani_internal_error .extern .func _gfortrani_internal_error21,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_findloc2_s1gfortran_findloc2_s1predu32 %r63; .reg .pred %r64; .reg .pred44 12 ld.u64 %r45,[%r39+56]; add.u64 %r44,%r45,1; .loc 1 44 10 ld.u64 %r46,[%r39+48]; sub.u64 %r38,%r44,%r46; .loc 1 45 6 setp.gt.s64 %r47,%r38,0; @ %r47 bra $L2; $L6: .loc 1 46 12 mov.u64 %r38,0; bra $L1; $L2: .loc 1 48 44 ld.u64 %r48,[%r39+40]; mul.lo.u64 %r27,%r42,%r48; .loc 1 51 18 ld.u64 %r35,[%r39]; .loc 1 49 6 setp.eq.u32 %r49,%r41,0; @ %r49 bra $L7; .loc 1 51 40 add.u64 %r50,%r38,-1; .loc 1 51 11 mad.lo.u64 %r37,%r50,%r27,%r35; $L5: .loc 1 54(%value_in),_gfortran_compare_stringld.param.u32 %r56,[%value_in]; } .loc 1 54 7 setp.eq.u32 %r57,%r56,0; @ %r57 bra $L1; .loc 1 56 8 sub.u64 %r37,%r37,%r27; .loc 1 52 33 add.u64 %r38,%r38,-1; .loc 1 52 7 setp.ne.u64 %r58,%r38,-1; @ %r58 bra $L5; bra $L6; $L7: .loc 1 62 14 mov.u64 %r36,1; $L4: .loc 1 64(%value_in),_gfortran_compare_string64 7 setp.eq.u32 %r64,%r63,0; @ %r64 bra $L8; .loc 1 66 8 add.u64 %r35,%r35,%r27; .loc 1 62 33 add.u64 %r36,%r36,1; .loc 1 62 7 setp.ge.s64 %r65,%r38,%r36; @ %r65 bra $L4; bra $L6; $L8: mov.u64 %r38,%r36; $L1: .loc 1 70gfortran_mfindloc2_s1gfortran_mfindloc2_s1predpredpredpred.loc 1 92 12 ld.u64 %r66,[%r59+56]; add.u64 %r65,%r66,1; .loc 1 92 10 ld.u64 %r67,[%r59+48]; sub.u64 %r58,%r65,%r67; .loc 1 93 6 setp.gt.s64 %r68,%r58,0; @ %r68 bra $L15; $L23: .loc 1 94 12 mov.u64 %r58,0; bra $L14; $L15: .loc 1 96 15 ld.u64 %r25,[%r61+16]; .loc 1 97 9 ld.u64 %r53,[%r61]; .loc 1 99 22 cvt.u32.u64 %r26,%r25; .loc 1 99 53 add.u32 %r69,%r26,-4; and.b32 %r70,%r69,-5;.loc 1 99 22 add.u32 %r74,%r26,-1; set.u32.le.u32 %r76,%r74,1; neg.s32 %r77,%r76; .loc 1 99 6 cvt.u16.u32 %r79,%r73; cvt.u16.u32 %r80,%r77; or.b16 %r78,%r79,%r80;17; .loc 1 101 7 setp.ne.u32 %r85,%r26,16; @ %r85 bra $L18; $L17: .loc 1 108 44 ld.u64 %r86,[%r59+40]; mul.lo.u64 %r31,%r63,%r86; .loc 1 109 13 ld.u64 %r87,[%r61+40]; mul.lo.u64 %r34,%r25,%r87; .loc 1 113 18 ld.u64 %r52,[%r59]; .loc 1 111 6 setp.ne.u32 %r88,%r62,0; @ %r88 bra $L19; .loc 1 126 14 mov.u64 %r54,1; bra $L20; $L18: .loc 1 106 5 cvta.const.u64 %r90,$LC0; mov.u64 %r89,0; {r90; call _gfortrani_internal_error19: .loc 1 113 40 add.u64 %r35,%r58,-1; .loc 1 113 11 mad.lo.u64 %r55,%r31,%r35,%r52; .loc 1 114 13 mad.lo.u64 %r56,%r34,%r35,%r53; $L22: .loc 1 117 7 ld.s8 %r94,[%r56]; cvt.u16.u32 %r93,%r94; setp.ne.u16 %r95,%r93,0; @ %r95 bra $L21; $L24: .loc 1 119 8 sub.u64 %r55,%r55,%r31; .loc 1 120 10 sub.u64 %r56,%r56,%r34; .loc 1 115 33 add.u64 %r58,%r58,-1; .loc 1 115 7 setp.ne.u64 %r96,%r58,-1; @ %r96 bra $L22; bra $L23; $L21: .loc 1 117 1call (%value_in),_gfortran_compare_string117 15 setp.eq.u32 %r102,%r101,0; @ ! %r102 bra $L24; bra $L14; $L20: .loc 1 128 7 ld.s8 %r104,[%r53]; cvt.u16.u32 %r103,%r104; setp.ne.u16 %r105,%r103,0; @ %r105 bra $L25; $L26: .loc 1 130 8 add.u64 %r52,%r52,%r31; .loc 1 131 10 add.u64 %r53,%r53,%r34; .loc 1 126 33 add.u64 %r54,%r54,1; .loc 1 126 7 setp.ge.s64 %r106,%r58,%r54; @ %r106 bra $L20; bra $L23; $L25: .loc 1 128 1call (%value_in),_gfortran_compare_string28 15 setp.ne.u32 %r112,%r111,0; @ %r112 bra $L26; mov.u64 %r58,%r54; $L14gfortran_sfindloc2_s1gfortran_sfindloc2_s1mov.u64 %r28,%ar4; mov.u64 %r29,%ar5; .loc 1 148 6 setp.eq.u64 %r30,%r26,0; @ %r30 bra $L33; .loc 1 148 20 ld.u32 %r31,[%r26];$L33: .loc 1 150 call (%value_in),_gfortran_findloc2_38,[%value_in]; } mov.u64 %r23,%r38; bra $L32; $L35: .loc 1 152 10 mov.u64 %r23,0; $L3 findloc2_s4.o/gfortran_findloc2_s4gfortran_findloc2_s4); .file 1 "../../../libgfortran/generated/findloc2_s4.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_mfindloc2_s4gfortran_mfindloc2_s_gfortran_sfindloc2_s4gfortran_sfindloc2_s_gfortran_compare_string_char4_gfortran_compare_string_char4gfortrani_internal_error .extern .func _gfortrani_internal_error21,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97_gfortran_findloc2_s4gfortran_findloc2_s4pred3244 12 ld.u64 %r47,[%r41+56]; add.u64 %r46,%r47,1; .loc 1 44 10 ld.u64 %r48,[%r41+48]; sub.u64 %r40,%r46,%r48; .loc 1 45 6 setp.gt.s64 %r49,%r40,0; @ %r49 bra $L2; $L6: .loc 1 46 12 mov.u64 %r40,0; bra $L1; $L2: .loc 1 48 44 ld.u64 %r50,[%r41+40]; mul.lo.u64 %r27,%r44,%r50; .loc 1 56 8 shl.b64 %r35,%r27,2; .loc 1 51 18 ld.u64 %r37,[%r41]; .loc 1 49 6 setp.eq.u32 %r51,%r43,0; @ %r51 bra $L7; .loc 1 51 40 add.u64 %r52,%r40,-1; .loc 1 51 45 mul.lo.u64 %r53,%r52,%r27; .loc 1 51 30 shl.b64 %r54,%r53,2; .loc 1 51 11 add.u64 %r39,%r37,%r54; .loc 1 56 8 neg.s64 %r33,%r35; $L5: .loc 1 54call (%value_in),_gfortran_compare_string_char4ld.param.u32 %r59,[%value_in]; } .loc 1 54 7 setp.eq.u32 %r60,%r59,0; @ %r60 bra $L1; .loc 1 56 8 add.u64 %r39,%r39,%r33; .loc 1 52 33 add.u64 %r40,%r40,-1; .loc 1 52 7 setp.ne.u64 %r61,%r40,-1; @ %r61 bra $L5; bra $L6; $L7: .loc 1 62 14 mov.u64 %r38,1; $L4: .loc 1 64call (%value_in),_gfortran_compare_string_char4ld.param.u32 %r66,[%value_in]; } .loc 1 64 7 setp.eq.u32 %r67,%r66,0; @ %r67 bra $L8; .loc 1 66 8 add.u64 %r37,%r37,%r35; .loc 1 62 33 add.u64 %r38,%r38,1; .loc 1 62 7 setp.ge.s64 %r68,%r40,%r38; @ %r68 bra $L4; bra $L6; $L8: mov.u64 %r40,%r38; $L1: .loc 1 70gfortran_mfindloc2_s4gfortran_mfindloc2_spredpredu32 %r114; .reg .pred %r115; mov.u64 %r61,%ar0; mov.u64 %r62,%ar1; mov.u64 %r63,%ar2; mov.u32 %r64,%ar3; mov.u64 %r65,%ar4; mov.u64 %r66,%ar5; .loc 1 92 12 ld.u64 %r68,[%r61+56]; add.u64 %r67,%r68,1; .loc 1 92 10 ld.u64 %r69,[%r61+48]; sub.u64 %r60,%r67,%r69; .loc 1 93 6 setp.gt.s64 %r70,%r60,0; @ %r70 bra $L15; $L23: .loc 1 94 12 mov.u64 %r60,0; bra $L14; $L15: .loc 1 96 15 ld.u64 %r25,[%r63+16]; .loc 1 97 9 ld.u64 %r54,[%r63]; .loc 1 99 22 cvt.u32.u64 %r26,%r25; .loc 1 99 53 add.u32 %r71,%r26,-4; and.b32 %r72,%r71,-5;.loc 1 99 22 add.u32 %r76,%r26,-1; set.u32.le.u32 %r78,%r76,1; neg.s32 %r79,%r78; .loc 1 99 6 cvt.u16.u32 %r81,%r75; cvt.u16.u32 %r82,%r79; or.b16 %r80,%r81,%r82;@ %r85 bra $L17; .loc 1 101 7 setp.ne.u32 %r87,%r26,16; @ %r87 bra $L18; $L17: .loc 1 108 44 ld.u64 %r88,[%r61+40]; mul.lo.u64 %r31,%r65,%r88; .loc 1 109 13 ld.u64 %r89,[%r63+40]; mul.lo.u64 %r34,%r25,%r89; .loc 1 119 8 shl.b64 %r59,%r31,2; .loc 1 113 18 ld.u64 %r53,[%r61]; .loc 1 111 6 setp.ne.u32 %r90,%r64,0; @ %r90 bra $L19; .loc 1 126 14 mov.u64 %r55,1; bra $L20; $L18: .loc 1 106 5 cvta.const.u64 %r92,$LC0; mov.u64 %r91,0; {_gfortrani_internal_error19: .loc 1 113 40 add.u64 %r35,%r60,-1; .loc 1 113 45 mul.lo.u64 %r93,%r31,%r35; .loc 1 113 30 shl.b64 %r94,%r93,2; .loc 1 113 11 add.u64 %r56,%r53,%r94; .loc 1 114 13 mad.lo.u64 %r57,%r34,%r35,%r54; .loc 1 119 8 neg.s64 %r43,%r59; $L22: .loc 1 117 7 ld.s8 %r97,[%r57]; cvt.u16.u32 %r96,%r97; setp.ne.u16 %r98,%r96,0; @ %r98 bra $L21; $L24: .loc 1 119 8 add.u64 %r56,%r56,%r43; .loc 1 120 10 sub.u64 %r57,%r57,%r34; .loc 1 115 33 add.u64 %r60,%r60,-1; .loc 1 115 7 setp.ne.u64 %r99,%r60,-1; @ %r99 bra $L22; bra $L23; $L21: .loc 1 117 1call (%value_in),_gfortran_compare_string_char4ld.param.u32 %r104,[%value_in]; } .loc 1 117 15 setp.eq.u32 %r105,%r104,0; @ ! %r105 bra $L24; bra $L14; $L20: .loc 1 128 7 ld.s8 %r107,[%r54]; cvt.u16.u32 %r106,%r107; setp.ne.u16 %r108,%r106,0; @ %r108 bra $L25; $L26: .loc 1 130 8 add.u64 %r53,%r53,%r59; .loc 1 131 10 add.u64 %r54,%r54,%r34; .loc 1 126 33 add.u64 %r55,%r55,1; .loc 1 126 7 setp.ge.s64 %r109,%r60,%r55; @ %r109 bra $L20; bra $L23; $L25: .loc 1 128 1call (%value_in),_gfortran_compare_string_char4ld.param.u32 %r114,[%value_in]; } .loc 1 128 15 setp.ne.u32 %r115,%r114,0; @ %r115 bra $L26; mov.u64 %r60,%r55; $L14gfortran_sfindloc2_s4gfortran_sfindloc2_smov.u64 %r28,%ar4; mov.u64 %r29,%ar5; .loc 1 148 6 setp.eq.u64 %r30,%r26,0; @ %r30 bra $L33; .loc 1 148 20 ld.u32 %r31,[%r26];$L33: .loc 1 150 call (%value_in),_gfortran_findloc2_38,[%value_in]; } mov.u64 %r23,%r38; bra $L32; $L35: .loc 1 152 10 mov.u64 %r23,0; $L3577 gfortran_cfi_desc_to_gfc_desc .visible .func _gfortran_cfi_desc_to_gfc_descfortran/runtime/ISO_Fortran_binding.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_gfc_desc_to_cfi_desc .visible .func _gfortran_gfc_desc_to_cfi_desc; // BEGIN GLOBAL FUNCTION DECL: CFICFI_addres); // BEGIN GLOBAL FUNCTION DECL: CFI_allocatCFI_allocateCFI_deallocatCFI_deallocateCFI_establishCFI_establishCFI_is_contiguouCFI_is_contiguousCFI_secCFI_sectionCFI_select_pCFI_select_partCFI_setpointerCFI_setpointerimpure_ptr .extern .global .align 8 .u64 _impure_pt_gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_options[6]fwritefwrite67,70,73,95,97,100,100,114,101,115,115,58,32,98,97,115,101,32,97,100,100,114,101,115,115,32,111,102,32,67,32,68,101,115,99,114,105,112,116,111,114,32,109,117,115,116,32,110,111,116,32,98,101,32,78,85,76,76,46,32] = {67,70,73,95,97,100,100,114,101,115,115,58,32,115,117,98,115,99,114,105,112,116,115,91,37,100,93,32,105,115,32,111,117,116,32,111,102,32,98,111,117,110,100,115,46,32,70,111,114,32,100,105,109,101,110,115,105,111,110,32,61,32,37,100,44,32,115,117,98,115,99,114,105,112,116,115,32,61,32,37,100,44,32,108,111,119,101,114,95,98,111,117,110,100,32,61,32,37,100,44,32,117,112,112,101,114,32,98,111,117,110,100,32,61,32,37,100,44,32,101,120,116,101,110,100,32,61,32,37,10067,70,73,95,97,108,108,111,99,97,116,101,58,32,67,32,68,101,115,99,114,105,112,116,111,114,32,105,115,32,78,85,76,767,70,73,95,97,108,108,111,99,97,116,101,58,32,84,104,101,32,111,98,106,101,99,116,32,111,102,32,116,104,101,32,67,32,100,101,115,99,114,105,112,116,111,114,32,109,117,115,116,32,98,101,32,97,32,112,111,105,110,116,101,114,32,111,114,32,97,108,108,111,99,97,116,97,98,108,101,32,118,97,114,105,97,98,108,101,4667,70,73,95,97,108,108,111,99,97,116,101,58,32,66,97,115,101,32,97,100,100,114,101,115,115,32,111,102,32,67,32,100,101,115,99,114,105,112,116,111,114,32,109,117,115,116,32,98,101,32,78,85,76,76,467,70,73,95,97,108,108,111,99,97,116,101,58,32,73,102,32,48,32,60,32,114,97,110,107,32,40,61,32,37,100,41,32,117,112,112,101,114,95,98,111,117,110,100,115,91,93,32,97,110,100,32,108,111,119,101,114,95,98,111,117,110,100,115,91,93,44,32,109,117,115,116,32,110,111,116,32,98,101,32,78,85,76,76,4667,70,73,95,97,108,108,111,99,97,116,101,58,32,70,97,105,108,11710,46,167,70,73,95,100,101,97,108,108,111,99,97,116,101,58,32,67,32,68,101,115,99,114,105,112,116,111,114,32,105,115,32,78,85,76,76,46,10,0 }67,70,73,95,100,101,97,108,108,111,99,97,116,101,58,32,66,97,115,101,32,97,100,100,114,101,115,115,32,105,115,32,97,108,114,101,97,100,121,32,78,85,76,76,4667,70,73,95,100,101,97,108,108,111,99,97,116,101,58,32,67,32,68,101,115,99,114,105,112,116,111,114,32,109,117,115,116,32,100,101,115,99,114,105,98,101,32,97,32,112,111,105,110,116,101,114,32,111,114,32,97,108,108,111,99,97,116,97,98,108,101,32,111,98,106,101,99,116,4667,70,73,95,101,115,116,97,98,108,105,115,104,58,32,67,32,100,101,115,99,114,105,112,116,111,114,32,105,115,32,78,85,76,76,46,10,0 }; // BEGIN VAR DEF: $LC11 .const .align 1 .u8 $LC11[67] = {67,70,73,95,101,115,116,97,98,108,105,115,104,58,32,82,97,110,107,32,109,117,115,116,32,98,101,32,98,101,116,119,101,101,110,32,48,32,97,110,100,32,37,100,44,32,48,32,60,32,114,97,110,107,32,40,48,32,33,60,32,37,100,41,46144] = {67,70,73,95,101,115,116,97,98,108,105,115,104,58,32,73,102,32,98,97,115,101,32,97,100,100,114,101,115,115,32,105,115,32,110,111,116,32,78,85,76,76,32,40,98,97,115,101,95,97,100,100,114,32,33,61,32,78,85,76,76,41,44,32,116,104,101,32,101,115,116,97,98,108,105,115,104,101,100,32,67,32,100,101,115,99,114,105,112,116,111,114,32,105,115,32,102,111,114,32,97,32,110,111,110,97,108,108,111,99,97,116,97,98,108,101,32,101,110,116,105,116,121,32,40,97,116,116,114,105,98,117,116,101,32,33,61,32,37,100,41,46128] = {67,70,73,95,101,115,116,97,98,108,105,115,104,58,32,69,120,116,101,110,116,115,32,109,117,115,116,32,110,111,116,32,98,101,32,78,85,76,76,32,40,101,120,116,101,110,116,115,32,33,61,32,78,85,76,76,41,32,105,102,32,114,97,110,107,32,40,61,32,37,100,41,32,62,32,48,32,97,110,100,32,98,97,115,101,32,97,100,100,114,101,115,115,32,105,115,32,110,111,116,32,78,85,76,76,32,40,98,97,115,101,95,97,100,100,114,32,33,61,32,78,85,76,76,41,4667,70,73,95,105,115,95,99,111,110,116,105,103,117,111,117,115,58,32,67,32,100,101,115,99,114,105,112,116,111,114,32,105,115,32,78,85,76,76,4667,70,73,95,105,115,95,99,111,110,116,105,103,117,111,117,115,58,32,66,97,115,101,32,97,100,100,114,101,115,115,32,111,102,32,67,32,68,101,115,99,114,105,112,116,111,114,32,105,115,32,97,108,114,101,97,100,121,32,78,85,76,76,4667,70,73,95,105,115,95,99,111,110,116,105,103,117,111,117,115,58,32,67,32,68,101,115,99,114,105,112,116,111,114,32,109,117,115,116,32,100,101,115,99,114,105,98,101,32,97,110,32,97,114,114,97,121,32,40,48,32,60,32,100,118,45,62,114,97,110,107,32,61,32,37,100,41,46,10,0 }; // BEGIN VAR DEF: $LC17 .const .align 1 .u8 $LC17[40] = {67,70,73,95,115,101,99,116,105,111,110,58,32,83,111,117,114,99,101,32,109,117,115,116,32,110,111,116,32,98,101,32,32,78,85,76,7639] = {67,70,73,95,115,101,99,116,105,111,110,58,32,82,101,115,117,108,116,32,109,117,115,116,32,110,111,116,32,98,101,32,78,85,76,765] = {67,70,73,95,115,101,99,116,105,111,110,58,32,66,97,115,101,32,97,100,100,114,101,115,115,32,111,102,32,115,111,117,114,99,101,32,109,117,115,116,32,110,111,116,32,98,101,32,78,85,76,76,46,1001,99,116,95,112,97,114,116,58,32,66,97,115,101,32,97,100,100,114,101,115,115,32,111,102,32,115,111,117,114,99,101,32,109,117,115,116,32,110,111,116,32,98,101,32,78,85,76,76,46,10,0 }; // BEGIN VAR DEF: $LC34 .const .align 1 .u8 $LC34[100] = {67,70,73,95,115,101,108,101,99,116,95,112,97,114,116,58,32,83,111,117,114,99,101,32,97,110,100,32,114,101,115,117,108,116,32,109,117,115,116,32,104,97,118,101,32,116,104,101,32,115,97,109,101,32,114,97,110,107,32,40,115,111,117,114,99,101,45,62,114,97,110,107,32,61,32,37,100,44,32,114,101,115,117,108,116,45,62,114,97,110,107,32,61,32,37,100,41,46,10,0 }; // BEGIN VAR DEF: $LC35 .const .align 1 .u8 $LC35[98] = {67,70,73,95,115,101,108,101,99,116,95,112,97,114,116,58,32,83,111,117,114,99,101,32,109,117,115,116,32,110,111,116,32,100,101,115,99,114,105,98,101,32,97,110,32,97,115,115,117,109,101,100,32,115,105,122,101,32,97,114,114,97,121,32,32,40,115,111,117,114,99,101,45,62,100,105,109,91,37,100,93,46,101,120,116,101,110,116,32,33,61,32,45,49,41,46,10,0 }; // BEGIN VAR DEF: $LC36 .const .align 1 .u8 $LC36[127] = {67,70,73,95,115,101,108,101,99,116,95,112,97,114,116,58,32,68,105,115,112,108,97,99,101,109,101,110,116,32,109,117,115,116,32,98,101,32,119,105,116,104,105,110,32,116,104,101,32,98,111,117,110,100,115,32,111,102,32,115,111,117,114,99,101,32,40,48,32,60,61,32,100,105,115,112,108,97,99,101,109,101,110,116,32,60,61,32,115,111,117,114,99,101,45,62,101,108,101,109,95,108,101,110,32,45,32,49,44,32,48,32,60,61,32,37,100,32,60,61,32,37,100,41,46,10,0 }; // BEGIN VAR DEF: $LC37 .const .align 1 .u8 $LC37[199] = {67,70,73,95,115,101,108,101,99,116,95,112,97,114,116,58,32,68,105,115,112,108,97,99,101,109,101,110,116,32,112,108,117,115,32,116,104,101,32,101,108,101,109,101,110,116,32,108,101,110,103,116,104,32,111,102,32,114,101,115,117,108,116,32,109,117,115,116,32,98,101,32,108,101,115,115,32,116,104,97,110,32,111,114,32,101,113,117,97,108,32,116,111,32,116,104,101,32,101,108,101,109,101,110,116,32,108,101,110,103,116,104,32,111,102,32,115,111,117,114,99,101,32,40,100,105,115,112,108,97,99,101,109,101,110,116,32,43,32,114,101,115,117,108,116,45,62,101,108,101,109,95,108,101,110,32,60,61,32,115,111,117,114,99,101,45,62,101,108,101,109,95,108,101,110,44,32,37,100,32,43,32,37,100,32,61,32,37,100,32,60,61,32,37,100,41,46,10,0 }; // BEGIN VAR DEF: $LC38 .const .align 1 .u8 $LC38[33] = {67,70,73,95,115,101,116,112,111,105,110,116,101,114,58,32,82,101,115,117,108,116,32,105,115,32,78,85,76,76,46,10,0 }; // BEGIN VAR DEF: $LC39 .const .align 1 .u8 $LC39[86] = {67,70,73,95,115,101,116,112,111,105,110,116,101,114,58,32,82,101,115,117,108,116,32,115,104,97,108,108,32,98,101,32,116,104,101,32,97,100,100,114,101,115,115,32,111,102,32,97,32,67,32,100,101,115,99,114,105,112,116,111,114,32,102,111,114,32,97,32,70,111,114,116,114,97,110,32,112,111,105,110,116,101,114,46,10,0 }; // BEGIN VAR DEF: $LC30 .const .align 1 .u8 $LC30[43] = {67,70,73,95,115,101,108,101,99,116,95,112,97,114,116,58,32,83,111,117,114,99,101,32,109,117,115,116,32,110,111,116,32,98,101,32,78,85,76,76,46,67,70,73,95,115,101,108,101,99,116,95,112,97,114,116,58,32,82,101,115,117,108,116,32,109,117,115,116,32,110,111,116,32,98,101,32,78,85,76,76,46,10,0 }67,70,73,95,115,101,108,101,99,116,95,112,97,114,116,58,32,82,101,115,117,108,116,32,109,117,115,116,32,110,111,116,32,100,101,115,99,114,105,98,101,32,97,110,32,97,108,108,111,99,97,116,97,98,108,101,32,111,98,106,101,99,116,32,40,114,101,115,117,108,116,45,62,97,116,116,114,105,98,117,116,101,32,33,61,32,37,100,41,46,10,0 }; // BEGIN VAR DEF: $LC40 .const .align 1 .u8 $LC40[120] = {67,70,73,95,115,101,116,112,111,105,110,116,101,114,58,32,69,108,101,109,101,110,116,32,108,101,110,103,116,104,115,32,111,102,32,114,101,115,117,108,116,32,40,114,101,115,117,108,116,45,62,101,108,101,109,95,108,101,110,32,61,32,37,100,41,32,97,110,100,32,115,111,117,114,99,101,32,40,115,111,117,114,99,101,45,62,101,108,101,109,95,108,101,110,32,61,32,37,100,41,32,109,117,115,116,32,98,101,32,116,104,101,32,115,97,109,101,46,10,0 }; // BEGIN VAR DEF: $LC41 .const .align 1 .u8 $LC41[102] = {67,70,73,95,115,101,116,112,111,105,110,116,101,114,58,32,82,97,110,107,115,32,111,102,32,114,101,115,117,108,116,32,40,114,101,115,117,108,116,45,62,114,97,110,107,32,61,32,37,100,41,32,97,110,100,32,115,111,117,114,99,101,32,40,115,111,117,114,99,101,45,62,114,97,110,107,32,61,32,37,100,41,32,109,117,115,116,32,98,101,32,116,104,101,32,115,97,109,101,46,10,0 }; // BEGIN VAR DEF: $LC42 .const .align 1 .u8 $LC42[101] = {67,70,73,95,115,101,116,112,111,105,110,116,101,114,58,32,84,121,112,101,115,32,111,102,32,114,101,115,117,108,116,32,40,114,101,115,117,108,116,45,62,116,121,112,101,61,32,37,100,41,32,97,110,100,32,115,111,117,114,99,101,32,40,115,111,117,114,99,101,45,62,116,121,112,101,32,61,32,37,100,41,32,109,117,115,116,32,98,101,32,116,104,101,32,115,97,109,101,4661] = {67,70,73,95,115,101,99,116,105,111,110,58,32,82,101,115,117,108,116,32,109,117,115,116,32,110,111,116,32,100,101,115,99,114,105,98,101,32,97,110,32,97,108,108,111,99,97,116,97,98,108,101,32,97,114,114,97,121,4667,70,73,95,115,101,99,116,105,111,110,58,32,83,111,117,114,99,101,32,109,117,115,116,32,100,101,115,99,114,105,98,101,32,97,110,32,97,114,114,97,121,32,40,48,32,60,32,115,111,117,114,99,101,45,62,114,97,110,107,44,32,48,32,33,60,32,37,100,41,4118] = {67,70,73,95,115,101,99,116,105,111,110,58,32,84,104,101,32,101,108,101,109,101,110,116,32,108,101,110,103,116,104,115,32,111,102,32,115,111,117,114,99,101,32,40,115,111,117,114,99,101,45,62,101,108,101,109,95,108,101,110,32,61,32,37,100,41,32,97,110,100,32,114,101,115,117,108,116,32,40,114,101,115,117,108,116,45,62,101,108,101,109,95,108,101,110,32,61,32,37,100,41,32,109,117,115,116,32,98,101,32,101,113,117,97,10867,70,73,95,115,101,99,116,105,111,110,58,32,84,121,112,101,115,32,111,102,32,115,111,117,114,99,101,32,40,115,111,117,114,99,101,45,62,116,121,112,101,32,61,32,37,100,41,32,97,110,100,32,114,101,115,117,108,116,32,40,114,101,115,117,108,116,45,62,116,121,112,101,32,61,32,37,100,41,32,109,117,115,116,32,98,101,32,101,113,117,97161] = {67,70,73,95,115,101,99,116,105,111,110,58,32,82,97,110,107,32,111,102,32,114,101,115,117,108,116,32,109,117,115,116,32,98,101,32,101,113,117,97,108,32,116,111,32,116,104,101,32,114,97,110,107,32,111,102,32,115,111,117,114,99,101,32,109,105,110,117,115,32,116,104,101,32,110,117,109,98,101,114,32,111,102,32,122,101,114,111,115,32,105,110,32,115,116,114,105,100,101,115,32,40,114,101,115,117,108,116,45,62,114,97,110,107,32,61,32,115,111,117,114,99,101,45,62,114,97,110,107,32,45,32,122,101,114,111,95,99,111,117,110,116,44,32,37,100,32,33,61,32,37,100,32,45,32,37,100,4167,70,73,95,115,101,99,116,105,111,110,58,32,83,111,117,114,99,101,32,109,117,115,116,32,110,111,116,32,98,101,32,97,110,32,97,115,115,117,109,101,100,32,115,105,122,101,32,97,114,114,97,121,32,105,102,32,117,112,112,101,114,95,98,111,117,110,100,115,32,105,115,32,78,85,76,76,46122] = {67,70,73,95,115,101,99,116,105,111,110,58,32,73,102,32,115,116,114,105,100,101,115,91,37,100,93,32,61,32,48,44,32,116,104,101,110,32,116,104,101,32,108,111,119,101,114,32,98,111,117,110,100,115,44,32,108,111,119,101,114,95,98,111,117,110,100,115,91,37,100,93,32,61,32,37,100,44,32,97,110,100,32,117,112,112,101,114,95,98,111,117,110,100,115,91,37,100,93,32,61,32,37,100,44,32,109,117,115,116,32,98,101,32,101,113,117,97,108,46201] = {67,70,73,95,115,101,99,116,105,111,110,58,32,76,111,119,101,114,32,98,111,117,110,100,115,32,109,117,115,116,32,98,101,32,119,105,116,104,105,110,32,116,104,101,32,98,111,117,110,100,115,32,111,102,32,116,104,101,32,102,111,114,116,114,97,110,32,97,114,114,97,121,32,40,115,111,117,114,99,101,45,62,100,105,109,91,37,100,93,46,108,111,119,101,114,95,98,111,117,110,100,32,60,61,32,108,111,119,101,114,95,98,111,117,110,100,115,91,37,100,93,32,60,61,32,115,111,117,114,99,101,45,62,100,105,109,91,37,100,93,46,108,111,119,101,114,95,98,111,117,110,100,32,43,32,115,111,117,114,99,101,45,62,100,105,109,91,37,100,93,46,101,120,116,101,110,116,32,45,32,49,44,32,37,100,32,60,61,32,37,100,32,60,61,32,37,100,41203] = {67,70,73,95,115,101,99,116,105,111,110,58,32,85,112,112,101,114,32,98,111,117,110,100,115,32,109,117,115,116,32,98,101,32,119,105,116,104,105,110,32,116,104,101,32,98,111,117,110,100,115,32,111,102,32,116,104,101,32,102,111,114,116,114,97,110,32,97,114,114,97,121,32,40,115,111,117,114,99,101,45,62,100,105,109,91,37,100,93,46,108,111,119,101,114,95,98,111,117,110,100,32,60,61,32,117,112,112,101,114,95,98,111,117,110,100,115,91,37,100,93,32,60,61,32,115,111,117,114,99,101,45,62,100,105,109,91,37,100,93,46,108,111,119,101,114,95,98,111,117,110,100,32,43,32,115,111,117,114,99,101,45,62,100,105,109,91,37,100,93,46,101,120,116,101,110,116,32,45,32,49,44,32,37,100,32,33,60,61,32,37,100,32,33,60,61,32,37,100,41,469] = {67,70,73,95,115,101,99,116,105,111,110,58,32,73,102,32,116,104,101,32,117,112,112,101,114,32,98,111,117,110,100,32,105,115,32,115,109,97,108,108,101,114,32,116,104,97,110,32,116,104,101,32,108,111,119,101,114,32,98,111,117,110,100,32,102,111,114,32,97,32,103,105,118,101,110,32,100,105,109,101,110,115,105,111,110,32,40,117,112,112,101,114,91,37,100,93,32,60,32,108,111,119,101,114,91,37,100,93,44,32,37,100,32,60,32,37,100,41,44,32,116,104,101,110,32,104,101,32,115,116,114,105,100,101,32,102,111,114,32,115,97,105,100,32,100,105,109,101,110,115,105,111,110,116,32,109,117,115,116,32,98,101,32,110,101,103,97,116,105,118,101,32,40,115,116,114,105,100,101,91,37,100,93,32,60,32,48,44,32,37,100,32,60,32,48,41,46,_gfortran_cfi_desc_to_gfc_desc .visible .func _gfortran_cfi_desc_to_gfc_descpredpred101; mov.u64 %r56,%ar0; mov.u64 %r57,%ar1; .loc 1 41 16 ld.u64 %r46,[%r57]; .loc 1 43 6 setp.ne.u64 %r58,%r46,0; @ ! %r58 bra $L1; .loc 1 46 27 ld.u64 %r59,[%r46]; st.u64 [%r56],%r59; .loc 1 47 44 ld.s16 %r23,[%r46+22]; .loc 1 47 29 cvt.u32.u32 %r60,%r23; cvt.s32.s8 %r24,%r60; .loc 1 51 6 cvt.u16.u32 %r61,%r24; setp.ne.u16 %r62,%r61,6; @ %r62 bra $L4; .loc 1 52 29 mov.u32 %r63,5; st.u8 [%r56+29],%r63; bra $L5; $L4: .loc 1 53 11 setp.eq.u16 %r65,%r61,5; @ %r65 bra $L6; .loc 1 47 27 st.u8 [%r56+29],%r60; bra $L5; $L6: .loc 1 54 29 mov.u32 %r67,6; st.u8 [%r56+29],%r67; $L5: .loc 1 56 9 ld.s8 %r27,[%r46+20]; .loc 1 56 49 ld.u64 %r52,[%r46+8]; .loc 1 63 23 ld.u32 %r53,[%r46+16]; .loc 1 66 24 ld.s8 %r69,[%r46+21]; cvt.u16.u32 %r68,%r69; .loc 1 56 6 setp.eq.u32 %r70,%r27,0; @ %r70 bra $L7; .loc 1 56 28 ld.u64 %r49,[%r46+40]; .loc 1 56 16 setp.eq.u64 %r71,%r49,%r52; @ %r71 bra $L8; .loc 1 58 11 ld.s8 %r73,[%r56+29]; cvt.u16.u32 %r72,%r73; setp.eq.u16 %r74,%r72,5; @ %r74 bra $L16; .loc 1 48 61 shr.s32 %r75,%r23,8; .loc 1 59 29 cvt.s64.s32 %r32,%r75; bra $L9; $L15: .loc 1 70 32 rem.u64 %r77,%r49,%r52; .loc 1 70 10 setp.eq.u64 %r78,%r77,0; .loc 1 71 10 selp.u64 %r101,%r52,%r49,%r78; st.u64 [%r56+32],%r101; .loc 1 76 13 mov.u64 %r79,0; st.u64 [%r56+8],%r79; .loc 1 77 3 cvt.u16.u32 %r80,%r27; setp.le.s16 %r81,%r80,0; @ %r81 bra $L1; add.u32 %r82,%r27,-1; cvt.u64.u32 %r83,%r82; add.u64 %r85,%r83,%r83; add.u64 %r86,%r85,%r83; shl.b64 %r87,%r86,3; mov.u64 %r48,%r79; mov.u64 %r45,%r48; $L14: add.u64 %r47,%r46,%r48; .loc 1 79 58 ld.u64 %r36,[%r47+24]; add.u64 %r33,%r56,%r48; .loc 1 79 35 st.u64 [%r33+48],%r36; .loc 1 81 7 ld.u64 %r89,[%r47+32]; add.u64 %r88,%r36,%r89; .loc 1 81 31 add.u64 %r90,%r88,-1; .loc 1 80 35 st.u64 [%r33+56],%r90; .loc 1 82 63 div.u64 %r42,%r49,%r52; .loc 1 82 35 st.u64 [%r33+40],%r42; .loc 1 83 48 mul.lo.u64 %r91,%r36,%r42; .loc 1 83 17 sub.u64 %r45,%r45,%r91; .loc 1 77 3 setp.eq.u64 %r92,%r87,%r48; @ %r92 bra $L13; .loc 1 82 59 ld.u64 %r49,[%r47+64]; add.u64 %r48,%r48,24; bra $L14; $L13: st.u64 [%r56+8],%r45; bra $L1; $L16: .loc 1 61 29 mov.u64 %r32,%r52; $L9: st.u64 [%r56+16],%r32; .loc 1 63 20 st.u32 [%r56+24],%r53; .loc 1 64 27 cvt.u32.u32 %r93,%r27; st.u8 [%r56+28],%r93; .loc 1 66 22 st.u16 [%r56+30],%r68; bra $L15; $L7: .loc 1 57 29 st.u64 [%r56+16],%r52; .loc 1 63 20 st.u32 [%r56+24],%r53; .loc 1 64 27 cvt.u32.u32 %r95,%r27; st.u8 [%r56+28],%r95; .loc 1 66 22 st.u16 [%r56+30],%r68; .loc 1 76 13 mov.u64 %r97,0; st.u64 [%r56+8],%r97; bra $L1; $L8: .loc 1 57 29 st.u64 [%r56+16],%r49; .loc 1 63 20 st.u32 [%r56+24],%r53; .loc 1 64 27 cvt.u32.u32 %r98,%r27; st.u8 [%r56+28],%r98; .loc 1 66 22 st.u16 [%r56+30],%r68; bra $L15; $L1: .loc 1 8_gfortran_gfc_desc_to_cfi_desc .visible .func _gfortran_gfc_desc_to_cfi_descpred %r62; .reg .predpredpred %r74; .reg .u16pred %r83; .reg .u16 %r85; .reg .u64 %r88; mov.u64 %r50,%ar0; mov.u64 %r51,%ar1; .loc 1 99 7 ld.u64 %r22,[%r50]; .loc 1 99 6 setp.ne.u64 %r52,%r22,0; @ %r52 bra $L18; .loc 1 100 9 mov.u64 %r53,384mov.u64 %r22,%r54; $L18: .loc 1 105 18 ld.u64 %r23,[%r51]; .loc 1 105 16 st.u64 [%r22],%r23; .loc 1 106 17 ld.u64 %r24,[%r51+16]; .loc 1 106 15 st.u64 [%r22+8],%r24; .loc 1 107 14 ld.u32 %r56,[%r51+24]; st.u32 [%r22+16],%r56; .loc 1 108 25 ld.s8 %r26,[%r51+28]; .loc 1 108 11 cvt.u32.u32 %r57,%r26; st.u8 [%r22+20],%r57; .loc 1 109 18 ld.u8 %r59,[%r51+30]; cvt.s32.s8 %r28,%r59; .loc 1 109 16 st.u8 [%r22+21],%r59; .loc 1 111 7 ld.s8 %r29,[%r51+29]; .loc 1 111 6 cvt.u16.u32 %r61,%r29; setp.eq.u16 %r62,%r61,6; @ %r62 bra $L31; .loc 1 113 11 setp.eq.u16 %r64,%r61,5; @ ! %r64 bra $L35; bra $L20; $L31: mov.u32 %r44,5; $L19: .loc 1 120 30 cvt.u32.u64 %r67,%r24; shl.b32 %r66,%r67,8; .loc 1 120 3 cvt.u16.u32 %r69,%r66; cvt.u16.u32 %r70,%r44; add.u16 %r68,%r69,%r70; .loc 1 119 13 st.u16 [%r22+22],%r68; $L30: .loc 1 122 6 setp.ne.u64 %r71,%r23,0; @ %r71 bra $L22; $L25: .loc 1 142 6 ld.u64 %r72,[%r50]; setp.eq.u64 %r73,%r72,0; @ ! %r73 bra $L17; bra $L23; $L22: .loc 1 124 5 setp.le.s32 %r74,%r26,0; @ %r74 bra $L25; .loc 1 139 62 ld.u64 %r38,[%r51+32]; .loc 1 132 35 add.u32 %r42,%r26,-1; mov.u64 %r43,0; .loc 1 124 12 cvt.u32.u64 %r46,%r43; .loc 1 126 5 cvt.u16.u32 %r75,%r28; setp.eq.u16 %r76,%r75,2; .loc 1 135 21 mov.u64 %r88,-1; $L29: add.u64 %r40,%r51,%r43; .loc 1 127 41 ld.u64 %r48,[%r40+48]; selp.u64 %r49,0,%r48,%r76; add.u64 %r30,%r22,%r43; st.u64 [%r30+24],%r49; .loc 1 134 9 ld.u64 %r47,[%r40+56]; .loc 1 132 5 setp.ne.u32 %r77,%r42,%r46; @ %r77 bra $L27; .loc 1 133 6 setp.ne.u64 %r78,%r48,1; @ %r78 bra $L27; .loc 1 134 6 setp.eq.u64 %r79,%r47,0; @ %r79 bra $L33; $L27: .loc 1 138 9 sub.u64 %r80,%r47,%r48; .loc 1 138 52 add.u64 %r36,%r80,1; bra $L28; $L33: .loc 1 135 21 mov.u64 %r36,%r88; $L28: st.u64 [%r30+32],%r36; .loc 1 139 17 ld.u64 %r82,[%r40+40]; mul.lo.u64 %r81,%r82,%r38; .loc 1 139 15 st.u64 [%r30+40],%r81; .loc 1 124 47 add.u32 %r46,%r46,1; .loc 1 124 5 add.u64 %r43,%r43,24; setp.ne.u32 %r83,%r26,%r46; @ %r83 bra $L29; bra $L25; $L23: .loc 1 143 12 st.u64 [%r50],%r22; .loc 1 144 1 bra $L17; $L35: .loc 1 119 29 cvt.u32.u16 %r44,%r61; bra $L19; $L20: mov.u16 %r85,6; st.u16 [%r22+22],%r85; bra $L30; $L17CFICFI_addressu32pred %r107; mov.u64 %r56,%ar0; mov.u64 %r57,%ar1; .loc 1 149 9 ld.u64 %r55,[%r56]; .loc 1 151 7 cvta.global.u64 %r58,_gfortrani_compile_options; ld.u32 %r22,[%r58+36]; .loc 1 161 10 set.u32.eq.u64 %r60,%r55,0; neg.s32 %r61,%r60; .loc 1 151 6 set.u32.ne.u32 %r63,%r22,0; neg.s32 %r64,%r63; .loc 1 161 10 cvt.u16.u32 %r66,%r61; cvt.u16.u32 %r67,%r64; and.b16 %r65,%r66,%r67;163 13 ld.global.u64 %r75,[_impure_ptr]; .loc 1 163 4 ld.u64 %r76,[%r75+24]; mov.u64 %r73,60; mov.u64 %r72,1; cvta.const.u64 %r71,$LC165 11 mov.u64 %r55,0; bra $L36; $L37: .loc 1 170 9 ld.s8 %r52,[%r56+20]; .loc 1 178 7 cvt.u16.u32 %r78,%r52; setp.le.s16 %r79,%r78,0; @ %r79 bra $L36; mov.u64 %r53,%r57; add.u64 %r49,%r56,24; .loc 1 178 14 mov.u32 %r51,0; setp.eq.u32 %r107,%r22,0; $L42: .loc 1 180 32 ld.u64 %r30,[%r53]; .loc 1 180 48 ld.u64 %r31,[%r49]; .loc 1 180 16 sub.u64 %r50,%r30,%r31; .loc 1 181 7 @ %r107 bra $L39; .loc 1 182 23 ld.u64 %r34,[%r49+8]; .loc 1 182 8 set.u32.ne.u64 %r82,%r34,-1; neg.s32 %r83,%r82; .loc 1 182 37 set.u32.le.s64 %r85,%r34,%r50; neg.s32 %r86,%r85; cvt.u16.u32 %r88,%r83; cvt.u16.u32 %r89,%r86; and.b16 %r87,%r88,%r89; cvt.u32.u16 %r90,%r87; cvt.u16.u8 %r91,%r90; setp.eq.u16 %r92,%r91,0; @ %r92 bra $L40; $L41: .loc 1 185 17 ld.global.u64 %r95,[_impure_ptr]; .loc 1 185 8 ld.u64 %r96,[%r95+24]; st.u32 [%stack+20],%r34; .loc 1 189 34 cvt.u32.u64 %r98,%r34; cvt.u32.u64 %r99,%r31; sub.u32 %r97,%r98,%r99; .loc 1 185 8 st.u32 [%stack+16],%r97; st.u32 [%stack+12],%r31; st.u32 [%stack+8],%r30; st.u32 [%stack+4],%r51; st.u32 [%stack],%r51; cvta.const.u64 %r94,$LC96100,[%value_in]; } .loc 1 191 22 mov.u64 %r55,0; bra $L36; $L40: .loc 1 183 5 setp.lt.s64 %r102,%r50,0; @ %r102 bra $L41; $L39: .loc 1 194 28 ld.u64 %r104,[%r49+16]; .loc 1 194 14 mad.lo.u64 %r55,%r50,%r104,%r55; .loc 1 178 34 add.u32 %r51,%r51,1; .loc 1 178 7 add.u64 %r53,%r53,8; add.u64 %r49,%r49,24; setp.ne.u32 %r105,%r51,%r52; @ %r105 bra $L42; $L36CFI_allocatCFI_allocatepredu64u16predpred %r143; .reg .u64 %r146; .reg .predmov.u64 %r68,%ar0; mov.u64 %r69,%ar1; mov.u64 %r70,%ar2; mov.u64 %r71,%ar3; .loc 1 206 7 cvta.global.u64 %r72,_gfortrani_compile_options; ld.u32 %r22,[%r72+36]; .loc 1 206 6 setp.eq.u32 %r73,%r22,0; @ %r73 bra $L46; .loc 1 209 10 setp.ne.u64 %r74,%r68,0; @ %r74 bra $L47; .loc 1 211 13 ld.global.u64 %r79,[_impure_ptr]; .loc 1 211 4 ld.u64 %r80,[%r79+24]; mov.u64 %r77,36; mov.u64 %r76,1; cvta.const.u64 %r75,$LCld.param.u64 %r81,[%value_in]; } .loc 1 212 11 mov.u32 %r67,10; bra $L45; $L47: .loc 1 216 10 ld.s8 %r83,[%r68+21]; cvt.u16.u32 %r82,%r83; setp.ne.u16 %r84,%r82,2; @ %r84 bra $L49; .loc 1 218 13 ld.global.u64 %r89,[_impure_ptr]; .loc 1 218 4 ld.u64 %r90,[%r89+24]; mov.u64 %r87,88; mov.u64 %r86,1; cvta.const.u64 %r85,$LC385220 11 mov.u32 %r67,7; bra $L45; $L49: .loc 1 224 10 ld.u64 %r92,[%r68]; setp.eq.u64 %r93,%r92,0; @ %r93 bra $L46; .loc 1 226 13 ld.global.u64 %r98,[_impure_ptr]; .loc 1 226 4 ld.u64 %r99,[%r98+24]; mov.u64 %r96,57; mov.u64 %r95,1; cvta.const.u64 %r94,$LC428 11 mov.u32 %r67,3; bra $L45; $L46: .loc 1 234 67 ld.u16 %r102,[%r68+22]; and.b16 %r101,%r102,-5; set.u32.eq.u16 %r104,%r101,257; neg.s32 %r105,%r104; .loc 1 234 33 set.u32.eq.u16 %r108,%r102,1029; neg.s32 %r109,%r108; .loc 1 234 67 cvt.u16.u32 %r111,%r105; cvt.u16.u32 %r112,%r109; or.b16 %r110,%r111,%r112; cvt.u32.u16 %r113,%r110; cvt.u16.u8 %r114,%r113; setp.eq.u16 %r115,%r114,0; @ %r115 bra $L50; .loc 1 236 18 st.u64 [%r68+8],%r71; $L50: .loc 1 243 9 ld.s8 %r35,[%r68+20]; .loc 1 243 6 cvt.u16.u32 %r116,%r35; setp.gt.s16 %r117,%r116,0; @ %r117 bra $L51; .loc 1 258 24 ld.u64 %r66,[%r68+8]; .loc 1 239 10 mov.u64 %r59,1; bra $L52; $L51: .loc 1 245 10 setp.ne.u32 %r118,%r22,0; @ %r118 bra $L53; $L55: .loc 1 258 24 ld.u64 %r66,[%r68+8]; mov.u64 %r23,%r69; add.u64 %r62,%r68,8; mov.u64 %r39,%r70; mov.u32 %r60,0; mov.u64 %r59,1; bra $L54; $L53: .loc 1 246 21 set.u32.eq.u64 %r120,%r69,0; neg.s32 %r121,%r120; .loc 1 246 45 set.u32.eq.u64 %r123,%r7246 29 cvt.u16.u32 %r126,%r121; cvt.u16.u32 %r127,%r124; or.b16 %r125,%r126,%r127; .loc 1 246 4 cvt.u32.u16 %r128,%r125; cvt.u16.u8 %r129,%r12855; .loc 1 248 13 ld.global.u64 %r133,[_impure_ptr]; .loc 1 248 4 ld.u64 %r134,[%r133+24]; st.u32 [%stack],%r35; cvta.const.u64 %r132,$LC115,[%value_in]; } .loc 1 250 11 mov.u32 %r67,8; bra $L45; $L54: .loc 1 255 41 ld.u64 %r42,[%r23]; .loc 1 255 27 st.u64 [%r62+16],%r42; .loc 1 256 40 ld.u64 %r138,[%r39]; sub.u64 %r137,%r138,%r42; .loc 1 256 65 add.u64 %r45,%r137,1; .loc 1 256 22 st.u64 [%r62+24],%r45; .loc 1 257 7 setp.ne.u32 %r139,%r60,0; @ %r139 bra $L56; .loc 1 258 20 ld.u64 %r140,[%r68+8]; st.u64 [%r68+40],%r140; bra $L57; $L56: .loc 1 260 35 ld.u64 %r142,[%r62]; mul.lo.u64 %r141,%r142,%r66; .loc 1 260 20 st.u64 [%r62+32],%r141; $L57: .loc 1 261 12 mul.lo.u64 %r59,%r59,%r45; .loc 1 253 38 add.u32 %r60,%r60,1; .loc 1 253 7 add.u64 %r23,%r23,8; add.u64 %r62,%r62,24; add.u64 %r39,%r39,8; setp.ne.u32 %r143,%r60,%r35; @ %r143 bra $L54; $L52: .loc 1 265 19call (%value_in),c146,[%value_in]; } .loc 1 265 17 st.u64 [%r68],%r146; .loc 1 266 6 setp.ne.u64 %r148,%r146,0; @ %r148 bra $L58; .loc 1 268 16 ld.global.u64 %r153,[_impure_ptr]; .loc 1 268 7 ld.u64 %r154,[%r153+24]; mov.u64 %r151,44; mov.u64 %r150,1; cvta.const.u64 %r149,$LC615154; call (%value_in),fwriteld.param.u64 %r155,[%value_in]; } .loc 1 269 14 mov.u32 %r67,11; bra $L45; $L58: .loc 1 272 10 mov.u32 %r67,0; $L45: .loc 1 273CFI_deallocatCFI_deallocateu64 %r42; .reg .u64 %r43; .reg .pred16mov.u64 %r32,%ar0; .loc 1 279 7 cvta.global.u64 %r33,_gfortrani_compile_options; .loc 1 279 6 ld.u32 %r34,[%r33+36]; setp.eq.u32 %r35,%r34,0; @ %r35 bra $L73; .loc 1 282 10 setp.ne.u64 %r36,%r32,0; @ %r36 bra $L74; .loc 1 284 13 ld.global.u64 %r41,[_impure_ptr]; .loc 1 284 4 ld.u64 %r42,[%r41+24]; mov.u64 %r39,38; mov.u64 %r38,1; cvta.const.u64 %r37,$LCld.param.u64 %r43,[%value_in]; } .loc 1 285 11 mov.u32 %r31,10; bra $L72; $L74: .loc 1 289 13 ld.u64 %r23,[%r32]; .loc 1 289 10 setp.ne.u64 %r44,%r23,0; @ %r44 bra $L76; .loc 1 291 13 ld.global.u64 %r49,[_impure_ptr]; .loc 1 291 4 ld.u64 %r50,[%r49+24]; mov.u64 %r47,46; mov.u64 %r46,1; cvta.const.u64 %r45,$LC8;ld.param.u64 %r51,[%value_in]; } .loc 1 292 11 mov.u32 %r31,2; bra $L72; $L76: .loc 1 296 10 ld.s8 %r53,[%r32+21]; cvt.u16.u32 %r52,%r53; setp.ne.u16 %r54,%r52,2; @ %r54 bra $L77; .loc 1 298 13 ld.global.u64 %r59,[_impure_ptr]; .loc 1 298 4 ld.u64 %r60,[%r59+24]; mov.u64 %r57,76; mov.u64 %r56,1; cvta.const.u64 %r55,$LC955ld.param.u64 %r61,[%value_in]; } .loc 1 300 11 mov.u32 %r31,7; bra $L72; $L73: .loc 1 289 13 ld.u64 %r23,[%r32]; $L.loc 1 306 17 mov.u64 %r63,0; st.u64 [%r32],%r63; .loc 1 308 10 cvt.u32.u64 %r31,%r63; $L72: .loc 1 309CFI_establishCFI_establish32u16 %r107; .reg .predpredpred %r153; .reg .u32 %r155; .reg .u32316 7 cvta.global.u64 %r76,_gfortrani_compile_options; ld.u32 %r22,[%r76+36]; .loc 1 316 6 setp.eq.u32 %r77,%r22,0; @ ! %r77 bra $L103; set.u32.ne.u64 %r195,%r70,0; bra $L79; $L103: .loc 1 319 10 setp.ne.u64 %r78,%r69,0; @ %r78 bra $L80; .loc 1 321 13 ld.global.u64 %r83,[_impure_ptr]; .loc 1 321 4 ld.u64 %r84,[%r83+24]; mov.u64 %r81,37; mov.u64 %r80,1; cvta.const.u64 %r79,$LC1ld.param.u64 %r85,[%value_in]; } .loc 1 322 11 mov.u32 %r68,10; bra $L78; $L80: .loc 1 326 10 cvt.u32.u32 %r87,%r74; cvt.u16.u8 %r86,%r87; setp.le.u16 %r88,%r86,15; @ %r88 bra $L82; .loc 1 328 13 ld.global.u64 %r91,[_impure_ptr]; .loc 1 328 4 ld.u64 %r92,[%r91+24]; st.u32 [%stack+4],%r74; mov.u32 %r93,15; st.u32 [%stack],%r93; cvta.const.u64 %r90,$LC194,[%value_in]; } .loc 1 330 11 mov.u32 %r68,5; bra $L78; $L82: .loc 1 335 21 cvt.u16.u32 %r97,%r71; set.u32.eq.u16 %r98,%r97,1; neg.s32 %r99,%r98; .loc 1 335 63 set.u32.ne.u64 %r195,%r70,0; neg.s32 %r102,%r195; .loc 1 335 50 cvt.u16.u32 %r104,%r99; cvt.u16.u32 %r105,%r102; and.b16 %r103,%r104,%r105; .loc 1 335 10 cvt.u32.u16 %r106,%r103; cvt.u16.u8 %r107,%r106; setp.eq.u16 %r108,%r107,0; @ %r108 bra $L79; .loc 1 337 13 ld.global.u64 %r111,[_impure_ptr]; .loc 1 337 4 ld.u64 %r112,[%r111+24]; mov.u32 %r113,1; st.u32 [%stack],%r113; cvta.const.u64 %r110,$LC1211110114,[%value_in]; } .loc 1 341 11 mov.u32 %r68,7; bra $L78; $L79: .loc 1 335 63 neg.s32 %r118,%r195; cvt.u32.u32 %r116,%r118; cvt.u32.u8 %r37,%r116; .loc 1 345 17 st.u64 [%r69],%r70; .loc 1 347 6 cvt.u16.u32 %r119,%r72; setp.eq.u16 %r120,%r119,1029; @ %r120 bra $L83; .loc 1 348 12 and.b16 %r121,%r119,-5; set.u32.eq.u16 %r124,%r121,257; neg.s32 %r125,%r124; .loc 1 348 44 set.u32.eq.u16 %r128,%r119,6; neg.s32 %r129,%r128; .loc 1 348 63 cvt.u16.u32 %r131,%r125; cvt.u16.u32 %r132,%r129; or.b16 %r130,%r131,%r132; set.u32.eq.u16 %r135,%r119,-1; neg.s32 %r136,%r135; cvt.u16.u32 %r140,%r136; or.b16 %r138,%r130,%r140; cvt.u32.u16 %r141,%r138; cvt.u16.u8 %r142,%r141; setp.eq.u16 %r143,%r142,0; @ %r143 bra $L84; $L83: .loc 1 350 18 st.u64 [%r69+8],%r73; bra $L85; $L84: and.b16 %r144,%r119,255; .loc 1 354 31 cvt.s64.s16 %r146,%r119; .loc 1 354 14 cvt.s64.s16 %r148,%r144; .loc 1 357 37 sub.u64 %r150,%r146,%r148; .loc 1 357 14 shr.u64 %r55,%r150,8; .loc 1 359 10 setp.ne.u64 %r151,%r55,10; .loc 1 361 19 selp.u64 %r55,%r55,64,%r151; .loc 1 364 10 setp.ne.u16 %r153,%r144,4; @ %r153 bra $L87; .loc 1 366 19 add.u64 %r55,%r55,%r55; $L87: .loc 1 368 20 st.u64 [%r69+8],%r55; $L85: .loc 1 371 15 mov.u32 %r155,1; st.u32 [%r69+16],%r155; .loc 1 372 12 cvt.u32.u32 %r156,%r74; st.u8 [%r69+20],%r156; .loc 1 373 17 cvt.u32.u32 %r157,%r71; st.u8 [%r69+21],%r157; .loc 1 374 12 cvt.u16.u32 %r158,%r72; st.u16 [%r69+22],%r158; .loc 1 378 12 cvt.u16.u32 %r160,%r74; set.u32.gt.s16 %r161,%r160,0; neg.s32 %r162,%r161; .loc 1 378 16 cvt.u16.u32 %r164,%r162; cvt.u16.u32 %r165,%r37; and.b16 %r163,%r164,%r165; .loc 1 378 6%r168 bra $L88; $L93: .loc 1 399 10 mov.u32 %r68,0; bra $L78; $L88: .loc 1 380 10 set.u32.ne.u32 %r170,%r22,0; neg.s32 %r171,%r170; .loc 1 380 51 set.u32.eq.u64 %r173,%r75,0; neg.s32 %r174,%r173; cvt.u1689; .loc 1 382 13 ld.global.u64 %r183,[_impure_ptr]; .loc 1 382 4 ld.u64 %r184,[%r183+24]; st.u32 [%stack],%r74; cvta.const.u64 %r182,$LC131182185,[%value_in]; } .loc 1 385 11 mov.u32 %r68,8; bra $L78; $L89: add.u64 %r36,%r69,24; mov.u64 %r39,%r75; .loc 1 380 51 mov.u32 %r60,0; .loc 1 390 27 mov.u64 %r187,0; $L92: st.u64 [%r36],%r187; .loc 1 391 31 ld.u64 %r46,[%r39]; .loc 1 391 22 st.u64 [%r36+8],%r46; .loc 1 392 7 setp.ne.u32 %r188,%r60,0; @ %r188 bra $L90; .loc 1 393 20 ld.u64 %r189,[%r69+8]; st.u64 [%r69+40],%r189; bra $L91; $L90: .loc 1 395 49 ld.u64 %r191,[%r39+-8]; ld.u64 %r192,[%r69+8]; mul.lo.u64 %r190,%r191,%r192; .loc 1 395 20 st.u64 [%r36+16],%r190; $L91: .loc 1 388 34 add.u32 %r60,%r60,1; .loc 1 388 7 add.u64 %r36,%r36,24; add.u64 %r39,%r39,8; setp.lt.s32 %r193,%r60,%r74; @ %r193 bra $L92; bra $L93; $L78:GLOBAL FUNCTION DEF: CFI_is_contiguouCFI_is_contiguouspredmov.u64 %r46,%ar0; .loc 1 405 7 cvta.global.u64 %r47,_gfortrani_compile_options; .loc 1 405 6 ld.u32 %r48,[%r47+36];105; .loc 1 408 10 setp.ne.u64 %r50,%r46,0; @ %r50 bra $L106; .loc 1 410 13 ld.global.u64 %r55,[_impure_ptr]; .loc 1 410 4 ld.u64 %r56,[%r55+24]; mov.u64 %r53,41; mov.u64 %r52,1; cvta.const.u64 %r51,$LC1451ld.param.u64 %r57,[%value_in]; } .loc 1 411 11 cvt.u32.u64 %r45,%r46; bra $L104; $L106: .loc 1 415 10 ld.u64 %r58,[%r46]; setp.ne.u64 %r59,%r58,0; @ %r59 bra $L108; .loc 1 417 13 ld.global.u64 %r64,[_impure_ptr]; .loc 1 417 4 ld.u64 %r65,[%r64+24]; mov.u64 %r62,65; mov.u64 %r61,1; cvta.const.u64 %r60,$LC15ld.param.u64 %r66,[%value_in]; } .loc 1 419 11 cvt.u32.u64 %r45,%r58; bra $L104; $L108: .loc 1 423 13 ld.s8 %r23,[%r46+20]; .loc 1 423 10 setp.ne.u32 %r67,%r23,0; @ %r67 bra $L109; .loc 1 425 13 ld.global.u64 %r70,[_impure_ptr]; .loc 1 425 4 ld.u64 %r71,[%r70+24]; st.u32 [%stack],%r23; cvta.const.u64 %r69,$LC16,[%value_in]; } .loc 1 427 11 mov.u32 %r45,%r23; bra $L104; $L105: .loc 1 423 13 ld.s8 %r23,[%r46+20]; $L109: .loc 1 432 6 cvt.u16.u32 %r75,%r23; setp.le.s16 %r76,%r75,0; @ %r76 bra $L114; .loc 1 432 40 add.u16 %r77,%r75,-1; cvt.u32.u16 %r80,%r77; .loc 1 432 44 cvt.s64.s8 %r81,%r80; add.u64 %r83,%r81,%r81; add.u64 %r84,%r83,%r81; shl.b64 %r85,%r84,3; add.u64 %r86,%r46,%r85; .loc 1 432 20 ld.u64 %r88,[%r86+32]; setp.eq.u64 %r89,%r88,-1; @ ! %r89 bra $L118; bra $L115; $L113: .loc 1 439 10 setp.ne.u32 %r90,%r45,0; @ %r90 bra $L111; .loc 1 439 18 ld.u64 %r91,[%r46+40]; ld.u64 %r92,[%r46+8]; setp.eq.u64 %r93,%r91,%r92; @ ! %r93 bra $L104; bra $L112; $L111: .loc 1 442 29 ld.u64 %r95,[%r32+8]; ld.u64 %r96,[%r32]; mul.lo.u64 %r94,%r95,%r96; .loc 1 442 9 ld.u64 %r97,[%r32+32]; setp.ne.u64 %r98,%r97,%r94; @ %r98 bra $L116; $L112: .loc 1 437 34 add.u32 %r45,%r45,1; .loc 1 437 3 add.u64 %r32,%r32,24; setp.gt.s32 %r99,%r23,%r45; @ %r99 bra $L113; .loc 1 433 12 mov.u32 %r45,1; bra $L104; $L114: mov.u32 %r45,1; bra $L104; $L115: mov.u32 %r45,1; bra $L104; $L116: .loc 1 446 14 mov.u32 %r45,0; bra $L104; $L118: add.u64 %r32,%r46,8; .loc 1 437 25 mov.u32 %r45,0; bra $L113; $L104CFI_secCFI_sectionlocal .align 16 .b8 %frame_ar[368pred %r210; .reg .u64predpredpredu64 %r285; .reg .u64u64 %r305; .reg .u64 %r306; .reg .u64pred %r312; .reg .u64u64 %r317; .reg .u64 %r318; .reg .u64 %r319; .reg .u64u64 %r374; .reg .u64 %r375; .reg .u32 %r376; .reg .pred %r378; .reg .pred %r379; .reg .u64 %r381; .reg .u32 %r382; .reg .predu64predu32.reg .pred %r440; .reg .pred.reg .u64 %r482; .reg .pred %r483; .reg .u64 %r488; .reg .pred %r491; .reg .u64 %r494; .reg .u32 %r495; .reg .pred %r496; .reg .pred %r497; .reg .pred %r498; .reg .u64 %r499; .reg .pred %r500; .reg .u64 %r501; .reg .u32 %r502; mov.u64 %r203,%ar0; mov.u64 %r204,%ar1; mov.u64 %r205,%ar2; mov.u64 %r206,%ar3; mov.u64 %r207,%ar4; .loc 1 465 7 cvta.global.u64 %r208,_gfortrani_compile_options; ld.u32 %r22,[%r208+36]; .loc 1 465 6 setp.eq.u32 %r500,%r22,0; @ %r500 bra $L120; .loc 1 468 10 setp.ne.u64 %r210,%r204,0; @ %r210 bra $L121; .loc 1 470 13 ld.global.u64 %r215,[_impure_ptr]; .loc 1 470 4 ld.u64 %r216,[%r215+24]; mov.u64 %r213,39; mov.u64 %r212,1; cvta.const.u64 %r211,$LC121316; call (%value_in),fwriteld.param.u64 %r217,[%value_in]; } .loc 1 471 11 mov.u32 %r202,10; bra $L119; $L121: .loc 1 474 10 setp.ne.u64 %r218,%r203,0; @ %r218 bra $L123; .loc 1 476 13 ld.global.u64 %r223,[_impure_ptr]; .loc 1 476 4 ld.u64 %r224,[%r223+24]; mov.u64 %r221,38; mov.u64 %r220,1; cvta.const.u64 %r219,$LC1822224; call (%value_in),fwriteld.param.u64 %r225,[%value_in]; } .loc 1 477 11 mov.u32 %r202,10; bra $L119; $L123: .loc 1 481 10 ld.u64 %r226,[%r204]; setp.ne.u64 %r227,%r226,0; @ %r227 bra $L124; .loc 1 483 13 ld.global.u64 %r232,[_impure_ptr]; .loc 1 483 4 ld.u64 %r233,[%r232+24]; mov.u64 %r230,54; mov.u64 %r229,1; cvta.const.u64 %r228,$LC1923033; call (%value_in),fwriteld.param.u64 %r234,[%value_in]; } .loc 1 485 11 mov.u32 %r202,2; bra $L119; $L124: .loc 1 489 10 ld.s8 %r236,[%r203+21]; cvt.u16.u32 %r235,%r236; setp.ne.u16 %r237,%r235,1; @ %r237 bra $L125; .loc 1 491 13 ld.global.u64 %r242,[_impure_ptr]; .loc 1 491 4 ld.u64 %r243,[%r242+24]; mov.u64 %r240,60; mov.u64 %r239,1; cvta.const.u64 %r238,$LC223r224043; call (%value_in),fwriteld.param.u64 %r244,[%value_in]; } .loc 1 493 11 mov.u32 %r202,7; bra $L119; $L125: .loc 1 498 17 ld.s8 %r35,[%r204+20]; .loc 1 498 10 cvt.u16.u32 %r245,%r35; setp.gt.s16 %r246,%r245,0; @ %r246 bra $L126; .loc 1 500 13 ld.global.u64 %r249,[_impure_ptr]; .loc 1 500 4 ld.u64 %r250,[%r249+24]; st.u32 [%stack],%r35; cvta.const.u64 %r248,$LC22248251,[%value_in]; } .loc 1 502 11 mov.u32 %r202,5; bra $L119; $L126: .loc 1 506 17 ld.u64 %r40,[%r203+8]; .loc 1 506 37 ld.u64 %r41,[%r204+8]; .loc 1 506 10 setp.eq.u64 %r253,%r40,%r41; @ %r253 bra $L127; .loc 1 508 13 ld.global.u64 %r256,[_impure_ptr]; .loc 1 508 4 ld.u64 %r257,[%r256+24]; st.u32 [%stack+4],%r40; st.u32 [%stack],%r41; cvta.const.u64 %r255,$LC512 11 mov.u32 %r202,4; bra $L119; $L127: .loc 1 516 17 ld.s16 %r47,[%r203+22]; .loc 1 516 33 ld.s16 %r48,[%r204+22]; .loc 1 516 10 cvt.u16.u32 %r260,%r47; cvt.u16.u32 %r261,%r48; setp.eq.u16 %r262,%r260,%r261; @ %r262 bra $L128; .loc 1 518 13 ld.global.u64 %r265,[_impure_ptr]; .loc 1 518 4 ld.u64 %r266,[%r265+24]; st.u32 [%stack+4],%r47; st.u32 [%stack],%r48; cvta.const.u64 %r264,$LCmov.u32 %r202,6; bra $L119; $L166: mov.u64 %r200,%r207; .loc 1 457 1 mov.u32 %r158,0; mov.u32 %r137,%r158; $L130: .loc 1 529 10 ld.u64 %r269,[%r200]; setp.ne.u64 %r270,%r269,0; @ %r270 bra $L129; .loc 1 530 12 add.u32 %r137,%r137,1; $L129: .loc 1 527 38 add.u32 %r158,%r158,1; .loc 1 527 3 add.u64 %r200,%r200,8; setp.gt.s32 %r271,%r140,%r158; @ %r271 bra $L130; .loc 1 535 6 @ %r500 bra $L131; .loc 1 536 16 ld.s8 %r58,[%r203+20]; .loc 1 536 39 sub.u32 %r273,%r140,%r137; .loc 1 536 7 setp.eq.u32 %r274,%r58,%r273; @ %r274 bra $L131; .loc 1 538 16 ld.global.u64 %r277,[_impure_ptr]; .loc 1 538 7 ld.u64 %r278,[%r277+24]; st.u32 [%stack+8],%r137; st.u32 [%stack+4],%r140; st.u32 [%stack],%r58; cvta.const.u64 %r276,$LC27r276279,[%value_in]; } .loc 1 542 14 mov.u32 %r202,5; bra $L119; $L201: .loc 1 546 6 setp.eq.u64 %r497,%r205,0; @ %r497 bra $L134; $L169: .loc 1 553 7 @ %r496 bra $L133; bra $L134; $L168: add.u64 %r106,%r204,24; add.u64 %r501,%frame,240; mov.u64 %r123,%r501; add.u32 %r495,%r140,-1; cvt.u64.u32 %r285,%r495; shl.b64 %r286,%r285,3; add.u64 %r287,%frame,248; add.u64 %r199,%r286,%r287; $L135: .loc 1 549 11 ld.u64 %r288,[%r106]; st.u64 [%r123],%r288; .loc 1 548 7 add.u64 %r106,%r106,24; add.u64 %r123,%r123,8; setp.ne.u64 %r289,%r123,%r199; @ %r289 bra $L135; bra $L136; $L133: .loc 1 554 11 add.u64 %r501,%frame,240; cvt.s64.s32 %r291,%r140; shl.b64 %r292,%r291bra $L136; $L164: .loc 1 560 10 @ %r500 bra $L137; .loc 1 561 32 add.u32 %r302,%r140,-1; .loc 1 561 36 cvt.s64.s32 %r303,%r302; add.u64 %r305,%r303,%r303; add.u64 %r306,%r305,%r303; shl.b64 %r307,%r306,3; add.u64 %r308,%r204,%r307; .loc 1 561 4 ld.u64 %r310,[%r308+32]; setp.eq.u64 %r311,%r310,-1; @ %r311 bra $L138; $L137: .loc 1 568 7 setp.gt.s32 %r312,%r140,0; @ %r312 bra $L139; add.u64 %r501,%frame,240; bra $L163; $L138: .loc 1 563 13 ld.global.u64 %r317,[_impure_ptr]; .loc 1 563 4 ld.u64 %r318,[%r317+24]; mov.u64 %r315,79; mov.u64 %r314,1; cvta.const.u64 %r313,$LC251318; call (%value_in),fwriteld.param.u64 %r319,[%value_in]; } .loc 1 565 11 mov.u32 %r202,8; bra $L119; $L139: add.u64 %r181,%r204,24; add.u64 %r494,%frame,120; mov.u64 %r163,%r494; add.u64 %r499,%r204,32; mov.u64 %r176,%r499; add.u32 %r495,%r140,-1; cvt.u64.u32 %r321,%r495; shl.b64 %r322,%r321,3; add.u64 %r323,%frame,128; add.u64 %r188,%r322,%r323; $L141: .loc 1 569 40 ld.u64 %r325,[%r181]; ld.u64 %r326,[%r176]; add.u64 %r324,%r325,%r326; .loc 1 569 64 add.u64 %r327,%r324,-1; .loc 1 569 11 st.u64 [%r163],%r327; .loc 1 568 7 add.u64 %r181,%r181,24; add.u64 %r163,%r163,8; add.u64 %r176,%r176,24; setp.ne.u64 %r328,%r163,%r188; @ %r328 bra $L141; bra $L198; $L200: .loc 1 574 11 add.u64 %r494,%frame,120; setp.le.s32 %r331,%r140,0; @ %r331 bra $L143; add.u32 %r495,%r140,-1; cvt.u64.u32 %r333,%r495; add.u64 %r334,%r333,1; shl.b64 %r330,%r334,3; bra $L144; $L143: mov.u64 %r330,8; $L1449206330341,[%value_in]; } .loc 1 578 6 setp.eq.u64 %r343,%r207,0; @ ! %r343 bra $L199; add.u32 %r495,%r140,-1; add.u64 %r499,%r204,32; bra $L145; $L199: .loc 1 585 7 setp.gt.s32 %r344,%r140,0; @ %r344 bra $L146; bra $L163; $L145: mov.u64 %r36,%frame; .loc 1 457 1 mov.u32 %r153,0; .loc 1 581 12 mov.u64 %r345,1; $L147: st.u64 [%r36],%r345; .loc 1 580 42 add.u32 %r153,%r153,1; .loc 1 580 7 add.u64 %r36,%r36,8; setp.lt.s32 %r346,%r153,%r140; @ %r346 bra $L147; bra $L148; $L146: .loc 1 589 7 set.u32.ne.u32 %r348,%r22,0; neg.s32 %r349,%r348; cvt.u32.u32 %r347,%r349; mov.u64 %r23,0; .loc 1 585 16 cvt.u32.u64 %r152,%r23; .loc 1 590 8 cvt.u16.u8 %r357,%r347; $L150: .loc 1 587 23 add.u64 %r350,%r207,%r23; ld.u64 %r77,[%r350]; .loc 1 587 14 add.u64 %r351,%frame,%r23; st.u64 [%r351],%r77; .loc 1 590 8 set.u32.eq.u64 %r353,%r77,149; .loc 1 590 34 add.u64 %r362,%r501,%r23; .loc 1 590 46 add.u64 %r364,%r494,%r23; .loc 1 590 26 ld.u64 %r365,[%r362]; ld.u64 %r366,[%r364]; setp.eq.u64 %r367,%r365,%r366; @ %r367 bra $L149; .loc 1 592 17 ld.global.u64 %r370,[_impure_ptr]; .loc 1 592 8 ld.u64 %r371,[%r370+24]; .loc 1 595 58 add.u64 %r372,%r206,%r23; .loc 1 592 8 ld.u64 %r373,[%r372]; st.u32 [%stack+16],%r373; st.u32 [%stack+12],%r152; .loc 1 595 33 add.u64 %r374,%r205,%r23; .loc 1 592 8 ld.u64 %r375,[%r374]; st.u32 [%stack+8],%r375; st.u32 [%stack+4],%r152; st.u32 [%stack],%r152; cvta.const.u64 %r369,$LCmov.u32 %r202,12; bra $L119; $L149: .loc 1 585 42 add.u32 %r152,%r152,1; .loc 1 585 7 add.u64 %r23,%r23,8; setp.ne.u32 %r378,%r152,%r140; @ %r378 bra $L150; add.u32 %r495,%r140,-1; add.u64 %r499,%r204,32; bra $L148; $L159: .loc 1 605 8 setp.ne.u32 %r379,%r157,%r495; @ %r379 bra $L170; ld.u64 %r381,[%r147]; set.u32.eq.u64 %r382,%r381,-1; neg.s32 %r144,%r382; bra $L151; $L170: mov.u32 %r144,%r502; $L151: .loc 1 606 10 @ %r500 bra $L152; .loc 1 607 4 @ %r497 bra $L153; .loc 1 608 13 ld.u64 %r90,[%r149]; .loc 1 608 33 ld.u64 %r91,[%r147+-8]; .loc 1 608 4 setp.ge.s64 %r386,%r90,%r91; @ %r386 bra $L154; .loc 1 605 26 cvt.s64.s32 %r387,%r157; add.u64 %r389,%r387,%r387; add.u64 %r390,%r389,%r387; shl.b64 %r391,%r390,3; add.u64 %r392,%r204,%r391; ld.u64 %r190,[%r392+32]; bra $L155; $L154: .loc 1 608 46 setp.ne.u32 %r394,%r144,0; @ %r394 bra $L153; .loc 1 610 25 ld.u64 %r190,[%r147]; .loc 1 610 9 add.u64 %r395,%r91,%r190; .loc 1 609 23 setp.lt.s64 %r396,%r90,%r395; @ %r396 bra $L153; $L155: .loc 1 612 13 ld.global.u64 %r399,[_impure_ptr]; .loc 1 612 4 ld.u64 %r400,[%r399+24]; .loc 1 618 5 cvt.u32.u64 %r402,%r91; cvt.u32.u64 %r403,%r190; add.u32 %r401,%r402,%r403; .loc 1 618 29 add.u32 %r404,%r401,-1; .loc 1 612 4 st.u32 [%stack+24],%r404; st.u32 [%stack+20],%r90; st.u32 [%stack+16],%r91; st.u32 [%stack+12],%r157; st.u32 [%stack+8],%r157; st.u32 [%stack+4],%r157; st.u32 [%stack],%r157; cvta.const.u64 %r398,$LC619 11 mov.u32 %r202,12; bra $L119; $L153: .loc 1 624 13 add.u64 %r408,%r494,%r142; ld.u64 %r192,[%r408]; .loc 1 623 4 @ %r498 bra $L156; .loc 1 624 33 ld.u64 %r103,[%r147+-8]; .loc 1 624 4 setp.le.s64 %r410,%r103,%r192; @ %r410 bra $L157; .loc 1 605 26 cvt.s64.s32 %r411,%r157; add.u64 %r413,%r411,%r411; add.u64 %r414,%r413,%r411; shl.b64 %r415,%r414,3; add.u64 %r416,%r204,%r415; ld.u64 %r189,[%r416+32]; bra $L158; $L157: .loc 1 625 8 setp.ne.u32 %r418,%r144,0; @ %r418 bra $L156; .loc 1 627 21 ld.u64 %r189,[%r147]; .loc 1 627 5 add.u64 %r419,%r103,%r189; .loc 1 626 5 setp.gt.s64 %r420,%r419,%r192; @ %r420 bra $L156; $L158: .loc 1 629 13 ld.global.u64 %r423,[_impure_ptr]; .loc 1 629 4 ld.u64 %r424,[%r423+24]; .loc 1 635 5 cvt.u32.u64 %r426,%r103; cvt.u32.u64 %r427,%r189; add.u32 %r425,%r426,%r427; .loc 1 635 29 add.u32 %r428,%r425,-1; .loc 1 629 4 st.u32 [%stack+24],%r428; st.u32 [%stack+20],%r192; st.u32 [%stack+16],%r103; st.u32 [%stack+12],%r157; st.u32 [%stack+8],%r157; st.u32 [%stack+4],%r157; st.u32 [%stack],%r157; cvta.const.u64 %r422,$LC2429,[%value_in]; } .loc 1 636 11 mov.u32 %r202,12; bra $L119; $L156: .loc 1 640 23 ld.u64 %r115,[%r149]; .loc 1 640 4 setp.le.s64 %r431,%r115,%r192; @ %r431 bra $L152; .loc 1 640 36 add.u64 %r432,%frame,%r142; ld.u64 %r116,[%r432]; .loc 1 640 27 setp.lt.s64 %r433,%r116,0; @ %r433 bra $L152; .loc 1 642 20 ld.global.u64 %r436,[_impure_ptr]; .loc 1 642 11 ld.u64 %r437,[%r436+24]; st.u32 [%stack+20],%r116; st.u32 [%stack+16],%r157; st.u32 [%stack+12],%r115; st.u32 [%stack+8],%r192; st.u32 [%stack+4],%r157; st.u32 [%stack],%r157; cvta.const.u64 %r435,$LC48,[%value_in]; } .loc 1 647 11 mov.u32 %r202,9; bra $L119; $L152: .loc 1 602 38 add.u32 %r157,%r157,1; .loc 1 602 3 add.u64 %r149,%r149,8; add.u64 %r147,%r147,24; add.u64 %r142,%r142,8; setp.lt.s32 %r440,%r157,%r140; @ %r440 bra $L159; mov.u64 %r161,%frame; add.u64 %r151,%r204,24; add.u64 %r66,%frame,120; mov.u32 %r155,0; mov.u32 %r143,%r155; $L162: .loc 1 656 17 ld.u64 %r122,[%r161]; .loc 1 660 20 ld.u64 %r191,[%r76]; .loc 1 656 10 setp.ne.u64 %r441,%r122,0; @ %r441 bra $L160; .loc 1 658 7 add.u32 %r143,%r143,1; .loc 1 660 24 ld.u64 %r443,[%r151]; sub.u64 %r442,%r191,%r443; .loc 1 660 13 st.u64 [%r76],%r442; .loc 1 661 4 bra $L161; $L160: .loc 1 663 11 sub.u32 %r154,%r155,%r143; .loc 1 664 36 cvt.s64.s32 %r444,%r154; add.u64 %r446,%r444,%r444; add.u64 %r447,%r446,%r444; shl.b64 %r448,%r447,3; add.u64 %r449,%r203,%r448; st.u64 [%r449+24],%r191; .loc 1 665 47 ld.u64 %r459,[%r66]; sub.u64 %r458,%r459,%r191; .loc 1 665 58 div.s64 %r461,%r458,%r122; .loc 1 665 35 add.u64 %r462,%r461,1; .loc 1 665 31 st.u64 [%r449+32],%r462; .loc 1 666 39 ld.u64 %r471,[%r151+16]; mul.lo.u64 %r470,%r471,%r122; .loc 1 666 27 st.u64 [%r449+40],%r470; .loc 1 668 18 shl.b64 %r473,%r444,3; add.u64 %r474,%frame,%r473; add.u64 %r475,%r474,240; .loc 1 668 31 ld.u64 %r481,[%r475]; ld.u64 %r482,[%r151]; sub.u64 %r480,%r481,%r482; .loc 1 668 18 st.u64 [%r475],%r480; $L161: .loc 1 654 38 add.u32 %r155,%r155,1; .loc 1 654 3 add.u64 %r161,%r161,8; add.u64 %r76,%r76,8; add.u64 %r151,%r151,24; add.u64 %r66,%r66,8; setp.lt.s32 %r483,%r155,%r140; @ %r483 bra $L162; bra $L163; $L148: mov.u64 %r76,%r501; mov.u64 %r147,%r499; .loc 1 561 32 mov.u64 %r149,%r501; mov.u64 %r142,0; cvt.u32.u64 %r157,%r142; .loc 1 605 8 cvt.u32.u64 %r502,%r142; bra $L159; $L163: .loc 1 672 23call (%value_in),CFI_addres488,[%value_in]; } .loc 1 672 21 st.u64 [%r203],%r488; .loc 1 674 10 mov.u32 %r202,0; bra $L119; $L136: .loc 1 558 6 setp.eq.u64 %r498,%r206,0; @ ! %r498 bra $L200; bra $L164; $L128: .loc 1 527 29 mov.u32 %r140,%r35; bra $L166; $L120: ld.s8 %r140,[%r204+20]; .loc 1 527 3 setp.gt.s32 %r496,%r140,0; @ %r496 bra $L166; bra $L201; $L198: .loc 1 578 6 setp.eq.u64 %r491,%r207,0; @ %r491 bra $L182; add.u64 %r501,%frame,240; bra $L146; $L182: add.u64 %r501,%frame,240; bra $L145; $L134: .loc 1 558 6 setp.eq.u64 %r498,%r206,0; @ %r498 bra $L164; add.u64 %r501,%frame,240; bra $L163; $L131: .loc 1 546 6 setp.eq.u64 %r497,%r205,0; @ %r497 bra $L168; setp.gt.s32 %r496,%r140,0; bra $L169; $L119: .loc 1 675 1 mov.u32 %value,%r202CFI_select_pCFI_select_partreg .u32 %r34; .reg .u32 %r40; .reg .u64 %r45; .reg .u64 %r53; .reg .u6432predpredu32u32pred %r188; .reg .u64 %r189; .reg .u16 %r192; .reg .u16u32 %r214; .reg .u32 %r215; .reg .u16 %r216; .reg .u16 %r217; .reg .u16 %r218; .reg .u32 %r219; .reg .u16u64.loc 1 681 7 cvta.global.u64 %r92,_gfortrani_compile_options; .loc 1 681 6 ld.u32 %r93,[%r92+36]; setp.eq.u32 %r94,%r93,0; @ %r94 bra $L203; .loc 1 684 10 setp.ne.u64 %r95,%r89,0; @ %r95 bra $L204; .loc 1 686 13 ld.global.u64 %r100,[_impure_ptr]; .loc 1 686 4 ld.u64 %r101,[%r100+24]; mov.u64 %r98,42; mov.u64 %r97,1; cvta.const.u64 %r96,$LC3101; call (%value_in),fwriteld.param.u64 %r102,[%value_in]; } .loc 1 687 11 mov.u32 %r87,10; bra $L202; $L204: .loc 1 690 10 setp.ne.u64 %r103,%r88,0; @ %r103 bra $L206; .loc 1 692 13 ld.global.u64 %r108,[_impure_ptr]; .loc 1 692 4 ld.u64 %r109,[%r108+24]; mov.u64 %r106,42; mov.u64 %r105,1; cvta.const.u64 %r104,$LC31call (%value_in),fwrite693 11 mov.u32 %r87,10; bra $L202; $L206: .loc 1 698 10 ld.s8 %r112,[%r88+21]; cvt.u16.u32 %r111,%r112; setp.ne.u16 %r113,%r111,1; @ %r113 bra $L207; .loc 1 700 13 ld.global.u64 %r116,[_impure_ptr]; .loc 1 700 4 ld.u64 %r117,[%r116+24]; mov.u32 %r118,1; st.u32 [%stack],%r118; cvta.const.u64 %r115,$LC32703 11 mov.u32 %r87,7; bra $L202; $L207: .loc 1 707 10 ld.u64 %r223,[%r89]; setp.ne.u64 %r122,%r223,0; @ %r122 bra $L208; .loc 1 709 13 ld.global.u64 %r127,[_impure_ptr]; .loc 1 709 4 ld.u64 %r128,[%r127+24]; mov.u64 %r125,58; mov.u64 %r124,1; cvta.const.u64 %r123,$LC331128; call (%value_in),fwriteld.param.u64 %r129,[%value_in]; } .loc 1 711 11 mov.u32 %r87,2; bra $L202; $L208: .loc 1 715 17 ld.s8 %r34,[%r89+20]; .loc 1 715 33 ld.s8 %r222,[%r88+20]; .loc 1 715 10 cvt.u16.u32 %r130,%r34; cvt.u16.u32 %r131,%r222; setp.eq.u16 %r132,%r130,%r131; @ %r132 bra $L209; .loc 1 717 13 ld.global.u64 %r135,[_impure_ptr]; .loc 1 717 4 ld.u64 %r136,[%r135+24]; st.u32 [%stack+4],%r222; st.u32 [%stack],%r34; cvta.const.u64 %r134,$LC34mov.u32 %r87,5; bra $L202; $L209: .loc 1 724 10 setp.le.s16 %r140,%r130,0; @ %r140 bra $L210; .loc 1 724 56 add.u16 %r141,%r130,-1; cvt.u32.u16 %r143,%r141; cvt.s32.s8 %r40,%r143; .loc 1 724 60 cvt.s64.s8 %r144,%r143; add.u64 %r146,%r144,%r144; add.u64 %r147,%r146,%r144; shl.b64 %r148,%r147,3; add.u64 %r149,%r89,%r148; .loc 1 724 28 ld.u64 %r151,[%r149+32]; setp.ne.u64 %r152,%r151,-1; @ %r152 bra $L210; .loc 1 726 13 ld.global.u64 %r155,[_impure_ptr]; .loc 1 726 4 ld.u64 %r156,[%r155+24]; st.u32 [%stack],%r40; cvta.const.u64 %r154,$LC31154157,[%value_in]; } .loc 1 729 11 mov.u32 %r87,10; bra $L202; $L217: .loc 1 742 32 ld.u64 %r45,[%r89+8]; .loc 1 742 43 add.u64 %r159,%r45,-1; .loc 1 742 10 setp.ge.u64 %r160,%r159,%r90; @ %r160 bra $L211; .loc 1 744 13 ld.global.u64 %r163,[_impure_ptr]; .loc 1 744 4 ld.u64 %r164,[%r163+24]; .loc 1 747 29 cvt.u32.u64 %r166,%r45; add.u32 %r165,%r166,-1; .loc 1 744 4 st.u32 [%stack+4],%r165; st.u32 [%stack],%r90; cvta.const.u64 %r162,$LC36164748 11 mov.u32 %r87,12; bra $L202; $L211: .loc 1 753 32 ld.u64 %r53,[%r88+8]; .loc 1 753 24 add.u64 %r169,%r53,%r90; .loc 1 753 10 setp.ge.u64 %r170,%r45,%r169; @ %r170 bra $L212; .loc 1 755 13 ld.global.u64 %r173,[_impure_ptr]; .loc 1 755 4 ld.u64 %r174,[%r173+24]; st.u32 [%stack+12],%r45; .loc 1 760 25 cvt.u32.u64 %r176,%r90; cvt.u32.u64 %r177,%r53; add.u32 %r175,%r176,%r177; .loc 1 755 4 st.u32 [%stack+8],%r175; st.u32 [%stack+4],%r53; st.u32 [%stack],%r90; cvta.const.u64 %r172,$LC3717172178,[%value_in]; } .loc 1 762 11 mov.u32 %r87,12; bra $L202; $L212: .loc 1 766 6 cvt.u16.u32 %r180,%r222; setp.le.s16 %r181,%r180,0; @ %r181 bra $L213; add.u32 %r182,%r222,-1; cvt.u64.u32 %r183,%r182; add.u64 %r185,%r183,%r183; add.u64 %r186,%r185,%r183; shl.b64 %r187,%r186,3; add.u64 %r86,%r187,48; mov.u64 %r76,24; $L214: add.u64 %r73,%r89,%r76; .loc 1 770 47 ld.u64 %r65,[%r73]; add.u64 %r75,%r88,%r76; .loc 1 770 31 st.u64 [%r75],%r65; .loc 1 771 42 ld.u64 %r66,[%r73+8]; .loc 1 771 26 st.u64 [%r75+8],%r66; .loc 1 772 38 ld.u64 %r67,[%r73+16]; .loc 1 772 22 st.u64 [%r75+16],%r67; .loc 1 768 7 add.u64 %r76,%r76,24; setp.ne.u64 %r188,%r76,%r86; @ %r188 bra $L214; $L213: .loc 1 776 50 add.u64 %r189,%r223,%r90; .loc 1 776 21 st.u64 [%r88],%r189; .loc 1 777 10 mov.u32 %r87,0; bra $L202; $L203: .loc 1 734 75 ld.u16 %r193,[%r88+22]; and.b16 %r192,%r193,-5; set.u32.eq.u16 %r195,%r192,257; neg.s32 %r196,%r195; .loc 1 734 37 set.u32.eq.u16 %r199,%r193,1029; neg.s32 %r200,%r199; .loc 1 734 75 cvt.u16.u32 %r202,%r196; cvt.u16.u32 %r203,%r200; or.b16 %r201,%r202,%r203;%r206 bra $L215; ld.u64 %r223,[%r89]; ld.s8 %r222,[%r88+20]; bra $L212; $L210: ld.u16 %r208,[%r88+22]; and.b16 %r207,%r208,-5; set.u32.eq.u16 %r210,%r207,257; neg.s32 %r211,%r210; .loc 1 734 37 set.u32.eq.u16 %r214,%r208,1029; neg.s32 %r215,%r214; .loc 1 734 75 cvt.u16.u32 %r217,%r211; cvt.u16.u32 %r218,%r215; or.b16 %r216,%r217,%r218; cvt.u32.u16 %r219,%r216; cvt.u16.u8 %r220,%r219; setp.ne.u16 %r221,%r220,0; @ %r221 bra $L216; bra $L217; $L215: .loc 1 736 22 st.u64 [%r88+8],%r91; ld.u64 %r223,[%r89]; ld.s8 %r222,[%r88+20]; bra $L212; $L216: st.u64 [%r88+8],%r91; bra $L217; $L202: .loc 1 778CFI_setpointerCFI_setpointer.reg .u64 %r67; .reg .u32 %r68; .reg .predpredpredpredpred %r128; mov.u64 %r64,%ar0; mov.u64 %r65,%ar1; mov.u64 %r66,%ar2; .loc 1 785 7 cvta.global.u64 %r67,_gfortrani_compile_options; .loc 1 785 6 ld.u32 %r68,[%r67+36]; setp.eq.u32 %r69,%r68,0; @ %r69 bra $L220; .loc 1 787 10 setp.ne.u64 %r70,%r64,0; @ %r70 bra $L221; .loc 1 789 13 ld.global.u64 %r75,[_impure_ptr]; .loc 1 789 4 ld.u64 %r76,[%r75+24]; mov.u64 %r73,32; mov.u64 %r72,1; cvta.const.u64 %r71,$LC3871790 11 mov.u32 %r63,10; bra $L219; $L221: .loc 1 793 10 ld.s8 %r79,[%r64+21]; cvt.u16.u32 %r78,%r79; setp.eq.u16 %r80,%r78,0; @ %r80 bra $L223; .loc 1 795 14 ld.global.u64 %r85,[_impure_ptr]; .loc 1 795 5 ld.u64 %r86,[%r85+24]; mov.u64 %r83,85; mov.u64 %r82,1; cvta.const.u64 %r81,$LC3981ld.param.u64 %r87,[%value_in]; } .loc 1 797 12 mov.u32 %r63,7; bra $L219; $L232: .loc 1 805 25 mov.u64 %r88,0; st.u64 [%r64],%r88; .loc 1 806 24 mov.u32 %r89,1; st.u32 [%r64+16],%r89; .loc 1 864 10 cvt.u32.u64 %r63,%r88; bra $L219; $L241: .loc 1 814 14 ld.u64 %r29,[%r64+8]; .loc 1 814 34 ld.u64 %r30,[%r65+8]; .loc 1 814 7 setp.eq.u64 %r90,%r29,%r30; @ %r90 bra $L224; .loc 1 816 17 ld.global.u64 %r93,[_impure_ptr]; .loc 1 816 8 ld.u64 %r94,[%r93+24]; st.u32 [%stack+4],%r30; st.u32 [%stack],%r29; cvta.const.u64 %r92,$LC495,[%value_in]; } .loc 1 820 15 mov.u32 %r63,4; bra $L219; $L224: .loc 1 823 14 ld.s8 %r35,[%r64+20]; .loc 1 823 30 ld.s8 %r50,[%r65+20]; .loc 1 823 7 cvt.u16.u32 %r97,%r35; cvt.u16.u32 %r98,%r50; setp.eq.u16 %r99,%r97,%r98; @ %r99 bra $L225; .loc 1 825 17 ld.global.u64 %r102,[_impure_ptr]; .loc 1 825 8 ld.u64 %r103,[%r102+24]; st.u32 [%stack+4],%r50; st.u32 [%stack],%r35; cvta.const.u64 %r101,$LC410r10104,[%value_in]; } .loc 1 828 15 mov.u32 %r63,5; bra $L219; $L225: .loc 1 831 14 ld.s16 %r40,[%r64+22]; .loc 1 831 30 ld.s16 %r41,[%r65+22]; .loc 1 831 7 cvt.u16.u32 %r106,%r40; cvt.u16.u32 %r107,%r41; setp.eq.u16 %r108,%r106,%r107; @ %r108 bra $L226; .loc 1 833 17 ld.global.u64 %r111,[_impure_ptr]; .loc 1 833 8 ld.u64 %r112,[%r111+24]; st.u32 [%stack+4],%r41; st.u32 [%stack],%r40; cvta.const.u64 %r110,$LC4211110113,[%value_in]; } .loc 1 836 15 mov.u32 %r63,6; bra $L219; $L226: .loc 1 842 17 ld.u64 %r46,[%r65]; st.u64 [%r64],%r46; .loc 1 849 31 ld.u32 %r47,[%r65+16]; .loc 1 849 23 st.u32 [%r64+16],%r47; .loc 1 852 7 cvt.u16.u32 %r115,%r50; setp.gt.s16 %r116,%r115,0; @ %r116 bra $L227; $L231: .loc 1 864 10 mov.u32 %r63,0; bra $L219; $L227: mov.u64 %r52,%r66; add.u32 %r117,%r50,-1; cvt.u64.u32 %r118,%r117; shl.b64 %r119,%r118,3; add.u64 %r120,%r52,8; add.u64 %r62,%r119,%r120; .loc 1 852 7 mov.u64 %r51,24; setp.eq.u64 %r128,%r52,0; $L230: .loc 1 854 7 @ %r128 bra $L228; .loc 1 855 47 ld.u64 %r23,[%r52]; add.u64 %r127,%r65,%r51; bra $L229; $L228: .loc 1 857 49 add.u64 %r127,%r65,%r51; ld.u64 %r23,[%r127]; $L229: add.u64 %r53,%r64,%r51; st.u64 [%r53],%r23; .loc 1 859 42 ld.u64 %r48,[%r127+8]; .loc 1 859 26 st.u64 [%r53+8],%r48; .loc 1 860 38 ld.u64 %r49,[%r127+16]; .loc 1 860 22 st.u64 [%r53+16],%r49; .loc 1 852 7 add.u64 %r52,%r52,8; add.u64 %r51,%r51,24; setp.ne.u64 %r123,%r52,%r62; @ %r123 bra $L230; bra $L231; $L220: .loc 1 803 6 setp.eq.u64 %r125,%r65,0; @ %r125 bra $L232; .loc 1 823 30 ld.s8 %r50,[%r65+20]; bra $L226; $L223: .loc 1 803 6 setp.eq.u64 %r126,%r65,0; @ ! %r126 bra $L241; bra $L232; $L219gfortrani_size_from_real_kindfortran/io/size_from_kind.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_size_from_complex_kindgfortrani_size_from_complex_kindF: _gfortrani_size_from_real_kindgfortrani_size_from_real_kind (r22pred %r25; mov.u32 %r23,%ar0; .loc 1 34 3 setp.eq.u32 %r24,%r23,4; @ %r24 bra $L3; setp.eq.u32 %r25,%r23,8; @ %r25 bra $L4; .loc 1 53 14 cvt.s64.s32 %r22,%r23; bra $L1; $L3: .loc 1 38 14 mov.u64 %r22,4; bra $L1; $L4: .loc 1 34 3 mov.u64 %r22,8; $L1: .loc 1 55gfortrani_size_from_complex_kindgfortrani_size_from_complex_kind (r23pred %r26; .reg .u32 %r27; mov.u32 %r24,%ar0; .loc 1 61 3 setp.eq.u32 %r25,%r24,4; @ %r25 bra $L7; setp.eq.u32 %r26,%r24,8; @ %r26 bra $L8; .loc 1 80 16 add.u32 %r27,%r24,%r24; cvt.s64.s32 %r23,%r27; bra $L5; $L7: .loc 1 65 14 mov.u64 %r23,8; bra $L5; $L8: .loc 1 61 3 mov.u64 %r23,16; $L5: .loc 1 8gfortran_associatedgfortran_associatedfortran/intrinsics/associatedgfortran_associatedgfortran_associatedu64u32 %r60; .reg .u16predpred %r94; .reg .pred %r95; mov.u64 %r51,%ar0; mov.u64 %r52,%ar1; .loc 1 36 7 ld.u64 %r22,[%r51]; .loc 1 36 6 setp.eq.u64 %r53,%r22,0; @ %r53 bra $L4; .loc 1 38 6 ld.u64 %r54,[%r52]; setp.ne.u64 %r55,%r22,%r54; @ %r55 bra $L5; .loc 1 40 6 ld.u64 %r56,[%r51+16]; ld.u64 %r57,[%r52+16]; setp.ne.u64 %r58,%r56,%r57; @ %r58 bra $L6; .loc 1 42 6 ld.s8 %r60,[%r51+29]; cvt.u16.u32 %r59,%r60; ld.s8 %r62,[%r52+29]; cvt.u16.u32 %r61,%r62; setp.ne.u16 %r63,%r59,%r61; @ %r63 bra $L7; .loc 1 45 10 ld.s8 %r28,[%r51+28]; .loc 1 46 3 cvt.u16.u32 %r64,%r28; setp.le.s16 %r65,%r64,0; @ %r65 bra $L8; add.u32 %r66,%r28,-1; cvt.u64.u32 %r67,%r66; add.u64 %r69,%r67,%r67; add.u64 %r70,%r69,%r67; shl.b64 %r71,%r70,3; add.u64 %r49,%r71,64; mov.u64 %r44,40; $L3: add.u64 %r43,%r51,%r44; .loc 1 49 16 ld.u64 %r73,[%r43+16]; add.u64 %r72,%r73,1; .loc 1 49 14 ld.u64 %r74,[%r43+8]; sub.u64 %r39,%r72,%r74; add.u64 %r38,%r52,%r44; .loc 1 51 21 ld.u64 %r76,[%r38+16]; add.u64 %r75,%r76,1; ld.u64 %r78,[%r38+8]; sub.u64 %r77,%r75,%r78; .loc 1 51 10 setp.ne.u64 %r79,%r77,%r39; @ %r79 bra $L9; .loc 1 53 10 ld.u64 %r81,[%r43]; ld.u64 %r82,[%r38]; set.u32.ne.u64 %r83,%r81,%r82; neg.s32 %r84,%r83; .loc 1 53 79 set.u32.ne.u64 %r86,%r77,10; .loc 1 55 10 setp.le.s64 %r94,%r77,0; @ %r94 bra $L11; .loc 1 46 3 add.u64 %r44,%r44,24; setp.ne.u64 %r95,%r44,%r49; @ %r95 bra $L3; .loc 1 59 10 mov.u32 %r50,1; bra $L1; $L4: .loc 1 37 12 cvt.u32.u64 %r50,%r22; bra $L1; $L5: mov.u32 %r50,0; bra $L1; $L6: mov.u32 %r50,0; bra $L1; $L7: mov.u32 %r50,0; bra $L1; $L8: .loc 1 59 10 mov.u32 %r50,1; bra $L1; $L9: .loc 1 37 12 mov.u32 %r50,0; bra $L1; $L10: mov.u32 %r50,0; bra $L1; $L11: mov.u32 %r50,0; $L1: .loc 1 60 abort.o/ gfortran_abort .visible .func _gfortran_abort; .file 1 "../../../libgfortran/intrinsics/abort.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_sys_abort .extern .func _gfortrani_sys_gfortran_abort .visible .func _gfortran_abort { .loc 1 32 3 { call _gfortrani_sys_} args.o/ gfortran_getarg_i4 .visible .func _gfortran_getarg_fortran/intrinsics/arggfortran_getarg_i8 .visible .func _gfortran_getarg_i8_gfortran_iargc; // BEGIN GLOBAL FUNCTION DECL: _gfortran_get_command_argument_i4 .visible .func _gfortran_get_command_argument_i4_gfortran_get_command_argument_i8 .visible .func _gfortran_get_command_argument_i8gfortrani_get_args .extern .func _gfortrani_get_arg_gfortran_runtime_error .extern .func _gfortran_runtime_error110,103,32,97,114,103,117,109,101,110,116,32,116,111,32,103,101,116,95,99,111,109,109,97,110,100,95,97,114,103,117,109,101,110_gfortran_getarg_i4 .visible .func _gfortran_getarg_mov.u64 %r39,%ar2; .loc 1 42 3 add.u64 %r42,%frame,8; {frame; call _gfortrani_get_args,(%out_arg1,%out_arg2); } .loc 1 44 15 set.u32.eq.u64 %r44,%r39,0; neg.s32 %r45,%r44; .loc 1 44 22 set.u32.eq.u64 %r47,%r38,0or.b16 %r49,%r50,%r51; .loc 1 44 6! %r54 bra $L1; .loc 1 47 3 mov.u32 %r565658,[%value_in]; } .loc 1 49 8 ld.u32 %r25,[%r37]; .loc 1 49 6 ld.u32 %r60,[%frame+8]; set.u32.lt.s32 %r61,%r25,%r60; neg.s32 %r62,%r61; .loc 1 49 27 not.b32 %r64,%r25; shr.u32 %r65,%r64,31; cvt.u16.u32 %r68,%r62; cvt.u16.u32 %r69,%r65; and.b16 %r67,%r68,%r69; cvt.u32.u16 %r70,%r67; cvt.u16.u8 %r71,%r70; setp.eq.u16 %r72,%r71,0; @ %r72 bra $L1; .loc 1 51 46 cvt.s64.s32 %r73,%r25; .loc 1 51 33 ld.u64 %r74,[%frame]; shl.b64 %r75,%r73,3; add.u64 %r76,%r74,%r75; ld.u64 %r32,[%r76]32.loc 1 54 7 min.u64 %r80,%r39,%r78_gfortran_getarg_i8 .visible .func _gfortran_getarg_mov.u64 %r40,%ar2; .loc 1 68 40 ld.u64 %r22,[%r38]; .loc 1 42 3 add.u64 %r43,%frame,8; {frame; call _gfortrani_get_args,(%out_arg1,%out_arg2); } .loc 1 44 15 set.u32.eq.u64 %r45,%r40,0; neg.s32 %r46,%r45; .loc 1 44 22 set.u32.eq.u64 %r48,%r39,0or.b16 %r50,%r51,%r52; .loc 1 44 6! %r55 bra $L7; .loc 1 68 24 cvt.u32.u64 %r23,%r22; .loc 1 47 3 mov.u32 %r573949 6 ld.u32 %r61,[%frame+8]; set.u32.lt.s32 %r62,%r23,%r61; neg.s32 %r63,%r62; .loc 1 49 27 not.b32 %r65,%r23; shr.u32 %r66,%r65,31; cvt.u16.u32 %r69,%r63; cvt.u16.u32 %r70,%r66; and.b16 %r68,%r69,%r70;7; .loc 1 51 46 cvt.s64.s32 %r74,%r23; .loc 1 51 33 ld.u64 %r75,[%frame]; shl.b64 %r76,%r74,3; add.u64 %r77,%r75,%r76; ld.u64 %r34,[%r77]34.loc 1 54 7 min.u64 %r81,%r40,%r79$L7: .loc 1 7_gfortran_iargcgfortran_iargc { .reg .u32 %valueloc 1 86 3 add.u64 %r26,%frame,8frame; call _gfortrani_get_args,(%out_arg1,%out_arg2); } .loc 1 88 16 ld.u32 %r27,[%frame+8]; add.u32 %value,%r27,-1; .loc 1 89_gfortran_get_command_argument_i4 .visible .func _gfortran_get_command_argument_i4 {stack; mov.u64 %stack,0predpred_gfortran_runtime_error15: .loc 1 129 6 or.b64 %r49,%r42,%r43;6; .loc 1 129 39 setp.ne.u64 %r51,%r44,0; @ %r51 bra $L17; bra $L14; $L16: .loc 1 132 3 add.u64 %r54,%frame,8_gfortrani_get_args,(%out_arg1,%out_arg2); } .loc 1 134 7 ld.u32 %r39,[%r41]; .loc 1 134 6 setp.lt.s32 %r55,%r39,0; @ %r55 bra $L26; .loc 1 134 19 ld.u32 %r56,[%frame+8]; setp.ge.s32 %r57,%r39,%r56; @ %r57 bra $L27; $L25: .loc 1 137 26 cvt.s64.s32 %r59,%r39; .loc 1 137 14 ld.u64 %r60,[%frame]; shl.b64 %r61,%r59,3; add.u64 %r62,%r60,%r61; ld.u64 %r58,[%r62]58mov.u64 %r37,%r63; .loc 1 121 13 mov.u32 %r36,0; bra $L18; $L26: .loc 1 122 20 mov.u64 %r37,0; .loc 1 135 15 mov.u32 %r36,42; bra $L18; $L27: .loc 1 122 20 mov.u64 %r37,0; .loc 1 135 15 mov.u32 %r36,42; $L18: .loc 1 139 6 setp.eq.u64 %r65,%r42,0; @ %r65 bra $L19; .loc 1 141 10 setp.eq.u64 %r66,%r45,0; @ %r66 bra $L28; .loc 1 144 2 mov.u32 %r64147 6 setp.eq.u32 %r71,%r36,42; @ %r71 bra $L19; .loc 1 149 10 setp.le.u64 %r72,%r37,%r45; .loc 1 150 13 selp.u32 %r36,%r36,-1,%r72; .loc 1 152 7 min.u64 %r73,%r37,%r45; .loc 1 152 27 ld.s32 %r74,[%r41]; .loc 1 152 7 ld.u64 %r75,[%frame]; shl.b64 %r76,%r74,3; add.u64 %r77,%r75,%r76; ld.u64 %r78,[%r77];bra $L19; $L28: .loc 1 142 12 mov.u32 %r36,42; $L19: .loc 1 155 6 setp.eq.u64 %r87,%r43,0; @ %r87 bra $L21; .loc 1 156 13 st.u32 [%r43],%r37; $L21: .loc 1 158 6 setp.eq.u64 %r88,%r44,0; @ ! %r88 bra $L23; bra $L14; $L30: .loc 1 135 15 mov.u32 %r36,42; bra $L23; $L29: mov.u32 %r36,42; $L23: .loc 1 159 13 st.u32 [%r44],%r36; bra $L14; $L17: .loc 1 132 3 add.u64 %r91,%frame,8; {frame; call _gfortrani_get_args,(%out_arg1,%out_arg2); } .loc 1 134 7 ld.u32 %r39,[%r41]; .loc 1 134 6 setp.lt.s32 %r92,%r39,0; @ %r92 bra $L29; .loc 1 134 19 ld.u32 %r93,[%frame+8]; setp.le.s32 %r94,%r93,%r39; @ ! %r94 bra $L25; bra $L30; $L14_gfortran_get_command_argument_i8 .visible .func _gfortran_get_command_argument_i8 {179 13 ld.u64 %r33,[%r28]; st.u32 [%frame+8],%r33; .loc 1 180 3 add.u64 %r39,%frame,4; add.u64 %r40,%frame,8; {framecall _gfortran_get_command_argument_i4); }182 13 ld.s32 %r42,[%frame+4]; st.u64 [%r30],%r42; $L44: .loc 1 183 6 setp.eq.u64 %r43,%r31,0; @ %r43 bra $L43; .loc 1 184 13 ld.s32 %r44,[%frame]; st.u64 [%r31],%r44; $L43_gfortran_get_command_i4 .visible .func _gfortran_get_commanpredpred %r94; .reg .u64 %r98predor.b64 %r56,%r52,%r53; setp.ne.u64 %r57,%r56,0; @ %r57 bra $L53; .loc 1 203 41 setp.ne.u64 %r58,%r54,0; @ %r58 bra $L54; bra $L52; $L53: .loc 1 206 3 add.u64 %r61,%frame,8frame; call _gfortrani_get_args,(%out_arg1,%out_arg2); } .loc 1 208 6 setp.eq.u64 %r62,%r52,0; @ %r62 bra $L66; .loc 1 211 10 setp.eq.u64 %r63,%r55,0; @ %r63 bra $L67; .loc 1 214 2 mov.u32 %r65,3.loc 1 199 7 mov.u32 %r39,0; bra $L55; $L70: cvt.u32.u64 %r39,%r56; $L62: .loc 1 221 19 set.u32.ne.u64 %r69,%r52,0; neg.s32 %r70,%r69; cvt.u32.u32 %r68,%r70; mov.u64 %r38,0; mov.u64 %r40,%r38; .loc 1 221 27 cvt.u16.u8 %r82,%r68; .loc 1 226 30 cvt.u32.u64 %r121,%r55; $L60: .loc 1 219 16 ld.u64 %r71,[%frame]; shl.b64 %r72,%r38,3; add.u64 %r73,%r71,%r72; ld.u64 %r24,[%r73]24.loc 1 230 38 add.u64 %r51,%r75,%r40; .loc 1 221 40 set.u32.eq.u32 %r78,%r39,0; neg.s32 %r79,%r78; .loc 1 221 27 cvt.u16.u32 %r81,%r79; and.b16 %r80,%r81,%r82; .loc 1 221@ %r85 bra $L56; .loc 1 224 16 cvt.u32.u64 %r86,%r75; cvt.s64.s32 %r27,%r86; add.u64 %r87,%r27,%r40; .loc 1 224 7 setp.le.u64 %r88,%r87,%r55; @ %r88 bra $L57; .loc 1 226 30 cvt.u32.u64 %r91,%r40; sub.u32 %r89,%r121,%r91; .loc 1 233 4 cvt.s64.s32 %r27,%r89; .loc 1 227 18 mov.u32 %r39,-1; bra $L58; $L57: .loc 1 230 23 add.u32 %r92,%r119,-1; .loc 1 230 12 cvt.u32.u64 %r93,%r38; setp.eq.u32 %r94,%r92,%r93; @ %r94 bra $L68; .loc 1 230 27 set.u32.eq.u64 %r39,%r55,%r51; bra $L58; $L68: mov.u32 %r39,0; $L58: .loc 1 233 12 add.u64 %r98,%r52,%r40; .loc 1 233 4105,[%value_in]; } ld.u32 %r119,[%frame+8]; $L56: .loc 1 238 21 add.u32 %r107,%r119,-1; .loc 1 238 10 cvt.u32.u64 %r108,%r38; setp.eq.u32 %r109,%r107,%r108; @ %r109 bra $L69; .loc 1 239 9 add.u64 %r40,%r51,1; bra $L59; $L69: .loc 1 237 15 mov.u64 %r40,%r51; $L59: .loc 1 217 3 add.u64 %r38,%r38,1; cvt.u32.u64 %r110,%r38; setp.gt.s32 %r111,%r119,%r110; @ %r111 bra $L60; bra $L61; $L66: .loc 1 199 7 cvt.u32.u64 %r39,%r52; bra $L55; $L67: .loc 1 212 12 mov.u32 %r39,42; $L55: .loc 1 217 17 ld.u32 %r119,[%frame+8]; .loc 1 217 3 setp.gt.s32 %r112,%r119,0; @ %r112 bra $L62; mov.u64 %r40,0; $L61: .loc 1 242 6 setp.eq.u64 %r113,%r53,0; @ %r113 bra $L63; .loc 1 243 13 st.u32 [%r53],%r40; $L63: .loc 1 245 6 setp.eq.u64 %r114,%r54,0; @ %r114 bra $L52; $L65: .loc 1 246 13 st.u32 [%r54],%r39; bra $L52; $L54: .loc 1 206 3 add.u64 %r117,%frame,8frame; call _gfortrani_get_args,(%out_arg1,%out_arg2); } .loc 1 217 17 ld.u32 %r119,[%frame+8]; .loc 1 217 3 setp.gt.s32 %r118,%r119,0; @ %r118 bra $L70; .loc 1 199 7 cvt.u32.u64 %r39,%r56; bra $L65; $L52:_gfortran_get_command_i8 .visible .func _gfortran_get_comman.loc 1 264 3 add.u64 %r34,%frame,429; call _gfortran_get_command_265 6 setp.eq.u64 %r35,%r27,0; @ %r35 bra $L82; .loc 1 266 13 ld.s32 %r36,[%frame+4]; st.u64 [%r27],%r36; $L82: .loc 1 267 6 setp.eq.u64 %r37,%r28,0; @ %r37 bra $L81; .loc 1 268 13 ld.s32 %r38,[%frame]; st.u64 [%r28],%r38; $L81: .loc 1 269 1 ret; } cshift0.o/ BEGIN FUNCTION DECL: cshift0 .func cshift0in_ar4); .file 1 "../../../libgfortran/intrinsics/cshift0.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_cshift0_1 .visible .func _gfortran_cshift0_1gfortran_cshift0_1_char .visible .func _gfortran_cshift0_1_char_gfortran_cshift0_1_char4 .visible .func _gfortran_cshift0_1_char4_gfortran_cshift0_2 .visible .func _gfortran_cshift0_2gfortran_cshift0_2_char .visible .func _gfortran_cshift0_2_char_gfortran_cshift0_2_char4 .visible .func _gfortran_cshift0_2_char4_gfortran_cshift0_4 .visible .func _gfortran_cshift0_4gfortran_cshift0_4_char .visible .func _gfortran_cshift0_4_char_gfortran_cshift0_4_char4 .visible .func _gfortran_cshift0_4_char4_gfortran_cshift0_8 .visible .func _gfortran_cshift0_8gfortran_cshift0_8_char .visible .func _gfortran_cshift0_8_char_gfortran_cshift0_8_char4 .visible .func _gfortran_cshift0_8_char4_gfortran_cshift0_16 .visible .func _gfortran_cshift0_16gfortran_cshift0_16_char .visible .func _gfortran_cshift0_16_char_gfortran_cshift0_16_char4 .visible .func _gfortran_cshift0_16_char4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortran_size0_gfortran_size0_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extentsgfortrani_cshift0_i1 .extern .func _gfortrani_cshift0_i1_gfortrani_cshift0_i2 .extern .func _gfortrani_cshift0_i2_gfortrani_cshift0_i4 .extern .func _gfortrani_cshift0_i4_gfortrani_cshift0_i8 .extern .func _gfortrani_cshift0_i8_gfortrani_cshift0_i16 .extern .func _gfortrani_cshift0_i16_gfortrani_cshift0_r4 .extern .func _gfortrani_cshift0_r4_gfortrani_cshift0_r8 .extern .func _gfortrani_cshift0_r8_gfortrani_cshift0_c4 .extern .func _gfortrani_cshift0__gfortrani_cshift0_c8 .extern .func _gfortrani_cshift0_114,101,116,117,114,110,32,118,97,108,117,101,0 }03,117,109,101,110,116,32,39,68,73,77,39,32,105,115,32,111,117,116,32,111,102,32,114,97,110,103,101,32,105,110,32,99,97,108,108,32,116,111,32,39,67,83,72,73,70,84,397] = {67,83,72,73,70,84,0 }; // BEGIN FUNCTION DEF: cshift0 .func cshift0in_ar4) {stack; mov.u64 %stack,0480u64 %r191; .reg .u64predpredpred %r215; .reg .pred %r216; .reg .pred %r217; .reg .pred %r218; .reg .predpredpred %r265; .reg .predpred %r305; .reg .u64 %r307; .reg .predu64 %r325; .reg .u64pred %r345; .reg .pred %r346; .reg .predu64 %r377; .reg .u64 %r386; .reg .pred %r388; .reg .u64 %r396; .reg .pred.reg .predu32u64 %r427; .reg .pred %r429; .reg .pred %r431; .reg .pred %r433; .reg .pred.reg .u64 %r439; .reg .u64 %r440; mov.u64 %r165,%ar0; mov.u64 %r166,%ar1; mov.u64 %r167,%ar2; mov.u32 %r168,%ar3; mov.u64 %r169,%ar4; .loc 1 54 6 setp.le.s32 %r170,%r168,0; @ %r170 bra $L2; .loc 1 54 28 ld.s8 %r171,[%r166+28]; .loc 1 54 17 setp.ge.s32 %r172,%r171,%r168; @ %r172 bra $L3; $L2: .loc 1 55 5 cvta.const.u64 %r173173_gfortran_runtime_error166; call (%value_in),_gfortran_size0,(59 6 ld.u64 %r177,[%r165]; setp.ne.u64 %r178,%r177,0; @ %r178 bra $L4; .loc 1 63 19 st.u64 [%r165+8],%r177; .loc 1 64 7 ld.u64 %r180,[%r166+16]; st.u64 [%r165+16],%r180; ld.u64 %r181,[%r166+24]; st.u64 [%r165+24],%r181; .loc 1 65 23 ld.s8 %r121,[%r166+28]; .loc 1 65 7 cvt.u16.u32 %r182,%r121; setp.gt.s16 %r183,%r182,0; @ %r183 bra $L5; $L8: .loc 1 81 241(%value_in),_gfortrani_xmallocarray186,[%value_in]; } .loc 1 81 22 st.u64 [%r165],%r186; bra $L6; $L7: .loc 1 69 16 ld.u64 %r189,[%r68+8]; ld.u64 %r191,[%r68]; .loc 1 74 17 mad.lo.u64 %r93,%r87,%r93,%r93; add.u64 %r68,%r68,24; add.u64 %r58,%r58,24; .loc 1 69 14 sub.u64 %r87,%r189,%r191; $L55: add.u64 %r31,%r165,%r58; .loc 1 77 4 st.u64 [%r31+48],%r193; st.u64 [%r31+56],%r87; st.u64 [%r31+40],%r93; .loc 1 65 7 setp.ne.u64 %r194,%r58,%r427; @ %r194 bra $L7; bra $L8; $L4: .loc 1 83 12 cvta.global.u64 %r195,_gfortrani_compile_options; .loc 1 83 11 ld.u32 %r196,[%r195+36]; setp.eq.u32 %r197,%r196,0; @ %r197 bra $L6; .loc 1 85 7 cvta.const.u64 %r201,$LC1; cvta.const.u64 %r20020001; call _gfortrani_bounds_equal_extents$L6: .loc 1 89 6 setp.eq.u64 %r202,%r176,0; @ %r202 bra $L1; .loc 1 92 15 ld.s8 %r203,[%r166+29]; shl.b32 %r204,%r203,4; cvt.s64.s32 %r205,%r204; ld.u64 %r207,[%r166+16]; shl.b64 %r206,%r207,7; or.b64 %r47,%r205,%r206; .loc 1 94 3 setp.eq.u64 %r208,%r47,1040; @ %r208 bra $L10; setp.gt.u64 %r209,%r47,1040; @ %r209 bra $L11; setp.eq.u64 %r210,%r47,288; @ %r210 bra $L12; setp.gt.u64 %r211,%r47,288; @ %r211 bra $L13; setp.eq.u64 %r212,%r47,160; @ %r212 bra $L14; setp.eq.u64 %r213,%r47,272; @ %r213 bra $L12; setp.eq.u64 %r214,%r47,144; @ %r214 bra $L14; bra $L15; $L13: setp.eq.u64 %r215,%r47,544; @ %r215 bra $L16; setp.eq.u64 %r216,%r47,560; @ %r216 bra $L17; setp.eq.u64 %r217,%r47,528; @ %r217 bra $L16; bra $L15; $L11: setp.eq.u64 %r218,%r47,1088; @ %r218 bra $L18; setp.gt.u64 %r219,%r47,1088; @ %r219 bra $L19; setp.eq.u64 %r220,%r47,1056; @ %r220 bra $L10; setp.eq.u64 %r221,%r47,1072; @ %r221 bra $L20; bra $L15; $L19: setp.eq.u64 %r222,%r47,2080; @ %r222 bra $L21; setp.eq.u64 %r223,%r47,2112; @ %r223 bra $L22; setp.eq.u64 %r224,%r47,2064; @ %r224 bra $L21; bra $L15; $L14: .loc 1 988; call _gfortrani_cshift0_99 7 bra $L1; $L12:8; call _gfortrani_cshift0_104 7 bra $L1; $L16:8; call _gfortrani_cshift0_109 7 bra $L1; $L10:8; call _gfortrani_cshift0_114 7 bra $L1; $L21:8; call _gfortrani_cshift0_i16121 7 bra $L1; $L17:8; call _gfortrani_cshift0_r47 bra $L1; $L20:8; call _gfortrani_cshift0_r8130 7 bra $L1; $L18:8; call _gfortrani_cshift0_c47 bra $L1; $L28; call _gfortrani_cshift0_c8160 7 bra $L1; $L15: .loc 1 188 3 setp.eq.u64 %r261,%r169,4; @ %r261 bra $L24; setp.gt.s64 %r262,%r169,4; @ %r262 bra $L25; setp.eq.u64 %r263,%r169,1; @ %r263 bra $L26; setp.eq.u64 %r264,%r169,2; @ %r264 bra $L27; bra $L28; $L25: setp.eq.u64 %r265,%r169,8; @ %r265 bra $L29; setp.eq.u64 %r266,%r169,16; @ %r266 bra $L30; bra $L28; $L26:8; call _gfortrani_cshift0_327 8 ld.u64 %r114,[%r165]; .loc 1 328 8 ld.u64 %r115,[%r166]; .loc 1 196 7 bra $L31; $L27: .loc 1 199 11 ld.u64 %r114,[%r165]; .loc 1 199 46 ld.u64 %r115,[%r166]; .loc 1 199 43 or.b64 %r271,%r114,%r115; cvt.u16.u64 %r274,%r271; and.b16 %r273,%r274,1; cvt.u32.u16 %r275,%r273; cvt.u16.u8 %r276,%r275; setp.ne.u16 %r277,%r276,0; @ %r277 bra $L318; call _gfortrani_cshift0_205 4 bra $L1; $L24: .loc 1 209 11 ld.u64 %r114,[%r165]; .loc 1 199 46 ld.u64 %r115,[%r166]; .loc 1 209 43 or.b64 %r282,%r114,%r115; and.b64 %r283,%r282,3; setp.ne.u64 %r284,%r283,0; @ %r284 bra $L318; call _gfortrani_cshift0_215 4 bra $L1; $L29: .loc 1 219 11 ld.u64 %r114,[%r165]; .loc 1 199 46 ld.u64 %r115,[%r166]; .loc 1 219 43 or.b64 %r289,%r114,%r115; and.b64 %r290,%r289,7; setp.eq.u64 %r291,%r290,0; @ %r291 bra $L32; .loc 1 228 8 and.b64 %r293,%r289,3; setp.ne.u64 %r294,%r293,0; @ %r294 bra $L318; call _gfortrani_cshift0_c4233 4 bra $L1; $L32:8; call _gfortrani_cshift0_239 4 bra $L1; $L30: .loc 1 244 11 ld.u64 %r114,[%r165]; .loc 1 199 46 ld.u64 %r115,[%r166]; .loc 1 245 4 or.b64 %r303,%r114,%r115; and.b64 %r304,%r303,15; setp.eq.u64 %r305,%r304,0; @ %r305 bra $L33; .loc 1 254 8 and.b64 %r307,%r303,7; setp.ne.u64 %r308,%r307,0; @ %r308 bra $L318; call _gfortrani_cshift0_c84 bra $L1; $L33:8; call _gfortrani_cshift0_i16265 4 bra $L1; $L28: .loc 1 327 8 ld.u64 %r114,[%r165]; .loc 1 328 8 ld.u64 %r115,[%r166]; $L31: .loc 1 286 9 add.u32 %r102,%r168,-1; .loc 1 287 14 mov.u64 %r317,0; st.u64 [%frame+240],%r317; .loc 1 288 14 st.u64 [%frame+360],%r317; .loc 1 290 13 mov.u64 %r319,1; st.u64 [%frame],%r319; .loc 1 291 12 st.u64 [%frame+120],%r317; .loc 1 298 23 ld.s8 %r134,[%r166+28]; cvt.u32.u32 %r321,%r134; cvt.s64.s8 %r133,%r321; .loc 1 298 3 setp.le.s64 %r322,%r133,0; @ %r322 bra $L57; .loc 1 300 15 cvt.s64.s32 %r126,%r102; .loc 1 302 21 ld.u64 %r153,[%r165+16]; .loc 1 305 21 ld.u64 %r157,[%r166+16]; mov.u64 %r86,%r169; mov.u64 %r83,%r169; mov.u64 %r94,40; .loc 1 292 5 mov.u64 %r91,%r317; .loc 1 296 7 mov.u64 %r90,%r91; .loc 1 298 12 mov.u64 %r116,%r91; .loc 1 312 20 mov.u64 %r440,%r91; $L39: add.u64 %r84,%r166,%r94; .loc 1 308 17 ld.u64 %r324,[%r84+16]; add.u64 %r323,%r324,1; .loc 1 308 15 ld.u64 %r325,[%r84+8]; sub.u64 %r150,%r323,%r325; .loc 1 302 21 add.u64 %r326,%r165,%r94; ld.u64 %r327,[%r326]; mul.lo.u64 %r82,%r153,%r327; .loc 1 305 21 ld.u64 %r328,[%r84]; mul.lo.u64 %r85,%r157,%r328; .loc 1 300 10 setp.ne.u64 %r329,%r126,%r116; @ %r329 bra $L35; .loc 1 303 14 setp.ne.u64 %r330,%r82,0; .loc 1 306 14 setp.ne.u64 %r331,%r85,0; .loc 1 308 15 mov.u64 %r90,%r150; selp.u64 %r86,%r85,%r169,%r331; selp.u64 %r83,%r82,%r169,%r330; bra $L38; $L35: .loc 1 312 20 shl.b64 %r332,%r91,3; add.u64 %r333,%frame,%r332; add.u64 %r334,%r333,120; st.u64 [%r334],%r440; .loc 1 313 21 st.u64 [%r333],%r150; .loc 1 314 22 st.u64 [%r333+360],%r82; .loc 1 315 22 st.u64 [%r333+240],%r85; .loc 1 316 12 add.u64 %r91,%r91,1; $L38: .loc 1 298 55 add.u64 %r116,%r116,1; .loc 1 298 3 add.u64 %r94,%r94,24; setp.ne.u64 %r344,%r116,%r133; @ %r344 bra $L39; .loc 1 319 14 ld.u64 %r61,[%frame+240]; .loc 1 321 14 ld.u64 %r39,[%frame+360]; .loc 1 319 6 setp.ne.u64 %r345,%r61,0; selp.u64 %r61,%r61,%r169,%r345; .loc 1 321 6 setp.eq.u64 %r346,%r39,0; selp.u64 %r39,%r169,%r39,%r346; .loc 1 330 24 setp.eq.u64 %r347,%r90,0; @ %r347 bra $L59; rem.s64 %r105,%r167,%r90; .loc 1 331 6 setp.ge.s64 %r348,%r105,0; @ %r348 bra $L34; .loc 1 332 11 add.u64 %r105,%r105,%r90; bra $L34; $L54: .loc 1 352 34 mul.lo.u64 %r131,%r86,%r105; .loc 1 342 24 mul.lo.u64 %r123,%r169,%r105; .loc 1 343 23 sub.u64 %r140,%r90,%r105; .loc 1 343 32 mul.lo.u64 %r34,%r169,%r140; .loc 1 340 19 set.u32.eq.u64 %r350,%r86,%r169; neg.s32 %r351,%r350; .loc 1 340 38 set.u32.eq.u64 %r353,%r83,%r169; neg.s32 %r354,%r353; .loc 1 340 27.loc 1 373 32 ld.u64 %r144,[%frame]; mul.lo.u64 %r138,%r140,%r83; .loc 1 380 30 mul.lo.u64 %r33,%r144,%r39; .loc 1 381 30 mul.lo.u64 %r359,%r144,%r61; .loc 1 381 16 neg.s64 %r135,%r359; setp.eq.u32 %r431,%r141,0; setp.gt.s64 %r429,%r133,2; .loc 1 354 4 setp.le.s64 %r433,%r140,0; .loc 1 360 4 setp.gt.s64 %r434,%r105,0; cvt.u32.u32 %r435,%r134; cvt.s64.s8 %r436,%r435; add.u64 %r437,%r436,-1; add.u64 %r438,%frame,360; add.u64 %r439,%frame,240; $L56: .loc 1 340 10 @ %r431 bra $L42; .loc 1 344 23 add.u64 %r361,%r115,%r123; .loc 1 344 41161368,[%value_in]; } .loc 1 345 17 add.u64 %r370,%r114,%r34; .loc 1 345 475bra $L43; $L42: .loc 1 352 16 add.u64 %r110,%r115,%r131; .loc 1 354 4 @ %r433 bra $L60; mov.u64 %r109,%r114; .loc 1 354 11 mov.u64 %r111,0; $L45:696,[%value_in]; } .loc 1 357 13 add.u64 %r109,%r109,%r83; .loc 1 358 12 add.u64 %r110,%r110,%r86; .loc 1 354 34 add.u64 %r111,%r111,1; .loc 1 354 4 setp.ne.u64 %r388,%r111,%r140; @ %r388 bra $L45; .loc 1 357 13 add.u64 %r106,%r114,%r138; bra $L44; $L60: .loc 1 354 4 mov.u64 %r106,%r114; $L44: .loc 1 360 4 @ ! %r434 bra $L43; mov.u64 %r107,%r115; .loc 1 360 23 mov.u64 %r108,0; $L46: .loc 1 362 8169396,[%value_in]; } .loc 1 363 13 add.u64 %r106,%r106,%r83; .loc 1 364 12 add.u64 %r107,%r107,%r86; .loc 1 360 40 add.u64 %r108,%r108,1; .loc 1 360 4 setp.ne.u64 %r398,%r108,%r105; @ %r398 bra $L46; $L43: .loc 1 369 12 add.u64 %r114,%r114,%r39; .loc 1 370 12 add.u64 %r115,%r115,%r61; .loc 1 371 15 ld.u64 %r399,[%frame+120]; add.u64 %r67,%r399,1; .loc 1 373 13 setp.eq.u64 %r400,%r67,%r144; @ %r400 bra $L48; .loc 1 371 15 st.u64 [%frame+120],%r67; bra $L56; $L48: .loc 1 377 20 mov.u64 %r401,0; st.u64 [%frame+120],%r401; .loc 1 381 16 add.u64 %r112,%r115,%r135; .loc 1 383 14 @ %r429 bra $L50; bra $L1; $L53: .loc 1 377 20 st.u64 [%r132],%r401; .loc 1 380 30 mul.lo.u64 %r70,%r76,%r75; .loc 1 381 30 mul.lo.u64 %r404,%r78,%r75; .loc 1 381 16 sub.u64 %r112,%r115,%r404; .loc 1 382 12 add.u64 %r113,%r113,1; .loc 1 383 14 add.u64 %r132,%r132,8; add.u64 %r129,%r129,8; setp.eq.u64 %r405,%r437,%r113; @ ! %r405 bra $L52; bra $L1; $L50: add.u64 %r132,%frame,128; .loc 1 380 30 mov.u64 %r70,%r33; .loc 1 383 14 mov.u64 %r129,8; .loc 1 382 12 mov.u64 %r113,1; $L52: .loc 1 391 23 ld.u64 %r409,[%r132]; add.u64 %r75,%r409,1; st.u64 [%r132],%r75; .loc 1 392 30 add.u64 %r411,%r438,%r129; ld.u64 %r76,[%r411]; .loc 1 392 20 sub.u64 %r412,%r76,%r70; add.u64 %r114,%r114,%r412; .loc 1 393 30 add.u64 %r414,%r439,%r129; ld.u64 %r78,[%r414]; .loc 1 393 20 add.u64 %r115,%r112,%r78; .loc 1 373 32 add.u64 %r415,%frame,%r129; ld.u64 %r81,[%r415]; .loc 1 373 13 setp.eq.u64 %r416,%r75,%r81; @ %r416 bra $L53; bra $L56; $L57: .loc 1 298 3 mov.u64 %r86,%r169; mov.u64 %r83,%r169; mov.u64 %r39,%r169; mov.u64 %r61,%r169; .loc 1 296 7 mov.u64 %r90,%r317; .loc 1 330 24 mov.u64 %r105,%r90; bra $L34; $L59: mov.u64 %r105,%r90; $L34: .loc 1 334 9 setp.ne.u64 %r417,%r114,0; @ %r417 bra $L54; bra $L1; $L5: .loc 1 69 16 ld.u64 %r419,[%r166+56]; ld.u64 %r421,[%r166+48]; .loc 1 69 14 sub.u64 %r87,%r419,%r421; add.u64 %r68,%r166,72; add.u32 %r422,%r121,-1; cvt.u64.u32 %r423,%r422; add.u64 %r425,%r423,%r423; add.u64 %r426,%r425,%r423; shl.b64 %r427,%r426,3; mov.u64 %r58,%r177; .loc 1 72 17 mov.u64 %r93,1; .loc 1 77 4 mov.u64 %r193,%r58; bra $L55; $L_gfortran_cshift0_1 .visible .func _gfortran_cshift0_1ld.u64 %r23,[%r29+16103; ld.s8 %r27,[%r31]; bra $L102; $L103: mov.u32 %r27,1; $L102: ld.s8 %r38,[%r30]call cshift0); }_gfortran_cshift0_1_char .visible .func _gfortran_cshift0_1_charpredmov.u64 %r29,%ar2; mov.u64 %r30,%ar3; mov.u64 %r31,%ar4; mov.u32 %r32,%ar5; cvt.s64.s32 %r22,%r32; setp.eq.u64 %r33,%r31,0; @ %r33 bra $L106; ld.s8 %r26,[%r31]; bra $L105; $L106: mov.u32 %r26,1; $L105: ld.s87call cshift0); }_gfortran_cshift0_1_char4 .visible .func _gfortran_cshift0_1_char4predmov.u64 %r33,%ar4; mov.u32 %r34,%ar5; cvt.s64.s32 %r35,%r34; shl.b64 %r24,%r35,2; setp.eq.u64 %r36,%r33,0; @ %r36 bra $L109; ld.s8 %r28,[%r33]; bra $L108; $L109: mov.u32 %r28,1; $L108: ld.s8 %r42,[%r32]call cshift0); }_gfortran_cshift0_2 .visible .func _gfortran_cshift0_2ld.u64 %r23,[%r29+162; ld.s16 %r27,[%r31]; bra $L111; $L112: mov.u32 %r27,1; $L111: ld.s16 %r38,[%r30]call cshift0); }_gfortran_cshift0_2_char .visible .func _gfortran_cshift0_2_charpredmov.u64 %r29,%ar2; mov.u64 %r30,%ar3; mov.u64 %r31,%ar4; mov.u32 %r32,%ar5; cvt.s64.s32 %r22,%r32; setp.eq.u64 %r33,%r31,0; @ %r33 bra $L115; ld.s16 %r26,[%r31]; bra $L114; $L115: mov.u32 %r26,1; $L114:7call cshift0); }_gfortran_cshift0_2_char4 .visible .func _gfortran_cshift0_2_char4predmov.u64 %r33,%ar4; mov.u32 %r34,%ar5; cvt.s64.s32 %r35,%r34; shl.b64 %r24,%r35,2; setp.eq.u64 %r36,%r33,0; @ %r36 bra $L118; ld.s16 %r28,[%r33]; bra $L117; $L118: mov.u32 %r28,1; $L117: ld.s16 %r42,[%r32]call cshift0); }_gfortran_cshift0_4 .visible .func _gfortran_cshift0_predmov.u64 %r29,%ar2; mov.u64 %r30,%ar3; ld.u64 %r23,[%r28+16]; setp.eq.u64 %r31,%r30,0; @ %r31 bra $L121; ld.u32 %r26,[%r30]; bra $L120; $L121: mov.u32 %r26,1call cshift0); }_gfortran_cshift0_4_char .visible .func _gfortran_cshift0_4_charpredmov.u64 %r29,%ar3; mov.u64 %r30,%ar4; mov.u32 %r31,%ar5; cvt.s64.s32 %r22,%r31; setp.eq.u64 %r32,%r30,0; @ %r32 bra $L124; ld.u32 %r25,[%r30]; bra $L123; $L124: mov.u32 %r25,1; $L123: ld.s32 %r38,[%r29]call cshift0); }_gfortran_cshift0_4_char4 .visible .func _gfortran_cshift0_4_char4pred %r35; .reg .u64 %r41mov.u32 %r33,%ar5; cvt.s64.s32 %r34,%r33; shl.b64 %r24,%r34,2; setp.eq.u64 %r35,%r32,0; @ %r35 bra $L127; ld.u32 %r27,[%r32]; bra $L126; $L127: mov.u32 %r27,1; $L126: ld.s32 %r41,[%r31]call cshift0); }_gfortran_cshift0_8 .visible .func _gfortran_cshift0_8predmov.u64 %r29,%ar2; mov.u64 %r30,%ar3; ld.u64 %r23,[%r28+16]; setp.eq.u64 %r31,%r30,0; @ %r31 bra $L130; ld.u32 %r26,[%r30]; bra $L129; $L130: mov.u32 %r26,1call cshift0); }_gfortran_cshift0_8_char .visible .func _gfortran_cshift0_8_charpredmov.u64 %r29,%ar3; mov.u64 %r30,%ar4; mov.u32 %r31,%ar5; cvt.s64.s32 %r22,%r31; setp.eq.u64 %r32,%r30,0; @ %r32 bra $L133; ld.u32 %r25,[%r30]; bra $L132; $L133: mov.u32 %r25,1; $L132: ld.u64 %r36,[%r29]call cshift0); }_gfortran_cshift0_8_char4 .visible .func _gfortran_cshift0_8_char4pred %r35; .reg .u64 %r39mov.u32 %r33,%ar5; cvt.s64.s32 %r34,%r33; shl.b64 %r24,%r34,2; setp.eq.u64 %r35,%r32,0; @ %r35 bra $L136; ld.u32 %r27,[%r32]; bra $L135; $L136: mov.u32 %r27,1; $L135: ld.u64 %r39,[%r31]call cshift0); }_gfortran_cshift0_16 .visible .func _gfortran_cshift0_16ld.u64 %r23,[%r29+16139; ld.u32 %r27,[%r31]; bra $L138; $L139: mov.u32 %r27,1; $L138: ld.u64 %r36,[%r30]call cshift0); }_gfortran_cshift0_16_char .visible .func _gfortran_cshift0_16_charpred %r33; .reg .u64 %r37; mov.u64 %r27,%ar0; mov.u64 %r29,%ar2; mov.u64 %r30,%ar3; mov.u64 %r31,%ar4; mov.u32 %r32,%ar5; cvt.s64.s32 %r22,%r32; setp.eq.u64 %r33,%r31,0; @ %r33 bra $L142; ld.u32 %r26,[%r31]; bra $L141; $L142: mov.u32 %r26,1; $L141:7call cshift0); }_gfortran_cshift0_16_char4 .visible .func _gfortran_cshift0_16_char4pred %r36; .reg .u64 %r40; mov.u64 %r29,%ar0; mov.u64 %r31,%ar2; mov.u64 %r32,%ar3; mov.u64 %r33,%ar4; mov.u32 %r34,%ar5; cvt.s64.s32 %r35,%r34; shl.b64 %r24,%r35,2; setp.eq.u64 %r36,%r33,0; @ %r36 bra $L145; ld.u32 %r28,[%r33]; bra $L144; $L145: mov.u32 %r28,1; $L144: ld.u64 %r40,[%r32]call cshift0); }BEGIN FUNCTION DECL: eoshift0 .func eoshift0); .file 1 "../../../libgfortran/intrinsics/eoshift0.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_eoshift0_1 .visible .func _gfortran_eoshift0_1_gfortran_eoshift0_1_char .visible .func _gfortran_eoshift0_1_char_gfortran_eoshift0_2 .visible .func _gfortran_eoshift0_2_gfortran_eoshift0_2_char .visible .func _gfortran_eoshift0_2_char_gfortran_eoshift0_4 .visible .func _gfortran_eoshift0_4_gfortran_eoshift0_4_char .visible .func _gfortran_eoshift0_4_char_gfortran_eoshift0_8 .visible .func _gfortran_eoshift0_8_gfortran_eoshift0_8_char .visible .func _gfortran_eoshift0_8_char_gfortran_eoshift0_16 .visible .func _gfortran_eoshift0_16_gfortran_eoshift0_16_char .visible .func _gfortran_eoshift0_16_char_gfortran_eoshift0_16_char4 .visible .func _gfortran_eoshift0_16_char4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_size0_gfortran_size0_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extentsVAR DEF: space$0 .const .align 4 .u32 space$0[1] = {32 }; // BEGIN VAR DEF: space$1 .const .align 4 .u32 space$1[1] = {32 }; // BEGIN VAR DEF: space$2 .const .align 4 .u32 space$2[1] = {32 }; // BEGIN VAR DEF: space$3 .const .align 4 .u32 space$3[1] = {32 }; // BEGIN VAR DEF: space$4 .const .align 4 .u32 space$4[1] = {32 }8] = {69,79,83,72,73,70,8414,101,116,117,114,110,32,118,97,108,117,1012] = {2] = {32,0 }; // BEGIN FUNCTION DEF: eoshift0 .func eoshift0480predpred %r259; .reg .u32u32 %r266; .reg .u64u64u64 %r280; .reg .u64 %r281; .reg .u64 %r282; .reg .u64u32u64 %r301; .reg .u64predpredpred %r375; .reg .u32u64 %r391; .reg .predpred %r445; .reg .pred %r446; .reg .pred %r447; .reg .u64 %r448; .reg .pred %r449; .reg .pred %r450; .reg .u64 %r452; .reg .predu64 %r460; .reg .u64 %r461; .reg .u64 %r462; mov.u64 %r203,%ar0; mov.u64 %r204,%ar1; mov.u64 %r205,%ar2; mov.u64 %r206,%ar3; mov.u32 %r207,%ar4; mov.u64 %r208,%ar5; mov.u64 %r209,%ar6; mov.u64 %r21004; call (%value_in),_gfortran_size0,(%out_arg1); ld.param.u64 %r212,[%value_in]; } .loc 1 64 6 ld.u64 %r213,[%r203].loc 1 68 19 st.u64 [%r203+8],%r213; .loc 1 69 7 ld.u64 %r216,[%r204+16]; st.u64 [%r203+16],%r216; ld.u64 %r217,[%r204+24]; st.u64 [%r203+24],%r217; .loc 1 70 23 ld.s8 %r167,[%r204+28]; .loc 1 70 7 cvt.u16.u32 %r218,%r167; setp.gt.s16 %r219,%r218,0; @ %r219 bra $L3; $L6: .loc 1 87 2421208; call (%value_in),_gfortrani_xmallocarray222,[%value_in]; } .loc 1 87 22 st.u64 [%r203],%r222; bra $L4; $L5: .loc 1 74 16 ld.u64 %r225,[%r37+8]; ld.u64 %r227,[%r37]; .loc 1 79 17 mad.lo.u64 %r117,%r124,%r117,%r117; add.u64 %r37,%r37,24; add.u64 %r126,%r126,24; .loc 1 74 14 sub.u64 %r124,%r225,%r227; $L52: add.u64 %r68,%r203,%r126; .loc 1 82 4 st.u64 [%r68+48],%r229; st.u64 [%r68+56],%r124; st.u64 [%r68+40],%r117; .loc 1 70 7 setp.ne.u64 %r230,%r444,%r126; @ %r230 bra $L5; bra $L6; $L2: .loc 1 89 12 cvta.global.u64 %r231,_gfortrani_compile_options; .loc 1 89 11 ld.u32 %r232,[%r231+36]; setp.eq.u32 %r233,%r232,0; @ %r233 bra $L4; .loc 1 91 7 cvta.const.u64 %r237,$LC0;r20423637; call _gfortrani_bounds_equal_extents$L4: .loc 1 95 6 setp.ne.u64 %r238,%r212,0; @ ! %r238 bra $L1; .loc 1 98 9 add.u32 %r132,%r207,-1; .loc 1 100 13 mov.u64 %r239,1; st.u64 [%frame],%r239; .loc 1 101 12 mov.u64 %r240,0; st.u64 [%frame+120],%r240; .loc 1 102 14 mov.u64 %r241,-1; st.u64 [%frame+240],%r241; .loc 1 103 14 st.u64 [%frame+360],%r241; .loc 1 112 13 ld.s8 %r202,[%r204+28]; .loc 1 112 11 cvt.u32.u32 %r243,%r202; cvt.s64.s8 %r105,%r243; .loc 1 105 6 setp.le.s32 %r244,%r132,0; @ %r244 bra $L8; .loc 1 113 7 setp.le.s64 %r245,%r105,0; @ %r245 bra $L9; mov.u64 %r142,40; .loc 1 110 12 mov.u64 %r136,%r239; .loc 1 109 12 mov.u64 %r135,%r136; .loc 1 113 14 mov.u64 %r137,%r240; $L13: add.u64 %r187,%r203,%r142; .loc 1 117 7 ld.u64 %r246,[%r187]; setp.ne.u64 %r247,%r246,%r135; @ ! %r247 bra $L75; bra $L10; $L8: .loc 1 168 7 setp.gt.s64 %r248,%r105,0; @ %r248 bra $L10; mov.u64 %r179,%r241; mov.u64 %r178,%r179; mov.u64 %r106,%r240; mov.u64 %r101,%r106; mov.u64 %r95,%r106; bra $L12; $L75: add.u64 %r158,%r204,%r142; .loc 1 123 7 ld.u64 %r249,[%r158]; setp.ne.u64 %r250,%r249,%r136; @ %r250 bra $L10; .loc 1 128 12 ld.u64 %r252,[%r187+16]; ld.u64 %r254,[%r187+8]; sub.u64 %r253,%r252,%r254; .loc 1 128 9 mad.lo.u64 %r135,%r253,%r135,%r135; .loc 1 129 12 ld.u64 %r256,[%r158+16]; ld.u64 %r258,[%r158+8]; sub.u64 %r257,%r256,%r258; .loc 1 129 9 mad.lo.u64 %r136,%r257,%r136,%r136; .loc 1 113 30 add.u64 %r137,%r137,1; .loc 1 113 7 add.u64 %r142,%r142,24; setp.ne.u64 %r259,%r137,%r105; @ %r259 bra $L13; bra $L9; $L50: .loc 1 159 17 ld.u64 %r57,[%r203+16]; .loc 1 160 17 ld.u64 %r62,[%r204+16]; .loc 1 157 13 cvt.u32.u32 %r260,%r202; cvt.s64.s8 %r165,%r260; cvt.s64.s32 %r163,%r207; sub.u64 %r161,%r165,%r163; shl.b64 %r261,%r161,3; add.u64 %r262,%frame,120; mov.u32 %r266add.u64 %r271,%r163,%r163; add.u64 %r272,%r271,%r163; shl.b64 %r273,%r272,3; add.u64 %r93,%r273,40; add.u64 %r275,%r165,%r165; add.u64 %r276,%r275,%r165; shl.b64 %r277,%r276,3; add.u64 %r141,%r277,40; mov.u64 %r97,0; add.u64 %r448,%frame,360; add.u64 %r452,%frame,240; $L14: add.u64 %r36,%r204,%r93; .loc 1 158 14 add.u64 %r278,%frame,%r97; .loc 1 158 16 ld.u64 %r280,[%r36+16]; add.u64 %r279,%r280,1; ld.u64 %r282,[%r36+8]; sub.u64 %r281,%r279,%r282; .loc 1 158 14 st.u64 [%r278],%r281; .loc 1 159 15 add.u64 %r284,%r448,%r97; .loc 1 159 17 add.u64 %r285,%r203,%r93; ld.u64 %r287,[%r285]; mul.lo.u64 %r286,%r287,%r57; .loc 1 159 15 st.u64 [%r284],%r286; .loc 1 160 15 add.u64 %r289,%r452,%r97; .loc 1 160 17 ld.u64 %r291,[%r36]; mul.lo.u64 %r290,%r291,%r62; .loc 1 160 15 st.u64 [%r289],%r290; .loc 1 155 7 add.u64 %r97,%r97,8; add.u64 %r93,%r93,24; setp.ne.u64 %r292,%r93,%r141; @ %r292 bra $L14; .loc 1 161 5 mov.u64 %r125,%r161; .loc 1 205 12 ld.u64 %r178,[%frame+360]; .loc 1 206 12 ld.u64 %r179,[%frame+240]; $L51: .loc 1 163 16 shl.b64 %r293,%r125,3; add.u64 %r294,%frame,%r293; add.u64 %r295,%r294,120; mov.u64 %r296,0; st.u64 [%r295],%r296; .loc 1 164 41 sub.u32 %r297,%r202,%r132; .loc 1 164 11 cvt.s64.s32 %r105,%r297; mov.u64 %r101,%r208; mov.u64 %r95,%r208; bra $L12; $L10: .loc 1 170 12 cvt.s64.s32 %r154,%r132; .loc 1 159 17 ld.u64 %r188,[%r203+16]; .loc 1 160 17 ld.u64 %r189,[%r204+16]; mov.u64 %r108,40; mov.u64 %r109,0; mov.u64 %r106,%r109; mov.u64 %r138,%r109; mov.u64 %r101,%r109; mov.u64 %r95,%r109; .loc 1 182 17 mov.u64 %r462,%r109; $L19: add.u64 %r102,%r204,%r108; .loc 1 178 14 ld.u64 %r299,[%r102+16]; add.u64 %r298,%r299,1; .loc 1 178 12 ld.u64 %r300,[%r102+8]; sub.u64 %r193,%r298,%r300; .loc 1 172 18 add.u64 %r301,%r203,%r108; ld.u64 %r302,[%r301]; mul.lo.u64 %r197,%r188,%r302; .loc 1 175 18 ld.u64 %r303,[%r102]; mul.lo.u64 %r201,%r189,%r303; .loc 1 170 7 setp.ne.u64 %r304,%r154,%r138; @ %r304 bra $L15; .loc 1 173 11 setp.ne.u64 %r305,%r197,0; selp.u64 %r95,%r197,%r208,%r305; .loc 1 176 11 setp.ne.u64 %r306,%r201,0; selp.u64 %r101,%r201,%r208,%r306; .loc 1 178 12 mov.u64 %r106,%r193; bra $L18; $L15: .loc 1 182 17 shl.b64 %r307,%r109,3; add.u64 %r308,%frame,%r307; add.u64 %r309,%r308,120; st.u64 [%r309],%r462; .loc 1 183 18 st.u64 [%r308],%r193; .loc 1 184 19 st.u64 [%r308+360],%r197; .loc 1 185 19 st.u64 [%r308+240],%r201; .loc 1 186 9 add.u64 %r109,%r109,1; $L18: .loc 1 168 59 add.u64 %r138,%r138,1; .loc 1 168 7 add.u64 %r108,%r108,24; setp.lt.s64 %r319,%r138,%r105; @ %r319 bra $L19; .loc 1 205 12 ld.u64 %r178,[%frame+360]; .loc 1 206 12 ld.u64 %r179,[%frame+240]; $L12: .loc 1 192 27 abs.s64 %r320,%r205; .loc 1 192 6 setp.gt.s64 %r321,%r320,%r106; @ %r321 bra $L56; .loc 1 199 10 setp.le.s64 %r322,%r205,0; @ %r322 bra $L21; .loc 1 200 6 sub.u64 %r107,%r106,%r205; mov.u64 %r106,%r205; bra $L20; $L21: .loc 1 202 6 add.u64 %r107,%r205,%r106; mov.u64 %r106,%r205; bra $L20; $L56: .loc 1 195 11 mov.u64 %r107,0; $L20: .loc 1 207 8 ld.u64 %r150,[%r203]; .loc 1 208 8 ld.u64 %r76,[%r204]; .loc 1 210 9 setp.eq.u64 %r323,%r150,0; @ %r323 bra $L1; .loc 1 221 24 neg.s64 %r27,%r106; .loc 1 221 31 mul.lo.u64 %r99,%r27,%r95; .loc 1 215 29 mul.lo.u64 %r155,%r101,%r106; .loc 1 227 24 mul.lo.u64 %r115,%r107,%r208; setp.ge.s64 %r445,%r106,0; selp.u64 %r324,%r106,%r27,%r445; .loc 1 257 10 add.u64 %r164,%r324,-1; .loc 1 251 10 add.u64 %r156,%r106,-1; @ %r445 bra $L25; not.b64 %r326,%r106; bra $L26; $L25: mov.u64 %r326,%r156; $L26: .loc 1 225 19 set.u32.eq.u64 %r329,%r101,%r208; neg.s32 %r330,%r329; .loc 1 225 38 set.u32.eq.u64 %r332,%r95,%r208; neg.s32 %r333,%r332; .loc 1 225 .loc 1 275 32 ld.u64 %r185,[%frame]; mul.lo.u64 %r168,%r107,%r95; .loc 1 282 30 mul.lo.u64 %r177,%r185,%r178; .loc 1 283 30 mul.lo.u64 %r338,%r185,%r179; .loc 1 283 16 neg.s64 %r160,%r338; shl.b64 %r339,%r105,3; add.u64 %r118,%r339,-8; setp.le.s64 %r446,%r106,0; setp.ne.u32 %r449,%r182,0; setp.ne.u64 %r450,%r206,0; setp.le.s64 %r447,%r105,2; .loc 1 233 4 setp.gt.s64 %r455,%r107,0; .loc 1 257 8 setp.ne.u64 %r456,%r324,0; setp.eq.u64 %r457,%r210,1; .loc 1 264 8 setp.gt.s64 %r458,%r208,0; .loc 1 251 8 setp.eq.u64 %r459,%r106,0; add.u64 %r460,%frame,360; add.u64 %r461,%frame,240; $L53: .loc 1 213 10 @ %r446 bra $L27; .loc 1 215 15 add.u64 %r65,%r76,%r155; mov.u64 %r98,%r150; bra $L28; $L27: .loc 1 221 16 add.u64 %r98,%r150,%r99; mov.u64 %r65,%r76; $L28: .loc 1 225 10 @ %r449 bra $L29; .loc 1 233 4 @ %r455 bra $L57; bra $L31; $L29: .loc 1 228 411349,[%value_in]; } .loc 1 229 9 add.u64 %r98,%r98,%r115; bra $L31; $L57: .loc 1 233 4 mov.u64 %r144,%r98; .loc 1 233 11 mov.u64 %r145,0; $L30:14236 13 add.u64 %r144,%r144,%r95; .loc 1 237 12 add.u64 %r65,%r65,%r101; .loc 1 233 26 add.u64 %r145,%r145,1; .loc 1 233 4 setp.ne.u64 %r359,%r107,%r145; @ %r359 bra $L30; .loc 1 236 13 add.u64 %r98,%r98,%r168; $L31: .loc 1 240 10 @ %r445 bra $L32; .loc 1 250 10 @ %r450 bra $L58; mov.u64 %r98,%r150; bra $L34; $L32: @ %r450 bra $L35; $L34: .loc 1 257 8 @ %r456 bra $L59; $L39: .loc 1 271 12 add.u64 %r150,%r150,%r178; .loc 1 272 12 add.u64 %r76,%r76,%r179; .loc 1 273 15 ld.u64 %r364,[%frame+120]; add.u64 %r80,%r364,1; .loc 1 275 13 setp.eq.u64 %r365,%r80,%r185; @ %r365 bra $L37; .loc 1 273 15 st.u64 [%frame+120],%r80; bra $L53; $L35: .loc 1 251 8 @ ! %r459 bra $L33; bra $L39; $L58: .loc 1 250 10 mov.u64 %r98,%r150; $L33: mov.u64 %r148,%r326; $L40: .loc 1 253 6206208373,[%value_in]; } .loc 1 254 11 add.u64 %r98,%r98,%r95; .loc 1 251 10 add.u64 %r148,%r148,-1; .loc 1 251 8 setp.ne.u64 %r375,%r148,-1; @ %r375 bra $L40; bra $L39; $L59: .loc 1 257 10 mov.u64 %r146,%r164; $L36: .loc 1 261 9 @ %r457 bra $L41; .loc 1 264 8 @ %r458 bra $L42; bra $L43; $L41: .loc 1 262 8 ld.s8 %r382,[%r209]382bra $L43; $L42: .loc 1 264 15 mov.u64 %r147,0; $L44: .loc 1 265 3 add.u64 %r384,%r98,%r143209210391,[%value_in]; } .loc 1 264 33 add.u64 %r147,%r147,%r210; .loc 1 264 8 setp.gt.s64 %r393,%r208,%r147; @ %r393 bra $L44; $L43: .loc 1 267 11 add.u64 %r98,%r98,%r95; .loc 1 257 10 add.u64 %r146,%r146,-1; .loc 1 257 8 setp.ne.u64 %r394,%r146,-1; @ %r394 bra $L36; bra $L39; $L37: .loc 1 279 20 mov.u64 %r395,0; st.u64 [%frame+120],%r395; .loc 1 283 16 add.u64 %r149,%r76,%r160; .loc 1 285 14 @ %r447 bra $L1; add.u64 %r139,%frame,128; .loc 1 282 30 mov.u64 %r83,%r177; .loc 1 285 14 mov.u64 %r153,8; bra $L47; $L48: .loc 1 279 20 st.u64 [%r139],%r395; .loc 1 282 30 mul.lo.u64 %r83,%r89,%r88; .loc 1 283 30 mul.lo.u64 %r399,%r91,%r88; .loc 1 283 16 sub.u64 %r149,%r76,%r399; .loc 1 285 14 add.u64 %r139,%r139,8; add.u64 %r153,%r153,8; setp.eq.u64 %r400,%r118,%r153; @ %r400 bra $L1; $L47: .loc 1 293 23 ld.u64 %r401,[%r139]; add.u64 %r88,%r401,1; st.u64 [%r139],%r88; .loc 1 294 30 add.u64 %r403,%r460,%r153; ld.u64 %r89,[%r403]; .loc 1 294 20 sub.u64 %r404,%r89,%r83; add.u64 %r150,%r150,%r404; .loc 1 295 30 add.u64 %r406,%r461,%r153; ld.u64 %r91,[%r406]; .loc 1 295 20 add.u64 %r76,%r149,%r91; .loc 1 275 32 add.u64 %r407,%frame,%r153; ld.u64 %r94,[%r407]; .loc 1 275 13 setp.eq.u64 %r408,%r88,%r94; @ %r408 bra $L48; bra $L53; $L9: .loc 1 150 13 cvt.s64.s32 %r409,%r132; add.u64 %r411,%r409,%r409; add.u64 %r412,%r411,%r409; shl.b64 %r413,%r412,3; add.u64 %r414,%r204,%r413; ld.u64 %r46,[%r414+40]; .loc 1 151 4 ld.u64 %r424,[%r414+56]; ld.u64 %r433,[%r414+48]; sub.u64 %r432,%r424,%r433; .loc 1 150 11 mad.lo.u64 %r106,%r432,%r46,%r46; .loc 1 152 13 mul.lo.u64 %r205,%r205,%r46; .loc 1 155 7 setp.lt.s32 %r434,%r207,%r202; @ %r434 bra $L50; mov.u64 %r179,-1; mov.u64 %r178,%r179; .loc 1 135 5 mov.u64 %r125,0; bra $L51; $L3: .loc 1 74 16 ld.u64 %r436,[%r204+56]; ld.u64 %r438,[%r204+48]; .loc 1 74 14 sub.u64 %r124,%r436,%r438; add.u64 %r37,%r204,72; add.u32 %r439,%r167,-1; cvt.u64.u32 %r440,%r439; add.u64 %r442,%r440,%r440; add.u64 %r443,%r442,%r440; shl.b64 %r444,%r443,3; mov.u64 %r126,%r213; .loc 1 77 10 mov.u64 %r117,1; .loc 1 82 4 mov.u64 %r229,%r126; bra $L52; $L1: .loc 1 2_gfortran_eoshift0_1 .visible .func _gfortran_eoshift0_1 {ld.u64 %r23,[%r29+1678; ld.s8 %r27,[%r32]; bra $L77; $L78: mov.u32 %r27,1; $L77: ld.s8 %r42,[%r30]; mov.u64 %r41,13; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r41; call eoshift0,%out_arg8); }_gfortran_eoshift0_1_char .visible .func _gfortran_eoshift0_1_charpredmov.u64 %r27,%ar0; mov.u64 %r29,%ar2; mov.u64 %r30,%ar3; mov.u64 %r31,%ar4; mov.u64 %r32,%ar5; mov.u32 %r33,%ar6; cvt.s64.s32 %r22,%r33; setp.eq.u64 %r35,%r32,0; @ %r35 bra $L81; ld.s8 %r26,[%r32]; bra $L80; $L81: mov.u32 %r26,1; $L80: ld.s8 %r44,[%r30]; mov.u64 %r43,1; cvta.const.u64 %r427242; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r43; call eoshift0,%out_arg8); }_gfortran_eoshift0_1_char4 .visible .func _gfortran_eoshift0_1_char4predmov.u64 %r29,%ar0; mov.u64 %r31,%ar2; mov.u64 %r32,%ar3; mov.u64 %r33,%ar4; mov.u64 %r34,%ar5; mov.u32 %r35,%ar6; cvt.s64.s32 %r37,%r35; shl.b64 %r24,%r37,2; setp.eq.u64 %r38,%r34,0; @ %r38 bra $L84; ld.s8 %r28,[%r34]; bra $L83; $L84: mov.u32 %r28,1; $L83: ld.s8 %r47,[%r32]; mov.u64 %r46,4; cvta.const.u64 %r45,space$42445; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r46; call eoshift0,%out_arg8); }_gfortran_eoshift0_2 .visible .func _gfortran_eoshift0_2 {ld.u64 %r23,[%r29+1687; ld.s16 %r27,[%r32]; bra $L86; $L87: mov.u32 %r27,1; $L86: ld.s16 %r42,[%r30]; mov.u64 %r41,13; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r41; call eoshift0,%out_arg8); }_gfortran_eoshift0_2_char .visible .func _gfortran_eoshift0_2_charpredmov.u64 %r27,%ar0; mov.u64 %r29,%ar2; mov.u64 %r30,%ar3; mov.u64 %r31,%ar4; mov.u64 %r32,%ar5; mov.u32 %r33,%ar6; cvt.s64.s32 %r22,%r33; setp.eq.u64 %r35,%r32,0; @ %r35 bra $L90; ld.s16 %r26,[%r32]; bra $L89; $L90: mov.u32 %r26,1; $L89: ld.s16 %r44,[%r30]; mov.u64 %r43,1; cvta.const.u64 %r427242; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r43; call eoshift0,%out_arg8); }_gfortran_eoshift0_2_char4 .visible .func _gfortran_eoshift0_2_char4predmov.u64 %r29,%ar0; mov.u64 %r31,%ar2; mov.u64 %r32,%ar3; mov.u64 %r33,%ar4; mov.u64 %r34,%ar5; mov.u32 %r35,%ar6; cvt.s64.s32 %r37,%r35; shl.b64 %r24,%r37,2; setp.eq.u64 %r38,%r34,0; @ %r38 bra $L93; ld.s16 %r28,[%r34]; bra $L92; $L93: mov.u32 %r28,1; $L92: ld.s16 %r47,[%r32]; mov.u64 %r46,4; cvta.const.u64 %r45,space$32445; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r46; call eoshift0,%out_arg8); }_gfortran_eoshift0_4 .visible .func _gfortran_eoshift0_4 {predmov.u64 %r29,%ar2; mov.u64 %r30,%ar3; mov.u64 %r31,%ar4; ld.u64 %r23,[%r28+16ld.u32 %r26,[%r31]; bra $L95; $L96: mov.u32 %r26,1; $L95: ld.s32 %r41,[%r29]; mov.u64 %r40,1.param .u64 %out_arg8; st.param.u64 [%out_arg8],%r40; call eoshift0,%out_arg8); }_gfortran_eoshift0_4_char .visible .func _gfortran_eoshift0_4_charpredmov.u64 %r28,%ar2; mov.u64 %r29,%ar3; mov.u64 %r30,%ar4; mov.u64 %r31,%ar5; mov.u32 %r32,%ar6; cvt.s64.s32 %r22,%r32; setp.eq.u64 %r34,%r31,0; @ %r34 bra $L99; ld.u32 %r25,[%r31]; bra $L98; $L99: mov.u32 %r25,1; $L98: ld.s32 %r43,[%r29]; mov.u64 %r42,1; cvta.const2241; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r42; call eoshift0,%out_arg8); }_gfortran_eoshift0_4_char4 .visible .func _gfortran_eoshift0_4_char4predmov.u64 %r33,%ar5; mov.u32 %r34,%ar6; cvt.s64.s32 %r36,%r34; shl.b64 %r24,%r36,2; setp.eq.u64 %r37,%r33,0; @ %r37 bra $L102; ld.u32 %r27,[%r33]; bra $L101; $L102: mov.u32 %r27,1; $L101: ld.s32 %r46,[%r31]; mov.u64 %r45,4; cvta.const.u64 %r44,space$2.param .u64 %out_arg8; st.param.u64 [%out_arg8],%r45; call eoshift0,%out_arg8); }_gfortran_eoshift0_8 .visible .func _gfortran_eoshift0_8 {predmov.u64 %r28,%ar1; mov.u64 %r29,%ar2; mov.u64 %r30,%ar3; mov.u64 %r31,%ar4; ld.u64 %r23,[%r28+16105; ld.u32 %r26,[%r31]; bra $L104; $L105: mov.u32 %r26,1; $L104: mov.u64 %r41,1; cvta.const.u64 %r40,$LC2;3; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r41; call eoshift0,%out_arg8); }_gfortran_eoshift0_8_char .visible .func _gfortran_eoshift0_8_charpredmov.u64 %r29,%ar3; mov.u64 %r30,%ar4; mov.u64 %r31,%ar5; mov.u32 %r32,%ar6; cvt.s64.s32 %r22,%r32; setp.eq.u64 %r34,%r31,0; @ %r34 bra $L108; ld.u32 %r25,[%r31]; bra $L107; $L108: mov.u32 %r25,1; $L107: mov.u64 %r43,1; cvta.const.u64 %r42,$LC3; ld.u64 %r38,[%r29]2242; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r43; call eoshift0,%out_arg8); }_gfortran_eoshift0_8_char4 .visible .func _gfortran_eoshift0_8_char4predmov.u64 %r33,%ar5; mov.u32 %r34,%ar6; cvt.s64.s32 %r36,%r34; shl.b64 %r24,%r36,2; setp.eq.u64 %r37,%r33,0; @ %r37 bra $L111; ld.u32 %r27,[%r33]; bra $L110; $L111: mov.u32 %r27,1; $L110: mov.u64 %r46,4; cvta.const.u64 %r45,space$1; ld.u64 %r41,[%r31]2445; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r46; call eoshift0,%out_arg8); }_gfortran_eoshift0_16 .visible .func _gfortran_eoshift0_16 {ld.u64 %r23,[%r29+16ld.u32 %r27,[%r32]; bra $L113; $L114: mov.u32 %r27,1; $L113: mov.u64 %r42,1; cvta.const.u64 %r41,$LC2; ld.u64 %r37,[%r30]341; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r42; call eoshift0,%out_arg8); }_gfortran_eoshift0_16_char .visible .func _gfortran_eoshift0_16_charpredmov.u64 %r27,%ar0; mov.u64 %r29,%ar2; mov.u64 %r30,%ar3; mov.u64 %r31,%ar4; mov.u64 %r32,%ar5; mov.u32 %r33,%ar6; cvt.s64.s32 %r22,%r33; setp.eq.u64 %r35,%r32,0; @ %r35 bra $L117; ld.u32 %r26,[%r32]; bra $L116; $L117: mov.u32 %r26,1; $L116: mov.u64 %r44,1; cvta.const.u64 %r43,$LC37243; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r44; call eoshift0,%out_arg8); }_gfortran_eoshift0_16_char4 .visible .func _gfortran_eoshift0_16_char4predmov.u64 %r29,%ar0; mov.u64 %r31,%ar2; mov.u64 %r32,%ar3; mov.u64 %r33,%ar4; mov.u64 %r34,%ar5; mov.u32 %r35,%ar6; cvt.s64.s32 %r37,%r35; shl.b64 %r24,%r37,2; setp.eq.u64 %r38,%r34,0; @ %r38 bra $L120; ld.u32 %r28,[%r34]; bra $L119; $L120: mov.u32 %r28,1; $L119: mov.u64 %r47,4; cvta.const.u64 %r46,space$0; ld.u64 %r42,[%r32]332446; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r47; call eoshift0,%out_arg8); }BEGIN FUNCTION DECL: eoshift2 .func eoshift2; .file 1 "../../../libgfortran/intrinsics/eoshift2.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_eoshift2_1 .visible .func _gfortran_eoshift2_1_gfortran_eoshift2_1_char .visible .func _gfortran_eoshift2_1_char_gfortran_eoshift2_2 .visible .func _gfortran_eoshift2_2_gfortran_eoshift2_2_char .visible .func _gfortran_eoshift2_2_char_gfortran_eoshift2_4 .visible .func _gfortran_eoshift2_4_gfortran_eoshift2_4_char .visible .func _gfortran_eoshift2_4_char_gfortran_eoshift2_8 .visible .func _gfortran_eoshift2_8_gfortran_eoshift2_8_char .visible .func _gfortran_eoshift2_8_char_gfortran_eoshift2_16 .visible .func _gfortran_eoshift2_16_gfortran_eoshift2_16_char .visible .func _gfortran_eoshift2_16_char_gfortran_eoshift2_16_char4 .visible .func _gfortran_eoshift2_16_char4VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_size0_gfortran_size0_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extentsVAR DEF: space$0 .const .align 4 .u32 space$0[1] = {32 }; // BEGIN VAR DEF: space$1 .const .align 4 .u32 space$1[1] = {32 }; // BEGIN VAR DEF: space$2 .const .align 4 .u32 space$2[1] = {32 }; // BEGIN VAR DEF: space$3 .const .align 4 .u32 space$3[1] = {32 }; // BEGIN VAR DEF: space$4 .const .align 4 .u32 space$4[1] = {32 }8] = {69,79,83,72,73,70,8414,101,116,117,114,110,32,118,97,108,117,1012] = {2] = {32,0 }; // BEGIN FUNCTION DEF: eoshift2 .func eoshift2local .align 16 .b8 %frame_ar[60864pred %r196; .reg .u64 %r199; .reg .u64 %r200; .reg .pred %r201; .reg .u64 %r202; .reg .u64pred %r216; .reg .pred %r217; .reg .predpred %r248; .reg .pred %r249; .reg .u64 %r250; .reg .u64u64 %r283; .reg .pred %r285; .reg .pred %r287; .reg .predu64 %r299; .reg .pred %r301; .reg .u32pred %r339; .reg .predu64 %r344; .reg .u64 %r346; .reg .u32u64 %r357; .reg .pred %r358; .reg .pred %r359; .reg .pred %r361; .reg .pred %r362; .reg .pred %r365; .reg .pred %r366; .reg .predu64 %r374; .reg .u64 %r375; .reg .u64 %r376; mov.u64 %r167,%ar0; mov.u64 %r168,%ar1; mov.u64 %r169,%ar2; mov.u64 %r170,%ar3; mov.u32 %r171,%ar4; mov.u64 %r172,%ar5; mov.u64 %r173,%ar6; .loc 1 66 10 ld.u64 %r22,[%r168+16]168; call (%value_in),_gfortran_size0,(%out_arg1); ld.param.u64 %r175,[%value_in]; } .loc 1 70 6 ld.u64 %r176,[%r167]; setp.ne.u64 %r177,%r176,0; @ %r177 bra $L2; .loc 1 74 19 st.u64 [%r167+8],%r176; .loc 1 75 7 ld.u64 %r179,[%r168+16]; st.u64 [%r167+16],%r179; ld.u64 %r180,[%r168+24]; st.u64 [%r167+24],%r180; .loc 1 78 2422; call (%value_in),_gfortrani_xmallocarray183,[%value_in]; } .loc 1 78 22 st.u64 [%r167],%r183; .loc 1 80 23 ld.s8 %r135,[%r168+28]; .loc 1 80 7 cvt.u16.u32 %r185,%r135; setp.gt.s16 %r186,%r185,0; @ %r186 bra $L3; bra $L4; $L5: .loc 1 84 16 ld.u64 %r188,[%r78+8]; ld.u64 %r190,[%r78]; .loc 1 89 17 mad.lo.u64 %r91,%r85,%r91,%r91; add.u64 %r78,%r78,24; add.u64 %r44,%r44,24; .loc 1 84 14 sub.u64 %r85,%r188,%r190; $L54: add.u64 %r34,%r167,%r44; .loc 1 92 4 st.u64 [%r34+48],%r192; st.u64 [%r34+56],%r85; st.u64 [%r34+40],%r91; .loc 1 80 7 setp.ne.u64 %r193,%r44,%r352; @ %r193 bra $L5; bra $L4; $L2: .loc 1 95 12 cvta.global.u64 %r194,_gfortrani_compile_options; .loc 1 95 11 ld.u32 %r195,[%r194+36]; setp.eq.u32 %r196,%r195,0; @ %r196 bra $L4; .loc 1 97 7 cvta.const.u64 %r200,$LC0;r1619900; call _gfortrani_bounds_equal_extents$L4: .loc 1 101 6 setp.ne.u64 %r201,%r175,0; @ ! %r201 bra $L1; .loc 1 104 9 add.u32 %r105,%r171,-1; .loc 1 106 13 mov.u64 %r202,1; st.u64 [%frame],%r202; .loc 1 107 12 mov.u64 %r203,0; st.u64 [%frame+120],%r203; .loc 1 108 14 mov.u64 %r204,-1; st.u64 [%frame+360],%r204; .loc 1 109 14 st.u64 [%frame+480],%r204; .loc 1 110 14 st.u64 [%frame+240],%r204; .loc 1 112 23 ld.s8 %r134,[%r168+28]; cvt.u32.u32 %r207,%r134; cvt.s64.s8 %r133,%r207; .loc 1 112 3 setp.gt.s64 %r208,%r133,0; @ %r208 bra $L8; .loc 1 151 27 abs.s64 %r38,%r169; .loc 1 149 8 ld.u64 %r123,[%r168]; .loc 1 148 8 ld.u64 %r122,[%r167]; .loc 1 141 6 setp.ne.u64 %r209,%r170,0; @ %r209 bra $L56; .loc 1 63 11 mov.u64 %r80,%r170; .loc 1 141 6 mov.u64 %r107,%r204; .loc 1 62 7 mov.u64 %r84,%r170; .loc 1 137 14 mov.u64 %r37,%r107; .loc 1 64 11 mov.u64 %r74,%r170; .loc 1 141 6 mov.u64 %r36,%r107; bra $L10; $L8: .loc 1 114 15 cvt.s64.s32 %r54,%r105; .loc 1 116 21 ld.u64 %r161,[%r167+16]; .loc 1 119 21 ld.u64 %r165,[%r168+16]; mov.u64 %r97,40; .loc 1 111 5 mov.u64 %r87,%r203; .loc 1 62 7 mov.u64 %r84,%r87; .loc 1 112 12 mov.u64 %r126,%r87; .loc 1 63 11 mov.u64 %r80,%r87; .loc 1 64 11 mov.u64 %r74,%r87; setp.eq.u64 %r354,%r170,0; .loc 1 126 20 mov.u64 %r376,%r87; $L16: add.u64 %r83,%r168,%r97; .loc 1 122 17 ld.u64 %r211,[%r83+16]; add.u64 %r210,%r211,1; .loc 1 122 15 ld.u64 %r212,[%r83+8]; sub.u64 %r158,%r210,%r212; .loc 1 116 21 add.u64 %r213,%r167,%r97; ld.u64 %r214,[%r213]; mul.lo.u64 %r73,%r161,%r214; .loc 1 119 21 ld.u64 %r215,[%r83]; mul.lo.u64 %r79,%r165,%r215; .loc 1 114 10 setp.ne.u64 %r216,%r54,%r126; @ %r216 bra $L11; .loc 1 117 14 setp.ne.u64 %r217,%r73,0; .loc 1 120 14 setp.ne.u64 %r218,%r79,0; .loc 1 122 15 mov.u64 %r84,%r158; selp.u64 %r80,%r79,%r22,%r218; selp.u64 %r74,%r73,%r22,%r217; bra $L14; $L11: .loc 1 126 20 shl.b64 %r357,%r87,3; add.u64 %r220,%frame,%r357; add.u64 %r221,%r220,120; st.u64 [%r221],%r376; .loc 1 127 21 st.u64 [%r220],%r158; .loc 1 128 22 st.u64 [%r220+480],%r73; .loc 1 129 22 st.u64 [%r220+360],%r79; .loc 1 130 14 @ %r354 bra $L57; .loc 1 131 26 add.u64 %r233,%r87,%r87; add.u64 %r234,%r233,%r87; shl.b64 %r235,%r234,3; add.u64 %r236,%r170,%r235; ld.u64 %r238,[%r236+40]; ld.u64 %r239,[%r170+16]; mul.lo.u64 %r43,%r238,%r239; bra $L15; $L57: .loc 1 133 24 mov.u64 %r43,%r376; $L15: add.u64 %r241,%frame,%r357; st.u64 [%r241+240],%r43; .loc 1 134 12 add.u64 %r87,%r87,1; $L14: .loc 1 112 55 add.u64 %r126,%r126,1; .loc 1 112 3 add.u64 %r97,%r97,24; setp.ne.u64 %r243,%r126,%r133; @ %r243 bra $L16; .loc 1 137 14 ld.u64 %r37,[%frame+360]; .loc 1 139 14 ld.u64 %r36,[%frame+480]; .loc 1 141 23 ld.u64 %r107,[%frame+240]; .loc 1 137 6 setp.ne.u64 %r244,%r37,0; .loc 1 66 8 selp.u64 %r37,%r37,%r22,%r244; .loc 1 139 6 setp.ne.u64 %r245,%r36,0; .loc 1 66 8 selp.u64 %r36,%r36,%r22,%r245; .loc 1 151 27 abs.s64 %r38,%r169; .loc 1 149 8 ld.u64 %r123,[%r168]; .loc 1 148 8 ld.u64 %r122,[%r167]; .loc 1 141 6 @ %r354 bra $L10; .loc 1 141 13 setp.eq.u64 %r247,%r107,0; .loc 1 66 8 selp.u64 %r107,%r22,%r107,%r247; bra $L9; $L52: .loc 1 158 10 setp.le.s64 %r248,%r169,0; @ %r248 bra $L19; .loc 1 159 6 sub.u64 %r86,%r84,%r169; bra $L20; $L19: .loc 1 161 6 add.u64 %r86,%r84,%r169; $L20: .loc 1 164 6 setp.eq.u64 %r249,%r170,0; @ %r249 bra $L59; mov.u64 %r84,%r169; bra $L53; $L64: .loc 1 154 11 mov.u64 %r86,0; $L53: .loc 1 165 10 ld.u64 %r170,[%r170]; bra $L21; $L51: .loc 1 180 24 neg.s64 %r30,%r84; .loc 1 180 31 mul.lo.u64 %r136,%r30,%r74; .loc 1 174 29 mul.lo.u64 %r99,%r84,%r80; .loc 1 186 24 mul.lo.u64 %r152,%r22,%r86; setp.ge.s64 %r358,%r84,0; selp.u64 %r250,%r84,%r30,%r358; .loc 1 216 10 add.u64 %r143,%r250,-1; .loc 1 210 10 add.u64 %r146,%r84,-1; @ %r358 bra $L24; not.b64 %r252,%r84; bra $L25; $L24: mov.u64 %r252,%r146; $L25: .loc 1 184 38 set.u32.eq.u64 %r255,%r74,%r22; neg.s32 %r256,%r255; .loc 1 184 19 set.u32.eq.u64 %r258,%r80,%r22; neg.s32 %r259,%r258; .loc 1 184 27 cvt.u16.u32 %r261,%r256; cvt.u16.u32 %r262,%r259; and.b16 %r260,%r261,%r262; cvt.u32.u16 %r263,%r260; cvt.u32.u8 %r144,%r263; .loc 1 235 32 ld.u64 %r51,[%frame]; mul.lo.u64 %r98,%r86,%r74; .loc 1 242 30 mul.lo.u64 %r104,%r51,%r36; .loc 1 243 30 mul.lo.u64 %r264,%r51,%r37; .loc 1 243 16 neg.s64 %r129,%r264; .loc 1 244 30 mul.lo.u64 %r265,%r51,%r107; .loc 1 244 16 neg.s64 %r139,%r265; setp.le.s64 %r359,%r84,0; setp.ne.u32 %r362,%r144,0; setp.gt.s64 %r361,%r133,2; .loc 1 192 4 setp.gt.s64 %r365,%r86,0; .loc 1 216 8 setp.ne.u64 %r366,%r250,0; setp.eq.u64 %r367,%r173,1; .loc 1 223 8 setp.gt.s64 %r368,%r22,0; .loc 1 210 8 setp.eq.u64 %r369,%r84,0; cvt.u32.u32 %r370,%r134; cvt.s64.s8 %r371,%r370; add.u64 %r372,%r371,-1; add.u64 %r373,%frame,480; add.u64 %r374,%frame,360; add.u64 %r375,%frame,240; $L55: .loc 1 172 10 @ %r359 bra $L26; .loc 1 174 15 add.u64 %r114,%r123,%r99; mov.u64 %r77,%r122; bra $L27; $L26: .loc 1 180 16 add.u64 %r77,%r122,%r136; mov.u64 %r114,%r123; $L27: .loc 1 184 10 @ %r362 bra $L28; .loc 1 192 4 @ %r365 bra $L60; bra $L30; $L28: .loc 1 187 477r1275,[%value_in]; } .loc 1 188 9 add.u64 %r77,%r77,%r152; bra $L30; $L60: .loc 1 192 4 mov.u64 %r113,%r77; .loc 1 192 11 mov.u64 %r115,0; $L29:283,[%value_in]; } .loc 1 195 13 add.u64 %r113,%r113,%r74; .loc 1 196 12 add.u64 %r114,%r114,%r80; .loc 1 192 26 add.u64 %r115,%r115,1; .loc 1 192 4 setp.ne.u64 %r285,%r115,%r86; @ %r285 bra $L29; .loc 1 195 13 add.u64 %r77,%r77,%r98; $L30: .loc 1 199 10 @ %r358 bra $L31; .loc 1 209 10 setp.ne.u64 %r287,%r170,0; @ %r287 bra $L61; mov.u64 %r77,%r122; bra $L33; $L31: setp.ne.u64 %r288,%r170,0; @ %r288 bra $L34; $L33: .loc 1 216 8 @ %r366 bra $L62; $L38: .loc 1 230 12 add.u64 %r122,%r122,%r36; .loc 1 231 12 add.u64 %r123,%r123,%r37; .loc 1 232 12 add.u64 %r170,%r170,%r107; .loc 1 233 15 ld.u64 %r290,[%frame+120]; add.u64 %r53,%r290,1; .loc 1 235 13 setp.eq.u64 %r291,%r51,%r53; @ %r291 bra $L36; .loc 1 233 15 st.u64 [%frame+120],%r53; bra $L55; $L34: .loc 1 210 8 @ ! %r369 bra $L32; bra $L38; $L61: .loc 1 209 10 mov.u64 %r77,%r122; $L32: mov.u64 %r118,%r252; $L39: .loc 1 212 6170299,[%value_in]; } .loc 1 213 11 add.u64 %r77,%r77,%r74; .loc 1 210 10 add.u64 %r118,%r118,-1; .loc 1 210 8 setp.ne.u64 %r301,%r118,-1; @ %r301 bra $L39; bra $L38; $L62: .loc 1 216 10 mov.u64 %r116,%r143; $L35: .loc 1 220 9 @ %r367 bra $L40; .loc 1 223 8 @ %r368 bra $L41; bra $L42; $L40: .loc 1 221 8 ld.s8 %r308,[%r172]bra $L42; $L41: .loc 1 223 15 mov.u64 %r117,0; $L43: .loc 1 224 3 add.u64 %r310,%r77,%r1131172173317,[%value_in]; } .loc 1 223 33 add.u64 %r117,%r117,%r173; .loc 1 223 8 setp.gt.s64 %r319,%r22,%r117; @ %r319 bra $L43; $L42: .loc 1 226 11 add.u64 %r77,%r77,%r74; .loc 1 216 10 add.u64 %r116,%r116,-1; .loc 1 216 8 setp.ne.u64 %r320,%r116,-1; @ %r320 bra $L35; bra $L38; $L36: .loc 1 239 20 mov.u64 %r321,0; st.u64 [%frame+120],%r321; .loc 1 243 16 add.u64 %r119,%r123,%r129; .loc 1 244 16 add.u64 %r120,%r170,%r139; .loc 1 246 14 @ %r361 bra $L45; bra $L1; $L50: .loc 1 239 20 st.u64 [%r75],%r321; .loc 1 242 30 mul.lo.u64 %r56,%r66,%r65; .loc 1 243 30 mul.lo.u64 %r324,%r68,%r65; .loc 1 243 16 sub.u64 %r119,%r123,%r324; .loc 1 244 30 mul.lo.u64 %r325,%r70,%r65; .loc 1 244 16 sub.u64 %r120,%r170,%r325; .loc 1 245 12 add.u64 %r121,%r121,1; .loc 1 246 14 add.u64 %r75,%r75,8; add.u64 %r150,%r150,8; setp.eq.u64 %r326,%r121,%r372; @ ! %r326 bra $L48; bra $L1; $L45: add.u64 %r75,%frame,128; .loc 1 242 30 mov.u64 %r56,%r104; .loc 1 246 14 mov.u64 %r150,8; .loc 1 245 12 mov.u64 %r121,1; $L48: .loc 1 254 23 ld.u64 %r330,[%r75]; add.u64 %r65,%r330,1; st.u64 [%r75],%r65; .loc 1 255 30 add.u64 %r332,%r373,%r150; ld.u64 %r66,[%r332]; .loc 1 255 20 sub.u64 %r333,%r66,%r56; add.u64 %r122,%r122,%r333; .loc 1 256 30 add.u64 %r335,%r374,%r150; ld.u64 %r68,[%r335]; .loc 1 256 20 add.u64 %r123,%r119,%r68; .loc 1 257 30 add.u64 %r337,%r375,%r150; ld.u64 %r70,[%r337]; .loc 1 257 20 add.u64 %r170,%r120,%r70; .loc 1 235 32 add.u64 %r338,%frame,%r150; ld.u64 %r72,[%r338]; .loc 1 235 13 setp.eq.u64 %r339,%r65,%r72; @ %r339 bra $L50; bra $L55; $L59: mov.u64 %r84,%r169; bra $L21; $L63: .loc 1 154 11 mov.u64 %r170,0; .loc 1 167 10 mov.u64 %r86,%r170; $L21: .loc 1 169 9 setp.ne.u64 %r340,%r122,0; @ %r340 bra $L51; bra $L1; $L10: .loc 1 151 6 setp.lt.s64 %r341,%r84,%r38; @ ! %r341 bra $L52; bra $L63; $L56: .loc 1 63 11 mov.u64 %r80,%r203; .loc 1 62 7 mov.u64 %r84,%r80; .loc 1 137 14 mov.u64 %r37,%r204; .loc 1 64 11 mov.u64 %r74,%r80; .loc 1 141 6 mov.u64 %r36,%r37; mov.u64 %r107,%r37; $L9: .loc 1 151 6 setp.lt.s64 %r342,%r84,%r38; @ %r342 bra $L64; bra $L52; $L3: .loc 1 84 16 ld.u64 %r344,[%r168+56]; ld.u64 %r346,[%r168+48]; .loc 1 84 14 sub.u64 %r85,%r344,%r346; add.u64 %r78,%r168,72; add.u32 %r347,%r135,-1; cvt.u64.u32 %r348,%r347; add.u64 %r350,%r348,%r348; add.u64 %r351,%r350,%r348; shl.b64 %r352,%r351,3; mov.u64 %r44,%r176; .loc 1 87 10 mov.u64 %r91,1; .loc 1 92 4 mov.u64 %r192,%r44; bra $L54; $L1:_gfortran_eoshift2_1 .visible .func _gfortran_eoshift2_1 {predmov.u64 %r27,%ar1; mov.u64 %r28,%ar2; mov.u64 %r29,%ar3; mov.u64 %r30,%ar4; setp.eq.u64 %r31,%r30,0; @ %r31 bra $L78; ld.s8 %r25,[%r30]; bra $L77; $L78: mov.u32 %r25,1; $L77: ld.s8 %r39,[%r28]; mov.u64 %r38,1; cvta.const3738; call eoshift2} // BEGIN GLOBAL FUNCTION DEF: _gfortran_eoshift2_1_char .visible .func _gfortran_eoshift2_1_charpredmov.u64 %r28,%ar2; mov.u64 %r29,%ar3; mov.u64 %r30,%ar4; mov.u64 %r31,%ar5; setp.eq.u64 %r34,%r31,0; @ %r34 bra $L81; ld.s8 %r25,[%r31]; bra $L80; $L81: mov.u32 %r25,1; $L80: ld.s8 %r42,[%r29]; mov.u64 %r41,1; cvta.const041; call eoshift2} // BEGIN GLOBAL FUNCTION DEF: _gfortran_eoshift2_1_char4 .visible .func _gfortran_eoshift2_1_char4predmov.u64 %r28,%ar2; mov.u64 %r29,%ar3; mov.u64 %r30,%ar4; mov.u64 %r31,%ar5; setp.eq.u64 %r34,%r31,0; @ %r34 bra $L84; ld.s8 %r25,[%r31]; bra $L83; $L84: mov.u32 %r25,1; $L83: ld.s8 %r42,[%r29]; mov.u64 %r41,4; cvta.const.u64 %r40,space$4041; call eoshift2} // BEGIN GLOBAL FUNCTION DEF: _gfortran_eoshift2_2 .visible .func _gfortran_eoshift2_2 {predmov.u64 %r27,%ar1; mov.u64 %r28,%ar2; mov.u64 %r29,%ar3; mov.u64 %r30,%ar4; setp.eq.u64 %r31,%r30,0; @ %r31 bra $L87; ld.s16 %r25,[%r30]; bra $L86; $L87: mov.u32 %r25,1; $L86: ld.s16 %r39,[%r28]; mov.u64 %r38,1; cvta.const3738; call eoshift2} // BEGIN GLOBAL FUNCTION DEF: _gfortran_eoshift2_2_char .visible .func _gfortran_eoshift2_2_charpredmov.u64 %r28,%ar2; mov.u64 %r29,%ar3; mov.u64 %r30,%ar4; mov.u64 %r31,%ar5; setp.eq.u64 %r34,%r31,0; @ %r34 bra $L90; ld.s16 %r25,[%r31]; bra $L89; $L90: mov.u32 %r25,1; $L89: ld.s16 %r42,[%r29]; mov.u64 %r41,1; cvta.const041; call eoshift2} // BEGIN GLOBAL FUNCTION DEF: _gfortran_eoshift2_2_char4 .visible .func _gfortran_eoshift2_2_char4predmov.u64 %r28,%ar2; mov.u64 %r29,%ar3; mov.u64 %r30,%ar4; mov.u64 %r31,%ar5; setp.eq.u64 %r34,%r31,0; @ %r34 bra $L93; ld.s16 %r25,[%r31]; bra $L92; $L93: mov.u32 %r25,1; $L92: ld.s16 %r42,[%r29]; mov.u64 %r41,4; cvta.const.u64 %r40,space$3041; call eoshift2} // BEGIN GLOBAL FUNCTION DEF: _gfortran_eoshift2_4 .visible .func _gfortran_eoshift2_4 {r24u64mov.u64 %r29,%ar4; setp.eq.u64 %r30,%r29,0; @ %r30 bra $L96; ld.u32 %r24,[%r29]; bra $L95; $L96: mov.u32 %r24,1; $L95: ld.s32 %r38,[%r27]; mov.u64 %r37,1; cvta.const.u64 %r36,$LC2; {3637; call eoshift2} // BEGIN GLOBAL FUNCTION DEF: _gfortran_eoshift2_4_char .visible .func _gfortran_eoshift2_4_charpredmov.u64 %r29,%ar4; mov.u64 %r30,%ar5;99; ld.u32 %r24,[%r30]; bra $L98; $L99: mov.u32 %r24,1; $L98: ld.s32 %r41,[%r28]; mov.u64 %r40,1; cvta.const.u64 %r39539; call eoshift2} // BEGIN GLOBAL FUNCTION DEF: _gfortran_eoshift2_4_char4 .visible .func _gfortran_eoshift2_4_char4predmov.u64 %r29,%ar4; mov.u64 %r30,%ar5;102; ld.u32 %r24,[%r30]; bra $L101; $L102: mov.u32 %r24,1; $L101: ld.s32 %r41,[%r28]; mov.u64 %r40,4; cvta.const.u64 %r39,space$2; {39; call eoshift2} // BEGIN GLOBAL FUNCTION DEF: _gfortran_eoshift2_8 .visible .func _gfortran_eoshift2_8 {r24u64mov.u64 %r29,%ar4; setp.eq.u64 %r30,%r29,0; @ %r30 bra $L105; ld.u32 %r24,[%r29]; bra $L104; $L105: mov.u32 %r24,1; $L104: mov.u64 %r38,1; cvta.const.u64 %r37,$LC2; ld.u64 %r34,[%r27]; {3738; call eoshift2} // BEGIN GLOBAL FUNCTION DEF: _gfortran_eoshift2_8_char .visible .func _gfortran_eoshift2_8_charpredmov.u64 %r27,%ar2; mov.u64 %r28,%ar3; mov.u64 %r29,%ar4; mov.u64 %r30,%ar5; setp.eq.u64 %r33,%r30,0; @ %r33 bra $L108; ld.u32 %r24,[%r30]; bra $L107; $L108: mov.u32 %r24,1; $L107: mov.u64 %r41,1; cvta.const.u64 %r40,$LC3; ld.u64 %r37,[%r24041; call eoshift2} // BEGIN GLOBAL FUNCTION DEF: _gfortran_eoshift2_8_char4 .visible .func _gfortran_eoshift2_8_char4predmov.u64 %r27,%ar2; mov.u64 %r28,%ar3; mov.u64 %r29,%ar4; mov.u64 %r30,%ar5; setp.eq.u64 %r33,%r30,0; @ %r33 bra $L111; ld.u32 %r24,[%r30]; bra $L110; $L111: mov.u32 %r24,1; $L110: mov.u64 %r41,4; cvta.const.u64 %r40,space$1; ld.u64 %r37,[%r24041; call eoshift2} // BEGIN GLOBAL FUNCTION DEF: _gfortran_eoshift2_16 .visible .func _gfortran_eoshift2_16 {predsetp.eq.u64 %r31,%r30,0; @ %r31 bra $L114; ld.u32 %r25,[%r30]; bra $L113; $L114: mov.u32 %r25,1; $L113: mov.u64 %r39,1; cvta.const.u64 %r38,$LC2; ld.u64 %r35,[%r28]339; call eoshift2} // BEGIN GLOBAL FUNCTION DEF: _gfortran_eoshift2_16_char .visible .func _gfortran_eoshift2_16_charpredmov.u64 %r29,%ar3; mov.u64 %r30,%ar4; mov.u64 %r31,%ar5; setp.eq.u64 %r34,%r31,0; @ %r34 bra $L117; ld.u32 %r25,[%r31]; bra $L116; $L117: mov.u32 %r25,1; $L116: mov.u64 %r42,1; cvta.const.u64 %r41,$LC3; ld.u64 %r38,[%r29]142; call eoshift2} // BEGIN GLOBAL FUNCTION DEF: _gfortran_eoshift2_16_char4 .visible .func _gfortran_eoshift2_16_char4predmov.u64 %r29,%ar3; mov.u64 %r30,%ar4; mov.u64 %r31,%ar5; setp.eq.u64 %r34,%r31,0; @ %r34 bra $L120; ld.u32 %r25,[%r31]; bra $L119; $L120: mov.u32 %r25,1; $L119: mov.u64 %r42,4; cvta.const.u64 %r41,space$0; ld.u64 %r38,[%r29]142; call eoshift2} erfc_scaled.o/gfortran_erfc_scaled_r4 .visible .func (.param .f32 %value_out) _gfortran_erfc_scaled_r4fortran/intrinsics/erfc_scaled_in_gfortran_erfc_scaled_r8 .visible .func (.param .f64 %value_out) _gfortran_erfc_scaled_expf .extern .func (.param .f32 %value_out) expfp .extern .func (.param .f64 %value_out) exp (.param .f64 %in_ar0); // BEGIN VAR DEF: c$18 .const .align 4 .u32 c$18[9] = {1058041512,1091445090,1115962631,1133859148,1146911985,1154875782,1157640513,1150929628,850981762 }; // BEGIN VAR DEF: d$6 .const .align 8 .u64 d$6[8] = {4625053223475404501,4637982392458399207,4647936310760731806,4654845762207915303,4659454959122452994,4661518765118262918,4659781664573558205,4653125905153220449 }; // BEGIN VAR DEF: d$17 .const .align 4 .u32 d$17[8] = {1098640184,1122722638,1141263255,1154133111,1162718410,1166562548,1163326946,1150929628 }; // BEGIN VAR DEF: c$7 .const .align 8 .u64 c$7[9] = {4603256977754886206,4621190386817743874,4634353141348049104,4643961261102179426,4650968949598166947,4655244480475941079,4656728784125521771,4653125905153195150,4492092620723410965_gfortran_erfc_scaled_r4 .visible .func (.param .f32 %value_out) _gfortran_erfc_scaled_r4 (predpredf32 %r100; .reg .f32 %r102; .reg .f32 %r103; .reg .f32 %r105; .reg .f32 %r106; .reg .f32 %r107; .reg .f32 %r108; .reg .f32 %r109; mov.f32 %r76,%ar0; .loc 1 111 18 setp.gt.f32 %r77,%r76,0f00000000; @ %r77 bra $L17; neg.f32 %r48,%r76; bra $L2; $L17: mov.f32 %r48,%r76; $L2: .loc 1 112 6 setp.le.f32 %r78,%r48,0f3ef00000; @ ! %r78 bra $L27; .loc 1 115 10 setp.gt.f32 %r79,%r48,0f337ffae5; @ ! %r79 bra $L28; .loc 1 116 6 mul.f32 %r47,%r48,%r48; .loc 1 121 24 fma.rn.f32 %r28,%r47,0f3e3e3c83,0f404a4fda; fma.rn.f32 %r50,%r28,%r47,0f42e3ba72; .loc 1 122 24 add.f32 %r80,%r47,0f41bccf72; fma.rn.f32 %r30,%r80,%r47,0f4374064f; .loc 1 121 24 fma.rn.f32 %r52,%r50,%r47,0f43bcbe1c; .loc 1 124 23 fma.rn.f32 %r33,%r52,%r47,0f4548960b; .loc 1 122 24 fma.rn.f32 %r53,%r30,%r47,0f44a053bb; .loc 1 124 39 fma.rn.f32 %r22,%r53,%r47,0f4531c3ca; bra $L5; $L28: mov.f32 %r22,0f4531c3ca; mov.f32 %r33,0f4548960b; .loc 1 114 11 mov.f32 %r47,0f00000000; $L5: .loc 1 124 15 mul.f32 %r81,%r76,%r33; .loc 1 124 11 div.rn.f32 %r59,%r81,%r22; .loc 1 125 11 mov.f32 %r82,0f3f800000; sub.f32 %r60,%r82,%r59; .loc 1 126 13 { .param .f32 %value_in; .param .f32 %out_arg1; st.param.f32 [%out_arg1],%r47; call (%value_in),expf,(%out_arg1); ld.param.f32 %r84,[%value_in]; } .loc 1 126 11 mul.f32 %r75,%r84,%r60; .loc 1 127 14 bra $L1; $L27: .loc 1 129 11 setp.le.f32 %r85,%r48,0f40800000; @ ! %r85 bra $L29; .loc 1 131 12 mul.f32 %r54,%r48,0f32b8f382; cvta.const.u64 %r72,c$18; cvta.const.u64 %r69,d$17; add.u64 %r65,%r72,28; .loc 1 132 12 mov.f32 %r55,%r48; .loc 1 131 12 mov.f32 %r45,0f417beb38; mov.f32 %r46,0f3f106ea8; $L11: .loc 1 135 17 add.f32 %r25,%r46,%r54; .loc 1 135 9 mul.f32 %r54,%r25,%r48; .loc 1 136 17 add.f32 %r26,%r45,%r55; .loc 1 136 9 mul.f32 %r55,%r26,%r48; .loc 1 133 7 add.u64 %r72,%r72,4; add.u64 %r69,%r69,4; setp.eq.u64 %r86,%r65,%r72; @ %r86 bra $L10; .loc 1 135 20 ld.f32 %r46,[%r72]; .loc 1 136 20 ld.f32 %r45,[%r69]; bra $L11; $L10: .loc 1 138 19 add.f32 %r87,%r54,0f4499cadc; .loc 1 138 35 add.f32 %r88,%r55,0f4499cadc; .loc 1 138 11 div.rn.f32 %r75,%r87,%r88; bra $L12; $L29: .loc 1 143 10 setp.ge.f32 %r89,%r48,0f41131aa0; @ ! %r89 bra $L13; .loc 1 145 14 setp.ge.f32 %r90,%r48,0f7e1024d1; @ %r90 bra $L19; .loc 1 147 14 setp.ge.f32 %r91,%r48,0f45354000; @ ! %r91 bra $L13; .loc 1 149 12 mov.f32 %r92,0f3f106ebb; div.rn.f32 %r75,%r92,%r48; .loc 1 150 8 bra $L12; $L13: .loc 1 153 29 mul.f32 %r93,%r48,%r48; .loc 1 153 11 mov.f32 %r94,0f3f800000; div.rn.f32 %r49,%r94,%r93; .loc 1 158 24 fma.rn.f32 %r61,%r49,0f3c85a7d9,0f3e9c53c6; fma.rn.f32 %r66,%r49,%r61,0f3eb87f20; .loc 1 159 24 add.f32 %r95,%r49,0f402462a3; fma.rn.f32 %r68,%r49,%r95,0f3fefbceb; .loc 1 158 24 fma.rn.f32 %r70,%r49,%r66,0f3e00cced; .loc 1 159 24 fma.rn.f32 %r71,%r49,%r68,0f3f0724ca; .loc 1 158 24 fma.rn.f32 %r73,%r49,%r70,0f3c83c224; .loc 1 159 24 fma.rn.f32 %r74,%r49,%r71,0f3d77e215; .loc 1 161 24 fma.rn.f32 %r34,%r49,%r73,0f3a2cafe8; .loc 1 161 17 mul.f32 %r35,%r34,%r49; .loc 1 161 40 fma.rn.f32 %r36,%r49,%r74,0f3b190a3d; .loc 1 161 11 div.rn.f32 %r96,%r35,%r36; .loc 1 162 20 mov.f32 %r97,0f3f106ebb; sub.f32 %r37,%r97,%r96; .loc 1 162 11 div.rn.f32 %r75,%r37,%r48; bra $L12; $L19: .loc 1 142 11 mov.f32 %r75,0f00000000; $L12: .loc 1 166 6 setp.lt.f32 %r98,%r76,0f00000000; @ ! %r98 bra $L1; .loc 1 168 10 setp.lt.f32 %r99,%r76,0fc1161cac; @ %r99 bra $L20; .loc 1 172 10 mul.f32 %r100,%r76,0f41800000; cvt.rzi.f32.f32 %r39,%r100; .loc 1 172 8 mul.f32 %r56,%r39,0f3d800000; .loc 1 174 8 mul.f32 %r102,%r56,%r56;expf,(%out_arg1); ld.param.f32 %r103,[%value_in]; } .loc 1 173 12 sub.f32 %r105,%r76,%r56; .loc 1 173 20 add.f32 %r106,%r76,%r56; .loc 1 173 8 mul.f32 %r107,%r105,%r106; .loc 1 174 23 { .param .f32 %value_in; .param .f32 %out_arg1; st.param.f32 [%out_arg1],%r107; call (%value_in),expf,(%out_arg1); ld.param.f32 %r108,[%value_in]; } neg.f32 %r67,%r75; .loc 1 174 6 mul.f32 %r109,%r103,%r108; .loc 1 175 8 fma.rn.f32 %r75,%r109,0f40000000,%r67; bra $L1; $L20: .loc 1 169 6 mov.f32 %r75,0f7f800000;gfortran_erfc_scaled_r8 .visible .func (.param .f64 %value_out) _gfortran_erfc_scaled_r8 (predpredf64 %r100; .reg .f64 %r102; .reg .f64 %r103; .reg .f64 %r105; .reg .f64 %r106; .reg .f64 %r107; .reg .f64 %r108; .reg .f64 %r109; mov.f64 %r76,%ar0; .loc 1 111 18 setp.gt.f64 %r77,%r76,0d0000000000000000; @ %r77 bra $L46; neg.f64 %r48,%r76; bra $L31; $L46: mov.f64 %r48,%r76; $L31: .loc 1 112 6 setp.le.f64 %r78,%r48,0d3fde000000000000; @ ! %r78 bra $L56; .loc 1 115 10 setp.gt.f64 %r79,%r48,0d3c9ffe5ab7e8ad5e; @ ! %r79 bra $L57; .loc 1 116 6 mul.f64 %r47,%r48,%r48; .loc 1 121 24 fma.rn.f64 %r28,%r47,0d3fc7c7905a31c322,0d400949fb3ed443e9; fma.rn.f64 %r50,%r28,%r47,0d405c774e4d365da3; .loc 1 122 24 add.f64 %r80,%r47,0d403799ee342fb2de; fma.rn.f64 %r30,%r80,%r47,0d406e80c9d57e55b8; .loc 1 121 24 fma.rn.f64 %r52,%r50,%r47,0d407797c38897528b; .loc 1 124 23 fma.rn.f64 %r33,%r52,%r47,0d40a912c1535d121a; .loc 1 122 24 fma.rn.f64 %r53,%r30,%r47,0d40940a77529cadc8; .loc 1 124 39 fma.rn.f64 %r22,%r53,%r47,0d40a63879423b87ad; bra $L34; $L57: mov.f64 %r22,0d40a63879423b87ad; mov.f64 %r33,0d40a912c1535d121a; .loc 1 114 11 mov.f64 %r47,0d0000000000000000; $L34: .loc 1 124 15 mul.f64 %r81,%r76,%r33; .loc 1 124 11 div.rn.f64 %r59,%r81,%r22; .loc 1 125 11 mov.f64 %r82,0d3ff0000000000000; sub.f64 %r60,%r82,%r59; .loc 1 126 13 { .param .f64 %value_in; .param .f64 %out_arg1; st.param.f64 [%out_arg1],%r47; call (%value_in),exp,(%out_arg1); ld.param.f64 %r84,[%value_in]; } .loc 1 126 11 mul.f64 %r75,%r84,%r60; .loc 1 127 14 bra $L30; $L56: .loc 1 129 11 setp.le.f64 %r85,%r48,0d4010000000000000; @ ! %r85 bra $L58; .loc 1 131 12 mul.f64 %r54,%r48,0d3e571e703c5f5815; cvta.const.u64 %r72,c$7; cvta.const.u64 %r69,d$6; add.u64 %r65,%r72,56; .loc 1 132 12 mov.f64 %r55,%r48; .loc 1 131 12 mov.f64 %r45,0d402f7d66f486ded5; mov.f64 %r46,0d3fe20dd508eb103e; $L40: .loc 1 135 17 add.f64 %r25,%r46,%r54; .loc 1 135 9 mul.f64 %r54,%r25,%r48; .loc 1 136 17 add.f64 %r26,%r45,%r55; .loc 1 136 9 mul.f64 %r55,%r26,%r48; .loc 1 133 7 add.u64 %r72,%r72,8; add.u64 %r69,%r69,8; setp.eq.u64 %r86,%r65,%r72; @ %r86 bra $L39; .loc 1 135 20 ld.f64 %r46,[%r72]; .loc 1 136 20 ld.f64 %r45,[%r69]; bra $L40; $L39: .loc 1 138 19 add.f64 %r87,%r54,0d4093395b7fd2fc8e; .loc 1 138 35 add.f64 %r88,%r55,0d4093395b7fd35f61; .loc 1 138 11 div.rn.f64 %r75,%r87,%r88; bra $L41; $L58: .loc 1 143 10 setp.ge.f64 %r89,%r48,0d403a8b020c49ba5e; @ ! %r89 bra $L42; .loc 1 145 14 setp.ge.f64 %r90,%r48,0d7fc203a27a37512d; @ %r90 bra $L48; .loc 1 147 14 setp.ge.f64 %r91,%r48,0d418ffeeb00000000; @ ! %r91 bra $L42; .loc 1 149 12 mov.f64 %r92,0d3fe20dd750429b6d; div.rn.f64 %r75,%r92,%r48; .loc 1 150 8 bra $L41; $L42: .loc 1 153 29 mul.f64 %r93,%r48,%r48; .loc 1 153 11 mov.f64 %r94,0d3ff0000000000000; div.rn.f64 %r49,%r94,%r93; .loc 1 158 24 fma.rn.f64 %r61,%r49,0d3f90b4fb18b485c7,0d3fd38a78b9f065f6; fma.rn.f64 %r66,%r49,%r61,0d3fd70fe40e2425b8; .loc 1 159 24 add.f64 %r95,%r49,0d40048c54508800db; fma.rn.f64 %r68,%r49,%r95,0d3ffdf79d6855f0ad; .loc 1 158 24 fma.rn.f64 %r70,%r49,%r66,0d3fc0199d980a842f; .loc 1 159 24 fma.rn.f64 %r71,%r49,%r68,0d3fe0e4993e122c39; .loc 1 158 24 fma.rn.f64 %r73,%r49,%r70,0d3f9078448cd6c5b5; .loc 1 159 24 fma.rn.f64 %r74,%r49,%r71,0d3faefc42917d7de7; .loc 1 161 24 fma.rn.f64 %r34,%r49,%r73,0d3f4595fd0d71e33c; .loc 1 161 17 mul.f64 %r35,%r34,%r49; .loc 1 161 40 fma.rn.f64 %r36,%r49,%r74,0d3f632147a014bad1; .loc 1 161 11 div.rn.f64 %r96,%r35,%r36; .loc 1 162 20 mov.f64 %r97,0d3fe20dd750429b6d; sub.f64 %r37,%r97,%r96; .loc 1 162 11 div.rn.f64 %r75,%r37,%r48; bra $L41; $L48: .loc 1 142 11 mov.f64 %r75,0d0000000000000000; $L41: .loc 1 166 6 setp.lt.f64 %r98,%r76,0d0000000000000000; @ ! %r98 bra $L30; .loc 1 168 10 setp.lt.f64 %r99,%r76,0dc03aa0c49ba5e354; @ %r99 bra $L49; .loc 1 172 10 mul.f64 %r100,%r76,0d4030000000000000; cvt.rzi.f64.f64 %r39,%r100; .loc 1 172 8 mul.f64 %r56,%r39,0d3fb0000000000000; .loc 1 174 8 mul.f64 %r102,%r56,%r56;exp,(%out_arg1); ld.param.f64 %r103,[%value_in]; } .loc 1 173 12 sub.f64 %r105,%r76,%r56; .loc 1 173 20 add.f64 %r106,%r76,%r56; .loc 1 173 8 mul.f64 %r107,%r105,%r106; .loc 1 174 23 { .param .f64 %value_in; .param .f64 %out_arg1; st.param.f64 [%out_arg1],%r107; call (%value_in),exp,(%out_arg1); ld.param.f64 %r108,[%value_in]; } neg.f64 %r67,%r75; .loc 1 174 6 mul.f64 %r109,%r103,%r108; .loc 1 175 8 fma.rn.f64 %r75,%r109,0d4000000000000000,%r67; bra $L30; $L49: .loc 1 169 6 mov.f64 %r75,0d7ff0000000000000; $L30:618 gfortran_is_extension_ofgfortran_is_extension_offortran/intrinsics/extends_type_ofgfortran_is_extension_ofgfortran_is_extension_ofpredpred54 6 setp.eq.u64 %r27,%r26,0; @ %r27 bra $L4; .loc 1 56 6 setp.eq.u64 %r28,%r25,0; @ %r28 bra $L5; .loc 1 61 25 ld.u32 %r23,[%r26]; $L3: .loc 1 61 10 ld.u32 %r29,[%r25]; setp.eq.u32 %r30,%r29,%r23; @ %r30 bra $L6; .loc 1 62 10 ld.u64 %r25,[%r25+16]; .loc 1 59 9 setp.ne.u64 %r31,%r25,0; @ %r31 bra $L3; .loc 1 57 12 cvt.u32.u64 %r24,%r25; bra $L1; $L4: .loc 1 55 12 mov.u32 %r24,1; bra $L1; $L5: .loc 1 57 12 cvt.u32.u64 %r24,%r25; bra $L1; $L6: .loc 1 55 12 mov.u32 %r24,1; $L1: .loc 1 65 fnum.o/ gfortran_fnum_gfortran_fnum_fortran/intrinsics/fnu_gfortran_fnum_gfortran_fnum_i8_gfortrani_unit_to_f_gfortrani_unit_to_fdF: _gfortran_fnum_gfortran_fnum_mov.u64 %r24,%ar0; .loc 1 38_gfortrani_unit_to_fd,(39_gfortran_fnum_gfortran_fnum_u32 %r27; .reg .u32 %r29; mov.u64 %r26,%ar0; .loc 1 47 10 ld.u32 %r27,[%r26]_gfortrani_unit_to_fd,(%out_arg1); ld.param.u32 %r29,[%value_in]; } cvt.s64.s32 %value,%r29; .loc 1 48FUNCTION DECL: _gfortran_ierrno_gfortran_ierrno_i4; .file 1 "../../../libgfortran/intrinsics/ierrno.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_ierrno_gfortran_ierrno_i8; // BEGIN GLOBAL FUNCTION DECL: __errno__errnogfortran_ierrno_gfortran_ierrno_i4 { .reg .u32 %value; .reg .u64 %r24; .loc 1 39 26__errno; ld.param.u64 %r24,[%value_in]; } .loc 1 39 10 ld.u32 %value,[%r24]; .loc_gfortran_ierrno_gfortran_ierrno_i8 { .reg .u64 %value; .reg .u64 %r25; .loc 1 48 26__errno48 10 ld.s32 %value,[%r25]; .loc 1 4FUNCTION DECL: _gfortran_ishftc4gfortran_ishftc4libgfortran/intrinsics/ishft_gfortran_ishftc8gfortran_ishftc8_gfortran_ishftc16 .visible .func _gfortran_ishftc16ortran_ishftc4gfortran_ishftc4u32mov.u32 %r37,%ar0; mov.u32 %r38,%ar1; mov.u32 %r39,%ar2; .loc 1 36 6 setp.ge.s32 %r40,%r38,0; @ %r40 bra $L2; .loc 1 37 11 add.u32 %r38,%r38,%r39; $L2: .loc 1 39 13 set.u32.eq.u32 %r42,%r38,0; neg.s32 %r43,%r42; .loc 1 39 27 set.u32.eq.u32 %r45,%r38,%r39; neg.s32 %r46,%r45; .loc 1 39 18 cvt.u16.u32 %r48,%r43; cvt.u16.u32 %r49,%r46; or.b16 %r47,%r48,%r49; .loc 1 39 6 cvt.u32.u16 %r50,%r47; cvt.u16.u8 %r51,%r5; .loc 1 45 10 setp.eq.u32 %r53,%r39,32; @ %r53 bra $L6; .loc 1 45 67 mov.u32 %r55,-1; shl.b32 %r54,%r55,%r39; .loc 1 45 10 not.b32 %r32,%r54; bra $L4; $L6: mov.u32 %r32,-1; $L4: .loc 1 47 8 and.b32 %r35,%r37,%r32; .loc 1 49 31 shl.b32 %r56,%r35,%r38; .loc 1 49 22 xor.b32 %r57,%r56,%r37; and.b32 %r58,%r57,%r32; xor.b32 %r59,%r58,%r37; .loc 1 49 66 sub.u32 %r60,%r39,%r38; .loc 1 49 57 shr.u32 %r61,%r35,%r60; .loc 1 49 49 or.b32 %r36,%r59,%r61; bra $L1; $L5: .loc 1 40 12 mov.u32 %r36,%r37; $L1: .loc 1 50_gfortran_ishftc8gfortran_ishftc832u16 %r48; .reg .u16 %r49; .reg .u32mov.u64 %r37,%ar0; mov.u32 %r38,%ar1; mov.u32 %r39,%ar2; .loc 1 60 6 setp.ge.s32 %r40,%r38,0; @ %r40 bra $L8; .loc 1 61 11 add.u32 %r38,%r38,%r39; $L8: .loc 1 63 13 set.u32.eq.u32 %r42,%r38,0; neg.s32 %r43,%r42; .loc 1 63 27 set.u32.eq.u32 %r45,%r38,%r39; neg.s32 %r46,%r45; .loc 1 63 18 cvt.u16.u32 %r48,%r43; cvt.u16.u32 %r49,%r46; or.b16 %r47,%r48,%r49; .loc 1 63 6 cvt.u32.u16 %r50,%r47; cvt.u16.u8 %r51,%r50; setp.ne.u16 %r52,%r51,0; @ %r52 bra $L11; .loc 1 69 10 setp.eq.u32 %r53,%r39,64; @ %r53 bra $L12; .loc 1 69 67 mov.u64 %r55,-1; shl.b64 %r54,%r55,%r39; .loc 1 69 10 not.b64 %r32,%r54; bra $L10; $L12: mov.u64 %r32,-1; $L10: .loc 1 71 8 and.b64 %r35,%r37,%r32; .loc 1 73 31 shl.b64 %r56,%r35,%r38; .loc 1 73 22 xor.b64 %r57,%r56,%r37; and.b64 %r58,%r57,%r32; xor.b64 %r59,%r58,%r37; .loc 1 73 66 sub.u32 %r60,%r39,%r38; .loc 1 73 57 shr.u64 %r61,%r35,%r60; .loc 1 73 49 or.b64 %r36,%r59,%r61; bra $L7; $L11: .loc 1 64 12 mov.u64 %r36,%r37; $L7: .loc 1 74_gfortran_ishftc16 .visible .func _gfortran_ishftc16predpred %r57; .reg .u32 %r59; .reg .pred9186,[%r38]; ld.u64 %r187,[%r38+8]; mov.u32 %r40,%ar2; mov.u32 %r41,%ar3; .loc 1 85 6 setp.ge.s32 %r42,%r40,0; @ %r42 bra $L14; .loc 1 86 11 add.u32 %r40,%r40,%r41; $L14:or.b16 %r49,%r50,%r51; .loc 1 88 6; .loc 1 89 12 st.u64 [%r37],%r186; st.u64 [%r37+8],%r187; bra $L13; $L15: .loc 1 94 10 setp.eq.u32 %r57,%r41,128; @ %r57 bra $L24; .loc 1 94 70 add.u32 %r59,%r41,-64; setp.lt.s32 %r73,%r59,0; @ %r73 bra $L18; mov.u64 %r75,-1; shl.b64 %r189,%r75,%r59; mov.u64 %r188,0; bra $L19; $L18: mov.u64 %r77,-1; mov.u64 %r76,9223372036854775807; mov.u32 %r79,63; sub.u32 %r78,%r79,%r41; shr.u64 %r80,%r76,%r78; shl.b64 %r81,%r77,%r41; or.b64 %r189,%r81,%r80; mov.u64 %r188,%r81; $L19: .loc 1 94 10 not.b64 %r182,%r188; not.b64 %r183,%r189; bra $L17; $L24: mov.u64 %r182,-1; mov.u64 %r183,%r182; $L17: .loc 1 96 8 and.b64 %r184,%r186,%r182; and.b64 %r185,%r187,%r183; .loc 1 98 31 add.u32 %r97,%r40,-64; setp.lt.s32 %r111,%r97,0; @ %r111 bra $L20; shl.b64 %r191,%r184,%r97; mov.u64 %r190,0; bra $L21; $L20: shr.u64 %r114,%r184,1; mov.u32 %r117,63; sub.u32 %r116,%r117,%r40; shr.u64 %r118,%r114,%r116; shl.b64 %r119,%r185,%r40; or.b64 %r191,%r119,%r118; shl.b64 %r190,%r184,%r40; $L21: .loc 1 98 22 xor.b64 %r125,%r190,%r186; xor.b64 %r128,%r191,%r187; and.b64 %r132,%r125,%r182; and.b64 %r135,%r128,%r183; xor.b64 %r196,%r132,%r186; xor.b64 %r197,%r135,%r187; .loc 1 98 66 sub.u32 %r145,%r41,%r40; .loc 1 98 57 add.u32 %r147,%r145,-64; setp.lt.s32 %r161,%r147,0; @ %r161 bra $L22; shr.u64 %r198,%r185,%r147; mov.u64 %r199,0; bra $L23; $L22: shl.b64 %r164,%r185,1; mov.u32 %r167,63; sub.u32 %r166,%r167,%r145; shl.b64 %r168,%r164,%r166; shr.u64 %r169,%r184,%r145; or.b64 %r198,%r169,%r168; shr.u64 %r199,%r185,%r145; $L23: .loc 1 98 49 or.b64 %r174,%r196,%r198; st.u64 [%r37],%r174; or.b64 %r177,%r197,%r199; st.u64 [%r37+8],%r177; $L13: .loc 1 99 1 ret; } is_contiguous.o/1622802216gfortran_is_contiguous0gfortran_is_contiguous0fortran/intrinsics/is_contiguousgfortran_is_contiguous0gfortran_is_contiguous0pred %r43; mov.u64 %r35,%ar0; .loc 1 35 7 ld.s8 %r28,[%r35+28]; .loc 1 38 3 setp.le.s64 %r36,%r28,0; @ %r36 bra $L4; add.u64 %r27,%r35,40; .loc 1 37 10 mov.u64 %r30,1; .loc 1 38 10 mov.u64 %r31,0; $L3: .loc 1 41 10 ld.u64 %r37,[%r27]; setp.ne.u64 %r38,%r37,%r30; @ %r38 bra $L5; .loc 1 44 17 ld.u64 %r40,[%r27+16]; ld.u64 %r42,[%r27+8]; sub.u64 %r41,%r40,%r42; .loc 1 44 14 mad.lo.u64 %r30,%r41,%r30,%r30; .loc 1 38 25 add.u64 %r31,%r31,1; .loc 1 38 3 add.u64 %r27,%r27,24; setp.ne.u64 %r43,%r28,%r31; @ %r43 bra $L3; .loc 1 47 10 mov.u32 %r34,1; bra $L1; $L4: mov.u32 %r34,1; bra $L1; $L5: .loc 1 42 9 mov.u32 %r34,0; $L1: .loc 1 48ortran_mvbits_i1 .visible .func _gfortran_mvbits_i1.file 1 "../../../libgfortran/intrinsics/mvbitgfortran_mvbits_i2 .visible .func _gfortran_mvbits_i2_gfortran_mvbits_i4 .visible .func _gfortran_mvbits_i4_gfortran_mvbits_i8 .visible .func _gfortran_mvbits_i8_gfortran_mvbits_i16 .visible .func _gfortran_mvbits_i16F: _gfortran_mvbits_i1 .visible .func _gfortran_mvbits_i1 {.reg .u16 %r52; .reg .u16u16 %r72; .reg .u32 %r74; .reg .u16 %r75; mov.u64 %r43,%ar0; mov.u64 %r44,%ar1; mov.u64 %r45,%ar2; mov.u64 %r46,%ar3; mov.u64 %r47,%ar4; .loc 1 46 14 ld.u32 %r22,[%r45]; .loc 1 46 11 setp.eq.u32 %r48,%r22,8; @ %r48 bra $L3; .loc 1 46 61 mov.u32 %r50,1; shl.b32 %r49,%r50,%r22; .loc 1 46 70 cvt.u16.u32 %r53,%r49; add.u16 %r52,%r53,-1; .loc 1 47 43 cvt.u32.u16 %r54,%r52; cvt.s32.s8 %r42,%r54; bra $L2; $L3: mov.u32 %r42,-1; $L2: .loc 1 47 57 ld.u32 %r32,[%r47]; .loc 1 47 15 ld.u8 %r55,[%r43]; .loc 1 47 30 ld.u32 %r57,[%r44]; shr.s32 %r56,%r55,%r57; .loc 1 47 43 and.b32 %r58,%r56,%r42; .loc 1 47 54 shl.b32 %r59,%r58,%r32; .loc 1 48 30 shl.b32 %r60,%r42,%r32; .loc 1 48 17 cvt.u16.u32 %r63,%r60; not.b16 %r62,%r63; .loc 1 48 11 ld.u8 %r75,[%r46]; mov.u16 %r68,%r75; and.b16 %r66,%r62,%r68; .loc 1 50 7 cvt.u16.u32 %r72,%r59; or.b16 %r71,%r72,%r66; cvt.u32.u16 %r74,%r71; st.u8 [%r46],%r74;_gfortran_mvbits_i2 .visible .func _gfortran_mvbits_i2 {mov.u64 %r43,%ar0; mov.u64 %r44,%ar1; mov.u64 %r45,%ar2; mov.u64 %r46,%ar3; mov.u64 %r47,%ar4; .loc 1 46 14 ld.u32 %r22,[%r45]; .loc 1 46 11 setp.eq.u32 %r48,%r22,16; @ %r48 bra $L6; .loc 1 46 61 mov.u32 %r50,1; shl.b32 %r49,%r50,%r22; .loc 1 46 70 cvt.u16.u32 %r52,%r49; add.u16 %r51,%r52,-1; .loc 1 47 43 cvt.s32.s16 %r42,%r51; bra $L5; $L6: mov.u32 %r42,-1; $L5: .loc 1 47 57 ld.u32 %r32,[%r47]; .loc 1 47 15 ld.u16 %r53,[%r43]; .loc 1 47 30 ld.u32 %r55,[%r44]; shr.s32 %r54,%r53,%r55; .loc 1 47 43 and.b32 %r56,%r54,%r42; .loc 1 47 54 shl.b32 %r57,%r56,%r32; .loc 1 48 30 shl.b32 %r58,%r42,%r32; .loc 1 48 17 cvt.u16.u32 %r60,%r58; not.b16 %r59,%r60; .loc 1 48 11 ld.u16 %r62,[%r46]; and.b16 %r61,%r59,%r62; .loc 1 50 7 cvt.u16.u32 %r64,%r57; or.b16 %r63,%r61,%r64; st.u16 [%r46],%r63;_gfortran_mvbits_i4 .visible .func _gfortran_mvbits_i4 {46 14 ld.u32 %r22,[%r41]; .loc 1 46 50 setp.eq.u32 %r44,%r22,32; @ %r44 bra $L9; .loc 1 46 61 mov.u32 %r46,1; shl.b32 %r45,%r46,%r22; .loc 1 46 50 add.u32 %r35,%r45,-1; .loc 1 47 43 mov.u32 %r38,%r35; bra $L8; $L9: mov.u32 %r38,-1; .loc 1 46 50 mov.u32 %r35,%r38; $L8: .loc 1 47 57 ld.u32 %r29,[%r43]; .loc 1 47 30 ld.u32 %r48,[%r39]; ld.u32 %r49,[%r40]; shr.u32 %r47,%r48,%r49; .loc 1 47 43 and.b32 %r50,%r47,%r38; .loc 1 47 54 shl.b32 %r51,%r50,%r29; .loc 1 48 30 shl.b32 %r52,%r35,%r29; .loc 1 48 20 not.b32 %r53,%r52; .loc 1 48 11 ld.u32 %r55,[%r42]; and.b32 %r54,%r53,%r55; .loc 1 50 17 or.b32 %r56,%r51,%r54; .loc 1 50 7 st.u32 [%r42],%r56;_gfortran_mvbits_i8 .visible .func _gfortran_mvbits_i8 {pred46 14 ld.u32 %r22,[%r41]; .loc 1 46 50 setp.eq.u32 %r44,%r22,64; @ %r44 bra $L12; .loc 1 46 61 mov.u64 %r46,1; shl.b64 %r45,%r46,%r22; .loc 1 46 50 add.u64 %r35,%r45,-1; .loc 1 47 43 mov.u64 %r38,%r35; bra $L11; $L12: mov.u64 %r38,-1; .loc 1 46 50 mov.u64 %r35,%r38; $L11: .loc 1 47 57 ld.u32 %r29,[%r43]; .loc 1 47 30 ld.u64 %r48,[%r39]; ld.u32 %r49,[%r40]; shr.u64 %r47,%r48,%r49; .loc 1 47 43 and.b64 %r50,%r47,%r38; .loc 1 47 54 shl.b64 %r51,%r50,%r29; .loc 1 48 30 shl.b64 %r52,%r35,%r29; .loc 1 48 20 not.b64 %r53,%r52; .loc 1 48 11 ld.u64 %r55,[%r42]; and.b64 %r54,%r53,%r55; .loc 1 50 17 or.b64 %r56,%r51,%r54; .loc 1 50 7 st.u64 [%r42],%r56;_gfortran_mvbits_i16 .visible .func _gfortran_mvbits_i16 {pred %r44; .reg .u32 %r46; .reg .predpredu32u64 %r124; .reg .pred %r142; .reg .u64 %r145; .reg .u32346 14 ld.u32 %r22,[%r41]; .loc 1 46 50 setp.eq.u32 %r44,%r22,128; @ %r44 bra $L23; .loc 1 46 61 add.u32 %r46,%r22,-64; setp.lt.s32 %r60,%r46,0; @ %r60 bra $L15; mov.u64 %r62,1; shl.b64 %r206,%r62,%r46; mov.u64 %r205,0; bra $L16; $L15: mov.u64 %r64,1; mov.u64 %r206,0; shl.b64 %r205,%r64,%r22; $L16: .loc 1 46 50 mov.u64 %r207,-1; add.u64 %r75,%r205,%r207; set.u32.lt.u64 %r80,%r75,%r205; cvt.s64.s32 %r78,%r80; add.u64 %r81,%r206,%r207; mov.u64 %r201,%r75; sub.u64 %r202,%r81,%r78; .loc 1 47 43 mov.u64 %r203,%r201; mov.u64 %r204,%r202; bra $L14; $L23: mov.u64 %r203,-1; mov.u64 %r204,%r203; .loc 1 46 50 mov.u64 %r201,%r203; mov.u64 %r202,%r203; $L14: .loc 1 47 57 ld.u32 %r29,[%r43]; .loc 1 47 30 ld.u32 %r87,[%r40]; add.u32 %r86,%r87,-64; setp.lt.s32 %r104,%r86,0; @ %r104 bra $L17; ld.u64 %r106,[%r39+8]; shr.u64 %r211,%r106,%r86; mov.u64 %r212,0; bra $L18; $L17: ld.u64 %r108,[%r39+8]; shl.b64 %r107,%r108,1; mov.u32 %r110,63; sub.u32 %r109,%r110,%r87; shl.b64 %r112,%r107,%r109; ld.u64 %r114,[%r39]; shr.u64 %r113,%r114,%r87; or.b64 %r211,%r113,%r112; shr.u64 %r212,%r108,%r87; $L18: .loc 1 47 43 and.b64 %r121,%r211,%r203; and.b64 %r124,%r212,%r204; .loc 1 47 54 add.u32 %r223,%r29,-64; setp.lt.s32 %r142,%r223,0; @ %r142 bra $L19; shl.b64 %r216,%r121,%r223; mov.u64 %r215,0; bra $L20; $L19: shr.u64 %r145,%r121,1; mov.u32 %r148,63; sub.u32 %r147,%r148,%r29; shr.u64 %r149,%r145,%r147; shl.b64 %r150,%r124,%r29; or.b64 %r216,%r150,%r149; shl.b64 %r215,%r121,%r29; $L20: .loc 1 48 30 setp.lt.s32 %r170,%r223,0; @ %r170 bra $L21; shl.b64 %r218,%r201,%r223; mov.u64 %r217,0; bra $L22; $L21: shr.u64 %r173,%r201,1; mov.u32 %r176,63; sub.u32 %r175,%r176,%r29; shr.u64 %r177,%r173,%r175; shl.b64 %r178,%r202,%r29; or.b64 %r218,%r178,%r177; shl.b64 %r217,%r201,%r29; $L22: .loc 1 48 20 not.b64 %r184,%r217; not.b64 %r186,%r218; .loc 1 48 11 ld.u64 %r190,[%r42]; and.b64 %r189,%r190,%r184; ld.u64 %r193,[%r42+8]; and.b64 %r192,%r193,%r186; .loc 1 50 17 or.b64 %r195,%r215,%r189; st.u64 [%r42],%r195; or.b64 %r198,%r216,%r192; st.u64 [%r42+8],%r198; .loc 1 51 1 ret; } move_alloc.o/gfortran_move_alloc .visible .func _gfortran_move_allocfortran/intrinsics/move__gfortran_move_alloc_c .visible .func _gfortran_move_alloc_c, F: _gfortran_move_alloc .visible .func _gfortran_move_allocmov.u64 %r38,%ar0; mov.u64 %r39,%ar1; .loc 1 37 3 ld.u64 %r40,[%r39];.loc 1 39 19 ld.s8 %r30,[%r38+28]; .loc 1 39 3 cvt.u16.u32 %r41,%r30; setp.le.s16 %r42,%r41,0; @ %r42 bra $L2; add.u32 %r43,%r30,-1; cvt.u64.u32 %r44,%r43; add.u64 %r46,%r44,%r44; add.u64 %r47,%r46,%r44; shl.b64 %r48,%r47,3; add.u64 %r37,%r48,64; mov.u64 %r27,40; .loc 1 44 7 mov.u64 %r50,0; $L3: add.u64 %r31,%r38,%r27; .loc 1 41 7 ld.u64 %r23,[%r31+8]; add.u64 %r32,%r39,%r27; st.u64 [%r32+8],%r23; ld.u64 %r24,[%r31+16]; st.u64 [%r32+16],%r24; ld.u64 %r25,[%r31]; st.u64 [%r32],%r25; .loc 1 44 7 ld.u64 %r49,[%r31+8]; st.u64 [%r31+16],%r49; st.u64 [%r31],%r50; .loc 1 39 3 add.u64 %r27,%r27,24; setp.ne.u64 %r51,%r27,%r37; @ %r51 bra $L3; $L2: .loc 1 48 20 ld.u64 %r28,[%r38+8]; .loc 1 48 14 st.u64 [%r39+8],%r28; .loc 1 49 3 ld.u64 %r52,[%r38+16]; st.u64 [%r39+16],%r52; ld.u64 %r53,[%r38+24]; st.u64 [%r39+24],%r53; .loc 1 50 23 ld.u64 %r29,[%r38]; .loc 1 50 17 st.u64 [%r39],%r29; .loc 1 51 19 mov.u64 %r54,0; st.u64 [%r38],%r54; .loc 1 5_gfortran_move_alloc_c .visible .func _gfortran_move_alloc_c_gfortran_move_alloc,(%out_arg1,%out_arg2); } .loc 1 65 1 ret; } pack_generic.o/ 1622802216BEGIN FUNCTION DECL: pack_internal .func pack_internal.file 1 "../../../libgfortran/intrinsics/pack_generic.c" // BEGIN FUNCTION DECL: pack_s_internal .func pack_s_internal_gfortran_pack .visible .func _gfortran_packgfortran_pack_char .visible .func _gfortran_pack_char_gfortran_pack_char4 .visible .func _gfortran_pack_char4_gfortran_pack_s .visible .func _gfortran_pack_sgfortran_pack_s_char .visible .func _gfortran_pack_s_char_gfortran_pack_s_char4 .visible .func _gfortran_pack_s_char4); // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_count_0_gfortrani_count_0_gfortrani_xmallocarr_gfortrani_xmallocarray_gfortrani_pack_i1 .extern .func _gfortrani_pack_i1gfortrani_pack_i2 .extern .func _gfortrani_pack_i2gfortrani_pack_i4 .extern .func _gfortrani_pack_i4gfortrani_pack_i8 .extern .func _gfortrani_pack_i8gfortrani_pack_i16 .extern .func _gfortrani_pack_i16gfortrani_pack_r4 .extern .func _gfortrani_pack_r4gfortrani_pack_r8 .extern .func _gfortrani_pack_r8gfortrani_pack_c4 .extern .func _gfortrani_pack_c4gfortrani_pack_c8 .extern .func _gfortrani_pack_c810,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,80,65,67,75,32,105,110,116,114,105,110,115,105,99,59,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,100,0 }110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,121,0 }; // BEGIN FUNCTION DEF: pack_internal .func pack_internal {480pred %r155; .reg .pred %r156; .reg .pred %r157; .reg .predpredu64 %r198; .reg .predpredpredpred %r258; .reg .u64u64 %r284; .reg .u64 %r285; mov.u64 %r135,%ar0; mov.u64 %r136,%ar1; mov.u64 %r137,%ar2; mov.u64 %r138,%ar3; mov.u64 %r139,%ar4; .loc 1 94 9 ld.s8 %r22,[%r136+28]; .loc 1 94 7 cvt.u32.u32 %r140,%r22; cvt.s64.s8 %r93,%r140; .loc 1 96 8 ld.u64 %r107,[%r136]; .loc 1 97 8 ld.u64 %r45,[%r137]; .loc 1 102 15 ld.u64 %r23,[%r137+16]; .loc 1 102 13 cvt.u32.u64 %r94,%r23; .loc 1 104 53 add.u32 %r141,%r94,-4; and.b32 %r142,%r141,-5; set.u32.eq.u32 %r144,%r142,0; neg.s32 %r145,%r144; .loc 1 104 22 add.u32 %r146,%r94,-1; set.u32.le.u32 %r148,%r146,1; neg.s32 %r149,%r148; .loc 1 104 6 cvt.u16.u32 %r151,%r145; cvt.u16.u32 %r152,%r149; or.b16 %r150,%r151,%r152; cvt.u32.u16 %r153,%r150; cvt.u16.u8 %r154,%r153; setp.eq.u16 %r155,%r154,0; @ %r155 bra $L2; $L5: .loc 1 117 3 setp.gt.s64 %r156,%r93,0; @ %r156 bra $L3; $L7: .loc 1 124 14 ld.u64 %r40,[%frame+360]; .loc 1 124 6 setp.ne.u64 %r157,%r40,0; .loc 1 125 16 selp.u64 %r40,%r40,%r139,%r157; .loc 1 126 6 ld.u64 %r186,[%frame+240]; setp.ne.u64 %r187,%r186,0; @ %r187 bra $L8; bra $L45; $L2: .loc 1 106 7 setp.eq.u32 %r158,%r94,16; @ %r158 bra $L5; .loc 1 115 5 cvta.const.u64 %r151_gfortran_runtime_error121 20 ld.u64 %r33,[%r136+16]; .loc 1 119 16 cvt.u32.u32 %r162,%r22; cvt.s64.s8 %r161,%r162; shl.b64 %r128,%r161,3; add.u64 %r163,%frame,120; mov.u32 %r167167169,[%value_in]; } mov.u64 %r121,40; mov.u64 %r122,0; add.u64 %r281,%frame,360; add.u64 %r280,%frame,240; $L6: add.u64 %r120,%r136,%r121; .loc 1 120 17 add.u64 %r171,%frame,%r122; .loc 1 120 19 ld.u64 %r173,[%r120+16]; add.u64 %r172,%r173,1; ld.u64 %r175,[%r120+8]; sub.u64 %r174,%r172,%r175; .loc 1 120 17 st.u64 [%r171],%r174; .loc 1 121 18 add.u64 %r177,%r281,%r122; .loc 1 121 20 ld.u64 %r179,[%r120]; mul.lo.u64 %r178,%r179,%r33; .loc 1 121 18 st.u64 [%r177],%r178; .loc 1 122 18 add.u64 %r181,%r280,%r122; .loc 1 122 20 add.u64 %r182,%r137,%r121; ld.u64 %r184,[%r182]; mul.lo.u64 %r183,%r184,%r23; .loc 1 122 18 st.u64 [%r181],%r183; .loc 1 117 3 add.u64 %r122,%r122,8; add.u64 %r121,%r121,24; setp.ne.u64 %r185,%r122,%r128; @ %r185 bra $L6; bra $L7; $L45: .loc 1 127 16 cvt.s64.s32 %r188,%r94; st.u64 [%frame+240],%r188; $L8: .loc 1 129 10 ld.u64 %r89,[%r135]; .loc 1 129 6 setp.eq.u64 %r189,%r89,0; @ %r189 bra $L9; .loc 1 129 33 cvta.global.u64 %r190,_gfortrani_compile_options; .loc 1 129 30 ld.u32 %r191,[%r190+36]; setp.eq.u32 %r192,%r191,0; @ %r192 bra $L10; $L9: .loc 1 134 10 setp.eq.u64 %r193,%r138,0; @ %r193 bra $L11; .loc 1 138 12 ld.u64 %r195,[%r138+56]; add.u64 %r194,%r195,1; .loc 1 138 10 ld.u64 %r196,[%r138+48]; sub.u64 %r91,%r194,%r196; bra $L12; $L11: .loc 1 144 12137; call (%value_in),_gfortrani_count_0,(%out_arg1); ld.param.u64 %r198,[%value_in]; } mov.u64 %r91,%r198; .loc 1 147 14 ld.u64 %r89,[%r135]; $L12: .loc 1 147 10 setp.ne.u64 %r199,%r89,0; @ %r199 bra $L13; .loc 1 150 4 st.u64 [%r135+48],%r89; add.u64 %r201,%r91,-1; st.u64 [%r135+56],%r201; mov.u64 %r202,1; st.u64 [%r135+40],%r202; .loc 1 152 16 st.u64 [%r135+8],%r89; .loc 1 154 21139; call (%value_in),_gfortrani_xmallocarray206,[%value_in]; } mov.u64 %r89,%r206; .loc 1 154 19 st.u64 [%r135],%r89; .loc 1 156 7 setp.ne.u64 %r208,%r91,0; @ %r208 bra $L10; bra $L1; $L13: .loc 1 164 17 ld.u64 %r210,[%r135+56]; add.u64 %r209,%r210,1; .loc 1 164 15 ld.u64 %r211,[%r135+48]; sub.u64 %r97,%r209,%r211; .loc 1 165 7 setp.eq.u64 %r212,%r91,%r97; @ %r212 bra $L10; .loc 1 166 6 st.u64 [%stack+8],%r97; st.u64 [%stack],%r91;_gfortran_runtime_error10: .loc 1 172 14 ld.u64 %r215,[%r135+40]; ld.u64 %r216,[%r135+16]; mul.lo.u64 %r88,%r215,%r216; .loc 1 173 6 setp.ne.u64 %r217,%r88,0; .loc 1 174 14 selp.u64 %r88,%r88,%r139,%r217; .loc 1 176 12 ld.u64 %r99,[%frame+240]; .loc 1 179 10 set.u32.ne.u64 %r219,%r107,0; neg.s32 %r220,%r219; .loc 1 179 15 set.u32.ne.u64 %r222,%r45,0; neg.s32 %r223,%r222; cvt.u16.u32 %r225,%r220; cvt.u16.u32 %r226,%r223; and.b16 %r224,%r225,%r226; .loc 1 179 9 cvt.u32.u16 %r227,%r224; cvt.u16.u8 %r228,%r227; setp.eq.u16 %r229,%r228,0; @ %r229 bra $L15; .loc 1 193 32 ld.u64 %r132,[%frame]; .loc 1 200 30 mul.lo.u64 %r96,%r132,%r40; .loc 1 201 30 mul.lo.u64 %r230,%r132,%r99; .loc 1 201 16 neg.s64 %r116,%r230; setp.le.s64 %r279,%r93,1; add.u64 %r284,%frame,360; add.u64 %r285,%frame,240; $L27: .loc 1 182 10 ld.s8 %r232,[%r450; @ %r233 bra $L16; .loc 1 185 11107139240,[%value_in]; } .loc 1 186 16 add.u64 %r89,%r89,%r88; $L16: .loc 1 189 12 add.u64 %r107,%r107,%r40; .loc 1 190 12 add.u64 %r45,%r45,%r99; .loc 1 191 15 ld.u64 %r242,[%frame+120]; add.u64 %r63,%r242,1; .loc 1 193 13 setp.eq.u64 %r243,%r63,%r132; @ %r243 bra $L17; .loc 1 191 15 st.u64 [%frame+120],%r63; bra $L27; $L17: .loc 1 197 20 mov.u64 %r244,0; st.u64 [%frame+120],%r244; .loc 1 201 16 add.u64 %r105,%r45,%r116; .loc 1 203 14 @ %r279 bra $L15; add.u64 %r108,%frame,128; .loc 1 200 30 mov.u64 %r65,%r96; .loc 1 203 14 mov.u64 %r127,8; .loc 1 202 12 mov.u64 %r106,1; bra $L20; $L21: .loc 1 197 20 st.u64 [%r108],%r244; .loc 1 200 30 mul.lo.u64 %r65,%r71,%r70; .loc 1 201 30 mul.lo.u64 %r248,%r73,%r70; .loc 1 201 16 sub.u64 %r105,%r45,%r248; .loc 1 202 12 add.u64 %r106,%r106,1; .loc 1 203 14 add.u64 %r108,%r108,8; add.u64 %r127,%r127,8; setp.eq.u64 %r249,%r93,%r106; @ %r249 bra $L15; $L20: .loc 1 211 23 ld.u64 %r250,[%r108]; add.u64 %r70,%r250,1; st.u64 [%r108],%r70; .loc 1 212 30 add.u64 %r252,%r284,%r127; ld.u64 %r71,[%r252]; .loc 1 212 20 sub.u64 %r253,%r71,%r65; add.u64 %r107,%r107,%r253; .loc 1 213 30 add.u64 %r255,%r285,%r127; ld.u64 %r73,[%r255]; .loc 1 213 20 add.u64 %r45,%r105,%r73; .loc 1 193 32 add.u64 %r256,%frame,%r127; ld.u64 %r75,[%r256]; .loc 1 193 13 setp.eq.u64 %r257,%r70,%r75; @ %r257 bra $L21; bra $L27; $L15: .loc 1 219 6 setp.ne.u64 %r258,%r138,0; @ ! %r258 bra $L1; .loc 1 221 11 ld.u64 %r260,[%r138+56]; add.u64 %r259,%r260,1; .loc 1 221 9 ld.u64 %r261,[%r138+48]; sub.u64 %r100,%r259,%r261; .loc 1 222 22 ld.u64 %r263,[%r135]; sub.u64 %r262,%r89,%r263; .loc 1 222 13 div.s64 %r101,%r262,%r88; .loc 1 223 10 setp.le.s64 %r264,%r100,%r101; @ %r264 bra $L1; .loc 1 225 22 ld.u64 %r265,[%r138+40]; ld.u64 %r266,[%r138+16]; mul.lo.u64 %r84,%r265,%r266; .loc 1 226 14 setp.ne.u64 %r267,%r84,0; selp.u64 %r84,%r84,%r139,%r267; .loc 1 229 16 ld.u64 %r269,[%r138]; mad.lo.u64 %r104,%r84,%r101,%r269; .loc 1 230 13 sub.u64 %r102,%r100,%r101; .loc 1 231 19 add.u64 %r103,%r102,-1; $L26: .loc 1 28939276,[%value_in]; } .loc 1 234 20 add.u64 %r89,%r89,%r88; .loc 1 235 20 add.u64 %r104,%r104,%r84; .loc 1 231 19 add.u64 %r103,%r103,-1; .loc 1 231 17 setp.ne.u64 %r278,%r103,-1; @ %r278 bra $L26; $L1: .loc 1 239 1 ret; } // BEGIN FUNCTION DEF: pack_s_internal .func pack_s_internal {local .align 16 .b8 %frame_ar[3689predpred %r146; .reg .pred %r147; .reg .predpredpredpredpred %r173; .reg .u32u32 %r180; .reg .u16 %r181; .reg .u16 %r182; .reg .u16 %r183; .reg .u32predpredmov.u64 %r117,%ar0; mov.u64 %r118,%ar1; mov.u64 %r119,%ar2; mov.u64 %r120,%ar3; mov.u64 %r121,%ar4; .loc 1 470 9 ld.s8 %r22,[%r118+28]; .loc 1 470 7 cvt.u32.u32 %r122,%r22; cvt.s64.s8 %r72,%r122; .loc 1 473 14 st.u64 [%frame+240],%r121; .loc 1 475 3 setp.le.s64 %r123,%r72,0; @ %r123 bra $L69; .loc 1 482 20 ld.u64 %r29,[%r118+16]; .loc 1 477 16 shl.b64 %r106,%r72,3; add.u64 %r126,%frame,120; mov.u32 %r130106132,[%value_in]; } add.u64 %r89,%r118,40; mov.u64 %r87,0; .loc 1 474 9 mov.u64 %r96,1; add.u64 %r245,%frame,240; .loc 1 480 12 mov.u64 %r248,%r87; $L50: .loc 1 478 19 ld.u64 %r135,[%r89+16]; add.u64 %r134,%r135,1; ld.u64 %r136,[%r89+8]; sub.u64 %r26,%r134,%r136; .loc 1 479 10 setp.lt.s64 %r137,%r26,0; @ %r137 bra $L48; .loc 1 478 17 add.u64 %r138,%frame,%r87; st.u64 [%r138],%r26; .loc 1 483 13 mul.lo.u64 %r96,%r96,%r26; bra $L49; $L48: .loc 1 480 12 add.u64 %r139,%frame,%r87; st.u64 [%r139],%r248; mov.u64 %r96,0; $L49: .loc 1 482 18 add.u64 %r142,%r245,%r87; .loc 1 482 20 ld.u64 %r144,[%r89]; mul.lo.u64 %r143,%r144,%r29; .loc 1 482 18 st.u64 [%r142],%r143; .loc 1 475 3 add.u64 %r89,%r89,24; add.u64 %r87,%r87,8; setp.ne.u64 %r145,%r87,%r106; @ %r145 bra $L50; .loc 1 485 14 ld.u64 %r86,[%frame+240]; .loc 1 485 6 setp.ne.u64 %r146,%r86,0; selp.u64 %r86,%r86,%r121,%r146; .loc 1 490 6 setp.eq.u64 %r147,%r96,0; @ %r147 bra $L70; bra $L47; $L69: .loc 1 486 16 mov.u64 %r86,%r121; .loc 1 474 9 mov.u64 %r96,1; $L47: .loc 1 491 10 ld.u64 %r78,[%r118]; bra $L52; $L70: .loc 1 493 10 mov.u64 %r78,%r96; $L52: .loc 1 495 10 ld.u64 %r66,[%r117]; .loc 1 495 6 setp.ne.u64 %r148,%r66,0; @ ! %r148 bra $L85; setp.eq.u64 %r244,%r120,0; bra $L53; $L85: .loc 1 499 10 setp.eq.u64 %r244,%r120,0; @ %r244 bra $L54; .loc 1 503 12 ld.u64 %r151,[%r120+56]; add.u64 %r150,%r151,1; .loc 1 503 10 ld.u64 %r152,[%r120+48]; sub.u64 %r76,%r150,%r152; .loc 1 504 7 setp.le.s64 %r153,%r76,0; @ ! %r153 bra $L86; bra $L55; $L54: .loc 1 512 7 ld.u32 %r154,[%r119]; setp.eq.u32 %r155,%r154,0; @ %r155 bra $L55; .loc 1 516 14 ld.u64 %r75,[%frame]; .loc 1 517 8 setp.le.s64 %r246,%r72,1; @ %r246 bra $L57; add.u64 %r99,%frame,8; cvt.u32.u32 %r158,%r22; cvt.s64.s8 %r157,%r158; shl.b64 %r159,%r157,3; add.u64 %r92,%frame,%r159; $L58: .loc 1 518 9 ld.u64 %r160,[%r99]; mul.lo.u64 %r75,%r75,%r160; .loc 1 517 8 add.u64 %r99,%r99,8; setp.ne.u64 %r161,%r92,%r99; @ %r161 bra $L58; $L57: .loc 1 526 7 mov.u64 %r162,0; st.u64 [%r117+48],%r162; add.u64 %r163,%r75,-1; st.u64 [%r117+56],%r163; mov.u64 %r164,1; st.u64 [%r117+40],%r164; .loc 1 528 19 st.u64 [%r117+8],%r162; .loc 1 530 24121; call (%value_in),_gfortrani_xmallocarray530 22 st.u64 [%r117],%r66; .loc 1 532 10 setp.ne.u64 %r170,%r75,0; @ ! %r170 bra $L46; $L53: .loc 1 536 14 ld.u64 %r171,[%r117+40]; ld.u64 %r172,[%r117+16]; mul.lo.u64 %r65,%r171,%r172; .loc 1 537 6 setp.ne.u64 %r173,%r65,0; selp.u64 %r65,%r65,%r121,%r173; .loc 1 547 6 ld.u32 %r175,[%r119]; set.u32.ne.u32 %r176,%r175,0; neg.s32 %r177,%r176; .loc 1 547 13 set.u32.ne.u64 %r179,%r96,0; neg.s32 %r180,%r179; cvt.u16.u32 %r182,%r177; cvt.u16.u32 %r183,%r180; and.b16 %r181,%r182,%r183; cvt.u32.u16 %r184,%r181; cvt.u16.u8 %r185,%r184; setp.eq.u16 %r186,%r185,0; @ %r186 bra $L60; .loc 1 549 13 setp.eq.u64 %r187,%r78,0; @ %r187 bra $L60; .loc 1 516 14 ld.u64 %r110,[%frame]; .loc 1 557 9 ld.u64 %r111,[%frame+120]; .loc 1 566 27 mul.lo.u64 %r114,%r86,%r110; setp.le.s64 %r246,%r72,1; $L64: .loc 1 552 41194,[%value_in]; } .loc 1 553 9 add.u64 %r66,%r66,%r65; .loc 1 556 9 add.u64 %r78,%r78,%r86; .loc 1 557 12 add.u64 %r111,%r111,1; .loc 1 559 10 setp.ne.u64 %r196,%r111,%r110; @ %r196 bra $L64; .loc 1 568 11 @ %r246 bra $L60; add.u64 %r84,%frame,128; add.u64 %r105,%frame,248; .loc 1 566 27 mov.u64 %r45,%r114; .loc 1 567 9 mov.u64 %r77,1; .loc 1 563 17 mov.u64 %r247,0; bra $L62; $L63: st.u64 [%r84],%r247; .loc 1 566 27 mul.lo.u64 %r45,%r48,%r47; .loc 1 567 9 add.u64 %r77,%r77,1; .loc 1 568 11 add.u64 %r84,%r84,8; add.u64 %r105,%r105,8; setp.eq.u64 %r201,%r72,%r77; @ %r201 bra $L60; $L62: .loc 1 576 13 ld.u64 %r202,[%r84]; add.u64 %r47,%r202,1; st.u64 [%r84],%r47; .loc 1 577 20 ld.u64 %r48,[%r105]; .loc 1 577 10 sub.u64 %r203,%r48,%r45; add.u64 %r78,%r78,%r203; .loc 1 559 29 shl.b64 %r204,%r77,3; add.u64 %r205,%frame,%r204; ld.u64 %r50,[%r205]; .loc 1 559 10 setp.eq.u64 %r206,%r47,%r50; @ %r206 bra $L63; mov.u64 %r111,0; bra $L64; $L60: .loc 1 584 6 @ %r244 bra $L46; .loc 1 586 11 ld.u64 %r209,[%r120+56]; add.u64 %r208,%r209,1; .loc 1 586 9 ld.u64 %r210,[%r120+48]; sub.u64 %r79,%r208,%r210; .loc 1 587 22 ld.u64 %r212,[%r117]; sub.u64 %r211,%r66,%r212; .loc 1 587 13 div.s64 %r80,%r211,%r65; .loc 1 588 10 setp.le.s64 %r213,%r79,%r80; @ %r213 bra $L46; .loc 1 590 22 ld.u64 %r214,[%r120+40]; ld.u64 %r215,[%r120+16]; mul.lo.u64 %r59,%r214,%r215; .loc 1 591 14 setp.ne.u64 %r216,%r59,0; selp.u64 %r59,%r59,%r121,%r216; .loc 1 594 16 ld.u64 %r218,[%r120]; mad.lo.u64 %r83,%r59,%r80,%r218; .loc 1 595 13 sub.u64 %r81,%r79,%r80; .loc 1 596 19 add.u64 %r82,%r81,-1; $L67: .loc 1 59121225,[%value_in]; } .loc 1 599 20 add.u64 %r66,%r66,%r65; .loc 1 600 20 add.u64 %r83,%r83,%r59; .loc 1 596 19 add.u64 %r82,%r82,-1; .loc 1 596 17 setp.ne.u64 %r227,%r82,-1; @ %r227 bra $L67; bra $L46; $L86: .loc 1 526 7 st.u64 [%r117+48],%r66; add.u64 %r229,%r76,-1; st.u64 [%r117+56],%r229; mov.u64 %r230,1; st.u64 [%r117+40],%r230; .loc 1 528 19 st.u64 [%r117+8],%r66; .loc 1 530 24121; call (%value_in),_gfortrani_xmallocarray530 22 st.u64 [%r117],%r66; bra $L53; $L55: .loc 1 526 7 mov.u64 %r236,0; st.u64 [%r117+48],%r236; mov.u64 %r237,-1; st.u64 [%r117+56],%r237; mov.u64 %r238,1; st.u64 [%r117+40],%r238; .loc 1 528 19 st.u64 [%r117+8],%r236; .loc 1 530 24call (%value_in),_gfortrani_xmallocarray2,[%value_in]; } .loc 1 530 22 st.u64 [%r117],%r242; $L46_gfortran_pack .visible .func _gfortran_packpred %r90predpred %r180; .reg .predpred %r184; mov.u64 %r66,%ar0; mov.u64 %r67,%ar1; mov.u64 %r68,%ar2; mov.u64 %r69,%ar3; .loc 1 252 15 ld.u64 %r26,[%r67+16]; ld.s8 %r70,[%r67+29]; shl.b32 %r71,%r70,4; cvt.s64.s32 %r72,%r71; shl.b64 %r73,%r26,7; or.b64 %r28,%r72,%r73; .loc 1 254 3 setp.eq.u64 %r74,%r28,1040; @ %r74 bra $L88; setp.gt.u64 %r75,%r28,1040; @ %r75 bra $L89; setp.eq.u64 %r76,%r28,288; @ %r76 bra $L90; setp.gt.u64 %r77,%r28,288; @ %r77 bra $L91; setp.eq.u64 %r78,%r28,160; @ %r78 bra $L92; setp.eq.u64 %r79,%r28,272; @ %r79 bra $L90; setp.eq.u64 %r80,%r28,144; @ %r80 bra $L92; bra $L93; $L91: setp.eq.u64 %r81,%r28,544; @ %r81 bra $L94; setp.eq.u64 %r82,%r28,560; @ %r82 bra $L95; setp.eq.u64 %r83,%r28,528; @ %r83 bra $L94; bra $L93; $L89: setp.eq.u64 %r84,%r28,1088; @ %r84 bra $L96; setp.gt.u64 %r85,%r28,1088; @ %r85 bra $L97; setp.eq.u64 %r86,%r28,1056; @ %r86 bra $L88; setp.eq.u64 %r87,%r28,1072; @ %r87 bra $L98; bra $L93; $L97: setp.eq.u64 %r88,%r28,2080; @ %r88 bra $L99; setp.eq.u64 %r89,%r28,2112; @ %r89 bra $L100; setp.eq.u64 %r90,%r28,2064; @ %r90 bra $L99; bra $L93; $L92:66260 7 bra $L87; $L90:667 bra $L87; $L94:66272 7 bra $L87; $L88:66278 7 bra $L87; $L99:66call _gfortrani_pack_i16285 7 bra $L87; $L95:66291 7 bra $L87; $L98:667 bra $L87; $L96:66323 7 bra $L87; $L100:66328 7 bra $L87; $L93: .loc 1 356 3 setp.eq.u64 %r127,%r26,4; @ %r127 bra $L102; setp.gt.u64 %r128,%r26,4; @ %r128 bra $L103; setp.eq.u64 %r129,%r26,1; @ %r129 bra $L92; setp.eq.u64 %r130,%r26,2; @ %r130 bra $L104; bra $L105; $L103: setp.eq.u64 %r131,%r26,8; @ %r131 bra $L106; setp.eq.u64 %r132,%r26,16; @ %r132 bra $L107; bra $L105; $L104: .loc 1 364 11 ld.u64 %r134,[%r66]; and.b64 %r133,%r134,1; .loc 1 364 10 setp.ne.u64 %r135,%r133,0; @ %r135 bra $L105; .loc 1 364 46 ld.u64 %r137,[%r67]; and.b64 %r136,%r137,1; .loc 1 364 43 setp.ne.u64 %r138,%r136,0; @ %r138 bra $L105; .loc 1 365 4 setp.eq.u64 %r139,%r69,0; @ %r139 bra $L108; .loc 1 365 18 ld.u64 %r141,[%r69]; and.b64 %r140,%r141,1; .loc 1 365 15 setp.ne.u64 %r142,%r140,0; @ %r142 bra $L105; $L108:371 4 bra $L87; $L102: .loc 1 375 11 ld.u64 %r148,[%r66]; and.b64 %r147,%r148,3; .loc 1 375 10 setp.ne.u64 %r149,%r147,0; @ %r149 bra $L105; .loc 1 375 46 ld.u64 %r151,[%r67]; and.b64 %r150,%r151,3; .loc 1 375 43 setp.ne.u64 %r152,%r150,0; @ %r152 bra $L105; .loc 1 376 4 setp.eq.u64 %r153,%r69,0; @ %r153 bra $L109; .loc 1 376 18 ld.u64 %r155,[%r69]; and.b64 %r154,%r155,3; .loc 1 376 15 setp.ne.u64 %r156,%r154,0; @ %r156 bra $L105; $L109:382 4 bra $L87; $L106: .loc 1 386 11 ld.u64 %r162,[%r66]; and.b64 %r161,%r162,7; .loc 1 386 10 setp.ne.u64 %r163,%r161,0; @ %r163 bra $L105; .loc 1 386 46 ld.u64 %r165,[%r67]; and.b64 %r164,%r165,7; .loc 1 386 43 setp.ne.u64 %r166,%r164,0; @ %r166 bra $L105; .loc 1 387 4 setp.eq.u64 %r167,%r69,0; @ %r167 bra $L110; .loc 1 387 18 ld.u64 %r169,[%r69]; and.b64 %r168,%r169,7; .loc 1 387 15 setp.ne.u64 %r170,%r168,0; @ %r170 bra $L105; $L11393 4 bra $L87; $L107: .loc 1 398 11 ld.u64 %r176,[%r66]; and.b64 %r175,%r176,15; .loc 1 398 10 setp.ne.u64 %r177,%r175,0; @ %r177 bra $L105; .loc 1 398 47 ld.u64 %r179,[%r67]; and.b64 %r178,%r179,15; .loc 1 398 44 setp.ne.u64 %r180,%r178,0; @ %r180 bra $L105; .loc 1 399 4 setp.eq.u64 %r181,%r69,0; @ %r181 bra $L111; .loc 1 399 18 ld.u64 %r183,[%r69]; and.b64 %r182,%r183,15; .loc 1 399 15 setp.ne.u64 %r184,%r182,0; @ %r184 bra $L105; $L111:call _gfortrani_pack_i16405 4 bra $L87; $L105:66call p,%out_arg4,%out_arg5); } $L87_gfortran_pack_char .visible .func _gfortran_pack_char429 3 cvt.s64.s32 %r35,%r28; {p,%out_arg4,%out_arg5); }_gfortran_pack_char4 .visible .func _gfortran_pack_char4mov.u64 %r29,%ar4; mov.u32 %r30,%ar5; .loc 1 445 57 cvt.s64.s32 %r37,%r30; shl.b64 %r38,%r37,2; .loc 1 445 3 {call p,%out_arg4,%out_arg5); }_gfortran_pack_s .visible .func _gfortran_pack_s614 3 ld.u64 %r33,[%r25+16]; {call pack_s,%out_arg4,%out_arg5); }_gfortran_pack_s_char .visible .func _gfortran_pack_s_char631 3 cvt.s64.s32 %r35,%r28; {pack_s_internal,(%out_arg1,%out_arg2,%out_arg3,%out_arg4,%out_arg5); } .loc 1 63_gfortran_pack_s_char4 .visible .func _gfortran_pack_s_char4mov.u64 %r29,%ar4; mov.u32 %r30,%ar5; .loc 1 649 19 cvt.s64.s32 %r37,%r30; shl.b64 %r38,%r37,2; .loc 1 648 3 {call pack_s,%out_arg4,%out_arg5); } .loc 1 650 1 ret; } /637 _gfortran_selected_char_gfortran_selected_char_kindfortran/intrinsics/selected_char_kind.c" // BEGIN GLOBAL FUNCTION DECL: _gfortrani_f_gfortrani_fstrlenncasecmp15,111,95,49,48,54,52,54,0 }6] = {97,115,99,10508_gfortran_selected_char_gfortran_selected_char_kind32u64 %r46; .reg .u32 %r48; .reg .predr27; call (%value_in),_gfortrani_fstrle.loc 1 39 6 setp.ne.u64 %r32,%r31,5; @ %r32 bra $L2; .loc 1 39 20strncasecmp.loc 1 39 17 setp.ne.u32 %r37,%r36,0; .loc 1 41 12 selp.u32 %r26,-1,1,%r37; bra $L1; $L2: .loc 1 40 7 setp.ne.u64 %r38,%r31,7; @ %r38 bra $L4; .loc 1 40 23 cvta.const.u64 %r40,$LCcall (%value_in),strncasecmp.loc 1 40 20 setp.ne.u32 %r43,%r42,0; .loc 1 41 12 selp.u32 %r26,-1,1,%r43; bra $L1; $L4: .loc 1 42 11 setp.ne.u64 %r44,%r31,9; @ %r44 bra $L7; .loc 1 42 24 cvta.const.u64 %r46,$LC2call (%value_in),strncasecmp.loc 1 42 21 setp.ne.u32 %r49,%r48,0; .loc 1 43 12 selp.u32 %r26,-1,4,%r49; bra $L1; $L7: .loc 1 45 12 mov.u32 %r26,-1; $L1: .loc 1 46 size.o/ gfortran_size0gfortran_size0fortran/intrinsics/siz_gfortran_size1gfortran_size1gfortran_size0gfortran_size0u64 %r29; .reg .u32 %r30; .reg .u64 %r33; .reg .u64 %r39pred %r56; mov.u64 %r42,%ar0; .loc 1 36 19 ld.s8 %r30,[%r42+28]; .loc 1 36 3 cvt.u16.u32 %r43,%r30; setp.le.s16 %r44,%r43,0; @ %r44 bra $L4; add.u64 %r33,%r42,48; add.u64 %r27,%r42,56; add.u32 %r45,%r30,-1; cvt.u64.u32 %r46,%r45; add.u64 %r48,%r46,%r46; add.u64 %r49,%r48,%r46; shl.b64 %r50,%r49,3; add.u64 %r51,%r42,72; add.u64 %r39,%r50,%r51; .loc 1 35 8 mov.u64 %r41,1; $L3: .loc 1 38 13 ld.u64 %r53,[%r27]; add.u64 %r52,%r53,1; .loc 1 38 11 ld.u64 %r54,[%r33]; sub.u64 %r29,%r52,%r54; .loc 1 41 12 max.s64 %r55,%r29,0; mul.lo.u64 %r41,%r41,%r55; .loc 1 36 3 add.u64 %r33,%r33,24; add.u64 %r27,%r27,24; setp.ne.u64 %r56,%r33,%r39; @ %r56 bra $L3; bra $L1; $L4: .loc 1 35 8 mov.u64 %r41,1; $L1: .loc 1 44gfortran_size1gfortran_size1d.u64 %r25,%r29,-1; .loc 1 57 10 add.u64 %r31,%r25,%r25; add.u64 %r32,%r31,%r25; shl.b64 %r33,%r32,3; add.u64 %r34,%r28,%r33; ld.u64 %r37,[%r34+56]; add.u64 %r36,%r37,1; .loc 1 57 8 ld.u64 %r44,[%r34+48]; sub.u64 %r26,%r36,%r44; .loc 1 60 10 max.s64 %value,%r26,0; .loc 1 61BEGIN FUNCTION DECL: spread_internal .func spread_internalfortran/intrinsics/spread_generic.c" // BEGIN FUNCTION DECL: spread_internal_scalar$isra$0 .func spread_internal_scalar$isra$0gfortran_spread .visible .func _gfortran_spreadgfortran_spread_char .visible .func _gfortran_spread_char_gfortran_spread_char4 .visible .func _gfortran_spread_char4_gfortran_spread_scalar .visible .func _gfortran_spread_scalargfortran_spread_char_scalar .visible .func _gfortran_spread_char_scalar_gfortran_spread_char4_scalar .visible .func _gfortran_spread_char4_scalarGLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_options[6]; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray_gfortrani_spread_i1 .extern .func _gfortrani_spread_i1gfortrani_spread_i2 .extern .func _gfortrani_spread_i2gfortrani_spread_i4 .extern .func _gfortrani_spread_i4gfortrani_spread_i8 .extern .func _gfortrani_spread_i8gfortrani_spread_i16 .extern .func _gfortrani_spread_i16gfortrani_spread_r4 .extern .func _gfortrani_spread_r4gfortrani_spread_r8 .extern .func _gfortrani_spread_r8gfortrani_spread_c4 .extern .func _gfortrani_spread_c4gfortrani_spread_c8 .extern .func _gfortrani_spread_c8gfortrani_spread_scalar_i1 .extern .func _gfortrani_spread_scalar_i1gfortrani_spread_scalar_i2 .extern .func _gfortrani_spread_scalar_i2gfortrani_spread_scalar_i4 .extern .func _gfortrani_spread_scalar_i4gfortrani_spread_scalar_i8 .extern .func _gfortrani_spread_scalar_i8gfortrani_spread_scalar_i16 .extern .func _gfortrani_spread_scalar_i16gfortrani_spread_scalar_r4 .extern .func _gfortrani_spread_scalar_r4gfortrani_spread_scalar_r8 .extern .func _gfortrani_spread_scalar_r8gfortrani_spread_scalar_c4 .extern .func _gfortrani_spread_scalar_c4gfortrani_spread_scalar_c8 .extern .func _gfortrani_spread_scalar_c8114,101,116,117,114,110,32,114,97,110,107,32,116,111,111,32,108,97,114,103,101,32,105,110,32114,97,110,107,32,109,105,115,109,97,116,99,104,32,105,110,32,115,112,114,101,97,100,40,473,110,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,83,80,82,69,65,68,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,10105,110,99,111,114,114,101,99,116,32,100,101,115,116,105,110,97,116,105,111,110,32,114,97,110,107,32,105,110,32,115,112,114,101,97,100,40,410,105,109,32,111,117,116,115,105,100,101,32,111,102,32,114,97,110,107,32,105,110,32,115,112,114,101,97,100,40,41114,101,116,117,114,110,32,97,114,114,97,121,32,109,105,115,115,105,110,103,32,100,101,115,99,114,105,112,116,111,114,32,105,110,32,115,112,114,101,97,100,40,41100,105,109,32,116,111,111,32,108,97,114,103,101,32,105,110,32,115,112,114,101,97,100,40,41,0 }; // BEGIN FUNCTION DEF: spread_internal .func spread_internal48032pred %r160; .reg .predpredu64 %r198; .reg .predu64predpredpredu64 %r342; .reg .u64u64 %r357; .reg .u64mov.u64 %r149,%ar0; mov.u64 %r150,%ar1; mov.u64 %r151,%ar2; mov.u64 %r152,%ar3; .loc 1 54 10 ld.u64 %r22,[%r150+16]; .loc 1 56 11 ld.s8 %r23,[%r150+28]; .loc 1 56 9 cvt.u32.u32 %r153,%r23; cvt.s64.s8 %r118,%r153; .loc 1 58 9 add.u64 %r119,%r118,1; .loc 1 59 6 setp.le.s64 %r154,%r119,15; @ %r154 bra $L2; .loc 1 60 5 cvta.const.u64 %r11_gfortran_runtime_errorld.u64 %r24,[%r151]; .loc 1 62 6 setp.le.s64 %r157,%r24,%r119; @ %r157 bra $L3; .loc 1 63_gfortran_runtime_errorr120,[%r152]; .loc 1 67 10 ld.u64 %r128,[%r149]; .loc 1 67 6 setp.ne.u64 %r160,%r128,0; @ %r160 bra $L4; .loc 1 74 23 st.u8 [%r149+28],%r119; .loc 1 78 7 setp.le.s64 %r161,%r119,0; @ %r161 bra $L5; add.u64 %r102,%r149,40; .loc 1 76 11 mov.u64 %r107,%r128; .loc 1 78 14 mov.u64 %r124,%r107; .loc 1 77 10 mov.u64 %r140,1; .loc 1 36 14 mov.u64 %r99,%r107; .loc 1 89 19 mov.u64 %r365,%r107; .loc 1 83 21 add.u64 %r366,%r120,-1; bra $L8; $L35: mov.u64 %r140,%r100; $L8: .loc 1 84 15 mul.lo.u64 %r146,%r22,%r140; .loc 1 81 20 ld.u64 %r165,[%r151]; add.u64 %r164,%r165,-1; .loc 1 81 7 setp.ne.u64 %r166,%r164,%r124; @ %r166 bra $L6; .loc 1 83 21 mov.u64 %r36,%r366; .loc 1 85 11 mul.lo.u64 %r100,%r120,%r140; .loc 1 84 15 mov.u64 %r99,%r146; bra $L7; $L6: .loc 1 89 19 shl.b64 %r167,%r107,3; add.u64 %r168,%frame,%r167; st.u64 [%r168+120],%r365; .loc 1 90 22 add.u64 %r172,%r107,%r107; add.u64 %r173,%r172,%r107; shl.b64 %r174,%r173,3; add.u64 %r175,%r150,%r174; ld.u64 %r178,[%r175+56]; add.u64 %r177,%r178,1; ld.u64 %r185,[%r175+48]; sub.u64 %r31,%r177,%r185; .loc 1 90 20 st.u64 [%r168],%r31; .loc 1 91 23 ld.u64 %r198,[%r175+40]; mul.lo.u64 %r197,%r198,%r22; .loc 1 91 21 st.u64 [%r168+240],%r197; .loc 1 92 21 st.u64 [%r168+360],%r146; .loc 1 94 24 add.u64 %r36,%r31,-1; .loc 1 95 11 mul.lo.u64 %r100,%r31,%r140; .loc 1 96 11 add.u64 %r107,%r107,1; $L7: .loc 1 99 4 st.u64 [%r102+8],%r365; st.u64 [%r102+16],%r36; st.u64 [%r102],%r140; .loc 1 78 31 add.u64 %r124,%r124,1; .loc 1 78 7 add.u64 %r102,%r102,24; setp.ne.u64 %r203,%r119,%r124; @ %r203 bra $L35; bra $L43; $L5: .loc 1 101 19 st.u64 [%r149+8],%r128; .loc 1 102 24 mov.u64 %r20522; call (%value_in),_gfortrani_xmallocarray207,[%value_in]; } mov.u64 %r128,%r207; .loc 1 102 22 st.u64 [%r149],%r128; .loc 1 180 12 ld.u64 %r103,[%frame+240]; .loc 1 36 14 mov.u64 %r99,0; bra $L10; $L43: .loc 1 101 19 st.u64 [%r149+8],%r365; .loc 1 102 2410_gfortrani_xmallocarray212,[%value_in]; } mov.u64 %r128,%r212; .loc 1 102 22 st.u64 [%r149],%r128; .loc 1 104 10 setp.le.s64 %r214,%r100,0; @ %r214 bra $L1; .loc 1 180 12 ld.u64 %r103,[%frame+240]; bra $L10; $L4: .loc 1 114 11 ld.s8 %r215,[%r149+28]; .loc 1 114 10 setp.eq.u64 %r216,%r215,%r119; @ %r216 bra $L12; .loc 1 115 2 cvta.const.u64 %r217,$LC2stack; call _gfortran_runtime_error12: .loc 1 117 26 cvta.global.u64 %r219,_gfortrani_compile_options; ld.u32 %r111,[%r219+36]; .loc 1 117 10 setp.ne.u32 %r220,%r111,0; @ %r220 bra $L13; .loc 1 155 4 setp.gt.s64 %r221,%r215,0; @ %r221 bra $L14; .loc 1 36 14 mov.u64 %r99,0; bra $L15; $L13: .loc 1 119 4 setp.le.s64 %r222,%r215,0; @ %r222 bra $L36; .loc 1 157 24 add.u64 %r112,%r24,-1; add.u64 %r63,%r149,40; .loc 1 111 18 mov.u32 %r111,0; .loc 1 113 11 mov.u64 %r108,0; .loc 1 119 11 mov.u64 %r123,%r108; .loc 1 36 14 mov.u64 %r99,%r108; .loc 1 136 16 mov.u64 %r367,%r108; $L20: .loc 1 123 21 ld.u64 %r226,[%r63+16]; add.u64 %r225,%r226,1; .loc 1 123 19 ld.u64 %r227,[%r63+8]; sub.u64 %r122,%r225,%r227; .loc 1 124 11 setp.ne.u64 %r228,%r112,%r123; @ %r228 bra $L16; .loc 1 126 14 ld.u64 %r229,[%r63]; ld.u64 %r230,[%r149+16]; mul.lo.u64 %r99,%r229,%r230; .loc 1 128 8 setp.eq.u64 %r231,%r120,%r122; @ %r231 bra $L17; .loc 1 129 7 st.u64 [%stack+16],%r120; st.u64 [%stack+8],%r122; add.u64 %r233,%r112,1;_gfortran_runtime_error16: .loc 1 136 16 shl.b64 %r356,%r108,3; add.u64 %r236,%frame,%r356; add.u64 %r237,%r236,120; st.u64 [%r237],%r367; .loc 1 137 19 add.u64 %r358,%r108,%r108; add.u64 %r354,%r358,%r108; shl.b64 %r242,%r354,3; add.u64 %r243,%r150,%r242; ld.u64 %r246,[%r243+56]; add.u64 %r245,%r246,1; ld.u64 %r253,[%r243+48]; sub.u64 %r51,%r245,%r253; .loc 1 137 17 st.u64 [%r236],%r51; .loc 1 138 8 setp.eq.u64 %r256,%r51,%r122; @ %r256 bra $L18; .loc 1 139 7 st.u64 [%stack+16],%r51; st.u64 [%stack+8],%r122; add.u64 %r258,%r123,1;_gfortran_runtime_error18: .loc 1 145 8 setp.gt.s64 %r260,%r51,0; .loc 1 146 18 selp.u32 %r111,%r111,1,%r260; .loc 1 147 20 ld.u64 %r271,[%r243+40]; mul.lo.u64 %r270,%r271,%r22; .loc 1 147 18 st.u64 [%r236+240],%r270; .loc 1 148 20 ld.u64 %r276,[%r63]; ld.u64 %r277,[%r149+16]; mul.lo.u64 %r275,%r276,%r277; .loc 1 148 18 st.u64 [%r236+360],%r275; .loc 1 149 8 add.u64 %r108,%r108,1; $L17: .loc 1 119 28 add.u64 %r123,%r123,1; .loc 1 119 4 add.u64 %r63,%r63,24; setp.ne.u64 %r278,%r215,%r123; @ %r278 bra $L20; bra $L21; $L14: .loc 1 157 24 add.u64 %r105,%r24,-1; add.u64 %r104,%r149,40; .loc 1 159 14 ld.u64 %r138,[%r149+16]; .loc 1 113 11 mov.u64 %r109,0; .loc 1 155 11 mov.u64 %r121,%r109; .loc 1 36 14 mov.u64 %r99,%r109; .loc 1 163 16 mov.u64 %r368,%r109; $L24: .loc 1 159 14 ld.u64 %r281,[%r104]; mul.lo.u64 %r147,%r138,%r281; .loc 1 157 11 setp.eq.u64 %r282,%r105,%r121; @ %r282 bra $L37; .loc 1 163 16 shl.b64 %r357,%r109,3; add.u64 %r284,%frame,%r357; add.u64 %r285,%r284,120; st.u64 [%r285],%r368; .loc 1 164 19 add.u64 %r359,%r109,%r109; add.u64 %r355,%r359,%r109; shl.b64 %r290,%r355,3; add.u64 %r291,%r150,%r290; ld.u64 %r294,[%r291+56]; add.u64 %r293,%r294,1; ld.u64 %r301,[%r291+48]; sub.u64 %r69,%r293,%r301; .loc 1 164 17 st.u64 [%r284],%r69; .loc 1 165 8 setp.gt.s64 %r304,%r69,0; .loc 1 166 18 selp.u32 %r111,%r111,1,%r304; .loc 1 167 20 ld.u64 %r315,[%r291+40]; mul.lo.u64 %r314,%r315,%r22; .loc 1 167 18 st.u64 [%r284+240],%r314; .loc 1 168 18 st.u64 [%r284+360],%r147; .loc 1 169 8 add.u64 %r109,%r109,1; bra $L22; $L37: .loc 1 159 12 mov.u64 %r99,%r147; $L22: .loc 1 155 28 add.u64 %r121,%r121,1; .loc 1 155 4 add.u64 %r104,%r104,24; setp.ne.u64 %r319,%r215,%r121; @ %r319 bra $L24; $L21: .loc 1 174 10 setp.ne.u32 %r320,%r111,0; @ ! %r320 bra $L15; bra $L1; $L36: .loc 1 36 14 mov.u64 %r99,0; $L15: .loc 1 177 18 ld.u64 %r80,[%frame+240]; .loc 1 177 10 setp.eq.u64 %r321,%r80,0; selp.u64 %r103,%r22,%r80,%r321; $L10: .loc 1 181 12 ld.u64 %r125,[%frame+360]; .loc 1 183 8 ld.u64 %r127,[%r150]; .loc 1 185 9 setp.ne.u64 %r322,%r127,0; @ %r322 bra $L25; bra $L1; $L32: .loc 1 192 16 mov.u64 %r129,%r128; .loc 1 189 14 mov.u64 %r130,0; $L26: .loc 1 191 1127329,[%value_in]; } .loc 1 192 16 add.u64 %r129,%r129,%r99; .loc 1 189 33 add.u64 %r130,%r130,1; .loc 1 189 7 setp.ne.u64 %r331,%r120,%r130; @ %r331 bra $L26; $L33: .loc 1 195 12 add.u64 %r127,%r127,%r103; .loc 1 196 12 add.u64 %r128,%r128,%r125; .loc 1 197 15 ld.u64 %r332,[%frame+120]; add.u64 %r82,%r332,1; .loc 1 199 13 setp.eq.u64 %r333,%r82,%r98; @ %r333 bra $L27; .loc 1 197 15 st.u64 [%frame+120],%r82; bra $L34; $L27: .loc 1 203 20 mov.u64 %r334,0; st.u64 [%frame+120],%r334; .loc 1 207 16 add.u64 %r126,%r128,%r141; .loc 1 209 14 @ %r360 bra $L1; add.u64 %r142,%frame,128; .loc 1 206 30 mov.u64 %r86,%r79; .loc 1 209 14 mov.u64 %r135,8; bra $L30; $L31: .loc 1 203 20 st.u64 [%r142],%r334; .loc 1 206 30 mul.lo.u64 %r86,%r92,%r91; .loc 1 207 30 mul.lo.u64 %r338,%r94,%r91; .loc 1 207 16 sub.u64 %r126,%r128,%r338; .loc 1 209 14 add.u64 %r142,%r142,8; add.u64 %r135,%r135,8; setp.eq.u64 %r339,%r106,%r135; @ %r339 bra $L1; $L30: .loc 1 217 23 ld.u64 %r340,[%r142]; add.u64 %r91,%r340,1; st.u64 [%r142],%r91; .loc 1 218 30 add.u64 %r342,%r363,%r135; ld.u64 %r92,[%r342]; .loc 1 218 20 sub.u64 %r343,%r92,%r86; add.u64 %r127,%r127,%r343; .loc 1 219 30 add.u64 %r345,%r364,%r135; ld.u64 %r94,[%r345]; .loc 1 219 20 add.u64 %r128,%r126,%r94; .loc 1 199 32 add.u64 %r346,%frame,%r135; ld.u64 %r97,[%r346]; .loc 1 199 13 setp.eq.u64 %r347,%r91,%r97; @ %r347 bra $L31; bra $L34; $L25: .loc 1 199 32 ld.u64 %r98,[%frame]; .loc 1 206 30 mul.lo.u64 %r79,%r103,%r98; .loc 1 207 30 mul.lo.u64 %r348,%r125,%r98; .loc 1 207 16 neg.s64 %r141,%r348; cvt.u32.u32 %r350,%r23; cvt.s64.s8 %r349,%r350; shl.b64 %r106,%r349,3; setp.le.s64 %r360,%r118,1; setp.gt.s64 %r361,%r120,0; add.u64 %r363,%frame,240; add.u64 %r364,%frame,360; $L34: .loc 1 189 7 @ %r361 bra $L32; bra $L33; $L1: .loc 1 223 1 ret; } // BEGIN FUNCTION DEF: spread_internal_scalar$isra$0 .func spread_internal_scalar$isra$0pred %r56; .reg .u64 %r57; .reg .u64u32 %r63; .reg .u64u64 %r100; .reg .pred %r102; mov.u64 %r50,%ar0; mov.u64 %r51,%ar1; mov.u64 %r52,%ar2; mov.u64 %r53,%ar3; .loc 1 233 7 cvt.u32.u64 %r22,%r53; .loc 1 237 8 ld.u64 %r23,[%r50+16]; .loc 1 239 6 ld.s8 %r55,[%r50+28]; cvt.u16.u32 %r54,%r55; setp.eq.u16 %r56,%r54,1; @ %r56 bra $L45; .loc 1 240 5 cvta.const.u64 %r57,$LC4; {stack; call _gfortran_runtime_error45: .loc 1 242 6 ld.u64 %r59,[%r52]; setp.le.s64 %r60,%r59,1; @ %r60 bra $L46; .loc 1 243 5 cvta.const.u64 %r61,$LC1stack; call _gfortran_runtime_error46: .loc 1 249 7 add.u32 %r63,%r22,-1; cvt.s64.s32 %r49,%r63; .loc 1 245 6 ld.u64 %r64,[%r50]; setp.ne.u64 %r65,%r64,0; @ %r65 bra $L47; .loc 1 247 24 cvt.s64.s32 %r68,%r2call (%value_in),_gfortrani_xmallocarray.loc 1 247 22 st.u64 [%r50],%r69; .loc 1 248 19 st.u64 [%r50+8],%r64; .loc 1 249 7 st.u64 [%r50+48],%r64; st.u64 [%r50+56],%r49; mov.u64 %r73,1; st.u64 [%r50+40],%r73; $L49: .loc 1 258 3 setp.gt.s32 %r74,%r22,0; @ %r74 bra $L48; bra $L44; $L47: .loc 1 253 26 ld.u64 %r76,[%r50+56]; ld.u64 %r78,[%r50+48]; .loc 1 253 56 sub.u64 %r79,%r76,%r78; .loc 1 254 7 ld.u64 %r82,[%r50+40]; div.s64 %r81,%r79,%r82; .loc 1 253 10 setp.ge.s64 %r83,%r81,%r49; @ %r83 bra $L49; .loc 1 255 2 cvta.const.u64 %r84,$LC5; {stack; call _gfortran_runtime_error48: cvt.u32.u64 %r87,%r53; add.u32 %r86,%r87,-1; cvt.u64.u32 %r45,%r86; .loc 1 258 3 mov.u64 %r43,0; $L50: .loc 1 260 43 ld.u64 %r89,[%r50+40]; ld.u64 %r90,[%r50+16]; mul.lo.u64 %r88,%r89,%r90; .loc 1 260 12 ld.u64 %r93,[%r50]; mad.lo.u64 %r92,%r88,%r43,%r93; .loc 1 261 7100,[%value_in]; } mov.u64 %r44,%r43; .loc 1 258 3 add.u64 %r43,%r43,1; setp.ne.u64 %r102,%r44,%r45; @ %r102 bra $L50; $L44: .loc 1 2_gfortran_spread .visible .func _gfortran_spreadpred %r90; .reg .predpredpred.reg .predpredpredpredpredpred %r181; .reg .u64 %r184; .reg .u64 %r185; mov.u64 %r79,%ar0; mov.u64 %r80,%ar1; mov.u64 %r81,%ar2; mov.u64 %r82,%ar3; .loc 1 275 15 ld.u64 %r26,[%r79+16]; ld.s8 %r83,[%r79+29]; shl.b32 %r84,%r83,4; cvt.s64.s32 %r85,%r84; shl.b64 %r86,%r26,7; or.b64 %r28,%r85,%r86; .loc 1 276 3 setp.eq.u64 %r87,%r28,1040; @ %r87 bra $L54; setp.gt.u64 %r88,%r28,1040; @ %r88 bra $L55; setp.eq.u64 %r89,%r28,288; @ %r89 bra $L56; setp.gt.u64 %r90,%r28,288; @ %r90 bra $L57; setp.eq.u64 %r91,%r28,160; @ %r91 bra $L58; setp.eq.u64 %r92,%r28,272; @ %r92 bra $L56; setp.eq.u64 %r93,%r28,144; @ %r93 bra $L58; bra $L59; $L57: setp.eq.u64 %r94,%r28,544; @ %r94 bra $L60; setp.eq.u64 %r95,%r28,560; @ %r95 bra $L61; setp.eq.u64 %r96,%r28,528; @ %r96 bra $L60; bra $L59; $L55: setp.eq.u64 %r97,%r28,1088; @ %r97 bra $L62; setp.gt.u64 %r98,%r28,1088; @ %r98 bra $L63; setp.eq.u64 %r99,%r28,1056; @ %r99 bra $L54; setp.eq.u64 %r100,%r28,1072; @ %r100 bra $L64; bra $L59; $L63: setp.eq.u64 %r101,%r28,2080; @ %r101 bra $L65; setp.eq.u64 %r102,%r28,2112; @ %r102 bra $L66; setp.eq.u64 %r103,%r28,2064; @ %r103 bra $L65; bra $L59; $L58: .loc 1 280 7 ld.u64 %r107,[%r82]; ld.u64 %r106,[%r106107; call _gfortrani_spread_282 7 bra $L53; $L56: .loc 1 286 7 ld.u64 %r111,[%r82]; ld.u64 %r110,[%r110111; call _gfortrani_spread_288 7 bra $L53; $L60: .loc 1 292 7 ld.u64 %r115,[%r82]; ld.u64 %r114,[%r114115; call _gfortrani_spread_294 7 bra $L53; $L54: .loc 1 298 7 ld.u64 %r119,[%r82]; ld.u64 %r118,[%r118119; call _gfortrani_spread_300 7 bra $L53; $L65: .loc 1 305 7 ld.u64 %r123,[%r82]; ld.u64 %r122,[%r123; call _gfortrani_spread_i16307 7 bra $L53; $L61: .loc 1 311 7 ld.u64 %r127,[%r82]; ld.u64 %r126,[%r127; call _gfortrani_spread_r4313 7 bra $L53; $L64: .loc 1 316 7 ld.u64 %r131,[%r82]; ld.u64 %r130,[%r130131; call _gfortrani_spread_r8318 7 bra $L53; $L62: .loc 1 343 7 ld.u64 %r135,[%r82]; ld.u64 %r134,[%r134135; call _gfortrani_spread_c4345 7 bra $L53; $L66: .loc 1 348 7 ld.u64 %r139,[%r82]; ld.u64 %r138,[%r138139; call _gfortrani_spread_c87 bra $L53; $L59: .loc 1 376 3 setp.eq.u64 %r140,%r26,4; @ %r140 bra $L68; setp.gt.u64 %r141,%r26,4; @ %r141 bra $L69; setp.eq.u64 %r142,%r26,1; @ %r142 bra $L58; setp.eq.u64 %r143,%r26,2; @ %r143 bra $L70; bra $L71; $L69: setp.eq.u64 %r144,%r26,8; @ %r144 bra $L72; setp.eq.u64 %r145,%r26,16; @ %r145 bra $L73; bra $L71; $L70: .loc 1 384 11 ld.u64 %r147,[%r79]; and.b64 %r146,%r147,1; .loc 1 384 10 setp.ne.u64 %r148,%r146,0; @ %r148 bra $L71; .loc 1 384 46 ld.u64 %r150,[%r80]; and.b64 %r149,%r150,1; .loc 1 384 43 setp.ne.u64 %r151,%r149,0; @ %r151 bra $L71; .loc 1 388 4 ld.u64 %r155,[%r82]; ld.u64 %r154,[%r154155; call _gfortrani_spread_390 4 bra $L53; $L68: .loc 1 394 11 ld.u64 %r157,[%r79]; and.b64 %r156,%r157,3; .loc 1 394 10 setp.ne.u64 %r158,%r156,0; @ %r158 bra $L71; .loc 1 394 46 ld.u64 %r160,[%r80]; and.b64 %r159,%r160,3; .loc 1 394 43 setp.ne.u64 %r161,%r159,0; @ %r161 bra $L71; .loc 1 398 4 ld.u64 %r165,[%r82]; ld.u64 %r164,[%r1645; call _gfortrani_spread_400 4 bra $L53; $L72: .loc 1 404 11 ld.u64 %r167,[%r79]; and.b64 %r166,%r167,7; .loc 1 404 10 setp.ne.u64 %r168,%r166,0; @ %r168 bra $L71; .loc 1 404 46 ld.u64 %r170,[%r80]; and.b64 %r169,%r170,7; .loc 1 404 43 setp.ne.u64 %r171,%r169,0; @ %r171 bra $L71; .loc 1 408 4 ld.u64 %r175,[%r82]; ld.u64 %r174,[%r174175; call _gfortrani_spread_410 4 bra $L53; $L73: .loc 1 414 11 ld.u64 %r177,[%r79]; and.b64 %r176,%r177,15; .loc 1 414 10 setp.ne.u64 %r178,%r176,0; @ %r178 bra $L71; .loc 1 415 7 ld.u64 %r180,[%r80]; and.b64 %r179,%r180,15; .loc 1 415 4 setp.ne.u64 %r181,%r179,0; @ %r181 bra $L71; .loc 1 419 4 ld.u64 %r185,[%r82]; ld.u64 %r184,[%r1call _gfortrani_spread_i16421 4 bra $L53; $L7179call spread,%out_arg4); } $L53:_gfortran_spread_char .visible .func _gfortran_spread_char.loc 1 443call spread,%out_arg4); }_gfortran_spread_char4 .visible .func _gfortran_spread_char4$LFB12:call spread,%out_arg4); }_gfortran_spread_scalar .visible .func _gfortran_spread_scalar64predpredpred %r165; .reg .u64 %r168; mov.u64 %r64,%ar0; mov.u64 %r65,%ar1; mov.u64 %r66,%ar2; mov.u64 %r67,%ar3; .loc 1 476 7 ld.u64 %r22,[%r64+16]; .loc 1 476 6 setp.ne.u64 %r68,%r22,0; @ %r68 bra $L110; .loc 1 477 5 cvta.const.u64 %r69,$LCstack; call _gfortran_runtime_error110: .loc 1 479 15 ld.s8 %r71,[%r64+29]; shl.b32 %r72,%r71,4; cvt.s64.s32 %r73,%r72; shl.b64 %r74,%r22,7; or.b64 %r28,%r73,%r74; .loc 1 484 7 ld.u64 %r46,[%r67]; .loc 1 480 3 setp.eq.u64 %r75,%r28,1040; @ %r75 bra $L111; setp.gt.u64 %r76,%r28,1040; @ %r76 bra $L112; setp.eq.u64 %r77,%r28,288; @ %r77 bra $L113; setp.gt.u64 %r78,%r28,288; @ %r78 bra $L114; setp.eq.u64 %r79,%r28,160; @ %r79 bra $L115; setp.eq.u64 %r80,%r28,272; @ %r80 bra $L113; setp.eq.u64 %r81,%r28,144; @ %r81 bra $L115; bra $L116; $L114: setp.eq.u64 %r82,%r28,544; @ %r82 bra $L117; setp.eq.u64 %r83,%r28,560; @ %r83 bra $L118; setp.eq.u64 %r84,%r28,528; @ %r84 bra $L117; bra $L116; $L112: setp.eq.u64 %r85,%r28,1088; @ %r85 bra $L119; setp.gt.u64 %r86,%r28,1088; @ %r86 bra $L120; setp.eq.u64 %r87,%r28,1056; @ %r87 bra $L111; setp.eq.u64 %r88,%r28,1072; @ %r88 bra $L121; bra $L116; $L120: setp.eq.u64 %r89,%r28,2080; @ %r89 bra $L122; setp.eq.u64 %r90,%r28,2112; @ %r90 bra $L123; setp.eq.u64 %r91,%r28,2064; @ %r91 bra $L122; bra $L116; $L115: .loc 1 484 7 ld.u64 %r94,[%r66call _gfortrani_spread_scalar_7 bra $L109; $L113: .loc 1 490 7 ld.u64 %r98,[%r66call _gfortrani_spread_scalar_492 7 bra $L109; $L117: .loc 1 496 7 ld.u64 %r102,[%r66102call _gfortrani_spread_scalar_498 7 bra $L109; $L111: .loc 1 502 7 ld.u64 %r106,[%r66106call _gfortrani_spread_scalar_504 7 bra $L109; $L122: .loc 1 509 7 ld.u64 %r110,[%r66110call _gfortrani_spread_scalar_i16511 7 bra $L109; $L118: .loc 1 515 7 ld.u64 %r114,[%r66114call _gfortrani_spread_scalar_r4517 7 bra $L109; $L121: .loc 1 520 7 ld.u64 %r118,[%r66118call _gfortrani_spread_scalar_r8522 7 bra $L109; $L119: .loc 1 547 7 ld.u64 %r122,[%r66call _gfortrani_spread_scalar_c4549 7 bra $L109; $L123: .loc 1 552 7 ld.u64 %r126,[%r66call _gfortrani_spread_scalar_c87 bra $L109; $L116: .loc 1 580 3 setp.eq.u64 %r128,%r22,4; @ %r128 bra $L125; setp.gt.u64 %r129,%r22,4; @ %r129 bra $L126; setp.eq.u64 %r130,%r22,1; @ %r130 bra $L127; setp.eq.u64 %r131,%r22,2; @ %r131 bra $L128; bra $L129; $L126: setp.eq.u64 %r132,%r22,8; @ %r132 bra $L130; setp.eq.u64 %r133,%r22,16; @ %r133 bra $L131; bra $L129; $L127: .loc 1 583 7 ld.u64 %r136,[%r66136call _gfortrani_spread_scalar_585 7 bra $L109; $L128: .loc 1 588 43 ld.u64 %r139,[%r64]; or.b64 %r138,%r65,%r139; and.b64 %r140,%r138,1; setp.ne.u64 %r141,%r140,0; @ %r141 bra $L129; .loc 1 592 4 ld.u64 %r144,[%r66144call _gfortrani_spread_scalar_594 4 bra $L109; $L125: .loc 1 598 43 ld.u64 %r147,[%r64]; or.b64 %r146,%r65,%r147; and.b64 %r148,%r146,3; setp.ne.u64 %r149,%r148,0; @ %r149 bra $L129; .loc 1 602 4 ld.u64 %r152,[%r66152call _gfortrani_spread_scalar_604 4 bra $L109; $L130: .loc 1 608 43 ld.u64 %r155,[%r64]; or.b64 %r154,%r65,%r155; and.b64 %r156,%r154,7; setp.ne.u64 %r157,%r156,0; @ %r157 bra $L129; .loc 1 612 4 ld.u64 %r160,[%r661call _gfortrani_spread_scalar_614 4 bra $L109; $L131: .loc 1 618 44 ld.u64 %r163,[%r64]; or.b64 %r162,%r65,%r163; and.b64 %r164,%r162,152 4 ld.u64 %r168,[%r66168call _gfortrani_spread_scalar_i16624 call spread_internal_scalar$isra$0$L109: .loc 1 63_gfortran_spread_char_scalar .visible .func _gfortran_spread_char_scalarpredmov.u64 %r24,%ar0; mov.u64 %r26,%ar2; mov.u64 %r27,%ar3; mov.u64 %r28,%ar4; .loc 1 647 6 ld.u64 %r30,[%r24+16]; setp.ne.u64 %r31,%r30,0; @ %r31 bra $L163; .loc 1 648 5 cvta.const.u64 %r32,$LC6stack; call _gfortran_runtime_error163: .loc 1 649 3 ld.u64 %r37,[%r28]; {call spread_internal_scalar$isra$0_gfortran_spread_char4_scalar .visible .func _gfortran_spread_char4_scalarpred$LFB14: mov.u64 %r24,%ar0; mov.u64 %r26,%ar2; mov.u64 %r27,%ar3; mov.u64 %r28,%ar4; ld.u64 %r30,[%r24+16]; setp.ne.u64 %r31,%r30,0; @ %r31 bra $L165; cvta.const.u64 %r32,$LC6stack; call _gfortran_runtime_error165: ld.u64 %r37,[%r28]; {call spread_internal_scalar$isra$0ret; $LFE14: } /677 FUNCTION DECL: _gfortran_string_len_trim$part$0 .func (.param .u64 %value_out) _gfortran_string_len_trim$part$0fortran/intrinsics/string_intrinsics_in_gfortrani_memcmp_char4gfortrani_memcmp_char4fortran/intrinsics/string_intrinsicgfortran_compare_string .visible .func (.param .u32 %value_out) _gfortran_compare); // BEGIN GLOBAL FUNCTION DECL: _gfortran_concat_string .visible .func _gfortran_conc_gfortran_string_trim .visible .func _gfortran_string_trimgfortran_string_len_trgfortran_string_len_tr; // BEGIN GLOBAL FUNCTION DECL: _gfortran_string_indexgfortran_string_index_gfortran_adjustl .visible .func _gfortran_adjustl_gfortran_adjustr .visible .func _gfortran_adjustr_gfortran_string_scangfortran_string_scan_gfortran_string_verifygfortran_string_verify_gfortran_string_minmax .visible .func _gfortran_string_minmax); // BEGIN GLOBAL FUNCTION DECL: _gfortran_compare_string_char4gfortran_compare_string_char4gfortran_concat_string_char4 .visible .func _gfortran_concat_string_chGLOBAL FUNCTION DECL: _gfortran_string_trim_char4 .visible .func _gfortran_string_trim_char4gfortran_string_len_trim_char4gfortran_string_len_trim_char); // BEGIN GLOBAL FUNCTION DECL: _gfortran_string_index_char4gfortran_string_index_char_gfortran_adjustl_char4 .visible .func _gfortran_adjustl_char4_gfortran_adjustr_char4 .visible .func _gfortran_adjustr_char4_gfortran_string_scan_char4gfortran_string_scan_char_gfortran_string_verify_char4gfortran_string_verify_char_gfortran_string_minmax_char4 .visible .func _gfortran_string_minmax_ch); // BEGIN GLOBAL FUNCTION DECL: memcmpmem_gfortrani_xmallocarr_gfortrani_xmallocarray; // BEGIN GLOBAL FUNCTION DECL: _gfortran_runtime_error .extern .func _gfortran_runtime_VAR DEF: zero_length_string .global .align 1 .u8 zero_length_string[1]; // BEGIN VAR DEF: zero_length_string_char4 .global .align 4 .u32 zero_length_string_char4[1],105,114,115,116,32,97,114,103,117,109,101,110,116,32,111,102,32,39,37,115,39,32,105,110,116,114,105,110,115,105,99,32,115,104,111,117,108,100,32,98,101,32,112,114,101,115,101,110,116,04] = {77,65,884] = {77,73,7883,101,99,111,110,100,32,97,114,103,117,109,101,110,116,32,111,102,32,39,37,115,39,32,105,110,116,114,105,110,115,105,99,32,115,104,111,117,108,100,32,98,101,32,112,114,101,115,101,110,116,0 }; // BEGIN FUNCTION DEF: _gfortran_string_len_trim$pa_gfortran_string_len_trim$pa16u64u64 %r59; .reg .u16u32 %r66; .reg .pred %r67; mov.u64 %r39,%ar0; mov.u64 %r40,%ar1; .loc 1 193 10 add.u64 %r38,%r39,-1; .loc 1 198 6 setp.gt.u64 %r41,%r38,7; @ %r41 bra $L2; $L13: .loc 1 240 11 add.u64 %r42,%r40,%r38; .loc 1 240 9 ld.s8 %r44,[%r42]; cvt.u16.u32 %r43,%r44; setp.eq.u16 %r45,%r43,32; @ %r45 bra $L3; .loc 1 246 12 add.u64 %r36,%r38,1; bra $L4; $L3: .loc 1 242 10 setp.eq.u64 %r46,%r38,0; @ ! %r46 bra $L14; bra $L1; $L2: .loc 1 210 12 add.u64 %r47,%r40,%r39; .loc 1 206 16 and.b64 %r24,%r47,7; .loc 1 211 9 sub.u64 %r25,%r38,%r24; .loc 1 212 7 setp.eq.u64 %r48,%r24,0; @ %r48 bra $L7; $L9: .loc 1 213 7 add.u64 %r49,%r40,%r38; .loc 1 213 5 ld.s8 %r51,[%r49]; cvt.u16.u32 %r50,%r51; setp.eq.u16 %r52,%r50,32; @ %r52 bra $L8; .loc 1 214 24 add.u64 %r38,%r38,1; bra $L1; $L8: .loc 1 212 7 add.u64 %r38,%r38,-1; setp.ne.u64 %r53,%r25,%r38; @ %r53 bra $L9; bra $L7; $L12: .loc 1 230 6 add.u64 %r29,%r25,-8; .loc 1 231 8 add.u64 %r54,%r40,%r29; .loc 1 231 7 ld.u64 %r55,[%r54+1]; setp.ne.u64 %r56,%r55,2314885530818453536; @ %r56 bra $L10; .loc 1 230 6 mov.u64 %r25,%r29; bra $L11; $L7: and.b64 %r38,%r25,7; $L11: .loc 1 228 13 setp.ne.u64 %r57,%r25,%r38; @ %r57 bra $L12; bra $L13; $L15: .loc 1 242 10 setp.eq.u64 %r58,%r38,0; @ ! %r58 bra $L14; bra $L1; $L16: .loc 1 240 9 mov.u64 %r38,%r25; $L14: mov.u64 %r36,%r38; .loc 1 244 7 add.u64 %r38,%r38,-1; .loc 1 240 11 add.u64 %r59,%r40,%r38; .loc 1 240 9 ld.s8 %r61,[%r59]; cvt.u16.u32 %r60,%r61; setp.eq.u16 %r62,%r60,32; @ %r62 bra $L15; $L4: .loc 1 246 12 mov.u64 %r38,%r36; bra $L1; $L10: .loc 1 240 11 add.u64 %r64,%r40,%r25; .loc 1 240 9 ld.s8 %r66,[%r64]; cvt.u16.u32 %r65,%r66; setp.eq.u16 %r67,%r65,32; @ %r67 bra $L16; .loc 1 246 12 add.u64 %r36,%r25,1; bra $L4;gfortrani_memcmp_char4gfortrani_memcmp_char4pred %r29; .reg .pred %r30; .reg .pred %r31; .reg .pred.loc 2 60 13 add.u64 %r24,%r28,-1; .loc 2 60 9 setp.eq.u64 %r29,%r28,0; @ %r29 bra $L31; $L30: .loc 2 62 11 ld.u32 %r22,[%r26]; .loc 2 62 18 ld.u32 %r23,[%r27]; .loc 2 62 10 setp.eq.u32 %r30,%r22,%r23; @ %r30 bra $L29; .loc 2 63 24 setp.lt.u32 %r31,%r22,%r23; selp.u32 %r25,-1,1,%r31; bra $L27; $L29: .loc 2 64 10 add.u64 %r26,%r26,4; .loc 2 65 10 add.u64 %r27,%r27,4; .loc 2 60 13 add.u64 %r24,%r24,-1; .loc 2 60 9 setp.ne.u64 %r32,%r24,-1; @ %r32 bra $L30; .loc 2 67 10 mov.u32 %r25,0; bra $L27; $L31: cvt.u32.u64 %r25,%r28; $L27: .loc 2 68_gfortran_compare_gfortran_compare_stringu16 %r48; .reg .predmov.u64 %r31,%ar1; mov.u64 %r32,%ar2; mov.u64 %r33,%ar3; .loc 1 91 6 or.b64 %r34,%r31,%r33;40; .loc 1 93 6 setp.eq.u64 %r36,%r31,0; @ %r36 bra $L41; .loc 1 95 6 setp.eq.u64 %r37,%r33,0; @ %r37 bra $L42; .loc 1 98 9 min.u64 %r42,%r32,%r30;(%value_in),memcmp,(%out_arg1,mov.u32 %r29,%r43; .loc 1 99 6 setp.ne.u32 %r45,%r29,0; @ %r45 bra $L34; .loc 1 102 6 setp.eq.u64 %r46,%r32,%r30; @ %r46 bra $L34; .loc 1 105 6 setp.le.u64 %r47,%r32,%r30; @ %r47 bra $L36; .loc 1 107 11 sub.u64 %r24,%r32,%r30; .loc 1 108 9 add.u64 %r27,%r33,%r30; .loc 1 109 11 mov.u32 %r26,-1; bra $L37; $L36: .loc 1 113 11 sub.u64 %r24,%r30,%r32; .loc 1 114 9 add.u64 %r27,%r31,%r32; .loc 1 115 11 mov.u32 %r26,1; bra $L37; $L39: .loc 1 120 10 ld.u8 %r48,[%r27]; setp.eq.u16 %r49,%r48,32; @ %r49 bra $L38; .loc 1 122 14 setp.gt.u16 %r51,%r48,32; @ %r51 bra $L43; .loc 1 125 20 neg.s32 %r29,%r26; bra $L34; $L38: .loc 1 127 8 add.u64 %r27,%r27,1; .loc 1 118 9 setp.ne.u64 %r52,%r22,%r27; @ %r52 bra $L39; bra $L34; $L37: add.u64 %r22,%r27,%r24; bra $L39; $L40: .loc 1 92 12 cvt.u32.u64 %r29,%r34; bra $L34; $L41: .loc 1 94 12 mov.u32 %r29,-1; bra $L34; $L42: .loc 1 96 12 mov.u32 %r29,1; bra $L34; $L43: .loc 1 123 20 mov.u32 %r29,%r26; $L34_gfortran_concat_string .visible .func _gfortran_conc, .param .u64 %in_ar5) {mov.u64 %r31,%ar5; .loc 1 142 6 setp.lt.u64 %r32,%r28,%r26; @ %r32 bra $L46; .loc 1 144 7145 7 bra $L45; $L46: .loc 1 147 3148 8 add.u64 %r24,%r27,%r28; .loc 1 149 11 sub.u64 %r25,%r26,%r28; .loc 1 151 6 setp.gt.u64 %r49,%r25,%r30; @ %r49 bra $L48; .loc 1 153 7.loc 1 154 7 bra $L45; $L48: .loc 1 157 3$L45_gfortran_string_trim .visible .func _gfortran_string_trsetp.ne.u64 %r29,%r27,0; @ %r29 bra $L50; .loc 1 168 8 st.u64 [%r25],%r27; bra $L51; $L50:r28; call (%value_in),_gfortran_string_len_trim$part$0st.u64 [%r25],%r33; .loc 1 170 652; $L51: .loc 1 171 11 cvta.global.u64 %r35,zero_length_string; st.u64 [%r26],%r35; bra $L49; $L52: .loc 1 175 15 mov.u64 r37; call (%value_in),_gfortrani_xmallocarray.loc 1 175 13 st.u64 [%r26],%r38; .loc 1 178 7 ld.u64 %r42,[%r25]$L49_gfortran_string_len_trgfortran_string_len_tr88 6 setp.eq.u64 %r25,%r23,0; @ %r25 bra $L56;gfortran_string_len_trim$part$0bra $L54; $L56: .loc 1 189 12 mov.u64 %r22,%r23; $L54gfortran_string_indexgfortran_string_index16predar2; mov.u64 %r38,%ar3; mov.u32 %r39,%ar4; .loc 1 259 6 setp.ne.u64 %r40,%r37,0; @ %r40 bra $L58; .loc 1 260 30 setp.eq.u32 %r41,%r39,0; @ %r41 bra $L68; add.u64 %r34,%r35,1; bra $L57; $L58: .loc 1 262 6 setp.gt.u64 %r42,%r37,%r35; @ %r42 bra $L69; .loc 1 265 6 setp.ne.u32 %r43,%r39,0; @ %r43 bra $L60; .loc 1 267 12 add.u64 %r44,%r35,1; sub.u64 %r29,%r44,%r37; .loc 1 269 13 mov.u64 %r31,1; .loc 1 268 13 mov.u64 %r28,0; bra $L61; $L60: .loc 1 274 13 sub.u64 %r28,%r35,%r37; .loc 1 275 13 mov.u64 %r31,-1; .loc 1 273 12 mov.u64 %r29,%r31; bra $L61; $L67: add.u64 %r32,%r36,%r28; $L66: .loc 1 280 14 mov.u64 %r30,0; $L63: .loc 1 282 18 add.u64 %r45,%r32,%r30; .loc 1 282 37 add.u64 %r46,%r38,%r30; .loc 1 282 14 ld.s8 %r48,[%r45]; cvt.u16.u32 %r47,%r48; ld.s8 %r50,[%r46]; cvt.u16.u32 %r49,%r50; setp.ne.u16 %r51,%r47,%r49; @ %r51 bra $L62; .loc 1 280 31 add.u64 %r30,%r30,1; .loc 1 280 7 setp.ne.u64 %r52,%r37,%r30; @ %r52 bra $L63; bra $L64; $L62: .loc 1 285 10 setp.ne.u64 %r53,%r37,%r30; @ %r53 bra $L65; $L64: .loc 1 286 23 add.u64 %r34,%r28,1; bra $L57; $L65: .loc 1 278 30 add.u64 %r28,%r28,%r31; .loc 1 278 3 add.u64 %r32,%r32,%r31; setp.ne.u64 %r54,%r28,%r29; @ %r54 bra $L66; .loc 1 263 12 mov.u64 %r34,0; bra $L57; $L61: .loc 1 278 3 setp.ne.u64 %r55,%r28,%r29; @ %r55 bra $L67; .loc 1 263 12 mov.u64 %r34,0; bra $L57; $L68: .loc 1 260 30 mov.u64 %r34,1; bra $L57; $L69: .loc 1 263 12 mov.u64 %r34,0; $L57: .loc 1 289_gfortran_adjustlmov.u64 %r31,%ar2; .loc 1 301 9 setp.eq.u64 %r32,%r30,0; @ %r32 bra $L72; mov.u64 %r28,%r31; .loc 1 300 5 mov.u64 %r27,0; bra $L74; $L76: .loc 1 302 6 add.u64 %r27,%r27,1; .loc 1 301 9 add.u64 %r28,%r28,1; setp.eq.u64 %r33,%r30,%r27; @ ! %r33 bra $L74; mov.u64 %r55,0; bra $L75; $L74: .loc 1 301 18 ld.s8 %r35,[%r28]; cvt.u16.u32 %r34,%r35; setp.eq.u16 %r36,%r34,32; @ %r36 bra $L76; bra $L84; $L75: .loc 1 307 13 add.u64 %r43,%r29,%r55; .loc 1 307 5 mov.u32 %r4044,[%value_in]; } .loc 1 308 1 bra $L72; $L84: .loc 1 305 33 sub.u64 %r55,%r30,%r27; .loc 1 305 5.loc 1 306 6 setp.ne.u64 %r54,%r27,0; @ %r54 bra $L75; $L72:_gfortran_adjustr .visible .func _gfortran_adjustr.loc 1 319 9 mov.u64 %r26,%r29; bra $L87; $L90: .loc 1 320 6 mov.u64 %r26,%r22; $L87: .loc 1 319 9 setp.eq.u64 %r31,%r26,0; @ %r31 bra $L89; .loc 1 319 22 add.u64 %r22,%r26,-1; add.u64 %r32,%r30,%r22; .loc 1 319 16 ld.s8 %r34,[%r32]; cvt.u16.u32 %r33,%r34; setp.eq.u16 %r35,%r33,32; @ %r35 bra $L90; .loc 1 323 5 sub.u64 %r27,%r29,%r26; bra $L86; $L89: mov.u64 %r27,%r29; $L86: .loc 1 322 6 setp.le.u64 %r36,%r29,%r26; @ %r36 bra $L88; .loc 1 323 5 mov.u32 %r340,[%value_in]; } $L88: .loc 1 324 11 add.u64 %r41,%r28,%r27; .loc 1 324 3_gfortran_string_scangfortran_string_scan16336 12 set.u32.eq.u64 %r43,%r37,0; neg.s32 %r44,%r43; .loc 1 336 27 set.u32.eq.u64 %r46,%r39,0; neg.s32 %r47,%r46; .loc 1 336 17 cvt.u16.u32 %r49,%r44; cvt.u16.u32 %r50,%r47; or.b16 %r48,%r49,%r50; .loc 1 336 6 cvt.u32.u16 %r51,%r48; cvt.u16.u8 %r52,%r51; setp.ne.u16 %r53,%r52,0; @ %r53 bra $L98; .loc 1 339 6 setp.ne.u32 %r54,%r41,0; @ %r54 bra $L93; .loc 1 352 14 mov.u64 %r29,0; bra $L94; $L95: .loc 1 345 29 add.u64 %r55,%r40,%r32; .loc 1 345 11 ld.s8 %r57,[%r55]; cvt.u16.u32 %r56,%r57; setp.eq.u16 %r59,%r56,%r58; @ %r59 bra $L99; .loc 1 343 29 add.u64 %r32,%r32,1; .loc 1 343 4 setp.gt.u64 %r60,%r39,%r32; @ %r60 bra $L95; .loc 1 341 7 setp.eq.u64 %r61,%r36,0; @ %r61 bra $L91; mov.u64 %r37,%r36; $L93: .loc 1 345 15 add.u64 %r36,%r37,-1; add.u64 %r62,%r38,%r36; ld.s8 %r25,[%r62]; .loc 1 343 11 mov.u64 %r32,0; .loc 1 345 11 cvt.u16.u32 %r58,%r25; bra $L95; $L97: .loc 1 356 25 add.u64 %r63,%r40,%r30; .loc 1 356 11 ld.s8 %r65,[%r63]; cvt.u16.u32 %r64,%r65; setp.ne.u16 %r67,%r64,%r66; @ %r67 bra $L96; .loc 1 357 13 add.u64 %r36,%r29,1; bra $L91; $L96: .loc 1 354 29 add.u64 %r30,%r30,1; .loc 1 354 4 setp.gt.u64 %r68,%r39,%r30; @ %r68 bra $L97; .loc 1 352 30 add.u64 %r29,%r29,1; .loc 1 352 7 setp.le.u64 %r69,%r37,%r29; @ %r69 bra $L100; $L94: .loc 1 356 15 add.u64 %r70,%r38,%r29; ld.s8 %r27,[%r70]; .loc 1 354 11 mov.u64 %r30,0; .loc 1 356 11 cvt.u16.u32 %r66,%r27; bra $L97; $L98: .loc 1 337 12 mov.u64 %r36,0; bra $L91; $L99: mov.u64 %r36,%r37; bra $L91; $L100: mov.u64 %r36,0; $L_gfortran_string_verifygfortran_string_verifyr22.loc 1 377 6 setp.eq.u64 %r35,%r30,0; @ %r35 bra $L115; .loc 1 380 6 setp.eq.u32 %r36,%r34,0; @ %r36 bra $L116; .loc 1 383 13 add.u64 %r26,%r30,-1; .loc 1 384 13 mov.u64 %r30,-1; .loc 1 382 12 mov.u64 %r24,%r30; bra $L109; $L112: .loc 1 396 18 add.u64 %r37,%r31,%r26; ld.s8 %r22,[%r37]; .loc 1 394 14 mov.u64 %r25,0; .loc 1 396 14 cvt.u16.u32 %r41,%r22; $L111: .loc 1 396 32 add.u64 %r38,%r33,%r25; .loc 1 396 14 ld.s8 %r40,[%r38]; cvt.u16.u32 %r39,%r40; setp.eq.u16 %r42,%r39,%r41; @ %r42 bra $L110; .loc 1 394 32 add.u64 %r25,%r25,1; .loc 1 394 7 setp.ne.u64 %r43,%r32,%r25; @ %r43 bra $L111; $L113: .loc 1 400 23 add.u64 %r29,%r26,1; bra $L107; $L110: .loc 1 392 31 add.u64 %r26,%r26,%r24; .loc 1 392 3 setp.eq.u64 %r44,%r30,%r26; @ %r44 bra $L117; $L114: .loc 1 394 7 @ %r47 bra $L112; bra $L113; $L115: .loc 1 378 12 mov.u64 %r29,%r30; bra $L107; $L117: mov.u64 %r29,0; bra $L107; $L116: .loc 1 390 13 mov.u64 %r24,1; .loc 1 389 13 mov.u64 %r26,0; $L109: setp.ne.u64 %r47,%r32,0; bra $L114_gfortran_string_minmax .visible .func _gfortran_string_minmaxu64 %r62; .reg .predmov.u64 %r47,%ar0; mov.u64 %r48,%ar1; mov.u32 %r49,%ar2; mov.u32 %r50,%ar3; .loc 1 419 10 mov.u64 %r41,%args; ld.u64 %r31,[%r41]; .loc 1 420 7 add.u64 %r32,%r41,16; ld.u64 %r29,[%r41+8]; .loc 1 421 9 st.u64 [%r47],%r31; .loc 1 423 6 setp.eq.u64 %r52,%r29,0; @ %r52 bra $L120; .loc 1 427 3 setp.gt.s32 %r53,%r50,1; @ %r53 bra $L130; bra $L122; $L120: .loc 1 424 5 setp.gt.s32 %r54,%r49,0; @ %r54 bra $L131; cvta.const.u64 %r33,$LC1; bra $L123; $L131: cvta.const.u64 %r33,$LC0; $L123:_gfortran_runtime_error130: .loc 1 427 10 mov.u32 %r35,1; $L121: .loc 1 429 15 mov.u64 %r44,%r32; ld.u64 %r37,[%r44]; .loc 1 430 12 add.u64 %r32,%r44,16; ld.u64 %r38,[%r44+8]; .loc 1 432 10 setp.ne.u64 %r57,%r38,0; @ %r57 bra $L124; .loc 1 434 7 setp.ne.u32 %r58,%r35,1; @ %r58 bra $L125; .loc 1 435 6 setp.gt.s32 %r59,%r49,0; @ %r59 bra $L132; cvta.const.u64 %r34,$LC1; bra $L126; $L132: cvta.const.u64 %r34,$LC0; $L126: st.u64 [%stack],%r34; cvta.const.u64 %r60,$LC3; {stack; call _gfortran_runtime_error124: .loc 1 441 10 ld.u64 %r62,[%r47]; setp.ge.u64 %r63,%r62,%r37; @ %r63 bra $L127; .loc 1 442 8 st.u64 [%r47],%r37; $L127: .loc 1 444 16call (%value_in),_gfortran_compare_string444 14 mul.lo.u32 %r69,%r68,%r49; .loc 1 444 10 setp.ge.s32 %r70,%r69,0; .loc 1 429 15 selp.u64 %r31,%r31,%r37,%r70; selp.u64 %r29,%r29,%r38,%r70; $L125: .loc 1 427 27 add.u32 %r35,%r35,1; .loc 1 427 3 setp.ne.u32 %r71,%r50,%r35; @ %r71 bra $L121; $L122: .loc 1 452 7 ld.u64 %r25,[%r47]; .loc 1 452 6 setp.ne.u64 %r72,%r25,0; @ %r72 bra $L128; .loc 1 453 11 cvta.global.u64 %r73,zero_length_string; st.u64 [%r48],%r73; bra $L119; $L128: .loc 1 456 23 mov.u64 _gfortrani_xmallocarray7 ld.u64 %r93,[%r47]; sub.u64 %r92,%r93,%r31; .loc 1 458 15 add.u64 %r94,%r76,%r31; .loc 1 458 7 mov.u32 %r9459 13 st.u64 [%r48],%r76; $L119:_gfortran_compare_string_char4gfortran_compare_string_charpredmov.u64 %r38,%ar2; mov.u64 %r39,%ar3; .loc 1 91 6 or.b64 %r40,%r37,%r39; setp.eq.u64 %r41,%r40,0; @ %r41 bra $L143; .loc 1 93 6 setp.eq.u64 %r42,%r37,0; @ %r42 bra $L144; .loc 1 95 6 setp.eq.u64 %r43,%r39,0; @ %r43 bra $L145; .loc 1 98 9 min.u64 %r23,%r38,%r36; .loc 2 60 13 add.u64 %r22,%r23,-1; .loc 2 60 9 setp.eq.u64 %r44,%r23,0; @ %r44 bra $L136; mov.u64 %r27,%r39; mov.u64 %r26,%r37; $L138: .loc 2 62 11 ld.u32 %r24,[%r26]; .loc 2 62 18 ld.u32 %r25,[%r27]; .loc 2 62 10 setp.eq.u32 %r45,%r24,%r25; @ %r45 bra $L137; .loc 2 63 24 setp.ge.u32 %r46,%r24,%r25; .loc 1 94 12 selp.u32 %r35,1,-1,%r46; bra $L134; $L137: .loc 2 64 10 add.u64 %r26,%r26,4; .loc 2 65 10 add.u64 %r27,%r27,4; .loc 2 60 13 add.u64 %r22,%r22,-1; .loc 2 60 9 setp.ne.u64 %r47,%r22,-1; @ %r47 bra $L138; $L136: .loc 1 102 6 setp.eq.u64 %r48,%r38,%r36; @ %r48 bra $L147; .loc 1 105 6 setp.le.u64 %r49,%r38,%r36; @ %r49 bra $L139; .loc 1 107 11 sub.u64 %r28,%r38,%r36; .loc 1 108 28 shl.b64 %r50,%r36,2; .loc 1 108 9 add.u64 %r32,%r39,%r50; .loc 1 109 11 mov.u32 %r35,-1; bra $L140; $L139: .loc 1 113 11 sub.u64 %r28,%r36,%r38; .loc 1 114 28 shl.b64 %r51,%r38,2; .loc 1 114 9 add.u64 %r32,%r37,%r51; .loc 1 115 11 mov.u32 %r35,1; bra $L140; $L142: .loc 1 120 11 ld.u32 %r31,[%r32]; .loc 1 120 10 setp.eq.u32 %r52,%r31,32; @ %r52 bra $L141; .loc 1 122 14 setp.gt.u32 %r53,%r31,32; @ %r53 bra $L134; .loc 1 125 20 neg.s32 %r35,%r35; bra $L134; $L141: .loc 1 127 8 add.u64 %r32,%r32,4; .loc 1 118 13 add.u64 %r33,%r33,-1; .loc 1 118 9 setp.ne.u64 %r54,%r33,-1; @ %r54 bra $L142; .loc 1 92 12 mov.u32 %r35,0; bra $L134; $L140: .loc 1 118 13 add.u64 %r33,%r28,-1; bra $L142; $L143: .loc 1 92 12 cvt.u32.u64 %r35,%r40; bra $L134; $L144: .loc 1 94 12 mov.u32 %r35,-1; bra $L134; $L145: .loc 2 63 24 mov.u32 %r35,1; bra $L134; $L147: .loc 1 92 12 mov.u32 %r35,0; $L134: .loc 1 131_gfortran_concat_string_char4 .visible .func _gfortran_concat_string_chu64 %r76; .reg .u32 %r77; .reg .predmov.u64 %r37,%ar3; mov.u64 %r38,%ar4; mov.u64 %r39,%ar5; .loc 1 142 6 setp.lt.u64 %r40,%r36,%r34; @ %r40 bra $L154; .loc 1 144 7 shl.b64 %r41,%r34,2;.loc 1 145 7 bra $L153; $L154: .loc 1 147 3 shl.b64 %r23,%r36,2;.loc 1 148 8 add.u64 %r26,%r35,%r23; .loc 1 149 11 sub.u64 %r27,%r34,%r36; .loc 1 151 6 setp.gt.u64 %r58,%r27,%r38; @ %r58 bra $L156; .loc 1 153 7 shl.b64 %r59,%r27,7 bra $L153; $L156: .loc 1 157 3 shl.b64 %r25,%r38,add.u64 %r28,%r26,%r25; shl.b64 %r76,%r27,2; add.u64 %r29,%r26,%r76; .loc 2 48 10 mov.u32 %r77,32; $L157: st.u32 [%r28],%r77; .loc 2 47 3 add.u64 %r28,%r28,4; setp.ne.u64 %r78,%r28,%r29; @ %r78 bra $L157_gfortran_string_trim_char4 .visible .func _gfortran_string_trim_charmov.u64 %r37,%ar3; .loc 1 188 6 setp.ne.u64 %r38,%r36,0; @ %r38 bra $L160; $L163: .loc 1 168 8 mov.u64 %r39,0; st.u64 [%r34],%r39; .loc 1 171 11 cvta.global.u64 %r40,zero_length_string_char4; st.u64 [%r35],%r40; bra $L159; $L160: .loc 1 193 10 add.u64 %r25,%r36,-1; .loc 1 240 11 shl.b64 %r32,%r25,2; add.u64 %r41,%r37,%r32; .loc 1 240 9 ld.u32 %r42,[%r41]; setp.ne.u32 %r43,%r42,32; @ %r43 bra $L162; .loc 1 242 10 setp.eq.u64 %r44,%r25,0; @ %r44 bra $L163; add.u64 %r45,%r32,-4; add.u64 %r30,%r37,%r45; bra $L164; $L165: add.u64 %r30,%r30,-4;%r25,-1; .loc 1 240 9 ld.u32 %r47,[%r30]; setp.eq.u32 %r48,%r47,32; @ %r48 bra $L165; $L162: .loc 1 168 8 st.u64 [%r34],%r36; .loc 1 175 15 mov.u64 %r50_gfortrani_xmallocarray.loc 1 175 13 st.u64 [%r35],%r51; .loc 1 178 7 ld.u64 %r54,[%r34]; shl.b64 %r53,%r54,_gfortran_string_len_trim_char4gfortran_string_len_trim_char4u32 %r40; .reg .pred188 6 setp.eq.u64 %r33,%r31,0; @ %r33 bra $L177; .loc 1 193 10 add.u64 %r30,%r31,-1; .loc 1 240 11 shl.b64 %r27,%r30,2; add.u64 %r34,%r32,%r27; .loc 1 240 9 ld.u32 %r35,[%r34]; setp.ne.u32 %r36,%r35,32; @ %r36 bra $L174; .loc 1 242 10 setp.eq.u64 %r37,%r30,0; @ %r37 bra $L172; add.u64 %r38,%r27,-4; add.u64 %r28,%r32,%r38; bra $L175; $L176: add.u64 %r28,%r28,-4; setp.eq.u64 %r39,%r30,0; @ %r39 bra $L172; $L175: mov.u64 %r31,%r30; .loc 1 244 7 add.u64 %r30,%r30,-1; .loc 1 240 9 ld.u32 %r40,[%r28]; setp.eq.u32 %r41,%r40,32; @ %r41 bra $L176; $L174: .loc 1 246 12 mov.u64 %r30,%r31; bra $L172; $L177: .loc 1 189 12 mov.u64 %r30,%r31; $L172: .loc 1 247_gfortran_string_index_char4gfortran_string_index_char32mov.u64 %r38,%ar2; mov.u64 %r39,%ar3; mov.u32 %r40,%ar4; .loc 1 259 6 setp.ne.u64 %r41,%r38,0; @ %r41 bra $L185; .loc 1 260 30 setp.eq.u32 %r42,%r40,0; @ %r42 bra $L194; add.u64 %r35,%r36,1; bra $L184; $L185: .loc 1 262 6 setp.gt.u64 %r43,%r38,%r36; @ %r43 bra $L195; .loc 1 265 6 setp.ne.u32 %r44,%r40,0; @ %r44 bra $L187; .loc 1 267 12 add.u64 %r45,%r36,1; sub.u64 %r27,%r45,%r38; .loc 1 269 13 mov.u64 %r29,1; .loc 1 268 13 mov.u64 %r26,0; bra $L188; $L187: .loc 1 274 13 sub.u64 %r26,%r36,%r38; .loc 1 275 13 mov.u64 %r29,-1; .loc 1 273 12 mov.u64 %r27,%r29; bra $L188; $L193: shl.b64 %r46,%r26,2; add.u64 %r30,%r37,%r46; mov.u64 %r25,%r39; .loc 1 280 14 mov.u64 %r28,0; $L190: .loc 1 282 14 ld.u32 %r47,[%r30]; ld.u32 %r48,[%r25]; setp.ne.u32 %r49,%r47,%r48; @ %r49 bra $L189; .loc 1 280 31 add.u64 %r28,%r28,1; .loc 1 280 7 add.u64 %r30,%r30,4; add.u64 %r25,%r25,4; setp.ne.u64 %r50,%r38,%r28; @ %r50 bra $L190; bra $L191; $L189: .loc 1 285 10 setp.ne.u64 %r51,%r38,%r28; @ %r51 bra $L192; $L191: .loc 1 286 23 add.u64 %r35,%r26,1; bra $L184; $L192: .loc 1 278 30 add.u64 %r26,%r26,%r29; .loc 1 278 3 setp.ne.u64 %r52,%r26,%r27; @ %r52 bra $L193; .loc 1 263 12 mov.u64 %r35,0; bra $L184; $L188: .loc 1 278 3 setp.ne.u64 %r53,%r26,%r27; @ %r53 bra $L193; .loc 1 263 12 mov.u64 %r35,0; bra $L184; $L194: .loc 1 260 30 mov.u64 %r35,1; bra $L184; $L195: .loc 1 263 12 mov.u64 %r35,0; $L184gfortran_adjustl_char4 .visible .func _gfortran_adjustl_char4u32 %r41; .reg .predmov.u64 %r35,%ar2; .loc 1 301 9 setp.eq.u64 %r36,%r34,0; @ %r36 bra $L199; mov.u64 %r32,%r35; .loc 1 300 5 mov.u64 %r28,0; bra $L201; $L203: .loc 1 302 6 add.u64 %r28,%r28,1; .loc 1 301 9 add.u64 %r32,%r32,4; setp.eq.u64 %r37,%r34,%r28; @ %r37 bra $L206; $L201: .loc 1 301 18 ld.u32 %r38,[%r32]; setp.eq.u32 %r39,%r38,32; @ %r39 bra $L203; bra $L211; $L206: mov.u64 %r53,0; $L202: add.u64 %r29,%r33,%r53; .loc 2 47 10 mov.u64 %r30,0; .loc 2 48 10 mov.u32 %r41,32; $L205: st.u32 [%r29],%r41; .loc 2 47 25 add.u64 %r30,%r30,1; .loc 2 47 3 add.u64 %r29,%r29,4; setp.lt.u64 %r42,%r30,%r28; @ %r42 bra $L205; bra $L199; $L211: .loc 1 305 33 sub.u64 %r24,%r34,%r28; .loc 1 305 5 shl.b64 %r53,%r24,306 6 setp.ne.u64 %r52,%r28,0; @ %r52 bra $L202; $L199:_gfortran_adjustr_char4 .visible .func _gfortran_adjustr_char4u32 %r43; .reg .pred %r44; .reg .u64mov.u64 %r35,%ar2; .loc 1 319 9 setp.eq.u64 %r36,%r34,0; @ %r36 bra $L218; shl.b64 %r55,%r34,2; mov.u64 %r27,%r55; mov.u64 %r26,%r34; bra $L214; $L216: .loc 1 320 6 add.u64 %r26,%r26,-1; .loc 1 319 9 add.u64 %r27,%r27,-4; setp.eq.u64 %r37,%r26,0; @ %r37 bra $L215; $L214: .loc 1 319 22 add.u64 %r38,%r35,%r27; .loc 1 319 16 ld.u32 %r39,[%r38+-4]; setp.eq.u32 %r40,%r39,32; @ %r40 bra $L216; .loc 1 323 5 sub.u64 %r32,%r34,%r26; .loc 1 322 6 setp.le.u64 %r41,%r34,%r26; @ %r41 bra $L219; mov.u64 %r26,%r27; shl.b64 %r55,%r32,2; $L215: mov.u64 %r31,%r33; add.u64 %r28,%r33,%r55; .loc 2 48 10 mov.u32 %r43,32; $L217: st.u32 [%r31],%r43; .loc 2 47 3 add.u64 %r31,%r31,4; setp.ne.u64 %r44,%r28,%r31; @ %r44 bra $L217; bra $L213; $L218: .loc 1 319 9 mov.u64 %r26,%r34; mov.u64 %r55,%r26; bra $L213; $L219: mov.u64 %r26,%r27; shl.b64 %r55,%r32,2; $L213: .loc 1 324 11 add.u64 %r46,%r33,%r55; .loc 1 324 3_gfortran_string_scan_char4gfortran_string_scan_char32u32 %r68; .reg .pred %r69; .reg .pred %r70; .reg .pred.loc 1 336 12 set.u32.eq.u64 %r50,%r44,0; neg.s32 %r51,%r50; .loc 1 336 27 set.u32.eq.u64 %r53,%r46,0; neg.s32 %r54,%r53; .loc 1 336 17 cvt.u16.u32 %r56,%r51; cvt.u16.u32 %r57,%r54; or.b16 %r55,%r56,%r57; .loc 1 336 6 cvt.u32.u16 %r58,%r55; cvt.u16.u8 %r59,%r58; setp.ne.u16 %r60,%r59,0; @ %r60 bra $L232; .loc 1 339 6 setp.ne.u32 %r61,%r48,0; @ %r61 bra $L226; mov.u64 %r36,%r45; .loc 1 352 14 mov.u64 %r33,0; bra $L227; $L226: shl.b64 %r62,%r44,2; add.u64 %r63,%r62,-4; add.u64 %r32,%r45,%r63; .loc 1 339 6 mov.u64 %r43,%r44; bra $L228; $L229: .loc 1 345 11 ld.u32 %r64,[%r40]; setp.eq.u32 %r65,%r25,%r64; @ %r65 bra $L224; .loc 1 343 29 add.u64 %r35,%r35,1; .loc 1 343 4 add.u64 %r40,%r40,4; setp.gt.u64 %r66,%r46,%r35; @ %r66 bra $L229; .loc 1 341 31 add.u64 %r43,%r43,-1; .loc 1 341 7 add.u64 %r32,%r32,-4; setp.eq.u64 %r67,%r43,0; @ %r67 bra $L224; $L228: .loc 1 345 15 ld.u32 %r25,[%r32]; mov.u64 %r40,%r47; .loc 1 343 11 mov.u64 %r35,0; bra $L229; $L231: .loc 1 356 11 ld.u32 %r68,[%r38]; setp.ne.u32 %r69,%r27,%r68; @ %r69 bra $L230; .loc 1 357 13 add.u64 %r43,%r33,1; bra $L224; $L230: .loc 1 354 29 add.u64 %r34,%r34,1; .loc 1 354 4 add.u64 %r38,%r38,4; setp.gt.u64 %r70,%r46,%r34; @ %r70 bra $L231; .loc 1 352 30 add.u64 %r33,%r33,1; .loc 1 352 7 add.u64 %r36,%r36,4; setp.le.u64 %r71,%r44,%r33; @ %r71 bra $L233; $L227: .loc 1 356 15 ld.u32 %r27,[%r36]; mov.u64 %r38,%r47; .loc 1 354 11 mov.u64 %r34,0; bra $L231; $L232: .loc 1 337 12 mov.u64 %r43,0; bra $L224; $L233: mov.u64 %r43,0; $L224gfortran_string_verify_char4gfortran_string_verify_charr22u32u64 %r48; .reg .predar2; mov.u64 %r38,%ar3; mov.u32 %r39,%ar4; .loc 1 377 6 setp.eq.u64 %r40,%r35,0; @ %r40 bra $L247; .loc 1 380 6 setp.eq.u32 %r41,%r39,0; @ %r41 bra $L248; .loc 1 383 13 add.u64 %r26,%r35,-1; .loc 1 384 13 mov.u64 %r35,-1; .loc 1 382 12 mov.u64 %r24,%r35; bra $L241; $L244: .loc 1 396 18 ld.u32 %r22,[%r29]; mov.u64 %r27,%r38; .loc 1 394 14 mov.u64 %r25,0; $L243: .loc 1 396 14 ld.u32 %r42,[%r27]; setp.eq.u32 %r43,%r22,%r42; @ %r43 bra $L242; .loc 1 394 32 add.u64 %r25,%r25,1; .loc 1 394 7 add.u64 %r27,%r27,4; setp.ne.u64 %r44,%r37,%r25; @ %r44 bra $L243; $L245: .loc 1 400 23 add.u64 %r34,%r26,1; bra $L239; $L242: .loc 1 392 31 add.u64 %r26,%r26,%r24; .loc 1 392 3 add.u64 %r29,%r29,%r30; setp.eq.u64 %r45,%r35,%r26; @ %r45 bra $L249; $L246: .loc 1 394 7 @ %r49 bra $L244; bra $L245; $L247: .loc 1 378 12 mov.u64 %r34,%r35; bra $L239; $L249: mov.u64 %r34,0; bra $L239; $L248: .loc 1 390 13 mov.u64 %r24,1; .loc 1 389 13 mov.u64 %r26,0; $L241: shl.b64 %r30,%r24,2; shl.b64 %r48,%r26,2; add.u64 %r29,%r36,%r48; setp.ne.u64 %r49,%r37,0; bra $L246; $L239: .loc 1 404_gfortran_string_minmax_char4 .visible .func _gfortran_string_minmax_ch94419 10 mov.u64 %r42,%args; ld.u64 %r30,[%r42]; .loc 1 420 7 add.u64 %r33,%r42,16; ld.u64 %r29,[%r42+8]; .loc 1 421 9 st.u64 [%r52],%r30; .loc 1 423 6 setp.eq.u64 %r57,%r29,0; @ %r57 bra $L252; .loc 1 427 3 setp.gt.s32 %r58,%r55,1; @ %r58 bra $L265; bra $L254; $L252: .loc 1 424 5 setp.gt.s32 %r59,%r54,0; @ %r59 bra $L266; cvta.const.u64 %r31,$LC1; bra $L255; $L266: cvta.const.u64 %r31,$LC0; $L255: st.u64 [%stack],%r31; cvta.const.u64 %r60,$LC2; {stack; call _gfortran_runtime_error265: .loc 1 427 10 mov.u32 %r40,1; $L253: .loc 1 429 15 mov.u64 %r48,%r33; ld.u64 %r38,[%r48]; .loc 1 430 12 add.u64 %r33,%r48,16; ld.u64 %r39,[%r48+8]; .loc 1 432 10 setp.ne.u64 %r62,%r39,0; @ %r62 bra $L256; .loc 1 434 7 setp.ne.u32 %r63,%r40,1; @ %r63 bra $L257; .loc 1 435 6 setp.gt.s32 %r64,%r54,0; @ %r64 bra $L267; cvta.const.u64 %r32,$LC1; bra $L258; $L267: cvta.const.u64 %r32,$LC0; $L258:3stack; call _gfortran_runtime_error256: .loc 1 441 10 ld.u64 %r67,[%r52]; setp.ge.u64 %r68,%r67,%r38; @ %r68 bra $L259; .loc 1 442 8 st.u64 [%r52],%r38; $L259call (%value_in),_gfortran_compare_string_char4ld.param.u32 %r73,[%value_in]; } .loc 1 444 14 mul.lo.u32 %r74,%r73,%r54; .loc 1 444 10 setp.ge.s32 %r75,%r74,0; .loc 1 429 15 selp.u64 %r30,%r30,%r38,%r75; selp.u64 %r29,%r29,%r39,%r75; $L257: .loc 1 427 27 add.u32 %r40,%r40,1; .loc 1 427 3 setp.ne.u32 %r76,%r55,%r40; @ %r76 bra $L253; $L254: .loc 1 452 7 ld.u64 %r25,[%r52]; .loc 1 452 6 setp.ne.u64 %r77,%r25,0; @ %r77 bra $L260; .loc 1 453 11 cvta.global.u64 %r78,zero_length_string_char4; st.u64 [%r53],%r78; bra $L251; $L260: .loc 1 456 23 mov.u64 %r80,4r80; call (%value_in),_gfortrani_xmallocarray.loc 1 457 7 shl.b64 %r26,%r30,34 ld.u64 %r27,[%r52]; .loc 2 47 3 setp.ne.u64 %r91,%r27,%r30; @ %r91 bra $L262; $L264: .loc 1 459 13 st.u64 [%r53],%r81; bra $L251; $L262: add.u64 %r49,%r81,%r26; shl.b64 %r92,%r27,2; add.u64 %r34,%r81,%r92; .loc 2 48 10 mov.u32 %r93,32; $L263: st.u32 [%r49],%r93; .loc 2 47 3 add.u64 %r49,%r49,4; setp.ne.u64 %r94,%r34,%r49; @ %r94 bra $L263; bra $L264; $L251: .loc 1 461 1gfortran_srand .visible .func _gfortran_srandfortran/intrinsics/rand.c" // BEGIN GLOBAL FUNCTION DECL: _gfortran_ira_gfortran_irand_gfortran_rand .visible .func (.param .f32 %value_out) _gfortran_rand (.param .u64 %in_ar0); // BEGIN VAR DEF: rand_seed .global .align 8 .u64 rand_seed[1] = {1_gfortran_srand .visible .func _gfortran_srapred %r27; .reg .u64 %r29; mov.u64 %r25,%ar0; .loc 1 63 3 ld.s32 %r29,[%r25]; .loc 1 53 21 setp.ne.u64 %r27,%r29,0; selp.u64 %r29,%r29,123459876,%r27; .loc 1 53 13 st.global.u64 [rand_seed],%r29; .loc 1 6_gfortran_ira_gfortran_irandmov.u64 %r32,%ar0; .loc 1 76 6 setp.ne.u64 %r33,%r32,0; @ %r33 bra $L4; .loc 1 101 27 ld.global.u64 %r35,[rand_seed]; mul.lo.u64 %r34,%r35,16807; .loc 1 101 39 rem.u64 %r30,%r34,2147483647; .loc 1 102 6 cvt.u32.u64 %r31,%r30; bra $L5; $L4: .loc 1 77 7 ld.u32 %r23,[%r32]; .loc 1 83 3 setp.eq.u32 %r36,%r23,0; @ %r36 bra $L6; setp.eq.u32 %r37,%r23,1; @ %r37 bra $L9; bra $L11; $L6: .loc 1 101 27 ld.global.u64 %r39,[rand_seed]; mul.lo.u64 %r38,%r39,16807; .loc 1 101 39 rem.u64 %r30,%r38,2147483647; .loc 1 102 6 cvt.u32.u64 %r31,%r30; bra $L5; $L11: .loc 1 97 7 cvt.s64.s32 %r40,%r23; .loc 1 53 21 setp.eq.u32 %r41,%r23,0; @ %r41 bra $L10; .loc 1 101 27 mul.lo.u64 %r43,%r40,16807; .loc 1 101 39 rem.u64 %r30,%r43,2147483647; .loc 1 102 6 cvt.u32.u64 %r31,%r30; bra $L5; $L10: mov.u32 %r31,520932930; mov.u64 %r30,520932930; .loc 1 54 1 bra $L5; $L9: .loc 1 83 3 mov.u32 %r31,520932930; mov.u64 %r30,520932930; $L5: .loc 1 101 14 st.global.u64 [rand_seed],%r30; .loc 1 107_gfortran_rand .visible .func (.param .f32 %value_out) _gfortran_randpredf32 %r62; .reg .u64 %r63; .reg .pred %r64; .reg .u64u32 %r69; .reg .f32 %r70; mov.u64 %r47,%ar0; .loc 1 76 6 setp.ne.u64 %r48,%r47,0; @ %r48 bra $L13; .loc 1 101 27 ld.global.u64 %r50,[rand_seed]; mul.lo.u64 %r49,%r50,16807; .loc 1 101 39 rem.u64 %r35,%r49,2147483647; .loc 1 127 38 cvt.u32.u64 %r52,%r35; add.u32 %r51,%r52,-1; .loc 1 127 42 and.b32 %r53,%r51,-512; .loc 1 127 50 cvt.rn.f32.u32 %r54,%r53; mul.f32 %r46,%r54,0f30000000; bra $L14; $L13: .loc 1 77 7 ld.u32 %r22,[%r47]; .loc 1 83 3 setp.eq.u32 %r55,%r22,0; @ %r55 bra $L15; setp.eq.u32 %r56,%r22,1; @ %r56 bra $L18; bra $L20; $L15: .loc 1 101 27 ld.global.u64 %r58,[rand_seed]; mul.lo.u64 %r57,%r58,16807; .loc 1 101 39 rem.u64 %r35,%r57,2147483647; .loc 1 127 38 cvt.u32.u64 %r60,%r35; add.u32 %r59,%r60,-1; .loc 1 127 42 and.b32 %r61,%r59,-512; .loc 1 127 50 cvt.rn.f32.u32 %r62,%r61; mul.f32 %r46,%r62,0f30000000; bra $L14; $L20: .loc 1 97 7 cvt.s64.s32 %r63,%r22; .loc 1 53 21 setp.eq.u32 %r64,%r22,0; @ %r64 bra $L19; .loc 1 101 27 mul.lo.u64 %r66,%r63,16807; .loc 1 101 39 rem.u64 %r35,%r66,2147483647; .loc 1 127 38 cvt.u32.u64 %r68,%r35; add.u32 %r67,%r68,-1; .loc 1 127 42 and.b32 %r69,%r67,-512; .loc 1 127 50 cvt.rn.f32.u32 %r70,%r69; mul.f32 %r46,%r70,0f30000000; bra $L14; $L19: mov.f32 %r46,0f3e786670; mov.u64 %r35,520932930; .loc 1 54 1 bra $L14; $L18: .loc 1 83 3 mov.f32 %r46,0f3e786670; mov.u64 %r35,520932930; $L14: .loc 1 101 14 st.global.u64 [rand_seed],%r35; .loc 1 1BEGIN FUNCTION DECL: init_rand_state$constprop$0 .func init_rand_state$constprop$0; .file 1 "../../../libgfortran/intrinsics/random.c" .file 2 "../../../libgfortran/intrinsics/time_1_gfortran_arandom_r4 .visible .func _gfortran_arandom_r4_gfortran_arandom_r8 .visible .func _gfortran_arandom_r8_gfortran_random_seed_i4 .visible .func _gfortran_random_seed_i4_gfortran_random_seed_i8 .visible .func _gfortran_random_seed_i8popreareadcloseclosetimetime_gfortran_runtime_error .extern .func _gfortran_runtime_VAR DEF: master_state .global .align 8 .u64 master_state[5] = {0,-5952639272145821898,-2790978430781836137,-4872796757339724681,-6638731986642513151 }01,118,47,117,114,97,110,100,11182,65,78,68,79,77,95,83,69,69,68,32,115,104,111,117,108,100,32,104,97,118,101,32,97,116,32,109,111,115,116,32,111,110,101,32,97,114,103,117,109,101,110,116,32,112,114,101,115,101,110,116,4665,114,114,97,121,32,114,97,110,107,32,111,102,32,71,69,84,32,105,115,32,110,111,116,3265,114,114,97,121,32,115,105,122,101,32,111,102,32,71,69,84,32,105,115,32,116,111,111,32,115,109,97,108,108,46,0 }; // BEGIN VAR DEF: JUMP$0 .const .align 8 .u64 JUMP$0[4] = {1733541517147835066,-3051731464161248980,-6244198995065845334,41556572707897605465,114,114,97,121,32,115,105,122,101,32,111,102,32,80,85,84,32,105,115,32,116,111,111,32,115,109,97,108,108,46,0 }; // BEGIN VAR DEF: rand_state$1 .global .align 8 .u64 rand_state$1[5]65,114,114,97,121,32,114,97,110,107,32,111,102,32,80,85,84,32,105,115,32,110,111,116,32,49,46,0 }; // BEGIN VAR DEF: xor_keys .const .align 8 .u64 xor_keys[4] = {-4824380571461165239,-3143405005619208392,-4949652464462711484,1245905266060733753 }; // BEGIN FUNCTION DEF: init_rand_state$constprop$0 .func init_rand_state$constprop$0 { .reg .u64 %stack; mov.u64 %stack,00pred %r64; .reg .u64 %r67; .reg .u32predpred %r98; .reg .u64loc 1 353 20 cvta.global.u64 %r108,master_state; .loc 1 353 6 ld.u8 %r58,[%r108]; setp.ne.u16 %r59,%r58,0; @ ! %r59 bra $L18; add.u64 %r109,%r108,8; bra $L2; $L18: .loc 1 319 12 mov.u32 %r61,262144.loc 1 320 6 setp.eq.u32 %r64,%r62,-1; @ %r64 bra $L3; .loc 1 322 17 mov.u64 %r67,86267; call (%value_in),read.loc 1 323 7close,(ld.u64 %r31,[%frame]; .loc 1 324 14 bra $L4; $L3: .loc 2 230 14call (%value_in),time,(%out_arg1); ld.param.u64 %r72,[%value_in]; } .loc 2 233 6 setp.eq.u64 %r73,%r72,-1; @ %r73 bra $L11; .loc 1 331 12 xor.b64 %r31,%r72,324108011427370141; bra $L4; $L11: .loc 1 326 12 mov.u64 %r31,324108011427370141; $L4: cvta.global.u64 %r109,master_state+8; mov.u64 %r52,%r109; add.u64 %r49,%r109,32; $L6: .loc 1 291 19 add.u64 %r23,%r31,-7046029254386353131; .loc 1 292 15 shr.u64 %r75,%r23,30; .loc 1 292 10 xor.b64 %r76,%r75,%r23; .loc 1 292 5 mul.lo.u64 %r26,%r76,-4658895280553007687; .loc 1 293 15 shr.u64 %r77,%r26,27; .loc 1 293 10 xor.b64 %r28,%r77,%r26; .loc 1 293 5 mul.lo.u64 %r29,%r28,-7723592293110705685; .loc 1 294 17 shr.u64 %r78,%r29,31; .loc 1 294 12 xor.b64 %r31,%r78,%r29; .loc 1 360 29 st.u64 [%r52],%r31; .loc 1 357 7 add.u64 %r52,%r52,8; setp.ne.u64 %r79,%r49,%r52; @ %r79 bra $L6; .loc 1 362 25 mov.u32 %r81,1; st.u8 [%r108],%r81; $L2: .loc 1 364 3 cvta.global.u64 %r82,rand_state$1+8; mov.u64 %r86109ld.u64 %r48,[%r108+8]; ld.u64 %r36,[%r108+16]; ld.u64 %r33,[%r108+24]; ld.u64 %r44,[%r108+32]; cvta.const.u64 %r55,JUMP$0; add.u64 %r53,%r55,32; mov.u64 %r56,1733541517147835066; .loc 1 265 12 mov.u64 %r43,0; .loc 1 264 12 mov.u64 %r42,%r43; .loc 1 263 12 mov.u64 %r41,%r43; .loc 1 262 12 mov.u64 %r40,%r43; bra $L7; $L9: .loc 1 268 11 shr.u64 %r96,%r56,%r39; and.b64 %r97,%r96,1; .loc 1 268 10 setp.eq.u64 %r98,%r97,0; @ %r98 bra $L8; .loc 1 269 5 xor.b64 %r40,%r40,%r48; .loc 1 270 5 xor.b64 %r41,%r41,%r36; .loc 1 271 5 xor.b64 %r42,%r42,%r33; .loc 1 272 5 xor.b64 %r43,%r43,%r44; $L8: .loc 1 238 18 shl.b64 %r45,%r36,17; .loc 1 240 12 xor.b64 %r46,%r33,%r48; .loc 1 241 12 xor.b64 %r47,%r36,%r44; .loc 1 242 12 xor.b64 %r36,%r36,%r46; .loc 1 243 12 xor.b64 %r48,%r48,%r47; .loc 1 245 12 xor.b64 %r33,%r45,%r46; .loc 1 229 18 shr.u64 %r99,%r47,19; shl.b64 %r44,%r47,45; or.b64 %r44,%r44,%r99; .loc 1 267 29 add.u32 %r39,%r39,1; .loc 1 267 5 setp.ne.u32 %r100,%r39,64; @ %r100 bra $L9; .loc 1 266 3 add.u64 %r55,%r55,8; setp.eq.u64 %r101,%r53,%r55; @ %r101 bra $L10; .loc 1 268 15 ld.u64 %r56,[%r55]; $L7: .loc 1 267 13 mov.u32 %r39,0; bra $L9; $L10: .loc 1 277 12 st.u64 [%r108+8],%r40; .loc 1 278 12 st.u64 [%r108+16],%r41; .loc 1 279 12 st.u64 [%r108+24],%r42; .loc 1 280 12 st.u64 [%r108+32],%r43; .loc 1 368 12 mov.u32 %r107,1; st.global.u8 [rand_state$1],%r107;_gfortran_random_r4 .visible .func _gfortran_random_r4mov.u64 %r42,%ar0; .loc 1 380 7 cvta.global.u64 %r77,rand_state$1; .loc 1 380 6 ld.u8 %r44,[%r77]; setp.ne.u16 %r45,%r44,0; @ %r45 bra $L20; .loc 1 381 5 { call init_rand_state$constprop$0; } $L20: .loc 1 236 37 ld.u64 %r28,[%r77+16]; .loc 1 240 20 ld.u64 %r34,[%r77+8]; .loc 1 240 12 ld.u64 %r49,[%r77+24]; xor.b64 %r35,%r34,%r49; .loc 1 241 12 ld.u64 %r51,[%r77+32]; xor.b64 %r37,%r28,%r51; .loc 1 242 12 xor.b64 %r53,%r28,%r35; st.u64 [%r77+16],%r53; .loc 1 243 12 xor.b64 %r55,%r34,%r37; st.u64 [%r77+8],%r55; .loc 1 238 18 shl.b64 %r57,%r28,17; .loc 1 245 12 xor.b64 %r58,%r57,%r35; st.u64 [%r77+24],%r58; .loc 1 229 18 shr.u64 %r60,%r37,19; shl.b64 %r61,%r37,45; or.b64 %r62,%r60,%r61; .loc 1 247 12 st.u64 [%r77+32],%r62; .loc 1 236 27 shl.b64 %r64,%r28,2; add.u64 %r65,%r64,%r28; .loc 1 229 18 shl.b64 %r66,%r65,7; shr.u64 %r67,%r65,57; or.b64 %r68,%r66,%r67; .loc 1 236 18 shl.b64 %r70,%r68,3; add.u64 %r71,%r70,%r68; .loc 1 385 33 shr.u64 %r72,%r71,32; .loc 1 106 5 cvt.u32.u64 %r74,%r72; and.b32 %r73,%r74,-256; .loc 1 107 8 cvt.rn.f32.u32 %r75,%r73; .loc 1 107 23 mul.f32 %r76,%r75,0f2f800000; .loc 1 107 6 st.f32 [%r42],%r76;_gfortran_random_r8 .visible .func _gfortran_random_r8u64 %r40; .reg .u16 %r42; .reg .pred %r43; .reg .u64 %r47; .reg .u64mov.u64 %r40,%ar0; .loc 1 399 7 cvta.global.u64 %r73,rand_state$1; .loc 1 399 6 ld.u8 %r42,[%r73]; setp.ne.u16 %r43,%r42,0; @ %r43 bra $L22; .loc 1 400 5 { call init_rand_state$constprop$0; } $L22: .loc 1 236 37 ld.u64 %r26,[%r73+16]; .loc 1 240 20 ld.u64 %r32,[%r73+8]; .loc 1 240 12 ld.u64 %r47,[%r73+24]; xor.b64 %r33,%r32,%r47; .loc 1 241 12 ld.u64 %r49,[%r73+32]; xor.b64 %r35,%r26,%r49; .loc 1 242 12 xor.b64 %r51,%r26,%r33; st.u64 [%r73+16],%r51; .loc 1 243 12 xor.b64 %r53,%r32,%r35; st.u64 [%r73+8],%r53; .loc 1 238 18 shl.b64 %r55,%r26,17; .loc 1 245 12 xor.b64 %r56,%r55,%r33; st.u64 [%r73+24],%r56; .loc 1 229 18 shr.u64 %r58,%r35,19; shl.b64 %r59,%r35,45; or.b64 %r60,%r58,%r59; .loc 1 247 12 st.u64 [%r73+32],%r60; .loc 1 236 27 shl.b64 %r62,%r26,2; add.u64 %r63,%r62,%r26; .loc 1 229 18 shl.b64 %r64,%r63,7; shr.u64 %r65,%r63,57; or.b64 %r66,%r64,%r65; .loc 1 236 18 shl.b64 %r68,%r66,3; add.u64 %r69,%r68,%r66; .loc 1 121 5 and.b64 %r70,%r69,-2048; .loc 1 122 8 cvt.rn.f64.u64 %r71,%r70; .loc 1 122 23 mul.f64 %r72,%r71,0d3bf0000000000000; .loc 1 122 6 st.f64 [%r40],%r72; .loc 1 40pred64pred %r110; .reg .predmov.u64 %r80,%ar0; .loc 1 462 8 ld.u64 %r51,[%r80]; .loc 1 464 7 ld.s8 %r45,[%r80+28]; .loc 1 466 3 setp.gt.s64 %r81,%r45,0; @ %r81 bra $L24; $L29: .loc 1 475 11 ld.u64 %r46,[%frame]; .loc 1 477 7 cvta.global.u64 %r137,rand_state$1; .loc 1 477 6 ld.u8 %r83,[%r137]25; bra $L43; $L24: add.u64 %r73,%frame,240; add.u64 %r72,%r80,40; mov.u64 %r70,%frame; .loc 1 466 19 mov.u64 %r52,0; add.u64 %r135,%frame,120; .loc 1 468 16 mov.u64 %r85,%r52; $L28: st.u64 [%r73],%r85; .loc 1 469 17 ld.u64 %r86,[%r72]; st.u64 [%r70],%r86; .loc 1 470 19 ld.u64 %r88,[%r72+16]; add.u64 %r87,%r88,1; ld.u64 %r89,[%r72+8]; sub.u64 %r27,%r87,%r89; .loc 1 470 17 shl.b64 %r91,%r52,3; add.u64 %r92,%r135,%r91; st.u64 [%r92],%r27; .loc 1 471 10 setp.gt.s64 %r93,%r27,0; @ ! %r93 bra $L23; .loc 1 466 36 add.u64 %r52,%r52,1; .loc 1 466 3 add.u64 %r73,%r73,8; add.u64 %r72,%r72,24; add.u64 %r70,%r70,8; setp.ne.u64 %r94,%r45,%r52; @ %r94 bra $L28; bra $L29; $L43: .loc 1 478 5 { call init_rand_state$constprop$0; } bra $L25; $L37: .loc 1 236 27 shl.b64 %r96,%r62,2; add.u64 %r97,%r96,%r62; .loc 1 229 18 shl.b64 %r98,%r97,7; shr.u64 %r99,%r97,57; or.b64 %r100,%r98,%r99; .loc 1 236 18 shl.b64 %r102,%r100,3; add.u64 %r103,%r102,%r100; .loc 1 238 18 shl.b64 %r59,%r62,17; .loc 1 240 12 xor.b64 %r60,%r28,%r63; .loc 1 241 12 xor.b64 %r61,%r41,%r62; .loc 1 242 12 xor.b64 %r62,%r62,%r60; .loc 1 243 12 xor.b64 %r63,%r63,%r61; .loc 1 245 12 xor.b64 %r28,%r59,%r60; .loc 1 229 18 shr.u64 %r104,%r61,19; shl.b64 %r41,%r61,45; or.b64 %r41,%r41,%r104; .loc 1 484 37 shr.u64 %r105,%r103,32; .loc 1 106 5 cvt.u32.u64 %r107,%r105; and.b32 %r106,%r107,-256; .loc 1 107 8 cvt.rn.f32.u32 %r108,%r106; .loc 1 107 23 mul.f32 %r109,%r108,0f2f800000; .loc 1 107 6 st.f32 [%r51],%r109; .loc 1 488 12 add.u64 %r51,%r51,%r33; .loc 1 489 15 add.u64 %r34,%r34,1; .loc 1 492 13 setp.ne.u64 %r110,%r34,%r47; @ %r110 bra $L37; .loc 1 501 14 @ ! %r136 bra $L44; bra $L31; $L36: .loc 1 496 20 st.u64 [%r79],%r141; .loc 1 499 29 mul.lo.u64 %r36,%r39,%r38; .loc 1 500 12 add.u64 %r50,%r50,1; .loc 1 501 14 add.u64 %r79,%r79,8; setp.ne.u64 %r113,%r45,%r50; @ %r113 bra $L33; st.u64 [%r137+16],%r62; st.u64 [%r137+24],%r28; st.u64 [%r137+8],%r63; st.u64 [%r137+32],%r41; bra $L23; $L31: st.u64 [%r137+16],%r62; st.u64 [%r137+24],%r28; st.u64 [%r137+8],%r63; st.u64 [%r137+32],%r41; bra $L23; $L44: add.u64 %r79,%frame,248; .loc 1 499 29 mov.u64 %r36,%r67; .loc 1 500 12 mov.u64 %r50,1; $L33: .loc 1 508 23 ld.u64 %r123,[%r79]; add.u64 %r38,%r123,1; st.u64 [%r79],%r38; shl.b64 %r76,%r50,3; .loc 1 509 29 add.u64 %r124,%frame,%r76; ld.u64 %r39,[%r124]; .loc 1 509 20 sub.u64 %r125,%r39,%r36; shl.b64 %r126,%r125,2; add.u64 %r51,%r51,%r126; .loc 1 492 32 add.u64 %r128,%r140,%r76; ld.u64 %r42,[%r128]; .loc 1 492 13 setp.eq.u64 %r129,%r38,%r42; @ %r129 bra $L36; .loc 1 496 20 mov.u64 %r34,0; bra $L37; $L25: .loc 1 480 9 setp.eq.u64 %r130,%r51,0; @ %r130 bra $L23; ld.u64 %r62,[%r137+16]; ld.u64 %r28,[%r137+24]; ld.u64 %r63,[%r137+8]; ld.u64 %r41,[%r137+32]; .loc 1 488 12 shl.b64 %r33,%r46,2; ld.u64 %r34,[%frame+240]; .loc 1 492 32 ld.u64 %r47,[%frame+120]; .loc 1 499 29 mul.lo.u64 %r67,%r47,%r46; setp.eq.u64 %r136,%r45,1; add.u64 %r140,%frame,120; .loc 1 496 20 mov.u64 %r141,0; bra $L37; $L23: .loc 1 51predpred %r91; .reg .predpred %r106; .reg .predmov.u64 %r78,%ar0; .loc 1 529 8 ld.u64 %r51,[%r78]; .loc 1 531 7 ld.s8 %r44,[%r78+28]; .loc 1 533 3 setp.gt.s64 %r79,%r44,0; @ %r79 bra $L46; $L51: .loc 1 542 11 ld.u64 %r45,[%frame]; .loc 1 544 7 cvta.global.u64 %r133,rand_state$1; .loc 1 544 6 ld.u8 %r81,[%r133]bra $L65; $L46: add.u64 %r70,%frame,240; add.u64 %r69,%r78,40; mov.u64 %r66,%frame; .loc 1 533 19 mov.u64 %r52,0; add.u64 %r132,%frame,120; .loc 1 535 16 mov.u64 %r83,%r52; $L50: st.u64 [%r70],%r83; .loc 1 536 17 ld.u64 %r84,[%r69]; st.u64 [%r66],%r84; .loc 1 537 19 ld.u64 %r86,[%r69+16]; add.u64 %r85,%r86,1; ld.u64 %r87,[%r69+8]; sub.u64 %r27,%r85,%r87; .loc 1 537 17 shl.b64 %r89,%r52,3; add.u64 %r90,%r132,%r89; st.u64 [%r90],%r27; .loc 1 538 10 setp.gt.s64 %r91,%r27,0; @ ! %r91 bra $L45; .loc 1 533 36 add.u64 %r52,%r52,1; .loc 1 533 3 add.u64 %r70,%r70,8; add.u64 %r69,%r69,24; add.u64 %r66,%r66,8; setp.ne.u64 %r92,%r44,%r52; @ %r92 bra $L50; bra $L51; $L65: .loc 1 545 5 { call init_rand_state$constprop$0; } bra $L47; $L59: .loc 1 236 27 shl.b64 %r94,%r62,2; add.u64 %r95,%r94,%r62; .loc 1 229 18 shl.b64 %r96,%r95,7; shr.u64 %r97,%r95,57; or.b64 %r98,%r96,%r97; .loc 1 236 18 shl.b64 %r100,%r98,3; add.u64 %r101,%r100,%r98; .loc 1 238 18 shl.b64 %r59,%r62,17; .loc 1 240 12 xor.b64 %r60,%r28,%r63; .loc 1 241 12 xor.b64 %r61,%r40,%r62; .loc 1 242 12 xor.b64 %r62,%r62,%r60; .loc 1 243 12 xor.b64 %r63,%r63,%r61; .loc 1 245 12 xor.b64 %r28,%r59,%r60; .loc 1 229 18 shr.u64 %r102,%r61,19; shl.b64 %r40,%r61,45; or.b64 %r40,%r40,%r102; .loc 1 121 5 and.b64 %r103,%r101,-2048; .loc 1 122 8 cvt.rn.f64.u64 %r104,%r103; .loc 1 122 23 mul.f64 %r105,%r104,0d3bf0000000000000; .loc 1 122 6 st.f64 [%r51],%r105; .loc 1 554 12 add.u64 %r51,%r51,%r32; .loc 1 555 15 add.u64 %r33,%r33,1; .loc 1 558 13 setp.ne.u64 %r106,%r33,%r49; @ %r106 bra $L59; .loc 1 567 14 @ ! %r131 bra $L66; bra $L53; $L58: .loc 1 562 20 st.u64 [%r76],%r137; .loc 1 565 29 mul.lo.u64 %r35,%r38,%r37; .loc 1 566 12 add.u64 %r50,%r50,1; .loc 1 567 14 add.u64 %r76,%r76,8; setp.ne.u64 %r109,%r44,%r50; @ %r109 bra $L55; st.u64 [%r133+16],%r62; st.u64 [%r133+24],%r28; st.u64 [%r133+8],%r63; st.u64 [%r133+32],%r40; bra $L45; $L53: st.u64 [%r133+16],%r62; st.u64 [%r133+24],%r28; st.u64 [%r133+8],%r63; st.u64 [%r133+32],%r40; bra $L45; $L66: add.u64 %r76,%frame,248; .loc 1 565 29 mov.u64 %r35,%r67; .loc 1 566 12 mov.u64 %r50,1; $L55: .loc 1 574 23 ld.u64 %r119,[%r76]; add.u64 %r37,%r119,1; st.u64 [%r76],%r37; shl.b64 %r73,%r50,3; .loc 1 575 29 add.u64 %r120,%frame,%r73; ld.u64 %r38,[%r120]; .loc 1 575 20 sub.u64 %r121,%r38,%r35; shl.b64 %r122,%r121,3; add.u64 %r51,%r51,%r122; .loc 1 558 32 add.u64 %r124,%r136,%r73; ld.u64 %r41,[%r124]; .loc 1 558 13 setp.eq.u64 %r125,%r37,%r41; @ %r125 bra $L58; .loc 1 562 20 mov.u64 %r33,0; bra $L59; $L47: .loc 1 547 9 setp.eq.u64 %r126,%r51,0; @ %r126 bra $L45; ld.u64 %r62,[%r133+16]; ld.u64 %r28,[%r133+24]; ld.u64 %r63,[%r133+8]; ld.u64 %r40,[%r133+32]; .loc 1 554 12 shl.b64 %r32,%r45,3; ld.u64 %r33,[%frame+240]; .loc 1 558 32 ld.u64 %r49,[%frame+120]; .loc 1 565 29 mul.lo.u64 %r67,%r49,%r45; setp.eq.u64 %r131,%r44,1; add.u64 %r136,%frame,120; .loc 1 562 20 mov.u64 %r137,0; bra $L59; $L45:_gfortran_random_seed_i4 .visible .func _gfortran_random_seed_i4mov.u64 %stack,032predpred %r106; .reg .predpredpredmov.u64 %r84,%ar0; mov.u64 %r85,%ar1; mov.u64 %r86,%ar2; .loc 1 757 17 set.u32.ne.u64 %r88,%r84,0; .loc 1 757 33 set.u32.ne.u64 %r91,%r85,0; neg.s32 %r92,%r91; .loc 1 757 22 sub.u32 %r93,%r92,%r88; .loc 1 757 49 set.u32.ne.u64 %r95,%r86,0; .loc 1 757 38 sub.u32 %r97,%r93,%r95; .loc 1 757 6 setp.le.s32 %r98,%r97,1; @ %r98 bra $L68; .loc 1 758 5 cvta.const.u64 %r99_gfortran_runtime_error68: .loc 1 760 6 setp.eq.u64 %r101,%r84,0; @ %r101 bra $L69; .loc 1 761 11 mov.u32 %r102,8; st.u32 [%r84],%r102; .loc 1 766 6 setp.eq.u64 %r103,%r86,0; @ %r103 bra $L70; $L73: .loc 1 769 10 ld.s8 %r105,[%r86+28]; cvt.u16.u32 %r104,%r105; setp.ne.u16 %r106,%r104,1; @ ! %r106 bra $L93; bra $L71; $L69: .loc 1 766 6 setp.ne.u64 %r107,%r86,0; @ %r107 bra $L73; bra $L94; $L71: .loc 1 770 2 cvta.const.u64 %r108,$LC2; {stack; call _gfortran_runtime_error93: .loc 1 773 11 ld.u64 %r111,[%r86+56]; add.u64 %r110,%r111,1; ld.u64 %r113,[%r86+48]; sub.u64 %r112,%r110,%r113; .loc 1 773 10 setp.gt.s64 %r114,%r112,7; @ %r114 bra $L75; .loc 1 774 2_gfortran_runtime_error75: .loc 1 776 10 ld.global.u8 %r118,[rand_state$76; .loc 1 777 2 { call init_rand_state$constprop$0; } $L76: cvta.global.u64 %r69,rand_state$1+8; mov.u64 %r67,%frame; cvta.const.u64 %r66,xor_keys; add.u64 %r62,%r69,32; .loc 1 752 1 mov.u64 %r68,%frame; mov.u64 %r81,-4824380571461165239; $L78: .loc 1 742 22 ld.u64 %r122,[%r69]; xor.b64 %r121,%r122,%r81; .loc 1 742 13 st.u64 [%r68],%r121; .loc 1 741 3 add.u64 %r69,%r69,8; add.u64 %r68,%r68,8; add.u64 %r66,%r66,8; setp.eq.u64 %r123,%r62,%r69; @ %r123 bra $L87; .loc 1 742 32 ld.u64 %r81,[%r66]; bra $L78; $L87: .loc 1 783 19 mov.u64 %r54,0; $L77: .loc 1 784 35 mov.u64 %r125,7; sub.u64 %r124,%r125,%r54; .loc 1 784 26 ld.u64 %r127,[%r86+40]; mul.lo.u64 %r126,%r124,%r127; shl.b64 %r128,%r126,2; .loc 1 784 10 ld.u64 %r130,[%r86]; add.u64 %r129,%r130,%r128; .loc 1 784 2 shl.b64 %r131,%r54,2; add.u64 %r132,%r131,%r6713139,[%value_in]; } .loc 1 783 36 add.u64 %r54,%r54,1; .loc 1 783 7 setp.ne.u64 %r141,%r54,8; @ %r141 bra $L77; bra $L67; $L94: .loc 1 795 6 setp.ne.u64 %r142,%r85,0; @ %r142 bra $L80; .loc 1 797 25 cvt.u32.u64 %r144,%r85; st.global.u8 [master_state],%r144; .loc 1 798 7 { call init_rand_state$constprop$0; } bra $L67; $L70: .loc 1 801 6 setp.eq.u64 %r145,%r85,0; @ %r145 bra $L67; $L80: .loc 1 804 10 ld.s8 %r147,[%r85+28]; cvt.u16.u32 %r146,%r147; setp.eq.u16 %r148,%r146,1; @ %r148 bra $L82; .loc 1 805 9 cvta.const.u64 %r1414_gfortran_runtime_error82: .loc 1 808 11 ld.u64 %r152,[%r85+56]; add.u64 %r151,%r152,1; ld.u64 %r154,[%r85+48]; sub.u64 %r153,%r151,%r154; .loc 1 808 10 setp.gt.s64 %r155,%r153,7; @ %r155 bra $L83; .loc 1 809 9_gfortran_runtime_error83: .loc 1 814 35 ld.u64 %r75,[%r85+40]; mov.u64 %r60,%frame; neg.s64 %r159,%r75; shl.b64 %r160,%r159,2; shl.b64 %r162,%r75,3; sub.u64 %r163,%r162,%r75; shl.b64 %r164,%r163,2; ld.u64 %r165,[%r85]; add.u64 %r82,%r165,%r164; add.u64 %r77,%frame,32; mov.u64 %r83,%frame; $L84: .loc 1 813 2168172,[%value_in]; } .loc 1 812 7 add.u64 %r83,%r83,4; add.u64 %r82,%r82,%r160; setp.ne.u64 %r174,%r77,%r83; @ %r174 bra $L84; cvta.global.u64 %r55,master_state+8; cvta.const.u64 %r53,xor_keys; mov.u64 %r76,-4824380571461165239; $L86: .loc 1 742 22 ld.u64 %r176,[%r60]; xor.b64 %r175,%r176,%r76; .loc 1 742 13 st.u64 [%r55],%r175; .loc 1 741 3 add.u64 %r60,%r60,8; add.u64 %r55,%r55,8; add.u64 %r53,%r53,8; setp.eq.u64 %r177,%r60,%r77; @ %r177 bra $L85; .loc 1 742 32 ld.u64 %r76,[%r53]; bra $L86; $L85: .loc 1 820 25 mov.u32 %r179,1; st.global.u8 [master_state],%r179; .loc 1 821 7 { call init_rand_state$constprop$0; } $L_gfortran_random_seed_i8 .visible .func _gfortran_random_seed_i8mov.u64 %stack,0predpred %r111; .reg .u64 %r112; .reg .u16u64 %r119; .reg .predpredpred837 17 set.u32.ne.u64 %r85,%r81,0; .loc 1 837 33 set.u32.ne.u64 %r88,%r82,0; neg.s32 %r89,%r88; .loc 1 837 22 sub.u32 %r90,%r89,%r85; .loc 1 837 49 set.u32.ne.u64 %r92,%r83,0; .loc 1 837 38 sub.u32 %r94,%r90,%r92; .loc 1 837 6 setp.le.s32 %r95,%r94,1; @ %r95 bra $L96; .loc 1 838 5 cvta.const.u64 %r96,$LC1stack; call _gfortran_runtime_error96: .loc 1 841 6 setp.eq.u64 %r98,%r81,0; @ %r98 bra $L97; .loc 1 842 11 mov.u64 %r99,4; st.u64 [%r81],%r99; .loc 1 847 6 setp.eq.u64 %r100,%r83,0; @ %r100 bra $L98; $L101: .loc 1 850 10 ld.s8 %r102,[%r83+28]; cvt.u16.u32 %r101,%r102; setp.ne.u16 %r103,%r101,1; @ ! %r103 bra $L121; bra $L99; $L97: .loc 1 847 6 setp.ne.u64 %r104,%r83,0; @ %r104 bra $L101; bra $L122; $L99: .loc 1 851 2 cvta.const.u64 %r105,$LC2stack; call _gfortran_runtime_error121: .loc 1 854 11 ld.u64 %r108,[%r83+56]; add.u64 %r107,%r108,1; ld.u64 %r110,[%r83+48]; sub.u64 %r109,%r107,%r110; .loc 1 854 10 setp.gt.s64 %r111,%r109,3; @ %r111 bra $L103; .loc 1 855 2_gfortran_runtime_error103: .loc 1 857 10 ld.global.u8 %r115,[rand_state$1]; setp.ne.u16 %r116,%r115,0; @ %r116 bra $L104; .loc 1 858 2 { call init_rand_state$constprop$0; } $L104: cvta.global.u64 %r66,rand_state$1+8; mov.u64 %r64,%frame; cvta.const.u64 %r61,xor_keys; add.u64 %r52,%r66,32; .loc 1 833 1 mov.u64 %r65,%frame; mov.u64 %r78,-4824380571461165239; $L106: .loc 1 742 22 ld.u64 %r119,[%r66]; xor.b64 %r118,%r119,%r78; .loc 1 742 13 st.u64 [%r65],%r118; .loc 1 741 3 add.u64 %r66,%r66,8; add.u64 %r65,%r65,8; add.u64 %r61,%r61,8; setp.eq.u64 %r120,%r52,%r66; @ %r120 bra $L115; .loc 1 742 32 ld.u64 %r78,[%r61]; bra $L106; $L115: .loc 1 864 19 mov.u64 %r53,0; $L105: .loc 1 865 26 ld.u64 %r122,[%r83+40]; mul.lo.u64 %r121,%r53,%r122; shl.b64 %r123,%r121,3; .loc 1 865 10 ld.u64 %r125,[%r83]; add.u64 %r124,%r125,%r123; .loc 1 865 2 shl.b64 %r126,%r53,3; add.u64 %r127,%r126,%r64; mov.u64 %r1301127130134,[%value_in]; } .loc 1 864 35 add.u64 %r53,%r53,1; .loc 1 864 7 setp.ne.u64 %r136,%r53,4; @ %r136 bra $L105; bra $L95; $L122: .loc 1 875 6 setp.ne.u64 %r137,%r82,0; @ %r137 bra $L108; .loc 1 877 25 cvt.u32.u64 %r139,%r82; st.global.u8 [master_state],%r139; .loc 1 878 7 { call init_rand_state$constprop$0; } bra $L95; $L98: .loc 1 881 6 setp.eq.u64 %r140,%r82,0; @ %r140 bra $L95; $L108: .loc 1 884 10 ld.s8 %r142,[%r82+28]; cvt.u16.u32 %r141,%r142; setp.eq.u16 %r143,%r141,1; @ %r143 bra $L110; .loc 1 885 9 cvta.const.u64 %r144144_gfortran_runtime_error110: .loc 1 888 11 ld.u64 %r147,[%r82+56]; add.u64 %r146,%r147,1; ld.u64 %r149,[%r82+48]; sub.u64 %r148,%r146,%r149; .loc 1 888 10 setp.gt.s64 %r150,%r148,3; @ %r150 bra $L111; .loc 1 889 9 cvta.const.u64 %r11_gfortran_runtime_error111: mov.u64 %r50,%frame; ld.u64 %r153,[%r82+40]; shl.b64 %r73,%r153,3; ld.u64 %r62,[%r82]; add.u64 %r71,%frame,32; .loc 1 893 41 mov.u64 %r74,%frame; $L112: .loc 1 893 2 mov.u64 %r1566160,[%value_in]; } .loc 1 892 7 add.u64 %r74,%r74,8; add.u64 %r62,%r62,%r73; setp.ne.u64 %r162,%r71,%r74; @ %r162 bra $L112; cvta.global.u64 %r49,master_state+8; cvta.const.u64 %r80,xor_keys; mov.u64 %r75,-4824380571461165239; $L114: .loc 1 742 22 ld.u64 %r164,[%r50]; xor.b64 %r163,%r164,%r75; .loc 1 742 13 st.u64 [%r49],%r163; .loc 1 741 3 add.u64 %r50,%r50,8; add.u64 %r49,%r49,8; add.u64 %r80,%r80,8; setp.eq.u64 %r165,%r50,%r71; @ %r165 bra $L113; .loc 1 742 32 ld.u64 %r75,[%r80]; bra $L114; $L113: .loc 1 897 25 mov.u32 %r167,1; st.global.u8 [master_state],%r167; .loc 1 898 7 { call init_rand_state$constprop$0; } $L95: .loc 1 904 1 ret; } /698 BEGIN FUNCTION DECL: reshape_internal .func reshape_internal.file 1 "../../../libgfortran/intrinsics/reshape_gener_gfortran_reshape .visible .func _gfortran_reshape_gfortran_reshape_char .visible .func _gfortran_reshape_char); // BEGIN GLOBAL FUNCTION DECL: _gfortran_reshape_char4 .visible .func _gfortran_reshape_ch); // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_xmallocarr_gfortrani_xmallocarray_gfortrani_reshape_packed .extern .func _gfortrani_reshape_packed73,110,99,111,114,114,101,99,116,32,115,105,122,101,32,105,110,32,83,79,85,82,67,69,32,97,114,103,117,109,101,110,116,32,116,111,32,82,69,83,72,65,80,69,32,105,110,116,114,105,110,115,105,99,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,1097,108,117,101,32,37,108,100,32,105,110,32,79,82,68,69,82,32,97,114,103,117,109,101,110,116,32,116,111,32,82,69,83,72,65,80,69,32,105,110,116,114,105,110,115,105,9910,99,111,114,114,101,99,116,32,101,120,116,101,110,116,32,105,110,32,114,101,116,117,114,110,32,118,97,108,117,101,32,111,102,32,82,69,83,72,65,80,69,32,105,110,116,114,105,110,115,105,99,32,105,110,32,100,105,109,101,110,115,105,111,110,32,37,108,100,58,32,105,115,32,37,108,100,44,32,115,104,111,117,108,100,32,98,101,32,37,108,100,0 }114,97,110,107,32,111,102,32,114,101,116,117,114,110,32,97,114,114,97,121,32,105,110,99,111,114,114,101,99,116,32,105,110,32,82,69,83,72,65,80,69,32,105,110,116,114,105,110,115,105,99,111,117,116,32,111,102,32,114,97,110,103,101,32,105,110,32,79,82,68,69,82,32,97,114,103,117,109,101,110,116,32,116,111,32,82,69,83,72,65,80,69,32,105115,104,97,112,101,32,97,110,100,32,116,97,114,103,101,116,32,100,111,32,110,111,116,32,99,111,110,102,111,114,109,0 }; // BEGIN FUNCTION DEF: reshape_internal .func reshape_internallocal .align 16 .b8 %frame_ar[120064pred %r287; .reg .predpred %r299; .reg .u64 %r303; .reg .u64 %r304; .reg .pred %r306; .reg .predpredu64 %r313; .reg .u64 %r319; .reg .u32predu64 %r341; .reg .predu64 %r357; .reg .pred %r358; .reg .predpred %r372; .reg .u64 %r373; .reg .u64 %r376; .reg .u64 %r377; .reg .u32pred %r449; .reg .u64 %r450; .reg .pred %r452; .reg .pred %r453; .reg .pred %r454; .reg .u32.reg .pred %r475; .reg .pred %r477; .reg .u64 %r478; .reg .pred %r479; .reg .u32 %r481; .reg .u32 %r482; .reg .u32 %r484; .reg .u32 %r485; .reg .u16 %r486; .reg .u16 %r487; .reg .u16 %r488; .reg .u32 %r490; .reg .u32 %r491; .reg .u16 %r493; .reg .u16 %r495; .reg .u32 %r496; .reg .u16 %r497; .reg .pred %r498; .reg .u64 %r505; .reg .u64pred %r562; .reg .u32 %r564; .reg .u32 %r565; .reg .u16 %r569; .reg .u16 %r570; .reg .u32 %r572; .reg .u16 %r573; .reg .pred %r574; .reg .u64 %r576; .reg .pred %r577; .reg .u64 %r587; .reg .u64 %r597; .reg .u64 %r607; .reg .u64 %r609; .reg .u64 %r612; .reg .u64 %r614; .reg .u64 %r616; .reg .u64 %r617; .reg .pred %r618; .reg .pred %r619; .reg .u64 %r620; .reg .pred %r621; .reg .pred %r622; .reg .pred %r623; .reg .u64 %r624; .reg .u64 %r625; .reg .u64 %r626; .reg .pred %r627; .reg .u64 %r628; .reg .u64 %r629; .reg .u64pred %r635; .reg .u64 %r636; .reg .u64 %r637; .reg .u64 %r638; .reg .pred %r639; .reg .u64 %r640; .reg .u64 %r641; .reg .u32 %r642; .reg .u32 %r643; .reg .u64 %r644; .reg .u32 %r645; .reg .u64 %r646; .reg .u32 %r647; mov.u64 %r272,%ar0; mov.u64 %r273,%ar1; mov.u64 %r274,%ar2; mov.u64 %r275,%ar3; mov.u64 %r276,%ar4; mov.u64 %r277,%ar5; .loc 1 68 10 ld.u64 %r279,[%r274+56]; add.u64 %r278,%r279,1; .loc 1 68 8 ld.u64 %r280,[%r274+48]; sub.u64 %r172,%r278,%r280; .loc 1 73 15 ld.s8 %r281,[%r272+28]; .loc 1 73 6 setp.eq.u64 %r282,%r281,%r172; @ %r282 bra $L2; .loc 1 74 5 cvta.const.u64 %r283283_gfortran_runtime_errorld.u64 %r285,[%r274+40]; shl.b64 %r264,%r285,3; ld.u64 %r262,[%r274]; mov.u64 %r117,%frame; shl.b64 %r286,%r172,3; add.u64 %r269,%r286,%frame; .loc 1 80 44 mov.u64 %r265,%frame; .loc 1 76 15 mov.u32 %r165,0; .loc 1 83 18 mov.u64 %r646,0; .loc 1 84 16 mov.u32 %r647,1; $L5: .loc 1 80 39 ld.u64 %r32,[%r262]; .loc 1 81 10 setp.le.s64 %r287,%r32,0; @ %r287 bra $L3; .loc 1 80 21 st.u64 [%r265],%r32; bra $L4; $L3: .loc 1 83 18 st.u64 [%r265],%r646; .loc 1 84 16 mov.u32 %r165,%r647; $L4: .loc 1 78 3 add.u64 %r262,%r262,%r264; add.u64 %r265,%r265,8; setp.ne.u64 %r289,%r265,%r269; @ %r289 bra $L5; .loc 1 88 6 ld.u64 %r290,[%r272]6; add.u64 %r255,%r272,40; add.u64 %r294,%r172,%r172; add.u64 %r295,%r294,%r172; shl.b64 %r296,%r295,3; add.u64 %r261,%r255,%r296; mov.u64 %r254,%r117; .loc 1 92 10 mov.u64 %r30,1; $L7: .loc 1 95 8 ld.u64 %r176,[%r254]; .loc 1 97 4 st.u64 [%r255+8],%r290; add.u64 %r298,%r176,-1; st.u64 [%r255+16],%r298; st.u64 [%r255],%r30; .loc 1 99 7 mul.lo.u64 %r30,%r30,%r176; .loc 1 93 7 add.u64 %r254,%r254,8; add.u64 %r255,%r255,24; setp.ne.u64 %r299,%r255,%r261; @ %r299 bra $L7; .loc 1 101 19 st.u64 [%r272+8],%r290; .loc 1 108 24 max.s64 %r303,30(%value_in),_gfortrani_xmallocarray04,[%value_in]; } .loc 1 108 22 st.u64 [%r272],%r304; .loc 1 109 23 st.u8 [%r272+28],%r172; $L6: .loc 1 112 6 setp.ne.u32 %r306,%r165,0; @ %r306 bra $L1; .loc 1 115 6 setp.eq.u64 %r307,%r275,0; @ %r307 bra $L51; .loc 1 117 14 ld.s8 %r43,[%r275+28]; .loc 1 117 12 cvt.u32.u32 %r308,%r43; cvt.s64.s8 %r154,%r308; .loc 1 120 7 cvt.u16.u32 %r309,%r43; setp.le.s16 %r310,%r309,0; @ %r310 bra $L52; .loc 1 122 21 cvt.u64.u32 %r311,%r43; shl.b64 %r312,%r311,3; add.u64 %r313,%frame,36r312319,[%value_in]; } add.u64 %r110,%r275,40; add.u64 %r244,%frame,120; add.u64 %r245,%frame,240; add.u32 %r321,%r43,-1; cvt.u64.u32 %r322,%r321; add.u64 %r324,%r322,%r322; add.u64 %r325,%r324,%r322; shl.b64 %r326,%r325,3; add.u64 %r327,%r275,64; add.u64 %r250,%r326,%r327; .loc 1 119 14 mov.u32 %r164,%r165; .loc 1 118 13 mov.u64 %r155,1; .loc 1 128 26 mov.u64 %r644,0; .loc 1 127 15 cvt.u32.u64 %r645,%r155; $L14: .loc 1 123 24 ld.u64 %r44,[%r110]; .loc 1 123 22 st.u64 [%r244],%r44; .loc 1 124 24 ld.u64 %r329,[%r110+16]; add.u64 %r328,%r329,1; ld.u64 %r330,[%r110+8]; sub.u64 %r48,%r328,%r330; .loc 1 125 14 setp.le.s64 %r331,%r48,0; @ %r331 bra $L11; .loc 1 124 22 st.u64 [%r245],%r48; bra $L12; $L11: .loc 1 128 26 st.u64 [%r245],%r644; .loc 1 127 15 mov.u32 %r164,%r645; $L12: .loc 1 131 14 setp.ne.u64 %r333,%r44,%r155; @ %r333 bra $L53; .loc 1 132 19 ld.u64 %r334,[%r245]; mul.lo.u64 %r155,%r155,%r334; bra $L13; $L53: .loc 1 134 19 mov.u64 %r155,0; $L13: .loc 1 120 7 add.u64 %r110,%r110,24; add.u64 %r244,%r244,8; add.u64 %r245,%r245,8; setp.ne.u64 %r335,%r110,%r250; @ %r335 bra $L14; bra $L10; $L52: .loc 1 119 14 mov.u32 %r164,%r165; .loc 1 118 13 mov.u64 %r155,1; $L10: .loc 1 136 12 ld.u64 %r157,[%r275]; bra $L9; $L51: .loc 1 143 12 mov.u64 %r157,%r275; .loc 1 142 14 mov.u32 %r164,1; .loc 1 141 13 mov.u64 %r155,1; .loc 1 140 12 mov.u64 %r154,%r275; $L9: .loc 1 146 7 cvta.global.u64 %r336,_gfortrani_compile_options; .loc 1 146 6 ld.u32 %r337,[%r336+36]; setp.eq.u32 %r338,%r337,0; @ ! %r338 bra $L88; setp.eq.u64 %r627,%r276,0; bra $L15; $L88: add.u64 %r118,%r272,48; mov.u64 %r190,1; .loc 1 150 10 mov.u64 %r184,%r190; bra $L17; $L54: mov.u64 %r190,%r191; $L17: .loc 1 153 20 ld.u64 %r52,[%r117]; .loc 1 153 7 mul.lo.u64 %r184,%r184,%r52; .loc 1 154 17 ld.u64 %r340,[%r118+8]; add.u64 %r339,%r340,1; .loc 1 154 15 ld.u64 %r341,[%r118]; sub.u64 %r185,%r339,%r341; .loc 1 155 7 setp.eq.u64 %r342,%r52,%r185; @ %r342 bra $L16; .loc 1 156 6 st.u64 [%stack+16],%r52; st.u64 [%stack+8],%r185; st.u64 [%stack],%r190;_gfortran_runtime_error16: .loc 1 151 7 add.u64 %r191,%r190,1; add.u64 %r117,%r117,8; add.u64 %r118,%r118,24; setp.ne.u64 %r345,%r190,%r172; @ %r345 bra $L54; add.u64 %r211,%r273,48; add.u64 %r174,%r273,56; ld.s8 %r346,[%r273+28]; add.u32 %r347,%r346,-1; cvt.u64.u32 %r348,%r347; add.u64 %r350,%r348,%r348; add.u64 %r351,%r350,%r348; shl.b64 %r352,%r351,3; add.u64 %r353,%r273,72; add.u64 %r130,%r352,%r353; .loc 1 162 21 mov.u64 %r183,1; $L18: .loc 1 171 9 ld.u64 %r356,[%r174]; add.u64 %r355,%r356,1; .loc 1 171 7 ld.u64 %r357,[%r211]; sub.u64 %r354,%r355,%r357; .loc 1 172 33 max.s64 %r354,%r354,0; .loc 1 172 18 mul.lo.u64 %r183,%r183,%r354; .loc 1 168 7 add.u64 %r211,%r211,24; add.u64 %r174,%r174,24; setp.ne.u64 %r358,%r130,%r211; @ %r358 bra $L18; .loc 1 175 10 setp.ge.s64 %r359,%r183,%r184; @ %r359 bra $L19; .loc 1 175 34 set.u32.eq.u64 %r361,%r275,0; neg.s32 %r362,%r361; .loc 1 175 39 cvt.u16.u32 %r365,%r164; cvt.u16.u32 %r368,%r362; or.b16 %r367,%r368,%r365; .loc 1 175 30 cvt.u32.u16 %r370,%r367; cvt.u16.u8 %r371,%r370; setp.eq.u16 %r372,%r371,0; @ %r372 bra $L19; .loc 1 176 2 st.u64 [%stack+8],%r184; st.u64 [%stack],%r183; cvta.const.u64 %r373373_gfortran_runtime_error19: .loc 1 180 10 setp.eq.u64 %r627,%r276,0; @ %r627 bra $L15; .loc 1 186 14 shl.b64 %r376,%r172,2; add.u64 %r377,%frame,480; mov.u32 %r381ld.u64 %r385,[%r276+40]; shl.b64 %r202,%r385,3; ld.u64 %r222,[%r276]; .loc 1 190 33 mov.u64 %r232,0; .loc 1 200 16 mov.u32 %r643,1; $L22: .loc 1 190 28 ld.u64 %r72,[%r222]; .loc 1 190 10 add.u64 %r179,%r72,-1; .loc 1 192 14 shr.u64 %r387,%r179,63; .loc 1 192 23 set.u32.le.s64 %r389,%r172,%r179; neg.s32 %r390,%r389; .loc 1 192 18 cvt.u16.u64 %r393,%r387; cvt.u16.u32 %r394,%r390; or.b16 %r392,%r393,%r394; .loc 1 192 11.loc 1 193 3 st.u64 [%stack],%r72; cvta.const.u64 %r398,$LC3stack; call _gfortran_runtime_error20: .loc 1 196 16 shl.b64 %r400,%r179,2; add.u64 %r401,%frame,%r400; add.u64 %r402,%r401,480; .loc 1 196 11 ld.u32 %r403,[%r402]; setp.eq.u32 %r404,%r403,0; @ %r404 bra $L21; .loc 1 197 3 st.u64 [%stack],%r72; cvta.const.u64 %r405405_gfortran_runtime_errorst.u32 [%r402],%r643; .loc 1 188 4 add.u64 %r232,%r232,1; add.u64 %r222,%r222,%r202; setp.ne.u64 %r411,%r232,%r172; @ %r411 bra $L22; bra $L15; $L26: cvt.u32.u64 %r162,%r26; .loc 1 208 10 @ %r627 bra $L23; .loc 1 209 34 ld.u64 %r414,[%r276+40]; mul.lo.u64 %r413,%r26,%r414; .loc 1 209 31 ld.u64 %r415,[%r276]; shl.b64 %r416,%r413,3; add.u64 %r417,%r415,%r416; .loc 1 209 68 ld.u32 %r419,[%r417]; add.u32 %r162,%r419,-1; $L23: .loc 1 213 17 st.u64 [%r195],%r420; .loc 1 214 20 cvt.s64.s32 %r421,%r162; add.u64 %r423,%r421,%r421; add.u64 %r424,%r423,%r421; shl.b64 %r425,%r424,3; add.u64 %r426,%r272,%r425; ld.u64 %r87,[%r426+40]; .loc 1 214 18 st.u64 [%r196],%r87; .loc 1 215 20 ld.u64 %r436,[%r426+56]; add.u64 %r435,%r436,1; ld.u64 %r444,[%r426+48]; sub.u64 %r91,%r435,%r444; .loc 1 215 18 st.u64 [%r197],%r91; .loc 1 217 35 shl.b64 %r446,%r421,3; add.u64 %r447,%frame,%r446; .loc 1 217 10 ld.u64 %r448,[%r447]; setp.eq.u64 %r449,%r91,%r448; @ %r449 bra $L24; .loc 1 218 9 cvta.const.u64 %r44stack; call _gfortran_runtime_error24: .loc 1 220 10 setp.ne.u64 %r452,%r87,%r146; @ %r452 bra $L55; .loc 1 221 15 mul.lo.u64 %r146,%r146,%r91; bra $L25; $L55: .loc 1 223 15 mov.u64 %r146,0; $L25: .loc 1 224 10 setp.le.s64 %r453,%r91,0; @ %r453 bra $L1; .loc 1 206 3 add.u64 %r26,%r26,1; add.u64 %r195,%r195,8; add.u64 %r196,%r196,8; add.u64 %r197,%r197,8; setp.ne.u64 %r454,%r26,%r172; @ %r454 bra $L26; bra $L89; $L15: add.u64 %r195,%frame,1080; add.u64 %r629,%frame,840; mov.u64 %r196,%r629; add.u64 %r626,%frame,960; mov.u64 %r197,%r626; .loc 1 190 33 mov.u64 %r26,0; mov.u64 %r146,1; .loc 1 213 17 mov.u64 %r420,%r26; bra $L26; $L89: .loc 1 228 10 ld.s8 %r96,[%r273+28]; .loc 1 228 8 cvt.u32.u32 %r455,%r96; cvt.s64.s8 %r56,%r455; .loc 1 237 17 add.u32 %r456,%r96,-1; cvt.u64.u32 %r210,%r456; add.u64 %r457,%r210,1; shl.b64 %r458,%r457,3; add.u64 %r625,%frame,720; mov.u32 %r463458465,[%value_in]; } add.u64 %r156,%r273,40; add.u64 %r624,%frame,480; mov.u64 %r163,%r624; add.u64 %r628,%frame,600; mov.u64 %r148,%r628; add.u64 %r468,%r210,%r210; add.u64 %r469,%r468,%r210; shl.b64 %r470,%r469,3; add.u64 %r471,%r273,64; add.u64 %r51,%r470,%r471; mov.u64 %r151,%r624; .loc 1 233 9 mov.u64 %r152,1; .loc 1 243 15 mov.u64 %r641,0; .loc 1 242 11 cvt.u32.u64 %r642,%r152; $L31: .loc 1 238 20 ld.u64 %r99,[%r156]; .loc 1 238 18 st.u64 [%r151],%r99; .loc 1 239 20 ld.u64 %r473,[%r156+16]; add.u64 %r472,%r473,1; ld.u64 %r474,[%r156+8]; sub.u64 %r103,%r472,%r474; .loc 1 240 10 setp.le.s64 %r475,%r103,0; @ %r475 bra $L28; .loc 1 239 18 st.u64 [%r148],%r103; bra $L29; $L28: .loc 1 243 15 st.u64 [%r148],%r641; .loc 1 242 11 mov.u32 %r165,%r642; $L29: .loc 1 246 10 setp.ne.u64 %r477,%r99,%r152; @ %r477 bra $L56; .loc 1 247 15 ld.u64 %r478,[%r148]; mul.lo.u64 %r152,%r152,%r478; bra $L30; $L56: .loc 1 249 15 mov.u64 %r152,0; $L30: .loc 1 235 3 add.u64 %r156,%r156,24; add.u64 %r151,%r151,8; add.u64 %r148,%r148,8; setp.ne.u64 %r479,%r51,%r156; @ %r479 bra $L31; .loc 1 257 7 ld.u64 %r149,[%r272]; .loc 1 257 52 ld.u64 %r94,[%r273]; .loc 1 252 13 set.u32.ne.u64 %r481,%r146,0; neg.s32 %r482,%r481; .loc 1 252 32 set.u32.ne.u64 %r484,%r155,0; neg.s32 %r485,%r484; cvt.u16.u32 %r487,%r482; cvt.u16.u32 %r488,%r485; and.b16 %r486,%r487,%r488; .loc 1 252 27 set.u32.ne.u64 %r490,%r152,0; neg.s32 %r491,%r490; .loc 1 252 32 cvt.u16.u32 %r495,%r491; and.b16 %r493,%r486,%r495; cvt.u32.u16 %r496,%r493; cvt.u16.u8 %r497,%r496; setp.eq.u16 %r498,%r497,0; @ %r498 bra $L32; .loc 1 256 13 mul.lo.u64 %r505,%r155,%r277; .loc 1 255 13 mul.lo.u64 %r506,%r152,%r277; .loc 1 254 13 mul.lo.u64 %r507,%r146,%r27750157505; call _gfortrani_reshape_packed259 7 bra $L1; $L32: .loc 1 263 21 ld.u64 %r111,[%frame+840]; .loc 1 263 12 mul.lo.u64 %r192,%r111,%r277; .loc 1 264 12 ld.u64 %r508,[%frame+480]; mul.lo.u64 %r150,%r277,%r508; .loc 1 266 6 and.b32 %r509,%r165,%r164; setp.eq.u32 %r510,%r509,0; @ %r510 bra $L3434: .loc 1 269 6 setp.eq.u32 %r511,%r165,0; @ %r511 bra $L35; .loc 1 275 7 setp.le.s64 %r512,%r154,0; @ %r512 bra $L57; .loc 1 280 22 ld.u64 %r98,[%frame+120]; .loc 1 280 13 mul.lo.u64 %r150,%r98,%r277; .loc 1 277 16 shl.b64 %r205,%r154,3; add.u64 %r514,%frame,3620521,[%value_in]; } .loc 1 278 17 add.u64 %r524,%frame,2r524add.u64 %r161,%frame,128; add.u64 %r159,%r624,%r205; $L36: .loc 1 279 17 st.u64 [%r163],%r98; .loc 1 275 7 add.u64 %r163,%r163,8; setp.eq.u64 %r534,%r159,%r163; @ %r534 bra $L58; .loc 1 279 26 ld.u64 %r98,[%r161]; add.u64 %r161,%r161,8; bra $L36; $L48: .loc 1 287 7277541,[%value_in]; } .loc 1 289 12 add.u64 %r149,%r149,%r192; .loc 1 290 11 add.u64 %r94,%r94,%r150; .loc 1 291 16 add.u64 %r240,%r240,1; st.u64 [%frame+1080],%r240; .loc 1 292 16 add.u64 %r241,%r241,1; st.u64 [%frame+720],%r241; .loc 1 296 13 setp.ne.u64 %r543,%r240,%r235; @ %r543 bra $L37; .loc 1 300 21 mov.u64 %r544,0; st.u64 [%frame+1080],%r544; .loc 1 305 14 @ %r639 bra $L38; add.u64 %r178,%frame,1088; .loc 1 303 43 mov.u64 %r122,%r230; .loc 1 305 14 mov.u64 %r170,8; mov.u64 %r115,%r544; .loc 1 300 21 mov.u64 %r631,%r115; bra $L39; $L40: st.u64 [%r178],%r631; .loc 1 303 30 mul.lo.u64 %r548,%r125,%r124; .loc 1 303 43 mul.lo.u64 %r122,%r548,%r277; .loc 1 305 14 add.u64 %r115,%r115,1; add.u64 %r178,%r178,8; add.u64 %r170,%r170,8; setp.eq.u64 %r549,%r115,%r640; @ %r549 bra $L38; $L39: .loc 1 313 24 ld.u64 %r550,[%r178]; add.u64 %r124,%r550,1; st.u64 [%r178],%r124; .loc 1 314 30 add.u64 %r552,%r629,%r170; ld.u64 %r125,[%r552]; .loc 1 314 20 sub.u64 %r554,%r149,%r122; mad.lo.u64 %r149,%r125,%r277,%r554; .loc 1 296 34 add.u64 %r556,%r626,%r170; ld.u64 %r128,[%r556]; .loc 1 296 13 setp.eq.u64 %r557,%r124,%r128; @ %r557 bra $L40; .loc 1 320 13 setp.ne.u64 %r558,%r241,%r237; @ ! %r558 bra $L49; bra $L41; $L60: mov.u64 %r149,0; $L49: .loc 1 324 21 mov.u64 %r559,0; st.u64 [%frame+720],%r559; .loc 1 327 29 ld.u64 %r561,[%frame+480]; mul.lo.u64 %r560,%r237,%r561; .loc 1 327 42 mul.lo.u64 %r133,%r560,%r277; .loc 1 329 14 setp.ne.u64 %r562,%r56,1; @ %r562 bra $L42; $L45: .loc 1 331 19 set.u32.ne.u64 %r564,%r104,0; neg.s32 %r565,%r564; .loc 1 331 24 cvt.u16.u32 %r570,%r565; and.b16 %r569,%r570,%r634; .loc 1 331 18 cvt.u32.u16 %r572,%r569; cvt.u16.u8 %r573,%r572; setp.ne.u16 %r574,%r573,0; @ %r574 bra $L43; .loc 1 345 19 mov.u64 %r94,%r157; bra $L44; $L47: .loc 1 324 21 st.u64 [%r137],%r559; .loc 1 327 29 mul.lo.u64 %r576,%r142,%r141; .loc 1 327 42 mul.lo.u64 %r133,%r576,%r277; .loc 1 329 14 add.u64 %r227,%r227,1; add.u64 %r137,%r137,8; add.u64 %r138,%r138,8; setp.eq.u64 %r577,%r227,%r56; @ ! %r577 bra $L46; bra $L45; $L43: .loc 1 336 19 @ %r635 bra $L59; .loc 1 338 352007,[%value_in]; } .loc 1 339 366r6200597,[%value_in]; } .loc 1 340 366620007,[%value_in]; } .loc 1 341 32 ld.u64 %r609,[%frame+480]; mul.lo.u64 %r150,%r277,%r609; .loc 1 345 19 mov.u64 %r94,%r157; .loc 1 341 32 mov.u64 %r56,%r154; .loc 1 334 24 mov.u64 %r104,0; bra $L44; $L42: add.u64 %r137,%frame,728; add.u64 %r138,%frame,488; .loc 1 329 14 mov.u64 %r227,1; $L46: .loc 1 350 24 ld.u64 %r612,[%r137]; add.u64 %r141,%r612,1; st.u64 [%r137],%r141; .loc 1 351 29 ld.u64 %r142,[%r138]; .loc 1 351 19 sub.u64 %r614,%r94,%r133; mad.lo.u64 %r94,%r142,%r277,%r614; .loc 1 320 34 shl.b64 %r616,%r227,3; add.u64 %r617,%r628,%r616; ld.u64 %r145,[%r617]; .loc 1 320 13 setp.eq.u64 %r618,%r141,%r145; @ %r618 bra $L47; bra $L44; $L57: .loc 1 273 12 mov.u64 %r94,%r157; .loc 1 275 7 mov.u64 %r56,%r154; bra $L35; $L58: .loc 1 273 12 mov.u64 %r94,%r157; mov.u64 %r56,%r154; $L35: .loc 1 284 9 setp.eq.u64 %r619,%r149,0; @ %r619 bra $L1; .loc 1 296 34 ld.u64 %r235,[%frame+960]; .loc 1 320 34 ld.u64 %r237,[%frame+600]; .loc 1 291 13 ld.u64 %r240,[%frame+1080]; .loc 1 292 13 ld.u64 %r241,[%frame+720]; .loc 1 303 30 mul.lo.u64 %r620,%r111,%r235; .loc 1 303 43 mul.lo.u64 %r230,%r620,%r277; .loc 1 338 35 shl.b64 %r200,%r154,3; mov.u64 %r104,%r94; .loc 1 331 24 set.u32.ne.u64 %r632,%r275,0; neg.s32 %r633,%r632; cvt.u16.u32 %r634,%r633; .loc 1 336 19 setp.le.s64 %r635,%r154,0; .loc 1 338 35 add.u64 %r636,%frame,360; .loc 1 339 36 add.u64 %r637,%frame,240; .loc 1 340 36 add.u64 %r638,%frame,120; .loc 1 305 14 setp.eq.u64 %r639,%r172,1; add.u64 %r640,%r172,-1; bra $L48; $L59: .loc 1 345 19 mov.u64 %r94,%r157; .loc 1 336 19 mov.u64 %r56,%r154; .loc 1 334 24 mov.u64 %r104,0; $L44: .loc 1 284 9 setp.eq.u64 %r621,%r149,0; @ %r621 bra $L1; $L41: .loc 1 320 34 ld.u64 %r237,[%frame+600]; .loc 1 291 13 ld.u64 %r240,[%frame+1080]; .loc 1 292 13 ld.u64 %r241,[%frame+720]; bra $L48; $L37: .loc 1 320 13 setp.eq.u64 %r622,%r241,%r237; @ %r622 bra $L49; bra $L48; $L38: setp.eq.u64 %r623,%r241,%r237; @ %r623 bra $L60; $L_gfortran_reshape .visible .func _gfortran_reshape {r24mov.u64 %r28,%ar4; .loc 1 364 3 ld.u64 %r35,[%r25+16]; {_gfortran_reshape_char .visible .func _gfortran_reshape_char) {reg .u64 %r23mov.u64 %r29,%ar6; .loc 1 380 3 {_gfortran_reshape_char4 .visible .func _gfortran_reshape_ch) {mov.u64 %r29,%ar5; mov.u64 %r30,%ar6; .loc 1 396 21 shl.b64 %r38,%r30,2; .loc 1 395 3 {397 1 ret; } /717 gfortrani_reshape_packed .visible .func _gfortrani_reshape_packed.file 1 "../../../libgfortran/intrinsics/reshape_packed.c"F: _gfortrani_reshape_packed .visible .func _gfortrani_reshape_packedmov.u64 %r33,%ar5; .loc 1 38 8 min.s64 %r24,%r29,%r31; .loc 1 39 3.loc 1 40 7 add.u64 %r26,%r28,%r24; .loc 1 41 9 sub.u64 %r27,%r29,%r24; .loc 1 42 9 setp.le.s64 %r42,%r27,0; @ %r42 bra $L1; $L3: .loc 1 44 12 min.s64 %r25,%r33,%r27; .loc 1 45 7.loc 1 46 11 add.u64 %r26,%r26,%r25; .loc 1 47 13 sub.u64 %r27,%r27,%r25; .loc 1 42 9 setp.gt.s64 %r51,%r27,0; @ %r51 bra $L3; $L1: .loc 1 49 1 ret; } /735 gfortran_selected_int_gfortran_selected_int_kindfortran/intrinsics/selected_int_kind.f90" // BEGIN VAR DEF: int_infos$0 .const .align 4 .u64 int_infos$0[5] = {8589934593,17179869186,38654705668,77309411336,163208757264_gfortran_selected_int_gfortran_selected_int_kindpred %r37; mov.u64 %r30,%ar0; .loc 1 39 8 ld.u32 %r22,[%r30]; cvta.const.u64 %r27,int_infos$0+12; mov.u32 %r26,2; .loc 1 38 8 mov.u32 %r25,1; $L4: .loc 1 39 8 setp.gt.s32 %r31,%r22,%r26; @ %r31 bra $L2; .loc 1 40 56 cvta.const.u64 %r32,int_infos$0; .loc 1 39 8 add.u32 %r33,%r25,-1; cvt.s64.s32 %r34,%r33; .loc 1 40 56 shl.b64 %r35,%r34,3; add.u64 %r36,%r32,%r35; ld.u32 %r29,[%r36]; .loc 1 41 12 bra $L1; $L2: .loc 1 38 8 add.u32 %r25,%r25,1; setp.eq.u32 %r37,%r25,6; @ %r37 bra $L5; .loc 1 39 8 ld.u32 %r26,[%r27]; add.u64 %r27,%r27,8; bra $L4; $L5: .loc 1 45 8 mov.u32 %r29,-1; $L1756 gfortran_selected_real_kind2008gfortran_selected_real_kind2008fortran/intrinsics/selected_real_kind.f90" // BEGIN GLOBAL FUNCTION DECL: _gfortran_selected_real_gfortran_selected_real_kindgfortran_selected_real_kind2008gfortran_selected_real_kind2008u16predpred49 6 setp.eq.u64 %r37,%r34,0; @ %r37 bra $L13; .loc 1 49 25 ld.u32 %r26,[%r34]; bra $L2; $L13: .loc 1 42 8 cvt.u32.u64 %r26,%r34; $L2: .loc 1 50 6 setp.eq.u64 %r38,%r35,0; @ %r38 bra $L14; .loc 1 50 25 ld.u32 %r27,[%r35]; .loc 1 57 8 set.u32.le.s32 %r40,%r27,37; neg.s32 %r41,%r40; cvt.u32.u32 %r39,%r41; cvt.u32.u8 %r30,%r39; .loc 1 57 54 and.b32 %r25,%r30,1; bra $L3; $L14: mov.u32 %r25,1; mov.u32 %r30,%r25; .loc 1 43 8 cvt.u32.u64 %r27,%r35; $L3: .loc 1 55; .loc 1 51 33 ld.u32 %r28,[%r36]; .loc 1 58 8 set.u32.le.s32 %r44,%r28,2; neg.s32 %r45,%r44; cvt.u32.u32 %r43,%r45; cvt.u32.u8 %r29,%r43; bra $L4; $L15: mov.u32 %r29,1; .loc 1 44 12 cvt.u32.u64 %r28,%r36; $L4: .loc 1 62 13 setp.gt.s32 %r46,%r26,6; @ %r46 bra $L5; cvt.u16.u32 %r48,%r29; cvt.u16.u32 %r49,%r30; and.b16 %r47,%r48,%r49; cvt.u32.u16 %r50,%r47; cvt.u16.u8 %r51,%r6; .loc 1 56 58 mov.u32 %r32,1; bra $L7; $L5: .loc 1 56 8 set.u32.le.s32 %r55,%r26,15; .loc 1 56 58 neg.s32 %r32,%r55; $L7: .loc 1 57 8 setp.gt.s32 %r56,%r27,307; @ %r56 bra $L8; .loc 1 58 8 setp.gt.s32 %r57,%r28,2; @ %r57 bra $L17; .loc 1 62 13 set.u32.le.s32 %r59,%r26,15; neg.s32 %r60,%r59; cvt.u16.u32 %r62,%r60; cvt.u16.u32 %r63,%r29; and.b16 %r61,%r62,%r63;@ ! %r66 bra $L10; bra $L18; $L12: .loc 1 68 37 setp.ne.u32 %r67,%r25,0; @ %r67 bra $L10; bra $L22; $L16: .loc 1 62 13 mov.u32 %r33,4; bra $L1; $L18: mov.u32 %r33,8; .loc 1 64 12 bra $L1; $L10: .loc 1 68 37 setp.ne.u32 %r68,%r32,0; .loc 1 69 41 selp.u32 %r33,-4,-1,%r68; bra $L1; $L22: .loc 1 70 41 setp.ne.u32 %r69,%r32,0; .loc 1 73 41 selp.u32 %r33,-2,-3,%r69; bra $L1; $L17: .loc 1 77 41 mov.u32 %r33,-5; bra $L1; $L8: .loc 1 58 8 setp.le.s32 %r72,%r28,2; @ %r72 bra $L12; .loc 1 77 41 mov.u32 %r33,-5; $L1: .loc 1 79 45_gfortran_selected_real_gfortran_selected_real_kind94 72 mov.u64 %r27,call (%value_in),_gfortran_selected_real_kind2008GLOBAL FUNCTION DECL: _gfortran_sind_r4 .visible .func (.param .f32 %value_out) _gfortran_sind_r4fortran/intrinsics/trigd.inc"_gfortran_sind_r8 .visible .func (.param .f64 %value_out) _gfortran_sind__gfortran_cosd_r8 .visible .func (.param .f64 %value_out) _gfortran_cosd__gfortran_tand_r8 .visible .func (.param .f64 %value_out) _gfortran_tand_fmodf .extern .func (.param .f32 %value_out) fmodfCL: cosf .extern .func (.param .f32 %value_out) cosfsinf .extern .func (.param .f32 %value_out) tanf .extern .func (.param .f32 %value_out) tanfod .extern .func (.param .f64 %value_out) fmodCL: cos .extern .func (.param .f64 %value_out) cossin .extern .func (.param .f64 %value_out) sintan .extern .func (.param .f64 %value_out) tan (.param .f64 %in_ar0); // BEGIN VAR DEF: tiny$0 .const .align 8 .u64 tiny$0[1] = {103582791429521408 }; // BEGIN VAR DEF: tiny$1 .const .align 4 .u32 tiny$1[1] = {226492416_gfortran_sind_r4 .visible .func (.param .f32 %value_out) _gfortran_sind_r4 (r22; .reg .f32 %r24; .reg .f32 %r31; .reg .f32 %r32.reg .f32 %r49; .reg .predpredpred %r60; .reg .u32u32 %r86; .reg .pred %r91; .reg .predpred %r97; .reg .f32 %r98; .reg .f32 %r99; .reg .pred %r100; .reg .f32 %r102; .reg .f32 %r103; .reg .pred %r104; .reg .f32 %r106; .reg .pred %r107; .reg .f32 %r108; .reg .f32 %r109; .reg .predf32 %r120; mov.f32 %r49,%ar0; .loc 1 95 7 abs.f32 %r22,%r49; .loc 1 95 6 setp.gtu.f32 %r50,%r22,0f7f7fffff; @ %r50 bra $L2; .loc 1 110 11 mov.f32 %r52,0f3d000000; sub.f32 %r51,%r22,%r52; .loc 1 110 10 setp.lt.f32 %r53,%r51,0f00000000; @ ! %r53 bra $L41; .loc 1 112 4 mul.f32 %r24,%r49,0fb6395dad; fma.rn.f32 %r48,%r49,0f3c8f0000,%r24; .loc 1 114 11 bra $L1; $L41: .loc 1 102 7 mov.f32 %r54,0f3f800000; copysign.f32 %r48,%r49,%r54; .loc 1 127 7 mov.f32 %r56,0f43b40000; { .param .f32 %value_in; .param .f32 %out_arg1; st.param.f32 [%out_arg1],%r22(%value_in),fmodmov.f32 %r42,%r57; .loc 1 133 11 cvt.rzi.s32.f32 %r40,%r57; cvt.rn.f32.s32 %r58,%r40; sub.f32 %r59,%r58,%r57; .loc 1 133 10 setp.neu.f32 %r60,%r59,0f00000000; @ %r60 bra $L6; rem.s32 %r62,%r40,30; .loc 1 133 48 setp.ne.u32 %r67,%r62,0; @ %r67 bra $L6; rem.s32 %r69,%r40,180; .loc 1 137 7 setp.ne.u32 %r74,%r69,0; @ %r74 bra $L7; .loc 1 140 11 setp.eq.u32 %r75,%r40,180; @ %r75 bra $L8; .loc 1 216 7 mul.f32 %r48,%r48,0f00000000; bra $L1; $L8: mul.f32 %r48,%r48,0f80000000; bra $L1; $L7: rem.s32 %r78,%r40,90; .loc 1 143 12 setp.ne.u32 %r83,%r78,0; @ %r83 bra $L10; .loc 1 144 6 setp.eq.u32 %r84,%r40,90; @ %r84 bra $L1; neg.f32 %r48,%r48; bra $L1; $L10: rem.s32 %r86,%r40,60; .loc 1 145 12 setp.ne.u32 %r91,%r86,0; @ %r91 bra $L11; .loc 1 148 11 setp.gt.s32 %r92,%r40,179; @ %r92 bra $L12; .loc 1 216 7 mul.f32 %r48,%r48,0f3f5db3d7; bra $L1; $L12: mul.f32 %r48,%r48,0fbf5db3d7; bra $L1; $L11: .loc 1 152 6 setp.gt.s32 %r93,%r40,179; @ %r93 bra $L13; .loc 1 216 7 mul.f32 %r48,%r48,0f3f000000; bra $L1; $L13: mul.f32 %r48,%r48,0fbf000000; bra $L1; $L6: .loc 1 161 8 mov.f32 %r94,0f43340000; sub.f32 %r31,%r57,%r94; .loc 1 161 7 setp.le.f32 %r95,%r31,0f00000000; @ ! %r95 bra $L42; .loc 1 163 12 mov.f32 %r96,0f42b40000; sub.f32 %r32,%r57,%r96; .loc 1 163 11 setp.le.f32 %r97,%r32,0f00000000; @ ! %r97 bra $L43; .loc 1 165 9 mov.f32 %r99,0f42340000; sub.f32 %r98,%r57,%r99; .loc 1 165 8 setp.gt.f32 %r100,%r98,0f00000000; @ ! %r100 bra $L18; .loc 1 168 9 sub.f32 %r32,%r96,%r57; bra $L20; $L43: .loc 1 174 9 mov.f32 %r103,0f43070000; sub.f32 %r102,%r57,%r103; .loc 1 174 8 setp.le.f32 %r104,%r102,0f00000000; @ %r104 bra $L20; .loc 1 180 7 sub.f32 %r42,%r94,%r57; bra $L18; $L42: .loc 1 184 13 mov.f32 %r106,0f43870000; sub.f32 %r38,%r57,%r106; .loc 1 184 12 setp.le.f32 %r107,%r38,0f00000000; @ ! %r107 bra $L44; .loc 1 186 12 mov.f32 %r109,0f43610000; sub.f32 %r108,%r57,%r109; .loc 1 186 11 setp.le.f32 %r110,%r108,0f00000000; @ %r110 bra $L28; .loc 1 190 5 sub.f32 %r38,%r106,%r57; .loc 1 191 14 mov.u32 %r39,1; bra $L23; $L28: .loc 1 187 3 mov.f32 %r38,%r31; .loc 1 160 9 mov.u32 %r39,0; $L23: .loc 1 193 8 neg.f32 %r48,%r48; bra $L24; $L44: .loc 1 198 12 mov.f32 %r113,0f439d8000; sub.f32 %r112,%r57,%r113; .loc 1 198 11 setp.le.f32 %r114,%r112,0f00000000; @ %r114 bra $L29; .loc 1 204 3 mov.f32 %r115,0f43b40000; sub.f32 %r38,%r115,%r57; .loc 1 160 9 mov.u32 %r39,0; bra $L25; $L29: .loc 1 201 14 mov.u32 %r39,1; $L25: .loc 1 205 8 neg.f32 %r48,%r48; $L24: .loc 1 208 4 mul.f32 %r37,%r38,0fb6395dad; fma.rn.f32 %r43,%r38,0f3c8f0000,%r37; .loc 1 210 7 setp.eq.u32 %r116,%r39,0; @ %r116 bra $L26; $L27sf,(%out_arg1); ld.param.f32 %r118,[%value_in]; } .loc 1 216 7 mul.f32 %r48,%r48,%r118; bra $L1; $L26sinf,(%out_arg1); ld.param.f32 %r120,[%value_in]; } .loc 1 216 7 mul.f32 %r48,%r48,%r120; bra $L1; $L2: .loc 1 223 5 sub.f32 %r48,%r49,%r49; bra $L1; $L20: .loc 1 208 4 mul.f32 %r46,%r32,0fb6395dad; fma.rn.f32 %r43,%r32,0f3c8f0000,%r46; bra $L27; $L18: mul.f32 %r47,%r42,0fb6395dad; fma.rn.f32 %r43,%r42,0f3c8f0000,%r47; bra $L26;gfortran_cosd_r4 .visible .func (.param .f32 %value_out) _gfortran_cosd_r4 (r22; .reg .f32 %r24; .reg .f32 %r39; .reg .u32 %predpred %r55; .reg .predpredf32 %r122; .reg .pred %r123; .reg .f32 %r125; .reg .f32f32pred %r137; .reg .f32 %r138; .reg .pred %r139; .reg .f32 %r142; .reg .f32 %r144; mov.f32 %r51,%ar0; .loc 1 245 7 abs.f32 %r22,%r51; .loc 1 245 6 setp.gtu.f32 %r52,%r22,0f7f7fffff; @ %r52 bra $L46; .loc 1 253 11 mov.f32 %r54,0f3c000000; sub.f32 %r53,%r22,%r54; .loc 1 253 10 setp.le.f32 %r55,%r53,0f00000000; @ ! %r55 bra $L80; .loc 1 258 7 setp.eq.f32 %r56,%r51,0f00000000; @ %r56 bra $L49; .loc 1 259 6 ld.const.f32 %r24,[tiny$1]; mov.f32 %r57,0f3f800000; sub.f32 %r50,%r57,%r24; bra $L45; $L80: .loc 1 275 7 mov.f32 %r59,0f43b40000; { .param .f32 %value_in; .param .f32 %out_arg1; st.param.f32 [%out_arg1],%r22(%value_in),fmod282 11 cvt.rzi.s32.f32 %r43,%r60; cvt.rn.f32.s32 %r61,%r43; sub.f32 %r62,%r61,%r60; .loc 1 282 10 setp.neu.f32 %r63,%r62,0f00000000; @ %r63 bra $L51; rem.s32 %r65,%r43,30; .loc 1 282 48 setp.ne.u32 %r70,%r65,0; @ %r70 bra $L51; rem.s32 %r72,%r43,180; .loc 1 284 7 setp.ne.u32 %r77,%r72,0; @ %r77 bra $L52; .loc 1 285 6 setp.eq.u32 %r78,%r43,180; @ %r78 bra $L64; $L49: mov.f32 %r50,0f3f800000; bra $L45; $L52: rem.s32 %r80,%r43,90; .loc 1 287 12 setp.eq.u32 %r85,%r80,0; @ %r85 bra $L65; rem.s32 %r87,%r43,60; .loc 1 289 12 setp.ne.u32 %r92,%r87,0; @ %r92 bra $L53; .loc 1 292 31 set.u32.ne.u32 %r94,%r43,60; neg.s32 %r95,%r94; .loc 1 292 59 set.u32.ne.u32 %r97,%r43,300; neg.s32 %r98,%r97; .loc 1 292 36 cvt.u16.u32 %r100,%r95; cvt.u16.u32 %r101,%r98; and.b16 %r99,%r100,%r101; .loc 1 292 11 cvt.u32.u16 %r102,%r99; cvt.u16.u8 %r103,%r102; setp.eq.u16 %r104,%r103,0; .loc 1 293 3 selp.f32 %r50,0f3f000000,0fbf000000,%r104; bra $L45; $L53: .loc 1 298 31 set.u32.ne.u32 %r106,%r43,30; neg.s32 %r107,%r106; .loc 1 298 59 set.u32.ne.u32 %r109,%r43,330; neg.s32 %r110,%r109; .loc 1 298 36 cvt.u16.u32 %r112,%r107; cvt.u16.u32 %r113,%r110; and.b16 %r111,%r112,%r113; .loc 1 298 11 cvt.u32.u16 %r114,%r111; cvt.u16.u8 %r115,%r114; setp.eq.u16 %r116,%r115,0; .loc 1 299 3 selp.f32 %r50,0f3f5db3d7,0fbf5db3d7,%r116; bra $L45; $L51: .loc 1 309 8 mov.f32 %r117,0f43340000; sub.f32 %r44,%r60,%r117; .loc 1 309 7 setp.le.f32 %r118,%r44,0f00000000; @ ! %r118 bra $L81; .loc 1 311 12 mov.f32 %r119,0f42b40000; sub.f32 %r39,%r60,%r119; .loc 1 311 11 setp.le.f32 %r120,%r39,0f00000000; @ ! %r120 bra $L82; .loc 1 313 9 mov.f32 %r122,0f42340000; sub.f32 %r121,%r60,%r122; .loc 1 313 8 setp.gt.f32 %r123,%r121,0f00000000; @ ! %r123 bra $L83; .loc 1 315 9 sub.f32 %r39,%r119,%r60; .loc 1 307 9 mov.u32 %r45,0; bra $L60; $L82: .loc 1 321 9 mov.f32 %r126,0f43070000; sub.f32 %r125,%r60,%r126; .loc 1 321 8 setp.le.f32 %r127,%r125,0f00000000; @ %r127 bra $L69; .loc 1 327 7 sub.f32 %r44,%r117,%r60; .loc 1 328 9 mov.u32 %r45,1; bra $L58; $L81: .loc 1 332 13 mov.f32 %r129,0f43870000; sub.f32 %r39,%r60,%r129; .loc 1 332 12 setp.le.f32 %r130,%r39,0f00000000; @ ! %r130 bra $L84; .loc 1 334 12 mov.f32 %r132,0f43610000; sub.f32 %r131,%r60,%r132; .loc 1 334 11 setp.le.f32 %r133,%r131,0f00000000; @ %r133 bra $L70; .loc 1 338 5 sub.f32 %r39,%r129,%r60; .loc 1 341 12 mov.u32 %r45,1; bra $L60; $L84: .loc 1 346 12 mov.f32 %r136,0f439d8000; sub.f32 %r135,%r60,%r136; .loc 1 346 11 setp.le.f32 %r137,%r135,0f00000000; @ %r137 bra $L71; .loc 1 352 3 mov.f32 %r138,0f43b40000; sub.f32 %r44,%r138,%r60; .loc 1 307 9 mov.u32 %r45,0; bra $L58; $L63: .loc 1 362 7 setp.eq.u32 %r139,%r45,0; @ %r139 bra $L45; .loc 1 363 6 neg.f32 %r50,%r50; bra $L45; $L46: .loc 1 371 5 sub.f32 %r50,%r51,%r51; bra $L45; $L64: .loc 1 285 6 mov.f32 %r50,0fbf800000; bra $L45; $L65: .loc 1 288 6 mov.f32 %r50,0f00000000; bra $L45; $L69: .loc 1 328 9 mov.u32 %r45,1; bra $L60; $L71: .loc 1 307 9 mov.u32 %r45,0; $L60: .loc 1 355 4 mul.f32 %r46,%r39,0fb6395dad; fma.rn.f32 %r47,%r39,0f3c8f0000,%r46; .loc 1 358 6 { .param .f32 %value_in; .param .f32 %out_arg1; st.param.f32 [%out_arg1],%r47; call (%value_in),sinf,(%out_arg1); ld.param.f32 %r142,[%value_in]; } mov.f32 %r50,%r142; bra $L63; $L83: .loc 1 275 7 mov.f32 %r44,%r60; .loc 1 307 9 mov.u32 %r45,0; bra $L58; $L70: .loc 1 341 12 mov.u32 %r45,1; $L58: .loc 1 355 4 mul.f32 %r48,%r44,0fb6395dad; fma.rn.f32 %r49,%r44,0f3c8f0000,%r48; .loc 1 360 6 { .param .f32 %value_in; .param .f32 %out_arg1; st.param.f32 [%out_arg1],%r49; call (%value_in),cosf,(%out_arg1); ld.param.f32 %r144,[%value_in]; } mov.f32 %r50,%r144; bra $L63; $L45gfortran_tand_r4 .visible .func (.param .f32 %value_out) _gfortran_tand_r4 (f32 %r43; .reg .pred %r44; .reg .f32 %r45; .reg .f32 %r46; .reg .predpredpred %r87; .reg .f32 %r88; .reg .f32 %r89; .reg .pred %r90; .reg .f32 %r92; .reg .f32 %r93; .reg .pred %r94; .reg .f32 %r95; .reg .f32 %r97; mov.f32 %r43,%ar0; .loc 1 389 7 abs.f32 %r22,%r43; .loc 1 389 6 setp.gtu.f32 %r44,%r22,0f7f7fffff; @ %r44 bra $L86; .loc 1 404 11 mov.f32 %r46,0f3d000000; sub.f32 %r45,%r22,%r46; .loc 1 404 10 setp.lt.f32 %r47,%r45,0f00000000; @ ! %r47 bra $L107; .loc 1 406 4 mul.f32 %r24,%r43,0fb6395dad; fma.rn.f32 %r42,%r43,0f3c8f0000,%r24; .loc 1 408 11 bra $L85; $L107: .loc 1 396 7 mov.f32 %r48,0f3f800000; copysign.f32 %r42,%r43,%r48; .loc 1 421 7 mov.f32 %r50,0f43b40000; { .param .f32 %value_in; .param .f32 %out_arg1; st.param.f32 [%out_arg1],%r22(%value_in),fmod7 11 cvt.rzi.s32.f32 %r38,%r51; cvt.rn.f32.s32 %r52,%r38; sub.f32 %r53,%r52,%r51; .loc 1 427 10 setp.neu.f32 %r54,%r53,0f00000000; @ %r54 bra $L90; mul.lo.u32 %r57,%r38,-1527099483; .loc 1 427 48 setp.gt.u32 %r58,%r57,95443717; @ %r58 bra $L90; rem.s32 %r60,%r38,180; .loc 1 429 7 setp.ne.u32 %r65,%r60,0; @ %r65 bra $L91; .loc 1 476 7 mul.f32 %r42,%r42,0f00000000; bra $L85; $L91: rem.s32 %r67,%r38,90; .loc 1 437 12 setp.ne.u32 %r72,%r67,0; @ %r72 bra $L93; .loc 1 438 6 setp.ne.u32 %r73,%r38,90; selp.f32 %r41,0fff800000,0f7f800000,%r73; .loc 1 476 7 mul.f32 %r42,%r42,%r41; bra $L85; $L93: .loc 1 443 31 set.u32.ne.u32 %r75,%r38,45; neg.s32 %r76,%r75; .loc 1 443 59 set.u32.ne.u32 %r78,%r38,225; neg.s32 %r79,%r78; .loc 1 443 36 cvt.u16.u32 %r81,%r76; cvt.u16.u32 %r82,%r79; and.b16 %r80,%r81,%r82; .loc 1 443 11@ %r85 bra $L85; neg.f32 %r42,%r42; bra $L85; $L90: .loc 1 451 8 mov.f32 %r86,0f43340000; sub.f32 %r33,%r51,%r86; .loc 1 451 7 setp.le.f32 %r87,%r33,0f00000000; @ ! %r87 bra $L108; .loc 1 453 12 mov.f32 %r89,0f42b40000; sub.f32 %r88,%r51,%r89; .loc 1 453 11 setp.gt.f32 %r90,%r88,0f00000000; @ ! %r90 bra $L109; .loc 1 455 5 sub.f32 %r33,%r86,%r51; .loc 1 456 5 neg.f32 %r42,%r42; bra $L97; $L108: .loc 1 461 12 mov.f32 %r93,0f43870000; sub.f32 %r92,%r51,%r93; .loc 1 461 11 setp.le.f32 %r94,%r92,0f00000000; @ %r94 bra $L97; .loc 1 467 5 mov.f32 %r95,0f43b40000; sub.f32 %r33,%r95,%r51; .loc 1 468 5 neg.f32 %r42,%r42; bra $L97; $L109: .loc 1 421 7 mov.f32 %r33,%r51; $L97: .loc 1 472 4 mul.f32 %r36,%r33,0fb6395dad; fma.rn.f32 %r39,%r33,0f3c8f0000,%r36; .loc 1 473 4 { .param .f32 %value_in; .param .f32 %out_arg1; st.param.f32 [%out_arg1],%r39; call (%value_in),tanf,(%out_arg1); ld.param.f32 %r97,[%value_in]; } .loc 1 476 7 mul.f32 %r42,%r42,%r97; bra $L85; $L86: .loc 1 483 5 sub.f32 %r42,%r43,%r43; $L85gfortran_sind_r8 .visible .func (.param .f64 %value_out) _gfortran_sind_r8 (predpredpred %r60; .reg .u32u32 %r86; .reg .pred %r91; .reg .predpred %r97; .reg .f64 %r98; .reg .f64 %r99; .reg .pred %r100; .reg .f64 %r102; .reg .f64 %r103; .reg .pred %r104; .reg .f64 %r106; .reg .predf64 %r120; mov.f64 %r49,%ar0; .loc 1 95 7 abs.f64 %r22,%r49; .loc 1 95 6 setp.gtu.f64 %r50,%r22,0d7fefffffffffffff; @ %r50 bra $L111; .loc 1 110 11 mov.f64 %r52,0d3ec0000000000000; sub.f64 %r51,%r22,%r52; .loc 1 110 10 setp.lt.f64 %r53,%r51,0d0000000000000000; @ ! %r53 bra $L150; .loc 1 112 4 mul.f64 %r24,%r49,0d3e444a53a722b83b; fma.rn.f64 %r48,%r49,0d3f91df4600000000,%r24; .loc 1 114 11 bra $L110; $L150: .loc 1 102 7 mov.f64 %r54,0d3ff0000000000000; copysign.f64 %r48,%r49,%r54; .loc 1 127 7 mov.f64 %r56,0d4076800000000000; { .param .f64 %value_in; .param .f64 %out_arg1; st.param.f64 [%out_arg1],%r22(%value_in),fmo57,[%value_in]; } mov.f64 %r42,%r57; .loc 1 133 11 cvt.rzi.s32.f64 %r40,%r57; cvt.rn.f64.s32 %r58,%r40; sub.f64 %r59,%r58,%r57; .loc 1 133 10 setp.neu.f64 %r60,%r59,0d0000000000000000; @ %r60 bra $L115; rem.s32 %r62,%r40,30; .loc 1 133 48 setp.ne.u32 %r67,%r62,0; @ %r67 bra $L115; rem.s32 %r69,%r40,180; .loc 1 137 7 setp.ne.u32 %r74,%r69,0; @ %r74 bra $L116; .loc 1 140 11 setp.eq.u32 %r75,%r40,180; @ %r75 bra $L117; .loc 1 216 7 mul.f64 %r48,%r48,0d0000000000000000; bra $L110; $L117: mul.f64 %r48,%r48,0d8000000000000000; bra $L110; $L116: rem.s32 %r78,%r40,90; .loc 1 143 12 setp.ne.u32 %r83,%r78,0; @ %r83 bra $L119; .loc 1 144 6 setp.eq.u32 %r84,%r40,90; @ %r84 bra $L110; neg.f64 %r48,%r48; bra $L110; $L119: rem.s32 %r86,%r40,60; .loc 1 145 12 setp.ne.u32 %r91,%r86,0; @ %r91 bra $L120; .loc 1 148 11 setp.gt.s32 %r92,%r40,179; @ %r92 bra $L121; .loc 1 216 7 mul.f64 %r48,%r48,0d3febb67ae8584caa; bra $L110; $L121: mul.f64 %r48,%r48,0dbfebb67ae8584caa; bra $L110; $L120: .loc 1 152 6 setp.gt.s32 %r93,%r40,179; @ %r93 bra $L122; .loc 1 216 7 mul.f64 %r48,%r48,0d3fe0000000000000; bra $L110; $L122: mul.f64 %r48,%r48,0dbfe0000000000000; bra $L110; $L115: .loc 1 161 8 mov.f64 %r94,0d4066800000000000; sub.f64 %r31,%r57,%r94; .loc 1 161 7 setp.le.f64 %r95,%r31,0d0000000000000000; @ ! %r95 bra $L151; .loc 1 163 12 mov.f64 %r96,0d4056800000000000; sub.f64 %r32,%r57,%r96; .loc 1 163 11 setp.le.f64 %r97,%r32,0d0000000000000000; @ ! %r97 bra $L152; .loc 1 165 9 mov.f64 %r99,0d4046800000000000; sub.f64 %r98,%r57,%r99; .loc 1 165 8 setp.gt.f64 %r100,%r98,0d0000000000000000; @ ! %r100 bra $L127; .loc 1 168 9 sub.f64 %r32,%r96,%r57; bra $L129; $L152: .loc 1 174 9 mov.f64 %r103,0d4060e00000000000; sub.f64 %r102,%r57,%r103; .loc 1 174 8 setp.le.f64 %r104,%r102,0d0000000000000000; @ %r104 bra $L129; .loc 1 180 7 sub.f64 %r42,%r94,%r57; bra $L127; $L151: .loc 1 184 13 mov.f64 %r106,0d4070e00000000000; sub.f64 %r38,%r57,%r106; .loc 1 184 12 setp.le.f64 %r107,%r38,0d0000000000000000; @ ! %r107 bra $L153; .loc 1 186 12 mov.f64 %r109,0d406c200000000000; sub.f64 %r108,%r57,%r109; .loc 1 186 11 setp.le.f64 %r110,%r108,0d0000000000000000; @ %r110 bra $L137; .loc 1 190 5 sub.f64 %r38,%r106,%r57; .loc 1 191 14 mov.u32 %r39,1; bra $L132; $L137: .loc 1 187 3 mov.f64 %r38,%r31; .loc 1 160 9 mov.u32 %r39,0; $L132: .loc 1 193 8 neg.f64 %r48,%r48; bra $L133; $L153: .loc 1 198 12 mov.f64 %r113,0d4073b00000000000; sub.f64 %r112,%r57,%r113; .loc 1 198 11 setp.le.f64 %r114,%r112,0d0000000000000000; @ %r114 bra $L138; .loc 1 204 3 mov.f64 %r115,0d4076800000000000; sub.f64 %r38,%r115,%r57; .loc 1 160 9 mov.u32 %r39,0; bra $L134; $L138: .loc 1 201 14 mov.u32 %r39,1; $L134: .loc 1 205 8 neg.f64 %r48,%r48; $L133: .loc 1 208 4 mul.f64 %r37,%r38,0d3e444a53a722b83b; fma.rn.f64 %r43,%r38,0d3f91df4600000000,%r37; .loc 1 210 7 setp.eq.u32 %r116,%r39,0; @ %r116 bra $L135s,(%out_arg1); ld.param.f64 %r118,[%value_in]; } .loc 1 216 7 mul.f64 %r48,%r48,%r118; bra $L110sin,(%out_arg1); ld.param.f64 %r120,[%value_in]; } .loc 1 216 7 mul.f64 %r48,%r48,%r120; bra $L110; $L111: .loc 1 223 5 sub.f64 %r48,%r49,%r49; bra $L110; $L129: .loc 1 208 4 mul.f64 %r46,%r32,0d3e444a53a722b83b; fma.rn.f64 %r43,%r32,0d3f91df4600000000,%r46; bra $L136; $L127: mul.f64 %r47,%r42,0d3e444a53a722b83b; fma.rn.f64 %r43,%r42,0d3f91df4600000000,%r47; bra $L135; $L110:_gfortran_cosd_r8 .visible .func (.param .f64 %value_out) _gfortran_cosd_r8 (f64 %r22; .reg .f64 %r24; .reg .f64 %r39; .reg .u32 %predpred %r55; .reg .predpredf64 %r122; .reg .predpred %r130; .reg .f64predf64 %r142; .reg .f64 %r144; mov.f64 %r51,%ar0; .loc 1 245 7 abs.f64 %r22,%r51; .loc 1 245 6 setp.gtu.f64 %r52,%r22,0d7fefffffffffffff; @ %r52 bra $L155; .loc 1 253 11 mov.f64 %r54,0d3ea0000000000000; sub.f64 %r53,%r22,%r54; .loc 1 253 10 setp.le.f64 %r55,%r53,0d0000000000000000; @ ! %r55 bra $L189; .loc 1 258 7 setp.eq.f64 %r56,%r51,0d0000000000000000; @ %r56 bra $L158; .loc 1 259 6 ld.const.f64 %r24,[tiny$0]; mov.f64 %r57,0d3ff0000000000000; sub.f64 %r50,%r57,%r24; bra $L154; $L189: .loc 1 275 7 mov.f64 %r59,0d4076800000000000; { .param .f64 %value_in; .param .f64 %out_arg1; st.param.f64 [%out_arg1],%r22(%value_in),fmo,[%value_in]; } .loc 1 282 11 cvt.rzi.s32.f64 %r43,%r60; cvt.rn.f64.s32 %r61,%r43; sub.f64 %r62,%r61,%r60; .loc 1 282 10 setp.neu.f64 %r63,%r62,0d0000000000000000; @ %r63 bra $L160; rem.s32 %r65,%r43,30; .loc 1 282 48 setp.ne.u32 %r70,%r65,0; @ %r70 bra $L160; rem.s32 %r72,%r43,180; .loc 1 284 7 setp.ne.u32 %r77,%r72,0; @ %r77 bra $L161; .loc 1 285 6 setp.eq.u32 %r78,%r43,180; @ %r78 bra $L173; $L158: mov.f64 %r50,0d3ff0000000000000; bra $L154; $L161: rem.s32 %r80,%r43,90; .loc 1 287 12 setp.eq.u32 %r85,%r80,0; @ %r85 bra $L174; rem.s32 %r87,%r43,60; .loc 1 289 12 setp.ne.u32 %r92,%r87,0; @ %r92 bra $L162; .loc 1 292 31 set.u32.ne.u32 %r94,%r43,60; neg.s32 %r95,%r94; .loc 1 292 59 set.u32.ne.u32 %r97,%r43,300; neg.s32 %r98,%r97; .loc 1 292 36 cvt.u16.u32 %r100,%r95; cvt.u16.u32 %r101,%r98; and.b16 %r99,%r100,%r101; .loc 1 292 11 cvt.u32.u16 %r102,%r99; cvt.u16.u8 %r103,%r102; setp.eq.u16 %r104,%r103,0; .loc 1 293 3 selp.f64 %r50,0d3fe0000000000000,0dbfe0000000000000,%r104; bra $L154; $L162: .loc 1 298 31 set.u32.ne.u32 %r106,%r43,30; neg.s32 %r107,%r106; .loc 1 298 59 set.u32.ne.u32 %r109,%r43,330; neg.s32 %r110,%r109; .loc 1 298 36 cvt.u16.u32 %r112,%r107; cvt.u16.u32 %r113,%r110; and.b16 %r111,%r112,%r113; .loc 1 298 11 cvt.u32.u16 %r114,%r111; cvt.u16.u8 %r115,%r114; setp.eq.u16 %r116,%r115,0; .loc 1 299 3 selp.f64 %r50,0d3febb67ae8584caa,0dbfebb67ae8584caa,%r116; bra $L154; $L160: .loc 1 309 8 mov.f64 %r117,0d4066800000000000; sub.f64 %r44,%r60,%r117; .loc 1 309 7 setp.le.f64 %r118,%r44,0d0000000000000000; @ ! %r118 bra $L190; .loc 1 311 12 mov.f64 %r119,0d4056800000000000; sub.f64 %r39,%r60,%r119; .loc 1 311 11 setp.le.f64 %r120,%r39,0d0000000000000000; @ ! %r120 bra $L191; .loc 1 313 9 mov.f64 %r122,0d4046800000000000; sub.f64 %r121,%r60,%r122; .loc 1 313 8 setp.gt.f64 %r123,%r121,0d0000000000000000; @ ! %r123 bra $L192; .loc 1 315 9 sub.f64 %r39,%r119,%r60; .loc 1 307 9 mov.u32 %r45,0; bra $L169; $L191: .loc 1 321 9 mov.f64 %r126,0d4060e00000000000; sub.f64 %r125,%r60,%r126; .loc 1 321 8 setp.le.f64 %r127,%r125,0d0000000000000000; @ %r127 bra $L178; .loc 1 327 7 sub.f64 %r44,%r117,%r60; .loc 1 328 9 mov.u32 %r45,1; bra $L167; $L190: .loc 1 332 13 mov.f64 %r129,0d4070e00000000000; sub.f64 %r39,%r60,%r129; .loc 1 332 12 setp.le.f64 %r130,%r39,0d0000000000000000; @ ! %r130 bra $L193; .loc 1 334 12 mov.f64 %r132,0d406c200000000000; sub.f64 %r131,%r60,%r132; .loc 1 334 11 setp.le.f64 %r133,%r131,0d0000000000000000; @ %r133 bra $L179; .loc 1 338 5 sub.f64 %r39,%r129,%r60; .loc 1 341 12 mov.u32 %r45,1; bra $L169; $L193: .loc 1 346 12 mov.f64 %r136,0d4073b00000000000; sub.f64 %r135,%r60,%r136; .loc 1 346 11 setp.le.f64 %r137,%r135,0d0000000000000000; @ %r137 bra $L180; .loc 1 352 3 mov.f64 %r138,0d4076800000000000; sub.f64 %r44,%r138,%r60; .loc 1 307 9 mov.u32 %r45,0; bra $L167; $L172: .loc 1 362 7 setp.eq.u32 %r139,%r45,0; @ %r139 bra $L154; .loc 1 363 6 neg.f64 %r50,%r50; bra $L154; $L155: .loc 1 371 5 sub.f64 %r50,%r51,%r51; bra $L154; $L173: .loc 1 285 6 mov.f64 %r50,0dbff0000000000000; bra $L154; $L174: .loc 1 288 6 mov.f64 %r50,0d0000000000000000; bra $L154; $L178: .loc 1 328 9 mov.u32 %r45,1; bra $L169; $L180: .loc 1 307 9 mov.u32 %r45,0; $L169: .loc 1 355 4 mul.f64 %r46,%r39,0d3e444a53a722b83b; fma.rn.f64 %r47,%r39,0d3f91df4600000000,%r46; .loc 1 358 6 { .param .f64 %value_in; .param .f64 %out_arg1; st.param.f64 [%out_arg1],%r47; call (%value_in),sin,(%out_arg1); ld.param.f64 %r142,[%value_in]; } mov.f64 %r50,%r142; bra $L172; $L192: .loc 1 275 7 mov.f64 %r44,%r60; .loc 1 307 9 mov.u32 %r45,0; bra $L167; $L179: .loc 1 341 12 mov.u32 %r45,1; $L167: .loc 1 355 4 mul.f64 %r48,%r44,0d3e444a53a722b83b; fma.rn.f64 %r49,%r44,0d3f91df4600000000,%r48; .loc 1 360 6 { .param .f64 %value_in; .param .f64 %out_arg1; st.param.f64 [%out_arg1],%r49; call (%value_in),cos,(%out_arg1); ld.param.f64 %r144,[%value_in]; } mov.f64 %r50,%r144; bra $L172; $L154:_gfortran_tand_r8 .visible .func (.param .f64 %value_out) _gfortran_tand_r8 (f64 %r43; .reg .pred %r44; .reg .f64 %r45; .reg .f64 %r46; .reg .predpredpred %r87; .reg .f64 %r88; .reg .f64 %r89; .reg .pred.loc 1 389 7 abs.f64 %r22,%r43; .loc 1 389 6 setp.gtu.f64 %r44,%r22,0d7fefffffffffffff; @ %r44 bra $L195; .loc 1 404 11 mov.f64 %r46,0d3ec0000000000000; sub.f64 %r45,%r22,%r46; .loc 1 404 10 setp.lt.f64 %r47,%r45,0d0000000000000000; @ ! %r47 bra $L216; .loc 1 406 4 mul.f64 %r24,%r43,0d3e444a53a722b83b; fma.rn.f64 %r42,%r43,0d3f91df4600000000,%r24; .loc 1 408 11 bra $L194; $L216: .loc 1 396 7 mov.f64 %r48,0d3ff0000000000000; copysign.f64 %r42,%r43,%r48; .loc 1 421 7 mov.f64 %r50,0d4076800000000000; { .param .f64 %value_in; .param .f64 %out_arg1; st.param.f64 [%out_arg1],%r22(%value_in),fmo51,[%value_in]; } .loc 1 427 11 cvt.rzi.s32.f64 %r38,%r51; cvt.rn.f64.s32 %r52,%r38; sub.f64 %r53,%r52,%r51; .loc 1 427 10 setp.neu.f64 %r54,%r53,0d0000000000000000; @ %r54 bra $L199; mul.lo.u32 %r57,%r38,-1527099483; .loc 1 427 48 setp.gt.u32 %r58,%r57,95443717; @ %r58 bra $L199; rem.s32 %r60,%r38,180; .loc 1 429 7 setp.ne.u32 %r65,%r60,0; @ %r65 bra $L200; .loc 1 476 7 mul.f64 %r42,%r42,0d0000000000000000; bra $L194; $L200: rem.s32 %r67,%r38,90; .loc 1 437 12 setp.ne.u32 %r72,%r67,0; @ %r72 bra $L202; .loc 1 438 6 setp.ne.u32 %r73,%r38,90; selp.f64 %r41,0dfff0000000000000,0d7ff0000000000000,%r73; .loc 1 476 7 mul.f64 %r42,%r42,%r41; bra $L194; $L202: .loc 1 443 31 set.u32.ne.u32 %r75,%r38,45; neg.s32 %r76,%r75; .loc 1 443 59 set.u32.ne.u32 %r78,%r38,225; neg.s32 %r79,%r78; .loc 1 443 36 cvt.u16.u32 %r81,%r76; cvt.u16.u32 %r82,%r79; and.b16 %r80,%r81,%r82; .loc 1 443 11@ %r85 bra $L194; neg.f64 %r42,%r42; bra $L194; $L199: .loc 1 451 8 mov.f64 %r86,0d4066800000000000; sub.f64 %r33,%r51,%r86; .loc 1 451 7 setp.le.f64 %r87,%r33,0d0000000000000000; @ ! %r87 bra $L217; .loc 1 453 12 mov.f64 %r89,0d4056800000000000; sub.f64 %r88,%r51,%r89; .loc 1 453 11 setp.gt.f64 %r90,%r88,0d0000000000000000; @ ! %r90 bra $L218; .loc 1 455 5 sub.f64 %r33,%r86,%r51; .loc 1 456 5 neg.f64 %r42,%r42; bra $L206; $L217: .loc 1 461 12 mov.f64 %r93,0d4070e00000000000; sub.f64 %r92,%r51,%r93; .loc 1 461 11 setp.le.f64 %r94,%r92,0d0000000000000000; @ %r94 bra $L206; .loc 1 467 5 mov.f64 %r95,0d4076800000000000; sub.f64 %r33,%r95,%r51; .loc 1 468 5 neg.f64 %r42,%r42; bra $L206; $L218: .loc 1 421 7 mov.f64 %r33,%r51; $L206: .loc 1 472 4 mul.f64 %r36,%r33,0d3e444a53a722b83b; fma.rn.f64 %r39,%r33,0d3f91df4600000000,%r36; .loc 1 473 4 { .param .f64 %value_in; .param .f64 %out_arg1; st.param.f64 [%out_arg1],%r39; call (%value_in),tan,(%out_arg1); ld.param.f64 %r97,[%value_in]; } .loc 1 476 7 mul.f64 %r42,%r42,%r97; bra $L194; $L195: .loc 1 483 5 sub.f64 %r42,%r43,%r43; $L194:778 BEGIN FUNCTION DECL: unpack_bounds .func unpack_boundsfortran/intrinsics/unpack_generic.c" // BEGIN FUNCTION DECL: unpack_internal .func unpack_internal_gfortran_unpack1 .visible .func _gfortran_unpack1gfortran_unpack1_char .visible .func _gfortran_unpack1_char_gfortran_unpack1_char4 .visible .func _gfortran_unpack1_char4_gfortran_unpack0 .visible .func _gfortran_unpack0gfortran_unpack0_char .visible .func _gfortran_unpack0_char_gfortran_unpack0_char4 .visible .func _gfortran_unpack0_char4); // BEGIN GLOBAL VAR DECL: _gfortrani_compile_options .extern .global .align 8 .u64 _gfortrani_compile_option_gfortran_size0_gfortran_size0_gfortrani_count_0_gfortrani_count_0_gfortran_runtime_error .extern .func _gfortran_runtime__gfortrani_bounds_equal_extents .extern .func _gfortrani_bounds_equal_extents_gfortrani_xmallocarr_gfortrani_xmallocarrayFUNCTION DECL: _gfortrani_unpack1_i1 .extern .func _gfortrani_unpack1_i1gfortrani_unpack1_i2 .extern .func _gfortrani_unpack1_i2gfortrani_unpack1_i4 .extern .func _gfortrani_unpack1_i4gfortrani_unpack1_i8 .extern .func _gfortrani_unpack1_i8gfortrani_unpack1_i16 .extern .func _gfortrani_unpack1_i16gfortrani_unpack1_r4 .extern .func _gfortrani_unpack1_r4gfortrani_unpack1_r8 .extern .func _gfortrani_unpack1_r8gfortrani_unpack1_c4 .extern .func _gfortrani_unpack1_c4gfortrani_unpack1_c8 .extern .func _gfortrani_unpack1_c8gfortrani_unpack0_i1 .extern .func _gfortrani_unpack0_i1gfortrani_unpack0_i2 .extern .func _gfortrani_unpack0_i2gfortrani_unpack0_i4 .extern .func _gfortrani_unpack0_i4gfortrani_unpack0_i8 .extern .func _gfortrani_unpack0_i8gfortrani_unpack0_i16 .extern .func _gfortrani_unpack0_i16gfortrani_unpack0_r4 .extern .func _gfortrani_unpack0_r4gfortrani_unpack0_r8 .extern .func _gfortrani_unpack0_r8gfortrani_unpack0_c4 .extern .func _gfortrani_unpack0_c4gfortrani_unpack0_c8 .extern .func _gfortrani_unpack0_c8112,97,99,107,95,105,110,116,101,114,110,97] = {85,78,80,65,67,75,0 }99,111,114,114,101,99,116,32,115,105,122,101,32,111,102,32,114,101,116,117,114,110,32,118,97,108,117,101,32,105,110,32,85,78,80,65,67,75,32,105,110,116,114,105,110,115,105,99,58,32,115,104,111,117,108,100,32,98,101,32,97,116,32,108,101,97,115,116,32,37,108,100,44,32,105,115,32,37,108,1070,117,110,110,121,32,115,105,122,101,100,32,108,111,103,105,99,97,108,32,97,114,114,97,1218] = {100,105,109,32,62,32,4849116,114,97,110,47,105,110,116,114,105,110,115,105,99,115,47,117,110,112,97,99,1076] = {70,73,69,76,68114,101,116,117,114,110,32,118,97,108,117,101,0 }; // BEGIN FUNCTION DEF: unpack_bounds .func unpack_boundspred326; call (%value_in),_gfortran_siz39 16call (%value_in),_gfortrani_count_0,(%out_arg1); ld.param.u64 %r32,[%value_in]; } .loc 1 40 6 setp.ge.s64 %r33,%r30,%r32; @ %r33 bra $L2; .loc 1 41 5 st.u64 [%stack+8],%r300stack; call _gfortran_runtime_error4647 5 cvta.const.u64 %r40,$LC1call _gfortrani_bounds_equal_extents$L3: .loc 1 50 6 ld.u64 %r41,[%r251; .loc 1 51 5 cvta.const.u64 %r46,$LC1; cvta.const.u64 %r455call _gfortrani_bounds_equal_extents$L1: .loc 1 54 1 ret; } // BEGIN FUNCTION DEF: unpack_internal .func unpack_internal {stack; mov.u64 %stack,0608pred %r187; .reg .predpred %r193; .reg .predpred64u64 %r281; .reg .u64u64u64 %r333; .reg .u64 %r334; .reg .u64 %r335; .reg .u64 %r336; .reg .pred %r337; .reg .pred %r338; .reg .u64u64 %r342; .reg .pred %r343; mov.u64 %r168,%ar0; mov.u64 %r169,%ar1; mov.u64 %r170,%ar2; mov.u64 %r171,%ar3; mov.u64 %r172,%ar4; .loc 1 88 8 ld.u64 %r97,[%r170]; .loc 1 93 15 ld.u64 %r22,[%r170+16]; .loc 1 95 22 cvt.u32.u64 %r23,%r22; .loc 1 95 53 add.u32 %r173,%r23,-4; and.b32 %r174,%r173,-5; set.u32.eq.u32 %r176,%r174,0; neg.s32 %r177,%r176; .loc 1 95 22 add.u32 %r178,%r23,-1; set.u32.le.u32 %r180,%r178,1; neg.s32 %r181,%r180; .loc 1 95 6 cvt.u16.u32 %r183,%r177; cvt.u16.u32 %r184,%r181; or.b16 %r182,%r183,%r184; cvt.u32.u16 %r185,%r182; cvt.u16.u8 %r186,%r185; setp.ne.u16 %r187,%r186,0; @ %r187 bra $L12; .loc 1 97 7 setp.eq.u32 %r189,%r23,16; @ %r189 bra $L12; .loc 1 106 5 cvta.const.u64 %r190,$LC4;stack; call _gfortran_runtime_error12: .loc 1 108 6 ld.u64 %r192,[%r168; .loc 1 112 11 ld.s8 %r105,[%r170+28]; .loc 1 114 7 setp.le.s64 %r195,%r105,0; @ %r195 bra $L14; .loc 1 121 17 ld.u64 %r37,[%r168+16]; .loc 1 122 17 ld.u64 %r42,[%r171+16]; .loc 1 116 13 shl.b64 %r342,%r105,3; add.u64 %r198,%frame,120; cvt.u32.u64 %r202,%r19mov.u64 %r147,40; mov.u64 %r148,%r192; .loc 1 86 9 cvt.u32.u64 %r34,%r148; .loc 1 113 10 mov.u64 %r107,1; add.u64 %r341,%frame,480; add.u64 %r340,%frame,360; add.u64 %r339,%frame,240; .loc 1 117 4 mov.u64 %r206,%r148; $L15: add.u64 %r146,%r168,%r147; st.u64 [%r146+8],%r206; add.u64 %r145,%r170,%r147; ld.u64 %r208,[%r145+16]; add.u64 %r207,%r208,1; ld.u64 %r209,[%r145+8]; sub.u64 %r31,%r207,%r209; add.u64 %r210,%r31,-1; st.u64 [%r146+16],%r210; st.u64 [%r146],%r107; .loc 1 119 14 add.u64 %r211,%frame,%r148; st.u64 [%r211],%r31; .loc 1 120 18 set.u32.le.s64 %r213,%r31,0; neg.s32 %r214,%r213; cvt.u16.u32 %r217,%r34; and.b16 %r216,%r217,1; cvt.u16.u32 %r220,%r214; or.b16 %r219,%r220,%r216; cvt.u32.u16 %r222,%r219; cvt.u32.u8 %r34,%r222; .loc 1 121 15 add.u64 %r224,%r341,%r148; .loc 1 121 17 mul.lo.u64 %r225,%r107,%r37; .loc 1 121 15 st.u64 [%r224],%r225; .loc 1 122 15 add.u64 %r227,%r340,%r148; .loc 1 122 17 add.u64 %r228,%r171,%r147; ld.u64 %r230,[%r228]; mul.lo.u64 %r229,%r230,%r42; .loc 1 122 15 st.u64 [%r227],%r229; .loc 1 123 15 add.u64 %r232,%r339,%r148; .loc 1 123 17 ld.u64 %r234,[%r145]; mul.lo.u64 %r233,%r234,%r22; .loc 1 123 15 st.u64 [%r232],%r233; .loc 1 124 7 mul.lo.u64 %r107,%r107,%r31; .loc 1 114 7 add.u64 %r148,%r148,8; add.u64 %r147,%r147,24; setp.ne.u64 %r235,%r148,%r342; @ %r235 bra $L15; bra $L37; $L14: .loc 1 126 19 st.u64 [%r168+8],%r192; .loc 1 127 24 mov.u64 %r22172; call (%value_in),_gfortrani_xmallocarray239,[%value_in]; } .loc 1 127 22 st.u64 [%r168],%r239; bra $L17; $L37: .loc 1 126 19 st.u64 [%r168+8],%r206; .loc 1 127 24(%value_in),_gfortrani_xmallocarray4,[%value_in]; } .loc 1 127 22 st.u64 [%r168],%r244; bra $L18; $L13: .loc 1 131 11 ld.s8 %r105,[%r168+28]; .loc 1 132 7 setp.le.s64 %r247,%r105,0; @ %r247 bra $L17; .loc 1 137 17 ld.u64 %r57,[%r168+16]; .loc 1 138 17 ld.u64 %r62,[%r171+16]; .loc 1 134 13 shl.b64 %r342,%r105,3; add.u64 %r250,%frame,120; mov.u32 %r254add.u64 %r139,%r168,40; add.u64 %r137,%r171,40; add.u64 %r134,%r170,40; add.u64 %r260,%r105,%r105; add.u64 %r261,%r260,%r105; shl.b64 %r262,%r261,3; add.u64 %r101,%r139,%r262; mov.u64 %r130,0; .loc 1 86 9 cvt.u32.u64 %r34,%r130; add.u64 %r341,%frame,480; add.u64 %r340,%frame,360; add.u64 %r339,%frame,240; $L19: .loc 1 135 16 ld.u64 %r264,[%r139+16]; add.u64 %r263,%r264,1; ld.u64 %r265,[%r139+8]; sub.u64 %r54,%r263,%r265; .loc 1 135 14 add.u64 %r266,%frame,%r130; st.u64 [%r266],%r54; .loc 1 136 18 set.u32.le.s64 %r268,%r54,0; neg.s32 %r269,%r268; cvt.u16.u32 %r272,%r34; and.b16 %r271,%r272,1; cvt.u16.u32 %r275,%r269; or.b16 %r274,%r275,%r271; cvt.u32.u16 %r277,%r274; cvt.u32.u8 %r34,%r277; .loc 1 137 15 add.u64 %r279,%r341,%r130; .loc 1 137 17 ld.u64 %r281,[%r139]; mul.lo.u64 %r280,%r281,%r57; .loc 1 137 15 st.u64 [%r279],%r280; .loc 1 138 15 add.u64 %r283,%r340,%r130; .loc 1 138 17 ld.u64 %r285,[%r137]; mul.lo.u64 %r284,%r285,%r62; .loc 1 138 15 st.u64 [%r283],%r284; .loc 1 139 15 add.u64 %r287,%r339,%r130; .loc 1 139 17 ld.u64 %r289,[%r134]; mul.lo.u64 %r288,%r289,%r22; .loc 1 139 15 st.u64 [%r287],%r288; .loc 1 132 7 add.u64 %r139,%r139,24; add.u64 %r137,%r137,24; add.u64 %r134,%r134,24; add.u64 %r130,%r130,8; setp.ne.u64 %r290,%r101,%r139; @ %r290 bra $L19; bra $L18; $L28: .loc 1 176 32 ld.u64 %r164,[%frame]; .loc 1 183 30 mul.lo.u64 %r135,%r164,%r108; .loc 1 184 30 mul.lo.u64 %r291,%r164,%r109; .loc 1 184 16 neg.s64 %r128,%r291; .loc 1 185 30 mul.lo.u64 %r292,%r164,%r110; .loc 1 185 16 neg.s64 %r129,%r292; setp.eq.u64 %r343,%r105,1; $L30: .loc 1 159 10 ld.s8 %r294,[%r97]; cvt.u16.u32 %r293,%r294; setp.eq.u16 %r295,%r293,0; @ %r295 bra $L20; .loc 1 162 1111172302,[%value_in]; } .loc 1 163 16 add.u64 %r94,%r94,%r73; bra $L21; $L20: .loc 1 168 1111172310,[%value_in]; } $L21: .loc 1 171 12 add.u64 %r114,%r114,%r108; .loc 1 172 12 add.u64 %r98,%r98,%r109; .loc 1 173 12 add.u64 %r97,%r97,%r110; .loc 1 174 15 ld.u64 %r312,[%frame+120]; add.u64 %r76,%r312,1; .loc 1 176 13 setp.eq.u64 %r313,%r76,%r164; @ %r313 bra $L22; .loc 1 174 15 st.u64 [%frame+120],%r76; bra $L30; $L22: .loc 1 180 20 mov.u64 %r314,0; st.u64 [%frame+120],%r314; .loc 1 184 16 add.u64 %r111,%r98,%r128; .loc 1 185 16 add.u64 %r112,%r97,%r129; .loc 1 187 14 @ %r343 bra $L11; add.u64 %r106,%frame,128; .loc 1 183 30 mov.u64 %r78,%r135; .loc 1 187 14 mov.u64 %r156,8; bra $L25; $L27: .loc 1 180 20 st.u64 [%r106],%r314; .loc 1 183 30 mul.lo.u64 %r78,%r87,%r86; .loc 1 184 30 mul.lo.u64 %r318,%r89,%r86; .loc 1 184 16 sub.u64 %r111,%r98,%r318; .loc 1 185 30 mul.lo.u64 %r319,%r91,%r86; .loc 1 185 16 sub.u64 %r112,%r97,%r319; .loc 1 187 14 add.u64 %r106,%r106,8; add.u64 %r156,%r156,8; setp.eq.u64 %r320,%r342,%r156; @ %r320 bra $L11; $L25: .loc 1 195 23 ld.u64 %r321,[%r106]; add.u64 %r86,%r321,1; st.u64 [%r106],%r86; .loc 1 196 30 add.u64 %r323,%r341,%r156; ld.u64 %r87,[%r323]; .loc 1 196 20 sub.u64 %r324,%r87,%r78; add.u64 %r114,%r114,%r324; .loc 1 197 30 add.u64 %r326,%r340,%r156; ld.u64 %r89,[%r326]; .loc 1 197 20 add.u64 %r98,%r111,%r89; .loc 1 198 30 add.u64 %r328,%r339,%r156; ld.u64 %r91,[%r328]; .loc 1 198 20 add.u64 %r97,%r112,%r91; .loc 1 176 32 add.u64 %r329,%frame,%r156; ld.u64 %r93,[%r329]; .loc 1 176 13 setp.eq.u64 %r330,%r86,%r93; @ %r330 bra $L27; bra $L30; $L17: .loc 1 147 3 cvta.const.u64 %r334,$LC5; cvta.const.u64 %r333,__func__$0; mov.u32 %r332,147; cvta.const.u64 %r331,$LC33333338: .loc 1 149 14 ld.u64 %r335,[%r169+40]; ld.u64 %r336,[%r169+16]; mul.lo.u64 %r73,%r335,%r336; .loc 1 150 12 ld.u64 %r108,[%frame+480]; .loc 1 151 12 ld.u64 %r109,[%frame+360]; .loc 1 152 12 ld.u64 %r110,[%frame+240]; .loc 1 153 8 ld.u64 %r114,[%r168]; .loc 1 154 8 ld.u64 %r98,[%r171]; .loc 1 155 8 ld.u64 %r94,[%r169]; .loc 1 157 9 setp.ne.u64 %r337,%r114,0; @ %r337 bra $L28; bra $L11; $L18: .loc 1 143 6 setp.ne.u32 %r338,%r34,0; @ ! %r338 bra $L38; $L11: .loc 1 2_gfortran_unpack1 .visible .func _gfortran_unpack1predpred %r90; .reg .predpredpredpredpredpredpred %r189; mov.u64 %r68,%ar0; mov.u64 %r69,%ar1; mov.u64 %r70,%ar2; mov.u64 %r71,%ar3; .loc 1 215 7 cvta.global.u64 %r72,_gfortrani_compile_options; .loc 1 215 6 ld.u32 %r73,[%r72+36]; setp.eq.u32 %r74,%r73,0; @ %r74 bra $L40;call unpack_bounds$L40: .loc 1 218 15 ld.u64 %r27,[%r69+16]; ld.s8 %r79,[%r69+29]; shl.b32 %r80,%r79,4; cvt.s64.s32 %r81,%r80; shl.b64 %r82,%r27,7; or.b64 %r29,%r81,%r82; .loc 1 221 3 setp.eq.u64 %r83,%r29,1040; @ %r83 bra $L41; setp.gt.u64 %r84,%r29,1040; @ %r84 bra $L42; setp.eq.u64 %r85,%r29,288; @ %r85 bra $L43; setp.gt.u64 %r86,%r29,288; @ %r86 bra $L44; setp.eq.u64 %r87,%r29,160; @ %r87 bra $L45; setp.eq.u64 %r88,%r29,272; @ %r88 bra $L43; setp.eq.u64 %r89,%r29,144; @ %r89 bra $L45; bra $L46; $L44: setp.eq.u64 %r90,%r29,544; @ %r90 bra $L47; setp.eq.u64 %r91,%r29,560; @ %r91 bra $L48; setp.eq.u64 %r92,%r29,528; @ %r92 bra $L47; bra $L46; $L42: setp.eq.u64 %r93,%r29,1088; @ %r93 bra $L49; setp.gt.u64 %r94,%r29,1088; @ %r94 bra $L50; setp.eq.u64 %r95,%r29,1056; @ %r95 bra $L41; setp.eq.u64 %r96,%r29,1072; @ %r96 bra $L51; bra $L46; $L50: setp.eq.u64 %r97,%r29,2080; @ %r97 bra $L52; setp.eq.u64 %r98,%r29,2112; @ %r98 bra $L53; setp.eq.u64 %r99,%r29,2064; @ %r99 bra $L52; bra $L46; $L45:call _gfortrani_unpack1_227 7 bra $L39; $L4call _gfortrani_unpack1_233 7 bra $L39; $L47:call _gfortrani_unpack1_239 7 bra $L39; $L41:call _gfortrani_unpack1_245 7 bra $L39; $L52:call _gfortrani_unpack1_i167 bra $L39; $L48:call _gfortrani_unpack1_r47 bra $L39; $L51:call _gfortrani_unpack1_r8263 7 bra $L39; $L49:call _gfortrani_unpack1_c4290 7 bra $L39; $Lcall _gfortrani_unpack1_c8295 7 bra $L39; $L46: .loc 1 321 11 ld.u64 %r30,[%r68+16]; .loc 1 321 3 setp.eq.u64 %r136,%r30,4; @ %r136 bra $L55; setp.gt.u64 %r137,%r30,4; @ %r137 bra $L56; setp.eq.u64 %r138,%r30,1; @ %r138 bra $L45; setp.eq.u64 %r139,%r30,2; @ %r139 bra $L57; bra $L58; $L56: setp.eq.u64 %r140,%r30,8; @ %r140 bra $L59; setp.eq.u64 %r141,%r30,16; @ %r141 bra $L60; bra $L58; $L57: .loc 1 329 11 ld.u64 %r143,[%r68]; and.b64 %r142,%r143,1; .loc 1 329 10 setp.ne.u64 %r144,%r142,0; @ %r144 bra $L58; .loc 1 329 46 ld.u64 %r146,[%r69]; and.b64 %r145,%r146,1; .loc 1 329 43 setp.ne.u64 %r147,%r145,0; @ %r147 bra $L58; .loc 1 330 7 ld.u64 %r149,[%r71]; and.b64 %r148,%r149,1; .loc 1 330 4 setp.ne.u64 %r150,%r148,0; @ %r150 bra $L58call _gfortrani_unpack1_4 bra $L39; $L55: .loc 1 340 11 ld.u64 %r156,[%r68]; and.b64 %r155,%r156,3; .loc 1 340 10 setp.ne.u64 %r157,%r155,0; @ %r157 bra $L58; .loc 1 340 46 ld.u64 %r159,[%r69]; and.b64 %r158,%r159,3; .loc 1 340 43 setp.ne.u64 %r160,%r158,0; @ %r160 bra $L58; .loc 1 341 7 ld.u64 %r162,[%r71]; and.b64 %r161,%r162,3; .loc 1 341 4 setp.ne.u64 %r163,%r161,0; @ %r163 bra $L58call _gfortrani_unpack1_347 4 bra $L39; $L59: .loc 1 351 11 ld.u64 %r169,[%r68]; and.b64 %r168,%r169,7; .loc 1 351 10 setp.ne.u64 %r170,%r168,0; @ %r170 bra $L58; .loc 1 351 46 ld.u64 %r172,[%r69]; and.b64 %r171,%r172,7; .loc 1 351 43 setp.ne.u64 %r173,%r171,0; @ %r173 bra $L58; .loc 1 352 7 ld.u64 %r175,[%r71]; and.b64 %r174,%r175,7; .loc 1 352 4 setp.ne.u64 %r176,%r174,0; @ %r176 bra $L58call _gfortrani_unpack1_4 bra $L39; $L60: .loc 1 363 11 ld.u64 %r182,[%r68]; and.b64 %r181,%r182,15; .loc 1 363 10 setp.ne.u64 %r183,%r181,0; @ %r183 bra $L58; .loc 1 364 7 ld.u64 %r185,[%r69]; and.b64 %r184,%r185,15; .loc 1 364 4 setp.ne.u64 %r186,%r184,0; @ %r186 bra $L58; .loc 1 365 7 ld.u64 %r188,[%r71]; and.b64 %r187,%r188,15; .loc 1 365 4 setp.ne.u64 %r189,%r187,0; @ %r189 bra $L58call _gfortrani_unpack1_i16371 4 bra $L39; $L58call unp,%out_arg4,%out_arg5); } $L39:_gfortran_unpack1_char .visible .func _gfortran_unpack1_charmov.u64 %r27,%ar3; mov.u64 %r28,%ar4; mov.u32 %r29,%ar5; .loc 1 396 7 cvta.global.u64 %r31,_gfortrani_compile_options; .loc 1 396 6 ld.u32 %r32,[%r31+36]; setp.eq.u32 %r33,%r32,0; @ %r33 bra $L98; .loc 1 397 5 {call unpack_bounds$L98: .loc 1 399 3 cvt.s64.s32 %r43,%r29; {call unp,%out_arg4,%out_arg5); }_gfortran_unpack1_char4 .visible .func _gfortran_unpack1_char4mov.u64 %r29,%ar3; mov.u64 %r30,%ar4; mov.u32 %r31,%ar5; .loc 1 417 7 cvta.global.u64 %r33,_gfortrani_compile_options; .loc 1 417 6 ld.u32 %r34,[%r33+36]; setp.eq.u32 %r35,%r34,0; @ %r35 bra $L103; .loc 1 418 5call unpack_bounds$L103: .loc 1 421 20 cvt.s64.s32 %r45,%r31; shl.b64 %r46,%r45,2; .loc 1 420 3call unp,%out_arg4,%out_arg5); } .loc 1 42_gfortran_unpack0 .visible .func _gfortran_unpack0pred %r90; .reg .predpredu64 %r149; .reg .u64 %r150; .reg .predpredpredpredpred184; mov.u64 %r64,%ar0; mov.u64 %r65,%ar1; mov.u64 %r66,%ar2; mov.u64 %r67,%ar3; .loc 1 437 7 cvta.global.u64 %r68,_gfortrani_compile_options; .loc 1 437 6 ld.u32 %r69,[%r68+36]; setp.eq.u32 %r70,%r69,0; @ %r70 bra $L108; .loc 1 438 5 mov.u64 %r74,rcall unpack_bounds$L108: .loc 1 440 15 ld.u64 %r27,[%r65+16]; ld.s8 %r75,[%r65+29]; shl.b32 %r76,%r75,4; cvt.s64.s32 %r77,%r76; shl.b64 %r78,%r27,7; or.b64 %r29,%r77,%r78; .loc 1 442 3 setp.eq.u64 %r79,%r29,1040; @ %r79 bra $L109; setp.gt.u64 %r80,%r29,1040; @ %r80 bra $L110; setp.eq.u64 %r81,%r29,288; @ %r81 bra $L111; setp.gt.u64 %r82,%r29,288; @ %r82 bra $L112; setp.eq.u64 %r83,%r29,160; @ %r83 bra $L113; setp.eq.u64 %r84,%r29,272; @ %r84 bra $L111; setp.eq.u64 %r85,%r29,144; @ %r85 bra $L113; bra $L114; $L112: setp.eq.u64 %r86,%r29,544; @ %r86 bra $L115; setp.eq.u64 %r87,%r29,560; @ %r87 bra $L116; setp.eq.u64 %r88,%r29,528; @ %r88 bra $L115; bra $L114; $L110: setp.eq.u64 %r89,%r29,1088; @ %r89 bra $L117; setp.gt.u64 %r90,%r29,1088; @ %r90 bra $L118; setp.eq.u64 %r91,%r29,1056; @ %r91 bra $L109; setp.eq.u64 %r92,%r29,1072; @ %r92 bra $L119; bra $L114; $L118: setp.eq.u64 %r93,%r29,2080; @ %r93 bra $L120; setp.eq.u64 %r94,%r29,2112; @ %r94 bra $L121; setp.eq.u64 %r95,%r29,2064; @ %r95 bra $L120; bra $L114; call _gfortrani_unpack0_448 7 bra $L107; $L111:call _gfortrani_unpack0_7 bra $L107; $L115:call _gfortrani_unpack0_460 7 bra $L107; $L109:call _gfortrani_unpack0_7 bra $L107; $L120:call _gfortrani_unpack0_i16473 7 bra $L107; $L116:call _gfortrani_unpack0_r4479 7 bra $L107; $L119:call _gfortrani_unpack0_r8484 7 bra $L107; $L117:call _gfortrani_unpack0_c4511 7 bra $L107; $L121:call _gfortrani_unpack0_c87 bra $L107; $L114: .loc 1 542 11 ld.u64 %r30,[%r64+16]; .loc 1 542 3 setp.eq.u64 %r132,%r30,4; @ %r132 bra $L123; setp.gt.u64 %r133,%r30,4; @ %r133 bra $L124; setp.eq.u64 %r134,%r30,1; @ %r134 bra $L113; setp.eq.u64 %r135,%r30,2; @ %r135 bra $L125; bra $L126; $L124: setp.eq.u64 %r136,%r30,8; @ %r136 bra $L127; setp.eq.u64 %r137,%r30,16; @ %r137 bra $L128; bra $L126; $L125: .loc 1 550 11 ld.u64 %r139,[%r64]; and.b64 %r138,%r139,1; .loc 1 550 10 setp.ne.u64 %r140,%r138,0; @ %r140 bra $L126; .loc 1 551 4 ld.u64 %r142,[%r65]; or.b64 %r141,%r67,%r142; and.b64 %r143,%r141,1; setp.ne.u64 %r144,%r143,0; @ %r144 bra $L126call _gfortrani_unpack0_4 bra $L107; $L123: .loc 1 561 11 ld.u64 %r150,[%r64]; and.b64 %r149,%r150,3; .loc 1 561 10 setp.ne.u64 %r151,%r149,0; @ %r151 bra $L126; .loc 1 562 4 ld.u64 %r153,[%r65]; or.b64 %r152,%r67,%r153; and.b64 %r154,%r152,3; setp.ne.u64 %r155,%r154,0; @ %r155 bra $L126call _gfortrani_unpack0_568 4 bra $L107; $L127: .loc 1 572 11 ld.u64 %r161,[%r64]; and.b64 %r160,%r161,7; .loc 1 572 10 setp.ne.u64 %r162,%r160,0; @ %r162 bra $L126; .loc 1 573 4 ld.u64 %r164,[%r65]; or.b64 %r163,%r67,%r164; and.b64 %r165,%r163,7; setp.ne.u64 %r166,%r165,0; @ %r166 bra $L126call _gfortrani_unpack0_579 4 bra $L107; $L128: .loc 1 584 11 ld.u64 %r172,[%r64]; and.b64 %r171,%r172,15; .loc 1 584 10 setp.ne.u64 %r173,%r171,0; @ %r173 bra $L126; .loc 1 586 4 ld.u64 %r175,[%r65]; or.b64 %r174,%r67,%r175; and.b64 %r176,%r174,15; setp.ne.u64 %r177,%r176,0; @ %r177 bra $L126call _gfortrani_unpack0_i16592 4 bra $L107; $L126: .loc 1 597 3 add.u64 %r182,%frame,8; mov.u64 %r184,0; st.u64 [%r182],%r184; st.u64 [%r182+8],%r184; st.u64 [%r182+16],%r184; st.u64 [%r182+24],%r184; .loc 1 599 17 st.u64 [%frame],%r67;framecall unp,%out_arg4,%out_arg5); } $L107:_gfortran_unpack0_char .visible .func _gfortran_unpack0_charmov.u64 %r24,%ar0; mov.u64 %r26,%ar2; mov.u64 %r27,%ar3; mov.u64 %r28,%ar4; mov.u32 %r29,%ar5; .loc 1 618 7 cvta.global.u64 %r31,_gfortrani_compile_options; .loc 1 618 6 ld.u32 %r32,[%r31+36]; setp.eq.u32 %r33,%r32,0; @ %r33 bra $L167; .loc 1 619 5call unpack_bounds$L167: .loc 1 621 3 add.u64 %r38,%frame,8; mov.u64 %r40,0; st.u64 [%r38],%r40; st.u64 [%r38+8],%r40; st.u64 [%r38+16],%r40; st.u64 [%r38+24],%r40; .loc 1 623 17 st.u64 [%frame],%r28; .loc 1 624 3 cvt.s64.s32 %r49,%r29; {framecall unp,%out_arg4,%out_arg5); }_gfortran_unpack0_char4 .visible .func _gfortran_unpack0_char4mov.u64 %r28,%ar2; mov.u64 %r29,%ar3; mov.u64 %r30,%ar4; mov.u32 %r31,%ar5; .loc 1 642 7 cvta.global.u64 %r33,_gfortrani_compile_options; .loc 1 642 6 ld.u32 %r34,[%r33+36]; setp.eq.u32 %r35,%r34,0; @ %r35 bra $L172; .loc 1 643 5 mov.u64 %r39,0call unpack_bounds$L172: .loc 1 645 3 add.u64 %r40,%frame,8; mov.u64 %r42,0; st.u64 [%r40],%r42; st.u64 [%r40+8],%r42; st.u64 [%r40+16],%r42; st.u64 [%r40+24],%r42; .loc 1 647 17 st.u64 [%frame],%r30; .loc 1 649 20 cvt.s64.s32 %r51,%r31; shl.b64 %r52,%r51,2; .loc 1 648 3unp,%out_arg4,%out_arg5); }gfortran_internal_packgfortran_internal_packfortran/runtime/in_pack_gener_gfortrani_internal_pack_1_gfortrani_internal_pack_1_gfortrani_internal_pack_2_gfortrani_internal_pack_2_gfortrani_internal_pack_4_gfortrani_internal_pack_4_gfortrani_internal_pack_8_gfortrani_internal_pack_8_gfortrani_internal_pack_16_gfortrani_internal_pack_16_gfortrani_internal_pack_r4_gfortrani_internal_pack_r4_gfortrani_internal_pack_r8_gfortrani_internal_pack_r8_gfortrani_internal_pack_c4_gfortrani_internal_pack_c4_gfortrani_internal_pack_c8_gfortrani_internal_pack_c8_gfortrani_xmallocarr_gfortrani_xmallocarrayF: _gfortran_internal_packgfortran_internal_pack16 .b8 %frame_ar[368predpredpred %r132; .reg .predpred %r148; .reg .u64 %r150; .reg .pred %r151; .reg .u64 %r152; .reg .u64 %r153; .reg .u64 %r154; .reg .u64 %r155; .reg .pred %r156; .reg .predpred %r188; .reg .u64 %r190; .reg .u64 %r191; mov.u64 %r88,%ar0; .loc 1 48 13 ld.u64 %r87,[%r88]; .loc 1 48 6 setp.eq.u64 %r89,%r87,0; @ %r89 bra $L1; .loc 1 51 15 ld.u64 %r26,[%r88+16]; ld.s8 %r90,[%r88+29]; shl.b32 %r91,%r90,4; cvt.s64.s32 %r92,%r91; shl.b64 %r93,%r26,7; or.b64 %r28,%r92,%r93; .loc 1 53 3 setp.eq.u64 %r94,%r28,1040; @ %r94 bra $L3; setp.gt.u64 %r95,%r28,1040; @ %r95 bra $L4; setp.eq.u64 %r96,%r28,288; @ %r96 bra $L5; setp.gt.u64 %r97,%r28,288; @ %r97 bra $L6; setp.eq.u64 %r98,%r28,160; @ %r98 bra $L7; setp.eq.u64 %r99,%r28,272; @ %r99 bra $L5; setp.eq.u64 %r100,%r28,144; @ %r100 bra $L7; bra $L8; $L6: setp.eq.u64 %r101,%r28,544; @ %r101 bra $L9; setp.eq.u64 %r102,%r28,560; @ %r102 bra $L10; setp.eq.u64 %r103,%r28,528; @ %r103 bra $L9; bra $L8; $L4: setp.eq.u64 %r104,%r28,1088; @ %r104 bra $L11; setp.gt.u64 %r105,%r28,1088; @ %r105 bra $L12; setp.eq.u64 %r106,%r28,1056; @ %r106 bra $L3; setp.eq.u64 %r107,%r28,1072; @ %r107 bra $L13; bra $L8; $L12: setp.eq.u64 %r108,%r28,2080; @ %r108 bra $L14; setp.eq.u64 %r109,%r28,2112; @ %r109 bra $L15; setp.eq.u64 %r110,%r28,2064; @ %r110 bra $L14; bra $L8; $L7: .loc 1 5788; call (%value_in),_gfortrani_internal_pack_1,(1; $L5: .loc 1 88; call (%value_in),_gfortrani_internal_pack_2,(bra $L1; $L9: .loc 1 88; call (%value_in),_gfortrani_internal_pack_4,(%out_arg1); ld.param.u64 %r116,[%value_in]; } mov.u64 %r87,%r116; bra $L1; $L3: .loc 1 88; call (%value_in),_gfortrani_internal_pack_8,(%out_arg1); ld.param.u64 %r118,[%value_in]; } mov.u64 %r87,%r118; bra $L1; $L14: .loc 1 7488; call (%value_in),_gfortrani_internal_pack_16,(%out_arg1); ld.param.u64 %r120,[%value_in]; } mov.u64 %r87,%r120; bra $L1; $L10: .loc 1 7788; call (%value_in),_gfortrani_internal_pack_r4,(%out_arg1); ld.param.u64 %r122,[%value_in]; } mov.u64 %r87,%r122; bra $L1; $L13: .loc 1 8088; call (%value_in),_gfortrani_internal_pack_r8,(bra $L1; $L11: .loc 1 10188; call (%value_in),_gfortrani_internal_pack_c4,(%out_arg1); ld.param.u64 %r126,[%value_in]; } mov.u64 %r87,%r126; bra $L1; $L15: .loc 1 10488; call (%value_in),_gfortrani_internal_pack_c8,(%out_arg1); ld.param.u64 %r128,[%value_in]; } mov.u64 %r87,%r128; bra $L1; $L8: .loc 1 128 3 setp.eq.u64 %r129,%r26,4; @ %r129 bra $L16; setp.gt.u64 %r130,%r26,4; @ %r130 bra $L17; setp.eq.u64 %r131,%r26,1; @ %r131 bra $L7; setp.eq.u64 %r132,%r26,2; @ %r132 bra $L18; bra $L19; $L17: setp.eq.u64 %r133,%r26,8; @ %r133 bra $L20; setp.eq.u64 %r134,%r26,16; @ %r134 bra $L21; bra $L19; $L18: .loc 1 134 11 and.b64 %r135,%r87,1; .loc 1 134 109; .loc 1 137 9call (%value_in),_gfortrani_internal_pack_2,(%out_arg1); ld.param.u64 %r138,[%value_in]; } mov.u64 %r87,%r138; bra $L1; $L16: .loc 1 140 11 and.b64 %r139,%r87,3; .loc 1 140 10 setp.ne.u64 %r140,%r139,0; @ %r140 bra $L19; .loc 1 143 9call (%value_in),_gfortrani_internal_pack_4,(%out_arg1); ld.param.u64 %r142,[%value_in]; } mov.u64 %r87,%r142; bra $L1; $L20: .loc 1 146 11 and.b64 %r143,%r87,7; .loc 1 146 10 setp.ne.u64 %r144,%r143,0; @ %r144 bra $L19; .loc 1 149 9call (%value_in),_gfortrani_internal_pack_8,(%out_arg1); ld.param.u64 %r146,[%value_in]; } mov.u64 %r87,%r146; bra $L1; $L21: .loc 1 153 11 and.b64 %r147,%r87,15; .loc 1 153 10 setp.ne.u64 %r148,%r147,0; @ %r148 bra $L19; .loc 1 156 9call (%value_in),_gfortrani_internal_pack_16,(%out_arg1); ld.param.u64 %r150,[%value_in]; } mov.u64 %r87,%r150; bra $L1; $L19: .loc 1 162 7 ld.s8 %r66,[%r88+28]; .loc 1 165 3 setp.le.s64 %r151,%r66,0; @ %r151 bra $L1; add.u64 %r58,%frame,240; add.u64 %r56,%r88,40; mov.u64 %r29,%frame; add.u64 %r65,%frame,120; .loc 1 165 19 mov.u64 %r75,0; .loc 1 164 10 mov.u32 %r59,1; .loc 1 163 9 mov.u64 %r68,1; .loc 1 167 16 mov.u64 %r152,%r75; bra $L23; $L28: mov.u64 %r75,%r69; $L23: st.u64 [%r58],%r152; .loc 1 168 19 ld.u64 %r40,[%r56]; .loc 1 168 17 st.u64 [%r29],%r40; .loc 1 169 19 ld.u64 %r154,[%r56+16]; add.u64 %r153,%r154,1; ld.u64 %r155,[%r56+8]; sub.u64 %r44,%r153,%r155; .loc 1 169 17 st.u64 [%r65],%r44; .loc 1 170 10 setp.le.s64 %r156,%r44,0; @ %r156 bra $L1; .loc 1 177 10 setp.eq.u64 %r157,%r40,%r68; .loc 1 178 16 selp.u32 %r59,%r59,0,%r157; .loc 1 180 13 mul.lo.u64 %r68,%r68,%r44; .loc 1 165 36 add.u64 %r69,%r75,1; .loc 1 165 3 add.u64 %r58,%r58,8; add.u64 %r56,%r56,24; add.u64 %r29,%r29,8; add.u64 %r65,%r65,8; setp.ne.u64 %r158,%r66,%r69; @ %r158 bra $L28; .loc 1 183 6 setp.ne.u32 %r159,%r59,0; @ %r159 bra $L1; .loc 1 187 13call (%value_in),_gfortrani_xmallocarray162,[%value_in]; } mov.u64 %r87,%r162; .loc 1 189 7 ld.u64 %r73,[%r88]; .loc 1 190 19 ld.u64 %r46,[%frame]; .loc 1 190 11 mul.lo.u64 %r70,%r46,%r26; .loc 1 192 9 setp.eq.u64 %r164,%r73,0; @ %r164 bra $L1; .loc 1 202 32 ld.u64 %r78,[%frame+120]; .loc 1 199 12 ld.u64 %r81,[%frame+240]; .loc 1 209 28 mul.lo.u64 %r165,%r46,%r78; .loc 1 209 40 mul.lo.u64 %r86,%r165,%r26; .loc 1 188 8 mov.u64 %r71,%r87; setp.eq.u64 %r188,%r66,1; add.u64 %r191,%frame,120; $L27: .loc 1 195 72,[%value_in]; } .loc 1 197 12 add.u64 %r71,%r71,%r26; .loc 1 198 11 add.u64 %r73,%r73,%r70; .loc 1 199 15 add.u64 %r81,%r81,1; .loc 1 202 13 setp.ne.u64 %r174,%r81,%r78; @ %r174 bra $L27; .loc 1 211 14 @ %r188 bra $L1; add.u64 %r80,%frame,248; .loc 1 209 40 mov.u64 %r50,%r86; .loc 1 210 12 mov.u64 %r74,1; .loc 1 206 20 mov.u64 %r190,0; bra $L25; $L26: st.u64 [%r80],%r190; .loc 1 209 28 mul.lo.u64 %r178,%r53,%r52; .loc 1 209 40 mul.lo.u64 %r50,%r178,%r26; .loc 1 210 12 add.u64 %r72,%r74,1; .loc 1 211 14 add.u64 %r80,%r80,8; setp.eq.u64 %r179,%r74,%r75; @ %r179 bra $L1; mov.u64 %r74,%r72; $L25: .loc 1 218 23 ld.u64 %r180,[%r80]; add.u64 %r52,%r180,1; st.u64 [%r80],%r52; shl.b64 %r62,%r74,3; .loc 1 219 28 add.u64 %r181,%frame,%r62; ld.u64 %r53,[%r181]; .loc 1 219 19 sub.u64 %r183,%r73,%r50; mad.lo.u64 %r73,%r53,%r26,%r183; .loc 1 202 32 add.u64 %r185,%r191,%r62; ld.u64 %r57,[%r185]; .loc 1 202 13 setp.eq.u64 %r186,%r52,%r57; @ %r186 bra $L26; mov.u64 %r81,0; bra $L27;815 _gfortran_internal_unpack .visible .func _gfortran_internal_unpackfortran/runtime/in_unpack_gener_gfortrani_internal_unpack_1 .extern .func _gfortrani_internal_unpack_1; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_internal_unpack_2 .extern .func _gfortrani_internal_unpack_// BEGIN GLOBAL FUNCTION DECL: _gfortrani_internal_unpack_4 .extern .func _gfortrani_internal_unpack_); // BEGIN GLOBAL FUNCTION DECL: _gfortrani_internal_unpack_8 .extern .func _gfortrani_internal_unpack_8; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_internal_unpack_16 .extern .func _gfortrani_internal_unpack_16; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_internal_unpack_r4 .extern .func _gfortrani_internal_unpack_r); // BEGIN GLOBAL FUNCTION DECL: _gfortrani_internal_unpack_r8 .extern .func _gfortrani_internal_unpack_r8; // BEGIN GLOBAL FUNCTION DECL: _gfortrani_internal_unpack_c4 .extern .func _gfortrani_internal_unpack_c); // BEGIN GLOBAL FUNCTION DECL: _gfortrani_internal_unpack_c8 .extern .func _gfortrani_internal_unpack_c8F: _gfortran_internal_unpack .visible .func _gfortran_internal_unpack8predpredpredpredpredpred %r172; .reg .predpred %r185; .reg .predpredpredmov.u64 %r96,%ar0; mov.u64 %r97,%ar1; .loc 1 46 8 ld.u64 %r73,[%r96]; .loc 1 48 9 set.u32.eq.u64 %r99,%r73,%r97; neg.s32 %r100,%r99; .loc 1 48 20 set.u32.eq.u64 %r102,%r97,0; neg.s32 %r103,%r102; .loc 1 48 17 cvt.u16.u32 %r105,%r100; cvt.u16.u32 %r106,%r103; or.b16 %r104,%r105,%r106; .loc 1 48 6 cvt.u32.u16 %r107,%r104; cvt.u16.u8 %r108,%r107; setp.ne.u16 %r109,%r108,0; @ %r109 bra $L1; .loc 1 51 15 ld.u64 %r29,[%r96+16]; ld.s8 %r110,[%r96+29]; shl.b32 %r111,%r110,4; cvt.u32.u64 %r113,%r29; shl.b32 %r112,%r113,7; or.b32 %r32,%r111,%r112; .loc 1 52 3 setp.eq.u32 %r114,%r32,1040; @ %r114 bra $L3; setp.gt.u32 %r115,%r32,1040; @ %r115 bra $L4; setp.eq.u32 %r116,%r32,288; @ %r116 bra $L5; setp.gt.u32 %r117,%r32,288; @ %r117 bra $L6; setp.eq.u32 %r118,%r32,160; @ %r118 bra $L7; setp.eq.u32 %r119,%r32,272; @ %r119 bra $L5; setp.eq.u32 %r120,%r32,144; @ %r120 bra $L7; bra $L8; $L6: setp.eq.u32 %r121,%r32,544; @ %r121 bra $L9; setp.eq.u32 %r122,%r32,560; @ %r122 bra $L10; setp.eq.u32 %r123,%r32,528; @ %r123 bra $L9; bra $L8; $L4: setp.eq.u32 %r124,%r32,1088; @ %r124 bra $L11; setp.gt.u32 %r125,%r32,1088; @ %r125 bra $L12; setp.eq.u32 %r126,%r32,1056; @ %r126 bra $L3; setp.eq.u32 %r127,%r32,1072; @ %r127 bra $L13; bra $L8; $L12: setp.eq.u32 %r128,%r32,2080; @ %r128 bra $L14; setp.eq.u32 %r129,%r32,2112; @ %r129 bra $L15; setp.eq.u32 %r130,%r32,2064; @ %r130 bra $L14; bra $L8; $L7: .loc 1 56 797; call _gfortrani_internal_unpack_1,(%out_arg1,%out_arg2); } .loc 1 57 7 bra $L1; $L5: .loc 1 61 797; call _gfortrani_internal_unpack_2,(%out_arg1,%out_arg2); } .loc 1 62 7 bra $L1; $L9: .loc 1 66 797; call _gfortrani_internal_unpack_4,(%out_arg1,%out_arg2); } .loc 1 67 7 bra $L1; $L3: .loc 1 71 797; call _gfortrani_internal_unpack72 7 bra $L1; $L14: .loc 1 77 797; call _gfortrani_internal_unpack_16,(%out_arg1,%out_arg2); } .loc 1 78 7 bra $L1; $L10: .loc 1 82 797; call _gfortrani_internal_unpack_r4,(%out_arg1,%out_arg2); } .loc 1 83 7 bra $L1; $L13: .loc 1 86 797; call _gfortrani_internal_unpack_r8,(%out_arg1,%out_arg2); } .loc 1 87 7 bra $L1; $L11: .loc 1 110 797; call _gfortrani_internal_unpack_c4,(%out_arg1,%out_arg2); } .loc 1 111 7 bra $L1; $L15: .loc 1 114 797; call _gfortrani_internal_unpack_c8,(%out_arg1,%out_arg2); } .loc 1 115 7 bra $L1; $L8: .loc 1 141 3 setp.eq.u64 %r149,%r29,4; @ %r149 bra $L17; setp.gt.u64 %r150,%r29,4; @ %r150 bra $L18; setp.eq.u64 %r151,%r29,1; @ %r151 bra $L7; setp.eq.u64 %r152,%r29,2; @ %r152 bra $L19; bra $L20; $L18: setp.eq.u64 %r153,%r29,8; @ %r153 bra $L21; setp.eq.u64 %r154,%r29,16; @ %r154 bra $L22; bra $L20; $L19: .loc 1 148 41 or.b64 %r155,%r73,%r97; and.b64 %r156,%r155,1; setp.ne.u64 %r157,%r156,0; @ %r157 bra $L20; .loc 1 152 497; call _gfortrani_internal_unpack_2,(%out_arg1,%out_arg2); } .loc 1 153 4 bra $L1; $L17: .loc 1 157 41 or.b64 %r160,%r73,%r97; and.b64 %r161,%r160,3; setp.ne.u64 %r162,%r161,0; @ %r162 bra $L20; .loc 1 161 497; call _gfortrani_internal_unpack_4,(%out_arg1,%out_arg2); } .loc 1 162 4 bra $L1; $L21: .loc 1 166 41 or.b64 %r165,%r73,%r97; and.b64 %r166,%r165,7; setp.ne.u64 %r167,%r166,0; @ %r167 bra $L20; .loc 1 170 497; call _gfortrani_internal_unpack171 4 bra $L1; $L22: .loc 1 176 42 or.b64 %r170,%r73,%r97; and.b64 %r171,%r170,15; setp.ne.u64 %r172,%r171,0; @ %r172 bra $L20; .loc 1 180 497; call _gfortrani_internal_unpack_16,(%out_arg1,%out_arg2); } .loc 1 181 4 bra $L1; $L20: .loc 1 190 7 ld.s8 %r69,[%r96+28]; .loc 1 192 3 setp.le.s64 %r175,%r69,0; @ %r175 bra $L23; add.u64 %r66,%r96,40; mov.u64 %r44,0; .loc 1 192 19 mov.u64 %r90,%r44; .loc 1 191 9 mov.u64 %r65,1; add.u64 %r221,%frame,240; add.u64 %r220,%frame,120; .loc 1 194 16 mov.u64 %r178,%r44; bra $L25; $L32: mov.u64 %r90,%r74; $L25: add.u64 %r177,%r221,%r44; st.u64 [%r177],%r178; .loc 1 195 19 ld.u64 %r49,[%r66]; .loc 1 195 17 add.u64 %r179,%frame,%r44; st.u64 [%r179],%r49; .loc 1 196 19 ld.u64 %r181,[%r66+16]; add.u64 %r180,%r181,1; ld.u64 %r182,[%r66+8]; sub.u64 %r53,%r180,%r182; .loc 1 196 17 add.u64 %r184,%r220,%r44; st.u64 [%r184],%r53; .loc 1 197 10 setp.le.s64 %r185,%r53,0; @ %r185 bra $L1; .loc 1 200 10 setp.ne.u64 %r186,%r49,%r65; @ %r186 bra $L31; .loc 1 201 8 mul.lo.u64 %r65,%r65,%r53; bra $L24; $L31: .loc 1 203 8 mov.u64 %r65,0; $L24: .loc 1 192 36 add.u64 %r74,%r90,1; .loc 1 192 3 add.u64 %r66,%r66,24; add.u64 %r44,%r44,8; setp.ne.u64 %r187,%r69,%r74; @ %r187 bra $L32; .loc 1 208 6 setp.eq.u64 %r188,%r65,0; @ %r188 bra $L26; .loc 1 210 32 mul.lo.u64 %r29,%r65,%r29; $L23: .loc 1 210 7195,[%value_in]; } .loc 1 211 7 bra $L1; $L26: .loc 1 214 19 ld.u64 %r54,[%frame]; .loc 1 214 11 mul.lo.u64 %r71,%r54,%r29; .loc 1 216 9 setp.eq.u64 %r197,%r73,0; @ %r197 bra $L1; .loc 1 226 32 ld.u64 %r37,[%frame+120]; .loc 1 223 12 ld.u64 %r91,[%frame+240]; .loc 1 233 29 mul.lo.u64 %r198,%r37,%r54; .loc 1 233 41 mul.lo.u64 %r95,%r198,%r29; setp.eq.u64 %r222,%r69,1; $L30: .loc 1 219 7205,[%value_in]; } .loc 1 221 11 add.u64 %r97,%r97,%r29; .loc 1 222 12 add.u64 %r73,%r73,%r71; .loc 1 223 15 add.u64 %r91,%r91,1; .loc 1 226 13 setp.ne.u64 %r207,%r37,%r91; @ %r207 bra $L30; .loc 1 235 14 @ %r222 bra $L1; add.u64 %r67,%frame,248; add.u64 %r87,%frame,8; .loc 1 233 41 mov.u64 %r57,%r95; .loc 1 234 12 mov.u64 %r92,1; .loc 1 230 20 mov.u64 %r223,0; bra $L28; $L29: st.u64 [%r67],%r223; .loc 1 233 29 mul.lo.u64 %r211,%r60,%r59; .loc 1 233 41 mul.lo.u64 %r57,%r211,%r29; .loc 1 234 12 add.u64 %r72,%r92,1; .loc 1 235 14 add.u64 %r67,%r67,8; add.u64 %r87,%r87,8; setp.eq.u64 %r212,%r90,%r92; @ %r212 bra $L1; mov.u64 %r92,%r72; $L28: .loc 1 242 23 ld.u64 %r213,[%r67]; add.u64 %r59,%r213,1; st.u64 [%r67],%r59; .loc 1 243 29 ld.u64 %r60,[%r87]; .loc 1 243 20 sub.u64 %r215,%r73,%r57; mad.lo.u64 %r73,%r60,%r29,%r215; .loc 1 226 32 shl.b64 %r217,%r92,3; add.u64 %r218,%r220,%r217; ld.u64 %r64,[%r218]; .loc 1 226 13 setp.eq.u64 %r219,%r59,%r64; @ %r219 bra $L29; mov.u64 %r91,%r65; bra $L30; $L1: .loc 1 247 1 ret; } _abs_c4.o/ gfortran_specific__abs_c4 .visible .func (.param .f32 %value_out) _gfortran_specific__abs_c4fortran/generated/_abs_c4.F90" // BEGIN GLOBAL FUNCTION DECL: cabsf .extern .func (.param .f32 %value_out) cabsfortran_specific__abs_c4 .visible .func (.param .f32 %value_out) _gfortran_specific__abs_c4mov.u64 %r25,%ar0; .loc 1 42 42 ld.f32 %r28,[%r25+4]; ld.f32 %r26,[%r25]; { .param .f32 %value_in; .param .f32 %out_arg1; st.param.f32 [%out_arg1],%r26(%value_in),cabs43 12 _abs_c8.o/ gfortran_specific__abs_c8 .visible .func (.param .f64 %value_out) _gfortran_specific__abs_c8fortran/generated/_abs_c8.F90" // BEGIN GLOBAL FUNCTION DECL: cabs .extern .func (.param .f64 %value_out) cabsortran_specific__abs_c8 .visible .func (.param .f64 %value_out) _gfortran_specific__abs_c8mov.u64 %r25,%ar0; .loc 1 42 42 ld.f64 %r28,[%r25+8]; ld.f64 %r26,[%r25]; { .param .f64 %value_in; .param .f64 %out_arg1; st.param.f64 [%out_arg1],%r26(%value_in),c_abs_c10.o/ _abs_i4.o/ gfortran_specific__abs_gfortran_specificfortran/generated/_abs_i4.F90" // BEGIN GLOBAL FUNCTION DEF: _gfortran_specific__abs_gfortran_specific__abs_mov.u64 %r24,%ar0; .loc 1 42 42 ld.u32 %r26,[%r24]; abs.s32 %valuegfortran_specific__abs_gfortran_specificfortran/generated/_abs_i8.F90" // BEGIN GLOBAL FUNCTION DEF: _gfortran_specific__abs_gfortran_specific__abs_mov.u64 %r24,%ar0; .loc 1 42 42 ld.u64 %r26,[%r24]; abs.s64 %value,%r26; .loc 1 43 12// BEGIN GLOBAL FUNCTION DECL: _gfortran_specific__abs_i16 .visible .func _gfortran_specificfortran/generated/_abs_i16.F90gfortran_specific__abs_i16 .visible .func _gfortran_specific__abs_i1642 43 ld.u64 %r28,[%r25+8]; shr.s64 %r27,%r28,63; ld.u64 %r33,[%r25]; xor.b64 %r32,%r33,%r27; xor.b64 %r35,%r28,%r27; sub.u64 %r39,%r32,%r27; set.u32.gt.u64 %r44,%r39,%r32; cvt.s64.s32 %r42,%r44; sub.u64 %r45,%r35,%r27; add.u64 %r48,%r45,%r42; .loc 1 43 12 st.u64 [%r24],%r39; st.u64 [%r24+8],%r48; ret; } _abs_r4.o/ gfortran_specific__abs_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__abs_r4fortran/generated/_abs_r4.F90" // BEGIN GLOBAL FUNCTION DEF: _gfortran_specific__abs_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__abs_r4mov.u64 %r24,%ar0; .loc 1 42 42 ld.f32 %r26,[%r24]; abs.f32 %value,%r26; .loc 1 43 _abs_r8.o/ gfortran_specific__abs_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__abs_r8fortran/generated/_abs_r8.F90" // BEGIN GLOBAL FUNCTION DEF: _gfortran_specific__abs_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__abs_r8mov.u64 %r24,%ar0; .loc 1 42 42 ld.f64 %r26,[%r24]; abs.f64 %value,%r26; .loc 1 43 12ortran_specific__aimag_c4 .visible .func (.param .f32 %value_out) _gfortran_specific__aimag_c4fortran/generated/_aimag_c4.F90gfortran_specific__aimag_c4 .visible .func (.param .f32 %value_out) _gfortran_specific__aimag_c4mov.u64 %r23,%ar0; .loc 1 43 12 ld.f32 %value,[%r23+ _aimag_c8.o/ gfortran_specific__aimag_c8 .visible .func (.param .f64 %value_out) _gfortran_specific__aimag_c8fortran/generated/_aimag_c8.F90gfortran_specific__aimag_c8 .visible .func (.param .f64 %value_out) _gfortran_specific__aimag_c8mov.u64 %r23,%ar0; .loc 1 43 12 ld.f64 %value,[%r23+8]ortran_specific__exp_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__exp_r4fortran/generated/_exp_r4.F90" // BEGIN GLOBAL FUNCTION DECL: expf .extern .func (.param .f32 %value_out) expfF: _gfortran_specific__exp_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__exp_r4mov.u64 %r24,%ar0; .loc 1 42 42expf,(%out_arg1); ld.param.f32 %r26,[%value_in]; } .loc 1 43 12 _exp_r8.o/ gfortran_specific__exp_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__exp_r8fortran/generated/_exp_r8.F90" // BEGIN GLOBAL FUNCTION DECL: exp .extern .func (.param .f64 %value_out) expF: _gfortran_specific__exp_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__exp_r8mov.u64 %r24,%ar0; .loc 1 42 42exp,(%out_arg1)_exp_r10.o/ _exp_c4.o/ gfortran_specific__exp_c4 .visible .func _gfortran_specific__exp_c4fortran/generated/_exp_c4.F90" // BEGIN GLOBAL FUNCTION DECL: cexpf .extern .func cexpfortran_specific__exp_c4 .visible .func _gfortran_specific__exp_c48f32 %r29; .reg .f32 %r31; mov.u64 %r26,%ar0; mov.u64 %r27,%ar1; .loc 1 42 42 ld.f32 %r31,[%r27+4]; ld.f32 %r29,[%r27];.param .f32 %out_arg3; st.param.f32 [%out_arg3],%r31; call cexpf43 12 st.f32 [%r26],%r24; st.f32 [%r26+4],%r25gfortran_specific__exp_c8 .visible .func _gfortran_specific__exp_c8fortran/generated/_exp_c8.F90" // BEGIN GLOBAL FUNCTION DECL: cexp .extern .func cexportran_specific__exp_c8 .visible .func _gfortran_specific__exp_c842 42 ld.f64 %r31,[%r27+8]; ld.f64 %r29,[%r27];.param .f64 %out_arg3; st.param.f64 [%out_arg3],%r31; call cexp43 12 st.f64 [%r26],%r24; st.f64 [%r26+8],%r25; ret; } _exp_c10.o/ gfortran_specific__log_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__log_r4fortran/generated/_log_r4.F90" // BEGIN GLOBAL FUNCTION DECL: logf .extern .func (.param .f32 %value_out) logfF: _gfortran_specific__log_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__log_r4mov.u64 %r24,%ar0; .loc 1 42 42logf,(%out_arg1); ld.param.f32 %r26,[%value_in]; } .loc 1 43 12g_r8.o/ gfortran_specific__log_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__log_r8fortran/generated/_log_r8.F90" // BEGIN GLOBAL FUNCTION DECL: log .extern .func (.param .f64 %value_out) logF: _gfortran_specific__log_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__log_r8mov.u64 %r24,%ar0; .loc 1 42 42log,(%out_arg1)_log_r10.o/ _log_c4.o/ gfortran_specific__log_c4 .visible .func _gfortran_specific__log_c4fortran/generated/_log_c4.F90" // BEGIN GLOBAL FUNCTION DECL: clogf .extern .func clogfortran_specific__log_c4 .visible .func _gfortran_specific__log_c48f32 %r29; .reg .f32 %r31; mov.u64 %r26,%ar0; mov.u64 %r27,%ar1; .loc 1 42 42 ld.f32 %r31,[%r27+4]; ld.f32 %r29,[%r27];.param .f32 %out_arg3; st.param.f32 [%out_arg3],%r31; call clogf43 12 st.f32 [%r26],%r24; st.f32 [%r26+4],%r25gfortran_specific__log_c8 .visible .func _gfortran_specific__log_c8fortran/generated/_log_c8.F90" // BEGIN GLOBAL FUNCTION DECL: clog .extern .func clogortran_specific__log_c8 .visible .func _gfortran_specific__log_c842 42 ld.f64 %r31,[%r27+8]; ld.f64 %r29,[%r27];.param .f64 %out_arg3; st.param.f64 [%out_arg3],%r31; call clog43 12 st.f64 [%r26],%r24; st.f64 [%r26+8],%r25; ret; } _log_c10.o/ _log10_r4.o/gfortran_specific__log10_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__log10_r4fortran/generated/_log10_r4.F90" // BEGIN GLOBAL FUNCTION DECL: log10f .extern .func (.param .f32 %value_out) log10fF: _gfortran_specific__log10_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__log10_r4mov.u64 %r24,%ar0; .loc 1 42 46log10f,(%out_arg1); ld.param.f32 %r26,[%value_in]; } .loc 1 43 12g10_r8.o/gfortran_specific__log10_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__log10_r8fortran/generated/_log10_r8.F90" // BEGIN GLOBAL FUNCTION DECL: log10 .extern .func (.param .f64 %value_out) log10F: _gfortran_specific__log10_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__log10_r8mov.u64 %r24,%ar0; .loc 1 42 46log10,(%out_arg1)_log10_r10.o/ _sqrt_r4gfortran_specific__sqrt_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__sqrt_r4fortran/generated/_sqrt_r4.F90gfortran_specific__sqrt_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__sqrt_r4mov.u64 %r24,%ar0; .loc 1 42 44 ld.f32 %r25,[%r24]; sqrt.rn.f32 %value,%r25; .loc 1 43 ortran_specific__sqrt_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__sqrt_r8fortran/generated/_sqrt_r8.F90gfortran_specific__sqrt_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__sqrt_r8mov.u64 %r24,%ar0; .loc 1 42 44 ld.f64 %r25,[%r24]; sqrt.rn.f64 %value,%r25; .loc 1 43 12_sqrt_r10.o/ _sqrt_c4gfortran_specific__sqrt_c4 .visible .func _gfortran_specific__sqrt_c4fortran/generated/_sqrt_c4.F90" // BEGIN GLOBAL FUNCTION DECL: csqrtf .extern .func csqrgfortran_specific__sqrt_c4 .visible .func _gfortran_specific__sqrt_c48f32 %r29; .reg .f32 %r31; mov.u64 %r26,%ar0; mov.u64 %r27,%ar1; .loc 1 42 44 ld.f32 %r31,[%r27+4]; ld.f32 %r29,[%r27];.param .f32 %out_arg3; st.param.f32 [%out_arg3],%r31; call csqrtf43 12 st.f32 [%r26],%r24; st.f32 [%r26+4],%r25; ret; } _sqrt_c8.o/gfortran_specific__sqrt_c8 .visible .func _gfortran_specific__sqrt_c8fortran/generated/_sqrt_c8.F90" // BEGIN GLOBAL FUNCTION DECL: csqrt .extern .func csqrtortran_specific__sqrt_c8 .visible .func _gfortran_specific__sqrt_c842 44 ld.f64 %r31,[%r27+8]; ld.f64 %r29,[%r27];.param .f64 %out_arg3; st.param.f64 [%out_arg3],%r31; call csqrt43 12 st.f64 [%r26],%r24; st.f64 [%r26+8],%r25; ret; } _sqrt_c10.o/ gfortran_specific__asin_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__asin_r4fortran/generated/_asin_r4.F90" // BEGIN GLOBAL FUNCTION DECL: asinf .extern .func (.param .f32 %value_out) aF: _gfortran_specific__asin_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__asin_r4mov.u64 %r24,%ar0; .loc 1 42 44asinf,(%out_arg1); ld.param.f32 %r26,[%value_in]; } .loc 1 43 12 _asin_r8.o/ gfortran_specific__asin_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__asin_r8fortran/generated/_asin_r8.F90" // BEGIN GLOBAL FUNCTION DECL: asin .extern .func (.param .f64 %value_out) asinF: _gfortran_specific__asin_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__asin_r8mov.u64 %r24,%ar0; .loc 1 42 44asin,(%out_arg1)_asin_r10.o/ _asinh_r4.o/gfortran_specific__asinh_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__asinh_r4fortran/generated/_asinh_r4.F90" // BEGIN GLOBAL FUNCTION DECL: asinhf .extern .func (.param .f32 %value_out) asinhfF: _gfortran_specific__asinh_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__asinh_r4mov.u64 %r24,%ar0; .loc 1 42 46asinhf,(%out_arg1); ld.param.f32 %r26,[%value_in]; } .loc 1 43 12 _asinh_r8.o/gfortran_specific__asinh_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__asinh_r8fortran/generated/_asinh_r8.F90" // BEGIN GLOBAL FUNCTION DECL: asinh .extern .func (.param .f64 %value_out) asinhF: _gfortran_specific__asinh_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__asinh_r8mov.u64 %r24,%ar0; .loc 1 42 46asinh,(%out_arg1)_asinh_r10.o/ gfortran_specific__acos_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__acos_r4fortran/generated/_acos_r4.F90" // BEGIN GLOBAL FUNCTION DECL: acosf .extern .func (.param .f32 %value_out) acosfF: _gfortran_specific__acos_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__acos_r4mov.u64 %r24,%ar0; .loc 1 42 44acosf,(%out_arg1); ld.param.f32 %r26,[%value_in]; } .loc 1 43 12 _acos_r8.o/ gfortran_specific__acos_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__acos_r8fortran/generated/_acos_r8.F90" // BEGIN GLOBAL FUNCTION DECL: acos .extern .func (.param .f64 %value_out) acosF: _gfortran_specific__acos_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__acos_r8mov.u64 %r24,%ar0; .loc 1 42 44acos,(%out_arg1)_acos_r10.o/ _acosh_r4.o/gfortran_specific__acosh_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__acosh_r4fortran/generated/_acosh_r4.F90" // BEGIN GLOBAL FUNCTION DECL: acoshf .extern .func (.param .f32 %value_out) acoshfF: _gfortran_specific__acosh_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__acosh_r4mov.u64 %r24,%ar0; .loc 1 42 46acoshf,(%out_arg1); ld.param.f32 %r26,[%value_in]; } .loc 1 43 12 _acosh_r8.o/gfortran_specific__acosh_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__acosh_r8fortran/generated/_acosh_r8.F90" // BEGIN GLOBAL FUNCTION DECL: acosh .extern .func (.param .f64 %value_out) acoshF: _gfortran_specific__acosh_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__acosh_r8mov.u64 %r24,%ar0; .loc 1 42 46acosh,(%out_arg1)_acosh_r10.o/ gfortran_specific__atan_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__atan_r4fortran/generated/_atan_r4.F90" // BEGIN GLOBAL FUNCTION DECL: atanf .extern .func (.param .f32 %value_out) atanfF: _gfortran_specific__atan_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__atan_r4mov.u64 %r24,%ar0; .loc 1 42 44atanf,(%out_arg1); ld.param.f32 %r26,[%value_in]; } .loc 1 43 12 _atan_r8.o/ gfortran_specific__atan_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__atan_r8fortran/generated/_atan_r8.F90" // BEGIN GLOBAL FUNCTION DECL: atan .extern .func (.param .f64 %value_out) atanF: _gfortran_specific__atan_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__atan_r8mov.u64 %r24,%ar0; .loc 1 42 44atan,(%out_arg1)_atan_r10.o/ _atanh_r4.o/gfortran_specific__atanh_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__atanh_r4fortran/generated/_atanh_r4.F90" // BEGIN GLOBAL FUNCTION DECL: atanhf .extern .func (.param .f32 %value_out) atanhfF: _gfortran_specific__atanh_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__atanh_r4mov.u64 %r24,%ar0; .loc 1 42 46atanhf,(%out_arg1); ld.param.f32 %r26,[%value_in]; } .loc 1 43 12 _atanh_r8.o/gfortran_specific__atanh_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__atanh_r8fortran/generated/_atanh_r8.F90" // BEGIN GLOBAL FUNCTION DECL: atanh .extern .func (.param .f64 %value_out) aF: _gfortran_specific__atanh_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__atanh_r8mov.u64 %r24,%ar0; .loc 1 42 46atanh,(%out_arg1)_atanh_r10.o/ gfortran_specific__sin_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__sin_r4fortran/generated/_sin_r4.F90" // BEGIN GLOBAL FUNCTION DECL: sinf .extern .func (.param .f32 %value_out) F: _gfortran_specific__sin_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__sin_r4mov.u64 %r24,%ar0; .loc 1 42 42sinf,(%out_arg1); ld.param.f32 %r26,[%value_in]; } .loc 1 43 12 _sin_r8.o/ gfortran_specific__sin_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__sin_r8fortran/generated/_sin_r8.F90" // BEGIN GLOBAL FUNCTION DECL: sin .extern .func (.param .f64 %value_out) sinF: _gfortran_specific__sin_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__sin_r8mov.u64 %r24,%ar0; .loc 1 42 42sin,(%out_arg1)_sin_r10.o/ _sin_c4.o/ gfortran_specific__sin_c4 .visible .func _gfortran_specific__sin_c4fortran/generated/_sin_c4.F90" // BEGIN GLOBAL FUNCTION DECL: csinf .extern .func csinfortran_specific__sin_c4 .visible .func _gfortran_specific__sin_c48f32 %r29; .reg .f32 %r31; mov.u64 %r26,%ar0; mov.u64 %r27,%ar1; .loc 1 42 42 ld.f32 %r31,[%r27+4]; ld.f32 %r29,[%r27];.param .f32 %out_arg3; st.param.f32 [%out_arg3],%r31; call csinf43 12 st.f32 [%r26],%r24; st.f32 [%r26+4],%r25gfortran_specific__sin_c8 .visible .func _gfortran_specific__sin_c8fortran/generated/_sin_c8.F90" // BEGIN GLOBAL FUNCTION DECL: csin .extern .func csF: _gfortran_specific__sin_c8 .visible .func _gfortran_specific__sin_c842 42 ld.f64 %r31,[%r27+8]; ld.f64 %r29,[%r27];.param .f64 %out_arg3; st.param.f64 [%out_arg3],%r31; call cs43 12 st.f64 [%r26],%r24; st.f64 [%r26+8],%r25; ret; } _sin_c10.o/ gfortran_specific__cos_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__cos_r4fortran/generated/_cos_r4.F90" // BEGIN GLOBAL FUNCTION DECL: cosf .extern .func (.param .f32 %value_out) cosfF: _gfortran_specific__cos_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__cos_r4mov.u64 %r24,%ar0; .loc 1 42 42cosf,(%out_arg1); ld.param.f32 %r26,[%value_in]; } .loc 1 43 12 _cos_r8.o/ gfortran_specific__cos_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__cos_r8fortran/generated/_cos_r8.F90" // BEGIN GLOBAL FUNCTION DECL: cos .extern .func (.param .f64 %value_out) cosF: _gfortran_specific__cos_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__cos_r8mov.u64 %r24,%ar0; .loc 1 42 42cos,(%out_arg1)_cos_r10.o/ _cos_c4.o/ gfortran_specific__cos_c4 .visible .func _gfortran_specific__cos_c4fortran/generated/_cos_c4.F90" // BEGIN GLOBAL FUNCTION DECL: ccosf .extern .func ccosfortran_specific__cos_c4 .visible .func _gfortran_specific__cos_c48f32 %r29; .reg .f32 %r31; mov.u64 %r26,%ar0; mov.u64 %r27,%ar1; .loc 1 42 42 ld.f32 %r31,[%r27+4]; ld.f32 %r29,[%r27];.param .f32 %out_arg3; st.param.f32 [%out_arg3],%r31; call ccosf43 12 st.f32 [%r26],%r24; st.f32 [%r26+4],%r25gfortran_specific__cos_c8 .visible .func _gfortran_specific__cos_c8fortran/generated/_cos_c8.F90" // BEGIN GLOBAL FUNCTION DECL: ccos .extern .func ccosortran_specific__cos_c8 .visible .func _gfortran_specific__cos_c842 42 ld.f64 %r31,[%r27+8]; ld.f64 %r29,[%r27];.param .f64 %out_arg3; st.param.f64 [%out_arg3],%r31; call ccos43 12 st.f64 [%r26],%r24; st.f64 [%r26+8],%r25; ret; } _cos_c10.o/ gfortran_specific__tan_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__tan_r4fortran/generated/_tan_r4.F90" // BEGIN GLOBAL FUNCTION DECL: tanf .extern .func (.param .f32 %value_out) tanfF: _gfortran_specific__tan_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__tan_r4mov.u64 %r24,%ar0; .loc 1 42 42tanf,(%out_arg1); ld.param.f32 %r26,[%value_in]; } .loc 1 43 12 _tan_r8.o/ gfortran_specific__tan_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__tan_r8fortran/generated/_tan_r8.F90" // BEGIN GLOBAL FUNCTION DECL: tan .extern .func (.param .f64 %value_out) tanF: _gfortran_specific__tan_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__tan_r8mov.u64 %r24,%ar0; .loc 1 42 42tan,(%out_arg1)_tan_r10.o/ gfortran_specific__sinh_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__sinh_r4fortran/generated/_sinh_r4.F90" // BEGIN GLOBAL FUNCTION DECL: sinhf .extern .func (.param .f32 %value_out) sinhfF: _gfortran_specific__sinh_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__sinh_r4mov.u64 %r24,%ar0; .loc 1 42 44sinhf,(%out_arg1); ld.param.f32 %r26,[%value_in]; } .loc 1 43 12 _sinh_r8.o/ gfortran_specific__sinh_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__sinh_r8fortran/generated/_sinh_r8.F90" // BEGIN GLOBAL FUNCTION DECL: sinh .extern .func (.param .f64 %value_out) sinhF: _gfortran_specific__sinh_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__sinh_r8mov.u64 %r24,%ar0; .loc 1 42 44sinh,(%out_arg1)_sinh_r10.o/ gfortran_specific__cosh_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__cosh_r4fortran/generated/_cosh_r4.F90" // BEGIN GLOBAL FUNCTION DECL: coshf .extern .func (.param .f32 %value_out) coshfF: _gfortran_specific__cosh_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__cosh_r4mov.u64 %r24,%ar0; .loc 1 42 44coshf,(%out_arg1); ld.param.f32 %r26,[%value_in]; } .loc 1 43 12 _cosh_r8.o/ gfortran_specific__cosh_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__cosh_r8fortran/generated/_cosh_r8.F90" // BEGIN GLOBAL FUNCTION DECL: cosh .extern .func (.param .f64 %value_out) coshF: _gfortran_specific__cosh_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__cosh_r8mov.u64 %r24,%ar0; .loc 1 42 44cosh,(%out_arg1)_cosh_r10.o/ gfortran_specific__tanh_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__tanh_r4fortran/generated/_tanh_r4.F90" // BEGIN GLOBAL FUNCTION DECL: tanhf .extern .func (.param .f32 %value_out) tanhfF: _gfortran_specific__tanh_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__tanh_r4mov.u64 %r24,%ar0; .loc 1 42 44tanhf,(%out_arg1); ld.param.f32 %r26,[%value_in]; } .loc 1 43 12 _tanh_r8.o/ gfortran_specific__tanh_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__tanh_r8fortran/generated/_tanh_r8.F90" // BEGIN GLOBAL FUNCTION DECL: tanh .extern .func (.param .f64 %value_out) F: _gfortran_specific__tanh_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__tanh_r8mov.u64 %r24,%ar0; .loc 1 42 44tanh,(%out_arg1)_tanh_r10.o/ _conjg_c4.o/gfortran_specific__conjg_4 .visible .func _gfortran_specific__conjg_4fortran/generated/_conjg_c4.F90gfortran_specific__conjg_4 .visible .func _gfortran_specific__conjg_445 ld.f32 %r27,[%r26+4]; neg.f32 %r24,%r27; .loc 1 43 12 ld.f32 %r28,[%r26]; st.f32 [%r25],%r28; st.f32 [%r25+4],%r24; ret; } _conjg_c8.o/gfortran_specific__conjg_8 .visible .func _gfortran_specific__conjg_8fortran/generated/_conjg_c8.F90gfortran_specific__conjg_8 .visible .func _gfortran_specific__conjg_845 ld.f64 %r27,[%r26+8]; neg.f64 %r24,%r27; .loc 1 43 12 ld.f64 %r28,[%r26]; st.f64 [%r25],%r28; st.f64 [%r25+8],%r24; ret; } _conjg_c10.o/ _aint_r4gfortran_specific__aint_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__aint_r4fortran/generated/_aint_r4.F90gfortran_specific__aint_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__aint_r4mov.u64 %r24,%ar0; .loc 1 42 44 ld.f32 %r25,[%r24]; cvt.rzi.f32.f32 %value,%r25; .loc 1 43 ortran_specific__aint_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__aint_r8fortran/generated/_aint_r8.F90gfortran_specific__aint_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__aint_r8mov.u64 %r24,%ar0; .loc 1 42 44 ld.f64 %r25,[%r24]; cvt.rzi.f64.f64 %value,%r25; .loc 1 43 12_aint_r10.o/ _anint_r4.o/gfortran_specific__anint_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__anint_r4fortran/generated/_anint_r4.F90" // BEGIN GLOBAL FUNCTION DECL: roundf .extern .func (.param .f32 %value_out) roundfF: _gfortran_specific__anint_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__anint_r4mov.u64 %r24,%ar0; .loc 1 42 46roundf,(%out_arg1); ld.param.f32 %r26,[%value_in]; } .loc 1 43 12 _anint_r8.o/gfortran_specific__anint_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__anint_r8fortran/generated/_anint_r8.F90" // BEGIN GLOBAL FUNCTION DECL: round .extern .func (.param .f64 %value_out) roundF: _gfortran_specific__anint_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__anint_r8mov.u64 %r24,%ar0; .loc 1 42 46round,(%out_arg1)_anint_r10.o/ gfortran_specific__sign_gfortran_specific__sign_fortran/generated/_sign_i4.F90gfortran_specific__sign_gfortran_specific__sign_i441 46 ld.u32 %r25,[%r28]; ld.u32 %r31,[%r29]; xor.b32 %r30,%r25,%r31; shr.s32 %r26,%r30,31; add.u32 %r33,gfortran_specific__sign_gfortran_specific__sign_fortran/generated/_sign_i8.F90gfortran_specific__sign_gfortran_specific__sign_41 46 ld.u64 %r25,[%r28]; ld.u64 %r31,[%r29]; xor.b64 %r30,%r25,%r31; shr.s64 %r26,%r30,63; add.u64 %r33,%r25,%r26; xor.b64 %value,%r33,%r26; .loc 1 42 12// BEGIN GLOBAL FUNCTION DECL: _gfortran_specific__sign_i16 .visible .func _gfortran_specific__sign_i16fortran/generated/_sign_i16.F90gfortran_specific__sign_i16 .visible .func _gfortran_specific__sign_i16u64.loc 1 41 47 ld.u64 %r59,[%r29]; ld.u64 %r60,[%r29+8]; ld.u64 %r36,[%r30+8]; xor.b64 %r35,%r36,%r60; shr.s64 %r38,%r35,63; add.u64 %r43,%r59,%r38; set.u32.lt.u64 %r48,%r43,%r59; cvt.s64.s32 %r46,%r48; add.u64 %r49,%r60,%r38; sub.u64 %r52,%r49,%r46; xor.b64 %r53,%r43,%r38; st.u64 [%r28],%r53; xor.b64 %r56,%r52,%r38; st.u64 [%r28+8],%r56; .loc 1 42 12gfortran_specific__sign_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__sign_r4fortran/generated/_sign_r4.F90gfortran_specific__sign_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__sign_r446 ld.f32 %r27,[%r25]; ld.f32 %r28,[%r26]; copysign.f32 %value,%r28,%r27; .loc 1 42 ortran_specific__sign_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__sign_r8fortran/generated/_sign_r8.F90gfortran_specific__sign_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__sign_r846 ld.f64 %r27,[%r25]; ld.f64 %r28,[%r26]; copysign.f64 %value,%r28,%r27; .loc 1 42 12_sign_r10.o/ _dim_i4.o/ gfortran_specific__dim_gfortran_specific__dim_fortran/generated/_dim_i4.F90gfortran_specific__dim_gfortran_specific__dim_i441 44 ld.u32 %r29,[%r26]; ld.u32 %r30,[%r27]; sub.u32 %r28,%r29,%r30; max.s32 %value,%r28,gfortran_specific__dim_gfortran_specific__dim_fortran/generated/_dim_i8.F90gfortran_specific__dim_gfortran_specific__dim_41 44 ld.u64 %r29,[%r26]; ld.u64 %r30,[%r27]; sub.u64 %r28,%r29,%r30; max.s64 %value,%r28,0; .loc 1 42 12m_i16.o/ gfortran_specific__dim_i16 .visible .func _gfortran_specific__dim_i16fortran/generated/_dim_i16.F90gfortran_specific__dim_i16 .visible .func _gfortran_specific__dim_i16pred.loc 1 41 45 ld.u64 %r47,[%r27]; ld.u64 %r48,[%r27+8]; ld.u64 %r49,[%r28]; ld.u64 %r50,[%r28+8];2; add.u64 %r54,%r38,%r35; setp.ge.s64 %r44,%r54,0; @ %r44 bra $L2; mov.u64 %r53,0; mov.u64 %r54,%r53; $L2: .loc 1 42 12 st.u64 [%r26],%r53; st.u64 [%r26+8],%r54; ret; } _dim_r4.o/gfortran_specific__dim_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__dim_r4fortran/generated/_dim_r4.F90gfortran_specific__dim_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__dim_r4pred44 ld.f32 %r27,[%r25]; ld.f32 %r28,[%r26]; sub.f32 %r24,%r27,%r28; setp.le.f32 %r29,%r24,0f00000000; selp.f32 %value,0f00000000,%r24,%r29; .loc 1 42 _dim_r8.o/ gfortran_specific__dim_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__dim_r8fortran/generated/_dim_r8.F90gfortran_specific__dim_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__dim_r8pred44 ld.f64 %r27,[%r25]; ld.f64 %r28,[%r26]; sub.f64 %r24,%r27,%r28; setp.le.f64 %r29,%r24,0d0000000000000000; selp.f64 %value,0d0000000000000000,%r24,%r29; .loc 1 42 12ortran_specific__atan2_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__atan2_r4fortran/generated/_atan2_r4.F90" // BEGIN GLOBAL FUNCTION DECL: atan2f .extern .func (.param .f32 %value_out) atan2fortran_specific__atan2_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__atan2_r4f32 %r28; .reg .f32 %r29; mov.u64 %r25,%ar0; mov.u64 %r26,%ar1; .loc 1 41 48 ld.f32 %r28,[%r26]; ld.f32 %r27,[%r25]; { .param .f32 %value_in; .param .f32 %out_arg1; st.param.f32 [%out_arg1],%r2712ortran_specific__atan2_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__atan2_r8fortran/generated/_atan2_r8.F90" // BEGIN GLOBAL FUNCTION DECL: atan2 .extern .func (.param .f64 %value_out) atan2ortran_specific__atan2_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__atan2_r8f64 %r28; .reg .f64 %r29; mov.u64 %r25,%ar0; mov.u64 %r26,%ar1; .loc 1 41 48 ld.f64 %r28,[%r26]; ld.f64 %r27,[%r25]; { .param .f64 %value_in; .param .f64 %out_arg1; st.param.f64 [%out_arg1],%r27(%value_in),atan2.loc 1 42 12ortran_specific__mod_gfortran_specific__mod_fortran/generated/_mod_i4.F90gfortran_specific__mod_gfortran_specific__mod_i4remgfortran_specific__mod_gfortran_specific__mod_fortran/generated/_mod_i8.F90gfortran_specific__mod_gfortran_specific__mod_44 ld.u64 %r28,[%r25]; ld.u64 %r29,[%r26]; rem.s64 %value,%r28,%r29; .loc 1 42 12gfortran_specific__mod_i16 .visible .func _gfortran_specific__mod_i16fortran/generated/_mod_i16.F90"odti3 .extern .func __modti3F: _gfortran_specific__mod_i16 .visible .func _gfortran_specific__mod_i1641 45 ld.u64 %r46,[%r26]; ld.u64 %r47,[%r26+8]; add.u64 %r37,%frame,16; add.u64 %r41,%frame,32; st.u64 [%frame+16],%r46; st.u64 [%frame+24],%r47; ld.u64 %r39,[%r27]; st.u64 [%frame+32],%r39; ld.u64 %r40,[%r27+8]; st.u64 [%frame+40],%r40; {42 12 st.u64 [%r25],%r48; st.u64 [%r25+8],%r49; ret; } _mod_r4.o/ gfortran_specific__mod_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__mod_r4fortran/generated/_mod_r4.F90" // BEGIN GLOBAL FUNCTION DECL: fmodf .extern .func (.param .f32 %value_out) fmodfortran_specific__mod_r4 .visible .func (.param .f32 %value_out) _gfortran_specific__mod_r4f32 %r28; .reg .f32 %r29; mov.u64 %r25,%ar0; mov.u64 %r26,%ar1; .loc 1 41 44 ld.f32 %r28,[%r26]; ld.f32 %r27,[%r25]; { .param .f32 %value_in; .param .f32 %out_arg1; st.param.f32 [%out_arg1],%r27(%value_in),fmod12 _mod_r8.o/ gfortran_specific__mod_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__mod_r8fortran/generated/_mod_r8.F90" // BEGIN GLOBAL FUNCTION DECL: fmod .extern .func (.param .f64 %value_out) fmodortran_specific__mod_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__mod_r8f64 %r28; .reg .f64 %r29; mov.u64 %r25,%ar0; mov.u64 %r26,%ar1; .loc 1 41 44 ld.f64 %r28,[%r26]; ld.f64 %r27,[%r25]; { .param .f64 %value_in; .param .f64 %out_arg1; st.param.f64 [%out_arg1],%r27(%value_in),fmo_mod_r10.o/ 836 _gfortran_specific__nint_4_4gfortran_specific__nint_4_4fortran/generated/misc_specifics.F90" // BEGIN GLOBAL FUNCTION DECL: _gfortran_specific__nint_4_8gfortran_specific__nint_4_8_gfortran_specific__nint_8_4gfortran_specific__nint_8_4_gfortran_specific__nint_8_8gfortran_specific__nint_8_8_gfortran_specific__nint_16_4 .visible .func _gfortran_specific__nint_16_); // BEGIN GLOBAL FUNCTION DECL: _gfortran_specific__nint_16_8 .visible .func _gfortran_specific__nint_16_8; // BEGIN GLOBAL FUNCTION DECL: _gfortran_specific__char_1_i4 .visible .func _gfortran_specific__char_1_i4_gfortran_specific__char_1_i8 .visible .func _gfortran_specific__char_1_i8_gfortran_specific__char_1_i16 .visible .func _gfortran_specific__char_1_i16_gfortran_specific__len_1_gfortran_specific__len_1_); // BEGIN GLOBAL FUNCTION DECL: _gfortran_specific__len_1_gfortran_specific__len_1_; // BEGIN GLOBAL FUNCTION DECL: _gfortran_specific__len_1_i16 .visible .func _gfortran_specific__len_1_i16_gfortran_specific__index_1_gfortran_specific__index_1_i4gfortran_specific__index_1_gfortran_specific__index_1_i8gfortran_specific__index_1_i16 .visible .func _gfortran_specific__index_1_i16lroundflroundflroundlround_gfortran_string_index_gfortran_string_indexF: _gfortran_specific__nint_4_4gfortran_specific__nint_4_4mov.u64 %r24,%ar0; .loc 1 37 45 ld.f32 %r26,[%r24]26; call (%value_in),lroundf,(38 12 cvt.u32_gfortran_specific__nint_4_8gfortran_specific__nint_4_8mov.u64 %r24,%ar0; .loc 1 45 45 ld.f64 %r26,[%r24]26; call (%value_in),lround,(46 12 cvt.u32_gfortran_specific__nint_8_4gfortran_specific__nint_8_4u64 %r25; .reg .f32 %r27; .reg .u64 %r28; mov.u64 %r25,%ar0; .loc 1 69 45 ld.f32 %r27,[%r25]; call (%value_in),lroundf,(%out_arg1); ld.param.u64 %r28,[%value_in]; } cvt.u32.u64 %r23,%r28; cvt.s64.s32 %value,%r23; .loc 1 70 12gfortran_specific__nint_8_8gfortran_specific__nint_8_8u64 %r25; .reg .f64 %r27; .reg .u64 %r28; mov.u64 %r25,%ar0; .loc 1 77 45 ld.f64 %r27,[%r25]; call (%value_in),lround,(%out_arg1); ld.param.u64 %r28,[%value_in]; } cvt.u32.u64 %r23,%r28; cvt.s64.s32 %value,%r23; .loc 1 78 12gfortran_specific__nint_16_4 .visible .func _gfortran_specific__nint_16_4101 46 ld.f32 %r28,[%r26]28; call (%value_in),lroundf,(%out_arg1); ld.param.u64 %r29,[%value_in]; } cvt.u32.u64 %r23,%r29; cvt.s64.s32 %r31,%r23; st.u64 [%r25],%r31; shr.s64 %r32,%r31,63; st.u64 [%r25+8],%r32; .loc 1 102 12_gfortran_specific__nint_16_8 .visible .func _gfortran_specific__nint_16_8109 46 ld.f64 %r28,[%r26]28; call (%value_in),lround,(%out_arg1); ld.param.u64 %r29,[%value_in]; } cvt.u32.u64 %r23,%r29; cvt.s64.s32 %r31,%r23; st.u64 [%r25],%r31; shr.s64 %r32,%r31,63; st.u64 [%r25+8],%r32; .loc 1 110 12_gfortran_specific__char_1_i4 .visible .func _gfortran_specific__char_1_i4mov.u64 %r24,%ar0; mov.u64 %r26,%ar2; .loc 1 135 54 ld.u32 %r27,[%r26]; st.u8 [%r24],%r27; .loc 1 136 12_gfortran_specific__char_1_i8 .visible .func _gfortran_specific__char_1_i8mov.u64 %r24,%ar0; mov.u64 %r26,%ar2; .loc 1 143 54 ld.u64 %r27,[%r26]; st.u8 [%r24],%r27; .loc 1 144 12_gfortran_specific__char_1_i16 .visible .func _gfortran_specific__char_1_i16mov.u64 %r24,%ar0; mov.u64 %r26,%ar2; .loc 1 151 55 ld.u64 %r27,[%r26]; st.u8 [%r24],%r27; .loc 1 152 12_gfortran_specific__len_1_gfortran_specific__len_1_i4mov.u64 %r24,%ar1; .loc 1 162 12 cvt.u32_gfortran_specific__len_1_gfortran_specific__len_1_mov.u64 %r25,%ar1; .loc 1 169 44 cvt.u32.u64 %r27,%r25; cvt.s64.s32 %value,%r27; .loc 1 170 12gfortran_specific__len_1_i16 .visible .func _gfortran_specific__len_1_i16u64 %r29; mov.u64 %r24,%ar0; mov.u64 %r26,%ar2; .loc 1 177 45 cvt.u32.u64 %r28,%r26; cvt.s64.s32 %r27,%r28; st.u64 [%r24],%r27; shr.s64 %r29,%r27,63; st.u64 [%r24+8],%r29; .loc 1 178 12_gfortran_specific__index_1_gfortran_specific__index_1__gfortran_string_index,[%value_in]; } .loc 1 188 12 cvt.u32_gfortran_specific__index_1_gfortran_specific__index_1_56 mov.u32 %r33call (%value_in),_gfortran_string_index34,[%value_in]; } cvt.u32.u64 %r36,%r34; cvt.s64.s32 %value,%r36; .loc 1 196 12gfortran_specific__index_1_i16 .visible .func _gfortran_specific__index_1_i16 {mov.u64 %r29,%ar4; .loc 1 203 57call (%value_in),_gfortran_string_index35,[%value_in]; } cvt.u32.u64 %r37,%r35; cvt.s64.s32 %r36,%r37; st.u64 [%r25],%r36; shr.s64 %r38,%r36,63; st.u64 [%r25+8],%r38; .loc 1 204 12 ret; } dprod_r8.o/ ortran_specific__dprod_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__dprod_r8fortran/intrinsics/dprod_r8.f90gfortran_specific__dprod_r8 .visible .func (.param .f64 %value_out) _gfortran_specific__dprod_r81 48 ld.f32 %r31,[%r27]; cvt.f64.f32 %r30,%r31; ld.f32 %r33,[%r28]; cvt.f64.f32 %r32,%r33; mul.f64 %value,%r30,%r32; .loc 1 32 12f2c_specifics.o/1622802223gfortran_f2c_specific__abs_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__abs_r4fortran/intrinsics/f2c_specifics.F90" // BEGIN GLOBAL FUNCTION DECL: _gfortran_f2c_specific__abs_c4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__abs_c4_gfortran_f2c_specific__aimag_c4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__aimag_c4_gfortran_f2c_specific__aimag_c8 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__aimag_c8_gfortran_f2c_specific__exp_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__exp_r4_gfortran_f2c_specific__exp_c4 .visible .func _gfortran_f2c_specific__exp_c); // BEGIN GLOBAL FUNCTION DECL: _gfortran_f2c_specific__exp_c8 .visible .func _gfortran_f2c_specific__exp_c8; // BEGIN GLOBAL FUNCTION DECL: _gfortran_f2c_specific__log_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__log_r4_gfortran_f2c_specific__log_c4 .visible .func _gfortran_f2c_specific__log_c); // BEGIN GLOBAL FUNCTION DECL: _gfortran_f2c_specific__log_c8 .visible .func _gfortran_f2c_specific__log_c8; // BEGIN GLOBAL FUNCTION DECL: _gfortran_f2c_specific__log10_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__log10_r4_gfortran_f2c_specific__sqrt_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__sqrt_r4_gfortran_f2c_specific__sqrt_c4 .visible .func _gfortran_f2c_specific__sqrt_c); // BEGIN GLOBAL FUNCTION DECL: _gfortran_f2c_specific__sqrt_c8 .visible .func _gfortran_f2c_specific__sqrt_c8; // BEGIN GLOBAL FUNCTION DECL: _gfortran_f2c_specific__asin_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__asin_r4_gfortran_f2c_specific__acos_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__acos_r4_gfortran_f2c_specific__atan_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__atan_r4_gfortran_f2c_specific__asinh_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__asinh_r4_gfortran_f2c_specific__acosh_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__acosh_r4_gfortran_f2c_specific__atanh_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__atanh_r4_gfortran_f2c_specific__sin_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__sin_r4_gfortran_f2c_specific__sin_c4 .visible .func _gfortran_f2c_specific__sin_c); // BEGIN GLOBAL FUNCTION DECL: _gfortran_f2c_specific__sin_c8 .visible .func _gfortran_f2c_specific__sin_c8; // BEGIN GLOBAL FUNCTION DECL: _gfortran_f2c_specific__cos_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__cos_r4_gfortran_f2c_specific__cos_c4 .visible .func _gfortran_f2c_specific__cos_c); // BEGIN GLOBAL FUNCTION DECL: _gfortran_f2c_specific__cos_c8 .visible .func _gfortran_f2c_specific__cos_c8; // BEGIN GLOBAL FUNCTION DECL: _gfortran_f2c_specific__tan_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__tan_r4_gfortran_f2c_specific__sinh_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__sinh_r4_gfortran_f2c_specific__cosh_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__cosh_r4_gfortran_f2c_specific__tanh_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__tanh_r4_gfortran_f2c_specific__aint_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__aint_r4_gfortran_f2c_specific__anint_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__anint_r4_gfortran_f2c_specific__sign_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__sign_r); // BEGIN GLOBAL FUNCTION DECL: _gfortran_f2c_specific__dim_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__dim_r); // BEGIN GLOBAL FUNCTION DECL: _gfortran_f2c_specific__atan2_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__atan2_r); // BEGIN GLOBAL FUNCTION DECL: _gfortran_f2c_specific__mod_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__); // BEGIN GLOBAL FUNCTION DECL: _gfortran_f2c_specific__conjg_4 .visible .func _gfortran_f2c_specific__conjg_); // BEGIN GLOBAL FUNCTION DECL: _gfortran_f2c_specific__conjg_8 .visible .func _gfortran_f2c_specific__conjg_8; // BEGIN GLOBAL FUNCTION DECL: cabsf .extern .func (.param .f32 %value_out) cabsfCL: expf .extern .func (.param .f32 %value_out) expfcexpf .extern .func cexpcexp .extern .func cexp); // BEGIN GLOBAL FUNCTION DECL: logf .extern .func (.param .f32 %value_out) logfclogf .extern .func clogclog .extern .func clog10f .extern .func (.param .f32 %value_out) log10fcsqrtf .extern .func csqrcsqrt); // BEGIN GLOBAL FUNCTION DECL: asinf .extern .func (.param .f32 %value_out) aacosf .extern .func (.param .f32 %value_out) acosfatanf .extern .func (.param .f32 %value_out) atanfasinhf .extern .func (.param .f32 %value_out) asinhfacoshf .extern .func (.param .f32 %value_out) acoshfatanhf .extern .func (.param .f32 %value_out) atanhfsinf .extern .func (.param .f32 %value_out) csinf .extern .func csincsin .extern .func csccosf .extern .func ccosccos .extern .func ccotanf .extern .func (.param .f32 %value_out) tanfsinhf .extern .func (.param .f32 %value_out) sinhfcoshf .extern .func (.param .f32 %value_out) coshftanhf .extern .func (.param .f32 %value_out) tanhfroundf .extern .func (.param .f32 %value_out) roundfatan2f .extern .func (.param .f32 %value_out) atan2fCL: fmodf .extern .func (.param .f32 %value_out) fmodfortran_f2c_specific__abs_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__abs_r4mov.u64 %r25,%ar0; .loc 1 66 73 ld.f32 %r28,[%r25]; abs.f32 %r27,%r28; cvt.f64.f32 %value,%r27; .loc 1 66 86_gfortran_f2c_specific__abs_c4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__abs_c4mov.u64 %r26,%ar0; .loc 1 72 17 ld.f32 %r29,[%r26+4]; ld.f32 %r27,[%r26]; { .param .f32 %value_in; .param .f32 %out_arg1; st.param.f32 [%out_arg1],%r27(%value_in),cabscvt.f64.f32 %value,%r30; .loc 1 73 12_gfortran_f2c_specific__aimag_c4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__aimag_c4mov.u64 %r24,%ar0; .loc 1 80 48 ld.f32 %r26,[%r24+4]; cvt.f64.f32 %value,%r26; .loc 1 81 12_gfortran_f2c_specific__aimag_c8 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__aimag_c8mov.u64 %r23,%ar0; .loc 1 87 12 ld.f64 %value,[%r23+8]_gfortran_f2c_specific__exp_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__exp_r4mov.u64 %r25,%ar0; .loc 1 91 73expf,(%out_arg1); ld.param.f32 %r27,[%value_in]; } cvt.f64.f32 %value,%r27; .loc 1 91 86_gfortran_f2c_specific__exp_c4 .visible .func _gfortran_f2c_specific__exp_c48f32 %r29; .reg .f32 %r31; mov.u64 %r26,%ar0; mov.u64 %r27,%ar1; .loc 1 93 81 ld.f32 %r31,[%r27+4]; ld.f32 %r29,[%r27];.param .f32 %out_arg3; st.param.f32 [%out_arg3],%r31; call cexpfst.f32 [%r26],%r24; st.f32 [%r26+4],%r25; .loc 1 93 96_gfortran_f2c_specific__exp_c8 .visible .func _gfortran_f2c_specific__exp_c895 95 ld.f64 %r31,[%r27+8]; ld.f64 %r29,[%r27];.param .f64 %out_arg3; st.param.f64 [%out_arg3],%r31; call cexpst.f64 [%r26],%r24; st.f64 [%r26+8],%r25; .loc 1 95 110_gfortran_f2c_specific__log_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__log_r4mov.u64 %r25,%ar0; .loc 1 98 73logf,(%out_arg1); ld.param.f32 %r27,[%value_in]; } cvt.f64.f32 %value,%r27; .loc 1 98 86_gfortran_f2c_specific__log_c4 .visible .func _gfortran_f2c_specific__log_c48f32 %r29; .reg .f32 %r31; mov.u64 %r26,%ar0; mov.u64 %r27,%ar1; .loc 1 100 81 ld.f32 %r31,[%r27+4]; ld.f32 %r29,[%r27];.param .f32 %out_arg3; st.param.f32 [%out_arg3],%r31; call clogfst.f32 [%r26],%r24; st.f32 [%r26+4],%r25; .loc 1 100 96_gfortran_f2c_specific__log_c8 .visible .func _gfortran_f2c_specific__log_c8[%r27+8]; ld.f64 %r29,[%r27];.param .f64 %out_arg3; st.param.f64 [%out_arg3],%r31; call clogst.f64 [%r26],%r24; st.f64 [%r26+8],%r25; .loc 1 102 110_gfortran_f2c_specific__log10_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__log10_r4log10f,(%out_arg1); ld.param.f32 %r27,[%value_in]; } cvt.f64.f32 %value,%r27; .loc 1 105 88_gfortran_f2c_specific__sqrt_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__sqrt_r4mov.u64 %r25,%ar0; .loc 1 108 74 ld.f32 %r26,[%r25]; sqrt.rn.f32 %r23,%r26; cvt.f64.f32 %value,%r23; .loc 1 108 87_gfortran_f2c_specific__sqrt_c4 .visible .func _gfortran_f2c_specific__sqrt_c48f32 %r29; .reg .f32 %r31; mov.u64 %r26,%ar0; mov.u64 %r27,%ar1; .loc 1 110 82 ld.f32 %r31,[%r27+4]; ld.f32 %r29,[%r27];.param .f32 %out_arg3; st.param.f32 [%out_arg3],%r31; call csqrtfst.f32 [%r26],%r24; st.f32 [%r26+4],%r25; .loc 1 110 97_gfortran_f2c_specific__sqrt_c8 .visible .func _gfortran_f2c_specific__sqrt_c8[%r27+8]; ld.f64 %r29,[%r27];.param .f64 %out_arg3; st.param.f64 [%out_arg3],%r31; call csqrtst.f64 [%r26],%r24; st.f64 [%r26+8],%r25; .loc 1 112 11_gfortran_f2c_specific__asin_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__asin_r4asinf,(%out_arg1); ld.param.f32 %r27,[%value_in]; } cvt.f64.f32 %value,%r27; .loc 1 115 87_gfortran_f2c_specific__acos_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__acos_r4acosf,(%out_arg1); ld.param.f32 %r27,[%value_in]; } cvt.f64.f32 %value,%r27; .loc 1 118 87_gfortran_f2c_specific__atan_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__atan_r4atanf,(%out_arg1); ld.param.f32 %r27,[%value_in]; } cvt.f64.f32 %value,%r27; .loc 1 121 87_gfortran_f2c_specific__asinh_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__asinh_r4asinhf,(%out_arg1); ld.param.f32 %r27,[%value_in]; } cvt.f64.f32 %value,%r27; .loc 1 124 88_gfortran_f2c_specific__acosh_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__acosh_r4acoshf,(%out_arg1); ld.param.f32 %r27,[%value_in]; } cvt.f64.f32 %value,%r27; .loc 1 127 88_gfortran_f2c_specific__atanh_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__atanh_r4atanhf,(%out_arg1); ld.param.f32 %r27,[%value_in]; } cvt.f64.f32 %value,%r27; .loc 1 130 88_gfortran_f2c_specific__sin_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__sin_r4sinf,(%out_arg1); ld.param.f32 %r27,[%value_in]; } cvt.f64.f32 %value,%r27; .loc 1 133 86_gfortran_f2c_specific__sin_c4 .visible .func _gfortran_f2c_specific__sin_c48f32 %r29; .reg .f32 %r31; mov.u64 %r26,%ar0; mov.u64 %r27,%ar1; .loc 1 135 81 ld.f32 %r31,[%r27+4]; ld.f32 %r29,[%r27];.param .f32 %out_arg3; st.param.f32 [%out_arg3],%r31; call csinfst.f32 [%r26],%r24; st.f32 [%r26+4],%r25; .loc 1 135 96_gfortran_f2c_specific__sin_c8 .visible .func _gfortran_f2c_specific__sin_c8[%r27+8]; ld.f64 %r29,[%r27];.param .f64 %out_arg3; st.param.f64 [%out_arg3],%r31; call csst.f64 [%r26],%r24; st.f64 [%r26+8],%r25; .loc 1 137 110_gfortran_f2c_specific__cos_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__cos_r4cosf,(%out_arg1); ld.param.f32 %r27,[%value_in]; } cvt.f64.f32 %value,%r27; .loc 1 140 86_gfortran_f2c_specific__cos_c4 .visible .func _gfortran_f2c_specific__cos_c48f32 %r29; .reg .f32 %r31; mov.u64 %r26,%ar0; mov.u64 %r27,%ar1; .loc 1 142 81 ld.f32 %r31,[%r27+4]; ld.f32 %r29,[%r27];.param .f32 %out_arg3; st.param.f32 [%out_arg3],%r31; call ccosfst.f32 [%r26],%r24; st.f32 [%r26+4],%r25; .loc 1 142 96_gfortran_f2c_specific__cos_c8 .visible .func _gfortran_f2c_specific__cos_c8[%r27+8]; ld.f64 %r29,[%r27];.param .f64 %out_arg3; st.param.f64 [%out_arg3],%r31; call ccosst.f64 [%r26],%r24; st.f64 [%r26+8],%r25; .loc 1 144 110_gfortran_f2c_specific__tan_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__tan_r4tanf,(%out_arg1); ld.param.f32 %r27,[%value_in]; } cvt.f64.f32 %value,%r27; .loc 1 147 86_gfortran_f2c_specific__sinh_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__sinh_r4sinhf,(%out_arg1); ld.param.f32 %r27,[%value_in]; } cvt.f64.f32 %value,%r27; .loc 1 150 87_gfortran_f2c_specific__cosh_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__cosh_r4coshf,(%out_arg1); ld.param.f32 %r27,[%value_in]; } cvt.f64.f32 %value,%r27; .loc 1 153 87_gfortran_f2c_specific__tanh_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__tanh_r4tanhf,(%out_arg1); ld.param.f32 %r27,[%value_in]; } cvt.f64.f32 %value,%r27; .loc 1 156 87_gfortran_f2c_specific__aint_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__aint_r4mov.u64 %r25,%ar0; .loc 1 159 74 ld.f32 %r26,[%r25]; cvt.rzi.f32.f32 %r23,%r26; cvt.f64.f32 %value,%r23; .loc 1 159 87_gfortran_f2c_specific__anint_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__anint_r4roundf,(%out_arg1); ld.param.f32 %r27,[%value_in]; } cvt.f64.f32 %value,%r27; .loc 1 162 88_gfortran_f2c_specific__sign_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__sign_r4f32 %r28; .reg .f32 %r29; mov.u64 %r26,%ar0; mov.u64 %r27,%ar1; .loc 1 175 78 ld.f32 %r28,[%r26]; ld.f32 %r29,[%r27]; copysign.f32 %r24,%r29,%r28; cvt.f64.f32 %value,%r24; .loc 1 175 91_gfortran_f2c_specific__dim_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__dim_r4u64 %r27;178 77 ld.f32 %r28,[%r26]; ld.f32 %r29,[%r27]; sub.f32 %r24,%r28,%r29; setp.le.f32 %r30,%r24,0f00000000; @ %r30 bra $L36; cvt.f64.f32 %r25,%r24; bra $L34; $L36: mov.f64 %r25,0d0000000000000000; $L34: .loc 1 178 90_gfortran_f2c_specific__atan2_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__atan2_r4cvt.f64.f32 %value,%r30; .loc 1 181 92_gfortran_f2c_specific__mod_r4 .visible .func (.param .f64 %value_out) _gfortran_f2c_specific__mod_r4odcvt.f64.f32 %value,%r30; .loc 1 184 90_gfortran_f2c_specific__conjg_4 .visible .func _gfortran_f2c_specific__conjg_4190 20 ld.f32 %r27,[%r26+4]; neg.f32 %r24,%r27; ld.f32 %r28,[%r26]; st.f32 [%r25],%r28; st.f32 [%r25+4],%r24; .loc 1 191 14_gfortran_f2c_specific__conjg_8 .visible .func _gfortran_f2c_specific__conjg_8195 20 ld.f64 %r27,[%r26+8]; neg.f64 %r24,%r27; ld.f64 %r28,[%r26]; st.f64 [%r25],%r28; st.f64 [%r25+8],%r24; .loc 1 196 14 ret; } random_init.o/_gfortran_random_init .visible .func _gfortran_random_inlibgfortran/intrinsics/random_init.f90" // BEGIN GLOBAL FUNCTION DECL: _gfortran_random_seed_i4 .extern .func _gfortran_random_seed_i4_gfortran_runtime_error_at .extern .func _gfortran_runtime_error_at_gfortran_os_error_at .extern .func _gfortran_os_error_atVAR DEF: seed$0 .global .align 8 .u64 seed$0[8]32,55,57,32,111,102,32,102,105,108,101,32,116,114,97,110,47,105,110,116,114,105,110,115,105,99,115,47,114,97,110,100,111,109,955] = {115,101,101,100,0 }101,109,112,116,105,110,103,32,116,111,32,97,108,108,111,99,97,116,101,32,97,108,114,101,97,100,121,32,97,108,108,111,99,97,116,101,100,32,118,97,114,105,97,98,108,101,32,39,37,115,39,0 }; // BEGIN VAR DEF: once$1 .global .align 4 .u32 once$1[1] = {16973,110,32,102,105,108,101,32,39,116,114,97,110,47,105,110,116,114,105,110,115,105,99,115,47,114,97,110,100,111,109,95,105,110,105,116,46,102,57,48,39,44,32,97,114,111,117,110,100,32,108,105,110,101,32,56,48_gfortran_random_init .visible .func _gfortran_random_inpredpredmov.u32 %r47,%ar0; mov.u32 %r48,%ar1; mov.u32 %r49,%ar2; .loc 1 76 7 ld.global.u32 %r50,[once$1];77 20 mov.u32 %r52,0; st.global.u32 [once$1],%r52; .loc 1 78 34 mov.u64 %r55,0; {gfortran_random_seed_.loc 1 79 27 cvta.global.u64 %r110,seed$0; mov.u64 %r57,4; st.u64 [%r110+16],%r57; mov.u64 %r59,1103806595072; st.u64 [%r110+24],%r59; ld.u32 %r23,[%frame]; setp.le.s32 %r60,%r23,0; @ %r60 bra $L9; cvt.s64.s32 %r61,%r23; shl.b64 %r36,%r61,2; bra $L3; $L9: mov.u64 %r36,%r55; $L3: ld.u64 %r63,[%r110]; setp.eq.u64 %r64,%r63,0; @ %r64 bra $L4; .loc 1 79 15 cvta.const.u64 %r67,$LC2; st.u64 [%stack],%r67; cvta.const.u64 %r66,$LC0; cvta.const.u64 %r65,$LC1stack; call _gfortran_runtime_error_at79 27 max.u64 %r70,%rst.u64 [%r110],%r71; setp.ne.u64 %r74,%r71,0; @ %r74 bra $L5;r7_gfortran_os_error_atmov.u64 %r79,1; st.u64 [%r110+48],%r79; cvt.s64.s32 %r81,%r23; st.u64 [%r110+56],%r81; st.u64 [%r110+40],%r79; mov.u64 %r85,-1; st.u64 [%r110+8],%r85; mov.u64 %r87,4; st.u64 [%r110+32],%r87; .loc 1 80 32110; call _gfortran_random_seed_.loc 1 85 10 setp.eq.u32 %r91,%r48,0; @ %r91 bra $L2; .loc 1 85 46 ld.u64 %r38,[%r110]; ld.u64 %r39,[%r110+8]; ld.u64 %r40,[%r110+48]; ld.u64 %r41,[%r110+56]; setp.gt.s64 %r96,%r40,%r41; @ %r96 bra $L2; add.u64 %r97,%r39,%r40; shl.b64 %r98,%r97,2; add.u64 %r26,%r38,%r98; add.u64 %r99,%r41,1; sub.u64 %r46,%r99,%r40; mov.u64 %r37,%r63; $L6: ld.u32 %r101,[%r26]; add.u32 %r100,%r101,%r49; st.u32 [%r26],%r100; add.u64 %r37,%r37,1; add.u64 %r26,%r26,4; setp.ne.u64 %r102,%r37,%r46; @ %r102 bra $L6; $L2: .loc 1 88 7 setp.eq.u32 %r103,%r47,0; @ %r103 bra $L7; .loc 1 89 33 mov.u64 %r106,0; cvta.global.u64 %r105,seed$0; {5106; call _gfortran_random_seed_bra $L1; $L7: .loc 1 91 25 mov.u64 %r109,gfortran_random_seed_$L1: .loc 1 94 36 ret; } backtracebacktrace_allocbacktracebacktrace_free .visible .func backtrace_freebacktrace_vector_growbacktrace_vector_growbacktrace_vector_finishbacktrace_vector_finishbacktrace_vector_releasbacktrace_vector_release__errno__errno114,101,97,108,108,111,99,0 }7] = {109,97,108,108,111,99backtracebacktrace_alloc5.loc 1 57 958 6 set.u32.eq.u64 %r36,%r33,0; neg.s32 %r37,%r36; .loc 1 60 10 set.u32.ne.u64 %r39,%r30,0; neg.s32 %r40,%r39; cvt.u16.u32 %r42,%r37; cvt.u16.u32 %r43,%r40; and.b16 %r41,%r42,%r43; cvt.u32.u16 %r44,%r41; cvt.u16.u8 %r45,%r44; setp.eq.u16 %r46,%r45,0; @ %r46 bra $L1; .loc 1 61 34__errno 2 ld.u32 %r50,[%r47]; cvta.const.u64 %r49,$LC0;,.param .u32 %in_ar2)call %r30,(%out_arg1,%out_arg2,%out_arg3),$LCT0; } $L1: .loc 1 64backtrace_free .visible .func backtrace_free {mov.u64 %r23,%ar1; .loc 1 74.loc 1 7backtrace_vector_growbacktrace_vector_growpredu641mov.u64 %r35,%ar3; mov.u64 %r36,%ar4; .loc 1 86 17 ld.u64 %r22,[%r36+16]; .loc 1 91 14 ld.u64 %r30,[%r36+8]; .loc 1 101 14 ld.u64 %r31,[%r36]; .loc 1 86 6 setp.ge.u64 %r37,%r22,%r33; @ %r37 bra $L8; .loc 1 91 10 setp.ne.u64 %r38,%r30,0; @ %r38 bra $L9; .loc 1 92 6 shl.b64 %r29,%r33,5; bra $L10; $L9: .loc 1 93 15 setp.le.u64 %r39,%r30,4095; @ %r39 bra $L11; .loc 1 94 6 add.u64 %r29,%r30,4096; bra $L10; $L11: .loc 1 96 6 add.u64 %r29,%r30,%r30; $L10: .loc 1 98 27 add.u64 %r41,%r33,%r30; max.u64 %r23,%r41,%r29; .loc 1 101 14mov.u64 %r31,%r44; .loc 1 102 10 setp.ne.u64 %r45,%r31,0; @ %r45 bra __errno104 4 ld.u32 %r49,[%r46]; cvta.const.u64 %r48,$LC1;.loc 1 105 11 bra $L7; $L12: .loc 1 108 17 st.u64 [%r36],%r31; .loc 1 109 27 ld.u64 %r30,[%r36+8]; .loc 1 109 22 sub.u64 %r22,%r23,%r30; $L8: .loc 1 112 7 add.u64 %r31,%r31,%r30; .loc 1 113 13 add.u64 %r50,%r33,%r30; st.u64 [%r36+8],%r50; .loc 1 114 12 sub.u64 %r51,%r22,%r33; st.u64 [%r36+16],%r51;backtrace_vector_finishbacktrace_vector_finishpred %r32; .reg .u64mov.u64 %r30,%ar3; .loc 1 148 12 mov.u64 %r31,0; st.u64 [%r28+16],%r31; .loc 1 150 10 ld.u64 %r23,[%r28+8]; .loc 1 154 7 ld.u64 %r22,[%r28]; .loc 1 150 6 setp.ne.u64 %r32,%r23,0; @ %r32 bra $L15;mov.u64 %r26,%r23; bra $L16; $L15: .loc 1 159mov.u64 %r26,%r36; .loc 1 159 13 st.u64 [%r28],%r26; .loc 1 160 6 setp.ne.u64 %r37,%r26,0; @ %r37 bra __errnou32 %r41,[%r38]; cvta.const.u64 %r40,$LC1;call %r29,(%out_arg1,%out_arg2,%out_arg3),$LCT2; } bra $L14; $L16: .loc 1 134 13 mov.u64 %r42,0; st.u64 [%r28],%r42; .loc 1 135 13 st.u64 [%r28+8],%r42; .loc 1 136 12 st.u64 [%r28+16],%r42; $L14: .loc 1 138backtrace_vector_releasbacktrace_vector_releaseu64 %r40; .reg .u64 %r42; .reg .u32 %r43.loc 1 148 12 mov.u64 %r32,0; st.u64 [%r29+16],%r32; .loc 1 150 10 ld.u64 %r22,[%r29+8]; .loc 1 154 7 ld.u64 %r26,[%r29]; .loc 1 150 6 setp.ne.u64 %r33,%r22,0; @ %r33 bra $L19; .loc 1 154 755 17 st.u64 [%r29],%r22; .loc 1 156 14 mov.u32 %r27,1; bra $L18; $L19: .loc 1 1596159 13 st.u64 [%r29],%r38; .loc 1 160 621;__errno162 7 ld.u32 %r43,[%r40]; cvta.const.u64 %r42,$LC1;cvt.u32.u64 %r27,%r38; bra $L18; $L21: .loc 1 166 10 mov.u32 %r27,1; $L18:backtrace.o/FUNCTION DECL: unwind .func (.param .u32 %value_out) unwindbacktrace/backtracbacktrace_fullbacktrace_full_Unwind_GetIPInfo_Unwind_GetIPInfo; // BEGIN GLOBAL FUNCTION DECL: backtrace_pcinfobacktrace_pcinfobacktrace_backtrace_allbacktrace_free_Unwind_Backtrace_Unwind_Backtr; // BEGIN FUNCTION DEF: unwind .func (.param .u32 %value_out) unwind29predpredu32 %r58; .reg .pred1 7 mov.u32 %r35,0; st.u32 [%frame],%r35; .loc 1 74frame; call (%value_in),_Unwind_GetIPInfo,(%out_arg1,%out_arg2).loc 1 79 12 ld.u32 %r22,[%r34]; .loc 1 79 6 setp.le.s32 %r39,%r22,0; @ %r39 bra $L2; .loc 1 81 7 add.u32 %r40,%r22,-1; st.u32 [%r34],%r40; .loc 1 82 14 mov.u32 %r32,%r35; bra $L1; $L2: .loc 1 85 6 ld.u32 %r41,[%frame]; setp.ne.u32 %r42,%r41,0; @ %r42 bra $L4; .loc 1 86 5 add.u64 %r29,%r29,-1; $L4: .loc 1 89 18 ld.u64 %r30,[%r34+32]; .loc 1 89 23 ld.u64 %r31,[%r34+16]; .loc 1 88 6 ld.u32 %r43,[%r34+45; .loc 1 89 18 mov.u64 %r49,0; { .param .u32 %value_in; $LCT0: .callprototype ( .param .u32 %value_out),.param .u64 %in_ar2,call (%value_in),%r31),$LCT0; ld.param.u32 %r50,[%value_in]; } mov.u32 %r26,%r50; .loc 1 89 16 st.u32 [%r34+40],%r26; bra $L6; $L5: .loc 1 91 18 ld.u64 %r56,[%r34+24]; ld.u64 %r57,[%r34+8call (%value_in),backtrace_pcinfor26,%r58; .loc 1 91 16 st.u32 [%r34+40],%r26; $L6: .loc 1 93 6 setp.eq.u32 %r59,%r26,0; .loc 1 94 12 selp.u32 %r32,0,5,%r59; $L1: .loc 1 backtrace_fullbacktrace_fullpredmov.u64 %r29,%ar4; .loc 1 109 21 add.u32 %r30,%r26,1; .loc 1 109 14 st.u32 [%frame],%r30; .loc 1 110 15 st.u64 [%frame+8],%r25; .loc 1 111 18 st.u64 [%frame+16],%r27; .loc 1 112 24 st.u64 [%frame+24],%r28; .loc 1 113 14 st.u64 [%frame+32],%r29; .loc 1 114 13 mov.u32 %r31,0; st.u32 [%frame+40],%r31; .loc 1 118 7 mov.u64 %r35,0; movcall (%value_in),backtrace_allocld.param.u64 %r36,[%value_in]; } .loc 1 119 6 setp.ne.u64 %r38,%r36,0; @ %r38 bra $L9; .loc 1 120 21 st.u32 [%frame+44],%r31; bra $L10; $L9: .loc 1 123 7 {call backtrace_free); } .loc 1 124 23 mov.u32 %r45,1; st.u32 [%frame+44],%r45; $L10: .loc 1 127 3 mov.u64 %r46,unwindframe; call (%value_in),_Unwind_Backtrace.loc 1 128 15 ld.u32 %value,[%frame+40] dwarf.o/ FUNCTION DECL: units_search .func (.param .u32 %value_out) units_searchbacktrace/dwarf.c" // BEGIN FUNCTION DECL: function_addrs_searchfunction_addrs_sear); // BEGIN FUNCTION DECL: unit_addrs_searchunit_addrs_sear); // BEGIN FUNCTION DECL: line_searchline_sear); // BEGIN FUNCTION DECL: abbrev_compare .func (.param .u32 %value_out) abbrev_compare; // BEGIN FUNCTION DECL: report_inlined_functionsreport_inlined_functionsokup_abbrev .func (.param .u64 %value_out) lookup_abbrev; // BEGIN FUNCTION DECL: add_function_rangeadd_function_range); // BEGIN FUNCTION DECL: add_unit_addradd_unit_addr); // BEGIN FUNCTION DECL: free_abbrevs .func free_abbrevs; // BEGIN FUNCTION DECL: unit_addrs_compunit_addrs_compare; // BEGIN FUNCTION DECL: line_compare .func (.param .u32 %value_out) line_compare; // BEGIN FUNCTION DECL: function_addrs_compare .func (.param .u32 %value_out) function_addrs_compare; // BEGIN FUNCTION DECL: add_line$isra$0 .func add_line$isra$0FUNCTION DECL: update_pcrange$isra$0 .func update_pcrange$isra$0FUNCTION DECL: read_uleb128 .func (.param .u64 %value_out) read_uleb128 (.param .u64 %in_ar0); // BEGIN FUNCTION DECL: read_sleb128 .func (.param .u64 %value_out) read_sleb128 (.param .u64 %in_ar0); // BEGIN FUNCTION DECL: advanceadvance; // BEGIN FUNCTION DECL: read_byteread_byte (.param .u64 %in_ar0); // BEGIN FUNCTION DECL: read_uint16 .func (.param .u32 %value_out) read_uint16 (.param .u64 %in_ar0); // BEGIN FUNCTION DECL: read_uint24read_uint24 (.param .u64 %in_ar0); // BEGIN FUNCTION DECL: read_uint32read_uint32 (.param .u64 %in_ar0); // BEGIN FUNCTION DECL: read_uint64 .func (.param .u64 %value_out) read_uint64 (.param .u64 %in_ar0); // BEGIN FUNCTION DECL: read_address .func (.param .u64 %value_out) read_addressFUNCTION DECL: resolve_addr_index$isra$0 .func (.param .u32 %value_out) resolve_addr_index$isra$0FUNCTION DECL: add_ranges_from_rnglists$isra$0 .func (.param .u32 %value_out) add_ranges_from_rnglists$isra$010, .param .u64 %in_ar11); // BEGIN FUNCTION DECL: resolve_string .func (.param .u32 %value_out) resolve_stringFUNCTION DECL: read_attribute .func (.param .u32 %value_out) read_attributeread_line_header_format_entries); // BEGIN FUNCTION DECL: read_referenced_name .func (.param .u64 %value_out) read_referenced_nameFUNCTION DECL: read_referenced_name_from_attr$paread_referenced_name_from_attr$part$0FUNCTION DECL: find_address_rangesfind_address_ranges, .param .u64 %in_ar10); // BEGIN FUNCTION DECL: read_function_entread_function_ent, .param .u64 %in_ar9); // BEGIN FUNCTION DECL: dwarf_lookup_pc .func (.param .u32 %value_out) dwarf_lookup_pc); .file 2 "../../../libbacktrace/internal.h" // BEGIN FUNCTION DECL: dwarf_fileline .func (.param .u32 %value_out) dwarf_filelinebacktrace_dwarf_addbacktrace_dwarf_addbsearchbsearchbacktrace_vector_growbacktrace_vector_growbacktrace_free .extern .func backtrace_freestrcmpstrcmpnnprintfstrnnlen; // BEGIN GLOBAL FUNCTION DECL: backtrace_backtrace_allbacktrace_vector_releasebacktrace_vector_releasebacktrace_qsort .extern .func backtrace_qsortbacktrace_vector_finishbacktrace_vector_finish105,110,118,97,108,105,100,32,97,98,98,114,101,118,105,97,116,105,111,110,32,99,111,10015,32,105,110,32,37,115,32,97,116,32,37,1068,87,65,82,70,32,117,110,100,101,114,102,108,111,176,69,66,49,50,56,32,111,118,101,114,102,108,111,119,115,32,117,105,110,116,54,52,95,1115,105,103,110,101,100,32,76,69,66,49,50,56,32,111,118,101,114,102,108,111,119,115,32,117,105,110,116,54,5226] = {117,110,114,101,99,111,103,110,105,122,101,100,32,97,100,100,114,101,115,115,32,3] = {68,87,95,70,79,82,77,95,97,100,100,114,120,32,118,97,108,117,101,32,111,117,116,32,111,102,32,114,97,110,10346,100,101,98,117,103,95,97,100,100,114,0 }114,110,103,108,105,115,116,115,32,111,102,102,115,101,116,32,111,117,116,32,111,102,32,114,97,110,103,101114,110,103,108,105,115,116,115,32,105,110,100,101,120,32,111,102,102,115,101,116,32,111,117,116,32,111,102,32,114,97,110,103,101117,110,114,101,99,111,103,110,105,122,101,100,32,68,87,95,82,76,69,32,118,97,108,117,10146,100,101,98,117,103,95,114,110,103,108,105,115,116,1114,97,110,103,101,115,32,111,102,102,115,101,116,32,111,117,116,32,111,102,32,114,97,110,103,10146,100,101,98,117,103,95,114,97,110,103,101,168,87,95,70,79,82,77,95,115,116,114,120,32,118,97,108,117,101,32,111,117,116,32,111,102,32,114,97,110,103,10146,100,101,98,117,103,95,115,116,114,95,111,102,102,115,101,116,168,87,95,70,79,82,77,95,115,116,114,120,32,111,102,102,115,101,116,32,111,117,116,32,111,102,32,114,97,110,103,101,0 }; // BEGIN VAR DEF: $LC17 .const .align 1 .u8 $LC17[26] = {68,87,95,70,79,82,77,95,115,116,114,112,32,111,117,116,32,111,102,32,114,97,110,103,10131] = {68,87,95,70,79,82,77,95,108,105,110,101,95,115,116,114,112,32,111,117,116,32,111,102,32,114,97,110,103,101,0 }; // BEGIN VAR DEF: $LC19 .const .align 1 .u8 $LC19[43] = {68,87,95,70,79,82,77,95,105,110,100,105,114,101,99,116,32,116,111,32,68,87,95,70,79,82,77,95,105,109,112,108,105,99,105,116,95,99,111,110,115,116105,110,118,97,108,105,100,32,100,105,114,101,99,116,111,114,121,32,105,110,100,101,120,32,105,110,32,108,105,110,101,32,110,117,109,98,101,114,32,112,114,111,103,114,97,109,0 }; // BEGIN VAR DEF: $LC34 .const .align 1 .u8 $LC34[43] = {105,110,118,97,108,105,100,32,102,105,108,101,32,110,117,109,98,101,114,32,105,110,32,108,105,110,101,32,110,117,109,98,101,114,32,112,114,111,103,114,97,109,0 }; // BEGIN VAR DEF: $LC35 .const .align 1 .u8 $LC35[27] = {117,110,114,101,99,111,103,110,105,122,101,100,32,68,87,65,82,70,32,118,101,114,115,105,111,110,0 }; // BEGIN VAR DEF: $LC36 .const .align 1 .u8 $LC36[27] = {97,98,98,114,101,118,32,111,102,102,115,101,116,32,111,117,116,32,111,102,32,114,97,110,103,101,0 }; // BEGIN VAR DEF: $LC37 .const .align 1 .u8 $LC37[14] = {46,100,101,98,117,103,95,97,98,98,114,101,118,0 }; // BEGIN VAR DEF: $LC30 .const .align 1 .u8 $LC30[12] = {46,100,101,98,117,103,95,108,10509,98,101,114,32,118,101,114,115,105,111,110,0 }110,111,110,45,122,101,114,111,32,115,101,103,109,101,110,116,95,115,101,108,101,99,116,111,114,930] = {68,87,95,70,79,82,77,95,115,116,114,112,95,115,117,112,32,111,117,116,32,111,102,32,114,97,110,103,101117,110,114,101,99,111,103,110,105,122,101,100,32,68,87,65,82,70,32,102,111,114,109115,105,110,103,32,102,105,108,101,32,110,97,109,101,32,105,110,32,108,105,110,101,32,110,117,109,98,101,114,32,112,114,111,103,114,97,109,32,104,101,97,100,101,114105,110,118,97,108,105,100,32,100,105,114,101,99,116,111,114,121,32,105,110,100,101,120,32,105,110,32,108,105,110,101,32,110,117,109,98,101,114,32,112,114,111,103,114,97,109,32,104,101,97,100,101,11497,98,115,116,114,97,99,116,32,111,114,105,103,105,110,32,111,114,32,115,112,101,99116,32,111,102,32,114,97,110,103,10146,100,101,98,117,103,95,105,110,102,1105,110,118,97,108,105,100,32,97,98,115,116,114,97,99,116,32,111,114,105,103,105,110,32,111,114,32,115,112,101,99,105,102,105,99105,110,118,97,108,105,100,32,102,105,108,101,32,110,117,109,98,101,114,32,105,110,32,68,87,95,65,84,95,99,97,108,108,95,102,105,108,101,32,97,116,116,114,105,98,117,116,101117,110,105,116,32,108,105,110,101,32,111,102,102,115,101,116,32,111,117,116,32,111,102,32,114,97,110,103,101,0 }; // BEGIN FUNCTION DEF: units_search .func (.param .u32 %value_out) units_searchpred1125 22 ld.u64 %r24,[%r29]; .loc 1 1128 10 ld.u64 %r25,[%r28]; .loc 1 1129 6 ld.u64 %r30,[%r24+24]; setp.gt.u64 %r31,%r30,%r25; @ %r31 bra $L3; .loc 1 1131 11 ld.u64 %r33,[%r24+32]; set.u32.le.u64 %r34,%r33,%r25; neg.s32 %r27,%r34; bra $L1; $L3: .loc 1 1130 12 mov.u32 %r27,-1; $L1:FUNCTION DEF: function_addrs_search .func (.param .u32 %value_out) function_addrs_searchpred24,[%r27]; .loc 1 1179 6 ld.u64 %r29,[%r28]; setp.gt.u64 %r30,%r29,%r24; @ %r30 bra $L6; .loc 1 1181 11 ld.u64 %r32,[%r28+8]; set.u32.le.u64 %r33,%r32,%r24; neg.s32 %r26,%r33; bra $L4; $L6: .loc 1 1180 12 mov.u32 %r26,-1; $L4: .loc 1 1185 1 mov.u32 %value,%r26;unit_addrs_search .func (.param .u32 %value_out) unit_addrs_searchpred24,[%r27]; .loc 1 1264 6 ld.u64 %r29,[%r28]; setp.gt.u64 %r30,%r29,%r24; @ %r30 bra $L9; .loc 1 1266 11 ld.u64 %r32,[%r28+8]; set.u32.le.u64 %r33,%r32,%r24; neg.s32 %r26,%r33; bra $L7; $L9: .loc 1 1265 12 mov.u32 %r26,-1; $L7: .loc 1 1270 1 mov.u32 %value,%r26;line_searchline_searchpred24,[%r27]; .loc 1 1308 6 ld.u64 %r29,[%r28]; setp.gt.u64 %r30,%r29,%r24; @ %r30 bra $L12; .loc 1 1310 11 ld.u64 %r32,[%r28+24]; set.u32.le.u64 %r33,%r32,%r24; neg.s32 %r26,%r33; bra $L10; $L12: .loc 1 1309 12 mov.u32 %r26,-1; $L10: .loc 1 1314 1 mov.u32 %value,%r26;abbrev_compare .func (.param .u32 %value_out) abbrev_comparepred.loc 1 1325 20 ld.u64 %r23,[%r27]; .loc 1 1325 6 setp.lt.u64 %r28,%r22,%r23; @ %r28 bra $L15; .loc 1 1327 11 set.u32.gt.u64 %r30,%r22,%r23; neg.s32 %r25,%r30; bra $L13; $L15: .loc 1 1326 12 mov.u32 %r25,-1; $L13: .loc 1 1336 1 mov.u32 %value,%r25;report_inlined_functions .func (.param .u32 %value_out) report_inlined_functionspredpredpred %r98; mov.u64 %r53,%ar0; st.u64 [%frame],%r53; mov.u64 %r54,%ar1; mov.u64 %r55,%ar2; mov.u64 %r56,%ar3; mov.u64 %r57,%ar4; mov.u64 %r58,%ar5; .loc 1 3513 15 ld.u64 %r22,[%r54+32]; .loc 1 3513 6 setp.ne.u64 %r59,%r22,0; @ %r59 bra $L17; $L19: .loc 1 3514 12 mov.u32 %r52,0; bra $L16; $L17: .loc 1 3517 7 ld.u64 %r65,[%r54+24]; mov.u64 %r64,function_addrs_search; mov.u64 %r63framecall (%value_in),bsearch66,[%value_in]; } mov.u64 %r40,%r66; .loc 1 3521 6 setp.eq.u64 %r67,%r66,0; @ %r67 bra $L19; .loc 1 3524 46 ld.u64 %r46,[%r54+24]; .loc 1 3525 14 ld.u64 %r45,[%r54+32]; .loc 1 3535 9 ld.u64 %r28,[%frame]; .loc 1 3524 36 sub.u64 %r68,%r66,%r46; shr.s64 %r70,%r68,3; .loc 1 3524 64 mad.lo.u64 %r72,%r70,-6148914691236517205,1; .loc 1 3524 9 setp.ge.u64 %r73,%r72,%r45; @ %r73 bra $L20; add.u64 %r31,%r66,24; add.u64 %r41,%r66,32; bra $L21; $L22: add.u64 %r31,%r31,24; add.u64 %r41,%r41,24; .loc 1 3524 36 sub.u64 %r74,%r50,%r46; shr.s64 %r76,%r74,3; .loc 1 3524 64 mad.lo.u64 %r78,%r76,-6148914691236517205,1; .loc 1 3524 9 setp.ge.u64 %r79,%r78,%r45; @ %r79 bra $L23; $L21: add.u64 %r40,%r31,-24; mov.u64 %r50,%r31; .loc 1 3526 3 ld.u64 %r80,[%r31]; setp.lt.u64 %r81,%r28,%r80; @ %r81 bra $L20; .loc 1 3527 3 ld.u64 %r82,[%r41]; setp.lt.u64 %r83,%r28,%r82; @ %r83 bra $L22; bra $L20; $L23: mov.u64 %r40,%r50; $L20: .loc 1 3532 11 ld.u64 %r38,[%r40+16]; .loc 1 35report_inlined_functions90,[%value_in]; } mov.u32 %r52,%r90; .loc 1 3537 6 setp.ne.u32 %r91,%r52,0; @ %r91 bra $L16; .loc 1 3541 9 ld.u64 %r96,[%r38]; ld.u32 %r95,[%r58]; ld.u64 %r94,[%r57]; ld.u64 %r93,[%frame]; { .param .u32 %value_in; $LCT0: .callprototype ( .param .u32 %value_out),.param .u64 %in_ar2,.param .u32 %in_ar3,.param .u64 %in_ar4)96; call (%value_in),%r55),$LCT0; ld.param.u32 %r97,[%value_in]; } mov.u32 %r52,%r97; .loc 1 3542 6 setp.ne.u32 %r98,%r52,0; @ %r98 bra $L16; .loc 1 3547 22 ld.u64 %r36,[%r38+8]; .loc 1 3547 13 st.u64 [%r57],%r36; .loc 1 3548 20 ld.u32 %r37,[%r38+16]; .loc 1 3548 11 st.u32 [%r58],%r37; $L16: .loc 1 3551// BEGIN FUNCTION DEF: lookup_abbrev .loc 1 1502 25 ld.u64 %r23,[%r30]; .loc 1 1502 12 add.u64 %r34,%r31,-1; .loc 1 1502 6 setp.ge.u64 %r35,%r34,%r23; @ ! %r35 bra $L31; ld.u64 %r58,[%r30+8]; bra $L28; $L31: .loc 1 1503 26 shl.b64 %r36,%r31,5; add.u64 %r37,%r36,-32; ld.u64 %r58,[%r30+8]; add.u64 %r29,%r58,%r37; .loc 1 1503 7 ld.u64 %r39,[%r29]; setp.eq.u64 %r40,%r39,%r31; @ %r40 bra $L27; $L28: .loc 1 1507 3 add.u64 %r41,%frame,8; mov.u64 %r43,0; st.u64 [%r41],%r43; st.u64 [%r41+8],%r43; st.u64 [%r41+16],%r43; .loc 1 1508 12 st.u64 [%frame],%r31; .loc 1 1509 7 mov.u64 %r50,abbrev_compare; mov.u64 %r44call (%value_in),bsearch52,[%value_in]; } mov.u64 %r29,%r52; .loc 1 1511 6 setp.ne.u64 %r53,%r29,0; @ %r53 bra $L27; .loc 1 1513 7 cvt.u32.u64 %r56,%r29; cvta.const.u64 %r55,$LC0;27: .loc 1 15FUNCTION DEF: add_function_range .func (.param .u32 %value_out) add_function_rangepredpredar2; mov.u64 %r38,%ar3; mov.u64 %r39,%ar4; mov.u64 %r40,%ar5; mov.u64 %r41,%ar6; .loc 1 3159 10 ld.u64 %r22,[%r41+24]; .loc 1 3159 6 setp.eq.u64 %r42,%r22,0; @ %r42 bra $L33; .loc 1 3161 51 add.u64 %r44,%r22,%r22; add.u64 %r45,%r44,%r22; shl.b64 %r46,%r45,3; add.u64 %r47,%r46,-24; .loc 1 3161 9 ld.u64 %r48,[%r41]; add.u64 %r32,%r48,%r47; .loc 1 3162 22 ld.u64 %r26,[%r32+8]; .loc 1 3162 10 setp.eq.u64 %r49,%r26,%r37; @ %r49 bra $L34; .loc 1 3162 49 add.u64 %r50,%r26,1; .loc 1 3162 29 setp.ne.u64 %r51,%r50,%r37; @ %r51 bra $L33; $L34: .loc 1 3163 4 ld.u64 %r52,[%r32+16]; setp.ne.u64 %r53,%r52,%r36; @ %r53 bra $L33; .loc 1 3165 7 setp.ge.u64 %r54,%r26,%r38; @ %r54 bra $L36; .loc 1 3166 14 st.u64 [%r32+8],%r38; .loc 1 3167 11 mov.u32 %r34,1; bra $L32; $L33: .loc 1 3172 8 mov.u64 %r56,24;call (%value_in),backtrace_vector_grow60,[%value_in]; } .loc 1 3174 6 setp.eq.u64 %r61,%r60,0; @ %r61 bra $L37; .loc 1 3177 10 st.u64 [%r60],%r37; .loc 1 3178 11 st.u64 [%r60+8],%r38; .loc 1 3179 15 st.u64 [%r60+16],%r36; .loc 1 3181 3 ld.u64 %r63,[%r41+24]; add.u64 %r62,%r63,1; st.u64 [%r41+24],%r62; .loc 1 3183 10 mov.u32 %r34,1; bra $L32; $L36: .loc 1 3167 11 mov.u32 %r34,1; bra $L32; $L37: .loc 1 3175 12 cvt.u32.u64 %r34,%r60; $L32: .loc 1 3184 1 mov.u32 %value,%r34;add_unit_addradd_unit_addrpredpredar2; mov.u64 %r38,%ar3; mov.u64 %r39,%ar4; mov.u64 %r40,%ar5; mov.u64 %r41,%ar6; .loc 1 1201 10 ld.u64 %r22,[%r41+24]; .loc 1 1201 6 setp.eq.u64 %r42,%r22,0; @ %r42 bra $L42; .loc 1 1203 47 add.u64 %r44,%r22,%r22; add.u64 %r45,%r44,%r22; shl.b64 %r46,%r45,3; add.u64 %r47,%r46,-24; .loc 1 1203 9 ld.u64 %r48,[%r41]; add.u64 %r32,%r48,%r47; .loc 1 1204 22 ld.u64 %r26,[%r32+8]; .loc 1 1204 10 setp.eq.u64 %r49,%r26,%r37; @ %r49 bra $L43; .loc 1 1204 49 add.u64 %r50,%r26,1; .loc 1 1204 29 setp.ne.u64 %r51,%r50,%r37; @ %r51 bra $L42; $L43: .loc 1 1205 4 ld.u64 %r52,[%r32+16]; setp.ne.u64 %r53,%r52,%r36; @ %r53 bra $L42; .loc 1 1207 7 setp.ge.u64 %r54,%r26,%r38; @ %r54 bra $L45; .loc 1 1208 14 st.u64 [%r32+8],%r38; .loc 1 1209 11 mov.u32 %r34,1; bra $L41; $L42: .loc 1 1214 8 mov.u64 %r56,24;call (%value_in),backtrace_vector_grow60,[%value_in]; } .loc 1 1216 6 setp.eq.u64 %r61,%r60,0; @ %r61 bra $L46; .loc 1 1219 10 st.u64 [%r60],%r37; .loc 1 1220 11 st.u64 [%r60+8],%r38; .loc 1 1221 8 st.u64 [%r60+16],%r36; .loc 1 1223 3 ld.u64 %r63,[%r41+24]; add.u64 %r62,%r63,1; st.u64 [%r41+24],%r62; .loc 1 1225 10 mov.u32 %r34,1; bra $L41; $L45: .loc 1 1209 11 mov.u32 %r34,1; bra $L41; $L46: .loc 1 1217 12 cvt.u32.u64 %r34,%r60; $L41: .loc 1 1226 1 mov.u32 %value,%r34;free_abbrevs .func free_abbrevspredmov.u64 %r35,%ar3; .loc 1 737 26 ld.u64 %r31,[%r33]; .loc 1 737 3 setp.eq.u64 %r36,%r31,0; @ %r36 bra $L51; .loc 1 737 10 mov.u64 %r29,0; $L52: .loc 1 739 23 shl.b64 %r37,%r29,5; ld.u64 %r38,[%r33+8]; add.u64 %r23,%r38,%r37; .loc 1 738 5 ld.u64 %r45,[%r23+16]; shl.b64 %r44,%r45,4; ld.u64 %r46,[%r23+24]call backtrace_free); } .loc 1 737 41 add.u64 %r29,%r29,1; .loc 1 737 26 ld.u64 %r27,[%r33]; .loc 1 737 3 setp.gt.u64 %r47,%r27,%r29; @ %r47 bra $L52; .loc 1 741 3 shl.b64 %r31,%r27,5; $L51: ld.u64 %r53,[%r33+8]backtrace_free); } .loc 1 744 24 mov.u64 %r54,0; st.u64 [%r33],%r54; .loc 1 745 20 st.u64 [%r33+8],%r54; .loc 1 746 1 ret; } // BEGIN FUNCTION DEF: unit_addrs_compare .func (.param .u32 %value_out) unit_addrs_compare37 9 ld.u64 %r22,[%r32]; .loc 1 1237 19 ld.u64 %r23,[%r33]; .loc 1 1237 6 setp.lt.u64 %r34,%r22,%r23; @ %r34 bra $L59; .loc 1 1239 6 setp.gt.u64 %r35,%r22,%r23; @ %r35 bra $L60; .loc 1 1241 9 ld.u64 %r24,[%r32+8]; .loc 1 1241 20 ld.u64 %r25,[%r33+8]; .loc 1 1241 6 setp.lt.u64 %r36,%r24,%r25; @ %r36 bra $L6116]; ld.u64 %r27,[%r38+56]; .loc 1 1245 29 ld.u64 %r39,[%r33+16]; ld.u64 %r29,[%r39+56]; .loc 1 1245 6 setp.lt.s64 %r40,%r27,%r29; @ %r40 bra $L63; .loc 1 1247 6 set.u32.gt.s64 %r42,%r27,%r29; neg.s32 %r31,%r42; bra $L57; $L59: .loc 1 1238 12 mov.u32 %r31,-1; bra $L57; $L60: .loc 1 1240 12 mov.u32 %r31,1; bra $L57; $L61: mov.u32 %r31,1; bra $L57; $L62: .loc 1 1238 12 mov.u32 %r31,-1; bra $L57; $L63: mov.u32 %r31,-1; $L57: .loc 1 12FUNCTION DEF: line_compare .func (.param .u32 %value_out) line_comparepred83 10 ld.u64 %r22,[%r28]; .loc 1 1283 20 ld.u64 %r23,[%r29]; .loc 1 1283 6 setp.lt.u64 %r30,%r22,%r23; @ %r30 bra $L66; .loc 1 1285 11 setp.gt.u64 %r31,%r22,%r23; @ %r31 bra $L67; .loc 1 1287 15 ld.u32 %r24,[%r28+20]; .loc 1 1287 26 ld.u32 %r25,[%r29+20]; .loc 1 1287 11 setp.lt.s32 %r32,%r24,%r25; @ %r32 bra $L68; .loc 1 1289 11 set.u32.gt.s32 %r34,%r24,%r25; neg.s32 %r27,%r34; bra $L64; $L66: .loc 1 1284 12 mov.u32 %r27,-1; bra $L64; $L67: .loc 1 1286 12 mov.u32 %r27,1; bra $L64; $L68: .loc 1 1284 12 mov.u32 %r27,-1; $L64:FUNCTION DEF: function_addrs_comp32pred %r35; .reg .pred1156 9 ld.u64 %r22,[%r31]; .loc 1 1156 19 ld.u64 %r23,[%r32]; .loc 1 1156 6 setp.lt.u64 %r33,%r22,%r23; @ %r33 bra $L71; .loc 1 1158 6 setp.gt.u64 %r34,%r22,%r23; @ %r34 bra $L72; .loc 1 1160 9 ld.u64 %r24,[%r31+8]; .loc 1 1160 20 ld.u64 %r25,[%r32+8]; .loc 1 1160 6 setp.lt.u64 %r35,%r24,%r25; @ %r35 bra $L73; .loc 1 1162 6 setp.gt.u64 %r36,%r24,%r25; @ %r36 bra $L74; .loc 1 1164 10 ld.u64 %r39,[%r32+16]; ld.u64 %r40,[%r31+16]; ld.u64 %r38,[%r39]; ld.u64 %r37,[%r40call (%value_in),strcmpmov.u32 %r30,%r41; bra $L69; $L71: .loc 1 1157 12 mov.u32 %r30,-1; bra $L69; $L72: .loc 1 1159 12 mov.u32 %r30,1; bra $L69; $L73: mov.u32 %r30,1; bra $L69; $L74: .loc 1 1157 12 mov.u32 %r30,-1; $L69: .loc 1 1165FUNCTION DEF: add_line$isra$0 .func add_line$isra$0predu64mov.u64 %r42,%ar5; mov.u64 %r43,%ar6; mov.u64 %r44,%ar7; .loc 1 2270 10 ld.u64 %r22,[%r44+24]; .loc 1 2270 6 setp.eq.u64 %r45,%r22,0; @ %r45 bra $L76; .loc 1 2272 42 add.u64 %r47,%r22,%r22; add.u64 %r48,%r47,%r22; shl.b64 %r49,%r48,3; add.u64 %r50,%r49,-24; .loc 1 2272 10 ld.u64 %r51,[%r44]; add.u64 %r26,%r51,%r50; .loc 1 2273 10 ld.u64 %r52,[%r26]; setp.ne.u64 %r53,%r52,%r39; @ %r53 bra $L76; .loc 1 2273 24 ld.u64 %r54,[%r26+8]; setp.ne.u64 %r55,%r54,%r40; @ %r55 bra $L76; .loc 1 2273 52 ld.u32 %r56,[%r26+16]; setp.ne.u32 %r57,%r56,%r41; @ ! %r57 bra $L75; $L76: .loc 1 2278 2 mov.u64 %r59call (%value_in),backtrace_vector_grow,[%value_in]; } .loc 1 2280 6 setp.eq.u64 %r64,%r63,0; @ %r64 bra $L75; .loc 1 2285 15 ld.u64 %r66,[%r38+16]; add.u64 %r65,%r66,%r39; .loc 1 2285 10 st.u64 [%r63],%r65; .loc 1 2287 16 st.u64 [%r63+8],%r40; .loc 1 2288 14 st.u32 [%r63+16],%r41; .loc 1 2289 16 ld.u64 %r34,[%r44+24]; .loc 1 2289 11 st.u32 [%r63+20],%r34; .loc 1 2291 3 add.u64 %r67,%r34,1; st.u64 [%r44+24],%r67; $L75: .loc 1 2294 1 ret; } // BEGIN FUNCTION DEF: update_pcrange$isra$0 .func update_pcrange$isra$0.reg .predpredpredmov.u32 %r35,%ar0; mov.u64 %r36,%ar1; mov.u64 %r37,%ar2; .loc 1 1544 3 setp.eq.u32 %r38,%r35,18; @ %r38 bra $L86; setp.eq.u32 %r39,%r35,85; @ %r39 bra $L87; setp.ne.u32 %r40,%r35,17; @ %r40 bra $L85; .loc 1 1547 14 ld.u32 %r22,[%r36]; .loc 1 1547 10 setp.ne.u32 %r41,%r22,1; @ %r41 bra $L89; .loc 1 1549 19 ld.u64 %r42,[%r36+8]; st.u64 [%r37],%r42; .loc 1 1550 24 st.u32 [%r37+8],%r22; bra $L85; $L89: .loc 1 1552 15 setp.ne.u32 %r44,%r22,2; @ %r44 bra $L85; .loc 1 1554 19 ld.u64 %r45,[%r36+8]; st.u64 [%r37],%r45; .loc 1 1555 24 mov.u64 %r46,4294967297; st.u64 [%r37+8],%r46; bra $L85; $L86: .loc 1 1561 14 ld.u32 %r25,[%r36]; .loc 1 1561 10 setp.ne.u32 %r47,%r25,1; @ %r47 bra $L90; .loc 1 1563 20 ld.u64 %r48,[%r36+8]; st.u64 [%r37+16],%r48; .loc 1 1564 25 st.u32 [%r37+24],%r25; bra $L85; $L90: .loc 1 1566 15 setp.ne.u32 %r50,%r25,3; @ %r50 bra $L91; .loc 1 1568 20 ld.u64 %r51,[%r36+8]; st.u64 [%r37+16],%r51; .loc 1 1569 25 mov.u64 %r52,4294967297; st.u64 [%r37+24],%r52; bra $L85; $L91: .loc 1 1572 15 setp.ne.u32 %r53,%r25,2; @ %r53 bra $L85; .loc 1 1574 20 ld.u64 %r54,[%r36+8]; st.u64 [%r37+16],%r54; .loc 1 1575 25 mov.u32 %r55,1; st.u32 [%r37+24],%r55; .loc 1 1576 34 st.u32 [%r37+32],%r55; bra $L85; $L87: .loc 1 1581 14 ld.u32 %r29,[%r36]; .loc 1 1582 4 set.u32.eq.u32 %r58,%r29,3; neg.s32 %r59,%r58; set.u32.eq.u32 %r61,%r29,10or.b16 %r63,%r64,%r65; .loc 1 158192; .loc 1 1584 20 ld.u64 %r69,[%r36+8]; st.u64 [%r37+40],%r69; .loc 1 1585 25 mov.u32 %r70,1; st.u32 [%r37+48],%r70; bra $L85; $L92: .loc 1 1587 15 setp.ne.u32 %r71,%r29,12; @ %r71 bra $L85; .loc 1 1589 20 ld.u64 %r72,[%r36+8]; st.u64 [%r37+40],%r72; .loc 1 1590 25 mov.u64 %r73,4294967297; st.u64 [%r37+48],%r73; $L85: .loc 1 1598 1 ret; } // BEGIN FUNCTION DEF: read_uleb128 .func (.param .u64 %value_out) read_uleb128local .align 16 .b8 %frame_ar[20predpred %r91; mov.u64 %r47,%ar0; .loc 1 633 12 mov.u32 %r27,0; .loc 1 632 9 mov.u32 %r26,%r27; .loc 1 631 7 mov.u64 %r25,0; $L99: .loc 1 638 9 ld.u64 %r29,[%r47+16]; .loc 1 428 10 ld.u64 %r46,[%r47+24]; .loc 1 428 6 setp.eq.u64 %r48,%r46,0; @ %r48 bra $L94; .loc 1 448 12 add.u64 %r38,%r29,1; st.u64 [%r47+16],%r38; .loc 1 449 13 add.u64 %r49,%r46,-1; st.u64 [%r47+24],%r49; .loc 1 641 9 ld.u8 %r30,[%r29]; .loc 1 642 10 setp.le.u32 %r50,%r26,63; @ %r50 bra $L95; bra $L101; $L94: .loc 1 431 6 ld.u32 %r51,[%r47+56]93; .loc 1 418 38 ld.u64 %r57,[%r47+8]; sub.u64 %r56,%r29,%r57; .loc 1 417 3 st.u32 [%stack+16],%r56; ld.u64 %r58,[%r47]; st.u64 [%stack+8],%r58; cvta.const.u64 %r59,$LC2; st.u64 [%stack],%r59; cvta.const.u64 %r55,$LC1; mov.u64 %r54,20framestack; call (%value_in),sn,%out_arg4); ld.param.u32 %r60,[%value_in]; } .loc 1 419 3 ld.u64 %r65,[%r47+48]; ld.u64 %r66,[%r47+40];,.param .u32 %in_ar2);framecall %r66,(%out_arg1,%out_arg2,%out_arg3),$LCT2; } .loc 1 434 31 mov.u32 %r67,1; st.u32 [%r47+56],%r67; bra $L93; $L95: .loc 1 643 10 cvt.u16.u32 %r69,%r30; and.b16 %r68,%r69,127; cvt.u32.u16 %r71,%r68; cvt.u64.u8 %r70,%r71; .loc 1 643 33 shl.b64 %r72,%r70,%r26; .loc 1 643 6 or.b64 %r25,%r25,%r72; bra $L98; $L101: .loc 1 644 15 setp.ne.u32 %r73,%r27,0; @ %r73 bra $L98; .loc 1 418 38 ld.u64 %r78,[%r47+8]; sub.u64 %r77,%r38,%r78; .loc 1 417 3 st.u32 [%stack+16],%r77; ld.u64 %r79,[%r47]; st.u64 [%stack+8],%r79; cvta.const.u64 %r80,$LC3; st.u64 [%stack],%r80; cvta.const.u64 %r76,$LC1; mov.u64 %r75,20framestack; call (%value_in),sn,%out_arg4); ld.param.u32 %r81,[%value_in]; } .loc 1 419 3 ld.u64 %r86,[%r47+48]; ld.u64 %r87,[%r47+40];7; call %r87,(%out_arg1,%out_arg2,%out_arg3),$LCT3; } .loc 1 647 13 mov.u32 %r27,1; $L98: .loc 1 649 13 add.u32 %r26,%r26,7; .loc 1 651 3 cvt.u32.u32 %r90,%r30; cvt.s32.s8 %r89,%r90; cvt.u16.u32 %r88,%r89; setp.lt.s16 %r91,%r88,0; @ %r91 bra $L99; mov.u64 %r46,%r25; $L93FUNCTION DEF: read_sleb128 .func (.param .u64 %value_out) read_sleb128local .align 16 .b8 %frame_ar[208predpredmov.u64 %r53,%ar0; .loc 1 668 12 mov.u32 %r28,0; .loc 1 667 9 mov.u32 %r27,%r28; .loc 1 666 7 mov.u64 %r26,0; $L108: .loc 1 673 9 ld.u64 %r31,[%r53+16]; .loc 1 428 10 ld.u64 %r34,[%r53+24]; .loc 1 428 6 setp.eq.u64 %r54,%r34,0; @ %r54 bra $L103; .loc 1 448 12 add.u64 %r42,%r31,1; st.u64 [%r53+16],%r42; .loc 1 449 13 add.u64 %r55,%r34,-1; st.u64 [%r53+24],%r55; .loc 1 676 9 ld.u8 %r32,[%r31]; .loc 1 677 10 setp.le.u32 %r56,%r27,63; @ %r56 bra $L104; bra $L115; $L103: .loc 1 431 6 ld.u32 %r57,[%r53+56]; setp.ne.u32 %r58,%r57,0; @ %r58 bra $L110; .loc 1 418 38 ld.u64 %r63,[%r53+8]; sub.u64 %r62,%r31,%r63; .loc 1 417 3 st.u32 [%stack+16],%r62; ld.u64 %r64,[%r53]; st.u64 [%stack+8],%r64; cvta.const.u64 %r65,$LC2; st.u64 [%stack],%r65; cvta.const.u64 %r61,$LC1; mov.u64 %r60,20framestack; call (%value_in),sn,%out_arg4); ld.param.u32 %r66,[%value_in]; } .loc 1 419 3 ld.u64 %r71,[%r53+48]; ld.u64 %r72,[%r53+40];57; call %r72,(%out_arg1,%out_arg2,%out_arg3),$LCT4; } .loc 1 434 31 mov.u32 %r73,1; st.u32 [%r53+56],%r73; .loc 1 675 9 mov.u64 %r52,%r34; bra $L102; $L104: .loc 1 678 10 cvt.u16.u32 %r75,%r32; and.b16 %r74,%r75,127; cvt.u32.u16 %r77,%r74; cvt.u64.u8 %r76,%r77; .loc 1 678 33 shl.b64 %r78,%r76,%r27; .loc 1 678 6 or.b64 %r26,%r26,%r78; bra $L107; $L115: .loc 1 679 15 setp.ne.u32 %r79,%r28,0; @ %r79 bra $L107; .loc 1 418 38 ld.u64 %r84,[%r53+8]; sub.u64 %r83,%r42,%r84; .loc 1 417 3 st.u32 [%stack+16],%r83; ld.u64 %r85,[%r53]; st.u64 [%stack+8],%r85; cvta.const.u64 %r86,$LC4; st.u64 [%stack],%r86; cvta.const.u64 %r82,$LC1; mov.u64 %r81,20stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r87,[%value_in]; } .loc 1 419 3 ld.u64 %r92,[%r53+48]; ld.u64 %r93,[%r53+40];,.param .u32 %in_ar2)call %r9,$LCT5; } .loc 1 682 13 mov.u32 %r28,1; $L107: .loc 1 684 13 add.u32 %r27,%r27,7; .loc 1 686 3 cvt.u32.u32 %r96,%r32; cvt.s32.s8 %r95,%r96; cvt.u16.u32 %r94,%r95; setp.lt.s16 %r97,%r94,0; @ %r97 bra $L108; .loc 1 688 6 shr.u32 %r98,%r32,6; cvt.u16.u32 %r101,%r98; and.b16 %r100,%r101,1; .loc 1 688 23 set.u32.le.u32 %r103,%r27,63; neg.s32 %r104,%r103; cvt.u16.u32 %r108,%r104; and.b16 %r106,%r100,%r108; cvt.u32.u16 %r109,%r106; cvt.u16.u8 %r110,%r109; setp.eq.u16 %r111,%r110,0; @ %r111 bra $L109; .loc 1 689 28 mov.u64 %r113,-1; shl.b64 %r112,%r113,%r27; .loc 1 689 9 or.b64 %r26,%r26,%r112; $L109: .loc 1 691 10 mov.u64 %r52,%r26; bra $L102; $L110: .loc 1 675 9 mov.u64 %r52FUNCTION DEF: advance .func (.param .u32 %value_out) advancelocal .align 16 .b8 %frame_ar[208428 10 ld.u64 %r22,[%r34+24]; .loc 1 428 6 setp.gt.u64 %r36,%r35,%r22; @ %r36 bra $L117; .loc 1 448 12 ld.u64 %r38,[%r34+16]; add.u64 %r37,%r38,%r35; st.u64 [%r34+16],%r37; .loc 1 449 13 sub.u64 %r39,%r22,%r35; st.u64 [%r34+24],%r39; mov.u32 %r33,1; bra $L116; $L117: .loc 1 431 11 ld.u32 %r33,[%r34+56]; .loc 1 431 6 setp.ne.u32 %r40,%r33,0; @ %r40 bra $L119; .loc 1 418 38 ld.u64 %r45,[%r34+16]; ld.u64 %r46,[%r34+8]; sub.u64 %r44,%r45,%r46; .loc 1 417 3 st.u32 [%stack+16],%r44; ld.u64 %r47,[%r34]; st.u64 [%stack+8],%r47; cvta.const.u64 %r48,$LC2; st.u64 [%stack],%r48; cvta.const.u64 %r43,$LC1; mov.u64 %r42,20frameld.param.u32 %r49,[%value_in]; } .loc 1 419 3 ld.u64 %r54,[%r34+48]; ld.u64 %r55,[%r34+40];33; call %r55,(%out_arg1,%out_arg2,%out_arg3),$LCT6; } .loc 1 434 31 mov.u32 %r56,1; st.u32 [%r34+56],%r56; bra $L116; $L119: .loc 1 447 12 mov.u32 %r33,0; $L116: .loc 1 451FUNCTION DEF: read_byte .func (.param .u32 %value_out) read_bytelocal .align 16 .b8 %frame_ar[208pred %r39mov.u64 %r34,%ar0; .loc 1 477 24 ld.u64 %r22,[%r34+16]; .loc 1 428 10 ld.u64 %r23,[%r34+24]; .loc 1 428 6 setp.eq.u64 %r35,%r23,0; @ %r35 bra $L121; .loc 1 448 12 add.u64 %r36,%r22,1; st.u64 [%r34+16],%r36; .loc 1 449 13 add.u64 %r37,%r23,-1; st.u64 [%r34+24],%r37; .loc 1 481 11 ld.u8 %r33,[%r22]; bra $L122; $L121: .loc 1 431 6 ld.u32 %r38,[%r34+56]; setp.ne.u32 %r39,%r38,0; @ %r39 bra $L123; .loc 1 418 38 ld.u64 %r44,[%r34+8]; sub.u64 %r43,%r22,%r44; .loc 1 417 3 st.u32 [%stack+16],%r43; ld.u64 %r45,[%r34]; st.u64 [%stack+8],%r45; cvta.const.u64 %r46,$LC2; st.u64 [%stack],%r46; cvta.const.u64 %r42,$LC1; mov.u64 %r41,20framestack; call (%value_in),sn,ld.u64 %r52,[%r34+48]; ld.u64 %r53,[%r34+40];,.param .u32 %in_ar2)framecall %r5,$LCT7; } .loc 1 434 31 mov.u32 %r54,1; st.u32 [%r34+56],%r54; .loc 1 480 12 cvt.u32.u64 %r33,%r23; bra $L122; $L123: cvt.u32.u64 %r33,%r23; $L122: .loc 1 482FUNCTION DEF: read_uint16read_uint16local .align 16 .b8 %frame_ar[208mov.u64 %r36,%ar0; .loc 1 501 24 ld.u64 %r23,[%r36+16]; .loc 1 428 10 ld.u64 %r24,[%r36+24]; .loc 1 428 6 setp.le.u64 %r37,%r24,1; @ %r37 bra $L125; .loc 1 448 12 add.u64 %r38,%r23,2; st.u64 [%r36+16],%r38; .loc 1 449 13 add.u64 %r39,%r24,-2; st.u64 [%r36+24],%r39; .loc 1 505 6 ld.u32 %r40,[%r36+32bra $L130; $L125: .loc 1 431 6 ld.u32 %r42,[%r36+56]; setp.ne.u32 %r43,%r42,0; @ %r43 bra $L129; .loc 1 418 38 ld.u64 %r48,[%r36+8]; sub.u64 %r47,%r23,%r48; .loc 1 417 3 st.u32 [%stack+16],%r47; ld.u64 %r49,[%r36]; st.u64 [%stack+8],%r49; cvta.const.u64 %r50,$LC2; st.u64 [%stack],%r50; cvta.const.u64 %r46,$LC1; mov.u64 %r45,20stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r51,[%value_in]; } .loc 1 419 3 ld.u64 %r56,[%r36+48]; ld.u64 %r57,[%r36+40];,.param .u32 %in_ar2)framecall %r57,(%out_arg1,%out_arg2,%out_arg3),$LCT8; } .loc 1 434 31 mov.u32 %r58,1; st.u32 [%r36+56],%r58; .loc 1 504 12 mov.u32 %r35,%r42; bra $L128; $L126: ld.u8 %r59,[%r23]; ld.u8 %r60,[%r23+1]; shl.b64 %r61,%r60,8; or.b64 %r62,%r61,%r59; cvt.u16.u64 %r63,%r62; cvt.u32.u64 %r65,%r62; shl.b32 %r64,%r65,8; cvt.u32.u16 %r66,%r63; shr.u32 %r67,%r66,8;or.b16 %r68,%r69,%r70; cvt.u32.u16 %r35,%r68; .loc 1 506 35 bra $L128; $L130: ld.u8 %r71,[%r23]; ld.u8 %r72,[%r23+1]; shl.b64 %r73,%r72,8; or.b64 %r74,%r73,%r71; cvt.u16.u64 %r75,%r74; cvt.u32.u16 %r35,%r75; .loc 1 508 35 bra $L128; $L129: .loc 1 504 12 mov.u32 %r35,0; $L128: .loc 1 509FUNCTION DEF: read_uint24read_uint24local .align 16 .b8 %frame_ar[20832u64 %r67; .reg .u3232u32 %r87; mov.u64 %r53,%ar0; .loc 1 516 24 ld.u64 %r41,[%r53+16]; .loc 1 428 10 ld.u64 %r42,[%r53+24]; .loc 1 428 6 setp.le.u64 %r54,%r42,2; @ %r54 bra $L132; .loc 1 448 12 add.u64 %r55,%r41,3; st.u64 [%r53+16],%r55; .loc 1 449 13 add.u64 %r56,%r42,-3; st.u64 [%r53+24],%r56; .loc 1 520 6 ld.u32 %r57,[%r53+32]; setp.ne.u32 %r58,%r57,0; @ %r58 bra $L133; bra $L137; $L132: .loc 1 431 6 ld.u32 %r59,[%r53+56]; setp.ne.u32 %r60,%r59,0; @ %r60 bra $L136; .loc 1 418 38 ld.u64 %r65,[%r53+8]; sub.u64 %r64,%r41,%r65; .loc 1 417 3 st.u32 [%stack+16],%r64; ld.u64 %r66,[%r53]; st.u64 [%stack+8],%r66; cvta.const.u64 %r67,$LC2; st.u64 [%stack],%r67; cvta.const.u64 %r63,$LC1; mov.u64 %r62,20stack; call (%value_in),sn419 3 ld.u64 %r73,[%r53+48]; ld.u64 %r74,[%r53+40];59; call %r74,(%out_arg1,%out_arg2,%out_arg3),$LCT9; } .loc 1 434 31 mov.u32 %r75,1; st.u32 [%r53+56],%r75; .loc 1 519 12 mov.u32 %r52,%r59; bra $L131; $L133: .loc 1 521 14 ld.u8 %r76,[%r41]; .loc 1 521 30 shl.b32 %r77,%r76,16; .loc 1 521 40 ld.u8 %r78,[%r41+1]; .loc 1 521 56 shl.b32 %r79,%r78,8; .loc 1 521 37 or.b32 %r80,%r77,%r79; .loc 1 522 8 ld.u8 %r81,[%r41+2]; .loc 1 522 6 or.b32 %r52,%r80,%r81; bra $L131; $L137: .loc 1 524 14 ld.u8 %r82,[%r41+2]; .loc 1 524 30 shl.b32 %r83,%r82,16; .loc 1 524 40 ld.u8 %r84,[%r41+1]; .loc 1 524 56 shl.b32 %r85,%r84,8; .loc 1 524 37 or.b32 %r86,%r83,%r85; .loc 1 525 8 ld.u8 %r87,[%r41]; .loc 1 525 6 or.b32 %r52,%r86,%r87; bra $L131; $L136: .loc 1 519 12 mov.u32 %r52,0; $L131// BEGIN FUNCTION DEF: read_uint32 .func (.param .u32 %value_out) read_uint3local .align 16 .b8 %frame_ar[208mov.u64 %r48,%ar0; .loc 1 533 24 ld.u64 %r35,[%r48+16]; .loc 1 428 10 ld.u64 %r36,[%r48+24]; .loc 1 428 6 setp.le.u64 %r49,%r36,3; @ %r49 bra $L139; .loc 1 448 12 add.u64 %r50,%r35,4; st.u64 [%r48+16],%r50; .loc 1 449 13 add.u64 %r51,%r36,-4; st.u64 [%r48+24],%r51; .loc 1 537 6 ld.u32 %r52,[%r48+32]bra $L144; $L139: .loc 1 431 6 ld.u32 %r54,[%r48+56143; .loc 1 418 38 ld.u64 %r60,[%r48+8]; sub.u64 %r59,%r35,%r60; .loc 1 417 3 st.u32 [%stack+16],%r59; ld.u64 %r61,[%r48]; st.u64 [%stack+8],%r61; cvta.const.u64 %r62,$LC2; st.u64 [%stack],%r62; cvta.const.u64 %r58,$LC1; mov.u64 %r57,20stack; call (%value_in),sn,419 3 ld.u64 %r68,[%r48+48]; ld.u64 %r69,[%r48+40]; { $LCT154; call %r69,(%out_arg1,%out_arg2,%out_arg3),$LCT10; } .loc 1 434 31 mov.u32 %r70,1; st.u32 [%r48+56],%r70; .loc 1 536 12 mov.u32 %r47,%r54; bra $L138; $L140: .loc 1 538 14 ld.u8 %r71,[%r35]; .loc 1 538 30 shl.b32 %r72,%r71,24; .loc 1 538 40 ld.u8 %r73,[%r35+1]; .loc 1 538 56 shl.b32 %r74,%r73,16; .loc 1 538 37 or.b32 %r75,%r72,%r74; .loc 1 539 33 ld.u8 %r76,[%r35+3]; .loc 1 539 31 or.b32 %r77,%r75,%r76; .loc 1 539 9 ld.u8 %r78,[%r35+2]; .loc 1 539 25 shl.b32 %r79,%r78,8; .loc 1 539 31 or.b32 %r47,%r77,%r79; bra $L138; $L144: .loc 1 542 31 ld.u8 %r80,[%r35]; ld.u8 %r81,[%r35+1]; shl.b64 %r82,%r81,8; or.b64 %r83,%r82,%r80; ld.u8 %r84,[%r35+2]; shl.b64 %r85,%r84,16; or.b64 %r86,%r85,%r83; ld.u8 %r87,[%r35+3]; shl.b64 %r88,%r87,24; or.b64 %r89,%r88,%r86; cvt.u32.u64 %r47,%r89; bra $L138; $L143: .loc 1 536 12 mov.u32 %r47,0; $L138: .loc 1 543 1 mov.u32 %value,%r47;read_uint64 .func (.param .u64 %value_out) read_uint64local .align 16 .b8 %frame_ar[208pred %r69; .reg .u32 %r70; .reg .predmov.u64 %r64,%ar0; .loc 1 550 24 ld.u64 %r47,[%r64+16]; .loc 1 428 10 ld.u64 %r51,[%r64+24]; .loc 1 428 6 setp.le.u64 %r65,%r51,7; @ %r65 bra $L146; .loc 1 448 12 add.u64 %r66,%r47,8; st.u64 [%r64+16],%r66; .loc 1 449 13 add.u64 %r67,%r51,-8; st.u64 [%r64+24],%r67; .loc 1 554 6 ld.u32 %r68,[%r64+32]; setp.ne.u32 %r69,%r68,0; @ %r69 bra $L147; bra $L151; $L146: .loc 1 431 6 ld.u32 %r70,[%r64+56]; setp.ne.u32 %r71,%r70,0; @ %r71 bra $L150; .loc 1 418 38 ld.u64 %r76,[%r64+8]; sub.u64 %r75,%r47,%r76; .loc 1 417 3 st.u32 [%stack+16],%r75; ld.u64 %r77,[%r64]; st.u64 [%stack+8],%r77; cvta.const.u64 %r78,$LC2;74,$LC1; mov.u64 %r73,20stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r79,[%value_in]; } .loc 1 419 3 ld.u64 %r84,[%r64+48]; ld.u64 %r85,[%r64+40]; { $LCT1170; call %r85,(%out_arg1,%out_arg2,%out_arg3),$LCT11; } .loc 1 434 31 mov.u32 %r86,1; st.u32 [%r64+56],%r86; .loc 1 553 12 mov.u64 %r63,0; bra $L145; $L147: .loc 1 555 14 ld.u8 %r87,[%r47]; .loc 1 555 30 shl.b64 %r88,%r87,56; .loc 1 555 40 ld.u8 %r89,[%r47+1]; .loc 1 555 56 shl.b64 %r90,%r89,48; .loc 1 555 37 or.b64 %r91,%r88,%r90; .loc 1 558 33 ld.u8 %r92,[%r47+7]; .loc 1 558 31 or.b64 %r93,%r91,%r92; .loc 1 556 9 ld.u8 %r94,[%r47+2]; .loc 1 556 25 shl.b64 %r95,%r94,40; .loc 1 558 31 or.b64 %r96,%r93,%r95; .loc 1 556 35 ld.u8 %r97,[%r47+3]; .loc 1 556 51 shl.b64 %r98,%r97,32; .loc 1 558 31 or.b64 %r99,%r96,%r98; .loc 1 557 9 ld.u8 %r100,[%r47+4]; .loc 1 557 25 shl.b64 %r101,%r100,24; .loc 1 558 31 or.b64 %r102,%r99,%r101; .loc 1 557 35 ld.u8 %r103,[%r47+5]; .loc 1 557 51 shl.b64 %r104,%r103,16; .loc 1 558 31 or.b64 %r105,%r102,%r104; .loc 1 558 9 ld.u8 %r106,[%r47+6]; .loc 1 558 25 shl.b64 %r107,%r106,8; .loc 1 558 31 or.b64 %r63,%r105,%r107; bra $L145; $L151: .loc 1 563 31 ld.u8 %r108,[%r47]; ld.u8 %r109,[%r47+1]; shl.b64 %r110,%r109,8; or.b64 %r111,%r110,%r108; ld.u8 %r112,[%r47+2]; shl.b64 %r113,%r112,16; or.b64 %r114,%r113,%r111; ld.u8 %r115,[%r47+3]; shl.b64 %r116,%r115,24; or.b64 %r117,%r116,%r114; ld.u8 %r118,[%r47+4]; shl.b64 %r119,%r118,32; or.b64 %r120,%r119,%r117; ld.u8 %r121,[%r47+5]; shl.b64 %r122,%r121,40; or.b64 %r123,%r122,%r120; ld.u8 %r124,[%r47+6]; shl.b64 %r125,%r124,48; or.b64 %r126,%r125,%r123; ld.u8 %r127,[%r47+7]; shl.b64 %r128,%r127,56; or.b64 %r63,%r128,%r126; bra $L145; $L150: .loc 1 553 12 mov.u64 %r63,0; $L145FUNCTION DEF: read_address .func (.param .u64 %value_out) read_addresslocal .align 16 .b8 %frame_ar[208mov.u32 %r34,%ar1; .loc 1 584 3 setp.eq.u32 %r35,%r34,4; @ %r35 bra $L153; setp.gt.s32 %r36,%r34,4; @ %r36 bra $L154; setp.eq.u32 %r37,%r34,1; @ %r37 bra $L155; setp.eq.u32 %r38,%r34,2; @ %r38 bra $L156; bra $L157; $L154: setp.eq.u32 %r39,%r34,8; @ %r39 bra $L158call (%value_in),read_byte,(%out_arg1); ld.param.u32 %r41,[%value_in]; } cvt.u32.u32 %r43,%r41; cvt.u64.u8call (%value_in),read_uint16,(cvt.u16.u32 %r47,%r45; cvt.u64call (%value_in),read_uint32,(%out_arg1); ld.param.u32 %r49,[%value_in]; } cvt.u64.u32 %r32,%r49; bra $L152; $L158: .loc 1 59333; call (%value_in),read_uint64,(%out_arg1); ld.param.u64 %r51,[%value_in]; } mov.u64 %r32,%r51; bra $L152; $L157: .loc 1 418 38 ld.u64 %r56,[%r33+16]; ld.u64 %r57,[%r33+8]; sub.u64 %r55,%r56,%r57; .loc 1 417 3 st.u32 [%stack+16],%r55; ld.u64 %r58,[%r33]; st.u64 [%stack+8],%r58; cvta.const.u64 %r59,$LC5; st.u64 [%stack],%r59; cvta.const.u64 %r54,$LC1; mov.u64 %r53,20framestack; call (%value_in),sn,%out_arg4); ld.param.u32 %r60,[%value_in]; } .loc 1 419 3 ld.u64 %r65,[%r33+48]; ld.u64 %r66,[%r33+40]; mov.u32 %r64,0; { $LCT1264; call %r66,(%out_arg1,%out_arg2,%out_arg3),$LCT12; } .loc 1 596 14 mov.u64 %r32,0; $L15FUNCTION DEF: resolve_addr_index$isra$0 .func (.param .u32 %value_out) resolve_addr_index$isra$0predu32mov.u64 %r38,%ar8; .loc 1 1099 23 cvt.s64.s32 %r22,%r33; .loc 1 1099 10 mad.lo.u64 %r24,%r22,%r35,%r32; .loc 1 1100 14 add.u64 %r40,%r22,%r24; .loc 1 1100 6.loc 1 1102 7 mov.u32 %r44,0; cvta.const.u64 %r43,$LC6; { $LCT1313; } .loc 1 1103 14 mov.u32 %r29,%r44; bra $L160; $L161: .loc 1 1106 17 cvta.const.u64 %r45,$LC7; st.u64 [%frame],%r45; .loc 1 1107 18 st.u64 [%frame+8],%r30; .loc 1 1108 51 add.u64 %r46,%r30,%r24; .loc 1 1108 16 st.u64 [%frame+16],%r46; .loc 1 1109 52 sub.u64 %r47,%r31,%r24; .loc 1 1109 17 st.u64 [%frame+24],%r47; .loc 1 1110 25 st.u32 [%frame+32],%r34; .loc 1 1111 27 st.u64 [%frame+40],%r36; .loc 1 1112 17 st.u64 [%frame+48],%r37; .loc 1 1113 31 mov.u32 %r48,0; st.u32 [%frame+56]frame33; call (%value_in),read_addres1115 12 st.u64 [%r38],%r51; .loc 1 1116 10 mov.u32 %r29,1; $L160: .loc 1 1117 1 mov.u32 %value,%r29;, .param .u64 %in_ar11, .param .u64 %in_ar1ar11; ld.param.u64 %ar11,[%in_ar11]; .reg .u64 %ar12; ld.param.u64 %ar12,[%in_ar1local .align 16 .b8 %frame_ar[280u64predpredpredpred %r201; .reg .u64 %r204; .reg .u32u64 %r235; .reg .u32pred %r282; .reg .u32 %r285; .reg .u64 %r287; .reg .u32 %r290; .reg .u64 %r292; .reg .u32 %r295; .reg .u64 %r297; .reg .u64 %r305; .reg .u64 %r306; .reg .u32 %r307; .reg .predu64 %r333; .reg .u64 %r334; .reg .u32 %r335; .reg .u32u32 %r343; .reg .u32 %r344; .reg .u64predmov.u64 %r109,%ar0; mov.u64 %r110,%ar1; mov.u64 %r111,%ar2; mov.u32 %r112,%ar3; mov.u64 %r113,%ar4; mov.u64 %r114,%ar5; st.u64 [%frame+272],%r114; mov.u64 %r115,%ar6; mov.u32 %r116,%ar7; mov.u64 %r117,%ar8; mov.u64 %r118,%ar9; mov.u64 %r119,%ar10; mov.u64 %r120,%ar11; mov.u64 %r121,%ar12; .loc 1 1740 37 ld.u64 %r84,[%r110+136]; .loc 1 1736 6 setp.eq.u32 %r122,%r116,0; @ %r122 bra $L164; .loc 1 1739 15 ld.u64 %r23,[%r113+80]; .loc 1 1739 53 ld.u32 %r24,[%r113+44]; .loc 1 1739 70 setp.eq.u32 %r123,%r24,0; selp.u64 %r25,4,8,%r123; .loc 1 1739 12 mad.lo.u64 %r27,%r25,%r115,%r23; .loc 1 1740 6 setp.lt.u64 %r125,%r27,%r84; @ %r125 bra $L166; $L187: .loc 1 1742 7 mov.u32 %r128,0; cvta.const.u64 %r127,$LC8; { $LCT14121128; call %r119,$LCT14; } .loc 1 1743 14 mov.u32 %r108,%r128; bra $L163; $L189: .loc 1 573 12 add.u64 %r357,%frame,357; call (%value_in),read_uint64,(%out_arg1); ld.param.u64 %r131,[%value_in]; } mov.u64 %r93,%r131; bra $L168; $L217: .loc 1 575 12 add.u64 %r357,%frame,207; call (%value_in),read_uint32,(%out_arg1); ld.param.u32 %r134,[%value_in]; } cvt.u64.u32 %r93,%r134; $L168: .loc 1 1758 14 ld.u64 %r135,[%r113+80]; add.u64 %r29,%r93,%r135; .loc 1 1759 41 ld.u64 %r30,[%r110+136]; .loc 1 1759 10 setp.lt.u64 %r136,%r29,%r30; @ %r136 bra $L169; .loc 1 1761 4 mov.u32 %r139,0; cvta.const.u64 %r138,$LC9; { $LCT15120%r119,(%out_arg1,%out_arg2,%out_arg3),$LCT15; } .loc 1 1762 11 mov.u32 %r108,%r139; bra $L163; $L169: .loc 1 1764 63 ld.u64 %r141,[%r110+64]; add.u64 %r140,%r141,%r29; .loc 1 1764 24 st.u64 [%frame+216],%r140; .loc 1 1765 64 sub.u64 %r142,%r30,%r29; .loc 1 1765 25 st.u64 [%frame+224],%r142; $L186: .loc 1 1796 11 add.u64 %r358,%frame,264; .loc 1 1782 11 add.u64 %r359,%frame,272; $L170: .loc 1 17357; call (%value_in),read_byte,(%out_arg1); ld.param.u32 %r145,[%value_in]; } .loc 1 1773 10 setp.eq.u32 %r147,%r145,0; @ %r147 bra $L171; .loc 1 1775 7 cvt.u16.u32 %r148,%r145; setp.eq.u16 %r149,%r148,4; @ %r149 bra $L172; setp.gt.u16 %r151,%r148,4; @ %r151 bra $L173; setp.eq.u16 %r153,%r148,2; @ %r153 bra $L174; setp.eq.u16 %r155,%r148,3; @ %r155 bra $L175; setp.eq.u16 %r157,%r148,1; @ %r157 bra $L176; bra $L177; $L173: setp.eq.u16 %r159,%r148,6; @ %r159 bra $L178; setp.eq.u16 %r161,%r148,7; @ %r161 bra $L179; setp.eq.u16 %r163,%r148,5; @ %r163 bra $L180; bra $L177; $L17read_uleb128,(%out_arg1); ld.param.u64 %r166,[%value_in]; } .loc 1 1782 11 ld.u32 %r177,[%r113+48]; ld.u64 %r178,[%r113+72]; ld.u64 %r179,[%r110+112]; ld.u64 %r180,[%r110+4018179178119; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r120; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r359; call (%value_in),resolve_addr_index$isra$0,%out_arg8,%out_arg9); ld.param.u32 %r181,[%value_in]; } .loc 1 1782 9 setp.ne.u32 %r182,%r181,0; @ %r182 bra $L170; $L185: .loc 1 1785 15 mov.u32 %r108,0; bra $L163; $L174:read_uleb128.loc 1 1796 11 ld.u32 %r196,[%r113+48]; ld.u64 %r197,[%r113+72]; ld.u64 %r198,[%r110+112]; ld.u64 %r199,[%r110+4097119; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r120; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r358; call (%value_in),resolve_addr_index$isra$0,%out_arg8,%out_arg9); ld.param.u32 %r200,[%value_in]; } .loc 1 1796 9 setp.eq.u32 %r201,%r200,0; @ %r201 bra $L182read_uleb128,(%out_arg1); ld.param.u64 %r204,[%value_in]; } .loc 1 1801 11 ld.u32 %r214,[%r113+48]; ld.u64 %r215,[%r113+72]; ld.u64 %r216,[%r110+112]; ld.u64 %r217,[%r110+40119; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r120; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%frame; call (%value_in),resolve_addr_index$isra$0,%out_arg8,%out_arg9); ld.param.u32 %r218,[%value_in]; } .loc 1 1801 9 setp.eq.u32 %r219,%r218,0; @ %r219 bra $L182; .loc 1 1805 11 ld.u64 %r228,[%frame]; add.u64 %r227,%r111,%r228; ld.u64 %r230,[%frame+264]; add.u64 %r229,%r111,%r230; { .param .u32 %value_in; $LCT16: .callprototype ( .param .u32 %value_out),.param .u64 %in_ar2,.param .u64 %in_ar3,.param .u64 %in_ar4,227119%r117,$LCT16; ld.param.u32 %r231,[%value_in]; } .loc 1 1805 9 setp.eq.u32 %r232,%r231,0; @ ! %r232 bra $L170; $L182: .loc 1 1799 15 mov.u32 %r108,0; bra $L163; $L175:read_uleb128,(%out_arg1); ld.param.u64 %r235,[%value_in]; } .loc 1 1819 11 ld.u32 %r245,[%r113+48]; ld.u64 %r246,[%r113+72]; ld.u64 %r247,[%r110+112]; ld.u64 %r248,[%r110+40.param .u64 %out_arg8; st.param.u64 [%out_arg8],%r120; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%frame; call (%value_in),resolve_addr_index$isra$0,%out_arg8,%out_arg9); ld.param.u32 %r249,[%value_in]; } .loc 1 1819 9 setp.eq.u32 %r250,%r249,0; @ %r250 bra $L184; .loc 1 1823357; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r253,[%value_in]; } .loc 1 1824 10 ld.u64 %r254,[%frame]; add.u64 %r58,%r111,%r254; st.u64 [%frame],%r58; .loc 1 1825 11 add.u64 %r262,%r253,%r58; { .param .u32 %value_in; $LCT17: .callprototype ( .param .u32 %value_out),.param .u64 %in_ar2,.param .u64 %in_ar3,.param .u64 %in_ar4,182119%r117,$LCT17; ld.param.u32 %r263,[%value_in]; } .loc 1 1825 9 setp.eq.u32 %r264,%r263,0; @ ! %r264 bra $L170; $L184: .loc 1 1822 15 mov.u32 %r108,0; bra $L163; $L172: .loc 1 1836 127; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r267,[%value_in]; } .loc 1 1837 137; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r270,[%value_in]; } ld.u64 %r271,[%frame+272]; add.u64 %r107,%r111,%r271; .loc 1 1838 11 add.u64 %r279,%r270,%r107; add.u64 %r280,%r267,%r107; { .param .u32 %value_in; $LCT18: .callprototype ( .param .u32 %value_out),.param .u64 %in_ar2,.param .u64 %in_ar3,.param .u64 %in_ar4,2119%r117,$LCT18; ld.param.u32 %r281,[%value_in]; } .loc 1 1838 9 setp.eq.u32 %r282,%r281,0; @ ! %r282 bra $L170; bra $L185; $L180: .loc 1 1846 11 ld.u32 %r285,[%r113+48]285; call (%value_in),read_addres287,[%value_in]; } .loc 1 1846 9 st.u64 [%frame+272],%r287; .loc 1 1847 4 bra $L170; $L178: .loc 1 1854 12 ld.u32 %r290,[%r113+48]290; call (%value_in),read_addres2,[%value_in]; } .loc 1 1855 13 ld.u32 %r295,[%r113+48]295; call (%value_in),read_addres7,[%value_in]; } .loc 1 1856 11 add.u64 %r305,%r111,%r297; add.u64 %r306,%r111,%r292; { .param .u32 %value_in; $LCT19: .callprototype ( .param .u32 %value_out),.param .u64 %in_ar2,.param .u64 %in_ar3,.param .u64 %in_ar4,306305119%r117,$LCT19; ld.param.u32 %r307,[%value_in]; } .loc 1 1856 9 setp.eq.u32 %r308,%r307,0; @ ! %r308 bra $L170; bra $L185; $L179: .loc 1 1868 12 ld.u32 %r311,[%r113+48]311; call (%value_in),read_addres13,[%value_in]; } .loc 1 1869357; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r316,[%value_in]; } .loc 1 1870 10 add.u64 %r79,%r111,%r313; .loc 1 1871 11 add.u64 %r324,%r316,%r79; { .param .u32 %value_in; $LCT20: .callprototype ( .param .u32 %value_out),.param .u64 %in_ar2,.param .u64 %in_ar3,.param .u64 %in_ar4,324119%r117,$LCT20; ld.param.u32 %r325,[%value_in]; } .loc 1 1871 9 setp.eq.u32 %r326,%r325,0; @ ! %r326 bra $L170; bra $L185; $L177: .loc 1 418 38 ld.u64 %r331,[%frame+216]; ld.u64 %r332,[%frame+208]; sub.u64 %r330,%r331,%r332; .loc 1 417 3 st.u32 [%stack+16],%r330; ld.u64 %r333,[%frame+200]; st.u64 [%stack+8],%r333; cvta.const.u64 %r334,$LC10; st.u64 [%stack],%r334; cvta.const.u64 %r329,$LC1; mov2stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r335,[%value_in]; } .loc 1 419 3 ld.u64 %r340,[%frame+248]; ld.u64 %r341,[%frame+240]; mov.u32 %r339,0; { $LCT21frame339; call %r341,$LCT21; } .loc 1 1879 11 mov.u32 %r108,%r339; .loc 1 420 1 bra $L163; $L171: .loc 1 1883 6 ld.u32 %r343,[%frame+256]; set.u32.eq.u32 %r344,%r343,0; neg.s32 %r108,%r344; bra $L163; $L216: .loc 1 1746 21 cvta.const.u64 %r347,$LC11; st.u64 [%frame+200],%r347; .loc 1 1747 44 ld.u64 %r102,[%r110+64]; .loc 1 1747 22 st.u64 [%frame+208],%r102; .loc 1 1748 59 add.u64 %r348,%r102,%r115; .loc 1 1748 20 st.u64 [%frame+216],%r348; .loc 1 1749 60 sub.u64 %r349,%r84,%r115; .loc 1 1749 21 st.u64 [%frame+224],%r349; .loc 1 1750 29 st.u32 [%frame+232],%r112; .loc 1 1751 31 st.u64 [%frame+240],%r119; .loc 1 1752 21 st.u64 [%frame+248],%r120; .loc 1 1753 35 st.u32 [%frame+256],%r116; add.u64 %r357,%frame,200; bra $L186; $L164: .loc 1 1740 6 setp.ge.u64 %r351,%r115,%r84; @ ! %r351 bra $L216; bra $L187; $L166: .loc 1 1746 21 cvta.const.u64 %r352,$LC11; st.u64 [%frame+200],%r352; .loc 1 1747 44 ld.u64 %r104,[%r110+64]; .loc 1 1747 22 st.u64 [%frame+208],%r104; .loc 1 1748 59 add.u64 %r353,%r104,%r27; .loc 1 1748 20 st.u64 [%frame+216],%r353; .loc 1 1749 60 sub.u64 %r354,%r84,%r27; .loc 1 1749 21 st.u64 [%frame+224],%r354; .loc 1 1750 29 st.u32 [%frame+232],%r112; .loc 1 1751 31 st.u64 [%frame+240],%r119; .loc 1 1752 21 st.u64 [%frame+248],%r120; .loc 1 1753 35 mov.u32 %r355,0; st.u32 [%frame+256],%r355; .loc 1 572 6 setp.ne.u32 %r356,%r24,0; @ %r356 bra $L189; bra $L217; $L163: .loc 1 1887 1 mov.u32 %value,%r108;add_ranges .func (.param .u32 %value_out) add_ranges, .param .u64 %in_ar10, .param .u64 %in_ar1ar11; ld.param.u64 %ar11,[%in_ar11pred32u32 %r125; .reg .u32predpred %r141; .reg .u64 %r144; .reg .u32 %r147; .reg .u64 %r148; .reg .u64 %r149; .reg .pred %r150; .reg .predpred %r154; .reg .pred %r155; .reg .pred %r156; .reg .predpredmov.u64 %r71,%ar6; mov.u64 %r72,%ar7; mov.u64 %r73,%ar8; mov.u64 %r74,%ar9; mov.u64 %r75,%ar10; mov.u64 %r76,%ar11; .loc 1 1908 6 ld.u32 %r77,[%r71+8]; setp.eq.u32 %r78,%r77,0; @ %r78 bra $L219; .loc 1 1908 27 ld.u32 %r79,[%r71+2419; .loc 1 1620 18 ld.u64 %r31,[%r71]; .loc 1 1620 9 st.u64 [%frame+64],%r31; .loc 1 1621 6 ld.u32 %r81,[%r71+12]; setp.ne.u32 %r82,%r81,0; @ %r82 bra $L220; $L223: .loc 1 1629 19 ld.u64 %r36,[%r71+16]; .loc 1 1629 10 st.u64 [%frame],%r36; .loc 1 1630 6 ld.u32 %r83,[%r71+32]; setp.ne.u32 %r84,%r83,0; @ ! %r84 bra $L222; bra $L221; $L220: .loc 1 1623 12 add.u64 %r94,%frame,64; ld.u32 %r95,[%r69+48]; ld.u64 %r96,[%r69+72]; ld.u64 %r97,[%r66+112];3174; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r75; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r94; call (%value_in),resolve_addr_index$isra$0,1623 10 setp.ne.u32 %r100,%r99,0; @ %r100 bra $L223; $L225: .loc 1 1626 9 mov.u32 %r64,0; bra $L218; $L221: .loc 1 1632 12 ld.u32 %r110,[%r69+48]; ld.u64 %r111,[%r69+72]; ld.u64 %r112,[%r66+112]; ld.u64 %r11311r1121113674; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r75; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%frame; call (%value_in),resolve_addr_index$isra$0,%out_arg8,%out_arg9); ld.param.u32 %r114,[%value_in]; } .loc 1 1632 10 setp.eq.u32 %r115,%r114,0; @ %r115 bra $L225; .loc 1 1638 12 ld.u64 %r36,[%frame]; $L222: ld.u64 %r48,[%frame+64]; .loc 1 1637 6 ld.u32 %r116,[%r71+28]; setp.eq.u32 %r117,%r116,0; @ %r117 bra $L226; .loc 1 1638 12 add.u64 %r36,%r36,%r48; $L226: .loc 1 1642 9 add.u64 %r42,%r67,%r48; st.u64 [%frame+64],%r42; .loc 1 1643 10 add.u64 %r43,%r67,%r36; st.u64 [%frame],%r43; .loc 1 1645 10 { .param .u32 %value_in; $LCT22: .callprototype ( .param .u32 %value_out),.param .u64 %in_ar2,.param .u64 %in_ar3,.param .u64 %in_ar4,.param .u64 %in_ar5,.param .u64 %in_ar6);7576; call (%value_in),%r72,$LCT22; ld.param.u32 %r125,[%value_in]; } mov.u32 %r64,%r125; .loc 1 1909 12 bra $L218; $L219: .loc 1 1913 6 ld.u32 %r126,[%r71+48]; setp.eq.u32 %r127,%r126,0; @ %r127 bra $L244; ld.u64 %r26,[%r71+40]; .loc 1 1919 6 ld.u32 %r128,[%r69+40]; setp.gt.s32 %r129,%r128,4; @ %r129 bra $L228; ld.u64 %r30,[%r66+96]; .loc 1 1668 6 setp.lt.u64 %r130,%r26,%r30; @ %r130 bra $L229; .loc 1 1670 7 mov.u32 %r133,0; cvta.const.u64 %r132,$LC12; { $LCT23%r74,$LCT23; } .loc 1 1671 14 mov.u32 %r64,%r133; bra $L218; $L229: ld.u64 %r29,[%r66+24]; .loc 1 1674 19 cvta.const.u64 %r134,$LC13; st.u64 [%frame],%r134; .loc 1 1675 20 st.u64 [%frame+8],%r29; .loc 1 1676 55 add.u64 %r135,%r29,%r26; .loc 1 1676 18 st.u64 [%frame+16],%r135; .loc 1 1677 56 sub.u64 %r136,%r30,%r26; .loc 1 1677 19 st.u64 [%frame+24],%r136; .loc 1 1678 27 st.u32 [%frame+32],%r68; .loc 1 1679 29 st.u64 [%frame+40],%r74; .loc 1 1680 19 st.u64 [%frame+48],%r75; .loc 1 1681 33 mov.u32 %r137,0; st.u32 [%frame+56],%r137; .loc 1 1691 13 ld.u32 %r56,[%r69+48]; bra $L231; $L272: .loc 1 1688 10 ld.u32 %r138,[%frame+56]; setp.eq.u32 %r139,%r138,0; @ %r139 bra $L232; $L234: .loc 1 1689 9 mov.u32 %r64,0; bra $L218; $L232: .loc 1 1691 13 ld.u32 %r56,[%r69+48]; mov.u64 %r54,%r70; bra $L233; $L242: .loc 1 1688 10 ld.u32 %r140,[%frame+56]; setp.ne.u32 %r141,%r140,0; @ %r141 bra $L234; $L233: .loc 1 1691 13 mov.u64 %r70,%r54; $L231:frame56; call (%value_in),read_addres144,[%value_in]; } .loc 1 1692 14 ld.u32 %r147,[%r69+48]framecall (%value_in),read_addres148,[%value_in]; } mov.u64 %r54,%r148; .loc 1 1694 20 or.b64 %r149,%r144,%r148; .loc 1 1694 10 setp.eq.u64 %r150,%r149,0; @ %r150 bra $L235; .loc 1 1697 11 ld.u32 %r56,[%r69+48]; .loc 1 606 3 setp.eq.u32 %r151,%r56,4; @ %r151 bra $L236; setp.gt.s32 %r152,%r56,4; @ %r152 bra $L237; setp.eq.u32 %r153,%r56,1; @ %r153 bra $L238; setp.eq.u32 %r154,%r56,2; @ %r154 bra $L239; bra $L240; $L237: setp.eq.u32 %r155,%r56,8; @ %r155 bra $L241; bra $L240; $L238: .loc 1 1697 10 setp.eq.u64 %r156,%r144,255; @ %r156 bra $L242; bra $L240; $L239: setp.eq.u64 %r157,%r144,65535; @ %r157 bra $L242; bra $L240; $L236: setp.eq.u64 %r158,%r144,4294967295; @ %r158 bra $L242; bra $L240; $L241: setp.eq.u64 %r159,%r144,-1; @ %r159 bra $L242; $L240: .loc 1 1703 11 add.u64 %r167,%r70,%r148; .loc 1 1701 9 add.u64 %r168,%r167,%r67; .loc 1 1702 10 add.u64 %r169,%r70,%r144; .loc 1 1701 9 add.u64 %r170,%r169,%r67; { .param .u32 %value_in; $LCT24: .callprototype ( .param .u32 %value_out),.param .u64 %in_ar2,.param .u64 %in_ar3,.param .u64 %in_ar4,.param .u64 %in_ar5,.param .u64 %in_ar6);17087576; call (%value_in),%r72,$LCT24; ld.param.u32 %r171,[%value_in]; } .loc 1 1701 7 setp.eq.u32 %r172,%r171,0; @ ! %r172 bra $L272; bra $L234; $L235: .loc 1 1709 6 ld.u32 %r174,[%frame+56]; set.u32.eq.u32 %r175,%r174,0; neg.s32 %r64,%r175; .loc 1 1920 12 bra $L218; $L228: .loc 1 1924 12 ld.u32 %r1907026; .param .u32 %out_arg8; st.param.u32 [%out_arg8],%r190; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r72; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r73; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r74; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r75; .param .u64 %out_arg13; st.param.u64 [%out_arg13],%r76; call (%value_in),add_ranges_from_rnglists$isra$0,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12,%out_arg13); ld.param.u32 %r191,[%value_in]; } mov.u32 %r64,%r191; bra $L218; $L244: .loc 1 1916 14 mov.u32 %r64,1; $L218: .loc 1 1927 1 mov.u32 %value,%r64;resolve_string .func (.param .u32 %value_out) resolve_string27predpredpredpredmov.u64 %r52,%ar6; mov.u64 %r53,%ar7; .loc 1 1043 14 ld.u32 %r22,[%r50]; .loc 1 1043 3 setp.eq.u32 %r54,%r22,5; @ %r54 bra $L274; setp.eq.u32 %r55,%r22,6; @ %r55 bra $L275; mov.u32 %r45,1; bra $L273; $L274: .loc 1 1046 23 ld.u64 %r23,[%r50+8]; .loc 1 1046 15 st.u64 [%r53],%r23; .loc 1 1047 14 mov.u32 %r45,1; bra $L273; $L275: ld.u64 %r24,[%r50+8]; .loc 1 1054 41 setp.ne.u32 %r56,%r47,0; selp.u64 %r25,8,4,%r56; .loc 1 1054 9 mad.lo.u64 %r27,%r24,%r25,%r49; .loc 1 1056 29 ld.u64 %r29,[%r46+120]; .loc 1 1055 13 add.u64 %r58,%r25,%r27; .loc 1 1055 5 setp.lt.u64 %r59,%r58,%r29; @ %r59 bra $L278; .loc 1 1058 6 mov.u32 %r62,0; cvta.const.u64 %r61,$LC14; { $LCT2525; } .loc 1 1059 13 mov.u32 %r45,%r62; bra $L273; $L278: .loc 1 1062 18 cvta.const.u64 %r63,$LC15; st.u64 [%frame+200],%r63; .loc 1 1063 41 ld.u64 %r30,[%r46+48]; .loc 1 1063 19 st.u64 [%frame+208],%r30; .loc 1 1064 59 add.u64 %r64,%r30,%r27; .loc 1 1064 17 st.u64 [%frame+216],%r64; .loc 1 1065 60 sub.u64 %r65,%r29,%r27; .loc 1 1065 18 st.u64 [%frame+224],%r65; .loc 1 1066 26 st.u32 [%frame+232],%r48; .loc 1 1067 28 st.u64 [%frame+240],%r51; .loc 1 1068 18 st.u64 [%frame+248],%r52; .loc 1 1069 32 mov.u32 %r66,0; st.u32 [%frame+256],%r66; .loc 1 572 6 setp.eq.u32 %r67,%r47,0; @ %r67 bra $L280; .loc 1 573 12 add.u64 %r69,%frame,call (%value_in),read_uint64mov.u64 %r36,%r70; bra $L281; $L280: .loc 1 575 12 add.u64 %r72,%frame,20call (%value_in),read_uint32,(%out_arg1); ld.param.u32 %r73,[%value_in]; } cvt.u64.u32 %r36,%r73; $L281: .loc 1 1072 5 ld.u64 %r74,[%r46+104]; setp.gt.u64 %r75,%r74,%r36; @ %r75 bra $L282; .loc 1 418 38 ld.u64 %r80,[%frame+216]; ld.u64 %r81,[%frame+208]; sub.u64 %r79,%r80,%r81; .loc 1 417 3 st.u32 [%stack+16],%r79; ld.u64 %r82,[%frame+200]; st.u64 [%stack+8],%r82; cvta.const.u64 %r83,$LC16; st.u64 [%stack],%r83; cvta.const.u64 %r78,$LC1; mov.u64 %r77,20framestack; call (%value_in),sn,%out_arg4); ld.param.u32 %r84,[%value_in]; } .loc 1 419 3 ld.u64 %r89,[%frame+248]; ld.u64 %r90,[%frame+240]; mov.u32 %r88,0; { $LCT2688; call %r90,(%out_arg1,%out_arg2,%out_arg3),$LCT26; } .loc 1 1075 13 mov.u32 %r45,%r88; bra $L273; $L282: .loc 1 1077 59 ld.u64 %r92,[%r46+32]; add.u64 %r91,%r92,%r36; .loc 1 1077 10 st.u64 [%r53],%r91; .loc 1 1078 9 mov.u32 %r45,1; $L273: .loc 1 1084FUNCTION DEF: read_attribute .func (.param .u32 %value_out) read_attributelocal .align 16 .b8 %frame_ar[208u64 %r138; .reg .u32pred %r147; .reg .predpredpred %r154; .reg .pred %r155; .reg .pred %r156; .reg .predpredpredpred %r171; .reg .pred %r172; .reg .predpred %r177; .reg .pred %r178; .reg .pred %r179; .reg .predpred %r186; .reg .predpredpred %r194; .reg .u32 %r195; .reg .predu64 %r202; .reg .u32 %r203; .reg .u32 %r205; .reg .u64u32 %r221; .reg .u64 %r223; .reg .u16u32 %r229; .reg .u64 %r231; .reg .u32 %r232; .reg .u64 %r234; .reg .u32u32u32 %r268; .reg .u64 %r272; .reg .u32u64u6432u32 %r331; .reg .u64 %r332; .reg .u64 %r333; .reg .u32 %r334; .reg .u64 %r336; .reg .u32 %r337; .reg .pred %r338; .reg .u64 %r341; .reg .predu32 %r359; .reg .u32 %r361; .reg .u64 %r362; .reg .u3232pred %r404; .reg .pred %r405; .reg .pred %r406; .reg .predpred %r424; .reg .predu32 %r450; .reg .u64 %r452; .reg .u32u32 %r460; .reg .pred %r461; .reg .u32 %r462; .reg .u32 %r463; .reg .pred %r464; .reg .u64 %r466; .reg .u32 %r468; .reg .predmov.u32 %r136,%ar0; mov.u64 %r137,%ar1; mov.u64 %r138,%ar2; mov.u32 %r139,%ar3; mov.u32 %r140,%ar4; mov.u32 %r141,%ar5; mov.u64 %r142,%ar6; mov.u64 %r143,%ar7; mov.u64 %r144,%ar8; $L348: .loc 1 763 3 mov.u64 %r145,0; st.u64 [%r144],%r145; st.u64 [%r144+8],%r145; .loc 1 765 3 setp.eq.u32 %r147,%r136,21; @ %r147 bra $L285; setp.gt.u32 %r148,%r136,21; @ %r148 bra $L286; setp.gt.u32 %r149,%r136,12; @ %r149 bra $L287; setp.gt.u32 %r150,%r136,10; @ %r150 bra $L288; setp.eq.u32 %r151,%r136,6; @ %r151 bra $L289; setp.gt.u32 %r152,%r136,6; @ %r152 bra $L290; setp.eq.u32 %r153,%r136,4; @ %r153 bra $L291; setp.gt.u32 %r154,%r136,4; @ %r154 bra $L292; setp.eq.u32 %r155,%r136,1; @ %r155 bra $L293; setp.eq.u32 %r156,%r136,3; @ %r156 bra $L294; bra $L295; $L292: setp.eq.u32 %r157,%r136,5; @ %r157 bra $L296; bra $L295; $L290: setp.eq.u32 %r158,%r136,9; @ %r158 bra $L297; setp.eq.u32 %r159,%r136,10; @ %r159 bra $L298; setp.eq.u32 %r160,%r136,7; @ %r160 bra $L299; setp.eq.u32 %r161,%r136,8; @ %r161 bra $L300; bra $L295; $L287: setp.eq.u32 %r162,%r136,17; @ %r162 bra $L301; setp.gt.u32 %r163,%r136,17; @ %r163 bra $L302; setp.eq.u32 %r164,%r136,15; @ %r164 bra $L303; setp.gt.u32 %r165,%r136,15; @ %r165 bra $L304; setp.eq.u32 %r166,%r136,13; @ %r166 bra $L305; setp.eq.u32 %r167,%r136,14; @ %r167 bra $L306; bra $L295; $L304: setp.eq.u32 %r168,%r136,16; @ %r168 bra $L307; bra $L295; $L302: setp.eq.u32 %r169,%r136,19; @ %r169 bra $L308; setp.eq.u32 %r170,%r136,20; @ %r170 bra $L309; setp.eq.u32 %r171,%r136,18; @ %r171 bra $L310; bra $L295; $L286: setp.eq.u32 %r172,%r136,32; @ %r172 bra $L311; setp.gt.u32 %r173,%r136,32; @ %r173 bra $L312; setp.eq.u32 %r174,%r136,26; @ %r174 bra $L313; setp.gt.u32 %r175,%r136,26; @ %r175 bra $L314; setp.eq.u32 %r176,%r136,24; @ %r176 bra $L315; setp.gt.u32 %r177,%r136,24; @ %r177 bra $L316; setp.eq.u32 %r178,%r136,22; @ %r178 bra $L317; setp.eq.u32 %r179,%r136,23; @ %r179 bra $L318; bra $L295; $L316: setp.eq.u32 %r180,%r136,25; @ %r180 bra $L319; bra $L295; $L314: setp.eq.u32 %r181,%r136,30; @ %r181 bra $L320; setp.gt.u32 %r182,%r136,30; @ %r182 bra $L321; setp.eq.u32 %r183,%r136,28; @ %r183 bra $L322; setp.eq.u32 %r184,%r136,29; @ %r184 bra $L323; setp.eq.u32 %r185,%r136,27; @ %r185 bra $L324; bra $L295; $L321: setp.eq.u32 %r186,%r136,31; @ %r186 bra $L325; bra $L295; $L312: setp.eq.u32 %r187,%r136,36; @ %r187 bra $L326; setp.gt.u32 %r188,%r136,36; @ %r188 bra $L327; setp.eq.u32 %r189,%r136,34; @ %r189 bra $L328; setp.eq.u32 %r190,%r136,35; @ %r190 bra $L329; setp.eq.u32 %r191,%r136,33; @ %r191 bra $L330; bra $L295; $L327: setp.gt.u32 %r192,%r136,7938; @ %r192 bra $L331; setp.gt.u32 %r193,%r136,7936; @ %r193 bra $L328; setp.le.u32 %r194,%r136,40; @ %r194 bra $L332; add.u32 %r195,%r136,-41; setp.gt.u32 %r196,%r195,3; @ ! %r196 bra $L383; bra $L295; $L331: setp.eq.u32 %r197,%r136,7968; @ %r197 bra $L334; setp.eq.u32 %r198,%r136,7969; @ %r198 bra $L323; bra $L295; $L293: .loc 1 768 21 st.u32 [%r144],%r136; .loc 1 769 2113141; call (%value_in),read_addres202,[%value_in]; } .loc 1 769 19 st.u64 [%r144+8],%r202; .loc 1 770 14 mov.u32 %r135,%r136; bra $L284; $L294: .loc 1 772 21 mov.u32 %r203,13; st.u32 [%r144],%r203138; call (%value_in),read_uint16,(%out_arg1); ld.param.u32 %r205,[%value_in]; } .loc 1 773 14 cvt.u16.u32 %r210,%r205; cvt.u64.u16 %r209,%r2113r209; call (%value_in),advance135,%r211; bra $L284; $L291: .loc 1 775 21 mov.u32 %r212,13; st.u32 [%r144],%r212138; call (%value_in),read_uint32,(.loc 1 776 14 cvt.u64.u32 %r217,%r21413r217; call (%value_in),advance135,%r218; bra $L284; $L296: .loc 1 778 21 mov.u32 %r219,3; st.u32 [%r144],%r219;138; call (%value_in),read_uint16,(%out_arg1); ld.param.u32 %r221,[%value_in]; } cvt.u16.u32 %r224,%r221; cvt.u64.u16 %r223,%r224; st.u64 [%r144+8],%r223; .loc 1 780 14 mov.u32 %r135,1; bra $L284; $L289: .loc 1 782 21 mov.u32 %r225,3; st.u32 [%r144],%r225;138; call (%value_in),read_uint32,(%out_arg1); ld.param.u32 %r227,[%value_in]; } cvt.u64.u32 %r228,%r227; st.u64 [%r144+8],%r228; .loc 1 784 14 mov.u32 %r135,1; bra $L284; $L299: .loc 1 786 21 mov.u32 %r229,3; st.u32 [%r144],%r229; .loc 1 787 21138; call (%value_in),read_uint64,(%out_arg1); ld.param.u64 %r231,[%value_in]; } .loc 1 787 19 st.u64 [%r144+8],%r231; .loc 1 788 14 mov.u32 %r135,1; bra $L284; $L320: .loc 1 790 21 mov.u32 %r232,13; st.u32 [%r144],%r232; .loc 1 791 14 mov.u64 %r234,16138advance135,%r235; bra $L284; $L300: .loc 1 793 21 mov.u32 %r236,5; st.u32 [%r144],%r236; .loc 1 458 15 ld.u64 %r76,[%r138+16]; .loc 1 459 16 ld.u64 %r77,[%r138+24];strnle239,[%value_in]; } .loc 1 464 10 add.u64 %r79,%r239,1; .loc 1 428 6 setp.lt.u64 %r241,%r77,%r79; @ %r241 bra $L336; .loc 1 448 12 add.u64 %r242,%r76,%r79; st.u64 [%r138+16],%r242; .loc 1 449 13 sub.u64 %r243,%r77,%r79; st.u64 [%r138+24],%r243; mov.u32 %r135,1; bra $L337; $L336: .loc 1 431 11 ld.u32 %r135,[%r138+56]; .loc 1 431 6 setp.ne.u32 %r244,%r135,0; @ %r244 bra $L370; .loc 1 418 38 ld.u64 %r249,[%r138+8]; sub.u64 %r248,%r76,%r249; .loc 1 417 3 st.u32 [%stack+16],%r248; ld.u64 %r250,[%r138]; st.u64 [%stack+8],%r250; cvta.const.u64 %r251,$LC2; st.u64 [%stack],%r251; cvta.const.u64 %r247,$LC1; mov2462stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r252,[%value_in]; } .loc 1 419 3 ld.u64 %r257,[%r138+48]; ld.u64 %r258,[%r138+40]; { $LCT27frame135; call %r258,$LCT27; } .loc 1 434 31 mov.u32 %r259,1; st.u32 [%r138+56],%r259; .loc 1 467 12 mov.u64 %r76,0; bra $L337; $L370: mov.u32 %r135,0; mov.u64 %r76,0; $L337: .loc 1 794 21 st.u64 [%r144+8],%r76; .loc 1 795 40 bra $L284; $L297: .loc 1 797 21 mov.u32 %r260,13; st.u32 [%r144],%r260; .loc 1 79read_uleb128,(%out_arg1); ld.param.u64 %r262,[%value_in]; }13r262; call (%value_in),advance135,%r265; bra $L284; $L298: .loc 1 800 21 mov.u32 %r266,13; st.u32 [%r144],%r266138; call (%value_in),read_byte,(%out_arg1); ld.param.u32 %r268,[%value_in]; } .loc 1 801 14 cvt.u32.u32 %r273,%r268; cvt.u64.u8 %r272,%r27313r272; call (%value_in),advance135,%r274; bra $L284; $L288: .loc 1 803 21 mov.u32 %r275,3; st.u32 [%r144],%r275;138; call (%value_in),read_byte,(%out_arg1); ld.param.u32 %r277,[%value_in]; } cvt.u32.u32 %r280,%r277; cvt.u64.u8 %r279,%r280; st.u64 [%r144+8],%r279; .loc 1 805 14 mov.u32 %r135,1; bra $L284; $L305: .loc 1 811 21 mov.u32 %r281,4; st.u32 [%r144],%r281; .loc 1 812 21138; call (%value_in),read_sleb128,(%out_arg1); ld.param.u64 %r283,[%value_in]; } .loc 1 812 19 st.u64 [%r144+8],%r283; .loc 1 813 14 mov.u32 %r135,1; bra $L284; $L306: .loc 1 572 6 setp.eq.u32 %r284,%r139,0; @ %r284 bra $L338; .loc 1 573 12138; call (%value_in),read_uint64,(%out_arg1); ld.param.u64 %r286,[%value_in]; } mov.u64 %r88,%r286; bra $L339; $L3138; call (%value_in),read_uint32,(%out_arg1); ld.param.u32 %r288,[%value_in]; } cvt.u64.u32 %r88,%r288; $L339: .loc 1 819 5 ld.u64 %r289,[%r142+104]; setp.gt.u64 %r290,%r289,%r88; @ %r290 bra $L340; .loc 1 418 38 ld.u64 %r295,[%r138+16]; ld.u64 %r296,[%r138+8]; sub.u64 %r294,%r295,%r296; .loc 1 417 3 st.u32 [%stack+16],%r294; ld.u64 %r297,[%r138]; st.u64 [%stack+8],%r297; cvta.const.u64 %r298,$LC17; st.u64 [%stack],%r298; cvta.const.u64 %r293,$LC1; mov292293stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r299,[%value_in]; } .loc 1 419 3 ld.u64 %r304,[%r138+48]; ld.u64 %r305,[%r138+40]; mov.u32 %r303,0; { $LCT28frame3; call %r305,$LCT28; } .loc 1 822 13 mov.u32 %r135,%r303; .loc 1 420 1 bra $L284; $L340: .loc 1 824 16 mov.u32 %r306,5; st.u32 [%r144],%r306; .loc 1 826 51 ld.u64 %r308,[%r142+32]; add.u64 %r307,%r308,%r88; .loc 1 825 16 st.u64 [%r144+8],%r307; .loc 1 827 9 mov.u32 %r135,1; bra $L284; $L325: .loc 1 572 6 setp.eq.u32 %r309,%r139,0; @ %r309 bra $L341; .loc 1 573 12138; call (%value_in),read_uint64,(%out_arg1); ld.param.u64 %r311,[%value_in]; } mov.u64 %r97,%r311; bra $L342; $L341138; call (%value_in),read_uint32,(%out_arg1); ld.param.u32 %r313,[%value_in]; } cvt.u64.u32 %r97,%r313; $L342: .loc 1 834 5 ld.u64 %r314,[%r142+128]; setp.gt.u64 %r315,%r314,%r97; @ %r315 bra $L343; .loc 1 418 38 ld.u64 %r320,[%r138+16]; ld.u64 %r321,[%r138+8]; sub.u64 %r319,%r320,%r321; .loc 1 417 3 st.u32 [%stack+16],%r319; ld.u64 %r322,[%r138]; st.u64 [%stack+8],%r322; cvta.const.u64 %r323,$LC18; st.u64 [%stack],%r323; cvta.const.u64 %r318,$LC1; mov17318stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r324,[%value_in]; } .loc 1 419 3 ld.u64 %r329,[%r138+48]; ld.u64 %r330,[%r138+40]; mov.u32 %r328,0; { $LCT2929%r330,$LCT29; } .loc 1 837 13 mov.u32 %r135,%r328; .loc 1 420 1 bra $L284; $L343: .loc 1 839 16 mov.u32 %r331,5; st.u32 [%r144],%r331; .loc 1 841 56 ld.u64 %r333,[%r142+56]; add.u64 %r332,%r333,%r97; .loc 1 840 16 st.u64 [%r144+8],%r332; .loc 1 842 9 mov.u32 %r135,1; bra $L284; $L303: .loc 1 845 21 mov.u32 %r334,3; st.u32 [%r144],%r334; .loc 1 846 21138; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r336,[%value_in]; } .loc 1 846 19 st.u64 [%r144+8],%r336; .loc 1 847 14 mov.u32 %r135,1; bra $L284; $L307: .loc 1 849 21 mov.u32 %r337,8; st.u32 [%r144],%r337; .loc 1 850 10 setp.ne.u32 %r338,%r140,2; @ %r338 bra $L344; .loc 1 851 1613141; call (%value_in),read_addres41,[%value_in]; } .loc 1 851 14 st.u64 [%r144+8],%r341; .loc 1 854 14 mov.u32 %r135,1; bra $L284; $L344: .loc 1 572 6 setp.eq.u32 %r342,%r139,0; @ %r342 bra $L345; .loc 1 573 12138; call (%value_in),read_uint64,(%out_arg1); ld.param.u64 %r344,[%value_in]; } mov.u64 %r106,%r344; bra $L346; $L345138; call (%value_in),read_uint32,(%out_arg1); ld.param.u32 %r346,[%value_in]; } cvt.u64.u32 %r106,%r346; $L346: .loc 1 853 14 st.u64 [%r144+8],%r106; .loc 1 854 14 mov.u32 %r135,1; bra $L284; $L301: .loc 1 856 21 mov.u32 %r347,7; st.u32 [%r144],%r347;138; call (%value_in),read_byte,(%out_arg1); ld.param.u32 %r349,[%value_in]; } cvt.u32.u32 %r352,%r349; cvt.u64.u8 %r351,%r352; st.u64 [%r144+8],%r351; .loc 1 858 14 mov.u32 %r135,1; bra $L284; $L310: .loc 1 860 21 mov.u32 %r353,7; st.u32 [%r144],%r353;138; call (%value_in),read_uint16,(%out_arg1); ld.param.u32 %r355,[%value_in]; } cvt.u16.u32 %r358,%r355; cvt.u64.u16 %r357,%r358; st.u64 [%r144+8],%r357; .loc 1 862 14 mov.u32 %r135,1; bra $L284; $L308: .loc 1 864 21 mov.u32 %r359,7; st.u32 [%r144],%r359;138; call (%value_in),read_uint32,(%out_arg1); ld.param.u32 %r361,[%value_in]; } cvt.u64.u32 %r362,%r361; st.u64 [%r144+8],%r362; .loc 1 866 14 mov.u32 %r135,1; bra $L284; $L309: .loc 1 868 21 mov.u32 %r363,7; st.u32 [%r144],%r363; .loc 1 869 21138; call (%value_in),read_uint64,(%out_arg1); ld.param.u64 %r365,[%value_in]; } .loc 1 869 19 st.u64 [%r144+8],%r365; .loc 1 870 14 mov.u32 %r135,1; bra $L284; $L285: .loc 1 872 21 mov.u32 %r366,7; st.u32 [%r144],%r366; .loc 1 873 21138; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r368,[%value_in]; } .loc 1 873 19 st.u64 [%r144+8],%r368; .loc 1 874 14 mov.u32 %r135,1; bra $L284; $L317: .loc 1 879 9138; call (%value_in),read_uleb128,(setp.ne.u64 %r371,%r370,33; @ %r371 bra $L347; .loc 1 418 38 ld.u64 %r376,[%r138+16]; ld.u64 %r377,[%r138+8]; sub.u64 %r375,%r376,%r377; .loc 1 417 3 st.u32 [%stack+16],%r375; ld.u64 %r378,[%r138]; st.u64 [%stack+8],%r378; cvta.const.u64 %r379,$LC19; st.u64 [%stack],%r379; cvta.const.u64 %r374,$LC1; mov73374stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r380,[%value_in]; } .loc 1 419 3 ld.u64 %r385,[%r138+48]; ld.u64 %r386,[%r138+40]; mov.u32 %r384,0; { $LCT385%r386,$LCT30; } .loc 1 884 13 mov.u32 %r135,%r384; .loc 1 420 1 bra $L284; $L347: .loc 1 886 9 cvt.u32.u64 %r136,%r370; mov.u64 %r137,0; bra $L348; $L318: .loc 1 891 21 mov.u32 %r387,10; st.u32 [%r144],%r387; .loc 1 572 6 setp.eq.u32 %r388,%r139,0; @ %r388 bra $L349; .loc 1 573 12138; call (%value_in),read_uint64,(%out_arg1); ld.param.u64 %r390,[%value_in]; } mov.u64 %r115,%r390; bra $L350; $L349138; call (%value_in),read_uint32,(%out_arg1); ld.param.u32 %r392,[%value_in]; } cvt.u64.u32 %r115,%r392; $L350: .loc 1 892 19 st.u64 [%r144+8],%r115; .loc 1 893 14 mov.u32 %r135,1; bra $L284; $L315: .loc 1 895 21 mov.u32 %r393,14; st.u32 [%r144],%r393; .loc 1 896read_uleb128,(%out_arg1); ld.param.u64 %r395,[%value_in]; }13r395; call (%value_in),advance98,[%value_in]; } mov.u32 %r135,%r398; bra $L284; $L319: .loc 1 898 21 mov.u32 %r399,3; st.u32 [%r144],%r399; .loc 1 899 19 mov.u64 %r400,1; st.u64 [%r144+8],%r400; .loc 1 900 14 cvt.u32.u64 %r135,%r400; bra $L284; $L311: .loc 1 902 21 mov.u32 %r401,11; st.u32 [%r144],%r401; .loc 1 903 21138; call (%value_in),read_uint64,(%out_arg1); ld.param.u64 %r403,[%value_in]; } .loc 1 903 19 st.u64 [%r144+8],%r403; .loc 1 904 14 mov.u32 %r135,1; bra $L284; $L332: setp.eq.u32 %r404,%r136,38; @ %r404 bra $L351; setp.gt.u32 %r405,%r136,38; @ %r405 bra $L352; setp.eq.u32 %r406,%r136,37; @ %r406 bra $L353; mov.u32 %r135,0; bra $L284; $L352: setp.eq.u32 %r407,%r136,39; @ ! %r407 bra $L384; bra $L354; $L313: .loc 1 9138; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r409,[%value_in]; } mov.u64 %r72,%r409; .loc 1 914 6 bra $L356; $L353138; call (%value_in),read_byte,(%out_arg1); ld.param.u32 %r411,[%value_in]; } .loc 1 916 13 cvt.u32.u32 %r413,%r411; cvt.u64.u8 %r72,%r413; .loc 1 917 6 bra $L356; $L351138; call (%value_in),read_uint16,(%out_arg1); ld.param.u32 %r415,[%value_in]; } .loc 1 919 13 cvt.u16.u32 %r417,%r415; cvt.u64.u16 %r72,%r417; .loc 1 920 6 call (%value_in),read_uint24,(%out_arg1); ld.param.u32 %r419,[%value_in]; } .loc 1 922 13 cvt.u64.u32 %r72,%r419; .loc 1 923 6 bra $L356; $L384138; call (%value_in),read_uint32,(%out_arg1); ld.param.u32 %r421,[%value_in]; } .loc 1 925 13 cvt.u64.u32 %r72,%r421; $L356: .loc 1 931 16 mov.u32 %r422,6; st.u32 [%r144],%r422; .loc 1 932 14 st.u64 [%r144+8],%r72; .loc 1 933 9 mov.u32 %r135,1; bra $L284; $L383: setp.eq.u32 %r423,%r136,42; @ %r423 bra $L357; setp.gt.u32 %r424,%r136,42; @ %r424 bra $L358; setp.eq.u32 %r425,%r136,41; @ %r425 bra $L359; mov.u32 %r135,0; bra $L284; $L358: setp.eq.u32 %r426,%r136,43; @ ! %r426 bra $L385; bra $L360; $L324: .loc 1 9438; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r428,[%value_in]; } mov.u64 %r73,%r428; .loc 1 944 6 bra $L362; $L359138; call (%value_in),read_byte,(%out_arg1); ld.param.u32 %r430,[%value_in]; } .loc 1 946 13 cvt.u32.u32 %r432,%r430; cvt.u64.u8 %r73,%r432; .loc 1 947 6 bra $L362; $L35138; call (%value_in),read_uint16,(%out_arg1); ld.param.u32 %r434,[%value_in]; } .loc 1 949 13 cvt.u16.u32 %r436,%r434; cvt.u64.u16 %r73,%r436; .loc 1 950 6 call (%value_in),read_uint24,(952 13 cvt.u64.u32 %r73,%r438; .loc 1 953 6 bra $L362; $L385138; call (%value_in),read_uint32,(%out_arg1); ld.param.u32 %r440,[%value_in]; } .loc 1 955 13 cvt.u64.u32 %r73,%r440; $L362: .loc 1 961 16 mov.u32 %r441,2; st.u32 [%r144],%r441; .loc 1 962 14 st.u64 [%r144+8],%r73; .loc 1 963 9 mov.u32 %r135,1; bra $L284; $L322: .loc 1 966 21 mov.u32 %r442,10; st.u32 [%r144],%r44138; call (%value_in),read_uint32,(%out_arg1); ld.param.u32 %r444,[%value_in]; } cvt.u64.u32 %r445,%r444; st.u64 [%r144+8],%r445; .loc 1 968 14 mov.u32 %r135,1; bra $L284; $L326: .loc 1 970 21 mov.u32 %r446,10; st.u32 [%r144],%r446; .loc 1 971 21138; call (%value_in),read_uint64,(%out_arg1); ld.param.u64 %r448,[%value_in]; } .loc 1 971 19 st.u64 [%r144+8],%r448; .loc 1 972 14 mov.u32 %r135,1; bra $L284; $L330: .loc 1 974 21 mov.u32 %r449,3; st.u32 [%r144],%r449; .loc 1 975 19 st.u64 [%r144+8],%r137; .loc 1 976 14 mov.u32 %r135,1; bra $L284; $L328: .loc 1 980 21 mov.u32 %r450,10; st.u32 [%r144],%r450; .loc 1 981 21138; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r452,[%value_in]; } .loc 1 981 19 st.u64 [%r144+8],%r452; .loc 1 982 14 mov.u32 %r135,1; bra $L284; $L329: .loc 1 984 21 mov.u32 %r453,12; st.u32 [%r144],%r453; .loc 1 985 21138; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r455,[%value_in]; } .loc 1 985 19 st.u64 [%r144+8],%r455; .loc 1 986 14 mov.u32 %r135,1; bra $L284; $L334: .loc 1 572 6 setp.eq.u32 %r456,%r139,0; @ %r456 bra $L363; .loc 1 573 12138; call (%value_in),read_uint64,(%out_arg1); ld.param.u64 %r458,[%value_in]; } mov.u64 %r117,%r458; bra $L364; $L363138; call (%value_in),read_uint32,(%out_arg1); ld.param.u32 %r460,[%value_in]; } cvt.u64.u32 %r117,%r460; $L364: .loc 1 996 19 st.u64 [%r144+8],%r117; .loc 1 997 10 setp.ne.u64 %r461,%r143,0; @ %r461 bra $L365; $L368: .loc 1 999 18 mov.u32 %r462,0; st.u32 [%r144],%r462; .loc 1 1000 11 mov.u32 %r135,1; bra $L284; $L365: .loc 1 1002 21 mov.u32 %r463,9; st.u32 [%r144],%r463; .loc 1 1003 14 mov.u32 %r135,1; bra $L284; $L323: .loc 1 572 6 setp.eq.u32 %r464,%r139,0; @ %r464 bra $L366; .loc 1 573 12138; call (%value_in),read_uint64,(%out_arg1); ld.param.u64 %r466,[%value_in]; } mov.u64 %r119,%r466; bra $L367; $L366138; call (%value_in),read_uint32,(%out_arg1); ld.param.u32 %r468,[%value_in]; } cvt.u64.u32 %r119,%r468; $L367: .loc 1 1009 5 setp.eq.u64 %r469,%r143,0; @ %r469 bra $L368; .loc 1 1014 5 ld.u64 %r470,[%r143+160]; setp.gt.u64 %r471,%r470,%r119; @ %r471 bra $L369; .loc 1 418 38 ld.u64 %r476,[%r138+16]; ld.u64 %r477,[%r138+8]; sub.u64 %r475,%r476,%r477; .loc 1 417 3 st.u32 [%stack+16],%r475; ld.u64 %r478,[%r138]; st.u64 [%stack+8],%r478; cvta.const.u64 %r479,$LC20; st.u64 [%stack],%r479; cvta.const.u64 %r474,$LC1; mov47374stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r480,[%value_in]; } .loc 1 419 3 ld.u64 %r485,[%r138+48]; ld.u64 %r486,[%r138+40]; mov.u32 %r484,0; { $LCT31485%r486,$LCT31; } .loc 1 1017 13 mov.u32 %r135,%r484; .loc 1 420 1 bra $L284; $L369: .loc 1 1019 16 mov.u32 %r487,5; st.u32 [%r144],%r487; .loc 1 1021 59 ld.u64 %r489,[%r143+88]; add.u64 %r488,%r489,%r119; .loc 1 1020 16 st.u64 [%r144+8],%r488; .loc 1 1022 9 mov.u32 %r135,1; bra $L284; $L295: .loc 1 418 38 ld.u64 %r494,[%r138+16]; ld.u64 %r495,[%r138+8]; sub.u64 %r493,%r494,%r495; .loc 1 417 3 st.u32 [%stack+16],%r493; ld.u64 %r496,[%r138]; st.u64 [%stack+8],%r496; cvta.const.u64 %r497,$LC21; st.u64 [%stack],%r497; cvta.const.u64 %r492,$LC1; movstack; call (%value_in),sn,%out_arg4); ld.param.u32 %r498,[%value_in]; } .loc 1 419 3 ld.u64 %r503,[%r138+48]; ld.u64 %r504,[%r138+40]; mov.u32 %r502,0; { $LCT3250ame2; call %r504,$LCT32; } .loc 1 1026 14 mov.u32 %r135,%r502; $L284: .loc 1 1028 1 mov.u32 %value,%r135;read_line_header_format_entrieslocal .align 16 .b8 %frame_ar[224.reg .u64 %r45; .reg .u32 %r52; .reg .u64 %r60; .reg .u64predu64 %r216; .reg .pred %r217; .reg .pred %r218; .reg .pred %r219; .reg .u64 %r221; .reg .u64 %r224; .reg .u64predpred %r278; .reg .u64 %r280; .reg .u64 %r281; .reg .u64 %r282; mov.u64 %r101,%ar0; mov.u64 %r102,%ar1; mov.u64 %r103,%ar2; mov.u64 %r104,%ar3; mov.u64 %r105,%ar4; mov.u64 %r106,%ar5; mov.u64 %r107,%ar6; .loc 1 2541 1104; call (%value_in),read_byte,(%out_arg1); ld.param.u32 %r109,[%value_in]; } .loc 1 2541 17 cvt.u32.u32 %r111,%r109; cvt.u64.u8 %r41,%r111; .loc 1 2542 6 setp.eq.u64 %r112,%r41,0; @ %r112 bra $L407; .loc 1 2547 4 ld.u64 %r117,[%r104+48]; ld.u64 %r118,[%r104+40]; shl.b64 %r119,%r41118117; call (%value_in),backtrace_allocld.param.u64 %r120,[%value_in]; } mov.u64 %r38,%r120; .loc 1 2552 10 setp.eq.u64 %r122,%r38,0; @ %r122 bra $L408; mov.u64 %r98,%r38; .loc 1 2555 14 mov.u64 %r42,0; $L389: .loc 1 2557 2104; call (%value_in),read_uleb128,(.loc 1 2557 22 st.u32 [%r98],%r124;104; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r126,[%value_in]; } .loc 1 2558 22 st.u32 [%r98+4],%r126; .loc 1 2555 39 add.u64 %r42,%r42,1; .loc 1 2555 7 add.u64 %r98,%r98,8; setp.ne.u64 %r127,%r41,%r42; @ %r127 bra $L389; bra $L387; $L407: .loc 1 2543 13 mov.u64 %r38,%r41; $L387: .loc 1 2562 17104; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r129,[%value_in]; } .loc 1 2563 6390; .loc 1 2565 15 st.u64 [%r106],%r129; .loc 1 2566 15 st.u64 [%r107],%r129; .loc 1 2567 11 mov.u32 %r100,1; .loc 1 2568 7 bra $L391; $L390: .loc 1 2572 5 shl.b64 %r32,%r129,3; ld.u64 %r137,[%r104+48]; ld.u64 %r138,[%r104+40]138137; call (%value_in),backtrace_allocld.param.u64 %r139,[%value_in]; } .loc 1 2574 6 setp.eq.u64 %r141,%r139,0; @ %r141 bra $L409; mov.u64 %r40,%r139; .loc 1 2579 10 mov.u64 %r45,0; setp.ne.u64 %r278,%r41,0; .loc 1 2458 34 add.u64 %r280,%r102,56; add.u64 %r281,%frame,200; .loc 1 2464 9 add.u64 %r282,%frame,216; $L406: .loc 1 2452 8 mov.u64 %r142,0; st.u64 [%frame+216],%r142; .loc 1 2453 3 @ %r278 bra $L392; $L402: .loc 1 418 38 ld.u64 %r148,[%r104+16]; ld.u64 %r149,[%r104+8]; sub.u64 %r147,%r148,%r149; .loc 1 417 3 st.u32 [%stack+16],%r147; ld.u64 %r150,[%r104]; st.u64 [%stack+8],%r150; cvta.const.u64 %r151,$LC22; st.u64 [%stack],%r151; cvta.const.u64 %r146,$LC1; movframe146stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r152,[%value_in]; } .loc 1 419 3 ld.u64 %r157,[%r104+48]; ld.u64 %r158,[%r104+40]; mov.u32 %r156,0; { $LCT33frame156; call %r158,$LCT33; } bra $L393; $L392: mov.u64 %r99,%r38; .loc 1 2451 7 mov.u64 %r80,%r142; .loc 1 2453 10 mov.u64 %r65,%r80; $L400: .loc 1 2457 12 ld.u64 %r169,[%r102+8]; ld.u32 %r170,[%r105+4]; ld.u32 %r171,[%r103+40]; ld.u32 %r172,[%r103+44]; ld.u32 %r173,[%r99+4]; mov.u64 %r16017r16010280; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r169; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r281; call (%value_in),read_attribute,%out_arg8,%out_arg9); ld.param.u32 %r174,[%value_in]; } .loc 1 2457 10 setp.eq.u32 %r175,%r174,0; @ %r175 bra $L393; .loc 1 2461 25 ld.u32 %r52,[%r99]; .loc 1 2461 7 setp.eq.u32 %r176,%r52,1; @ %r176 bra $L395; setp.eq.u32 %r177,%r52,2; @ %r177 bra $L396; bra $L397; $L395: .loc 1 2464 9 ld.u64 %r187,[%r104+48]; ld.u64 %r188,[%r104+40]; ld.u64 %r190,[%r103+64]; ld.u32 %r191,[%r102+200]; ld.u32 %r192,[%r103+4419190281.param .u64 %out_arg8; st.param.u64 [%out_arg8],%r282; call (%value_in),resolve_string,%out_arg8); ld.param.u32 %r193,[%value_in]; } .loc 1 2464 7 setp.eq.u32 %r194,%r193,0; @ ! %r194 bra $L397; bra $L393; $L396: .loc 1 2471 7 ld.u32 %r195,[%frame+200]; setp.ne.u32 %r196,%r195,3; @ %r196 bra $L397; .loc 1 2473 17 ld.u64 %r60,[%frame+208]; .loc 1 2473 11 ld.u64 %r197,[%r105+40]; setp.lt.u64 %r198,%r60,%r197; @ %r198 bra $L398; .loc 1 418 38 ld.u64 %r203,[%r104+16]; ld.u64 %r204,[%r104+8]; sub.u64 %r202,%r203,%r204; .loc 1 417 3 st.u32 [%stack+16],%r202; ld.u64 %r205,[%r104]; st.u64 [%stack+8],%r205; cvta.const.u64 %r206,$LC23; st.u64 [%stack],%r206; cvta.const.u64 %r201,$LC1; mov20020stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r207,[%value_in]; } .loc 1 419 3 ld.u64 %r212,[%r104+48]; ld.u64 %r213,[%r104+40]; cvt.u32.u64 %r211,%r160; { $LCT3421frame211; call %r21,$LCT34; } .loc 1 420 1 bra $L393; $L398: .loc 1 2480 12 ld.u64 %r214,[%r105+48]; shl.b64 %r215,%r60,3; add.u64 %r216,%r214,%r215; ld.u64 %r80,[%r216]; $L397: .loc 1 2453 35 add.u64 %r65,%r65,1; .loc 1 2453 3 add.u64 %r99,%r99,8; setp.ne.u64 %r217,%r41,%r65; @ %r217 bra $L400; .loc 1 2489 12 ld.u64 %r66,[%frame+216]; .loc 1 2489 6 setp.eq.u64 %r218,%r66,0; @ %r218 bra $L402; .loc 1 2496 6 setp.ne.u64 %r219,%r80,0; @ %r219 bra $L403; .loc 1 2497 13 st.u64 [%r40],%r66; bra $L404; $L403: .loc 1 2504 1780221,[%value_in]; } .loc 1 2505 1224,[%value_in]; } .loc 1 2506 20 ld.u64 %r230,[%r104+48]; ld.u64 %r231,[%r104+40]; .loc 1 2506 52 add.u64 %r232,%r221,%r224; .loc 1 2506 20 add.u64 %r233,%r2323230; call (%value_in),backtrace_allocld.param.u64 %r234,[%value_in]; } .loc 1 2508 10 setp.eq.u64 %r236,%r234,0; @ %r236 bra $L393; .loc 1 2510 71243,[%value_in]; } .loc 1 2514 18 add.u64 %r245,%r234,%r221; mov.u32 %r246,47; st.u8 [%r245],%r246; .loc 1 2515 27 add.u64 %r247,%r221,1; add.u64 %r248,%r234,%r247; .loc 1 2515 7 add.u64 %r249,%r224,1; ld.u64 %r250,[%frame+216]249257,[%value_in]; } .loc 1 2516 15 st.u64 [%r40],%r234; bra $L404; $L393: .loc 1 2584 4 ld.u64 %r264,[%r104+48]; ld.u64 %r265,[%r104+40]; {r15264; call backtrace_free); } .loc 1 2587 8 mov.u32 %r100,0; .loc 1 2588 4 bra $L391; $L404: .loc 1 2579 33 add.u64 %r45,%r45,1; .loc 1 2579 3 add.u64 %r40,%r40,8; setp.ne.u64 %r266,%r129,%r45; @ %r266 bra $L406; .loc 1 2592 11 st.u64 [%r106],%r129; .loc 1 2593 11 st.u64 [%r107],%r139; .loc 1 2595 7 mov.u32 %r100,1; bra $L391; $L409: .loc 1 2576 11 cvt.u32.u64 %r100,%r139; $L391: .loc 1 2598 6 setp.eq.u64 %r267,%r38,0; @ %r267 bra $L386; .loc 1 2599 5 ld.u64 %r273,[%r104+48]; ld.u64 %r274,[%r104+40]; shl.b64 %r275,%r41,3; {27743; call backtrace_free); } bra $L386; $L408: .loc 1 2553 9 cvt.u32.u64 %r100,%r120; $L386: .loc 1 2604 1 mov.u32 %value,%r100;read_referenced_name .func (.param .u64 %value_out) read_referenced_namelocal .align 16 .b8 %frame_ar[288predpredpredpredpredpredpred %r176; .reg .u64 %r182; .reg .predu64 %r190; mov.u64 %r71,%ar0; mov.u64 %r72,%ar1; mov.u64 %r73,%ar2; mov.u64 %r74,%ar3; mov.u64 %r75,%ar4; .loc 1 3054 17 ld.u64 %r22,[%r72+16]; .loc 1 3054 6 setp.gt.u64 %r76,%r22,%r73; @ %r76 bra $L429; .loc 1 3055 17 sub.u64 %r23,%r73,%r22; .loc 1 3055 43 ld.u64 %r24,[%r72+8]; .loc 1 3055 7 setp.lt.u64 %r77,%r23,%r24; @ %r77 bra $L430; $L429: .loc 1 3057 7 mov.u32 %r80,0; cvta.const.u64 %r79,$LC24; { $LCT3535; } .loc 1 3060 14 mov.u64 %r70,0; bra $L428; $L430: .loc 1 3065 17 cvta.const.u64 %r81,$LC25; st.u64 [%frame+200],%r81; .loc 1 3066 18 ld.u64 %r82,[%r71+56]; st.u64 [%frame+208],%r82; .loc 1 3067 31 ld.u64 %r84,[%r72]; add.u64 %r83,%r84,%r23; .loc 1 3067 16 st.u64 [%frame+216],%r83; .loc 1 3068 36 add.u64 %r85,%r22,%r24; sub.u64 %r86,%r85,%r73; .loc 1 3068 17 st.u64 [%frame+224],%r86; .loc 1 3069 25 ld.u32 %r87,[%r71+200]; st.u32 [%frame+232],%r87; .loc 1 3070 27 st.u64 [%frame+240],%r74; .loc 1 3071 17 st.u64 [%frame+248],%r75; .loc 1 3072 31 mov.u32 %r88,0; st.u32 [%frame+256],%r88; .loc 1 3074 10 add.u64 %r187,%frame,7; call (%value_in),read_uleb128,(75 6 setp.ne.u64 %r92,%r91,0; @ %r92 bra $L432; .loc 1 418 38 ld.u64 %r97,[%frame+216]; ld.u64 %r98,[%frame+208]; sub.u64 %r96,%r97,%r98; .loc 1 417 3 st.u32 [%stack+16],%r96; ld.u64 %r99,[%frame+200]; st.u64 [%stack+8],%r99; cvta.const.u64 %r100,$LC26; st.u64 [%stack],%r100; cvta.const.u64 %r95,$LC1; mov.u64 %r94,20stack; call (%value_in),snld.u64 %r106,[%frame+248]; ld.u64 %r107,[%frame+240]; { $LCT36framecall %r107,(%out_arg1,%out_arg2,%out_arg3),$LCT36; } .loc 1 3078 14 mov.u64 %r70,%r91; .loc 1 420 1 bra $L428; $L432: .loc 1 3081 12 add.u64 %r112,%r72,1111call (%value_in),lookup_abbrevld.param.u64 %r113,[%value_in]; } .loc 1 3082 6 setp.eq.u64 %r114,%r113,0; @ %r114 bra $L445; .loc 1 3085 7 mov.u64 %r115,0; st.u64 [%frame+272],%r115; .loc 1 3086 3 ld.u64 %r116,[%r113+16]; setp.eq.u64 %r117,%r116,0; @ %r117 bra $L446; .loc 1 3092 7 add.u64 %r58,%r71,56; .loc 1 3086 10 mov.u64 %r57,%r115; .loc 1 3116 11 add.u64 %r189,%frame,264; .loc 1 3103 9 add.u64 %r190,%frame,272; $L444: shl.b64 %r69,%r57,4; .loc 1 3090 64 ld.u64 %r118,[%r113+24]; add.u64 %r36,%r118,%r69; .loc 1 3090 12 ld.u64 %r128,[%r71+8]; ld.u32 %r129,[%r72+48]; ld.u32 %r130,[%r72+40]; ld.u32 %r131,[%r72+44]; ld.u64 %r133,[%r36+8]; ld.u32 %r134,[%r36+4]1118758; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r128; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%frame; call (%value_in),read_attribute,%out_arg8,%out_arg9); ld.param.u32 %r135,[%value_in]; } .loc 1 3090 10 setp.eq.u32 %r136,%r135,0; @ %r136 bra $L434; .loc 1 3095 28 ld.u64 %r137,[%r113+24]; add.u64 %r42,%r137,%r69; .loc 1 3095 31 ld.u32 %r43,[%r42]; .loc 1 3095 7 setp.eq.u32 %r138,%r43,71; @ %r138 bra $L435; setp.gt.u32 %r139,%r43,71; @ %r139 bra $L436; setp.eq.u32 %r140,%r43,3; @ %r140 bra $L437; bra $L438; $L436: setp.eq.u32 %r141,%r43,110; @ %r141 bra $L439; setp.eq.u32 %r142,%r43,8199; @ %r142 bra $L439; bra $L438; $L437: .loc 1 3101 7 ld.u64 %r143,[%frame+272]; setp.eq.u64 %r144,%r143,0; @ ! %r144 bra $L438; .loc 1 3103 9 ld.u64 %r154,[%r72+64]; ld.u32 %r155,[%r71+200]; ld.u32 %r1561515154frame7475; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r190; call (%value_in),resolve_string,%out_arg8); ld.param.u32 %r157,[%value_in]; } .loc 1 3103 7 setp.eq.u32 %r158,%r157,0; @ ! %r158 bra $L438; bra $L434; $L439: .loc 1 3115 8 mov.u64 %r159,0; st.u64 [%frame+264],%r159; .loc 1 3116 11 ld.u64 %r169,[%r72+64]; ld.u32 %r170,[%r71+200]; ld.u32 %r171171170frame7475; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r189; call (%value_in),resolve_string,%out_arg8); ld.param.u32 %r172,[%value_in]; } .loc 1 3116 9 setp.eq.u32 %r173,%r172,0; @ %r173 bra $L447; .loc 1 3120 12 ld.u64 %r70,[%frame+264]; .loc 1 3120 9 setp.ne.u64 %r174,%r70,0; @ ! %r174 bra $L438; bra $L428; $L447: .loc 1 3119 15 mov.u64 %r70,%r159; bra $L428; $L435: .loc 1 3000 6 ld.u32 %r175,[%r42+4]; setp.eq.u32 %r176,%r175,32; @ %r176 bra $L43871framecall (%value_in),read_referenced_name_from_attr$part$0.loc 1 3133 9 setp.eq.u64 %r183,%r182,0; @ %r183 bra $L438; .loc 1 3134 12 st.u64 [%frame+272],%r182; $L438: .loc 1 3086 38 add.u64 %r57,%r57,1; .loc 1 3086 3 ld.u64 %r184,[%r113+16]; setp.gt.u64 %r185,%r184,%r57; @ %r185 bra $L444; .loc 1 3143 10 ld.u64 %r70,[%frame+272]; bra $L428; $L434: .loc 1 3093 9 mov.u64 %r70,0; bra $L428; $L446: .loc 1 3086 3 mov.u64 %r70,%r116; .loc 1 3143 10 bra $L428; $L445: .loc 1 3083 12 mov.u64 %r70,%r113; $L428: .loc 1 3144FUNCTION DEF: read_referenced_name_from_attr$part$0 .func (.param .u64 %value_out) read_referenced_name_from_attr$part$0predpredpred3003 10 ld.u32 %r22,[%r44]; .loc 1 3003 6 setp.ne.u32 %r47,%r22,8; @ %r47 bra $L468; .loc 1 3006 4 ld.u64 %r24,[%r42+48]; ld.u64 %r25,[%r42+40]; ld.u64 %r48,[%r44+8]; st.u64 [%frame],%r48; .loc 1 1143 7 mov.u64 %r53,units_search; mov.u64 call (%value_in),bsearch54,[%value_in]; } mov.u64 %r41,%r54; .loc 1 1144 27 setp.ne.u64 %r55,%r41,0; @ ! %r55 bra $L467; ld.u64 %r38,[%r41]; .loc 1 3008 10 setp.ne.u64 %r56,%r38,0; @ %r56 bra $L471; $L473: .loc 1 3009 9 mov.u64 %r41,0; bra $L467; $L471: .loc 1 3011 16 ld.u64 %r63,[%r44+8]; ld.u64 %r64,[%r38+24]; sub.u64 %r62,%r6call (%value_in),read_referenced_name65,[%value_in]; } mov.u64 %r41,%r65; bra $L467; $L468: .loc 1 3016 7 and.b32 %r66,%r22,-5; .loc 1 3015 6 setp.ne.u32 %r67,%r66,3; @ %r67 bra $L472; .loc 1 3017 12 ld.u64 %r73,[%r44+8];call (%value_in),read_referenced_name74,[%value_in]; } mov.u64 %r41,%r74; bra $L467; $L472: .loc 1 3019 6 setp.ne.u32 %r75,%r22,9; @ %r75 bra $L473; .loc 1 3022 43 ld.u64 %r31,[%r42+8]; .loc 1 3022 4 ld.u64 %r32,[%r31+48]; ld.u64 %r33,[%r31+40]; ld.u64 %r76,[%r44+8]; st.u64 [%frame],%r76; .loc 1 1143 7 mov.u64 %r81,units_search; mov.u64 call (%value_in),bsearch82,[%value_in]; } mov.u64 %r41,%r82; .loc 1 1144 27 setp.ne.u64 %r83,%r41,0; @ ! %r83 bra $L467; ld.u64 %r39,[%r41]; .loc 1 3024 10 setp.eq.u64 %r84,%r39,0; @ %r84 bra $L473; .loc 1 3027 16 ld.u64 %r91,[%r44+8]; ld.u64 %r92,[%r39+24]; sub.u64 %r90,%r91,%r92; .loc 1 3028 14 ld.u64 %r93,[%r42+8]call (%value_in),read_referenced_name94,[%value_in]; } mov.u64 %r41,%r94; $L467: .loc 1 30FUNCTION DEF: find_address_ranges .func (.param .u32 %value_out) find_address_ranges, .param .u64 %in_ar10local .align 16 .b8 %frame_ar[1164u16predpred %r155; .reg .pred %r156; .reg .predpredpred %r164; .reg .u32 %r169; .reg .pred.reg .u16 %r179; .reg .u32 %r180; .reg .u16 %r181; .reg .predpredpred %r210; .reg .u64predu64 %r248; .reg .u32 %r253; .reg .pred %r254; .reg .u32predmov.u64 %r84,%ar0; mov.u64 %r85,%ar1; mov.u64 %r86,%ar2; mov.u64 %r87,%ar3; mov.u32 %r88,%ar4; mov.u64 %r89,%ar5; mov.u64 %r90,%ar6; mov.u64 %r91,%ar7; mov.u64 %r92,%ar8; mov.u64 %r93,%ar9; mov.u64 %r94,%ar10; .loc 1 1942 9 ld.u64 %r95,[%r86+24]; setp.ne.u64 %r96,%r95,0; @ ! %r96 bra $L506; .loc 1 1957 16 add.u64 %r282,%r92,112; .loc 1 1961 10 setp.eq.u64 %r283,%r94,0; .loc 1 1965 7 add.u64 %r284,%frame,88; .loc 1 1967 7 add.u64 %r285,%frame,72; add.u64 %r286,%frame,56; .loc 1 2034 9 add.u64 %r287,%r92,88; .loc 1 2041 9 add.u64 %r288,%r92,96; bra $L479; $L506: .loc 1 2073 10 mov.u32 %r83,1; bra $L478; $L479:86; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r98,[%value_in]; } .loc 1 1954 10 setp.eq.u64 %r99,%r98,0; @ %r99 bra $L481; .loc 1 1957 1628call (%value_in),lookup_abbrev1958 10 setp.ne.u64 %r106,%r105,0; @ %r106 bra $L482; $L502: .loc 1 1959 9 mov.u32 %r83,0; bra $L478; $L482: .loc 1 1961 10 @ %r283 bra $L484; .loc 1 1962 20 ld.u32 %r23,[%r105+8]; .loc 1 1962 12 st.u32 [%r94],%r23; $L484: .loc 1 1964 7 mov.u64 %r108,0; st.u64 [%frame],%r108; st.u64 [%frame+8],%r108; st.u64 [%frame+16],%r108; st.u64 [%frame+24],%r108; st.u64 [%frame+32],%r108; st.u64 [%frame+40],%r108; st.u64 [%frame+48],%r108; .loc 1 1965 7 st.u64 [%r284],%r108; st.u64 [%r284+8],%r108; .loc 1 1967 7 st.u64 [%r285],%r108; st.u64 [%r285+8],%r108; .loc 1 1969 7 ld.u64 %r121,[%r105+16]; setp.ne.u64 %r122,%r121,0; @ %r122 bra $L507; $L501: .loc 1 2047 17 ld.u32 %r61,[%r105+8]; .loc 1 2048 4 set.u32.eq.u32 %r124,%r61,17; neg.s32 %r125,%r124; set.u32.eq.u32 %r127,%r61,46; neg.s32 %r128,%r127; cvt.u16.u32 %r130,%r125; cvt.u16.u32 %r131,%r128; or.b16 %r129,%r130,%r131; .loc 1 2047 10 cvt.u32.u16 %r132,%r129; cvt.u16.u8 %r133,%r132; setp.ne.u16 %r134,%r133,0; @ ! %r134 bra $L487; bra $L486; $L507: .loc 1 1969 14 mov.u64 %r81,%r108; .loc 1 1968 25 cvt.u32.u64 %r76,%r81; .loc 1 1966 21 cvt.u32.u64 %r75,%r81; $L485: shl.b64 %r77,%r81,4; .loc 1 1973 61 ld.u64 %r135,[%r105+24]; add.u64 %r28,%r135,%r77; .loc 1 1973 9 ld.u32 %r146,[%r92+48]; ld.u32 %r147,[%r92+40]; ld.u32 %r148,[%r92+44]; ld.u64 %r149,[%r28+8]; ld.u32 %r150,[%r28+4]187; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r89; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r286; call (%value_in),read_attribute,%out_arg8,%out_arg9); ld.param.u32 %r151,[%value_in]; } mov.u32 %r83,%r151; .loc 1 1973 7 setp.ne.u32 %r152,%r151,0; @ ! %r152 bra $L478; .loc 1 1978 28 ld.u64 %r153,[%r105+24]; add.u64 %r154,%r153,%r77; ld.u32 %r34,[%r154]; .loc 1 1978 4 setp.eq.u32 %r155,%r34,85; @ %r155 bra $L489; setp.gt.u32 %r156,%r34,85; @ %r156 bra $L490; setp.eq.u32 %r157,%r34,16; @ %r157 bra $L491; setp.gt.u32 %r158,%r34,16; @ %r158 bra $L492; setp.eq.u32 %r159,%r34,3; @ %r159 bra $L493; bra $L494; $L492: setp.le.u32 %r160,%r34,18; @ %r160 bra $L489; setp.eq.u32 %r161,%r34,27; @ %r161 bra $L495; bra $L494; $L490: setp.eq.u32 %r162,%r34,115; @ %r162 bra $L496; setp.eq.u32 %r163,%r34,116; @ %r163 bra $L497; setp.eq.u32 %r164,%r34,114; @ %r164 bra $L498; bra $L494; $L489: .loc 1 1981 8286frame; call update_pcrange$isra$0.loc 1 1982 8 bra $L494; $L491: .loc 1 1985 11 ld.u32 %r169,[%r105+8]; setp.ne.u32 %r170,%r169,17; @ %r170 bra $L494; .loc 1 1986 12 ld.u32 %r80,[%frame+56]; .loc 1 1987 9 set.u32.eq.u32 %r172,%r80,3; neg.s32 %r173,%r172; set.u32.eq.u32 %r175,%r80,10; neg.s32 %r176,%r175; cvt.u16.u32 %r178,%r173; cvt.u16.u32 %r179,%r176; or.b16 %r177,%r178,%r179; .loc 1 1986 5 cvt.u32.u16 %r180,%r177; cvt.u16.u8 %r181,%r180; setp.eq.u16 %r182,%r181,0; @ %r182 bra $L494; .loc 1 1988 14 ld.u64 %r183,[%frame+64]; st.u64 [%r92+56],%r183; bra $L494; $L493: .loc 1 1992 11 ld.u32 %r184,[%r105+8]; setp.ne.u32 %r185,%r184,17; @ %r185 bra $L494; .loc 1 1994 14 ld.u64 %r186,[%frame+56]; st.u64 [%frame+88],%r186; ld.u64 %r187,[%frame+64]; st.u64 [%frame+96],%r187; .loc 1 1995 19 mov.u32 %r75,1; bra $L494; $L495: .loc 1 2000 11 ld.u32 %r188,[%r105+8]; setp.ne.u32 %r189,%r188,17; @ %r189 bra $L494; .loc 1 2002 18 ld.u64 %r190,[%frame+56]; st.u64 [%frame+72],%r190; ld.u64 %r191,[%frame+64]; st.u64 [%frame+80],%r191; .loc 1 2003 23 mov.u32 %r76,1; bra $L494; $L498: .loc 1 2008 11 ld.u32 %r192,[%r105+8]; setp.ne.u32 %r193,%r192,17; @ %r193 bra $L494; .loc 1 2009 5 ld.u32 %r194,[%frame+56]; setp.ne.u32 %r195,%r194,10; @ %r195 bra $L494; .loc 1 2010 23 ld.u64 %r196,[%frame+64]; st.u64 [%r92+64],%r196; bra $L494; $L496: .loc 1 2014 11 ld.u32 %r197,[%r105+8]; setp.ne.u32 %r198,%r197,17; @ %r198 bra $L494; .loc 1 2015 5 ld.u32 %r199,[%frame+56]; setp.ne.u32 %r200,%r199,10; @ %r200 bra $L494; .loc 1 2016 16 ld.u64 %r201,[%frame+64]; st.u64 [%r92+72],%r201; bra $L494; $L497: .loc 1 2020 11 ld.u32 %r202,[%r105+8]; setp.ne.u32 %r203,%r202,17; @ %r203 bra $L494; .loc 1 2021 5 ld.u32 %r204,[%frame+56]; setp.ne.u32 %r205,%r204,10; @ %r205 bra $L494; .loc 1 2022 20 ld.u64 %r206,[%frame+64]; st.u64 [%r92+80],%r206; $L494: .loc 1 1969 42 add.u64 %r81,%r81,1; .loc 1 1969 7 ld.u64 %r207,[%r105+16]; setp.gt.u64 %r208,%r207,%r81; @ %r208 bra $L485; .loc 1 2032 10 setp.ne.u32 %r209,%r75,0; @ %r209 bra $L499; $L503: .loc 1 2039 10 setp.ne.u32 %r210,%r76,0; @ ! %r210 bra $L501; bra $L500; $L499: .loc 1 2034 9 ld.u64 %r221,[%r92+64]; ld.u32 %r222222849091; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r287; call (%value_in),resolve_string,%out_arg8); ld.param.u32 %r223,[%value_in]; } .loc 1 2034 7 setp.eq.u32 %r224,%r223,0; @ ! %r224 bra $L503; bra $L502; $L500: .loc 1 2041 9 ld.u64 %r235,[%r92+64]; ld.u32 %r236232859091; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r288; call (%value_in),resolve_string,%out_arg8); ld.param.u32 %r237,[%value_in]; } .loc 1 2041 7 setp.eq.u32 %r238,%r237,0; @ ! %r238 bra $L501; bra $L502; $L487: .loc 1 2064 10 ld.u32 %r239,[%r105+12]; setp.ne.u32 %r240,%r239,0; @ ! %r240 bra $L505; bra $L504; $L486: .loc 1 2050 9 mov.u64 %r248,add_unit_addr; ld.u64 %r246,[%frame];frame; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r248; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r92; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r90; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r91; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r93; call (%value_in),add_ranges,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12); ld.param.u32 %r253,[%value_in]; } .loc 1 2050 7 setp.eq.u32 %r254,%r253,0; @ %r254 bra $L502; .loc 1 2058 7 ld.u32 %r255,[%r105+8]; setp.ne.u32 %r256,%r255,17; @ %r256 bra $L487; .loc 1 2059 8 ld.u32 %r257,[%frame+48]; setp.ne.u32 %r258,%r257,0; @ %r258 bra $L481; .loc 1 2060 5 ld.u32 %r259,[%frame+8]; setp.eq.u32 %r260,%r259,0; @ %r260 bra $L487; .loc 1 2060 28 ld.u32 %r261,[%frame+24]; setp.ne.u32 %r262,%r261,0; @ ! %r262 bra $L487; bra $L481; $L505: .loc 1 1942 9 ld.u64 %r263,[%r86+24479; bra $L506; $L504: .loc 1 2066 9 mov.u64 %r275,0;89; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r91; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r92; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r93; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r275; call (%value_in),find_address_ranges,%out_arg8,%out_arg9,%out_arg10,%out_arg11); ld.param.u32 %r276,[%value_in]; } .loc 1 2066 7 setp.eq.u32 %r277,%r276,0; @ ! %r277 bra $L505; bra $L502; $L481: .loc 1 1955 9 mov.u32 %r83,1; $L478: .loc 1 2074 1 mov.u32 %value,%r83;read_function_entry .func (.param .u32 %value_out) read_function_ent280predu16 %r151; .reg .u16 %r152; .reg .u32 %r153; .reg .u16 %r154; .reg .predpred %r163; .reg .u32 %r167; .reg .u64 %r169; .reg .u64 %r172; .reg .u64 %r179; .reg .predu32predpred %r230; .reg .predpred %r236; .reg .predpredpred.reg .u32u32 %r293; .reg .pred %r294; .reg .u64 %r295; .reg .u64pred %r309; .reg .predu32 %r361; .reg .u64 %r362; .reg .u32pred %r372; .reg .u32 %r373; .reg .predpred %r395; .reg .u32 %r400; .reg .predu64 %r408; .reg .pred %r409; .reg .predmov.u64 %r122,%ar0; mov.u64 %r123,%ar1; mov.u64 %r124,%ar2; mov.u64 %r125,%ar3; st.u64 [%frame+272],%r125; mov.u64 %r126,%ar4; mov.u64 %r127,%ar5; mov.u64 %r128,%ar6; mov.u64 %r129,%ar7; mov.u64 %r130,%ar8; mov.u64 %r131,%ar9; .loc 1 3197 9 ld.u64 %r132,[%r126+24]; setp.ne.u64 %r133,%r132,0; @ ! %r133 bra $L582; .loc 1 3212 16 add.u64 %r429,%r124,112; .loc 1 3367 32 add.u64 %r430,%r123,56; add.u64 %r431,%frame,256; .loc 1 3257 10 add.u64 %r432,%frame,272; .loc 1 3236 7 add.u64 %r433,%frame,200; bra $L538; $L582: .loc 1 3425 10 mov.u32 %r121,1; bra $L537; $L538:read_uleb128,(209 10 setp.eq.u64 %r136,%r135,0; @ %r136 bra $L586; .loc 1 3212 16129; call (%value_in),lookup_abbrevld.param.u64 %r142,[%value_in]; } .loc 1 3213 10 setp.ne.u64 %r143,%r142,0; @ %r143 bra $L541; $L544: .loc 1 3214 9 mov.u32 %r121,0; bra $L537; $L541: .loc 1 3216 28 ld.u32 %r23,[%r142+8]; .loc 1 3217 8 set.u32.eq.u32 %r145,%r23,46; neg.s32 %r146,%r145; set.u32.eq.u32 %r148,%r23,3; neg.s32 %r149,%r148; cvt.u16.u32 %r151,%r146; cvt.u16.u32 %r152,%r149; or.b16 %r150,%r151,%r152; .loc 1 3218 8 cvt.u32.u16 %r153,%r150; cvt.u16.u8 %r154,%r153; setp.ne.u16 %r155,%r154,0; @ %r155 bra $L542; setp.ne.u32 %r156,%r23,29; @ %r156 bra $L543; .loc 1 3221 6 mov.u64 %r118,%r131; $L583: .loc 1 3229 9 mov.u64 %r158115129; call (%value_in),backtrace_allocld.param.u64 %r161,[%value_in]; } mov.u64 %r100,%r161; .loc 1 3231 7 setp.eq.u64 %r163,%r100,0; @ %r163 bra $L544; .loc 1 3233 4 mov.u32 %r167158169,[%value_in]; } .loc 1 3236 7 mov.u64 %r172,0; st.u64 [%r433],%r172; st.u64 [%r433+8],%r172; st.u64 [%r433+16],%r172; st.u64 [%r433+24],%r172; st.u64 [%r433+32],%r172; st.u64 [%r433+40],%r172; st.u64 [%r433+48],%r172; .loc 1 3238 7 ld.u64 %r179,[%r142+16]; setp.eq.u64 %r180,%r179,0; @ %r180 bra $L545; mov.u32 %r117,1; bra $L585; $L588: .loc 1 3223 6 mov.u64 %r118,%r130; .loc 1 3218 8 cvt.u32.u64 %r117,%r416; .loc 1 3225 16 mov.u64 %r100,%r416; $L585: .loc 1 3237 25 mov.u32 %r93,0; .loc 1 3238 14 mov.u64 %r103,0; .loc 1 3265 7 setp.ne.u32 %r427,%r117,0; $L569: shl.b64 %r97,%r103,4; .loc 1 3242 61 ld.u64 %r181,[%r142+24]; add.u64 %r32,%r181,%r97; .loc 1 3242 9 ld.u64 %r192,[%r123+8]; ld.u32 %r193,[%r124+48]; ld.u32 %r194,[%r124+40]; ld.u32 %r195,[%r124+44]; ld.u64 %r196,[%r32+8]; ld.u32 %r197,[%r32+4]6430; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r192; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r431; call (%value_in),read_attribute,%out_arg8,%out_arg9); ld.param.u32 %r198,[%value_in]; } .loc 1 3242 7 setp.eq.u32 %r199,%r198,0; @ %r199 bra $L564; .loc 1 3250 7 ld.u32 %r200,[%r142+8]; setp.eq.u32 %r201,%r200,17; @ %r201 bra $L547; $L550: .loc 1 3265 7 @ %r427 bra $L548; .loc 1 3350 6 bra $L549; $L547: .loc 1 3251 27 ld.u64 %r203,[%r142+24]; add.u64 %r204,%r203,%r97; ld.u32 %r49,[%r204]; .loc 1 3251 8 setp.ne.u32 %r205,%r49,17; @ %r205 bra $L550; .loc 1 3253 15 ld.u32 %r40,[%frame+256]; .loc 1 3253 11 setp.ne.u32 %r206,%r40,1; @ %r206 bra $L551; .loc 1 3254 8 ld.u64 %r207,[%frame+264]; st.u64 [%frame+272],%r207; bra $L552; $L551: .loc 1 3255 16 setp.ne.u32 %r208,%r40,2; @ %r208 bra $L552; .loc 1 3257 10 ld.u64 %r219,[%frame+264]; ld.u32 %r220,[%r123+200]; ld.u32 %r221,[%r124+48]; ld.u64 %r222,[%r124+72]; ld.u64 %r223,[%r123+168]; ld.u64 %r224,[%r123+962232128; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r129; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r432; call (%value_in),resolve_addr_index$isra$0,%out_arg8,%out_arg9); ld.param.u32 %r225,[%value_in]; } .loc 1 3257 8 setp.eq.u32 %r226,%r225,0; @ ! %r226 bra $L550; bra $L564; $L548: .loc 1 3267 29 ld.u64 %r227,[%r142+24]; add.u64 %r48,%r227,%r97; .loc 1 3267 32 ld.u32 %r49,[%r48]; .loc 1 3267 8 setp.eq.u32 %r228,%r49,88; @ %r228 bra $L553; setp.gt.u32 %r229,%r49,88; @ %r229 bra $L554; setp.eq.u32 %r230,%r49,49; @ %r230 bra $L555; setp.gt.u32 %r231,%r49,49; @ %r231 bra $L556; setp.eq.u32 %r232,%r49,3; @ %r232 bra $L557; add.u32 %r233,%r49,-17; setp.gt.u32 %r234,%r233,1; @ %r234 bra $L549; bra $L558; $L556: setp.eq.u32 %r235,%r49,71; @ %r235 bra $L559; setp.eq.u32 %r236,%r49,85; @ %r236 bra $L558; bra $L549; $L554: setp.eq.u32 %r237,%r49,110; @ %r237 bra $L560; setp.eq.u32 %r238,%r49,8199; @ %r238 bra $L560; setp.eq.u32 %r239,%r49,89; @ %r239 bra $L561; bra $L549; $L553: .loc 1 3270 8 ld.u32 %r240,[%frame+256]; setp.ne.u32 %r241,%r240,3; @ %r241 bra $L549; .loc 1 3272 18 ld.u64 %r51,[%frame+264]; .loc 1 3272 12 setp.ne.u64 %r242,%r51,0; @ %r242 bra $L562; .loc 1 3273 30 cvta.const.u64 %r243,$LC27; st.u64 [%r100+8],%r243; bra $L549; $L562: .loc 1 3276 21 add.u64 %r244,%r51,-1; .loc 1 3276 9 ld.u64 %r245,[%r127+56]; setp.lt.u64 %r246,%r244,%r245; @ %r246 bra $L563; .loc 1 418 38 ld.u64 %r251,[%r126+16]; ld.u64 %r252,[%r126+8]; sub.u64 %r250,%r251,%r252; .loc 1 417 3 st.u32 [%stack+16],%r250; ld.u64 %r253,[%r126]; st.u64 [%stack+8],%r253; cvta.const.u64 %r254,$LC28; st.u64 [%stack],%r254; cvta.const.u64 %r249,$LC1; mov24249stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r255,[%value_in]; } .loc 1 419 3 ld.u64 %r260,[%r126+48]; ld.u64 %r261,[%r126+40]; mov.u32 %r259,0; { $LCT3726frame259; call %r261,$LCT37; } .loc 1 420 1 bra $L564; $L563: .loc 1 3284 23 ld.u64 %r262,[%r127+64]; shl.b64 %r263,%r51,3; add.u64 %r264,%r262,%r263; ld.u64 %r58,[%r264+-8]; .loc 1 3283 32 st.u64 [%r100+8],%r58; bra $L549; $L561: .loc 1 3290 8 ld.u32 %r266,[%frame+256]; setp.ne.u32 %r267,%r266,3; @ %r267 bra $L549; .loc 1 3291 31 ld.u64 %r268,[%frame+264]; st.u32 [%r100+16],%r268; bra $L549; $L584: .loc 1 3000 6 ld.u32 %r269,[%r48+4]; setp.ne.u32 %r270,%r269,32; @ %r270 bra $L565; $L566: .loc 1 3308 24 mov.u32 %r93,0; bra $L549; $L565:112431289; call (%value_in),read_referenced_name_from_attr$part$0.loc 1 3307 10 setp.eq.u64 %r278,%r277,0; @ %r278 bra $L566; .loc 1 3308 24 st.u64 [%r100],%r277; mov.u32 %r93,0; bra $L549; $L557: .loc 1 3314 8 ld.u64 %r279,[%r100]; setp.ne.u64 %r280,%r279,0; @ %r280 bra $L549; .loc 1 3316 10 ld.u64 %r290,[%r124+64]; ld.u32 %r291,[%r123+200]; ld.u32 %r292,[%r124+44292903.param .u64 %out_arg8; st.param.u64 [%out_arg8],%r100; call (%value_in),resolve_string,%out_arg8); ld.param.u32 %r293,[%value_in]; } .loc 1 3316 8 setp.eq.u32 %r294,%r293,0; @ ! %r294 bra $L549; bra $L564; $L560: .loc 1 3329 9 mov.u64 %r295,0; st.u64 [%frame],%r295; .loc 1 3330 12 ld.u64 %r305,[%r124+64]; ld.u32 %r306,[%r123+200]; ld.u32 %r307,[%r124+44063053.param .u64 %out_arg8; st.param.u64 [%out_arg8],%frame; call (%value_in),resolve_string,%out_arg8); ld.param.u32 %r308,[%value_in]; } .loc 1 3330 10 setp.ne.u32 %r309,%r308,0; @ ! %r309 bra $L564; .loc 1 3335 13 ld.u64 %r72,[%frame]; .loc 1 3335 10 setp.eq.u64 %r310,%r72,0; @ %r310 bra $L549; .loc 1 3337 19 st.u64 [%r100],%r72; .loc 1 3338 22 mov.u32 %r93,%r117; .loc 1 3341 5 bra $L549; $L558: .loc 1 3344 5433; call update_pcrange$isra$0$L549: .loc 1 3238 42 add.u64 %r103,%r103,1; .loc 1 3238 7 ld.u64 %r316,[%r142+16]; setp.gt.u64 %r317,%r316,%r103; @ %r317 bra $L569; bra $L649; $L564: .loc 1 3246 13 mov.u32 %r121,0; bra $L537; $L649: .loc 1 3355 10 setp.eq.u32 %r318,%r117,0; @ %r318 bra $L571; $L545: .loc 1 3355 23 ld.u64 %r319,[%r100]; setp.ne.u64 %r320,%r319,0; @ %r320 bra $L572; .loc 1 3357 4 mov.u64 %r323,4122r31289; call backtrace_free); } .loc 1 3382 10 ld.u32 %r326,[%r142+12]; setp.eq.u32 %r327,%r326,0; @ %r327 bra $L581; $L577: .loc 1 3386 13 ld.u64 %r338,[%frame+2711233126.param .u64 %out_arg8; st.param.u64 [%out_arg8],%r129; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r130; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r131; call (%value_in),read_function_ent,%out_arg8,%out_arg9,%out_arg10); ld.param.u32 %r339,[%value_in]; } .loc 1 3386 11 setp.eq.u32 %r340,%r339,0; @ ! %r340 bra $L581; bra $L544; $L572: .loc 1 3364 7 ld.u32 %r341,[%frame+248]; setp.ne.u32 %r342,%r341,0; @ %r342 bra $L574; .loc 1 3365 8 ld.u32 %r343,[%frame+208]; setp.eq.u32 %r344,%r343,0; @ %r344 bra $L575; .loc 1 3365 31 ld.u32 %r345,[%frame+224]; setp.eq.u32 %r346,%r345,0; @ %r346 bra $L575; $L574: .loc 1 3367 13 ld.u64 %r360,[%frame+272]; ld.u32 %r361,[%r123+200]; ld.u64 %r362,[%r123+16]; mov.u64 %r354,add_function_range364433; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r354; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r100; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r128; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%r129; .param .u64 %out_arg12; st.param.u64 [%out_arg12],%r118; call (%value_in),add_ranges,%out_arg8,%out_arg9,%out_arg10,%out_arg11,%out_arg12); ld.param.u32 %r364,[%value_in]; } .loc 1 3367 11 setp.eq.u32 %r365,%r364,0; @ ! %r365 bra $L650; bra $L544; $L575: .loc 1 3376 8 mov.u64 %r368,4122r3681289; call backtrace_free); } .loc 1 3382 10 ld.u32 %r371,[%r142+12]; setp.ne.u32 %r372,%r371,0; @ %r372 bra $L577; bra $L581; $L650: ld.u32 %r373,[%r142+12]; setp.ne.u32 %r374,%r373,0; @ %r374 bra $L578; bra $L581; $L571: ld.u32 %r375,[%r142+12]; setp.ne.u32 %r376,%r375,0; @ %r376 bra $L577; bra $L581; $L578: .loc 1 3398 8 mov.u64 %r377,0; st.u64 [%frame],%r377; st.u64 [%frame+8],%r377; st.u64 [%frame+16],%r377; st.u64 [%frame+24],%r377; .loc 1 3400 13 ld.u64 %r391,[%frame+271123396.param .u64 %out_arg8; st.param.u64 [%out_arg8],%r129; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r130; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%frame; call (%value_in),read_function_ent,%out_arg8,%out_arg9,%out_arg10); ld.param.u32 %r392,[%value_in]; } .loc 1 3400 11 setp.eq.u32 %r393,%r392,0; @ %r393 bra $L579; .loc 1 3405 11 ld.u64 %r394,[%frame+24]; setp.eq.u64 %r395,%r394,0; @ %r395 bra $L581; .loc 1 341129; call (%value_in),backtrace_vector_releaseld.param.u32 %r400,[%value_in]; } .loc 1 3409 8 setp.eq.u32 %r401,%r400,0; @ %r401 bra $L579; .loc 1 3413 12 ld.u64 %r101,[%frame]; .loc 1 3414 5 ld.u64 %r406,[%frame+24]; mov.u64 %r405,function_addrs_compare; mov.u64 %r404,24; {r40604405; call backtrace_qso); } .loc 1 3418 30 st.u64 [%r100+24],%r101; .loc 1 3419 36 ld.u64 %r407,[%frame+24]; st.u64 [%r100+32],%r407; bra $L581; $L579: .loc 1 3403 10 mov.u32 %r121,0; bra $L537; $L581: .loc 1 3197 9 ld.u64 %r408,[%r126+24]; setp.ne.u64 %r409,%r408,0; @ %r409 bra $L538; bra $L582; $L586: .loc 1 3210 9 mov.u32 %r121,1; bra $L537; $L542: .loc 1 3220 10 setp.ne.u32 %r411,%r23,29; .loc 1 3221 6 selp.u64 %r118,%r130,%r131,%r411; bra $L583; $L552: .loc 1 3265 7 @ %r427 bra $L558; bra $L549; $L559: .loc 1 3298 8 setp.ne.u32 %r413,%r93,0; @ %r413 bra $L549; bra $L584; $L555: setp.ne.u32 %r414,%r93,0; @ %r414 bra $L549; bra $L584; $L543: .loc 1 3236 7 mov.u64 %r416,0; st.u64 [%r433],%r416; st.u64 [%r433+8],%r416; st.u64 [%r433+16],%r416; st.u64 [%r433+24],%r416; st.u64 [%r433+32],%r416; st.u64 [%r433+40],%r416; st.u64 [%r433+48],%r416; .loc 1 3238 7 ld.u64 %r423,[%r142+16]; setp.ne.u64 %r424,%r423,0; @ %r424 bra $L588; bra $L571; $L537: .loc 1 3426 1 mov.u32 %value,%r121;dwarf_lookup_pcdwarf_lookup_pclocal .align 16 .b8 %frame_ar[44096u64u64 %r416; .reg .u64 %r424; .reg .u64 %r426; .reg .u64 %r429; .reg .u64 %r463; .reg .u64u32 %r513; .reg .u64 %r514; .reg .u64 %r515; .reg .u64 %r516; .reg .u64 %r517; .reg .u64 %r518; .reg .u64 %r519; .reg .u64 %r520; .reg .u32u64 %r532; .reg .u64 %r534; .reg .u64 %r536; .reg .pred %r537; .reg .u32 %r538; .reg .u64 %r539; .reg .u64pred %r546; .reg .u64 %r547; .reg .pred %r548; .reg .pred %r549; .reg .pred %r551; .reg .u64 %r552; .reg .pred %r553; .reg .u64 %r554; .reg .pred %r555; .reg .u64 %r556; .reg .pred %r557; .reg .pred %r559; .reg .pred %r560; .reg .pred %r576; .reg .u64u32 %r583; .reg .u32 %r584; .reg .u32 %r587; .reg .pred %r588; .reg .u64 %r591; .reg .u32 %r594; .reg .u32 %r596; .reg .predu64pred %r623; .reg .u64 %r625; .reg .u64 %r626; .reg .u64 %r627; .reg .u64 %r628; .reg .u64 %r629; .reg .u64 %r630; .reg .u64 %r631; .reg .u32 %r632; .reg .u32 %r636; .reg .u64 %r637; .reg .u64 %r638; .reg .pred %r639; .reg .u64 %r642; .reg .u32 %r645; .reg .u64 %r648; .reg .u64 %r652; .reg .pred %r654; .reg .u64 %r655; .reg .u64 %r656; .reg .u64 %r657; .reg .u32 %r660; .reg .u32 %r662; .reg .pred %r663; .reg .u32 %r664; .reg .pred %r665; .reg .u64 %r667; .reg .u64 %r668; .reg .u64 %r669; .reg .u64 %r670; .reg .u64 %r671; .reg .u64 %r672; .reg .u64 %r673; .reg .u32 %r674; .reg .u64 %r679; .reg .u64 %r680; .reg .u32 %r681; .reg .u32 %r682; .reg .u32 %r685; .reg .u32 %r689; .reg .pred %r690; .reg .pred %r691; .reg .u64 %r693; .reg .u64 %r694; .reg .u64 %r695; .reg .u64 %r696; .reg .u64 %r697; .reg .u64 %r698; .reg .u32 %r699; .reg .u64 %r704; .reg .u64 %r705; .reg .u32 %r706; .reg .u64 %r707; .reg .u64 %r708; .reg .u16 %r710; .reg .u16 %r711; .reg .u16 %r713; .reg .u32 %r715; .reg .u32 %r718; .reg .u32 %r722; .reg .u32 %r724; .reg .pred %r725; .reg .u32 %r726; .reg .pred %r727; .reg .u64 %r729; .reg .u64 %r730; .reg .u64 %r731; .reg .u64 %r732; .reg .u64 %r733; .reg .u64 %r734; .reg .u32 %r735; .reg .u64 %r740; .reg .u64 %r741; .reg .u64 %r742; .reg .u32 %r743; .reg .pred %r744; .reg .u64 %r745; .reg .pred %r746; .reg .u64 %r750; .reg .u64 %r751; .reg .u64 %r753; .reg .pred %r754; .reg .u16 %r755; .reg .pred %r756; .reg .u64 %r757; .reg .pred %r758; .reg .u16 %r759; .reg .pred %r760; .reg .u64 %r765; .reg .u64 %r766; .reg .u64 %r767; .reg .u64 %r768; .reg .pred %r770; .reg .u32 %r771; .reg .pred %r772; .reg .u64 %r773; .reg .u64 %r776; .reg .pred %r778; .reg .u64 %r780; .reg .u64 %r781; .reg .u64 %r782; .reg .u64 %r783; .reg .u64 %r784; .reg .u64 %r785; .reg .u32 %r786; .reg .u64 %r791; .reg .u64 %r792; .reg .u32 %r793; .reg .u64 %r794; .reg .u64 %r795; .reg .u64 %r796; .reg .u64 %r797; .reg .u64 %r798; .reg .pred %r799; .reg .u16 %r800; .reg .pred %r801; .reg .pred %r802; .reg .u32 %r803; .reg .pred %r804; .reg .u64 %r806; .reg .u64 %r807; .reg .u64 %r808; .reg .u64 %r809; .reg .u64 %r810; .reg .u64 %r811; .reg .u32 %r812; .reg .u64 %r817; .reg .u64 %r818; .reg .u64 %r819; .reg .u64 %r820; .reg .pred %r821; .reg .u64 %r826; .reg .u64 %r827; .reg .u64 %r828; .reg .u64 %r829; .reg .u64 %r830; .reg .pred %r832; .reg .u64 %r835; .reg .u64 %r836; .reg .u64 %r838; .reg .u16 %r839; .reg .u32 %r840; .reg .pred %r841; .reg .u16 %r843; .reg .u32 %r844; .reg .pred %r845; .reg .u16 %r846; .reg .u32 %r847; .reg .pred %r848; .reg .u16 %r850; .reg .u32 %r851; .reg .pred %r852; .reg .u16 %r853; .reg .u32 %r854; .reg .pred %r855; .reg .u16 %r857; .reg .u32 %r858; .reg .pred %r859; .reg .pred %r860; .reg .u16 %r861; .reg .pred %r862; .reg .u16 %r863; .reg .pred %r864; .reg .u32 %r865; .reg .pred %r866; .reg .u32 %r867; .reg .pred %r868; .reg .u64 %r871; .reg .pred %r873; .reg .u64 %r875; .reg .u64 %r876; .reg .u64 %r877; .reg .u64 %r878; .reg .u64 %r879; .reg .u64 %r880; .reg .u32 %r881; .reg .u64 %r886; .reg .u64 %r887; .reg .u64 %r888; .reg .u64 %r890; .reg .u64 %r893; .reg .u16 %r894; .reg .u32 %r895; .reg .pred %r896; .reg .pred %r897; .reg .pred %r898; .reg .u64 %r899; .reg .u64 %r900; .reg .u64 %r901; .reg .u64 %r902; .reg .pred %r903; .reg .u64 %r904; .reg .u64 %r905; .reg .u64 %r906; .reg .u64 %r909; .reg .u64 %r910; .reg .u64 %r911; .reg .u64 %r912; .reg .u64 %r913; .reg .u64 %r914; .reg .u64 %r915; .reg .u32 %r916; .reg .u64 %r921; .reg .u64 %r922; .reg .u64 %r924; .reg .u64 %r927; .reg .u64 %r933; .reg .u64 %r934; .reg .u64 %r935; .reg .u64 %r936; .reg .u64 %r937; .reg .pred %r939; .reg .u64 %r946; .reg .u64 %r948; .reg .u32 %r949; .reg .u64 %r950; .reg .u64 %r951; .reg .u64 %r952; .reg .u64 %r959; .reg .u64 %r961; .reg .u64 %r962; .reg .u64 %r965; .reg .u64 %r968; .reg .u16 %r969; .reg .pred %r970; .reg .u64 %r979; .reg .u64 %r981; .reg .u32 %r984; .reg .pred %r985; .reg .u64 %r994; .reg .u64 %r996; .reg .u32 %r999; .reg .pred %r1000; .reg .u64 %r1001; .reg .pred %r1002; .reg .u64 %r1003; .reg .pred %r1004; .reg .u64 %r1005; .reg .pred %r1006; .reg .u32 %r1008; .reg .u32 %r1009; .reg .u32 %r1010; .reg .u32 %r1012; .reg .u64 %r1013; .reg .u32 %r1015; .reg .u32 %r1016; .reg .u32 %r1017; .reg .u64 %r1027; .reg .u64 %r1028; .reg .pred %r1029; .reg .u64 %r1032; .reg .pred %r1033; .reg .u32 %r1034; .reg .pred %r1035; .reg .u64 %r1037; .reg .u64 %r1038; .reg .u64 %r1039; .reg .u64 %r1040; .reg .u64 %r1041; .reg .u64 %r1042; .reg .u32 %r1043; .reg .u64 %r1048; .reg .u64 %r1049; .reg .u32 %r1050; .reg .u16 %r1051; .reg .pred %r1052; .reg .pred %r1054; .reg .pred %r1056; .reg .pred %r1058; .reg .pred %r1060; .reg .u32 %r1063; .reg .u64 %r1065; .reg .u64 %r1068; .reg .pred %r1070; .reg .u32 %r1071; .reg .pred %r1072; .reg .u64 %r1074; .reg .u64 %r1075; .reg .u64 %r1076; .reg .u64 %r1077; .reg .u64 %r1078; .reg .u64 %r1079; .reg .u32 %r1080; .reg .u64 %r1085; .reg .u64 %r1086; .reg .u32 %r1087; .reg .u64 %r1088; .reg .u64 %r1089; .reg .u64 %r1092; .reg .u64 %r1095; .reg .u64 %r1098; .reg .u16 %r1099; .reg .u32 %r1100; .reg .pred %r1101; .reg .pred %r1102; .reg .u32 %r1103; .reg .pred %r1104; .reg .u32 %r1105; .reg .u64 %r1106; .reg .pred %r1107; .reg .u64 %r1108; .reg .u64 %r1109; .reg .u64 %r1110; .reg .u64 %r1112; .reg .u64 %r1113; .reg .u64 %r1114; .reg .u64 %r1115; .reg .u64 %r1116; .reg .u64 %r1117; .reg .u64 %r1118; .reg .u32 %r1119; .reg .u32 %r1123; .reg .u64 %r1124; .reg .u64 %r1125; .reg .u64 %r1127; .reg .u64 %r1130; .reg .u64 %r1136; .reg .u64 %r1137; .reg .u64 %r1138; .reg .u64 %r1139; .reg .u64 %r1140; .reg .pred %r1142; .reg .u64 %r1149; .reg .u64 %r1151; .reg .u32 %r1152; .reg .u64 %r1153; .reg .u64 %r1154; .reg .u64 %r1155; .reg .u64 %r1162; .reg .u64 %r1166; .reg .pred %r1167; .reg .u32 %r1168; .reg .pred %r1169; .reg .u64 %r1171; .reg .u64 %r1172; .reg .u64 %r1173; .reg .u64 %r1174; .reg .u64 %r1175; .reg .u64 %r1176; .reg .u64 %r1177; .reg .u32 %r1178; .reg .u64 %r1183; .reg .u64 %r1184; .reg .u32 %r1185; .reg .u64 %r1186; .reg .u64 %r1187; .reg .u16 %r1188; .reg .pred %r1189; .reg .pred %r1191; .reg .pred %r1193; .reg .pred %r1195; .reg .pred %r1197; .reg .pred %r1199; .reg .pred %r1201; .reg .pred %r1203; .reg .pred %r1205; .reg .pred %r1207; .reg .pred %r1209; .reg .pred %r1211; .reg .pred %r1213; .reg .u64 %r1223; .reg .u64 %r1224; .reg .u64 %r1227; .reg .u64 %r1228; .reg .u64 %r1229; .reg .u64 %r1230; .reg .u64 %r1232; .reg .u64 %r1234; .reg .u64 %r1237; .reg .u32 %r1238; .reg .u64 %r1241; .reg .pred %r1242; .reg .u64 %r1243; .reg .u64 %r1244; .reg .pred %r1245; .reg .u64 %r1247; .reg .u64 %r1248; .reg .u64 %r1249; .reg .u64 %r1250; .reg .u64 %r1251; .reg .u64 %r1252; .reg .u64 %r1253; .reg .u32 %r1254; .reg .u32 %r1258; .reg .u64 %r1259; .reg .u64 %r1260; .reg .u64 %r1261; .reg .u64 %r1262; .reg .u64 %r1263; .reg .u64 %r1267; .reg .u32 %r1268; .reg .u32 %r1269; .reg .u32 %r1271; .reg .u32 %r1272; .reg .u32 %r1273; .reg .u32 %r1274; .reg .u32 %r1276; .reg .u64 %r1277; .reg .u32 %r1280; .reg .u64 %r1282; .reg .u16 %r1283; .reg .u64 %r1286; .reg .u32 %r1287; .reg .u64 %r1288; .reg .u64 %r1289; .reg .u64 %r1290; .reg .pred %r1291; .reg .u64 %r1294; .reg .pred %r1295; .reg .pred %r1296; .reg .u64 %r1297; .reg .pred %r1298; .reg .u64 %r1300; .reg .u64 %r1305; .reg .pred %r1306; .reg .u64 %r1307; .reg .u64 %r1308; .reg .u32 %r1315; .reg .pred %r1316; .reg .u64 %r1320; .reg .u64 %r1321; .reg .u32 %r1322; .reg .pred %r1323; .reg .u64 %r1324; .reg .u64 %r1325; .reg .u64 %r1326; .reg .u64 %r1327; .reg .u32 %r1333; .reg .pred %r1334; .reg .u64 %r1340; .reg .u64 %r1341; .reg .u64 %r1347; .reg .u64 %r1348; .reg .u64 %r1349; .reg .u64 %r1355; .reg .u64 %r1356; .reg .u64 %r1357; .reg .u32 %r1358; .reg .u32 %r1359; .reg .u64 %r1363; .reg .u32 %r1371; .reg .pred %r1372; .reg .pred %r1373; .reg .pred %r1374; .reg .pred %r1375; .reg .u32 %r1381; .reg .pred %r1382; .reg .u64 %r1387; .reg .pred %r1388; .reg .u64 %r1392; .reg .u64 %r1393; .reg .pred %r1394; .reg .u64 %r1400; .reg .u64 %r1401; .reg .u64 %r1407; .reg .u64 %r1408; .reg .u64 %r1409; .reg .u32 %r1410; .reg .pred %r1411; .reg .pred %r1412; .reg .pred %r1413; .reg .u64 %r1421; .reg .u32 %r1422; .reg .u32 %r1426; .reg .u64 %r1427; .reg .u64 %r1428; .reg .u32 %r1429; .reg .u64 %r1433; .reg .u64 %r1434; .reg .u64 %r1435; .reg .u64 %r1436; .reg .u64 %r1438; .reg .pred %r1439; .reg .pred %r1440; .reg .pred %r1441; .reg .u16 %r1442; .reg .u32 %r1443; .reg .pred %r1444; .reg .pred %r1445; .reg .u64 %r1447; .reg .u64 %r1450; .reg .u64 %r1456; .reg .u64 %r1457; .reg .u64 %r1458; .reg .pred %r1460; .reg .u64 %r1467; .reg .u64 %r1469; .reg .u32 %r1470; .reg .u64 %r1471; .reg .u64 %r1472; .reg .u64 %r1473; .reg .u64 %r1480; .reg .u32 %r1485; .reg .u64 %r1486; .reg .u64 %r1487; .reg .u32 %r1488; .reg .pred %r1489; .reg .u32 %r1495; .reg .u64 %r1496; .reg .u64 %r1497; .reg .u32 %r1498; .reg .u64 %r1503; .reg .u64 %r1504; .reg .u64 %r1506; .reg .pred %r1507; .reg .u64 %r1508; .reg .u64 %r1510; .reg .u64 %r1512; .reg .pred %r1513; .reg .u32 %r1519; .reg .u64 %r1520; .reg .u64 %r1521; .reg .u32 %r1522; .reg .u64 %r1523; .reg .u64 %r1525; .reg .u64 %r1527; .reg .pred %r1528; .reg .u64 %r1529; .reg .pred %r1530; .reg .u64 %r1531; .reg .pred %r1532; .reg .u64 %r1533; .reg .u32 %r1534; .reg .u64 %r1541; .reg .u32 %r1542; .reg .pred %r1543; .reg .u64 %r1546; .reg .u64 %r1548; .reg .u32 %r1549; .reg .u64 %r1550; .reg .u32 %r1551; .reg .u32 %r1553; .reg .pred %r1554; .reg .u64 %r1555; .reg .u64 %r1556; .reg .u64 %r1557; .reg .u64 %r1558; .reg .u64 %r1559; .reg .pred %r1560; .reg .u64 %r1561; .reg .u64 %r1562; .reg .u64 %r1563; .reg .u64 %r1564; .reg .u64 %r1565; .reg .u16 %r1566; mov.u64 %r514,%ar0; mov.u64 %r515,%ar1; mov.u64 %r516,%ar2; st.u64 [%frame+432],%r516; mov.u64 %r517,%ar3; mov.u64 %r518,%ar4; mov.u64 %r519,%ar5; mov.u64 %r520,%ar6; .loc 1 3575 10 mov.u32 %r521,1; st.u32 [%r520],%r521; .loc 1 3578 17 ld.u64 %r22,[%r515+32]; .loc 1 3580 5 setp.eq.u64 %r522,%r22,0; @ %r522 bra $L764; .loc 1 3580 7 ld.u64 %r528,[%r515+24]; add.u64 %r1564,%frame,432; mov.u64 %r527,unit_addrs_search;56525527; call (%value_in),bsearch530,0; @ %r531 bra $L764; .loc 1 3593 33 ld.u64 %r493,[%r515+24]; .loc 1 3593 53 ld.u64 %r182,[%r515+32]; .loc 1 3593 26 sub.u64 %r532,%r530,%r493; shr.s64 %r534,%r532,3; .loc 1 3593 42 mad.lo.u64 %r536,%r534,-6148914691236517205,1; .loc 1 3593 9 setp.le.u64 %r537,%r182,%r536; @ %r537 bra $L653; .loc 1 3594 9 ld.u64 %r29,[%frame+432]; add.u64 %r495,%r530,24; add.u64 %r491,%r530,32; bra $L654; $L764: .loc 1 3585 14 mov.u32 %r538,0; st.u32 [%r520],%r538; .loc 1 3586 14 mov.u32 %r513,%r538; bra $L651; $L656: .loc 1 3593 9 add.u64 %r495,%r495,24; add.u64 %r491,%r491,24; .loc 1 3593 26 sub.u64 %r539,%r486,%r493; shr.s64 %r541,%r539,3; .loc 1 3593 42 mad.lo.u64 %r543,%r541,-6148914691236517205,1; .loc 1 3593 9 setp.ge.u64 %r544,%r543,%r182; @ %r544 bra $L771; $L654: add.u64 %r92,%r495,-24; mov.u64 %r486,%r495; .loc 1 3594 3 ld.u64 %r545,[%r495]; setp.gt.u64 %r546,%r545,%r29; @ %r546 bra $L653; .loc 1 3595 3 ld.u64 %r547,[%r491]; setp.lt.u64 %r548,%r29,%r547; @ %r548 bra $L656; bra $L653; $L771: mov.u64 %r92,%r486; $L653: .loc 1 3604 5 ld.u64 %r93,[%r92+16]; .loc 1 3605 12 ld.u64 %r34,[%r93+128]; .loc 1 3626 12 ld.u32 %r272,[%r514+8]; .loc 1 3610 9 setp.gt.u64 %r549,%r92,%r493; @ %r549 bra $L814; setp.eq.u32 %r1560,%r272,0; bra $L658; $L814: setp.eq.u32 %r1560,%r272,0; bra $L657; $L660: .loc 1 3614 10 @ %r1560 bra $L659; .loc 1 3615 26 ld.u64 %r33,[%r93+128]; membar.sys; .loc 1 3615 10 mov.u64 %r34,%r33; .loc 1 3626 12 ld.u32 %r272,[%r514+8]; setp.eq.u32 %r1560,%r272,0; $L659: .loc 1 3617 10 setp.ne.u64 %r551,%r34,-1; @ %r551 bra $L658; .loc 1 3620 7 add.u64 %r92,%r92,-24; .loc 1 3622 9 ld.u64 %r93,[%r92+16]; .loc 1 3623 16 ld.u64 %r34,[%r93+128]; .loc 1 3610 9 ld.u64 %r552,[%r515+24]; setp.ge.u64 %r553,%r552,%r92; @ %r553 bra $L658; $L657: .loc 1 3611 9 ld.u64 %r37,[%frame+432]; .loc 1 3611 3 ld.u64 %r554,[%r92+-24]; setp.gt.u64 %r555,%r554,%r37; @ %r555 bra $L658; .loc 1 3612 3 ld.u64 %r556,[%r92+-16]; setp.lt.u64 %r557,%r37,%r556; @ %r557 bra $L660; $L658: .loc 1 3626 6 @ %r1560 bra $L661; .loc 1 3627 13 ld.u64 %r40,[%r93+128]; membar.sys; mov.u64 %r34,%r40; $L661: .loc 1 3630 6 setp.eq.u64 %r559,%r34,0; @ %r559 bra $L662; .loc 1 3685 6 setp.eq.u64 %r560,%r34,-1; @ ! %r560 bra $L664; bra $L663; $L662: .loc 1 3641 11 ld.u64 %r41,[%r92+16]; .loc 1 2908 3 add.u64 %r1563,%frame,400; st.u64 [%r1563],%r34; st.u64 [%r1563+8],%r34; st.u64 [%r1563+16],%r34; .loc 1 2909 13 st.u64 [%frame+424],%r34; .loc 1 2911 3 add.u64 %r1556,%frame,200; st.u64 [%r1556],%r34; st.u64 [%r1556+8],%r34; st.u64 [%r1556+16],%r34; st.u64 [%r1556+24],%r34; st.u64 [%r1556+32],%r34; st.u64 [%r1556+40],%r34; st.u64 [%r1556+48],%r34; st.u64 [%r1556+56],%r34; st.u64 [%r1556+64],%r34; .loc 1 2914 10 ld.u64 %r101,[%r41+56]; .loc 1 2914 59 ld.u64 %r102,[%r515+136]; .loc 1 2913 6 setp.lt.u64 %r576,%r101,%r102; @ %r576 bra $L665; .loc 1 2916 7 cvt.u32.u64 %r579,%r34; cvta.const.u64 %r578,$LC29; { $LCT38%r518,$LCT38; } .loc 1 2917 7 bra $L666; $L665: .loc 1 2920 17 cvta.const.u64 %r580,$LC30; st.u64 [%frame+336],%r580; .loc 1 2921 46 ld.u64 %r103,[%r515+64]; .loc 1 2921 18 st.u64 [%frame+344],%r103; .loc 1 2922 57 add.u64 %r581,%r103,%r101; .loc 1 2922 16 st.u64 [%frame+352],%r581; .loc 1 2923 58 sub.u64 %r582,%r102,%r101; .loc 1 2923 17 st.u64 [%frame+360],%r582; .loc 1 2924 25 ld.u32 %r583,[%r515+200]; st.u32 [%frame+368],%r583; .loc 1 2925 27 st.u64 [%frame+376],%r518; .loc 1 2926 17 st.u64 [%frame+384],%r519; .loc 1 2927 31 cvt.u32.u64 %r584,%r34; st.u32 [%frame+392],%r584; .loc 1 717 9 add.u64 %r1561,%frame,3361561; call (%value_in),read_uint32,(%out_arg1); ld.param.u32 %r587,[%value_in]; } .loc 1 717 7 cvt.u64.u32 %r145,%r587; .loc 1 718 6 setp.ne.u64 %r588,%r145,4294967295; @ %r588 bra $L772; .loc 1 720 1361; call (%value_in),read_uint64,(%out_arg1); ld.param.u64 %r591,[%value_in]; } mov.u64 %r145,%r591; .loc 1 721 19 mov.u32 %r509,1; bra $L667; $L772: .loc 1 724 17 cvt.u32.u64 %r509,%r34; $L667: .loc 1 2930 17 st.u64 [%frame+360],%r145; .loc 1 2616 11561; call (%value_in),read_uint16,(%out_arg1); ld.param.u32 %r594,[%value_in]; } .loc 1 2616 16 st.u32 [%frame+200],%r594; .loc 1 2617 24 add.u32 %r596,%r594,-2; .loc 1 2617 6 setp.le.u32 %r597,%r596,3; @ %r597 bra $L668; .loc 1 418 38 ld.u64 %r602,[%frame+352]; ld.u64 %r603,[%frame+344]; sub.u64 %r601,%r602,%r603; .loc 1 417 3 st.u32 [%stack+16],%r601; ld.u64 %r604,[%frame+336]; st.u64 [%stack+8],%r604; cvta.const.u64 %r605,$LC31; st.u64 [%stack],%r605; cvta.const.u64 %r600,$LC1; mov599600stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r606,[%value_in]; } .loc 1 419 3 ld.u64 %r611,[%frame+384]; ld.u64 %r612,[%frame+376]; mov.u32 %r610,0; { $LCT39%r612,$LCT39; } bra $L666; $L668: .loc 1 2623 6 setp.eq.u32 %r613,%r594,5; @ %r613 bra $L669; .loc 1 2624 19 ld.u32 %r614,[%r41+48]; st.u32 [%frame+204],%r614; bra $L670; $L669: .loc 1 2627 231561; call (%value_in),read_byte,(%out_arg1); ld.param.u32 %r617,[%value_in]; } st.u32 [%frame+204],%r617; .loc 1 261561; call (%value_in),read_byte,(%out_arg1); ld.param.u32 %r621,[%value_in]; } .loc 1 2630 10 setp.eq.u32 %r623,%r621,0; @ %r623 bra $L670; .loc 1 418 38 ld.u64 %r628,[%frame+352]; ld.u64 %r629,[%frame+344]; sub.u64 %r627,%r628,%r629; .loc 1 417 3 st.u32 [%stack+16],%r627; ld.u64 %r630,[%frame+336]; st.u64 [%stack+8],%r630; cvta.const.u64 %r631,$LC32; st.u64 [%stack],%r631; cvta.const.u64 %r626,$LC1; mov625stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r632,[%value_in]; } .loc 1 419 3 ld.u64 %r637,[%frame+384]; ld.u64 %r638,[%frame+376]; mov.u32 %r636,0; { $LCT46frame636; call %r638,$LCT40; } bra $L666; $L670: .loc 1 572 6 setp.eq.u32 %r639,%r509,0; @ %r639 bra $L671; .loc 1 573 1261; call (%value_in),read_uint64,(%out_arg1); ld.param.u64 %r642,[%value_in]; } mov.u64 %r160,%r642; bra $L672; $L6711561; call (%value_in),read_uint32,(%out_arg1); ld.param.u32 %r645,[%value_in]; } cvt.u64.u32 %r160,%r645; $L672: .loc 1 2640 11 add.u64 %r1559,%frame,272; mov.u64 %r648,645596164852,[%value_in]; } .loc 1 2641 16 st.u64 [%frame+296],%r160; .loc 1 428 10 ld.u64 %r162,[%frame+360]; .loc 1 428 6 setp.gt.u64 %r654,%r160,%r162; @ %r654 bra $L673; .loc 1 448 12 ld.u64 %r656,[%frame+352]; add.u64 %r655,%r656,%r160; st.u64 [%frame+352],%r655; .loc 1 449 13 sub.u64 %r657,%r162,%r160; st.u64 [%frame+360],%r657; .loc 1 2646 231559; call (%value_in),read_byte,(%out_arg1); ld.param.u32 %r660,[%value_in]; } st.u32 [%frame+208],%r660; .loc 1 2647 6 ld.u32 %r662,[%frame+200]; setp.le.s32 %r663,%r662,3; @ %r663 bra $L674; bra $L878; $L673: .loc 1 431 6 ld.u32 %r664,[%frame+392]; setp.eq.u32 %r665,%r664,0; @ ! %r665 bra $L666; .loc 1 418 38 ld.u64 %r670,[%frame+352]; ld.u64 %r671,[%frame+344]; sub.u64 %r669,%r670,%r671; .loc 1 417 3 st.u32 [%stack+16],%r669; ld.u64 %r672,[%frame+336]; st.u64 [%stack+8],%r672; cvta.const.u64 %r673,$LC2; st.u64 [%stack],%r673; cvta.const.u64 %r668,$LC1; mov667668stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r674,[%value_in]; } .loc 1 419 3 ld.u64 %r679,[%frame+384]; ld.u64 %r680,[%frame+376]; { $LCT41frame664; call %r680,$LCT41; } .loc 1 434 31 mov.u32 %r681,1; st.u32 [%frame+392],%r681; bra $L666; $L674: .loc 1 2648 27 mov.u32 %r682,1; st.u32 [%frame+212],%r682; bra $L678; $L878: .loc 1 2650 21559; call (%value_in),read_byte,(%out_arg1); ld.param.u32 %r685,[%value_in]; } st.u32 [%frame+212],%r685; $L6781559; call (%value_in),read_byte,(%out_arg1); ld.param.u32 %r689,[%value_in]; } .loc 1 489 24 ld.u64 %r175,[%frame+288]; .loc 1 428 10 ld.u64 %r181,[%frame+296]; .loc 1 428 6 setp.ne.u64 %r690,%r181,0; @ %r690 bra $L679; .loc 1 431 11 ld.u32 %r412,[%frame+328]; .loc 1 431 6 setp.ne.u32 %r691,%r412,0; @ %r691 bra $L773; .loc 1 418 38 ld.u64 %r696,[%frame+280]; sub.u64 %r695,%r175,%r696; .loc 1 417 3 st.u32 [%stack+16],%r695; ld.u64 %r697,[%frame+272]; st.u64 [%stack+8],%r697; cvta.const.u64 %r698,$LC2; st.u64 [%stack],%r698; cvta.const.u64 %r694,$LC1; mov693694stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r699,[%value_in]; } .loc 1 419 3 ld.u64 %r704,[%frame+320]; ld.u64 %r705,[%frame+312]; { $LCT42frame412; call %r705,$LCT42; } .loc 1 434 31 mov.u32 %r706,1; st.u32 [%frame+328],%r706; bra $L680; $L679: .loc 1 448 12 add.u64 %r707,%r175,1; st.u64 [%frame+288],%r707; .loc 1 449 13 add.u64 %r708,%r181,-1; st.u64 [%frame+296],%r708; .loc 1 493 22 ld.u8 %r1566,[%r175]; mov.u16 %r711,%r1566; xor.b16 %r710,%r711,-128; add.u16 %r713,%r710,-128; .loc 1 2655 20 cvt.u32.u16 %r715,%r713; cvt.s32.s8 %r412,%r715; .loc 1 493 22 bra $L680; $L773: cvt.u32.u64 %r412,%r181; $L680: .loc 1 2655 18 st.u32 [%frame+216],%r412; .loc 1 265659; call (%value_in),read_byte,(%out_arg1); ld.param.u32 %r718,[%value_in]; } st.u32 [%frame+220],%r718; .loc 1 2658 221559; call (%value_in),read_byte,(%out_arg1); ld.param.u32 %r722,[%value_in]; } .loc 1 2658 20 st.u32 [%frame+224],%r722; .loc 1 2659 32 ld.u64 %r134,[%frame+288]; .loc 1 2659 23 st.u64 [%frame+232],%r134; .loc 1 2660 44 add.u32 %r724,%r722,-1; .loc 1 2660 8 cvt.u64.u32 %r136,%r724; .loc 1 428 10 ld.u64 %r191,[%frame+296]; .loc 1 428 6 setp.le.u64 %r725,%r136,%r191; @ %r725 bra $L681; .loc 1 431 6 ld.u32 %r726,[%frame+328]; setp.ne.u32 %r727,%r726,0; @ %r727 bra $L666; .loc 1 418 38 ld.u64 %r732,[%frame+280]; sub.u64 %r731,%r134,%r732; .loc 1 417 3 st.u32 [%stack+16],%r731; ld.u64 %r733,[%frame+272]; st.u64 [%stack+8],%r733; cvta.const.u64 %r734,$LC2; st.u64 [%stack],%r734; cvta.const.u64 %r730,$LC1; mov729730stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r735,[%value_in]; } .loc 1 419 3 ld.u64 %r740,[%frame+320]; ld.u64 %r741,[%frame+312]; { $LCT43frame726; call %r741,$LCT43; } bra $L666; $L681: .loc 1 448 12 add.u64 %r218,%r134,%r136; st.u64 [%frame+288],%r218; .loc 1 449 13 sub.u64 %r742,%r191,%r136; st.u64 [%frame+296],%r742; .loc 1 2663 6 ld.u32 %r743,[%frame+200]; setp.gt.s32 %r744,%r743,4; @ %r744 bra $L683; .loc 1 2322 19 mov.u64 %r745,0; st.u64 [%frame+240],%r745; .loc 1 2324 8 add.u64 %r201,%r134,%r191; .loc 1 2325 9 setp.lt.u64 %r746,%r218,%r201; @ %r746 bra $L774; .loc 1 2331 13 st.u64 [%frame+248],%r745; bra $L685; $L687: .loc 1 2327 43 sub.u64 %r750,%r201,%r206; .loc 1 2327 12750; call (%value_in),strnle751,[%value_in]; } .loc 1 2327 9 add.u64 %r753,%r751,1; add.u64 %r206,%r206,%r753; .loc 1 2328 7 add.u64 %r207,%r207,1; st.u64 [%frame+240],%r207; .loc 1 2325 9 setp.le.u64 %r754,%r201,%r206; @ ! %r754 bra $L684; bra $L686; $L774: mov.u64 %r206,%r218; mov.u64 %r207,%r745; $L684: .loc 1 2325 19 ld.u8 %r755,[%r206]; setp.ne.u16 %r756,%r755,0; @ %r756 bra $L687; $L686: .loc 1 2331 13 mov.u64 %r757,0; st.u64 [%frame+248],%r757; .loc 1 2332 6 setp.ne.u64 %r758,%r207,0; @ %r758 bra $L688; $L685: .loc 1 2344 9 ld.u8 %r759,[%r218]; setp.ne.u16 %r760,%r759,0; @ %r760 bra $L775; bra $L690; $L688: .loc 1 2335 6 ld.u64 %r765,[%frame+320]; ld.u64 %r766,[%frame+312]; shl.b64 %r767,%r2077665; call (%value_in),backtrace_allocld.param.u64 %r768,[%value_in]; } .loc 1 2334 17 st.u64 [%frame+248],%r768; .loc 1 2339 10 setp.eq.u64 %r770,%r768,0; @ %r770 bra $L666; .loc 1 2344 18 ld.u64 %r218,[%frame+288]; bra $L685; $L775: .loc 1 2344 9 mov.u64 %r496,0; $L689: .loc 1 2346 10 ld.u32 %r771,[%frame+328]; setp.ne.u32 %r772,%r771,0; @ %r772 bra $L666; .loc 1 2349 16 ld.u64 %r773,[%frame+248]; add.u64 %r215,%r773,%r496; .loc 1 459 16 ld.u64 %r273,[%frame+296]call (%value_in),strnle776,[%value_in]; } .loc 1 464 10 add.u64 %r275,%r776,1; .loc 1 428 6 setp.ge.u64 %r778,%r273,%r275; @ %r778 bra $L691; .loc 1 418 38 ld.u64 %r783,[%frame+280]; sub.u64 %r782,%r218,%r783; .loc 1 417 3 st.u32 [%stack+16],%r782; ld.u64 %r784,[%frame+272]; st.u64 [%stack+8],%r784; cvta.const.u64 %r785,$LC2; st.u64 [%stack],%r785; cvta.const.u64 %r781,$LC1; mov07stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r786,[%value_in]; } .loc 1 419 3 ld.u64 %r791,[%frame+320]; ld.u64 %r792,[%frame+312]; { $LCT44%r792,$LCT44; } .loc 1 434 31 mov.u32 %r793,1; st.u32 [%frame+328],%r793; .loc 1 2350 20 ld.u64 %r794,[%frame+248]; add.u64 %r416,%r794,%r496; .loc 1 467 12 mov.u64 %r218,%r34; bra $L692; $L691: .loc 1 448 12 ld.u64 %r796,[%frame+288]; add.u64 %r795,%r796,%r275; st.u64 [%frame+288],%r795; .loc 1 449 13 sub.u64 %r797,%r273,%r275; st.u64 [%frame+296],%r797; mov.u64 %r416,%r215; $L692: .loc 1 2349 20 st.u64 [%r215],%r218; .loc 1 2350 10 ld.u64 %r798,[%r416]; setp.eq.u64 %r799,%r798,0; @ %r799 bra $L666; .loc 1 2344 18 ld.u64 %r218,[%frame+288]; .loc 1 2344 9 add.u64 %r496,%r496,8; ld.u8 %r800,[%r218]; setp.ne.u16 %r801,%r800,0; @ %r801 bra $L689; $L690: .loc 1 428 10 ld.u64 %r291,[%frame+296]; .loc 1 428 6 setp.ne.u64 %r802,%r291,0; @ %r802 bra $L693; .loc 1 431 6 ld.u32 %r803,[%frame+328]; setp.ne.u32 %r804,%r803,0; @ %r804 bra $L666; .loc 1 418 38 ld.u64 %r809,[%frame+280]; sub.u64 %r808,%r218,%r809; .loc 1 417 3 st.u32 [%stack+16],%r808; ld.u64 %r810,[%frame+272]; st.u64 [%stack+8],%r810; cvta.const.u64 %r811,$LC2; st.u64 [%stack],%r811; cvta.const.u64 %r807,$LC1; mov806807stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r812,[%value_in]; } .loc 1 419 3 ld.u64 %r817,[%frame+320]; ld.u64 %r818,[%frame+312]; { $LCT45frame3; call %r818,$LCT45; } bra $L666; $L693: .loc 1 448 12 ld.u64 %r300,[%frame+288]; add.u64 %r237,%r300,1; st.u64 [%frame+288],%r237; .loc 1 449 13 add.u64 %r819,%r291,-1; st.u64 [%frame+296],%r819; .loc 1 2358 24 mov.u64 %r820,0; st.u64 [%frame+256],%r820; .loc 1 2360 8 add.u64 %r221,%r300,%r291; .loc 1 2361 9 setp.gt.u64 %r821,%r221,%r237; @ %r821 bra $L776; $L702: .loc 1 2371 7 ld.u64 %r826,[%frame+320]; ld.u64 %r827,[%frame+312]; ld.u64 %r829,[%frame+256]; shl.b64 %r828,%r8298826; call (%value_in),backtrace_allocld.param.u64 %r830,[%value_in]; } .loc 1 2370 18 st.u64 [%frame+264],%r830; .loc 1 2375 6 setp.eq.u64 %r832,%r830,0; @ ! %r832 bra $L879; bra $L666; $L703: .loc 1 2363 44 sub.u64 %r835,%r221,%r237; .loc 1 2363835; call (%value_in),strnle836,[%value_in]; } .loc 1 2363 9 add.u64 %r838,%r836,1; add.u64 %r226,%r237,%r838; .loc 1 702 9 ld.s8 %r840,[%r226]; cvt.u16.u32 %r839,%r840; setp.ge.s16 %r841,%r839,0; @ %r841 bra $L777; add.u64 %r1557,%r226,1; $L697: .loc 1 704 7 mov.u64 %r227,%r1557; add.u64 %r1557,%r1557,1; sub.u64 %r485,%r1557,%r226; .loc 1 702 9 ld.s8 %r844,[%r227]; cvt.u16.u32 %r843,%r844; setp.lt.s16 %r845,%r843,0; @ %r845 bra $L697; bra $L696; $L777: .loc 1 701 7 mov.u64 %r485,%r1565; $L696: .loc 1 2364 9 add.u64 %r230,%r226,%r485; .loc 1 702 9 ld.s8 %r847,[%r230]; cvt.u16.u32 %r846,%r847; setp.ge.s16 %r848,%r846,0; @ %r848 bra $L778; add.u64 %r1555,%r230,1; $L699: .loc 1 704 7 mov.u64 %r31,%r1555; add.u64 %r1555,%r1555,1; sub.u64 %r483,%r1555,%r230; .loc 1 702 9 ld.s8 %r851,[%r31]; cvt.u16.u32 %r850,%r851; setp.lt.s16 %r852,%r850,0; @ %r852 bra $L699; bra $L698; $L778: .loc 1 701 7 mov.u64 %r483,%r1565; $L698: .loc 1 2365 9 add.u64 %r233,%r230,%r483; .loc 1 702 9 ld.s8 %r854,[%r233]; cvt.u16.u32 %r853,%r854; setp.ge.s16 %r855,%r853,0; @ %r855 bra $L779; add.u64 %r1558,%r233,1; $L701: .loc 1 704 7 mov.u64 %r234,%r1558; add.u64 %r1558,%r1558,1; sub.u64 %r279,%r1558,%r233; .loc 1 702 9 ld.s8 %r858,[%r234]; cvt.u16.u32 %r857,%r858; setp.lt.s16 %r859,%r857,0; @ %r859 bra $L701; bra $L700; $L779: .loc 1 701 7 mov.u64 %r279,%r1565; $L700: .loc 1 2366 9 add.u64 %r237,%r233,%r279; .loc 1 2367 7 add.u64 %r89,%r89,1; st.u64 [%frame+256],%r89; .loc 1 2361 9 setp.le.u64 %r860,%r221,%r237; @ ! %r860 bra $L694; bra $L702; $L776: mov.u64 %r89,%r820; .loc 1 701 7 mov.u64 %r1565,1; $L694: .loc 1 2361 19 ld.u8 %r861,[%r237]; setp.ne.u16 %r862,%r861,0; @ %r862 bra $L703; bra $L702; $L879: .loc 1 2378 18 ld.u64 %r1562,[%frame+288]; .loc 1 2378 9 ld.u8 %r863,[%r1562]; setp.ne.u16 %r864,%r863,0; @ %r864 bra $L780; $L712: .loc 1 2680 6 ld.u32 %r865,[%frame+328]; setp.ne.u32 %r866,%r865,0; @ ! %r866 bra $L880; bra $L666; $L780: .loc 1 2378 9 mov.u64 %r504,0; $L704: .loc 1 2383 10 ld.u32 %r867,[%frame+328]; setp.ne.u32 %r868,%r867,0; @ %r868 bra $L666; .loc 1 459 16 ld.u64 %r302,[%frame+296]56302; call (%value_in),strnlen,(%out_arg1,%out_arg2); ld.param.u64 %r871,[%value_in]; } .loc 1 464 10 add.u64 %r304,%r871,1; .loc 1 428 6 setp.ge.u64 %r873,%r302,%r304; @ %r873 bra $L706; .loc 1 418 38 ld.u64 %r878,[%frame+280]; sub.u64 %r877,%r1562,%r878; .loc 1 417 3 st.u32 [%stack+16],%r877; ld.u64 %r879,[%frame+272]; st.u64 [%stack+8],%r879; cvta.const.u64 %r880,$LC2; st.u64 [%stack],%r880; cvta.const.u64 %r876,$LC1; mov875876stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r881,[%value_in]; } .loc 1 419 3 ld.u64 %r886,[%frame+320]; ld.u64 %r887,[%frame+312]; { $LCT46886%r887,$LCT46; } bra $L666; $L706: .loc 1 448 12 add.u64 %r888,%r1562,%r304; st.u64 [%frame+288],%r888; .loc 1 449 13 sub.u64 %r890,%r302,%r304; st.u64 [%frame+296],%r890; .loc 1 2389 1959; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r893,[%value_in]; } .loc 1 2390 10 ld.s8 %r895,[%r1562]; cvt.u16.u32 %r894,%r895; setp.eq.u16 %r896,%r894,47; @ %r896 bra $L707; .loc 1 2391 4 setp.ne.u64 %r897,%r893,0; @ %r897 bra $L708; .loc 1 2391 27 ld.u64 %r247,[%r41+96]; .loc 1 2391 23 setp.ne.u64 %r898,%r247,0; @ %r898 bra $L709; $L707: .loc 1 2392 20 ld.u64 %r899,[%frame+264]; add.u64 %r900,%r899,%r504; st.u64 [%r900],%r1562; bra $L710; $L708: .loc 1 2402 23 add.u64 %r901,%r893,-1; .loc 1 2402 12 ld.u64 %r902,[%frame+240]; setp.ge.u64 %r903,%r901,%r902; @ %r903 bra $L711; .loc 1 2403 10 ld.u64 %r904,[%frame+248]; shl.b64 %r905,%r893,3; add.u64 %r906,%r904,%r905; ld.u64 %r247,[%r906+-8]; bra $L709; $L711: .loc 1 418 38 ld.u64 %r912,[%frame+288]; ld.u64 %r913,[%frame+280]; sub.u64 %r911,%r912,%r913; .loc 1 417 3 st.u32 [%stack+16],%r911; ld.u64 %r914,[%frame+272]; st.u64 [%stack+8],%r914; cvta.const.u64 %r915,$LC23; st.u64 [%stack],%r915; cvta.const.u64 %r910,$LC1; mov909910stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r916,[%value_in]; } .loc 1 419 3 ld.u64 %r921,[%frame+320]; ld.u64 %r922,[%frame+312]; { $LCT47%r922,$LCT47; } bra $L666; $L709:247924,[%value_in]; } .loc 1 2412 19927,[%value_in]; } .loc 1 2413 18 ld.u64 %r933,[%frame+320]; ld.u64 %r934,[%frame+312]; .loc 1 2413 50 add.u64 %r935,%r924,%r927; .loc 1 2413 18 add.u64 %r936,%r935,19r934933; call (%value_in),backtrace_allocld.param.u64 %r937,[%value_in]; } .loc 1 2416 7 setp.eq.u64 %r939,%r937,0; @ %r939 bra $L666; .loc 1 2418 492474946,[%value_in]; } .loc 1 2422 15 add.u64 %r948,%r937,%r924; mov.u32 %r949,47; st.u8 [%r948],%r949; .loc 1 2423 24 add.u64 %r950,%r924,1; add.u64 %r951,%r937,%r950; .loc 1 2423 4 add.u64 %r952,%r927629959,[%value_in]; } .loc 1 2424 22 ld.u64 %r961,[%frame+264]; add.u64 %r962,%r961,%r504; st.u64 [%r962],%r937; $L710: .loc 1 2428 759; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r965,[%value_in]; } .loc 1 2429 759; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r968,[%value_in]; } .loc 1 2378 18 ld.u64 %r1562,[%frame+288]; .loc 1 2378 9 add.u64 %r504,%r504,8; ld.u8 %r969,[%r1562]; setp.ne.u16 %r970,%r969,0; @ %r970 bra $L704; bra $L712; $L683: .loc 1 2670 12 add.u64 %r979,%frame,248; add.u64 %r981,%frame,24515151551556read_line_header_format_entries84,[%value_in]; } .loc 1 2670 10 setp.eq.u32 %r985,%r984,0; @ %r985 bra $L666; .loc 1 2674 12 add.u64 %r994,%frame,264; add.u64 %r996,%frame,256515151551556read_line_header_format_entries99,[%value_in]; } .loc 1 2674 10 setp.eq.u32 %r1000,%r999,0; @ ! %r1000 bra $L712; bra $L666; $L880: .loc 1 2702 6 ld.u64 %r1001,[%frame+256]; setp.eq.u64 %r1002,%r1001,0; @ %r1002 bra $L781; .loc 1 2703 20 ld.u64 %r1003,[%frame+264]; ld.u64 %r326,[%r1003]; bra $L713; $L781: .loc 1 2705 20 cvta.const.u64 %r326,$LC27; $L713: .loc 1 2708 18 ld.u64 %r91,[%frame+360]; .loc 1 2708 9 setp.eq.u64 %r1004,%r91,0; @ %r1004 bra $L714; .loc 1 2706 12 mov.u64 %r429,%r326; .loc 1 2707 10 mov.u32 %r137,1; .loc 1 2700 11 mov.u64 %r327,0; .loc 1 2701 12 cvt.u32.u64 %r293,%r327; $L746: .loc 1 477 24 ld.u64 %r408,[%frame+352]; .loc 1 448 12 add.u64 %r1005,%r408,1; st.u64 [%frame+352],%r1005; .loc 1 449 13 add.u64 %r91,%r91,-1; st.u64 [%frame+360],%r91; .loc 1 481 11 ld.u8 %r409,[%r408]; .loc 1 2713 20 ld.u32 %r329,[%frame+224]; .loc 1 2713 10 setp.lt.u32 %r1006,%r409,%r329; @ %r1006 bra $L715; .loc 1 2718 7 sub.u32 %r330,%r409,%r329; .loc 1 2719 22 ld.u32 %r331,[%frame+220]; .loc 1 2719 12 div.u32 %r1008,%r330,%r331; .loc 1 2720 46 add.u32 %r334,%r1008,%r293; .loc 1 2721 14 ld.u32 %r336,[%frame+212]; .loc 1 2720 34 ld.u32 %r1010,[%frame+208]; mul.lo.u32 %r1009,%r334,%r1010; .loc 1 2721 9 div.u32 %r1012,%r1009,%r336; cvt.u64.u32 %r1013,%r1012; .loc 1 2720 12 add.u64 %r327,%r327,%r1013; .loc 1 2722 13 rem.u32 %r293,%r334,%r336; .loc 1 2723 41 rem.u32 %r1015,%r330,%r331; .loc 1 2723 29 ld.u32 %r1017,[%frame+216]; add.u32 %r1016,%r1015,%r1017; .loc 1 2723 11 add.u32 %r137,%r137,%r1016; .loc 1 2724 4 ld.u64 %r1027,[%frame+384]; ld.u64 %r1028,[%frame+3765151510227; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r1563; call add_line$isra$0,%out_arg8); } .loc 1 2708 18 ld.u64 %r91,[%frame+360]; bra $L716; $L715: .loc 1 2727 15 setp.ne.u32 %r1029,%r409,0; @ %r1029 bra $L717;1561; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r1032,[%value_in]; } .loc 1 477 24 ld.u64 %r414,[%frame+352]; .loc 1 428 10 ld.u64 %r91,[%frame+360]; .loc 1 428 6 setp.ne.u64 %r1033,%r91,0; @ %r1033 bra $L718; .loc 1 431 6 ld.u32 %r1034,[%frame+392]; setp.ne.u32 %r1035,%r1034,0; @ %r1035 bra $L719; .loc 1 418 38 ld.u64 %r1040,[%frame+344]; sub.u64 %r1039,%r414,%r1040; .loc 1 417 3 st.u32 [%stack+16],%r1039; ld.u64 %r1041,[%frame+336]; st.u64 [%stack+8],%r1041; cvta.const.u64 %r1042,$LC2; st.u64 [%stack],%r1042; cvta.const.u64 %r1038,$LC1; mov.u64 %r1037,20101038stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r1043,[%value_in]; } .loc 1 419 3 ld.u64 %r1048,[%frame+384]; ld.u64 %r1049,[%frame+376]; { $LCT4810frame,$LCT48; } .loc 1 434 31 mov.u32 %r1050,1; st.u32 [%frame+392],%r1050; .loc 1 428 10 ld.u64 %r91,[%frame+360]; bra $L719; $L718: .loc 1 448 12 add.u64 %r424,%r414,1; st.u64 [%frame+352],%r424; .loc 1 449 13 add.u64 %r91,%r91,-1; st.u64 [%frame+360],%r91; .loc 1 2733 4 ld.u8 %r1051,[%r414]; setp.eq.u16 %r1052,%r1051,3; @ %r1052 bra $L720; setp.gt.u16 %r1054,%r1051,3; @ %r1054 bra $L721; setp.eq.u16 %r1056,%r1051,1; @ %r1056 bra $L782; setp.eq.u16 %r1058,%r1051,2; @ %r1058 bra $L722; bra $L719; $L721: setp.eq.u16 %r1060,%r1051,4; @ %r1060 bra $L723; bra $L719; $L722: .loc 1 2745 18 ld.u32 %r1063,[%frame+204]51063; call (%value_in),read_address,(%out_arg1,%out_arg2); ld.param.u64 %r1065,[%value_in]; } mov.u64 %r327,%r1065; .loc 1 2708 18 ld.u64 %r91,[%frame+360]; .loc 1 2746 8 bra $L716; $L720: .loc 1 459 16424strnle1068,[%value_in]; } .loc 1 464 10 add.u64 %r426,%r1068,1; .loc 1 428 6 setp.ge.u64 %r1070,%r91,%r426; @ %r1070 bra $L724; .loc 1 431 6 ld.u32 %r1071,[%frame+392]; setp.ne.u32 %r1072,%r1071,0; @ %r1072 bra $L666; .loc 1 418 38 ld.u64 %r1077,[%frame+344]; sub.u64 %r1076,%r424,%r1077; .loc 1 417 3 st.u32 [%stack+16],%r1076; ld.u64 %r1078,[%frame+336]; st.u64 [%stack+8],%r1078; cvta.const.u64 %r1079,$LC2; st.u64 [%stack],%r1079; cvta.const.u64 %r1075,$LC1; mov.u64 %r1074,201074107stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r1080,[%value_in]; } .loc 1 419 3 ld.u64 %r1085,[%frame+384]; ld.u64 %r1086,[%frame+376]; { $LCT491085,$LCT49; } .loc 1 434 31 mov.u32 %r1087,1; st.u32 [%frame+392],%r1087; bra $L666; $L724: .loc 1 448 12 add.u64 %r1088,%r424,%r426; st.u64 [%frame+352],%r1088; .loc 1 449 13 sub.u64 %r1089,%r91,%r426; st.u64 [%frame+360],%r1089; .loc 1 27551561; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r1092,[%value_in]; } .loc 1 2757 361; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r1095,[%value_in]; } .loc 1 2758 361; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r1098,[%value_in]; } .loc 1 2759 6 ld.s8 %r1100,[%r414+1]; cvt.u16.u32 %r1099,%r1100; setp.ne.u16 %r1101,%r1099,47; @ %r1101 bra $L726; .loc 1 2708 18 ld.u64 %r91,[%frame+360]; mov.u64 %r429,%r424; bra $L716; $L726: .loc 1 2755 13 cvt.u32.u64 %r348,%r1092; .loc 1 2768 10 setp.ne.u32 %r1102,%r348,0; @ %r1102 bra $L727; .loc 1 2768 26 ld.u32 %r1103,[%frame+200]; setp.gt.s32 %r1104,%r1103,4; @ %r1104 bra $L727; .loc 1 2769 13 ld.u64 %r357,[%r41+96]; bra $L728; $L727: .loc 1 2770 26 add.u32 %r1105,%r348,-1; cvt.u64.u32 %r352,%r1105; .loc 1 2770 15 ld.u64 %r1106,[%frame+240]; setp.ge.u64 %r1107,%r352,%r1106; @ %r1107 bra $L729; .loc 1 2771 13 ld.u64 %r1108,[%frame+248]; shl.b64 %r1109,%r352,3; add.u64 %r1110,%r1108,%r1109; ld.u64 %r357,[%r1110]; bra $L728; $L729: .loc 1 418 38 ld.u64 %r1115,[%frame+352]; ld.u64 %r1116,[%frame+344]; sub.u64 %r1114,%r1115,%r1116; .loc 1 417 3 st.u32 [%stack+16],%r1114; ld.u64 %r1117,[%frame+336]; st.u64 [%stack+8],%r1117; cvta.const.u64 %r1118,$LC33; st.u64 [%stack],%r1118; cvta.const.u64 %r1113,$LC1; mov.u64 %r1112,2011121113stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r1119,[%value_in]; } .loc 1 419 3 ld.u64 %r1124,[%frame+384]; ld.u64 %r1125,[%frame+376]; mov.u32 %r1123,0; { $LCT511frame,$LCT50; } bra $L666; $L728: .loc 1 2779 1771127,[%value_in]; } .loc 1 27804241130,[%value_in]; } .loc 1 2782 5 ld.u64 %r1136,[%frame+384]; ld.u64 %r1137,[%frame+376]; .loc 1 2782 37 add.u64 %r1138,%r1127,%r1130; .loc 1 2782 5 add.u64 %r1139,%r1138,11111371136; call (%value_in),backtrace_allocld.param.u64 %r1140,[%value_in]; } mov.u64 %r429,%r1140; .loc 1 2785 10 setp.eq.u64 %r1142,%r429,0; @ %r1142 bra $L666; .loc 1 2787 71271149,[%value_in]; } .loc 1 2792 18 add.u64 %r1151,%r429,%r1127; mov.u32 %r1152,47; st.u8 [%r1151],%r1152; .loc 1 2793 27 add.u64 %r1153,%r1127,1; add.u64 %r1154,%r429,%r1153; .loc 1 2793 7 add.u64 %r1155,%r1111r424111162,[%value_in]; } .loc 1 2708 18 ld.u64 %r91,[%frame+360]; bra $L716; $L723: .loc 1 28001561; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r1166,[%value_in]; } .loc 1 2708 18 ld.u64 %r91,[%frame+360]; .loc 1 2801 8 bra $L716; $L719: .loc 1 2803 13 add.u64 %r368,%r1032,-1; .loc 1 428 6 setp.le.u64 %r1167,%r368,%r91; @ %r1167 bra $L730; .loc 1 431 6 ld.u32 %r1168,[%frame+392]; setp.ne.u32 %r1169,%r1168,0; @ %r1169 bra $L666; .loc 1 418 38 ld.u64 %r1174,[%frame+352]; ld.u64 %r1175,[%frame+344]; sub.u64 %r1173,%r1174,%r1175; .loc 1 417 3 st.u32 [%stack+16],%r1173; ld.u64 %r1176,[%frame+336]; st.u64 [%stack+8],%r1176; cvta.const.u64 %r1177,$LC2; st.u64 [%stack],%r1177; cvta.const.u64 %r1172,$LC1; mov.u64 %r1171,2011711172stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r1178,[%value_in]; } .loc 1 419 3 ld.u64 %r1183,[%frame+384]; ld.u64 %r1184,[%frame+376]; { $LCT51118ame,$LCT51; } .loc 1 434 31 mov.u32 %r1185,1; st.u32 [%frame+392],%r1185; bra $L666; $L730: .loc 1 448 12 ld.u64 %r1187,[%frame+352]; add.u64 %r1186,%r1187,%r368; st.u64 [%frame+352],%r1186; .loc 1 449 13 sub.u64 %r91,%r91,%r368; st.u64 [%frame+360],%r91; bra $L716; $L717: .loc 1 2810 4 cvt.u16.u32 %r1188,%r409; setp.eq.u16 %r1189,%r1188,5; @ %r1189 bra $L731; setp.gt.u16 %r1191,%r1188,5; @ %r1191 bra $L732; setp.eq.u16 %r1193,%r1188,3; @ %r1193 bra $L733; setp.gt.u16 %r1195,%r1188,3; @ %r1195 bra $L734; setp.eq.u16 %r1197,%r1188,1; @ %r1197 bra $L735; setp.eq.u16 %r1199,%r1188,2; @ %r1199 bra $L736; bra $L737; $L734: setp.eq.u16 %r1201,%r1188,4; @ %r1201 bra $L738; bra $L737; $L732: setp.eq.u16 %r1203,%r1188,9; @ %r1203 bra $L739; setp.gt.u16 %r1205,%r1188,9; @ %r1205 bra $L740; setp.le.u16 %r1207,%r1188,7; @ %r1207 bra $L716; setp.eq.u16 %r1209,%r1188,8; @ %r1209 bra $L741; bra $L737; $L740: setp.le.u16 %r1211,%r1188,11; @ %r1211 bra $L716; setp.eq.u16 %r1213,%r1188,12; @ %r1213 bra $L742; bra $L737; $L735: .loc 1 2813 8 ld.u64 %r1223,[%frame+384]; ld.u64 %r1224,[%frame+376515151224223; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r1563; call add_line$isra$0,%out_arg8); } .loc 1 2708 18 ld.u64 %r91,[%frame+360]; .loc 1 2815 8 bra $L716; $L736: .loc 1 2820 1361; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r1227,[%value_in]; } .loc 1 2821 45 cvt.u64.u32 %r1228,%r293; add.u64 %r375,%r1228,%r1227; .loc 1 2822 13 ld.u32 %r378,[%frame+212]; .loc 1 2821 18 ld.u32 %r1229,[%frame+208]; .loc 1 2821 33 mul.lo.u64 %r1230,%r1229,%r375; .loc 1 2822 8 div.u64 %r1232,%r1230,%r378; .loc 1 2821 11 add.u64 %r327,%r327,%r1232; .loc 1 2823 35 rem.u64 %r1234,%r375,%r378; .loc 1 2823 12 cvt.u32.u64 %r293,%r1234; .loc 1 2708 18 ld.u64 %r91,[%frame+360]; .loc 1 2825 8 bra $L716; $L733: .loc 1 2827 2461; call (%value_in),read_sleb128,(%out_arg1); ld.param.u64 %r1237,[%value_in]; } .loc 1 2827 15 cvt.u32.u64 %r1238,%r1237; add.u32 %r137,%r137,%r1238; .loc 1 2708 18 ld.u64 %r91,[%frame+360]; .loc 1 2828 8 bra $L716; $L738: .loc 1 2833 1261; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r1241,[%value_in]; } .loc 1 2834 6 setp.ne.u64 %r1242,%r1241,0; @ %r1242 bra $L743; .loc 1 2708 18 ld.u64 %r91,[%frame+360]; .loc 1 2835 14 cvta.const.u64 %r429,$LC27; bra $L716; $L743: .loc 1 2838 18 add.u64 %r1243,%r1241,-1; .loc 1 2838 10 ld.u64 %r1244,[%frame+256]; setp.lt.u64 %r1245,%r1243,%r1244; @ %r1245 bra $L744; .loc 1 418 38 ld.u64 %r1250,[%frame+352]; ld.u64 %r1251,[%frame+344]; sub.u64 %r1249,%r1250,%r1251; .loc 1 417 3 st.u32 [%stack+16],%r1249; ld.u64 %r1252,[%frame+336]; st.u64 [%stack+8],%r1252; cvta.const.u64 %r1253,$LC34; st.u64 [%stack],%r1253; cvta.const.u64 %r1248,$LC1; mov.u64 %r1247,20124748stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r1254,[%value_in]; } .loc 1 419 3 ld.u64 %r1259,[%frame+384]; ld.u64 %r1260,[%frame+376]; mov.u32 %r1258,0; { $LCT5212,$LCT52; } bra $L666; $L744: .loc 1 2845 16 ld.u64 %r1261,[%frame+264]; shl.b64 %r1262,%r1241,3; add.u64 %r1263,%r1261,%r1262; ld.u64 %r429,[%r1263+-8]; .loc 1 2708 18 ld.u64 %r91,[%frame+360]; bra $L716; $L731: .loc 1 28501561; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r1267,[%value_in]; } .loc 1 2708 18 ld.u64 %r91,[%frame+360]; .loc 1 2851 8 bra $L716; $L741: .loc 1 2860 6 mov.u32 %r1269,255; sub.u32 %r1268,%r1269,%r329; .loc 1 2861 11 ld.u32 %r1272,[%frame+220]; div.u32 %r1271,%r1268,%r1272; .loc 1 2862 45 add.u32 %r394,%r1271,%r293; .loc 1 2863 13 ld.u32 %r396,[%frame+212]; .loc 1 2862 33 ld.u32 %r1274,[%frame+208]; mul.lo.u32 %r1273,%r394,%r1274; .loc 1 2863 8 div.u32 %r1276,%r1273,%r396; cvt.u64.u32 %r1277,%r1276; .loc 1 2862 11 add.u64 %r327,%r327,%r1277; .loc 1 2864 12 rem.u32 %r293,%r394,%r396; .loc 1 2866 8 bra $L716; $L739: .loc 1 2868 11561; call (%value_in),read_uint16,(%out_arg1); ld.param.u32 %r1280,[%value_in]; } cvt.u16.u32 %r1283,%r1280; cvt.u64.u16 %r1282,%r1283; .loc 1 2868 16 add.u64 %r327,%r327,%r1282; .loc 1 2708 18 ld.u64 %r91,[%frame+360]; .loc 1 2869 17 mov.u32 %r293,0; .loc 1 2870 8 bra $L716; $L742: .loc 1 28761561; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r1286,[%value_in]; } .loc 1 2708 18 ld.u64 %r91,[%frame+360]; .loc 1 2877 8 bra $L716; $L737: .loc 1 2882 35 add.u32 %r1287,%r409,-1; cvt.u64.u32 %r1288,%r1287; .loc 1 2882 31 ld.u64 %r1289,[%frame+232]; add.u64 %r1290,%r1289,%r1288; .loc 1 2882 10 ld.u8 %r406,[%r1290]; .loc 1 2882 3 setp.eq.u32 %r1291,%r406,0; @ %r1291 bra $L716; $L745: .loc 1 2883 561; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r1294,[%value_in]; } .loc 1 2882 48 add.u32 %r406,%r406,-1; .loc 1 2882 3 setp.ne.u32 %r1295,%r406,0; @ %r1295 bra $L745; .loc 1 2708 18 ld.u64 %r91,[%frame+360]; bra $L716; $L782: .loc 1 2733 4 mov.u64 %r429,%r326; mov.u32 %r293,%r409; mov.u32 %r137,1; mov.u64 %r327,0; $L716: .loc 1 2708 9 setp.ne.u64 %r1296,%r91,0; @ %r1296 bra $L746; bra $L714; $L882: .loc 1 2941 6 ld.u64 %r1297,[%frame+424]; setp.eq.u64 %r1298,%r1297,0; @ %r1298 bra $L666; .loc 1 2951 2 mov.u64 %r13005113005181563; call (%value_in),backtrace_vector_grow05,[%value_in]; } .loc 1 2953 6 setp.eq.u64 %r1306,%r1305,0; @ %r1306 bra $L666; .loc 1 2955 10 mov.u64 %r1307,-1; st.u64 [%r1305],%r1307; .loc 1 2956 16 mov.u64 %r1308,0; st.u64 [%r1305+8],%r1308; .loc 1 2957 14 st.u64 [%r1305+16],%r13085118519; call (%value_in),backtrace_vector_releaseld.param.u32 %r1315,[%value_in]; } .loc 1 2960 6 setp.eq.u32 %r1316,%r1315,0; @ %r1316 bra $L666; .loc 1 2963 6 ld.u64 %r111,[%frame+400]; .loc 1 2964 3 ld.u64 %r1321,[%frame+424]; mov.u64 %r1320,line_compare313001320; call backtrace_qso); } .loc 1 2967 21 ld.u64 %r113,[%frame+424]; .loc 1 3648 7 ld.u32 %r1322,[%r514+8]; setp.ne.u32 %r1323,%r1322,0; @ %r1323 bra $L747; bra $L881; $L666: .loc 2 267 12 ld.u64 %r1325,[%frame+416]; ld.u64 %r1326,[%frame+408]; add.u64 %r1324,%r1325,%r1326; st.u64 [%frame+416],%r1324; .loc 2 268 13 mov.u64 %r1327,0; st.u64 [%frame+408],%r1327; .loc 2 269 3118519; call (%value_in),backtrace_vector_releaseld.param.u32 %r1333,[%value_in]; } .loc 1 2302 10 ld.u64 %r463,[%frame+240]; .loc 1 2302 6 setp.eq.u64 %r1334,%r463,0; @ %r1334 bra $L749; .loc 1 2303 5 shl.b64 %r1340,%r463,3; ld.u64 %r1341,[%frame+2413411340518519; call backtrace_free); } $L749: .loc 1 2305 3 ld.u64 %r1348,[%frame+256]; shl.b64 %r1347,%r1348,3; ld.u64 %r1349,[%frame+26451134913519; call backtrace_free); } .loc 1 3639 22 mov.u64 %r506,%r34; .loc 1 2975 16 mov.u64 %r113,0; .loc 1 2974 10 mov.u64 %r34,-1; .loc 1 3640 28 mov.u64 %r508,%r113; .loc 1 3629 12 cvt.u32.u64 %r88,%r113; bra $L750; $L747: .loc 1 3652 4 ld.u64 %r411,[%r92+16]; .loc 1 3450 7 st.u64 [%r1559],%r1308; st.u64 [%r1559+8],%r1308; st.u64 [%r1559+16],%r1308; st.u64 [%r1559+24],%r1308; mov.u64 %r45,%r34; .loc 1 3451 13 mov.u64 %r94,%r1559; bra $L751; $L881: .loc 1 3651 12 add.u64 %r94,%r515,208; .loc 1 3652 4 ld.u64 %r411,[%r92+16]; mov.u64 %r45,%r94; $L751: .loc 1 3454 17 cvta.const.u64 %r1355,$LC25; st.u64 [%frame],%r1355; .loc 1 3455 18 ld.u64 %r1356,[%r515+56]; st.u64 [%frame+8],%r1356; .loc 1 3456 16 ld.u64 %r1357,[%r411]; st.u64 [%frame+16],%r1357; .loc 1 3457 20 ld.u64 %r276,[%r411+8]; .loc 1 3457 17 st.u64 [%frame+24],%r276; .loc 1 3458 25 ld.u32 %r1358,[%r515+200]; st.u32 [%frame+32],%r1358; .loc 1 3459 27 st.u64 [%frame+40],%r518; .loc 1 3460 17 st.u64 [%frame+48],%r519; .loc 1 3461 31 mov.u32 %r1359,0; st.u32 [%frame+56],%r1359; .loc 1 3463 9 bra $L752; $L754: .loc 1 3465 12 mov.u64 %r1363,5151511363frame1556518; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r519; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r94; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r94; call (%value_in),read_function_ent,%out_arg8,%out_arg9,%out_arg10); ld.param.u32 %r1371,[%value_in]; } .loc 1 3465 10 setp.eq.u32 %r1372,%r1371,0; @ %r1372 bra $L753; .loc 1 3463 18 ld.u64 %r276,[%frame+24]; $L752: .loc 1 3463 9 setp.ne.u64 %r1373,%r276,0; @ %r1373 bra $L754; .loc 1 3470 12 ld.u64 %r508,[%r94+24]; .loc 1 3470 6 setp.eq.u64 %r1374,%r508,0; @ %r1374 bra $L753; .loc 1 3475 6 setp.ne.u64 %r1375,%r45,0; @ %r1375 bra $L755; .loc 1 3477159518519; call (%value_in),backtrace_vector_releaseld.param.u32 %r1381,[%value_in]; } .loc 1 3477 10 setp.eq.u32 %r1382,%r1381,0; @ %r1382 bra $L753; .loc 1 3479 13 ld.u64 %r506,[%r94]; bra $L756; $L755: .loc 1 3486 951518519; call (%value_in),backtrace_vector_finishld.param.u64 %r1387,[%value_in]; } mov.u64 %r506,%r1387; .loc 1 3488 10 setp.eq.u64 %r1388,%r506,0; @ %r1388 bra $L753; .loc 1 3490 19 st.u64 [%r45+24],%r276; $L756: .loc 1 3493 3 mov.u64 %r1393,function_addrs_compare; mov3921393; call backtrace_qso); } bra $L757; $L753: .loc 1 3639 22 mov.u64 %r506,%r34; .loc 1 3640 28 mov.u64 %r508,0; $L757: .loc 1 2302 10 ld.u64 %r469,[%frame+240]; .loc 1 2302 6 setp.eq.u64 %r1394,%r469,0; @ %r1394 bra $L758; .loc 1 2303 5 shl.b64 %r1400,%r469,3; ld.u64 %r1401,[%frame+2414011400518519; call backtrace_free); } $L758: .loc 1 2305 3 ld.u64 %r1408,[%frame+256]; shl.b64 %r1407,%r1408,3; ld.u64 %r1409,[%frame+2645114091407518519; call backtrace_free); } .loc 1 2966 10 mov.u64 %r34,%r111; .loc 1 3656 13 mov.u32 %r88,1; $L750: .loc 1 3666 10 ld.u32 %r1410,[%r514+8]; setp.ne.u32 %r1411,%r1410,0; @ %r1411 bra $L759; .loc 1 3668 19 st.u64 [%r93+136],%r113; .loc 1 3669 22 st.u64 [%r93+144],%r506; .loc 1 3670 28 st.u64 [%r93+152],%r508; .loc 1 3671 13 st.u64 [%r93+128],%r34; bra $L760; $L759: .loc 1 3675 4 membar.sys; st.u64 [%r93+136],%r113; .loc 1 3676 4 membar.sys; st.u64 [%r93+144],%r506; .loc 1 3677 4 membar.sys; st.u64 [%r93+152],%r508; .loc 1 3679 4 membar.sys; st.u64 [%r93+128],%r34; $L760: .loc 1 3685 6 setp.ne.u64 %r1412,%r34,-1; @ %r1412 bra $L664; .loc 1 3690 10 setp.eq.u32 %r1413,%r88,0; @ %r1413 bra $L663; .loc 1 3691 9 ld.u64 %r1421,[%frame+43515151427518dwarf_lookup_pc22,[%value_in]; } mov.u32 %r513,%r1422; bra $L651; $L663: .loc 1 3693 14 ld.u64 %r1428,[%frame+432]; mov.u64 %r1427,0; cvt.u32.u64 %r1426,%r1427; { .param .u32 %value_in; $LCT53: .callprototype ( .param .u32 %value_out),.param .u64 %in_ar2,1427141427; call (%value_in),%r517),$LCT53; ld.param.u32 %r1429,[%value_in]; } mov.u32 %r513,%r1429; bra $L651; $L664: .loc 1 3698 24 ld.u64 %r1435,[%r92+16]; ld.u64 %r1436,[%r1435+136]; mov.u64 %r1434,line_search; mov.u64 %r1433564143614331434; call (%value_in),bsearch1438,[%value_in]; } .loc 1 3700 6 setp.ne.u64 %r1439,%r1438,0; @ %r1439 bra $L761; .loc 1 3707 16 ld.u64 %r56,[%r92+16]; .loc 1 3707 19 ld.u64 %r90,[%r56+104]; .loc 1 3707 10 setp.ne.u64 %r1440,%r90,0; @ %r1440 bra $L762; .loc 1 3711 13 ld.u64 %r90,[%r56+88]; .loc 1 3712 7 setp.eq.u64 %r1441,%r90,0; @ %r1441 bra $L763; .loc 1 3713 8 ld.s8 %r1443,[%r90]; cvt.u16.u32 %r1442,%r1443; setp.eq.u16 %r1444,%r1442,47; @ %r1444 bra $L763; .loc 1 3714 19 ld.u64 %r58,[%r56+96]; .loc 1 3714 8 setp.eq.u64 %r1445,%r58,0; @ %r1445 bra $L763; .loc 1 3721 23901447,[%value_in]; } .loc 1 3723 1581450,[%value_in]; } .loc 1 3724 53 add.u64 %r1456,%r1447,%r1450; .loc 1 3724 21 add.u64 %r1457,%r1456,11457518519; call (%value_in),backtrace_allocld.param.u64 %r1458,[%value_in]; } .loc 1 3726 11 setp.eq.u64 %r1460,%r1458,0; @ %r1460 bra $L764; .loc 1 3731145141467,[%value_in]; } .loc 1 3733 19 add.u64 %r1469,%r1458,%r1450; mov.u32 %r1470,47; st.u8 [%r1469],%r1470; .loc 1 3734 28 add.u64 %r1471,%r1450,1; add.u64 %r1472,%r1458,%r1471; .loc 1 3734 8 add.u64 %r1473,%r14471474731480,[%value_in]; } .loc 1 3737 9 ld.u64 %r56,[%r92+16]; mov.u64 %r90,%r1458; $L763: .loc 1 3737 27 st.u64 [%r56+104],%r90; $L762: .loc 1 3740 14 ld.u64 %r1487,[%frame+432]; mov.u64 %r1486,0; cvt.u32.u64 %r1485,%r1486; { .param .u32 %value_in; $LCT54: .callprototype ( .param .u32 %value_out),.param .u64 %in_ar2,487141486; call (%value_in),%r517),$LCT54; ld.param.u32 %r1488,[%value_in]; } mov.u32 %r513,%r1488; bra $L651; $L761: .loc 1 3745 12 ld.u64 %r66,[%r92+16]; .loc 1 3745 15 ld.u64 %r67,[%r66+152]; .loc 1 3745 6 setp.ne.u64 %r1489,%r67,0; @ %r1489 bra $L765; .loc 1 3746 12 ld.u32 %r1495,[%r1438+16]; ld.u64 %r1496,[%r1438+8]; ld.u64 %r1497,[%frame+432]; { .param .u32 %value_in; $LCT55: .callprototype ( .param .u32 %value_out),.param .u64 %in_ar2,141495%r517),$LCT55; ld.param.u32 %r1498,[%value_in]; } mov.u32 %r513,%r1498; bra $L651; $L765: .loc 1 3749 7 ld.u64 %r1504,[%r66+144]; mov.u64 %r1503,function_addrs_search560414331503; call (%value_in),bsearch1506,[%value_in]; } mov.u64 %r484,%r1506; .loc 1 3753 6 setp.eq.u64 %r1507,%r1506,0; @ %r1507 bra $L766; .loc 1 3759 43 ld.u64 %r492,[%r92+16]; .loc 1 3759 46 ld.u64 %r487,[%r492+144]; .loc 1 3760 14 ld.u64 %r289,[%r492+152]; .loc 1 3770 9 ld.u64 %r141,[%frame+432]; .loc 1 3759 36 sub.u64 %r1508,%r1506,%r487; shr.s64 %r1510,%r1508,3; .loc 1 3759 63 mad.lo.u64 %r1512,%r1510,-6148914691236517205,1; .loc 1 3759 9 setp.le.u64 %r1513,%r289,%r1512; @ %r1513 bra $L767; add.u64 %r481,%r1506,24; add.u64 %r512,%r1506,32; bra $L768; $L766: .loc 1 3754 12 ld.u32 %r1519,[%r1438+16]; ld.u64 %r1520,[%r1438+8]; ld.u64 %r1521,[%frame+432]; { .param .u32 %value_in; $LCT56: .callprototype ( .param .u32 %value_out),.param .u64 %in_ar2,115201511506; call (%value_in),%r517),$LCT56; ld.param.u32 %r1522,[%value_in]; } mov.u32 %r513,%r1522; bra $L651; $L769: .loc 1 3759 9 add.u64 %r481,%r481,24; add.u64 %r512,%r512,24; .loc 1 3759 36 sub.u64 %r1523,%r507,%r487; shr.s64 %r1525,%r1523,3; .loc 1 3759 63 mad.lo.u64 %r1527,%r1525,-6148914691236517205,1; .loc 1 3759 9 setp.ge.u64 %r1528,%r1527,%r289; @ %r1528 bra $L783; $L768: add.u64 %r484,%r481,-24; mov.u64 %r507,%r481; .loc 1 3761 3 ld.u64 %r1529,[%r481]; setp.gt.u64 %r1530,%r1529,%r141; @ %r1530 bra $L767; .loc 1 3762 3 ld.u64 %r1531,[%r512]; setp.gt.u64 %r1532,%r1531,%r141; @ %r1532 bra $L769; bra $L767; $L783: mov.u64 %r484,%r507; $L767: .loc 1 3765 12 ld.u64 %r96,[%r484+16]; .loc 1 3767 12 ld.u64 %r1533,[%r1438+8]; st.u64 [%frame],%r1533; .loc 1 3768 10 ld.u32 %r1534,[%r1438+16]; st.u32 [%frame+200],%r1534; .loc 1 3770 9 add.u64 %r1541,%frame,201517frame1541; call (%value_in),report_inlined_functions2,[%value_in]; } mov.u32 %r513,%r1542; .loc 1 3772 6 setp.ne.u32 %r1543,%r513,0; @ %r1543 bra $L651; .loc 1 3775 10 ld.u32 %r1549,[%frame+200]; ld.u64 %r1550,[%frame+432]; ld.u64 %r1548,[%r96]; ld.u64 %r1546,[%frame]; { .param .u32 %value_in; $LCT57: .callprototype ( .param .u32 %value_out),.param .u64 %in_ar2,5015461541548; call (%value_in),%r517),$LCT57; ld.param.u32 %r1551,[%value_in]; } mov.u32 %r513,%r1551; bra $L651; $L714: .loc 1 2938 6 ld.u32 %r1553,[%frame+392]; setp.ne.u32 %r1554,%r1553,0; @ ! %r1554 bra $L882; bra $L666; $L651: .loc 1 3776 1 mov.u32 %value,%r513;dwarf_fileline .func (.param .u32 %value_out) dwarf_fileline.reg .predu32 %r60; .reg .pred.loc 1 3791 6 ld.u32 %r35,[%r30+8]; setp.ne.u32 %r36,%r35,0; @ %r36 bra $L884; .loc 1 3793 18 ld.u64 %r28,[%r30+32]; .loc 1 3793 7 setp.eq.u64 %r37,%r28,0; @ %r37 bra $L885; $L888: .loc 1 379734frame; call (%value_in),dwarf_lookup_pc.loc 1 3799 7 setp.ne.u32 %r46,%r45,0; @ %r46 bra $L883; .loc 1 3799 17 ld.u32 %r47,[%frame]; setp.eq.u32 %r48,%r47,0; @ %r48 bra $L887; $L889: .loc 1 3797 10 mov.u32 %r29,0; bra $L883; $L887: .loc 1 3795 11 ld.u64 %r28,[%r28]; .loc 1 3793 7 setp.ne.u64 %r49,%r28,0; @ %r49 bra $L888; bra $L885; $L884: .loc 1 3807 10 add.u64 %r26,%r30,32; $L890: .loc 1 3810 12 ld.u64 %r24,[%r26]; membar.sys; .loc 1 3811 7 setp.eq.u64 %r50,%r24,0; @ %r50 bra $L885; .loc 1 381434frame; call (%value_in),dwarf_lookup_pc.loc 1 3816 7 setp.ne.u32 %r59,%r58,0; @ %r59 bra $L883; .loc 1 3816 17 ld.u32 %r60,[%frame]; setp.ne.u32 %r61,%r60,0; @ %r61 bra $L889; .loc 1 3819 7 mov.u64 %r26,%r24; .loc 1 3810 10 bra $L890; $L885: .loc 1 3825 10 mov.u64 %r66,0; cvt.u32.u64 %r65,%r66; { .param .u32 %value_in; $LCT58: .callprototype ( .param .u32 %value_out),.param .u64 %in_ar2,.param .u32 %in_ar3,.param .u64 %in_ar4);call (%value_in),%r32),$LCT58; ld.param.u32 %r67,[%value_in]; } mov.u32 %r29,%r67; $L883: .loc 1 3826backtrace_dwarf_addbacktrace_dwarf_addlocal .align 16 .b8 %frame_ar[544107; .reg .u64u64 %r195; .reg .predpred %r206; .reg .u32 %r207; .reg .pred32u32 %r302; .reg .predu64u32 %r324; .reg .u64 %r325; .reg .u64 %r326; .reg .u64 %r329; .reg .u32 %r330; .reg .pred %r331; .reg .u64 %r334; .reg .u64 %r337; .reg .predpred %r347; .reg .predpred %r372; .reg .u64 %r375; .reg .u32 %r378; .reg .u64 %r382; .reg .u64 %r386; .reg .u64 %r390; .reg .u32 %r391; .reg .predpred %r414; .reg .u64 %r417; .reg .u64 %r420; .reg .predpredpred %r471; .reg .u32 %r472; .reg .pred %r473; .reg .u64 %r474; .reg .pred %r475; .reg .predu32 %r485; .reg .pred %r486; .reg .predpredu64 %r503; .reg .u32 %r509; .reg .u64 %r510; .reg .pred %r511; .reg .u64 %r512; .reg .u64 %r513; .reg .u64u64 %r534; .reg .u64u64 %r549; .reg .u64 %r552; .reg .u32 %r554; .reg .u64 %r556; .reg .pred %r558; .reg .u32 %r559; .reg .pred %r560; .reg .pred %r561; .reg .pred %r562; .reg .pred %r563; .reg .u64 %r565; .reg .u32 %r566; .reg .pred %r568; .reg .pred %r569; .reg .u64 %r570; .reg .u64 %r572; .reg .u64 %r573; .reg .u64 %r575; .reg .pred %r576; .reg .u64 %r577; .reg .u64 %r579; .reg .u64 %r583; .reg .u64 %r584; mov.u64 %r155,%ar0; mov.u64 %r156,%ar1; mov.u64 %r157,%ar2; mov.u32 %r158,%ar3; mov.u64 %r159,%ar4; mov.u64 %r160,%ar5; mov.u64 %r161,%ar6; mov.u64 %r162,%ar7; mov.u64 %r163,%ar8; .loc 1 2095 3 add.u64 %r577,%frame,488; mov.u64 %r165,0; st.u64 [%r577],%r165; st.u64 [%r577+8],%r165; st.u64 [%r577+16],%r165; .loc 1 2096 3 add.u64 %r575,%frame,456; st.u64 [%r575],%r165; st.u64 [%r575+8],%r165; st.u64 [%r575+16],%r165; .loc 1 2097 16 st.u64 [%frame+512],%r165; .loc 1 2098 19 st.u64 [%frame+480],%r165; .loc 1 2104 13 cvta.const.u64 %r174,$LC25; st.u64 [%frame+392],%r174; .loc 1 2105 36 ld.u64 %r82,[%r157]; .loc 1 2105 14 st.u64 [%frame+400],%r82; .loc 1 2106 12 st.u64 [%frame+408],%r82; .loc 1 2107 35 ld.u64 %r81,[%r157+72]; .loc 1 2107 13 st.u64 [%frame+416],%r81; .loc 1 2108 21 st.u32 [%frame+424],%r158; .loc 1 2109 23 st.u64 [%frame+432],%r160; .loc 1 2110 13 st.u64 [%frame+440],%r161; .loc 1 2111 27 cvt.u32.u64 %r175,%r165; st.u32 [%frame+448],%r175; .loc 1 2113 3 add.u64 %r579,%frame,520; st.u64 [%r579],%r165; st.u64 [%r579+8],%r165; st.u64 [%r579+16],%r165; .loc 1 2116 9 setp.ne.u64 %r180,%r81,0; @ %r180 bra $L954; bra $L900; $L940: .loc 1 2129 10 setp.ne.u32 %r181,%r149,0; @ %r181 bra $L939; .loc 1 2132 23 ld.u64 %r82,[%frame+408]; bra $L899; $L954: .loc 1 2093 10 mov.u64 %r152,%r165; .loc 1 2114 15 mov.u64 %r81,%r152; add.u64 %r573,%frame,392; add.u64 %r572,%frame,328; .loc 1 1371 13 add.u64 %r583,%frame,200; add.u64 %r584,%frame,264573; call (%value_in),read_uint32,(%out_arg1); ld.param.u32 %r184,[%value_in]; } .loc 1 717 7 cvt.u64.u32 %r84,%r184; .loc 1 718 6 setp.ne.u64 %r185,%r84,4294967295; @ %r185 bra $L955; .loc 1 720 13573; call (%value_in),read_uint64,(%out_arg1); ld.param.u64 %r188,[%value_in]; } mov.u64 %r84,%r188; .loc 1 721 19 mov.u32 %r151,1; bra $L902; $L955: .loc 1 724 17 mov.u32 %r151,0; $L902: .loc 1 2135 16 mov.u64 %r191,64191195,[%value_in]; } .loc 1 2136 21 st.u64 [%frame+352],%r84; .loc 1 428 10 ld.u64 %r85,[%frame+416]; .loc 1 428 6 setp.gt.u64 %r197,%r84,%r85; @ %r197 bra $L903; .loc 1 448 12 ld.u64 %r199,[%frame+408]; add.u64 %r198,%r199,%r84; st.u64 [%frame+408],%r198; .loc 1 449 13 sub.u64 %r200,%r85,%r84; st.u64 [%frame+416],%r200; .loc 1 214172; call (%value_in),read_uint16,(%out_arg1); ld.param.u32 %r203,[%value_in]; } .loc 1 2142 23 add.u32 %r205,%r203,-2; .loc 1 2142 10 setp.gt.u32 %r206,%r205,3; @ ! %r206 bra $L1011; bra $L904; $L903: .loc 1 431 6 ld.u32 %r207,[%frame+448]; setp.ne.u32 %r208,%r207,0; @ %r208 bra $L939; .loc 1 418 38 ld.u64 %r213,[%frame+408]; ld.u64 %r214,[%frame+400]; sub.u64 %r212,%r213,%r214; .loc 1 417 3 st.u32 [%stack+16],%r212; ld.u64 %r215,[%frame+392]; st.u64 [%stack+8],%r215; cvta.const.u64 %r216,$LC2; st.u64 [%stack],%r216; cvta.const.u64 %r211,$LC1; mov21021stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r217,[%value_in]; } .loc 1 419 3 ld.u64 %r222,[%frame+440]; ld.u64 %r223,[%frame+432]; { $LCT592frame7; call %r22,$LCT59; } .loc 1 434 31 mov.u32 %r224,1; st.u32 [%frame+448],%r224; bra $L939; $L904: .loc 1 418 38 ld.u64 %r229,[%frame+344]; ld.u64 %r230,[%frame+336]; sub.u64 %r228,%r229,%r230; .loc 1 417 3 st.u32 [%stack+16],%r228; ld.u64 %r231,[%frame+328]; st.u64 [%stack+8],%r231; cvta.const.u64 %r232,$LC35; st.u64 [%stack],%r232; cvta.const.u64 %r227,$LC1; mov226stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r233,[%value_in]; } .loc 1 419 3 ld.u64 %r238,[%frame+376]; ld.u64 %r239,[%frame+368]; mov.u32 %r237,0; { $LCT623frame237; call %r239,$LCT60; } .loc 1 2145 4 bra $L939; $L1011: .loc 1 2148 10 setp.le.s32 %r576,%r203,4; @ %r576 bra $L956; .loc 1 215272; call (%value_in),read_byte,(%out_arg1); ld.param.u32 %r243,[%value_in]; } .loc 1 2153 18 cvt.u16.u32 %r246,%r243; and.b16 %r245,%r246,-5908; .loc 1 2152 14 mov.u32 %r47,%r243; bra $L906; $L956: .loc 1 2149 12 mov.u32 %r47,0; $L906: .loc 1 2161 6 mov.u64 %r25116579; call (%value_in),backtrace_vector_grow.loc 1 2163 10 setp.eq.u64 %r257,%r256,0; @ %r257 bra $L939; .loc 1 2167 5 mov.u64 %r259,16161; call (%value_in),backtrace_allocld.param.u64 %r262,[%value_in]; } .loc 1 2168 10 setp.eq.u64 %r264,%r262,0; @ %r264 bra $L939; .loc 1 2171 11 st.u64 [%r256],%r262; .loc 1 2172 7 add.u64 %r81,%r81,1; .loc 1 2174 10 @ %r576 bra $L957; .loc 1 21572; call (%value_in),read_byte,(%out_arg1); ld.param.u32 %r268,[%value_in]; } .loc 1 2177 11 mov.u32 %r63,%r268; bra $L909; $L957: .loc 1 2175 11 mov.u32 %r63,0; $L909: .loc 1 2179 15 add.u64 %r51,%r262,112; .loc 1 2179 7 mov.u64 %r271,16; mov.u32 %r273273572 6 setp.eq.u32 %r277,%r151,0; @ %r277 bra $L910; .loc 1 573 12572; call (%value_in),read_uint64,(136,%r280; bra $L911; $L91072; call (%value_in),read_uint32,(%out_arg1); ld.param.u32 %r283,[%value_in]; } cvt.u64.u32 %r136,%r283; $L911: .loc 1 2181 12 ld.u64 %r53,[%r157+88]; ld.u64 %r54,[%r157+16]; .loc 1 1351 24 mov.u64 %r284,0; st.u64 [%r262+112],%r284; .loc 1 1352 20 st.u64 [%r262+120],%r284; .loc 1 1354 6 setp.gt.u64 %r286,%r53,%r136; @ %r286 bra $L912; .loc 1 1356 7 cvt.u32.u64 %r289,%r284; cvta.const.u64 %r288,$LC36; { $LCT61%r160,$LCT61; } bra $L939; $L912: .loc 1 1360 19 cvta.const.u64 %r290,$LC37; st.u64 [%frame+264],%r290; .loc 1 1361 20 st.u64 [%frame+272],%r54; .loc 1 1362 33 add.u64 %r291,%r54,%r136; .loc 1 1362 18 st.u64 [%frame+280],%r291; .loc 1 1363 39 sub.u64 %r292,%r53,%r136; .loc 1 1363 19 st.u64 [%frame+288],%r292; .loc 1 1364 27 st.u32 [%frame+296],%r158; .loc 1 1365 29 st.u64 [%frame+304],%r160; .loc 1 1366 19 st.u64 [%frame+312],%r161; .loc 1 1367 33 cvt.u32.u64 %r293,%r284; st.u32 [%frame+320],%r293; .loc 1 1371 13 mov.u64 %r296,64296300,[%value_in]; } .loc 1 1372 15 mov.u64 %r107,%r284; .loc 1 1373 9 bra $L913; $L920: .loc 1 1375 10 ld.u32 %r302,[%frame+256]; setp.ne.u32 %r303,%r302,0; @ %r303 bra $L939; .loc 1 1377 7 add.u64 %r107,%r107,1; .loc 1 1379 7583; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r306,[%value_in]; } .loc 1 477 24 ld.u64 %r138,[%frame+216]; .loc 1 428 10 ld.u64 %r139,[%frame+224]; .loc 1 428 6 setp.ne.u64 %r307,%r139,0; @ %r307 bra $L915; .loc 1 431 6 ld.u32 %r308,[%frame+256]; setp.ne.u32 %r309,%r308,0; @ %r309 bra $L918; .loc 1 418 38 ld.u64 %r314,[%frame+208]; sub.u64 %r313,%r138,%r314; .loc 1 417 3 st.u32 [%stack+16],%r313; ld.u64 %r315,[%frame+200]; st.u64 [%stack+8],%r315; cvta.const.u64 %r316,$LC2; st.u64 [%stack],%r316; cvta.const.u64 %r312,$LC1; mov11312stack; call (%value_in),sn,%out_arg4); ld.param.u32 %r317,[%value_in]; } .loc 1 419 3 ld.u64 %r322,[%frame+248]; ld.u64 %r323,[%frame+240]; { $LCT62frame8; call %r32,$LCT62; } .loc 1 434 31 mov.u32 %r324,1; st.u32 [%frame+256],%r324; bra $L918; $L915: .loc 1 448 12 add.u64 %r325,%r138,1; st.u64 [%frame+216],%r325; .loc 1 449 13 add.u64 %r326,%r139,-1; st.u64 [%frame+224],%r326; .loc 1 481 11 bra $L918; $L919: .loc 1 1387 11583; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r329,[%value_in]; } .loc 1 1388 7 cvt.u32.u64 %r330,%r329; setp.ne.u32 %r331,%r330,33; @ %r331 bra $L918; .loc 1 13call (%value_in),read_sleb128,(read_uleb128,(%out_arg1); ld.param.u64 %r337,[%value_in]; } .loc 1 1383 13 setp.ne.u64 %r338,%r337,0; @ %r338 bra $L919; .loc 1 1392 7583; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r341,[%value_in]; } $L93; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r344,[%value_in]; } .loc 1 1373 9 setp.ne.u64 %r345,%r344,0; @ %r345 bra $L920; .loc 1 1395 6 ld.u32 %r346,[%frame+256]; setp.ne.u32 %r347,%r346,0; @ %r347 bra $L939; .loc 1 1398 6 setp.eq.u64 %r348,%r107,0; @ %r348 bra $L935; .loc 1 1402 9 shl.b64 %r113,1161; call (%value_in),backtrace_allocld.param.u64 %r353,[%value_in]; } .loc 1 1401 20 st.u64 [%r262+120],%r353; .loc 1 1405 6 setp.eq.u64 %r355,%r353,0; @ %r355 bra $L939; .loc 1 1407 24 st.u64 [%r262+112],%r107; .loc 1 1408 3346113361,[%value_in]; } .loc 1 1418 10 ld.u32 %r363,[%frame+320]; setp.eq.u32 %r364,%r363,0; @ %r364 bra $L958; $L931: .loc 1 14871161; call free_abbrevsbra $L939; $L958: .loc 1 1418 10 mov.u64 %r68,%r344; $L924:read_uleb128,(%out_arg1); ld.param.u64 %r371,[%value_in]; } .loc 1 1422 10 setp.eq.u64 %r372,%r371,0; @ %r372 bra $L925; .loc 1 1426 32584; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r375,[%value_in]; } .loc 1 1426 15 cvt.u32.u64 %r118,%r375; .loc 1 1427584; call (%value_in),read_byte,(%out_arg1); ld.param.u32 %r378,[%value_in]; } .loc 1 1429 17 mov.u64 %r382,6426,[%value_in]; } .loc 1 1430 17 mov.u64 %r121,%r344; .loc 1 1431 13 bra $L926; $L928: .loc 1 1435 4 add.u64 %r121,%r121,1; .loc 1 1436 11583; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r390,[%value_in]; } .loc 1 1437 7 cvt.u32.u64 %r391,%r390; setp.ne.u32 %r392,%r391,33; @ %r392 bra $L926; .loc 1 1438 6583; call (%value_in),read_sleb128,(%out_arg1); ld.param.u64 %r395,[%value_in]; } $L92read_uleb128,(%out_arg1); ld.param.u64 %r398,[%value_in]; } .loc 1 1431 13 setp.ne.u64 %r399,%r398,0; @ %r399 bra $L928; .loc 1 1441 10 setp.ne.u64 %r400,%r121,0; @ %r400 bra $L929; .loc 1 1444 4584; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r403,[%value_in]; } .loc 1 1445 4584; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r406,[%value_in]; } .loc 1 1443 10 mov.u64 %r126,%r121; bra $L930; $L929: .loc 1 1450 6 shl.b64 %r411,1161; call (%value_in),backtrace_allocld.param.u64 %r412,[%value_in]; } mov.u64 %r126,%r412; .loc 1 1452 7 setp.eq.u64 %r414,%r126,0; @ %r414 bra $L931; mov.u64 %r29,%r126; .loc 1 1454 14 mov.u64 %r121,%r398; $L934: .loc 1 1460584; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r417,[%value_in]; } .loc 1 1461584; call (%value_in),read_uleb128,(%out_arg1); ld.param.u64 %r420,[%value_in]; } .loc 1 1462 11 setp.eq.u64 %r421,%r417,0; @ %r421 bra $L930; .loc 1 1464 32 st.u32 [%r29],%r417; .loc 1 1465 32 cvt.u32.u64 %r130,%r420; .loc 1 1465 30 st.u32 [%r29+4],%r130; .loc 1 1466 11 setp.ne.u32 %r422,%r130,33; @ %r422 bra $L932; .loc 1 1467 26584; call (%value_in),read_sleb128,(%out_arg1); ld.param.u64 %r425,[%value_in]; } .loc 1 1467 24 st.u64 [%r29+8],%r425; bra $L933; $L932: .loc 1 1469 24 mov.u64 %r426,0; st.u64 [%r29+8],%r426; $L933: .loc 1 1470 8 add.u64 %r121,%r121,1; add.u64 %r29,%r29,16; .loc 1 1456 6 bra $L934; $L930: .loc 1 1477 23 ld.u64 %r427,[%r262+120]; add.u64 %r133,%r427,%r68; .loc 1 1477 37 st.u64 [%r133],%r371; st.u32 [%r133+8],%r118; st.u32 [%r133+12],%r378; st.u64 [%r133+16],%r121; st.u64 [%r133+24],%r126; .loc 1 1418 10 add.u64 %r68,%r68,32; ld.u32 %r428,[%frame+320]; setp.ne.u32 %r429,%r428,0; @ ! %r429 bra $L924; bra $L931; $L925: .loc 1 1481 3 ld.u64 %r434,[%r262+112]; ld.u64 %r435,[%r262+120]; mov.u64 %r433,abbrev_compare; mov.u64 %r432,32432433; call backtrace_qso); } $L935: .loc 1 2187 10 setp.gt.s32 %r436,%r203,4; @ %r436 bra $L936; .loc 1 2188572; call (%value_in),read_byte,(%out_arg1); ld.param.u32 %r439,[%value_in]; } .loc 1 2188 11 mov.u32 %r63,%r439; $L936: .loc 1 2190 7 add.u32 %r441,%r47,-4; setp.gt.u32 %r442,%r441,1; @ %r442 bra $L937; .loc 1 2197 4572; call (%value_in),read_uint64,(%out_arg1); ld.param.u64 %r445,[%value_in]; } $L937: .loc 1 2203 21 st.u64 [%r262+24],%r152; .loc 1 2204 45 setp.ne.u32 %r446,%r151,0; selp.u64 %r58,12,4,%r446; add.u64 %r41,%r84,%r152; .loc 1 2204 19 add.u64 %r152,%r41,%r58; .loc 1 2205 22 st.u64 [%r262+32],%r152; .loc 1 2206 30 ld.u64 %r59,[%frame+344]; .loc 1 2206 20 st.u64 [%r262],%r59; .loc 1 2207 24 ld.u64 %r447,[%frame+352]; st.u64 [%r262+8],%r447; .loc 1 2208 42 sub.u64 %r448,%r59,%r82; .loc 1 2208 27 st.u64 [%r262+16],%r448; .loc 1 2209 18 st.u32 [%r262+40],%r203; .loc 1 2210 21 st.u32 [%r262+44],%r151; .loc 1 2211 19 st.u32 [%r262+48],%r63; .loc 1 2212 19 mov.u64 %r449,0; st.u64 [%r262+88],%r449; .loc 1 2213 19 st.u64 [%r262+96],%r449; .loc 1 2214 23 st.u64 [%r262+104],%r449; .loc 1 2215 18 st.u64 [%r262+56],%r449; .loc 1 2218 16 st.u64 [%r262+128],%r449; .loc 1 2219 22 st.u64 [%r262+136],%r449; .loc 1 2220 25 st.u64 [%r262+144],%r449; .loc 1 2221 31 st.u64 [%r262+152],%r449; .loc 1 2223572160; .param .u64 %out_arg8; st.param.u64 [%out_arg8],%r161; .param .u64 %out_arg9; st.param.u64 [%out_arg9],%r262; .param .u64 %out_arg10; st.param.u64 [%out_arg10],%r577; .param .u64 %out_arg11; st.param.u64 [%out_arg11],%frame; call (%value_in),find_address_ranges,%out_arg8,%out_arg9,%out_arg10,%out_arg11); ld.param.u32 %r470,[%value_in]; } .loc 1 2223 10 setp.eq.u32 %r471,%r470,0; @ %r471 bra $L939; .loc 1 2228 10 ld.u32 %r472,[%frame+384]; setp.ne.u32 %r473,%r472,0; @ %r473 bra $L939; $L908: .loc 1 2129 15 ld.u32 %r149,[%frame+448]; .loc 1 2116 9 ld.u64 %r474,[%frame+416]; setp.ne.u64 %r475,%r474,0; @ %r475 bra $L940; .loc 1 2231 6 setp.ne.u32 %r476,%r149,0; @ %r476 bra $L939; $L900: .loc 1 2234 17 ld.u64 %r477,[%frame+520]; st.u64 [%frame+456],%r477; ld.u64 %r478,[%frame+528]; st.u64 [%frame+464],%r478; ld.u64 %r479,[%frame+536]; st.u64 [%frame+472],%r479; .loc 1 2235 19 st.u64 [%frame+480],%r81161; call (%value_in),backtrace_vector_releaseld.param.u32 %r485,[%value_in]; } .loc 1 3853 6 setp.eq.u32 %r486,%r485,0; @ ! %r486 bra $L1012; bra $L941; $L939: .loc 1 2239 6 setp.eq.u64 %r487,%r81,0; @ %r487 bra $L943; ld.u64 %r27,[%frame+520]; .loc 1 2242 14 mov.u64 %r72,0; $L944: .loc 1 2244 4 ld.u64 %r493,[%r27]; add.u64 %r492,%r493,112161; call free_abbrevs2245 4 mov.u64 %r496,160; ld.u64 %r495,[%r274960161; call backtrace_free); } .loc 1 2242 37 add.u64 %r72,%r72,1; .loc 1 2242 7 add.u64 %r27,%r27,8; setp.ne.u64 %r499,%r81,%r72; @ %r499 bra $L944; .loc 2 267 12 ld.u64 %r501,[%frame+536]; ld.u64 %r502,[%frame+528]; add.u64 %r500,%r501,%r502; st.u64 [%frame+536],%r500; .loc 2 268 13 mov.u64 %r503,0; st.u64 [%frame+528],%r503; .loc 2 269 3161; call (%value_in),backtrace_vector_releaseld.param.u32 %r509,[%value_in]; } $L943: .loc 1 2249 6 ld.u64 %r510,[%frame+512]; setp.eq.u64 %r511,%r510,0; @ %r511 bra $L941; .loc 2 267 12 ld.u64 %r513,[%frame+504]; ld.u64 %r514,[%frame+496]; add.u64 %r512,%r513,%r514; st.u64 [%frame+504],%r512; .loc 2 268 13 mov.u64 %r515,0; st.u64 [%frame+496],%r515; .loc 2 269 3161; call (%value_in),backtrace_vector_releaseld.param.u32 %r521,[%value_in]; } $L941: .loc 1 3904 12 mov.u32 %r154,0; bra $L898; $L1012:161; call (%value_in),backtrace_vector_releaseld.param.u32 %r527,[%value_in]; } .loc 1 3855 6 setp.eq.u32 %r528,%r527,0; @ %r528 bra $L941; .loc 1 3857 9 ld.u64 %r35,[%frame+488]; .loc 1 3858 9 ld.u64 %r36,[%frame+456]; .loc 1 3859 15 ld.u64 %r37,[%frame+512]; .loc 1 3860 15 ld.u64 %r38,[%frame+480]; .loc 1 3861 3 mov.u64 %r532,unit_addrs_compare; mov.u64 %r531,24; {53532; call backtrace_qso); } .loc 1 3866 5 mov.u64 %r534,2161; call (%value_in),backtrace_allocld.param.u64 %r537,[%value_in]; } .loc 1 3868 6 setp.eq.u64 %r539,%r537,0; @ %r539 bra $L941; .loc 1 3871 15 mov.u64 %r540,0; st.u64 [%r537],%r540; .loc 1 3872 18 st.u64 [%r537+8],%r159; .loc 1 3873 23 st.u64 [%r537+16],%r156; .loc 1 3874 16 st.u64 [%r537+24],%r35; .loc 1 3875 22 st.u64 [%r537+32],%r37; .loc 1 3876 16 st.u64 [%r537+40],%r36; .loc 1 3877 22 st.u64 [%r537+48],%r38; .loc 1 3878 25 add.u64 %r541,%r537,56; mov.u64 %r543,144543547,[%value_in]; } .loc 1 3879 23 st.u32 [%r537+200],%r158; .loc 1 3880 3 add.u64 %r549,%r537,208; mov.u64 %r552,32; cvt.u32.u64 %r554,%r5552556,[%value_in]; } .loc 1 3906 6 setp.eq.u64 %r558,%r163,0; @ %r558 bra $L947; .loc 1 3907 21 st.u64 [%r163],%r537; $L947: .loc 1 3913 15 add.u64 %r31,%r155,32; .loc 1 3909 6 ld.u32 %r559,[%r155+8]; setp.ne.u32 %r560,%r559,0; @ %r560 bra $L948; .loc 1 3913 15 ld.u64 %r32,[%r155+32]; .loc 1 3913 7 setp.eq.u64 %r561,%r32,0; @ %r561 bra $L949; $L950: mov.u64 %r52,%r32; .loc 1 3915 8 ld.u64 %r32,[%r32]; .loc 1 3913 7 setp.ne.u64 %r562,%r32,0; @ %r562 bra $L950; .loc 1 3915 8 mov.u64 %r31,%r52; $L949: .loc 1 3917 11 st.u64 [%r31],%r537; bra $L951; $L948: .loc 1 3925 7 mov.u64 %r28,%r31; $L953: .loc 1 3931 12 ld.u64 %r24,[%r28]; membar.sys; .loc 1 3933 11 setp.eq.u64 %r563,%r24,0; @ %r563 bra $L952; .loc 1 3936 11 mov.u64 %r28,%r24; .loc 1 3928 6 bra $L953; $L952: .loc 1 3939 8 atom.cas.b64 %r565,[%r28],0,%r537; setp.eq.u64 %r568,%r565,%r24; selp.u32 %r566,1,0,%r568; .loc 1 3939 7 setp.eq.u32 %r569,%r566,0; @ %r569 bra $L948; $L951: .loc 1 3944 16 mov.u64 %r570,dwarf_fileline; st.u64 [%r162],%r570; .loc 1 3946 10 mov.u32 %r154,1; $L898: .loc 1 3947 1 mov.u32 %value,%r154BEGIN FUNCTION DECL: fileline_initializefileline_initializebacktrace/filelinbacktrace_pcinfobacktrace_pcinfobacktrace_syminfobacktrace_syminfogetpietpisnnprintfbacktrace_opebacktrace_initializebacktrace_initial)32,116,111,32,114,101,97,100,32,101,120,101,99,117,116,97,98,108,101,32,105,110,102,111,114,109,97,116,105,111,110,0 }111,99,47,99,117,114,112,114,111,99,47,102,105,108,101108,105,98,98,97,99,107,116,114,97,99,101,32,99,111,117,108,100,32,110,111,116,32,102,105,110,100,32,101,120,101,99,117,116,97,98,108,101,32,116,111,32,111,112,10114,111,99,4747,112,114,111,99,47,37,108,100,47,111,98,106,101,99,116,47,97,46,111,117,116,0 }; // BEGIN FUNCTION DEF: fileline_initializefileline_initialize32predpred6432pred %r101; .reg .u64 %r102.loc 1 64 6 ld.u32 %r47,[%r44+82; .loc 1 69 6 ld.u32 %r49,[%r44+56]; setp.ne.u32 %r50,%r49,0; @ %r50 bra $L3; $L6: .loc 1 76 24 ld.u64 %r26,[%r44+24]; .loc 1 76 17 st.u64 [%frame+64],%r26; bra $L4; $L3: .loc 1 71 7 mov.u32 %r53,-1; cvta.const.u64 %r52,$LC2;,.param .u32 %in_ar2);call %r45,(%out_arg1,%out_arg2,%out_arg3),$LCT0; } .loc 1 72 14 mov.u32 %r43,0; bra $L1; $L2: .loc 1 67 14 ld.u32 %r24,[%r44+56]; membar.sys; .loc 1 69 6 setp.ne.u32 %r54,%r24,0; @ %r54 bra $L3; .loc 1 75 6 ld.u32 %r55,[%r44+8]; setp.eq.u32 %r56,%r55,0; @ %r56 bra $L6; .loc 1 78 19 ld.u64 %r28,[%r44+24]; membar.sys; mov.u64 %r26,%r28; .loc 1 78 17 st.u64 [%frame+64],%r26; $L4: .loc 1 79 6 setp.ne.u64 %r57,%r26,0; @ %r57 bra $L20; .loc 1 86 13 cvt.u32.u64 %r41,%r26; .loc 1 116 20 add.u64 %r102,%frame,72; $L13: .loc 1 90 7 setp.eq.u32 %r58,%r41,3; @ %r58 bra $L21; setp.eq.u32 %r59,%r41,4; @ %r59 bra $L8; setp.eq.u32 %r60,%r41,1; @ %r60 bra $L10; setp.eq.u32 %r61,%r41,2; @ %r61 bra $L22; .loc 1 93 13 ld.u64 %r39,[%r44]; .loc 1 113 10 setp.eq.u64 %r62,%r39,0; @ %r62 bra $L10; bra $L7; $L8:getpid; st.u64 [%stack],%r67; cvta.const.u64 %r66,$LC3; mov.u64 %r65,64framestack; call (%value_in),sn107 13 mov.u64 %r39,%frame; .loc 1 108 4 bra $L7; $L21: .loc 1 102 13 cvta.const.u64 %r39,$LC0; bra $L7; $L22: .loc 1 90 7 cvta.const.u64 %r39,$LC1; $L7: .loc 1 116 20102; call (%value_in),backtrace_openld.param.u32 %r75,[%value_in]; } .loc 1 118 10 setp.ge.s32 %r76,%r75,0; @ %r76 bra $L11; .loc 1 118 26 ld.u32 %r77,[%frame+72]; setp.eq.u32 %r78,%r77,0; @ %r78 bra $L15; $L10: .loc 1 86 28 add.u32 %r41,%r41,1; .loc 1 86 3 setp.ne.u32 %r79,%r41,5; @ %r79 bra $L13; .loc 1 131 13 ld.u64 %r32,[%r44]; .loc 1 131 7 setp.eq.u64 %r80,%r32,0; @ %r80 bra $L14; .loc 1 132 6 mov.u32 %r83,2;bra $L15; $L14: .loc 1 134 6 cvt.u32.u64 %r86,%r32; cvta.const.u64 %r85,$LC4;,.param .u32 %in_ar2);%r45,(%out_arg1,%out_arg2,%out_arg3),$LCT2; } bra $L15; $L40: .loc 1 158 24 ld.u64 %r42,[%frame+64]; .loc 1 157 6 ld.u32 %r87,[%r44+8]; setp.eq.u32 %r88,%r87,0; @ %r88 bra $L16; bra $L39; $L15: .loc 1 150 17 ld.u32 %r43,[%r44+8]; .loc 1 150 10 setp.ne.u32 %r89,%r43,0; @ %r89 bra $L18; .loc 1 151 40 mov.u32 %r90,1; st.u32 [%r44+56],%r90; bra $L1; $L18: .loc 1 153 2 membar.sys; mov.u32 %r91,1; st.u32 [%r44+56],%r91; .loc 1 154 14 mov.u32 %r43,0; bra $L1; $L16: .loc 1 158 24 st.u64 [%r44+24],%r42; .loc 1 167 10 mov.u32 %r43,1; bra $L1; $L39: .loc 1 161 7 membar.sys; st.u64 [%r44+24],%r42; .loc 1 167 10 mov.u32 %r43,1; bra $L1; $L20: .loc 1 80 12 mov.u32 %r43,1; bra $L1; $L11: .loc 1 143 12 add.u64 %r99,%frame,64backtrace_initialize.loc 1 143 10 setp.eq.u32 %r101,%r100,0; @ ! %r101 bra $L40; bra $L15; $L1backtrace_pcinfobacktrace_pcinfor24u32 %r42; .reg .u64mov.u64 %r29,%ar4; .loc 1 177 8fileline_initializemov.u32 %r24,%r33; .loc 1 177 6 setp.eq.u32 %r34,%r24,0; @ %r34 bra $L41; .loc 1 180 6 ld.u32 %r35,[%r25+56]; setp.ne.u32 %r36,%r35,0; @ %r36 bra $L43; .loc 1 183 10 ld.u64 %r43,[%r25+24]; { .param .u32 %value_in; $LCT3: .callprototype ( .param .u32 %value_out),.param .u64 %in_ar2,.param .u64 %in_ar3,.param .u64 %in_ar4)call (%value_in),%r43),$LCT3bra $L41; $L43: .loc 1 178 12 mov.u32 %r24,0backtrace_syminfobacktrace_syminfor24u64mov.u64 %r29,%ar4; .loc 1 193 8fileline_initializemov.u32 %r24,%r33; .loc 1 193 6 setp.eq.u32 %r34,%r24,0; @ %r34 bra $L47; .loc 1 196 6 ld.u32 %r35,[%r25+56]; setp.ne.u32 %r36,%r35,0; @ %r36 bra $L49; .loc 1 199 3 ld.u64 %r42,[%r25+40];,.param .u64 %in_ar3,.param .u64 %in_ar4)call %r42),$LCT4; } .loc 1 200 10 mov.u32 %r24,1; bra $L47; $L49: .loc 1 194 12 mov.u32 %r24,0 posix.o/ backtrace_opebacktrace_openbacktrace/posix.c" // BEGIN GLOBAL FUNCTION DECL: backtrace_closbacktrace_closeopop__errno__errno; // BEGIN GLOBAL FUNCTION DECL: fcntfcntlcloseclose (6] = {99,108,111,115backtrace_opebacktrace_openpredpredmov.u64 %r29,%ar2; mov.u64 %r30,%ar3; .loc 1 64 6mov.u32 %r32,0; st.u32 [%r30],%r32; .loc 1 67 16 mov.u32 %r34,262144mov.u32 %r26,%r35; .loc 1 68 6 setp.ge.s32 %r37,%r26,0; @ %r37 bra $L3; .loc 1 7__errnou32 %r39,[%r38]; setp.ne.u32 %r40,%r39,2; @ %r40 bra $L4; .loc 1 71 18 mov.u32 %r41,1; st.u32 [%r30],%r41; .loc 1 74 14 mov.u32 %r26,-1; .loc 1 71 18 bra $L1; $L4: .loc 1 73 34__errno2 ld.u32 %r45,[%r42];,.param .u32 %in_ar2);call %r28,(%out_arg1,%out_arg2,%out_arg3),$LCT0; } .loc 1 74 14 mov.u32 %r26,-1; bra $L1; $L3: .loc 1 82 3 mov.u32 %r48,1; st.u32 [%stack],%r48;fcntl.loc 1 85 10 bra $L1; $L2: .loc 1 67 16 mov.u32 %r53,262144mov.u32 %r26,%r54; .loc 1 68 6 setp.lt.s32 %r56,%r26,0; @ ! %r56 bra $L3; bra $L4; $L1: .loc 1 backtrace_closbacktrace_closepred.loc 1 94 7close,(94 6 setp.ge.s32 %r31,%r30,0; @ %r31 bra $L8; .loc 1 96 38__errno; ld.param.u64 %r32,[%value_in]; } .loc 1 96 7 ld.u32 %r35,[%r32]; cvta.const.u64 %r34,$LC0;.loc 1 97 14 mov.u32 %r25,0; bra $L6; $L8: .loc 1 99 10 mov.u32 %r25,1;BEGIN FUNCTION DECL: print_callbackprint_callbackin_ar4); .file 1 "../../../libbacktrace/print.c" // BEGIN FUNCTION DECL: error_callback .func error_callbackbacktrace_print .visible .func backtrace_prinVAR DECL: _impure_ptr .extern .global .align 8 .u64 _impure_ptrerrorstrerror(.param .u32 %value_out) fputcbacktrace_fulbacktrace_full17] = {48,120,37,108,120,32,37,115,10,9,37,115,58,37,1005] = {37,115,58,32,0 }4] = {63,63,635] = {58,32,37,1108,105,98,98,97,99,107,116,114,97,99,101,58,32,37,115,0 }; // BEGIN FUNCTION DEF: print_callback .func (.param .u32 %value_out) print_callbackpred %r29; .reg .predmov.u64 %r28,%ar4; .loc 1 58 3 setp.ne.u64 %r29,%r26,0; @ %r29 bra $L2; cvta.const.u64 %r26,$LC0; $L2: setp.ne.u64 %r30,%r28,0; @ %r30 bra $L3; cvta.const.u64 %r28,$LC0; $L3: ld.u64 %r33,[%r24+8]; st.u32 [%stack+24],%r27; st.u64 [%stack+16],%r26; st.u64 [%stack+8],%r28.loc 1 63 10 mov.u32 %value,0; .loc 1 64error_callback .func error_callbackpredpredmov.u32 %r35,%ar2; .loc 1 73 19 ld.u64 %r36,[%r33]; ld.u64 %r23,[%r36]; .loc 1 73 6 setp.eq.u64 %r37,%r23,0; @ %r37 bra $L5; .loc 1 74 14 ld.global.u64 %r40,[_impure_ptr]; .loc 1 74 5 ld.u64 %r41,[%r40+24]22,[%value_in]; } $L5: .loc 1 75 12 ld.global.u64 %r46,[_impure_ptr]; .loc 1 75 3 ld.u64 %r47,[%r46+24]; st.u64 [%stack],%r34; cvta.const.u64 %r45,$LC38,[%value_in]; } .loc 1 76 6 setp.le.s32 %r50,%r35,0; @ %r50 bra $L6; .loc 1 77 5strerr77 14 ld.global.u64 %r55,[_impure_ptr]; .loc 1 77 5 ld.u64 %r56,[%r55+24]; st.u64 [%stack],%r52; cvta.const.u64 %r54,$LC457,[%value_in]; } $L6: .loc 1 78 16 ld.global.u64 %r61,[_impure_ptr]; .loc 1 78 3 ld.u64 %r62,[%r61+24];5962; call (%value_in),fputc.loc 1 backtrace_print .visible .func backtrace_print.loc 1 88 14 st.u64 [%frame],%r23; .loc 1 89 10 st.u64 [%frame+8],%r25; .loc 1 90 3 add.u32 %r31,%r24,1; mov.u64 %r29,error_callback; mov.u64 %r28,print_callbackframe; call (%value_in),backtrace_full.loc 1 92 1 ret; } read.o/ backtrace_get_viewbacktrace_get_view.file 1 "../../../libbacktrace/read.c" // BEGIN GLOBAL FUNCTION DECL: backtrace_release_view .visible .func backtrace_release_viewlseekl// BEGIN GLOBAL FUNCTION DECL: __errno__errno; // BEGIN GLOBAL FUNCTION DECL: backtrace_backtrace_allreareadbacktrace_free .extern .func backtrace_free114,101,97,100,0 }6] = {108,115,101,101,107102,105,108,101,32,116,111,111,32,115,104,111,114backtrace_get_viewbacktrace_get_viewpredu32u64 %r70; .reg .u64 %r73; mov.u64 %r33,%ar0; mov.u32 %r34,%ar1; mov.u64 %r35,%ar2; mov.u64 %r36,%ar3; mov.u64 %r37,%ar4; mov.u64 %r38,%ar5; mov.u64 %r39,%ar6; .loc 1 61 7 mov.u32 %r42,0;(%value_in),lseek.loc 1 61 6 setp.ge.s64 %r44,%r43,0; @ %r44 bra $L2; .loc 1 63 38__errno63 7 ld.u32 %r48,[%r45]; cvta.const.u64 %r47,$LC0;,.param .u32 %in_ar2);call %r37,(%out_arg1,%out_arg2,%out_arg3),$LCT0; } .loc 1 64 14 mov.u32 %r32,%r42; bra $L1; $L2:call (%value_in),backtrace_alloc67 14 st.u64 [%r39+8],%r53; .loc 1 68 670 14 st.u64 [%r39],%r53; .loc 1 71 13 st.u64 [%r39+16],%r36; .loc 1 73 9ad.loc 1 74 6 setp.ge.s32 %r60,%r59,0; @ %r60 bra $L4; .loc 1 76 37__errno; ld.param.u64 %r61,[%value_in]; } .loc 1 76 7 ld.u32 %r64,[%r61]; cvta.const.u64 %r63,$LC1;.loc 1 77 7 ld.u64 %r66,[%r39+8];.loc 1 78 14 mov.u32 %r32,%r42; bra $L1; $L4: .loc 1 81 7 cvt.s64.s32 %r67,%r59; .loc 1 81 6 setp.ge.u64 %r68,%r67,%r36; @ %r68 bra $L6; .loc 1 83 7 cvta.const.u64 %r70,$LC2;,.param .u32 %in_ar2);call %r37,(%out_arg1,%out_arg2,%out_arg3),$LCT2; } .loc 1 84 7 ld.u64 %r73,[%r39+85 14 mov.u32 %r32,%r42; bra $L1; $L5: .loc 1 69 12 cvt.u32.u64 %r32,%r53; bra $L1; $L6: .loc 1 88 10 mov.u32 %r32,1; $L1: .loc 1 89backtrace_release_view .visible .func backtrace_release_view99 3 ld.u64 %r33,[%r25+16]; ld.u64 %r34,[%r25+8]; {call backtrace_free); } .loc 1 100 14 mov.u64 %r35,0; st.u64 [%r25],%r35; .loc 1 101 14 st.u64 [%r25+8],%r35; .loc 1 102 1 ret; } simple.o/ FUNCTION DECL: simple_unwindle_unwindbacktracebacktrace_simplbacktrace_simple_Unwind_GetIPInfo_Unwind_GetIPInfo; // BEGIN GLOBAL FUNCTION DECL: _Unwind_Backtrace_Unwind_Backtr; // BEGIN FUNCTION DEF: simple_unwind .func (.param .u32 %value_out) simple_unwind66 7 mov.u32 %r32,0; st.u32 [%frame],%r32; .loc 1 69_Unwind_GetIPInfo74 12 ld.u32 %r22,[%r31]; .loc 1 74 6 setp.le.s32 %r36,%r22,0; @ %r36 bra $L2; .loc 1 76 7 add.u32 %r37,%r22,-1; st.u32 [%r31],%r37; .loc 1 77 14 mov.u32 %r29,%r32; bra $L1; $L2: .loc 1 80 6 ld.u32 %r38,[%frame]; setp.ne.u32 %r39,%r38,0; @ %r39 bra $L4; .loc 1 81 5 add.u64 %r28,%r28,-1; $L4: .loc 1 83 16 ld.u64 %r42,[%r31+32]; ld.u64 %r44,[%r31+16]; { .param .u32 %value_in; $LCT0: .callprototype ( .param .u32 %value_out)(%value_in),%r44,(%out_arg1,%out_arg2),$LCT083 14 st.u32 [%r31+40],%r43; .loc 1 85 6 setp.eq.u32 %r45,%r43,0; .loc 1 86 12 selp.u32 %r29,0,5,%r45; $L1: .loc 1 backtrace_simplbacktrace_simpleu64 %r31; .reg .u32 %r33; mov.u64 %r24,%ar0; mov.u32 %r25,%ar1; mov.u64 %r26,%ar2; mov.u64 %r27,%ar3; mov.u64 %r28,%ar4; .loc 1 100 21 add.u32 %r29,%r25,1; .loc 1 100 14 st.u32 [%frame],%r29; .loc 1 101 15 st.u64 [%frame+8],%r24; .loc 1 102 18 st.u64 [%frame+16],%r26; .loc 1 103 24 st.u64 [%frame+24],%r27; .loc 1 104 14 st.u64 [%frame+32],%r28; .loc 1 105 13 mov.u32 %r30,0; st.u32 [%frame+40],%r30; .loc 1 106 3 mov.u64 %r31,simple_unwind;frame; call (%value_in),_Unwind_Backtrace.loc 1 107 15 ld.u32 %value,[%frame+40] sort.o/ backtrace_qsort .visible .func backtrace_qsortbacktrace/sortbacktrace_qsort .visible .func backtrace_qsortpredmov.u64 %r53,%ar0; mov.u64 %r54,%ar1; mov.u64 %r55,%ar2; mov.u64 %r56,%ar3; .loc 1 69 6 setp.gt.u64 %r57,%r54,1; @ ! %r57 bra $L1; setp.ne.u64 %r96,%r55,0; .loc 1 50 3 setp.eq.u64 %r98,%r55,0; bra $L2; $L15: .loc 1 69 6 setp.le.u64 %r58,%r54,1; @ %r58 bra $L1; $L2: .loc 1 76 29 shr.u64 %r59,%r54,1; .loc 1 76 3 mad.lo.u64 %r37,%r59,%r55,%r53; .loc 1 50 3 @ %r96 bra $L4; $L7: add.u64 %r49,%r53,%r55; .loc 1 78 7 mov.u64 %r28,0; .loc 1 79 10 mov.u64 %r31,1; bra $L5; $L4: add.u64 %r30,%r37,%r55; .loc 1 50 3 mov.u64 %r36,%r53; $L6: .loc 1 54 9 ld.s8 %r33,[%r36]; .loc 1 55 10 ld.u8 %r62,[%r37]; st.u8 [%r36],%r62; .loc 1 56 10 cvt.u32.u32 %r63,%r33; st.u8 [%r37],%r63; .loc 1 50 31 add.u64 %r36,%r36,1; .loc 1 50 36 add.u64 %r37,%r37,1; .loc 1 50 3 setp.ne.u64 %r64,%r30,%r37; @ %r64 bra $L6; bra $L7; $L5: .loc 1 81 33 mov.u64 %r43,%r49; .loc 1 81 12 { .param .u32 %value_in; $LCT0: .callprototype ( .param .u32 %value_out)(%value_in),%r56,(%out_arg1,%out_arg2),$LCT0; ld.param.u32 %r67,[%value_in]; } .loc 1 81 10 setp.le.s32 %r68,%r67,0; @ ! %r68 bra $L28; add.u64 %r97,%r55,%r49; bra $L8; $L28: .loc 1 83 4 add.u64 %r28,%r28,1; .loc 1 84 7 setp.eq.u64 %r69,%r28,%r31; @ ! %r69 bra $L29; add.u64 %r97,%r55,%r49; bra $L8; $L29: .loc 1 85 6 mad.lo.u64 %r42,%r28,%r55,%r53; .loc 1 50 3 @ ! %r98 bra $L30; mov.u64 %r97,%r49; bra $L8; $L30: add.u64 %r97,%r55,%r43; $L9: .loc 1 54 9 ld.s8 %r38,[%r42]; .loc 1 55 10 ld.u8 %r72,[%r43]; st.u8 [%r42],%r72; .loc 1 56 10 cvt.u32.u32 %r73,%r38; st.u8 [%r43],%r73; .loc 1 50 31 add.u64 %r42,%r42,1; .loc 1 50 36 add.u64 %r43,%r43,1; .loc 1 50 3 setp.ne.u64 %r75,%r43,%r97; @ %r75 bra $L9; $L8: .loc 1 79 27 add.u64 %r31,%r31,1; .loc 1 79 3 mov.u64 %r49,%r97; setp.lt.u64 %r76,%r31,%r54; @ %r76 bra $L5; .loc 1 89 6 setp.ne.u64 %r77,%r28,0; @ %r77 bra $L10; $L13: .loc 1 98 13 add.u64 %r78,%r54,-1; sub.u64 %r32,%r78,%r28; .loc 1 97 25 mad.lo.u64 %r80,%r28,%r55,%r55; .loc 1 97 12 add.u64 %r50,%r53,%r80; .loc 1 94 9 add.u64 %r81,%r28,%r28; .loc 1 94 6 setp.lt.u64 %r82,%r81,%r54; @ %r82 bra $L11; bra $L31; $L10: .loc 1 90 5 mad.lo.u64 %r48,%r28,%r55,%r53; .loc 1 50 3 @ %r98 bra $L13; add.u64 %r52,%r48,%r55; mov.u64 %r47,%r53; $L14: .loc 1 54 9 ld.s8 %r44,[%r47]; .loc 1 55 10 ld.u8 %r85,[%r48]; st.u8 [%r47],%r85; .loc 1 56 10 cvt.u32.u32 %r86,%r44; st.u8 [%r48],%r86; .loc 1 50 31 add.u64 %r47,%r47,1; .loc 1 50 36 add.u64 %r48,%r48,1; .loc 1 50 3 setp.ne.u64 %r87,%r48,%r52; @ %r87 bra $L14; bra $L13; $L11: .loc 1 9); } .loc 1 97 12 mov.u64 %r53,%r50; .loc 1 98 13 mov.u64 %r54,%r32; .loc 1 99 7 bra $L15; $L31: .loc 1 103 7 {); } .loc 1 106 7 mov.u64 %r54,%r28; bra $L15; $L1: .loc 1 108 1 ret; } state.o/ backtrace_create_statebacktrace_create_statebacktrace/statbacktrace_backtrace_allF: backtrace_create_statebacktrace_create_statepred.loc 1 61 3 add.u64 %r27,%frame,8; mov.u64 %r29,0; st.u64 [%r27],%r29; st.u64 [%r27+8],%r29; st.u64 [%r27+16],%r29; st.u64 [%r27+24],%r29; st.u64 [%r27+32],%r29; st.u64 [%r27+40],%r29; st.u64 [%r27+48],%r29; st.u64 [%r27+56],%r29; .loc 1 62 23 st.u64 [%frame],%r23; .loc 1 63 23 st.u32 [%frame+8],%r24; .loc 1 66 5 mov.u64 %r38,call (%value_in),backtrace_alloc67 6 setp.eq.u64 %r43,%r41,0; @ %r43 bra $L1; .loc 1 69$L1: .loc 1 7 unknown.o/ FUNCTION DECL: unknown_fileline .func (.param .u32 %value_out) unknown_fileline.file 1 "../../../libbacktrace/unknowbacktrace_initialbacktrace_initial); // BEGIN FUNCTION DEF: unknown_fileline .func (.param .u32 %value_out) unknown_fileline3.loc 1 49 10 mov.u64 %r32,0; cvt.u32.u64 %r31,%r32; { .param .u32 %value_in; $LCT0: .callprototype ( .param .u32 %value_out),.param .u64 %in_ar2,%r25),$LCT050backtrace_initialbacktrace_initialmov.u64 %r23,%ar0; mov.u64 %r28,%ar5; .loc 1 62 24 mov.u64 %r29,0; st.u64 [%r23+32],%r29; .loc 1 63 16 mov.u64 %r30,unknown_fileline; st.u64 [%r28],%r30; .loc 1 64 10 mov.u32 %value,1; .loc 1 6 fortran.la# libgfortranfortran -lmfortranfortran.spec# # This spec file is read by gfortran when linking. # It is used to specify the libraries we need to link in, in the right # order. # %rename lib liborig *lib: -lm %(libgcc) %(liborig)u32 %r164; .reg .u64 %r165; .reg .predpred %r180; .reg .predpredpredu32 %r212; .reg .u32 %r216; .reg .pred %r218; .reg .u64 %r219; .reg .u32 %r220; .reg .pred %r222; .reg .pred %r223; .reg .u64 %r224; .reg .pred %r225; .reg .pred %r226; mov.u64 %r99,%ar0; mov.u64 %r100,%ar1; mov.u64 %r101,%ar2; mov.u64 %r102,%ar3; mov.u64 %r103,%ar4; st.u64 [%frame],%r103; mov.u64 %r104,%ar5; mov.u64 %r105,%ar6; mov.u64 %r106,%ar7; mov.u64 %r107100; neg.s32 %r128,%r127; .loc 1 46 24 cvt.u16.u32 %r130,%r125; cvt.u16.u32 %r131,%r128; and.b16 %r129,%r130,%r131; .loc 1 47 14 cvt.u32.u16 %r132,%r129; cvt.u16.u8 %r133,%r132; setp.ne.u16 %r134,%r133,0; @ %r134 bra $L196; .loc 1 46 49 shr.u64 %r136,%r101100; neg.s32 %r139,%r138; .loc 1 46 53 cvt.u16.u64 %r142,%r136; cvt.u16.u32 %r143,%r139; and.b16 %r141,%r142,%r143; .loc 1 46 40 cvt.u32.u16 %r144,%r141; cvt.u16.u8 %r145,%r144; setp.ne.u16 %r146,%r145,0; selp.u64 %r100,%r99,%r100,%r146; bra $L167; $L196: mov.u64 %r100,%r99; $L167: .loc 1 46 11 st.u64 [%r224+16],%r100; .loc 1 48 12 st.u64 [%r224+24],%r101; .loc 1 49 1210010010 @ %r225 bra $L179; .loc 1 447 15 ld.u64 %r191,[%r55+24]; .loc 1 448 4 ld.u64 %r192,[%r191+88]; {loop_ullu16predpredpredpred %r180; .reg .predpredpredu32 %r212; .reg .u32 %r216; .reg .pred %r218; .reg .u64 %r219; .reg .u32 %r220; .reg .predmov.u32 %r101,%ar0; mov.u64 %r102,%ar1; mov.u64 %r103,%ar2; mov.u64 %r104,%ar3; mov.u64 %r105,%ar4; mov.u64 %r106,%ar5; st.u64 [%frame],%r106; mov.u64 %r107,%ar6; mov.u64 %r108,%ar7; mov.u64 %r109,%ar8; mov.u64 %r110,%ar10 @ %r229 bra $L181; .loc 1 449 15 ld.u64 %r191,[%r54+24]; .loc 1 450 4 ld.u64 %r192,[%r110010010010010 ordered.o/ predld.shared.u64 %r57,[nvptx_thrs]; mad.lo.u64 %r48,%r55,104,%r57; .loc 2 108 21 ld.u64 %r42,[%r48+16]; .loc 2 113 6 setp.eq.u64 %r58,%r42,0; @ %r58 bra $L21; .loc 2 113 27 ld.u32 %r83,[%r42]; .loc 2 113 20 setp.eq.u32 %r59,%r83,1; @ %r59 bra $L21; .loc 2 109 27 ld.u64 %r43,[%r48+24]; .loc 2 117 21 mov.u32 %r60,-1; st.u32 [%r43+44],%r60; .loc 2 120 9 ld.u32 %r23,[%r43+40]; .loc 2 124 26 ld.u64 %r53,[%r42+80]; .loc 2 120 6 setp.ne.u32 %r61,%r23,1; @ %r61 bra $L23; .loc 2 124 51 ld.u32 %r62,[%r48+40]; .loc 2 124 7 shl.b64 %r63,%r62,3; add.u64 %r64,%r53,%r63; .loc 3 63 10 ld.u64 %r65,[%r64]; atom.add.u32 %r66,[%r65],.u32 %r68,[%r43+48]; add.u32 %r39,%r23,%r68; .loc 2 134 10 setp.gt.u32 %r69,%r83,%r39; @ %r69 bra $L25; .loc 2 135 8 sub.u32 %r39,%r39,%r83; $L25: .loc 2 136 44 ld.u32 %r32,[%r48+40]; .loc 2 136 27 cvt.u64.u32 %r70,%r39; .loc 2 136 35 shl.b64 %r71,%r70,2; add.u64 %r72,%r51,%r71; st.u32 [%r72],%r32; ld.u32 %r83,[%r42]; $L24: .loc 2 139 9 ld.u32 %r73,[%r43+48];mov.u32 %r30,%ar1; .loc 1 37 6 and.b32 %r31,%r30,1;9+4pred %r64; mov.u64 %r42,%ar0; mov.u32 %r43,%ar1%r51,[%r42];ld.u32 %r60,[%r42];5656565656; $L52: add.u64 %r74,%r171,4; st.u64 [%frame+24],%r74; .loc 1 101 6 setp.gt.u64 %r333,%r56,%r5656; .loc 1 115 7 min.u64 %r342,%r343,%r69; .loc 1 115 22 add.u64 %r344,%r190,%r56835656; .loc 1 96 16 sub.u64 %r362,%r191,%r5656; $L57: add.u64 %r91,%r174,4; st.u64 [%frame+24],%r91; .loc 1 101 6 setp.gt.u64 %r372,%r56,%r91100,%r47,%r170; st.u64 [%frame+24],%r100100141141,1; st.u64 [%frame+24],%r121; .loc 1 101 6 setp.gt.u64 %r474,%r60,%r121; @ %r474 bra $L26; mov.u64 %r60; .loc 1 96 16 sub.u64 %r482,%r191,%r60; $L72: add.u64 %r126,%r146,2; st.u64 [%frame+24],%r126; .loc 1 101 6 setp.gt.u64 %r491,%r60,%r126; @ %r491 bra $L26; mov.u64 %r60; .loc 1 115 7 min.u64 %r560,%r561,%r113; .loc 1 115 22 add.u64 %r562,%r190,%r60; $L80: add.u64 %r155,%r113,%r127; st.u64 [%frame+24],%r155; .loc 1 118 6 setp.gt.u64 %r564,%r6060; $L82: add.u64 %r167,%r441,%r163; st.u64 [%frame+24],%r167; .loc 1 101 6 setp.gt.u64 %r590,%rpredpredar2; .loc 2 631 15 ld.u32 %r33,[%r36]; .loc 2 632 3 ld.global.u32 %r3$L199: .loc 2 657 7 ld.u64 %r81,[%r29];acc_on_device_h_acc_on_device_hacc_onacc_onF: acc_on_device_h_acc_on_device_h_mov.u64 %r27,%ar0; st.u64 [%frame+16],%r27; ld.u64 %r28,[%frame+16]; ld.u32 %r22,[%r28]; mov.u32 %r29,%r22acc_on_device,(mov.u32 %r23,%r30; set.u32.ne.u32 %r24,%r23,0; neg.s32 %r24,%r24; st.u32 [%frame],%r24; ld.u32 %r25,[%frame]; mov.u32 %r26,%r25;__tls_get_addrlibc.so.6ld-linux-x86-64.so.2EL@HEHEE<(HHPT LdIH@$HHLRLLILsILMfANA.optLELPHMHUHL( H]1DHH}蒆HUDHHHhY\H`HpHHPHDXHHfDHpH}HMHUH}} \E1DuH]H`tHHfH8VLDHXH9uE AFDxIHEH]HHPL|X1IIEH=zH=&H=0H=utCHMH9HAX1f.HHXHx9u@HuH}LpIXM9SH]LuDxDmL}E1E9sYDH}HHPHHH=H=H=H=AE9r=/&HEE1E1E9DHE@E9DHHPH<&AWH HHJHfHXHzA9AwD9 DHH}D9HPH _DHHPHtHAHHuHx'AIE9D;DH}HHPH H9uE9DH HJH<at I|$HMHuH>VDHFX1HHXHxV$9uOLmOH=}!HuH;HVX1@HHXH;B9uuHMH9thHAX1HHXHxtG9uH=LH=MH=uAH=5HAHI9EHjIU=+/G1褦@M1H9u H=l7AoEAoEFAoE F AoE0F0AoE@F@IEPHFP @H=M@HzHaMAIHHH}RH}H=HOHXHHGeHB G1HJHfDH=xE9HQHG11.DHpHHDHpH+hH;EHhHP HEH@HEH}Heظ[A\A]A^A_]fHzHAHEHBAHxHMI?IGHuWHuHB GHBHDH=TH=!HVHIEHH96Mt I|$H;H!HHFHHHtPGTGqfDHFKHɸN fE9`DH}HHPH HEHEEE=I?AoAoGBAoG B AoG0B0AoG@B@IGPHBPZHl(I{?A1 XX9`L]I-LHMHxG11dE.p?HHGHB H}AUHFDH HFiHLHe1[A\A]A^A_]1 yL]1II3I}1 L]II3HE HE 1H|L]IMt(AI?IGHE9OHUDDm[HGGHDHHJ HEHEPGGmHVIMG=$/1 1I~L$H9<D1L1A@AWHGDH$PD$H$L$}A[]D1N]HD$9$vW ]HI3fDL@IID8 (It$ LR]J< IHھ HuHѿ$HL$H<^ H=(.$H.'1f.HH$H$@HǾG1E1~!Ly$H9+f11HA AW+GLDH$PD$H$L$EAYAZ D1I9$tHc$9vKHL$@L$I$HiH9I E1UH<$H$$=GHDŽ$@D$pH$HD$h$D$tH$HD$x褼HH$@1HRGA|HH$A}1E1ZƄ$HI.Iĉ$YD$pMcHO$Pt$pD$L$HL$ $H$I$L%._AXK<tAHA9v'H$@g(HHH薿uƄ$Eu$uLH.H$@HtD$H|$H=h.`GHDŽ$LHthH$mGHE1*H$JJH|$HHt$L$LL$(LHHD$ Hw?IL9HD$bML$mfDI]M}I}=uLHL$ L)HHHt0>1@?$2T?3h?4?8@d9D@D;@T;@d;@t;@;@;@; A; A;4A;HA;\A;pAD>D$>,D4>@DD>TDT>hDd>|Dt>D>D>D>D>D>D>E>E>0E?DE?XE$?lE4?ED?ET?Ed?Et?E?E?E? F? F?4FHHH II II4I$IHI^OtdVDuV|Y}$Y0Z$DZtl`gD$g$hthjjdlA$B$CtCC0CDC\CpD4DdDDLLLOOP$QTQdTDU X\YtYdZ[t\X]^$^4^^^^<_Pt_p____d`acT$ch4c|cdd0dDe`$e|efgHhpjDjjj8kdTkkl4lo$o,o@oTolorTsdss0sDDt|dtttuu TvDv\wxDy$ԀTāDDXTxdt4 D T l T   ! !t!4"d"$x"t""ԑ""$#L#$#ē#ԓ#d0$$$$ĕ$ԕ$$ % %44%dH%t\%x%Ė%&T&4&D&T&d&t&''0'D'ġX'ԡl''''4'T't'Ģ' (D((D(`(t(($@)T)))***$+4+D,+d@+T+h+$|+++, ,4<,tX,,d,D-t..0.D.`.|....>,>$@>T>h>T>d>>$>d>?$AT@0B4A\BCBDBdDCIHCDOCPC4D8DDěEԟEF4H=QL wAR LBiB KB=  H=Ht,4H=%HH=Ht$HH HtI<4H=r$H=~H[A1Ha,@臟 ] =< =WtlE9N KB@̣C.11膬%G=u =:`111L`= 111 H=|_=11謯ʭH=pHGPL1ɺĔpH=خH=ۢ=ҢĢHc-P1115111111u!QIu[fAHcwݍPHcHDHtʋJ+JR 9O9L뵃==H=HSd1=ʡt4=q~+mt1=u =1=u=tЃ1=t=<u =7˲\:1=t==à~4t1=u=ٲt =Բ/H=oHHH1蛺111ۨP111ƨH111豨sj!111荨H=pM`xgB=d Q\BU(HIHmLHLpBeH|$1$HD$iL$HHȉH=8H!貒H IH1LLHHHp:fL%S=suxۉPH 11QT11^ȭM=t1116[RB/11Ȩ`11諨ɦHL%®MOLv"HlHH=o=u{R=Lt2H SH1H=R5XB1R=ut1H=GUcBWBH5 n%=t=zYBHG=ĥ=OYBHHH1&101H1HLH|L%htL)SoLDHM|HXkHH=ݣIE1DLHN,9A}HcH5AUIHHcLDOaDLLL%HID f.s@=ث*x,4x dHR=t1೤h07H%=G31賳H 11螳L⾐11荳11軟fDH;|$XzE1H¾HHD$(LT$(VHHsAAVH|$0E1LѺH Ht$ Hx=j=@1H HXZH1҉fk0L{8AˆT$HC@Ht$PHHD$PHD$XHD$5Ht$XIL5MHD$ u HH|$bHD$AH@HD$Mt H9D$PH|$ HD$Xt H;D$Ll$PHt$PHHD$PHD$X95Ht$XIL)5MHD$ u H>HD$MLl$PtHt$PL6LhLl$PH|$ HD$XtHt$XH|$ HHD$XMD$,tL9wHu LD$,H;D$|$HcH;D$H{f?Do6D$,Ht$H=5LEd$ IDLd$ 4cHHsMHD$0qHD$HH5f8HD$HD$Dx6ܕH\$H.Ic1H HD$@HC1HD$8@HD$p6HE1AjH|$@H.gIHD$0DH<(_AX|$,t LIHйHDDLDDHD$HMHjHT$ 7ADo1Y^H9\$8IHD$@HHH9\$t=HD$f8&Hb E1HhL[]A\A]A^A_H|$HHʕHu/H5HHFH#xHHHH9Hs(H|$@HD$@f+7LCH#P nILHD$HIHT$H|9AFtH=~<Ht HT$H1mH>H D$HAH >H#T$@HI!1L!HT$@HT$@H‰sH EHDjLHI HHu1Ҿ$L9HtH@Ls02EMLDAV1Iv1AHHH?5DHT$8DL$4LD$(LT$ HD$L\$|$HT$8DL$4LD$(LT$ HD$L\$L$뎋Hc!HË1HHc&L-MufDHIMuImfA> u1ҾL螂9Hu1Ҿ$L节9HItL`Iu(H|$@HD$@'7Lo@H#P nILHD$HIHT$H 9AFtH=9Ht HT$H1L:L T$H1H:H#T$@E1IM!L!HT$@HsH EHIcAHLPHHuI}02EHHDG1Hw1LHHH HD$H7HD$HIFAHxXw(6 IFH5HxXW(6gfff)AfD(H;蹆 1HD, HHL$`11IFt$UJDŽ9El$I5EI$EHc'D9PIVHE+H߉t$,t$|FEH,ňEH;H,H9uVIcD\$ DT$H}H4ň#HjE1A11I跇 H=IFD\$0IEEHAFD]AXAYDT$M81IDm0Jc#H1ҹ HiH=IFHT$LDLD$D$HD$LD$L$HD$0HHcǹ1 }H4݈#HHEj1E1A1H7 HEXZ H*gff.AD$D 0HIHEH&IfHt E<QHEHDE<CtH$M@E1HEHt$fHEHD$P\fHJHIHyE@HxHHe qDHVsf.HHL$CaHL$\H(azH8aHXEH}诤aHIf=<> H}f= f% x|$uAE<> Hu|$8AHEHXH$QH$?D@D$E1 f=;tc1ҾHIpIGyHE HE(H$1#H?H$HH)(la0AXAfIEpHaf80H}f=HHf?H}f?%HGpHH}f80HwxHf>0a H}DAE:H$H|$p莍H$H$115 LΤeIEpeH}HGppddd dpd zdffd Rda苛\=1qE@HŠ4HtLlH~f?tS~8L IJf.H}0YDM,EZHE81ҿf.Ht$>`Ht$DftLUE1A|$8J Hc1HŠ4LlE1m7I@Q>`rH=TH%E1m7I@lz8H@@(HYH f tfHZ f8 V HH1HH <H <HJ(vHtHT$<`HT$N H=;,OH$fDu:|$ujHWHsLWHcH8L[]A\A]A^A_ÐKPHWH(LWH(HH@H(|$tL>A}$IHut1ۀ=MuD=at 腇t[]A\H|gHPHHrH@8H9tPP1E1H@8у`H9ɋPPtAA`D9tMHHH0MP0LO0Iz8LW0IA8LH0LA8LO@IH0Hx@IH@H@8MYLWHYLKIYI{L_1IJLQу`H9sH( - *\@I8H0ATUS Qu{1H`fH@L L;`tqID$1HID$1HtP9t*HlH}HtߋDQt S#fm=~GZtH}HHtHu8H9HfH}HHGLk8LH0E=CZ HSCZHHt}tH{tHS(HtBZ@<tTHS8Ht$HT$HUEHHt-BZH{tHS(HtBZt HK8H9J8tHH(H[]A\A]A^A_HNpHEf9uCHYf=t'f u0HHt$>uH8HHCf8u-HH{HCZ<HEZt˃HID$H}Ht7HEUTLHE8 Md$0AD$ZtI;D$8uA9T$TuL{8IMF0MtA@ZtIF8I9@81LLL9]HEHP@HzHr1fDH(9H D,tL9>uL.f.HK8H9J8 `DHEHU8Ht$1LHT$HLQCHt HID$H{_fDH{ T$3+]T$tHCHX$1LL|$HEDAFTA9@T1LLLD$L9LD$u1M)Wm=HjhW/m=4jfPHJ<>u}1ciE9 HUXHK1lHHC@xc1C@HD@EH@H@H<>HPXH~m,eHqaLsLCuu:H衰C:|tLHu+HCf8 HHC4LMH$f.D$HvHPHHHxJHHL$t$1˙;/H=z5D1HH1b$ÞL%|5HMt6LTLHHIL}/u fH}/tHHIH=4HcHHALcJ|5[H54IHHtLHA6C7/McK<7HLLLAa VE.gcdfD]-/D/E 6tc/uYH=Z4H)ēHHRHMEH56jT0H!fHHxHYt$蓖Hh[]A\A]A^A_@H=3IHcl$HE1E1Dǂ1fDL=adcg3=gcdaHii11+fkIHH3Hcl$@Hcl$H}肑HHHHmHHoE.gcnfEfoncgЉ*30BpHH5ZjÀ;1҅‰mDLHhm11*)%D=*30Btdˆd$>D$?D$@B03*HT$DAmT$EHD$HH1P1LD$LC*nAXAYhD=ՈE1Fp\0VHE1H\$LmDe(H-@D$]fDfuwEDDH=b5$6=AŅD%yD%njuVD$1E1HH EH HBHA|D$#LBTD$(IpI@H vHH98A@(A@ AHD  L  DL$ IH Hlj)A)AI8LN M1MIERtMD|$@l$DL;L$@tnE1MDRI D)DL$ AH$A)@BH9Hr)HL2MItID|$@l$DH;T$@uAH$A;^D$A;F qD$(1\$(HHHDCINHHHH9u\$(MLDLH\$H\$H5i1*H-fHtH)0H̙HA밿iiH\$H51HHMAH$t}IhI LL$ 蚌LL$ Iƾ|$(IhD8XD$HAF mAnIFLljT$,LD$ LD$ T$,IpII@녺xn4S1DmH\$Lt$)LD$1nAP ApDD$)H-=Hn0SnJiSTH;As)HHHHQHH{@^ƃ[D@F% ff.@HHHH4H@rH;A|$T1H;xsH@HHHH4HcH`HH{0Md$8M2[]A\ff.@HHHH4HcH@HH;aH}T1H;xsH@HHHH4HcH`HH{Hm8HdHu`H@H(Ht+HHHRHH,Ht-HHzӦHHu`H[]H[]fff.HGTH;BUSHHHHRHH,HtyHHOHu@H3Hٺ,HHHپ HHu H[]G( H{HH@TDBD9sYHJHHHH4tLHHWD$RTA9v@HH`HHHрHz@t9H進% fDfD% fD1DƂtD$Hf.HRHHHL4Mt5I~@I~`LAƆEH<@[]A\A]A^úՖ64ff.ff.HHh[HLg(MߦLk2EMLD1AMIUH1H ![Hf.Hu[?,MmM;1AEHITHu*@tϻHITHtf.1HH,9nIvKHHHHVLMMMMIH@xHt xh"HJzH'zH$zH zHH[]A\A]A^A_f.Z 6@HmH9;f}uIwH螰HAHIl${(gk@HsHfH{ H[A H1H|$P111fHHD$P0HfEH f f@HeHfH|$0eHD$@)D$0D$@ 4IHeHeHe%E1A_HL<wH~4D$,HHz1ҾLIT$@J1ҾLID$@P$h** *1E1ubHdHH0 )h) ) )i )1H@ +2]11ҿ+HY2] HYu11Ҿ2]11ҾeHzYe2]5RH`YEHLdebug_lHUine8U$H2U輛11Ҿ2]HX85RAvHLdebug_aEfDUHUUbbreU$HLdebug_rJ5TRHLdebug_ilEH_U]UnfoMU$5"RHLdebug_lEWHfTdTineTT$ߚQAsNEHRRangefD RR$4袚=5Q~=)5HLdebug_alESddrHSS$Yt=HNE[5:QHR$5%QHLdebug_lEHpRocnRfeRUR$PPPH[!0]11Ҿ|HLW0]=H)W=-HM11/]HV t=HN11Ҿ/]HVDU/]11ҾHVo/]11ҾHVU/]11ҾHXV;/]11Ҿ|H^V!/]5OHVzf. H E)AE1HDH$HD$]f.+8EH$D$ E1E1HD$ >HiHcIGD$ AGMEu110?d 1Hž(L@/HEHHE(HEA$f t-f ( ID$ f8 H@L`fA<$  AD$LE3/Ht'LhAUf0Rf2f80NL覔H=HEH}Hh(H-rHDLDL$ IwDL$ DH$D$ E15H|$dD$aLVLD$`$*H=0'H|$`s:1=H=p =bHH9-|IA fI~ Pffz,HD$f8%VH9f?IHW HIDIzLHHLT$}LT$fDAS;AI%IH1E1IDIGHD$`E1Ld$0H\$@Hl$HLHL$ LT$8MILHkH[IHtH蠬LL9~H|$HL$ MHl$ LT$8H\$@Ld$0Hl$HI~Ht$LD$@LT$8HL$0HL$0LT$8LD$@fHIeXJfv(fLt$HPH4$f:Ht$tHt$`HL$WHT$XH6HHHD$<>F9FHD$XHHD$`H=H-`|AkIUf:2IEH!H8Hf? HGpHH@HE wHH|$`H|$`@1HlLVHIt L95$=`8H-}AEIUHf: Lm IH9tH9HHD$f8%fLT$HT$XAHMrtZHt$`HմLT$MzIHBf82@H@Hf8 wIMH[IEHu AEHŠEHH@p`|j蠞E1E1110Z 1Ҿ(17/`|Qk`@|L@|8@|$HE LT$ HpIHH@ Hxz/LT$ tIHHP HE @9Bvf=s,H,fH<(Hv H$HHH9\$wހ|$,Ht_qH0[]A\A]A^f[f.Hg1HtBb@H uH=h42[H=u!`EZH d!ex{f@c Htgc:bH9f.6T1҉Ht$`HD$`D$h HD$p$Ld$x$Df..H9-~tHI<>sfHVSHcVM,`It$HNt2~'HV0H H @Ht =L1HHZHtQH{PVH~CHcH9u;111H HHI(HcH9L\@1HH[]A\A]Mt !It$td$fHLzHH[Il$HH[Il$:1ҾHD$@IT$D$H1ҾHD$@IT$D$H[Il$Dt ;J'Lf.H[Il$AO x@ f.C L AD$Lu$L95qMt?H[]A\A]A^A_H~!HHBE1L;57f.$kHDHdEHƒ.ˆ$u)HH@H;HHƒ.t$LH.mHIH+LLE1AD$A^H= Ht$XLt$X;<$L8A_DEM L(H  Y =R <$$= I)L@HHHHH HmHu݀=. M9 |$ <$I^P H f; IH t L118LHHHYHHD$( D$0HL|$8Lt$@HHLxMHD$ HD$HD$H$I~PH ˆ$vHt EH9UD$s$M I|$iH% D$I|$61L#H x #HxM ID$Ht:Pt3x<H0HHDH(x<H9uAF;(I=Q+LhxHD$0IuI}0IUv=DEH$E1LƄ$H褞IU8Iu0E1HL莞$t 11!=~gHHhxH&H=1fHDŽ$HD$ DŽ$1H$HD$h)$HE)$HD$HH$pthH$H$E1Ld$EIHHDHAH|HgHD;`rLd$H$LH$H$H}0HD$hHL$pH\$xH$HD$t DmHE/H\$Ld$8E1IID$@HZD;hPDHlE2mH$H|$p)hH|$p1m11PHD$(轔 HL$LL$(fo$H@HXHHHL$ LH8H@@)D$pHH0HL$HD$HH$H\$H$H]AE;l$HH$H|$pygu|H|$p1Yl11PH HL$H@fo$HXHh8HHHL$ H@@H\$)D$pHH0HL$HD$HH$H$H|$p1Il$k11PH蒓 HL$HX8H@HhH@@HHHL$ HH0HL$HD$HID$0HD$H=H|$`If.HS11LH1/=pt I9'HH[tHmfDH<M1ҹ1P諒 LH@HAIHuLFHu!11HLQH[H}HH uHm[11LH.=/H[Hm/=HD$0D$(J;D$0HHuvJhJ@IHAH@pH4HH@HH9 H9wHH9w@HHu HH9HGPHtpL@t9HGPHAH9vHyvH8wHH9w11@pLI~VtI~VtLLM91I~*]H=H~HtyPt H@#w_HA HsOLVHp{u.=[=MH$@H$HLDŽ$6DŽ$HDŽ$H$h@=L~IVf:H=ۯDEH¾dLHHHwLLVM9-1LH.HIHLLXJHDI~VI^PHM9 PHIF(H<> Ht EH94=}L|HID$<LHx xH$AvHt$H$Lc$D$ =HID$HHHQ9H0 HAF92 <LMQ`dLLDDL$1HDDhT,IL1HORH@H $LPH~H@HD$@H]H$LHLHI;MH4$LHL~IHBf8MIEHL`HLT$LQHmH}QLHQLT$it-  Dj6OA8DtVHt :EHA$Ic pfr4BuL)ILHcDE1Hٿ].lIHDE1H¿^lHH[]A\A]f.f"It$uEAD$Ht$ Ht$ H׉L$,D$(7RHE1E111HiHH[]A\A]fD빺@,bff.fIt$H购H\E1E111HIbtIŋ%=HcjAT AE111LH{A_XaDt|U1E1E11LVsHf}HHgyfIt$HFD$|$LDfA>HL$I~f9PffD$u D;=cDL$HT$ LD$`Ht$HL$XL譭H6WLT$XL9LH;D$` DL$Ht$LD$`HL$X1LbHT$`Ht$XLDM IDI~f?:_6HT$ Ht$LD$`HL$XE1LIv1ҿ_X IVHMDJjHL$pHT$hAFHIA[[I%Ht$ 1E1E11L)F|$HD$XLD|$0t EALH¾`D IMu$@D;=IbI~DE|$DLHMHIoMtLL!D8IH5NULLT$qK\HT$HALDHE4LT$f%fu E H5ULLT$KLT$HLT$LT$L;5fI^f;C6A9D$DLEE1E11|DIHD$X@A9/AI^< 7|$. SLփMC!LPLl$XMfD Mffd+A$ f>+HEDhID$XA8+L$E1E111LCHD$XDLDy9fA<$HT$XAt$6D_IHD$XMV" E1E1111LBH|$E1E1111HD$XBMHD$`AUf.Df0AEHE1j11AL`蓱Ht$pHV(A^A_tvAEHE1j1AL`H KHt$hH(]A\AE`œHt$`HHHH\$`DHt$X1LY{Ht$`LG{ME1E1111LAMHD$X|$Ht$X~uI~f?6yHHD$XIFLPMfDHt$ E1E111L A|$HHD$XHXLD1R E1LHډDHI#l E1E1111L@|$HD$XMT$Ht$XLbM\f.Ht$LD$`HL$XE11L账DL$HL$`MHt$X1H8 I|$t#MtAE%=0tAED9JDΚHHt$LD$`HL$XE1HL>1HDQ HIDjDL$(HL$pHT$hHI_AXf}f;*t DWHHT$XHD$`H9t H9< tHDHD$`AZH HD$I^f8` H@@2C f[61ɉDG # IFE1@XHt$LD$`HL$X1L苤HT$XHL$`zyDYZ1HD$XHT$`DCZE1ɀ|$0HHD$`DD$LHt$XD%D3LHLHŤIDI~f?D6Ht$LD$`HL$XE11LģDL$HT$`1Ht$XMHH HI%1HN Ht$LD$`HL$XE11LSH|$(E1E1111!D |$0HHD$`HT$XD64 )H|$(P#"HD$(HhHD$HPfZ4E1E1111L:Lt$E1E1111HD$XL:DHD$`苕Ht$XHHD$@I~Ld$`LT$f?DO6jATL׉މE11LT$A^_LT$HfDHT$ Ht$LD$`HL$XE1L衠DD$HT$`LHt$XDG HI#HL$(HT$MLH IfEHD$HXHxH9sHL$(H;AdCD$Xf; C6D$ = Dd$ D? MHT$XDH]A> ^11Ho:HD$ HT$Ht$(L$H|$HL$`E1蚞Af= <>H 1E1E111L7XH GHD$8D$0UHL$@uI~f?_6Ht$` FD9tDƀHD$`H$PA9tHDH$HA؋T$`RATPL$HL$`HT$Xt$PH|$8tH HHRHLHT$|$ 1DSHIMt)|$t"LLQtAED9 1H8ILT$E1谷aQԯH̯HLH|$LT$H|$LLT$AE11DѥH8HHCLT$H|$(E11DL虥H!-PLT$IcPZ<OfA>t HD$Lt$Iƃ|$d|$LDHT$ H|$LD$`HL$XE1LE1ɀ|$0DD$LHT$`Ht$XDDImL$E1E111Lb5DHHD$XGHI8_&fIcPjP^|$Ht$LD$`HL$XLD$0D\$(LDHT$ E11FD$0AT$D\$(Ã<w=D\$~LtQHMD$ DDPLD$pHL$hsA[I[^@IF f8-HD$f8#H@pHRH;PtH;vPt H9}P8"nU6U4H9$8x&$fHLH|$X7IHD$XIF@2CLLt$HD$LH{1psHL$HH{1YsE1LD$`HL$XHT$ HH,@DL$HT$ LD$`Ht$HL$XL'|$0HL$`HT$XD6\DHLOMjHD$f8W6HD$Hxf?G69H|$X>HH@HAGHD$mHZHP [ @DȌI]D踌IeD訌IHD$XIc< HL$I~f9PfHG fHD$f80Q1HA3Dd$ HD$HD$Lt$ILd$LI\$HQLMHD$f C6D$CAÈD$(D<EAfA>T$1DD9 D\$(Ht$LD$`HL$X1E1LD\$ ږHt$XHT$`D\$ ~ z \$AHDLHD$XHD$H@H5HL$`HT$XD6)ILT$CLT$LLPG|$8|$D;=?L1H7HI1HUf:fHGf8.QA9/I~H\$(H$H UH{f?l DG6HT$`Ht$XH$MDrID$,!LT$`HT$H5ωL$rL$HHrHD$f}1 }6lIIcf @AFfG6D$(|$|Ht$ 1ҹLc.HD$X<  L$H|$11;.HD$`]HfDAM AEf0qf*}IFHhHPRIcI}HD(dI}Ht$X.(#Ht$`LfHt$X1LfMHL(Ht$XL(I~E1E1111D$0H_,I~ 1E1E111HD$8,sHD$@DT$0t$Xf;[6DD$0*QHf}H)QfD)QH==$AvHL`I^HC 0b<>D$LiH{HHGE1E1111HD$0 *1{ HD$8vH{PE1E1111)HD$@HD$0f8HD$0X6HD$0|$Lpt$XD$0fA}*LA}H|$PDL$Ht$LD$`HL$X1L͏HL$`HT$XD7)I0Ht$(QHt$HD\$k(QD\$iL$E1E11LLD\$(HD$XIIFD\$@A8fA:,AbLT$XDd$1DD\$(D1 uD;%ED\$(bLD\$ [H|$HD$X[Ht$XHD$`D\$ ~u xD\$HHjDL$(MHDy=DHHD$D\$ HL$pHHT$h=DIHEA7IA\H9A^LT$HH-LT$ LHIPAWt$xcH HH-AHBI uf;C6|$[KL҉DYBIHD$XHt$LD$XHL$`1L3f;{6KMf}H$Qf}H$QFIH$QD$umE1|$Af;s6HE1Aj1LNDA1HL[XMZHt$XH@Ht$`HH@1AYI@HHڍ|?lHPMLHHHIH#Q,DD$ELD9|$0I[7H#QH\$XHT$`HH544Ld*LT$@T$(q#QT$(1CLT$@ǃgR(LT$DxLPHD$XI}6E6gLHT$#QHT$9H9IŽWH"Qt$XDT$0$Ht$L=MfA}*A}"QpHt$ LE1d#HD$XDL$HL$MHT$XDD-A]IDBIHD$XH{H<>D$LA"QAwL։D$*IHD$XPSqH!QAk!L$LL"HR(1HHHD$LT$L- 2YHL$I~f9gHC!QH#IfA}0MH߉T$!QT$H QhH\$H-UHCpH9$>H\$HCpHH,UHCpH9=HnH,UHCp]L-*1XD;=<LPHt$ H|$1ҹ HD$Xf8*t%1HfHD$Xf8*tHDNHD$X|$0I~ f_6LP2I~Ld$XHÿ9N(DxHXL`Ht$ 1ҹLHD$X< qH|$11HD$`HQLT$XZc! ;@HDžHA9HD$@fA>MvA>Av 0HuIFHf~IFHH@@6 f9 sHL$HqHc HH9OH%HH9:HD$ IFHHt$X=:H@DHADL$8p *=:LHD$[ *HL$ =:LI*H|$@HD$ H+wH觽QHD$(荻DL$8HD$(^LD$ HT$5i:HAq(HL$ H9t HH#DL$HL$MHT$50:]:(L9t HL#L5[8N޿F)L肵M"&E6HG~QD$(pQAH|$n\QD\$eD\$#AݝxHMDjLD$pHL$hD\H-7HD$臻BLD$pHL$hMDD$E[H^7HD$ LDHA(DAH3(AYA9AZLT$D\$HLT$XLT$ IcI~<vJ%f?P_6HT$ Ht$LD$`HL$XE1L@߁ cvAHD$XƄ$Ƅ$fD$1L$HDŽ$H$HD$`Ƅ$Ƅ$f$HDŽ$H$Ƅ$Ƅ$f$HDŽ$M4E6H#aLLt$HD$茒#D$DL$HL$MHT$X޿]%H65HD$($T$L։T$ (T$ H߉D$(9D$H=LT$(MUQfta_6LPt1I~Hl$XHÿ9H(IDxHXHhX^QQE6HE6Ht.&$&HD$ D$0E6H&E6H!!r!^!JAHLT$ED\$DB(DAH4(A9D\$LT$r EH]LT$%HAAt1AWAVIAUATAUSHZ!PIf7tFft(f4t"DLL2H[]A\A]A^A_I~DLDI^HOHIHcL9HD$ffMt$LH+D$IcLL$H9HN1L$IFHPHpHPHx1Ll$@Dd$81ۃLt$IՉD$(HcL|$ HD$0IcHD$IHD$Mt$HM"LOIVHt IH;D$~uAD$;MtdL;|$IH;D$HL$ 1Ht HcLH)HHD$L)H9HN1LHD$HH@H};XtHHL`LhMtKLOItH9D$~ID$f8HHH1fIcL9|HD$1HHtHf8uH{L$*OHHD$~HML$HHD$0t LH+D$IcLH9HN1XL$IFHtDXEg1ۃL1Lt$8L$l$\$|HcE1H)H\$pLt$Ht$HL$IDd$xHl$fy&fHt$0HSDFDX4BAHSHvL$HH)HH$H1H1HH!H?H?H$$A@H$t@D)HHL$H$DŽ$IL$HH$H$HAH$jAjjLH$UH$$fo$$)$H$H E1L$H|$ HD$8IH@HFD;x<DHHHXH\$ HXHRE1LH+L$0HD$ LHH4L$(HDw1T$8Ht$ D)ULl$@Dd$8LH+D$IcH9HOHct$xHcD$LL$l$E1L\$@N6AHt$(E)H$M)MMEL$LdL9MHD$(LL9EHD$HL$|MH|$ LD$XLL$PH4HD~LdLL$PLD$XH\$@AMfID+t$I)IL|$TEvLdMMAtHD$HHt$@HH+t$pL\$hDT$`LD$XH<LL$PL.LL$PLD$XDT$`L\$ht@M9gHD$(LH9VL9M\LH+$HI9HIN1HH|$ L\$hLL$`DT$XLD$PYLD$PDT$XLL$`L\$h1DML$;HCf8HS f:HHt$0DZ4DFPAB HPHvL$HH)HH$H1H1HH!H?H?H$$A@H$t@D)HHL$H$DŽ$IL$HH$H$HLH$jAjjAH$UH$$$Hfo$H$)$Dh4Hs H fI~HCVAD@B fHVHpHH)HH$H1H1HH!H?H?H$$A@wXH$t@D)HHH$L$4L$EعLLyU‰$uH$E蹀LL\$yU$L\$t1!DT$xL=L$EعLLWyU‰$Pv FV4fAM4ff9VAN4ff9C|$8Ht A2E0@u3D|Eu'=]uA f5 AF1LpH\$8LD$H|$HƃHLEL D EHHHH!/H$AE fAEu=ru5TuHD$LLֿLLpLH1jHHT$4fLIHD$f8D$FfD$8NUIƋD$LL$HLD$0vD$$|$1LLHIDT$(LL$P1LD$8E|$LLH|$#1eH@LLgMH>O\$XCfenH$LD$#HL$PHT$8Ht$(͵DL$IEft$$|$(1Ʌ1L$$D$(Ht$Xt$HPLL$hLD$PHT$xHt$`H|$L3sH MLLL$XLD$@L$,HDFYfu f8^Hh[]A\A]A^A_1H$eHT$@LgwMH[fDeof}ffA}fxE9 |$MHLHqD=ED5E؃eID\$EsfAhAfE9AE9A9@H0L |$MHLDK.fDt$$,H$HL$PLD$#HT$8Ht$(SL$$IƋD$(Dt$HM MLDHHMILD-GDff}efA}e|$1HL/Hn|$IHLr-DT$$L$(1D$$1D$(DHh1[]A\A]A^A_|$$H$LD$#HL$PHT$8Ht$(;IwL$$|$LL荼HILL$PLD$8L$(fDHD$dHD$ f.HD$HEDeAgAefHD$ HD$xfL$ $褥M$L$ 1$舥M$fDsM@L`MAH|$PHs!HL$PH|$MLHefDH|$PHH $?!H $Ht$PH|$ML0H|$0HH $!H $H|$PH!HL$PHt$0H|$MLLNLڑND<t<t\_HxRG 0uHGH 0uf87u@tDtzL*WKtnH~KtoM81Hs~t?H9u/H{LHLH[]A\A]I1HSLHH1[]A\A]úLҔ=D 1HSH΋ >w{HҶHf.f8ttG;H[[fDHXPH=At&H[ vDH޺[jf.}H1H[1Ҁ>ƒ뇺 F7ff.C;I|$HDE;jHAD$0AA1HL$@HdnHL$HLqLL!I9kE DHAD%PD%H=H=A0@A @)I<$HDHEDHH?C;tiAD$1uaH蹛IHiIHItI<>uAADL jHL$DHAHL$HAHL!L9A DHA2A|$>t Hփ HtL Jju@L 8jE1E1L HAAI $HEHDHHL$HL$/DHyHHL$eHHHL$>H HtA0uD%D9fHQmHM@ 5M,$Mt,@AA H޿x1Af;HuHCHPXHHT$tL$L$HD$L$LkH=gL$tMmAEfL$1H'gL$AAD$0tCC;===Y.fAuAAE AD$1Ny0BDy0@oADHLODXLIf85DD$HL$pHp<>GI|$HL$DD$aHHAD$0AADD$HL$2EEAM%AC9LD$-A|$,!HeLD$Hp x1LhLD$A|$((y1LoHSXHt=HHS wD11aD1Aa%EH|$u"uAUfA|$,A"HSXL$HDT$HDT$L$t}HLHL$DD$AB `{DAUfwBA|$,teAM%AHs HD$xx1HL$HL$HT$ AM%AD1AM$IE1N AD fXAM$ƒtCu5 cSt&+Щ?*BQ8Щ*Щ*[*Щ*E8{*EA}>A=INHINAuE>f ]AEMDyA}y1 MmA} :Mm@IHT@ RIm8D$@A}Mm@E<>HNH8HH8HU A}1zdE<>2H"Hk8HH8谢 %H(@ DI}@IUHCHIE@@dHsE<>AuFAF AufDfHL$0aa)D$0)D$@)D$P)D$`|AW0HEHIE@HALhMu@MmMA}uAEtIIm8HxH:QPHLIyLIHHPAFHtA}BIHh8HHPH늃iHHL$ L1HHL$ IIE8HEHL$ H@ IE@I?Ht#HIu@I?HL$ AG0u A:HAtA}IAIHh8HfHP@A:@ IUHPIEIHPDAMIU6E;2Im@H#I@|$AF=@ LfE1Ht$BLÃBLI$p_HHBL} ؈BLI$LA$<>mH$HpH$fIf=H~yf5HvACH$D Á|$ZL#H\$`H I EAM 6H\$H$H{7H"C0\M@L$Af.tf5kI@<>VHPf: HH$;fD|$D$"dDAF@I]8HuDH[Hf;HCf8LuHP <>uHt$Hx Ap_1[2A] KIEIUIuXLA}R*<>H@H@ IE@5AEA}H${GمAeH<H$MIu8HZtAH11|$HD$@0 t !!HD$H$Hx5HtH@A}H$uIE8f8H|$HD$H8HGH$k0HA#[H$<>vH@H@ IE@LL$(xAL$(4H$G;G;1HW A}1}L$(1ׅAEL$(SuAGLAG@AG:P8ACAI}@f?Ht$11ԊIE@HL`D$ zH\${0BhH{H$3HtH$A}xxHP 1օHHu)AE Iu8HrD1AHD$H$Hxe3HIu8Hqx8jx9LHHTP0 m@x8'1fH~Ht$1Ap_+"IE8Hp Iu8HdH9'H;,H H~ Ht$E1p_`LZH{ BL ÈZLH$HX <> HpH$LLLLsAE QAHaW1HL读HLH$H|$蘬HHP0H|$H$!بHD$H8H0 H$_苭Iu8HVD$(HJf9 H$IH,"1\$(H$HFJ?H$H5#WHqHHt>wXH5VHHhͼHHH>D$(EH~H$w+IF Hp Iv HdTH92H&D@u @;HLjL$(\HL$(:f1D\$(H$HF =H$H5THq轺HHH>]L$HHHD$!H8HH$qP%-H=w10*1HH$HD$PYHT$PHHx qH9p8gHIf8.u6f9.u0Hx H9y t$fDf9.uHy H9x uH@HIf8.tH93H$H$Mb$;$HM H vMF1IT$@6ID$@HHD$`%Tg9HH$HHD$(HD$(HHDy0H$u A8D$(uH詢HyHL$Pf HHL$Pt\H$̺O#E1HD$`HqHH@I9HXL;hNH$1HHL$PynHL$PuH$HϺ!HL$PHL$PH Ht^A0H$!!HyHL$P HHL$P*H$HϺHL$P輜HL$PH HIu8HJ\$(~A}`}1fz1H̰H9̐9uH$pH9$8@BH\$(H/@Lt$`茍H1HD$ 耍H1HD$(tHIM1HD$0HHL$8輰Ht$HIl[H|$HjH|$ _H|$HjH|$hE111Ҿ,bLd$hݔIHT$hE1E1LL0EHIHD$`f87^Ht$H|$`1Ap_LL$OtLL$Ht$`LZH|$HiLD$0HL$(LLp]Ll$HLiH|$0^LHiHt$8H81Ht$8HLF?HLHIZLH~iH|$ ^LHiiH|$(o^LHTiI}0HtOj%HAEHx[]A\A]A^A_ÐL$IvHHT$*GDML$HpLHT$@dizdff.ALuf=dHHIWHHL`8HhIGH4$HH(WZwsD1'A5A;HV@0%DD$ Imt$1ۋ=kcHIL`8HhD$AD A@ALD$!L1|IIG8ID$H|$E111Ҿ,bH@H@ HD$iIHISHD$I@H4$1Ap_IG@HH!S|jAH S_HL`8Hԭ@ HRHP@IWHPIG/@ H@f8 =LIIG8D$LT$T$kImt$=o^HIL`8HhD$I|$N;H@0L濰1 H|$8;rt fA<$#t9D$I|$:H@tL激1oA$tIT$XHt=Hu=u"IT$X1H3HID$XI$Hj8aH D$P'9p'8n'8ff.@"@>HtMtAt$H|$(ttAOH!M1E1I Ht$pB>E1p_IL$H^!9AB>DNALc|$HC0M_J|<>L9gHt$pB>H E1p_L\$ DT$DT$H L\$ D9A>DNHC8DT$JDftffHPHz f9H9\$HT$ _HMHT$ tIF@HrHLFmGIuLH9YGHS8JDHD$D$@H@(xHD$@fEI|$ fH9\$KHHMtIF@It$QfD|$@H=1L|H=}L@HiLHx:HIt @  LULH-1H&DHI|$HE1AQLHIGIH-H|$hH H= L1HAHC(1HMxLHT$ HjU@HI|$HMHT$ IHILLMo*H|$hH]HC(x4=A 'HHLh8HHC H=LHEHk MHLI1Bf.H= L\LE1O!LH-H>HI|$HKH-I9fD| &}8}HEHt x0 o1LHԂH}L0!HtH@!LH]}0 LmMAE0ƒ!!DU0u 0H}Lsf.HrH|$({SHD$H@@H:,,}.x...wH=e1L苁E0 H,1F.:. ...[.H\$HC(&++++oP+[.Gd+3/q/ I}LHtH@:LLI}HLmM A}0Q1A}8VI}L.1LLHHtHE!LLCLHEHtx0 t H=+HHtx0ux8tHxLHH=f@\ANAHL,,..H+oW.[).Gff.HtIL9cwDH[]A\ff.?f.sfHD$0HD$8*HCL{IHT$IH(HLD$(HChHCLsHD$(HCH(HkHT$HD$HZH9*t L9c@H\$(H9H\$LHL# L{0fA?$AwH|$)HD$<H .<IN(vHt I~8t H HHAHH H1fDC@t2HD$ HtHT$(HPHT$(HHX1[]A\A]A^A_fDL{@fA?t,HT$ fH{0spIT$LL>GVIAUATUSH8HHo8W\Ht$(HDe T$(J8IDžubD$(IGAHH@H=`Bh113HIt AeIEMoA_IFHHH@H贜tAMHsH8[]A\H8[]A\A]A^A_Ht$(H|$ LHcFlH% @IwL,l0HD$HHD$HHLHL$fLt$L賞 ILt$L蓞 uI1Ll1Llf1H\$HU t2H1WH\$H: t.H1,HD$HHD$HHu!HD$HHuXH=XfC,t.HHu:f.H@0H#@ZtHX[]A\A]A^A_fDE9E  UL H|$pI L< D$DH\$PPHHHD$0[ T$DH|$pmH|$pcH|$pHD$H|$pFH|$pHALЄ11H6HXHt*@(H([]A\A]A^A_ú66=1~%H|$pE1觉H|$pH蚉H|$pH-HD$3f.DUSH1H(H-\H|$HHt$D$D$-|H{Ht$1|T$Ht$HHHHt H([]D1ҹ1MtHRIF@H4$H|$ XJpRH1HLHHHD$HA|(k I9tIH$@HtCZD$PHt$8H|$ H$HD$@訳pH$IF@H$LT$@H$L$HJHLʥH$HtGH9$uIV@BJ!1HLHHL9ttH$@H2HeD%E1efCZLxD$PIF@D$@PJ HHD$HAFL$@C1ЉT$PH4/T$PH|$L$@֋H@ ɈPX@ ӉبgH$HX(AFZH$HsIF@Hp8fDH\$ HHH@H<$׋fەHDŽ$f.HD$@趕HDŽ$HD$@1L耤D$@HHf.IFH@HHHDH@AFZL"D$@IF@HHp0Hx8p(p pppH0HkHXd@Hx0HT$xHL$oHt$pHD$oD$@Ht<HH\$ HHHL @}IF@SE11ɸH@D$@HXe>IF@1D$@PJ 1L諢HIF@PJ EoKE1t?O1AufDf+VIAUHATUSILHH̍I\$xAHtCPtHHH-H[0HuI$HtHHH[0Hu[D]A\A]A^HsLHqA ff.UATAUSHHHLwHt$L$$E(E EDHuH D H  IH HD)A)ÉHL,MUMIDtRHD$LL\$8DL$0L$(HT$LT$L(L/p@LT$HT$L$(DL$0L\$8HuE1DALAD$I E)AEAED)؉D$E$f.\$E$L9rD)L<J<>L'MIt,HD$LLLLL$o@LL$uE$HufMLDIALHID$HHI9t2LHLL$(HL$eHL$LL$(tHLLtL}MHHL[]A\A]A^A_D$$ M HmIE@HGKvHH9LuDT$$HEDIEL@IBLhM9LLL\$8DL$0L$(HT$LD$LD$HT$L$(DL$0L\$8%LLHT$L\$0DL$(L$FHT$uHuE1L$DL$(L\$0HUI@E1I4u5DfA TfALUf PIHD$AH F@u!H5HHNHyHH!H9umS$@tHWDHxpH[]A\A]A^A_$q@JT1HcHމD$ H|DXD$ JT9HAWHxXJ rH WH5)HxXJgfff)Efu%H5`HHVHZHH!H9z Mf@MIFHAHHPI9HЈD$uKf5QHHVHJHH!H9I_XH迻 w7DHaL47T$DdH5gfffHl)t:|$IB48FHHt@HH9uAN(|$ $ԚIB8F $HTfHH9ueH$H$H F(fDu/H5SHHFH#xHHHH9Efw8H] f;tHHHHtHSf*f,1ҾHH%H] Iń HD$!HL$ H fDDHDT$ L $L$L $DT$ uL$HEH5OAgfffHxXfAAAA)m_EDH5\6HHFH#HHHHH9HD$DD$,H@HxXHD$l DD$,EiE1D$(1HD$H5m5H@HxXHD$kgfffDD$,)D<3;+A9wH|$LD$HLD$ApAx HHHH4FEff.@IUSHhL$Qt$(HT$$DD$SDL$RHD$H|$HD$@AD$H<ŠFHhIl$HHD$$D$,fM9Lm@LH;\$uHt$H(`s HL$H pGHFC <EWAVCHf9vALHL FM}M9dAH=tL`H@Hpt$,p*HsHZHs|$SHH=JtHk@uZH5t1tHHC1ҸHHHH5tHsH:H H)HsHsHH0HsH5st$(HHsHh0t$$pHC@fMLt LuL}M9WLm0L[L%rMIc$HHrHHHpHL$HH@Ht$L$$HrADqDHt$@HT$8D$0 HT$8L}Ht$@D$0L FQfDfDHS,DLT$XDD$TL$HDT$0Ht$@T$8:>Ht$@LT$XHD\$0T$8L$HHB<8FDD$THcHLT$HHDD$@L$8_HD\$0L$8DD$@LT$HHUfDHDLT$HDD$@L$8Ht$0=Ht$0LT$HHL$8DD$@HB<8FtgHcHDD$8HL$0^HL$0DD$8f.u8H[iHPHJHL!L9t!I\$XHHӟ  IcT$THHu%H5i=HHVHJHH!H9Ht$ 'HFHL8MHF@H@Ht I9LHfIcWHHLhMDIG MeH@(yCt8IG H@HcHDf8uHpL&tI}fMmMu.fHHA@HD$ HXXHk wD$,HHPH55Hkgfff)ȉD$,tM^L\$fA;*,;ff.HNHyHH!H9tHXH  w [ATUHSHt []A\fDSf}-AtyHH>HtHf}*u*M9PHH)H)H gHP@D9t[D]A\f.HVHvfHhHXH@Yoff.fuIH5uHHHFH#xHH9t!HHx8HX^ HD$ FHzLFHbHt$XH@HD0nHŠFLAwHHl$`D$AHD$ ҋFx`;D$|ZD;d$(QfD;d$(HD$ t$894FFD$D4$D$HD$ D,$Dd$(FD$8U1!H5tHFHHx8HX# TfH=M^0 urH58^HHFH#xHHHH9tGIEHxHX蛔 v)IEH5]HxHXv gfff)Mm)$Mef.UATUSHH/H=t9HT H=c9uHGIHHHL!L9u* u^H579HHVHJHH!H9t=AHCHxXo P D9tHD)[]A\A]fDE+CH[]A\A]HEHHXHMo w=H=8 ¸uH58HVHJHL!L9tA]@HEH5e8HxXn gfffH=N8)A$ uH518HFHHHL!L9HCH58HxXn gfff)ff.HcR~H5NLcfBHs8BfHƉr Hs9uH[]A\A]M9AIH HH GB uff.I_HI9HuH@ HXH(HH\ULcEDI,_f,`u-HmHcMUI,_ f96B`f*iU]%H\HI9ݍIG+$;$D؋ctt 9cLcl$B8Ft~v|$ythH6JpGtWHXnHL$XHybB9FBFBFB98FcD$AD"d$(Af*f0Z H=c<HT$h1."/H3cHL$XH| D$tD$E1fD Ht$ McAă D$@A!H0H H GH$ JH H9wD$@D$Dt$EBt$p1LAt*H=];HtT$h+1 .D$tErH=3;HtT$h8+1-D$tƄ$E1D$"L茖^A2EH{`Ht$HH@ H0Hf8,Lcl$F8FEDH39wH=h8HT$h*1'+Ll$LL$8L.LL$8AUD$H_HHD$XD3=IS;2H=7H$,1*fH=7H$/1m*|HD$xL$,*VHD$0HHt1PH=Z^DD$DHL$86wHHL$8DD$DSH=$7Ht$T$h*1DD$DHL$8)DD$DHL$8D$t$Ƅ$D$$ D$@H$HqQB9FMBFBFN$AD#d$(|$t0Lt$A~SZ Af0#f*H]HL$XH|D$E1HD$H|$x$H:H6=7CH 'Ht4/"XD$9B$9B$;BvfDHD$D$tbH$f?* $0tH$f80$`褿'H5M^0y Ƅ$A06 "0MH$+1o~E1&KH=~&HaL$h$-16ET$ Ht$DM`EIIf.HD$f8,IuL`IH IHDmHH@t fA<$*L,-H|$H/:L|$ I?蒔HCI3HyiET$ DM`H II-)HL$@HD$HH+LT$QT$R}Ht$H|LT$T$_fD(M}LIEHH@ HL983H=A?f.Hp f>tH!HHL~Af,uMAf*HFH=wf8*G;HHHLHH/AQL ~mD@rHx 1HAAt6AJ @FHH9t6H H IuDMH REI yI @FHH9u1AtmADLfDHHH~p9fDIt EH|$ #HHD[]A\A]A^A_ÐH9H|$ pvHHx db^H=$m$ACH=w@H=H /LQ MJL H@Ht HPf:uDZ(Eft8Ht$zu&L% D袽 H|$AԅH|$&efLH4$_L9IH4$B%0H=H={ H5{HHzHVH#zHHHH9t]HxHX H<vt7HHt H@f8*tYHLT( MfDHNf9*D9iLT/ fDZ21HuD9hu~ff.uNH5HHFH#XHHHH9t#H]XHg6 D$H=qOHtUT=15BHEHHD$tFH8HH=x LgHE@H@H9AHAr  fG;9Ok H$OHHW HPHlxXHNHSxHt WL$LHAؾ>1AH=>xfL fA? L觠|$+It5Ht0IGf8*u&xv H=w1ҾōH@HwHp@HLMNL 6C|$$DHvHHcN DVHCHHH)HMMcs`fC96sES`<FLJ GIA:v$M,If.AIMPIUP1D9ZDHHBHQP'H4$H9t$ H=c7H+bE1Hp@Z|$*HH@HHj8H;h\$,y5Hv gfff)ʉT$ HQP[1t!H2Hs޿D<u߿YiZSP1`spasADA@`sZH 2Z^HB+h?,H H@HLj8L;hfDHHXLc@H=1>1H`s$AUTHu((>!AUTHu/>AUTL9>AUTH޿C>HMm8H@L9hvIcETH HHZH,H 7HAXHt9PMA@1NHAP 2H11ɿ30uH00uHf.V9uHW HN H@H@8tHD1HtA9BrÃfD$ fDD$ HHH HDoAL(Lx)D$hDpI9oAHI(L` )D$ HL$0 1H=PH=AL@ZvA?L;kH=jD`vMC$t Lg7C$HAL-kLkDsHLc(tD$ rLD$ 9BCHH[]A\A]A^A_1DA?FHCl$ Dt$$H;D$ UL9c(C$lMVC$S9ktDL$ AA9H HH H;x;hEAD)LKH4HDH|LDHHL)IHL;(uH(9huH BL9LH9uH=\vtfH=RXv5DDHzƺ(St;tHHrROMo@HH9uH1HtQPHQOM6f.HHtWP0HxKzH=QH>Ht`MKzHHJzH=eHRHt`u?JzH:[MfM!fDHGHHtH fDHHHHHuHfDHGHHH |fHHHHHuH`H-H9AI͋ IHMƍA@LzIAIE<$wfMtK Q@KHCHHvLLRHHtA<$HQ7A$Hĸ[]A\A]A^A_@A$r@BHCщ3HHHzAHL$I;ILɃIHL$H|$H|$1Mt< Q@HCHHH ۆtLLuHHÆL=ԆL9}D-D5D$u D9mH]HDC%D9EPKt$DSHDL8L`DhDp@pHH Hz7LHCI:fH{LHHCIHt$ f{>DDHL$usH}HL$HHHH4DGT$ DD$HL$0zT$ HHEDD$HL$% DCEPCHȮaN7DH谮a:DH{KHHHCHIHt$HT$HHH$HD$ H$HD$HLKLIMt 8/L2I$HD$ EL$H=DH$~%AHLItfDHkCH9ƍT uHt$LD$ 9IHLD$ HHH  A@UHCH%1HL$.zE1HEHHAHL$0fDAHEAC1DD$(HL$ DT$HT$HH4.zT$DD$(HEDT$@HL$ ֋ EPHPt=ApHtHHt89؍PwH$LHCH\Hĸ[]A\A]A^A_f.AT$I ?HD$hLd$`D$pT$tHDH|$@HD$c7H=H= LH|$@Lt$H|$@LIZ/H|$@I7Iʼn LH|$@H|$HGPH|$@6htfDLH|$@ouH|$@6H1Lt$L¿MLs.H4H@H@ LH|$@LH|$@IFHH|$@IFP18Iĉl H|$@:6H|$@A)6H|$@H$HH$AHHǎAHH AHHA AHHAHH HADŽ$0LfAJLIH$2$J@dH$$HLHl$`L%H94HIVHB0 4MfIAD$ ID$xA9|$ ID$(xA9|$ H5A7IG(L|$\$HDH=HHE=H|$@S1H=HH=XI>:H|$@2HHD$0HH9 H|$@2HHD$HH90H|$H H|$@2HHH9 H H|$@2HHH9HK H|$@t2IHI9zMM.~S=&kIE(HHD)D`1LH|$@M.HI9IU(HDuIMIuHpHgH|$0"D$0HDIEMo(LJH[ǀH@H(HH$HDH$Ht'H DrPH@HuEQ+HE@Ht%H rPH@HuHm8HwAzHHH@HH\$Ll$ HIH$LVCH$HtfHHpHHHuAEQrMe@Im@MuWE1&?H% Mm8M3HHLl$ \$H@HHBH|$H@H@@QHp@HEc Lt$IIHuIFA:HHD$uHHL%MMM9u%I$HL9IA$:uHT$HLAt$fAD$Hfw65hAfAt$'5 AI$HOI$HH$H*fo$AEPfo$AE`$AEpH=}vHpyH=\vA$tӯ <AD$@xAD$`vg$OD+|$|$ & Dh;D+|$|$ Dh H$%H$DŽ$!H艔$L%l$M Il$M^@ID$HTmMHIH9wLL\$ 9Il$L\$ ED$(AD$ EI4$LH  D IH HD)DA)‰HHH8HHt I91IAA EIEMH A)ADAA)A|$$M9thB H9Hw)HHLMIuHHD˃!@H$$DŽ$ H$A|$$Hȋ|$xMI@ HHAA)hE1LH|$@IDIL9HR HDueMI@H HAA)X hE1\$H|$@#H=d HH= 11(輌H|$@Ho%A9ʼnC S H|$@Y%H|$@HL%LHCH|$@HCIDIL9H@H\`\$rDL衎M4$M.IEHH)~|$xHD$ \$H98u~@8xuHvHxDL$LD$ $HT$' $LD$DL$uHT$IH8L[]A\A]A^A_@HGKLmHH9DL$T$H4$Ml$DL$T$L$mfEID$EMIl$If@H>H98~@8xxHvHxLT$(DL$$T$LD$L\$ $. E1 $L\$LD$T$DL$$LT$((E1IIHf"0HC8HC@CHCICJHCP됐@@)1A))AD1))1ƉA)A D1))1AtHsHKHT$ L.fMHSHsLHH8% M'HT$Ht$LHH8tˋPDD )D)D)1ȉ)A)EA1DD)D)1D) 1)1A)A/MILH|$ D) 1кD)A))ED1ADDy7D) D)1Љ)A)E1։A)) A1DD)D) 1Љ)A)1։A))D1‰))1A))A D1)1A/DDy7D) 1кD)A))E))1A))A D1)1AHD>Dy7DD) 1кD)))1)) 1)) 1))1A))AD1))1A))A D1)1ADy7D) D)1Љ)A)E1։A)) A1DD)D) 1Љ)A)1։A))D1‰))1A))A D1)1ADD)D) 1Ɖ)A)1‰A)-DCD DKAD  @HC1Do)D$ o@D$ )D$0@ ׉  b|$ TH|$(¾)yAaSD׉T$&T$AƅHcsH{D0(yAH{DD) 1AA)E1ȉA)) A1DD)D) 1ʉ)A)1ȉA))D1))1A))A D1)1ADL{pL蜨DHL~'yAfD蔧{f.<${7@ D1NAHs H^HT$ L. MHS Hs LHH8xDAD1ަAfDD1ƦAVfDD1讦A.fDD1薦AHsHHHT$ L.MHSHHsHLHH8xD?AD1&AHsxHHT$ Li.M8HSxHsxLHH8xDϥAD1趥AHsXHHT$ L.M HSXHsXLH$H8:xD_AD1FAHHHT$ L.^ME HHLHH8xDAfDD1ΤAƋ$#C%C;H6H[L|$H6LHL.9T$ID$D|cAƃ<$#9D1FAHHHT$ L..M HHLHH8xDAfDD1ΣAHs HtjHT$ L.M HS Hs LH@H8VxD{A+fDD1fAfDD1NAHs(HtjHT$ L.Ml HS(Hs(LHH8xDA+fDD1AfDD1΢AHsxHtRHT$ L.MT HSxHsxLH@H8VxD{AfDD1fAHshHHT$ L.M HShHshLHH8xDAHT$ LS.T$ ID$D|֡AfDHHHt$ L|$fT$ID$D Ty7)D)D)1ȉ)A)EDD)D)1DfAUSHHAHHXHT$ 良.=bCH= Ht#hRH H5 UH|$EE1EDHH H|$ƓH{XHT$ H.tZEH=Ht-H}Rs H lXHH[]A\A]fD1HHfDH{`t$ H`HfH[]A\A]H{`t$ H`@[R@ff.fHtsAUATIUSՉIHt H車uH@[LL]A\A]A$<>H fx$VH`I}`IuI}H[L]A\A]KI}LO6HtAD$uZHR'?@I}j`IuI}H[L]A\A] KI}sI}4`IuI}H[L]A\A] KI} `At$HL[]A\A]`I}I} LN6HI}I}`IuI}H[L]A\A]JI}I}v`IuI}H[L]A\A]J@I}J`IuI}H[L]A\A]JZUR=Z{R=AWAVIAUATIUS։HH҉L$l3H|$ ysD$8AIHl$ Dl$(Dd$,D$t(HEuAG@EuAGDAGII8A1IvH9DPHA@=DXEHH 1E9o@IvAADE1EHH E9gDIVAƒA@I~HLT$`I8LT$H9MVAH詊HI1A"GJAAIvA@nHH EIVA(LLLU`AD$IvQ@HH IFAD$E9o@Io8AGHt IcL`E;gDEo@t IcLm`EgD|$tm1pA1I9oPIvI~AÃ@HH I^AH`I9oPIFAtHLYIoPHH[]A\A]A^A_fDAIvA@HH InAPfH=)蔄uDI~HT$J`E1E9gDI8HT$AII 0I~HT$`1E9o@HT$I8@E1HH E9gDA뵐`I~DA:/AGJH=a誃uHzfI~HD$`HD$ LT$fHLL#`LT$fI~HT$D\$LT$(`LT$D\$HT$ffDI~`pDIG8M`BE1D1ffDD$MeIE0AT$uL`ID$LHAD$ Al$ID$HsP'HL蚎LH=jt H|$H=WH¾R1(zH='Hp,έaH!tu%H= HD$HXHH=t H|$TH=ݱH¾R1yL|$IGHD$L|H=(HLx0H=6L`A|$uLw`ID$AD$ HAl$H{ID$`L|$HIm&IH/L6HIHHߋp`I$pt,E1DHHtA &I$D9hwL|$HIwP%IGPHD$hHbHx0Ht$hxHD$hHFpH7`HD$hHtRI@eH{1`Hw`RHj`,1H`H$}H$}HD$XuH$tE111wK7It$iHtpH`ILcH@(HD@E1EuD9hA|$DluL`ID$A@(HAD$ ID$IAl$LcH@(HuTL`E1HD$xD$pH$A~xWDorLE~H4D$eTyT$DA+11+yxDEu='5mu H=7uM]t+]t!]t]]HDHl$H|$Hu%^Hl$H|$Hq˺~Hl$H|$d^HP=utWtHItAHl$HÃ]^M`4^HDfDL$`fDH|$xHD$(JHl$H|$K^H1@Hl$H|$W^Hf.f1<:ti%H$I HE$EuCH~>HcH9u6H$`Ft1Ht<:uCHH<:uHuHھ_11jr1ɺ`F1FM1ɺPF12MH_CIHpHXXHHھ waqoIHGHXXH`Hھ aHoHHHXXHtHھ a#oHLH=AHWXHanH #H=M HHAA!H!Ӻ%P`-o\HyDxyHHDxH9HE1H@H0HgEHD$xADAAH|$DtH|$1D É؃EL%EIF H1ҿHHHԐHp HtNLdHHDH@fDH ?Lt$LH Ht6aH=L1H5 >=otDH HD`"xHHHD`X`HHH@L8MHAA!H!DHt$dLG1ҋ|dHHutFLt$lE1E1Kc4.H=56JlLHIό5IIuHLH=<H^Ld$dbaMl$ A $1I8M9H= Ր`FDdCfHHa(`HHEAHSaHD1HD$HxHoefDHT$X/Ht$H1P4HH#x`E1nL$H|$(踏6HH[AHT$p1HT$X1D\$0L$(L$(HD\$0-HT$X1D\$(HID\$(2D9\$ D$9D$ HLHHDt$DHT$X1DT$$t$HHT$XI1IV1ILHHl$1HH3DT$E HT$8H12HLLt$1L2HHLH賔5LD$ H1LHlI$H1L2HHLFt5H|$0H7{51H@HLYN5HHSa(`H1E1H=116H=ImLD$ HT$HH1HH\$HL$0IƿGH̓5LHH¿F蹓5LHIH¿L裓5HHL11HHLH耓5t$HHT$XHZLH1}1HHLHJ5t$DHT$XH$LH1G1HILF1HHT$pE1H <H[MHE1#E11LD$ HT$HH1HL$0Ht$HڿGH藒5Ht$HH¿F肒5Ht$HH¿Lm5HEHD꾠`15\H.D`;xHHD`HHE1H@H0DHHD$E1Het$HHSHپH1 HHHc`cHD$:>bu`ն1b `躶pc-`覶ff.I@I]AJ/ $u f&f O E1fD$+HD$Im 7ff|$+IG@HHSHHHRHRBQHB@Hr@H$H$HH$uCDƃLtIN;BLrMp @MvMt1A%=#uAF;uLLD$tJMvLD$MuLt$1LD$IdžLGLD$LAHIHtfXA8Ht$PH\$`Lt$PH袇HL±H[HHCH@@(EQHE@HU@HoH@ Hb8N8HD$PHT$XHl$`T1DHLL9Aw1ɅHHL-HHCH@@()Aw1ɅyH=/tHH1LHH=qYHG @tID$ @It$XHtMDVEtDEB11Ift1D"#tH XH@ @u&H=aHtJHWHtH@Ht@@u/Lt$*/LOH5L6AG1'HT$I}NAOȃf $gȺHIO8fD$+Im HL$LHHCt H;ELp0HMIF(HPH;L11L|$Ld$Ll$ AMIIfDIuHڿWWt!HtIuH=WuAMMm8MSIE(HtPt L;ttjPe_L@H% HD$ waLpX~L1H5HL$HAX>pHv H~uHD$PHD$XH\$`PjPeK8H@ HxuHD$PHT$XHl$`PjPeKDHD$ aLpXEHD$ aLpX&D$,D$fAI0HD$@HD$HH|$r@b$u fH|$-J,HHP8hHT$@H|$&&,L$,ʃHDщT$,H|$#,HQL`8MDAP<<$uAGfkHT$@H5HD$@HD$@HD$0L#IHD$01HHxM|$+HD$  HUHHHh6~HޅIHӅHILpLHL`LLCQ^HK@1Hs@HtHQ Lt$PHT$PHt$XHH\$`LwAD$QID$@H|$ Ld$`HD$PID$@HD$XHD$0HHHD$8(>HL(AEQIE@Ll$`HD$PIE@HD$XAPD< <$uAGf H=fH6H|$ H=HƺL襩HߺL5\HL%\HsLyHs3AE9Au sD$Jl1ML$ Lt$P1AALLqL$HMpEeAMD9At$ AuKlAED$ H|$&HIuhH$H9Dm]D9Au HMuJD%1Md$Mt~A|$uIT$8A|$H5]1ɺAAHLpH$IHJDAA9ȍqrLlMd$Mu|$t31_AH5ֵHnDd$ H$JD111a_L$HMAl$E,$AA9u{ At$I\D$H$HH3EQAWHLP-|$+u$HD$PH8$LҤH$HtGH([]A\A]A^A_À8$H@ Hxu1D$,D$D$,D$D$,D$BD$,D$BD$,D$I~D$,D$>iD$,D$?TD$,D$A?D$,D$*fdH1HD$HCQHS@1HK@HtHB Ht8txHt$@HD$PHL$XH\$`HLt$P1AALmMo@HD$@MH=ŵL%b6HIH8uH@ Hxu1vĉ;AGf3HL$HLt$P1dHQ1ɺHAAL5mH$|$+QrqHDHt$1hH$rNJHDD$=B=A=?Lt$P|$1H=R6AH5JHDd$ H$JD|$H|$!HTHP8xH5荸D$Ivt$,H=ձ 6H$H|$QrqHD k!H[ H@8H$|$>QrqHDI_01HD$xHD$xHHD$p!HHstHP8Lt$PHL$pL2H HH?tHP8HL$pLߠI0N H đH5%H1cHL$pQrqHDHiH Hx e"I_0HHDHZ E1{NuEDk HPHLC8LE9DEB豜H LC@DHLHP蔜HH[HuAHHH5 E1H.u5H=HL6HH83HLȞMH1zHT$pJqrHDBpH=,6eH(HHD$pHo@d1H=kAL6H5E1E1HH05HT$pHJt2HLHD$pH@9wf.AGfyH=߮-6HT$8H|$ Ha4lzqH$H4HO%ÉL$txHH$L$% ÉJZqDLLLd$I@ HtH2H1HHH9uD@ApDHH|$Fz6t}DOqH$H4HDo%4sxHH$Dm% É]]Au|AGHL$HHL$0f0 H\$H|$H$HVHHL$1D$MD$ Lt$PAHDŽ$DC1HDH4]rxHtHMH HHH9uXsAHjH=ׂtHHHD$0{DAqH$IH4H#DoA#pxHH$Dk%D CAuHD$P1A|$xeǴqH$IH4HoAoxIH$ HP0E@pAuDIH=H? 6AoHD$@lAHDŽ$DC1HDH4nxH=1 6HT$pHpe@1H=Z 6H5[E1E1HH0-5H5H.*HL辕M1HpH|$pHHt ygGH$QBAH\%1ɉt%E1H=2~tHPjYPe4|%E1jAE1A11(H=H;6H#6HH=ͦ1H 6HH74eHC H-MpHVH¾"H5HCH6 eHPHBH(pHH¾"H\5HCH eHPHHHJH(oHH¾"H5HCHPoHPHHHJHP1Hx ;H A1(1E1kxH$IH(1E1]kxH$HH6Pj0!Pe2ise2jPe2Pj Pe20jsem2(11jxH$IH. HL71HLQDC ttDHL@@;HL{S eAE>4$HLwRLdLdfr~w~1HL讽Pk~seW~yC~/~AEAЋ> se~(}} }}}}p{}$IHtH $IE1$H$LMu$<@IILсH@0fu L9h8>H@HHuL9h8uH~0H$DL$藍H8H@HD$(HD$8-f.Ew H=uAGt1AencLAcF4HD$wfDFAFPHC LsHhI~H証1HH$HH+fDFLAF-F4HD$ILLD$輥vHHxELD$1LLLHD$SHS f|$$DH|$HD$8LOSHC LsH\$(HhIH=OE4H=HD$@9E4H=1HD$P&E4H=HD$XE4HD$`fDcAcIAfLl$8Ll$($H% 1HRHt$1L4HD$H\$(HG0HD$ HD$8FHP<<$oH|$ HH{HH蛓H{8*E1DOdH{@HT$@fE1I1ɾPwLIH)D$`Hl$`)D$p)D$@)D$P)$L$x4H{@HT$@E11ɾPwfo$LH)D$`Hl$`)D$p)D$@)D$PL$x褜4H{@HT$@E11ɾPwfo$LH)D$`Hl$`)D$p)D$@)D$PL$xb4H{@HT$@E11ɾPwfo$LH )D$`Hl$`)D$p)D$@)D$PL$x 4L;k8H{(HZHĈH[]A\A]A^A_Cf}Mt"@I$8$$M$MuH|$ Ll$ 'I$x1=pMmMtpIcEt rPwI}8tHŀ.A}1`~CpMuH=g荏HYHgRx1n=pMmMuDMI$8$Lk0HD$(Mu 2fMmMt'A}uHT$(L153MmHD$(MuHHHHD$1HD$(HIf.HD$IM HLq8Q LH A9Q My8tM9tpL $L1Ay97Md$M|$n\$xLd$/|$o7IHIH 2IHD$(I\$xIFH=H4H*rIF HHIF HIF HHhHH$4HqIF(HIF(HIF(HHD$PLp@HDŽ$h 8H|$HHDŽ$p;Lt$HE1f.AVB$8fIz@;8$uVf7Ml$8LCIEf/I0LH$HL$KAH89 HhIEH@ @.L蓌3f85;HP<>n>I0H$H$H@H8 Hp1HfDHHM3M;x@Ml$8IuLHH~ G !f>I0H$L$c@H8Q HhHb3H}HoIH&LH3M;fIp@U9<<$uBfMl$8IEH@ @I0LH$HL$?H8 LHh 3f85<HP<>+:I0H$H$H`?H8N Hp1+HHH3M;In8E<>QHE0Ht.f8t(Hj3f85f:HhE<><:IHWvHHxMHE1H8~t?E<>MHIITH"LHHD$ROHD$AF IHm7<(&<$%/.x#DIH.7< <$uBfI^8D$LcjtMd$A$PfAT$4G@9H'H|$H$H"HHC'H@ HP H511bHAE1DD$}H$hIH9"B 9&"HHHDJL HhfH3H$pH{!Q19h!rHDl$HTqL"HBDMvM\$xA91H$hLd$(1I\$ H{y2HCPI\$(H$pH{_2HCPH$IT, B&HHuI2IHHHt$|$nHDŽ$"IH$H$tDDt$|ID賂IDHrH荘IH.HH$H$H<$$H|$Hb(MLt$HL<$f..AtZAAE -^ IEHDpAuX LGl$Ll$AIuMe8ID$H@ @ I0LH$PHAmL$Pe9H8SH@LHD$躄3f855HP<>m6I0H$HH$HH9H8AL$L4$H$Ap_1H$L5MH$HLH!MmMUEu4@In8IITHքHD$H;UHz1A~H} LH HHD$IH1< <$uBfLeHTH!1HTHD$H}HH$ 1C31HD$ C31IC3H|$HL$H$Ap_1HD$8LKHT$ H$L#H$HH H|$ H$HH H5H|$8H$HrHH$HL7HH$HHt$H|$H$蝋HH$H11AmD1I H$hIHB 9HHH<$HDHމJL(L`3H$pHQ19~rHHTqL*HBAH$HH{0?MH{0 I/I0H$H$H4H8H% fIE0Ht/f8t)L|3f85-LhAE<>,Et A|$ ILJvHHxEA|$u AD$LLGAI0LH$HH$L$0H8HhAD$gE<>,H4$H3M;A~ uAFuHEf8#@H|$Vc |$n+HD$1Hx_#Ht$H|$8H$L$hN@L$hHt$81L$hUwH|$H$HL$ht|$ t=HD$H@@u.H|$Լ1H3H|$H$HuDA~EIV@HlH51躌HAE1DD$H$hHD$HB 9HHHDJHL$HhHf}t HD$(H@ `AF(RR[(Av AF AHtW j9[P1HHt$8HT$ 2HT$ Ht$8<H@HH<$3H$pHQ19yrH\$HHTM9qHHBtMtMDl$fDLFIp@&<i<$uBfI0LH$XHAmL$X5-H8#HXHCHPHB H$@B81L%~t?1H$ILLHE1=HH1H>sH$HHˁfDH|$EuHDŽ$x(I]8<>Hf5f._HC0Ht-f8t'Hw3f85i(HX<>@(I0HH$8HH$8+H8IU8L`f5f.aHjf} HADD$ S3H$xH3L1HD$@sHL$HH2DD$ IA} t Ef}D$ D$JHD$1LHx8;HHDH$H$I}@2u_IU@<>GH5 D興H HXDH$IIDHQlH$|$H4$H$1Ap_D?|$ YH$LH<$HHD$$W AݺI M@I?D)H H H9HD$(H$HT$ H@(H߀`p3HT$ HIH$HY3LHHLD$ "LD$ H>1HpjH4$HMH¿<2H$pHtH$p1ҾHD$ vH$pHD$ QYAF -NE<>AFAMfIl$8?LH$H$ H$LFHH$ I0H$L$(H8HhI}P3HIJ3LHWIHSL13@BfQf.H$h1ҾLH$hB@@AD$A|$ HUf: t1HHT$HHT$tHRHR HHEHP E0c HBf8.zHhHL3H$xH3L1HD$gkHL$HH"2A} I2HmD$ D$f}-HK3D$HfAFHt$1jHD$Ht$H|$H$Xxf gHqAFtM@fA<$ tH5$yH1|HH\$H1Hs|H$HHƻ Lt11H;LH$ H$IuQfDHD$ D$+f.CI}0H$H$HH8oH$hL1HX+H$pAp_1LH$p 0H$pHLHa@H8HH$H8HVHH8lHx0!@H}Hp8H$HanfDH$p1ҾHD$蟿H$pHD$Qjf.H$h1ҾlH$hB@LH +HD$3fDBffLXe3f8L`A$<>Il$A|$ AD$IEf8|H4$1_3H$AD$Z|$oZHD$P11H@(H$H|$HHDŽ$M HD$pDH$H$HH$t|$oH|$HHDŽ$xHDŽ${BfH|$Bt?HE1H@@8~?HBftH|$KAFHD$H@tIt$LDTuDH 3LDI3IHD8_IuHDuH$LHlH1H'I|$I?3LHDtDHIFt&-vHH9wH1S3HHHt$ HT$8)2Ht$ HT$8II0H$@H$@HH8qHPA~fA<$Ht$1 ^HD$H_HC @ A~ HIM0Hf;!H$(H$(HH8HxTa3f8kHX<>BH$xH$pAp_1r*H$pHH$xHHDŽ$H$xLf9HHHP HhHHt7H}1DHINAFtM@H$HL=jL1\H|$H$HjHH5*nHDŽ$2H$1ɺHm%H$H$ID, Hx2I|, HHHH,D$oH,HLHbBD$xH$L޹IL-(_H\$pHkUH{(L_Ip@ <<$w 1- H|$~AF&H|$\%H|$1XHBfBfH$HH8eVH$HHt$PHH$H1聞HCf8H{ t2tHCHXʾLL#11ҜH|$HHDŽ$D$xHD$p5HDŽ$xHDŽ$@IH^ </<$ B1f"H$HH$HH$HH$q 1H$H<$HHl$H$xHHHpH} ID$f8t,A~W H7G Ht$13HD$HD$HT$1HpnHD$H$0H$0HH8H$hHhLcfA<$A~H$h) 23LH?HHeKL1H$pnH$xH$pAp_1H$p%H$pHH$xHUH1WHH$pk[3H$pHp1HHD$mH$xH$pAp_1H$p$H$pHH$xH=һ(ûHl$H: uHI|$h U1=H$1HIRtfDI}0H$H$HTH8BH$hHhH_UtH[PfS47@9H$h H$hLTH$pAp_1LH$p"H$pHLHgH$hI}0H$A^H$Hf H8THhHEH@H@ H$p@H$h/3HUHHzm0I}0H$ H$ H9 H8'HXbf %H$h1LH$pf; tH5bH1fH$p1HHfH$p1Ap_LH$pH$hH$h*3HH7HHiKHH$pLHH1PHH$p1MH BfuH{S3H$pHp1HeH$p$HHHNH>E1HH6H&fDH=)11H=DHC1Hx6HHID$ht f8uL`1_3HIH$HƺoMHsH1Hs_dHMHH¿q1~H$H)MseH8HC(HH$hHIM0HXGH$hM2H$h 1o.HbH{I.3HHƿEl$2Ht>wDhID$pHCID$`i2/ /,/$.{-]-t2 se}2iy2UH2Ac2-T2..=.---.0,0y0e0Q-=0)001F-:2^2O2h2Y2uA2a,M$091%y-_-=0;0f.D$H1H=tH$H1LMfDtH@f8 .DHt$D3H$HjHpHH¿92H$HfIn8HH$AFHEHEAFIv8HH$Eru$quH$1H HD3HH@H$Ht$(D3H$HehH$1H賡ucH$"uRfDHt$D[3H$HhA~LHHIuHH$12HAESHLeH9K*91+9p.9AE14+93+}9fWH$HI0ILHH$H$@HHHD$2fIwHHD$ HDH=1a21HIh H1PC2HHh8HD$ IG0L$XH$PLHBIW0EH=\(11L_*HH$@HI2HHHl$"H|$8LeH$HI0HILHH$PeIwHH$P"I(LSI811[ff?5Mg@II|$4kI|$f?IG@Ld I|$ ktEHH(I;o8s\Mg@II|$jI|$uH$`耧ID$fI|$ H$`HH(VI;o8ID$ rH|$ H$`H$P!Ht$01LI A<1$uBfH$pHt H|$-!1H$H$PkH@o2l-lH$`@ƒft H|$0H$"XhH$`L$hMHHHz0HT$HDŽ$H$tbH$HH H$pL$8H$HuH$@H2LHT$H$0HrAGfIo0Hu *HmH} uEuHU8I}0H$H$H5H8H@H}@HE8<>uI7~HE@두H$`3ID$]fAGƒfAWʨ @AX@.f)f 1fD1fDHCHHpDD$L$@fA<$IHHD$<-P+H0NH/$~-1Hx"H/9$HxHtJHP8HD$H$Hx0H$H4H8 It$HP1蚑IH$H$0H5qLO1f.@))C)C )C0H$$H$pHH$foD$9-H$)H$H$1)CHپw)C )C|$H11H#HL$0HHHht f9uHAH|$HH=+ 1]H=M1HD$PHDŽ$]H=.1HD$]Io0HD$XXH5 H$輈IVHHLB H NlIWIL1I8lH>2LD$HL$XGHȁH$ lH5 H‰HIVHHLB H_ kIWIL1IkH2H$ IFHHL$PH L@ kH$1IIHQwkH}2f|$8H$HD$hHDŽ$(HDŽ$8H$LuPMH$1LL=3G讑Ht$hHHAHtH$HK|H$1LyH$L=FH$ݶH$HHAH$H{AD$;HDŽ$JL$MH$膶LHHFH$H{H$ZLHH[FH$H{{H$(H$H$H$HHFH$H:{f|$8AXL$s!H}0H$(' Ht$hLH$(¹2AN;LH$H$(L AW H$rH$L=sEH$VH$HHAH$HL$mzH|$H#H$HHELHDzL;d$H@!H$*oH}0H$H$(1H$LH$(ɸ2AL$;Af;H$LH$HL蕸2f|$84H$(LH$1L׎H$HHSDH$HsyH$ Ht$hH-DH$HMyH}0H$(\L;d$HH$(H$Ht nHL2AL$;LHt$hͷ2H$L 1HE0L諷2Af;H$HL荷2H$(H$H\$hH$HH_CH$HxHHSCHJH$HZx5DBf;L f.`TH5'}HHHD$hWH$pH'H={H$HCH$2H$E11ɋrHDFBFHl1M-AD$3$ДtJs@H% fDAXtf.DH\$LH Iw(H H$01H H|$H H\$I0H$hLH聨H$htFH=11vHH H$hH; H=t11KHH` AUH$HH|$ H|$ H$@H$P H$PH$@@H$@H$xHt H|$ "H$@IH:@H$@A$D Hl$HHH H$XH IHHOLH:$HHuAA@@@9BHD$HH= xLD$8Z2H$QrqHDD`Ht$hLAD$H$pL H$HtGH=Ww2H=FwHD$x2LL$H$EH\$8D$Dd$`L$IIL$Ll$pHHH)EHl$hH$$f.t$HH=v2LD$8H1HD;6L$HMt$xt$HMLDH$(PHt$x{H H$H$pHO H$xH|$8= H;$H$ID$H$x7H$H= vHLL$PHDŽ$HDŽ$D)ȉD$H52HT$xIHIž1HHD`IDH$MH1uHD`HI|$LL$PH=duL2L$HHD4t@|$I4H|$H`f2Iw(H|$ZDD$E1Md$Mf.MD$@ Mt"LljL$8LD$b1LD$L$8dE1\fMd$@̀M'W¬0`L$,H9,3Ht,Hl$0HcL;4EHl$@McD)Hc9H|$XH?H|$`L2L%H-̮2M9Ht$H|$0( ?H|$0L( *H|$0 I~L!TIHcL;4E>Ht$Hz( H|$0Le( yH|$0賣HI@LH Hu D{H#1AtE1DGD$DLHDLLHD$D9t$H|$Hj{ID9HH= DHdfx5fD1@Dt$L AtD$t$t$|tpD@1N H)( 1HD|L|$LmLRxB ¬(=yHL$ ˷HL$ IH=Wy覷HD$ Ih[ ¬c(sIcIDd$E1Yff.?@)эPHD@HHHD@MM9uI9tHL諟AIIE1Ll$ EfDLLLHI&D9t$IHLJIE9H5_(HpKHD$@D|$L$H$D$XLDLT$$D$\$h LIDɊHAE1jLH¾GDgHIDAYAZ|$HLT$D|$LDpHLIjAHھHDDHƉ+YI^t$ E D$AD$t$ fH{角AG$1f;%{}f.LLAMLPyI D$ LHD$0AUATAUSHHHxD-WH$LL$HD$(|DD$ HD$ `|HD$8H|$0HD$@|Dl$HHD$PD$X HD$`@D$hAQEH|$(WHHT$(ZjAAXrf.tF$H2|Ht3P1u (f;hs!DHHTrjH{HuHx[]A\A]A^A_LP4IL=!WjM~fff.fq1FHfdp1`FAp1Fp1FpHb.[A`iD$E~sH$CiD$sH$+iH$HL$PH|$pIDH[HHEHt x\$P@D$H$HL$PH|$pIEO$gmUH=Qk4AAAE11E)LLJM9}H$t$ H$t$ HTHDC@o{`~e+fM2H$A)H$wi$H$HL$PH|$pIO$1HH=H!Hl$(HM1fDLlAE Au$DpDltYA} AE $Ő=DhEADžT@H5DLh9]wDLhDLWDLKWDLVDLUDbH$`=1 hT1H!=7=7>7$8C>7D$D5,"HL@IIxLH8I9tE1ND$E1f.AUATUSH8~.PL$PL1DL ,GMHI9uHt$8Ht$8E1|$ LNfHT$PHHHH HH9HuHt$8LH$PHwL H H,tuI9HHYHADD98T$ 1Ht$8NfH8[]A\A]A^A_A$f0y< k՘1fA<$*%oG1.^ED$ }D$UH9H<@*Gf?0t <-GtA9tH ,G8tD9dPH4@*Gu uH4@*GH<@*Gtf*L$(T$,L EA|$xѽ1ffA> AGuڨuAouH<$vH=opRu9H5ZpHFHxHL!L9t"HCXHHD$Ѧ sTHH8NP Mp MMM|$MuMMEgAvH=ouDH5oHFHxHL!L9t-HKXHHL$ HL$fDHIKTL8NP MH M_fu{?0H WIH?1HItHuF@tθ9`@ T@9u@ i@1HD HHLICD WH% fDH5mHYgfff)CH5mH|$/gfff)YfVIAUATIUS1HHHDŽ$If0tef*Ha_HHtLHt HI냐H$)HĘ[]A\A]A^A_SaHH)H HHHz8>HzH11(AL@ƃ9H=NIj 1u(LD@ƃ.H=gAuu^,EiFtHxwATUASHH@rH+D$Ht$ HHd$ D$(D$D$,&DHHr+D$Ht$ HHd$ D$(D$D$,&HH@[]A\ú z(f.nAVAUATUSMII@HHH@ۃ@ExHAA@Ofi%IH0HIFxB4Jf% fZ4)% HHu@0HiHCxfLB4Jf% fB4迿H(Af;C6 H`ftS8LMc ЈC8HCC6 1HH=HC DLc LHAf;C6RH`ft)HcHHHDH?S8LMc ЈC8HCC$ 1HHC H=UH@4f%z%H@4Hf% fC4! H{.HCxH@4f%,%H@4Hf% fC4 Hj-.HCx[]A\A]A^1Ҿb)H )xy@qg)LLHLh)L Hƹ)HL;l)LeHx 1H蚚KljC0 7ff.t$L$HH$H$t 8H$1HzL}|H\$@u.H\$@H@@0H7@9u P|&H$1HH!@I9cf.H9tHHLHHtMd$Mt,LL'LHtHuMd$HMuH$Ht ;hpHu+HDŽ$HDŽ$Qf}Hf.fD$f8AD$ u{^,I|1f.؃H@HI|X9 ŃuAT$H)H;hthH[]A\A]AT$H(HPt@뫺P*ff.AUATHUSHHH?H@ f8HPf:HEhH.(HDbt.t)ABH[]A\A]CfOvvf=<f=tf=uH{HD9tlH'Dt)H[]A\A]@AtBH'DH[]A\A]fLstfGwdfFs6ft:H1[]A\A]fTtDfXfH{H?1{ AvH{P+D BfHH{H 1{ AvH{PA7.EtrtkAD1D E1{ vLkPH{HLCAHd&DA}xI|$Htf?tIEHt f8t@9tG9uLH[뚺pVPff.8ٽAf2.f4nf=f=u]EtIDD$XAIGHXHt f;[ HD$@HD$HHD$PD$XIGf8a|$8>HI$HEIEHL[]A\A]A^A_f.IG Ht f8fAHLfwIWf/tf0f=Bf.f/Nf.5LD\$Mw !HD\$t f80IG|$XH@H@1Ht f8I)HD$ HHE|$x;I$IEfDIG H@0Mw fA> |$XIGH@H@Ht f8 $AH‹>T$Xc  IEHUH5HH)I9$I D$ D$ IGLxxfH|$@$L!f8LHD$!f8IHL$HȽD\$HHD$蓈&$8ID\$VHpHufDHRHtf:"uf>:ID\$L}&$LHLt$Ht$LG;H|$@L:IGD\$ T$XL踤!f8LHD$豅!f8HD$H$H$LHL$q1HfHHH9w쉴$IFH$$$H$ADHAoa&$$`H$H$AKc&D\$AH|$H$D\$b&D\$$8D$H D\$HjjAjLt$(HD@HHD\$f.H$LD$ܦD$HD$@HIGH@H}f8sHT$`H$H$H$fo$)D$@fo$)D$PAf.H$EйHL\$HHD$c&L\$‰$I8D\$jHjAHt$pHo&$A[A^D\$*f.D\$jAjHL$0f.fHAjjHjHt$(H@@Hu'HHV@H@HtnvH% fDHXH A߹AH"߹$0\A9$Ik(0HHdb0󉷋G$H|$(D$|#I1fDH!hLHHubI;\$t[CQHs@HC@H$0H$(H$ Ht,<K<<HvHH$ uAD$$9rLd$(1HgHHCH8HCH;=ۼvq)HCHt HxW)H8Ht'H;=|ۼQA)HSHHzHD,=EtHHx'H@Hu觤HCL-ۼH@fA}jI}_HD$(LHp\jLt$(IFHHD$)HBHX0IF@QH% fDHCLhMtyCP%==fFuHC@HS@HHx H$(H$0HH$ t (#H$ LHCH@AD$$92HD$HHD$`HeHt$H0HD$(H;XQH$Hqo$H$fH~fI~)$HH$ CQHC@HS@Hu@H@Ht8tH$H$H$Df#L.HHH SHt ;yH|$L$HZHHLAP H$1L$Mt6IF0HD$f=`HD$@[MvML$uHD$HD$H9D$`z=dk|$|H!bHD$PIōC1E1L|$(L4Kf.%HSAEHuQHL蠪t M%IHII9HcH(HSH:Ht H;=׼uAEHzHt659tH@HxJ$H@HuHSHBHsHC[fDH$ ~H$ Hx$HDHD$xLHHIFL|$HxL HHt(H0W<tI9tI^XHf.I^HHt x HPbH$ AF,H$($HD$0IHD$pH QHt 9&H|$paH$(H$ HH HH"@DHHH9$( HH8vH@Ht׀xxH葪H$ Hx1#L-^ռfA}jJfL許4#HCL("@HC@HS@H$0H$(HH$ 8tD8"H@HuHDŽ$ Df~Hc.)pH=kHŠ4HDN H$ 2H=1L&XHDŽ$ H$(H$0K H=jHuNH$ L苽HD$(L9`!D$|voL,fDHy_E1H(H1HtPD9t0DH|Ht#L9'譔H1HuD9ufDHI9uHD$(MHxI9t#L FHD$(LHPb1Ll$(@ID$1HtP9LHlHuIuH쒿f.A H@@Hu,@D$|Ht$PH$0HHD$Ld$ IHHD$8Ht$@H ^Ht$L$0L;d$ HD$(L9`L6!HHD$0>HN HD$0H@0Hh0HP(HDŽ$PHDŽ$0HDŽ$8H$(HDŽ$@H9H$ HDŽ$HHEHuHHt$HHHT$@,Ht$HHT$@<x@ fH$0Ht#H$8HPH$8HHDŽ$8H$0HEH$8HEL8L}H$ H$(H9HPHs(H$PH(w I\$8fD}VH$L`ML$Ld$@L|$HHD$@QhHP@Hp@1HtHB AD$QH$ H$ HD$H$(It$@H$0HE1ID$@LHD$HD$HD$H9D$88Ld$ MHD$(L9`L{HHD$H@0Lh0HP(HDŽ$PHDŽ$0HDŽ$8H$(HDŽ$@L9L$ HDŽ$HDIEM}H(H艩A<H<gH$0Ht#H$8HPH$8HHDŽ$8IEH$0L$0H$8IEH(ImH$ L$(I9HPHs(H$PH'fHD$H@0@ tK Ht$H$1H HD$(x$D$01H4HÁbHHD$pFLt$X% ؉FIFX1AHD$H$ HD$@HD$H$1H8HDŽ$H$U_H|$@$H$0H8H$YH$1HDŽ$_L$(H$0D$KdHH9$HED냄$@H H D  IH HD)H$ A)AIJ,LL$ H}HHDBtBH$1DD$PL\$HL$8+^LL$ 5H$ L$81L\$HDD$PDAUEWLI E)ACDA)‹$DAfDF<+$DM9LrD)AIJ>H:HtDHt.H$1]%$DH$ fDHHDHH$8HEH$HEHEE1E11AADDBM1DE1H4D$0AD$GtAD9t$0HD$pLt$XD`EHLx1Lt$AAL|$ LL|$0L|$@AH;H$1HDŽ$\$HL-H8CH@Ht @9vEAFHA9uADDt$xƉD$lLt$L|$0t'AE9tHt$pHJHJHpHrHH|$lFIF0H@HD$HHD$pHpH@Ht$PHD$8D$L$L HL$PHT$8HH|$HH|$0H@HHLiP D$HD$ D$9D$lHt$ )HHD$8HD$0HD$PH|$8H$1HDŽ$ZHt$ $H|$@D,H8Lht$xLD9t$Ht$HH=1U11ЌIHHD$0>Ht$0FD$X ЈFIEH5p*11A@IVًDHHDHH@L8ML;=fLn VLL$v1AA_HLuHtk1Ht>wxHHƃYH!H$1AA_HnuA9HtIE%DHA9uHMtEIDH|$@0L$(Ht$pFPVH|f0wIHfH$0L$ ILHLeM A$El$%D9A|$AuDo[H}IH4HtDoM11L$ LI녀K H=RH6l1ۺ"KH=1HuHEEEVM11E1k1H|$pzzHs(HrHs0HrHS0LIT$0LC0HHE9H$E1L$H$LLL9AH{L$8H$(I9L$ IGMwL(LlA<<[H$0Ht#H$8HPH$8HHDŽ$8IGL$0H$8IGH(IoH$ L$(fHBH$PHH$0HHBH$8HHDŽ$0HDŽ$8H$ L$(H\$pH$1H{HDŽ$U$H|$@L&H8IH@HsHHt$8t xIF0H@HD$HHD$pH@HD$P`HpE1$FHD$(HPHp H=BD1T}bHD$PHD$(HpH HD$(HxID$0IEH0DDIInX<P <IW(vHt I8t HHHBHpH LH9.u I9HHAHpH H9.tHH~0HT$HHt$@Ht$@HT$HD^,EHF8HHIH9uH9u#D9sfHD@HHDHHH9tIJH9t:LMtHHIHHHLH@IJHHIJLHIBIIL6H~0HT$HHt$@#Ht$@HT$HA H4<H~(vHt H~8t H?HHGH7I HtXHFH6HHH9uH9tIHH9tBH8HtHHHOHHH9H@IHHHIHLHI@ILrIIK H=HKHd5IG(HHH I8L9{H9)HPnI9eIMH9UI0wA,IG8HpLH9.uI9u&9`40@HL$1AA_HLXkHLHjHIL$L$H$虉_AXuZHHHLHt$HHH~HzHI0$HAZH$  0'?I 0 ? 0>0>AWAVIAUATUS1HHt$@DA@Hu=r=j=u0IG(1Ht%PtHDHt@Eg$Av I0L-=MH$Aw$tz1ېMdID$Ht]HxHtCt HfHxLH@HuwzID$L-<H@H녴ID$A9_$wLхHn<H<$L<$Hĸ[]A\A]A^A_LKǿHLzHH$AD$l$E1HHDIf.HI9LuuI~HtIHt ^ 9HM;wtLL輂uLHt8u1I~\$LLIAL?DL|IH@Ht$xvIwHHtxv H;nfA1HtGD9uIL\$HII9/H<$l$uMt@AD$tzI|AD$HGHHztMufDIG0HH$tSH TvH8v]rzA|$xL݃IG0HH$uA$wtH uHFu9q 1'HLH HIHt y1HtJ9uLĿH Mg1@LHtĿID$1HtP9u10HD$pFsbHt$pHD$pLHHD$pHt6xE1u *DD9`vDAHDHx襂HD$pHuH|$pWL%*9Mt9IT$HHs HH9wA|$,I<$^I0L,bL%8H8Mt9IT$HHs HH9wA|$,I<$0L~bH8v?8=q8=,IG(HtuPtnH\HC(Ht5Pt.H|t# uHC0HtHx8 fDHt$@L軇HufL-7MzH%( fD Au@tà uN=7A/f.L(1aLHHD$(q¿H@HߋpTIGHߋpTH=,7HtEW1ۅuf;_؃HHHwH>=H=6HuL 'HT$(H=BHS;l$I^ LLl$XLd$`H{HD IHtH̀uLHH|$pH蠸LL$H|$pE111LuH|$P1Ҿc_{DD$$x]r [SHH4D$T$jbL$HËT$DD$$% ȉSC@QL$B1HЉT$H4jbT$L$@DD$$HlAD$ 0AD$ƃ@ AD$ID$@Hx(H@0H9HHD`)f.EH@t H9HIH9HP:tH;jt҉D3AD$A AD$DkE9{fAuDD[HDkH4IA4ibHËDk%D Cp{yEnADB1HDAH4hb@D EHfH\${QCH|$(谑L-.l$8 l$Hf-fA$oH hHrh謂EnADB1HDAH4Ifb@D EPHPEHLHGH6HCH@H%xHS0H@HJH9HHCHHxHRHHzH@HRHHHHxH@H;H )gHt-XIG(HSPHHt$@H\H{H\I9H@0HH$t Hx8  0H,DH=fl$Ht}ftAVT1xH<$6tMA|$x LtAD$tsHDBD AE9Apt!rN|`40+zDHT$ި[HT$H4D$DBHDD$$cbDL$H‹DD$$A%D ȉDBBH{Apj(1cbE1HH؃AHT$(DC1DL$$DDD$H4GcbDD$8HT$(DL$$@DƁHDHt$@1xHsA%*f.LȌH\$PH ɹt<<|HLHD$ Hx[]A\A]A^A_DE 0?HHIHD$<H{Xt<v<n<At$E1Ll$0dH=^HHD$0JH{0HLؾH8 Hl$Hp}zEfGHVfwH1@HHHD4H@m1fDLȌHp)H\$PHC0HC8kS 0HuHCH?t <_HsH\$E8H\$kLs81Lk0H5+.1H=ùLLHŰLHH藽qfCHE@HC0H,HC8fDHkHH5-1H=ù1%HHjLHHw~E1E1LyHEd [9ff.AUATAUSHH8Hcv;HD$HEDAH|HP IHEJ|HPD@IIEAUH4AAA@DL$,tAIUI~HH)HHD$H1H1HH!H?H?Ht$BBt?e1H8[]A\A]A^A_@Hc}:HD$PH|xA@wA@HT$tHDHHH!H[(\1H9wHH H HH9rD)ڃHHT$H9fA9pHu1 AJTz0ƒA9HDsLl$LL9DIUI+VA@D$(HT$t=@D)HHHT$IF@u H|$HT$f.IF@ H|$H|$HT$f.HT$HhDjjINIuH|$ B#DL$<‰D$8YD9^v pDʹ@?)HTHHHTIVBB/ lWXlE1AWAVIAUATIUSHXHG T$ HhE11A@DDH|HHR H Hw(H|HH@H=`H_ HFDCVH4AAA@DL$,A5HVH{HH)HHD$H1H1HH!H?H?Ht$BƉD$(A9s҃?DHt)ωHHHtHSB+ IG H|pAD9tjNHP1H|fH y0HH9׍\uIHJ H tBLIIHBH=tVLIgD1fD1I1HJ H uNH|HH@ H= Lw(HBH=uIG HxHH@H=uLo IFEEAVH4AAA@DL$LAIVI}HH)HHD$0H1H1HH!H?H?Ht$8BƉD$HA9s?@)ΉHt0HHHt0IUBE1H|$8Ll$0jjHKHH|$ DL$<‰D$8_D9AXvpDD?)HTHHHTHSBu@PH|A@A@HD$tHDHHH!ЍlHFH+CA@D$(HD$tUDD)HHHD$HC@HD$H4$H4$HfDHIG bfHSBqH|$GHD$HQDH|$HD$1fH|$HD$f.PH|0A@-A@Ll$0tHDHHI!IILPggHDLY|$ L:豰IH@H=IG HxHH@H=Hw 1LL-hIH@ H= IW HtBH|HH@ H= Hw(L1+L HX[]A\A]A^A_fMH*YBffI*Mى^(hLZ"DLHLHfMFM+EA@D$HLD$0t=@D)IILD$0IEM@=H|$0&DE1;IEM@&DIIVLLfHH H*XLl$0MjjIMIvH|$@DL$\‰D$XYD9^v pDʹ@?)HT0HHHT0IUBV? lWXFef.AWAVMAUATEUSIHHHHL$D$ HxHH@H= H_ IGDCAWH4AAA@DL$|`A IWH{HH)HHD$`H1H1HH!H?H?Ht$hBƉD$xA9s?@)ΉHt`HHHt`HSBHD$ID$ 1Ht DxEI|$1LT@ID$ HtbD@EtYE1fDEJDHx SI|$JKHD1ID$ JDHp@ID$ Ht AD9pwH|$N HHHH\$8`IDhID$@QHP@H|$@LHD$PHT$@HP@HT$H1ID$ Hx1HD$MI~LlIu IIIEHP H DIu(H@H=Mm HFEEVH4AAA@D$AjHVI}HH)HH$H1H1HH!H?H?H$BƉ$A9s$҃?@HԀ)ωHHHԀIUB_1҃IF IT$Ht;hfMH=~kL辈I@,ID$ HPHL$H\$ E1LLt$(HIHH|$DH{HDHp{GHpIHHHD$1IH=HG%AV, AF,AF/HC HtAD;hrfH\$ Lt$(HD$H H|$F,uA2N/G% AF,@8F AF/I1[]A\A]A^A_@PH|`A@A@HD$`tHDHHH!HHD$jjIMHH$D$‰$YD9^IUBu@PHԀA@A@H$tHDHHH!HHT$jI%H$LIH$HHF$%HVI+UA@DŽ$H$@D)HHH$IE@IH$,1Ht$LHt$IfLHIEH% fDIE@H$H1EH$.H$@H褮}I$IHBH=LHHD$HHt$H$H¿謚H$HF$AF,HD$fIGH+CA@D$xHD$`@D)HHHD$`HC@uH|$`x~HD$`fD1AH4'\HËC%D C)щρL@HSBqH|$`GHD$dH|$hXHD$`Af.% AF, AF/fHD$`H뒾jjHKIwH|$p D$‰$AZD9A[v pDʹ@?)HT`HHHT`HSB^G lWX^ZfAWAVAUATUSHH9HD$HD$H$HՉIAH|HP IIJ|HPD@IIGABƉD$8A9s?@)ΉHt HHHt IUBtI`71|$ x H|$JdHH[]A\A]A^A_fPH| xA@wA@H|$ tHDHHH!Htt9wI>fDJDt$ H@PTt3;T$t8HD$NH H9PtH9u2FTD$ 9#AfDII+}A@D$8H|$ te@D)HHH|$ IE@tpH|$ "fDH|$(H|$ f.H|$ H1IE@H|$ 1|H|$ y\jjIMIwH|$0BB/ lWX&IUSH(HAGID$N\ Nt @I D9tLAEA;FsT$LL$L$tAEA[I ACD9AuAD9|$IDt$KvD9tuAI$HtDcE1ۀ}EH(H[]A\A]A^A_fD(1O\HHDDL $x $E1EuI$1HtB9D$A0DL$ $]S\ $LDHI Dk DL$IoD9AuEO(sN|EDt$EKvDt D$HA9EE!DAHDE9sJDDkDI$AE9Auù{x^DHT$ɓUHDkH4$N\D$HËAuDkHT$A%D CkADD$DC1H$DHH4\N\DD(DD$@H$AD 0E@pHpt)AxLLCM\$A $DC1HDH4L\DD( $81{L\HHD$HEHEED$pAFJ4SAAA@D$  DAHD$8HPHD$HxHH)HH$H1H1HH!H?H?H$BƉ$A9s$?@)ΉHHHHH|$HWBHD$hZHD$ H@ HhIL$ AJL)H0J|$(L)HLHHPIT$EBAWH4AAA@D$,eAIWIzHH)HH$H1H1HH!H?H?H$BƉ$(A9s$?@)ΉHHHHIRB 1҃A AJL)`I$H@H=It$ HT$GoHHdI$LKH@ DH= AID$(HsI$H@ H= It$(HT$GLL$HIID$LL$A9NH$0l$LM)IN$H$N<MAT$EL$ D$>I$H $DŽ$H$fo$D9APfo$A`؋$$H$ H$L$9މ$ 2@H$DŽ$H@H$'@)HHH$ HD$L$LHHHD$8jjHpSD$‰$_D9AXv&H|$HWBu PH1 A@A@H$tHDHHH!HHD$hHD$ H@ H 11^@HDŽ$DŽ$ Il$HD$ H@ HtD$|$;xLT$jIJjIwH$ 買D$<‰$8YD9^LT$v&pDʹ@?)HHHHIRBu$f.PHA@OA@H$tHDHHH!ЍP:IGI+BA@DŽ$(H$@D)HHH$IRBuH$y1HT$pHt$H1xH$A1AH,H^t1IHپm\HT$PH$Hz@QHH% fDLILL$LLL$@LHXLII$7IRB H$H1DLL$LLL$H_fH$H$a@1H$0E1EfDHD$8H|$H@H+GA@DŽ$H$'@D)HHH$HD$H@@uH$H$H$H\$ HHD$HwHWBH$wH$H$wHHD$hHDŽ$DŽ$(hl' lWXkl`h1HIHHP@7H;u H{HEHx$HHD$HEH@HpHHxq2@HCH H@pP@DL$H{AO&2D$GD$Au2D$ EȸAAAt5EAIIILGD$A<AGAAE AuEAAAt|AAAAE AuEAAAk AAAADADL$L{A2D$AOEOAD$8AIHt$8%A DL$8AAA DL$;.A%IOD$DLD$IHL$PD$HL$HEHLMH{INjGu2WAuDtp%=EAAL$19r )ȉGL$%A ЉG8DFAD GHHSH HH2Hu Hx IGHx!HIGH@HpHEHx.@HCHt$HxgH|$D$8 D$,HCHt$,HxH5Ht$8H|$0D$0b,H{AHP HEH‹$ILLQm@DL$,ZYAu2T$ AAt3IIILGT$DɁGDL$HHLAA DL$$AAA LDL$'$QH LL$DD$,HA^A_tIu@Ht H@HxQ-@HHK$PLL$DD$,LHL9A[[IHHL[]A\A]A^A_AG!%=AAmAL$ADGAGA8FfH{HP`H{HP H{HHPLL$8LD$0HHLL\ HCHt$HxzHt$,H|$D$,)HAME@$ELmE1PHT$HHt$@HSH@HrHxb+@AYAZ6fA2OL$AD$AAHAE P HEH$mHVLD$fAHDL$LPDL$HM$HLLAQA n$LL$LDD$,H L4$HHY^tIu@Ht H@Hx(@HHKL⋄$LPLL$DD$,Ht$IXZ_f]1@D$A@L$D΁fOwW1AE1Af.AG@HHxHxHP@HCHt$HxHD$8HHXHt$8H{o%AHHkHPD$LEHHLIE1@8t$D@DC@1ujD8vMEADCDfDD@DC4f.DCjHHcH1HH=HFiT$DɁ,H{fDHt$HH|$D$8Ht$8H|$0D$0#H{AIm@HP HIE$mHLVLH|$,HAD$,$LL$H $@HxHP@DHCH86Hx+HxHP@HCHt$HxD$8HCHt$8HhH}"AHELmHPD$ELHLL2HHCHt$HxMD$8HHt$8HXH{t"AHLkHPD$ELHLHIHH:uAHzu:HzHP@tHCHt$HxD$8HC|H{fHt$HH|$D$8*Ht$8H|$0D$0!H{AIm@HPHIE$kHLVLH|$,HAD$,$HKLL$$f.HH8DHAS8HHt;hrLH8[]A\A]A^A_@.HH8HDHAS8ID$HsHx@H HEG/AHH1DEG/AAUDL$Er%HP%L<Q\$0HtpH\$0H Htv j!HD$0HtGHt@E1DDH5H|wHP0HD$0Ht AD;`rH5Y MtA~AFHHHD$0HtHPt-1HsAt$_LD$0NIT$A@IIDH|HtH HHHH9uDApHD$0EϺ.HE1,(1\1HD$8HHGHHIt$Mt fA<$tKHsH߉i9wH[]A\A]f.H{pHCH[]A\A]fDI|$ u]fv)f#t f%uLLHt wfIUf:uHr HtLL|L po^HD$8H@f tfT$t{AŃ~A\H|$0p4HOHyW4  Pͭ!LL$0:HLL$PT$HqT$HLL$PLLL$HVLL$HHLL$H?LL$HhH-H f;tC6R HH zf;t|C6.L$\Hf?!G6HʚutT$t~{ ЄHD$8H\$0H@HKDHP1f: ÍHdHf;C6HD$0HH@HxHIkf8FP6 HtIHcЀ t=HcЀt1HcЀt%HcЀtHcЀt 1L2AfA<$AD$6DL9|$HDŽ$tlL9t4H5=ML$LD$@L/H$1蠜HD$@A^A_Lt$@H$L&HT$0E11ɉL\H-E=L5HILULL$@LLH=yIH4H"_=SAHKXD PH-IHRHHRAV)t!IDHTHtHH9uKXH`RPzHHu1HH8HEH8`H$H@mEEAIED1D8H4H=$11L)^@>H=}$11Lg@AM H=Ll 1<ID$LHHD$LuLIHHt$H= uLA} vMeHI]Pt$$LAH|$A1A AS<t<t<t1ۉ[WHH 0H<tJ<r<v<uӋ>tăt3ttL7uDH 0말A΃v/uHf?tduûadTfDHH=+&nH@AWAVAUATMUSHf9H|$t$T$HD$(HD$0D$HHD$PD$h:A6HHD$XHD$` D$hAH%HD$PAD$f.f/ LT$PAD$1Ll$("L$ML\$fL[IMA@#f=$@ H[T$HHIL(DAH[HLHp f>=HH(f9/H@0f8!HH[HLHx(:A;l$AI_f.H[HID IEE1A;l$.fHD$(H"f8H[HID HH@<>HD$(EH@HIDHE1A;l$@\$ Hf;AHksVLT$PIt#FH|H$1fDHTHH9wH$$Afo$t$hfI~HD$`)D$PN@ID$ HX0GfHAHt f8Dt$hfHH $AH $H$DpHP1DH4H4HA9wH$D$fo$Dt$h)D$PHD$`fjjAHL$`Ht$@H$$ZYBjHjAHL$PEH蛨tFsHD$PH|$HHGj|$HteHD$H@H@ SLT$PDt$h jjAHL$ AHt$@H$˹$AZA[HD$0H|$AHGt#HD$H@ L;S9XWX u HD$0IGffAd$(LID$ ID$蹺LAD$ LHX[]A\A]A^A_f.HHf>u H-HfE1HL~H$D$HHtHHH[]A\A]EtPf=uJHC8uH 0HuHPH~t{uLhHLIu@uRffA}L=L1H,fH0L=Ht$(T$@DDHTL9fA>GI9.LLfA>u=fA?u5HD$ Ht*HI~Ht$HH;D$ u HD$HH9D$@tVEIE1A#L9=Ĺt$U;<Hs8Ht$@HALjLL$PHf8HLl$H=<9Ht$AU8HHt @$Ht$f.E1E1DLHx'MiH{pMHt f?LLdMt LLd H5b9HS9HH)HG/H49HPHH9H @9HH HH9H!HH)H9H)H9v H58HH8HD$`HH\$@HCH$Ht P1HC0HD$pHt$@T$xHA A( ЈA(D$hT$\AD$lQ La8Lq@AAH7HxCH8~HT$@HHD$@H7HH7qH@HpLk0YA$1HL#A$ML`8LM<NHs8HAH6!DHt$@H95<6FHt$@VHt$@HO6HHt@$tHpHD$@1RLl$(H=w6Ht$(AU6HHt @$Ll$(HLZb<,H{0!H{816 1QHb p|H=!b11H}H bpB|H=aH1豁H5a p|H a{H=a1L[H a{H=qaL1/H5`a p3It$LL9 HE1E1j1HHL蘖_AXH|$HLXH$HD$HLHp@`LL`HT$@1LLVHl$PH=}4Ht$PU4HHt @$-Ht$P HxHOH4H54H|$PL譫H|$P#H|$PHD$Hp9tT$ WHD$HT$ p9tWH `Ht _)Lt$8H=3Ht$8AVD3HHt@$uEHt$8L_t k4E11H}HL$@OLhHpLLP_+Hp13f8VHxf? H[k;E1<*Ls8MH\$HLt$HL95:2AFLt$HH=2HAVI2HHt@$tLp뼺LyW W H{0LHGV@$HD$HH(P0$1`50#HAUATIUSIHLHhHBH}HFHtq9pvlLtMt`M>MtXLL $" L $t4AV>IGH҈UEHH9tA|fDfA?,@fHt$0LLHD$P)D$@Ll$@)D$0dLl$@L! uCLd$0L! tLl$8Lu! }E1HhD[]A\A]A^A_L uLt$@L7! At]L|$0MvL! ItH|$8Ht uIM4$EMd$M}2 H LuSh55HCHT$0Ht$LL $RgL $t"HL9HL{AEL$,HD$?wHHHH!ЋL$LHHT$0?wHHHH!HSAV>UHL$8HH $ H $IWLyHT$8 HH$ AH $HT$HAHI6I9;Hf8wLxAf=f [AiH|$Hf?0u HH|$HH$LD$OLD$H$IHLOHMH$H9$1HL;k% M@,LHHL$$9$(HL$u)1fD9pLL9tc$D$xHA@HD$`.@D)HHHD$`H}hH1LLD$6LD$IGf89I LD$g LD$tIGLxA1;$ 9\LL9toH@@(H@@QH@@HH@HuHLHL$ HL$tf.HX 1HH9i9uHD$ H$ILھ)x18H1HH99uHD$ H$Lھ(x1HPRDIAP$$1`H^_LHT$4Ht$pH$L$D$4oH$H$聐HD$ L$xH$ILھ(1HHT$4Ht$pH$L$D$4nH$H$HD$ L$xtXH$LھX'1H.H{ L$ݾ L$tHCHXHLھP(1CHLھ'1/H5WX1fAUATUSHL?I_hHtcIM1Ht$H|$1Ho11L$`A D$uEMLV tMt$My2E1HĸD[]A\A]A^A_IE11LH tL1LAt$ILt$ L\$HD$EHHD$pHD$xD$HD$PmD$FHH$H$HHT$pI9sHH9wLt$pHT$xI9sHH9wLt$xLI9sHH9wL$HH9w I9sL$HuH$LL$pHH$I9VH9H9v|$EuL;$|$FM9vI9HIFHD$ H=YkHHL$0LL$( AMghLnHHLL$(HL$01MyIDHyHDI9ILLMFT$HL$L $6AIGpL $HL$T$Hf8L@Af=Wf Ht$ I9IҸ)H*HEHMVL1DHZYHD$pH$HD$(HCHL$PHE1@4H$HL$pHHT$(HD$PD$hDŽ$D$x%D$l$D$|=&LT$HLL$4$HT$pLD$EHLL\$8LT$0HT$(菿LT$0L\$8$ D$pL\$HLL$HT$(LD$F4$LHNFD$IL|$PHHT$pHH9H$H$H9T$xu H9L|$HIHD$PHMDHHHII9MM9}H߈ $lHi $ȃHH3EFH3EH6H=iLg H|$IMHH1*`DLL$t$H$LD$FHL$PLݽt!H$H$sH9 $ D$HD$HD<$D$FAIcHD$pHD$x4f.H5igH|$s]f.L9tI9HD$HyH9r|$FuHH='gHD$ DHH=gHD$ IM9HD$PI9'|$EuI륀H|$ uHI9H9AV*QL0HD$pLT$HHD$(LL$t$LD$GHL$PH$L[SH$H9$HHT$p@H9!H9T$x(H9u6HD$PHH9DE1SI9"HHH9HOII9HIH$DmHH$IHD$pD$HHD$(ΘID;d$xH$(1HH$I@f8IIx HL$LL$L$Ҳ L$LL$T$HL$tI@L@A[AVMt$(L*IIOI>MHII9tI"Hff.L?IohHcIGpHf8wL`AOHA$L$ f=If OA$M11L*A/E1EDH}Hc HIHD$UH nHt VTLtH}HpIMtIV|$ LMY(ILHI8H< pH Ht H|$kMH="ܨt fA}|HĨ[]A\A]A^A_f.ID$f8% E1E uL~AtHftf=t E  HԨHOf.K H= ΦHxE MvfA>L(6 IFHx Ht f?YM|$xu H|$`H$Ht$`H|$ DŽ$oD$ )D$`oD$0)D$pwDS$HDfD$ F fHDL}:DH>kIx>H(4} 0£f.jjHpAH|$p8x$$^9_?@)эPHD`HHHD`HK$t7HkHDgD$ H/G 6K H=o˦HlHHC軁fAI0E9t#I?11v9tE9df.SMd$HͨMlL@Lt$L|$H $AQHq@HA@HL$pD$HD$hHt$`Htg1L=8Ll$`Io AE<D Et,=|̨t#<<BE1҃<Ll$`IuHHt$`u|$H˨H $HHAtPt HxHx1[]A\A]A^A_`1fDD1Et =˨H|$`H:AHt$`HB11HH$tD$0"HʨH$HL$09HT$0HH\HtH;H4H\$(1H L$Ht$D$ IHLLH$0DŽ$8DT$ t7$8H=!.DT$ H|$E1E11HH觘DT$ AE<I}0ALI@H% fDAEpI}0f?Iu8f>LAE}L >IIEpHtJf8uCHXf=t'f u0DA$!= $`+HCf8% @B@AMHc 0<I]@f;H{ Cf?uH12Af;E1JD#Hxf uHfuH1iIIuAAERH<$臼jAG8`DH$0HǨHD$HHXI}H7H 0@f7GE1fDIAxfA@E1fʃ<A1A} vIuPI}HMf?H-ǨHIPHHt$HH$0H'H8H@ IHuID$HEHvL$ HH9w H*HuDE(E EH}LH  D IH HD)DA)‰HHHL M1It L9?1IEE ME\$I E)AEADEBADE$A)B AH9Hr)HHL MI L9uDE$D$ AD$A vHD$`L`@HD$H@HHDŽ$fA<$HD$((H=Ũ L$H1LLymD$0 H\ŨH 9H H\HD$ Ht$H|$(L$H$H$IH$0DŽ$8} E1HƄ$Ƅ$Ƅ$H$H;HD$0.H$AD\$0 ˆT$8E1D$A  h[|$8 $0;$4HL$EH+IL$(D$0HDT$0;DT$0HH$H=9(DT$0L DT$0H$t$49$0C>$ HD$ H @  HC HC4HCC<C= Il$(H H'GDT$ HH‰=DT$ H$H{$m$8_HD$H=K'DT$ H¨ H~¨C,DT$ w¨AED<jȍH4ƒ(wH HH ʄvI}HHW f^11lKH(èE1҉DAEI\$f;IT$ HHT$ 襝 HT$ {HD$ 1D$0Ht$`1$AEE1HHt$`HDAEE1+Ht$`AEE1H|$`$AEE1Ht$`DAEE1H\$`Ht$HHHHD$ k+HuH9BLdMIT$H:HzpHT$[HT$HD$(H6AHT$DHE11AHHHD$HHHD$0賏I|$~$HpH$0a$hSH$PDHq AHD$(=U=Ht$H|$01DT$(DT$(qL$It$LLL$(H=H${ 1ZILL$(tH5AHL$LϺ1H=$1 I$H$AD$>ID$HkhH|H9Ht$ H趤H|$`H1 Ht$`qDHD$`11HHD$0HL$HHHL$(HHD$8NiD$  H01HtM9HvHH\HtHD$(HH-HD$flHD$(@>D$@fAHD$H@8op 0HuHPHfFNH;HHD1x vHxPo A(HD$(H(H |$ HH\$8l$ ADT$@H-%HHD$N)f;LL$DT$@/AA<AA=HD$(Ht@=EAAHL$0IHAhHAQ,H HPL$ E1HLAE<Hl$`HEhHHD$H|$W#HHH t!HcH[ H9w HH3H{ HHƿDe6 Ht>wD`HEpHt$`AH|$`H蠱AnpHl$`H]hHHl AGH11HHD$ɨA$H|$11eA%HƻHD9HDLdMM$MA|$>H EHɉD$ t I|$ HSHpH{HuHs|$ H#HH|$`Hu H|$`H(Hl$`E< wH$H Ht3jt*OqH=Y1HeMt I|$ ]LH]AEE1fHt$`DsAEE1I}H 0HH{ j tHCHX"E1҃IqI}H11DT$dDT$? HH)9h HlHH}HE1A} vMeP}>fA<$WID$8IP 08HpH 0$DU>EH}1蔸AHH>\HHE H9H AZf8HpH{r<HU H|$`E11ɃƘBL;l$`AEE1ҍP LDT$V"AEDT$fDA"<D9XHDHLHt߃i(uD}Et?=ƸhHH9@HcHϾHHH?HH 5HMpHHDHC 6D$0HD$ 1IEHMpHxf?wHp f>hLMXHD$ 19D$O$0D$$8H=D$@$4DT$ND$HD$8I{ H|$E1E11HHHDT$ND\$Oi$8H$HH;D\$HDT$@HLHL$0 HL$0H;$ADT$@D\$H)H;HHD\$@DT$0 DT$0D\$@C8H$+HD$H+A@D$(HDT$0-|$8HDT$0!H$H=ZDT$08z DT$0H|$8t]HL$EH+IL$(tvD$0{>{Ht$1HDT$0ODT$0\UH$JHAHL$H+D$8IL$(Atf>4@-zMHDE$HmHD$ HBD$B {>Ht$H=|u {<A HD$`C<HpHt$`H|$ AHD$H=HHHcH9HH$HFH$0HT$@HL$(x HL$(HT$@HHD$(H9AAgH^HE1Ht f8 HD$f8T DL$ E H6HDT$e+HDT$IH(@4H@H@ H@<@=DP>tLHD$8HD$0HDT$@HT$X/d LL$8DT$@ HT$f: f} AA8D"\$8`;$8HL$(HQf:HD$`HpHt$`HHEI}Hf?t,L$ L$<>Ht f-fwE$fAf;H UD$ML$I}HHt$L$H$ILH$0DŽ$8w#$0H=lv $t$HH|$HI$8H=PAD$<AD$='v E1E11HHLр@H=AD$6f?D$G69D$HX@4fA3D$4fAE<L gHD$XImH+H=lLL$M;HT$X11H]YLL$H?MH9JHlHHEHf8L$H1HH4I4H9wH|$X$ IqAQAHH$yHDŽ$ABH$D$DŽ$H$IAHHH1H$0H1HH!H?H?H$8$P$ ;$P|1H0H9b9u}>QH ڥHt3pڥt* LH=_ڥ1LZ1Ls H|$`HIE8iHt$`L茎LD$`D$A< wLLD$ LD$H ٥H3٥&OqLD$LD$H=٥1L5ZLD$H.E1A} vMePLJ AID$HcH9K)HpE1ЉDAEAR40žHHH!H9LHHLHD$ /H|$ A"A9ľHDGyiHcyqDH9vD%VHDe/ f8H[@6H HH95H=G 1Hq LHHHw HIt>wEeHT$ HHw HHt>wDcL$gHHHIDgHsHHHEؾqDIHL$HDHqMH|$`HHt$`fDHm1ID$HD$ tC SHD$D$ % = EHyHtD9HvDH|mH5HDL$ HT$H5jIHYILG/DL$ Ld$HDLrDL$ HX HHDfA<$DL$ u H|$DHHNAEE1H|$MAJ71HNAEE1H? H|$`HcHt$`=H}11LL$GULL$H)H9HHDHHHf8uL$Ht$L$0L$ H$LКLL$?LM L$ hH|$LyUAl$Ed$4H$H$HfAEH$D@$;$1HH9̀9uH$D>$1;$HH9̠9u0L\$jAjH|$F$`^_L\$\EH$H$H$HHDŽ$H$H$DŽ$D$HDŽ$H$DŽ$D$cAE@*jL$ L$ #"5~q"5~qp30똣1L槻PH\$H\HH|$`Ht$`_At$ H|$8KLL$DT$@AA<AA=H|$8DT$rJD$ DT$DH$1@LDL$ "DL$ IM1H hҥEHɉD$ t=Iҥt4 LLD$(H=3ҥ1HRLD$(ML?l$ H5 H‰HL$IIHQiC LIHX H=U D$ E1Hc,k DL$ I30&H|$8DT$,ID$ DT$|Ht$01HDT$@DT$HuDT$8LL$HUHIGDT$8LL$HHLHDT$8LL$ LL$DT$8L$Ht$DT$@H$H$0L#DT$@H+DS>f9Hl$L$Ht$H$LH$0ԕAhHD$(H(DP>V`30Ε|$0$u |$8 H\$ DT$8HHGD$(DT$8$u0L$HH9t4$0;$4tƄ$HH $L$0L$0t$(HHDT$8a HHߤDT$8HHHuRT$(jAA)hT$($H\tHD$ Ht @ uHC C=C<H$(Hxt"DT$0H\DT$0HHtH;L$(wzl$(xA1H41HT$8DT$0NHT$8HDT$0hHT HHͣAL$HLHtHHH9uLD$(DT$8L1DT$8H)H|$ \$@I|$f?DT$ ? HDT$ pH9gLDT$ O$H=8HDT$(L$ g L1HbL$ HډHQ HHtC=H$DT$(i0L)D9r)hJTH=DT$0f DT$0fH4Jt)H{(Hs@1HK ‰9wH 0Ht9JH;C.NHCH;,.NHCH;.NHCHH;-NHCPH;-NHCXH;-NHC`H;H-NHC0H;x}HHChHm H#VHCpHXHxhHHL=AHtDBvIjD}L{ L[t;$`HH@0HtISH9BAfu+s>Ht$(LHL\$MHD$(L\$1HMH?L;{ 53HHtHS ~HX[]A\A]A^A_DL$HT$(E1LLHL\$蹩uF+s茭HT$(LLHMHD$(M$1HYf.HT$(E1LLHL\$e@+s$fHT$(E1LLHL\$\+s䬜fHT$(E1LLHL\$襣b+s褬fHT$(LLHL\$v`r+sg@HT$(E1LLHL\$-a%/+s,HT$(E1LLHL\$L\$HD$(1HfH $Ht$(LHL\$x+s趫HT$(E1LLHL\$L\$sH{(HD$(HjHHZMZ93H@A!fHT$(E1LLHL\$U4+sfHT$(E1LLHL\$:+sĪfHT$(LE1LHsHD$(M$1H{Mt!1HD$(DHt HC H $E1ALLH1XfDH3@A!E+ tHC8t H% H=*H@0HD$A;IC0Lx0Hx(H|$L9Mo+f~Mt M9MmL9|$t^IGH|$L$HphL$uIwHS%Ht$(L$Hp L߈L$L$ef.HT$(3$@H|$0H1L$L\$D$0HD$8sD$@+HD$HVz_L$L\$+s觨+s蓨sA=t8H|$0jzH1D$0HD$8sD$@+HD$HVz+s*}+sK+s)+sUSHAUATIUSIH8~tHHxCHItf8tH8[]A\A]A^A_DHX0L@(L9tLc&DhH[tI9*Md$I9tHk}uHEHPH9BuHLLD$HHLD$twvHELD$HT$HxHD$@$HT$LD$H @HL;| XRLR M] HD$@$H@I{LD$(L\$LT$HL`L$$ LT$L$$HHT$IzFLT$L\$LD$(MMMuM ⑜_f=XÞt5HH1$HD$D$"!HD$fi11LHH]H []A\A]A^ÐHsLN{DH 1[]A\A]A^DHN8HH1$HD$D$!HD$ff.H []A\A]A^fD=Yž2HH1$HD$D$!HD$ffff.D$ ID$`LpH@A>HE1MD$ IDd$6IwHH9t,Aw IPH9tIwXH9AG 0HuIGHG*~t{uIGHHpIG@Hx+u#AEIG@A]H@@D$ IF(Mv0*͸AIFL$H$INH$HIFLt$pHD$xM~1L9AtAM6MHD$xL$I9Ld$pLt$pH$IT$H$M~HID$HD$xL9uDH$HHq9jH@HHD0LxH@A?L0A1H|$P5ݠHĸ[]A\A]A^A_ÐH$HHBH$E1HHDŽ$HDŽ$Lt$pHD$xbfD@IwH|$tvAAW 0HuIGHf<HL$p1H$HAH$H9Ht HPHHH9u@HD$xIfDIE1 LE1oE9|KDvAVHIlHaA)9pYAIG@H@@;D$ (F1A vIGPDd$6FDAL9AD@t$6Dd$6D d$ EHD$(H8=벞HD$(I$ABAE~,EIn8D$ D$6D$ IG(HHp IO8H9H9HHI9I[fDEo,1Io8AEfH=౞DZHt$@H|$p1ҘD$pHD$xDŽ$ HDŽ$ǯH$HHtwp1H$u d@9Xv[H|$pHRHHD0H@H1D$pHD$xDŽ$ HDŽ$艮HEHuH|$p1D$pHD$xDŽ$ HDŽ$GE1I0ĸ`øOHHID$61;1D$ D$6D$ 1IH$9Dd$6D d$ HE1RfDPwσHuf.},uHHH[]fHEHx@$H{H@HHtXHtHDHH[]úP`ff.HIF8@D@A 0LuHHH ~KHHHH@@HqHx&IF8O 1>@HT$HR:HD$A !HD$H@8uHD$A@H$HHL$D$xHA0HtIUHRHRH9HED$xHD$H|$I3H3IILM1I9fAu A}NIHtxA}IEHT$ MUH(HD$pABfD$zD$@H 0uIBHfD$zD$@H=IB@H@HD$hPfv fHT$hB4R6%9|$@5D$`$A$$$!Ѓ@舄$AH[?En6H}f?G6DA. V)L$ZH$HHf;ID{6fA}AE6D. U)H=1L1/HIL4 4HLH$Hƺ H=H11H1H襠HKIHL/.HH¾/辤H$Hƺ H=궞11LHIL4c4HLsH$Hƺ> H=11H;H$HHKIHL//HHH$Hƺ H=;11LHL4H贄4HLLAL&H{Aʫ`趫$?$1H$H诖tMn[L蜖tG @H$DHT$P1L4HAIN{D9l$(IFu(1F1H$HHH$1ҾqŻH$BHJLdLp0HT$PH$@H1yH$1ҾHŻH$BHJHlL$H$@LHD$`H|$`THL$`H5Hp}H$H=N11HH$IHھ<HI,HLH$LhL|$@LcG-Dt$E9tCt$(A&HAǃtHHH!H9u A9`T襢D|$cD;|$'Ƅ$Lc/HxAL$4AID$MD$HD$XIDD&EA9H$Dt$HDt$HtAH9L$(w|$(*ZD$(1tܺ`賡1HD$X$LhI]H\$LH$@L$9H|$PHD$G6AL$4HHD$`HH`LIH1LHH$HDŽ$@vH$@LH$Ht$H=Ъ11oHT$XHž1)6HHhht f}uHEHƺL!H=11LHLHD$^Hl$HHNHLHt$H=:11HT$XHpHHD$HLH=11LLL$L$?H$HILԕHLTH=11LQLHL4.x4HH>HLHt$`H=j11 1LH$Ld$L$HT$PH$@H1HDŽ$@frhL$κ?Hk0HIL$(HH8IL$0HH@ID$0HIT$0F`HPHj0,1L1V`jɚ`e赚1L1`r苚^h11Ҹ`PDt$`?2`>L1fA}IEuf8tHLD$(HT$8LHƿ/vHt$0H1H$1ɺHĻIXL趄I萅H\$0H$HHD$@t$(H$耺H$T IcIEALHD$(|$I~HH߃;l$uIHt$(LG]HD$(f8x4IcHD$aH|$t\=ҝtGH$@PH1DŽ$@HDŽ$HDŽ$PaHDŽ$XǜЩƃ,|$=KҝtH$@H1DŽ$@HDŽ$HDŽ$PiHDŽ$XǜJЩ뗺7蟗H11E1HD$pHE1LH |Lt$PH$諕E1DŽ$L$HD$Hz$H@ HDHxެIL$1MLd$8"HDRMH9l$HDH=1L4Ht$@H7LHH >*IV(9l$MHP8IV0HP@IF0H0Iv0!$L$$9D$PL$H$BprH\$IF HxpIH|$ue|$Hu<``L$`8HD$ H|$xHD$xD$HH|$ D$w\``Lt$0D$H^`˔(跔MM81y@ H|$8TxItLL9l$IEp 0HuHHH fyDG,LH$@MMH`&;֘HD$HD$`HHD$hE1H}H|$HPHHtzt[HpQ=1ǝtEH$` 1DŽ$`HDŽ$hHDŽ$p"HDŽ$xrĩHxHHLHE1HHt11HPHHtztHP=Ɲ %1HHHI4HIHvH}f}H(f8(H$`H$0H$`H$0HHHD$ oD$8HHt$ 1D$8HL$H$`LLT$(H@DHp4ADDL$0/ LT$(DL$0H$$AL# H|$ H$1qD$8@HD$hHLLPUATAWL$LD$xHL$pHHD$HH Ht%=H|$PHD$H@HH'x'H|$[H£tH HCHt @D$.HL$Lt$ H$`HDHL$HD$0HD$LL(LL$0M A|$ID$HtH4HH9$t =Ý A$wdA$tY=ÝtBOHH1DŽ$`HDŽ$hDŽ$p"HDŽ$x111LOMMtS=WÝ)LHHIt׋e A$A|$MMuLHL$`MtULHHt8DEuxt&QwtHH@Md$MuAEQM}@IE@L$pHD$H$hL$`fDMA?LHHIHDŽ$03xLMt)@LHLHHH$MMuM$LHHLGH$0H$`HLzL$`EHx(>HH.5L$`MAEQI}@HD$L$pHH$`H$hu(fDH$`HxHH$`?uG@tD_`Eu袜HxIf?t耧LH{HHHXHt PH`Ht @HpHt @HH0DH$0HD$ E1D;D%FEAf.L4HH1DŽ$`HDŽ$hDŽ$pr"HDŽ$x+fDH% fDAG 0aIGH 0Mf87C@9LΡHLL$`LHqL$`HD$HD$H9D$0H|$XEDL$8Lt$HHL$XLD$`HT$PLAEDE謟LLL|$ DHLzMHD$IF1Lum,LxXINXHAAHAAH1HuH8A2F_HH!H9HI#~XDHHPHHAHHH!AF_D8DFLl$HLIH!t H9Lt$Lh IL1HHIHADA)HD$EEEHD$H$`McHHMLHD$IcELHHH?=LH,-H$`LxL$`~It$H|$HH1DŽ$`HDŽ$hDŽ$p"HDŽ$xحLHH$0D$H H $H91I|$H4$H@CI~HD$P肓HD$X1HT$նHT$ELH=HH!H HH1HuHH8H81`HHtRHH9IHpHH9v HHH!ADGEHLH!1LD$8HT$0HD$MHT$0LD$8I=HI=H!I HL1HLH8H811LD$8LH1HuLH8HT$0H8LD$811|f.H蘙H$`1eHDŽ$`HDŽ$hHDŽ$p5#HDŽ$xiHD$Hx0tB_HH1DŽ$`HDŽ$hDŽ$p8#HDŽ$xH1DŽ$`HDŽ$hDŽ$p9#HDŽ$xTcH$`DH1DŽ$`HHDŽ$hDŽ$p"HDŽ$x}HH!H9HADGHHEgfHLH!H9HLL!H D$H=H!H HH!H9HLH1HuI8H8D1uAAHL$H9vHHHHH1H%E8@A,A@@tE8DAFLT$  $AA[A\LT$P!)e@BLT$pjjAALLT$ $AA^A_L$jjHAAH$ $A[]jjAHt$0AH$/ $A\A]jjAHAH$ $_AXAA@jjAHt$0AH$ $AYAZL$jjAALH $XZCHD$L$jjAALHpx> $AY^L$jjAALH $A^A_-HD$L$jjAALHpP $AXZз7 cз6 bP!bQ"bjjAAHH$ $Y^vjjAHt$0AH$ $_AXAHBtk&=TbHD$ptJcDH|$@صHD$@HD$HD$PE HD$XH 1P`HD$pvfDLLkF?fD#nfD(1E1Y^FI$IDŽ$(11^FE1I$HDHT$@H {6oH=HD$PD$X )D$@HD$`H|$@1I_Qa@H=[_HHsHHD$腼HIHT$šf.AWAVAUATUSH$H|$H$H$H$L$$$L$DŽ$DŽ$DŽ$ $HFDŽ$HD$PHHt @$H\$HHɉL$0Ht$X\HD$HD$ tHD$XHHD$ Dt$0EH|$ !Є*HD$HDh$HD$EH$PD$(HD$`H$H$H\$D$(H|$`HH,H7H$PHLd$HS0Hz0HB(HDŽ$HDŽ$HDŽ$H$xHDŽ$H9H$pHDŽ$H$p;H$pH$xLjH9trA}?H$H9H$puH$HHBH$E1HH$pH$xHDŽ$HDŽ$H9ufH[HEQumL}@M=M(< <MmMAE<uA]IE1DHA9sAEH fH% HD$ !D$0HD$ H$H$Hĸ[]A\A]A^A_fA}IUHrL9L u,AUHHc1< Hՠ4ID{ H$x_fDHc1AL4Š4IHMI>Hfu Lw H f=t wLHUHj0HB(HDŽ$HDŽ$HDŽ$H$xHDŽ$H9H$pHDŽ${HELuHHHT$HT$A<-p@ Iv(Ht<v I~8t H6H}HFHH DH$Ht#H$HPH$HHDŽ$H$HEH4$H$HEH0HuH$pH$xHjH90}H$H9H$pHELuL LA<<;$HHBH$1HH$pH$xHDŽ$HDŽ$H9DDH@D9tf}GHUHL$HrH91Hu+UHHc1< Hՠ4HDH$x<hP \<IV(vHt I~8t HH|$h H$H7 Mp1H|$HH<$HhH^΢IH$pH$ DŽ$pD$HHD$8Z$AHD$HELHHHDH2H<$IFHD$IFHD$0H@HxHXLkIEHxvH@H<$HHT$HHT$HH;P>H|$8LL$pMHl$HImXf}IH|$IuX誁Iu0H=HILD1L$&Ht$0HHD$h$HT$1HLHD$XL$HD$X1LLHLL$hs$LLMmMUHl$HH9$H|$8H$DŽ$p‹$$ u@2$@%=H$$ H@@H%@8b@v Au @tpAAD uAAAADǁȹ=G%@8G$ %H$ Љ$ $ Ј$$ _)$ H<$1moH|$  HD$ HHH$` H$pD$PHt$8H<$1҉$*腝0="HILHHrsH|$HLzHH HHT$H|$ {)L|$xIݵE1E1LH1H藬H|$P1keHILp1RHH$fL|$L(LVHLH$pCHɢH$PH$IDŽ$PHHL$`D$H4$LLALA蟒IHJH9uHJHt$HHH$p$HHD$8~)$HcHmL|$Ll$xFLLH$HT$P1MGFIHT$P1MGI$H$0LIƄ$0$0H$H|$PHt$XHHPHHR4H$p$x$|cHPHH$HR4H$P$XHL$8$\HT$`蕾Ht$8E1HDŽ$HH$pT诚Z:#HD$Ll$pD$1HpIHLD9HT$X7I@H$ LIhLD$LxHXH$IGHx A1HHHHHpHbHHIbLD$IH5bH$pIH@8H$PH$PH$pʴH $\ IHLG1-HMHc1LH1XHCL|$xLt$ HcMgLMILa1ILLF$ILt$ tb$pHcD$0HcH9t$ I9}8HD$ LLD$(H9HBHD$ oaLD$(HLZ1XIH$Hk1:H<$INHŢHH$pH$DŽ$pHHD$8D$H|$HLAHAH詎HHD$`HEHHH9HL$(L4$HT$`HHL$(LHD$HLH@XHEXAV$H$DŽ$I$H$1D$$ L4EA@IHHSXH1LuH82C_AuDL!L9TH{XDL!HH"HIIDK_M!AA8AFI1HHXH$HL$XHHX@_L!HL$PAƈ$HSXAH1LuH82C_1HL$XHH1LuH8H8Ll$P1ȨH{XL!L9UHL$XHL!L9AIL#$HH=M M!LH HH$L!L9HIH1LuHD$XH$H8H81Ȩ`HD$PHLHH1IH$pH$pHH9H@HGC_L!EDI<F@A9AENE9}A9D~<FAA@Ht$PDHH#$L!H=H L!HHt$XH HH$H1LuHHH8H81DDD9ADYHH#$H=H L$PL!HHL$XH HH$H1LuHH8H81بDDDAH$HLL!E$AH#CXH HCXH8A $DC_H$9C$aH$ЗAMHL$8L$pMHD$HHDŽ$pH$xH$H$p|H<$.HpHHxHXH$pHHD$8H$pHC$HD@HH|pHmHuH|$ ~j XևH$HL(HD$HHcI]HtHHH!H9,Hp[HIUM1LHھ]HhLHIն HlѶH$PLHHDŽ$PxEQIKHE@H$PH$pH$H$pHE@H$x9@HDŽ$hH$pHDŽ$xD$P\DC_LkXM!AEZk_MI!@ XgHHc< XFDC_AEHL$XH8@tP`IEDHl$PyHt$XH8@@tA@`IEDA1HH=L!H\$P&1HH=L!It5HL$8Ht$PL1IH=M!HHDŽ$pHH@HD$(_ILI9IMHt$ YILHt$ XIMHھG1DHcT$0$L)AH)1HHD$@HI\H6HEHD$HHAHYHyHHqAԗHD$Hc1Hc f fw1DL9bH\$@$HD$\$LlHD$LHD$@D$MH$ H$pȾ1DŽ$pHDŽ$xXDŽ$ HDŽ$+ X肃H$ H$pDŽ$pHDŽ$xXDŽ$ HDŽ$ϻ X&։F7@B@2$ @1f X跂 X裂 X菂 X{ Xg XSH$ H$pȾDŽ$pHDŽ$xXDŽ$ HDŽ$蠺 X XずH$ H$pDŽ$pHDŽ$xXDŽ$ HDŽ$0 X臁It$H~tH$H$H$p1o$L)$pH$,~ X)|IE5H0\$E@$Ot$0Ll$pMX迀@D|$DHø H$xH$xHDŽ$HDŽ$HDŽ$C4?H$p )H|$`HD$`HD$`D|$DPC4?p)t&HDVHT@HtHHH9uELE1I1H@L$AL$4L$pD$DŽ$HMH$6A}EED)A9vtA}C47Z>H$p1HH4HtWL$ T$>EL$ IH$pT$%AU AEAEEAEt%1AHLƒA9ILuEu@Ll$H$p1EL.ѷIHD$H1L4Ll$HH7HT$HH|$1Ll$`Hu?IDDl$DA9A-ED$D$$Dd$$ED$ D$$E1HD$1EfDHt$HtN41HHt$HtdH=lp11҉ ID|$ D$HT$`LDP@ 8LHLHT5\H|$HiHD$`A.LdAE9jD$$9D$DAD$Dt$AIMH HIH9uD@I1D$$9D$DADAHD$(HH;k)9D$8vb{2D$84(A>HH4HD$(H8H_+EHt$(HX PD$8H|$`th1Ll$0Ld$(H(I$HD)H9HLPrpHLtBD9sA1L4HLDH|$4I $H9H|$`QrqHDuHt y~GH$pHt xTHĨ[]A\A]A^A_DHDŽ$pT$?f 1HD$hHD$hHD$`DG蛫D$81D$?@D |$?PxHSHHt)uH|fHtH2H1HHH9uh;H|$`EE1HD$(HD$8A1bfu0ELHÀHH}A L0LcHC HC(CIEE11HC?HULHH}(;=Y{HH@ HH1ƅDE1fLdI|$HtLMD$Apt!Wt0f9Tu9rIPHD9AHHt;XrEDHI]HW Dc%D9E|$HD{JDD%OE>A.H0Ht:GAEI(I;.HH fc G@%E1AƅHbHHDž0DžH4HHH=HD` H}JrHBH|fDHH9QHHq1HtNA9t.t7IOH}H1EHEE HE;H1E1bDD,H}1HEHEEHEHH@ HE{0DD{S>I}IH4H7DghEHIEADcE|$%D CAIEEC1HDAH43 EIEHtH H HHH9uD`HEH}H1EHEEHE H}1HEHEEHEx HH@ HX1LLAL^HRS1HIFHIF HxE1Lu"fDHID$DH@ NHDžpHE0J Ιf.HHHxLrIvHJHBH@HHH<ʋHKDPD9VL I9$D HD;Lt\HpH">HpHuE1DH0DBH0DAƒ9H\uqf.1 1E11{=EHEEHExH}H1&=T[EHEEHExH9tHHH@H.HD`DT?HHY?I~1AL;9wv;DA9ƒ?ILHM‰IM THu1A9HHTHs^ۺ0h ̙0 ˙0u˙1}HpܧH& HEHEHEEE{k>IH HHELHHpDeHpI>HxI>Iăo,u1OHAD$,1D]E1L LP HxDeDHHpHEHH HD+M"A9BGM4HAI~IЍPD)D苍)IL9ED$AA ucEHHH9w)։HHHLMuDeEuD]H% EtDetD]H% HBIFEtDetD]H}AE11Ҿ Mb)LHMIT$E1I>INIF HHtuD9`voDHLtIVHRHt;u%uI~Ht yKGAIN1Ht9Qv։TH}$HxHBH9HHIFH@ l{AWAVAUATAUSHH8=mHD$D$ HD$ OF9ssI\L+HLPnH{t HL=tH|$ 1HD$ HD$(D$0:HD$8AF9rD$fD$MlM}(ImMtIGHPA<$HEHttHtmAUHptaH~(tZ1'fDHtH;XsCA9]Htv6H~(t/I}u؀~tHL(;HEHuD$D$A;F5HX[]A\A]A^A_fDH|$ AE11Ҿ Ht$ HH|$ j DHBHD$H@Hx*AH$IG HHu PH;Vx1H$HL$HT$@$H@HL`HDHxL$͵Iw HT$L$HTHHzH$?IG H`;XTHu HE HA@6O]ffDH$H$LDT$L$L$DT$IC$H$$LcH$ADHALH$L扄$L$DT$JICH[L$ $$H$ALDHALHL牄$@ZDT$L$Ld$PIKHL1DT$L$*+L$L$LIH$LL,vDT$iA{k^$mI{DT$DL$L\$qL\$LIG1ISVDT$IDL$L\$$lIqQf8HHfHB@43A4f@?tE1HD[]A\A]A^A_fDWHA 0LuHOH fcu1 LgHvHoPfA<$uf}uHMD}LuI4fEDfL$DT$<E_Ll$ 1JfDI<HI| HH9uDD|$8A9s|$@?)J| HHJ| H$AHDŽ$DŽ$D$H$<HD$HFH$T$VHHt$@LL@4LD$t$@HAMt$H|$`AVAv4bSAVAv4HgQ|$pHt$`H|$@SWH$(L$mHD$LLLT$LH$$L։$$$H$ HDH}H|$ D$8Ll$ L$H$LLR$MH$H$ELHTRHƉ$a2L$Ht$@LLLT$C$;$LT$1H̀H99uH$(kUQ<H|$@HnQH$(<L$lD=IHDŽ$EDŽ$HDŽ$DŽ$Ef=HE8u=H$`E1H\$(H|$H 0HuHPHIt' 1H[]A\A]A^A_x HhHHpPf><H$HxF=HI#8H) A>g$D$B H$fo$H$$)$$$IT$ HDŽ$HDŽ$H$HT$$HDŽ$yHHT$H$HDŽ$H$HH$$H$LIt$ I~ % fDH@HHhfDHL$$DŽ$HDŽ$HDŽ$Vx|$H\$(fxHp1HHH9wH$$HHL$H$ HL$LHL$H ׶H$It$ H$fo$HH$$)$$ֶH$fo$H$$)$$DH$jjAAH$PV($(A^A_}L$LHL@ֶH$IV It$ fo$HH$$)$$ֶL$H$jjALLH#$AYAZjIN jIt$ HA$_AXcH\$(aH$|$M1BB=IUHHfL$HH$H$L$LnH$LE0ff.@֮CfDH HRSHH HQH߾([鞮Cf.@TUS}1I1HH@@ 1MffCH{CkHtL˄D+H[]A\fD1D+H[]A\@d$ fA<$8tLL@I9Hf;8Ld$HD$Ld$t"DAt$K\H[ fstfHuILE1HQ>wDaH8{HHt>wDeLd$MA;\$rA|$yAD$L7A%E1AwHHDbAmpMtvLD$I~PHLH$HILD$I~0HLHHIxLD$I~HHLHHIUMvMuIELh @IExP>P$`I}GO L4It0LD$LHLHI_HIII}uLo_D$Hx HLHHI,MMMi@I}LwfA>&mIFI~f8LHt$,HD$,I~PfPfPf'f LD$HHLHHI|MNMi @IELpMWLD$I~HLHHI7MvMu&fDHII~Lt$Hx8HLHMHIIEHx@MA A`fA>bI~LT$L$HLGGL$LT$IAMfLLL\$L$ML$L\$AAEOO;f.LLT$LT$L\$LL\$LT$T$)GLLT$LT$L\$~LL\$LT$T$dAAOBOI;fDMMA'AA<$zfA:MIzG4f%fu GL$L$H¿FLIRD$@HD$PMHD$HMAAAA @<$AAMM5fA;LLLT$L$JL$LT$L޹LGHL$LT$M1\fA;MI{G4f%fu GCL$? L$H¿GLfILLL\$L$HL$L\$AA6OeOY7<$tTfA}MAM+LLT$L$LzL$LT$fA>MAMH LH¿GILI}G4f%fuGtBLT$L$o HLFLT$IAL$MPHLT$L$. HLGAtuAfA;MI{ LH¿GIABAmfA}LT$L$I}(MAtCAfA:M-Izm LH¿FIMMLLT$L\$LT$7FLT$L\$T$H LH¿F,II~G4f%fu GLT$L$Ld$ LT$L$jjLT$LD$H^9_L$LT$?@)эPHD HHHD {IBI+CA@D$8HD$ W@Ld$ D)HHHD$ ?LL\$L$L$L\$fHLT$L$HLF@E$ttHEH0E ƒuHx71L(9]$v@HUH4‹E ƒuHyκWUPu f.v% fDH% = tRCt4HHSXHtg1HtOH[fHH޿[1 8@fuxBEg%f.HSX1HT$HWHT$1HHCXH[ff.@L$(AD$AA LLH A)AE AED)؉D$E$D`EAfDH{Ht$/tFADd$D}$DL9rE)DHT$HHHHuMLDIߋT$ |$(AHt$ 1请8HHt$ p1薷8aHt$Ji1耻8USHHHHt'HHSgHHH[]1ɉfDHt,T$ <>uHtHT$ H먺0Eg>1H1Lff.H(i)i@HDHi1 H11ҾH(ff.@ATSHcGHw L)sHcHLIJHu~HH~L 1I H HL9uKDHHHJDRHE1E1 XZHe[A\]DHHH $HsH)HH $i@DPKIIIOHHcILISHII~%J 1IHIHH9u~KTHHHHHHAQE1E1ɺ7HHY^He[A\]ÐuFHHcAP>D`K)HHHOHyA?HA)AB4HcLIRLII~N$1ҐI I HL9uE~&AKIHILHHHH9u߃?tHHKDHAQXWX&gff.L]A\A]A^fDACE1HfN N IL9uؾHH 4D[]A\A]A^IC[]A\A]A^ff.AUATIUSLHMLHHt$HX pHh!tH|$z"H! fA<$u)f;u"HD$fD$ffC f=u-HtHHB7LsMtA>fA<$u'HtLHMt$Mt A>@f;u9HtHHLsMtA<6<ffA<$u'HtLHMt$Mt A> f;u"Ht HHtLsMt A>H_ fA<$u6HtLH 6M|$MtA<@<MDf;u)HtHH5HSHt :@fA<$u/HtLH5ID$Ht8ff;Ltf;HtHHHCH8H 0HuHPHf{Gf~GfXuYHLpHtfA>u LHLEHL$Ht$L$HLL$L$wA@H蠫AfA<$HtLHMt$MA>}AV 0HuIFHfH[HIVHUf:[HHT$HHHT$A~ U3MvPfA>uHL$LHL$HLEHLHL$+HL$u LIHf9H$Ht$L$HL$HLL$L$H$AefH|$Ʀ~H|$脨l!?A7H=!H{THCAE$I](IEAUHtHH2HSH+:"J 0HuHBHf]HHJHOf9r[HT$ HHL$HIHL$HT$ [z 1LzPfA?u LHLEL?HL$Ht$L$HLL$L$L$"ARf!>H=O H>HD$Ht$HAEXAE$LMe(AIEfDLt$Læt fA>! >AtsH=HAA$ID$AE$Me(IEAU3f;%DfA<$Ht LHtMt$Mt A>0E1f;5HtHHHSH: J 0HuHBHfHHHJHRf9HT$ HHL$HIHL$HT$ z /LrPfA>uLD$LLD$HLELLLD$rLD$u LMIfA8uYHL$Ht$HL$L$LH$LHT$djf;HHHSH:HHT$ {f.fA<$f;HtHHZ7HSH:HHT$/ =(HT$_L=,^=Tz yHH{f9u HT$ HHL$HL$HT$ HHELfA>uHL$LHL$HLEI9 1HLHL$BHL$LHZ2fA>HL$Ht$L$ATHLL$H$L$A@H|$^Lt$Lɠt fA>R8!8xH=H }AA""8"1+V@HxH|$ƞLt$L1AAf SfSHD$f8HD$P6 HFHcЀ FHcЀFHcЀFHcЀFHcЀFHpF!6D$>H=HHD$L$H$Ld$HAE4AE$HDŽ$IEIELDŽ$4L$Ƅ$DŽ$H$H$1LLd$`HIH|$jIUH$H$HH$LL$5A;fA<$lfDf;Hf;qAW 0HuIGHFfR$ŠmAV 0HuIFHf<HINHCf9HHHL$HIHL$LA~ &INPf9HHHL$ HHD$HL$ LA~ )GHMvXtfA>u LHLEAkf-$`nH$L`$bH$SH|$`H6|$x;H|$`/!53AH=HkHD$Ht$HAEcAE$LMe(I]0IEIG@J 0HuHBHf{t f~1HLzHBfA?BLHIBfA>LHu%M~MA?AW 0HuIGHf]HMwHfA>u LHtIA $MPfA?u LHLELO_HL$Ht$L$HLL$L$L$2AfH 0HuHPHf{t f~)HLxHAfA?uALHIdAfA>uLHtIVHt :;fHL cf;HHH$HSH:J 0HuHBHf{ݲf~ӲfX[HLzHtfA?u LHLEHL$Ht$L$HLL$L$AL)oT/&N>$S%A~ HMfA?u LHLEMfA>u LHLEL fA>HtLHIFH8H 0HuHPHfXaHLpHtfA>u LHLEM9^1LL&LL(HL$Ht$L$HLL$uAHDAV 0HuIFHf<HINHBf9WHHL$HIHL$\A~ 8!INPf9WHHL$ HHD$HL$ WA~ JHMvXtfA>u LHLEAkf-K$oL)/l}.&0tx>$r%A~ HMfA?u LHLEMfA>u LHLEL NL91LHfA>$HtLHIFH891LHWLHt&HL$Ht$L$HLL$ rA\HD$H@f8 YL=*^k0=(TXU=$A~ IHHD$(f8uHHHHDL$HL$MfA>LHHD$ /f; HHLsMA>LP=$A~ MHL|$(G7fA?uLHIDHD$(MfA>u LHLEHD$HL$(H9=[Ht$H|$(1ZL;t$ ZHt$ 1LkSHt$ L$>HD$HL$L$Ht$A$HLL$H$H$HD$ H$HD$(H$|A@L )t0&8E$t%A~ HM3fA?u LHLEMfA>u LHLELvM91LL@XLL]#EHL$Ht$L$HLL$mAL()5&5Q$%A uHMgfA>u LHLEMfA?u LHLEL91LHXߝLHu"HL$Ht$L$HLL$L$L$>pAHIWHpf:HHT$HIHT$A ILcHL$Ht$L$HLL$H$L$FAHIWHof:ΆHHT$ HHD$HT$ A .MPfA?MuLHIf;uHHtHCHt 8ُL97C7HD$HL$L$Ht$HLL$L$H$VA9HIWHnf:хHHT$HIHT$A PMPfA?MuLIHMDfA>Vf;Lo>fA<$ޢHt$LHٿHLT$aHI !&AH=LT$HHD$Ht$HAETAE$LMu(MU0IE)'@HMHSmfA?NLHINf;uHHtHCHt 8HHhWGHL$Ht$L$HLL$H$SAHIOHmf9EHHL$HHHL$XA MPfA?MuHT$LHIHT$>f:HHT$HHT$LzMrA?hAW 0Hf{f~fU5HIWHf:kSHHT$HHHT$=SfA?fA>L91LHHL$=ڝHL$H$Ht$L$HL$AGHLL$L$L$qAIDHIWHkf:HHT$HIHT$zA ILLu LMIfA?HtLHMMA?AW 0Hf{f~fUYHIWH$f:HHT$HHHT$fA?fA>L9Y1LHHL$a؝HL$H$Ht$L$HL$AFHLL$L$L$oAmHtLHIWH:J 0HVflCfmfnHzHHH|$ 2f?uHT$(HHT$(tHD$ z HRPf:uHHT$(HT$(HHEHH|$t.LOt"H;\$ Q<Ht$ 1Hם,;H|$L_H;\$ [Ht$ 1H֝Ht$ HH|$nH|$褈\!AWH=H[HL$ HD$H$IUH\$AEXAE$IEHAHH$HH$HDŽ$DŽ$[H$Ƅ$DŽ$跫H$1HIHnHt$IE(HL舫IGf8HD$ LhH\HHD$ f8,hM fA?}L#fA?}LH|$t.LAt"H;\$ Ht$ 1Hԝ]LH|$LQH;\$ SHt$ 1HԝHt$ HrH|$؄`H|$薆N!AIH=HHL$ HD$H$IUH\$AEXAE$IEHAHH$HH$HDŽ$DŽ$[H$Ƅ$DŽ$詩H$1HGHHt$IE(HLzxDIGf8HD$ _fHLHHD$ f8?fM fA?{LfA?{LH|$t.L1t"H;\$ Ht$ 1HҝgJH|$LAH;\$ Ht$ 1HҝwHt$ HbH|$ȂPH|$膄>!A9H=H@HL$ HD$H$IUH\$AE~AE$IEHAHH$HH$HDŽ$DŽ$[H$Ƅ$DŽ$虧H$1HDHHt$IE(HLjhDIGf8HD$ rdHH|$|,H|$~!AEH=H.HD$IUH$H\$AE~AE$HDŽ$IEID$H$HHDŽ$[Ƅ$H$DŽ$L$詡H$1H?HPHt$IE(HLzxDIGf8HD$  nH Hd}HD$ f8nM fA?]xL fA?AxL;d$ xH|$ 1L ˝H|$ L(LHH|$V~LnH|${\H|$|J!AuH=HHD$IUH$H\$AEXAE$HDŽ$IEID$H$HHDŽ$[Ƅ$H$DŽ$L$ٟH$1H7=HNHt$IE(HL誟DIGf8HD$ lHLH|HD$ f8~lM fA?vLfA?vL;d$ 3vH|$ 1L=ɝH|$ LX LxH|$L&H|$4yH|$zz!@AH=HHD$IUH$H\$AEXAE$HDŽ$IEID$H$HHDŽ$[Ƅ$H$DŽ$L$ H$1Hg;H/Ht$IE(HLڝDHtLHIWH:wJ 0HuHBHfl#M9a1LLHL$(LD$ ǝLD$ HL$(LLHL$  HL$ H*rH|$8`LPH|$v>H|$x,!AWH=HlHD$IUH$H\$AE~AE$HDŽ$IEID$H$HHDŽ$[Ƅ$H$DŽ$L$軛H$1H9HHt$IE(HL茛IGf8HD$ ShH,HyHD$ f83hM fA?\rLfA?@rL;d$ uH|$ 1LŝH|$ L8LXH|$fL~H|$ulH|$vZ! AH=HHD$IUH$H\$AE~AE$HDŽ$IEID$H$HHDŽ$[Ƅ$H$DŽ$L$H$1HG7HHt$IE(HL躙z I% fD sAA""8"1 L@fA<$tfDf;Mf;8fA<$|fDfA<$fDAV 0HuIFHf<HINHf9!HHL$HIHL$$A~ INPf9 HHL$ HHD$HL$ $A~  HMvXtfA>u LHLEAkf-k$@qHtLHIWHA:8J 0HflM91LLHL$(LD$ ULD$ HL$(cLLHL$ cHL$ FH~6H|$<$L|H|$:qH|$r!F AH= HHD$IUH$H\$AEXAE$HDŽ$IEID$H$HHDŽ$[Ƅ$H$DŽ$L$H$1Hm3Hv7Ht$IE(HLIGf8HD$ "RHLH|lHD$ f8RM fA?gLfA?gL;d$ jH|$ 1LuH|$ LxLhH|$nVLFH|$lo4H|$*q"!xAH=>HHD$IUH$H\$AEXAE$HDŽ$IEID$H$HHDŽ$[Ƅ$H$DŽ$L$AH$1H1HHt$IE(HLDIGf8HD$ /PH|$HTiHD$ f8PM fA?fLCfA?eL;d$ hH|$ 1L襽H|$ LLH|$LvH|$mdH|$ZoR!A H=nHHD$IUH$H\$AE~AE$HDŽ$IEID$H$HHDŽ$[Ƅ$H$DŽ$L$qH$1H/HHt$IE(HLB@DIGf8HD$ NHTH:gHD$ f8NM fA?%dLsfA? dL;d$ fH|$ 1LջH|$ LLH|$LH|$kH|$m!A=H=Hk2HD$IUH$H\$AE~AE$HDŽ$IEID$H$HHDŽ$[Ƅ$H$DŽ$L$衐H$1H-H2Ht$IE(HLrpDIGf8HD$ LHHfHD$ f8LM fA?DbLKfA?(bL;d$ dH|$ 1LH|$ L L@H|$L>ֿH|$iĿH|$k!AmH=HHD$IUH$H\$AEXAE$HDŽ$IEID$H$HHDŽ$[Ƅ$H$DŽ$L$юH$1H/,H Ht$IE(HL袎頽DfA<$}鉽fDAW 0HuIGHfLf f[ f\,HMwHtfA>u LHLEL9f1LH1LHN޾!AH=RHϛHD$AEHAE$IEIvH9tHjHt$Mu(HMu0L臍酼f," *!+A~ HMԇfA?u LHLEMfA>u LHLELfA>HtLHIFHr8iH 0HuHPHfXHHLpHtfA>u LHLEM91LL蜶 LLHL$Ht$L$HLL$pFAʼ/@HD$H@f8 nw 5AA" "8"1,Ae*D@++A~  HMfA?u LHLEMfA>u LHLEL ѻL91LHkfA>HtLH$IFH8~H 0HuHPHfX]HLpHtfA>u LHLEL91LHڴ"LHHL$Ht$L$HLL$CAߺmf.fH>HIWHA'f:HWHHT$HIHT$UXA pMPfA?MuLIHMDLLLT$JLT$u LMIfA>f;HCSHsL|$ILD$nLHHD$ Ff;HHLsMA>Lʡ=T}A~ MHL|$(fA?uLHIDHD$(MfA>u LHLEHD$HL$(H9Ht$H|$(1"oL;t$ THt$ 1LHt$ LӼHD$HL$L$Ht$ATLAdHLpH#fA>RLHIQfA?LHtIGHg8^H 0HHPH߷fD,H*P+A~ HM]fA?u LHLEMfA>u LHLEL M9O1LLoLLtHL$Ht$L$HLL$u LHLEL9b1LHm-LH!A-H=؇HFHD$Ht$HAEHAE$LMu(Mu0IE܂ڱ=u =pyL=mr"j=n|iz HLtfA>u LHLE,N*T+gA \HMfA>u LHLEMfA?u LHLEL9Ŷ1LH?LH\HL$Ht$L$HLL$L$L$%;A®-=Q=A~ 2HMyfA?u LHLEMfA>u LHLELfA>ׯHtLHįIFH8H 0HuHPHfXHLpHtfA>u LHLEM9+1LL᨝RLL?HL$Ht$L$HLL$8At'(A~ HMpfA?u LHLEMfA>u LHLELfA>HtLH~IFHq8hH 0HuHPHfXGHLpHtfA>u LHLEM91LL蛧 LLHL$Ht$L$HLL$o7Aɭ.A~  HD$A~ IE1A~  HD$A~ IE1ȼfA>HAV6 HtLHcЀ t@HcЀt4HcЀt(HcЀtHcЀtHcЀ< G`?#M\M锾fA?MM鏾fA?8MCE1z ILzP鸰f;HT$E11LHWY'.(0A~ %HM}fA?u LHLEMfA>u LHLEL֫L9ڰ1LHp}fA>HtLH)IFH891LHߤ'LHHL$Ht$L$HLL$3Ar-1=QĪA~ HMfA?u LHLEMfA>u LHLELjL9@1LHfA>@HtLHIFH 89# 1LHsLHHL$Ht$L$HLL$%2AxE1ظA~ 4IFPHD$A~ I-E1B=$z HHۆf9u HT$ HHL$HL$HT$ HHELfA>uHL$LHL$HLEI9xZ1HLHL$bHL$LHzfA>HL$Ht$L$A$HLL$H$L$;AϦ@E1pIMHL$zI HL$8͇ xAA""8"1 ߂QE1z I.LrP%A~ "3HE1V=9'y:=F^|Q=:`z HLfA>LHHD$f;HHHSH:HHT$ ﹡=:HT$ '=Tdz HHDD$ HD$ fA>=&mIvEmIѯI@CA~ IfDA~ HL$"A~ IDA~ HL$"EhY8=&eJ=$.Rz HLfA>uHT$LHT$HLEfAL91LH艜QLH>HD$HL$L$Ht$A$HLL$L$H$J5AfDC-0;=QA~  HM)fA?u LHLEMfA>u LHLEL#M9C1LL腛LLHL$Ht$L$HLL$(AZD'3(>A~ 3HMfA?u LHLEMfA>u LHLELLM91LL讚ƠLLHL$Ht$L$HLL$'AAfDIFPHD$IFPHD$2I髤E1lLt$IFf8 .H|$HIfDz &IgfDA~ NHMifA?u LHLEMfA>u LHLELfA>HtLHIFHӟ8ʟH 0HuHPHfXHLpHtfA>u LHLEM91LLnLL[HL$Ht$L$HLL$(A+鐝DA~ HMhfA?u LHLEMfA>u LHLEL־ǞfA>HtLHIFH8H 0HuHPHfXqHLpHtfA>u LHLEM9"1LLŗ6LL#HL$Ht$L$HLL$'AXDA~ ޝHMdfA?u LHLEMfA>u LHLEL螽fA>HtLHpIFHc8ZH 0HuHPHfX9HLpHtfA>u LHLEM91LL荖LLHL$Ht$L$HLL$a&A 1HT$qJE11LHٜLH6gA~ gHMBffA?u LHLEMfA>u LHLEL'fA> HtLHIFH8H 0HuHPHfX›HLpHtfA>u LHLEM9ް1LLLL3tHL$Ht$L$HLL$$AD驙fDA~ .HMdfA?u LHLEMfA>u LHLELߚfA>ӚHtLHIFH8H 0HuHPHfXHLpHtfA>u LHLEM91LLݓNLL;HL$Ht$L$HLL$#A pDA~ HMcfA?u LHLEMfA>u LHLEL趹fA>HtLHIFH{8rH 0HuHPHfXQHLpHtfA>u LHLEM9Y1LL襒LLHL$Ht$L$HLL$y"AӘ8DA~ HM~fA?u LHLEMfA>u LHLEL~ofA>cHtLHPIFHC8:H 0HuHPHfXHLpHtfA>u LHLEM97 1LLmޗLL˗HL$Ht$L$HLL$A!ADz ΜHHdf9u HT$ HHL$HL$HT$ HHELfA>uHL$LHL$HLEI9oS1HLHL$藐HL$JLH7fA>,HL$Ht$L$A,HLL$H$L$M)Az ֛HHPXf9u HT$ HHL$HL$HT$ HHELfA>uHL$LHL$HLEI9cL1HLHL$蟏HL$RLH?fA>4HL$Ht$L$A(HLL$H$L$U(A f-58=QԕA ɕHMfA>u LHLEMfA?u LHLEL91LH謎lLHYHL$Ht$L$HLL$L$L$A/@'d5(A HMbfA>u LHLEMfA?u LHLEL961LH֍LHHL$Ht$L$HLL$L$L$ACYfDI]HL$銨fA>MM,A~ oHL$dDE1A IMWPA~ IHHD$]f8uHHHHDL$HL$If:IuHHT$ HIL9w1HLHL$(ԋHL$(LH=XHHt ~H|$t.fA<$Tf;uHZHED$HD$!ԑ4H=HHL$HD$H$AEAE$IULd$IEHAH$HDŽ$H$LHH$HD$ DŽ$HƄ$DŽ$H$`H$1LHoIE(Mu0HLLAS`QfDA~ MIHHD$Zf8uHHHHDL$HL$MfA>͖LHHD$ f;ݎHHܕLsMϕA>ŕL =(A~ MHL|$(#fA?uLHIDHD$(MfA>u LHLEHD$Ht$(H9#Ht$H|$(1JL;t$ Ht$ 1L(Ht$ LCHD$HL$L$Ht$A($A錍fA XMwPHT$A~ It@A~ UIHHD$>Yf8uHHHHDL$HL$If:IuHHT$ HIL9\1HLHL$(6HL$(LHN=THꁈHt H|$t.fA<$f;uHHED$HD$!0БH=HzHL$HD$H$AEAE$IULd$IEHAH$HDŽ$H$LHH$HD$ DŽ$HƄ$DŽ$H$[H$1LGHъIE(Mu0HLLA[鳊A~ IHHD$_Vf8uHHHHDL$HL$MfA>tLHHD$ kf;EHHDLsM7A>-Lu=, A~ MHL|$(=fA?uLHIDHD$(MfA>u LHLEHD$Ht$(H9gHt$H|$(1貄L;t$ Ht$ 1L萄xHt$ LcHD$HL$L$Ht$A,# AA~ IHHD$Tf8uHHHHDL$HL$If:IuHHT$ HIL91HLHL$(ւHL$(LH=OH}Ht H|$t.fA<$Qf;uH\HED$HD$!ˑ6H=HHL$HD$H$AEAE$IULd$IEHAH$HDŽ$H$LHH$HD$ DŽ$HƄ$DŽ$H$WH$1LHqIE(Mu0HLLAUWSJ 0HuHBHFf($ŰrHHJHLf9u!HT$ HHL$HHL$HT$ tHz ;LzPfA?uHL$LHL$HLEH$Ht$L$HL$AnHLL$H$AyHHJHf9u!HT$ HHL$HHL$HT$ tHz LzPfA?uHL$LHL$HLEH$Ht$L$HL$AmHLL$H$AD̈́HHJHȘf9u!HT$ HHL$HHL$HT$ tHz LzPfA?uHL$LHL$HLEH$Ht$L$HL$AlHLL$H$*A!HHJHf9u!HT$ HHL$HHL$HT$ tHz 7LzPfA?uHL$LHL$HLEH$Ht$L$HL$AkHLL$H$~AuHLzH-fA?uHT$LHHT$tIǃz HRPf:uHHT$HT$HHELHT$0HT$vHL$Ht$L$H$LHLHt$HLH$H$|AHzHHH|$f?uHT$ HHT$ tHD$Ht$HLL$L$H$D{A'HHrHf>QHT$ HHt$HHHt$HT$ $H$AGHL$Ht$HL$L$LL$A3HLzHJfA?6HT$LHHT$Jz MIHRPf:uHHT$HT$HHEHLHT$!HT$u HLIf:Ht$H$AFHL$Ht$HL$L$LL$A3}A~ ~HMMIfA?u LHLEMfA>u LHLEL~F~L9r1LHwfA>~HtLHIFH}8}91LHOw}LHl}HL$Ht$L$HLL$AT}{A~ =}HMTfA?u LHLEMfA>u LHLEL&|L91LHvAfA>|HtLHAIFH|8|H 0HuHPHfXz|HLpHtfA>u LHLEL9ʓ1LHu?|LH,|HL$Ht$L$HLL$A{zA~ {HMRfA?u LHLEMfA>u LHLELΛ{L9w1LH0uvfA>l{HtLHIFHL{8C{9c1LHtzLH輷zHL$Ht$L$HLL$QAz2yA~ zHMFfA?u LHLEMfA>u LHLELv>zL9v1LHsrvfA>zHtLHIFHy8yH 0HuHPHfXyHLpHtfA>u LHLEL91LHGsyLHd|yHL$Ht$L$HLL$ALywA~ 5yHM%EfA?u LHLEMfA>u LHLELxL9u1LHr.ufA>xHtLH9IFHx8x91LHq7xLH $xHL$Ht$L$HLL$AwvA~ wHM`fA?u LHLEMfA>u LHLELƗwL9r1LH(qerfA>dwHtLHIFHDw8;wH 0HuHPHfXwHLpHtfA>u LHLEL91LHpvLH贳vHL$Ht$L$HLL$IAv*uA~ vHMGMfA?u LHLEMfA>u LHLELn6vL9p1LHojfA> vHtLHIFHu8u9!1LH?ouLH\tuHL$Ht$L$HLL$ADusHwf;MߒMŒA MPI@A IE1飑I'{E1A vMwPf9uH\A Jf;MwPiHT$靐A IE1鯏MvPH3wHD$A~ MHvDH3H|$ HHD$ 鲞H5H|$ HHD$ ׯHjH|$ HHD$ HH|$ HkHD$ aHH|$ HHD$ z IyHH|$ HėHD$ 麗H{H|$ HHD$ 駙H3H|$ HHD$ 锛A~ rHMAgfA?u LHLEMfA>u LHLELrM9_1LLglrLL脯lrHL$Ht$L$HLL$Au LHLEL>qM9l1LLkqLL轮qHL$Ht$L$HLL$Auq3p=8&>M#z zHLfA>vLHHD$f;zHHvHSHz:zHHT$ >HT$ =$zzz HHDD$ HD$ fA>z=<7zIvEm_yI~fA~ nHMpfA?u LHLEMfA>u LHLELmM9Z1LLgmLL蝪mHL$Ht$L$HLL$AUmlA~ EmHMofA?u LHLEMfA>u LHLEL^lM9Ur1LLflLLݩlHL$Ht$L$HLL$AlSk=wITS={K=7vz vHHf:<HHT$HIHT$f;uHHqHSHu:uHHT$ =GL9֭1HLHL$oeHL$2uLH臨uHL$Ht$L$ADHLH$L$L$,tAi@BFN}EC.BDtz tHL2fA>PLHHD$(f;ktHH=pHSHPt:GtHHT$ }DHT$ =(!tz HHDD$ HD$ fA>s=0sIvEmsI~fA~ gHMxTfA?u LHLEMfA>u LHLEL辇VgM9T1LL a8gLL=%gHL$Ht$L$HLL$dAfeA~ fHMGSfA?u LHLEMfA>u LHLELfM9[1LL``xfLL}efHL$Ht$L$HLL$A5fd=r;D1=s1=toz oHLtfA>u LHLEL91LH_qoLHƢ^oHt$H$LL$L$L;1oAu LHLEL~eM9~Y1LL^dLLdHL$Ht$L$HLL$$AdscfA>fM2HBH|$ H@HD$ 6LH fA>L迂H fA>L蝂Ht|fA>[LHt^fA>3LaHt@fA>LCHt"fA>L%H6cfA>LMHT$yHTH|$ HHD$ HH|$ HHD$ 鳗I˨HH|$ HrHD$ hHsH HD$f8H|$KH HD$f8H|$#HHD$f8H|$HhHD$f8SH|$ӀH@HD$f8XH|$諀HHD$f8'H|$胀{nA I飧A bHMUefA>u LHLEMfA?u LHLEL9^N1LH{[;bLH蘞(bHL$Ht$L$HLL$L$L$aAa_A aHMdfA>u LHLEMfA?u LHLEL9M1LHZ{aLH؝haHL$Ht$L$HLL$L$L$A(a>_A aHMcfA>u LHLEMfA?u LHLEL9L1LHY`LH`HL$Ht$L$HLL$L$L$Ah`~^A X`HMbfA>u LHLEMfA?u LHLEL9b1LH;Y_LHX_HL$Ht$L$HLL$L$L$!A_]A _HMafA>u LHLEMfA?u LHLEL9)b1LH{X;_LH蘛(_HL$Ht$L$HLL$L$L$aA^\A HIVyLHLELHLEǝLHLELHLELHLE$LHLE鋆LHLEjLHLEIA IUzIBzA HT$ZyHT$EyH{f;MayxyA XI~}A EH{IX}f:M{Q^I%xA ]HMOcfA>u LHLEMfA?u LHLEL9Y1LHVB]LH蟙/]HL$Ht$L$HLL$L$L$hA\[A OHT$5A \HMafA>u LHLEMfA?u LHLEL9V1LHUm\LHʘZ\HL$Ht$L$HLL$L$L$A\0ZAD$]鄙AD$CMPIvAD$_醗MWPIxAD$r陕M fA?LIZݘM fA?L$Z項M fA?ؖLZÖM fA?;LڒBZ&H 0HuHPHf{t f~CxHLxHtfA?u LHLEHLxHL$Ht$L$HLL$L$ySAwXIE1KzM fA?L~Y邂M fA?bLYYMM fA?EL̑tY0M fA?L觑OY雓LHLE"AD$AD$LHLE鯍LHLE鮇LHLEm='z bHL&fA>LHHD$f;EbHH^HSH*b:!bHHT$ k='HT$ w=,az HHDD$ HD$ fA>a=aIvEmfaI~AD$9`AD$h鏂=Gz  _HH`f:MHHT$HIHT$f;^HHZHSH^:^HHT$h=7L91HLHL$jNHL$-^LH肑^HL$Ht$L$ADHLL$L$H$']ARM fA?ȂLWS鳂H 0H@f{t f~pHLxHtfA?u LHLEM9eH1LLMoHD$HL$L$Ht$HLL$L$H$1LAoRM fA?*L艋RM fA?LdR鐃IEH\$H$D$HDŽ$H$d"H$1H¿HIE1AQM fA?L越Q=([z [HLȴfA>uHT$LHT$HLEfAL91LHzKB[LH藎/[HD$HL$L$Ht$A(HLL$L$H$;ZAO=%Zz ZHLfA>sLHHD$f;ZHHbVHSHuZ:lZHHT$ c=$HT$ K=%DZz :ZHLtfA>u LHLE=G ZH|$t.fA<$lf;uH HED$HD$!}UH=DuH,HL$HD$H$IUAEAE$IEHAL$Lt$H$HH$HDŽ$DŽ$FLH$Ƅ$DŽ$;H$1L虼H#NIE(HLLA  N=TXz XHLfA>uHT$LHT$HLEfAL91LHgH/XLH脋XHD$HL$L$Ht$ATHLL$L$H$(WALAD$|驥?Wz WHLpfA>sLHHD$Kf;nWHH@SHSHSW:JWHHT$ `?HT$ a=($Wz HHDD$ HD$ fA>V=VIvEm VI~踽3cTz YTHHyf:fHHT$HIHT$8f;THHOHSHS:SHHT$W]={L91HLHL$CHL$|SLHцiSHL$Ht$L$AkHLH$L$L$vۃ&SA1HAD$龳=oSz RHLtfA>u LHLEL9Ӈ1LHBRLHRHt$H$LL$L$zRAG=lgRz ]RHLtfA>u LHLE=;cPz YPHLfA>ʄLHHD$քf;PHHKHSHP:OHHT$ ^Y=;HT$ =TOz HHDD$ HD$ fA>hO= [OIvEm踳@OI~gAD$MNAD$鲬.Lz LHH6}f:#}HHT$HIHT$|f;LHHwHHSHL:LHHT$U=vL9w1HLHL$Fu LHLE=5~>Lz pGHLfA><LHHD$f;6GHHCHSHG:GHHT$ uP=&HT$ (=(Fz HHDD$ HD$ fA>F=rFIvEmϪWFI~~HLzH'fA?3}LHH}f:];HHT$HHT$pHBH8;8/;H 0HuHPHfX;HLxHLLz CHLtfA>u LHLEL9eO1LH3TCLHvACHt$H$LL$L$/CA8M z BHLvNfA>`NLHHD$zf;BHH>HSHB:BHHT$ K=HT$ 8y=$rBz HHDD$ HD$ fA>B=4AIvEmWAI~HSHtH7MnM*M:MjM&MMOxf;\4IHD$ |f;@4MŢf;)4IHD$ MKf;4M鄫f;3M A AE1qFMֶz >HLtfA>u LHLEL9{1LH.d>LHqQ>Ht$H$LL$L$? $>A/3z >HLtfA>u LHLEL9K1LH .=LH)q=Ht$H$LL$L$ =A2MxMM隹X M@AA""8"1j&@z *=HLZfA>uHT$LHT$HLEfAL91LH,xu LHLEL9@f1LH+;LHo;Ht$H$LL$L$;A0z o;HLYfA>YLHHD$Xf;5;HH7HSH;:;HHT$ tD@HT$ V=,:z HHDD$ HD$ fA>:=s:IvEmОX:I~HF-f;;-IHD$ Ff;-IHD$ gvM鉂z 7HLŧfA>LHHD$f;7HH}3HSH7:7HHT$ @=9HT$ =T_7z HHDD$ HD$ fA>6=!6IvEmD6I~)z 4HLtfA>u LHLEMpM z 2HLfA>LHHD$f;F2HH.HSH+2:"2HHT$ ;CHT$ П=,1z HHDD$ HD$ fA>1=1IvEmi1I~萘YW$z :/HLfA>LHHD$f;/HH*HSH.:.HHT$ ?8AHT$ =$.z HHDD$ HD$ fA>K.=x>.IvEm蛒#.I~J!L,iG $dAA""8"1sYcLeyM5Mwz +HLtfA>u LHLEL9j1LHR+LH^?+Ht$H$LL$L$-+A z +HHgf:jHHT$HIL9g1HLHL$bHL$%*LHz]*HL$Ht$L$AkHLL$L$H$)Az )HLmifA>WiLHHD$hf;)HHU%HSHh):_)HHT$ 2EHT$ Eg=,9)z HHDD$ HD$ fA>(=(IvEm(I~͏z w&HL hfA>gLHHD$hf;=&HH"HSH"&:&HHT$ |/BHT$ e=(%z HHDD$ HD$ fA>%={%IvEm؉`%I~臌PNMYz %#HHdf:ydHHT$HIL9c1HLHL$HL$J"LHU7"HL$Ht$L$AcHLL$L$H$D!Az !HH_f:~_HHT$HIL9K1HLHL$DHL$!LH\T HL$Ht$L$AgHLL$L$H$ AMz  HH'^f:^HHT$HIHT$f;R HH$HSH7 :. HHT$).: aAA""8"18Laz yHH[f:[HHT$HIHT$[f;8HH HSH:HHT$w'=wL9X1HLHL$ HL$LHPHL$Ht$L$AgHLH$L$L$薥FAQMܬM,AD$֠Cs閲M-C黱CX~{~CM髫HPHCv--AD$r2rfA?LMpAD$ /AD$[~M9Ba1LLHL$(LD$ LD$ HL$(LLHL$ OHL$ yH!1iH|$/2WL2GH|$ݻ5H|$蛽#!TANH=6HM`HD$IUH$H\$AEXAE$HDŽ$IEID$H$HHDŽ$[Ƅ$H$DŽ$L$H$1H~H_Ht$IE(HLC9H|$ں9H|$蘼9!SAKH=5HIHL$ HD$H$IUH\$AEXAE$IEHAHH$HH$HDŽ$DŽ$[H$Ƅ$DŽ$H$1H }HbHHt$IE(HL|zM9O1LLHL$(LD$ LD$ HL$(LLHL$ LHL$ H5.}H|$C/kL/[H|$IH|$诺7!QAbH=3H#NHD$IUH$H\$AE~AE$HDŽ$IEID$H$HHDŽ$[Ƅ$H$DŽ$L$H$1H${HMHt$IE(HL M9!]1LLHL$(LD$ LD$ HL$( LLHL$ +JHL$  HF, H|$T-| L-l H|$Z H|$H !PAsH=1H,\HD$IUH$H\$AEXAE$HDŽ$IEID$H$HHDŽ$[Ƅ$H$DŽ$L$H$1H5yH|Ht$IE(HL LHHHL$Ht$L$HLL$ƓA^ MAD$v陦AD$MmjHT$r'MeAG頷HzHHH|$ if?uHT$(HHT$(tHD$ z AHRPf:uHHT$(HT$(HHEH* H|$'+t.L+t"H;\$ kHt$ 1H}ijH|$+C L*3 H;\$ DHt$ 1H9 Ht$ HTG H|$b H|$  !nMAH=4/HYCHL$ HD$H$IUH\$AEXAE$IEHAHH$HH$HDŽ$DŽ$[H$Ƅ$DŽ$3H$1HvHBHt$IE(HLfkHzHHH|$  gf?uHT$(HHT$(tHD$ z ?HRPf:uHHT$(HT$(HHEH(H|$!)t.L)t"H;\$ k@Ht$ 1HwF?H|$)=L(-H;\$ WHt$ 1H3 Ht$ HNEH|$\H|$!hKA H=.-HCWHL$ HD$H$IUH\$AE~AE$IEHAHH$HH$HDŽ$DŽ$[H$Ƅ$DŽ$-H$1HtHVHt$IE(HLf;Lt$ C+H|$@+H|$+!LJA H=,H_xHL$ HD$H$IUH\$AE~AE$IEHAHH$HH$HDŽ$DŽ$[H$Ƅ$DŽ$H$1HosHwHt$IE(HLC-H|$9-H|$,!EIA H= +HUHL$ HD$H$IUH\$AEXAE$IEHAHH$HH$HDŽ$DŽ$[H$Ƅ$DŽ$ H$1HhrH UHt$IE(HLC7AD$.QMpC)LHAHL$Ht$L$HLL$辌AXVC4LHA,HL$Ht$L$HLL$jAC؈LH8A HL$Ht$L$HLL$AC܉HL$Ht$L$HLL$׋Aof;Iu\CMXLH@uzLHo@HL$Ht$L$HLL$MACkM9}1LLHL$(LD$ kLD$ HL$(yLLHL$ y?HL$ \H!LH|$R#:L"*H|$PH|$!\EAH="'H|HD$IUH$H\$AEXAE$HDŽ$IEID$H$HHDŽ$[Ƅ$H$DŽ$L$%H$1HnH{Ht$IE(HLM9t1LLHL$(LD$ |LD$ HL$(LLHL$ =HL$ mH]H|$c!KL ;H|$a)H|$!mCAH=3%HsHD$IUH$H\$AE~AE$HDŽ$IEID$H$HHDŽ$[Ƅ$H$DŽ$L$6H$1HlH5Ht$IE(HLM9t51LLHL$(LD$ LD$ HL$(LLHL$ ;HL$ HxH|$~fLVH|$|DH|$:2!AAH=N#H4HD$IUH$H\$AE~AE$HDŽ$IEID$H$HHDŽ$[Ƅ$H$DŽ$L$QH$1HjH3Ht$IE(HL" HBH-5M?MsLH>: SHL$Ht$L$HLL$ARCRM(MhChhCLoM2MfM際MΚC͝Ct'H|$茦y'H|$Jg'!?AH=^!H{HL$ HD$H$IUH\$AE~AE$IEHAHH$HH$HDŽ$DŽ$[H$Ƅ$DŽ$]H$1HhHxzHt$IE(HL.,Lt$ LHd8~HL$Ht$L$HLL$BA|~CX~LH8pOHL$Ht$L$HLL$A@OCOf;Lt$ BeLt$ BAD$MDM錞AD$XXAD$AD$DDE1[  AAA""8"1e1LT$@LH7e>HL$Ht$L$HLL$ށA5>vC>C1ITIMԜM If;Lt$ Vi.Lt$ XL9 1LHHL$HL$0H$Ht$L$HL$AFHLL$L$L$!AYHzHHH|$ W f?uHT$(HHT$(tHD$ z u-HRPf:uHHT$(HT$(HHEHtH|$t.L&t"H;\$ 4Ht$ 1H H|$L6H;\$  Ht$ 1HlHt$ H3WH|$轠EH|${3!9A.H=H HL$ HD$H$IUH\$AE~AE$IEHAHH$HH$HDŽ$DŽ$[H$Ƅ$DŽ$H$1HbH; Ht$IE(HL_]HT$E1HBHIGHEIGH]HMH fA?q LHHL f:{HHT$HHT$cLzMVA?LAW 0HuIGHfU*HIWH}f:jHHT$HHHT$*fA?fA>L9G1LHHL$2HL$H$Ht$L$HL$AGHLL$L$L$}A>Cq5qLH0uHL$Ht$L$HLL${Au^CuMPA )Hz )I,tHA f)MPHIsz H)HJPItHHRHf:HHT$HHHT$f9HHL$HHL$HQH:J 0HuHBH$H$AFHL$Ht$HL$L$LL$ANz 'I)ChI z s'HJPI*f:yHVHCf:^HSLpz w!1pz 'IoIoHRPpz &HJPIoHHRHf:HHT$HHHT$f9HHL$HHL$HQH:J 0HuHBH$H$AGHL$Ht$HL$L$LL$ŀAtz H%I)Hf:JHz %HlHlz %HJPI f:H5IC&E1z vLzPf>Hlf9IlHt$HLL$L$LL$ H$ALL$ jjMjAGz뗃z v!f;LJPjjz w1if;uE1tjHRPhE1~jE1z AhLzP8hE1z gLzPvgE1z fLzPfE1z eLzPef;%i2iH 0HuHPHf{t f~ iHLxHtfA?u LHLEHLahHD$HL$L$Ht$HLL$H$Ah L*cHD$cC6kZkCAb)cHbfA?.LLfA?LA !MPH  r(ACA""8"1"K(E13 AGA""8"1"Coz w1C(HRPA !H! H M_A  MPH Lt$驱f;eLt$魱LfA?LAXXHeXC AD$O9OC鴰L (XHD$XA 4 HHA  MPHfA?0L Ht$ H&H|$H|$谕!,AcH=HHL$ HD$H$IUH\$AEXAE$IEHAHH$HH$HDŽ$DŽ$[H$Ƅ$DŽ$øH$1H!VHt(Ht$IE(HL蘸CE1 ,A&A""8"1E1CE1;Cb==CmmC,lOlC+HHD$.EvEmUDIvEmS%BI=vEmP?IM*( oAA"?"8"1vEm$M.u LHLE=H|$t.fA<$f;uHHED$HD$!>H=HHL$HD$H$IUAEAE$IEHAL$Lt$H$HH$HDŽ$DŽ$FLH$Ƅ$DŽ$H$1LZFHIE(HLLĄLHD$f;Lt$O AIA""8"1Lt$f;;Lt$ HHD$L I@vEmQ<4ICٙ `A= A"?"8"19 A A"?"8"1b Ax A"?"8"1}f;II魀f;I鳀H@~HD$U~AF69Cl $AA""8"1 JsAp A"?"8"1#sE1)E1! AHA""8"1CbE1E12 AA""8"1<`AD$AF6AF6AF6AF6E1YE1Q tAqA""8"1MAD$Ht$ HLH|$ZsH|$u!f AH=,HHL$ HD$H$IUH\$AEXAE$IEHAHH$HH$HDŽ$DŽ$[H$Ƅ$DŽ$+H$1H5Ht(Ht$IE(HLCE1K ,AA""8"1U IA.A""8"1!LT$ AA""8"1麆E1>z wC1 E1) A'A""8"1ǶHRPE1E1@ AA""8"1JC 4 A~A""8"1E1] A`A""8"1sCL9HD$Lf;IIlf;Irf;IIf;Iz HLtfA>u LHLE=VH|$t.fA<$f;uHRHED$HD$!,H=HHL$HD$H$IUAEAE$IEHAL$Lt$H$HH$HDŽ$DŽ$FLH$Ƅ$DŽ$脔H$1L1HlIE(HLLATRLWHD$z HLtfA>u LHLE=$H|$t.fA<$f;uHHED$HD$!ZH=!HQHL$HD$H$IUAEAE$IEHAL$Lt$H$HH$HDŽ$DŽ$FLH$Ƅ$DŽ$H$1Lv0HIE(HLLALHD$f;ImImf;Im rAA""8"1rLt$if;>Lt$if;)Lt$i AA A""8"1LkzHD$~zHSHyLHyHt$H$LL$L$薗\yALt$?{f;^Lt$C{f;ILt$ { AA""8"1{CxTC#xFxLHBHt$H$LL$L$ȖlA鸿CE1Ƀz HJPw1Ƀz |HJPs IAA""8"1"C1Ƀz 8HJP/E1" бA\A""8"1驱AD$ܰ@6^@6E@6,@6@6Z@6AAD$3C5qXqHHD$龇HHD$ 3z ;HLtfA>u LHLE=HH|$t.fA<$f;uH HED$HD$!~H=EHHL$HD$H$IUAEAE$IEHAL$Lt$H$HH$HDŽ$DŽ$FLH$Ƅ$DŽ$u LHLE=ЃH|$t.fA<$f;uHHED$HD$!lH=HHL$HD$H$IUAEAE$IEHAL$Lt$H$HH$HDŽ$DŽ$FLH$Ƅ$DŽ$ċH$1L")HIE(HLLA蔋钺LHD$ AEA""8"1f;"Lt$}Lt$}  AA""8"1 ABHQQHD$fQHHD$ Pz HLtfA>u LHLE=eH|$t.fA<$f;uHaHED$HD$!;H=߆HHL$HD$H$IUAEAE$IEHAL$Lt$H$HH$HDŽ$DŽ$FLH$Ƅ$DŽ$蓉H$1L&H{IE(HLLAcaLfHD$ކ A]A""8"1f;I8Cf鉄L,HD$2,A++H+f;IVICf;{IIA%RIf;QII镠f;6I雠HFHD$鱙HHD$ &z HLtfA>u LHLE=H|$t.fA<$f;uHHED$HD$!8H=܆HHL$HD$H$IUAEAE$IEHAL$Lt$H$HH$HDŽ$DŽ$FLH$Ƅ$DŽ$H$1LT$H޵IE(HLLAƆĵLHD$f;Lt$ۆ AYA""8"1Lt$鲖f;5Lt$鶖f; IYIFz HLtfA>u LHLE=}ٿH|$t.fA<$f;uHHED$HD$!IH=ۆHLHL$HD$H$IUAEAE$IEHAL$Lt$H$HH$HDŽ$DŽ$FLH$Ƅ$DŽ$H$1Le"HIE(HLLAׄճLHD$f;I魛I際f;~I頛A0Lt$f;RLt$f;=Lt$ن AUA""8"1CoNNgن 6AA""8"1q3ن 4AA""8"1= AD$@H HD$ HHD$ D z >HLtfA>u LHLE=KzH|$t.fA<$f;uH HED$HD$!H=H؆HHL$HD$H$IUAEAE$IEHAL$Lt$H$HH$HDŽ$DŽ$FLH$Ƅ$DŽ$?H$1LH'IE(HLLA LHD$f;ѻLt$ >׆ AA""8"1HE1鞰AFHt$H|$(zHD$HL$L$Ht$AT$EA@Ht$ HFz>E1Hֆ AA""8"1RAD$L`HD$`ME1mՆ AA""8"1鉨C 1Ն ǣAA""8"1頣AD$ӢE1ݮAՆ CA|A""8"1Kz HLtfA>u LHLE=v_H|$t.fA<$f;uH[HED$HD$!5H=ԆHHL$HD$H$IUAEAE$IEHAL$Lt$H$HH$HDŽ$DŽ$FLH$Ƅ$DŽ$~H$1LHuIE(HLLA]~[L`HD$HHD$_HHT$ |=:HT$ ( =;vEm*_IvEm'S\ILt$7f;WLt$7f;BLt$t7͆ &AA""8"1߁z HHDD$ HD$ fA>=nIvEmqI~au_HdHD$ʆ AA"?"8"1܁HHD$ Ef;ޮLt$DTvEmVI~فz w11z w1HRPHRP Ht$ HH|$KH|$M! AoH=ƆHHL$ HD$H$IUH\$AE~AE$IEHAHH$HH$HDŽ$DŽ$[H$Ƅ$DŽ$pH$1H-Ht(Ht$IE(HLp颟CE1鋟ņ ,AA""8"1ׁņ KdAtA"?"8"1ׁ$dņ hA A"?"8"1ׁhhHt$ H/H|$=JpH|$K^!IAH=ņHHL$ HD$H$IUH\$AE~AE$IEHAHH$HH$HDŽ$DŽ$[H$Ƅ$DŽ$oH$1Hl Ht(Ht$IE(HLnChE1ʝ.Ć ,A?A""8"18ցC_Ht$H$LL$L$gt^AWAIJE1@CL6h_HD${_L^HD$qHSHHSH^鮣LH2Q^1Ƀz HJPAD$ /AD$† _ACA""8"1ԁ^Lt$Lf;&Lt$LHHD$mKvEm KI?vEmB[HIvEmEIvEmCI1Ƀz  HJPӶ I AnA""8"1ȁ" f;Lt$C DA|A"?"8"1ȁDHHD$ Dz ĚHLtfA>u LHLE=WH|$t.fA<$f;uHHED$HD$!ԐmH=εHHL$HD$H$IUAEAE$IEHAL$Lt$H$HH$HDŽ$DŽ$FLH$Ƅ$DŽ$_H$1L#HIE(HLLA_铎LHD$ٴ AMA""8"1Ɓf;#Lt$ALt$A eAyA""8"1ƁpeAwwE1׍f;IwALL AA""8"1#ƁaE1yE1qHt$ H鉳Ht$ HlE15 AA""8"1ŁmL%xHD$8xf;ŗIkwCMp% -xAA""8"1/ŁxLNHD$af;QI锕I違f;6I釕A H黬H騬AD$y =iAA"?"8"1āiAD$0SAFh鏥Ht$H|$(AHD$HL$L$Ht$A$$ AD@H=FHD$GHHD$ FC_YY^ pAA"?"8"1hÁp* AEA""8"14Á݋AD$QHHD$HHD$ %HWoHD$pHHD$ }of;I=CiiLHC|`Ht$H$LL$L$`O`A鹉C(`H]HD$]HHD$ l]z aHLtfA>u LHLE=nQ4H|$t.fA<$f;uH0HED$HD$!͐ H=kHHL$HD$H$IUAEAE$IEHAL$Lt$H$HH$HDŽ$DŽ$FLH$Ƅ$DŽ$bYH$1LHJIE(HLLA2Y0L5HD$v AA""8"1f;II' aAA"?"8"11~a 4jAA"?"8"1 jHHD$ `z -HLtfA>u LHLE=:OH|$t.fA<$f;uHHED$HD$!pː֍H=7HHL$HD$H$IUAEAE$IEHAL$Lt$H$HH$HDŽ$DŽ$FLH$Ƅ$DŽ$.WH$1LHIE(HLLAVLHD$B AQA""8"1Lf;Lt$<^Lt$'^f;mLt$+^f;XLt$f;CLt$Lt$f;$Lt$oXLt$ZXf;Lt$^XCc yAA""8"1mRAF>eHt$H|$(HD$HL$L$Ht$A,$A?@1Ƀz HJP鰂E1z eAA""8"1脼>AD$q鲂5 ?AA"?"8"1??C=W AA""8"1^C߯AD$RuC  CrAd::1Ƀz gHJP^1Ƀz mHJPdAF6ZAF6DAD$AD$ #AA""8"1C HoH_HD$]_HHD$ ^z HLtfA>u LHLE=+JH|$t.fA<$f;uHHED$HD$!aƐLjH=(HHL$HD$H$IUAEAE$IEHAL$Lt$H$HH$HDŽ$DŽ$FLH$Ƅ$DŽ$RH$1L}HIE(HLLAQLHD$3 AA""8"1=f;}Lt$s\Lt$^\f;^Lt$b\HlYHD$ZC` Ht$ Hz邬 ٤AA""8"1莸鲤C3ZE1E1E1) ؄A}A""8"13鱄ff.tcIE(LptcIE(LEtmfwIHEHPfw5K4B4ff%f9v BuCuH}bu.fDE1HhD[]A\A]A^A_I8HufH}7~HELPA fJABu-‡u L!I¿!L$ >AtH=L$HAD$cI\$HEAD$$ID$(HmHuL9LLT$w&Ht$LT$H4$LY HMI4$LHl$8H,$HD$D$ GHt$LHL$(HD$0D$4HD$@1LH gHHHpH9tHHD$%HT$IT$0LLLntfDHD$H$)f.Hl$8H,$LI$LLT$(HD$D$ ~HD$0D$4HD$1HLPfHHLT$q A4A"a"8"1/L$AWAVIAUATIUSLIILHXLxtHt$0LHl$0H\$8=DfHݣtLLDvH辥H9=1HHf}u&MtHAHDHUHt ::@L $t\HX[]A\A]A^A_EHLLhDHX[]A\A]A^A_@f;DLL$(LLHiu11HX[]A\A]A^A_@}ALL$dCz vkHRPHz vcHJPHt$Vf=X=HXHt$ z HBҁfDAUATIUSHMLHXAHt$H $fm_ Ifp f=fn f=u*HtHHHSHt: DH$f8u9HtH$HH$LxMtA?fDf;u)Ht HHtLsMtA>oH;$*H$1HXĔ`A$fqf=fk5Ht$0HL}PE1HXD[]A\A]A^A_fkwLfL2uH$H@f8tH$f8ffDIPf:1HHT$HT$H[ f:I IL$ f9'HHL$HIHL$UAff=u%Ht HHtHCHt 8\*@L91LH”u!f;H&/HH8HSH:-Zf.HtLHIT$Ht :"fC! A H=r~HHCAE$I](IEAUH$L$<>tMIރ<>tLsM9tLLSLLZ;Ld$0HL8A[! AIH=~HmH$DT$AE<AE$Me(I]8IEH$IE0Ht$HLDT$M|$fA?W LOHfA?M8 IT$ f:L HHT$HT$HHEHq\f;RH$q=HLt$0H\$8t#H $Ht$LD$0HL'AHtLHM|$MA?AW 0HuIGHfkfqHIWH*f:+HHT$HIHT$+A  MPfA?u LHLELL̔u LMIfA?"HtLHMMA?AO 0HMfcIHHH|$Wf?J^HZA HD$4 MPfA?u LHLEH|$L˔u LL|$HD$H|$rYLoIf;?H$o*HD$L|$0HD$8IT$f: HHT$HHT$f:I IL$ f9 HHL$HL$IHLDLL ˔u LMIfA?HtLHrIWHe:\J 0HuHBHuHHT$HT$HHEHT$Ht$L.ʔu HD$L|$ILqLnf;H$mqHD$L|$8HD$0HH\$@pMH $Ht$LD$0HL"{A)?H$f8\f;IPf:a HHT$tHT$7 Hf:I9 IL$ f9W HHL$HIHL$Af[f=u%Ht HHtHCHt 8$@L91LH:f;7Hv(HH HSH:%fMxfA?LH H+fA?MIL$ f9HHL$HHD$HL$HD$f-f=u'Ht HHtHCHt8"fDL91LH Vf;H&HH HSHc:-$fIPf:HHT$HT$ H f:IIL$ f9HHL$HL$IHLDLLƔu LMIfA?u'Ht LHtIGHt8vfDAff=u)Ht HHtHSHt:L91LH躸f.IPf:IHHT$HT$H f:I!IL$ f9?HHL$HIHL$5Aff=u%Ht HHtHCHt 8t@L91LH躷f;H#HH0HSH:!RfJ 0HuHBHf<_HHJHf9JHT$HHL$HIHL$HT$Zz nHJPf9Iu%HHzXt f?uM91LL裶A<>tMvA$L<>tIT$I9t5HLHT$HT$`LHuMD!AH=t~H!H$AE<LAE$Me(DT$IENIE(H$M}0IE8AW 0HuIGHf<HIWHqf:HHT$HIHT$A IWPf:HHT$HHD$HT$tA lHMXtfA?u LHLEM931LLt\AL<>tINA$L<>tIT$H9 HHHT$ HL$HL$HT$ LL[! AiH=~H0H$LAE<AE$Me(DT$IEEMIE(HD$HHt$I]0LIE8DT$fAV 0HKf<yHM~H fA?LHHA~ v&I~Pf?~ Ht$A~ Ht$ % HGLHIIT$ Mf:HHT$!HT$f:fHHHT$HIHT$ IL$ If9 HHL$\HL$f9I@HHHT$HIHT$$IL$ If9!HHL$HL$ OHHHT$HIHT$IL$ If9HHL$HL$f9I@H?LHI.IL$ Mf9+HHL$6HL$ HHHT$HIHT$IL$ If9HHL$HL$ f9I|@HHHT$HIHT$IL$ If9HHL$tHL$f9I@HCf8uH$f8!>AH=~H~ $AA" "8"1yD$Vf;e;!A&H=~HH$H$H@AE$Iu(IEAUIFH@J 0HuHBHkfl$HfM|$ IfA?LqIT$ Mf:HHT$HT$ufDM|$ IfA?LHT$WHT$If=fDI@ f8HD$) H|$ Rjf.Mx fA?) LHT$HT$ .fDMx fA?9 LHT$HT$ fDMx fA? LHT$cHT$ %~ D$A,A" "8"1+yDT$UHLrHfA>-HT$LHIHT$W-z LrPfA>3-LHHD$ -fA?uLHtIGHt 8D/Ht$f>f=u!Ht HHtHCHt 8;L9a 1LH̬-f;HEHHBHSH%:"Dd@HLrHfA>+HT$LHIHT$+z LrPfA>p+LHHD$G+fA?uLHtIGHt 85Ht$f>f=u!Ht HHtHCHt 8f9L9v 1LH謫/,f;HmHHH"HSH:GD@HLrHfA>M*HT$LHIHT$ *z LrPfA>)LHHD$)fA?uLHtIGHt 8+/Ht$f>f=u!Ht HHtHCHt 8;L91LH茪*f;HFHHHSH:!A$@HHJH3f9H)HT$HHL$HIHL$HT$)z _HRPf:IuHHT$HT$IHLDLLu LMIAf`f=L91LHCc)HLrHfA>(HT$LHIHT$(z [LrPfA>(LHHD$c(fA?uLHtIGHt 8g/Ht$f>|f=u!Ht HHtHCHt 8G5L91LH_(f;\H@;HHHSH:[:G~ $AQA" Xf.A~  A~  LHMvXLb"!ApH=~HH$AE<LAE$Me(DT$IEL@IE(H$Mu0IE8A~ qE1fA>oHt$LHHt$LETfDAD$<AD$!JAH=~HH$AE<LAE$Me(DT$IE?I]0IE(M}8[DCOH$f8RHD$H $LD$0Ht$AmHLLt$0HD$8H$HD$@ A fC/H$f8 @CH$f8H$H $LD$0Ht$AlHLLt$0L|$8HD$@vAdfDCH$f8"H$H $LD$0Ht$AkHLLt$0L|$8HD$@AfDC7H$f8:H$H $LD$0Ht$AnHLLt$0L|$8HD$@AfDf9HL$f9I#Df=NnfDE1z Iz LzPo|DA = A IGPIHD$E1fDfA>I@fA>I@fA>I0@fA>HL$) EID$ f8HD$fA?MKCM|$ fA?f:wI!M|$ fA?.f:SI}M|$ fA?f:/IHك<>tHKAL<>tIWH9HHHT$HL$ԘHL$HT$ nHك<>tHKAL<>tIVH9HHHT$HL$wԘHL$HT$HHZԘHك<>tHKAL<>tIVH9HHHT$HL$ԘHL$HT$HHӘHك<>tHKAM<>tM~L9HLHL$ӘHL$[HLӘCHك<>tHKAL<>tIVH9EHHHT$HL$.ӘHL$HT$2HHӘ 1 E1HIWH f: HHT$HIHT$r A MPfA?u LHLELQ2f:AI;fA?HD$Mf:AIf:AIhIIbHT$z /ILz IfDH 0HuHPHf{t f~LxHHL|$tfA?uLHT$IHLDHt$f>f=u!Ht HHtHCHt 828L9|1LHD f=Ht$f>A +HT$A I5DH 0HuHPHf{t f~iLXHHL\$tfA;uLHT$IHLDfA>!f=LHL\$L\$H$f8H$H $LD$0Ht$ApHLL\$0Lt$8HD$@sAhaH 0HuHPHf{t f~5H@HHHD$HD$tf8uHHT$HHEHT$fA?f=u!Ht HHtHCHt 80H;\$RHt$1Hʜf=fA?DH 0HuHPHf{t f~mH@HHHD$HD$tf8uHHT$HHEHT$fA?f=u!Ht HHtHCHt 81H;\$Ht$1Hpf=fA?DH 0HuHPHf{t f~H@HHHD$HD$tf8uHHT$HHEHT$fA?Kf=u!Ht HHtHCHt 8.1H;\$Ht$1H*f=hfA?DA~ [L-DIHD$E1LE1H$H $LD$0Ht$ApHLL|$0Lt$8HD$@Av`~ H$AA" "8"1yD$J 0HuHBHf{t f~MHHJHtf9uHHL$HL$HHEI9w1HLHL$ܙHL$LHH$f8H$H $LD$0Ht$ApHLL|$0Lt$8HD$@dAR@H 0HuHPHf{t f~iHHHHtf9uHHL$HL$HHEL91HLHL$HL$LHH$f8H$H $LD$0Ht$AnHLLt$0L|$8HD$@AnH 0HuHPHf{t f~HHHHtf9uHHL$HL$HHEL91HLHL$HL$3LH0 H$f8H$H $LD$0Ht$AlHLLt$0L|$8HD$@AH 0HuHPHf{t f~YHLxHtfA?u LHLEM9A1LLFLLc H$f8HD$H $LD$0Ht$AmHLLt$0HD$8H$HD$@AfDHHȘ H 0HuHPHf{t f~9HHHHtf9uHHL$HL$HHEL91HLHL$HHL$LH`H$f8H$H $LD$0Ht$AkHLLt$0L|$8HD$@AJ 0HuHBHfF"fGs}J 0HuHBHfFxfGJJ 0HuHBHfF3fGJ 0HuHBHfFfGA GMPIzA ;MPIgHHƘ~ D$AA" "8"1yDT$D~ XD$AA" "8"1yDT$'f.IIA {IDA cIjDHSH`z AHBPHD$fA>MM@z >HBPHD$fA>MMyfDHSH~z HBPHD$fA>MqMW@HSH`>z HBPHD$fA>M^MD@z >LzPI1~ hD$A8A" "8"17yDT$7@HSHͼLzPfA?uLD$LLD$HLEM91LLLD$LD$EHHJHfAfA8uH$f87,HLBH fA8u!HT$LLD$HLD$HT$tIM91LLHL$LD$觐LD$HL$mHLHL$LD$FLD$HL$u HLIL9 A fA8uH$f8|(HLBHRfA8u!HT$LLD$HLD$HT$tIM91LLHL$LD$0LD$HL$HLHL$LD$ϛLD$HL$u HLIL9,A"fA8uH$f8&$z QIHD$.HLBH@fA8u!HT$LLD$HLD$HT$tIM91LLHL$LD$融LD$HL$dHLHL$LD$=LD$HL$u HLIL9AfA8uH$f8&IE14IE1-CBH$f8=XCH$f8HD$H $LD$0Ht$AmHLL|$0HD$8H$HD$@ARCnH$f8qHD$H $LD$0Ht$AkHLL|$0HD$8H$HD$@誹A+C4H$f87HD$H $LD$0Ht$AnHLL|$0HD$8H$HD$@CA1CH$f8HD$H $LD$0Ht$AlHLL|$0HD$8H$HD$@ܸA=CHtHHHSH:J 0HxBHD$L9<Ht$1LHL$ LD$ꉔLD$HL$ Ht$LHL$HL$f9H$f8HD$L|$8HHL$HHD$8H$HD$PjGHL$Ht$LD$@LAnHX{A_AX<1C0H$f85HD$H $LD$0Ht$AnHLL|$8HD$0H$HD$@ܶA1HHL$ E1HL$WI11HLLD$ HL$衈HL$LD$ gCH$f88HD$0H$HD$@ AcA 9A HtHHHSH:J 0H(HD$L9(Ht$1LHL$ LD$LD$HL$ Ht$LHL$+HL$f9H$f8HD$L|$8HHL$HHD$8H$HD$PjGHL$Ht$LD$@AlHL荜{AAYA[heAG~1HHL$蘓E1HL$I11HLLD$HL$,HL$LD$AG1HHL$3E1HL$I11HLLD$HL$DžHL$LD$PHtHHHSH:J 0H1HD$L9'Ht$1LHL$ LD$ʄLD$HL$ ,Ht$LHL$HL$ f9H$f8HD$L|$8HHL$HHD$8H$HD$PjGHL$Ht$LD$@AkHL8{AY^1HHL$UE1HL$I11HLLD$HL$郔HL$LD$鯾~ TD$A A" "8"1yDT$#CzH$f8}HD$H $LD$0Ht$AmHLL|$0HD$8H$HD$@*A7HtHHHSH:J 0H#H$f8L|$0H$HL|$HL$HHD$PL|$@jGHL$Ht$LD$@HAmL辗{AEXZAKC_H$f8dHD$H $LD$0Ht$AkHLL|$8HD$0H$HD$@;A)AfA?Lt$$;Lt$ z hMMz NIIfA?Lt$Lt$IE1z  MYMGHD$HD$IDfA?Lt$Lt$}z MHM6fA?Lt$Lt$z qMMz WIHBPIHD$LH3靺LH!LH$LHLH)A@ f9uH$f8A@rf9uH$f8HD$Ht$HgA@f9uH$f8x锹LHHt$H`Ht$H4vHD$A@$fA?uH$f8z "IHJP1Ƀz EHJPuLHIDHD$HD$9u!HT$(HHL$ HHL$ HT$(tHz LrPfA>uHL$ LHL$ HLEHD$H95Ht$1HHL$ N|HL$ &Ht$HdfA>H$f8HD$Lt$@HHD$8HD$ HD$@H$HD$PjGHL$Ht$LD$@LAmHő{A_AX鞷H 0HuHPHf{t f~LpHHLt$tfA>uLHIDHD$HD$9u!HT$(HHL$ HHL$ HT$(tHz LrPfA>uHL$ LHL$ HLEHD$H9"Ht$1HHL$ 'zHL$ ?Ht$H=*fA>H$f8 HD$Lt$@HHD$8HD$ HD$@H$HD$PjGHL$Ht$LD$@AkHL螏{AAYA[vz IHJPz DIHJPJ1Ƀz HJPH 0HuHPHf{t f~rLpHHLt$tfA>uLHIDHD$HD$9u!HT$(HHL$ HHL$ HT$(tHz hLrPfA>uHL$ LHL$ HLEHD$H9$Ht$1HHL$ wHL$ Ht$HԺfA>H$f8HD$Lt$@HHD$8HD$ HD$@H$HD$PjGHL$Ht$LD$@AnHL5{AEA^Xv H 0HuHPHf{t f~gH@HHHD$tf8uHHHHDt$Ht$AHt$H蘹H$f8HD$H $LD$0Ht$ApHLLt$8HD$0H$HD$@AwH 0HuHPHf{t f~LpHHLt$tfA>uLHIDHD$HD$9u!HT$(HHL$ HHL$ HT$(tHz $LrPfA>uHL$ LHL$ HLEHD$H9Ht$1HHL$ ztHL$ rHt$H萷]fA>RH$f8@HD$Lt$@HHD$8HD$ HD$@H$HD$PjGHL$Ht$LD$@HAlL{AEXZʯ1Ƀz HJP LLe錮z ITIBLLHL$貶HL$Mz ɿII LHLD$sLD$H 0HuHPHf{t f~HLpHtfA>u LHLEM9%1LLrWLLDH$f82HD$H $LD$0Ht$AnHLL|$0HD$8H$HD$@\AJH 0HuHPHf{t f~sHLpHtfA>u LHLEM91LLr8LL%H$f8HD$H $LD$0Ht$AlHLL|$0HD$8H$HD$@芟AxLHLD$誴LD$ELHLD$腴LD$l z IqI_IGH H 0HuHPHf{t f~HLpHtfA>u LHLEM931LLpLLݳH$f8HD$H $LD$0Ht$AmHLL|$0HD$8H$HD$@HAk6J 0HuHBHf{t f~HHJHtf9uHHL$HL$HHEI91HLHL$oHL$zLHgH$f8UH$H $LD$0Ht$ApHLL|$0Lt$8HD$@hAVLLHL$舲HL$#H$H $LD$0Ht$ApHLL|$0Lt$8HD$@AکLHLD$LD$ 鵩H 0HuHPHf{t f~ HLpHtfA>u LHLEM9 1LL{nLL蘱H$f8HD$H $LD$0Ht$AkHLL|$0HD$8H$HD$@AfLLHL$#HL$龨z :IIJ 0HfFpfGHHJHCf9u!HT$ HHL$HHL$HT$ tHz ùLrPfA>uHL$LHL$HLEI91HLHL$ mHL$NHSHJ 0HH$f8,H$H $LD$0Ht$ApHLL\$0L\$Lt$8HD$@AL\$ڧH 0HH$f8hHD$H $LD$0Ht$AlHLL|$8HD$0H$HD$@A"H 0HH$f8MHD$H $LD$0Ht$AnHLL|$8HD$0H$HD$@AH 0H\H$f82HD$H $LD$0Ht$AkHLL|$8HD$0H$HD$@+AJ 0HufFfGHHJHff9u!HT$ HHL$HHL$HT$ tHz *LrPfA>uHL$LHL$HLEI991HLHL$hHL$RJJ 0HfFpfGHHJHf9u!HT$ HHL$HHL$HT$ tHz iLrPfA>uHL$LHL$HLEI91HLHL$gHL$w鑢鉢HSH駢Lt$0L|$8HLD$HHD$PLjFHL$AlHt$HLD$@_}{A_AX='Lt$0L|$8HLD$HHD$PAnjFHL$HHt$LLD$@}{AY^ءA MP.H 0Hmf{t f~HHHHtf9uHHL$HL$HHEI9U 1HLHL$fHL$\LH藩IH$f87HD$H $LD$0Ht$AmHLL|$0HD$8H$HD$@ALt$0L|$8HLD$HHD$PHjFHL$AkHt$LLD$@{{AEXZ钠HD$Lt$0HL|$HAmHLHD$@H$HD$PjGHL$Ht$LD$@m{{AAYA[J4Lt$0L|$8HHL$HHD$PAkjGHL$HHt$LLD$@{{AA^A_HD$Lt$0HLD$HAmHLHD$@H$HD$PjFHL$Ht$LD$@z{AA^A_酟Lt$0L|$8HHL$HHD$PAljGHL$HHt$LLD$@mz{AY^L6HSHTLt$0L|$8HHL$HHD$PHjGHL$AnHt$LLD$@ z{AEXZԞA OJ 0HfFfGHHJHf9u!HT$ HHL$HHL$HT$ tHz LrPfA>uHL$LHL$HLEI91HLHL$@cHL$ HSH!HBH5AFr1Ƀz HJPE1z xLrPoHLrHCfA>/HT$LHHT$txz IHJPf9uHHL$HL$HHEHLHL$oHL$u HLIL9tE1HLHL$6bHL$uz w逮LH?uAufA>uH$f8tHD$L|$0HLt$HAlHLHD$@H$HD$PjFHL$Ht$LD$@w{AA^A_xbLz vHHJPLH萤u4AGfA>uH$f8t! 1LbnuiLE1HD$L|$0HLt$HAlHLHD$@H$HD$PjGHL$Ht$LD$@v{AAYA[镛1f@HD$鷡H$f88HD$L|$0HL|$HD$HH$L|$@HD$PjFz  AGbHD$1Ƀz HJPHBH]@%HBHAFAG'@PCBi@1bHD$H9Ht$1HHL$ ]HL$ ؚHt$HÚHD$f8H$f8HD$L|$8HHD$8HD$ HD$HH$HD$PjFz ܩ@_HD$1Ƀz HJP@^AGCkH$f8pHD$H $LD$0Ht$AnHLHD$0HD$HD$8H$HD$@A%鮘J 0HuHBHf{t f~HLrHtfA>u LHLEHD$L9PHt$1L_\Ht$LzH$f8nHD$H $LD$0Ht$AnHLHD$0HD$HD$8H$HD$@A#ΗHLrH)fA>HT$ LHHT$ z  IHJPf9uHHL$ HL$ HHEHLHL$ hHL$ u HLIHD$H9Ht$1HHL$ =[HL$ UHt$HS@fA>5H$f8#HD$Lt$@HHD$8HD$ HD$@H$HD$PjFz !$@cL1Ƀz HJPHt$H踝wH$f8eHD$H $LD$0Ht$AkHLHD$0HD$HD$8H$HD$@A @HLrH)fA> HT$ LHHT$ z :IHJPf9uHHL$ HL$ HHEHLHL$ fHL$ u HLIHD$H9Ht$1HHL$ lYHL$ Ht$H肜fA>H$f8HD$Lt$@HHD$8HD$ HD$@H$HD$PjFz !SL@[1Ƀz HJPHt$H E1z LrPH$f8ZHD$H $LD$0Ht$ApHLLt$8HD$0H$HD$@覅A锓HPHAF@EvHD$H9qHt$1HHL$ VHL$ 6Ht$H!HD$f8H$f8HD$L|$8HHD$8HD$ HD$HH$HD$PjFz 鲢E1z LrPHLrHRfA>>HT$LHHHT$tvz bLrPfA>uHL$LHL$HLEHLHL$ cHL$u LIHM9tS1LLHL$UHL$upz LLLHL$衘HL$u@AFuf9uH$f8tHD$L|$0HHL$HAnHLHD$@H$HD$PjFHL$Ht$LD$@j{AAYA[Ր鿏Lz vKLLrPLHu鎏AGqfA>uH$f8t'f1La1ɄLE1HD$L|$0HLt$HLAnHHD$@H$HD$PjGHL$Ht$LD$@"j{A_AXHPH&HBHHPH@HBHAC,m@N@*1Ƀz xHJPoHBH HD$)@HPHHD$H9 Ht$1HHL$ RHL$ |Ht$H諕gHD$f8XH$f8FHD$L|$8HHD$8HD$ HD$HH$HD$PjFz xHLrHQfA>HT$LHHHT$tvz >LrPfA>uHL$LHL$HLEHLHL$^HL$u LIHM9t[1LLHL$}QHL$uLz ȝLLsLLHL$uHL$uAFuf9uH$f8tHD$L|$0HHL$HLAkHHD$@H$HD$PjFHL$Ht$LD$@f{A_AX锋z vLLrPHBH1L]1ɄLE1LH蜓u@AGfA>uH$f8tHD$L|$0HLt$HAkHLHD$@H$HD$PjGHL$Ht$LD$@e{AY^ҋ鼊HLrHPfA>HT$LHHHT$tvz LrPfA>uHL$LHL$HLEHLHL$\HL$u LIHM9t[1LLHL$GOHL$uz LLsLLHL$?HL$uމAFuf9uH$f8t黉HD$L|$0HHL$HAmHLHD$@H$HD$PjFHL$Ht$LD$@d{AY^u_z vLLrPHBH 1L[1ɄLE1LHgu AGHfA>uH$f8tHD$L|$0HLt$HHAmLHD$@H$HD$PjGHL$Ht$LD$@c{AEXZ醈E1z LrPE1z LrPE1z $LrP@W1Ƀz XHJPOAGAFDgE1bCJ 0HuHBHf{t f~HLrHtfA>u LHLEHD$L9NHt$1LLuHt$LÏ`H$f8NHD$H $LD$0Ht$AkHLHD$0HD$HD$8H$HD$@)zAE1z LrPHLrHfA>HT$ LHHT$ z >IHJPf9uHHL$ HL$ HHEHLHL$ XHL$ u HLIHD$H92Ht$1HHL$ pKHL$ HHt$H膎3fA>(H$f8HD$Lt$@HHD$8HD$ HD$@H$HD$PjFz !WHLrH)fA> HT$ LHHT$ z IHJPf9uHHL$ HL$ HHEHLHL$ WHL$ u HLIHD$H9Ht$1HHL$ NJHL$ FHt$Hd1fA>&H$f8HD$Lt$@HHD$8HD$ HD$@H$HD$PjFz !5L@[1Ƀz HJPHt$HɌ5H$f8#HD$H $LD$0Ht$AlHLHD$0HD$HD$8H$HD$@/wACJ 0HuHBHf{t f~~HLrHtfA>u LHLEL;t$Ht$1LH?Ht$L*H$f8HD$H $LD$0Ht$AlHLHD$0HD$HD$8H$HD$@FvA41Ƀz [HJPRHD$@tHBH|AF-THt$HH$f8HD$H $LD$0Ht$AmHLHD$0HD$HD$8H$HD$@uAZsC4J 0HuHBHf{t f~HLrHtfA>u LHLEHD$L9Ht$1LGHt$L3H$f8HD$H $LD$0Ht$AmHLHD$0HD$HD$8H$HD$@tAL釂@-^@Y1Ƀz HJP@Lp@UIATUASFHHrf4HEHHfS4A4ff%f9v AH}H}FHELpA fAFAE!߮H=|HLmHmL9utHL~DʓHLKzHUIHGD䱓H;XtHHھ~DʓLHD[]A\A]A^Ic魱DC[1]A\A]A^2.L aIfH9訷~fD1uf;uA$f+f[f<HCLs L{(fl$bfmfnHHH@ H9H$+1HHHL$W܏HL$<>tHIUH<>tHEH@hH9AhfH<$mUL}EL5L-%L퍴C!1%H=zH[eHUHD[%ALH~DA2DL谋@Lp0!$ H=zHV9EH;HHL8H>DhID$f8 L2HIfDf}<HELu L}(fl fmlfnHpHx H4$qLteLutYH;$9H4$1H)ڏt=Hڃ<>tHSH4$H<>tHFH@hH9Bh9LLPH;$BH4$1Hُ<>`HSH4$<>`HFH@hH9BhL贉uxLxulE!!"H=zHgH $D[HQ>LHXD>fkfDf<fBuEH )t(&t#$>%ufHCxff}Bf;Bu=H7 )t",&t'`(uHCxHEBfw$D}E f{f~Ef{{f~qf;]EFfN$м@Lȉ Cm! ![H=zHHD)@HCf8]aLpLx L"IHT$@LDHl$@Lt$HL|$P+)HDHULu f{'f~ 'fUusLz fA?uhfA>uaL9HJ'1LHH $֏H $t;HL$@L|$HFLt$PL|$XHT$@LDVH>fHHt!f{of~efXHCi Hx1[]A\A]A^A_H6HHL8Ht>DhLhL؇t fA<$`&Cu!cH=zHL2HLXD ;)@H蘧L腴LXA$f WMf8fA<$:AT$6 H'HcЀ 'HcЀ'HcЀ'HcЀ'HcЀ'H{'!UH=yH[HL4D9LH HLDIq!HLDIc!LH4D9@HCLs L{(fl7fm'fnEHHH@ H9H$ #1HHHL$ӏHL$<>tHIUH<>tHEH@hH9AhH<$LLL脃LDC!H=QyH fHUHD[|8LHXDi8@Hx)t# &tb$t %@HExH}0Lu8L9$1LHWҏt8Hڃ<>tHSA<>7IFH@hH9Bh$fA>XMvL901LHяiHڃ<>tHSA<>eIFH@hH9Bh1HL$@HLDLt$@?wHZ HT$@HDHl$@LH7;DLuAFf($Őf>tHSMH<>tHEH@hH9BhO!H=yH-.M;ftLL~D2ILfMHLHDfDH)t#&t$t %@HExH9Lu0L}81HH̏Hڃ<>Lu L}Lp L91LHˏHڃ<>tHSAL<>tIFH@hH9BhIG@!;H=yHQIVLDX/0LILcDY9@H9LuS1HHVʏFHڃ<>tHSMH<>tHEH@hH9Bh!H=VyH-*LLbMHLHDf!6H=yHM#HLXD+/KfDfA<$ID$f8 dHpHx H4$_LtkLct_H;$'H4$1HɏtCHڃ<>tHSH4$H<>tHFH@hH9Bh'fLL8H;$1H4$1Hȏ<>OHSH4$<>wOHFH@hH9BhlLx\L\zLE!H=iyHVH $D[HQ-LH~D-$ %v* +^gfH|$@H|$XH|$@H|$ H|$8H|$ !H=yHgtIHLcD@ H9H$1HHHL$ƏHL$<>tHIUH<>tHEH@hH9AhH<$LLoLvwL?xgC!H=LyHcHUHD[w+LH~Dd+HHH@ H9H$#1HHHL$ŏHL$<>tHIUH<>tHEH@hH9AhH<$LgLLguL'woC!kH=4yHcHUHD[_*LHXDL*lHCHT$@LDHD$@Hl$H2H>1@LuL} L@vHT$@LDH\$@Lt$HL|$PIHLuf{f~HHT$@LDLt$@H\$HHeDLuL} f{f~L91LHÏ#HT$@LDLt$@L|$HL|$PH0fDL}fA?f;5ш}HE HLLHH$~GHIAGC!s H=L9HJ 1LHH $WH $HL$@L|$HGLt$PL|$XDH{0Ls8fA>XMvL9 1HLA<>tMvE<>2'HEH@hI9FhnHL$@HLDHl$@ wHTHf.H{0Ls8L91HLUA<>tMvE<>&HEH@hI9FhHL$@HLDHl$@dwHa IVnH\$PHT$@HD$HfIF IVmH\$PHT$@HD$H{IFM~ H$f{Ff~<L91LHgH$HT$@LDL|$HL|$PHD$@4H6fDIVIN f{f~fULz fA?f9L9LB1LHLD$H $kH $LD$HL$PLD$@GL|$HL|$X{@M~IF LH$q|H$HT$@LDH\$@L|$HHD$P HKM~f{nf~dHHT$@LDL|$@H\$HSH_y AA"8"1t!H=yHFHD$ Ht$@LHD$@D$8D$HD$XMvL9AFDy SAA"8"1t,@HCxH{0Ls8L9AF@fkvHHH@ H9H$ 1HHHL$GHL$<>tHIUH<>tHEH@hH9AhH<$L%LLlLnC:!(H=yH%IHUHD[!LHXD!DHpHx H4$_LotkLt_H;$H4$1HǻtCHڃ<>tHSH4$H<>tHFH@hH9BhfLLxH;$H$H4$1HHX<>}BHSH4$<>lBHFH@hH9BhLLk L mE!PH=yHIH $D[HQC LHXD0 PHpHx H4$LtcLtWH;$H4$1H?t;Hڃ<>tHSH4$H<>tHFH@hH9Bh{L(LhH;$"H4$1Hȹ<>@HSH4$<>:AHFH@hH9BhLiLk|E1!H=yHHgH $D[HQLH~DfknHHH@ H9H$y1HHHL$׸?HL$<>tHIUH<>tHEH@hH9AhH<$LLLhLmjC!H=zyHGHUHD[LH~DDHCf8XH@5HHH@ H9H$1HHHL$詷HL$<>tHIUH<>tHEH@hH9AhH<$LLoLgwL?igC!H=LyHJHUHD[wLHXDdHHH@ H9H$1HHHL$葶HL$<>tHIUH<>tHEH@hH9AhH<$LgLLgfL'hoC!kH=4yHSHUHD[_LH~DLl''(f-=Q-=Q'(fEkESEKH{I91L鰏0HT$@LDLt$@L|$HL|$PH~@E=E$E1E!E}C7~'(2-q=QfCoHSLDH\$@HD$PHl$XHT$HHT$@ZDHHBf8U^Lx fA?OfA>DL9HHCHBf8ULx fA?fA>L9HHC@ID$f8 H=u / =p`32=m'^!=nHExH9Lu071HHήt9<>7[HSE<>[HEH@hH9Bh`7f;BAH?=n.HCx =c{}L{0!H=yHLIVMLDHILnD1xDCMH{I9D 1LҭH$HT$@LDL|$HL|$PHD$@HctC+HL$@HLDLt$@zvH3NHCHH$9H$HT$@LDL|$@HD$HHXAG'4fA>AV6 HtLHcЀ t@HcЀt4HcЀt(HcЀtHcЀtHcЀ GE =$HCxLs0L{8L911HLæA<>tMvE<>0HEH@hI9FhIfA?>Hl$@H\$H$L|$PC:Jy AA "8"1tHBf8UoLx fA?`f9VL9L@Cf.HBf8U"Lx fA?f9 L9L@C3DEb=9 =F3*=:cHExf;BHL}0=:H3=THCxLs0HC8fA>H$=q}IvEm I~n!*yH=yH9IV1M>DoHEL$HPDHH6HPMDHFIL:D1nEh =&!_ =$#HExLu0L}8fA>u f;BBfA?}L90B1LH譣_<>BHSA<>AIFH@hH9Bh&HT$@$LDLt$@Hl$HL|$PwHf.wy AA"8"1tHCxLs0L{8L9@1HLТA<>tMvE<>+HEH@hI9FhVfA?KHl$@H\$H,L|$PfDHCxLs0L{8L91HL<A<>tMvE<>p+HEH@hI9FhfA?Hl$@H\$H(L|$P*f.y AˋA"8"1!tv@AGHCsElC3LQxL`ShE!H=myH$.H $D[HQLHXDCLQ>LR.E{!iH=yH?H $D[HQ LH~DDCLPNLNR>E!H=[yHO3H $D[HQLH~DrDy AA"8"1t@C9LOvLQfE3!!H=yH}IH $D[HQLHXDDOy AA"8"1Yt@=w%={=7cHExUf;BKHLu0E=G4HCx&L{0M931LLRA<>tMA<>*IFH@hI9GhHT$@DLDH\$@Lt$HHl$PvHf=r =s=tsHExeH9Lu0'1HH葝C<>(HSE<>(HEH@hH9Bh Ht$@LHl$@Lt$HwHBCDHExf;BHL}0DQ-=(HCx{Ls0HC8fA>H$d=i}WIvEmC HExlf;BbHL}0\>.=$BHCx4Ls0HC8fA>H$=`h}IvEmI~2!H=iyH.IV1MD}eHL$HPDHHHPMDHFIL>D1+e[fDHH[LOL觾H fA<$7L脾H fA<$6LaHfA<$6L>HfA<$6LHgfA<$ &LHDfA<$%LսCCL萼H fA>FLnH fA>FLLHt|fA>\FL.Ht^fA>4FLHt@fA>HFLHt"fA> FLԻHfA>|&L費pCU颽CCOC=oHExH9Lu0o1HH㒏<>3%HSE<>%HEH@hH9Bh\Ht$@LHl$@Lt$HPwHG7?.HEx f;BHL}0?7=(HCxLs0HC8fA>H$=_}IvEm7I~!TۃH=yH/7IV1MD1\HoL$HPDHH`ސHPMDHFIސIL?D1[HExf;BHLu0HCx=^}L{0!kڃH=4yHIVMLDHݐILD12[bEE;.0HEx"f;BHLu0=vHCxL{0M9Q$1LLA<>tMA<>-IFH@hI9GhHT$@cLDH\$@Lt$HHl$PvHyi=(^HExPLu0L}8fA>u f;B#fA?,L9D1LH\<>HSA<>IFH@hH9BhտHT$@(LDLt$@Hl$HL|$PvH飿=;HExf;BHL}0z=;=T^HCxPLs0HC8fA>H$9=|[},IvEmI~N!׃ H=yHIV1M@DXH׷L$HPDHHڐHPMDHFڐIL;D1GXwE=lRHExDH9Lu01HHpt9<>"HSE<>"HEH@hH9Bh^f;BH=lнHCx½=Z}L{0!mփH=6yH IVMLDHِILlD14WdE LLHEx,f;B"HL}0CD0=,HCxLs0HC8fA>H$ݼ= Y}мIvEmCI~!`ՃH=)yH/IV1MD=VH{L$HPDHHlؐHPMDHFUؐILCD1UHExf;BHL}0A/=$ػHCxʻLs0HC8fA>H$=W}IvEmI~z!6ԃH=yH/IV1MDUHQL$HPDHHBאHPMDHF+אILAD1THExغf;BκHL}0ȻB9=(HCxLs0HC8fA>H$=V}|IvEmaI~P! Ӄ[H=մyH\8IV1MDSH'L$HPDHH֐HPMDHF֐ILBD1SǫLLHExf;BHLu0莺3HCxqL{0M9(1LL蝈OA<>tMA<>IFH@hI9GhHT$@kLDHl$@Lt$HH\$P vHHExٸf;BϸHLu0ɹ.HCxL{0M9'1LL؇A<>tMA<><IFH@hI9GhTHT$@cLDHl$@Lt$HH\$PGvH2"HExH9Lu01HH@t9<>^'HSE<>@'HEH@hH9Bhf;BH豸=mHCx=S}L{0!=ЃH=yH,IVMLDHnӐILmD1QLEHExLf;BBHLu0tMA<>IFH@hI9GhǵHT$@gLDHl$@Lt$HH\$PvH镵HExf;B}HL}0w=9@=T[HCxMLs0HC8fA>H$6=yQ})IvEmI~K!̓H=yHIV1M=DNHԭL$HPDHHАHPMDHFАIL9D1DNtHEx[f;BQHL}0KE=,1HCx#Ls0HC8fA>H$ =OP}IvEmrI~!ӳ!̃ެH=XyHIV1MDlMHL$HPDHHϐHPMDHFϐILED1MJHEx1f;B'HL}0!=&/=(HCxLs0HC8fA>H$=#O}ӲIvEmFI~!c˃H=,yHQ.IV1MD@LH~L$HPDHHoΐHPMDHFXΐIL&D1KLBL$HExf;BHL}0@=,˱HCxLs0HC8fA>H$=M}IvEm ~I~m!)ʃxH=yH IV1MDKHDL$HPDHH5͐HPMDHF͐IL@D1JHEx˰H9Lu01HH<>&HSE<>HEH@hH9BhpHt$@LHl$@Lt$HdvH[KHEx=Lu0L}8fA>u f;B&fA?L9 1LHI<>; HSA<>- IFH@hH9Bh¯HT$@,LDLt$@Hl$HL|$PvH鐯HExH9Lu01HH~`<>kHSE<>(HEH@hH9Bh'Ht$@LHl$@Lt$HvHHExH9Lu0=1HH ~Ү<>?&HSE<>p(HEH@hH9BhHt$@LHl$@Lt$HvHtHExfH9Lu0U1HH}D<>_HSE<>i$HEH@hH9Bh Ht$@LHl$@Lt$HvH3ݭHExϭf;BŭHLu0迮={HCxL{0M9#1LL|~A<>tMA<>~IFH@hI9GhHHT$@kLDH\$@Lt$HHl$P;vH&HExf;BHLu0HCx۬=I}L{0ʬ!ŃեH=OyHIVMLDHȐILD1MF}HExdH9Lu01HH{B<>9"HSE<>_"HEH@hH9Bh Ht$@LHl$@Lt$HvHHEx֫H9Lu01HH{t9<>"HSE<>!HEH@hH9BhVf;BuHs=FbHCxT=G}L{0C!ÃNH=ȥyHA&IVMLDH0ǐILFD1DL鸫HExժf;B˪HL}0ū='=$HCxLs0HC8fA>H$=F}yIvEm^I~M! ÃXH=ҤyH&IV1MDCH$L$HPDHHƐHPMDHFŐIL=D1CěH|HDH$鄰-y 6AAq"8"17tHH$鏽H$Hףy A=A"8"1tH$CHt$@LHl$@Lt$HvH!ky عAA"8"1ut鱹C(y RAA"8"12t+y &AФA"8"1tC`Ht$@LHl$@Lt$HvHCCVHt$@LHl$@Lt$HvH֙{CGCVC)rCXHt$@LHl$@Lt$HvHz}HHHC9HCy A:A"8"1óty AA"8"1菳tqQy A?A"8"1[tHCxLs0HC8fA>H$k=B}^IvEmCI~2!=H=yHqIV1MD?H L$HPDHHHPMDHFIL%D1y?驗+y AA"8"15tHJH,y rACA?"8"1tKHCx=[A}Ls0!ýH=yHIWMLDFIL9D1>麖HL,y AgA?"8"16t[CQy jAA"8"1tCAD$6ĞAD$6驞L.y UASA"8"1衰t.cy ζAƌA"8"1mt駶/y AA"8"19ty A)A?"8"1tHCx,=o?}Ls0!׻&H=yHIWMLDFIL@D1<ΔPy ԘAA"8"1Zt魘HCx=>}Ls0p!,{H=yHu>IWMLDFaIL:D1;'L1y tAkA"8"1诮tHC]y AҍA"8"1gtڨ)y ADA"8"13tAF6y XAA"8"1t1Ly BAA"8"1蹭t{y AA?"8"1腭tHCx=<}Ls0!WH= yHIWMLDF舼IL;D1:NHCx5=x<}Ls0$!ฃ/H=yHIWMLDFILDD19בYy AٙA?"8"1ctLiy  AÛA"8"1'tHCxN=;}Ls0=!HH=™yHujIWMLDF.ILED18vy AݘA"8"1耫tBy tAA"8"1Ptfy AA"8"1t鼪HHCx;=~:}Ls0*!涃5H=yHu>IWMLDFIL>D17Hd[y tAA"8"1it.y AxA"8"18tÙLy AlA"8"1t}y bAA?"8"1ȩt;y KAA"8"1蔩t$Vy -ANA"8"1`tAFH胝BTbDv=&#HCx=8}HC0H$IvEm߳I~Y!ǴH=yHIV1MD5HHPMDHHԷL$HPDHF輷IL&D1R5邍AFSy AA?"8"1tH0Hy -AA?"8"1躧tHCx=$7}HC0H$IvEm?޳zI~i!\H=%yHAIV1MD94HwHPMDHHiL$HPDHFQILDD13?HCx=86}HC0H$IvEmSݳI~}!pH=9yHIV1MDM3HHPMDHH}L$HPDHFeIL?D12+HCx=U5}HC0H$IvEmpܳI~߳!荱ܑH=VyHIV1MDj2HHPMDHH蚴L$HPDHF肴ILBD12Hʒy AA?"8"1Ԥty jAA?"8"1蠤tCby APA?"8"1ltLLHCsAFHh=:(=;:=9!HCx=v3}HC0H$IvEmڳI~@ݳ!讯H=wyHIV1M=D0HɏHPMDHH軲L$HPDHF裲IL9D190iAFtLqӐy iAA?"8"1ݢtBHCx=G2}HC0H$IvEmbٳI~ܳ!ΎH=HyHUIV1M@D\/HHPMDHH茱L$HPDHFtIL;D1 /:HCx=d1}HC0H$IvEmسI~.۳!蜭H=eyHIV1M>Dy.HHPMDHH詰L$HPDHF葰IL:D1'.Wَy A2A?"8"1twy MAeA?"8"1诠t&HL~HHFAD$6@AD$6%AD$6 AD$6HCx=/}Ls0y!5H=yHu>IWMLDFjIL'D1-0Lty tAA"8"1踟t}y VAȎA"8"1臟t/Iy  AvA"8"1Sty APA"8"1tǜLHCƽŒy A%A"8"1̞ty AA"8"1蘞tAFH軒>0>AR=% HCx=-}HC0H$IvEmԳ̼I~׳!NH=ȋyHIV1MD*HHPMDHH L$HPDHFIL%D1*麂AFKAF0vHH y QAA?"8"1t*HCx=,}HC0H$IvEmӳI~Jֳt!踨H=yHAIV1MD)HӈHPMDHHūL$HPDHF譫ILAD1C)s=׺HCxɺ=+}HC0H$IvEmҳI~^ճ!̧H=yHIV1MD(HHPMDHH٪L$HPDHFIL=D1W(釀HCx=*}HC0H$ѹIvEmѳI~{Գ!馃8H=yHIV1MD'HHPMDHHL$HPDHFީIL>D1t'&y AFA?"8"10ty jAA?"8"1tCy AyA?"8"1șty nAA?"8"1蔙tGHCx=(}Ls0!fH=/yHIWMLDF藨ILCD1-&]~߆y AA?"8"1tHCx=S(}Ls0!軤 H=yHujIWMLDFILAD1%}8y AYA"8"1Bty tAA"8"1tfԅy AUA?"8"1ޗtHCx=H'}Ls0!谣H=yyHIWMLDFᦐIL?D1w$|)y AA?"8"13t_y AA"8"1ty UA-A"8"1˖t.HdH}y MAўA?"8"1臖t&HHAFTHHH芊CHE&='=HCx/=%}HC0H$IvEm̳I~`ϳ!ΡH=yHuzIV1MD"HHPMDHHߤL$HPDHFǤIL'D1]"zHy yAA?"8"1tRHCx={$}HC0H$IvEm˳I~Eγ!賠H=|yHIV1MD!H΀HPMDHHL$HPDHF訣ILED1>!ny@6HCx(=#}HC0H$IvEmʳI~Yͳ!ǟH=yHqIV1MD HHPMDHHԢL$HPDHF輢IL@D1R xy vAA"8"1tOHHHCx=h"}HC0H$IvEmɳI~2̳!蠞~H=iyHIV1MD}H~HPMDHH譡L$HPDHF蕡ILCD1+[wy nA A?"8"1tGy MA<A?"8"1賑t&L>my A%A"8"1wt9y BAA"8"1Cty AoA?"8"1t[HHŤ~y AA?"8"1ːt{HCx=5 }Ls0!蝜|H=f~yHIIWMLDFΟIL&D1du~y AA"8"1 te}y AoA?"8"1tHCx=V}Ls0!辛 |H=}yHIWMLDFIL=D1t7}y AA"8"1At鄘}y A՝A"8"1 t|y UAGA"8"1َt.AF6uAF6_AF6IAF63AF6eAF6O_|y AA?"8"1itpHCx=}Ls0!;zH=|yHu6IWMLDFpILBD16s{y tAA"8"1ƍt@LHh|$ H4$flfn,~fpLjLz EAUAf{ f~ fmf{f~L9 1LH'7UH<>HtHMAuL<>tIUHrhH9qh fDffkLzLj EAAMf{ f~ ff{f~}L9 1LHg UH<>HtHMA7L<>tIWHrhH9qhL 9 H`c HCf8H$f8@EftMA$L<>tID$H@hI9Gh f;<L{Lk Ls(M9a1LLTt`AL<>tIWA $L<>tID$H@hH9Bhu*AD$fDE1HhL[]A\A]A^A_LL襭tQEuAGuAEuAFu!tHSMH<>tHEH@hH9Bh A$fkfquID$f8cAD$Ht$@LwLl$@LEC!<H=xH"IILfDC !lH=xHH"AD$IPLH4$LH¿8IzHI%>D$ AFf.AD$EAEAF!JH=xH2"MIL@I|$ Ml$_yf}nH_^HbNIE@@LȝAHEUHMH|$@A@4HL$@T$H%D$LaA)\"AD$CE!bH=+xH#H=K|IcrIUIL\$ ^H$H;PtH~(IHH$|$ cPIcAD$AEAAF6C,!H=sxH IMLfH$|$ <IfDL}LLm Lu(L艩HtHMA7L<>tIWHrhH9qh(LzLj EAAMf{f~fQf{ f~ L91LH UH<>HtHMA7L<>tIWHrhH9qhLzLj EAAMf{f~ff{f~L9p1LHcoUH<>HtHMA7L<>tIWHrhH9qh8%f*f{MG@f~6L9 1LHLD$۩LD$. UH<>HtHMA0Lƒ<>tIPHzhH9yh fFfGAMDff{MGf~~L9m 1LHLD$3LD$ UH<>HtHMA0Lƒ<>tIPHzhH9yh fFafGWAMDff{MG@f~6L9% 1LHLD$苨LD$V UH<>HtHMA0Lƒ<>tIPHrhH9qh fFfG/AM>DHEf8EC!tH==xHx#x kA$ A "8"1-sDff{MEf~ L9 1LHLD$cLD$ UH<>HtHMA0Lƒ<>tIPHrhH9qh Aff"f{MGf~L9 1LHLD$ӦLD$ UH<>HtHMA0Lƒ<>tIPHzhH9yh] fFfG7AMDAD$TE>Ef;DH4$|$ HT$@pLl$@L|$HH\$P HI DErf;)EfF fGLuHE M9HD$1LL补A<>tMvAL<>tIGH@hI9Fh|HD$f8mf;cHD$L|$@AGLl$HH\$XHD$PkH4$|$ HT$@UHI@Ef;EfFn fGLuHE M9HD$1LL豤A<>tMvAL<>tIGH@hI9FhHD$f8}f;sHD$L|$@AGLl$HH\$XlHD$P fEf;2EfFfG LuHE M9HD$'1LLᣎA<>tMvAL<>tIGH@hI9FhHD$f8f;HD$L|$@AGLl$HH\$XnHD$P;fEf;EfF fGPLuHE M9HD$1LL!A<>tMvAL<>tIGH@hI9FhHD$f8f;HD$L|$@AGLl$HH\$XmHD$PkfH$f8kDLuM9[ 1LL^A<>tMvAUL<>tIEH@hI9Fhf;H4$|$ HT$@pLl$@L|$HH\$PHILuM9 1LL辡=A<>tMvAL<>tIGH@hI9Fhf;H4$|$ HT$@lL|$@Ll$HH\$P!HI@LuM9C 1LL&UA<>tMvAL<>tIGH@hI9Fh f;H4$|$ HT$@mL|$@Ll$HH\$PHILuM9 1LL膠]A<>tMvAL<>tIGH@hI9Fh(f;H4$|$ HT$@nL|$@Ll$HH\$PHILuM9 1LL柎aA<>tMvAL<>tIGH@hI9Fh,f;"H4$|$ HT$@kL|$@Ll$HH\$PIHIhLh Mt$ LxLS3LP#f}HpPHPSHEUHMH|$@I@4HL$@T$H%D$LRAIEAUH|$ Ll$ @4T$(%D$,lRA)E^AD$CEx!fH=xH{H$I;Gt|$ LH¾~IH=|Icc\$ IWIL]H$IHcIfDEf;u@H4$|$ HT$@lLD$@LD$Ll$HH\$PHILD$@EEf;u8H4$|$ HT$@pLD$@L|$HH\$P*HIIfDAEEIf;u@H4$|$ HT$@kLD$@LD$Ll$HH\$PHILD$@ETEf;u@H4$|$ HT$@nLD$@LD$Ll$HH\$PmHILD$@EEf;u@H4$|$ HT$@mLD$@LD$Ll$HH\$P HILD$'@ELuHE M9HD$ 1LLA<>tMvAL<>tIGH@hI9FhHD$f8f;HD$L|$@AFLl$HH\$XnHD$POfDLuHE M9HD$6 1LLM]A<>tMvAL<>tIGH@hI9Fh(HD$f8f;HD$L|$@AFLl$HH\$XlHD$PfDLuHE M9HD$P 1LL襚A<>tMvAL<>tIGH@hI9FhHD$f8qf;gHD$L|$@AFLl$HH\$XHD$PLuHE M9HD$ 1LLA<>tMvAL<>tIGH@hI9FhHD$f8f;HD$L|$@AFLl$HH\$XmHD$P_fDLuM9 1LLLD$aLD$A<>tMvAL<>tI@H@hI9Fhf;H4$|$ HT$@pLD$@LD$L|$HH\$PHILD$V@LuM9[ 1LLLD$豘LD$VA<>tMvAL<>tI@H@hI9Fh!f;H4$|$ HT$@nLD$@LD$Ll$HH\$P HILD$$@LuM91LLLD$LD$A<>tMvAL<>tI@H@hI9Fhf;H4$|$ HT$@kLD$@LD$Ll$HH\$PZHILD$t@LuM91LLLD$QLD$A<>tMvAL<>tI@H@hI9Fhif;_H4$|$ HT$@mLD$@LD$Ll$HH\$PHILD$@LuM91LLLD$衖LD$A<>tMvAL<>tI@H@hI9Fhif;_H4$|$ HT$@lLD$@LD$Ll$HH\$PHILD$@AEAGGAGAG OAG2wLuHE M9HD$N1LLLD$蘕LD$t?A<>tMvAL<>tI@H@hI9Fh f.AMEfLuHE M9HD$1LLLD$LD$tA<>tMvAL<>tI@H@hI9FhuHD$f8uf;yHD$LD$@AGLl$HH\$XHD$PH4$|$ HT$@k|HI2fLuHE M9HD$f1LLLD$XLD$t?A<>tMvAL<>tI@H@hI9Fh!f.AMEfLuHE M9HD$.1LLLD$ؓLD$t?A<>tMvAL<>tI@H@hI9Fhf.AMEfLuHE M9HD$1LLLD$XLD$;A<>tMvAL<>tI@H@hI9FhHD$f8f;HD$LD$@AGLl$HH\$XHD$PH4$|$ HT$@n{HIfDLuHE M9HD$1LLLD$舒LD$A<>tMvAL<>tI@H@hI9FhHD$f8f;HD$LD$@AFLl$HH\$XHD$PH4$|$ HT$@m7zHIVfDLuHE M9HD$V1LLLD$踑LD$t?A<>tMvAL<>tI@H@hI9Fhf.AMEfLuHE M9HD$1LLLD$8LD$tA<>tMvAL<>tI@H@hI9FhuHD$f8uf;yHD$LD$@AGLl$HH\$XHD$PH4$|$ HT$@lxHI2fAGHD$f8!AGXHD$f8AGHD$f8AG&HD$f8pAGHD$f8 AGLHD$f8fAGHD$f8#GAGHD$f8(A@*yA@)A@iA@ZA@H4$|$ HT$@mL|$@Ll$HH\$PHI!H4$|$ HT$@lL|$@Ll$HH\$PHIH4$|$ HT$@nL|$@Ll$HH\$PHIH4$|$ HT$@kL|$@Ll$HH\$P]HI|A@&A@FA@HD$f8f;HD$LD$@AFLl$HH\$XHD$PjA@&A@HD$f8f;HD$LD$@AFLl$HH\$XHD$PA@HD$f8f;HD$LD$@AGLl$HH\$XHD$PfA@HD$f8f;HD$LD$@AFLl$HH\$XHD$PbA@cx AA "8"1ms/x 7AA "8"19sx }A A "8"1sVǷx A A "8"1sx -A A "8"1s_x cA A "8"1is<+x cA A "8"15sHAuMǘMEŐH|$tHD$xy@HX[]A\A]A^A_LT$8lT$8` {HFE1jfDEOIvfAv MtIuLT$8\xHIID$HT$8MeHAH{It$@ff tHGMeIt$@HH9tyH_fuyMeIt$@@HsLHH@HPHs DHLGHp8I9IHD$cLD|$8xT$8H?ID$Xt>E11 fE1]WH11D$?ID$XD$0HTH!rHPxE1uAD9h6DLHTHAHrHu |$0u]H?ID$XIT$HBHBMd$8@,HD$ L9`8Ev_MI!A|$?tH?sH@@0k8MucMmMNA}"uIULHL-rMfDIULHMmMu1HH=L!I(tf%=1AwUD)%FAMHH4#HHHLIHs1LHH@HP]H@f84H@f81HxxHS@Ht!1LHP38谘pDD9t9L$LL$DHŠHt$HH|$@AغBHHڃHcIVHDD越InID"IDhLxHh1HDDHfDLL$8DL$0DD$(L$8DD$(DL$09DLL$8DL$0DD$(؜L$8DL$0DD$(Q9FfA@~DLDD$(蕒DD$(HD-DD$(LHL$HH1HHXhHt$H1DH@AIDLxM{AFHI9H_HL$ f@HIcf @6AFEuAHUfAFAH 9DMvDEEEfAF D$ HChHD$HHH)H LCxHHL9%HHspIIHI)IL!IHI)M9v HHEHSxAumIII)M9LL9H%HI!LH)L9LL@H{hHL$HHHL$D@HHH{hLHL$LD$HHHspHLSxHL$IILD$II)fE È]H[]fHATUSHl$ D$(\$09Dgr <@9v1AN\ES(A9vE9sMMSM9v[1]A\FA9vJHLH;iv?A)DNAt$JDLHtfHHH9isD9A(s ty,uH9u[]A\@9sE1cAWAVAUATUSHLOMAAiMlHD$IEI9DbA9EDMUODN4M;ppDv\E;_IQI\$0L$T$Ht$ H|$I9DILBf.HJ4H9rD9HHHI9v 9r\$I9HD$ D9Dp@Hh(r}H$ML|$HD$( EODI@ HT$(Ht$`HDŽ$Hx@-foD$`MOD9vKTAHjH+$AD9sHD$MlMUH$H?;oHt$8HTH|$HtH$HtGE1HD[]A\A]A^A_L=oHD$ l$xLLpL09|$4xP|$(HxpH|$HHxhH|$8iD$E1LLHD$@LA܉hDur9l$H|$ptH$H9$t N(;L$(t;L$4wH~ L$4H|$He]1$DIL$L=DoV1HD$ DDp@Hh(b\$@DIHT$XHt$PH|$`HD$`H;D$Xt H;D$PH|$p@HD$H1҉HhD$ PD$@HPAVHD$@DD$HL$TLmH AHT$`H;T$XH;T$PKH|$ptH$.ao9D$\|$@tfoD$`~D$AE09foD$pAE@fo$AEPfo$AE`fo$MAAEpLl$HD$PH9D$`_IUDEDAD9sD$@E1D$ @E pZj+D$9$a|$9HD$HDx,9rHLy,tK9H$VH=Hlsa\H=HlNa7HDŽ$y G謡g袡gHDŽ$D$@HFE1H^ H$HD$+QgfH=QClHt 9Cld HyֹbIL$HYDHIHѡH ClHt Bl IL$HDIEE1Ld$E11D$EAGD$H=BlHtJBlt=LCH Dx1TgHC H=vBl1HpH3bH5dBl QgHD$HPHHBI;D$D D9|$DEH\fD9I>sCHTHHBH9r0UHHLfDH1HHHVH9w D9uI~H9{S(E^XAED$D9INH9KHsHHCCHHI9vIF~H)=LoHH9LKMFMLI@L)H9H9LD$HLT$@LL$8D\$0T$(HLqd{,T$(D\$0LL$8LT$@LD$HHOIA9VPЉACFPA9VLAFNLL MHt$L\$(HM9H~ATMBU1AQAMD[dH {,L\$(AC, = I HD$1HLL\$(HP}L\$(tTI HD$HLHP|t)HL֢H\$8L|$@DLt$PLHD$L` MA$AL$%98AAD$MtE4AHHIH G?lH+?lJeYghLXmLIfIF`H&DPAH@HAF\9HAEAA@H\$Dd$0DˉD$(DT$8E܋T$(H|$ALơyAuH\$Dd$0 LguA|$(q0HH4HD$Hx HOT$0L$(<IHD$T$0L$(L` A$AL$% A$AD$E{,A~UA;F\D$pH3A9VLKAFNLA9VPACVPH$LD\$0MH9Ht$xHCHt$$$HD$hHAкH~DH|$hATUPLT$HaH $LT$(D\$0JHD$`D$0Ht@D$0Ht$`AEL|$@LT$HD$tEƄ$H\$8AHDt$Lt$PH$$H$t$0Aǃ$|$9|$0E H$HD$`Lt$hHDHpL9! L$Dl$($l$\1PADL$pD;9щ|$(GH0I9LB9|$0IIHpL9P(DH,D9rAM9GD11AAL$8DB1T$0DDD$(H49Ht$DD$(T$0L$8HF 0A@A DAAD@ IHPt(qH|HtI4$H2HIH9uHI@L8{AI\$ HHk9BUIL$PLt#IMuI\$ HHHQHɋpQ9 I\$H$A9@H\$A H=G:lHt'1\LgID$ HxH$E11Hl$!ID$ HO9XFڿ3LtWvtH|$LD$tIF`HWpE1u\HH|$`A2I;nhtHH=IF`AHD9hDH|$L|Io Hb}uE 0uHEH 0qf87g@]Ld$AL&gLyCI\$ IT$H'A~U EHl$t H 8lHt[RgIL$HQDl$MDHȫID$E1HH$HD$QE1H$HHD$DAH@ FJ(Rg1HIgIL$HQqIt$E1H$Ht$1y{uszI|$ HH4H!o`5ID$ h ډP_ID$ Eat$p9$B9B؋D$\9FЉT$\DH(AIF`H\$DT$8Dd$0Ht@D$(HD$HP HJ%9 ABHD$LtHPHHAuBt$t$(9Ld77HILDd$(Dd$HV1D9l$(vWD$HEARD$ATAVLL$h$H$XH D$D$PDD$(EB;l$\v)t%9݉FD9AGA9sD$tt$t,H\$8D|$0l$8Lt$DL|$@ELt$PLT$HHD$xH$H9vL)IFHL艔t$\D$8L$9FA9FXAFFX9L$(AFXHD$H@HtHFH;D$BH"HD$H@HtHNH;L$D$0;F(v8~,H9t)HII~HH9vH)INLT$8ܓT$8;T$(vD$(D$$A9A$$IL$E1H$IL$HE1H$1L\$(L\$(HHHDHHLLHt$(HL$0HH|$89HHt$(HL$0H|$8HHLLHHHDƃLKMMFS(H=1lHt$(E1t$H=1lHPAu1lE1t$t$(t$zFqtIH4HD$Hx HODD$L$n/HHD$DD$L$HP DƁJ% Bu1E1AAL$8EC1HT$0DDD$DL$H4.Ht$DL$DD$HT$0L$8HF 0Dρ@ 0`EtnHLd$D$A1HD$DQH@ H1HD$hH\$8DD|$0EL|$@Lt$PLT$HHD$Dl$(D$\D$81$/lt$(E1t$1 ^ [i [ GiI1$  )iff.kH9HD$  HEAHuH|$ HWHHRL|@A'IGXt9AG_HI#GXHHt H9HbuHAGPVuAAG`/AGQ IG@HL|$@HD$0IG@HD$8;uH=HD$DA)yHE(L|$(HD$HD$PH$f.HuH<$D$PHD$XH\$`MgDIФL$PIǍA@Ld$XD$PƍQLHAG A?AGA{LT$PHzAG ȃAAGH|$`;uHD$XD$PHD$L$$LfH|$IwfAGIHLδA(D$ŰHuHAGHuHAGIuHAGJuHAGKH%HIG@ME1Dt$MMHH,mADM$HHc 1L$͠4HI$x =H% HuAG(A<  H|$t LH袣H|$01LHuH=IG1ɉǺ1wIHuH=MHD$DA)M?h@HxuH=AG`AFLÄHt f8uLxAL詶A< AO fufDL`H@JfvfQIM9DMDt$A?bAG@7HHSmIG`AFGA\AIQ1ITPHR f:uLzAWI9w%HHmIG(HHmHHIGHЦmHIGPHmIGXH|$`tD$PHD$XAG A?AG@Ld$XLHAG кAGщ|$PIALd$lfHPf:Hx lI$HPHJP2QsHP HRBaH@H;BSH;AII $@HuHcH9{L4X5H|$`tAWHD$XLd$X кAGAu7H|$`tAWHD$X AGLd$XLT$PHփ AG AGL0HGtH=Hu6Hx[]A\A]A^A_L|$(HtH=HtLt$ @HtIVHH|IHtD`LDd$!0EIvAD$E1HHD$@HHLmHHD$tHH$tH\$`HD$XIFIH|$Ht$D$PHP$HTHAHL9LI;t(HH9u11L2%$ f.1LLL$A1L$H}(HT$PHA@$IH@HIt`貮D9d$'f.D$@Ld$H[UHt$@H*mE<($`@H% fDHcu(H`Mt LHęH|$HJAH{x^IcH`HmHFH{1]Ht$H|$ \Ll$ MfDIU0f=wf#H{Eu,]IE0Hߋp^EttMeX1I4$HI0mIEH߃HHDHHcpTe^HCAT$Ht$@D$@HD$HHHD$PȞmA9uMmM>H{1\Hh[]A\A]A^A_@uHH]uIH]uJH]uKH]HU@HsHcEU $E1gHc 1DHTH͠4H=}nL LL$ u MfLHAƝmD9e EP vE1Hu(Hm@H}@HuHH߹fmHuPH߹OmHuXH8m_1[BMmMY@H|$PZb@H|$PLY fH|$PYELd$HD$@<E1<DmAAyIAALusB@H% fDQHI ĸ@tIEH8[]A\A]fDX6fDX&fDXfDSH1H`u.f,ƒtPt1[[fH{tJfv΋[fH{wtH{jtЃ9fwHuHcҋ[DH[+(HD$HGH\$PH_ HL$HD$HGH\$XHHD$(H|$Ht$H|$INqH$DŽ$DŽ$u1Hĸ[]A\A]A^A_Ht$H|$(INrH$utĄff$$AFtHD$HH@H@HL8@MAƒft fL9u% @HD$-E&mL`@MH@HHH@Ht I91Ll$%fDIEHHH@L9HL{A$f7ID$ f8tHLHLxHxfA?*f*t%f,HGf8*zLA Md$M,9IM~I^ H$IFL|$8H$){H$1HHXH$Io@H\$pHXHH\$x$'HD$8H@HHH@H9u H E1HD$ 1D$`Lt$@5@$Ht$@H|$PHIAXAYI D$`H("Dc D9At$HD$HsJDHD$ H !D`AΉL$0AAt$E9HD$ MpN| EeEuAE9At$RAuKlHmHLl$0H|$0Lt$@VHD$0xFD$`t8HL$ LkLaHlfI4$I}IIkL9uL-iLLhM)LH{HmHuT$`D$`HL$ HkLaL|fI<$IHO{H}F{M9uH$4{H|$p*{H|$x {L{&{Mt?LfHmH f} tu0yH} 1H?E0u1H|$Ht$H|$INqH$DŽ$DŽ$^Ht$H|$9Df@fIG f8tHLDHL`H@HD$ HD$(HtDHH@H@HH( HmEƒft f3H9u% @In8/f.IF8H@H9EHLH{HHuLLE1"L:LzI]MfhfA<$*ZAD$<A$f~!f-%ozHD$8H|$ 耽KHp;L|$Ld$0HHD$A~qInHIvXIF@HD$8IF`HD$ IFPtI~k A~r` MIn(AFq]KfIvH A~r A~rI^`DKIcEƒ@ t,I~8 A~r- A~rI A~r  h%IF`I^PE1MfX@f;6D$H] AFqD$pt#INHA~rA~rhMH|$(1E1Ht$0fHD$ NdIuLI}贻IuLutI}HT$XHu_IEf8*uxwxhuAHt$Llu0Ht$ L[uHT$XHLWMLL1E1=HH?HH1{HD$0x3@HD$ Htx @Ht{CH=iHhIHX1跗dGID$@A|$wA|$hLd$8IF8H@H9Ef} Lm fA}@LHHI)LL1H踒d( zJgH|$ 蕒dH|$0膒dLl$0Lt$@1`{MfIT$f:!DIT$f:!5 JgHL$0Iv LAHDP({L$H$LH0nH}oLl$8LL0nH}oHELHxoLoLHNluotH$H$o)l&lIdžHD$0f80hHMu)H|$@uHtHHH!H9H^{A~q!INH I~P< t1I~X< t1u{IF(HIv`LHHAvXMNPL@讬HZYEhuIVPf:!f{{IvHt A~rI~`AFqaI^`CHcA@MF(҉D$@A(IvHt A~rvIvXHLD$H}LD$Ht EAVq7AFqZIvHIVXH if:!'H9IVPf:!HzIF0Ht{Iv`1ɉLKHQI~`H9tHѳL艮H4IV@Iv HL|$Ld$0R({Idž[H=iHtI1)dH\$8I9tDj{HL_HiLzH8z{HD$AW(HH@HH0 {H|$( H|$(}olH\$PL|$HLy,nHQzoLt$HIFHx{oL}oHLlo lA~qt!I~ A~rA~rLYA~qI~t A~rMfPH5iIN(I9HA~ I;vXWXH;zitMkLHiWA\$:]HM;fXvEFsLML̰ {L=izHE1AjHLHZ"|H_AXHE1E1jLGH‰ɝHY^I~`H1LHIV@Iv HL|$Ld$0R({Idž [I~`WA~qI~I~`0Au2InPI^Xf}8- f;8 f}E f;EA~quI~_IVPf:!\IFXf8!NI^`f;*@L`IF(kLzHD$@-{HI9t LLSLAEHI9FzIvPHHD$HLD$HHL$@DHM諝HtH=ifI~`H9tH裯L[HIV@Iv HL|$Ld$0R(U{IdžD wBgH$zH|$pzH|$xzLz^{kD$`H$1^zH|$pTzH|$xJz1z{WIF`IN(4MNPMFXpaf84H>{I~`H蛮LSHIV@Iv HL|$Ld$0R(M {IdžBMF(DL$@A(H|$ HL|$Ld$0RHD$(HsH@HH;(uHUHHD$XHHHPT {1ҾH5ڦHt HHUH=V{iHIHLd$8&H=){iL|$Ld$0HIHfA<$6HCf8!ID$f8!zIt$H{2dH{f?*-IH[Md$EHzI9NP1@I~`I^PIv(HЦIF8I^(H$HIvPH{+薩0IvXH{聩W$fDIFPf86IvXHxEtKI^0Hf;t8+H{1` tIFPH@H;i H;tiu A~qzI~dIFPH;UiIF0H FH0iI9FX5Iv`I~P蔨{IF`xzHLڧHz IV`HE1HBzjIHAYAZL I~`H9tH2LHHLHBhIF@Iv HL|$Ld$0P( {IdžEDf;!fA<$!HCI\$1HHHD$@H)HH?H?@8 9Dd$HDy{HD$`IF0HHt$`HNH]HB(@M@S@l$hD$H1 {Mt'Iv`L覦tDfDf;! Ht$PH|$axolLH=viHH|$Ht$P!nHoo[H{HMHAAIv(HZ̦IF8In(H$HH}HHuf>*,f>0H95NiH;5QiCH;4E:EMtfWt fZ!WfH0u Ax {IV`Ez+HHEIN`t*E1ExH0E HIN`H9t HHLˡH{IV@Iv HL|$Ld$0R({H$In(IF0IdžIF8%H}H衣AE1UH{HHA3H}H]uI~`_HuI~P6xHuI~P!5HuI~X  IvXI~PǨ VInXIDnPI~`HעL|$Ld$0Idž4-H.}iI9@f f1LL $~WpL $;IQHHuH*kH@HRXHtSeL $[hHL $eIAHPXHt! J-hHL $7Iy8IAt @@;AYuuA+IAxHufHXhHHϾx1jcuQAnf.|$gHl$`HH!HD$8HH=|$HD$@A]HL$0HL1HuLH8H81Ȩtm1]HHH1H, HLH!H9*H!H9HL$8H9X1HL$@HD$HA8DFHfDH|$` H6AHH82$K7H82$V@H)H!G2I$@mFDD1DL$(JcDL$(AHeL$ LǾ L;H$HIH9G2M11iHHFL0Dh@ 6;GHH1HH8H81Dl$+H$Dl$H5 e(141H$HHleXe@ aDefAC D9wH\HIQC@HtMtHtHpI}atXYHLL[]A\A]A^A_LLY1LLF1OVS\VS?HHH?@8@HH1@H)HHt6H?H1H)1HHHtHDDIL跫IIADىLIQID虫_@D \HD$H=aH0H aU\HD$H=aH0H zaA\fHD$0HHHD@<HGHD$HC@H@HD$C 0D$u HCHD$E1|$HD$H|$1Hp7HHD$0yH4LhALIYIF1pIEIVH|$1HI{7H"Ht$HT$@LH|$@HLIHD$1HxR{ I5HkXLcP1Lv=H5cHLqCLIH¾<LY_zIkLt$H|$@HIvIvI} AEHu zHCLe`zIH|$0f?UG6< tK< tGG6t:G6t-G6t G6tG6HHt$HT$@LRHD$0H|$@HHptIHD$0HT$4Hp@IwHxI)zu0I}1PL~HIw_zHHMkzLLリU_zLt$HILHHL~L"_zIHt$LE1 FHIf8D`6HD$0Hxf?G6A9u HD$0Lx1L;LHD$ :H|$1LI|41LIOLD$ LL<HI\z1LHHPH|$1L041HLHOHD$1Hx`OLL$ {HL^zI}HAxH H|$0f?H H|$0f?x螙HH|$0f?b{HH|$0f?LXHeH|$0f?65HBH|$0f? ;AHA@4e^G6f|$HH"H|$D$@!I$H@8f8t#H$HL0 MMP Mb(LM4 I^(H|$TH-bHl$LLT$XcHHD$ VHD$(I$Hx D$8HHx D$0H1Hp I$Hx zuLT$XmI$H AzLtIA~H@pLtIH@;pw?M9t:D$0|$8D$8HD$(|$0H|$ HD$ HD$Hl$H|$(HHHH€yBG<EAAD|$@DDHD$HXH Dc%D9AD$H|$CKHDHxH|$(HHP Hh(Dp@Hx|$0xH|$ Hx0|$8x8@=_fo$ HD$H`H k%9 EHCH$HHCH$HCfDZfDH$H9kIHD$> ADH1LT$Hp8I$Hx8muHT$PLT$ HLLT$ HT$VHT$HHHCLT$ HD$T@=Y_HH$0HL@I$HH1DŽ$HDŽ$DŽ$ HDŽ$Ek{{At$DHL$XHT$@HT$@HL$XIHH4HD$HXH DgDD$hHL$XHT$@跶HHD$DD$hHT$@HL$XHXADc%D Cb@H1Hp(I$Hx(uI$LT$XpHB8H f8bHq8f>Tf}XH$H$荹wLT$XHLT$XSlLT$XHEH$HHbH$H$LT$XH@8H$H$"wLT$X H|$LT$Xk.HD$LT$XH@H$HT$ H$HD$(H$H$HH$?H5$$SH$H$AкLT$X$LT$XLT$XLT$XLT$XLT$XH$HL&H$HD$HD$HID$HHHt HHI$HT$(H@H@HH HHL$ ZbHL$ HT$(AHH9T$H|$8 =M_I$H$@H|$Hp8!f.HD$ED$|$H| LHt*ooCFoC F oC0F0H{@H~@HHHHL9uD`H& Rgfffɉ)DAOH$AغH1DŽ$HDŽ$DŽ$PHDŽ$kXD{uHD$IHHH`HKozHHD$AH`k%D CE1E1A IML$H$H$H1DŽ$HDŽ$DŽ$ HDŽ$٬k @1E1ADC1DAHH耬H|$H`@D EPH|$H1fuLT$H$HD$AHǀ`E$HAs]fDI$HH$HLH8LBhHH1DŽ$HDŽ$DŽ$ HDŽ$vk@LT$X?w HHHH!H$H$LT$XH$LT$XH;$BH$H9$^f.@H5HDHHmgf.SHH[.gf.@StH\HԼHHvZf.H{; f.H DHHf.SHH߾[YKfHMH`DHHMf.SHH߾[ KfUSHHHHDH#PJHHt HH[]ÐJHtHtHNNVH64line map string pair map hashalloc_entriescanon_relative_path_prefixinput_location_and_blockcf-protection=full-fcf-protection=none-fcf-protection=branch-fcf-protection=returnFunction is OpenACC serial offload paralleloacc kernelsoacc serialexecute_oacc_device_lowerscan_omp_forenclosing routine2-p6e_to_uhwicreate_artificial_child_accessbuild_ref_for_modelsign_maskNot vectorized: Incompatible number of vector subparts between %T andvect_reanalyze_as_main_loopReanalyzing as a main loop with vector mode %s ***** Failed to analyze main loop with vector mode %smaybe_set_vectorized_backedge_valuevect_reanalyze_as_main_loop .ident %s"%s" Section already exists: %qsinternal call LHS type mismatch=|checkzero start column(SUSE Linux) 10.3.0N9__gnu_cxx20recursive_init_errorEstd::bad_allocSt9bad_allocpure virtual method called basic_string::appendLL$|()uninitialized __any_stringThhhhhhhh3,%|t"}+++++++++++++++++++++++++++++++++++++++++++++0cccH(cccccccccccccccccccchcccc8cccccccccccccp,+B Q e X!,!H"*+P+@,$%//0- .-."""#######~#~####3,3]56061112k3 3`4t4x% &  P(P(<)@;)*6x77h8 8X99H:c;#'##S:S!SSSYYYYYYYYYYYYYYYYYYYYYY4YY4Z4YY4ZZ[t[[D\__`YYYY\Y]YYYYYYYYYYYYYYYYYYYYYYYYYYYYYY^Y^Xcomplex imaginary Sd#L$t$$#$$$$$<$$#&&H'p'''@((()p)))@****h++`,,.0-......................&&-566`66P6@6(666555:999`909986`6:::::`6`6::`6::::`:::::::P:8:8:8:8:8:8:8:8:8:8:8:8:8:8:8:8:8:8:8:8:8:8:8:8:8:8:8:8:8:8:8:8:x8x8x8x8x8x8x8x8x8x8x8x8x8x8x8x8x8x8x8x8x8x8x8x8x8x8x8x8x8x8x8x8888888888888888888888888888888887666`665666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666677 88)82258@82K8V8a8l82w888888:;;;:X;OpOOOOPPPPO`OOO;3Y42%2(%2p[V&2\)2`*2d+2fF.2d^.2xv.2.2.2p.2.24.2H/2\/2$6/2PN/2df/2~/2Ԣ/2/2h;/2K02U02V202V02Z02cX52dG72}X7272L=2l=2B2tC2C2HC2@e6E2m G2$G2Q2T VQ2l nQ2DS Q2 Q2 Q2 Q2 R2 6R2T NR2 #T2 U2 Y2\ Y2` Z2 b ![2@b c\2d \2`e *_2hf ]`2k p`2v `2 `2Ԇ a2 @a2 a2 a2 ub2 f2 f2 f2 9g20 g2 h2 t2 t2 Yx2 y2,! z2h" ez2* |28 |289 }2[ "}2j :}2t ~2 2 b2d x2̗ 2, 2d 2 M2\ I2 2 ϝ2 2ij G2 ܮ2x 2 2 $2^ Z2a r2ȁ 2 28 ү2L 2` 2t 2 22 J2 b22z222T32!2ܜ2242`2x22<f2hݴ2U2͵2DE2pŶ2=2ȟ2=2 2L42x׹2O2РǺ2XX2̦ջ2R2$ʼ2PB2|2<2ԧ2,2,̿2XD22322<!2h22222б22Xr2B2e22ؾ2$2p2 28'232b2@22 2$3833@p33p+ 3 3N 3- 30 3xW3U33\x!3#3 l%3&3d=(3 e)3 *3d +3 -3 .3 /3 23@73G93F;3$;3H=3?3,@3xA3B3 D3X.F3F32I38I3 K3M30XP3R3U3`6W3W3@\3`3a3Dc3c3wd3 e3hYf3f3g3i3Kk3hl3!m3\!m3!n3!p3<"p3"q3h#v3#v3$x3$qy3@%z3%k|3-o}3 .~3X.3. 3.38/3p0303132%3d2532F32'3<3353l63637373@8՟3<3<3= 3X?S3?3@@38D3De3D3|E_3Ii3Id3L3M3MC3N3@O3O3Rf3[}3\"3 ]3t]3H^3^ 3^3D_W3_J3_=3$`3p`3`3a3Dq3 v3w4zI4Lz4z4z 40{ 4 4(4d4D4܁w44l4`4̃44y4H4d14P74T4pq44}4<\4;4Ԑ4 4ly4F44ԒN 4  4@!4d!4"4#4Q$4ؚ$4I%4 %4l&4:'4'4P(4l)4̞G*4 *4П+4$,4-4p-4(.4`04-343444U540 64h647484̬j94":4<:4<4=4\!>4#?4?4\`@4A4\A4B4hE4ܺH4LJ4\K4@L4L4M4(fN4`O4PP4Q4Q44R4T4U4c4`d4h4xl4q4p.u4w4(y4t"{44D4$ʄ44_4484O444h؟4 44x4@9K4:;40:4<S4<4j34\$44444 η4X+404`0k4 24tN45XD5!d5TN5Q5dd5 5@5x5е5i5T565uT55.T545D5(T5<d5Pt5d5x5555$5455 54t-5$.5.50D05\25x$<5P50e5p4e5te5e5e5f54f5tf5f5Dg5<g5hg5|h5D7`K7K7L7L7L7M7M7,M7@N7T$N7h4N7|DN7TN7N7O7dR7XR7tR7d7e78f7ddh74i7Di7 Ti7 j7Dj7hk7|k7k7l7 l74l7Hl7\l7pDm7o7dp78 p7` p7t r7 s7!t78!t7!u7!u7!u7!u7!v7"tv74"Dy7"Dz7"z7"|7#|70#$~7l#D7$70$T7P$7$7%7%Ԇ7&74&7&7&Ԏ7&470'd7L'd7#70%7%T7'7'7'7((7\(47(7(d7)7\)7)7P*47*d7*ԯ7$+d7X+7+t7,7@-7-D7-7.7|)7)$7)7+7<,7,D7|.7.7/74/7H/$7\/7/8/ 8$0 880 8L0 8h0$ 8|0D 80T 80t 80 80/841/81T08118(2d28D228`2T382382t48258368L368`3:83;83;84<8 5B8\5C85DK85K85DL8`6tN86TO87W87c8P880Č838H4t84D868L7Ԕ88Ė88Ԗ88888@9T89Ğ8948:t8$:8t:ĩ8:ԩ8:8:8:Ī8:8:8 ;$8 ;8p;48;ķ8;48<Ժ8l<8<48<t8<8<$8<48<8 =Խ8<=8=$8=D8=T8=t8<>8>48>8>8>8?D8D?8|?8?8p@t8@$8Ad8A8,A$8TA8pA48At8A8A8A8A8A88B48PBd8hBD8|B8B$80C8|C$8C8\DT8pDt8DD8D8D8\E8pE9Ed9E9E9E9F9FD94FT9HF9hF9|F9F9F$9G9G9Gt9HD 9lHT 9Hd 9H 9H 9HD 9Id 94I 9tI9I90JD9LJt9J9,K9K9K9K99Q@9@QC9QDE9QI9DR4M9RO9StR9`SS9StU9SV90TTX9lTX9TDY9TY9T_9,Ua9Uc9,Vc9@Vd9Vde9Ve9Ve9 Wg9XWg9lWg9W4h9WDh98Ym9|Lto9Wp9Wq9Xt9Xdv9X$x9Yx9Yx9YTy9Z$z9@Zz9lZz9Z$|9Z|9$[~9T[~9[T9[9L\9\9T]T9]ԇ9]9 ^9`^9t^Ģ9^9^49^9^9@_49_9`d9`9@aT9|a$9at9,b$9xbt9bT9c9Pc9dcD9c9c49d9=D>=$4@=h@=A=DA=B=DB=XB=lDC=C=صdG=$G=DH=X4H=pH=X=X=Y=Y=0Y=DDZ=XZ=ld]=b=c=c=,d=@Dd=Tdd=hd=|d=d=e=4e=f=йg=h=(i=ddj=кj=(tk=Tk=h4l=q==T=$≠d=T=0=T=$=ܻ=@==(=t===dě=|==4==<==Կ===`t=$=4=P=Ԭ=D==x==(=t=D==D=@t=h=4=4=H4=d>T>d>t>>>0>D>XD>pT>d>t>>d>>$>@4>>>$ >8T >L >t >d > > >$ >@>\$>p>>>(>>T>>>>0>D>pD>>D>>>$>|d >!>(%>4&>D)>d,>|,>->9>,9>@9>tt:><>A>XA>|$B>4E>E>G>PM>TN>N>P>T4Q>htQ>Q>S>T>V>LX>x$X>\>$\>\>g>lg>$h>i>m>Dq>PDs>t>dt>4w>>dT>D>8T>4>D>>LČ>`>>> >P>p$>T>>ď>Đ>>T>> >\>p>>>d>>$>>X>ԕ>>>> $> t>H>p>>>D>><ԣ>D>>>h>4>>(>T4>|D>į>ԯ>>(>>,>h>|d>>Զ> T>>>>$>T><d>T>>d>>>4>>D>d>>8T>>>$>0>>>>t>4>t><t>t>Hd>T>4>H>>4>\d>T>h>D>>4>H>`D>>4>>>>D>>,>H>D>>>4>t>0>X>d>>>T>>T>hD>>>>4>H>|>>>?? ?L?pt???D???<D?P?hd ? ? ?D ?\??d?P?h??? ?H?4????8$?PT?d4!?!?#?%?T'?(?(?$)?*?|.?.?t0?80?L1?`2?42?2?3?3?3?3?P44?ldI?tI?I?L?$M?DO?$Q?tT?8U?LU?xX?$Y?Y?Z?@4[?x[?_?a?Pe?g?h?(u?w?x?y?<4z?`Ā??DĤ??D?$D?$?Xԩ?x?Ĭ?T?t????4??H$??IJ?Բ??$? d?H?dt?$?T?t?,?@Ը?hĹ???T?D?lԽ?d?t??4?,?@?\T???$?$T?\?D??$$?t??t?< ?P ? ? T? @T @t @ @ t@ @ @ @ @0 @H @ @D d@t D@ @ 4@ T@ @H 4 @t t @ (@ D)@*@@*@h+@-@/@1@d1@1@6@dt:@t<@T>@>@$C@C@ F@R@(U@4 4V@X@|X@\$Y@tY@Y@$Z@tZ@([@T[@ld]@^@_@$`@L`@ha@b@b@b@4c@$c@\d@d@d@4e@e@$f@8Df@Tf@tf@k@dl@<l@hTm@m@n@o@Tq@xDs@ts@d|@,4@@,@d@|@$@H@t@@Tt@@@D@t@T@İ@԰@(@<@P$@d4@xt@@Ա@@@@d@ 4@L d@!@t!@!d@ "@X"@"@ #@X#@#4@#@#@$@$4@$t@$T@$$@%4@h%A% A%4 A&A`&At&A&$A|tAAAdA A AD!A!$#AT'#Al'$A'4$A'$A'$A(%A@(d%AX(t%Ap(%A(T&A(&A)&A()'A<)D'AP)'Ad)'Ax)'A)'A)'A)(A)d(A))A */A*0A*0A*0A*0A*1A+1A+T2A8+d2AL+t2A`+2At+T3A+3A+3A+t4A4,4At,T5A,5A,D6A,7A-7A8-7AL-8A`-;A-AP.d>Ax.>A.$GA.GA(/$IAL/$KA/DQA/QA/DRA0RA00RAD0SA\04SAp0SA0WA0tYA01dZAp1ZA1$[A1[A14\A1T\A1]A1$]A2D^AP2t^Ah2_A2`A2taA3bAX3bAl3cA34cA3dcA3cA3dA3eA4$fA<44fAP4fAx4gA4hA4hA4hA5iA5iA054iAD5TiAX5iAt5iA5iA54jA5dkA5kA6dlA6tlA06lAL64mA`6tmAt6mA6nA6dnA6nA6oA6pA7TpA7pA,7qA7sA7TtA7uA8$wA<8txAh8yA8zA8{A(9T|Ah9d}A9}A9~A0:4Ah:$A:A:AH;$At;A;A;TA4<Ap<dA<A(=TAD=A`=ԊA|=A=TA=A=tA=ČA>$A`>A|>A>TA>ԑA?AP?A?A?A@A<@A|@A@A@ACЅ?CAC,tACHtCCDCDCȆGCKC܇MCMCHOC<OCPPC|DPCTPCPCQC(4QC<TRCtRCRCԉSCTSCSC4SCHTCUCĊVCtWCHYC\C($\C<]C_Č_CcCeCdfC0fCHgClDgCtgCgCĎthChChChC0iCiCjC0tjCLjC`jC|kCoCDpC4rCptCuC$ClCВC dCDDCCtDCCTCԓԪCīC4CC0ĬCd$CxtCCԔ$CDCdC(CTC4CTCtCԕCCԯCC<4CPC|CİCCC̖$CdC C C\DCCԗCdCLijC$CĘCC<DC|CԵCCؙCtC8C\DCtCCDTCpCdCCLCtCCCDC4CCTCC8ClCC̞dCC4DChC$CПCC<tCpCCDCCCC $C DCpCC4CTCԡCDC8CxDCC$CC0C\dCDCCC$C<TCTCxtCCԤCC0C|CCDCХtCC $C@DCTCCdCȦCTCCCHCCԧC$TCLCtCCC84C`CC4CC CpCTCЪCD0DDHDpDDDЫDD 4DXDD4DDD@$DhDḒdDD$DDPD|$DDԮDD,tDtDD4D8tDtDtDܰD4D4tDTDpDDDDXDt D D$"D,"DT4%DijD'D-Dȴ4/D1DTd4D6D;DDtE8@ECEDE(TGESEWE[E4T`EmEpEqETtEuEvET}EhDE ԋEԎET$EěE,4ExdEDE(ELtEttEDELĹEEE$dE\E4ExEEdEEEdtEFFL(F$5F8F8F8FTOFT]Fd]Ft_F{F(|F\F4FFFPdFFFF$$FHFtFFHFFF|GGG<TGGG G8*G/G>G$D@GhBGJG<OGPGQGZG@aGxGGhGG ĆGLԈGG$GdGԝG,G|tGTG4Gx$G4G<tGĻGG$GG G` Gh G $HdH̚HH$HdHH0HH4HdHHHDHHdHT HHD H H$ H(HHHdHTHHHH$HHH H`T"Hd#H`D%H$'Hx)H*H,Ht-HD.H/H0H2H<4H6HT7H$8Hl9H :H H >H >H ?H $?H D?H$ d?H8 ?Ht $@H @H @H( DAHd AH AH dBH BH CHt4EH4GH<IHIHdJHJH<DKHpKH$LHLHLHDMH8dMHLNH|dOHOHOHPHDPHtPH<PH\PH|QHRHRHTRHRHTTH TH TH<THPUHl4UHUHTVHVHWHDWHtWH(WH<XHd4XHdXHXHYH0YHXYH\HT\H d\H \HXD]H]H]H]H4^HT^H_H`t`H`HaHTaHdaHtaHaHaH(aH<dHdH4eHDeHteHeH0eHDfH\gHnHnH$DoH<oHhoHDpHpHpH4qHqH qH$$rH<dsHxvHvHTxHLxHxyHTzHdzH$D}H}H~HHTH ĀH4H`HxH$HĂHHHHH(H<4HPtHdHxH4HtHHH4HH ԇH H, dH@ HT Hh DH| H DH ĊH dH H!DHP!ԍH!TH!H!DH$"HP"tH"ԔH"H"tH"ԖH"H#H#H\#ԘHp#H#H#H#DH#tH#H$H$H,$H@$Hh$$H$H$H$$H%4H%DH0%THD%dHX%tHl%H%H%H%ěH%ԛH%H%H%$H &4H &DH4&ԜHX&dH|&ԝH&H&4H&tH&H&H'H,'DHH'Hd'H'H'H(Hp(4H(H(H(H(DH(H)H,)HH)H\)ԦH)H)4H)TH)4H*dH*H*tH*įH*4H8+ıHh+H+H,4H$,dH`,ļH,H,H,H,$H,TH,tH -H$-H<-ԽHX-4Hp-TH-dH-tH-H-H-H-H-ľH.ԾH$.H8.HL.H`.Ht.$H.4H.DH.TH.dH.tH.H/4H/TH/dH,04H0TH0H0H0$H01Hh1DH1H1H$2H82Hh2H|2H2TH2tH$3H34H 44H5H 6H46tH`6THH$H8HLTH`HH/H/H2Hp3H5Hh5H6DH7tH<74H\7Hp7TI7$I7dI8Ip8I8DI6I6I 9tI 9IL9Id94Ix9TI9I9I9I: ID:D Ip: I: I; I;4 IL; I|;I;4I;DI<dI<tI0<ID<IX<Il<4I<I<I<DI8=TIL=tI`=It=$I=DI=I=I=tI$>I<>I>I>I ?I8?4IP?TIh?tI?DI?I? I8@d IP@ Ix@ I@!I@!I@"I@d"IA"I8A#IPA#ItA$$IAt$IA$IAD,I4BT,IHBd,I\B,IpB0IB2IC2IC3I0CD3IPCT3IdC3IC3IC3ID4ID4ID4ID4I(E5IHE46IxF7IFt8IG:IHGIH>IHFIIdHII4JI|JJIJ$WIK^I9_IC_ICdaIHD4bIDcItEdIEfI(FThI HdjItHkI,IlI@JmI4KoILoILpIHLsILsIL$tIM4tIMyIxMzIMzIMzIM{IPL@PdAPAPDBPCPPDPGPDJPJP4LPMPaPD$bPldbPtbPLbPxcPdPeP<fP{P}PPP`PP P@ $P P ԉP dP DPX dPt P dP P ԍP P0 PD 4P` P TP dP P P P8 4PX TPt dP P $P tP PPLPxĝPtPT 4P P dPT DPLtPTPPPtPpTP<PDPPP\DP|PP4PPPPLPPPPDPlP4PTPdPPPPPxPTPPP P<PXDPPPTPP@PlTPPP dPDPPP4PdPP0PXPtPDPTPPP,PT4PtPPPdPP$$PDdPlPPPPHPhTPPP0PlPP$PP(PtPdPPtP,PLP`dPtDPPP dP P P 4P TP P P PT!TP!dP!tP!TP"tP$"P"DP"P"TP#tP$#$P\#P#dP#P#dP4$PL$P`$P$tP$P%P%T Q%t Q% Q% Q(&4QP&Q&Q&Q&Q$'$Q'dQ'Q'Q'4Q'$Q'Q((QD(DQ`(tQ(Q(TQ(Q)Q)4Q()TQ<)tQP)Qd)4Q)Q)TQ*dQ(*tQ<* Q\*4 Qp* Q* Q* Q*!Q*"Q8+#QP+4$Q+%Q+'Q,(Ql,*Q,D+Q -+Q<-+Q\-$,Q-,Q-,Q--Q-d-Q.-Q.T0Ql.d0Q.0Q.0Q.41Q.2Q(/3Qt/3Q/4Q/44Q/4Q/5Q05Q805QL05Q`05Qt06Q07Q0$8Q18Q@18QT18Qh19Q|1$9Q19Q19Q1D:Q1;QH2dQ3$@Q3BQx3$CQ3EQp4GQ4KQ4KQ4LQ85LQL5TMQ5NQ5OQ6QQ46VQ6YQ64ZQ7ZQp7_Q8`Q@8dcQ8cQ,9hQ9$iQ9lQ|:xQ ;tQ;DQl<Q<ԘQ<Q<Q<ԛQH=4Q|=Q=Q=DQ(>ԡQt>ĢQ>Q?Q`?ĥQ?$Q@QL@DQ`@Q@4QLAQBԲQC$QDTQhD$QDTQETQU ?U@@UlDAUHUЄdJU@JUKUtNU OUXPUl4PUTPUtPUQURUHRUxRURUtSUSU44TU`TTUttTUTUdUUdVU,VU@VUTVUhVU|$WUdWUԉWUWU,XU@DXUldXUXUXU4YU YUP$ZUDZUZU[UT[U([UT[Uh[U\UT\UԌ\UT]U0]U\]U^UT^Uȍt^U܍^U^U_U0_Ux`U$`Ud`U`UaU<$aUPDaUddaUxaUbU4bUtbUDbUpbUdUdUeUPeU|DfUdfUfUfUfUgU$$gU8dgUdgUgU$hUhUhUXDiUiUГjU djUHjU$kUkUkU<dlU|lU4mUmU4mUpdnUnU̖nUpUdDsUwUtyUPzUt{Uؘ{U{Ud|UH|U||UT}U~U$UddUtUКUĀUU0ULDUhtUUU؛$UDUdUU(ԂU`$UtUăUU tUXԄU$UtUąU$UPtUxĆU$U؞tUćU(U`dUUUdUU8UpdUĊUUdUPċU$UtUČUUHdUpUUТTUU@UxTUUU tUXԐU4UȤUU$U4U`UDUUUdU<ĔUhԔU|UUUU̦$UU $U<UtėU4UtUUUĘUU@4UTTUhUęUDŲUUUU4UH4U\U|ԛUUUĩ4UةTUU U U4UTUhtUUĝUȪUܪ4UdU$$UXtUxUğUU̫DUdUU ĠU4UH4UhdUUUġUЬUtUĢU0UD4UlDUUUUԭ4UTUdUU<4U`TUtU4U̮TUdUUUD$UxUdUU4U4UHtUhU|UU$UUUU$U\4U|UбTUU,U\$UpUUUȲ$UܲtUUU,U@U`TUtUUijUسUUtU UDUXUlU$U4UUȴUUDU4UXU4UUȵUUUPU$UUUضUU(4U<U\UpU4UUԷU$UDUDUxU4UUĸU$U tU4UHUpUUUĹUعDUtUU,UX4UlDUdUtUUغUU UH4UpTUUUĻUػDUU U44U`TUttUUUļ4UDUU 4ULUtU4UUU4U dU@U`UtUU4UľtUU U,dUTUtUUtUĿUUUPUpUU4UTUdUUU$TUPUpUUUUUdUU U4UHU\UptUUU$U4UUTUhUUTUU4UtU4UTUhV4VTVVVVV$4VDTVXVxVV4VVVV$V8DVXVV$VDVTVVV4$VHVtVdVtVV V4DVd4(V(V4)VT)VD*V t*V@*VT*Vh+V$+V+V+Vd,V,V(-VH1V2V$3VD3V3V3V(4VH44V\4V4V4V4V5Vd5V5V(5VP5Vdd6Vt6V6V7V7V$7V88Vl8V$8Vd8V8V8V9V d9VX9Vl9V$:VT:V:V:V>V@>Vl>V>Vt?V@V@VDAV(dAV<AVdAVxTBVBVCVTCVCV(CV<$DVtDVDVEV$EVEV0FVPGVpdGVGVGVHVDHVdHV(HVPHVpIV$IVDIVIVIVJVJV$dJVDJVlKVdKVKVKVKV$LV(DLV<LVp$MVPVQVQV dRV,RVLTSVxSVSV4TVTTVTVTUV8dUVLUVtUVDVVVVVV$WV$4WV8TWVLdWV`WV$XV4XVDXVtXV XV XV4XVHXVtYV4YVDYVtYV YV8YVLYV`YVtYVZV$ZVdZVtZVZV4ZVHZV\[V[V$[V4[VD[Vt[V[V<[VP\V|4\Vd\Vt\V\V\V@$]Vl4]VD]V]V]V]V ^V84^VdD^V^V^V0D_V_V aV8cVhcV|$cVcVcVtdVdVeV4eVLTfVdfV|gV$gVDgVTgVdgVtgVgVDhVLdhVdhVxhVDiViV4kVTkVtkV,kV@kVTkVh4lVDlVTlVdlVlVlVTmV(dmV<$nVhtnV|oV4oVToVdoVtVuVlwVt|V|V||V}V}V$}VhTVtVVVTV 4VDVąVlTVVԊV4VttVVVVVtV4ԏV`VtVVdVV`4VTV@VTVhDVVVVD4V|DVtVVĨVVDVVVV(4VTTVpVԲVVVdVijVHԳV\VTVVV<ĵVPVlVVVķVV0$VDDVXVDVTVdVtVVhV|VV4VdV0VhV|TVVVV0dVpVVTVV0VhVdVVVVHdVVVtVDV@VVVtV8DVVVxDVVVHV\VVtVPVV$VV,VxTVVVhVV$V4VVV,VXtVVdVV$VVV|W WWW W#W$WT4'Wd,Wd.W$2W 2Wh3W4W0 d5Wh 46W 7W 8WH @W AW BW, 4BW@ dDW FW DJWp gW oW$DWxĻWWH4W,4WW,WDWWW$W4tWHWtW4WWWHWWW4W,WhWWWW@WXWWdWTWDWPW|tWX4XLDX3X43X4X6X@D8X:XT;X=X4>XxD?XD@XAX,AXlNXNXOXQXRX@RXXRXlRXSXDSXSXSXSX$TX4TXYX0YXDZXX4ZXlTZXtZXZXZXZXZX[X4[X T[X t[X4[XH[X\[Xp[X\X4\XT\Xt\X\X\X\X\X$]X84]XLT]X`t]Xt]X]X]XD^Xt^X^X^X^X$_XtT`X`XaX(bXdcXdXeX fX< gXp TnX nX oX doX oX!oX!tpX4!$rXx!4rX!DrX!TrX!tX!ttX "uXT"vX"xX"{X#{X#{X,#{X@#{XT#|Xh#|X|#$|X#4|X#D|X#T|X#d|X#t|X#|X$|X$|X0$|XD$|XX$|Xl$|X$|X$}X$}X$$}X$4}X$D}X$T}X %d}X %t}X4%}XH%}X\%}Xp%}X%}X%}X%}X%}X%~X%~X%$~X&4~X$&D~X8&T~XL&d~X`&t~Xt&~X&~X&~X&~X&~X&~X&~X'~X'X('X<'$XP'4Xd'X'X'X'X'X'X'X(X(X0(XD($XX(4Xl(DX(X(X(X(ĀX(ԀX(X)X$)X8)TXd)dXx)tX)X)X)X)X)āX)ԁX*X*X,*ăX|*ԃX*X*tX*X+X+X+ĈX+ԈX+ԋXH,DX,tX/X/TX/X/īX/ԫX0DX0tX,0X@0XT0ĬXh0X|0$X0X0X0X0ĮX1DX1Xh1X|1X1$X1tX2dXX2tXl2X2TX2X03DXT6X6DX`7X7X7DXt8TX8dX8tXL9$X4,4X,Xp/TX2X8X8X9X9X9$X9dX:X:X:X:X;X<;Xt;$X<4X<TX,<XH<Xh<DX<Y0=Yh=Y=DY=dY=4Y>YP>DY>Y,?Yx? Y?Y\@dY AYXAYA$YBYLBY`B4Y|BYBYBdYBYCY(CYHC4Y\CYCYCYCDYCYCY DY4D4YTDTYhDtY|DYDYDYlE!YE!YE$YF%YG&Y\G4)YG0YH0YHH43YHt6YH9Y ItVYXJt`Y KkY\KmYK4rYHMdYMYNYD:YX:Yl:Y:$Y:4Y:DY:TY>Y>ԪY>Y?Y@4Y@TY@dYAYD4YDY0E԰Y4FYFYFdYpI$YIY JYJY4LDYLYLYMYdNYN$YO4YOtYDOYXOYODYOYPYP$Y,PtYhPtYPYPYQ4YQYTQTYxQtYQdYQYQYRtYRYDRYlRdYRtYRTYPSYtSYSYSY4TYTTtYTYTYTYTYT$Y4UYUTYUTYVYTVTYtVYVDYVYV$YWDYWY4WYWYX$YTYYY4ZPZ ZZZ[Z\Z\Z ]t$Z]%Z]$%Z]D%Z^%Z(^%ZD^&Z\^(Z^d(Z^*Z$_t+Z_T3Z`:Z`D\H?\t@\DC\LTC\`dC\ttC\C\C\C\ČC\DE\dE\ԍG\4G\,DH\d4I\M\4R\|Z\TT[\\\̓]\8i\l\ܔm\n\4{\X4\D\T\,d\@t\\Đ\(\T\4\\\ԗ\(\d\\D\\4\\\D\ܚ\D\ld\\t\Ԭ\$\dd\̜\\0\Dĵ\l\\Լ\4d\ht\\Dd\@\\T\T\\\\$\\4\P\d\̟\\\$\]ĢT](]]]]Уd] ]Ԥ$]  ]T ] ]T ]T ] ]ܦd]$]D]d]]]P]d]t]`]ȩ]]]]īD]] ]x$]d%]%]L&]']*]*]+]0T.]l.]1]5]Į5]6] 6]46]H6]\6]p6]7]7]t8]ܯ>]L4?]hF]H]tI]HJ]M]P]l^]b]Tc](Dh]ti]4k]Tk]k]<du]]t]ԹT]$]t]кT] ]tԨ]]L]T]t]L]t]$]dD],]h]H]\]p]]^^$^d^ ^^@^^^^̪4^T^$d^8^T^h^$^0t^ ^L ^^H^d^^L4^^^`^ĸ ^\"^t#^D$^Ļ$%^%^&^̿T'^(^DD)^|)^d,^T.^@/^1^D2^d4^Td6^6^6^6^8^88^hW^W^X^Z^LtZ^Z^\^T\^\^t]^^^T`^Ta^Tb^q^hr^4v^(t^x^^,Ԙ^T^|^D^^^^Ĝ^4^H$^0^\^^^$^$^P^dt^԰^^^D^D^ ^^T^^hD^^^ ^ ^4^H^\^p4^^^t^t^^^^^4^ ^$$^<D^T^^^(^<t^d^D^^$^D4^_P___T__D_T_d_0t_Dt___T _d!_!_!_!_$#_d$_$_%_\D&_d&_D(_<(_$)_$*_D*_4-_8t-_L/_7_D_PU_U_$U_4U_DU_V_0V_DW_d[_[_\_]_o_`dt_$w_ w_Ty_{_|_@t}_d__$_<_d_x_D_t__T_ Ԩ_8d____0T_D_XT__Į__hd_d__`_t__XT_4_ _p___$_d_ _X_|_T__ _4_HT_\_x___$_(4_<D_P_x__d_8_|______$_4_,d_P_p_$_T_`pd!`)`)`D*`*`*`4+`+`0,`L,`d$.`.`/`01`E`DL`8Q`$Q`R`U`U`$tV`P$W`l[`$[`4[`|T]`^`q`<$s`s`du`0w`x`|`@}`x}`~`Ā` `4`D``TԔ`$``ԕ`t``(ė`<d`P`d``D`$``d`4d````l`T`` ` $`lԧ``t`4`P`xĬ`` 4`pt``` `p``4`8```($`<```8`D`$`aaaT8axHadTHa@4JaKa@tLaTLahMaNaTQa4VaXaYa Za,[aT[a|_aDha~a|~aDadaaāaa alaaaalԊaċataTa0DaaTadaԗaa4 4aL at a a a Ԧa a a( a< aT a a a a a0 $a da Da Da aP ad Բax a Գa a $a( a@ aT ah da| a Da a Da Ĺa aTaHadaxaaԾa4a|aaaaL$a`aaa0TaaTaLaTa4aDaTada,ta@aTaha|aaaDaTataa0aDaXala$a4ataaaada0taDaXalaaaa(Dataaa$aDaTaaaa(aDa`taaaaa a8aLa`ataadaaaaa8aDaa0$ataaaTaa4a4daPadaxaTaaaaaa a a4aHa\$ap4aDaTadataaa$aa<$aTaxDaab4b,bHbpbTbbbb<b4 b b T bP  bx  b! bH! bt! b! b!b!b!DbL"b"b"b#bP#b#"b$d$b$$Gb Ib#dKbL$Mb4%$NbP%Nbt%Ob%ObD&Pbl&tPb&Qb&tVb,'VbP'DXb'Zb'_bP(tab(dbb(4cb(ddb )ob)qb*Tybt*yb*zb+{b@+{b+tb0,b,b,4b0-dbP-Ԍb-Ԫb-tb*Ĭbp+b .bH.tb.4b.b/b4/tb/b/Db0bD1b/b,0b0c34c3c84cX4c4c4c5c@5Dcd5c5 c6 c@6cl64c6$cX7c7c74c7dc8dcl8c8#c8%c$9$(c9(c94*c9t*c:*c@:+c\:/c:/c:$0c;T0c ;t0c<;1ch;$1c|;D1c;2c;4c0<6cl<8c<9c<49c<$;c$==cp=>c=4>c=T>c=@c>Acd>Bc>Rc>Sc?TSc5Sc5Sc5DTcD?Ucd?Ucx?$Vc?$Wc?Xc0@Xcx@DYc@Tec@ec4AgcAgcA4c,Bc|BcBDcCchCԣcCDc4DcHDįcDcD4cEc8EcETcFdc|FcFcFcGc0GtchGĹcGԹcGcGĻc HDc4HԼcHtcHcHTc8Ic\IcI4cIdcJc(JcxJcJTc\KdcKtcKtcLcdL4dL$dL dk>kh@kDEkGkt4SkĽUkrkrkrk@skvkh4|kT|kDk$Tk$kTkDkkħkkt4ktk|ıkk`k4khDkԽkdk<4kkk\kdk$kdTkk$k@tk`klk$kkkDkX4klkkkkkDkTkk k\kkkkk4$k`Tktkkk$dkpk4kk$khkdk$kk4kH$k\kpkkDkkk@kkTk4kkTkktkPl<llttllTltl@dll4 lD lt l, ld lx l l l4 l lLt lDllll4ltllTl4lTlt$lllD lt!l!l "l "l4t"lL#l$ld%l%l,%lXt&l&l&l$'l(l$)l4)l`+l,l-l/l@/l|3l48l$$:ld:l|lđl 4lHWmWmXmXm,XmpXmdYmYmZmZm$m(tmTm4nn$nn<nnDnnnnD nnnn n n0 nD nX Tn dn n n n n n n n$ n8 nL n` 4nt Tn tn n n n n n n, n@ nT nh $n n n n $n` dn n $n n@ nl n Tn n d n8 !np "n #n $n0 D$nH $n\ $np $n $n D(n (n  *np d*n *n $+n +n +n ,n, .nx 4.n .n .n 0n 0n 1n4 T1nP 4n 5n 5n 5n 6n 6n 7nT d9n ;n nX d@n DJn Jn$ KnH TKnd tNn On DOn tOn tPn8 PnL Rn Rn Sn  TYnX dYnl tYn Yn d\n\ ]n ^n d^n $n dn n nn n Tn 4n< n n tn( nt $n $n0 tnH n\ np dn Dn n n 4n tn n Dn n o0 o oh $o< DoP Tod $o oH to $!o 4!o !o  $"oL %o )o 4*ol -o AoL Lo Mo DMo TMo dMo DNo TNo$ dNo8 tNoL No` No $Oo Oo Oo 4Po PoL 4So tSo So To ToH To\ Zo `oX bo ho io4 ko\ mo dooP qo Tro dro uoD yo yo Dzo dzo zo(! ${o ĢpP> Ԣpd> px> p> p< 4p< p\= dp> p> pT? Tp? ĭp? 4p\@ p@ pxA TpHB dq? 4q@ qA qA D qB T qB qB  qB 4 qC q(C qPC  qxC T qC qD qD 4 q,D t q@D qhD qD $qD dqHE $q`E $qE qE DqE tqE 4qE qF %qF $&qF d'qF D(qG (qG T)q,G )q@G 4+qXG ,qG T.qG .qG .qG .qH $0q0H d1qHH d2q\H $6qH 46qH D6qI T6qI d6q,I t6q@I 6qTI 6qhI 6q|I t7qI 8qI 9qI T;qI qJ ?qJ $AqJ DqJ Eq4K HqK dKqK DLqK LqL dMqL Nq0L NqDL PqpL PqL RqL Rq M Rq(M Sq@M $SqXM TqtM UqM UqM UqM UqM UqM UqM VqN VqN Vq,N Xq\N dYqpN ^qN `qTO eqO 4fqO fqO fqO iqtP jqP dkqP kqP lqQ Tlq8Q sqQ 4tqQ ttqQ tqQ tqQ 4uqR uq4R vqR DwqR tq(S qu d>u >u >u $?u dAuH Fu $Ku Mu\ Ou u u tuP u u u$ u u u $u 4u uX Du u| tv v Dvl tv v vd v v 1v 6v 6v T:vT t:vh :v :v D;v ;v =v t?vH @vl 4Av Hv Pv tQv Qv tRv $Sv| dSv Sv 4Tv dTv Uv Wvd Xvx Yv Yv Yv Zv 4Zv Zv Zv [v T]v, t^v@ _vT _v| _v 4`v D`v Dav Tav$ avL bv cv cv cv 4evD evp fv gv dgv hv hv< iv kvL mv nv pv pv8 t|vh |v| |v Tv $v0 TvH v $v v4 v v0 tv Dv\ v v v vp 4v v v ԛv( dv@ vh v v $v Tv vL v` tv Ġv Dv 4v$ vT vx v v v Tv tv$ v8 dvx ԧv Ĩv v 4v4 Ԫvp Tv Tv$ v@ v Tv v$ v v v4 4vl v v v Tv( vD v 4vl Dv Tv v Tv8 vX v 4vt v Tv tw w wx w$  w<  wP w w  w 4 w T w w8 T wp w w $ w T w w w w( w< wP wd  wx  w $ w w w w w w w( 4w< DwP w w w w Dw Tw dw tw w4 wH w\ wp w w w w w 4w Dw Tw dw( tw< Dwt Tw dw w w w tw w $w, 4w@ Dw` w w w w w w w w Dw0 TwD dwX wp w w w w w 4w Dw Tw tw$ w8 wL w` wt w dw w w $w Dw dw w, w@ wT wh w| w w w $w 4w Tw w w$ w8 wL wd wx $w Tw tw w w w w w$ TwH dw\ twp w 4w Dw Tw w Dw< TwP wd wx w w w w w w $w 4w Dw, dwD w\ wp w w w w $w 4w Dw dw tw, w@ wT wh  w  w !w !w !w !w !w0 "wD $"wX 4"wl T"w "w "w "w t$w  %w\ %wp %w &w $&w D&w d&w t&w &w &w )w\ )wp )w 4*w D*w T*w d*w t*w *w *w *w0 *wL D+w +w ,w 8w 9w4 9wP :w| :w D;w d;w ;w $ Sy> DVy0? _y? Tdy@ tly@ ny@ dvyA yT8 y= ĒyB ԒyB y C $yXC dyC tyC yC $y,D dyF yXF yTG yG DyB tyB yB yB yTD TytD yD DyE $y`E yE yF 4yF Ty$H dy8H tyLH y`H ytH yH yH yI $y(I dyPI yxI yI dyI tyJ yJ 4yhJ yJ yJ yJ $yTK dy|K dyK T zL T zdL tzL zL zL TzM dzM z,M z@M zTM $zhM 4z|M DzM TzM dzM tzM zM zM zN zN z@N zN zN zO $z0O DzdO $zO #zO %z0P %z`P &zP $'zP D'zP d'zP d(z4Q *zhQ 4+zQ +zQ +zQ 4,z R $-z\R -zR $2zR 43zS 3zXS D4zS =zS $>zT >zT ?z0T AzT dBzT BzT Jz4U czV Ddz4V dzHV dzpV dzV fzV fzV dgzV gzW 4hz0W hzdW $izW dnzW nzW nzX nzX pzdX qzX 4szX TszY sz Y vzPY $vzdY dvzxY vzY vzY wzY yzY d}zDZ zZ zZ zD[ zX[ zl[ z[ tz[ Ēz[ $z[ z[ z \ zX\ z|\ z\ Ԙz\ z] z`] Dz] 4z(^ zt^ z^ Dz^ $z(_ tzt_ z_ z_ $z_ z` DzP` z|` z` z` $z` 4z` Dz` dz` za zDa za $za z(b zxb 4zb Dzc 4z@c Dzc zc $zc Tzd z0d zLd zd zd ze z$e z@e $zTe dz|e ze {e {f ${0f {\f ${f { g {g {g {g {h {Lh {`h {h T{ ? ? @X TD D dF0 G H $K tK, K@ KT Kh dL 4Q S S, dU| $V V $W X8 [l Tb Tg g  Th4 h\ hp h h i ti j( DjL j $k n  q| $q 4q tq $r s0 w y y y y0 zD $zX Tzx tz z z z z z { |< ԁ L Ԍ T \ 䤂( ԰ T t   Ƃ $ǂ ǂ ȂT dʂ t˂ T͂d ͂ $΂ Dς Ђ DЂ  TЂ4 dЂ\ Ђx ҂ ӂ $ӂ, 4ӂ@ ӂT ӂp $Ԃ Ԃ Ղ 4؂8 t܂t ܂ ݂ ނ ނL ߂  T4   @     , T d   Tx   D 4 4    D t P    " T" 4%l & * $5 4>5 t>5 >5 ? 6 T?D6 ?X6 A6 B7 dKT7 Kh7 L7 L7 DN7 N8 $P@8 TPX8 tPp8 P8 R8 R8 S8 tS9 dU@9 tUT9 Uh9 U|9 U9 V9 $V9 W9 dXL: dZ: td: 4g4; j; w< = 4<> 䇄X> t> > > ? l? T? 4? ? ԛ(@ 4@@ @ A tA ԣHB TB < tA A dB tB B B B LC ij`C ԳtC 䳄C $C dC D Ĵ(D DD XD DlD 4D $E TpE E E D˄4F $̄lF ̄F ̈́F D̈́F Є0G ӄG dބG ބG ބ H 4߄HH t߄I I `J DJ tK HL L D#0M @N TAHN DO LdO dWO D\PP 4iC Di$C Ti8C diC tipH iH $jH mI qI rN sO TtP tP uP dLQ ԇQ 4R S Ė,T $R dR 仅DS ԽS T DT TT dU t0U XU U 4…U DDž$V ̅V $ЅV dЅ|T tЅT ЅT ЅT ЅU ЅW ЅW х,W х@W $хTW dхtW tхW хW хW ҅W ҅X Ӆ4X d؅X ۅX ܅Y \Y Y Y 8Z TZ 9Z 9Z 49Z t9$[ 9L[ ;[ ;[ ;[ ;[ <[ 4>H\ >t\ >\ >\ ?\ ?\ C0] dCT] J] K^ K<^ Ld^ $S^ $[,_ d[T_ 4^_ T^_ p` P` ` ` Ą` ` dHa ta Ta da a |b Tb b ě@c t|c 䥆c d TDd ld 4d Dd Td d te Աe IJe Բe e 4e Tf @f ĴTf 䴆f f f ䷆`g 丆g g 但,h ׆h i 4i i `j =`i t?j ?j @k $@0k C\k TDpk Dk DEk Hk TI l Tl TVl TXm X,m XHm Zxm \m t^n ^n $bhn cn dn h8o iXo ixo jo Tko tko kp kp up dup $p ,q xq q 䃇q $q Dr 8r tr Ċr r dr 4s D0s tDs Xs ls s s s $s D t ,t t Ĝt t dt ĝu 0u 4Du DXu Tlu u u Ԡv 䠇0v $Xv tv dv w d\w w ԫw 䫇w $x dx t|x ԭx įy dLy Ty y z 4z `z z z D{ H| ɇ| ˇ| D4} w dD{ D{ } $} 4} D} T} d} t} ~ $~ ~  TX 4 t   t3T 5 : : ;$ > 4?l T? ? ? ? t@ @D E F̃ TG DH0 I| TJ 4K K KԄ M$ DNH N $O P DP PL tQ R 4S$ S@ ST tT W܇ W X ZX D^ cH dh i tp  l  ċ  t @ D ( x Ԗȍ  䱈 $ tĎ IJ؎  D( d< P Գ $ D Dt t Ļ< l $ Dē T ň Lj Lj ̈T 4͈ ͈ܕ Έ< Tψp tш Ո 4׈d ׈ ݈ ވd $ 4 0 d  t` 4, | & Y [ \D 4^ _ 4a$ a8 Tc d tf0 Th jȜ tl dm $n n qؠ Du( $|x ġ 4 H 4\ d ĕ ԕ T8  $̣ Ԙ P DT D dԥ D X Խ  L ‰ dȉH ͉ ؉` ؉ ډ ݉ ݉Ī , 4x T ȫ  h  |  < ; <̯ < t> dcx c c dԱ e eH $g dg h i Ti, jx k r Du v0 vD TwX wp Tx 4yD z\ {p { d{ T| T~8 d t $ ԡl D Զ  4` tԸ 4 x 4Ĺ ފ8 ފ $ߊ ߊط Tp L ` $t ĺ غ < d  $ 8 L ` $t 4 D T dļ tؼ    ( < P d x   $ 4Ƚ Dܽ T d t , D X $p T    $, d@ T x  Ŀ T  < d T    T, T  T H \ T d ! $! 4! D! T! "< $"P 4"d D"x # $ d$ %( $%< t%\ &p $( . .0 $/D /p 0 $1 38 4 45 5 d7@ 8 49 D DE E DI I KX DM 4N W< XT Xh X| X Y $Z 4Z DZ tZ D[$ d[8 t[L [` ] $^ ^ `8 `L Tax b k k k0 l\ m dm m n0 doh Dp Tr Ts( t t w xp x z } D  D L T 4 L T 䨍0  `   T, DǍ ׍ Tۍ ލ4 t < d x T( N\  D t < l $  L X d ( Ğ4 ԞH \ p $ 4 ԟ D$ T8 ` t Ģ T 0 T|    ĸ 与, Î Ȏ| $Ɏ 4Ɏ DɎ Ɏ dʎ TˎT Ύ$ 4Ўd Ўx tю DҎ ֎  ׎@ ڎ| ڎ ێ ێ@ ݎP $ $\ tx 4T x 4 D $X T T  t   4\     t     TH  D    4"< *@ +h , - - / ;P ? G4 c tj p\ Dq Ę DL   $ӏ Tӏ ԏ 4ُ dُ ݏ tޏ ޏt ߏ  P  D $ D $x D\ $ D $  $ 8 t   D , H T   d d( T    , D !D 4# % )` ) + - - - - 4.l! C! C! C D" E" DE," E@" ET" E" $F" $G" H<# HP# Id# DI# dI# I# dJ,$ TKX$ L$ TS % dS % tS4% T`% W% dY% TZ& _X& d& Tf,' $hp' $o' dv( xD( Dyh( y( y) }) t* * 4+ + , ė, , d- - ѐL/ Ր/ ݐ@0 0 \1 1 1 2 42 X3 3 H4 45 T=$ =' >( ?@) D@x) $A) C3 D|5 E5 E5 4F5 F5 $G5 DG6 TG6 G(6 tHT6 Hh6 H|6 H6 DJ6 J7 K,7 LX7 M7 M7 M7 M7 N7 4N7 dN$8 Nd8 4O8 tO8 P8 PL9 P`9 tQ9 Q9 U: \`: D^: t_: da$; td; k$< dl< p< s= tw\= wt= = Ԑ,> D> X> Dl> > D> T> > Ē> ? T TF $H K| Z ^L t_ tg ti$ jt dk k Tn Do8 q q r 4r t ~\ ā Ą dl  4 ٘ D٘ ܘ$ dt D A Tn $ ̵ D  t Ĵؽ t $ 4X ļ D 俙4 Й љ ә8 Dۙ ۙ Dޙ 4D d tl T 40 d t T< h D  D( $O O Q< Qh tS S T dVD TWp Y d[8 [\ \p d De e 4f4 dm y ${@ T ċ0 $ 仚t dš ǚ ɚ\ 4͚ ٚ ޚt P 48 h   T    $+ + 4, t, $. . D0 0 D3 $5$ 6 t8 D: T: d: : : : : :0 :D :X $; d; < $>( ?t E F 4G$ DH\ dHp H K K $L0 tLD Mp dM M N TO O8 OL Pp Q DR R S4 T DT T T dUD W TY YX D\ dT Te j n, q q s u w` tx x( zt } 4 䀛 $( P t  $ DP ԅ , p 䈛 t0 l 4  Ģ( @ t D 䤛  $ d< ԥt 䥛  4 T 䨛  4 D ԩ  䪛T 䬛  0 ĮX Ԯl 4 D  䱛P $ d t l T  ( Tl › ›0 $ƛ ͛@ Λ 4՛ T T0 \ ` D%4 ( ; DR TR DdL d| e f f $f4 Tf fT g th i iT i $j Dj dj j j( j< jP kd dk k $l lT l Dm Tm m n n o q tq qT Tr r 4s s$ tt8 tX tx u 4u u v 4v4 vH w\ Twp w z { t{, {@ {\ {p | t| | } D} } 8 䀜L ` T  D8 d x $   t D0 ĊD Dd 䋜  T  40 L x d  䏜 4 Đ@ Dh  Ē $ ē d, @ T Dt  ĕ  $ d  Ė 44 dH \ ԗp  4 $ Tt Ě 䚜 䛜 T0 P 4h Ԟ| T 0 TL 4x d 䬜@ $ 䱜@ d  T  0 Td  Ĵ Դ 䴜    4 d, T t  乜, 4@ T ĺh 亜|  D t  仜 T Լ4 ` T 4 t  4, P Ü $Ĝ Ȝ8 ʜ $̜ ̜  ̜8 dΜ Μ Μ Μ Ϝ $Ϝ dϜ Ϝ, ќ| $ќ Ӝ ל $ݜ ޜ ޜ Tߜ ߜ0 ߜD $X l 4    T( dd t   4  t  4 L $x  4 t  d $ \ Tt   T   4 d( T 4t      4 t t    $4 dH x   $ t D 4< d\     $! ! t ! ! ! (" P" p" T" " " # DD# 4l# $# # $ DP$ dd$ $ D$ $  % @% % % %& $'l& d'& '& '& (& (' )(' )d' )x' *' *' 4+' d+' T-' d-( -( -<( .P( 4.d( D.x( T.( d.( t.( .( .( .( .) /$) /8) $/L) 4/`) D/t) T/) d/) t/) /) /) /) /* /* /(* 5* 5* $6P+ C+ 4G4, Hx, {- {- T|- H. Tt. $. tp/ 4/ t/  0 T(0 P0 d0 0 t1 3 䚝3 $3 44 ģH4 $`4 4 4 4 405 ĦL5 T5 5 6 t@6 t6 d6 ԫ6 6 6 䭝7 t87 D\7 7 7 $7 08 |8 D8 9 \9 Ļ9 仝9 9 伝: `: t: : : : 保: 4; Dp; ; Ý; Ý; t̝$< ̝P< Ν< 4Ν< Ν< Н4= tҝ= ӝ= ӝ= tԝ> ԝ<> ՝h> T֝> ؝> tٝ? ڝ`? T۝? d۝? t۝? d$@ T@ @ @ &T>'>(>$)>4)>D)>T+H?$,?t,?,?$-@-0@-L@.@/@40@1@E?\E@EdAEAF4B@FBF$DFDDFtDGFLGdF`GFGFGdGGGHtJTHJpHJH4KIKPItPJRJTJUKWTL[L[L4\Mt\0M4]\Mt]M]M]M^Md^N^@N^hN`NbNdcNcOexOtfOfPf8PDg`PgPgPDhPhPQnQoQp0RstRtRtSu\SwSTxv>v>w>w>,w>@w>Tw>hw>|w?w?w?w?x?x?(x@ >L?xd??TBJhLNDP4QЭ4RVXdZ^HDkoԯp4s@$uvvذxx$y8zp tlt|ĴDTHDŧǧPʧTͧЧdҧէtاL4ۧݧߧ0|$ȼ4D`TD4t$tTh4$x)*T,\t-.Ap$Ry~ht$TŨŨͨD֨xި$$4DTD @#*(0D7p<dDDLDTDZxa$ip(tuv`dw4x8{{|H}T~TĀ8dD$h䆩ćLd0D|$䍩`ĎDdD$(t䔩ĕ XdD<$䛩 ĜldPD$4䢩ģddԧd\@IJ$pd4TԺԽh$0T©éĩ\ũǩ$Dȩtɩͩ8ѩթ֩שhکdݩdtDH,x\ X<  $ lDTT$8dh !#L$%'L*$TU4UWY4Z 4\Ld_d`b@dg$hTllno  4H d d ԝ0 D| D\  |ԑt$DtT(t 4\DȮ ܮ"ݮ#ޮ$`$T%Dd$&ep&g&g'thT'l'p't8(x(|(Ԁ)䄰h))*L*$**d0+ԙ++<,,Ķ,$ǰ(-Ӱx-ְ/4ٰ/>$0D@h1L1M3tO3Q44Sl4U4X`5dZ5d_5i6DnH7Dt7tt8|89D9H:D:4;䝱;4L<$=D>> ??4ı@TDZ@ ATlBBdCU,EDXEĠFTGHHtHIddIT@JJJ$K pK KDLTLL4M$\NNNOt$PT!pPϳQ~R0SS4dT䥴4UV|VȴVдlWشWٴ0Xdڴ|X۴Xtܴ(YtYtY ZtXZdZ4Z<[[ \] TauPhDwhtyh?l@̫A0BtCԭD4EDGHHdIKXLM NdOIJP$QSij4UtVTWXYL4ZZĶT[$]P4^D_`pa4bb̹c0ddTe4f\gDiDjk\l4nopTrTs$tL4uTvxXzԅ@|dɺ˺̺xDκtϺ`кPҺDӺtԺhֺ׺(ٺںTܺ8D޺t޺TxD4 D(T$%-007d8$D?pELpOdY|T[$]0^bd@f|np(rDuwHzl0ĝTT 4pD,4x亻`$ǻػXT@$ 4NdcTyTDtڼ8TdtH+TTUVhW|WXYdZZ0$[L[xil,xČԌ\\ǿHd0dtL$dTDl$t   4 H$ dT   D  < h4 t    , @$ Td t    $(dTth|dPT0Dh4H$t(`dT@xD X4 $8 tp   T 8 d  T   @ Tl   4 D < t 4   $T t   d4l T   LD!!!4""XT#$$ 4%D%4&&4'd'4((<4))4**L*`*t**+$+4+D+T+$d+8t+L+`+t+4,,,T$-t--.<d.t./T//</tD0001<T1h112d22D3pT3334 T4L4x44455T5 d545`5t56D6T66 6LD7x77488(8T$9t99994:,D:@T:Td:ht:|::::;d;( ;T < T< < < <!=!d=H!=t!>!T>!>!>">("D?T"4@"D@"@"@"AD#B#C#dD$DEd$$F$G$G<%H%I%J&dK\&DL&L&L&M'M@'4Nx'N'O'DP,(Ph(Q(TR(R )S8)S)S)S)T)T)T*T,*T@*TT*Th*U|*U*DU*U*dV+V,+VH+Vd+$W+4W+W+W+X,dX@,Xl,4Y,Y,Z-T[X-[-\-\-\-]-d].]0.d^P.d_._._.`.`/4a(/aD/Tt\/wp/Dx/x/y/$y/Dy/Ty/dy0ty0y(0y<0yP0yd0yx0y0y0z0z0$z04z0Dz1Tz1dz,1tz@1zT1zh1z|1z1${1T{1{1{1{1|2D|2t|02|D2|X2}l24}2d}2}2}2}2}2}2~ 3$~$3T~83~L3~`3~t3~33$3D3d3344,4H4tl44444454,5d@5T5Ԃh5|5T5ԃ5$54556t<6|6ԇ6d6Ԉ7T777$708d88$88 9l9Đ9499$L::D:ԕ:d:;ėL;t;;;;(<tP<<$<==>=t(>|>>>>>D>T?d?8?L?l?D???Dh@d@@@dAlA$LBdpBDBB@ClCtCtCXDdDD$DE F$FH\HIdII J5K6K;Lt=HM4?wL x[xD{(ytyydzz$z{ľ,{@{4h{ }T-vt-t}-}-}.}d.~t.~.(~/x~T0~1~$3T4;t;;;;BdFHI]Xcfi4ndpt<|}@ԂԌ4lpXdē$4ttdĝ Xt44T$D$LT8xDDD(<d' 0t12T3$3@D8d88$949(:T;T;t;;; ;$ ;@ <\ T